`
`PCI‘IUS96l02303
`
`in this example may
`The BILLING method map
`describe theipricing algorithm that should be used in this
`
`BILLING method (e.g., bill $0.001 per byte of content released).
`
`Block 1988 ("Map meter value to billing amount“) functions in
`
`the same manner as block 1950 of the EVENT method; it maps
`
`the meter value to a billing value. Process step 1988 may also
`
`interrogate the secure database (as limited by the privacy filter)
`
`to determine if other objects or information (e.g., user
`
`information) are present as part of the BILLING method
`
`algorithm.
`
`BILLING method 1980 may then write a BILLING audit
`trail ifrequired to a BILLING method Audit
`UDE (block
`
`1990, 1992), and may prepare to return the billing amount to the
`
`calling CONTROL method (or other control process). Before that,
`
`however, BELING method 1980 may test whether a billing
`
`amount was determined (decision block 1994). Ifno billing
`
`amount was determined, then the BILLING method may be
`
`failedi(bIocI*. 1996). This may occur if the user is not authorized
`
`to access the specific areas of the pricing table that the BILLING
`
`method MDE describes (e.g., you may purchase not more than
`
`$100.00 of information from this content object).
`
`565
`
`Petitioner Apple Inc. — Exhibit 1024, p. 2001
`
`Petitioner Apple Inc. - Exhibit 1024, p. 2001
`
`
`
`W0 96/27155
`
`Access
`
`PCFIUS96/02303
`
`Figure 54 is a flowchart of an example of program control
`
`steps performed by an ACCESS method 2000, As described
`
`above, an ACCESS method may be used to access content
`
`embedded in an object 300 so it can be written to, read from, or
`
`otherwise manipulated or processed. In many cases, the
`
`ACCESS method may be relatively trivial since the object may,
`
`for example, be stored in a local storage that is easily accessible.
`
`However, in the general case, an ACCESS method 2000 must go
`
`through a more complicated procedure in order to obtain the
`
`object. For example, some objects (or parts of objects) may only
`
`be available at remote sites or may be provided in the form of a
`
`real-time download or feed (e.g., in the case of broadcast
`
`transmissions). Even if the object is stored locally to the VDE
`
`node, it may be stored as a secure or protected object so that it is
`
`not directly accessible to a calling process. ACCESS method 2000
`
`establishes the connections, routings, and security requisites
`
`needed to access the object. These steps may be performed
`
`transparently to the calling process so that the calling process
`
`only needs to issue an access request and the particular ACCESS
`I method corresponding to the object or class of objects handles all
`
`of the details and logistics involved in actually accessing the
`
`object.
`
`566
`
`Petitioner Apple Inc. — Exhibit 1024, p. 2002
`
`Petitioner Apple Inc. - Exhibit 1024, p. 2002
`
`
`
`W0 96/27155
`
`'
`
`PCIIUS96/02303
`
`ACCESS method 2000 may first prime an ACCESS audit
`
`trail (if required) by writing to an ACCESS Audit Trail UDE
`
`(blocks 2002, 2004). ACCESS method 2000 may then read and
`
`load an ACCESS method DTD in order to determine the format
`
`of an ACCESS MDE (blocks 2006, 2008). The ACCESS method I
`
`MDE specifies the source and routing information for the
`particular object to be accessed in the preferred embodiment.
`
`Using the ACCESS method DTD, ACCESS method 2000 may
`
`load the correction parameters (e.g., by telephone number,
`
`account ID, password and/or a request script in the remote
`
`resource dependent language).
`
`ACCESS method 2000 reads the ACCESS method MDE
`
`from the secure database, reads it in accordance with the
`
`' ACCESS method DTD, and loads encrypted content source and
`
`routing information based on the MDE (blocks 2010, 2012). This
`
`source and routing information specifies the location of the
`
`encrypted content. ACCESS method 2000 then determines
`
`whether a connection to the content is available (decision block
`
`2014). This ”connection“ could be, for example, an on-line
`
`connection to a remote site, a real-time information feed, or a
`
`path to a secure/protected resource, for example. Ifthe
`
`connection to the content is not currently available (”No“ exit of
`
`decision block 2014); then ACCESS method 2000 takes steps to
`
`open the connection (block 2016). Ifthe connection fails (e.g.,
`
`567
`
`Petitioner Apple Inc. — Exhibit 1024, p. 2003
`
`Petitioner Apple Inc. - Exhibit 1024, p. 2003
`
`
`
`W0 96I27l55 '
`
`'
`
`PCT/US96/02303
`
`because the user is not authorized to access a protected secure
`
`resource), then the ACCESS method 2000 returns with a failure
`
`-
`
`indication (termination point 2018). Ifthe open connection
`
`succeeds, on the other hand, then ACCESS method 2000 obtains
`
`the encrypted content (block 2020). ACCESS method 2000 then
`
`writes an ACCESS audit trail if required to the secure database
`
`ACCESS method Audit Trail UDE (blocks 2022, 2024), and then
`
`terminates (terminate point 2026).
`
`Decrypt and Encrypt
`
`Figure 55a is a flowchart of an example of process control
`
`steps performed by a representative example of a DECRYPT
`
`method 2030 provided by the preferred embodiment. DECRYPT
`
`method 2030 in the preferred embodiment obtains or derives a
`
`decryption key from an appropriate PERC 808, and uses it to
`
`decrypt a block of encrypted content. DECRYPT method 2030 is
`
`passed a block of encrypted content or a pointer to where the
`
`encrypted block is stored. DECRYPT 2030 selects a key number
`
`from a key block (block 2032). For security purposes, a content.
`
`object may be encrypted with more
`
`one key. For example, a
`
`movie may have the first 10 minutes encrypted using a first key,
`
`the second 10 minutes encrypted with a second key, and so on.
`
`These keys are stored in a PERC 808 in a structure called a "key
`
`block.“ The selection process involves determimng the correct key
`
`to use from the key block
`
`order to decrypt the content. The
`
`568
`
`Petitioner Apple Inc. — Exhibit 1024, p. 2004
`
`Petitioner Apple Inc. - Exhibit 1024, p. 2004
`
`
`
`W0 96/27l55
`
`-
`
`.
`
`PCl‘fUS96I02303
`
`process for this selection is
`
`to the process used by EVENT.
`
`'
`
`methods to map events into atomic element numbers. DECRYPT
`
`method 2030 may then access an appropriate PERC 808 from the
`secure database 610 and loads a key (or ”seed“) from a PERC
`
`(blocks 2034, 2036). This key information may be the actual
`
`decryption key to be used to decrypt the content, or it may be
`information from which the decryption key may be at least in
`
`part derived or calculated Ifnecessary, DECRYPT method 2030
`
`computes the decryption key based on the information read from
`
`PERC 303 at block 2034 (block 2033). DECRYPT method 2030
`
`then uses the obtained and/or calculated decryption key to
`
`actually decrypt the block of encrypted information (block 2040).
`
`DECRYPT method 2030 outputs the decrypted block (or the
`pointer indicating where itlmay be found), and terminates
`
`(termination point 2042).
`
`Figure 55b is a flowchart of an example of process‘ control
`
`steps performed by a representative example of an ENCRYPT '
`
`method 2050. ENCRYPT method 2050 is passed as an input, a
`
`block of information to encrypt (or a pointer indicating where it
`
`may be found). ENCRYPT method 2050 then may determine an
`
`encryption key to use from a key block (block 2052). The
`
`encryption key selection makes a determination if a key for a
`
`specific block of content to be written already exists in a key block
`
`stored in PERC 808. Ifthe key already exists in the key block,
`
`569
`
`Petitioner Apple Inc. — Exhibit 1024, p. 2005
`
`Petitioner Apple Inc. - Exhibit 1024, p. 2005
`
`
`
`wo 95/27155
`
`PC!‘/US96I02303
`
`then the appropriate key number is selected. lfno such key
`
`exists in the key block, a new key is calculated using‘ an
`
`algorithm appropriate to the encryption algorithm. This key is
`
`then stored in the key block of PERC 808 so that DECRYPT
`method 2030 may access the key in order to decrypt the content
`
`stored in the content object. ENCRYPT method 2050 then
`
`accesses the appropriate PERC to obtain, derive and/or compute
`
`an encryption key to be used to encrypt the information block
`
`(blocks 2054, 2056, 2058, which are similar to Figure 55a blocks
`
`2034, 2036, 2038). ENCRYPT method 2050 then actually
`
`encrypts the informationblock using the obtained and/or derived
`
`encryption key (block 2060) and outputs the encrypted
`
`information block or a pointer where it can be found before
`
`terminating (termination point 2062).
`
`Content
`
`Figure 56 is a floWcha.rt of an example of process control
`
`steps performed by a representative of a CONTENT method 2070
`
`provided by the preferred embodiment. CONTENT method 2070
`
`in the preferred embodiment builds a ”synopsis“ of protected
`
`content using a secure process. For example, CONTENT method
`
`2070 may be used to derive unsecure (”public“) information from
`
`secure content. Such derived public information might include,
`
`for example, an abstract, an index, a table of contents, a directory
`of files, a schedule when content may be available, or excerpts
`
`.
`
`such as for example, a movie ”trailer.“
`
`570
`
`Petitioner Apple Inc. — Exhibit 1024, p. 2006
`
`Petitioner Apple Inc. - Exhibit 1024, p. 2006
`
`
`
`WO 96/27155
`
`PCI'/US96l02303
`
`' CONTENT method 2070 begins by ‘determining whether
`
`the derived content to be provided must be derived from secure
`contents, or whether it is already available in the object in the
`
`form of static values (decision block 2070). Some objects may, for
`example, contain prestored abstracts, indexes, tables of contents,
`
`etc., provided expressly for the purpose of being extracted by the
`
`CONTENT method 2070. Ifthe object contains such static values
`
`(”static“ exit to decision block 2072), then CONTENT method
`
`2070 may simply read this static value content information from
`
`the object (block 2074), optionally decrypt, and release this
`
`content description (block 2076). If, on the other hand,
`
`CONTENT method 2070 must derive the synopsis/content
`
`description from the secure object (”derived“ exit to decision block
`2072), then the CONTENT method may then securely read
`
`information from the container according to a synopsis algorithm
`
`to produce the synopsis (block 2078).
`
`Extract and Embed
`
`Figure 57a is a flowchart of an example of process control
`
`steps performed by a representative example of an EXTRACT
`
`method 2080 provided by the preferred embodiment. EXTRACT
`
`method 2080 is used to copy or remove content from an object and
`
`place it into a new object. In the preferred embodiment, the
`
`EXTRACT method 2080 does not involve any release of content,
`
`but rather simply takes content from one container and places it
`
`—
`
`571
`
`Petitioner Apple Inc. — Exhibit 1024, p. 2007
`
`Petitioner Apple Inc. - Exhibit 1024, p. 2007
`
`
`
`' WO 96127155
`
`PCI'lUS96l02303
`
`into another container, both of which may be secure. Extraction
`
`of content difl'ers from release in that the content is never
`
`exposed outside a secure container. Extraction and Embedding
`
`are complementary functions; extract takes content from a
`
`container and creates a new container containing the extracted
`
`content and any specified control information associated with
`
`that content. Embedding takes content that is already in a
`
`container and stores it (or the complete object) in another
`
`container directly and/or _by reference, integrating the control
`
`information associated with existing content with those of the
`
`new content.
`
`EXTRACT method 2080 begins by priming an Audit UDE
`
`(blocks 2082, 2084). EXTRACT method then calls a BUDGET
`
`method to make sure that the user has enough budget for (and is
`
`authorized to) extract content from the original object (block
`
`2086). Ifthe user’s budget does not permit the extraction (”no“
`exit to decision block 2088), then EXTRACT method 2080 may
`
`write a failure audit record (block 2090), and terminate
`
`(termination point 2092). Ifthe user’s budget permits the
`
`extraction (”yes“ ezdt to decision block 2088), then the EXTRACT
`
`method 2080 creates a copy of the extracted object with specified
`
`rules and control information (block 2094). In the preferred
`
`embodiment, this step involves calling a method that actually
`
`.
`
`controls the copy. This step may or may not involve decryption
`
`572
`
`Petitioner Apple Inc. — Exhibit 1024, p. 2008
`
`Petitioner Apple Inc. - Exhibit 1024, p. 2008
`
`
`
`W0 96f27l55
`
`.
`
`PCI‘/US96l02303
`
`and encryption, depending on the particular the PERC 808
`
`associated with the original object,for example. EXTRACT
`
`method 2080 then checks whether any control changes are
`permitted by the rights authorizing the extract to ‘begin with
`
`(decision block 2096). In some cases, the extract rights require I
`
`I
`
`an exact copy of the PERC 808 associated with the original object
`(or a PERC included for this purpose) to be placed in the new
`
`I
`
`(destination) container (”no“ exit to decision block 2096). If no
`
`control changes are permitted, then extract method 2080 may
`
`simply write audit information to the Audit UDE (blocks 2098,
`
`-
`
`2100) before terminating (terminate point 2102). If, on the other
`
`hand, the extract rights permit the user to make control changes
`
`(”yes“ to decision block 2096), then EXTRACT method 2080 may
`
`call a method or load module that solicits new or changed control
`
`information (e.g., from the user, the distributor who
`
`created/granted extract rights, or from some other source) from
`
`the user (blocks 2104, 2106). EXTRACT method 2080 may then
`
`call a method or load module to create a new PERC that reflects
`
`these user-specified control information (block 2104). This new
`
`PERC is then placed in the new (destination) object, the auditing
`
`steps are performed, and the process terminates.
`
`Figure 57b is an example of process control steps
`
`performed by a representative example of an EMBED method
`
`2110 provided by the preferred embodiment. EMBED method
`
`573
`
`Petitioner Apple Inc. — Exhibit 1024, p. 2009
`
`Petitioner Apple Inc. - Exhibit 1024, p. 2009
`
`
`
`W0 96I27l55
`
`PCT/US96I02303
`
`2110 is similar to EXTRACT method 2080 shown in Figure 57a.
`
`However, the EMBED method 2110 performs a slightly diflerent
`
`fimction—it writes an object (or reference) into a destination
`
`container. Blocks 2112-2122 shown in Figure 57b are similar to
`
`blocks 2082-2092 shown in Figure 57a. At block 2124, EMBED
`
`method 2110 writes the source object into the destination
`
`container, and may at the same time extract or change the
`
`control infonnation of the destination container. One alternative
`
`is to simply leave ‘the control infonnation of the destination
`
`container alone, and include the full set of control information
`
`associated with the object being embedded in addition to the
`
`original container control infonnation. As an optimization,
`
`however, the preferred embodiment provides a technique
`
`whereby the control infonnation associated with the object being
`
`embedded are ”abstracted“ and incorporated into the control
`
`information of the destination container. Block 2124 may call a
`
`method to abstract or change this control information. EMBED
`
`method 2110 thenperforms steps 2126-2130 which are
`
`to
`
`steps 2096, 2104, 2106 shown in Figure 57a to allow the user, if
`
`authorized, to change and/or specify control information
`
`associated with the embedded object and/or destination ,
`
`container. EMBED method 2110 then writes audit information
`
`into an Audit UDE (blocks 2132, 2134), before terminating (at
`
`termination point 2136).
`
`A
`
`574
`
`Petitioner Apple Inc. —’ Exhibit 1024, p. 2010
`
`Petitioner Apple Inc. - Exhibit 1024, p. 2010
`
`
`
`wo 96/27155
`
`Obscure
`
`PC’I‘IUS96I02303 1
`
`Figure 58a is a flowchart of an example of process control
`
`steps performed by a representative example of an OBSCURE
`
`method 2140 provided by the preferred embodiment. OBSCURE
`
`method 2140 is typically used to release secure content in
`
`devalued form. For example, OBSCURE method 2140 may
`
`release a high resolution image in a lower resolution so that a
`
`viewer can appreciate the image but not enjoy its full value. As
`another example, the OBSCURE method 2140 may place an
`obscuring legend (e.g., ”¢OPY,“ ”PROOF,“ etc.) across an image
`
`to devalue "it. OBSCURE method 2140 may ”obscure“ text,
`
`images, audio information, or any other type of content.
`
`OBSCURE method 2140 first calls an EVENT'method to
`
`detennine if the content is appropriate and in the range to be
`
`obscured (block 2142). Ifthe content is not appropriate for
`
`obscuring, the OBSCURE method terminates (decision block
`
`2144 ”no“ exit, terminate point 2146). Assuming that the content
`
`is to be obscured (”yes“ exit to decision block 2144), then
`
`OBSCURE method 2140 determines whether it has previously
`been called to obscure this content (decision block 2148).
`
`Assuming the OBSCURE 2140 has not previously called for this
`
`object/content (”yes“ exit to decision block 2148),“the OBSCURE
`
`method 2140 reads an appropriate OBSCURE method MDE from
`
`the secure database and loads an obscure formula and/or pattern
`
`575
`
`‘Petitioner Apple Inc. — Exhibit 1024, p. 2011
`
`Petitioner Apple Inc. - Exhibit 1024, p. 2011
`
`
`
`W0 95/27155
`
`_
`
`PCl'IUS96l02303
`
`from the MDE (blocks 2150, 2152). The OBSCURE method 2140
`
`may then apply the appropriate obscure transform based on the
`
`patters and/or formulas loaded by block 2l50 (block 2154). The
`
`OBSCURE method then may terminate (terminate block 2156).
`
`Fingerprint
`
`Figure 58b is a flowchart of an example of process control
`
`steps performed by a representative example of a
`
`FINGERPRINT method 2160 provided by the preferred
`
`embodiment. FINGERPRINT method 2160 m the preferred
`
`embodiment operates to ”mark“ released content with a
`
`”fingerprint“ identification of who released the content and/or
`
`check for such marks. This allows one to later determine who
`
`released unsecured content by examining the content.
`
`FINGERPRINT method 2160 may, for example, insert a user ID
`
`within a datastream representing audio, video, or binary format
`
`information. FINGERPRINT method 2160 is quite similar to V
`
`OBSCURE method 2140 shown in Figure 58a except that the
`
`transform applied by FINGERPRINT method block 2174
`
`”fingerprintsf‘ the released content rather than obscuring it.
`
`Figure 58c shows an example of a ”fingerprinting“
`
`procedure 2160 that inserts into released content ”fingerprints“
`
`2161 that identify the object and/or property and/or the user that
`
`.
`
`576
`
`Petitioner Apple Inc. — Exhibit 1024, p. 2012
`
`Petitioner Apple Inc. - Exhibit 1024, p. 2012
`
`
`
`W0 96/27155
`
`PCT/US96/02303
`
`requested the released content and/or the date and time of the
`
`release and/or other identification criteria of the released content.
`
`Such fingerprints 2161 can be ”buried“ -- that is inserted in
`
`a manner that hides the fingerprints from typical users,
`
`sophisticated ”hackers,“ and/or from all users, ‘depending on the
`file format, the sophistication and/or variety of the insertion
`
`algorithms, and on the availability of original, non-fingerprinted
`
`content (for comparison for reverse engineering of algorithm(s)).
`
`Inserted or embedded fingerprints 2161, in a preferred
`
`embodiment, may be at least in part encrypted to make them
`
`more secure. ‘Such encrypted fingerprints 2161 may be
`
`embedded within released content provided in ”clear“ (plaintext)
`
`form.
`
`Fingerprints 2161 can be used for a variety of purposes
`
`including, for example, the often related purposes of proving
`
`misuse of released materials and proving the source of released
`
`content. Software piracy is a particularly good example Where
`
`fingerprinting can be very useful. Fingerprinting can also help to
`
`enforce content providers’ rights for most types of electronically
`
`delivered information including movies, audio recordings,
`
`multimedia, information databases, and traditional ”1iterary“
`
`materials. Fingerprinting is a desirable alternative or addition to
`
`copy protection.
`
`577
`
`Petitioner Apple Inc. — Exhibit 1024, p. 2013
`
`Petitioner Apple Inc. - Exhibit 1024, p. 2013
`
`
`
`wo 95/27155
`
`H
`
`.
`
`PCI‘/US96I02303
`
`Most piracy of software applications, for example, occurs
`not with the making of an illicit copy by an individual for use on
`
`another of the individuals own computers, but rather in giving a
`
`copy to another party. This often starts a chain (or more
`
`accurately a pyramid) of illegal copies, as copies are handed from
`
`individual to individual. The fear of identification resulting from
`
`the embedding of a fingerprint 2161 will likely dissuade most
`
`individuals from participating, as many currently do, in
`
`widespread, ”casual“ piracy. In some cases, content may be
`checked for the presence of a fingerprint by a fingerprint method
`
`to help enforce content providers’ rights.
`
`Different fingerprints 2161 can have diflerent levels of
`
`security (e.g., one fingerprint 2161(1) could be
`
`readable/identifiable by commercial concerns, while another
`
`fingerprint 2161(2) could be readable only by a more trusted
`
`agency. The methods for generating the more secure fingerprint
`
`2161 might employ more complex encryption techniques (e.g.,
`
`digital signatures) and/or obscuring of location methodologies.
`
`Two or more fingerprints 2161 can be embedded in difierent
`
`«
`
`locations and/or using difierent techniques to help protect
`fingerprinted information against hackers. The more secure
`
`fingerprints might only be employed periodically rather than
`
`each time content release occurs, if the technique used to provide
`
`-
`
`a more secure fingerprint involves an undesired amount of
`
`578
`
`Petitioner Apple Inc. — Exhibit 1024, p. 2014
`
`Petitioner Apple Inc. - Exhibit 1024, p. 2014
`
`
`
`W0 96/27155
`
`_
`
`PCT/US96/02303
`
`additional overhead. This may nevertheless be effective since a
`
`principal objective offingerprinting is deterrence—that is the
`
`fear on the part of the creator of an illicit copy that the copying
`
`will be found out.
`
`For example, one might embed a copy of a fingerprint 2161
`
`' which might be readily identified by an authorized party--for
`
`example a distributor, service personal, client administrator, or
`
`clearinghouse using a VDE electronic appliance 600. One might
`
`'
`
`embed one or more additional copies or variants of a fingerprint
`
`2161 (e.g., fingerprints carrying information describing some or
`
`all relevant identifying information) and this additional one or
`
`more fingerprints 2161 might be maintained in a more secure
`
`manner.
`
`Fingerprinting. can also protect privacy concerns. For
`
`example, the algorithm and/or mechanisms needed to identify the
`
`fingerprint 2161 might be available only through a particularly
`
`trusted agent.
`
`‘Fingerprinting 2161 can take many forms. For example, in
`
`an image, the color of every N pixels (spread across an image, or
`spread across a subset ofthe image) might be subtly shifted in a
`
`visually unnoticeable manner (at least according to the normal,
`
`unaided observer). These shifts could be interpreted by analysis
`
`579
`
`Petitioner Apple Inc. — Exhibit 1024, p. 2015
`
`Petitioner Apple Inc. - Exhibit 1024, p. 2015
`
`
`
`wo 95/27155
`
`PCI‘/US96/02303
`
`of the image (with or without access to the original image), with
`each occurrence or lack of occurrence of a shifi; in color (or
`
`greyscale) being one or more binary ”on or ofi“ bits for digital
`information storage. The N pixels might be either consistent, or
`
`alternatively, pseudo—random in order (but interpretable, at least
`
`in part, by a object creator, object provider, clientadministrator,
`
`and/or VDE administrator).
`
`Other modifications of an image (or moving image, audio,
`
`_ etc.) which provide a similar benefit (that is, storing information
`
`in a form that is not normally noticeable as a result of a certain
`
`modification of the source information) may be appropriate,
`
`depending on the application.- For example, certain subtle
`
`modifications in the frequency of stored audio information can be
`
`modified so as to be normally unnoticeable to the listener While
`
`still being readable with the proper tools. Certain properties of
`
`the storage of information might be modified to provide such
`
`slight but interpretable variations in polarity of certain
`
`information which is optically stored to achieve similar results.
`
`Other variations employing other electronic, magnetic, and/or
`
`optical characteristic may be employed.
`
`Content stored in files that employ graphical formats, such
`
`as Microsoft Windows word processing files, provide significant
`
`_
`
`opportunities for ”burying“ a fingerprint 2161. Content that
`
`580
`
`Petitioner Apple Inc. — Exhibit 1024, p. 2016
`
`Petitioner Apple Inc. - Exhibit 1024, p. 2016
`
`
`
`WO 96/27155
`
`.-.. PCT/US96l02303
`
`includes images and/or audio provides the opportunity to embed
`
`fingerprints 2161 that may be difiicult for unauthorized
`
`‘ individuals to identify since, in the absence of an
`
`”unfingerprinted“ original for purposes of comparison, minor
`
`subtle variations at one or more time instances in audio
`
`frequencies, or in one or more video images, or the like, will be in
`themselves undiscernible given the normally unknown nature of
`
`the original and the large amounts of data employed in both .
`
`image and sound data (and which is not particularly sensitive to‘
`minor variations). With formatted text documents, particularly
`
`those created with graphical word processors (such as Microsoft
`
`Windows or Apple Maclntosh word processors and their DOS and
`
`Unix equivalents), fingerprints 2161 can normally be inserted
`
`unobtrusively into portions of the document data representation
`
`that are not normally visible to the end user (such as in a header
`
`or other non—displayed data field).
`
`Yet another form of fingerprinting, which may be
`
`particularly suitable for certain textual documents, would employ
`and control the formation of characters for a given font.
`Individual characters may have a slightly different ‘visual
`
`V
`
`formation which connotes certain ”fingerprint“ information. This M
`
`alteration of a given character’s form would be generally
`
`undiscernible, in part because so many slight variations exist in
`
`versions of the same font available from difi‘erent suppliers, and
`
`581
`
`Petitioner Apple Inc. — Exhibit 1024, p_. 2017
`
`Petitioner Apple Inc. - Exhibit 1024, p. 2017
`
`
`
`W0 96/27155
`
`PC!‘/US96I02303
`
`in part because of the smallness of the variation. For example, in
`
`a preferred embodiment, a program such as Adobe‘ Type Align
`
`could be used which, in its ofl‘-the-shelf versions, supports the
`
`ability of a user to modify font characters in a variety of ways.
`
`The mathematical definition of thelfont character is modified
`
`according
`
`the user’s instructions to produce a specific set of
`
`modifications to be applied to a character or font. Information
`
`content could be used in an analogous manner (as an alternative
`
`to user selections) to modify certain or all characters too subtly
`for user recognition under normal circumstances but which
`
`nevertheless provide appropriate encoding for the fingerprint
`
`2161. Various subtly different versions of a given character
`
`might be used within a single document so as to increase the
`
`' ability to carry transaction related font fingerprinted
`
`information.
`
`i
`
`' Some other examples of applications for fingerprinting
`
`might include:
`
`1.
`
`In software programs, selecting certain
`
`interchangeable code fragments in such a way as to
`
`produce more or less identical operation, but on
`
`analysis, differences that detail fingerprint
`
`information.
`
`2. With databases, selecting to format certain fields,
`
`_
`
`such as dates, to appear in different ways.
`
`582
`
`Petitioner Apple Inc. — Exhibit 1024, p. 2018
`
`Petitioner Apple Inc. - Exhibit 1024, p. 2018
`
`
`
`WO 96127155
`
`PCT/US96I02303
`
`3.
`
`In games, adjusting backgrounds, or changing order
`- of certain events, including noticeable or very subtle
`
`changes in timing and/or ordering of appearance of
`
`-game elements, or slight changes in the look of
`
`elements of the game.
`
`Fingerprinting method 2160 is typically performed (if at
`
`all) at the point at which content is released from a content object
`
`300. However, it could also be performed upon distribution of an
`
`object to ”mark“ the content while still in encrypted form. For
`
`example, a network-based object repository could embed
`
`fingerprints 2161 into the content of an object before
`
`transmitting the object to the requester, the fingerprint
`
`information could identify a content requester/end user. This
`
`could help detect ”spoof“ electronic appliances 600 used to release
`
`content without authorization.
`
`Destroy
`
`Figure 59 is a flowchart of an example ofprocess control
`
`steps performed by a representative performed by a DESTROY
`
`method 2180 provided by the preferred embodiment. DESTROY
`
`method 2180 removes the ability of a user to use an object by
`
`destroying the URT the user requires to access the object. In the
`
`preferred embodiment, a DESTROY method 2180 may first write .
`
`audit information to an Audit UDE (blocks 2182, 2184).
`
`583
`
`Petitioner Apple Inc. — Exhibit 1024, p. 2019
`
`Petitioner Apple Inc. - Exhibit 1024, p. 2019
`
`
`
`wo 95/27155
`
`'
`
`PCI‘IUS96I02303
`
`DESTROY method 2180 may than call a WRITE and/or AC¢ESS
`method to write information which will corrupt (and thus
`
`destroy) the header and/or other important parts of the object
`
`(block 2186). DESTROY method 2180 may then mark one or
`
`more of the control structures (e.g., the URT) as damaged by
`
`writing appropriate information to the control structure (blocks
`
`2188, 2190). DESTROY method 2180, finally, may write
`
`additional audit information to Audit UDE (blocks 2192, 2194)
`
`before terminating (terminate point 2196).
`
`0 Panic
`
`Figure 60 is a flowchart of an example of process control
`
`steps performed by a representative example of a PANIC method
`2200 provided by the preferred embodiment. PANIC method
`
`0
`
`2200 may be called when a security violation is detected. PANIC
`
`method 2200 may prevent the user from further accessing the
`
`‘ object currently being accessed by, for example, destroying the
`channel being used to access the object and marking one or more
`
`of the control structures (e.g., the URT) associated with the user
`
`and object as damaged (blocks 2206, and 2208-2210,
`respectively). Because the control structure is damaged, the VDE
`node will need to contact an administrator to obtain a valid
`
`I
`
`‘ -control structure(s) before the user may access the same object
`
`When the VDE node contacts the administrator, the
`
`administrator may request information sufficient to satisfy itself
`
`584
`
`Petitioner Apple Inc. — Exhibit 1024, p. 2020
`
`Petitioner Apple Inc. - Exhibit 1024, p. 2020
`
`
`
`W0 96/27155
`
`-
`
`PCl‘lUS96l02303
`
`that no security violation occurred, or if a security violation did
`
`occur, take appropriate steps to ensure that the security violation
`
`is not repeated.
`
`Meter
`
`Figure 61 is a flowchart of an example of process control
`
`steps performed by a representative example of a METER
`
`method provided by the preferred embodiment. Although
`
`METER methods were described above in connection with
`
`Figures'49, 50 and 51, the METER method 2220 shown inlFigure
`
`61 is possibly a somewhat more representative example. In the
`
`preferred embodiment, METER method 2220 first primes an
`
`Audit Trail by accessing a METER Audit Trail UDE (blocks 2222,
`2224). I METER method 2220 may then read the DTD for the
`‘Meter UDE from the secure database (blocks 2226, 2228).
`
`METER method 2220 may then read the Meter UDE ifrom the
`
`secure database (blocks 2230, 2232). METER method 2220 next
`
`may test the obtained Meter UDE to determine whether it has
`
`expired (decision block 2234). In the preferred embodiment, each
`
`Meter UDE may be marked with an expiration date. Ifthe
`current date/time is later than the expiration date ofthe Meter
`
`UDE (”yes“ exit to decision block 2234), then the 1V.lETER method
`
`2220 may record a failure in the Audit Record and terminate
`
`with a failure condition (block 2236, 2238).
`
`585
`
`Petitioner Apple Inc. — Exhibit 1024, p. 2021
`
`Petitioner Apple Inc. - Exhibit 1024, p. 2021
`
`
`
`W0 96l27l55
`
`.
`
`PC'I'IUS96l02303
`
`Assuming the Meter UDE is not yet expired, the meter
`
`method 2220 may update it using the atomic element and event
`
`count passed to the METER method from, for example, an
`
`EVENT method (blocks 2239, 2240). The METER method 2220
`
`may then save the Meter Use Audit Record in the Meter Audit
`
`'
`
`_
`
`Trail UDE (blocks 2242, 2244); before terminating (at terminate
`
`point 2246).
`
`Additional Security Features Provided by the Preferred
`Embodiment
`
`VDE 100 provided by the preferred embodiment has
`
`suficient security to help ensurelthat it cannot be compromised
`
`short of a successful ”brute force attack,“ and so that the time
`
`and cost to succeed in such a "brute force attack“ substantially
`
`exceeds
`
`value ‘to be derived. In addition; the security
`
`provided by VDE 100 compartmentalizes the internal workings of
`
`VDE so that a successful "brute force attack“ would compromise
`
`only a strictly bounded subset