`
`BEFORE THE PATENT TRIAL AND APPEAL BOARD
`
`APPLE INC.,
`Petitioner,
`v.
`UNIVERSAL SECURE REGISTRY, LLC,
`Patent Owner.
`_________________________________________
`Case IPR2018-00810
`U.S. Patent No. 9,100,826
`________________________________________
`
`DECLARATION OF DR. VICTOR SHOUP IN SUPPORT OF
`
`PETITIONER’S REPLY TO PATENT OWNER’S RESPONSE
`
`Apple 1018
`Apple. v. USR
`IPR2018-00810
`
`
`
`TABLE OF CONTENTS
`INTRODUCTION .......................................................................................... 1
`I.
`II. LEGAL PRINCIPLES .................................................................................... 2
`A. Claim Construction...................................................................................... 2
`B. Obviousness................................................................................................. 2
`C. Secondary Considerations............................................................................ 4
`III. OPINIONS .................................................................................................. 5
`A. USR’s Proposed Claim Constructions Are Overly Narrow And Inconsistent
`With BRI. ........................................................................................................... 5
`1. Contrary to USR’s Argument, “Biometric Information” Is An Example Of
`“Authentication Information.”......................................................................... 5
`2. USR’s Construction For “Enabling Or Disabling” A Device Is Unduly
`Narrow. ........................................................................................................... 8
`B. The Challenged Claims Are Obvious..........................................................10
`1. Maritzen’s Biometric Key Is “First Authentication Information” That Is
`Derived/Determined From A “First Biometric Information.” .........................10
`It Would Have Been Obvious To Combine Maritzen With Jakobsson To
`2.
`Determine the Recited “First Authentication Information” From “First
`Biometric Information.” .................................................................................14
`3. Contrary To USR’s Argument, Maritzen’s “Biometric Information” Is The
`Claimed “Authentication Information.”..........................................................24
`It Would Have Been Obvious To Combine Maritzen With Jakobsson’s
`4.
`Teachings That “Second Biometric Information” Is Retrieved From Memory
`By A Second Device. .....................................................................................24
`5. Maritzen In View Of Jakobsson And Niwa Discloses A Second Processor
`“Configured To Receive A First Authentication Information.” ......................28
`6. Maritzen and Jakobsson Disclose Authenticating The First Entity “Based
`Upon The First Authentication Information And The Second Biometric
`Information.”..................................................................................................28
`7. Maritzen Discloses A “First Handheld Device.”......................................30
`8. Maritzen Discloses A Processor Configured To “Enable Or Disable” Use
`Of The First Handheld Device Based On The Result Of A Comparison.........33
`9. Maritzen In View Of Niwa Discloses Storing “Respective Biometric
`Information For A Second Plurality Of Users.”..............................................34
`
`ii
`
`
`
`10. USR Fails To Demonstrate Secondary Considerations of Non-
`Obviousness...................................................................................................35
`IV. CONCLUSION ..........................................................................................39
`V. AVAILABILITY FOR CROSS-EXAMINATION ........................................40
`VI. RIGHT TO SUPPLEMENT .......................................................................40
`VII. JURAT .......................................................................................................41
`
`iii
`
`
`
`I, Victor Shoup, Ph.D., declare as follows:
`I.
`INTRODUCTION
`1.
`I have been retained by Apple to provide opinions in this proceeding
`
`relating to U.S. Patent No. 9,100,826 (“’826 patent”). I submit this Declaration to
`
`address and respond to the arguments made in Patent Owner’s Response and the
`
`declaration submitted by Dr. Jakobsson in support of the Patent Owner’s Response.
`
`2.
`
`My background and qualifications are summarized in my previous
`
`declaration (Ex-1002) and my curriculum vitae is attached thereto as Appendix A.
`
`In preparing this Declaration, I have reviewed the following materials and the
`
`relevant exhibits cited in each of these filings:
`
`(cid:120) Petition (“Pet.”) (Paper No. 3) and the exhibits cited therein
`
`(cid:120) Decision on Institution (Paper No. 8)
`
`(cid:120) Corrected Patent Owner’s Response (“POR”) (Paper No. 18) and the
`
`exhibits cited therein
`
`(cid:120) Declaration of Markus Jakobsson In Support Of Patent Owner Response
`
`(“Jakobsson Decl.”) (Ex-2003)
`
`(cid:120) Conditional Motion to Amend (Paper No. 19) (“CMTA”)
`
`(cid:120) Declaration of Markus Jakobsson In Support of CMTA (Ex-2013)
`
`(cid:120) Transcript of March 20, 2019 deposition of Markus Jakobsson
`(“Jakobsson Dep.”) (Ex-1017)
`
`(cid:120) Declaration of Dr. Ari Juels In Support Of Petitioner’s Reply (Ex-1020)
`
`1
`
`
`
`II.
`
`LEGAL PRINCIPLES
`3.
`I am not an attorney. For purposes of this Declaration, I have been
`
`informed about certain aspects of the law that are relevant to my analysis and
`
`opinions.
`
`A.
`4.
`
`Claim Construction
`I have been informed that claim construction is a matter of law and
`
`that the final claim construction will be determined by the Board.
`
`5.
`
`I have been informed that the claim terms in an IPR review should be
`
`given their broadest reasonable construction in light of the specification as
`
`commonly understood by a person of ordinary skill in the art (“POSITA”). I have
`
`applied this standard in my analysis.
`
`B.
`6.
`
`Obviousness
`I have been informed and understand that a patent claim can be
`
`considered to have been obvious to a POSITA at the time the application was filed.
`
`This means that, even if all the requirements of a claim are not found in a single
`
`prior art reference, the claim is not patentable if the differences between the subject
`
`matter in the prior art and the subject matter in the claim would have been obvious
`
`to a POSITA at the time the application was filed.
`
`7.
`
`I have been informed and understand that a determination of whether
`
`a claim would have been obvious should be based upon several factors, including,
`
`among others:
`
`2
`
`
`
`(cid:120) the level of ordinary skill in the art at the time the application was
`
`filed;
`
`(cid:120) the scope and content of the prior art; and
`
`(cid:120) what differences, if any, existed between the claimed invention and
`
`the prior art.
`
`8.
`
`I have been informed and understand that the teachings of two or
`
`more references may be combined in the same way as disclosed in the claims, if
`
`such a combination would have been obvious to a POSITA. In determining
`
`whether a combination based on either a single reference or multiple references
`
`would have been obvious, it is appropriate to consider, among other factors:
`
`(cid:120) whether the teachings of the prior art references disclose known
`
`concepts combined in familiar ways, and when combined, would yield
`
`predictable results;
`
`(cid:120) whether a POSITA could implement a predictable variation, and
`
`would see the benefit of doing so;
`
`(cid:120) whether the claimed elements represent one of a limited number of
`
`known design choices, and would have a reasonable expectation of
`
`success by those skilled in the art;
`
`(cid:120) whether a POSITA would have recognized a reason to combine
`
`known elements in the manner described in the claim;
`
`3
`
`
`
`(cid:120) whether there is some teaching or suggestion in the prior art to make
`
`the modification or combination of elements claimed in the patent;
`
`and
`
`(cid:120) whether the innovation applies a known technique that had been used
`
`to improve a similar device or method in a similar way.
`
`9.
`
`I have been informed and understand that a POSITA has ordinary
`
`creativity, and is not an automaton.
`
`10.
`
`I have been informed and understand that in considering obviousness,
`
`it is important not to determine obviousness using the benefit of hindsight derived
`
`from the patent being considered.
`
`C.
`11.
`
`Secondary Considerations
`I have been informed and understand that certain factors may support
`
`or rebut the obviousness of a claim. I understand certain secondary considerations
`
`may rebut a showing of obviousness and that such secondary considerations
`
`include, among other things, commercial success of the patented invention,
`
`skepticism of those having ordinary skill in the art at the time of invention,
`
`unexpected results of the invention, any long-felt but unsolved need in the art that
`
`was satisfied by the alleged invention, the failure of others to make the alleged
`
`invention, praise of the alleged invention by those having ordinary skill in the art,
`
`and copying of the alleged invention by others in the field. I understand that there
`
`4
`
`
`
`must be a nexus, that is, a connection, between any such secondary considerations
`
`and the alleged invention. I also understand that contemporaneous and
`
`independent invention by others is a secondary consideration tending to show
`
`obviousness.
`
`III. OPINIONS
`A.
`USR’s Proposed Claim Constructions Are Overly Narrow And
`Inconsistent With BRI.
`1.
`Contrary to USR’s Argument, “Biometric Information” Is
`An Example Of “Authentication Information.”
`12. USR argues that “biometric information” and “authentication
`
`information” must be different (POR, 12-13), but USR’s interpretation is
`
`inconsistent with the claims, the specification, and the broadest reasonable
`
`interpretation standard.
`
`The Claims Support My Construction.
`a)
`First, the claims support my construction. “Authentication
`
`13.
`
`information” is a set of information items that can be used to authenticate a user,
`
`and can include pins, passwords, and biometric information. Nothing in the claims
`
`requires that “authentication information” and “first biometric information” are
`
`mutually exclusive. For example, a dependent claim could have read: “wherein the
`
`authentication information comprises the first biometric information.”
`
`14. Moreover, the claims recite two different elements that should not be
`
`conflated: “authentication information” (with no modifier) and “first authentication
`
`5
`
`
`
`information.” These are independent elements that share no claimed relationship.
`
`USR argues that “authentication information” (with no modifier) cannot be
`
`biometric information because the claims require determining “first authentication
`
`information” from the biometric information. POR, 14 (“Some claims also require
`
`that ‘authentication information’ be determined from ‘biometric information.’”).
`
`This argument fails because USR improperly conflates “first authentication
`
`information” and “authentication information” (with no modifier). The claims
`
`require that “first authentication information” be determined from “biometric
`
`information.” The claims do not require that “authentication information” (with no
`
`modifier) be determined from “biometric information.” There is no claimed
`
`relationship between “first authentication information” and “authentication
`
`information” (with no modifier), and therefore no restriction on the relationship
`
`between “authentication information” (with no modifier) and “biometric
`
`information.”
`
`15. USR erroneously argues that the order of the claim terms supports its
`
`argument. POR, 13-14 (“The retrieved or received ‘biometric information’ cannot
`
`also be the ‘authentication information’ used to authenticate the user, since user
`
`authentication occurs before the ‘biometric information’ is even retrieved or
`
`received.”). USR is mistaken because nothing in the claims requires a specific
`
`sequence of steps. For example, system claim 1 only requires a processor that is
`
`6
`
`
`
`configured to (a) “authenticate a user of the first handheld device based on
`
`authentication information,” and (b) “retrieve or receive first biometric information
`
`of the user of the first handheld device.” Claim 1 only requires a processor capable
`
`of performing these steps, and does not require the processor to perform them in
`
`any particular sequence.
`
`The Specification Supports My Construction.
`b)
`Second, the ’826 patent broadly describes “authentication
`
`16.
`
`information” as information used to verify, identify, or authenticate a user. Ex-
`
`1002, Shoup-Decl., ¶¶34-35. A POSITA would have understood that
`
`“authentication information” includes any information used by the system to verify
`
`the identity of an individual, including “biometric information.”1 In fact, the
`
`specification expressly identifies “biometric information” as one example of
`
`“authentication information” used by the system to verify the identity of an
`
`individual. Ex-1001, ’826 patent, 35:18-21 (“The act of receiving the first
`
`authentication information of the first entity comprises receiving biometric
`
`information of the first entity.”).
`
`1 USR argues that the term “system” is ambiguous, but the first line of the abstract
`
`explains that “the invention provides a system for authenticating identities of a
`
`plurality of users.” Ex-1001, ’826 patent, Abstract.
`
`7
`
`
`
`A POSITA would have understood that biometric
`c)
`information is a form of authentication information.
`The plain meaning of the phrase “authentication information” includes
`
`17.
`
`any information used to authenticate a user, including biometric information. A
`
`POSITA would have understood that authentication information includes biometric
`
`information and my construction falls within the broadest reasonable interpretation
`
`of the phrase “authentication information.”
`
`2.
`
`USR’s Construction For “Enabling Or Disabling” A Device
`Is Unduly Narrow.
`Claims 7, 14, 26, and 34 recite a processor configured to “enable or
`
`18.
`
`disable use of [a] first handheld device based on a result of [a] comparison.” I
`
`showed in my prior declaration that Maritzen in view of Niwa discloses this
`
`limitation. Ex-1002, Shoup-Decl., ¶¶115-119 (“Maritzen discloses that the PTD
`
`CPU 210 [first processor] is configured to compare stored biometric information
`
`[stored authentication information] with received biometric information
`
`[authentication information of the user of the first handheld device] and to
`
`unlock the PTD and limit access to authorized users [enable or disable use of the
`
`first handheld device] based on a result of the comparison.”). In its attempt to
`
`distinguish the claims from the prior art, USR suggests re-interpreting the plain
`
`language of “enabling or disabling use of the first handheld device based on a
`
`result of a comparison” with the following 44-word construction: “to expand the
`
`8
`
`
`
`range of functionality available to the [first] user of the first handheld device based
`
`on one result of the comparison, and to reduce the range of functionality available
`
`to the [first] user of the first handheld device based on another result of the
`
`comparison.” POR, 15. USR’s proposed construction is unduly narrow and
`
`contravenes the broadest reasonable interpretation standard, not to mention plain
`
`meaning. Enabling or disabling use of a handheld device is a concept plainly
`
`understood by those of ordinary skill in the art and requires no construction.
`
`19. USR argues that “disabling use” requires “reducing the range of
`
`functionality available to the user to less than what was previously available.”
`
`POR, 18-19. But the claim makes clear that the processor must merely disable
`
`“use” of the device. It does not require completely disabling the device itself (e.g.,
`
`turning the phone off), and it does not require any active reduction in functionality.
`
`The verb “disable” means “to make ineffective or inoperative.”2 Thus, for
`
`example, if a processor instructs a device to remain locked from performing a
`
`transaction, that device is rendered ineffective or inoperative, even if the device is
`
`still powered. A POSITA would have understood this disclosure to be well within
`
`2 Ex-1031, Disable, Merriam-Webster.com (2019), https://www.merriam-
`
`webster.com/dictionary/disable.
`
`9
`
`
`
`the plain meaning of the phrase “disable use” and the broadest reasonable
`
`interpretation of the claim.
`
`B.
`
`20.
`
`The Challenged Claims Are Obvious.
`1. Maritzen’s Biometric Key Is “First Authentication
`Information” That Is Derived/Determined From A “First
`Biometric Information.”
`Claims 1, 10, 21, and 30 require that a “first authentication
`
`information” is derived from a “first biometric information.” I showed previously
`
`that Maritzen discloses “a ‘biometric key’ [first authentication information]”
`
`that is derived from a user’s biometric information. Ex-1002, Shoup-Decl., ¶38.
`
`USR’s argument that “a POSITA would not have understood…that Maritzen’s
`
`biometric key is determined or derived from ‘first biometric information’” (POR,
`
`23-25) is incorrect.
`
`21.
`
`First, Maritzen’s biometric key is called the biometric key because it
`
`is derived from biometric information. A POSITA would have understood that the
`
`biometric key is not a physical key but rather is a cryptographic key (e.g., in the
`
`form of binary data) derived from or determined from biometric information.
`
`Second, Maritzen clearly discloses that the biometric key is determined in response
`
`to a biometric authentication. Ex-1004, Maritzen, [0044] (“if the biometric input is
`
`valid for the device, privacy card 110 creates a biometric key”). The term
`
`“biometric key” was a term that had a special meaning to those of skill in the art.
`
`10
`
`
`
`A POSITA would have understood the term “biometric key” to refer to a
`
`cryptographic key that is derived from biometric information. In the mid-1990s, a
`
`number of researchers started investigating the question about how to better use
`
`biometrics as a means of authentication and key management. The problems
`
`addressed were two-fold: (i) how to employ a biometric as a means for unlocking
`
`and/or deriving an ordinary cryptographic key, and (ii) how to avoid storing a
`
`biometric template in the clear. An early work in this area is the 1994 patent by
`
`Tomko described a system “biometric controlled key generation.” Ex-1025, U.S.
`
`Patent No. 5,680,460 (“Tomko”). Follow up work by the same team is described
`
`in the 1999 article titled “Biometric Encryption.” Ex-1026, Soutar. Dr. Ari Juels
`
`also published very related work entitled “A Fuzzy Commitment Scheme” in 1999.
`
`Ex-1027, Juels. Similarly, Davida, Frankel, and Matt also published an article
`
`entitled “On Enabling Secure Applications Through Off-line Biometric
`
`Identification” in 1998. Ex-1028, Davida. In all these systems, a cryptographic
`
`key is generated (derived) from a biometric sample (along with other information).
`
`In 2000, a fairly influential paper entitled “Biometric Decision Landscapes” by
`
`John Daugman also refers to biometric key generation. Ex-1029, Daugman. This
`
`basic concept that cryptographic keys were generated based on a biometric sample
`
`was well established and, in my opinion, a POSITA would have understood the
`
`11
`
`
`
`term “biometric key” to mean a cryptographic key that is derived from biometric
`
`information.
`
`22. USR further attempts to disprove that Maritzen’s biometric key is
`
`determined or derived from biometric information by arguing that “it is more likely
`
`in [one] embodiment that the PTD simply stores the biometric key . . . and
`
`retrieves it for use at the time of a transaction” and “that such a biometric key is
`
`not determined or derived in any way from the user’s biometric information
`
`because the biometric key would not ever vary across different transactions, let
`
`alone based on information the user provides or generates.” POR, 23-24.
`
`Maritzen’s disclosure proves USR is wrong. As USR acknowledges, Maritzen
`
`clearly discloses that the privacy card “creates a biometric key.” Ex-1004,
`
`Maritzen, [0044], [0088], [0109], [0124], [0148], [0164].
`
`23. USR concludes that the biometric key must be stored because “it
`
`would be redundant and nonsensical for the PTD to create and use a biometric key
`
`to unlock itself after it has already validated the user’s biometric input” (POR, 23),
`
`but this argument is baseless. As USR itself points out, Maritzen discloses two
`
`embodiments. In a first embodiment, the privacy card is integrated in the PTD.
`
`Ex-1004, Maritzen, [0044] (“If privacy card 110 is within PTD 100, validation of
`
`the biometric information may be conducted by PTD 100.”). In a second
`
`embodiment, the privacy card is not integrated into the PTD. Id. (“Alternatively, if
`
`12
`
`
`
`privacy card 110 is separate from PTD 100, validation is conducted by privacy
`
`card 110. Privacy card 110 only transmits the biometric key . . . The biometric key
`
`is used to unlock PTD 100 and to gain authorization of the financial transaction.”).
`
`In the first embodiment, the privacy card and the PTD may be integrated in the
`
`same physical device, but the privacy card still generates and transmits the
`
`biometric key to the PTD to unlock the PTD so there is nothing “nonsensical”
`
`about using the biometric key to unlock the PTD.
`
`24. USR also argues that “the PTD must retrieve and transmit the same
`
`biometric key for the user in every transaction in order for the transmitted key to
`
`match the known biometric key” that is stored at the clearing house. POR, 24. I
`
`disagree. There were many ways known in the art for dynamically generating
`
`biometric keys based on biometric information that could be compared against
`
`static, pre-stored biometric keys. As discussed above in paragraph 21, this concept
`
`and was well studied, and solutions developed, starting in the mid-1990s, and
`
`would have been known to a POSITA. As a simple example, suppose the user
`
`registers a biometric sample S. The clearing house generates and stores biometric
`
`key C, which is generated as a random codeword in a suitable error-correcting
`
`code. The user device stores the value P=C XOR S. Note that P by itself does not
`
`reveal much, if any, information about either C or S. When the user device
`
`receives a new biometric sample S’, the user device can compute a biometric key
`
`13
`
`
`
`C’ obtained by applying a decoding algorithm to find the codeword nearest P XOR
`
`S’. If S’ is sufficiently close to S, then C' will be exactly equal to C (this is the
`
`very nature of an error-correcting code). Instead of C, the system may use a hash
`
`H1 of C as the biometric key. Another hash H2 of C may be stored on the user
`
`device and compared to the hash H2 of C' to perform a local authentication of the
`
`user. In this simple example, the user device need not store a biometric key at the
`
`user device and send the same biometric key for every authentication attempt. It
`
`uniquely calculates the biometric key based on the user’s biometric information.
`
`Therefore, a POSITA would not have understood the biometric key of Maritzen to
`
`be necessarily stored in advance at the PTD.
`
`2.
`
`It Would Have Been Obvious To Combine Maritzen With
`Jakobsson To Determine the Recited “First Authentication
`Information” From “First Biometric Information.”
`Claim 1[h] recites that the first processor is “programmed to
`
`25.
`
`determine the first authentication information derived from first biometric
`
`information.” As I explained in my previous declaration, it would have been
`
`obvious to combine Maritzen with the combination function of Jakobsson to meet
`
`this claim limitation. Ex-1002, Shoup-Decl., ¶¶87-96. USR argues that “a
`
`POSITA would not have been motivated to combine Jakobsson’s ‘combination
`
`function’ with Maritzen because both references teach away from such a
`
`combination, the combination would change the basic principles under which
`
`14
`
`
`
`Maritzen was designed to operate, and the combination would render Maritzen
`
`inoperable for its intended purpose.” POR, 25-26. USR’s arguments fail because
`
`they rely on a fabricated “mandate to maintain user anonymity and avoid sending
`
`any user information” (id. at 30) that fundamentally misconstrues the teachings of
`
`Maritzen and improperly casts implementation details as fundamental constraints.
`
`26. As discussed in further detail below, Maritzen discloses no mandate to
`
`“maintain user anonymity” and, at most, advises against sending unencrypted user
`
`information in certain circumstances. Maritzen discloses no mandate to “avoid
`
`sending any user information” (POR, 29-30) because it expressly stores and
`
`validates user information as part of remote authentication procedures. Thus,
`
`contrary to USR’s arguments, Jakobsson’s combination function and its teachings
`
`regarding the transmission and use of user information are compatible with
`
`Maritzen.
`
`Maritzen Includes No Mandate To Avoid Sending Any
`a)
`User Information.
`27. USR argues that “Maritzen focuses on maintaining user anonymity
`
`and thus teaches that user information should never be sent or remotely verified”
`
`(POR, 30), but this is not true. Maritzen makes clear that any “anonymity” it
`
`discloses is directed at keeping sensitive user information away from a point-of-
`
`sale device (i.e., the vehicle access payment gateway), not remote verifier (i.e., the
`
`clearing house). Ex-1004, Maritzen, [0054] (“VAPGT 120 does not obtain
`
`15
`
`
`
`information as to who the user is, who the financial processor 140 is, or the
`
`account being used. Thus the privacy of both the user and the financial processor
`
`is maintained.”), [0090] (“[T]ransaction key 340 may be transmitted directly to
`
`clearing house 130. No user information is transmitted to VAPGT 120.”). As I
`
`explained during my deposition, Maritzen only limits transmission of user data
`
`from the user device to the point-of-sale device (Maritzen’s VAPGT) during the
`
`authentication protocol. Ex-2005, Shoup Dep., 160:20-24 (“it’s not sent during
`
`the protocol to the VAPGT”)3, 162: 22-25 (“No biometric information identifying
`
`the user is transmitted any time during the authentication protocol”); 163:14-16
`
`(“It says here in this embodiment that no user information is transmitted to the
`
`VAPGT.”). Maritzen describes no limitations with respect to sharing user
`
`information with clearing house 130 during, for example, a set up stage. See, e.g.,
`
`Ex-1004, Maritzen, [0048] (“In addition, clearing house 130 may validate the
`
`transaction key against pre-existing user keys. In one embodiment, the user may
`
`set-up specific keys to conduct specific financial transactions.”).
`
`28.
`
`In fact, Maritzen’s clearing house stores and verifies various pieces of
`
`user information. For example, the clearing house memory includes an entire data
`
`structure dedicated to verifying user information. Ex-1004, Maritzen, [0080]
`
`3 Emphasis added throughout unless otherwise noted.
`
`16
`
`
`
`(“User area 880 includes user account information 910, user keys 920, user
`
`certificates and profiles 930, historical transaction events 940, and pre-established
`
`biometric key 950.”), [0081] (“In one embodiment, clearing house 130 determines
`
`if transaction type 540 is consistent with historical transaction events 940
`
`conducted by the user. In addition, clearing house 130 may compare the current
`
`transaction type against pre-established user certificates and profiles 930. In
`
`addition, clearing house 130 may validate transaction key 340 against pre-existing
`
`user keys 920.”).
`
`Ex-1004, Maritzen, Figs. 8-9.
`
`29. Moreover, Maritzen does not prohibit user information from ever
`
`being transmitted. It merely teaches that sensitive information such as a biometric
`
`17
`
`
`
`sample should not be exposed in the clear (i.e., unencrypted) during transmissions.
`
`For example, Maritzen discusses an embodiment of the user device wherein a
`
`biometric key is transmitted between the privacy card and the PTD. During this
`
`transmission, Maritzen explains that “[p]rivacy card 110 only transmits the
`
`biometric key. The biometric information identifying the user is not transmitted at
`
`any time.” Ex-1004, Maritzen, [0044]. USR cites [0044] in support of its
`
`argument that Maritzen includes a “mandate . . . [to] avoid sending any user
`
`information” (POR, 30), but USR misinterprets the meaning of [0044].
`
`30.
`
`First, [0044] only describes the transmission of biometric information
`
`between the privacy card and the PTD. Second, as I explained during my
`
`deposition, Maritzen contemplates the transmission of encrypted biometric
`
`information because such biometric information would not “identify[] the user.”
`
`Ex-2005, Shoup Dep., 201:19-202:1 (“Biometric information identifying the user
`
`would be information as presented that would identify the user and if that
`
`information were encrypted, for example, then that information wouldn't identify
`
`the user.”). Maritzen does not prohibit transmitting biometric information in any
`
`form whatsoever. It teaches that biometric information identifying the user is not
`
`transmitted at any time in communications between the privacy card and the PTD.
`
`Encrypted or cryptographically protected biometric information would not identify
`
`18
`
`
`
`the user. Thus, at most, Maritzen advises against sending unprotected personal
`
`information, but in no way discourages sending encrypted personal information.
`
`b) Maritzen Includes No Mandate To Maintain User
`Anonymity.
`31. USR argues that Maritzen includes a “mandate to maintain user
`
`anonymity” (POR, 30), but Maritzen does not define the term “anonymous.”
`
`There are many levels of anonymity that may disclose varying levels of user
`
`information. While the strictest possible form of anonymity might require that no
`
`user identifying information whatsoever is ever transmitted, anonymous systems
`
`exist that send some forms of user information. For example, a system that
`
`transmits a user’s credit card information, but hides the user’s name and address
`
`would be considered anonymous. A system that does not transmit a name and
`
`address would be considered anonymous despite transmitting user information
`
`such as a user identification code, encrypted biometric information, or a device
`
`identifier. A system that transmits a user identification code, but hides the user’s
`
`credit card information would be considered anonymous. A system that transmits
`
`a user’s name and address to a back-end authentication server, but does not send
`
`the user’s name and address to a point-of-sale terminal would be considered
`
`anonymous as well. Anonymity can refer to any system that keeps some
`
`identifying information from some entity. Maritzen itself teaches that PTDs
`
`belong to single users and that PTD identifiers are sent along with other
`
`19
`
`
`
`information to authenticate the user for a transaction. Ex-Ex-1004, Maritzen, at
`
`[0038] (“In one embodiment, PTD 100 is associated with a particular user such
`
`that only the particular user may access PTD 100 and conduct the financial
`
`transaction using PTD 100.”),; [0045] (“In one embodiment, the transaction key
`
`may include the biometric key and a PTD identifier. The PTD identifier identifies
`
`the particular PTD being used.”). The PTD identifier can be used to identify a
`
`specific user, but Maritzen’s system would still be considered an anonymous
`
`system.
`
`My Proposed Combination Is Narrowly Focused On
`c)
`Discrete Teachings From Jakobsson.
`USR argues that the combination of Maritzen and Jakobsson would
`
`32.
`
`require extensive changes to Maritzen’s system. POR, 32 (“Maritzen's PTD would
`
`need to be modified to maintain at least time, secret, event state, and user biometric
`
`information in order to generate an authentication code using Jakobsson's
`
`combination function . . . and to maintain and transmit associated user
`
`identification information.”). This argument is wrong and misconstrues my
`
`proposed combination. As the Board recognized, “Petitioner [relies] on Jakobsson
`
`only to the extent that Maritzen does not disclose expressly ‘a second device
`
`configured to receive second authentication information’ and ‘that the biometric
`
`key . . . is derived from the first biometric information.’” Institution Decision,
`
`Paper No. 8, 16. Contrary to USR’s assertions, such a combination is not