`
`
`
`UNITED STATES PATENT AND TRADEMARK OFFICE
`
`BEFORE THE PATENT TRIAL AND APPEAL BOARD
`
`APPLE INC.,
`Petitioner,
`v.
`UNIVERSAL SECURE REGISTRY, LLC,
`Patent Owner.
`_________________________________________
`Case IPR2018-00810
`U.S. Patent No. 9,100,826
`________________________________________
`
`
`REPLY TO PATENT OWNER’S RESPONSE
`
`
`
`
`
`Contents
`
`Introduction ......................................................................................................... 1
`I.
`II. Argument ............................................................................................................. 1
`A. USR’s Proposed Claim Constructions Are Overly Narrow And
`Contravene BRI. ..................................................................................................................... 1
`“Biometric Information” Is An Example Of “Authentication
`1.
`Information.” ....................................................................................................... 1
`2. USR’s Construction For “Enabling Or Disabling” A Device Is Unduly
`Narrow. ................................................................................................................ 4
`B. USR Fails To Overcome Petitioner’s Showing That The Challenged
`Claims Are Obvious. ............................................................................................................. 5
`1. Maritzen’s Biometric Key Is “First Authentication Information”
`Derived/Determined From A “First Biometric Information.” ............................ 5
`It Would Have Been Obvious To Combine Maritzen With Jakobsson To
`2.
`Determine The Recited “First Authentication Information” From “First
`Biometric Information.” ...................................................................................... 7
`3. Maritzen’s “Biometric Information” Is The Claimed “Authentication
`Information.” ..................................................................................................... 14
`It Would Have Been Obvious To Combine Maritzen With Jakobsson’s
`4.
`Teachings That “Second Biometric Information” Is Retrieved From Memory
`By A Second Device. ........................................................................................ 14
`5. Maritzen In View Of Jakobsson And Niwa Discloses A Second
`Processor “Configured To Receive A First Authentication Information.”....... 16
`6. Maritzen and Jakobsson Disclose Authenticating The First Entity “Based
`Upon The First Authentication Information And The Second Biometric
`Information.” ..................................................................................................... 17
`7. Maritzen Discloses A “First Handheld Device.” ..................................... 19
`8. Maritzen Discloses A Processor Configured To “Enable Or Disable” Use
`Of The First Handheld Device Based On The Result Of A Comparison. ........ 21
`9. Maritzen In View Of Niwa Discloses Storing “Respective Biometric
`Information For A Second Plurality Of Users.” ............................................... 21
`10. USR Fails To Demonstrate Any Secondary Considerations of Non-
`Obviousness. ..................................................................................................... 22
`III. Conclusion ...................................................................................................... 26
`
`ii
`
`
`
`I.
`
`Introduction
`USR’s Patent Owner Response (“POR”) repeats arguments that the Board
`
`already rejected, and fails to rebut Petitioner’s showing that the challenged claims
`
`are unpatentable. First, USR proposes improperly narrow constructions that not
`
`only contravene the broadest reasonable interpretation standard but also are
`
`inconsistent with plain meaning and the intrinsic evidence. Second, USR
`
`mischaracterizes the express teachings of Maritzen, Jakobsson, and Niwa, and the
`
`testimony of Petitioner’s expert, Dr. Shoup. Finally, USR fails to demonstrate any
`
`secondary considerations of non-obviousness whatsoever.
`
`II. Argument
`A. USR’s Proposed Claim Constructions Are Overly Narrow And
`Contravene BRI.
`1.
`“Biometric Information” Is An Example Of
`“Authentication Information.”
`Claiming that “biometric information” must be different from
`
`“authentication information,” as USR does (POR, 12-13), is inconsistent with the
`
`intrinsic evidence and the BRI standard. “Authentication information” is a set of
`
`information items that can be used to authenticate a user, and can include PINs,
`
`passwords, and biometric information. Ex-1018, Shoup-Decl., ¶12.
`
`1
`
`
`
`First, nothing in the claims requires that “authentication information” and
`
`“first biometric information” are mutually exclusive.1 Moreover, the claims recite
`
`two different elements that should not be conflated, as USR does: “authentication
`
`information” (with no modifier) and “first authentication information.” These are
`
`independent elements with no recited relationship. USR argues that
`
`“authentication information” (with no modifier) cannot be biometric information
`
`because the claims require determining “first authentication information” from the
`
`biometric information. POR, 14. However, the claims require that “first
`
`authentication information” be determined from “biometric information.” They do
`
`not require that “authentication information” (with no modifier) be determined
`
`from “biometric information.” Because “first authentication information” and
`
`“authentication information” (with no modifier) are not related, there is no
`
`restriction on the relationship between “authentication information” (with no
`
`modifier) and “biometric information.” Ex-1018, Shoup-Decl., ¶13.
`
`The order of the claim steps does not support USR either, as it erroneously
`
`suggests (POR, 13-14). For example, system claim 1 only requires a processor that
`
`is configured to (a) “authenticate a user of the first handheld device based on
`
`
`1 For example, a dependent claim could have read: “wherein the authentication
`
`information comprises the first biometric information.”
`
`2
`
`
`
`authentication information,” and (b) “retrieve or receive first biometric information
`
`of the user of the first handheld device.” The claim does not require the processor
`
`to perform these steps in any particular sequence. Ex-1018, Shoup-Decl., ¶15.
`
`Moreover, method claims do not require any specific order of operations
`
`unless expressly set forth in the claim. Interactive Gift Exp., Inc. v. Compuserve
`
`Inc., 256 F.3d 1323, 1342 (Fed. Cir. 2001). Here, method claims 10 and 30 do not
`
`require any specific sequence.
`
`Second, the specification2 expressly identifies “biometric information” as
`
`one example of “authentication information” used by the system to verify the
`
`identity of an individual. Ex-1001, ’826 patent, 35:18-21 (“the act of receiving the
`
`first authentication information of the first entity comprises receiving biometric
`
`information of the first entity”). Ex-1018, Shoup-Decl., ¶16.
`
`Third, those of ordinary skill in the art would have understood that
`
`“authentication information” means any information used to authenticate a user,
`
`including biometric information. Ex-1018, Shoup-Decl., ¶17. Accordingly,
`
`Petitioner’s construction falls within the broadest reasonable interpretation of the
`
`phrase “authentication information.”
`
`
`2 USR argues that the term “system” is ambiguous, but challenged claims 1 and 21
`
`claim a “system for authenticating identities.”
`
`3
`
`
`
`2.
`
`USR’s Construction For “Enabling Or Disabling” A Device
`Is Unduly Narrow.
`Claims 7, 14, 26, and 34 recite a processor configured to “enable or disable
`
`use of [a] first handheld device based on a result of [a] comparison.” The Petition
`
`showed that Maritzen in view of Niwa discloses this limitation. Pet., 52-53. In its
`
`attempt to distinguish the prior art, USR suggests re-interpreting the plain language
`
`of “enabling or disabling use of the first handheld device based on a result of a
`
`comparison” with a 44-word construction. POR, 15. USR’s “construction” is
`
`unduly narrow and inconsistent with the BRI standard. Enabling or disabling use
`
`of a handheld device is a concept well understood by those of ordinary skill in the
`
`art and requires no construction. Ex-1018, Shoup-Decl., ¶18.
`
`USR argues that “disabling use” requires “reducing the range of
`
`functionality available to the user to less than what was previously available.”
`
`POR, 18-19. But the claim makes clear that the processor must merely disable
`
`“use” of the device. It does not require completely disabling the device itself (e.g.,
`
`turning the phone off), and it does not require any active reduction in functionality.
`
`“Disable” means “to make ineffective or inoperative.”3 Thus, for example, if a
`
`processor instructs a device to remain locked from performing a transaction, the
`
`
`3 Ex-1031, Disable, Merriam-Webster.com (2019), https://www.merriam-
`
`webster.com/dictionary/disable.
`
`4
`
`
`
`device has been rendered ineffective or inoperative, even if the device is still
`
`powered. Ex-1018, Shoup-Decl., ¶19.
`
`B. USR Fails To Overcome Petitioner’s Showing That The
`Challenged Claims Are Obvious.
`1. Maritzen’s Biometric Key Is “First Authentication
`Information” Derived/Determined From A “First Biometric
`Information.”
`The Petition showed that Maritzen discloses a biometric key (which
`
`corresponds to “first authentication information” in claims 1, 10, 21, and 30)
`
`derived from a user’s biometric information. Pet., 17. USR is incorrect in
`
`suggesting otherwise. POR, 23-25. Ex-1018, Shoup-Decl., ¶20.
`
`First, Maritzen’s biometric key is called the biometric key because it is
`
`derived from biometric information. A POSITA would have understood that the
`
`biometric key is not a physical key but rather is a cryptographic key (e.g., in the
`
`form of binary data) derived from or determined from biometric information.
`
`Second, Maritzen clearly discloses that the biometric key is determined in
`
`response to a biometric authentication. Ex-1004, Maritzen, [0044] (“if the
`
`biometric input is valid for the device, privacy card 110 creates a biometric key”).
`
`Ex-1018, Shoup-Decl., ¶21
`
`USR’s argument that the PTD in Maritzen “simply stores the biometric
`
`key,” without ever actually determining or deriving the biometric key from the
`
`user’s biometric information (POR, 23-24), fares no better. Maritzen clearly
`
`5
`
`
`
`discloses that the privacy card “creates a biometric key.” Ex-1004, Maritzen,
`
`[0044], [0088], [0109], [0124], [0148], [0164]. A POSITA would have understood
`
`that a biometric key is created from biometric information based on the use of the
`
`term “biometric key” in the field. Ex-1018, Shoup-Decl., ¶¶22-23.
`
`Furthermore, USR’s conclusion that the biometric key must be stored
`
`because “it would be redundant and nonsensical for the PTD to create and use a
`
`biometric key to unlock itself after it has already validated the user’s biometric
`
`input” (POR, 23) is inconsistent with a disclosed embodiment in Maritzen. In a
`
`first embodiment, the privacy card is integrated in the PTD. Ex-1004, Maritzen,
`
`[0044]. In a second embodiment, the privacy card is not integrated into the PTD.
`
`Id. In the first embodiment, the privacy card and the PTD may be integrated in the
`
`same physical device, but the privacy card still generates and transmits the
`
`biometric key to the PTD to unlock the PTD so there is nothing “nonsensical”
`
`about using the biometric key to unlock the PTD. Ex-1018, Shoup-Decl., ¶23.
`
`Finally, contrary to USR’s argument (POR, 24), the PTD does not have to
`
`retrieve and transmit the same biometric key for the user in every transaction for
`
`the transmitted key to match the known biometric key stored at the clearing house.
`
`There were many ways known in the art for dynamically generating biometric keys
`
`based on biometric information that could be compared against static, pre-stored
`
`6
`
`
`
`biometric keys. Thus, a POSITA would not have understood the biometric key of
`
`Maritzen to be stored in advance at the PTD. Ex-1018, Shoup-Decl., ¶24.
`
`2.
`
`It Would Have Been Obvious To Combine Maritzen With
`Jakobsson To Determine The Recited “First Authentication
`Information” From “First Biometric Information.”
`It would have been obvious to combine Maritzen with the combination
`
`function of Jakobsson to meet the limitation in claim 1 that the first processor is
`
`“programmed to determine the first authentication information derived from the
`
`first biometric information.” Pet., 40-46. USR’s arguments against this
`
`combination (POR, 25-26) fail because they rely on a fabricated “mandate to
`
`maintain user anonymity and avoid sending any user information” (id., 30) that
`
`fundamentally misconstrues the teachings of Maritzen and improperly casts
`
`implementation details as fundamental constraints. Ex-1018, Shoup-Decl., ¶25.
`
`a) Maritzen Includes No Mandate To Avoid Sending User
`Information.
`Maritzen does not teach that user information should never be sent or
`
`remotely verified. It makes clear that any “anonymity” it describes relates to
`
`keeping sensitive user information away from a point-of-sale device (i.e., the
`
`vehicle access payment gateway), not a remote verifier (i.e., the clearing house).
`
`Ex-1004, Maritzen, [0054], [0090]. As Dr. Shoup explained at his deposition,
`
`Maritzen only limits transmission of user data from the user device to the point-of-
`
`sale device (Maritzen’s VAPGT) during the authentication protocol. Shoup-Dep.,
`
`7
`
`
`
`160:20-24 (“it’s not sent during the protocol to the VAPGT”)4, 162 (“no biometric
`
`information identifying the user is transmitted any time during the authentication
`
`protocol”), 163:14-16 (“It says here in this embodiment that no user information is
`
`transmitted to the VAPGT.”). Maritzen describes no limitations with respect to
`
`sharing user information with clearing house 130 during, for example, a set up
`
`stage. Ex-1004, Maritzen, [0048]. Ex-1018, Shoup-Decl., ¶¶26-27. In fact,
`
`Maritzen’s clearing house stores and verifies various pieces of user information.
`
`For example, the clearing house memory includes an entire data structure
`
`dedicated to verifying user information. Ex-1004, Maritzen, [0078], [0081]. Ex-
`
`1018, Shoup-Decl., ¶¶26, 28.
`
`
`4 Emphasis added throughout unless otherwise noted.
`
`8
`
`
`
`
`
`
`
`Ex-1004, Maritzen, Figs. 8-9.
`
`Moreover, Maritzen does not prohibit user information from ever being
`
`transmitted. It merely teaches that sensitive information such as a biometric
`
`sample should not be exposed in the clear (i.e., unencrypted) during transmissions.
`
`For example, Maritzen discusses an embodiment of the user device wherein a
`
`biometric key is transmitted between the privacy card and the PTD. During this
`
`transmission, Maritzen explains that “[p]rivacy card 110 only transmits the
`
`biometric key. The biometric information identifying the user is not transmitted at
`
`any time.” Ex-1004, Maritzen, [0044]. USR cites [0044] in support of its
`
`argument that Maritzen includes a “mandate…to avoid sending any user
`
`9
`
`
`
`information” (POR, 30), but USR misinterprets the meaning of [0044]. Ex-1018,
`
`Shoup-Decl., ¶¶26, 29.
`
`First, [0044] only describes the transmission of biometric information
`
`between the privacy card and the PTD. Second, Maritzen contemplates the
`
`transmission of encrypted biometric information because such biometric
`
`information would not “identify[] the user.” Ex-2005, Shoup-Dep., 201:19-202:1
`
`(“biometric information identifying the user would be information as presented
`
`that would identify the user and if that information were encrypted, for example,
`
`then that information wouldn’t identify the user”). Maritzen does not prohibit
`
`transmitting biometric information in any form whatsoever. It teaches that
`
`biometric information identifying the user is not transmitted at any time in
`
`communications between the privacy card and the PTD. Encrypted or
`
`cryptographically protected biometric information would not identify the user.
`
`Thus, at most, Maritzen advises against sending unprotected personal information,
`
`but in no way discourages sending encrypted personal information. Ex-1018,
`
`Shoup-Decl., ¶30.
`
`b) Maritzen Includes No Mandate To Maintain User
`Anonymity.
`Maritzen teaches no “mandate to maintain user anonymity” as USR asserts.
`
`POR, 30. Maritzen does not even define the term “anonymous,” and, as Dr. Shoup
`
`explains, a POSITA would understand that there are many levels of anonymity that
`
`10
`
`
`
`may disclose varying levels of user information. While the strictest possible form
`
`of anonymity might require that no user identifying information whatsoever is ever
`
`transmitted, anonymous systems exist that send some forms of user information.
`
`For example, a system that does not transmit a name and address would be
`
`considered anonymous despite transmitting user information such as a user
`
`identification code, encrypted biometric information, or a device identifier.
`
`Maritzen itself teaches that PTDs belong to single users and that PTD identifiers
`
`are sent along with other information to authenticate the user for a transaction. Ex-
`
`1004, Maritzen, [0038], [0045]. The PTD identifier can be used to identify a
`
`specific user, but Maritzen’s system would still be considered an anonymous
`
`system. Ex-1018, Shoup-Decl., ¶31.
`
`c)
`Petitioner Only Relies On Jakobsson For The Limited
`Application Of Known Techniques.
`The combination of Maritzen and Jakobsson would be a limited application
`
`of known techniques (explicitly disclosed in Jakobsson) regarding the derivation of
`
`authentication information from biometric information and the receipt of second
`
`authentication information (Pet., 33-35, 41-46), and not, as USR suggests, “an
`
`overhaul of Maritzen’s system” (POR, 32). As the Board recognized, “Petitioner
`
`[relies] on Jakobsson only to the extent that Maritzen does not disclose expressly ‘a
`
`second device configured to receive second authentication information’ and ‘that
`
`11
`
`
`
`the biometric key . . . is derived from the first biometric information.’” DI, 16.
`
`Ex-1018, Shoup-Decl., ¶32
`
`d)
`Jakobsson’s Teachings Are Compatible With Maritzen.
`USR argues that Maritzen and Jakobsson are directed to “entirely different
`
`fields and problems” (POR, 34-35), but that is plainly incorrect. As the Petition
`
`explained (Pet., 34-40), both Maritzen and Jakobsson are directed toward secure
`
`financial transactions that address the issue of electronic fraud. It is irrelevant
`
`whether Maritzen discloses an embodiment of a vehicle payment system because a
`
`POSITA would have understood that the electronic authentication techniques
`
`taught by Jakobsson and Maritzen are readily transferable across both systems.
`
`Ex-1018, Shoup-Decl., ¶35. As USR acknowledges (POR, 10-11), a POSITA
`
`would have been a trained electrical engineer with years of experience, and
`
`therefore would have had the understanding and creativity necessary to combine
`
`these references. See KSR Int'l Co. v. Teleflex Inc., 550 U.S. 398, 421 (2007) (“A
`
`person of ordinary skill is also a person of ordinary creativity, not an automaton.”).
`
`USR similarly misconstrues the skill of a POSITA in suggesting the
`
`combination of Maritzen and Jakobsson would render Maritzen unable to “achieve
`
`its purpose of maintaining user anonymity and avoiding transmission of user
`
`information.” POR, 33. Maritzen only advises against sending user information in
`
`an unprotected fashion, and Jakobsson’s authentication does not send any such
`
`12
`
`
`
`information in the clear. For example, Jakobsson discloses that an authentication
`
`code [first authentication information] is determined from User Data (P) [first
`
`biometric information] using a combination function 230 that cryptographically
`
`obfuscates any user information and “protects it from being compromised.”
`
`Jakobsson, [0072]. A POSITA would have understood that a one-way function or
`
`key derivation function would have maintained user anonymity and protected
`
`against unwanted disclosure of user information Ex-1018, Shoup-Decl., ¶33.
`
` USR’s remaining attempts to distinguish Maritzen and Jakobsson also fail.
`
`USR purports to distinguish Jakobsson’s system by pointing out that it is directed
`
`toward a “personal (as opposed to vehicle) event detecting and alert system”
`
`(POR, 35), but Maritzen’s device is also called a “personal transaction device.”
`
`Ex-1004, Maritzen, [0002]. USR further argues that “Jakobsson’s user device can
`
`be a credit card device, key fob, USB dongle or cellular telephone…as opposed to
`
`Maritzen’s vehicle-based PTD device.” POR, 35. But each of a credit card device,
`
`a key fob, a USB dongle, and a cellular telephone is a “personal transaction
`
`device” or PTD. Moreover, Figure 6a of Maritzen makes clear that Maritzen’s
`
`PTD can be a cellular telephone. Ex-1004, Maritzen, [0016] (“FIGS. 6 a and 6 b
`
`are examples of a personal transaction device with integrated privacy card”). Ex-
`
`1018, Shoup-Decl., ¶34.
`
`13
`
`
`
`
`
`Ex-1004, Maritzen, Fig. 6a.
`
`3. Maritzen’s “Biometric Information” Is The Claimed
`“Authentication Information.”
`Claim 1 requires a first processor “programmed to authenticate a user of the
`
`first handheld device based on authentication information.” As shown in the
`
`Petition, Maritzen discloses this limitation. Pet., 20-21. Ex-1018, Shoup-Decl.,
`
`¶36.
`
`USR’s argument to the contrary – “a POSITA would have understood that
`
`Maritzen’s ‘biometric information’ does not constitute ‘authentication
`
`information’” (POR, 38) – relies on an incorrect claim construction, as explained
`
`above in Section II.A.1. A POSITA would have understood that authentication
`
`information is any information used by the system to verify the identity of an
`
`individual, including biometric information. Ex-1018, Shoup-Decl., ¶37.
`
`4.
`
`It Would Have Been Obvious To Combine Maritzen With
`Jakobsson’s Teachings That “Second Biometric
`
`14
`
`
`
`Information” Is Retrieved From Memory By A Second
`Device.
`A POSITA would have understood that Jakobsson’s teachings regarding the
`
`retrieval or receipt of second biometric information would have been entirely
`
`compatible with the teachings of Maritzen because, contrary to USR’s argument
`
`(POR, 38-40), Maritzen does not contain any prohibitions against using user
`
`information to authenticate the user. Ex-1018, Shoup-Decl., ¶38.
`
`First, as discussed above, Maritzen discusses anonymity with respect to the
`
`point-of-sale device, and not with respect to the clearing house. Maritzen already
`
`expressly discloses storing user information. Ex-1004, Maritzen, [0081], Fig. 9.
`
`Additionally, Maritzen only advises against transmitting unencrypted biometric
`
`information between the privacy card and the PTD. Maritzen does not, as USR
`
`contends (POR, 27-30), contain a blanket prohibition of the transmission or use of
`
`user information. USR mischaracterizes the Maritzen reference by arguing that it
`
`“repeatedly stresses that neither biometric information identifying the user nor any
`
`other user information is transmitted from the PTD at any time during the
`
`transaction” (internal quotations omitted). POR, 47. But Maritzen’s admonition
`
`against sending biometric information is only related to transmissions from the
`
`privacy card to the PTD in an embodiment where the privacy key and PTD are
`
`separate. There is no admonition against sending biometric information “from the
`
`PTD.” Ex-1004, Maritzen, [0044]. Ex-1018, Shoup-Decl., ¶39.
`
`15
`
`
`
`USR is also incorrect that “Maritzen’s clearing house only receives and
`
`verifies a ‘transaction key’ and ‘biometric key,’ not biometric information or any
`
`other user information.” POR, 39-40. As Figures 8 and 9 disclose, Maritzen’s
`
`clearing house stores a variety of user information used for authenticating a user
`
`including user account information, user keys, and user certificates and profiles.
`
`Maritzen also receives various pieces of user information to authenticate the user
`
`in connection with its stored user information. Ex-1004, Maritzen, [0081].
`
`Accordingly, it would have been obvious to store biometric information in the
`
`“user area” memory of the clearing house and to retrieve or receive biometric
`
`information from this “user area” to conduct authentication. Pet., 62-65. Ex-1018,
`
`Shoup-Decl., ¶40.
`
`Finally, as Dr. Shoup explains, distributing the functionality of systems was
`
`well known, and a POSITA would have understood that communications between
`
`such systems would have been encrypted to protect against the disclosure of
`
`sensitive information – like user data. Therefore, USR is incorrect to suggest that
`
`biometric information would never be received in Maritzen’s system, and there
`
`would be no motivation to combine with Jakobsson’s distributed system, where
`
`biometric information is sent from device to device. Ex-1018, Shoup-Decl., ¶41.
`
`5. Maritzen In View Of Jakobsson And Niwa Discloses A
`Second Processor “Configured To Receive A First
`Authentication Information.”
`
`16
`
`
`
`USR argues that Petitioner fails to meet its burden with respect to limitation
`
`21[h] because Petitioner “exclusively asserts that ‘Jakobsson discloses this
`
`limitation’ and relies solely on its analysis of limitation 1[j] of claim 1, which only
`
`addresses Maritzen.” POR, 40. However, Petitioner’s analysis under limitation
`
`1[j] (as referenced explicitly through an internal citation) clearly demonstrates that
`
`Maritzen discloses limitation 21[h]. Pet., 47 (“Maritzen discloses that the clearing
`
`house CPU 810 [second processor] is configured to receive the biometric key
`
`[first authentication information of the user of the first handheld device] from
`
`the VAPGT.”). Ex-1018, Shoup-Decl., ¶42.
`
`6. Maritzen and Jakobsson Disclose Authenticating The First
`Entity “Based Upon The First Authentication Information
`And The Second Biometric Information.”
`Maritzen and Jakobsson plainly disclose the requirement in claim limitation
`
`30[e] of authenticating “the first entity based upon the first authentication
`
`information and the second biometric information.” USR’s argument to the
`
`contrary (POR, 40-43) overlooks the proof presented by Petitioner. Ex-1018,
`
`Shoup-Decl., ¶43.
`
`First, as the Petition explains with respect to limitation 30[d] (by reference
`
`to limitation 21[i]), Jakobsson discloses “receiving the second biometric data of the
`
`first user at the second device.” Pet., 62, 74. This disclosure makes clear that an
`
`authentication code (i.e., second authentication information) can be generated from
`
`17
`
`
`
`biometric information. See also Pet., 17, 33-34, 50 (explaining that Jakobsson’s
`
`authentication code and Maritzen’s pre-store biometric key are both second
`
`authentication information). Ex-1018, Shoup-Decl., ¶44.
`
`Second, as explained in limitation 30[c], Maritzen discloses “receiving with
`
`a second device…the first authentication information.” Pet., 74, 47 (“Maritzen
`
`discloses that the clearing house CPU 810 [second processor] is configured to
`
`receive the biometric key [first authentication information of the user of the
`
`first handheld device] from the VAPGT.”). Ex-1018, Shoup-Decl., ¶45.
`
`Claim limitation 30[e] merely requires “authenticating…the identity of the
`
`first entity based on the first authentication information and the second biometric
`
`information.” As discussed above with respect to limitations 30[b] and 30[c],
`
`second authentication information (whether Jakobsson’s authentication code or
`
`Maritzen’s pre-stored biometric key) can be derived from biometric information,
`
`and, as the Petition explains with respect to limitation 30[e], “Maritzen discloses
`
`that the clearing house CPU 810 [second processor] is configured to compare the
`
`biometric key [first authentication information] with the pre-established
`
`biometric key 950 [second authentication information] to authenticate the
`
`identity of the user.” Pet., 48, 74. Accordingly, Petitioner has shown that claim 30
`
`is invalid in light of the Maritzen and Jakobsson. See also Pet., 49-52 (claim 2), 58
`
`(claim 11). Ex-1018, Shoup-Decl., ¶46.
`
`18
`
`
`
`7. Maritzen Discloses A “First Handheld Device.”
`As explained throughout the Petition, Maritzen’s PTD is a first handheld
`
`device (Pet., 20, 26, 55, 56, 59), and USR cites nothing within Maritzen to support
`
`its claim that the PTD is “more likely” mounted to the vehicle than handheld
`
`(POR, 45). Moreover, USR’s purported distinction is irrelevant because handheld
`
`devices can be mounted to a vehicle. Ex-1018, Shoup-Decl., ¶47.
`
`First, the term PTD stands for “personal transaction device.” A POSITA
`
`would have understood that personal devices are typically kept on a user’s person
`
`– hence the term personal. Personal devices are typically small and capable of
`
`being held in one’s hand. USR argues that “the word ‘handheld’ does not appear
`
`in Maritzen,” but Maritzen describes various embodiments of the PTD and its
`
`embedded privacy card as being “the size of a credit card.” Ex-1004, Maritzen,
`
`[0069]. A POSITA would have understood that an integrated system that is the
`
`size of a credit card is handheld. Ex-1017, Jakobsson-Dep., 209:4-9 (“Q. My
`
`question to you is, credit cards are handheld, correct? …A. Yes, a credit card could
`
`be held in your hand.”). Ex-1018, Shoup-Decl., ¶48.
`
`Second, USR argues that “many ‘personal’ devices are not handheld,
`
`especially those devices that are intended for use in a vehicle (e.g., conventional
`
`toll transponders, navigation systems, radar detectors, etc.)” (POR, 44), but no
`
`POSITA would consider any of these devices to be “personal” devices. Moreover,
`
`19
`
`
`
`all these devices are capable of being held in one’s hand. Notably, USR does not,
`
`and cannot, cite any support for its contention that the PTD is “more likely” to be
`
`mounted to the vehicle. For example, Maritzen discloses no mounting apparatus or
`
`adhesive that could be used to attach the PTD to a vehicle. Ex-1018, Shoup-Decl.,
`
`¶49.
`
`Third, Figures 6a and 6b disclose embodiments of the PTD that are clearly
`
`handheld devices. For example, Figure 6a is a cellular telephone, which is a
`
`handheld device. Maritzen further explains that elements 630 and 660 are
`
`biometric input devices (i.e., fingerprint scanners), which reveals that the
`
`dimensions of the entire device are proportioned for handheld use. Ex-1004,
`
`Maritzen, [0076]. Ex-1018, Shoup-Decl., ¶¶50-51.
`
`
`
`20
`
`
`
`Ex-1004, Maritzen, Figs. 6a-6b.
`
`8. Maritzen Discloses A Processor Configured To “Enable Or
`Disable” Use Of The First Handheld Device Based On The
`Result Of A Comparison.
`Maritzen discloses enabling or disabling use of a handheld device as a result
`
`of a comparison, as claims 7, 14, 26, and 34 require. Pet., 52-54. USR’s argument
`
`that Maritzen fails to disclose this limitation (POR, 48-50) relies on an incorrect
`
`understanding of what it means to enable or disable use of the first handheld
`
`device. While USR contends that Maritzen itself distinguishes between the PTD
`
`disabling itself and the PTD remaining locked (POR, 50), Maritzen makes no such
`
`distinction. Furthermore, USR has not articulated any distinction between a locked
`
`PTD and a disabled PTD. A locked PTD and a disabled PTD are both rendered
`
`inoperable for some functional purpose. For example, Maritzen explains that
`
`“PTD 100 is disabled such that the user may not access the PTD 100.” Ex-1004,
`
`Maritzen, [0056]. Similarly, a successful verification would “unlock PTD 100”
`
`such that the user may access PTD 100. Maritzen makes no distinction between
`
`these terms. Ex-1018, Shoup-Decl., ¶¶52-53.
`
`9. Maritzen In View Of Niwa Discloses Storing “Respective
`Biometric Information For A Second Plurality Of Users.”
`In its Petition, Petitioner explained that Maritzen in view of Niwa discloses
`
`claim 15 for the same reasons as claim 8. Pet., 56, 59. USR faults Petitioner for
`
`referring back to its argument with respect to claim 8 (POR, 51-52), but identifies
`
`21
`
`
`
`no reason that the same disclosure on which Petitioner relies for claim 8 would not
`
`apply equally to claim 15 Ex-1018, Shoup-Decl., ¶¶54-55.
`
`In