throbber

`
`
`
`UNITED STATES PATENT AND TRADEMARK OFFICE
`
`BEFORE THE PATENT TRIAL AND APPEAL BOARD
`
`APPLE INC.,
`Petitioner,
`
`v.
`
`UNIVERSAL SECURE REGISTRY LLC,
`Patent Owner.
`_________________________________________
`
`Case IPR2018-00813
`U.S. Patent No. 9,100,826
`_________________________________________
`
`PETITIONER’S REPLY TO PATENT OWNER RESPONSE
`
`
`
`
`
`
`
`
`
`
`
`

`

`Contents
`
`Introduction ......................................................................................................... 1
`I.
`II. Argument ............................................................................................................. 1
`A. USR’s Proposed Claim Constructions Are Overly Narrow And Contravene
`BRI. ....................................................................................................................... 1
`“Biometric Information” Is An Example Of “Authentication
`1.
`Information.” ....................................................................................................... 1
`2. USR’s Construction For “Enabling Or Disabling” A Device Is Unduly
`Narrow. ................................................................................................................ 4
`B. USR Fails To Overcome Petitioner’s Showing That Claims 1, 2, 10, 11, 21,
`22, 24, 27, 30, And 31 Are Anticipated By Jakobsson. ......................................... 5
`Jakobsson Discloses A Second Processor Configured To “Receive Or
`1.
`Retrieve” Second Authentication Information.................................................... 5
`C. USR Fails To Overcome Petitioner’s Showing That Claims 7, 14, 26, And
`34 Are Obvious Over Jakobsson In View of Verbauwhede and Maritzen. ........... 9
`Jakobsson In View Of Maritzen Discloses “Enabling Or Disabling” A
`1.
`Device. ................................................................................................................. 9
`It Would Have Been Obvious To Combine Jakobsson With Maritzen’s
`2.
`Teachings On “Enabling or Disabling” A Device. ........................................... 10
`The Superficial Differences Identified By USR Would Not Have
`3.
`Dissuaded A POSITA From Combining Jakobsson With Maritzen. ............... 11
`Jakobsson In View Of Verbauwhede Discloses Comparing Information
`4.
`To Conduct An Authentication. ........................................................................ 15
`It Would Have Been Obvious To Combine Jakobsson With
`5.
`Verbauwhede. .................................................................................................... 17
`D. USR Fails To Overcome Petitioner’s Showing That Claims 8 and 15 Are
`Obvious In View Of Jakobsson And Gullman. .................................................... 19
`1. Gullman Discloses A First Device Configured To Store Biometric
`Information For A Second Plurality Of Users. ................................................. 19
`2.
`It Would Have Been Obvious To Combine Jakobsson With Gullman. .. 20
`E. USR Fails To Demonstrate Secondary Considerations Of Non-
`Obviousness. ......................................................................................................... 21
`III. Conclusion ...................................................................................................... 25
`
`
`
`ii
`
`

`

`I.
`
`Introduction
`USR’s Patent Owner Response (“POR”) repeats arguments that the Board
`
`already rejected, and fails to rebut Petitioner’s showing that the challenged claims
`
`are unpatentable. First, USR proposes improperly narrow constructions that not
`
`only contravene the broadest reasonable interpretation standard but also are
`
`inconsistent with plain meaning and the intrinsic evidence. Second, USR
`
`mischaracterizes the express teachings of Jakobsson, Maritzen, and Gullman, and
`
`the testimony of Petitioner’s expert, Dr. Shoup. Finally, USR fails to demonstrate
`
`any secondary considerations of non-obviousness whatsoever.
`
`II. Argument
`A. USR’s Proposed Claim Constructions Are Overly Narrow And
`Contravene BRI.
`1.
`“Biometric Information” Is An Example Of
`“Authentication Information.”
`Claiming that “biometric information” must be different from
`
`“authentication information,” as USR does (POR, 19-20), is inconsistent with the
`
`intrinsic evidence and the BRI standard. “Authentication information” is a set of
`
`
`
`1
`
`

`

`information items that can be used to authenticate a user, and can include pins,
`
`passwords, and biometric information.1 Ex-1118, Shoup-Decl., ¶12.
`
`First, nothing in the claims requires that “authentication information” and
`
`“first biometric information” are mutually exclusive.2 Ex-1118, Shoup-Decl., ¶13.
`
`Moreover, the claims recite two different elements that should not be conflated, as
`
`USR does: “authentication information” (with no modifier) and “first
`
`authentication information.” These are independent elements with no recited
`
`relationship. USR argues that “authentication information” (with no modifier)
`
`cannot be biometric information because the claims require determining “first
`
`authentication information” from the biometric information. POR, 14. However,
`
`
`1 Petitioner notes that Jakobsson discloses authenticating a user based on a PIN, a
`
`password, and biometric information, all of which are examples of “authentication
`
`information.” Pet., 21; Ex-1104, Jakobsson, [0059] (“a first authentication of user
`
`110 is performed by the user authentication device 120 based on information
`
`supplied to the authentication device 120 by the user 110. For example, the
`
`information supplied by the user may be a PIN, a password or biometric
`
`information”).
`
`2 For example, a dependent claim could have read: “wherein the authentication
`
`information comprises the first biometric information.”
`
`
`
`2
`
`

`

`the claims require that “first authentication information” be determined from
`
`“biometric information.” They do not require that “authentication information”
`
`(with no modifier) be determined from “biometric information.” Because “first
`
`authentication information” and “authentication information” (with no modifier)
`
`are not related, there is no restriction on the relationship between “authentication
`
`information” (with no modifier) and “biometric information.” Ex-1118, Shoup-
`
`Decl., ¶14.
`
`The order of the claim steps does not support USR either, as it erroneously
`
`suggests. POR, 20-21. For example, system claim 1 only requires a processor that
`
`is configured to (a) “authenticate a user of the first handheld device based on
`
`authentication information,” and (b) “retrieve or receive first biometric information
`
`of the user of the first handheld device.” The claim does not require the processor
`
`to perform these steps in any particular sequence. Ex-1118, Shoup-Decl., ¶15.
`
`Moreover, method claims do not require any specific order of operations
`
`unless expressly set forth in the claim. Interactive Gift Exp., Inc. v. Compuserve
`
`Inc., 256 F.3d 1323, 1342 (Fed. Cir. 2001) Here, method claims 10 and 30 do not
`
`require any specific sequence because they do not expressly recite one.
`
`
`
`3
`
`

`

`Second, the specification3 expressly identifies “biometric information” as
`
`one example of “authentication information” used by the system to verify the
`
`identity of an individual. Ex-1101, ’826 patent, 35:18-21 (“the act of receiving the
`
`first authentication information of the first entity comprises receiving biometric
`
`information of the first entity”). Ex-1118, Shoup-Decl., ¶16.
`
`Third, those of ordinary skill in the art would have understood that
`
`“authentication information” means any information used to authenticate a user,
`
`including biometric information. Accordingly, Petitioner’s construction falls
`
`within the broadest reasonable interpretation of the phrase “authentication
`
`information.” Ex-1118, Shoup-Decl., ¶17.
`
`2.
`
`USR’s Construction For “Enabling Or Disabling” A Device
`Is Unduly Narrow.
`Claims 7, 14, 26, and 34 recite a processor configured to “enable or disable
`
`use of [a] first handheld device based on a result of [a] comparison.” Jakobsson in
`
`view of Verbauwhede and Maritzen discloses this limitation. Pet., 55-60. In its
`
`attempt to distinguish the prior art, USR suggests re-interpreting the plain language
`
`of “enabl[ing] or disabl[ing] use of the first handheld device based on a result of
`
`[a] comparison” with a 44-word construction. POR, 23. USR’s “construction” is
`
`
`3 USR argues that the term “system” is ambiguous, but challenged claims 1 and 21
`
`claim a “system for authenticating identities.”
`
`
`
`4
`
`

`

`unduly narrow and inconsistent with the BRI standard. Enabling or disabling use
`
`of a handheld device is a concept well understood by those of ordinary skill in the
`
`art and requires no construction. Ex-1118, Shoup-Decl., ¶18.
`
`USR argues that “disabling use” requires “reducing the range of
`
`functionality available to the user to less than what was previously available.”
`
`POR, 26. But the claim makes clear that the processor must merely disable “use”
`
`of the device. It does not require completely disabling the device itself (e.g.,
`
`turning the phone off), and it does not require any active reduction in functionality.
`
`“Disable” means “to make ineffective or inoperative.”4 Thus, for example, if a
`
`processor instructs a device to remain locked from performing a transaction, the
`
`device has been rendered ineffective or inoperative, even if the device is still
`
`powered. Ex-1118, Shoup-Decl., ¶19.
`
`B. USR Fails To Overcome Petitioner’s Showing That Claims 1, 2,
`10, 11, 21, 22, 24, 27, 30, And 31 Are Anticipated By Jakobsson.
`1.
`Jakobsson Discloses A Second Processor Configured To
`“Receive Or Retrieve” Second Authentication Information.
`Claim 1 recites a second processor configured to “retrieve or receive”
`
`second authentication information, which as the Petition explained, Jakobsson
`
`
`4 Ex-1131, Disable, Merriam-Webster.com (2019), https://www.merriam-
`
`webster.com/dictionary/disable
`
`
`
`5
`
`

`

`discloses. Pet., 35, 29-31. Ex-1118, Shoup-Decl., ¶20. In response, USR suggests
`
`that Jakobsson’s verifier only “derives or creates” second authentication
`
`information, and that deriving or creating requires CPU registers while “retrieving
`
`or receiving” requires RAM. POR, 29. These arguments are technically specious,
`
`and find no support in the ’826 specification or the Jakobsson reference. Ex-1118,
`
`Shoup-Decl., ¶21; Ex-1120, Juels-Decl., ¶¶44-50.
`
`First, USR acknowledges, as it must, that the claimed “retrieving or
`
`receiving” refers to transferring data from memory. POR, 31 (“the claimed data
`
`retrieval or receipt is from … memory”). It is irrelevant whether data is stored in
`
`CPU registers (which USR acknowledges Jakobsson discloses, POR, 30), RAM,
`
`ROM, or any other kind of well-known memory device. The data still must be
`
`retrieved or received from the memory to perform the authentication. As explained
`
`in the Petition, “[a] POSITA would have understood that the … [second
`
`authentication information] must be stored in memory (e.g., random-access
`
`memory [RAM]) once it is derived or generated and that it must be retrieved or
`
`received from memory to perform the comparison with [the first authentication
`
`information].” Pet., 29-30. Ex-1118, Shoup-Decl., ¶22; Ex-1120, Juels-Decl.,
`
`¶¶44-50.
`
`Second, USR erroneously asserts that “retrieval or receipt is from long term
`
`memory, such as RAM” rather that CPU registers, and that “[d]ata is read from
`
`
`
`6
`
`

`

`registers…[while] data is received or retrieved from …RAM.” POR, 31. This
`
`argument is wrong technically and is nonsensical. There is no distinction between
`
`reading data from a memory device and receiving or retrieving data from a
`
`memory device because these terms are all synonymous with respect to the transfer
`
`of data from a memory device to a processor. Notably, USR identifies no
`
`disclosure in the ’826 patent to support its interpretation because the ’826 patent
`
`makes no distinction between reading data from register memory and “retrieving or
`
`receiving” data from any other memory. Ex-1118, Shoup-Decl., ¶23; Ex-1120,
`
`Juels-Decl., ¶¶44-50.
`
`Furthermore, RAM and CPU registers are both examples of short-term
`
`memory that can be used to store intermediate values during the execution of a
`
`computer algorithm. A POSITA would have understood that RAM is a temporary
`
`storage location (like CPU registers) for calculating intermediate values. In fact,
`
`RAM is often considered “volatile” memory because it is unable to store
`
`information once a power source is removed. Thus, there is no meaningful
`
`distinction between CPU registers and RAM because they both represent well-
`
`known short-term memory devices that can be used to facilitate the comparison of
`
`information described in Jakobson. Ex-1118, Shoup-Decl., ¶24; Ex-1120, Juels-
`
`Decl., ¶¶44-50.
`
`
`
`7
`
`

`

`Third, a POSITA would understand that the authentication procedure
`
`described in Jakobsson can be implemented using RAM, ROM, CPU registers,
`
`flash memory or any other common memory device, and that USR’s assertion that
`
`Jakobsson requires storing data within CPU registers (POR, 30-31) is therefore
`
`incorrect. RAM, ROM, flash, disk memory, and other such memory devices were
`
`well-known to those of skill in the art. See, e.g., Ex-1105, Maritzen, [0036].
`
`Jakobsson expressly discloses that the verifier 105 can be “implemented as
`
`software running on a server class computer including a processor, memory, and so
`
`on, to enable authentication of a large number of users, for example, in an
`
`enterprise…[or] implemented as software running on a desktop computer, laptop
`
`computer, special-purpose device, or personal digital assistant (PDA)… [or] as a
`
`software program running on a general-purpose computer.” Ex-1104, Jakobsson,
`
`[0038]. A POSITA would have understood that any of these embodiments would
`
`have included a RAM, ROM or other well-known memory devices from which
`
`data is retrieved or received. Ex-1118, Shoup-Decl., ¶25; Ex-1120, Juels-Decl.,
`
`¶¶44-50.
`
`Finally, USR ignores Jakobsson’s disclosure that the verifier can be
`
`implemented on a “computer … interacting with one or more other computer
`
`programs on the same or a different computer.” Ex-1104, Jakobsson, [0038]. As
`
`explained in the Petition, a POSITA would have understood that the derivation of
`
`
`
`8
`
`

`

`“second authentication information” can be implemented on a “different” program
`
`or computer or in hardware and that the verifier can be configured to retrieve or
`
`receive the “second authentication information” from the “different” program or
`
`computer or from hardware. Pet., 30-31; Ex-1104, Jakobsson, [0038], [0139]. Ex-
`
`1118, Shoup-Decl., ¶26; Ex-1120.
`
`C. USR Fails To Overcome Petitioner’s Showing That Claims 7, 14,
`26, And 34 Are Obvious Over Jakobsson In View of Verbauwhede
`and Maritzen.
`1.
`Jakobsson In View Of Maritzen Discloses “Enabling Or
`Disabling” A Device.
`Claims 7, 14, 26, and 34 require a processor configured to “enable or disable
`
`use of [a] first handheld device based on a result of [a] comparison.” Jakobsson in
`
`view of Maritzen discloses enabling or disabling use of a handheld device as a
`
`result of a comparison. Pet., 55-60. Ex-1118, Shoup-Decl., ¶27. USR admits that
`
`Maritzen discloses locking and unlocking a device as a result of a comparison, but
`
`argues that this is somehow different from enabling or disabling use. POR, 33-36.
`
`As discussed above, a POSITA would not have viewed the claim to be so limited.
`
`Nor does Maritzen recognize a distinction between the PTD disabling itself and the
`
`PTD remaining locked, and USR has not articulated any. A locked PTD and a
`
`disabled PTD are both rendered inoperable for some functional purpose. For
`
`example, Maritzen explains that “PTD 100 is disabled such that the user may not
`
`access PTD 100.” Ex-1105, Maritzen, [0056]. Similarly, a successful verification
`
`
`
`9
`
`

`

`would “unlock PTD 100” such that the user may access PTD 100. Ex-1105,
`
`Maritzen, [0067]. Maritzen makes no distinction between these terms. Ex-1118,
`
`Shoup-Decl., ¶28.
`
`2.
`
`It Would Have Been Obvious To Combine Jakobsson With
`Maritzen’s Teachings On “Enabling or Disabling” A
`Device.
`USR’s argument that Jakobsson is incompatible with Maritzen because
`
`Jakobsson includes certain embodiments where an event state is reported even
`
`when local authentication fails (POR, 36-37) overlooks other embodiments where
`
`the event state is not necessarily reported when local authentication fails. Ex-1118,
`
`Shoup-Decl., ¶29; Ex-1120, Juels-Decl., ¶¶54-56.
`
`For example, Jakobsson discloses that the device can store failed
`
`authentications as an event state. Ex-1104, Jakobsson, [0052] (“reportable event(s)
`
`. . . [include] authentication quality (e.g., a number of PIN errors prior to
`
`successful authentication, strength of a biometric match, etc.)”). Jakobsson further
`
`discloses that “in some embodiments…the occurrence of the event is
`
`communicated in identity authentication codes output by the device subsequent to
`
`the occurrence of the reportable event.” Ex-1104, Jakobsson, [0015]. If
`
`authentication fails, the device can be disabled (as taught by Maritzen), and the
`
`failed authentication is stored as an event state (as taught by Jakobsson) and sent
`
`“subsequent to the occurrence of the reportable event” (e.g., when the device is
`
`
`
`10
`
`

`

`activated upon a successful authentication). Moreover, Jakobsson explains that an
`
`authentication code (which includes the event state) is generated if the first
`
`authentication is successfully verified. Ex-1104, Jakobsson, [0059] (“a first
`
`authentication of user 110 is performed… [i]f the first authentication is
`
`successfully verified by the authentication device 120, the device 120 generates an
`
`identity authentication code.”). Jakobsson does not require that the authentication
`
`code (and a corresponding event state) is generated regardless of the authentication
`
`result, and nothing in Jakobsson requires that event states are sent for every
`
`authentication attempt or with every authentication code. In fact, Jakobsson
`
`discloses that “[t]he authentication device can verify the correctness of user data,
`
`and only provide an identity authentication code if the user data is correct.” Ex-
`
`1104, Jakobsson, [0111] (emphases added). If no authentication code is provided
`
`unless the user data is correct, then the event state (contained in the authentication
`
`code) is not provided unless the user data is correct. Thus, Maritzen’s teaching
`
`that the device can remain locked upon a failed authentication is entirely consistent
`
`with Jakobsson. Ex-1118, Shoup-Decl., ¶30; Ex-1120, Juels-Decl., ¶¶54-56.
`
`3.
`
`The Superficial Differences Identified By USR Would Not
`Have Dissuaded A POSITA From Combining Jakobsson
`With Maritzen.
`USR attempts to distinguish Jakobsson and Maritzen by identifying
`
`superficial differences, but none of them would have dissuaded a POSITA from
`
`
`
`11
`
`

`

`combining the teachings of the two references because they disclose toward
`
`remarkably similar electronic authentication systems. Ex-1118, Shoup-Decl., ¶31.
`
`For example, USR asserts that Jakobsson’s system is directed toward a
`
`“personal (as opposed to vehicle)” (emphasis in original) event detecting and alert
`
`system, but Maritzen’s device is also called a “personal transaction device” that is
`
`clearly handheld and designed for personal use. See Ex-1105, Maritzen, Fig. 6a.
`
`Both references are directed toward secure financial transactions that address the
`
`issue of electronic fraud. It is irrelevant whether Maritzen discloses embodiments
`
`that are directed toward a vehicle payment system because a POSITA would have
`
`understood that the electronic authentication techniques taught by Jakobsson and
`
`Maritzen are readily transferable across both systems. Ex-1118, Shoup-Decl., ¶32.
`
`USR also makes several arguments that appear to misunderstand that
`
`Jakobsson is the primary reference. For example, USR argues that “Petitioner cites
`
`examples of Jakobsson’s use of a PIN or password…. [i]n contrast, Maritzen does
`
`not teach PIN-based authentication, … including a PIN would be contrary to
`
`Maritzen’s goal of reducing the time it takes to complete the transaction.” POR,
`
`42. This argument fails because Jakobsson is the primary reference, and none of
`
`Petitioner’s arguments proposes adding a PIN to Maritzen’s system. Moreover, the
`
`use of PINs is one set of limited, non-exclusive examples in Jakobsson and in no
`
`way defines the scope of Jakobsson’s teachings. Both disclosures discuss many
`
`
`
`12
`
`

`

`examples of authentication techniques that were known at the time. A POSITA
`
`would have recognized that both systems are directed toward electronic
`
`authentication systems and would have had the skill to combine discrete teachings
`
`from Maritzen into the system of Jakobsson. Ex-1118, Shoup-Decl., ¶33.
`
`USR argues that Maritzen is “focused upon maintaining anonymity” (POR,
`
`13) which USR equates, citing no support, with a ban on sending personally
`
`identifiable information. POR, 40. But Maritzen does not propose any ban on
`
`sending “personally identifiable information.” Maritzen never uses this term, and
`
`USR provides no definition for this term.5 Ex-1118, Shoup-Decl., ¶34.
`
`It is untrue that Dr. Shoup “confirmed at his deposition that the Maritzen
`
`system does not transmit any personally identifiable user or biometric information”
`
`(POR, 40). He testified that Maritzen only limits transmission of user data from
`
`the user device to the point-of-sale device (Maritzen’s VAPGT). Ex-2105, Shoup-
`
`Dep., 163:14-16 (“It says here in this embodiment that no user information is
`
`
`5 In fact, Dr. Shoup asked for a definition that USR’s counsel declined to provide.
`
`Ex-2105, Shoup-Dep., 28:15-20 (“Q. So is it fair to say that the anonymous
`
`credentialing system didn't use any personally identifiable information in
`
`connection with the authentication? A. What do you mean by ‘personally
`
`identifiable information’?”).
`
`
`
`13
`
`

`

`transmitted to the VAPGT.”) (emphasis added), 160:20-24 (“it’s not sent during
`
`the protocol to the VAPGT.”) (emphasis added.) As he explained, Maritzen
`
`contemplates the transmission of encrypted biometric information because such
`
`biometric information would not “identify[] the user.” Ex-2105, Shoup Dep.,
`
`201:19-202:1 (“biometric information identifying the user would be information as
`
`presented that would identify the user and if that information were encrypted, for
`
`example, then that information wouldn't identify the user….”). Maritzen does not
`
`teach that biometric information is not transmitted in any form whatsoever. It
`
`teaches that biometric information identifying the user is not transmitted at any
`
`time. Ex-1105, Maritzen, [0044] (“The biometric information identifying the user
`
`is not transmitted at any time.”). Encrypted or cryptographically protected
`
`biometric information would not identify the user. Thus, at most, Maritzen advises
`
`against sending unprotected user information, and is compatible with Jakobsson’s
`
`teachings, which disclose the encryption or obfuscation of user information. Ex-
`
`1118, Shoup-Decl., ¶34.
`
`USR also emphasizes that Maritzen is “focused on anonymity” (Ex-2101,
`
`Jakobsson-Decl., ¶33), but a POSITA would understand that there are many levels
`
`of anonymity that may disclose varying levels of user information. While the
`
`strictest possible form of anonymity might require that no user identifying
`
`information whatsoever is ever transmitted, anonymous systems exist that send
`
`
`
`14
`
`

`

`some forms of user information. For example, a POSITA would consider a system
`
`that transmits a user’s credit card information, but hides the user’s name and
`
`address, to be anonymous. A system that does not transmit a name and address
`
`would be considered anonymous despite transmitting user information such as a
`
`user identification code, encrypted biometric information, or a device identifier. A
`
`system that transmits a user identification code, but hides the user’s credit card
`
`information, would be considered anonymous. A system that transmits a user’s
`
`name and address to a back-end authentication server, but does not send the user’s
`
`name and address to a point-of-sale terminal, would be considered anonymous as
`
`well. Anonymity can refer to any system that keeps some identifying information
`
`from some entity. Maritzen itself teaches that PTDs belong to single users and that
`
`PTD identifiers are sent along with other information to authenticate the user for a
`
`transaction. Ex-1105, Maritzen, [0038], [0045]. The PTD identifier can be used to
`
`identify a specific user, but Maritzen’s system would still be considered an
`
`anonymous system. Ex-1118, Shoup-Decl., ¶35-37.
`
`4.
`
`Jakobsson In View Of Verbauwhede Discloses Comparing
`Information To Conduct An Authentication.
`Claims 7 and 14 require a processor configured to compare stored
`
`authentication information or stored biometric information with the authentication
`
`information or biometric information of the first user. Jakobsson in view of
`
`Verbauwhede discloses this limitation. Pet., 55-60. Ex-1118, Shoup-Decl., ¶38.
`
`
`
`15
`
`

`

`USR erroneously asserts that Jakobsson is “silent as to how [local]
`
`authentication occurs [and as to] whether the authentication mechanism compares
`
`stored authentication information with authentication information of the user.”
`
`POR, 44. That is incorrect because Jakobsson expressly teaches that devices
`
`compare stored information with received information to authenticate a user. For
`
`example, Jakobsson explains that verifying devices can “can observe … [a
`
`biological] characteristic, and compare the characteristic to records that associate
`
`the characteristic with the entity.” Ex-1104, Jakobsson, [0005]. Moreover, as
`
`explained by Dr. Shoup and Dr. Juels, and admitted by Dr. Jakobsson, a POSITA
`
`would have understood that authenticating a user involves comparing a stored
`
`value against a received value. Ex-1117, Jakobsson-Dep., 157:18-20 (“Q. Okay.
`
`If the match is performed by the device alone, then there must be a stored value,
`
`correct? A. The stored value would be the template.”); 160:6-13 (Q. It’s not
`
`possible to perform the first authentication described in paragraph 59 on one
`
`device without a comparison to a stored value, correct? …A. There has to be a
`
`stored value somewhere”). Ex-1118, Shoup-Decl., ¶39; Ex-1120, Juels-Decl.,
`
`¶¶51-53.
`
`USR further argues that Jakobsson does not teach comparing authentication
`
`information with stored authentication information because “a device could
`
`authenticate a user in many ways depending on what type of authentication
`
`
`
`16
`
`

`

`information was used,” POR, 44-45, yet fails to identify a single viable alternative
`
`for conducting the claimed local authentication without comparing a stored value
`
`with a received value. Ex-1118, Shoup-Decl., ¶40.
`
`5.
`
`It Would Have Been Obvious To Combine Jakobsson With
`Verbauwhede.
`USR’s assertion that Verbauwhede “teaches away” from sending biometric
`
`information and storing biometric information in a server as disclosed in Jakobsson
`
`(POR, 45-46) both mischaracterizes Petitioner’s argument and underestimates the
`
`technical abilities and understanding of a POSITA. Ex-1118, Shoup-Decl., ¶41.
`
`Petitioner relies on Verbauwhede to show that comparing
`
`authentication/biometric information to conduct a local authentication was well
`
`known. Pet., 55-58. This teaching is confined to a local authentication phase, and
`
`a POSITA would have understood that this teaching is easily separable from
`
`Verbauwhede’s remote authentication techniques. Ex-1118, Shoup-Decl., ¶42.
`
`USR argues that Verbauwhede advocates for the localization of user data, while
`
`Jakobsson discloses that biometric is stored on the second device and that a
`
`POSITA would have thus been discouraged from combining Verbauwhede with
`
`Jakobsson. POR, 46-47. But a POSITA would have understood that localizing
`
`biometric information was a design choice that exists among a variety of other
`
`authentication techniques that were all known. It is irrelevant if Verbauwhede
`
`teaches that remote authentication can be done without biometric information,
`
`
`
`17
`
`

`

`because Petitioner only relies on its teachings with respect to local authentication.
`
`Petitioner need not prove that each and every teaching in both references are
`
`perfectly compatible because Verbauwhede’s teachings regarding local
`
`authentication are separately useful from its teachings regarding remote
`
`authentication. See KSR Int'l Co. v. Teleflex Inc., 550 U.S. 398, 420 (2007)
`
`(“Common sense teaches, however, that familiar items may have obvious uses
`
`beyond their primary purposes, and in many cases a person of ordinary skill will be
`
`able to fit the teachings of multiple patents together like pieces of a puzzle.”).
`
`Accordingly, a POSITA would have understood that Verbauwhede’s teachings
`
`about comparing stored values to conduct a local authentication can be combined
`
`with Jakobsson and Maritzen without also importing Verbauwhede’s teachings
`
`about remote authentication. Ex-1118, Shoup-Decl., ¶42.
`
`Notably, USR does not, and cannot, identify any teaching that discourages
`
`the comparison of stored biometric information against received biometric
`
`information to authenticate a user. In fact, Verbauwhede, expressly advocates for
`
`it. Ex-1107, Verbauwhede, [0063] (“During the matching process, the thumbpod
`
`200 loads a stored fingerprint template and performs a matching function to
`
`produce a match score. During the decision process, the thumbpod 200, using the
`
`match score, decides if the candidate fingerprint is a match to the template.”). Ex-
`
`1118, Shoup-Decl., ¶43.
`
`
`
`18
`
`

`

`D. USR Fails To Overcome Petitioner’s Showing That Claims 8 and
`15 Are Obvious In View Of Jakobsson And Gullman.
`1. Gullman Discloses A First Device Configured To Store
`Biometric Information For A Second Plurality Of Users.
`Claims 8 recites “a first memory coupled to the first processor included in
`
`the first handheld device and configured to store respective biometric information
`
`for a second plurality of users.” USR argues that claims 8 and 15 require a token
`
`that provides separate access for multiple distinguishable users, and that Gullman
`
`only discloses multiple users having identical access to the same account. POR,
`
`49-51. There is no support for this cramped construction of “a second plurality of
`
`users,” and in any event Gullman does disclose separate access for multiple
`
`distinguishable users. Ex-1118, Shoup-Decl., ¶44.
`
`First, the claims do not require separate access for multiple distinguishable
`
`users. Nor do they require access to multiple accounts. The plain meaning of the
`
`phrase “biometric information for a second plurality of users” merely requires
`
`biometric information for a second group of people. Nothing in the claims requires
`
`that the second group of people have access to a second group of accounts. The
`
`claims make no mention of accounts at all. USR argues that Gullman “never
`
`specifies transmitting multiple correlation factors [or] converting an identifier of
`
`what specific user’s template was matched” (POR, 50), but the claims mention no
`
`
`
`19
`
`

`

`correlation factors or identifiers. Notably, USR provides no support whatsoever
`
`for its proposed construction. Ex-1118, Shoup-Decl., ¶45.
`
`To the extent that the claims require providing separate access to “multiple
`
`distinguishable users,” or “access to multiple accounts,” Gullman discloses access
`
`to multiple accounts for multiple users. For example, Gullman explains that the
`
`token sent from the user device 12 to the remote host 10 can comprise an account
`
`number for the specific user. Ex-1106, Gullman, 4:3-8. The account number is
`
`later decoded by the remote host 10 to identify the user and his or her account. Id.,
`
`4:13-15. A POSITA would have understood that the account number can be used
`
`to identify separate accounts. Ex-1118, Shoup-Decl., ¶46.
`
`2.
`
`It Would Have Been Obvious To Combine Jako

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket