`
`BEFORE THE PATENT TRIAL AND APPEAL BOARD
`
`APPLE INC.,
`Petitioner,
`v.
`UNIVERSAL SECURE REGISTRY, LLC,
`Patent Owner.
`_________________________________________
`Case IPR2018-00813
`U.S. Patent No. 9,100,826
`________________________________________
`
`DECLARATION OF DR. VICTOR SHOUP IN SUPPORT OF
`
`PETITIONER’S REPLY TO PATENT OWNER’S RESPONSE
`
`Apple 1118
`Apple v. USR
`IPR2018-00813
`
`
`
`Contents
`
`INTRODUCTION .......................................................................................... 1
`I.
`II. LEGAL PRINCIPLES .................................................................................... 2
`A. Claim Construction.................................................................................... 2
`B. Obviousness .............................................................................................. 2
`C.
`Secondary Considerations.......................................................................... 4
`III. OPINIONS ................................................................................................... 5
`A. USR’s Proposed Claim Constructions Are Overly Narrow And Inconsistent
`With BRI. ........................................................................................................... 5
`1. Contrary to USR’s Argument, “Biometric Information” Is An Example
`Of “Authentication Information.”.................................................................... 5
`2. USR’s Construction For “Enabling Or Disabling” A Device Is Unduly
`Narrow. ........................................................................................................... 8
`B. Claims 1, 2, 10, 11, 21, 22, 24, 27, 30, And 31 Are Anticipated By
`Jakobsson. .........................................................................................................10
`Jakobsson Discloses A Second Processor Configured To “Receive Or
`1.
`Retrieve” Second Authentication Information................................................10
`C. Claims 7, 14, 26, And 34 Are Obvious Over Jakobsson In View of
`Verbauwhede and Maritzen. ..............................................................................15
`Jakobsson In View Of Maritzen Discloses “Enabling Or Disabling” A
`1.
`Device............................................................................................................15
`It Would Have Been Obvious To Combine Jakobsson With Maritzen’s
`2.
`Teachings On “Enabling or Disabling” A Device...........................................16
`It Would Have Been Obvious To Combine Jakobsson And Maritzen
`3.
`Because The Differences Identified By USR Are Superficial.........................18
`Jakobsson In View Of Verbauwhede Discloses Comparing Information
`4.
`To Conduct An Authentication.......................................................................23
`It Would Have Been Obvious To Combine Jakobsson With
`5.
`Verbauwhede. ................................................................................................25
`D. Claims 8 and 15 Are Obvious In View Of Jakobsson And Gullman. ........27
`1. Gullman Discloses A First Device Configured To Store Biometric
`Information For A Second Plurality Of Users. ...............................................27
`2.
`It Would Have Been Obvious To Combine Jakobsson With Gullman. ..28
`
`ii
`
`
`
`E. USR Fails To Demonstrate Secondary Considerations Of Non-
`Obviousness. .....................................................................................................29
`IV. CONCLUSION ...........................................................................................34
`V. AVAILABILITY FOR CROSS-EXAMINATION ........................................34
`VI. RIGHT TO SUPPLEMENT ........................................................................34
`VII. JURAT ........................................................................................................35
`
`iii
`
`
`
`I, Victor Shoup, Ph.D., declare as follows:
`I.
`INTRODUCTION
`1.
`I have been retained by Apple to provide opinions in this proceeding
`
`relating to U.S. Patent No. 9,100,826 (“’826 patent”). I submit this Declaration to
`
`address and respond to the arguments made in Patent Owner’s Response and the
`
`declaration submitted by Dr. Jakobsson in support of the Patent Owner’s Response.
`
`2.
`
`My background and qualifications are summarized in my previous
`
`declaration (Ex-1102) and my curriculum vitae is attached thereto as Appendix A.
`
`In preparing this Declaration, I have reviewed the following materials and the
`
`relevant exhibits cited in each of these filings:
`
`(cid:120) Petition (“Pet.”) (Paper 3) and the exhibits cited therein
`
`(cid:120) Decision on Institution (Paper 9) (“DI”)
`
`(cid:120) Patent Owner’s Response (“POR”) (Paper 18) and the exhibits cited
`
`therein
`
`(cid:120) Declaration of Markus Jakobsson In Support Of POR (“Jakobsson-
`
`Decl.”)
`
`(cid:120) Conditional Motion to Amend (Paper 19) (“CMTA”)
`
`(cid:120) Declaration of Markus Jakobsson In Support of CMTA
`
`(cid:120) Transcript of March 20, 2019 deposition of Markus Jakobsson
`(“Jakobsson-Dep.”) (Exhibit 1117)
`
`(cid:120) Declaration of Dr. Ari Juels In Support Of Petitioner’s Reply (Ex-1120)
`
`1
`
`
`
`II.
`
`LEGAL PRINCIPLES
`3.
`I am not an attorney. For purposes of this Declaration, I have been
`
`informed about certain aspects of the law that are relevant to my analysis and
`
`opinions.
`
`A.
`4.
`
`Claim Construction
`I have been informed that claim construction is a matter of law and
`
`that the final claim construction will be determined by the Board.
`
`5.
`
`I have been informed that the claim terms in an IPR review should be
`
`given their broadest reasonable construction in light of the specification as
`
`commonly understood by a person of ordinary skill in the art (“POSITA”). I have
`
`applied this standard in my analysis.
`
`B.
`6.
`
`Obviousness
`I have been informed and understand that a patent claim can be
`
`considered to have been obvious to a POSITA at the time the application was filed.
`
`This means that, even if all the requirements of a claim are not found in a single
`
`prior art reference, the claim is not patentable if the differences between the subject
`
`matter in the prior art and the subject matter in the claim would have been obvious
`
`to a POSITA at the time the application was filed.
`
`7.
`
`I have been informed and understand that a determination of whether
`
`a claim would have been obvious should be based upon several factors, including,
`
`among others:
`
`2
`
`
`
`(cid:120) the level of ordinary skill in the art at the time the application was
`
`filed;
`
`(cid:120) the scope and content of the prior art; and
`
`(cid:120) what differences, if any, existed between the claimed invention and
`
`the prior art.
`
`8.
`
`I have been informed and understand that the teachings of two or
`
`more references may be combined in the same way as disclosed in the claims, if
`
`such a combination would have been obvious to a POSITA. In determining
`
`whether a combination based on either a single reference or multiple references
`
`would have been obvious, it is appropriate to consider, among other factors:
`
`(cid:120) whether the teachings of the prior art references disclose known
`
`concepts combined in familiar ways, and when combined, would yield
`
`predictable results;
`
`(cid:120) whether a POSITA could implement a predictable variation, and
`
`would see the benefit of doing so;
`
`(cid:120) whether the claimed elements represent one of a limited number of
`
`known design choices, and would have a reasonable expectation of
`
`success by those skilled in the art;
`
`(cid:120) whether a POSITA would have recognized a reason to combine
`
`known elements in the manner described in the claim;
`
`3
`
`
`
`(cid:120) whether there is some teaching or suggestion in the prior art to make
`
`the modification or combination of elements claimed in the patent;
`
`and
`
`(cid:120) whether the innovation applies a known technique that had been used
`
`to improve a similar device or method in a similar way.
`
`9.
`
`I have been informed and understand that a POSITA has ordinary
`
`creativity, and is not an automaton.
`
`10.
`
`I have been informed and understand that in considering obviousness,
`
`it is important not to determine obviousness using the benefit of hindsight derived
`
`from the patent being considered.
`
`C.
`11.
`
`Secondary Considerations
`I have been informed and understand that certain factors may support
`
`or rebut the obviousness of a claim. I understand certain secondary considerations
`
`may rebut a showing of obviousness and that such secondary considerations
`
`include, among other things, commercial success of the patented invention,
`
`skepticism of those having ordinary skill in the art at the time of invention,
`
`unexpected results of the invention, any long-felt but unsolved need in the art that
`
`was satisfied by the alleged invention, the failure of others to make the alleged
`
`invention, praise of the alleged invention by those having ordinary skill in the art,
`
`and copying of the alleged invention by others in the field. I understand that there
`
`4
`
`
`
`must be a nexus, that is, a connection, between any such secondary considerations
`
`and the alleged invention. I also understand that contemporaneous and
`
`independent invention by others is a secondary consideration tending to show
`
`obviousness.
`
`III. OPINIONS
`A.
`USR’s Proposed Claim Constructions Are Overly Narrow And
`Inconsistent With BRI.
`1.
`Contrary to USR’s Argument, “Biometric Information” Is
`An Example Of “Authentication Information.”
`12. USR argues that “biometric information” and “authentication
`
`information” must be different (POR, 19-20), but USR’s interpretation is
`
`inconsistent with the claims, the specification, and the broadest reasonable
`
`interpretation standard.
`
`The Claims Support My Construction.
`a)
`First, the claims support my construction. “Authentication
`
`13.
`
`information” is a set of information items that can be used to authenticate a user,
`
`5
`
`
`
`and can include pins, passwords, and biometric information.1 See, e.g., Ex-1101,
`
`’826 patent, claim 1 (“a first handheld device including: a first processor, the
`
`processor programmed to authenticate a user of the first handheld device based on
`
`authentication information . . . .”). Nothing in the claims requires that
`
`“authentication information” and “first biometric information” are mutually
`
`exclusive. For example, a dependent claim could have read: “wherein the
`
`authentication information comprises the first biometric information.”
`
`14. Moreover, the claims recite two different elements that should not be
`
`conflated: “authentication information” (with no modifier) and “first authentication
`
`information.” These are independent elements that share no claimed relationship.
`
`USR argues that “authentication information” (with no modifier) cannot be
`
`biometric information because the claims require determining “first authentication
`
`1 Notably, Jakobsson discloses authenticating a user based on a PIN, a password,
`
`and biometric information, all of which are examples of “authentication
`
`information.” Ex-1102, Shoup-Decl., ¶54; Ex-1104, Jakobsson, [0059] (“a first
`
`authentication of user 110 is performed by the user authentication device 120
`
`based on information supplied to the authentication device 120 by the user 110.
`
`For example, the information supplied by the user may be a PIN, a password or
`
`biometric information”).
`
`6
`
`
`
`information” from the biometric information. POR, 14 (“Some claims also require
`
`that ‘authentication information’ be determined from ‘biometric information.’”).
`
`This argument fails because USR improperly conflates “first authentication
`
`information” and “authentication information” (with no modifier). The claims
`
`require that “first authentication information” be determined from “biometric
`
`information.” The claims do not require that “authentication information” (with no
`
`modifier) be determined from “biometric information.” There is no claimed
`
`relationship between “first authentication information” and “authentication
`
`information” (with no modifier), and therefore no restriction on the relationship
`
`between “authentication information” (with no modifier) and “biometric
`
`information.”
`
`15. USR erroneously argues that the order of the claim terms supports its
`
`argument. POR, 21 (“the retrieved or received ‘biometric information’ cannot also
`
`be the ‘authentication information’ used to authenticate the user, since user
`
`authentication occurs before the ‘biometric information’ is even retrieved or
`
`received.”). USR is mistaken because nothing in the claims requires a specific
`
`sequence of steps. For example, system claim 1 only requires a processor that is
`
`configured to (a) “authenticate a user of the first handheld device based on
`
`authentication information,” and (b) “retrieve or receive first biometric information
`
`of the user of the first handheld device.” Claim 1 only requires a processor capable
`
`7
`
`
`
`of performing these steps, and does not require the processor to perform them in
`
`any particular sequence.
`
`The Specification Supports My Construction.
`b)
`Second, the specification broadly describes “authentication
`
`16.
`
`information” as information used to verify, identify, or authenticate a user. Ex-
`
`1102, Shoup-Decl., ¶¶39-40. The specification expressly identifies “biometric
`
`information” as one example of “authentication information” used by the system to
`
`verify the identity of an individual. Ex-1101, ’826 patent, 35:18-21 (“the act of
`
`receiving the first authentication information of the first entity comprises receiving
`
`biometric information of the first entity by detecting the biometric information
`
`with the first device.”).
`
`A POSITA would have understood that biometric
`c)
`information is a form of authentication information.
`The plain meaning of the phrase “authentication information” includes
`
`17.
`
`any information used to authenticate a user, including biometric information. A
`
`POSITA would have understood that authentication information includes biometric
`
`information, and my construction falls within the broadest reasonable
`
`interpretation of the phrase “authentication information.”
`
`2.
`
`USR’s Construction For “Enabling Or Disabling” A Device
`Is Unduly Narrow.
`Claims 7, 14, 26, and 34 recite a processor configured to “enable or
`
`18.
`
`disable use of [a] first handheld device based on a result of [a] comparison.” In my
`
`8
`
`
`
`previous declaration, I showed that Jakobsson in view of Verbauwhede and
`
`Maritzen discloses this limitation. Ex-1102, Shoup-Decl., ¶¶163-168. In its
`
`attempt to distinguish the prior art, USR suggests re-interpreting the plain language
`
`of “enabl[ing] or disabl[ing] use of the first handheld device based on a result of
`
`[a] comparison . . .” with the following 44-word construction: “to expand the range
`
`of functionality available to the [first] user of the first handheld device based on
`
`one result of the comparison, and to reduce the range of functionality available to
`
`the [first] user of the first handheld device based on another result of the
`
`comparison.” POR, 23. USR’s proposed construction is unduly narrow and
`
`contravenes the broadest reasonable interpretation standard, not to mention plain
`
`meaning. Enabling or disabling use of a handheld device is a concept plainly
`
`understood by those of ordinary skill in the art and requires no construction.
`
`19. USR argues that “disabling use” requires “reducing the range of
`
`functionality available to the user to less than what was previously available.”
`
`POR, 26. But the claim makes clear that the processor must merely disable “use”
`
`of the device. It does not require completely disabling the device itself (e.g.,
`
`turning the phone off), and it does not require any active reduction in functionality.
`
`9
`
`
`
`The verb “disable” means “to make ineffective or inoperative.”2 Thus, for
`
`example, if a processor instructs a device to remain locked from performing a
`
`transaction, that device is rendered ineffective or inoperative, even if the device is
`
`still powered. A POSITA would have understood this disclosure to be well within
`
`the plain meaning of the phrase “disable use” and the broadest reasonable
`
`interpretation of the claim.
`
`Claims 1, 2, 10, 11, 21, 22, 24, 27, 30, And 31 Are Anticipated By
`B.
`Jakobsson.
`1.
`
`Jakobsson Discloses A Second Processor Configured To
`“Receive Or Retrieve” Second Authentication Information.
`Claims 1, 10, and 21 recite a second processor configured to “retrieve
`
`20.
`
`or receive” second authentication information, which as I explained in my previous
`
`declaration, Jakobsson discloses. Ex-1102, Shoup-Decl., ¶¶86, 73-76 (“A POSITA
`
`would have understood that the Authentication Code A1V [second authentication
`
`information] must be stored in memory (e.g., random-access memory) once it is
`
`derived or generated and that it must be retrieved or received from memory to
`
`perform the comparison with Authentication Code AD [first authentication
`
`information]... A POSITA would have understood that the derivation of
`
`2 Ex-1131, Disable, Merriam-Webster.com (2019), https://www.merriam-
`
`webster.com/dictionary/disable
`
`10
`
`
`
`Authentication Code A1V [second authentication information] can be
`
`implemented on a ‘different’ program or computer or in hardware and that the
`
`verifier can be configured to retrieve or receive the Authentication Code A1V
`
`[second authentication information] from the ‘different’ program or computer or
`
`from hardware.”).
`
`21. USR argues that Jakobsson fails to disclose receipt or retrieval of
`
`second authentication information because Jakobsson’s verifier only “derives or
`
`creates” second authentication information. POR, 29. To support its argument,
`
`USR attempts to distinguish “deriving or creating” from the claimed “retrieving or
`
`receiving” of second authentication by insisting that “deriving or creating” requires
`
`CPU registers while “retrieving or receiving” requires RAM. These arguments are
`
`technically specious, and find no support in the ’826 specification or the Jakobsson
`
`reference.
`
`22.
`
`First, USR acknowledges that the claimed “retrieving or receiving”
`
`refers to transferring data from memory. POR, 31 (“the claimed data retrieval or
`
`receipt is from . . . memory”). It is irrelevant whether data is stored in CPU
`
`registers, RAM, ROM, or any other kind of well-known memory device. The data
`
`still must be retrieved or received from the memory to perform the authentication.
`
`As I explained in my previous declaration, “[a] POSITA would have understood
`
`that the … [second authentication information] must be stored in memory (e.g.,
`
`11
`
`
`
`random-access memory) once it is derived or generated and that it must be
`
`retrieved or received from memory to perform the comparison with [the first
`
`authentication information].” Ex-1102, Shoup-Decl., ¶¶73-74.
`
`23.
`
`Second, USR makes the erroneous assertion that “retrieval or receipt
`
`is from long term memory, such as RAM” rather that CPU registers. POR, 31 (“A
`
`POSITA would understand that the claimed data retrieval or receipt is from long-
`
`term memory, such as RAM, and not from short-term storage such as registers.”).
`
`In an attempt to make a specious distinction between retrieving and reading from
`
`different types of memory, USR insists that “[d]ata is read from registers…[while]
`
`data is received or retrieved from …RAM.” Id. This argument has no merit from
`
`either a technical or plain English perspective. There is no distinction between
`
`reading data from a memory device and receiving or retrieving data from a
`
`memory device because these terms are all synonymous in context with respect to
`
`the transfer of data from a memory device to a processor. Notably, USR identifies
`
`no disclosure in the ’826 patent to support its interpretation because the ’826 patent
`
`makes no distinction between reading data from register memory and “retrieving or
`
`receiving” data from any other memory.
`
`24. Moreover, USR argument fails because RAM and CPU registers are
`
`both examples of short-term memory that can be used to store intermediate values
`
`during the execution of a computer algorithm. A POSITA would have understood
`
`12
`
`
`
`that RAM is a temporary storage location (like CPU registers) for calculating
`
`intermediate values. In fact, RAM is often considered “volatile” memory because
`
`it is unable to store information once a power source is removed. Thus, there is no
`
`meaningful distinction between CPU registers and RAM because they both
`
`represent well-known short-term memory devices that can be used to facilitate the
`
`comparison of information described in Jakobson.
`
`25.
`
`Third, USR concludes that Jakobsson’s disclosure requires storing
`
`data within CPU registers from which USR contends information is necessarily
`
`“read” rather than “received or retrieved.” This again is incorrect and unsupported
`
`by the Jakobsson reference. Nothing in Jakobsson specifies that second
`
`authentication information is stored in CPU registers. And RAM, ROM, flash,
`
`disk memory, and other such memory devices were well-known to those of skill in
`
`the art. See, e.g., Ex-1105, Maritzen, [0036] (“a computer program may be stored
`
`in a computer readable storage medium, such as, but is not limited to, any type of
`
`disk including floppy disks, optical disks, CD-ROMs, and magnetic-optical disks,
`
`read-only memories (ROMs), random access memories (RAMs), EPROMs,
`
`EEPROMs, magnetic or optical cards, or any type of media suitable for storing
`
`electronic instructions, and each coupled to a computer system bus.”). Jakobsson
`
`also discloses that the verifier 105 can be “implemented as software running on a
`
`server class computer including a processor, memory, and so on, to enable
`
`13
`
`
`
`authentication of a large number of users, for example, in an enterprise…[or]
`
`implemented as software running on a desktop computer, laptop computer, special-
`
`purpose device, or personal digital assistant (PDA)… [or] as a software program
`
`running on a general-purpose computer . . . .” Ex-1104, Jakobsson, [0038]. A
`
`POSITA therefore would have understood that any of these embodiments would
`
`have included a RAM, ROM or other well-known memory devices from which
`
`data is retrieved or received.
`
`26.
`
`Finally, USR ignores Jakobsson’s disclosure that the verifier can be
`
`implemented on a “computer … interacting with one or more other computer
`
`programs on the same or a different computer.” Ex-1104, Jakobsson, [0038],
`
`[0139] (“One or more of the preceding embodiments can be employed in a system
`
`that includes one or more master verifiers and one or more subordinate verifiers.
`
`Generally, the master verifier is more remote than the subordinate verifier from the
`
`communication terminal 140 employed during an authentication. In one
`
`embodiment, a master verifier stores the information required to calculate each
`
`secret and corresponding authentication code for one or more authentication
`
`devices 120. To facilitate authentication with authorized verifiers, both the
`
`authentication devices 120 and the master verifiers employed in this approach
`
`derive temporary secrets for each period. However, a subordinate verifier can only
`
`authenticate codes that are presented to it when it has received the secrets from a
`
`14
`
`
`
`master verifier. Further, the secrets received by the subordinate verifier are only
`
`valid during specific periods of time (i.e., they are temporary secrets) and they are
`
`only valid for a limited number of authentication devices. As a result, the
`
`subordinate verifier only can authenticate for a reduced period of time. The
`
`periods during which the subordinate verifier can authenticate are limited to those
`
`periods for which the subordinate verifier possesses valid secrets that are used to
`
`generate authentication codes. This approach can be advantageous because it
`
`reduces damage that an attacker can do if the subordinate verifier is compromised.”
`
`(emphasis added)). As I explained in my previous declaration, a POSITA would
`
`have understood that the derivation of “second authentication information” can be
`
`implemented on a “different” program or computer or in hardware and that the
`
`verifier can be configured to retrieve or receive the “second authentication
`
`information” from the “different” program or computer or from hardware. Ex-
`
`1102, Shoup-Decl., ¶74.
`
`Claims 7, 14, 26, And 34 Are Obvious Over Jakobsson In View of
`C.
`Verbauwhede and Maritzen.
`1.
`Jakobsson In View Of Maritzen Discloses “Enabling Or
`Disabling” A Device.
`Claims 7, 14, 26, and 34 require a processor configured to “enable or
`
`27.
`
`disable use of [a] first handheld device based on a result of [a] comparison.” As I
`
`explained in my previous declaration, Jakobsson in view of Maritzen discloses
`
`15
`
`
`
`enabling or disabling use of a handheld device as a result of a comparison. Ex-
`
`1102, Shoup-Decl., ¶¶163-168.
`
`28. USR argues that Maritzen fails to disclose this limitation because
`
`locking and unlocking a device as a result of a comparison is somehow
`
`distinguishable from enabling or disabling use of the device based on the result of
`
`a comparison. POR, 33-36. As discussed above, a POSITA would not have
`
`viewed the claim to be so limited. USR contends that Maritzen itself recognizes a
`
`distinction between the PTD disabling itself and the PTD remaining locked, but
`
`this is incorrect. Maritzen makes no such distinction, and USR has not articulated
`
`any distinction between a locked PTD and a disabled PTD. A locked PTD and a
`
`disabled PTD are both rendered inoperable for some functional purpose. For
`
`example, Maritzen explains that “PTD 100 is disabled such that the user may not
`
`access the PTD 100.” Ex-1105, Maritzen, [0056]. Similarly, a successful
`
`verification would “unlock PTD 100” such that the user may access PTD 100. Ex-
`
`1105, Maritzen, [0067]. Maritzen makes no distinction between these terms.
`
`2.
`
`It Would Have Been Obvious To Combine Jakobsson With
`Maritzen’s Teachings On “Enabling or Disabling” A
`Device.
`29. USR argues that enabling or disabling a device based on the result of
`
`a comparison is incompatible with Jakobsson, and would render Jakobsson
`
`inoperable for its intended purpose. POR, 36. Specifically, Jakobsson reports
`
`16
`
`
`
`“event states” and, in some non-limiting embodiments, reports event states even
`
`when local authentication fails. Thus, USR argues, disabling a device upon a
`
`failed authentication would “excise [a] key functionality” of Jakobsson because
`
`event states could not be sent if the device was disabled. POR, 37. This argument
`
`fails because non-limiting examples do not render these references incompatible.
`
`Moreover, Jakobsson expressly discloses embodiments where the event state is not
`
`sent when local authentication fails, which is entirely compatible with Maritzen’s
`
`teachings.
`
`30.
`
`For example, Jakobsson discloses that the device can store failed
`
`authentications as an event state. Jakobsson, [0052] (“reportable event(s) . . .
`
`[include] authentication quality (e.g., a number of PIN errors prior to successful
`
`authentication, strength of a biometric match, etc.)”). Jakobsson further discloses
`
`that “in some embodiments…the occurrence of the event is communicated in
`
`identity authentication codes output by the device subsequent to the occurrence of
`
`the reportable event.” Ex-1104, Jakobsson, [0015]. If authentication fails, the
`
`device can be disabled (as taught by Maritzen), and the failed authentication is
`
`stored as an event state (as taught by Jakobsson) and sent “subsequent to the
`
`occurrence of the reportable event” (e.g., when the device is activated upon a
`
`successful authentication). Moreover, Jakobsson explains that an authentication
`
`code (which includes the event state) is generated if the first authentication is
`
`17
`
`
`
`successfully verified. Ex-1104, Jakobsson, [0059] (“a first authentication of user
`
`110 is performed….[i]f the first authentication is successfully verified by the
`
`authentication device 120, the device 120 generates an identity authentication code
`
`. . . .”). Jakobsson does not require that the authentication code (and a
`
`corresponding event state) is generated regardless of the authentication result and
`
`nothing in Jakobsson requires that event states are sent for every authentication
`
`attempt or with every authentication code. In fact, Jakobsson discloses that “[t]he
`
`authentication device can verify the correctness of user data, and only provide an
`
`identity authentication code if the user data is correct.” Ex-1104, Jakobsson, [0111]
`
`(emphases added). If no authentication code is provided unless the user data is
`
`correct, then the event state (contained in the authentication code) is not provided
`
`unless the user data is correct. Thus, Maritzen’s teaching that the device can
`
`remain locked upon a failed authentication is entirely consistent with Jakobsson.
`
`3.
`
`It Would Have Been Obvious To Combine Jakobsson And
`Maritzen Because The Differences Identified By USR Are
`Superficial.
`31. USR attempts to distinguish Jakobsson and Maritzen by identifying
`
`superficial differences between the two references, but these differences would not
`
`dissuade a POSITA from combining the teachings of Jakobsson and Maritzen
`
`because they are directed toward remarkably similar electronic authentication
`
`systems.
`
`18
`
`
`
`32.
`
`For example, USR argues that Maritzen is directed toward “real-time
`
`settlement of vehicle-accessed, financial transactions that provide anonymity and
`
`security” while “Jakobsson, in contrast, discloses a personal (as opposed to
`
`vehicle) event detecting an alert system.” POR, 39-40. First, USR’s arguments
`
`seriously mischaracterize the references. USR purports to distinguish Jakobsson’s
`
`system by pointing out that it is directed toward a “personal (as opposed to
`
`vehicle)” (emphasis in original) event detecting and alert system, but Maritzen’s
`
`device also is called a “personal transaction device” that is clearly handheld and
`
`designed for personal use. See Ex-1105, Maritzen, Fig. 6a. Second, as I explained
`
`in my previous declaration, both references are directed toward secure financial
`
`transactions that address the issue of electronic fraud. It is irrelevant whether
`
`Maritzen discloses embodiments that are directed toward a vehicle payment system
`
`because a POSITA would have understood that the electronic authentication
`
`techniques taught by Jakobsson and Maritzen are readily transferable across both
`
`systems.
`
`33. USR also makes several arguments that appear to misunderstand that
`
`Jakobsson is the primary reference. For example, USR argues that “Petitioner cites
`
`examples of Jakobsson’s use of a PIN or password…. [i]n contrast, Maritzen does
`
`not teach PIN-based authentication, … including a PIN would be contrary to
`
`Maritzen’s goal of reducing the time it takes to complete the transaction.” POR,
`
`19
`
`
`
`42. USR continues: “Tellingly, Maritzen does not include a single mention of
`
`PIN-based authentication in its more than 181 paragraphs of disclosures and 137
`
`claims; indeed, the system of Maritzen does not include a keyboard or other entry
`
`means for inputting a PIN.” POR, 42-43. This argument fails because Jakobsson
`
`is the primary reference, and none of the grounds discussed in my previous
`
`declaration propose adding a PIN to Maritzen’s system. Moreover, the use of PINs
`
`is one set of limited, non-exclusive examples in Jakobsson and in no way defines
`
`the scope of Jakobsson’s teachings. Both disclosures discuss many examples of
`
`authentication techniques that were known at the time. A POSITA would have
`
`recognized that both systems are directed toward electronic authentication systems
`
`and would have had the skill to combine discrete teachings from Maritzen into the
`
`system of Jakobsson.
`
`34. USR argues that Maritzen’s “main goal” is to provide “anonymity,”
`
`which it equates, citing no support, with a ban on sending personally identifiable
`
`information. POR, 34. But Maritzen does not propose any ban on sending
`
`“personally identifiable information.” Maritzen never uses this term, and USR
`
`20
`
`
`
`provides no definition for this term.3 USR argues that I “confirmed at [my]
`
`deposition that the Maritzen system does not transmit any personally identifiable
`
`user or biometric information,” (POR, 40) but this is false. I never confirmed this
`
`statement. Instead, I explained