`________________
`
`BEFORE THE PATENT TRIAL AND APPEAL BOARD
`________________
`
`APPLE INC.
`Petitioner,
`
`v.
`
`UNIVERSAL SECURE REGISTRY LLC
`Patent Owner
`________________
`
`Case CBM2018-00026
`U.S. Patent No. 8,577,813
`________________
`
`PATENT OWNER’S EXHIBIT 2001
`DECLARATION OF MARKUS JAKOBSSON
`IN SUPPORT OF PATENT OWNER’S PRELIMINARY RESPONSE
`
`USR Exhibit 2001
`
`
`
`1.
`
`I have been retained on behalf of Universal Secure Registry LLC
`
`(“Patent Owner”) in connection with the above-captioned covered business method
`
`(CBM) review. I have been retained to provide my opinions in support of USR’s
`
`Preliminary Response. I am being compensated for my time at the rate of $625 per
`
`hour. I have no interest in the outcome of this proceeding.
`
`2.
`
`In preparing this declaration, I have reviewed and am familiar with the
`
`Petition for CBM2018-00026, U.S. Patent No. 8,577,813, and its file history, and
`
`all other materials cited and discussed in the Petition (including the declaration of
`
`Dr. Victor Shoup) and cited and discussed in this Declaration. I understand the
`
`Petition asserts that claims 1-26 are invalid under 35 U.S.C. § 101 for being
`
`directed at patent ineligible subject matter (e.g., abstract idea) without significantly
`
`more. Specifically, I understand that the Petition asserts independent claims 1, 16
`
`and 24 are invalid under § 101 for purportedly being directed at nothing more than
`
`the abstract idea of “verifying an account holder’s identity based on codes and/or
`
`information related to the account holder before enabling a transaction.” See, e.g.,
`
`Paper No. 3, CBM2018-00026 at 44 (hereinafter “Petition”).
`
`3.
`
`The statements made herein are based on my own knowledge and
`
`opinion. This Declaration represents only the opinions I have formed to date. I may
`
`consider additional documents as they become available or other documents that
`
`USR Exhibit 2001, Page 1
`
`
`
`are necessary to form my opinions. I reserve the right to revise, supplement, or
`
`amend my opinions based on new information and on my continuing analysis.
`
`I.
`
`QUALIFICATIONS
`
`4.
`
`My qualifications can be found in my Curriculum Vitae, which
`
`includes my detailed employment background, professional experience, and list of
`
`technical publications and patents. Ex. 2004.
`
`5.
`
`I am currently the Chief of Security and Data Analytics at Amber
`
`Solutions, Inc., a cybersecurity company that develops home and office automation
`
`technology. At Amber, my research studies and addresses abuse, including social
`
`engineering, malware and privacy intrusions. My work primarily involves
`
`identifying risks, developing protocols and user experiences, and evaluating the
`
`security of proposed approaches.
`
`6.
`
`I received a Master of Science degree in Computer Engineering from
`
`the Lund Instituted of Technology in Sweden in 1993, a Master of Science degree
`
`in Computer Science from the University of California at San Diego in 1994, and a
`
`Ph.D. in Computer Science from the University of California at San Diego in 1997,
`
`specializing in Cryptography. During and after my Ph.D. studies, I was also a
`
`Researcher at the San Diego Supercomputer Center, where I did research on
`
`authentication and privacy.
`
`USR Exhibit 2001, Page 2
`
`
`
`7.
`
`From 1997 to 2001, I was a Member of Technical Staff at Bell Labs,
`
`where I did research on authentication, privacy, multi-party computation, contract
`
`exchange, digital commerce including crypto payments, and fraud detection and
`
`prevention. From 2001 to 2004, I was a Principal Research Scientist at RSA Labs,
`
`where I worked on predicting future fraud scenarios in commerce and
`
`authentication and developed solutions to those problems. During that time I
`
`predicted the rise of what later became known as phishing. I was also an Adjunct
`
`Associate Professor in the Computer Science department at New York University
`
`from 2002 to 2004, where I taught cryptographic protocols.
`
`8.
`
`From 2004 to 2016, I held a faculty position at the Indiana University
`
`at Bloomington, first as an Associate Professor of Computer Science, Associate
`
`Professor of Informatics, Associate Professor of Cognitive Science, and Associate
`
`Director of the Center for Applied Cybersecurity Research (CACR) from 2004 to
`
`2008; and then as an Adjunct Associate Professor from 2008 to 2016. I was the
`
`most senior security researcher at Indiana University, where I built a research
`
`group focused on online fraud and countermeasures, resulting in over 50
`
`publications and two books.
`
`9. While a professor at Indiana University, I was also employed by
`
`Xerox PARC, PayPal, and Qualcomm to provide thought leadership to their
`
`security groups. I was a Principal Scientist at Xerox PARC from 2008 to 2010, a
`
`USR Exhibit 2001, Page 3
`
`
`
`Director and Principal Scientist of Consumer Security at PayPal from 2010 to
`
`2013, a Senior Director at Qualcomm from 2013 to 2015, and Chief Scientist at
`
`Agari from 2016 to 2018. Agari is a cybersecurity company that develops and
`
`commercializes technology to protect enterprises, their partners and customers
`
`from advanced email phishing attacks. At Agari, my research studied and
`
`addressed trends in online fraud, especially as related to email, including problems
`
`such as Business Email Compromise, Ransomware, and other abuses based on
`
`social engineering and identity deception. My work primarily involved identifying
`
`trends in fraud and computing before they affected the market, and developing and
`
`testing countermeasures, including technological countermeasures, user interaction
`
`and education.
`
`10.
`
`I have founded or co-founded several successful computer security
`
`companies. In 2005 I founded RavenWhite Security, a provider of authentication
`
`solutions, and I am currently its Chief Technical Officer. In 2007 I founded
`
`Extricatus, one of the first companies to address consumer security education. In
`
`2009 I founded FatSkunk, a provider of mobile malware detection software; I
`
`served as Chief Technical Officer of FatSkunk from 2009 to 2013, when FatSkunk
`
`was acquired by Qualcomm and I became a Qualcomm employee. In 2013 I
`
`founded ZapFraud, a provider of anti-scam technology addressing Business Email
`
`USR Exhibit 2001, Page 4
`
`
`
`Compromise, and I am currently its Chief Technical Officer. In 2014 I founded
`
`RightQuestion, a security consulting company.
`
`11.
`
`I have additionally served as a member of the fraud advisory board at
`
`LifeLock (an identity theft protection company); a member of the technical
`
`advisory board at CellFony (a mobile security company); a member of the
`
`technical advisory board at PopGiro (a user reputation company); a member of the
`
`technical advisory board at MobiSocial dba Omlet (a social networking company);
`
`and a member of the technical advisory board at Stealth Security (an anti-fraud
`
`company). I have provided anti-fraud consulting to KommuneData (a Danish
`
`government entity), J.P. Morgan Chase, PayPal, Boku, and Western Union.
`
`12.
`
`I have authored five books and over 100 peer-reviewed publications,
`
`and have been a named inventor on over 100 patents and patent applications.
`
`13. My work has included research in the area of applied security,
`
`privacy, cryptographic protocols, authentication, malware, social engineering,
`
`usability and fraud.
`
`II.
`
`LEGAL UNDERSTANDING
`
`A.
`
`14.
`
`The Person of Ordinary Skill in the Art
`
`I understand that a person of ordinary skill in the relevant art (also
`
`referred to herein as “POSITA”) is presumed to be aware of all pertinent art, thinks
`
`USR Exhibit 2001, Page 5
`
`
`
`along conventional wisdom in the art, and is a person of ordinary creativity—not
`
`an automaton.
`
`15.
`
`I have been asked to consider the level of ordinary skill in the field
`
`that someone would have had at the time the claimed invention was made. In
`
`deciding the level of ordinary skill, I considered the following:
`
`• the levels of education and experience of persons working in the
`
`field;
`
`• the types of problems encountered in the field; and
`
`• the sophistication of the technology.
`
`16. A person of ordinary skill in the art (“POSITA”) relevant to the ’813
`
`patent at the time of the invention would have a Bachelor of Science degree in
`
`electrical engineering, computer science or computer engineering, and three years
`
`of work or research experience in the fields of secure transactions and encryption,
`
`or a Master’s degree in electrical engineering, computer science or computer
`
`engineering, and two years of work or research experience in related fields.
`
`17.
`
`I have reviewed the declaration of Dr. Victor Shoup, including his
`
`opinions regarding the Person of Ordinary Skill in the Art. Ex. 1002 at ¶¶ 45-47.
`
`My description of the level of ordinary skill in the art is essentially the same as that
`
`of the Dr. Shoup, except that Dr. Shoup’s description requires two years of work or
`
`USR Exhibit 2001, Page 6
`
`
`
`research experience (as compared to three years). The opinions set forth in this
`
`Declaration response would be the same under either my or Dr. Soup’s proposal.
`
`18.
`
`I am well-qualified to determine the level of ordinary skill in the art
`
`and am personally familiar with the technology of the ’813 Patent. I was a person
`
`of at least ordinary skill in the art at the time of the priority date of the ’813 patent.
`
`Regardless if I do not explicitly state that my statements below are based on this
`
`timeframe, all of my statements are to be understood as a POSITA would have
`
`understood something as of the priority date of the ’813 patent.
`
`B.
`
`19.
`
`Legal Principles
`
`I am not a lawyer and will not provide any legal opinions. Though I
`
`am not a lawyer, I have been advised that certain legal standards are to be applied
`
`by technical experts in forming opinions regarding the meaning and validity of
`
`patent claims.
`
`1.
`
`Patent Eligible Subject Matter Under 35 U.S.C. § 101
`
`20.
`
`I understand that to obtain a patent, a claimed invention must be a new
`
`and useful process, machine, manufacture, or composition of matter, or any new
`
`and useful improvement thereof. That said, I understand that laws of nature,
`
`abstract ideas, and natural phenomena are not patent eligible.
`
`21.
`
`I understand that the United States Supreme Court’s decision in Alice
`
`Corp. Pty. v. CLS Bank Int’l, 137 S. Ct. 2347 (2014), provides a two-step analysis
`
`USR Exhibit 2001, Page 7
`
`
`
`for distinguishing patents that claim ineligible abstract ideas from those that claim
`
`eligible applications of those ideas. In the first step, the Court must determine
`
`whether the claims at issue are directed to a patent-ineligible abstract idea. If the
`
`claims at issue are not directed to an abstract idea, the claims are deemed to be
`
`directed at patent-eligible subject matter and the analysis concludes. If, however,
`
`the claims at issue are directed to an abstract idea, the analysis proceeds to step
`
`two. In the second step, the elements of the claim must be examined, both
`
`individually and as an “ordered combination,” for an “inventive concept” that is
`
`“an element or combination of elements that is ‘sufficient to ensure that the patent
`
`in practice amounts to significantly more than a patent upon the [ineligible
`
`concept] itself.’” Id. at 2355. If a claim that is otherwise directed to an abstract idea
`
`includes elements that when taken individually or as an ordered combination
`
`transform the claim to significantly more than the abstract idea, the claim is
`
`deemed to be patent-eligible subject matter.
`
`2. My Understanding of Claim Construction Law
`
`22.
`
`I understand that in this CBM review the claims must be given their
`
`broadest reasonable interpretation, but that interpretation must be consistent with
`
`the patent specification. In this Declaration, I have used the broadest reasonable
`
`interpretation (“BRI”) standard when interpreting the claim terms.
`
`USR Exhibit 2001, Page 8
`
`
`
`3.
`
`CBM Review Eligibility
`
`23.
`
`I understand that a patent is not eligible for CBM review if the patent
`
`claims a “technological invention.” See AIA § 18(d)(1). I further understand that
`
`although the AIA does not define “technological inventions,” 37 C.F.R. §
`
`42.301(b) provides some guidance: “the following will be considered on a case-by-
`
`case basis: whether the claimed subject matter as a whole recites a technological
`
`feature that is novel and unobvious over the prior art; and solves a technical
`
`problem using a technical solution.” Thus, a patent may not me eligible for CBM
`
`review if the claimed subject matter as a whole recites a novel, unobvious
`
`technological feature that solves a technical problem with a technical solution.
`
`III. OVERVIEW OF THE ’813 PATENT
`
`A.
`
`24.
`
`The ’813 Patent Specification
`
`The ’813 patent provides improved devices and methods that allow
`
`users to securely authenticate their identity and authenticate their electronic ID
`
`device when engaging in a distributed electronic transaction involving a point-of-
`
`sale device. Ex. 1001, ’813 Patent at FIG. 31, 43:4-51:55. When used in
`
`conjunction with the patent’s Universal Secure Registry (“USR”), the claimed
`
`electronic ID device can both securely identify the user, and separately
`
`authenticate and approve the user’s financial transaction requests made through a
`
`point-of-sale device. Id. at 43:4-15, FIG. 31. One non-exclusive, non-limiting
`
`USR Exhibit 2001, Page 9
`
`
`
`example of such a system is shown in FIG. 31, which includes the electronic ID
`
`device 352, the point-of-sale device 354, and the USR 356. The USR in this
`
`embodiment includes a secure database that stores account (e.g., credit card)
`
`information for a plurality of users. Id. at 44:39-53.
`
`25.
`
`The ’813 patent specification identifies a number of disadvantages of
`
`prior art approaches to providing secure access. For example, a prior art
`
`authorization system may control access to computer networks using password
`
`protected accounts, but such a system is susceptible to tampering and difficult to
`
`maintain. See id. at 1:64-2:15. Moreover, prior art hand-held computer devices
`
`may be used to verify identity, but security could be compromised if the device
`
`ends up in the wrong hands. See id. at 2:16-43.
`
`26.
`
`To prevent unauthorized use of the claimed electronic ID device, a
`
`user must authenticate themselves to the device to activate it for a transaction. The
`
`’813 patent describes multiple ways to do this, including using a biometric input
`
`(e.g., fingerprint) and/or secret information (e.g., a PIN). Id. at 45:55-46:45, 50:1-
`
`22, 51:7-26. Once activated, the electronic ID device allows a user to select an
`
`account for a transaction, such as a financial transaction, and generates encrypted
`
`authentication information that is sent via the point-of-sale device to the USR for
`
`authentication and approval of the requested financial transaction. Id. at 46:22-36.
`
`Notably, this encrypted authentication information is not the user’s credit card
`
`USR Exhibit 2001, Page 10
`
`
`
`information or other sensitive user information, which could be intercepted and
`
`misused. See id. at 4:14-20 (“Additionally, the system may enable the user’s
`
`identity to be confirmed or verified without providing any identifying information
`
`about the person to the entity requiring identification. This can be advantageous
`
`where the person suspects that providing identifying information may subject the
`
`identifying information to usurpation.”). Instead, the electronic ID device may first
`
`generate a non-predictable value, and then generates single-use authentication
`
`information from the non-predictable value, information associated with the
`
`biometric data, and the secret information. Id. at 46:14-36, 50:56-65. This
`
`encrypted authentication information is transmitted to the secure registry, where it
`
`may be used, for example, to authenticate the electronic ID device or to determine
`
`transaction approval. Id. at 11:36-45, 12:19-44, 12:64-13:8, 48:60-49:24, 50:23-32,
`
`51:7-26.
`
`B.
`
`27.
`
`The ’813 Patent Claims
`
`The ’813 patent includes 26 claims. Claims 1, 16, and 24 are
`
`independent.
`
`IV. CLAIM CONSTRUCTION
`
`28.
`
`I understand that Petitioner has identified five terms that they allege
`
`require construction. Petition at 33-42. Their construction for these terms do not
`
`impact my opinion in this declaration.
`
`USR Exhibit 2001, Page 11
`
`
`
`V.
`
`THE ’813 PATENT CLAIMS ARE DIRECTED TO A
`“TECHNOLOGICAL INVENTION”
`
`A.
`
`Claimed Subject Matter as a Whole Recites Technological
`Features that are Novel and Unobvious Over the Prior Art
`
`29.
`
`In my opinion the claimed subject matter of the ’813 patent as a whole
`
`includes novel and unobvious technological features. For example, the claimed
`
`subject matter provides an electronic ID device or methods associated with an
`
`electronic ID device that perform user identity authentication locally at the device
`
`and generate cryptographic information for remote authentication of the device by
`
`a secure registry to, for example, enable or deny a transaction involving the device
`
`and a point-of-sale device. This technology allows for secure distributed
`
`transaction enablement involving a point-of-sale device without compromising a
`
`user’s sensitive information.
`
`30.
`
`For example, independent claim 1 recites many different elements
`
`that, when taken as a whole, constitute technological features that are novel and
`
`unobvious over the prior art. Claim 1 is directed at an “electronic ID device” that
`
`allows a user to “select any one of a plurality of accounts…to employ in a financial
`
`transaction.” Ex. 1001 at 51:65-67. The electronic ID device includes a “biometric
`
`sensor” that receives the user’s biometric input. Id. at 52:1-2. The electronic ID
`
`device also includes a “user interface” that receives the user’s secret information
`
`(e.g., a personal identification number (PIN) or password) and information
`
`USR Exhibit 2001, Page 12
`
`
`
`concerning the account selected by the user. Id. at 52:3-6. A processor at the
`
`electronic ID device “activate[s]” the ID device after successful authentication of
`
`the biometric input and/or the secret information. Id. at 52:9-15. Once the
`
`electronic ID device has been activated, the processor generates a “non-predictable
`
`value” and also generates “encrypted authentication information” from the non-
`
`predictable value, information associated with at least a portion of the biometric
`
`input, and the secret information. Id. at 52:15-21. A “communication interface”
`
`wirelessly transmits the encrypted authentication information to a “point-of-sale
`
`(POS) device,” which in turn sends at least a portion of the encrypted
`
`authentication information to a “secure registry.” Id. at 52:21-29.
`
`31.
`
`In my view the ordered combination of claim elements of claim 1
`
`creates a unique combination that constitutes a technological feature. For example,
`
`a biometric input received by the electronic ID device’s biometric sensor and
`
`secret information received by the ID device’s input interface is used by its
`
`processor to locally authenticate the user of the ID device and activate the ID
`
`device. After this local authentication of the user is performed, the electronic ID
`
`device’s processor generates a specific item of cryptographic data (e.g., “encrypted
`
`authentication information”) from selected information (e.g., non-predictable
`
`value, information associated with the biometric input, and secret information).
`
`This specific item of cryptographic data is then used to authenticate the electronic
`
`USR Exhibit 2001, Page 13
`
`
`
`ID device itself. In contrast to local user authentication at the electronic ID device,
`
`remote authentication of the electronic ID device takes place at the secure registry
`
`once the data is transmitted to the secure registry via the point-of-sale device. The
`
`resulting technological feature is also novel and nonobvious.
`
`32.
`
`Similarly, independent claim 16 also recites different elements that,
`
`when taken as a whole, constitute technological features that are novel and
`
`unobvious over the prior art. Specifically, claim 16 is directed at a method of
`
`“generating authentication information.” Ex. 1001 at 53:25-26. An identity of a
`
`user to an electronic ID device is authenticated based on “biometric data” of the
`
`user and/or “secret information” (e.g., PIN or password) known to the user, which
`
`“activates” the electronic ID device. Id. at 53:27-33. Responsive to ID device
`
`activation, a “non-predictable value” is generated, and “encrypted authentication
`
`information” is generated from the non-predictable value, information associated
`
`with at least a portion of the biometric data, and the secret information. Id. at
`
`53:39-42. A communication interface communicates the encrypted authentication
`
`information to a “secure registry” via a “point-of-sale (POS) device” to
`
`authenticate the electronic ID device. Id. at 53:43-47.
`
`33.
`
`In my opinion the ordered combination of claim elements of claim 16
`
`creates a unique combination that constitutes a technological feature. For instance,
`
`the electronic ID device locally authenticates the user of the ID device based on
`
`USR Exhibit 2001, Page 14
`
`
`
`received biometric input and secret information, and activates the ID device.
`
`Responsive to this activation, a non-predictable value is generated, and a specific
`
`item of cryptographic data (e.g., “encrypted authentication information”) is
`
`generated from the non-predictable value, information associated with the
`
`biometric input, and secret information. This specific item of cryptographic data is
`
`used to remotely authenticate the electronic ID device itself with the secure
`
`registry. Also, in contrast to user authentication that occurs locally at the electronic
`
`ID device, the cryptographic data is transmitted to the secure registry via the point-
`
`of-sale device to remotely authenticate the electronic ID device at the secure
`
`registry. The resulting technological feature is also novel and nonobvious.
`
`34. As another example, I believe independent claim 24 also recites
`
`various elements that, when taken as a whole, constitute technological features that
`
`are novel and unobvious over the prior art. Specifically, claim 24 is directed at a
`
`method of “controlling access to a plurality of accounts.” Ex. 1001 at 54:24-25. A
`
`“non-predictable value” is generated at an “electronic ID device.” Id. at 54:26-27.
`
`The electronic ID device also generates “encrypted authentication information”
`
`from the non-predictable value, “information associated with at least a portion of a
`
`biometric of the user” of the electronic ID device, and “secret information” (e.g.,
`
`PIN or password) provided to the electronic ID device by the user. Id. at 54:28-34.
`
`The encrypted authentication information is communicated from the ID device to a
`
`USR Exhibit 2001, Page 15
`
`
`
`“secure registry” via a “point-of-sale (POS) device” to authenticate the ID device
`
`with the secure registry. Id. at 54:35-38. The point-of-sale device is authorized to
`
`initiate a financial transaction involving a transfer of funds to or from the account
`
`selected by the user when the encrypted authentication information is successfully
`
`authenticated or is otherwise denied. Id. at 54:39-46.
`
`35.
`
`In my view the ordered combination of claim elements of claim 24
`
`creates a unique combination that constitutes a technological feature. For example,
`
`cryptographic data (e.g., “encrypted authentication information”) is generated
`
`using a unique set of factors (e.g., non-predictable value, information associated
`
`with a biometric input, and secret information). This specific item of cryptographic
`
`data is then communicated to a secure registry to authenticate the electronic ID
`
`device remotely at the secure registry. If the secure registry authenticates the
`
`electronic ID device based on the cryptographic data then a point-of-sale device is
`
`authorized to initiate a financial transaction involving a transfer of funds to or from
`
`the account selected by the user. The resulting technological feature is also novel
`
`and unobvious.
`
`B.
`
`The Claimed Subject Matter Solves a Technical Problem with a
`Technical Solution
`
`36. One technical problem the subject matter of independent claims 1, 16,
`
`and 24 solve is how to authenticate an electronic ID device while simultaneously
`
`protecting the privacy of the user. Specifically, these claims address, among other
`
`USR Exhibit 2001, Page 16
`
`
`
`things, how to securely and reliably authenticate a user’s device-initiated
`
`transaction remotely without compromising the user’s sensitive information. These
`
`two concerns are often in conflict with each other. As an inventor of one of Apple
`
`Inc.’s cited art references (see Ex. 1214 in co-pending proceeding CBM2018-
`
`00024), my doctoral thesis entitled “Privacy vs. Authenticity” deals with such
`
`issues. It is easy to achieve privacy if one sacrifices authenticity or vice versa, but
`
`the goal is to get both. Additionally, achieving both should ideally be done in a
`
`manner that is usable to typical end users, and is computationally practical (often
`
`meaning “affordable”).
`
`37.
`
`In my opinion the ’813 patent discloses a technical solution to this
`
`technical problem by providing a unique and highly secure distributed transaction
`
`approval system incorporating multiple parts that work together to simultaneously
`
`improve both the authenticity and privacy of distributed electronic transactions.
`
`The recited electronic ID device generates “encrypted authentication information”
`
`from a non-predictable value, information associated with at least a portion of a
`
`biometric input, and secret information known to the user. This encrypted
`
`authentication information is used to authenticate the electronic ID device without
`
`having to send sensitive information of the user (e.g., user’s name, social security
`
`number, credit card number, etc.), which may compromise the privacy of the user.
`
`The use of the “encrypted authentication information” to authenticate the
`
`USR Exhibit 2001, Page 17
`
`
`
`electronic ID device is computationally lightweight and protects against replay
`
`attacks. Moreover, the use of a “secure registry” acts as an interpreter of the
`
`identity assertions and a repository of payment data.
`
`38. Another critical
`
`technical concern
`
`in a distributed electronic
`
`transaction system is preventing, in the first instance, the interception of sensitive
`
`information that could later be fraudulently used in future transactions. Distributed
`
`electronic transaction systems necessarily require the electronic transmission of
`
`data that is inherently vulnerable to interception. For example, prior art distributed
`
`electronic transaction systems often required transmission of a user’s social
`
`security number, password, transaction card details, or other sensitive information
`
`over the Internet and, although that transmitted data may have been encrypted,
`
`encryption can be broken, leaving the decrypted data vulnerable to interception and
`
`fraudulent use.
`
`39.
`
`In my opinion, the subject matter of claims 1, 16, and 24 solves this
`
`technical problem by incorporating technology that obviates the need to send any
`
`sensitive user information at all. Instead, a local electronic ID device generates and
`
`sends the remote secure registry encrypted authentication information that serves
`
`as a proxy for the user’s sensitive information and cannot be used outside of the
`
`transaction processing system. Thus, the encrypted authentication information is
`
`essentially useless
`
`if
`
`intercepted. Moreover,
`
`the encrypted authentication
`
`USR Exhibit 2001, Page 18
`
`
`
`information is based on a non-predictable value that prevents intercepted data from
`
`being resubmitted into the system as a replay attack. As a result of these technical
`
`improvements, a user may participate in a distributed electronic transaction without
`
`compromising sensitive information that could later be misused.
`
`40. Yet another technical problem is how to securely and reliably
`
`authenticate the identity of a user of an electronic device for use in a distributed
`
`electronic transaction involving a point-of-sale device. I believe one important
`
`concern is ensuring that the person remotely initiating a transaction is authorized to
`
`do so. Due to the nature of distributed electronic transactions, the entity
`
`authorizing a transaction only has visibility into the data it actually receives, and
`
`not into the original source of that data. Hence, for example, prior art distributed
`
`electronic transaction systems lacked a technical solution to stop someone with a
`
`stolen or counterfeit transaction card (or even just stolen credentials) from
`
`fraudulently accessing confidential information or transacting through a website.
`
`41.
`
`I believe the subject matter of claims 1 and 16 solves this technical
`
`problem by incorporating technology to locally authenticate the user of the
`
`electronic ID device through multifactor authentication (e.g., secret information,
`
`such as a PIN, and biometric input, such as a fingerprint) before generating and
`
`sending encrypted authentication information that is difficult to counterfeit. As to
`
`the subject matter of independent claim 24, biometric inputs from the user and
`
`USR Exhibit 2001, Page 19
`
`
`
`secret information known to the user are leveraged when generating the encrypted
`
`authentication information for remote authentication of the electronic ID device by
`
`the secure registry. In this way, the secure registry authorizing the point-of-sale
`
`device to carry out the transaction can be confident that the person initiating the
`
`transaction at the electronic ID device is authorized to do so.
`
`VI. THE ’813 PATENT CLAIMS ARE DIRECTED TO PATENT-
`ELIGIBLE SUBJECT MATTER
`
`A.
`
`42.
`
`Claims are Not Directed to an Abstract Idea
`
`Petitioner argues that all the claims are directed to the abstract idea of
`
`“verifying an account holder’s identity based on codes and/or information related
`
`to the account holder before enabling a transaction.” Petition at 44. However, in
`
`my opinion Petitioner fails in large part to account for the specific claim
`
`requirements.
`
`43.
`
`For example, independent claims 1 and 16 recite an electronic ID
`
`device that performs local authentication of a user’s identity by receiving
`
`“biometric input/data” (e.g., fingerprint, retinal scan, etc. captured by a “biometric
`
`sensor”) and “secret information” (e.g., PIN, password, etc. received at “user
`
`interface”) from the user to “activate” the electronic ID device so that the ID
`
`device can be used to carry out a transaction. See Ex. 1001 at 51:65-52:15, 53:25-
`
`33. Once the electronic ID device has been activated, the electronic ID device
`
`USR Exhibit 2001, Page 20
`
`
`
`generates a “non-predictable value,” and
`
`further generates “encrypted
`
`authentication information” from the non-predictable value, information associated
`
`with the biometric data, and the secret information. See id. at 52:15-21, 53:34-42.
`
`These are not arbitrary or abstract concepts but instead are distinct, technical
`
`components that serve to address real-life technical problems that plague
`
`distributed electronic transactions by generating and transmitting information in a
`
`secure process that is exceedingly difficult to counterfeit and impervious to “replay
`
`attacks.” Once generated, the encrypted authentication information is sent to a
`
`“secure registry” via a “point-of-sale (POS) device” to authenticate the electronic
`
`ID device with the secure registry. See Ex. 1001 at 52:24-29, 53:43-47. A close
`
`review of these claim limitations demonstrate that the claimed subject matter is
`
`directed to a tangible device (claim 1) and process (claim 16) that perform
`
`discernable and meaningful functions to materialize a secure transaction between a
`
`user’s electronic device and a point-of-sale device.
`
`44. As another example, independent claim 24 also recites an electronic
`
`ID device that generates “encrypted authentication information” from a non-
`
`predictable value, information associated with biometric data of a user, and secret
`
`information received from the user. See id. at 54:26-34. Again, these are not
`
`arbitrary or abstract concepts but instead are distinct, technical components that
`
`serve to address real-life technical problems th