throbber

`
`
`
`
`UNITED STATES PATENT AND TRADEMARK OFFICE
`________________
`
`BEFORE THE PATENT TRIAL AND APPEAL BOARD
`________________
`
`
`APPLE INC.
`Petitioner,
`
`v.
`
`UNIVERSAL SECURE REGISTRY LLC
`Patent Owner
`________________
`
`Case CBM2018-00024
`U.S. Patent No. 8,577,813
`________________
`
`
`
`PATENT OWNER’S EXHIBIT 2001
`
`DECLARATION OF MARKUS JAKOBSSON
`
`IN SUPPORT OF PATENT OWNER’S PRELIMINARY RESPONSE
`
`
`
`
`
`
`
`USR Exhibit 2001
`
`

`

`
`
`1.
`
`I have been retained on behalf of Universal Secure Registry LLC
`
`(“Patent Owner”) in connection with the above-captioned covered business method
`
`review (CBM). I have been retained to provide my opinions in support of USR’s
`
`Preliminary Response. I am being compensated for my time at the rate of $625 per
`
`hour. I have no interest in the outcome of this proceeding.
`
`2.
`
`In preparing this declaration, I have reviewed and am familiar with the
`
`Petition for CBM2018-00024, U.S. Patent No. 8,577,813, and its file history, and
`
`all other materials cited and discussed in the Petition (including the declaration of
`
`Dr. Victor Shoup) and cited and discussed in this Declaration. I understand the
`
`Petition asserts that claims 1-2, 4-11, 13-20, and 22-26 are obvious. Specifically, I
`
`understand that the Petition asserts independent claims 1, 16 and 24 are obvious in
`
`view of U.S. Patent No. 6,016,476 (“Maes”) (Ex. 1213) and International
`
`Publication No. WO 2004/051585 (“Jakobsson”) (Ex. 1214) under 35 U.S.C. § 103
`
`(“the Challenged Claims”).
`
`3.
`
`The statements made herein are based on my own knowledge and
`
`opinion. This Declaration represents only the opinions I have formed to date. I may
`
`consider additional documents as they become available or other documents that
`
`are necessary to form my opinions. I reserve the right to revise, supplement, or
`
`amend my opinions based on new information and on my continuing analysis.
`
`
`
`
`
`
`USR Exhibit 2001, Page 1
`
`

`

`
`
`
`
`I.
`
`QUALIFICATIONS
`
`4. My qualifications can be found in my Curriculum Vitae, which
`
`includes my detailed employment background, professional experience, and list of
`
`technical publications and patents. Ex. 2002.
`
`5.
`
`I am currently the Chief of Security and Data Analytics at Amber
`
`Solutions, Inc, a cybersecurity company that develops home and office automation
`
`technology. At Amber, my research studies and addresses abuse, including social
`
`engineering, malware and privacy intrusions. My work primarily involves
`
`identifying risks, developing protocols and user experiences, and evaluating the
`
`security of proposed approaches.
`
`6.
`
`I received a Master of Science degree in Computer Engineering from
`
`the Lund Instituted of Technology in Sweden in 1993, a Master of Science degree
`
`in Computer Science from the University of California at San Diego in 1994, and a
`
`Ph.D. in Computer Science from the University of California at San Diego in 1997,
`
`specializing in Cryptography. During and after my Ph.D. studies, I was also a
`
`Researcher at the San Diego Supercomputer Center, where I did research on
`
`authentication and privacy.
`
`7.
`
`From 1997 to 2001, I was a Member of Technical Staff at Bell Labs,
`
`where I did research on authentication, privacy, multi-party computation, contract
`
`
`
`
`USR Exhibit 2001, Page 2
`
`

`

`
`
`exchange, digital commerce including crypto payments, and fraud detection and
`
`prevention. From 2001 to 2004, I was a Principal Research Scientist at RSA Labs,
`
`where I worked on predicting future fraud scenarios in commerce and
`
`authentication and developed solutions to those problems. During that time I
`
`predicted the rise of what later became known as phishing. I was also an Adjunct
`
`Associate Professor in the Computer Science department at New York University
`
`from 2002 to 2004, where I taught cryptographic protocols.
`
`8.
`
`From 2004 to 2016, I held a faculty position at the Indiana University
`
`at Bloomington, first as an Associate Professor of Computer Science, Associate
`
`Professor of Informatics, Associate Professor of Cognitive Science, and Associate
`
`Director of the Center for Applied Cybersecurity Research (CACR) from 2004 to
`
`2008; and then as an Adjunct Associate Professor from 2008 to 2016. I was the
`
`most senior security researcher at Indiana University, where I built a research
`
`group focused on online fraud and countermeasures, resulting in over 50
`
`publications and two books.
`
`9. While a professor at Indiana University, I was also employed by
`
`Xerox PARC, PayPal, and Qualcomm to provide thought leadership to their
`
`security groups. I was a Principal Scientist at Xerox PARC from 2008 to 2010, a
`
`Director and Principal Scientist of Consumer Security at PayPal from 2010 to
`
`2013, a Senior Director at Qualcomm from 2013 to 2015, and Chief Scientist at
`
`
`
`
`USR Exhibit 2001, Page 3
`
`

`

`
`
`Agari from 2016 to 2018. Agari is a cybersecurity company that develops and
`
`commercializes technology to protect enterprises, their partners and customers
`
`from advanced email phishing attacks. At Agari, my research studied and
`
`addressed trends in online fraud, especially as related to email, including problems
`
`such as Business Email Compromise, Ransomware, and other abuses based on
`
`social engineering and identity deception. My work primarily involved identifying
`
`trends in fraud and computing before they affected the market, and developing and
`
`testing countermeasures, including technological countermeasures, user interaction
`
`and education.
`
`10.
`
`I have founded or co-founded several successful computer security
`
`companies. In 2005 I founded RavenWhite Security, a provider of authentication
`
`solutions, and I am currently its Chief Technical Officer. In 2007 I founded
`
`Extricatus, one of the first companies to address consumer security education. In
`
`2009 I founded FatSkunk, a provider of mobile malware detection software; I
`
`served as Chief Technical Officer of FatSkunk from 2009 to 2013, when FatSkunk
`
`was acquired by Qualcomm and I became a Qualcomm employee. In 2013 I
`
`founded ZapFraud, a provider of anti-scam technology addressing Business Email
`
`Compromise, and I am currently its Chief Technical Officer. In 2014 I founded
`
`RightQuestion, a security consulting company.
`
`
`
`
`USR Exhibit 2001, Page 4
`
`

`

`
`
`11.
`
`I have additionally served as a member of the fraud advisory board at
`
`LifeLock (an identity theft protection company); a member of the technical
`
`advisory board at CellFony (a mobile security company); a member of the
`
`technical advisory board at PopGiro (a user reputation company); a member of the
`
`technical advisory board at MobiSocial dba Omlet (a social networking company);
`
`and a member of the technical advisory board at Stealth Security (an anti-fraud
`
`company). I have provided anti-fraud consulting to KommuneData (a Danish
`
`government entity), J.P. Morgan Chase, PayPal, Boku, and Western Union.
`
`12.
`
`I have authored five books and over 100 peer-reviewed publications,
`
`and have been a named inventor on over 100 patents and patent applications.
`
`13. My work has included research in the area of applied security,
`
`privacy, cryptographic protocols, authentication, malware, social engineering,
`
`usability and fraud.
`
`II. LEGAL UNDERSTANDING
`
`A. The Person of Ordinary Skill in the Art
`
`14.
`
`I understand that a person of ordinary skill in the relevant art (also
`
`referred to herein as “POSITA”) is presumed to be aware of all pertinent art, thinks
`
`along conventional wisdom in the art, and is a person of ordinary creativity—not
`
`an automaton.
`
`
`
`
`USR Exhibit 2001, Page 5
`
`

`

`
`
`15.
`
`I have been asked to consider the level of ordinary skill in the field
`
`that someone would have had at the time the claimed invention was made. In
`
`deciding the level of ordinary skill, I considered the following:
`
` the levels of education and experience of persons working in the
`
`field;
`
` the types of problems encountered in the field; and
`
` the sophistication of the technology.
`
`16. A person of ordinary skill in the art (“POSITA”) relevant to the ’813
`
`patent at the time of the invention would have a Bachelor of Science degree in
`
`electrical engineering, computer science or computer engineering, and three years
`
`of work or research experience in the fields of secure transactions and encryption,
`
`or a Master’s degree in electrical engineering, computer science or computer
`
`engineering, and two years of work or research experience in related fields.
`
`17.
`
`I have reviewed the declaration of Dr. Victor Shoup, including his
`
`opinions regarding the Person of Ordinary Skill in the Art. Ex. 1202 at ¶¶ 37-38.
`
`My description of the level of ordinary skill in the art is essentially the same as that
`
`of the Dr. Shoup, except that Dr. Shoup’s description requires two years of work or
`
`research experience (as compared to three years). The opinions set forth in this
`
`Declaration response would be the same under either my or Dr. Soup’s proposal.
`
`
`
`
`USR Exhibit 2001, Page 6
`
`

`

`
`
`18.
`
`I am well-qualified to determine the level of ordinary skill in the art
`
`and am personally familiar with the technology of the ’813 Patent. I was a person
`
`of at least ordinary skill in the art at the time of the priority date of the ’813 patent.
`
`Regardless if I do not explicitly state that my statements below are based on this
`
`timeframe, all of my statements are to be understood as a POSITA would have
`
`understood something as of the priority date of the ’813 patent.
`
`B.
`
`Legal Principles
`
`19.
`
`I am not a lawyer and will not provide any legal opinions. Though I
`
`am not a lawyer, I have been advised that certain legal standards are to be applied
`
`by technical experts in forming opinions regarding the meaning and validity of
`
`patent claims.
`
`1.
`
`Anticipation
`
`20.
`
`I understand that to obtain a patent, a claimed invention must have, as
`
`of the priority date, been novel and not anticipated by prior art in the field. I
`
`understand that a claim is anticipated only if each and every element in the claim is
`
`explicitly or inherently found in a single prior art reference.
`
`21.
`
`I understand that implicit and inherent disclosures of a prior art
`
`reference may be relied upon in the rejection of claims for anticipation, so long as
`
`the limitation not expressly disclosed necessarily flows from the teachings of the
`
`prior art reference. I also understand that to be an inherent disclosure the limitation
`
`
`
`
`USR Exhibit 2001, Page 7
`
`

`

`
`
`must necessarily be contained in the prior art reference and the mere fact that the
`
`process, apparatus, or system described in the prior art reference might possibly or
`
`sometimes practice or contain a claim limitation is inadequate to establish that the
`
`reference inherently discloses the limitation.
`
`2. My Understanding of Obviousness Law
`
`22.
`
`I understand that to obtain a patent, a claimed invention must have, as
`
`of the priority date, been nonobvious in view of the prior art in the field. I
`
`understand that an invention is obvious when the differences between the subject
`
`matter sought to be patented and the prior art are such that the subject matter as a
`
`whole would have been obvious at the time the invention was made to a person
`
`having ordinary skill in the art.
`
`23.
`
`I understand that to prove that prior art, or a combination of prior art,
`
`renders a patent obvious, it is necessary to: (1) identify the particular references
`
`that singly, or in combination, make the patent obvious; (2) specifically identify
`
`which elements of the patent claim appear in each of the asserted references; and
`
`(3) explain how the prior art references could have been combined to create the
`
`inventions claimed in the asserted claim.
`
`24.
`
`I understand that a patent composed of several elements is not proved
`
`obvious merely by demonstrating that each of its elements was, independently,
`
`known in the prior art, and that obviousness cannot be based on the hindsight
`
`
`
`
`USR Exhibit 2001, Page 8
`
`

`

`
`
`combination of components selectively culled from the prior art to fit the
`
`parameters of the patented invention.
`
`25.
`
`I also understand that a reference may be said to teach away when a
`
`person of ordinary skill, upon reading the reference, would be discouraged from
`
`following the path set out in the reference, or would be led in a direction divergent
`
`from the path that was taken by the applicant. Even if a reference is not found to
`
`teach away, I understand its statements regarding preferences are relevant to a
`
`finding regarding whether a skilled artisan would be motivated to combine that
`
`reference with another reference.
`
`3. My Understanding of Claim Construction Law
`
`26.
`
`I understand that in this CBM review the claims must be given their
`
`broadest reasonable interpretation, but that interpretation must be consistent with
`
`the patent specification. In this Declaration, I have used the broadest reasonable
`
`interpretation (“BRI”) standard when interpreting the claim terms.
`
`III. OVERVIEW OF THE ’813 PATENT
`
`A. The ’813 Patent Specification
`
`27.
`
`I have reviewed the ’813 patent. The ’813 patent provides improved
`
`systems, devices and methods that allow users to securely authenticate their
`
`identity when using a “point-of-sale” (“POS”) device. Ex. 1201 (’813 patent) at
`
`Fig. 31, 43:4-51:55. When used in conjunction with the patent’s Universal Secure
`
`
`
`
`USR Exhibit 2001, Page 9
`
`

`

`
`
`Registry (“USR”), the claimed Electronic ID Device can both securely identify the
`
`user, and separately authenticate and approve the user’s financial transaction
`
`requests made through a POS device. Id. at 43:4-15, Fig. 31. The USR (USR 10
`
`in Fig. 1, USR 356 in Fig. 31) includes a secure database that stores account (e.g.,
`
`credit card) information for a plurality of users. Id., 44:39-53.
`
`28. The ’813 patent specification identifies a number of disadvantages of
`
`prior art approaches to providing secure access. For example, a prior art
`
`authorization system may control access to computer networks using password
`
`protected accounts, but such a system is susceptible to tampering and difficult to
`
`maintain. Id., at 1:64-2:15. Or, hand-held computer devices may be used to verify
`
`identity, but security could be compromised if a device ends up in the wrong
`
`hands. Id., at 2:16-43.
`
`29. To prevent unauthorized use of the Electronic ID Device, a user must
`
`first authenticate themselves to the device to activate it for a financial transaction.
`
`The ‘813 patent describes multiple ways to do this, including using a biometric
`
`input (e.g., fingerprint) and/or secret information (e.g., a PIN). Id. at 45:55-46:45,
`
`50:1-22, 51:7-26. Once activated, the Electronic ID Device allows a user to select
`
`an account for a financial transaction, and generates encrypted authentication
`
`information that is sent via the POS device to the USR for authentication and
`
`approval of the requested financial transaction. Id. at 46:22-36. This encrypted
`
`
`
`
`USR Exhibit 2001, Page 10
`
`

`

`
`
`authentication information is not the user’s credit card information (which could be
`
`intercepted and misused). Instead, the Electronic ID Device first generates a non-
`
`predictable value (e.g., a random number) using, for example a seed (Id. at 33:64-
`
`34:61, 46:46-67), and then generates single-use authentication information using
`
`the non-predictable value, information associated with the biometric data, and the
`
`secret information. Id. at 46:14-36, 50:56-65. This encrypted authentication
`
`information is transmitted to the secure registry, where it is used to determine
`
`transaction approval. Id. at 11:36-45, 12:19-44, 12:64-13:8, 48:60-49:24, 50:23-
`
`32, 51:7-26.
`
`B.
`
`The ’813 Patent Claims
`
`30. The ’813 patent includes 26 claims. Claims 1, 16, and 24 are
`
`independent. All of the claims relate to communicating authentication information
`
`from an electronic ID device. Id. at 51:65-52:29. Claims 16 and 24 recite
`
`alternative embodiments of the invention. Id. at 53:25-47; 54:24-46.
`
`IV. OVERVIEW OF THE ASSERTED PRIOR ART
`
`A. Maes (Exhibit 1213)
`
`31. U.S. Patent No. 6,016,476 (“Maes”) (Ex. 1213) explains that
`
`consumers at the time had to carry a large number of “credit cards, ATM cards and
`
`direct debit cards,” and that carrying this many cards was burdensome. Id. at 1:50-
`
`2:20, 2:50-67. Accordingly, the “object of [Maes’] present invention” is to provide
`
`a PDA that is “compatible with the current infrastructure (i.e., immediately
`
`
`
`
`USR Exhibit 2001, Page 11
`
`

`

`
`
`employed without having to change the existing infrastructure)” and in which the
`
`user can store all their card information. Id. at 2:23-49, 7:61-7:19. When the user
`
`needs to conduct a transaction, the PDA can write selected card information to a
`
`smartcard (“Universal Card”) that is then swiped across a point of sales terminal.
`
`Id. at 4:1-11, 2:23-30.
`
`32. A user of Maes begins by enrolling for the service. Id. at 6:56-67.
`
`Prior to conducting a transaction, the user must also connect the PDA to the central
`
`server of the service provider in a “client/server” mode in order to download a
`
`temporary digital certificate. Id. at 3:39-52. After downloading the digital
`
`certificate, the PDA can initiate financial transactions without having to connect to
`
`a server, in what is called “local mode.” Id. at 3:52-67, Figs. 5-6.
`
`Where the PDA is being used with a point of sales terminal that supports electronic
`
`data transfer, the local mode operates as shown in Figure 5. Id. at 3:53-467, 12:5-
`
`29, Fig. 5. The user selects a financial card stored in the PDA. Id. at 12:5-29. The
`
`PDA determines that the user is authorized to initiate the transaction by performing
`
`local verification (i.e., verification on the PDA) of the user’s biometric and/or PIN
`
`and confirming the digital certificate is valid. Id. at 3:53-467, Fig. 5. If the
`
`verification is valid, the PDA knows that the user is authorized to conduct the
`
`transaction, and the card information is transmitted to a financial institution by
`
`either writing it to the Universal Card and swiping it across the terminal (steps
`
`
`
`
`USR Exhibit 2001, Page 12
`
`

`

`
`
`216-218), or sending it to the terminal wirelessly (step 228). Id. at 312:5-29, Fig.
`
`5.
`
`33. Maes discloses an alternative local mode of operation that is designed
`
`to “provide[] biometric security for transactions that do not involve electronic data
`
`transfer” (e.g., “transactions that are performed remotely over the telephone”)
`
`using an “authorization number.” Id. at 12:30-39, 6:50-55, 2:42-48. In these
`
`situations, after the PDA locally verifies that the user is authorized to conduct the
`
`transaction using the biometric and other information, it displays the authorization
`
`number on the PDA screen. Id. at 12:30-13:5. The card information and
`
`authorization number are then “verbally communicated to the merchant in order to
`
`process the transaction.” Id. at 12:30-13:5. The operation of this alternative local
`
`mode is shown in Figure 6.
`
`B.
`
`Jakobsson (Exhibit 1214)
`
`34.
`
`I am an inventor of Petitioner’s secondary prior art reference
`
`International Patent Application Publication No. WO 2004/051585 A2 (Ex. 1214
`
`“Jakobsson”). Jakobsson discloses an event detecting and alert system for personal
`
`identity authentication systems. Specifically, “[t]he invention addresses the[]
`
`shortcomings [of the prior art] by including an indication of the occurrence of an
`
`event directly into the efficient computation of an identity authentication code,
`
`where the verifier may efficiently verify the authentication code and identify the
`
`
`
`
`USR Exhibit 2001, Page 13
`
`

`

`
`
`signaling of an event state.” Ex. 1214, Jakobsson at [0010]. See id. at [0011] (“the
`
`previous approaches do not have the flexibility to communicate event information
`
`in, or as part of, an authentication code, in the present approach, an authentication
`
`code is generated in a manner that communicates to the verifier information about
`
`the occurrence of one or more reportable events.”). Jakobsson expressly discloses
`
`that “[e]xample reportable events include: device tampering; an event external to
`
`the device detected by the device; an environmental event, such as temperature
`
`exceeding or falling below a threshold; static discharge; high or low battery power;
`
`geographic presence at a particular location; confidence level in a biometric
`
`reading; and so on.” Ex. 1214, Jakobsson at [0011].
`
`35.
`
`Jakobsson’s user device (such as a credit card device, key fob, USB
`
`dongle or cellular telephone, Ex. 1214, Jakobsson at [0041]) computes an
`
`authentication code based upon various values including a dynamic variable that
`
`changes over time, an event state, a device secret, along with user data such as a
`
`PIN number or social security number. Id. at [0043], [0058], [0060-0061]. The
`
`code is then transmitted to a verifier that retrieves from its records the data
`
`necessary for verification, such as the PIN associated with the user, and then
`
`verifies the received information. Ex. 1214, Jakobsson at [0049-0050].
`
`
`
`
`
`
`
`
`USR Exhibit 2001, Page 14
`
`

`

`
`
`V. CLAIM CONSTRUCTION
`
`36.
`
`I understand that Petitioner has identified five terms that they allege
`
`require construction. Pet. at 19-24. Their construction for these terms do not impact
`
`my opinion in this declaration.
`
`VI. THE ’813 PATENT RECITES A NOVEL AND INVENTIVE
`TECHNOLOGICAL FEATURE THAT PROVIDES A TECHNICAL
`SOLUTION TO A TECHNICAL PROBLEM
`
`37.
`
`In my opinion, the claimed invention includes a novel and nonobvious
`
`technological feature that provides a technical solution to a technical problem.
`
`Here, the technical problem is to implement privacy and authentication at the same
`
`time. Specifically, how to securely and reliably authenticate a user and the user’s
`
`device-initiated transaction remotely and without compromising the user’s
`
`sensitive information. These two concerns are often in conflict with each other.
`
`This is evidenced by the fact that my Doctoral Thesis is entitled “Privacy vs.
`
`Authenticity,” and deals with such issues. Indeed, it is easy to achieve privacy if
`
`one sacrifices authenticity (and vice versa). The goal is to get both. Additionally,
`
`this has to be done in a manner that is usable to typical end users, and which is
`
`computationally practical (which often means “affordable”). The ’813 patent
`
`discloses a technical solution to this problem.
`
`38. The ’813 patent provides a unique and highly secure distributed
`
`transaction approval system incorporating multiple parts that work together to
`
`
`
`
`USR Exhibit 2001, Page 15
`
`

`

`
`
`simultaneously improve both the authenticity and privacy of distributed electronic
`
`transactions. The use of the “encrypted authentication information” is
`
`computationally lightweight, protects against replay attacks, and serves as a proxy
`
`for transmitting sensitive data over a network. The use of a “secure registry” acts
`
`as an interpreter of the identity assertions and a repository of payment data. The
`
`fact that the “secure registry” can use any type of database is another benefit that
`
`makes it easier to implement.
`
`39. Specifically, the invention includes a local “electronic ID device” that
`
`authenticates a user of the device based upon “secret information” (e.g., a PIN
`
`code) and “biometric input” (e.g., a fingerprint captured by the “biometric sensor”
`
`of the local device). The local device then generates and wirelessly transmits a
`
`transaction approval request signal to a remote “secure registry.” This signal
`
`(“encrypted authentication information”) is generated using a “non-predictable
`
`value, information associated with at least a portion of the biometric input, and the
`
`secret information.” The local device also “wirelessly transmit[s] the “encrypted
`
`authentication information to a point-of-sale (POS) device.” And, the “secure
`
`registry” also receives “at least a portion of the encrypted information from the
`
`POS device.” The “secure registry” uses the “encrypted authentication
`
`information” “to authenticate the electronic ID device” and “authorize[] the POS
`
`
`
`
`USR Exhibit 2001, Page 16
`
`

`

`
`
`device to initiate a financial transaction . . . when the encrypted authentication
`
`information is successfully authenticated.”
`
`40. As discussed above, one important concern is ensuring that the person
`
`remotely initiating a transaction is authorized to do so. Due to the nature of
`
`distributed electronic transactions, the entity authorizing a transaction only has
`
`visibility into the data it actually receives, and not into the ultimate source of that
`
`data. Hence, for example, prior art distributed electronic transaction systems
`
`lacked a technical solution to stop someone with a stolen or counterfeit transaction
`
`card (or even just stolen credentials) from fraudulently accessing confidential
`
`information or transacting through a website. The claimed invention solves this
`
`technical problem by incorporating technology to locally authenticate the user of
`
`the device through multifactor authentication (e.g., a secret PIN and fingerprint),
`
`and to generate and send the remote “secure registry” specific data that is difficult
`
`to counterfeit. In this way, the entity authorizing a transaction via the second
`
`device can be confident that the person initiating the transaction via the first device
`
`is authorized to do so.
`
`41. Another critical technical concern in a distributed electronic
`
`transaction system is preventing, in the first instance, the interception of sensitive
`
`information that could later be fraudulently used in future transactions. Distributed
`
`electronic transaction systems necessarily require the electronic transmission of
`
`
`
`
`USR Exhibit 2001, Page 17
`
`

`

`
`
`data that is inherently vulnerable to interception. Hence, for example, prior art
`
`distributed electronic transaction systems often required transmission of a user’s
`
`social security number, password, transaction card details, or other sensitive
`
`information over the Internet and, although that data may have been encrypted,
`
`encryption can be broken, leaving the transmitted data vulnerable to interception
`
`and fraudulent use. The claimed invention solves this technical problem by
`
`incorporating technology that obviates the need to send any sensitive information
`
`at all. Instead, the local first device generates and sends the remote second device
`
`authentication information that cannot be used outside of the transaction
`
`processing system and, thus, is essentially useless if intercepted. Moreover, the
`
`local first device generates and uses a non-predictable value that prevents
`
`intercepted data from being resubmitted into the system as a replay attack. As a
`
`result of these technical improvements, a user may participate in a distributed
`
`electronic transaction without compromising sensitive information that could later
`
`be misused.
`
`42. Hence, when viewed as a whole and in light of the specification, the
`
`claimed subject matter involves a novel and inventive technological feature that
`
`provides an improved technical solution to a technical problem specifically arising
`
`in distributed electronic transactions.
`
`VII. GROUND 1: MAES AND JAKOBSSON DO NOT RENDER THE
`CHALLENGED CLAIMS OBVIOUS
`
`
`
`
`USR Exhibit 2001, Page 18
`
`

`

`
`
`43. Of the Petition’s three asserted grounds, I understand that only
`
`Ground 1 challenges the independent claims (1, 16 and 24) of the ’813 patent. See
`
`Petition at 25-91. Specifically, Ground 1 proffers the combination of Maes in view
`
`of Jakobsson for claims 1-2, 4-5, 11, 13, 16-20, and 24. Id. at 27. However, in my
`
`opinion, Ground 1 does not render any independent claim obvious.
`
`A.
`
`Secure Registry
`
`44. Each independent claim requires a “secure registry.” See Ex. 1201,
`
`’813 patent at 1[c], 1[g], 16[f], 24[c]. For example, limitation 1[c] recites “a
`
`communication interface configured to communicate with a secure registry.” I
`
`understand the Petitioner defines “secure registry” as “a database with access
`
`restrictions.” Petition at 24. While the Petitioner argues that both Maes and
`
`Jakobsson disclose this element, in my opinion, Petitioner is wrong about both
`
`references.
`
`1. Maes does not disclose a secure registry
`
`45.
`
`Initially, I understand Petitioner argues that Maes’ “financial server 70
`
`. . . operates as a secure registry during a ‘purchase transaction.’” Petition at 31-
`
`32. Petitioner notes Maes discloses that after local authentication, the PDA sends
`
`financial information to financial institution 70 via the POS terminal 80, upon
`
`which financial institution 70 verifies the user’s identity and provides an
`
`authorization number to the merchant. Id. at 32. From this Petitioner argues that a
`
`
`
`
`USR Exhibit 2001, Page 19
`
`

`

`
`
`POSITA would understand “that financial institution 70 would need to include a
`
`database with access restrictions because it accepts encrypted information from the
`
`PDA device 10 to verify the identity of the consumer and provides and
`
`authorization number when the consumer is verified. Id. at 32-33. Petitioner
`
`further argues that central server 60 is coupled to 70 and includes database
`
`functionality with access restrictions because server 60 contains the user’s credit
`
`card information. Id. at 33. Thus, I understand Petitioner argues that a POSITA
`
`“would have understood that central server 60 is one example of database used by
`
`financial institution 70 as a secure registry. Id. at 34. I disagree with this
`
`conclusion.
`
`46. Specifically, the Petition fails to proffer any citation to Maes’
`
`disclosing that either central server 60 or financial institution 70 has any type of
`
`access restrictions. Indeed, in my opinion, the Petition has failed to identify any
`
`functionality in Maes that even recites the Petitioner’s own claim construction.
`
`47.
`
`In my opinion, there is also no teaching or suggestion of any type of
`
`access restrictions in Maes. For example, there is no disclosure of any limitation
`
`of searching the database for a matching encryption key. Maes merely states that
`
`the key is provided to financial institution 70 in advance and the key is used to
`
`decrypt. Ex. 1213, Maes at 13:51-55.
`
`
`
`
`USR Exhibit 2001, Page 20
`
`

`

`
`
`48. Further, there is no inherent disclosure of access restrictions in Maes
`
`because it is not necessary for the function of Maes. In my opinion, a POSITA
`
`would understand that a database with access restrictions is not necessary for Maes
`
`to function. There is also no implicit disclosure of access restrictions because any
`
`number of security methodologies could be implemented without using access
`
`restrictions on a database.
`
`2.
`
`Jakobsson does not disclose a secure registry
`
`49.
`
`In my opinion, the failing of Maes cannot be cured with the improper
`
`combination of Jakobsson. As discussed below, a POSITA would not be
`
`motivated to combine these references to cure Maes’ lack of disclosure of a secure
`
`registry. In addition, as shown herein, in my opinion, Petitioner has failed to prove
`
`that Jakobsson even discloses a secure registry.
`
`50.
`
`I understand Petitioner next contends that Jakobsson discloses a
`
`secure registry. Petition at 34. Specifically, the Petition argues that verifier 105 is
`
`the secure registry because it stores multiple users’ PIN and secrets; thus, a
`
`“POSITA would have understood that verifier 105 has access restrictions because
`
`it will authenticate only users with valid credentials to enable a transaction.” Id. at
`
`35. However, in my opinion, nothing in the Petition’s cited portions of Jakobsson
`
`describes an access restriction for the underlying database, but only an
`
`authentication mechanism to ensure users are authenticated prior to authorizing
`
`
`
`
`USR Exhibit 2001, Page 21
`
`

`

`
`
`transaction. Indeed, authentication is not an access restriction. Moreover, based
`
`upon the Petition’s argument and construction, if authentication could be construed
`
`as access restrictions, the ’813 patent’s subsequent encryption (see, e.g., 1[f])
`
`would be redundant.
`
`51. Finally, there is nothing in Jakobsson that would require an access
`
`restricted database in order for it to function. Similarly, the disclosure is not
`
`implicit as there are any number of secu

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket