`________________
`
`BEFORE THE PATENT TRIAL AND APPEAL BOARD
`________________
`
`
`UNIFIED PATENTS, INC.
`Petitioner,
`
`v.
`
`UNIVERSAL SECURE REGISTRY LLC
`Patent Owner
`________________
`
`Case IPR2018-00067
`U.S. Patent No. 8,577,813
`________________
`
`
`
`PATENT OWNER’S EXHIBIT 2004
`
`DECLARATION OF MARKUS JAKOBSSON
`
`IN SUPPORT OF PATENT OWNER’S RESPONSE
`
`
`
`
`
`
`
`USR Exhibit 2004
`
`
`
`
`
`
`1.
`
`I have been retained on behalf of Universal Secure Registry LLC
`
`(“Patent Owner”) in connection with the above-captioned inter partes review
`
`(IPR). I have been retained to provide my opinions in support of USR’s Patent
`
`Owner Response. I am being compensated for my time at the rate of $625 per hour.
`
`I have no interest in the outcome of this proceeding.
`
`2.
`
`In preparing this declaration, I have reviewed and am familiar with the
`
`Board’s decision in IPR2018-00067 to institute review (Paper 14. “Decision”),
`
`along with U.S. Patent No. 8,577,813 and its file history. I have also reviewed the
`
`Corrected Petition (Paper 12, “Petition”) and its Exhibits and the Patent Owner’s
`
`Preliminary Responses and associated exhibits.
`
`3.
`
`I understand the Board instituted review as to claims 1-3, and 5-26
`
`(“Challenged Claims”) as being obvious in view of Maes (Ex. 1003), Pare (Ex.
`
`1004), Labrou (Ex. 1005), Burger (Ex. 1006), and Pizarro (Ex. 1007) under 35
`
`U.S.C. § 103. Specifically, the Decision relied upon the combination of
`
`(i) Maes and Pare for claims 1-3, 5, 11, 13-17, 20, and 22-26;
`
`(ii) Maes and Labrou for claims 1-3, 5, 11-17, and 19-26;
`
`(iii) Maes, Pare and Labrou for claims 12-15, 19-23, and 25-26;
`
`(iv) Maes, Pare and Burger for claims 6-9 and 18;
`
`(v) Maes, Labrou and Burger for claims 6-10 and 18;
`
`(vi) Maes, Pare, Burger and Labrou for claim 10; and
`
`
`
`
`USR Exhibit 2004, Page 1
`
`
`
`(vii) Pizarro and Pare for claims 2, 5, 11, 13, 16-17, and 24.
`
`4.
`
`The statements made herein are based on my own knowledge and
`
`opinion. This Declaration represents only the opinions I have formed to date. I may
`
`consider additional documents as they become available or other documents that
`
`are necessary to form my opinions. I reserve the right to revise, supplement, or
`
`amend my opinions based on new information and on my continuing analysis.
`
`I.
`
`QUALIFICATIONS
`
`5. My qualifications can be found in my Curriculum Vitae, which
`
`includes my detailed employment background, professional experience, and list of
`
`technical publications and patents. Ex. 2005.
`
`6.
`
`I am currently the Chief of Security and Data Analytics at Amber
`
`Solutions, Inc, a cybersecurity company that develops home and office automation
`
`technology. At Amber, my research studies and addresses abuse, including social
`
`engineering, malware and privacy intrusions. My work primarily involves
`
`identifying risks, developing protocols and user experiences, and evaluating the
`
`security of proposed approaches.
`
`7.
`
`I received a Master of Science degree in Computer Engineering from
`
`the Lund Instituted of Technology in Sweden in 1993, a Master of Science degree
`
`in Computer Science from the University of California at San Diego in 1994, and a
`
`Ph.D. in Computer Science from the University of California at San Diego in 1997,
`
`
`
`
`USR Exhibit 2004, Page 2
`
`
`
`specializing in Cryptography. During and after my Ph.D. studies, I was also a
`
`Researcher at the San Diego Supercomputer Center, where I did research on
`
`authentication and privacy.
`
`8.
`
`From 1997 to 2001, I was a Member of Technical Staff at Bell Labs,
`
`where I did research on authentication, privacy, multi-party computation, contract
`
`exchange, digital commerce including crypto payments, and fraud detection and
`
`prevention. From 2001 to 2004, I was a Principal Research Scientist at RSA Labs,
`
`where I worked on predicting future fraud scenarios in commerce and
`
`authentication and developed solutions to those problems. During that time I
`
`predicted the rise of what later became known as phishing. I was also an Adjunct
`
`Associate Professor in the Computer Science department at New York University
`
`from 2002 to 2004, where I taught cryptographic protocols.
`
`9.
`
`From 2004 to 2016, I held a faculty position at the Indiana University
`
`at Bloomington, first as an Associate Professor of Computer Science, Associate
`
`Professor of Informatics, Associate Professor of Cognitive Science, and Associate
`
`Director of the Center for Applied Cybersecurity Research (CACR) from 2004 to
`
`2008; and then as an Adjunct Associate Professor from 2008 to 2016. I was the
`
`most senior security researcher at Indiana University, where I built a research
`
`group focused on online fraud and countermeasures, resulting in over 50
`
`publications and two books.
`
`
`
`
`USR Exhibit 2004, Page 3
`
`
`
`10. While a professor at Indiana University, I was also employed by
`
`Xerox PARC, PayPal, and Qualcomm to provide thought leadership to their
`
`security groups. I was a Principal Scientist at Xerox PARC from 2008 to 2010, a
`
`Director and Principal Scientist of Consumer Security at PayPal from 2010 to
`
`2013, a Senior Director at Qualcomm from 2013 to 2015, and Chief Scientist at
`
`Agari from 2016 to 2018.
`
`11. Agari is a cybersecurity company that develops and commercializes
`
`technology to protect enterprises, their partners and customers from advanced
`
`email phishing attacks. At Agari, my research studied and addressed trends in
`
`online fraud, especially as related to email, including problems such as Business
`
`Email Compromise, Ransomware, and other abuses based on social engineering
`
`and identity deception. My work primarily involved identifying trends in fraud and
`
`computing before they affected the market, and developing and testing
`
`countermeasures, including technological countermeasures, user interaction and
`
`education.
`
`12.
`
`I have founded or co-founded several successful computer security
`
`companies. In 2005 I founded RavenWhite Security, a provider of authentication
`
`solutions, and I am currently its Chief Technical Officer. In 2007 I founded
`
`Extricatus, one of the first companies to address consumer security education. In
`
`2009 I founded FatSkunk, a provider of mobile malware detection software; I
`
`
`
`
`USR Exhibit 2004, Page 4
`
`
`
`served as Chief Technical Officer of FatSkunk from 2009 to 2013, when FatSkunk
`
`was acquired by Qualcomm and I became a Qualcomm employee. In 2013 I
`
`founded ZapFraud, a provider of anti-scam technology addressing Business Email
`
`Compromise, and I am currently its Chief Technical Officer. In 2014 I founded
`
`RightQuestion, a security consulting company.
`
`13.
`
`I have additionally served as a member of the fraud advisory board at
`
`LifeLock (an identity theft protection company); a member of the technical
`
`advisory board at CellFony (a mobile security company); a member of the
`
`technical advisory board at PopGiro (a user reputation company); a member of the
`
`technical advisory board at MobiSocial dba Omlet (a social networking company);
`
`and a member of the technical advisory board at Stealth Security (an anti-fraud
`
`company). I have provided anti-fraud consulting to KommuneData (a Danish
`
`government entity), J.P. Morgan Chase, PayPal, Boku, and Western Union.
`
`14.
`
`I have authored five books and over 100 peer-reviewed publications,
`
`and have been a named inventor on over 100 patents and patent applications.
`
`15. My work has included research in the area of applied security,
`
`privacy, cryptographic protocols, authentication, malware, social engineering,
`
`usability and fraud.
`
`
`
`
`USR Exhibit 2004, Page 5
`
`
`
`II. LEGAL UNDERSTANDING
`
`A. The Person of Ordinary Skill in the Art
`
`16.
`
`I understand that a person of ordinary skill in the relevant art (also
`
`referred to herein as “POSITA”) is presumed to be aware of all pertinent art, thinks
`
`along conventional wisdom in the art, and is a person of ordinary creativity—not
`
`an automaton.
`
`17.
`
`I have been asked to consider the level of ordinary skill in the field
`
`that someone would have had at the time the claimed invention was made. In
`
`deciding the level of ordinary skill, I considered the following:
`
` the levels of education and experience of persons working in the
`
`field;
`
` the types of problems encountered in the field; and
`
` the sophistication of the technology.
`
`18. A person of ordinary skill in the art (“POSITA”) relevant to the ’813
`
`patent at the time of the invention would have a Bachelor of Science degree in
`
`electrical engineering, computer science or computer engineering, and three years
`
`of work or research experience in the fields of secure transactions and encryption;
`
`or a Master’s degree in electrical engineering, computer science or computer
`
`engineering, and two years of work or research experience in related fields.
`
`
`
`
`USR Exhibit 2004, Page 6
`
`
`
`19.
`
`I have reviewed the declaration of Dr. Eric Cole, including his
`
`opinions regarding the Person of Ordinary Skill in the Art. Ex. 1009, ¶¶ 26-28. My
`
`description of the level of ordinary skill in the art is essentially the same as that of
`
`Dr. Cole, except that Dr. Cole’s description requires two years of work or research
`
`experience (as compared to three years). The opinions set forth in this Declaration
`
`would be the same under either my or Dr. Cole’s proposal.
`
`20.
`
`I am well-qualified to determine the level of ordinary skill in the art
`
`and am personally familiar with the technology of the ‘813 Patent. I was a person
`
`of at least ordinary skill in the art at the time of the priority date of the ‘813 patent
`
`in 2006. Regardless if I do not explicitly state that my statements below are based
`
`on this timeframe, all of my statements are to be understood as a POSITA would
`
`have understood something as of the priority date of the ’813 patent.
`
`B.
`
`Legal Principles
`
`21.
`
`I am not a lawyer and will not provide any legal opinions. Though I
`
`am not a lawyer, I have been advised that certain legal standards are to be applied
`
`by technical experts in forming opinions regarding the meaning and validity of
`
`patent claims.
`
`1. Obviousness
`
`22.
`
`I understand that to obtain a patent, a claimed invention must have, as
`
`of the priority date, been nonobvious in view of prior art in the field. I understand
`
`
`
`
`USR Exhibit 2004, Page 7
`
`
`
`that an invention is obvious when the differences between the subject matter
`
`sought to be patented and the prior art are such that the subject matter as a whole
`
`would have been obvious at the time the invention was made to a person having
`
`ordinary skill in the art.
`
`23.
`
`I understand that to prove that prior art, or a combination of prior art,
`
`renders a patent obvious, it is necessary to: (1) identify the particular references
`
`that singly, or in combination, make the patent obvious; (2) specifically identify
`
`which elements of the patent claim appear in each of the asserted references; and
`
`(3) explain how the prior art references could have been combined to create the
`
`inventions claimed in the asserted claim.
`
`24.
`
`I understand that a patent composed of several elements is not proved
`
`obvious merely by demonstrating that each of its elements was, independently,
`
`known in the prior art, and that obviousness cannot be based on the hindsight
`
`combination of components selectively culled from the prior art to fit the
`
`parameters of the patented invention.
`
`25.
`
`I also understand that a reference may be said to teach away when a
`
`person of ordinary skill, upon reading the reference, would be discouraged from
`
`following the path set out in the reference, or would be led in a direction divergent
`
`from the path that was taken by the applicant. Even if a reference is not found to
`
`teach away, I understand its statements regarding preferences are relevant to a
`
`
`
`
`USR Exhibit 2004, Page 8
`
`
`
`finding regarding whether a skilled artisan would be motivated to combine that
`
`reference with another reference.
`
`2. My Understanding of Claim Construction Law
`
`26.
`
`I understand that in this inter partes review the claims must be given
`
`their broadest reasonable interpretation, but that interpretation must be consistent
`
`with the patent specification. In this Declaration, I have used the broadest
`
`reasonable interpretation (“BRI”) standard when interpreting the claim terms.
`
`27.
`
`I understand Petitioner identifies three terms that purportedly require
`
`construction. Pet., 5-7. The construction for these terms do not impact my opinion
`
`in this declaration.
`
`III. OVERVIEW OF THE ’813 PATENT
`
`A. The ’813 Patent Specification
`
`28.
`
`I have reviewed the ’813 patent. The ’813 patent provides improved
`
`systems, devices and methods that allow users to securely authenticate their
`
`identity when using a “point-of-sale” (“POS”) device; e.g., when making a retail
`
`credit card transaction. Ex. 1001 (’813 patent), Fig. 31, 43:4-51:55. When used in
`
`conjunction with the patent’s Universal Secure Registry (“USR”), the claimed
`
`Electronic ID Device can both securely identify the user, and separately
`
`authenticate and approve the user’s financial transaction requests made through a
`
`POS device. Id., 43:4-15, Fig. 31. The USR (USR 10 in Fig. 1, USR 356 in Fig.
`
`
`
`
`USR Exhibit 2004, Page 9
`
`
`
`31) includes a secure database that stores account (e.g., credit card) information for
`
`a plurality of users. Id., 44:39-53.
`
`29. The ’813 specification identifies a number of disadvantages of prior
`
`art approaches to providing secure access. For example, a prior art authorization
`
`system may control access to computer networks using password protected
`
`accounts, but such a system is susceptible to tampering and difficult to maintain.
`
`Id., at 1:64-2:15. Or, hand-held computer devices may be used to verify identity,
`
`but security could be compromised if a device ends up in the wrong hands. Id., at
`
`2:16-43.
`
`30. To prevent unauthorized use of the Electronic ID Device, a user must
`
`first authenticate themselves to the device to activate it for a financial transaction.
`
`The ’813 patent describes multiple ways to do this, including using a biometric
`
`input (e.g., fingerprint) and/or secret information (e.g., a PIN). Id., 45:55-46:45,
`
`50:1-22, 51:7-26. Once activated, the Electronic ID Device allows a user to select
`
`an account for a financial transaction, and generates encrypted authentication
`
`information that is sent via the POS device to the USR for authentication and
`
`approval of the requested financial transaction. Id., 46:22-36. This encrypted
`
`authentication information is not the user’s credit card information (which could be
`
`intercepted and misused). Instead, the Electronic ID Device first generates a non-
`
`predictable value (e.g., a random number) using, for example, the user’s biometric
`
`
`
`
`USR Exhibit 2004, Page 10
`
`
`
`information and/or a seed (Id., 33:64-34:61, 46:46-67), and then generates single-
`
`use authentication information using the non-predictable value, information
`
`associated with the biometric data, and the secret information. Id., 46:14-36,
`
`50:56-65. This encrypted authentication information is transmitted to the secure
`
`registry, where it is used to determine transaction approval. Id., 11:36-45, 12:19-
`
`44, 12:64-13:8, 48:60-49:24, 50:23-32, 51:7-26.
`
`B.
`
`The ’813 Patent Claims
`
`31. The ’813 patent includes 26 claims, of which claims 1, 16, and 24 are
`
`independent. All of the claims relate to communicating authentication information
`
`from an electronic ID device. I have reviewed the claims in detail.
`
`IV. OVERVIEW OF THE ASSERTED PRIOR ART
`
`A. Maes (Exhibit 1003)
`
`32. Maes seeks to reduce the number of financial cards a person must
`
`carry around. Maes, 1:50-2:20, 2:50-67. Maes states that the “object of [its]
`
`present invention” is to provide a PDA that is “compatible with the current
`
`infrastructure (i.e., immediately employed without having to change the existing
`
`infrastructure)” and in which the user can store all their credit card, ATM card
`
`and/or debit card information. Maes, 2:23-49, 7:61-7:19. When the user needs to
`
`conduct a transaction, the PDA accesses the stored information for a selected card,
`
`and can writes it to a smartcard (“Universal Card”) that the user can swipe across a
`
`point of sales terminal. Id., 4:1-11, 2:23-30.
`
`
`
`
`USR Exhibit 2004, Page 11
`
`
`
`33. Maes operates in two modes, a local mode and client server mode.
`
`The operation of the client/server mode is shown in Figure 4 of Maes, and is used
`
`to obtain a digital certificate:
`
`34. After the PDA obtains the digital certificate, it can be used in “local
`
`mode” to initiate financial transactions. Id., 3:52-67, Figs. 5-6. Where the point of
`
`sales terminal supports electronic data transfer (e.g., wireless transmission from the
`
`
`
`
`
`
`USR Exhibit 2004, Page 12
`
`
`
`PDA to the terminal or swiping the Universal Card across the terminal) the PDA is
`
`operated as shown in Figure 5:
`
`
`
`Id., 3:53-467, 12:5-29, Fig. 5.
`
`35. But Maes also teaches that an object of its invention is to “provide[]
`
`biometric security for transactions that do not involve electronic data transfer”—
`
`for example “transactions that are performed remotely over the telephone”—and
`
`
`
`
`USR Exhibit 2004, Page 13
`
`
`
`that an “authorization number” is used to accomplish this objective. Id., 12:30-39,
`
`6:50-55, 2:42-48. As shown in Figure 6, in situations where electronic data
`
`transfer is not supported, the PDA verifies that the user is authorized to conduct the
`
`transaction using the user’s biometric and/or PIN and checking the digital
`
`certificate, and then displays on the screen of the PDA the selected card
`
`information and the “authorization number” that can be “verbally” communicated
`
`to the merchant. Id., 12:30-13:5.
`
`
`
`
`
`
`USR Exhibit 2004, Page 14
`
`
`
`B.
`
`Pare (Exhibit 1004)
`
`36. The goal of Pare is to eliminate “tokens,” a term Pare uses to mean
`
`“portable man made memory devices” or “smart cards” that are used to conduct
`
`financial transactions). Pare, 1:12-3:60, 5:5-8, 6:55-7:3. Pare teaches away from
`
`using a token to verify a user’s identity because doing so requires the biometric be
`
`stored on the token and is thus subject to tampering and fraud by a malicious user.
`
`Id., 2:21-53. Pare also teaches away from using the token as a central repository
`
`for the buyer’s financial information because that simply creates a “monster” token
`
`that can “financially incapacitate” the buyer if it is lost or stolen. Id., 3:23-33.
`
`Accordingly, Pare teaches that the “objective” and “essence of [its] invention” is
`
`to conduct transactions “without the use of any tokens.” Id., 9:12-28, 6:55-7:3,
`
`7:57-60.
`
`37. The system configuration of Pare is shown in Figure 3. A point of
`
`sales terminal 2 includes an integrated “Biometric Input Device 12” (“BIA”) with
`
`fingerprint sensor (13) and PIN pad 14. Id., 11:23-29, 10:44-49. The sales
`
`terminal 2 also communicates with a data processing center (“DPC”) 1 through
`
`modem 18.
`
`
`
`
`USR Exhibit 2004, Page 15
`
`
`
`
`
`38. Pare uses a specific transaction protocol. To form the transaction, the
`
`seller offers a proposal to the buyer, and the buyer accepts by inputting his or her
`
`PIN number and a biometric identifier using the BIA. Id., 4:34-42. The biometric
`
`and PIN are then encrypted and included in the “commercial transaction mes sage”
`
`shown in Figure 5 below, along with various other information needed to decrypt
`
`and process the commercial transaction message. Pare, Abstract , Fig 5.
`
`
`
`
`USR Exhibit 2004, Page 16
`
`
`
`
`
`39. The sales terminal sends the commercial transaction message to the
`
`DPC, which decrypts the biometric and PIN information to verify the buyer’s
`
`identity and determine if the transaction should be authorized. Id., Fig. 11, Fig.
`
`12.
`
`C.
`
`Labrou (Exhibit 1005)
`
`40. Labrou describes a method of performing a transaction between two
`
`agreement parties—customer 102 and merchant transaction server 104. Ex. 1005,
`
`Labrou, ¶ 229; Fig. 1 (below). A secure transaction server 106 verifies both
`
`parties’ identification before processing the transaction. Id.
`
`41. To conduct a transaction, the merchant and consumer each register
`
`with the secure transaction server and obtain a “Private Identification Entry” or
`
`“PIE.” Labrou teaches that the PIE can be a PIN throughout the disclosures. Id., ¶
`
`
`
`
`USR Exhibit 2004, Page 17
`
`
`
`253, 256, 259; Figs. 29-32. Labrou also includes a disclosure that instead of a
`
`PIN, the PIE may be string from a biometric input, although it does not explain
`
`how this would work..
`
`42. With reference to Figure 52 below, in a pre-authorization phase the
`
`consumer’s device connects with the merchants transaction server to display on the
`
`consumer device information about the transaction. If the transaction is acceptable
`
`to the consumer, he or she enters his or her PIE (in this case a PIN) and selects an
`
`account:
`
`
`
`In order to process the transaction, the consumer and merchant each generate a
`
`message containing their view of the transaction. Id., ¶ 236. With reference to
`
`Figure 29 below, the consumer generates an encryption key by using the PIN that
`
`
`
`
`USR Exhibit 2004, Page 18
`
`
`
`was entered in the pre-authorization phase and a Random Sequence Number
`
`(“RSN”) generated from a pseudorandom number. The encryption key is then
`
`used to encrypt the consumer’s view of the transaction information, along with the
`
`user ID (UIDc) and the device ID of the merchant that was provided during the pre-
`
`authorization phase (DIDm). The merchant performs a similar process, creating an
`
`encryption key using its PIE and RSN, and then encrypting its view of the
`
`transaction information along with the DIDc and the UIDm. The consumer and
`
`merchant’s view of the transaction are then sent to the secure server, which
`
`compares the messages to determine if the transaction should be authorized.
`
`D. Burger Ex. (Ex. 1006)
`
`43. Burger teaches a point-of-transaction device that can access account
`
`information by comparing a scanned fingerprint to a stored fingerprint. Ex. 1006,
`
`
`
`
`
`
`USR Exhibit 2004, Page 19
`
`
`
`30:20-29. Once authenticated, the device communicates transaction information to
`
`a point-of-sale device. Id., 74:1-10.
`
`E.
`
`Pizarro (Ex. 1007)
`
`44.
`
`In Pizarro a user loads their various financial accounts into a device,
`
`such as a cell phone, that wirelessly completes a transaction with a point-of-sale
`
`device. Ex. 1007, Abstract, 9:14-24. Before a transaction is initiated, a user sends
`
`its address to a financial institution. Id., 7:54-67.
`
`45. To initiate a transaction, the customer selects an account from those
`
`stored on the user device. The account information is transmitted to the financial
`
`institution, which verifies the proximity of the user device to the address on file.
`
`Id., 2:64-3:8; 9:25-39. In one embodiment, the user device verifies the buyer’s
`
`identity locally using a biometric (id., 9:25-39, Fig. 4A), and in a separate
`
`embodiment the biometric information is not verified on the user device and is
`
`instead sent to the financial institution for verification. Id., Fig. 4B.
`
`VI. GROUND 1 – MAES AND PARE DO NOT RENDER ANY OF THE
`CHALLENGED CLAIMS OBVIOUS
`
`
`
`46.
`
`I understand that Ground 1, as set forth in the Institution Decision, is
`
`the combination of Maes in view of Pare for claims 1, 2, 3, 5, 11, 13-17, 20, and
`
`22-26. Decision, 30. In my opinion, Ground 1 does not render any claim obvious
`
`for the reasons below.
`
`
`
`
`USR Exhibit 2004, Page 20
`
`
`
`A. A POSITA Would Not Combine Maes And Pare To Arrive At The
`Independent Claims Of The ’813 Patent
`
`47. Each of the three independent claims of the ’813 patent requires an
`
`electronic ID device configured to locally authenticate a user based on a biometric
`
`input and/or secret information (e.g., a PIN), and then transmit to a secure registry
`
`encrypted authentication information that is generated from three types of
`
`information: (1) a non-predictable value, (2) information associated with at least a
`
`portion of the biometric input, and (3) the secret information. For example, Claim
`
`1 recites:
`
`[1d][i] the processor being programmed to activate the electronic ID
`device based on successful authentication by the electronic ID device
`of at least one of the biometric input and the secret information,
`
`[1d][ii] the processor [of an electronic ID device] also being
`programmed such that once the electronic ID device is activated the
`processor is configured to generate a non-predictable value and to
`generate encrypted authentication
`information
`from
`the non-
`predictable value, information associated with at least a portion of the
`biometric input, and the secret information, and
`
`[1d][iii] to communicate the encrypted authentication information via
`the communication interface to the secure registry; and
`
`Ex. 1001, 52:9-23; see also 53:32-48 (Claim 16), 54:27-38 (Claim 24).
`
`48. Maes teaches that for purposes of “provid[ing] biometric security for
`
`transactions that do not involve electronic data transfer,” the PDA can display an
`
`“authorization number” on the screen after the user enters and the PDA verifies a
`
`
`
`
`USR Exhibit 2004, Page 21
`
`
`
`PIN and biometric. Maes, 12:30-54. The user may then “verbally” communicate
`
`the number to the merchant so that it can be verified with a central server. Id.
`
`49.
`
`I understand that Petitioner admits Maes does not “teach generating a
`
`non-predictable value or using the non-predictable value to generate encrypted
`
`authentication information associated with the biometric input and secret
`
`information.” Pet., 18. I agree.
`
`50.
`
`I further understand that Petitioner argues that Maes in view of Pare
`
`renders the encrypted authentication information limitations obvious. I further
`
`understand that Petitioner proposes replacing Maes’s authorization number with
`
`Pare’s “commercial transaction message,” and claims that doing so meets the
`
`claimed limitations. Pet., 18-23; Ex. 1009, ¶ 54; see also Decision, 11-13. I
`
`disagree. In my opinion, a POSITA would not find Petitioner’s combination
`
`obvious for several reasons, which I discuss below.
`
`1.
`
`There Is No Motivation To Combine Because Pare Teaches
`Away From The Claimed Electronic ID Device
`
`51.
`
`I understand that a reference may be said to teach away when a person
`
`of ordinary skill, upon reading the reference, would be discouraged from following
`
`the path set out in the reference, or would be led in a direction divergent from the
`
`path that was taken by the applicant. I also understand that if the disclosure
`
`criticizes, discredits, or otherwise discourages the solution claimed, then the
`
`disclosure teaches away such that a POSITA would not be motivated to combine
`
`
`
`
`USR Exhibit 2004, Page 22
`
`
`
`the references. Finally, I understand that even if a reference is not found to teach
`
`away, its statements regarding preferences are relevant to a finding regarding
`
`whether a skilled artisan would be motivated to combine that reference with
`
`another reference.
`
`52.
`
`In my opinion, Pare teaches away from using an electronic ID device
`
`(such as the PDA of Maes) that locally authenticates a user and then generates the
`
`claimed encrypted authentication information. Pare is titled “Tokenless Biometric
`
`Transaction Authorization Method And System” and repeatedly criticizes prior art
`
`that employs “tokens,” which Pare uses to refer to “portable man made memory
`
`devices” or “smart cards” that are used to conduct financial transactions (like
`
`Maes’ PDA and Universal Card). Pare, 1:12-3:60, 5:5-8, 6:55-7:3. Pare explains
`
`that to perform biometric verification on a token, the PIN and biometric
`
`information need to be stored on the token, and that doing so can result in the
`
`biometric being reproducible and subject to tampering if it is, for example, lost or
`
`stolen. Id., 2:21-53, 7:46-60. Pare also teaches away from centralizing financial
`
`information in the token or adding additional security to the token because trying
`
`to “smarten” tokens simply “leads to centralization of function” that “looks good
`
`during design, but actual use results in increased vulnerability for consumers,”
`
`“heavier and heavier penalties on the consumer for destruction or loss” of the
`
`token, and additional costs. Id., 3:23-36.
`
`
`
`
`USR Exhibit 2004, Page 23
`
`
`
`53. Pare thus states that the “objective” and “essence of [its] invention” is
`
`to conduct transactions “without the use of any tokens.” Id., 9:12-28; see also id.,
`
`Abstract (“A method and system for tokenless authorization of commercial
`
`transactions”), 6:55-7:3 (invention “eliminates the need to carry and present any
`
`token … eliminates all the inconveniences associated with carrying, safeguarding,
`
`and locating tokens.”), 7:57-60 (“it is an object of the invention therefore to
`
`provide a commercial transaction system that eliminates the need for a user to
`
`possess and present a physical object, such as a token, in order to authorize a
`
`transaction.”). And every one of Pare’s claims includes the requirement that the
`
`transaction be performed “without the buyer having to use any portable man made
`
`memory devices such as smart cards or swipe cards.” Id., 67:52-78:9.
`
`54. Pare also teaches a POSITA that the commercial transaction should
`
`be implemented on hardware and software (referred to as the Biometric Input
`
`Apparatus, or “BIA”) that is “strictly limited” in its functionality, and that is
`
`“integrated” with the sales terminal. Id., 11:1-11, 2:20-38. The PDA taught in
`
`Maes is a general purpose device that also includes substantial functionality not
`
`related to financial transactions (e.g., calendaring and email), and in my opinion, a
`
`POSITA would understand that limiting such a PDA’s interfaces solely to financial
`
`functions would be neither practical nor desirable.
`
`
`
`
`USR Exhibit 2004, Page 24
`
`
`
`55.
`
`In my opinion, Petitioner cannot show that a POSITA would be
`
`motivated to implement Pare’s commercial transaction message in a token such as
`
`the PDA of Maes because this is precisely what Pare teaches a POSITA they
`
`should not do. A POSITA would not be motivated to combine Maes and Pare
`
`because Pare criticizes, discredits, and discourages conducting transaction with an
`
`electronic ID device like that in Maes, and also rejects adding additional security to
`
`an electronic ID device because Pare teaches doing so is futile.
`
`2.
`
`There Is No Motivation To Combine Maes And Pare Because
`The Combination Is Redundant
`
`56.
`
`In my opinion, a POSITA also would not be motivated to combine
`
`Maes and Pare because the combination results in redundant functionality, and a
`
`less secure system.
`
`57. Maes verifies a user’s identity once, locally by the PDA, at the start of
`
`the transaction, using biometric and PIN information. Maes, 1:10-17 (invention
`
`“provide[s] personal verification prior to processing user requested financial
`
`transactions”), 2:32-42 (“present invention” provides a “PDA device which utilizes
`
`biometric security to provide user verification prior to accessing and writing the
`
`selected financial and person information”), 5:60-63. In Maes, the user selects a
`
`financial card to conduct the transaction with, and then enters a biometric input and
`
`a PIN that are locally verified by the PDA. Id., 3:53-67, 12:40-47. If the biometric
`
`and PIN are valid, the transaction may proceed. Id.
`
`
`
`
`USR Exhibit 2004, Page 25
`
`
`
`58.
`
`In Pare the user’s biometric and PIN information is also validated
`
`once in a transaction, but that it happens remotely on the server after the
`
`transaction is initiated. Pare, Fig. 5 (illustrating fields of commercial transaction
`
`message, including encrypted biometric and PIN information); Fig. 12 (illustrating
`
`how the PIN and biometric information are used by the data processing center to
`
`authenticate the user). Pare does not teach that the user’s biometric and PIN
`
`information should be verified locally, and, in fact, teaches against it. Id., 2:21-53
`
`(teaching against local biometric verification by the device); 7:46-60 (biometric
`
`should be stored and verified “at a location that is operationally isolated from the
`
`user requesti