`
`
`
`
`UNITED STATES PATENT AND TRADEMARK OFFICE
`________________
`
`BEFORE THE PATENT TRIAL AND APPEAL BOARD
`________________
`
`
`APPLE INC.
`Petitioner,
`
`v.
`
`UNIVERSAL SECURE REGISTRY LLC
`Patent Owner
`________________
`
`Case IPR2018-00809
`U.S. Patent No. 9,530,137
`________________
`
`
`
`PATENT OWNER’S EXHIBIT 2010
`
`DECLARATION OF MARKUS JAKOBSSON
`
`IN SUPPORT OF PATENT OWNER’S RESPONSE
`
`
`
`
`
`
`
`USR Exhibit 2010
`
`
`
`
`
`1.
`
`I have been retained on behalf of Universal Secure Registry LLC
`
`(“USR” or “Patent Owner”) in connection with the above-captioned inter partes
`
`review (IPR). I have been retained to provide my opinions in support of USR’s
`
`Patent Owner Response. I am being compensated for my time at the rate of $625
`
`per hour. I have no interest in the outcome of this proceeding.
`
`2.
`
`In preparing this declaration, I have reviewed and am familiar with the
`
`Petition for Inter Partes Review, IPR2018-00809, U.S. Patent No. 9,530,137 (“the
`
`’137 Patent”) and its file history, and all other materials cited and discussed in the
`
`Petition (including the deposition and declaration of Dr. Victor Shoup) or cited and
`
`discussed in this Declaration. I understand the Petition proffers two invalidity
`
`grounds for the ’137 patent (Ex. 1101) that were instituted by the Board: (1)
`
`Claims 1, 2, 6, 7, 9, and 12 are allegedly obvious in view of International Patent
`
`Application Publication No. WO 2004/051585 (“Jakobsson”) (Ex. 1113) and U.S.
`
`Patent Application Publication No. 2004/0236632 (Ex. 1114) (“Maritzen”); and (2)
`
`Claim 5 is allegedly obvious in view of Jakobsson, Maritzen, and U.S. Patent No.
`
`6,453,301 (“Niwa”) (Ex. 1117).
`
`3.
`
`The statements made herein are based on my own knowledge and
`
`opinion. This Declaration represents only the opinions I have formed to date. I
`
`may consider additional documents as they become available or other documents
`
`
`
`
`USR Exhibit 2010, Page 1
`
`
`
`
`
`that are necessary to form my opinions. I reserve the right to revise, supplement,
`
`or amend my opinions based on new information and on my continuing analysis.
`
`II. QUALIFICATIONS
`
`4. My qualifications can be found in my Curriculum Vitae, which
`
`includes my detailed employment background, professional experience, and list of
`
`technical publications and patents. Ex. 2002.
`
`5.
`
`I am currently the Chief of Security and Data Analytics at Amber
`
`Solutions, Inc., a cybersecurity company that develops home and office automation
`
`technology. At Amber, my research addresses abuse, including social engineering,
`
`malware and privacy intrusions. My work primarily involves identifying risks,
`
`developing protocols and user experiences, and evaluating the security of proposed
`
`approaches.
`
`6.
`
`I received a Master of Science degree in Computer Engineering from
`
`the Lund Instituted of Technology in Sweden in 1993, a Master of Science degree
`
`in Computer Science from the University of California at San Diego in 1994, and a
`
`Ph.D. in Computer Science from the University of California at San Diego in 1997,
`
`specializing in Cryptography. During and after my Ph.D. studies, I was also a
`
`Researcher at the San Diego Supercomputer Center, where I did research on
`
`electronic payment schemes, authentication and privacy.
`
`
`
`
`USR Exhibit 2010, Page 2
`
`
`
`
`
`7.
`
`From 1997 to 2001, I was a Member of Technical Staff at Bell Labs,
`
`where I did research on authentication, privacy, multi-party computation, contract
`
`exchange, digital commerce including crypto payments, and fraud detection and
`
`prevention. From 2001 to 2004, I was a Principal Research Scientist at RSA Labs,
`
`where I worked on predicting future fraud scenarios in commerce and
`
`authentication and developed solutions to those problems. During that time I
`
`predicted the rise of what later became known as phishing. I was also an Adjunct
`
`Associate Professor in the Computer Science department at New York University
`
`from 2002 to 2004, where I taught cryptographic protocols.
`
`8.
`
`From 2004 to 2016, I held a faculty position at Indiana University at
`
`Bloomington, first as an Associate Professor of Computer Science, Associate
`
`Professor of Informatics, Associate Professor of Cognitive Science, and Associate
`
`Director of the Center for Applied Cybersecurity Research (CACR) from 2004 to
`
`2008; and then as an Adjunct Associate Professor from 2008 to 2016. I was the
`
`most senior security researcher at Indiana University, where I built a research
`
`group focused on online fraud and countermeasures, resulting in over 50
`
`publications and two books.
`
`9. While a professor at Indiana University, I was also employed by
`
`Xerox PARC, PayPal, and Qualcomm to provide thought leadership to their
`
`security groups. I was a Principal Scientist at Xerox PARC from 2008 to 2010, a
`
`
`
`
`USR Exhibit 2010, Page 3
`
`
`
`
`
`Director and Principal Scientist of Consumer Security at PayPal from 2010 to
`
`2013, a Senior Director at Qualcomm from 2013 to 2015, and Chief Scientist at
`
`Agari from 2016 to 2018. Agari is a cybersecurity company that develops and
`
`commercializes technology to protect enterprises, their partners and customers
`
`from advanced email phishing attacks. At Agari, my research addressed trends in
`
`online fraud, especially as related to email, including problems such as Business
`
`Email Compromise, Ransomware, and other abuses based on social engineering
`
`and identity deception. My work primarily involved identifying trends in fraud and
`
`computing before they affected the market, and developing and testing
`
`countermeasures, including technological countermeasures, user interaction and
`
`education.
`
`10.
`
`I have founded or co-founded several successful computer security
`
`companies. In 2005 I founded RavenWhite Security, a provider of authentication
`
`solutions, and I am currently its Chief Technical Officer. In 2007 I founded
`
`Extricatus, one of the first companies to address consumer security education. In
`
`2009 I founded FatSkunk, a provider of mobile malware detection software; I
`
`served as Chief Technical Officer of FatSkunk from 2009 to 2013, when FatSkunk
`
`was acquired by Qualcomm and I became a Qualcomm employee. In 2013 I
`
`founded ZapFraud, a provider of anti-scam technology addressing Business Email
`
`
`
`
`USR Exhibit 2010, Page 4
`
`
`
`
`
`Compromise, and I am currently its Chief Technical Officer. In 2014 I founded
`
`RightQuestion, a security consulting company.
`
`11.
`
`I have additionally served as a member of the fraud advisory board at
`
`LifeLock (an identity theft protection company); a member of the technical
`
`advisory board at CellFony (a mobile security company); a member of the
`
`technical advisory board at PopGiro (a user reputation company); a member of the
`
`technical advisory board at MobiSocial dba Omlet (a social networking company);
`
`and a member of the technical advisory board at Stealth Security (an anti-fraud
`
`company). I have provided anti-fraud consulting to KommuneData (a Danish
`
`government entity), J.P. Morgan Chase, PayPal, Boku, and Western Union.
`
`12.
`
`I have authored five books and over 100 peer-reviewed publications,
`
`and have been a named inventor on over 100 patents and patent applications.
`
`13. My work has included research in the area of electronic payments,
`
`applied security, privacy, cryptographic protocols, authentication, malware, social
`
`engineering, usability and fraud.
`
`14.
`
`I am an inventor of Petitioner’s primary prior art reference. Ex. 1113,
`
`WO 2004/051585 A2 (“Jakobsson”).
`
`III. LEGAL UNDERSTANDING
`
`A. The Person of Ordinary Skill in the Art
`
`15.
`
`I understand that a person of ordinary skill in the relevant art (also
`
`referred to herein as “POSITA”) is presumed to be aware of all pertinent art, thinks
`USR Exhibit 2010, Page 5
`
`
`
`
`
`
`
`
`along conventional wisdom in the art, and is a person of ordinary creativity—not
`
`an automaton.
`
`16.
`
`I have been asked to consider the level of ordinary skill in the field
`
`that someone would have had at the time the claimed invention was made. In
`
`deciding the level of ordinary skill, I considered the following:
`
` the levels of education and experience of persons working in the
`
`field;
`
` the types of problems encountered in the field; and
`
` the sophistication of the technology.
`
`17. A person of ordinary skill in the art (“POSITA”) relevant to the ’137
`
`Patent at the time of the invention would have a Bachelor of Science degree in
`
`electrical engineering and/or computer science, and three years of work or research
`
`experience in the fields of secure transactions and encryption, or a Master’s degree
`
`in electrical engineering and/or computer science and two years of work or
`
`research experience in related fields.
`
`18.
`
`I have reviewed the declaration of Dr. Victor Shoup, including his
`
`opinions regarding the Person of Ordinary Skill in the Art. Ex. 1002 at ¶¶ 35-37.
`
`My description of the level of ordinary skill in the art is essentially the same as that
`
`of the Dr. Shoup, except that Dr. Shoup’s description requires two years of work or
`
`
`
`
`USR Exhibit 2010, Page 6
`
`
`
`
`
`research experience (as compared to three years). The opinions set forth in this
`
`Declaration response would be the same under either my or Dr. Shoup’s proposal.
`
`19.
`
`I am well-qualified to determine the level of ordinary skill in the art
`
`and am personally familiar with the technology of the ’137 Patent. I was a person
`
`of at least ordinary skill in the art at the time of the priority date of the ’137 Patent
`
`in 2006. Regardless if I do not explicitly state that my statements below are based
`
`on this timeframe, all of my statements are to be understood as a POSITA would
`
`have understood something as of the priority date of the ’137 Patent.
`
`B.
`
`Legal Principles
`
`20.
`
`I am not a lawyer and will not provide any legal opinions. Though I
`
`am not a lawyer, I have been advised that certain legal standards are to be applied
`
`by technical experts in forming opinions regarding the meaning and validity of
`
`patent claims.
`
`(a) My Understanding of Obviousness Law
`
`21.
`
` I understand that to obtain a patent, a claimed invention must have, as
`
`of the priority date, been nonobvious in view of the prior art in the field. I
`
`understand that an invention is obvious when the differences between the subject
`
`matter sought to be patented and the prior art are such that the subject matter as a
`
`whole would have been obvious at the time the invention was made to a person
`
`having ordinary skill in the art.
`
`
`
`
`USR Exhibit 2010, Page 7
`
`
`
`
`
`22.
`
`I understand that to prove that prior art, or a combination of prior art,
`
`renders a patent obvious, it is necessary to: (1) identify the particular references
`
`that singly, or in combination, make the patent obvious; (2) specifically identify
`
`which elements of the patent claim appear in each of the asserted references; and
`
`(3) explain how the prior art references could have been combined to create the
`
`inventions claimed in the asserted claim.
`
`23.
`
`I understand that a patent composed of several elements is not proved
`
`obvious merely by demonstrating that each of its elements was, independently,
`
`known in the prior art, and that obviousness cannot be based on the hindsight
`
`combination of components selectively culled from the prior art to fit the
`
`parameters of the patented invention.
`
`24.
`
`I also understand that a reference may be said to teach away when a
`
`person of ordinary skill, upon reading the reference, would be discouraged from
`
`following the path set out in the reference, or would be led in a direction divergent
`
`from the path that was taken by the applicant. Even if a reference is not found to
`
`teach away, I understand its statements regarding preferences are relevant to a
`
`finding regarding whether a skilled artisan would be motivated to combine that
`
`reference with another reference.
`
`(a) My Understanding of Claim Construction Law
`
`
`
`
`USR Exhibit 2010, Page 8
`
`
`
`
`
`25.
`
`I understand that in this inter partes review the claims must be given
`
`their broadest reasonable interpretation, but that interpretation must be consistent
`
`with the patent specification. In this Declaration, I have used the broadest
`
`reasonable interpretation (“BRI”) standard when interpreting the claim terms.
`
`IV. OVERVIEW OF THE ’137 PATENT
`
`A. The ’137 Patent Specification
`
`26.
`
`I have reviewed the ’137 Patent. Ex. 1001. The ’137 Patent relates to
`
`a unique and highly secure distributed transaction approval system. Figure 21
`
`depicts one possible embodiment of such a transaction approval system:
`
`
`
`
`USR Exhibit 2010, Page 9
`
`
`
`
`
`
`
`27. The claimed invention provides improved transaction security by
`
`providing a system where users locally authenticate themselves at a first device
`
`using multi-factor authentication (e.g., a PIN code and a biometric, such as a
`
`fingerprint) before the first device generates a transaction approval request that it
`
`transmits to a remote second device. See, e.g., id. at 29:21-44; Fig. 21. That
`
`transaction approval request from the first device is improved as well. See, e.g., id.
`
`at 16:49-17:54; Figs. 6, 21. The request signal(s) include at least three specific
`
`types of data: first authentication information, an indicator of the device’s
`
`biometric authentication of the user, and a code that is a time-varying value. See,
`
`e.g., id. at 14:26-53, 32:31-33:19; Figs. 21, 23. The request signal(s) are sent to a
`
`second device for processing authorization of the transaction (e.g., by a server).
`
`The second device may return an enablement signal based on the request signal(s),
`
`as well as second authentication information of the user available at the second
`
`device. See, e.g., id. at 33:20-34:6; Figs. 21, 24-25.
`
`28. The claimed invention solves a technical problem specifically
`
`encountered in distributed electronic transaction approval systems. One important
`
`concern is ensuring that the person remotely initiating a transaction is an
`
`authorized user, and not someone fraudulently using a counterfeit or stolen device
`
`(e.g., access card, credit card, phone, etc.). The claimed invention addresses this
`
`concern by locally authenticating the user of the first device through multifactor
`
`
`
`
`USR Exhibit 2010, Page 10
`
`
`
`
`
`authentication (e.g., a secret PIN and fingerprint), and by generating and sending
`
`the remote second device an indication of biometric authentication and other data
`
`that is difficult to counterfeit. See, e.g., id. at 2:50-52, 13:62-14:7, 22:16-20.
`
`Another critical concern in a distributed electronic transaction approval system is
`
`preventing the interception of sensitive information that could be fraudulently used
`
`in future transactions. The claimed invention addresses this concern by generating
`
`and sending authentication information (rather than requiring users to send their
`
`social security number, password, credit card number, or other sensitive
`
`information) from the local first device to the remote second device, and by
`
`incorporating a time varying value that helps prevent a replay attack. See, e.g., id.
`
`at 4:23-31, 15:43-50, 18:27-34, 19:45-52.
`
`29. Hence, the ’137 Patent provides an improved secure distributed
`
`transaction approval system. A user needs more than just possession of the local
`
`device to conduct transactions, as the claimed system locally authenticates both
`
`secret information and biometric information from the user before it engages in a
`
`transaction, protecting against fraudulent transactions using a stolen device.
`
`Furthermore, the device in the claimed system does not publish or send the user’s
`
`secret information or other sensitive information over a network, where it might be
`
`stolen and misused. Instead, the device generates signal(s) including
`
`authentication information, indication of the device’s biometric authentication of
`
`
`
`
`USR Exhibit 2010, Page 11
`
`
`
`
`
`the user, and a time varying value, and sends those to the second device for
`
`transaction approval. And, inclusion of the time varying value protects against
`
`interception and resubmission of signal(s) in a replay attack.
`
`B.
`
`The ’137 Patent Claims
`
`30. The ’137 Patent includes 12 claims, of which claims 1 and 12 are
`
`independent. The two independent claims of the ’137 Patent are reproduced
`
`below:
`
`1. A system for authenticating a user for enabling a transaction, the
`system comprising:
`
`
`a first device including:
`
`
`
`a first processor, the first processor programmed to authenticate a
`user of the first device based on secret information and to retrieve
`or receive first biometric information of the user of the first device;
`
` a
`
` first wireless transceiver coupled to the first processor and
`programmed to transmit a first wireless signal including first
`authentication information of the user of the first device; and
`
` a
`
` biometric sensor configured to capture the first biometric
`information of the user;
`
`wherein the first processor is programmed to generate one or more
`signals including the first authentication information, an indicator
`of biometric authentication, and a time varying value in response
`to valid authentication of the first biometric information, and to
`provide the one or more signals including the first authentication
`information for transmitting to a second device; and
`
`wherein the first processor is further configured to receive an
`enablement signal from the second device; and
`
`
`
`
`
`USR Exhibit 2010, Page 12
`
`
`
`
`
`
`
`the system further including the second device that is configured to
`provide the enablement signal indicating that the second device
`approved the transaction based on use of the one or more signals;
`
`wherein the second device includes a second processor that is
`configured to provide the enablement signal based on the
`indication of biometric authentication of the user of the first
`device, at least a portion of the first authentication information, and
`second authentication information of the user of the first device to
`enable and complete processing of the transaction.
`
`
`Ex. 1001 at 45:27-61.
`
`12. A system for authenticating a user for enabling a transaction, the
`system comprising:
`
`
`a first device including:
`
`
`
`a biometric sensor configured to capture a first biometric
`information of the user;
`
` a
`
` first processor programmed to: 1) authenticate a user of the first
`device based on secret information, 2) retrieve or receive first
`biometric information of the user of the first device, 3) authenticate
`the user of the first device based on the first biometric, and 4)
`generate one or more signals including first authentication
`information, an indicator of biometric authentication of the user of
`the first device, and a time varying value; and
`
` a
`
` first wireless transceiver coupled to the first processor and
`programmed to wirelessly transmit the one or more signals to a
`second device for processing;
`
`wherein generating the one or more signals occurs responsive to
`valid authentication of the first biometric information; and
`wherein the first processor is further configured to receive an
`enablement signal from the second device; and
`
`wherein the first processor is further programmed to receive an
`enablement signal indicating an approved transaction from the second
`
`
`
`
`USR Exhibit 2010, Page 13
`
`
`
`
`
`device, wherein the enablement signal is provided from the second
`device based on acceptance of
`the
`indicator of biometric
`authentication and use of the first authentication information and use
`of second authentication information to enable the transaction.
`
`Id. at 46:55-47:14.
`
`V. OVERVIEW OF THE ASSERTED PRIOR ART
`
`A.
`
`Jakobsson
`
`31. My reference, Ex. 1113 (“Jakobsson”), discloses an event detecting
`
`and alert system for personal identity authentication systems. Specifically, “[t]he
`
`invention addresses the[] shortcomings [of the prior art] by including an indication
`
`of the occurrence of an event directly into the efficient computation of an identity
`
`authentication code, where the verifier may efficiently verify the authentication
`
`code and identify the signaling of an event state.” Ex. 1113, [0010]. See id.,
`
`[0011] (“the previous approaches do not have the flexibility to communicate event
`
`information in, or as part of, an authentication code, in the present approach, an
`
`authentication code is generated in a manner that communicates to the verifier
`
`information about the occurrence of one or more reportable events.”). Jakobsson
`
`expressly discloses that “[e]xample reportable events include: device tampering; an
`
`event external to the device detected by the device; an environmental event, such
`
`as temperature exceeding or falling below a threshold; static discharge; high or low
`
`battery power; geographic presence at a particular location; confidence level in a
`
`biometric reading; and so on.” Ex. 1113, [0011].
`
`
`
`
`USR Exhibit 2010, Page 14
`
`
`
`
`
`32.
`
`Jakobsson’s user device (such as a credit card device, key fob, USB
`
`dongle or cellular telephone, id., [0041]) computes an authentication code based
`
`upon various values including a dynamic variable that changes over time, an event
`
`state, a device secret, along with user data such as a PIN number or social security
`
`number. Id., [0043], [0058], [0060-0061]. The code is then transmitted to a
`
`verifier that retrieves from its records the data necessary for verification, such as
`
`the PIN associated with the user, and then verifies the received information. Id.,
`
`[0049-0050].
`
`B. Maritzen
`
`33. Maritzen states that “[a] situation that still requires use of cash is in
`
`the collection of fees at vehicle-accessed payment gateways such as toll booths,
`
`vehicular kiosks, smog-certification stations, and the like.” Ex. 1114 at [0003].
`
`Maritzen explains that “[t]he collection of fees at these gateways is time
`
`consuming and subject to fraud.” Id.
`
`34. Maritzen discloses a system and method for electronic payment of
`
`fees using a personal transaction device (PTD) at vehicle-accessed, payment-
`
`gateway terminals (VAPGT). Ex. 1114 at Abstract, [0007]-[0009]. In the system
`
`of Maritzen, a PTD is sensed by a VAPGT and the VAPGT then transmit a
`
`payment request to the PTD. Id. at [0029]-[0030]. The PTD can then be accessed
`
`using biometric control (e.g., a fingerprint), which in the preferred embodiment is
`
`
`
`
`USR Exhibit 2010, Page 15
`
`
`
`
`
`inputted into a separate “privacy card,” and a transaction key is then generated. Id.
`
`at [0029]-[0030], [0088]-[0089]. Maritzen teaches two embodiments for its
`
`transaction key. Id. at [0089]. In one embodiment, the transaction key includes
`
`only one type of information, namely it includes “only [a] biometric key.” Id. In a
`
`second embodiment, the transaction key includes two types of information, namely
`
`a “PTD identifier” that “identifies the particular PTD being used” and a “biometric
`
`key.” Id. The PTD transmits the transaction key to a clearing house for verifying
`
`that the vehicle-access payment should be authorized. Id. at [0029]-[0030].
`
`C.
`
`Schutzer
`
`35.
`
`I understand that Schutzer is relied upon only for Ground 3 of the
`
`Petition, which asserts that dependent Claims 8 and 11 of the ’137 Patent are
`
`invalid based on Jakobsson in view of Maritzen and Schutzer. Pet. at 63-72. I
`
`further understand that Claims 8 and 11 have been cancelled by Patent Owner.
`
`Accordingly, I understand that Ground 3 is now moot.
`
`D. Niwa
`
`36.
`
`I understand that Petitioner relies on Niwa only for Ground 2 of the
`
`Petition, which asserts that dependent Claim 5 of the ’137 Patent is invalid based
`
`on Jakobsson in view of Maritzen and Niwa. Pet. at 53-63. Niwa discloses a
`
`fingerprint authentication device. Ex. 1117 at 2:19-44. The fingerprint
`
`authentication device allows a user to conduct a commercial transaction by
`
`inputting a valid fingerprint. Ex. 1117 at 2:19-44.
`USR Exhibit 2010, Page 16
`
`
`
`
`
`
`
`
`VI. CLAIM CONSTRUCTION
`
`37.
`
`I understand that Petitioner has identified three terms that purportedly
`
`require construction. Pet. at 14-20. The construction for these terms do not impact
`
`my opinion in this declaration.
`
`38. Petitioner does not provide an express construction of “the one or
`
`more signals” in its Petition, but in applying the prior art Petitioner interprets the
`
`claim as requiring that one (but not all) of the following three types of information
`
`be included in the one or more signals: (1) first authentication information, (2) an
`
`indicator of biometric authentication of the user of the first device, and (3) a time
`
`varying value. See Pet. at 41-42, 43-45, 62 (limitations 1[f],1[h], and 12[f]). In
`
`my opinion, Petitioner’s interpretation of “the one or more signals” is contrary to
`
`the plain language of the claims and the specification of the ’137 patent.
`
`39. As explained below, in my opinion, one skilled in the art would
`
`understand that “the one or more signals” should be construed to mean “one or
`
`more signals that include all of the following three types of information: (1) first
`
`authentication information, (2) an indicator of biometric authentication of the user
`
`of the first device, and (3) a time varying value.”
`
`C.
`
`40.
`
`“The One Or More Signals” (All Challenged Claims)
`
`I understand that the ’137 Patent includes two independent claims:
`
`Claims 1 and 12. Both independent claims recite the term “the one or more
`
`signals.” It is my opinion that, consistent with the context of the claims in which
`USR Exhibit 2010, Page 17
`
`
`
`
`
`
`
`
`they appear, one skilled in the art would understand that “the one or more signals”
`
`should be construed to mean “one or more signals that include all of the following
`
`three types of information: (1) first authentication information, (2) an indicator of
`
`biometric authentication of the user of the first device, and (3) a time varying
`
`value.”
`
`41. My opinion is supported by the plain language of the claims. In all
`
`the Challenged Claims, this definition of the term “the one or more signals” is
`
`provided within the following limitation:
`
`wherein the first processor is programmed to generate
`
`one or more signals including first authentication
`
`information, an indicator of biometric authentication
`
`of the user of the first device, and a time varying value
`
`in response to valid authentication of the first biometric
`
`information . . . ”
`
`Ex. 1001 at 45:40-44 (Claim 1); 46:60-67 (Claim 12). One skilled in the art would
`
`understand that use of the conjunctive “and” in the list of included constituents
`
`means that all three of these constituents must be included within “the one or more
`
`signals.”
`
`42. My opinion is also supported by subsequent limitations in the claim
`
`which also confirm that “the one or more signals” must include all three types of
`
`
`
`
`USR Exhibit 2010, Page 18
`
`
`
`
`
`information. For example, the claims recite the step of transmitting “the one or
`
`more signals” to a second device for processing, and require that the second device
`
`enable a transaction using “the first authentication information” and “the indication
`
`of biometric authentication” list elements referenced in the “one or more signals”
`
`limitation. Ex. 1001 at 45:44-61 (Claim 1), 47:1-14 (Claim 12). A person of skill
`
`in the art would understand the term “the one or more signals” includes all three
`
`listed types of information so that the recited transmission of “the one or more
`
`signals” to the second device provides the second device with the different types of
`
`information the second device uses to approve the transaction.
`
`43. My opinion is also consistent with the specification, where all three of
`
`the recited constituents are included in “the one or more signals.” For example,
`
`Figure 23 of the ’137 Patent illustrates an embodiment of the various fields
`
`included in the signals transmitted between the first wireless device and the second
`
`wireless device:
`
`
`
`
`USR Exhibit 2010, Page 19
`
`
`
`
`
`
`
`Ex. 1001 at Fig. 23, 32:31-34. The signals shown in Figure 23 include examples of
`
`the three types of information recited in Claims 1 and 12, including a first
`
`authentication information (e.g., “public ID code field 304,” “digital signature field
`
`306 containing a digital signature of the first user” and/or “other ID data field
`
`314”), a time varying value (e.g., “one-time varying code field 308 that includes a
`
`random code”), and an indicator of biometric authentication (e.g., “biometric data
`
`field 312”).
`
`II.
`
`JAKOBSSON IN VIEW OF MARITZEN DOES NOT INVALIDATE
`ANY CLAIM OF THE ’137 PATENT
`
`44.
`
`I understand that the ’137 Patent includes two independent claims:
`
`Claims 1 and 12. In my opinion, Petitioner has not shown that any of the claims of
`
`the ’137 Patent are invalid for at least three reasons.
`
`45.
`
` First, limitations 1[f], 1[h], and 12[f] require a first device transmit
`
`“the one or more signals” to a second device for processing. Petitioner contends
`
`that Jakobsson satisfies these limitations by transmitting and processing an
`
`“authentication code.” But, Claims 1 and 12 require (and the ’137 Patent makes
`
`clear) that three separate, and distinct, types of information must be transmitted
`
`and processed in “the one or more signals:” (1) first authentication information, (2)
`
`an indicator of biometric authentication of the user of the first device, and (3) a
`
`time varying value.
`
`
`
`
`USR Exhibit 2010, Page 20
`
`
`
`
`
`46. Second, Petitioner has failed to demonstrate that Jakobsson discloses
`
`limitations 1[i] and 12[i] because the Petition erroneously points to the same item
`
`for both an “indicator of biometric authentication” and “first authentication
`
`information.” Even assuming that such double counting is proper, the cited
`
`element cannot be an “indicator of biometric authentication” because it does not
`
`indicate that biometric authentication has occurred.
`
`47. Third, all three of Petitioner’s proposed grounds rely upon the
`
`combination of Jakobsson with Maritzen. See, e.g., Pet. at iii (“Ground 1: Claims
`
`1, 2, 6, 7, 9, 10, and 12 are Obvious Over Jakobsson in View of Maritzen”;
`
`“Ground 2: Claim 5 is Obvious over Jakobsson in View of Maritzen and Niwa”;
`
`Ground 3: Claim 8 and 11 are Obvious over Jakobsson in View of Maritzen and
`
`Schutzer). Petitioner’s combination of these two references relies upon hindsight
`
`bias; cherry-picking components from the prior art in a failed attempt to match
`
`them with parameters of the patented invention. A POSITA would not have been
`
`motivated to combine the references in the manner Petitioner proposes.
`
`A.
`
`Petitioner Fails To Show Any Disclosure Of Transmitting And
`Processing “The One Or More Signals” (Limitations 1[f], 1[h],
`And 12[f])
`
`48. Both independent claims of the ’137 Patent include at least one
`
`limitation requiring transmitting and processing “the one or more signals”:
`
`
`
`
`USR Exhibit 2010, Page 21
`
`
`
`
`
` “provide the one or more signals including the first authentication
`
`information for transmitting to a second device” (limitation 1[f]);
`
` “the system further including the second device that is configured to
`
`provide the enablement signal indicating that the second device
`
`approved the transaction based on use of the one or more signals”
`
`(limitation 1[h]); and
`
` “a first wireless transceiver coupled to the first processor and
`
`programmed to wirelessly transmit the one or more signals to a
`
`second device for processing (limitation 12[f]).
`
`Ex. 1001 at 45:44-47, 45:51-54, 47:1-4.
`
`49. As used in the ’137 Patent, “the one or more signals” means “one or
`
`more signals that include all of the following three types of information: (1) first
`
`authentication information, (2) an indicator of biometric authentication of the user
`
`of the first device, and (3) a time varying value.” In other words, limitations 1[f],
`
`1[h], and 12[f] (the “Transmitting and Processing Limitations”) require the first
`
`device transmit all three of these separate, and distinct, types of information to a
`
`second device for processing.
`
`50.
`
`I understand Petitioner contends Jakobsson discloses the “one or more
`
`signals” limitation because it teaches transmitting an “authentication code.” See
`
`Pet. at 34 (“the processor of user authentication device 120 [first processor] is