`
`UNITED STATES PATENT AND TRADEMARK OFFICE
`________________
`
`BEFORE THE PATENT TRIAL AND APPEAL BOARD
`________________
`
`APPLE INC.,
`Petitioner,
`
`v.
`
`UNIVERSAL SECURE REGISTRY LLC,
`Patent Owner
`________________
`
`Case IPR2018-00810
`U.S. Patent No. 9,100,826
`________________
`
`PATENT OWNER’S EXHIBIT 2015
`DECLARATION OF MARKUS JAKOBSSON
`IN SUPPORT OF PATENT OWNER’S REPLY TO OPPOSITION OF
`CONDITIONAL MOTION TO AMEND
`
`USR Exhibit 2015
`
`
`
`IPR2018-00810
`
`1.
`
`I have been retained on behalf of Universal Secure Registry LLC
`
`(“Patent Owner”) in connection with the above-captioned inter partes review
`
`(IPR). I have been retained to provide my opinions in support of USR’s Reply to
`
`Opposition of Conditional Motion to Amend. I am being compensated for my time
`
`at the rate of $625 per hour. I have no interest in the outcome of this proceeding.
`
`2.
`
`In preparing this declaration, I have reviewed and am familiar with the
`
`Petition for IPR2018-00810, U.S. Patent No. 9,100,826 (hereinafter “’826 patent”),
`
`and its file history, and all other materials cited and discussed in the Petition
`
`(including all prior art references cited therein) and all other materials cited and
`
`discussed in this Declaration including the Conditional Motion to Amend, Paper 19
`
`(“Motion”) and Petitioner’s Opposition to the Conditional Motion to Amend, Paper
`
`25 (“Op.”).
`
`3.
`
`The statements made herein are based on my own knowledge and
`
`opinion. This Declaration represents only the opinions I have formed to date. I may
`
`consider additional documents as they become available or other documents that
`
`are necessary to form my opinions. I reserve the right to revise, supplement, or
`
`amend my opinions based on new information and on my continuing analysis.
`
`USR Exhibit 2015, Page 1
`
`
`
`IPR2018-00810
`
`I.
`
`QUALIFICATIONS
`
`4.
`
`My qualifications can be found in my Curriculum Vitae, which
`
`includes my detailed employment background, professional experience, and list of
`
`technical publications and patents. Ex. 2004.
`
`5.
`
`I am currently the Chief of Security and Data Analytics at Amber
`
`Solutions, Inc., a cybersecurity company that develops home and office automation
`
`technology. At Amber, my research studies and addresses abuse, including social
`
`engineering, malware and privacy intrusions. My work primarily involves
`
`identifying risks, developing protocols and user experiences, and evaluating the
`
`security of proposed approaches.
`
`6.
`
`I received a Master of Science degree in Computer Engineering from
`
`the Lund Instituted of Technology in Sweden in 1993, a Master of Science degree
`
`in Computer Science from the University of California at San Diego in 1994, and a
`
`Ph.D. in Computer Science from the University of California at San Diego in 1997,
`
`specializing in Cryptography. During and after my Ph.D. studies, I was also a
`
`Researcher at the San Diego Supercomputer Center, where I did research on
`
`authentication and privacy.
`
`7.
`
`From 1997 to 2001, I was a Member of Technical Staff at Bell Labs,
`
`where I did research on authentication, privacy, multi-party computation, contract
`
`exchange, digital commerce including crypto payments, and fraud detection and
`
`USR Exhibit 2015, Page 2
`
`
`
`IPR2018-00810
`
`prevention. From 2001 to 2004, I was a Principal Research Scientist at RSA Labs,
`
`where I worked on predicting future fraud scenarios in commerce and
`
`authentication and developed solutions to those problems. During that time I
`
`predicted the rise of what later became known as phishing. I was also an Adjunct
`
`Associate Professor in the Computer Science department at New York University
`
`from 2002 to 2004, where I taught cryptographic protocols.
`
`8.
`
`From 2004 to 2016, I held a faculty position at the Indiana University
`
`at Bloomington, first as an Associate Professor of Computer Science, Associate
`
`Professor of Informatics, Associate Professor of Cognitive Science, and Associate
`
`Director of the Center for Applied Cybersecurity Research (CACR) from 2004 to
`
`2008; and then as an Adjunct Associate Professor from 2008 to 2016. I was the
`
`most senior security researcher at Indiana University, where I built a research
`
`group focused on online fraud and countermeasures, resulting in over 50
`
`publications and two books.
`
`9. While a professor at Indiana University, I was also employed by
`
`Xerox PARC, PayPal, and Qualcomm to provide thought leadership to their
`
`security groups. I was a Principal Scientist at Xerox PARC from 2008 to 2010, a
`
`Director and Principal Scientist of Consumer Security at PayPal from 2010 to
`
`2013, a Senior Director at Qualcomm from 2013 to 2015, and Chief Scientist at
`
`Agari from 2016 to 2018. Agari is a cybersecurity company that develops and
`
`USR Exhibit 2015, Page 3
`
`
`
`IPR2018-00810
`
`commercializes technology to protect enterprises, their partners and customers
`
`from advanced email phishing attacks. At Agari, my research studied and
`
`addressed trends in online fraud, especially as related to email, including problems
`
`such as Business Email Compromise, Ransomware, and other abuses based on
`
`social engineering and identity deception. My work primarily involved identifying
`
`trends in fraud and computing before they affected the market, and developing and
`
`testing countermeasures, including technological countermeasures, user interaction
`
`and education.
`
`10.
`
`I have founded or co-founded several successful computer security
`
`companies. In 2005 I founded RavenWhite Security, a provider of authentication
`
`solutions, and I am currently its Chief Technical Officer. In 2007 I founded
`
`Extricatus, one of the first companies to address consumer security education. In
`
`2009 I founded FatSkunk, a provider of mobile malware detection software; I
`
`served as Chief Technical Officer of FatSkunk from 2009 to 2013, when FatSkunk
`
`was acquired by Qualcomm and I became a Qualcomm employee. In 2013 I
`
`founded ZapFraud, a provider of anti-scam technology addressing Business Email
`
`Compromise, and I am currently its Chief Technical Officer. In 2014 I founded
`
`RightQuestion, a security consulting company.
`
`11.
`
`I have additionally served as a member of the fraud advisory board at
`
`LifeLock (an identity theft protection company); a member of the technical
`
`USR Exhibit 2015, Page 4
`
`
`
`IPR2018-00810
`
`advisory board at CellFony (a mobile security company); a member of the
`
`technical advisory board at PopGiro (a user reputation company); a member of the
`
`technical advisory board at MobiSocial dba Omlet (a social networking company);
`
`and a member of the technical advisory board at Stealth Security (an anti-fraud
`
`company). I have provided anti-fraud consulting to KommuneData (a Danish
`
`government entity), J.P. Morgan Chase, PayPal, Boku, and Western Union.
`
`12.
`
`I have authored five books and over 100 peer-reviewed publications,
`
`and have been a named inventor on over 100 patents and patent applications.
`
`13. My work has included research in the area of applied security,
`
`privacy, cryptographic protocols, authentication, malware, social engineering,
`
`usability and fraud.
`
`II.
`
`LEGAL UNDERSTANDING
`
`A.
`
`14.
`
`The Person of Ordinary Skill in the Art
`
`I understand that a person of ordinary skill in the relevant art (also
`
`referred to herein as “POSITA”) is presumed to be aware of all pertinent art, thinks
`
`along conventional wisdom in the art, and is a person of ordinary creativity—not
`
`an automaton.
`
`15.
`
`I have been asked to consider the level of ordinary skill in the field
`
`that someone would have had at the time the claimed invention was made. In
`
`deciding the level of ordinary skill, I considered the following:
`
`USR Exhibit 2015, Page 5
`
`
`
`• the levels of education and experience of persons working in the
`
`IPR2018-00810
`
`field;
`
`• the types of problems encountered in the field; and
`
`• the sophistication of the technology.
`
`16. A person of ordinary skill in the art relevant to the ’826 patent at the
`
`time of the invention would have a Bachelor of Science degree in electrical
`
`engineering and/or computer science, and three years of work or research
`
`experience in the fields of secure transactions and encryption, or a Master’s degree
`
`in electrical engineering and/or computer science and two years of work or
`
`research experience in related fields.
`
`17.
`
`I am well-qualified to determine the level of ordinary skill in the art
`
`and am personally familiar with the technology of the ’826 Patent. I was a person
`
`of at least ordinary skill in the art at the time of the priority date of the ’826 patent
`
`in 2006. Regardless if I do not explicitly state that my statements below are based
`
`on this timeframe, all of my statements are to be understood as a POSITA would
`
`have understood something as of the priority date of the ’826 patent.
`
`B.
`
`18.
`
`Legal Principles
`
`I am not a lawyer and will not provide any legal opinions.
`
`USR Exhibit 2015, Page 6
`
`
`
`IPR2018-00810
`
`19.
`
`Though I am not a lawyer, I have been advised that certain legal
`
`standards are to be applied by technical experts in forming opinions regarding the
`
`meaning and validity of patent claims.
`
`20.
`
`I have been informed and understand that if the Board should accept
`
`Petitioner’s arguments and cancel any of the original issued claims of the ’826
`
`patent, Patent Owner has made a conditional motion to amend to substitute the
`
`canceled claim(s) with corresponding proposed amended claims 36-61, as set forth
`
`in Section III of Ex. 2013 (my declaration in support of Patent Owner’s Motion to
`
`Amend).
`
`21.
`
`I have been informed and understand that to permit the proposed
`
`substitute claims to be entered, Patent Owner must show, among other things, that
`
`the substitute claims are supported by the written description of the original
`
`disclosure of the patent, as well as any patent application to which the claim seeks
`
`the benefit of priority in this proceeding.
`
`22.
`
`I have been informed by counsel and understand that to satisfy the
`
`written description requirement, the substitute claims must be disclosed in
`
`sufficient detail such that one skilled in the art can reasonably conclude that the
`
`inventor had possession of the claimed invention as of the filing date sought. I
`
`understand that the Patent Owner can show possession of the claimed invention by
`
`USR Exhibit 2015, Page 7
`
`
`
`IPR2018-00810
`
`pointing to such descriptive means as words, structures, figures, diagrams, and
`
`formulas that fully set forth the claimed invention.
`
`23.
`
`I have been informed by counsel and understand that incorporation by
`
`reference is a method by which material from one or more documents may be
`
`integrated into a host document. I understand that material incorporated by
`
`reference is considered part of the written description of the patent that can be used
`
`to show possession of the claimed invention.
`
`24.
`
`I have been informed by counsel and understand that to permit the
`
`proposed substitute claims to be entered, Patent Owner must show, among other
`
`things, that the substitute claims do not introduce new subject matter.
`
`25.
`
`I understand that new matter is any addition to the claims without
`
`support in the original disclosure.
`
`26.
`
`I have been informed by counsel and understand that to permit the
`
`proposed substitute claims to be entered, Patent Owner must show, among other
`
`things, the substitute claims do not broaden the scope of the original claims.
`
`27.
`
`I understand that claims in dependent form are construed to include all
`
`the limitations of the claim incorporated by reference into the dependent claim and
`
`further limit the claim incorporated by reference.
`
`28.
`
`It has been explained to me by counsel for the Patent Owner that in
`
`proceedings before the USPTO, the claims of an unexpired patent are to be given
`
`USR Exhibit 2015, Page 8
`
`
`
`IPR2018-00810
`
`their broadest reasonable interpretation in view of the specification from the
`
`perspective of one having ordinary skill in the relevant art at the time of the
`
`invention. I have considered each of the claim terms using the broadest reasonable
`
`interpretation standard.
`
`III. RESPONSIVE ARGUMENTS TO OPPOSITION
`
`A.
`
`29.
`
`Claim 56 Has Written Description Support
`
`I understand that Petitioner contends that substitute claim 56 lacks
`
`written description support and is therefore invalid under 35 U.S.C. § 112. Op. at
`
`3-4. I respectfully disagree.
`
`30.
`
`Limitations 56[c] and 56[e] specify that the first authentication
`
`information includes a first key encrypted by a second key and that the encrypted
`
`first key is decrypted using the second key to retrieve the first key. Motion at B6.
`
`The specification describes that a first wireless signal includes “a PKI encrypted
`
`one-time DES key.” Ex. 2008 at 49:24-26. The specification further describes how
`
`“[t]he second wireless device uses the first public key to decrypt the PKI encrypted
`
`DES key.” Id. at 50:30-31. I understand that in response to this disclosure,
`
`Petitioner states “a value encrypted with a public key, which is an asymmetric key,
`
`could not be decrypted using the same public key. Even with extensive
`
`experimentation, it would be impossible for a POSITA to implement encryption
`
`and decryption with a public key.” Op. at 4. In my opinion, the specification as
`
`USR Exhibit 2015, Page 9
`
`
`
`IPR2018-00810
`
`written contains an obvious error because a public key cannot be used to decrypt
`
`ciphertext.
`
`31.
`
`I have been informed that an amendment to correct an obvious error
`
`does not constitute new matter where the ordinary artisan would not only recognize
`
`the existence of the error in the specification, but also recognize the appropriate
`
`corrections. The obvious error noted by Petitioner in the ’860 Application would
`
`immediately be recognized by a POSITA, who would also recognize the
`
`appropriate corrections. In particular, a POSITA would know that a public key
`
`cannot be used to both encrypt and decrypt data, and upon identifying this obvious
`
`error, a POSITA would also readily recognize two corrections—both very trivial in
`
`nature—that would clarify the specification.
`
`32.
`
`First, in my opinion, since a public key cannot be used to both encrypt
`
`and decrypt data, a POSITA would readily understand that the recipient’s public
`
`key would have been used to perform encryption of the data (e.g., second wireless
`
`device’s public key used to encrypt DES key) and the recipient’s private key would
`
`be used to decrypt the data (e.g., second wireless device’s private key used to
`
`decrypt DES key). Also, since an asymmetric, public key cannot be used to
`
`perform symmetric encryption/decryption, then the key described in the
`
`specification as performing the desired symmetric encryption and decryption of the
`
`DES key may simply be a symmetric key like the claimed “second key.” In my
`
`USR Exhibit 2015, Page 10
`
`
`
`IPR2018-00810
`
`opinion, both of these corrections in view of the specification’s teachings would be
`
`readily recognizable to a POSITA. As such, I believe the specification provides
`
`written description support for limitations 56[c] and 56[e].
`
`B.
`
`Petitioner Fails to address “the digital signature generated using a
`private key associated with the first handheld device”
`
`33.
`
`In my opinion, Petitioner fails to show that the prior art of record
`
`discloses “the digital signature generated using a private key associated with the
`
`first handheld device.” Motion at B1 (36[f]). Petitioner ignores this claim
`
`limitation in its analysis of the prior art. See Op. at 5-10. Instead, Petitioner focuses
`
`only on whether Schutzer discusses a “digital signature,” and neglects to dig
`
`deeper as to whether Schutzer’s digital signature is specifically generated using a
`
`private key associated with a handheld device. See Op. at 9 (citing Ex. 1030,
`
`Schutzer, ¶29). A close review of the cited portion of Schutzer reveals that
`
`Schutzer is silent on how the digital signature is generated, such as who or what
`
`generated the digital signature. In particular, no explicit or implicit disclosure is
`
`made that Schutzer’s digital signature was generated using a private key of a
`
`handheld device.
`
`34. Also, no implicit disclosure is made in Schutzer that the digital
`
`signature is necessarily generated by a private key of the user’s computing device
`
`10. For instance, Schutzer’s digital signature may be generated using the private
`
`USR Exhibit 2015, Page 11
`
`
`
`IPR2018-00810
`
`key of a certificate authority and be used as part of a digital certificate to
`
`authenticate the user.
`
`35. As another example, the digital signature may be that of the user itself
`
`and not the user’s device. The distinction here may be subtle yet important. A user
`
`may personally have its own private-public key pair that it uses to create digital
`
`signatures that can be uniquely associated to him or her. That same user may,
`
`however, own multiple different electronic devices that each have their own
`
`public-private key pairs that may be used to create digital signatures that can be
`
`used to identify the specific device itself and not necessarily its user.
`
`36.
`
`This same limitation can also be found in substitute claim 45. Motion
`
`at B4 (45[c]). Since Petitioner applies the same unpatentability analysis for claim
`
`45 as it did for claim 36 (Op. at 11), the same arguments above regarding
`
`Petitioner’s failure to establish the limitation “the digital signature generated
`
`using a private key associated with the first handheld device” equally apply to
`
`claim 45.
`
`C.
`
`37.
`
`Petitioner Fails to Address Several Limitations of Claim 45
`
`I understand that Petitioner asserts that “Substitute claim 45 adds
`
`similar amendments to claim 10 as substitute claim 36 to 1,” and then summarily
`
`concludes that, “Accordingly, substitute claim 45 is obvious for at least the same
`
`USR Exhibit 2015, Page 12
`
`
`
`IPR2018-00810
`
`reasons claims 10 and 36 are obvious.” Op. at 11. But I believe Petitioner’s
`
`dismissive analysis neglects limitations that are distinctly unique to claim 45.
`
`38.
`
`Petitioner fails to address limitations 45[e] and 45[g], which
`
`respectively recite, “at least one of the digital signature and/or the one-time code
`
`encrypted by the first handheld device” and “decrypting, with the second device, at
`
`least one of the digital signature and/or the one-time code encrypted by the first
`
`handheld device.” Motion at B3. Petitioner does not address anywhere in its
`
`Opposition what prior art reference purportedly discloses these claim features. In
`
`my opinion, these limitations are unique to claim 45 and are not found in claim 36.
`
`Thus, Petitioner’s summary reliance on its limited analysis of claim 36 as the basis
`
`for its opposition to claim 45 is explicitly deficient, leaving Petitioner with no
`
`argument whatsoever with respect to limitations 45[e] and 45[g].
`
`39. Next, limitation 45[d] requires that a first signal generated “include[]
`
`the first authentication information of the first entity, the one-time code, and the
`
`digital signature as separable fields of the first signal.” Motion at B4 (emphasis
`
`added). Again, this “separable fields” requirement is not present in claim 36 and is
`
`consequently not addressed by Petitioner in its analysis of claim 36. See Op. at 5-
`
`10. While Petitioner discusses “separable fields” with respect to a different claim,
`
`claim 42, Petitioner does not refer back to or cite to claim 42 in its analysis of
`
`claim 45.
`
`USR Exhibit 2015, Page 13
`
`
`
`IPR2018-00810
`
`40. Moreover, in my opinion independent claim 45 includes other
`
`distinctly different limitations not found in independent claim 36 or dependent
`
`claim 42 (e.g., “at least one of the digital signature and/or the one-time code
`
`encrypted by the first handheld device” and “decrypting…at least one of the digital
`
`signature and/or the one-time code encrypted by the first handheld device”). These
`
`limitations have a material impact on how claim 45 comes together as a whole to
`
`define a distinctly different invention than claim 36 or claim 42. To satisfy its
`
`burden, these differences require that Petitioner articulate in its Opposition how
`
`and why the “separable fields” limitation was obvious with respect to claim 45 as a
`
`whole. Petitioner does not.
`
`41. By neglecting to analyze multiple features of claim 45 in its
`
`Opposition, Petitioner fails to make a prima facie showing of unpatentability.
`
`D.
`
`A POSITA Would Not Combine Jakobsson, Maritzen, and
`Schutzer By Prepending First Authentication Information
`Claims 36 and 45
`
`42.
`
`I understand that Petitioner argues that a POSITA would be motivated
`
`to “add[] the digital signature of Schutzer and one-time code disclosed by
`
`Jakobsson to the key of Maritzen” by “prepending or appending values such as
`
`Maritzen’s keys, Jakobsson’s one-time code, and Schutzer’s digital signature.” Op.
`
`at 10 (emphasis added); see also id. at 13-14 (prepending or appending same
`
`values to achieve limitations of claim 42). I strongly disagree. Even assuming that
`
`USR Exhibit 2015, Page 14
`
`
`
`IPR2018-00810
`
`Maritzen’s transaction or biometric key was derived from biometric information (it
`
`is not), a POSITA would not prepend or append Maritzen’s keys to Jakobsson’s
`
`code and Schutzer’s digital signature because doing so would be redundant.
`
`Jakobsson already teaches that its authentication code incorporates biometric data
`
`(e.g., authentication code A (K, T, E, P) 292 where P may be biometric data). See
`
`Ex. 1005, Jakobsson at [0072], [0073]. Thus, there would be little motivation to
`
`make the proposed modification in order to send substantially the same
`
`information twice at the same time: once by prepending or appending and another
`
`by incorporating the value into an authentication code.
`
`43. Moreover, Maritzen frequently states that neither “biometric
`
`information identifying the user” nor any other “user information” is transmitted
`
`from the user device at any time during a transaction. Ex. 1004, Maritzen at [0044]
`
`(“The biometric information identifying the user is not transmitted at any time.”);
`
`see also id. at [0045], [0088], [0090], [0109], [0111], [0124], [0128], [0148],
`
`[0150], [0164], [0166]. Thus, I believe a POSITA would understand that Maritzen
`
`teaches away from prepending/appending and sending the claimed “first
`
`authentication information derived from the first biometric information.” As such,
`
`in my opinion, independent substitute claims 36 and 45 are novel and non-obvious
`
`over the prior art.
`
`Claim 42
`
`USR Exhibit 2015, Page 15
`
`
`
`IPR2018-00810
`
`44.
`
`Even if it were assumed that Jakobsson’s authentication code was
`
`generated without using biometric data (i.e., did not include user data (P)) and
`
`Maritzen’s keys supplied information derived from a biometric, I believe a
`
`POSITA would still not prepend or append Maritzen’s keys to Jakobsson’s code
`
`and Schutzer’s digital signature for Petitioner’s stated purpose of “more securely
`
`authenticat[ing] the user.” Jakobsson never discloses an embodiment where an
`
`authentication code is generated without use of—at least at some stage—a one-way
`
`function, such as a hash function. Even in the embodiment where Jakobsson
`
`describes a PIN (P) being appended to authentication code A(K, T, E), the latter
`
`value is the result of a one-way function. Id.; See Ex. 1104, Jakobsson at [0073]. In
`
`fact, use of a one-way function is critical to Jakobsson’s system because otherwise
`
`the system would not be secure. Therefore, in light of the teachings of Jakobsson, I
`
`believe a POSITA would not, for example, prepend or append various values
`
`without applying a one-way function because certain types of information
`
`described in Jakobsson, such as the secret key K or biometric value P, would be
`
`put at risk of interception.
`
`45. Also, if Maritzen’s biometric key were biometric information, which
`
`it is not, then it is well understood that it would suffer from errors, such as
`
`translation and rotation errors. These errors would make it practically impossible
`
`for the verifier to verify a received authentication code, as it would not know what
`
`USR Exhibit 2015, Page 16
`
`
`
`IPR2018-00810
`
`input to provide to Jakobsson’s one-way combination function for the generation
`
`of the verifier-generated authentication code, as the errors are not knowable to the
`
`verifier.
`
`E.
`
`46.
`
`Jakobsson and Burnett Fail to Disclose Limitations 56[c], 56[e]
`
`I understand that Petitioner relies on the combination of Maritzen,
`
`Jakobsson, Niwa, Schutzer, and Burnett in attempting to show that claim 56 is
`
`obvious. I believe Petitioner fails to make a prima facie showing of obviousness.
`
`47.
`
`Petitioner first contends that Maritzen in view of Jakobsson discloses
`
`claim limitations 56[c] and 56[e]. Op. at 15 (citing: Ex. 1019, Shoup Decl. at ¶ 55-
`
`56; Op. at Section II.D.1.a(2); Ex. 1005, Jakobsson at ¶¶ 6, 7, 21, 58; Ex. 1004,
`
`Maritzen at ¶¶ 45-46). However, in my opinion the cited portions of Maritzen and
`
`Jakobsson do not disclose that a first key used to encrypt at least a portion of first
`
`authentication information is itself encrypted by a second key, which is then
`
`decrypted at a second device using the second key.
`
`48.
`
`For example, Maritzen describes how a “transaction key” may be
`
`encrypted using “standard encrypting methods, such as, for example, public key
`
`infrastructure (PKI) encryption.” Ex. 1004, Maritzen at ¶ 45. However, I believe
`
`Maritzen’s “transaction key” is merely an authentication value and is not an
`
`encryption key that encrypts or decrypts data. See, e.g., id. at ¶ 44-50. Instead,
`
`Maritzen simply validates the transaction key by comparing the transaction key to
`
`USR Exhibit 2015, Page 17
`
`
`
`IPR2018-00810
`
`other keys stored at the clearing house 130 to determine if there is a match. See id.
`
`at ¶ 48. By contrast, the current claims require that “at least a portion of the first
`
`authentication information [is] encrypted by a first key” and the first key is used to
`
`“decrypt[], at the second device, the portion of the first authentication information
`
`encrypted by the first key using the first key.” Motion at B6 (56[c], 56[e]).
`
`Maritzen’s transaction key performs no encryption or decryption. Also, I believe
`
`the cited portions of Jakobsson fail to disclose the above claim limitations.
`
`49.
`
`Petitioner argues that “[t]o the extent that Maritzen and Jakobsson do
`
`not explicitly discuss encrypting data with a first key and encrypting the first key
`
`with a second key, Burnett discloses this limitation.” Op. at 15. Specifically,
`
`Petitioner states that “Burnett discloses that a ‘session key’ ([first key]) used to
`
`encrypt information can be encrypted using a key encryption key (‘KEK’)
`
`([second key]), and that the same KEK can be used to decrypt the first key.” Op. at
`
`15 (citing Ex. 1021, Burnett at 54-55, FIG. 3-1). Petitioner also claims that it
`
`would have been obvious to “modify the authentication information of Jakobsson
`
`by encrypting it with a session key, encrypting the session key with a KEK, and
`
`transmitting the KEK-encrypted session key…to the second device for
`
`decryption as taught by Burnett.” Op. at 17. A close review of Burnett reveals
`
`that Petitioner misunderstands and misapplies Burnett.
`
`USR Exhibit 2015, Page 18
`
`
`
`IPR2018-00810
`
`50. Among other things, Chapter 3 of Burnett discusses password-based
`
`encryption (PBE). In particular, it describes how a “session key,” which is used to
`
`encrypt and decrypt bulk data, may itself be encrypted using another key that is
`
`known as a key encryption key (KEK). Ex. 1021, Burnett at 54. Burnett further
`
`discusses how, advantageously, the KEK is not stored and is instead generated
`
`as needed at the device to encrypt or decrypt the session key to recover the
`
`encrypted data. Id. (“When he needs a KEK to encrypt, [he] will generate it, use
`
`it, and then throw it away. When he needs to decrypt the data, he generates the
`
`KEK again, uses it, and throws it away.”). In particular, the process uses PBE
`
`where a “mixing algorithm” blends a “salt” (i.e., a random value) and a user-
`
`selected password together to generate a KEK. Id. at 55. After the KEK is used to
`
`encrypt the session key, it is thrown away and the salt used to generate the KEK is
`
`stored alongside the encrypted session key at the device. Id. To decrypt the stored,
`
`encrypted session key, the salt is retrieved and input into the same mixing
`
`algorithm along with the same password in order to regenerate the same KEK. Id.
`
`51.
`
`Subsequent pages of Burnett explain that the KEK is personal to each
`
`user/device and is not shared with other users/devices. In particular, Burnett
`
`discloses:
`
`There are a couple of reasons to use a session key and a KEK.
`First, suppose you need to share the data with other people and
`you want to keep it stored encrypted. In that case, you generate
`USR Exhibit 2015, Page 19
`
`
`
`IPR2018-00810
`
`one session key, and everyone gets a copy of it. Then everyone
`protects his or her copy of the session key using PBE. So rather
`than share a password (something everyone would need for
`decrypting if you had used PBE to encrypt the bulk data), you
`share the key.
`
`Ex. 1021, Burnett at 58 (emphasis added). Thus, if a first device and a second
`
`device were to share a session key used to encrypt and decrypt bulk data, each
`
`would generate its own KEK using its own password and salt to encrypt and store
`
`the shared session key instead of sharing the same password and KEK.
`
`Consequently, the KEK used to encrypt the session key at a first device is not
`
`used to decrypt the encrypted session key at a second device. FIG. 3-4 of
`
`Burnett also supports this interpretation because the figure shows how a session
`
`key shared by two users, Pao-Chi and Gwen, is encrypted with a KEK unique to
`
`each user.
`
`52. By contrast, substitute claim 56 requires that the same “second key”
`
`used to encrypt the first key at the first handheld device is also used at the second
`
`device to decrypt the encrypted first key to retrieve the first key at the second
`
`device. Burnett’s KEK is not shared between users and instead each user generates
`
`their own KEK to encrypt and store a shared session key using passwords and salts
`
`unique to them. Consequently, in my opinion Petitioner fails to establish that
`
`Burnett discloses limitations 56[c] and 56[e].
`
`USR Exhibit 2015, Page 20
`
`
`
`IPR2018-00810
`
`53.
`
`Similarly, Petitioner’s assertions as to why a POSITA would be
`
`motivated to combine Maritzen, Jakobsson, Niwa, Schutzer, and Barnett are also
`
`defective. See Op. at 15-16. Barnett never teaches transmitting a KEK-encrypted
`
`session key from one user to another since each user generates its own KEK based
`
`on its own password and salt. See Ex. 1021, Barnett at 58-59. In my opinion a
`
`POSITA would not be motivated to make Petitioner’s proposed combination that
`
`envisions transmitting KEK-encrypted session keys between users since doing so
`
`would require one user to also share its password and salt with the other user so the
`
`latter could decrypt the encrypted session key. See id. at 58 (“So rather than share a
`
`password…”). Consequently, I believe Petitioner fails to establish that the
`
`proposed combination renders claim 56 obvious.
`
`F.
`
`54.
`
`Substitute Claims are Patent Eligible under § 101
`
`I understand that Petitioner argues that the substitute claims are
`
`unpatentable under § 101 because they purportedly claim patent-ineligible abstract
`
`ideas. Op. at 18-24. However, I have been informed that on September 19, 2018,
`
`United States Magistrate Judge Sherry R. Fallon for the District Court of Delaware
`
`issued a Report and Recommendation (R&R) rejecting similar arguments made by
`
`Petitioner, and recommending that the District Court deny Petitioner’s motion to
`
`dismiss under § 101 since the claims of the ’826 patent are “not directed to an
`
`abstract idea because ‘the plain focus of the claims is on an improvement to
`
`USR Exhibit 2015, Page 21
`
`
`
`IPR2018-00810
`
`computer functionality itself, not on economic or other tasks for which a computer
`
`is used in its ordinary capacity.’” Ex. 2016, Universal Secure Registry, LLC v.
`
`Apple, Inc., 1:17-cv-00585-JFB-SRF, Dkt. 137 at 19 (D. Del. Sep. 18, 2018)
`
`(emphasis added). Specifically, I understand that Judge Fallon stated that:
`
`The ’826 patent is directed to an improvement in computer
`functionality, as it requires biometric information to locally
`authe