`________________
`
`BEFORE THE PATENT TRIAL AND APPEAL BOARD
`________________
`
`APPLE INC.
`Petitioner,
`
`v.
`
`UNIVERSAL SECURE REGISTRY LLC
`Patent Owner
`________________
`Case IPR2018-00810
`U.S. Patent No. 9,100,826
`________________
`
`PATENT OWNER'S EXHIBIT 2003
`DECLARATION OF MARKUS JAKOBSSON
`IN SUPPORT OF PATENT OWNER'S RESPONSE
`
`
`
`
`
`USR Exhibit 2003
`
`
`
`
`
`1.
`
`I have been retained on behalf of Universal Secure Registry LLC
`
`("Patent Owner") in connection with the above-captioned inter partes review (IPR).
`
`I have been retained to provide my opinions in support of USR's Patent Owner
`
`Response. I am being compensated for my time at the rate of $625 per hour. I have
`
`no interest in the outcome of this proceeding.
`
`2.
`
`In preparing this declaration, I have reviewed and am familiar with the
`
`Board's decision in IPR2018-00810 to institute review (Paper 8, "Decision"), the
`
`Petition (Paper 3, "Petition") filed by Apple Inc. ("Petitioner"), U.S. Patent No.
`
`9,100,826 and its file history, and all other materials cited and discussed in the
`
`Petition (including the declaration of Dr. Victor Shoup) and cited and discussed in
`
`this Declaration.
`
`3.
`
`I understand that the Board instituted review as to whether claims 1, 2,
`
`7, 8, 10, 11, 14, 15, 21, 22, 24, 26, 27, 30, 31, and 34 ("Challenged Claims") would
`
`have been obvious in view of US 2004/0236632 A1 (Ex. 1004, "Maritzen"), WO
`
`2004/051585 A2 (Ex. 1005, "Jakobsson"), and US 6,453,301 B1 (Ex. 1007, "Niwa")
`
`under 35 U.S.C. § 103. Decision, 5, 20. For the reasons stated herein, I conclude
`
`that the Challenged Claims would not have been obvious.
`
`4.
`
`The statements made herein are based on my own knowledge and
`
`opinion. This Declaration represents only the opinions I have formed to date. I may
`
`consider additional documents as they become available or other documents that are
`
`
`
`USR Exhibit 2003, Page 1
`
`
`
`
`
`necessary to form my opinions. I reserve the right to revise, supplement, or amend
`
`my opinions based on new information and on my continuing analysis.
`
`I.
`
`QUALIFICATIONS
`5. My qualifications can be found in my Curriculum Vitae, which includes
`
`my detailed employment background, professional experience, and list of technical
`
`publications and patents. Ex. 2004.
`
`6.
`
`I am currently the Chief of Security and Data Analytics at Amber
`
`Solutions, Inc., a cybersecurity company that develops home and office automation
`
`technology. At Amber, my research addresses abuse, including social engineering,
`
`malware and privacy intrusions. My work primarily involves identifying risks,
`
`developing protocols and user experiences, and evaluating the security of proposed
`
`approaches.
`
`7.
`
`I received a Master of Science degree in Computer Engineering from
`
`the Lund Instituted of Technology in Sweden in 1993, a Master of Science degree
`
`in Computer Science from the University of California at San Diego in 1994, and a
`
`Ph.D. in Computer Science from the University of California at San Diego in 1997,
`
`specializing in Cryptography. During and after my Ph.D. studies, I was also a
`
`Researcher at the San Diego Supercomputer Center, where I did research on
`
`electronic payment schemes, authentication and privacy.
`
`
`
`USR Exhibit 2003, Page 2
`
`
`
`
`
`8.
`
`From 1997 to 2001, I was a Member of Technical Staff at Bell Labs,
`
`where I did research on authentication, privacy, multi-party computation, contract
`
`exchange, digital commerce including crypto payments, and fraud detection and
`
`prevention. From 2001 to 2004, I was a Principal Research Scientist at RSA Labs,
`
`where I worked on predicting future fraud scenarios in commerce and authentication
`
`and developed solutions to those problems. During that time I predicted the rise of
`
`what later became known as phishing. I was also an Adjunct Associate Professor in
`
`the Computer Science department at New York University from 2002 to 2004, where
`
`I taught cryptographic protocols.
`
`9.
`
`From 2004 to 2016, I held a faculty position at Indiana University at
`
`Bloomington, first as an Associate Professor of Computer Science, Associate
`
`Professor of Informatics, Associate Professor of Cognitive Science, and Associate
`
`Director of the Center for Applied Cybersecurity Research (CACR) from 2004 to
`
`2008; and then as an Adjunct Associate Professor from 2008 to 2016. I was the most
`
`senior security researcher at Indiana University, where I built a research group
`
`focused on online fraud and countermeasures, resulting in over 50 publications and
`
`two books.
`
`10. While a professor at Indiana University, I was also employed by Xerox
`
`PARC, PayPal, and Qualcomm to provide thought leadership to their security
`
`groups. I was a Principal Scientist at Xerox PARC from 2008 to 2010, a Director
`
`
`
`USR Exhibit 2003, Page 3
`
`
`
`
`
`and Principal Scientist of Consumer Security at PayPal from 2010 to 2013, a Senior
`
`Director at Qualcomm from 2013 to 2015, and Chief Scientist at Agari from 2016
`
`to 2018.
`
`11. Agari is a cybersecurity company that develops and commercializes
`
`technology to protect enterprises, their partners and customers from advanced email
`
`phishing attacks. At Agari, my research addressed trends in online fraud, especially
`
`as related to email, including problems such as Business Email Compromise,
`
`Ransomware, and other abuses based on social engineering and identity deception.
`
`My work primarily involved identifying trends in fraud and computing before they
`
`affected the market, and developing and testing countermeasures, including
`
`technological countermeasures, user interaction and education.
`
`12.
`
`I have founded or co-founded several successful computer security
`
`companies. In 2005 I founded RavenWhite Security, a provider of authentication
`
`solutions, and I am currently its Chief Technical Officer. In 2007 I founded
`
`Extricatus, one of the first companies to address consumer security education. In
`
`2009 I founded FatSkunk, a provider of mobile malware detection software; I served
`
`as Chief Technical Officer of FatSkunk from 2009 to 2013, when FatSkunk was
`
`acquired by Qualcomm and I became a Qualcomm employee. In 2013 I founded
`
`ZapFraud, a provider of anti-scam technology addressing Business Email
`
`
`
`USR Exhibit 2003, Page 4
`
`
`
`
`
`Compromise, and I am currently its Chief Technical Officer. In 2014 I founded
`
`RightQuestion, a security consulting company.
`
`13.
`
`I have additionally served as a member of the fraud advisory board at
`
`LifeLock (an identity theft protection company); a member of the technical advisory
`
`board at CellFony (a mobile security company); a member of the technical advisory
`
`board at PopGiro (a user reputation company); a member of the technical advisory
`
`board at MobiSocial dba Omlet (a social networking company); and a member of
`
`the technical advisory board at Stealth Security (an anti-fraud company). I have
`
`provided anti-fraud consulting to KommuneData (a Danish government entity), J.P.
`
`Morgan Chase, PayPal, Boku, and Western Union.
`
`14.
`
`I have authored five books and over 100 peer-reviewed publications,
`
`and have been a named inventor on over 100 patents and patent applications.
`
`15. My work has included research in the area of electronic payments,
`
`applied security, privacy, cryptographic protocols, authentication, malware, social
`
`engineering, usability and fraud.
`
`II. LEGAL UNDERSTANDING
`A. The Person of Ordinary Skill in the Art
`16.
`I understand that a person of ordinary skill in the relevant art (also
`
`referred to herein as "POSITA") is presumed to be aware of all pertinent art, thinks
`
`
`
`USR Exhibit 2003, Page 5
`
`
`
`
`
`along conventional wisdom in the art, and is a person of ordinary creativity—not an
`
`automaton.
`
`17.
`
`I have been asked to consider the level of ordinary skill in the field that
`
`someone would have had at the time the claimed invention was made. In deciding
`
`the level of ordinary skill, I considered the following:
`
`• the levels of education and experience of persons working in the
`
`field;
`
`• the types of problems encountered in the field; and
`
`• the sophistication of the technology.
`
`18.
`
`I conclude that a person of ordinary skill in the art ("POSITA") relevant
`
`to the '826 patent at the time of the invention would have a Bachelor of Science
`
`degree in electrical engineering and/or computer science, and three years of work or
`
`research experience in the fields of secure transactions and encryption, or a Master's
`
`degree in electrical engineering and/or computer science and two years of work or
`
`research experience in related fields.
`
`19.
`
`I have reviewed the declaration of Dr. Victor Shoup, including his
`
`opinions regarding the Person of Ordinary Skill in the Art. Ex. 1002, ¶ 22. My
`
`description of the level of ordinary skill in the art is essentially the same as that of
`
`the Dr. Shoup, except that Dr. Shoup's description requires two years of work or
`
`
`
`USR Exhibit 2003, Page 6
`
`
`
`
`
`research experience (as compared to three years). The opinions set forth in this
`
`Declaration would be the same under either my or Dr. Shoup's proposal.
`
`20.
`
`I am well-qualified to determine the level of ordinary skill in the art and
`
`am personally familiar with the technology of the '826 Patent. I was a person of at
`
`least ordinary skill in the art at the time of the priority date of the '826 patent in 2006.
`
`Regardless if I do not explicitly state that my statements below are based on this
`
`timeframe, all of my statements are to be understood as a POSITA would have
`
`understood something as of the priority date of the '826 patent.
`
`B.
`21.
`
`Legal Principles
`I am not a lawyer and will not provide any legal opinions. Though I
`
`am not a lawyer, I have been advised that certain legal standards are to be applied
`
`by technical experts in forming opinions regarding the meaning and validity of
`
`patent claims.
`
`1. Obviousness
`
`22.
`
`I understand that to obtain a patent, a claimed invention must have, as
`
`of the priority date, been nonobvious in view of prior art in the field. I understand
`
`that an invention is obvious when the differences between the subject matter sought
`
`to be patented and the prior art are such that the subject matter as a whole would
`
`have been obvious at the time the invention was made to a person having ordinary
`
`skill in the art.
`
`
`
`USR Exhibit 2003, Page 7
`
`
`
`
`
`23.
`
`I understand it its Petitioner's burden to prove that the Challenged
`
`Claims would have been obvious.
`
`24.
`
`I understand that to prove that prior art, or a combination of prior art,
`
`renders a patent obvious, it is necessary to: (1) identify the particular references that
`
`singly, or in combination, make the patent obvious; (2) specifically identify which
`
`elements of the patent claim appear in each of the asserted references; and (3) explain
`
`how the prior art references could have been combined to create the inventions
`
`claimed in the asserted claim.
`
`25.
`
`I understand that a patent composed of several elements is not proved
`
`obvious merely by demonstrating that each of its elements was, independently,
`
`known in the prior art, and that obviousness cannot be based on the hindsight
`
`combination of components selectively culled from the prior art to fit the parameters
`
`of the patented invention.
`
`26.
`
`I also understand that a reference may be said to teach away when a
`
`person of ordinary skill, upon reading the reference, would be discouraged from
`
`following the path set out in the reference, or would be led in a direction divergent
`
`from the path that was taken by the applicant. Even if a reference is not found to
`
`teach away, I understand its statements regarding preferences are relevant to a
`
`finding regarding whether a skilled artisan would be motivated to combine that
`
`reference with another reference.
`
`
`
`USR Exhibit 2003, Page 8
`
`
`
`
`
`2.
`
`Claim Construction
`
`27.
`
`I understand that in this inter partes review the claims must be given
`
`their broadest reasonable interpretation, but that interpretation must be consistent
`
`with the patent specification. In this Declaration, I have used the broadest reasonable
`
`interpretation ("BRI") standard when interpreting the claim terms.
`
`III. OVERVIEW OF THE '826 PATENT
`A. The '826 Patent Specification
`28.
`I have reviewed the '826 patent. The '826 patent relates to a unique and
`
`highly secure distributed authentication system that locally authenticates a user's
`
`identity at a handheld device (e.g., using a PIN or biometric input), and also remotely
`
`authenticates the user's identity at a second device based on wirelessly transmitted
`
`authentication information (e.g., comprising a time-varying code) determined from
`
`the user's biometric information. Ex. 1001, Figs. 21-27, 28:32-36:26. Figure 21
`
`depicts one embodiment of such a distributed authentication system:
`
`
`
`USR Exhibit 2003, Page 9
`
`
`
`
`
`
`
`Id. at Fig. 21.
`
`29.
`
`In some embodiments, a first handheld device may authenticate the user
`
`of the device based on authentication information (e.g., a PIN) or biometric
`
`information provided by the user that may be compared against information stored
`
`in memory of the device. Ex. 1001, Fig. 22, 28:56-29:3, 29:65-30:7, 30:25-31. If
`
`user authentication fails, the device may disable use (e.g., by shutting down and/or
`
`deleting data stored in memory). Id. at Fig. 22, 28:56-29:3, 30:3-14. 30:31-39. If
`
`the user is successfully authenticated, the device may prepare and wirelessly transmit
`
`"a first wireless signal containing encrypted authentication information of the first
`
`user" to a second device. Id. at Fig. 22, 28:64-30:14, 30:46-58. The wireless signal
`
`may include a "time-varying code" and/or other information determined from the
`
`
`
`USR Exhibit 2003, Page 10
`
`
`
`
`
`provided biometric information. Id. at Fig. 23, 31:55-32:42. After receiving the
`
`wireless signal, the second device may authenticate the identity of the user of the
`
`first handheld device using the encrypted authentication information and other
`
`information (e.g., second biometric
`
`information or second authentication
`
`information) received or retrieved from memory. Id. at Fig. 22, 30:59-61, 31:2-10,
`
`31:25-32, 32:46-54.
`
`30. The '826 patent identifies a number of disadvantages of prior art
`
`authentication systems. For example, a prior art system may control access to
`
`computer networks using password protected accounts, but such a system is
`
`susceptible to tampering and difficult to maintain; or hand-held computer devices
`
`may be used to verify identity, but security could be compromised if a device ends
`
`up in the wrong hands. Ex. 1001, 1:46-2:41.
`
`31.
`
`In contrast, the '826 patent provides a more secure distributed
`
`authentication system, where a handheld device locally authenticates a user based
`
`on gathered biometric or authentication
`
`information,
`
`thereby preventing
`
`unauthorized use of the device. Ex. 1001, Fig. 22, 28:56-29:3, 29:65-30:39. And,
`
`rather than relying solely on local user authentication, the '826 patent provides
`
`additional security by imposing additional remote user authentication, based on
`
`different authentication information (e.g., one-time variable token or other
`
`information determined from the provided biometric information) wirelessly
`
`
`
`USR Exhibit 2003, Page 11
`
`
`
`
`
`transmitted by the first device, and other information (e.g., second authentication
`
`information or biometric information) available at the second device (e.g., securely
`
`stored or received by the second device). Id. at Fig. 24, Fig. 26, 32:43-56, 34:7-25.
`
`B.
`The '826 Patent Claims
`32. The '826 patent includes 35 claims, of which claims 1, 10, 21, and 30
`
`are independent. All of the '826 patent's claims relate to distributed authentication
`
`systems or methods that authenticate the identity of a user of a handheld device.
`
`33.
`
`Independent claims 1 and 10 are similar in some respects. Ex. 1001,
`
`44:24-58, 45:30-47. Independent claims 21 and 30 are also similar to claims 1 and
`
`10, but differ in significant ways. Id. at 46:21-57, 47:29-48:13. For example, while
`
`claims 1 and 10 refer to a first handheld device that authenticates the user of the
`
`device based on "authentication information," claims 21 and 30 refer, instead, to a
`
`first handheld device that authenticates the user of the device based on "first
`
`biometric information" provider by the user. Id. at 46:23-29, 47:31-33. As I discuss
`
`below, "authentication information" and "first biometric information" are different
`
`types of information in this context. In addition, while claims 1 and 10 refer to a
`
`second device that authenticates the user of the first handheld device based upon
`
`"second authentication information," claims 21 and 30 refer, instead, to a second
`
`device that authenticates the user of the first handheld device based upon "second
`
`biometric information" (id. at 46:47-57, 48:6-13), where "second authentication
`
`
`
`USR Exhibit 2003, Page 12
`
`
`
`
`
`information" and "second biometric information" are also different types of
`
`information in this context. The dependent claims also add a variety of significant
`
`features.
`
`IV. OVERVIEW OF THE ASSERTED PRIOR ART
`A. Maritzen (Ex. 1004)
`34. Maritzen discloses a vehicle-based payment system focused upon
`
`maintaining anonymity. It recognizes "[a] situation that still requires use of cash is
`
`in the collection of fees at vehicle-accessed payment gateways such as toll booths,
`
`vehicular kiosks, smog-certification stations, and the like." Ex. 1004, [0003].
`
`Maritzen explains that "[t]he collection of fees at these gateways is time consuming
`
`and subject to fraud." Id. Accordingly, Maritzen seeks to provide "a system and
`
`method for the real-time settlement of vehicle-accessed, financial transactions that
`
`provide anonymity and security." Id. at [0006].
`
`35. Maritzen discloses a system and method for electronic payment of fees
`
`using a personal transaction device (PTD) at a vehicle-accessed, payment-gateway
`
`terminal (VAPGT). Ex. 1004, Abstract, [0002], [0007]-[0009].
`
`
`
`USR Exhibit 2003, Page 13
`
`
`
`
`
`
`
`Id. at Fig. 1. As a vehicle with a PTD nears a VAPGT, the VAPGT transmits a
`
`payment request to the PTD. Id. at [0040]-[0042]. A user accesses the PTD to make
`
`a payment using a biometric input—in the preferred embodiment, the user provides
`
`the biometric input to a separate "privacy card" that transmits a separate "biometric
`
`key" to the PTD. Id. at [0043]-[0044]. The privacy card "only transmits the
`
`biometric key" to the PTD, while "biometric information identifying the user is not
`
`transmitted at any time." Id. at [0044]. Next, the PTD transmits a "transaction key"
`
`including the biometric key to the VAPGT (id. at [0045])—the "PTD does not
`
`transmit any user information to [the] VAPGT." Id. Then, the VAPGT transmits a
`
`"transaction request" including the transaction key to a clearing house, which
`
`validates information in the transaction request. Id. at [0046]-[0048].
`
`
`
`USR Exhibit 2003, Page 14
`
`
`
`
`
`B.
`36.
`
`Jakobsson (Ex. 1005)
`I am an inventor of International Patent Application Publication No.
`
`WO 2004/051585 A2 (Ex. 1005, "Jakobsson").
`
`37.
`
`Jakobsson discloses an "identity authentication system" that uses an
`
`"identity authentication code…to verify identity and to communicate event state
`
`information." Ex. 1005, Title, Abstract. "The invention addresses the[]
`
`shortcomings [of the prior art] by including an indication of the occurrence of an
`
`event directly into the efficient computation of an identity authentication code,
`
`where the verifier may efficiently verify the authentication code and identify the
`
`signaling of an event state." Id. at [0010]. Jakobsson discloses that "[e]xample
`
`reportable events include: device tampering; an event external to the device detected
`
`by the device; an environmental event, such as temperature exceeding or falling
`
`below a threshold; static discharge; high or low battery power; geographic presence
`
`at a particular location; confidence level in a biometric reading; and so on." Id. at
`
`[0011].
`
`38.
`
`Jakobsson's user device (such as a key fob or telephone (Ex. 1005,
`
`[0016])) generates an "identity authentication code" that depends on values
`
`including at least a dynamic variable, an event state, and a device secret. Id. at
`
`[0017], [0020], [0021], [0063]-[0072]. The identity authentication code is sent along
`
`
`
`USR Exhibit 2003, Page 15
`
`
`
`
`
`with user identification information to the verifier for authentication. See id. at
`
`[0004], [0021], [0097], [0112].
`
`C. Niwa (Ex. 1007)
`39. Niwa discloses a fingerprint authentication device. Ex. 1007, 2:19-44.
`
`The fingerprint authentication device allows a user to conduct a commercial
`
`transaction using his fingerprint. Id.
`
`V. CLAIM CONSTRUCTION
`40.
`I understand that Petitioner has identified two terms that they allege
`
`require construction. Pet. at 12-15. Their constructions for these terms do not impact
`
`my opinions in this declaration. However, as explained below, I disagree with
`
`Petitioner's proposed construction of "authentication information."
`
`41.
`
`I understand that Petitioner does not provide an express construction of
`
`"to […] enable or disable use of the first handheld device based on a result of the
`
`comparison" in its Petition, but as explained below, a POSITA would have
`
`understood this phrase means "to expand the range of functionality available to the
`
`[first] user of the first handheld device based on one result of the comparison, and to
`
`reduce the range of functionality available to the [first] user of the first handheld
`
`device based on another result of the comparison."
`
`A.
`"Authentication Information"
`42. Every independent claim of the '826 patent recites "authentication
`
`information." I understand that Petitioner's proposed construction of this term as
`
`
`
`USR Exhibit 2003, Page 16
`
`
`
`
`
`"information used by the system to verify the identity of an individual." Pet. at 15.
`
`I disagree for several reasons.
`
`43. First, Petitioner's reference to "the system" is confusing, especially in
`
`the context of method claims 10-20 and 30-35. Given that these claims do not recite
`
`any "system," it is unclear what "system" the proposed construction refers to. Ex.
`
`1001, 45:30-46:20, 47:30-48:34. I understand this is improper.
`
`44. Second, I understand that Petitioner's construction of "authentication
`
`information" would cover "biometric information" (see Pet. at 15), and that
`
`Petitioner asserts that retrieved or received "biometric information" may also
`
`constitute "authentication information" used by the first handheld device to
`
`authenticate a user. See id. at 20. This contravenes the plain language of the claims,
`
`which Petitioner and Dr. Shoup do not appear to address as a part of their claim
`
`construction analysis. See id. at 15. I understand this is improper.
`
`45. The actual claim
`
`language makes clear
`
`that "authentication
`
`information" and "biometric information" are distinctly different in this context.
`
`Indeed, these claim terms are separately recited within the very same claim
`
`limitations. See, e.g., Ex. 1001, 44:27-31 ("to authenticate a user of the first
`
`handheld device based on authentication information and to retrieve or receive first
`
`biometric information of the user of the first handheld device") (emphasis added).
`
`
`
`USR Exhibit 2003, Page 17
`
`
`
`
`
`I understand this creates a presumption that authentication information means
`
`something different than biometric information.
`
`46. Significantly, method claim 10 recites "retrieving or receiving first
`
`biometric information of the user" who has already been authenticated by the device
`
`"based on authentication information" (Ex. 1001, 45:32-36)—the claim requires
`
`performing the step of "authenticating…a user of the first handheld device as the
`
`first entity based on authentication information," and then performing the step of
`
`"retrieving or receiving first biometric information of the user." Id. at 45:32-39
`
`(emphasis added throughout). A POSITA would have understood that the retrieved
`
`or received "biometric information" cannot also be the "authentication information"
`
`used to authenticate the user, since user authentication occurs before the "biometric
`
`information" is even retrieved or received.
`
`47. Comparison of different claims further confirms that retrieved or
`
`received "biometric information" cannot also be the "authentication information"
`
`relied on by the first handheld device to authenticate a user. For example, claim 30
`
`recites "authenticating…a first user of the first handheld device based on first
`
`biometric information provided by the first user." Ex. 1001, 47:31-48:2 (emphasis
`
`added). In contrast, claim 10 recites "authenticating…a user of the first handheld
`
`device as the first entity based on authentication information," and then "retrieving
`
`or receiving first biometric information of the user." Id. at 45:32-39 (emphasis
`
`
`
`USR Exhibit 2003, Page 18
`
`
`
`
`
`added). Hence, a POSITA would have understood that while "biometric
`
`information" is used to authenticate the user in claim 30, the "authentication
`
`information" used to authenticate the user in claim 10 is different than the "biometric
`
`information" separately recited within the same claim.
`
`48. Some claims also require that "authentication information" be
`
`determined from "biometric information."
`
` See, e.g., Ex. 1001, 45:38-39
`
`("determining a first authentication information from the first biometric
`
`information") (emphasis added). In this context as well, a POSITA would have
`
`understood that the "biometric information" cannot also be the "authentication
`
`information."
`
`49. For all these reasons, I disagree with Petitioner's proposed construction
`
`and interpretation of "authentication information."
`
`B.
`
`"To […] enable or disable use of the first handheld device based
`on a result of the comparison"
`50. This phrase is recited in challenged claims 7, 14, 26, and 34. Consistent
`
`with the claim language and in view of the specification, I conclude that a POSITA
`
`would have understood the phrase "to […] enable or disable use of the first handheld
`
`device based on a result of the comparison" means "to expand the range of
`
`functionality available to the [first] user of the first handheld device based on one
`
`result of the comparison, and to reduce the range of functionality available to the
`
`[first] user of the first handheld device based on another result of the comparison."
`
`
`
`USR Exhibit 2003, Page 19
`
`
`
`
`
`51. The construction makes clear that the construed phrase requires that the
`
`first handheld device is capable of enabling use of the device and of disabling use
`
`of the device, depending on the situation. This is supported by the plain language of
`
`challenged claims 7, 14, 26, and 34, which all recite "enabl[ing] or disabl[ing] use
`
`of the first handheld device based on a result of the comparison" (of biometric
`
`information or authentication information, depending on the claim). Ex. 1001,
`
`45:14-20, 45:60-64, 47:7-12, 48:24-28 (emphasis added); see also id. at 47:18-23
`
`(reciting similar claim language with respect to a comparison performed by a second
`
`device). A POSITA would have understood from the claim language that the claims
`
`require the device to be capable of more than simply enabling use or not enabling
`
`use—enabling use and disabling use are two different responses the device must be
`
`able to perform based on different results of the recited comparison. Dr. Shoup
`
`appears to agree, as he attempts to address both enabling use and disabling use in his
`
`analysis of these claims. See, e.g., Ex. 1002, ¶¶ 115-116. However, as I discuss
`
`below, his analysis fails in other ways.
`
`52. The construction is also supported by the specification, which discloses
`
`enabling use and disabling use are different actions a user device may take based on
`
`the result of the comparison performed when the user attempts to authenticate
`
`himself to the device. This is addressed, for example, in connection with Figure 22:
`
`
`
`USR Exhibit 2003, Page 20
`
`
`
`
`
`
`
`Ex. 1001, Fig. 22A (excerpt).
`
`53.
`
`In this embodiment, the first device performs a comparison at step 202
`
`when the "the first user of the first wireless device 2110 first authenticates his or
`
`herself to the wireless device 2110…by either entering a PIN…or by interacting with
`
`the biometric sensor." Ex. 1001, 29:65-30:3. Based on a failed comparison when
`
`"the user of the device does not enter the correct PIN number or does not match the
`
`biometric data," the device disables its use when it "shuts down at step 204" and
`
`optionally "delete[s] any portion of or all of the data stored in memory 2118 at step
`
`206." Id. at 30:7-14. Dr. Shoup appears to agree that shutting down a device or
`
`deleting its data are examples of disabling use. See Ex. 2005 (Shoup Tr.), 173:11-
`
`24 ("if you shut down the device…that would disable the device…and deleting the
`
`data that would be yet another example").
`
`54. Likewise, based on a successful comparison, the device enables its use
`
`to "identify and authenticate the identity of the first user" to a second device. Ex.
`
`1001, 29:52-64, 30:46-31:16.
`
`
`
`USR Exhibit 2003, Page 21
`
`
`
`
`
`55. Enabling use or disabling use of the device based on different results of
`
`this comparison is also described elsewhere in the specification. See, e.g., Ex. 1001,
`
`28:65-29:3 ("stored biometric data of the first entity[] is compared…with the
`
`detected biometric data to determine whether the first entity is enabled or should be
`
`disabled from using the first wireless device"); see also id. at 5:22-26, 5:36-40, 7:30-
`
`34, 7:37-42, 28:56-29:16, 29:65-30:14, 30:25-39, 34:50-35:6.
`
`56. A POSITA would have understood from these disclosures in the
`
`specification that enabling use and disabling use are different responses the device
`
`must be able to perform based on different results of the recited comparison. Hence,
`
`in accordance with the claim language and in view of the specification, the
`
`construction requires that the first handheld device is capable of enabling use of the
`
`device based on one result of the recited comparison and of disabling use of the
`
`device based on another result of the recited comparison.
`
`57. The construction also makes clear that enabling use of a device and
`
`disabling use of a device are both actions that change the state of the device. Hence,
`
`merely doing nothing and maintaining the existing state of the device does not
`
`constitute enabling use or disabling use of the device. Rather, enabling use means
`
`expanding the range of functionality available to the user beyond what was
`
`previously available, while disabling use means reducing the range of functionality
`
`available to the user to less than what was previously available.
`
`
`
`USR Exhibit 2003, Page 22
`
`
`
`
`
`58. This interpretation is supported by the plain language of challenged
`
`claims 7, 14, 26, and 34, which all recite "enable" and "disable" as verbs that connote
`
`an action changing the state of the device, not merely the absence of action or change
`
`in state. See Ex. 1001, 45:14-20, 45:60-64, 47:7-12, 48:24-28.
`
`59. The construction is also supported by the specification. For example,
`
`as discussed above, the device may "shutdown" and/or "delete data" based on a failed
`
`comparison (Ex. 1001, Fig. 22, 30:3-14, 30:25-39), actions that reduce the range of
`
`functionality available to the user of the device. Or the device may provide the user
`
`with access to the device (id. at 34:52-57) or allow its use to "identify and
`
`authenticate the identity of the first user" to a second device (id. at Fig. 22, 29:52-
`
`64, 30:46-31:16), actions that expand the range of functionality available to the user
`
`of the device. See also id. at 5:22-26, 5:36-40, 7:30-34, 7:37-42, 28:56-29:16, 29:65-
`
`30:14, 30:25-39, 34:50-35:6. A POSITA would have understood from the claim
`
`language and these specification disclosures that enabling use means expanding the
`
`range of functionality available to the user, while disabling use means reducing the
`
`range of functionality available to the user. The construction captures this.
`
`VI. MARITZEN IN VIEW OF JAKOBSSON AND NIWA DOES NOT
`RENDER THE CHALLENGED