`Case 6:21-cv-01101-ADA Document 31-7 Filed 05/19/22 Page 1 of 31
`
`EXHIBIT 7
`EXHIBIT 7
`
`
`
`Case 6:21-cv-01101-ADA Document 31-7 Filed 05/19/22 Page 2 of 31
`
`IN THE UNITED STATES DISTRICT COURT
`FOR THE WESTERN DISTRICT OF TEXAS
`WACO DIVISION
`
`AIRE TECHNOLOGY LTD.,
`
`Plaintiff,
`
`Case No. 6:21-cv-955-ADA
`
`v.
`
`JURY TRIAL DEMANDED
`
`SAMSUNG ELECTRONICS CO., LTD. and
`SAMSUNG ELECTRONICS AMERICA, INC.,
`
`Defendants.
`
`AIRE TECHNOLOGY LTD.,
`
`Plaintiff,
`
`Case No. 6:21-cv-1101-ADA
`
`v.
`
`APPLE INC.
`
`Defendant.
`
`AIRE TECHNOLOGY LTD.,
`
`JURY TRIAL DEMANDED
`
`Plaintiff,
`
`Case No. 6:21-cv-1104-ADA
`
`v.
`
`GOOGLE LLC,
`
`Defendant.
`
`JURY TRIAL DEMANDED
`
`DECLARATION OF DR. JOHN R. BLACK, JR.
`REGARDING INDEFINITENESS OF U.S. PATENT NO. 8,205,249
`
`
`
`Case 6:21-cv-01101-ADA Document 31-7 Filed 05/19/22 Page 3 of 31
`
`I, John R. Black, Jr., Ph.D., hereby declare and state as follows:
`
`I.
`
`INTRODUCTION
`
`1.
`
`I have been retained by Fish & Richardson P.C. on behalf of Defendants Samsung
`
`Electronics Co., Ltd. and Samsung Electronics America, Inc. (collectively, “Samsung”) as an
`
`expert in connection with the above captioned matter.
`
`2.
`
`I understand that Aire Technology Ltd. (“Aire”) has alleged that Samsung
`
`infringes U.S. Patent No. 8,205,249 (“the ’249 patent”). I understand that Aire has also sued
`
`Apple Inc. (“Apple”) and Google LLC (“Google”) alleging infringement of the ’249 patent.
`
`3.
`
`I have been asked to provide my opinion on how certain terms appearing in
`
`claims of the ’249 patent would be understood by a person of ordinary skill in the art
`
`(“POSITA”) in the field of the ’249 patent, which the patent itself identifies as “secure
`
`authentication of a user of a portable data carrier communicating with a terminal,”1 in the
`
`October 2002 time frame based on my technical understanding of those terms, in light of the
`
`intrinsic evidence (the patent and its file history) and extrinsic evidence (other than the patent
`
`and its file history).
`
`4.
`
`All emphases (such as bolding, underlining, or italics) in quotations herein are
`
`mine, unless otherwise stated.
`
`A.
`
`5.
`
`6.
`
`Qualifications and Experience
`
`My curriculum vitae is attached as Attachment A.
`
`I am an Associate Professor of Computer Science at the University of Colorado,
`
`Boulder. I received a B.S. in Mathematics and Computer Science from the California State
`
`University at Hayward (now “California State University, East Bay”) in 1988. I received an
`
`
`1 ’249 patent at 1:9-10
`
`2
`
`
`
`Case 6:21-cv-01101-ADA Document 31-7 Filed 05/19/22 Page 4 of 31
`
`M.S. in Computer Science in 1997, and a Ph.D. in Computer Science in 2000, both from the
`
`University of California at Davis (“UC Davis”).
`
`7.
`
`I have taught more than 60 classes in computer science, on subjects including data
`
`structures, algorithms, networking, operating systems, software engineering, security,
`
`cryptography, discrete mathematics, and quantum computing. I have authored or coauthored
`
`more than 20 publications, primarily on issues relating to computer security. I have been
`
`involved with computers for over 35 years in both commercial and academic capacities.
`
`8.
`
`My earliest interest was in networks and security. My first memories in this
`
`regard were around 1975 when a group of friends and I learned about the telephone network and
`
`its security. A few years later, personal computers became available and I spent most of my free
`
`time studying, programming, and modifying them. I worked extensively with various
`
`networking products throughout the 1980s, and developed an interest in cryptography soon
`
`thereafter. Although my undergraduate institution had no courses in cryptography or security in
`
`the 1980s, I decided to pursue self-study at the time, and opted to double major in Computer
`
`Science and Mathematics because cryptography is a blend of these two subject areas.
`
`9.
`
`After earning my B.S. degree in 1988, I worked for six years at Ingres Corp as a
`
`software developer. My work primarily was directed at transaction logging, data type support,
`
`and security.
`
`10.
`
`In 1995, I began my Ph.D. at UC Davis under cryptographer Phillip Rogaway.
`
`My area of focus was cryptography and security and my research involved encryption,
`
`authentication, hash functions, and network security. My Ph.D. thesis focused on authentication
`
`specifically, and portions of my thesis have been published as papers in top-level venues.
`
`3
`
`
`
`Case 6:21-cv-01101-ADA Document 31-7 Filed 05/19/22 Page 5 of 31
`
`11.
`
`After graduation I took a position as an Assistant Professor at the University of
`
`Nevada at Reno. In the Fall of 2001, I taught the networking class there, which included
`
`coverage of Ethernet, interior gateway protocols, exterior gateway protocols, ARP,
`
`DHCP/BOOTP, IP, UDP, TCP, HTTP, SMTP and other protocols. In 2001, a graduate student
`
`and I looked at the security of ARP and invented a new protocol “AuthARP” to add security to
`
`the protocol.
`
`12.
`
`In 2002, I moved to the University of Colorado at Boulder where I am currently
`
`employed. In the Fall of 2002, I co-designed and co-taught a new course called “Foundations of
`
`Computer and Network Security,” which included descriptions of security issues around both
`
`wired and wireless security challenges, mostly for public-facing network services including the
`
`world-wide web. I have taught this class seven more times since then, including modern topics
`
`such as wireless networking, the Internet of Things, and so forth.
`
`13.
`
`In my career at the University of Colorado, I have published several more papers
`
`in the area of cryptography and network security. I have taught more than 30 courses in network
`
`security and cryptography, and have graduated several PhD students in these areas. I have also
`
`served as a reviewer and referee for over 100 papers in the area of cryptography, including
`
`serving on more than 20 conference committees reviewing submissions to cryptography
`
`conferences. In 2009 I was the general chair of the CRYPTO conference.
`
`14.
`
`I also worked as a consultant at times, often writing software on contract basis.
`
`Although most projects are covered by NDAs, many involved computer security and
`
`cryptography.
`
`15.
`
`In 2011, I began technical consulting for a local company called Cardinal Peak,
`
`which focuses primarily on video encoding and delivery systems. My work for Cardinal Peak
`
`4
`
`
`
`Case 6:21-cv-01101-ADA Document 31-7 Filed 05/19/22 Page 6 of 31
`
`has largely been directed to video encoding, transcoding, compression, encryption, and DRM.2
`
`For example, I designed the security system for the Pro1 smart thermostat, implemented the
`
`DRM for the Yonder Music App, worked on 802.1X code for smart dog collars, and helped
`
`design the cryptography used in Fitbit devices for wireless transfer of a Fitbit watch to a phone
`
`or laptop.
`
`16.
`
`In 2016, I took a leave of absence from the University of Colorado to start a
`
`company called “SecureSet” in Denver, Colorado. The objective of SecureSet is to take
`
`reasonably proficient technical people and turn them into computer and network security
`
`specialists via five months of intensive training. SecureSet was sold to WeWork in 2019 and
`
`continues to offer computer security classes today.
`
`17.
`
`Additional details regarding my background, experience, and qualifications can
`
`be found in Attachment A.
`
`B.
`
`18.
`
`Compensation
`
`I am being compensated for my time at my usual consulting rate of $625 per hour,
`
`plus actual expenses. No part of my compensation depends on the outcome of this case or on the
`
`opinions that I render.
`
`C. Materials Considered
`
`19.
`
`In preparing this declaration, I have relied upon my education, knowledge, and
`
`experience. I reviewed, among other things, the following materials:
`
` Any materials cited herein;
`
` The ’249 patent and its file history;
`
`
`
`IPR2022-00875 and Dr. Shamos’s associated expert declaration;
`
`
`2 Digital Rights Management, or DRM, involves the use of encryption and authentication to protect digital
`media such as software, music, movies, etc., against illicit copying and sharing.
`5
`
`
`
`Case 6:21-cv-01101-ADA Document 31-7 Filed 05/19/22 Page 7 of 31
`
` Certain definitions from the following dictionaries:
`
`Dictionary
`American Heritage Dictionary of the English Lang. (3d
`ed. 1992)
`Oxford English Dictionary (1961)
`Webster’s Third New Int’l Dictionary (2002)
`
`Shorthand
`Amer. Heritage
`
`Oxford
`Webster’s
`
` Publications including the following:
`
`Publication
`Lawrence O'Gorman, Securing Business’s Front Door –
`Password, Token, and Biometric Authentication (2002)
`Andrew Hutchinson & Marc Welz, Incremental Security
`in Open, Untrusted Networks, Future Trends in
`Distributed Computer Systems 151–154 (Nov. 1999)
`Gregory R. Ganger, Authentication Confidences (Apr.
`28, 2001)
`Jalal Al-Muhtadi et al., A Flexible, Privacy-Preserving
`Authentication Framework for Ubiquitous Computing
`Environments, Proceedings 22nd International
`Conference on Distributed Computing Systems
`Workshops (July 2002)
`Ross Anderson, Security Engineering (1st ed. 2001)
`
`
`UNDERSTANDING OF THE LAW
`
`Shorthand
`O’Gorman
`
`Huchinson & Welz
`
`Ganger
`
`Al-Muhtadi
`
`Anderson
`
`
`
`II.
`
`20.
`
`I am not an attorney or a patent agent and offer no opinions on the law. I have,
`
`however, been informed by counsel regarding various legal standards that may apply to this case,
`
`and I have applied those standards where necessary in arriving at my conclusions.
`
`A.
`
`21.
`
`Level of a Person Having Ordinary Skill in the Art
`
`I understand that patents are to be interpreted from the person having ordinary
`
`skill in the art at the time of the invention (“POSITA”).
`
`22.
`
`I have been informed that a POSITA is a hypothetical person who has full
`
`knowledge of all the pertinent prior art, and that courts may consider the following factors in
`
`determining the level of skill in the art: (1) type of problems encountered in the art; (2) prior art
`
`6
`
`
`
`Case 6:21-cv-01101-ADA Document 31-7 Filed 05/19/22 Page 8 of 31
`
`solutions to those problems; (3) rapidity with which innovations are made; (4) sophistication of
`
`the technology; (5) educational level of active workers in the field.
`
`B.
`
`23.
`
`Indefiniteness
`
`I have been advised by counsel that the “definiteness requirement” of the patent
`
`laws of the United States requires that patent claims particularly point out and distinctly claim
`
`the subject matter which an inventor regards as the invention.
`
`24.
`
`Counsel has advised me that whether any claim terms or phrases are indefinite,
`
`should be determined from the perspective of a POSITA.
`
`25.
`
`Counsel has also advised me that a patent is valid and its claims definite if they,
`
`when read in light of the specification and the prosecution history, inform, with reasonable
`
`certainty, a POSITA about the scope of the invention.
`
`26.
`
`I have been informed and understand that a term may be found to be indefinite
`
`when the claim language is facially subjective or ambiguous, and the meaning of a term and/or
`
`scope of the claims is not “reasonably certain” to one of skill in the art.
`
`III. BACKGROUND
`A.
`
`The ’249 Patent
`
`27.
`
`The ’249 patent discloses technology for facilitating an electronic transaction
`
`using a portable data carrier.3
`
`28.
`
`According to the patent, prior methods for effecting secure electronic transactions
`
`were unable to account for and exploit the “quality” of the user authentication method used to
`
`effectuate the transaction.4
`
`
`3 ’249 patent at 2:17-18.
`4 ’249 patent at 1:36-43.
`
`7
`
`
`
`Case 6:21-cv-01101-ADA Document 31-7 Filed 05/19/22 Page 9 of 31
`
`29.
`
`The “secure electronic transaction” can be “a transaction requiring the production
`
`of a digital signature on the part of [a] user,” and “[s]uch a transaction can be e.g., the effecting
`
`of a banking transaction by which the account of the user 30 is debited.”5
`
`30.
`
`Figure 1 (reproduced below) depicts “the structure of a system for performing a
`
`digital signature” according to the patent.6
`
`31.
`
`The alleged invention includes a portable data carrier 20, a terminal 14, and a
`
`
`
`background system 10.7
`
`
`5 ’249 patent at 2:24-28.
`6 ’249 patent at Fig. 1.
`7 ’249 patent at 2:16-24.
`
`8
`
`
`
`Case 6:21-cv-01101-ADA Document 31-7 Filed 05/19/22 Page 10 of 31
`
`32.
`
`The “portable data carrier 20 ... is carried by a user 30 and set up to perform a
`
`security-establishing operation within a transaction, and a data record 40 which is to be handled
`
`securely within a transaction to be effected.”8
`
`33.
`
`The system can support multiple types of authentication methods, including a
`
`knowledge-based method and a biometric-based method.9
`
`34.
`
`According to the patent, “[t]he biometric method inherently constitutes [a] higher-
`
`quality” authentication method than a knowledge-based method “since [the biometric method]
`
`presupposes the personal presence of the user 30” and “this is not ensured in the knowledge-
`
`based method since the knowledge can have been acquired by an unauthorized user.”10
`
`35.
`
`The patent states: “Specification of an authentication method can be effected
`
`automatically by the terminal 14 on the basis of information transmitted with the electronic
`
`document 40, but it can also be presented to the user 30 as a decision request via the display
`
`device 16.”11
`
`B.
`
`36.
`
`Level of Ordinary Skill in the Art for the ’249 Patent
`
`I understand that Samsung’s IPR petition expert, Dr. Shamos, opined that a
`
`POSITA at the time of the alleged invention would have had at least:
`
`(1) a Bachelor’s degree in computer science, computer engineering, electrical
`engineering, or a related field; and, (2) in addition, one to two years of experience with
`digital authentication techniques, such as, for example, biometrics, digital signatures,
`passwords, and/or PIN numbers. Graduate education could substitute for professional
`experience, or significant experience in the field could substitute for formal education.
`
`IPR2022-00875, Shamos Declaration, ¶ 22.
`
`
`8 ’249 patent at 2:21-24.
`9 ’249 patent at 4:6-8, 3:22-26.
`10 ’249 patent at 3:29-33.
`11 ’249 patent at 4:9-14.
`
`9
`
`
`
`Case 6:21-cv-01101-ADA Document 31-7 Filed 05/19/22 Page 11 of 31
`
`37.
`
`Based on the factors set forth above in the “Understanding of the Law” section, in
`
`view of the ’249 patent and its prosecution history, in my opinion, this is an appropriate level of
`
`skill in the art at the time of the alleged invention.
`
`38.
`
`I have been informed that the earliest priority date of the alleged invention of the
`
`’249 patent is October 2002. At that time, my education and/or work experiences were
`
`consistent with the level of skill in the art discussed in the preceding paragraphs.
`
`C.
`
`39.
`
`Technology Background
`
`In the security world, user authentication is performed to limit access to certain
`
`assets, including but not limited to, computer networks (such as secure banking websites) and
`
`physical locations (such as storage facilities for sensitive physical documents or materials).12
`
`40.
`
`There are three generally accepted bases for performing user authentication:
`
`(1) knowledge-based—what the user “knows;” (2) token-based—what the user “has;” and
`
`(3) biometric-based—what the user “is.”13
`
`41.
`
`In most systems, where the transaction or access at issue is performed remotely
`
`(outside the physical presence of the entity supplying access), these authentication methods work
`
`as a proxy for authenticity—they “are only derivative indicators of authenticity; they are not
`
`‘who you are,’ but a mutually agreed device or protocol to act as evidence that you are who you
`
`say you are.”14 Thus, all of these authentication methods are susceptible to fraud, without any
`
`guarantee that the user is who they say they are.15
`
`
`12 O’Gorman at 1.
`13 O’Gorman at 1; Hutchinson & Welz at 152; Ganger at 3.
`14 O’Gorman at 3.
`15 See O’Gorman at 12 (“One can see that a well-chosen password can be even stronger than a token (with 12-
`digits) and much better than a fingerprint.”).
`
`10
`
`
`
`Case 6:21-cv-01101-ADA Document 31-7 Filed 05/19/22 Page 12 of 31
`
`42.
`
`For knowledge-based authentication, the user presents a secret, known only to
`
`the user, which is subsequently validated against known data. For example, the user could
`
`present a password or Personal Identification Number (“PIN”). Such secrets can be of varying
`
`degrees of length or complexity, with longer and more complex secrets being preferred over
`
`shorter less complex secrets as the latter are more vulnerable to attack by guessing.
`
`43.
`
`Knowledge-based authentication carries the drawback that the secret is
`
`susceptible to compromise through guessing or data mining techniques. For example, the user
`
`could have a password that includes a pet’s name and an important date. In such a case, an
`
`unauthorized individual could obtain the secret by mining data from the user’s public social
`
`media posts or profiles.
`
`44.
`
`On the other hand, a secret is not a physical device like a car key that could be
`
`misplaced and merely “picked up” by an unauthorized individual.
`
`45.
`
`Token-based authentication requires that the user be in possession of some
`
`physical object. In this context, an example would be a chip enabled smart card, which the user
`
`presents to authenticate herself. Once the user presents an acceptable object, she is afforded
`
`access. Other examples include a fob or a “one time password generator” that produces a rolling
`
`PIN each time a button is pressed on the device.
`
`46.
`
`Token-based authentication is not susceptible to compromise in the same way as
`
`knowledge-based authentication. For example, an unauthorized individual cannot guess a
`
`physical object into existence—nor is that individual likely to successfully replicate the object
`
`without having a physical copy of it. On the other hand, an unauthorized individual need only
`
`obtain the object—for example, by retrieving a chip enabled smart card from an unattended
`
`reader—to gain access.
`
`11
`
`
`
`Case 6:21-cv-01101-ADA Document 31-7 Filed 05/19/22 Page 13 of 31
`
`47.
`
`Biometric authentication methods rely on the measurement of physiological or
`
`anatomical characteristics, behavioral characteristics, or a combination of these. In particular,
`
`authentication with biometrics involves the comparison of measured physiological data in situ
`
`with stored data. For example, a user’s fingerprint consists of natural patterns that can be
`
`measured. When the user presents her fingerprint to the scanner, the measured patterns are
`
`compared to a set of stored data to determine the statistical likelihood that the presented
`
`fingerprint matches the stored data.
`
`48.
`
`In other words, biometric authentication is essentially pattern recognition.
`
`Because of this, the security of the authentication is highly dependent upon the complexity of the
`
`pattern used for comparison. For example, a fingerprint technique that requires twelve or more
`
`matching points is less likely to produce a false positive than one that requires only five
`
`matching points.
`
`49.
`
`An advantage of using a biometric authentication is that it does not rely upon the
`
`user remembering anything (e.g., a secret or to keep a chip card on her person). In addition,
`
`biometric features are difficult to “spoof” or “steal” in the same sense as one could steal a
`
`written-down password or a chip card. As discussed in the following paragraph, biometrics are
`
`not without vulnerability.
`
`50.
`
`On the other hand, if a biometric feature is “stolen”—for example, by obtaining
`
`the underlying data set to be used for comparison, or through reproduction such as a photo or
`
`replication of a fingerprint, face, or retina16—the user cannot change that feature and must cease
`
`using it. Whereas a compromised password or stolen fob can be changed or replaced, a user’s
`
`retina or fingerprint is essentially unalterable. Biometric authentication methods also have a high
`
`
`16 Modern biometric devices include checks for “liveness” to ensure the measurement is being taken from an
`actual living human being to forestall these kinds of attacks.
`12
`
`
`
`Case 6:21-cv-01101-ADA Document 31-7 Filed 05/19/22 Page 14 of 31
`
`“false negative rate,” meaning they have a propensity to falsely reject an authorized user, which
`
`can be more frustrating that forgetting a password or PIN (which can be reset) or misplacing a
`
`token (which can be replaced). The reverse is also possible—where an unauthorized user is
`
`incorrectly granted access (a false positive). The rate of false positives and false negatives
`
`depends on the specifics of the biometric method used (e.g., the specific techniques for
`
`measuring and matching the biometric feature) and of the particular implementation. Properly
`
`balancing the false positive rate and false negative rate in 2002 was a costly and difficult
`
`proposition; in 2022 much progress has been made but challenges still remain.17
`
`51.
`
`Some biometric methods are uniquely vulnerable in that they require no positive,
`
`volitional action from the user. For example, a sleeping user’s face may be used by an
`
`unauthorized individual to access a system secured by facial recognition. Additionally, an
`
`incapacitated or otherwise unconscious user’s thumb can be placed onto a fingerprint scanner,
`
`granting access to an unauthorized person.
`
`52. Modern security architecture also relies on combinations of the three categories.
`
`This is often referred to as Multi-factor Authentication (“MFA”). For example, when I log into
`
`my bank’s website, I have to enter my password and a PIN that the bank sends to my phone. For
`
`another website, I have to enter both my password and a code from a device I use to generate
`
`one-time PINs called an “authenticator.” In the latter example, the authentication is performed
`
`by a combination of knowledge (my password) and token (the authenticator).
`
`53.
`
`Similarly, biometric and knowledge-based methods can also be combined. For
`
`example, a system could require that a user present her fingerprints in a particular order (e.g.,
`
`
`17 For example, my phone recognizes my fingerprint perhaps 20% of the time despite my attempts to calibrate it.
`13
`
`
`
`Case 6:21-cv-01101-ADA Document 31-7 Filed 05/19/22 Page 15 of 31
`
`right thumb, left index, right pinky, right index). The user’s fingerprints represent a biometric
`
`method, while the sequence is a secret known only to the user.
`
`54.
`
`A combination of methods generally produces a stronger authentication by
`
`reducing the risks associated with the individual methods. For example, where the user is
`
`required to recite a secret and present a token, the likelihood that an unauthorized user has
`
`obtained both the secret and the token is reduced. In fact, the use of multiple factors can help
`
`detect access attempts by unauthorized users (e.g., presenting the correct card but the incorrect
`
`PIN). Of course, using multiple authentication methods to reduce false positive rates means that
`
`the drawbacks of multiple methods may combine to create new problems; for example,
`
`combining two methods with high false negative rates could result in an even higher false
`
`negative rate that would be unacceptable from a security perspective.
`
`55.
`
`Even within a particular category, the specific authentication methods have
`
`distinct advantages and disadvantages. For example, fingerprints and voice recognition are both
`
`forms of biometric authentication.18 Fingerprints have the advantage of being independent of the
`
`environment and generally unchanging over time, whereas a user’s voice can sound different
`
`based on the environmental acoustics, background noise, illness, or the user’s stress level. Voice
`
`recognition has the advantage that the reader (e.g., a telephone) is effectively ubiquitous,
`
`whereas fingerprint readers are not as widely available. In addition, fingerprints can be used
`
`without positive volitional action by the user (e.g., a sleeping or incapacitated user’s thumb can
`
`be placed on a reader), whereas voice recognition requires the user to take positive action by
`
`speaking (though she may be tricked in various ways to use her voice in ways she may not
`
`intend).
`
`
`18 See O’Gorman at 5.
`
`14
`
`
`
`Case 6:21-cv-01101-ADA Document 31-7 Filed 05/19/22 Page 16 of 31
`
`56.
`
`The following table provides a brief summary of the advantages and
`
`disadvantages of different authentication methods.19
`
`
`
`57.
`
`As discussed in the preceding paragraphs, there are advantages and disadvantages
`
`to each category of user authentication. Whether or not one is of higher or lower quality from a
`
`security perspective depends on a number of factors including security requirements, limits to the
`
`technology used, security setting, security goals, acceptable risks, and available resources, as
`
`well as the details about the particular method used.
`
`58.
`
`There is not now, nor was there at the time of the alleged invention, a generally
`
`accepted standard measurement for the “quality” of a given authentication method from a
`
`security perspective that holds universally true in every setting. Rather, the appropriate method
`
`depends on the specific demands of the application in question.20 Whether a given method is
`
`appropriate is more akin to a balancing of trade-offs against system security requirements than to
`
`
`19 O’Gorman, Table 2.
`20 See Al-Muhtadi et al. at 1.
`
`15
`
`
`
`Case 6:21-cv-01101-ADA Document 31-7 Filed 05/19/22 Page 17 of 31
`
`a hierarchy of methods ranked in quality from a security perspective that remain the same
`
`irrespective of the system under consideration.
`
`IV.
`
`INDEFINITENESS OPINION
`
`“an inherently relatively lower quality and an inherently relatively higher quality from a
`security perspective”
`
`59.
`
`In my opinion, the term “an inherently relatively lower quality and an inherently
`
`relatively higher quality from a security perspective” is indefinite because a POSITA would not
`
`have been able to understand the scope of that term, and the claims in which it appears, with
`
`reasonable certainty.21
`
`60.
`
`A POSITA would not have been able to discern whether user authentication
`
`methods are of an “inherently relatively” higher or lower quality from a security perspective, and
`
`the ’249 patent provides insufficient guidance for making that determination.
`
`A.
`
`A POSITA Would Not Have Understood User Authentication Methods To Be
`of Inherently Higher Or Lower Quality Than Another
`
`61.
`
`A POSITA would not have been able to determine whether authentication
`
`methods were “inherently relatively” lower or higher “quality” than another without more
`
`information. Rather, the perceived relative quality of an authentication method is both viewpoint
`
`dependent and context specific.
`
`62.
`
`Context is of paramount importance in developing a security system. Before
`
`determining what mechanism (e.g., authentication method) is used to enforce a security policy,
`
`the context must be determined. The context includes the goals of the system, the types and
`
`numbers of users involved, the threat model, acceptable risk, and many other factors. These
`
`issues are discussed in detail in Security Engineering by Ross Anderson. One of Anderson’s
`
`
`21 As discussed in Section II.B, I understand from counsel that this is the standard for indefiniteness.
`16
`
`
`
`Case 6:21-cv-01101-ADA Document 31-7 Filed 05/19/22 Page 18 of 31
`
`teachings is foundational, and is a point that I have made many times in my classes: “many
`
`systems fail because their designers protect the wrong things, or protect the right things but in the
`
`wrong way.”22
`
`63.
`
`As discussed in Section IIIIII.C above, each of the three user authentication
`
`categories (knowledge-based, token-based, and biometric-based) offers its own distinct
`
`advantages and disadvantages.
`
`64.
`
`An “inherent” quality is intrinsic; it is an “essential constituent or
`
`characteristic.”23 Taken together with the discussion in Section III.C above, it is my opinion that
`
`no authentication method is “inherently” of higher or lower quality from a security perspective.
`
`65. Moreover, “relative” in the context it is used in the claim involves a comparison
`
`between two or more things.24 Thus for an authentication method to be “inherently relatively” of
`
`higher or lower quality from a security perspective, it must possess some “inherent” quality level
`
`that, when assessed against a different authentication method, would lead a POSITA to conclude
`
`that one method is “inherently” higher quality from a security perspective. In other words, when
`
`comparing two different authentication methods, a POSITA would somehow need to be able to
`
`assess which of them is superior or inferior based solely on inherent attributes of the methods,
`
`without considering any other outside factors related to the security system where these
`
`methods would be used.
`
`
`22 Anderson at 4.
`23 Amer. Heritage at 928. See also, for example: Oxford at 293 (defining inherent as “[e]xisting in something as
`a permanent attribute or quality”); Webster’s at 1163 (defining inherent as “structural or involved in the constitution
`or essential character of something”).
`24 See, for example: Amer. Heritage at 1523 (defining relative as “[c]onsidered in comparison with something
`else”); Oxford at 399 (defining relative as “[a]rising from, depending on, or determined by, relation to something
`else or to each other; comparative”); Webster’s at 1916 (defining relative as “a thing having a relation to or
`connection with or necessary dependence upon another thing”).
`17
`
`
`
`Case 6:21-cv-01101-ADA Document 31-7 Filed 05/19/22 Page 19 of 31
`
`66.
`
`It is my opinion that, because there is no category of authentication method—or
`
`specific authentication—that has an inherently higher or lower quality from a security
`
`perspective, it would not be possible for a POSITA to determine, with reasonable certainty,
`
`whether any user authentication method is of “inherently relatively” higher or lower quality from
`
`a security perspective.
`
`B.
`
`The ’249 Patent Does Not Provide a POSITA With Sufficient Guidance to
`Compare the Inherent Quality of User Authentication Methods
`
`67.
`
`The ’249 patent’s specification offers just one example comparing authentication
`
`methods from a security perspective: “The biometric method inherently constitutes the higher-
`
`quality one here, since it presupposes the personal presence of the user 30[.]”25
`
`68.
`
`The specification further asserts that the “personal presence of the user 30” is “not
`
`ensured in the knowledge-based method since the knowledge can have been acquired by an
`
`unauthorized user.”26
`
`69.
`
`The specification’s single example leaves a great deal of uncertainty for the
`
`POSITA. For example, a POSITA would not be able to determine whether token-based
`
`authentication method is of higher or lower quality than a knowledge-based method. In addition,
`
`the specification provides no guidance to aid a POSITA in comparing authentication methods
`
`within a given category—for example, fingerprint vs. facial recognition. The same uncertainty
`
`applies to comparing multi-factor authentication (“MFA”) methods, where multiple
`
`authentication methods are used in combination. Properly evaluating “quality from a security
`
`
`25 ’249 patent at 3:29-33.
`26 ’249 patent at 3:29-33. See also ’249 patent at 3:58-62 (“[T]he chip card 20 supports two authentication
`methods, namely a PIN check as a knowledge-based, inherently low-quality method, and a fingerprint check as a
`biometric, inherently higher-quality m