`Case 6:21-cv-01101-ADA Document 31-21 Filed 05/19/22 Page 1 of 9
`
`EXHIBIT 21
`EXHIBIT 21
`
`
`
`Case 6:21-cv-01101-ADA Document 31-21 Filed 05/19/22 Page 2 of 9
`
`Authentication Confidences
`
`Gregory R. Ganger
`ganger@ece.cmu.edu
`April 28, 2001
`
`April 2001
`CMU-CS-01-123
`
`School of Computer Science
`Carnegie Mellon University
`Pittsburgh, PA 15213
`
`Abstract
`
`''Over the Internet, no one knows you ^e a dog, " goes the joke. Yet, in most systems, a password submitted over the
`Internet gives one the same access rights as one typed at the physical console. We promote an alternate approach to
`authentication, in which a system fuses observations about a user into a probability (an authentication confidence)
`that the user is who they claim to be. Relevant ohsei^ations include pass word correctness, physical location, activity
`patterns, and biometric readings. Authentication confidences r^ne current yes-or-no authentication decisions,
`allowing systems to cleanly provide partial access rights to authenticated users v^hose identities are suspeet.
`
`We thank the members and companies of the Parallel Data Consortium (at the time of this writing: EMC Corporation, Hewlett-Packard Labs,
`Hitachi, IBM Corporation, Intel Corporation, LSI Logic, Lucent Technologies, Network Appliances, Panasas, Inc., Platys Communications,
`Seagate Technology, Snap Appliances, Sun Mierosystems and Veritas Software Corporation) for their insiglits and support.
`
`DEF-AIRE-EXTRINSICOOO00023
`
`
`
`Case 6:21-cv-01101-ADA Document 31-21 Filed 05/19/22 Page 3 of 9
`
`Keywords: security, authentication, biometric authentication, system access.
`
`2
`
`DEF-AIRE-EXTRINSICOOO00024
`
`
`
`Case 6:21-cv-01101-ADA Document 31-21 Filed 05/19/22 Page 4 of 9
`
`1. The Case for Authentieation Confidenees
`
`Access control decisions consist of two main steps: authentication of a principaVs digital identity and authorization
`of the principaVs right to perform the desired action. Well-established mechanisms exist for both. Unfortunately,
`authentication in current computer systems results in a binary yes-or-no decision, building on the faulty assumption
`that an absolute verification of a principaVs identity can be made. In reality, no perfect (and acceptable) meehanism
`is known for digital verification of a user's identity, and the problem is even more diffieult over a network. Despite
`this, authorization mechanisms accept the yes-or-no decision fully, regardless of how borderline the corresponding
`authentication. The result is imperfect access control.
`
`This white paper promotes an alternative approaeh in which the system remembers its confidence in each
`authenticated principaPs identity. Authorization decisions can then explicitly consider both the "authenticated"
`identity and fhe system^s confidence in that authentieation. Explicit use of authentication confidences allows case-
`by-case decisions to be made for a given principaVs access to a set of objects. So, for example, a system
`administrator might be able to check e-mail when fogged in across the network, but not be able to modify sensitive
`system eonfigurations. The remainder of this section discusses various causes of identity uneertainty and existing
`meehanisms for dealing with it. The following seetion discusses how authentieation eonfidences might be added to
`systems.
`
`1.1. Human identification and confidence
`
`In current computer systems, authentication of a user's digital identity relies on one or more mechanisms from three
`categories:
`
`GO Something one knows. The concept here is that if the user knows a pre-determined secret, it must be fhe right
`person. The common type of secret is a password, though other schemes like images [5] and patterns are being
`explored. The conventional wisdom is that since it is a secret, no additional information about the likelihood of
`true identity is necessary or available. We disagree. For example, a system's confidence in the provided
`password could certainly depend upon the location of its source 一 the likelihood of an impo어er providing my
`password from my office is much lower than fhe likelihood of them providing it over the network (especially
`from the Internet or the donnitories). As well, a gap of idle time between when the password was provided and
`a session's use might indicate that the real user has left their work아afion and an intruder has taken the
`opportunity to gain access.
`
`00 Something one has. The concept here is that if a user has a pre-configured item, it mu어 be the right person.
`The common item is some kind of smart card or ID badge. The conventional wisdom is that anyone who has the
`token should have full access and that no other information is needed. Again, we disagree. As with the
`password example, location of token and time since session use can both affoct the confidence that a system
`
`3
`
`DEF-AIRE-EXTRINSICOOO00025
`
`
`
`Case 6:21-cv-01101-ADA Document 31-21 Filed 05/19/22 Page 5 of 9
`
`should have m the eorresponding authentication. More radical out-of-band information, such as the owner's
`expected location based on scheduled appointments, could also provide insight.
`
`GO Something one is. The concept here is that the system compares measured features of fhe user to pre-recorded
`values, allowing access if there is a match [1]. Commonly, physical features (e.g., faee shape or fingerprint) are
`the focus of such schemes, though researchers continue to look for identifying patterns in user activity.
`Identifying features are boiled down to numerieal values c시尼d “Diomgtrics" for comparison purposes.
`Biometric values are inherently varied, both because of changes in the feature itself and because of changes in
`the measurement environment. For example, facial biometrics can vary during a day due to acne appearance,
`feci시 hair growth, fecial expressions, and ambient light variations. More drastic changes result when switching
`between eye이asses and contact lenses or upon breaking one's nose. Similar sets of issues exi어 for other
`physical features. Therefore, the decision approach used is to define a ''closeness of match" metric and to set
`some cut-off value 一 above the cut-off value, the system accepts the identity, and below it, not. When setting
`the cut-off value, an administrator makes a trade-off between the likelihood of false positives (allowing the
`WTong person in) and false negatives (denying access to the right person). Figure 1 illustrates this process and
`the corresponding trade-off. Note that we are not suggesting elimination of the cut-off. Instead, we are
`suggesting that the amount by which the observed value exceeds this cut-off be remembered as part of
`confidence.
`
`8
`
`6
`
`4
`
`〇
`①
`一3匕
`〇
`一」으
`も①
`「①
`匕 ①
`〇
`
`으3 丄
`
`2
`
`cu
`
`Thr&shjd C
`
`'~~f--------------
`/Threshold B
`
`Threshold A
`
`—User
`-----impostor
`
`β
`
`5
`
`4
`
`s 믄①으느 。
`
`0.2
`
`0.4
`0.6
`Measurement
`(a) 에oseness and cut-off points
`
`0.8
`
`1
`
`0.2
`0.4
`0.6
`False Acceptance Rate
`(b) false accept/reject trade-off
`
`0.8
`
`1
`
`Figure 1: Illustrative example of closeness-of-match thresholds for biometrie-based authentication and the
`corresponding trade-off between false aeceptance rate and false rejection rate. On the left, (a) shows possible
`distributions of closeness values for a user and an impo어er. Notice that each cut-off threshold will sometimes reject
`the real user and sometimes accept the imposter. Specifically, at a given eut-off threshold, false accepts are to the
`right of the dashed line and false rejects are to the left of the solid line. As biometric aeeuraey improves, the area
`beneath the user's distribution will increase and that beneath the imposter's curve will decrease. On the right, (b)
`illustrates fhe trade-off between false acceptance rate and false rejection rate more directly with the common
`''Receiver Operating Characteristic" curve. Better biometric accuracy would reduce the space beneath this curve.
`
`4
`
`DEF-AIRE-EXTRINSIC00000026
`
`
`
`Case 6:21-cv-01101-ADA Document 31-21 Filed 05/19/22 Page 6 of 9
`
`Confidence in identity can be enhanced by combining multiple mechanisms. The simplest approach is to apply the
`mechanisms independently and then combine their resulting confidences, but more powerful fusing is also possible.
`For example, merged lip reading and speech processing [3] can be better than either alone. Likewise for password
`checks and typing patterns [8]. Note that if the outeomes conflict, this will reduce confidence, but will do so
`appropriately.
`
`1.2. Propagating identity among distribnted systems
`
`Solid mechanisms exist for propagating authenticated identity across systems [2], but they assume that intermediary
`nodes can be trusted to SPEAK-FOR the original party appropriately. Recent work [4] refines this SPEAK-FOR logic to
`put limits on what a remote system ean say as the principal, but any statements within the set are still made as
`though 100% confident in identity. Arguably, the more nodes through which the SPEAK-FOR relationship is passed,
`the lower the confidence value associated with it should be. This is particularly true when the security of any of
`those nodes is questionable.
`
`1.3. Alternatives to authentication confidences
`
`Few alternatives exi아 to allow a given user differing degrees of access depending on the quality of authentication.
`One option is simply not to allow access when sufficient confidence can not be gained 一 this is still not an
`uncommon practice for commercial settings, where traveling employees are denied aeeess to systems on internal
`company networks. This tends to reduce productivity and induce insecure workarounds. An alternate approach is to
`provide users with several digital personalities, giving different access rights to each. This approach creates
`management headaches and results in security problems when users choose the same password for all accounts.
`Properly used, we think that authentication confidences provide a flexible, intuitive meehanism for expressing to
`systems the varying levels of aeeess for a given user.
`
`1.4. Authentication confidences in the real world
`
`Authentication confidences are common practice in the real world. For example, a sentry might allow someone
`dressed in the right uniform to approach, but then deny entry if the right passphrase is not given. A civilian example
`is practiced by credit card companies, which may block suspect purchases or attempt to increase their confidence by
`calling the card owner. Proven useful in the real world, we think explicit use of authentication confidences in
`computer authorization decisions is worth exploring.
`
`2. Using Authentication Confidences
`
`There are two main issues with ineorporating authentication confidenees into systems: establishing confidence
`values and using them. The latter is simpler, in theory, so we will discuss it first. To retain confidence information,
`the "user ID" normally associated with a session should be paired with a corresponding confidence value. Likewise,
`any authorization 아mcture [7] (e.g., an ACL entry) should be extended to keep a confidence value with each user ID
`
`5
`
`DEF-AIRE-EXTRINSICOOO00027
`
`
`
`Case 6:21-cv-01101-ADA Document 31-21 Filed 05/19/22 Page 7 of 9
`
`field. To allow access, an authorization check verifies two things: that the authenticated user ID matches and that the
`measured eonfidence matehes or exceeds the recorded requirement. Capabilities are somewhat more difficult than
`ACLs, since a system's confidence in a given authentication could drop after the corresponding session was granted
`a capability. However, if the required confidence were recorded in the capability, then a usage time check could be
`made to ensure proper access control.
`
`Note that migration to authentication confidences can be straightforward and incremental. Current systems
`essentially use a global value for all decisions. Systems ean continue to have a global default, but also provide
`support for specifying speeifie eonfidence requirements. When an authorization decision is made, the authorization
`mechanism can check both the identity and the eonfidenee in that identity, independent of whether the required
`eonfidenee is a default or a specific value.
`
`Now, we discuss the more difficult issue of establishing the confidence value. This issue requires an algorithm for
`converting the system's relevant observations into a value. We eurrently envision authentication confidences as
`percentage values between 〇 and 100, as in "I am 90% sure that this is Fred." Doing this will involve pre-eonfigured
`eonfidenee value settings for binary observations (e.g., password checks and local/remote console). More continuous
`values, such as biometric "closeness of match" comparisons, can be included mathematically. Table 1 gives an
`example.
`
`Password-Based Information
`
`Biometric-Based Information
`
`Authentication Confidence
`
`Failed eheek
`Passed check (remote console)
`Passed check (local console)
`Passed check (local console)
`Passed check (local console)
`Passed check (local console)
`
`Don't care
`No check
`No check
`Failed check
`Barely passed check
`Strongly passed check
`
`No access granted
`60% confidence
`80% confidence
`30% confidence
`85% confidence
`95% confidence
`
`Table 1: Illustrative example of authentication confidence values for a hypothetical system. This system always
`requires a successfiil password check. Its confidence in user identity is also improved by using the local console and
`by passing a biometric-based identity check at the focal console. Failing the biometric check reduces authentication
`confidence, but does not reject the user entirely; so, clearly, the administrators of this system do not fully trust
`biometric-based authentication. The highest confidence is reached when all signs are positive.
`
`As suggested earlier, a system's confidence in a given authentication may vaiy over the duration of a session. There
`are several possible eauses of such variation. For example, a lengthy period of idle time may reduce eonfidenee due
`to the potential of another person sitting down at an abandoned workstation. On the other hand, the system may
`observe positive signs of identity, sueh as aetivities common to the user in question (for example, the author looks at
`the Dilbert web page early each day). Continuous biometric authentication checks (as could be done with video
`based checks) could also increase or decrease the sy아em's confidence in a user's claimed identity [6]; this would be
`particularly true if the video camera observes the original user leaving the workstation. Finally, if the eonfidenee
`drops too low, the system could insist that the user re-authenticate before continuing the session.
`
`6
`
`DEF-AIRE-EXTRINSICOOO00028
`
`
`
`Case 6:21-cv-01101-ADA Document 31-21 Filed 05/19/22 Page 8 of 9
`
`3. Summary
`
`Authentication confidences are an mt。冀아ing approach to embracing (rather than hiding) the uneertainty inherent to
`authentication decisions. By explicitly subsetting the privileges of a principal based upon authentication confidence,
`one eould more eleanly handle the difficulties involved with specifying access rights that vary based on how the
`principal authenticates to the system. Clearly, experience is needed to determine exa어ly how to best realize
`authentieation eonfidenees in praetiee, but we 'believe that the notion is worth exploring.
`
`7
`
`DEF-AIRE-EXTRINSICOOO00029
`
`
`
`Case 6:21-cv-01101-ADA Document 31-21 Filed 05/19/22 Page 9 of 9
`
`References
`
`[1] Biometrics, IEEE Computer, Februaiy 200〇.
`
`[2] Michael Burrows, Martin Abadi, and Roger Needham. A logic of authentication. ACM Transactions on
`Computer Systems, 8 (1):18-36, February 199〇.
`
`[3] Tsuhan Chen. Audio-Visual Speech Processing,
`http:/7amp.ece.cmu.edu/projects/AudioVisua1SpeechProcessing/.
`
`[4] J. Howell and D. Kotz. End-to-end authorization. Symposium on Operating Systems Design and
`Implementation. San Diego, CA, 23-25 October 2000, pages 151-164. USENIX Assoeiation, 200〇.
`
`[5] Ian Jermyn, Alain Mayer, Fabian Monrose, Michael K. Reiter, and Aviel D. Rubin. The design and analysis of
`graphical passwords. 8th Security Symposium. Washington DC, 23-26 August 1999, pages 238, 1-14. USENIX
`Association, 1999.
`
`[6] Andrew J. Klosterman and Gregory R. Ganger. Secure Continuous Biometric-Enhanced Authentication. CMU-
`CS-00-134. Carnegie Mellon University School of Computer Science Technical Report, May 200〇.
`
`[7] B. W. Lampson. Protection. Princeton Symposium on Information Sciences and Systems, pages 437-443, 1971.
`
`[8] Fabian Monrose, Michael K. Reiter, and Susanne Wetzel. Password hardening based on keystroke dynamics.
`ACM Conference on Computer and Communications Security. Kent Ridge Digital Labs, Singapore, November
`2-4. Published as Proeeedings of ACM Conference on Computer and Communications Security, pages 73-82.
`ACM Press, November 1999.
`
`8
`
`DEF-AIRE-EXTRINSICOOO00030
`
`