throbber
Case 6:21-cv-01101-ADA Document 31-20 Filed 05/19/22 Page 1 of 5
`Case 6:21-cv-01101-ADA Document 31-20 Filed 05/19/22 Page 1of5
`
`EXHIBIT 20
`EXHIBIT 20
`
`

`

`Case 6:21-cv-01101-ADA Document 31-20 Filed 05/19/22 Page 2 of 5
`
`Incremental Security in Open, Untrusted Networks
`
`Andrew Hutchison, Marc W얘z
`Department of Computer Science
`University 〇호 Cape Town
`Rondebosch, 7701 Republic of South Africa
`{hutch,mwelz}@cs.uct•ac.za
`
`Abstract
`
`In this paper we identify a number of security problems
`encountered in open, untrusted networks and motivate why
`some of these problems are going to remain with us for the
`foreseeable future. In order to reduce system, vulnerabil­
`ity^ in such environments, we suggest that network sendees
`should provide a second line of defense to catch those at­
`tackers who are not excluded by the first line ——the conven­
`tional signon process. Part of this fallback position could
`adapt anomaly detection (a concept borrowed from con­
`ventional network intrusion detection systems) to provide a
`means of gradually and continuously authenticating users
`and modulating their access rights accordingly.
`
`1
`
`Introduction
`
`Computer network connectivity costs are decreasing for
`the end user. At the same time it is becoming possible to
`access computer networks from an ever increasing variety
`of platforms such as cellular telephones, internet kiosks and
`pagers. The combination of these two trends means that un­
`sophisticated users will become an ever increasing fraction
`of the online population.
`We shall refer to such cheap, ubiquitous networks as
`commodity networks.
`Users of such a networks (subjects in this context) will
`have to be authenticated and granted access rights to re­
`sources (referred to as objects'). There are a number of chal­
`lenges associated with this process:
`
`« Authentication has to be reasonably simple and non­
`intrusive.
`
`» Subjects are naive and thus can't be relied on to follow
`good security procedures.
`
`• It may be difficult or impossible to verify the identity
`of a subject.
`
`0-7695-0468-X/99 $10.00 © 1999 IEEE
`
`® There exists a well-established and experienced in­
`truder population.
`
`This paper will describe these problems in greater detail
`and describes an approach which may be used as a second
`line of defense in such a hostile environment.
`Our approach attempts to incorporate the anomaly detec­
`tion capabilities typically only found in network intrusion
`detection systems (see [1] for a example of a research sys­
`tem or [2] for an overview of commercial ones) and make
`them an integral part of an application, where anomaly de­
`tection may not only be used to provide a continuous and
`progressive authentication mechanism, but also a means to
`constrain the available actions to those needed and actually
`used.
`
`2 Security Challenges in Open^ Untrasted
`Networks
`
`2,1 Simple, Inexpensive Authentication
`
`A requirement of a consumer network infrastructure is
`that authentication should be reasonahiy simple and inex­
`pensive. For example, it is unlikely that ISPs will require
`that subscribers install retina scanners (at least at current
`prices) in order to access the internet from home.
`Another example of ease and convenience taking prece­
`dence over security is that passwords for dialup accounts are
`often stored in plaintext on the local machine and changed
`infrequently if ever.
`It appears unlikely that these trends will he reversed any­
`time soon —- the computer industry has created the expecla­
`tion that computers should be simple and easy to use, while
`it is probably 흥oing to be difficult to persuade the commod­
`ity PC hardware industry to add expensive authentication
`devices to home PCs.
`
`A니thorized licensed use limited to: University of Texas at Austin. Downloaded on March 01,2022 at 05:53:48 UTC from IEEE Xplore. Restrictions apply.
`DEF-AIRE-EXTRINSICOOO00087
`
`

`

`Case 6:21-cv-01101-ADA Document 31-20 Filed 05/19/22 Page 3 of 5
`
`2.2 Naive User Population
`
`Despite valiant efforts by educators and support per­
`sonnel, computer users still do write passwords on post-it
`notes stuck to their monitor. It seems unlikely that this wiÜ
`change — more and more people will use computers as a
`mere tool and won't have an interest in computers them­
`selves.
`
`2.3 Unverifiabie Identity
`
`In a number of situations it is difficult to associate an
`online user with a real person or organization. For example
`users of services such as prepaid cellular telephony have,
`for all intents and purposes, no identity. Unless the user of
`such a telephone chooses to tell you, there is no reasonable
`way of establishing his or her name.
`In some situations it may be possible to trace the airtime
`purchase to a credit card, but requiring that prepaid cellular
`phones are only purchased with credit cards is not practical
`To illustrate this point; In South Africa prepaid celiphones
`were introduced to make wireless communications avail­
`able to those who would not qualify for credit. Their intro­
`duction has been credited with a significant growth in the
`number of South African GSM telephone users and some
`of these new users are reported never to have opened a bank
`account.
`
`2.4 Established Intruder Population
`
`System crackers are a part of the Internet. While a large
`proportion of crackers are amateurs who merely use existing
`cracking tools, there does exist a category of cracker who
`undeniably is able to mount complex attacks.
`While the classical cracker is portrayed as an individual
`who breaks into systems for the intellectual challenge, it
`would seem reasonable to assume that a number of crack­
`ers are in the service of intelligence agencies, both military
`and commercial. Such crackers are likely to be experienced
`and motivated enough to keep abreast of the newest security
`developments.
`
`2.5 Fundamental Security Problems
`
`The above description is intended to show that it is diffi­
`cult to secure an object in a commodity network ——vulner­
`abilities exist at any point between it and the subject.
`It might be argued that today^s networks were never de­
`signed to resist determined attackers and that the next gen」
`eration should be more secure. Said next generation net­
`works are supposed to employ strong cryptographic meth­
`ods, smart eards and biometrics to exclude intruders and
`impostors.
`
`And while we hope that future networks will be more se­
`cure, it seems unwise to believe that all vulnerabilities will
`go away: Cryptographic channels might contain trapdoors
`and will reduce the efficacy of network intrusion detection
`systems or virus scanners. Biometric credentials are dif­
`ficult to revoke if ever compromised. Smart cards can be
`stolen and don'i necessarily map to an identifiable subject
`—users of prepaid GSM phones are stili difficult to trace,
`despite being accompanied by smart cards.
`Apart from criticisms of particular technologies, there
`exist two more fundamental problems:
`For one it is very difficult and expensive to construct a
`truly secure system — given the pressure to deliver a new
`network service to the market as fast as possible and at the
`lowest cost, it is probable that security issues will not re­
`ceive any more attention than they receive currently.
`But even if it were easy to construct a secure network,
`it is still unclear if such a system is desirable: A net­
`work where each subject can be identified and mapped to a
`known real-world entity would offer no privacy to its users.
`There already exist concerns that current networks record
`too much information about their users: For example, rash
`USENET posts have come to ha니ni their authors at job in­
`terviews. If these trends continue reporters are likely to quiz
`a future presidential candidate about the web sites he visited
`as teenager.
`Put simply, a number of real world activities (such
`as cash payments) are anonymous and without permanent
`record. If these activities are to have electronic e이uivaients,
`then some form of anonymity has to be possible. In other
`words there is a tradeoff between the accountability and the
`privacy of subjects in a network. If it is desirable to grant
`subjects some degree of privacy then there exits the oppor­
`tunity for hostile subjects to launch attacks.
`
`3 A Second Line of Defense
`
`The above suggests that hostile subjects are always likely
`to probe objects on a commodity network, and that the own­
`ers of such an object may not be able to do very much about
`this — the attacker may use an anonymous service, use a
`stolen identity, launch an attack from a compromised inter­
`mediate or be based somewhere where the victim has no
`legal recourse.
`Since it does not appear feasible to exclude hostile or
`naive subjects from a commodity network, we propose that
`a second line of defense be made a standard component of
`distributed applications.
`Where the first line of defense includes conventional
`subject authentication (via password, smartcard or finger­
`print), the second line uses an alternative means to identify
`a user.
`
`152
`
`A니thorized licensed use limited to: University of Texas at Austin. Downloaded on March 01,2022 at 05:53:48 UTC from IEEE Xplore. Restrictions apply.
`DEF-AIRE-EXTRINSICOOO00088
`
`

`

`Case 6:21-cv-01101-ADA Document 31-20 Filed 05/19/22 Page 4 of 5
`
`Available CapabiUties
`Used Capabilities^^)
`
`(
`
`Figure 3. Graduai reduction of capabilities to
`those exercised
`
`Mr Jones usually !〇함s in at 7:20 and first checks his mail
`before checking his diary. A factor of below 0.6 would re­
`strict the user of Mr Joπes^ account to checking his mail,
`and a factor of below 0.2 mi응h£ page the system administra­
`tor.
`An object which would implement such a second line of
`defense would be equipped with the following two compo­
`nents:
`
`β A module which profiles subject activity in order to
`establish usage patterns and trends. Where anoma­
`lous behaviour patterns emerge, the system may flag
`alerts or disable a service. Where volumes of data are
`too lar딩e or where privacy issues prevent full logging,
`it seems worthwhile to investigate inscrutable pattern
`matching techniques such as neural networks or ge­
`netic algorithms since these can be thought of as main­
`taining only a digest of past user behaviour, and can
`thus not be used to reconstruct an exact record of past
`user behaviour.
`
`» A component which establishes what services are not
`being used by a particular subject (possibly using the
`module explained in the previous paragraph), with an
`option to temporarily or permanently disable such ser­
`vices. For example, a given user might only use a
`home banking service to examine her current balance.
`The proposed component might then notify the user
`that her ability to initiate transfers would be disabled
`unless this component received verified instruetions to
`the contrary. Such a component wo비d protect unso­
`phisticated users who do not make full use of a given
`service. The component can be thought of as a way of
`automating the principle of least necessary privilege,
`since the component would gradually restrict the users
`rights to only those privileges needed and exercised.
`
`We do note that these ideas are not new (see [3]) for an
`example of a hosbbased IDS, while [4, 5] use an immune
`systems metaphor) anomaly detection has been part of
`network intrusion detections systems for some time. How­
`ever, the use of anomaly detection modules as an integral
`
`Figure 2. Progressive authentication (fuzzy
`value)
`
`A first line of defense exists in most distributed systems:
`subjects usually have to pass an initial authentication phase.
`Once a subject has passed (or bypassed) this phase the sub­
`ject is 응wanted access to a set of objects.
`The point to note is that the above seeurity measure con­
`sists of an imu이 phase where after no security checks are
`performed.
`We suggest that the second line use the actions of a sub­
`ject as a way of verifying the identity of a user This has
`the advantage that the authentication module is in operation
`for as long as the subject is accessing the object, also this
`security measure can be implemented entirely on the side
`of the object, and requires no co-operation of or trust in the
`the subject. Furthermore, such a system would no longer
`restrict confidence in the user authenticity to a binary value
`(yes, no), instead it would be possible to have a progressive
`gradation, and be able to adjust access rights accordingly:
`For example, host Bilbo has a confidence factor of 0.95 that
`Mr Joπes^ account is being used by its rightful owner since
`
`153
`
`A니thorized licensed use limited to: University of Texas at Austin. Downloaded on March 01,2022 at 05:53:48 UTC from IEEE Xplore. Restrictions apply.
`DEF-AIRE-EXTRINSICOOO00089
`
`

`

`Case 6:21-cv-01101-ADA Document 31-20 Filed 05/19/22 Page 5 of 5
`
`5 Conclusion
`
`System crackers are likely remain a threat to commod­
`ity networks. Protection of such networks is complicated
`by the fact that their users are unreliable 一 most lack
`the knowledge or motivation to follow a reasonable secu­
`rity policy. For this reason it seems prudent to augment a
`conventional authentication component (based on an initial
`signon with password, biometric or key) with a user profil­
`ing or anoiiiaiy detection module which allows the system
`to verify the authenticity of a user throughout a session and
`adjust the users access rights, both on a per session basis (as
`a function of how confident the system is of the user's au­
`thenticity) and in a the Jong term (where aeeess is gradually
`restricted to those functions actually used).
`
`References
`
`[1] J. M. J. Bonifacio, E. S. Moreira, A. M. Cansian, and A. C­
`P. L. E Carvalho. An adaptive intrusion detection system us­
`ing neurai networks. In Global IT Security, pages 416-428,
`September 1998.
`[2] T, Escamilla. Intrusion DetecHoれ.John Wiley and Sons,
`1998.
`[3] T. Lane and C. E. Bro비ey. Temporal sequence learning and
`data reduction for anomaly detection. In ACM Conference
`on Computer and Communications Security, pages 150-158,
`November 1998.
`[4] C. P. Louwrens and v. S. H. Solms. Can computerized immu­
`nity be achieved, based on a biological model ? In Global IT
`Security, pages 240-250, September 1998.
`[5] A. Somayaji, S, Hofmeyr, and F. S. Principles of a computer
`immune system. In New Seeurily Paradigms Workshop, pages
`75-82, September 1997.
`
`part of an application does not yet seem to have been ex­
`plored fully.
`As mentioned previously, we are particularly interested
`in investigating how the complement of anomaly detection
`(ie detecting normal behaviour) can be used to provide a
`continuous and progressive means of authenticating a user
`(one might call this fuzzy logic for authentication, since
`confidence in user authenticity ceases to be a binary value),
`and how this confidence value can be used to modulate the
`access rights of the subject. Our second, related, area of
`interest involves the use of an anomaly detection/profiling
`system to determine the set of actions typically performed
`by a subject (versus the set of possible actions), and reduc­
`ing the set of possible actions to those used (one might refer
`to this as the If you don 7 use 让,you loose it principle). This
`would offer an automatic way of implementing a least priv­
`ilege policy.
`We anticipate that anomaly detection will coupled ever
`more closer to applications or services —■ apart from the
`above-mentioned possibilities, a tighter coupling would
`also offer a number of other advantages, including a re­
`duced development effort (it would require less effort to
`keep the two synchronized) and easier access to application
`state (this will become increasingly important if network
`traffic is encrypted, since encrypted traffic would degrade
`the efficacy of a conventional network intrusion detection
`system significantly).
`
`4 Applications and Limitations
`
`Our proposed second line of defense is likely to be most
`effective in situations where authorized subjects perform a
`smail set of tasks — abnormalities are recognized more eas­
`ily under these eireumstances. As it turns 아K, naive users,
`the largest fraction of commodity networks users, do fall
`into this category — these users typically only use a lim­
`ited subset of a particular application. By automatically dis­
`abling, or at least monitoring the use of more sophisticated
`features, it should be able to detect a number of abuses. For
`example, a naive user is unlikely to take advantage of the
`macro capabilities of a word processor, thus the sudden use
`of sophisticated macros mi^ht be indicative of a macro virus
`infection and should thus trigger an alert.
`The corollary of this observation is that an anomaly de­
`tection system is of iesser use where subjects are sophisti­
`cated and perfonn a large set of complex operations. While
`this does present a problem, it is worth noting that sophisti­
`cated (as opposed to naive) users are more likely to follow
`sensible security procedures (eg: selected complex pass­
`words, memorize passwords instead of writing them down,
`et cetera) and are thus, ceteris paribus, less Hkely to fall vic­
`tim to an attack.
`
`154
`
`A니thorized licensed use limited to: University of Texas at Austin. Downloaded on March 01,2022 at 05:53:48 UTC from IEEE Xplore. Restrictions apply.
`DEF-AIRE-EXTRINSICOOO00090
`
`

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket