throbber
An Intrusion-Detection Model
`
`
`
`DOROTHY E. DENNING
`
`IEEE TRANSACTIONS ON SOFTWARE ENGINEERING, VOL. SE-13, NO. 2, FEBRUARY 1987,
`222-232.
`
`Abstract-A model of a real-time intrusion-detection expert system capable of detecting break-ins,
`penetrations, and other forms of computer abuse is described. The model is based on the hypothesis that
`security violations can be detected by monitoring a system's audit records for abnormal patterns of system
`usage. The model includes profiles for representing the behavior of subjects with respect to objects in
`terms of metrics and statistical models, and rules for acquiring knowledge about this behavior from audit
`records and for detecting anomalous behavior. The model is independent of any particular system,
`application environment, system vulnerability, or type of intrusion, thereby providing a framework for a
`general-purpose intrusion-detection expert system.
`Index Terms-Abnormal behavior, auditing, intrusions, monitoring, profiles, security, statistical measures.
`
`I. INTRODUCTION
`This paper describes a model for a real-time intrusion-detection expert system that aims to detect a wide
`range of security violations ranging from attempted break-ins by outsiders to system penetrations and
`abuses by insiders. The development of a real-time intrusion-detection system is motivated by four
`factors: 1) most existing systems have security flaws that render them susceptible to intrusions,
`penetrations, and other forms of abuse; finding and fixing all these deficiencies is not feasible for
`technical and economic reasons; 2) existing systems with known flaws are not easily replaced by systems
`that are more secure-mainly because the systems have attractive features that are missing in the more-
`secure systems, or else they cannot be replaced for economic reasons; 3) developing systems that are
`absolutely secure is extremely difficult, if not generally impossible; and 4) even the most secure systems
`are vulnerable to abuses by insiders who misuse their privileges.
`The model is based on the hypothesis that exploitation of a system's vulnerabilities involves abnormal
`use, of the system; therefore, security violations could be detected from abnormal patterns of system
`usage. The following examples illustrate:
`• Attempted break-in: Someone attempting to break into a system might generate an abnormally
`high rate of password failures with respect to a single account or the system as a whole.
`• Masquerading or successful break-in: Someone logging into a system through an unauthorized
`account and password might have a different login time, location, or connection type from that of
`the account's legitimate user. In addition, the penetrator’s behavior may differ considerably from
`that of the legitimate, user-, in particular, he might spend most of his time browsing through
`directories and executing system status commands, whereas the legitimate user might concentrate
`on editing or compiling and linking programs. Many break-ins have been discovered by security
`officers or other users on the system who have noticed the alleged user behaving strangely.
`• Penetration by legitimate user: A user attempting to penetrate the security mechanisms in the
`operating system might execute different programs or trigger more protection violations from
`attempts to access unauthorized files or programs. If his attempt succeeds, he will have access to
`commands and files not normally permitted to him.
`• Leakage by legitimate user: A user trying to leak sensitive documents might log into the system
`at unusual times or route data to remote printers not normally used.
`
`
`
`1
`
`Blue Coat Systems - Exhibit 1016
`
`

`
`Report Documentation Page
`
`Form Approved
`OMB No. 0704-0188
`
`Public reporting burden for the collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and
`maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other aspect of this collection of information,
`including suggestions for reducing this burden, to Washington Headquarters Services, Directorate for Information Operations and Reports, 1215 Jefferson Davis Highway, Suite 1204, Arlington
`VA 22202-4302. Respondents should be aware that notwithstanding any other provision of law, no person shall be subject to a penalty for failing to comply with a collection of information if it
`does not display a currently valid OMB control number.
`
`2. REPORT TYPE
`
`1. REPORT DATE
`FEB 1987
`
`4. TITLE AND SUBTITLE
`An Intrusion-Detection Model
`
`6. AUTHOR(S)
`
`7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES)
`Naval Postgraduate School,Center of Terrorism and Irregular
`Warfare,Monterey,CA,93943
`
`3. DATES COVERED
` 00-00-1987 to 00-00-1987
`
`5a. CONTRACT NUMBER
`5b. GRANT NUMBER
`5c. PROGRAM ELEMENT NUMBER
`5d. PROJECT NUMBER
`5e. TASK NUMBER
`5f. WORK UNIT NUMBER
`
`8. PERFORMING ORGANIZATION
`REPORT NUMBER
`
`9. SPONSORING/MONITORING AGENCY NAME(S) AND ADDRESS(ES)
`
`10. SPONSOR/MONITOR’S ACRONYM(S)
`
`11. SPONSOR/MONITOR’S REPORT
`NUMBER(S)
`
`12. DISTRIBUTION/AVAILABILITY STATEMENT
`Approved for public release; distribution unlimited
`
`13. SUPPLEMENTARY NOTES
`14. ABSTRACT
`A model of a real-time intrusion-detection expert system capable of detecting break-ins, penetrations, and
`other forms of computer abuse is described. The model is based on the hypothesis that security violations
`can be detected by monitoring a system’s audit records for abnormal patterns of system usage. The model
`includes profiles for representing the behavior of subjects with respect to objects in terms of metrics and
`statistical models, and rules for acquiring knowledge about this behavior from audit records and for
`detecting anomalous behavior. The model is independent of any particular system, application
`environment, system vulnerability, or type of intrusion, thereby providing a framework for a
`general-purpose intrusion-detection expert system.
`
`15. SUBJECT TERMS
`16. SECURITY CLASSIFICATION OF:
`
`a. REPORT
`unclassified
`
`b. ABSTRACT
`unclassified
`
`c. THIS PAGE
`unclassified
`
`17. LIMITATION OF
`ABSTRACT
`Same as
`Report (SAR)
`
`18. NUMBER
`OF PAGES
`17
`
`19a. NAME OF
`RESPONSIBLE PERSON
`
`Standard Form 298 (Rev. 8-98)
`Prescribed by ANSI Std Z39-18
`
`Blue Coat Systems - Exhibit 1016
`
`

`
`•
`
`Inference by legitimate user: A user attempting to obtain unauthorized data from a database
`through aggregation and inference might retrieve more records than usual.
`• Trojan horse: The behavior of a Trojan horse planted in or substituted for a program may differ
`from the legitimate program in terms of its CPU time or 1/0 activity.
`• Virus: A virus planted in a system might cause an increase in the frequency of executable files
`rewritten, storage used by executable files, or a particular program being executed as the virus
`spreads.
`• Denial-of-Service: An intruder able to monopolize a resource (e.g., network) might have
`abnormally high activity with respect to the resource, while activity for all other users is
`abnormally low.
`
`
`Of course, the above forms of aberrant usage can also be linked with actions unrelated to security. They
`could be a sign of a user changing work tasks, acquiring new skills, or making typing mistakes; software
`updates; or changing workload on the system. An important objective of our current research is to
`determine what activities and statistical measures provide the best discriminating power; that is, have a
`high rate of detection and a low rate of false alarms.
`
`II. OVERVIEW OF MODEL
`
`The model is independent of any particular system, application environment, system vulnerability, or type
`of intrusion, thereby providing a framework for a general-purpose intrusion-detection expert system,
`which we have called IDES. A more detailed description of the design and application of IDES is given in
`our final report [1].
`
`
`The model has six main components:
`• Subjects: Initiators of activity on a target system- normally users.
`• Objects: Resources managed by the system-files, commands, devices, etc.
`• Audit records: Generated by the target system in response to actions performed or attempted by
`subjects on objects-user login, command execution, file access, etc.
`• Profiles: Structures that characterize the behavior of subjects with respect to objects in terms of
`statistical metrics and models of observed activity. Profiles are automatically generated and
`initialized from templates.
`• Anomaly records: Generated when abnormal behavior is detected.
`• Activity rules: Actions taken when some condition is satisfied, which update profiles, detect
`abnormal behavior, relate anomalies to suspected intrusions, and produce reports.
`
`
`The model can be regarded as a rule-based pattern matching system. When an audit record is generated, it
`is matched against the profiles. Type information in the matching profiles then determines what rules to
`apply to update the profiles, check for abnormal behavior, and report anomalies detected. The security
`officer assists in establishing profile templates for the activities to monitor, but the rules and profile
`structures are largely system-independent.
`
`The basic idea is to monitor the standard operations on a target system: logins, command and program
`execution's, file and device accesses, etc., looking only for deviations in usage. The model does not
`contain any special features for dealing with complex actions that exploit a known or suspected security
`flaw in the target system; indeed, it has no knowledge of the target system's security mechanisms or its
`deficiencies. Although a flaw-based detection mechanism may have some value, it would be considerably
`more complex and would be unable to cope with intrusions that exploit deficiencies that are not suspected
`or with personnel-related vulnerabilities. By detecting the intrusion, however, the security officer may be
`better able to locate vulnerabilities.
`
`
`
`2
`
`Blue Coat Systems - Exhibit 1016
`
`

`
`
`The remainder of this paper describes the components of the model in more detail.
`
`III. SUBJECTS AND OBJECTS
`Subjects are the initiators of actions in the target system. A subject is typically i terminal user, but might
`also be a process acting on behalf of users or groups of users, or might be the system itself. All activity
`arises through commands initiated by subjects. Subjects may be grouped into different classes (e.g., user
`groups) for the purpose of controlling access to objects in the system. User groups may overlap.
`
`Objects are the receptors of actions and typically include such entities as files, programs, messages,
`records, terminals, printers, and user- or program-created structures. When subjects can be recipients of
`actions (e.g., electronic mail), then those subjects are also considered to be objects in the model. Objects
`are grouped into classes by type (program, textfile, etc.). Additional structure may also be imposed, e.g.,
`records may be grouped into files or database relations; files may be grouped into directories. Different
`environments may require different object granularity; e.g., for some database applications, granularity at
`the record level may be desired, whereas for most applications, granularity at the file or directory level
`may suffice.
`
`IV. AUDIT RECORDS
`Audit Records are 6-tuples representing actions performed by subjects on objects:
`<Subject, Action, Object, Exception-Condition, Resource-Usage, Time-stamp>
`
`where
`• Action: Operation performed by the subject on or with the object, e.g., login, logout, read,
`execute.
`• Exception-Condition: Denotes which, if any, exception condition is raised on the return. This
`should be the actual exception condition raised by the system, not just the apparent exception
`condition returned to the subject.
`• Resource-Usage: List of quantitative elements, where each element gives the amount used of
`some resource, e.g., number of lines or pages printed, number of records read or written, CPU
`time or 1/0 units used, session elapsed time.
`• Time-stamp: Unique time/date stamp identifying when the action took place.
`We assume that each field is self-identifying, either implicitly or explicitly, e.g., the action field either
`implies the type of the expected object field or else the object field itself specifies its type. If audit records
`are collected for multiple systems, then an additional field is needed for a system identifier.
`
`Since each audit record specifies a subject and object, it is conceptually associated with some cell in an
`"audit matrix" whose rows correspond to subjects and columns to objects. The audit matrix is analogous
`to the "access-matrix" protection model, which specifies the rights of subjects to access objects; that is,
`the actions that each subject is authorized to perform on each object. Our intrusion-detection model
`differs from the access-'matrix model by substituting the concept of "action performed" (as evidenced by
`an audit record associated with a cell in the matrix) for "action authorized" (as specified by an access right
`in the matrix cell). Indeed, since activity is observed without regard for authorization, there is an implicit
`assumption that the access controls in the system permitted an action to occur. The task of intrusion
`detection is to determine whether activity is unusual enough to suspect an intrusion. Every statistical
`measure used for this purpose is computed from audit records associated with one or more cells in the
`matrix.
`
`Most operations on a system involve multiple objects. For example, file copying involves the copy
`program, the original file, and the copy. Compiling involves the compiler, a source program file, an object
`
`
`
`3
`
`Blue Coat Systems - Exhibit 1016
`
`

`
`program file, and possibly intermediate files and additional source files referenced through "include"
`statements. Sending an electronic mail message involves the mail program, possibly multiple destinations
`in the "To" and "cc" fields, and possibly "include" files.
`Our model decomposes all activity into single-object actions so that each audit record references only one
`object. File copying, for example, is decomposed into an execute operation on the copy command, a read
`operation on the source file, and a write operation on the destination file. The following illustrates the
`audit records generated in response to a command
`
`
`COPY GAME.EXE TO <Library>GAME.EXE
`
`
`issued by user Smith- to copy an executable GAME file into the <Library> directory; the copy is aborted
`because Smith does not have write permission to < Library >:
`(Smith, execute, <Library>COPY.EXE, 0, CPU=00002, 11058521678)
`(Smith, read, <Smith>GAME.EXE, 0, RECORDS=O, 11058521679)
`(Smith, write, <Library> GAME.EXE, write-viol, RECORDS=O, 11058521680)
`
`
`Decomposing complex actions has three advantages. First, since objects are the protectable entities of a
`system, the decomposition is consistent with the protection mechanisms of systems. Thus, IDES can
`potentially discover both attempted subversions of the access controls (by noting an abnormality in the
`number of exception conditions returned) and successful subversions by noting an abnormality in the set
`of objects accessible to the subject). Second, single-object audit records greatly simplify the model and its
`application. Third, the audit records produced by existing systems generally contain a single object,
`although some systems provide a way of linking together the audit records associated with a "job step"
`(e.g., copy or compile) so that all files accessed during execution of a program can be identified.
`
`The target system is responsible for auditing and for transmitting audit records to the intrusion-detection
`system for analysis (it may also keep an independent audit trail). The time at which audit records are
`generated determines what type of data is available. If the audit record for some action is generated at the
`time an action is requested, it is possible to measure both successful and unsuccessful attempts to perform
`the activity, even if the action should abort (e.g., because of a protection violation) or cause a system
`crash. If it is generated when the action completes, it is possible to measure the resources consumed by
`the action and exception conditions that may cause the action to terminate abnormally (e.g., because of
`resource overflow). Thus, auditing an activity after it completes has the advantage of providing more
`information, but the disadvantage of not allowing immediate detection of abnormalities, especially those
`related to break-ins and system crashes. Thus, activities such as login, execution of high risk commands
`(e.g., to acquire special “superuser" privileges), or access to sensitive data should be audited when they
`are attempted so that penetrations can be detected immediately; if resource-usage data are also desired,
`additional auditing can be performed on completion as well. For example, access to a database containing
`highly sensitive data may be monitored when the access is attempted and then again when it completes to
`report the number of records retrieved or updated. Most existing audit systems monitor session activity at
`both initiation (login), when the time and location of login are recorded, and termination (logout), when
`the resources consumed during the session are recorded. They do not, however, monitor both the start and
`finish of command and program execution or file accesses. IBM's System Management Facilities (SMF)
`[2], for example, audit only the completion of these activities.
`
`Although the auditing mechanisms of existing systems approximate the model, they are typically deficient
`in terms of the activities monitored and record structures generated. For example, Berkeley 4.2 UNIX [3]
`monitors command usage but not file accesses or file protection violations. Some systems do not record
`all login failures. Programs, including system programs, invoked below the command level are not
`explicitly monitored (their activity is included in that for the main program). The level at which auditing
`
`
`
`4
`
`Blue Coat Systems - Exhibit 1016
`
`

`
`should take place, however, is unclear, since too much auditing could severely degrade performance on
`the target system or overload the intrusion-detection system.
`
`
`Deficiencies in the record structures are also present. Most SMF audit records, for example, do not
`contain a subject field; the subject must be reconstructed by linking together the records associated with a
`given job. Protection violations are sometimes provided through separate record formats rather than as an
`exception condition in a common record; VM password failures at login, for example, are handled this
`way (there are separate records for successful logins and password failures).
`
`Another problem with existing audit records is that they contain little or no descriptive information to
`identify the values contained therein. Every record type has its own structure, and the exact format of
`each record type must be known to interpret the values. A uniform record format with self-identifying
`data would be preferable so that the intrusion-detection software can be system-independent. This could
`be achieved either by modifying the software that produces the audit records in the target system, or by
`writing a filter that translates the records into a standard format.
`
`V. PROFILES
`
`An activity profile characterizes the behavior of a given subject (or set of subjects) with respect to a given
`object (or set thereof), thereby serving as a signature or description of normal activity for its respective
`subject(s) and object(s). Observed behavior is characterized in terms of a statistical metric and model. A
`metric is a random variable x representing a quantitative measure accumulated over a period. The period
`may be a fixed interval of time (minute, hour, day, week, etc.), or the time between two audit-related
`events (i.e., between login and logout', program initiation and program termination, file open and file
`close, etc.). Observations (sample points ) xi of x obtained from the audit records are used together with a
`statistical model to determine whether a new observation is abnormal. The statistical model makes no
`assumptions about the underlying distribution of x; all knowledge about x is obtained from observations.
`Before describing the structure, generation, and application of profiles, we shall first discuss statistical
`metrics and models.
`
`A. Metrics
`We define three types of metrics:
`• Event Counter: x is the number of audit records satisfying some property occurring during a
`period (each audit record corresponds to an event). Examples are number of logins during an
`hour, number of times some command is executed during a login session, and number of
`password failures during a minute.
`Interval Timer: x is the length of time between two related events; i.e., the difference between the
`time- stamps in the respective audit records. An example is the length of time between successive
`logins into an account.
`• Resource Measure: x is the quantity of resources consumed by some action during a period as
`specified in the Resource-Usage field of the audit records. Examples are the total number of
`pages printed by a user per day and total amount of CPU time consumed by some program during
`a single execution. Note that a resource measure in our intrusion-detection model is implemented
`as an event counter or interval timer on the target system. For example, the number of pages
`printed during a login session is implemented on the target system as an event counter that counts
`the number of print events between login and logout; CPU time consumed by a program as an
`interval timer that runs between program initiation and termination. Thus, whereas event counters
`and interval timers measure events at the audit-record level, resource measures acquire data from
`events on the target system that occur at a level below the audit redords. The Resource-Usage
`
`•
`
`
`
`5
`
`Blue Coat Systems - Exhibit 1016
`
`

`
`field of audit records thereby provides a means of data reduction so that fewer events need be
`explicitly recorded in audit records.
`
`
`B. Statistical Models
`Given a metric for a random variable x and n observations x1, … , xn the purpose of a statistical model of
`x is to determine whether a new observation xn+1 is abnormal with respect to the previous observations.
`The following models may be included in IDES:
`
`
`1) Operational Model: This model is based on the operational assumption that abnormality can be
`decided by comparing a new observation of x against fixed limits. Although the previous sample points
`for x are not used, presumably the limits are determined from prior observations of the same type of
`variable. The operational model is most applicable to metrics where experience has shown that certain
`values are frequently linked with intrusions. An example is an event counter for the number of password
`failures during a brief period, where more than 10, say, suggests an attempted break-in.
`
`
`2) Mean and Standard Deviation Model: This model is based on the assumption that all we know about
`x1, … xn, are mean and standard deviation as determined from its first two moments:
`
`
`2 + … + xn 2
`
`sum = x1 + … + xn
`
`sumsquares = x1
`mean = sum/ n
`stdev = sqrt (sumsquares / (n+1) – mean2 )
`A new observation xn+1 is defined to be abnormal if it falls outside a confidence interval that is d standard
`deviations from the mean for some parameter d:
`mean + d * stdev
`By Chebyshev's inequality, the probability of a value falling outside this interval is at most 1 /d2; for d =
`4, for example, it is at most 0.0625. Note that 0 (or null) occurrences should be included so as not to bias
`the data.
`
`This model is applicable to event counters, interval timers, and resource measures accumulated over a
`fixed time interval or between two related events. It has two advantages over an operational model. First,
`it requires no prior knowledge about normal activity in order to set limits; instead, it learns what
`constitutes normal activity from its observations, and the confidence intervals automatically reflect this
`increased knowledge. Second, because the confidence intervals depend on observed data, what is
`considered to be normal for one user can be considerably different from another.
`
` A
`
` slight variation on the mean and standard deviation model is to weight the computations, with greater
`weights placed on more recent values.
`
`3) Multivariate Model: This model is similar to the mean and standard deviation model except that it is
`based on correlations among two or more metrics. This model would be useful if experimental data show
`that better discriminating power can be obtained from combinations of related measures rather than
`individually-e.g., CPU time and 1/0 units used by a program, login frequency, and session elapsed time
`(which may be inversely related).
`
`4) Markov Process Model: This model, which applies only to event counters, regards each distinct type of
`event (audit record) as a state variable, and uses a state transition matrix to characterize the transition
`frequencies between states (rather than just the frequencies of the individual states-i.e., audit records-
`taken separately). A new observation is defined to be abnormal if its probability as determined by the
`
`
`
`6
`
`Blue Coat Systems - Exhibit 1016
`
`

`
`previous state and the transition matrix is too low. This model might be useful for looking at transitions
`between certain commands where command sequences were important.
`
`5) Time Series Model: This model, which uses an interval timer together with an event counter or
`resource measure, takes into account the order and interarrival times of the observations x1, … , xn, as well
`as their values. A new observation is abnormal if its probability of occurring at that time is too low. A
`time series has the advantage of measuring trends of behavior over time and detecting gradual but
`significant shifts in behavior, but the disadvantage of being more costly than mean and standard
`deviation.
`
`Other statistical models can be considered, for example, models that use more than the first two moments
`but less than the full set of values.
`
` C
`
` Profile Structure
`An activity profile contains information that identifies the statistical model and metric of a random
`variable, as well as the set of audit events measured by the variable. The structure of a profile contains 10
`components, the first 7 of which are independent of the specific subjects and objects measured:
`
`
`<Variable-Name, Action-Pattern, Exception-Pattern, Resource-Usage-Pattern, Period, Variable-
`Type, Threshold, Subject-Pattern, Object-Pattern, Value>
`
`
`Subject- and Object-Independent Components:
`• Variable-Name: Name of variable.
`• Action-Pattern: Pattern that matches zero or more actions in the audit records, e.g., "login,"
`"read,” “execute. "
`• Exception-Pattern: Pattern that matches on the Exception-Condition field of an audit record.
`• Resource-Usage-Pattern: Pattern that matches on the Resource-Usage field of an audit record.
`• Period: Time interval for measurement, e.g., day, hour, minute (expressed in ten-ns of clock
`units). This component is null if there is no fixed time interval; the period is the duration of the
`activity.
`• Variable-Type: Name of abstract data type that defines a particular type of metric and statistical
`model, e.g., event counter with mean and standard deviation model.
`• Threshold: Parameter(s) defining limit(s) used in statistical test to determine abnormality. This
`field and its interpretation is determined by the statistical model (Variable-Type). For the
`operational model, it is an upper (and possibly lower) bound on the value of an observation; for
`the mean and standard deviation model, it is the number of standard deviations from the mean.
`
`
`Subject- and Object-Dependent Components:
`• Subject-Pattern: Pattern that matches on the Subject field of audit records.
`• Object-Pattern: Pattern that matches on the Object field of audit records.
`• Value: Value of current (most recent) observation and parameters used by the statistical model to
`represent distribution of previous values. For the mean and standard deviation model, these
`parameters are count, sum, and sum-of-squares (first two moments). The operational model
`requires no parameters.
`
` A
`
` profile is uniquely identified by Variable-Name, Subject-Pattern, and Object-Pattern. All components
`of a profile are invariant except for Value.
`
`
`
`
`7
`
`Blue Coat Systems - Exhibit 1016
`
`

`
`Although the model leaves unspecified the exact format for patterns, we have identified the following
`SNOBOL- like constructs as being useful:
`‘string'
`String of characters.
`*
`
`Wild card matching any string.
`#
`
`Match any numeric string.
`IN(list)
`Match any string in list.
`p -> name The string matched by p is associated with name.
`pi p2 Match pattern p1 followed by p2.
`pi | p2 Match pattern p1 or p2.
`pi, p2
`Match pattern p1 and p2.
`Not p Match anything but pattern p.
`Examples of patterns are:
`'Smith'
`* -> User - - match any string and assign to User
`'<Library>*' --matchfiles in<Library> directory
`IN(Special-Files) -- match files in Special-Files
`'CPU=' # -> Amount -- match string 'CPU=' followed by integer; assign integer to Amount
`The following is a sample profile for measuring the quantity of output to user Smith's terminal on a
`session basis. The variable type ResourceByActivity denotes a resource measure using the mean and
`standard deviation model.
`
`
`Variable-Name: SessionOutput
`Action-Pattern: 'logout'
`Exception-Pattern: 0
`Resource-Usage-Pattern: 'SessionOutput=' # -> Amount
`Period:
`Variable-Type: ResourceByActivity
`Threshold: 4
`Subject-Pattern:
`'Smith'
`Object-Pattern:
`*
`Value:
`record of ...
`Whenever the intrusion-detection system receives an audit record that matches, a variable's patterns, it
`updates the variable's distribution and checks for abnormality. The distribution of values for a variable is
`thus derived-i.e., learned-as audit records matching the profile patterns are processed.
`
`D. Profiles for Classes
`Profiles can be defined for individual subject-object pairs (i.e., where the Subject and Object patterns
`match specific names, e.g., Subject "Smith" and Object "Foo"), or for aggregates of subjects and objects
`(i.e., where the Subject and Object patterns match sets of names) as shown in Fig. 1 [ed: not included].
`For example, file-activity profiles could be created for pairs of individual users and files, for groups of
`users with respect to specific files, for individual users with respect to classes of files, or for groups of
`users with respect to file classes. The nodes in the lattice are interpreted as follows:
`• Subject-Object: Actions performed by single subject on single object-e.g., user Smith-file Foo.
`• Subject-Object Class: Actions performed by single subject aggregated over all objects in the
`class. The class of objects might be represented as a pattern match on a subfield of the Object
`field that specifies the object's type (class), as a pattern match directly o 'n the object's name (e.
`
`
`
`8
`
`Blue Coat Systems - Exhibit 1016
`
`

`
`g., the pattern " *. EXE " for all executable files), or as a pattern match that tests whether @the
`object is in some list (e.g., "IN(hit-list)").
`• Subject Class-Object: Actions performed on single object aggregated over all subjects in the
`class-e.g., privileged users-directory file <Library>, nonprivileged users-directory file <Library>.
`• Subject Class-Object Class: Actions' aggregated over all subjects in the class and objects in the
`class-privileged users-system files, nonprivileged users-system files.
`• Subject: Actions performed by single subject aggregated over all objects-e.g., user session
`activity.
`• Object: Actions performed on a single object aggregated over all subjects-e.g., password file
`activity.
`• Subject Class: Actions aggregated over all subjects in the class-e.g., privileged user activity,
`nonp

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket