throbber
222
`
`[EEE TRANSACTIONS ON SOFTWARE ENGINEERING. VOL. SE43. NO. 2. FEBRUARY I98?
`
`An Intrusion—Detection Model
`
`DOROTHY E. DBNNING
`
`Abstract—«A model or a real-time intrusion-detection expert system
`capable of detecting break-ins, penetrations, and other forms of com-
`puter abuse is described. The model is based on the hypothesis that
`security violations can be detected by monitoring a system’s audit rec-
`ords for abnormal patterns of system usage. The model includes pro
`files for representing the behavior of subjects with respect to objects
`in terms of metrics and statistical models, and rules for acquiring
`knowledge about this behavior from audit records and for detecting
`anomalous behavior. The model is independent of any particular sys-
`tem, application environment, system vulnerability, or type of intru-
`sion, thereby providing a framework for a general-purpose intrusion-
`detection expert system.
`
`Index Terms—Abnormal behavior, auditing, intrusions. monitor-
`ing. profiles, security, statistical measures.
`
`1.
`
`INTRODUCTION
`
`HIS paper describes a model for a real-time intrusion-
`detection expert system that aims to detect a wide
`range of security violations ranging from attempted break-
`ins by outsiders to system penetrations and abuses by in-
`siders. The development of a real-time intrusion-detec-
`tion system is motivated by four factors: 1) most existing
`systems have security flaws that render them susceptible
`to intrusions, penetrations. and other forms of abuse;
`finding and fixing all these deficiencies is not feasible for
`technical and economic reasons; 2) existing systems with
`known flaws are not easily replaced by systems that are
`more secure—mainly because the systems have attractive
`features that are missing in the more-secure systems, or
`else they cannot be replaced for economic reasons; 3) de—
`veloping systems that are absolutely secure is extremely
`difficult, if not generally impossible; and 4) even the most
`secure systems are vulnerable to abuses by insiders who
`misuse their privileges.
`The model is based on the hypothesis that exploitation
`of a system’s vulnerabilities involves abnormal use of the
`system; therefore, security violations could be detected
`from abnormal patterns of system usage. The following
`examples illustrate:
`0 Attempted break-in: Someone attempting to break
`into a system might generate an abnormally high rate of
`password failures with respect to a single account or the
`system as a whole.
`0 Masqueradr‘ng or successful break—in: Someone log-
`
`Manuscript received December 20, 1985; revised August 1, 1986. This
`work was supported by the Space and Naval Warfare Command (SPA-
`WAR) under Contract 83F83DIDO and by the National Science Foundation
`under Grant MES-83 ”650.
`The author is with SRI International, Menlo Park. CA 94025.
`IEEE Log Number 8611562.
`
`ging into a system through an unauthorized account and
`password might have a different login time, location, or
`connection type from that of the account’s legitimate user.
`In addition, the penetrator’s behavior may differ consid-
`erably from that of the legitimate user; in particular, he
`might spend most of his time browsing through directories
`and executing system status commands, whereas the le-
`gitimate user might concentrate on editing or compiling
`and linking programs. Many break-ins have been discow
`ered by security officers or other users on the system who
`have noticed the alleged user behaving strangely.
`' Penetration by illegitimate user: A user attempting to
`penetrate the security mechanisms in the operating system
`might execute different programs or trigger more protec-
`tion violations from attempts to access unauthorized files
`or programs. If his attempt succeeds, he will have access
`to commands and files not normally permitted to him.
`I Leakage by legitimate user: A user trying to leak
`sensitive documents might log into the system at unusual
`times or route data to remote printers not normally used.
`0 Inference by legitimate user: A user attempting to
`obtain unauthorized data from a database through aggre-
`gation and inference might retrieve more records than
`usual.
`
`- Trojan horse: The behavior of a Trojan horse planted
`in or substituted for a program may differ from the legit-
`imate program in terms of its CPU time or It'O activity.
`0 Virus: A virus planted in a system might cause an
`increase inthe frequency of executable files rewritten,
`storage used by executable files, or a particular program
`being executed as the virus spreads.
`e Denial-of-Service: An intruder able to monopolize a
`resource (e.g., network) might have abnormally high ac-
`tivity with respect to the resource, while activity for all
`other users is abnormally low.
`'
`Of course, the above forms of aberrant usage can also
`be linked with actions unrelated to security. They could
`be a sign of a user changing work tasks, acquiring new
`skills, or making typing mistakes; software updates; or
`changing workload on the system. An important objective
`of our current research is to determine what activities and
`
`statistical measures provide the best discriminating power;
`that is, have a high rate of detection and a low rate of
`false alarms.
`
`II. Ovmzvrsw or MODEL
`
`The model is independent of any particular system, ap-
`plication environment, system vulnerability, or type of in-
`trusion, thereby providing a framework for a general-pur-
`
`0098-5589t87t0200—0222$01.00 © 198? IEEE
`
`CS—1031
`
`Cisco Systems, Inc. v. Finjan, Inc.
`
`CS-1031
`Cisco Systems, Inc. v. Finjan, Inc.
`
`

`

`DENNING: INT RU SIGN-DETECTION MODEL
`
`223
`
`pose intrusion-detection expert system, which we have
`called IDES. A more detailed description of the design
`and
`application of
`IDES is
`given
`in
`our
`final
`report [1].
`The model has six main components:
`0 Subjects: Initiators Of activity on a target system—
`norrnally users.
`_
`0 Objects: Resources managed by' the system—files,
`commands, devices, etc.
`0 Audit records: Generated by the target system in re—
`sponse to actions performed or attempted by subjects on
`objects—user login, command execution, file access, etc.
`0 Profiles: Structures that characterize the behavior of
`subjects with respect to object‘s in terms of statistical met-
`rics and models of observed activity. Profiles are auto-
`matically generated and initialized from templates.
`0 Anomaly records: Generated when abnormal behav-
`ior is detected.
`
`0 Activity rules: Actions taken when some condition is
`satisfied, which update profiles, detect abnormal behav—
`ior, relate anomalies to suspected intrusions, and produce
`reports.
`The model can be regarded as a rule-based pattern
`matching system. When an audit record is generated, it is
`matched against
`the profiles. Type information in the
`matching profiles then determines what rules to apply to
`update the profiles, check for abnormal behavior, and re—
`port anomalies detected. The security officer assists in es-
`tablishing profile templates for the activities to monitor,
`but the rules and profile structures are largely system-in-
`dependent.
`The basic idea is to monitor the standard Operations on
`a target system:
`logins, command and program execu-
`tions, file and device accesses, etc. , looking only for de-
`viations in usage. The model does not contain any special
`features for dealing with complex actions that exploit a
`known or suspected security flaw in the target system; in-
`deed, it has no knowledge of the target system’s security
`mechanisms or its deficiencies. Although a flaw-based de-
`tection mechanism may have some value,
`it would be
`considerably more complex and would be unable to cope
`with intrusions that exploit deficiencies that are not sus~
`pected or with personnel-related vulnerabilities. By de-
`tecting the intrusion, however, the security oflicer may be
`better able to locate vulnerabilities.
`
`.
`
`The remainder of this paper describes the components
`Of the model in more detail.
`
`11]. SUBJECTS AND OBJECTS
`
`Subjects are the initiators of actions in the target sys-
`tem. A subject is typically a terminal user, but might also
`be a process acting On behalf of users or groups of users,
`or might be the system itself. All activity arises through
`commands initiated by subjects. Subjects may be grouped
`into different classes (e.g., user groups) for the purpose
`Of controlling access to objects in the system. User groups
`may overlap.
`Objects are the receptors of actions and typically in-
`
`clude such entities as files, programs, messages, records,
`terminals, printers, and user- or program-created struc-
`tures. When subjects can be recipients of actions (e. g.,
`electronic mail), then those subjects are also considered
`to be objects in the model. Objects are grouped into
`classes by type (program, text file, etc.). Additional struc-
`ture may also be imposed, e.g., records may be grouped
`into files or database relations; files may be grouped into
`directories. Different environments may require different
`object granularity; e.g.. for some database applications,
`granularity at the record level may be desired, whereas
`for most applications, granularity at the file or directory
`level may suffice.
`
`IV. AUDIT RECORDS
`
`Audit Records are 6-tuples representing actions per-
`formed by subjects on objects:
`<Subjeet, Action, Object. Exception-Condition,
`Resource-Usage. Time-stamp)
`
`where
`
`0 Action: Operation performed by the subject on or
`with the object, e.g., login, logout, read, execute.
`0 Exception-Condition: Denotes which, if any, excep-
`tion condition is raised on‘ the return. This should be the
`
`actual exception condition raised by the system, not just
`the apparent exception condition retumed to the subject.
`0 Resource-Usage: List of quantitative
`elements,
`where each element gives the amount used of some re-
`source, e.g., number of lines or pages printed, number of
`records read or written, CPU time or IIO units used, ses-
`sion elapsed time. '
`' Time-stamp: Unique timeidate stamp identifying
`when the action took place.
`We assume that each field is self—identifying, either im-
`plicitly or explicitly, e.g. , the action field either implies
`the type Of the expected object field or else the object field
`itself specifies its type. If audit records are collected for
`multiple systems, then an additional field is needed for a
`system identifier.
`Since each audit record specifies a subject and object,
`it is conceptually associated with some cell in an “audit
`matrix" whose rows correspond to subjects and columns
`to objects. The audit matrix is analogous to the “access-
`matrix" protection model, which specifies the rights of
`subjects to access objects; that is, the actions that each
`subject is authorized to perform on each object. Our in-
`trusion-detection model differs from the access-matrix
`
`model by substituting the concept of “action performed"
`(as evidenced by an audit record associated with a cell in
`the matrix) for “action authorized” (as specified by an
`access right in the matrix cell). Indeed, since activity is
`observed without regard for authorization, there is an im-
`plicit assumption that the access controls in the system
`permitted an action to Occur. The task of intrusion detec-
`tion is tO determine whether activity is unusual enough to
`suspect an intrusion. Every statistical measure used for
`this purpose is computed from audit records associated
`with one or more cells in the matrix.
`
`

`

`224
`
`IEEE TRANSACTIONS ON SOFTWARE ENGINEERING. VOL. 88-13, N0. 2. FEBRUARY [98?
`
`Most operations on a system involve multiple objects.
`For example, file copying involves the copy program, the
`original file, and the copy. Compiling involves the com-
`piler, a source program file, an object program file, and
`possibly intermediate files and additional source files ref-
`erenced through “include" statements. Sending an elec-
`tronic mail message involves the mail program, possibly
`multiple destinations in the “To" and “cc" fields, and
`possibly “include” files.
`Our model decomposes all activity into single-object
`actions so that each audit record references only one ob-
`ject. File copying, for example, is decomposed into an
`execute operation on the copy command, a read operation
`on the source file, and a write operation on the destination
`file. The following illustrates the audit records generated
`in response to a command
`
`COPY GAMEEXE TO <Librarv>GAME.EXE
`
`issued by user Smith to copy an executable GAME file
`into the (Library) directory; the copy is aborted be-
`cause Smith does not have write permission to <Li-
`brary>:
`
`lSmith. execute, <Librarv>COPY.EXE,
`CPU=oooo2, 11osas216r81
`llSmith, read, <5mith>GAME.EXE.
`RECORDS=0, 11058521679:
`lSmith, write, cLibrarv)GAME.EXE, write~viol,
`RECORDS=0. 11058521680}I
`
`0,
`
`0.
`
`Decomposing complex actions has three advantages.
`First, since objects are the protectable entities of a sys-
`tem, the decomposition is consistent' with the protection
`mechanisms of systems. Thus, IDES can potentially dis-
`cover both attempted subversions of the access comrols
`(by noting an abnormality in the number of exception con-
`ditions returned) and successful subversions' (by noting an
`abnormality in the set of objects accessible to the subject).
`Second, single-object audit records greatly simplify the
`model and its application. Third, the audit records pro-
`duced by existing systems generally contain a single ob-
`ject, although some systems provide a way of linking to-
`gether the audit records associated with a “job step"
`(e.g., copy or compile) so that all files accessed during
`execution of a program can be identified.
`The target system is responsible for auditing and for
`transmitting audit records to the intrusion-detection sys-
`tem for analysis (it may also keep an independent audit
`trail). The time at which audit records are generated de-
`termines what type of data is available. If the audit record
`for some action is generated at the time an action is re-
`quested, it is possible to measure both succe5sful and un-
`successful attempts to perform the activity, even if the
`action should abort (e. g., because of a protection viola-
`tion) or cause a system crash. If it is generated when the
`action completes, it is possible to measure the resources
`consumed by the action and exception conditions that may
`cause the action to terminate abnormally (e.g., because of
`resource overflow). Thus, auditing an activity after it
`completes has the advantage of providing more informa-
`
`tion, but the disadvantage of not allowing immediate de-
`tection of abnormalities, especially those related to break-
`ins and system crashes. Thus, activities such as login, ex-
`ecution of high risk commands (e.g., to acquire special
`‘ ‘superuser” privileges), or access to sensitive data should
`be audited when they are attempted so that penetrations
`can be detected immediately; if resource-usage data are
`also desired, additional auditing can be performed on
`completion as well. For example, access to a database
`containing highly sensitive data may be monitored when
`the access is attempted and then again when it completes
`to report the number of records retrieved or updated. Most
`existing audit systems monitor session activity at both ini-
`tiation (login), when the time and location of login are
`recorded, and termination (logout), when the resources
`consumed during the session are reeorded. They do not,
`however, monitor both the start and finish of command
`and program execution or file accesses. IBM’s System
`Management Facilities (SMF) [2], for example, audit only
`the completion of these activities.
`Although the auditing mechanisms of existing systems
`approximate the model,
`they are typically deficient
`in
`terms of the activities monitored and record structures
`
`generated. For example, Berkeley 4.2 UNIX [3] monitors
`command usage but not file accesses or file protection vi-
`olations. Some systems do not record all login failures.
`Programs, including system programs, invoked below the
`command level are not explicitly monitored (their activity
`is included in that for the main program). The level at
`which auditing should take place, however,
`is unclear,
`since too much auditing could severely degrade perfor-
`mance on the target system or overload the intrusion-de-
`tection system.
`_
`Deficiencies in the record structures are also present.
`Most SMF audit records, for example, do not contain a
`subject field; the subject must be reconstructed by linking
`together the records associated with a given job. Protec-
`tion violations are sometimes provided through separate
`record formats rather than as an exception condition in a
`common record; VM password failures at login, for ex-
`ample, are handled this way (there are separate records
`for successful logins and password failures).
`Another problem with existing audit records is that they
`contain little or no descriptive information to identify the
`values contained therein. Every record type has its own
`structure, and the exact format of each record type must
`be known to interpret the values. A uniform record format
`with self-identifying data would be preferable so that the
`intrusion~detection software can be system-independent.
`This could be achieved either by modifying the software
`that produces the audit records in the target system, or by
`writing a filter that translates the records into a standard
`format.
`
`V. PROFILES
`
`An activity profile characterizes the behavior of a given
`subject (or set of subjects) with respect to a given object
`(or set thereof), thereby serving as a signature or descrip-
`
`

`

`DEN NING:
`
`iNTRUSION -DET EC TION MODEL
`
`225
`
`tion of normal activity for its respective subject(s) and
`object(s). Observed behavior is characterized in terms of
`a statistical metric and model. A metric is a random var-
`
`iable x representing a quantitative measure accumulated
`over a period. The period may be a fixed interval of time
`(minute, hour, day, week, etc.), or the time between two
`audit-related events (i.e., between login and logout, pro-
`gram initiation and program termination, file open and file
`close, etc.). Observations (sample points )x, of I obtained
`from the audit records are used together with a statistical
`model to determine whether a new observation is abnor-
`
`mal. The statistical model makes no assumptions about
`the underlying distribution of x; all knowledge about x is
`obtained from observations. Before describing the struc~
`lure, generation, and application of profiles, we shall first
`discuss statistical metrics and models.
`
`A. Metrics
`
`We define three types of metrics:
`' Event Counter: x is the number of audit records sat-
`
`isfying some property occurring during a period (each au-
`dit record corresponds to an event). Examples are number
`of logins during an hour, number of times some command
`is executed during a login session, and number of pass-
`word failures during a minute.
`0 Interval Timer: x is the length of time' between two
`related events;
`i.e.,
`the difference between the time-
`
`stamps in the respective audit records. An example is the
`length of time between successive logins into an account.
`0 Resource Measure: x is the quantity of resources
`consumed by some action during a period as specified in
`the Resource-Usage field of the audit records. Examples
`are the total number of pages printed by a user per day
`and total amount of CPU time consumed by some pro-
`gram during a single execution. Note that a resource mea—
`sure in our intrusion-detection model is implemented as
`an event counter or interval timer on the target system.
`For example, the number of pages printed during a login
`session is implemented on the target system as an event
`counter that counts the number of print events between
`login and logout; CPU time consumed by a program as
`an interval timer that runs between program initiation and
`termination. Thus, whereas event counters and interval
`timers measure events at the audit-record level, resource
`
`measures acquire data from events on the target system
`that occur at a level below the audit records. The Re-
`
`source-Usage field of audit records thereby provides a
`means of data reduction so that fewer events need be ex-
`
`plicitly recorded in audit records.
`
`B. Statistical Models
`
`Given a metric for a random variable x and n observa-
`
`,x,,, the purpose of a statistical model of x
`-
`-
`-
`tions :51,
`is to determine whether a new observation x,, H is abnor-
`mal with respect to the previous observations. The fol-
`lowing models may be included in IDES:
`1) Operational Model : This model is based on the op-
`erational assumption that abnormality can be decided by
`
`comparing a new observation of x against fixed limits.
`Although the previous sample points for x are not used,
`presumably the limits are determined from prior obser-
`vations of the same type of variable. The operational
`model is most applicable to metrics where experience has
`shown that certain values are frequently linked with intru-
`sions. An example is an event counter for the number of
`password failures during a brief period, where more than
`10, say, suggests an attempted break-in.
`2} Mean and Standard Deviation Model: This model
`is based on the assumption that all we know about xh
`-
`-
`-
`, x,, are mean and standard deviation as determined
`from its first two moments:
`
`sum=x1 +
`
`+x,,
`
`sunts'quores = x? + -- - + xi
`
`mean = sum/n
`sums uares
`
`stdev = squ ((q—I) — melting).
`
`n _
`
`A new observation x" H is defined to be abnormal if it
`falls outside a confidence interval that is d standard de-
`viations from the mean for some parameter d:
`
`mean i d X srdeo
`
`By Chebyshev’s inequality, the probability of a value fall-
`ing outside this interval is at most 1 /d2; for d = 4, for
`example,
`it is at most 0.0625. Note that O (or null) oc-
`currences should be included so as not to bias the data.
`
`This model is applicable to event counters, interval tim-
`ers, and resource measures accumulated over a fixed time
`interval or between two related events. It has two advan-
`
`tages over an operational model. First, it requires no prior
`knowledge about normal activity in order to set limits;
`instead, it learns what constitutes normal activity from its
`observations, and the confidence intervals automatically
`reflect
`this increased knowledge. Second, because the
`confidence intervals depend on observed data, what is
`considered to be normal for one user can be considerably
`different from another.
`
`A slight variation on the mean and standard deviation
`model is to weight the computations, with greater weights
`placed on more recent values.
`is similar to the
`3) Multivariate Model: This model
`mean and standard deviation model except that it is based
`on correlations among two or more metrics. This model
`would be useful if experimental data show that better dis-
`criminating power can be obtained from combinations of
`related measures rather than individually—e.g., CPU time
`and IlO units used by a program,
`login frequency, and
`session elapsed time (which may be inversely related).
`4) Markov Process Model: This model, which applies
`only to event counters, regards each distinct type of event
`(audit record) as a state variable, and uses a state transi-
`tion matrix to characterize the transition frequencies be-
`tween states (rather than just the frequencies of the indi-
`vidual states—i.e., audit records—taken separately). A
`
`

`

`226
`
`IEEE TRANSACTIONS ON SOFTWARE ENGINEERING, VOL. SEJ—IJ, N0. 2, FEBRUARY 198?
`
`new observation is defined to be abnormal if its probabil—
`ity as determined by the previous state and the transition
`matrix is too low. This model might be useful for looking
`at transitions between certain commands where command
`sequences were important.
`5) Time Series Model: This model, which uses an in-
`
`terval timer together with an event counter or resource
`measure,
`takes into aCcount
`the order and interarrival
`times of the observations x],
`-
`-
`'
`, x", as well as their
`values. A new observation is abnormal if its probability
`of occurring at that time is too low. A time series has the
`advantage of measuring trends of behavior over time and
`detecting gradual but significant shifts in behavior, but the
`disadvantage of being more costly than mean and standard
`deviation.
`
`Other statistical models can be considered, for exam-
`ple, models that use more than the first two moments but
`less than the full set of values.
`
`C. Profile Structure
`
`An activity profile contains information that identifies
`the statistical model and metric of a random variable, as
`wall as the set of audit events measured by the variable.
`The structure of a profile contains 10 components, the first
`7 of which are independent of the specific subjects and
`objects measured:
`
`<Variabie~Name, Action-Pattern, Exception-Pattern,
`Resource-Usage-Pattern. Period. Variable-Type.
`Threshold. Subject-Pattern. Object-Pattern, Value)
`
`Subject- and Object-Independent Components:
`0 Variable-Name: Name of variable.
`O Action—Pattern: Pattern that matches zero or more
`
`actions in the audit records, e-g., “logic,” “read," “ex-
`ecute.”
`
`' Exception-Pattern: Pattern that matches on the Ex-
`ception-Condition field of an audit record.
`' Resource- Usage-Pattern: Pattern that matches on the
`Resource-Usage field of an audit record.
`0 Period: Time interval for measurement, e.g., day,
`hour, minute (expressed in terms of clock units). This
`component is null if there is no fixed time interval; i.e.,
`the period is the duration of the activity.
`O Variable-Type: Name of abstract data type that de-
`fines a particular type of metric and statistical model, e.g. ,
`event counter with mean and standard deviation model.
`
`0 Threshold: Parameter(s) defining limit(s) used in
`statistical test to determine abnormality. This field and its
`interpretation is determined by the statistical model (Var-
`iable-Type). For the operational model, it is an upper (and
`possibly lower) bound on the value of an observation; for
`' the mean and standard deviation model, it is the number
`of standard deviations from the mean.
`
`Subject- and Object-Dependent Components:
`0 Subject-Pattern: Pattern that matches on the Subject
`field of audit records.
`
`' Object-Pattern: Pattern that matches on the Object
`field of audit records.
`
`0 Value: Value of current (most recent) observatioa
`and parameters used by the statistical model to represent
`distribution of previous values. For the mean and standard
`deviation model, these parameters are count, sum, and
`sum-of-squares
`(first
`two moments). The operational
`mode] requires no parameters.
`A profile is uniquely identified by Variable—Name, Sub-
`ject-Pattern, and Object-Pattern. All components of a
`profile are invariant except for Value.
`Although the model leaves unspecified the exact format
`for patterns, we have identified the following SNOBOL-
`like constructs as being useful:
`'
`
`String of characters.
`‘srrr'ng’
`Wild card matching any string.
`*
`Match any numeric string.
`#
`Match any string in list.
`IN(lt‘st)
`p ~+ name The string matched by p is associated with
`name.
`
`pl p2
`p] | p2
`p1, p2
`op
`
`Match pattern p1 followed by p2.
`Match pattern p1 or p2.
`Match pattern p1 and p2.
`Match anything but pattern p.
`
`Examples of patterns are:
`'Smith‘
`" -’ User —— match any string and assign to User
`'<Librarv) " -- match files in < Library) directory
`INISpeciaI-Filesl -- match files in Special~Flles
`'CPU='
`if —. Amount -- thatch string ‘CPU=' followed by inte-
`ger: assign integer to Amount
`
`The following is a sample profile for measuring the
`quantity of output to user Smith’s terminal on a session
`basis. The variable type ResourceByActivity denotes a re-
`source measure using the mean and standard deviation
`model.
`
`Variable—Name:
`Action—Pattern:
`Exception-Pattern:
`Resource-Usage-Pattern:
`Period:
`Variable-Type:
`Threshold:
`Subject-Pattern:
`Object-Pattern:
`Value:
`
`SessionOutput
`'Iogo ut’
`O
`'SessionOutput='
`
`.
`
`if —* Amount
`
`HesourcerActivitv
`4
`'Srnith'
`'
`record of
`
`Whenever the intrusion-detection system receives an
`audit record that matches a variable’s patterns, it updates
`the variable's distribution and checks for abnormality. The
`distribution of values for a variable is thus derived—Le,
`
`learned—as audit records matching the profile patterns are
`processed.
`
`D. Profiles for Classes
`
`Profiles can‘be defined for individual subject—object
`pairs (i.e., where the Subject and Object patterns match
`specific names, e.g.. Subject “Smith” and Object
`“Foo”), or for aggregates of subjects and objects (i.e.,
`where the Subject and Object patterns match sets of
`names) as shown in Fig. 1. For example, file-activity pro-
`files could be created for pairs of individual users and files,
`
`

`

`DEN NI NO:
`
`INTRUSION -DETECTION MODEL
`
`227
`
`/\Syetem
`SubjectClass/
`\DbjachIaae
`
`Subject \/ Object:
`Subjecwlas/s—\Object01aaa
`Sub]ect~DbjectClasa/\SubjectClasn-ijact:
`\
`/
`Subject-Object.
`
`Fig. l. Hierarchy of subjects and objects.
`
`for groups of users with respect to Specific files, for in-
`dividual users with respect to classes of files, or for groups
`of users with respect to file classes. The nodes in the lat-
`tice are interpreted as follows:
`' Subject-Object: Actions performed by single sub-
`ject on single object—cg, user Smith—file Foo.
`0 Subject—Object Class: Actions performed by single
`subject aggregated over all objects in the class. The class
`of objects might be represented as a pattern match on a
`subfield of the Object field that specifies the object’s type
`(class), as a pattern match directly on the object‘s name
`(e.g., the pattern “*.EXE" for all executable files), or as
`a pattern match that tests whether the object is in some
`list (e.g., “lN(hit-list)”).
`0 Subject Class—Object: Actions performed on single
`object aggregated over all subjects in the class—e.g.,
`privileged users—directory file < Library >, nonprivi-
`leged users—directory file < Library > .
`0 Subject Class—Object Class: Actions aggregated over
`all subjects in the class and objects in the class—privi-
`leged userssystem files, nonprivilegedl users—system
`files.
`' Subject: Actions performed. by single subject aggre-
`gated over all objects~e. g., user session activity.
`* Object: Actions performed on a single object aggre-
`gated over all subjects—e.g., password file activity.
`' Subject Class: Actions aggregated over all subjects
`in the class—e. g. , privileged user activity, nonprivileged
`user activity.
`' Object Class: Actions aggregated over all objects in
`the class—e.g., executable file activity.
`" System; Actions aggregated over all subjects and ob-
`jects.
`The random variable represented by a profile for a class
`can aggregate activity for the class in two ways:
`- Class-as-a-whole activity: The set of all subjects or
`objects in the class is treated as a single entity, and each
`observation of the random variable represents aggregate
`activity for the entity. An example is a profile for the class
`of all users representing the average number of logins into
`
`the system per day, where all users are treated as a single
`entity.
`0 Aggregate individual activity: The subjects or ob-
`jects in the class are treated as distinct entities, and each
`observation of the random variable represents activity for
`some member of the class. An example is a profile for the
`class of all users characterizing the average number of.
`logins by any one user per day. Thus, the profile repre-
`sents a “typical” member of the class.
`Whereas class-as-a-whole activity can be defined by an
`event counter, interval timer, or resource measure for the
`class, aggregate individual activity requires separate met-
`rics for each member of the class. Thus, it is defined in
`terms of the lower-level profiles (in the sense .of the lat-
`tice) for the individual class members. For example, aver-
`age login frequency per day is defined as the average of
`the daily total frequencies in the individual user login pro-
`files. A measure for a class-as-a-whole could also be de-
`
`fined in terms of lower-level profiles, but this is not nec—
`essary.
`
`The two methods of aggregation serve difference pur-
`poses with respect to intrusion detection. Class-as-a-whole
`activity reveals whether some general pattern of behavior
`is normal with respect to a class. A variable that gives the
`frequency with which the class of executable program files
`are updated in the system per day, for example, might be
`useful for detecting the injection of a virus into the system
`(which causes executable files to be rewritten as the virus
`spreads). A frequency distribution of remote logins into
`the class of dial-up lines might be useful for detecting
`attempted break-ins.
`Aggregate individual activity reveals whether the be-
`havior of a given user (or object) is consistent with that
`of other users (or objects). This may be useful for detect-
`ing intrusions by new users who have deviant behavior
`from the start.
`
`E. Profile Templates
`
`When user accounts and objects can be created dynam-
`ically,

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket