`
`CARL E. LANDWEHR
`
`Code 7593, Naval Research Laboratory, Washington, D C. 20375
`
`(cid:19)(cid:21)(cid:24)
`
`Efforts to build “secure” computer systems have now been underway for more than a
`decade. Many designs have been proposed, some prototypes have been constructed, and a
`few systems are approaching the production stage. A small number of systems are even
`operating in what the Department of Defense calls the “multilevel” mode some
`information contained in these computer systems may have a classification higher than
`the clearance of some of the users of those systems.
`This paper reviews the need for formal security models, describes the structure and
`operation of military security controls, considers how automation has affected security
`problems, surveys models that have been proposed and applied to date, and suggests
`possible directions for future models
`
`Keywords and Phrases: security, computer, protection, operating system, data security,
`access control, access matrix, capabilities, confidentiality, privacy, information flow,
`security classes, confinement, integrity, aggregation, sanitization, verification
`
`CR Categories: 1.3, 3.53, 3.56, 4.0, 4.35, 8.1
`
`INTRODUCTION
`
`Efforts to build “secure” computer systems
`have now been underway for more than a
`decade. Many designs have been proposed,
`some prototypes have been constructed,
`and a few systems are approaching the pro-
`duction stage. A small number of systems
`in the Department of Defense (DOD) are
`even operating in “multilevel” mode: some
`information in any of these systems may
`have a classification higher than the clear-
`ance of some users.
`
`Nearly all of the projects to design or
`construct secure systems for processing
`classified information have had a formal
`
`mathematical model for security as part of
`the top-level definition of the system. The
`model functions as a concise and precise
`description of the behavior desired of the
`security-relevant portions of the system.
`These models have been influenced by the
`DoD regulations for processing classified
`data, by intuitive notions of security, by the
`structure of existing computer systems, and
`by the capabilities of program-verification
`technology. They have not always been
`influenced by, or have even recognized, the
`
`© 1981 ACM 0010-4892/81/0900-0247 $00 00
`
`ways in which security regulations are ap-
`plied in practice.
`It is the purpose of this paper to review
`the need for formal security models, to de-
`scribe briefly the structure and operation of
`military security controls, to survey models
`that have been proposed and applied to
`date, and to suggest possible directions for
`future models. All the models described
`concern access to information within a com-
`
`puter and the flow of information within
`a computer;
`they are not concerned
`with the areas described by the Dennings
`[DENN79b] of user authentication,
`infer-
`ence controls, or cryptographic controls.
`Our descriptions, whenever possible,
`avoid formal notation. The purpose of this
`paper is to make the basic concepts of each
`model apparent, not to restate each model
`in complete detail.
`
`1. WHY FORMAL MODELS?
`
`In order to build a secure system, designers
`must first decide exactly what “secure”
`means for their particular needs. In a pri-
`vate company, security may be related to
`the "nondisclosure of confidential account-
`
`Computing Surveys, Vol 13, No 3, September 1981
`
`EMC V. IV
`
`EMC v. IV
`IPR2017-00338
`Ex. 1020
`
`
`
`248
`
`Carl E. Landwehr
`
`CONTENTS
`
`INTRODUCTION
`1 WHY FORMAL MODELS”
`2. STRUCTURE OF MILITARY SECURITY
`3. DYNAMICS OF MILITARY SECURITY
`4. EFFECTS OF AUTOMATION
`
`4 1 Old Problems Aggravated
`4 2 New Problems
`4 3 Potential Benefits
`5. FORMAL MODELS FOR COMPUTER
`SECURITY
`
`5 1 Basic Concepts and Trends
`5 2 High-Water-Mark Model
`5.3 Access Matrix Model
`5 4 Models Based on Access Matrices
`5 5 Bell and LaPadula Model
`5 6 Information-Flow Models
`
`5 7 Extensions and Applications of the Bell and
`LaPadula Model
`5 8 Programs as Channels for Information Trans-
`mission
`DISCUSSION
`CONCLUSION
`ACKNOWLEDGMENTS
`REFERENCES
`
`ing data or trade secrets, or to the enforce-
`ment of privacy regulations regarding per-
`sonal medical or credit records. If national
`
`security data are involved, security be-
`comes the protection of classified material,
`as detailed in various DoD instructions and
`
`have the following design characteristics:
`.
`. .”).
`The point here is not that the regulations
`are poorly phrased—indeed, it would be
`undesirable for regulations to specify par-
`ticular approaches when many of the ques-
`tions involved are still research issues—but
`that formal models of security are needed
`for design. Since the system must not only
`be secure, but must be demonstrably so,
`designers need formal security models to be
`able to convince others of the security of
`the system. By constructing a formal model
`for security, demonstrating that systems
`enforcing this model are secure (according
`to the applicable DoD regulations, privacy
`laws, or company policy), and then dem-
`onstrating that the design to which the
`implementation corresponds enforces the
`model, the designers can make a convincing
`argument that the system is secure.
`To date, the need for computer security
`has been more apparent in military than in
`commercial applications; consequently, the
`models discussed below concern military
`rather than industrial security. As security
`concerns become more important to the
`private sector and to the nonmilitary parts
`of the government, formal models appro-
`priate to these applications will also be
`needed.
`
`2. STRUCTURE OF MILITARY SECURITY
`
`Because most of the models described be-
`
`regulations. One might hope for these reg-
`ulations to be clear-cut and directly appli-
`cable to information stored in computers:
`not so. Because most of the regulations
`were originally constructed for an environ-
`ment where information was recorded on
`
`low were constructed with military security
`in mind, it will be helpful to review briefly
`some of the major aspects of military se-
`curity for readers unfamiliar with them.
`The requirement for military security
`arises from the existence of information
`
`paper and stored in safes, they have had to
`be revised as the use and understanding of
`computers within DOD have increased.
`Although the DoD regulations can be
`said to define the security required for sys-
`tems processing classified national security
`data, their form is not very helpful to sys-
`tem designers. Typically, regulations are
`written in English and are descriptive
`(“safeguards must permit accomplishment
`of mission functions while affording an ap-
`propriate degree of security” [OPNA79])
`rather than prescriptive (“the system shall
`
`that, if known by an enemy, might damage
`the national security (by making defenses
`more easily penetrable, for example). Be-
`cause there are costs associated with pro-
`tecting such information, and because not
`all information is equally sensitive, different
`sensitivity levels of information are distin-
`guished. The recognized sensitivity levels,
`in increasing order of effect on national
`security, are unclassified, confidential, se-
`cret, and top secret. Information that has
`been assigned any of the three levels above
`unclassified is called classified. The clas-
`
`Computing Surveys. Vol 13, No 3, September 1981
`
`
`
`Formal Models for Computer Security
`
`°
`
`249
`
`sifzcatzon of information takes into account
`its sensitivity level and, in some cases, ad-
`ditional factors described below.
`
`Since the purpose of the classification
`system is to prevent the uncontrolled dis-
`semination of sensitive information, mech-
`anisms are required to ensure that those
`individuals allowed access to classified in-
`
`formation will not distribute it improperly.
`In the military security system, the grant-
`ing of a clearance to an individual indicates
`that certain formal procedures and inves-
`tigations have been carried out and that
`the individual
`is considered trustworthy
`with information classified up to a certain
`sensitivity level. Clearances
`for higher
`levels of information correspond to greater
`degrees of trust and correspondingly re-
`quire more extensive background investi-
`gations. The discretionary power accorded
`individuals of increasing clearance levels is
`enforced by explicit legal penalities for any
`improper handling of classified information.
`The smaller the number of people who
`know a secret, the easier it is to control
`further dissemination. In recognition of this
`fact, and of the fact that few individuals
`need to be aware of all the information
`
`classified at a given sensitivity level, a finer
`grain of classification has been created on
`the basis of need-to-know. The general
`principle is
`that classified information
`should not be entrusted to an individual
`unless he has both the clearance required
`for it and some specific job-related need to
`know that information. Although this prin-
`ciple applies to all classified information, in
`some cases information relating to specific
`subject areas is formally designated as a
`separate compartment of information (e.g.,
`all information related to nuclear weapons
`might be in a compartment called NU-
`CLEAR). Compartment designations are in
`addition to the sensitivity level designa-
`tions;
`information might be designated
`“confidential, NUCLEAR” or “secret, NU-
`CLEAR,” for example. Compartments may
`overlap, with some information designated
`as being in two or more compartments. A
`classification or security level then consists
`of both a sensitivity level and a (possibly
`empty) set of compartments.
`Corresponding to these formally desig-
`nated need-to-know compartments are ad-
`
`ditional clearances that are used to control
`
`the compartments to which an individual
`may have access. If information is desig-
`nated with multiple compartments, an in-
`dividual must be cleared for all of them
`before he can view that information.
`
`In addition to compartments, there are
`restrictions known as caveats placed on
`some documents. Although these serve a
`function quite similar to that of compart-
`ments, they are usually broader in scope.
`One caveat, for example, is the “Originator
`Controlled (ORCON)” caveat,
`indicating
`that its originator must approve any further
`dissemination of the information. There are
`
`no specific clearances that correspond to
`the caveats; instead, specific properties of
`individuals (such as authorship or citizen-
`ship) are referred to.
`The dissemination of information of a
`
`particular security level (including sensitiv-
`ity level and any compartments or caveats)
`to individuals lacking the appropriate clear-
`ances for that level is prohibited by law;
`these statutory restrictions are sometimes
`referred to as mandatory access controls.
`In distributing information of a given se-
`curity level to those who possess the nec-
`essary clearances, a cleared individual must
`exercise some discretion in determining
`whether the recipient has, in addition, a
`need to know the information. These im-
`
`precise but important restrictions are re-
`ferred to as discretionary access controls.
`
`3. DYNAMICS OF MILITARY SECURITY
`
`The structure described above is generally
`adequate to describe a static set of infor-
`mation recorded on paper. Each piece of
`paper can be appropriately classified and
`physically protected (e.g., by storage in a
`safe). The dynamics of information han-
`dling under such a system are more difficult
`to model than its static aspects. These in-
`clude such operations as creating a new
`piece of classified information (perhaps us-
`ing a collection of existing information),
`sanitizing information by removing the sen-
`sitive parts,
`declassifying
`information,
`copying information, and so on.
`Creation of new classified information
`
`can cause a number of problems, the first
`of which is determining whether new infor-
`mation should in fact be classified. In the
`
`Computing Surveys, Vol 13, No 3, September 1981
`
`
`
`250
`
`-
`
`Carl E. Landwehr
`
`case of a new document relating to a pre-
`viously classified system or topic or to using
`information from classified sources, it will
`usually be clear to the author that the new
`document will be classified as well. Gener-
`
`ally, a document can be viewed as a se-
`quence of paragraphs, each of which is as-
`signed a classification. Because the docu-
`ment as a whole also has a classification,
`the document is in this sense a multilevel
`
`object, that is, it can contain information
`classified at various levels.
`The level of classification of a document
`
`as a whole is usually that of the most clas-
`sified information it contains.
`In some
`
`cases, however, a collection of information,
`each component of which is by itself un-
`classified (or classified at a low level) may
`yield a more highly classified document.
`For example, a picture of the Statue of
`Liberty and its caption, “Location of Secret
`Particle Beam Weapon,” could,
`if sepa-
`rated, both be unclassified. Together, they
`might be top secret. The problem of detect-
`ing whether such a collection exists is called
`the aggregation problem. If the new docu-
`ment is created by sanitizing an existing
`one, the new document may be classified at
`a lower level than the original. Determina-
`tion of when the information in a document
`
`has been sufficiently “desensitized” is called
`the sanitization problem. Proper identifi-
`cation of aggregated or sanitized informa-
`tion is the obligation of the document cre-
`ator, in cooperation with his security offi-
`cer. If a document is found to have been
`
`more highly classified than required, it may
`be downgraded (given a lower security
`level without changing its contents).
`As long as the principal storage medium
`for the information is paper, and the prin-
`cipal tools for creating it are manual (e.g.,
`pens, pencils, typewriters), the control of
`these operations is not too difficult. When
`a document is not in a safe, it is in the
`custody of some individual trusted not to
`distribute it improperly. A draft document
`with an as-yet-undetermined classification
`can be protected by storing it in a safe and
`neither declaring a specific classification
`nor entering it into the formal system for
`control of classified documents. The tools
`
`used to create and modify documents are
`simple and generally passive; they cannot
`
`Computing Surveys, Vol 13, No 3, September 1981
`
`easily alter a classified document or betray
`its contents to an unauthorized person
`without the knowing cooperation of the tool
`user.
`
`4. EFFECTS OF AUTOMATION
`
`The use of computers to store and modify
`information can simplify the composition,
`editing, distribution, and reading of mes-
`sages and documents. These benefits are
`not free, however. Part of the cost is the
`aggravation of some of the security prob-
`lems just discussed and the introduction of
`some new problems as well. Most of the
`difficulties arise precisely because a com-
`puter shared by several users cannot be
`viewed as a passive object in the same sense
`that a safe or a pencil is passive.
`For example, consider a computer pro-
`gram that displays portions of a document
`on a terminal. The user of such a program
`is very likely not its author. It is, in general,
`possible for the author to have written the
`program so that it makes a copy of the
`displayed information accessible to himself
`(or a third party) without the permission or
`knowledge of the user who requested the
`execution of the program. If the author is
`not cleared to view this information, secu-
`rity has been violated.
`Similarly, recording the security level of
`a document—a straightforward task in a
`manual system—can be a complex opera-
`tion for a document stored in a computer.
`It may require cooperation among several
`programs (e.g., terminal handler, line edi-
`tor, file system, disk handler) written by
`different individuals in different program-
`ming languages using different compilers. It
`is much more difficult to establish that the
`
`computer program(s) for recording a clas-
`sification behaves in accordance with its
`user’s wishes than it is to establish the same
`
`criterion for a pen or a pencil.
`Information contained in an automated
`
`system must be protected from three kinds
`of threats: (1) the unauthorized disclosure
`of information, (2) the unauthorized mod-
`ification of information, and (3) the unau-
`thorized withholding of information (usu-
`ally called denial of service). Each of the
`problems discussed below reflects one or
`more of these dangers.
`
`
`
`Formal Models for Computer Security
`
`°
`
`251
`
`4.1 Old Problems Aggravated
`
`4 1.1 Aggregation
`
`The aggregation problem exists in a com-
`puter-based system just as it does in a
`manual one. Forming aggregate objects
`may be easier, though, because users may
`be able to search many documents more
`quickly and correlate the information in
`them more easily than could be done man-
`ually. Database management systems that
`include numerous files of information in-
`
`dexed in several different ways and that
`can respond to user queries have no direct
`analog in the world of documents and safes.
`The response to a single query can aggre-
`gate information from a wide variety of
`sources in ways that would be infeasible in
`a manual system. A closely related problem
`is the inference problem. Studies have
`shown that database systems, if they pro-
`vide almost any statistical
`information
`(such as counts of records, average values)
`beyond the raw data values stored, are rel-
`atively easy to compromise [DEMI77,
`DENN79a, DENN79b, DOBK79, ScHw79].
`By carefully constructing queries and using
`only small amounts of outside information,
`a user can often infer the values of data he
`
`is unauthorized to obtain directly.
`
`the computer system must have a reliable
`way of determining with whom it is con-
`versing.
`
`4 1.3 Browsing
`
`Computers generally maintain directories
`for files to facilitate searching large bodies
`of information rapidly: rarely is there a
`similar catalog of all the information con-
`tained in even a single safe. Unless a com-
`puter system implements strict need-to-
`know access controls, it may be possible for
`a user to examine secretly all documents
`stored in the system at or below his clear-
`ance level (this is called the browsing prob-
`lem). Browsing through all the documents
`in a safe would be a much more difficult
`
`activity to conceal.
`
`4.1.4 Integrity
`
`Undetected modification of information is
`
`much easier to accomplish if the informa-
`tion is stored on electronic media than if it
`
`is stored on paper, both because changes
`are harder to detect and because there is
`
`often only a single copy of the information
`that need be altered. Protecting informa-
`tion against unauthorized modification is
`called the integrity problem.
`
`4.1.2 Authentication
`
`4 1.5 Copying
`
`In the manual system, keys and safe com-
`binations are entrusted to humans by other
`humans; it is not generally difficult to rec-
`ognize the trusted individual. A person
`opening a safe and examining its contents
`is likely to be observed by other people who
`will know whether that person is authorized
`to do so. Further, an individual with access
`to a safe must have a clearance sufficient
`
`for him to see every document stored in the
`safe without violating security. Individuals
`with different clearance levels may have
`access to the computer system, and so the
`system must be able to distinguish among
`its users and restrict information access to
`
`qualified users. Since the computer will
`have access to all the information it stores
`
`and since it must provide access to those
`documents only to authorized individuals,
`the authentication problem is aggravated:
`
`Although paper documents may be copied
`without altering the original, making such
`a copy entails removing the original from
`the safe. Undetected copying of files within
`most computer systems presents no similar
`barrier and usually can be done much more
`rapidly.
`
`4.1 6 Denial of Service
`
`In the manual system, the combination for
`a safe or a cipher lock may be forgotten or
`misplaced, or the lock may malfunction. In
`either case the legitimate users of the infor-
`mation in the safe may be denied access to
`it for a time. Such occurrences, however,
`are rare. Denial of service is a much more
`
`notorious characteristic of computer sys-
`tems, which can be vulnerable to power
`outages (or even fluctuations) and to hard-
`ware and software problems.
`
`Computing Surveys, Vol 13, No 3, September 1981
`
`
`
`252
`
`-
`
`Carl E. Landwehr
`
`4.2 New Problems
`
`4.2.1 Confinement
`
`Storage of information in a computer can
`also cause new kinds of security problems.
`In a computer system, programs are exe-
`cuted by the active entities in the system,
`usually called processes or jobs. Generally,
`each process is associated with a user, and
`programs are executed by the process in
`response to the user’s requests. A program
`that accesses some classified data on behalf
`
`of a process may leak those data to other
`processes or files (and thus to other users).
`The prevention of such leakage is called the
`confinement problem [LAMP73]. Lampson
`identifies three kinds of channels that can
`
`be used to leak information. Legitimate
`channels are those that the program uses
`to convey the results of its computation
`(e.g., the printed output from the program
`or a bill for the services of the program). It
`is possible, for example, by varying line
`spacing, to hide additional information in
`these channels. Storage channels are those
`that utilize system storage such as tempo-
`rary files or shared variables (other than
`the legitimate channels) to pass informa-
`tion to another process. Covert channels
`are paths not normally intended for infor-
`mation transfer at all, but which could be
`used to signal some information. For ex-
`ample, a program might vary its paging rate
`in response to some sensitive data it ob-
`serves. Another process may observe the
`variations in paging rate and “decipher”
`them to reveal the sensitive data. Because
`
`they generally depend on the observation
`of behavior over time, covert channels are
`also referred to as timing channels.‘
`
`4 2.2 Tro/an Horses and Trapdoors
`
`A program that masquerades as a useful
`service but surreptitiously leaks data is
`called a Trojan horsez. A trapdoor is a
`
`' Although these terms had been in use for some time,
`Lampson was apparently the first to introduce this
`nomenclature for kinds of leakage channels into the
`open literature. We will employ his definitions, using
`“timing channel” in place of “covert channel.” The
`reader is cautioned that usage in the literature is not
`uniform.
`
`2 This term was introduced by Dan Edwards in
`ANDE72.
`
`Computing Surveys, Vol 13, No 3, September 1981
`
`hidden piece of code that responds to a
`special input, allowing its user access to
`resources without passing through the nor-
`mal security enforcement mechanism. For
`example, a trapdoor in a password checking
`routine might bypass its checks if called by
`a user with a specific identification number.
`
`4. 2.3 Other Threats
`
`Another class of threats introduced by au-
`tomation is related to the electrical char-
`acteristics of computers. Wiretapping and
`monitoring of electromagnetic radiation
`generated by computers fall into this class.
`The formal models described below do not
`address this class of threats, nor do they
`cover problems of authentication,
`infer-
`ence, or denial of service.
`
`4.3 Potential Benefits
`
`In compensation for the added complexities
`automation brings to security, an auto-
`mated system can, if properly constructed,
`bestow a number of benefits as well. For
`example, a computer system can place
`stricter limits on user discretion. In the
`paper system, the possessor of a document
`has complete discretion over its further dis-
`tribution. An automated system that en-
`forces need-to-know constraints strictly can
`prevent the recipient of a message or doc-
`ument from passing it to others. Of course,
`the recipient can always copy the informa-
`tion by hand or repeat it verbally, but the
`inability to pass it on directly is a significant
`barrier.
`
`The sanitization of documents can be
`
`simplified in an automated system. Remov-
`ing all uses of a particular word or phrase,
`for example, can be done more quickly and
`with fewer errors by a computer than by a
`person (presuming, of course, that the ed-
`iting programs work correctly!). Although
`it is doubtful whether a completely general
`sanitization program is feasible, automated
`techniques for sanitizing highly formatted
`information should be available in a few
`years.
`
`Automated sytems can apply a finer
`grain of protection. Instead of requiring
`that an entire document be classified at the
`level of the most sensitive information it
`contains, a computer-based system can
`maintain the document as a multilevel ob-
`
`
`
`Formal Models for Computer Security
`
`'
`
`253
`
`ject, enforcing the appropriate controls on
`each subsection. The aggregation and san-
`itization problems remain; nevertheless, the
`opportunity exists for more flexible access
`controls.
`
`An automated system can also offer new
`kinds of access control. Permission to exe-
`
`cute certain programs can be granted or
`denied so that specific operations can be
`restricted to designated users. Controls can
`be designed so that some users can execute
`a program but cannot read or modify it
`directly. Programs protected in this way
`might be allowed to access information not
`directly available to the user, sanitize it,
`and pass the results back to the user. Nat-
`urally, great care would be needed in the
`construction of such a sanitization program
`and the controls protecting it.
`Although these benefits are within reach
`of current technology, they have been dif-
`ficult to realize in practice. Security is a
`relative, not an absolute, concept, and gains
`in security often come only with penalties
`in performance. To date, most systems de-
`signed to include security in the operating
`system structure have exhibited either slow
`response times or awkward user inter-
`faces—or both.
`
`5. FORMAL MODELS FOR COMPUTER
`SECURITY
`
`The formal structures described below can
`
`be used to model the military security en-
`vironment. These same structures can also
`
`be used as the basis for specifying programs
`that cause a computer to simulate the se-
`curity controls of the military environment.
`Because it is difficult to capture the com-
`plexities of the real world in a formal struc-
`ture, each model deviates from reality in
`some respects. Generally, the models en-
`force controls that are more rigid than the
`controls in the actual environment; any
`computer operations that obey the struc-
`tures of the model will be secure according
`to the conventional definitions, and some
`operations disallowed by the model would
`nevertheless be considered secure outside
`
`the formal model. Although this is the
`“safe” side on which to err, use of overly
`restrictive models to improve the security
`of a system can lead to systems that are
`
`to their
`
`intended users
`
`unacceptable
`[W1Ls79].
`The models presented in this section are
`diverse in several ways:
`they have been
`developed at different times, they treat the
`problem from different perspectives, and
`they provide different levels of detail in
`their specifications. We have tried to con-
`sider both chronology and formal similarity
`in organizing our presentation. Since
`models with different formal bases some-
`
`times influence each other over time, it is
`hard to provide an ordering that both re-
`spects formal similarity and avoids forward
`references. Consequently, we include a brief
`discussion of some useful concepts and his-
`torical trends before presenting the individ-
`ual models.
`
`5.1 Basic Concepts and Trends
`
`The finite—state machine model for com-
`putation views a computer system as a fi-
`nite set of states, together with a transition
`function to determine what the next state
`
`will be, based on the current state and the
`current value of the input. The transition
`function may also determine an output
`Value. Transitions are viewed as occurring
`instantaneously in this model;
`therefore
`certain potential information channels (e.g.,
`those related to observing the time spent in
`a certain state) in real systems tend to be
`hidden by it. Different security models ap-
`ply different interpretations of this general
`model, but this structure is the basis for all
`of those surveyed below.
`The lattice model for security levels is
`widely used to describe the structure of
`military security levels. A lattice is a finite
`set together with a partial ordering on its
`elements such that for every pair of ele-
`ments there is a least upper bound and a
`greatest lower bound [BIRK70]. The simple
`linear ordering of sensitivity levels has al-
`ready been defined. Compartment sets can
`be partially ordered by the subset relation:
`one compartment set is greater than or
`equal to another if the latter set is a subset
`of the former. Classifications, which include
`a sensitivity level and a (perhaps empty)
`compartment set, can then be partially or-
`dered as follows: for any sensitivity levels a
`and b and any compartment sets c and d
`
`(a, C) 2 (b, d)
`
`Computing Surveys, Vol 13, No 3, September 1981
`
`
`
`254
`
`-
`
`Carl E. Landwehr
`
`if and only if a 2 b and c Q d. That each
`pair of classifications has a greatest lower
`bound and a least upper bound follows from
`these definitions and the facts that the clas-
`
`sification “unclassified, no compartments”
`is a global lower bound and that we can
`postulate a classification “top secret, all
`compartments” as a global upper bound.
`Because the lattice model matches the mil-
`
`itary classification structure so closely, it is
`widely used. The high-watermark model
`[WEIS69], one of the earliest formal models,
`includes a lattice of security levels, though
`it is not identified as such.
`
`The access matrix model, described in
`detail below, was developed in the early
`1970s as a generalized description of oper-
`ating system protection mechanisms. It
`models controls on users’ access to infor-
`
`mation without regard to the semantics of
`the information in question. A reference
`monitor checks the validity of users’ ac-
`cesses to objects. Models based on access
`matrices continue to be of interest because
`
`of their generality; recent examples include
`studies of take—grant models [BIsH79] and
`the model of data security used by Popek
`[POPE78a].
`When classified information is involved,
`the semantics of the information must be
`considered: the classification of the infor-
`mation and the clearance of the user must
`
`be known before access can be granted. For
`this purpose, models based on the access
`matrix have been extended to include clas-
`
`sifications, clearances, and rules concerning
`the classifications. The best known such
`model is the Bell and LaPadula model
`
`[BELL73a], which may be summarized in
`two axioms:
`
`(a) No user may read information classi-
`fied above his clearance level (“No read
`up”);
`(b) No user may lower the classification of
`information (“No write down”).
`
`The full statement of the model includes
`
`several more axioms and is quite complex.
`In the early 1970s, Roger Schell con-
`ceived an approach to computer security
`based on defining a small subset of a system
`that would be responsible for its security
`and assuring that this subset would monitor
`all accesses (i.e., it would provide complete
`validation of program references), that it
`
`Computing Surveys, Vol 13, No 3, September 1981
`
`would be correct, and that it would be
`isolated (so that its behavior could not be
`tampered with). This mechanism would be
`called a security kernel [ANDE72, SCHE73].
`Similar considerations motivated the work
`
`of Price and Parnas [PRIC73, PARN74] on
`virtual memory mechanisms for protection.
`The Bell and LaPadula model grew out of
`work on the security kernel concept.
`This idea fit well with the notions of
`
`operating system kernels and layered ab-
`stract machines that were being circulated
`widely at that time. The security kernel
`would be the innermost layer of the system
`and would implement all of the security-
`relevant operations in the system; for the
`access-matrix model, the kernel would im-
`plement the functions of the reference mon-
`itor. Because the security kernel would be
`of minimal size and functionality, it would
`be feasible to examine it closely for flaws
`and perhaps even to verify its correctness
`(or at least its security properties) formally.
`In practice, it has been difficult to identify
`and isolate all of the security-relevant func-
`tions of a general-purpose operating system
`without creating a fairly large, fairly slow
`“kernel.”
`
`Information-flow models, based partly
`on work by Fenton [FENT74], and first in-
`troduced by Denning [DENN75], recognize
`and exploit the lattice structure of security
`levels. Instead of requiring a list of axioms
`governing users’ accesses, an information-
`flow model simply requires that all infor-
`m