`
`PROCEEDINGS OF THE IEEE, VOL. 63, NO. 9, SEPTEMBER 1975
`
`related to hazard from h e r s and other light sources,” Amer. J.
`Ophthulmol., vol. 66, p. 15,1968.
`[57] A. Vassiliadis, H. C. Zweng, N. A. Peppers, R. R. Peabody, and
`R. C. Honey, “Thresholds of laser eye hazards,” Arch. Environ.
`Health, vol. 20, p. 161, 1970.
`[Sa] P. W. Lappin, “Ocular damage thresholds for the helium-neon
`laser,” Arch. Environ. Health, vol. 20, p. 177, 1970.
`[59] W. T. Ham et al., “Retinal bum thresholds for the He-Ne laser in
`the rhesus monkey,” Arch. OphthalmoL, to be published.
`[60] T. P. Davis and W. J. Mautner, “Helium-neon laser effects on the
`eye,” U.S. Army Med. Res. Develop. Com., Washington, D.C.,
`Annu. Rep. Contr. DADA 17-6942-9013,1969.
`[61] J. J. Vos, “Digital computations of temperature in retinal bum
`problems,” Inst. Perception, Soesterberg, The Netherlands, RVO-
`
`TNO, Rep. IZF 1965016,1963.
`[62] M. A. Mainster, T. J. White, J. H. Tips, and P. W. Wilson, “Reti-
`nal-temperature increases produced by intense light sources,” J.
`Opt. Soc. Amer., vol. 60, p. 264, 1970.
`[63] A. M. Clarke, W. T. Ham, W. J. Geeraets, R. C. Williams, and
`H. A. Mueller, “Laser effects on the eye,” Arch. Environ. Health,
`vol. 18, p. 424, 1969.
`[64] R. H. Stem and R. F. Sognnaes, “Laser beam on dental hard tis-
`sues,” J. Dent. Res., vol. 43, p. 873, 1964.
`[as] R. H. Stem, “Dentistry and the laser,” in Laser Applications in
`Medicine and Biology, vol. 11, Dr. M. L. Wolbmht, Ed. New
`York: Plenum, 1974, pp. 361-388.
`[ 6 6 ] T. E. Gordon, Jr., and D. L. Smith, ‘‘Lasex welding of prosthe-
`ses-an initial report,” J. Prosth. Dent., vol. 24, p. 472, 1970.
`
`Invited Paper
`
`Abrtmet-This tutorid paper explores the mechanics of protecting
`computer-stored informstion from unauthorized use or modification.
`It concentrates on those architectural structures-whether hardware or
`aoftware-that are necessrry to support information protection. The
`papa develops in three. main section& Section I describes deaired
`functions, design principles, and examples of elemmtary protection and
`authentication mechanismr Any reader
`h n i h with computers
`show find the T i section to be reasonably accessiile. Section
`II
`architecture.
`requirea some famlliuity with d&ptor-b8sed computer
`It examines in depth the principles of modem protection architecturea
`and the relation between capability
`systems and access control list
`systems, and en& with a brief analysis of protected
`subsystems and
`protected objects. The reader who
`the pre-
`is dismayed by either
`requisites or the level of detail in the second section may wish to skip
`to Section III, which reviews the state of the art and current research
`projects and provides suggestions for further read@.
`
`GLOSSARY
`
`T HE FOLLOWING glossary provides, for reference,
`
`brief definitions for several terms as used in this paper
`in the context of protecting information in computers.
`
`Access
`
`Access control list
`
`Authenticate To
`
`information
`to make use of
`
`The ability
`
`stored in a computer system. Used fre-
`quently as a
`verb, to the horror
`of
`grammarians.
`A list of principals that are authorized
`to have access to some object.
`(or
`verify the identity
`of a person
`other agent external to the protection
`system) making a request.
`
`Authorize
`
`Capability
`
`Certify
`
`Complete isolation
`
`Confinement
`
`Descriptor
`
`Discretionary
`
`Domain
`
`Encipherment
`
`To grant a
`principal access to certain
`information.
`In a
`computer system, an unforgeable
`ticket, which when presented can be
`taken as incontestable
`proof that the
`presenter is authorized to have access
`to the object named in the ticket.
`To check the accuracy, correctness, and
`completeness of a security or protection
`mechanism.
`that separates
`A protection system
`principals into compartments between
`which no flow of information or control
`is possible.
`Allowing a borrowed program to have
`access to data, while ensuring that the
`program cannot release the information.
`A protected value which is (or leads to)
`the physical address of some protected
`object.
`(In contrast with
`nondiscretionary.)
`Controls on access to an object
`that
`may be changed by the creator of the
`object.
`The set of objects that currently may be
`directly accessed by a principal.
`The (usually)
`reversible scrambling of
`data according to a secret transforma-
`tion key, so as to make it safe for trans-
`mission or storage in a physically unpro-
`tected environment.
`To authorize (s.v.).
`Referring to ability to change authoriza-
`tion, a scheme
`in which the record of
`
`Manuscript received October 11, 1974;revised April 17, 1975. Copy-
`right €9 1975 by 3. H. Saltzer.
`The authors are with Project MAC and the Department of Electrical
`Engineering and Computer Science, Massachusetts Institute of Tech-
`nology, Cambridge, Mass. 02139.
`
`Grant
`Hierarchical control
`
`Authorized licensed use limited to: Penn State University. Downloaded on January 7, 2009 at 15:48 from IEEE Xplore. Restrictions apply.
`
`Petitioner Apple Inc. - Exhibit 1051, p. 1
`
`
`
`
`
`
`
`SALTZER AND SCHROEDER: PROTECTION
`
`
`
`
`
`OF COMPUTER INFORMATION
`
`List-oriented
`
`Password
`
`Permission
`
`Prescript
`
`Rincipal
`
`Privacy
`
`Propagation
`
`Protected object
`
`each authorization is controlled by an-
`other authorization, resulting in a hier-
`archical tree of authorizations.
`Used to describe a protection system in
`which each protected object has a list of
`authorized principals.
`used to au-
`A secret character string
`thenticate the
`claimed identity of an
`individual.
`form of allowed access,
`A particular
`e.g., permission to READ as contrasted
`with permission to WRITE.
`A rule
`that must be followed before
`access to an object is permitted, thereby
`introducing an opportunity for
`human
`judgment about the need for access, so
`that abuse of the access is discouraged.
`The entity
`in a computer system to
`which authorizations are granted; thus
`the unit of accountability in a com-
`puter system.
`The ability of an individual (or organiza-
`tion) to decide whether, when, and
`to
`whom personal (or organizational) in-
`formation is released.
`When a principal,
`having been autho-
`rized access to some object, in turn
`authorizes access to another principal.
`A data
`structure whose existence is
`known, but
`whose internal organiza-
`tion is not accessible, except by invok-
`int the protected
`subsystem
`(q.v.)
`that manages it.
`Protected subsystem A collection
`of procedures and data
`objects that is encapsulated in a domain
`of its own so that the internal structure
`of a data object is accessible only to the
`procedures of the protected subsystem
`and the procedures may be called only
`at designated domain entry points.
`1) Security (q.v.). 2) Used more nar-
`rowly to denote mechanisms and tech-
`niques that control the access of execut-
`ing programs to stored information.
`A principal
`that may be used by several
`different individuals.
`
`
`To take
`away previously authorized
`access from some principal.
`processing
`
`
`With respect to information
`systems, used to denote mechanisms
`and techniques
`that control who may
`use or modify the computer or the
`in-
`formation stored in it.
`Referring to ability to change authoriza-
`tion, a scheme in which each authoriza-
`tion contains within it the specification
`of which principals may change it.
`Used to describe a protection system in
`which each principal maintains a list of
`unforgeable bit patterns, called tickets,
`one for
`each object
`the principal is
`authorized to have access.
`
`Protection
`
`Protection group
`
`Revoke
`
`Security
`
`Self control
`
`Ticket-oriented
`
`1279
`
`User
`
`Used imprecisely to refer to the individ-
`ual who is accountable for some identi-
`fiable set of activities in a computer
`system.
`I. BASIC PRINCIPLES OF INFORMATION PROTECTION
`A. Considerations Surrounding the Study of Protection
`1 ) General Observations: As computers become better
`understood and more economical, every day brings new a p
`plications. Many of these new applications involve both stor-
`ing information and simultaneous use by several individuals.
`The key concern in this paper is multiple use. For those a p
`plications in which all users should not have identical author-
`ity, some scheme is needed to ensure that the computer sys-
`tem implements the desired authority structure.
`For example, in an airline seat reservation system, a reserva-
`tion agent might have authority to make reservations and
`to
`cancel reservations for people whose names he can supply. A
`flight boarding agent might have the additional authority to
`print out the list of all passengers who hold reservations on the
`flights for which he is responsible. The airline might wish to
`withhold from the reservation agent the authority to print out
`a list of reservations, so as to be sure that a request for a pas-
`senger list from a law enforcement agency is reviewed by the
`correct level of management.
`The airline example is one of protection of corporate infor-
`mation for corporate
`self-protection (or public interest, de-
`pending on one’s view). A different kind of example is an on-
`line warehouse inventory management system
`that generates
`reports about the current status
`of the inventory. These re-
`ports not only represent corporate information that must be
`also may
`protected from
`release outside the company, but
`indicate the quality of the job being done by the warehouse
`manager. In order to preserve his personal privacy, it may be
`appropriate to restrict the access to such reports, even within
`the company, to those
`who have a legitimate
`reason to be
`judging the quality of the warehouse manager’s work.
`Many other examples of systems requiring protection of
`information are encountered every day: credit bureau
`data
`banks; law enforcement information
`systems; timesharing
`service bureaus; on-line medical information systems; and
`government social service data processing systems. These
`examples span a wide range of needs for organizational and
`personal privacy. All have in common controlled sharing of
`users. All, therefore, require
`information among multiple
`some plan to ensure that the computer
`system helps imple-
`ment the
`correct authority structure.
`Of course, in some
`applications no special provisions in the computer
`system
`are necessary.
`It may be, for instance, that an externally
`administered code of ethics or a lack of knowledge about
`computers adequately protects the
`stored information.
`Al-
`though there are situations in which the computer need pro-
`vide no aids to ensure protection of information, often it is
`appropriate to have the computer enforce a desired authority
`structure.
`are
`The words “privacy,” “security,” and “protection”
`sys-
`frequently used in connection with information-storing
`tems. Not all authors use these terms in the same way. This
`encountered in computer
`paper uses definitions commonly
`science literature.
`The term “privacy” denotes a socially defined ability of an
`individual (or organization) to determine whether, when, and
`
`Authorized licensed use limited to: Penn State University. Downloaded on January 7, 2009 at 15:48 from IEEE Xplore. Restrictions apply.
`
`Petitioner Apple Inc. - Exhibit 1051, p. 2
`
`
`
`1280
`
`
`
`
`
`PROCEEDINGS OF THE IEEE,
`
`SEPTEMBER 1975
`
`to whom personal (or organizational) information
`released.
`This paper will not be explicitly concerned with privacy,
`but instead with the mechanisms used to help achieve it.’
`The term “security” describes techniques that control who
`may use or modify the computer or the information contained
`
`is to be
`
`in it. ’
`
`Security specialists (e.g., Anderson [ 61 ) have found it useful
`to place potential security violations in three categories.
`1) Unauthorized information release: an unauthorized per-
`son is able to read and take advantage of information stored
`in the computer. This category of concern sometimes extends
`to “traffic analysis,” in which the intruder observes only the
`patterns of information use and from those patterns can infer
`also includes unauthorized use
`some information content. It
`of a proprietary program.
`2) Unauthorized information modification: an unauthorized
`person is able to make changes in stored information-a form
`of sabotage. Note
`that this kind of violation does not require
`that the intruder see the information he has changed.
`3) Unauthorized denial of use: an intruder can prevent an
`authorized user from referring
`to or modifying information,
`even though the intruder may not be able to refer to or mod-
`ify the information. Causing a system “crash,” disrupting a
`scheduling algorithm, or firing a bullet into a computer
`are
`examples of denial of use. This is another form of sabotage.
`The term “unauthorized” in the three categories listed above
`means that release, modification, or denial of use occurs con-
`trary to the desire of the person who controls the information,
`possibly even contrary to the constraints supposedly enforced
`by the system. The biggest complication in a general-purpose
`remoteaccessed computer system
`in
`is that the “intruder”
`these definitions may be an otherwise legitimate user of the
`computer system.
`Examples of security techniques sometimes applied to com-
`puter systems are the following:
`1) labeling files with lists of authorized users,
`2) verifying the identity of a prospective user by demanding
`a password,
`3) shielding the computer to prevent interception and sub
`sequent interpretation of electromagnetic radiation,
`4) enciphering information sent over telephone lines,
`5 ) locking the room containing the computer,
`6 ) controlling who is allowed to make changes to the com-
`puter system (both its hardware and software),
`7) using redundant circuits or programmed cross-checks that
`maintain security in the face
`of hardware or software
`failures,
`8) certifying that the hardware and software
`implemented as intended.
`It is apparent that a wide range of considerations are pertinent
`to the engineering of security of information. Historically, the
`
`are actually
`
`‘A thorough and scholarly discussion of the concept of privacy may
`be found in [ 11, and an interesting study of the impact of technology
`on privacy is given in [2]. In 1973, the U.S. Department of Health,
`Education, and Welfare published a related study [ 31. A recent paper
`by Turn and Ware [ 4 ] discusses the relationship of the social objective
`of privacy to the security mechaniams of modern computer systems.
`‘W. Ware [ 51 has suggested that the term security be used for sy%
`tems that handle clacrdfied defense information, and priwcy for systems
`This suggestion has never really
`handling nondefense information.
`taken hold outside
`the defense security community, but literature
`originating within that community often uses Ware’s defmitions.
`
`literature of computer systems has more narrowly defined the
`term protection to be just those security techniques that con-
`trol the access of executing programs to stored inf~rmation.~
`An example of a protection technique is labeling of computer-
`stored files with lists of authorized users. Similarly, the term
`authentication is used for those security techniques that verify
`the identity of a person (or other external agent) making a
`request of a computer system. An example of an authentica-
`tion technique
`is demanding a password. This paper concen-
`trates on protection and authentication
`mechanisms, with
`to the other equally necessary se-
`only occasional reference
`curity mechanisms. One should recognize that concentration
`on protection and authentication mechanisms provides a nar-
`row view of information security, and that a narrow
`view is
`dangerous. The objective
`of a secure system is to prevent all
`unauthorized use of information, a negative kind of require-
`ment.
`It is hard to prove that this negative requirement has
`been achieved, for one must demonstrate that every possible
`an expansive view of
`threat has been anticipated. Thus
`the
`to help ensure that no gaps a p
`problem is most appropriate
`pear in the strategy.
`In contrast, a narrow concentration on
`protection mechanisms, especially those
`logically impossible
`to defeat, may lead to false confidence in the system
`as a
`whole.4
`2) Functional Levels of Information Protection: Many dif-
`ferent designs have been proposed and mechanisms
`imple-
`mented for protecting information in computer systems. One
`is their dif-
`reason for differences among protection schemes
`ferent functional properties-the kinds
`of access control that
`can be expressed naturally and enforced.
`It is convenient to
`divide protection schemes according to their functional p r o p
`erties. A rough categorization is the following.
`a) Unprotected systems: Some systems have no provision
`for preventing a determined user from having access to every
`piece of information stored
`in the system. Although these
`systems are not directly of interest here, they are worth men-
`tioning since, as of 1975, many of the most widely used, com-
`mercially available batch data processing systems fall into this
`category-for example, the Disk Operating System for the IBM
`System 370 [ 91. Our definition of protection, which excludes
`features usable only for mistake prevention, is important here
`since it is common for unprotected systems
`to contain a va-
`riety of mistake-prevention features. These may provide just
`enough control that any breach of control is likely to be the
`an accident. Neverthe-
`result of a deliberate act rather than
`
`’Some authors have widened the scope of the term “protection” to
`include mechanisms designed to limit the consequences of accidental
`mistakes in programming or in applying programs. With this wider
`definition, even computer systems used by a single person might in-
`clude “protection” mechanisms. The effect of this broader defmition
`of “protection” would be to include in our study mechanisms that may
`be deliberately bypassed by the user, on the ba& that the probability
`of accidental bypass can be made as small as desired by careful design.
`Such accident-reducing mechanisms are often essential, but one would
`be ill-advised to apply one to a situation in which a systematic attack
`by another user is to be prevented. Therefore, we will insist on the
`narrower d e f d i o n . Protection mechanisms are very useful in prevent-
`ing mistakes, but mistake-preventing mechanisms that can be delibera-
`tely bypassed have little value in providing protection. Another com-
`is to techniques that ensure
`mon extension of the term “protection”
`the reliability of
`information storage and computing service despite
`accidental failure of individual components or programs. In this paper
`we arbitrarily label those concerns “reliability” or “integrity,” although
`it should be recognized that historically the study of protection mecha-
`nisms is rooted in attempts to provide reliability in multiprogramming
`systems.
`‘The broad view, encompassing all the considerations mentioned
`here and more, is taken in several current books [ 61 -[ 81.
`
`Authorized licensed use limited to: Penn State University. Downloaded on January 7, 2009 at 15:48 from IEEE Xplore. Restrictions apply.
`
`Petitioner Apple Inc. - Exhibit 1051, p. 3
`
`
`
`
`
`
`
`SALTZER AND SCHROEDER: PROTECTION OF COMPUTER INFORMATION
`
`1281
`
`less, it would be a mistake to claim that such systems provide
`any security.5
`b) All-or-nothing systems: These are systems that provide
`isolation of users, sometimes moderated by total
`sharing of
`some pieces of information. If only isolation is provided, the
`user of such a system might just as well be using his own pri-
`vate computer, as far as protection and sharing of information
`are concerned. More commonly, such systems also have public
`libraries to which every user may have access.
`In some cases
`the public library mechanism may be extended
`to accept user
`contributions, but still on the basis that all users have equal
`access. Most of the first generation
`of commercial time-
`sharing systems provide a protection scheme with this level of
`function. Examples include the Dartmouth Time-sharing
`System (DTSS) [ 101 and IBMs VM/370 system [ 11 I . There
`are innumerable others.
`more complex ma-
`c) Controlled
`sharing: Significantly
`chinery is required to control explicitly who may access each
`data item stored in
`the system. For example, such a system
`might provide each file with a list of authorized users and al-
`low an owner
`to distinguish several common patterns of use,
`such as reading, writing, or executing the contents of the file
`as a program. Although conceptually straightforward, actual
`com-
`implementation is surprisingly intricate, and only a few
`plete examples exist. These include
`M.I.T.’s Compatible
`(CTSS) [ 121, Digital Equipment Cor-
`Time-sharing System
`poration’s DECsystem/ 10 [ 131 , System Development Cor-
`poration’s Advanced Development Prototype
`(ADEPT)
`System [ 141, and
`Bolt, Beranek,
`and Newman’s TENEX
`[ 151 .6
`d) User-programmed sharing controls: A user may want
`to restrict access to a file in a way not provided in the standard
`facilities for controlling sharing. For example, he may wish to
`permit access only on weekdays between 9:00
`A M . and
`4:OO P.M. Possibly, he may wish to permit access to only the
`average value of the data in a file. Maybe he wishes to require
`that a file be modified only if two users agree. For such cases,
`and a myriad of others, a general escape is to provide for user-
`defined protected objects and subsystems. A protected sub-
`system is a collection of programs and data with the property
`that only the programs of the subsystem have direct access to
`is, the protected objects).
`the data (that
`Access to those
`programs is limited to calling specified entry points. Thus the
`programs of the subsystem completely control the operations
`performed on the data. By constructing a protected
`subsys-
`of access
`tem, a user can develop any programmable form
`control to the objects he creates. Only a few of the most ad-
`vanced system designs have tried to permit user-specified pro-
`tected subsystems. These include
`Honeywell’s Multics
`[ 161,
`the University of California’s CAL system [ 171, Bell Labora-
`tories’ UNIX system [ 181, the Berkeley Computer Corpora-
`tion BCC-500 [ 191, and two
`systems currently under
`con-
`struction: the CAP system of Cambridge University [ 201, and
`[ 2 1 1. Ex-
`the HYDRA system of Camegie-Mellon University
`
`argument as to whether systems origi-
`’One can develop a spirited
`nally designed as unprotected, and later modified to implement some
`higher level of protection goal, should be reclassified or continue to be
`arises from skepticism that one
`considered unprotected. The argument
`can successfully change
`the fundamental design decisions involved.
`Most large-scale commercial batch processing
`systems fall into this
`questionable area.
`‘An easier-to-implement strategy of providing shared catalogs that
`are accessible among groups of users who anticipate the need to share
`was introduced in CTSS in 1962, and is used today in some commercial
`systems.
`
`ploring alternative mechanisms for implementing protected
`subsystems is a current research topic.
`A specialized use of
`of protection
`protected subsystems is the implementation
`example, in a file of
`controls based on data content. For
`salaries, one may wish
`to permit access to all salaries under
`$15 000. Another example is permitting access to certain
`statistical aggregations of data but not to any individual data
`item. This area of protection raises questions about the
`possibility of discerning information by statistical tests and by
`examining indexes, without
`ever having direct access to the
`is the subject of a
`data itself. Protection
`based on content
`variety of recent or current research projects [ 221-[ 251 and
`will not be explored in this tutorial.
`foregoing three
`e) Putting
`strings on information: The
`levels have been concerned with establishing conditions for the
`release of information to an executing program. The fourth
`level of capability is to maintain some control over the user of
`the information even after it has been released. Such control
`is desired, for example, in releasing income information to a
`tax advisor; constraints should prevent
`him from passing the
`information on
`to a firm which prepares mailing lists. The
`printed labels on
`classified military information declaring a
`document to be “Top Secret” are another example
`of a con-
`straint on information after its release to a person authorized
`to receive it. One may not (without risking severe penalties)
`release such information
`to others, and the
`label serves as a
`notice of the restriction. Computer systems
`that implement
`such strings on information are rare and the mechanisms
`are
`incomplete. For example, the ADEPT system [ 141 keeps
`track of the classification level of all input data used to create
`all output data
`a file;
`are automatically
`labeled with the
`highest classification encountered during execution.
`that cuts across all levels of func-
`There is a consideration
`the dynamics of use. This term refers
`tional capability:
`to
`how one establishes and changes the specification of who may
`access what. At any of the levels it is relatively easy to envi-
`sion (and design) systems that statically express a particular
`protection intent. But the need to change access authoriza-
`tion dynamically and the need for such changes to be re-
`quested by executing programs introduces much complexity
`into protection systems. For a given functional level, most
`existing protection systems differ primarily
`in the way they
`handle protection dynamics. To
`gain some insight
`into the
`complexity introduced by program-directed changes to access
`authorization, consider
`the question “Is there any
`way that
`O’Hara could access file X?” One should check
`to see not
`only if O’Hara has access to file X, but also whether or not
`the specification of file X’s accessibility.
`O’Hara may change
`The next step is to see if O’Hara can change
`the specification
`the specification of file X’s accessibility,
`of who may change
`etc. Another problem
`of dynamics arises when the owner
`revokes a user’s access
`to a file while that file is being used.
`Letting the previously authorized user continue until he
`is
`“finished” with the information may not be acceptable, if the
`owner has suddenly
`realized that the
`file contains sensitive
`data. On the other hand, immediate withdrawal of authoriza-
`tion may severely disrupt the user. It should be apparent that
`provisions for the dynamics of use are at least as important as
`those for static specification of protection intent.
`In many
`cases, it is not necessary to meet the protection
`needs of the person responsible for the information stored in
`the computer entirely through
`computer-aided enforcement.
`External mechanisms such as contracts, ignorance, or barbed
`
`Authorized licensed use limited to: Penn State University. Downloaded on January 7, 2009 at 15:48 from IEEE Xplore. Restrictions apply.
`
`Petitioner Apple Inc. - Exhibit 1051, p. 4
`
`
`
`1282
`
`SEPTEMBER
`
`
`
`
`
`
`
`PROCEEDINGS OF THE IEEE,
`
`1975
`
`wire fences may provide some of the required functional
`capability. This discussion, however,
`is focused on the in-
`ternal mechanisms.
`3) Design Principles: Whatever the level of
`functionality
`provided, the usefulness of a set of protection mechanisms d e
`pends upon the ability of a system to prevent security viola-
`tions. In practice, producing a system at any
`level of
`func-
`tionality (except level one) that actually does prevent all such
`to be extremely difficult. So-
`unauthorized acts has proved
`phisticated users of most systems are aware of at least one way
`to crash the system, denying other users authorized access to
`involving a large
`stored information. Penetration exercises
`number of different general-purpose systems
`all have shown
`can obtain unautho-
`that users can construct programs that
`rized access to information stored within.
`Even in systems
`as an important ob-
`designed and implemented with security
`jective, design and implementation flaws provide paths that
`circumvent the intended access constraints. Design and con-
`struction techniques that systematically exclude flaws are the
`topic of much research activity, but no complete method a p
`plicable to the construction of large general-purpose systems
`exists yet. This difficulty is related to the negative quality
`of the requirement to prevent all unauthorized actions.
`In the absence of such methodical techniques, experience
`has provided some useful principles
`that can guide the design
`and contribute to an implementation without security
`flaws.
`Here are eight examples of design principles that apply par-
`ticularly to protection mechanisms.’
`
`simple
`a) Economy of mechanism: Keep the design as
`and small as possible. This well-known principle applies
`to
`any aspect of a system, but it
`deserves emphasis for protec-
`tion mechanisms for this reason:
`design and implementation
`errors that result in unwanted access paths will not be noticed
`during norqal use (since normal use usually does not include
`attempts to exercise improper access paths). As a result, tech-
`niques such as line-by-be inspection of software and physic31
`examination of hardware that implements protection mecha-
`nisms are necessary. For such techniques
`to be successful, a
`small and simple design is essential.
`b) Fail-safe defaults: Base access decisions on permission
`This principle, suggested by E. Glaser
`rather than exclusion.
`in 1965,* means that the default situation
`is lack of access,
`and the protection
`scheme identifies conditions under which
`access is permitted. The alternative,
`in which mechanisms
`attempt to identify conditions under which access should be
`refused, presents the wrong psychological base for secure sys-
`tem design. A conservative design must be based on arguments
`why objects should be accessible, rather than why they should
`not. In a
`large system some objects will be inadequately con-
`sidered, so a default of lack of permission is safer. A design
`or implementation mistake in a mechanism that gives explicit
`permission tends to fail by refusing permission, a safe situa-
`
`‘Deaign principles b), d), f), and h) are revised versions of material
`originally published in Communications of
`the ACM 126, p. 3981.
`8 Copyright 1974, Association for Computing Machinery, Inc., re-
`printed by permission.
`this paper we have attempted to identify original sources when-
`ever possible. Many of the seminal ideas, however, were widely spread
`by word of mouth or internal memorandum rather than by purnal
`publication, and historical accuracy is sometimes difficult to obtain.
`In addition, some ideas related to protection were originally conceived
`in other contexts. In such cases, we have attempted to credit the per-
`son who f i t noticed their applicability to protection in computer
`systems, rather than the original inventor.
`
`tion, since it will be quickly detected. On the other hand, a
`design or implementation mistake
`in a mechanism that ex-
`plicitly excludes
`access tends to fail by allowing access, a
`failure which