throbber
DoD 5200.28-STD
`
` Supersedes
`
` CSC-STD-00l-83, dtd l5 Aug 83
`
` Library No. S225,7ll
`
`
` DEPARTMENT OF DEFENSE STANDARD
`
`
` DEPARTMENT OF
`
`
` DEFENSE
`
`
` TRUSTED COMPUTER
`
`
` SYSTEM EVALUATION
`
`
` CRITERIA
`
`
` DECEMBER l985
`
`
` December 26, l985
`
`
`Page 1
`
`APPLE 1012
`
`

`

` FOREWORD
`
`
`This publication, DoD 5200.28-STD, "Department of Defense Trusted Computer
`
`System Evaluation Criteria," is issued under the authority of an in accordance
`
`with DoD Directive 5200.28, "Security Requirements for Automatic Data
`
`Processing (ADP) Systems," and in furtherance of responsibilities assigned by
`
`DoD Directive 52l5.l, "Computer Security Evaluation Center." Its purpose is
`
`to provide technical hardware/firmware/software security criteria and
`
`associated technical evaluation methodologies in support of the overall ADP
`
`system security policy, evaluation and approval/accreditation responsibilities
`
`promulgated by DoD Directive 5200.28.
`
`
`The provisions of this document apply to the Office of the Secretary of
`
`Defense (ASD), the Military Departments, the Organization of the Joint
`
`Chiefs of Staff, the Unified and Specified Commands, the Defense Agencies
`
`and activities administratively supported by OSD (hereafter called "DoD
`
`Components").
`
`
`This publication is effective immediately and is mandatory for use by all DoD
`
`Components in carrying out ADP system technical security evaluation activities
`
`applicable to the processing and storage of classified and other sensitive DoD
`
`information and applications as set forth herein.
`
`
`Recommendations for revisions to this publication are encouraged and will be
`
`reviewed biannually by the National Computer Security Center through a formal
`
`review process. Address all proposals for revision through appropriate
`
`channels to: National Computer Security Center, Attention: Chief, Computer
`
`Security Standards.
`
`
`DoD Components may obtain copies of this publication through their own
`
`publications channels. Other federal agencies and the public may obtain
`
`copies from: Office of Standards and Products, National Computer Security
`
`Center, Fort Meade, MD 20755-6000, Attention: Chief, Computer Security
`
`Standards.
`
`
`_________________________________
`
`Donald C. Latham
`
`Assistant Secretary of Defense
`
`(Command, Control, Communications, and Intelligence)
`
`
`Page 2
`
`

`

` ACKNOWLEDGEMENTS
`
`
`Special recognition is extended to Sheila L. Brand, National Computer Security
`
`Center (NCSC), who integrated theory, policy, and practice into and directed
`
`the production of this document.
`
`
`Acknowledgment is also given for the contributions of: Grace Hammonds and
`
`Peter S. Tasker, the MITRE Corp., Daniel J. Edwards, NCSC, Roger R. Schell,
`
`former Deputy Director of NCSC, Marvin Schaefer, NCSC, and Theodore M. P. Lee,
`
`Sperry Corp., who as original architects formulated and articulated the
`
`technical issues and solutions presented in this document; Jeff Makey,
`
`formerly NCSC, Warren F. Shadle, NCSC, and Carole S. Jordan, NCSC, who
`
`assisted in the preparation of this document; James P. Anderson, James P.
`
`Anderson & Co., Steven B. Lipner, Digital Equipment Corp., Clark Weissman,
`
`System Development Corp., LTC Lawrence A. Noble, formerly U.S. Air Force,
`
`Stephen T. Walker, formerly DoD, Eugene V. Epperly, DoD, and James E.
`
`Studer, formerly Dept. of the Army, who gave generously of their time and
`
`expertise in the review and critique of this document; and finally, thanks are
`
`given to the computer industry and others interested in trusted computing
`
`for their enthusiastic advice and assistance throughout this effort.
`
`
`Page 3
`
`

`

` CONTENTS
`
`
` FOREWORD. . . . . . . . . . . . . . . . . . . . . . . . . . . .i
`
`
` ACKNOWLEDGMENTS . . . . . . . . . . . . . . . . . . . . . . . ii
`
`
` PREFACE . . . . . . . . . . . . . . . . . . . . . . . . . . . .v
`
`
` INTRODUCTION. . . . . . . . . . . . . . . . . . . . . . . . . .1
`
`
` PART I: THE CRITERIA
`
`
` 1.0 DIVISION D: MINIMAL PROTECTION. . . . . . . . . . . . . .9
`
`
` 2.0 DIVISION C: DISCRETIONARY PROTECTION. . . . . . . . . . 11
`
`
` 2.1 Class (C1): Discretionary Security Protection . . 12
`
`
` 2.2 Class (C2): Controlled Access Protection. . . . . 15
`
`
` 3.0 DIVISION B: MANDATORY PROTECTION. . . . . . . . . . . . 19
`
`
` 3.1 Class (B1): Labeled Security Protection . . . . . 20
`
`
` 3.2 Class (B2): Structured Protection . . . . . . . . 26
`
`
` 3.3 Class (B3): Security Domains. . . . . . . . . . . 33
`
`
` 4.0 DIVISION A: VERIFIED PROTECTION . . . . . . . . . . . . 41
`
`
` 4.1 Class (A1): Verified Design . . . . . . . . . . . 42
`
`
` 4.2 Beyond Class (A1). . . . . . . . . . . . . . . . . 51
`
`
` PART II: RATIONALE AND GUIDELINES
`
`
` 5.0 CONTROL OBJECTIVES FOR TRUSTED COMPUTER SYSTEMS. . . . . 55
`
`
` 5.1 A Need for Consensus . . . . . . . . . . . . . . . 56
`
`
` 5.2 Definition and Usefulness. . . . . . . . . . . . . 56
`
`
` 5.3 Criteria Control Objective . . . . . . . . . . . . 56
`
`
` 6.0 RATIONALE BEHIND THE EVALUATION CLASSES. . . . . . . . . 63
`
`
` 6.1 The Reference Monitor Concept. . . . . . . . . . . 64
`
`
` 6.2 A Formal Security Policy Model . . . . . . . . . . 64
`
`
` 6.3 The Trusted Computing Base . . . . . . . . . . . . 65
`
`
` 6.4 Assurance. . . . . . . . . . . . . . . . . . . . . 65
`
`
` 6.5 The Classes. . . . . . . . . . . . . . . . . . . . 66
`
`
` 7.0 THE RELATIONSHIP BETWEEN POLICY AND THE CRITERIA . . . . 69
`
`
` 7.1 Established Federal Policies . . . . . . . . . . . 70
`
`
` 7.2 DoD Policies . . . . . . . . . . . . . . . . . . . 70
`
`
` 7.3 Criteria Control Objective For Security Policy . . 71
`
`
` 7.4 Criteria Control Objective for Accountability. . . 74
`
`
` 7.5 Criteria Control Objective for Assurance . . . . . 76
`
`
` 8.0 A GUIDELINE ON COVERT CHANNELS . . . . . . . . . . . . . 79
`
`
`Page 4
`
`

`

` 9.0 A GUIDELINE ON CONFIGURING MANDATORY ACCESS CONTROL
`
` FEATURES . . . . . . . . . . . . . . . . . . . . . . . . 81
`
`
` 10.0 A GUIDELINE ON SECURITY TESTING . . . . . . . . . . . . 83
`
` 10.1 Testing for Division C . . . . . . . . . . . . . . 84
`
` 10.2 Testing for Division B . . . . . . . . . . . . . . 84
`
` 10.3 Testing for Division A . . . . . . . . . . . . . . 85
`
`
` APPENDIX A: Commercial Product Evaluation Process. . . . . . 87
`
`
` APPENDIX B: Summary of Evaluation Criteria Divisions . . . . 89
`
`
` APPENDIX C: Sumary of Evaluation Criteria Classes. . . . . . 91
`
`
` APPENDIX D: Requirement Directory. . . . . . . . . . . . . . 93
`
`
` GLOSSARY. . . . . . . . . . . . . . . . . . . . . . . . . . .109
`
`
` REFERENCES. . . . . . . . . . . . . . . . . . . . . . . . . .115
`
`
`Page 5
`
`

`

` PREFACE
`
`
`The trusted computer system evaluation criteria defined in this document
`
`classify systems into four broad hierarchical divisions of enhanced security
`
`protection. They provide a basis for the evaluation of effectiveness of
`
`security controls built into automatic data processing system products. The
`
`criteria were developed with three objectives in mind: (a) to provide users
`
`with a yardstick with which to assess the degree of trust that can be placed
`
`in computer systems for the secure processing of classified or other sensitive
`
`information; (b) to provide guidance to manufacturers as to what to build into
`
`their new, widely-available trusted commercial products in order to satisfy
`
`trust requirements for sensitive applications; and (c) to provide a basis for
`
`specifying security requirements in acquisition specifications. Two types of
`
`requirements are delineated for secure processing: (a) specific security
`
`feature requirements and (b) assurance requirements. Some of the latter
`
`requirements enable evaluation personnel to determine if the required features
`
`are present and functioning as intended. The scope of these criteria is to be
`
`applied to the set of components comprising a trusted system, and is not
`
`necessarily to be applied to each system component individually. Hence, some
`
`components of a system may be completely untrusted, while others may be
`
`individually evaluated to a lower or higher evaluation class than the trusted
`
`product considered as a whole system. In trusted products at the high end of
`
`the range, the strength of the reference monitor is such that most of the
`
`components can be completely untrusted. Though the criteria are intended to
`
`be application-independent, the specific security feature requirements may
`
`have to be interpreted when applying the criteria to specific systems with
`
`their own functional requirements, applications or special environments (e.g.,
`
`communications processors, process control computers, and embedded systems in
`
`general). The underlying assurance requirements can be applied across the
`
`entire spectrum of ADP system or application processing environments without
`
`special interpretation.
`
`
`Page 6
`
`

`

`Historical Perspective
`
`
` INTRODUCTION
`
`
`In October 1967, a task force was assembled under the auspices of the Defense
`
`Science Board to address computer security safeguards that would protect
`
`classified information in remote-access, resource-sharing computer systems.
`
`The Task Force report, "Security Controls for Computer Systems," published in
`
`February 1970, made a number of policy and technical recommendations on
`
`actions to be taken to reduce the threat of compromise of classified
`
`information processed on remote-access computer systems.[34] Department of
`
`Defense Directive 5200.28 and its accompanying manual DoD 5200.28-M, published
`
`in 1972 and 1973 respectively, responded to one of these recommendations by
`
`establishing uniform DoD policy, security requirements, administrative
`
`controls, and technical measures to protect classified information processed
`
`by DoD computer systems.[8;9] Research and development work undertaken by the
`
`Air Force, Advanced Research Projects Agency, and other defense agencies in
`
`the early and mid 70's developed and demonstrated solution approaches for the
`
`technical problems associated with controlling the flow of information in
`
`resource and information sharing computer systems.[1] The DoD Computer
`
`Security Initiative was started in 1977 under the auspices of the Under
`
`Secretary of Defense for Research and Engineering to focus DoD efforts
`
`addressing computer security issues.[33]
`
`
`Concurrent with DoD efforts to address computer security issues, work was
`
`begun under the leadership of the National Bureau of Standards (NBS) to define
`
`problems and solutions for building, evaluating, and auditing secure computer
`
`systems.[17] As part of this work NBS held two invitational workshops on the
`
`subject of audit and evaluation of computer security.[20;28] The first was
`
`held in March 1977, and the second in November of 1978. One of the products
`
`of the second workshop was a definitive paper on the problems related to
`
`providing criteria for the evaluation of technical computer security
`
`effectiveness.[20] As an outgrowth of recommendations from this report, and
`
`in support of the DoD Computer Security Initiative, the MITRE Corporation
`
`began work on a set of computer security evaluation criteria that could be
`
`used to assess the degree of trust one could place in a computer system to
`
`protect classified data.[24;25;31] The preliminary concepts for computer
`
`security evaluation were defined and expanded upon at invitational workshops
`
`and symposia whose participants represented computer security expertise
`
`drawn from industry and academia in addition to the government. Their work
`
`has since been subjected to much peer review and constructive technical
`
`criticism from the DoD, industrial research and development organizations,
`
`universities, and computer manufacturers.
`
`
`The DoD Computer Security Center (the Center) was formed in January 1981 to
`
`staff and expand on the work started by the DoD Computer Security
`
`Initiative.[15] A major goal of the Center as given in its DoD Charter is to
`
`encourage the widespread availability of trusted computer systems for use by
`
`those who process classified or other sensitive information.[10] The criteria
`
`presented in this document have evolved from the earlier NBS and MITRE
`
`evaluation material.
`
`
`Scope
`
`
`Page 7
`
`

`

`The trusted computer system evaluation criteria defined in this document apply
`
`primarily to trusted commercially available automatic data processing (ADP)
`
`systems. They are also applicable, as amplified below, the the evaluation of
`
`existing systems and to the specification of security requirements for ADP
`
`systems acquisition. Included are two distinct sets of requirements: 1)
`
`specific security feature requirements; and 2) assurance requirements. The
`
`specific feature requirements encompass the capabilities typically found in
`
`information processing systems employing general-purpose operating systems
`
`that are distinct from the applications programs being supported. However,
`
`specific security feature requirements may also apply to specific systems with
`
`their own functional requirements, applications or special environments (e.g.,
`
`communications processors, process control computers, and embedded systems in
`
`general). The assurance requirements, on the other hand, apply to systems
`
`that cover the full range of computing environments from dedicated controllers
`
`to full range multilevel secure resource sharing systems.
`
`
`Purpose
`
`
`As outlined in the Preface, the criteria have been developedto serve a number
`
`of intended purposes:
`
`
` * To provide a standard to manufacturers as to what security
`
` features to build into their new and planned, commercial
`
` products in order to provide widely available systems that
`
` satisfy trust requirements (with particular emphasis on preventing
`
` the disclosure of data) for sensitive applications.
`
`
` * To provide DoD components with a metric with which to evaluate
`
` the degree of trust that can be placed in computer systems for
`
` the secure processing of classified and other sensitive
`
` information.
`
`
` * To provide a basis for specifying security requirements in
`
` acquisition specifications.
`
`
`With respect to the second purpose for development of the criteria, i.e.,
`
`providing DoD components with a security evaluation metric, evaluations can be
`
`delineated into two types: (a) an evaluation can be performed on a computer
`
`product from a perspective that excludes the application environment; or, (b)
`
`it can be done to assess whether appropriate security measures have been taken
`
`to permit the system to be used operationally in a specific environment. The
`
`former type of evaluation is done by the Computer Security Center through the
`
`Commercial Product Evaluation Process. That process is described in Appendix
`
`A.
`
`
`The latter type of evaluation, i.e., those done for the purpose of assessing a
`
`system's security attributes with respect to a specific operational mission,
`
`is known as a certification evaluation. It must be understood that the
`
`completion of a formal product evaluation does not constitute certification or
`
`accreditation for the system to be used in any specific application
`
`environment. On the contrary, the evaluation report only provides a trusted
`
`computer system's evaluation rating along with supporting data describing the
`
`product system's strengths and weaknesses from a computer security point of
`
`
`Page 8
`
`

`

`view. The system security certification and the formal approval/accreditation
`
`procedure, done in accordance with the applicable policies of the issuing
`
`agencies, must still be followed-before a system can be approved for use in
`
`processing or handling classified information.[8;9] Designated Approving
`
`Authorities (DAAs) remain ultimately responsible for specifying security of
`
`systems they accredit.
`
`
`The trusted computer system evaluation criteria will be used directly and
`
`indirectly in the certification process. Along with applicable policy, it
`
`will be used directly as technical guidance for evaluation of the total system
`
`and for specifying system security and certification requirements for new
`
`acquisitions. Where a system being evaluated for certification employs a
`
`product that has undergone a Commercial Product Evaluation, reports from that
`
`process will be used as input to the certification evaluation. Technical data
`
`will be furnished to designers, evaluators and the Designated Approving
`
`Authorities to support their needs for making decisions.
`
`
`Fundamental Computer Security Requirements
`
`
`Any discussion of computer security necessarily starts from a statement of
`
`requirements, i.e., what it really means to call a computer system "secure."
`
`In general, secure systems will control, through use of specific security
`
`features, access to information such that only properly authorized
`
`individuals, or processes operating on their behalf, will have access to read,
`
`write, create, or delete information. Six fundamental requirements are
`
`derived from this basic statement of objective: four deal with what needs to
`
`be provided to control access to information; and two deal with how one can
`
`obtain credible assurances that this is accomplished in a trusted computer
`
`system.
`
`
` Policy
`
`
` Requirement 1 - SECURITY POLICY - There must be an explicit and
`
`well-defined security policy enforced by the system. Given identified
`
`subjects and objects, there must be a set of rules that are used by the system
`
`to determine whether a given subject can be permitted to gain access to a
`
`specific object. Computer systems of interest must enforce a mandatory
`
`security policy that can effectively implement access rules for handling
`
`sensitive (e.g., classified) information.[7] These rules include requirements
`
`such as: No person lacking proper personnel security clearance shall obtain
`
`access to classified information. In addition, discretionary security
`
`controls are required to ensure that only selected users or groups of users
`
`may obtain access to data (e.g., based on a need-to-know).
`
`
` Requirement 2 - MARKING - Access control labels must be associated
`
`with objects. In order to control access to information stored in a computer,
`
`according to the rules of a mandatory security policy, it must be possible to
`
`mark every object with a label that reliably identifies the object's
`
`sensitivity level (e.g., classification), and/or the modes of access accorded
`
`those subjects who may potentially access the object.
`
`
`Page 9
`
`

`

` Accountability
`
`
` Requirement 3 - IDENTIFICATION - Individual subjects must be
`
`identified. Each access to information must be mediated based on who is
`
`accessing the information and what classes of information they are authorized
`
`to deal with. This identification and authorization information must be
`
`securely maintained by the computer system and be associated with every active
`
`element that performs some security-relevant action in the system.
`
`
` Requirement 4 - ACCOUNTABILITY - Audit information must be
`
`selectively kept and protected so that actions affecting security can be
`
`traced to the responsible party. A trusted system must be able to record the
`
`occurrences of security-relevant events in an audit log. The capability to
`
`select the audit events to be recorded is necessary to minimize the expense of
`
`auditing and to allow efficient analysis. Audit data must be protected from
`
`modification and unauthorized destruction to permit detection and
`
`after-the-fact investigations of security violations.
`
`
` Assurance
`
`
` Requirement 5 - ASSURANCE - The computer system must contain
`
`hardware/software mechanisms that can be independently evaluated to provide
`
`sufficient assurance that the system enforces requirements 1 through 4 above.
`
`In order to assure that the four requirements of Security Policy, Marking,
`
`Identification, and Accountability are enforced by a computer system, there
`
`must be some identified and unified collection of hardware and software
`
`controls that perform those functions. These mechanisms are typically
`
`embedded in the operating system and are designed to carry out the assigned
`
`tasks in a secure manner. The basis for trusting such system mechanisms in
`
`their operational setting must be clearly documented such that it is
`
`possible to independently examine the evidence to evaluate their sufficiency.
`
`
` Requirement 6 - CONTINUOUS PROTECTION - The trusted mechanisms that
`
`enforce these basic requirements must be continuously protected against
`
`tampering and/or unauthorized changes. No computer system can be considered
`
`truly secure if the basic hardware and software mechanisms that enforce the
`
`security policy are themselves subject to unauthorized modification or
`
`subversion. The continuous protection requirement has direct implications
`
`throughout the computer system's life-cycle.
`
`
`These fundamental requirements form the basis for the individual evaluation
`
`criteria applicable for each evaluation division and class. The interested
`
`reader is referred to Section 5 of this document, "Control Objectives for
`
`Trusted Computer Systems," for a more complete discussion and further
`
`amplification of these fundamental requirements as they apply to
`
`general-purpose information processing systems and to Section 7 for
`
`amplification of the relationship between Policy and these requirements.
`
`
`Structure of the Document
`
`
`The remainder of this document is divided into two parts, four appendices, and
`
`a glossary. Part I (Sections 1 through 4) presents the detailed criteria
`
`derived from the fundamental requirements described above and relevant to the
`
`
`Page 10
`
`

`

`rationale and policy excerpts contained in Part II.
`
`
`Part II (Sections 5 through 10) provides a discussion of basic objectives,
`
`rationale, and national policy behind the development of the criteria, and
`
`guidelines for developers pertaining to: mandatory access control rules
`
`implementation, the covert channel problem, and security testing. It is
`
`divided into six sections. Section 5 discusses the use of control objectives
`
`in general and presents the three basic control objectives of the criteria.
`
`Section 6 provides the theoretical basis behind the criteria. Section 7 gives
`
`excerpts from pertinent regulations, directives, OMB Circulars, and Executive
`
`Orders which provide the basis for many trust requirements for processing
`
`nationally sensitive and classified information with computer systems.
`
`Section 8 provides guidance to system developers on expectations in dealing
`
`with the covert channel problem. Section 9 provides guidelines dealing with
`
`mandatory security. Section 10 provides guidelines for security testing.
`
`There are four appendices, including a description of the Trusted Computer
`
`System Commercial Products Evaluation Process (Appendix A), summaries of the
`
`evaluation divisions (Appendix B) and classes (Appendix C), and finally a
`
`directory of requirements ordered alphabetically. In addition, there is a
`
`glossary.
`
`
`Structure of the Criteria
`
`
`The criteria are divided into four divisions: D, C, B, and A ordered in a
`
`hierarchical manner with the highest division (A) being reserved for systems
`
`providing the most comprehensive security. Each division represents a major
`
`improvement in the overall confidence one can place in the system for the
`
`protection of sensitive information. Within divisions C and B there are a
`
`number of subdivisions known as classes. The classes are also ordered in a
`
`hierarchical manner with systems representative of division C and lower
`
`classes of division B being characterized by the set of computer security
`
`mechanisms that they possess. Assurance of correct and complete design and
`
`implementation for these systems is gained mostly through testing of the
`
`security- relevant portions of the system. The security-relevant portions of
`
`a system are referred to throughout this document as the Trusted Computing
`
`Base (TCB). Systems representative of higher classes in division B and
`
`division A derive their security attributes more from their design and
`
`implementation structure. Increased assurance that the required features are
`
`operative, correct, and tamperproof under all circumstances is gained through
`
`progressively more rigorous analysis during the design process.
`
`
`Within each class, four major sets of criteria are addressed. The first three
`
`represent features necessary to satisfy the broad control objectives of
`
`Security Policy, Accountability, and Assurance that are discussed in Part II,
`
`Section 5. The fourth set, Documentation, describes the type of written
`
`evidence in the form of user guides, manuals, and the test and design
`
`documentation required for each class.
`
`
`A reader using this publication for the first time may find it helpful to
`
`first read Part II, before continuing on with Part I.
`
`
`Page 11
`
`

`

` PART I: THE CRITERIA
`
`
`Highlighting (UPPERCASE) is used in Part I to indicate criteria not contained
`
`in a lower class or changes and additions to already defined criteria. Where
`
`there is no highlighting, requirements have been carried over from lower
`
`classes without addition or modification.
`
`
`Page 12
`
`

`

` 1.0 DIVISION D: MINIMAL PROTECTION
`
`
`This division contains only one class. It is reserved for those systems that
`
`have been evaluated but that fail to meet the requirements for a higher
`
`evaluation class.
`
`
`Page 13
`
`

`

` 2.0 DIVISION C: DISCRETIONARY PROTECTION
`
`
`Classes in this division provide for discretionary (need-to-know) protection
`
`and, through the inclusion of audit capabilities, for accountability of
`
`subjects and the actions they initiate.
`
`
`Page 14
`
`

`

`2.1 CLASS (C1): DISCRETIONARY SECURITY PROTECTION
`
`
`The Trusted Computing Base (TCB) of a class (C1) system nominally satisfies
`
`the discretionary security requirements by providing separation of users and
`
`data. It incorporates some form of credible controls capable of enforcing
`
`access limitations on an individual basis, i.e., ostensibly suitable for
`
`allowing users to be able to protect project or private information and to
`
`keep other users from accidentally reading or destroying their data. The
`
`class (C1) environment is expected to be one of cooperating users processing
`
`data at the same level(s) of sensitivity. The following are minimal
`
`requirements for systems assigned a class (C1) rating:
`
`
`2.1.1 Security Policy
`
`
`
`
` 2.1.1.1 Discretionary Access Control
`
`
` The TCB shall define and control access between named users and
`
` named objects (e.g., files and programs) in the ADP system. The
`
` enforcement mechanism (e.g., self/group/public controls, access
`
` control lists) shall allow users to specify and control sharing
`
` of those objects by named individuals or defined groups or
`
` both.
`
`
`2.1.2 Accountability
`
`
`
`
` 2.1.2.1 Identification and Authentication
`
`
` The TCB shall require users to identify themselves to it before
`
` beginning to perform any other actions that the TCB is expected
`
` to mediate. Furthermore, the TCB shall use a protected
`
` mechanism (e.g., passwords) to authenticate the user's identity.
`
` The TCB shall protect authentication data so that it cannot be
`
` accessed by any unauthorized user.
`
`
`2.1.3 Assurance
`
`
`
`
` 2.1.3.1 Operational Assurance
`
`
` 2.1.3.1.1 System Architecture
`
`
` The TCB shall maintain a domain for its own execution
`
` protects it from external interference or tampering
`
` (e.g., by modification of its code or data strucutres).
`
` Resources controlled by the TCB may be a defined subset
`
` of the subjects and objects in the ADP system.
`
`
` 2.1.3.1.2 System Integrity
`
`
` Hardware and/or software features shall be provided that
`
` can be used to periodically validate the correct
`
` operation of the on-site hardware and firmware elements
`
` of the TCB.
`
`
`
`
` 2.1.3.2 Life-Cycle Assurance
`
`
`Page 15
`
`

`

` 2.1.3.2.1 Security Testing
`
`
` The security mechanisms of the ADP system shall be tested
`
` and found to work as claimed in the system documentation.
`
` Testing shall be done to assure that there are no obvious
`
` ways for an unauthorized user to bypass or otherwise
`
` defeat the security protection mechanisms of the TCB.
`
` (See the Security Testing Guidelines.)
`
`
`2.1.4 Documentation
`
`
`
`
` 2.1.4.1 Security Features User's Guide
`
`
` A single summary, chapter, or manual in user documentation
`
` shall describe the protection mechanisms provided by the TCB,
`
` guidelines on their use, and how they interact with one
`
` another.
`
`
`
`
` 2.1.4.2 Trusted Facility Manual
`
`
` A manual addressed to the ADP System Administrator shall
`
` present cautions about functions and privileges that should be
`
` controlled when running a secure facility.
`
`
`
`
` 2.1.4.3 Test Documentation
`
`
` The system developer shall provide to the evaluators a document
`
` that describes the test plan, test procedures that show how the
`
` the security mechanisms were tested, and results of the
`
` security mechanisms' functional testing.
`
`
`
`
` 2.1.4.4 Design Documentation
`
`
` Documentation shall be available that provides a description of
`
` the manufacturer's philosophy of protection and an explanation
`
` of how this philosophy is translated into the TCB. If the TCB
`
` is composed of distinct modules, the interfaces between these
`
` modules shall be described.
`
`
`Page 16
`
`

`

`2.2 CLASS (C2): CONTROLLED ACCESS PROTECTION
`
`
`Systems in this class enforce a more finely grained discretionary access
`
`control than (C1) systems, making users individually accountable for their
`
`actions through login procedures, auditing of security-relevant events, and
`
`resource isolation. The following are minimal requirements for systems
`
`assigned a class (C2) rating:
`
`
`2.2.1 Security Policy
`
`
`
`
` 2.2.1.1 Discretionary Access Control
`
`
` The TCB shall define and control access between named users and
`
` named objects (e.g., files and programs) in the ADP system. The
`
` enforcement mechanism (e.g., self/group/public controls, access
`
` control lists) shall allow users to specify and control sharing
`
` of those objects by named individuals, or defined groups of
`
` individuals, or by both, and shall provide controls to limit
`
` propagation of access rights. The discretionary access control
`
` mechanism shall, either by explicit user action or by default,
`
` provide that objects are protected from unauthorized access.
`
` These access controls shall be capable of including or
`
` excluding access to the granularity of a single user. Access
`
` permission to an object by users not already possessing
`
` access permission shall only be assigned by authorized users.
`
`
`
`
` 2.2.1.2 Object Reuse
`
`
` All authorizations to the information contained within a
`
` storage object shall be revoked prior to initial assignment,
`
` allocation or reallocation to a subject from the TCB's pool
`
` of unused storage objects. No information, including encrypted
`
` representations of information, produced by a prior subject's
`
` actions is to be available to any subject that obtains access
`
` to an object that has been released back to the system.
`
`
`2.2.2 Accountability
`
`
`
`
` 2.2.2.1 Identification and Authentication
`
`
` The TCB shall require users to identify themselves to it before
`
` beginning to perform any other actions that the TCB is expected
`
` to mediate. Furthermore, the TCB shall use a protected
`
` mechanism (e.g., passwords) to authenticate the user's
`
` identity.
`
` The TCB shall protect authentication data so that it cannot be
`
` accessed by any unauthorized user. The TCB shall be able to
`
` enforce individual accountability by providing the capability
`
` to uniquely identify each individual ADP system user. The TCB
`
` shall also provide the capability of associating this identity
`
` with all auditable actions taken by that individual.
`
`
`
`
` 2.2.2.2 Audit
`
`
`Page 17
`
`

`

` The TCB shall be able to create, maintain, and protect from
`
` modification or unauthorized access or destruction an audit
`
` trail of accesses to the objects it protects. The audit data
`
` shall be protected by the TCB so that read access to it is
`
` limited to those who are authorized for audit data. The TCB
`
` shall be able to record the following types of events: use of
`
` identification and authentication mechanisms, introduction or
`
` objects into a user's address space (e.g., file open, program
`
` initiation), deletion of objects, and actions taken by
`
` computer operators and system administrators and/or system
`
` security officers, and other security relevant events. For
`
` each recorded event, the audit record shall identify: date and
`
` time of the event, user, type of event, and success or failure
`
` of the event. For identification/authentication events the
`
` origin of request (e.g., terminal ID) shall be included in the
`
` audit record. For events that introduce an

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket