throbber
'
`_
`A
`United States Patent
`
`[19]
`
`US005421006A
`[11] Patent Number:
`
`h
`
`5,421,006
`
`Jablon et al.
`
`'
`
`[45] Date of Patent: May 30, 1995
`
`[541
`
` 1TAnI¥YA1oIirsci)ztli‘PUrER
`SYSTEM SOFTWARE
`
`[75] I Inventors: David P. Jablou; Nora E. Hanley,
`b°*h 0*" Shfewsbury» Mm
`; Co
`Co
`Co _ H
`73 A -
`[
`]
`Sslgnee Tarp“ mputer W '
`'
`—
`[21] Appl. No.: 231,443
`[22] Filed:
`Apr. 20’ 1994
`
`oustou
`
`’
`
`OTHER PUBLICATIONS
`Intel 386 SL Microprocessor SuperSet Programmer’s
`Reference manual, 1990, ISBN l—555l2—l29—2.
`Compaq Computer Corporafiom Security standard for
`Hardware Configuration, pp. 1-6, 1990.
`Fiowchart of Operations of Computers According to
`the Security Standard for Hardware Configuraiton.
`Chap. 13, Real Time Clock Interface, 386 SL Micro-
`processor Superset System Design Guide by Intel Cor-
`poration, pp. 13-1 to 13-2, 1990.
`Using Password Security, Operations Guide for Com-
`paq Deskpro 386s Personal Computer by Compaq
`_mpute Cor'p., pp. 3-5 to -7 1988.
`_
`
`[63]
`
`R91-“ed U-S- Allliliclltiilll D3“
`
`Continuation of Ser. No. sso,o5o, May 7, 1992. 1:231; E: 1I,3:l:,‘;s°hel’ If‘
`
`[51]
`Int. CL6 ........................ G06F 11/00; H04K 1/00
`Attorney. Agent. or Fz'rm—Pravel, Hewitt. Kimball &
`[52]
`.................... 395/575- 380/4
`'eger
`[58] Field of Search ............... 395/575, 700, 7§o, 425; El
`ABSTRACT
`380/4
`A method and device for reliably assessing the integrity
`of a computer system’s software prevents execution of
`corrupted programs at time of system initialization,
`enhancing system security. Programs and data compris-
`ing the system’s trusted software, including all startup
`-
`-
`-
`-
`p’°°"‘°’S°‘°" are V”‘fled berm bemg “”h’°d' M°th°ds t°
`verify the trusted software use a hierarchy ofboth mod-
`ification detection codes and public-key digital signa-
`mm codes. The toylevelcodes an placed in a protecw
`bl‘ “°“"’°1‘“'1°.“°“‘gF “ea: “id 3“ “ed by the ‘WW?
`1’:°.;‘.:.*;"*.‘9:.-°.‘:-f“**°‘**‘°**‘*>'
`e im izationprogramse a arware ac 0
`protect the codes in the non-volatile memory from
`being overwritten by subsequent untrusted programs.
`The latch is only reset at system restart, when control
`returns to the bootstrap program. Software reconfigu-
`ration is possible with trusted programs that write new
`top-level codes while the latch is open. The mechanism
`itself is immune to malicious software attack when the
`write-protect latch is closed before running untrusted
`software. Preferred embodiments in an IBM-compatible
`personal computer uses the reset switch to initiate a
`trusted path between the user and a program. Damage
`from certain classes of computer virus and trojan horse
`attacks is prevented. A system recovery process is de-
`scribed. A related improved method for user authenti-
`cation uses a read-and -write memory protection latch
`to prevent access to sensitive authentication data.
`19 Claims, 8 Drawing Sheets
`.EADDKYABORY
`
`[56]
`
`R“-f°'°“°°5 Cit“!
`u_s_ PATENT DOCUMENTS
`4,309,569 1/1982 Merkle .
`4,388,695
`6/1983 I-Ieinemann ......................... 364/900
`4,590,552 5/1986 Guttag .
`:.g:i,g:;? mag; Eoogdmau et al. .................. 364/900
`4,535,055
`s/1937 Barnsdale, Jr. et al.
`............ 364/200
`
`3
`4,
`,
`4:332: 3:339
`4,325,353 4/1939
`_ 4,535,733 12/1939
`4,908,861
`3/1990
`4,930,073 5/1990
`4,970,504 11/1990
`4.975.950 12/1990
`5,022,077 6/1991
`5.050.212 9/1991
`5s°73»934 12/1991
`5:121’-345 6/1992
`5’133=7°5 3/1992 ‘
`5’1‘:"559 3/
`- 395/425
`1251992
`395/800
`5:2o4:966 4/1993
`380/30
`5,265,164 11/1993 Matyas et aL
`5,273,973
`1/1994 O'Brien et al. ...................395/500
`
`
`
`
`
`
`.
`
`1
`
`APPLE 1004
`
`1
`
`APPLE 1004
`
`

`
`U.S. Patent‘
`
`May 30, 1995
`
`Sheet 1 of 8
`
`5,421,006
`
`.202
`
`m4=a4o>
`
`>mo2m2
`
`mmmmoo<
`
`mmaooma
`
`2
`
`2
`
`
`
`

`
`U.S. Patent
`
`May 30, 1995
`
`Sheet 2 of 3
`
`5,421,006
`
`BOOT RECORD WAS READ
`INTO MEMORY
`
`"‘o I
`
` COMPATIBLE?
`
`BOOT CODE =
`COMPUTED MDC
`OF BOOT RECORD
`
`
`54
`
`CLOSE WRITE
`PROTECT LATCH
`
`RECORD
`PROGRAM
`
`.
`
`66
`
`PRINT ERROR
`
`'
`
`MESSAGE
`
`I CONTINUE No
`WITH NEXT
`
`BOOTABLE
`
`DEVICE
`
`'
`
`LAST
`BOOTABLE
`
`_ DEVICE?
`
`YES
`
`67
`
`R
`
`T
`
`69
`
`3
`
`3
`
`

`
`U.S. Patent
`
`May 30, 1995
`
`I
`
`Sheet 3 of 3
`
`5,421,006
`
`A SECURE CONFIGURATION:
`
`VARIABLE
`
`VALUE
`
`COMPATIBLE
`
`USER ENABLED
`
`USER CODE’
`
`CONFIG ENABLED
`
`CONFIG CODE
`
`COMPATIBLE CONFIGURATION:
`
`VARIABLE
`
`VALUE
`
`comnsue -Raj 122
`useneweo 222
`useacooe Immainaj 222
`coumsumso mmamaj 222
`conmcooe 224
`
` PUBLIC KEY ‘
`
`
`
`PRIVATE KEY
`
`T
`168
`
`OFF-UNE STORAGE OF
`TRUSTED AUTHOR!TYX
`
`4
`
`

`
`U.S. Patent
`
`May 30, 1995»
`
`Sheet 4 of 8
`
`5,421,006
`
`70
`
`B105
`
`80
`
`READ ONLY MEMORY
`
`
`
`WRHE-PROTECTABLE
`
`NON-VOLATILE MEMORY
`
`.MDC OF BOOT RECORD
`
`VULNERABl.EMEMORY
`O
`I
`
`TRUSTED SOFTWARE
`
`98
`
`UNTRUSTED SOFTWARE 1
`
`72
`
`aoo'r RECORD
`
`MDCOFDOS
`
`82
`
`V ' 88
`
`74
`
`D08
`
`84
`
`86
`
`76
`
`'
`' musnsnAPPLICATION
`
`MDC OF APP.
`' 9°
`V
`92
`
`UNTRUSTED APPLICATION
`
`FIG. I
`
`_ 7
`
`3
`
`_
`
`5
`
`

`
`U.S. Patent
`
`- May 30, 1995
`
`Sheet 5 of 8
`
`130
`/,
`' PROGRAMB
`
`132
`
`
`
`148
`
`-
`MDC or E
`
`'L.
`
`134
`
`.
`
`I
`
`[
`PR°°*W°
`
`136
`
`PROGRAM C
`
`146
`
`i
`
`MDCOF LISTL
`j
`
`‘
`
`133
`
`140
`
`PROGRAM E
`
`142
`
`FIG. Q
`
`6
`
`

`
`U.S. Patent
`
`May 30, 1995
`
`Sheet 6 of 3
`
`5,421,006
`
`
`
`FIG.0
`

`
`I
`
`3
`
`g
`
`
`
`IIII
`2|‘
`1.31!
`iii
`
`I $$
`
`14
`
`
`
` 10
`
`
`
`DECODER
`
` ADDRESS
`
`7
`
`

`
`U.S. Patent
`
`*
`
`May 30, 1995
`
`Sheet 7 of 3
`
`5,421,006
`
`PROGRAM
`
`BOOT RECORD
`
`
`
`
`READ DOS INTO '
`
` MEMORY
`
`
`
`
`
`READ APPLICATION
`lNTO MEMORY
`
`nos MDC =
`COMPUTED MDC or
`
`
`nos READ mo
`
`MEMORY?
`
`
`
`
`APPL. MDC =
`
`COMPUTED MDC OF
`APPL. READ INTO
`
`MEMORY?
`
`
`
` PRINT ERROR
`MESSAGE
`
`RUN APPLICATION
`
`
`
`m. 9
`
`8
`
`

`
`U.S. Patent
`
`May 30, 1995‘
`
`Sheet 3 of 8
`
`5,421,006
`
`
`
`APPLICATION
`'
`1 OR 2?
`
`BOOT RECORD
`PR
`RAM
`
`READ DOS mo _
`MEMORY
`
`
`
`READ APPLICATION 2
`INTO MEMORY
`
`RUN APPLICATION 2
`
`PRINT ERROR
`MESSAGE
`
`FIG. I‘ '
`
`9
`
`

`
`1
`
`5,421,006
`
`METHOD AND APPARATUS FOR ASSESSING
`INTEGRITY OF COMPUTER SYSTEM SOFTWARE
`
`This is a continuation of co-pending application Ser.
`No. 07/880,050 filed on May 7, _l992.
`BACKGROUND
`
`S
`
`1. Field of the Invention
`This invention relates to an improved method and
`device for assessing the integrity of computer software
`during the system initialization process, and preventing
`incorrect programs from executing. Preferred embodi-
`ments of this invention relate to an improved user au-
`thentication method, and generally improved computer
`system security. The invention is useful for preventing
`damage from certain computer virus and Trojan horse
`attacks.
`2. Background Discussion
`The field of computer security spans many interre-
`lated arms, addressing many different problems. Defen-
`sive security protective measures can often be quite
`complex, and in complex systems, an attacker can ex-
`ploit a “weak link” in the system to circumvent the
`protective measures. The security of one aspect of the
`system can thus depend on the strength of protection
`provided in other areas. This invention primarily asses-
`ses software integrity at startup time, and it is intended
`to enhance rather than replace other security methods.
`Since it can compensate for other security weaknesses
`of the system, the resulting benefits go beyond integrity
`assessment to improve user authentication and other
`security functions.
`As this invention provides the greatest benefit in
`personal computer systems, the IBM-compatible per-
`sonal computer (herein after referred to simply as the
`“PC”) running the DOS operating system will be used
`as an example for much of this discussion. But the same
`benefits can be realized in other computer operating
`systems and hardware, and appropriately modified im-
`plementations will be apparent to those skilled in the
`art.
`.
`As this invention is related to the fields of software
`protection, software integrity assessment, cryptogra-
`phy, memory protection, and user authentication, we
`will discuss relevant prior art in each of these areas. The
`invention includes a unique application of prior art in
`cryptography and integrity assessment. To clarify the
`relationship between the relevant fields, we first define
`some concepts in computer security as used herein,
`. including “trusted software”, software “integrity” and
`software “protection”. We will also review some spe-
`cific security threats including “Trojan horse” and
`“software virus” attacks, and review the prior art in
`addressing these threats, distinguishing between protec-
`tion methods and integrity assessment methods, and
`describing the useful concept of a “trusted path”.
`“Trusted software”, as used here, is defined to be the
`subset of all the software used in a system, which is
`responsible for the correct and reliable operation of the
`system, and responsible for enforcing a system’s secu-
`rity policy. The security policy may include rules for
`how to authorize access to the system, and rules for
`determining who can access particular data within the
`system. The “trust” here is an abstract relative measure.
`The determination of which software is trusted, and the
`functions which it is trusted to perform, can vary
`widely from system to system. Our usage of the term
`
`2
`“trusted software” is closely related to the usage in
`standard computer security literature,
`including the
`US. Department of Defense Trusted Computer System
`Evaluation Criteria (TCSEC). In this literature,
`the
`term “trusted computing base” is often used, which is
`comprised of the trusted software, plus any supporting
`hardware.
`Software “integrity”, as used in this discussion, refers
`to whether the software is as trustworthy as when it
`was initially installed. We assume that the system is in its
`most reliable state immediately after a proper installa-
`tion. System software that has been changed, whether
`through a deliberate act by unauthorized person, or
`through an accidental system malfunction, is said to
`have undergone an “integrity violation”. In such cases,
`the software should no longer be presumed to operate
`correctly. Maintaining the integrity of the trusted soft-
`ware is especially important. “Integrity assessment” is
`the art of determining whether a system’s integrity is
`intact or has been violated. The “trusted software”
`referred to throughout
`this discussion is generally
`trusted at least to not violate the integrity of other parts
`of the system.
`We now introduce an extsion to the concept of
`trusted software, to be called “transient trusted soft-
`ware”. This concept applies to systems that startup in a
`highly trusted state and degrade over lime. At a certain
`point in the operation of a system, the set of trusted
`software may become vulnerable to attack, and can no
`longer be relied upon to perform trusted operations.
`When the system is restarted, integrity assessment mea-
`sures can be used to revalidate the transient trusted
`software. In the rest of this document, our use of
`“trusted software” will generally refer to this “transient
`trusted software”.
`Software “protection” is defined here as the art of
`preventing violations of software integrity. Although
`this invention is primarily an integrity assessment
`method, rather than a software protection method, a
`review of some protection methods will help frame the
`invention in the context of prior art. The discussion will
`show how software protection methods in actual use
`are less than perfect, and how a stronger layer of integ-
`rity assessment is needed. (The field of software “pro-
`tection” should not be confused with the field of soft-
`ware “copy-protection”, which addresses the problem
`. of software theft.)
`—
`One general class of threat to system security is a
`“Trojan horse” attack. This is a program that is de-
`signed or has been modified to perform some hostile act,
`but is disguised as a familiar or non-threatening pro-
`gram, or it may be hidden within trusted system pro-
`grams.
`Another general class of security threat is the “soft-
`ware virus”. These are hostile software programs, often
`introduced into systems using a Trojan horse method,
`with the additional ability to replicate by attaching
`c_opies of themselves into other modified programs.
`These attacks first violate the integrity of software, and
`then perform a hostile action at a later time.
`Whereas the wide-spread threat from software vi-
`ruses is - a relatively new phenomenon, historically,
`much attention in the computer security field has fo-
`cused on methods to protect computer system integrity
`while allowing untrusted programs to run. The field of
`software protection has generated many mechanisms
`for securing access to software and data within a sys-
`tem. Multi-ring architectures were designed both to
`
`10
`
`'
`15
`
`20
`
`25
`
`30
`
`35
`
`45
`
`55
`
`'
`
`65
`
`10
`
`10
`
`

`
`5,421,006
`
`3
`segregate user processes from each other in multi-user
`time sharing systems, and to protect trusted operating
`systems from less-trusted applications. In these systems,
`special hardware and software mechanisms segregate
`the software address space into two or more protection
`“rings”. The innermost ring contains the system’s most-
`trusted software, and can enforce some of the security
`policy even in the face of failures of software in outer
`rings. A good background discussion of protection
`rings, and a description of an advanced multi-ring archi-
`tecture can be found in U.S. Pat. No. 4,787,031.
`However, despite the architectural strength of some
`systems, in actual use, the integrity of trusted software
`cannot always be guaranteed. In the UNIX operating
`system, which uses a two-ring architecture, there is a
`facility for “root” access for processes running in the
`less-privileged outer ring. With root access, much of the
`architectural surrounding the inner ring can be by-
`passed, and any user or process running as root can
`modify any trusted software. In theory, root access is
`only used for special security-sensitive operations, but
`in practice, preventing unauthorized root access -is a
`well-known security problem of the system.
`In IBM-compatible PC system running DOS, which
`uses the processor’s ringless “real” addressing mode,
`the problem is much worse. There is no architectural
`constraint preventing any application from corrupting
`the rest of the system software. Since all DOS programs
`have access to the same address space as DOS itself, all
`writable storage areas of the machine are vulnerable to
`attack. This problem remains even in PC operating
`systems that switch between real and protected address-
`ing modes of the Intel 386 family of microprocessors,
`which is discussed in U.S. Pat. No. 4,825,358. (This
`patent is also cited below in the discussion of prior art in
`memory-protection devices.) Since such systems gener-
`ally still provide access to real mode, and for other
`compatibility reasons, there is always a back-door for
`bypassing the security features set up in protected
`mode.
`
`10
`
`15
`
`20
`
`25
`
`30
`
`35
`
`45
`
`50
`
`4
`effective, especially in typical PC systems. Software-
`only protective measures can only offer a limited form
`of assurance against malicious attack, generally because
`the protection software and the virus must share the
`same address space. The protection software is thus
`vulnerable to a virus attack designed to specifically
`target the protection program. Thus, even if an existing
`software product perfectly protects against all current
`viruses, there is no guarantee that a new virus will not
`be developed to circumvent the product, and escape
`detection. Some of the anti-virus products on the mar-
`ket use special hardware to address the problem, but
`these generally focus on preventing virus infection,
`rather than assessing integrity. And both hardware and
`software-only products often rely on the mrecy of the
`product design or implementation. A virus developer
`can discover secret design details by reverse engineer-
`ing the product, and a software attack can be designed
`to circumvent these solutions.
`‘
`The field of virus detection provides another level of
`defense against viruses. If a virus infection has already
`occurred, it is often possible to detect and remove the
`virus’ before more-serious damage can occur. This field
`is related to the more general field of software integrity
`assessment. Some methods of virus detection, such as
`searching for data patterns indicating the presence of
`specific viruses, cannot be used as a general integrity
`test. But a strong integrity assessment test can offer
`strong proof that no virus has infected a given program.
`The use of “modification detection codes”, discussed
`further below, provides a strong test for integrity and
`viruses.
`Software viruses are only one class of security threats
`that can be introduced to a system with a Trojan horse
`attack. Other threats include attacks directed at obtain-
`ing security-sensitive data, such as passwords.
`Accidental corruption of the PC system is also a
`common problem, typically resolved by a system re-
`start. Somewhat less commonly, a copy of an operating
`system on disk becomes corrupted, and if the system
`restarts without detecting the corruption, further dam-
`age may occur. Our invention can also detect such
`accidental system failures.
`The “trusted path” feature is an important compo-
`nent of secure systems, designed specifically to elimi-
`nate Trojan horse and other threats during security-
`critical operations, such as a login process. A trusted
`path unambiguously establishes a secure connection
`between the user’s input ‘device and a trusted program,
`such that no other hardware or software component
`can intervene or intercept the communication. This is
`sometimes implemented with a reserved keyboard key
`known as the “secure attention key”, as is described in
`U.S. Pat. No. 4,918,653. Trusted path is a required fea-
`ture of systems evaluated against higher levels of the
`U.S. Department of Defense TCSEC security standard.
`The European ITSEC has similar requirements, and
`there is recent recognition that “trusted path” is needed
`as a minimal requirement for secure general—purpose
`commercial systems. One form of this invention pro-
`vides a trusted path to a login program, using the PC’s
`reset switch as a secure attention key.
`Much of the preceding background has suggested a
`need for integrity assessment methods, and there is
`relevant prior art in this field as well. A widely used
`technique is to compute an integrity assessment code on
`a program, and verify that the code matches a predeter-
`mined value before executing the program. We will
`
`The threat of ‘very sophisticated deliberate attacks
`against system security has also become a common
`problem. There is much literature about the problem of
`PC viruses, and many products have been designed to
`mitigate their threat. A particularly troublesome form
`of virus attack is one that is carefully designed to bypass
`normal system security features. And though there may
`be no deliberate attempt to infect a given system, there
`is still a high risk of inadvertent infection.
`Because PC DOS systems have no solid memory
`protection architecture, all writable storage areas of the
`machine are thus vulnerable to attack. Some examples
`of vulnerable areas in a PC include the following:
`a writable hard disk;
`a removable disk, where unauthorized substitution of,
`or access to the disk is possible;
`an inadequately protected storage area on a network
`server that contains a program to be downloaded to a ‘
`PC;
`a PC which loads downloads a program into memory
`from another machine across a network, where the
`network or the network download protocol has inad-
`equate protection.
`To mitigate the virus threat, a wide variety of prod-
`ucts have been designed to detect, prevent, and remove
`viruses. Though prevention of a virus attack is beyond
`the scope of this invention, part of the need for this
`invention is that no purely preventive solution is 100%
`
`55
`
`60
`
`65
`
`11
`
`11
`
`

`
`5,421,006
`
`5
`discuss two different approaches for computing the
`integrity assessment code, namely checksums and modi-
`fication detection codes.
`Within the PC, the BIOS program which resides in
`read-only memory (ROM) is the first program to run
`when the system starts up. As part of its initialization it
`looks for other ROM extensions to BIOS, and verifies
`the checksum of these extensions programs before al-
`lowing them to be used. This is described in “IBM PC
`Technical Refenence—-System BIOS”. U.S. Pat. No.
`5,022,077, also uses checksums to validates extensions to
`the PC BIOS program where the extensions reside out-
`side of the ROM. But the real focus of their patent is on
`protecting the storage area where BIOS extensions are
`kept, rather than verifying their integrity. And their
`storage protection method shares the architectural
`weakness of most
`software-controlled protection
`schemes on the PC.
`U.S. Pat. No. 4,975,950 claims the invention of check-
`ing a system for the presence of a virus at system initial-
`ization, and preventing operation if a virus is found.
`But, rather than defining a virus-detection technique, or
`an integrity assessment method as in our invention, it
`, uses only “known techniques for checking file size, file
`checksum, or file signature”.
`Although checksums are adequate for detecting acci-
`dental modifications of data, they are an insecure de-
`fense against deliberate modification. It is in fact very
`easy to modify a message such that it retains the same
`checksum value, and whole classes of more complex
`algorithms, including cyclic redundancy checks, suffer
`from the same problem. To address this problem, “mod-
`ification detection codes” have been designed to specifi-
`cally detect deliberate corruption of data, and are supe-
`rior to earlier methods, such as checksums. Whereas
`data can be intentionally modified in a manner that
`preserves a chosen checksum, it is intended to be com-
`putationally infeasible to modify data so as to preserve
`a specific modification detection code value. The secu-
`rity of a good modification detection code algorithm
`may depend on solving a particularly difiicult unsolved
`mathematical problem, one that has withstood pro-
`longed serious attention by experts in cryptography and
`mathematics. Modification detection codes are also
`known by other names in the literature,
`including:
`“cryptographic checksum”, “cryptographic hash”, “se-
`cure hash algorithm”, and “message diges ”. There has
`also been recent progress in finding strong, yet efficient
`algorithms, including a recent proposed standard algo-
`rithm described in “National Institute of Standards and
`Technology—Proposed FIPS for Secure Hash Stan-
`dar ”, Federal Register, Jan. 1992, page 3747.
`Modification detection codes are also commonly
`used in conjunction with the use of ,“public-key digital
`signatures”, which can authenticate the originator of a
`message. Creating a digital signature for a message often
`involves computing a modification detection code for
`the message, and then a further computation that
`“signs” the code with a private key held only by the
`originator of a message. A public-key that corresponds
`to the originator’s private key is made widely available.
`The signature can be then be verified by any person
`who has access to the originator’s public-key, with a
`computation that uses the modification detection code,
`the signature, and the public-key. The digital signature
`technique, a popular example of which is described in
`U.S. Pat. No. 4,405,829 (“RSA” , is used in an enhanced
`form of our invention.
`
`’
`
`6
`Modification detection codes have also been applied
`to the problem of virus protection on PCs. Recent soft-
`ware products compute modification detection codes
`on programs and verify them prior to program execu-
`tion. But software-only protection schemes for PCs
`suffer from the problem of residing in the unprotected
`address space. A potential solution is to embed the mod-
`ification detection code in a permanent read-only mem-
`ory device, but this makes system reconfiguration quite
`difficult. Other methods used in software products keep
`the modification detection code algorithm secret, and
`take measures to hinder the “reverse engineering” of
`the protection software. The weaknesses here are that it
`is difficult to predict how secrecy will be maintained,
`especially since reverse engineering is not a mathemati-
`cally intractible problem. Other product armounce-
`ments have described software-only verification sys-
`tems using public-key digital signatures, in addition to
`modification detection codes, to verify programs.
`Our invention uses a combination of known tech-
`niques, as described above, but it further incorporates a
`new hardware memory protection latch to make secu-
`rity mechanisms immune to software attack. One result
`is that our integrity assessment method is immune to the
`kind of violations it is intended to detect. A review of
`prior art in memory protection is therefore appropriate.
`In this field, a wide variety of software and hardware
`methods allow the memory address space to be parti-
`tioned and allow control over which software has ac-
`cess to individual regions of the memory. These meth-
`ods generally allow trusted software to both enable and
`disable the protection mechanism for a given region of
`memory, and these methods are often tied to central
`features of the architecture of the systems central pro-
`cessor unit (CPU). The memory protection method in
`our invention is partly distinguished by only allowing
`software control in one direction: from unprotected to
`protected mode.
`An add-on memory protection method, structurally
`similar to the one in our invention, but allowing two-
`way switching, is described in U.S. Pat. No. 4,388,695.
`The previously mentioned U.S. Pat. No. 4,825,358 also
`briefly describes memory.protection hardware for a PC,
`which uses software to enable and disable the protec-
`tion.
`Other patented, memory protection sches that
`have used a one-way switching latch have been either in
`the opposite direction, that is only from protected mode
`to unprotected mode, as in U.S. Pat. No. 4,651,323, or
`have been designedfor a different purpose and are trig-
`gered by a different mechanism, as in U.S. Pat. No.
`4,685,056.
`In the field of user authentication, many methods
`have been developed, including the common, and often
`controversial, use of passwords. Rather than review
`these methods in detail here, the relevant fact is that
`almost all methods require access to secret user-specific
`authentication data during the authentication process.
`To minimize the threat of secret passwords being re-
`vealed, it is generally recognized that passwords should
`be stored in a one-way hashed form to thwart attacks
`that look for them. But even one-way hashed passwords
`should be kept secret in order to thwart “brute-force”
`computational attacks on the known hashed password.
`Such attacks are especially easy if the password is
`poorly chosen. The Department of Defense Password
`Management Guideline—-CSC-STD-002-85, April’
`1985, discusses many of these issues. ,
`
`I0
`
`15
`
`20
`
`25
`
`30
`
`35
`
`40
`
`45
`
`50
`
`55
`
`60
`
`65
`
`'12
`
`12
`
`

`
`5,421,006
`
`5
`
`10
`
`20
`
`25
`
`30
`
`35
`
`4-0
`
`-
`
`7
`In PC systems, there are many products that use
`passwords. Of particular interest here are systems that
`require a password for initial startup of the system,
`sometimes implemented within the ROM BIOS. Some
`BIOS password implementations keep a hashed form of
`the password in an unprotected readable and writable
`non-volatile memory used to store configuration data,
`known as the “CMOS RAM”, and thus the stored
`hashed passwords are vulnerable to being read or writ-
`ten by untrusted applications.
`In general, integrity assessment is needed in many
`systems because no purely preventive measures can
`guarantee that .a system will never be corrupted; Even
`systems that use protected addressing modes have com-
`plexities that can be explored to attack system integrity.
`A natural time to check system integrity is at startup,
`and a startup integrity check is particularly beneficial
`for personal computers since they can be restarted fre-
`quently.
`'
`The concept of transient trusted software has been
`introduced to allow systems without a strong protection
`architecture to nevertheless use strong security meth-
`ods during system initialization. Our invention assesses
`the integrity of the trusted software, to discover any
`corruption as the system starts. The combined hard-
`ware and software approach used here makes this
`method immune to software-only attack, and the secu-
`rity of this method does not depend on keeping secret
`the design details of either the software or the hard-
`ware. This additional integrity guarantee is best used in
`combination with traditional protection methods to
`enforce a wide range of security policies.
`Standards for evaluating computer system security,
`such as the TCSEC, require a high level of assurance
`for a product to be rated at the upper levels. Such sys-
`tems may be analyzed against a mathematical model of
`security, and may involve formal proof techniques. The
`nature of the design of this mechanism suggests that
`such an analysis may be possible.
`_
`In light of this discussion of the prior art, the afore-
`mentioned problems with existing security solutions,
`the need for strong measures to verify the integrity of
`computer system software, and the need for storing
`secret authentication data, we proceed to describe this
`invention. Particular preferred embodiments of the
`invention on an IBM PC are described here to provide
`additional protection against certain PC viruses, a
`“trusted path” login procedure, and an improved means
`of maintaining the secrecy of stored authentication data.
`OBJECTIVES AND SUNIMARY OF THE
`INVENTION
`
`It is therefore an object of this invention to assess the
`integrity of trusted software during the system initial-
`ization process, and to prevent incorrect programs from
`executing.
`It is another object of this invention to correctly
`assess software integrity, even if a potential attacker has
`complete knowledge of the hardware and software
`system design, and has access to the system software.
`The mechanism must force an attacker to resort to
`directly attacking the hardware of the specific system,
`in order to undetectably modify the software.
`It is another object of this invention to permit recon-
`figuration of the trusted software without requiring
`hardware modification by the user.
`
`8
`It is another object of this invention to allow the set
`of trusted software to be arbitrarily large, and to effi-
`ciently assess the integrity of software components.
`It is another object of this invention for the hardware
`component to be fully compatible with existing system
`software, meaning that the new mechanism can be dis-
`abled by software if it’s not needed in a specific configu-
`ration.
`It is another object of this invention to allow a trusted
`path to be established between the user and a program
`in response to a signal initiated by the user.
`It is another object of this invention to enhance the
`secrecy of authentication data used in an access-control
`mechanism, when user authentication software is con-
`tained within the trusted initia1izab'.on software.
`In accordance with the above objects,-embodiments
`of ‘the invention highlight different uses of a hardware
`latch memory protection mechanism. One or more re-
`gions of non-volatile memory are provided, in which
`security-relevant data are stored. Access to a protecta-
`ble memory region is controlled with a latch mecha-
`nism, such that the memory is always both readable and
`writable when the computer is first started. But during‘
`system. initialization trusted software closes the latch to
`protect the memory, and thus prevent all subsequently
`run programs from reading and/or writing the security-
`relevant data during normal operation. Once closed, the
`latch can not be opened by software control. The latch
`is only re-opened when the system is restarted, which‘
`can occur by either momentarily turning off the power
`switch, or by pushing a reset switch. When the system
`is restarted, control of the CPU returns to a trusted
`startup program in read-only memory.
`The memory protection latch mechanism prevents
`software-only attacks on the stored data during normal
`operation. This protection remains even if complete
`knowledge of the system design is available to an at-
`tacker.
`_
`Embodiments of the invention store data in a protect-
`able memory region during a software configuration
`process, and use this data to verify system initialization
`programs before they are run. The computer starts up in
`a mode where the latch is open, and the memory is
`readable and writable. The first program to run resides
`5 in invulnerable read-only memory. It loads a second V
`stage program from vulnerable memory into main mem-
`ory, and before it transfers control to the second pro-
`gram, the first program verifies the integrity of the
`second program using the verification data stored in the
`protectable non-volatile memory. If an integrity viola-
`tion is detected, the second program is not run, and the
`user is warned of the problem. If integrity is verified,
`the second program i

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket