throbber
Automated Assistance for Detecting Malicious Code *
`
`R. Crawford, P. Kerchen, K. Levitt, R. Olsson, M. Archer, M. Casillas
`
`Department of Computer Science
`University of California, Davis
`Davis1 CA 95616
`virus©cs.ucdavis.edu
`
`Email:
`
`Abstract
`
`This paper gives an update on. our continuing work on the fifaficiom Code Testbed
`(HGT). The MOT is a ami-aulmnated tool, operating in a simulated, clennmm en-
`m‘rmmcmt, that is capable of detecting may type: of rolicious code, such as m.
`Trojan horses. and time/logic W. The MOT allow security analyst: to check a
`program before imiailafion, thereby deciding any damage a Widow pray-nun might
`inflict.
`Keywordis: Detecfinn of Malicious Code, Static Analysis, Dynamic Analyst.
`
`1
`
`Introduction
`
`The Malicious Code Tutbed (MCI) was ofiginafly defined to use both static and
`dynamic analysis tools developed It the University of California, Davis, that hm
`henshmtobeefediulgfimmbintmofmfiduumde. Onegmlofflle
`mbedkbenhuuthemoffimflutookbynfingfiminammpkmutuy
`fashiontodetect more genealmeutf malicious code
`haul-reporttothismnfaencelastyearflL‘lepruentedadsignmu-dewofthe
`MCP.In1hcpmtpapu-,weteportononrmmdsnpgmdingtheMCT
`envimnmentfordynamicanalysk.
`Maughiuprindplq themfionofaflaficiocsGodeTestbed'nindepen-dentofm
`Mammfluditectunlphfiommiuifidimpflemmuthnefiom
`hufocusedonfimnlafingzDOSDPmfingayslmtuningonPCucfitectmThis
`design dedsionwasmadepfimaflybeamethePCIDOSenvimnmentismwidmmd
`uda-n‘bletointttsions:lhnsthfimfimmenlistheonetkathascngmduedthe
`mustraluwoddmafidunscodemmusemchaflmgemdetecfinntechniqucs.
`Sectionszandzpmfidebackgmudmtefialonmafidousmdeandmtdm
`fimmethodekfionimfiewsthemeofmisindnamicuflyfistedmimmd
`SectionSdescn'bstlleuchitectueoftheMCI‘.Secfionfimummemultsfiom
`ourcxpefienceusingtbeMCI'onmafidousoode.
`
`‘SPONSORS: hmfimWM.U5. Department-effigy
`
`
`kapufomadundalhcansp:anspimofdieUS.DeparmmtofEuagy
`LivumaeNafiomlLabarmymderCommW—7405—Eng-48.
`
`by
`
`lawman
`
`-i--
`..-
`.a_u=_lu:._-..,
`
`I
`
`#0602133
`
`OSTI
`
`CS—1017
`Cisco Systems, Inc. v. Finjan, Inc. -
`Page 1
`
`CS-1017
`Cisco Systems, Inc. v. Finjan, Inc.
`Page 1
`
`

`

`2 Malicious Code — A Brief Overview
`
`In recent years, various forms of malicious code ham: appeared on virtually all major
`families of computer platform. The prevalence of malicious code — Trojan horses. time
`bombs. worms. and viruses — threatens the traditional I‘open systems" approach that
`has evolved in the academic realm. as well as in much of the commercial sector.
`The (Talent situation in the personal computer arena. may be indicative of future
`trends in worlwtatiou and. mainframe environments. On PC systems — where literally
`hundreds of computer Viruses. time bombs. and 'flojan horse-5 have proliferated —
`the problem is caused by rogue program; that unWittingly are invited in to the system.
`Thus malicious code may be inserted into almost any type ofcolnputer system Via these
`same avenues — “shareware“ may he installed. or malicious code might be produced
`in-honse by a disgruntled employee. or a. program containing malicious code might even
`be purchased from a legitimate vendor of commercial software.
`Our definition of what constitutes finalicious’ code shall address only the probable
`efleclo of executing such code; we shall not eouoern ourselves with the “original intent"F
`of the (possibly unknown] writer. Although the intentions ol' the writer may be crucial
`in determining legal culpability — as" whether malice and forethought were present —
`to include such considerations within the scope of our Working definition” for malicious
`code would clearly rendei- the problem inoomputable
`Yet even using our restricted, operational definition of 'malicioux code“, the prob-
`lem of malicious code detection — in the most general case — is not decidable by
`purely formal methods. This follows not merely from the results of [4] [2] [3], but
`rather license the inherent aemantis of the problem statment demand that a value
`judgement regarding the nature of the oode‘s probable effects be rendered. But because
`doing so would require that the intentof the program’s potential users be considered.
`uoarticleoffaithalcintoChurcllk'I'hesiscanservetobridgethegapbetweenour
`intmtinesense of “malicious elfocte'. and algorithmic solutions. It would seem that. in
`all but the most mud] rutfictcd programming mu'ironments. the problem statement
`must remain a fuzzy one.
`Thus. although no algorithm that identifies malicious codein all environments and
`in all guiseacanenist. anumheroftechniqneealready existforcopingwithcertainre—
`stricted forms ofmalicious code Since the problem cannot with certainty be prevented
`in current programming environments. it must be managed instead.
`This idea forms the hash of the Malicious Code Testbed — an automated assistant
`
`whoue “on it '3 to perform the “gr-ant work” necmary to aid a. human analyst
`in detecting not only currently known toms of malicious code. but also mutated or
`entirely novel forms. Given the absence of a decision procedure for malicious code,
`such atettbedwould allowouetoutaminenprogram toascertain whetherornotitis
`auapiriotu.
`We first discuss the most prevalent methods of coping with malicious code. and
`then describe some ofonr previous work aimed at providing defenses against malicious
`code. Thenwecrploreingreaterdetail theMelicim Code Testbed.
`
`

`

`3 A Sample of Current Methods for Coping
`with Malicious Code
`
`Presently, the majority of malicious code defenses are concerned with computer viruses.
`However, some are more broadly applicable to malicious code in scum]. These ureth-
`ods may be divided into two distinct classes depending on when they are applied: as
`a. pare-execution check or at run time. Pie-accution techniques are applied to a. suspi-
`cious program before it can he executed by a user. In contrast, run time methods are
`actually applied to the program as it executes, in hopu of stopping the program before
`it can cause dumageor allow a virus to propagate. Another taxonomy of malicious code
`defenses divide; all methods into the categories of stutieor dynamic analys's. Although
`most static analzmis technique are applied as pre-execution check. certain static anal-
`ysis techniques can be applied at run time. Similarly. although most dynamic analysis
`techniques are applied as run time checks, certain dynamic analysis techniques {such
`as our own Malicious Code Testbed) can be applied as pmt‘lon duels.
`Many of the more sophisticated Fire-execution methods rely on the prior existence of
`a copy of the program that is assumed to be “clean”. perhaps bcuuse it was originally
`written by a trusted programmer and then translated into an executable file by a trusted
`compiler on a secure system- Onesuch method computa cryptographic checlmims that
`are characteristic of that trusted executable file. and embeds them in that file- [6] The
`file is then copied to an insecure environment, whose operating system will not allow a
`user to mute any program until it has reocrnputed what those checlsums should he
`and compared those value with the one; actually embedded in the program.
`In. this
`my. most alterations made to a. trusted executable file after it loam the secure system
`can be detected before the program is executed in the insecure environment.
`It is important to note that this technique shares one important diaractefistic
`in common with most other sophisticand pru-crocntion methods — ultimately, they
`depend on the prior application of detection (or formal verification} techniques in order
`tocertifyanexecutablefileas “dean” in thefirst place.
`Keeping Ken Thompson’3 admonition "ion trusting trust” firmly'in mind [5]. how
`should a security administrator proceed when faced with programs so large or complex
`that “trust. but verify” is not a. feasible option? We suggest that — in the middle
`ground between the two extremes ofexhaustively provable correctness and trust based
`on nothing more substantial than personal familiarity with. or a background security
`checkon. aprogrum’swfitu—theMCThctingtoamZt ahuxnan analyst) can provide
`a. practical alternative basis for trust.
`
`3.1 Simple Scanners and Monitors
`
`Simple scanners such. as MeAfee’s Scanv or Norstad's Disinfectant are by and large
`the most common pee-execution method in use today. Typially. the user will invoke
`ascannertosearchthestaticteutol'abinary program‘lorfixed patterns (bitstringu)
`thumarch thoseo‘l'lmown malicionsprograms. Ifnoneofthosebitstringsarefound.
`themthenproceedsmuecutethepmgramnusthuemnflshaastavuygood
`record in defending against known malicious programs. such. as polymorphic viruses
`that use ahamfilutation Engine”. butthey annotlJeapplied ingeneral tofindiug
`new malidons code. oreven tofinding familiar malicious code protected by a “Muta-
`
`

`

`tion Engine” that is. itself. slightly mutated. Another popular approach uses simple
`monitors to observe program execution and detect potentially malicious behavior at
`run time. Such monitors usually sit astride the system call interface. e.g-. to watch
`all disk accesses and ensure that no unauthorized writes are performed. Unfortunately
`such techniques incur a substantial speed penalty during encution of normal programs,
`and typically become quite a nu'ntance to the user.
`To be efl'ective. these progruns must also err on the conservative side. resulting in
`many false alarms which require user interaction. But in these interactions. current
`techniques require the user to make relatively immediate [and usually uninformed)
`decisions regarding whether the program should be allowed to promed. Such decisions
`would benefit. immensely from the opportunity to explore a trace of the program's
`history, as well as its then-current execution state.
`
`3-2 Encryption 35 Watchdog Processors
`
`Baa-yptr'on is another method of coping. With the threat of malicious code- Lapid.
`Ahituv. and Neumann [1‘] use encryption to defend against Trojan horses and trapdoors.
`When correctly implemented. encryption techniques are quite elfective against many
`types of malicious code. but the cost of such a system is high due to the required
`hardware.
`finial-13'. watchdog processors [8] also require additional hardwue. Such
`processors are capable of detecting invalid readsfwrites from/to memory. but they
`require additional support to effectively combat viruses. Also. both of these methods
`are dependent on the prior existence at a. “clean" version of every program that is to
`he executed. Asmentioned. tocertiiy such copieeas “clean' in thefimt place requires
`either formal verifiution or a malidotts code detection npability. which is the subject
`of the present paper.
`
`4 Review of Dynamic Analysis using Events
`
`Over the last few years, we have developed a powerful. state-ofthe-art debugger called
`Dale]: [9]. Dale]: incorporatu two significant advances over traditional debuggers:
`it
`features a fully-WMablc language for manipulating the debugging environment,
`and it provide. extensive support for user-definable events.
`The MCI‘ user’s environment was designed in accordance with the phiosophy un-
`derlying the Dale]: debugger. and features analogous to there in Dalek have been incor-
`porated into the Mcr. But we have also customized the MCT enVironmeut. in light
`of its specific mission to help ferret out malicious code. We believe that “dynamic
`analysis” (and the developmmt of appropriate methodologies for it] should be seen
`as representing an extremely promming avenue ofInquiry rather than as being just a
`fancy word for the sum of things people have always done with traditional debuggers.
`By fully proyrmmoblmwnemeun the MCI' is an dendr'bleenoironmio asim-
`ilar sense that the Emacs text-editor is extendihle. But due to the nature of the
`
`MC’P's m‘meion, these general-pm language constructs have been fully integrated
`with traditional application-specific debugging features such as breakpoints and single-
`stepping-
`Like the Dalel: debugger. the MCT also provides automated support for detecting
`hierarchical emzts— occurrences of interesting activities during the execution of the
`
`

`

`suspicious program. This capability allows the MCT to reprfient the suspicious pro-
`gram’s behavior in terms of whatever higher-level abstractions have been defined by
`the security analyst.
`In some ways. an event is conceptually similar to a topic in a relational database
`— once the structure of a particular database table ha been defined by the user. ever;
`occur-fenced an event of that type that is detected by the MCT will have its attributes
`recorded permanently, as fields in a. newly inserted triple. Thai. is, when the MCI‘
`detects an event occurrence, it causes a corresponding tuple. (or record) to appear in
`the appropriate database table. The attributes associated with an event should contain
`information sufficient to characterize a particular occurrence of that event allowing it
`to be distinguished from other instances of the same event. The code written by
`a security analyst for art event's definition can cause it. upon activation to assign
`valuis to thfie attributes Econ: variables'II]. the suspicious program from variables in
`the “outer” MCT environment, or from computation based on a combination of such
`variables.
`In addition to defining an event as a template for passive data. the security analyst
`also needs to define an active. procedural aspect for that event. This is accomplished
`by writing a body of code in the MC’I"s language. and assodating it with that event.
`The purpose of this code. when activated.
`'n: to recognize exactly those conditions
`in the suspicious program's execution state that the security analyst has specified as
`constituting a valid occurrence of th'n particular type of event.
`This event-recognition code can be executed manually by the security analyrt as
`sfhe single-steps the suspicious program. or it can be executed automatically by the
`MOT. if the analyst has bound that event's code to a breakpoint, or to a range of
`breakpoint addresses Events whose code is activated in this manner are called primitive
`events.
`
`The MCI‘ also supports high-level events. When defining a. high-level event. one
`most spcdfy the names of all lower—level events on which it amide. A high-level event
`is not explicitly raised; instead. the MCI can automatically trigger a high-level event‘s
`code into executing whenever an occurrence of a primitive event on which that high-
`level event depends is successfully recognined. The high-level event’s code will have
`accua to all the attribute of its lower-level. constituent events. as well as access to the
`“raw” state of the suspicious program and to variants defined in the “outer” MCI
`environment.
`
`Note that the security analyst can define a high-level event whose recognition may
`depend on lower-lac] constituent events whose occurrences are widely term-uteri in
`time. For a concrete example of a network of events fled to detect self-propagating
`code. see [1].
`Viewed {torn the perspective ofa relational database. a high-level event is conceptu-
`ally alrin to an ongoing query: In defining a high-level event, the security analyst poses
`a query. The MCT then provides incremental answers to that activated query, as the
`behavior of the snspicious program muses new occurrences of primitive eventfatrribnte
`triples automatically to be inserted in the database.
`The “execution history database“ maintains a record of all recognized event oc-
`currences and their attributu. It may be browscd sclectivcly by the security analyst
`in interactive mode. or accessed programmatiully via. accas {auctions written in the
`MCTH: language.
`
`
`
`

`

`5 Architecture of the MCT
`
`One dfiign goal for the MC’I‘ is that it be as universal as possible. That is, the testbed
`should in principle be capable of analyzing both source code and executable files from
`difl'erent pmrs and dillerertt operating systems. However. to achieve such broad
`applicability, we would have to develop various front—ends and back-ends for the MCI.
`In addition. because of the radically different “security architectures” (or lack thereof]
`on difi'erent platforms. that portion of the MCT between the front-ends and hack-ends
`- that portion common to all platforms - could turn out to be the null intersection-
`NeVertheless. we feel that using a common maehineeindependent internal form language
`may illuminate “pads common to many security architecturfi.
`
`Initial Program Loading
`5.1
`If started. for example. with an “executable” file. a front-end will need to understand
`any loading [and possibly some dynamic linking) conventions of the target operating
`system. The [rout-end will also need to know the processor type of the machine code in
`order to properly translate it into the Lisp-based internal form. A back-end will need
`to emulate any dynamic linking operations ol' the target operating system. as well as its
`system call interface. Thus. for example. if a. program running under the MCT Iunites."
`a file and then “reads" it back again. it should not be apparent to that program that
`it
`not. in fact. running directly on the target CPU and operating system.
`Either before or after translation 01' an executable program into an internal form.
`the MCI might also search it to identify any known standard system-library routine;
`Assuming those routines are “clean”. this step could significantly reduce the sire of the
`problem.
`In addition, since the type of every parameter required by a system-library
`routine is known. this inlormation permits subservient phases of the analysis to infer
`the types of any variables in the suspicious program that are passed as arguments to
`those system-library routines.
`
`5.2 Program Representation — Internal Form Language
`
`In order for the MCI use: profitably to apply various analysis tools. those took must
`share a common representation. of the suspidous program that is the subject of their
`analyst.
`To analyze the behavior of an executable machine code program, we must first
`translate its code into our internal form language. We have designed a set of pro-
`cedurs that, given a. 2-tnple {Memory-Add“, fife-momma). will translate its
`Memory. Garments into our internal lonn language. Because not all amenibler instruc-
`tions have the same length,
`it behooves us to explicitly represent the original Mem-
`oerdreas as another field. in the internal form representation. This will. allow the
`translation tool easfly to access other 2—tuples representing the next few adjacent Mem-
`omAddresaer. should the need arise. in order to complete the job of disassembling a
`single lens instruction.
`The internal form language was deliberately dedgned toinclude only asmall number
`of basic operators. that simplifying the analysis. These operators are closely related
`to the hardware operations on a microprocessor, allowing convenient translation from
`machine code into the internal [om Typical basic operations of the interml form.
`
`

`

`language include READ or WRITE to a. Mmorleddneu or Register- As an example
`of the syntax of the internal form, an. indirect write of 0 through register CX. might
`look like:
`
`[ WRITEBYTE 0, (ADD (READ.WORD CK] #DXOIAE} )
`
`The internal form is a Lisp-like language, whose order of evaluation is the same
`as that of Lisp. By defining the internal form’s operators (e.g., ‘READ.WORD”} as
`functions in Lisp, a. program written in the internal [orm language can be interpreted
`by any standard Lisp system. Thus. by using; Lisp as the “native language“ of the
`MCI'. dynamic analysis can readily simulate the execution of r. program that has been
`translated from machine code into the internal form.
`
`5.3 Memory Model for the Code Segment
`
`Although the translator will produce a string of code in our internal form language, the
`MCI‘ must store much more than just that string of syntax. To adequately represent
`even a. l-byte instruction of the original machine code. the MCT uses an elaborate data
`structure that also store: the original {Monoryn‘tddrem MmConrents) 2-tuple,
`along with various auxiliary fields to record other information that may be computed by
`dynamic analysis techniques. Reprueutations ofeirery MmemAddnuh a. suspicious
`program‘s code segment are stored in a table in the MCT.
`
`5.4 Memory Model for the Data Segment
`
`Cells in a suspicious program’s data memory can be represented by the same struc-
`tum as are used for its code. although at first it might appear that only the (Mem-
`orLLAddn-na, MmemCorttents) fields are needed. Thar: reprmentationa of its data
`can be stored in the same table as the representations of its code. Named registers on
`the target CPU are treated as a special case of data memory. The MCI“ interpreter
`“allocates” data memory only as required by the dynamic behavior of the program (Le.
`for the run time stack and local variablts. and for memory that is explicitly allocated
`dynamically via all; to malice]. We must also load the MCT interpreter with any
`initialiaed data in the original executable machine code file, as well as any sections of
`DOS we think the program might attempt to access directly (e.g.. the interrupt vector
`table].
`
`6 Experience Using the Malicious Code Testbed
`
`The MCI is written in Common Lisp. and its execution of internal form code in a
`simulated PCIDOS environment on a Unix workstation is several orders of magnitude
`slower than genuine execution of the original machine code on a PC platform. Never-
`theless. bemoan:- the security analyst can define events, and then leave the MCT to run
`unassisted for long periods to watch for occurrences of those events, this time penalty
`is acoeptalfle.
`In order to detect self-modifying code, we have included several predefined. events in
`a standard library for the MCI‘. These events record every memory aooess — attributes
`such as the memory access mode (Le, Read. Write, or elk-cute]. and the memory
`address and contents. Thus, we can perform a relational join within this table. 12.3..
`
`

`

`if a. particular location has been modified by some instruction. we can determine the
`address of the responsible instruction. and the contents of that instruction, even if they
`have been modified snbeequently. The overhead incurred by this recordkeeping in one
`reason for the MCI": slow execution.
`
`In implementing the Mm, we are extending the bonndarim of our simulated DOS
`environment incrementally, as necessitated by the demands of our test programs for
`DOS/BIOS system services. Currently, the DOS system call interface is still somewhat
`skeletal. Our simulation of the PC hardware is also fairly rudimentary, e.g.. we do not
`currently simulate the periodic clock tick interrupts, and thus we avoid their initiated
`processing time-
`As mentioned, the MCT is an extendible. customizable enfironment. Thus, the
`exact nature of the “display" it presents to the user is a matter of personal choice. In the
`sample displays that folloW. we utilize a highly verbose mode, that. in most situations,
`would present the security analyst with far more undigerted. low-level information
`than dared. Nevertheless. on occasion this level of detail is dainble, and is certainly
`justified in this case on exponitory grounds.
`
`
`The Malicious Code Testbed Displays a ”fine" of Program Execution
`
`"‘ Initializing Low-DOS ““ for Compaq 336. DOS 3.3!. : 0x0000 — 0x1000
`MCT wfll simulate -- JMP 011195
`at [P 0111100 ::
`
`— ((JUMP‘ (T #:1195)))
`MCI' will simulate —- CLD at IF 0:1195 ::
`—- “WRITE? DF #:tOD
`
`at IP 031196 ::
`MCT will simulate — MOV AH. 0130
`—— ((WRITEB AH (READB (CONST #xEODJ)
`MCT will simulate —- INT 0x21
`at IP 0x11” ::
`—- {(LIB INT #:21”
`
`MCT will simulate —- CMP AH. use at IP mm ::
`._. “warren AF (AUX [- (man an} (mm-2; (CONST #xsonm
`{WI-urn? or (ovenrtow (. (men an) (READB (CONS’I‘ #:EODD)
`(warm? PF {mam [- (11mm; All) (men (comer my)»
`(WRITE? SF (SIGN (— (BEAM! AH) (READB {CONST #xeomn
`(warren ZF (ZERO (. (mus AB) (mm; (CONST #:Eonm
`[warren CF (CARRY (— (mun AH) (READB (corm- #XEO]))}])
`
`MCT will simulate — INC #1135 at [P 0111913 ::
`—-- ((JUMP “=0 (READ.F CF”I #1135} [T #xllflFD)
`
`MCT will simulate —- MOV AX. G“:
`— “WRITE—W AX (READJN CS)”
`
`at [P OxHBS ::
`
`MCT will simulate —- ADD ADC. #10 at IP 0:113? ::
`—- {\VRITE.W AX (+ (READW AX) iREAD.W (CONST #JtIODD
`
`
`

`

`The first example MCT display above providfi the conceptual equivalent of a pro-
`gram “trace”. such as might be provided by a debugger. For each newly executing
`instruction, the MCT displays the 5033 assembler mnemonic, the address of that in-
`struction. and its translation into our internal form language (Whidt is then executed
`by the Lisp interpreter]:
`
`In the next example, the security analyst has programmed the MCT to display
`a more high—level mess-age immediately upon detecting an occurrence ol' the event
`named SELF-MOD-CODE. In this particular case. a memory location that was initially
`messed in moda Read. then Write. is subsequently arrows-ed in eXecute mode. The
`event code. written by the security analyst. notifies him aftereach subsequent eXecute
`mess, and also provides some higher-level information it has computed — namely.
`which instruction was responsible for modifying the instruction the 150ij executed.
`
`
`
`The Malicious Code Testbed Detects Self-Modifying Code
`
`MC'T will simulate —- JMP-INTER—SEG DXSFC 03:01!!!
`
`at [P 0x123D ::
`
`—- (HUMP-ABS #x3FC #:03}
`
`MCI‘ Will simulate —- REP at [P Ux3FC ::
`— “PREFIX ‘33? #3:”)
`.S'ELF- EYOD-CODE EVENT— Loafion OISFC modified by address 011213
`
`ncr will simulate —. REP movs
`
`at I]? flxSI-‘D ::
`
`..... “mew (rs—stun (READW Dn) (amnw (us—53m (nunw 3m);
`(WRITEW SI (+ (READW SI) #x2}) [WRlTEW DI (+ (READW DI) #:2D}
`SELF- MOD-CODE EVENT— Location OxSFD modified by address 03:1le
`
`nor wfl] simulate —. REP movs
`
`at I? flxSFD :;
`
`_. ((wnrrnw (ES-SHIFT (mnw m1) (nannw {BS-SHIFT mm 51))»
`(wmnw SI (+ (READW SI) #2)) (WRI'rnw DI (+ (READJN DI) #:2)»
`ssymongogs EVENT— Location 0::er modified by addrew 0:1le
`
`
`the MOVE instruction that was modified is being repeated
`In the are above.
`because of its prefix. REP. Thus, every time it repeats. the MCT dhiplays the fact that
`it detected another occurrence of the event, SELF-MOD—CODE. The security analyst
`might decide to redefine this event so that when not in “verbose“ mode [as determined
`by examining a variable in the “outer” MC? environment), it wfll quietly record all
`eXecute access-5 to this. location alter the first Read, Write. eXecute sequence. but will
`not announce those sub-sequent occurrences of the event immediately.
`
`

`

`In the next. example, after a section 0‘ malicious code has decrypted itself [not
`shown), the deaypted code proceeds to read the realtirne clock twice in rapid mecca-ion
`to check whether it ’3 being single-stepped under a debugger. If 165 than 1 second has
`elapsed, it assume it is not being watched, and. then attempts a. reboot.
`
`
`
`
`
`Deerypted Code Mods Clock to Check if being Single—Stepped under Debugger;
`“—01. Being Watched—I. Mtp aRe—t
`
`
`
`
`
`
`
`MCT will simulate —-MOV a: 1'? 01:10:35 ::
`
`— {(WRITE.B AH (READJB (CONST 0:23));
`SELF-MOD- C-‘OQE EVENT — Location 011035 modified by address OxIDSC
`
`MC? will simulate —-1'NT
`
`at 1'? 0x103? ::
`
`-—-— [(LIB INT OxlAD
`Program desires to read 24—h: realtime clock {base 10]:
`Enter hours:
`11
`Enter minutes:
`Enter secondx
`Enter hnndredths:
`
`15
`0.3
`
`00
`
`SELF-M013.8033 EVENT— location {111037 modified by addrws 0x105C
`
`MOT will mum: —PUSH n 1? 0:10:59 ::
`
`._. ([wmaw SP {- (mum SP) 012))
`(mesw (55.5mm- (READW 59)} [READJN Dxm
`SELF-MOD-CODE EVENT— Loation 0:11:39 modified by address onosc
`
`MCT will simulate —-MOV at [P flxlfl3A :-_
`_ [(WRJTEB AH (mun (CONS? mm)
`supmoncooe EVENT— Lamina ImusA modified by address must;
`
`MCT will simulate “m at IP 011030 ::
`—— ({LIB INT 0x1 A}}
`Program desires to read 24-hr mealtime dock (base Ill):
`Enter hours:
`11
`Enter minuts:
`Enter seconds:
`Enter hundredths:
`
`15
`03
`
`00
`
`SELF-MOD-CODE EVENT— [nation DflOJC modified by addres leEISC
`
`:41. [P OIIEBE ::
`MCT will simulate —POP
`— ((WRITE.W ax [READW [SS-SHIFT {READJN SPDJ)
`(WRITEW SP (+ (mm? SP) 012”}
`SELF-RIOD-CODE SVENT— location 0:10.313 modified by addrens ciao-SC
`
`10
`
`
`
`

`

`'—“-
`
`..
`
`mcr will simulate 43m:-
`
`.: IP Dxlfl-RF ::
`
`_. (Murray AF {aux (- (mm; DH] (mug A50»)
`(WRITEF cs- (CARRY (- {magma DH] (11210.3 mm)
`(WRITEJ’ or (OVEKFLOW (- [mun DH) [REALLB AH))))
`(WRITE? PF (mam (- (mun DH) (READB mm}
`(WRITE? 5F (SIGN (— (READS DE) (READB AHDD
`(warm? 2? (23110 (- (READB on) (READB AH)})))
`SELF-MOD-CODE EVENT— Locnion 0:103? modified by address axlosc
`
`It I? 011041 ::
`MOT will simulate —-JNZ
`— ((JUMP “=0 (READY? zrn 011053) (T 0110431))
`ssm-mon-coos EVENT— Location 0x104] modified by address 0x105C
`
`MCI' will simulate —AND at [P 07:10-43 ::
`
`—- ((VNRJTEF fiF UNDEF] [\VRITEF CF 0x0) WHITE? 0? 0x0}
`(wax-ran CL {Locmn (mm: CL) (READB {CONST mum)
`(warm? PF (PARJ'I'Y (mun cm) (mar SF (3109: (33513.3 01-1)}
`(\VRI'TEF ZF [ZERO (READB CL)]))
`SELF-MOD-CODE EVENT— Locatio: 0x104: modified by address axlosc
`
`MCT Will simulat- -—-CMP
`
`a1 IP 0x10“ ::
`
`—- ((WRITEP AF [AUX l' {READB CI.) (READB [CONST 0x1]}]))
`(\VRJTEF CF (CARRY (— [READE CI.) (11519.3 {CONST 0x111”)
`(WRITE? OF (OVERFLOW (- {READB CL] {READH (CONST 0.11””)
`(WRITEF PF (PARITY (- (REA'DJ CL) (READE (OONS'T mum)
`(WRITEF SF [SIGN (- (READ.8 CI.) (READB (CONST 031”)”
`(“TRIKE ZF [ZERO (- (READB CL] (HEADLB (CONST (”EDD”)
`SELF-MOD-CODE EVENT— [mafia]: 011046 modified by addxua 011050
`
`MCT will simulate —JNZ as IP 0:1049 ::
`— ({JUMP “=0 [READ‘F ZFY; 0:106:30 (T 021043)!)
`SELF-MOD—CODE EVENT— Localion 0:10-19 modified by address 011050
`
`MOT Will simulate —m a. [P 011MB ::
`— {(1.13 INT 01:19))
`N
`ngximfltunpfingakeboot—Ofitopmtd?
`SELF-MOD-CODE EVENT— Location 0:10-13 modified by address 0:105C
`
`IDcu'yptchadeRcuhClodztoChcckifbcingSingh-Stcpptd ndchbml
`HNOthngWflChd-ItncflAflmptslm
`
`
`
`ll
`
`
`
`

`

`7 Future Research
`
`One direction to pursue is to focus on the so-mlled "palyrnorphie” vim that. by
`using unknown “Mutation Engines”. can easily evade static scanners. Our event-based
`dynamic analysis techniques should be able to handle all polymorphic “mutants". since
`all polymorphic variants of a. given Virus 5110qu share a common behavioral profile —
`a. common dynamic canonical form.
`We also plan to return to implementing one of our original design goals for the
`MC’I‘. namely integrating static analya's tools [1] into the common MC’I‘ environment.
`We feel there is great potential for the omplemenlory use of these two families of
`analyst techniques. leading ultimately to the development of a. more rigorous detection
`methodology.
`t may be that a two-tier detection scheme Will be warranted — an efllcient—running
`coarse-grain event filter that could quickly screen large sections of uncle. and a slower.
`more thorough. fine-grain event mesh for especially critical systems or scum.
`
`Acknowledgements
`
`We thank Doug Mmr for his valuable insights.
`
`References
`
`[I] EL Crawford. TL Lo. 1. Crowley. G. Fink. P. Kerdlel. W. Ho. K. Levitt. R. Clarion.
`M. Archer. “A Testbed for Malicious Code Detection: A Synthesin of Static and D3:-
`namic Analysis Technique: '. Set-m Network: — Preceding: Fifth Inlemntiortul
`Computer Virus 5' Set-wily Conference. March 1992. pp. 225-236.
`
`[2] P. Cohen. 'Compnte'r Vii-Ilsa — Theory and Experiments". Computers 8 Security.
`Vol. 5. 1937. Pp- 22-35.
`
`[3] Len Adlenlan. “An Alanna Theory ofComputer Viruses“ (ah-tact]. CRYP'TO
`’83.
`
`[4] M. Elan-Eon. W. Rum. and J. Ullmzn. 'Protection in Operating Systems’. CACM.
`Vol. 19. Na. 3. Aug. 1975. pp. 451-471.
`
`[5] K. Thompson. ‘Reflections on. 1mm; Thirst", Comm. ACH. Vol. 27. No. 8. 1984..
`pp. 781-763.
`
`[6] F. Cohen. ‘A Cryptographic Checkout for Integrity Protection”. Computers 3
`Security. VoL 6. 1937. pp. 505.510-
`
`[7] Y. Lapid. N. Alrirnv. and S. Nenrnann. “Approaehs to Handling “Trojan Horse'
`Threats'. Computers 8‘ Seem-fly. Vol. 5. 1985. pp. 251-356.
`
`[8] A- Mahmud and E. .l. McClulney. “Concurrent Error Detection Using “fatal-dog
`Processors—A Surrey”. IEEE Transaction: on Computers. VoL 37. No. 1 1938. pp.
`160-174.
`
`[91 R. Olson. FL anford. and W. Ha- ‘A Dahlia. Approach to Eventnbued De-
`bugging". SOHl‘lflRE—Pmtioe and Experience. Vol. 21. No. 2. Feb. 1991. pp.
`mm.
`
`12
`
`

`

`Hem-10113374
`PREPRJNr
`
`Automated Assistance for Detecting Malicious Code
`
`R. Crawford, P. Kerchen, K. Levitt,
`R. Olsson, M. Archer, NI. Casfllas
`
`Thispaperm prepamdfarsubmiflal to I11:
`Sixth InternafionalCompnlerSecnfilychirus Conference &Expo
`MID-12,1993
`NewYodchwYoflc
`
`June '18, 1993
`
`
`
`
`'l'hhiupmpuirtofapapcrinacndcd fu'pnbficafiuninliomalwpmmfiny. Sine:
`mumhmmwfimwmhmmumm
`mdmfingdlaitwinnotbednd unprodned withoutflmpanininnoflhe
`author.
`
`“‘qh 5—”
`
`‘
`
`'Mhh-u
`
`13
`
`13
`
`

`

`m
`
`Eye!
`ammdwwkspulmd In
`Thsdoummtwa
`weUnlmdsanova‘q-Idt. mmmummmmcommmb’Ema
`lhhulydofifa‘mlna'y Wrenflflranflayummymty,
`mimphdormm WWRWMMM.
`Imamdny inhaling,
`mpg-adminm
`thatlbusewmld

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket