throbber
22
`
`Computer Viruses
`
`Theory and Experiments
`
`l. Introduction
`
`Fred Cohen
`Dept of Computer Science. and Electric Engineering, Lehigh
`University, Bethlehem PA 1821.5, USA, and The Foundation for
`Computer Integrity Research, Pittsburgh, PA 1.5217 USA
`
`This paper introduces "computer viruses" and examines
`their potential for causing widespread damage to computer
`systems. Basic theoretical results are presented, and the infeasi(cid:173)
`bility of viral defense in large classes of systems is shown
`Defensive schemes are presented and several experiments are
`described
`
`Keywords Computer Viruses, System Integrity, Data Integrity
`
`Fred Cohen received a B.S. in Electri(cid:173)
`cal Engineering from Carnegie-Mellon
`University in 1977, an MS in Informa(cid:173)
`tion Science from the University of
`in
`Pittsburgh in 1981 and a Ph.D.
`Electrical Engineering from the Uni(cid:173)
`versity of Southern California in 1986
`He has worked as a freelance con(cid:173)
`sultant since 1977, and has designed
`and implemented numerous devices
`and systems. He is currently a profes(cid:173)
`sor of Computer ScienCe and Electri(cid:173)
`cal Engineering at lehigh University,
`Chairman and Directdr of Engineering at the Foundation for
`Computer Integrity Research, and President of Legal Software
`Incorporated.
`He is a member of the ACM, IEEE, and IACR. His current
`research interests include computer viruses, information flow
`model, adaptive systems theory, genetic models of computing,
`and evolutionary systems
`
`North-Holland
`Computers & Security 6 (1987) 22-35
`
`Ibis paper defines a major computer security
`problem called a virus. The virus is interesting
`because of its ability to attach itself to other
`programs and cause them to become viruses as
`welL Given the widespread use of sharing in cur(cid:173)
`rent computer systems, the threat of a virus cany(cid:173)
`ing a Trojan horse [1,20] is significant Although a
`considerable amount of work has been done in
`implementing policies to protect against the illicit
`dissemination of information [ 4, 7], and many sys(cid:173)
`tems have been implemented to provide protection
`from this sort of attack [12,19,21,22], little work
`has been done in the area of keeping information
`entering an area from causing damage (5,18] There
`are many types of information paths possible in
`systems, some legitimate and authorized, and
`others that may be covert (18], the most com(cid:173)
`monly ignored one being through the user We will
`ignore covert information paths throughout this
`paper.
`The general facilities exist for providing prov(cid:173)
`ably correct protection schemes [9], but they de(cid:173)
`pend on a security policy that is effective against
`the types of attacks being carried out Even some
`quite simple protection systems cannot be proven
`'safe' [14]. Protection from denial of services re(cid:173)
`quires the detection of halting programs which is
`well known to be undecidable [11]. The problem
`of precisely marking information flow within a
`system (10] has been shown to be NP-complete.
`The use of guards for the passing of untrustworthy
`information [25] between users has been ex(cid:173)
`amined, but in general depends on the ability to
`prove program couectness which is well known to
`be NP-complete
`The Xerox worm program [23] has demon(cid:173)
`strated the ability to propagate through a network,
`and has even accidentally caused denial of services
`In a later variation, the game of 'core wars' [8] was
`invented to allow two programs to do battle with
`one another Other variations on this theme have
`been reported by many unpublished authors,
`mostly in the context of nighttime games played
`between programmers. The term virus has also
`been used in conjunction with an augmentation to
`
`016'7-4048/87 j$3 50© 198'7, Elsevier Science Publishers B V (North-Holland)
`
`Blue Coat Systems - Exhibit 1044
`
`

`
`F Cohen / Computa V'iruse~
`
`APL in which the author places a generic call at the
`beginning of each function which in turn invokes
`a preprocessor to augment the default APL inter(cid:173)
`preter [13]
`The potential threat of a widespread security
`problem has been examined [15] and the potential
`damage to government, financial, business, and
`academic institutions is extreme. In addition, these
`institutions tend to use ad hoc protection mecha(cid:173)
`nisms in response to specific threats rather than
`sound theoretical techniques [16].. Current military
`protection systems depend to a large degree on
`isolationism [3]; however, new systems are being
`developed to allow 'multilevel' usage [17]. None of
`the published proposed systems defines or imple(cid:173)
`ments a policy which could stop a virus.
`In this paper, we open the new problem of
`protection from computer viruses, First we ex(cid:173)
`amine the infection property of a virus and show
`that the transitive closure of shared information
`could potentially become infected When used in
`conjunction with a Irqjan horse, it is clear that
`this could cause widespread denial of services
`and/ or unauthorized manipulation of data.. I he
`results of several experiments with computer
`viruses are used to demonstrate that viruses are a
`formidable threat in both normal and high secur(cid:173)
`ity operating systems .. The paths of sharing, transi(cid:173)
`tivity of information flow, and generality of infor(cid:173)
`mation interpretation are identified as the key
`properties in the protection from computer viruses,
`and a case by case analysis of these properties is
`shown. Analysis shows that the only systems with
`potential for protection from a viral attack are
`systems with limited transitivity and limited shar(cid:173)
`ing, systems with no sharing, and systems without
`general interpretation of information (I uring ca(cid:173)
`pability).. Only the first case appears to be of
`practical interest to current society. In general,
`detection of a virus is shown to be undecidable
`both by a-priori and runtime analysis, and without
`detection, cure is likely to be difficult or impossi(cid:173)
`ble
`Several proposed countermeasures are ex(cid:173)
`amined and shown to correspond to special cases
`of the case by case analysis of viral properties
`Limited transitivity systems are considered hope(cid:173)
`ful, but it is shown that precise implementation is
`intractable, and imprecise policies are shown in
`genera! to lead to less and less usable systems with
`time.. The use of system-wide viral antibodies is
`
`2 .. A Computer Vims
`
`We define a computer 'virus' as a program that
`can 'infect' other programs by modifying them to
`include a possibly evolved copy of itself. With the
`infection property, a virus can spread througho{.t
`a computer system or network using the authori(cid:173)
`zations of every user using it to infect their pro(cid:173)
`grams Every program that gets infected may also
`act as a virus and thus the infection grows.
`The following pseudo-program shows how a
`virus might be written in a pseudo-computer lan(cid:173)
`guage. The ' •~ ' symbol is used for definition, the
`':' symbol labels a statement, the ';' Separates
`statements, the ' = ' symbol is used for assignment
`or comparison, the ' - ' symbol stands for not, the
`'{'and')' symbols group sequences of statements
`together, and the ' . . ' symbol is used to indicate
`that an inelevant portion of code has been left
`implicit.
`This example virus (V) (Fig. 1) searches for an
`uninfected executable file (E) by looking for ex(cid:173)
`ecutable files without the "1234567'' in the begin(cid:173)
`ning, and prepends V to E, turning it into an
`infected file (I) V then checks to see if some
`
`program virus
`{1234567;
`subroutine infect-executable ·=
`{loop: file = random-execut~ble·
`if first-line-of-file = 1234567•
`'then goto loop;
`prepend virus to file·
`}
`.
`
`.
`
`subroutine do-damage :=
`{whatever damage is desired}
`subroutine trigger-pulled :
`{return true on desired conditions}
`main-p~ogram :=
`{infect-executable·
`if trigger-pulled fhen do-damage·
`goto next;
`•
`}
`next:}
`
`Fig 1 Simple virus 'V
`
`

`
`24
`
`F: Cohw /Computet Viruses
`
`triggering condition is true, and does damage
`Finally, V executes the rest of the program it was
`prepended 1 to When the user attempts to execute
`E, I is executed in its place; it infects another file
`and then executes as if it were E. With the excep(cid:173)
`tion of a slight delay for infection, I appears to be
`E until the triggering condition causes damage
`We note that viruses need not prepend themselves
`nor must they be restricted to a single infection
`per use.
`A common misconception of a virus relates it
`to programs that simply propagate through net(cid:173)
`works. The worm program, 'core wars,' and other
`similar programs have done this, but none of them
`actually involve infection .. The key property of a
`virus is its ability to infect other programs, thus
`reaching the transitive closure of sharing between
`users As an example, if V infected one of user
`A's executables (E), and user B then ran E, V
`could spread to user B 's files as well.
`It should be pointed out that a virus need not
`be used for evil purposes or be a Trojan horse. As
`an example, a compression virus could be written
`to find uninfected executables, compress them
`upon the user's permission, and prepend it_self to
`them Upon execution, the infected program de(cid:173)
`compresses itself and executes normally. Since it
`always asks permission before performing services,
`it is not a Trojan horse, but since it has the
`infection property, it is still a virus Studies indi(cid:173)
`cate that such a virus could save over 50% of the
`space taken up by executable files in an average
`system The performance of infected programs
`would decrease slightly as they are decompressed,
`and thus the compression virus implements a par(cid:173)
`ticular time space tradeoff. A sample compression
`virus could be written as in Fig .. 2.
`This program (C) finds an uninfected executa(cid:173)
`ble (E), compresses it, and prepends C to form an
`infected executable (I) It then uncompresses the
`rest of itself into a temporary file and executes
`normally When I
`is run, it will seek out and
`compress another executable before decom(cid:173)
`pressing E into a temporary file and executing it
`The effect is to spread through the system com(cid:173)
`pressing executable files, decompressing them as
`they are to be executed. Users will experience
`
`program compression-virus
`{01234567;
`subroutine infect-executable .
`{loop: file = random-executable;
`if first-line-of-file = 01234567
`then goto loop;
`compress file;
`prepend compression-virus to file;
`}
`main-program :=
`{if ask-permission
`then infect-executable;
`uncompress the-rest-of-this-file
`into tmpfile;
`run tmpfile;
`}
`
`}
`
`Fig 2 Compression virus 'C'
`
`significant delays as their executables are decom(cid:173)
`pressed before being run
`As a more threatening example, let us suppose
`that we modify the program V by specifying
`trigger-pulled as true after a given date and time,
`and specifying do-damage as an infinite loop
`With the level of sharing in most modem systems,
`the entire system would likely become unusable as
`of the specified date and time .. A great deal of
`work might be required to undo the damage of
`such a virus. This modification is shown in Fig .. 3.
`As an analogy to a computer virus, consider a
`biological disease that is 100% infectious, spreads
`whenever animals communicate, kills all infected
`animals instantly at a given moment, and has no
`detectable side effects until that moment. If a
`delay of even one week were used between the
`introduction of the disease and its effect, it would
`be very likely to leave only a few remote villages
`alive, and would certainly wipe out the vast major(cid:173)
`ity of modem society. If a computer virus of this
`type could spread through the computers of the
`world, it would likely stop most computer use for
`a significant period of time, and wreak havoc on
`modern government,
`financial, business, and
`academic institutions
`
`subroutine do-damage
`{loop: goto loop;}
`subroutine trigger-pulled :
`{if year > 1984 then return(true)
`otherwise return (false);
`
`:=
`
`1 The term 'prepend' is used in a technical sense in this paper
`to mean 'attach at the beginning'
`
`Fig. 3 A denial of services virus
`
`

`
`F: Cohen I Computer Viruse~
`
`25
`
`3 .. Prevention of Computer Viruses
`
`We have introduced the concept of viruses to the
`reader, and actual viruses to systems. Having
`planted the seeds of a potentially devastating at(cid:173)
`tack, it is appropriate to examine protection mech(cid:173)
`anisms which might help defend against it We
`examine here prevention of computer viruses
`
`3.1 Basic Limitations
`
`In order for users of a system to be able to share
`information, there must be a path through which
`infOrmation can flow from one user to another
`We make no differentiation between a user and a
`program acting as a sunogate for that user since a
`program always acts as a sunogate for a user in
`any computer use and we are ignoring the covert
`channel tlu ough the user. Assuming a I ming
`machine model for computation, we can prove
`that if information can be read by a user with
`I ming capability, then it can be copied, and the
`copy can then be treated as data on a I ming
`machine tape
`Given a general purpose system in which users
`are capable of using information in theii possession
`as they wish, and passing such information as they
`see fit to others, it should be clear that the ability
`to share information is transitive That is, if there
`is a path from user A to user B, and there is a
`path from user B to user C, then there is a path
`from user A to user C with the witting or unwit(cid:173)
`ting cooperation of user B.
`Finally, there is no fundamental distinction be(cid:173)
`tween information that can be used as data, and
`information that can be used as program. This can
`be clearly seen in the case of an interpreter that
`takes information edited as data, and interprets it
`as a program In effect, information only has
`meaning in its interpretation
`In a system where information can be interpre(cid:173)
`ted as a program by its recipient, that interpreta(cid:173)
`tion can result in infection as shown above If
`there is sharing, infection can spread tlu ough the
`interpretation of shared information If there is no
`restriction on the transitivity of information flow,
`then the information can reach the transitive
`closure of information flow starting at any source
`Sharing, transitivity of information flow, and gen(cid:173)
`erality of interpretation thus allow a virus to spread
`to the transitive closure of information flow start(cid:173)
`ing at any given source
`
`Clearly, if there is no sharing, there can be no
`dissemination of information across information
`boundaries, and thus no external information can
`be interpreted, and a virus cannot spread outside a
`single partition. Ihis is called 'isolationism' Just
`as clearly, a system in which no program can be
`altered and information cannot be used to make
`decisions cannot be infected since infection re(cid:173)
`quires the modification of interpreted informa(cid:173)
`tion. We call this a 'fixed first order functionality'
`system We should note that virtually any system
`with real usefulness in a scientific or development
`environment will require generality of interpreta(cid:173)
`tion, and that isolationism is unacceptable if we
`wish to benefit from the work of others Neverthe(cid:173)
`less, these are solutions to the problem of viruses
`which may be applicable in limited situations.
`
`3.2 Partition Models
`
`I wo limits on the paths of information flow can
`be distinguished, those that partition users into
`closed proper subsets under transitivity, and those
`that do not. Flow restrictions that result in closed
`subsets can be viewed as partitions of a system
`into isolated subsystems.. These limit each infec(cid:173)
`tion to one partition. This is a viable means of
`preventing complete viral takeover at the expense
`of limited isolationism, and is equivalent to giving
`each partition its own computer
`Ihe integrity model [5] is an example of a
`policy that can be used to partition systems into
`closed subsets under transitivity. In the Biba
`model, an integrity level is associated with all
`information .. Ihe strict integrity properties are the
`dual of the Bell-LaPadula properties; no user at a
`given integrity level can read an object of lower
`integrity or write an object of higher integrity In
`Biba's original model, a distinction was made be(cid:173)
`tween read and execute access, but this cannot be
`enforced without restricting the generality of in(cid:173)
`formation interpretation since a high integrity
`program can write a low integrity object, make
`low integrity copies of itself, and then read low
`integrity input and produce low integrity output
`If the integrity model and the Bell-LaPadula
`model coexist, a form of limited isolationism re(cid:173)
`sults which divides the space into closed subsets
`under transitivity .. If the same divisions are used
`for both mechanisms (higher integrity corresponds
`to higher security), isolationism results since infor-
`
`

`
`26
`
`F: Cohen / Computer Viruse.s
`
`mation moving up security levels also moves up
`integrity levels, and this is not permitted. When
`the Biba model has boundaries within the Bell(cid:173)
`LaPadula boundaries, infection can only spread
`from the higher integrity levels to lower ones
`within a given security level Finally, when the
`Bell-LaPadula boundaries are within the Biba
`boundaries, infection can only spread from lower
`security levels to higher security levels within a
`given integrity level. There are actually nine cases
`corresponding to all pairings of lower boundaries
`with upper boundaries, but the three shown
`graphically in Fig. 4 are sufficient for understand(cid:173)
`ing.
`Biba's work also included two other integrity
`policies, the 'low water mark' policy which makes
`output the lowest integrity of any input, and the
`'ring' policy in which users cannot invoke every(cid:173)
`thing they can read The former policy tends to
`move all
`information towards lower
`integrity
`levels, while the latter attempts to make a distinc-
`
`Biba
`
`B-L Result
`
`Bib a
`
`B-L Result
`
`Fig. 4. Pairings of lower boundaries with upper boundaries
`Top: Biba within B-L; middle: B-1 within Biba; bottom: same
`divisions \\ cannot write; // cannot read; X X no access;
`\+f~x
`
`tion that cannot be made with generalized infor(cid:173)
`mation interpretation
`Just as systems based on the Bell-LaPadula
`model tend to cause all information to move to(cid:173)
`wards higher levels of security by always increas(cid:173)
`ing the level to meet the highest level user, the
`Biba model tends to move all information towards
`lower integrity levels by always reducing the in(cid:173)
`tegrity of results to that of the lowest incoming
`integrity. We also know that a precise system for
`integrity is NP-complete (just as its dual is NP(cid:173)
`complete)
`The most trusted programmer is (by definition)
`the programmer that can write programs execut(cid:173)
`able by the most users .. In order to maintain the
`Beli-LaPadula policy, high level users cannot write
`programs used by lower level users. This means
`that the most ttusted programmers must be those
`at the lowest security leveL This seems contradic(cid:173)
`tory. When we mix the Biba and Beli-LaPadula
`models, we find that the resulting isolationism
`secures us from viruses, but does not permit any
`user to write programs that can be used throughout
`the system. Somehow, just as we allow encryption
`or declassification of data to move it from higher
`security levels to lower ones, we should be able to
`use program testing and verification to move in(cid:173)
`formation from lower integrity levels to higher
`ones
`Another commonly used policy that partitions
`systems into closed subsets is the compartment
`policy used in typical military applications This
`policy partitions users into compartments, with
`each user only able to access information required
`for their duties .. If every user has access to only
`one compartment at a time, the system is secw·e
`from viral attack across compartment boundaries
`because they are isolated. Unfortunately, in cur(cid:173)
`rent systems, users may have simultaneous access
`to multiple compartments. In this case, infection
`can spread across these boundaries to the transi(cid:173)
`tive closure of information flow
`
`3 3 Flow Model.s
`
`In policies that do not parutwn systems into
`closed proper subsets under transitivity, it is possi(cid:173)
`ble to limit the extent over which a virus can
`spread. The 'flow distance' policy implements a
`distance metric by keeping track of the distance
`(number of sharings) over which data has flowed
`
`

`
`F: Cohen 1 Computer Viruses
`
`27
`
`The rules are; the distance of output information
`is the maximum of the distances of input informa(cid:173)
`tion, and the distance of shared information is one
`more than the distance of the same information
`before sharing Protection is provided by en(cid:173)
`forcing a threshold above which information be(cid:173)
`comes unusable Thus a file with distance 8 shared
`into a process with distance 2 increases the pro(cid:173)
`cess to distance 9, and any further output will be
`at least that distance.
`The 'flow list' policy maintains a list of all
`users who have had an effect on each object The
`rule for maintaining this list is; the flow list of
`output is the union of the flow lists of all inputs
`(including the user who causes the action). Protec(cid:173)
`tion takes the form of an arbitrary Boolean ex(cid:173)
`pression on flow lists which deter·mines accessibil(cid:173)
`ity. This is a very general policy, and can be used
`to represent any of the above policies by selecting
`proper Boolean expressions.
`As an example, user A could only be allowed to
`access information written by users (B and C) or
`(Band D), but not information written by B, C,
`or D alone .. This can be used to enforce certifica(cid:173)
`tion of information by B before Cor D can pass it
`to A. The flow list system can also be used to
`implement the Biba and the distance models. As
`an example, the distance model can be realized as
`follows:
`OR( users,; distance 1)
`AND Nor( oR( users> distance 1))
`A further generalization of flow lists to flow
`sequences is possible, and appears to be the most
`general scheme possible for implementing a flow
`control policy
`In a system with unlimited information paths,
`limited transitivity may have an effect if users do
`not use all available paths, but since there is
`always a direct path between any two users, there
`is always the possibility of infection .. As an exam(cid:173)
`ple, in a system with transitivity limited to a
`distance of 1 it is 'safe' to share information with
`any user you 'trust' without having to wony about
`whether that user has incorrectly trusted another
`user
`
`I 4 Limited Inte~pretation
`
`With limits on the generality of interpretation less
`restrictive than fixed first order interpretation, the
`
`ability to infect is an open question because infec(cid:173)
`tion depends on the functions permitted Certain
`functions are required for infection. The ability to
`write is required, but any useful program must
`have output It is possible to design a set of
`operations that do not allow infection in even the
`most general case of sharing and transitivity, but
`it is not known whether any such set includes non
`fixed first order functions.
`As an example, a system with only the function
`'display-file' can only display the contents of a file
`to a user, and carmot possibly modify any file. In
`fixed database or mail systems this may have
`practical applications, but certainly not in a devel(cid:173)
`opment environment In many cases, computer
`mail is a sufficient means of communications and
`so long as the computer mail system is partitioned
`from other applications so that no information
`can flow between them except in the covert chan(cid:173)
`nel through the user, this may be used to prevent
`infection
`Although no fixed interpretation scheme can
`itself be infected, a high order fixed interpretation
`scheme can be used to infect programs written to
`be interpreted by it As an example, the microcode
`of a computer may be fixed, but code in the
`machine language it interprets can still be infected
`LISP, APl, and Basic are all examples of fixed
`interpretation schemes that can interpret informa(cid:173)
`tion in general ways .. Since their ability to inter(cid:173)
`pret is general, it is possible to write a program in
`any of these languages that infects programs in
`any or all of them
`In limited interpretation systems, infection can(cid:173)
`not spread any further than in general interpreta(cid:173)
`tion systems, because every function in a limited
`system must also be able to be performed in a
`general system The previous results therefore pro(cid:173)
`vide upper bounds on the spread of a virus in
`systems with limited interpretation.
`
`3 .5 Precision Problems
`
`Although isolationism and limited transitivity offer
`solutions to the infection problem, they are not
`ideal in the sense that widespread sharing is gener(cid:173)
`ally considered a valuable tool in computing .. Of
`these policies, only isolationism can be precisely
`implemented in practice because tracing exact in(cid:173)
`formation flow requires NP-complete time, and
`maintaining markings requires large amounts of
`
`

`
`28
`
`sht~ring
`
`genera I
`
`F: Cohen / Computer Viru~e\
`
`overcome it A similar possibility exists for com(cid:173)
`puter viruses .. We now examine the potential for
`detection and removal of a computer virus.
`
`4 1 Detection of Viruses
`
`In order to determine that a given program 'P' is
`a virus, it must be determined that P infects other
`programs This is undecidable since P could in(cid:173)
`voke any proposed decision procedure 'D' and
`infect other programs if and only if D determines
`that P is not a virus .. We conclude that a program
`that precisely discerns a virus from any other
`program by examining its appearance is infeasible
`In the following modification to program V (Fig.
`6), we use the hypothetical decision procedure D
`which returns "true" iff its argument is a virus, to
`exemplify the undecidability of viral detection
`By modifying the main program of V, we have
`assured that, if the decision procedure D de(cid:173)
`termines C V to be a virus, C V will not infect
`other programs and thus will not act as a virus .. If
`D determines that C V is not a virus, CV will
`infect other programs and thus be a virus. There(cid:173)
`fore, the hypothetical decision procedure D is self
`contradictory, and precise determination of a virus
`by its appearance is undecidable
`
`4. 2 Evolutions of a Virus
`
`In our experiments, some viruses took under 100
`bytes to implement on a general purpose com(cid:173)
`puter. Since we could interleave any program that
`doesn't halt, terminates in finite time, and does
`nOt overwrite the virus or any of its state varia(cid:173)
`bles, and still have a virus, the number of possible
`variations on a single virus is clearly very large. In
`this example of an evolutionary virus EV, we
`augment V by allowing it to add random state-
`
`program contradictory-virus :=
`{.
`main-program :=
`{if -·n (contradictory-virus) then
`{infect-executable;
`if trigger-pulled then
`do-damage;
`
`}
`goto next;
`}
`
`}
`
`Fig 6 Contradiction of the decidability of a virus 'C
`
`General I nterpretHt ion
`
`Limited lnterpretl!tion
`
`transitivity
`
`limited
`
`trHnsitivity
`
`genert~J I' limited
`
`unlimited unlimited
`
`unknown
`
`general
`
`unknown
`
`limited
`
`Hrbitrary
`
`closure
`
`orbitrary
`
`closure
`
`Fig 5 Limits of viral infection
`
`space [7]. This leaves us with imprecise techniques
`The problem with imprecise techniques is that
`they tend to move systems towards isolationism
`This is because they use conservative estimates of
`effects in order to prevent potential damage. The
`philosophy behind this is that it is better to be safe
`than sorry
`The problem is that when information has been
`unjustly deemed urneadable by a given user, the
`system becomes Jess usable for that user.. This is a
`form of denial of services in that access to infor(cid:173)
`mation that should be accessible is denied Such a
`system always tends to make itself less and less
`usable for sharing until it either becomes com(cid:173)
`pletely isolationist or reaches a stability point
`where all estimates are precise .. If such a stability
`point existed, we would have a precise system for
`that stability point Since we know that any pre(cid:173)
`cise stability point besides isolationism requires
`the solution to an NP-complete problem, we know
`that any non NP-complete solution must tend
`towards isolationism
`
`3 6 Summary and Conclursiom
`
`Fig 5 summarizes the limits placed on viral
`spreading by the preventative protection just ex(cid:173)
`amined. Unknown is used to indicate that the
`specifics of specific systems are known, but that
`no general theory has been shown to predict limi(cid:173)
`tations in these categories
`
`4 .. Cure of Computer Viruses
`
`Since prevention of computer viruses may be in(cid:173)
`feasible if sharing is desired, the biological anal(cid:173)
`ogy leads us to the possibility of cure as a means
`of protection. Cure in biological systems depends
`on the ability to detect a virus and find a way to
`
`

`
`F: Cohen / Computer Viru~es
`
`29
`
`ments between any two necessary statements (Fig
`7)
`
`In general, proof of the equivalence of two
`evolutions of a program 'P' ('P1' and 'P2 ') is
`undecidable because any decision pwcedure 'D'
`capable of finding their equivalence could be in(cid:173)
`If found equivalent they
`voked by P 1 and P2
`perform different operations, and if found differ(cid:173)
`ent they act the same, and are thus equivalent
`This is exemplified by the modification in Fig 8
`to pwgram E V in which the decision procedure D
`returns "true" iff two input programs are equiv(cid:173)
`alent
`The program UEV evolves into one of two
`types of pwgrams, P1 or P2 If the program type is
`P 1 the statement labeled "zzz" will become:
`if D( P1 , P2 ) then print 1;
`while if the program type is P2 , the statement
`labeled "zzz" will become:
`if D(P1 , P 2 ) then print 0;
`The two evolutions each call decision procedure D
`to decide whether they are equivalent If D indi(cid:173)
`cates that they are equivalent, then P1 will print a
`1 while P2 will print a 0, and D will be con(cid:173)
`tradicted. If D indicates that they are different,
`neither prints anything. Since they are otherwise
`equal, D
`is again contradicted Therefore, the
`hypothetical decision pwcedure D is self con(cid:173)
`tradictory, and the precise determination of the
`
`program evolutionary-virus
`{ ...
`subroutine print-random-statement :=
`{print (random-variable-name, ''='',
`random-variable-name) ;
`loop: if random-bit = 1 then
`{print (random-operator,
`ra-ndom-variable-name);
`goto loop;}
`print (semicolon) ;
`}
`subroutine copy-virus-with-insertions
`{loop: copy evolutionary-virus
`to virus till semicolon;
`if r.andom-bit = 1 then

`print-random-statement;
`if ~·end-of-input-file goto loop;
`}
`main-program :=
`{copy-with-random-insertions;
`infect-executable;
`if trigger-pulled then do-damage;
`goto next;}
`next:}
`
`Fig 7 Evolutionary virus 'EV'
`
`program undecidable-EV
`{ ...
`subroutine copy-with-undecidable
`{copy undecidable-EV to
`file till line-starts-with zzz;
`if file = Pl then
`print ("if D(Pl,P2) print 1;");
`if file = P2 then
`print ("if D(Pl,P2) print 0;");
`copy undecidable-EV to
`file till end-of-input-file;
`
`}
`main-program .
`{if random-bit = 0 then file Pl
`otherwise file= P2;
`copy-with-undecidable;
`zzz:
`infect-executable;
`if trigger-pulled then do-damage;
`goto next;}
`next:}
`
`Fig. 8 Undecidable equivalence of evolutions of a virus 'UEV'
`
`equivalence of these two pwgrams by their ap(cid:173)
`pearance is undecidable
`Since both P 1 and P2 are evolutions of the
`same program, the equivalence of evolutions of a
`pwgram is undecidable, and since they are both
`viruses, the equivalence of evolutions of a virus is
`undecidable. Program UE V also demonstrates that
`two unequivalent evolutions can both be viruses
`An alternative to detection by appearance, is
`detection by behavior .. A virus, just as any other
`program, acts as a surrogate for the user in re(cid:173)
`questing services, and the services used by a virus
`are legitimate in

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket