throbber
Patent No. 8,074,115
`Petition For Inter Partes Review
`
`Paper No. 1
`
`
`
`IN THE
`
`UNITED STATES PATENT AND TRADEMARK OFFICE
`
`
`
`BEFORE THE PATENT TRIAL AND APPEAL BOARD
`_____________
`
`SYMANTEC CORPORATION,
`
`Petitioner
`
`- vs. -
`
`THE TRUSTEES OF COLUMBIA UNIVERSITY
`IN THE CITY OF NEW YORK,
`
`
`Patent Owner
`
`_____________
`
`Patent No. 8,074,115
`Issued: Dec. 6, 2011
`Inventors: Salvatore J. Stolfo, Angelos D. Keromytis, and Stelios Sidiroglou
`Title: METHODS, MEDIA AND SYSTEMS FOR DETECTING ANOMALOUS
`PROGRAM EXECUTIONS
`
`Inter Partes Review No.
`
`PETITION FOR INTER PARTES REVIEW OF U.S. PATENT NO.
`8,074,115 UNDER 35 U.S.C. §§ 311-319 AND 37 C.F.R. §§ 42.1-.80, 42.100-.123
`_____________
`
`Mail Stop Patent Board
`Patent Trial and Appeal Board
`P.O. Box 1450
`Alexandria, VA 22313-1450
`
`
`
`
`
`December 5, 2014
`
`
`
`

`
`Patent No. 8,074,115
`Petition For Inter Partes Review
`
`TABLE OF CONTENTS
`
`Page
`
`I. 
`
`INTRODUCTION ...................................................................................................... 1 
`
`II.  MANDATORY NOTICES (37 C.F.R. § 42.8(A)(1)) ............................................ 1 
`
`A. 
`
`B. 
`
`C. 
`
`Real Party-In-Interest (37 C.F.R. § 42.8(b)(1)) ............................................. 1 
`
`Notice of Related Matters (37 C.F.R. § 42.8(b)(2)) ..................................... 1 
`
`Designation of Lead and Backup Counsel (37 C.F.R.
`§ 42.8(b)(3)) ........................................................................................................ 2 
`
`D. 
`
`Service of Information (37 C.F.R. § 42.8(b)(4)) ........................................... 2 
`
`III.  GROUNDS FOR STANDING (37 C.F.R. § 42.104(A)) ..................................... 2 
`
`IV. 
`
`IDENTIFICATION OF CHALLENGE (37 C.F.R. § 42.104(B)) ..................... 2 
`
`A. 
`
`B. 
`
`Effective Filing Date of the ’115 Patent........................................................ 2 
`
`There Is a Reasonable Likelihood That at Least One Claim of
`the ’115 Patent Is Unpatentable Under 35 U.S.C. §§ 102 or 103 .............. 3 
`
`V.  OVERVIEW OF THE ’115 PATENT .................................................................... 4 
`
`VI.  CONSTRUCTION OF THE CHALLENGED CLAIMS (37 C.F.R.
`§ 42.104(B)(3)) ............................................................................................................... 7 
`
`VII.  THE CHALLENGED CLAIMS ARE UNPATENTABLE .............................11 
`
`A. 
`
`The subject matter of the ’115 patent is disclosed in the prior
`art .......................................................................................................................11 
`
`1. 
`
`2. 
`
`3. 
`
`U.S. Patent Publication No. 2005/0108562 (“Khazan”) ..............11 
`
`U.S. Patent No. 8,108,929 (“Agrawal”) ...........................................14 
`
`U.S. Patent No. 5,440,723 (“Arnold”) .............................................15 
`
`B. 
`
`Reasons the Claims are Unpatentable..........................................................17 
`
`i
`
`

`
`Patent No. 8,074,115
`Petition For Inter Partes Review
`
`TABLE OF CONTENTS
`(Continued)
`
`1. 
`
`Ground 1: Khazan Anticipates Claims 22, 25, 27-29,
`32, 35-39, and 42 Under pre-AIA 35 U.S.C. § 102(e) ...................17 
`
`Page
`
`a. 
`
`b. 
`
`c. 
`
`d. 
`
`e. 
`
`f. 
`
`g. 
`
`h. 
`
`Claim 22: “A method for detecting anomalous
`executions” ...............................................................................17 
`
`Claim 32: “A non-transitory computer-readable
`medium containing computer-executable
`instructions that, when executed by a processor,
`cause the processor to perform a method for
`detecting anomalous program executions” .........................18 
`
`Claim 42: “A system for detecting anomalous
`program executions, comprising: a digital
`processing device” ...................................................................19 
`
`Claims 22, 32, and 42: “modifying a program to
`include indicators of program-level function calls
`being made during execution of the program” ...................20 
`
`Claims 22, 32, and 42: “comparing at least one of
`the indicators of program-level function calls
`made in an emulator to a model of function calls
`for at least a part of the program” ........................................22 
`
`Claims 22, 32, and 42: “identifying a function call
`corresponding to the at least one of the
`indicators as anomalous based on the
`comparison” .............................................................................23 
`
`Claims 25 and 35: “modifying the function call so
`that the function call becomes non-anomalous” ................24 
`
`Claims 27 and 37: “the comparing compares the
`function call name and arguments to the model” ..............24 
`
`ii
`
`

`
`Patent No. 8,074,115
`Petition For Inter Partes Review
`
`TABLE OF CONTENTS
`(Continued)
`
`Page
`
`i. 
`
`j. 
`
`k. 
`
`Claims 28 and 38: “the model reflects normal
`activity of the at least a part of the program” .....................25 
`
`Claims 29 and 39: “the model reflects attacks
`against the at least a part of the program” ...........................26 
`
`Claim 36: “generating a virtualized error in
`response to the function call being identified as
`being anomalous” ....................................................................27 
`
`2. 
`
`Ground 2: Khazan in View of Arnold Renders
`Obvious Claims 1, 4-8, 11, 14-18, 21, and 26 Under pre-
`AIA 35 U.S.C. § 103(a) .......................................................................28 
`
`a. 
`
`b. 
`
`c. 
`
`d. 
`
`e. 
`
`f. 
`
`Claim 1: “A method for detecting anomalous
`program executions” ...............................................................29 
`
`Claim 11: “A non-transitory computer-readable
`medium containing computer-executable
`instructions that, when executed by a processor,
`cause the processor to perform a method for
`detecting anomalous program executions” .........................29 
`
`Claim 21: “A system for detecting anomalous
`program executions, comprising: a digital
`processing device” ...................................................................29 
`
`Claims 1, 11, and 21: “executing at least a part of
`a program in an emulator” .....................................................29 
`
`Claims 1, 11, and 21: “comparing a function call
`made in the emulator to a model of function calls
`for the at least a part of the program” .................................30 
`
`Claims 1, 11, and 21: “identifying the function
`call as anomalous based on the comparison” .....................31 
`
`iii
`
`

`
`Patent No. 8,074,115
`Petition For Inter Partes Review
`
`TABLE OF CONTENTS
`(Continued)
`
`Page
`
`g. 
`
`h. 
`
`i. 
`
`j. 
`
`k. 
`
`Claims 1, 11, and 21: “upon identifying the
`anomalous function call, notifying an application
`community that includes a plurality of computers
`of the anomalous function call” ............................................32 
`
`Claims 4 and 14: “modifying the function call so
`that the function call becomes non-anomalous” ................35 
`
`Claims 6 and 16: “the comparing compares the
`function call name and arguments to the model” ..............35 
`
`Claims 7 and 17: “the model reflects normal
`activity of the at least a part of the program” .....................35 
`
`Claims 8 and 18: “the model reflects attacks
`against the at least a part of the program” ...........................35 
`
`3. 
`
`Ground 3: The Combination of Khazan, Arnold, and
`Agrawal Renders Obvious Claims 2, 3, 9, 10, 12, 13, 19,
`20, 23, 24, 30, 31, 33, 34, 40, and 41 Under pre-AIA 35
`U.S.C. § 103(a) .....................................................................................35 
`
`a. 
`
`b. 
`
`c. 
`
`Claims 2, 12, 23, and 33: “creating a combined
`model from at least two models created using
`different computers” ...............................................................37 
`
`Claims 3, 13, 24, and 34: “creating a combined
`model from at least two models created at
`different times” ........................................................................38 
`
`Claims 9, 19, 30, and 40: “randomly selecting the
`model as to be used in the comparison from a
`plurality of different models relating to the
`program” ...................................................................................39 
`
`iv
`
`

`
`Patent No. 8,074,115
`Petition For Inter Partes Review
`
`TABLE OF CONTENTS
`(Continued)
`
`d. 
`
`Claims 10, 20, 31, and 41: “randomly selecting a
`portion of the model to be used in the
`comparison” .............................................................................41 
`
`Page
`
`VIII.  CONCLUSION..........................................................................................................42 
`
`
`
`v
`
`

`
`Patent No. 8,074,115
`Petition For Inter Partes Review
`
`EXHIBIT LIST (37 C.F.R. § 42.63(e))
`
`Exhibit
`
`Description
`
`1001
`
`1002
`
`1003
`
`1004
`
`1005
`
`1006
`
`1007
`
`1008
`
`1009
`
`1010
`
`1011
`
`U.S. Patent No. 8,074,115 to Stolfo et al.
`
`File History of U.S. Patent No. 8,074,115
`
`Declaration of Michael T. Goodrich, Ph.D.
`
`Curriculum vitae of Michael T. Goodrich, Ph.D.
`
`The Trustees of Columbia University in the City of New York v. Symantec
`Corp., Civil Action No. 3:13-cv-808, Oct. 7, 2014 Claim
`Construction Order (Dkt. No. 123)
`
`The Trustees of Columbia University in the City of New York v. Symantec
`Corp., Civil Action No. 3:13-cv-808, October 23, 2014 Memoran-
`dum Order Clarifying Claim Construction (Dkt. No. 146)
`
`U.S. Patent No. 5,440,723 to Arnold et al.
`
`U.S. Patent No. 8,108,929 to Agrawal et al.
`
`McGraw-Hill Dictionary of Scientific and Technical Terms (5th
`ed. 1994)
`
`U.S. Patent Application Publication No. 2005/0208562 to Kha-
`zan et al.
`
`U.S. Patent No. 7,334,005 to Sobel
`
`vi
`
`

`
`Patent No. 8,074,115
`Petition For Inter Partes Review
`I.
`
`INTRODUCTION
`
`In accordance with 35 U.S.C. §§ 311-319 and 37 C.F.R. §§ 42.1-.80 & 42.100-
`
`.123, inter partes review is respectfully requested for claims 1-42 of United States Patent
`
`No. 8,074,115 to Stolfo et al., titled “Methods, Media and Systems for Detecting
`
`Anomalous Program Executions” (the “’115 patent”) owned by The Trustees of Co-
`
`lumbia University in the City of New York (“Columbia”). (EXHIBIT 1001 (“Ex.
`
`1001”)). This petition demonstrates that there is a reasonable likelihood that the peti-
`
`tioner will prevail on at least one of the claims challenged in the petition based on one
`
`or more prior art references. For the reasons provided herein, claims 1-42 of the ’115
`
`patent should be canceled as unpatentable.
`
`II. MANDATORY NOTICES (37 C.F.R. § 42.8(A)(1))
`A. Real Party-In-Interest (37 C.F.R. § 42.8(b)(1))
`The real party-in-interest for this petition is Symantec Corporation (“Petition-
`
`er” or “Symantec”).
`
`B. Notice of Related Matters (37 C.F.R. § 42.8(b)(2))
`The ’115 patent is presently the subject of the following patent infringement
`
`lawsuit brought by Columbia in the Eastern District of Virginia, Richmond Division:
`
`Civil Action No. 3:13-cv-808 against Symantec. Concurrent with the instant petition,
`
`Petitioner is also filing petitions requesting inter partes review of U.S. Patent Nos.:
`
`8,601,322, 7,487,544, 7,979,907, 7,448,084, and 7,913,306.
`
`1
`
`

`
`Patent No. 8,074,115
`Petition For Inter Partes Review
`C. Designation of Lead and Backup Counsel (37 C.F.R. § 42.8(b)(3))
`Lead: David D. Schumann, Reg. No. 53,569. Email: dschumann@fenwick.com.
`
`Backup: Brian M. Hoffman, Reg. No. 39,713. Email: bhoffman@fenwick.com.
`
`Address for both counsel: FENWICK & WEST LLP, 555 California Street, 12th
`
`Floor, San Francisco, CA 94104, Tel: (415) 875-2300, Fax: (415) 281-1350.
`
`Service of Information (37 C.F.R. § 42.8(b)(4))
`
`D.
`Service of any documents via hand-delivery may be made at the postal mailing
`
`addresses of the respective lead and back-up counsel designated above with courtesy
`
`copies
`
`to
`
`the
`
`email
`
`addresses
`
`dschumann@fenwick.com
`
`and
`
`bhoff-
`
`man@fenwick.com. Petitioner consents to electronic service.
`
`III. GROUNDS FOR STANDING (37 C.F.R. § 42.104(A))
`Petitioner certifies pursuant to Rule 42.104(a) that the ’115 patent is available
`
`for inter partes review and that Petitioner is not barred or estopped from requesting an
`
`inter partes review challenging the validity of the above-referenced claims of the ’115
`
`patent on the grounds identified in the petition.
`
`IV.
`
`IDENTIFICATION OF CHALLENGE (37 C.F.R. § 42.104(B))
`A. Effective Filing Date of the ’115 Patent
`The ’115 patent issued from U.S. Application No. 12/091,150 filed on October
`
`25, 2006. The ’150 Application is a U.S. National Phase application under 35 U.S.C. §
`
`371 of International Patent Application No. PCT/US2006/041591, filed October 25,
`
`2
`
`

`
`Patent No. 8,074,115
`Petition For Inter Partes Review
`2006, which claims the benefit of U.S. Provisional Application No. 60/730,289, filed
`
`Oct. 25, 2005. Claims 1, 11, 21, 22, 32, and 42 of the ’115 patent are independent.
`
`The effective filing date of these claims and the claims that depend from them is no
`
`earlier than Oct. 25, 2005.
`
`B. There Is a Reasonable Likelihood That at Least One Claim of the
`’115 Patent Is Unpatentable Under 35 U.S.C. §§ 102 or 103
`
`The challenged claims are generally directed to detecting anomalous program
`
`executions. However, the subject matter of these claims was disclosed in the prior art.
`
`Specifically, the claims are unpatentable in view of the following patents and publica-
`
`tions:
`
` U.S. Patent Publication No. 2005/0108562, filed on June 18, 2003, pub-
`
`lished on May 19, 2005, and titled “Technique for detecting executable ma-
`
`licious code using a combination of static and dynamic analyses” (“Kha-
`
`zan”) (EXHIBIT 1010). This publication is prior art to the ’115 patent un-
`
`der pre-AIA § 102(e).
`
` U.S. Patent No. 8,108,929, filed on October 19, 2004, issued on January 31,
`
`2012, and titled “Method and system for detecting intrusive anomalous use
`
`of a software system using multiple detection algorithms” (“Agrawal”)
`
`(EXHIBIT 1008). This patent is prior art to the ’115 patent under pre-
`
`AIA § 102(e).
`
`3
`
`

`
`Patent No. 8,074,115
`Petition For Inter Partes Review
` U.S. Patent No. 5,440,723, filed on January 19, 1993, issued on August 8,
`
`1995, and titled “Automatic immune system for computers and computer
`
`networks” (“Arnold”) (EXHIBIT 1007). This patent is prior art to the
`
`’115 patent under pre-AIA §§ 102(a) and 102(b).
`
` U.S. Patent No. 7,334,005, filed on April 13, 2005, issued on February 19,
`
`2008, and titled “Controllable deployment of software updates” (“Sobel”)
`
`(EXHIBIT 1011). This patent is prior art to the ’115 patent under pre-
`
`AIA § 102(e).
`
`Section VII below explains how the above-cited references create a reasonable
`
`likelihood that Petitioner will prevail on at least one of the challenged claims. See 35
`
`U.S.C. § 314(a). Indeed, section VII, as supported by the Declaration of Michael T.
`
`Goodrich, Ph.D. (EXHIBIT 1003), demonstrates that the challenged claims are antic-
`
`ipated and rendered obvious in view of these references. Petitioner therefore requests
`
`cancellation of claims 1-42 as unpatentable under 35 U.S.C. §§ 102 and 103.
`
`V. OVERVIEW OF THE ’115 PATENT
`The ’115 patent discloses a way to detect “anomalous program executions that
`
`may be indicative of a malicious attack or program fault.” (’115 patent, 3:13-15, Ex.
`
`1001). As disclosed in the ’115 patent, an anomaly detector trains a model of normal
`
`program behavior and applies the model to detect deviations from normal program
`
`behavior during subsequent operation. (See ’115 patent, 3:50-56, Ex. 1001). The
`
`4
`
`

`
`Patent No. 8,074,115
`Petition For Inter Partes Review
`anomaly detector of the ’115 patent specifically focuses on detecting deviations in
`
`function calls made by the program in order to detect anomalous behavior. (See ’115
`
`patent, 3:46-56, Ex. 1001).
`
`The anomaly detector first “models normal program execution stack behavior.”
`
`(’115 patent, 3:50-52, Ex. 1001). This behavior may include function names, function
`
`call arguments, stack frames, and the like. (’115 patent, 3:38-40, Ex. 1001). The
`
`anomaly detector then observes subsequent function calls made by the program and
`
`uses the trained model to detect deviations from normal behavior. (’115 patent, 3:52-
`
`56, Ex. 1001). In some embodiments, upon identifying a function call as anomalous,
`
`the anomaly detector notifies an application community running the same program or
`
`same portion of the program that an anomalous function call has been identified. (See
`
`’115 patent, 18:57-59, Ex. 1001).
`
`Figure 8 illustrates the operation of the anomaly detector described in the ’115
`
`patent:
`
`5
`
`

`
`Patent No. 8,074,115
`Petition For Inter Partes Review
`
`
`
`The anomaly detector detects a function call being made by a program at step
`
`802. The anomaly detector then compares the detected function call at step 804 to a
`
`model of normal function calls computed based on training data. To train a model of
`
`normal function calls, the anomaly detector monitors normal execution of the pro-
`
`gram. Once the model is trained, it is applied against further executions of the pro-
`
`gram to identify anomalous function calls associated with the program. Thereafter,
`
`detected function calls can be compared to the model at step 804 to identify whether
`
`the function call is anomalous at step 806. (’115 patent, 3:46-56, Ex. 1001).
`
`Several of the independent claims of the ’115 patent generally correspond to
`
`the steps illustrated by FIG. 8. For example, claim 22 recites: “[a] method for detect-
`
`ing anomalous program executions, comprising: modifying a program to include in-
`
`dicators of program-level function calls being made during execution of the program;
`
`6
`
`

`
`Patent No. 8,074,115
`Petition For Inter Partes Review
`comparing at least one of the indicators of program-level function calls made in an
`
`emulator to a model of function calls for at least a part of the program; and identify-
`
`ing a function call corresponding to the at least one of the indicators as anomalous
`
`based on the comparison. Independent claims 32 and 42 likewise recite “modifying,”
`
`“comparing,” and “identifying” elements.
`
`Other independent claims of the ’115 patent recite an additional element after
`
`the “identifying” element. For example, claim 1 recites “upon identifying the anoma-
`
`lous function call, notifying an application community that includes a plurality of
`
`computers of the anomalous function call.” Claims 11 and 21 also recite the “notify-
`
`ing” element.
`
`VI. CONSTRUCTION OF THE CHALLENGED CLAIMS (37 C.F.R.
`§ 42.104(B)(3))
`
`The terms in claims 1-42 are to be given their broadest reasonable construction
`
`(“BRC”), as understood by one of ordinary skill in the art and consistent with the dis-
`
`closure. See 37 C.F.R. § 42.100(b); see also In re Yamamoto, 740 F.2d 1569, 1571 (Fed.
`
`Cir. 1984); In re Am. Acad. of Sci. Tech. Ctr., 367 F.3d 1359, 1363-64 (Fed. Cir. 2004).
`
`The following constructions were adopted by the district court in The Trustees of
`
`Columbia University in the City of New York v. Symantec Corp., Civil Action No. 3:13-cv-
`
`808 for the ’115 patent. Petitioner submits that the claim terms should be construed
`
`at least as broadly as the constructions the district court adopted for the reasons set
`
`forth in that case. (Claim Construction Order, p. 2, Ex. 1005; Clarification of Claim
`
`7
`
`

`
`Patent No. 8,074,115
`Petition For Inter Partes Review
`Construction Order, pp. 1-2, Ex. 1006).
`
`Specifically, the BRC of the term “anomalous” is “deviation/deviating from a
`
`model of typical, attack-free computer system usage.” (Claim Construction Order, p.
`
`2, Ex. 1005; Clarification of Claim Construction Order, pp. 1-2, Ex. 1006). This con-
`
`struction is consistent with the specification, which states that the anomaly detector
`
`models “normal program execution stack behavior” and “detect[s] stacked function
`
`references as anomalous . . . by comparing those references to the model.” (’115 pa-
`
`tent, 3:46-56, Ex. 1001; Goodrich Decl., ¶ 55, Ex. 1003).
`
`The BRC of the term “emulator” is “software, alone or in combination with
`
`hardware, that permits the monitoring and selective execution of certain parts, or all,
`
`of a program.” (Claim Construction Order, p. 2, Ex. 1005). This construction is con-
`
`sistent with the specification, which explains that stack information may be extracted
`
`using “Selective Transactional Emulation (STEM) . . . which permits the selective exe-
`
`cution of certain parts, or all, of a program inside an instruction-level emulator . . . by
`
`modifying a program’s binary or source code to include indicators of what functions
`
`calls are being made (and any other suitable related information), or using any other
`
`suitable technique.” (’115 patent, 3:28-37, Ex. 1001; Goodrich Decl., ¶ 56, Ex. 1003).
`
`The BRC of the term “application community” is “members of a community
`
`running the same program or a selected portion of the program.” (Claim Construc-
`
`tion Order, p. 2, Ex. 1005). This construction is consistent with the specification,
`
`8
`
`

`
`Patent No. 8,074,115
`Petition For Inter Partes Review
`which explains that “models are shared among many members of a community run-
`
`ning the same application (referred to as an ‘application community’).” (’115 patent,
`
`6:31-33, Ex. 1001; Goodrich Decl., ¶ 57, Ex. 1003).
`
`In addition to the construction adopted by the district court, Petitioner submits
`
`the following constructions.
`
`The BRC of the phrase “generating a virtualized error” is “simulating an error
`
`return from the function.” The specification supports this construction by describing
`
`how an emulator is used to “simulate an error return from a function of the applica-
`
`tion.” (’115 patent, 13:51-61, Ex. 1001; Goodrich Decl., ¶ 58, Ex. 1003). The specifi-
`
`cation further states that “the emulator can . . . simulate an error return from the func-
`
`tion” and that this simulation is “sometimes referred to herein as ‘error virtualiza-
`
`tion.’” (’115 patent, 15:12-21, Ex. 1001; Goodrich Decl., ¶ 58, Ex. 1003).
`
`Additionally, the BRC of the phrase “indicators of program-level function
`
`calls” is “indicators of which of the program’s functions are being called.” The speci-
`
`fication supports this construction in disclosing that STEM, “which permits the selec-
`
`tive execution of certain parts, or all, of a program inside an instruction-level emula-
`
`tor,” modifies a program’s binary or source code “to include indicators of what func-
`
`tion calls are being made (and any other suitable related information) . . . .” (’115 pa-
`
`tent, 3:28-37, Ex. 1001; Goodrich Decl., ¶ 59, Ex. 1003). The specification further
`
`explains that the “instruction-level emulator can be selectively invoked for segments
`
`9
`
`

`
`Patent No. 8,074,115
`Petition For Inter Partes Review
`of the application’s code . . . and simulate an error return from a function of the application.”
`
`(’115 patent, 13:51-61, Ex. 1001 (emphasis added); Ex. 1003 at ¶ 59). Moreover, “up-
`
`on entering the vulnerable section of the application’s code, the instruction-level emulator
`
`can capture and store the program state and processes all instructions, including function
`
`calls, inside the area designated for emulation.” (’115 patent, 13:61-65, Ex. 1001 (em-
`
`phasis added); Goodrich Decl., ¶ 59, Ex. 1003). Because the indicators are of func-
`
`tion calls made by a program, and because the program (i.e., application) may return
`
`an error from the function, it is clear that the indicators indicate which of the pro-
`
`gram’s functions are being called. (Goodrich Decl., ¶ 59, Ex. 1003).
`
`Finally, the BRC of the term “reflects” is “describes.” The only instance in
`
`which the term “reflect” is used in the specification is when explaining that “models
`
`may be automatically updated as time progresses. For example, although a single site
`
`may learn a particular model over some period of time, application behavior may
`
`change over time. In this case, the previously learned model may no longer accurately
`
`reflect the application characteristics. . . .” (’115 patent, 7:58-62, Ex. 1001; Goodrich
`
`Decl., ¶ 60, Ex. 1003). A person of ordinary skill in the art would recognize that, in
`
`this instance, the specification uses the word “reflect” to mean “describe.” (Goodrich
`
`Decl., ¶ 60, Ex. 1003).
`
`The claim terms not specifically construed herein are given their BRC, as un-
`
`derstood by one of ordinary skill in the art and consistent with the disclosure.
`
`10
`
`

`
`Patent No. 8,074,115
`Petition For Inter Partes Review
`(Goodrich Decl., ¶ 54, Ex. 1003).
`
`VII. THE CHALLENGED CLAIMS ARE UNPATENTABLE
`A.
`The ’115 patent is directed at detecting anomalous function calls made by a
`
`The subject matter of the ’115 patent is disclosed in the prior art
`
`program. (’115 patent, Abstract, Ex. 1001). The concept of anomaly detection was
`
`well known before the priority date of the ’115 patent. Khazan states in its Background
`
`that “[a]nomaly detection approaches [for detecting malicious code] use a model or
`
`definition of what is expected or normal with respect to a particular application and
`
`then look for deviations from this model.” (Khazan, ¶ 8, Ex. 1010). Thus, the core
`
`concept recited by the independent claims of the ’115 patent, anomaly detection, was
`
`already recognized as prior art at least two years before the earliest priority date of the
`
`patent. The additional elements recited by the challenged claims merely describe an-
`
`ticipated or obvious uses of anomaly detection in connection with program execution.
`
`1.
`Khazan describes techniques “to distinguish between normal or expected be-
`
`U.S. Patent Publication No. 2005/0108562 (“Khazan”)
`
`havior of code and the behavior produced by MC [malicious code].” (Khazan, ¶ 65,
`
`Ex. 1010). Khazan discloses the basic concept of using anomaly detection to detect
`
`malicious code. (See Khazan, ¶ 8, Ex. 1010). In addition, Khazan specifically teaches
`
`comparing a function call made by a program to a model of function calls for the pro-
`
`gram, and identifying the function call as anomalous based on the comparison. (See
`
`11
`
`

`
`Patent No. 8,074,115
`Petition For Inter Partes Review
`Khazan, Abstract).
`
`In more detail, Khazan discloses a malicious code “detection system [which]
`
`includes a static analyzer and a dynamic analyzer.” (Khazan, ¶ 40, Ex. 1010) (refer-
`
`ence numbers omitted in this discussion of Khazan). The static analyzer acts “to ex-
`
`amine and identify calls or invocations made from [an] application executable to a
`
`predetermined set of target functions or routines.” (Khazan, ¶ 42, Ex. 1010). The
`
`static identifier “may also identify additional information about these functions, such
`
`as, for example, particular locations within the application from which the calls to
`
`these functions are made, parameter number and type information for each call, the
`
`values that some of these parameters take at run-time, and the like.” (Khazan, ¶ 42,
`
`Ex. 1010). The static analyzer then “produces a list of targets and invocation loca-
`
`tions, as related to the identified function calls.” (Khazan, ¶ 60, Ex. 1010).
`
`The malicious code detection system “creates an application model using the
`
`information obtained from the static analyzer.” (Khazan, ¶ 65, Ex. 1010). The model
`
`comprises “the identified calls, their locations within the program, and other call-
`
`related information.” (Khazan, ¶ 114, Ex. 1010). Khazan notes that building such
`
`models was prior art to Khazan’s application, and that “[s]ome existing anomaly de-
`
`tection techniques create models of normal behavior of a particular application based
`
`on observing sequences of system calls executed at run time as part of a learning
`
`phase.” (Khazan, ¶ 9, Ex. 1010).
`
`12
`
`

`
`Patent No. 8,074,115
`Petition For Inter Partes Review
`Khazan uses the model “to verify the run time behavior of the application exe-
`
`cutable” where “it is determined that the application executable has executed MC”
`
`“[i]f the run time behavior deviates from the application model.” (Khazan, ¶ 65, Ex.
`
`1010). Specifically, the dynamic analyzer “performs run time validation of the applica-
`
`tion’s run time behavior characterized by the target function calls being monitored . . .
`
`to ensure that the target function calls that are made at run time match the infor-
`
`mation obtained by the static analyzer.” (Khazan, ¶ 67, Ex. 1010). “If there are any
`
`deviations detected during the execution of the application executable, it is deter-
`
`mined that the application executable includes MC.” (Id.).
`
`Khazan was identified as prior art by the Examiner during prosecution of the
`
`’115 patent in the Notice of Allowance and Fees Due mailed July 28, 2011. (File His-
`
`tory, p. 8, Ex. 1002). In the “Reasons for Allowance” of the application that granted
`
`into the ’115 patent the Examiner identified Khazan as an example of the prior art of
`
`record. The Examiner distinguished Khazan by stating that “[t]he prior art of record .
`
`. . fails to teach . . . comparing a function call made in an emulator to a model of func-
`
`tion calls for the at least a part of the program [and] identifying the function call as
`
`anomalous based on the comparison.” (File History, p. 13, Ex. 1002).
`
`However, the Examiner’s description of Khazan is incorrect because the refer-
`
`ence explicitly discloses “comparing a function call made in an emulator to a model of
`
`function calls.” Khazan discloses that “[t]he detection tool may . . . execute one or
`
`13
`
`

`
`Patent No. 8,074,115
`Petition For Inter Partes Review
`more of the executables in order to possibly detect MC contained in these executa-
`
`bles. This may also be done as an emulation or simulation of the execution, or in
`
`what is known to those skilled in the art as a virtual environment.” (Khazan, ¶ 111,
`
`Ex. 1010; see also Khazan, ¶ 112, Ex. 1010 (“An execution of the application may also
`
`be emulated or simulated.”)). As shown above, Khazan also discloses identifying the
`
`function call as anomalous based on the comparison.
`
`2.
`Agrawal describes a way to detect “intrusive anomalous use of a software sys-
`
`U.S. Patent No. 8,108,929 (“Agrawal”)
`
`tem using multiple detection algorithms.” (Agrawal, Title, Ex. 1008). Among other
`
`things, Agrawal discloses various ways in which multiple models can be created and
`
`combined as part of the anomaly intrusion detection classification technique. For ex-
`
`ample, Agrawal mentions that “[m]ultiple detection algorithms may . . . be combined
`
`in parallel to improve the precision of detection of . . . anomalies.” (Agrawal, 1:62-65,
`
`Ex. 1008).
`
`The reference discusses how an attacker may gain unauthorized access to a
`
`computer system and “cause the system software to execute in a manner that is typi-
`
`cally inconsistent with the software specification and thus leads to a breach in securi-
`
`ty.” (Agrawal, 1:20-23, Ex. 1008). Such attacks may be detected by collating traces of
`
`activity from audit trails or logs and using a classification technique such as “anomaly
`
`intrusion detection” which “searches for a departure from normality.” (Agrawal,
`
`14
`
`

`
`Patent No. 8,074,115
`Petition For Inter Partes Review
`1:24-34, Ex. 1008). These activity traces may include “parameters for system calls and
`
`other function calls.” (Agrawal, 7:37, Ex. 1008).
`
`Agrawal describes a two-level examination technique where the first level pro-
`
`vides a provisional indication of a possible intrusion and the second level which pro-
`
`vides a more definite indication of whether there is an intrusion. (Agrawal, 5:4-14,
`
`Ex. 1008). “Multiple algorithms may be executed together within a single examination
`
`level, with the individual results then analyzed to obtain a composite result or output.”
`
`(Agrawal, 7:5-8, Ex. 1008). These algorithms, or models1, “can be built from multiple
`
`sets of training data.” (Agrawal, 8:51-52, Ex. 1008).
`
`The models can be created by different computers at different times. After all,
`
`“[m]ultiple models can be built from multiple sets of training data.” Furthermore, the
`
`“detection algorithms may be executed in the same or different systems, machines or
`
`processors.” (Agrawal, 2:18-20, Ex. 1008). A person of ordinary skill in the art would
`
`recognize from these teachings that it is obvious that not all models would be created
`
`at the same time, and the models could be created using different computers.
`
`(Goodrich Decl., ¶¶ 89-90, Ex. 1003).
`
`3.
`Arnold describes an “automatic immune system for computers and computer
`
`U.S. Patent No. 5,440,723

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket