throbber
UNITED STATES PATENT AND TRADEMARK OFFICE
`
`___________________
`
`BEFORE THE PATENT TRIAL AND APPEAL BOARD
`___________________
`
`
`
`SYMANTEC CORPORATION
`Petitioner
`
`
`
`v.
`
`
`
`THE TRUSTEES OF COLUMBIA UNIVERSITY
` IN THE CITY OF NEW YORK
`Patent Owner
`___________________
`
`CASE IPR2015-00375
`Patent 8,074,115
`___________________
`
`
`
`COLUMBIA’S PATENT OWNER RESPONSE
`UNDER 37 C.F.R. § 42.108
`
`
`
`
`Mail Stop “PATENT BOARD”
`Patent Trial and Appeal Board
`U.S. Patent and Trademark Office
`P.O. Box 1450
`Alexandria, VA 22313-1450
`
`3471556
`
`
`
`
`
`
`

`
`IPR2015-00375
`Patent 8,074,115
`
`TABLE OF CONTENTS
`
`I. 
`
`II. 
`
`Introduction ...................................................................................................... 1 
`
`The Invention Of The ’115 Patent ................................................................... 2 
`
`III.  Level Of Ordinary Skill In The Art ................................................................. 4 
`
`IV.  The Prior Art .................................................................................................... 6 
`
`A.  U.S. Patent Publication No. 2005/0108562 (“Khazan”) ....................... 6 
`
`1. 
`
`2. 
`
`3. 
`
`Khazan Describes A Simple List Of Function Calls .................. 7 
`
`Khazan Only Tracks Win32 DLL Function Calls That
`Are Capable Of Being Made ....................................................... 7 
`
`Khazan Identifies A Program As Malicious Code With
`Absolute Certainty ...................................................................... 8 
`
`B. 
`
`C. 
`
`U.S. Patent No. 8,108,929 (“Arnold”) .................................................. 8 
`
`U.S. Patent No. 5,440,723 (“Agrawal”) ................................................ 9 
`
`V.  Ground 1: Khazan Does Not Anticipate Claims 22, 25, 27-29, 32,
`35-39 or 42 ..................................................................................................... 10 
`
`A.  Khazan Does Not Disclose “Modifying A Program To
`Include Indicators Of Program-Level Function Calls” ....................... 10 
`
`B. 
`
`Khazan Does Not Disclose “Identifying A Function Call As
`Anomalous” ......................................................................................... 14 
`
`1. 
`
`2. 
`
`Khazan Does Not Disclose “Deviation From A Model
`of Typical Computer System Usage” ....................................... 14 
`
`Khazan Does Not Identify “Deviation From A Model” ........... 16 
`
`C. 
`
`Khazan Does Not Disclose an “Emulator” ......................................... 18 
`
`D.  Khazan Does Not Disclose “The Model Reflects Normal
`Activity” .............................................................................................. 20 
`
`E. 
`
`Khazan Does Not Disclose “The Model Reflects Attacks” ................ 22 
`
`3471556
`
`
`- i -
`
`
`
`

`
`IPR2015-00375
`Patent 8,074,115
`F. 
`Khazan Does Not Disclose A “Model Of Function Calls” ................. 23 
`
`1. 
`
`2. 
`
`“Model Of Function Calls” Requires The Model To Be
`Created Using A Learning Algorithm ...................................... 23 
`
`Under Patent Owner’s Construction, Khazan Does Not
`Disclose A “Model Of Function Calls” .................................... 27 
`
`VI.  Ground 2: Claims 1, 4-8, 11, 14-18, 21, Or 26 Are Not Obvious
`Under The Combination Of Khazan And Arnold ......................................... 30 
`
`A. 
`
`B. 
`
`C. 
`
`The Combination Of Khazan And Arnold Does Not Disclose
`An “Application Community” ............................................................ 30 
`
`The Combination Of Khazan And Arnold Does Not
`Disclose, “Upon Identifying The Anomalous Function Call,”
`Notifying Any Computer “Of The Anomalous Function
`Call” ..................................................................................................... 33 
`
`Khazan And Arnold May Not Be Combined To Render The
`Claims Obvious ................................................................................... 35 
`
`VII.  Ground 3: Claims 2-3, 9-10, 12-13, 19-20, 23-24, 30-31, 33-34, Or
`40-41 Are Not Obvious Under The Combination Of Khazan,
`Arnold, And Agrawal .................................................................................... 40 
`
`A. 
`
`B. 
`
`C. 
`
`D. 
`
`The Combination Of Khazan, Arnold, And Agrawal Does
`Not Disclose “Randomly Selecting The Model As To Be
`Used In The Comparison From A Plurality Of Different
`Models Relating To The Program” ..................................................... 40 
`
`The Combination Of Khazan, Arnold, And Agrawal Does
`Not Disclose “Randomly Selecting A Portion Of The Model
`To Be Used In The Comparison” ........................................................ 45 
`
`The Combination Of Khazan, Arnold, And Agrawal Does
`Not Disclose “Creating A Combined Model” ..................................... 46 
`
`The Combination Of Khazan, Arnold, And Agrawal Does
`Not Disclose “Creating A Combined Model From At Least
`Two Models Created Using Different Computers” ............................ 48 
`
`1. 
`
`Agrawal Does Not Disclose The Limitation ............................. 48 
`
`3471556
`
`
`- ii -
`
`
`
`

`
`IPR2015-00375
`Patent 8,074,115
`2. 
`
`A Combination Of Khazan, Arnold, and Agrawal
`Would Not Include The Limitation .......................................... 50 
`
`E. 
`
`The Combination Of Khazan, Arnold, And Agrawal Does
`Not Disclose “Creating A Combined Model From At Least
`Two Models Created At Different Times” .......................................... 52 
`
`1. 
`
`2. 
`
`Agrawal Does Not Disclose The Limitation ............................. 52 
`
`A Combination Of Khazan, Arnold, And Agrawal And
`Arnold Would Not Include The Limitation .............................. 55 
`
`Khazan, Arnold, and Agrawal May Not Be Combined to
`Render the Challenged Claims Obvious ............................................. 56 
`
`F. 
`
`
`
`3471556
`
`
`- iii -
`
`
`
`

`
`IPR2015-00375
`Patent 8,074,115
`
`TABLE OF AUTHORITIES
`
`Cases 
`Amgen Inc. v. F. Hoffman-La Roche Ltd.,
`580 F.3d 1340 (Fed. Cir. 2009) ..................................................................... 57
`
`Bates v. Coe,
`98 U.S. 31 (1878) ........................................................................................... 28
`
`Daiichi Sankyo Co. v. Apotex, Inc.,
`501 F.3d 1254 (Fed. Cir. 2007) ....................................................................... 4
`
`Endo Pharms., Inc. v. Depomed, Inc.,
`IPR2014-00656, Paper 66 (PTAB Sept. 21, 2015) ....................................... 57
`
`In re Am. Acad. Of Sci. Tech Ctr.,
`367 F.3d 1359 (Fed. Cir. 2004) ..................................................................... 26
`
`In re Cuozzo Speed Techs., LLC,
`793 F.3d 1268 (Fed. Cir. 2015) ..................................................................... 26
`
`In re Omeprazole Patent Litig.,
`536 F.3d 1361 (Fed. Cir. 2008) .............................................................. 22, 50
`
`In re Robertson,
` 169 F.3d 743 (Fed. Cir. 1999) ...................................................................... 23
`
`In re Suitco Surface, Inc.,
`603 F.3d 1255 (Fed. Cir. 2010) ..................................................................... 27
`
`InTouch Techs., Inc. v. VGO Communs., Inc.,
`751 F.3d 1327 (Fed. Cir. 2014) ..................................................................... 36
`
`KSR Int'l Co. v. Teleflex Inc.,
`550 U.S. 398 (2007)....................................................................................... 35
`
`Microsoft Corp. v. Proxyconn, Inc.,
`789 F.3d 1292 (Fed. Cir. 2015) ..................................................................... 26
`
`NetMoneyIN, Inc. v. VeriSign, Inc.,
` 545 F.3d 1359 (Fed. Cir. 2008) .................................................................... 20
`
`3471556
`
`
`- iv -
`
`
`
`

`
`IPR2015-00375
`Patent 8,074,115
`PAR Pharm., Inc. v. TWI Pharm., Inc.,
`773 F.3d 1186 (Fed. Cir. 2014) ........................................................ 48, 49, 60
`
`PharmaStem Therapeutics, Inc. v. ViaCell, Inc.,
`491 F.3d 1342 (Fed. Cir. 2007) ..................................................................... 36
`
`Phillips v. AWH Corp.,
`415 F.3d 1303 (Fed. Cir. 2005) ..................................................................... 25
`
`Rhone-Poulenc Agro, S.A. v. DeKalb Genetics Corp.,
`272 F.3d 1335 (Fed. Cir. 2001) ..................................................................... 58
`
`Unigene Labs., Inc. v. Apotex, Inc.,
` 655 F.3d 1352 (Fed. Cir. 2011) .................................................................... 57
`
`Other Authorities 
`
`MPEP § 2143 ........................................................................................................... 50
`
`
`
`3471556
`
`
`- v -
`
`
`
`

`
`IPR2015-00375
`Patent 8,074,115
`I.
`Introduction
`
`The challenged ’115 patent is about protecting computer systems from faults
`
`and attacks. The ’115 patent describes using an emulator to monitor the function
`
`calls made by a running program. The program’s function calls are compared to a
`
`sophisticated model of function calls. The model contains insights about the
`
`program’s expected behavior that have been learned by training on a large amount
`
`of data and distilling common features from the data. If a new function call
`
`deviates far enough from the statistical baseline of normal behavior found in the
`
`model, the function call may be labeled anomalous. Neighboring computers that
`
`are running the same application—an “application community”—may be notified
`
`of the anomalous function call. The application community also assists in
`
`developing multiple models. These models may be shared and combined
`
`throughout the community to develop better protections against faults and attacks.
`
`Petitioner’s challenge to the ’115 patent rests on one primary reference,
`
`Khazan. But Khazan is a rudimentary approach to computer security that takes a
`
`starkly different approach than the ’115 patent. Khazan uses a simple, quickly-
`
`compiled list of a subset of the external library functions that may be called by an
`
`application. If the runtime calls to external libraries do not exactly match the
`
`entries in the list, then the application is declared to be MC—malicious code—with
`
`100% certainty. Because Khazan’s list only records the function calls that the
`
`3471556
`
`
`- 1 -
`
`
`
`

`
`IPR2015-00375
`Patent 8,074,115
`application is capable of making, it does not have insight into which functions are
`
`actually called in normal usage. Therefore, Khazan does not disclose several
`
`critical elements of the ’115 patent. To remedy some admitted deficiencies,
`
`Petitioner turns to two other references, Arnold and Agrawal. However, neither
`
`reference discloses the elements for which they are cited, and neither may be
`
`appropriately combined with Khazan. All claims of the ’115 patent remain novel
`
`and nonobvious. None should be canceled in this trial.
`
`II. The Invention Of The ’115 Patent
`The ’115 patent, entitled “Methods, Media and Systems for Detecting
`
`Anomalous Program Executions,” resulted from the joint efforts of Professors
`
`Salvatore Stolfo and Angelos Keromytis at Columbia University’s Intrusion
`
`Detection Systems Laboratory and Network Security Laboratory. The patent
`
`relates to improved computer security techniques. Ex. 1001 at 1:18-19. Certain
`
`activity—function calls made by running programs—can be a leading indicator of
`
`intrusions or attacks. Id. at 3:64-4:5 (“[T]he application of an anomaly detector to
`
`function calls can enable rapid detection of malicious program executions, such
`
`that it is possible to mitigate against such faults or attacks . . . .”).
`
`A model of function calls is used to establish a baseline for program
`
`behavior and to detect anomalous program executions. Id. at 3:16-19. The ’115
`
`patent teaches methods to capture, isolate, and repair the harm caused by those
`
`3471556
`
`
`- 2 -
`
`
`
`

`
`IPR2015-00375
`Patent 8,074,115
`anomalies. Id. at 3:28-4:5. And it provides techniques for improving overall
`
`security by combining models created on different computers or at different times,
`
`and notifying the application community of specific information regarding
`
`anomalous functions. Id. at 8:9-49, 18:44-62.
`
`The model of function calls is developed using a learning algorithm with a
`
`training phase to develop a statistical profile of function calls. Id. at 3:16-19; 4:6-
`
`11; 5:8-21; 7:5-8:8. After the model has been created in a learning phase, the
`
`model of function calls can be applied to inspect running programs to identify
`
`anomalous function calls. Id. at 3:52-56. This second phase is conducted using
`
`emulation, which allows for “enhanced detection of some types of attacks,” as well
`
`as “enhanced reaction mechanisms.” Id. at 9:35-40; 3:37-45; 3:56-62.
`
`Once an anomalous function call is detected on one computer, that computer
`
`can notify an application community of the anomalous function call. Providing
`
`information about the anomalous function call to members of the application
`
`community allows the notified computers to “isolate the portion of the code that
`
`caused the fault.” Id. at 18:44-62. Further, multiple models can be used to make
`
`the system more robust. For example, a model of function call executions can be
`
`combined with other models created at different times or using different
`
`computers. Id. at 8:9-31; 6:31-47 (describing how these combined models allow
`
`distribution of computer workload, and reduce the effects of “concept drift”).
`
`3471556
`
`
`- 3 -
`
`
`
`

`
`IPR2015-00375
`Patent 8,074,115
`III. Level Of Ordinary Skill In The Art
`As explained by Patent Owner’s expert Prof. George Cybenko, a POSITA at
`
`the time of the invention would have at least an undergraduate degree in computer
`
`science or mathematics and one to two years of experience in the field of computer
`
`security. Ex. 2030 at ¶¶ 24-28. Prof. Cybenko’s definition is comparable to that
`
`proposed by Petitioner’s experts in the related district court action. Ex. 2003 at ¶
`
`94 (defining a POSITA as one with “a Master’s degree in computer science,
`
`computer engineering, or a similar field, or a Bachelor’s degree in computer
`
`science, computer engineering, or a similar field, with approximately two years of
`
`industry experience relating to computer security.”); Ex. 2004 at ¶ 18 (same).
`
`However, Petitioner and its expert argue that a POSITA must have both a
`
`Master’s degree and two to three years of relevant experience. Ex. 1003 at ¶ 20.
`
`Notably, one of the inventors of the ’115 patent, Stelios Sidiroglou (now Stelios
`
`Sidiroglou-Douskos) had not yet received a Master’s degree at the time of the
`
`invention. Ex. 2032 (listing M.Phil. degree received from Columbia University in
`
`2006). Requiring a POSITA to have a Master’s degree in addition to two to three
`
`years of relevant experience would place the level of skill in the art significantly
`
`higher than the educational level of an inventor of the ’115 patent itself. Daiichi
`
`Sankyo Co. v. Apotex, Inc., 501 F.3d 1254, 1256 (Fed. Cir. 2007) (listing “the
`
`educational level of the inventor” as the first factor to consider “in determining
`
`3471556
`
`
`- 4 -
`
`
`
`

`
`IPR2015-00375
`Patent 8,074,115
`level of ordinary skill in the art”).
`
`Neither the Petition nor its expert provided any rationale for their proposed
`
`elevated level of ordinary skill. However, at deposition, Petitioner’s expert
`
`advanced a new argument that a particular formula in the ’115 patent was the
`
`entire basis for requiring a Master’s degree, and that, apart from that single
`
`formula, the appropriate level of skill was actually that proposed by Patent Owner.
`
`Ex. 2029 at 240:12-241:15; 254:16-255:6; 255:15-19 (“[T]he rest of the ’115
`
`patent as well as the ’322 patent of corresponding parts would only require a
`
`Bachelor’s degree in computer science or an equivalent plus one to two years of
`
`experience in computer security . . . .”). Petitioner’s expert testified that there was
`
`a typographical error in the formula at 4:59-62 in the ’115 patent, and that it would
`
`be hard “for an undergraduate to understand how to correct that formula as well as
`
`to even understand the formula” because the formula was not typically taught to
`
`undergraduate students. Id. at 257:23-258:6. Petitioner’s expert further testified
`
`that a reader would need to look at a prior art paper by Friedman and Singer that
`
`had the correct formula to be able to correct the typo. Id. at 45:17-19 (“So
`
`somebody would need to know that they need to go to Friedman & Singer and find
`
`this error and be able to correct it.”).
`
`This analysis is suspect for a number of reasons. First, it hinges on one
`
`formula that is an aspect of the PAD algorithm, and the PAD algorithm itself (as
`
`3471556
`
`
`- 5 -
`
`
`
`

`
`IPR2015-00375
`Patent 8,074,115
`opposed to learning algorithms in general) is not a requirement of any claim. See
`
`generally Ex. 1001 at cls. 1-42. Second, immediately following the formula, the
`
`patent notes that the value can be calculated a different way. Id. at 4:66-5:7
`
`(“Because this computation of C can be time consuming, C can also be calculated
`
`by . . . .”). This formula is a sufficient substitute for the one with the error. Ex.
`
`2030 at ¶ 29. Third, the corrected formula is already found in the provisional
`
`application. Ex. 2028 at 31. This was incorporated by reference into the
`
`specification. Ex. 1001 at 1:8-14. But Dr. Goodrich did not even read the
`
`provisional application. Ex. 2029 at 86:7-8 (“Q Did you read the provisional
`
`applications? A Just sitting here today, I don’t recall.”). Prof. Cybenko also
`
`discusses the complexity of the gamma function and the paper and concludes that
`
`both are comprehensible by undergraduates. Ex. 2030 at ¶ 29.
`
`Therefore, Dr. Goodrich’s obviousness analysis should be given little, if any,
`
`weight, because his conclusions apply the wrong standard and require a POSITA to
`
`have an unduly high level of education. Indeed, Dr. Goodrich’s misstep on the
`
`level of ordinary skill indicates his opinion should receive close scrutiny.
`
`IV. The Prior Art
`A. U.S. Patent Publication No. 2005/0108562 (“Khazan”)
`Khazan describes a way to detect malicious code in a software program by a
`
`two-step process: first static analysis and then dynamic analysis. Ex. 1010 at
`
`3471556
`
`
`- 6 -
`
`
`
`

`
`IPR2015-00375
`Patent 8,074,115
`Abstract. Before the program is executed, the static analysis step identifies the
`
`instructions that call into well-known Win32 APIs and saves their target and
`
`invocation locations in a list. Id. at Abstract; ¶¶ 42, 45, 47, 48, 99; Figs. 4A, 12-
`
`14. When the program is executed, the dynamic analysis step examines the Win32
`
`API calls that are made and validates the call against the list that was generated
`
`during the preprocessing static analysis step. Id. at Abstract; ¶¶ 78, 80, 82-83, 94,
`
`99; Figs. 4B, 6, 7, 9. If a mismatch occurs, the system blocks the API call. Id. at
`
`¶¶ 85, 94, 99. Ex. 2030 at ¶¶ 48-50.
`
`1. Khazan Describes A Simple List Of Function Calls
`Khazan uses “static analysis of a binary form of the application” to build
`
`what it terms a “model.” Khazan at Abstract. Khazan’s “model” is “comprised of
`
`a list of calls to targets, their invocation and target locations, and possibly other
`
`call-related information.” Ex. 1010 at Abstract; Ex. 2030 at ¶¶ 41-45. As
`
`described in Fig. 4A, the “model” (i.e. objects labeled 106 or 112 in Fig. 4A) is
`
`simply a “list of target and invocation locations.” Id. at Fig. 4A. Each and every
`
`example of a list used by Khazan is simply a different way of associating targets
`
`with invocation locations. Id. at ¶ 102-04, Figs. 10-12.
`
`2. Khazan Only Tracks Win32 DLL Function Calls That Are
`Capable Of Being Made
`
`Only “predetermined” functions are analyzed during static analysis and
`
`added to the list. Id. at ¶ 43. In other words, the static analyzer needs to know
`
`3471556
`
`
`- 7 -
`
`
`
`

`
`IPR2015-00375
`Patent 8,074,115
`about the functions before it is capable of adding them to the list. The
`
`predetermined functions are also a subset of external libraries used by the
`
`application—the well-known Win32 libraries that provide access to operating
`
`system functions. Id. at ¶ 42 (“These functions may represent the set of Win32
`
`Application Programming Interfaces (APIs) as known to those of ordinary skill in
`
`the art in connection with the Windows operating system by Microsoft
`
`Corporation.”); Ex. 2031 at ¶ 11; Ex. 2030 at ¶ 52. These predetermined external
`
`function calls are simply the calls that the program is capable of making, as shown
`
`by inspecting its executable code. Ex. 1010 at ¶ 48; Ex. 2031 at ¶ 15. Khazan’s
`
`static analyzer does not record information about how the functions are used at
`
`runtime. Ex. 1010 at ¶¶ 78, 86 (run time stack used by dynamic analyzer to
`
`compare to the list but not by the static analyzer to build the list); Ex. 2031 at ¶ 15.
`
`3. Khazan Identifies A Program As Malicious Code With
`Absolute Certainty
`
`If a program’s function calls do not exactly match the information in the list
`
`that was compiled by the static analyzer, the program is flagged as malicious. Ex.
`
`1010 at ¶¶ 43, 65; Ex. 2031 at ¶ 16. The validation procedure is a simple
`
`comparison of new function calls with what was noted on the static analysis list.
`
`Ex. 1010 at ¶¶ 67, 85. Khazan does not estimate a probability of maliciousness, or
`
`a probability of anything at all. Ex. 2031 at ¶ 16; Ex. 2030 at ¶ 49.
`
`B. U.S. Patent No. 8,108,929 (“Arnold”)
`
`3471556
`
`
`- 8 -
`
`
`
`

`
`IPR2015-00375
`Patent 8,074,115
`Arnold discloses a system for identification of a “virus, worm, or Trojan
`
`Horse, within a data processing system” which may employ anomaly detection.
`
`Ex. 1007 at 3:3-6; 4:60-5:26. After the virus or worm is detected, a signature can
`
`be computed automatically if one does not already exist. Id. at 2:6-62; 3:6-12;
`
`5:28-68; 7:10-20:23 (more than 16 columns of specification devoted to signature
`
`generation techniques). Then “neighboring data processing systems on a network
`
`[may be informed] of an occurrence of the undesirable software entity . . . .” Id. at
`
`2:66-68; 19:45-20:68. This is accomplished via a “kill signal” that “is transmitted
`
`from an infected computer to neighboring computers on the network.” Id. at
`
`19:48-52. The purpose of the kill signal is “that other computers on the network
`
`are alerted to the presence of an anomaly.” Id. at 20:8-11.
`
`C. U.S. Patent No. 5,440,723 (“Agrawal”)
`Agrawal is directed to processing “observations” of computer activity
`
`through multiple, tiered “detection algorithms.” Agrawal at Abstract. In
`
`particular, the different tiers generally utilize distinct algorithms, each with
`
`different speed and reliability characteristics. Id. at 1:56-60 (“By cascading
`
`detectors, fast algorithms are used to weed out normal behavior very quickly, while
`
`more complex algorithms are run preferably only when there is a good possibility
`
`that something is wrong.”); 2:43-45 (“Preferably, the first level detection algorithm
`
`has a computational-efficiency that is greater than a computational-efficiency of
`
`3471556
`
`
`- 9 -
`
`
`
`

`
`IPR2015-00375
`Patent 8,074,115
`the second level detection algorithm.”). Separate algorithms may be executed in a
`
`serial “cascade manner.” Id. at 3:45-64. Multiple separate algorithms may also
`
`run in parallel if the algorithms are directed to assessing different characteristics of
`
`the data. Id. at 3:65-4:8; 7:18-23.
`
`V. Ground 1: Khazan Does Not Anticipate Claims 22, 25, 27-29, 32, 35-39
`or 42
`A. Khazan Does Not Disclose “Modifying A Program To Include
`Indicators Of Program-Level Function Calls”
`
`All claims in this ground require the limitation “modifying a program to
`
`include indicators of program-level function calls.” Petitioner gives the example
`
`of modifying the program’s binary or source code to include indicators of what
`
`function call are being made. Pet. 9 (citing Ex. 1001 at 3:28-37); Ex. 1003 at ¶
`
`59. Patent Owner agrees that the claims require that the program itself—as
`
`opposed to other software components or libraries that the program interacts
`
`with—must be modified to include the indicators. Ex. 2030 at ¶ 114.
`
`Petitioner identifies paragraphs 73, 75, and 78 of Khazan as disclosing
`
`modification of a program to execute additional monitoring code that indicates
`
`which of the program’s functions are being called. Pet. 21; Ex. 1003 at ¶
`
`83. Although these paragraphs are cited, Petitioner has not identified a teaching in
`
`Khazan that the instrumentation technique modifies the program itself. To the
`
`contrary, Khazan teaches that other software components—not the program—are
`
`3471556
`
`
`- 10 -
`
`
`
`

`
`IPR2015-00375
`Patent 8,074,115
`modified to intercept function calls. Ex. 1010 at ¶ 73 (“At step 126, the associated
`
`libraries, such as all operating system DLLs, are instrumented to intercept calls
`
`to a predetermined set of target functions at run time.” (emphasis added)).
`
`In particular, Figure 7 shows how the instrumentation technique transfers
`
`control from the program when it makes a call to an external Win32 API
`
`function. Id. at Fig. 7 and ¶ 82. As the program 102 is running, the control of the
`
`program is transferred to the target function, then to the wrapper function, then the
`
`trampoline function, and then back to the target function. Id.; Ex. 2030 at ¶
`
`117. When a call is made to a target function, control is transferred to instructions
`
`inside the external library which provide a jump instruction to the wrapper
`
`function. Ex. 1010 at ¶ 83. The wrapper function verifies the call by comparing it
`
`with a list of calls generated by the static analysis. Id. at ¶¶ 84-85. After the call
`
`has been verified, control is transferred to the trampoline function which then
`
`returns to the library to execute the call. Id. at ¶ 88. As a result, the program is not
`
`modified but instead loses control to other software components that are not part of
`
`the program. Ex. 2030 at ¶ 117; Ex. 2031 at ¶ 12. Instead, the DLL is modified to
`
`add the required wrapper and stub code. Ex. 1010 at Fig. 7.
`
`Petitioner’s expert raised a new argument at deposition that the external
`
`DLL code and Khazan’s wrapper code were “part of the application.” Ex. 2029 at
`
`112:17-19 (“[P]aragraph 80 itself is talking about the wrapper’s DLL and it is
`
`3471556
`
`
`- 11 -
`
`
`
`

`
`IPR2015-00375
`Patent 8,074,115
`referring to it as part of the application.”). This argument makes no sense. First,
`
`Petitioner’s expert conceded that the program’s code was not modified. Id. at
`
`116:23-117:4 (“I don’t see application dot EXE that particular file as being
`
`modified. Instead I see this modification happening in this wrapper dot DLL
`
`function . . . .”). Second, Petitioner’s expert relies on the description of the
`
`program’s “address space” in paragraph 80 to support his interpretation. Id. at
`
`111:11-22. A program’s address space is the section of system RAM that has been
`
`allocated to that application by the operating system. Ex. 2030 at ¶ 121. It is not
`
`the program itself. Id. at ¶¶ 121-22. The ’115 patent indicates that the “program’s
`
`binary or source code” is modified, neither of which are external components that
`
`happen to be loaded into the program’s address space. Id. at ¶¶ 114-16; Ex. 1001
`
`at 3:28-37; Ex. 2024 (definition of “program” concurs). In fact, Patent Owner
`
`specifically cited this disclosure of how a program would be modified to the PTO
`
`to distinguish a prior art reference during prosecution. Ex. 1002 at 111-12. Third,
`
`Khazan itself distinguishes between “the application executable” and its “address
`
`space.” Ex. 1010 at ¶ 80 (“In this example, the application executable may be
`
`located in the first portion of its address space.” (emphasis added)).
`
`Further, Petitioner has not explained how functions that are internal to the
`
`program (not in libraries) would be tracked or how a program would be modified
`
`to add indicators for them. Khazan extensively discusses particular methods for
`
`3471556
`
`
`- 12 -
`
`
`
`

`
`IPR2015-00375
`Patent 8,074,115
`intercepting and monitoring Win32 API calls. Ex. 1010 at Fig. 7; ¶¶ 79-100.
`
`Khazan consistently states that its system monitors external function calls, with no
`
`discussion of monitoring function calls internal to an application. Ex. 1010 at ¶ 48
`
`(explaining that “the calls determined by static analyzer 104 are the Win32 APIs
`
`which are predetermined subset [sic] of externally called functions or routines”);
`
`id. at ¶ 73 (“The target functions are assumed to be external to the application.”);
`
`id. at ¶ 82 (“The external call is intercepted using the instrumentation techniques
`
`described herein.”); Ex. 2031 at ¶ 12 (named inventor on Khazan agrees).
`
`Prof. Cybenko concurs that a POSITA reading Khazan would understand
`
`that reference to disclose function calls made to well-known Win32 libraries and
`
`not to functions entirely within the application. Ex. 2030 at ¶¶ 123-25. Indeed, the
`
`consistent emphasis in Khazan on Win32 API calls makes sense because those are
`
`the function calls that interact with the operating system and which can harm the
`
`computer. Id. at ¶ 125; Ex. 2031 at ¶ 11. By contrast, internal function calls by
`
`definition do not interact with the operating system, and so in most cases are not
`
`indicative of malicious behavior. Ex. 2030 at ¶ 125.
`
`As such, Khazan’s discussion of instrumentation techniques for intercepting
`
`the runtime function calls made by an application are only applicable to external
`
`libraries and cannot be used with internal function calls, because the methods of
`
`representing those function calls are significantly different. Ex. 2030 at ¶¶ 114-17,
`
`3471556
`
`
`- 13 -
`
`
`
`

`
`IPR2015-00375
`Patent 8,074,115
`123-25. Khazan provides no guidance to a POSITA on why or how to implement
`
`a system that modifies a program to track internal function calls.
`
`B. Khazan Does Not Disclose “Identifying A Function Call As
`Anomalous”
`All claims in this ground require the limitation “identifying/identifies a
`
`function call corresponding to the at least one of the indicators as anomalous
`
`based on the comparison,” which Khazan does not disclose. The Board adopted
`
`the construction for “anomalous” as “deviation/deviating from a model of typical
`
`computer system usage.” Institution Decision at 6. Khazan does not meet this
`
`limitation under the Board’s construction for two reasons.
`
`1. Khazan Does Not Disclose “Deviation From A Model of
`Typical Computer System Usage”
`Khazan does not model computer system usage.1 Khazan consistently
`
`describes doing something different—collecting information regarding function
`
`target and invocation locations using “static analysis,” which is analysis “by static
`
`examination of code without execution.” Ex. 1010 at Abstract (“A model is
`
`constructed using a static analysis . . . .”); id. at ¶ 43 (“Static analysis processing as
`
`described herein may be characterized as identifying information about code by
`
`
`1 Notably, Petitioner proposed including “computer system usage” in the
`
`construction, and the Board included that term even though it disagreed with other
`
`aspects of Petitioner’s proposed construction. Institution Decision at 6.
`
`3471556
`
`
`- 14 -
`
`
`
`

`
`IPR2015-00375
`Patent 8,074,115
`static examination of code without execution.”). Thus, Khazan does not teach
`
`constructing its baseline list of targets and invocation locations by examining
`
`computer system usage, but rather by looking statically at unexecuted code. The
`
`information collected by Khazan’s static analyzer is simply a list of the Win32 API
`
`function calls that could be made by the application. Ex. 2030 at ¶¶ 98-104. This
`
`does not represent computer system usage because a program often calls only a
`
`subset of those function calls when it executes. Id. Further, the list created by
`
`Khazan does not reflect what parts of the program are normally activated by a user
`
`or are activated in response to other input stimulus. Ex. 2031 at ¶ 15.
`
`At deposition, Petitioner’s expert did not even believe that “computer system
`
`usage” was a claim requirement, even though he had advocated for the inclusion of
`
`the concept in the construction of “anomalous.” Ex. 2029 at 332:21-333:1 (“Is that
`
`a claim term, or is it just – you’re just asking it completely divorced from the ’115
`
`pate

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket