throbber
IPR2015-00375
`Patent No. 8,074,115
`
`
`
`
`UNITED STATES PATENT AND TRADEMARK OFFICE
`_____________
`
`BEFORE THE PATENT TRIAL AND APPEAL BOARD
`_____________
`
`SYMANTEC CORPORATION,
`Petitioner,
`v.
`
`THE TRUSTEES OF COLUMBIA UNIVERSITY
`IN THE CITY OF NEW YORK,
`Patent Owner.
`_____________
`
`Case IPR2015-00375
`Patent 8,074,115
`_____________
`
`SUPPLEMENTAL DECLARATION OF DR. MICHAEL T. GOODRICH
`_______
`
`
`
`
`
`
`
` 1
`
`SYMC 1015
`Symantec v. Columbia
`IPR2015-00375
`
`

`
`IPR2015-00375
`Patent No. 8,074,115
`I, Michael T. Goodrich, Ph.D., declare as follows:
`
`1.
`
`I have been asked by Symantec Corporation (“Petitioner”) to provide
`
`this supplemental declaration with my expert opinions in support of the above-
`
`captioned inter partes review of U.S. Patent No. 8,074,115 (the “’115 patent”).
`
`2.
`
`The purpose of this declaration is to clarify my opinions related to is-
`
`sues raised in my deposition for the above-captioned IPR, and to respond to issues
`
`raised by the Patent Owner.
`
`3.
`
`In addition to the documents mentioned in my earlier declaration, I
`
`reviewed the follow documents:
`
`a. Columbia’s Patent Owner Response for the above-captioned
`
`inter partes review of the ’115 patent (“Response”);
`
`b. Declaration of George Cybenko, Ph.D. In Support of Colum-
`
`bia’s Patent Owner Response (Ex. 2030);
`
`c. Declaration of Scott M. Lewandowski (Ex. 2031);
`
`d. Transcript of Deposition of Scott M. Lewandowski, December
`
`4, 2015 (Ex. 1013);
`
`e. Transcript of Expert Deposition of George Cybenko, Ph.D.,
`
`December 10, 2015 (Ex. 1014); and
`
`f. Galen Hunt, et al., “Detours: Binary Interception of Win32
`
`Functions,” Proceedings of the 3rd USENIX Windows NT
`
`
`
`
`
` 2
`
`SYMC 1015
`Symantec v. Columbia
`IPR2015-00375
`
`

`
`IPR2015-00375
`Patent No. 8,074,115
`Symposium, Seattle, WA, July 1999 (Ex. 1016).
`
`4.
`
`5.
`
`I currently hold the opinions set forth in this declaration.
`
`I understand that the Patent Owner in its Response for IPR2015-
`
`00375, and through its expert, Dr. Cybenko, states that a person of ordinary skill in
`
`the art (POSITA) at the time of the invention of the ’115 patent would have at least
`
`an undergraduate degree in computer science or mathematics and one to two years
`
`of experience in the field of computer security. Response at 4; Ex. 2030 at ¶¶ 24-
`
`28. Patent Owner contrasts this level of skill with my previously stated opinion
`
`that the level of ordinary skill in the art of the ʼ115 patent at the time of the effec-
`
`tive filing date is a person with a Master’s degree in computer science or a related
`
`field with two to three years of experience in the field of software security systems.
`
`See Ex. 1003 at ¶ 20.
`
`6.
`
`The primary distinction between Dr. Cybenko and my stated opinions
`
`is whether the person of ordinary skill has a Master’s degree or an undergraduate
`
`degree. See Ex. 1014 at 25:1-9. A typical Master’s degree in computer science or
`
`a related field requires one to two years of study. Thus, the difference in the levels
`
`of ordinary skill opined by me and Dr. Cybenko is as little as one to two years of
`
`schooling or experience in the field.
`
`7.
`
`In my opinion, this one to two year difference is not material to the
`
`understanding of the technologies and concepts expressed in the ’115 patent and its
`
`
`
`
`
` 3
`
`SYMC 1015
`Symantec v. Columbia
`IPR2015-00375
`
`

`
`IPR2015-00375
`Patent No. 8,074,115
`claims. Accordingly, the opinions I previously expressed using the higher level of
`
`skill in the art still hold if Dr. Cybenko’s asserted lower level of ordinary skill in
`
`the art is used. Hence, my opinions with respect to the claims of the ’115 patent do
`
`not change if Patent Owner’s asserted level of ordinary skill in the art is adopted by
`
`the Board. See Ex. 2029 at 312:15-22 (“all of my conclusions still hold with [Pa-
`
`tent Owner’s] definition of a person of ordinary skill”).
`
`8.
`
`Khazan discloses a model of typical computer system usage. Khazan
`
`in its Background section recognizes that “[a]nomaly detection approaches use a
`
`model or definition of what is expected or normal with respect to a particular
`
`application and then look for deviations from this model.” Ex. 1010 at ¶ 8. Here,
`
`Khazan is using “expected or normal with respect to a particular application” to re-
`
`fer to “typical computer system usage.” Khazan then uses consistent terminology
`
`when describing its own model. Specifically, Khazan says that the model pro-
`
`duced by its static analyzer comprises “the identified calls, their locations within
`
`the program, and other call related information.” Ex. 1010 at ¶ 114. The model is
`
`used to “distinguish between normal or expected behavior of code and the behavior
`
`produced by MC [malicious code]…If the run time behavior deviates from the ap-
`
`plication model, it is determined that the application executable has executed MC.”
`
`Id. at ¶ 65. Khazan’s application model thus describes “normal or expected behav-
`
`ior of the code,” which is the behavior one would expect the code to follow during
`
`
`
`
`
` 4
`
`SYMC 1015
`Symantec v. Columbia
`IPR2015-00375
`
`

`
`IPR2015-00375
`Patent No. 8,074,115
`typical computer system usage. See id. at ¶ 67 (“Normal behavior, or non-MC be-
`
`havior, is associated with particular target function calls identified by the static an-
`
`alyzer 104.”). Because the model describes “normal or expected behavior of the
`
`code” (i.e., typical computer system usage), Khazan is able to use the model to de-
`
`termine that the application executable has executed malicious code “[i]f the run
`
`time behavior deviates from the application model” Ex. 1010 at ¶ 65.
`
`9.
`
`Khazan discloses “executing at least part of a program in an emula-
`
`tor,” where the “emulator” permits both monitoring and selective execution of cer-
`
`tain parts, or all, of the program. As I discussed in my earlier declaration, Khazan
`
`discloses that the dynamic analysis may be emulated or simulated. Ex. 1003 at ¶
`
`66; Ex. 1010 at ¶¶ 110-112 (“An execution of the application may also be emulated
`
`or simulated.”). The emulation described in Khazan is performed by an emulator.
`
`This emulator permits both monitoring and selective execution of certain parts, or
`
`all, of the program.
`
`10.
`
`I understand that Patent Owner argues, and Dr. Cybenko testifies, that
`
`the “emulated” execution in Khazan would not permit monitoring and selective ex-
`
`ecution. Response at 18-20; Ex. 2030 at ¶¶ 131-53. Dr. Cybenko states that “one
`
`possible meaning of the term ‘emulate’ in general computing…is ‘to imitate the
`
`functions of (another computer system) by means of software.’” Ex. 2030 at ¶ 138
`
`(quoting Ex. 2042 at 3). While I generally agree with this definition, the conclu-
`
`
`
`
`
` 5
`
`SYMC 1015
`Symantec v. Columbia
`IPR2015-00375
`
`

`
`IPR2015-00375
`Patent No. 8,074,115
`sions Dr. Cybenko draws from it are incorrect. Dr. Cybenko says that “a POSITA
`
`would take the meaning of ‘emulated’ and ‘emulation’ as the terms are used in
`
`Khazan to mean that these executions can be performed on non-native hardware
`
`using a technology such as Microsoft’s VirtualPC.” Ex. 2030 at ¶ 142. Even as-
`
`suming this statement is correct, Dr. Cybenko does not explain why performing
`
`executions on non-native hardware does not permit monitoring and selective exe-
`
`cution. Rather, he merely states that “[t]he technology described in Khazan oper-
`
`ates at an entirely different level than instruction-level emulation does, and does
`
`not provide monitoring and selective execution capability.” Ex. 2030 at ¶ 146.
`
`But “monitoring and selective execution” are integral to the basic function of an
`
`emulator and these capabilities would be present in an emulator that performs exe-
`
`cution on non-native hardware.
`
`11. Moreover, Dr. Cybenko’s leap of logic in assuming Khazan is refer-
`
`encing execution performed on non-native hardware is unfounded. The provided
`
`definition of “emulate” refers to imitating the functions of another computer sys-
`
`tem, not necessarily of a different computer system (i.e., not necessarily on non-
`
`native hardware). I therefore disagree with Dr. Cybenko’s position that a POSITA
`
`would conclude that Khazan is referencing non-native hardware or technology like
`
`Microsoft’s VirtualPC.
`
`12.
`
`Instead, a POSITA reading Khazan’s statement that “[a]n execution of
`
`
`
`
`
` 6
`
`SYMC 1015
`Symantec v. Columbia
`IPR2015-00375
`
`

`
`IPR2015-00375
`Patent No. 8,074,115
`the application may also be emulated or simulated” would likely realize that Kha-
`
`zan was referencing using an emulator to provide computer security. Specifically,
`
`a POSITA would recognize that use of an emulator provides “sandbox” protection
`
`that prevents malicious code in an emulated application from escaping the sandbox
`
`and infecting the rest of the computer system. Mr. Lewandowski seems to concur
`
`with this interpretation. Ex. 2031 at ¶ 18 (saying that an emulator or virtual envi-
`
`ronment provides sandbox protection). The emulator provides the sandbox securi-
`
`ty by monitoring and selectively executing the application. As Mr. Lewandowski
`
`testified, “[a]n emulator as used at the time is typically a system that’s able to
`
`shepherd the execution of a program” where “shepherd” means “able to directly
`
`control the execution of the – of the program, often involving a translation step, but
`
`not always.” Ex. 1013 at 111:15- 112:3. This “control” is the monitoring and se-
`
`lective execution of the program.
`
`13. Dr. Cybenko also states that a virtual environment such as that pro-
`
`vided by VMWare would not provide the monitoring and selective execution of a
`
`program. Ex. 2030 at ¶¶ 147-153. I disagree with his conclusion. As Dr. Cyben-
`
`ko states, in VMWare “[a]pplications execute within a guest operating system and
`
`are isolated from each other as well as the virtualization software that manages re-
`
`sources across multiple guest operating systems.” Ex. 2030 at ¶ 149. The “virtual-
`
`ization software” in fact provides monitoring and selective execution of a program.
`
`
`
`
`
` 7
`
`SYMC 1015
`Symantec v. Columbia
`IPR2015-00375
`
`

`
`IPR2015-00375
`Patent No. 8,074,115
`Patent Owner’s exhibits related to VMWare demonstrate that the virtualization
`
`software includes a “Virtual Machine Monitor” (VMM) Ex. 2043 at 5, 7, 14; Ex.
`
`2044 at 4-6, 13. The VMM is the software “component that implements virtual
`
`machine hardware abstraction” and is “[r]esponsible for running the guest OS [op-
`
`erating system].” Ex. 2043 at 7. The VMM is called a “monitor” because it moni-
`
`tors the guest operating system and, by extension, any program executing within
`
`the guest operating system. The VMM is part of a program called a “Hypervisor”
`
`which is the “[s]oftware responsible for hosting and managing the virtual ma-
`
`chines.” Ex. 2043 at 7. Part of the base functionality of the Hypervisor is schedul-
`
`ing access to the processor and other resources of the computer system. Ex. 2043
`
`at 5, 7. The Hypervisor is thus providing selective execution of the program by
`
`scheduling when the program can access the processor.
`
`14. Dr. Cybenko states that VMWare implements “‘[d]irect execution of
`
`user-level code for performance’” and uses this as a basis for concluding that
`
`VMWare does not provide monitoring and selective execution of a program. Ex.
`
`2030 at ¶ 148 (quoting Ex. 2043 at 10). Dr. Cybenko, however, overlooks the fact
`
`that the execution of the user level code is monitored by a VMM and scheduled by
`
`a Hypervisor. His conclusions are incorrect and Khazan’s disclosure of VMWare
`
`meets the agreed-upon BRC of “emulator.”
`
`15. The combination of Khazan, Arnold, and Agrawal discloses “creating
`
`
`
`
`
` 8
`
`SYMC 1015
`Symantec v. Columbia
`IPR2015-00375
`
`

`
`IPR2015-00375
`Patent No. 8,074,115
`a combined model.” I understand that Patent Owner and Dr. Cybenko argue that
`
`the combination of these references does not disclose “creating a combined model”
`
`because “[w]hen Agrawal discusses ‘combining,’ it is in the context of combining
`
`outputs, not models…, or running different algorithms that are incompatible and
`
`could not be merged to create a unitary model…” Response at 46-47.
`
`16. Agrawal extensively describes creation and use of multiple different
`
`“models,” a term which is used “synonymous[ly] with algorithm” therein.
`
`Ex. 1008 at 8:63. Indeed, the title of Agrawal is “Method and System for Detect-
`
`ing Intrusive Anomalous Use of a Software System Using Multiple Detection Al-
`
`gorithms.” Id. at p. 1 (emphasis added). Several of the algorithms used are shown
`
`in FIG. 3, including “fast” algorithms: maximum sensor frequency test and Markov
`
`model (shown as F1, F2), and “sophisticated” algorithms: correlation shift algo-
`
`rithms, ellipsoid, multi-cluster analysis, and support vector machine (shown as S1-
`
`S4). See id. at FIG. 3.
`
`17. Agrawal describes how these models are combined. As I said during
`
`my deposition, the concept of combining models “just permeates Agrawal.”
`
`Ex. 2029 at 348:24. By this I mean that the idea of combining models is refer-
`
`enced throughout Agrawal’s disclosure. See, e.g., Ex. 1008 at 1:53-56 (“The pre-
`
`sent invention shows how to use a combination of analysis techniques (detec-
`
`tors)…”); 1:62-67 (“for example, a very precise detector for a specific kind of
`
`
`
`
`
` 9
`
`SYMC 1015
`Symantec v. Columbia
`IPR2015-00375
`
`

`
`IPR2015-00375
`Patent No. 8,074,115
`known assault may be combined with a general detection algorithm for divergence
`
`from previous known behavior.”); 3:6-13 (“The one or more second level detection
`
`algorithms preferably are selected from a set of mathematical models or algo-
`
`rithms...that include: [list of models], and combinations thereof.”); 3:33-44 (“…the
`
`step of processing the current observation or observation aggregate through at least
`
`one or more second level detection algorithms processes the current observation or
`
`observation aggregate through at least a pair of second level detection algorithms
`
`whose outputs are combined according to a given function…”); 7:37-41 (“FIG. 5
`
`illustrates a one examination level approach where a set of four (4) algorithms are
`
`executed, with each algorithm testing against a different characteristic or feature
`
`and their outcomes are evaluated together to make a decision about a next step.”).
`
`As alluded to in the quotations above, Agrawal discloses combining the models se-
`
`rially (first level algorithms followed by second level algorithms) or in parallel.
`
`See id. at Abstract; 1:53-67 see also id. at 2:1-3:3, 7:4-6 (“The principles of the
`
`present invention are not limited to multi-level examinations. Multiple algorithms
`
`may be executed together within a single examination level…”).
`
`18. Agrawal combines two models through use of a “combination func-
`
`tion.” Ex. 2029 at 348:9-10. Agrawal specifically discusses the combination func-
`
`tion: “…outputs are combined according to a given function to provide the sec-
`
`ond, more definite indication of the possible intrusion.” Ex. 1008 at 3:33-44. In
`
`
`
`
`
`
`10
`
`SYMC 1015
`Symantec v. Columbia
`IPR2015-00375
`
`

`
`IPR2015-00375
`Patent No. 8,074,115
`addition, Agrawal references use of the combination function in many other places
`
`where it discusses the combination of models. See, e.g., Ex. 1008 at 1:53-56, 1:62-
`
`67, and 7:37-41. When the combination function combines the outcomes of the
`
`models, it creates a new, combined model in the sense that the resulting value can
`
`be compared with something else in order to identify an anomaly. See Ex. 2029 at
`
`74:7-22 (where I say that a “model” in the context of the claims should be opera-
`
`tive to perform the comparison step that comes later in the claim). Agrawal specif-
`
`ically says that the combination function allows for multiple algorithms to be exe-
`
`cuted, “with each algorithm testing against a different characteristic or feature, and
`
`their respective outcomes evaluated together (e.g., against a given threshold) to
`
`make a decision.” Ex. 1008 at 4:13-18. Thus, Agrawal’s use of a combination
`
`function creates a combined model.
`
`19. Agrawal also describes a specific example in which a distinct model
`
`built with training data observed on a system running “Oracle” is combined with
`
`other models. Ex. 1008 at 8:51-54; 11:51-64. The Oracle model is selected based
`
`on a packet containing observed events. Ex. 1008 at 11:45-57. Then, Khazan’s
`
`system combines the Oracle model two other models, a k-means algorithm and a
`
`Markov transition model, using a linear combination function to compute an
`
`“overall distance estimate.” Ex. 1008 at 11: 51-64. The overall distance estimate
`
`is a combined model created based in part on the Oracle model.
`
`
`
`
`
`
`11
`
`SYMC 1015
`Symantec v. Columbia
`IPR2015-00375
`
`

`
`IPR2015-00375
`Patent No. 8,074,115
`20. A POSITA would find it obvious to use multiple, different, computers
`
`to create the models. Some models are specific to computers running particular
`
`operating systems and/or other programs and a POSITA would find it obvious to
`
`use different computers for the different models. Even for other types of models, a
`
`POSITA would recognize that is obvious, and preferable, to use multiple different
`
`computers to generate the models. Using multiple computers spreads the computa-
`
`tional overhead involved in creating the models across the computers, and allows
`
`for more models to be created in a shorter amount of time. In fact, Agrawal specif-
`
`ically states that “[t]he first and second level detection algorithms may be executed
`
`in the same or different systems, machines or processors.” Id. at 2:18-20.
`
`21.
`
`In addition, a POSITA would find it obvious to create at least some of
`
`the models at different times. While a POSITA would create the models using
`
`multiple different computers, a POSITA would reuse individual computers to cre-
`
`ate multiple different models. It would be unnecessary and likely cost-prohibitive
`
`for a POSITA to obtain enough computers to generate all of the models simultane-
`
`ously. Instead, it is more efficient and sensible to generate multiple models at dif-
`
`ferent times using a set of computers. Hence, a POSITA would find it obvious to
`
`use multiple computers to create the models, with individual computers creating
`
`different models at different times.
`
`
`
`
`
`
`12
`
`SYMC 1015
`Symantec v. Columbia
`IPR2015-00375
`
`

`
`rPR20l5-00375
`Patent No. 8,074,115
`I hereby declare that all statements made herein of my own knowledge are
`
`true and that all statements made on information and belief are believed to be true;
`
`and further that these statements were made with the knowledee that willful false
`
`statements and the like so made are punishable by fine or imprisonment, or both,
`
`under Section 1001 of Title 18 of the United States Code.
`
`Dated: January 7 .2016
`
`Michael T. Goodrich. Ph.D.
`
`t2
`
`
`13
`
`SYMC 1015
`Symantec v. Columbia
`IPR2015-00375

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket