throbber
UNITED STATES PATENT AND TRADEMARK OFFICE
`
`___________________
`
`BEFORE THE PATENT TRIAL AND APPEAL BOARD
`___________________
`
`
`
`SYMANTEC CORPORATION
`Petitioner
`
`
`
`v.
`
`
`
`THE TRUSTEES OF COLUMBIA UNIVERSITY
` IN THE CITY OF NEW YORK
`Patent Owner
`___________________
`
`CASE IPR2015-00375
`Patent 8,074,115
`___________________
`
`
`
`DECLARATION OF SCOTT M. LEWANDOWSKI
`
`
`
`Mail Stop "PATENT BOARD"
`Patent Trial and Appeal Board
`U.S. Patent and Trademark Office
`P.O. Box 1450
`Alexandria, VA 22313-1450
`
`
`Columbia Ex 2031-1
`Symantec v Columbia
`IPR2015-00375
`
`

`
`I.
`
`INTRODUCTION
`
`1. My name is Scott Lewandowski. Since 2006, I have provided
`
`information technology consulting services, with a focus on cyber security, to
`
`government and commercial customers through my company The Wynstone
`
`Group, Inc. I have served as the Chief Cyber Scientist for the U.S. Department of
`
`Defense’s National Cyber Range since 2011. From 2000 to 2006, I was a Member
`
`of Technical Staff at MIT Lincoln Laboratory. I have a Master’s degree in
`
`Computer Science, a Bachelor’s degree in Computer Science, and a Bachelor’s
`
`degree in Business Economics, all from Brown University. I am a co-inventor,
`
`along with Roger Khazan, Jesse Rabek, and Robert Cunningham, on U.S. Patent
`
`Publication No. 2005/0108562 (“Khazan”). My professional biography is attached
`
`to this declaration as Exhibit 1.
`
`2.
`
`I have been asked by The Trustees of Columbia University in the City
`
`of New York (“Columbia”) to provide factual information relevant to two Inter
`
`Partes reviews in the U.S. Patent and Trademark Office, IPR2015-00375 and
`
`IPR2015-00377. I understand that in these proceedings, Petitioner Symantec
`
`Corporation (“Symantec”) has submitted my Khazan patent application publication
`
`and has alleged that Khazan invalidates Columbia’s U.S. Patent Nos. 8,074,115
`
`(the “’115 patent”) and 8,601,322 (the “’322 patent”). I am not providing an
`
`expert opinion in this declaration, and I am not a lawyer. However, to understand
`
`Columbia Ex 2031-2
`Symantec v Columbia
`IPR2015-00375
`
`

`
`how my patent application publication is being characterized by Symantec in these
`
`cases, I have reviewed certain documents, which are listed as Exhibit 2.
`
`3.
`
`I am being compensated for my time by Columbia at the rate of $800
`
`per hour. My compensation is not contingent upon any aspect of this testimony,
`
`the outcome of this matter, or any issues involved in or related to this matter.
`
`Other than owning index funds, which I understand may own Symantec stock, I
`
`have no financial interest in Symantec or Columbia. I have no financial interest in
`
`the ’115 patent or the ’322 patent.
`
`4.
`
`I have had intermittent contact with named inventor Prof. Salvatore
`
`Stolfo on several occasions, such as at academic conferences. I am currently a
`
`sub-contractor on a Defense Advanced Research Projects Agency (“DARPA”)
`
`program that is part of named inventor Prof. Angelos Keroymtis’ portfolio as a
`
`Program Manager at DARPA. Neither of these relationships have influenced
`
`anything that I state in this declaration.
`
`II. BACKGROUND ON KHAZAN
`
`5.
`
`The material in Khazan relates to my work at MIT Lincoln Laboratory
`
`in the early 2000s. At MIT Lincoln Laboratory, I worked on the Department of
`
`Defense’s (“DoD”) most pressing computer security challenges. At that time, a
`
`prominent internet threat was the computer worm. Highly publicized cases, such
`
`as ILOVEYOU and Slammer, showed the public the devastation and havoc that
`
`Columbia Ex 2031-3
`Symantec v Columbia
`IPR2015-00375
`
`

`
`small, easily created attacks could create on an internet-scale within a matter of
`
`minutes. The worm threat was perceived by the DoD community as a particularly
`
`acute risk to the effectiveness of warfighter and command and control systems that
`
`were essential to DoD readiness. Some of my earliest projects at the Laboratory
`
`focused on autonomically detecting and responding to propagating malware,
`
`including worms. I served as the Principal Investigator on several efforts focused
`
`on this problem, and implemented the prototype of SARA: Survivable Autonomic
`
`Response Architecture.
`
`6.
`
`Of particular concern to the DoD was novel computer worms that
`
`exploited previously undisclosed vulnerabilities, or so-called “zero-day worms.”
`
`These worms had been eluding detection and effective response from commercial
`
`systems, and no vendor had a viable roadmap to address the challenge.
`
`7.
`
`I came to the realization that a simple, stop-gap response to the zero-
`
`day worm crisis was to detect code running on a computer system that was not
`
`authorized to run there. Identifying such malicious binaries on disk was easy;
`
`hashing and other techniques could be readily applied. Many worms, however,
`
`posed a unique challenge in that they are dynamically injected into processes
`
`running on a computer system.
`
`8.
`
`I eventually came to the realization that previously unauthorized code,
`
`in and of itself, was not the primary risk. Rather, the most significant risk arose
`
`Columbia Ex 2031-4
`Symantec v Columbia
`IPR2015-00375
`
`

`
`when that code interacted with the operating system in a way that could
`
`compromise system integrity. I wondered if we could build a system that would
`
`detect when this was happening, and alert that unknown – and thus presumably
`
`malicious – code was executing, thus stopping zero-day worms immediately, and
`
`without reliance on other hosts. This idea was in sharp contrast with other research
`
`at the time, which focused either on using community-wide host-level behavior or
`
`network activity to detect worms. The system that I was thinking of would be
`
`capable of stopping worms without communicating with any other computers.
`
`9.
`
`Having conducted my thesis research on interception of API calls, I
`
`realized that API interception could be the perfect technique for implementing my
`
`idea. By monitoring critical APIs and matching the origin of the calls with known
`
`good caller locations, newly introduced calling locations – i.e., dynamically
`
`injected worm code, or other code not previously identified as known good caller
`
`locations – could be easily identified without the typical false positive challenges
`
`inherent to most probabilistic detection approaches.
`
`10.
`
`I started to document my ideas for publication as a concept to be
`
`presented at a research conference; eventually this morphed into a published paper
`
`in a refereed conference. I also began discussing my idea with several of my
`
`colleagues, including Roger Khazan. After refining the idea, we realized that the
`
`implementation would be so simple that we could task a person with a Bachelor’s
`
`Columbia Ex 2031-5
`Symantec v Columbia
`IPR2015-00375
`
`

`
`degree working with our group, Jesse Rabek, to do a prototype implementation in
`
`his spare time. He was easily convinced once we promised that a successful
`
`implementation would result in a publication that he could apply toward his thesis
`
`work. He completed the implementation in short order, and we conducted testing
`
`that showed the effectiveness of the technique, promoting the research to formally
`
`recognized status within the group. I updated my existing publication material
`
`based on new insights, and Rabek added critical information about the
`
`implementation and the results of our testing. This process resulted in an academic
`
`paper and in the Khazan patent application, which is based upon the research that I
`
`and my colleagues performed at MIT Lincoln Laboratory.
`
`III. PARTICULAR ASPECTS OF KHAZAN
`
`11. Khazan’s system, both as implemented by our team at MIT Lincoln
`
`Laboratory, and as described in the Khazan patent application, works by building a
`
`list of targets and their invocation locations for calls made by a program to well-
`
`known DLLs, such as kernel32.dll. We worked with well-known operating system
`
`DLLs because these were the types of calls that were most often used by malicious
`
`code, including worms, to inflict damage on a computer system or to propagate,
`
`since such damage or propagation usually relies on accessing system resources,
`
`such as the network or file system. For example, a worm typically makes a new
`
`operating system call to duplicate itself on the file system or network. This is
`
`Columbia Ex 2031-6
`Symantec v Columbia
`IPR2015-00375
`
`

`
`normally accomplished by an external function call to an operating system DLL.
`
`We did not track function calls that were exclusively internal to a program, and to
`
`the best of my knowledge such a capability was never discussed for
`
`implementation, much less actually implemented. There were several reasons for
`
`this.
`
`12. First, we had developed a system for instrumenting the called DLLs at
`
`runtime, which allowed our code to run when a predetermined operating system
`
`call is made. The operating system call executed code in the DLL which would
`
`cause our wrapper code to be called. When our code activated, it would be able to
`
`compare the target and invocation location known at runtime to those that had been
`
`recorded in the static analysis list. If they matched, then control would be passed
`
`to the operating system call, as implemented in the external DLL, which would
`
`execute its normal code, and then control would return to the application. If there
`
`was any mismatch, our system would trigger a detection, and a typical response
`
`would be to terminate the application. But this system could not have worked for
`
`function calls that were internal to the analyzed program. We would have had to
`
`develop an entirely new system for interacting with the program, and such a
`
`system would have been much more complex and subject to false positives and
`
`negatives that would have adversely impacted the effectiveness of our system.
`
`Columbia Ex 2031-7
`Symantec v Columbia
`IPR2015-00375
`
`

`
`13. Second, even if we had developed such a system, it likely would not
`
`have been useful in identifying worms, because the malicious things that worms
`
`did were typically not part of an exclusively internal function.
`
`14. Third, we were achieving a lot of success with tracking function calls
`
`to well-known DLLs. In our view, there was little or no increased detection
`
`efficacy that could be achieved by re-architecting the system to cover a separate set
`
`of internal function calls.
`
`15. The list created by the static analyzer comprised targets and
`
`invocation locations for well-known external DLL function calls that were
`
`contained in the code of the application. Essentially, it documented wherever the
`
`external DLL function calls showed up in the code. The static analyzer was run
`
`before the program executed. The static analyzer was therefore only capable of
`
`determining which well-known external DLL functions were capable of being
`
`called through normal, non-obfuscated techniques – not which ones would actually
`
`be called at runtime, and we took no steps to understand what their parameters at
`
`runtime would be. As such, we were well-aware that the list produced by the static
`
`analyzer had no insight into how the program would behave at runtime. In fact,
`
`this premise is key to the effectiveness of our technique. The list did not capture
`
`the expected runtime frequency of the tracked function calls, and our technique
`
`was incapable of predicting the runtime frequency of the function calls or the
`
`Columbia Ex 2031-8
`Symantec v Columbia
`IPR2015-00375
`
`

`
`typical parameters that would be passed to the tracked function calls at runtime. If
`
`a human user interacted with the program, the system did not have any insight into
`
`what the typical use cases were, nor did the list capture any such information.
`
`16. At runtime, if there were invocations to monitored functions from
`
`locations not recorded in the list generated by the static analysis, the dynamic
`
`analyzer would label the execution as malicious. The static analyzer did not
`
`employ probability or statistics to build the list, nor did the dynamic analyzer
`
`employ probability or statistics to label the program as malicious code. The
`
`fundamental check that we did was whether the calling location of the observed
`
`runtime call was on the list or not. We thought that there were advantages to this
`
`simplicity. The statistical and machine learning approaches that were being
`
`investigated at the time were very “heavy weight.” They required a lot of
`
`computing resources to build their models, both in terms of collecting input data
`
`(often referred to as training data) and then developing the actual model.
`
`Determination of a probability by applying the model at runtime could be very
`
`computationally intensive, particularly if as many function calls were tracked and
`
`analyzed as what we were doing. This would add unacceptable overhead on the
`
`protected machine, especially at the time we were conducting our research. The
`
`approach that my team worked on, and which is described in Khazan, was not a
`
`machine learning approach, nor was it a statistical approach. Although we were
`
`Columbia Ex 2031-9
`Symantec v Columbia
`IPR2015-00375
`
`

`
`aware of machine learning approaches at the time (in fact, the group I worked in at
`
`the Laboratory was originally formed to apply machine learning techniques to
`
`speech processing, and later to computer intrusion detection), we wanted to stay
`
`away from them and make a simpler system that was fast and robust.
`
`17. Under certain circumstances, described in Khazan at paragraph 78, the
`
`static analyzer could be invoked when the program was running, on a software
`
`component—such as an external DLL—that had not previously been statically
`
`analyzed. In this situation, the static analyzer could add targets and invocation
`
`locations for function calls in the new software component to the existing list. But
`
`none of the information depended on the program being running at the time. It was
`
`the same data available when the program was not running. In fact, as mentioned
`
`above, this distinction is critical to our technique.
`
`18. To the best of my knowledge, neither I nor my colleagues ever ran the
`
`system described in Khazan in an emulator, nor did we test it in one. In fact, the
`
`system worked well without an emulator. Because any mismatch of a function call
`
`made at runtime with the information in the static analysis list could be determined
`
`before the target function call was executed, the system could stop and quarantine
`
`a worm before anything bad happened. The “sandbox” protection offered by an
`
`emulator or a virtual environment did not seem necessary in light of this.
`
`Columbia Ex 2031-10
`Symantec v Columbia
`IPR2015-00375
`
`

`
`19. To the best of my knowledge, neither I nor my colleagues ever
`
`implemented a network notification functionality for the system described in
`
`Khazan. Our vision for the project was that in a networked computer environment,
`
`such as a DoD LAN installation, every computer would run the Khazan system.
`
`Therefore, every computer would be protected in the same way. If a worm was
`
`detected on one computer in the LAN, then that showed that it would be detected
`
`by any other computer. Because all computers were equally protected, there was
`
`no need to send a notification to the other computers on the network when a worm
`
`was detected.
`
`20. To the best of my knowledge, neither I nor my colleagues ever
`
`implemented a system that combined models or built multiple models. We used a
`
`single, simple list, which all static analysis results were added to. New data could
`
`certainly be added to the list, for example by running the static analyzer on one
`
`program component at a time, but we certainly never viewed this as combining
`
`lists or models, or even creating different lists or models. There was no concept in
`
`our system of combining multiple lists into a single list.
`
`
`
`
`
`
`
`Columbia Ex 2031-11
`Symantec v Columbia
`IPR2015-00375
`
`

`
`
`
`I declare under the penalty of perjury that all statements made herein of my
`
`own knowledge are true and that all statements made on information and belief are
`
`believed to be true; and further that these statements were made with the
`
`knowledge that willful false statements and the like so are made punishable by fine
`
`or imprisonment, or both, under Section 1001 of Title 18 of the United States
`
`Code.
`
`
`
`Executed this 7th day of October in 2015.
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`/s/
`
`Scott M. Lewandowski
`
`Columbia Ex 2031-12
`Symantec v Columbia
`IPR2015-00375
`
`

`
`
`
`EXHIBIT 1
`
`Columbia Ex 2031-13
`Symantec v Columbia
`IPR2015-00375
`
`

`

`
`
`
`Professional Biography of
`
`Scott M. Lewandowski
`
`Mr. Scott M. Lewandowski serves as President and Principal Engineer for
`
`The Wynstone Group, Inc., a boutique consulting firm he founded in 2006 to
`
`provide specialized information dominance expertise to military, intelligence
`
`community, law enforcement, government, and commercial customers. He
`
`currently serves as the Chief Cyber Scientist for the United States Department of
`
`Defense’s National Cyber Range, where he applies his expertise in cyber testing
`
`and cyber operations to support missions of significant national importance.
`
`
`
`One of Scott's specialties is providing end-to-end test and evaluation support
`
`of offensive, defensive, and intelligence-related hardware and software systems,
`
`from initial test concept development through post-test results analysis and
`
`reporting. Scott’s experience with test and evaluation spans the entire spectrum of
`
`capability maturity,
`
`from Advanced Technology Demonstrations
`
`through
`
`Operational Tests and Evaluations. In the course of testing and evaluating systems,
`
`Scott has established strong relationships with dozens of test organizations and has
`
`developed intimate knowledge of their test ranges, procedures, and capabilities.
`
`Scott has served as the principal evaluator of many offensive, defensive, and
`
`intelligence gathering capabilities, has provided critical test-related subject matter
`
`expertise to countless other programs, and has designed and supervised the
`

`
`1
`
`Columbia Ex 2031-14
`Symantec v Columbia
`IPR2015-00375
`
`

`

`
`instantiation of cyber test ranges for commercial, military, and intelligence
`
`community organizations.
`
`
`
`Because Scott has such broad experience with and exposure to a diverse set
`
`of information systems and tools, he is frequently called on to provide subject
`
`matter expertise pertinent to new and existing systems, capabilities, and mission
`
`areas. Scott's systems expertise encompasses a broad mix of systems, including
`
`real-time and embedded systems; consumer through enterprise-class software,
`
`computers, networks, and supporting infrastructure; and specialized critical
`
`infrastructure and military systems, including IADS, SCADA, GCCS, SWIFT,
`
`cellular networks, etc. His knowledge extends from high-level concepts down to
`
`low-level implementation details such as firmware and chipset design, operating
`
`system kernel architecture, and the like. In addition to intimate familiarity with US
`
`government systems, he has significant experience with foreign computer
`
`hardware, software, system designs, and concepts of operation.
`
`
`
`Scott has extensive experience working with DARPA, DTO, IARPA, and
`
`HSARPA; NSA, and other intelligence organizations; US Secret Service; DHS;
`
`FBI; US Army; US Air Force; US Navy; and commercial organizations such as
`
`Hewlett Packard, Microsoft, American Power Conversion, H&R Block, Siemens,
`
`and various critical infrastructure providers. Scott has been called on to provide
`

`
`2
`
`Columbia Ex 2031-15
`Symantec v Columbia
`IPR2015-00375
`
`

`

`
`testimony and guidance to senior-level policymakers in the Executive Office of the
`
`President; the Departments of State, Defense, and Treasury; and the US Congress.
`
`Prior to founding The Wynstone Group, Scott was a member of the Technical Staff
`
`in the Information Systems Technology Group at MIT Lincoln Laboratory, where
`
`he focused on two complementary areas: research in the areas of network and host-
`
`based information operations; and the test and evaluation of information
`
`dominance systems. Much of Scott's work in the latter area focused on novel
`
`techniques to evaluate complex information dominance systems in a scientifically
`
`rigorous and repeatable manner; techniques for managing large-scale information
`
`operations testbeds; and realistic simulation and emulation of human activity for
`
`information operations testbeds. Before joining MIT Lincoln Laboratory, Scott
`
`played major roles in application development, research, and systems integration
`
`projects for companies in the high-tech, manufacturing, legal, accounting, sales,
`
`and hospitality industries.
`
`
`
`Scott matriculated at Brown University, where he received the Sc.M. degree
`
`in Computer Science and the A.B. degree in Computer Science and Business
`
`Economics magna cum laude.
`

`
`3
`
`Columbia Ex 2031-16
`Symantec v Columbia
`IPR2015-00375
`
`

`
`
`
`EXHIBIT 2
`
`Columbia Ex 2031-17
`Symantec v Columbia
`IPR2015-00375
`
`

`

`
`Materials Considered
`
`U.S. Patent Application Publication US 2005/0108562
`
`U.S. Patent No. 8,074,115
`
`U.S. Patent No. 8,108,929
`
`Petition for Inter Partes Review of U.S. Patent No. 8.074,115 Under 35 U.S.C. §§
`311-319 and 37 C.F.R. §§ 42.1-.80, 42.100-.123 in IPR2015-00375
`
`Columbia’s Patent Owner Preliminary Response Under 37 C.F.R. § 42.107 in
`IPR2015-00375
`
`Institution of Inter Partes Review in IPR2015-00375
`
`
`

`
`Columbia Ex 2031-18
`Symantec v Columbia
`IPR2015-00375

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket