throbber

`
`IN THE UNITED STATES PATENT AND TRADEMARK OFFICE
`
`BEFORE THE PATENT TRIAL AND APPEAL BOARD
`
`
`In the Inter Partes Review of:
`)
`)
`
`)
`U.S. Patent No.: 8,074,115
`)
`
`)
`
`)
`
`)
`For: METHODS, MEDIA AND
`
`SYSTEMS FOR DETECTING )
`
`ANOMALOUS PROGRAM
`)
`EXECUTIONS
`)
`)
`
`
`
`Mail Stop Patent Board
`
`Patent Trial and Appeal Board
`
`P.O. Box 1450
`
`Alexandria, VA 22313-1450
`
`
`
`DECLARATION OF MICHAEL T. GOODRICH, Ph.D.
`IN SUPPORT OF PETITION FOR INTER PARTES REVIEW OF
`U.S. PATENT NO. 8,074,115
`
`
`
`
`
`
`
`1
`
`SYMC 1003
`
`

`

`I, Michael T. Goodrich, Ph.D., declare as follows:
`
`I.
`
`INTRODUCTION
`1.
`
`I have been asked by the party requesting this review, Symantec
`
`Corporation (“Petitioner”), to provide my expert opinions in support of the above-
`
`captioned petition for Inter Partes review of U.S. Patent No. 8,074,115 (the “’115
`
`patent”), challenging the patentability of claims 1-42 of the ’115 patent.
`
`2.
`
`3.
`
`I currently hold the opinions set forth in this declaration.
`
`In summary, it is my opinion that the references cited below render
`
`obvious claims 1-42 of the ’115 patent. My detailed opinions on the claims are set
`
`forth below.
`
`II. BACKGROUND AND QUALIFICATIONS
`4.
`I earned a Bachelor’s Degree in Mathematics and Computer Science
`
`from Calvin College in 1983. I obtained my Master’s Degree and Ph.D. in Computer
`
`Sciences from Purdue University in 1985 and 1987, respectively.
`
`5.
`
`I currently hold the position of Chancellor’s Professor for the
`
`Department of Computer Science at the University of California, Irvine. I have been
`
`employed by the University of California, Irvine since 2001 and have spent more than
`
`two decades teaching computer science at the University of California, Irvine and
`
`previously at Johns Hopkins University.
`
`6. My research for more than 30 years has focused generally on algorithm
`
`and data structure design, information assurance and security, and parallel and
`
`
`
`2
`
`SYMC 1003
`
`

`

`distributed computing. In 2011 I co-authored a book entitled “Introduction to
`
`Computer Security,” which was published by Addison-Wesley, Inc.
`
`7.
`
`I am a listed inventor on three issued U.S. Patents: U.S. Patent No.
`
`7,257,711,
`
`titled “Efficient Authenticated Dictionaries with Skip Lists and
`
`commutative Hashing,” U.S. Patent No. 7,299,219, titled “High Refresh-Rate retrieval
`
`of Freshly Published Content using Distributed Crawling,” and U.S. Patent No.
`
`8,681,145, titled “Attribute Transfer Between Computer Models Including Identifying
`
`Isomorphic Regions in Polygonal Meshes.” Additionally, I have published over 100
`
`papers and books.
`
`8. My professional background and technical qualifications also are
`
`reflected in my Curriculum Vitae, which is attached as Exhibit 1004.
`
`III. COMPENSATION AND RELATIONSHIP WITH PARTIES
`9.
`I am being compensated for my time. This compensation is not
`
`contingent upon my performance, the outcome of this matter, or any issues involved
`
`in or related to this matter.
`
`10.
`
`I have no financial interest in Petitioner or any related parties. I have
`
`been informed that The Trustees of Columbia University in the City of New York
`
`(“Columbia”) owns the ’115 patent. I have no financial interest in and have no
`
`contact with Columbia. I similarly have no financial interest in the ’115 patent and
`
`have not had any contact with any of its inventors.
`
`
`
`3
`
`SYMC 1003
`
`

`

`IV. MATERIAL CONSIDERED
`11.
`I have reviewed and considered, in the preparation of this declaration,
`
`the ’115 patent (Ex. 1001) and the prosecution file history for the ’115 patent (Ex.
`
`1002).
`
`12.
`
`I have reviewed and considered the Claim Construction Order issued by
`
`the district court in the ongoing litigation between the Petitioner and the Patentee.
`
`(The Trustees of Columbia University in the City of New York v. Symantec Corp., Civil Action
`
`No. 3:13-cv-808, Oct. 7, 2014 Claim Construction Order (Dkt. No. 123), Ex. 1005). I
`
`have also reviewed the district court’s clarification of its Claim Construction Order in
`
`the same litigation. (The Trustees of Columbia University in the City of New York v. Symantec
`
`Corp., Civil Action No. 3:13-cv-808, October 23, 2014 Memorandum Order Clarifying
`
`Claim Construction (Dkt. No. 146), Ex. 1006).
`
`13.
`
`I understand that, for purposes of determining whether a reference will
`
`qualify as prior art, the challenged claims of the ’115 patent are entitled to a priority
`
`date of no earlier than October 25, 2005.
`
`14.
`
`I have also reviewed and understand various publications as discussed
`
`herein, including the following references:
`
`a.
`
`b.
`
`c.
`
`U.S. Patent Application Publication No. 2005/0208562 to
`
`Khazan et al. (“Khazan,” Ex. 1010);
`
`U.S. Patent No. 5,440,723 to Arnold et al. (“Arnold,” Ex. 1007);
`
`U.S. Patent No. 8,108,929 to Agrawal et al. (“Agrawal,” Ex. 1008);
`
`
`
`4
`
`SYMC 1003
`
`

`

`d. McGraw-Hill Dictionary of Scientific and Technical Terms (5th
`
`ed. 1994) (“McGraw-Hill,” Ex. 1009);
`
`e.
`
`U.S. Patent No. 7,334,005 to Sobel (“Sobel,” Ex. 1011).
`
`15.
`
`I understand that the above references form the basis for the grounds
`
`for rejection set forth in the Petition for Inter Partes Review of the ’115 patent.
`
`Additionally, I am aware of information generally available to, and relied upon by,
`
`persons of ordinary skill in the art at the relevant times, including technical
`
`dictionaries and technical reference materials (including, for example, textbooks,
`
`manuals, technical papers, articles, and relevant technical standards); some of my
`
`statements below are expressly based on such awareness.
`
`16. Due to procedural limitations for Inter Partes reviews, the grounds of
`
`invalidity discussed herein are based solely on prior patents and other printed
`
`publications. I understand that Petitioner and the other interested parties reserve all
`
`rights to assert other grounds for invalidity not addressed herein at a later time, for
`
`instance failure of the application to claim patentable subject matter under 35 U.S.C. §
`
`101, failure to meet requirements under 35 U.S.C. § 112 (e.g., lack of written
`
`description in support of the claims) and anticipation/obviousness under 35 U.S.C. §§
`
`102 and 103 not based solely on patents and printed publications (e.g., evidence of
`
`prior use of combinations of elements claimed in the ’115 patent). Thus, absence of
`
`discussion of such matters here should not be interpreted as indicating that there are
`
`no such additional grounds for invalidity of the ’115 patent.
`
`
`
`5
`
`SYMC 1003
`
`

`

`17.
`
`I reserve the right to supplement my opinions to address any
`
`information obtained, or positions taken, based on any new information that comes
`
`to light throughout this proceeding.
`
`V.
`
`BASIS OF OPINIONS FORMED
`A.
`18.
`
`Level of Ordinary Skill in the Art
`
`It is my understanding that the ’115 patent is to be interpreted based on
`
`how it would be read by a person of “ordinary skill in the art” at the time of the
`
`effective filing date of the application. It is my understanding that factors such as the
`
`education level of those working in the field, the sophistication of the technology, the
`
`types of problems encountered in the art, the prior art solutions to those problems,
`
`and the speed at which innovations are made may help establish the level of skill in
`
`the art.
`
`19.
`
`I am familiar with the technology at issue and the state of the art at
`
`earliest priority date of the ’115 patent, October 25, 2005.
`
`20.
`
`In my opinion, the level of ordinary skill in the art of the ’115 patent at
`
`the time of the effective filing date is a person with a Master’s degree in computer
`
`science or a related field with two to three years of experience in the field of software
`
`security systems. With more education, for example additional post-graduate degrees
`
`and/or study, less industry experience is needed to attain an ordinary level of skill.
`
`21.
`
`I consider myself to have at least such ordinary skill in the art with
`
`respect to the subject matter of the ’115 patent at the time of the effective filing date.
`
`
`
`6
`
`SYMC 1003
`
`

`

`22.
`
`I am not a patent attorney and my opinions are limited to what I believe
`
`a person of ordinary skill in the art would have understood the meaning of certain
`
`claim terms to be, based on the patent documents. I use the principles below,
`
`however, as a guide in formulating my opinions.
`
`VI. LEGAL STANDARD FOR CLAIM CONSTRUCTION
`23. My understanding is that a primary step in determining validity of patent
`
`claims is to properly construe the claims to determine claim scope and meaning.
`
`24.
`
`In an Inter Partes review proceeding, I understand that claims are to be
`
`given their broadest reasonable construction (BRC) in light of the patent’s
`
`specification. (See 37 C.F.R. § 42.100(b).) In other forums, such as in federal courts,
`
`different standards of proof and claim interpretation control, which are not applied by
`
`the PTO for Inter Partes review. Accordingly, any interpretation or construction of the
`
`challenged claims in this proceeding, either implicitly or explicitly, should not be
`
`viewed as constituting, in whole or in part, Petitioner’s own interpretation or
`
`construction, except as regards to the broadest reasonable construction of the claims
`
`presented.
`
`VII. THE ’115 PATENT
`25. The ’115 patent is entitled “Methods, Media and Systems for Detecting
`
`Anomalous Program Executions.” The ’115 patent includes 42 claims, of which
`
`claims 1, 11, 21, 22, 32, and 42 are independent. Claims 1-42 are challenged in the
`
`Petition.
`
`
`
`7
`
`SYMC 1003
`
`

`

`A. General Background of the Technology of the ’115 patent
`26. The ’115 patent is directed to detecting anomalous program executions.
`
`The anomalous executions may be due to attacks from malicious code such as
`
`“[c]omputer viruses, worms, [and] Trojans.” (’115 patent, 1:23-43, Ex. 1001). These
`
`types of malicious code, the ’115 patent explains, are a constant menace to computer
`
`users connected over a network. (See id.). Likewise, the anomalous executions may be
`
`due to programming errors in software applications. (See id.). The software faults and
`
`failures caused by the malicious code or programming errors “result in illegal memory
`
`access errors, division by zero errors, buffer overflows attacks, etc.” (Id.). The ’115
`
`patent explains that many computers are equipped with antivirus software and
`
`firewalls for protection from malicious code. (See id.). However, these are not always
`
`adequate. (Id.).
`
`B.
`Purported Invention of the ’115 Patent
`27. The ’115 patent alleges to provide “[m]ethods, media, and systems for
`
`detecting anomalous program executions . . . that may be indicative of a malicious
`
`attack or program fault.” (’115 patent, 3:7-15, Ex. 1001). Claim 1 is representative:
`
`1. A method for detecting anomalous program executions,
`
`comprising: executing at least a part of a program in an emulator;
`
`comparing a function call made in the emulator to a model of
`
`function calls for the at least a part of the program; identifying the
`
`function call as anomalous based on the comparison; and upon
`
`
`
`8
`
`SYMC 1003
`
`

`

`identifying the anomalous function call, notifying an application
`
`community that
`
`includes a plurality of computers of the
`
`anomalous function call.
`
`(’115 patent, 20:36-46, Ex. 1001.)
`
`28.
`
`In order to detect “anomalous program executions that may be
`
`indicative of a malicious attack or program fault,” an anomaly detector trains a model
`
`of normal program behavior and applies the model to detect deviations from normal
`
`program behavior during subsequent operation. (’115 patent, 3:13-15, Ex. 1001); (See
`
`’115 patent, 3:50-56, Ex. 1001). The anomaly detector of the ’115 patent specifically
`
`focuses on detecting anomalous function calls made by the program. (See ’115 patent,
`
`3:46-56, Ex. 1001).
`
`29. The anomaly detector models normal function call behavior by
`
`executing part, or all, of the program using an instruction-level emulator, or by using
`
`another suitable technique. (See ’115 patent, 3:28-37, Ex. 1001). The anomaly detector
`
`thereby determines information about the function calls made by the program. This
`
`information may include function names, function call arguments, stack frames, and
`
`the like. (’115 patent, 3:38-40, Ex. 1001). The information is then used to train a
`
`model that describes normal function calls made by the program. (See ’115 patent,
`
`4:9-13, Ex. 1001). The anomaly detector observes subsequent function calls made by
`
`the program and uses the trained model to detect anomalous behaviors. (’115 patent,
`
`3:52-56, Ex. 1001). Upon identifying a function call as anomalous, the anomaly
`
`
`
`9
`
`SYMC 1003
`
`

`

`detector notifies an application community that includes a plurality of computers that
`
`an anomalous function call has been identified. (See ’115 patent, 18:57-59, Ex. 1001).
`
`30.
`
`Figure 8 illustrates the operation of the anomaly detector described in
`
`the ’115 patent:
`
`
`
`The anomaly detector detects a function call being made by a program at step 802.
`
`The anomaly detector then compares the detected function call at step 804 to the
`
`previously-created model of normal function calls. This comparison causes a function
`
`call to be identified as anomalous at step 806. (’115 patent, 3:46-56, Ex. 1001).
`
`31.
`
`Several of the
`
`independent claims of the
`
`’115 patent generally
`
`correspond to the steps illustrated by FIG. 8. For example, claim 22 recites: “[a]
`
`method for detecting anomalous program executions, comprising: modifying a
`
`program to include indicators of program-level function calls being made during
`
`
`
`10
`
`SYMC 1003
`
`

`

`execution of the program; comparing at least one of the indicators of program-level
`
`function calls made in an emulator to a model of function calls for at least a part of
`
`the program; and identifying a function call corresponding to the at least one of the
`
`indicators as anomalous based on the comparison. (’115 patent, 21:50-59 Ex. 1001).
`
`Independent claims 32 and 42 likewise recite “modifying,” “comparing,” and
`
`“identifying” elements.
`
`32. Other independent claims of the ’115 patent recite an additional element
`
`after the “identifying” element. For example, claim 1 recites “upon identifying the
`
`anomalous function call, notifying an application community that includes a plurality
`
`of computers of the anomalous function call.” Claims 11 and 21 also recite the
`
`“notifying” element.
`
`C.
`Priority Date
`33. The ’115 patent was originally filed as a PCT application on October 25,
`
`2006 and claims priority to Provisional application No. 60/730,289 filed on October
`
`25, 2005. A U.S. National Phase application claiming priority to the PCT application
`
`and claiming the benefit of the Provisional application was filed on April 22, 2008.
`
`(Ex. 1002).
`
`34.
`
`In an office action dated August 23, 2010, the examiner rejected all of
`
`the pending claims, 1-42. (File History, page 126, Ex. 1002). Specifically, claims 11-
`
`20 and 32-41 were rejected under 35 U.S.C. § 101; claims 1-7 and 22 were rejected
`
`under 35 U.S.C. § 102(e) as anticipated by U.S. Patent No. 7,496,898 to Vu; and
`
`
`
`11
`
`SYMC 1003
`
`

`

`claims 8-21 and 23-42 were rejected under 35 U.S.C. § 103(a) as being obvious over
`
`Vu in view of Chan et al. (“a Machine learning approach to Anomaly Detection”,
`
`Technical Report, Department of computer Science, Florida Institute of Technology,
`
`March 2003, Pages 1-13). (File History, pages 128-130, Ex. 1002).
`
`35.
`
`In response, the patentee amended independent claims 1, 11, and 21 to
`
`add the limitation, “upon identifying the anomalous function call, notifying an
`
`application community that includes a plurality of computers of the anomalous
`
`function call.” (File History, pages 119-121, Ex. 1002). The patentee also argued that
`
`“nothing in Vu shows or suggests the feature of ‘executing at least a part of the
`
`program in an emulator.’ Instead of using an emulator that executes at least a part of
`
`the program, Vu relies on examining the code for generic function calls.” (File
`
`History, page 110, Ex. 1002). The patentee also argued that “Vu also does not show
`
`or suggest ‘comparing a function call made in the emulator to a model of function
`
`calls for the at least a part of the program.” (Id. (emphasis in original)).
`
`36. The examiner, thereafter, issued a notice of allowance on March 17,
`
`2011. (File History, page 95, Ex. 1002).
`
`37. On June 17, 2011, after having already obtained a notice of allowance,
`
`the patentee filed a request for continued examination (RCE). (File History, page 42,
`
`Ex. 1002). In the RCE, the patentee disclosed various references to be considered by
`
`the examiner. The examiner responded on July 28, 2011 with a second notice of
`
`allowance. (File History, page 13, Ex. 1002). In allowing the ’115 patent, the
`
`
`
`12
`
`SYMC 1003
`
`

`

`examiner identified Khazan as one example of prior art that discloses many of the
`
`claim limitations. (See id). The examiner determined, however, that Khazan and the
`
`other prior art fails to teach “singly or in combination comparing a function call made
`
`in an emulator to a model of function calls for the at least a part of the program,
`
`identifying the function call as anomalous based on the comparison and upon
`
`identification, notifying an application community.” (See id). These were the reasons
`
`for allowance provided by the examiner. (See id).
`
`VIII. ANTICIPATION AND OBVIOUSNESS STANDARDS
`38.
`I understand that “anticipation” is a question of fact and that for a
`
`reference to anticipate a claimed invention it must disclose each and every element set
`
`forth in the claim for that invention. I further understand that the requirement of
`
`strict identity between the claim and the reference is not met if a single element or
`
`limitation required by the claim is missing from the applied reference.
`
`39.
`
`It is my further understanding that a prior art reference is anticipatory
`
`only if it discloses each and every limitation of the claim (as properly construed) at
`
`issue. In other words, every limitation of a claim must identically appear in a single
`
`prior art reference for it to anticipate a claim.
`
`40.
`
`It is further my understanding that a claimed invention is unpatentable if
`
`the differences between the invention and the prior art are such that the subject
`
`matter of the claim as a whole would have been obvious at the time the invention was
`
`made to a person having ordinary skill in the art to which the subject matter pertains.
`
`
`
`13
`
`SYMC 1003
`
`

`

`41.
`
`It is my understanding that “obviousness is a question of law based on
`
`underlying factual issues including (1) the scope and content of the prior art, (2) the
`
`differences between the prior art and the asserted claims, (3) the level of ordinary skill
`
`in the pertinent art, and (4) the existence of secondary considerations such as
`
`commercial success, long-felt but unresolved needs, failure of others, etc.”
`
`42.
`
`I understand that for a single reference or a combination of references to
`
`render obvious the claimed invention, a person of ordinary skill in the art must have
`
`been able to arrive at the claims by altering or combining the applied references.
`
`43.
`
`I understand that an obviousness evaluation can be based on a
`
`combination of multiple prior art references. I understand that the prior art
`
`references themselves may provide a suggestion, motivation, or reason to combine,
`
`but other times the nexus linking two or more prior art references is simple common
`
`sense. I further understand that obviousness analysis recognizes that market demand,
`
`rather than scientific literature, often drives innovation, and that a motivation to
`
`combine references may be supplied by the direction of the marketplace.
`
`44.
`
`I understand that if a technique has been used to improve one device,
`
`and a person of ordinary skill in the art would recognize that it would improve similar
`
`devices in the same way, using the technique is obvious unless its actual application is
`
`beyond his or her skill.
`
`45.
`
`I also understand that practical and common sense considerations
`
`should guide a proper obviousness analysis, because familiar items may have obvious
`
`
`
`14
`
`SYMC 1003
`
`

`

`uses beyond their primary purposes. I further understand that a person of ordinary
`
`skill in the art looking to overcome a problem will often be able to fit together the
`
`teaching of multiple publications. I understand that obviousness analysis therefore
`
`takes into account the inferences and creative steps that a person of ordinary skill in
`
`the art would employ under the circumstances.
`
`46.
`
`I understand that a particular combination may be proven obvious
`
`merely by showing that it was obvious to try the combination. For example, when
`
`there is a design need or market pressure to solve a problem and there are a finite
`
`number of identified, predictable solutions, a person of ordinary skill has good reason
`
`to pursue the known options within his or her technical grasp because the result is
`
`likely the product not of innovation but of ordinary skill and common sense.
`
`47.
`
`I also understand that the combination of familiar elements according to
`
`known methods is likely to be obvious when it does no more than yield predictable
`
`results. When a work is available in one field of endeavor, design incentives and other
`
`market forces can prompt variation of it, either in the same field or a different one. If
`
`a person of ordinary skill can implement a predictable variation, the patent claims is
`
`likely obvious.
`
`48.
`
`It is further my understanding that a proper obviousness analysis focuses
`
`on what was known or obvious to a person of ordinary skill in the art, not just the
`
`patentee. Accordingly, I understand that any need or problem known in the field of
`
`
`
`15
`
`SYMC 1003
`
`

`

`endeavor at the time of invention and addressed by the patent can provide a reason
`
`for combining the elements in the manner claimed.
`
`49.
`
`I understand that a claim can be obvious in light of a single reference,
`
`without the need to combine references, if the elements of the claim that are not
`
`found explicitly or inherently in the reference can be supplied by the common sense
`
`of one of skill in the art.
`
`50.
`
`I understand that secondary indicia of non-obviousness may include (1)
`
`a long felt but unmet need in the prior art that was satisfied by the invention of the
`
`patent; (2) commercial success of processes covered by the patent; (3) unexpected
`
`results achieved by the invention; (4) praise of the invention by others skilled in the
`
`art; (5) taking of licenses under the patent by others; (6) deliberate copying of the
`
`invention; (7) failure of others to find a solution to the long felt need; and (8)
`
`skepticism by experts.
`
`51.
`
`I also understand that there must be a relationship between any such
`
`secondary considerations and
`
`the
`
`invention.
`
` I
`
`further understand
`
`that
`
`contemporaneous and independent invention by others is a secondary consideration
`
`supporting an obviousness determination.
`
`52.
`
`In sum, my understanding is that prior art teachings are properly
`
`combined where a person of ordinary skill in the art having the understanding and
`
`knowledge reflected in the prior art and motivated by the general problem facing the
`
`inventor, would have been led to make the combination of elements recited in the
`
`
`
`16
`
`SYMC 1003
`
`

`

`claims. Under this analysis, the prior art references themselves, or any need or
`
`problem known in the field of endeavor at the time of the invention, can provide a
`
`reason for combining the elements of multiple prior art references in the claimed
`
`manner.
`
`IX. ANALYSIS OF THE TECHNICAL BASIS UNDERLYING THE
`GROUNDS OF REJECTION SET FORTH IN THE PETITION FOR
`INTER PARTES REVIEW
`A.
`Claim Construction
`53. As I mentioned earlier, I understand that for purposes of an Inter Partes
`
`review, the terms of patent claims are to be given their BRC in light of the patent’s
`
`specification. I additionally understand that the BRC is not necessarily the same
`
`interpretation that would be given to the terms in other proceedings.
`
`54.
`
`I have reviewed the Claim Construction Order issued by the district
`
`court. (The Trustees of Columbia University in the City of New York v. Symantec Corp., Civil
`
`Action No. 3:13-cv-808, Oct. 7, 2014 Claim Construction Order (Dkt. No. 123), Ex.
`
`1005). In addition, I have reviewed the specification and the file history for further
`
`indication of BRC. From this review, I conclude that the claim constructions made
`
`by the district court are consistent with the BRC for purposes of an Inter Partes review.
`
`Additionally, the claim terms not construed herein are given their BRC as understood
`
`by a person of ordinary skill in the art and consistent with the disclosure.
`
`55.
`
`Specifically, the construction of the term “anomalous” adopted by the
`
`district court is “[d]eviation/deviating from a model of typical, attack-free computer
`
`
`
`17
`
`SYMC 1003
`
`

`

`system usage.” (Ex. 1005; Ex. 1006). This construction is consistent with the BRC.
`
`The specification describes how the anomaly detector models “normal program
`
`execution stack behavior” and “detect[s] stacked function references as anomalous . . .
`
`by comparing those references to the model.” (’115 patent, 3:46-56, Ex. 1001). The
`
`“normal . . . behavior” refers to typical, attack-free behavior while the “comparing”
`
`serves to detect a deviation from the normal behavior.
`
`56. The construction of the term “emulator” adopted by the district court is
`
`“[s]oftware, alone or in combination with hardware, that permits the monitoring and
`
`selective execution of certain parts, or all, of a program.” (Ex. 1005). This
`
`construction is consistent with the BRC. The specification describes how stack
`
`information may be
`
`extracted using “Selective Transactional Emulation
`
`(STEM)…which permits the selective execution of certain parts, or all, of a program
`
`inside an instruction-level emulator…by modifying a program’ s binary or source code
`
`to include indicators of what functions calls are being made (and any other suitable
`
`related information), or using any other suitable technique.” (’115 patent, 3:28-37, Ex.
`
`1001).
`
`57. The construction of the term “application community” as adopted by
`
`the district court is “[m]embers of a community running the same program or a
`
`selected portion of the program.” (Ex. 1005). This construction is consistent with
`
`the BRC. The specification states that “models are shared among many members of a
`
`community running the same application (referred to as an ‘application community’).”
`
`
`
`18
`
`SYMC 1003
`
`

`

`(’115 patent, 6:31-33, Ex. 1001). Further, the specification also describes how only a
`
`portion of an application may be modeled by each member of an application
`
`community. (’115 patent, 6:59-61, Ex. 1001).
`
`58.
`
`In addition, in my opinion the BRC of the phrase “generating a
`
`virtualized error” is “simulating an error return from the function.” The specification
`
`describes how an emulator may be used to “simulate an error return from a function
`
`of the application.” (’115 patent, 13:51-61, Ex. 1001). The specification also states
`
`that “the emulator can . . . simulate an error return from the function” and that this
`
`simulation is “sometimes referred to herein as ‘error virtualization.’” (’115 patent,
`
`15:12-21, Ex. 1001).
`
`59. My opinion is that the BRC of the phrase “indicators of program-level
`
`function calls” is “indicators of which of the program’s functions are being called.”
`
`The specification supports this construction in disclosing that STEM, “which permits
`
`the selective execution of certain parts, or all, of a program inside an instruction-level
`
`emulator,” modifies a program’s binary or source code “to include indicators of what
`
`function calls are being made (and any other suitable related information) . . . .” (’115
`
`patent, 3:28-37, Ex. 1001). The specification also explains that the “instruction-level
`
`emulator can be selectively invoked for segments of the application’s code . . . and
`
`simulate an error return from a function of the application.” (’115 patent, 13:51-61,
`
`Ex. 1001; Ex. 1003 at ¶ 59). Further, “upon entering the vulnerable section of the
`
`application’s code, the instruction-level emulator can capture and store the program
`
`
`
`19
`
`SYMC 1003
`
`

`

`state and processes all instructions, including function calls, inside the area designated
`
`for emulation.” (’115 patent, 13:61-65, Ex. 1001; Ex. 1003 at ¶ 59). Because the
`
`indicators are of function calls made by a program, and because the program (i.e.,
`
`application) may return an error from the function, it is clear that the indicators
`
`indicate which of the program’s functions are being called.
`
`60. Additionally, my opinion is that the BRC of the phrase “reflects” is
`
`“describes.” The specification discloses that “models may be automatically updated
`
`as time progresses. For example, although a single site may learn a particular model
`
`over some period of time, application behavior may change over time. In this case,
`
`the previously learned model may no longer accurately reflect the application
`
`characteristics. . . .” (’115 patent, 7:58-62, Ex. 1001). A person of ordinary skill in the
`
`art would recognize that the specification is using “reflects” in this instance to mean
`
`“describes.”
`
`B. Relevant Prior Art References
`1.
`
`U.S. Patent Application Publication No. 2005/0208562
`(“Khazan”)
`61. Khazan is a Patent Application Publication filed on June 18, 2003 and
`
`published on May 19, 2005. (Khazan, Ex. 1010). Thus, Khazan is prior art under
`
`pre-AIA 35 U.S.C. 102(e).
`
`62. Khazan is entitled, “Technique for Detecting Executable Malicious Code
`
`Using a Combination of Static and Dynamic Analyses.” (Khazan, Ex. 1010). Khazan
`
`
`
`20
`
`SYMC 1003
`
`

`

`is directed at ways to automatically detect “malicious code” by comparing the function
`
`call made when an application is executed with a model defined using calls to a
`
`predetermined set of targets. (See id.). Malicious code, as defined in Khazan, refers to
`
`machine instructions which perform unauthorized functions within a computer
`
`system that may be destructive, disruptive, or otherwise problematic. (Khazan, ¶ 5,
`
`Ex. 1010). Examples of malicious code (or “MC” as it is referred to in Khazan)
`
`include “a computer virus, a worm, a trojan application, and the like.” (See id.).
`
`63. As its title suggests, Khazan discloses the detection of malicious code
`
`using both static analysis and dynamic analysis. This detection may be performed
`
`using non-transitory computer-readable media, such as ROM, data storage devices, or
`
`other forms of media or storage, containing computer-executable instructions.
`
`(Khazan, ¶ 38, Ex. 1010). Similarly, the detection may be performed by a system
`
`comprising digital processing devices like computer processors. (Khazan, ¶ 72, Ex.
`
`1010). First, static analysis is performed by identifying information about the
`
`application without execution of the code. (Khazan, ¶ 43, Ex. 1010). Specifically, a
`
`static analyzer in a malicious code detection system reviews the binary code of a
`
`software application to identify a set of predetermined target function calls that are
`
`made by the application. (See id.). As an example provided in Khazan, “it may be
`
`determined that the target function calls to be identified are those calls that are
`
`external to the application 102, such as those calls that are made to system functions.”
`
`(Khazan, ¶ 42, Ex. 1010). After performing this static analysis, the static analyzer
`
`
`
`21
`
`SYMC 1003
`
`

`

`outputs a list of locations within the application where a function call was invoked
`
`along with a reference to the target function. (Khazan, ¶ 42, Ex. 1010).
`
`64. Khazan discloses that the information obtained from the static analysis
`
`is used to create an “application model” for distinguishing “between normal or
`
`expected behavior of code” and the behavior produced by [malicious code]. (Khazan,
`
`¶ 65, Ex. 1010). The model comprises target function call names and other call-
`
`related information (e.g., parameter number, typing, and run-time values), which, as
`
`one of ordinary skill in the art would recognize are a function call name and
`
`arguments. More specifically, the model, created using in

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket