`
`BEFORE THE PATENT TRIAL AND APPEAL BOARD
`
`
`
`
`
`
`
`
`
`August 31, 1999
`
`In re U.S. Patent No. 5,944,839
`
`Filed:
`
`Issued:
`
`Inventor: Henri J. Isenberg
`
`Assignee: Clouding IP, LLC
`
`Title:
`
`March 19, 1997
`
`System and Method for Automatically Maintaining a Computer
`System
`
`
`
`Mail Stop PATENT BOARD, PTAB
`Commissioner for Patents
`P.O. Box 1450
`Alexandria, VA 22313-1450
`
`
`PETITION FOR INTER PARTES REVIEW OF U.S. PATENT NO. 5,944,839
`UNDER 35 U.S.C. §§ 311-319 AND 37 C.F.R. § 42.100 ET SEQ.
`
`
`
`
`
`TABLE OF CONTENTS
`
`TABLE OF CONTENTS .......................................................................................... ii
`
`EXHIBIT LIST ......................................................................................................... iv
`
`I.
`
` INTRODUCTION ....................................................................................... 1
`
`MANDATORY NOTICES ........................................................................... 3
`II.
`A. Real Party-In-Interest .......................................................................................................... 3
`B. Related Maters .................................................................................................................... 3
`C. Lead and Back-Up Counsel ................................................................................................ 4
`D. Service Information ............................................................................................................ 4
`
`III.
`
`PAYMENT OF FEES ................................................................................... 4
`
`REQUIREMENTS FOR INTER PARTES REVIEW ................................... 5
`IV.
`A. Grounds For Standing ......................................................................................................... 5
`B.
`Identification of Challenge ................................................................................................. 5
`1. The Specific Art and Statutory Ground(s) on Which the Challenge is Based ............................................ 5
`2. How the Construed Claims Are Unpatentable Under the Statutory Grounds Identified in 37 C.F.R. §
`42.204(B)(2) and Supporting Evidence Relied upon to Support the Challenge .................................... 7
`
`FACTUAL BACKGROUND ....................................................................... 8
`V.
`A. Declaration Evidence .......................................................................................................... 8
`B. The State of the Art ............................................................................................................. 9
`C. The ‘573 Patent Application ............................................................................................. 11
`D. The Prosecution History ................................................................................................... 12
`
`BROADEST REASONABLE CONSTRUCTION .................................... 14
`VI.
`A. Sensors .............................................................................................................................. 15
`B. Artificial Intelligence (AI) Engine .................................................................................... 16
`
`REPRESENTATIVE PROPOSED REJECTIONS SHOWING THAT
`VII.
`PETITIONER HAS A REASONABLE LIKELIHOOD OF PREVAILING ......... 17
`A. Claims 1, 2, 6, 8, 14, 15 and 17 are Rendered Obvious by Gurer Under 35 U.S.C. § 103
`........................................................................................................................................... 17
`B. Claims 1, 2, 6, 8, 14, 15 and 17 are Rendered Obvious by Allen ‘218 Under 35
`U.S.C. § 103 ...................................................................................................................... 34
`C. Claims 6, 8 and 14 are Rendered Obvious by Barnett Taken in View of Allen ‘664 Under
`35 U.S.C. § 103 ................................................................................................................. 47
`
`ii
`
`
`
`
`
`
`VIII. CONCLUSION ........................................................................................... 54
`
`CERTIFICATE OF SERVICE ................................................................................ 55
`
`
`
`
`
`iii
`
`
`
`
`
`1001
`
`
`1002
`
`
`1003
`
`
`1004
`
`
`1005
`
`
`1006
`
`
`1007
`
`
`
`EXHIBIT LIST
`
`“System and Method for Automatically Maintaining a Computer
`System,” U.S. Patent No. 5,944,839 to Isenberg, issued August 31,
`1999 (i.e., the ‘839 patent) (for inter partes review).
`
`Excerpts from the Prosecution History of Application No. 08/820,573,
`which matured into the ‘839 patent.
`
`An Artificial Intelligence Approach to Network Fault Management by
`Gurer, D. W., et al. (“Gurer”), published in February 1996.
`
` “Autonomous Learning and Reasoning Agent,” U.S. Patent No.
`5,586,218 to Allen (“Allen ‘218”), filed May 23, 1994, issued
`December 17, 1996.
`
`“System and Method for Managing Faults in a Distributed System,”
`U.S. Patent No. 5,664,093 to Barnett et al. (“Barnett”), filed July 25,
`1996, issued September 2, 1997.
`
`“Case-based Reasoning System,” U.S. Patent No. 5,581,664 to Allen
`(“Allen ‘664”), filed May 23, 1994, issued December 3, 1996.
`
`Declaration of Dr. Todd Mowry, Ph.D.
`
`
`
`
`
`
`iv
`
`
`
`I.
`
` INTRODUCTION
`
`Petitioner Oracle Corporation (“Oracle” or “Petitioner”) respectfully
`
`requests inter partes review for claims 1, 2, 6, 8, 14, 15 and 17 of U.S. Patent No.
`
`5,944,839 (the “‘839 patent,” attached as Ex. 1001) in accordance with 35 U.S.C.
`
`§§ 311–319 and 37 C.F.R. § 42.100 et seq.
`
`The ‘839 patent is generally directed to a system and method for automated
`
`maintenance of a computer system. More particularly, the ‘839 patent describes a
`
`maintenance tool that uses a set of sensors in combination with a case base
`
`database to diagnose and solve computer system problems. (Ex. 1001 at 1:57-59).
`
`If the information gathered by the sensors indicates a problem with the computer
`
`system, then the sensors activate an artificial intelligence (AI) engine. (Id. at
`
`Abstract). The AI engine uses the sensor inputs and information stored in the case
`
`base knowledge database to diagnose the likely cause of the problem and
`
`determine the best solution. (Id. at 2:4-5). If the information necessary to evaluate
`
`a case is not in the knowledge database, then the engine activates a sensor to gather
`
`additional information. (Id. at Abstract). Once the appropriate solution is
`
`determined from the data, the AI engine actives the appropriate sensor to perform
`
`the repair. (Id. at 2:9-11). If the maintenance tool has gathered all the possible
`
`data and still does not have a solution to the computer problem, then the AI engine
`
`has failed to find a solution in the knowledge database. (Id. at 4:66 – 5:2).
`
`1
`
`
`
`
`
`
`Accordingly, the tool saves the state of the computer system and the knowledge
`
`database to a location where the state and database can be examined by a human
`
`computer expert. (Id. at 5:3-6). Presumably, the human expert can then solve the
`
`computer problem and add the solution to the knowledge database. (Id. at 5:7-8).
`
`As demonstrated by various references which were not before the Examiner,
`
`this type of maintenance tool was developed and published several years prior to
`
`the earliest claimed priority date. For instance, Gurer describes a system that uses
`
`case-based reasoning to automatically diagnose and correct faults in a computer
`
`network that it is monitoring. (Ex. 1003 at 1:10-16). The raw input to the system is
`
`a set of “alarms” which are produced by either the element manager software on a
`
`particular network element (e.g., an ATM switch) when it notices a hard error (e.g.,
`
`a link is down), or through software that performs statistical analysis of the
`
`network when it notices a statistical error (e.g., performance degradation due to
`
`congestion). (Id. at 1:30-34). Gurer further describes how the case-based
`
`reasoning system uses its library of cases (i.e., knowledge database) to determine a
`
`likely solution to the problem, which potentially involves deciding that the sensors
`
`should collect more information regarding the state of the network, and how the
`
`gathered information is stored in the knowledge database in the form of new cases.
`
`(Id. at 7:1-10). Likewise, Allen ‘218 describes a software agent which performs
`
`autonomous learning in a real-world environment, implemented in a case-based
`
`
`
`2
`
`
`
`
`
`reasoning system and coupled to a sensor for gathering information from its
`
`environment. (Ex. 1004 at Abstract).
`
`As shown below, Gurer, Allen ‘218, and other references render obvious the
`
`challenged claims of the ‘839 patent.
`
`
`II. MANDATORY NOTICES
`
` Pursuant to 37 C.F.R. § 42.8(a)(1), Oracle provides the following mandatory
`
`disclosures.
`
`A. Real Party-In-Interest
`Pursuant to 37 C.F.R. § 42.8(b)(1), Petitioner certifies that Oracle is the real
`
`party-in-interest.
`
`B. Related Maters
`Pursuant to 37 C.F.R. § 42.8(b)(2), Petitioner states that the ‘839 Patent is
`
`asserted in co-pending litigation captioned Clouding IP, LLC v. Oracle Corp.,
`
`D.Del., Case No. 1:12-cv-00642. This litigation remains pending. The patents-in-
`
`suit are U.S. Patents 6,631,449; 6,918,014; 7,596,784; 7,065,637; 6,738,799;
`
`5,944,839; 5,825,891; 5,678,042; 5,495,607; 7,254,621; 6,925,481. This IPR
`
`petition is directed to U.S. Patent 5,944,839; however, petitions corresponding to
`
`the remaining patents will be filed in the forthcoming weeks.
`
`
`
`
`
`
`
`3
`
`
`
`
`
`
`
`C. Lead and Back-Up Counsel
`
`Pursuant to 37 C.F.R. § 42.8(b)(3), Petitioner provides the following
`
`designation of counsel: Lead counsel is Greg Gardella (Reg. No. 46,045) and
`
`back-up counsel is Scott A. McKeown (Reg. No. 42,866).
`
`Service Information
`
`D.
`Pursuant to 37 C.F.R. § 42.8(b)(4), papers concerning this matter should be
`
`served on the following.
`
`Email:
`
`Address: Greg Gardella or Scott McKeown
`Oblon Spivak
`1940 Duke Street
`Alexandria, VA 22314
`cpdocketgardella@oblon.com and
`cpdocketmckeown@oblon.com
`Telephone: (703) 413-3000
`Fax:
`
`(703) 413-2220
`
`
`
`
`III. PAYMENT OF FEES
`
`The undersigned authorizes the Office to charge $27,200 to Deposit Account
`
`No. 15-0030 as the fee required by 37 C.F.R. § 42.15(a) for this Petition for inter
`
`partes review. The undersigned further authorizes payment for any additional fees
`
`that might be due in connection with this Petition to be charged to the above
`
`referenced Deposit Account.
`
`
`
`
`
`
`
`4
`
`
`
`
`
`IV. REQUIREMENTS FOR INTER PARTES REVIEW
`
`As set forth below and pursuant to 37 C.F.R. § 42.104, each requirement for
`
`inter partes review of the ‘839 patent is satisfied.
`
`A. Grounds For Standing
`Pursuant to 37 C.F.R. § 42.104(a), Petitioner hereby certifies that the ‘839
`
`patent is available for inter partes review and that the Petitioner is not barred or
`
`estopped from requesting inter partes review challenging the claims of the ‘839
`
`patent on the grounds identified herein. The ‘839 patent has not been subject to a
`
`previous estoppel based proceeding of the AIA, and, the complaint served on
`
`Oracle referenced above in Section II(B) was served within the last 12 months.
`
`B. Identification of Challenge
`Pursuant to 37 C.F.R. §§ 42.104(b) and (b)(1), Petitioner requests inter
`
`partes review of claims 1, 2, 6, 8, 14, 15 and 17 of the ‘839 patent, and that the
`
`Patent Trial and Appeal Board (“PTAB”) invalidate the same.
`
`1.
`
`The Specific Art and Statutory Ground(s) on Which the
`Challenge is Based
`
`
`
`Pursuant to 37 C.F.R. § 42.204(b)(2), inter partes review of the ‘839 patent
`
`is requested in view of the following references, each of which is prior art to the
`
`‘839 patent under 35 U.S.C. § 102(a) and/or (e):
`
`
`
`5
`
`
`
`
`
`(1) An Artificial Intelligence Approach to Network Fault Management by
`
`Gurer, D. W., et al. (“Gurer,” Ex. 1003) published in February 1996. Gurer is prior
`
`art to the ‘839 patent under at least 35 U.S.C. § 102(a).
`
`(2) U.S. Patent No. 5,586,218 to Allen (“Allen ‘218,” Ex. 1004), issued
`
`December 17, 1996 from an application filed August 24, 1995. Allen ‘218 is prior
`
`art to the ‘839 patent under at least 35 U.S.C. § 102(e).
`
`(3) U.S. Patent No. 5,664,093 to Barnett (Ex. 1005), issued
`
`September 2, 1997 from an application filed July 25, 1996. Barnett is prior art to
`
`the ‘839 patent under at least 35 U.S.C. § 102(e).
`
`(4) U.S. Patent No. 5,581,664 to Allen (“Allen ‘664,” Ex. 1006), issued
`
`December 3, 1996 from an application filed May 23, 1994. Allen ‘664 is prior art
`
`to the ‘839 patent under at least 35 U.S.C. § 102(e).
`
`Gurer (Ex. 1003) renders obvious claims 1, 2, 6, 8, 14, 15 and 17 of the ‘839
`
`patent under 35 U.S.C. § 103.
`
`Allen ‘218 (Ex. 1004) renders obvious claims 1, 2, 6, 8, 14, 15 and 17 of the
`
`‘839 patent under 35 U.S.C. § 103.
`
`Barnett (Ex. 1005) in view of Allen ‘664 (Ex. 1006) renders obvious claims
`
`6, 8 and 14 of the ‘839 patent under 35 U.S.C. § 103.
`
`
`
`6
`
`
`
`
`
`2.
`
`How the Construed Claims Are Unpatentable Under the
`Statutory Grounds Identified in 37 C.F.R. § 42.204(B)(2)
`and Supporting Evidence Relied upon to Support the
`Challenge
`
`
` Pursuant to 37 C.F.R. § 42.204(b)(4), an explanation of how claims 1, 2, 6,
`
`8, 14, 15 and 17 of the ‘839 patent are unpatentable under the statutory grounds
`
`identified above, including the identification of where each element of the claim is
`
`found in the prior art, is provided in Section VII, below, in the form of claims
`
`charts. Pursuant to 37 C.F.R. § 42.204(b)(5), the appendix numbers of the
`
`supporting evidence relied upon to support the challenges and the relevance of the
`
`evidence to the challenges raised, including identifying specific portions of the
`
`evidence that support the challenges, are provided in Section VII, below, in the
`
`form of claim charts.
`
`
`
`
`
`
`
`
`7
`
`
`
`
`
`V.
`
`FACTUAL BACKGROUND
`A. Declaration Evidence
`This Petition is supported by the declaration of Professor Todd C. Mowry
`
`from Carnegie Mellon University. (Ex. 1007). Professor Mowry offers his opinion
`
`with respect to the content and state of the prior art.
`
`Prof. Mowry is a Professor in Carnegie Mellon’s Department of Computer
`
`Science, has studied, taught, and practiced in the field of computer science for
`
`almost 20 years, and has been a professor of computer science since 1993. (Id. at
`
`¶ 1). Prof. Mowry was an Assistant Professor in the ECE and CS departments at
`
`the University of Toronto prior to joining Carnegie Mellon University in July,
`
`1997. (Id. at ¶ 3). Professor Mowry's research interests span the areas of computer
`
`architecture, compilers, operating systems, parallel processing, database
`
`performance, and modular robotics. He has supervised 11 Ph.D. students and
`
`advised numerous other graduate students. (Id. at ¶ 4).
`
`Prof. Mowry has authored over 70 publications and technical reports in the
`
`field of computer science. (Id.) He is an Associate Editor of ACM Transactions on
`
`Computer Systems (TOCS). Further, Prof. Mowry has received a Sloan Research
`
`Fellowship and the TR35 Award from MIT's Technology Review. (Id. at ¶ 6).
`
`
`
`
`
`
`
`8
`
`
`
`
`
`B. The State of the Art
`
`
`The artificial intelligence (AI) engine disclosed in the ‘839 patent includes a
`
`large database, or case base, of knowledge held by computer experts. This
`
`knowledge includes that necessary to diagnose and correct problems with the
`
`general operation of the computer system. (See Ex. 1001 at 1:65 – 2:1). “Case-
`
`based reasoning” (CBR) is an AI problem-solving technique that originated with
`
`Roger Schank and his PhD students in the mid-1980s at Yale University. (Ex. 1007
`
`at ¶ 12). According to Schank, “[a] case-based reasoner solves new problems by
`
`adapting solutions that were used to solve old problems.” (Id.) The key word in
`
`this quote is “adapting.” (Id.) In contrast with rule-based reasoning, which
`
`performs a scripted action for a rule whenever the specific conditional test for that
`
`rule is satisfied, the motivation behind case-based reasoning was to take a more
`
`flexible and adaptive approach to problem-solving that draws upon analogies to
`
`earlier solutions of related (by somewhat different) problems. (Id.) The proponents
`
`of case-based reasoning argue that drawing upon analogies to solve problems
`
`corresponds well to how humans solve problems, i.e., by recalling situations that
`
`remind them of their current problem, and by attempting to adapt the previous
`
`solution to the current circumstances. (Id.)
`
`
`
`By 1997, case-based reasoning had become a mature research area that was
`
`being actively explored by dozens of research groups around the world. In fact,
`
`
`
`9
`
`
`
`
`
`there was so much published work on CBR by the early 1990’s that multiple
`
`survey articles were published on this topic: Kolodner and Aamodt. (Id. at ¶ 15).
`
`The latter of these two articles, Aamodt, cites 75 papers on CBR. (Id.) The
`
`Aamodt survey paper also describes “The CBR Cycle” in Section 3.3 and Figure 1,
`
`which is the basic structure of nearly all CBR systems, and which has been cited
`
`numerous times. (Id.)
`
`Conferences and workshops were also being created that were devoted
`
`entirely to CBR: the 1st European Workshop on CBR (EWCBR) began in 1993,
`
`and the 1st International Conference on CBR (ICCBR) began in 1995. (Id.)
`
`
`
`
`
`10
`
`
`
`
`
`
`
`C. The ‘573 Patent Application
`Application No. 08/820,573 (“the ‘573 application”), which issued as the
`
`‘839 patent, was filed on March 19, 1997.
`
`The ‘573 application describes a system and method for the automated
`
`maintenance of a computer system using a set of sensors in combination with a
`
`knowledge database to diagnose and solve computer system problems. The
`
`maintenance tool performs the following basic steps.
`
`First, a set of sensors act as a monitoring system that monitors the operation
`
`of the computer system. When one of the sensors detects a problem, the sensor
`
`activates an artificial intelligence (“AI”) engine. (Ex. 1001 at 1:61-64).
`
`Second, the AI engine uses the sensor inputs to diagnose the likely cause of
`
`the problem and determine the best solution. The AI engine includes a large
`
`database, or case base, of knowledge held by computer experts. This knowledge
`
`includes that necessary to diagnose and correct problems with the general
`
`operation of the computer system. (Id. at 1:65 – 2:1).
`
`Third, while processing the problem, the AI engine may reach a point where
`
`additional data are needed. If so, the AI engine requests the data from the
`
`appropriate sensor or sensors. (Id. at 2:5-8).
`
`Fourth, once the appropriate solution is determined from the data, the AI
`
`engine activates the appropriate sensor to perform the repair. (Id. at 2:9-11).
`
`
`
`11
`
`
`
`
`
`D. The Prosecution History
`
`In the November 6, 1998 office action claims 1-19 of the ‘573 application
`
`were rejected under 35 U.S.C. § 103(a) as being unpatentable over U.S. Patent No.
`
`5,809,493 (“Ahamed”) in view of U.S. Patent No. 5,619,656 (“Graf”). In
`
`response, in the Amendment dated January 21, 1999, the Patent Owner amended,
`
`amongst other claims, independent claims 1 and 7, and added new independent
`
`claim 20.
`
`(Ex. 1002 at January 27, 1999 Amendment, pg. 6).
`
`The Patent Owner summarized the amendments to independent claim 1 as
`
`
`
`follows:
`
`(Id. at pg. 7).
`
`
`
`The Patent Owner then distinguished Ahamed and Graf on the basis that
`
`
`
`12
`
`
`
`
`
`“neither Ahamed nor Graf discloses an AI engine utilizing cases as claimed.
`
`Moreover, neither Ahamed nor Graf discloses saving gathered data in the
`
`knowledge database as claimed.” (Id.) Further, the Patent Owner stated that “one
`
`of ordinary skill in the art would not find saving a state as claimed to be obvious in
`
`light of Ahamed and Graf. Neither reference discloses or suggests saving the state
`
`of the computer system when a likely solution cannot be determined. Independent
`
`claims 7 and 20 recite similar limitations.” (Id. at pg. 8).
`
`The Examiner then issued a notice of allowability which did not include a
`
`statement of reasons for allowance.
`
`
`
`
`
`
`
`
`13
`
`
`
`
`
`VI. BROADEST REASONABLE CONSTRUCTION
`
`Pursuant to 37 C.F.R. § 42.204(b)(3), the claims subject to inter partes review
`
`shall receive the “broadest reasonable construction in light of the specification of
`
`the patent in which [they] appear[].” See also In re Swanson, No. 07-1534 (Fed.
`
`Cir. 2008); In re Trans Texas Holding Corp., 498 F.3d 1290, 1298 (Fed. Cir. 2007)
`
`(citing In re Yamamoto, 740 F.2d 1569, 1571 (Fed. Cir. 1984)). As the Federal
`
`Circuit noted in Trans Texas, the Office has traditionally applied a broader
`
`standard than a Court does when interpreting claim scope. Moreover, the Office is
`
`not bound by any district court claim construction. Trans Texas, 498 F.3d at 1297-
`
`98, 1301. Rather,
`
`the PTO applies to verbiage of the proposed claims the
`broadest reasonable meaning of the words in their
`ordinary usage as they would be understood by one of
`ordinary skill in the art, taking into account whatever
`enlightenment by way of definitions or otherwise that
`may be afforded by the written description contained in
`applicant’s specification. In re Morris, 127 F.3d 1048,
`1054-55, 44 U.S.P.Q.2d 1023, 1027-28 (Fed. Cir. 1997).
`
`Because the standards of claim interpretation used by the Courts in patent litigation
`
`are different from the claim interpretation standards used by the Office in claim
`
`examination proceedings (including inter partes review), any claim interpretations
`
`submitted herein for the purpose of demonstrating a Reasonable Likelihood of
`
`Prevailing are neither binding upon litigants in any litigation, nor do such claim
`
`
`
`14
`
`
`
`
`
`interpretations correspond to the construction of claims under the legal standards that
`
`are mandated to be used by the Courts in litigation.
`
`The interpretation of the claims presented either implicitly or explicitly
`
`herein should not be viewed as constituting, in whole or in part, Petitioner’s own
`
`interpretation and/or construction of such claims for the purposes of the underlying
`
`litigation. Instead, such constructions in this proceeding should be viewed only as
`
`constituting an interpretation of the claims under the “broadest reasonable
`
`construction” standard.
`
`All claim terms not specifically addressed below have been accorded their
`
`broadest reasonable interpretation in light of the patent specification including their
`
`plain and ordinary meaning to the extent such a meaning could be determined by a
`
`a skilled artisan.
`
`Sensors
`
`A.
`The ‘839 patent describes the sensors as follows:
`
`The sensors 112 are software programs that gather information from
`the computer system 300. (See, e.g., Ex. 1001 at 3:16-17).
`
`Accordingly, and as explained in the Declaration of Professor Mowry, the
`
`term “sensors” should be interpreted as including, under the broadest reasonable
`
`construction, different aspects of the same software program or different
`
`components of the same application. (Ex. 1007 at ¶¶ 21, 22, 24).
`
`
`
`15
`
`
`
`
`
`B. Artificial Intelligence (AI) Engine
`The term “AI engine” should be interpreted as including, under the broadest
`
`reasonable construction, a different aspect of the same software program as the
`
`“sensors” discussed above in Section A. Both the sensors and the AI engine can be
`
`different components of the same software program or different components of the
`
`same application. (Ex. 1007 at ¶¶ 23-24).
`
`
`
`
`
`
`
`
`16
`
`
`
`
`
`VII. REPRESENTATIVE PROPOSED REJECTIONS SHOWING THAT
`PETITIONER HAS A REASONABLE LIKELIHOOD OF
`PREVAILING
`
`The references addressed below each provide the teaching believed by the
`
`Examiner to be missing from the prior art and variously anticipate or render
`
`obvious the claimed subject matter. It should be understood that rejections may be
`
`premised on alternative combinations of, or citations within, these same references.
`
`A. Claims 1, 2, 6, 8, 14, 15 and 17 are Rendered Obvious by Gurer
`Under 35 U.S.C. § 103
`An Artificial Intelligence Approach to Network Fault Management published
`
`by Gurer et al. (“Gurer,” Ex. 1003) was not considered during the original
`
`prosecution of the ‘839 patent, nor is it cumulative of any prior art considered by
`
`the Examiner. Gurer was published in February 1996. The earliest effective
`
`priority date of the ‘839 patent is March 19, 1997. Therefore, Gurer is available as
`
`prior art against all claims of the ‘839 patent under 35 U.S.C. § 102(a). The
`
`following claim chart demonstrates, on a limitation-by-limitation basis, how claims
`
`1, 2, 6, 8, 14, 15 and 17 of the ‘839 patent are rendered obvious by Gurer under 35
`
`U.S.C. § 103(a). (See Ex. 1007 at ¶¶ 25-48).
`
`
`US 5,944,839 Claim
`Language
`1. A tool for
`automatically
`maintaining a
`computer system
`
`Correspondence to Gurer
`
`
`Gurer discloses that
`
`
`“[a]utomation of network management activities can
`benefit from the use of artificial intelligence (AI)
`
`
`
`17
`
`
`
`
`
`US 5,944,839 Claim
`Language
`having a processor and
`a memory, the tool
`comprising:
`
`a knowledge database
`stored in the memory
`and holding a plurality
`of cases describing
`potential computer
`problems and
`corresponding likely
`solutions
`
`Correspondence to Gurer
`
`technologies, including fault management,
`performance analysis, and traffic management. Here
`we focus on fault management, where the goal is to
`proactively diagnose the cause of abnormal network
`behavior and to propose, and if possible, take
`corrective actions.” (See Ex. 1003 at 1:11-14).
`
`
`The declaration of Prof. Mowry sets forth the reasons why
`one skilled in the art would understand that the network
`management system of Gurer would necessarily have a
`processor and a memory associated therewith. (Ex. 1007 at
`¶¶ 28-29).
`Gurer discloses a knowledge database (“case library”)
`holding a plurality of cases describing potential computer
`problems and corresponding likely solutions.
`
`
`“Case-based reasoning is based on the premise that
`situations recur with regularity. Studies of experts
`and their problem solving techniques have found
`that experts rely quite strongly on applying their
`previous experiences to the current problem at
`hand. CBR can be thought of as such an expert that
`applies previous experiences stored as cases in a
`case library. Thus, the problem-solving process
`becomes one of recalling old experiences and
`interpreting the new situation in terms of those old
`experiences.” (Ex. 1003 at 6:35-39).
`
`
`
`18
`
`
`
`
`
`US 5,944,839 Claim
`Language
`
`a plurality of sensors
`stored in the memory
`and executing on the
`processor and adapted
`for gathering data
`about the computer
`system, storing the
`data in the knowledge
`database, and
`detecting whether a
`computer problem
`exists from the data
`and the plurality of
`cases; and
`
`
`Correspondence to Gurer
`
`The declaration of Prof. Mowry sets forth the reasons why
`one skilled in the art would understand that the knowledge
`database (or “case library” of Gurer, which is part of the
`case-based reasoning system) would necessarily be stored
`in a memory of the computer system. (Ex. 1007 at ¶¶ 30-
`31).
`Gurer discloses a plurality of sensors (“alarms”) adapted
`for gathering data about the computer system, storing the
`data in the knowledge database, and detecting whether a
`computer problem exists from the data and the plurality of
`cases.
`
`
`“The first step in fault management is to collect
`monitoring and performance alarms. Typically
`alarms are produced by either managed network
`elements (e.g., ATM switches, customer premise
`equipment) or by a statistical analysis of the
`network that monitors trends and threshold
`crossings. Alarms can be classified into two
`categories, physical and logical, where physical
`alarms are hard errors (e.g., a link is down),
`typically reported through an element manager,
`and logical alarms are statistical errors (e.g.,
`performance degradation due to congestion).” (Ex.
`1003 at 1).
`
`“Alarm filtering is a process that analyzes the
`multitude of alarms received and eliminates the
`redundant alarms (e.g., multiple occurrences of the
`same alarm). Alarm correlation is the
`interpretation of multiple alarms such that new
`conceptual meanings can be assigned to the
`alarms, creating derived alarms. Faults are
`identified by analyzing the filtered and correlated
`alarms and by requesting tests and status updates
`from the element managers, which provide
`additional information for diagnosis.” (Id. at 2:1-
`
`
`
`19
`
`
`
`
`
`
`
`US 5,944,839 Claim
`Language
`
`5).
`
`Correspondence to Gurer
`
`
` “The more complex processes of fault
`management include alarm filtering and
`correlation, fault identification, and correction.
`Many of these functions involve analysis,
`correlation, pattern recognition, clustering or
`categorization, problem solving, planning, and
`interpreting data from a knowledge base that
`contains descriptions of network elements and
`topology.” (Id. at 3:2-4).
`
`
`
`
`The declaration of Prof. Mowry sets forth the reasons why
`one skilled in the art would understand that the network
`elements and network system of Gurer, which generate the
`“alarms,” would necessarily be associated with a processor
`and a memory therewith. (Ex. 1007 at ¶¶ 32-35). Further,
`Prof. Mowry explains why one skilled in the art would
`understand that the collected alarms are stored somewhere,
`e.g., in the knowledge database, because fault
`identification and correction involve interpreting data from
`a knowledge base that contains descriptions of network
`
`20
`
`
`
`
`
`US 5,944,839 Claim
`Language
`
`an AI engine stored in
`the memory and
`executing on the
`processor in response
`to detection of a
`computer problem and
`[the AI engine]
`utilizing the plurality
`of cases to determine a
`likely solution to the
`detected computer
`problem,
`
`
`Correspondence to Gurer
`
`elements (which produce the alarms) and topology. (Id.)
`Gurer discloses an AI system executing in response to
`detection of a computer problem. Gurer discloses:
`
`
`“Automation of network management activities can
`benefit from the use of artificial intelligence (AI)
`technologies, including fault management,
`performance analysis, and traffic management. Here
`we focus on fault management, where the goal is to
`proactively diagnose the cause of abnormal network
`behavior and to propose, and if possible, take
`corrective actions . . . AI technologies may be used to
`automate the fault management process, in particular
`neural networks (NNs) and case-based reasoning
`(CBR).” (Ex. 1003 at 1:11-16).
`
`
`Further, Gurer discloses that:
`“[a]nother area of fault management where AI
`technologies can have a positive impact, is fault
`correction. CBR systems, ESs [Expert Systems], or
`intelligent planning systems can develop plans or
`courses of action that will correct a fault that has
`been identified and verified.” (Id. at 3:27-29).
`
`
`
`
`21
`
`
`
`
`
`
`
`
`
`US 5,944,839 Claim
`Language
`
`wherein when the
`knowledge database
`lacks data necessary to
`determine a likely
`solution to the
`computer problem, the
`AI engine activates a
`particular sensor in the
`plurality of sensors to
`gather the necessary
`data and store the data
`in the knowledge
`database, and
`
`
`wherein when the
`knowledge database
`does not describe a
`likely solution to the
`computer problem, the
`AI engine saves the
`gathered data in the
`knowledge database as
`a new case.
`
`Correspondence to Gurer
`
`See also the five-step CBR problem solving process shown
`at Ex. 1003, p. 7.
`
`The declaration of Prof. Mowry sets forth the reasons why
`one skilled in the art would understand that the AI engine
`of Gurer would necessarily be stored in a memory of the
`computer system. (Ex. 1007 at ¶¶ 36-38).
`Gurer discloses
`
`
`“The filtering and correlation of alarms is the first
`step of fault diagnosis. The second step involves
`further analysis and identification of the exact cause
`of the alarms, or the fault. This process is an iterative
`one where alarm data are analyzed and
`decisions are made whether more data should be
`gathered, a finer grained analysis should be
`executed, or problem solving
`should be performed. Gathering more data can
`consist of sending