`________________________
`
`BEFORE THE PATENT TRIAL AND APPEAL BOARD
`________________________
`
`Linear Technology Corporation
`Petitioner
`v.
`In-Depth Test LLC
`Patent Owner
`
`Patent No. 6,792,373
`Issued: September 14, 2004
`Filed: May 24, 2002
`Inventors: Eric Paul Tabor
`
`Title: METHODS AND APPARATUS FOR SEMICONDUCTOR TESTING
`________________________
`
`Inter Partes Review No. Unassigned
`
`__________________________________________________________________
`
`
`
`
`
`DECLARATION OF ADIT SINGH
`REGARDING U.S. PATENT NO. 6,792,373
`
`
`Singh Declaration
`U.S. Pat. 6,792,373
`
`
`
`
`1
`
`
`
`
`
`
`Declaration of Adit Singh
`Regarding U.S. Patent No. 6,792,373
`
`I, Adit Singh, a resident of Auburn, Alabama, declare as follows:
`
`I have been retained by McDermott Will & Emery to provide my
`
`1.
`
`2.
`
`opinion concerning the validity of U.S. Patent No. 6,792,373 (“the ‘373 Patent”)
`
`(Ex. 1001). McDermott Will & Emery is compensating me for my time at the rate
`
`of $500 per hour.
`
`3. My declaration contains the following sections beginning at the
`
`designated pages:
`
`
`I.
`
`II.
`
`III.
`
`IV.
`
`V.
`
`VI.
`
`Basis for My Opinion ................................................................................... 4
`
`Introduction and Qualifications .................................................................... 5
`
`My Understanding of the Governing Law ................................................... 8
`
`A. Types of Claims -- Dependent and Independent Claims ...................... 8
`
`B. Patentability and Validity of Claims ..................................................... 8
`
`C. IPR Proceedings and Claim Interpretation .......................................... 10
`
`D. Relevant Time Period .......................................................................... 11
`
`E. Level of Ordinary Skill in the Art and Relevant Timeframe .............. 13
`
`Background of the Relevant Discipline of the ‘373 Patent ........................ 13
`
`The Technical Overview of The ‘373 Patent ............................................. 17
`
`My Interpretation of Certain Claim Terms ................................................. 21
`
`Singh Declaration
`U.S. Pat. 6,792,373
`
`
`
`2
`
`
`
`
`
`
`A. “component” ........................................................................................ 21
`
`B. “recipe file” ......................................................................................... 22
`
`VII.
`
`References Used In My Analysis ............................................................... 23
`
`A. Lane - U.S. Patent No. 4,967,381........................................................ 23
`
`B. Western ................................................................................................ 30
`
`VIII. Analysis of the ‘373 Patent Claims ............................................................ 36
`
`A. General Description of the Claims and Organization of
`
`Analysis ............................................................................................... 36
`
`B. Invalidity of the Independent Claims .................................................. 37
`
`i.
`
`ii.
`
`Independent Claim 1 ...................................................................................... 37
`
`Independent Claim 8 ...................................................................................... 52
`
`iii.
`
`Independent Claim 15 .................................................................................... 57
`
`C. Invalidity of the Dependent Claims .................................................... 61
`
`iv. Dependent Claim 2, 9, and 16 ....................................................................... 61
`
`v.
`
`Dependent Claim 3, 10, and 17 ..................................................................... 64
`
`vi. Dependent Claim 4, 11, and 18 ..................................................................... 66
`
`vii. Dependent Claim 5, 12, and 20 ..................................................................... 68
`
`viii. Dependent Claims 6 and 13 ........................................................................... 69
`
`ix. Dependent Claim 7, 14, and 19 ..................................................................... 72
`
`IX.
`
`CONCLUSION .......................................................................................... 74
`
`Singh Declaration
`U.S. Pat. 6,792,373
`
`
`
`3
`
`
`
`
`
`
`
`
`I.
`
`Basis for My Opinion
`
`4.
`
`In preparing this declaration, I have reviewed:
`
`a. United States Patent No. 6,792,373 (the “ʼ373 Patent”) (Ex.
`
`1001);
`
`b. United States Patent No. 4,967,381 (“Lane”) (Ex. 1002);
`
`c. Western Electric Co., STATISTICAL QUALITY CONTROL
`
`HANDBOOK (Delmar Printing Co., 1956) (“Western”) (Ex.
`
`1003);
`
`d. Linear Tech. Corp. v. In-Depth Test LLC, Case IPR2015-
`
`00421— Paper 12, Patent Owner’s Preliminary Response (the
`
`“Preliminary Response”);
`
`e. Reexamination 90/008,313—Request for Reexamination dated
`
`October 30, 2006 (Ex. 1009);
`
`f. Andrew Grochowski, Integrated Circuit Testing for Quality
`
`Assurance in Manufacturing: History, Current Status, and
`
`Future Trends, 44 IEEE TRANS. ON CIR. & SYS. 8, 610-33 (Aug.
`
`1997) (“Grochowski”) (Ex. 1010);
`
`Singh Declaration
`U.S. Pat. 6,792,373
`
`
`
`4
`
`
`
`
`
`
`g. Higaki, Remote Monitoring and Control of Semiconductor
`
`Processing, Hewlett-Packard Journal, 30 (Jul. 1985) (“Higaki”)
`
`(Ex. 1011);
`
`h. Computer Focus – International, Hewlett Packard (Sept. 1985)
`
`(Ex. 1012);
`
`i. Semiconductor Process Analysis Software
`
`Incorporates
`
`Graphics Package, IEEE CG&A, 8 (Dec. 1985) (Ex. 1013).
`
`5.
`
`In forming my opinions expressed below, I have considered the above
`
`listed documents and my experience and knowledge based on my work in this area
`
`as described below. I also have provided a claim chart summarizing the support
`
`for my opinion expressed in this declaration. In some instances, I have provided
`
`additional citations of support in the claim chart that do not appear in my analysis
`
`below, which I have found relevant in forming my opinion. I understand that my
`
`claim chart has been submitted in this filing as Exhibit 1008.
`
`II.
`
`Introduction and Qualifications
`
`6.
`
`I received an undergraduate degree from the Indian Institute of
`
`Technology (IIT) in Kanpur in 1976, and my M.S. and Ph.D. from Virginia Tech
`
`in 1978 and 1982, respectively, all in Electrical Engineering.
`
`7.
`
`Since September 2002, I have been serving as a James B. Davis
`
`Distinguished Professor of Electrical and Computer Engineering at Auburn
`
`Singh Declaration
`U.S. Pat. 6,792,373
`
`
`
`5
`
`
`
`
`
`
`University, where I direct the VLSI Design and Test Laboratory. Before joining
`
`Auburn in 1991, I was Associate, and earlier Assistant, Professor of Electrical and
`
`Computer Engineering at the University of Massachusetts in Amherst. In the past,
`
`I held visiting positions during sabbaticals at major universities, most recently in
`
`2012 serving as “Guest Professor” at the University of Freiburg, Germany. My
`
`research program directed to semiconductor testing has received extensive support
`
`from U.S. National Science Foundation and private industry, and also from
`
`international agencies such as the Max Plank Society of Germany, the Fulbright
`
`Foundation, the Ministry of Science and Technology in India, and the National
`
`Science Council of Taiwan.
`
`8. My technical expertise spans all aspects of VLSI technology, in
`
`particular, integrated circuit test and reliability. I am particularly recognized for
`
`my pioneering contributions to statistical methods in test and adaptive testing. In
`
`this regard, I have published over two hundred research papers, served as a
`
`consultant to many of the largest semiconductor companies around the world, and
`
`am the primary inventor listed in several international patents, some of which have
`
`been licensed to industry. I have also held leadership roles as General Chair/Co-
`
`Chair/Program Chair for dozens of international VLSI design and test conferences.
`
`Most recently I was Program Chair of the 2014 International Conference on VLSI
`
`Design, Co-Chair of the 2014 Workshop on Reliability Aware Design, and am the
`
`Singh Declaration
`U.S. Pat. 6,792,373
`
`
`
`6
`
`
`
`
`
`
`Program Chair for the 2015 Asian Test Symposium. I also currently serve on the
`
`editorial boards of IEEE Design and Test Magazine and the Journal of Testing and
`
`Test Applications (JETTA), and on the Steering and Program Committees of many
`
`of the major IEEE international test and design automation conferences.
`
`9.
`
`Over the years, I have become a popular lecturer. In addition to the
`
`dozens of talks and seminars I have presented around the world directed to
`
`research involving semiconductor testing, I have been regularly invited by
`
`conferences and industry events to conduct training courses on cutting edge
`
`technical topics in my specialty. I have conducted almost 100 such courses,
`
`ranging from a half day to three days in length, in over a dozen different countries,
`
`and in-house for many major companies (IBM, Texas Instruments, AMD, National
`
`Semiconductor, NXP, Advantest, Bell Labs, etc.). For more than ten years, I have
`
`been regularly invited to conduct a half or full day short course on statistical and
`
`adaptive test methods at the flagship International Test Conference, the largest
`
`annual technical meeting on integrated circuit testing worldwide.
`
`10.
`
`I also have received numerous research and teaching awards. I was
`
`made a Fellow of IEEE, the world’s largest engineering professional society, in
`
`2002 for “contributions to defect based testing and test optimization in VLSI
`
`circuits”. I am a Golden Core member of the IEEE Computer Society. I served
`
`two elected terms (2007-11) as Chair of the IEEE Test Technology Technical
`
`Singh Declaration
`U.S. Pat. 6,792,373
`
`
`
`7
`
`
`
`
`
`
`Council (TTTC), and I currently serve (2011-15) on the Board of Governors of the
`
`IEEE Council on Design Automation (CEDA).
`
`III. My Understanding of the Governing Law
`
`A.
`11.
`
`Types of Claims -- Dependent and Independent Claims
`I understand that patents have two types of claims – independent
`
`claims and dependent claims. I understand that independent claims only include
`
`the aspects stated in those independent claims. I further understand that dependent
`
`claims include the aspects stated in the dependent claim plus the aspects stated in
`
`the independent claim from which the dependent claim depends.
`
`B.
`12.
`
`Patentability and Validity of Claims
`I understand that an invention described in a patent must be new, it
`
`cannot be obvious, and it must be useful to be a valid patent. To determine
`
`whether a patent meets these requirements, one must look at each of the claims. I
`
`understand that a patent claim is not valid if it is not new, obvious, or not useful.
`
`13.
`
`I understand that prior art refers to publically available information
`
`(e.g., published, on sale, or in public use in the United States) before the “critical
`
`date” of a particular patent claim.
`
`14.
`
`I understand that a patent claim is not new (which I understand to be
`
`termed “anticipated”) if each element of the claim is disclosed expressly or
`
`inherently in a single prior art reference.
`
`Singh Declaration
`U.S. Pat. 6,792,373
`
`
`
`8
`
`
`
`
`
`
`15.
`
`I further understand that the determination of whether a claim is
`
`obvious is evaluated from the perspective of a person of ordinary skill in the
`
`relevant area of the invention, at the time the invention was made. In analyzing
`
`obviousness, I understand that it is important to understand the scope of the claims
`
`at issue, the level of skill in the relevant area of the invention, the scope and
`
`content of the prior art references, the differences between the prior art references
`
`and the claims, and any secondary considerations that would demonstrate that an
`
`invention is not obvious. I also understand that if a technique has been used to
`
`improve one system or method, and a person of ordinary skill in the relevant area
`
`would improve similar systems or methods in the same way, using the technique is
`
`obvious unless actual application is beyond his or her skill to do so. I understand
`
`that if more than one reference is used, there must be a motivation to combine the
`
`references through an explicit or implicit teaching, suggestion or motivation to
`
`arrive at the invention, or of prior art references, such as common sense of a person
`
`of skill in the relevant area, market demand, or an industry need for the invention.
`
`16.
`
`I understand that secondary considerations indicating that a patent
`
`claim is not obvious may include evidence of commercial success caused by the
`
`invention, evidence of a recognized need that was solved by the invention,
`
`evidence that others copied the invention, or evidence that the invention achieved a
`
`Singh Declaration
`U.S. Pat. 6,792,373
`
`
`
`9
`
`
`
`
`
`
`surprising result. I understand that such secondary considerations must have a
`
`causal relationship to the elements of a claim.
`
`17.
`
`I am unaware of any such secondary considerations relating to any
`
`claim (namely, claims 1 through 20) of the ‘373 Patent.
`
`C.
`18.
`
`IPR Proceedings and Claim Interpretation
`I understand that this “inter partes review” (“IPR”) proceeding is a
`
`proceeding before the United States Patent and Trademark Office (“USPTO”) for
`
`challenging the patentability of the ‘373 Patent. I understand that an IPR is
`
`conducted by the Patent Trial and Appeal Board (the “Board”) if a trial is
`
`instituted.
`
`19.
`
`I understand that in an IPR proceeding, the Board gives the challenged
`
`patent’s claims their broadest reasonable interpretation in light of the specification
`
`of the patent. I understand that the specification of a patent includes all of the
`
`figures, background discussions, any detailed description, examples, and claims
`
`within the patent document.
`
`20.
`
`I understand that the Board will look at the specification of the patent
`
`to see if a claim term has been defined by the patent applicant, and if not, will
`
`apply the broadest reasonable ordinary meaning from the perspective of a person
`
`of ordinary skill in the relevant area. However, I also understand that if a term has
`
`Singh Declaration
`U.S. Pat. 6,792,373
`
`
`
`10
`
`
`
`
`
`
`no previous meaning to those of ordinary skill in the relevant area, its meaning
`
`then must be found in the patent.
`
`D. Relevant Time Period
`I understand that the patent application leading to ‘373 Patent (Ex.
`21.
`
`1001) was filed on May 24, 2002. I understand that the ‘373 Patent purports to
`
`claim the benefit of U.S. Provisional Application No. 60/293,577, filed on May 24,
`
`2001; U.S. Provisional Application No. 60/295,188, filed May 31, 2001; U.S.
`
`Provisional Application No. 60/374,328, filed on April 21, 2002; and U.S.
`
`Continuation-in-part Application No. 09/872,195, filed May 31, 2001 (the “Priority
`
`Applications”). (Id.) It is my understanding that the owner of the ‘373 Patent
`
`might try to evidence a priority date based on one or more of the Priority
`
`Applications.
`
`22. Based on my review of the Priority Applications, it is my view that
`
`none of the claims of the ‘373 Patent are supported by U.S. Provisional No.
`
`60/293,577, filed May 24, 2001, or U.S. Provisional Application No. 09/872,195,
`
`filed May 31, 2001. It is also my view that at least claims 2, 4, 5, 7, 9, 11, 12, 14,
`
`16, 18, and 20 are not supported by U.S. Provisional Application No. 60/295,188,
`
`filed May 31, 2001. It is also my view that at least claims 2, 9, and 16 are not
`
`supported by U.S. Provisional Application No. 60/374,328, filed April 21, 2002.
`
`Singh Declaration
`U.S. Pat. 6,792,373
`
`
`
`11
`
`
`
`
`
`
`23.
`
`I understand that the earliest priority date a patent owner may
`
`evidence is one year prior to the earliest effective filing date of a patent. However,
`
`during the Reexamination of the ‘373 Patent, filed in November 2006, the inventor
`
`of the ‘373 Patent contested the patentability of the ‘373 Patent and submitted
`
`Gneiting which has a publication date of October 11, 2000 as evidence that at least
`
`independent claims 1, 8, and 15 were invalid. (See Ex. 1009.) Clearly the inventor
`
`believed that there was no opportunity to prove a priority date before this
`
`publication date.
`
`24. The following table summarizes the earliest possible priority dates
`
`that I have attributed to each claim for the purpose of my analysis.
`
`Claims
`
`1, 8, 15
`
`2, 9, 16
`
`3, 10, 17
`
`4, 11, 18
`
`5, 12, 20
`
`6, 13
`
`7, 14
`
`19
`
`Priority date used
`for my analysis
`
`October 12, 2000
`
`May 24, 2001
`
`October 12, 2000
`
`April 21, 2001
`
`April 21, 2001
`
`October 12, 2000
`
`April 21, 2001
`
`October 12, 2000
`
`
`
`Singh Declaration
`U.S. Pat. 6,792,373
`
`
`
`12
`
`
`
`
`
`
`Level of Ordinary Skill in the Art and Relevant Timeframe
`E.
`25. The ‘373 Patent is directed to a method and apparatus for testing
`
`semiconductors, including the identification of defective or potentially defective
`
`semiconductor components based on the analyzed data. Accordingly, I believe that
`
`a person of ordinary skill in the art in the field of developing the technology of the
`
`‘373 Patent would have a Bachelor of Science degree in combination with 1-2
`
`years training in semiconductor testing. This description is approximate, and a
`
`higher level of training or skill might make up for less education, and vice-versa.
`
`26.
`
`I believe that I would qualify as at least a person of ordinary skill in
`
`the art in the fields of using and developing the technology of the ‘373 Patent, as
`
`described above, and that I have a sufficient level of knowledge, experience and
`
`education to provide an expert opinion in these fields of the ‘373 Patent. This is
`
`true regardless of whether the testimony provided in this opinion is given in the
`
`past or present tense.
`
`IV. Background of the Relevant Discipline of the ‘373 Patent
`
`27.
`
`In the field of semiconductor manufacturing, generally hundreds of
`
`semiconductor circuits are printed or manufactured on a wafer at a time, and
`
`hundreds of wafers may be manufactured on a manufacturing line. Each wafer
`
`goes through up to 80 or more processing steps. The process is automated, and
`
`optimized at each step as a result of extensive testing to reduce defects. In this
`
`Singh Declaration
`U.S. Pat. 6,792,373
`
`
`
`13
`
`
`
`
`
`
`regard, it is desirable to know if any process step injects too many defects into a
`
`wafer or group of devices on a wafer before the same defects are carried on in
`
`further processing of the wafer, or injected into other wafers in the manufacturing
`
`line. Providing test data to semiconductor test engineers during the manufacturing
`
`process or after the semiconductor is manufactured allows the engineers to adjust
`
`the manufacturing process in the future. For example, identifying defects during
`
`the diffusion step may indicate that an extra level of cleaning is required, or that
`
`the chemicals used are unclean, thereby alerting the process engineer to use a new
`
`solution for the next wafer lot.
`
`28. Up until about late 1980s, in the field of semiconductor testing,
`
`integrated circuits were tested in an absolute way. That is, input signals were
`
`applied to semiconductor circuits and if the expected response was observed, the
`
`circuit was good. If the expected response was not received, the circuit was
`
`declared bad. Using digital circuits as an example, when an expected 0’s and 1’s
`
`pattern is received at the output in response to an input, the circuit is good.
`
`Otherwise, the circuit is classified as bad, or defective. However, all possible test
`
`cases cannot be applied; i.e., testing of some microprocessors would take an
`
`inconceivable amount of time to apply all possible input cases. So, for any test
`
`applied to a circuit, the test is necessarily incomplete. And not only is it
`
`incomplete, but, only a fraction of all possible defects is tested.
`
`Singh Declaration
`U.S. Pat. 6,792,373
`
`
`
`14
`
`
`
`
`
`
`29. As shown by literature, the automated test equipment (ATE) industry
`
`was started as early as the late 1960s by Teradyne and other companies to test
`
`linear analog ICs. (Ex. 1010 at 612-13.) ATE is generally semiconductor test
`
`equipment that includes some form of automation in a manufacturing environment,
`
`for example by way of being synchronized with other equipment or by being
`
`controlled by a computer to automatically perform a series of tasks. (Id. at 611-13.)
`
`30.
`
` By the 1980s, automation was recognized in semiconductor testing as
`
`common, and wafer processing was controlled and monitored remotely using
`
`workstation computers. (Ex. 1011 at 30.) These computers stored and applied
`
`“recipes” or process programs to computer control repeatability of process steps.
`
`(Id.) Using a computer, a test engineer could “request equipment status, store
`
`measurement data, remotely control equipment, store and restore calibration data,
`
`and store and restore recipes. Equipment-generated alarm conditions [could be]
`
`recorded and reported.” (Id.)
`
`31. These computer systems also implemented statistical software to
`
`assist technicians in analyzing data received from semiconductor test equipment.
`
`For example, in 1985, Hewlett Packard implemented a statistical software program
`
`called Enhansys into its semiconductor test environment. (Ex. 1012 at 9.)
`
`Enhansys enabled engineers to retrieve and visually analyze semiconductor test
`
`Singh Declaration
`U.S. Pat. 6,792,373
`
`
`
`15
`
`
`
`
`
`
`data. (Ex. 1013 at 9.) Using Enhansys, test engineers could “sort, analyze, and
`
`query a data subset, and obtain a hard-copy report.” (Id.)
`
`32. The amount of data collected by computer-enabled test systems
`
`continued to grow as semiconductor processing and testing became more
`
`automated, and as the number of test measurements for each device increased.
`
`Mass production of semiconductor devices only amplified the amount of data
`
`collected. By the late 1980s, statistical software programs were in place to
`
`supplement visual graph inspection to determine whether a circuit was good or
`
`bad. (See Ex. 1002.) In this manner, a computer generated a graph using statistical
`
`pen-and-paper mythologies for statistical charting that have been used since the
`
`1920’s to determine the quality of a batch of products or to control a process for
`
`the production of such products. (Ex. 1003 at 23, Preface.) Large amounts of test
`
`results produced from semiconductor test equipment could be analyzed by
`
`comparing the data against sample populations of statistically similar data. (See
`
`Ex. 1002 Abstract.)
`
`33.
`
`If the software program detected that a device under test behaved in
`
`an unexpected fashion different from the sample population, for example by
`
`producing “outlying” test results that stray from the sample population or
`
`otherwise behave in an outlier fashion, the device was classified as potentially
`
`defective and/or less reliable.
`
`Singh Declaration
`U.S. Pat. 6,792,373
`
`
`
`16
`
`
`
`
`
`
`34. By the 1990s, the trend in the industry was to offer automated test
`
`equipment (ATE) with mixed signal capabilities. (Ex. 1010 at 613.) Mixed-signal
`
`ATE systems were and still are sophisticated computer controlled systems that
`
`operate under a “test plan” written in a high level programming language such as C
`
`to test both analog and digital qualities of devices under test. Under the control of
`
`the test plan, the ATE “generates and applies stimuli to an IC . . . senses and
`
`digitizes the IC's responses, and . . . analyzes these responses.” (Id.)
`
`35. During this time, new statistical testing methods were employed for
`
`dealing with both analog and digital test results, and for more efficiently
`
`classifying whether a device was good, bad, or marginal/less reliable.
`
`V. The Technical Overview of The ‘373 Patent
`
`36. The ‘373 Patent is directed to a system for testing semiconductors.
`
`(Ex. 1001 Abstract.) In this regard, I understand that ‘373 Patent to be an attempt
`
`to capture the methodology of outlier detection developed and used during the
`
`1990s, and its application to known ATE.
`
`37. The ’373 Patent describes testing semiconductors with test equipment
`
`connected to a computer to detect whether a test result is an outlier. Figure 1,
`
`reproduced below, depicts a tester 102 that tests components 106. (Ex. 1001 Fig.
`
`1.)
`
`Singh Declaration
`U.S. Pat. 6,792,373
`
`
`
`
`
`17
`
`
`
`
`
`
`(Ex. 1001, Fig. 1)
`
`
`
`38. The tester 102 used by the ’373 Patent is described as “automatic test
`
`equipment (ATE)” or “any test equipment that tests components 106 and generates
`
`output data.” (Id. at 3:24-37.) A “Teradyne tester” is given as an example. (Id. at
`
`3:37.) The tester 102 is described as operating in connection with a “computer
`
`system 108 that receives tester data from the tester.” (Id. at 3:42-49.)
`
`39. The computer implements a statistical engine (software) running on
`
`the computer to analyze data received from the tester. (Id. at 3:45-51.) The
`
`software includes a supplementary data analysis element 206 which analyzes
`
`output test data from the tester 102. (Id. at 5:25-33, FIG. 2.) The ’373 patent
`
`explains how the supplementary data analysis element 206 identifies outliers:
`
`The supplementary data analysis element 206 may operate in any
`suitable manner to designate outliers, such as by comparison to
`selected values and/or according to treatment of the data in the data
`smoothing process. For example, an outlier identification element
`
`Singh Declaration
`U.S. Pat. 6,792,373
`
`
`
`18
`
`
`
`
`
`
`according to various aspects of the present invention initially
`automatically calibrates its sensitivity to outliers based on selected
`statistical relationships for each relevant datum (step 434). Some of
`these statistical relationships are then compared to a threshold or other
`reference point, such as the data mode, mean, or median, or
`combinations thereof, to define relative outlier threshold limits. In the
`present embodiment, the statistical relationships are scaled, for
`example by one, two, three, and six standard deviations of the data, to
`define the different outlier amplitudes (step 436). The output test data
`may then be compared to the outlier threshold limits to identify and
`classify the output test data as outliers (step 438).
`(Ex. 1001 at 13:49-65 (emphasis added).)
`
`40. The supplementary data analysis element 206 includes an outlier
`
`classification element 212, which is “configured to identify and/or classify the
`
`various outliers in the data according to selected algorithms.” (Id. at 14:7-14.) I
`
`note that the patent owner distinguished part classification from outlier
`
`identification, stating that part classification may be based on “traditional threshold
`
`testing” and/or the “number of outliers and in what range the outliers fall.” (Ex.
`
`1006 at 9-10.) To the extent that the classification element classifies data
`
`congruent with the statistical rules disclosed in the above example of outlier
`
`identification (Ex. 1001 at 13:49-65), the ’373 patent explains that the
`
`classification of the data may be used to identify outliers. For example, the ’373
`
`patent explains:
`
`Singh Declaration
`U.S. Pat. 6,792,373
`
`
`
`19
`
`
`
`
`
`
`The outlier classification element may classify data in accordance
`with conventional SPC control rules, such as Western Electric rules,
`to characterize the data.
`The outlier classification element suitably classifies the data using a
`selected set of classification
`limit calculation methods. Any
`appropriate classification methods may be used to characterize the
`data according to the needs of the operator. The present outlier
`classification element, for example, classifies outliers by comparing
`the output
`test data
`to selected
`thresholds, such as values
`corresponding to one, two, three, and six statistically scaled standard
`deviations from a threshold, such as the data mean, mode, and/or
`median. The identification of outliers in this manner tends to
`normalize any identified outliers for any test regardless of datum
`amplitude and relative noise.
`
` (Id. at 1-16 (emphasis added). Compare with id. at 14:59-61 (classifying
`
`components).)
`
`41. The ’373 Patent further explains that the computer may generate an
`
`output report. The output is broadly defined by the ’373 Patent, which states that
`
`“[a]ny form, such as graphical, numerical, textual, printed, or electronic form, may
`
`be used to present the output report for use or subsequent analysis.” (Ex. 1001 at
`
`18:2-4.) The ’373 Patent also states that the “output report may be provided in any
`
`suitable manner, for example output to a local workstation, sent to a server,
`
`activation of an alarm, or any other appropriate manner (step 712). In one
`
`Singh Declaration
`U.S. Pat. 6,792,373
`
`
`
`20
`
`
`
`
`
`
`embodiment, the output report may be provided off-line such that the output does
`
`not affect the operation of the system.” (Id. at 18:57-62.)
`
`VI. My Interpretation of Certain Claim Terms
`
`“component”
`
`A.
`42. The ’373 Patent specification defines component by example to
`
`include “semiconductor devices on a wafer, circuit boards, packaged devices, or
`
`other electrical or optical systems.” (Ex. 1001 at 3:26-29.) The ‘373 Patent
`
`provides one other example in which "the general resistivity of resistor components
`
`in the semiconductor devices varies across the wafer," and discusses measuring the
`
`resistance of a “resistor component." (Ex. 1001 at 17:30-36 (emphasis
`
`added).) Because the ’373 Patent states that these resistor components are in the
`
`semiconductor devices, it is my understanding that a skilled person would
`
`conclude that the components may be discrete portions of a semiconductor device.
`
`43. Other than the foregoing examples, the ’373 Patent does not limit a
`
`component to any specific structure. Instead, the ’373 Patent describes the
`
`identification and/or selection of components generically, stating that a component
`
`may be identified based on “x-y coordinates corresponding to a position of the
`
`component 106 on a wafer map for the tested wafer” (Ex. 1001 at 4:15-18), and
`
`that “predetermined components may be selected according to any criteria, such as
`
`data for various circumferential zones, radial zones, random components, or
`
`Singh Declaration
`U.S. Pat. 6,792,373
`
`
`
`21
`
`
`
`
`
`
`individual stepper fields” (id. at 18:17-20). The ’373 Patent states that a
`
`“component” produces an output signal in response to a signal applied to the
`
`component by a tester, but does not discuss how such signals are structurally
`
`applied. (Ex. 1001 at 6:22-25.)
`
`44. Accordingly,
`
`it
`
`is my opinion
`
`that
`
`the broadest reasonable
`
`interpretation of “component” is “any discrete portion of a semiconductor wafer,
`
`including any zone, field, chip, device, or other discrete portion having an
`
`identifiable position with respect to the wafer and that is subject to testing.”
`
`“recipe file”
`
`B.
`45. Claims 2, 9, and 16 recite configuration data being in or read from a
`
`recipe file using the computer system, and claim 16 recites “identifying the outlier
`
`according to the configuration data in the recipe file.”
`
`46. The ’373 Patent discloses that “configuration algorithms, parameters,
`
`and any other criteria may be stored in a recipe file.” (Ex. 1001 at 6:11–18.)
`
`However, this disclosure does not require, for example, that a configuration
`
`algorithm or parameter actually be stored in the claimed recipe file, and does not
`
`further define the recipe file with respect to the claims. The ’373 Patent also
`
`discloses “sensitivity parameters in a recipe configuration file” (Ex. 1001 at 17:1–
`
`3); however, claim 16 recites a broader “recipe file.”
`
`Singh Declaration
`U.S. Pat. 6,792,373
`
`
`
`22
`
`
`
`
`
`
`47. Accordingly,
`
`it
`
`is my opinion
`
`that
`
`the broadest reasonable
`
`interpretation of “recipe file” is “a file storing configuration data for testing
`
`purposes.”
`
`VII. References Used In My Analysis
`
`A. Lane - U.S. Patent No. 4,967,381
`48. U.S. Patent 4,967,381 (“Lane”) (Ex. 1002) describes a system that is
`
`useful in process control of machines and processes, and which provides “a set of
`
`predefined data management or data analysis tasks which the operator of the
`
`system can use when using the system to run a selected process.” (Ex. 1002 at
`
`2:56-3:39, 4:6-17.) “Acces