throbber
UNITED STATES PATENT AND TRADEMARK OFFICE
`
`______________________________
`
`BEFORE THE PATENT TRIAL AND APPEAL BOARD
`
`
`
`TOYOTA MOTOR CORPORATION
`
`Petitioner
`
`
`
`v.
`
`
`
`Patent of AMERICAN VEHICULAR SCIENCES
`
`Patent Owner
`
`
`
`Patent No. 6,772,057
`
`Issue Date: August 3, 2004
`
`Title: VEHICLE MONITORING SYSTEMS USING IMAGE PROCESSING
`
`
`
`PATENT OWNER’S RESPONSE
`PURSUANT TO 37 CFR § 42.120
`
`Case No. IPR2013-00419
`
`
`
`
`

`
`
`
`

`

`TABLE OF CONTENTS
`
`INTRODUCTION ........................................................................................... 1
`
`SUMMARY OF THE ’057 PATENT, SCOPE AND CONTENT OF
`THE PRIOR ART, AND LEVEL OF ORDINARY SKILL........................... 4
`
`I.
`
`II.
`
`III. GROUNDS FOR WHICH REVIEW HAS BEEN INSTITUTED ................. 8
`
`IV. CLAIM CONSTRUCTION ............................................................................ 9
`
`V.
`
`THE BOARD SHOULD CONFIRM VALIDITY OF CLAIM 1-4, 7-
`10, 31, 41, 56, 59-62, AND 64 OVER THE GROUNDS ASSERTED
`IN THE PETITION ....................................................................................... 10
`
`A. None of the References Raised In The Review Disclose a
`“Pattern Recognition Algorithm Generated From Data of
`Possible Exterior Objects and Patterns of Received Waves from
`the Possible Exterior Objects” (claims 1-4, 7-10, 31, 41, 56, 59-
`61, 62, 64) ............................................................................................ 10
`
`(1) Lemelson ................................................................................... 11
`
`a.
`
`b.
`
`c.
`
`d.
`
`Lemelson does not expressly disclose the claim
`limitation ......................................................................... 12
`
`The Board’s decision to grant review based on
`Lemelson relied on the doctrine of inherency ................ 12
`
`Lemelson does not inherently disclose the claim
`limitation—it could have involved generating the
`algorithm with simulated data ........................................ 14
`
`Lemelson does not inherently disclose the claim
`limitation—it also could have involved generating
`an algorithm with data and waves not representing
`exterior objects to be detected ........................................ 19
`

`
`i
`
`

`

`e.
`
`PATENT OWNER’S RESPONSE UNDER 37 CFR §42.120
`IPR2013-00419
`Toyota’s expert’s belated attempt at his deposition
`to
`read extra disclosure
`into Lemelson
`is
`unavailing ....................................................................... 22
`
`(2) Borcherts ................................................................................... 25
`
`(3) Asayama .................................................................................... 26
`
`(4) Yamamura ................................................................................. 26
`
`(5) Other References Cited In the Petition But For Which
`Review Was Not Granted ......................................................... 27
`
`B. None of the Obviousness Grounds Raised In The Review Fix
`The Failure To Disclose a “Pattern Recognition Algorithm
`Generated From Data of Possible Exterior Objects and
`Patterns of Received Waves from the Possible Exterior
`Objects” (claims 4, 31, 59) .................................................................. 27
`
`VI. THE BOARD SHOULD CONFIRM VALIDITY OF CLAIM 30, 32-
`34, 37-39, AND 62 OVER THE GROUNDS ASSERTED IN THE
`PETITION ..................................................................................................... 29
`
`VII. CONCLUSION .............................................................................................. 32
`

`
`
`

`
`
`
`ii
`
`

`

`PATENT OWNER’S RESPONSE UNDER 37 CFR §42.120
`IPR2013-00419
`
`TABLE OF AUTHORITIES
`
`Cases
`
`Microsoft Corp. v. Proxyconn, Inc.,
`Case IPR2012-00026 (PTAB, Feb. 19, 2014) ........................... 13, 28, 29, 32
`
`Scaltech, Inc. v. Retec/Tetra, LLC.,
`178 F.3d 1378 (Fed. Cir. 1999) ..................................................................... 13
`
`Transclean Corp. v. Bridgewood Servs., Inc.,
`290 F.3d 1364 (Fed. Cir. 2002) ..................................................................... 13
`
`Verdegaal Bros. v. Union Oil Co. of California,
`814 F.2d 628 (Fed. Cir. 1987) ....................................................................... 11
`
`Statutes 
`35 U.S.C. §314 ............................................................................................... 27
`Rules 
`37 CFR §42.120 ..................................................................................... 1, 9, 27
`
`

`
`iii
`
`

`

`I.
`
`INTRODUCTION
`
`Patent Owner American Vehicular Sciences (“AVS”) submits the following
`
`response under 37 CFR §42.120 to the Petition filed by Toyota Motor Corporation
`
`(“Toyota”) requesting inter partes review of certain claims of U.S. Pat. No.
`
`6,772,057 (“the ‘057 patent”). This filing is timely pursuant to the Board’s
`
`Scheduling Order and the parties’ stipulation extending the deadline to March 20,
`
`2014. (See Paper 20, Scheduling Order (“The parties may stipulate to different
`
`dates for DUE DATES 1 through 3 (earlier or later, but no later than DUE DATE
`
`4).”); Paper 30, Notice of Stipulation).)
`
`
`
`AVS respectfully submits that the arguments presented and the additional
`
`evidence submitted, such as testimony from AVS expert Professor Cris
`
`Koutsougeras, PhD, show that at least claims 1-4, 7-10, 31, 41, 56, 59-62, and 64
`
`of the ‘057 patent are not anticipated or obvious in view of the grounds for review.
`
`AVS also reiterates the arguments with respect to claims 30, 32-34 and 37-39.
`
`Specifically, none of the prior art raised in the grounds for review discloses a
`
`key requirement in claims 1-4, 7-10, 31, 41, 56, 59-62, and 64 of the ‘057 patent—
`
`a “pattern recognition algorithm generated from data of possible exterior objects
`
`and patterns of received waves from the possible exterior objects.” (See Exhibit
`
`1001, ‘057 patent at independent claims 1 and 56 and dependent claims 31 and 41
`
`(emphasis added).) In other words, these claims require a pattern recognition
`

`
`1
`
`

`

`PATENT OWNER’S RESPONSE UNDER 37 CFR §42.120
`IPR2013-00419
`algorithm that must be generated in this particular way. Toyota and its expert
`
`glossed over this claim requirement, suggesting that just any pattern recognition
`
`algorithm would suffice. But as AVS’s expert explains, and illustrates with
`
`evidentiary support, there are numerous different ways that a pattern recognition
`
`algorithm can be generated that would not satisfy this claim limitation.
`
`Importantly, Toyota and its expert had only alleged that two out of the eight
`
`prior art references that it asserted in its Petition (Lemelson and Pomerleau) even
`
`disclosed a “pattern recognition algorithm” at all (much less one generated as
`
`required by the above-listed ‘057 patent claims). (See Paper 1, Toyota’s Petition at
`
`10-21, 40-46.)
`
`Out of those two references, the Board found that Pomerleau was not
`
`appropriate for an anticipation or obviousness ground for review, because it
`
`disclosed using a trained pattern recognition algorithm for detecting road lines—
`
`not “objects” as required by the claims. (See Paper 19, Board’s Decision to
`
`Institute Inter Parte Review (“Board Decision”) at pp. 34-37.) The Board therefore
`
`substantively denied review based on any ground premised on Pomerleau, and
`
`Pomerleau is therefore not at issue. (Id.)
`
`With respect to Lemelson, Toyota and its expert only pointed to a single
`
`sentence in Lemelson that refers to how the pattern recognition algorithm is
`
`generated—a sentence that states that the training of Lemelson’s network involved
`

`
`2
`
`

`

`PATENT OWNER’S RESPONSE UNDER 37 CFR §42.120
`IPR2013-00419
`“providing known inputs to the network resulting in desired output responses.”
`
`(See Paper 1, Toyota’s Petition at 11, citing Lemelson at 8:4-6.) Toyota glossed
`
`over the failure in Lemelson to disclose whether those “known inputs” included the
`
`specific inputs required by claims 1-4, 7-10, 31, 41, 56, 59-62, and 64.
`
`As discussed below, Toyota’s arguments, and the Board’s comments in
`
`response, implicitly rest on the doctrine of inherency. In other words, because
`
`Lemelson does not expressly disclose generating a trained pattern recognition
`
`algorithm “from data of possible exterior objects and patterns of received waves
`
`from the possible exterior objects,” in order to find anticipation, Toyota was
`
`required to show that Lemelson “necessarily” included that type of algorithm
`
`generation (i.e., not that it was merely possible or probable that Lemelson used the
`
`claimed type of algorithm generation). Toyota, however, did not establish this
`
`requirement, and could not establish this requirement, because there are in fact
`
`several types of “known inputs” that Lemelson could have been referring to other
`
`than the inputs required by the subject ‘057 patent claims.
`
`For example, Lemelson could have used simulated data to generate a pattern
`
`recognition algorithm, which would not involve “data of possible exterior objects
`
`and patterns of received waves from the possible exterior objects.” Or it could
`
`have used data or wave patterns relating to something other than “the possible
`
`exterior objects” for which the system is trying to provide a “classification,
`

`
`3
`
`

`

`PATENT OWNER’S RESPONSE UNDER 37 CFR §42.120
`IPR2013-00419
`identification, or location.” For example, instead of training the system with data
`
`and patterns of received waves from cars, it could have involved training with
`
`images of license plates or tail-lights, which would fail to satisfy the claim.
`
`As such, the instituted grounds for review do not establish anticipation or
`
`obviousness of at least claims 1-4, 7-10, 30-34, 37-39, 41, 56, 59-62, and 64 of the
`
`‘057 patent. If the Board agrees that Lemelson does not “necessarily” disclose the
`
`claimed manner of generating an algorithm, then the instituted ground for review
`
`of claims 1-4, 7-10, 31, 41, 56, 59-62, and 64 based on anticipation by Lemelson
`
`fails, as do the instituted grounds for review of obviousness of claims 5, 31, and 59
`
`(in view of Lemelson in combination with Borchert or Asayama). And if the
`
`Board agrees with AVS regarding claim 30’s “on the rearview mirror”
`
`requirement, then claims 30, 32-34 and 37-39 also overcome the grounds for
`
`review. AVS requests that the Board confirm claims 1-4, 7-10, 30-34, 37-39, 41,
`
`56, 59-62, and 64.1
`
`II.
`
`SUMMARY OF THE ’057 PATENT, SCOPE AND CONTENT OF
`THE PRIOR ART, AND LEVEL OF ORDINARY SKILL
`
`The ‘057 patent claims generally relate to technology for monitoring an
`
`environment exterior of a vehicle, where the vehicle determines if any object is in
`                                                            
`1  AVS is not challenging the Board’s review of the remaining claims for which
`
`review was instituted, namely claims 40, 43, 46, 48, and 49.  
`

`
`4
`
`

`

`PATENT OWNER’S RESPONSE UNDER 37 CFR §42.120
`IPR2013-00419
`the path of vehicle, classifies or identifies the object, and affects other systems in
`
`the vehicle in response to the classification or identification of the object. But
`
`what made the ‘057 patent groundbreaking and superior to prior vehicle collision
`
`avoidance systems was the specific way that it implemented the system.
`
`Independent claims 1 and 56 (and dependent claims 31 and 41) recite a key aspect
`
`of the invention—a processor coupled to at least one receiver (e.g., an infrared
`
`receiver, CCD array, or radar receiver), where the processor implements a trained
`
`pattern recognition system (such as a neural network) that is trained with data and
`
`patterns of received waves from possible exterior objects.
`
`As AVS’s expert explains in his declaration, a pattern recognition system
`
`such as a neural network is fundamentally different than just a computer program.
`
`(Exhibit 2001, Koutsougeras Decl. at ¶ 15.) A computer program can be used if a
`
`programmer can guarantee knowing all possible variables. (Id.) But in an object
`
`detection system, this can be very difficult. (Id. at ¶¶ 15-16.) If the goal is to have
`
`the system detect whether an object is a car, it would be difficult to program such a
`
`system to compare a received image of a car to a database of images of all possible
`
`car models, in all possible colors, from all possible angles. (Id. at ¶ 18.)
`
`For that reason, the inventor of the ‘057 patent developed a way to perform
`
`this object recognition using a “pattern recognition algorithm” such as a neural
`
`network. (Id. at ¶¶ 16-20.) A pattern recognition algorithm does not just compare
`

`
`5
`
`

`

`PATENT OWNER’S RESPONSE UNDER 37 CFR §42.120
`IPR2013-00419
`detected car to a database to find a match. Rather, it calculates degrees of
`
`similarity between something it has been told (or “trained”) is a car, versus
`
`something it has been told is not a car. (Id. at ¶ 18.) The more positive and
`
`negative examples (the “training set”) that the system is given, the more accurate it
`
`will be. (Id.)
`
`The inventor of the ‘057 patent also found that a specific type of training to
`
`generate the “training set” was the most effective. (See id. at ¶¶ 19, 20, 53.) The
`
`inventor disclosed and claimed generating the algorithm from “data of possible
`
`exterior objects and patterns of received waves from the possible exterior objects.”
`
`(Id.) For example, if the vehicle uses a radar receiver, a neural network could be
`
`trained with examples of received radar waves from possible objects such as cars,
`
`motorcycles, trucks, etc. (“i.e., “patterns or received waves from the possible
`
`exterior objects”), plus labels indicating the classification and possibly other
`
`information relating to the example object (i.e., “data”). (Id. at ¶¶ 19-20.) The
`
`examples of received radar waves from possible objects used to generate the
`
`algorithm can, therefore, be real radar waves, so that the system knows how to
`
`recognize radar waves received from that same object or a similar one when the
`
`vehicle is later driving down the road. (Id. at ¶ 20.) This can be done, for
`
`example, by putting actual examples of a possible object in front of a vehicle radar
`
`system, letting the system hit the object with radar waves that are thereafter
`

`
`6
`
`

`

`PATENT OWNER’S RESPONSE UNDER 37 CFR §42.120
`IPR2013-00419
`received back by the system, and then telling the system the identity and
`
`classification of the object.2 (Id. at ¶ 20.) This is in contrast to other ways to train
`
`a pattern recognition system, such as through completely simulated data (a
`
`computer simulation of radar waves). (Id. at ¶¶ 46, 55-63.)
`
`As Professor Koutsougeras explains, therefore, the scope and content of the
`
`prior art to the ‘057 patent would have been narrower than that offered by Toyota
`
`and its expert, Dr. Papanikolopolous. (Id. at ¶ 34.) Professor Koutsougeras
`
`explains that the scope and content of the prior art would not have included
`
`generically any “vehicle sensing systems,” as there are many vehicle sensing
`
`systems that have no relevance or application to external object detection or pattern
`
`recognition systems. (Id.) Rather, the scope and content of the prior art would
`
`have included sensors and pattern recognition algorithms for object classification,
`
`including those for automotive use. (Id.) AVS and Professor Koutsougeras,
`
`however, do not have any fundamental disagreement with the definition of the
`
`level of ordinary skill proposed by Toyota and Dr. Papanikolopolous, and therefore
`
`have applied that definition of the level of ordinary skill for purposes of this IPR.
`
`                                                            
`2 This is not to say, of course, that every individual vehicle must be trained in this
`
`way. Once a single system has been trained, those saved examples of waves and
`
`label data can be transferred to other systems.  
`

`
`7
`
`

`

`PATENT OWNER’S RESPONSE UNDER 37 CFR §42.120
`IPR2013-00419
`III. GROUNDS FOR WHICH REVIEW HAS BEEN INSTITUTED
`
`Toyota’s Petition included sixteen proposed grounds for invalidity, based on
`
`eight different prior art references. (See Paper 19, Board Decision at pp. 6-7.) Of
`
`those sixteen proposed grounds, the Board granted review based on five of those
`
`grounds. Specifically, the Board granted review on the following grounds:
`
` Claims 1-4, 7-10, 40, 41, 46, 48, 49, 56, 59-61, and 64 as anticipated under
`
`35 U.S.C. § 102(e) by Lemelson;
`
` Claims 30-34, 37-39, and 62 for obviousness under 35 U.S.C. § 103(a) over
`
`Lemelson and Borcherts;
`
` Claims 4, 43, and 59 for obviousness under 35 U.S.C. § 103(a) over
`
`Lemelson and Asayama;
`
` Claim 34 for obviousness under 35 U.S.C. § 103(a) over Lemelson,
`
`Borcherts, and Asayama; and
`
` Claims 30, 32, and 37-39 for obviousness under 35 U.S.C. § 103(a) over
`
`Yamamura and Borcherts.
`
`(Paper 19, Board Decision at pp. 38-39.)
`
`Accordingly, claims 1-3, 7-10, 41, 56, 60, 61, and 64 only stand reviewed
`
`for alleged anticipation by Lemelson. Review of those claims was not instituted
`
`based on any other prior art reference, nor on any other ground. Claims 30-34, 37-
`
`39 and 62 stand reviewed only for alleged obviousness in view of Lemelson and
`

`
`8
`
`

`

`PATENT OWNER’S RESPONSE UNDER 37 CFR §42.120
`IPR2013-00419
`Borcherts. Claims 4 and 59 stand reviewed only for alleged anticipation by
`
`Lemelson or alleged obviousness in view of Lemelson and Asayama.
`
`Pursuant to 37 CFR §42.120, AVS is addressing only the grounds for which
`
`review was instituted, for select claims. (See 37 CFR §42.120 (“A patent owner
`
`may file a response to the petition addressing any ground for unpatentabililty not
`
`already denied.”).)
`
`IV. CLAIM CONSTRUCTION
`
`For purposes of this IPR only, AVS does not contest the Board’s claim
`
`constructions. Any disagreements that AVS might have with the Board’s claim
`
`constructions are not material to the arguments in this Response.
`
`In particular, the Board provided the following constructions for the
`
`following terms:
`
`
`
` “trained pattern recognition algorithm” is construed as “an algorithm
`
`that processes a signal that is generated by an object, or is modified by interacting
`
`with an object, in order to determine to which one of a set of classes the object
`
`belongs, the algorithm having been taught, through a variety of examples, various
`
`patterns of received signals generated or modified by objects”;
`
`
`
`“trained pattern recognition means” is construed as “a neural
`
`computer or neural network trained for pattern recognition, and equivalents
`
`thereof”;
`

`
`9
`
`

`

`PATENT OWNER’S RESPONSE UNDER 37 CFR §42.120
`IPR2013-00419
`“identify” / “identification” is construed as “determine that the
`
`
`
`object belongs to a particular set or class” and “identification” as “determination
`
`that the object belongs to a particular set or class”;
`
`
`
`“exterior object” is construed as “a material or physical thing outside
`
`the vehicle, not a part of the roadway on which the vehicle travels”;
`
`
`
`“rear view mirror” is construed as “a mirror that faces to the rear,
`
`which necessarily excludes non-rear-facing mirrors”; and
`
`
`
`“transmitter” is construed as “encompassing devices that transmit any
`
`type of electromagnetic waves, including visible light.”
`
`V. THE BOARD SHOULD CONFIRM VALIDITY OF CLAIM 1-4, 7-10,
`31, 41, 56, 59-62, AND 64 OVER THE GROUNDS ASSERTED IN THE
`PETITION
`
`A. None of the References Raised In The Review Disclose a “Pattern
`Recognition Algorithm Generated From Data of Possible Exterior
`Objects and Patterns of Received Waves from the Possible Exterior
`Objects” (claims 1-4, 7-10, 31, 41, 56, 59-61, 62, 64)
`
`As discussed, independent claims 1 and 56 and dependent claims 31 and 41
`
`require a specific type of training of the pattern recognition algorithm. These
`
`claims require a pattern recognition algorithm “generated from data of possible
`
`exterior objects and patterns of received waves from the possible exterior objects.”
`
`(See Exhibit 1001, ‘057 patent at claims 1, 56, 31, 41.)
`
`None of the references at issue in the instituted grounds for review (i.e.,
`

`
`10
`
`

`

`PATENT OWNER’S RESPONSE UNDER 37 CFR §42.120
`IPR2013-00419
`Lemelson, Asayama, Borcherts, or Yamamura) disclose this claim limitation,
`
`either expressly or inherently. See Verdegaal Bros. v. Union Oil Co. of California,
`
`814 F.2d 628, 631 (Fed. Cir. 1987) (“A claim is anticipated only if each and
`
`every element as set forth in the claim is found, either expressly or inherently
`
`described, in a single prior art reference.”).
`
`(1) Lemelson
`
`The only reference that the Board found may disclose a “pattern recognition
`
`algorithm generated from data of possible exterior objects and patterns of received
`
`waves from the possible exterior objects” is Lemelson. (See Paper 19, Board’s
`
`Decision at pp. 15-21.) Review of claims 1-4, 7-10, 41, 56, 59-61, and 64 was
`
`instituted for anticipation by Lemelson. Review of claims 31 and 62 (which
`
`additionally require a receiver arranged on a rear view mirror) was only instituted
`
`for obviousness by Lemelson in view of Borcherts. And review of claims 4 and 59
`
`(which require an infrared transmitter) was also instituted for obviousness by
`
`Lemelson in view of Asayama.
`
`Lemelson, however, does not expressly disclose the nature and manner of
`
`how its neural network algorithm is generated, and it does not inherently (i.e.,
`
`“necessarily”) disclose that its neural network was generated as claimed.
`
`
`
`
`

`
`11
`
`

`

`PATENT OWNER’S RESPONSE UNDER 37 CFR §42.120
`IPR2013-00419
`Lemelson does not expressly disclose the claim limitation
`
`Lemelson discloses a system for identifying objects exterior to a vehicle.
`
`a.
`
`(See Exhibit 1002, Lemelson at Abstract.) And it does disclose using a type of
`
`pattern recognition algorithm (a neural network) for identifying objects. (See
`
`Lemelson at 5:35-45.) The only discussion in Lemelson, however, relating to
`
`generating the neural network, merely states that “[t]raining involves providing
`
`known inputs to the network resulting in desired output responses.” (See Exhibit
`
`2001, Koutsougeras Decl. at ¶ 43, citing Lemelson at 8:4-6.)
`
`This is the only sentence from Lemelson that Toyota cited in its Petition as
`
`relating to the nature of Lemelson’s pattern recognition algorithm generation or
`
`training. (See Paper 1, Toyota’s Petition at p. 11.) And it is the only sentence that
`
`Toyota’s expert, Dr. Papanikolopolous, cites in his declaration with respect to how
`
`the trained pattern recognition algorithm in Lemelson is generated. (See Exhibit
`
`1016, Papanikolopoulos Decl. at ¶¶ 47-64.)
`
` Nowhere else
`
`in Dr.
`
`Papanikolopolous’s declaration does he allege that Lemelson discloses how its
`
`pattern recognition algorithm was generated. (See id.)
`
`b.
`
`The Board’s decision to grant review based on Lemelson
`relied on the doctrine of inherency
`
`
`The Board also did not rely on any express disclosure in Lemelson with
`
`respect to the “algorithm generated from” requirement of the subject ‘057 patent
`
`claims. The Board only found that Lemelson discloses training a neural network
`12
`

`
`

`

`PATENT OWNER’S RESPONSE UNDER 37 CFR §42.120
`IPR2013-00419
`with “known inputs.” (Paper 19, Board Decision at 20.) From that, the Board only
`
`stated that “[i]t follows that, during training of the neural network, providing
`
`known inputs as disclosed in Lemelson involves providing the neural network with
`
`data identifying the potential roadway hazards, i.e., data of possible exterior
`
`objects, corresponding to the inputs, i.e., patterns of received waves from possible
`
`exterior objects.” (Id.) (emphasis added).
`
`The use by the Board of the phrases “it follows that” and “Lemelson
`
`involves” means that, although it did not reference the doctrine by name, the Board
`
`is applying the doctrine of inherency. Inherency, however, requires that a claimed
`
`limitation be “necessarily” and “inevitably” present. See Transclean Corp. v.
`
`Bridgewood Servs., Inc., 290 F.3d 1364, 1373 (Fed. Cir. 2002) (“Inherent”
`
`anticipation is appropriate only when the prior art necessarily includes a claim
`
`limitation that is not expressly disclosed.). It is not enough that a claim limitation
`
`was possibly or probably present in a prior art reference. See Scaltech, Inc. v.
`
`Retec/Tetra, LLC., 178 F.3d 1378, 1384 (Fed. Cir. 1999) (invalidity based on
`
`inherency is not established by mere “probabilities or possibilities”). See also, e.g.,
`
`Microsoft Corp. v. Proxyconn, Inc., Case IPR2012-00026 (PTAB, Feb. 19, 2014)
`
`(“A finding of anticipation by inherency requires more than probabilities or
`
`possibilities. Based on the evidence discussed above, it is possible to infer that
`
`Perlman describes such permanent storage memory. However, Microsoft has not
`

`
`13
`
`

`

`PATENT OWNER’S RESPONSE UNDER 37 CFR §42.120
`IPR2013-00419
`presented evidence that the computers or routers described by Perlman necessarily
`
`use permanent storage memory as recited in claims 1 and 3.”).
`
`Here, the only way that Lemelson could inherently disclose a “pattern
`
`recognition algorithm generated from data of possible exterior objects and patterns
`
`of received waves from the possible exterior objects,” would be if the “known
`
`inputs” referenced in Lemelson necessarily included “data of possible exterior
`
`objects and patterns of received waves from the possible exterior objects.”
`
`Further, it is not enough to merely show that Lemelson discloses a “trained
`
`pattern recognition algorithm” when there are numerous different ways to generate
`
`such an algorithm other than the manner required by the claims. The ‘057 patent
`
`claims do not just claim a “trained pattern recognition algorithm,” period. The
`
`added requirement that the algorithm be “generated from data of possible exterior
`
`objects and patterns of received waves from the possible exterior objects” must be
`
`also disclosed in the prior art for there to be anticipation.
`
`c.
`
`the claim
`inherently disclose
`Lemelson does not
`limitation—it could have involved generating the algorithm
`with simulated data
`
`Lemelson does not inherently disclose the claimed manner of generating a
`
`pattern recognition algorithm because there are several other ways that Lemelson
`
`could have generated its pattern recognition algorithm. First, the system in
`
`Lemelson could have been generated using simulated data, rather than data from
`

`
`14
`
`

`

`PATENT OWNER’S RESPONSE UNDER 37 CFR §42.120
`IPR2013-00419
`possible exterior objects and patterns of received waves from the possible exterior
`
`objects. (See Exhibit 2001, Koutsougeras Decl. at ¶¶ 55-63.)
`
`Simulated data is data that does not include any “patterns of received waves
`
`from the possible exterior objects.” (Id.) Rather, it is generated by computer
`
`programs that simulate what sensors would be reading if they were detecting an
`
`object. (Id.) As Professor Koutsougeras explains, “[s]imulated data is therefore
`
`not data from objects or patterns of waves from objects—it is completely made-up
`
`data.” (Id. at ¶ 56.) In his declaration, he explains that as an analogy, simulated
`
`data is similar to a movie made with actors versus a cartoon. The cartoon would
`
`provide a rough approximation for what a person is expected to look like, but not
`
`nearly as accurate as a video with a real actor. (See id. at ¶ 57.)
`
`Professor Koutsougeras also explains that using simulated data for
`
`generating a pattern recognition algorithm for a vehicle could very well have been
`
`the “known inputs” referenced by Lemelson. (See id. at ¶¶ 55-63.) Lemelson
`
`claims priority to an application that was filed in 1993. (See id. at ¶ 58, citing
`
`Lemelson at cover.) In 1992, Pomerleau described in his thesis using simulated
`
`data to train a neural network on a vehicle.3 (See id.) In fact, Pomerleau devoted
`
`                                                            
`3 AVS again notes that the Pomerleau article was rejected as the basis for a ground
`
`for review. Pomerleau discloses two ways to train a neural network—with real
`

`

`
`15
`
`

`

`PATENT OWNER’S RESPONSE UNDER 37 CFR §42.120
`IPR2013-00419
`an entire section of his thesis to what was titled “Training With Simulated Data.”
`
`(See id. at ¶ 58, citing Exhibit 2004, Pomerleau at p. 38.) Pomerleau explained that
`
`he believed at the time that “the only way to achieve variety in the training set
`
`sufficient to ensure that the network learns a general internal representation was to
`
`generate the training set synthetically.” (Id., citing Exhibit 2004 at p. 38.)
`
`Pomerleau explained that:
`
`To generate synthetic training data for the task of autonomous road
`
`following, I developed a program that generated aerial views of
`
`simulated stretches of roads and then used a model of the camera to
`
`back-project the aerial map into a 2D image of the road ahead. The
`
`                                                                                                                                                                                                
`data and with simulated data. The Board recognized that the Pomerleau article
`
`disclosed using real data only for generating a neural network for detecting a road
`
`(not objects). The Pomerleau article also vaguely refers to programming
`
`parameters of objects for the “simulated road generator,” but not for training with
`
`real data (nor, as AVS explained in its Preliminary Response, does the Pomerleau
`
`article provide an enabling disclosure even for simulated data of objects). (See
`
`Paper 17, AVS’s Prelim. Resp. at p. 39.) In any event, the Pomerleau Thesis (a
`
`different publication) was not asserted by Toyota in its Petition, and it is not the
`
`basis of any ground for review.
`

`
`16
`
`

`

`PATENT OWNER’S RESPONSE UNDER 37 CFR §42.120
`IPR2013-00419
`simulated road image generator used nearly 200 parameters in order to
`
`generate a variety of realistic road images. Some of the most
`
`important parameters are listed in Figure 3.1.
`
`(Id. at ¶ 58, citing Exhibit 2004 at page 38.) And Pomerleau reported that
`
`generating the algorithm using simulated data worked. (See id. at ¶ 58, citing
`
`Exhibit 2004 at p. 40 (“Pomerleau reports in his thesis that, using the simulated
`
`training set of artificial road images, the network ‘could accurately drive Navlab I
`
`at a speed of 4 miles per hour along a 400 meter path through a wooded area of the
`
`CMU campus under sunny fall conditions.’”).)
`
`Professor Koutsougeras also discusses how the use of simulated data for
`
`training a neural network was widely known and used in other contexts as well.
`
`For example, he cites to U.S. Pat. No. 5,537,327, which involved the use of a
`
`trained neural network to detect impedance faults on a power line. (See Exhibit
`
`2001, Koutsougeras Decl. at ¶ 59.) That patent included claim 4 “wherein said
`
`neural network training is accomplished by the use of simulated data” and claim 6
`
`“wherein said neural network training is accomplished by applying actual data.”
`
`(See Exhibit 2003, U.S. Pat. No. 5,537,327 at claims 1, 4, and 6) (emphasis added).
`
`One reason why the “known inputs” of Lemelson may have been simulated
`
`data is because training with simulated data can be more desirable in some aspects
`
`compared to real-world data. Using simulated data has advantages in being able to
`

`
`17
`
`

`

`PATENT OWNER’S RESPONSE UNDER 37 CFR §42.120
`IPR2013-00419
`generate a large training set easily and ensure “balance” in a pattern recognition
`
`algorithm. (See, e.g., Exhibit 2004, Pomerleau Thesis at 38 (“I believed at the time
`
`that the only way to achieve variety in the training set sufficient to ensure that the
`
`network learns a general internal representation was to generate the training set
`
`synthetically.”).) (See also Exhibit 2001, Koutsougeras Decl. at ¶ 61-62.) For
`
`example, if a pattern recognition algorithm is generated with mostly compact
`
`sedans, and only a few SUV’s, it may tend to recognize only cars that are closer in
`
`appearance to a compact sedan. As another analogy to illustrate the importance of
`
`a large number of examples for generating an algorithm, take the example of a
`
`system for detecting if an individual is male or female. If all of the female
`
`examples have long hair and all of the male examples have short hair, the system
`
`may believe that a long-haired male is a female. Using simulated data ma

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket