`__________________
`
`BEFORE THE PATENT TRIAL AND APPEAL BOARD
`__________________________________________________________________
`
`TOYOTA MOTOR CORPORATION
`
`Petitioner
`
`Patent No. 6,772,057
`Issue Date: Aug. 3, 2004
`Title: VEHICULAR MONITORING SYSTEMS USING IMAGE PROCESSING
`__________________________________________________________________
`PETITION FOR INTER PARTES REVIEW
`OF U.S. PATENT NO. 6,772,057
`PURSUANT TO 35 U.S.C. § 312 and 37 C.F.R. § 42.104
`Case No. IPR2015-00261
`__________________________________________________________________
`
`
`
`
`
`I.
`
`MANDATORY NOTICES (37 C.F.R. § 42.8) ..................................................... 2
`A.
`Real Party-in-Interest (37 C.F.R. § 42.8(b)(1)) ............................................ 2
`B.
`Related Matters (37 C.F.R. § 42.8(b)(2)) ...................................................... 2
`C.
`Counsel & Service Information (37 C.F.R. §§ 42.8(b)(3)-(4)) ................... 3
`PAYMENT OF FEES (37 C.F.R. § 42.103) .......................................................... 4
`II.
`III. REQUIREMENTS FOR IPR (37 C.F.R. § 42.104) ............................................. 4
`A. Grounds for standing (37 C.F.R § 42.104(a)) ............................................. 4
`B.
`Identification of Challenge (37 C.F.R. § 42.104(b)(1)-(2)) and
`Relief Requested (37 C.F.R. § 42.104(b)) .................................................... 4
`Claim Construction (37 C.F.R. § 42.104(b)(3)) ........................................... 6
`C.
`IV. OVERVIEW OF THE ’057 PATENT .................................................................. 7
`V. HOW CHALLENGED CLAIMS ARE UNPATENTABLE (37 C.F.R.
`§§ 42.104(B)(4)-(5)) .................................................................................................... 8
`A. Ground 1: Claims 1-4, 7-10, 41, 56, 59-61, and 64 Are Obvious
`Under 35 U.S.C. § 103(a) over Lemelson .................................................... 8
`Lemelson already discloses that the neural network is trained with
`“known inputs.” ................................................................................ 11
`B. Ground 2: Claims 1-4, 7-10, 41, 56, 59-61, and 64 Are Obvious
`Under 35 U.S.C. § 103(a) over Lemelson in View of Nishio .................. 23
`C. Ground 3: Claims 31 and 62 Are Obvious Under 35 U.S.C.
`§ 103(a) over Lemelson in View of Borcherts .......................................... 25
`VI. CONCLUSION ....................................................................................................... 28
`
`i
`
`
`
`
`
`
`Exhibit 1101
`
`EXHIBITS
`U.S. Patent No. 6,772,057 to Breed et al.
`
`Exhibit 1102
`
`U.S. Patent No. U.S. Patent No. 6,553,130 to Lemelson et al.
`
`Exhibit 1103
`
`File History for U.S. Patent Application No. 08/105,304
`
`Exhibit 1104
`
`European Patent Application No. 93112302 (Publication No.
`0582236A1) to Nishio
`
`Exhibit 1105
`
`U.S. Patent No. 5,245,422 to Borcherts et al.
`
`Exhibit 1106
`
`Expert Declaration of Nikolaos Papanikolopoulos, Ph.D.
`
`ii
`
`
`
`
`
`
`Pursuant to 35 U.S.C. §§ 311-319 and 37 C.F.R. Part 42, Toyota Motor
`
`Corporation (“Toyota” or “Petitioner”) respectfully requests inter partes review of
`
`claims 1-4, 7-10, 31, 41, 56, 59-62 and 64 of U.S. Patent No. 6,772,057 (“the ’057
`
`patent”), filed on Nov. 22, 2002, and issued on Aug. 3, 2004, to David S. BREED,
`
`and currently assigned to American Vehicular Sciences LLC (“AVS”) according to the
`
`U.S. Patent and Trademark Office (“the USPTO”) assignment records. There is a
`
`reasonable likelihood that Petitioner will prevail with respect to at least one claim
`
`challenged in this Petition.
`
`This Petition for Inter Partes Review is being filed along with a motion
`
`requesting joinder with the pending inter partes review initiated by Mercedes-Benz USA
`
`LLC (“Mercedes”) concerning the ’057 patent: Mercedes-Benz USA LLC, v. American
`
`Vehicular Sciences, LLC, Case No. IPR2014-00646 (“Mercedes 646 IPR”). This
`
`Petition does not propose any additional grounds beyond those that were instituted in
`
`the Mercedes 646 IPR. The only differences are that Toyota (1) does not request
`
`review of claims 16, 30, 40, 43, 46, 77, 78, or 81-83, which are issue in the Mercedes
`
`646 IPR, (2) only requests review based on Grounds B-D, and (2) requests review of
`
`dependent claims 3, 8-10 and 64, which are not at issue in the Mercedes 646 IPR.
`
`Although Toyota requests review of additional dependent claims that are not
`
`specifically at issue in the Mercedes 646 IPR, Toyota requests review of these claims
`
`based on the same obviousness combinations set forth in instituted Grounds B and C.
`
`Addition of these claims will not create any additional issues or complicate the
`1
`
`
`
`
`
`
`proceeding. These additional claims recite limitations that are disclosed in the primary
`
`Lemelson reference relied upon
`
`in these Grounds and do not necessitate
`
`consideration of additional references or obviousness issues. Moreover, in a prior
`
`proceeding (IPR2013-00419) on the ’057 patent involving Toyota and AVS, these
`
`claims were not separately disputed (beyond the disputes concerning the independent
`
`claims from which they depend and which are already at issue in the Mercedes 646
`
`IPR).
`
`I. MANDATORY NOTICES (37 C.F.R. § 42.8)
`A. Real Party-in-Interest (37 C.F.R. § 42.8(b)(1))
`Petitioner, Toyota, is the real party-in-interest.
`
`B. Related Matters (37 C.F.R. § 42.8(b)(2))
`The ’057 patent is currently the subject of the following litigations: American
`
`Vehicular Sciences LLC v. Toyota Motor Corp. et al., No. 14-CV-13019 (E.D. Mich.)
`
`(“AVS Litigation”), which was transferred from the District Court for the Eastern
`
`District of Texas in a litigation originally styled as American Vehicular Sciences LLC v.
`
`Toyota Motor Corp. et al., No. 6:12-CV-410. Petitioner is a named defendant in the AVS
`
`Litigation. The earliest that Petition or any of its subsidiaries was served with the
`
`complaint was July 26, 2012. Petitioner previously filed a petition for inter partes
`
`review in IPR 2013-00419 on July 12, 2013 asserting invalidity of claims 1-4, 7-10, 30-
`
`34, 37-41, 43, 46, 48, 49, 56, 59-62 and 64 of the ’057 patent. On January 13, 2014
`
`2
`
`
`
`
`
`
`(Paper 19), the Board instituted the proceeding with respect to all challenged claims.
`
`That proceeding is currently pending.
`
`Petitioner has also filed petitions in IPR2013-00420, -00421, -00422, and -
`
`00423, which addressed patents that were asserted against Toyota in the AVS
`
`Litigation, as well as IPR2013-00424, which addressed U.S. Patent No. 5,845,000
`
`(“the ’000 patent”), the patent that was asserted against Toyota in a related case:
`
`American Vehicular Sciences LLC v. Toyota Motor Corp., et al., 12-CV-406 (E.D. Tex.)
`
`(which has also since been transferred to the EDMI). IPR2013-00420, 00422 and -
`
`0423 have settled and been terminated, and the Board has issued a final written
`
`decision in IPR2013-00421. IPR2013-00424 is currently pending. Petitioner is
`
`concurrently filing a motion for joinder in IPR2014-00647, which also addresses the
`
`’000 patent. On November 13, 2014, Petitioner filed a request for ex parte
`
`reexamination of the ’057 patent. Petitioner is not aware of any other pending judicial
`
`or administrative matter that would affect, or be affected by, a decision in this
`
`proceeding.
`
`C.
`Counsel & Service Information (37 C.F.R. §§ 42.8(b)(3)-(4))
`Lead Counsel: Matt Berkowitz (Reg. No. 57,215)
`
`Back-up Counsel: Thomas R. Makin (pro hac to be requested upon authorization)
`
`Petitioner requests authorization to file a motion for Thomas R. Makin to appear pro
`
`hac vice as backup counsel. Mr. Makin is an experienced litigation attorney in patent
`
`cases, admitted to practice law in New York, and in several United States District
`3
`
`
`
`
`
`
`Courts and Courts of Appeal. Mr. Makin has an established familiarity with the
`
`subject matter at issue and represents Petitioner as a defendant in the related AVS
`
`Litigation, identified above. Additionally, Mr. Makin was previously admitted pro hac
`
`vice as backup counsel in IPR 2013-00419 regarding the ’057 patent.
`
`Electronic Service Information: ptab@kenyon.com and mberkowitz@kenyon.com
`
`Post and Delivery: Kenyon & Kenyon LLP, One Broadway, New York, NY 10004
`
`Telephone: 212-425-7200 Facsimile: 212-425-5288
`
`II.
`
`PAYMENT OF FEES (37 C.F.R. § 42.103)
`The USPTO is authorized to charge all fees required in connection with this
`
`Petition, as well as any other fees that may be required connection with this Petition
`
`or these proceedings, to the deposit account of Kenyon & Kenyon LLP, Deposit
`
`Account 11-0600.
`
`III. REQUIREMENTS FOR IPR (37 C.F.R. § 42.104)
`A. Grounds for standing (37 C.F.R § 42.104(a))
`Petitioner certifies that the ’057 patent (Ex. 1101) is available for inter partes
`
`review and that Petitioner is not barred or estopped from requesting an inter partes
`
`review challenging the patent’s claims on the grounds identified in this Petition.
`
`B.
`
`Identification of Challenge (37 C.F.R. § 42.104(b)(1)-(2)) and Relief
`Requested (37 C.F.R. § 42.104(b))
`Petitioner respectfully petitions for IPR of and challenges claims 1-4, 7-10, 31,
`
`41, 56, 59-62 and 64 of the ’057 patent under 35 U.S.C. §§ 102 and 103. Cancellation
`
`of these claims is requested.
`
`4
`
`
`
`
`
`
`The ’057 patent claims priority back through several applications, the earliest of
`
`which is App. No. 474,786, which was filed on June 7, 1995, and issued as U.S. Patent
`
`No. 5,845,000 (“the ’000 patent”).
`
`Petitioner relies upon the following references in support of its petition. None
`
`of these references were of record during prosecution of the ’057 patent.
`
`1) U.S. Patent No. 6,553,130 (“Lemelson,” Ex. 1102) issued from U.S. Appl.
`
`No. 08/671,853 (“’853 app.”), filed on June 28, 1996. The ’853 application is a
`
`continuation of U.S. App. No. 08/105,304 (“’304 app.,” Ex. 1103), which was filed on
`
`Aug. 11, 1993, and, as noted where applicable in this petition, contains materially the
`
`same disclosure as the ’853 app. Accordingly, Lemelson constitutes prior art against
`
`the ’057 patent under 35 U.S.C. § 102(e).
`
`2) European Patent Application Publication No. 0582236A1 (“Nishio,” Ex.
`
`1104), which published on Feb. 9, 1994, constitutes prior art against the ’057 patent
`
`under 35 U.S.C. § 102(b).
`
`3) U.S. Patent No. 5,245,422 to Borcherts (“Borcherts,” Ex. 1105), which
`
`issued on Sept. 14, 1993, constitutes prior art against the ’057 patent under 35 U.S.C.
`
`§ 102(b).
`
`Petitioner requests that claims 1-4, 7-10, 31, 41, 56, 59-62 and 64 of the ’057
`
`patent be cancelled on the following grounds:
`
`Ground 1: Claims 1-4, 7-10, 41, 56, 59-61 and 64 are obvious under 35 U.S.C.
`
`§ 103(a) over Lemelson
`
`5
`
`
`
`
`
`
`Ground 2: Claims 1-4, 7-10, 41, 56, 59-61 and 64 are obvious under 35 U.S.C.
`
`§ 103(a) over Lemelson and Nishio
`
`Ground 3: Claims 31 and 62 are obvious under 35 U.S.C. § 103(a) over Lemelson in
`
`view of Borcherts
`
`The above-listed grounds of unpatentability are explained in detail in Section V,
`
`below.
`
` For support, petitioners
`
`rely on
`
`the Declaration of Nikolaos
`
`Papanikolopoulos, Ph. D., a copy of which is attached hereto as Ex. 1106.
`
`C.
`Claim Construction (37 C.F.R. § 42.104(b)(3))
`For purposes of this IPR only, Petitioner proposes the construction of the
`
`claim terms as set forth by the Board in its institution decisions in IPR2013-00419,
`
`pp. 7-15, and IPR2014-00646, pp. 8-9. Those constructions are as follows:
`
`Claim Term
`“trained pattern
`
`Board’s Construction
`“an algorithm that processes a signal that is generated by
`
`recognition algorithm”
`
`an object, or is modified by interacting with an object, in
`
`(claims 1, 31, 41, and 56)
`
`order to determine to which one of a set of classes the
`
`object belongs, the algorithm having been taught, through
`
`a variety of examples, various patterns of received signals
`
`generated or modified by objects”
`
`“trained pattern
`
`“a neural computer or neural network trained for pattern
`
`recognition means…”
`
`recognition, and equivalents thereof”
`
`(1, 31, 41 and 56)
`
`6
`
`
`
`
`
`
`“identify” (ubiquitous)
`
`“determine that the object belongs to a particular set or
`
`class”
`
`“exterior object”
`(ubiquitous)
`
`“rear view mirror” (31,
`62)
`“transmitter” (4, 59)
`
`“a material or physical thing outside the vehicle, not a part
`
`of the roadway on which the vehicle travels”
`
`“a mirror that faces to the rear”
`
`“device that transmits any type of electromagnetic waves,
`
`including visible light.”
`
`
`IV. OVERVIEW OF THE ’057 PATENT
`The ’057 patent generally relates to a vehicle monitoring system that utilizes
`
`various types of sensors such as cameras, radar or laser radar (lidar) in order to detect
`
`objects. (Ex. 1101, 17:53-23:9, 39:1-20.) In one embodiment, a processor receives
`
`the signals from the sensor(s) and identifies, classifies or locates an object using a
`
`trained pattern recognition algorithm. (Id. at 14:8-25, 8:15-19.) A system in the
`
`vehicle, such as a visual display, can then be affected depending on the classification,
`
`identification or location of the exterior object. (Id. at 14:26-28).
`
`Petitioner challenges two independent claims, claims 1 and 56, as well as claims
`
`depending from independent claims 30 and 40. Independent claims 1 and 56 are very
`
`similar, and require (i) a “at least one receiver” arranged to receive waves from the
`
`vehicle exterior, (ii) a “processor comprising trained pattern recognition means” that
`
`applies a “trained pattern recognition algorithm” to provide the “classification,
`
`7
`
`
`
`
`
`
`identification or location of the exterior object” and (iii) a system in the vehicle that is
`
`“affected in response” thereto.
`
`Independent claim 30 is similar to claims 1 and 56 but requires the “at least one
`
`receiver” to be “arranged on a rear view mirror of the vehicle.” Independent claim 40
`
`requires a “plurality of receivers.” While neither claim recites “trained pattern
`
`recognition means” or a “trained pattern recognition algorithm,” claims 31 and 41,
`
`which depend from claims 30 and 40, respectively, do require “trained pattern
`
`recognition means.” Of these, Petitioner challenges only claims 31 and 41.
`
`V. HOW CHALLENGED CLAIMS ARE UNPATENTABLE (37 C.F.R.
`§§ 42.104(B)(4)-(5))
`A. Ground 1: Claims 1-4, 7-10, 41, 56, 59-61, and 64 Are Obvious Under
`35 U.S.C. § 103(a) over Lemelson
`Claims 1-4, 7-10, 41, 56, 59-61 and 64 of the ’057 patent would have been
`
`obvious to one of ordinary skill in the art over Lemelson. This ground was already
`
`presented in the Mercedes 646 IPR and instituted by the Board.
`
`Lemelson teaches a vehicle exterior monitoring system that one of ordinary
`
`skill could implement to identify exterior objects and obstacles and affect a vehicle
`
`system by warning the driver by, for example, actuating the brakes or steering to
`
`minimize the likelihood or effects of a collision. (Ex. 1102, Abstract; 2:53-63, 3:5-26,
`
`5:15-18, 8:38-39, Fig. 1; Ex. 1103, pp. 1, 3-6, 8, 13-14, Fig. 1.) Figure 1 of Lemelson
`
`depicts a radar/lidar computer 14 for locating an exterior object based on received
`
`radar or lidar signals, a camera receiver 16 to receive waves from the exterior
`
`8
`
`
`
`
`
`
`environment, an image analysis computer 19 (“IAC”) for classifying and identifying
`
`exterior objects, brakes 33 and steering 36 that are affected depending on the
`
`identified exterior objects and a display 32 for warning the driver of a potential
`
`collision. (Ex. 1102, 5:31-56, 5:67-6:8; Ex. 1103, pp. 8-9.) Lemelson teaches that the
`
`signal output from the camera(s) is digitized and passed to the IAC. (Ex. 1102, 5:36-
`
`39; Ex. 1103, pp. 8-9.) The IAC “identifies” the detected exterior object(s) using
`
`“neural networks” that have been “trained” using “known inputs.” (Ex. 1102, 5:39-45,
`
`7:47-8:10, 8:21-23; Ex. 1103, pp. 9, 12-13.)
`
`Lemelson renders independent claims 1 and 56, and dependent claim 41,
`
`obvious. First, Lemelson teaches the “receiver…” limitations (“plurality of receivers”
`
`in claim 411) of these claims because it teaches several devices that are coupled to a
`
`processor and receive electromagnetic radiation. (Ex. 1102, Figs. 1, 2, 6:21-42; Ex.
`
`1103, Figs. 1, 2, p. 10) For example, Lemelson discloses radar and lidar receivers, (Ex.
`
`1102, 5:67-6:8; Ex. 1103, p. 13), as well as “multiple cameras” that are used “for stereo
`
`imaging capabilities” (Ex. 1102, Figs. 1 and 2, 6:37-38; Ex. 1103, Figs. 1 and 2, p. 9;
`
`Ex. 1106, ¶ 48.) Lemelson also teaches use of “a CCD array camera” as a receiver.
`
`(Ex. 1102, 6:28-34, 7:36-41; Ex. 1103, pp. 10, 12.) Claim 41 requires “a plurality of
`
`receivers arranged apart from one another and to receive waves from different parts
`
`
`1 Because Petitioner does not directly challenge independent claim 40, Petitioner
`
`refers only to claim 41, which incorporates the limitations of claim 40.
`
`9
`
`
`
`
`
`
`of the environment exterior of the vehicle…” Lemelson meets this limitation at least
`
`through its disclosure that these “[m]ultiple cameras may be used for front, side and
`
`rear viewing and for stereo imaging capabilities suitable for generation of three
`
`dimensional image information including capabilities for depth perception and placing
`
`multiple objects in three dimensional image fields to further improve hazard detection
`
`capabilities.” (Ex. 1102, 6:37-42; Ex. 1103, p. 10).
`
`Second, Lemelson teaches processing the received signals to provide a
`
`classification, identification or location of the exterior object. For example, Lemelson
`
`teaches that “[t]he computer is operable to analyze video and/or other forms of image
`
`information generated as the vehicle travels to identify obstacles ahead of the
`
`vehicle…” (Ex. 1102, 2:39-41; Ex. 1103, p. 3.) Lemelson teaches that the analog
`
`signal output from the camera(s) is passed to the IAC. (Ex. 1102, 5:36-39; Ex. 1103,
`
`pp. 8-9.) Further, the IAC meets the “trained pattern recognition means…”
`
`limitation as construed by the Board because it discloses a processor that implements
`
`a “neural network” and is therefore a “neural computer.” (Ex. 1106, ¶ 49.) “Neural
`
`networks” are defined by the ’057 patent to be a type of “trained pattern recognition
`
`algorithm” and are within the Board’s construction of that term. (Ex. 1101, 4:35-36.)
`
`In particular, Lemelson discloses that the IAC is “provided, implemented and
`
`programmed using neural networks and artificial intelligence as well as fuzzy logic
`
`algorithms” to “(a) identify objects on the road ahead such as other vehicles,
`
`pedestrians, barriers and dividers, turns in the road, signs and symbols, etc., and
`10
`
`
`
`
`
`
`generate identification codes, and (b) detect distances from such objects by their size
`
`(and shape) . . . .” (Ex. 1102, 5:39-45; Ex. 1103, p. 9.) Lemelson explains that the
`
`neural network in the IAC may be “trained” using “known inputs.” (Ex. 1102, 7:47-
`
`8:10, 8:21-23; Ex. 1103, pp. 12-13.)
`
`The IAC, which uses a neural network, “classifies, identifies and locates”
`
`exterior objects. (Ex. 1106, ¶ 49.) The IAC “classifies” by using a neural network
`
`program that is “structured and arranged to apply a trained pattern recognition
`
`algorithm generated from data of possible exterior objects and patterns of received
`
`waves” therefrom.
`
`Independent claims 1 and 56 (and dependent 41) recite that the “trained
`
`pattern recognition algorithm” be “generated from data of possible exterior objects
`
`and patterns of received waves from the possible exterior objects.” Even if this
`
`method step within a system claim constitutes a limitation (see MPEP § 2113;
`
`SmithKline Beecham Corp. v. Apotex Corp., 439 F.3d 1312, 1317 (Fed. Cir. 2006) (en
`
`banc); Ex Parte Klasing et al., App. No. 11/507,120, 2013 Pat. App. LEXIS 1619, at *8-
`
`10 (PTAB March 14, 2013)), and even if is limited to training with “real data,” it
`
`would have been obvious to one of ordinary skill in view of Lemelson.
`
`Lemelson already discloses that the neural network is trained with “known
`
`inputs.” (Exh. 1102, 8:1-8; Ex. 1103, pp. 16-17). In the early-to-mid 1990s, one of
`
`ordinary skill in the art would have known that training with “real data” would have
`
`yielded the best results for this purpose of training the Lemelson system. (Exh. 1106,
`11
`
`
`
`
`
`
`¶ 61.) The Lemelson neural network was trained to identify “other vehicles,
`
`pedestrians, barriers and dividers, turns in the road, signs and symbols.” (Exh. 1106, ¶
`
`67.) As of 1995, one of ordinary skill in the art would not have expected that a
`
`simulated data set could be readily generated that could accurately represent all
`
`exterior objects described by Lemelson as perceived by sensors on a vehicle. (Exh.
`
`1106, ¶ 67.) One of ordinary skill in the art in 1995 would have known that the
`
`generation of simulated data was not sophisticated enough to allow for training the
`
`type of neural network described by Lemelson. (Exh. 1106, ¶ 64.) Generation of
`
`simulated data would have required a lot of computer power and special equipment,
`
`neither of which were disclosed by Lemelson. (Exh. 1106, ¶ 66.) Lemelson does not
`
`disclose any computer hardware or methods for generating simulated data. (Id.)
`
`Moreover, images directly obtained from exterior objects would have been by
`
`far more representative of the types of complex 3-dimensional objects Lemelson’s
`
`vehicle warning system would have been expected to encounter during road
`
`operation. (Exh. 1106, ¶ 68.) Such data would also have been far more plentiful,
`
`easier to obtain, less costly and less time-consuming to produce than any synthetic
`
`data then available. (Exh. 1106, ¶ 68.) Additionally, one of ordinary skill would not
`
`have expected to succeed in training a neural network to accurately recognize (as
`
`would be required of a vehicle warning system) complex 3-dimensional objects like
`
`pedestrians, automobiles, trucks, etc. without using sufficiently representative data,
`
`which could only have been obtained from exterior objects directly imaged. (Exh.
`12
`
`
`
`
`
`
`1106, ¶ 68.) Accordingly, the “generated from” phrase would have been obvious to
`
`one of ordinary skill.
`
`Finally, Lemelson discloses the limitations of claims 1, 41, and 56 requiring that
`
`a vehicle system be affected in response to the classification, identification or location
`
`of the exterior object. (Id. at ¶ 53.) Based on the neural network object determination,
`
`the IAC can display “symbols representing the hazard objects.” (Ex. 1102, 6:43-55,
`
`9:60-62, Fig. 2; Ex. 1103, pp. 10-11, 16.) The IAC also provides codes to a decision
`
`computer 23, which “integrates the inputs from the image analysis computer 19” as
`
`well as a “radar or lidar computer 14.” (Ex. 1102, 8:30-33, 6:1-8; Ex. 1103, pp. 9, 13.)
`
`The decision computer 23 then generates control signals to control a vehicle system
`
`such as brakes or steering.(Ex. 1102, 5:46-51, 2:53-3:26; Ex. 1103, pp. 4, 9.) Thus,
`
`Lemelson renders obvious each of claims 1, 41, and 56.
`
`Lemelson also renders obvious claims 2-4, 7-10, 59-61, and 64. Claim 2
`
`depends directly from claim 1 but additionally requires that the “at least one receiver
`
`comprises a pair of receivers spaced apart from one another.” Lemelson teaches that
`
`“[m]ultiple cameras may be used” for “stereo imaging capabilities.” (Ex. 1102, 6:37-
`
`38; Ex. 1103, p. 10.) Receivers (such as cameras) spaced apart from one another are
`
`necessary for such “stereo imaging capabilities.” (Ex. 1106, ¶¶ 48, 54.) Claim 3 (from
`
`1) requires that the “at least one receiver is arranged to receive infrared waves.”
`
`Lemelson teaches that the camera may be implemented with known “infrared imaging
`
`methods.” (Ex. 1102, 6:34-37; Ex. 1103, p. 14.)
`13
`
`
`
`
`
`
`Claims 4 (depends from 1) and 59 (depends from 56) require a “transmitter for
`
`transmitting waves into the environment exterior of the vehicle whereby the at least
`
`one receiver” is “arranged to receive waves transmitted by said transmitter and
`
`reflected by any exterior objects.” Lemelson meets this because it teaches a vehicle
`
`with “headlights,” (Ex. 1102, 3:29, 5:57; Ex. 1103, pp. 5, 9), which project light that is
`
`reflected off exterior objects. Vehicle headlights are a “transmitter” within the
`
`Board’s construction because they “transmit… electromagnetic waves.” (Ex. 1106,
`
`¶ 56.) Lemelson teaches that a camera (or cameras, for stereo imaging capabilities)
`
`may be used for “front” viewing. (Ex. 1102, 6:37-38; Ex. 1103, p. 10.)
`
`Claims 7 (depends directly from 1) and 61 (depends directly from 56) require
`
`that the affected vehicle system is a “display viewable to the driver and arranged to
`
`show an image or icon of the exterior object.” Lemelson teaches that the IAC can
`
`display “symbols representing the hazard objects.” (Ex. 1102, 6:43-55, Fig. 2, 9:60-62;
`
`Ex. 1103, pp. 10-11, 16, Fig. 2; Ex. 1106, ¶¶ 53, 57.) Lemelson also teaches “various
`
`warning and vehicle operating devices such as… a display driver 31 which drives
`
`a (heads-up or dashboard) display.” (Ex. 1102, 5:49-56; Ex. 1103, p. 9.) Thus,
`
`Lemelson renders obvious claims 7 and 61. (Ex. 1106, ¶ 57.)
`
`Lemelson also renders obvious claim 8 (from 1) because it discloses that the
`
`camera is preferably a “CCD array.” (Ex. 1102, 6:31-32; Ex. 1103, p. 14; Ex. 1106,
`
`¶ 58.)
`
`14
`
`
`
`
`
`
`Claims 9 (from 1) and 64 (from 56) require “measurement means for measuring
`
`a distance between the exterior object and a vehicle.” Lemelson meets this limitation
`
`because it discloses “multiple cameras” for “stereo imaging capabilities.” (Ex. 1102,
`
`6:37-38; Ex. 1103, p. 14.) It also uses radar and/or lidar for “distance measurements.”
`
`(Ex. 1102, 8:56; 5:67-6:8; Ex. 1103, p. 13; 18.) Lemelson meets dependent claim 10
`
`(from 9), which limits the measurement means to laser or laser radar, based on the
`
`same disclosure. (Ex. 1106, ¶ 59.)
`
`Claim 60 depends from claim 56. Claim 60 requires that the “at least one
`
`receiver is arranged to receive waves from a blind spot of the vehicle.” Lemelson
`
`teaches these limitations at least through its disclosure that the camera(s) may be used
`
`for “side and rear viewing,” i.e., the locations that are not visible to the driver while
`
`driving. (Ex. 1102, 6:5-8, 6:37-38; Ex. 1103, p. 10; Ex. 1106, ¶ 60.) Thus, Lemelson
`
`renders claims 60 obvious.
`
`As shown in the below claim charts, Lemelson renders obvious claims 1-4, 7-
`
`10, 41, 56, 59-61, and 64 of the ‘057 patent. Claim charts for claims 56, 59, 61, and 64
`
`are not provided because they are substantively the same as claims 1, 4, 7, and 9,
`
`respectively.
`
`
`’057 Patent, Claim 1
`1. A monitoring
`arrangement for monitoring
`an environment exterior of
`a vehicle, comprising:
`
`Lemelson (Ex. 1102); ’304 App. (Ex. 1103)
`E.g., Ex. 1102, Fig. 1; see also Ex. 1103, Fig. 1.
`E.g., Ex. 1102, 2:14-16, “A video scanning system,
`such as a television camera and/or one or more
`laser scanners mounted on the vehicle scan the road
`
`15
`
`
`
`
`
`
`(a) at least one receiver
`arranged to receive waves
`from the environment
`exterior of the vehicle
`which contain information
`on any objects in the
`environment and generate a
`signal characteristic of the
`received waves; and
`
`(b) a processor coupled to
`said at least one receiver
`and comprising trained
`pattern recognition means
`
`in front of the vehicle . . . .” See also Ex. 1103, p. 7.
`E.g., Ex. 1102, 6:37-38, “Multiple cameras may be
`used for front, side and rear viewing. See also Ex.
`1103, p. 10.
`E.g., Ex. 1102, Figs. 1-2; see also Ex. 1103, Fig. 1.
`E.g., Ex. 1102, 4:31-34, “The video camera 16 is
`preferably a CCD array camera generating
`successive picture frames with individual pixels
`being digitized for processing by the video
`preprocessor 51.” See also Ex. 1103, p. 10.
`E.g., Ex. 1102, 5:31-39, “A television camera(s) 16
`having a wide angle lens 16L is mounted at the front
`of the vehicle such as the front end of the roof,
`bumper or end of the hood to scan the road ahead
`of the vehicle . . . . The analog signal output of
`camera 16 is digitized in an A/D convertor 18 and
`passed directly to or through a video preprocessor
`51 to microprocessor 11, to an image field analyzing
`computer 19 . . . .” See also Ex. 1103, pp. 8-9.
`E.g., Ex. 1102, 5:67-6:8 (“An auxiliary range
`detection means comprises a range computer 21
`which accepts digital code signals from a radar or
`lidar computer 14 which interprets radar and/or
`laser range signals from respective reflected
`radiation receiving means on the vehicle. In a
`modified form, video scanning and radar or lidar
`scanning may be jointly employed to identify and
`indicate distances between the controlled vehicle
`and objects ahead of, to the side(s) of, and to the
`rear of the controlled vehicle.” See also Ex. 1103, p.
`9.
`E.g., Ex. 1102, 6:37-38, “Multiple cameras may be
`used for front, side and rear viewing.” See also Ex.
`1103, p. 10.
`E.g., Ex. 1102, Figs. 1-4; see also Ex. 1103, Figs. 1-4.
`E.g., Ex. 1102, 5:36-45, “The analog signal output of
`camera 16 is digitized in an A/D convertor 18 and
`
`16
`
`
`
`
`
`
`for processing the signal to
`provide a classification,
`identification or location of
`the exterior object,
`
`(c) said trained pattern
`recognition means being
`structured and arranged to
`apply a trained pattern
`recognition algorithm
`generated from data of
`possible exterior objects
`and patterns of received
`waves from the possible
`exterior objects to provide
`the classification,
`identification or location of
`the exterior object;
`
`
`passed directly to or through a video preprocessor
`51 to microprocessor 11, to an image field analyzing
`computer 19, which is provided, implemented and
`programmed using neural networks and artificial
`intelligence as well as fuzzy logic algorithms to (a)
`identify objects on the road ahead such as other
`vehicles, pedestrians, barriers and dividers, turns in
`the road, signs and symbols, etc., and generate
`identification codes, and (b) detect distances from
`such objects by their size (and shape) . . . .” See also
`Ex. 1103, p. 8-9.
`E.g., Ex. 1102, 7:47-8:10 (“In another embodiment,
`the image analyzing computer 19 is implemented as
`a neural computing network with networked
`processing elements performing successive
`computations on input image structure as shown in
`FIG. 3 where signal inputs 61 are connected to
`multiple processing elements 63, 65 and 67 through
`the network connections 62, 64 and 66.
`The processing elements (PE’s) 63, 65 and 67 map
`input signal vectors to the output decision layer,
`performing such tasks as image recognition and
`image parameter analysis. A typical neural network
`processing element known to those skilled in the art
`is shown in FIG. 4 where input vectors (X1, X2 . . .
`Xn) are connected via weighing elements (W1, W2 .
`. . Wn) to a summing node 70. The output of node
`70 is passed through a nonlinear processing element
`72 to produce an output signal, U. Offset or bias
`inputs can be added to the inputs through weighing
`circuit Wo. The output signal from summing node
`70 is passed through the nonlinear element 72. The
`nonlinear function is preferably a continuous,
`differentiable function such as a sigmoid which is
`typically used in neural network processing element
`nodes. Neural networks used in the vehicle warning
`system are trained to recognize roadway hazards
`which the vehicle is approaching including
`automobiles, trucks, and pedestrians. Training
`involves providing known inputs to the network
`17
`
`
`
`
`
`
`(d) whereby a system in the
`vehicle is coupled to said
`processor such that the
`operation of the system is
`affected in response to the
`classification, identification
`or location of the exterior
`object.
`
`resulting in desired output responses. The weights
`are automatically adjusted based on error signal
`measurements until the desired outputs are
`generated. Various learning algorithms may be
`applied. Adaptive operation is also possible with on-
`line adjustment of network weights to meet imaging
`requirements.” See also Ex. 1103, pp. 12-13.
`
`E.g., Ex. 1102, Figs. 1, 2; see also Ex. 1103, Figs. 1, 2.
`E.g., Ex. 1102, 6:47-55, “Actual image data can be
`displayed in real time using video display 55 via
`analog-to-digital converter 54. The image display
`may include highlighting of hazards, special warning
`images such as flashing lights, alpha-numeric
`messages, distance values, speed indic