`
`__________________
`
`BEFORE THE PATENT TRIAL AND APPEAL BOARD
`
`
`
`
`
`Mercedes-Benz USA, LLC
`
`Petitioner
`
`
`
`Patent No. 5,845,000
`Issue Date: December 1, 1998
`Title: OPTICAL IDENTIFICATION AND MONITORING SYSTEM USING
`PATTERN RECOGNITION FOR USE WITH VEHICLES
`
`
`
`
`PETITION FOR INTER PARTES REVIEW
`OF U.S. PATENT NO. 5,845,000
`PURSUANT TO 35 U.S.C. § 312 and 37 C.F.R. § 42.104
`
`Case No. IPR2014-00647
`
`
`
`
`
`
`
`
`
`
`
`TABLE OF CONTENTS
`
`LISTING OF EXHIBITS ........................................................................................ iv
`
`I.
`
`Mandatory Notices (37 C.F.R. § 42.8) ............................................................ 1
`
`A.
`
`B.
`
`C.
`
`Real Party-in-Interest (37 C.F.R. § 42.8(b)(1)) ..................................... 1
`
`Related Matters (37 C.F.R. § 42.8(b)(2)) .............................................. 1
`
`Counsel & Service Information (37 C.F.R. §§ 42.8(b)(3)-(4)) ............. 2
`
`II.
`
`Payment of Fees (37 C.F.R. § 42.103) ............................................................ 3
`
`III. Requirements For IPR (37 C.F.R. § 42.104) ................................................... 3
`
`A. Grounds for Standing (37 C.F.R. § 42.104(a)) ..................................... 3
`
`B.
`
`Identification of Challenge (37 C.F.R. § 42.104(b)) and Relief
`Requested (37 C.F.R. § 42.22(a)(1)) ..................................................... 3
`
`C.
`
`Claim Construction (37 C.F.R. § 42.104(b)(3)) .................................... 4
`
`IV. OVERVIEW of the ’000 Patent ...................................................................... 7
`
`V. How Challenged Claims are Unpatentable (37 C.F.R. §§ 42.104(b)(4)-
`(5)) ................................................................................................................... 8
`
`A. Ground 1: Claims 10, 11, 15, 19 and 23 are Anticipated Under 35
`U.S.C. § 102(e) By Lemelson ............................................................... 8
`
`B.
`
`C.
`
`Ground 2: Claims 10, 11, 15, 19 and 23 are Obvious Under 35
`U.S.C. § 103(a) in View of Lemelson ................................................. 23
`
`Ground 3: Claims 10, 11, 15, 19 and 23 are Obvious Under 35
`U.S.C. § 103(a) Over Lemelson in View of Nishio ............................ 23
`
`D. Ground 4: Claims 10, 11, 15, 19 and 23 are Obvious Under 35
`U.S.C. § 103(a) Over Lemelson in View of Asayama ........................ 26
`
`E.
`
`Ground 5: Claims 16, 17, and 20 are Obvious Under 35 U.S.C. §
`103(a) Over Lemelson in View of Yanagawa ..................................... 27
`
`
`
`ii
`
`
`
`
`
`F.
`
`Ground 6: Claims 10, 15, 19 and 23 are Anticipated Under 35
`U.S.C. § 102(e) By Nishio .................................................................. 29
`
`G. Ground 7: Claims 10, 15, 19 and 23 are Obvious Under 35 U.S.C.
`§ 103(a) In View of Nishio ................................................................. 44
`
`H. Ground 8: Claims 10, 15, 19 and 23 are Obvious Under 35 U.S.C.
`§ 103(a) Over Nishio In View of Asayama ........................................ 45
`
`I.
`
`J.
`
`Ground 9: Claims 10, 15, 16, 17, 19, 20 and 23 are Obvious Under
`35 U.S.C. § 103(a) Over Nishio In View Yanagawa .......................... 50
`
`Ground 10: Claims 10, 11, 15, 19 and 23 are Obvious Under 35
`U.S.C. § 103(a) Over Nishio In View Lemelson ................................ 55
`
`K. Ground 11: Claims 10, 11, 15, 16, 17, 19, 20 and 23 are Obvious
`Under 35 U.S.C. § 103(a) Over Nishio in View of Mizukoshi .......... 57
`
`VI. Conclusion ..................................................................................................... 60
`
`
`
`iii
`
`
`
`
`
`Exhibit 1001
`
`Exhibit 1002
`
`Exhibit 1003
`
`Exhibit 1004
`
`Exhibit 1005
`
`Exhibit 1006
`
`Exhibit 1007
`
`Exhibit 1008
`
`Exhibit 1009
`
`Exhibit 1010
`
`Exhibit 1011
`
`Exhibit 1012
`
`LISTING OF EXHIBITS1
`
`U.S. Patent No. 5,845,000 to Breed et al.
`
`U.S. Patent No. 6,553,130 to Lemelson et al.
`
`File History for U.S. Patent Application No. 08/105,304
`
`U.S. Patent No. 5,541,590 to Nishio
`
`File History for US Patent Application No. 08/097,178
`
`U.S. Patent No. 5,214,408 to Asayama
`
`Certified English Translation of Japanese Unexamined
`Patent Application Publication
`JP-S62-131837
`to
`Yanagawa
`
`Certified English Translation of Japanese Unexamined
`Patent Application Publication
`JP-H06-267303
`to
`Mizukoshi
`
`Japanese Unexamined Patent Application Publication
`JP-S62-131837 to Yanagawa
`
`Japanese Unexamined Patent Application Publication
`JP-H06-267303 to Mizukoshi
`
`File History for U.S. Patent Application No. 08/474,786
`
`Infringement Contentions of American Vehicular Sciences
`LLC with respect to U.S. Patent No. 5,845,000 in the
`litigation captioned American Vehicular Sciences LLC v.
`Mercedes-Benz U.S. International, Inc. and Mercedes-Benz USA,
`LLC, 13-cv-00309 (E.D. Tex.)
`
`Exhibit 1013
`
`Expert Declaration of Larry S. Davis
`
`
`1 Unless otherwise specified, all citations to Exhibits refer to the original page, column
`
`or line number of that Exhibit.
`
`
`
`iv
`
`
`
`
`
`Exhibit 1014
`
`Exhibit 1015
`
`Patent Owner’s March 24, 2014 Response in IPR2013-
`00424
`
`Patent Trial and Appeal Board Decision Instituting Inter
`Partes Review on U.S. Patent No. 5,845,000
`
`
`
`v
`
`
`
`
`
`Pursuant to 35 U.S.C. §§ 311-319 and 37 C.F.R. Part 42, Mercedes-Benz USA
`
`LLC (“Petitioner”) respectfully requests inter partes review of claims 10, 11, 15, 16, 17,
`
`19, 20, and 23 of U.S. Patent No. 5,845,000 (“the ’000 patent”). According to
`
`U.S. Patent and Trademark Office records, the ’000 patent is currently assigned to
`
`American Vehicular Sciences LLC (“AVS” or the “Patent Owner”).
`
`I. MANDATORY NOTICES (37 C.F.R. § 42.8)
`A. Real Party-in-Interest (37 C.F.R. § 42.8(b)(1))
`The real parties-in-interest with respect to this Petition are Petitioner and
`
`Mercedes-Benz U.S. International, Inc. (“MBUSI”).
`
`B. Related Matters (37 C.F.R. § 42.8(b)(2))
`The ’000 patent is currently the subject of the following litigations: American
`
`Vehicular Sciences LLC v. Toyota Motor Corp. et al., No. 6:12-CV-406 (E.D. Tex.) (“AVS
`
`406 Litigation”); American Vehicular Sciences LLC v. Mercedes-Benz U.S. Int’l, Inc., Case
`
`No. 6:13-CV-308 (E.D. Tex.); and American Vehicular Sciences LLC v. BMW Group,
`
`Case No. 6:12-CV-413 (E.D. Tex.). Petitioner and MBUSI were named as defendants
`
`in the 308 Litigation and served with a Summons and Complaint in that action on
`
`April 17, 2013. On July 22, 2013, they were served with infringement contentions in
`
`that proceeding. (Ex. 1012, p. 6.) Pending U.S. Patent App. No. 11/558,996 and
`
`numerous other patents and applications claim the benefit of the application from
`
`which the ’000 patent issued. The ’000 patent is currently the subject of IPR2013-
`
`00424 (instituted January 14, 2014). Petitioner is not aware of any other pending
`
`1
`
`
`
`
`
`
`administrative matter that would affect, or be affected by, a decision in this
`
`proceeding. Petitioner is simultaneously filing petitions seeking inter partes review of
`
`four other patents currently assigned to AVS: U.S. Patent No. 6,772,057; U.S. Patent
`
`No. 6,738,697; U.S. Patent No. 6,746,078; and U.S. Patent No. 7,630,802. These
`
`petitions do not address the ’000 patent, but involve the same patent owner and
`
`Petitioner.
`
`C.
`Lead Counsel: Scott W. Doyle (Reg. No. 39176)
`
`Counsel & Service Information (37 C.F.R. §§ 42.8(b)(3)-(4))
`
`Back-up Counsel: Jonathan R. DeFosse (pro hac to be requested upon authorization)2.
`
`Electronic Service: scott.doyle@shearman.com, jonathan.defosse@shearman.com.
`
`Postal Address: Scott W. Doyle, Jonathan DeFosse, Shearman & Sterling LLP, 801
`
`Pennsylvania Ave., NW, Suite 900, Washington, DC 20004.
`
`Telephone: (202) 508-8000; Facsimile: (202) 508-8100.
`
`
`2 Petitioners request authorization to file a motion for Jonathan R. DeFosse to appear
`
`pro hac vice as backup counsel. Mr. DeFosse is an experienced litigation attorney in
`
`patent cases. He is admitted to practice in Virginia and Washington, DC, as well as
`
`before several United States District Courts and Courts of Appeal. Mr. DeFosse is
`
`familiar with the issues raised in this Petition because he represents Petitioners in the
`
`AVS 308 Litigation.
`
`2
`
`
`
`
`
`
`II.
`
`PAYMENT OF FEES (37 C.F.R. § 42.103)
`
`Petitioner is requesting inter partes review of 8 claims of the ’000 patent. The
`
`United States Patent & Trademark Office is authorized to charge all fees required in
`
`connection with this petition (calculated to be $23,000.00) or these proceedings, to the
`
`deposit account of Shearman & Sterling, LLP, Deposit Account 500324.
`
`III. REQUIREMENTS FOR IPR (37 C.F.R. § 42.104)
`A. Grounds for Standing (37 C.F.R. § 42.104(a))
`Petitioner certifies that the ’000 patent (Ex. 1001) is available for inter partes
`
`review and that Petitioner is not barred or estopped from requesting an inter partes
`
`review challenging the patent’s claims on the grounds identified in this petition.
`
`B.
`
`Identification of Challenge (37 C.F.R. § 42.104(b)) and Relief
`Requested (37 C.F.R. § 42.22(a)(1))
`
`Petitioner respectfully requests that inter partes review be instituted and claims
`
`10, 11, 15, 16, 17, 19, 20, and 23 of the ’000 patent be cancelled on the following
`
`grounds of unpatentability:
`
`Ground 1: Claims 10, 11, 15, 19 and 23 are Anticipated Under 35 U.S.C. §
`
`102(e) by Lemelson (Exs. 1002 and 1003).
`
`Ground 2: Claims 10, 11, 15, 19 and 23 are Obvious Under 35 U.S.C. § 103(a)
`
`in View of Lemelson
`
`Ground 3: Claims 10, 11, 15, 19 and 23 are Obvious Under 35 U.S.C. § 103(a)
`
`Over Lemelson in View of Nishio (Exs. 1004 and 1005).
`
`3
`
`
`
`
`
`
`Ground 4: Claims 10, 11, 15, 19 and 23 are Obvious Under 35 U.S.C. § 103(a)
`
`Over Lemelson in View of Asayama (Ex. 1006).
`
`Ground 5: Claims 16, 17, and 20 are Obvious Under 35 U.S.C. § 103(a) Over
`
`Lemelson in View of Yanagawa (Exs. 1007 and 1009).
`
`Ground 6: Claims 10, 15, 19 and 23 are Anticipated Under 35 U.S.C. § 102(e)
`
`By Nishio.
`
`Ground 7: Claims 10, 15, 19 and 23 are Obvious Under 35 U.S.C. § 103(a) In
`
`View of Nishio.
`
`Ground 8: Claims 10, 15, 19 and 23 are Obvious Under 35 U.S.C. § 103(a)
`
`Over Nishio In View of Asayama.
`
`Ground 9: Claims 10, 15, 16, 17, 19, 20 and 23 are Obvious Under 35 U.S.C. §
`
`103(a) Over Nishio In View Yanagawa.
`
`Ground 10: Claims 10, 11, 15, 19 and 23 are Obvious Under 35 U.S.C. § 103(a)
`
`Over Nishio In View Lemelson.
`
`Ground 11: Claims 10, 11, 15, 16, 17, 19, 20 and 23 are Obvious Under 35
`
`U.S.C. § 103(a) Over Nishio in View of Mizukoshi (Exs. 1008 and 1010).
`
`The above-listed grounds of unpatentability are explained in detail in Section
`
`[V], below. This Petition is supported by the Declaration of Larry S. Davis (Ex. 1013).
`
`C.
`
`Claim Construction (37 C.F.R. § 42.104(b)(3))
`
`4
`
`
`
`
`
`
`The Board construed certain terms of the ’000 patent in its Decisions granting
`
`the Toyota petitions for inter partes review. (Ex. 1015, pp. 9-26.) The below chart has
`
`a summary of those constructions, which Petitioner has applied3 in its petition.
`
`Claim Term
`
`Board’s Construction
`
`“pattern recognition
`
`algorithm” (claims 10 and 16)
`
`“an algorithm which processes a signal that is
`
`generated by an object, or is modified by
`
`interacting with an object, for determining to
`
`which one of a set of classes the object belongs”
`
`(Id. at 10-11.)
`
`“trained pattern recognition
`
`means for…” (10, 16)
`
`“a neural computer or microprocessor trained for
`
`pattern recognition, and equivalents thereof” (Id.
`
`at 12-16.)
`
`“identify” (10, 16, 23)
`
`“determining that the object belongs to a
`
`
`3 In the 308 Litigation, Petitioner has taken the position that the following terms of
`
`the ’000 patent are indefinite under 35 U.S.C. § 112(b): “trained pattern recognition
`
`means…” (claims 10 and 16) and “categorization means…” (claims 10 and 16).
`
`Petitioner has no opportunity to challenge these terms as indefinite under § 112(b) as
`
`part of the IPR proceedings.
`
`5
`
`
`
`
`
`
`“transmitter means for
`
`transmitting…” (10)
`
`“reception means for
`
`receiving…” (10, 16)
`
`particular set or class” (Id. at 16.)
`
`“infrared, radar, and pulsed GaAs laser systems”
`
`and “transmitters which emit visible light” (Id. at
`
`16-19.)
`
`“a CCD array and CCD transducer” (Id. at 19-20.)
`
`“processor means… for
`
`For this petition, a processor provides sufficient
`
`processing…” (10, 16)
`
`structure to perform the function. (Id. at 20-21.)
`
`“categorization means… for
`
`“a neural computer, a microprocessor, and their
`
`categorizing…” (10, 16)
`
`equivalents” (Id. at 21-22.)
`
`“output means…” (10, 16)
`
`“electronic circuit or circuits capable of outputting
`
`a signal to another vehicle system” (Id. at 22-24.)
`
`“dimming the headlights” (16)
`
`“decreasing the intensity or output of the headlight
`
`to a lower level of illumination” (Id. at 25.)
`
`“measurement means for
`
`The recited “radar” provides sufficient structure to
`
`measuring…” (11)
`
`perform the recited functions. (Id. at 24.)
`
`“wherein said categories
`
`“categorizing radiation from taillights of a vehicle-
`
`6
`
`
`
`
`
`
`further comprise radiation
`
`in-front, which may include additional types of
`
`from taillights of a vehicle-in-
`
`radiation” (Id. at 25-26.)
`
`front” (17)
`
`IV. OVERVIEW OF THE ’000 PATENT
`The ’000 patent is directed to a vehicle interior monitoring system that
`
`monitors, identifies, and locates objects outside of the vehicle. ( Ex. 1001, Abstract;
`
`Ex. 1011, p. 43.) Objects are illuminated with electromagnetic radiation, and a lens is
`
`used to focus the illuminated images onto the arrays of a charge coupled device
`
`(CCD). ( Ex. 1001, Abstract:, 7:26-40.) Computational means using trained pattern
`
`recognition analyzes the signals received at the CCD to identify external objects,
`
`which, in turn, are used to affect the operation of other vehicular systems. ( Id. at
`
`Abstract.) The ’000 patent discloses that a vehicle computation system uses a
`
`“trainable or a trained pattern recognition system” which relies on pattern recognition
`
`to process signals and to “identify” an object exterior to the vehicle. ( Id. at col. 3:21-
`
`44.) Figures 7 and 7A illustrate portions of the sensor system that use transmitters,
`
`receivers, circuitry, and processors to perform pattern recognition of external objects.
`
`The ’000 patent also discloses a system for detecting the headlights or taillights
`
`of other vehicles and dimming the vehicle’s headlights in response. (Ex. 1001, col.
`
`9:54-58.) A CCD array is designed to be sensitive to visible light and does not use a
`
`separate source of illumination. (Id.) In another embodiment, external objects are
`
`7
`
`
`
`
`
`
`illuminated with “electromagnetic, and specifically infrared, radiation,” and lenses are
`
`used to focus images onto one or more CCDs arrays. (Id.)
`
`V. HOW CHALLENGED CLAIMS ARE UNPATENTABLE (37 C.F.R.
`§§ 42.104(B)(4)-(5))
`A. Ground 1: Claims 10, 11, 15, 19 and 23 are Anticipated Under 35
`U.S.C. § 102(e) By Lemelson
`
`Claim 10 of the ’000 patent is anticipated under 35 U.S.C. § 102(e) by Lemelson
`
`(Exs. 1002 and 1003). Lemelson teaches an exterior monitoring system that one of
`
`ordinary skill could implement to identify objects outside of a moving vehicle and
`
`affecting a vehicle subsystem in response to that identification. (Ex. 1002, Abstract,
`
`2:14-23, 2:53-3:39, 5:15-18, Fig. 1; Ex. 1003, Abstract; pp. 7-10, 12, Fig. 1; Ex. 1013, ¶
`
`28.) For example, Figure 1 of Lemelson discloses many aspects of the challenged
`
`claims, including a radar/lidar computer 14 for locating an exterior object based on
`
`received radar or lidar signals that includes both electromagnetic radiation emitters
`
`and receivers, a camera receiver 16 to receive waves emitted from or reflected by
`
`objects in the exterior environment, a processor 19 for classifying and identifying
`
`exterior objects, and vehicle systems 33, 36, 41, and 42 that are affected depending on
`
`the identified exterior. (Ex. 1002, Fig. 1, 5:31-6:8; Ex. 1003, pp. 12-14.)
`
`Lemelson anticipates claims 10 and 23 of the ’000 patent. First, Lemelson
`
`teaches element a “transmitter means for transmitting…”, element 10(a) (element
`
`23(a) is substantially the same). The Board construed this term to cover “infrared,
`
`radar, and pulsed GaAs laser systems” as well as “transmitters which emit visible
`
`8
`
`
`
`
`
`
`light.” (Ex. 1015, p 19.) Lemelson teaches vehicle headlights which are within that
`
`construction because they “emit visible light.” (Ex. 1002, 3:29, 5:57; Ex. 1003, pp. 9,
`
`13.) Vehicle headlights also satisfy the “infrared” element of that construction
`
`because ordinary headlights emitted infrared waves when Lemelson was filed (and at
`
`the time the ’304 app. was filed). (Ex. 1013, ¶ 30.) Lemelson also discloses “infrared
`
`imaging,” which teaches receiving infrared waves, including those emitted by
`
`headlights. Lemelson also discloses “radar and/or laser range signals” transmitted by
`
`the vehicle which also satisfies the Board’s construction. (Ex. 1002, 6:2-3; Ex. 1003,
`
`p. 9; Ex. 1013, ¶ 30.)
`
`Lemelson teaches a “reception means...”, element 10(b) (elements 16(a) and
`
`23(b) are substantially the same). The Board construed this term to cover “a CCD
`
`array and a CCD transducer.” (Ex. 1015, p. 20.) Lemelson teaches that TV cameras
`
`are preferably CCD arrays that receive electromagnetic radiation from exterior
`
`objects, thus satisfying the Board’s construction. (Ex. 1002, 5:31, 6:31-32; Ex. 1003,
`
`pp. 12-14; Ex. 1013, ¶ 31.) The imaging method may include “infrared imaging.” (Ex.
`
`1002, 6:36; Ex. 1003, p. 14; see also Ex. 1002, 4:13 ; Ex. 1003, p. 10.)
`
`Lemelson teaches a “processor means…”, element 10(c) (elements 16(b) and
`
`23(c) are substantially the same). In particular, Lemelson teaches that “[t]he analog
`
`signal output of camera 16 is digitized in an A/D convertor 18 and passed directly to
`
`or through a video preprocessor 51 to microprocessor 11, to an image field analyzing
`
`computer 19 which is provided, implemented and programmed using neural networks
`
`9
`
`
`
`
`
`
`and artificial intelligence.” (Ex. 1002, 5:36-41, Figs. 1 and 2; Ex. 1003, p. 9, Figs. 1
`
`and 2.) This teaches that the reception means (a camera) is “coupled to” the
`
`processor means (Ex. 1013, ¶ 33) as well as “creating an electronic signal characteristic
`
`of said exterior object” because digitizing the analog signals received by the camera
`
`using an “A/D converter” requires creating a digital signal representative
`
`(“characteristic”) of the waves received. (Ex. 1013, ¶¶ 32, 33.)
`
`Lemelson teaches a “categorization means…”, element 10(d) (elements 16(c)
`
`and 23(d) are substantially the same). (Ex. 1013, ¶ 48.) The Board construed this to
`
`mean “a neural computer, a microprocessor, and their equivalents.” (Ex. 1015, p. 22.)
`
`Lemelson teaches that the camera signal is passed “to microprocessor 11, to an image
`
`field analyzing computer 19 which is provided, implemented and programmed using
`
`neural networks.” (Ex. 1002, 5:36-38; Ex. 1003, p. 9.) A neural computer is a
`
`computer that has been programmed to run neural network software. (Ex. 1013, ¶
`
`34.) Therefore, Lemelson’s disclosure of an “analyzing computer… implemented and
`
`programmed using neural networks” teaches a neural computer. (Id.) Lemelson’s
`
`categorization means also comprises “trained pattern recognition means…” The
`
`Board construed this term to cover “a neural computer or microprocessor trained for
`
`pattern recognition, and equivalents thereof.” (Ex. 1015, p. 16.) Lemelson satisfies
`
`this limitation for the same reason as the “categorization means.” (Ex. 1013, ¶ 35-37.)
`
`Lemelson teaches that the trained pattern recognition means is “structured and
`
`arranged to apply a pattern recognition algorithm.” The Board construed “pattern
`
`10
`
`
`
`
`
`
`recognition algorithm” as “an algorithm which processes a signal that is generated by
`
`an object, or is modified by interacting with an object, for determining to which one
`
`of a set of classes the object belongs.” The neural networks taught by Lemelson are
`
`within the Board’s construction because neural networks, by design, ascribe a label to
`
`input data and thus necessarily “determine to which one of a set of classes the object
`
`belongs.” (Ex. 1013, ¶¶ 38, 39.) The Board construed “identify” to mean “to
`
`determine that the object belongs to a particular set or class.” In this regard,
`
`Lemelson teaches “identifying objects on the road ahead such as other vehicles,
`
`pedestrians,” which means determining that the object belongs to a particular set or
`
`class such as vehicles, pedestrians, etc. (Id.)
`
`Lemelson teaches that the “algorithm [is] generated from data of possible
`
`exterior objects and patterns of received electromagnetic illumination from the
`
`possible exterior objects.” AVS incorrectly argues that this limitation is not met.
`
`(Ex. 1014, p. 13.) But the broadest reasonable construction of the limitation in
`
`question does not require that the training set be directly imaged from physical
`
`exterior objects. Nowhere do the challenged claims require, or even imply, that such
`
`data must be imaged directly from actual exterior objects. AVS is importing this
`
`limitation into its claims in a desperate attempt to save them from anticipatory prior
`
`art. Indeed, the disputed term could mean any type of data so long as it relates to
`
`information about such objects and patterns of received waves or radiation they emit,
`
`irrespective of whether it is real or synthetically generated. (Ex. 1013, ¶¶ 40-42.)
`
`11
`
`
`
`
`
`
`In any event, even if the disputed limitation were construed by the Board to
`
`require data imaged directly from actual exterior objects, Lemelson teaches this. In
`
`particular, Lemelson teaches that “training involves providing known inputs to the
`
`network” and that “adaptive operation is also possible with on-line adjustment…”
`
`(Ex. 1002, 8:4-10; Ex. 1003, p. 13.) This disclosure would necessarily convey to one
`
`of ordinary skill in the art that the neural network of Lemelson was trained on images
`
`directly obtained from actual objects (i.e. natural image data). “Adaptive operation”
`
`could only have been accomplished with direct imaging using onboard vehicle
`
`sensors, which indicates that the “known inputs” Lemelson refers to mean natural
`
`image data. (Ex. 1013, ¶¶ 43-46.)
`
`In the early-to-mid 1990s, one of ordinary skill would have known that the
`
`statistical patterns provided by real imagery—essential in training a neural network to
`
`recognize complex 3-dimensional objects such as the “automobiles, trucks, and
`
`pedestrians” mentioned in Lemelson—could not have been found in synthetic data
`
`(and, in many cases, still cannot be found in such data today). (Ex. 1013, ¶ 43-45.)
`
`In the absence of the statistical patterns present in natural image data, a neural
`
`network will not learn to recognize real objects such as automobiles, trucks and
`
`pedestrians as is necessary in the field of vehicle safety. (Id.)
`
`In the early-to-mid 1990s, directly imaged data was by far the most realistic
`
`data type that could be obtained to train a neural network to classify or identify the
`
`virtually limitless variety of complex 3-dimensional objects a vehicle would be
`
`12
`
`
`
`
`
`
`expected to encounter in operation. (Ex. 1013, ¶ 43-45.) One of ordinary skill would
`
`not have expected to succeed in training a network for this task without using
`
`sufficiently realistic representations of objects the vehicle would be expected to
`
`encounter. (Id.) Such sufficiently realistic representations could only have been
`
`obtained from real image data. (Id.)
`
`Indeed, it is very telling Lemelson provides no disclosure whatsoever as to the
`
`complex 3-dimensional models necessary to produce such data synthetically. Such
`
`models would necessarily have had to render both the surfaces and reflectance
`
`patterns of complex objects like human beings, requiring enormous processing
`
`power that was prohibitively expensive and generally unavailable in the early-to-mid
`
`1990s. (Ex. 1013, ¶ 45.) Moreover, such models would still not have produced an
`
`acceptable substitute for natural image data. During this period, synthetic data was
`
`vastly inferior to real image data (and still is in many cases), and required expensive,
`
`time-consuming and complex models to produce. That Lemelson says nothing
`
`about how such a synthetic model was developed or implemented indicates that
`
`Lemelson’s disclosure relates to real image data. (Id.)
`
`Moreover, Lemelson itself cites to references which explicitly disclose training
`
`a neural network on real data. (Ex. 1013, ¶¶ 47.) For example, Lemelson cites to
`
`“Integration of Acoustic and Visual Speech Signals Using Neural Networks”
`
`(“Yuhas.”) (Ex. 1002, 19:26-28; Ex. 1003, p. 56.) Yuhas teaches training a neural
`
`network on visual and auditory speech data in order to enable the network to
`
`13
`
`
`
`
`
`
`recognize human speech. See Yuhas, B. P. et. al, "Integration of Acoustic and Visual
`
`Speech Signals Using Neural Networks", IEEE Communications Magazine, pp.-65-
`
`71, November, 1989. In particular, Yuhas teaches that “speech signals… were
`
`obtained from a male speaker who was videotaped…” and that the resulting images
`
`were “sampled to produce a topographically accurate image of 20 x 25 pixels.” (Id. at
`
`67.) As Yuhas exemplifies, one of ordinary skill would necessarily have resorted to
`
`training a neural network on directly-imaged data to teach it to recognize (i.e.
`
`accurately ascribe labels to) complex objects, such as a human face. (Ex. 1013, ¶ 47.)
`
`Additionally, the argument that training only on a specific feature of an object
`
`(such as the nose on a person’s face) is not training a system with “data of possible
`
`exterior objects” is meritless. Again, this is an attempt to distinguish the claims from
`
`anticipatory prior art by importing the requirement that the “whole” object must be
`
`trained on. There is simply no support in the plain language of the claims for this
`
`assertion. Training on a specific feature of an object is training on the object itself
`
`because the information received from/about the specific feature would still originate
`
`from or be generated by the object. (Ex. 1013, ¶¶ 47.)
`
`Lemelson discloses an “output means…”, element 10(e) (element 23(e) is
`
`substantially the same). The Board construed this as an “electronic circuit or circuits
`
`capable of outputting a signal to another vehicle system.” Lemelson teaches this
`
`through its disclosure of a processor (decision computer 23) that accepts codes from
`
`the image analysis computer 19 and “integrates the inputs from the image analysis
`
`14
`
`
`
`
`
`
`computer 19” as well as from a “radar or lidar computer 14.” (Ex. 1002, 8:30-33, 6:1-
`
`8; Ex. 1003, pp. 9, 13-14.) The decision computer 23 performs the function of
`
`“affecting a system in the vehicle” by generating control signals to control a vehicle
`
`system such as the brakes or steering wheel. (Ex. 1002, 5:46-52; Ex. 1003, pp. 8-9.)
`
`Based on the codes provided by the image-analyzing computer 19, the decision
`
`computer 23 can also operate a heads-up display viewable by the driver or a warning
`
`light. (Ex. 1002, 5:45-56; Ex. 1003, pp. 8-9.) A decision computer that generates
`
`control signals to vehicle systems such as brakes, steering or a warning display would
`
`necessarily be an “electronic circuit or circuits capable of outputting a signal to
`
`another vehicle system.” (Ex. 1013, ¶ 49.) Thus, Lemelson anticipates claims 10 and
`
`23 because it discloses every limitation of those claims.
`
`Claim 11 depends directly from claim 10 and recites a “measurement means for
`
`measuring the distance from the at least one exterior object to said vehicle, said
`
`measurement means comprising radar.” Lemelson satisfies this because it discloses
`
`that “[a]n auxiliary range detection means comprises a range computer 21 which
`
`accepts digital code signals from a radar or lidar computer 14 which interprets radar
`
`and/or laser range signals from respective reflected radiation receiving means on the
`
`vehicle.” (Ex. 1002, 5:67-6:8; Ex. 1003, p. 9.) Lemelson also teaches “distance
`
`measurements from radar/lidar systems” and thus anticipates claim 11. (Ex. 1002,
`
`8:55-58; Ex. 1003, p. 14; Ex. 1013, ¶ 50.)
`
`15
`
`
`
`
`
`
`Claim 15 depends directly from claim 10 but requires that “said processor
`
`means comprise[] a neural network algorithm.” Lemelson satisfies this element by
`
`teaching that “[t]he various hardware and software elements used to carry out the
`
`invention described herein are illustrated in the form of block diagrams, flow charts,
`
`and depictions of neural network and fuzzy logic algorithms…” emphasis added. (Ex.
`
`1002, 4:31-34; Ex. 1003, pp. 6-7.) Lemelson teaches that “[n]eural networks used in
`
`the vehicle warning system are trained to recognize roadway hazards which the vehicle
`
`is approaching including automobiles, trucks, and pedestrians. Training involves
`
`providing known inputs to the network resulting in desired output responses. The
`
`weights are automatically adjusted based on error signal measurements until the
`
`desired outputs are generated. Various learning algorithms may be applied.” (Ex.
`
`1002, 8:1-8; Ex. 1003, p. 13.) Thus, Lemelson anticipates claim 15. (Ex. 1013, ¶ 51.)
`
`Claim 19 depends from claim 10, and requires that the “reception means
`
`comprise a CCD array.” As discussed above, Lemelson satisfies this limitation at least
`
`through its disclosure that the “video camera 16 is preferably a CCD array.” (Ex.
`
`1002, 6:31-32; Ex. 1003, p. 10; Ex. 1013, ¶ 52.)
`
`Claim 23 recites substantially the same limitations as claim 10 (as stated above)
`
`except in method claim form. Claim 23 does not expressly include the “trained
`
`pattern recognition” limitation recited in element 10(d). Claim 23 instead requires
`
`“processing the electronic signal” by “generating a pattern recognition algorithm,”
`
`“storing the algorithm within the pattern recognition system,” and “applying the
`
`16
`
`
`
`
`
`
`pattern recognition algorithm.” For the reasons discussed above, Lemelson satisfies
`
`these limitations. (Ex. 1013, ¶¶ 30, 31, 33, 48, 49.) As shown in the below claim
`
`charts, Lemelson teaches each and every element of claims 10, 11, 15, 19, and 23.
`
`’000 Patent – Claim 10
`
`Lemelson (Ex. 1002); ’304 Application (Ex. 1003)
`
`10. In a motor vehicle
`having an interior and an
`exterior, a monitoring
`system for monitoring at
`least one object exterior
`to said vehicle
`comprising:
`
`a) transmitter means for
`transmitting
`electromagnetic waves to
`illuminate the at least one
`exterior object;
`
`b) reception means for
`receiving reflected
`electromagnetic
`illumination from the at
`least one exterior object;
`
`E.g., Fig. 1; see also Ex. 1003, Fig. 1.
`E.g., 2:14-20, “a video scanning system, such as a
`television camera and/or one or more laser scanners
`mounted on the vehicle scan the road in front of the
`vehicle and generate image information which is computer
`analyzed per se or in combination with a range sensing
`system to warn the driver of hazardous conditions during
`driving by operating a display.” See also Ex. 1003, p. 3.
`
`E.g., 4:8-15, “Another object is to provide a system and
`method employing a television scanning camera mounted
`on a vehicle for scanning the field ahead, such as the
`image of the road ahead of the vehicle and a computer for
`analyzing the image signals generated wherein automatic
`image intensifying, or infra-red scanning and detection
`means is utilized to permit scanning operations to be
`effected during driving at night and in low light, snowing
`or fog conditions.” See also Ex. 1003, p. 6.
`E.g., 6:34-37, “The video camera 16 may also be
`implemented with other technologies including known
`image intensifying electron gun and infrared imaging
`methods.” See also Ex. 1003, p. 10.
`E.g., 5:56, “[A] head light controller 41.” See also Ex. 1003,
`p. 9.
`
`E.g., Figs. 1-2. See also Ex. 1003, Figs. 1-2.
`E.g., 5:31-39, “A television camera(s) 16 having a wide
`angle lens 16L is mounted at the front of the vehicle such
`as the front end of the roof, bumper or end of the hood
`to scan the road ahead of the vehicle . . . The analog signal
`output of camera 16 is digitized in an A/D convertor