`
`UNITED STATES PATENT AND TRADEMARK OFFICE
`__________________
`
`BEFORE THE PATENT TRIAL AND APPEAL BOARD
`__________________________________________________________________
`
`TOYOTA MOTOR CORPORATION
`
`Petitioner
`
`
`
`Patent No. 5,845,000
`Issue Date: Dec. 1, 1998
`
`Title: OPTICAL IDENTIFICATON AND MONITORING SYSTEM USING
`PATTERN RECOGNITION FOR USE WITH VEHICLES
`__________________________________________________________________
`
`DECLARATION OF NIKOLAOS PAPANIKOLOPOULOS, PH.D.
`
`
`Case No. IPR2015-00262
`__________________________________________________________________
`
`
`
`IPR2015-00262 - Ex. 1109
`Toyota Motor Corp., Petitioner
`
`1
`
`
`
`
`
`I, Nikolaos Papanikolopoulos, Ph.D., hereby declare and state as follows:
`
`I.
`1.
`
`BACKGROUND
`
`I am currently employed by the University of Minnesota as a Distinguished
`
`McKnight University Professor of Computer Science and Engineering. I have been a
`
`professor at the University of Minnesota (originally as an assistant professor, and then
`
`as an associate professor) since the Fall of 1992. Between Fall 2001 and Spring 2004,
`
`and between Fall 2010 and Spring 2013, I was the Director of Undergraduate Studies
`
`of the College of Science and Engineering.
`
`2.
`
`In 1992, I received my Ph.D. in Electrical and Computer Engineering from
`
`Carnegie Mellon University. My thesis was entitled “Controlled Active Vision” and
`
`focused on using computer vision in a controlled fashion to monitor and manipulate
`
`objects in the environment. In 1988, I also received my M.S. in Electrical and
`
`Computer Engineering from Carnegie Mellon University. My B.S. in Electrical
`
`Engineering was received in 1987 from the National Technical University in Athens,
`
`Greece.
`
`3.
`
`Over the last nineteen years, my research and teaching work has focused on
`
`computer vision, intelligent transportation systems, and robotics. This research has
`
`included autonomous vehicles and object detection and recognition including work
`
`with artificial intelligence and pattern recognition systems.
`
`4. My research in the early 1990’s focused on solving sensor deployment
`
`problems including using sensory systems and algorithms to monitor the exterior and
`
`
`
`2
`
`2
`
`
`
`
`
`interior spaces of vehicles. Our efforts ranged from monitoring for pedestrians at
`
`crosswalks to performing real-time vehicle following. In particular, we developed a
`
`system (using a CCD camera) that could track humans as articulated bodies. We also
`
`created a system that detected the license plate of a vehicle ahead and then allowed
`
`the vehicle on which the camera was mounted to keep a constant distance from the
`
`leading vehicle. A screenshot of the pertinent system display is shown in Figure 1.
`
`Figure 1
`
`
`
`5.
`
`I currently teach three courses relating to intelligent systems: (i) CSci 5561
`
`Computer Vision, (ii) CSci 5511 Artificial Intelligence, and (iii) CSci 5551
`
`
`
`3
`
`3
`
`
`
`
`
`Introduction to Intelligent Robotic Systems.
`
`6. My research has produced more than 320 journal and conference publications.
`
`More than 70 publications are in refereed journals. Many of my publications relate to
`
`intelligent systems (including intelligent vehicles). Some examples include:
`
`Somasundaram, G., Sivalingam, R., Morellas, V., and Papanikolopoulos, N.P.,
`“Classification and Counting of Composite Objects in Traffic Scenes Using
`Global and Local Image Analysis”, IEEE Trans. on Intelligent Transportation
`Systems, Volume 14, No. 1, March 2013, pp. 69-81.
`
`Atev, S., Miller, G., and Papanikolopoulos, N.P., “Clustering of Vehicle
`Trajectories”,
`IEEE Trans. on Intelligent Transportation Systems, Volume 11,
`No. 3, September 2010, pp. 647-657.
`
`Atev, S., Arumugam, H., Masoud, O., Janardan, R., and Papanikolopoulos,
`N.P., “A Vision-Based Approach to Collision Prediction at Traffic
`Intersections”, IEEE Trans. on Intelligent Transportation Systems, Volume 6,
`No. 4, December 2005, pp. 416-423.
`
`Masoud, O., and Papanikolopoulos, N.P., “A Novel Method for
`Tracking and Counting Pedestrians in Real-time Using a Single
`Camera”, IEEE Trans. on Vehicular Technology, Volume 50, No. 5,
`September 2001, pp. 1267-1278.
`
`Du, Y., and Papanikolopoulos, N.P., "Real-time Vehicle Following
`Through a Novel Symmetry-Based Approach", Proceedings of the 1997 IEEE
`Int. Conf. on Robotics and Automation, pp. 3160-3165, Albuquerque, NM,
`April 20-25, 1997.
`
`As a result of my work and research, I am familiar with the design, control,
`
`7.
`
`operation and functionality of exterior monitoring systems for vehicles, including
`
`those employed on hybrid vehicles.
`
`A copy of my curriculum vitae is attached as included herewith.
`
`4
`
`8.
`
`
`
`4
`
`
`
`
`
`II. ASSIGNMENT AND COMPENSATION
`9.
`I submit this declaration in support of the Petition for Inter Partes Review of
`
`U.S. Patent No. 5,845,000 (“the ’000 patent”) filed by Toyota Motor Corporation
`
`(“Toyota”).
`
`10.
`
`11.
`
`I am not an employee of Toyota or any affiliate or subsidiary thereof.
`
`I am being compensated for my time at a rate of $500 per hour. My
`
`compensation is in no way dependent upon the substance of the opinions I offer
`
`below, or upon the outcome of Toyota’s Petition for Inter Partes Review (or the
`
`outcome of such an inter partes review, if a review is granted).
`
`12.
`
`I have been asked to provide certain opinions relating to the ’000 patent.
`
`Specifically, I have been asked to provide my opinion regarding (i) the level of
`
`ordinary skill in the art to which the ’000 patent pertains, and (ii) the patentability of
`
`claims 10, 11, 16, 17, 19, 20, and 23 of the ’000 patent, assuming that the “generated
`
`from” phrase in those claims constitutes a limitation, assuming further that it requires
`
`training with “real data” and assuming further that it is not explicitly disclosed by
`
`Lemelson.
`
`13. The opinions expressed in this declaration are not exhaustive of my opinions
`
`on the patentability of any of the claims in the ’000 patent. Therefore, the fact that I
`
`do not address a particular point should not be understood to indicate any agreement
`
`on my part that any claim otherwise complies with the patentability requirements.
`
`Further, I previously executed declarations in connection with Toyota’s other petition
`
`
`
`5
`
`5
`
`
`
`
`
`for inter partes review of the ’000 patent (IPR2013-00424), in which I expressed my
`
`opinion that Lemelson discloses the “generated from” limitation, even if it is
`
`interpreted to require “real data.” However, I have been asked to assume for the
`
`purposes of this declaration that it does not.
`
`14. The opinions expressed in this declaration are my personal opinions and do not
`
`reflect the views of University of Minnesota.
`
`III. LEGAL STANDARDS
`15.
`I have been informed and I understand that a patentability analysis is
`
`performed from the viewpoint of a hypothetical person of ordinary skill in the art. I
`
`understand that “the person of ordinary skill” is a hypothetical person who is
`
`presumed to be aware of the universe of available prior art as of the time of the
`
`invention at issue.
`
`16.
`
`I understand that a patent claim is unpatentable as anticipated when a single
`
`piece of prior art describes every element of the claimed invention, either expressly or
`
`inherently, and arranged in the same way as in the claim. For inherent anticipation to
`
`be found, it is required that the missing descriptive material is necessarily present in
`
`the prior art. I understand that, for the purpose of an inter partes review, prior art that
`
`anticipates a claim can include both patents and printed publications from anywhere
`
`in the world.
`
`17.
`
`I understand that some claims are written in dependent form, in which case
`
`they incorporate all of the limitations of the claim(s) on which they depend. I have
`
`
`
`6
`
`6
`
`
`
`
`
`further been informed that material not explicitly contained in a single prior art
`
`document may still be considered for purposes of anticipation if that material is
`
`incorporated by reference into the document. The document must be incorporated in
`
`such a manner that makes clear that the material is effectively part of the host
`
`document as if it were explicitly contained therein.
`
`18.
`
`I understand that a patent claim is unpatentable as obvious if the subject matter
`
`of the claim as a whole would have been obvious to a person of ordinary skill in the
`
`art as of the time of the invention at issue. I understand that the following factors
`
`must be evaluated to determine whether the claimed subject matter is obvious: (1) the
`
`scope and content of the prior art; (2) the difference or differences, if any, between
`
`the scope of the claim of the patent under consideration and the scope of the prior
`
`art; and (3) the level of ordinary skill in the art at the time the patent was filed. Unlike
`
`anticipation, which allows consideration of only one item of prior art, I understand
`
`that obviousness may be shown by considering more than one item of prior
`
`art. Moreover, I have been informed and I understand that so-called objective indicia
`
`of non-obviousness, also known as “secondary considerations,” like the following are
`
`also to be considered when assessing obviousness: (1) commercial success; (2) long-
`
`felt but unresolved needs; (3) copying of the invention by others in the field; (4) initial
`
`expressions of disbelief by experts in the field; (5) failure of others to solve the
`
`problem that the inventor solved; and (6) unexpected results. I also understand that
`
`evidence of objective indicia of non-obviousness must be commensurate in scope
`7
`
`
`
`7
`
`
`
`
`
`with the claimed subject matter.
`
`19. As an initial matter, I have been informed that claim terms may be written in
`
`means-plus-function format. In this situation, the means-plus-function claim terms
`
`cover the corresponding structure identified in the specification for performing the
`
`claimed function, and equivalents thereof.
`
`20.
`
`I have applied these principles with respect to my analysis set forth below.
`
`Also, I have applied the claim constructions set forth by the Board in its Decision on
`
`Institution in IPR2014-00647.
`
`IV. LEVEL OF ORDINARY SKILL IN THE ART
`21.
`I have been asked to provide my opinion regarding the level of ordinary skill in
`
`the art in May 1994 (which I understand is the month in which an application to
`
`which the ’000 claims priority was filed) and June 1995, which is the month in which
`
`the application leading to the ’000 patent was filed.1
`
`22.
`
`It is my opinion that, in May 1994, a person of ordinary skill in the art would
`
`have had one of the following: (i) a bachelor’s degree in electrical engineering,
`
`mechanical engineering, computer engineering, or computer science (or a closely
`
`related field) with at least four years of experience working with intelligent vehicles or
`
`exterior monitoring vehicle systems, (ii) a master’s degree in electrical engineering,
`
`mechanical engineering, computer engineering, or computer science (or a closely
`
`
`1 My opinion on the state of the art would not change even if the effective filing date
`were in May of 1992, the earliest date to which the ’000 patent claims priority.
`8
`
`
`
`8
`
`
`
`
`
`related field) with at least two years of experience working with intelligent vehicles or
`
`exterior monitoring vehicle systems or (iii) a Ph.D. in electrical engineering,
`
`mechanical engineering, computer engineering, or computer science (or a closely
`
`related field).2
`
`23.
`
`In my opinion, the level of ordinary skill in the art would have been the same in
`
`in June 1995 (and at any time between May 1994 and June 1995).
`
`24.
`
`In opining on the level of ordinary skill in the art, I have considered the
`
`following factors: (i) the education level of the inventor; (ii) the type of problems
`
`encountered in the art; (iii) prior art solutions to those problems; (iv) the rapidity with
`
`which innovations are made; (v) the sophistication of the technology; and (vi) the
`
`education level of active workers in the field.
`
`25. Based on my experience and education, I consider myself to have been a
`
`person of at least ordinary skill in the art with respect to the field of technology
`
`implicated by the ’000 patent from the time of filing to the present.
`
`V.
`BACKGROUND OF THE ’000 PATENT
`26. The ’000 patent generally describes a system and method for monitoring the
`
`interior and exterior of a vehicle and for identifying objects. The ’000 patent
`
`describes a number of different types of receivers and transmitters for performing the
`
`
`2 Although I have applied this level of ordinary skill in analyzing the obviousness
`issues, it is my opinion that claims 10-11, 16-17, 19-20 and 23 are, for the reasons set
`forth below, so clearly obvious that even a person of lesser skill would have found
`them obvious.
`
`
`
`9
`
`9
`
`
`
`
`
`identification. For example, CCD arrays are mentioned in 7:33-35 as receivers.
`
`Transmitters, like infrared ones, are discussed in 7:30-31. The information from the
`
`CCD arrays is processed by computational methodologies (“trained pattern
`
`recognition technologies”), such as a neural computer with the objective of classifying
`
`and identifying external objects. The output of this step is used to affect a response
`
`system of the vehicle.
`
`VI. CLAIMS OF THE ’000 PATENT
`27. The ’000 patent includes 25 claims. As noted above, I have been asked to
`
`consider the patentability of claims 10, 11, 16, 17, 19, 20, and 23. These claims are
`
`reproduced below for reference:
`
`10. In a motor vehicle having an interior and an exterior, a monitoring
`system for monitoring at least one object exterior to said vehicle comprising:
`
`a) transmitter means for transmitting electromagnetic waves to illuminate
`the at least one exterior object;
`
`b) reception means for receiving reflected electromagnetic illumination
`from the at least one exterior object;
`
`c) processor means coupled to said reception means for processing said
`received illumination and creating an electronic signal characteristic of said
`exterior object based thereon;
`
`d) categorization means coupled to said processor means for categorizing
`said electronic signal to identify said exterior object, said categorization
`means comprising trained pattern recognition means for processing said
`electronic signal based on said received illumination from said exterior
`object to provide an identification of said exterior object based thereon,
`said pattern recognition means being structured and arranged to apply a
`pattern recognition algorithm generated from data of possible exterior
`objects and patterns of received electromagnetic illumination from the
`possible exterior objects; and
`
`
`
`10
`
`10
`
`
`
`
`
`
`
`e) output means coupled to said categorization means for affecting another
`system in the vehicle in response to the identification of said exterior
`object.
`
`further comprising
`in accordance with claim 10,
`11. The system
`measurement means for measuring the distance from the at least one exterior
`object to said vehicle, said measurement means comprising radiation.
`
`16. In a motor vehicle having an interior and an exterior, an automatic
`headlight dimming system comprising:
`
`a) reception means for receiving electromagnetic radiation from the
`exterior of the vehicle;
`
`b) processor means coupled to said reception means for processing the
`received radiation and creating an electronic signal characteristic of the
`received radiation;
`
`c) categorization means coupled to said processor means for categorizing
`said electronic signal to
`identify a source of the radiation, said
`categorization means comprising trained pattern recognition means for
`processing said electronic signal based on said received radiation to provide
`an identification of the source of the radiation based thereon, said pattern
`recognition means being structured and arranged to apply a pattern
`recognition algorithm generated from data of possible sources of radiation
`including lights of vehicles and patterns of received radiation from the
`possible sources; and
`
`d) output means coupled to said categorization means for dimming the
`headlights in said vehicle in response to the identification of the source of
`the radiation.
`
`17. The invention in accordance with claim 16 wherein said categories
`further comprise radiation from taillights of a vehicle-in-front.
`
`19. The system of claim 10, wherein said reception means comprise a CCD
`array.
`
`20. The invention in accordance with claim 16, wherein said reception
`means comprise a CCD array.
`
`23. A method for affecting a system in a vehicle based on an object exterior
`of the vehicle, comprising the steps of:
`
`11
`
`11
`
`
`
`
`
`a) transmitting electromagnetic waves to illuminate the exterior object;
`
`b) receiving reflected electromagnetic illumination from the object on an
`array;
`
`c) processing the received illumination and creating an electronic signal
`characteristic of the exterior object based thereon;
`
`d) processing the electronic signal based on the received illumination from
`the exterior object to identify the exterior object, said processing step
`comprising the steps of generating a pattern recognition algorithm from
`data of possible exterior objects and patterns of received electromagnetic
`illumination from the possible exterior objects, storing the algorithm within
`a pattern recognition system and applying the pattern recognition algorithm
`using the electronic signal as input to obtain the identification of the
`exterior object; and
`
`e) affecting the system in the vehicle in response to the identification of the
`exterior object.
`VII. BACKGROUND ON THE STATE OF THE ART
`28. The following is a brief exemplary discussion of the state of the art prior to
`
`May 1994.
`
`29. During the last forty years, there has been a growing interest in intelligent
`
`vehicles (IV) and intelligent transportation systems (ITS). With emphasis on
`
`improved safety and improved system efficiency, a large number of applications have
`
`affected our everyday lives.
`
`30. The Defense Advanced Research Projects Agency (DARPA) funded several
`
`programs throughout the US in the 1980s with the objective of creating autonomous
`
`vehicles (Computing Initiative and the project was named Autonomous Land Vehicle
`
`(ALV)). Furthermore, the Image Understanding effort focused initially on cameras
`
`
`
`12
`
`12
`
`
`
`
`
`(sometimes in stereo pairs) to provide a situation awareness for the computational
`
`logic that drives a vehicle.
`
`31. Groups at Carnegie Mellon University, University of Maryland, and University
`
`of Massachusetts-Amherst worked on different aspects of the same problem–
`
`developing intelligent vehicles. Meetings like the DARPA Image Understanding
`
`Workshops and organizations like the Intelligent Transportation Society of America
`
`provided immediate dissemination of knowledge to various stakeholders.
`
`32. Other groups in Europe (e.g., Germany) focused throughout the late 1980’s
`
`and early 1990’s on the use of computer vision to drive a vehicle autonomously at
`
`high speeds. In this case, the emphasis was on the use of estimation and control
`
`techniques that will drive the vehicle based on stereo vision information. Their
`
`methods were similar to trained pattern recognition with the ability to monitor the
`
`lane markers of the roadway.
`
`33. Vehicle manufacturers in Japan, such as Toyota and Nissan, and Europe, such
`
`as Renault and Volkswagen, also built sensory systems to fit a wide range of vehicles
`
`from compact cars to trucks.
`
`34. Throughout all of these applications, various combinations of sensors including
`
`transmitters and detectors were used. The sensors included radar, laser radar, infrared
`
`emitters and detectors, as well as television cameras and CCD arrays. All of these
`
`systems functioned to receive and measure electromagnetic waves in order to detect
`
`objects in a vehicle’s environment.
`
`
`
`13
`
`13
`
`
`
`
`
`35. Additionally, there was extensive research that was performed with respect to
`
`the application of neural networks to detect objects and control. This was research
`
`published in a number of different patents and articles, including, for example:
`
`1) Kornhauser, A., “Neural Network Approaches for Lateral Control
`of Autonomous Highway Vehicles”, Proceedings of the Vehicle Navigation
`and Information Systems Conference, 1991, pp. 1143-1151.
`
`Plumer, E., “Neural Network Structure for Navigation Using
`2)
`Potential Fields”, Proceedings of the International Joint Conference on Neural
`Networks, 1992, pp. 327-332.
`
`3) Kraiss, K., and Kuttelwesch. H., “Teaching Neural Networks to
`Guide a Vehicle Through an Obstacle Course by Emulating a Human
`Teacher”, Proceedings of the International Joint Conference on Neural Networks,
`1990, pp. 333-337.
`
`4) Ciaccia, P., Maio, D., and Rizzi, S., “Integrating Knowledge-based
`Systems and Neural Networks for Navigational Tasks”, Proceedings of the
`5th Annual European Computer Conference (CompEuro ‘91), 1991, pp. 652-656.
`
`5) Neuber, S., Nijhuis, J., and Spaanenburg, L., “Developments in
`Autonomous Vehicle Navigation”, Proceedings of CompEuro ’92, 1992, pp.
`453-458.
`
`Luo, R., Potlapalli, H., and Hislop D., “Outdoor Landmark
`6)
`Recognition Using Fractal Based Vision and Neural Networks”,
`Proceedings of the 1993 IEEE/RSJ International Conference on Intelligent Robots
`and Systems, Yokohama, Japan, 1993, pp. 612-618.
`
`7) U.S. Patent No. 6,553,130 to Lemelson, “Motor Vehicle Warning
`and Control System and Method”, Publication date April 22, 2003,
`Priority date August 11, 1993.
`
`8)
`Pomerleau, D., “Neural Network Perception for Mobile Robot
`Guidance”, Ph.D. Thesis, Carnegie Mellon University, CMU-CS-92-115.
`February 16, 1992. (“Pomerleau”).
`
`Pomerleau, Dean, “ALVINN: An Autonomous Land Vehicle in a
`9)
`Neural Network,” Technical Report AIP-77, Carnegie Mellon University,
`March 13, 1990. (“1990 Pomerleau”).
`14
`
`
`
`14
`
`
`
`
`
`10) Arain et al., “Action Planning for the Collision Avoidance System
`Using Neural Networks,” Proceedings of the Intelligent Vehicles 1993
`Symposium, 1993.
`
`11) Catala, et al., “A Neural Network Texture Segmentation System
`for Open Road Vehicle Guidance,” Proceedings of the Intelligent Vehicles
`1992 Symposium, 1992.
`
`12) Goerick, et al., “Local Orientation Coding and Neural Network
`Classifiers with an Application to Real Time Car Detection and
`Tracking,” Mustererkennung 1994, Proceedings of the 16th Symposium of the
`DAGM and the 18th Workshop of the OAGM, Springer-Verlag, 1994.
`
`13) U.S. Patent No. 5,541,590 to Nishio, “Vehicle Crash Predictive
`and Evasive Operation System by Neural Networks,” Publication date
`July 30, 1996, Priority date August 4, 1992.
`36. Monitoring the exterior environment for object recognition is one of the
`
`applications of the aforementioned intelligent vehicles, including those vehicles that
`
`had utilized neural networks. Exterior monitoring in particular had been the subject
`
`of extensive research in the late 1980’s and the early 1990’s. Many research groups,
`
`including those mentioned in the prior art listed above, had implemented systems to
`
`analyze a vehicle scene by using various techniques that ranged from model-based
`
`computer vision to neural networks. (See, e.g., Kornhauser, “Neural Network
`
`Approaches for Lateral Control of Autonomous Highway Vehicles,” Proceedings of the
`
`Vehicle Navigation and Information Systems Conference, pp. 1143-1151, 1991 (“1991
`
`Kornhauser); 1990 Pomerleau; Dickmanns, et al., “An All-Transputer Visual
`
`Autobahn-Autocopilot/Copilot,” 1993 Proceedings of the Fourth International Conference on
`
`Computer Vision, pp. 608-615, 1993 (“1993 Dickmanns”); Ciaccia et al., “Integrating
`
`Knowledge-Based Systems and Neural Networks for Navigational Tasks.” Proceedings
`15
`
`
`
`15
`
`
`
`
`
`of the 5th Annual European Computer Conference (CompEuro ‘91), pp. 652-656, 1991 (“1991
`
`Ciaccia); Pomerleau; Neuber, et al., “Developments in Autonomous Vehicle
`
`Navigation,” Proceedings of CompEuro ’92, pp. 453-458., 1992 (“1992 Neuber”); and
`
`Luo, et al., “Outdoor Landmark Recognition Using Fractal Based Vision and Neural
`
`Networks,” Proceedings of the 1993 IEEE/RSJ International Conference on Intelligent Robots
`
`and Systems, Yokohama, Japan, July 26-30, 1993 (“1993 Luo”).)
`
`37.
`
`For example, researchers used information about an object’s appearance when
`
`perceived through an imaging apparatus, such as white lane markers and traffic signs
`
`as perceived through video cameras, to facilitate object recognition or detection. (See,
`
`e.g., Dickmanns, et al., “An Integrated Spatio-Temporal Approach to Automatic
`
`Visual Guidance of Autonomous Vehicles,” IEEE Transactions on Systems, Man and
`
`Cybernetics, Vol. 20, No. 6, pp. 1273-1284, 1990 (“1990 Dickmanns”); 1993
`
`Dickmanns; 1993 Luo.)
`
`38.
`
`Furthermore, some utilized traditional numerical methods to analyze and
`
`measure every element in a scene so as to create very accurate representations of the
`
`exterior environment. Groups in Germany, for example, used advanced estimation
`
`techniques to measure the road and vehicle parameters and perform obstacle
`
`avoidance at high speeds. (See, e.g., Graefe, et al., “Towards a Vision Based Robot
`
`with a Driver’s License,” 1988 IEEE International Workshop on Intelligent Robots (IROS
`
`88), pp. 627-632, 1988, (“1988 Graefe”); 1990 Dickmanns; 1993 Dickmanns.)
`
`39. As computers improved from the late 1980’s to the early 1990’s, neural
`16
`
`
`
`16
`
`
`
`
`
`networks were viewed as a viable alternative. In particular, neural network training
`
`became more manageable and many groups in the United States and Europe utilized
`
`artificial neural networks for performing obstacle avoidance and autonomous
`
`navigation. (See, e.g., 1990 Pomerleau; Pomerleau.)
`
`40. Neural network methodologies, such as back-propagation, provided ways to
`
`quickly adapt to the rapidly evolving scenes that vehicles would encounter. This
`
`training information was captured as “weights” that were assigned to various
`
`structures and components within often hidden layers of the neural networks. (See,
`
`e.g., 1991 Kornhauser, 1990 Pomerleau; 1992 Neuber; Pomerleau.) The sensory
`
`information such as images acquired from video cameras, infrared cameras, and laser
`
`radar, were fed into artificial neural networks and the internal network layers would
`
`provide outputs to drive the vehicle by controlling vehicle systems including steering
`
`as was the case with the NAVLAB vehicle. (See, e.g., 1990 Pomerleau; Pomerleau.)
`
`VIII. ANALYSIS
`A.
`Scope and Content of the Prior Art
`41. The scope and content of the prior art as of May 1994 would have broadly
`
`included patents and publications regarding vehicle sensing systems as well as
`
`computer vision and object identification (regardless of whether specifically applied in
`
`automobiles or otherwise).
`
`42.
`
`In my opinion, the references disclosed below would all have been considered
`
`to be within the same technical field as the subject matter of the ’000 patent.
`
`
`
`17
`
`17
`
`
`
`
`
`Furthermore, all of these references would be considered highly relevant prior art to
`
`claims 10, 11, 16, 17, 19, 20, and 23 of the ’000 patent.
`
`43. My opinion is the same with respect to the scope and content of the prior art as
`
`of May 1994 and any time between May 1994 and June 1995.
`
`B.
`44.
`
`List of Prior Art References Discussed in Analysis
`
`In my analysis, I discuss the following references, which I introduce here to
`
`provide abbreviations.
`
`1) U.S. Patent No. 6,553,130 (“Lemelson,” Exhibit 1102) issued from U.S.
`
`Appl. No. 08/671,853 (“’853 app.”), filed on June 28, 1996. The ’853 application is a
`
`continuation of U.S. App. No. 08/105,304 (“’304 app.,” Exhibit 1103), which was
`
`filed on Aug. 11, 1993. I have been asked to review the ’304 app. to see whether it
`
`contains the same or materially the same disclosure that I rely upon from Lemelson.
`
`As set forth below, I believe that it does, and have included parallel citations to the
`
`specification in that application.
`
`2) U.S. Patent 5,541,590 (“Nishio,” Exhibit 1104) issued from U.S. Appl. No.
`
`375,249, which is a continuation of U.S. Appl. No. 08/097,178 (“’178 app.,” Exhibit
`
`1105), filed on July 27, 1993.
`
`3) U.S. Patent No. 5,214,408 to Asayama (“Asayama,” Exhibit 1106) was filed
`
`on Oct. 24, 1991 and issued on May 25, 1993.
`
`4) Japanese Unexamined Patent Application Publication No. S62-131837 to
`
`Yanagawa (“Yanagawa,” Exhibit 1107) published on June 15, 1987. Yanagawa was
`
`
`
`18
`
`18
`
`
`
`
`
`published in Japanese. I have reviewed the English translation and associated
`
`affidavit Exhibit 1108.
`
`C.
`
`45.
`
`Claims 10, 11, 19, and 23 are Obvious Under 35 U.S.C. § 103 Over
`Lemelson
`
`It is my opinion that Lemelson renders obvious all of the limitations described
`
`in claims 10, 11, 19 and 23. Lemelson describes a system in a vehicle that uses
`
`sensors to detect possible obstacles on the roadway. The sensors include CCD
`
`cameras that can be placed in a stereovision configuration and/or on different
`
`locations around the vehicle. Lemelson also discloses radar/lidar for measuring
`
`distances to exterior objects. The lidar (also called laser radar) emits laser light and
`
`analyzes the reflected radiation-thus the term lidar comes from the combination of
`
`the words light and radar. Then a processor that uses a pattern recognition
`
`methodology (e.g., a neural network) analyzes the images obtained by the cameras to
`
`determine the identity of obstacles and affect numerous vehicle subsystems. The
`
`system of Lemelson can then utilize inputs from all the sensors as well the state of the
`
`vehicle in order to automatically affect a vehicle operation such as braking, steering,
`
`etc.
`
`46.
`
`In my opinion, the disclosure in Lemelson relates to the exact same technical
`
`field as the ’000 patent and describes all the same features (and more) of a vehicle
`
`exterior monitoring system.
`
`47.
`
`Figure 1 of Lemelson depicts many of the key elements of the system and is
`
`
`
`19
`
`19
`
`
`
`
`
`reproduced below:
`
`
`48. As can be seen from Figure 1, Lemelson teaches radar/lidar (14), a TV camera
`
`(16), an image analysis computer (19) for analyzing the camera images, and a decision
`
`computer 23. The decision computer 23 inputs the signals from the image analysis
`
`computer as well as the radar/lidar computer and outputs vehicle control signals.
`
`(Ex. 1102 (Lemelson), 5:36-58, 8:31-50; ’304 app., Ex. 1103 (’304 app), pp. 12-13, 17-
`
`18.) These control signals can affect the brakes, steering or flash the headlights. (Id.)
`
`Lemelson discloses the “transmitter means” limitations of claims 10 and 23
`49. Lemelson discloses the “transmitter” limitations of claim 10. Specifically,
`
`Lemelson discloses transmitters such as the vehicle’s headlights, (Ex. 1102
`
`
`
`20
`
`20
`
`
`
`
`
`(Lemelson), 3:28, 5:57; Ex. 1103 (‘304 app), pp. 9, 13), and a laser radar system’s
`
`transmitter, (Ex. 1102 (Lemelson), 6:2-4; ’304 app., Ex. 1103 (‘304 app), p. 13). In my
`
`opinion, the vehicle headlights in Lemelson would have been considered at least
`
`equivalent by one of ordinary skill in the art to the infrared transmitter structure
`
`identified in the ’000 specification. The reason is that ordinary headlights at the time
`
`that Lemelson was filed (and at the time that the ’304 app. was filed) would have
`
`emitted some amount of infrared radiation. Therefore, such headlights would have
`
`actually been infrared transmitters themselves, or even if not, been considered
`
`equivalent by one of ordinary skill.
`
`Lemelson discloses the “reception means” limitations of claims 10 and 16; and the
`“receiving” step of claim 23
`50. Lemelson also discloses the “reception means” limitations of claims 10, 16, and
`
`the “receiving” step of claim 23. Specifically, Lemelson discloses receivers such as
`
`CCD television cameras, (Lemelson, Ex. 1102, 6:31-42; ’304 app., Ex. 1103, p. 14), a
`
`laser radar system’s receiver, (Lemelson, Ex. 1102, 6:2-4; ’304 app., Ex. 1103, p. 13),
`
`and infrared receivers, (Lemelson, Ex. 1102, 4:13-15, 6:36; ’304 app., Ex. 1103, pp. 10,
`
`14). Figure 1, as reproduced above, depicts the system of Lemelson and shows “TV
`
`Camera 16” as an input to the system. Figure 1 also depicts radar and lidar as inputs
`
`to the system.
`
`51. Lemelson discloses utilizing his system in low-light, a