`
`UNITED STATES PATENT AND TRADEMARK OFFICE
`__________________
`
`BEFORE THE PATENT TRIAL AND APPEAL BOARD
`__________________________________________________________________
`
`TOYOTA MOTOR CORPORATION
`
`Petitioner
`
`
`
`Patent No. 6,772,057
`Issue Date: Aug. 3, 2004
`Title: VEHICULAR MONITORING SYSTEMS
`__________________________________________________________________
`
`DECLARATION OF NIKOLAOS PAPANIKOLOPOULOS, PH.D.
`
`
`Case No. IPR2015-00261
`__________________________________________________________________
`
`
`
`1
`
`IPR2015-00261 - Ex. 1106
`Toyota Motor Corp., Petitioner
`
`1
`
`
`
`
`
`I, Nikolaos Papanikolopoulos, Ph.D., hereby declare and state as follows:
`
`I.
`
`1.
`
`BACKGROUND
`
`I am currently employed by the University of Minnesota as a Distinguished
`
`McKnight University Professor of Computer Science and Engineering. I have been a
`
`professor at the University of Minnesota (originally as an assistant professor, and then
`
`as an associate professor) since the Fall of 1992. Between Fall 2001 and Spring 2004,
`
`and between Fall 2010 and Spring 2013, I was the Director of Undergraduate Studies
`
`of the College of Science and Engineering.
`
`2.
`
`In 1992, I received my Ph.D. in Electrical and Computer Engineering from
`
`Carnegie Mellon University. My thesis was entitled “Controlled Active Vision” and
`
`focused on using computer vision in a controlled fashion to monitor and manipulate
`
`objects in the environment. In 1988, I also received my M.S. in Electrical and
`
`Computer Engineering from Carnegie Mellon University. My B.S. in Electrical
`
`Engineering was received in 1987 from the National Technical University in Athens,
`
`Greece.
`
`3.
`
`Over the last nineteen years, my research and teaching work has focused on
`
`computer vision, intelligent transportation systems, and robotics. This research has
`
`included autonomous vehicles and object detection and recognition including work
`
`with artificial intelligence and pattern recognition systems.
`
`4.
`
`My research in the early 1990’s focused on solving sensor deployment
`
`problems including using sensory systems and algorithms to monitor the exterior and
`
`
`
`2
`
`2
`
`
`
`
`
`interior spaces of vehicles. Our efforts ranged from monitoring for pedestrians at
`
`crosswalks to performing real-time vehicle following. In particular, we developed a
`
`system (using a CCD camera) that could track humans as articulated bodies. We also
`
`created a system that detected the license plate of a vehicle ahead and then allowed
`
`the vehicle on which the camera was mounted to keep a constant distance from the
`
`leading vehicle. A screenshot of the pertinent system display is shown in Figure 1.
`
`Figure 1
`
`
`
`5.
`
`I currently teach three courses relating to intelligent systems: (i) CSci 5561
`
`Computer Vision, (ii) CSci 5511 Artificial Intelligence, and (iii) CSci 5551
`
`
`
`3
`
`3
`
`
`
`
`
`Introduction to Intelligent Robotic Systems.
`
`6.
`
`My research has produced more than 320 journal and conference publications.
`
`More than 70 publications are in refereed journals. Many of my publications relate to
`
`intelligent systems (including intelligent vehicles). Some examples include:
`
`Somasundaram, G., Sivalingam, R., Morellas, V., and Papanikolopoulos, N.P.,
`“Classification and Counting of Composite Objects in Traffic Scenes Using
`Global and Local Image Analysis”, IEEE Trans. on Intelligent Transportation
`Systems, Volume 14, No. 1, March 2013, pp. 69-81.
`
`Atev, S., Miller, G., and Papanikolopoulos, N.P., “Clustering of Vehicle
`Trajectories”, IEEE Trans. on Intelligent Transportation Systems, Volume 11,
`No. 3, September 2010, pp. 647-657.
`
`Atev, S., Arumugam, H., Masoud, O., Janardan, R., and Papanikolopoulos,
`N.P., “A Vision-Based Approach to Collision Prediction at Traffic
`Intersections”, IEEE Trans. on Intelligent Transportation Systems, Volume 6,
`No. 4, December 2005, pp. 416-423.
`
`Masoud, O., and Papanikolopoulos, N.P., “A Novel Method for
`Tracking and Counting Pedestrians in Real-time Using a Single
`Camera”, IEEE Trans. on Vehicular Technology, Volume 50, No. 5,
`September 2001, pp. 1267-1278.
`
`Du, Y., and Papanikolopoulos, N.P., "Real-time Vehicle Following
`Through a Novel Symmetry-Based Approach", Proceedings of the 1997 IEEE
`Int. Conf. on Robotics and Automation, pp. 3160-3165, Albuquerque, NM,
`April 20-25, 1997.
`
`As a result of my work and research, I am familiar with the design, control,
`
`7.
`
`operation and functionality of exterior monitoring systems for vehicles, including
`
`those employed on hybrid vehicles.
`
`8.
`
`A copy of my curriculum vitae is attached as included herewith as Exhibit A.
`
`II. ASSIGNMENT AND COMPENSATION
`4
`
`
`
`4
`
`
`
`
`
`9.
`
`I submit this declaration in support of the Petition for Inter Partes Review of
`
`U.S. Patent No. 6,772,057 (“the ’057 patent”) filed by Toyota Motor Corporation
`
`(“Toyota”).
`
`10.
`
`11.
`
`I am not an employee of Toyota or any affiliate or subsidiary thereof.
`
`I am being compensated for my time at a rate of $500 per hour. My
`
`compensation is in no way dependent upon the substance of the opinions I offer
`
`below, or upon the outcome of Toyota’s Petition for Inter Partes Review (or the
`
`outcome of such an inter partes review, if a review is granted).
`
`12.
`
`I have been asked to provide certain opinions relating to the ’057 patent.
`
`Specifically, I have been asked to provide my opinion regarding (i) the level of
`
`ordinary skill in the art to which the ’057 patent pertains, and (ii) the patentability of
`
`claims 1-4, 7-10, 31, 41, 56, 59-62, and 64 of the ’057 patent, assuming that the
`
`“generated from” phrase in those claims constitutes a limitation, assuming further that
`
`it requires training with “real data” and assuming further that it is not explicitly
`
`disclosed by Lemelson.
`
`13.
`
`The opinions expressed in this declaration are not exhaustive of my opinions
`
`on the patentability of any of the claims in the ’057 patent. Therefore, the fact that I
`
`do not address a particular point should not be understood to indicate any agreement
`
`on my part that any claim otherwise complies with the patentability requirements.
`
`Further, I previously executed declarations in connection with Toyota’s other petition
`
`for inter partes review of the ’057 patent (IPR2013-00419), in which I expressed my
`5
`
`
`
`5
`
`
`
`
`
`opinion that Lemelson discloses the “generated from” limitation, even if it is
`
`interpreted to require “real data.” However, I have been asked to assume for the
`
`purposes of this declaration that it does not.
`
`14.
`
`The opinions expressed in this declaration are my personal opinions and do not
`
`reflect the views of University of Minnesota.
`
`III. LEGAL STANDARDS
`I have been informed and I understand that a patentability analysis is
`
`15.
`
`performed from the viewpoint of a hypothetical person of ordinary skill in the art. I
`
`understand that “the person of ordinary skill” is a hypothetical person who is
`
`presumed to be aware of the universe of available prior art as of the time of the
`
`invention at issue.
`
`16.
`
`I understand that a patent claim is unpatentable as anticipated when a single
`
`piece of prior art describes every element of the claimed invention, either expressly or
`
`inherently, and arranged in the same way as in the claim. For inherent anticipation to
`
`be found, it is required that the missing descriptive material is necessarily present in
`
`the prior art. I understand that, for the purpose of an inter partes review, prior art that
`
`anticipates a claim can include both patents and printed publications from anywhere
`
`in the world.
`
`17.
`
`I understand that some claims are written in dependent form, in which case
`
`they incorporate all of the limitations of the claim(s) on which they depend. I have
`
`further been informed that material not explicitly contained in a single prior art
`
`
`
`6
`
`6
`
`
`
`
`
`document may still be considered for purposes of anticipation if that material is
`
`incorporated by reference into the document. The document must be incorporated in
`
`such a manner that makes clear that the material is effectively part of the host
`
`document as if it were explicitly contained therein.
`
`18.
`
`I understand that a patent claim is unpatentable as obvious if the subject matter
`
`of the claim as a whole would have been obvious to a person of ordinary skill in the
`
`art as of the time of the invention at issue. I understand that the following factors
`
`must be evaluated to determine whether the claimed subject matter is obvious: (1) the
`
`scope and content of the prior art; (2) the difference or differences, if any, between
`
`the scope of the claim of the patent under consideration and the scope of the prior
`
`art; and (3) the level of ordinary skill in the art at the time the patent was filed. Unlike
`
`anticipation, which allows consideration of only one item of prior art, I understand
`
`that obviousness may be shown by considering more than one item of prior
`
`art. Moreover, I have been informed and I understand that so-called objective indicia
`
`of non-obviousness, also known as “secondary considerations,” like the following are
`
`also to be considered when assessing obviousness: (1) commercial success; (2) long-
`
`felt but unresolved needs; (3) copying of the invention by others in the field; (4) initial
`
`expressions of disbelief by experts in the field; (5) failure of others to solve the
`
`problem that the inventor solved; and (6) unexpected results. I also understand that
`
`evidence of objective indicia of non-obviousness must be commensurate in scope
`
`with the claimed subject matter.
`
`
`
`7
`
`7
`
`
`
`
`
`19. As an initial matter, I have been informed that claim terms may be written in
`
`means-plus-function format. In this situation, the means-plus-function claim terms
`
`cover the corresponding structure identified in the specification for performing the
`
`claimed function, and equivalents thereof.
`
`20.
`
`I have applied these principles with respect to my analysis set forth below.
`
`Also, I have applied the claim constructions set forth by the Board in its Decision on
`
`Institution in IPR2014-00646.
`
`IV. LEVEL OF ORDINARY SKILL IN THE ART
`I have been asked to provide my opinion regarding the “level of ordinary skill”
`
`21.
`
`in the art in both June 1995 (which I understand is the month in which the earliest
`
`application in the claimed priority chain was filed) and in November 2002, which is
`
`the month in which the application leading to the ’057 patent was filed.
`
`22.
`
`It is my opinion that, in June 1995, a person of ordinary skill in the art would
`
`have had one of the following: (i) a bachelor’s degree in electrical engineering,
`
`mechanical engineering, computer engineering, or computer science (or a closely
`
`related field) with at least four years of experience working with intelligent vehicles or
`
`exterior monitoring vehicle systems, (ii) a master’s degree in electrical engineering,
`
`mechanical engineering, computer engineering, or computer science (or a closely
`
`related field with at least two years of experience working with intelligent vehicles or
`
`exterior monitoring vehicle systems or (iii) a PhD in electrical engineering, mechanical
`
`
`
`8
`
`8
`
`
`
`
`
`engineering, computer engineering, or computer science (or a closely related field).1
`
`23.
`
`In my opinion, the level of ordinary skill in the art would have been the same in
`
`November 2002 (and at any time between June 1995 and November 2002).
`
`24.
`
`In opining on the level of ordinary skill in the art, I have considered the
`
`following factors: (i) the education level of the inventor; (ii) the type of problems
`
`encountered in the art; (iii) prior art solutions to those problems; (iv) the rapidity with
`
`which innovations are made; (v) the sophistication of the technology; and (vi) the
`
`education level of active workers in the field.
`
`25.
`
`Based on my experience and education, I consider myself to have been a
`
`person of at least ordinary skill in the art with respect to the field of technology
`
`implicated by the ’000 patent from the time of filing to the present.
`
`V.
`
`26.
`
`BACKGROUND OF THE ’000 PATENT
`
`The ’057 patent generally describes a system and method for monitoring the
`
`interior and exterior of a vehicle and for identifying objects. The ’057 patent
`
`describes a number of different types of receivers and transmitters for performing the
`
`identification. For example, CCD transducers are mentioned in 39:26-27 as receivers.
`
`Transmitters, such as infrared ones, are discussed in 39:26-29. The information from
`
`the CCD arrays is processed by computational methodologies, such as a neural
`
`
`1 Although I have applied this level of ordinary skill in analyzing the obviousness
`issues, it is my opinion that claims 1-4, 7-10, 31, 41, 56, 59-62, and 64 are, for the
`reasons set forth below, so clearly obvious that even a person of lesser skill would
`have found them obvious.
`
`
`
`9
`
`9
`
`
`
`
`
`computer, with the objective of classifying, locating or identifying external objects.
`
`(Exh. 1, 39:44-54.) The output of this step is used to affect a response system of the
`
`vehicle. (Id. at 39:54-62.)
`
`VI. CLAIMS OF THE ’057 PATENT
`The ’057 patent includes 86 claims. As noted above, I have been asked to
`
`27.
`
`consider the patentability of claims 1-4, 7-10, 31, 41, 56, 59-62 and 64. These claims
`
`are reproduced below for reference:
`
`1. A monitoring arrangement for monitoring an environment exterior of a
`vehicle, comprising:
`
`at least one receiver arranged to receive waves from the environment exterior
`of the vehicle which contain information on any objects in the environment
`and generate a signal characteristic of the received waves;
`
`and a processor coupled to said at least one receiver and comprising trained
`pattern recognition means for processing the signal to provide a classification,
`identification or location of the exterior object, said trained pattern
`recognition means being structured and arranged to apply a trained pattern
`recognition algorithm generated from data of possible exterior objects and
`patterns of received waves from the possible exterior objects to provide the
`classification, identification or location of the exterior object;
`
`whereby a system in the vehicle is coupled to said processor such that the
`operation of the system is affected in response to the classification,
`identification or location of the exterior object.
`
`2. The arrangement of claim 1, wherein said at least one receiver comprises a
`pair of receivers spaced apart from one another.
`
`3. The arrangement of claim 1, wherein said at least one receiver is arranged
`to receive infrared waves.
`
`4. The arrangement of claim 1, wherein the monitoring arrangement further
`comprises a transmitter for transmitting waves into the environment exterior
`
`
`
`10
`
`10
`
`
`
`
`
`
`
`of the vehicle whereby said at least one receiver is arranged to receive waves
`transmitted by said transmitter and reflected by any exterior objects.
`
`7. The arrangement of claim 1, wherein the another system is a display
`viewable by the driver and arranged to show an image or icon of the exterior
`object.
`
`8. The arrangement of claim 1, wherein said at least one receiver is a CCD
`array.
`
`9. The arrangement of claim 1, further comprising measurement means for
`measuring a distance between the exterior object and the vehicle.
`
`10. The arrangement of claim 9, wherein said measurement means comprise a
`radar or laser radar system.
`
`30. A vehicle including a monitoring arrangement for monitoring an
`environment exterior of the vehicle, the monitoring arrangement comprising:
`
`at least one receiver arranged on a rear view mirror of the vehicle to receive
`waves from the environment exterior of the vehicle which contain
`information on any objects in the environment and generate a signal
`characteristic of the received waves; and a processor coupled to said at least
`one receiver and arranged to classify or identify the exterior object based on
`the signal and thereby provide the classification or identification of the
`exterior object;
`
`whereby a system in the vehicle is coupled to said processor such that the
`operation of the system is affected in response to the classification or
`identification of the exterior object.
`
`31. The vehicle of claim 30, wherein said processor comprises trained pattern
`recognition means for processing the signal to provide the classification or
`identification of the exterior object, said trained pattern recognition means
`being structured and arranged to apply a pattern recognition algorithm
`generated from data of possible exterior objects and patterns of received
`waves from the possible exterior objects.
`
`40. A monitoring arrangement for monitoring an environment exterior of a
`vehicle, comprising:
`
`a plurality of receivers arranged apart from one another and to receive waves
`from different parts of the environment exterior of the vehicle which contain
`
`11
`
`11
`
`
`
`
`
`
`
`information on any objects in the environment and generate a signal
`characteristic of the received waves;
`
`and a processor coupled to said receivers and arranged to classify, identify or
`locate the exterior object based on the signals generated by said receivers and
`thereby provide the classification identification or location of the exterior
`object,
`
`whereby a system in the vehicle is coupled to said processor such that the
`operation of the system is affected in response to the classification,
`identification or location of the exterior object.
`
`41. The arrangement of claim 40, wherein said processor comprises trained
`pattern recognition means for processing the signal to provide the
`classification, identification or location of the exterior object, said trained
`pattern recognition means being structured and arranged to apply a trained
`pattern recognition algorithm generated from data of possible exterior objects
`and patterns of received waves from the possible exterior objects.
`
`56. A vehicle including a monitoring arrangement for monitoring an
`environment exterior of the vehicle, the monitoring arrangement comprising:
`
`at least one receiver arranged to receive waves from the environment exterior
`of the vehicle which contain information on any objects in the environment
`and generate a signal characteristic of the received waves;
`
`and a processor coupled to said at least one receiver and comprising trained
`pattern recognition means for processing the signal to provide a classification,
`identification or location of the exterior object, said trained pattern
`recognition means being structured and arranged to apply a trained pattern
`recognition algorithm generated from data of possible exterior objects and
`patterns of received waves from the possible exterior objects to provide the
`classification, identification or location of the exterior object;
`
`whereby a system in the vehicle is coupled to said processor such that the
`operation of the system is affected in response to the classification,
`identification or location of the exterior object.
`
`59. The vehicle of claim 56, wherein the monitoring arrangement further
`comprises a transmitter for transmitting waves into the environment exterior
`of the vehicle whereby said at least one receiver is arranged to receive waves
`transmitted by said transmitter and reflected by any exterior objects.
`
`12
`
`12
`
`
`
`
`
`60. The vehicle of claim 56, wherein said at least one receiver is arranged to
`receive waves from a blind spot of the vehicle.
`
`61. The vehicle of claim 56, wherein the another system is a display viewable
`by the driver and arranged to show an image or icon of the exterior object.
`
`62. The vehicle of claim 56, wherein said at least one receiver is mounted on a
`rear view mirror or in a rear window.
`
`64. The vehicle of claim 56, further comprising measurement means for
`measuring a distance between the exterior object and the vehicle.
`
`
`VII. BACKGROUND ON THE STATE OF THE ART
`The following is a brief exemplary discussion of the state of the art prior to
`
`28.
`
`June 1995.
`
`29. During the last forty years, there has been a growing interest in intelligent
`
`vehicles (IV) and intelligent transportation systems (ITS). With emphasis on
`
`improved safety and improved system efficiency, a large number of applications have
`
`affected our everyday lives.
`
`30.
`
`The Defense Advanced Research Projects Agency (DARPA) funded several
`
`programs throughout the US in the 1980s with the objective of creating autonomous
`
`vehicles (Computing Initiative and the project was named Autonomous Land Vehicle
`
`(ALV)). Furthermore, the Image Understanding effort focused initially on cameras
`
`(sometimes in stereo pairs) to provide a situation awareness for the computational
`
`logic that drives a vehicle.
`
`31. Groups at Carnegie Mellon University, University of Maryland, and University
`
`
`
`13
`
`13
`
`
`
`
`
`of Massachusetts-Amherst worked on different aspects of the same problem–
`
`developing intelligent vehicles. Meetings like the DARPA Image Understanding
`
`Workshops and organizations like the Intelligent Transportation Society of America
`
`provided immediate dissemination of knowledge to various stakeholders.
`
`32. Other groups in Europe (e.g., Germany) focused throughout the late 1980’s
`
`and early 1990’s on the use of computer vision to drive a vehicle autonomously at
`
`high speeds. In this case, the emphasis was on the use of estimation and control
`
`techniques that will drive the vehicle based on stereo vision information. Their
`
`methods were similar to trained pattern recognition with the ability to monitor the
`
`lane markers of the roadway.
`
`33. Vehicle manufacturers in Japan, such as Toyota and Nissan, and Europe, such
`
`as Renault and Volkswagen, also built sensory systems to fit a wide range of vehicles
`
`from compact cars to trucks.
`
`34.
`
`Throughout all of these applications, various combinations of sensors including
`
`transmitters and detectors were used. The sensors included radar, laser radar, infrared
`
`emitters and detectors, as well as television cameras and CCD arrays. All of these
`
`systems functioned to receive and measure electromagnetic waves in order to detect
`
`objects in a vehicle’s environment.
`
`35. Additionally, there was extensive research that was performed with respect to
`
`the application of neural networks to detect objects and control. This was research
`
`published in a number of different patents and articles, including, for example:
`14
`
`
`
`14
`
`
`
`
`
`
`
`1) Kornhauser, A., “Neural Network Approaches for Lateral Control
`of Autonomous Highway Vehicles”, Proceedings of the Vehicle Navigation
`and Information Systems Conference, 1991, pp. 1143-1151.
`
`Plumer, E., “Neural Network Structure for Navigation Using
`2)
`Potential Fields”, Proceedings of the International Joint Conference on Neural
`Networks, 1992, pp. 327-332.
`
`3) Kraiss, K., and Kuttelwesch. H., “Teaching Neural Networks to
`Guide a Vehicle Through an Obstacle Course by Emulating a Human
`Teacher”, Proceedings of the International Joint Conference on Neural Networks,
`1990, pp. 333-337.
`
`4) Ciaccia, P., Maio, D., and Rizzi, S., “Integrating Knowledge-based
`Systems and Neural Networks for Navigational Tasks”, Proceedings of the
`5th Annual European Computer Conference (CompEuro ‘91), 1991, pp. 652-656.
`
`5) Neuber, S., Nijhuis, J., and Spaanenburg, L., “Developments in
`Autonomous Vehicle Navigation”, Proceedings of CompEuro ’92, 1992, pp.
`453-458.
`
`Luo, R., Potlapalli, H., and Hislop D., “Outdoor Landmark
`6)
`Recognition Using Fractal Based Vision and Neural Networks”,
`Proceedings of the 1993 IEEE/RSJ International Conference on Intelligent Robots
`and Systems, Yokohama, Japan, 1993, pp. 612-618.
`
`7) U.S. Patent No. 6,553,130 to Lemelson, “Motor Vehicle Warning
`and Control System and Method”, Publication date April 22, 2003,
`Priority date August 11, 1993.
`
`8)
`Pomerleau, D., “Neural Network Perception for Mobile Robot
`Guidance”, Ph.D. Thesis, Carnegie Mellon University, CMU-CS-92-115.
`February 16, 1992. (“Pomerleau”).
`
`Pomerleau, Dean, “ALVINN: An Autonomous Land Vehicle in a
`9)
`Neural Network,” Technical Report AIP-77, Carnegie Mellon University,
`March 13, 1990. (“1990 Pomerleau”).
`
`10) Arain et al., “Action Planning for the Collision Avoidance System
`Using Neural Networks,” Proceedings of the Intelligent Vehicles 1993
`Symposium, 1993.
`
`15
`
`15
`
`
`
`
`
`11) Catala, et al., “A Neural Network Texture Segmentation System
`for Open Road Vehicle Guidance,” Proceedings of the Intelligent Vehicles
`1992 Symposium, 1992.
`
`12) Goerick, et al., “Local Orientation Coding and Neural Network
`Classifiers with an Application to Real Time Car Detection and
`Tracking,” Mustererkennung 1994, Proceedings of the 16th Symposium of the
`DAGM and the 18th Workshop of the OAGM, Springer-Verlag, 1994.
`
`13) U.S. Patent No. 5,541,590 to Nishio, “Vehicle Crash Predictive
`and Evasive Operation System by Neural Networks,” Publication date
`July 30, 1996, Priority date August 4, 1992.
`
`36. Monitoring the exterior environment for object recognition is one of the
`
`applications of the aforementioned intelligent vehicles, including those vehicles that
`
`had utilized neural networks. Exterior monitoring in particular had been the subject
`
`of extensive research in the late 1980’s and the early 1990’s. Many research groups,
`
`including those mentioned in the prior art listed above, had implemented systems to
`
`analyze a vehicle scene by using various techniques that ranged from model-based
`
`computer vision to neural networks (see e.g., Kornhauser, “Neural Network
`
`Approaches for Lateral Control of Autonomous Highway Vehicles,” Proceedings of the
`
`Vehicle Navigation and Information Systems Conference, pp. 1143-1151, 1991 (“1991
`
`Kornhauser); 1990 Pomerleau; Dickmanns, et al., “An All-Transputer Visual
`
`Autobahn-Autocopilot/Copilot,” 1993 Proceedings of the Fourth International Conference on
`
`Computer Vision, pp. 608-615, 1993 (“1993 Dickmanns”), Ciaccia et al., “Integrating
`
`Knowledge-Based Systems and Neural Networks for Navigational Tasks.” Proceedings
`
`of the 5th Annual European Computer Conference (CompEuro ‘91), pp. 652-656, 1991 (“1991
`
`
`
`16
`
`16
`
`
`
`
`
`Ciaccia); Pomerleau; Neuber, et al., “Developments in Autonomous Vehicle
`
`Navigation,” Proceedings of CompEuro ’92, pp. 453-458., 1992 (“1992 Neuber”); and
`
`Luo, et al., “Outdoor Landmark Recognition Using Fractal Based Vision and Neural
`
`Networks,” Proceedings of the 1993 IEEE/RSJ International Conference on Intelligent Robots
`
`and Systems, Yokohama, Japan, July 26-30, 1993 (“1993 Luo”)).
`
`37.
`
`For example, researchers used information about an object’s appearance when
`
`perceived through an imaging apparatus, such as white lane markers and traffic signs
`
`as perceived through video cameras, to facilitate object recognition or detection. (See
`
`e.g., Dickmanns, et al., “An Integrated Spatio-Temporal Approach to Automatic
`
`Visual Guidance of Autonomous Vehicles,” IEEE Transactions on Systems, Man and
`
`Cybernetics, Vol. 20, No. 6, pp. 1273-1284, 1990 (“1990 Dickmanns”); 1993
`
`Dickmanns; 1993 Luo.)
`
`38.
`
`Furthermore, some utilized traditional numerical methods to analyze and
`
`measure every element in a scene so as to create very accurate representations of the
`
`exterior environment. Groups in Germany, for example, used advanced estimation
`
`techniques to measure the road and vehicle parameters and perform obstacle
`
`avoidance at high speeds. (See e.g., Graefe, et al., “Towards a Vision Based Robot with
`
`a Driver’s License,” 1988 IEEE International Workshop on Intelligent Robots (IROS 88),
`
`pp. 627-632, 1988, (“1988 Graefe”); 1990 Dickmanns; 1993 Dickmanns.)
`
`39. As computers improved from the late 1980’s to the early 1990’s, neural
`
`networks were viewed as a viable alternative. In particular, neural network training
`17
`
`
`
`17
`
`
`
`
`
`became more manageable and many groups in the United States and Europe utilized
`
`artificial neural networks for performing obstacle avoidance and autonomous
`
`navigation. (See, e.g., 1990 Pomerleau; Pomerleau.)
`
`40. Neural network methodologies, such as back-propagation, provided ways to
`
`quickly adapt to the rapidly evolving scenes that vehicles would encounter. This
`
`training information was captured as “weights” that were assigned to various
`
`structures and components within often hidden layers of the neural networks. (See
`
`e.g., 1991 Kornhauser, 1990 Pomerleau; 1992 Neuber; Pomerleau.) The sensory
`
`information such as images acquired from video cameras, infrared cameras, and laser
`
`radar, were fed into artificial neural networks and the internal network layers would
`
`provide outputs to drive the vehicle by controlling vehicle systems including steering
`
`as was the case with the NAVLAB vehicle. (See e.g., 1990 Pomerleau; Pomerleau.)
`
`VIII. ANALYSIS
`A.
`
`Scope and Content of the Prior Art
`
`41.
`
`The scope and content of the prior art as of June 1995 would have broadly
`
`included vehicle sensing systems as well as computer vision and object identification
`
`(regardless of whether specifically applied in automobiles or otherwise).
`
`42.
`
`In my opinion, the references disclosed below would all have been considered
`
`to be within the same technical field as the subject matter of the ’057 patent.
`
`Furthermore, all of these references would have been considered highly relevant prior
`
`art to claims 1-4, 7-10, 31, 41, 56, 59-62 and 64 of the ’057 patent.
`
`
`
`18
`
`18
`
`
`
`
`
`43. My opinion is the same with respect to the scope and content of the prior art as
`
`of November 2002, and any time between June 1995 and November 2002.
`
`B.
`
`
`44.
`
`List of Prior Art References Discussed in Analysis
`
`In my analysis, I discuss the following references, which I introduce here to
`
`provide abbreviations.
`
`1) U.S. Patent No. 6,553,130 (“Lemelson,” Exhibit 1102) issued from U.S. Appl. No.
`
`08/671,853 (“’853 app.”), filed on June 28, 1996. The ’853 application is a
`
`continuation of U.S. App. No. 08/105,304 (“’304 app.,” Exhibit 1103), which was
`
`filed on Aug. 11, 1993. I have been asked to review the ’304 app. to see whether it
`
`contains the same or materially the same disclosure that I rely upon from Lemelson.
`
`As set forth below, I believe that it does, and have included parallel citations to the
`
`specification in that application.
`
`2) European Patent Application No. 93112302 (Publication No. 0582236A1) (Exhibit
`
`1104), which published on Feb. 9, 1994.
`
`3) U.S. Patent No. 5,245,422 to Borcherts (“Borcherts,” Exhibit 1105), which issued
`
`on Sept. 14, 1993.
`
`C.
`
`Claims 1-4, 7-10, 41, 56, 59-61, and 64 are Obvious Under 35 U.S.C. § 103
`Over Lemelson
`
`45.
`
`It is my opinion that Lemelson renders obvious all the limitations described in
`
`claims 1-4, 7-10, 41, 56, 59-61, and 64.
`
`
`
`19
`
`19
`
`
`
`
`
`46. As detailed below, Lemelson describes a system in a vehicle that uses sensors to
`
`detect possible obstacles on the roadway. The sensors include CCD cameras that can
`
`be placed in a stereovision configuration and/or on different locations around the
`
`vehicle. Lemelson also discloses radar/lidar for measuring distances to exterior
`
`objects. The lidar (also called laser radar) emits laser light and analyzes the reflected
`
`radiation (thus the term lidar comes from the combination of the words light and
`
`radar). Then a processor that uses a pattern recognition methodology (e.g., a neural
`
`network) analyzes the images obtained by the cameras to determine the identity of
`
`obstacles and affect numerous vehicle subsystems. The system of Lemelson can then
`
`utilize inputs from all the sensors as well the state of the vehicle in order to
`
`automatically affect a vehicle operation such as braking, steering, etc. For example,
`
`Lemelson teaches that symbols can be displayed representing the hazard objects.
`
`(Lemelson, Ex. 1102, 6:43-55; ’304 app., Ex. 1103, p. 14-15.) Additionally, the image
`
`analyzing computer provides codes to a decision computer, which “integrates the
`
`inputs from image analysis computer 19, range computer 21, digital accelerometer 45,
`
`and the radar or lidar computer 14 to generate output warning and control signals.”
`
`(Lemelson, Ex. 1102, 8:31-34; ’304 app., Ex. 1103, p. 17.) Lemelson further teaches
`
`that “[w]arning signals alert the driver of impending hazards and, depending on the
`
`situation, actual vehicle control signals may be gene