throbber
UNITED STATES PATENT AND TRADEMARK OFFICE
`__________________
`
`BEFORE THE PATENT TRIAL AND APPEAL BOARD
`__________________________________________________________________
`
`TOYOTA MOTOR CORPORATION
`
`Petitioner
`
`
`
`Patent No. 5,845,000
`Issue Date: December 1, 1998
`Title: VEHICULAR MONITORING SYSTEMS
`__________________________________________________________________
`
`DECLARATION OF NIKOLAOS PAPANIKOLOPOULOS, PH.D.
`
`
`Case No. IPR2013-00424
`__________________________________________________________________
`
`
`
`
`
`IPR2013-00424 - Ex. 1013
`Toyota Motor Corp., Petitioner
`
`1
`
`

`

`I, Nikolaos Papanikolopoulos, Ph.D., hereby declare and state as follows:
`
`I.
`1.
`
`BACKGROUND
`
`I am currently employed by the University of Minnesota as a Distinguished
`
`McKnight University Professor of Computer Science and Engineering. I have been a
`
`professor at the University of Minnesota (originally as an assistant professor, and then
`
`as an associate professor) since the Fall of 1992. Between Fall 2001 and Spring 2004,
`
`and between Fall 2010 and Spring 2013, I was the Director of Undergraduate Studies
`
`of the College of Science and Engineering.
`
`2.
`
`In 1992, I received my Ph.D. in Electrical and Computer Engineering from
`
`Carnegie Mellon University. My thesis was entitled “Controlled Active Vision” and
`
`focused on using computer vision in a controlled fashion to monitor and manipulate
`
`objects in the environment. In 1988, I also received my M.S. in Electrical and
`
`Computer Engineering from Carnegie Mellon University. My B.S. in Electrical
`
`Engineering was received in 1987 from the National Technical University in Athens,
`
`Greece.
`
`3.
`
`Over the last nineteen years, my research and teaching work has focused on
`
`computer vision, intelligent transportation systems, and robotics. This research has
`
`included autonomous vehicles and object detection and recognition including work
`
`with artificial intelligence and pattern recognition systems.
`
`4. My research in the early 1990’s focused on solving sensor deployment
`
`problems including using sensory systems and algorithms to monitor the exterior and
`
`
`
`
`
`2
`
`

`

`interior spaces of vehicles. Our efforts ranged from monitoring for pedestrians at
`
`crosswalks to performing real-time vehicle following. In particular, we developed a
`
`system (using a CCD camera) that could track humans as articulated bodies. We also
`
`created a system that detected the license plate of a vehicle ahead and then allowed
`
`the vehicle on which the camera was mounted to keep a constant distance from the
`
`leading vehicle. A screenshot of the pertinent system display is shown in Figure 1.
`
`Figure 1
`
`
`
`5.
`
`I currently teach three courses relating to intelligent systems: (i) CSci 5561
`
`Computer Vision, (ii) CSci 5511 Artificial Intelligence, and (iii) CSci 5551
`
`
`
`
`
`3
`
`

`

`Introduction to Intelligent Robotic Systems.
`
`6. My research has produced more than 320 journal and conference publications.
`
`More than 70 publications are in refereed journals. Many of my publications relate to
`
`intelligent systems (including intelligent vehicles). Some examples include:
`
`Somasundaram, G., Sivalingam, R., Morellas, V., and Papanikolopoulos, N.P.,
`“Classification and Counting of Composite Objects in Traffic Scenes Using
`Global and Local Image Analysis”, IEEE Trans. on Intelligent Transportation
`Systems, Volume 14, No. 1, March 2013, pp. 69-81.
`
`Atev, S., Miller, G., and Papanikolopoulos, N.P., “Clustering of Vehicle
`Trajectories”, IEEE Trans. on Intelligent Transportation Systems, Volume 11,
`No. 3, September 2010, pp. 647-657.
`
`Atev, S., Arumugam, H., Masoud, O., Janardan, R., and Papanikolopoulos,
`N.P., “A Vision-Based Approach to Collision Prediction at Traffic
`Intersections”, IEEE Trans. on Intelligent Transportation Systems, Volume 6,
`No. 4, December 2005, pp. 416-423.
`
`Masoud, O., and Papanikolopoulos, N.P., “A Novel Method for
`Tracking and Counting Pedestrians in Real-time Using a Single
`Camera”, IEEE Trans. on Vehicular Technology, Volume 50, No. 5,
`September 2001, pp. 1267-1278.
`
`Du, Y., and Papanikolopoulos, N.P., "Real-time Vehicle Following
`Through a Novel Symmetry-Based Approach", Proceedings of the 1997 IEEE
`Int. Conf. on Robotics and Automation, pp. 3160-3165, Albuquerque, NM,
`April 20-25, 1997.
`
`As a result of my work and research, I am familiar with the design, control,
`
`7.
`
`operation and functionality of exterior monitoring systems for vehicles, including
`
`those employed on hybrid vehicles.
`
`8.
`
`A copy of my curriculum vitae is attached as included herewith.
`
`
`
`
`
`
`
`4
`
`

`

`II. ASSIGNMENT AND COMPENSATION
`9.
`I submit this declaration in support of the Petition for Inter Partes Review of
`
`U.S. Patent No. 5,845,000 (“the ’000 patent”) filed by Toyota Motor Corporation
`
`(“Toyota”).
`
`10.
`
`11.
`
`I am not an employee of Toyota or any affiliate or subsidiary thereof.
`
`I am being compensated for my time at a rate of $500 per hour. My
`
`compensation is in no way dependent upon the substance of the opinions I offer
`
`below, or upon the outcome of Toyota’s petition for inter partes review (or the
`
`outcome of such an inter partes review, if a trial is initiated).
`
`12.
`
`I have been asked to provide certain opinions relating to the ’000 patent.
`
`Specifically, I have been asked to provide my opinion regarding (i) the level of
`
`ordinary skill in the art to which the ’000 patent pertains, and (ii) the patentability of
`
`claims 10, 11, 16, 17, 19, 20, and 23 of the ’000 patent.
`
`13. The opinions expressed in this declaration are not exhaustive of my opinions
`
`on the patentability of any of the claims in the ’000 patent. Therefore, the fact that I
`
`do not address a particular point should not be understood to indicate any agreement
`
`on my part that any claim otherwise complies with the patentability requirements.
`
`14. The opinions expressed in this declaration are my personal opinions and do not
`
`reflect the views of University of Minnesota.
`
`III. LEGAL STANDARDS
`15.
`I have been informed and I understand that a patentability analysis is
`
`
`
`
`
`5
`
`

`

`performed from the viewpoint of a hypothetical person of ordinary skill in the art. I
`
`understand that “the person of ordinary skill” is a hypothetical person who is
`
`presumed to be aware of the universe of available prior art as of the time of the
`
`invention at issue.
`
`16.
`
`I understand that a patent claim is unpatentability as anticipated when a single
`
`piece of prior art describes every element of the claimed invention, either expressly or
`
`inherently, and arranged in the same way as in the claim. For inherent anticipation to
`
`be found, it is required that the missing descriptive material is necessarily present in
`
`the prior art. I understand that, for the purpose of an inter partes review, prior art that
`
`anticipates a claim can include both patents and printed publications from anywhere
`
`in the world.
`
`17.
`
`I understand that some claims are written in dependent form, in which case
`
`they incorporate all of the limitations of the claim(s) on which they depend. I have
`
`further been informed that material not explicitly contained in a single prior art
`
`document may still be considered for purposes of anticipation if that material is
`
`incorporated by reference into the document. The document must be incorporated in
`
`such a manner that makes clear that the material is effectively part of the host
`
`document as if it were explicitly contained therein.
`
`18.
`
`I understand that a patent claim is unpatentable as obvious if the subject matter
`
`of the claim as a whole would have been obvious to a person of ordinary skill in the
`
`art as of the time of the invention at issue. I understand that the following factors
`
`
`
`
`6
`
`

`

`must be evaluated to determine whether the claimed subject matter is obvious: (1) the
`
`scope and content of the prior art; (2) the difference or differences, if any, between
`
`the scope of the claim of the patent under consideration and the scope of the prior
`
`art; and (3) the level of ordinary skill in the art at the time the patent was filed. Unlike
`
`anticipation, which allows consideration of only one item of prior art, I understand
`
`that obviousness may be shown by considering more than one item of prior
`
`art. Moreover, I have been informed and I understand that so-called objective indicia
`
`of non-obviousness, also known as “secondary considerations,” like the following are
`
`also to be considered when assessing obviousness: (1) commercial success; (2) long-
`
`felt but unresolved needs; (3) copying of the invention by others in the field; (4) initial
`
`expressions of disbelief by experts in the field; (5) failure of others to solve the
`
`problem that the inventor solved; and (6) unexpected results. I also understand that
`
`evidence of objective indicia of non-obviousness must be commensurate in scope
`
`with the claimed subject matter.
`
`19. As an initial matter, I have been informed that claim terms may be written in
`
`means-plus-function format. In this situation, the means-plus-function claim terms
`
`cover the corresponding structure identified in the specification for performing the
`
`claimed function, and equivalents thereof.
`
`20.
`
`I have applied these principles with respect to my analysis set forth below.
`
`IV. BACKGROUND OF THE ’000 PATENT
`21. The ’000 patent generally describes a system and method for monitoring the
`
`
`
`
`
`7
`
`

`

`interior and exterior of a vehicle and for identifying objects. The ’000 patent
`
`describes a number of different types of receivers and transmitters for performing the
`
`identification. For example, CCD arrays are mentioned in 7:33-35 as receivers.
`
`Transmitters, like infrared ones, are discussed in 7:30-31. The information from the
`
`CCD arrays is processed by computational methodologies (“trained pattern
`
`recognition technologies”), such as a neural computer with the objective of classifying
`
`and identifying external objects. The output of this step is used to affect a response
`
`system of the vehicle.
`
`V.
`CLAIMS OF THE ’000 PATENT
`22. The ’000 patent includes 25 claims. As noted above, I have been asked to
`
`consider the patentability of claims 10, 11, 16, 17, 19, 20, and 23. These claims are
`
`reproduced below for reference:
`
`10. In a motor vehicle having an interior and an exterior, a monitoring
`system for monitoring at least one object exterior to said vehicle comprising:
`
`a) transmitter means for transmitting electromagnetic waves to illuminate
`the at least one exterior object;
`
`b) reception means for receiving reflected electromagnetic illumination
`from the at least one exterior object;
`
`c) processor means coupled to said reception means for processing said
`received illumination and creating an electronic signal characteristic of said
`exterior object based thereon;
`
`d) categorization means coupled to said processor means for categorizing
`said electronic signal to identify said exterior object, said categorization
`means comprising trained pattern recognition means for processing said
`electronic signal based on said received illumination from said exterior
`object to provide an identification of said exterior object based thereon,
`
`
`
`
`
`8
`
`

`

`said pattern recognition means being structured and arranged to apply a
`pattern recognition algorithm generated from data of possible exterior
`objects and patterns of received electromagnetic illumination from the
`possible exterior objects; and
`
`e) output means coupled to said categorization means for affecting another
`system in the vehicle in response to the identification of said exterior
`object.
`
`further comprising
`in accordance with claim 10,
`11. The system
`measurement means for measuring the distance from the at least one exterior
`object to said vehicle, said measurement means comprising radiation.
`
`16. In a motor vehicle having an interior and an exterior, an automatic
`headlight dimming system comprising:
`
`a) reception means for receiving electromagnetic radiation from the
`exterior of the vehicle;
`
`b) processor means coupled to said reception means for processing the
`received radiation and creating an electronic signal characteristic of the
`received radiation;
`
`c) categorization means coupled to said processor means for categorizing
`said electronic signal to
`identify a source of the radiation, said
`categorization means comprising trained pattern recognition means for
`processing said electronic signal based on said received radiation to provide
`an identification of the source of the radiation based thereon, said pattern
`recognition means being structured and arranged to apply a pattern
`recognition algorithm generated from data of possible sources of radiation
`including lights of vehicles and patterns of received radiation from the
`possible sources; and
`
`d) output means coupled to said categorization means for dimming the
`headlights in said vehicle in response to the identification of the source of
`the radiation.
`
`17. The invention in accordance with claim 16 wherein said categories
`further comprise radiation from taillights of a vehicle-in-front.
`
`19. The system of claim 10, wherein said reception means comprise a CCD
`array.
`
`
`
`
`
`9
`
`

`

`20. The invention in accordance with claim 16, wherein said reception
`means comprise a CCD array.
`
`23. A method for affecting a system in a vehicle based on an object exterior
`of the vehicle, comprising the steps of:
`
`a) transmitting electromagnetic waves to illuminate the exterior object;
`
`b) receiving reflected electromagnetic illumination from the object on an
`array;
`
`c) processing the received illumination and creating an electronic signal
`characteristic of the exterior object based thereon;
`
`d) processing the electronic signal based on the received illumination from
`the exterior object to identify the exterior object, said processing step
`comprising the steps of generating a pattern recognition algorithm from
`data of possible exterior objects and patterns of received electromagnetic
`illumination from the possible exterior objects, storing the algorithm within
`a pattern recognition system and applying the pattern recognition algorithm
`using the electronic signal as input to obtain the identification of the
`exterior object; and
`
`e) affecting the system in the vehicle in response to the identification of the
`exterior object.
`VI. CLAIM CONSTRUCTION
`23.
`I have not performed my own independent claim construction analysis.
`
`Rather, I have been asked to apply the following claim constructions in analyzing the
`
`patentability of the identified claims. As noted above, I have been informed that
`
`claim terms may be written in means-plus-function format. In this situation, the
`
`means-plus-function claim terms cover the corresponding structure identified in the
`
`specification for performing the claimed function and equivalents thereof.
`
`“pattern recognition algorithm” (claims 10, 16)
`
`I have been informed that “pattern recognition” means “a system that determines
`
`
`
`
`10
`
`

`

`whether or not an object is a member of but a single particular class.” A neural
`
`network, fuzzy logic and sensor fusion are types of pattern recognition systems.
`
`“trained pattern recognition means” (claims 10, 16)
`
`I have been informed that this claim limitation is written in means-plus-function
`
`format.
`
`I have been informed that the corresponding structure includes a neural computer, a
`
`processor and equivalents thereof. The required function performed by this structure
`
`is stated in the various claims and carries its plain and ordinary meaning except with
`
`respect to the terms “identify” and “identification” set forth below.
`
`“identify” / “identification” (claims 10, 16, 23)
`
`I have been informed that the specification defines “identify” as follows: “to
`
`determine that the object belongs to a particular set or class. The class may be one
`
`containing, for example, all rear facing child seats, one containing all human
`
`occupants, or all human occupants not sitting in a rear facing child seat depending on
`
`the purpose of the system. In the case where a particular person is to be recognized,
`
`the set or class will contain only a single element, i.e., the person to be recognized.”
`
`“transmitter means” (claim 10)
`
`I have been informed that this claim limitation is written in means-plus-function
`
`format.
`
`I have been informed that the corresponding structure includes an infrared
`
`
`
`
`11
`
`

`

`transmitter, radar, laser radar, and equivalents thereof. The required function
`
`performed by this structure is stated in the various claims and carries its plain and
`
`ordinary meaning.
`
`“reception means” (claims 10, 16)
`
`I have been informed that this claim limitation is written in means-plus-function
`
`format.
`
`I have been informed that the corresponding structure includes an infrared receiver,
`
`radar, laser radar, CCD transducers, TV cameras, and equivalents thereof. The
`
`required function performed by this structure is stated in the various claims and
`
`carries its plain and ordinary meaning.
`
`“processor means” (claims 10, 16)
`
`I have been informed that this claim limitation is written in means-plus-function
`
`format.
`
`I have been informed that the corresponding structure includes electronic modules,
`
`circuitry, neural computers, application specific integrated circuits, and CPUs and
`
`equivalents thereof. The required function performed by this structure is stated in the
`
`various claims and carries its plain and ordinary meaning.
`
`
`“categorization means” (claims 10, 16)
`
`
`
`
`
`12
`
`

`

`I have been informed that this claim limitation is written in means-plus-function
`
`format.
`
`I have been informed that the corresponding structure includes “trained pattern
`
`recognition means” as construed above, microprocessors, and neural computers. The
`
`required function performed by this structure is stated in the various claims and
`
`carries its plain and ordinary meaning.
`
`
`“output means” (claims 10, 16)
`
`I have been informed that this claim limitation is written in means-plus-function
`
`format.
`
`I have been informed that in the context of claim 10, the corresponding structure
`
`includes warning systems, displays, braking controllers and seatbelt retraction devices
`
`as well as equivalents thereof. With respect to claim 10, the required function
`
`performed by this structure is stated in claim 10 and carries its plain and ordinary
`
`meaning.
`
`I have further been informed that in the context of claim 16, the corresponding
`
`structure includes any part of a pattern recognition system including a processor as
`
`well as a sensing ECU, other controller, or equivalents thereof. The required function
`
`performed by this structure is stated in claim 16 and carries its plain and ordinary
`
`meaning.
`
`
`
`
`
`13
`
`

`

`
`“measurement means” (claim 11)
`
`I have been informed that this claim limitation is written in means-plus-function
`
`format.
`
`I have been informed that the corresponding structure includes radar and equivalents
`
`thereof. The required function performed by this structure is stated in claim 11 and
`
`carries its plain and ordinary meaning.
`
`
`“dimming the headlights” (claim 16)
`
` I
`
` have been informed that this term includes any reduction of headlight intensity such
`
`as complete elimination of headlight output.
`
`
`“wherein said categories further comprise radiation from taillights of a vehicle-in-front” (claim 17)
`
` have been informed that this claim is met if any category created by the
`
` I
`
`“categorization means” includes taillight radiation, and that it is immaterial whether
`
`the category includes taillight radiation alone, or taillight radiation plus other types of
`
`radiation such as headlight radiation.
`
`
`
`24. With respect to the other terms in the ’000 patent, I have applied the plain and
`
`ordinary meaning of those claim terms when comparing the claims to the prior art.
`
`VII. BACKGROUND ON THE STATE OF THE ART
`25. The following is a brief exemplary discussion of the state of the art prior to
`
`
`
`
`
`14
`
`

`

`May 1994.
`
`26. During the last forty years, there has been a growing interest in intelligent
`
`vehicles (IV) and intelligent transportation systems (ITS). With emphasis on improved
`
`safety and improved system efficiency, a large number of applications have affected
`
`our everyday lives.
`
`27. The Defense Advanced Research Projects Agency (DARPA) funded several
`
`programs throughout the US in the 1980s with the objective of creating autonomous
`
`vehicles (Computing Initiative and the project was named Autonomous Land Vehicle
`
`(ALV)). Furthermore, the Image Understanding effort focused initially on cameras
`
`(sometimes in stereo pairs) to provide a situation awareness for the computational
`
`logic that drives a vehicle.
`
`28. Groups at Carnegie Mellon University, University of Maryland, and University
`
`of Massachusetts-Amherst worked on different aspects of the same problem–
`
`developing intelligent vehicles. Meetings like the DARPA Image Understanding
`
`Workshops and organizations like the Intelligent Transportation Society of America
`
`provided immediate dissemination of knowledge to various stakeholders.
`
`29. Other groups in Europe (e.g., Germany) focused throughout the late 1980’s
`
`and early 1990’s on the use of computer vision to drive a vehicle autonomously at
`
`high speeds. In this case, the emphasis was on the use of estimation and control
`
`techniques that will drive the vehicle based on stereo vision information. Their
`
`methods were similar to trained pattern recognition with the ability to monitor the
`
`
`
`
`15
`
`

`

`lane markers of the roadway.
`
`30. Vehicle manufacturers in Japan, such as Toyota and Nissan, and Europe, such
`
`as Renault and Volkswagen, also built sensory systems to fit a wide range of vehicles
`
`from compact cars to trucks.
`
`31. Throughout all of these applications, various combinations of sensors including
`
`transmitters and detectors were used. The sensors included radar, laser radar, infrared
`
`emitters and detectors, as well as television cameras and CCD arrays. All of these
`
`systems functioned to receive and measure electromagnetic waves in order to detect
`
`objects in a vehicle’s environment.
`
`32. Additionally, there was extensive research that was performed with respect to
`
`the application of neural networks to detect objects and control. This was research
`
`published in a number of different patents and articles, including, for example:
`
`1) Kornhauser, A., “Neural Network Approaches for Lateral Control
`of Autonomous Highway Vehicles”, Proceedings of the Vehicle Navigation
`and Information Systems Conference, 1991, pp. 1143-1151.
`
`Plumer, E., “Neural Network Structure for Navigation Using
`2)
`Potential Fields”, Proceedings of the International Joint Conference on Neural
`Networks, 1992, pp. 327-332.
`
`3) Kraiss, K., and Kuttelwesch. H., “Teaching Neural Networks to
`Guide a Vehicle Through an Obstacle Course by Emulating a Human
`Teacher”, Proceedings of the International Joint Conference on Neural Networks,
`1990, pp. 333-337.
`
`4) Ciaccia, P., Maio, D., and Rizzi, S., “Integrating Knowledge-based
`Systems and Neural Networks for Navigational Tasks”, Proceedings of the
`5th Annual European Computer Conference (CompEuro ‘91), 1991, pp. 652-656.
`
`
`
`
`
`16
`
`

`

`5) Neuber, S., Nijhuis, J., and Spaanenburg, L., “Developments in
`Autonomous Vehicle Navigation”, Proceedings of CompEuro ’92, 1992, pp.
`453-458.
`
`Luo, R., Potlapalli, H., and Hislop D., “Outdoor Landmark
`6)
`Recognition Using Fractal Based Vision and Neural Networks”,
`Proceedings of the 1993 IEEE/RSJ International Conference on Intelligent Robots
`and Systems, Yokohama, Japan, 1993, pp. 612-618.
`
`7) U.S. Patent No. 6,553,130 to Lemelson, “Motor Vehicle Warning
`and Control System and Method”, Publication date April 22, 2003,
`Priority date August 11, 1993.
`
`8)
`Pomerleau, D., “Neural Network Perception for Mobile Robot
`Guidance”, Ph.D. Thesis, Carnegie Mellon University, CMU-CS-92-115.
`February 16, 1992. (“Pomerleau,” Exhibit 1005).
`
`Pomerleau, Dean, “ALVINN: An Autonomous Land Vehicle in a
`9)
`Neural Network,” Technical Report AIP-77, Carnegie Mellon University,
`March 13, 1990. (“1990 Pomerleau”).
`
`10) Arain et al., “Action Planning for the Collision Avoidance System
`Using Neural Networks,” Proceedings of the Intelligent Vehicles 1993
`Symposium, 1993.
`
`11) Catala, et al., “A Neural Network Texture Segmentation System
`for Open Road Vehicle Guidance,” Proceedings of the Intelligent Vehicles
`1992 Symposium, 1992.
`
`12) Goerick et al., “Local Orientation Coding and Neural Network
`Classifiers with an Application to Real Time Car Detection and
`Tracking,” Mustererkennung 1994, Proceedings of the 16th Symposium of the
`DAGM and the 18th Workshop of the OAGM, Springer-Verlag, 1994.
`
`13) U.S. Patent No. 5,541,590 to Nishio, “Vehicle Crash Predictive
`and Evasive Operation System by Neural Networks,” Publication date
`July 30, 1996, Priority date August 4, 1992.
`33. Monitoring the exterior environment for object recognition is one of the
`
`applications of the aforementioned intelligent vehicles, including those vehicles that
`
`had utilized neural networks. Exterior monitoring in particular had been the subject
`
`
`
`
`
`17
`
`

`

`of extensive research in the late 1980’s and the early 1990’s. Many research groups,
`
`including those mentioned in the prior art listed in ¶ 32 above, had implemented
`
`systems to analyze a vehicle scene by using various techniques that ranged from
`
`model-based computer vision to neural networks (see e.g., Kornhauser, “Neural
`
`Network Approaches for Lateral Control of Autonomous Highway Vehicles,”
`
`Proceedings of the Vehicle Navigation and Information Systems Conference, pp. 1143-1151, 1991
`
`(“1991 Kornhauser); 1990 Pomerleau; Dickmanns, et al., “An All-Transputer Visual
`
`Autobahn-Autocopilot/Copilot,” 1993 Proceedings of the Fourth International Conference on
`
`Computer Vision, pp. 608-615, 1993 (“1993 Dickmanns”), Ciaccia et al., “Integrating
`
`Knowledge-Based Systems and Neural Networks for Navigational Tasks.” Proceedings
`
`of the 5th Annual European Computer Conference (CompEuro ‘91), pp. 652-656, 1991 (“1991
`
`Ciaccia); Pomerleau, Ex. 1005; Neuber, et al., “Developments in Autonomous Vehicle
`
`Navigation,” Proceedings of CompEuro ’92, pp. 453-458., 1992 (“1992 Neuber”); and
`
`Luo, et al., “Outdoor Landmark Recognition Using Fractal Based Vision and Neural
`
`Networks,” Proceedings of the 1993 IEEE/RSJ International Conference on Intelligent Robots
`
`and Systems, Yokohama, Japan, July 26-30, 1993 (“1993 Luo”)).
`
`34.
`
`For example, researchers used information about an object’s appearance when
`
`perceived through an imaging apparatus, such as white lane markers and traffic signs
`
`as perceived through video cameras, to facilitate object recognition or detection. (See
`
`e.g., Dickmanns, et al., “An Integrated Spatio-Temporal Approach to Automatic
`
`Visual Guidance of Autonomous Vehicles,” IEEE Transactions on Systems, Man and
`
`
`
`
`18
`
`

`

`Cybernetics, Vol. 20, No. 6, pp. 1273-1284, 1990 (“1990 Dickmanns”); 1993
`
`Dickmanns; 1993 Luo.)
`
`35.
`
`Furthermore, some utilized traditional numerical methods to analyze and
`
`measure every element in a scene so as to create very accurate representations of the
`
`exterior environment. Groups in Germany, for example, used advanced estimation
`
`techniques to measure the road and vehicle parameters and perform obstacle
`
`avoidance at high speeds. (See e.g., Graefe, et al., “Towards a Vision Based Robot with
`
`a Driver’s License,” 1988 IEEE International Workshop on Intelligent Robots (IROS 88),
`
`pp. 627-632, 1988, (“1988 Graefe”); 1990 Dickmanns; 1993 Dickmanns.)
`
`36. As computers improved from the late 1980’s to the early 1990’s, neural
`
`networks were viewed as a viable alternative. In particular, neural network training
`
`became more manageable and many groups in the United States and Europe utilized
`
`artificial neural networks for performing obstacle avoidance and autonomous
`
`navigation. (See, e.g., 1990 Pomerleau; Pomerleau, Ex. 1005.)
`
`37. Neural network methodologies, such as back-propagation, provided ways to
`
`quickly adapt to the rapidly evolving scenes that vehicles would encounter. This
`
`training information was captured as “weights” that were assigned to various
`
`structures and components within often hidden layers of the neural networks. (See
`
`e.g., 1991 Kornhauser, Pomerleau 1990; 1992 Neuber; Pomerleau, Ex. 1005.) The
`
`sensory information such as images acquired from video cameras, infrared cameras,
`
`and laser radar, were fed into artificial neural networks and the internal network layers
`
`
`
`
`19
`
`

`

`would provide outputs to drive the vehicle by controlling vehicle systems including
`
`steering as was the case with the NAVLAB vehicle. (See e.g., Pomerleau 1990;
`
`Pomerleau, Ex. 1005.)
`
`VIII. ANALYSIS
`A.
`38.
`
`Level of Ordinary Skill in the Art
`
`I have been asked to provide my opinion regarding the level of ordinary skill in
`
`the art in May 1994 (which I understand is the month in which an application to
`
`which the ’000 claims priority was filed) and June 1995, which is the month in which
`
`the application leading to the ’000 patent was filed.1
`
`39.
`
`It is my opinion that, in May 1994, a person of ordinary skill in the art would
`
`have had one of the following: (i) a bachelor’s degree in electrical engineering,
`
`mechanical engineering, computer engineering, or computer science (or a closely
`
`related field) with at least four years of experience working with intelligent vehicles or
`
`exterior monitoring vehicle systems, (ii) a master’s degree in electrical engineering,
`
`mechanical engineering, computer engineering, or computer science (or a closely
`
`related field) with at least two years of experience working with intelligent vehicles or
`
`exterior monitoring vehicle systems or (iii) a PhD in electrical engineering, mechanical
`
`engineering, computer engineering, or computer science (or a closely related field).2
`
`
`1 My opinion on the state of the art would not change even if the effective filing date
`were in May of 1992, the earliest date to which the ’000 patent claims priority.
`2 Although I have applied this level of ordinary skill in analyzing the obviousness
`issues, it is my opinion that claims 10-11, 16-17, 19-20 and 23 are, for the reasons set
`
`
`
`
`20
`
`

`

`40.
`
`In my opinion, the level of ordinary skill in the art would have been the same in
`
`in June 1995 (and at any time between May 1994 and June 1995).
`
`41.
`
`In opining on the level of ordinary skill in the art, I have considered the
`
`following factors: (i) the education level of the inventor; (ii) the type of problems
`
`encountered in the art; (iii) prior art solutions to those problems; (iv) the rapidity with
`
`which innovations are made; (v) the sophistication of the technology; and (vi) the
`
`education level of active workers in the field.
`
`42. Based on my experience and education, I consider myself to have been a
`
`person of at least ordinary skill in the art with respect to the field of technology
`
`implicated by the ’000 patent from the time of filing to the present.
`
`B.
`Scope and Content of the Prior Art
`43. The scope and content of the prior art as of May 1994 would have broadly
`
`included patents and publications regarding vehicle sensing systems as well as
`
`computer vision and object identification (regardless of whether specifically applied in
`
`automobiles or otherwise).
`
`44.
`
`In my opinion, the references disclosed below would all have been considered
`
`to be within the same technical field as the subject matter of the ’000 patent.
`
`Furthermore, all of these references would be considered highly relevant prior art to
`
`claims 10, 11, 16, 17, 19, 20, and 23 of the ’000 patent.
`
`
`forth below, so clearly obvious that even a person of lesser skill would have found
`them obvious.
`
`
`
`
`
`21
`
`

`

`45. My opinion is the same with respect to the scope and content of the prior art as
`
`of May 1994 and any time between May 1994 and June 1995.
`
`C.
`46.
`
`List of Prior Art References Discussed in Analysis
`
`In my analysis, I discuss the following references, which I introduce here to
`
`provide abbreviations. I understan

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket