`Breed et al.
`
`[54] OPTICAL IDENTIFICATION AND
`MONITORING SYSTEM USING PATTERN
`RECOGNITION FOR USE WITH VEHICLES
`
`[75]
`
`Inventors: David S. Breed, Boonton Township,
`N.J.; Wilbur E. DuVall, Kimberling
`City, Mo.; Wendell C. Johnson,
`Torrance, Calif.
`
`[73] Assignee: Automotive Technologies
`International, Inc., Denville, N.J.
`
`[ *] Notice:
`
`The term of this patent shall not extend
`beyond the expiration date of Pat. No.
`5,835,613.
`
`[21] Appl. No.: 474,786
`
`[22] Filed:
`
`Jun. 7, 1995
`
`Related U.S. Application Data
`
`[63] Continuation-in-part of Ser. No. 878,571, May 5, 1992,
`abandoned, Ser. No. 40,978, Mar. 31, 1993, abandoned, Ser.
`No. 247,760, May 23, 1994, and Ser. No. 239,978, May 9,
`1994, abandoned.
`Int. Cl.6
`....................................................... G06K 9/00
`[51]
`[52] U.S. Cl. ............................................. 382/100; 348/143
`[58] Field of Search ............................. 340/436; 382/104,
`382/103, 291, 100; 280/735; 348/143, 148
`
`[56]
`
`References Cited
`
`U.S. PATENT DOCUMENTS
`
`1!1985 Shah ........................................ 350/354
`4,496,222
`4,625,329 11/1986 Ishikawa et a!. ........................... 382/1
`3/1987 Friedman et a!.
`...................... 364/550
`4,648,052
`1!1988 Heynen eta!. ......................... 351!210
`4,720,189
`4,768,088
`8/1988 Ando ......................................... 358/93
`6/1989 Hutchinson ............................. 351!210
`4,836,670
`4,881,270 11/1989 Knecht et a!. .......................... 382/191
`3/1990 Greene et a!. .......................... 382/100
`4,906,940
`8/1990 Hutchinson ............................. 351!210
`4,950,069
`4,966,388 10/1990 Warner eta!. ....................... 280/730.1
`3/1991 Girod ................................... 250/201.4
`5,003,166
`
`111111111111111111111111111111111111111111111111111111111111111111111111111
`US005845000A
`[11] Patent Number:
`[45] Date of Patent:
`
`5,845,000
`*Dec. 1, 1998
`
`FOREIGN PATENT DOCUMENTS
`
`Japan .
`2/1991
`342337
`94/22692 10/1994 WIPO .
`
`OTHER PUBLICATIONS
`
`"Analysis of Hidden Units in a Layered Network Trained to
`Classify Sonar Targets", R. Paul Gorman, et al., Neural
`Networks, vol. 1, pp.75-89, 1988.
`Learned Classification of Sonar Targets Using a Massively
`Parallel Network, R. Paul Gorman et al., IEEE Transactions
`on Acoustics, Speech and Signal Processing, vol. 36, No.7,
`Jul., 1988, pp. 1135-1140.
`"How Airbags Work", David S. Breed, Presented at the
`Canadian Association of Road Safety Professional, Oct. 19,
`1992-0ct. 20, 1992.
`Derwent Abstract of German Patent Publication No. DE 42
`11 556, Oct. 7, 1993.
`Derwent Abstract of Japanese Patent Application No.
`02-051332, Nov. 13, 1991.
`
`Primary Examiner-Yon J. Couso
`
`[57]
`
`ABSTRACT
`
`A vehicle interior monitoring system to identify, locate and
`monitor occupants, including their parts, and other objects in
`the passenger compartment and objects outside of a motor
`vehicle, such as an automobile or truck, by illuminating the
`contents of the vehicle and objects outside of the vehicle
`with electromagnetic, and specifically infrared, radiation
`and using one or more lenses to focus images of the contents
`onto one or more arrays of charge coupled devices (CCD
`arrays). Outputs from the CCD arrays, are analyzed by
`appropriate computational means employing trained pattern
`recognition technologies, to classify, identify or locate the
`contents or external objects. In general, the information
`obtained by the identification and monitoring system is used
`to affect the operation of some other system in the vehicle.
`When system is installed in the passenger compartment of
`an automotive vehicle equipped with an airbag, the system
`determines the position of the vehicle occupant relative to
`the airbag and disables deployment of the airbag if the
`occupant is positioned so that he/she is likely to be injured
`by the deployment of the airbag.
`
`(List continued on next page.)
`
`25 Claims, 12 Drawing Sheets
`
`113
`
`IPR2013-00424 - Ex. 1001
`Toyota Motor Corp., Petitioner
`
`1
`
`
`
`5,845,000
`Page 2
`
`U.S. PATENT DOCUMENTS
`
`5,008,946
`5,026,153
`5,064,274
`5,071,160
`5,074,583
`5,118,134
`5,162,861
`5,181,254
`5,185,667
`5,193,124
`5,214,744
`5,227,784
`5,235,339
`
`4/1991 Ando ........................................... 382/2
`6/1991 Suzuki et a!. .......................... 356/3.16
`11/1991 Allen ....................................... 359/604
`12/1991 White et a!. ............................ 280/735
`12/1991 Fujita et a!.
`......................... 280/730.1
`6/1992 Mattes et a!. ........................... 280/735
`11/1992 Tamburino et a!.
`........................ 356!5
`1!1993 Schweizer et a!.
`.. ... ... ... .... ... ... 382/100
`2/1993 Zimmermann .......................... 358/209
`3/1993 Subbarao ................................ 382/255
`.. ... ... ... .... ... ... .. 395/11
`5/1993 Schweizer et a!.
`7/1993 Masamori eta!. ...................... 340/436
`8/1993 Morrison et a!. ....................... 342/159
`
`9/1993
`5,249,027
`9/1993
`5,249,157
`3/1994
`5,298,732
`4/1994
`5,305,012
`5/1994
`5,309,137
`7/1994
`5,329,206
`7/1994
`5,330,226
`8/1994
`5,339,075
`5,355,118 10/1994
`5,390,136
`2/1995
`5,441,052
`8/1995
`5,454,591 10/1995
`7/1996
`5,537,003
`
`Mathur eta!. .............................. 356/1
`Taylor ..................................... 340/435
`Chen .................................... 250/203.4
`Faris ............................................ 345/7
`Kajiwara ................................. 348/148
`Slotkowski et a!. .................... 315/159
`Gentry eta!. ........................... 280/735
`Abst eta!. .............................. 340/903
`Fukahara ................................. 348/148
`Wang ...................................... 364/754
`Miyajima ........................... 126/661.09
`Mazur eta!. ........................... 260/735
`Bechtel et a!. .............................. 35/82
`
`2
`
`
`
`U.S. Patent
`US. Patent
`
`Dec. 1, 1998
`Dec. 1, 1998
`
`Sheet 1 of 12
`Sheet 1 0f 12
`
`5,845,000
`5,845,000
`
`v:
`
`N --
`
`~
`
`0 -
`
`0
`
`N -
`
`o: 2:
`
`---
`
`M
`
`0 -
`
`-0 -
`
`N
`
`0 -
`
`V')
`
`.2:
`
`0 -
`
`0 --
`
`0
`
`0 -
`
`M --
`
`m:
`
`3
`
`
`
`U.S. Patent
`US. Patent
`
`Dec. 1, 1998
`Dec. 1, 1998
`
`Sheet 2 of 12
`Sheet 2 0f 12
`
`5,845,000
`5,845,000
`
`~
`
`0 - 0
`N -
`
`-0 -
`
`2:
`
`N --
`
`mow
`
`o: 02
`
`~ --
`
`m:
`
`4
`
`
`
`U.S. Patent
`US. Patent
`
`Dec. 1, 1998
`Dec. 1, 1998
`
`Sheet 3 of 12
`Sheet 3 0f 12
`
`5,845,000
`5,845,000
`
`0 --
`
`2:
`
`0
`
`0 -
`
`('(") --
`
`5
`
`
`
`U.S. Patent
`US. Patent
`
`Dec. 1, 1998
`Dec. 1, 1998
`
`Sheet 4 of 12
`Sheet 4 0f 12
`
`5,845,000
`5,845,000
`
`#2
`
`""d"
`
`0 -
`
`0
`
`N -
`
`-0 -
`
`2:
`
`m2
`
`0 --
`
`9:
`
`0
`
`0 -
`
`("() --
`
`m:
`
`6
`
`
`
`U.S. Patent
`US. Patent
`
`Dec. 1,1998
`Dec. 1, 1998
`
`Sheet 5 0f 12
`Sheet 5 of 12
`
`5,845,000
`5,845,000
`
`\0
`
`"""" --
`
`t.r)
`
`105
`
`0 -
`
`100
`
`c-'4 --
`
`"""" 0 -
`
`0
`
`N -
`
`-0 -
`
`101
`
`FIG.1B
`
`-
`
`- ~
`00 ---
`
`-
`
`0
`
`110
`
`0 --
`
`0
`
`0 -
`
`mF-d
`v—Il
`
`~ --
`
`7
`
`
`
`U.S. Patent
`US. Patent
`
`Dec. 1, 1998
`Dec. 1, 1998
`
`Sheet 6 0f 12
`Sheet 6 of 12
`
`5,845,000
`5,845,000
`
`210
`
`FIG.2
`
`8
`
`
`
`""(cid:173)Ul = = =
`
`Oo
`Ul
`
`'"""' N
`
`0 ......,
`-..J
`~ .....
`'JJ. =(cid:173)~
`
`00
`'0
`'"""'
`'0
`!""
`!"l
`~
`~
`
`~ = ......
`~ ......
`~
`\Jl .
`d .
`
`GROUND
`
`o
`
`I
`
`i/o i
`
`r------,
`ARMING SENSOR
`
`L ______ l
`
`I
`
`INFLATABLE RESTRAINT SYSTEM
`
`'--------
`I
`:~1:
`r-------1
`
`OCCUPANT POSITION SENSORS
`
`-----1
`I-=-
`I ~ ~ 1
`~ I
`I
`I
`I
`
`,---
`
`//
`
`//
`
`//
`
`//
`
`//
`
`FIG. 3
`
`DIAGNOSTIC UNIT
`/
`
`12 VOLTS
`
`0
`
`.,-----. -· -----
`I 1/1 I
`
`e
`
`1
`
`CRASH SENSOR
`ELECTRONIC
`'------..1
`I
`I
`I
`I
`I
`I
`
`•
`
`:
`1
`I
`e
`1
`
`: ---~-=--:. -:::::.-:. ~ --~
`
`9
`
`
`
`"'"(cid:173)Ul = = =
`
`Oo
`Ul
`
`'"""' N
`
`0 ......,
`00
`~ .....
`'JJ. =(cid:173)~
`
`00
`'0
`'"""'
`'0
`:"""
`!"l
`~
`~
`
`~ = ......
`~ ......
`~
`\Jl .
`d .
`
`~e:r.
`I Pre-amplifier H t-1 _409
`
`I
`
`I
`
`_LI.l")
`
`I
`
`144Mhz
`
`k
`
`~
`
`144 Mhz
`
`I &
`
`Driver
`Diode
`
`,
`
`I
`
`~I
`
`FIG. 4
`
`Detect
`
`No Sign~
`
`416
`
`dt ..,...___
`d
`
`Velocity
`
`.
`I
`
`. --------
`
`::r:=
`
`(3f2-3fl)= 150KHz I Filter v 14415;
`
`Gain Control
`Automatic
`
`Amplifier
`
`Filter
`
`I
`
`MHz
`
`413
`
`408~
`
`(Oto2~ llll
`
`I~
`
`Detector
`Phase
`
`Oscillator
`
`Crystal
`
`f2 48.05 MHz
`
`405--J
`
`Oscillator
`Crystal
`
`\
`
`401
`
`407
`
`..,,
`
`I 3fl
`
`I I
`
`Tripier
`
`. ' Frequency
`
`/
`
`--,.
`
`Tripier
`
`/
`
`fl48MHz H Frequency I 144 Mhz
`
`10
`
`
`
`U.S. Patent
`US. Patent
`
`Dec. 1, 1998
`Dec. 1, 1998
`
`Sheet 9 of 12
`Sheet 9 0f 12
`
`5,845,000
`5,845,000
`
`0
`om—
`N
`
`0
`N
`L()
`
`.n.UHE
`
`0
`
`""" L()
`
`11
`
`11
`
`
`
`U.S. Patent
`US. Patent
`
`Dec. 1, 1998
`Dec. 1, 1998
`
`Sheet 10 of 12
`Sheet 10 0f 12
`
`5,845,000
`5,845,000
`
`0
`N
`
`w.DNK
`
`\
`\
`\
`\
`I
`~
`
`...
`
`0
`M
`\0
`
`12
`
`12
`
`
`
`U.S. Patent
`US. Patent
`
`Dec. 1,1998
`Dec. 1, 1998
`
`Sheet 11 0f 12
`Sheet 11 of 12
`
`5,845,000
`5,845,000
`
`
`
`13
`
`
`
`U.S. Patent
`US. Patent
`
`Dec. 1, 1998
`Dec. 1, 1998
`
`Sheet 12 of 12
`Sheet 12 0f 12
`
`5,845,000
`5,845,000
`
`N -
`
`ON—
`
`0
`-l
`
`
`
`. .
`\ : f j
`
`\ill/
`
`-W
`
`--00
`
`0 -00
`
`-0 -
`
`-
`
`00
`
`FIG.8
`
`14
`
`14
`
`
`
`5,845,000
`
`1
`OPTICAL IDENTIFICATION AND
`MONITORING SYSTEM USING PATTERN
`RECOGNITION FOR USE WITH VEHICLES
`
`CROSS REFERENCE TO RELATED
`APPLICATIONS
`
`This application is a continuation-in-part of application
`Ser. No. 07/878,571 filed May 5, 1992, now abandoned, of
`application Ser. No. 08/040,978 filed Mar. 31, 1993 now
`abandoned, of copending application Ser. No. 08/247,760
`filed May 23, 1994 and of application Ser. No. 08/239,978
`filed May 9, 1994 now abandoned, the last three of which are
`included herein by reference.
`
`BACKGROUND OF THE INVENTION
`
`2
`disclosed and, as illustrated, is mounted lower than the
`steering wheel. White et al. correctly perceive that such a
`sensor could be defeated, and the airbag falsely deployed, by
`an occupant adjusting the control knobs on the radio and
`thus they suggest the use of a plurality of such sensors.
`Mattes et al. (U.S. Pat. No. 5,118,134) disclose a variety
`of methods of measuring the change in position of an
`occupant including ultrasonic, active or passive infrared and
`microwave radar sensors, and an electric eye. Their use of
`10 these sensors is to measure the change in position of an
`occupant during a crash and use that information to access
`the severity of the crash and thereby decide whether or not
`to deploy the airbag. They are thus using the occupant
`motion as a crash sensor. No mention is made of determining
`15 the out-of-position status of the occupant or of any of the
`other features of occupant monitoring as disclosed in the
`above cross-referenced patent applications. It is interesting
`to note that nowhere does Mattes et al. discuss how to use
`active or passive infrared to determine the position of the
`occupant. As pointed out in the above cross-referenced
`patent applications, direct occupant position measurement
`based on passive infrared is probably not possible and, until
`very recently, was very difficult and expensive with active
`infrared requiring the modulation of an expensive GaAs
`infrared laser. Since there is no mention of these problems,
`the method of use contemplated by Mattes et al. must be
`similar to the electric eye concept where position is mea(cid:173)
`sured indirectly as the occupant passes by a plurality of
`longitudinally spaced-apart sensors.
`The object of an occupant out-of-position sensor is to
`determine the location of the head and/or chest of the vehicle
`occupant relative to the airbag since it is the impact of either
`the head or chest with the deploying airbag which can result
`in serious injuries. Both White et al. and Mattes et al.
`35 disclose only lower mounting locations of their sensors
`which are mounted in front of the occupant such as on the
`dashboard or below the steering wheel. Both such mounting
`locations are particularly prone to detection errors due to
`positioning of the occupant's hands, arms and legs. This
`40 would require at least three, and preferably more, such
`sensors and detectors and an appropriate logic circuitry
`which ignores readings from some sensors if such readings
`are inconsistent with others, for the case, for example, where
`the driver's arms are the closest objects to two of the sensors.
`White et al. also disclose the use of error correction
`circuitry, without defining or illustrating the circuitry, to
`differentiate between the velocity of one of the occupant's
`hands as in the case where he/she is adjusting the knob on
`the radio and the remainder of the occupant. Three ultrasonic
`50 sensors of the type disclosed by White et al. might, in some
`cases, accomplish this differentiation if two of them indi(cid:173)
`cated that the occupant was not moving while the third was
`indicating that he or she was. Such a combination, however,
`would not differentiate between an occupant with both hands
`55 and arms in the path of the ultrasonic transmitter at such a
`location that they were blocking a substantial view of the
`occupant's head or chest. Since the sizes and driving posi(cid:173)
`tions of occupants are extremely varied, trained pattern
`recognition systems, such as neural networks, are required
`60 when a clear view of the occupant, unimpeded by his/her
`extremities, cannot be guaranteed.
`Fujita et al., in U.S. Pat. No. 5,074,583, illustrate another
`method of determining the position of the occupant but do
`not use this information to suppress deployment if the
`65 occupant is out-of-position. In fact, the closer that the
`occupant gets to the airbag the faster the inflation rate of the
`airbag is according to the Fujita patent, which thereby
`
`20
`
`1. Prior Art On Out Of Position Occupants And Rear
`Facing Child Seats
`Whereas thousands of lives have been saved by airbags,
`a large number of people have also been injured, some
`seriously, by the deploying airbag, and thus significant
`improvements need to be made in this regard. As discussed
`in detail in copending patent applications Ser. Nos. 08/040,
`978 and 08/239,978 cross-referenced above, for a variety of
`reasons vehicle occupants may be too close to the airbag 25
`before it deploys and can be seriously injured or killed as a
`result of the deployment thereof. Also, a child in a rear
`facing child seat which is placed on the right front passenger
`seat is in danger of being seriously injured if the passenger
`airbag deploys. For these reasons and, as first publicly 30
`disclosed in Breed, D. S. "How Airbags Work" presented at
`the International Conference on Seatbelts and Airbags in
`1993, in Canada, occupant position sensing and rear facing
`child seat detection is required.
`Initially these systems will solve the out-of-position occu(cid:173)
`pant and the rear facing child seat problems related to
`current airbag systems and prevent unneeded airbag deploy(cid:173)
`ments when a front seat is unoccupied. However, airbags are
`now under development to protect rear seat occupants in
`vehicle crashes and all occupants in side impacts. A system
`will therefore be needed to detect the presence of occupants,
`determine if they are out-of-position and to identify the
`presence of a rear facing child seat in the rear seat. Future
`automobiles are expected to have eight or more airbags as
`protection is sought for rear seat occupants and from side 45
`impacts. In addition to eliminating the disturbance and
`possible harm of unnecessary airbag deployments, the cost
`of replacing these airbags will be excessive if they all deploy
`in an accident needlessly.
`Inflators now exist which will adjust the amount of gas
`flowing to the airbag to account for the size and position of
`the occupant and for the severity of the accident. The vehicle
`identification and monitoring system (VIMS) discussed in
`patent application Ser. No. 08/239,978 will control such
`inflators based on the presence and position of vehicle
`occupants or of a rear facing child seat. The instant invention
`is an improvement on that VIMS system and uses an
`advanced optical system comprising one or more CCD
`(charge coupled device) arrays and a source of illumination
`combined with a trained neural network pattern recognition
`system.
`The need for an occupant out-of-position sensor has been
`observed by others and several methods have been disclosed
`in U.S. patents for determining the position of an occupant
`of a motor vehicle. Each of these systems, however, have
`significant limitations. In White et al. (U.S. Pat. No. 5,071,
`160), for example, a single acoustic sensor and detector is
`
`15
`
`
`
`20
`
`25
`
`30
`
`35
`
`5,845,000
`
`10
`
`15
`
`3
`increases the possibility of injuring the occupant. Fujita et al.
`do not measure the occupant directly but instead determine
`his or her position indirectly from measurements of the seat
`position and the vertical size of the occupant relative to the
`seat. This occupant height is determined using an ultrasonic
`displacement sensor mounted directly above the occupant's
`head.
`As discussed above, the optical systems described herein
`are also applicable for many other sensing applications both
`inside and outside of the vehicle compartment such as for
`sensing crashes before they occur as described in copending
`patent application Ser. No. 08/239,978 cross-referenced
`above, for a smart headlight adjustment system and for a
`blind spot monitor.
`2. Definitions
`The use of pattern recognition is central to the instant
`invention as well as those cross-referenced patent applica(cid:173)
`tions above. Nowhere in the prior art is pattern recognition
`which is based on training, as exemplified through the use of
`neural networks, mentioned for use in monitoring the inte-
`rior or exterior environments of the vehicle. "Pattern rec(cid:173)
`ognition" as used herein will mean any system which
`processes a signal that is generated by an object, or is
`modified by interacting with an object, in order to determine
`which one of a set of classes that the object belongs to. Such
`a system might determine only that the object is or is not a
`member of one specified class, or it might attempt to assign
`the object to one of a larger set of specified classes, or find
`that it is not a member of any of the classes in the set. The
`signals processed are generally electrical signals coming
`from transducers which are sensitive to either acoustic or
`electromagnetic radiation and, if electromagnetic, they can
`be either visible light, infrared, ultraviolet or radar. A train(cid:173)
`able or a trained pattern recognition system as used herein
`means a pattern recognition system which is taught various
`patterns by subjecting the system to a variety of examples.
`The most successful such system is the neural network.
`To "identify" as used herein will mean to determine that
`the object belongs to a particular set or class. The class may
`be one containing, for example, all rear facing child seats, 40
`one containing all human occupants, or all human occupants
`not sitting in a rear facing child seat depending on the
`purpose of the system. In the case where a particular person
`is to be recognized, the set or class will contain only a single
`element, i.e., the person to be recognized.
`An "occupying item" of a seat may be a living occupant
`such as a human being or a dog, another living organism
`such as a plant, or an inanimate object such as a box or bag
`of groceries.
`In the description herein on anticipatory sensing, the term 50
`"approaching" when used in connection with the mention of
`an object or vehicle approaching another will mean the
`relative motion of the object toward the vehicle having the
`anticipatory sensor system. Thus, in a side impact with a
`tree, the tree will be considered as approaching the side of 55
`the vehicle and impacting the vehicle. In other words, the
`coordinate system used in general will be a coordinate
`system residing in the target vehicle. The "target" vehicle is
`the vehicle which is being impacted. This convention per(cid:173)
`mits a general description to cover all of the cases such as 60
`where (i) a moving vehicle impacts into the side of a
`stationary vehicle, (ii) where both vehicles are moving when
`they impact, or (iii) where a vehicle is moving sideways into
`a stationary vehicle, tree or wall.
`3. Pattern Recognition Prior Art
`Japanese patent 3-42337 (A) to Ueno discloses a device
`for detecting the driving condition of a vehicle driver
`
`4
`compnsmg a light emitter for irradiating the face of the
`driver and a means for picking up the image of the driver and
`storing it for later analysis. Means are provided for locating
`the eyes of the driver and then the irises of the eyes and then
`determining if the driver is looking to the side or sleeping.
`Ueno determines the state of the eyes of the occupant rather
`than determining the location of the eyes relative to the other
`parts of the vehicle passenger compartment. Such a system
`can be defeated if the driver is wearing glasses, particularly
`sunglasses, or another optical device which obstructs a clear
`view of his/her eyes. Pattern recognition technologies such
`as neural networks are not used.
`U.S. Pat. No. 5,008,946 to Ando uses a complicated set of
`rules to isolate the eyes and mouth of a driver and uses this
`information to permit the driver to control the radio, for
`example, or other systems within the vehicle by moving his
`eyes and/or mouth. Ando uses natural light and illuminates
`only the head of the driver. He also makes no use of trainable
`pattern recognition systems such as neural networks, nor is
`there any attempt to identify the contents of the vehicle nor
`of their location relative to the vehicle passenger compart-
`ment. Rather, Ando is limited to control of vehicle devices
`by responding to motion of the driver's mouth and eyes.
`U.S. Pat. No. 5,298,732 to Chen also concentrates in
`locating the eyes of the driver so as to position a light filter
`between a light source such as the sun or the lights of an
`oncoming vehicle, and the driver's eyes. Chen does not
`explain in detail how the eyes are located but does supply a
`calibration system whereby the driver can adjust the filter so
`that it is at the proper position relative to his or her eyes.
`Chen references the use of an automatic equipment for
`determining the location of the eyes but does not describe
`how this equipment works. In any event, there is no mention
`of monitoring the position of the occupant, other that the
`eyes, of determining the position of the eyes relative to the
`passenger compartment, or of identifying any other object in
`the vehicle other than the driver's eyes. Also, there is no
`mention of the use of a trainable pattern recognition system.
`U.S. Pat. No. 5,305,012 to Faris also describes a system
`for reducing the glare from the headlights of an oncoming
`vehicle. Faris locates the eyes of the occupant by the use of
`two spaced apart infrared cameras using passive infrared
`radiation from the eyes of the driver. Again, Faris is only
`interested in locating the driver's eyes relative to the sun or
`45 oncoming headlights and does not identify or monitor the
`occupant or locate the occupant relative to the passenger
`compartment or the airbag. Also, Faris does not use trainable
`pattern recognition techniques such as neural networks.
`Faris, in fact, does not even say how the eyes of the occupant
`are located but refers the reader to a book entitled Robot
`Vision (1991) by Berthold Horn, published by MIT Press,
`Cambridge, Mass. Also, Faris uses the passive infrared
`radiation rather than illuminating the occupant with active
`infrared radiation or in general electromagnetic radiation as
`in the instant invention.
`The use of neural networks as the pattern recognition
`technology is central to this invention since it makes the
`monitoring system robust, reliable and practical. The result(cid:173)
`ing algorithm created by the neural network program is
`usually only a few lines of code written in the C computer
`language as opposed to typically hundreds of lines when the
`techniques of the above patents to Ando, Chen and Faris are
`implemented. As a result, the resulting systems are easy to
`implement at a low cost making them practical for automo-
`65 tive applications. The cost of the CCD arrays, for example,
`have been prohibitively expensive until very recently ren(cid:173)
`dering their use for VIMS impractical. Similarly, the imple-
`
`16
`
`
`
`5,845,000
`
`6
`5
`mentation of the techniques of the above referenced patents
`of which would be fast enough. Although the Subbarao
`requires expensive microprocessors while the implementa(cid:173)
`patent provides a good discussion of the camera focusing art
`tion with neural networks and similar trainable pattern
`and is therefore included herein by reference, it is a more
`recognition technologies permits the use of low cost micro(cid:173)
`complicated system than is needed for the practicing the
`processors typically costing less than $5.
`instant invention. In fact, a neural network can also be
`The present invention uses sophisticated trainable pattern
`trained to perform the distance determination based on the
`recognition capabilities such as neural networks. Usually the
`two images taken with different camera settings or from two
`data is preprocessed, as discussed below, using various
`adjacent CCD's and lens having different properties as the
`feature extraction. An example of such a pattern recognition
`cameras disclosed in Subbarao making this technique prac-
`system using neural networks on sonar signals is discussed 10 tical for the purposes of this instant invention. Distance can
`in two papers by Gorman, R. P. and Sejnowski, T. 1.
`also be determined by the system disclosed in U.S. Pat. No.
`"Analysis of Hidden Units in a Layered Network Trained to
`5,003,166 (Girod) by the spreading or defocusing of a
`Classify Sonar Targets", Neural Networks, Vol. 1. pp. 75-89,
`pattern of structured light projected onto the object of
`1988, and "Learned Classification of Sonar Targets Using a
`interest.
`Massively Parallel Network", IEEE Transactions on 15
`In each of these cases, regardless of the distance mea(cid:173)
`Acoustics, Speech, and Signal Processing, Vol. 36, No. 7,
`surement system used, a trained pattern recognition system,
`July 1988. Examples of feature extraction techniques can be
`as defined above, is used in the instant invention to identify
`found in U.S. Pat. No. 4,906,940 entitled "Process and
`and classify, and in some cases to locate, the illuminated
`Apparatus for the Automatic Detection and Extraction of
`object and its constituent parts.
`Features in Images and Displays" to Green et al. Examples 20
`5. Optics And Acoustics
`of other more advanced and efficient pattern recognition
`The laser systems described above are expensive due to
`techniques can be found in U.S. Pat. No. 5,390,136 entitled
`the requirement that they be modulated at a high frequency
`"Artificial Neuron and Method of Using Same and U.S.
`if the distance from the airbag to the occupant, for example,
`patent application Ser. No. 08/076,601 entitled "Neural
`needs to be measured. Both laser and non-laser optical
`Network and Method of Using Same" to Wang, S. T. Other 25
`systems in general are good at determining the location of
`examples include U.S. Pat. Nos. 5,235,339 (Morrison et al.),
`objects within the two dimensional plane of the image and
`5,214,744 (Schweizer et al), 5,181,254 (Schweizer et al),
`the modulated laser system in the scanning mode can
`and 4,881,270 (Knecht et al). All of the above references are
`determine the distance of each part of the image from the
`included herein by reference.
`30 receiver. It is also possible to determine distance with the
`4. Optics
`non-laser system by focusing as discussed above, or stereo(cid:173)
`Optics can be used in several configurations for monitor(cid:173)
`graphically if two spaced apart receivers are used and, in
`ing the interior of a passenger compartment of an automo(cid:173)
`some cases the mere location in the field of view can be used
`bile. In one known method, a laser optical system uses a
`to estimate the position relative to the airbag, for example.
`GaAs infrared laser beam to momentarily illuminate an
`Finally, a recently developed pulsed quantum well diode
`object, occupant or child seat, in the manner as described 35
`laser does provide inexpensive distance measurements as
`and illustrated in FIG. 8 of the copending patent application
`discussed below.
`Ser. No. 08/040,978 cross-referenced above. The receiver
`Acoustic systems are also quite effective at distance
`can be a charge coupled device or CCD, (a type of TV
`measurements since the relatively low speed of sound per(cid:173)
`camera) to receive the reflected light. The laser can either be
`mits simple electronic circuits to be designed and minimal
`used in a scanning mode, or, through the use of a lens, a cone 40
`microprocessor capability is required. If a coordinate system
`of light can be created which covers a large portion of the
`is used where the z axis is from the transducer to the
`object. In these configurations, the light can be accurately
`occupant, acoustics are good at measuring z dimensions
`controlled to only illuminate particular positions of interest
`while simple optical systems using a single CCD are good
`within the vehicle. In the scanning mode, the receiver need
`at measuring x and y dimensions. The combination of
`only comprise a single or a few active elements while in the 45
`acoustics and optics, therefore, permits all three measure-
`case of the cone of light, an array of active elements is
`ments to be made with low cost components.
`needed. The laser system has one additional significant
`advantage in that the distance to the illuminated object can
`One example of a system using these ideas is an optical
`be determined as disclosed in the 08/040,978 patent appli(cid:173)
`system which floods the passenger seat with infrared light
`cation.
`50 coupled with a lens and CCD array which receives and
`displays the reflected light and an analog to digital converter
`In a simpler case, light generated by a non-coherent light
`(AD C) which digitizes the output of the CCD and feeds it to
`emitting diode device is used to illuminate the desired area.
`an Artificial Neural Network (ANN) or other pattern recog(cid:173)
`In this case, the area covered is not as accurately controlled
`nition system , for analysis. This system uses an ultrasonic
`and a larger CCD array is required. Recently, however, the
`cost of CCD arrays has dropped substantially with the result
`55 transmitter and receiver for measuring the distances to the
`that this configuration is now the most cost effective system
`objects located in the passenger seat. The receiving trans(cid:173)
`for monitoring the passenger compartment as long as the
`ducer feeds its data into an ADC and from there into the
`distance from the transmitter to the objects is not needed. If
`ANN. The same ANN can be used for both systems thereby
`providing full three dimensional data for the ANN to ana-
`this distance is required, then either the laser system, a
`60 lyze. This system, using low cost components, will permit
`stereographic system, a focusing system, or a combined
`accurate identification and distance measurements not pos(cid:173)
`ultrasonic and optic system is required. A mechanical focus(cid:173)
`ing system, such as used on some camera systems can
`sible by either system acting alone. If a phased array system
`determine the initial position of an occupant but is too slow
`is added to the acoustic part of the system as disclosed in
`to monitor his/her position during a crash. A distance mea(cid:173)
`copending patent application (ATI-102), the optical part can
`65 determine the location of the driver's ears, for example, and
`suring system based of focusing is described in U.S. Pat. No.
`the phased array can direct a narrow beam to the location
`5,193,124 (Subbarao) which can either be used with a
`and determine the distance to the occupant's ears.
`mechanical focusing system or with two cameras, the latter
`
`17
`
`
`
`7
`
`5,845,000
`
`6. Applications
`The applications for this technology are numerous as
`described in the copending patent applications listed above.
`They include: (i) the monitoring of the occupant for safety
`purposes to prevent airbag deployment induced injuries, (ii)
`the locating of the eyes of the occupant to permit automatic
`adjustment of the rear view mirror(s), (iii) the location of the
`seat to place the eyes at the proper position to eliminate the
`parallax in a heads-up display in night vision systems, (iv)
`t