`Breed et al.
`
`[54]
`
`[75]
`
`OPTICAL IDENTIFICATION AND
`MONITORING SYSTEM USING PATTERN
`RECOGNITION FOR USE WITH VEHICLES
`
`Inventors: David S. Breed, Boonton Township,
`N.J.; Wilbur E. DuVall, Kimberling
`City, Mo.; Wendell C. Johnson,
`Torrance, Calif.
`
`[73]
`
`Assignee: Automotive Technologies
`International, Inc., Denville, N.J.
`
`[ * l
`
`Notice:
`
`The term of this patent shall not extend
`beyond the expiration date of Pat. No.
`5,835,613.
`
`[21]
`
`Appl. No.: 474,786
`
`[22]
`
`Filed:
`
`Jun. 7, 1995
`
`[63]
`
`[51]
`[52]
`[511]
`
`[56]
`
`Related U.S. Application Data
`
`Continuation-in-part of Ser. No. 878.571, May 5, 1992,
`abandoned, Ser. No. 40,978, Mar. 31, 1993, abandoned. Ser.
`No. 247.760, May 23, 1994. and Ser. No. 239,978, May 9.
`1994. abandoned.
`Int. Cl.6
`....................................................... G06K 9/00
`U.S. Cl. ............................................. 382/100; 348/143
`Field of Search ............................. 340/436; 3112/104,
`382/103, 291, 100; 280/735; 348/143, 148
`
`References Cited
`
`U.S. PATENT DOCUMENTS
`
`4,496,222
`4,625,329
`4.648.052
`4,720,189
`4,768,088
`4,836,670
`4,881,270
`4,906,940
`4,950,069
`4,966,388
`5,003,166
`
`1/1985 Shah ........................................ 350/354
`11/1986 Ishikawa et al.
`.... ................ ...... . 382/1
`...................... 364/550
`3/1987 Friedman et al.
`1/1988 Heynen et al.
`......................... 351/210
`8/1988 Ando ......................................... 358/93
`6/1989 Hutchinson ............................. 351/210
`11/1989 Knecht et al. .... ...... .......... ...... 382/191
`3/1990 Greene et al. .......................... 3S2/IOO
`8/1990 Hutchinson ............................. 351/210
`10/1990 Warner et al. ....................... 280/730.1
`3/1991 Girod .... ............ .... ...... ......... 250/201.4
`
`111111
`
`1111111111111111111111111111111111111111111111111111111111111
`US005845000A
`[11] Patent Number:
`[45] Date of Patent:
`
`5,845,000
`*Dec. 1, 1998
`
`FOREIGN PATENT DOCUMENTS
`
`Japan .
`2/1991
`342337
`94/22692 10/1994 WIPO .
`
`OTHER PUBLICATIONS
`
`"Analysis of Hidden Units in a Layered Network Trained to
`Classify Sonar Targets", R. Paul Gorman, et a!., Neural
`Networks, vol. 1, pp.75-89, 1988.
`Learned Classification of Sonar Targets Using a Massively
`Parallel Network, R. Paul Gorman eta!., IEEE Transactions
`on Acoustics, Speech and Signal Processing, vol. 36, No.7,
`Jul., 1988, pp. 1135-1140.
`'"How Airbags Work", David S. Breed, Presented at the
`Canadian A5sociation of Road Safety Professional, Oct. 19,
`1992-0cl. 20, 1992.
`Derwent Abstract of German Patent Publication No. DE 42
`11 556, Oct. 7, 1993.
`Dt:rwt:nt Abstract of Japant:st: Patt:nt Application No.
`02-051332, Nov. 13, 1991.
`
`Primary Examiner-Yon J. Couso
`
`[57]
`
`ABSTRACT
`
`A vt:hick intt:rior monitoring systt:m to idt:ntify, locatt: and
`monitor occupants, including their parts, and other objects in
`the passenger compartment and objects outside of a motor
`vehick, such as an automobik or truck, by illuminating the
`contents of the vehicle and objects outside of the vehicle
`with electromagnetic, and specifically infrared, radiation
`and using one or more lenses to focus images of the contents
`onto one or more arrays of charge coupled devices (CCD
`arrays). Outputs from the CCD arrays, are analyzed by
`appropriate computational means employing trained pattern
`recognition technologies, to classify, identify or locate the
`contents or external objects. In general, the information
`obtained by the identification and monitoring system is used
`to affect the operation of some other system in the vehicle.
`When system is installed in the passenger compartment of
`an automotive vehicle equipped with an airbag, the system
`determines the position of the vehicle occupant relative to
`the airbag and disables deployment of the airbag if the
`occupant is positioned so that he/she is likely to be injured
`hy the deployment of the airhag.
`
`(List continued on next page.)
`
`25 Claims, 12 Drawing Sheets
`
`]0()----~
`
`I
`
`~---.~--105
`~~--------------114
`
`...... ==~~
`
`~lll
`
`IPR2015-00262 - Ex. 1101
`Toyota Motor Corp., Petitioner
`1
`
`
`
`5,845,000
`Page 2
`
`U.S. PATENT DOCUMENTS
`
`5,008,946
`5,026,153
`5,064,274
`5,071,160
`5.074.583
`5,118,134
`5,162,861
`5,181,254
`5,185,667
`5,193,124
`5,214,744
`5,227,784
`5,235,339
`
`4/1991 Ando .. ... ... ...... ... . ... ... ... ... .......... ... 382/2
`6/1991 Suzuki et a!. .......................... 356/3.16
`11/1991 Alten ....................................... 359/604
`12/1991 White et a!.
`... ... ... ... .......... ... ... 280/735
`12/1991 Fujita eta!. ......................... 280/730.1
`6/1992 Mattes et a!. .. ... ... ... .......... ... ... 280/735
`11/1992 Tamburino et a!.
`........................ 356/5
`1/1993 Schweizer eta!. ..................... 382/100
`2/1993 Zimmermann . ... . ... ... ... ... ......... 358/209
`3/1993 Suhharao ................................ 3S2/255
`....................... 395/11
`5/1993 Schweizer et a!.
`7/1993 Masamori el a!. ...................... 340/436
`8/1993 Morriwn et a!. ....................... 342/159
`
`5,249,027
`5,249,157
`5,298.732
`5,305,012
`5,309.137
`5,329,206
`5,330,226
`5,339.075
`5,355,118
`5,390,136
`5,441.052
`5,454,591
`5,537.003
`
`9/1993 Mathur et al. .............................. 356/1
`9/1993 Taylor ..................................... 340/435
`3/1994 Chen .................................... 250/203.4
`4/1994 Faris ............................................ 345/7
`5/1994 Kajiwara ................................. 348/148
`7/1994 Slotkowski eta!. .................... 315/159
`7/1994 Gentry et a!. ........................... 280/735
`8/1994 Abst et a!. . ... ...... .... ... ... ... ....... 340/903
`10/1994 Fukahara .. ... ... ... ... ... ....... ... ... ... 348/148
`2/1995 Wang ...................................... 364/754
`8/1995 Miyajima ........................... 126/661.09
`10/1995 Mazur et a!.
`. ...... .... ... ... ... ....... 260/735
`7/1996 Bechtel et a!. .............................. 35/82
`
`2
`
`
`
`U.S. Patent
`US. Patent
`
`Dec. 1, 1998
`Dec. 1, 1998
`
`Sheet 1 of 12
`Sheet 1 0f 12
`
`5,845,000
`5,845,000
`
`t"'' --
`
`'V
`
`0 -
`
`0
`
`t"'' -
`
`'V --
`
`v:
`
`M
`
`0 - ---
`
`-0 -
`
`-
`
`o: 2:
`
`1.(')
`
`.2:
`
`0 -
`
`0 --
`
`0
`
`0 -
`
`M --
`
`m:
`
`3
`
`
`
`U.S. Patent
`US. Patent
`
`Dec. 1, 1998
`Dec. 1, 1998
`
`Sheet 2 of 12
`Sheet 2 0f 12
`
`5,845,000
`5,845,000
`
`.......
`2:
`0
`..........
`
`~
`
`0 - 0
`N -
`
`('I --
`
`o: 02
`
`mow
`
`0 -.......
`
`~ -.......
`
`m:
`
`4
`
`
`
`U.S. Patent
`US. Patent
`
`Dec. 1, 1998
`Dec. 1, 1998
`
`Sheet 3 of 12
`Sheet 3 0f 12
`
`5,845,000
`5,845,000
`
`0 --
`
`2:
`
`0
`
`0 -
`
`-
`
`5
`
`
`
`U.S. Patent
`US. Patent
`
`Dec. 1, 1998
`Dec. 1, 1998
`
`Sheet 4 of 12
`Sheet 4 0f 12
`
`5,845,000
`5,845,000
`
`0
`
`N -
`
`-0 -
`
`2:
`
`9:
`
`-.:::!" --
`
`#2
`
`---
`
`~
`
`0 -
`
`V)
`
`m2
`0 ........
`
`0 -........
`
`\
`0
`
`0 -
`
`~ --
`
`m:
`
`6
`
`
`
`U.S. Patent
`US. Patent
`
`Dec. 1,1998
`Dec. 1, 1998
`
`Sheet 5 0f 12
`Sheet 5 of 12
`
`5,845,000
`5,845,000
`
`~ -- ~
`0 -
`
`0
`N
`
`~ --
`
`\0
`-
`-
`
`M
`0
`
`00 -
`
`-0 -
`
`101
`
`FIG.1B
`
`~
`
`0 -
`
`100
`
`l.r)
`
`105
`
`0 -
`
`0 --
`
`110
`
`0
`
`0 -
`
`mF-d
`v—Il
`
`M --
`
`7
`
`
`
`U.S. Patent
`US. Patent
`
`Dec. 1, 1998
`Dec. 1, 1998
`
`Sheet 6 0f 12
`Sheet 6 of 12
`
`5,845,000
`5,845,000
`
`210
`
`FIG.2
`
`8
`
`
`
`~
`
`00
`\0
`\0
`
`~ !"l .... ~ ....
`
`~ = ......
`~ ......
`~
`•
`'JJ
`•
`Cj
`
`I
`
`e
`
`I
`
`~
`
`~ -.. oc
`
`~ -.. = = =
`
`GROUND
`
`0
`
`L------1
`
`I
`I
`I
`I
`r------,
`ARMING SENSOR
`
`0
`
`N
`....
`0 .....
`.....:J
`~ .....
`'JJ =(cid:173)~
`
`INFLATABLE RESTRAINT SYSTEM
`
`'--------
`I
`:~1:
`r-------1
`
`FIG. 3
`
`OCCUPANT POSITION SENSORS
`
`.::
`
`-
`
`-
`
`-
`
`--. -·--
`
`,---
`
`-----1
`I -=-
`-:::..
`I
`I
`I ~ I
`I
`1
`
`~ I
`
`DIAGNOSTIC UNIT
`
`-
`
`1_ -
`
`,"If
`
`///;
`/ ,,
`
`1
`I
`
`12 VOLTS
`
`I
`
`Q
`
`I
`
`I 1/ I I
`
`CRASH SENSOR
`ELECTRONIC
`I
`I
`I
`
`•
`
`I
`
`1
`
`._ _____ ...
`
`I
`I
`------------,
`
`~------1
`
`9
`
`
`
`0
`0
`0
`._.
`Ul
`~
`00
`._.
`Ul
`
`FIG. 4
`
`Detect -
`
`416
`
`MHz
`
`413-
`
`408
`
`I~
`
`Detector
`Phase
`
`~ = ......
`~ ......
`~
`rJJ .
`e •
`
`I 144 Mhz
`
`lil
`
`Driver
`Diode
`/
`
`1
`
`407
`
`..,r
`
`3fl
`
`144 Mhz
`
`I
`
`405-...1 f2 48.05 MHz H • Frequency
`
`Tripier
`
`/
`
`Oscillator
`Crystal
`
`Frequency
`
`Tripier
`
`H
`
`fl48 MHz
`
`Oscillator
`Crystal
`
`~ ....
`
`N
`
`Q
`00
`~ .....
`'Jl =-~
`
`~
`
`00
`\C
`\C
`
`r'l .... ~ ....
`
`0
`
`e:!
`
`1:409
`
`_41?
`
`Pre-amplifier
`
`--410 J
`
`Bias
`
`144 Mhz
`
`M..
`
`f Mixer~..._ _j
`
`Amplifier
`Gain Control ~
`
`Filter
`
`(Oto2\hj ~ H' Automatic 1
`(3f2-3fl )= 150KHz I Filter v 144.15
`
`150 KH
`
`/x
`
`VelOcity rn_,
`
`10
`
`
`
`U.S. Patent
`US. Patent
`
`Dec. 1, 1998
`Dec. 1, 1998
`
`Sheet 9 of 12
`Sheet 9 0f 12
`
`5,845,000
`5,845,000
`
`0
`om—
`N
`
`0
`N
`1..()
`
`.n.UHE
`
`0
`~
`L()
`
`11
`
`11
`
`
`
`U.S. Patent
`US. Patent
`
`Dec. 1, 1998
`Dec. 1, 1998
`
`Sheet 10 of 12
`Sheet 10 0f 12
`
`5,845,000
`5,845,000
`
`0
`N
`
`-N
`
`\0
`
`w.DNK
`
`\
`\
`I
`\
`I
`~
`
`...
`
`12
`
`12
`
`
`
`U.S. Patent
`US. Patent
`
`Dec. 1,1998
`Dec. 1, 1998
`
`Sheet 11 0f 12
`Sheet 11 of 12
`
`5,845,000
`5,845,000
`
`
`
`13
`
`
`
`14
`
`U.S. Patent
`US. Patent
`
`Dec. 1, 1998
`Dec. 1, 1998
`
`Sheet 12 of 12
`Sheet 12 0f 12
`
`5,845,000
`5,845,000
`
`ON—
`
`-l
`
`-0 -
`
`
`
`00
`
`FIG.8
`
`14
`
`
`
`5,845,000
`
`1
`OPTICAL IDENTlFICATION AND
`MONITORING SYSTEM USING PATTERN
`RECOGNITION FOR USE WITH VEHICLES
`
`CROSS REFERENCE TO RELATED
`APPLICATIONS
`
`This application is a continuation-in-part of application
`Ser. No. 07/878,571 filed May 5, 1992, now abandoned, of
`application Ser. No. 08/040,978 filed Mar. 31, 1993 now
`abandoned, of copending application Ser. No. 08/247,760
`filed May 23, 1994 and of application Ser. No. 08/239,978
`filed May 9, 1994 now abandoned, the last three of which are
`included herein by reference.
`
`BACKGROUND OF THE INVENTION
`
`5
`
`2
`disclosed and, as illustrated, is mounted lower than the
`steering wheel. White et al. correctly perceive that such a
`sensor could be defeated, and the airbag falsely deployed, by
`an occupant adjusting the control knobs on the radio and
`thus they suggest the use of a plurality of such sensors.
`Mattes et al. (U.S. Pat. No. 5,118,134) disclose a variety
`of methods of measuring the change in position of an
`occupant including ultrasonic, active or passive infrared and
`microwave radar sensors, and an electric eye. Their use of
`10 these sensors is to measure the change in position of an
`occupant during a crash and use that information to access
`the severity of the crash and thereby decide whether or not
`to deploy the airbag. They arc thus using the occupant
`motion as a crash sensor. No mention is made of determining
`15 the out-of-position status of the occupant or of any of the
`other features of occupant monitoring as disclosed in the
`above cross-referenced patent applications. It is interesting
`to note that nowhere does Mattes et al. discuss how to use
`active or passive infrared to determine the position of the
`occupant. As pointed out in the above cross-referenced
`patent applications, direct occupant position measurement
`based on passive infrared is probably not possible and, until
`very recently, was very difficult and expensive with active
`infrared requiring the modulation of an expensive GaAs
`infrared laser. Since there is no mention of these problems,
`the method of use contemplated by Mattes et al. must be
`similar to the electric eye concept where position is mea(cid:173)
`sured indirectly as the occupant passes by a plurality of
`longitudinally spaced-apart sensors.
`The object of an occupant out-of-position sensor is to
`determine the location of the head and/or chest of the vehicle
`occupant relative to the airbag since it is the impact of either
`the head or chest with the deploying airbag which can result
`in serious injuries. Both White et al. and Mattes et al.
`disclose only lower mounting locations of their sensors
`which are mounted in front of the occupant such as on the
`dashboard or below the steering wheel. Both such mounting
`locations are particularly prone to detection errors due to
`positioning of the occupant's hands, arms and legs. This
`would require at least three, and preferably more, such
`sensors and detectors and an appropriate logic circuitry
`which ignores readings from some sensors if such readings
`are inconsistent with others, for the case, for example, where
`the driver's arms are the closest objects to two of the sensors.
`White et al. also disclose the use of error correction
`circuitry, without defining or illustrating the circuitry, to
`differentiate between the velocity of one of the occupant's
`hands as in the case where he/she is adjusting the knob on
`the radio and the remainder of the occupant. Three ultrasonic
`50 sensors of the type disclosed by White et al. might, in some
`cases, accomplish this differentiation if two of them in eli(cid:173)
`cated that the occupant was not moving while the third was
`indicating that he or she was. Such a combination, however,
`would not differentiate between an occupant with both hands
`55 and arms in the path of the ultrasonic transmitter at such a
`location that they were blocking a substantial view of the
`occupant's head or chest. Since the sizes and driving posi(cid:173)
`tions of occupants are extremely varied, trained pattern
`recognition systems, such as neural network5, are required
`60 when a clear view of the occupant, unimpeded by his/her
`extremities, cannot be guaranteed.
`Fujita et al., in U.S. Pat. No. 5,074,583, illustrate another
`method of determining the position of the occupant but do
`not use this information to suppress deployment if the
`65 occupant is out -of-position. In fact, the closer that the
`occupant gets to the airbag the faster the inflation rate of the
`airbag is according to the Fujita patent, which thereby
`
`25
`
`1. Prior Art On Out Of Position Occupants And Rear
`Facing Child Seats
`Whereas thousands of lives have been saved by airbags,
`a large number of people have also been injured, some 20
`seriously, by the deploying airbag, and thus significant
`improvements need to be made in this regard. As discussed
`in detail in copending patent applications Ser. Nos. 08/040,
`97R and OR/239,97R cross-referenced above, for a variety of
`reasons vehicle occupants may be too close to the airbag
`before it deploys and can be seriously injured or killed as a
`result of the deployment thereof. Also, a child in a rear
`facing child seat which is placed on the right front passenger
`seat is in danger of being seriously injured if the passenger
`airbag deploys. For these reasons and, as first publicly 30
`disclosed in Breed, D. S. "Ilow Airbags Work" presented at
`the International Conference on Seatbelts and Airbags in
`1993, in Canada, occupant position sensing and rear facing
`child seat detection is required.
`Initially these systems will solve the out-of-position occu- 35
`pant and the rear facing child seat problems related to
`current airbag systems and prevent unneeded airbag deploy(cid:173)
`ments when a front seat is unoccupied. However, airbags are
`now under development to protect rear seat occupants in
`vehicle crashes and all occupants in side impacts. A system 40
`will therefore be needed to detect the presence of occupants,
`determine if they arc out-of-position and to identify the
`presence of a rear facing child seat in the rear seat. Future
`automobiles are expected to have eight or more airbags as
`protection is sought for rear seat occupants and from side 45
`impacts. In addition to eliminating the disturbance and
`possible harm of unnecessary airbag deployments, the cost
`of replacing these airbags will be excessive if they all deploy
`in an accident needlessly.
`Inflators now exist which will adjust the amount of gas
`flowing to the airbag to account for the size and position of
`the occupant and for the severity of the accident. The vehicle
`identification and monitoring system (VIMS) discussed in
`patent application Ser. No. 08/239,978 will control such
`inflators based on the presence and position of vehicle
`occupants or of a rear facing child scat. The instant invention
`is an improvement on that VIMS system and uses an
`advanced optical system comprising one or more CCD
`(charge coupled device) arrays and a source of illumination
`combined with a trained neural network pattern recognition
`system.
`The need for an occupant out-of-position sensor has been
`observed by others and several methods have been disclosed
`in U.S. patents for determining the position of an occupant
`of a motor vehicle. Each of these systems, however, have
`significant limitations. In White et al. (U.S. Pat. No. 5,071,
`160), for example, a single acoustic sensor and detector is
`
`15
`
`
`
`5,845,000
`
`3
`increases the possibility of injuring the occupant. Fujita et al.
`do not measure the occupant directly but instead determine
`his or her position indirectly from measurements of the seat
`position and the vertical size of the occupant relative to the
`seat. This occupant height is determined using an ultrasonic
`displacement sensor mounted directly above the occupant's
`head.
`As discussed above, the optical systems described herein
`are also applicable for many other sensing applications both
`inside and outside of the vehicle compartment such as for
`sensing crashes before they occur as described in copending
`patent application Ser. No. 08/239,978 cross-referenced
`above, for a smart headlight adjustment system and for a
`blind spot monitor.
`2. Definitions
`The use of pattern recognition is central to the instant
`invention as well as those cross-referenced patent applica(cid:173)
`tions above. Nowhere in the prior art is pattern recognition
`which is based on training, as exemplified through the use of
`ntural ndworks, mtntiontd for ust in monitoring tht intt(cid:173)
`rior or exterior environments of the vehicle. "Pattern rec(cid:173)
`ognition" as used herein will mean any system which
`processes a signal that is generated by an object, or is
`modified by interacting with an object, in order to determine
`which one of a set of classes that the object belongs to. Such
`a system might determine only that the object is or is not a
`member of one specified class, or it might attempt to assign
`the object to one of a larger set of specified classes, or find
`that it is not a member of any of the classes in the set. The
`signals processed are generally electrical signals coming
`from transducers which are sensitive to either acoustic or
`electromagnetic radiation and, if electromagnetic, they can
`be either visible light, infrared, ultraviolet or radar. A train(cid:173)
`able or a trained pattern recognition system as used herein
`means a pattern recognition system which is taught various
`patterns by subjecting the system to a variety of examples.
`The most successful such system is the neural network.
`To "identify" as used herein will mean to determine that
`the object belongs to a particular set or class. The class may
`be one containing, for example, all rear facing child seats,
`one containing all human occupants, or all human occupants
`not sitting in a rear facing child seat depending on the
`purpose of the system. In the case where a particular person
`is to be recognized, the set or class will contain only a single
`element, i.e., the person to be recognized.
`An "occupying item" of a seat may be a living occupant
`such as a human being or a dog, another living organism
`such as a plant, or an inanimate object such as a box or bag
`of groceries.
`In the description herein on anticipatory sensing, the term
`'"approaching" when used in connection with the mention of
`an object or vehicle approaching another will mean the
`relative motion of the object toward the vehicle having the
`anticipatory sensor system. Thus, in a side impact with a
`tree, the tree will be considered as approaching the side of
`the vehicle and impacting the vehicle. In other words, the
`coordinate system used in general will be a coordinate
`system residing in the target vehicle. The '"target" vehicle is
`the vehicle which is being impacted. This convention per(cid:173)
`mits a general description to cover all of the cases such as
`where (i) a moving vehicle impacts into the side of a
`stationary vehicle, (ii) where both vehicles are moving when
`they impact, or (iii) where a vehicle is moving sideways into
`a stationary vehicle, tree or wall.
`3. Pattern Recognition Prior Art
`Japanese patent 3-42337 (A) to Ueno discloses a device
`for detecting the driving condition of a vehicle driver
`
`10
`
`4
`compnsmg a light emitter for irradiating the face of the
`driver and a means for picking up the image of the driver and
`storing it for later analysis. Means are provided for locating
`the eyes of the driver and then the irises of the eyes and then
`5 determining if the driver is looking to the side or sleeping.
`Ueno determines the state of the eyes of the occupant rather
`than determining the location of the eyes relative to the other
`parts of the vehicle passenger compartment. Such a system
`can be defeated if the driver is wearing glasses, particularly
`sunglasses, or another optical device which obstructs a clear
`view of his/her eyes. Pattern recognition technologies such
`as neural networks are not used.
`U.S. Pat. No. 5,008,946 to Ando uses a complicated set of
`rules to isolate the eyes and mouth of a driver and uses this
`15 information to permit the driver to control the radio, for
`example, or other systems within the vehicle by moving his
`eyes and/or mouth. Ando uses natural light and illuminates
`only the head of the driver. He also makes no use of trainable
`pattern recognition systems such as neural networks, nor is
`20 there any attempt to identify the contents of the vehicle nor
`of their location relative to the vehicle passenger compart(cid:173)
`ment. Rather, Ando is limited to control of vehicle devices
`by responding to motion of the driver's mouth and eyes.
`U.S. Pat. No. 5,298,732 to Chen also concentrates in
`25 locating the eyes of the driver so as to position a light filter
`between a light source such as the sun or the lights of an
`oncoming vehicle, and the driver's eyes. Chen does not
`explain in detail how the eyes are located but does supply a
`calibration system whereby the driver can adjust the filter so
`30 that it is at the proper position relative to his or her eyes.
`Chen references the use of an automatic equipment for
`determining the location of the eyes hut does not describe
`how this equipment works. In any event, there is no mention
`of monitoring the position of the occupant, other that the
`35 eyes, of determining the position of the eyes relative to the
`passenger compartment, or of identifying any other object in
`the vehicle other than the driver's eyes. Also, there is no
`mention of the use of a trainable pattern recognition system.
`U.S. Pat. No. 5,305,012 to Faris also dtscribts a systtm
`40 for reducing the glare from the headlights of an oncoming
`vehicle. Faris locates the eyes of the occupant by the use of
`two spaced apart infrared cameras using passive infrared
`radiation from the eyes of the driver. Again, Faris is only
`interested in locating the driver's eyes relative to the sun or
`45 oncoming headlights and does not identify or monitor the
`occupant or locate the occupant relative to the passenger
`compartment or the airbag. Also, Faris does not use trainable
`pattern recognition techniques such as neural networks.
`Faris, in fact, does not even say how the eyes of the occupant
`50 are located but refers the reader to a book entitled Robot
`Vision (1991) by Berthold Horn, published by MIT Press,
`Cambridge, Mass. Also, Faris uses the passive infrared
`radiation rather than illuminating the occupant with active
`infrared radiation or in general electromagnetic radiation as
`55 in the instant invention.
`The use of neural networks as the pattern recognition
`technology is central to this invention since it makes the
`monitoring system robust, reliable and practical. The result(cid:173)
`ing algorithm crtattd by tht ntural ndwork program is
`60 usually only a few lines of code written in the C computer
`language as opposed to typically hundreds of lines when the
`techniques of the above patents to Ando, Chen and Faris are
`implemented. As a result, the resulting systems are easy to
`implement at a low cost making them practical for automo-
`65 tive applications. The cost of the CCD arrays, for example,
`have been prohibitively expensive until very recently ren(cid:173)
`dering their use for VIMS impractical. Similarly, the imple-
`
`16
`
`
`
`5,845,000
`
`5
`
`5
`mentation of the techniques of the above referenced patents
`requires expensive microprocessors while the implementa(cid:173)
`tion with neural networks and similar trainable pattern
`recognition technologies permits the use of low cost micro(cid:173)
`processors typically costing less than $5.
`The present invention uses sophisticated trainable pattern
`recognition capabilities such as neural networks. Usually the
`data is preprocessed, as cliscussecl below, using various
`feature extraction. An example of such a pattern recognition
`system using neural networks on sonar signals is cliscussecl
`in two papers by Gorman, R. P. and Sejnowski, T. J.
`"Analysis of Hidden Units in a Layered Network Trained to
`Classify Sonar Targets", Neural Networks, Vol. 1. pp. 75-89,
`1988, and "Learned Classification of Sonar Targets Using a
`Massively Parallel Network", IEEE Transactions on 15
`Acoustics, Speech, and Signal Processing, Vol. 36, No. 7,
`July 1988. Examples of feature extraction techniques can be
`found in U.S. Pat. No. 4,906,940 entitled ''Process and
`Apparatus for the Automatic Detection and Extraction of
`Features in Images and Displays" to Green et a!. Examples 20
`of other more aclvancecl and efficient pattern recognition
`techniques can be found in U.S. Pat. No. 5,390,136 entitled
`"Artificial Neuron and Method of Using Same and U.S.
`patent application Ser. No. 08/076,601 entitled "Neural
`Network and Method of Using Same" to Wang, S. T. Other 25
`examples include U.S. Pat. Nos. 5,235,339 (Morrison eta!.),
`5,214,744 (Schweizer et al), 5,181,254 (Schweizer et al),
`and 4,881,270 (Knecht et al). All of the above references are
`included herein by reference.
`4. Optics
`Optics can be used in several configurations for monitor(cid:173)
`ing the interior of a passenger compartment of an automo(cid:173)
`bile. In one known method, a laser optical system uses a
`GaAs infrared laser beam to momentarily illuminate an
`object, occupant or child seat, in the manner as clescribecl
`and illustrated in FIG. 8 of the copending patent application
`Ser. No. 08/040,978 cross-referenced above. The receiver
`can be a charge coupled device or CCD, (a type of TV
`camera) to receive the reflected light. The laser can either be
`used in a scanning mode, or, through the use of a lens, a cone 40
`of light can be created which covers a large portion of the
`object. In these configurations, the light can be accurately
`controlled to only illuminate particular positions of interest
`within the vehicle. In the scanning mode, the receiver need
`only comprise a single or a few active elements while in the 45
`case of the cone of light, an array of active elements is
`needed. The laser system has one additional significant
`advantage in that the distance to the illuminated object can
`be determined as disclosed in the 08/040,978 patent appli-
`cation.
`In a simpler case, light generated by a non-coherent light
`emitting diode device is used to illuminate the desired area.
`In this case, the area covered is not as accurately controlled
`and a larger CCD array is required. Recently, however, the
`cost of CCD arrays has dropped substantially with the result
`that this configuration is now the most cost efl:'ective system
`for monitoring the passenger compartment as long as the
`distance from the transmitter to the objects is not needed. If
`this distance is required, then either the laser system, a
`stereographic system, a focusing system, or a combined 60
`ultrasonic and optic system is required. A mechanical focus(cid:173)
`ing system, such as used on some camera systems can
`determine the initial position of an occupant but is too slow
`to monitor his/her position during a crash. A distance mea(cid:173)
`suring system based of focusing is described in U.S. Pat. No. 65
`5,193,124 (Subbarao) which can either be used with a
`mechanical focusing system or with two cameras, the latter
`
`6
`of which would be fast enough. Although the Subbarao
`patent provides a good discussion of the camera focusing art
`and is therefore incluclccl herein by reference, it is a more
`complicated system than is needed for the practicing the
`instant invention. In fact, a neural network can also be
`trained to perform the distance determination based on the
`two images taken with different camera settings or from two
`adjacent CCD's and lens having different properties as the
`cameras disclosed in Subbarao making this technique prac-
`10 tical for the purposes of this instant invention. Distance can
`also be determined by the system disclosed in U.S. Pat. No.
`5,003,166 (Girod) by the spreading or defocusing of a
`pattern of structured light projected onto the object of
`interest.
`In each of these cases, regardless of the distance mea(cid:173)
`surement system used, a trained pattern recognition system,
`as defined above, is used in the instant invention to identify
`and classify, and in some cases to locate, the illuminated
`object and its constituent parts.
`5. Optics And Acoustics
`The laser systems clescribecl above are expensive clue to
`the requirement that they be modulated at a high frequency
`if the distance from the airhag to the occupant, for example,
`needs to be measured. Both laser and non-laser optical
`systems in general are good at determining the location of
`objects within the two dimensional plane of the image and
`the modulated laser system in the scanning mode can
`determine the distance of each part of the image from the
`30 receiver. It is also possible to determine distance with the
`non-laser system by focusing as discussed above, or stereo(cid:173)
`graphically if two spaced apart receivers are used and, in
`some cases the mere location in the field of view can be used
`to estimate the position relative to the airbag, for example.
`35 Finally, a recently developed pulsed quantum well diode
`laser does provide inexpensive distance measurements as
`discussed below.
`Acoustic systems are also quite effective at distance
`measurements since the relatively low speed of sound per(cid:173)
`mits simple electronic circuits to be designed and minimal
`microprocessor capability is required. If a coordinate system
`is used where the z axis is from the transducer to the
`occupant, acoustics are good at measuring z dimensions
`while simple optical systems using a single CCD are good
`at measuring x and y dimensions. The combination of
`acoustics and optics, therefore, permits all three measure-
`ments to be made with low cost components.
`One example of a system using these ideas is an optical
`system which floods the passenger seat with infrared light
`50 coupled with a lens and CCD array which receives and
`displays the reflected light and an analog to digital converter
`(AD C) which digitizes the output of the CCD and feeds it to
`an Artificial Neural Network (ANN) or other pattern recog(cid:173)
`nition system , for analysis. This system uses an ultrasonic
`55 transmitter and receiver for measuring the distances to the
`objects located in the passenger seat. The receiving trans(cid:173)
`ducer feeds its data into an ADC and from there into the
`ANN. The same ANN can be used for both systems thereby
`providing full three dimensional data for the ANN to ana(cid:173)
`lyze. This system, using low cost components, will permit
`accurate identification and distance measurements not pos-
`sible by either system acting alone. If a phased array system
`is added to the acoustic part of the system as disclosed in
`copencling patent application (ATI-102), the optical part can
`determine the location of the driver's ears, for example, and
`the phased array can direct a narrow beam to the location
`and determine the distance to the occupant's ears.
`
`17
`
`
`
`7
`
`5,845,000
`
`6. Applications
`The applications for this technology are numerous as
`described in the copending patent applications listed above.
`They include: (i) the monitoring of the occupant for safety
`purposes to prevent airbag deployment induced injuries, (ii) 5
`the locating of the eyes of the occupant to permit automatic
`adjustment of the rear view mirror(s), (iii) t