`Breed et ai.
`
`111111
`
`1111111111111111111111111111111111111111111111111111111111111
`US006772057B2
`
`(10) Patent No.:
`(45) Date of Patent:
`
`US 6,772,057 B2
`Aug. 3, 2004
`
`(54) VEHICUIAR MONITORING SYSTEMS
`USING IMAGE PROCESSING
`
`(56)
`
`References Cited
`
`(75)
`
`Inventors: David S. Breed, Boonton Township,
`Morris County, NJ (US); Wilbur E.
`DuVall, Kimberling City, MO (US);
`Wendell C. Johnson, Signal Hill, CA
`(US)
`
`(73) Assignee: Automotive Technologies
`International, Inc., Denville, NJ (US)
`
`EP
`
`( *) Notice:
`
`Subject to any disclaimer, the term of this
`patent is extended or adjusted under 35
`U.S.c. 154(b) by 0 days.
`
`U.S. PATENT DOCUMENTS
`1/1985 Shah ........................ 3592/300
`4,496,222 A
`
`(List continued on next page.)
`
`FOREIGN PATENT DOCUMENTS
`0885782
`12/1998
`
`(List continued on next page.)
`
`OTHER PUBLICATIONS
`
`(21) Appl. No.: 10/302,105
`
`(22) Filed:
`
`Nov. 22, 2002
`
`(65)
`
`Prior Publication Data
`
`US 2003/0125855 A1 Jui. 3, 2003
`
`Related U.S. Application Data
`
`(63) Continuation-in-part of application No. 10/116,808, filed on
`Apr. 5, 2002, which is a continuation-in-part of application
`No. 09/925,043, filed on Aug. 8, 2001, now Pat. No.
`6,507,779, which is a continuation-in-part of application No.
`09/765,559, filed on Jan. 19, 2001, now Pat. No. 6,553,296,
`and a continuation-in-part of application No. 09/389,947,
`filed on Sep. 3, 1999, now Pat. No. 6,393,133, and a
`continuation-in-part of application No. 09/838,919, filed on
`Apr. 20, 2001, now Pat. No. 6,442,465, which is a continu(cid:173)
`ation-in-part of application No. 09/765,559, which is a
`continuation-in-part of application No. 09/476,255, filed on
`Dec. 30, 1999, now Pat. No. 6,324,453, and a continuation(cid:173)
`in-part of application No. 09/389,947, which is a continua(cid:173)
`tion-in-part of application No. 09/200,614, filed on Nov. 30,
`1998, now Pat. No. 6,141,432, which is a continuation of
`application No. 08/474,786, filed on Jun. 7, 1995, now Pat.
`No. 5,845,000.
`(60) Provisional application No. 60/114,507, filed on Dec. 31,
`1998.
`
`Int. CI? ................................................ B60R 21/32
`(51)
`(52) U.S. CI. ........................ 701/45; 340/573.1; 348/77;
`180/271; 280/735; 382/181
`(58) Field of Search ............................ 701/45, 36, 301;
`340/461,435,436,438,815.4,573.1; 348/148,
`77, 154; 180/271; 280/735, 728.1; 382/181,
`115, 190, 224, 100, 104
`
`8\0 -
`
`"Analysis of Hidden Units in a Layered Network Trained to
`Classify Sonar Targets", R. Paul Gorman, et aI., Neural
`Networks, vol. 1, pp. 75-89, 1988.
`
`(List continued on next page.)
`
`Primary Examiner-Thomas G. Black
`Assistant Examiner-Tuan C To
`(74) Attorney, Agent, or Firm-Brian Roffe
`
`(57)
`
`ABSTRACT
`
`Vehicular monitoring arrangement for monitoring an envi(cid:173)
`ronment of the vehicle including at least one active pixel
`camera for obtaining images of the environment of the
`vehicle and a processor coupled to the active pixel camera(s)
`for determining at least one characteristic of an object in the
`environment based on the images obtained by the active
`pixel camera(s). The active pixel camera can be arranged in
`a headliner, roof or ceiling of the vehicle to obtain images of
`an interior environment of the vehicle, in an A-pillar or
`B-pillar of the vehicle to obtain images of an interior
`environment of the vehicle, or in a roof, ceiling, B-pillar or
`C-pillar of the vehicle to obtain images of an interior
`environment of the vehicle behind a front seat of the vehicle.
`The determined characteristic can be used to enable optimal
`control of a reactive component, system or subsystem
`coupled to the processor. When the reactive component is an
`airbag assembly including at least one airbag, the processor
`can be designed to control at least one deployment param(cid:173)
`eter of the airbag(s).
`
`86 Claims, 19 Drawing Sheets
`
`120
`
`IPR2013-00419 - Ex. 1001
`Toyota Motor Corp., Petitioner
`
`1
`
`
`
`US 6,772,057 B2
`Page 2
`
`U.S. PATENT DOCUMENTS
`
`4,625,329 A
`4,648,052 A
`4,720,189 A
`4,768,088 A
`4,836,670 A
`4,881,270 A
`4,906,940 A
`4,950,069 A
`4,966,388 A
`5,003,166 A
`5,008,946 A
`5,026,153 A
`5,060,278 A
`5,062,696 A
`5,064,274 A
`5,071,160 A
`5,074,583 A
`5,103,305 A
`5,118,134 A
`5,162,861 A
`5,181,254 A
`5,185,667 A
`5,193,124 A
`5,214,744 A
`5,227,784 A
`5,235,339 A
`5,249,027 A
`5,249,157 A
`5,298,732 A
`5,305,012 A
`5,309,137 A *
`5,329,206 A
`5,330,226 A
`5,339,075 A
`5,355,118 A
`5,390,136 A
`5,441,052 A
`5,446,661 A
`5,454,591 A
`5,463,384 A
`5,473,515 A
`5,482,314 A
`5,497,305 A *
`5,528,698 A
`5,531,472 A
`5,537,003 A
`5,550,677 A
`5,563,650 A
`5,653,462 A
`5,706,144 A *
`5,785,347 A
`5,821,633 A
`5,829,782 A
`5,835,613 A
`5,845,000 A
`5,848,802 A
`5,877,897 A *
`5,943,295 A
`5,949,331 A *
`5,954,360 A
`5,959,367 A *
`5,983,147 A
`6,005,958 A
`6,007,095 A
`6,020,812 A
`6,027,138 A
`
`11/1986
`3/1987
`1/1988
`8/1988
`6/1989
`11/1989
`3/1990
`8/1990
`10/1990
`3/1991
`4/1991
`6/1991
`10/1991
`11/1991
`11/1991
`12/1991
`12/1991
`4/1992
`6/1992
`11/1992
`1/1993
`2/1993
`3/1993
`5/1993
`7/1993
`8/1993
`9/1993
`9/1993
`3/1994
`4/1994
`5/1994
`7/1994
`7/1994
`8/1994
`10/1994
`2/1995
`8/1995
`8/1995
`10/1995
`* 10/1995
`* 12/1995
`1/1996
`3/1996
`6/1996
`7/1996
`7/1996
`8/1996
`10/1996
`8/1997
`1/1998
`7/1998
`10/1998
`11/1998
`11/1998
`12/1998
`12/1998
`3/1999
`8/1999
`9/1999
`9/1999
`9/1999
`11/1999
`12/1999
`12/1999
`2/2000
`2/2000
`
`Ishikawa et al. ............ 382/104
`Friedman et al. ........... 364/550
`Heynen et al.
`............. 351/210
`Ando ... ... ..... ... ... ... ...... 358/93
`Hutchinson ................. 351/210
`Knecht et al.
`................ 382/17
`Greene et al.
`................ 382/16
`Hutchinson ................. 351/210
`.............. 280/730
`Warner et al.
`Girod ...................... 250/201.4
`Ando ............................ 382/2
`Suzuki et al. ... ... ... ... ... ... 356/1
`Fukumizu ................... 382/157
`Oshima et al.
`............. 359/554
`Alten ......................... 359/604
`White et al. ................ 280/735
`Fujita et al.
`................ 280/735
`Watanabe ................... 358/105
`Mattes et al. ............... 280/735
`Tamburino et al. ........ 356/5.05
`Schweizer et al. . ... ... ... ... 382/1
`Zimmermann .............. 348/143
`Subbarao ... ... ... ... ... ...... 382/41
`Schweizer et al. ............ 395/21
`Masamori et al. .......... 340/903
`Morrison et al.
`........... 342/159
`Mathur et al. ............. 356/3.14
`Taylor ........................ 340/903
`Chen ....................... 250/203.4
`Faris . ... ..... ... ... ... ... ..... ... 345/7
`Kajiwara .................... 340/436
`Slotkowski et al.
`........ 315/159
`Gentry et al. ............... 280/735
`Abst et al. .................. 340/903
`Fukuhara .................... 340/435
`Wang ......................... 364/754
`Miyajima .............. 128/661.09
`Gioutsos et al. ... .... 364/424.05
`Mazur et al.
`............... 280/735
`Juds ........................... 340/903
`Liu ........................... 362/80.1
`Corrado et al. ............. 280/735
`Pastrick et al. ............ 362/83.1
`Kamei et al. ............... 382/100
`Semchena et al. .......... 280/735
`Bechtel et al. ................ 315/82
`Schofield et al. ........... 359/604
`Poelstra ....................... 348/36
`Breed et al. ................ 280/735
`Brandin ...................... 359/843
`Adolph et al. .............. 280/735
`Burke et al. ............... 307/10.1
`Breed et al. ................ 280/735
`Breed et al. ................ 382/100
`Breed et al. ................ 382/100
`Breed et al. ................ 280/735
`Schofield et al. ........... 359/604
`Varga et al.
`.................. 367/99
`Schofield et al. ........... 340/461
`Griggs, III et al. ......... 280/735
`O'Farrell et al.
`.......... 307/10.1
`Krumm ... ..... ... ... ... ...... 701/45
`Farmer et al.
`.............. 382/103
`Stanley ....................... 280/735
`Thompson et al.
`......... 340/438
`Tanaka et al.
`.............. 280/735
`
`Schweizer .. ... ..... ... ... ... 701/45
`2/2000
`6,029,105 A
`6,087,953 A *
`7/2000
`Deline et al. ............ 340/815.4
`6,111,517 A
`8/2000
`Atick et al. ... ... ... ... 340/825.34
`6,113,137 A
`9/2000
`Mizutani et al. ............ 280/735
`Kaneda ....................... 396/82
`6,115,552 A
`9/2000
`2002/0154379 A1 * 10/2002 Tonar et al.
`................ 359/267
`FOREIGN PATENT DOCUMENTS
`
`GB
`JP
`JP
`JP
`JP
`WO
`WO
`
`2289332
`360166806
`3-42337
`407055573 A
`2001-325700
`94/22693
`0196147
`
`11/1995
`8/1985
`2/1991
`3/1995
`11/2001
`10/1994
`12/2001
`
`................. 180/273
`
`OTHER PUBLICATIONS
`
`Learned Classification of Sonar Targets Using a Massively
`Parallel Network, R. Paul Gorman et aI., IEEE Transactions
`on Acoustics, Speech and Signal Processing, vol. 36, No.7,
`JuI., 1988, pp 1135-1140.
`"How Airbags Work", David S. Breed, Presented at the
`Canadian Association of Road Safety Professionals, Oct. 19,
`1992-0ct. 20, 1992.
`Intelligent System for Video Monitoring of Vehicle Cockpit,
`S. Boverie et aI., SAE Paper No. 980613, Feb., 1998.
`Omnidirectional Vision Sensor for Intelligent Vehicles, T.
`Ito et aI., 1998 IEEE International Conference on Intelligent
`Vehicles, pp. 365-370, 1998.
`A 256x256 CMOS Brightness Adaptive Imaging Array with
`Column-Parallel Digital Output, C. Sodini et aI., 1998 IEEE
`International Conference on Intelligent Vehicles, 1998, pp.
`347-352.
`Derwent Abstract of German Patent Publication No. DE 42
`11 556, Oct. 7, 1993.
`Derwent Abstract of Japanese patent application No.
`02-051332, Nov. 13, 1991.
`3D Perception for Vehicle Inner Space Monitoring, S.
`Boverie et aI., Advanced Microsystems for Automotive
`Applications 2000, Apr., 2000, pp. 157-172.
`Low-Cost High Speed CMOS Camera for Automotive
`Applications, N. Stevanovic et aI., Advanced Microsystems
`for Automotive Applications 2000, Apr., 2000, pp. 173-180.
`New Powerful Sensory Tool in Automotive Safety Systems
`Based on PMD-Technology, R. Schwarte et aI., Advanced
`Microsystems for Automotive Applications 2000, Apr.,
`2000, pp. 181-203.
`An Interior Compartment Protection System Based on
`Motion Detection Using CMOS Imagers, S. B. Park et aI.,
`1998 IEEE International Conference on Intelligent Vehicles.
`Sensing Automobile Occupant Position with Optical Trian(cid:173)
`gulation, W. Chapelle et aI., Sensors, Dec. 1995.
`Intelligent System for Video Monitoring of Vehicle Cockpit,
`S. Boverie et aI., SAE Paper No. 980613, Feb. 23-26, 1998.
`A 256x256 CMOS Brightness Adaptive Imaging Array with
`Column-Parallel Digital Output, CG. Sodini et aI., 1998
`IEEE International Conference on Intelligent Vehicles.
`The FERET Evaluation Methodology for Face-Recognition
`Algorithms, P.J. Phillips et aI., NISTIR 6264, Jan. 7, 1999.
`The Technology Review Ten: Biometrics, J. Atick, Jan./Feb.
`2001.
`* cited by examiner
`
`2
`
`
`
`u.s. Patent
`US. Patent
`
`Aug. 3, 2004
`Aug. 3, 2004
`
`Sheet 1 0f 19
`Sheet 1 of 19
`
`US 6,772,057 B2
`US 6,772,057 B2
`
`N -
`
`'<::t o -
`
`o
`
`N -
`
`-o -
`
`0
`
`M -I
`
`105
`
`-
`
`0
`
`110,
`
`'<::t
`
`,-.
`
`o o
`
`100
`
`~
`~ .
`d
`\ ~
`\ ~
`
`FIG.1A
`
`\
`\
`
`3
`
`
`
`u.s. Patent
`US. Patent
`
`Aug. 3, 2004
`Aug. 3, 2004
`
`Sheet 2 0f 19
`Sheet 2 of 19
`
`US 6,772,057 B2
`US 6,772,057 B2
`
`o
`N .......
`
`104
`
`......
`112
`.......
`
`FIG.1B
`
`
`
`.......
`
`o -
`
`!
`./
`
`N o .......
`
`......
`
`o .......
`.......
`
`......
`.......
`
`......
`
`o o ......
`
`~ -.......
`
`4
`
`
`
`u.s. Patent
`
`US. Patent
`
`Aug. 3, 2004
`Aug. 3, 2004
`
`Sheet 3 0f 19
`Sheet 3 of 19
`
`US 6,772,057 B2
`US 6,772,057 B2
`
`("l
`
`......
`......
`
`'.
`
`~ o ......
`
`......
`o ......
`
`FIG.1C
`
`lr) o ......
`105
`
`o ....(cid:173)
`....-
`
`0 \
`
`o ....-
`
`m'—
`~
`
`5
`
`
`
`u.s. Patent
`
`Aug. 3, 2004
`
`Sheet 4 of 19
`
`US 6,772,057 B2
`
`..(cid:173)-
`
`~ o ..-
`
`o
`
`N -
`
`1..0 --
`
`M
`0 ..-
`
`00
`..-
`..-
`
`I
`
`lr) o -
`
`r--..-
`
`0 ..-
`..-'
`
`..(cid:173)
`..-
`
`0
`0
`..-
`
`..-
`..-
`
`-o -
`
`Q
`~ .
`d
`~
`~
`
`/
`N
`
`0 -
`
`6
`
`
`
`u.s. Patent
`
`Aug. 3, 2004
`
`Sheet 5 of 19
`
`US 6,772,057 B2
`
`~ o .--
`
`o
`N
`.--
`
`.-(cid:173)
`.--
`
`f"') o -!
`
`.-(cid:173)
`.-(cid:173)
`.--
`
`I.r) o .--
`
`o
`
`o o
`
`f"')
`.......
`.--
`
`-o -
`
`!
`I
`/
`/
`
`N o -
`
`7
`
`
`
`u.s. Patent
`
`US. Patent
`
`Aug. 3, 2004
`Aug. 3, 2004
`
`Sheet 6 of 19
`Sheet 6 0f 19
`
`US 6,772,057 B2
`US 6,772,057 B2
`
`M -N
`
`"""" -N
`--< -....
`
`N
`
`: .. ,
`__ -+tt-_
`.
`. :',.:'
`. ',"
`
`"
`
`.
`
`,,'
`
`"
`
`N ....
`
`..................... ............... .
`
`FIG.2A
`
`o -N
`- """"
`
`N
`N
`
`-
`
`oI
`
`nN
`
`o --
`
`.... -
`
`---
`
`8
`
`
`
`u.s. Patent
`
`US. Patent
`
`Aug. 3, 2004
`Aug. 3, 2004
`
`Sheet 7 of 19
`Sheet 7 0f 19
`
`US 6,772,057 B2
`US 6,772,057 B2
`
`~ .....-
`
`.....-
`
`0 .......
`.....-
`
`'---'
`
`•
`
`~
`~
`~
`
`
`
`.....(cid:173)
`.......
`
`1""""1
`
`.....-
`1""""1
`N
`
`8
`
`9
`
`
`
`N
`~
`""-l
`(I)
`Q
`
`e
`
`""-l J-J
`""-l
`-..CJ\
`rJ'l
`
`'""'"
`\C
`o ....,
`00
`~ .....
`'JJ. =(cid:173)~
`
`~
`C
`C
`N
`~~
`
`~ ({Q
`
`~ = .....
`~ .....
`~
`•
`'J'J.
`d •
`
`GROUND
`
`0
`
`.......... ' --fJet-----,
`
`I
`I
`
`------------,
`
`I
`
`1------1
`
`:
`
`r------.,
`ARMING SENSOR
`
`L-______ _
`
`:~l:
`
`INFLATABLE RESTRAINT SYSTEM
`
`r-------I
`
`OCCUPANT POSITION SENSORS
`
`FIG. 3
`
`I --=--
`_____ 1
`~-::. I
`I
`I.~ I
`I
`I
`
`,(cid:173)
`
`;'
`
`DIAGNOSTIC UNIT
`
`~~~~~
`
`~~
`
`~
`
`~~
`
`I 1/ I
`
`L ______ I
`0,
`I
`
`"'-------i-. -
`
`I
`I
`
`,,_ ---_. _a ____ _
`
`:
`I
`
`CRASH SENSOR
`ELECTRONIC
`I
`L.. _____ ..1
`
`e'
`I
`I
`11/ I
`
`I
`
`I
`
`12 VOLTS
`
`o
`
`10
`
`
`
`N
`~
`""-l
`(I)
`b
`N
`""-l
`""-l
`-..CJ\
`rJ'l
`
`e
`
`'"""
`\C
`o .....,
`\C
`~ .....
`'JJ. =(cid:173)~
`
`~
`C
`C
`N
`~~
`
`~ ({Q
`
`~ = .....
`~ .....
`~
`•
`rJl
`d •
`
`~
`
`Receive
`
`409
`
`Pre-amplifier
`
`~
`
`144 Mhz
`
`411
`
`MHz
`144.15
`
`r
`I Adiust
`Phase
`
`Gain Control
`If Automatic
`)
`
`Filter
`
`408~
`
`I~o K4 Filter
`(3f2-3f1)= ISO KHz I
`I~
`
`FIG. 4
`
`Detect
`No Signa
`
`150 KHz
`
`Ix
`
`I
`
`Amplifier
`
`Ix
`
`......
`
`413
`
`414,
`
`416
`
`~-/'
`
`I r I
`
`(0 to 2R::-)
`
`Detector
`
`Phase
`
`TripleI'
`
`Frequency
`
`f2 48.05 MHz ~
`
`Oscillator
`Ctystal
`
`Driver
`Diode
`
`403 -
`
`407
`
`3ft
`
`144 Mhz
`
`TripIer
`
`Frequency
`
`402
`
`Oscillator
`Ctystal
`
`f1 48 MHz
`
`Velocity
`
`D=RX
`
`405
`
`401
`
`11
`
`
`
`u.s. Patent
`
`Aug. 3, 2004
`
`Sheet 10 of 19
`
`US 6,772,057 B2
`
`o
`N
`...-
`
`o
`..(cid:173)
`..-
`
`...(cid:173)
`...(cid:173)
`...-
`
`12
`
`
`
`u.s. Patent
`
`US. Patent
`
`Aug. 3, 2004
`Aug. 3, 2004
`
`Sheet 11 of 19
`Sheet 11 0f 19
`
`US 6,772,057 B2
`US 6,772,057 B2
`
`
`
`..... :
`
`.........
`
`. . . . . . . . . . . , . . .
`
`.....
`
`"'.
`" " "
`
`
`
`\
`\
`\
`\
`\
`I
`I
`\
`\
`\
`
`--,.
`
`FIG.6
`
`13
`
`631
`
`630
`
`13
`
`
`
`u.s. Patent
`
`Aug. 3, 2004
`
`Sheet 12 of 19
`
`US 6,772,057 B2
`
`
`
`14
`
`
`
`u.s. Patent
`
`US. Patent
`
`Aug. 3, 2004
`Aug. 3, 2004
`
`Sheet 13 of 19
`Sheet 13 0f 19
`
`US 6,772,057 B2
`S 6,772,057 132
`
`ON U
`
`......
`
`o ......
`00
`
`....-< o
`
`....-<
`
`00
`
`FIG.8
`
`15
`
`15
`
`
`
`u.s. Patent
`
`US. Patent
`
`Aug. 3, 2004
`Aug. 3
`, 2004
`
`Sheet 14 of 19
`Sheet 14 0f 19
`
`US 6,772,057 B2
`US 6,772,057 B2
`
` '(lNAS-‘‘
`
`.......
`.......
`..-
`
`.g
`
`g.
`
`m.05
`
`16
`
`16
`
`
`
`
`
`N
`~
`""-l
`(I)
`b
`N
`""-l
`""-l
`-..CJ\
`rJ'l
`
`e
`
`'""'"
`\C
`o ....,
`'""'" Ul
`~ .....
`'Jl =(cid:173)~
`
`~
`C
`C
`N
`~~
`
`~ ({Q
`
`~ = .....
`~ .....
`~
`•
`rJl
`d •
`
`Component(510)
`or other Vehicle
`Security System
`
`-------------
`
`I --~
`
`~;~~~ss~r--~ _ ~J -----~-
`
`Algorithm (506)
`Recognition
`Pattern
`Training Phase of
`
`------
`
`---.-
`
`---------
`
`(504)
`Data Derivation
`
`~~------.-~-
`
`-----_.----~~~--~-
`
`j
`
`r--Optical Image
`
`(Optional) (512)
`Transmitter
`Optical
`
`)
`
`FIG. 10
`
`L Recepti on (502)
`_~___~----r~--~ ___ ~_
`
`'-'-
`
`Compartment
`Passenger
`
`, . (500)
`\,
`,
`
`17
`
`
`
`N
`~
`-....l
`(I)
`b
`N
`-....l
`-....l
`-..CJ\
`\Jl
`
`e
`
`'""'"
`\C
`o ....,
`'""'" 0'1
`~ .....
`'JJ. =(cid:173)~
`
`~
`C
`C
`N
`~~
`
`~ ({Q
`
`~ = .....
`~ .....
`~
`•
`rJl
`d •
`
`Sound .'1
`:1
`
`Police
`Alanll/Contact
`
`///'.
`
`Authorized Driver
`Includes
`that Image
`Algorithm indicate
`Recognition
`Does Pattern
`
`------------
`
`530
`
`---
`
`Yes
`
`526
`
`---~
`
`--------
`Vehicle
`Ignition of
`Enable
`
`~
`
`528
`
`524 ~--GpPIY I~a~;~;"'~
`
`/" " ",
`---r-
`
`Algorithm
`Recogmhon
`
`520
`
`oper~tional:::J .--"
`Set System ~_-1 ----------(cid:173)
`
`___ __ _____
`
`522 _____ _
`
`__ _
`
`-
`
`Aut"-I~iVCr(S}-)
`
`518
`
`"i-:r;ain P~t~em
`
`Algorithm
`Recogmlton
`
`516 ""lobtain,lmages:]
`
`Includmg
`
`-~
`Training Pha~
`
`514~l-Set System ~
`
`FIG. 11
`
`18
`
`
`
`N
`~
`""-l
`(I)
`b
`N
`""-l
`""-l
`-..CJ\
`rJ'l
`
`e
`
`'""'"
`\C
`0 ....,
`'""'"
`-..J
`~ .....
`'Jl =-~
`
`> =
`
`~
`C
`C
`N
`~~
`({Q
`
`~ = .....
`~ .....
`~
`•
`rJl
`d •
`
`FIG. 13
`
`614
`
`compon:~ ___ .-J
`
`612
`
`610
`
`-----~
`
`FIG. 12
`
`;"-:J~~':'mmH~:';-;i:;':~~' .H_ -~= ... 1 +~ti;, C':':~~tl
`
`602~_~ /
`
`I
`
`-
`
`606~_)
`
`604=-<7·
`
`19
`
`
`
`u.s. Patent
`
`Aug. 3, 2004
`
`Sheet 18 of 19
`
`US 6,772,057 B2
`
`')
`
`... 0 .., ..,
`8
`d:
`
`~
`
`-"""
`
`00
`
`~ .....
`:~ ~
`~< u __
`~ ,-..
`
`iX~ - u
`.c >(
`CI)._
`:.:3~
`
`...
`
`~
`
`111\
`III \\
`I I I \ \
`
`c:
`
`.2 -111-
`
`0
`-
`is .!:l c:
`_ c:
`III 0.2
`UUai
`8 -.5
`c: ~ E
`e c:: ~
`~ 8.. as
`.- e Cl
`o 0
`~U
`E
`!=
`
`\ \ I 1/
`\ \ II/
`\\II
`it n
`~II -~ ~I .~ ~ ~,
`~ I ~ ~ I
`'---~_~___i
`
`\
`
`)
`
`/.c u
`,,~ 5
`
`I
`
`~III
`oS ..g
`
`20
`
`
`
`u.s. Patent
`
`Aug. 3, 2004
`
`Sheet 19 of 19
`
`US 6,772,057 B2
`
`FIG. 15
`
`Optional transmitter
`730
`
`Optional measurement system (radar)
`746
`
`~~erioro~ect(~
`
`~ Electronic module/processor (transmitter
`drive circuitry, signal processing circuitry-
`neural computer) 740,742,744,745
`
`Receiver(s) (single or
`multiple) 734, 736,
`736
`
`Display to driver! Airbag
`controVheadlight dimmer
`controVOther system control
`748
`
`21
`
`
`
`US 6,772,057 B2
`
`1
`VEHICULAR MONITORING SYSTEMS
`USING IMAGE PROCESSING
`
`CROSS REFERENCE TO RELATED
`APPLICATIONS
`
`This application is a continuation-in-part of U.S. patent
`application Ser. No. 10/116,808 filed Apr. 5,2002 which is:
`1) a continuation-in-part of U.S. patent application Ser.
`No. 091925,043 filed Aug. 8, 2001, now U.S. Pat. No.
`6,507,779, which is:
`a) a continuation-in-part of U.S. patent application Ser.
`No. 091765,559 filed Jan. 19, 2001, now U.S. Pat.
`No. 6,553,296; and
`b) a continuation-in-part of U.S. patent application Ser.
`No. 09/389,947 filed Sep. 3, 1999, now U.S. Pat. No.
`6,393,133; and
`2) a continuation-in-part of U.S. patent application Ser.
`No. 09/838,919 filed Apr. 20, 2001, now U.S. Pat. No.
`6,442,465, which is:
`a) a continuation-in-part of U.S. patent application Ser.
`No. 091765,559 filed Jan. 19, 2001 which is a
`continuation-in-part of U.S. patent application Ser.
`No. 09/476,255 filed Dec. 30, 1999, now U.S. Pat.
`No. 6,324,453, which claims priority under 35
`U.S.c. §119(e) of U.S. provisional patent application
`Ser. No. 60/114,507 filed Dec. 31, 1998; and
`b) a continuation-in-part of U.S. patent application Ser.
`No. 09/389,947 filed Sep. 3, 1999, now U.S. Pat. No.
`6,393,133, which is a continuation-in-part of U.S.
`patent application Ser. No. 09/200,614, filed Nov. 30,
`1998, now U.S. Pat. No. 6,141,432, which is a
`continuation of U.S. patent application Ser. No.
`08/474,786 filed Jun. 7, 1995, now U.S. Pat. No.
`5,845,000, all of which are incorporated by reference
`herein.
`This application claims priority under 35 U.S.c. §119(e)
`of U.S. provisional patent application Ser. No. 60/114,507
`filed Dec. 31, 1998 through the parent applications.
`
`FIELD OF THE INVENTION
`
`The present invention relates to apparatus and methods
`for monitoring environments in and outside of a vehicle
`using image processing.
`The present invention also relates to arrangements for
`detecting the presence, type and/or position of occupants in
`vehicles and objects exterior of vehicles, e.g., in a driver's
`blind spot, primarily using optics.
`The present invention also relates to apparatus and meth(cid:173)
`ods for determining a distance between objects in an envi(cid:173)
`ronment in and outside of a vehicle by image processing
`techniques.
`
`BACKGROUND OF THE INVENTION
`1. Prior Art on Out of Position Occupants and Rear Facing 55
`Child Seats
`Whereas thousands of lives have been saved by airbags,
`a large number of people have also been injured, some
`seriously, by the deploying airbag, and over 100 people have
`now been killed. Thus, significant improvements need to be 60
`made to airbag systems. As discussed in detail in U.S. Pat.
`No. 5,653,462 referenced above, for a variety of reasons
`vehicle occupants may be too close to the airbag before it
`deploys and can be seriously injured or killed as a result of
`the deployment thereof. Also, a child in a rear facing child 65
`seat that is placed on the right front passenger seat is in
`danger of being seriously injured if the passenger airbag
`
`20
`
`2
`deploys. For these reasons and, as first publicly disclosed in
`Breed, D. S. "How Airbags Work" presented at the Interna(cid:173)
`tional Conference on Seatbelts and Airbags in 1993, in
`Canada, occupant position sensing and rear facing child seat
`5 detection systems are required.
`Initially, these systems will solve the out-of-position
`occupant and the rear facing child seat problems related to
`current airbag systems and prevent unneeded airbag deploy(cid:173)
`ments when a front seat is unoccupied. However, airbags are
`10 now under development to protect rear seat occupants in
`vehicle crashes and all occupants in side impacts. A system
`will therefore be needed to detect the presence of occupants,
`determine if they are out-of-position and to identify the
`presence of a rear facing child seat in the rear seat. Future
`automobiles are expected to have eight or more airbags as
`15 protection is sought for rear seat occupants and from side
`impacts. In addition to eliminating the disturbance and
`possible harm of unnecessary airbag deployments, the cost
`of replacing these airbags will be excessive if they all deploy
`in an accident needlessly.
`Inflators now exist which will adjust the amount of gas
`flowing to the airbag to account for the size and position of
`the occupant and for the severity of the accident. The vehicle
`identification and monitoring system (VIMS) discussed in
`U.S. Pat. No. 5,829,782 will control such inflators based on
`25 the presence and position of vehicle occupants or of a rear
`facing child seat. As discussed more fully below, the instant
`invention is an improvement on that VIMS system and uses
`an advanced optical system comprising one or more CCD
`(charge coupled device) or CMOS arrays and particularly
`30 active pixel arrays plus a source of illumination preferably
`combined with a trained neural network pattern recognition
`system.
`Others have observed the need for an occupant out-of(cid:173)
`position sensor and several methods have been disclosed in
`35 U.S. patents for determining the position of an occupant of
`a motor vehicle. Each of these systems, however, has
`significant limitations. For example, in White et al. (U.S.
`Pat. No. 5,071,160), a single acoustic sensor and detector is
`described and, as illustrated, is mounted lower than the
`40 steering wheel. White et al. correctly perceive that such a
`sensor could be defeated, and the airbag falsely deployed, by
`an occupant adjusting the control knobs on the radio and
`thus they suggest the use of a plurality of such sensors.
`Mattes et al. (U.S. Pat. No. 5,118,134) describe a variety
`45 of methods of measuring the change in position of an
`occupant including ultrasonic, active or passive infrared and
`microwave radar sensors, and an electric eye. The sensors
`measure the change in position of an occupant during a crash
`and use that information to access the severity of the crash
`50 and thereby decide whether or not to deploy the airbag. They
`are thus using the occupant motion as a crash sensor. No
`mention is made of determining the out-of-position status of
`the occupant or of any of the other features of occupant
`monitoring as disclosed in one or more of the above(cid:173)
`referenced patents and patent applications. It is interesting to
`note that nowhere does Mattes et al. discuss how to use
`active or passive infrared to determine the position of the
`occupant. As pointed out in one or more of the above(cid:173)
`referenced patents and patent applications, direct occupant
`position measurement based on passive infrared is probably
`not possible and, until very recently, was very difficult and
`expensive with active infrared requiring the modulation of
`an expensive GaAs infrared laser. Since there is no mention
`of these problems, the method of use contemplated by
`Mattes et al. must be similar to the electric eye concept
`where position is measured indirectly as the occupant passes
`by a plurality of longitudinally spaced-apart sensors.
`
`22
`
`
`
`US 6,772,057 B2
`
`15
`
`3
`The object of an occupant out-of-position sensor is to
`determine the location of the head and/or chest of the vehicle
`occupant relative to the airbag since it is the impact of either
`the head or chest with the deploying airbag which can result
`in serious injuries. Both White et al. and Mattes et al.
`describe only lower mounting locations of their sensors in
`front of the occupant such as on the dashboard or below the
`steering wheel. Both such mounting locations are particu(cid:173)
`larly prone to detection errors due to positioning of the
`occupant's hands, arms and legs. This would require at least
`three, and preferably more, such sensors and detectors and
`an appropriate logic circuitry which ignores readings from
`some sensors if such readings are inconsistent with others,
`for the case, for example, where the driver's arms are the
`closest objects to two of the sensors.
`White et al. also describe the use of error correction
`circuitry, without defining or illustrating the circuitry, to
`differentiate between the velocity of one of the occupant's
`hands as in the case where he/she is adjusting the knob on
`the radio and the remainder of the occupant. Three ultrasonic
`sensors of the type disclosed by White et al. might, in some
`cases, accomplish this differentiation if two of them indi(cid:173)
`cated that the occupant was not moving while the third was
`indicating that he or she was. Such a combination, however,
`would not differentiate between an occupant with both hands
`and arms in the path of the ultrasonic transmitter at such a
`location that they were blocking a substantial view of the
`occupant's head or chest. Since the sizes and driving posi(cid:173)
`tions of occupants are extremely varied, it is now believed
`that pattern recognition systems and preferably trained pat(cid:173)
`tern recognition systems, such as neural networks, are
`required when a clear view of the occupant, unimpeded by
`his/her extremities, cannot be guaranteed.
`Fujita et aI., in U.S. Pat. No. 5,074,583, describe another
`method of determining the position of the occupant but do
`not use this information to suppress deployment if the
`occupant is out-of-position. In fact, the closer the occupant
`gets to the airbag, the faster the inflation rate of the airbag
`is according to the Fujita et al. patent, which thereby
`increases the possibility of injuring the occupant. Fujita et al.
`do not measure the occupant directly but instead determine
`his or her position indirectly from measurements of the seat
`position and the vertical size of the occupant relative to the
`seat (occupant height). This occupant height is determined
`using an ultrasonic displacement sensor mounted directly
`above the occupant's head.
`As discussed above, the optical systems described herein
`are also applicable for many other sensing applications both
`inside and outside of the vehicle compartment such as for
`sensing crashes before they occur as described in U.S. Pat.
`No. 5,829,782, for a smart headlight adjustment system and
`for a blind spot monitor (also disclosed in U.S. provisional
`patent application Ser. No. 60/202,424).
`2. Definitions
`Preferred embodiments of the invention are described 55
`below and unless specifically noted, it is the applicants'
`intention that the words and phrases in the specification and
`claims be given the ordinary and accustomed meaning to
`those of ordinary skill in the applicable art(s). If the appli(cid:173)
`cant intends any other meaning, he will specifically state he
`is applying a special meaning to a word or phrase.
`Likewise, applicants' use of the word "function" here is
`not intended to indicate that the applicants seek to invoke the
`special provisions of 35 U.S.c. §112, sixth paragraph, to
`define their invention. To the contrary, if applicants wish to
`invoke the provisions of 35 U.S.c. §112, sixth paragraph, to
`define their invention, they will specifically set forth in the
`
`4
`claims the phrases "means for" or "step for" and a function,
`without also reciting in that phrase any structure, material or
`act in support of the function. Moreover, even if applicants
`invoke the provisions of 35 U.S.c. §112, sixth paragraph, to
`5 define their invention, it is the applicants' intention that their
`inventions not be limited to the specific structure, material or
`acts that are described in the preferred embodiments herein.
`Rather, if applicants claim their inventions by specifically
`invoking the provisions of 35 U.S.c. §112, sixth paragraph,
`10 it is nonetheless their intention to cover and include any and
`all structure, materials or acts that perform the claimed
`function, along with any and all known or later developed
`equivalent structures, materials or acts for performing the
`claimed function.
`The use of pattern recognition is important to the instant
`invention as well as to one or more of those disclosed in the
`above-referenced patents and patent applications above.
`"Pattern recognition" as used herein will generally mean any
`system which processes a signal that is generated by an
`20 object, or is modified by interacting with an object, in order
`to determine which one of a set of classes that the object
`belongs to. Such a system might determine only that the
`object is or is not a member of one specified class, or it might
`attempt to assign the object to one of a larger set of specified
`25 classes, or find that it is not a member of any of the classes
`in the set. The signals processed are generally electrical
`signals coming from transducers which are sensitive to
`either acoustic or electromagnetic radiation and, if
`electromagnetic, they can be either visible light, infrared,
`30 ultraviolet or radar or low frequency radiation as used in
`capacitive sensing systems.
`A trainable or a trained pattern recognition system as used
`herein means a pattern recognition system which is taught
`various patterns by subjecting the system to a variety of
`35 examples. The most successful such system is the neural
`network. Not all pattern recognition systems are trained
`systems and not all trained systems are neural networks.
`Other pattern recognition systems are based on fuzzy logic,
`sensor fusion, Kalman filters, correlation as well as linear
`40 and non-linear regression. Still other pattern recognition
`systems are hybrids of more than one system such as
`neural-fuzzy systems.
`A pattern recognition algorithm will thus generally mean
`an algorithm applying or obtained using any type of pattern
`45 recognition system, e.g., a neural network, sensor fusion,
`fuzzy logic, etc.
`To "identify" as used herein will usually mean to deter(cid:173)
`mine that the object belongs to a particular set or class. The
`class may be one containing, for example, all rear facing
`50 child seats, one containing all human occupants, or all
`human occupants not sitting in a rear facing child seat
`depending on the purpose of the system. In the case where
`a particular person is to be recognized, the set or class will
`contain only a single element, i.e., the person to be recog(cid:173)
`nized.
`To "ascertain the identity of" as used herein with refer(cid:173)
`ence to an object will generally mean to determine the type
`or nature of the object (obtain information as to what the
`object is), i.e., that the object is an adult, an occupied rear
`60 facing child seat, an occupied front facing child seat, an
`unoccupied rear facing child seat, an unoccupied front
`facing child seat, a child, a dog, a bag of groceries, a car, a
`truck, a tree, a pedestrian, a deer etc.
`An "occupying item" or "occupant" of a seat or "object"
`65 in a seat may be a living occupant such as a human being or
`a dog, another living organism such as a plant, or an
`inanimate object such as a box or bag of groceries.
`
`23
`
`
`
`US 6,772,057 B2
`
`5
`A "rear seat" of a vehicle as used herein will generally
`mean any seat behind the front seat on which a driver sits.
`Thus, in minivans or other large vehi