`Breed et al.
`
`111111111111111111111111111111111111111111111111111111111111111111111111111
`US006772057B2
`
`(10) Patent No.:
`(45) Date of Patent:
`
`US 6, 772,057 B2
`Aug. 3, 2004
`
`(54) VEHICUlAR MONITORING SYSTEMS
`USING IMAGE PROCESSING
`
`(56)
`
`References Cited
`
`(75)
`
`Inventors: David S. Breed, Boonton Township,
`Morris County, NJ (US); Wilbur E.
`DuVall, Kimberling City, MO (US);
`Wendell C. Johnson, Signal Hill, CA
`(US)
`
`(73) Assignee: Automotive Technologies
`International, Inc., Denville, NJ (US)
`
`EP
`
`( *) Notice:
`
`Subject to any disclaimer, the term of this
`patent is extended or adjusted under 35
`U.S.C. 154(b) by 0 days.
`
`U.S. PATENT DOCUMENTS
`1!1985 Shah ........................ 3592/300
`4,496,222 A
`
`(List continued on next page.)
`
`FOREIGN PATENT DOCUMENTS
`0885782
`12/1998
`
`(List continued on next page.)
`
`OTHER PUBLICATIONS
`
`(21) Appl. No.: 10/302,105
`
`(22) Filed:
`
`Nov. 22, 2002
`
`(65)
`
`Prior Publication Data
`
`US 2003/0125855 A1 Jul. 3, 2003
`
`Related U.S. Application Data
`
`(63) Continuation-in-part of application No. 10/116,808, filed on
`Apr. 5, 2002, which is a continuation-in-part of application
`No. 09/925,043, filed on Aug. 8, 2001, now Pat. No.
`6,507,779, which is a continuation-in-part of application No.
`09/765,559, filed on Jan. 19, 2001, now Pat. No. 6,553,296,
`and a continuation-in-part of application No. 09/389,947,
`filed on Sep. 3, 1999, now Pat. No. 6,393,133, and a
`continuation-in-part of application No. 09/838,919, filed on
`Apr. 20, 2001, now Pat. No. 6,442,465, which is a continu(cid:173)
`ation-in-part of application No. 09!765,559, which is a
`continuation-in-part of application No. 09/476,255, filed on
`Dec. 30, 1999, now Pat. No. 6,324,453, and a continuation(cid:173)
`in-part of application No. 09/389,947, which is a continua(cid:173)
`tion-in-part of application No. 09/200,614, filed on Nov. 30,
`1998, now Pat. No. 6,141,432, which is a continuation of
`application No. 08/474,786, filed on Jun. 7, 1995, now Pat.
`No. 5,845,000.
`(60) Provisional application No. 60/114,507, filed on Dec. 31,
`1998.
`
`Int. Cl? ................................................ B60R 21/32
`(51)
`(52) U.S. Cl. ........................ 701/45; 340/573.1; 348/77;
`180/271; 280/735; 382/181
`(58) Field of Search ............................ 701!45, 36, 301;
`340/461, 435, 436, 438, 815.4, 573.1; 348/148,
`77, 154; 180/271; 280/735, 728.1; 382/181,
`115, 190, 224, 100, 104
`
`"Analysis of Hidden Units in a Layered Network Trained to
`Classify Sonar Targets", R. Paul Gorman, et al., Neural
`Networks, vol. 1, pp. 75-89, 1988.
`
`(List continued on next page.)
`
`Primary Examiner-Thomas G. Black
`Assistant Examiner-Tuan C To
`(74) Attorney, Agent, or Firm-Brian Roffe
`
`(57)
`
`ABSTRACT
`
`Vehicular monitoring arrangement for monitoring an envi(cid:173)
`ronment of the vehicle including at least one active pixel
`camera for obtaining images of the environment of the
`vehicle and a processor coupled to the active pixel camera(s)
`for determining at least one characteristic of an object in the
`environment based on the images obtained by the active
`pixel camera(s). The active pixel camera can be arranged in
`a headliner, roof or ceiling of the vehicle to obtain images of
`an interior environment of the vehicle, in an A-pillar or
`B-pillar of the vehicle to obtain images of an interior
`environment of the vehicle, or in a roof, ceiling, B-pillar or
`C-pillar of the vehicle to obtain images of an interior
`environment of the vehicle behind a front seat of the vehicle.
`The determined characteristic can be used to enable optimal
`control of a reactive component, system or subsystem
`coupled to the processor. When the reactive component is an
`airbag assembly including at least one airbag, the processor
`can be designed to control at least one deployment param(cid:173)
`eter of the airbag(s).
`
`86 Claims, 19 Drawing Sheets
`
`810-
`
`811
`
`120
`
`IPR2013-00424 - Ex. 1011
`Toyota Motor Corp., Petitioner
`
`1
`
`
`
`US 6, 772,057 B2
`Page 2
`
`U.S. PATENT DOCUMENTS
`
`4,625,329 A
`4,648,052 A
`4,720,189 A
`4,768,088 A
`4,836,670 A
`4,881,270 A
`4,906,940 A
`4,950,069 A
`4,966,388 A
`5,003,166 A
`5,008,946 A
`5,026,153 A
`5,060,278 A
`5,062,696 A
`5,064,274 A
`5,071,160 A
`5,074,583 A
`5,103,305 A
`5,118,134 A
`5,162,861 A
`5,181,254 A
`5,185,667 A
`5,193,124 A
`5,214,744 A
`5,227,784 A
`5,235,339 A
`5,249,027 A
`5,249,157 A
`5,298,732 A
`5,305,012 A
`5,309,137 A *
`5,329,206 A
`5,330,226 A
`5,339,075 A
`5,355,118 A
`5,390,136 A
`5,441,052 A
`5,446,661 A
`5,454,591 A
`5,463,384 A
`5,473,515 A
`5,482,314 A
`5,497,305 A *
`5,528,698 A
`5,531,472 A
`5,537,003 A
`5,550,677 A
`5,563,650 A
`5,653,462 A
`5,706,144 A *
`5,785,347 A
`5,821,633 A
`5,829,782 A
`5,835,613 A
`5,845,000 A
`5,848,802 A
`5,877,897 A *
`5,943,295 A
`5,949,331 A *
`5,954,360 A
`5,959,367 A *
`5,983,147 A
`6,005,958 A
`6,007,095 A
`6,020,812 A
`6,027,138 A
`
`11/1986
`3/1987
`1!1988
`8/1988
`6/1989
`11/1989
`3/1990
`8/1990
`10/1990
`3/1991
`4/1991
`6/1991
`10/1991
`11/1991
`11/1991
`12/1991
`12/1991
`4/1992
`6/1992
`11/1992
`1!1993
`2/1993
`3/1993
`5/1993
`7/1993
`8/1993
`9/1993
`9/1993
`3/1994
`4/1994
`5/1994
`7/1994
`7/1994
`8/1994
`10/1994
`2/1995
`8/1995
`8/1995
`10/1995
`* 10/1995
`* 12/1995
`1!1996
`3/1996
`6/1996
`7/1996
`7/1996
`8/1996
`10/1996
`8/1997
`1!1998
`7/1998
`10/1998
`11/1998
`11/1998
`12/1998
`12/1998
`3/1999
`8/1999
`9/1999
`9/1999
`9/1999
`11/1999
`12/1999
`12/1999
`2/2000
`2/2000
`
`Ishikawa et a!. . . . . . . . . . . . . 382/104
`Friedman et a!. ........... 364/550
`Heynen et a!.
`............. 351!210
`An do
`... ... ... .. ... ... ... ... .. . 358/93
`Hutchinson ................. 351!210
`Knecht et a!.
`. . . . . . . . . . . . . . . . 382/17
`Greene et a!.
`. . . . . . . . . . . . . . . . 382/16
`Hutchinson ................. 351/210
`Warner et a!.
`. . . . . . . . . . . . . . 280/730
`Girod ...................... 250/201.4
`Ando ............................ 382/2
`Suzuki et a!. . .. ... ... ... ... ... 356/1
`Fukumizu ................... 382/157
`Oshima et a!.
`............. 359!554
`Allen ......................... 359/604
`White et a!. . . . . . . . . . . . . . . . . 280/735
`Fujita et a!.
`................ 280/735
`Watanabe ................... 358/105
`Mattes et a!. . . . . . . . . . . . . . . . 280/735
`Tamburino et a!. ........ 356/5.05
`Schweizer et a!. . ... ... ... ... 382/1
`Zimmermann .............. 348/143
`Subbarao . .. ... ... ... ... ... .. . 382/41
`Schweizer et a!. ............ 395/21
`Masamori et a!. . . . . . . . . . . 340/903
`Morrison et a!.
`........... 342/159
`Mathur eta!. ............. 356/3.14
`Taylor ........................ 340/903
`Chen . . . . . . . . . . . . . . . . . . . . . . . 250/203.4
`Faris . ... ... .. ... ... ... ... ... .. ... 345/7
`Kajiwara .................... 340/436
`Slotkowski et a!.
`........ 315/159
`Gentry et a!. . . . . . . . . . . . . . . . 280/735
`Abst et a!. . . . . . . . . . . . . . . . . . . 340/903
`Fukuhara .................... 340/435
`Wang ......................... 364/754
`Miyajima .............. 128/661.09
`Gioutsos et a!. ... ... . 364/424.05
`Mazur et a!.
`. . . . . . . . . . . . . . . 280/735
`Juds ........................... 340/903
`Liu ........................... 362/80.1
`Corrado et a!. . . . . . . . . . . . . . 280/735
`Pas trick et a!. ............ 362/83.1
`Kamei et a!. . . . . . . . . . . . . . . . 382/100
`Semchena et a!. . . . . . . . . . . 280/735
`Bechtel et a!. ................ 315/82
`Schofield et a!. ........... 359/604
`Poelstra . . . . . . . . . . . . . . . . . . . . . . . 348/36
`Breed et a!. . . . . . . . . . . . . . . . . 280/735
`Brandin ...................... 359/843
`Adolph et a!. .............. 280/735
`Burke eta!. ............... 307/10.1
`Breed et a!. . . . . . . . . . . . . . . . . 280/735
`Breed et a!. . . . . . . . . . . . . . . . . 382/100
`Breed et a!. . . . . . . . . . . . . . . . . 382/100
`Breed et a!. . . . . . . . . . . . . . . . . 280/735
`Schofield et a!. ........... 359/604
`Varga eta!. .................. 367/99
`Schofield et a!. ........... 340/461
`Griggs, III et a!. . . . . . . . . . 280/735
`O'Farrell et a!.
`.......... 307/10.1
`Krumm ... ... .. ... ... ... ... .. . 701!45
`Farmer et a!.
`.............. 382/103
`Stanley ....................... 280/735
`Thompson et a!.
`......... 340/438
`Tanaka et a!.
`. . . . . . . . . . . . . . 280/735
`
`2/2000
`6,029,105 A
`6,087,953 A *
`7/2000
`6,111,517 A
`8/2000
`6,113,137 A
`9/2000
`6,115,552 A
`9/2000
`2002/0154379 A1 * 10/2002
`FOREIGN PATENT DOCUMENTS
`
`Schweizer .. ... ... .. ... ... ... 701!45
`Deline et a!. ............ 340/815.4
`Atick et a!. ... ... ... ... 340/825.34
`Mizutani et a!. . . . . . . . . . . . . 280/735
`Kaneda . . . . . . . . . . . . . . . . . . . . . . . 396/82
`Tonar et a!.
`................ 359/267
`
`GB
`JP
`JP
`JP
`JP
`wo
`wo
`
`2289332
`360166806
`3-42337
`407055573 A
`2001-325700
`94/22693
`0196147
`
`11/1995
`8/1985
`2/1991
`3/1995
`11/2001
`10/1994
`12/2001
`
`................. 180/273
`
`OTHER PUBLICATIONS
`
`Learned Classification of Sonar Targets Using a Massively
`Parallel Network, R. Paul Gorman et al., IEEE Transactions
`on Acoustics, Speech and Signal Processing, vol. 36, No.7,
`Jul., 1988, pp 1135-1140.
`"How Airbags Work", David S. Breed, Presented at the
`Canadian Association of Road Safety Professionals, Oct. 19,
`1992-0ct. 20, 1992.
`Intelligent System for Video Monitoring of Vehicle Cockpit,
`S. Boverie et al., SAE Paper No. 980613, Feb., 1998.
`Omnidirectional Vision Sensor for Intelligent Vehicles, T.
`Ito et al., 1998 IEEE International Conference on Intelligent
`Vehicles, pp. 365-370, 1998.
`A 256x256 CMOS Brightness Adaptive Imaging Array with
`Column-Parallel Digital Output, C. Sodini et al., 1998 IEEE
`International Conference on Intelligent Vehicles, 1998, pp.
`347-352.
`Derwent Abstract of German Patent Publication No. DE 42
`11 556, Oct. 7, 1993.
`Derwent Abstract of Japanese patent application No.
`02-051332, Nov. 13, 1991.
`3D Perception for Vehicle Inner Space Monitoring, S.
`Boverie et al., Advanced Microsystems for Automotive
`Applications 2000, Apr., 2000, pp. 157-172.
`Low-Cost High Speed CMOS Camera for Automotive
`Applications, N. Stevanovic et al., Advanced Microsystems
`for Automotive Applications 2000, Apr., 2000, pp. 173-180.
`New Powerful Sensory Tool in Automotive Safety Systems
`Based on PMD-Technology, R. Schwarte et al., Advanced
`Microsystems for Automotive Applications 2000, Apr.,
`2000, pp. 181-203.
`An Interior Compartment Protection System Based on
`Motion Detection Using CMOS Imagers, S. B. Park et al.,
`1998 IEEE International Conference on Intelligent Vehicles.
`Sensing Automobile Occupant Position with Optical Trian(cid:173)
`gulation, W. Chapelle et al., Sensors, Dec. 1995.
`Intelligent System for Video Monitoring of Vehicle Cockpit,
`S. Boverie et al., SAE Paper No. 980613, Feb. 23-26, 1998.
`A 256x256 CMOS Brightness Adaptive Imaging Array with
`Column-Parallel Digital Output, CG. Sodini et al., 1998
`IEEE International Conference on Intelligent Vehicles.
`The FERET Evaluation Methodology for Face-Recognition
`Algorithms, P.J. Phillips et al., NISTIR 6264, Jan. 7, 1999.
`The Technology Review Ten: Biometrics, J. Atick, Jan./Feb.
`2001.
`* cited by examiner
`
`2
`
`
`
`U.S. Patent
`US. Patent
`
`Aug. 3, 2004
`Aug. 3, 2004
`
`Sheet 1 0f 19
`Sheet 1 of 19
`
`US 6,772,057 B2
`US 6, 772,057 B2
`
`N -
`
`v
`-.:t
`0
`N 08
`,—4_
`I—I
`
`0 -
`
`N -
`
`-0 -
`
`~
`~ .
`I I d
`I ~
`
`FIG.1A
`
`~
`
`0
`
`M -I
`
`100
`
`[05
`
`0
`110
`
`-.:t
`
`0
`0
`
`3
`
`
`
`U.S. Patent
`US. Patent
`
`Aug. 3, 2004
`Aug. 3, 2004
`
`Sheet 2 0f 19
`Sheet 2 of 19
`
`Us 6,772,057 B2
`US 6, 772,057 B2
`
`0
`N ........
`
`-........
`112
`
`
`
`’—
`........
`o—(
`
`0 -
`
`‘:
`/
`
`/
`
`/
`
`m
`v—i
`
`h—d
`
`‘0
`O
`_
`:
`1
`1
`
`_
`
`oy—
`0
`........
`........
`—1
`1
`
`
`0 ......
`
`~ -........
`
`'
`N
`0
`........
`
`I
`
`q-—I
`F—i
`
`-........
`
`F“
`
`- 0
`
`4
`
`
`
`U.S. Patent
`US. Patent
`
`Aug. 3, 2004
`Aug. 3, 2004
`
`Sheet 3 0f 19
`Sheet 3 of 19
`
`US 6, 772,057 B2
`US 6,772,057 B2
`
`('l --
`
`'-.::t
`
`0 -
`
`.
`
`lr)
`
`105
`
`0 -
`
`0 .....-
`
`100‘
`
`0 \
`0
`
`113"
`.....(cid:173)
`.....-
`
`-0 -
`
`FIG.1C
`
`',
`
`'
`
`' I I
`
`!
`
`5
`
`
`
`U.S. Patent
`US. Patent
`
`Aug. 3, 2004
`Aug. 3, 2004
`
`Sheet 4 0f 19
`Sheet 4 of 19
`
`US 6, 772,057 B2
`US 6,772,057 B2
`
`-
`
`-
`
`0
`
`N -
`
`1..0 --
`
`M
`
`0 -
`
`00 --I
`
`-0 -
`Xx“.101
`
`FIG.ID
`
`tr)
`
`0 -
`
`r---
`
`0 --
`--
`
`114
`
`III
`
`0
`0
`
`100‘
`
`--
`
`I
`N
`
`0 -
`
`6
`
`
`
`U.S. Patent
`US. Patent
`
`Aug. 3, 2004
`Aug. 3, 2004
`
`Sheet 5 0f 19
`Sheet 5 of 19
`
`US 6,772,057 B2
`US 6, 772,057 B2
`
`100
`
`-
`
`---
`
`(""')
`
`0 -\
`
`l.r)
`
`0 -
`
`0
`HO
`
`0
`0
`
`' m
`,—
`y—u
`
`"o:::t
`
`0 -
`
`0
`
`N -
`
`-0 -
`"\_,101
`
`FIG.1B
`
`I
`
`/
`I
`N
`
`0 -
`
`7
`
`
`
`U.S. Patent
`US. Patent
`
`Aug. 3,2004
`Aug. 3, 2004
`
`Sheet 6 of 19
`Sheet 6 0f 19
`
`US 6, 772,057 B2
`US 6,772,057 B2
`
`...............................
`
`-
`
`FIG.2A
`
`M -N
`
`"""" -N
`........ -N
`
`--<
`
`0 -N
`
`OI
`
`nN
`
`0 -
`
`........ -
`
`---
`
`8
`
`
`
`U.S. Patent
`US. Patent
`
`Aug. 3, 2004
`Aug. 3, 2004
`
`Sheet 7 0f 19
`Sheet 7 of 19
`
`US 6, 772,057 B2
`US 6,772,057 B2
`
`~ --
`
`
`
`0 ....... - '----'
`
`-....... -
`
`~ .
`~
`~
`
`8
`
`9
`
`
`
`N
`~
`""-l
`(It
`Q
`N
`""-l
`""-l
`-..a-..
`rJ'l
`e
`
`'"""'
`'0
`0 ......,
`00
`~ .....
`'J1 =(cid:173)~
`
`~
`
`N c c
`
`~
`
`~~
`({Q
`
`~ = ......
`~ ......
`~
`'J'J. .
`d .
`
`GROUND
`o
`
`I
`
`e
`
`1
`
`I
`
`CRASH SENSOR
`ELECTRONIC
`L------..1
`I
`I
`I
`
`I
`I
`
`•
`
`~------1
`
`~
`
`~~
`
`:
`1
`I
`fJ
`1
`
`:
`
`DIAGNOSTIC UNIT
`/
`
`~~~~~
`
`~~
`
`12 VOLTS
`
`Q
`
`! /
`r------.,
`ARMING SENSOR
`
`o i
`
`L ______ I
`
`L-------~
`:~1:
`
`INFLATABLE RESTRAINT SYSTEM
`
`r-------1
`
`OCCUPANT POSITION SENSORS
`
`I
`
`FIG. 3
`
`-----1
`I --=--
`::-~ I
`I
`I~ I
`I
`1
`
`,-
`
`.,-----. -· -----
`I 1/1 I
`------------,
`
`10
`
`
`
`N
`~
`""-l
`(It
`b
`N
`""-l
`':...l
`0'1
`rJ'l
`
`e
`
`'"""
`'0
`0 ......,
`'0
`~ ......
`'J1 =(cid:173)~
`
`~
`
`N c c
`
`~
`
`~~
`({Q
`
`~ = ......
`~ ......
`~
`\Jl .
`d .
`
`~
`
`409
`
`Pre-amplifier
`
`fK_
`
`144 Mhz
`
`FIG. 4
`
`Detect
`No Signa
`
`416
`
`dt .,....__
`d
`
`Velocity
`
`150KHz
`1
`
`lx
`
`1 Amplifier
`I Gain Control
`
`1
`
`Automatic
`
`MHz
`144.15
`
`••
`
`Filter
`
`413
`
`(3f2-3fl )= 150 KHz
`
`(0 to 2R::-}
`
`Detector
`
`Phase
`
`Adjust
`Phase
`
`Filter
`
`408
`
`Tripier
`
`40
`
`144 Mhz
`
`/S)_
`
`Driver
`
`403 -Diode
`
`407
`
`3ft
`
`144 Mhz
`
`Tripier
`
`Frequency
`
`402
`
`Oscillator
`Ctystal
`
`f1 48 MHz
`
`405
`
`401
`
`Frequency
`
`f2 48.05 MHz ~
`
`Oscillator
`Crystal
`
`11
`
`
`
`U.S. Patent
`US. Patent
`
`Aug. 3, 2004
`Aug. 3, 2004
`
`Sheet 10 of 19
`Sheet 10 0f 19
`
`US 6, 772,057 B2
`US 6,772,057 B2
`
`0
`
`N -/
`
`0 --
`
`
`
`---
`
`--
`
`12
`
`12
`
`
`
`U.S. Patent
`US. Patent
`
`Aug. 3, 2004
`Aug. 3, 2004
`
`Sheet 11 of 19
`Sheet 11 0f 19
`
`US 6, 772,057 B2
`US 6,772,057 B2
`
`0
`N
`\0
`
`.... ········ ..
`······- ...
`
`···. ·- ...
`
`··· .....
`
`
`
`\
`I
`I
`I
`\
`I
`I
`\
`\
`\.
`
`FIG.6
`
`13
`
`
`
`\
`
`~:·::·:.
`
`632
`
`631
`
`630
`
`13
`
`
`
`U.S. Patent
`
`Aug. 3, 2004
`
`Sheet 12 of 19
`
`US 6, 772,057 B2
`
`
`
`14
`
`
`
`U.S. Patent
`US. Patent
`
`Aug. 3, 2004
`Aug. 3, 2004
`
`Sheet 13 of 19
`Sheet 13 0f 19
`
`US 6, 772,057 B2
`S 6,772,057 132
`
`-0 -
`
`00
`
`FIG.8
`
`15
`
`I
`I
`N
`
`0 -
`
`0
`
`ON U
`
`-
`
`00
`
`15
`
`
`
`U.S. Patent
`US. Patent
`
`Aug. 3, 2004
`Aug. 3, 2004
`
`Sheet 14 of 19
`Sheet 14 0f 19
`
`US 6, 772,057 B2
`US 6,772,057 132
`
`m.05
`
`16
`
`16
`
`
`
`N
`~
`""-l
`(It
`Q
`N
`""-l
`""-l
`_,.a-..
`rJ'l
`
`e
`
`'"""'
`'0
`0 ......,
`'"""' Ul
`~ .....
`'J1 =(cid:173)~
`
`~
`
`N c c
`
`~
`
`~~
`({Q
`
`~ = ......
`~ ......
`~
`\Jl .
`d .
`
`~~~~~ssor --l-~J ----~-
`
`Algorithm ( 506)
`Recognition
`Pattern
`Training Phase of
`
`(Optional) (512)
`Transmitter
`Optical
`
`FIG. 10
`
`Component( 51 0)
`or other Vehicle
`Security System
`
`~-~-1-----~-~
`
`-
`
`-------
`
`---
`
`---------
`
`. ---
`
`_j
`
`(504)
`Data Derivation
`~-]~---~~
`LReception (502)
`r-~?ptical Image
`
`-~---~---r~---~---~ -
`)
`
`. (500)
`
`'-,
`
`Compartment
`Passenger
`
`\.
`.
`
`~
`
`17
`
`
`
`N
`~
`-....l
`(It
`b
`N
`-....l
`':...l
`0'1
`\Jl
`e
`
`~
`~
`~
`;
`
`'0
`
`......,
`0
`0'1
`
`rJl
`
`~
`
`N c c
`
`~
`
`~~
`({Q
`
`~ = ......
`~ ......
`~
`\Jl .
`d .
`
`/"
`.
`
`)
`'"'-,
`
`'
`/
`Authonzed Dnver
`Includes
`that Image
`Algorith-m lndi·.cate
`Recognition
`Does Pattern
`
`/
`
`-Sound ••1
`:J__
`
`Police
`Alann/Contact
`
`~
`~--~-r-
`_ -upply ~~·!;~;..,~
`
`-,
`
`'-
`
`'-
`
`Algorithm
`Recogmhon
`
`-----------
`
`530
`
`Yes
`
`'
`
`~
`
`524
`
`Enable~ -~-
`~--·-
`
`----~~ -
`Vehicle
`Ignition of
`
`"'-
`
`528
`
`.520
`
`Oper~tionai:::J ___--
`Set System ~--1 ________ _____
`-----~---
`
`~
`
`Autlt_I~ivcr(s}-)
`
`518
`
`-,i-T;ain P~t~em
`
`Algorithm
`Recogmlton
`
`522---_ __
`
`--~
`
`516 ---~Obtain.Images:J
`
`Includmg
`
`Trainmg Pha~
`514~l-Set System~
`
`-=r_
`
`FIG. 11
`
`18
`
`
`
`N
`~
`""-l
`(It
`b
`N
`""-l
`':...l
`0'1
`rJ'l
`
`e
`
`'"""'
`'0
`0 ......,
`'"""'
`-..J
`~ .....
`'J1 =-~
`
`~
`
`N c c
`> =
`
`~~
`({Q
`
`FIG. 13
`
`Compon:~...___)
`614
`
`612
`
`______ _)
`610
`
`FIG. 12
`
`~ = ......
`~ ......
`~
`\Jl .
`d .
`
`606~_)
`~r_Reac:i·~~ co:;onen]
`
`.
`
`604=-<7
`
`u-j
`
`K __ Processor
`
`...
`::::~:men]-.[~:~;~:~i::;-~:~era
`
`60i~-~ /
`
`19
`
`
`
`U.S. Patent
`US. Patent
`
`Aug. 3, 2004
`002
`
`1a
`Sheet 18 of 19
`
`US 6, 772,057 B2
`2B750a277,6S
`
`00
`
`Ill\
`,,1 \\
`
`I I I \\
`
`
`4.|!I
`2w,\ILAamx.‘\II858
`wEZuouy—EwfimI.\l'51I\\zI\
`
`
`22>v.»AI..I«Gog—GOHSGQ.II.N:35‘
`
`\ \ I II
`\\ 1/J
`\\II
`
`
`
`
`
`pm.8250EocanOU
`
`
`
`85:22.30Safioma08:.
`
`wcouncicwfin—-1
`
`
`
`
`UEd:
`
`m\x/NSS.\
`
`
`
`.3385zeta—382
`
`2:32
`
`20
`
`20
`
`
`
`
`
`U.S. Patent
`
`Aug. 3, 2004
`
`Sheet 19 of 19
`
`US 6, 772,057 B2
`
`FIG. 15
`
`Optional transmitter
`730
`
`Exterior object(s)
`
`Receiver(s) (single or
`multiple) 734, 736,
`736
`
`Optional measurement system (radar)
`746
`
`~ Electronic module/processor (transmitter
`drive circuitry, signal processing circuitry-
`neural computer) 740,742,744,745
`
`Display to driver/ Airbag
`controVheadlight dimmer
`controVOther system control
`748
`
`21
`
`
`
`US 6, 772,057 B2
`
`1
`VEHICULAR MONITORING SYSTEMS
`USING IMAGE PROCESSING
`
`CROSS REFERENCE TO RELATED
`APPLICATIONS
`
`This application is a continuation-in-part of U.S. patent
`application Ser. No. 10/116,808 filed Apr. 5, 2002 which is:
`1) a continuation-in-part of U.S. patent application Ser.
`No. 09/925,043 filed Aug. 8, 2001, now U.S. Pat. No.
`6,507,779, which is:
`a) a continuation-in-part of U.S. patent application Ser.
`No. 09!765,559 filed Jan. 19, 2001, now U.S. Pat.
`No. 6,553,296; and
`b) a continuation-in-part of U.S. patent application Ser.
`No. 09/389,947 filed Sep. 3, 1999, now U.S. Pat. No.
`6,393,133; and
`2) a continuation-in-part of U.S. patent application Ser.
`No. 09/838,919 filed Apr. 20, 2001, now U.S. Pat. No.
`6,442,465, which is:
`a) a continuation-in-part of U.S. patent application Ser.
`No. 09!765,559 filed Jan. 19, 2001 which is a
`continuation-in-part of U.S. patent application Ser.
`No. 09/476,255 filed Dec. 30, 1999, now U.S. Pat.
`No. 6,324,453, which claims priority under 35
`U.S.C. §119(e) of U.S. provisional patent application
`Ser. No. 60/114,507 filed Dec. 31, 1998; and
`b) a continuation-in-part of U.S. patent application Ser.
`No. 09/389,947 filed Sep. 3, 1999, now U.S. Pat. No.
`6,393,133, which is a continuation-in-part of U.S.
`patent application Ser. No. 09/200,614, filed Nov. 30,
`1998, now U.S. Pat. No. 6,141,432, which is a
`continuation of U.S. patent application Ser. No.
`08/474,786 filed Jun. 7, 1995, now U.S. Pat. No.
`5,845,000, all of which are incorporated by reference
`herein.
`This application claims priority under 35 U.S.C. §119(e)
`of U.S. provisional patent application Ser. No. 60/114,507
`filed Dec. 31, 1998 through the parent applications.
`
`FIELD OF THE INVENTION
`
`The present invention relates to apparatus and methods
`for monitoring environments in and outside of a vehicle
`using image processing.
`The present invention also relates to arrangements for
`detecting the presence, type and/or position of occupants in
`vehicles and objects exterior of vehicles, e.g., in a driver's
`blind spot, primarily using optics.
`The present invention also relates to apparatus and meth(cid:173)
`ods for determining a distance between objects in an envi(cid:173)
`ronment in and outside of a vehicle by image processing
`techniques.
`
`BACKGROUND OF THE INVENTION
`1. Prior Art on Out of Position Occupants and Rear Facing 55
`Child Seats
`Whereas thousands of lives have been saved by airbags,
`a large number of people have also been injured, some
`seriously, by the deploying airbag, and over 100 people have
`now been killed. Thus, significant improvements need to be 60
`made to airbag systems. As discussed in detail in U.S. Pat.
`No. 5,653,462 referenced above, for a variety of reasons
`vehicle occupants may be too close to the airbag before it
`deploys and can be seriously injured or killed as a result of
`the deployment thereof. Also, a child in a rear facing child 65
`seat that is placed on the right front passenger seat is in
`danger of being seriously injured if the passenger airbag
`
`10
`
`20
`
`2
`deploys. For these reasons and, as first publicly disclosed in
`Breed, D. S. "How Airbags Work" presented at the Interna(cid:173)
`tional Conference on Seatbelts and Airbags in 1993, in
`Canada, occupant position sensing and rear facing child seat
`detection systems are required.
`Initially, these systems will solve the out-of-position
`occupant and the rear facing child seat problems related to
`current airbag systems and prevent unneeded airbag deploy(cid:173)
`ments when a front seat is unoccupied. However, airbags are
`now under development to protect rear seat occupants in
`vehicle crashes and all occupants in side impacts. A system
`will therefore be needed to detect the presence of occupants,
`determine if they are out-of-position and to identify the
`presence of a rear facing child seat in the rear seat. Future
`automobiles are expected to have eight or more airbags as
`15 protection is sought for rear seat occupants and from side
`impacts. In addition to eliminating the disturbance and
`possible harm of unnecessary airbag deployments, the cost
`of replacing these airbags will be excessive if they all deploy
`in an accident needlessly.
`Inflators now exist which will adjust the amount of gas
`flowing to the airbag to account for the size and position of
`the occupant and for the severity of the accident. The vehicle
`identification and monitoring system (VIMS) discussed in
`U.S. Pat. No. 5,829,782 will control such inflators based on
`25 the presence and position of vehicle occupants or of a rear
`facing child seat. As discussed more fully below, the instant
`invention is an improvement on that VIMS system and uses
`an advanced optical system comprising one or more CCD
`(charge coupled device) or CMOS arrays and particularly
`30 active pixel arrays plus a source of illumination preferably
`combined with a trained neural network pattern recognition
`system.
`Others have observed the need for an occupant out-of(cid:173)
`position sensor and several methods have been disclosed in
`35 U.S. patents for determining the position of an occupant of
`a motor vehicle. Each of these systems, however, has
`significant limitations. For example, in White et al. (U.S.
`Pat. No. 5,071,160), a single acoustic sensor and detector is
`described and, as illustrated, is mounted lower than the
`40 steering wheel. White et al. correctly perceive that such a
`sensor could be defeated, and the airbag falsely deployed, by
`an occupant adjusting the control knobs on the radio and
`thus they suggest the use of a plurality of such sensors.
`Mattes et al. (U.S. Pat. No. 5,118,134) describe a variety
`45 of methods of measuring the change in position of an
`occupant including ultrasonic, active or passive infrared and
`microwave radar sensors, and an electric eye. The sensors
`measure the change in position of an occupant during a crash
`and use that information to access the severity of the crash
`50 and thereby decide whether or not to deploy the airbag. They
`are thus using the occupant motion as a crash sensor. No
`mention is made of determining the out-of-position status of
`the occupant or of any of the other features of occupant
`monitoring as disclosed in one or more of the above(cid:173)
`referenced patents and patent applications. It is interesting to
`note that nowhere does Mattes et al. discuss how to use
`active or passive infrared to determine the position of the
`occupant. As pointed out in one or more of the above(cid:173)
`referenced patents and patent applications, direct occupant
`position measurement based on passive infrared is probably
`not possible and, until very recently, was very difficult and
`expensive with active infrared requiring the modulation of
`an expensive GaAs infrared laser. Since there is no mention
`of these problems, the method of use contemplated by
`Mattes et al. must be similar to the electric eye concept
`where position is measured indirectly as the occupant passes
`by a plurality of longitudinally spaced-apart sensors.
`
`22
`
`
`
`US 6, 772,057 B2
`
`15
`
`3
`The object of an occupant out-of-position sensor is to
`determine the location of the head and/or chest of the vehicle
`occupant relative to the airbag since it is the impact of either
`the head or chest with the deploying airbag which can result
`in serious injuries. Both White et al. and Mattes et al.
`describe only lower mounting locations of their sensors in
`front of the occupant such as on the dashboard or below the
`steering wheel. Both such mounting locations are particu(cid:173)
`larly prone to detection errors due to positioning of the
`occupant's hands, arms and legs. This would require at least
`three, and preferably more, such sensors and detectors and
`an appropriate logic circuitry which ignores readings from
`some sensors if such readings are inconsistent with others,
`for the case, for example, where the driver's arms are the
`closest objects to two of the sensors.
`White et al. also describe the use of error correction
`circuitry, without defining or illustrating the circuitry, to
`differentiate between the velocity of one of the occupant's
`hands as in the case where he/she is adjusting the knob on
`the radio and the remainder of the occupant. Three ultrasonic
`sensors of the type disclosed by White et al. might, in some
`cases, accomplish this differentiation if two of them indi(cid:173)
`cated that the occupant was not moving while the third was
`indicating that he or she was. Such a combination, however,
`would not differentiate between an occupant with both hands
`and arms in the path of the ultrasonic transmitter at such a
`location that they were blocking a substantial view of the
`occupant's head or chest. Since the sizes and driving posi(cid:173)
`tions of occupants are extremely varied, it is now believed
`that pattern recognition systems and preferably trained pat(cid:173)
`tern recognition systems, such as neural networks, are
`required when a clear view of the occupant, unimpeded by
`his/her extremities, cannot be guaranteed.
`Fujita et al., in U.S. Pat. No. 5,074,583, describe another
`method of determining the position of the occupant but do
`not use this information to suppress deployment if the
`occupant is out-of-position. In fact, the closer the occupant
`gets to the airbag, the faster the inflation rate of the airbag
`is according to the Fujita et al. patent, which thereby
`increases the possibility of injuring the occupant. Fujita et al.
`do not measure the occupant directly but instead determine
`his or her position indirectly from measurements of the seat
`position and the vertical size of the occupant relative to the
`seat (occupant height). This occupant height is determined
`using an ultrasonic displacement sensor mounted directly
`above the occupant's head.
`As discussed above, the optical systems described herein
`are also applicable for many other sensing applications both
`inside and outside of the vehicle compartment such as for
`sensing crashes before they occur as described in U.S. Pat.
`No. 5,829,782, for a smart headlight adjustment system and
`for a blind spot monitor (also disclosed in U.S. provisional
`patent application Ser. No. 60/202,424).
`2. Definitions
`Preferred embodiments of the invention are described 55
`below and unless specifically noted, it is the applicants'
`intention that the words and phrases in the specification and
`claims be given the ordinary and accustomed meaning to
`those of ordinary skill in the applicable art(s). If the appli(cid:173)
`cant intends any other meaning, he will specifically state he
`is applying a special meaning to a word or phrase.
`Likewise, applicants' use of the word "function" here is
`not intended to indicate that the applicants seek to invoke the
`special provisions of 35 U.S.C. §112, sixth paragraph, to
`define their invention. To the contrary, if applicants wish to
`invoke the provisions of 35 U.S.C. §112, sixth paragraph, to
`define their invention, they will specifically set forth in the
`
`4
`claims the phrases "means for" or "step for" and a function,
`without also reciting in that phrase any structure, material or
`act in support of the function. Moreover, even if applicants
`invoke the provisions of 35 U.S.C. §112, sixth paragraph, to
`define their invention, it is the applicants' intention that their
`inventions not be limited to the specific structure, material or
`acts that are described in the preferred embodiments herein.
`Rather, if applicants claim their inventions by specifically
`invoking the provisions of 35 U.S.C. §112, sixth paragraph,
`10 it is nonetheless their intention to cover and include any and
`all structure, materials or acts that perform the claimed
`function, along with any and all known or later developed
`equivalent structures, materials or acts for performing the
`claimed function.
`The use of pattern recognition is important to the instant
`invention as well as to one or more of those disclosed in the
`above-referenced patents and patent applications above.
`"Pattern recognition" as used herein will generally mean any
`system which processes a signal that is generated by an
`20 object, or is modified by interacting with an object, in order
`to determine which one of a set of classes that the object
`belongs to. Such a system might determine only that the
`object is or is not a member of one specified class, or it might
`attempt to assign the object to one of a larger set of specified
`25 classes, or find that it is not a member of any of the classes
`in the set. The signals processed are generally electrical
`signals coming from transducers which are sensitive to
`either acoustic or electromagnetic radiation and, if
`electromagnetic, they can be either visible light, infrared,
`30 ultraviolet or radar or low frequency radiation as used in
`capacitive sensing systems.
`A trainable or a trained pattern recognition system as used
`herein means a pattern recognition system which is taught
`various patterns by subjecting the system to a variety of
`35 examples. The most successful such system is the neural
`network. Not all pattern recognition systems are trained
`systems and not all trained systems are neural networks.
`Other pattern recognition systems are based on fuzzy logic,
`sensor fusion, Kalman filters, correlation as well as linear
`40 and non-linear regression. Still other pattern recognition
`systems are hybrids of more than one system such as
`neural-fuzzy systems.
`A pattern recognition algorithm will thus generally mean
`an algorithm applying or obtained using any type of pattern
`45 recognition system, e.g., a neural network, sensor fusion,
`fuzzy logic, etc.
`To "identify" as used herein will usually mean to deter(cid:173)
`mine that the object belongs to a particular set or class. The
`class may be one containing, for example, all rear facing
`50 child seats, one containing all human occupants, or all
`human occupants not sitting in a rear facing child seat
`depending on the purpose of the system. In the case where
`a particular person is to be recognized, the set or class will
`contain only a single element, i.e., the person to be recog(cid:173)
`nized.
`To "ascertain the identity of" as used herein with refer(cid:173)
`ence to an object will generally mean to determine the type
`or nature of the object (obtain information as to what the
`object is), i.e., that the object is an adult, an occupied rear
`60 facing child seat, an occupied front facing child seat, an
`unoccupied rear facing child seat, an unoccupied front
`facing child seat, a child, a dog, a bag of groceries, a car, a
`truck, a tree, a pedestrian, a deer etc.
`An "occupying item" or "occupant" of a seat or "object"
`65 in a seat may be a living occupant such as a human being or
`a dog, another living organism such as a plant, or an
`inanimate object such as a box or bag of groceries.
`
`23
`
`
`
`US 6, 772,057 B2
`
`10
`
`5
`A "rear se