`Breed et al.
`
`USOO6856873B2
`US 6,856,873 B2
`Feb. 15, 2005
`
`(10) Patent No.:
`(45) Date of Patent:
`
`(54) VEHICULAR MONITORING SYSTEMS
`USING IMAGE PROCESSING
`
`(75) Inventors: David S. Breed, Boonton Township,
`Morris County, NJ (US); Wilbur E.
`DuVall, Kimberling City, MO (US);
`Wendell C. Johnson, Signal Hill, CA
`(US)
`(73) Assignee: Automotive Technologies
`International, Inc., Denville, NJ (US)
`Subject to any disclaimer, the term of this
`patent is extended or adjusted under 35
`U.S.C. 154(b) by 0 days.
`
`(*) Notice:
`
`(21) Appl. No.: 10/116,808
`(22) Filed:
`Apr. 5, 2002
`(65)
`Prior Publication Data
`US 2002/0116106 A1 Aug. 22, 2002
`Related U.S. Application Data
`(63) Continuation-in-part of application No. 09/838.919, filed on
`Apr. 20, 2001, now Pat. No. 6,442.465, which is a continu
`ation-in-part of application No. 09/765,559, filed on Jan. 19,
`2001, now Pat. No. 6,553,296, which is a continuation-in
`part of application No. 09/476.255, filed on Dec. 30, 1999,
`now Pat. No. 6,324,453, and a continuation-in-part of appli
`cation No. 09/389,947, filed on Sep. 3, 1999, now Pat. No.
`6,393,133, which is a continuation-in-part of application No.
`09/200,614, filed on Nov.30, 1998, now Pat. No. 6,141,432,
`which is a continuation of application No. 08/474.786, filed
`on Jun. 7, 1995, now Pat. No. 5,845,000, and a continuation
`in-part of application No. 09/925,043, filed on Aug. 8, 2001,
`now Pat. No. 6,507,779, which is a continuation-in-part of
`application No. 09/765,559.
`(60) Provisional application No. 60/114,507, filed on Dec. 31,
`1998.
`(51) Int. Cl." .................................................. G06K 9/00
`(52) U.S. Cl. ...
`... 701/45; 382/1; 382/154;
`180/271
`(58) Field of Search ..................... 701/45, 46; 280/735,
`280/736, 739, 732; 180/268, 271, 273,
`272; 382/103, 104, 106, 1; 367/99, 96;
`340/438, 5.2, 540, 436,567; 250/208.1,
`214 AL, 226; 362/61; 315/82; 463/36; 34.5/156
`
`(56)
`
`References Cited
`
`U.S. PATENT DOCUMENTS
`
`4,496.222 A 1/1985 Shah .......................... 359/300
`(List continued on next page.)
`FOREIGN PATENT DOCUMENTS
`
`EP
`
`O885782
`12/1998
`(List continued on next page.)
`OTHER PUBLICATIONS
`
`“Analysis of Hidden Units in a Layered Network Trained to
`Classify Sonar Targets”, R. Paul Gorman, et al., Neural
`Networks, vol. 1, pp. 75–89, 1988.
`(List continued on next page.)
`
`Primary Examiner Thomas G. Black
`ASSistant Examiner Tuan C To
`(74) Attorney, Agent, or Firm-Brian Roffe
`(57)
`ABSTRACT
`Vehicular monitoring arrangement for monitoring an envi
`ronment of the vehicle including at least one active pixel
`camera for obtaining images of the environment of the
`vehicle and a processor coupled to the active pixel camera(s)
`for determining at least one characteristic of an object in the
`environment based on the images obtained by the active
`pixel camera(s). The active pixel camera can be arranged in
`a headliner, roof or ceiling of the vehicle to obtain images of
`an interior environment of the vehicle, in an A-pillar or
`B-pillar of the vehicle to obtain images of an interior
`environment of the vehicle, or in a roof, ceiling, B-pillar or
`C-pillar of the vehicle to obtain images of an interior
`environment of the vehicle behind a front seat of the vehicle.
`The determined characteristic can be used to enable optimal
`control of a reactive component, System or Subsystem
`coupled to the processor. When the reactive component is an
`airbag assembly including at least one airbag, the processor
`can be designed to control at least one deployment param
`eter of the airbag(s).
`
`50 Claims, 18 Drawing Sheets
`
`
`
`Passenger
`Compartment
`65
`
`9isite
`(Optional) (512)
`
`Optical Image
`Reception (502)
`
`Data Derivation
`SO4
`(504)
`
`P
`38
`(508)
`-
`I
`Security System
`or other Vehicle
`Component(510)
`
`-
`
`
`
`Training Phase of
`Pattern
`Recognition
`Algorithm (506)
`
`VWGoA EX1025
`U.S. Patent No. 9,955,551
`
`
`
`US 6,856.873 B2
`Page 2
`
`U.S. PATENT DOCUMENTS
`
`FOREIGN PATENT DOCUMENTS
`
`GB
`JP
`JP
`JP
`WO
`WO
`
`11/1995
`228.9332
`8/1985
`36O166806
`2/1991
`3-42337
`4O7055573 A 3/1995
`94/22693
`10/1994
`O196147
`12/2001
`
`................. 180/273
`
`OTHER PUBLICATIONS
`
`4,625,329 A 11/1986 Ishikawa et al. ............ 382/104
`4,648,052 A 3/1987 Friedman et al. ..
`... 364/550
`4,720,189 A
`1/1988 Heynen et al. ............. 351/210
`4,768,088 A 8/1988 Ando .......................... 358/93
`4,836,670 A 6/1989 Hutchinson
`- - - 351/210
`4.881,270 A 11/1989 Knecht et al. ................ 382/17
`4,906,940 A 3/1990 Greene et al. ................ 382/16
`4.950,069 A 8/1990 Hutchinson ....
`... 351/210
`4,966,388 A 10/1990 Warner et al.
`... 280/730
`5,003,166 A 3/1991 Girod ...................
`250/2014
`s: A 4/1991 Skea- - - - - - - - - - - - - - - - - - - - - 3. Learned Classification of Sonar Targets Using a Massively
`,026,153 A 6/1991 Suzuki et al.
`... 356/1
`Parallel Network, R. Paul Gorman et al., IEEE Transactions
`5,060,278 A 10/1991 Fukumizu ......
`... 382/157
`5,062,696 A 11/1991. Oshima et al. .
`... 359/554
`on Acoustics, Speech and Signal Processing, Vol. 36, No. 7,
`5,064.274. A 11/1991 Alten .........
`359/604
`Jul. 1988, pp 1135-1140.
`5071160 A 12/1991. White et al. .
`... 28Of735
`5074583 A 13: N N. C. 32.
`“How Airbags Work”, David S. Breed, Presented at the
`5103305. A
`4/1992 Watanabe. 358/105
`Canadian ASSociation of Road Safety Professionals, Oct. 19,
`5,118,134 A 6/1992 Mattes et al. ......
`... 280/735
`1992-Oct. 20, 1992.
`5,162,861 A 11/1992 Tamburino et al. ........ 356/5.05
`-
`5,181.254 A 1/1993 Schweizer et al. ............. 382/1
`Omnidirectional Vision Sensor for Intelligent Vehicles, T.
`5,185,667 A 2/1993 Zimmermann .
`... 348/143
`Ito et al., 1998 IEEE International Conference on Intelligent
`5,193,124 A 3/1993 Subbarao .......
`... 38.2/41
`Vehicles, pp. 365-370, 1998.
`5,214,744 A 5/1993 Schweizer et al. ............ 395/21
`5,227,784. A 7/1993 Masamori et al. .......... 340/903
`A 256x256 CMOS Brightness Adaptive Imaging Array with
`5,235,339 A 8/1993 Morrison et al. ........... 342/159
`Column-Parallel Digital Output, C. Sodini et al., 1998 IEEE
`5,249,027 A 9/1993 Mathur et al. ..
`... 356/3.14
`International Conference on Intelligent Vehicles, 1998, pp.
`5,249,157 A 9/1993 Taylor .....
`... 340/903
`347-352
`5,298.732 A 3/1994 Chen .......
`250/203.4
`St. A :
`Misawa
`- - - 3. Derwent Abstract of German Patent Publication No. DE 42
`5,309,137 A
`5/1994 Kajiwara .......
`,
`11556, Oct. 7, 1993.
`2: A ZE Stil al.
`- - - 35-5
`Derwent Abstract of Japanese Patent Application No.
`2- - - 2
`entry el al. . . . . . . . . . . . . . . .
`5,339,075 A
`8/1994 Abst et al. .................. ;, 02-051332, Nov. 13, 1991.
`5,355,118 A 10/1994 Fukuhara ....
`... 340/435
`3D Perception for Vehicle Inner Space Monitoring, S.
`5,390,136 A
`2/1995 Wang .........
`... 364/754
`-
`- -
`Boverie et al., Advanced Microsystems for Automotive
`5,441,052 A 8/1995 Miyajima ......
`128/661.09
`licati
`5
`5.446,661. A
`s/1995 Giousos et al... 364/42405
`Applications 2000, Apr, 2000, pp. 157-172.
`5,454,591 A 10/1995 Mazur et al. ............... 280/735
`5,467,402 A * 11/1995 Okuyama et al. ........... 5. Low-Cost High Speed CMOS Camera for Automotive
`5,482,314 A
`1/1996 Corrado et al. ............. 280/735
`Applications, N. Stevanovic et al., Advanced MicroSystems
`5,490,069 A
`2/1996 Gioutsos et al. .............. 701/45
`for Automotive Applications 2000, Apr., 2000, pp. 173-180.
`5.528,698 A * 6/1996 K.
`i et al. ......
`... 38.2/100
`5.531.472 A * 9.
`SWAN al. ..........
`4. New Powerful Sensory Tool in Automotive Safety Systems
`5.537,003 A
`7/1996 Bechtelet al.... 31582
`Based on PMD-Technology, R. Schwarte et al., Advanced
`5,550,677 A * 8/1996 Schofield et al.
`... 359/604
`Microsystems for Automotive Applications 2000, Apr.,
`5,563,650 A 10/1996 Poelstra ....................... 348/36
`2000, pp. 181-203.
`5,574,426 A * 11/1996 Shigal et al. ............... 340/435
`5,653,462 A 8/1997 Breed et al. ....
`... 280/735
`An Interior Compartment Protection System Based on
`5,704836 A * 1/1998 Norton et al. ................ 463/36
`Motion Detection Using CMOS Imagers, S. B. Park et al.,
`5,785,347 A
`7/1998 Adolph et al. .............. 280/735
`1998 IEEE International Conference on Intelligent Vehicles.
`5,796,094. A * 8/1998 Schofield et al. ........ 250/208.1
`5,821,633. A 10/1998 Burke et al. ............... 307/10.1
`Sensing Automobile Occupant Position with Optical Trian
`5,829,782. A 11/1998 Breed et al. ................ 280/735
`gulation, W. Chapelle et al., Sensors, Dec. 1995.
`5.835,613 A 11/1998 Breed et al. ................ 382/100
`5,845,000 A 12/1998 Breed et al. .
`... 382/100
`Intelligent System for Video Monitoring of Vehicle Cockpit,
`5,848,802. A 12/1998 Breed et al. ................ 280/735
`S. Boverie et al., SAE Paper No. 980613, Feb. 23–26, 1998.
`5,943,295 A 8/1999 Varga et al. .................. 367/99
`A 256x256 CMOS Brightness Adaptive Imaging Array with
`5,954,360 A * 9/1999 Griggs, III et al.
`... 280/735
`Col
`Parallel Digital O
`CG. Sodini
`1, 1998
`5,983,147 A 11/1999 Krumm .............
`... 701/45
`olumn-Parallel Digital Output, CG. Sodini et al.,
`6,005,958 A * 12/1999 Farmer et al. ..
`... 382/103
`IEEE International Conference on Intelligent Vehicles.
`6,007,095 A * 12/1999 Stanley ..............
`... 280/735
`The FERET Evaluation Methodology for Face-Recognition
`6,020,812 A 2/2000 Thompson et al.
`... 340/438
`- - -
`-
`6,027,138 A 2/2000 Tanaka et al. .............. 280/735
`Algorithms, P.J. Phillips et al., NISTIR 6264, Jan. 7, 1999.
`6,029,105 A 2/2000 Schweizer ................... 701/45
`D.
`6,111,517 A 8/2000 Atick et al. .....
`340/825.34
`is technology Review Ten: Biometrics, J. Atick, Jan./Feb.
`6,113,137 A 9/2000 Mizutani et al. ............ 280/735
`6,115,552 A * 9/2000 Kaneda .........
`... 396/82
`2001/0003168 A1
`6/2001 Breed et al. .................. 701/45
`
`2- Y - 2
`
`aIIS . . . . . . .
`
`- - -
`
`:
`
`:
`
`
`
`* cited by examiner
`
`
`
`U.S. Patent
`U.S. Patent
`
`Feb. 15, 2005
`Feb. 15, 2005
`
`Sheet 1 of 18
`Sheet 1 of 18
`
`US 6,856,873 B2
`US 6,856,873 B2
`
`
`
`
`
`
`
`U.S. Patent
`U.S. Patent
`
`Feb. 15, 2005
`Feb. 15, 2005
`
`Sheet 2 of 18
`Sheet 2 of 18
`
`US 6,856,873 B2
`US 6,856,873 B2
`
`
`
`
`
`3. s
`
`FIG.1B
`
`
`
`U.S. Patent
`U.S. Patent
`
`Feb. 15, 2005
`Feb. 15, 2005
`
`Sheet 3 of 18
`Sheet 3 of 18
`
`US 6,856,873 B2
`US 6,856,873 B2
`
`CN
`N—
`w
`Ssssene
`us
`
`
`
`s s
`104
`120
`
`FIG.1C
`
`
`
`U.S. Patent
`U.S. Patent
`
`Feb. 15, 2005
`Feb. 15, 2005
`
`Sheet 4 of 18
`Sheet 4 of 18
`
`US 6,856,873 B2
`US 6,856,873 B2
`
`104
`
`120
`
`s
`
`—N=
`_—
`
`
`
`FIG.1D
`
`
`
`U.S. Patent
`U.S. Patent
`
`Feb. 15, 2005
`Feb. 15, 2005
`
`Sheet 5 of 18
`Sheet 5 of 18
`
`US 6,856,873 B2
`US 6,856,873 B2
`
`cll
`
`vol
`
`OCI
`
`[ol
`I0 I
`
`vil
`
`
`
`90 I
`
`00I
`
`Sol OOT
`
`
`
`U.S. Patent
`U.S. Patent
`
`Feb. 15, 2005
`Feb. 15, 2005
`
`Sheet 6 of 18
`Sheet 6 of 18
`
`US 6,856,873 B2
`US 6,856,873 B2
`
`
`
`210
`
`1] op
`<<
`N
`
`214
`211A
`
`114
`
`FIG,2A
`
`241
`
`C
`r
`N
`
`
`
`U.S. Patent
`U.S. Patent
`
`Feb. 15, 2005
`Feb. 15, 2005
`
`US 6,856,873 B2
`
`
`
`Sheet 7 of 18
`Sheet 7 of 18
`
`US 6,856,873 B2
`
`
`
`U.S. Patent
`
`Feb. 15, 2005
`
`Sheet 8 of 18
`
`US 6,856,873 B2
`
`
`
`CINQORIÐ
`
`
`
`
`
`INGIJLS?S JLN IVRILSGTH OEHT8IWLVT HINI
`
`
`
`
`
`
`
`
`
`SHOSN'GIS NOILISOI JLNWRITTOOO
`
`SJZIOA ZI
`
`„LINmQIJLSONOVICI
`
`
`
`U.S. Patent
`U.S. Patent
`
`Feb. 15, 2005
`Feb. 15, 2005
`
`Sheet 9 of 18
`Sheet 9 of 18
`
`US 6,856,873 B2
`US 6,856,873 B2
`
`
`
`POSLOb907=Fg
`
`
`vYSpordZUpriAouonba1y
`JoALIGJody,jeasAx)
`
`
`
`ZHIN8bIZUWvrOFCOPcOr10¢
`
`JOJRTIOSO
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`OLP™~
`
`JOICUyeIsAI>)
`
`Aguonba1JGeZHIS0°8b
`
`
`
`JOWeTIOSO
`
`cor
`
`W(ZUP41Ll
`
`9ATN99=
`
`CIP
`
`onewWony
`
`
`
`adOg]“Oo.
`Joqu0DureDTOUPHA081
`wygyduryes
`
`«uz10)
`
`bpOWWi
`
`10010apANOOTSAdGBUSTSONP
`
`ZHIOST=(1F€-ZE)
`
`oseyg
`
`10}9919q
`
`80P
`
`SIV
`
`
`
`
`
`
`
`U.S. Patent
`U.S. Patent
`
`Feb. 15, 2005
`Feb. 15, 2005
`
`Sheet 10 0f 18
`Sheet 10 of 18
`
`US 6,856,873 B2
`US 6,856,873 B2
`
`©N—
`
`C
`CN
`w
`
`
`
`
`
`U.S. Patent
`U.S. Patent
`
`Feb. 15, 2005
`Feb. 15, 2005
`
`Sheet 11 of 18
`Sheet 11 of 18
`
`US 6,856,873 B2
`US 6,856,873 B2
`
`
`
`FIG.6
`
`s
`
`
`
`
`
`
`
`U.S. Patent
`U.S. Patent
`
`Feb. 15, 2005
`Feb. 15, 2005
`
`Sheet 13 0f 18
`Sheet 13 of 18
`
`US 6,856,873 B2
`US 6,856,873 B2
`
`
`
`C
`CN
`via
`
`FIG.8
`
`
`
`U.S. Patent
`U.S. Patent
`
`Feb. 15, 2005
`Feb. 15, 2005
`
`US 6,856,873 B2
`US 6,856,873 B2
`
`
`
`Sheet 14 of 18
`Sheet 14 of 18
`
`FIG.9
`
`
`
`U.S. Patent
`
`Feb. 15, 2005
`
`Sheet 15 of 18
`
`US 6,856,873 B2
`
`
`
`
`
`0 I "OIH
`
`JOSS3OOJA
`
`(809)
`
`
`
`(ZIS) (Ieuo?dO)
`
`
`?uòUII) reduIO3)
`(00$)
`
`
`
`U.S. Patent
`U.S. Patent
`
`Feb. 15, 2005
`Feb. 15, 2005
`
`Sheet 16 of 18
`Sheet 16 of 18
`
`US 6,856,873 B2
`US 6,856,873 B2
`
`
`
`
`
`THOld
`II 'OIH
`
`Oe!gis
`
`
`
`aseydjeuonesadyg
`
`
`
`ULMayBhg79g
`
`
`
`WoyeduteLy,
`
`vonmsosay
`
`unyso3,y
`
`
`
`(s)oSeuyur8j9q9
`
`
`
`JOA,Sunpnyouy
`
`(44
`
`
`
`sseydSuruzed],
`
`
`
`Uywashjag
`
`
`
`(saupazuouny
`
`
`
`soSeuy]uENqQO
`
`Surpnyouy
`
`9I¢
`
`weedA[ddy
`wauosry
`— #29
`
`
`
`wontugooayres
`
`
`
`areopuywyRospyaches
`Joauqpozioyynyaporyan,
`
`
`edenwey)Hs
`Sapn[ouljovores,
`
`
`
`wanedsoc
`
`uontudoay
`
`9%
`
`8%
`
`JBODETYpunoSs
`
`aotfog
`
`ore
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`U.S. Patent
`
`Feb. 15, 2005
`
`Sheet 17 of 18
`
`US 6,856,873 B2
`
`(909
`
`b09
`
`
`
`qusuoduoysarjousy
`
`JOSSIOOI
`
`BIOWBDJOXIgSANOW
`
`
`
`JUOWUOITAUS]OIYSA,
`
`clOW
`
`a19
`qusuodmo5D
`
`yoRU0s
`
`€lOlt
`
`O19
`
`cg
`
`
`
`Joo82uq]UIKgO
`
`qusUUONATA
`
`SONSTIO}OBIBY>UtW910]
`
`e8euyut30alqQJo
`
`omy,
`
`Avo
`
`
`
`
`
`
`
`
`
`U.S. Patent
`U.S. Patent
`
`Feb. 15, 2005
`
`Sheet 18 of 18
`
`US 6,856,873 B2
`US 6,856,873 B2
`
`
`
`
`
`UOHENeDsouaazyig]swWAIy
`
`
`
`JO.QU0DJUsuoduIO;
`
`vorjeurnusjag
`
`biDIA
`
`819
`
`0c9
`
`mee
`
`Suyaraoaywar]
`
`
`
`ABLY/(S)[OXT
`
`ANN
`Jiy\N
`yas
`
`JUSIUOITAUS
`
`iid,
`Vidz
`
`wyBrT
`
`90inos
`
`\__-s19
`
`
`
`HOI}E[Npo!W
`
`suey
`
`fCZO
`
`
`
`
`
`
`
`
`
`
`
`1
`VEHICULAR MONITORING SYSTEMS
`USING IMAGE PROCESSING
`
`CROSS REFERENCE TO RELATED
`APPLICATIONS
`This application is a continuation-in-part of U.S. patent
`application Ser. No. 09/838,919 filed Apr. 20, 2001, now
`U.S. Pat. No. 6,442,465, which is:
`1) a continuation-in-part of U.S. patent application Ser.
`No. 09/765,559 filed Jan. 19, 2001, now U.S. Pat. No.
`6,553,296, which in turn is a continuation-in-part of U.S.
`patent application Ser. No. 09/476,255 filed Dec. 30, 1999,
`now U.S. Pat. No. 6,324,453, which in turn claims priority
`under 35 U.S.C. S119(e) of U.S. provisional patent appli
`cation Ser. No. 60/114,507 filed Dec. 31, 1998; and
`2) a continuation-in-part of U.S. patent application Ser.
`No. 09/389,947 filed Sep. 3, 1999, now U.S. Pat. No.
`6,393,133, which in turn is a continuation-in-part of U.S.
`patent application Ser. No. 09/200,614, filed Nov.30, 1998,
`now U.S. Pat. No. 6,141,432, which in turn is a continuation
`of U.S. patent application Ser. No. 08/474,786 filed Jun. 7,
`1995, now U.S. Pat. No. 5,845,000, all of which are incor
`porated by reference herein.
`This application claims priority under 35 U.S.C. S119(e)
`of U.S. provisional patent application Ser. No. 60/114,507
`filed Dec. 31, 1998 through the parent applications.
`This application is also a continuation-in-part of U.S.
`patent application Ser. No. 09/925,043 filed Aug. 8, 2001,
`now U.S. Pat. No. 6,507,779, which is a continuation-in-part
`of U.S. patent application Ser. No. 09/765,559 filed Jan. 19,
`2001, now U.S. Pat. No. 6,553,296, and a continuation-in
`part of U.S. patent application Ser. No. 09/389,947 filed Sep.
`3, 1999, now U.S. Pat. No. 6,393,133.
`
`FIELD OF THE INVENTION
`The present invention relates to apparatus and methods
`for monitoring environments in and outside of a vehicle
`using image processing.
`The present invention also relates to apparatus and meth
`ods for determining a distance between objects in an envi
`ronment in and outside of a vehicle by image processing
`techniques.
`
`BACKGROUND OF THE INVENTION
`1. Prior Art on Out of Position Occupants and Rear Facing
`Child Seats
`Whereas thousands of lives have been saved by airbags,
`a large number of people have also been injured, Some
`Seriously, by the deploying airbag, and over 100 people have
`now been killed. Thus, Significant improvements need to be
`made to airbag Systems. AS discussed in detail in U.S. Pat.
`No. 5,653,462 referenced above, for a variety of reasons
`vehicle occupants may be too close to the airbag before it
`deploys and can be seriously injured or killed as a result of
`the deployment thereof. Also, a child in a rear facing child
`Seat that is placed on the right front passenger Seat is in
`danger of being Seriously injured if the passenger airbag
`deploys. For these reasons and, as first publicly disclosed in
`Breed, D. S. “How Airbags Work” presented at the Interna
`tional Conference on Seatbelts and Airbags in 1993, in
`Canada, occupant position Sensing and rear facing child Seat
`detection Systems are required.
`Initially, these Systems will Solve the out-of-position
`occupant and the rear facing child Seat problems related to
`current airbag Systems and prevent unneeded airbag deploy
`
`15
`
`25
`
`35
`
`40
`
`45
`
`50
`
`55
`
`60
`
`65
`
`US 6,856,873 B2
`
`2
`ments when a front Seat is unoccupied. However, airbags are
`now under development to protect rear Seat occupants in
`vehicle crashes and all occupants in Side impacts. A System
`will therefore be needed to detect the presence of occupants,
`determine if they are out-of-position and to identify the
`presence of a rear facing child Seat in the rear Seat. Future
`automobiles are expected to have eight or more airbags as
`protection is Sought for rear Seat occupants and from Side
`impacts. In addition to eliminating the disturbance and
`possible harm of unnecessary airbag deployments, the cost
`of replacing these airbags will be excessive if they all deploy
`in an accident needlessly.
`Inflators now exist which will adjust the amount of gas
`flowing to the airbag to account for the Size and position of
`the occupant and for the severity of the accident. The vehicle
`identification and monitoring system (VIMS) discussed in
`U.S. Pat. No. 5,829,782 will control Such inflators based on
`the presence and position of vehicle occupants or of a rear
`facing child Seat. AS discussed more fully below, the instant
`invention is an improvement on that VIMS system and uses
`an advanced optical System comprising one or more CCD
`(charge coupled device) or CMOS arrays and particularly
`active pixel arrays plus a Source of illumination preferably
`combined with a trained neural network pattern recognition
`System.
`Others have observed the need for an occupant out-of
`position Sensor and Several methods have been disclosed in
`U.S. patents for determining the position of an occupant of
`a motor vehicle. Each of these Systems, however, has
`significant limitations. For example, in White et al. (U.S.
`Pat. No. 5,071,160), a single acoustic sensor and detector is
`described and, as illustrated, is mounted lower than the
`Steering wheel. White et al. correctly perceive that Such a
`Sensor could be defeated, and the airbag falsely deployed, by
`an occupant adjusting the control knobs on the radio and
`thus they Suggest the use of a plurality of Such Sensors.
`Mattes et al. (U.S. Pat. No. 5,118,134) describe a variety
`of methods of measuring the change in position of an
`occupant including ultraSonic, active or passive infrared and
`microwave radar Sensors, and an electric eye. The Sensors
`measure the change in position of an occupant during a crash
`and use that information to access the Severity of the crash
`and thereby decide whether or not to deploy the airbag. They
`are thus using the occupant motion as a crash Sensor. No
`mention is made of determining the out-of-position Status of
`the occupant or of any of the other features of occupant
`monitoring as disclosed in one or more of the above
`referenced patents and patent applications. It is interesting to
`note that nowhere does Mattes et al. discuss how to use
`active or passive infrared to determine the position of the
`occupant. AS pointed out in one or more of the above
`referenced patents and patent applications, direct occupant
`position measurement based on passive infrared is probably
`not possible and, until very recently, was very difficult and
`expensive with active infrared requiring the modulation of
`an expensive GaAS infrared laser. Since there is no mention
`of these problems, the method of use contemplated by
`Mattes et al. must be Similar to the electric eye concept
`where position is measured indirectly as the occupant passes
`by a plurality of longitudinally Spaced-apart Sensors.
`The object of an occupant out-of-position Sensor is to
`determine the location of the head and/or chest of the vehicle
`occupant relative to the airbag since it is the impact of either
`the head or chest with the deploying airbag which can result
`in serious injuries. Both White et al. and Mattes et al.
`describe only lower mounting locations of their Sensors in
`front of the occupant such as on the dashboard or below the
`
`
`
`3
`Steering wheel. Both Such mounting locations are particu
`larly prone to detection errors due to positioning of the
`occupants hands, arms and legs. This would require at least
`three, and preferably more, Such Sensors and detectors and
`an appropriate logic circuitry which ignores readings from
`Some Sensors if Such readings are inconsistent with others,
`for the case, for example, where the driver's arms are the
`closest objects to two of the Sensors.
`White et al. also describe the use of error correction
`circuitry, without defining or illustrating the circuitry, to
`differentiate between the velocity of one of the occupants
`hands as in the case where he/she is adjusting the knob on
`the radio and the remainder of the occupant. Three ultrasonic
`Sensors of the type disclosed by White et al. might, in Some
`cases, accomplish this differentiation if two of them indi
`cated that the occupant was not moving while the third was
`indicating that he or she was. Such a combination, however,
`would not differentiate between an occupant with both hands
`and arms in the path of the ultraSonic transmitter at Such a
`location that they were blocking a Substantial view of the
`occupants head or chest. Since the sizes and driving posi
`tions of occupants are extremely varied, it is now believed
`that pattern recognition Systems and preferably trained pat
`tern recognition Systems, Such as neural networks, are
`required when a clear View of the occupant, unimpeded by
`his/her extremities, cannot be guaranteed.
`Fujita et al., in U.S. Pat. No. 5,074,583, describe another
`method of determining the position of the occupant but do
`not use this information to SuppreSS deployment if the
`occupant is out-of-position. In fact, the closer the occupant
`gets to the airbag, the faster the inflation rate of the airbag
`is according to the Fujita et al. patent, which thereby
`increases the possibility of injuring the occupant. Fujita et al.
`do not measure the occupant directly but instead determine
`his or her position indirectly from measurements of the Seat
`position and the vertical size of the occupant relative to the
`seat (occupant height). This occupant height is determined
`using an ultraSonic displacement Sensor mounted directly
`above the occupants head.
`AS discussed above, the optical Systems described herein
`are also applicable for many other Sensing applications both
`inside and outside of the vehicle compartment Such as for
`Sensing crashes before they occur as described in U.S. Pat.
`No. 5,829,782, for a smart headlight adjustment system and
`for a blind spot monitor (also disclosed in U.S. provisional
`patent application Ser. No. 60/202,424).
`2. Definitions
`Preferred embodiments of the invention are described
`below and unless Specifically noted, it is the applicants
`intention that the words and phrases in the Specification and
`claims be given the ordinary and accustomed meaning to
`those of ordinary skill in the applicable art(s). If the appli
`cant intends any other meaning, he will Specifically State he
`is applying a Special meaning to a word or phrase.
`Likewise, applicants use of the word “function” here is
`not intended to indicate that the applicants Seek to invoke the
`Special provisions of 35 U.S.C. S112, Sixth paragraph, to
`define their invention. To the contrary, if applicants wish to
`invoke the provisions of 35 U.S.C. S112, sixth paragraph, to
`define their invention, they will specifically set forth in the
`claims the phrases “means for” or “step for and a function,
`without also reciting in that phrase any Structure, material or
`act in Support of the function. Moreover, even if applicants
`invoke the provisions of 35 U.S.C. S112, sixth paragraph, to
`define their invention, it is the applicants intention that their
`inventions not be limited to the Specific Structure, material or
`acts that are described in the preferred embodiments herein.
`
`1O
`
`15
`
`25
`
`35
`
`40
`
`45
`
`50
`
`55
`
`60
`
`65
`
`US 6,856,873 B2
`
`4
`Rather, if applicants claim their inventions by Specifically
`invoking the provisions of 35 U.S.C. S112, Sixth paragraph,
`it is nonetheless their intention to cover and include any and
`all Structure, materials or acts that perform the claimed
`function, along with any and all known or later developed
`equivalent Structures, materials or acts for performing the
`claimed function.
`The use of pattern recognition is important to the instant
`invention as well as to one or more of those disclosed in the
`above-referenced patents and patent applications above.
`“Pattern recognition' as used herein will generally mean any
`System which processes a Signal that is generated by an
`object, or is modified by interacting with an object, in order
`to determine which one of a set of classes that the object
`belongs to. Such a System might determine only that the
`object is or is not a member of one specified class, or it might
`attempt to assign the object to one of a larger Set of Specified
`classes, or find that it is not a member of any of the classes
`in the Set. The Signals processed are generally electrical
`Signals coming from transducers which are Sensitive to
`either acoustic or electromagnetic radiation and, if
`electromagnetic, they can be either visible light, infrared,
`ultraViolet or radar or low frequency radiation as used in
`capacitive Sensing Systems.
`A trainable or a trained pattern recognition System as used
`herein means a pattern recognition System which is taught
`various patterns by Subjecting the System to a variety of
`examples. The most Successful Such System is the neural
`network. Not all pattern recognition Systems are trained
`Systems and not all trained Systems are neural networkS.
`Other pattern recognition Systems are based on fuzzy logic,
`sensor fusion, Kalman filters, correlation as well as linear
`and non-linear regression. Still other pattern recognition
`Systems are hybrids of more than one System Such as
`neural-fuzzy Systems.
`A pattern recognition algorithm will thus generally mean
`an algorithm applying or obtained using any type of pattern
`recognition System, e.g., a neural network, Sensor fusion,
`fuZZy logic, etc.
`To “identify” as used herein will usually mean to deter
`mine that the object belongs to a particular Set or class. The
`class may be one containing, for example, all rear facing
`child Seats, one containing all human occupants, or all
`human occupants not Sitting in a rear facing child Seat
`depending on the purpose of the System. In the case where
`a particular perSon is to be recognized, the Set or class will
`contain only a single element, i.e., the person to be recog
`nized.
`To “ascertain the identity of as used herein with refer
`ence to an object will generally mean to determine the type
`or nature of the object (obtain information as to what the
`object is), i.e., that the object is an adult, an occupied rear
`facing child Seat, an occupied front facing child Seat, an
`unoccupied rear facing child Seat an unoccupied front facing
`child Seat, a child, a dog, a bag of groceries, a car, a truck,
`a tree, a pedestrian, a deer etc.
`An “occupying item' or “occupant' of a Seat or “object'
`in a Seat may be a living occupant Such as a human being or
`a dog, another living organism Such as a plant, or an
`inanimate object Such as a box or bag of groceries.
`A "rear Seat' of a vehicle as used herein will generally
`mean any Seat behind the front Seat on which a driver sits.
`Thus, in minivans or other large vehicles where there are
`more than two rows of Seats, each row of Seats behind the
`driver is considered a rear Seat and thus there may be more
`than one “rear seat” in such vehicles. The space behind the
`front Seat includes any number of Such rear Seats as well as
`
`
`
`S
`any trunk Spaces or other rear areas Such as are present in
`Station wagons.
`An optical image will generally mean any type of image
`obtained using electromagnetic radiation including infrared
`and radar radiation.
`In the description herein on anticipatory Sensing, the term
`“approaching” when used in connection with the mention of
`an object or vehicle approaching another will usually mean
`the relative motion of the object toward the vehicle having
`the anticipatory Sensor System. Thus, in a Side impact with
`a tree, the tree will be considered as approaching the Side of
`the vehicle and impacting the vehicle. In other words, the
`coordinate System used in general will be a coordinate
`System residing in the target vehicle. The “target' vehicle is
`the vehicle that is being impacted. This convention permits
`a general description to cover all of the cases Such as where
`(i) a moving vehicle impacts into the side of a stationary
`vehicle, (ii) where both vehicles are moving when they
`impact, or (iii) where a vehicle is moving Sideways into a
`Stationary vehicle, tree or wall.
`“Out-of-position” as used for an occupant will generally
`mean that the occupant, either the driver or a passenger, is
`Sufficiently close to an occupant protection apparatus
`(airbag) prior to deployment that he or she is likely to be
`more Seriously injured by the deployment event itself than
`by the accident. It may also mean that the occupant is not
`positioned appropriately in order to attain the beneficial,
`restraining effects of the deployment of the airbag. AS for the
`occupant being too close to the airbag, this typically occurs
`when the occupants head or chest is closer than Some
`distance Such as about 5 inches from the deployment door of
`the airbag module. The actual distance where airbag deploy
`ment should be Suppressed depends on the design of the
`airbag module and is typically farther for the passenger
`airbag than for the driver airbag.
`3. Pattern Recognition Prior Art
`Japanese Patent No. 3-42337 (A) to Ueno discloses a
`device for detecting the driving condition of a vehicle driver
`comprising a light emitter for irradiating the face of the
`driver and a means for picking up the image of the driver and
`Storing it for later analysis. Means are provided for locating
`the eyes of the driver and then the irises of the eyes and then
`determining if the driver is looking to the Side or Sleeping.
`Ueno determines the State of the eyes of the occupant rather
`than determining the location of the eyes relative to the other
`parts of the vehicle passenger compartment. Such a System
`can be defeated if the driver is wearing glasses, particularly
`Sunglasses, or another optical device which obstructs a clear
`View of his/her eyes. Pattern recognition technologies Such
`as neural networks are not used.
`U.S. Pat. No. 5,008,946 to Ando uses a complicated set of
`rules to isolate the eyes and mouth of a driver and uses this
`information to permit the driver to control the radio, for
`example, or other Systems within the vehicle by moving his
`eyes and/or mouth. Ando uses natural light and analyzes
`only the head of the driver. He also makes no use of trainable
`pattern recognition Systems. Such as neural networks, nor is
`there any attempt to identify the contents of the vehicle nor
`of their location relative to the vehicle passenger compart
`ment. Rather, Ando is limited to control of vehicle devices
`by responding to motion of the driver's mouth and eyes.
`U.S. Pat. No. 5,298,732 to Chen also concentrates on
`locating the eyes of the driver So as to position a light filter
`between a light Source Such as the Sun or the lights of an
`oncoming vehicle, and the driver's eyes. Chen does not
`explain in detail how the eyes are located but does Supply a
`calibration system whereby the driver can adjust the filter so
`
`15
`
`25
`
`35
`
`40
`
`45
`
`50
`
`55
`
`60
`
`65
`
`US 6,856,873 B2
`
`