`Breed et al.
`
`(10) Patent N0.:
`(45) Date of Patent:
`
`US 6,772,057 B2
`Aug. 3, 2004
`
`US006772057B2
`
`(54) VEHICULAR MONITORING SYSTEMS
`USING IMAGE PROCESSING
`
`(56)
`
`References Cited
`
`(75) Inventors: David S. Breed, Boonton Township,
`Morris County, NJ (US); Wilbur E.
`DuVall, Kimberling City, MO (US);
`Wendell C. Johnson, Signal Hill, CA
`(Us)
`(73) Assigneej Automotive Technologies
`International, Inc., Denville, NJ (US)
`
`( * ) Notice:
`
`Subject to any disclaimer, the term of this
`patent is extended or adjusted under 35
`USC 154(b) by 0 days.
`
`(21) Appl. No.: 10/302,105
`(22) Filed:
`Nov. 22, 2002
`(65)
`Prior Publication Data
`
`US 2003/0125855 A1 Jul. 3, 2003
`
`Related US. Application Data
`
`(63) Continuation-in-part of application No. 10/116,808, ?led on
`Apr. 5, 2002, which is a continuation-in-part of application
`No. 09/925,043, ?led on Aug. 8, 2001, now Pat. No.
`6,507,779, which is a continuation-in-part of application No.
`09/765,559, ?led on Jan. 19, 2001, now Pat. No. 6,553,296,
`and a continuation-in-part of application No. 09/389,947,
`?led on Sep. 3, 1999, now Pat. No. 6,393,133, and a
`continuation-in-part of application No. 09/838,919, ?led on
`Apr. 20, 2001, now Pat. No. 6,442,465, which is a continu
`ation-in-part of application No. 09/765,559, which is a
`continuation-in-part of application No. 09/476,255, ?led on
`Dec. 30, 1999, now Pat. No. 6,324,453, and a continuation
`in-part of application No. 09/389,947, which is a continua
`tion-in-part of application No. 09/200,614, ?led on Nov. 30,
`1998, now Pat. No. 6,141,432, which is a continuation of
`application No. 08/474,786, ?led on Jun. 7, 1995, now Pat.
`No. 5,845,000.
`(60) Provisional application No. 60/114,507, ?led on Dec. 31,
`1998.
`
`(51) Int. Cl.7 .............................................. .. B60R 21/32
`(52) us. Cl. ...................... .. 701/45; 340/5731; 348/77;
`180/271; 280/735; 382/181
`(58) Field of Search .......................... .. 701/45, 36, 301;
`340/461, 435, 436, 438, 815.4, 5731; 348/148,
`77, 154; 180/271; 280/735, 728.1; 382/181,
`115, 190, 224, 100, 104
`
`U.S. PATENT DOCUMENTS
`
`4,496,222 A
`
`1/1985 Shah ...................... .. 3592/300
`
`(List continued on next page.)
`
`FOREIGN PATENT DOCUMENTS
`
`EP
`
`0885782
`
`12/1998
`
`(List continued on next page.)
`
`OTHER PUBLICATIONS
`
`“Analysis of Hidden Units in a Layered Network Trained to
`Classify Sonar Targets”, R. Paul Gorman, et al., Neural
`Networks, vol. 1, pp. 75—89, 1988.
`
`(List continued on next page.)
`
`Primary Examiner—Thomas G. Black
`Assistant Examiner—Tuan C To
`(74) Attorney, Agent, or Firm—Brian Roffe
`(57)
`ABSTRACT
`
`Vehicular monitoring arrangement for monitoring an envi
`ronment of the vehicle including at least one active pixel
`camera for obtaining images of the environment of the
`vehicle and a processor coupled to the active pixel camera(s)
`for determining at least one characteristic of an object in the
`environment based on the images obtained by the active
`pixel camera(s). The active pixel camera can be arranged in
`a headliner, roof or ceiling of the vehicle to obtain images of
`an interior environment of the vehicle, in an A-pillar or
`B-pillar of the vehicle to obtain images of an interior
`environment of the vehicle, or in a roof, ceiling, B-pillar or
`C-pillar of the vehicle to obtain images of an interior
`environment of the vehicle behind a front seat of the vehicle.
`The determined characteristic can be used to enable optimal
`control of a reactive component, system or subsystem
`coupled to the processor. When the reactive component is an
`airbag assembly including at least one airbag, the processor
`can be designed to control at least one deployment param
`eter of the airbag(s).
`
`86 Claims, 19 Drawing Sheets
`
`120
`
`1
`
`Mercedes-Benz USA, LLC, Petitioner - Ex. 1001
`
`
`
`US 6,772,057 B2
`Page 2
`
`US. PATENT DOCUMENTS
`_
`4,625,329 A 11/1986 lshlkawa er a1- ---------- -- 382/104
`4,648,052 A
`3/1987 Fnedman 9t 91- -~
`364/550
`4,720,189 A
`1/1988 Heynen er a1- ----------- -- 351/210
`4,768,088 A
`8/1988 Ando ------------------------ -- 358/93
`4,836,670 A
`6/1989 Hutchinson .
`.. 351/210
`4,881,270 A 11/1989 Kii66h1 61 a1. ..
`382/17
`4,906,940 A
`3/1990 Greene et al. ..
`382/16
`4,950,069 A
`8/1990 Hii16hiiis6ii
`351/210
`4,966,388 A 10/1990 Warner et al.
`280/730
`.
`5,003,166 A
`3/1991 Girod .................... .. 250/201.4
`5,008,946 A
`4/1991 Ando .......................... .. 382/2
`.
`5,026,153 A
`6/1991 Suzuki 6161.
`356/1
`.
`5,060,278 A 10/1991 FukumiZu .... ..
`382/157
`5,062,696 A 11/1991 Oshima et al. .
`359/554
`5,064,274 A 11/1991 Alten ....... ..
`359/604
`5,071,160 A 12/1991 White et al. .
`280/735
`5 074 583 A 12/1991 Fujita et aL ______________ __ 280/735
`5,103,305 A
`4/1992 Watanabe _________________ __ 358/105
`5,118,134 A
`6/1992 Mattes 61 a1, ,,,, ,,
`280/735
`5,162,861 A 11/1992 Tamburino et al. ...... .. 356/5.05
`5,181,254 A
`1/1993 S_ChWeiZef et a1- -
`---- -- 382/1
`571857667 A
`2/1993 Zlmmermann -
`348/143
`23133223 2
`31333 Z‘Iba‘” """ ‘i
`iii/3i
`5’227’784 A
`7;1993 ?aslz?llzfir 6e: :1‘ """"" "34O/9/O3
`’
`’
`-
`' """" "
`5,235,339 A
`8/1993 Morrison et a1. ......... .. 342/159
`5 249 027 A
`9/1993 Mathur et aL __
`__ 356514
`5,249,157 A
`9/1993 Taylor
`340/903
`5,298,732 A
`3/1994 Chen ..... ..
`250/203.4
`5,305,012 A
`4/1994 Faris . . . . . . .
`. . . . .. 345/7
`5,309,137 A
`5/ 1994 Kajiwara -
`340/436
`53297206 A
`7/1994 Slotkowskl et ‘1
`~~ 315/159
`5,330,226 A
`7/1994 Gentry et al. ............. .. 280/735
`5’339’075 A
`8/1994 Abst et a1‘ """""""" " 340/903
`5,355,118 A 10/1994 Fukuhara .................. .. 340/435
`5,390,136 A
`2/1995
`_
`5,441,052 A
`8/1995
`5,446,661 A
`8/1995 GiOlltSOS 6161. ..... .. 364/424.05
`5,454,591 A 10/1995 MaZur 61111,
`____ __ 280/735
`5,463,384 A 10/1995 Juds ......... ..
`340/903
`5,473,515 A 12/1995 Liu ......................... .. 362/801
`5,482,314 A
`1/1996 Corrado et a1. ........... .. 280/735
`2
`g/
`gasmck ft {11' '
`3365531365
`
`' ' ' ' "
`
`/
`' ' ' ' ' '
`mm c a '
`/
`’
`’
`280/735
`7/1996 Semchena et a1.
`5,531,472 A
`315/82
`7/1996 B e Cht 61 et al
`5 537 003 A
`359/604
`8/1996 scho?eld et aL
`55507677 A
`5,563,650 A 10/1996 POelStIa ..................... .. 348/36
`5,653,462 A
`8/1997 Breed et al. .............. .. 280/735
`5,706,144 A
`1/1998 Brandin .... ..
`359/843
`5,785,347 A
`7/1998 Adolph et a1~
`2530/735
`5:821:63 A 10/1998 Burke ct a1~ -
`307/101
`2 ?/
`Ewes et a1‘ '
`"" "
`/
`/
`fee et a‘ '
`’
`’
`5,845,000 A 12/1998 Breed et al. .............. .. 382/100
`5 848 802 A 12/1998 Breed et a1‘ ~~~~~~~~~~~~~~ " 280/735
`
`2/2000 SchWeiZer ................. .. 701/45
`6,029,105 A
`6,087,953 A * 7/2000 Deline et a1. .......... .. 340/815.4
`6,111,517 A
`8/2000 Atick 61 a1. .......... .. 340/825.34
`6,113,137 A
`9/2000 MiZutani 61 a1. .......... .. 280/735
`6,115,552 A
`9/2000 Kaneda ..................... .. 396/82
`2002/0154379 A1 * 10/2002 TOIlaI 61 a1. .............. .. 359/267
`
`FOREIGN PATENT DOCUMENTS
`
`GB
`JP
`JP
`JP
`JP
`WO
`W0
`
`2289332
`360166806
`3-42337
`407055573 A
`2001-325700
`94/22693
`0196147
`
`11/1995
`8/1985
`2/1991
`3/1995
`11/2001
`10/1994
`120001
`
`............... .. 180/273
`
`OTHER PUBLICATIONS
`_
`_
`_
`_
`Learned Classi?cation of Sonar Targets Using a Massively
`Parallel Network, R. Paul Gorman et al., IEEE Transactions
`on Acoustics, Speech and Signal Processing, vol. 36, No. 7,
`Jul., 1988, pp 1135—1140.
`“HoW Airbags Work”, David S. Breed, Presented at the
`Canadian Association of Road Safety Professionals, Oct. 19,
`1995092011992
`Intelligent System for Video Monitoring of Vehicle Cockpit,
`S. Boverie et al., SAE Paper No. 980613, Feb., 1998.
`_
`_
`_
`_
`_
`_
`_
`Omnidirectional Vision Sensor for Intelligent Vehicles, T.
`Ito et al., 1998 IEEE International Conference on Intelligent
`Vehicles, pp. 365—370, 1998.
`A256X256 CMOS Brightness Adaptive Imaging Array With
`Column—Parallel Digital Output, C. Sodini et al., 1998 IEEE
`International Conference on Intelligent Vehicles, 1998, pp.
`347_352~
`DerWent Abstract of German Patent Publication No. DE 42
`11 556 O t 7 1993
`>
`C '
`>
`'
`_
`_
`DerWent Abstract of Japanese patent application No.
`02451332, NOV- 13, 1991
`3D Perception for Vehicle Inner Space Monitoring, S.
`Boverie et al., Advanced Microsystems for Automotive
`Applications 2000, Apr” 2000, pp, 157—172_
`LOW_COSt High Speed CMOS Camera for Automotive
`Applications, N. Stevanovic et al., Advanced Microsystems
`
`for Automotive Applications 2000, Apr., 2000, pp. 173—180.
`.
`.
`NeW PoWerful Sensory Tool in Automotive Safety Systems
`Based on PMD—Technology, R. SchWarte et al., Advanced
`Microsysterns for Automotive Applications 2000, APR,
`2000, pp. 181—203.
`An Interior Compartment Protection System Based on
`Motion Detection Using CMOS Imagers, S. B. Park et al.,
`1998 IEEE International Conference on Intelligent Vehicles.
`Sensing Automobile Occupant Position With Optical Trian
`gulation, W. Chapelle et al., Sensors, Dec. 1995.
`.
`.
`.
`.
`.
`.
`Intelligent System for Video Monitoring of Vehicle Cockpit,
`
`scho?eld et a1' 8/1999 Varga 61 a1. ................ .. 367/99
`
`A 5,943,295 A
`
`9/1999 s6h6?61d 61 a1. ......... .. 340/461
`5,949,331 A
`9/1999 Griggs, III et al. ....... .. 280/735
`5,954,360 A
`9/ 1999 O’Farrell et a1- --
`-- 307/10-1
`5959367 A
`5,983,147 A 11/1999 Krumm ------- ~~
`-
`- 701/45
`63005958 A 12/1999 Farmer et a1‘
`382/103
`6,007,095 A 12/1999 Stanley ............ ..
`280/735
`6,020,812 A
`2/2000 Thompson et al. ....... .. 340/438
`6,027,138 A
`2/2000 Tanaka et al. ............ .. 280/735
`
`Paper NO. Boverie et 3.1., A256X256 CMOS Brightness Adaptive Imaging Array With 23—26,
`
`
`
`C01urnn—Para11e1 Digital Output, CG. Sodini et a1, 1998
`IEEE International Conference on Intelligent Vehicles.
`The FERET Evaluation Methodology for Face—Recognition
`A1g0ii1hiiis, R]. Phillips 61 31., NISTIR 6264,Jan.7, 1999.
`The Technology RevieW Ten: Biometrics, J. Atick, J an./Feb.
`2001
`'
`* cited by examiner
`
`2
`
`
`
`U.S. Patent
`
`Aug. 3, 2004
`
`Sheet 1 0f 19
`
`US 6,772,057 B2
`
`3:
`
`N22
`
`m2
`
`c:
`
`3
`
`
`
`U.S. Patent
`
`Aug. 3, 2004
`
`Sheet 2 0f 19
`
`US 6,772,057 B2
`
`4
`
`
`
`U.S. Patent
`
`Aug. 3, 2004
`
`Sheet 3 0f 19
`
`US 6,772,057 B2
`
`2:
`
`5
`
`
`
`U.S. Patent
`
`Aug. 3, 2004
`
`Sheet 4 0f 19
`
`US 6,772,057 B2
`
`6
`
`
`
`U.S. Patent
`
`Aug. 3, 2004
`
`Sheet 5 0f 19
`
`US 6,772,057 B2
`
`37,
`
`N2.
`
`‘o:
`
`02
`
`7
`
`
`
`U.S. Patent
`
`Aug. 3, 2004
`
`Sheet 6 0f 19
`
`US 6,772,057 B2
`
`213
`
`214
`211A
`
`250
`
`FIG. 2A
`
`120
`
`224
`
`8
`
`
`
`U.S. Patent
`
`Aug. 3, 2004
`
`Sheet 7 0f 19
`
`US 6,772,057 B2
`
`9
`
`
`
`U.S. Patent
`
`Aug. 3, 2004
`
`Sheet 8 0f 19
`
`US 6,772,057 B2
`
`
`
`
`
`wMOmZM-w ZOrr-mOm Casi-DUO
`
`GZDOMU
`
`m .9;
`
`10
`
`
`
`U.S. Patent
`
`Aug. 3, 2004
`
`91f09teehS
`
`US 6,772,057 B2
`
`AM
`
`EV
`
`E23:
`
`.5>CQ.5_Q_._..—.
`3555:2.3:»o:o:_uu.E
`
`NEEanE
`
`_Em\CU
`
`._2m:_omO
`
`.5
`
`22m
`
`Ix‘é£23:
`
`o>Eoo~_N
`
`aCVEu._E§-2._
`
`OCN_.=Ou1<
`
`
`
`_,0.__%_____,.w.5.
`
`2;
`
`EmO2052.5
`
`SSE
`
`.o..o9oD
`
`SK98
`
`moaoscoi
`
`535
`
`N223.?Q
`
`_Sm>_U
`
`._oE_:omO
`
`11
`
`
`
`U.S. Patent
`
`Aug. 3, 2004
`
`Sheet 10 0f 19
`
`US 6,772,057 B2
`
`FIG. 5
`
`12
`
`
`
`U.S. Patent
`
`Aug. 3, 2004
`
`Sheet 11 0f 19
`
`US 6,772,057 B2
`
`w .bhm
`
`omo
`
`as Nm©
`
`
`
`‘7, - 1%
`
`Q06
`
`13
`
`
`
`14
`
`
`
`U.S. Patent
`
`Aug. 3, 2004
`
`Sheet 13 0f 19
`
`US 6,772,057 B2
`
`‘120
`
`15
`
`
`
`U.S. Patent
`
`Aug. 3, 2004
`
`Sheet 14 0f 19
`
`US 6,772,057 B2
`
`woo
`
`16
`
`
`
`U.S. Patent
`
`Aug. 3, 2004
`
`Sheet 15 0f 19
`
`US 6,772,057 B2
`
`
`
`“0 SEE mEEEP
`
`
`
`A83 Exam?
`
`zo?cmoood
`
`E33;
`
`68805
`
`Amos
`
`A3 32.28600
`
`
`829$ 556mm
`20Eo> 650 M0
`
`
`
`Q63 cocaooom
`
`
`
`omg? ROMEO
`
`
`
`:EEZBQ QED.
`
`.. - $03
`
`a _ a cmémaov
`
`$2825;
`EQEEQEQU
`
`$w=omm§
`
`17
`
`
`
`U.S. Patent
`
`Aug. 3, 2004
`
`91cl061teehS
`
`US 6,772,057 B2
`
`|\;a.:flw_<2
`
`
`
`£23..:&<
`
`
`
`:o:_:mooo.~_.
`
`.//I
`
`//\_.
`
`E23.$5\7,
`
`._o>1n_m..:.=_2:
`
`J
`
`//NR
`
`3&2...E530I
`
`_.==_am_<
`
`,
`
`7.1«HMTAmv.o>_._n_
`
`355_.Eo:Eo._OE2:5:a._.rEE2a>.minIN
`
`.
`In7=o:_:mcoum
`
`
`nu~tc.==<
`
`w._=.___2._
`
`2OE
`
` _BEE?_.__.E5EotiwSmli./Em
`
`3w.:..____§_o7/Em
`
`.ua:.oU\=:s_<
`
`T=_._3@
`
`OOSOL
`
`.\.
`
`.
`
`
`
`
`
`._m>_.‘_Quo.~_.E.::<,u_oEu>
`
`32.2;.77.8=c_:=m—
`
`
`
`nE..=u:_E£.T5wfi<x/7»
`cozimouom\0.//
`ummE—it,,uimcw
`
`
`
`
`/
`
`
`
`
`
`18
`
`
`
`U.S. Patent
`
`Aug. 3, 2004
`
`91f071teehS
`
`US 6,772,057 B2
`
`
`
`asuwus:£350
`
`
`
`.=uE=c:>_._w—
`
`
`
`3=u_..2oE.=_On=_E._2uQ
`
`
`
`nmwE_:_.9030.—o
`
`_c.:=cU
`
`.=a=on_EeU
`
`3GE
`
`.ml/.28\II?:.7
`
`
`
`uomnuooumI:Eufiwv$25u>=o<+$:uE=ou_>:mo_u___u>
`
`\\/So
`
`19
`
`
`
`U.S. Patent
`
`Aug. 3, 2004
`
`Sheet 18 0f 19
`
`US 6,772,057 B2
`
`2o
`
`omo
`
`a Qc?EEE
`558m 224
`
`ho 8 395 g
`
`6.
`
`l \
`
`ll H50 :53: Trim I ll
`
`
`I \ \
`
`3 GE
`
`\ I
`
`\ I H
`
`. \
`
`y m
`
`coca-382
`
`2:32
`
`\\\l/ Nuo
`
`20
`
`
`
`U.S. Patent
`
`Aug. 3, 2004
`
`Sheet 19 of 19
`
`US 6,772,057 B2
`
`Optional transmitter
`730
`
`Exterior object(s)
`
`Receiver(s) (single or
`multiple) 734, 736,
`736
`
`Optional measurement system (radar)
`746
`
`Electronic module/processor (transmitter
`drive circuitry, signal processing circuitry-
`neural computer) 740,742,74-4,745
`
`Display to driver/Airbag
`control/headlight dimmer
`control/Other system control
`748
`
`21
`
`
`
`US 6,772,057 B2
`
`1
`VEHICULAR MONITORING SYSTEMS
`USING IMAGE PROCESSING
`
`CROSS REFERENCE TO RELATED
`APPLICATIONS
`
`This application is a continuation-in-part of U.S. patent
`application Ser. No. 10/116,808 filed Apr. 5, 2002 which is:
`1) a continuation-in-part of U.S. patent application Ser.
`No. 09/925,043 filed Aug. 8, 2001, now US. Pat. No.
`6,507,779, which is:
`a) a continuation—in—part of U.S. patent application Ser.
`No. 09/765,559 filed Jan. 19, 2001, now U.S. Pat.
`No. 6,553,296; and
`b) a continuation-in-part of US. patent application Ser.
`No. 09/389,947 filed Sep. 3, 1999, now U.S. Pat. No.
`6,393,133; and
`2) a continuation-in-part of US. patent application Ser.
`No. 09/838,919 filed Apr. 20, 2001, now U.S. Pat. No.
`6,442,465, which is:
`a) a continuation-in-part of U.S. patent application Ser.
`No. 09/765,559 filed Jan. 19, 2001 which is a
`continuation-in-part of U.S. patent application Ser.
`No. 09/476,255 filed Dec. 30, 1999, now U.S. Pat.
`No. 6,324,453, which claims priority under 35 ,
`U.S.C. §119(e) of U.S. provisional patent application
`Ser. No. 60/114,507 filed Dec. 31, 1998; and
`b) a continuation-in-part of U.S. patent application Ser.
`No. 09/389,947 filed Sep. 3, 1999, now U.S. Pat. No.
`6,393,133, which is a continuation-in-part of U.S.
`patent application Ser. No. 09/200,614, filed Nov. 30,
`1998, now US. Pat. No. 6,141,432, which is a
`continuation of U.S. patent application Ser. No.
`08/474,786 filed Jun. 7, 1995, now U.S. Pat. No.
`5,845,000, all of which are incorporated by reference
`herein.
`
`This application claims priority under 35 U.S.C. §119(e)
`of U.S. provisional patent application Ser. No. 60/114,507
`filed Dec. 31, 1998 through the parent applications.
`FIELD OF TIIE INVENTION
`
`‘
`
`’
`
`The present invention relates to apparatus and methods
`for monitoring environments in and outside of a vehicle
`using image processing.
`The present invention also relates to arrangements for
`detecting the presence, type and/or position of occupants in
`vehicles and objects exterior of vehicles, e.g., in a driver’s
`blind spot, primarily using optics.
`The present invention also relates to apparatus and meth-
`ods for determining a distance between objects in an envi-
`ronment in and outside of a vehicle by image processing
`techniques.
`BACKGROUND OF TIIE INVENTION
`1. Prior Art on Out of Position Occupants and Rear Facing ‘
`Child Seats
`Whereas thousands of lives have been saved by airbags,
`a large number of people have also been injured, some
`seriously, by the deploying airbag, and over 100 people have
`now been killed. Thus, significant improvements need to be
`made to airbag systems. As discussed in detail in U.S. Pat.
`No. 5,653,462 referenced above, for a variety of reasons
`vehicle occupants may be too close to the airbag before it
`deploys and can be seriously injured or killed as a result of
`the deployment thereof. Also, a child in a rear facing child
`seat that is placed on the right front passenger seat is in
`danger of being seriously injured if the passenger airbag
`
`2
`deploys. For these reasons and, as first publicly disclosed in
`Breed, D. S. “How Airbags Work” presented at the Interna-
`tional Conference on Seatbelts and Airbags in 1993,
`in
`Canada, occupant position sensing and rear facing child seat
`detection systems are required.
`Initially,
`these systems will solve the out—of—position
`occupant and the rear facing child seat problems related to
`current airbag systems and prevent unneeded airbag deploy-
`ments when a front seat is unoccupied. However, airbags are
`now under development to protect rear seat occupants in
`vehicle crashes and all occupants in side impacts. A system
`will therefore be needed to detect the presence of occupants,
`determine if they are out-of-position and to identify the
`presence of a rear facing child seat in the rear seat. Future
`automobiles are expected to have eight or more airbags as
`protection is sought for rear seat occupants and from side
`impacts.
`In addition to eliminating the disturbance and
`possible harm of unnecessary airbag deployments, the cost
`of replacing these airbags will be excessive if they all deploy
`in an accident needlessly.
`Infiators now exist which will adjust the amount of gas
`flowing to the airbag to account for the size and position of
`the occupant and for the severity of the accident. The vehicle
`identification and monitoring system (VIMS) discussed in
`U.S. Pat. No. 5,829,782 will control such infiators based on
`the presence and position of vehicle occupants or of a rear
`facing child seat. As discussed more fully below, the instant
`invention is an improvement on that VIMS system and uses
`an advanced optical system comprising one or more CCD
`(charge coupled device) or CMOS arrays and particularly
`active pixel arrays plus a source of illumination preferably
`combined with a trained neural network pattern recognition
`system.
`Others have observed the need for an occupant out—of-
`position sensor and several methods have been disclosed in
`U.S. patents for determining the position of an occupant of
`a motor vehicle. Each of these systems, however, has
`significant limitations. For example, in White et al. (U.S.
`Pat. No. 5,071,160), a single acoustic sensor and detector is
`described and, as illustrated,
`is mounted lower than the
`steering wheel. White et al. correctly perceive that such a
`sensor could be defeated, and the airbag falsely deployed, by
`an occupant adjusting the control knobs on the radio and
`thus they suggest the use of a plurality of such sensors.
`Mattes et al. (U.S. Pat. No. 5,118,134) describe a variety
`of methods of measuring the change in position of an
`occupant including ultrasonic, active or passive infrared and
`microwave radar sensors, and an electric eye. The sensors
`measure the change in position of an occupant during a crash
`and use that information to access the severity of the crash
`and thereby decide whether or not to deploy the airbag. They
`are thus using the occupant motion as a crash sensor. No
`mention is made of determining the out-of-position status of
`the occupant or of any of the other features of occupant
`monitoring as disclosed in one or more of the above-
`referenced patents and patent applications. It is interesting to
`note that nowhere does Mattes et al. discuss how to use
`active or passive infrared to determine the position of the
`occupant. As pointed out in one or more of the above-
`referenced patents and patent applications, direct occupant
`position measurement based on passive infrared is probably
`not possible and, until very recently, was very difficult and
`expensive with active infrared requiring the modulation of
`an expensive GaAs infrared laser. Since there is no mention
`of these problems,
`the method of use contemplated by
`Mattes et al. must be similar to the electric eye concept
`where position is measured indirectly as the occupant passes
`by a plurality of longitudinally spaced-apart sensors.
`
`22
`
`
`
`US 6,772,057 B2
`
`3
`The object of an occupant out—of—position sensor is to
`determine the location of the he ad and/or chest of the vehicle
`occupant relative to the airbag since it is the in1pact of either
`the head or chest with the deploying airbag which can result
`in serious injuries. Both White et al. and Mattes et al.
`describe only lower mounting locations of their sensors in
`front of the occupant such as on the dashboard or below the
`steering wheel. Both such mounting locations are particu-
`larly prone to detection errors due to positioning of the
`occupant’s hands, arms and legs. This would require at least
`three, and preferably more, such sensors and detectors and
`an appropriate logic circuitry which ignores readings from
`some sensors if such readings are inconsistent with others,
`for the case, for example, where the driver’s arms are the
`closest objects to two of the sensors.
`White et al. also describe the use of error correction
`circuitry, without defining or illustrating the circuitry,
`to
`di erentiate between the velocity of one of the occupant’s
`hands as in the case where he/she is adjusting the knob on
`the radio and the remainder of the occupant. Three ultrasonic
`sensors of the type disclosed by White et al. might, in some
`cases, accomplish this differentiation if two of them indi-
`cated that the occupant was not moving while the third was
`indicating that he or she was. Such a combination, however,
`would not differentiate between an occupant with both hands .
`and arms in the path of the ultrasonic transmitter at such a
`location that they were blocking a substantial View of the
`occupant’s head or chest. Since the sizes and driving posi-
`tions of occupants are extremely varied, it is now believed
`that pattern recognition systems and preferably trained pat-
`tern recognition systems, such as neural networks, are
`required when a clear view of the occupant, unimpeded by
`his/her extremities, cannot be guaranteed.
`Fujita et al., in U.S. Pat. No. 5,074,583, describe another
`method of determining the position of the occupant but do
`not use this information to suppress deployment
`if the
`occupant is out-of-position. In fact, the closer the occupant
`gets to the airbag, the faster the inflation rate of the airbag
`is according to the Fujita et al. patent, which thereby
`increases the possibility of injuring the occupant. Fujita et al.
`do not measure the occupant directly but instead determine
`his or her position indirectly from measurements of the seat
`position and the vertical size of the occupant relative to the
`seat (occupant height). This occupant height is determined
`using an ultrasonic displacement sensor mounted directly
`above the occupant’s head.
`As discussed above, the optical systems described herein
`are also applicable for many other sensing applications both
`inside and outside of the vehicle compartment such as for
`sensing crashes before they occur as described in US. Pat.
`No. 5,829,782, for a smart headlight adjustment system and
`for a blind spot monitor (also disclosed in U.S. provisional
`patent application Ser. No. 60/202,424).
`2. Definitions
`Preferred embodiments of the invention are described
`below and unless specifically noted,
`it is the applicants’
`intention that the words and phrases in the specification and
`claims be given the ordinary and accustomed meaning to
`those of ordinary skill in the applicable art(s). If the appli-
`cant intends any other meaning, he will specifically state he
`is applying a special meaning to a word or phrase.
`Likewise, applicants’ use of the word “function” here is
`not intended to indicate that the applicants seek to invoke the
`special provisions of 35 U.S.C. §ll2, sixth paragraph, to
`define their invention. To the contrary, if applicants wish to
`invoke the provisions of 35 U.S.C. §112, sixth paragraph, to
`define their invention, they will specifically set forth in the
`
`.
`
`4
`claims the phrases “means for” or “step for” and a function,
`without also reciting in that phrase any structure, material or
`act in support of the function. Moreover, even if applicants
`invoke the provisions of 35 U.S.C. §112, sixth paragraph, to
`define their invention, it is the applicants’ intention that their
`inventions not be limited to the specific structure, material or
`acts that are described in the preferred embodiments herein.
`Rather, if applicants claim their inventions by specifically
`invoking the provisions of 35 U.S.C. §l 12, sixth paragraph,
`it is nonetheless their intention to cover and include any and
`all structure, materials or acts that perform the claimed
`function, along with any and all known or later developed
`equivalent structures, materials or acts for performing the
`claimed function.
`The use of pattern recognition is important to the instant
`invention as well as to one or more of those disclosed in the
`above-referenced patents and patent applications above.
`“Pattern recognition” as used herein will generally mean any
`system which processes a signal that is generated by an
`' object, or is modified by interacting with an object, in order
`to determine which one of a set of classes that the object
`belongs to. Such a system might determine only that the
`object is or is not a member of one specified class, or it might
`attempt to assign the object to one of a larger set of specified
`classes, or find that it is not a member of any of the classes
`in the set. The signals processed are generally electrical
`signals coming from transducers which are sensitive to
`either acoustic or electromagnetic radiation and,
`if
`electromagnetic, they can be either visible light, infrared,
`ultraviolet or radar or low frequency radiation as used in
`capacitive sensing systems.
`A trainable or a trained pattern recognition system as used
`herein means a pattern recognition system which is taught
`various patterns by subjecting the system to a variety of
`examples. The most successful such system is the neural
`network. Not all pattern recognition systems are trained
`systems and not all trained systems are neural networks.
`Other pattern recognition systems are based on fuzzy logic,
`sensor fusion, Kalman filters, correlation as well as linear
`and non-linear regression. Still other pattern recognition
`systems are hybrids of more than one system such as
`neural—fuzzy systems.
`A pattern recognition algorithm will thus generally mean
`an algorithm applying or obtained using any type of pattern
`recognition system, e.g., a neural network, sensor fusion,
`fuzzy logic, etc.
`To “identify” as used herein will usually mean to deter-
`mine that the object belongs to a particular set or class. The
`class may be one containing, for example, all rear facing
`child seats, one containing all human occupants, or all
`human occupants not sitting in a rear facing child seat
`depending on the purpose of the system. In the case where
`a particular person is to be recognized, the set or class will
`contain only a single element, ie, the person to be recog-
`nized.
`To “ascertain the identity of” as used herein with refer-
`ence to an object will generally mean to determine the type
`or nature of the object (obtain information as to what the
`object is), ie, that the object is an adult, an occupied rear
`facing child seat, an occupied front facing child seat, an
`unoccupied rear facing child seat, an unoccupied front
`facing child seat, a child, a dog, a bag of groceries, a car, a
`truck, a tree, a pedestrian, a deer etc.
`An “occupying item” or “occupant” of a seat or “object”
`in a seat may be a living occupant such as a human being or
`a dog, another living organism such as a plant, or an
`inanimate object such as a box or bag of groceries.
`
`.
`
`23
`
`
`
`US 6,772,057 B2
`
`5
`A “rear seat” of a vehicle as used herein will generally
`mean any seat behind tl1e front seat on which a driver sits.
`Thus, in minivans or other large vehicles where there are
`more than two rows of seats, each row of seats behind the
`driver is considered a rear seat and thus there may be more
`than one “rear seat” in such vehicles. The space behind the
`front seat includes any number of such rear seats as well as
`any trunk spaces or other rear areas such as are present in
`station wagons.
`An optical image will generally mean any type of image
`obtained using electromagnetic radiation including visual,
`infrared and radar radiation.
`In the description herein on anticipatory sensing, the term
`“approaching” when used in connection with the mention of
`an object or vehicle approaching another will usually mean
`the relative motion of the object toward the vehicle having
`the anticipatory sensor system. Thus, iii a side impact with
`a tree, the tree will be considered as approaching the side of
`the vehicle and impacting the vehicle. In other words, the
`coordinate system used in general will be a coordinate
`system residing in the target vehicle. The “target” vehicle is
`the vehicle that is being impacted. This convention permits
`a general description to cover all of the cases such as where
`(i) a moving vehicle impacts into the side of a stationary
`vehicle, (ii) where both vehicles are moving when they
`impact, or (iii) where a vehicle is moving sideways into a ,
`stationary vehicle, tree or wall.
`“Out-of-position” as used for an occupant will generally
`mean that the occupant, either the driver or a passenger, is
`sufficiently close to an occupant protection apparatus
`(airbag) prior to deployment that he or she is likely to be
`more seriously injured by the deployment event itself than
`by the accident. It may also mean that the occupant is not
`positioned appropriately in order to attain the beneficial,
`restraining elfects of the deployment of the airbag. As for the
`occupant being too close to the airbag, this typically occurs
`when the occupant’s head or chest
`is closer than some
`distance such as about 5 inches from the deployment door of
`the airbag module. The actual distance where airbag deploy-
`ment should be suppressed depends on the design of the
`airbag module and is typically farther for the passenger
`airbag than for the driver airbag.
`3. Pattern Recognition Prior Art
`Japanese Patent No. 3-42337 (A) to Ueno discloses a
`device for detecting the driving condition of a vehicle driver
`comprising a light emitter for irradiating the face of the
`driver and a means for picking up the image of the driver and
`storing it for later analysis. Means are provided for locating
`the eyes of the driver and then the irises of the eyes and then
`determining if the driver is looking to the side or sleeping.
`Ueno determines the state of the eyes of the occupant rather .
`than determining the location of the eyes relative to the other
`parts of the vehicle passenger compartment. Such a system
`can be defeated if the driver is wearing glasses, particularly
`sunglasses, or another optical device which obstructs a clear
`View of his/her eyes. Pattern recognition technologies such
`as neural networks are not used.
`U.S. Pat. No. 5,008,946 to Ando uses a complicated set of
`rules to isolate the eyes and mouth of a driver and uses this
`information to permit the driver to control the radio, for
`example, or other systems within the vehicle by moving his
`eyes and/or mouth. Ando uses natural light and analyzes
`only the head of the driver. He also makes no use of trainable
`pattern recognition systems such as neural networks, nor is
`there any attempt to identify the contents of the vehicle nor
`of their location relative to the vehicle passenger compart-
`ment. Rather, Ando is limited to control of vehicle devices
`by responding to motion of the driver’s mouth and eyes.
`
`6
`U.S. Pat. No. 5,298,732 to Chen also concentrates on
`locating the eyes of the driver so as to position a light filter
`between a light source such as the sun or the lights of an
`oncoming vehicle, and the driver’s eyes. Chen does not
`explain in detail how the eyes are located but does supply a
`calibration system whereby the driver can adjust the filter so
`that it is at the proper position relative to his or her eyes.
`Chen references the use of automatic equipment for deter-
`mining the location of the eyes but does not describe how
`this equipment works. In any event, there is no mention of
`illumination of the occupant, monitoring the position of the
`occupant, other that the eyes, determining the position of the
`eyes relative to the passenger compartment, or identifying
`any other object in the vehicle other than the driver’s eyes.
`Also, there is no mention of the use of a trainable pattern
`recognition system.
`U.S. Pat. No. 5,305,012 to Faris also describes a system
`for reducing the glare from the headlights of an oncoming
`vehicle. Faris locates the eyes of the occupant utilizing two
`' spaced apart infrared cameras using passive infrared radia-
`tion from the eyes of the driver. Again, Faris is only
`interested in locating the driver’s eyes rel