`Higgins-Luthman
`
`(10) Patent NO.:
`(45) Date of Patent:
`
`US 7,991,522 B2
`*Aug. 2,2011
`
`(54) IMAGING SYSTEM FOR VEHICLE
`
`(75)
`
`Inventor: Michael J. Higgins-Luthman, Livonia,
`MI (US)
`
`(73) Assignee: Donnelly Corporation, Holland, MI
`(US)
`
`( * ) Notice:
`
`Subject to any disclaimer, the term of this
`patent is extended or adjusted under 35
`U.S.C. 154(b) by 0 days.
`This patent is subject to a terminal dis-
`claimer.
`
`(21) Appl. No.: 121979,497
`
`(22) Filed:
`
`Dec. 28,2010
`
`(65)
`
`Prior Publication Data
`US 201 110090339 A1
`Apr. 21,201 1
`
`Related U.S. Application Data
`(63) Continuation of application No. 121764,355, filed on
`Apr. 21, 2010, now Pat. No. 7,877,175, which is a
`15,675,
`On
`Dec. 22, 2005, now Pat. No. 7,720,580.
`(60) Provisional application No. 601638,687, filed on Dec.
`23, 2004.
`
`(51) Int. C1.
`G05D 1/00
`(2006.01)
`(52) U.S. C1. ............................. 701128; 701141; 7011301
`(58) Field of Classification Search .................... 701128,
`701136,301; 2501208.1; 3821199; 3401435,
`3401436
`See application file for complete search history.
`
`(56)
`
`References Cited
`
`U.S. PATENT DOCUMENTS
`3,882,268 A
`511975 Ogawa et al.
`4,258,979 A
`311981 Mahin
`4,600,913 A
`711986 Caine
`
`4,847,772 A
`4,907,870 A
`4,931,937 A
`4,942,533 A
`4,970,653 A
`4,971,430 A
`5,070,454 A
`5,097,362 A
`5,128,874 A
`
`711989 Michalopoulos et a1
`311990 Bmcker
`611990 Kakinami et al.
`711990 Kakinami et al.
`1111990 Kenue
`1111990 Lynas
`1211991 Griffith
`311992 Lynas
`711992 Bhanu et al.
`(Continued)
`
`FOREIGN PATENT DOCUMENTS
`
`(Continued)
`
`OTHER PUBLICATIONS
`
`Van Leuven et al., "Real-Time Vehicle Tracking in Image
`Sequences", IEEE, US, vol. 3, May 21, 2001, pp. 2049-2054,
`XP010547308.
`Van Leeuwen et al., "Requirements for Motion Estimation in Image
`Sequences for Traffic Applications", IEEE, US, vol. 1, May 24, 1999,
`pp. 145-150, XP010340272.
`
`(Continued)
`
`T~~~
`
`i
`~
`h
`P,~,,,~ E~~~~~~~ ~
`AssistantExaminer Jamie Figueroa
`(74) Attornq, Agent, orFirm V a n Dyke, Gadner, Linn &
`Burkhart, LLP
`
`ABSTRACT
`(57)
`An imaging system for a vehicle includes an imaging array
`sensor and a control. The imaging array sensor comprises a
`plurality of photo-sensing pixels and is disposedat an exterior
`rearview mirror assembly at a side of the vehicle with a field
`of view exterior of the vehicle. The imaging array sensor is
`operable to capture an image exterior of the vehicle. The
`control may process the captured images and may determine
`that the imaging array sensor is misaligned when the imaging
`array sensor is disposed at the exterior rearview mirror assem-
`bly at the side of the vehicle. The control, responsive to a
`determination of misalignment of the imaging array sensor,
`may at least partially compensate for the determined mis-
`alignment of the imaging array sensor.
`
`50 Claims, 11 Drawing Sheets
`
`VALEO EX. 1001
`
`
`
`6,330,511 B2
`6,341,523 B2
`6,353,392 B1
`6,396,397 B1
`6,411,204 B1
`6,420,975 B1
`6,433,676 B2
`6,485,155 B1
`6,498,620 B2
`6,580,996 B1
`6,590,719 B2
`6,594,583 B2
`6,671,607 B2
`6,690,268 B2
`6,691,008 B2
`6,708,100 B2
`6,717,610 B1
`6,748,312 B2
`6,757,109 B2
`6,760,471 B1
`6,823,241 B2
`6,824,281 B2
`6,882,287 B2
`6,928,180 B2
`6,941,216 B2
`6,946,978 B2
`6,968,266 B2
`7,005,974 B2
`7,038,577 B2
`7,049,945 B2
`7,151,844 B2
`7,188,963 B2
`7,295,682 B2
`7,370,983 B2
`7,388,475 B2
`7,391,014 B2
`7,420,592 B2
`7,463,138 B2
`7,526,103 B2
`7,565,006 B2
`7,720,580 B2
`7,764,808 B2
`200210003571 A1
`200210159270 A1
`200210188392 A1
`200310025597 A1
`200310052773 A1
`200310156015 A1
`200310169522 A1
`200310236622 A1
`200410149504 A1
`200510232469 A1
`200610050018 A1
`200610125919 A1
`200610164230 A1
`200610171704 A1
`200810144924 A1
`201010002071 A1
`
`1212001 Ogura et al.
`112002 Lynam
`312002 Schofield et al.
`512002 Bos et al.
`612002 Bloomfield et al.
`712002 DeLine et al.
`812002 DeLine et al.
`1112002 Duroux et al.
`1212002 Schofield et al.
`612003 Friedrich
`712003 Bos
`712003 Ogura et al.
`1212003 Ishizu et al.
`212004 Schofield et al.
`212004 Kondo et al.
`312004 Russell et al.
`412004 Bos et al.
`612004 Russell et al.
`612004 Bos
`712004 Raymond
`1112004 Shirato et al.
`1112004 Schofield et al.
`412005 Schofield
`812005 Stam et al.
`912005 Isogai et al.
`912005 Schofield
`1112005 Ahmed-Zaid et al.
`212006 McMahon et al.
`512006 Pawlicki et al.
`512006 Breed et al.
`1212006 Stevenson et al.
`312007 Schofield et al.
`1112007 Otsuka et al.
`512008 DeWind et al.
`612008 Litkouhi
`612008 Saccagno
`912008 Freeman
`1212008 Pawlicki et al.
`412009 Schofield et al.
`712009 Stam et al.
`512010 Higgins-Luthman
`712010 Zhu et al.
`112002 Schofield et al.
`1012002 Lynam et al.
`1212002 Breed et al.
`212003 Schofield
`312003 Sjonell
`812003 Winner et al.
`912003 Schofield et al.
`1212003 Schofield
`812004 Swoboda et al.
`1012005 Schofield et al.
`312006 Hutzel et al.
`612006 Camilleri et al.
`712006 DeWind et al.
`812006 Bingle et al.
`612008 Hoffmann
`112010 Ahiska
`
`FOREIGN PATENT DOCUMENTS
`
`U.S. PATENT
`5,177,685 A
`111993
`5,189,561 A
`211993
`5,294,991 A
`311994
`5,304,980 A
`411994
`5,333,111 A
`711994
`5,355,118 A
`1011994
`5,365,603 A
`1111994
`5,369,590 A
`1111994
`5,424,952 A
`611995
`5,426,294 A
`611995
`5,448,484 A
`911995
`5,487,116 A
`111996
`5,500,766 A
`311996
`5,521,633 A
`511996
`5.521.843 A
`511996
`
`DOCUMENTS
`Davis et al.
`Hong
`Oshima et al.
`Maekawa
`Chaiken et al.
`Fukuhara
`Karmann
`Karasudani
`Asayama
`Kobayashi et al.
`Bullock et al.
`Nakano et al.
`Stonecypher
`Nakaj ima et al.
`Hashima et al.
`Wada et al.
`Maekawa
`Bechtel et al.
`Nishio
`Schofield et al.
`Sato et al.
`Noguchi et al.
`Woll et al.
`Erickson et al.
`Tsutsumi et al.
`Yamasaki
`Shimoura et al.
`Kinoshita et al.
`Hardin et al.
`Varaprasad et al.
`Schofield et al.
`Pomerleau
`Schierbeek et al.
`Varaprasad et al.
`Mathieu
`Schofield et al.
`Schofield et al.
`Nakayama
`Schofield et al.
`O'Farrell et al.
`Stam et al.
`Breed et al.
`Takano et al.
`Schofield et al.
`Lion
`Franke et al.
`Kakinami et al.
`Kawaziri et al.
`Schofield et al.
`Schofield et al.
`Tamura et al.
`Hiwatashi
`Nakamura et al.
`Anandan et al.
`Schofield et al.
`Thau et al.
`Kramer et al.
`Jitsukata et al.
`Seo et al.
`Juds
`Bos
`Ishikawa et al.
`Schofield et al.
`Lemelson et al.
`Luckscheiter et al.
`DeLine et al.
`Sasaki et al.
`Kashiwazaki
`Lynam
`Kodaka et al.
`DeLine et al.
`Yano et al.
`Shimoura et al.
`Ishikawa et al.
`Franke et al.
`Bos et al.
`Lee
`Schofield et al.
`
`
`
`US 7,991,522 B2
`Page 3
`
`OTHER PUBLICATIONS
`Van Leeuwen et al., "Motion Estimation with a Mobile Camera for
`Traffic Applications", IEEE, US, Oct. 3, 2000, pp. 58-63.
`Van Leeuwen et al., "Motion Interpretation for In-Car Vision Sys-
`terns", IEEE, US, vol. 1, Sep. 30, 2002, pp. 135-140.
`Van Leeuwen et al,, "Motion Estimation in Image Sequences for
`~
`~
`~
`f
`f
`i
`~
`
`vol, 1, M~~ 1, 2000, pp, 354-359,
`~
`~
`~
`l
`i
`~
`~
`t
`i
`~
`~
`~
`~
`~
`,
`
`XP002529773.
`
`John Wiley &
`
`Pratt, "Digital Image Processing, Passage-ED.?",
`Sons, US, Jan. 1, 2001, pp. 657-659, XP002529771.
`Supplemental European Search Report completed May 29, 2009,
`from corresponding European Application No. EP 03 72 1946.
`Hicks et al., "Panoramic Electronic Rear Vision for Automotive
`Applications", Society of Automotive Engineers (SAE). Mar. 1-4,
`1999, 1999-01-0655.
`
`
`
`U.S. Patent
`U.S. Patent
`
`A U ~ . 2,2011
`Aug. 2, 2011
`
`Sheet 1 of 11
`Sheet 1 of 11
`
`US 7,991,522 B2
`
` {3%{39
`‘xavsgtgaga;
`
`
`
`U.S. Patent
`
`A U ~ . 2,2011
`
`Sheet 2 of 11
`
`325
`
`Car B a ~ k Door 'fandie 1
`
`FOG, 3A
`
`FtG. 3B
`
`
`
`U.S. Patent
`
`A U ~ . 2,2011
`
`Sheet 3 of 11
`
`.......................................................
`
`-.-
`s i ~ m p , .i. - 'j a:r<j i.t.]Gcli: i,qf<>i,lsnai>17fi
`j y,:as ~.e$c-t'l~cr: pi)in: hf Ijlm
`far
`............... - 3
`
`P~ine (S;,
`
`v>ir:s ; ~ t fh:if/insr;znt M'SS Z'Ijj
`2
`
`/ i Afia. Prciccssing)
`x;., id' ,
`i
`40
`
`'-. ",
`
`:\,{ir?Ljy l);;l~$l
`
`*-
`
`--&-.-d.A-d.--.dM&
`
`----.-
`
`.....................
`
`3
`
`(XKFJ.Y:,:i-:.j:
`,, I iri;2tii>;? of whtyd a: &id\ sheel
`1 ' : ., :ii~gle is rnertsur ed
`;3
`; Refcv.erit:..e Am.;]
`,
`
`e i iiiiSp)
`
`t
`I
`
`FIG, 4
`
`
`
`U.S. Patent
`US. Patent
`
`Aug. 2, 2011
`A U ~ . 2,2011
`
`Sheet 4 of 11
`Sheet 4 of 11
`
`US 7,991,522 132
`
`
`
`
`
`3aI
`
`
`
`3%}? View {315 {E'izz'whafii and
`mquimd Wfiéfif}? arg§3 wiih
`may 81’ gr: mama-723.011, {3'92“
`{his 6131116175. sizagafiva \ther
`gingic "a.
`E 2:10? {2-1 ccmaifies'ed
`xvii} 3
`fits}
`
`
`
`
`
`37%;, fl
`
`
`
`U.S. Patent
`U.S. Patent
`
`A U ~ . 2,2011
`Aug. 2, 2011
`
`Sheet 5 of 11
`Sheet 5 of 11
`
`US 7,991,522 B2
`US 7,991,522 B2
`
`yawn?"
`
`FEE}, 8
`
`FIG, $3
`
`
`
`
`U.S. Patent
`U S. Patent
`
`A U ~ . 2,2011
`Aug. 2, 2011
`
`Sheet 6 of 11
`
`US 7,991,522 B2
`
`..,m.“wmumnAwmvm.fl...m.m"~“awua.1mwwnw*Mv.mrM0mmm0"flwwuu."~Wa.en"vk3“~
`.ewwWnw...,.ma5
`
`(I
`
`
`aMwn,m,.............1.?.rrrrcifrri.“innit-ununuuuflnnn
`(Ii/w?
`m«Cx.wflx,.5,.A‘13wi.uxw..,.w«u..w.mmMa..:,.mx,mu.1wuMw.sw..1..Mm.mmam
`
`
`
`.a
`
` ‘................
`
`..
`
`.
`
`
`
`uM$m:2ma?
`
`FIG, 48
`
`r.wmy
`
`nu.
`1".
`
`$383, ”E?
`
`
`
`
`
`
`
`
`U.S. Patent
`
`Aug. 2,2011
`
`Sheet 7 of 11
`
`0 Gray level values 255
`
`FIG. I2
`
`FIG. 13
`
`I Horizontal edge line
`
`I
`
`FIG. 14
`
`
`
`U.S. Patent
`US. Patent
`
`Aug. 2, 2011
`A U ~ . 2,2011
`
`Sheet 8 of 11
`Sheet 8 of 11
`
`US 7,991,522 B2
`
`$§€a
`
`“‘15
`
`.x/n"
`
`1H0
`/ ‘1 <21 1112911
`«1’:
`5111121313
`’11117131:
`“a.
`\ £11113 3111-1111111
`
`W”....................................u»;
`
`111111: 1.11":
`
`
`1
`
`i y ,
`g
`111
`1.111.111.1111, 11.11111.
`
`
`// X‘x\\
`
`
`
`
`.mwvw.~......~.~.~..u..~....w-wmwamvmw.m'
`
`;
`”11311.-21eadgfi
`
`3 {11.1313111113
`
`
`‘11
`
`............
`§
`g 17111111111:11"111'”
`E 21.1111“.11 111121 1.1.1111:
`..............................
`
`--...-.-.-...
`i‘:
`1
`1
`
`3
`g
`g
`g
`
`’31-’113T13319111z1‘10"
`1/”
`< 511121.211_______1111122
`£11111 1331:1111111:
`
`‘1\
`>1f
`//
`
`‘1
`
`.x
`
`x“.
`\-
`
`_
`‘1" 122$
`
`
`
`1aL...11...
`
`\\
`If!”
`,«Afiax 001111211’111 M
`x”
`\ 1311.1:15:11.0111f >
`XI“
`1‘"
`KW 12.111.11.311 //
`
`M}
`
` \
`1111
`’;.:"1’ 3171;11:13121130‘1212 »‘
`111
`...................».“me
`“’11'111:
`
`
`
`U.S. Patent
`US. Patent
`
`Aug. 2, 2011
`A U ~ . 2,2011
`
`Sheet 9 of 11
`Sheet 9 of 11
`
`US 7,991,522 B2
`
`
`
`‘W‘
`
`FIG, 86
`
`
`
`U.S. Patent
`US. Patent
`
`Aug. 2, 2011
`A U ~ . 2,2011
`
`Sheet 10 of 11
`Sheet 10 of 11
`
`US 7,991,522 132
`
`
`
`
`m Heafiiig
`13012311311
`
`
`w 37’sSki {if};
`
`9, «5;,
`
`if: if 24'
`
`
`fig, ’33
`
`
`
`U.S. Patent
`US. Patent
`
`Aug. 2, 2011
`A U ~ . 2,2011
`
`Sheet 11 of 11
`Sheet 11 of 11
`
`US 7,991,522 B2
`
`
`{33 5 ‘ £3373.
`
`
`
`Q53, 21%}
`
`
`
`US 7,99 1,522 B2
`
`CROSS REFERENCE TO RELATED
`APPLICATIONS
`
`FIELD OF THE INVENTION
`
`-
`
`BACKGROUND OF THE INVENTION
`
`SUMMARY OF THE INVENTION
`
`1
`IMAGING SYSTEM FOR VEHICLE
`
`2
`Many automotive lane departure warning (LDW) systems
`(also known as run off road warning systems) are being devel-
`opedandimplementedonvehicles today. These systems warn
`a driver of a vehicle when their vehicle crosses the road's land
`5 markings or when there is a clear trajectory indicating they
`will imminently do so. The warnings are typically not a d -
`The present application is a continuation of U.S. patent
`application ser, N ~ ,
`vated if the corresponding turn signal is on, as this implies the
` 121764,355, filed
`21, 2010, now
`driver intends to make a lane change maneuver. Additionally,
`
`U,S, pat, N ~ , 7,877,175, which is a continuation of U,S,
`the warning Vstems may be deactivated
`a certain
`patent application Ser. No. 111315,675, filed Dec. 22, 2005,
`now U,S, Pat, No, 7,720,580, which claims benefit of U,S, 10 vehicle speed. The driver interface for these systems may be
`in the form
`provisional application Ser, No, 601638,687, filed Dee, 23,
`warning (such as an
`light)
`a
`andor an audible warning (typically a rumble strip sound).
`2004, which is hereby incorporated herein by reference in its
`One application warns a driver with an indicator light if the
`entirety.
`vehicle tire is crossing the lane marker and no other vehicle is
`15 detected in the driver's corresponding blind spot; andor fur-
`ther warns the driver with an audible warning if the vehicle is
`crossing into the adjacent lane and there is a vehicle detected
`The present invention relates generally to vision or imag-
`in the driver's blind spot.
`ing systems for vehicles and is related to object detection
`There is concern that the current systems will be more of a
`'ystems and, more particularly, to imaging 'ystems which are 20 driver annoyance or distraction than will be acceptable by the
`operable to determine if a vehicle or object of interest is
`cons,er
`market, using the turn signal as the principle
`adjacent to, forward of or rearward of the subject vehicle to
`means of establishing to the
`system that the maneu-
`assist the driver in changing lanes or parking the vehicle. The
`ver is intentional does not reflect typical driving patterns and,
`present invention also relates generally to a lane departure
`thus, many intended maneuvers will cause a
`As a
`warning system for a vehicle
`25 driver gets annoyed by warnings during intended maneuvers,
`the driver will likely begin to ignore the warnings, which may
`result in an accident when the warning is appropriate.
`Therefore, there is a need in the art for an object detection
`Many lane change aidside object detectiodlane departure
`system, such as a blind spot detection system or lane change
`warning devices or systems and the like have been proposed 30 assist system or lane departure warning system or the like,
`which are operable to detect a vehicle or other object that is
`which overcomes the short comings of the prior art.
`present next to, ahead of or rearward of the equipped vehicle
`or in an adjacent lane with respect to the equipped vehicle.
`Such systems typically utilize statistical methodologies to
`The present invention is intended to provide an object
`statistically analyze the images captured by a camera or sen- 35
`detection system, such as a blind spot detection system, a lane
`sor at the vehicle to estimate whether a vehicle or other object
`change assist or aid system or device, a lane departure warn-
`is adjacent to the equipped vehicle. Because such systems
`ing system, a side object detection system, a reverse park aid
`typically use statistical methodologies to determine a likeli-
`system, a forward park aid system, a forward, sideward or
`hood or probability that a detected object is a vehicle, and for
`other reasons, the systems may generate false positive detec- 40 rearward collision avoidance system, an adaptive cruise con-
`tions, where the system indicates that a vehicle is adjacent to,
`trol system, a passive steering system or the like, which is
`operable to detect andor identify a vehicle or other object of
`forward of or rearward of the subject vehicle when there is no
`vehicle adjacent to, forward of or rearward of the subject
`interest at the side, front or rear of the vehicle equipped with
`vehicle, or false negative detections, where the system, for
`the object detection system. The object detection system of
`example, indicates that there is no vehicle adjacent to the 45 the present invention, such as for a lane change assist system,
`subject vehicle when there actually is a vehicle in the adjacent
`utilizes an edge detection algorithm to detect edges of objects
`lane.
`in the captured images and determines if a vehicle is present
`Such known and proposed systems are operable to statis-
`in a lane adjacent to the equipped or subject vehicle in
`tically analyze substantially all of the pixels in a pixelated
`response to various characteristics of the detected edges, such
`image as captured by a pixelated image capture device or 50 as the size, location, distance, intensity, relative speed andor
`camera. Also, such systems may utilize algorithmic means,
`the like. The system processes a subset of the image data
`such as flow algorithms or the like, to track substantially each
`captured which is representative of a target zone or area of
`pixel or most portions of the image to determine how sub-
`interest of the scene within the field of view of the imaging
`stantially each pixel or most portions of the image has
`system where a vehicle or object of interest is likely to be
`changed from one frame to the next. Such frame by frame 55 present. The system processes the detected edges within the
`flow algorithms and systems may not be able to track a vehicle
`image data subset to determine if they correspond with physi-
`which is moving at generally the same speed as the equipped
`cal characteristics of vehicles and other objects to determine
`whether the detected edge or edges islare part of a vehicle or
`vehicle, because there may be little or no relative movement
`between the vehicles and, consequently, little or no change
`a significant edge or object at or toward the subject vehicle.
`from one frame to the next. Because the systems may thus 60 The system utilizes various filtering mechanisms, such as
`substantially continuously analyze substantially every pixel
`algorithms executed in software by a system microprocessor,
`for substantially every frame captured and track such pixels
`to substantially eliminate or substantially ignore edges or
`and frames from one frame to the next, such systems may
`pixels that are not or cannot be indicative of a vehicle or
`require expensive processing controls and computationally
`significant object to reduce the processing requirements and
`expensive software to continuously handle and process sub- 65 to reduce the possibility of false positive signals.
`stantially all of the data from substantially all of the pixels in
`The object detection system of the present invention may
`substantially each captured image or frame.
`capture images at a side of the vehicle and may process
`
`
`
`US 7,99 1,522 B2
`
`BRIEF DESCRIPTION OF THE DRAWINGS
`
`DESCRIPTION OF THE PREFERRED
`EMBODIMENTS
`
`4
`These and other objects, advantages, purposes and features
`of the present invention will become apparent upon review of
`the following specification in conjunction with the drawings.
`
`3
`various windows of the images to detect a vehicle in the
`adjacent lane or other object, such as a bicycle, in the adjacent
`lane. The system may adjust the image processing to account
`for misalignment of the camera at the side of the vehicle. The
`system may adjust the area or zone of interest in response to 5
`a turning of the vehicle's wheels, such as when the vehicle is
`FIG. 1 is a top plan view of a vehicle incorporating the
`turning or curving along a curve in the road. The system may
`the Present
`distinguish between vehicles or other objects and shadows of
`FIG. 2 is arepresentation of a captured image of a side area
`objects/vehicles so that a shadow of a vehicle two lanes over
`may not be considered a vehicle in the adjacent lane, The 10 of a vehicle as captured by an imaging sensor in accordance
`system may switch between daytime and nighttime algo- With the present invention;
`FIGS. 3A-C are schematics of the captured image of FIG.
`rithms and may be operable to detect headlamps of vehicles in
`2 showing the adjustments that may be made to the image
`the adjacent lane.
`processing to account for misalignment of the image sensor;
`According to an aspect of the present invention, an imaging 15
`FIG. 4 is a schematic showing an adjustment of the area of
`system for a
`an imaging array
`and a
`interest when the wheels of the subject vehicle are turned;
`The image array
`a plurality
`FIG, 5 is a plan view of one of the wheels of the subject
`photo-sensing pixels and is positioned at the vehicle with a
`vehicle showing the angles of the wheel as it is turned;
`field of view exteriorly of the vehicle. The imaging array
`FIGS. 6-9 are representations of captured images of the
`sensor is operable to capture an image of a scene occurring 20 side area of the vehicle, showing how different shadows may
`exteriorly of the vehicle. The captured image comprises an
`be detected;
`image data set representative of the exterior scene. The con-
`FIG. 10 is a schematic of the image processing windows
`trol algorithmically processes the image data set to a reduced
`useful in processing the captured images in accordance with
`image data set of the image data set. The control processes the
`the present invention;
`FIG. 11 is a representation of a captured image of the side
`reduced image data set to extract information from the 25
`area of the vehicle, showing different processing windows
`reduced image data set. The control selects the reduced image
`used to detect the vehicle in the adjacent lane;
`data set based on a steering angle of the vehicle.
`FIG. 12 is a plot of the gray level values of the rows of
`Optionally, the control may process the reduced image data
`pixels as a result of a wide line integration in accordance with
`set with an edge detection algorithm to extract information
`from the reduced image data set. The image sensor may be 30 the present
`l3 is a~rocessingmask for processing the windows of
`one of (a) part of an exterior rearview mirror assembly of the
`the captured images using gradient calculations in accor-
`vehicle and with a field of view at least partially
`of
`dance with the present invention;
`the vehicle, and (b) at an upper windshield area and behind the
`FIG. 14 is a representation of a captured image showing the
`windshield of the vehicle and with a field of view forward and
`35 shadow of the vehicle in the area adjacent to the vehicle;
`through the windshield (such as at an area that is cleaned by
`FIG, 15 is a process flow diagram showing the bicycle
`the windshield wiper or wipers of the vehicle when the wind-
`detection function of the present invention; and
`are activated). Optionally, the image
`FIGS, 16-20 are representations of
`images of the
`of the
`may be part of an exterior rearview mirror
`side area of the subject vehicle, showing the headlight detec-
`vehicle and with a field of view at least partially sideward of 40 tion function ofthe present invention,
`the vehicle, wherein the imaging system comprises a side
`object detection system for detecting objects at a side of the
`vehicle. Optionally, the image sensor may be at an upper
`windshield area and behind the windshield of the vehicle and
`Referring now to the drawings and the illustrative embodi-
`with a field of view forward and through the windshield, 45
`ments depicted therein, an object detection system or imaging
`wherein the imaging system comprises a lane departure warn-
`system, such as a lane change assist or aid system 10, is
`ing system.
`positioned at a vehicle 12 (such as at an exterior rearview
`Therefore, the present invention provides an imaging sys-
`mirror 12a of a vehicle) and is operable to capture an image of
`tem for use as or in association with a side object detection
`system andlor a lane departure warning system. The system is 50 a scene occurring sidewardly and rearwardly at or along one
`or both sides of vehicle 12 (FIG. 1). Lane change assist
`operable to process captured image data of a scene occurring
`system 10 comprises an image capture device or sensor or
`exteriorly and along one or both sides of the vehicle to deter-
`camera 14, which captures an image of the scene occurring
`mine if a target vehicle or object of interest is located at or in
`toward a respective side of the vehicle 12, and a control 16,
`the lane adjacent to the subject or host vehicle. The imaging
`system of the present invention may process zones or areas of 55 which processes the captured image to determine whether
`another vehicle 18 is present at the side of vehicle 12, as
`interest in the captured images and may adjust processing to
`discussed below. Control 16 may be further operable to acti-
`accommodate any misalignment of the camera that may
`occur during installation of the camera at the side of the
`vate a warning indicator or display or signal device to alert the
`driver of vehicle 12 that another vehicle is present at the side
`vehicle. The side object detection system may also select or
`adjust the image processing to selectiadjust the areas of inter- 60 of vehicle 12. The warning or alert signal may be provided to
`the driver of vehicle 12 in response to another vehicle being
`est, such as in response to a steering angle of the vehicle, such
`detected at the blind spot area (as shown in FIG. 1 ) and may
`as a turning of the wheels of the vehicle, so that the zone or
`only be provided when the driver of vehicle 12 actuates a turn
`area is adapted for the turning of the subject vehicle. The
`signal toward that side or begins turning the subject vehicle 12
`imaging system of the present invention thus provides
`enhanced processing of captured images to provide the 65 toward that side to change lanes into the lane occupied by the
`other detected vehicle 18. The control and imaging system
`desired function of the imaging system or associated control
`or control system or alert system.
`may utilize aspects described in U.S. patent application Ser.
`
`
`
`US 7,99 1,522 B2
`
`6
`5
`of the windshield so as to have a field of view forwardly and
`No. 101427,051, filed Apr. 30, 2003 by Pawlicki et al. for
`through the windshield of the vehicle, preferably at a location
`OBJECT DETECTION SYSTEM FOR VEHICLE, now
`that is cleaned by the windshield wipers of the vehicle, such
`U.S. Pat. No. 7,038,577, which is hereby incorporated herein
`as at an interior rearview mirror assembly of the vehicle or at
`by reference. Reference is made to U.S. patent application
`Ser. No. 101427,05 1, for a discussion of image processing 5 an accessory module or windshield electronics module or the
`techniques and control functions useful with the present
`like. In such an application, the image sensor may be incor-
`invention.
`porated in or associated with a lane departure warning system
`Optionally, the imaging system and object detection sys-
`that detects a departure of the controlled or subject vehicle
`tem of the present invention may utilize aspects of the imag-
`from a lane as the vehicle travels along a road.
`ing systems or detection systems of the types described in l o Camera Calibration:
`U.S. Pat. Nos. 5,929,786 andor 5,786,772, andorU.S. patent
`In order to verify that the camera or imaging sensor is
`application Ser. No. 101427,051, filed Apr. 30,2003 by Paw-
`mounted at the vehicle (such as at an exterior portion of the
`licki et al. for OBJECT DETECTION SYSTEM FOR
`vehicle) within a desired tolerance limit so as to provide the
`VEHICLE, now U.S. Pat. No. 7,038,577; andor Ser. No.
`desired field of view, the camera may detect the side of the
`111239,980, filed Sep. 30, 2005 by Camilleri et al. for 1s vehicle (shown at 30 in FIG. 2) andor the door handle or
`VISION SYSTEM FORVEHICLE, now U.S. Pat. No. 7,881,
`handles (the front door handle is shown at 32a in FIG. 2, while
`496, andor U.S. provisional applications, Ser. No. 601628,
`the rear door handle is shown at 32b in FIG. 2) of the vehicle
`709, filed Nov. 17, 2004 by Camilleri et al. for IMAGING
`and the control may confirm that they are in the expected
`AND DISPLAY SYSTEM FORVEHICLE; Ser. No. 601614,
`location in the captured images. If the control determines that
`644, filed Sep. 30, 2004; andor Ser. No. 601618,686, filed 20 the camera is not aligned or aimed at the desired location
`Oct. 14,2004 by Laubinger for VEHICLE IMAGING SYS-
`(such as by determining that the vehicle edge andor door
`TEM, or of the reverse or backup aid systems, such as rear-
`handleihandles are not at the expected location), the control
`wardly directed vehicle vision systems utilizing principles
`may adjust the image andor image processing to account for
`disc1osedinU.S. Pat. Nos. 5,550,677; 5,760,962; 5,670,935;
`any such misalignment of the camera. For example, the
`6,201,642; 6,396,397; 6,498,620; 6,717,610 andor 6,757, 25 degree of misalignment may be calculated, and the image
`109, andorU.S. patent application Ser. No. 101418,486, filed
`processing may be adjusted or shifted andor rotated to posi-
`Apr. 18, 2003 by McMahon et al. for VEHICLE IMAGING
`tion the reference structure at the appropriate location in the
`SYSTEM, now U.S. Pat. No. 7,005,974, or of automatic
`capturedimages.
`headlamp controls, such as the types described in U.S. Pat.
`For example, the algorithm may function to preprocess the
`Nos. 5,796,094 andor 5,715,093; andor U.S. patent applica- 30 captured image by a histogram equalization to improve the
`tion Ser. No. 111105,757, filed Apr. 14, 2005, now U.S. Pat.
`image contrast. The algorithm may then process the captured
`No. 7,526,103; and U.S. provisional applications, Ser. No.
`images via an edge detection in the area of interest to extract
`601607,963, filed Sep. 8, 2004 by Schofield for IMAGING
`the expected edge of the vehicle (shown at 34 in FIG. 2). The
`SYSTEM FOR VEHICLE; and Ser. No. 601562,480, filed
`algorithm may filter the image data to remove noise in the
`Apr. 15, 2004 by Schofield for IMAGING SYSTEM FOR 35 edge detected image. The algorithm may perform a coarse
`VEHICLE, or of rain sensors, such as the types described in
`structure fitting (such as via a line fitting algorithm or contour
`U.S. Pat. Nos. 6,250,148 and 6,341,523, or of other imaging
`fitting algorithm or the like) of the vehicle side and door
`systems, such as the types described in U.S. Pat. Nos. 6,353,
`handles in the captured image for verifying the camera
`392 and 6,313,454, which may utilize various imaging sen-
`mounting is within the desired or appropriate tolerance limit.
`sors or imaging array sensors or cameras or the like, such as 40 The algorithm may further perform a fine structure fitting
`a CMOS imaging array sensor, a CCD sensor or other sensors
`(such as via a correlation algorithm or contour fitting algo-
`or the like, such as the types disclosed in commonly assigned,
`rithm orthe like) for calculating shift in yaw, pitch androll. As
`U.S. Pat. Nos. 5,550,677; 5,760,962; 6,097,023 and 5,796,
`shown in FIGS. 3A-C, the actual or detected vehicle edges
`094, and U.S. patent application Ser. No. 091441,341, filed
`may be misaligned or separated from the expected vehicle
`Nov. 16, 1999 by Schofield et al. for VEHICLE HEAD- 45 edges, suchthat the image processingmay be adjusted to shift
`LIGHT CONTROL USING IMAGING SENSOR, now U.S.
`the captured image data accordingly to accommodate such
`Pat. No. 7,339,149, andor PCT Application No. PCTI
`misalignment of the camera. Basedon the results ofthe image
`US20031036177 filed Nov. 14, 2003, published Jun. 3,2004
`processing techniques, data or information of the yaw, pitch
`as PCT Publication No. WO 20041047421 A3, with all of the
`and roll may be used to set the polygon co-ordinates and H
`above referenced U.S. patents, patent applications and provi- 50 depression pixel calibration parameters, so that the expected
`sional applications and PCT applications being commonly
`vehicle edges are substantially aligned with the actual or
`assigned and being hereby incorporated herein by reference.
`detected vehicle edges.
`The image sensor may be located at the vehicle so as to
`After the image data or image processing is adjusted to
`have a sideward field of view, such as at an exterior rearview
`account for any misalignment of the camera at the vehicle, the
`mirror of the vehicle, such a generally or at least partially 55 camera may capture images of the scene occurring exteriorly
`within an exterior rearview mirror of the vehicle. For
`of the vehicle and at that side of the vehicle, and the control
`example, an image sensor may be located within an exterior
`may process the images to detect objects or lane markers or
`rearview mirror assembly of the vehicle and may have a
`the like at the side of the vehicle andor rearward of the
`generally rearwardly and sidewardly field of view through a
`vehicle, and may utilize aspects described in U.S. patent
`transflective reflective element of the exterior rearview mirror 60 application Ser. No. 101427,05 1, filed Apr. 30,2003 by Paw-
`assembly. In such an application, the image sensor may be
`licki et al. for OBJECT DETECTION SYSTEM FOR
`incorporated in or associated with a side object detection
`VEHICLE, now U.S. Pat. No. 7,038,577, which is hereby
`system that detects objects at a side or blind spot area of the
`incorporated herein by reference.
`controlled or subject vehicle. Optionally, the image sensor
`Adjustment of Zone when Vehicle Turning:
`may have a generally forward field of view to capture images 65 Optionally, the control may perform a curve processing or
`of a scene occurring forwardly of the vehicle. The i