throbber
(12) United States Patent
`Higgins-Luthman
`
`(10) Patent NO.:
`(45) Date of Patent:
`
`US 7,877,175 B2
`*Jan. 25,2011
`
`(54) IMAGING SYSTEM FOR VEHICLE
`
`(75)
`
`Inventor: Michael J. Higgins-Luthman, Livonia,
`MI (US)
`
`(73) Assignee: Donnelly Corporation, Holland, MI
`(US)
`
`( * ) Notice:
`
`Subject to any disclaimer, the term of this
`patent is extended or adjusted under 35
`U.S.C. 154(b) by 0 days.
`
`This patent is subject to a terminal dis-
`claimer.
`
`(21) Appl. No.: 121764,355
`
`(22) Filed:
`
`Apr. 21,2010
`
`(65)
`
`Prior Publication Data
`US 201010228435 A1
`Sep. 9,2010
`
`Related U.S. Application Data
`(63) Continuation of application No. 111315,675, filed on
`Dec. 22, 2005, now Pat. No. 7,720,580.
`(60) Provisional application No. 601638,687, filed on Dec.
`23. 2004.
`
`(51) Int. C1.
`G05D 1/00
`(2006.01)
`(52) U.S. C1. ............................ 701128; 701141; 7011301
`(58) Field of Classification Search ................... 701128,
`701136,301; 2501208.1; 3821199; 3401435,
`3401436
`See application file for complete search history.
`
`(56)
`
`References Cited
`U.S. PATENT DOCUMENTS
`
`4,258,979 A
`
`311981 Mahin
`
`4,600,913 A
`4,847,772 A
`4,907,870 A
`4,931,937 A
`4,942,533 A
`
`711986 Caine
`711989 Michalopoulos et al.
`311990 Bmcker
`611990 Kakinami et al.
`711990 Kakinami et al.
`
`(Continued)
`
`FOREIGN PATENT DOCUMENTS
`
`EP
`
`0 354 261 A1
`
`211990
`
`(Continued)
`
`OTHER PUBLICATIONS
`
`Van Leuven et al., "Real-Time Vehicle Tracking in Image
`Sequences", IEEE, US, vol. 3, May 21, 2001, pp. 2049-2054,
`XP010547308.
`
`(Continued)
`Primary Examiner-Khoi Tran
`Assistant Examiner-Jaime Figueroa
`(74) Attorney, Agent, or Firm-Van Dyke, Gardner, Linn &
`Burkhart, LLP
`
`(57)
`
`ABSTRACT
`
`An imaging system for a vehicle includes an imaging array
`sensor and a control. The image array sensor comprises a
`plurality of photo-sensing pixels and is positioned at the
`vehicle with a field of view exterior of the vehicle. The imag-
`ing array sensor is operable to capture an image exterior of the
`vehicle. The control may process the captured images and
`may determine that the imaging array sensor is not aligned
`within a desired tolerance when the imaging array sensor is
`positioned at the vehicle. The control, responsive to a deter-
`mination of a misalignment of the imaging array sensor at the
`vehicle, may adjust at least one of the captured images or an
`image data set and the image processing to at least partially
`compensate for the determined misalignment of the imaging
`array sensor.
`
`27 Claims, 11 Drawing Sheets
`
`VALEO EX. 1001
`
`

`

`U.S.
`
`PATENT DOCUMENTS
`
`4,970,653 A
`4,971,430 A
`5,097,362 A
`5,128,874 A
`5,177,685 A
`5,189,561 A
`5,304,980 A
`5,365,603 A
`5,369,590 A
`5,424,952 A
`5,426,294 A
`5,448,484 A
`5,487,116 A
`5,500,766 A
`5,521,633 A
`5,537,003 A
`5,541,590 A
`5,550,677 A
`5,581,464 A
`5,617,085 A
`5,642,093 A
`5,642,299 A
`5,670,935 A
`5,715,093 A
`5,745,310 A
`5,760,962 A
`5,786,772 A
`5,790,403 A
`5,796,094 A
`5,837,994 A
`5,845,000 A
`5,877,897 A
`5,884,212 A
`5,890,083 A
`5,892,855 A
`5,929,784 A
`5,929,786 A
`5,949,331 A
`6,005,492 A
`6,009,337 A
`6,044,321 A
`6,049,619 A
`6,097,023 A
`6,104,552 A
`6,169,940 B1
`6,173,222 B1
`6,201,236 B1
`6,201,642 B1
`6,218,960 B1
`6,222,447 B1
`6,226,389 B1
`6,226,592 B1
`6,243,003 B1
`6,246,961 B1
`6,249,214 B1
`6,250,148 B1
`6,269,308 B1
`6,278,377 B1
`6,282,483 B1
`6,285,393 B1
`6,292,111 B1
`6,292,752 B1
`6,313,454 B1
`6,317,057 B1
`6,320,176 B1
`
`1111990 Kenue
`1111990 Lynas
`311992 Lynas
`7/1992 Bhanu et al.
`111993 Davis et al.
`211993 Hong
`411994 Maekawa
`1 1/ 1994 Karmann
`1 111994 Karasudani
`611995 Asayama
`611995 Kobayashi et al.
`9/1995 Bullock et al.
`111996 Nakano et al.
`311996 Stonecypher
`511996 Nakajima et al.
`7/1996 Bechtel et al.
`711996 Nishio
`811996 Schofield et al.
`1211996 Woll et al.
`4/1997 Tsutsumi et al.
`611997 Kinoshita et al.
`611997 Hardin et al.
`911997 Schofield et al.
`2/1998 Schierbeek et al.
`411998 Mathieu
`611998 Schofield et al.
`711998 Schofield et al.
`8/1998 Nakayama
`811998 Schofield et al.
`1111998 Stam et al.
`1211998 Breed et al.
`3/1999 Schofield et al.
`311999 Lion
`311999 Franke et al.
`411999 Kakinami et al.
`7/1999 Kawaziri et al.
`711999 Schofield et al.
`911999 Schofield et al.
`1211999 Tamura et al.
`12/1999 Vaisanen et al.
`312000 Nakamura et al.
`412000 Anandan et al.
`812000 Schofield et al.
`8/2000 Thau et al.
`112001 Jitsukata et al.
`112001 Seo et al.
`312001 Juds
`3/2001 Bos
`412001 Ishikawa et al.
`412001 Schofield et al.
`512001 Lemelson et al.
`512001 Luckscheiter et al.
`612001 DeLine et al.
`612001 Sasaki et al.
`61200 1 Kashiwazaki
`612001 Lynam
`71200 1 Kodaka et al.
`812001 DeLine et al.
`812001 Yano et al.
`912001 Shimoura et al.
`912001 Ishikawa et al.
`912001 Franke et al.
`1112001 Bos et al.
`1112001 Lee
`1112001 Schofield et al.
`
`Page 2
`
`6,330,511 B2
`6,341,523 B2
`6,353,392 B1
`6,396,397 B1
`6,411,204 B1
`6,420,975 B1
`6,433,676 B2
`6,485,155 B1
`6,498,620 B2
`6,580,996 B1
`6,590,719 B2
`6,594,583 B2
`6,671,607 B2
`6,690,268 B2
`6,691,008 B2
`6,708,100 B2
`6,717,610 B1
`6,748,312 B2
`6,757,109 B2
`6,823,241 B2
`6,824,281 B2
`6,882,287 B2
`6,941,216 B2
`6,968,266 B2
`7,005,974 B2
`7,038,577 B2
`7,049,945 B2 *
`7,151,844 B2
`7,295,682 B2
`7,463,138 B2
`7,720,580 B2 *
`200210003571 A1
`200210159270 A1
`200210188392 A1
`200310025597 A1
`200310052771 A1
`200310156015 A1
`200310169522 A1
`200310236622 A1
`200410149504 A1
`200510232469 A1
`2006101259 19 A1
`200610164230 A1
`200610171704 A1
`
`1212001 Ogura et al.
`112002 L y n m
`312002 Schofield et al.
`512002 Bos et al.
`612002 Bloomfield et al.
`712002 DeLine et al.
`812002 DeLine et al.
`1112002 Duroux et al.
`1212002 Schofield et al.
`612003 Friedrich
`712003 Bos
`712003 Ogura et al.
`1212003 Ishizu et al.
`212004 Schofield et al.
`212004 Kondo et al.
`312004 Russell et al.
`412004 Bos et al.
`612004 Russell et al.
`612004 Bos
`1112004 Shirato et al.
`1112004 Schofield et al.
`412005 Schofield
`912005 Isogai et al.
`1112005 Ahmed-Zaid et al.
`212006 McMahon et al.
`512006 Pawlick'i et al.
`512006 Breed et al. ................. 3401435
`1212006 Stevenson et al.
`1112007 Otsuka et al.
`1212008 Pawlick'i et al.
`5120 10 Higgins-Luthman ... ... ... 70 1128
`112002 Schofield et al.
`1012002 Lyn'un et al.
`1212002 Breed et al.
`212003 Schofield
`312003 Sjonell
`812003 Winner et al.
`912003 Schofield et al.
`1212003 Schofield
`812004 Swoboda et al.
`1012005 Schofield et al.
`612006 Camilleri et al.
`712006 DeWind et al.
`812006 Bingle et al.
`
`FOREIGN PATENT DOCUMENTS
`
`OTHER PUBLICATIONS
`
`Van Leeuwen et al., "Requirements for Motion Estimation in Image
`Sequences for Traffic Applications", IEEE, US, vol. 1, May 24,1999,
`pp. 145-150, XP010340272.
`Van Leeuwen et al., "Motion Estimation with a Mobile Camera for
`Traffic Applications", IEEE, US, Oct. 3, 2000, pp. 58-63.
`Van Leeuwen et al., "Motion Interpretation for In-Car Vision Sys-
`tems", IEEE, US, vol. 1, Sep. 30, 2002, pp. 135- 140.
`Van Leeuwen et al., "Motion Estimation in Image Sequences for
`Traffic Applications", vol. 1, May 1, 2000, pp. 354-359,
`~ ~ 0 0 2 5 2 9 7 7 3 .
`Pratt, "Digital Image Processing, Passage-ED.3",
`Sons, US, Jan. 1,2001, pp. 657-659, XP002529771.
`Supplemental European Search Report completed May 29, 2009,
`from corresponding European Application No. EP 03 72 1946.
`* cited by examiner
`
`John Wiley &
`
`

`

`U.S. Patent
`U.S. Patent
`
`Jan. 25,2011
`Jan.25,2011
`
`Sheet 1 of 11
`Sheetlofll
`
`US 7,877,175 B2
`
`“u
`
`
`
`x,\_
`
`
`
`§$¥¥§.,,7.9:55.;VVVVV......by;J.Jig;VVVVVVVVVVVVIf?,,
`
`_,AWE....A».22.2»?
`
`
`
`
`.,.WVEs...)V}.3.5...V.,.5.fi32.........5§s3iii}?1...,»,42¢...,..
`if:...,5t:5..»VV
`,»...,
`
`
`.ixbégafiéa...)........sh.s.?.s\ss.$..u.wV.rr§,r.r3%.
`
`
`
`.V)fifeit......,.V.V,
`..u«AH,.
`
`4.,.V
`
`
`\..1£..¥§fi.:V...34$7...V£51.”.,
`
`
`..§5.§7§.¥v$§¢v¥$$xtg§a§\
`
`22%;”.fiéfix
`
`
`
`
`

`

`U.S. Patent
`
`Jan. 25,2011
`
`Sheet 2 of 11
`
`Sr.rra3 Template
`
`,
`d / = =
`
`A *,.-
`
`AN_..
`
`<<N2
`
`32 6
`Car Back Dsor Handie
`-<*'-* p* . ry
`A-A-- 32a YC'
`
`< TO"? Door Cland!~ r Reference S~TUC~UTE;'
`
`--.
`Expected kdge
`
`34
`
`A131 (4raa Of I:lte?esi.j
`
`FIG, 2
`
`1
`
`FIG. 3A
`--
`--
`.......................
`
`.--
`
`FIG, 3 8
`
`

`

`U.S. Patent
`
`Jan. 25,2011
`
`Sheet 3 of 11
`
`-....
`--.-
`..................
`....................
`Point (X,, )', I n.ss rcfe~e~xcc point ar k'rg::
`
`.................................................. --
`
`3
`
`

`

`U.S. Patent
`
`Jan. 25,2011
`
`Sheet 4 of 11
`
`bG
`
`i"ih~cl
`YVjZ ee):
`"krglc>
`;
`:
`A:r~le
`'\, 0 $x--..- v--- ....\ "G,?
`: ..
`"*F~--
`".,
`-*
`,k '
`:
`\
`
`
`.
`I ~ o p
`ilie'~,~; ofxhc :&&
`3!36
`1 ~equire<j x,vl-eeI :irzgit: with
`:j~cir s j g ~ conveadon. (For
`1
`.
`
`ca-~era negati;;c i rvht.c{
`1
`1 snpie tsii?; 120t bc considered
`1 u;iIl hc scr RS ~21.51)
`
`3
`I
`
`

`

`U.S. Patent
`US. Patent
`
`Jan. 25,2011
`Jan. 25, 2011
`
`Sheet 5 of 11
`Sheet 5 of 11
`
`US 7,877,175 B2
`US 7,877,175 B2
`
`...;.;.3~,;;;,3.;
`
`_, mm,
`
`. 5&3;5w
`
`:I
`
`’2‘gym“
`
`!(x}
`
`FIG.. 9
`
`

`

`O
`
`U.S. Patent
`
`na
`
`Jan. 25,2011
`
`Sheet 6 of 11
`
`US 7,877,175 B2
`
`1m1h0m2mam5w2\n...m.
`nJaW
`
`uwHw.mMMu~..n1.am1W..u...nf.~am0“Km6~.vtWe"~eMh..Sa;azs”
`
`
`
`flfig ii?
`
`\1ak‘xkkk‘xka3efiecwé
`.m3.i...5933z\m:u.m.nnnnn.mam5,m.2an
`
`-.\
`\
`
`.4w.~muu
`
`s
`
`tum.”guy
`
`
`
`
`
`
`

`

`U.S. Patent
`
`Jan. 25,2011
`
`Sheet 7 of 11
`
`- 0 Gray level values 255
`
`FIG. 12
`
`FIG. 13
`
`I Horizontal edge line
`\
`
`I
`
`- sloped edge line
`
`FIG. 14
`
`

`

`US. Patent
`
`Jan. 25, 2011
`
`Sheet 8 of 11
`
`US 7,877,175 B2
`
`1:111. ’25
`
`m
`‘1 2“-1.3.
`
`
`
`3
`/ .
`ff ’1/‘13171__j;;1ee:1;1«:m
`/
`<1
`111 1. W
`
`\£1113 123111 1111111/
`
`«V
`
`3
`,
`\M
`
`
`
`'\
`‘T’u
`1.1.. “$3.,~ .
`3.541.544
`’.
`
`
`
`f
`1
`,
`'g
`1111
`
`a1
`.
`3
`i1
`
`1.111111.11111115
`
`z,7§’%’§"'f~\z:’
`
`.~
`
`
`1.111111111.11..11..11111111
`
`
`
`......~uuuuwuw.
`
`;
`
`.
`
`ff
`/ V131"- 11113115111¥==\
`,1!
`71.“
`11 1121211“; .1 1.1185.
`“~\\ 21:16: {Bayiéme-
`
`1»
`
`91
`
`.5
`
`“11111.11.1. 1
`Eff/fl
` JJJJJJJJJ
` 11g
`
`1:1"?£1.1121.1.3313?1'11111122
`’3.'.
`
`
`T171113
`
`

`

`U.S. Patent
`U.S. Patent
`
`Jan. 25,2011
`Jan.25,2011
`
`Sheet 9 of 11
`Sheet9of11
`
`US 7,877,175 B2
`
`Ef.qkn?
`
`flit?“ ”fig
`
`
`

`

`U.S. Patent
`U S. Patent
`
`Jan. 25, 2011
`Jan. 25,2011
`
`Sheet 10 of 11
`Sheet 10 of 11
`
`US 7,877,175 B2
`US 7,877,175 B2
`
`m 99:22:91 ‘2’
`
`
`
`

`

`U.S. Patent
`
`Jan. 25,2011
`
`Sheet 11 of 11
`
`sed
`
`FIG, 28
`
`

`

`1
`IMAGING SYSTEM FOR VEHICLE
`
`CROSS REFERENCE TO RELATED
`APPLICATIONS
`
`FIELD OF THE INVENTION
`
`BACKGROUND OF THE INVENTION
`
`25
`
`SUMMARY OF THE INVENTION
`
`2
`a driver of a vehicle when their vehicle crosses the road's land
`markings or when there is a clear trajectory indicating they
`will imminently do so. The warnings are typically not acti-
`vated if the corresponding turn signal is on, as this implies the
`5 driver intends to make a lane change maneuver. Additionally,
`the warning systems may be deactivated below a certain
`The present application is a continuation of U.S. patent
`vehicle speed. The driver interface for these systems may be
`application Ser. No. 111315,675, filed Dec. 22, 2005, now
`in the form of a visual warning (such as an indicator light)
`U.S. Pat. No. 7,720,580, which claims benefit of U.S. provi-
`andor an audible warning (typically a rumble strip sound).
`sional application, Ser. No. 601638,687, filed Dec. 23, 2004,
`which is hereby incorporated herein by reference in its l o One application warns a driver with an indicator light if the
`entirety.
`vehicle tire is crossing the lane marker and no other vehicle is
`detected in the driver's corresponding blind spot; andor fur-
`ther warns the driver with an audible warning if the vehicle is
`crossing into the adjacent lane and there is a vehicle detected
`The present invention relates generally to vision or imag- 1s in the driver's blind spot.
`ing systems for vehicles and is related to object detection
`There is concern that the current systems will be more of a
`systems and, more particularly, to imaging systems which are
`driver annoyance or distraction than will be acceptable by the
`operable to determine if a vehicle or object of interest is
`c o n s u e r market. Using the turn signal as the principle
`adjacent to, forward of or rearward of the subject vehicle to
`means of establishing to the warning system that the maneu-
`assist the driver in changing lanes or parking the vehicle. The 20 ver is intentional does not reflect typical driving patterns and,
`present invention also relates generally to a lane departure
`thus, many intended maneuvers will cause a warning. As a
`warning system for a vehicle
`driver gets annoyed by warnings during intended maneuvers,
`the driver will likely begin to ignore the warnings, which may
`result in an accident when the warning is appropriate.
`Therefore, there is a need in the art for an object detection
`system, such as a blind spot detection system or lane change
`assist system or lane departure warning system or the like,
`which overcomes the short comings of the prior art.
`
`Many lane change aidside object detectiodlane departure
`warning devices or systems and the like have been proposed
`which are operable to detect a vehicle or other object that is
`present next to, ahead of or rearward of the equipped vehicle
`or in an adjacent lane with respect to the equipped vehicle. 30
`Such systems typically utilize statistical methodologies to
`statistically analyze the images captured by a camera or sen-
`The present invention is intended to provide an object
`sor at the vehicle to estimate whether a vehicle or other object
`detection system, such as a blind spot detection system, a lane
`is adjacent to the equipped vehicle. Because such systems
`change assist or aid system or device, a lane departure warn-
`typically use statistical methodologies to determine a likeli- 35 ing system, a side object detection system, a reverse park aid
`hood or probability that a detected object is a vehicle, and for
`system, a forward park aid system, a forward, sideward or
`other reasons, the systems may generate false positive detec-
`rearward collision avoidance system, an adaptive cruise con-
`tions, where the system indicates that a vehicle is adjacent to,
`trol system, a passive steering system or the like, which is
`forward of or rearward of the subject vehicle when there is no
`operable to detect andor identify a vehicle or other object of
`vehicle adjacent to, forward of or rearward of the subject 40 interest at the side, front or rear of the vehicle equipped with
`vehicle, or false negative detections, where the system, for
`the object detection system. The object detection system of
`example, indicates that there is no vehicle adjacent to the
`the present invention, such as for a lane change assist system,
`subject vehicle when there actually is a vehicle in the adjacent
`utilizes an edge detection algorithm to detect edges of objects
`lane.
`in the captured images and determines if a vehicle is present
`Such known and proposed systems are operable to statis- 45 in a lane adjacent to the equipped or subject vehicle in
`tically analyze substantially all of the pixels in a pixelated
`response to various characteristics of the detected edges, such
`as the size, location, distance, intensity, relative speed andor
`image as captured by a pixelated image capture device or
`camera. Also, such systems may utilize algorithmic means,
`the like. The system processes a subset of the image data
`such as flow algorithms or the like, to track substantially each
`captured which is representative of a target zone or area of
`pixel or most portions of the image to determine how sub- 50 interest of the scene within the field of view of the imaging
`stantially each pixel or most portions of the image has
`system where a vehicle or object of interest is likely to be
`changed from one frame to the next. Such frame by frame
`present. The system processes the detected edges within the
`flow algorithms and systems may not be able to track a vehicle
`image data subset to determine if they correspond with physi-
`which is moving at generally the same speed as the equipped
`cal characteristics of vehicles and other objects to determine
`vehicle, because there may be little or no relative movement 55 whether the detected edge or edges islare part of a vehicle or
`between the vehicles and, consequently, little or no change
`a significant edge or object at or toward the subject vehicle.
`from one frame to the next. Because the systems may thus
`The system utilizes various filtering mechanisms, such as
`substantially continuously analyze substantially every pixel
`algorithms executed in software by a system microprocessor,
`for substantially every frame captured and track such pixels
`to substantially eliminate or substantially ignore edges or
`and frames from one frame to the next, such systems may 60 pixels that are not or cannot be indicative of a vehicle or
`require expensive processing controls and computationally
`significant object to reduce the processing requirements and
`expensive software to continuously handle and process sub-
`to reduce the possibility of false positive signals.
`stantially all of the data from substantially all of the pixels in
`The object detection system of the present invention may
`substantially each captured image or frame.
`capture images at a side of the vehicle and may process
`Many automotive lane departure warning (LDW) systems 65 various windows of the images to detect a vehicle in the
`(also known as run off roadwarning systems) are being devel-
`adjacent lane or other object, such as a bicycle, in the adjacent
`opedandimplementedonvehicles today. These systems warn
`lane. The system may adjust the image processing to account
`
`

`

`US 7,877,175 B2
`
`BRIEF DESCRIPTION OF THE DRAWINGS
`
`DESCRIPTION OF THE PREFERRED
`EMBODIMENTS
`
`15
`
`25
`
`4
`These and other objects, advantages, purposes and features
`of the present invention will become apparent upon review of
`the following specification in conjunction with the drawings.
`
`3
`for misalignment of the camera at the side of the vehicle. The
`system may adjust the area or zone of interest in response to
`a turning ofthe vehicle's wheels, such as when the vehicle is
`turning or curving along a curve in the road. The system may
`distinguish between vehicles or other objects and shadows of
`objects/vehicles so that a shadow of a vehicle two lanes over
`FIG,
`is a top plan view of a vehicle incorporating the
`may not be considered a vehicle in the adjacent lane. The
`object detection system of the present invention;
`system may switch between daytime and nighttime alga-
`FIG. 2 is a representation of a captured image of a side area
`rithms andmay be operable to detect headlamps ofvehicles in 10 of a vehicle as captured by an imaging sensor in accordance
`the adjacent lane.
`with the present invention;
`the captured image
`3A-C are schematics
`According to an aspect of the present invention, an imaging
`2 showing the adjustments that may be made to the image
`system for a vehicle includes an imaging array sensor and a
`processing to account for misalignment of the image sensor;
`control. The image array sensor comprises a plurality of
`FIG. 4 is a schematic showing an adjustment of the area of
`photo-sensing pixels and is positioned at the vehicle with a
`interest when the wheels of the subject vehicle are turned;
`field of view exteriorly of the vehicle. The imaging array
`FIG, 5 is a plan view of one of the wheels of the subject
`sensor is operable to capture an image of a scene occurring
`vehicle showing the angles ofthe wheel as it is turned;
`exteriorly of the vehicle. The captured image comprises an
`FIGS. 6-9 are representations of captured images of the
`image data set representative of the exterior scene. The con- 20 side area of the vehicle, showing how different shadows may
`trol algorithmically processes the image data set to a reduced
`be detected;
`FIG. 10 is a schematic of the image processing windows
`image data set of the image data set. The control processes the
`useful in processing the captured images in accordance with
`reduced image data set to extract information from the
`the present invention;
`reduced image data set. The control selects the reduced image
`FIG. 11 is a representation of a captured image of the side
`data set based on a steering angle of the vehicle.
`area of the vehicle, showing different processing windows
`Optionally, the control may process the reduced image data
`used to detect the vehicle in the adjacent lane;
`set with an edge detection algorithm to extract information
`FIG. 12 is a plot of the gray level values of the rows of
`from the reduced image data set. The image sensor may be
`pixels as a result of a wide line integration in accordance with
`one of (a) part of an exterior rearview mirror assembly of the 30 the present invention;
`FIG. 13 is a~rocessingmaskfor~rocessingthewindowsof
`vehicle and with a field of view at least partially sideward of
`the captured images using gradient calculations in actor-
`the vehicle, and (b) at an upper windshield area and behind the
`dance with the present invention;
`windshield of the vehicle and with a field of view forward and
`l 4 is arepresentation
`the
`image
`through the windshield (such as at an area that is cleaned by
`a
`the windshield wiper or wipers of the vehicle when the wind- 35 shadow of the vehicle in the area adjacent to the vehicle;
`l5 is a process flow diagram
`the
`shield wipers are activated), Optionally, the image sensor
`detection function of the present invention; and
`may be part of an exterior rearview mirror assembly of the
`FIGS. 16-20 are representations of captured images of the
`vehicle and with a field of view at least partially sideward of
`side area of the subject vehicle, showing the headlight detec-
`the vehicle, wherein the imaging system comprises a side
`40 tion function of the present invention.
`object detection system for detecting objects at a side of the
`vehicle. Optionally, the image sensor may be at an upper
`windshield area and behind the windshield of the vehicle and
`with a field of view forward and through the windshield,
`wherein the imaging system comprises a lane departure warn- 45
`Referring now to the drawings and the illustrative embodi-
`ing system.
`ments depicted therein, anobject detection system or imaging
`system, such as a lane change assist or aid system 10, is
`Therefore, the present invention provides an imaging sys-
`positioned at a vehicle 12 (such as at an exterior rearview
`tern for use as or in association with a side object detection
`mirror 12a of avehicle) and is o~erableto capture an image of
`system andlor a lane departure warning system. The system is
`operable to process captured image data of a scene occurring 50 a Scene occurring sidewardl~ and rearwardl~ at or along one
`0' both sides of vehicle 12 (FIG. 1). Lane change assist
`exteriorly and along one or both sides of the vehicle to deter-
`l o
`mine if a target vehicle or object of interest is located at or in
`an image capture device Or
`Or
`Camera 14, which captures an image
`the scene
`the lane adjacent to the subject or host vehicle, The imaging
`system of the present invention may process zones or areas of
`the
`16,
`12, and a
`side
`a
`interest in the captured images and may adjust processing to 55 which processes the captured image to determine whether
`another vehicle 18 is present at the side of vehicle 12, as
`accommodate any misalignment of the camera that may
`discussed below. Control 16 may be further operable to acti-
`occur during installation of the camera at the side of the
`vate a warning indicator or display or signal device to alert the
`vehicle. The side object detection system may also select or
`driver of vehicle 12 that another vehicle is present at the side
`adjust the image processing to selectiadjust the areas of inter- 60 of vehicle 12, The warning or alert signal may be provided to
`the driver of vehicle 12 in response to another vehicle being
`est, such as in response to a steering angle of the vehicle, such
`as a turning of the wheels of the vehicle, so that the zone or
`detected at the blind spot area (as shown in FIG. 1 ) and may
`only be provided when the driver of vehicle 12 actuates a turn
`area is adapted for the turning of the subject vehicle. The
`signal toward that side or begins turning the subject vehicle 12
`imaging system of the Present invention thus provides
`enhanced processing of captured images to provide the 65 toward that side to change lanes into the lane occupied by the
`other detected vehicle 18. The control and imaging system
`desired function of the imaging system or associated control
`or control system or alert system.
`may utilize aspects described in U.S. patent application Ser.
`
`

`

`US 7,8
`
`5
`No. 101427,051, filed Apr. 30, 2003 by Pawlicki et al. for
`OBJECT DETECTION SYSTEM FOR VEHICLE, now
`U.S. Pat. No. 7,038,577, which is hereby incorporated herein
`by reference. Reference is made to U.S. patent application
`Ser. No. 101427,051, for a discussion of image processing
`techniques and control functions useful with the present
`invention.
`Optionally, the imaging system and object detection sys-
`tem of the present invention may utilize aspects of the imag-
`ing systems or detection systems of the types described in
`U.S. Pat. Nos. 5,929,786 andor 5,786,772, andorU.S. patent
`application Ser. No. 101427,051, filedApr. 30,2003 by Paw-
`licki et al. for OBJECT DETECTION SYSTEM FOR
`VEHICLE, now U.S. Pat. No. 7,038,577; andor Ser. No.
`111239,980, filed Sep. 30, 2005 by Camilleri et al. for
`VISION SYSTEM FOR VEHICLE, andor U.S. Provisional
`Applications, Ser. No. 601628,709, filed Nov. 17, 2004 by
`Camilleri et al, for IMAGING AND DISPLAY SYSTEM
`FOR VEHICLE; Ser. No. 601614,644, filed Sep. 30, 2004;
`andor Ser. No. 601618,686, filed Oct. 14,2004 by Laubinger
`for VEHICLE IMAGING SYSTEM, or of the reverse or
`backup aid systems, such as rearwardly directed vehicle
`vision systems utilizing principles disclosed in U.S. Pat. Nos.
`5,550,677; 5,760,962; 5,670,935; 6,201,642; 6,396,397;
`6,498,620; 6,717,610 andor 6,757,109, andor U.S. patent
`application Ser. No. 101418,486, filed Apr. 18, 2003 by
`McMahon et al. for VEHICLE IMAGING SYSTEM, now
`U.S. Pat. No. 7,005,974, or of automatic headlamp controls,
`suchas thetypes describedinU.S. Pat. Nos. 5,796,094 andor
`5,715,093; andor U.S. patent application Ser. No. 111105,
`757, filed Apr. 14, 2005, now U.S. Pat. No. 7,526,103; and
`U.S. provisional applications, Ser. No. 601607,963, filed Sep.
`8, 2004 by Schofield for IMAGING SYSTEM FOR
`VEHICLE; and Ser. No. 601562,480, filed Apr. 15, 2004 by
`Schofield for IMAGING SYSTEM FOR VEHICLE, or of
`rain sensors, such as the types described in U.S. Pat. Nos.
`6,250,148 and 6,341,523, or of other imaging systems, such
`as the types describedinU.S. Pat. Nos. 6,353,392 and 6,313,
`454, which may utilize various imaging sensors or imaging
`array sensors or cameras or the like, such as a CMOS imaging
`array sensor, a CCD sensor or other sensors or the like, such
`as the types disclosed in commonly assigned, U.S. Pat. Nos.
`5,550,677; 5,760,962; 6,097,023 and 5,796,094, and U.S.
`patent application Ser. No. 091441,341, filedNov. 16,1999 by
`Schofield et al. for VEHICLE HEADLIGHT CONTROL
`USING IMAGING SENSOR, now U.S. Pat. No. 7,339,149,
`andor PCT Application No. PCT/US20031036 177 filed Nov.
`14,2003, published Jun. 3,2004 as PCT Publication No. WO
`20041047421 A3, with all of the above referenced U.S. pat-
`ents, patent applications and provisional applications and
`PCT applications being commonly assigned and being
`hereby incorporated herein by reference.
`The image sensor may be located at the vehicle so as to
`have a sideward field of view, such as at an exterior rearview
`mirror of the vehicle, such as generally or at least partially
`within an exterior rearview mirror of the vehicle. For
`example, an image sensor may be located within an exterior
`rearview mirror assembly of the vehicle and may have a
`generally rearwardly and sidewardly field of view through a
`transflective reflective element of the exterior rearview mirror
`assembly. In such an application, the image sensor may be
`incorporated in or associated with a side object detection
`system that detects objects at a side or blind spot area of the
`controlled or subject vehicle. Optionally, the image sensor
`may have a generally forward field of view to capture images
`of a scene occurring forwardly of the vehicle. The image
`sensor may be located within the vehicle cabin and rearward
`
`6
`of the windshield so as to have a field of view forwardly and
`through the windshield of the vehicle, preferably at a location
`that is cleaned by the windshield wipers of the vehicle, such
`as at an interior rearview mirror assembly of the vehicle or at
`an accessory module or windshield electronics module or the
`like. In such an application, the image sensor may be incor-
`porated in or associated with a lane departure warning system
`that detects a departure of the controlled or subject vehicle
`from a lane as the vehicle travels along a road.
`
`Camera Calibration:
`In order to verify that the camera or imaging sensor is
`mounted at the vehicle (such as at an exterior nortion of the
`vehicle) within a desired tolerance limit so as to provide the
`desired field of view, the camera may detect the side of the
`vehicle (shown at 30 in FIG. 2) andor the door handle or
`handles (the front door handle is shown at 32a in FIG. 2, while
`the rear door handle is shown at 32b in FIG. 2) of the vehicle
`and the control may confiim that they are in the expected
`location in the captured images. If the control determines that
`the camera is not aligned or aimed at the desired location
`(such as by determining that the vehicle edge andor door
`handleihandles are not at the expected location), the control
`may adjust the image andor image processing to account for
`any such misalignment of the camera. For example, the
`degree of misalignment may be calculated, and the image
`processing may be adjusted or shifted andor rotated to posi-
`tion the reference structure at the appropriate location in the
`captured images.
`For example, the algorithm may function to preprocess the
`captured image by a histogram equalization to improve the
`image contrast. The algorithm may then process the captured
`images via an edge detection in the area of interest to extract
`the expected edge of the vehicle (shown at 34 in FIG. 2). The
`algorithm may filter the image data to remove noise in the
`edge detected image. The algorithm may perform a coarse
`structure fitting (such as via a line fitting algorithm or contour
`fitting algorithm or the like) of the vehicle side and door
`handles in the captured image for verifying the camera
`mounting is within the desired or appropriate tolerance limit.
`The algorithm may further perform a fine structure fitting
`(such as via a correlation algorithm or contour fitting algo-
`rithm or the like) for calculating shift in yaw, pitch and roll. As
`shown in FIGS. 3A-C, the actual or detected vehicle edges
`may be misaligned or separated from the expected vehicle
`edges, such that the image processing may be adjusted to shift
`the captured image data accordingly to accommodate such
`misalignment of the camera. Based on the results of the image
`processing techniques, data or information of the yaw, pitch
`and roll may be used to set the polygon co-ordinates and H
`depression pixel calibration parameters, so that the expected
`vehicle edges are substantially aligned with the actual or
`detected vehicle edges.
`After the image data or image processing is adjusted to
`account for any misalignment of the camera at the vehicle, the
`camera may capture images of the scene occurring exteriorly
`of the vehicle and at that side of the vehicle, and the control
`may process the images to detect objects or lane markers or
`the like at the side of the vehicle andor rearward of the
`vehicle, and may utilize aspects described in U.S. patent
`application Ser. No. 101427,051, filedApr. 30,2003 by Paw-
`licki et al. for OBJECT DETECTION SYSTEM FOR
`VEHICLE, now U.S. Pat. No. 7,038,577, which is hereby
`incorporated herein by reference.
`
`Adjustment of Zone when Vehicle Turning:
`Optionally, the control may perform a curve processing or
`lane divider curve fitting function to select or adjust the
`
`

`

`US 7,877,175 B2
`
`8
`4. Compute Cumulative Wheel Angle (0,):
`
`OCN = OREF;
`
`OC(N-I) = ON-I + OCN;
`
`OC(N-2) = ON-2 + OC(N-I);
`
`OC(N-3) = ON-3 + OC(N-2);
`
`7
`reduced image set or zone of interest based on a steering angle
`of the vehicle (the angle a

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket