`Borcherts et al.
`
`[75]
`
`[54] SYSTEM AND METHOD FOR
`AUTOMATICALLY STEERING A VEHICLE
`WITHIN A LANE IN A ROAD
`Inventors: Robert H. Borc:herts, Ann Arbor;
`Jac:ek L. Jurzak, Rochester Hills;
`Shih-Ping Liou, Ann Arbor;
`Tse-Liang A. Yeh, Rochester Hills,
`all of Mich.
`[73] Assignee: Zexel Corporation, Japan
`[21] Appl. No.: 722,661
`
`Jun. 28, 1991
`[22] Filed:
`[51]
`Int. Cl.s ............................................... H04N 7/18
`[52] U.S. Cl ................................ 358/103; 3641424.02;
`364/426.04
`[58] Field of Search ............................... 358/103, 105;
`364/424.01,424.02,426.01,426.04; 180/179
`References Cited
`U.S. PATENT DOCUMENTS
`4,703,429 10/1987 Sakata ............................ 364/426.04
`4,757,450 7/1988 Etoh ............................... 364/426.04
`5,014,200 5/1991 Chundrlik et al. ............. 364/426.04
`5,081,585 111992 Kurami et al. .................. 358/103 X
`5,087,969 2/1992 Kamada et al ...................... 358/103
`
`[56]
`
`FOREIGN PATENT DOCUMENTS
`0354561 8/1989 European Pat. Off ..
`0354562 8/1989 European Pat. Off. .
`0361914 9/1989 European Pat. Off ..
`1-66712 3/1989 Japan.
`SO
`
`111111111111111111111111111111111111111111111111111111111111111111111111111
`USOO5245422A
`[11] Patent Number:
`[45] Date of Patent:
`
`5,245,422
`Sep. 14, 1993
`
`1-106910 7/1989 Japan.
`2-48704' 2/1990 Japan .
`2-48705 2/1990 Japan .
`2-48706 2/1990 Japan .
`2-90379 3/1990 Japan.
`2-90380 3/1990 Japan .
`2-90381 3/1990 Japan .
`9005957 5/1990 PCT Int'l Appl ..
`
`OTHER PUBLICATIONS
`E. D. Dickrnanns et aI. Applications of Dynamic Mon(cid:173)
`ocular Machine Vision. Machine Vision and Applica(cid:173)
`tions (1989) 1:241-261.
`
`Primary Examiner-Victor R. Kostak
`Attorney, Agent, or Firm-Harness, Dickey & Pierce
`
`[57]
`ABSTRACT
`An automatic vehicle steering system is provided for
`automatically steering a vehicle along a lane in a road.
`A video sensor is included for generating a plurality of
`frames of video images of the road. A computer proces(cid:173)
`sor analyzes the frames to determine the lane bound(cid:173)
`aries of the road and the position of the vehicle. The
`system advantageously utilizes engagement of a cruise
`control switch and a steering control switch to initiate
`processing of the image data and automatic steering of
`the vehicle. In such manner, the reliability and effi(cid:173)
`ciency of the system is increased while at the same time
`minimizing complexity and cost.
`
`27 Claims, 16 Drawing Sheets
`
`SO
`
`16
`
`IPR2013-00419 - Ex. 1004
`Toyota Motor Corp., Petitioner
`
`1
`
`
`
`u.s. Patent
`
`Sep. 14, 1993
`
`Sheet 1 of 16
`
`5,245,422
`
`"'I't---+---~~-~
`
`---~SQ
`
`2
`
`
`
`u.s. Patent
`
`US. Pateht
`
`Sep. 14, 1993
`Sep. 14, 1993
`
`Sheet 2 of 16
`Sheet 2 of 16
`
`5,245,422
`5,245,422
`
`0—.
`
`N ...
`
`NF
`
`\
`
`.<~2:9“.
`
`I
`f.),\
`I··
`I~··~~
`I ~ •
`I:·. Y
`I • •• \
`I • • \
`• •
`I • • \
`I
`•
`•
`I
`•
`•
`I
`:
`••
`I
`•
`•
`I
`••
`~
`I
`:
`~
`I
`:
`••
`I
`•
`•
`• t t!
`I : N
`I
`I
`:
`I
`:
`
`I
`. . \
`. . \
`
`3
`
`
`
`u.s. Patent
`
`Sep. 14, 1993
`
`Sheet 3 of 16
`
`5,245,422
`
`60
`
`16
`
`18
`
`SEARCH
`AREA
`PREDICTOR
`
`44
`
`STEERING
`ACTUATOR
`
`40
`
`PEDAL
`POSmoNS
`
`24
`
`26 .
`
`DRIVER INPUT
`
`Figure 3.
`
`4
`
`
`
`u.s. Patent
`US. Patent
`
`Sep.14,1993
`Sep. 14, 1993
`
`Sheet 4 of 16
`Sheet 4 of 16
`
`5,245,422
`5,245,422
`
`66
`66
`
`34
`
`_ T=2
`_ T=1
`
`36
`
`36
`
`L
`I
`
`10
`
`Figure 28.
`Figure 28.
`
`5
`
`
`
`u.s. Patent
`US. Patent
`
`Sep.14,1993
`Sep. 14, 1993
`
`Sheet 5 of 16
`Sheet 5 of 16
`
`2
`
`5,245,422
`5,245,422
`
`
`
`Figure 4A ..
`Figure 4A. '
`
`6
`
`
`
`u.s. Patent
`
`US. Patent
`
`Sep. 14, 1993
`Sep. 14, 1993
`
`Sheet 6 of 16
`Sheet 6 of 16
`
`5,245,422
`5,245,422
`
`
`
`Figure 48.
`Figure 4B.
`
`‘
`
`7
`
`
`
`u.s. Patent
`
`US. Patent
`
`Sep. 14, 1993
`Sep. 14, 1993
`
`Sheet 7 of 16
`Sheet 7 of 16
`
`5,245,422
`5,245,422
`
`
`
`Figure 4C.
`Figure 4C.
`
`8
`
`
`
`u.s. Patent
`
`Sep. 14, 1993
`
`Sheet 8 of 16
`
`5,245,422
`
`Drtver initiate.
`OUIIe Control
`
`SCSlWItch
`lltumedon
`
`No
`
`I Ve.
`
`,
`
`Speed
`Cru .. Control
`Mode
`
`,
`
`Speed
`and StHring
`Cruise Control
`Mode
`
`,
`
`St .. ring
`Wheel
`Override
`
`if
`Braking or
`Cru .. Control
`Swttch Disabled
`
`,
`Braking or
`Cru .. Control
`Swttch Disabled
`
`Cruise
`~ Control ~
`Disabled
`
`SCS
`Audio &. Video
`information
`
`'---
`SCS
`~ Ready
`
`ON. drIv. the
`vehicle wtthin
`lane boundari ..
`
`Figure 5.
`
`. .
`
`r - •• - -,II - - - -
`
`SCS
`~ Inltlal2atlon
`-!-
`
`· ·
`· ·
`· · · Sta1ulmode
`·
`· · · · · ·
`~ Learning
`mode
`·
`·
`~
`~ ~~
`: --. -1- ---
`
`dIIpIay
`
`,~
`
`:
`
`petfOl'l'TlCllC8
`
`",
`
`: Reodymode
`
`9
`
`
`
`u.s. Patent
`
`US. Patent
`
`Sep. 14, 1993
`Sep. 14, 1993
`
`Sheet 9 of 16
`Sheet 9 of 16
`
`5,245,422
`5,245,422
`
`
`
`• CD
`Figure6.
`~ :s
`0) -LL
`
`10
`
`10
`
`
`
`u.s. Patent
`US. Patent
`
`Sep. 14, 1993
`Sep. 14, 1993
`
`Sheet 10 of 16
`Sheet 10 of 16
`
`5,245,422
`5,245,422
`
`C\I
`CO
`
`• CC
`
`Figure7A.
`t--e :::s
`C) .-u.
`
`11
`
`
`
`CD
`CO
`
`/ ~
`
`11
`
`
`
`u.s. Patent
`
`US. Patent
`
`Sep. 14, 1993
`Sep. 14, 1993
`
`Sheet 11 of 16
`Sheet 11 of 16
`
`5,245,422
`5,245,422
`
`C\I
`CO
`
`• m
`to-
`!
`:::s
`C) .-u.
`
`Figure7B.
`
`12
`
`
`
`/
`
`CO
`CO
`
`12
`
`
`
`u.s. Patent
`
`US. Patent
`
`Sep. 14, 1993
`u Sep. 14, 1993
`
`Sheet 12 of 16
`Sheet 12 of 16
`
`5,245,422
`5,245,422
`
`• o t--
`!
`::s
`C) .-LI.
`
`
`
`Figure7C.
`
`13
`
`NG
`
`C\I
`CO
`
`
`
`/ ~
`
`13
`
`
`
`u.s. Patent
`
`US. Patent
`
`Sep. 14, 1993
`Sep. 14, 1993
`
`Sheet 13 of 16
`Sheet 13 of 16
`
`5,245,422
`5,245,422
`
`I , -I -v •
`
`
`
`•
`CD
`Figure8.
`!
`:::J
`C) .-u.
`
`14
`
`14
`
`
`
`u.s. Patent
`
`Sep. 14, 1993
`
`Sheet 14 of 16
`
`5,245,422
`
`(START)
`,,--
`--
`(Input Imc:gel
`,r
`Compute Grextient Magnitude
`end Grextient Direction
`,.
`Hypothesize the Intersection Point
`Based on the S each Area
`
`"
`Collect Support for E a:h
`Hypothesized Intersection Point
`,.
`Select the Right md Left
`Intersection Point CJ'ld Converaent Lines
`
`~r
`
`Iu pdcle the S each Areal
`
`Figure 9.
`
`15
`
`
`
`u.s. Patent
`
`Sep. 14, 1993
`
`Sheet 15 of 16
`
`5,245,422
`
`Ipx(K) = xl + K*m
`
`NO
`
`Figure 10.
`
`16
`
`
`
`u.s. Patent
`
`Sep. 14, 1993
`
`Sheet 16 of 16
`
`5,245,422
`
`...
`S 4 - ten
`
`-I
`I I
`(Ie
`- px )
`
`Gra:lent Dlredion d pxtl locdlon (I, J)
`Consistent with the Cl"lgle.
`
`NO
`
`NO
`
`Figure 11.
`
`17
`
`
`
`1
`
`5,245,422
`
`SYSTEM AND METHOD FOR AUTOMATICALLY
`STEERING A VEHICLE WITHIN A LANE IN A
`ROAD
`
`2
`initiating vehicle speed control. The invention advanta(cid:173)
`geously utilizes the actuation of the cruise control
`switch to initiate the processing of the sensor informa(cid:173)
`tion and to provide automatic steering control of the
`5 vehicle under safe traffic and road conditions. A pro(cid:173)
`grammable processor provides signal processing and
`analyzes the information, while a steering controller
`controls the steering of the vehicle as a function of the
`analysis by the processor.
`
`BACKGROUND OF THE INVENTION
`Technical Field
`This invention relates to digital image processing
`systems, and, more particularly, to systems for automat- 10
`ically controlling the steering of a vehicle.
`
`BRIEF DESCRIPTION OF THE DRAWINGS
`The various advantages of the present invention will
`Discussion
`become apparent to those skilled in the art by reading
`The technical literature suggests the desirability of a
`the following specification and by reference to the
`control system for automatically controlling the steer-
`ing of a vehicle. Representative examples of some 15 drawings in which:
`known approaches are disclosed in European Patent
`FIG. 1 is a schematic diagram of a vehicle equipped
`Application Nos. EP 0 354 56 A2 filed Aug. 9,1989 and
`with an automatic vehicle steering system in accor-
`EP 0 361 914 A2 filed Sep. 28, 1989, both assigned to
`dance with the present invention;
`FIGS. lA-1B are schematic diagrams which illus-
`Honda Giken Kogyo Kabushiki Kaisha, Japanese Ap-
`plication No. 62-97935 and European Patent Applica- 20 trate detection of the lane in the road in front of the
`tion No. EP 0 304 042 A2 filed Aug. 17, 1988 assigned
`vehicle;
`FIG. 3 is a block diagram which illustrates the system
`to Kabushiki Kaisha Toshiba. Briefly, these documents
`disclose the general concept of using a video input de-
`configuration in accordance with the present invention;
`vice, such as a camera, that is mounted to the vehicle 2S
`FIGS. 4A-4C are pictures which illustrate the opera-
`and a computer processor for processing the image data
`tion of the present invention;
`and providing control signals to mechanisms for con-
`FIG. 5 is a flow diagram which illustrates the pro-
`trolling the steering. of the vehicle.
`cessing steps;
`Generally, the pnor art approaches do not appear to
`FIG. 6 is a schematic diagram illustrating the detec-
`be cost effective. As a result, their implementation in a
`tion and prediction of the lane boundaries in the road in
`vehicle affordable by the ordinary consumer is not very 30 front of the vehicle'
`practical. C?ne reason for the.expe~se is that. most of
`FIGS. 7A-7C ~re continued schematic diagrams
`these techmques process the VIdeo Input data In a very
`illustrating the detection and prediction of the lane in
`complex manner. For example, the EP '914 application
`the road'
`FIG. Ii is a continued schematic diagram illustrating
`utilizes a Hough transform to analyze the im~ge data. 3S
`Th.e ~se of transf~rms of these types are relatlvel~ ~o-
`the detection and prediction of the lane in the road;
`phlstl~ted and dIfficult. to analyze thereby requInn~
`FIG. 9 is a flow chart diagram which illustrates the
`e?tpenslve com~uter equIpment to perforI~ the a~alY~ls
`lane detection algorithm in accordance with the present
`since an exceedingly large amount of data IS reqUIred In
`invention'
`order to perform these transforms.
`' .
`. . .
`t f th kn
`M
`t
`.
`1
`al
`11 40
`FIG. 10 IS a flow chart dIagram whIch tllustrates the
`7 own sys ems contl.nu?us y an . yze a
`operation of the lane detection algorithm; and
`OS?
`o.f the VIdeo Input da~ and the maJonty of th71r algo-
`FIG. 11 is a flow chart diagram which further illus-
`t
`nthm parameters are eIther fixed or predetermined. As
`t
`th
`f th I
`d t
`1
`'th
`f
`f
`a result, the processor is given the enormous task of
`ra es
`e opera Ion 0
`e ane e ec IOn a gon m.
`isolating those smaller areas of interest that contain
`DETAILED DESCRIPTION OF THE
`meaningful image data points. The prior art systems also 4S
`PREFERRED EMBODIMENT
`generally require an extensive manual tuning effort for
`Turning now to FIG. 1, a vehicle 10 is shown therein
`each specific traffic scene and condition. Even so, there
`which illustrates the essential components of the auto(cid:173)
`is no high degree of probability that the processor has
`matic vehicle steering system in accordance with the
`correctly detected the actual lane boundary lines that
`present invention. An image input device 12 is mounted
`are often used as criteria for controlling the vehicle 50
`to the front portion of the vehicle 10 at a location near
`steering. This is because there is no good preset criteria
`the rear view mirror assembly. Such a device may be a
`for initiating the processing of the image data associated
`video camera of the conventional or infrared kind and is
`only with relevant road features. As a result, the proces(cid:173)
`used to monitor the road geometry and traffic condition
`sor's power and resources are often wasted in process(cid:173)
`in front of the vehicle 10 by providing a plurality of
`ing image data from scenes which do not actually con- 55
`frames of video images of the road. Image input device
`tain the lane boundary lines. In addition, the prior art
`12 may be mounted in combination with the rear view
`approaches do not generally embody any mechanisms
`mirror assembly or separate therefrom or at any other
`which allow the driver of the vehicle to operate the
`location which adequately monitors the road in front of
`automatic steering control system only when traffic
`the vehicle 10.
`conditions are proper and safe.
`An image digitization electronics and processing unit
`14 is shown mounted under the hood of the vehicle 10.
`The processing unit 14 may be one of several standard
`off the shelf programmable processors capable of pro(cid:173)
`viding image processing. Image digitization and elec(cid:173)
`tronics and processing unit 14 is made up of both hard(cid:173)
`ware and software. The hardware is connected to image
`input device 12 and contains all the signal conditioning
`
`SUMMARY OF THE INVENTION
`In accordance with the preferred embodiment of the
`present invention, a system is provided for automati(cid:173)
`cally steering a vehicle. Included is a sensor which is 65
`mounted to the vehicle and generates position informa(cid:173)
`tion about the road in front of the vehicle. The vehicle
`contains a cruise control system that has a switch for
`
`60
`
`18
`
`
`
`3
`electronics. Included in the hardware are image digitiz(cid:173)
`ing frame grabbers for converting each frame of the
`analog video images to digital signals or pulses, and
`computer processors for providing digital image pro(cid:173)
`cessing. The software provides control for the image 5
`input device 12, image processing for lane detection and
`a predictor for improving the efficiency of the image
`processing function by providing for the necessary
`search area.
`A steering control actuator 16 is mounted on the 10
`vehicle 10. Steering control actuator 16 may be either
`hydraulic or electric and controls the steering angle of
`the wheels, subject to the manual steering override by
`the driver, so that the vehicle is at the desired position
`within the lane in the road when the automatic vehicle 15
`steering system is engaged.
`Steering actuator electronics and control unit 18 is
`also mounted to the vehicle 10. Steering actuator con(cid:173)
`trol unit 18 drives the steering control actuator 16 so
`that the vehicle motion follows the desired path pro- 20
`vided from the output of the image digitization elec(cid:173)
`tronics and processing unit 14.
`Whee] angle and driver steer sensors 20 are mounted
`to the vehicle 10. The wheel angle sensor measures the
`steering whee] angle. The driver steer sensor measures 25
`the driver force applied to the steering whee] to detect
`driver effort in controlling the steering wheel. The
`detection of a significant driver steer will temporarily
`disengage the steering control actuator 16 so that the
`automatic vehicle steering function is overridden by 30
`conventional driver steering.
`A conventional cruise control system 22 is employed
`to provide automatic vehicle speed control of the vehi(cid:173)
`cle 10. A manually actuable cruise control switch 26 is
`mounted inside the vehicle 10 for engaging the cruise 35
`control system 22. It is generally assumed that the cruise
`control system 22 is engaged when the vehicle is under
`proper and safe traffic and road conditions.
`An automatic steering switch 24 is also mounted to
`the interior of the vehicle 10. Automatic steering switch 40
`24 allows the driver to engage the automatic vehicle
`steering system. In order to engage the automatic vehi(cid:173)
`cle steering system to steer the vehicle 10, the system
`requires that both the cruise control switch 26 and auto(cid:173)
`matic steering switch 24 be engaged. Cruise control 45
`switch 26 and automatic steering switch 24 can also be
`configured such that with the cruise control system 22
`disengaged, engagement of the automatic steering
`switch 24 will also simultaneously engage the cruise
`control switch 22 which also engages the cruise control 50
`system 22, thereby providing engagement of the auto(cid:173)
`matic vehicle steering system. On the other hand, when
`the cruise control system 22 or switch 26 is disengaged,
`the automatic steering switch 24 and the automatic
`steering control function are both disengaged.
`Two additional system components are included,
`whose location in the vehicle 10 are irrelevant. The first
`being a sensor and vehicle system interface 64 which
`includes a standard vehicle speed sensor added to the·
`standard vehicle equipment, a vehicle power supply 60
`interface and a standard vehicle cruise system interface.
`The vehicle speed sensor may be used for steering con(cid:173)
`trol purposes to modify controller response time
`thereby enhancing the operation of the automatic vehi(cid:173)
`cle steering system. The vehicle power supply and 65
`cruise control interface may be necessary to connect the
`video cruising system to the standard vehicle equipment
`to ensure that both systems operate properly.
`
`55
`
`5,245,422
`
`4
`The second is a driver interface and warning informa(cid:173)
`tion center 54 which may consist of audio, visual and
`other sensory interactions. Such devices may inform the
`driver about performance of the automatic vehicle
`steering system to enable the driver to make proper
`judgment on· the safety of the driving situation.
`In operation, the driver, while driving the vehicle 10
`on a road having lanes such as a freeway, may engage
`the automatic vehicle steering system. During normal
`weather and driving conditions, the driver is required to
`have engaged both the cruise control switch 26 and
`automatic steering switch 24. With the cruise control
`system 22 engaged, the driver may engage the auto(cid:173)
`matic steering switch 24 to engage the automatic vehi(cid:173)
`cle steering system. With the cruise control system 22
`disengaged, the system may be configured so that en(cid:173)
`gagement of the automatic steering switch will further
`cause engagement of the cruise control switch 26 to
`thereby allow engagement of the automatic steering
`system. By requiring engagement of the cruise control
`system 22, the system may assume that the vehicle is
`under proper and safe traffic road conditions.
`Engagement of the automatic vehicle steering system
`initiates the video input device 12. Video input device
`12 generates a continuous plurality of frames of video
`images of the road in front of the vehicle 10. The image
`digitization electronics and processing unit 14 receives
`and analyzes the frames of the video images. In doing
`so, processing unit 14 converts the analog inputs from
`each frame to a plurality of digital signals. Processing
`unit 14 then analyzes the digital signals and attempts to
`detect the lane boundaries on both sides of the vehicle
`10. Furthermore, processing unit 14 analyzes the path
`and determines the proper directional response needed
`to maintain the vehicle 10 in the desired position within
`the lane.
`The automatic vehicle steering system utilizes the
`processed data to lock on to the lane and steer the vehi(cid:173)
`cle 10 in a desired position therein. In doing so, the
`processing unit 14 provides a directional control re(cid:173)
`sponse to steering actuator control unit 18 which in turn
`directs steering control actuator 16 to steer the vehicle
`in the desired direction. Wheel angle and driver steer
`sensors 20 measure the steering wheel angle and fur(cid:173)
`thermore measure and detect driver effort to override
`the automatic vehicle steering system. The detection of
`a significant driver steer by the driver steer sensor will
`result in temporary disengagement of the steering con(cid:173)
`trol actuator 16 thereby temporarily disengage the auto(cid:173)
`matic vehicle steering system. This may occur, for ex(cid:173)
`ample, when the driver of the vehicle 10 changes lanes.
`Once in the new lane the automatic vehicle steering
`system will be re-engaged to provide steering within the
`new lane provided the driver is no longer manually
`overriding the automatic steering of the vehicle 10.
`FIGS. 2A and 2B illustrate the basic geometry in(cid:173)
`volved for providing images of the road for the auto(cid:173)
`matic vehicle steering system. Vehicle 10 is shown
`within the lane of a road 28 having a left lane boundary
`34 and a right lane boundary 36. Image input device 12
`monitors the road geometry and provides a plurality of
`frames of video images of the road in front of the vehi(cid:173)
`cle 10 such as frame 66.
`FIG. 3 illustrates the system configuration for the
`automatic vehicle steering system. Video input device
`12 provides continuous frames of the road in front of the
`vehicle 10 to image processor 14. Image processor 14
`performs lane identification 42 within the area specified
`
`19
`
`
`
`5,245,422
`
`5
`by the search area predictor 40 and furthermore, a lane
`centering algorithm 44. Search area predictor 40 pro(cid:173)
`vides the necessary search area in an efficient manner.
`The response signal from lane centering algorithm 44 is
`provided to steering controller 18, which in tum con- 5
`troIs steering actuator 16. Steering actuator 16 adjusts
`the angle of the wheels 60 of vehicle 10 to direct the
`vehicle 10 in the desired direction.
`Wheel angle and driver steer sensors 10 measure the
`wheel angle and detect conventional driver steering. 10
`Wheel angle and driver steer sensors 10 are adapted to
`provide a signal to search area predictor 40. The image
`processor 14 receives this signal and uses the wheel
`angle signal to check for a consistent steering angle
`sufficient to allow for the initiation of the system. The 15
`wheel angle signal further provides the image processor
`14 with vehicle turning information. As such, the pro(cid:173)
`cessor 14 is able to use this information to provide for a
`better prediction of the lane position. The wheel angle
`and driver steer sensors 10 are further adapted to pro- 20
`vide a driver steer signal to steering controller 18 to
`disengage steering actuator 16 when the driver manu(cid:173)
`ally operates the steering wheel 31 while the automatic
`vehicle steering system is engaged. A wheel angle sig(cid:173)
`nal is also provided to steering controller 18. Steering 25
`controller 18 is further adapted to receive inputs from
`steering wheel 31 and steering actuator 16. Further(cid:173)
`more, steering controller 18 is adapted to provide sig(cid:173)
`nals to a warning system 54.
`Cruise control switch 16 engages the cruise control 30
`system 11 which is adapted to control vehicle speed 38
`by controlling throttle control 58 which in tum controls
`the throttle 60. The cruise control switch 16, vehicle
`speed 38, automatic steering switch 14 and steering
`wheel 31 are adapted to receive driver inputs 46. Auto- 35
`matic steering switch 14 is further adapted to receive
`cruise control inputs from cruise control switch 16.
`Automatic steering switch 14 in turn communicates
`with steering wheel 31. Cruise control switch 16 further
`communicates with pedal positions 56 which in turn 40
`controls throttle control 58.
`FIGS. 4A-4C are photographs which illustrate the
`operation of the automatic vehicle steering system.
`FIGS. 4A-4C illustrate operation of the vehicle 10
`within the lane boundaries of the road. The automatic 45
`steering system maintains the vehicle 10 at the desired
`location within the lane, under normal traffic condi(cid:173)
`tions. FIG. 4C illustrates the vehicle 10 changing lanes,
`whereby the automatic vehicle steering system is tem(cid:173)
`porarily disengaged as long as the driver manually oper- 50
`ates the steering. Once in the desired position of the new
`lane the driver may discontinue manual steering which
`re-engages the automatic vehicle steering system.
`The flow chart in FIG. 5 illustrates the processing
`steps performed by the automatic vehicle steering sys- 55
`tern. The driver of the vehicle 10 initially turns on the
`cruise control switch 16 to engage the cruise control
`system 11 or turns the automatic' steering switch 14 to
`engage both the cruise control system 11 and automatic
`vehicle steering system. With the cruise control system 60
`11 engaged and the automatic vehicle steering disen(cid:173)
`gaged or not ready to operate, the vehicle maintains
`speed control in the cruise control mode unless the
`cruise control system 11 is disengaged. Cruise control
`system 11 may be disengaged by conventional tech- 65
`niques such as applying the brakes or disengaging the
`cruise control switch 16 or may be temporarily disen(cid:173)
`gaged while manually depressing the throttle control
`
`6
`58. With the cruise control system 11 and the automatic
`vehicle steering switch 14 both engaged, the vehicle 10
`locks on to the lane and operates in the speed and steer(cid:173)
`ing cruise control mode until being disengaged.
`The automatic vehicle steering system may be disen(cid:173)
`gaged in several ways. The driver may disengage the
`vehicle steering system by turning off either the cruise
`control switch 16 or the automatic steering switch 14.
`Depressing the brake pedal will further disengage the
`system. Temporary disengagement will result from
`manual driver steer. When the driver depresses the
`throttle control 58 the cruise control system 11 wiIl be
`temporarily overridden, however, the automatic vehi(cid:173)
`cle steering system will continue to steer the vehicle.
`When the driver engages the automatic vehicle steer(cid:173)
`ing system, the system initially undergoes an initializa(cid:173)
`tion process. Audio and video information is provided
`to the driver of the vehicle 10 which indicates whether
`the system is ready. During automatic vehicle steering
`system initialization, all that is required of the driver is
`that he maintain the vehicle in the desired position be(cid:173)
`tween the lane boundaries of the road.
`FIGS. 6-11 illustrate how processing unit 14 operates
`to analyze the frames of road images and predict the
`path of the lane in the road in front of the vehicle 10.
`Processing unit 14 receives a continuous series offrames
`of the road in front of the vehicle 10 from image input
`device 11. Image input device 12 provides frames of
`images at a rate of thirty frames per second, capable of
`providing an adequate response for vehicles travelling
`at normal highway speeds. For higher speeds, the sys(cid:173)
`tem may require a higher rate of frame speed.
`The processing unit 14 includes image digitizing
`frame grabbers for receiving each analog input frame
`from image input device 12 and converting each frame
`to a plurality of digital signals. Processing unit 14 in(cid:173)
`cludes computer processors for providing digital pro(cid:173)
`cessing to analyze the digital information provided by
`the image digitizing frame grabbers. Processing unit 14
`is further equipped with software for controlling the
`image input device, image processing for lane detection
`and a predictor to improve the efficiency of the image
`processing function.
`In order to locate the lane boundaries in the image of
`a road scene, the processing unit 14 first detects all edge
`points in the image. In doing so, there are certain as(cid:173)
`sumptions that are made in order to simplify the prob(cid:173)
`lem. 'For an automatic vehicle steering system we first
`assume low curvature lane boundaries. In addition, we
`assume that in most situations a pair of boundaries exist.
`Finally, it is assumed that the ground is locally level and
`the images are taken while the car is in the lane of the
`road. This letter assumption is usually correct because
`the driver is likely to engage the cruise control switch
`11 and/or steering control switch only when the car is
`travelling between one lane boundaries and the car is
`usually travelling in a straight line. Under these assump(cid:173)
`tions, the location of the lane in the image can be pre(cid:173)
`dicted by the predictor 40 based on lane curvature,
`vehicle dynamics and steering inputs.
`Two main lane boundaries are modeled close to the
`vehicle using two parallel line segments. The first line
`segment being the tangent to the current left lane
`boundary 34 and the second being tangent to the cur(cid:173)
`rent right lane boundary 36. Due to the projective ge(cid:173)
`ometry of the image, these two convergent lines must
`converge at a point in the image called a vanishing point
`84.
`
`20
`
`
`
`5,245,422
`
`55
`
`7
`The best two convergent lines are essentially chosen
`from a set of candidates. Here, however, we will use
`two intersection points 86 and 88, that is, where the left
`convergent line 78 and the right convergent line 80 each
`cross the chosen search area 82 as shown in FIG. 7. The S
`use of two intersection points rather than one vanishing
`point allows for the ability to follow the lane in situa(cid:173)
`tions where one side of a lane boundary is less promi(cid:173)
`nent than the other or is completely missing.
`Since the location of the intersection points does not 10
`change much between two continuous frames, an as(cid:173)
`sumption is made that its location in the current frame
`will be close to that in the previous frame. This fact
`allows for combining road edge detection and intersec(cid:173)
`tion point determination in one step.
`To select the two best intersection points, the algo(cid:173)
`rithm collects evidence supporting each candidate from
`the image. The supporting evidence, coming from the
`pixel level local computation, includes the strength and
`direction of edge points and length of line segments. 20
`Functions are provided to measure the support of each
`of the evidence and combine them in the performance
`measure that gives confidence in an intersection point.
`The intersection point having the highest confidence is
`selected and the corresponding convergent line is con- 25
`sidered as the image of the lane boundary. FIG. 8 illus(cid:173)
`trates the characteristics of such an image. Shown
`therein are edge responses and associated orientation of
`several line samples. It is desirable to obtain the data
`that provides a strong edge response in addition to a 30
`consistent orientation such as line 90. The overall re(cid:173)
`sponse is then used to calculate the intersection point
`for that boundary line within a chosen search area 82.
`FIG. 6 illustrates a left convergent line 78 and a right
`convergent line 80 as both pass through the chosen 35
`search area 82 to obtain the left convergent line inter(cid:173)
`section point 86 and the right convergent line intersec(cid:173)
`tion point 88. Left and right convergent lines 78 and 80
`cross at the point known as the vanishing point 84. It is
`most desirable to obtain the intersection of intersection 40
`points 86 and 88 or vanishing point 84 within the search
`area 82. In order to do so, the system employ a predic(cid:173)
`tor to continuously adjust the search area as shown in
`FIG. 7. The predictor determines the area in which to
`search. Upon system initialization, the predictor initially 45
`searches a large area. As the predictor locates the inter(cid:173)
`section points it is able to adjust to that location and
`search a smaller area, thereby enabling the system to
`operate faster and more efficiently. Upon initialization
`the predictor could be adjusted to monitor a narrower 50
`area based on various assumptions or cover a propor(cid:173)
`tioned area (i.e., monitor every second or third Pixel) in
`order to speed up the initialization process. The result(cid:173)
`ing intersection point 88 found within the search area 82
`provides the desired vehicle direction.
`Algorithm software flow chart diagrams are pro(cid:173)
`vided on FIGS. 9 through 11. The processor 14 receives
`an image input. The gradient magnitude and gradient
`direction of the image data is computed. The intersec(cid:173)
`tion point is then hypothesized based on the search area 60
`as shown in FIG. 10, wherein (Xl, X2) specifies a one(cid:173)
`dimensional search area in the image and m is the num(cid:173)
`ber of hypothesized intersection points within the one
`dimensional area. Then, the software then collects sup(cid:173)
`port for each hypothesized intersection point as shown 65
`in FIG. 11. {(ipx(k), ipy), k=O, 1, ••• } represents the set
`of image coordinates of the hypothesized intersection
`point. M(i, j) is the gradient magnitude at pixel location
`
`8
`(i, j) in the image, and xwidth and ywidth are the hori(cid:173)
`zontal and vertical size of the one below the one(cid:173)
`dimensional search area respectively. In addition, the
`right and left intersection point and convergent lines are
`then selected and the search area is updated prior to
`receiving the" next image input.
`In view of the foregoing, it can be appreciated that
`the present invention enables the user to achieve a sys(cid:173)
`tem for automatically steering a vehicle within the lines
`of a road. Thus, while this invention has been described
`in connection with a particular example thereof, no
`limitation is intended thereby except as defmed by the
`following claims. This is because the skilled practitioner
`will realize that other modifications can be made with-
`15 out departing from the spirit of t