` B 60 Q 1/14
` G 08 G
`
`(19) Japanese Patent Office (JP)
`
`(11) Kokai Application No.
`
`(12) Unexamined Patent Publication (A)
`
`S62-131837
`
`ID Symbol
`
`JPO File Number
`
`(43) Publication Date: June 15, 1987
`
`
`1/16
`
`
`
`
`
`
`
`A-8410-3K
`
` 6821-5H
`
`Request for Examination: not yet submitted
`Number of Inventions: 1
`(6 pages total)
`────────────────────────────────────────────────────────────────────────────
`
`(54) Title of the Invention:
`
`Traveling Vehicle Recognition Device
`
`(21) Application No.:
`(22) Application Date:
`
`S60-272478
`December 5, 1985
`
`(72) Inventor:
`
`Hirohiko Yanagawa
`
`c/o Nippon Denso Co., Ltd., 1-1 Showa-cho, Kariya-shi
`
`(72) Inventor:
`
`Hideko Akatsuka
`
`c/o Nippon Denso Co., Ltd., 1-1 Showa-cho, Kariya-shi
`
`(72) Inventor:
`
`Genichi Yamada
`
`c/o Nippon Denso Co., Ltd., 1-1 Showa-cho, Kariya-shi
`
`(71) Applicant:
`
`Nippon Denso Co., Ltd.
`
`1-1 Showa-cho, Kariya-shi
`
`(74) Representative:
`
`Takehiko Suzue, Patent Attorney
`
`and 2 others
`
`Specification
`
`1. Title of the Invention
`
`Traveling Vehicle Recognition Device
`
`2. Claim
`
` A traveling vehicle recognition device
`characterized in comprising:
` color imaging means for imaging the forward
`direction of a traveling vehicle;
` means for forming color image signals
`corresponding to each color based on a video
`signal imaged by the imaging means;
` features extraction means for extracting an
`image signal of colors corresponding to taillights
`and headlights based on the color image signals
`obtained by said means;
` means for recognizing the presence of
`taillights or headlights according to the image
`signal extracted by the features extraction
`means;
` calculating means for computing the distance
`between vehicles and the speed relative to a
`vehicle ahead based on said recognized taillight
`image; and
`
`
` executing means for executing headlight
`control based on the recognition result of said
`recognition means;
` and controlling to switch the vehicle
`headlights to low beams at least when a state in
`which there is an oncoming vehicle in the
`forward direction has been recognized by said
`recognition of headlights.
`
`
`3. Detailed description of the invention
`<Industrial field of use>
` The present invention relates to a recognition
`device for a traveling vehicle which recognizes
`the presence of taillights of a vehicle traveling
`ahead and headlights of an oncoming vehicle,
`especially at night, calculates and displays the
`interrelationship with the vehicle ahead, and is
`capable of controlling the device vehicle’s
`headlights automatically.
`
`<Prior art>
` When driving an automobile at night, the
`headlights are lit while travelling, and in
`particular, the headlights are set to high beams
`
`
`
`
`
`-229-
`
`1
`
`Mercedes-Benz USA, LLC, Petitioner - Ex. 1007
`
`
`
`Unexamined Patent No. S62-131837 (2)
`
`when driving in an area with few other traveling
`automobiles.
` However, with this type of high beam driving
`state, in the case that there is an oncoming vehicle
`or a vehicle traveling ahead is becoming close, the
`headlights must be switched to low beams so as
`not to obstruct the field of vision of the driver of
`the oncoming vehicle or the driver of the vehicle
`traveling ahead. This type of beam control of the
`headlights is troublesome for the driver, however,
`and further complicates the driving operation,
`especially when driving on a road with many
`curves. Moreover, in the case that there is a
`vehicle traveling ahead, the driver must accurately
`perceive the distance from the vehicle ahead and
`the speed relative to the vehicle ahead, and must
`control the lights in this way and accurately know
`the relative relationship with the vehicle ahead in
`order to drive safely.
`
`<Problems that the invention is to solve>
` The present invention was devised in
`consideration of such points, and seeks to provide
`
`
`recognized taillight.
`
`<Operation>
` With the traveling vehicle recognition device
`configured as above, the headlights and taillights
`of a vehicle ahead can be recognized from color
`features, and the driver can be notified that there
`is an oncoming vehicle and there is a vehicle
`traveling ahead based on this recognition.,
`Conditions that arise when the headlights must be
`switched from high beams to low beams are also
`detected based on this recognition result, and the
`headlight beams can be controlled automatically
`once detection conditions have been set. The
`distance from and the speed relative to a vehicle
`traveling ahead can also be calculated based on
`taillight recognition, and a warning can therefore
`be issued to the driver in states such as when there
`is risk of a rear-end collision.
`<Working examples of the invention>
` A working example of the present invention
`will be described hereinafter with reference to the
`appended drawings. Fig. 1 shows the
`
`a traveling vehicle recognition device capable, for
`example, of automatically controlling headlight
`beams to high and low beams according to the
`state of whether there is a vehicle ahead,
`especially when driving at night, and of issuing
`warnings to the driver according to the
`interrelationship with a vehicle traveling ahead.
`
`
`<Means of solving the problems>
` Specifically, the traveling vehicle recognition
`device of the present invention has an imaging
`apparatus such as a color television camera set up
`for imaging, for example, the forward direction of
`a traveling vehicle, extracts color features of
`headlights and taillights to form a feature
`extracted color image signal based on a color
`video signal imaged by this imaging apparatus,
`recognizes the headlights and taillights of a
`vehicle ahead, and controls the headlight beams
`based on this recognition result. The traveling
`vehicle recognition device also computes the
`distance from and the speed relative a vehicle
`traveling ahead based on the image signal of a
`
`
`configuration of this working example, which is
`provided with a color television camera 11. The
`television camera 11 is mounted and set up in the
`front of a vehicle 12 as shown in Fig. 2, for
`example, and is set so as to be able to image the
`forward direction of the vehicle 12, especially a
`vehicle 121 traveling ahead and a vehicle 122
`traveling in the oncoming lane. By setting up in
`this way, the red taillights of the vehicle 121 and
`the white headlights of the vehicle 122 may be
`imaged accurately, especially at night.
` A video signal of images imaged by the
`television camera 11 is supplied to a decoder 13.
`The decoder 13 forms R (red), G (green) and B
`(blue) color image signals based on the video
`signal, and supplies the R, B and B color image
`signals to an image signal processor 14.
` The image signal processor 14 extracts the
`features of red, which is the color of taillights, and
`of white, which is the color of headlights, from
`the R, G, B color image signals, extracting, for
`example, a binary image signal, and causes the
`presence of taillights or headlights within the
`
`
`
`
`
`
`
`-230-
`
`2
`
`
`
`Unexamined Patent No. S62-131837 (3)
`
`area of the screen to be imaged is set and feature
`extraction conditions for recognizing taillights
`and headlights are set.
` Once the settings have been initialized in this
`way, the operation advances to step 103, in which
`the color image signal from the decoder 13
`formed based on the video signal from the color
`television camera 11 is captured and inputted to
`the image signal processor 14. The operation then
`advances to a 104. In step 104 features are
`extracted by the image signal processor 14 from
`the color image signal, and the luminescent colors
`of white and red are emphasized.
` This image signal processor 14 is configured
`as shown in Fig. 4, for example, and is provided
`with a features extraction unit 141. The R, G and
`B color image signals from the decoder 13 are
`supplied to the features extraction unit 141.
`“Extracting features by the features extraction
`unit 141” means that the inputted image signals
`are binarized to capture only information relating
`to headlights and taillights, which are to be
`
`
`
` R > 2 B, and R > 2 G ... ... ... (2)
` The image data of features extracted in step
`104 in this way are stored in a memory 142 in
`step 105. Image data are stored every 0.05 second,
`for example. Next, In step 106, the image data
`stored in the memory 142 are sent to a recognition
`unit 143, which determines whether the image
`from which features have been extracted is a
`taillight.
` As determination criteria, the determination is
`made according to whether there are two red
`images 52 and 53 at the same height within a
`setting range 51 on a screen corresponding to the
`range of the traffic lane in which the device
`vehicle is traveling, as shown in Fig. 5(A). In the
`case that taillights are recognized in step 106, the
`operation advances to step 107, in which whether
`the headlights are in a high beam state is
`determined from the state of a headlight high/low-
`beam switch. In the case that the headlights are in
`a high beam state in this step 107, in the next step
`108, the headlights are controlled to switch the
`
`the imaged video to be recognized based on this
`extracted image signal. The recognition results are
`then sent to an executing part 15.
` The executing part 15 is also supplied a
`detection signal corresponding to the vehicle
`speed from a vehicle speed sensor 16 and a signal
`from a headlight switch 17 indicating the state of
`whether the headlights are set to a high or low
`beam . The executing part 15 then executes tasks
`for controlling the headlight beams or issuing a
`warning to the driver based on the recognition
`information, vehicle speed information and
`headlight information.
` Fig. 3 shows the flow of operating states of the
`device described above, which starts when the
`ignition switch of the vehicle is turned on. In step
`101, whether it is nighttime is determined
`according to whether the headlights are lit, and in
`the case that it is determined to be nighttime, the
`operation advances to step 102. Settings are
`initialized instep 122 [Translator’s note: error for
`102]. In this initializing step 102, the scanning
`
`
`recognized. In this example, color image signals
`corresponding to the luminescent colors of
`headlights and taillights are extracted. Conditional
`expressions for the features extraction are then
`set, and the image signals are extracted in
`accordance with the conditional expressions.
` For example, with white luminescence such as
`when a headlight is lit, the R, G and B values are
`large and there is little difference between the
`values. The conditional expressions for white
`luminescence are as follows.
` | R - G | < ε / 10
` | G - B | < ε / 10
` | B - R | < ε / 10
` 4ε / 5 < R, G, B ... ... ... ... ... (1)
`The potential values that R, G and B may assume
`range from 0 to ε.
` With red luminescence when a taillight is lit,
`the value of R (red) is at least twice that of G
`(green) and B (blue). Therefore, the extraction
`conditional expression for the red luminescence
`of taillights is as follows.
`
`
`
`
`
`-231-
`
`3
`
`
`
`Unexamined Patent No. S62-131837 (4)
`
`the vehicle is Z, and the magnification of the
`camera 11 is β, the following equation holds true.
` β = f / Z ... ... ... ... ... ... ... ... ... ... (3)
`If the distance between taillights is R when β is
`"1", the following equation then holds true.
` β = r / R ... ... ... ... ... ... ... ... ... ... (4)
` From equations (3) and (4), the distance Z
`between vehicles is obtained by the following
`equation.
` Z = f R / r ... ... ... ... ... ... ... ... ... (5)
` The distance between vehicles is calculated in
`this way every 0.05 second as the image data are
`stored, and the speed of the device vehicle relative
`to a vehicle traveling ahead is calculated from the
`distance between vehicles obtained every 0.05
`second. Specifically, 0.05 second after a taillight
`image such as shown in Fig. 5(A) has been
`obtained, the same taillight image is as shown in
`Fig. 5(B), and the distance between taillights 52
`and 53 changes from r1 to r2. If Z1 is the distance
`between vehicles calculated using the distance r2,
`the speed V relative to the vehicle ahead can be
`obtained by the following equation.
`
`
`that they are high beams, the headlights are
`switched to low beams in step 114.
` In the case that headlights were not recognized
`in step 112, it is determined that there is neither a
`car traveling ahead nor an oncoming car, in which
`case, the operation advances to step 115. In step
`115, the past headlight setting status is determined
`from the contents stored in memory, and in the
`case that a state of high beams has been stored in
`memory, the operation advances to step 116 to
`switch the headlights to high beams. For example,
`in the case that the headlights have been switched
`to low beams in step 108 or 114 from a state of
`traveling with high beams on and the previous
`high beam state has been stored in memory, the
`headlights are switched to high beams in step
`116after passing the vehicle ahead or being
`passed by the oncoming vehicle.
` In other words, with the device, in the case that
`there is a vehicle ahead or an oncoming vehicle
`while traveling at night and the headlights are in
`the high beam state, for example, the headlights
`are automatically switched to low beams, thus
`
`headlights to low beams, and the operation
`advances to step 109. In this case, that the
`headlights have been switched from high to low
`beams is stored in a memory. This storage is
`erased in the case that the headlights are returned
`to high beams or the ignition switch is
`disengaged, but is retained in the meantime.
`Alternatively, in the case that the headlights are
`determined to be low beams in step 107, the
`operation advances directly to step 109.
` In step 109, the image data stored every 0.05
`second in the memory 142 are inputted to a
`computation unit 144, and in the next step 110,
`the distance Z from a vehicle traveling ahead and
`the speed relative to the vehicle traveling ahead
`are calculated.
` This distance Z from a vehicle traveling ahead
`is calculated based, for example, on a distance r1
`between the recognized taillights 52 and 53
`Specifically, the distance r1 is obtained by a
`calculation such as the following:
` If the focal length of the television camera 11
`is f, the distance from the lens of the camera 11 to
`
`
`
` V = (Z - Z1) / 0.05 ... ... (6)
` The distance Z from and the speed V relative
`to a vehicle ahead are obtained by such a
`calculation in step 110, and the results of this
`calculation are displayed in step 111.
` A numerical display on a panel meter of the
`vehicle, for example, may be used as a display
`means in the step 111.
` In the case that a taillight was not recognized
`in step 106, the operation advances to step 112. In
`this step 112, headlights are recognized by
`determining whether there are two white
`luminescent colors 62 and 63 at the same height
`in a setting range 61 corresponding to an
`oncoming traffic lane on a screen as shown in Fig.
`6, and recognizing the headlights of an oncoming
`vehicle according to whether there are these two
`white luminescent colors 62 and 63.
` In the case that headlights have been
`recognized in this step 112, the operation
`advances to step 113, and the state of the
`headlights of the device vehicle is determined in
`the same manner as in step 107, and in the case
`
`
`
`
`
`-232-
`
`4
`
`
`
`Unexamined Patent No. S62-131837 (5)
`
`monitor vehicles ahead. The device can also be used
`effectively as a safe driving warning system by setting a
`safe distance between vehicles according to the absolute
`speed of the device vehicle, and warning the driver by
`voice or a buzzer if the distance between vehicles has
`become less than the set distance between vehicles. A
`delay timer processing may also be used in the working
`example described earlier when controlling to switch
`from low beams to high beams.
`
`
`<Effects of the invention>
` As described above, the traveling vehicle recognition
`device of the present invention recognizes whether there
`is a vehicle traveling ahead or an oncoming vehicle,
`especially at night, and automatically controls the
`headlights according to the recognition result. Therefore,
`the invention can achieve a significant effect in terms of
`safe driving by executing a basic safety operation of
`nighttime driving automatically. In connection with this,
`the invention can also issue various warning actions for
`safe driving, thereby effectively expanding the range of
`
`automating road traffic safety. The distance from
`and the speed relative to a vehicle ahead at night
`can also be known accurately, and can be used
`effectively as a means for preventing rear-end
`collisions. Because the distance between vehicles
`and relative speed have been calculated in this
`case, these data can be used to predict a potential
`rear-end collision, and an audible or other type of
`warning can be issued to a driver based on this
`prediction. In other words, the device may also be
`used as a means for preventing drowsy driving.
` The television camera for imaging in the
`forward direction may be mounted at any location
`from which the forward direction of the vehicle
`can be imaged. The television camera may also be
`configured such that the mounting angle of the
`camera can be varied, such that, for example, the
`angle is automatically controlled in response to
`the angle of steering maneuvers. When configured
`in this way, the camera always faces the steered
`direction of the vehicle so as to effectively
`
`
`applications for safe driving.
`
`
`4. Brief explanation of the drawings
` Fig. 1 is a block diagram illustrating the
`recognition device according to a working
`example of the present invention; Fig. 2 is a
`diagram illustrating the setting state of a
`television camera in this working example; Fig. 3
`is a flowchart illustrating the operation states of
`this working example; Fig. 4 is a diagram
`showing an example of a configuration of the
`image signal processor of this working example;
`Fig. 5 is a diagram illustrating image states for
`recognizing taillights, and Fig. 6 is a diagram
`illustrating image states for recognizing
`headlights.
` 11 ... Color television camera, 12 ... Vehicle,
`13 ... Decoder, 14 ... Video signal processor, 15 ...
`Executing part, 16 ... Vehicle speed sensor, 17 ...
`Headlight switch
`
`
`
`Takehiko Suzue, Patent Attorney
`
`
`
`
`
`-233-
`
`
`
`5
`
`
`
`m7813MmmWmdMnmU
`
`duunn human
`vnhulr. and nlauw
`
`6
`
`
`
`
`
`Park IP Translations
`
`
`
`
`
`May 5, 2013
`
`
`
`Certification
`
`
`
`
`
`
`This is to certify that the attached translation is, to the best
`of my knowledge and belief, a true and accurate translation from
`Japanese into English of the patent that is entitled: Unexamined
`Patent Publication (A)S62-131837.
`
`
`
`
`_______________________________________
`
`Abraham I. Holczer
`
`Project Manager
`
`
`
`Park Case # 39368
`
`134 W. 29th Street 5th Floor New York, N.Y. 10001
`Phone: 212-581-8870 Fax: 212-581-5577
`
`7