throbber
(19) Japanese Patent Office (JP)
`(12) Kokai Unexamined Patent Application Bulletin (A)
`(11) Laid Open Patent Application No.
`
`6-124340
`(43) Publication Date
`
`May 6, 1994
`Number of Claims
`
`5
`
`Number of Pages
`
`16
`
`Examination Request
`
`not yet made
`
`
`items of data to be handled is greatly reduced, the
`processing steps become fast, and extraction of the
`features of the object under scrutiny can be realized at
`high speed.
`
`(FIG. 1)
`
`
`
`
`
`
`
`(54)
`
`[Title of the Invention] Vehicle Image Processing Device
`
`(57) [Abstract]
`[Problem] To provide a vehicle image processing device
`that performs suitable image processing and can be
`practically realized for a vehicle.
`[Configuration] A vehicle image processing device that
`comprises a distance measurement means 66 that
`detects information about the distance and direction to an
`object outside the vehicle, an assessment means 67 that
`decides upon an image processing method based on the
`measurement results, and an image processing means
`69 that, based on the above set image processing
`method, performs image processing on the image
`detected by an image sensor 68. The image processing
`method is, for example, processing that decides upon a
`region on which to perform image processing, processing
`that decides upon an object on which to perform image
`processing,
`and
`processing
`that
`enlarges/reduces/direction-converts a specified object in
`an image. With the above configuration, extraction
`processing is done on image information focusing on an
`object of scrutiny as the target, so that the range in which
`data processing is performed is limited, the number of
`
`Translation by Patent Translations Inc. 1-800-844-0494 mail@PatentTranslations.com
`
`FI
`
`
`Tech. Indic.
`
`
`
`Continued on the last
`page
`
`000003997
`NISSAN MOTOR CO.,LTD.
`2 Takaracho, Kanagawa-ku,
`Yokohama-shi, Kanagawa-ken
`YAMAMURA, Tomohiro
`NISSAN MOTOR CO.,LTD.
`2 Takaracho, Kanagawa-ku,
`Yokohama-shi, Kanagawa-ken
`SATOU, Hiroshi
`NISSAN MOTOR CO.,LTD.
`2 Takaracho, Kanagawa-ku,
`Yokohama-shi, Kanagawa-ken
`Patent Attorney, NAKAMURA,
`Junnosuke (and one more person)
`
`
`
`
`(51)
`
`
`Int. Cl.5
`G06F 15/62
`
`G01C 21/00
`G01S 13/86
` 13/93
`
`Identification Code
`380
`415
`
`
`
`N
`
`N
`
`(21) Application No.:
`
`5-138145
`
`
`Internal File No.
`9287-5L
`9287-5L
`
`7015-5J
`7015-5J
`
`(71)
`
`Applicant:
`
`(22) Application Date:
`
`June 10, 1993
`
`(72)
`
`Inventor:
`
`(31) Number assigned to
`priority application:
`
`4-193993
`
`(72)
`
`Inventor:
`
`(32) Date of filing of priority
`application:
`(33) Country in which
`priority application
`was filed:
`
`
`
`
`July 21, 1992
`
`Japan
`
`(74)
`
`
`
`
`
`
`
`Agent:
`
`
`
`
`
`
`
`IPR2013-00419 - Substitute Ex. 1013
`Toyota Motor Corp., Petitioner
`
`1
`
`

`
`(2)
`
`JP-06-124340-A
`
`
`
`[Claims]
`[Claim 1] A vehicle image processing device characterized by
`comprising:
`an image sensor that is carried on the vehicle and into which is
`input image information on the vehicle surroundings;
`a distance measurement means that determines information on
`the distance and direction to a single object or a plurality of
`objects present outside the vehicle;
`an assessment means that assesses and sets an image
`processing method based on the distance and direction
`information; and
`an image processing means that, based on the image
`processing method set by the assessment means, performs
`image processing on the image information determined by the
`image sensor.
`[Claim 2] The vehicle image processing device recited in claim
`1, characterized in that the assessment means assesses and
`sets an image processing method including at least one type of
`processing among: processing to decide upon the region on
`which to perform image processing, processing to decide upon
`the object on which to perform image processing, and
`processing
`to enlarge/reduce/direction-convert a specified
`object in an image.
`[Claim 3] The vehicle image processing device recited in claim
`1,
`characterized
`in
`that
`the
`image
`sensor
`enlarges/reduces/direction-converts a specified object in an
`image
`according
`to
`the
`enlargement/reduction/direction-conversion processing that is
`set by the assessment means.
`[Claim 4] The vehicle image processing device recited in claim
`2 or claim 3, characterized by comprising a measurement
`means that detects the travel state of one’s own vehicle,
`wherein the assessment means decides upon the object on
`which to perform image processing, according to the detection
`results of the measurement means.
`[Claim 5] A vehicle image processing device characterized by
`comprising:
`a reflecting body detection means that detects the distance and
`direction from one’s own vehicle to a reflecting body by emitting
`electromagnetic waves in the direction of travel of one’s own
`vehicle while sweeping a prescribed angle range, receiving the
`reflected wave from the reflecting body, and calculating the
`distance to the reflecting body based on the propagation delay
`time from emission to reception of the electromagnetic wave at
`prescribed sweep angles;
`an image pickup means that images the scenery to the front in
`the direction of travel of one’s own vehicle;
`an image recognition means that processes the image to the
`front of the vehicle imaged by the image pickup means and
`detects preceding vehicles that are present ahead;
`an image processing region setting means that sets the image
`processing region in the image recognition means according to
`the distance and direction detected from the reflecting body
`detection means; and
`a preceding vehicle position output means that outputs the
`distance or direction, or both, to the reflecting body that is
`recognized as a preceding vehicle by the image processing
`recognition means.
`[Detailed Description of the Invention]
`[0001]
`[Field of Industrial Application] This invention relates to a
`vehicle
`image processing device
`that
`recognizes
`the
`environment around the vehicle. Such a vehicle image
`processing device is used, for example, in a preceding vehicle
`detection device to be applied in a vehicle automatic travel
`Translation by Patent Translations Inc. 1-800-844-0494 mail@PatentTranslations.com
`
`control device or a warning device for approaching another
`vehicle, or the like; that is to say, it is used in a device that
`detects the position of a preceding vehicle that is traveling
`ahead of one’s own vehicle.
`[0002]
`[Prior Art] Devices for preceding vehicle image processing
`include, for example those shown in FIG. 14. In FIG. 14, (a) is a
`vehicle image processing device, which uses one image sensor
`to take in information in as wide a range as possible around the
`vehicle, and which recognizes the information. And (b) is a
`vehicle image processing device that uses two image sensors,
`which are set up at a prescribed angle and distance, and
`extracts features, and the distance and direction to an object,
`from the parallax between such information.
`[0003]
`[Problems to Be Solved by the Invention] In a conventional
`vehicle image processing device such as above, there have
`been problems such as the following. Namely, there have been
`problems in that:
`(1) because the image of an object present in the distance will
`be small, it is difficult to extract its features;
`(2) because the entire image that is taken in is processed, the
`volume of information is large; this makes the processing
`difficult and slows the processing down, and because the size
`of the processing device is large, it is difficult to use it as vehicle
`equipment;
`(3) because the method of image processing cannot be freely
`changed, only specified information can be obtained;
`(4) in addition, because it is impossible to accurately measure
`the distance to an object in a vehicle that is traveling, even if
`image information is obtained, it cannot be used effectively for
`warning the driver or for vehicle travel control.
`[0004] An object of the present invention, which was devised in
`order to solve the problems of the prior art as described above,
`is to provide a vehicle image processing device that can
`perform appropriate image processing and can be practically
`realized for a vehicle.
`[0005]
`[Means for Solving the Problems] In order to achieve the
`object described above, the present invention is configured as
`recited in the claims. That is to say, the invention recited in
`claim 1 comprises an image sensor that is carried on the
`vehicle and into which is input image information on the vehicle
`surroundings; a distance measurement means that determines
`information on the distance and direction to a single object or a
`plurality of objects present outside the vehicle; an assessment
`means that assesses and sets an image processing method
`based on the distance and direction information; and an image
`processing means that performs image processing on the
`above image information determined by the image sensor,
`based on the image processing method set by the assessment
`means. The assessment means, for example as recited in claim
`2, assesses and sets an image processing method including at
`least one type of processing among: processing to decide upon
`the region on which to perform image processing, processing to
`decide upon the object on which to perform image processing,
`and processing to enlarge/reduce/direction-convert a specified
`object in an image. Furthermore, the image sensor above, for
`example
`as
`recited
`in
`claim
`3
`enlarges/reduces/direction-converts a specified object in an
`image
`according
`to
`the
`enlargement/reduction/direction-conversion processing that is
`
`
`
`
`
`2
`
`

`
`(3)
`
`JP-06-124340-A
`
`
`
`set by the assessment means. Furthermore, the invention
`reflected by the object and return. Note that, if the laser light
`recited in claim 4 is one that, in addition to the configuration
`that is emitted is scanned vertically and horizontally, the
`described above, is configured to comprise a measurement
`distance in each direction can be measured, so distance and
`means that detects the travel state of one’s own vehicle and to
`direction information can be obtained. Furthermore, likewise [a
`decide upon the object on which the assessment means
`device] that uses radio waves or ultrasound waves can be
`performs image processing according to the detection results of
`realized with essentially the same configuration. Furthermore,
`the measurement means. Furthermore, the invention recited in
`the assessment means 61 is a means that assesses and sets
`claim 5 is one that shows the specific configuration of a case in
`the image processing method based on the measurement
`which the above vehicle image processing device is applied to
`results of the distance measurement means 60. Note that the
`a preceding vehicle detection device.
`above image processing method refers to, in this working
`[0006]
`example, processing to decide upon the region in which image
`[Operation] As described above, in the present invention, the
`processing is to be performed. Furthermore, the image sensor
`assessment means decides upon the image processing
`62 is a means that inputs the image information on the
`method based on the measurement results of the distance
`surroundings on the vehicle; for example, it is a video camera
`measurement means. This image processing method refers to,
`that uses CCDs. Furthermore, the image processing means 63
`for example as recited in claim 2, processing to decide upon the
`is for example a means that causes pattern recognition to be
`region on which to perform image processing, processing to
`performed. This image processing means 63 and the above
`decide upon the object on which to perform image processing,
`assessment means 61 can be constituted by a computer, for
`and processing to enlarge/reduce/direction-convert a specified
`example. Note that, S1 is a distance and direction signal, S2 is
`object in an image. Furthermore, the image processing means
`an image signal, S3 is an assessment signal, and S4 is a
`feature extraction signal.
`performs image processing based on the above set image
`[0008] Next, FIG. 2 is a diagram that shows the positional
`processing method. Accordingly, by processing with a focus on
`extracting
`image
`information
`targeting
`important objects
`relationship between the vehicle and the object outside the
`requiring image processing, such as objects that block the
`vehicle; (a) is the top view, and (b) is the side view. In FIG. 2, 64
`travel of the vehicle, the range in which data processing is
`is the vehicle, and 65 is the object outside the vehicle.
`performed by the image processing means is limited, so the
`Furthermore, FIG. 3 is a chart listing an example of distance
`number of data items to handle is greatly reduced and the
`measurement results, and FIG. 4 is a flowchart showing the
`processing step is speeded up, making it possible to realize the
`processing procedure in the device in FIG. 1. In the following,
`extraction of the features of the object under scrutiny at high
`the operation of the working example in FIG. 1 is described with
`speed. Furthermore, as recited in claim 3, in an image sensor
`reference to FIG. 2 to FIG. 4. First, the distance and direction of
`that enlarges/reduces/direction-converts a specified object in
`an object 65 that is present in front of the vehicle is measured
`an image, an optimum image for image processing can be
`by a distance measurement means 60 (not shown in FIG. 2)
`obtained, so that it is possible to obtain, from the image
`that is provided at the front end of one’s own vehicle 64. If (cid:84)x is
`processing means, detailed information that cannot be obtained
`set to the vehicle’s horizontal direction and (cid:84)y to the vertical
`from data with fewer pixels, such as information by which one
`direction (see FIG. 2), with this distance measurement means
`can recognize the license plate on a vehicle. Furthermore, in an
`60, data showing the distance to each bearing can be obtained
`invention, such as that recited in claim 4, which comprises a
`in a matrix L((cid:84)x,(cid:84)y). An example of the distance measurement
`measurement means that detects the travel state of one’s own
`results shown in FIG. 3 are the results of the case in which
`vehicle, and wherein the assessment means decides upon the
`there is an object 65 to the right [sic] as shown in FIG. 2. The
`object on which to perform image processing according to the
`assessment means 61 infers the position of the object based on
`detection results, if multiple objects are present in front of the
`the above measurement results. This inference method is
`vehicle, the object to scrutinize can be selected according to
`carried out for example as follows. Namely, for each L((cid:84)x,(cid:84)y),
`the one’s own vehicle’s travel state or the like. Furthermore, in
`the value
`is compared with a neighboring
` L((cid:84)x’,(cid:84)y’).
`the invention recited in claim 5, the configuration is such that
`Furthermore, because the values of L((cid:84)x,(cid:84)y) and L((cid:84)x’,(cid:84)y’) vary
`the distance and direction to a reflecting body are detected by a
`greatly at the border regions between where the object is
`reflecting body detection means; an image processing region
`present and where the object is not present, it is clear that the
`setting means is provided that sets the processing region in the
`object is present at that position. In the above example, the
`image processing recognition means according to the results;
`object is inferred to be to the right. In addition, the assessment
`and the image processing region is determined according to the
`means 61 sets the method of image processing based on the
`distance and direction to the reflecting body. Thus it becomes
`above results. In the case of this working example, processing
`possible to reliably detect the position of the preceding vehicle,
`to decide upon the region on which to perform image
`and to restrict the region on which to perform image processing
`processing is used as the method of image processing.
`to a narrow range, and thus it becomes possible to perform
`Therefore the assessment means 61 decides that the region on
`real-time
`computation
`processing without
`using
`an
`which to perform image processing is that surrounding the
`ultra-high-speed computer.
`location of the object. In the decision method at this time, the
`[0007]
`region is made larger than the object by multiplying the inferred
`[Working Examples]
`is
`invention
`this
`following,
`the
`In
`size of the object by a certain suitable value. Next, with respect
`described with reference to the drawings. FIG. 1 is a block
`to image information about the front of the vehicle determined
`diagram showing a first working example of this invention. In
`by the image sensor 62, based on information sent from the
`FIG. 1, a distance measurement means 60, which is for
`assessment means 61, the image processing means 63 sets
`example radar that uses light, radio waves, ultrasound waves,
`the region on which to perform image processing, and performs
`or the like, can obtain information on the distance and direction
`data processing focusing on that region. In the above example,
`to an object. Among [devices] that use light, there are those that
`image processing focuses on objects to the right. As above, as
`measure the distance by combining a laser with a photosensor
`a result of detecting the distance and direction of objects that
`and measuring the elapsed time for the laser light to be
`will block the travel of one’s own vehicle, and the assessment
`Translation by Patent Translations Inc. 1-800-844-0494 mail@PatentTranslations.com
`
`3
`
`

`
`(4)
`
`JP-06-124340-A
`
`
`
`means 61 performing processing focusing on extracting image
`interface 73, which is a device that exchanges information
`information targeting that object, the range in which data
`between the image processing device and an occupant, is for
`processing is performed by the image processing means 63 is
`example an operation switch and a display device provided
`restricted, so the number of data items to be handled is greatly
`inside the cab of the vehicle. An occupant operates this
`reduced, the processing step is speeded up, and extraction of
`operation switch to input signals to the image processing device,
`the features of the object under scrutiny can be realized at high
`and the information of the image processing device is given to
`speed.
`the occupant by being displayed on a display device.
`[0009] Next, FIG. 5 is a block diagram showing the second
`Furthermore, an image sensor 74 is the same as that used in
`working example of the present invention. In this working
`the second working example above. Furthermore, an
`example, the present invention is applied to an image
`assessment standard setting means 75 is a means (described
`processing device in which detailed image information is
`in detail below) that sets the assessment standard for an
`required for the object. In FIG. 5, the distance measurement
`assessment means 76. Furthermore, 76 is the assessment
`means 66 is the same as in FIG. 1 above. Furthermore, the
`means, and 77
`is an
`image processing means. The
`assessment means 67 infers the distance and direction to an
`assessment standard setting means 75, the assessment
`object in the same way as in FIG. 1, but in addition it outputs a
`means 76 and the image processing means 77 can be
`signal that controls the image sensor 68. Furthermore, the
`constituted by, for example, a computer. Note that S1 is a
`image sensor 68 is an image sensor that can enlarge, reduce,
`distance and direction signal, S2 is an image signal, S3 is an
`focus on, and rotation-control the image information according
`assessment signal, S4 is a feature extraction signal, and S5 is
`to control signals given from the assessment means 67 above.
`an enlargement/reduction/direction conversion signal, S6 is a
`For such an image sensor, one can use a video camera device
`vehicle speed signal, S7 is a steering angle signal, S8 is a
`standard value setting signal, and S9 is a standard signal.
`that has a lens that has a zoom mechanism, and a mechanism
`[0011] Next, the operation is described. If there are multiple
`that can rotate the camera as a whole. Furthermore, the image
`processing means 69 is, for example, a detailed image
`objects present in front of the vehicle, it becomes necessary to
`processing means that can recognize a vehicle license plate.
`set multiple image processing regions, but in this working
`Note that S1 is a distance and direction signal, S2 is an image
`example, the configuration is such that in a case such as this,
`signal, S3 is an assessment signal, S4 is a feature extraction
`the object that is to be scrutinized varies depending on the size
`signal, and S5 is an enlargement/reduction/direction conversion
`of, and distance to, the object, as well as the travel state of
`signal. In the environment shown in FIG. 2 above, an object is
`one’s own vehicle and the like. The assessment standard
`present to the right. With the distance measurement means 66,
`setting means 75 sets the assessment standard for the
`distance data is obtained in a matrix in the same way as in FIG.
`assessment means 76. That is to say, the assessment means
`1 above. Based on the above results, the assessment means
`76 assesses what kind of environment one’s own vehicle is
`67 infers the distance and direction to the object in the same
`currently in, and in accordance with this, the object to be
`way as in the first working example above, and controls the pan,
`scrutinized is decided upon, but in the above assessment, the
`tilt, zoom, and focus adjustment mechanism of the image
`assessment standard setting means 75 sets the assessment
`sensor 68. For example, if the data shown in FIG. 3 above is
`standard. For example, detection of the state of one’s own
`obtained, the camera is moved to the right side, and the focus
`vehicle is performed by detecting the travel speed and steering
`is adjusted to match the distance. With such processing, the
`angle by way of a vehicle speed sensor and a steering angle
`optimum image for image processing can be obtained.
`sensor, and if the travel speed is greater than or equal to a
`Therefore, detailed information that cannot be obtained from
`prescribed value and the steering angle is less than or equal to
`data with few pixels, such as information by which a vehicle’s
`a prescribed value, it can be assessed that it is traveling along
`license plate can be recognized, can be obtained from the
`a highway, while if the steering angle is greater than or equal to
`image processing means 69. Note detailed information that can
`a prescribed value and the vehicle speed is less than or equal
`be obtained with the above image processing means 69
`to a prescribed value, it can be assessed that it is traveling near
`includes the brake lamps of the preceding vehicle, the lighting
`an intersection, but the assessment standard setting means 75
`of its turn signals, the size of the vehicle, its direction of travel,
`sets the threshold values for the vehicle speed values and
`and the like, and the features of these can be extracted.
`steering angle values that will be the standard for this
`Furthermore, as a result of the image processing means 69
`assessment. The settings for these standard values are input
`performing image processing for the object of scrutiny that is
`using a human interface 73 such as operation switches. As
`set by the assessment means 67, features of pedestrians or
`stated above, the assessment means 76 assesses obstacles in
`buildings can also be extracted, and one can also recognize the
`front as the objects that are to be scrutinized if it assesses that
`situation in terms of traffic signs, the position of traffic signals,
`it is traveling along a highway, and assesses the traffic signal in
`the presence of any railroad crossings, road equipment and the
`front as the object that is to scrutinized if it assesses that it is
`like.
`traveling near an intersection. Furthermore, image processing
`[0010] Next, FIG. 6 is a block diagram of a third working
`is performed by the image processing means 77 on the object
`example of the present invention. In FIG. 6, a distance
`that is to be scrutinized. Note that, in the above description, a
`measurement means 70 is the same as that used in the first
`case in which the setting of the standard values of the
`working example. Furthermore, a vehicle speed sensor 71
`assessment standard setting means 73 is performed using a
`measures the travel speed of the vehicle; for example, it
`human interface 73 is described, but one can also perform the
`detects the rotation speed of a wheel per unit time, and
`environment setting automatically, by way of inputting the
`calculates the speed of the vehicle from the length of the
`image processing results. For example, if the present invention
`circumference of the wheel, or it reflects light or radio waves
`is applied to a system in which the image processing means 77
`onto the road surface and calculates the speed by way of the
`detects white lines on the road, then using the property that a
`Doppler effect. Furthermore, a steering angle sensor 72 detects
`continuous white line will not be detected near an intersection, if
`the steering angle; for example, it makes use of a steering
`an absence of the white line is detected, the assessment will be
`wheel that has a built-in rotary encoder. Furthermore, a human
`that it is near an intersection, and the object that is to be
`Translation by Patent Translations Inc. 1-800-844-0494 mail@PatentTranslations.com
`
`4
`
`

`
`(5)
`
`JP-06-124340-A
`
`
`
`scrutinized can be set to be the traffic signal in front.
`Summarizing the above, we have the following. First, the
`distance and direction to an object is measured by the distance
`measurement means 70. Depending on these measurement
`results and the travel situation of the vehicle, the assessment
`standard setting means 75 and the assessment means 76
`decide upon the object on which to perform image processing.
`For example, if it is assessed as traveling on a highway, the
`preceding vehicle is taken as the object, and if it is assessed as
`traveling near an intersection, the traffic signal is determined to
`be the object. But setting can be made as appropriate
`depending on the situation, so this is not limited to the above
`example. Furthermore, when performing image processing, the
`assessment means
`76 makes
`an
`adjustment
`(enlargement/reduction/direction conversion, or the like) to the
`image sensor 74 so as to result in the optimum image, and
`decides which part of the image taken in from the image sensor
`74 is to be processed. The image processing means 77, which
`has taken in the image of the image sensor 74, performs
`feature extraction of the object in accordance with the targeted
`object. This feature extraction may be, for example, detection of
`an obstacle, detection of the lighting up of brake lamps on a
`vehicle for which there is risk of a collision, recognition of
`restricted speed as indicated on a traffic sign, or the like. In
`addition, by calculating the temporal change in the distance
`information, the relative speed between one’s own vehicle and
`the targeted object can also be detected. Note that, as in the
`second working example, in this working example as well,
`detailed information obtained by the image processing means
`77 includes the brake lamps of the preceding vehicle, the
`lighting-up of turn signals, the size of a vehicle, the direction of
`travel, and the like, and these features can be extracted.
`Furthermore, with regard to the object of scrutiny that is set by
`the assessment means 76, as a result of the image processing
`means 77 performing image processing, the features of
`pedestrians or buildings can be extracted, and one can
`recognize the situation, including traffic signs, the positions of
`traffic signals, the presence any railroad crossings, road
`equipment and the like. Furthermore, additional distance
`measurement means 70 or image sensors 74 can be installed
`as necessary, such as on the front, rear, left, or right of the
`vehicle. In this case, the features of an object can be extracted
`in independent directions.
`[0012] Next, a working example in which the present invention
`is applied to the detection of the preceding vehicle is described.
`In terms of conventional preceding vehicle detection devices,
`there are
`those
`that make use of
`the
`reflection of
`electromagnetic waves, as described
`for example
`in
`JP-61-023985-A and the like. This is such that, by emitting
`electromagnetic waves (for example, laser light or the like)
`while sweeping through a prescribed angle, receiving the wave
`reflected from a reflecting body, and calculating the distance to
`the reflecting body based on the propagation delay time from
`the emission to the reception of the electromagnetic wave at
`prescribed sweep angles, the distance and direction from one’s
`own vehicle to the reflecting body is detected, and information
`on the direction and distance from one’s own vehicle to a
`reflecting object can be obtained. Note that, in general, reflex
`reflectors are installed at the rear of a vehicle in order to
`improve its visibility from behind, and this reflex reflector, being
`a reflecting body, reflects laser light and other electromagnetic
`waves, making easy detection possible. Furthermore,
`in
`addition to this, a device has also been devised in which, using
`a television camera or the like, which is oriented to the front of
`
`
`the vehicle, the scenery in front of the vehicle is input, image
`Translation by Patent Translations Inc. 1-800-844-0494 mail@PatentTranslations.com
`
`processing is performed on the image that is input, and the
`preceding vehicle or road is recognized.
`[0013] However, a conventional preceding vehicle detection
`device such as this has presented the following problems. First,
`in a preceding vehicle detection device that makes use of the
`reflection of an electromagnetic wave such as laser radar, it is
`possible that there will be, not only reflex reflectors provided on
`the rear of the vehicle as reflecting bodies, but also corner
`reflectors and the like on a guard rail installed on the shoulder
`of the road, and thus there is the problem that it is very difficult
`to select only the reflection from the reflex reflectors on the
`preceding car from among a large number of reflecting bodies.
`Furthermore, it is relatively easily possible to compare the
`amount of change per unit time in the detected distance to the
`reflecting body with one’s own vehicle travel speed, and to
`distinguish whether the reflecting body is a stationary object or
`a moving object, and to recognize only moving objects as
`vehicles. But with this method there is the problem that, for a
`vehicle or like that is stopped at the tail end of congestion on a
`highway, an assessment will be made to the effect that,
`because it is a stationary object, it is not a vehicle, and thus the
`position of the preceding vehicle cannot be detected, and
`because of this, it is impossible to perform high-precision travel
`control or to give suitable warning for approaching another
`vehicle too closely. Furthermore, in a preceding vehicle
`detection device that image-processes the image from a
`television camera and recognizes the preceding vehicle, with
`the current technology, it is relatively easily possible to use
`image processing to recognize the white lines that indicate the
`edge of the roadway or traffic lanes, but one cannot expect very
`high precision
`for distinguishing a preceding vehicle.
`Furthermore, for distinguishing a preceding vehicle, it is also
`necessary to perform processing many times, which makes it
`necessary to process a large volume of image information.
`Then there is the problem that, if this is applied to vehicle
`automatic travel or warning devices for a rear-end collisions or
`the like, it is essential that such processing be performed at
`high speed (in real time), which requires a very-high-speed
`computer. Furthermore, in technology that uses ordinary image
`processing for purposes other than preceding vehicle detection,
`the region within the entire screen on which image processing
`must be performed (the region under scrutiny) is limited, and
`various types of processing will be performed within this
`[region]; thus, a method can be applied that reduces the overall
`calculation volume, but if a preceding

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket