`
`Wassim G. Najm
`
`Volpe National Transportation Systems Center
`Kendall Square, Cambridge, MA 02142
`
`ABSTRACT
`
`Sensor technologies of collision avoidance systems are identified based on a literature search about available products,
`prototypes, and experimental systems. These sensors constitute the front-end functional element of potential
`countermeasure systems to rear-end, backing, lane change, roadway departure, opposite direction, intersection, and reduced
`visibility crashes. The characteristics and capabilities of alternative sensor technologies are described based on published
`literature. Crash avoidance sensor technologies encompass microwave, millimeter-wave, and near-infrared radars;
`ultrasonic transducers, charge-coupled device cameras operating in near-infrared and visible bands; uncooled, passive far-
`infrared detectors; millimeter-wave imaging radar; and near-infrared communications.
`
`1. INTRODUCTION
`
`The collision avoidance capabilities of motor vehicles can be significantly improved by applying advanced technologies
`to assist drivers in avoiding crashes. Recent advances in electronics, communications, processors, and control systems now
`allow for the design of collision avoidance systems with increased sophistication, reduced cost, and high reliability.
`Although traffic-related deaths in 1992 declined to its lowest point in 30 years in the United States' , nearly 6 million
`police-reported crashes still occur every year according to the National Highway Traffic Safety Administration's General
`Estimates System accident database. The societal costs of motor vehicle crashes exceed $137 billion annually'.
`
`In an effort to identify crash avoidance opportunities, a preliminary analysis of seven major crash types was conducted
`to identify crash causal factors and applicable countermeasure concepts, model crash scenarios and avoidance maneuvers,
`provide preliminary estimates of countermeasure effectiveness, and identify research and data needs2. These target crashes
`include rear-end, backing, lane change/merge, roadway departure, opposite direction, intersection/crossing path, and reduced
`visibility. Based on crash subtypes and causes, various countermeasure concepts were devised to enhance the crash
`avoidance capabilities of the driver-vehicle system. For purpose of sensor technology discussion in this paper, the
`functions of a sample of countermeasure concepts are defined below2:
`
`-
`
`-
`
`-
`-
`-
`-
`
`Headway Detection: advises the driver of an imminent crash with an obstacle in the vehicle's path or to keep a safe
`headway when following another vehicle.
`Proximity Detection: provides the driver with information about vehicles in adjacent lanes or obstacles in the vehicle's
`path while backing up.
`Lane Position Monitor: advises the driver if vehicle is drifting out of its travel lane.
`In-Vehicle Signing: conveys to the driver posted or dynamic information provided by the traffic control infrastructure.
`Gap Acceptance Aid: advises the driver to safely straight-cross or turn at intersections.
`Vision Enhancement: presents the driver with a clear image of the environment ahead during reduced visibility
`conditions (e.g., night/inclement weather).
`
`The application of enabling technologies to motor vehicles, as building blocks for collision avoidance products, must
`meet stringent requirements in performance, cost, reliability, fault tolerance, and environmental hardening3. In addition to
`low cost, automotive product applications approach those of military electronics, including 5- to 10-year life, over 1,000
`thermal cycles from -40° to +150°C, 150-g mechanical shocks, and 50-g sinewave vibrations4. Moreover, automotive
`electronics must be immune to vehicle fluids and electromagnetic interference. Sensor technologies, in particular, must
`operate under additional conditions such as dust, dirt, snow, ice, fog, and adverse weather.
`
`This paper compares sensor technologies of collision avoidance systems based on a literature search of available
`
`62 ISPIE Vol. 2344 Intelligent Vehicle Highway Systems (1994)
`
`O-8194-1677-O/95/$6.OO
`
`1
`
`Mercedes-Benz USA, LLC, Petitioner - Ex. 1006
`
`
`
`products, prototypes, and experimental systems. The characteristics of existing sensors and their performance are described
`as reported in the literature. The technologies of collision avoidance systems were previously reviewed with regard to their
`three main functional elements: sensor, decision-making capability, and driver/system interface5. This paper concentrates
`on the sensor element of crash countermeasure systems and addresses enabling sensor technologies for realizing
`countermeasure concepts defined above. Next, sensor types are discussed following the descending order of entries in
`Table 1 that lists countermeasure concepts, enabling sensor technologies, and applicability to target crashes.
`
`Table 1: Countermeasure Concepts and Applicable Sensor Technologies
`
`Concepts
`
`Sensors
`
`Headway Detection
`
`radar, laser, & video
`
`Target Crashes
`
`rear-end
`
`Proximity Detection
`
`radar & ultrasonic
`
`backing and lane change/merge
`
`Lane Position Monitor
`
`laser & video
`
`road departure and opposite direction
`
`In-Vehicle Signing
`
`video, JR comm., &
`p-wave transponder
`
`road departure, opposite direction,
`and intersection/crossing path
`
`Gap Acceptance Aid
`
`video
`
`intersection/crossing path
`
`Vision Enhancement
`
`radar, passive FIR, & CCD
`
`reduced visibility
`
`2. HEADWAY DETECTION
`
`Forward-looking sensors (FLSs) are used in headway detection systems to gather information about targets ahead of
`the vehicle, in both active and passive modes. Active sensors employ the principle of radar measurements to determine
`range, relative speed, angular position, and profile of targets. These sensors operate in a wide range of the frequency
`spectrum, either in microwave (p-wave) (1-30 GHz), millimeter-wave (mm-wave) (30-300 GHz), or Near-Infrared (NIR)
`(0.75-3 pm) regions. Passive sensors rely on charge-coupled device (CCD) cameras to acquire images of targets ahead and
`measure distances based on video image processing.
`
`2.1. RadarBased Sensors
`
`Table 2 provides the characteristics and distinct features of some selected radar sensor prototypes and products.
`Automotive radar-based FLSs operate in various modes of transmission including pulse, pulse doppler, frequency-modulated
`continuous-wave (FM-CW), binary phase modulation using maximal-length pseudonoise (PN) code sequences, and pulsed
`frequency modulation (PFM).
`
`Experimental trials were conducted on two prototypes, operating at 10 and 35 GHz with FM-CW, using typical targets
`(man, car, metallic plate)'6. The results of distance measurements with both sensors have shown good results, except
`several values were in error by 10% for distances between 30 and 35 m. This result may be due to ground clutter effects.
`A 35 GHz radar, operating over smooth ground covers such as asphalt, needed an additional transmitted power between 6
`and 8 dBs to compensate for signal multipath fading17. Experimental results with a 50 0Hz FM-CW radar7, employing a
`narrow beam with low sidelobes, showed that guardrails and traffic control signs (flat objects) can only be detected from a
`narrow angle. Metal poles (cylindrical objects) are nondirectional and viewed from all angles. Target detections are picked
`up from guardrails along a curved road, a concrete wall ahead at an intersection, and railroad tracks at bottom of a
`downgrade. Missed targets occured at beginning and end of an upward slope. No detections occurred from a guardrail
`along a straight road, approaching woods or plants at an intersection, and road reflections for an upward slope. The test&8
`also showed that the radar was sufficiently effective even in rainfall of up to 10 mm/h.
`
`Cooperative radar systems using passive transponders on the rear of vehicles (tagging) have been tested. These
`
`SPIE Vol. 2344 Intelligent Vehicle Highway Systems (1 994) I 63
`
`2
`
`
`
`systems possess some advantages, including no interference due to masking, blinding, or crosstalk, no false alarms due to
`non-hazardous targets, and similar radar cross section for all tagged vehicles'6"9'°. However, one major drawback of such
`systems is that damaged tags and non-equipped obstacles cannot be detected. Measurements of distance, relative speed,
`and relative angle to a 17.5 0Hz transponder° mounted on the rear of a lead vehicle were accomplished at a maximum
`range of 150 m. A low cost, Van Atta array, refrodirective transponder was suggested for cooperative collision avoidance
`applications, which can be realized using GaAs mm-wave monolithic integrated circuits (MICs) 21, The retrodirective
`feature results in reduced interrogator transmitter power and reduced interference at the interrogator receiver due to the
`transponder narrow beam. This transponder does not require any RF source or receiver since it is a simple modulator that
`transmits back an incident wave modulated by information. The shape of the incident phase front is replicated in the
`outgoing wave, thus compensating for phase distortions due to atmospheric or multipath effects between the interrogator
`and the transponder. If mounted at the front and rear of a vehicle, the Iransponder may minimize false alarms by having
`different modulation codes to distinguish between an oncoming or preceding vehicle. Moreover, information about the host
`vehicle can be lransmitted to any interrogating vehicle.
`
`Table 2: Characteristics and Features of Selected Automotive Radar Sensors
`
`Characteristics
`
`Distinct Features
`
`35 GHz, Pulse, 2i-Ix5V,
`10-200 m, 12.5 m
`
`and steered by stepping the
`Beam is electronically scanned at 21-1/ms with angular sweep of
`frequency from 34.2 GHz @ -16 to 35.8 GHz @ 16, forming 16x16 array frame after complete scan6.
`
`50 GHz, FM-CW, 2x2,
`100 m, 2 m
`
`Fixed and single Mill's cross transmit and receive antennas at right angles to each other and polarized @
`to avoid interference with oncoming radars. Sidelobes are below 27 dB7.
`
`94 GHz, FM-CW,
`l.5x1.5, 128 m, 0.5 m
`
`60 GHz, Pulse, l.5'xl.5,
`15-100 m,
`
`61 0Hz, PN code, 12°H,
`20-150 m, 0.75 m
`
`78 GHz, PFM, lO'H,
`150 m,
`
`24.125 GHz, FM-CW,
`5H, 107 m,
`
`77 Gllz, PFM, 3 3x3',
`1-150 m, I m
`
`Fixed antenna with a vezy well defined beam. Radar head is slO cm long. Waveguide assembly is made
`of one plastic injection moulding and later metallized to meet required electrical performance8'9.
`
`Quasi-optical design is used for heterodyne detection where a Gunn diode acts as transmitter and local
`oscillator using modulated bias pulse shapes. Active antenna is integrated with varactor and mixer diode'°.
`
`1 transmitting antenna (l2'H) and 4 effective receivers (3°l-l each) create 2D image. Wavefront
`reconstruction is used for angular discrimination. Binary phase modulation using PN codes improve range
`resolution. 128 range gates over 100 m are covered within 13 ms".
`
`Receiving antenna is scanned electronically by sequential switching using PIN diodes to produce 3 receiver
`beams (3°H each). mm-wave circuits are built using conventional microstrip techniques'2.
`
`p-wave electronics and signal conditioning circuit are integrated onto the back of a fixed, flat etched array
`antenna'3.
`
`Antenna is built with a 3 4-element microstrip patch array using folded optic design. A microstrip PIN
`switch matrix selects 3 scanned beams. Radome is self-tested for any precipitation. Diameter= 145 mm. 3
`beams are processed sequentially within 15 ms. Each beam is divided into 30 range gates'4.
`
`50 GHz, FM-CW,
`30°F1x3'V,5-lOO m,5-lOm
`
`Antenna is mechanically scanned over 30' based on diffraction electronics which is achieved by a rotating
`drum. The beamwidth varies from 1H at 0 position to 4'H at
`positions'5.
`
`Characteristics: frequency, modulation, beamwidth, range, and range resolution
`
`2.2. Laser-Based Sensors
`
`Table 3 lists the characteristics and features of some selected automotive laser sensors (also known as optical radars).
`These sensors operate in a pulse mode at a wavelength of 0.904 or 0.85 pm. Laser receivers employ honeycomb and JR
`filters to prevent false alarms triggered by direct sunlight. An automatic gain control in the receiver amplifier circuit
`decreases measurement errors caused by fluctuations in target reflectivity. In addition, a sensitivity time control circuit is
`used to keep unwanted reflections caused by fog and road surface in the near-field below the minimum sensitivity of the
`laser receiver.
`
`64 1 SPIE Vol. 2344 Intelligent Vehicle Highway Systems (1994)
`
`3
`
`
`
`The detection performance of lasers is degraded due to dirty optical interface, heavy rain or thick fog, and car exhaust
`emissions27. A recent experiment showed that the maximum detection range of a laser FLS was reduced by about 30% in
`rainy weather compared to clear weather due to waterdrops on lead vehicle reflector surfaces, road splashes, puddles on the
`road, rainfall, and waterdrops on the optical surface. Other field tests showed that a narrow-beam laser FLS lost track of
`targets at ranges over 60 m due to vibration (pitch and roll) of the vehicle26. A recent evaluation29 of an intelligent cruise
`control using a bumper-mounted laser sensor indicated an occasional loss of target acquisition on curves; missed targets
`such as flat-bed or bare-frame trucks, dirty trucks, auto haulers, and a few minivans and compact passenger cars; and "late"
`detection or intermittent detection.
`
`Table 3: Characteristics and Features of Selected Automotive Laser Sensors
`
`Characteristics
`
`81-IxlV (4 beams),
`5-100 m, 1 m, 20 Hz
`
`Distinct Features
`
`Received signals are assessed using model-based investigation of vehicle states and knowledge-based
`recognition of obstacles to reduce alarms caused by roadside objects. Alarms around curves are
`suppressed by reducing the range of each of 4 beams according to steering wheel angler.
`
`3Hx3V, 80 m, , 10-100 Hz
`
`1 main beam with 80 m range and 2 secondary beams for up to 30 m. Size=l5OWx48HxlOODmm.
`
`5.71-1x3.4'V, 80 m, 0.1 m,
`30 Hz
`
`4.8rH, ioo m, ,'
`
`Two JR diodes: short range with 5.7Hx3.4'V beam and long range with l.5Hx2.2V. Two PIN
`photodiodes with 5.711x3.4W beam. Range is increased by using avalanche photodiode.
`
`3-section photodiode: 1 central (21-I) up to 100 m and 2 outer ones (1.431-I) to either side of central
`beam up to 40 m. A lens cover dust detector is provided to alert driver before dirt covers lens.
`
`2 beams 111 & 2.81-I, lOOm,
`*
`
`2 beams are generated by changing the gap between the IR diode and an aspherIcal plastic lens.
`Receiving amplifier has automatic gain control to absorb the range of received optical power26.
`
`Characteristics: beamwidth, range, range resolution, and data rate
`
`23. Vision-Based Sensors
`
`Vision-based FLSs were developed for vehicle following and obstacle detection and ranging applications using one and
`two CCD cameras, respectively. In intelligent cruise control where the driver sets the headway to a lead vehicle, one
`camera measures only the changes in the gap between the two vehicles. The lead vehicle is identified and marked using
`features such as tire spacing and external shape. Once the size of the target's image is stored for a set headway, the
`following vehicle maintains this headway based on the image size which is inversely proportional to the inter-vehicle gap.
`Measurements are achieved in 80 ms and are reliable for up to 50 m. To determine the absolute distance to a lead vehicle,
`two cameras are utilized to measure the distance based on differences between right and left camera images. A significant
`problem with this stereo vision system is image correspondence between the right and left cameras, which is alleviated by
`using a third camera30. An experimental 3-camera vision system has been tested which measures distances to objects ahead
`every 100 ms. This system consists of a custom digital processor and off-the-shelf CCD cameras to perform image
`acquisition, feature extraction, stereo matching, and post filtering. To further improve the cost/performance of such a
`system, two VLSI chips are being developed: an image acquisition chip with a wide intensity dynamic range to cover
`various weather and lighting conditions, and a hybrid analog/digital array processor for edge detection and stereo
`matching30.
`
`Finally, a camera chip3' is being developed to provide a high dynamic range of 100 dBs. This dynamic range is
`necessary both for imaging of high-contrast scenes with brightness changes of 100,000: 1 from frame to frame and to avoid
`severe saturation caused by reflections of bright sources such as the sun. Further advantages of this camera chip include
`short access time, good ambient temperature range, and low power consumption.
`
`Radar sensors and ultrasonic lransducers are discussed in this section, for side- arid rear-looking detection of near
`
`3, PROXIMITY DETECTION
`
`SPIE Vol. 2344 Intelligent Vehicle Highway Systems (1994) 16
`
`4
`
`
`
`objects. Note that CCD cameras, incorporated in sideview mirrors, are currently under investigation as sensors for
`experimental lane change warning systems to recognize approaching vehicles by optical flow image processing.
`
`3.1. Radar Sensors
`
`Commercial proximity radar sensors operate in the 50 MHz band centered at 10.525 GHz, which is allocated by the
`Federal Communications Commission (FCC) for "field disturbance sensor use". FM-CW modulation enables the radar to
`determine both range and relative velocity of obstacles and to discriminate multiple stationary or moving objects by
`employing bandpass filters to divide the range into sectors. Three position range gates are typically provided for backing
`applications32 (e.g., 1.5, 3 and 6 m). The radar doppler effect is also used to discriminate among approaching and receding
`obstacles. Data rates are typically around 30 Hz. Microstrip patch-array antennas, integrated with GaAs p-wave MICs
`(MMICs), are currently fabricated to miniaturize the size and reduce the cost of radar sensors. One microstrip patch-array
`antenna measures 19 cm2(A)xO.6 cm(T) and can easily fit inside a sideview mirror or a typical car taillamp33. A lane
`change warning prototype uses a radar sensor inside the sideview mifror to transmit a beam about 10 m behind the car,
`which is just wide enough to span the adjacent off-side lane and overlap with the mirror's narrow sight angle. A dual-
`frequency doppler radar is employed in a commercial system designed to warn a school bus driver when children are
`moving in the vicinity of the bus during loading and unloading". Within the radar sensor, a single GaAs MMIC chip
`interfaces to planar receive and transmit antennas. The sensor is designed to operate over the temperature range from -40°
`to 85°C and to function with the radome covered by either 1 cm of snow or ice, or 6 mm of mud.
`
`An ultra-wideband, spread-spectrum radar technology was recently announced for proximity detection applications36
`which has the capability to detect objects up to 60 m away. The radar transmits a very short electrical pulse, with an
`adjustable width from 50 ns to 50 ps, at a noise-dithered repetition rate. The dithering randomizes the time of emission,
`creating a spread spectrum that looks like random noise to other detectors. The radar noise is coded, thus minimizing
`interference among similar co-located sensors. The radar receiver samples echos at 1 MHz rate. No carrier frequency is
`used; thus, this sensor does not require FCC approval and promises low cost hardware implementation.
`
`3.2. Ultrasonic Transducers
`
`Ultrasonic transducers (30-200 KHz) operate in a pulse transmission mode to detect an obstacle and determine its
`distance. Commercial transducers for collision avoidance applications, either electrostatic or made from a ceramic
`piezoelectric element, operate at around 50 KHz37'38. The accuracy of distance measurements depends primarily on the
`precision of the sound velocity estimation which is sensitive to local air properties. Moreover, the accuracy is affected by
`the obstacle's angle of incidence and offset relative to the beam centerline, and by precipitation and high frequency external
`noise39. The range of ultrasonic sensors is limited to 0.3-10 m due to high path-loss attenuation but may be extended to 40
`m using cooperative transponders'7. Range measurements are typically updated every 18 ms. The use of multiple
`transducers provides a means for controlling the antenna radiation patterns and verification of system operation40.
`
`4. LANE POSITION MONITOR
`
`Laser scanning of lane markers and vision-based sensors are addressed, which do not require infrastructure support,
`such as special lane markers. Other sensing schemes have been investigated for automated highway system applications
`which require coded magnet arrays, passive wire loops, and special stripes to be placed in the center of the lane.
`
`4.1. Laser-Based Lane Sensors
`
`One example is a laser radar that detects a roadway reference system provided by retro-reflective lane definition
`markers41. This system operates first by illuminating a section of the roadway ahead, then detecting the light reflected back
`by the lane definition markers within the illuminated area. A laser diode transmits 32 pulses 15 ns wide at a rate of 125
`KHz. A 7.5°VxlO°H beam diverges on the roadway to illuminate up to 3 consecutive lane markers. The light returning
`from a reflective marker is detected by both a photodiode and a 128-element linear CCD array. The photodiode signal is
`used to determine the distance to the marker at a resolution of < 30 cm and the CCD array provides the angle between the
`lane marker and the vehicle heading at a resolution of 0.1°. Distance and angle data are updated every 20 ms and markers
`
`66 / SPIE Vol. 2344 Intelligent Vehicle Highway Systems (1994)
`
`5
`
`
`
`are detected from 8 m to 25 m. This sensor was tested on a track using lane definition markers placed 8 m apart. The
`distance and angular resolution and data sampling rate of the laser radar were sufficient for lateral control of the test
`vehicle at speeds between 40 to 60 Km/h. All tests were conducted under ideal weather and lighting conditions along an
`asphalt test track41.
`
`Another example is a laboratory prototype of a side-looking laser which measures the distance from either side of the
`vehicle to a reflective lane marker"2. The light beam is mechanically scanned to irradiate the pavement up to 1.5 m in a
`normal direction to the side of the vehicle.
`
`4.2. Vision-Based Lane Sensors
`
`Vision-based systems consist of a monochrome CCD video camera, mounted near the rearview mirror, that captures
`the driver's view of the road ahead. Two functions are performed: detection of lane boundaries by image processing and
`estimation of road geometry and vehicle roadway position using estimation techniques. The first function is conducted by
`either template matching or Hough transform image processing methods, while the latter function employs recursive
`estimation techniques such as Kalman filtering.
`
`4.2.1. Template Matching: This method is a region segmentation technique used for feature extraction in image processing.
`It extracts regions where the intensity of light or color is the same by means of correlation. It is adopted in systems to
`determine lane-marker positions. The lane markings can be either solid or dashed on both sides of the road. Only one
`marking is sufficient for detection. If markings are absent, the border between road pavement and curbstone or grass is
`recognized. In order to reduce the computational load, image processing is performed in certain areas of the image that
`correspond to the road boundaries. In one system, 8 areas of the image are processed to extract the lane markings43.
`
`A least-squares line-fit may be used to generate the lane-boundaries from detected marker positions. The estimated
`lane boundaries are then checked for consistency of angles and width with the previous estimated boundaries. The template
`matching algorithm detects yellow lane-markers, even though their gray-level intensity is only 20% higher than that of the
`background. More than 3000 frames of highway scenes have been successfully processed in real time which include
`curved roads, left and right lane changes, exit and entrance ramps, underpass, merging lanes, obstacles and highway-to-
`highway exchanges". A computer architecture has been developed and successfully operated on a vehicle to implement
`lane sensing in real-time. This system acquires images at a rate of 30 Hz and has a 0.2 s processing time with 0.23 s
`delay45.
`
`A Kalman filter, using linear state-space models of both the vehicle dynamics and the road, plus camera and vehicle
`geometry, is employed to estimate the lane curvature, vehicle offset with respect to the center of the lane, and vehicle
`heading45'6. Information obtained from measurements of vehicle speed and steering angle is also used as input to the
`Kalman filter. A real-time vehicle guidance system for autonomous driving performs lane keeping at a cycle time of 80
`ms, using template matching and Kalman filter"3'46. The system has been tested in real traffic situations, driving
`autonomously on highways at speeds up to 85 Km/h. A wide angle lens with a focal length of 8 to 10 mm has been used
`to track the road from 4 to 20 m. For driving at high speeds, a focal length of 25 mm is used to provide a look-ahead
`distance of 50 m. Camera stabilization is required when using a large focal length due to the small angle under which lane
`markings are observed. The lane keeping function is degraded due to missing lane markers, wet road, or direct sunlight
`into the camera lens.
`
`4,2.2. Hough Transform: This technique recognizes simple geometric shapes and is capable of extracting lane marker
`information such as its slope and position. It is used in one system to fit the extracted numerical white line data to a
`straight line for the lane boundary47. The system recognizes two white lines that form the two sides of a triangle
`representing the lane. The lower side of the image frame forms the base of the triangle that is used to locate the vehicle
`relative to the lane. The angle of view of the CCD video camera is set to place the vanishing point at 4/5 of the height of
`the image frame to minimize the background image in the frame. This system was tested by driving on a highway under
`various weather conditions at different times of day. The recognition was judged to be correct when the difference between
`the actual and the detected white line is less than the distance of 5 pixels47. The accuracy was 97% when the road was dry
`in daytime clear/cloudy conditions, 76% in daytime rainy conditions, 26% in twilight dry conditions, 98% in night
`
`SPIE Vol. 2344 Intelligent Vehicle Highway Systems (1994) / 67
`
`6
`
`
`
`clear/cloudy conditions, and 12% in night rainy conditions. The reduction of accuracy occurred in conditions where the
`light was reflected directly into the video camera. The recognition of vehicle location in the lane failed for several frames
`when the car changed lanes. Currently, one set of image data is processed within 100 ms.
`
`5. IN.VEHICLE SIGNING
`
`Experimental systems are being developed to convey information to drivers via in-vehicle displays about upcoming
`road signs, traffic light status, and variable speed limits. Such information is received by autonomous, vehicle-based
`sensors or cooperative infrastructure-vehicle communications.
`
`5.1. Vehicle-Based Video Sensors
`
`Some systems use video-based recognition techniques to detect, recognize, and transmit to the driver sign information
`based on definable patterns of standard traffic signs'8'49. Four basic functions are performed: image acquisition,
`characteristic extraction, road sign detection, and content recognition. The first three functions were implemented by
`application-specific hardware within 17 ms. The content recognition was processed by software within 0.5 s using a 14
`MHz 80286 processor'. One system49 conducts color segmentation of the input images, which is advantageous in
`characteristic extraction and reduction of information quantities. Another advantage of image color processing is to
`eliminate the effects of brightness and shadow encountered in the driving environment due to weather, sun angle, and other
`conditions. At present, the recognition and interpretation of a traffic sign using color and form of signs takes around 2 s,
`which is too slow for practical purposes50.
`
`5.2. Cooperative Infrastructure-Vehicle Communications
`
`5.2.1. Roadside/Vehicle IR Communications: An Intelligent Sign system was developed to provide drivers with safety-
`related information about intersections with stop sign and curves51. A roadside JR beacon continuously transmits to on-
`board vehicle receivers, in a perpendicular direction to the road, the distance to the stop sign or curve and the friction
`coefficient of the road surface. In addition, the beacon supplies advisory speed for curve entry and type of curve (e.g., left
`or right). The transmitter comprises 16 JR 0.930 pm-diodes. The receiver consists of 3 wide-band diodes with a high gain
`amplifier. The message contains 2 bytes and is transmitted at 8 Kbit/s. The range of the roadside-to-vehicle
`communication system is between 1 and 10 m.
`
`An experimental, roadside-to-vehicle communication system52 was tested which possesses a transmission-bit rate over
`600 Kb/s. The transmitter consists of five 0.85 pm-diodes which are located overhead at a height of 6 m and an angle of
`1.7 wrt the vertical axis. Directive lenses are installed above the road to produce a rectangular communication zone of
`250 cm x17 cm on each lane surface, thus establishing vertical communications. Such configuration offers a decrease in
`the shadow effect, less sensitivity to atmospheric disturbances and lane-to-lane interference, and possibility of variation in
`the beamwidth. The vehicle receiver is mounted behind the windshield at a height between 0.5 and 2 m depending on the
`vehicle type. The beamwidth at the vehicle receiver level is about 15 cm. A short message of 62 bits or a long message
`of 1019 bits may be transmitted, which contain 51 and 976 information bits, respectively. Experimental tests at a constant
`vehicle speed of 100 MPH have shown correct message reception rate of about 95% and 86% for short and long messages,
`respectively. Errors occur in received messages during entry and exit of communication zone due to low Signal-to-Noise
`Ratio (SNR). This communication system was also tested under various weather conditions such as maximum sunlight,
`night, rain, fog, frost, dirt, steam, and tinted windshield. The sun influence is reduced by the shadow created by the gantry.
`Dust has a great negative influence. Rain influence is reduced when it is falling in the same vertical plane as the signal
`beam.
`
`5.2.2. Roadside Transponders: A 17.5 0Hz roadside transponder system is under development20 to transmit back to vehicles
`programmed static data or dynamic data fed from an external source such as status of traffic light. An on-board vehicle
`transceiver transmits a signal to roadside transponders. The transponder amplifies the magnitude of the received signal and
`modulates a unique message onto the carrier wave before reflecting it. The vehicle transceiver decodes this incoming
`signal. A FM-CW technique with FF1' processing is employed to determine distance to the transponder. A measurement
`range of 40 m for roadside transponders has been achieved with a standard deviation error of 1 m.
`
`68 / SPIE Vol. 2344 Intelligent Vehicle Highway Systems (1994)
`
`7
`
`
`
`6. GAP ACCEPTANCE AID
`
`A video image processing system is designed to aid drivers cross the intersection after stopping at a stop sign, by
`detecting approaching vehicles53. Optical flow is used for detecting the movement of objects in the image, which is
`performed by the template matching method. This method enables a single processor to perform multiple types of object
`recognition. A prototype of a compact high-speed processor board was built using a special IC for calculating correlation
`between images that would otherwise require high operation load. In a simulation test, images of a vehicle approaching an
`intersection at 20 Km/h were taken by a CCD camera with a 7.5 mm focal length and a 45° field of view53. The results
`showed that approaching vehicles were difficult to detect at ranges over 25 m. Optical flow is calculated at a time interval
`of 0.3 S.
`
`7. VISION ENHANCEMENT
`
`Vehicle-based vision enhancement systems (VESs) are based on three sensor types: active mm-wave radar, passive far-
`infrared (FIR) detectors, and CCD arrays. VESs may aid the driver in reduced visibility conditions to obse