throbber
c2-1
`WIDE DYNAMIC RANGE VISION SENSOR
`FOR VEHICLES
`
`Keiichi YAMADA, Tomoaki NAKANO, Shin YAMAMOTO,
`Toyota Central Res. & Develop. Labs., Inc.
`Nagakute, Aichi, 480-11 Japan,
`Eisaku AKUTSU and Keiji AOKI
`Toyota Motor Corp.
`1200 Mishuku. Susono, Shizuoka, 410-11 Japan
`
`Abstract: The dynamic range of brightness on road
`scenes is very wide, because the lighting condi-
`tion dynamically varies with various weather and
`road conditions. Therefore, the dynamic range of
`conventional TV cameras is insufficient to input
`the images of road scenes. We have developed a
`method for expanding the dynamic range of TV
`cameras. Also, we have developed an experimen-
`tal vision sensor system with a wide dynamic range
`based on the method applicable to the vision sys-
`tems for vehicles. The effectiveness of the sensor
`in comparison with conventional TV cameras was
`confirmed from the experiments on highways un-
`der various lighting conditions.
`
`I. INTRODUCTION
`Computer vision greatly contributes to preventive safety
`and/or driver assisting systems for vehicles, because it is
`effective in recognizing lane marks, cars and obstacles on
`the lane [l, 2,3]. The dynamic range of brightness on road
`scenes is very wide where the lighting condition dynam-
`cally varies according t o various weather and road condi-
`tions. One of the problems in realizing the vision systems
`for vehicles is that the image taken with TV camera is
`sometimes either hidden under noise in its dark parts or
`saturated in its bright parts. This is because the dynamic
`range of conventional TV cameras is insufficient to input
`the images of these road scenes [4]. Accordingly, a TV
`camera with a wide dynamic range is necessary to realize
`such a computer vision system for vehicles.
`One of the approaches t o expanding the dynamic range
`of TV cameras is to improve imaging devices [5]. Although
`many studies on imaging devices have been done, no such
`a device that satisfies tbe requirements for vehicle applica-
`tion, a wide dynamic range, enough resolution, sensitivity
`and reasonable cost, has been reported. Another approach
`is to compose an image with a wider dynamic range than
`that of a camera from the images taken under different
`exposure conditions. Some studies on still images with a
`wide dynamic range of the objects at rest have been re-
`
`0-7Ro3-2105-7/94/$4.00
`
`1994 IEEE
`
`ported (6, 71. However, a practical TV camera which can
`take moving object has not yet been realized.
`We have developed a method for expanding the dy-
`namic range of T V cameras which can take moving objects.
`Based on the method applicable to the vision systems for
`vehicles, we have developed an experimental vision sensor
`system with a wide dynamic range. The effectiveness of
`the sensor in comparison with conventional TV cameras
`was confirmed from the experiments on highways under
`various lighting conditions.
`
`11. REQUIRED DYNAMIC RANGE
`FOR A VEHICLE CAMERA
`
`Fig. 1 shows brightness of objects, presenting on road
`scenes in the daytime, under various lighting conditions.
`The figure means that a dynamic range of up to lo4 is re-
`quired for a camera to be capable of taking images of the
`road scenes without lack in information. This indicates
`that the dynamic range of conventional TV cameras, ap-
`proximately 500, is insufficient t o take the images of the
`scenes without the lack.
`Conventional N camera
`+11111111------,
`Requied dynamic range
`
`101
`
`102
`
`103
`
`b
`Brightness
`1041"~)
`
`A s " c
`La"smah
`surface
`o n m d
`Carbody Carbody
`(Gray) While)
`In thetwilight ( 1 0 % ~ )
`
`m r m d Lanemm
`onroad
`sutiace
`Carbody Cart&
`(Gray)
`(Whne)
`In the sun at noon (1051Ux)
`
`Carbody Carbody
`(Gray)
`(write)
`In the tunnel (i021ux)
`
`carbody Carbody
`(whnel
`(Gray)
`In the shade at noon (1041ux)
`
`Fig. 1. Brightness of objects in road scenes under various
`lighting conditions and the dynamic range required for a
`camera to input the scenes.
`
`111. METHOD FOR EXPANDING
`DYNAMIC RANGE
`An image with a dynamic range wider than that taken
`
`1994 Vehicle Navigation & Information Systems Conference F%ceedm
`
`gs
`
`405
`
`Magna 2017
`TRW v. Magna
`IPR2015-00436
`
`

`
`by a TV camera would be obtained by combining the im-
`ages of a scene taken under different exposure conditions.
`To establish the method of expanding the dynamic range
`of a TV camera, assume the following model representing
`the relation between input light intensity S and output
`signal level L of a TV camera. Assume that the output
`signal level L is proportional to the light intensity S until
`L is saturated at the saturation level Laat :
`
`where E is a coefficient decided by the exposure condition.
`Suppose a set of exposure conditions Eo, E l , ..., E,-,, each
`condition is represented by:
`
`In equation (2), A, which is greater than 1, is an increase
`rate of exposure amount.
`Fig. 2 shows the output signal level L versus light
`intensity S in each exposure condition EO, Et, ..., En-].
`The light intensity in each range, expressed by the fol-
`lowing equations, is detected using each image taken with
`exposure condition E, ,
`
`Note that L,,, represents a noise level of the camera. By
`detecting the light intensity in each range from each image,
`as is also shown in Fig. 2, the light intensity range between
`L,,t/EoAn-l and L,,,/Eo would be covered by n images
`taken with exposure conditions EO to En-1.
`Since a conventional TV camera takes images under
`a single exposure condition, the light intensity range ob-
`tained by the conventional camera corresponds to that in
`the exposure condition EO. Let DO be the dynamic range of
`the conventional camera. Then, the dynamic range which
`can be covered by exposure conditions Eo to En-l is ex-
`pressed as DoA"-', which is An-' times as wide as the
`range of the conventional camera.
`
`Camera output level
`1% L t
`
`Lsat
`
`Lnoi
`
`IV. WIDE DYNAMIC RANGE
`VISION SENSOR
`An experimental vision sensor system, whose dynamic
`range is expanded by the method described in the previ-
`ous section, has been developed. Fig. 3 shows the block
`diagram of the sensor. The sensor is composed of a CCD
`monochrome TV camera and a VME double height size
`electronic circuit board named dynamic range ezpansion
`unit. The images in different exposure conditions are taken
`at short intervals by using electronic shutter function of
`the CCD camera to realize the imaging of moving objects.
`The images taken with different exposure time are com-
`bined into an image by the dynamic range expansion unit
`simultaneously with the reading out from CCD: Thus, the
`image with a wide dynamic range could be obtained in real
`time. The appearance of the developed sensor system is
`shown in Fig. 4.
`Fig. 5 shows the timing diagram when the number of
`exposure condition is two. Shutter time is varied in every
`field to take the image in each exposure condition every
`1/60 seconds. Image signal is read out from the CCD
`camera in non-interlace mode. As shown in the figure,
`when the number of exposure condition is two, two images
`for each of these exposure conditions can be taken in 1/60
`seconds if the image for the shorter exposure time is taken
`first: Thus, the wide dynamic range image is obtained
`every 1/30 seconds.
`The dynamic range expansion unit works as follows:
`The image for the first exposure condition is stored in the
`image memory. Each image taken after that is combined
`with the previous image read out from the image memory.
`The combined image is sent back to the image memory and
`stored there. By combining the image for the last exposure
`condition with the image from the image memory, a wide
`dynamic range image is obtained.
`The combining algorithm is as follows: The algorithm
`is processed by one pixel. If the pixel of the image for
`the longer exposure time is not saturated, this pixel value
`is used as the value of the combined image. Otherwise,
`the pixel value for the shorter exposure time is used by
`multiplying it and the ratio of the two exposure times to
`
`Dynamic range expansion unit
`
`Fig. 2. Camera output level L vs. light intensity S in each
`exposure condition E, (where i=1,2, ..., n-1).
`
`Fig.3. Block diagram of developed vision sensor system.
`
`406
`
`1994 Vehide Navigation & Ia60mtion Systems Conferenae Ratedingr
`
`

`
`Fig. 4. Appearance of developed vision sensor system.
`
`-
`
`Time
`
`Exposure term
`(approximately 1/60 sec.)
`IC*
`
`Read out
`
`Combine
`
`output
`
`Fig. 5. Timing diagram of the developed vision sensor
`when the number of exposure condition is two.
`
`of IO4 is obtained. This dynamic range is considered to
`be wide enough for vehicle cameras from Fig. 1.
`'The
`performance of the vision sensor is summarized in Table
`I
`
`V. EXPERIMENT
`A. Experimental Method
`To evaluate the effectiveness of the developed vision sen-
`sor. we performed the following experimcnts under various
`lighting conditions on the road.
`A conventional TV camera and thc camera ticad of the
`developed vision sensor were placed on a r o d of an au-
`tomobile a t a height of l.G m and declination angle of 5
`degrees. The image from the conventional camera was dig-
`itized with a frame grabber into 2.56 gray levels and saved
`for t,he evaluation. The image from the developed vision
`sensor was obtained from the image memory of the dy-
`namic range expansion unit and saved for the evaluation.
`Driving on highways a t a speed of 80 km/h: t,he images
`were taken with the conventional TV camera and devel-
`oped vision sensor simult~aneonsly. The weather was fine.
`The intensity of illuminatior ranged approximately from
`lo2 lux. in a tunnel. to 10' lux, in the sun. The ninii-
`ber of exposure condit.ion of the developed vision sensor
`was two. and exposure lime was set at 1/87 and 1/2620
`seconds. A fixed iris lens was used for t,he developed vi-
`sion sensor; an auto iris lens was used for the conventional
`camera t o follow the change in the brightness of t,hc scenes.
`The obtained
`Focal length of both the lenses was 25".
`images were evaluated in terms of lack in information by
`comparing the two images simultaneously taken with the
`conventional camera and the developed sensor.
`
`i oL
`
`? 256
`
`i o o
`
`io1
`
`i o 2
`
`103
`
`lo4
`
`Light intensity (arbitary unit)
`Fig. 6. Performance of developed vision sensor: pixel
`value of combined image with wide dynamir range vs. in-
`put light intensity when the exposure time is 1/87 and
`1/2620 seconds.
`
`Table 1. Performance of developed vision sensor
`
`Resolution
`Minimum illumination
`Exposure time
`Number of exposure time
`Image memory size
`
`564(H) X 242(V) pixels
`5 lux. E'1.4
`63.6 ps to 1/60 sec.
`2 to 5
`1024(H) x 256(\') x 2 bytes
`
`t When the exposure time is 1/87 and 112620 seconds.
`
`normalize the pixel value t o the scale of longer exposurc
`time pixel value. The algorithm is implemented as a look-
`u p table for speedup.
`To evalnate the dynamic range of t,he vision sensor,
`the gray level of the obtained wide dynamic range image
`in relation to t,he light intensity was measured. Fig. G
`shows the result for the exposure timr of 1 / 8 i and 1 /2G2O
`seconds. As can be seen from the figurc, a dynamic range
`
`B. Experimental Results
`Fig. i ( a ) and 7(b) show the image taken \rit,h the convcn-
`tional T V camera and that with the sensor at, t.h(, exit of
`a tunnel. respectivcly. A t,unnel exit, is one of \,tie scenrs
`having the widest dynamic range exccpt the casc t.hat the
`sun rays directly come int,o t,he lens. A s can he been from
`the figures, the developed vision sensor obtains a clearer
`image of the scene than t,he image wit,h the corivent,ional
`T V camera. On t,he other hand. some parts of the iiii-
`age obtained with the conventional camera are saturated
`and lack in information due t,o the narrow dynamic range.
`Fig. 8(a) and 8 ( b ) show anot.hrr example of the images of
`the scene where the sunlight was very bright and shade\\-s
`were very dark. The effectiveness of 1 tic developed scmor
`can be seen from the figures in t,hc same manncr as Fig. T.
`Although 29 L7c of the pixels in the image in Fig. i ( a ) and
`8 5% of the pixels in Fig. 8(a) are sat,urated, the pixel in
`both the images obtained with the developed serisor (Fig.
`7(b) and 8 ( b ) ) arc ncit,her saturated nor zero Irvcl.
`Fig. 9 ( a ) and 9 (b) show the cdge enharrccmcnt result.
`of the images in Fig. 7 ( a ) and i ( b ) using a sobcl operat,or.
`All of the lane mark edges are clearly observed in t,he image
`
`1994 Vehicle Navigation & Information Systems Conference Proceedings
`
`407
`
`

`
`The dynamic range of a conventional TV camera, 500,
`was expanded to lo4 with the developed image sensor sys-
`tem. The dynamic range of the developed sensor was found
`to be sufficient to input images of the road scenes in vari-
`ous lighting conditions. The effectiveness of the sensor in
`comparison with conventional TV cameras was confirmed
`from the experiments on highways under various lighting
`conditions.
`At present, the experimental vision sensor developed
`is not small enough in dimension for practical application.
`This is because most of the digital logic circuits in the
`dynamic range expansion unit are constructed by general-
`purpose logic ICs. However, it is possible to reduce the
`dimension for practical application. This is because the
`digital circuits can be integrated into a custom LSI. In ad-
`dition, both the analog circuits for video signal and the
`A/D converter, which are necessary for digital image pro-
`cessing for the vision system, commonly exist in conven-
`tional frame grabbers. Consequently, only the logic LSI
`and a look-up table for combining images are the essen-
`tial parts to be added to the conventional frame grabber
`for expanding the dynamic range of a TV camera. There-
`fore, the vision systems for vehicles would easily adopt
`the method for expanding the dynamic range of the TV
`camera and increase the robustness under varies lighting
`conditions.
`
`REFERENCES
`
`[l] E. D. Dickmanns and V. Graefe, “Dynamic monocular
`machine vision,” Machine Vision and Applications, 1,
`pp.223-240, 1988.
`[2] A. Kutami, et al., “Visual navigation of autonomous
`on-road vehicle,” IROS ’90, pp.175-180, 1990.
`[3] T. Ozaki, et al., “Image processing system for au-
`tonomous vehicle,” SPIE(Mobi1e Robots), 1989.
`[4] C. Thorpe, M. H. Hebert, T. Kanade and S. A.
`Shafer, “Vision and navigation for the Carnegie-Mellon
`Nablab,” IEEE Trans. PAMI, vol. 10, no. 3, pp.362-373,
`1988.
`[5] S. G. Chamberlain and J. P. Y. Lee, “A novel wide
`dynamic range silicon photodetector and linear imag-
`ing array,” IEEE Trans. ED., vol. 31, no.2, pp.175182,
`1984.
`[6] R. M. Rangayyan and R. Gordon, “Expanding the dy-
`namic range of x-ray videodensitometry using ordinary
`image digitizing devices,” Appl. Opt., vol. 23, no. 18,
`pp.3117-3120, 1984.
`[7] K. Moriwaki, “Adaptive exposure image input system
`for obtaining high quality color information”, IEICE,
`vol. J76-D-11, no. 9, pp.1894-1901, 1993.
`
`(b)
`( a)
`Fig. 7. Images at a tunnel exit obtained with; (a) a
`conventional TV camera, (b) the developed vision sensor.
`
`Fig. 8. Images on the road where the sunlight is very
`bright and shadows are very dark obtained with; (a) a
`conventional TV camera, (b) the developed vision sensor.
`
`(b)
`(a)
`Fig. 9. Edge enhancement result of the images in Fig.7
`using a sobel operator.
`
`for the developed sensor while some parts of the edges are
`lacked in the image for the conventional camera.
`From the experimental results described above, the ef-
`fectiveness of the method for expanding the dynamic range
`of a camera for the vision systems of vehicles has been con-
`firmed.
`
`VI. CONCLUSION
`
`One of the problems in realizing the vision systems
`for vehicles is insufficient dynamic range of conventional
`TV cameras. To solve the problem, we have developed a
`method for expanding the dynamic range of TV cameras.
`Also, we have developed an experimental vision sensor sys-
`tem with a wide dynamic range based on the method ap-
`plicable to the vision systems for vehicles.
`
`408
`
`1994 Vehiclc Navigation & Informatl ‘on Systcmr Confcrcnae Roeetding

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket