throbber
Vision Assistance in Scenes with
`Extreme Contrast
`
`Ulrich Seger
`
`Heinz-Gerd Graf
`
`lnstitute for
`Microelectronics Stu ttgarf
`
`Marc E. Landgraf
`
`Intel CorDora tion
`
`Applications of Vision systems in traffic environments still s d e r Erom the limited optid
`dynamic range of their sensors and lack of flexibility in readout mechanisms. We describe the
`performance and architecture of a High Dynamic Range Camera (HDRC) chip and the con-
`ceptional advantages for its adaptation to image processing systems.
`
`everal applications of image process-
`ing systems are under development
`within the European Prometheus
`project, which is a cooperative research
`program.' The task of these image processing sys-
`tems is to deliver actual, well-organized, and highly
`reliable data to the driver but also to driver assis-
`tant systems. The assistant systems help to keep a
`car in its lane, recognize obstacles, or enhance
`visibility under certain circumstances.
`If image data are to be used in vehicle control
`or warning systems, they must support short re-
`sponse times. For example, steering processes
`require response within some milliseconds. Im-
`aging of high-contrast scenes with brightness
`changes of 100,OOO:l from frame to frame is nec-
`essary for uninterrupted processing without de-
`lays. However, this is not possible with changing
`apertures or varying shutter or integration times2
`Commonly available cameras with an optical
`dynamic range of about 5,000:1(74 dB), and even
`high-performance devices known from the litera-
`ture3,4 to reach 8,OOO:l (78 dB), fall short of the
`minimum dynamic range of 100 dB desired in
`automotive applications. (This dynamic range is
`necessary to avoid severe saturation, caused by
`reflections of bright sources such as the sun.)
`Some camera system approaches attain a higher
`dynamic range by controlling shutter, aperture,
`or signal integration time, but may struggle with
`oscillations under rapidly changing conditions.
`
`(Imagine the effects created by the shadows in a
`tree-lined road.) These system approaches require
`extra exposure control and image postprocessing
`hardware as well as extra time for subsequent
`readout and image reconstruction.
`Help may come from a combination of a hard-
`ware-implemented logarithmic signal compres-
`sion with a RAM-like pixel access and the
`opportunity to integrate such circuits together with
`application-specific signal postprocessors into a
`standard CMOS process. This approach leads to
`higher system performances in applications in
`which high scene contrast is a problem.
`
`Sensor architecture
`During the development of the HDRC (High
`Dynamic Range Camera) chip, we placed special
`emphasis on a processor-friendly architecture.
`Systems engineers should be able to benefit from
`high optical performance as well as from an im-
`age sensor interface that is easy to adapt. Pixel
`processors implemented within the focal plane
`enlarge the application field toward imaging of
`extreme contrast scenes, and a RAM-like digital
`interface supports random access to each pixel
`with a minimum access time of 150 ns. A non-
`destructive readout mechanism allows subsequent
`access to the same pixel at even higher frequen-
`cies. (Figure 1 shows the HDRC64 sensor archi-
`tecture, the version with 64 x 64 pixels and our
`prototype.)
`
`50
`
`IEEEMicro
`
`0272-1732/93/0200-0050503.00 0 1993 IEEE
`
`Magna 2007
`TRW v. Magna
`IPR2015-00436
`
`

`
`A maximum readout frequency of 6.6 MHz allows frame
`rates of above 1,600 frameshecond (using the full 64x64-
`pixel field) but can even reach higher frame rates when
`accessing a smaller area of interest.
`The total data rate may be further increased with a
`multifield architecture (see Figure 2), which supports
`multiple parallel outputs and therefore may serve as an
`input device for processor arrays.
`To participate in the further scaling of technology and
`in design enhancements of digital macrocells, we used a
`standard CMOS technology as the target technology.
`Table 1 lists the specifications of the HDRC64.
`
`Local pixel processor
`This processor, which is placed around each pixel (see
`Figure 3, next page) within the focal plane, performs a
`logarithmic signal compression directly at the place of sig-
`nal generation? This arrangement prevents an information
`loss, which might occur should any of the preceding sig-
`nal transport or processing circuits become saturated.
`A logarithmic compression technique known from most
`biological systems shows some advantages concerning
`the dynamic range of input signals that may be processed.
`The HDRC chip achieves logarithmic compression by
`controlled draining of the photocurrent that normally
`would contribute to an output voltage proportional to
`the irradiated power. Chamberlain first used this tech-
`nique in the early 1980s: A development toward higher
`robustness and compatibility with today's CMOS tech-
`nologies resulted in a different conversion principle of
`the pixel processor, but it still converts an input signal to
`its logarithm at the pixel output. Also, the local pixel
`processor simplifies the implementation of area arrays by
`supporting full addressing capabilities to each pixel.
`
`U
`
`I
`
`I
`
`I Analog multiplexer and preamplifier
`
`-
`
`I
`
`Figure 1. HDRC64 sensor architecture.
`
`Address decoder
`(column select
`
`AUJL
`I #128x128-pixel)l
`
`C, Subfield 2
`
`r\ Address decoder
`-I.- -
`(row select)
`l:n -
`- ,'A4
`
`Subfield 4
`
`A2 l:n
`
`out 2
`
`A2 n:m
`
`1
`
`I
`
`r
`
`-
`
`I
`
`
`
`L
`
`t
`
`out 4
`
`A4 n:m
`
`Figure 2. Multifield architecture.
`
`Table 1. HDRC64 specifications.
`
`Minimum
`
`Parameter
`Power supply +
`Power supply -
`Quiescent current total chip
`Operating current at 1 -MHz readout frequency
`Pixel count
`Total photosensitive area
`Fill factor*
`Optical input signal dynamic range
`Resolvable contrast
`Repetitive pixel readout frequency**
`
`*In active area ** Depends on incident power
`
`Typical
`5
`0
`12
`19
`64 x 64
`3.84
`> 40
`1 : 100,000
`10
`-
`
`Maximum
`
`I
`
`Unit
`
`V
`V
`mA
`mA
`-
`mm2
`YO
`-
`
`Y O
`MHz
`
`February 1993 51
`
`

`
`. . .
`
`Figure 3. Sensor geometry in the focal plane.
`
`...
`
`Figure 4 shows a 2 x 3-pixel subfield. (Horizontal lines
`select digital rows, and vertical lines read analog data.)
`Figure 5 shows the different transfer functions of a CCD
`(chargecoupled device) camera compared to that of an HDRC.
`Note that the input dynamic range that can be processed
`without saturation is much larger if the output signal follows
`a logarithmic function of the input. In Figure 5, the input
`signal can change its value over six orders of magnitude with-
`out saturating the HDRC device output. (That corresponds to
`a thermometer with a scale from 1OC to 1,000,000"C.)
`The modulation of quantities like irradiated power in the
`space and time domains and the resolution of ratios of quan-
`tities between different pixels are even more important for
`image processing than the range of detected light intensities.
`Resolution (in the contrast and in the time and space do-
`mains) is the measure for the image quality.
`The value of the above-mentioned thermometer depends
`on how many scale partitions one can distinguish from each
`
`1
`
`gw-sel
`
`M?
`
`M 3 T
`vDD
`
`-L
`vss
`
`&
`vss
`
`i'f
`
`M 3 T
`vDD
`
`I
`
`I
`
`1
`vss
`
`_L vss
`
`T+
`h Col-se1
`0
`
`Pad driver
`
`Col-se1
`
`& vss
`
`7
`
`Col-se1
`
`Figure 4. Circuit schematic for 3 x 2 pixels.
`
`52
`
`IEEEMicro
`
`

`
`other; for example, whether or not you can distinguish a
`temperature of l00OC and l,oOO°C. The thermometer’s suc-
`cess depends on how fast it can change its value, that is, if it
`can react to a heat pulse within a few seconds. Thus, quality
`depends on the application to be met. For example, consider
`a highdefinition video image that contains 2 megapixels, is
`resolved with 256 gray levels, and allows a frame rate of 100
`Hz. This image, while of good quality for television applica-
`tions, is not suited for high-speed imaging: The frame acqui-
`sition time is 10 ms. Also, highly dynamic scenes with contrasts
`exceeding a range of 1:1,oOO will not be resolvable, but gray-
`level resolution within a given range of 1:200 (which may be
`displayed on TV monitors) will be superior. On the other
`hand, a logarithmic sensor that is optimized to handle ex-
`treme illumination conditions at the same time may not be
`able to resolve as many gray levels within a given range of
`intensities as its linear counterpart.
`Figure 6 compares the contrast resolution capabilities of
`competing imaging systems (human eye, HDRC, and CCD
`camera). It is obvious that the CCD camera resolves even
`smaller contrasts than human eyes (at least under certain
`conditions). But it falls short when resolvable intensities within
`one scene exceed a ratio of 256:l up to 1,024:l (depending
`on the analog-to-digital converter that can be used).
`HDRC imaging is thus a solution for all applications in
`which high contrasts must be detected at a high speed and
`contrast resolution of greater than 10 percent meets system
`requirements.
`
`An HDRC implementation
`We fmt integrated an HDRC chip with 64 x 64 pixels using
`a standard “digital” 1.2-pm CMOS technology.
`Readout frequency, pixel pitch, and array size are the cor-
`related design parameters. We chose the small array size with
`a medium spatial resolution (pixel pitch equals 54 pm) to get
`a high readout frequency. (Delay from address valid to out-
`put valid for a random access is 150 ns.)
`
`HDRC application
`The Institut de Recherches Robert Ebsch SA built an ex-
`perimental camera incorporating the HDRC chip, and we in-
`terfaced it to an I T M frame grabber board for demonstration
`purposes. Figures 7 and 8 (next page) show the attempts to
`record a critical road scene using a standard CCD video cam-
`era in comparison to using the HDRC.
`The scene shows two cars meeting at a tunnel’s entrance.
`(The left car approaches the tunnel coming out of a bright
`zone; the right car leaves the dark tunnel region. We placed
`the observing camera outside the tunnel, pointing into it.)
`For better comparison, we extracted a zoom window of only
`64 x 64 pixels corresponding to the 64 x 64 pixels of the
`HDRC from a standard CCD video stream. The images from
`the HDRC were taken with a constant aperture setting, while
`
`Transfer function (normalized)
`100 . . . . . . . . . . . . .; . . . . . . . . . . . . . .
`
`n 8
`v 50 . . . . . . . . .
`
`103 10-1
`
`101
`L (cd/m2)
`
`. . .
`
`. . .
`
`.
`.
`.
`
`.
`.
`.
`103
`
`
`
`
`
`.
`.
`.
`105
`
`Figure 5. Transfer functions of HDRC and CCD cameras.
`
`.
`.
`.
`
`.
`.
`.
`
`.
`
`.
`.
`.
`
`.
`
`.
`.
`.
`
`.
`
`
`
`
`
`
`
`n 8
`Y d I-I
`
`3;
`. . . . .i..
`.
`.
`I HDRC
`
`. . .
`
`
`
`iuman I
`
`103
`
`lo-’
`
`10’
`L (cd/m2)
`
`103
`
`105
`
`Figure 6. Contrast resolution capabilities.
`
`the aperture of the CCD camera was set to a value that allows
`most details to be detected. Despite the low spatial resolu-
`tion of the present HDRC, details of the cars can be extracted
`both in the dim and the bright regions.
`In dynamic driving situations demanding short response
`times, the steering time for the CCD’s aperture would lead to
`even more information loss within images taken with the
`CCD camera. The benefit from application of the HDRC chip
`in these situations is obvious and seems to be a necessary
`enhancement to existing vision systems in automotive
`applications.
`
`Discussion
`The actual 64 x 64-pixel approach with integrated digital
`decoders and analog output drivers is certainly not the final
`“production camera” for high-speed, highly dynamic imag-
`ing systems. But it proves the functionality, and it indicates
`the system performance of a highly dynamic range camera
`feasible in today’s or tomorrow’s standard technologies.
`
`February 1993 53
`
`

`
`I
`
`Vision assistance
`
`Integration of a complete “microsystem” with imager, de-
`coder, and control logic in a standard CMOS process is pos-
`sible today.’ Integration of analog-to-digital converters in a
`digital environment is also a state-of-the-art technique? Only
`the used die sizes limit integration of additional digital
`postprocessing circuitry on chip. Spatial resolution may be
`increased using the same 1.2-pm CMOS technology (with no
`space left for digital postprocessing) and the same pixel de-
`sign. The design will benefit from further scaling in CMOS
`technology as the factors limiting the resolution are dimen-
`sions of metal width and space.
`For best system performance of an image processor, an ap-
`plication-specific imager solution may take system requirements
`into account? Frame rates of above 2,000 frameshecond can’t
`be reached with a single large pixel frame but are possible by
`partitioning the total image frame into several subfields on one
`chip with a parallel readout of multiple fields.
`High sensitivity (below 0.1 lux) and high gray-level resolu-
`tion (greater than 8 bits) may not be reached in combination
`with the highest spatial resolution in planar technologies; but
`it is possible, if one can afford a lower spatial resolution.
`Still the costs for application-specific optical integrated cir-
`cuits are high, because so far there is no technology-indepen-
`dent support for optical standard cells. This means that every
`optical device must be a full-custom design. Developments in
`recent years show that the growing market for optical solu-
`tions will need applicationspecific optical ICs to overcome
`the problems resulting from the concentration of development
`efforts for image sensors (within the last 20 years) on the one
`and only consumer application, the video camera.
`
`FURTHER WORK ON HDRCS WILL FOCUS on higher spatial
`resolution (development of an HDRC 256 x 128 chip) as well
`as higher contrast resolution. New functions, such as variable
`conversion characteristcs or active resolution control, will take
`even more system aspects into account. The fact that CMOS
`image sensors are easy to integrate will become one major
`aspect in the development of vision systems. All optimiza-
`tions will focus on higher system performance of camera
`systems or image processing systems rather than toward a
`singular high-perfomnce camera chip, which could be done
`better in other technologies than CMOS. Therefore, our work
`will always be embedded in the development of application-
`specific image processing systems.
`
`Acknowledgments
`We thank B. Ulmer, who accompanied our project as project
`leader at our sponsor company (Daimler Benz AG), for his
`contributions concerning the specification of the HDRC and
`his encouragement. We also thank J.F. Longchamp and R.
`
`54
`
`IEEEMivo
`
`Figure 7. Road scenes taken with the HDRC.
`
`7
`
`_ _ _ ~
`
`

`
`Cochard (at the Institut de Recherches Robert Bosch S.A.,
`Lonay, Switzerland) for their very helpful conversations on
`image sensor development. Their work designing and manu-
`facturing an experimental camera made it possible to dem-
`onstrate the performance of our HDRC chip. The staff at IMS
`helped with discussions on circuit simulation and design as
`well as with processing work for the HDRC chip.
`The Bundesministerium fir Forschung und Technologie
`(BMW, Daimler Benz AG, and Volkswagen AG supported
`this work under contract TV8926 3. We alone are responsible
`for the contents.
`
`2.
`
`3.
`
`4.
`
`5.
`
`References
`1.
`B. Hofflinger,”Cooperative Research on Application-Specific
`Microelectronics,” Roc. /€E€, IEEE Press, Piscataway, N.J., Vol.
`77, No. 9, Sept. 1989, pp. 1390-1 395.
`R. Ginosar and Y.Y. Zeevi, ”Adaptive Sensitivityhntelligent Scan
`Image Sensor Chips,“ Roc. SP/€, Vol. 1001,1988, pp. 462-468.
`Y. Matsunaga et al.,“A High-Sensitivity MOS Photo-Transistor
`for Area ImageSensor,” /€€€Trans. €lectronDevices,Vol. 38, No.
`5, May 1991, pp. 1044-1047.
`J. Heynecek, ”A New Device Architecture Suitable for High-
`Resolution and High-Performance Image Sensors,“ /E€€ Trans.
`Electron Devices, Vol. 35, No. 5, May 1988, pp. 646-652.
`U. Segeretal., ”Developmentof an ImageSensorwith Enhanced
`Dynamic Range in Standard CMOS-Technology,“ Roc. Second
`Institute for Microelectronics
`Prometheus Workshop, Vol. 11,
`Stuttgart, Germany, 1989, p. 371.
`S.G. Chamberlain and J. Lee, ”A Novel Wide Dynamic Range
`Silicon Photodetector and Linear Imaging Array,” I€€€ 1. State
`Circuits, Vol. SC-19, No. 1, Feb.1984, pp. 41-48.
`D. Renshaw et al., ”ASIC Image Sensors,“ Roc. E€€ Custom
`lntegrated Circuits Conf, 1990, pp. 7.3.1-7.3.4.
`“IMS Semicustom ASICs: Gate Forest Family,” Institute for
`Microelectronics Stuttgart, 1992.
`J. Wyatt et al.,”The First Two Years of the MIT Vision Chip
`Project,” private communication.
`
`6.
`
`7.
`
`8.
`
`9.
`
`Figure 8. Road scenes taken with the CCD camera.
`
`, Microsystem Division of the Institute for
`
`Ulrich Seger is a staff member of the
`
`Microelectronics Stuttgart, where he works
`, on CMOS microsystem involving optical
`sensors. Earlier, he worked as a develop-
`ment engineer with Computer Gesellschaft
`Konstanz mbH. Seger studied electrical en-
`gineering at the Fachhochschule Konstanz and received the
`diploma in engineering for his work on a digital image pre-
`processor for optical character recognition.
`
`February 1993 55
`
`

`
`I
`
`Vision assistance
`
`Heinz-Gerd Graf heads the Microsystem
`Department of the Institute for Microelec-
`tronics Stuttgart. He has been a member
`of the research staff of the Department of
`Electrical Engineering at the University of
`Dortmund and a researcher in the Ad-
`vanced Physics Group of Messerschmitt-
`Boelkow-Blob GmbH. He received the diploma in physics
`from the University of Dortmund.
`
`Marc E. Landgraf was involved in the
`first HDRC design as a researcher in the
`IMS Interface Group. Today, he is a de-
`velopment engineer in the Flash Memory
`Development Group at Intel Corporation
`in Folsom, California. hndgraf received
`his BS degree from California Polytechnic
`State University, San Louis Obispo, and his MS degree in
`electrical engineering from the University of California, Davis.
`
`Direct questions concerning this article to Ulrich Seger, In-
`stitute for Microelectronics Stuttgart, Aumandring 30A, D 7000
`Stuttgm 80, Germany; or e-mail at seger@mikroelektronik.uni-
`stuttgart.dbp.de.
`
`Reader Interest Survey
`Indicate your interest in this article by circling the appropriate
`number on the Reader Service Card.
`
`LOW 162
`
`Medium 163
`
`High 164
`
`Call for Articles
`
`ZEEE Micro plans a special issue on fault-tolerant systems to appear tentatively in February 1 9 4 .
`Areas of interest include
`Defect tolerance for VLSI and WSI chips
`Fault tolerance in highly parallel hardware systems
`Real-time requirements for fault-tolerant hardware systems
`Fault modeling
`Coverage evaluation for fault detection and recovery mechanisms
`Related topics are welcome.
`
`Interested authors should submit six copies of their
`manuscripts until May 1, 1993. Send submissions to
`K.E. Grosspietsch
`GMD
`PO Box 1316
`D-5205 St. Augustin
`Germany
`Telephone +49 2241 142750; fax +49 2241 142618
`E-mail: grossp@gmdzi.uucp
`
`Authors needing guidelines for submission
`of articles should contact
`Claire Weller, IEEE Micro
`PO Box 3014
`Los Alamitos, C A 90720-1264
`Telephone (714) 821-8380; fax (714) 821-4010
`
`56
`
`IEEEMicro
`
`1

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket