throbber
http://www.edmundoptics.com/technical-support/technical-library/articles/?page=
`
`Go
`
`3 captures
`12 Sep 2010 - 27 Aug 2011
`
`SEP MAY AUG
`
`08
`2010 2011 2012
`
`👤 ⍰ ❎
`
`
`f 🐦
`▾ About this capture
`
`Articles
`Application Notes | Articles | EO Tech Tools | FAQ and Glossary | Marketing Literature | Video Resources
`
`Sort by: Date
`06/2009 - "Telecentric Illumination for Vision-System Backlighting" by Bruce Butkus - Machine Design
`04/2009 - "Optical Fabrication: Advances in sputtering benefit coating costs" by Iain Macmillan - Laser Focus World
`04/2009 - "A Vision System for the E-Pedigree Era" by Gregory Hollows and David Pfleger - Pharmaceutical
`Manufacturing
`03/2009 - "Optical Advances Speed Rapid Prototyping" by Amr Khalil - Design World
`03/2009 - "Matching Lenses and Sensors" by Gregory Hollows and Stuart Singer - Vision Systems Design
`03/2009 - "Putting More Meaning in Imaging" by Mike May - Bioscience Technology
`03/2009 - "Souping up optics with design and simulation software" - BioOptics World
`02/2009 - "Optical coatings industry: the picture looks bright" by Caren Les - Photonics Spectra
`02/2009 - "Special Report: The Largest Market in the World" by Barry Hochfelder - Advanced Imaging
`12/2008 - "Edmund Optics Doubles Precision-Asphere Production Capacity" by John Wallace - Laser Focus World
`11/2008 - "Sleeping with the Dragon: The yin and yang of doing business with China" by Robert Edmund - Laser Focus
`World
`11/2008 - "Take your positions..." by Gemma Church - Electro Optics
`11/2008 - "Manufacturing high-quality aspheres – conventional or in a hybrid process" by Jeremy Govier and Martin
`Weinacht - Photonik
`11/2008 - "Precision Prism Manufacturing Art or Science?" by Andrew Lynch - Nasa Tech Briefs
`10/2008 - "Glasses roll aspheres into the mainstream" by Gregg Fales - Optics & Laser Europe
`09/2008 - "Getting inside optical filters" by Mike May - BioOptics World
`09/2008 - "Into the UV" by Andrew Wilson - Vision Systems Design
`08/2008 - "Tracking fluid flow with two channels" by Hank Hogan - BioPhotonics
`07/2008 - "Optics optimizes fluorescence" by Kristin Vogt - BioOptics World
`07/2008 - "Innovate or Die" by Kathy Sheehan - SPIE (Copyright 2008, SPIE Professional Magazine. Used with
`permission.)
`Showing 41-60 of 85 Articles
`
`Prev | 1, 2, 3, 4, 5 | Next
`
`Copyright 2011, Edmund Optics Inc. — 101 East Gloucester Pike, Barrington, NJ 08007-1380 USA
`Phone: 1-800-363-1992 or 1-856-573-6250, Fax: 1-856-573-6295
`
`APPL-1025 / Page 1 of 15
`APPLE INC. v. COREPHOTONICS LTD.
`
`

`

`http://www.vision-systems.com/articles/print/volume-14/issue-3/features/matching-lenses-and-sensors.html
`
`18 captures
`10 Aug 2012 - 27 Dec 2017
`
`Go
`
`JUL AUG APR
`
`10
`2011 2012 2016
`
`👤 ⍰ ❎
`
`
`f 🐦
`▾ About this capture
`
`Subscribe |
`
`| Follow us on
`
`Home Factory Automation Non-Industrial Vision 3D Imaging Cameras Boards & Software Webcasts White Papers Editorial Digests Magazine Buyers Guide About Us
`
`Home > Matching Lenses and Sensors
`Matching Lenses and Sensors
`March 1, 2009
`
`Social Media Tools
`
`With pixel sizes of CCD and CMOS image sensors becoming smaller, system
`integrators must pay careful attention to their choice of optics
`
`Share
`
` Print
`
` Email
`
` Save
`
`Greg Hollows and Stuart Singer
`Each year, sensor manufacturers fabricate sensors with smaller pixel sizes. About 15
`years ago, it was common to find sensors with pixels as small as 13 µm. It is now
`common to find sensors with standard 5-µm pixel sizes. Recently, sensor
`manufacturers have produced pixel sizes of 1.4 µm without considering lens
`performance limits. It is also common to find sensors that contain 5 Mpixels and
`individual pixel sizes of 3.45 µm. In the next generation of image sensors, some manufacturers expect to produce devices
`with pixel sizes as small as 1.75 µm.
`
`Sponsored by:
`
`Click here to enlarge image
`
`In developing these imagers, sensor manufacturers have failed to communicate with lens manufacturers. This has resulted in
`a mismatch between the advertised sensor resolution and the resolution that is attainable from a sensor/lens combination. To
`address the problem, lens manufacturers now need to produce lenses that employ higher optical performance, lower f-
`numbers (f/#s), and significantly tightened manufacturing tolerances so that these lenses can take advantage of new
`sensors.
`Understanding light
`To understand how lenses can limit the performance of an imaging system, it is necessary to grasp the physics behind such
`factors as diffraction, lens aperture, focal length, and the wavelength of light. One of the most important parameters of a lens
`is its diffraction limit (DL). Even a perfect lens not limited by design will be diffraction limited and this figure, given in line
`pairs/mm, will determine the maximum resolving power of the lens. To calculate this diffraction limit figure, a simple formula
`that relates the f/# of the lens and the wavelength of light can be used.
`
`DL = 1/[(f/#)(wavelength in millimeters)]
`
`After the diffraction limit is reached, the lens can no longer resolve greater frequencies. One of the variables affecting the
`diffraction limit is the speed of the lens or f/#. This is directly related to the size of the lens aperture and the focal length of the
`lens as follows:
`
`f/# = focal length/lens aperture
`
`APPL-1025 / Page 2 of 15
`APPLE INC. v. COREPHOTONICS LTD.
`
`

`

`http://www.vision-systems.com/articles/print/volume-14/issue-3/features/matching-lenses-and-sensors.html
`The diffraction pattern resulting from a uniformly illuminated circular aperture has a bright region in the center, known as the
`Airy disk, which together with the series of concentric bright rings around it is called the Airy pattern. The diameter of this
`18 captures
`pattern is related to the wavelength of the illuminating light and the size of the circular aperture.
`10 Aug 2012 - 27 Dec 2017
`
`Go
`
`JUL AUG APR
`
`10
`2011 2012 2016
`
`👤 ⍰ ❎
`
`
`f 🐦
`▾ About this capture
`
`FIGURE 1. The Airy disk is the
`smallest point on which a beam
`of light can be focused (top).
`The center bright spot contains
`approximately 84% of the total
`spot image energy, 91% within
`the outside diameter of the first
`ring, and 94% of the energy
`within the outside diameter of
`the second ring. The 3-D light
`intensity of the Airy disk shows
`how the light is distributed
`(bottom).
`Click here to enlarge image
`
`This is important since the Airy disk is the smallest point a beam of light can be focused. The disk comprises rings of light
`decreasing in intensity and appears similar to the rings on a bulls-eye target. The center bright spot contains approximately
`84% of the total spot image energy, 91% within the outside diameter of the first ring and 94% of the energy within the outside
`diameter of the second ring and so on (see Fig. 1a and 1b). The Airy disk diameter (ADD) can be calculated by
`
`ADD = (2.44)(f/#)(wavelength)
`
`The image spot size can be considered the diameter of the Airy disk, which comprises all its rings. The spot size that a lens
`produces has an increasingly significant role in digital imaging. This is because the individual pixel size on the latest sensors
`has been reduced to the point where it is comparable or smaller than the Airy disk size.
`
`It is important to consider the Airy disk diameter at a particular f/# since the Airy disk diameter can be considerably larger
`than the individual pixel size. Using a lens set to f/8.0 will be performance limited by an individual pixel size <12.35 µm (see
`Table 1). In the table, all the values are given using a 632.8-nm wavelength.
`
`APPL-1025 / Page 3 of 15
`APPLE INC. v. COREPHOTONICS LTD.
`
`

`

`http://www.vision-systems.com/articles/print/volume-14/issue-3/features/matching-lenses-and-sensors.html
`
`18 captures
`10 Aug 2012 - 27 Dec 2017
`
`Go
`
`JUL AUG APR
`
`10
`2011 2012 2016
`
`👤 ⍰ ❎
`
`
`f 🐦
`▾ About this capture
`
`Click here to enlarge image
`
`null
`Sensor resolution
`While the diffraction limit in line pairs/mm determines the resolving power of the lens, the resolution limit of the image sensor,
`commonly referred to as the Nyquist frequency (NF), is also expressed in terms of line pairs/mm where
`
`NF = 1/[(pixel size)(2)]
`
`Table 2 shows the Nyquist frequency limits for pixel sizes now available in machine-vision cameras. What is required is a
`lens system with a fairly low f/# to even theoretically achieve the sensor limited resolution. It is common practice for such
`lenses to be calibrated with f/#s relating to infinity.
`
`Click here to enlarge image
`
`As an object is viewed at a finite distance in most machine-vision systems, these f/#s are no longer valid. A new “finite” f/#
`value must be calculated and employed on all system calculations such as spot size and resolution limits. A simple way to
`calculate the “finite” f/# (ff/#) is
`
`APPL-1025 / Page 4 of 15
`APPLE INC. v. COREPHOTONICS LTD.
`
`

`

`Go
`
`JUL AUG APR
`
`10
`2011 2012 2016
`
`👤 ⍰ ❎
`
`
`f 🐦
`▾ About this capture
`
`ff/# = (infinity f/#)(magnification+1)
`http://www.vision-systems.com/articles/print/volume-14/issue-3/features/matching-lenses-and-sensors.html
`
`18 captures
`Using the above equation, and assuming a unity optical magnification of 1, the ff/# for the lens is twice the infinity f/# value.
`10 Aug 2012 - 27 Dec 2017
`Thus, as a rule of thumb, a lens listed with an f/# of 1.4 can be estimated to have an f/# of 2.8 when used in a machine-vision
`systems. Smaller and smaller pixel sizes force lenses to run at very low f/#s to theoretically achieve the resolutions limits of
`the sensor.
`
`As the f/# gets lower and lower, it become more difficult to design and manufacture lenses that approach the theoretical limit.
`While some lens designs can approach theoretical limits, once manufacturing tolerances, different wavelength ranges,
`sensor alignment, microlenses, different lens mounts, and the desire to use these lenses over a range of working distances
`are taken into account, it becomes nearly impossible to approach the limits.
`Lens design
`When designing lenses, optical engineers take into account many different factors to achieve the desired resolution. In any
`lens design, whether for a web camera or for a high-resolution imaging system, the lens performance varies with the working
`distance, ff/#, or the wavelength range.
`
`Each lens has a sweet spot where the best performance is obtained. As factors such as working distance are varied, system
`performance fall-off will occur. The higher the resolution of the system, the faster this will happen.
`
`In the case of Sony’s 5-Mpixel sensor that features 3.45-µm pixels, for example, sensor-limited resolution really cannot be
`achieved even theoretically both at very short working distances and at longer working distances with the same lens. Thus, it
`is critical to discuss with lens manufacturers what the working distance for a specific application will be and to understand
`how the lens will perform at that distance.
`
`Any lens product cannot be used to make such systems work effectively. Remember: A lens is not guaranteed to perform in
`a 5-Mpixel camera simply because it is specified as a 5-Mpixel lens.
`
`In the past, machine-vision systems used lenses developed for microscopy, photography, and security applications. While
`these lenses can be very good, they do not maximize the capabilities of imagers used in machine vision. Additionally, the
`high level of price pressures in these markets requires loosened manufacturing tolerances and such lenses may omit the
`features of those specifically designed for machine vision.
`Tighter tolerances
`The tighter the tolerance of the manufacturing process, the more closely the lens will achieve the parameters of an ideal
`design. Tighter manufacturing tolerances also lead to a more repeatable lens—important when installing multiple systems—
`and better image quality across the entire sensor. Because image quality generally falls off at the corner of the image first,
`loosening tolerances only enhances and in many cases accelerates these effects.
`
`System developers do not require a background in optical mechanical design to determine if lens tolerances are tight
`enough. However, it should be determined whether the design information is for the ideal/nominal design or for the tolerance
`design. Since many lenses are specified using tolerance design information, the lens vendor may need to provide test
`images set for a specific application requirement.
`
`The higher the resolution of the system, the lower the f/#needs to be to resolve spots small enough to match the camera’s
`resolution. The lower the f/# of the lens, the larger the cone of light for a specific distance that the lens is working in, and the
`faster rays will diverge before and after best focus. If the alignment of the lens to the sensor is not tight enough, even a lens
`that meets specific resolution requirements may not yield a system that meets specification.
`
`Figure 2 shows a sensor (in red) tipped in relation to the lens system where the dashes represent individual pixels. The solid
`red line (right) indicates the point at which the defocusing of the cones of light produced by the lens grows larger than the
`pixels, creating out-of-focus imaging beyond those points. If enough pixels are added and the alignment is not perfect, the
`system will become defocused.
`
`FIGURE 2. A sensor (red) may be tipped in relation to the lens system.
`
`APPL-1025 / Page 5 of 15
`APPLE INC. v. COREPHOTONICS LTD.
`
`

`

`Red dashes represent individual pixels; solid red line indicates the point at
`http://www.vision-systems.com/articles/print/volume-14/issue-3/features/matching-lenses-and-sensors.html
`which the defocusing of the cones of light produced by the lens grows
`larger than the pixels, creating out-of-focus imaging beyond those points. If
`18 captures
`enough pixels are added and the alignment is not perfect, the system will
`10 Aug 2012 - 27 Dec 2017
`become defocused.
`
`Click here to enlarge image
`
`Go
`
`JUL AUG APR
`
`10
`2011 2012 2016
`
`👤 ⍰ ❎
`
`
`f 🐦
`▾ About this capture
`
`Asking a camera manufacturer how they guarantee the alignment of their sensor with relation to the camera lens mount is
`the best way to reduce the risks associated with this issue. Higher levels of alignment do add cost, but performance is
`maximized. For high levels of pixel density in linescan and 11-Mpixel and 16-Mpixel cameras, alignment tools may be
`designed into the lens or camera.
`Increasing fill factor
`Microlenses increase the fill factor of the sensor by capturing as much light as possible. However, like any lenses they have
`an acceptance angle at which they will still effectively collect light and focus it onto the active portion of the pixel (see Fig. 3).
`If the external lens used to form an image on sensors that use microlenses exceeds this angle, then the light does not reach
`the sensor (see Fig. 4).
`
`FIGURE 3. Microlenses increase the fill factor of the sensor by capturing
`as much light as possible. However, they have an acceptance angle at
`which they will effectively collect light and focus it onto the active portion of
`the pixel.
`
`Click here to enlarge image
`
`As sensors grow larger and larger, the acceptance angles of each of these microlenses do not change. The angle of light
`from the center of the external lens to the pixels farther and farther from the center of the sensor does change, as can be
`seen by the green and red ray traces of Fig. 4.
`
`FIGURE 4. If the external lens
`used in a design exceeds the
`acceptance angle of the
`microlens used with the
`sensor, light from objects
`farther from the center field of
`view of the lens (green and
`red) may not reach the
`sensor.
`Click here to enlarge image
`
`As sensor resolutions increase, light must still reach individual microlenses on the sensor at angles as low as 7° so that
`shading or roll-off does not occur. To overcome this, lens manufacturers such as Schneider Optics and Edmund Optics will
`be offering external lenses that are near telecentric in image space. In such designs, the angle of light farther and farther
`from the center will remain on-axis and no angular roll-off will occur (see Fig. 5).
`
`APPL-1025 / Page 6 of 15
`APPLE INC. v. COREPHOTONICS LTD.
`
`

`

`http://www.vision-systems.com/articles/print/volume-14/issue-3/features/matching-lenses-and-sensors.html
`
`18 captures
`10 Aug 2012 - 27 Dec 2017
`
`Go
`
`JUL AUG APR
`
`10
`2011 2012 2016
`
`👤 ⍰ ❎
`
`
`f 🐦
`▾ About this capture
`
`FIGURE 5. To overcome
`the problem associated with
`microlens-based sensors,
`lens manufacturers will offer
`external lenses that are
`near telecentric in image
`space. The angle from light
`farther and farther from the
`center will remain on-axis
`and no angular roll-off will
`occur.
`Click here to enlarge image
`
`Many have enjoyed the advances in sensor development associated with consumer cameras, but products designed for
`consumer applications and those for machine vision are vastly different. There will always be overlap and commonality
`between these areas, but understanding machine-vision optics is mandatory for those building high-resolution imaging
`systems.
`
`Greg Hollows is director, machine vision solutions, at Edmund Optics, Barrington, NJ, USA; www.edmundoptics.com; and
`Stuart Singer is vice president of Schneider Optics, Hauppauge, NY, USA; www.schneideroptics.com.
`
`Share
`
` Recommend
`
`Editor's Picks
`Machine vision system detects cracks in slabs
`
`MIT researchers develop new programming language for image processing
`
`3-D images created by movement of single lens
`
`Kinect system tracks construction workers to analyze on-the-job movements
`
`Port Authority orders scanners to map crashes
`
`Most Popular
`
`Recent Articles
`ST Robotics six-axis arm has reach of 500 mm
`
`Vision system grades eggs
`
`Photoluminescence imaging inspects solar cells
`
`Laser Design leverages structured light for 3-D scanning system
`
`Labsphere releases wide-dynamic-range light measurement systems
`
`Home | Factory Automation | Non-Industrial Vision | Cameras | Boards & Software | Lighting & Optics | Robotics
`Subscribe | Advertise | Buyers Guide | Industry Events | Videos | White Papers | Webcasts | Newsletters | Topic Index
`
`PennWell.com | PennWell Websites | PennWell Events | Contact Us | About Us | Privacy Policy | Terms & Conditions | Site Map | RSS | Webmaster
`
`APPL-1025 / Page 7 of 15
`APPLE INC. v. COREPHOTONICS LTD.
`
`

`

`Matching Lenses and Sensors
`
`Close
`
`With pixel sizes of CCD and CMOS image sensors becoming smaller, system integrators must pay careful
`attention to their choice of optics
`Greg Hollows and Stuart Singer
`Each year, sensor manufacturers fabricate sensors with smaller pixel sizes. About 15 years ago, it was
`common to find sensors with pixels as small as 13 µm. It is now common to find sensors with standard 5-
`µm pixel sizes. Recently, sensor manufacturers have produced pixel sizes of 1.4 µm without considering
`lens performance limits. It is also common to find sensors that contain 5 Mpixels and individual pixel sizes
`of 3.45 µm. In the next generation of image sensors, some manufacturers expect to produce devices with
`pixel sizes as small as 1.75 µm.
`
`Click here to enlarge image
`
`In developing these imagers, sensor manufacturers have failed to communicate with lens manufacturers.
`This has resulted in a mismatch between the advertised sensor resolution and the resolution that is
`attainable from a sensor/lens combination. To address the problem, lens manufacturers now need to
`produce lenses that employ higher optical performance, lower f-numbers (f/#s), and significantly tightened
`manufacturing tolerances so that these lenses can take advantage of new sensors.
`Understanding light
`
`APPL-1025 / Page 8 of 15
`APPLE INC. v. COREPHOTONICS LTD.
`
`

`

`To understand how lenses can limit the performance of an imaging system, it is necessary to grasp the
`physics behind such factors as diffraction, lens aperture, focal length, and the wavelength of light. One of
`the most important parameters of a lens is its diffraction limit (DL). Even a perfect lens not limited by
`design will be diffraction limited and this figure, given in line pairs/mm, will determine the maximum
`resolving power of the lens. To calculate this diffraction limit figure, a simple formula that relates the f/# of
`the lens and the wavelength of light can be used.
`DL = 1/[(f/#)(wavelength in millimeters)]
`After the diffraction limit is reached, the lens can no longer resolve greater frequencies. One of the
`variables affecting the diffraction limit is the speed of the lens or f/#. This is directly related to the size of
`the lens aperture and the focal length of the lens as follows:
`f/# = focal length/lens aperture
`The diffraction pattern resulting from a uniformly illuminated circular aperture has a bright region in the
`center, known as the Airy disk, which together with the series of concentric bright rings around it is called
`the Airy pattern. The diameter of this pattern is related to the wavelength of the illuminating light and the
`size of the circular aperture.
`
`FIGURE 1. The Airy disk is the smallest
`point on which a beam of light can be
`focused (top). The center bright spot
`contains approximately 84% of the total
`spot image energy, 91% within the
`outside diameter of the first ring, and
`94% of the energy within the outside
`diameter of the second ring. The 3-D
`light intensity of the Airy disk shows how
`the light is distributed (bottom).
`Click here to enlarge
`image
`
`This is important since the Airy disk is the smallest point a beam of light can be focused. The disk
`comprises rings of light decreasing in intensity and appears similar to the rings on a bulls-eye target. The
`center bright spot contains approximately 84% of the total spot image energy, 91% within the outside
`diameter of the first ring and 94% of the energy within the outside diameter of the second ring and so on
`(see Fig. 1a and 1b). The Airy disk diameter (ADD) can be calculated by
`
`APPL-1025 / Page 9 of 15
`APPLE INC. v. COREPHOTONICS LTD.
`
`

`

`ADD = (2.44)(f/#)(wavelength)
`The image spot size can be considered the diameter of the Airy disk, which comprises all its rings. The spot
`size that a lens produces has an increasingly significant role in digital imaging. This is because the
`individual pixel size on the latest sensors has been reduced to the point where it is comparable or smaller
`than the Airy disk size.
`It is important to consider the Airy disk diameter at a particular f/# since the Airy disk diameter can be
`considerably larger than the individual pixel size. Using a lens set to f/8.0 will be performance limited by an
`individual pixel size <12.35 µm (see Table 1). In the table, all the values are given using a 632.8-nm
`wavelength.
`
`Click here to enlarge image
`
`null
`Sensor resolution
`While the diffraction limit in line pairs/mm determines the resolving power of the lens, the resolution limit
`of the image sensor, commonly referred to as the Nyquist frequency (NF), is also expressed in terms of line
`pairs/mm where
`
`NF = 1/[(pixel size)(2)]
`Table 2 shows the Nyquist frequency limits for pixel sizes now available in machine-vision cameras. What
`
`APPL-1025 / Page 10 of 15
`APPLE INC. v. COREPHOTONICS LTD.
`
`

`

`is required is a lens system with a fairly low f/# to even theoretically achieve the sensor limited resolution.
`It is common practice for such lenses to be calibrated with f/#s relating to infinity.
`
`Click here to enlarge image
`
`As an object is viewed at a finite distance in most machine-vision systems, these f/#s are no longer valid. A
`new “finite” f/# value must be calculated and employed on all system calculations such as spot size and
`resolution limits. A simple way to calculate the “finite” f/# (ff/#) is
`ff/# = (infinity f/#)(magnification+1)
`Using the above equation, and assuming a unity optical magnification of 1, the ff/# for the lens is twice the
`infinity f/# value. Thus, as a rule of thumb, a lens listed with an f/# of 1.4 can be estimated to have an f/#
`of 2.8 when used in a machine-vision systems. Smaller and smaller pixel sizes force lenses to run at very
`low f/#s to theoretically achieve the resolutions limits of the sensor.
`As the f/# gets lower and lower, it become more difficult to design and manufacture lenses that approach
`the theoretical limit. While some lens designs can approach theoretical limits, once manufacturing
`tolerances, different wavelength ranges, sensor alignment, microlenses, different lens mounts, and the desire
`to use these lenses over a range of working distances are taken into account, it becomes nearly impossible
`to approach the limits.
`Lens design
`When designing lenses, optical engineers take into account many different factors to achieve the desired
`resolution. In any lens design, whether for a web camera or for a high-resolution imaging system, the lens
`performance varies with the working distance, ff/#, or the wavelength range.
`
`APPL-1025 / Page 11 of 15
`APPLE INC. v. COREPHOTONICS LTD.
`
`

`

`Each lens has a sweet spot where the best performance is obtained. As factors such as working distance are
`varied, system performance fall-off will occur. The higher the resolution of the system, the faster this will
`happen.
`In the case of Sony’s 5-Mpixel sensor that features 3.45-µm pixels, for example, sensor-limited resolution
`really cannot be achieved even theoretically both at very short working distances and at longer working
`distances with the same lens. Thus, it is critical to discuss with lens manufacturers what the working
`distance for a specific application will be and to understand how the lens will perform at that distance.
`Any lens product cannot be used to make such systems work effectively. Remember: A lens is not
`guaranteed to perform in a 5-Mpixel camera simply because it is specified as a 5-Mpixel lens.
`In the past, machine-vision systems used lenses developed for microscopy, photography, and security
`applications. While these lenses can be very good, they do not maximize the capabilities of imagers used in
`machine vision. Additionally, the high level of price pressures in these markets requires loosened
`manufacturing tolerances and such lenses may omit the features of those specifically designed for machine
`vision.
`Tighter tolerances
`The tighter the tolerance of the manufacturing process, the more closely the lens will achieve the parameters
`of an ideal design. Tighter manufacturing tolerances also lead to a more repeatable lens—important when
`installing multiple systems—and better image quality across the entire sensor. Because image quality
`generally falls off at the corner of the image first, loosening tolerances only enhances and in many cases
`accelerates these effects.
`System developers do not require a background in optical mechanical design to determine if lens tolerances
`are tight enough. However, it should be determined whether the design information is for the ideal/nominal
`design or for the tolerance design. Since many lenses are specified using tolerance design information, the
`lens vendor may need to provide test images set for a specific application requirement.
`The higher the resolution of the system, the lower the f/#needs to be to resolve spots small enough to match
`the camera’s resolution. The lower the f/# of the lens, the larger the cone of light for a specific distance that
`the lens is working in, and the faster rays will diverge before and after best focus. If the alignment of the
`lens to the sensor is not tight enough, even a lens that meets specific resolution requirements may not yield
`a system that meets specification.
`Figure 2 shows a sensor (in red) tipped in relation to the lens system where the dashes represent individual
`pixels. The solid red line (right) indicates the point at which the defocusing of the cones of light produced
`by the lens grows larger than the pixels, creating out-of-focus imaging beyond those points. If enough
`pixels are added and the alignment is not perfect, the system will become defocused.
`
`APPL-1025 / Page 12 of 15
`APPLE INC. v. COREPHOTONICS LTD.
`
`

`

`FIGURE 2. A sensor (red) may be tipped in relation to the lens system. Red dashes represent
`individual pixels; solid red line indicates the point at which the defocusing of the cones of light
`produced by the lens grows larger than the pixels, creating out-of-focus imaging beyond those
`points. If enough pixels are added and the alignment is not perfect, the system will become
`defocused.
`Click here to enlarge image
`
`Asking a camera manufacturer how they guarantee the alignment of their sensor with relation to the camera
`lens mount is the best way to reduce the risks associated with this issue. Higher levels of alignment do add
`cost, but performance is maximized. For high levels of pixel density in linescan and 11-Mpixel and 16-
`Mpixel cameras, alignment tools may be designed into the lens or camera.
`Increasing fill factor
`Microlenses increase the fill factor of the sensor by capturing as much light as possible. However, like any
`lenses they have an acceptance angle at which they will still effectively collect light and focus it onto the
`active portion of the pixel (see Fig. 3). If the external lens used to form an image on sensors that use
`microlenses exceeds this angle, then the light does not reach the sensor (see Fig. 4).
`
`APPL-1025 / Page 13 of 15
`APPLE INC. v. COREPHOTONICS LTD.
`
`

`

`FIGURE 3. Microlenses increase the fill factor of the sensor by capturing as much light as
`possible. However, they have an acceptance angle at which they will effectively collect light and
`focus it onto the active portion of the pixel.
`Click here to enlarge image
`
`As sensors grow larger and larger, the acceptance angles of each of these microlenses do not change. The
`angle of light from the center of the external lens to the pixels farther and farther from the center of the
`sensor does change, as can be seen by the green and red ray traces of Fig. 4.
`
`FIGURE 4. If the external lens used
`in a design exceeds the acceptance
`angle of the microlens used with the
`sensor, light from objects farther from
`the center field of view of the lens
`(green and red) may not reach the
`
`sensor.Click here to enlarge
`image
`
`As sensor resolutions increase, light must still reach individual microlenses on the sensor at angles as low
`as 7° so that shading or roll-off does not occur. To overcome this, lens manufacturers such as Schneider
`
`APPL-1025 / Page 14 of 15
`APPLE INC. v. COREPHOTONICS LTD.
`
`

`

`Optics and Edmund Optics will be offering external lenses that are near telecentric in image space. In such
`designs, the angle of light farther and farther from the center will remain on-axis and no angular roll-off
`will occur (see Fig. 5).
`
`FIGURE 5. To overcome the
`problem associated with microlens-
`based sensors, lens manufacturers
`will offer external lenses that are
`near telecentric in image space.
`The angle from light farther and
`farther from the center will remain
`on-axis and no angular roll-off will
`
`occur.Click here to enlarge
`image
`
`Many have enjoyed the advances in sensor development associated with consumer cameras, but products
`designed for consumer applications and those for machine vision are vastly different. There will always be
`overlap and commonality between these areas, but understanding machine-vision optics is mandatory for
`those building high-resolution imaging systems.
`Greg Hollows is director, machine vision solutions, at Edmund Optics, Barrington, NJ, USA;
`www.edmundoptics.com; and Stuart Singer is vice president of Schneider Optics, Hauppauge, NY, USA;
`www.schneideroptics.com.
`
`To access this Article, go to:
`http://www.optoiq.com/optoiq-2/en-us/index/display/oiq-articles-tool-template.articles.vision-
`systems-design.volume-14.issue-3.features.matching-lenses-and-sensors.html
`
`APPL-1025 / Page 15 of 15
`APPLE INC. v. COREPHOTONICS LTD.
`
`

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket