throbber
(12) United States Patent
`Dagher et al.
`
`(10) Patent No.:
`(45) Date of Patent:
`
`US 8,824,833 B2
`Sep. 2, 2014
`
`USOO8824.833B2
`
`(54) IMAGE DATA FUSION SYSTEMS AND
`METHODS
`
`(56)
`
`(75) Inventors: Joseph C. Dagher, Boulder, CO (US);
`Amit Ashok, Boulder, CO (US); David
`Tremblay, Boulder, CO (US); Kenneth
`S. Kubala, Boulder, CO (US)
`(73) Assignee: Omnivision Technologies, Inc., Santa
`Clara, CA (US)
`Subject to any disclaimer, the term of this
`patent is extended or adjusted under 35
`U.S.C. 154(b) by 380 days.
`12/865,343
`Jan. 30, 2009
`
`(*) Notice:
`
`(21) Appl. No.:
`(22) PCT Filed:
`(86). PCT No.:
`S371 (c)(1),
`(2), (4) Date:
`(87) PCT Pub. No.: WO2009/097552
`PCT Pub. Date: Aug. 6, 2009
`
`PCT/US2O09/032683
`
`Nov. 29, 2010
`
`(65)
`
`Prior Publication Data
`US 2011/0064327 A1
`Mar. 17, 2011
`
`(2006.01)
`(2006.01)
`(2006.01)
`
`(51) Int. Cl.
`G06K 9/32
`G06T5/00
`G06T5/50
`(52) U.S. Cl.
`CPC. G06T5/50 (2013.01); G06T5/004 (2013.01);
`G06T 2207/20221 (2013.01); G06T 2207/10148
`(2013.01)
`USPC ........................................... 382/294; 382/260
`(58) Field of Classification Search
`CPC ............................................. GO6T 2207/2O221
`See application file for complete search history.
`
`
`
`References Cited
`U.S. PATENT DOCUMENTS
`
`7, 1992 Ritchey........................... 348/39
`5,130,794. A *
`12/1992 Takemoto et al.
`5,172,236 A
`1/1994 Mimura et al.
`5,282,045 A
`5,771,416 A * 6/1998 Mukai et al. ...
`6,128.416 A * 10/2000 Oura ..........
`6,201,899 B1
`3/2001 Bergen
`6,654,013 B1 * 1 1/2003 Malzbender et al. ......... 345.426
`6,856,708 B1
`2/2005 Aoki
`7,274,830 B2
`9, 2007 Bacarella et al.
`(Continued)
`OTHER PUBLICATIONS
`
`396,378
`... 382,284
`
`Snavely et al., “Photo Tourism: Exploring photo collections in 3D.”
`ACM Transactions on Graphics, 25(3), Aug. 2006.*
`(Continued)
`Primary Examiner — Bhavesh Mehta
`Assistant Examiner — Andrew Moyer
`(74) Attorney, Agent, or Firm — Lathrop & Gage LLP
`
`ABSTRACT
`(57)
`Systems and methods for image data fusion include providing
`first and second sets of image data corresponding to an
`imaged first and second scene respectively. The scenes at least
`partially overlap in an overlap region, defining a first collec
`tion of overlap image data as part of the first set of image data,
`and a second collection of overlap image data as part of the
`second set of image data. The second collection of overlap
`image data is represented as a plurality of image data Subsets
`Such that each of the Subsets is based on at least one charac
`teristic of the second collection, and each Subset spans the
`overlap region. A fused set of image data is produced by an
`image processor, by modifying the first collection of overlap
`image databased on at least a selected one of, but less than all
`of the image data Subsets.
`39 Claims, 18 Drawing Sheets
`
`APPL-1025 / Page 1 of 36
`APPLE INC. v. COREPHOTONICS LTD.
`
`

`

`US 8,824,833 B2
`Page 2
`
`(56)
`
`References Cited
`
`U.S. PATENT DOCUMENTS
`
`11/2001 Okisu et al.
`2001/0045982 A1
`10, 2002 Sakurai et al.
`2002fO140823 A1
`2/2003 Elder et al. ...................... 386/46
`2003/0026588 A1*
`3, 2004 Tiana
`2004/0047518 A1
`4/2004 Afsenius
`2004/008.0661 A1
`2004/0105569 A1* 6/2004 Sharma et al. ................ 382/100
`2004/0234154 A1* 11, 2004 Hier ............
`382,254
`2005/0248590 A1* 11, 2005 Tian et al.
`345,660
`2006/0050338 A1
`3f2006 Hattori .............................. 359/9
`2006, OO61678 A1
`3, 2006 Yamazaki
`2007/0188601 A1* 8/2007 Rohaly et al. ................... 348/47
`2007/0247517 A1 10/2007 Zhang et al.
`2008.0056612 A1
`3f2008 Park et al. ..................... 382,284
`2008/02186 13 A1* 9, 2008 Janson et al. ................. 348,262
`2011 OO19048 A1
`1/2011 Raynor et al.
`OTHER PUBLICATIONS
`
`
`
`Bao and Xu, "Complex wavelet-based image mosaics using edge
`preserving visual perception modeling. Computers & Graphics 23.3
`(1999): 309-321.*
`Brown and Lowe, "Recognising panoramas.” Proceedings of the
`Ninth IEEE International Conference on Computer Vision, vol. 2,
`No. 1218-1225, 2003.*
`Klarquistand Bovik, “Fovea: A foveated vergent active stereo vision
`system for dynamic three-dimensional scene recovery.” Robotics and
`Automation, IEEE Transactions on 14.5 (1998): 755-770.*
`Kuhnlenz et al., “A multi-focal high-performance vision system.”
`Robotics and Automation, 2006. ICRA 2006, Proceedings 2006
`IEEE International Conference on, IEEE, 2006.*
`Scassellati, "A binocular, foveated active vision system.” No. AI-M-
`1628, Massachusetts Inst of Tech Cambridge Artificial Intelligence
`Lab, 1999.*
`
`Zhao et al., “Broadband and wide field of view foveated imaging
`system in space.” Optical Engineering 47.10 (2008): 103202
`1032O2.
`Wikipedia, http://en.wikipedia.org/wiki/Image Scaling, Jan. 29.
`2007.*
`Drori, Iddo, and Dani Lischinski. “Fast multiresolution image opera
`tions in the wavelet domain.” Visualization and Computer Graphics,
`IEEE Transactions on 9.3 (2003): 395-411.*
`Hill, Paul R., Cedric Nishan Canagarajah, and David R. Bull. “Image
`fusion using complex wavelets.” BMVC. 2002.*
`International Search Report and Written Opinion issued in related
`PCT patent application PCT/US2009/032683, dated Jan. 30, 2009,
`14 pages.
`Kiyoharu, et al., “Producing Object-Based Special Effects by Fusing
`Multiple Differently Focused Images.” IEEE Transactions on Cir
`cuits and Systems for Video Technology, IEEE Service Center, vol.
`10, No. 2, Mar. 1, 2000.
`Kazuya, et al., “All-in-Focus Image Generation by Merging Multiple
`Differently Focused Images in Three-Dimensional Frequency
`Domain' Advances in Multimedia Information Processing PCT
`2005 Lecture Notes in Computer Science, vol. 3767, pp. 303-314.
`Jan. 1, 2005.
`Hong, Sahyun, et al. “Data Fusion of Multiple Polarimetriv SAR
`Images. Using Discrete Wavelet Transform (DWT)” IEEE, 3323
`3325, 2002.
`Office Action issued in related Taiwanese Patent Application
`098103287 dated Jan. 9, 2013, 29 pages.
`U.S. Appl. No. 13/281,674. Office Action issued Sep. 10, 2013, 28
`pageS.
`U.S. Appl. No. 13/281,674 Response to Office Action filed Dec. 10,
`2013, 9 pages.
`
`* cited by examiner
`
`APPL-1025 / Page 2 of 36
`APPLE INC. v. COREPHOTONICS LTD.
`
`

`

`U.S. Patent
`
`Sep. 2, 2014
`
`Sheet 1 of 18
`
`US 8,824,833 B2
`
`
`
`30
`
`FIG. 1
`
`APPL-1025 / Page 3 of 36
`APPLE INC. v. COREPHOTONICS LTD.
`
`

`

`U.S. Patent
`
`Sep. 2, 2014
`
`Sheet 2 of 18
`
`US 8,824,833 B2
`
`
`
`120
`
`100
`
`FIG. 2A
`
`APPL-1025 / Page 4 of 36
`APPLE INC. v. COREPHOTONICS LTD.
`
`

`

`U.S. Patent
`
`Sep. 2, 2014
`
`Sheet 3 of 18
`
`US 8,824,833 B2
`
`
`
`| N N N
`
`FIG. 2B
`
`APPL-1025 / Page 5 of 36
`APPLE INC. v. COREPHOTONICS LTD.
`
`

`

`U.S. Patent
`
`Sep. 2, 2014
`
`Sheet 4 of 18
`
`US 8,824,833 B2
`
`164
`
`
`
`Multi-aperture Camera
`
`Processor
`
`Image Output Device
`
`FIG. 3
`
`APPL-1025 / Page 6 of 36
`APPLE INC. v. COREPHOTONICS LTD.
`
`

`

`U.S. Patent
`
`Sep. 2, 2014
`
`Sheet 5 of 18
`
`US 8,824,833 B2
`
`
`
`N12 Pixels
`Wide
`
`N/2 Pixels
`Tele
`
`Upsample?
`Interpolate
`
`175
`
`Upsample?
`Interpolate
`
`N Pixels
`Wide
`
`N Pixels
`Tele
`
`
`
`
`
`
`
`
`
`17O
`
`FG. 4
`
`APPL-1025 / Page 7 of 36
`APPLE INC. v. COREPHOTONICS LTD.
`
`

`

`U.S. Patent
`
`Sep. 2, 2014
`
`Sheet 6 of 18
`
`US 8,824,833 B2
`
`High and Low Frequency
`Tele Data
`
`Low-Frequency
`Wide Data
`
`18O ,
`
`ESNOCHSENH
`
`2OO
`
`NYOUIST FREOUENCY
`(cycles/degree)
`
`
`
`FIG. 5
`
`APPL-1025 / Page 8 of 36
`APPLE INC. v. COREPHOTONICS LTD.
`
`

`

`U.S. Patent
`
`Sep. 2, 2014
`
`Sheet 7 of 18
`
`US 8,824,833 B2
`
`190
`
`Upsampled
`Wide Signal
`
`High-passed
`Tele Signal
`
`192
`
`194
`
`196
`
`CO
`Z
`O
`n ?
`Y
`
`8.5
`NYOUIST FRECQUENCY
`GO (cycles/degree)
`
`17
`
`193
`
`
`
`a CO
`
`Z
`O
`5
`
`
`
`8.5
`NYOUIST FREOUENCY
`G) (cycles/degree)
`
`17
`
`191
`
`195
`
`193
`
`FIG. 6
`
`
`
`4. 17
`NYOUIST FREQUENCY
`(Cycles/degree)
`
`High-Resolution
`Overlap Region
`
`APPL-1025 / Page 9 of 36
`APPLE INC. v. COREPHOTONICS LTD.
`
`

`

`U.S. Patent
`
`Sep. 2, 2014
`
`Sheet 8 of 18
`
`US 8,824,833 B2
`
`a^a a Aa
`-
`
`a
`.
`^^^^
`
`
`
`
`
`N/2 Pixels
`Wide
`
`Upsample?
`Interpolate
`
`
`
`
`
`
`
`
`
`
`
`FIG. 7
`
`High-pass
`filter
`
`Upsample?
`Interpolate
`
`
`
`W. 3 2.
`N Pixels 3 Tele
`2 a% 2 %
`2 .
`
`
`
`APPL-1025 / Page 10 of 36
`APPLE INC. v. COREPHOTONICS LTD.
`
`

`

`U.S. Patent
`
`Sep. 2, 2014
`
`Sheet 9 of 18
`
`US 8,824,833 B2
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`300
`
`320
`
`*****
`
`N/2 Pixels
`Wide
`RGB
`
`
`
`Upsample/Interpolate
`
`RGB to YUV
`Conversion
`
`Convert RGB to YUV
`
`334
`
`-
`
`
`
`
`
`Upsample/
`interpolate
`
`High-pass
`filter
`
`No.
`
`No. ·
`
`312
`
`FIG. 8
`
`APPL-1025 / Page 11 of 36
`APPLE INC. v. COREPHOTONICS LTD.
`
`

`

`U.S. Patent
`
`Sep. 2, 2014
`
`Sheet 10 of 18
`
`US 8,824,833 B2
`
`300
`
`
`
`N/2 Pixels
`Wide
`RGB
`
`Upsample/Interpolate
`
`
`
`
`
`32O
`
`338
`
`Discard
`
`UV
`
`
`
`
`
`Convert RGB to YUV --
`
`.
`
`.
`
`.
`
`.
`
`.
`
`.
`
`.
`
`.
`
`.
`
`.
`
`334
`
`v
`
`4
`
`a
`
`...
`
`•
`
`
`
`
`
`
`
`Register Overlap
`Region
`
`Upsample?
`Interpolate
`
`Tele - Wide
`
`Tele (hi)
`
`312
`
`N Pixels
`Wide
`
`FIG. 9
`
`Tele
`
`APPL-1025 / Page 12 of 36
`APPLE INC. v. COREPHOTONICS LTD.
`
`

`

`U.S. Patent
`
`Sep. 2, 2014
`
`Sheet 11 of 18
`
`US 8,824,833 B2
`
`
`
`s
`
`
`
`XepuuunOO
`
`S.
`
`3
`
`APPL-1025 / Page 13 of 36
`APPLE INC. v. COREPHOTONICS LTD.
`
`

`

`U.S. Patent
`
`Sep. 2, 2014
`
`Sheet 12 of 18
`
`US 8,824,833 B2
`
`301
`
`365
`r
`
`302
`
`367 u
`
`CONVERT COLOR
`TO
`YUV
`(YC, Uc, Vc)
`
`
`
`REGISTRATION
`(match Yoto YG)
`
`(UC, Vo)
`
`
`
`
`
`371
`
`/ >
`373
`
`IMAGE FUSION
`
`FIG. 11
`
`APPL-1025 / Page 14 of 36
`APPLE INC. v. COREPHOTONICS LTD.
`
`

`

`U.S. Patent
`
`US 8,824,833 B2
`
`
`
`SN
`c5
`
`(1)699
`
`APPL-1025 / Page 15 of 36
`APPLE INC. v. COREPHOTONICS LTD.
`
`

`

`U.S. Patent
`
`Sep. 2, 2014
`
`Sheet 14 of 18
`
`US 8,824,833 B2
`
`
`
`(Z)699
`
`APPL-1025 / Page 16 of 36
`APPLE INC. v. COREPHOTONICS LTD.
`
`

`

`U.S. Patent
`
`Sep. 2, 2014
`
`Sheet 15 of 18
`
`US 8,824,833 B2
`
`
`
`APPL-1025 / Page 17 of 36
`APPLE INC. v. COREPHOTONICS LTD.
`
`

`

`U.S. Patent
`
`Sep. 2, 2014
`
`Sheet 16 of 18
`
`US 8,824,833 B2
`
`
`
`§ 18
`
`APPL-1025 / Page 18 of 36
`APPLE INC. v. COREPHOTONICS LTD.
`
`

`

`U.S. Patent
`
`Sep. 2, 2014
`
`Sheet 17 of 18
`
`US 8,824,833 B2
`
`206
`
`10
`
`20
`15
`Pixel position
`
`25
`
`30
`
`
`
`FIG. 16
`
`APPL-1025 / Page 19 of 36
`APPLE INC. v. COREPHOTONICS LTD.
`
`

`

`U.S. Patent
`
`Sep. 2, 2014
`
`Sheet 18 of 18
`
`US 8,824,833 B2
`
`180
`
`
`
`170
`
`140
`
`130
`
`120
`
`110
`
`Pixel position
`
`FIG. 17
`
`APPL-1025 / Page 20 of 36
`APPLE INC. v. COREPHOTONICS LTD.
`
`

`

`1.
`IMAGE DATA FUSON SYSTEMIS AND
`METHODS
`
`US 8,824,833 B2
`
`CROSS-REFERENCE TO RELATED
`APPLICATIONS
`
`This application claims priority to U.S. Provisional Patent
`application No. 61/025,533, filed on 1 Feb. 2008 and entitled
`MULTI-FOCAL LENGTH IMAGE FUSION, U.S. Provi
`sional Patent application No. 61/051,338, filed 7 May 2008
`and entitled TRANSFORM DOMAIN REGISTRATION
`FOR IMAGE FUSION, and U.S. Provisional Patent applica
`tion No. 61/059,319, filed 6 Jun. 2008 and entitled TRANS
`FORM DOMAIN REGISTRATION FOR IMAGEFUSION.
`All of the above-identified applications are incorporated
`herein by reference in their entireties.
`
`10
`
`15
`
`BACKGROUND
`
`25
`
`30
`
`35
`
`40
`
`45
`
`Small, digital cameras integrated into mobile electronics
`such as mobile phones, personal digital assistants (“PDAs)
`and music players are becoming ubiquitous. Each year,
`mobile phone manufacturers add more imaging features to
`their handsets, causing these mobile imaging devices to con
`Verge towards feature sets that consumers expect from stand
`alone digital still cameras. At the same time, the size of these
`handsets is shrinking, making it necessary to accordingly
`reduce the total size of the camera modules while still adding
`imaging features. Optical Zoom is a primary feature that many
`digital still cameras have that many mobile phones may not
`have, primarily due to the severe size constraints in mobile
`imaging devices.
`Cameras (including digital cameras) may be arranged to
`receive electromagnetic radiation (such as visible light)
`through an aperture that can be defined by the camera based
`on a number of well known techniques. For example, an
`optical Sub-system, including one or more lenses and/or other
`optical elements, may define the aperture such that the
`received radiation is imaged by the optical Sub-system and a
`resulting image is directed towards a sensor region such as a
`sensor array that includes a plurality of detectors defining a
`sensing Surface. The sensor region may be configured to
`receive the image and to generate a set of image databased on
`the image. In some common applications, such as when using
`conventional digital cameras to capture images, the camera
`may be aligned to receive electromagnetic radiation associ
`ated with scenery having a given set of one or more objects. In
`these applications the set of image data is, for example, rep
`resented as digital image data using an electrical signal con
`veyed by electrical conductors or stored using memory or
`other digital storage techniques. In addition, the set of image
`data can be processed using a number of known image pro
`cessing techniques.
`In the context of the present disclosure, "Zoom” may be
`understood as a capability to provide different magnifications
`of the same scene and/or object by changing the focal length
`of an optical system, with a higher “level of Zoom’ being
`associated herein with greater magnification and a lower level
`of Zoom being associated with lower magnification. In typical
`film-based cameras, as well as in conventional digital cam
`60
`eras, optical Zoom can be accomplished with multiple lens
`groups that are moved along an optical axis of an imaging
`system for defining a range of different lens configurations.
`For any given configuration, the position of the lens groups
`determines a focallength specific to that configuration. Based
`on well known techniques, camera users can adjustably con
`trol the positioning of the lens groups for selecting a specific
`
`50
`
`55
`
`65
`
`2
`level of Zoom. At any specific level of Zoom associated with a
`selected focal length of a camera's optical Sub-assembly, an
`image represents a portion of a given scene based in part on
`the field of view defined by the lens system. For example, an
`image plane can be defined by the camera's sensor region
`(such as a sensor array), and the resulting image represents a
`field of view consistent with (i) a shape and transverse extent
`of the sensor region’s sensing Surface, and (ii) the selected
`focal length. For a given camera, there is a tradeoff between
`Zoom and field of view Such that camera settings exhibiting
`longer focal lengths generally tend to result in a greater level
`of Zoom in conjunction with correspondingly narrower field
`of view. Conversely, camera settings exhibiting compara
`tively shorter focal lengths tend to result in a lower level of
`Zoom in conjunction with a wider field of view.
`Certain film-based cameras and digital cameras utilize a
`fixed focus imaging system, and these cameras generally do
`not feature adjustable optical Zoom. Fixed focus imaging
`systems are especially common in PDAs. The high complex
`ity, cost and decreased durability typically associated with
`moveable lenses (e.g., in cameras having optical Zoom) limit
`their use in inexpensive camera modules Such as mobile
`phone camera modules and other low cost modules. Film
`based cameras with fixed focus imaging systems generally
`offer no means for the user to adjust the degree of magnifi
`cation while preparing to take a picture. On the other hand,
`digital cameras having fixed optical focus can incorporate
`digital Zoom to allow the user to control the level of Zoom
`before and/or after capturing the image by generating a cor
`responding set of image data. For example, digital Zoom can
`utilize computer-processed cropping followed by signal
`upsampling and data interpolation of the cropped image to
`convert the cropped image to the original display size. As a
`result, however, the resolution of the cropped, final image is
`decreased and the image quality Suffers.
`
`SUMMARY
`
`The following embodiments and aspects thereof are
`described and illustrated in conjunction with systems, tools
`and methods, which are meant to be exemplary and illustra
`tive, not limiting in Scope. In various embodiments, one or
`more problems and/or limitations associated with the above
`described systems and methods have been addressed, while
`other embodiments are directed to other improvements.
`In an embodiment, an imaging method utilizes a multi
`aperture imaging system for producing a fused set of image
`data. This method may include providing a multi-aperture
`camera having first and second Sub-cameras including a first
`Sub-camera, having imaging optics defining a first aperture,
`with the first camera configured for imaging a first scene
`through the first aperture and for generating a first set of
`image data corresponding to the imaged first scene. A second
`camera may be provided, having imaging optics defining a
`second aperture, and the second Sub-camera may be config
`ured for imaging a second scene through the second aperture
`and for generating a second set of image data corresponding
`to the imaged second scene. The second sub-camera can be
`aligned Such that the second scene at least partially overlaps
`the first scene in an overlap region that defines (i) a first
`collection of overlap image data as part of the first set of
`image data for the imaged first scene and (ii) an at least
`generally corresponding, second collection of overlap image
`data as part of the second set of image data for the imaged
`second scene. The second collection of overlap image data of
`the second scene may be represented as a plurality of image
`data Subsets based on at least one associated characteristic of
`
`APPL-1025 / Page 21 of 36
`APPLE INC. v. COREPHOTONICS LTD.
`
`

`

`US 8,824,833 B2
`
`10
`
`25
`
`3
`the second collection of overlap image data, such that each
`Subset is Superimposed across the overlap region. A fused set
`of image data can be produced from the first set of image data
`by changing the first collection of overlap image data in the
`overlap region of the first scene based on at least a selected 5
`one of, but less than all of the image data Subsets.
`In one aspect, representing the second collection of overlap
`image data may include configuring the plurality of image
`data subsets such that each subset is based on a different
`characteristic as compared to the characteristic associated
`with any one of the other subsets.
`In another aspect, the first collection of overlap image data
`may include a first collection of luminance data, and the
`selected one of the image data Subsets may be a luminance
`channel (of luminance data) based on luminance as the char- 15
`acteristic of the second collection of overlap image data, and
`changing of the first collection of overlap image data may
`include combining the first and second collections of lumi
`nance data. Arranging of the second sub-camera may include
`Supplying the second Sub-camera as a grayscale camera for
`providing the luminance channel as being composed of gray
`scale scaled image data.
`In yet another aspect, representing the second collection of
`overlap image data may include filtering the second collec
`tion of overlap image data such that the selected image data
`Subset is composed of filtered data, and filtering the second
`collection of overlap image data may include applying con
`Volution filtering to the second collection of overlap image
`data Such that the selected image data Subset is influenced by
`the convolution filtering. Furthermore, representing the sec
`ond collection of overlap image data may include Scaling the
`second collection of overlap image data such that the selected
`image data Subset is composed of scaled data.
`In an additional aspect, the second collection of overlap
`image data may include intensity information, and Scaling the
`second collection of overlap image data may include chang
`ing at least Some of the intensity information. In this case
`Scaling the second collection of overlap image data includes
`applying a gain for causing the changing of the intensity
`information.
`40
`In another aspect, representing the second collection of
`overlap image data may include transforming at least Some of
`the second collection of overlap image data Such that the
`characteristic of the second collection of overlap image data
`is represented in a transform domain, and the selected image 45
`data Subset is composed of transformed data.
`In a particular aspect, configuring the first Sub-camera may
`include establishing a first focal length for the first sub-cam
`era, and configuring the second camera may include estab
`lishing a second focal length for the second camera. The 50
`second focallength may be different than the first focal length
`such that the second camera exhibits a different field of view
`as compared to the first camera. Configuring the first camera
`may include providing a first sensing Surface that has a first
`shape, with the first shape being characterized by a first trans- 55
`verse width. The first sensing surface may be oriented for
`receiving the imaged first scene to cause the generating of the
`first set of image data. In this particular aspect, configuring
`the second camera may include providing a second sensing
`Surface that has a second shape that matches the first shape 60
`and has a transverse width that matches the first transverse
`width, and the second sensing Surface may be oriented for
`receiving the imaged second scene to cause the generating of
`the first set of image data. Establishing the first focal length
`may cause the first set of image data to exhibit a first level of 65
`Zoom with respect to the first scene, and establishing the
`second focal length may cause the second set of data to
`
`30
`
`35
`
`4
`exhibit a second level of Zoom with respect to the second
`scene, and the first level of Zoom may be greater than the first
`level of Zoom. In some instances, imaging of the first scene
`may causes the first set of image data to have a first angular
`frequency based at least in part on the first focal length, and
`the imaging of the second scene may cause the second col
`lection of overlap data to have a second angular frequency
`based at least in part on the second focal length, Such that the
`second angular frequency is higher than the first angular
`frequency. In this particular aspect, generating the first set of
`image data may include initially producing an initial set of
`image data and then producing the first set of image data from
`the initial set of image data by upsampling the initial set of
`image data for increasing the angular frequency of the first set
`of image data, as compared to the initial image data, to a target
`angular frequency Such that the first set of image data is
`upsampled image data. The initial set of image data may
`include a group of initial data points, and the upsampling may
`cause the first set of image data to include (i) the group of
`initial data points and (ii) an additional number of data points.
`The upsampling of the initial set of image data may further
`include interpolating between the initial data points for
`assigning values for each of the additional of data points.
`Furthermore, the upsampling can include matching the
`increased angular frequency to the second angular frequency
`Such that the target angular frequency of the first set of image
`data is at least approximately equal to the second angular
`frequency.
`In one embodiment, the first Sub-camera may be config
`ured with a first sensor region having a first sensing Surface,
`and the first sensor region may be aligned such that the imag
`ing of the first scene includes projecting an image of the first
`scene through the first aperture and onto the first sensing
`Surface Such that the first sensor region causes the generating
`of the first set of image data. In this example, the second
`Sub-camera may be furnished with a second sensor region
`having a second sensing Surface, and the second sensor region
`may be aligned such that the imaging of the second scene
`includes projecting an image of the second scene through the
`second aperture and onto the second sensing Surface Such that
`the second sensor region causes the generating of the second
`set of image data. In one aspect of this embodiment, the first
`sensing Surface may have a first shape defined by a first
`Surface area and the second sensing Surface may have a sec
`ond shape that at least generally matches the first shape, and
`the second Surface may have a second Surface area that is at
`least approximately equal to the first Surface area. It is noted
`that the first sensor region and the second sensor region may
`each be a part of a single image sensor.
`In another aspect of this embodiment, the first collection of
`overlap image data may initially be represented based on first,
`second and third data channels, and changing the first collec
`tion of overlap image data may include converting the first
`collection of overlap image data, as represented by the first
`second and third data channels, to represent the first collec
`tion of overlap image databased on a different set of three
`data channels. For example, the first, second, and third chan
`nels may be R, G and B channels, respectively, and the dif
`ferent set of data channels may be Y. U and V channels.
`In yet another aspect of this embodiment, the second col
`lection of overlap image data may be initially based on first,
`second, and third channels, and representing the fused set of
`overlap image data may further include converting the second
`collection of overlap image data (as represented by the first,
`second, and third channels) to represent the second collection
`of overlap databased on a different set of three channels. Each
`of the different channels may serve as one of the plurality of
`
`APPL-1025 / Page 22 of 36
`APPLE INC. v. COREPHOTONICS LTD.
`
`

`

`5
`image data Subsets. For example, the three data channels may
`be R,G, and B channels, and the different set of data channels
`may be Y. U and V channels, and the Y channel may serve as
`the selected Subset of overlap image data.
`In an aspect, generating the first set of image data may
`include initially producing a set of initial image data and then
`producing the first set of image data from the initial image
`data by applying a first forward transformation to at least a
`portion of the initial image data Such that the first set of image
`data may be transformed data in a transform domain such that
`the first set of image data least generally represents, in the
`transform domain, at least Some of the portion of the initial
`image data, and representing the second collection of overlap
`image data may include applying a second forward transfor
`mation to at least some of the second set of image data Such
`that the characteristic of the second collection of image data
`is represented in the transform domain, and at least the
`selected image data Subset is composed of transformed data.
`Changing the first collection of overlap image data may
`include merging the selected one of the image data Subsets
`with the first collection of overlap image data in the transform
`domainto generate a merged data set in the transform domain,
`and producing the fused set of image data may include con
`Verting the merged data set from the transform domain by
`applying thereto at least one of (i) a reverse transformation
`and (ii) an inverse transformation.
`In an additional aspect, producing the fused set of image
`data further may include identifying at least one spatial fea
`ture that is present at a feature position within the first col
`lection of overlap image data of the first set of image data,
`searching for a related representation of at least one identified
`spatial feature (in the selected image data subset) such that
`each related representation at least approximately corre
`sponds to one of the identified features, and (for at least a
`selected one of the related representations that is located in
`the selected image data Subset based on the searching) regis
`tering the selected related representation as being associated
`with the feature position of the corresponding identified fea
`ture. In this additional aspect, changing the first collection of
`overlap image data may include modifying each identified
`40
`spatial feature based on the corresponding related represen
`tation of that feature. It is noted that the related representation
`may have a related feature position within the selected image
`data Subset, and searching for the related representation can
`include finding a spatial shift between the related feature
`position and the feature position. It is further noted that find
`ing the spatial shift may include determining that the spatial
`shift is non-zero and is caused by parallax between the first
`and second Sub-cameras.
`The additional aspect may include (i) defining a reference
`block overlying the feature position and having a shape that
`overlies a reference portion of the first collection of overlap
`image data Such that the reference portion of image data at
`least represents the spatial feature, (ii) defining a search
`region within the selected image data Subset, and (iii) desig
`nating a plurality of candidate blocks within the search
`region, each of which candidate blocks overlies an associated
`portion of the selected image data Subset at a candidate posi
`tion therein. In some instances the searching may include
`determining a degree of correspondence between (i) the ref
`erence portion of data overlaid by the reference block and (ii)
`the portion of data associated with each of the plurality of
`candidate blocks, and in this instance one candidate block
`may be selected based on the degree of correspondence. Such
`that the selected candidate block exhibits the highest degree
`of correspondence as compared to the other candidate blocks.
`Registering the selected related representation may include
`
`25
`
`30
`
`35
`
`45
`
`50
`
`55
`
`60
`
`65
`
`US 8,824,833 B2
`
`5
`
`10
`
`15
`
`6
`associating the candidate position of the selected candidate
`block with the feature position, and modifying of the spatial
`feature may include changing the reference portion of data
`based on at least some of the portion of data associated with
`the selected candidate block. Designating the plurality of
`candidate blocks may include defining a first candidate block
`as a specific one of the plurality of candidate blocks, and a
`second candidate block as a different one of the plurality of
`candidate blocks, such that the first and second candidate
`blocks partially overlap one another.
`In addition to the exemplary aspects and embodiments
`described above, further aspects and embodiments will
`become apparent by reference to the drawings and by study of
`the following descriptions. In addition to the exemplary
`aspects and embodiments described above, further aspects
`and embodiments will become apparent by reference to the
`drawings and by study of the following descriptions.
`
`BRIEF DESCRIPTION OF THE DRAWINGS
`
`Exemplary embodiments are illustrated in referenced fig
`ures of the drawings. It is intended that the embodiments and
`figures disclosed herein are to be illustrative rather than lim
`iting.
`FIG. 1 is a schematic view illustrating fusion of image data
`from two sources.
`FIG. 2A is a diagrammatic view of one embodiment of an
`optical design for a multi-aperture camera.
`FIG. 2B is a diagrammatic view of another embodiment of
`an optical design for a multi-aperture camera.
`FIG. 3 is a block diagram illustrating a multi-aperture
`imaging System.
`FIG. 4 is a block diagram illustrating one embodiment of a
`process for creating full-size images from a multi-aperture
`camera that shares a single sensor.
`FIG. 5 is an exemplary plot illustrating the differences in
`angular frequency information contained in the images pro
`duced from optical Sub-systems having different focal
`lengths, but the same f-number.
`FIG. 6 is a series of exemplary plots, shown here to illus
`trate how differences in angular frequency information from
`images produced from Sub-cameras having different focal
`lengths can be exploited in fusing the images.
`FIG. 7 is a combination block diagram and flow chart
`illustrating optional embodiments of the methods for process
`ing and fusing images from a multi-aperture camera.
`FIG. 8 is a combination block diagram and flow chart
`illustrating other embodiments of methods for processing and
`fusing images fro

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket