`
`
`
`
`
`(12) United States Patent
`
`
`
`(10) Patent N0.:
`US 9,571,731 B2
`
`
`
`
`
`
`
`
`
`(45) Date of Patent: Feb. 14, 2017
`Shabtay et al.
`
`
`
`USOO9571731B2
`
`
`
`(54)
`
`
`
`(71)
`
`(72)
`
`
`
`
`(73)
`
`(*)
`
`
`
`
`(21)
`
`(22)
`
`(86)
`
`
`
`
`
`(87)
`
`
`
`(65)
`
`
`
`(60)
`
`(51)
`
`
`
`
`
`(52)
`
`
`
`THIN MULTI-APERTURE IMAGING
`
`
`SYSTEM WITH AUTO-FOCUS AND
`
`
`
`METHODS FOR USING SAME
`
`
`
`
`
`
`
`
`
`
`
`
`Notice:
`
`
`
`
`
`
`
`Applicant: Corephotonics Ltd., Tel-Aviv (IL)
`
`
`
`
`
`
`
`Inventors: Gal Shabtay, Tel Aviv (IL); Noy
`Cohen, Tel Aviv (IL); Nadav Geva, Tel
`
`
`
`
`
`
`
`
`
`
`
`Aviv (IL); Oded Gigushinski, Herzlia
`
`
`
`
`(IL); Ephraim Goldenberg, Ashdod
`(1L)
`
`
`
`
`
`
`
`Assignee: Corephotonics Ltd., Tel Aviv (IL)
`
`
`
`
`
`
`Subject to any disclaimer, the term of this
`
`
`
`
`patent is extended or adjusted under 35
`
`
`
`U.S.C. 154(b) by 0 days.
`
`14/906,116
`Jul. 24, 2014
`
`
`
`PCT/IB2014/063393
`
`
`
`
`
`
`
`Appl. No.:
`PCT Filed:
`
`
`PCT No.:
`
`
`§ 371 (0X1),
`
`
`Jan. 19, 2016
`(2) Date:
`
`
`
`
`
`PCT Pub. No.: W02015/015383
`
`
`
`PCT Pub. Date: Feb. 5, 2015
`
`
`
`
`
`Prior Publication Data
`
`
`
`US 2016/0182821 A1
`Jun. 23, 2016
`
`
`
`
`
`Related US. Application Data
`
`
`
`
`
`
`
`
`
`Provisional application No. 61/861,185, filed on Aug.
`1, 2013.
`
`
`Int. Cl.
`
`H04N 5/232
`
`
`H04N 5/225
`
`
`
`
`
`
`
`
`
`
`(2006.01)
`(2006.01 )
`(Continued)
`
`
`US. Cl.
`
`CPC ............ H04N 5/23232 (201 3 .01 ); G02B 7/36
`
`
`
`
`
`
`
`(2013.01); G02B 27/646 (2013.01);
`
`
`
`
`(Continued)
`
`
`(58) Field of Classification Search
`
`
`
`
`CPC ....... G02B 27/646; G02B 7/36; H04N 5/2258;
`
`
`
`
`
`
`
`
`H04N 5/23212; H04N 5/23232; H04N
`
`
`
`
`
`5/2628; H04N 5/33; H04N 9/045; H04N
`
`
`
`
`
`
`9/09; H04N 9/64
`
`
`
`
`
`
`
`
`
`
`See application file for complete search history.
`References Cited
`
`
`U.S. PATENT DOCUMENTS
`
`
`
`12/2007 Labaziewicz et a1.
`
`
`7/2009 May et a1.
`
`
`
`(Continued)
`
`FOREIGN PATENT DOCUMENTS
`
`
`2013105012 A2
`7/2013
`
`
`
`2014199338 A2
`12/2014
`
`
`
`(Continued)
`
`(56)
`
`
`
`7,305,180 B2
`
`7,561,191 B2
`
`
`
`
`
`WO
`WO
`
`
`
`
`
`
`
`
`
`
`
`
`OTHER PUBLICATIONS
`
`
`International Search Report and Written Opinion issued in related
`
`
`
`
`
`
`
`PCT patent application PCT/IB2014/063393, dated May 11, 2016.
`
`
`
`
`
`
`9 pages.
`
`
`
`
`
`
`(Continued)
`
`
`
`
`
`(57)
`
`
`
`
`Primary Examiner 7 Amy Hsu
`
`
`
`
`
`
`(74) Attorney, Agent, or Firm 7 Nathan & Associates
`Patent Agents Ltd.; Menachem Nathan
`
`
`
`
`
`ABSTRACT
`
`
`
`
`
`
`
`
`
`Dual-aperture digital cameras with auto-focus (AF) and
`
`
`
`
`
`
`
`related methods for obtaining a focused and, optionally
`
`
`
`
`
`
`
`optically stabilized color image of an object or scene. A
`
`
`
`
`
`
`dual-aperture camera includes a first sub-camera having a
`
`
`
`
`
`
`
`
`
`first optics bloc and a color image sensor for providing a
`
`
`
`
`
`
`
`color image, a second sub-camera having a second optics
`
`
`
`
`
`
`
`
`bloc and a clear image sensor for providing a luminance
`
`
`
`
`
`
`
`
`image, the first and second sub-cameras having substantially
`the same field of view, an AF mechanism coupled mechani-
`
`
`
`
`
`
`
`cally at least to the first optics bloc, and a camera controller
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`coupled to the AF mechanism and to the two image sensors
`and configured to control the AF mechanism, to calculate a
`
`
`
`
`
`
`(Continued)
`
`
`3023
`
`
`
`
`
`304
`
`
`
`
`
`
`
`
`
`
`
`S S
`S E
`E S
`ééiifiééé
`
`
`
`
`
` éSEéEEiE ESéééééé Sééééiéé
`
`
`
`
`
`
`52525
`Eéésé
`SEES
`Laé—s2
`
`Sensor 1
`
`
`Sensor 2
`
`
`APPL-1020 / Page 1 of 19
`Apple v. Corephotonics
`
`APPL-1020 / Page 1 of 19
`Apple v. Corephotonics
`
`
`
`
`
`
`US 9,571,731 B2
` Page 2
`
`
`
`
`
`
`
`
`
`scaling difference and a sharpness difference between the
`
`
`
`
`
`
`
`
`color and luminance images,
`the scaling and sharpness
`
`
`
`
`
`
`
`dilferences be1ng due to the AF mechanism, and to process
`
`
`
`
`
`
`
`
`
`the color and luminance images into a fused color image
`
`
`
`
`using the calculated differences.
`
`
`
`
`
`
`
`
`
`(2006.01)
`(2006.01)
`(2006.01)
`(200601)
`(2006.01)
`(2006.01)
`(2006.01)
`
`
`
`
`
`
`
`
`
`
`
`(51)
`
`
`
`
`
`(56)
`
`7,676,146 B2
`
`
`7,965,314 B1*
`
`
`
`
`8,149,327 B2
`
`
`
`8,439,265 B2
`
`
`8,542,287 B2 *
`
`
`
`
`Int, Cl,
`
`
`
`
`H04N 9/04
`G023 7/36
`
`
`G023 27/64
`
`
`H04N 5062
`
`
`H04N 5/33
`
`
`H04N 9/09
`
`
`
`
`H04N 9/64
`'
`'
`'
`(52) U S Cl
`
`
`
`CPC ....... H04N 5/2258 (2013.01); H04N 5/23212
`
`
`
`
`
`
`
`(2013.01); H04N 5/2628 (2013.01); H04N
`
`
`
`
`
`5/33 (2013.01); H04N 9/045 (2013.01); H04N
`
`
`
`
`
`
`
`
`
`
`
`9/09 (2013.01); H04N 9/64 (2013.01)
`.
`
`
`References C‘ted
`US. PATENT DOCUMENTS
`
`
`3/2010 Border et a1.
`
`
`
`6/2011 Miller .............. G08B13/19643
`
`
`
`
`250/330
`
`
`
`
`4/2012 Lin et a1.
`
`
`
`5/2013 Ferren et a1.
`
`
`
`
`9/2013 Griffith ................ H04N 5/2251
`
`
`
`
`
`348/218.l
`
`
`
`
`
`
`10/2013 Scarff
`855535106 B2
`2/2014 Chang
`8,660,420 B2
`
`
`
`
`5/2014 Goldenberg et 31.
`8,731,390 B2
`
`
`
`
`
`9/2014 Golan et 31.
`8,824,823 B1
`
`
`
`
`
`
`2005/0225654 A1* 10/2005 Feldman ................ H04N 9/045
`
`
`
`
`
`
`
`348/272
`
`2005/0253951 A1
`
`
`2006/0054782 A1*
`
`
`
`11/2005 Fujimoto et a1.
`
`
`
`3/2006 Olsen ..................... H04N 5/265
`
`
`
`
`250/2081
`
`
`
`
`
`
`
`
`
`8/2006 May et a1.
`2006/0187338 A1
`
`
`
`
`
`
`
`
`
`2007/0296835 A1* 12/2007 Olsen """""""" H01L23418/23421
`2008/0030592 A1
`2/2008 Border et al.
`
`
`
`
`
`
`2008/0218613 A1
`9/2008 Janson et 31.
`
`
`
`
`
`
`
`2009/0050806 A1
`2/2009 Schmidt et a1.
`
`
`
`
`
`
`2010/0165134 A1
`7/2010 Dowski et a1.
`
`
`
`
`
`
`
`
`
`
`2011/0064327 A1
`3/2011 Dagher et a1.
`
`
`
`
`
`
`
`2011/1014186
`6/2011 Satoshi Arai et a1.
`
`
`
`
`
`
`
`2012/0113307 A1 *
`5/2012 Watanabe .......... H04N 5/23219
`
`
`
`
`
`
`348/333.01
`2/2013 Wajs ..................... G06T 7/0065
`2013/0033578 A1 *
`
`
`
`
`
`
`t
`2/2013 Ph
`2013/0044382 A1
`348/46
`l
`
`
`
`
`
`00“ e 3'
`
`
`
`
`
`
`8/2013 lmamura ................ P10432223?
`2013/0215299 A1
`9/2013 Phoon et a1.
`2013/0242181 A1
`
`
`
`
`
`2013/0258044 A1* 10/2013 Betts-Lacroix
`
`
`
`
`*
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`H04N 13/0242
`
`348/36
`
`
`
`
`
`2013/0321675 A1 * 12/2013 Cote ........................ H04N 9/64
`2013/0335854 A1
`12/2013 Etoh et a1.
`348/242
`
`
`
`
`2013/0341493 A1* 12/2013 Ando ....................... G01C 3/32
`
`
`
`
`
`
`250/2031
`
`2013/0342691 A1* 12/2013 Lewis .................... H04N 5/332
`
`
`
`
`
`
`
`348/143
`2014/0043519 A1
`2/2014 Azuma et a1.
`
`
`
`
`
`2015/0029601 A1
`1/2015 Dror et a1.
`
`
`
`
`
`2015/0085174 A1
`3/2015 Shabtay et al.
`
`
`
`
`
`
`FOREIGN PATENT DOCUMENTS
`
`
`
`
`
`
`
`W0
`WO
`WO
`
`
`
`
`
`
`
`2015001519 A2
`
`
`2015015383 A2
`
`
`2015124966 A1
`
`
`“2015
`
`2/2015
`
`”015
`
`
`
`
`
`OTHER PUBLICATIONS
`
`
`
`
`International Search Report and Written Opinion issued in related
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`PCT patent appllcatron PCT/IB2014/062180, dated Mar. 11, 2015.
`
`
`_
`11 Page?
`_
`_
`_
`_
`Internatlonal Search Report and Wr1tten Op1n1on 1ssued 1n related
`
`
`
`
`
`
`
`
`PCT patent application PCT/IB2014/062181, dated October 8,
`
`
`
`
`
`
`2014, 8 pages.
`
`
`
`
`
`
`* cited by examiner
`
`
`
`
`
`APPL-1020 / Page 2 of 19
`
`APPL-1020 / Page 2 of 19
`
`
`
`
`U.S. Patent
`
`
`
`
`
`Feb. 14, 2017
`
`
`
`
`
`Sheet 1 of 10
`
`
`
`US 9,571,731 B2
`
`
`
`
`
`1L|8§3L| $31an0
`
`100'
`
`
`
`
`
`
`106
`
`
`
`102
`
`
`
`
`104
`
`FIG.1A
`
`APPL-1020 / Page 3 of 19
`
`APPL-1020 / Page 3 of 19
`
`
`
`
`U.S. Patent
`
`
`
`Feb. 14, 2017
`
`
`
`
`
`Sheet 2 of 10
`
`
`
`
`US 9,571,731 B2
`
`
`
`
`
`
`
`1q813q 2.13ij1
`
`114b
`
`
`
`Sub-camera2
`
`100"
`
`
`
`
`FIG.1B
`
`Sub-camera1
`
`114a
`
`APPL-1020 / Page 4 of 19
`
`APPL-1020 / Page 4 of 19
`
`
`
`
`U.S. Patent
`
`
`
`
`
`Feb. 14, 2017
`
`
`
`
`Sheet 3 of 10
`
`
`
`US 9,571,731 B2
`
`
`
`
`
` Sub-camera2
`
`FIG.2
`
`APPL-1020 / Page 5 of 19
`
`APPL-1020 / Page 5 of 19
`
`
`
`
`U.S. Patent
`
`
`
`Feb. 14, 2017
`
`
`
`
`
`Sheet 4 of 10
`
`
`
`
`US 9,571,731 B2
`
`
`
`W1
`
`W m! ME
`
` Sensor2
`
`W\N\NW WWWWW
`MMMMMMMM
`
`WW
`W 'W‘N W. W ww MMMMmm
`
`
`
`
`
`FIG.3
`
`APPL-1020 / Page 6 of 19
`
`Fl
`
`LOU
`
`?CO
`
`J
`Ln
`
`$2243..
`
`
`
`fl
`
`0m
`
`
`
`0om
`
`(U
`
`NO0
`
`0
`
`APPL-1020 / Page 6 of 19
`
`
`
`
`U.S. Patent
`
`
`
`Feb. 14, 2017
`
`Sheet 5 of 10
`
`
`
`US 9,571,731 B2
`
`
`
`wow/7
`
`
`
`$35?QOmmmum@cm
`
`mafimmmmfifiawzamumufifiwm
`
`
`
`mow/”/comumtg
`
`
`
`2%?vamam,méfimfimme
`
`mama.“
`
`@3335
`
`353%,
`
`atm
`
`awmmmuma
`
`mNov
`
`meow
`
`
`
`
`
`
`
`
`
`
`
`
`
`.................../.HaWEEEmma
`
`03V
`
`<v.OE
`
`APPL-1020 / Page 7 of 19
`
`APPL-1020 / Page 7 of 19
`
`
`
`
`
`
`
`
`
`
`U.S. Patent
`
`
`
`
`
`Feb. 14, 2017
`
`
`
`
`Sheet 6 of 10
`
`
`
`US 9,571,731 B2
`
`
`
`
`
`0N?
`
`mm?
`
`.5»,5Vacoflmémawu
`
`.335was,50Emma
`
`om?mmwmfimflag
`
`
`NNVEwan,
`
`ES»53%.,”95%m55waan
`Emmin.E35:meE«5:33zommgfimcfismwwfimmmm
`
`5m$530”mmfinwamwtouEa
`
`figEcammmmwzzfiéfi925
`$30gamma:0mgvcmmmm
`mgHumfixwmmafigEm
`
`
`mmpmcmwfifivfluumqmuumfivumwmwmw
`
`
`
`
`
`
`
`umflammmugméi
`
`,2mEnéWm,Em
`
`
`
`9».9“.
`
`APPL-1020 / Page 8 of 19
`
`APPL-1020 / Page 8 of 19
`
`
`
`
`
`
`
`U.S. Patent
`
`
`
`
`
`Feb. 14, 2017
`
`
`
`
`Sheet 7 of 10
`
`
`
`US 9,571,731 B2
`
`
`
`
`
`FIG.4C
`
`APPL-1020 / Page 9 of 19
`
`APPL-1020 / Page 9 of 19
`
`
`
`
`U.S. Patent
`
`
`
`
`
`Feb. 14, 2017
`
`
`
`
`Sheet 8 of 10
`
`
`
`US 9,571,731 B2
`
`
`
`
`
`
`Sub-camera2
`
`
`
`
`
`Fixedfocusoptics
`
`
`
`
`
`Sub—camera1
`
`500
`
`
`
`
`
`
`
`
`
`
`NOL
`
`n E
`
`.2
`
`U
`.CU
`
`C(
`
`
`
`
`CI.)
`
`E
`LI.
`
`<
`
`
`
`
`
`
`<
`L0
`-
`Q
`u.
`
`APPL—1020 / Page 10 of 19
`
`
`
`
`
`
`
`
`
`U)
`
`.2
`4-'
`0.
`
`
`o
`V)
`
`3U
`
`
`
`
`.9
`t,
`
`Q)
`
`.5
`
`“g
`o
`Z
`
`
`
`
`APPL-1020 / Page 10 of 19
`
`
`
`
`U.S. Patent
`
`
`
`
`
`Feb. 14, 2017
`
`
`
`
`Sheet 9 of 10
`
`
`
`US 9,571,731 B2
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`.me“333%3.32cwmafi
`
`
`
`
`
`
`
`
`
`
`80m\\,.
`
`m3fihflcfihzmEatwmmE_,
`
`
`
`M,rmwfl..,,
`
`
`
`mwmmamWraflfiamfi$335aVEfitomfi3w_w_mm%.3 cgmfiwmmEW
`
`
`
`
`‘m,3,,$33.36.330
`VEmammagma?“3£535me
`
`
`
`2933.95£323.,.4m.wand
`
`
`
`M$3533353%wmmgmwfi
`
`MWU:Lm”J,
`
`
`33%mm.O_n_
`
`APPL—1020 / Page 11 of 19
`
`APPL-1020 / Page 11 of 19
`
`
`
`
`
`
`
`
`
`U.S. Patent
`
`
`
`
`
`Feb. 14, 2017
`
`
`
`
`Sheet 10 of 10
`
`
`
`US 9,571,731 B2
`
`
`
`Sub-camera1
`
`
`
`
`
`Mechanicallensholder604
`
`
`
`
`
`
`
`
`
`
`
`Sub—camera2
`
`
`Optics
`
`
`
`
`i/I/
`
`
`
`
`
`AFMechanism602
`
`
`FIG.6
`
`APPL—1020 / Page 12 of 19
`
`APPL-1020 / Page 12 of 19
`
`
`
`1
`
`THIN MULTI-APERTURE IMAGING
`
`
`SYSTEM WITH AUTO-FOCUS AND
`
`
`
`METHODS FOR USING SAME
`
`
`
`
`
`
`
`
`
`CROSS REFERENCE TO RELATED
`
`
`APPLICATIONS
`
`
`
`
`
`
`
`
`
`This application is a 371 application from international
`
`
`
`
`
`application PCT/IB2014/063393 and is related to and claims
`
`
`
`
`
`
`priority from US. Provisional Patent Application No.
`
`
`
`
`
`
`
`
`61/861,185 filed Aug. 1, 2013 and having the same title,
`
`
`
`
`
`
`which is incorporated herein by reference in its entirety.
`FIELD
`
`
`
`
`
`
`
`
`Embodiments disclosed herein relate in general to digital
`
`
`
`
`
`
`cameras and in particular to thin multi-aperture digital
`cameras with auto-focus.
`
`
`
`BACKGROUND
`
`
`
`
`
`
`
`
`
`
`
`10
`
`
`
`15
`
`
`
`20
`
`
`
`
`
`
`
`
`
`In recent years, mobile devices such as cell-phones,
`
`
`
`
`
`
`
`
`tablets and laptops have become ubiquitous. Most of these
`
`
`
`
`
`
`
`devices include one or two compact camerasia main
`
`
`
`
`
`
`
`
`
`rear-facing camera (i.e. a camera on the back side of the
`
`
`
`
`
`
`
`
`
`
`
`device, facing away from the user and often used for casual
`
`
`
`
`
`
`photography) and a secondary front-facing camera (i.e. a
`camera located on the front side of the device and often used
`
`
`
`
`
`
`
`
`
`
`
`for Video conferencing).
`
`
`
`
`
`
`
`Although relatively compact in nature, the design of most
`of these cameras is very similar to the traditional structure of
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`a digital still camera, i.e. they comprise an optical compo-
`nent (or a train of several optical elements and a main
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`aperture) placed on top of an image sensor. The optical
`
`
`
`
`
`
`
`component (also referred to as “optics”) refracts the incom-
`
`
`
`
`
`
`
`
`
`ing light rays and bends them to create an image of a scene
`on the sensor. The dimensions of these cameras are largely
`
`
`
`
`
`
`
`
`
`determined by the size of the sensor and by the height of the
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`optics. These are usually tied together through the focal
`length (“f”) of the lens and its field of view (FOV)7a lens
`
`
`
`
`
`
`
`
`that has to image a certain FOV on a sensor of a certain size
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`has a specific focal length. Keeping the FOV constant, the
`
`
`
`
`
`
`
`
`
`
`larger the sensor dimensions (e.g. in an X-Y plane), the
`
`
`
`
`
`
`
`
`larger the focal length and the optics height.
`As the dimensions of mobile devices shrink, the compact
`
`
`
`
`
`
`
`camera dimensions become more and more a key factor that
`
`
`
`
`
`
`
`
`
`limits the device thickness. Several approaches have been
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`proposed to reduce the compact camera thickness in order to
`
`
`
`
`
`
`alleviate this constraint. Recently, multi-aperture systems
`
`
`
`
`
`
`
`
`
`have been proposed for this purpose.
`In such systems,
`
`
`
`
`
`
`
`
`
`
`instead of having one aperture with one train of optical
`elements, the camera is divided into several apertures, each
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`with dedicated optical elements, all apertures sharing a
`similar field of view. Hereinafter, each such aperture,
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`together with the optics and the sensor area on which the
`
`
`
`
`
`image is formed, is defined as a “sub-camera”. Typically, in
`
`
`
`
`
`
`multi-aperture camera designs, each sub-camera creates a
`
`
`
`
`
`
`
`
`
`smaller image on the image sensor compared with the image
`
`
`
`
`
`
`created by a reference single-aperture camera. Therefore, the
`height of each sub-camera can be smaller than the height of
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`a single-aperture camera, reducing the total height of the
`
`
`
`
`
`
`
`
`camera could be reduced and allowing for slimmer designs
`of mobile devices.
`
`
`
`FIG. 1A and FIG. 1B show a schematic design of a
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`traditional camera and of a dual-aperture camera with two
`sub-cameras, respectively. Atraditional camera 100' in FIG.
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`1A includes an image sensor 102 placed on a substrate 104
`
`25
`
`
`
`30
`
`
`
`35
`
`
`
`40
`
`
`
`45
`
`
`
`50
`
`
`
`55
`
`
`
`60
`
`
`
`65
`
`
`
`
`
`US 9,571,731 B2
`
`
`
`2
`
`
`
`
`
`
`
`
`
`
`and a lens 106. A “camera height” is defined as the height of
`the camera module, from substrate 104 to the top of lens
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`106. A dual-aperture camera 100" in FIG. 1B includes two
`
`
`
`
`
`
`
`sub-cameras, a sub-camera 1 with an image sensor 112a and
`
`
`
`
`
`
`
`
`
`a lens 116a with an optical axis 118a, and a sub-camera 2
`
`
`
`
`
`
`
`
`
`with, an image sensor 112b and a lens 116b with an optical
`
`
`
`
`
`
`
`
`
`axis 11819. The two sensors are placed on, respectively,
`
`
`
`
`
`
`
`
`substrates 114a and 114b. For comparison’s sake,
`it
`is
`
`
`
`
`
`
`
`
`assumed that the reference single-aperture camera and the
`
`
`
`
`
`
`
`
`dual-aperture camera have the same field of view (FOV) and
`
`
`
`
`
`
`
`
`
`
`the sensors have the same pixel size. However, image sensor
`
`
`
`
`
`
`
`102 has a higher resolution (number of pixels) compared
`
`
`
`
`
`
`
`
`
`with image sensor 112a or image sensor 11219, and is
`
`
`
`
`
`
`
`therefore larger in size. The potential advantage in camera
`
`
`
`
`
`
`
`
`height of the dual-aperture camera (i.e. the thickness from
`substrate 114a to the top of lens 116a and from substrate
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`114b to the top of lens 116b) may be appreciated.
`
`
`
`
`
`
`
`There are several significant challenges involved in multi-
`
`
`
`
`
`
`
`
`
`aperture camera designs. First and foremost, the sensor area
`
`
`
`
`
`
`
`
`of each sub-camera is smaller compared with that of a
`
`
`
`
`
`
`
`single-aperture camera. If the pixel size in each sub-camera
`
`
`
`
`
`
`
`
`sensor is kept the same as that in the single-aperture camera
`
`
`
`
`
`
`
`
`sensor, the resolution of an image captured by each sub-
`
`
`
`
`
`
`
`camera is smaller than that captured by the single-aperture
`
`
`
`
`
`
`
`camera. If the resolution of the output image is to be kept the
`
`
`
`
`
`
`
`
`same, the images from the different sub-cameras need to be
`
`
`
`
`
`
`combined into a higher-resolution image. This is usually
`
`
`
`
`
`
`
`done in the digital domain, by a dedicated algorithm Several
`
`
`
`
`
`
`
`methods have been proposed for combining lower-resolu-
`
`
`
`
`
`
`tion images to produce a higher-resolution image. Some
`
`
`
`
`
`
`
`algorithms in such methods require a registration step
`
`
`
`
`
`
`
`
`between the set of low-resolution images, to account for
`
`
`
`
`
`
`parallax (which is present in a multi-aperture camera system
`
`
`
`
`
`
`
`due to the shift in point-of-view between sub-cameras). One
`
`
`
`
`
`
`
`such algorithm is described in co-assigned PCT patent
`
`
`
`
`
`
`application PCT/IB2014/062180 titled “Dual aperture zoom
`
`
`
`
`
`
`digital camera”, which is incorporated herein by reference in
`
`
`its entirety.
`
`
`
`
`
`
`
`
`Another challenge relates to the requirement that the
`
`
`
`
`
`
`
`
`
`camera provides an in-focus image for a wide range of
`
`
`
`
`
`
`
`object distances (usually from several centimeters to infinity
`
`
`
`
`
`
`
`in compact camera modules). To fulfill this requirement, a
`
`
`
`
`
`
`
`single-aperture camera may include an Auto-Focus (AF)
`
`
`
`
`
`
`
`mechanism that controls the focus position of the optics, by
`
`
`
`
`
`
`
`
`
`moving the optical element along the optical axis,
`thus
`
`
`
`
`
`
`
`
`changing its height above the sensor.
`In multi-aperture
`
`
`
`
`
`
`
`cameras, in order to support an in-focus image for a wide
`
`
`
`
`
`
`range of object distances, a straightforward approach would
`be to provide a dedicated AF mechanism in each sub-
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`camera. This approach has several drawbacks including
`
`
`
`
`
`
`
`
`
`increased size and cost of the camera, higher operating
`
`
`
`
`
`
`
`power and more complicated control, as the AF mechanisms
`
`
`
`
`
`
`of each sub-camera needs to be synchronized, to ensure all
`of the sub-cameras are focused to the same position.
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`Another complication that may arise when using an AF
`
`
`
`
`
`
`mechanism in a multi-aperture camera is connected with the
`algorithm that combines the lower resolution sub-camera
`
`
`
`
`
`
`
`
`
`
`
`
`
`images to produce a higher resolution image. Since an AF
`
`
`
`
`
`
`
`
`mechanism moves the optical element along the optical axis
`above the sensor, it scales the image that is formed on the
`
`
`
`
`
`
`
`
`
`sensor to some extent. Slight differences between the focus-
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`ing positions of different AF mechanisms in each sub-
`camera may result in different scales applied to the lower
`
`
`
`
`
`
`
`
`
`resolution sub-camera images. Such differences in scale may
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`degrade the performance of the image registration step in the
`
`
`
`
`
`
`
`
`
`algorithm. Correcting for the different scale is not trivial, due
`
`
`
`
`
`
`
`
`to the dynamic nature of the scaleithe scale applied on the
`APPL—1020 / Page 13 of 19
`
`APPL-1020 / Page 13 of 19
`
`
`
`
`
`US 9,571,731 B2
`
`
`3
`
`
`
`
`
`
`
`image depends on the focus position of the optics, which in
`
`
`
`
`
`
`
`
`
`
`turn changes with object distance. This means that the scale
`
`
`
`
`
`
`cannot be trivially corrected by calibrating the multi-aper-
`
`
`
`
`
`
`
`
`
`ture camera and applying a fixed correction, but rather, the
`
`
`
`
`
`
`
`
`correct scale has to be estimated at each image. Estimating
`
`
`
`
`
`
`
`
`
`
`the correct scale to apply from the image is not trivial, in the
`
`
`
`
`
`
`
`presence of parallax (where different objects appear at
`different locations as a function from their distance from the
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`camera) and in the presence of possible occlusions of objects
`
`
`
`
`
`
`
`
`
`
`in one aperture but not in the other. There is therefore a need
`
`
`
`
`
`
`
`
`
`for a method that can accurately estimate and correct dif-
`
`
`
`
`ferences in scaling on a per-image basis.
`
`
`
`
`
`
`
`As an alternative to using AF, multi-aperture camera
`
`
`
`
`
`
`
`designs have been proposed with no AF mechanism at all.
`
`
`
`
`
`
`
`
`
`
`Such designs rely on the smaller focal
`length of each
`
`
`
`
`
`
`sub-camera to provide increased depth-of-focus (DOF)
`
`
`
`
`
`
`compared with a corresponding single-aperture camera that
`
`
`
`
`
`
`
`
`supports a larger sensor. Since a larger DOF means that a
`
`
`
`
`
`
`
`
`wider range of object distances is imaged in-focus onto the
`sensor, the AF mechanism could be removed. While this
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`approach is advantageous in terms of cost, size and system
`
`
`
`
`
`
`
`
`
`complexity, the larger DOF that results from the shorter
`
`
`
`
`
`
`focal length of a multi-aperture camera is often insufficient
`
`
`
`
`
`
`
`
`to support an in-focus image for object distances ranging
`
`
`
`
`
`
`
`
`from a few centimeters to infinity. In these cases, settling for
`
`
`
`
`
`
`
`
`a multi-aperture camera with fixed-focus optics results in
`
`
`
`
`
`
`poor imaging performance at close object distances.
`
`
`
`
`
`
`Between using multiple AF mechanisms and using only
`
`
`
`
`
`fixed-focus optics,
`there is a need for a multi-aperture
`
`
`
`
`
`
`camera system that combines the benefits of an AF mecha-
`
`
`
`
`
`
`
`nism without adding additional complexity and cost to the
`camera system.
`
`
`
`
`
`
`
`
`SUMMARY
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`Embodiments disclosed herein provide designs of a multi-
`
`
`
`
`
`aperture camera with an AF mechanism, describe an algo-
`
`
`
`
`
`
`rithm that dynamically corrects
`scale differences
`for
`
`
`
`
`
`
`
`between sub-camera images, and propose a color filter array
`
`
`
`
`
`
`
`
`(CFA) design that may result
`in higher resolution and
`
`
`
`
`
`sensitivity when combining sub-camera images, compared
`with standard CFAs.
`
`
`
`
`
`
`
`
`
`
`In various embodiments, there are provided dual-aperture
`
`
`
`
`
`
`
`
`digital cameras with auto-focus (AF) for imaging an object
`
`
`
`
`
`
`
`
`or scene, each such dual-aperture digital camera comprising
`a first sub-camera that includes a first optics bloc and a color
`
`
`
`
`
`
`
`
`
`
`image sensor with a first number of pixels, the first camera
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`operative to provide a color image of the object or scene, a
`second sub-camera that includes a second optics bloc and a
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`clear image sensor having a second number of pixels, the
`
`
`
`
`
`
`second sub-camera operative to provide a luminance image
`of the object or scene, the first and second sub-cameras
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`having substantially the same field of view, an AF mecha-
`
`
`
`
`
`
`
`
`nism coupled mechanically at least to the first optics bloc,
`and a camera controller coupled to the AF mechanism and
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`to the two image sensors and configured to control the AF
`
`
`
`
`
`
`mechanism, to calculate a scaling difference and a sharpness
`the
`difference between the color and luminance images,
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`scaling and sharpness differences being due to the AF
`
`
`
`
`
`
`
`
`mechanism, and to process the color and luminance images
`into a fused color image using the calculated differences.
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`The first number of pixels and second number of pixels
`
`
`
`
`
`
`
`
`
`may be equal or different. The first and second images
`sensors are formed on a single substrate. The first sub-
`
`
`
`
`
`
`
`
`camera may include an infra-red (IR) filter that blocks IR
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`wavelengths from entering the color image sensor and the
`
`
`
`
`
`
`
`second sub-camera may be configured to allow at least some
`
`10
`
`
`
`15
`
`
`
`20
`
`25
`
`
`
`30
`
`
`
`35
`
`
`
`40
`
`
`
`45
`
`
`
`50
`
`
`
`55
`
`
`
`60
`
`
`
`65
`
`
`
`
`
`
`
`
`4
`
`
`
`
`
`
`
`
`
`IR wavelengths to enter the clear image sensor. In some
`
`
`
`
`
`
`
`embodiments, the color image sensor may include a non-
`
`
`
`
`
`Bayer color filter array (CFA).
`
`
`
`
`
`
`In an embodiment, the AF mechanism may be coupled
`
`
`
`
`
`
`
`
`
`mechanically to the first optics bloc, and the second optics
`
`
`
`
`
`
`
`
`bloc may have a fixed focus position. In an embodiment, the
`
`
`
`
`
`
`
`
`fixed focus position may be such that a DOF range of the
`
`
`
`
`
`
`
`
`second sub-camera is between infinity and less than about
`100 cm. In an embodiment, the AF mechanism may be
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`coupled mechanically to the first and second optics blocs
`
`
`
`
`
`
`
`and operative to move them together in a direction common
`
`
`
`
`
`
`to respective optics bloc optical axes.
`
`
`
`
`
`
`
`In an embodiment, the camera may further comprise an
`
`
`
`
`
`
`optical
`image stabilization mechanism coupled mechani-
`
`
`
`
`
`
`
`
`
`cally to the first and second optics blocs and in a direction
`
`
`
`
`
`
`
`perpendicular to respective optics bloc optical axes to opti-
`
`
`
`
`
`
`cally stabilize the AF fused color image.
`
`
`
`
`
`
`In an embodiment there is provided method for obtaining
`
`
`
`
`
`
`
`
`a focused color image of an object or scene using a dual-
`
`
`
`
`
`
`
`aperture camera, comprising the steps of obtaining simulta-
`
`
`
`
`
`
`neously an auto-focused color image and an auto-focused or
`
`
`
`
`
`
`
`fixed focus luminance image of the object or scene, wherein
`
`
`
`
`
`
`
`
`the color image has a first resolution, a first effective
`
`
`
`
`
`
`
`resolution and a first signal-to-noise ratio (SNR), and
`
`
`
`
`
`
`
`wherein the luminance image has a second resolution, a
`
`
`
`
`
`
`
`second effective resolution and a second SNR, preprocess-
`
`
`
`
`
`
`
`
`ing the two images to obtain respective rectified, normalized
`
`
`
`
`
`
`
`and scale-adjusted color and luminance images considering
`
`
`
`
`
`
`
`scaling and sharpness differences caused by the AF action,
`
`
`
`
`
`
`
`performing local registration between the rectified, normal-
`
`
`
`
`
`
`
`ized and scale-adjusted color and luminance images to
`
`
`
`
`
`
`
`
`obtain registered images, and fusing the registered images
`into a focused fused color image.
`
`
`
`
`
`
`
`
`
`
`
`
`
`In an embodiment, the step of preprocessing to obtain
`
`
`
`
`
`
`
`scale-adjusted color and luminance images includes calcu-
`
`
`
`
`
`
`
`
`lating a set of corresponding points in the color and lumi-
`
`
`
`
`
`
`
`nance images, extracting a single coordinate from each
`
`
`
`
`
`
`
`
`corresponding point and using the single coordinate to
`estimate a scaling factor S between the color and luminance
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`images. The extracted coordinate is Y and the scaling factor
`S may be given by S:(Y2'*W*Y2)\Y2'*W*Y1, where Y1 is
`
`
`
`
`
`a vector ofY coordinates of points taken from one image, Y2
`
`
`
`
`
`
`
`
`is a vector of Y coordinates of points taken from the other
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`image, and W is a diagonal matrix that holds the absolute
`values of Y2.
`
`
`
`
`
`
`
`
`In an embodiment, a method may further comprise using
`
`
`
`
`
`
`
`scaling factor S to scale one of the images to match the other
`
`
`
`
`
`
`image, thereby obtaining the registered images.
`
`
`
`
`
`
`
`In an embodiment, a method may further comprise opti-
`
`
`
`
`
`
`
`
`cally stabilizing the obtained color and luminance images.
`BRIEF DESCRIPTION OF THE DRAWINGS
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`Non-limiting examples of embodiments disclosed herein
`are described below with reference to figures attached hereto
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`that are listed following this paragraph. The drawings and
`
`
`
`
`
`
`descriptions are meant to illuminate and clarify embodi-
`ments disclosed herein, and should not be considered lim-
`
`
`
`
`
`
`
`
`
`
`iting in any way.
`
`
`
`
`
`FIG. 1A shows schematically the design of a traditional
`
`
`digital camera;
`
`
`
`
`
`FIG. 1B shows schematically the design of a dual-aper-
`ture camera;
`
`
`
`
`
`
`
`
`FIG. 2 shows schematically an embodiment of a dual-
`
`
`
`
`
`
`
`aperture imaging system with auto-focus disclosed herein, in
`(a) a general isomeric view, and (b) a sectioned isomeric
`
`
`
`
`
`
`
`
`vrew;
`
`
`
`
`
`
`APPL—1020 / Page 14 of 19
`
`APPL-1020 / Page 14 of 19
`
`
`
`
`
`
`
`US 9,571,731 B2
`
`5
`
`
`
`
`
`
`
`
`FIG. 3 shows an embodiment of an image sensor for the
`
`
`
`
`
`
`
`imaging system in FIG. 2, in which one sub-camera has a
`CFA sensor, while another sub-camera has a clear sensor;
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`FIG. 4A shows schematically in a flow chart an embodi-
`ment of a method disclosed herein;
`
`
`
`
`FIG. 4B shows in a flow chart details of the scale
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`adjustment step in the method shown in FIG. 4A;
`
`
`
`
`
`
`
`FIG. 4C shows two images with corresponding points;
`
`
`
`
`
`FIG. 5A shows schematically another embodiment of a
`
`
`
`
`
`
`dual-aperture imaging system with a single auto-focus
`mechanism disclosed herein in a sectioned isomeric view;
`
`
`
`
`
`
`
`
`
`
`
`
`FIG. 5B shows schematically in a flow chart an embodi-
`
`
`
`
`
`
`
`
`ment of a method for auto-focus imaging with the imaging
`
`
`
`system in FIG. 5A;
`
`
`
`
`
`
`FIG. 6 shows schematically yet another embodiment of a
`
`
`
`
`
`
`
`dual-aperture imaging system numbered with a single auto-
`focus mechanism in a sectioned isomeric view.
`
`
`
`
`
`DETAILED DESCRIPTION
`
`
`
`
`10
`
`
`
`15
`
`
`
`20
`
`
`
`
`
`
`
`
`
`FIG. 2 shows schematically an embodiment of a dual-
`
`
`
`
`
`
`
`aperture imaging system with auto-focus disclosed herein
`
`
`
`
`
`
`
`
`and numbered 200, in (a) a general isomeric view, and (b) a
`
`
`
`
`
`
`
`sectioned isomeric view.
`In the following description,
`
`
`
`
`
`
`
`“imaging system” and “camera” may be used interchange-
`
`
`
`
`
`
`
`
`ably. System 200 comprises two sub-cameras, labeled 202
`
`
`
`
`
`
`
`
`
`and 204, each sub-camera having its own optics. Thus,
`
`
`
`
`
`
`
`
`sub-camera 202 includes an optics bloc 206 with an aperture
`
`
`
`
`
`
`
`
`
`208 and an optical lens module 210, as well as a sensor 212
`
`
`
`
`
`
`
`
`Similarly, sub-camera 204 includes an optics bloc 214 with
`
`
`
`
`
`
`
`
`
`an aperture 216 and an optical lens module 218, as well as
`a sensor 220. The sensors are also referred to henceforth as
`
`
`
`
`
`
`
`
`
`“sensor 1” (212) and “sensor 2” (220). Note that the two
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`sensors may be implemented as two distinct areas on the
`same substrate, and not necessarily as two stand-alone
`
`
`
`
`
`
`
`
`sensors. Each optical lens module may include several lens
`
`
`
`
`
`
`
`
`
`elements as well as an Infra-Red (IR) filter 22211, b. In some
`
`
`
`
`
`
`
`embodiments, some or all of the lens elements belonging to
`
`
`
`
`
`
`
`
`
`
`
`
`different apertures may be formed on the same substrate.
`The two sub-cameras are positioned next to each other, with
`
`
`
`
`
`
`
`
`
`a small baseline 224 between the two apertures 208 and 216.
`
`
`
`
`
`
`
`
`
`
`
`Each sub-camera further includes an auto-focus mechanism,
`
`
`
`
`
`
`
`
`
`
`respectively 226 and 228.
`The sensors used in each sub-camera may have different
`
`
`
`
`
`
`
`
`color filter arrays (CFAs). In some embodiments, sensor 1
`
`
`
`
`
`
`
`may have one type of CFA, while sensor 2 may have another
`
`
`
`
`
`
`
`
`
`
`type of CFA. In some embodiments, sensor 1 may have a
`
`
`
`
`
`
`
`CFA and sensor 2 may have a “white” or “clear” filter array
`
`
`
`
`
`
`
`
`(marked by “W”)iin which all the pixels absorb the same
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`wide range of wavelengths, e.g. between 400 nm and 700
`
`
`
`
`
`
`
`nm (instead of each pixel absorbing a smaller portion of the
`
`
`
`
`
`
`
`spectrum). A sensor having a color filter array may be
`referred to henceforth as a “color image sensor”, while a
`
`
`
`
`
`
`sensor with a clear or W filter array is referred to as a “clear
`
`
`
`
`
`
`image sensor”. FIG. 3A shows a sensor embodiment 300,
`
`
`
`
`
`
`
`where numeral “1” represents sensor 1 (with a CFA) and
`
`
`
`
`
`
`
`
`numeral “2” represents sensor 2 (with a clear “white” filter
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`array). Circles 302a, 3021) mark image circles formed by the
`optics on the sensors, while a white area 304 marks the
`
`
`
`
`
`
`
`
`
`
`substrate on which the sensors are located. Circ