throbber
(12) United States Patent
`US 10,230,898 B2
`(10) Patent No.:
`Cohen et al.
`(45) Date of Patent:
`Mar. 12, 2019
`
`US010230898B2
`
`(54) DUAL APERTURE ZOOM CAMERA WITH
`VIDEO SUPPORT AND SWITCHING /
`NON-SWITCHING DYNAMIC CONTROL
`
`(71) Applicant: Corephotonics Ltd., Tel-Aviv (IL)
`
`(72)
`
`Inventors: Noy Cohen, Tel-Aviv (IL); Oded
`Gigushinski, Herzlia (IL); Nadav
`Geva, Tel-Aviv (IL); Gal Shabtay,
`Tel-Aviv (IL); Ester Ashkenazi,
`Modi’in (IL); Ruthy Katz, Tel Aviv
`(IL); Ephraim Goldenberg, Ashdod
`(IL)
`
`(73) Assignee: Corephotonics Ltd., Tel Aviv (IL)
`
`( * ) Notice:
`
`Subject to any disclaimer, the term of this
`patent is extended or adjusted under 35
`U.S.C. 154(b) by 0 days.
`
`CN
`CN
`
`(52) US. Cl.
`CPC ....... H04N 5/23296 (2013.01); H04N 5/2258
`(2013.01); H04N 5/23216 (2013.01); H04N
`5/23245 (2013.01)
`
`(58) Field of Classification Search
`CPC ............. H04N 5/23296; H04N 5/2258; H04N
`5/23216; H04N 5/23245
`See application file for complete search history.
`
`(56)
`
`References Cited
`U.S. PATENT DOCUMENTS
`
`4,199,785 A
`5,005,083 A
`
`4/1980 McCullough et a1.
`4/1991 Grage et a1.
`(Continued)
`
`FOREIGN PATENT DOCUMENTS
`
`101276415 A
`102739949 A
`
`10/2008
`10/2012
`
`(Continued)
`
`OTHER PUBLICATIONS
`
`International Search Report and Written Opinion issued in relation
`to PCT patent application PCT/IB2016/053803 dated Jun. 26, 2016,
`9 pages.
`
`(Continued)
`
`Primary Examiner 7 Nhan T Tran
`(74) Attorney, Agent, or Firm 7 Nathan & Associates;
`Menachem Nathan
`
`(57)
`
`ABSTRACT
`
`A dual-aperture zoom digital camera operable in both still
`and Video modes. The camera includes Wide and Tele
`
`imaging sections with respective lens/sensor combinations
`and image signal processors and a camera controller opera-
`tively coupled to the Wide and Tele imaging sections. The
`Wide and Tele imaging sections provide respective image
`data. The controller is configured to output, in a zoom-in
`operation between a lower zoom factor (ZF) value and a
`higher ZF value, a zoom video output image that includes
`(Cont1nued)
`
`(21) Appl. No.:
`
`15/324,720
`
`(22) PCT Filed:
`
`Jun. 26, 2016
`
`(86) PCT No.:
`
`PCT/IB2016/053803
`
`§ 371 (0X1),
`(2) Date:
`
`Jan. 8, 2017
`
`(87) PCT Pub. No.: WO2017/025822
`
`PCT Pub. Date: Feb. 16, 2017
`
`(65)
`
`Prior Publication Data
`
`US 2018/0184010 A1
`
`Jun. 28, 2018
`
`Related US. Application Data
`
`(60) Provisional application No. 62/204,667, filed on Aug.
`13, 2015.
`
`(51)
`
`Int. Cl.
`H04N 5/232
`H04N 5/225
`
`(2006.01)
`(2006.01)
`
`
`
`
`
`
`
` Tel: ISP 112
`
`
`
` UmmwmlHR
`
`
`s:nnnnnnn“.1115
`,2"
`m
`124
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`WW4
`
`
`Choose swsofls) w b: ovemlunal
`302
`
`
`am”) mm m. 111W”
`two (Wideand Tela) mum at:
`
` mm. 1,. .1, m304
`
`Opuonmy, a
`balance m
`
`
`
`
`
`
`
`
`
`
`
`
`Pmoess .n ompul ol-ny of ms
`302-303 in nhlain .mums mug:
`312
`
`
`
`APPL-1001 / Page 1 of 15
`Apple v. Corephotonics
`
`APPL-1001 / Page 1 of 15
`Apple v. Corephotonics
`
`

`

`US 10,230,898 B2
`
`Page 2
`
`only Wide image data or only Tele image data, depending on
`whether a no-switching criterion is fulfilled or not.
`
`20 Claims, 5 Drawing Sheets
`
`(56)
`
`References Cited
`
`U.S. PATENT DOCUMENTS
`
`>>>>>>>>>>>>>>>>>>
`
`5,032,917
`5,051,830
`5,248,971
`5,287,093
`5,436,660
`5,444,478
`5,459,520
`5,657,402
`5,682,198
`5,710,670
`5,768,443
`5,926,190
`5,940,641
`5,982,951
`6,101,334
`6,104,432
`6,128,416
`6,148,120
`6,208,765
`6,268,611
`6,549,215
`6,611,289
`6,643,416
`6,650,368
`6,680,748
`6,714,665
`6,724,421
`6,738,073
`6,741,250
`6,750,903
`6,778,207
`7,002,583
`7,015,954
`7,038,716
`7,199,348
`7,206,136
`7,248,294
`7,256,944
`7,305,180
`7,339,621
`7,346,217
`7,365,793
`7,411,610
`7,424,218
`7,509,041
`7,533,819
`7,619,683
`7,738,016
`7,880,776
`7,918,398
`7,964,835
`7,978,239
`8,115,825
`8,149,327
`8,154,610
`8,238,695
`8,274,552
`8,390,729
`8,391,697
`8,400,555
`8,401,276
`8,439,265
`8,446,484
`8,483,452
`8,514,491
`8,547,389
`8,553,106
`
`7/1991
`9/1991
`9/1993
`2/1994
`7/1995
`8/1995
`10/1995
`8/1997
`10/1997
`1/1998
`6/1998
`7/1999
`8/1999
`11/1999
`8/2000
`8/2000
`10/2000
`11/2000
`3/2001
`7/2001
`4/2003
`8/2003
`11/2003
`11/2003
`1/2004
`3/2004
`4/2004
`5/2004
`5/2004
`6/2004
`8/2004
`2/2006
`3/2006
`5/2006
`4/2007
`4/2007
`7/2007
`8/2007
`12/2007
`3/2008
`3/2008
`4/2008
`8/2008
`9/2008
`3/2009
`5/2009
`11/2009
`6/2010
`2/2011
`4/2011
`6/2011
`7/2011
`2/2012
`4/2012
`4/2012
`8/2012
`9/2012
`3/2013
`3/2013
`3/2013
`3/2013
`5/2013
`5/2013
`7/2013
`8/2013
`10/2013
`10/2013
`
`Aschwanden
`von Hoessle
`Mandl
`Amano et a1.
`Sakamoto
`Lelong et a1.
`Sasaki
`Bender et a1.
`Katayama et a1.
`Ohno
`Michael et a1.
`Turkowski et a1.
`McIntyre et a1.
`Katayama et a1.
`Fantone
`Nakamura et a1.
`Oura
`Sussman
`Bergen
`Pettersson et a1.
`Jouppi
`Yu et a1.
`Daniels et a1.
`Doron
`Monti
`Hanna et a1.
`Glatt
`Park et a1.
`Furlan et a1.
`Miyatake et a1.
`Lee et a1.
`Rabb, III
`Foote et a1.
`Klein et a1.
`Olsen et a1.
`Labaziewicz et a1.
`Slatter
`Labaziewicz et a1.
`Labaziewicz et a1.
`Fortier
`Gold, Jr.
`Cheatle et a1.
`Doyle
`Baudisch et a1.
`Hosono
`Barkan et a1.
`Davis
`Toyofuku
`LeGall et a1.
`Li et a1.
`Olsen et a1.
`Deever et a1.
`Culbert et a1.
`Lin et a1.
`J0 et a1.
`Davey et a1.
`Dahi et a1.
`Long et a1.
`Cho et a1.
`Georgiev et a1.
`Choe et a1.
`Ferren et a1.
`Muukki et a1.
`Ueda et a1.
`Duparre
`Hoppe et a1.
`Scarff
`
`8,587,691
`8,619,148
`8,803,990
`8,976,255
`9,019,387
`9,025,073
`9,025,077
`9,041,835
`9,137,447
`9,185,291
`9,215,377
`9,215,385
`9,270,875
`9,286,680
`9,344,626
`9,360,671
`9,369,621
`9,413,930
`9,413,984
`9,420,180
`9,438,792
`9,485,432
`9,578,257
`9,618,748
`9,681,057
`9,723,220
`9,736,365
`9,736,391
`9,800,798
`9,851,803
`9,894,287
`9,900,522
`2002/0005902
`2002/0063711
`2002/0122113
`2003/0030729
`2003/0093805
`2003/0160886
`2003/0017930
`2003/0202113
`2004/0008773
`2004/0017386
`2004/0027367
`2004/0061788
`2004/0240052
`2005/0013509
`2005/0046740
`2005/0157184
`2005/0200718
`2006/0054782
`2006/0056056
`2006/0125937
`2006/0170793
`2006/0175549
`2006/0187310
`2006/0187322
`2006/0187338
`2007/0024737
`2007/0025713
`
`2007/0177025
`2007/0182833
`
`2007/0188653
`2007/0189386
`2007/0257184
`2007/0285550
`2008/0017557
`2008/0024614
`2008/0025634
`2008/0030592
`2008/0030611
`2008/0084484
`2008/0117316
`2008/0218611
`2008/0218612
`2008/0218613
`2008/0219654
`2009/0086074
`
`B2
`B1
`B2
`B2
`B2
`B2
`B2
`B2
`B2
`B1
`B2
`B2
`B2
`B1
`B2
`B1
`B2
`B2
`B2
`B2
`B2
`B1
`B2
`B2
`B2
`B2
`B2
`B2
`B2
`B2
`B2
`B2
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`
`A1
`A1
`
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`
`11/2013
`12/2013
`8/2014
`3/2015
`4/2015
`5/2015
`5/2015
`5/2015
`9/2015
`11/2015
`12/2015
`12/2015
`2/2016
`3/2016
`5/2016
`6/2016
`6/2016
`8/2016
`8/2016
`8/2016
`9/2016
`11/2016
`2/2017
`4/2017
`6/2017
`8/2017
`8/2017
`8/2017
`10/2017
`12/2017
`2/2018
`2/2018
`1/2002
`5/2002
`9/2002
`2/2003
`5/2003
`8/2003
`9/2003
`10/2003
`1/2004
`1/2004
`2/2004
`4/2004
`12/2004
`1/2005
`3/2005
`7/2005
`9/2005
`3/2006
`3/2006
`6/2006
`8/2006
`8/2006
`8/2006
`8/2006
`8/2006
`2/2007
`2/2007
`
`8/2007
`8/2007
`
`8/2007
`8/2007
`11/2007
`12/2007
`1/2008
`1/2008
`1/2008
`2/2008
`2/2008
`4/2008
`5/2008
`9/2008
`9/2008
`9/2008
`9/2008
`4/2009
`
`Takane
`Watts et a1.
`Smith
`Matsuoto et a1.
`Nakano
`Attar et a1.
`Attar et a1.
`Honda
`Shibuno
`Shabtay et a1.
`Sokeila et a1.
`Luo
`Brisedoux et a1.
`Jiang et a1.
`Silverstein et a1.
`Zhou
`Malone et a1.
`Geerds
`Attar et a1.
`Jin
`Nakada et a1.
`Medasani et a1.
`Attar et a1.
`Munger et a1.
`Attar et a1.
`Sugie
`Laroia
`Du et a1.
`Ravirala et a1.
`Fisher et a1.
`Qian et a1.
`Lu
`Yuen
`Park et a1.
`Foote
`Prentice et a1.
`Gin
`Misawa et a1.
`Bittner
`Yoshikawa
`Itokawa
`Liu et a1.
`Pilu
`Bateman
`Minefuji et a1.
`Samadani
`Davis
`Nakanishi et a1.
`Lee
`Olsen et a1.
`Ahiska et a1.
`LeGall et a1.
`Pasquarette et a1.
`Miller et a1.
`Janson et a1.
`Janson et al.
`May et a1.
`Nakamura et a1.
`H0 sono ................
`
`Kopet et a1.
`Toyofuku .............
`
`Pollock et a1.
`Imagawa et a1.
`Olsen et a1.
`Son
`Witdouck
`Li et a1.
`Border et a1.
`Border et a1.
`Jenkins
`Ochi et a1.
`Orimoto
`Parulski et a1.
`Border et a1.
`Janson et a1.
`Border et a1.
`Li et a1.
`
`H04N 5/2259
`396/72
`
`H04N 5/232
`348/240.3
`
`APPL-1001 / Page 2 of 15
`
`APPL-1001 / Page 2 of 15
`
`

`

`US 10,230,898 B2
`
`Page 3
`
`(56)
`
`References Cited
`
`U.S. PATENT DOCUMENTS
`
`2009/0102950
`2009/0122195
`2009/0122406
`2009/0128644
`2009/0219547
`2009/0252484
`2009/0295949
`2010/0013906
`2010/0060746
`2010/0103194
`2010/0238327
`2010/0277619
`
`2011/0064327
`2011/0080487
`2011/0128288
`2011/0164172
`2011/0229054
`2011/0234853
`2011/0242286
`2011/0242355
`2012/0026366
`2012/0069235
`2012/0075489
`2012/0105579
`2012/0196648
`2012/0229663
`2012/0249815
`2012/0287315
`2013/0002928
`2013/0093842
`2013/0135445
`2013/0182150
`2013/0201360
`2013/0202273
`2013/0235224
`2013/0250150
`2013/0258044
`2013/0321668
`2014/0049615
`2014/0118584
`2014/0192238
`2014/0192253
`2014/0253693
`
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A9
`A1
`A1
`A1*
`
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1*
`
`4/2009
`5/2009
`5/2009
`5/2009
`9/2009
`10/2009
`12/2009
`1/2010
`3/2010
`4/2010
`9/2010
`11/2010
`
`3/2011
`4/2011
`6/2011
`7/2011
`9/2011
`9/2011
`10/2011
`10/2011
`2/2012
`3/2012
`3/2012
`5/2012
`8/2012
`9/2012
`10/2012
`11/2012
`1/2013
`4/2013
`5/2013
`7/2013
`8/2013
`8/2013
`9/2013
`9/2013
`10/2013
`12/2013
`2/2014
`5/2014
`7/2014
`7/2014
`9/2014
`
`2014/0267834
`
`A1*
`
`9/2014
`
`2014/0313316
`2014/0362242
`2015/0085174
`
`2015/0092066
`2015/0154776
`2015/0195458
`2015/0215516
`2015/0237280
`2015/0242994
`2015/0244942
`2015/0271471
`2015/0334309
`2016/0044250
`2016/0154202
`2016/0212358
`2016/0241793
`
`A1
`A1
`A1*
`
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1*
`
`10/2014
`12/2014
`3/2015
`
`4/2015
`6/2015
`7/2015
`7/2015
`8/2015
`8/2015
`8/2015
`9/2015
`11/2015
`2/2016
`6/2016
`7/2016
`8/2016
`
`Ahiska
`Van Baar et al.
`Rouvinen et al.
`Camp et al.
`Kauhanen et al.
`Hasuda et al.
`Ojala
`Border et al.
`Olsen et al.
`Chen et al.
`Griffith et al.
`Scarff .................. H04N 5/2258
`348/2401
`
`Dagher et al.
`Venkataraman et al.
`Petrou et al.
`Shintani et al.
`Weston et al.
`Hayashi et al.
`Pace et al.
`Goma et al.
`Golan et al.
`Imai
`Nishihara
`Jeon et al.
`Havens et al.
`Nelson et al.
`Bohn et al.
`Huang et al.
`Imai
`Yahata
`Dahi et al.
`Asakura
`Song
`Ouedraogo et al.
`Park et al.
`Malone et al.
`Betts-LaCroiX
`Kamath
`Uwagawa
`Lee et al.
`Attar et al.
`Laroia
`Shikata .............. H04N 5/23245
`34 8/47
`.................. H04N 5/23296
`348/240.1
`
`Aoki
`
`Olsson et al.
`Takizawa
`Shabtay ............. H04N 5/23296
`348/336
`
`Geiss et al.
`Zhang et al.
`Nakayama et al.
`Dolgin
`Choi et al.
`Shen
`Shabtay et al.
`Hsieh et al.
`Peng et al.
`Shabtay et al.
`Wippermann et al.
`Shikata
`Ravirala ............ H04N 5/23296
`
`2016/0301840 A1
`2016/0353012 A1
`2017/0019616 A1
`2017/0214846 A1
`2017/0214866 A1
`2017/0289458 A1
`2018/0150973 A1
`
`10/2016 Du et al.
`12/2016 Kao et al.
`1/2017 Zhu et al.
`7/2017 Du et al.
`7/2017 Zhu et al.
`10/2017 Song et al.
`5/2018 Tang et al.
`
`FOREIGN PATENT DOCUMENTS
`
`CN
`EP
`JP
`JP
`JP
`JP
`JP
`JP
`JP
`JP
`JP
`KR
`KR
`KR
`WO
`
`103024272 A
`2523450 A1
`04211230 A
`H07318864 A
`08271976 A
`2003298920 A
`2005099265 A
`2006238325 A
`2007228006 A
`2007306282 A
`2013106289 A
`20100008936 A
`20140014787 A
`101477178 B1
`2014199338 A2
`
`4/2013
`11/2012
`8/1992
`12/1995
`10/1996
`10/2003
`4/2005
`9/2006
`9/2007
`11/2007
`5/2013
`1/2010
`2/2014
`12/2014
`12/2014
`
`OTHER PUBLICATIONS
`
`Statistical Modeling and Performance Characterization of a Real-
`Time Dual Camera Surveillance System, Greienhagen et al., Pub-
`lisher: IEEE, 2000, 8 pages.
`A 3MPixel Multi-Aperture Image Sensor With 0.7 pm Pixels in 0.11
`pm CMOS, Fife et al., Stanford University, 2008, 3 pages.
`Dual camera intelligent sensor for high definition 360 degrees
`surveillance, Scotti et al., Publisher: IET, May 9, 2000, 8 pages.
`Dual-sensor foveated imaging system, Hua et al., Publisher: Optical
`Society of America, Jan. 14, 2008, 11 pages.
`Defocus Video Matting, McGuire et al., Publisher: ACM SIG-
`GRAPH, Jul. 31, 2005, 11 pages.
`Compact multi-aperture imaging with high angular resolution,
`Santacana et al., Publisher: Optical Society of America, 2015, 10
`pages.
`Multi-Aperture Photography, Green et al., Publisher: Mitsubishi
`Electric Research Laboratories, Inc., Jul. 2007, 10 pages.
`Multispectral Bilateral Video Fusion, Bennett et al., Publisher:
`IEEE, May 2007, 10 pages.
`Super-resolution imaging using a camera array, Santacana et al.,
`Publisher: Optical Society of America, 2014, 6 pages.
`Optical Splitting Trees for High-Precision Monocular Imaging,
`McGuire et al., Publisher: IEEE, 2007, 11 pages.
`High Performance Imaging Using Large Camera Arrays, Wilburn et
`al., Publisher: Association for Computing Machinery, Inc., 2005, 12
`pages.
`Real-time Edge-Aware Image Processing with the Bilateral Grid,
`Chen et al., Publisher: ACM SIGGRAPH, 9 pages.
`Superimposed multi-resolution imaging, Caries et al., Publisher:
`Optical Society of America, 2017, 13 pages.
`Viewfinder Alignment, Adams et al., Publisher: Eurographics, 2008,
`10 pages.
`Dual-Camera System for Multi-Level Activity Recognition, Bodor
`et al., Publisher: IEEE, Oct. 2014, 6 pages.
`Engineered to the task: Why camera-phone cameras are different,
`Giles Humpston, Publisher: Solid State Technology, Jun. 2009, 3
`pages.
`
`* cited by examiner
`
`APPL-1001 / Page 3 of 15
`
`APPL-1001 / Page 3 of 15
`
`

`

`U.S. Patent
`
`m00900l}03
`
`Ian"bwemm&MIeemoowImmwCmIWWma»
`
`_.
`
`9_.0Iwe2n.446.m9.0611H11S2“n21O11m61_.O.111a002
`o_.11OP1UMI]
`
`_
`
`.-
`
`_.
`
`_.mn.a.HC
`
`FIG. 1A
`
`FIG. 1B
`
`_—
`
`h_.Sn”._................................
`00g_LPl5n.mmS1.m_.1eI1$f1mseoe0_.Rmdm«B_.6Tm{1...Hn"TCm.0..rHW0n"Emm11_.U
`O_.001Im1un“S«mI
`
`___.
`
`_-IIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIII
`
`
`
`_—
`
`___.
`
`_—_.
`
`___.
`
`_IIIIIIIIIIIIIIIIIIIIIIIIIIIIIIII..
`
`2...................................................................ll.0_1n_SnU__
`
`Tele camera
`
`APPL-1001 / Page 4 of 15
`
`APPL-1001 / Page 4 of 15
`
`
`
`

`

`US. Patent
`
`Mar. 12,2019
`
`Sheet 2 015
`
`US 10,230,898 B2
`
`
`
`202
`
`204
`
`FIG. 2
`
`APPL-1001 / Page 5 of 15
`
`APPL-1001 / Page 5 of 15
`
`

`

`U.S. Patent
`
`Mar. 12, 2019
`
`Sheet 3 of 5
`
`US 10,230,898 B2
`
`Choose sensor(s) to be operational
`302
`
`l
`
`Optionally, calculate color balance if
`two (Wide and Tele) images are
`provided by the two sensors.
`304
`
`
`
`Optionally, apply calculated color
`balance in one of the images
`306
`
`Optionally, perform registration
`between the Wide and Tele images to
`output a transformation coefficient
`308
`
`Set an AF position using the
`transformation coefficient
`310
`
`314
`
`Process an output of any of steps
`302—308 to obtain a processed image
`312
`
`Resample the processed image
`according to the transformation
`coefficient, requested ZF, and output
`Video resolution
`
`FIG. 3A
`
`APPL-1001 / Page 6 of 15
`
`APPL-1001 / Page 6 of 15
`
`

`

`U.S. Patent
`
`Mar. 12, 2019
`
`Sheet 4 of 5
`
`US 10,230,898 B2
`
`
`
`
`
`‘
`
`mxxxxxxxx\xxxxxxxxxxxxxxxmmxxxx
`
`EM WM& 3,?" 9%.W’e’éé
`
`
`
`
`
`Wmfiw 3%%Ԥ2~wm
`
`
`
`unmmw;
`
`
`(1
`‘JumA
`MM};
`
`:4;
`5,91,,
`
`
`
`
`FIG. 3C
`
`APPL-1001 / Page 7 of 15
`
`APPL-1001 / Page 7 of 15
`
`

`

`US. Patent
`
`Mar. 12,2019
`
`Sheet 5 015
`
`US 10,230,898 B2
`
`Effective
`Resolution
`
`User Zoom factor
`
`AZoomdown
`<—-—>
`
`AZoomup
`
`FIG. 4
`
`APPL-1001 / Page 8 of 15
`
`APPL-1001 / Page 8 of 15
`
`

`

`US 10,230,898 B2
`
`1
`DUAL APERTURE ZOOM CAMERA WITH
`VIDEO SUPPORT AND SWITCHING/
`NON-SWITCHING DYNAMIC CONTROL
`
`CROSS REFERENCE TO RELATED
`APPLICATIONS
`
`This application is a 371 application from international
`patent application PCT/IB2016/053803 filed Jun. 26, 2016,
`and is related to and claims priority from US. Provisional
`Patent Application No. 62/204,667 filed Aug. 13, 2015
`which is expressly incorporated herein by reference in its
`entirety.
`
`FIELD
`
`Embodiments disclosed herein relate in general to digital
`cameras and in particular to zoom digital cameras with Video
`capabilities.
`
`BACKGROUND
`
`Digital camera modules are currently being incorporated
`into a variety of host devices. Such host devices include
`cellular telephones, personal data assistants (PDAs), com-
`puters, and so forth. Consumer demand for digital camera
`modules in host devices continues to grow.
`Host device manufacturers prefer digital camera modules
`to be small, so that they can be incorporated into the host
`device without increasing its overall size. Further, there is an
`increasing demand for such cameras to have higher-perfor-
`mance characteristics. One such characteristic possessed by
`many higher-performance cameras (e.g., standalone digital
`still cameras) is the ability to vary the focal length of the
`camera to increase and decrease the magnification of the
`image. This ability, typically accomplished with a zoom
`lens,
`is known as optical zooming. “Zoom” is commonly
`understood as a capability to provide different magnifica-
`tions of the same scene and/or object by changing the focal
`length of an optical system, with a higher level of zoom
`associated with greater magnification and a lower level of
`zoom associated with lower magnification. Optical zooming
`is typically accomplished by mechanically moving lens
`elements relative to each other. Such zoom lenses are
`
`typically more expensive, larger and less reliable than fixed
`focal length lenses. An alternative approach for approximat-
`ing the zoom effect is achieved with what is known as digital
`zooming. With digital zooming, instead of varying the focal
`length of the lens, a processor in the camera crops the image
`and interpolates between the pixels of the captured image to
`create a magnified but lower-resolution image.
`Attempts to use multi-aperture imaging systems to
`approximate the effect of a zoom lens are known. A multi-
`aperture imaging system (implemented for example in a
`digital camera) includes a plurality of optical sub-systems
`(also referred to as “cameras”). Each camera includes one or
`more lenses and/or other optical elements which define an
`aperture such that received electro-magnetic radiation is
`imaged by the optical sub-system and a resulting image is
`directed towards a two-dimensional (2D) pixelated image
`sensor region. The image sensor (or simply “sensor”) region
`is configured to receive the image and to generate a set of
`image data based on the image. The digital camera may be
`aligned to receive electromagnetic radiation associated with
`scenery having a given set of one or more objects. The set
`of image data may be represented as digital image data, as
`well known in the art. Hereinafter in this description,
`
`10
`
`15
`
`20
`
`25
`
`30
`
`35
`
`40
`
`45
`
`50
`
`55
`
`60
`
`65
`
`a, co
`image data” and “digital image data” may be used
`“image
`interchangeably. Also, “object” and “scene” may be used
`interchangeably. As used herein, the term “object” is an
`entity in the real world imaged to a point or pixel in the
`image.
`Multi-aperture imaging systems and associated methods
`are described for example in US Patent Publications No.
`2008/0030592, 2010/0277619 and 2011/0064327. In US
`2008/0030592, two sensors are operated simultaneously to
`capture an image imaged through an associated lens. A
`sensor and its associated lens form a lens/sensor combina-
`
`tion. The two lenses have different focal lengths. Thus, even
`though each lens/sensor combination is aligned to look in
`the same direction, each combination captures an image of
`the same subject but with two different fields of view (FOV).
`One sensor is commonly called “Wide” and the other “Tele”.
`Each sensor provides a separate image, referred to respec-
`tively as “Wide” (or “W”) and “Tele” (or “T”) images. A
`W-image reflects a wider FOV and has lower resolution than
`the T—image. The images are then stitched (fused) together to
`form a composite (“fused”) image. In the composite image,
`the central portion is formed by the relatively higher-
`resolution image taken by the lens/sensor combination with
`the longer focal length, and the peripheral portion is formed
`by a peripheral portion of the relatively lower-resolution
`image taken by the lens/sensor combination with the shorter
`focal length. The user selects a desired amount of zoom and
`the composite image is used to interpolate values from the
`chosen amount of zoom to provide a respective zoom image.
`The solution offered by US 2008/0030592 requires, in video
`mode, very large processing resources in addition to high
`frame rate requirements and high power consumption (since
`both cameras are fully operational).
`US 2010/0277619 teaches a camera with two lens/sensor
`
`combinations, the two lenses having different focal lengths,
`so that the image from one of the combinations has a FOV
`approximately 2-3 times greater than the image from the
`other combination. As a user of the camera requests a given
`amount of zoom, the zoomed image is provided from the
`lens/sensor combination having a FOV that is next larger
`than the requested FOV. Thus, if the requested FOV is less
`than the smaller FOV combination, the zoomed image is
`created from the image captured by that combination, using
`cropping and interpolation if necessary. Similarly, if the
`requested FOV is greater than the smaller FOV combination,
`the zoomed image is created from the image captured by the
`other combination, using cropping and interpolation if nec-
`essary. The solution offered by US 2010/0277619 leads to
`parallax artifacts when moving to the Tele camera in video
`mode.
`In both US 2008/0030592 and US 2010/0277619, differ-
`ent focal
`length systems cause matching Tele and Wide
`FOVs to be exposed at different times using CMOS sensors.
`This degrades the overall image quality. Different optical F
`numbers (“F#”) cause image intensity differences. Working
`with such a dual sensor system requires double bandwidth
`support, i.e. additional wires from the sensors to the follow-
`ing HW component. Neither US 2008/0030592 nor US
`2010/0277619 deal with registration errors.
`US 2011/0064327 discloses multi-aperture imaging sys-
`tems and methods for image data fusion that include pro-
`viding first and second sets of image data corresponding to
`an imaged first and second scene respectively. The scenes
`overlap at least partially in an overlap region, defining a first
`collection of overlap image data as part of the first set of
`image data, and a second collection of overlap image data as
`part of the second set of image data. The second collection
`APPL-1001 / Page 9 of 15
`
`APPL-1001 / Page 9 of 15
`
`

`

`US 10,230,898 B2
`
`3
`of overlap image data is represented as a plurality of image
`data cameras such that each of the cameras is based on at
`
`least one characteristic of the second collection, and each
`camera spans the overlap region. A fused set of image data
`is produced by an image processor, by modifying the first
`collection of overlap image data based on at least a selected
`one of, but less than all of, the image data cameras. The
`systems and methods disclosed in this application deal
`solely with fused still images.
`None of the known art references provide a thin (e.g.
`fitting in a cell-phone) dual-aperture zoom digital camera
`with fixed focal
`length lenses, the camera configured to
`operate in both still mode and video mode to provide still
`and video images, wherein the camera configuration does
`not use any fusion to provide a continuous, smooth zoom in
`video mode.
`Therefore there is a need for, and it would be advanta-
`geous to have thin digital cameras with optical zoom oper-
`ating in both video and still mode that do not suffer from
`commonly encountered problems and disadvantages, some
`of which are listed above.
`
`SUMMARY
`
`Embodiments disclosed herein teach the use of dual-
`
`aperture (also referred to as dual-lens or two-sensor) optical
`zoom digital cameras. The cameras include two cameras, a
`Wide camera and a Tele camera, each camera including a
`fixed focal length lens, an image sensor and an image signal
`processor (ISP). The Tele camera is the higher zoom camera
`and the Wide camera is the lower zoom camera. In some
`
`length (EFL)
`the thickness/effective focal
`embodiments,
`ratio of the Tele lens is smaller than about 1. The image
`sensor may include two separate 2D pixelated sensors or a
`single pixelated sensor divided into at least two areas. The
`digital camera can be operated in both still and video modes.
`In video mode, optical zoom is achieved “without fusion”,
`by, in some embodiments, switching between the W and T
`images to shorten computational time requirements, thus
`enabling high video rate. To avoid discontinuities in video
`mode, the switching includes applying additional processing
`blocks, which include in some embodiments image scaling
`and shifting. In some embodiments, when a no-switching
`criterion is fulfilled, optical zoom is achieved in video mode
`without switching.
`As used herein, the term “video” refers to any camera
`output that captures motion by a series of pictures (images),
`as opposed to “still mode” that friezes motion. Examples of
`“video” in cellphones and smartphones include “video
`mode” or “preview mode”.
`In order to reach optical zoom capabilities, a different
`magnification image of the same scene is captured (grabbed)
`by each camera, resulting in FOV overlap between the two
`cameras. Processing is applied on the two images to fuse and
`output one fused image in still mode. The fused image is
`processed according to a user zoom factor request. As part
`of the fusion procedure, up-sampling may be applied on one
`or both of the grabbed images to scale it
`to the image
`grabbed by the Tele camera or to a scale defined by the user.
`The fusion or up-sampling may be applied to only some of
`the pixels of a sensor. Down-sampling can be performed as
`well if the output resolution is smaller than the sensor
`resolution.
`The cameras and associated methods disclosed herein
`
`address and correct many of the problems and disadvantages
`of known dual-aperture optical zoom digital cameras. They
`
`10
`
`15
`
`20
`
`25
`
`30
`
`35
`
`40
`
`45
`
`50
`
`55
`
`60
`
`65
`
`4
`
`provide an overall zoom solution that refers to all aspects:
`optics, algorithmic processing and system hardware (HW).
`In a dual-aperture camera image plane, as seen by each
`camera (and respective image sensor), a given object will be
`shifted and have different perspective (shape). This is
`referred to as point-of-view (POV). The system output
`image can have the shape and position of either camera
`image or the shape or position of a combination thereof. If
`the output image retains the Wide image shape then it has the
`Wide perspective POV. If it retains the Wide camera position
`then it has the Wide position POV. The same applies for Tele
`images position and perspective. As used in this description,
`the perspective POV may be of the Wide or Tele cameras,
`while the position POV may shift continuously between the
`Wide and Tele cameras. In fused images, it is possible to
`register Tele image pixels to a matching pixel set within the
`Wide image pixels,
`in which case the output image will
`retain the Wide POV (“Wide fusion”). Alternatively, it is
`possible to register Wide image pixels to a matching pixel
`set within the Tele image pixels, in which case the output
`image will retain the Tele POV (“Tele fusion”). It is also
`possible to perform the registration after either camera
`image is shifted, in which case the output image will retain
`the respective Wide or Tele perspective POV.
`In an exemplary embodiment, there is provided a zoom
`digital camera comprising a Wide imaging section that
`includes a fixed focal length Wide lens with a Wide FOV and
`a Wide sensor,
`the Wide imaging section operative to
`provide Wide image data of an object or scene, a Tele
`imaging section that includes a fixed focal length Tele lens
`with a Tele FOV that is narrower than the Wide FOV and a
`
`Tele sensor, the Tele imaging section operative to provide
`Tele image data of the object or scene, and a camera
`controller operatively coupled to the Wide and Tele imaging
`sections,
`the camera controller configured to evaluate a
`no-switching criterion determined by inputs from both Wide
`and Tele image data, and, if the no-switching criterion is
`fulfilled, to output a zoom video output image that includes
`only Wide image data in a zoom-in operation between a
`lower zoom factor (ZF) value and a higher ZF value.
`In an exemplary embodiment there is provided a method
`for obtaining zoom images of an object or scene using a
`digital camera, comprising the steps of providing in the
`digital camera a Wide imaging section having a Wide lens
`with a Wide FOV and a Wide sensor, a Tele imaging section
`having a Tele lens with a Tele FOV that is narrower than the
`Wide FOV and a Tele sensor, and a camera controller
`operatively coupled to the Wide and Tele imaging sections,
`and configuring the camera controller to evaluate a no-
`switching criterion determined by inputs from both Wide
`and Tele image data, and, if the no-switching criterion is
`fulfilled, to output a zoom video output image that includes
`only Wide image data in a zoom-in operation between a
`lower ZF value and a higher ZF value.
`In some exemplary embodiments, the no-switching crite-
`rion includes a shift between the Wide and Tele images
`calculated by global registration, the shift being greater than
`a first threshold.
`
`In some exemplary embodiments, the no-switching crite-
`rion includes a disparity range calculated by global regis-
`tration,
`the disparity range being greater than a second
`threshold.
`
`In some exemplary embodiments, the no-switching crite-
`rion includes an effective resolution of the Tele image being
`lower than an effective resolution of the Wide image.
`APPL—1001 /Page 10 of 15
`
`APPL-1001 / Page 10 of 15
`
`

`

`US 10,230,898 B2
`
`5
`In some exemplary embodiments, the no-switching crite-
`rion includes a number of corresponding features in the
`Wide and Tele images being smaller than a third threshold.
`In some exemplary embodiments, the no-switching crite-
`rion includes a majority of objects imaged in an overlap area
`of the Wide and Tele images being calculated to be closer to
`the camera than a first threshold distance.
`
`In some exemplary embodiments, the no-switching crite-
`rion includes some objects imaged in an overlap area of the
`Wide and Tele images being calculated to be closer than a
`second threshold distance while other objects imaged in the
`overlap area of the Wide and Tele images being calculated
`to be farther than a third distance threshold.
`
`In some exemplary embodiments, the camera controller
`includes a user control module for receiving user inputs and
`a sensor control module for configuring each sensor to
`acquire the Wide and Tele image data based on the user
`inputs.
`In some exemplary embodiments, the user inputs include
`a zoom factor, a camera mode and a region of interest.
`In some exemplary embodiments, the Tele lens includes a
`ratio of total track length (TTL)/elfective focal length (EFL)
`smaller than 1. For a definition of TTL and EFL see e.g.
`co-assigned US
`published
`patent
`application No.
`20150244942.
`
`if the no-switching
`In some exemplary embodiments,
`criterion is not fulfilled,
`the camera controller is further
`configured to output video output images with a smooth
`transition when switching between the lower ZF value and
`the higher ZF value or vice versa, wherein at the lower ZF
`value the output image is determined by the Wide sensor,
`and wherein at the higher ZF value the output image is
`determined by the Tele sensor.
`In some exemplary embodiments, the camera controller is
`further configured to combine in still mode, at a predefined
`range of ZF values, at least some of the Wide and Tele image
`data to provide a fused output image of the object or scene
`from a particular point of view.
`
`BRIEF DESCRIPTION OF THE DRAWINGS
`
`Non-limiting examples of embodiments disclosed herein
`are described below with reference to figures attached hereto
`that are listed following this paragraph. Identical structures,
`elements or parts that appear in more than one figure are
`generally labeled with a same numeral in all the figures in
`which they appear. The drawings and descriptions are meant
`to illuminate and clarify embodiments disclosed herein, and
`should not be considered limiting in any way.
`FIG. 1A shows schematically a block diagram illustrating
`an exemplary dual-aperture zoom imaging system disclosed
`herein;
`FIG. 1B is a schematic mechanical diagram of the dual-
`aperture zoom imaging system of FIG. 1A:
`FIG. 2 shows an example of a Wide sensor, a Tele sensor
`and their respective FOVs;
`FIG. 3A shows an embodiment of an exemplary method
`disclosed herein for acquiring a zoom image in video/
`preview mode;
`FIG. 3B shows exemplary feature points in an object;
`FIG. 3C shows schematically a known rectification pro-
`cess;
`
`FIG. 4 shows a graph illustrating an effective resolution
`zoom factor.
`
`10
`
`15
`
`20
`
`25
`
`30
`
`35
`
`40
`
`45
`
`50
`
`55
`
`60
`
`65
`
`6
`DETAILED DESCRIPTION
`
`Definitions:
`Sharpness score: the gradients (dx, dy) of the image are
`compared (through subtraction) to the gradients of its low
`pass filtered version. A higher difference indicates a sharper
`original image. The result of this comparison is normalized
`with respect to the average variations (for example, sum of
`absolute gradients) of the original
`image,
`to obtain an
`absolute sharpness score.
`Edge score: for each image, the edges are found (for
`example, using Canny edge detection) and the average
`intensity of gradients along them is calculated, for example,
`by calculating the magnitude of gradients (dx, dy) for each
`edge pixel, summing the results and dividing by the total
`number of edge pixels. The result is the edge score.
`Effective resolution score: this scor

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket