throbber
US008989517B2
`
`US8,989,517 B2
`(10) Patent No.:
`az) United States Patent
`Morgan-Maretal.
`(45) Date of Patent:
`Mar.24, 2015
`
`
`(54) BOKEH AMPLIFICATION
`
`(56)
`
`(71) Applicant: Canon Kabushiki Kaisha, Tokyo (JP)
`
`(72)
`
`Inventors: David Peter Morgan-Mar,
`Wollstonecraft (AU); Kieran Gerard
`Larkin, Putney (AU); Matthew
`Raphael Arnison, Umina Beach (AU)
`
`References Cited
`U.S. PATENT DOCUMENTS
`
`7,065,256 B2
`8,422,827 B2*
`8,498,483 B2*
`8,624,986 B2*
`8,704,909 B2*
`8,737,756 B2*
`
`6/2006 Alonetal.
`4/2013 Ishiietal. oo. 382/299
`o..... 382/181
`7/2013 Noguchietal.
`
`1/2014 Li we . 348/208. 13
`4/2014 Kanaris etal... 348/222.1
`5/2014 Daneshpanah etal. ....... 382/255
`
`(73) Assignee: Canon Kabushiki Kaisha, Tokyo (JP)
`
`(Continued)
`
`FOREIGN PATENT DOCUMENTS
`
`
`
`Subject to any disclaimer, the term ofthis
`(*) Notice:
`patent is extended or adjusted under 35
`WO 2008/149363 A2=12/2008
`US.C. 154(b) by 0 days.
`
`OTHER PUBLICATIONS
`
`(21) Appl. No.: 14/079,481
`
`(22)
`
`Filed:
`
`Nov. 13, 2013
`
`(65)
`
`(30)
`
`Prior Publication Data
`
`US 2014/0152886 Al
`
`Jun. 5, 2014
`
`Foreign Application Priority Data
`
`Dec. 3, 2012
`
`(AU) oe eceeeeeeees 2012258467
`
`(51)
`
`(2006.01)
`(2006.01)
`(2006.01)
`(2006.01)
`(2006.01)
`(2006.01)
`
`Int. Cl.
`G06K 9/36
`G06K 9/40
`HOAN 5/225
`HOAN 5/228
`HOAN 5/262
`HOAN 5/232
`(52) U.S.CL.
`CPC vieeccsctectesseeneceeetees HOA4N 5/23212 (2013.01)
`USPC wieeeteneeees 382/280; 348/207.1; 348/222.1;
`348/239; 382/255; 382/276
`(58) Field of Classification Search
`USPC wee 348/207.1—207.11, 208.99-208.16,
`348/222.1, 239, 241, 345-357, 362-368
`See application file for complete search history.
`
`Bae, Soonmin, and Durand, Frédo. “Defocus Magnification.” Com-
`puter Graphics Forum: Proceedings of Eurographics 2007, Prague,
`Sep. 3-7, 2007. Ed. Cohen-Or, D and Slavik, P. Oxford, UK:
`Blackwell Publishing, 2007. 26.3:571-579.
`(Continued)
`
`Primary Examiner — Michael Osinski
`(74) Attorney, Agent, or Firm —Canon U.S.A.,
`Division
`
`Inc.
`
`IP
`
`ABSTRACT
`(57)
`A method of modifying the blurin at least a part of an image
`of a scene captures at least two images of the scene with
`different camera parameters to producea different amount of
`blur in each image. A corresponding patch in each of the
`captured imagesis selected each having an initial amount of
`bluris used to calculate a set offrequency domain pixel values
`from a function oftransformsofthe patches. Each ofthe pixel
`valuesin thesetare raised to a predetermined power, forming
`anamplified set offrequency domain pixel values. The ampli-
`fied set of frequency domain pixel values is combined with
`the pixels of the patch in one of the captured images to
`produce an output image patch with blur modified relative to
`the initial amountof blurin the imagepatch.
`
`15 Claims, 13 Drawing Sheets
`
`1010
`1
`!
`
`__)
`
`\L
`
`4
`750
`Ne
`
`4060
`Which So
`atch is less
` purred?,”
`1070b
`
`
`|\\
`
`107ta
`ayMultiply amplified
`
`spectralratio by
`
`
`
`
`F, to give Fi
`T
`
`Artificial od
`bokehpatch
`4090
`-—Y
`
`Cay
`
`
`
`Sat
`
`(Set)
`Fourier transform
`
`
`patches f, and f,
`1020
`xX
`f,Aatch 27
`A
`\
`blurred?,
`—<pateh is less
`Ny
`|
`,
`
`10308
`|
`4030b
`iy
`b ¥ 4
`Form spectral
`Formspeciral
`
`
`ratio FlF
`4040
`ratio Fy,
`
`
`|
`
`is} Modity spectral ratio
`
`
`‘
`Raise spectral ratio {
`to power V
`
`APPLE V COREPHOTONICS
`IPR2020-00905
`Exhibit 2037
`Page 1
`
`APPLE V COREPHOTONICS
`IPR2020-00905
`Exhibit 2037
`Page 1
`
`

`

`US 8,989,517 B2
`Page 2
`
`(56)
`
`References Cited
`
`U.S. PATENT DOCUMENTS
`
`200 1/0008418
`2002/0145671
`2003/0002746
`2007/0036427
`2008/0013861
`2008/0175508
`2009/01 15860
`2009/0141163
`2009/0297056
`201 1/0033132
`
`Al*
`Al*
`Al*
`Al*
`Al*
`Al*
`Al*
`Al
`Al*
`Al*
`
`7/2001
`10/2002
`1/2003
`2/2007
`1/2008
`7/2008
`5/2009
`6/2009
`12/2009
`2/2011
`
`
`
`Yamanaka et al.
`........... 348/222
`......
`. 348/241
`Alonetal.
`
`. 382/255
`Kusaka........
`Nakamuraetal.
`. 382/154
`Lietal. ww...
`. 382/286
`
`Bandoetal. .
`. 382/255
`
`Nakashimaet al.
`..... 348/208.99
`Attar et al.
`Lelescu et al. owe 382/261
`Ishii et al. sees 382/275
`
`2011/0090352 Al*
`4/2011 Wangetal. ou... 348/208.6
`2011/0205382 Al*
`8/2011 Kanarisetal.
`348/222.1
`
`2012/0206630 Al*
`. 348/241
`8/2012 Nguyen etal....
`
`2013/0063566 Al*
`3/2013 Morgan-Mar etal. ..
`. 348/46
`2013/0266210 Al* 10/2013 Morgan-Mar etal. ........ 382/154
`OTHER PUBLICATIONS
`
`Kubota, Akira, and Aizawa, Kiyoharu. “Reconstructing Arbitrarily
`Focused Images From Two Differently Focused Images Using Linear
`Filters.” IEEE Transactions on Image Processing 14.11 (2005): 1848-
`1859.
`
`* cited by examiner
`
`APPLE V COREPHOTONICS
`IPR2020-00905
`Exhibit 2037
`Page 2
`
`APPLE V COREPHOTONICS
`IPR2020-00905
`Exhibit 2037
`Page 2
`
`

`

`U.S. Patent
`
`Mar.24, 2015
`
`Sheet 1 of 13
`
`US 8,989,517 B2
`
`
`
`APPLE V COREPHOTONICS
`IPR2020-00905
`Exhibit 2037
`Page 3
`
`APPLE V COREPHOTONICS
`IPR2020-00905
`Exhibit 2037
`Page 3
`
`

`

`U.S. Patent
`
`Mar.24, 2015
`
`Sheet 2 of 13
`
`US 8,989,517 B2
`
`
`
`\ 255 ~
`
`“92
`
`Fig. 2
`
`APPLE V COREPHOTONICS
`IPR2020-00905
`Exhibit 2037
`Page 4
`
`APPLE V COREPHOTONICS
`IPR2020-00905
`Exhibit 2037
`Page 4
`
`

`

`U.S. Patent
`
`Mar.24, 2015
`
`Sheet 3 of 13
`
`US 8,989,517 B2
`
`300
`
`320
`
`
`
`310
`
`330
`
`Fig. 3A
`
`Fig. 3B
`
`APPLE V COREPHOTONICS
`IPR2020-00905
`Exhibit 2037
`Page 5
`
`APPLE V COREPHOTONICS
`IPR2020-00905
`Exhibit 2037
`Page 5
`
`

`

`U.S. Patent
`
`Mar.24, 2015
`
`Sheet 4 of 13
`
`US 8,989,517 B2
`
`f
`>
`“
`
`No
`
`om,
`
`(Wide-Area)
`
`Communications
`} 4 _
`Network 420
`.
`ee oh,
`— rc
`, a
`424
`
`‘
`
`~ {Local-Area)
`y—~
`é Communications
`:
`493
`} Neworkae
`et
`a6 Ty
`~
`i
`
`400
`
`Printer445
`Microphone
`
` |
`
`Video
`red
`
`447
`}
`rT
`J
`
`Audio-Video
`
`Interface 407
`
`I/O Interfaces
`
`408
`
`Local Net.
`
`i/face 411
`
`APPag
`
`a
`
`HDD 410
`
`Camera 427
`=
`DiskStorage
`
`418
`
`419
`
`404
`
`Processor
`405
`
`I/O Interface
`413
`
`Memory
`406
`
`Optical Disk
`Drive 412
`
`Keyboard 402
`
`A03
`
`Medium 425
`
`Fig. 4A
`
`APPLE V COREPHOTONICS
`IPR2020-00905
`Exhibit 2037
`Page 6
`
`APPLE V COREPHOTONICS
`IPR2020-00905
`Exhibit 2037
`Page 6
`
`

`

`U.S. Patent
`
`Mar.24, 2015
`
`Sheet 5 of 13
`
`US 8,989,517 B2
`
`434
`
`yea 433
`
`Instruction (Part 1) 428
`Pane)
`Instruction (Part 2) 429
`
`31
`
`—_—
`
`Data 435
`
`Data 436
`
`432
`
`Instruction 430
`
`Data 437
`
`ROM 449
`
`POST|}BIOS Bootstrap Operating
`
`450
`451
`Loader 452
`System 453
`
`67
`
`Input Variables 454
`
`Output Variables 461
`
`56
`
`63
`
`Intermediate Variables 458
`esI
`
`419
`
`404
`
`418
`
`Interface 442
`
`ALU 440
`
`Control Unit 439
`
`APPLE V COREPHOTONICS
`IPR2020-00905
`Exhibit 2037
`Page 7
`
`APPLE V COREPHOTONICS
`IPR2020-00905
`Exhibit 2037
`Page 7
`
`

`

`U.S. Patent
`
`Mar.24, 2015
`
`Sheet 6 of 13
`
`US 8,989,517 B2
`
`
`
`
`
`
`
`Fig. 5B
`
`Fig. 5C
`
`APPLE V COREPHOTONICS
`IPR2020-00905
`Exhibit 2037
`Page 8
`
`APPLE V COREPHOTONICS
`IPR2020-00905
`Exhibit 2037
`Page 8
`
`

`

`U.S. Patent
`
`Mar.24, 2015
`
`Sheet 7 of 13
`
`US 8,989,517 B2
`
`642
`
`600
`
`650
`
`610
`
`640
`
`620
`
`650
`
`630
`
`Fig. 6
`
`APPLE V COREPHOTONICS
`IPR2020-00905
`Exhibit 2037
`Page 9
`
`APPLE V COREPHOTONICS
`IPR2020-00905
`Exhibit 2037
`Page 9
`
`

`

`U.S. Patent
`
`Mar.24, 2015
`
`Sheet 8 of 13
`
`US 8,989,517 B2
`
`Capture two
`images of scene
`
`
` Select
`corresponding
`image patchesf,
`and f
`
`Select less
`blurred patch
`
`
`Form artificial
`bokeh patch
`fin)
`
`710
`
`720
`
`740
`
`750
`
`760
`
`700
`NO
`
`
`
`
`
`
`
`
`More
`patches?
`
`No
`
`770
`
`Assemble rendered patches
`into rendered image
`
`775
`
`Fig. 7
`
`Rendered
`
`image
`
`End
`
`780
`
`APPLE V COREPHOTONICS
`IPR2020-00905
`Exhibit 2037
`Page 10
`
`APPLE V COREPHOTONICS
`IPR2020-00905
`Exhibit 2037
`Page 10
`
`

`

`U.S. Patent
`
`Mar.24, 2015
`
`Sheet 9 of 13
`
`US 8,989,517 B2
`
`710
`
`Set up camera
`aimed at scene
`
`Set focus, zoom,
`aperture
`
`Takefirst image |
`of scene
`
`image of scene
`
`Changefocus,
`zoom, or
`aperture
`
`Take second
`
`APPLE V COREPHOTONICS
`IPR2020-00905
`Exhibit 2037
`Page 11
`
`APPLE V COREPHOTONICS
`IPR2020-00905
`Exhibit 2037
`Page 11
`
`

`

`U.S. Patent
`
`Mar.24, 2015
`
`Sheet 10 of 13
`
`US 8,989,517 B2
`
`910
`
`Calculate
`variance o,” of
`patch f,
`
`920
`
`
`
`
`Calculate
`variance a," of
`patch f
`
`740
`
`940
`
`
`
`
`930
`
`945
`
`950
`
`
`
` Select f as less | Select f; as less
`
`blurred patch
`
`blurred patch
`
`Fig. 9
`
`APPLE V COREPHOTONICS
`IPR2020-00905
`Exhibit 2037
`Page 12
`
`APPLE V COREPHOTONICS
`IPR2020-00905
`Exhibit 2037
`Page 12
`
`

`

`U.S. Patent
`
`Mar. 24, 2015
`
`Sheet 11 of 13
`
`US 8,989,517 B2
`
`750
`
`\
`
`Fourier transform |
`patches f, and f,
`
`
`
`Which
`F,
`
`
`patchis less
`1030a
`blurred?
`
`
`
`Form spectral
`ratio F./F,
`
`Form spectral
`ratio F4/F.
`
`Modify spectral ratio
`
`Raise spectral ratio
`to power NV
`
`
`
`Which
`f,
`
`
`patchis less
`blurred?
`
`
`1070a
`
`
`
`Multiply amplified
`Multiply amplified
`spectral ratio by
`spectral ratio by
`
`
`
`F, to give Fry
`Fz to give Fray
`
`
`
`
`Inverse Fourier
`transform Fito give fry
`
`1010
`
`1020
`
`1030b
`
`1040
`
`1050
`
`1060
`
`1070b
`
`1080
`
`1085
`
`Artificial
`bokeh patch
`
`1090
`
`Fig. 10
`
`APPLE V COREPHOTONICS
`IPR2020-00905
`Exhibit 2037
`Page 13
`
`APPLE V COREPHOTONICS
`IPR2020-00905
`Exhibit 2037
`Page 13
`
`

`

`U.S. Patent
`
`Mar.24, 2015
`
`Sheet 12 of 13
`
`US 8,989,517 B2
`
`770
`
`\
`
`Select pixel of output
`image
`
`Determine patches
`fin) containing
`corresponding pixels
`
`
`
`
`
`
` More
`than one
`
`
`
`
`
`
`Calculate output pixel
`from corresponding
`pixels in all patches
`
`Calculate output pixel
`from corresponding
`pixel in patch
`
`
`
`
`More
`pixels?
`
`No
`
`End
`
`Fig. 11
`
`1120
`
`1130
`
`1140
`
`
`
`
`1160
`
`1170
`
`APPLE V COREPHOTONICS
`IPR2020-00905
`Exhibit 2037
`Page 14
`
`APPLE V COREPHOTONICS
`IPR2020-00905
`Exhibit 2037
`Page 14
`
`

`

`U.S. Patent
`
`Mar.24, 2015
`
`Sheet 13 of 13
`
`US 8,989,517 B2
`
`1200
`
`Capture two
`images of scene
`
`1240
`
`Segment imagesinto
`foreground and background
`
`Determine object
`boundary regions
`
`Select corresponding image
`patchesf; and f, in boundary region
`
`
`
`
`
`Select less blurred
`patch
`
`Form artificial
`bokeh patch fin
`
`1270
`
`More
`patches?
`
`
`
`1220
`
`1230
`
`1240
`
`1250
`
`1260
`
`No
`
`Assemble rendered patches into rendered
`image of boundary region
`
`Artificial bokeh
`rendering
`
`Form composite
`rendered image
`
`Composite
`rendered image
`
`1290
`
`1295
`
`1280
`
`1285
`
`1292
`
`Fig. 12
`
`End
`
`APPLE V COREPHOTONICS
`IPR2020-00905
`Exhibit 2037
`Page 15
`
`APPLE V COREPHOTONICS
`IPR2020-00905
`Exhibit 2037
`Page 15
`
`

`

`US 8,989,517 B2
`
`1
`BOKEH AMPLIFICATION
`
`REFERENCE TO RELATED PATENT
`
`APPLICATION(S)
`
`This application claims the benefit under 35 U.S.C. §119 of
`the filing date of Australian Patent Application No.
`2012258467, filed Dec. 3, 2012, hereby incorporated by ref-
`erencein its entirety as if fully set forth herein.
`
`10
`
`TECHNICAL FIELD
`
`The current invention relates to digital image processing
`and, in particular, to rendering a photographic image with
`modified blur characteristics.
`
`15
`
`BACKGROUND
`
`Single-lens reflex (SLR) and digital single-lens reflex
`(DSLR) cameras have large aperture optics which can pro-
`duce a narrow depth of field. Depth of field measures the
`distance from the nearest object to the camera which is in
`focus, to the farthest object from the camera whichis in focus.
`(D)SLR cameras typically have a depth of filed of order
`significantly less than 1 meter for atypical portrait scenario of
`a subject a few meters from the camera. This allows the
`foreground subject of a photo to be rendered in sharp focus,
`while the background is blurred by defocus. The result is
`visually pleasing as it provides a separation between the
`subject and any distracting elements in the background. The
`aesthetic quality of background blur (encompassing both the
`quantity and “look”of the blur) is known as bokeh. Bokehis
`especially important for photos of people, or portraits.
`Compact digital cameras are more popular than DSLRs
`with consumers becauseoftheir smaller size, lighter weight,
`and lower cost. However, the smaller optics on a compact
`camera produce a large depth of field, of order greater than
`approximately 1 meter for the same typical portrait scenario,
`which renders the background in typical portrait shots as
`sharp anddistracting.
`Depth offield varies significantly depending on the geom-
`etry of the photographic scene. The following examples are
`for taking a photo ofa person about 3 meters from the camera:
`(1) the depth offield for a full frame SLR camera at 50 mm
`focal length and aperture f/2.8 is about 0.5 meters. For a
`portrait scenario, a photographer would typically wantto use
`a depth offield this size, or even smaller, maybe 0.2 meters or
`even 0.1 meters. An SLR camera can also be configured with
`a smaller aperture to achieve very large depthof field, though
`this is not usually donefor portraits.
`(i) the depth of field for a small compact camera (e.g.
`Canon™ IXUS™ model) at 50 mm full-frame equivalent
`focal length and aperture f/2.8, is 6 meters.
`(111) a large compact camera (e.g. Canon™ G12) at 50 mm
`full-frame equivalent focal length and aperture f/4 is 1.6
`meters. (This camera cannot achieve f/2.8 aperture—ifit
`could, its depth of field would be 1.2 meters.) It is practically
`impossible for a camera with a compact form factor to achieve
`a depth of field under about 1 meter, for a subject at 3 meters
`distance. Technically, such is possible, but would require very
`large and expensive lenses. Depth of field for compact cam-
`eras under normalconditions can easily be tens of meters or
`even infinity, meaning that everything from the subject to the
`far distance is in focus.
`
`If the person is closer to the camera than 3 meters, all the
`depth of field distances discussed above will be smaller, and
`if the person is further away, they will all be larger. Impor-
`
`40
`
`45
`
`50
`
`65
`
`2
`tantly, an SLR camera will always be able to achieve a sig-
`nificantly smaller depth of field than a compact camera. The
`depth of field is largely dictated by the size of the camera
`sensor.
`
`A method of producing artificial bokeh with a compact
`camera, mimicking the amount and quality of background
`blur produced by an SLR camera, would provide a major
`improvementin image quality for compact camera users.
`Camera manufacturers and professional photographers
`have recognised the depth of field limitations of small format
`cameras for decades. With the adventof digital camera tech-
`nology,it has becomefeasible to process camera imagesafter
`capture to modify the appearance of the photo. The genera-
`tion of SLR-like bokeh from compact camera images has
`been an early target for research in the field of digital camera
`image processing. However, no solution providing results of
`high (1.e. visually acceptable) aesthetic quality has been dem-
`onstrated.
`To accurately mimic small depth of field given a large
`depth of field photo, objects in the image mustbe blurred by
`an amount that varies with distance from the camera. The
`most commonprior approach tackles this problem in two
`steps:
`(1a). Estimate the distance ofregionsin the image from the
`camera to produce a depth map.
`(1b). Apply a blurring operation using a blur kernel size
`that varies with the estimated distance.
`
`Step (1a) is a difficult problem in itself, and the subject of
`active research by many groups. The three main methods of
`depth map estimation from camera images (i.e. excluding
`active illumination methods)are:
`(i) Stereo: taking photos from different camera positions
`and extracting depth from parallax. A major disadvantage of
`this approach is the requirementto take photos from multiple
`viewpoints, making it impractical for compact cameras.
`(ii) Depth from focus (DFF): taking a series of many
`images focused at different distances and measuring in
`patches which photo correspondsto a best focusat that patch,
`usually using maximalcontrast as the best focus criterion. A
`major disadvantage of this approach is that many exposures
`are required, necessitating a long elapsed time. During the
`exposures the camera or subject may inadvertently move,
`potentially blurring the subject and introducing additional
`problems caused by image misalignment.
`(111) Depth from defocus (DFD): quantifying the difference
`in amount of blur between two images taken with different
`focus and equating the blur differenceto a distance. Thisis the
`mostsuitable approach for implementation in a compact cam-
`era, as it does not require stereo camera hardware and can be
`performed with as few as two photos. However, it has the
`disadvantages that accuracy is typically relatively low, par-
`ticularly around the boundaries of objects in the scene, and
`that consistency is adversely affected by differing object tex-
`tures in the scene. Some DFD methods showbetter accuracy
`around object edges, at the cost of using computationally
`expensive algorithms unsuited to implementation in camera
`hardware.
`
`Step (1b) is computationally expensive for optically real-
`istic blur kernel shapes. A fallback is to use a Gaussian blur
`kernel, which producesa blurthat looksoptically unrealistic,
`making the resulting imageaesthetically unpleasing.
`To more easily approachartificial bokeh, many prior meth-
`ods use a simplified version of the above two-step method,
`being:
`(2a). Segment the image into a foreground region and a
`backgroundregion.
`(2b). Apply a constant blurring operation to the back-
`groundregion only.
`
`APPLE V COREPHOTONICS
`IPR2020-00905
`Exhibit 2037
`Page 16
`
`APPLE V COREPHOTONICS
`IPR2020-00905
`Exhibit 2037
`Page 16
`
`

`

`US 8,989,517 B2
`
`3
`Assuming step (2a) is done correctly, step (2b) is straight-
`forward. However, step (2a) is still difficult and has not been
`achieved satisfactorily within the constraints of a compact
`camera. In particular, the accuracy of segmentation around
`the edges of objects at different depths in the scene is poor.
`Evenif this simplified method can be achieved withouterror,
`the resulting images can look artificial, since intermediate
`levels of blur between the foreground and backgroundwill be
`absent.
`
`4
`a tiling substantially covering the area ofthe captured images,
`and the output image is formed bytiling the output image
`patches. Generally the plurality of corresponding image
`patchesin each ofthe captured images overlap, and the output
`image is formed by combiningthe pixel values of the output
`imagepatches.
`Ina specific implementation the plurality of corresponding
`imagepatchesin each ofthe captured imagescoverpart ofthe
`area ofthe captured images; and the output image patchesare
`
`(a).Eatimate Approachaicivlfferentplaces onan 10 Combinedwiththe area ofat least one ofthe captured images
`not covered by the plurality of corresponding image patches
`:
`.
`subioct compared to a blur-free representation of the
`to produce an output image. Desirably at leastpart ofthe area
`GBb). Applyablurring operation using a blur kemelsize
`oftheat least one ofthe captured images not covered by the
`that varies with the estimated blur amount.
`15 prayoonby°ling image patchesis blurred by con-
`so‘Ihebeckeroundwillchowasmallamountofbluring
`According to another aspect, disclosed is a camera com-
`prising an image capture system coupled to memory in which
`.
`:
`:
`relative to an in-focus foreground object. If such blurred
`captured images are stored, a processor, and a program
`regions can be identified accurately, they can be blurred more,
`executable by the processor to modify the blur inatleast a part
`producing increasedblur in the background.
`20 of an image of a scene, said program comprising: code for
`Step (3a) can be performed with a single image, or by using
`causing the capture system to capture at least two images of
`multiple images of the scene captured with different camera
`the scene, said images being captured with different camera
`parameters. Estimating blur from a single image is under-
`parameters to produce a different amount of blur in each of
`constrained and can only be achieved under certain assump-
`the captured images; code for selecting a corresponding
`tions. For example, one assumptionis that edges detected in 25 image patch in each of the captured images, each ofthe
`the image are step function edgesin the scene, blurred by the
`selected image patches having an initial amount ofblur; code
`camera optics, and that regions away from edges may be
`for calculating a set of frequency domain pixel values from a
`.
`.
`combined function of Fourier transforms of two of the
`accurately infilled from the edge blur estimates. These
`selected image
`patches; code for raising
`each of the
`pixel
`assumptionsare often false, resulting in poor blurestimates.
`values in thinetof frequency domain Dixel values to apte-
`Estimating blur from multiple imagesis akinto DFF or DFD, 3° determined power, thereby forming an amplified set of fre-
`because blur amountis directly related to depth, and shares
`quency domain pixel values; and code for combining the
`the same problems.
`amplified set offrequency domain pixel values with the pixels
`of the selected image patch in one of the captured imagesto
`produce an output image patch with blur modified with
`35 respect to the initial amount of blur in the image patch,
`wherein the amount of modification with respect to blur var-
`According to the present disclosure there is provided a
`ies across different regions of the image patch.
`method of modifying theblurin at least a part of an image of
`Another aspect is a camera system comprising: a lens
`a scene, sa d method comprising: capturing at least two
`formed of optics producing a relatively large depth of field; a
`imagesofthe scene, said images being captured with differ-
`ent camera parameters to produce a different amount of blur 40 Snsor configured capture an image of a scene focussed
`in each of the captured images; selecting a corresponding
`through the lens; a memory in which images captured by the
`image patch in each of the captured images, each of the
`sensorare stored; a capture mechanism configured to capture
`:
`:
`re
`:
`at least two imagesof the scene with different capture param-
`selected image patches having an initial amountof blur; cal-
`eters and to store the images in the memory; a processor; a
`culating a set of frequency domain pixel values from a com-
`program stored in the memory and executablé by the proces:
`bined function of Fourier transforms of two of the selected 45 cor to modify blurin at least a part of one of the captured
`ireueeydemainpilwalesto 2predeterminedpower, he Samanseraneesticheese ieeeah
`
`SUMMARY
`
`_scene with different camera parametersto producea different
`thereby forming an amplified set of frequency domain pixel
`amount of blur in each of the captured images; code for
`values; and combining the amplified set of frequency domain
`pixel values with the pixels ofthe selected image patchinone 50 selecting a corresponding imagepatchin each ofthe captured
`ofthe captured images to produce an output image patch with
`images, each of the selected image patches having an initial
`blur modified with respectto the initial amountof blur in the
`amount of blur; code for calculating a setof frequency
`image patch, wherein the amountofmodification with respect
`domain pixel values from a combined function of Fourier
`to blur varies across different regions of the image patch.
`transforms of two of the selected image patches; code for
`Preferably, the set of frequency domain pixel values are ss ‘418108 each ofthe pixel values in the setoffrequency domain
`:
`:
`:
`:
`pixel values to a predetermined power, thereby forming an
`Generallythemodificationincladeseeianfilteringopens
`amplified set offrequency domainpixel values; and code for
`tion. Alternatively the modification may include a smoothing
`iththeeivelanpltheselectedteespatchinone values
`filtering operation. The modification may include a normali-
`captured images to produce an output imagepatch with blur
`sation operation and/or a weighting operation. The weights 60 modified with respect to the initial amount of blur in the
`for the weighting operation are determined by the phases of—_image patch, wherein the amountofmodification with respect
`the set of frequency domain pixel values.
`to blur varies across different regions of the image patch.
`Typically the at least two images of the scene are divided
`In another aspect disclosed is a computer readable storage
`into a plurality of corresponding imagepatchesin each of the
`medium having a program recorded thereon, the program
`captured images; and the output image patches are combined 65 being executable by a processor to modify blur in at least a
`to produce an output image. Desirably the plurality of corre-
`part of an image ofa scene, the program comprising: code for
`receiving at least two imagesofthe scene, said images being
`sponding image patchesin each ofthe captured images form
`APPLE V COREPHOTONICS
`IPR2020-00905
`Exhibit 2037
`Page 17
`
`APPLE V COREPHOTONICS
`IPR2020-00905
`Exhibit 2037
`Page 17
`
`

`

`US 8,989,517 B2
`
`6
`field so as to mimic a photo taken with a smaller depthoffield
`by modifying blur already presentin the image taken with a
`large depth of field. The methods seek to offer one or more of
`improved accuracy, improved tolerance to imaging noise,
`improved tolerance to differences of object texture in the
`image, and improved aesthetic appearanceofthefinal image,
`all of these particularly in regionsat and near the boundaries
`of objects in the scene.
`Context
`
`5
`captured with different camera parameters to producea dif-
`ferent amount ofblur in each of the captured images; code for
`selecting a corresponding image patch in each ofthe captured
`images, each of the selected image patches having an initial
`amount of blur; code for calculating a set of frequency
`domain pixel values from a combined function of Fourier
`transforms of two of the selected image patches; code for
`raising each ofthe pixelvaluesin theset of frequency domain
`pixel values to a predetermined power, thereby forming an
`amplified set of frequency domain pixel values; and code for 1°
`combining the amplified set offrequency domain pixel values
`with the pixels of the selected image patch in one of the
`The technical details of accurately rendering artificial
`captured images to produce an output imagepatch with blur
`modified with respect to the initial amount of blur in the 5 bokeh rely on key aspects of the geometry and optics of
`imagepatch, wherein the amountofmodification with respect
`imaging devices. Most scenes that are captured using an
`to blur varies across different regions of the image patch.
`imaging device, such as a camera, contain multiple objects,
`Other aspects are disclosed.
`which are located at various distances from the lens of the
`
`Thin Lens Equation, Basic Geometry
`
`device. Commonly, the imaging device is focused on an
`object of interest in the scene. The object of interest shall be
`referred to as the subject of the scene. Otherwise, objects in
`the scene, which may include the subject, shall simply be
`referred to as objects.
`FIG. 1 is a schematic diagram showing the geometrical
`relationships between key parts of an imaging device and
`objects in a scene to be captured. FIG. 1 shows an imaging
`device or system (e.g. a camera) 100 which includesa lens
`110, anda sensor 115. For the purposesofthis description, the
`function and a two-dimensional pillbox function, and one-
`camera 100 is typically a compactdigital camera andthe lens
`dimensional cross-sections thereof;
`110 has relatively small optics producing a large depth of
`FIGS. 4A and 4B collectively form a schematic block
`field, particularly in comparison to an SLR camera. FIG. 1
`diagram of a general purpose computer on which various
`also shows an in-focus plane 130 and a general object 140
`implementations may be practised;
`formed by sphere positioned upona rectangular prism, form-
`FIGS. 5A, 5B, and 5C illustrate example images upon
`ing part of the scene but not necessarily the subject of the
`whichartificial bokeh processing according to the present
`scene to be captured. The image plane 120 of the imaging
`disclosure may be performed;
`device 100, also referred to as the focal plane, is defined to be
`FIG. 6 is a diagram illustrating the correspondence
`at the location of the sensor 115. Whenprojected through the
`between pixels and image patches withinafirst image and a
`lens 110, the image plane 120 formsthe in-focus plane 130,
`second imageof a scene;
`which can be considered to be a virtual plane in the geometri-
`FIG. 7 is a schematic flow diagram illustrating an exem-
`cal region of the object 140. A distance 150 from the lens 110
`plary method of determining an artificial bokeh image from
`to the image plane 120is related to a distance 160 from the
`two imagesof a scene, according to the present disclosure;
`lens 110 to the in-focus plane 130, by the thin lens law
`FIG. 8 is a schematic flow diagram illustrating one example
`according to the equation
`ofa methodof capturing two images as used in the method of
`FIG.7;
`FIG. 9 is a schematic flow diagram illustrating one example
`of a method of asymmetrical patch selection as used in the
`method ofFIG. 7;
`FIG. 10 is a schematic flow diagram illustrating one
`example of a method ofdeterminingan artificial bokeh image
`patch from two corresponding patches of two images of a
`scene as used in the methodof FIG.7;
`FIG. 11 is a schematic flow diagram illustrating one
`example of a method of assemblingartificial bokeh patches
`into an artificial bokeh imageas used in the method of FIG.7;
`and
`FIG. 12 is a schematic flow diagram illustrating a second
`exemplary method of determining an artificial bokeh image
`from two images of a scene, according to the present disclo-
`sure.
`
`BRIEF DESCRIPTION OF THE DRAWINGS
`
`At least one embodiment of the invention will now be
`
`described with reference to the following drawings, in which:
`FIG. 1 is a schematic diagram of a scene and an image
`capture device positioned to capture an image ofthe scene;
`FIG. 2 is a schematic diagram illustrating the geometry of
`a lens forming two different images at two different focal
`planes;
`FIGS. 3A and 3Billustrate a two-dimensional Gaussian
`
`20
`
`30
`
`40
`
`45
`
`60
`
`a)
`
`wheref is the focal length of the lens 110, z, is the lens-to-
`sensor distance 150, and z, is the distance 160 from the lens
`110 to the in-focus plane 130. The general scene object 140 is
`located at a distance 170 from the lens 110 andat a distance
`180 from the in-focus plane 130. This distance 170 is referred
`to as z,. The distance 180 from the object 140 to the in-focus
`plane 130 is given by z,-z, and may bepositive, zero, or
`negative. If the object 140 is focused onto the image plane
`120, then z,=z, and the object 140 is located in the in-focus
`plane 130. If z, is less than or greater than z,, then the object
`140 is located behind or in front of the in-focus plane 130
`respectively, and the image of the object 140 will appear
`blurred on the image plane 120.
`FIG. 1 illustrates a relatively simple geometrical optics
`model of imaging. This model relies on approximations
`including the thin lens approximation, paraxial imaging rays,
`and a lens free of aberrations. These approximations ignore
`someaspects of the optics that are inherent in actual imaging
`APPLE V COREPHOTONICS
`IPR2020-00905
`Exhibit 2037
`Page 18
`
`DETAILED DESCRIPTION INCLUDING BEST
`MODE
`
`Introduction
`The present disclosure is directed to providing methods of
`rendering a photographic image taken with large depth of
`
`65
`
`APPLE V COREPHOTONICS
`IPR2020-00905
`Exhibit 2037
`Page 18
`
`

`

`US 8,989,517 B2
`
`7
`systems, butare sufficient for general understanding of imag-
`ing behaviour, as is understood by those skilled in theart.
`Focusing is carried out either manually by the user or by
`using an autofocus mechanism that is built into the imaging
`device 100. Focusing typically manipulates the lens-to-sen-
`sor distance 150 in orderto place the in-focus plane 130 such
`that the distance z, 160 is equal to the distance z, 170 to a
`specific object of interest, i.e. to place the subject in the
`in-focus plane 130. Other objects in the scene that have a
`distance z, from the lens 110 thatis different from that of the
`subject are located either behind or in front of the in-focus
`plane 130. These other objects will appear blurred to some
`degree on the imageplane 120 and thusin the image captured
`on the sensor 115. This bluris referred to as defocus blur.
`
`Defocus Blur
`
`The amount of defocus blurring of an imaged object 140
`increases with the distance 180 of the object 140 from the
`in-focus plane 130. The amountof defocus blur present in a
`given patch or portion of a captured 2D image can be char-
`acterised by the point spread function (PSF). The PSFis the
`response of the imaging system to a point source, defined
`suchthat the integral of the PSF overthe imageplane is equal
`to unity. The PSF of an optical system is generally a spatially
`restricted two-dimensional function ofspatial coordinates (x,
`y) that approacheszero beyond a certain radial distance from
`the origin. The amountof blur can be characterised by mea-
`sures ofthe shape ofthe PSF. Typical measures of the amount
`of blur are the full-width-at-half-maximum (FWHM)ofthe
`PSF, or the standard deviation of the PSF.
`A basic understanding ofthe principles behind image blur-
`ring may be gained by assuming a mathematically simple
`model for the PSF of a camera lens 110. To achieve this
`simplicity, prior art analyses often model the PSF as a two-
`dimensional Gaussian function. This

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket