throbber
(12) United States Patent
`(10) Patent No.:
`US 8,989,517 B2
`
`Morgan-Mar et al.
`(45) Date of Patent:
`Mar. 24, 2015
`
`USOO8989517B2
`
`(54) BOKEH AMPLIFICATION
`
`(56)
`
`References Cited
`U.S. PATENT DOCUMENTS
`
`(71) Applicant: Canon Kabushiki Kaisha, Tokyo (JP)
`
`(72)
`
`Inventors: David Peter Morgan-Mar,
`Wollstonecraft (AU); Kieran Gerard
`Larkin, Putney (AU); Matthew
`Raphael Arnison, Umina Beach (AU)
`
`(73) Assignee: Canon Kabushiki Kaisha, Tokyo (JP)
`
`( * ) Notice:
`
`Subject to any disclaimer, the term of this
`patent is extended or adjusted under 35
`U.S.C. 154(b) by 0 days.
`
`(21) Appl.No.: 14/079,481
`
`(22)
`
`Filed:
`
`Nov. 13, 2013
`
`(65)
`
`(30)
`
`Prior Publication Data
`
`US 2014/0152886 A1
`
`Jun. 5, 2014
`
`Foreign Application Priority Data
`
`Dec. 3, 2012
`
`(AU) ................................ 2012258467
`
`(51)
`
`(2006.01)
`(2006.01)
`(2006.01)
`(2006.01)
`(2006.01)
`(2006.01)
`
`Int. Cl.
`G06K 9/36
`G06K 9/40
`H04N 5/225
`H04N 5/228
`H04N 5/262
`H04N 5/232
`(52) US. Cl.
`CPC ................................. H04N 5/23212 (2013.01)
`USPC ..................... 382/280; 348/207.1; 348/222.1;
`348/239; 382/255; 382/276
`(58) Field of Classification Search
`USPC ............... 348/207.17207.11, 20899720816,
`348/222.1, 239, 241, 3457357, 3627368
`See application file for complete search history.
`
`
`
`7,065,256 B2
`8,422,827 B2 *
`8,498,483 B2 *
`8,624,986 B2 *
`8,704,909 B2 *
`8,737,756 B2 *
`
`6/2006 Alon et a1.
`..................... 382/299
`4/2013 Ishii et a1.
`7/2013 Noguchi et a1
`...... 382/181
`
`...................
`. 348/20813
`1/2014 Li
`............. 348/222.l
`4/2014 Kanaris et a1.
`5/2014 Daneshpanah et a1.
`....... 382/255
`
`(Continued)
`
`FOREIGN PATENT DOCUMENTS
`
`WO
`
`2008/149363 A2
`
`12/2008
`
`OTHER PUBLICATIONS
`
`Bae, Soonmin, and Durand, Frédo. “Defocus Magnification.” Com-
`puter Graphics Forum: Proceedings of Eurographics 2007, Prague,
`Sep. 3-7, 2007. Ed. Cohen-Or, D and Slavik, P. Oxford, UK:
`Blackwell Publishing, 2007. 26.3:571-579.
`(Continued)
`
`Primary Examiner 7 Michael Osinski
`(74) Attorney, Agent, or FirmiCanon U.S.A.,
`Division
`
`Inc.
`
`IP
`
`(57)
`
`ABSTRACT
`
`A method of modifying the blur in at least a part of an image
`of a scene captures at least two images of the scene with
`different camera parameters to produce a different amount of
`blur in each image. A corresponding patch in each of the
`captured images is selected each having an initial amount of
`blur is used to calculate a set offrequency domain pixel values
`from a function oftransforms ofthe patches. Each ofthe pixel
`values in the set are raised to a predetermined power, forming
`an amplified set offrequency domain pixel values. The ampli-
`fied set of frequency domain pixel values is combined with
`the pixels of the patch in one of the captured images to
`produce an output image patch with blur modified relative to
`the initial amount of blur in the image patch.
`
`15 Claims, 13 Drawing Sheets
`
`750
`x
`\_
`
`1010
`r
`f4
`
`1060
`\\/
`
`APPLE V COREPHOTONICS
`|PR2020-00906
`Exhibit 2037
`Page 1
`
`
`(
`Stan
`)
`,
`Fouriertransiarm
`
`patches fl and f2
`
`
`1020
`A
`f,
`/Whlch
`/*-\_/
`,
`l
`blurred?
`Hpatch is less
`v i
`.
`r
`
`10,3“
`i
`1030b
`i v
`f‘ v
`r__/
`Fulm spectral
`Form spectral
`/
`
`lailo an:
`ratio Fm
`1040
`
`
`
`
`
`,
`t
`.4»
`i—+ Modify spectral razio
`1050
`
`7w)
`Raise spectral ratio ’
`to power 'V
`T
`r / Which
`patch is less
`\blurl'ea7 /
`i
`107%
`
`’7
`r
`£11”;,,,,,,,,
`V 1070b
`i
`ll.)
`Nultlply amplified
`spectral ratio by
`
`Fr ioglve FM“)
`
`
`
`__________.L
`“V
`Inverse Fourier
`V
`
`transform Fun to give 641i
`Artlficlai
`’\v/
`woken patch
`1090
`~\\ /
`-/ *
`
` 6
`
`
`
`
`APPLE V COREPHOTONICS
`IPR2020-00906
`Exhibit 2037
`Page 1
`
`

`

`US 8,989,517 B2
`Page 2
`
`(56)
`
`References Cited
`
`U.S. PATENT DOCUMENTS
`
`2001/0008418
`2002/0145671
`2003/0002746
`2007/0036427
`2008/0013861
`2008/0175508
`2009/0115860
`2009/0141163
`2009/0297056
`2011/0033132
`
`A1*
`A1*
`A1*
`A1*
`A1*
`A1*
`A1*
`A1
`A1*
`A1*
`
`7/2001
`10/2002
`1/2003
`2/2007
`1/2008
`7/2008
`5/2009
`6/2009
`12/2009
`2/2011
`
`Yamanaka et a1.
`............ 348/222
`......
`. 348/241
`Alon et a1.
`
`. 382/255
`Kusaka ........
`
`Nakamura et a1.
`. 382/154
`...........
`. 382/286
`Li et a1.
`
`. 382/255
`.
`Bando etal.
`
`Nakashima et a1.
`..... 348/208.99
`Attar et a1.
`................ 382/261
`Lelescu et a1.
`Ishii et al.
`..................... 382/275
`
`................ 348/208.6
`4/2011 Wang et a1.
`2011/0090352 A1 *
`8/2011 Kanaris et a1.
`348/222.1
`2011/0205382 A1 *
`
`.. 348/241
`8/2012 Nguyen et a1.
`2012/0206630 A1 *
`
`3/2013 Morgan-Mar et a1.
`..
`. 348/46
`2013/0063566 A1 *
`........ 382/154
`10/2013 Morgan-Mar et a1.
`2013/0266210 A1 *
`OTHER PUBLICATIONS
`
`Kubota, Akira, and Aizawa, Kiyoharu. “Reconstructing Arbitrarily
`Focused Images From Two Differently Focused Images Using Linear
`Filters.” IEEE Transactions on Image Processing 14.11 (2005): 1848-
`1859.
`
`* cited by examiner
`
`APPLE V COREPHOTONICS
`|PR2020-00906
`Exhibit 2037
`Page 2
`
`APPLE V COREPHOTONICS
`IPR2020-00906
`Exhibit 2037
`Page 2
`
`

`

`US. Patent
`
`Mar. 24, 2015
`
`Sheet 1 of 13
`
`US 8,989,517 B2
`
`
`
`APPLE V COREPHOTONICS
`IPR2020-00906
`Exhibit 2037
`Page 3
`
`APPLE V COREPHOTONICS
`IPR2020-00906
`Exhibit 2037
`Page 3
`
`

`

`US. Patent
`
`Mar. 24, 2015
`
`Sheet 2 of 13
`
`US 8,989,517 B2
`
`\
`
`255
`
`
`
`Fig. 2
`
`APPLE V COREPHOTONICS
`|PR2020-00906
`Exhibit 2037
`Page 4
`
`APPLE V COREPHOTONICS
`IPR2020-00906
`Exhibit 2037
`Page 4
`
`

`

`US. Patent
`
`Mar. 24, 2015
`
`Sheet 3 of 13
`
`US 8,989,517 B2
`
`300
`
`320
`
`
`
`310
`
`330
`
`Fig. 3A
`
`Fig. 3B
`
`APPLE V COREPHOTONICS
`IPR2020-00906
`Exhibit 2037
`Page 5
`
`APPLE V COREPHOTONICS
`IPR2020-00906
`Exhibit 2037
`Page 5
`
`

`

`US. Patent
`
`Mar. 24, 2015
`
`Sheet 4 of 13
`
`US 8,989,517 B2
`
`_, "" \ " \
`
`/
`
`2
`\
`
`Printer fl
`
`4 0‘8-
`
`417
`
`
`
`\
`{
`
`(Wide-Area)
`Communications
`
`Network 4_20_
`
`\ A
`4‘
`421
`W }
`
`I
`
`l 4
`A J (\j\\\
`\\
`424
`f N 3“
`(Local—Area)
`
`.
`_
`Communications
`Network fl /
`~/
`
`I
`
`Ext-
`Modem
`fl
`
`423
`\1
`
`\
`,4 "’ \—
`/m
`
`Audio-Video
`
`I/O Interfaces
`
`Interface Q
`
`m
`
`418
`
`Processor
`%
`
`I/O Interface
`m
`
`I
`
`I
`I
`
`Local Net.
`
`I/face fl
`
`
`
`Storage
`Devices
`
`HDD fl
`
`fl
`
`404
`
`41
`
`9
`
`
`
`
`
`
`
`Memory
`fl
`
`Optical Disk
`Drive fl
`
`Keyboard £2
`
`Camera fl
`
`II
`- 403
`
`Fig. 4A
`
`Medium fl
`
`APPLE V COREPHOTONICS
`IPR2020-00906
`Exhibit 2037
`Page 6
`
`APPLE V COREPHOTONICS
`IPR2020-00906
`Exhibit 2037
`Page 6
`
`

`

`US. Patent
`
`Mar. 24, 2015
`
`Sheet 5 of 13
`
`US 8,989,517 B2
`
`fl
`
`/ 433
`
`Instruction (Part 1) fl — Data 4_3§
`
`g
`
`Instruction Part 2 &
`
`i
`
`)
`
`Intermediate Variables 4—586
`
`Data fl
`
`‘
`
`&
`
`Instruction m
`
`Data E
`
`W
`
`POST
`4—50
`
`BIOS
`45_1
`
`Bootstrap
`Loader fl
`
`Operating
`System @
`
`Input Variables4—
`
`Output Variables4_
`
`419
`
`,
`
`404
`
`418
`
`
`
`Fig. 4B
`
`APPLE V COREPHOTONICS
`IPR2020-00906
`Exhibit 2037
`Page 7
`
`APPLE V COREPHOTONICS
`IPR2020-00906
`Exhibit 2037
`Page 7
`
`

`

`US. Patent
`
`Mar. 24, 2015
`
`Sheet 6 of 13
`
`US 8,989,517 B2
`
`
`
`
`
`
`
`Fig. 53
`
`APPLE V COREPHOTONICS
`IPR2020-00906
`Exhibit 2037
`Page 8
`
`APPLE V COREPHOTONICS
`IPR2020-00906
`Exhibit 2037
`Page 8
`
`

`

`US. Patent
`
`Mar. 24, 2015
`
`Sheet 7 of 13
`
`US 8,989,517 B2
`
`642
`
`500
`
`652
`
`610 \
`
`640
`
`520
`
`650
`
`63°
`
`APPLE V COREPHOTONICS
`IPR2020-00906
`Exhibit 2037
`Page 9
`
`APPLE V COREPHOTONICS
`IPR2020-00906
`Exhibit 2037
`Page 9
`
`

`

`US. Patent
`
`Mar. 24, 2015
`
`Sheet 8 of 13
`
`US 8,989,517 B2
`
`700
`\
`
`Capture two
`scene
`
`images of
`
`
`
` Select
`corresponding
`image patches f1
`and f2
`
`
`
`
`
`
`
`blurred patch
`
`Select less
`
`
`Form artificial
`
`bokeh patch
`7%»
`
`
`
`
`710
`
`720
`
`740
`
`750
`
`760
`
`
`patches?
`
`More
`
`No
`
`770
`
`Assemble rendered patches
`into rendered image
`
`775
`
`Fig. 7
`
`Rendered
`
`image
`
`End
`
`780
`
`APPLE V COREPHOTONICS
`lPR2020-00906
`Exhibit 2037
`Page 10
`
`APPLE V COREPHOTONICS
`IPR2020-00906
`Exhibit 2037
`Page 10
`
`

`

`US. Patent
`
`Mar. 24, 2015
`
`Sheet 9 of 13
`
`US 8,989,517 B2
`
`710
`
`Set up camera
`aimed at scene
`
`Set focus, zoom,
`
`aperture
`
`image of scene
`
`Take first image r
`of scene
`
`Change focus,
`zoom, or
`
`aperture
`
`Take second
`
`APPLE V COREPHOTONICS
`|PR2020-00906
`Exhibit 2037
`Page 11
`
`APPLE V COREPHOTONICS
`IPR2020-00906
`Exhibit 2037
`Page 11
`
`

`

`US. Patent
`
`Mar. 24, 2015
`
`Sheet 10 of 13
`
`US 8,989,517 B2
`
`740
`
`910
`
`Calculate
`variance 012 of
`
`patch f1
`
`920
`
`
`
`
`
`Calculate
`variance of of
`patch f2
`
`940
`
`
` Select f2 as less ,
`
`
`
` Select f1 as less
`
`blurred patch
`
`blurred patch
`
`930
`
`945
`
`950
`
`APPLE V COREPHOTONICS
`|PR2020-00906
`Exhibit 2037
`Page 12
`
`APPLE V COREPHOTONICS
`IPR2020-00906
`Exhibit 2037
`Page 12
`
`

`

`US. Patent
`
`Mar. 24, 2015
`
`Sheet 11 of 13
`
`US 8,989,517 B2
`
`750
`
`\
`
`1030a
`
`Fourier transform ‘
`patches f1 and f2
`
`f1
`
`Which
`patch is less
`blurred?
`
`
`
`
`Form spectral
`ratio F2/F1
`
`Form spectral
`ratio F1/F2
`
`
`1020
`
`1030b
`
`1040
`
`
`
`Modify spectral ratio
`
`1050
`
`Raise spectral ratio
`to power N
`
`1070a
`
`
`f1
`Which
`
`patch is less
`blurred?
`
`
`
`
`
`Multiply amplified
`Multiply amplified
`spectral ratio by
`spectral ratio by
`
`
`
`F1 to give PW)
`F2 to give FW)
`
`
`
`
`Inverse Fourier
`transform FM to give fW)
`
`1060
`
`1070b
`
`1080
`
`1085
`
`Artificial
`
`bokeh patch
`
`1090
`
`Fig. 10
`
`APPLE V COREPHOTONICS
`lPR2020-00906
`Exhibit 2037
`Page 13
`
`APPLE V COREPHOTONICS
`IPR2020-00906
`Exhibit 2037
`Page 13
`
`

`

`US. Patent
`
`Mar. 24, 2015
`
`Sheet 12 0f 13
`
`US 8,989,517 B2
`
`770
`
`\,
`
`Form output image
`
`Select pixel of output
`
`Image
`
`
`
`Determine patches
`fw) containing
`
`corresponding pixels
`
` More
`
`than one
`
`patch?
`
`
`
`
`
`Calculate output pixel
`from corresponding
`pixel in patch
`
`
`
`1110
`
`1120
`
`1130
`
`1140
`
`1150b
`
`
`
`
`Calculate output pixel
`from corresponding
`pixels in all patches
`
`pixels?
`
`No
`
`End
`
`Fig. 11
`
`1160
`
`1170
`
`APPLE V COREPHOTONICS
`|PR2020-00906
`Exhibit 2037
`Page 14
`
`APPLE V COREPHOTONICS
`IPR2020-00906
`Exhibit 2037
`Page 14
`
`

`

`US. Patent
`
`Mar. 24, 2015
`
`Sheet 13 of 13
`
`US 8,989,517 B2
`
`1200
`
`Capture two
`images of scene
`
`1210
`
`Segment images into
`foreground and background
`
`Determine object
`boundary regions
`
`Select corresponding image
`patches f1 and f2 in boundary region
`
`
`
`
`
`Select less blurred
`
`patch
`
`Form artificial
`
`bokeh patch fay)
`
`
`
`1270
`
`More
`
`atches?
`
`No
`
`Assemble rendered patches into rendered
`image of boundary region
`
`Artificial bokeh
`
`rendering
`
`Form composite
`rendered image
`
`Composite
`rendered image
`
`1290
`
`1295
`
`Fig. 12
`
`End
`
`1220
`
`1230
`
`1240
`
`1250
`
`1260
`
`1280
`
`1285
`
`1292
`
`APPLE V COREPHOTONICS
`lPR2020-00906
`Exhibit 2037
`Page 15
`
`APPLE V COREPHOTONICS
`IPR2020-00906
`Exhibit 2037
`Page 15
`
`

`

`1
`BOKEH AMPLIFICATION
`
`REFERENCE To RELATED PATENT
`
`APPLICATION(S)
`
`This application claims the benefit under 35 U.S.C. §119 of
`the filing date of Australian Patent Application No.
`2012258467, filed Dec. 3, 2012, hereby incorporated by ref-
`erence in its entirety as if fully set forth herein.
`
`10
`
`TECHNICAL FIELD
`
`The current invention relates to digital image processing
`and, in particular, to rendering a photographic image with
`modified blur characteristics.
`
`15
`
`BACKGROUND
`
`Single-lens reflex (SLR) and digital single-lens reflex
`(DSLR) cameras have large aperture optics which can pro-
`duce a narrow depth of field. Depth of field measures the
`distance from the nearest object to the camera which is in
`focus, to the farthest object from the camera which is in focus.
`(D)SLR cameras typically have a depth of filed of order
`significantly less than 1 meter for a typical portrait scenario of
`a subject a few meters from the camera. This allows the
`foreground subject of a photo to be rendered in sharp focus,
`while the background is blurred by defocus. The result is
`Visually pleasing as it provides a separation between the
`subject and any distracting elements in the background. The
`aesthetic quality of background blur (encompassing both the
`quantity and “look” of the blur) is known as bokeh. Bokeh is
`especially important for photos of people, or portraits.
`Compact digital cameras are more popular than DSLRs
`with consumers because of their smaller size, lighter weight,
`and lower cost. However, the smaller optics on a compact
`camera produce a large depth of field, of order greater than
`approximately 1 meter for the same typical portrait scenario,
`which renders the background in typical portrait shots as
`sharp and distracting.
`Depth of field varies significantly depending on the geom—
`etry of the photographic scene. The following examples are
`for taking a photo of a person about 3 meters from the camera:
`(i) the depth of field for a full frame SLR camera at 50 mm
`focal length and aperture f/2.8 is about 0.5 meters. For a
`portrait scenario, a photographer would typically want to use
`a depth offieldthis size, or even smaller, maybe 0.2 meters or
`even 0.1 meters. An SLR camera can also be configured with
`a smaller aperture to achieve very large depth of field, though
`this is not usually done for portraits.
`(ii) the depth of field for a small compact camera (e.g.
`CanonTM IXUSTM model) at 50 mm full-frame equivalent
`focal length and aperture f/2.8, is 6 meters.
`(iii) a large compact camera (e.g. CanonTM G12) at 50 mm
`full-frame equivalent focal length and aperture f/4 is 1.6
`meters. (This camera cannot achieve f/2.8 apertureiif it
`could, its depth of field would be 1.2 meters.) It is practically
`impossible for a camera with a compact form factor to achieve
`a depth of field under about 1 meter, for a subject at 3 meters
`distance. Technically, such is possible, but would require very
`large and expensive lenses. Depth of field for compact cam-
`eras under normal conditions can easily be tens of meters or
`even infinity, meaning that everything from the subject to the
`far distance is in focus.
`
`If the person is closer to the camera than 3 meters, all the
`depth of field distances discussed above will be smaller, and
`if the person is further away, they will all be larger. Impor-
`
`20
`
`25
`
`30
`
`35
`
`40
`
`45
`
`50
`
`55
`
`60
`
`65
`
`US 8,989,517 B2
`
`2
`
`tantly, an SLR camera will always be able to achieve a sig-
`nificantly smaller depth of field than a compact camera. The
`depth of field is largely dictated by the size of the camera
`sensor.
`
`A method of producing artificial bokeh with a compact
`camera, mimicking the amount and quality of background
`blur produced by an SLR camera, would provide a major
`improvement in image quality for compact camera users.
`Camera manufacturers and professional photographers
`have recognised the depth of field limitations of small format
`cameras for decades. With the advent of digital camera tech-
`nology, it has become feasible to process camera images after
`capture to modify the appearance of the photo. The genera-
`tion of SLR-like bokeh from compact camera images has
`been an early target for research in the field of digital camera
`image processing. However, no solution providing results of
`high (i.e. visually acceptable) aesthetic quality has been dem-
`onstrated.
`To accurately mimic small depth of field given a large
`depth of field photo, objects in the image must be blurred by
`an amount that varies with distance from the camera. The
`
`most common prior approach tackles this problem in two
`steps:
`(1a). Estimate the distance ofregions in the image from the
`camera to produce a depth map.
`(1b). Apply a blurring operation using a blur kernel size
`that varies with the estimated distance.
`
`Step (1a) is a difficult problem in itself, and the subject of
`active research by many groups. The three main methods of
`depth map estimation from camera images (i.e. excluding
`active illumination methods) are:
`(i) Stereo: taking photos from different camera positions
`and extracting depth from parallax. A major disadvantage of
`this approach is the requirement to take photos from multiple
`viewpoints, making it impractical for compact cameras.
`(ii) Depth from focus (DFF): taking a series of many
`images focused at different distances and measuring in
`patches which photo corresponds to a best focus at that patch,
`usually using maximal contrast as the best focus criterion. A
`major disadvantage of this approach is that many exposures
`are required, necessitating a long elapsed time. During the
`exposures the camera or subject may inadvertently move,
`potentially blurring the subject and introducing additional
`problems caused by image misalignment.
`(iii) Depth from defocus (DFD): quantifying the difference
`in amount of blur between two images taken with different
`focus and equating the blur difference to a distance. This is the
`most suitable approach for implementation in a compact cam-
`era, as it does not require stereo camera hardware and can be
`performed with as few as two photos. However, it has the
`disadvantages that accuracy is typically relatively low, par-
`ticularly around the boundaries of objects in the scene, and
`that consistency is adversely affected by differing object tex-
`tures in the scene. Some DFD methods show better accuracy
`around object edges, at the cost of using computationally
`expensive algorithms unsuited to implementation in camera
`hardware.
`
`Step (1b) is computationally expensive for optically real-
`istic blur kernel shapes. A fallback is to use a Gaussian blur
`kernel, which produces a blur that looks optically unrealistic,
`making the resulting image aesthetically unpleasing.
`To more easily approach artificial bokeh, many prior meth-
`ods use a simplified version of the above two-step method,
`being:
`(2a). Segment the image into a foreground region and a
`background region.
`(2b). Apply a constant blurring operation to the back-
`ground region only.
`
`APPLE V COREPHOTONICS
`|PR2020-00906
`Exhibit 2037
`Page 16
`
`APPLE V COREPHOTONICS
`IPR2020-00906
`Exhibit 2037
`Page 16
`
`

`

`5
`
`15
`
`SUMMARY
`
`US 8,989,517 B2
`
`4
`
`3
`Assuming step (2a) is done correctly, step (2b) is straight-
`forward. However, step (2a) is still difficult and has not been
`achieved satisfactorily within the constraints of a compact
`camera. In particular, the accuracy of segmentation around
`the edges of objects at different depths in the scene is poor.
`Even if this simplified method can be achieved without error,
`the resulting images can look artificial, since intermediate
`levels of blur between the foreground and background will be
`absent.
`
`a tiling substantially covering the area ofthe captured images,
`and the output image is formed by tiling the output image
`patches. Generally the plurality of corresponding image
`patches in each ofthe captured images overlap, and the output
`image is formed by combining the pixel values of the output
`image patches.
`In a specific implementation the plurality of corresponding
`image patches in each ofthe captured images cover part ofthe
`area ofthe captured images; and the output image patches are
`grailllbfsrtlifita‘t]: 3113;323:3311?oaftbtiucrlftziflTEelrlelrittglaces in an 10 combined with the area ofat least one ofthe captured images
`.
`.
`not covered by the plurality of corresponding image patches
`lmbl'get compared to a blur-free representation Of the
`to produce an output image. Desirably at least part ofthe area
`(31:31 fpcplsycinblurring operation using a blur kernel size
`ofthetat least one ofthe captured images not covered by the
`that varies with the estimated blur amount.
`5101;311:311 voviithof£15111)r01:133:51 image patches ls blurred by con-
`A com act camera does not have an infinite de th of field.
`-
`i
`-
`-
`so the bfckground will show a small amount 13f blurring
`According to another aspect, dISCIIOZed IS a camera Cfiimfi
`relative to an in-focus foreground object. If such blurred
`pris1ng antimage capture system C0111) e to memory In W C
`.
`.
`.
`captured images are stored, a processor, and a program
`regions can be 1deht1hed accurately, they can be blurred more,
`executable by the processor to modify the blur in at least a part
`producing increased blur in the background.
`20 of an image of a scene, said program comprising: code for
`Step (3a) can be performed with a single image, 01‘ by 115ng
`causing the capture system to capture at least two images of
`multiple images of the scene captured with different camera
`the scene, said images being captured with different camera
`parameters. Estimating blur from a single image is under-
`parameters to produce a different amount of blur in each of
`constrained and can only be achieved under certain assump-
`the captured images; code for selecting a corresponding
`tions. For example, one assumption is that edges detected in 25 image patch In each 0f the captured images, each 0f the
`the image are step function edges in the scene, blurred by the
`selected image patches hav1ng an mmal. amount ofblur; code
`camera optics, and that regions away from edges may be
`for calculating a .set of frequency domain pixel values from a
`accurately infilled from the edge blur estimates. These
`2:11:gggm?;ecgzltld;i-nglfietoirjgisgtiirgzagif cttfvthstpiiheel
`assum tions are often false, resultin in oor blur estimates.
`-
`’
`-
`-
`Estimiingbiurfrommunpieimagefisimmune), 3031:13:31,133;555gtgggggcfgrgogggnpfi1:3333ft:
`because blur amount is directly related to depth, and shares
`quency domain pixel values; and code for combining the
`the same problems.
`amplified set of frequency domain pixel values with the pixels
`of the selected image patch in one of the captured images to
`produce an output image patch with blur modified with
`35 respect to the initial amount of blur in the image patch,
`wherein the amount of modification with respect to blur var-
`According to the present disclosure there is provided a
`ies across different regions of the image patch.
`method of modifying the blur in at least a part of an image of
`Another aspect is a camera system comprising: a lens
`a scene, sa d method comprising; capturing at least two
`formed ofoptics producing a relatively large depth 0f field; a
`images of the scene, said images being captured with differ-
`ent camera parameters to produce a different amount of blur 40 sensor configured capture an image Of a scene focussed
`in each of the captured images; selecting a corresponding
`through the lens; a memory In Whlch images captured by the
`image patch in each of the captured images, each of the
`sensor are stored; a capture meChemsm configured to capture
`.
`.
`.
`.
`.
`.
`at least two images of the scene w1th different capture param-
`selected image patches hav1ng an imtial amount of blur, cal-
`eters and to store the images in the memory: a processor' a
`culating a set of frequency domain pixel values from a com-
`program stored in the memory and executable by the proces-
`bined function of Fourier transforms of two of the selected 45 sor to modify blur in at least a part of one of the captured
`1215;325:5035?ifésfiieiaiiifiifli: exgfiefiili‘éiiiflég213:5:
`13%;:2333;20:33efii‘ié’iziiiféiiiififitg; ffiiiegiiriffié
`thereby forming an amplified set of frequency domain pixel
`scene with different camera parameters to produce a different
`values; and combining the amplified set of frequency domain
`amount of blur in each of the captured images; code for
`pixel values with the pixels ofthe selected image patch in one 50 selecting a corresponding image patch in each ofthe captured
`ofthe captured images to produce an output image patch with
`images, each Of the selected image patches having an initial
`blur modified with respect to the initial amount of blur in the
`amount of blur; code for calculating a set. 0f frequency
`image patch, whereinthe amount ofmodificationwithrespect
`domain 131er values from a combined function 0f Fourier
`to blur varies across different regions of the image patch.
`transforms 0f two 9f the seleged image patches; code for
`Preferably, the set of frequency domain pixel values are 55 elsmg each ofthe pixel Values In the set offrequency domain
`.
`.
`.
`.
`pixel values to a predetermined power, thereby forming an
`EggeirigfyltheofifoEffigftiroaitsihilhfleil: Elaiihliteliffggg:
`amplified set offrequency domain pixel values; and code for
`tion. Alternatively the modification may include a smoothing
`figbtflenggg2113pl1113:1261:gigegluerggypioticrlilaiipéiel 21131::
`filtering operation. The modification may include a normali-
`captured images to produce an output image patch with blur
`sation operation andfor a weighting operation. The weights 60 modified with respect to the initial amount of blur in the
`for the weighting operation are determined by the phases 0f
`image patch, wherein the amount ofmodification with respect
`the set of frequency domain pixel values.
`to blur varies across different regions of the image patch.
`Typically the at least two images of the scene are divided
`In another aspect disclosed is a computer readable storage
`into a plurality of corresponding image patches in each of the
`medium having a program recorded thereon, the program
`captured images; and the output image patches are combined 65 being executable by a processor to modify blur in at least a
`to produce an output image. Desirably the plurality of corre-
`part of an image of a scene, the program comprising: code for
`receiving at least two images of the scene, said images being
`sponding image patches in each of the captured images form
`APPLE V COREPHOTONICS
`|PR2020-00906
`Exhibit 2037
`Page 17
`
`APPLE V COREPHOTONICS
`IPR2020-00906
`Exhibit 2037
`Page 17
`
`

`

`US 8,989,517 B2
`
`5
`captured with different camera parameters to produce a dif-
`ferent amount ofblur in each of the captured images; code for
`selecting a corresponding image patch in each ofthe captured
`images, each of the selected image patches having an initial
`amount of blur; code for calculating a set of frequency
`domain pixel values from a combined function of Fourier
`transforms of two of the selected image patches; code for
`raising each ofthe pixel values in the set of frequency domain
`pixel values to a predetermined power, thereby forming an
`amplified set of frequency domain pixel values; and code for
`combining the amplified set offrequency domainpixel values
`with the pixels of the selected image patch in one of the
`The technical details of accurately rendering artificial
`captured images to produce an output image patch With blur
`modified with respect to the initial amount of blur in the 15 bokeh rely on key aspects of the geometry and optics of
`image patch, wherein the amount ofmodification with respect
`imaging devices. Most scenes that are captured using an
`to blur varies across different regions of the image patch.
`imaging device, such as a camera, contain multiple objects,
`Other aspects are disclosed.
`which are located at various distances from the lens of the
`
`5
`
`10
`
`6
`field so as to mimic a photo taken with a smaller depth of field
`by modifying blur already present in the image taken with a
`large depth of field. The methods seek to offer one or more of
`improved accuracy, improved tolerance to imaging noise,
`improved tolerance to differences of object texture in the
`image, and improved aesthetic appearance of the final image,
`all of these particularly in regions at and near the boundaries
`of objects in the scene.
`Context
`
`Thin Lens Equation, Basic Geometry
`
`BRIEF DESCRIPTION OF THE DRAWINGS
`
`At least one embodiment of the invention will now be
`
`described with reference to the following drawings, in which:
`FIG. 1 is a schematic diagram of a scene and an image
`capture device positioned to capture an image of the scene;
`FIG. 2 is a schematic diagram illustrating the geometry of
`a lens forming two different images at two different focal
`planes;
`FIGS. 3A and 3B illustrate a two-dimensional Gaussian
`
`function and a two-dimensional pillbox function, and one-
`dimensional cross-sections thereof;
`FIGS. 4A and 4B collectively form a schematic block
`diagram of a general purpose computer on which various
`implementations may be practised;
`FIGS. 5A, 5B, and 5C illustrate example images upon
`which artificial bokeh processing according to the present
`disclosure may be performed;
`FIG. 6 is a diagram illustrating the correspondence
`between pixels and image patches within a first image and a
`second image of a scene;
`FIG. 7 is a schematic flow diagram illustrating an exem-
`plary method of determining an artificial bokeh image from
`two images of a scene, according to the present disclosure;
`FIG. 8 is a schematic flow diagram illustrating one example
`of a method of capturing two images as used in the method of
`FIG. 7;
`FIG. 9 is a schematic flow diagram illustrating one example
`of a method of asymmetrical patch selection as used in the
`method of FIG. 7;
`FIG. 10 is a schematic flow diagram illustrating one
`example of a method ofdetermining an artificial bokeh image
`patch from two corresponding patches of two images of a
`scene as used in the method of FIG. 7;
`FIG. 11 is a schematic flow diagram illustrating one
`example of a method of assembling artificial bokeh patches
`into an artificial bokeh image as used in the method of FIG. 7;
`and
`
`FIG. 12 is a schematic flow diagram illustrating a second
`exemplary method of determining an artificial bokeh image
`from two images of a scene, according to the present disclo-
`sure.
`
`DETAILED DESCRIPTION INCLUDING BEST
`MODE
`
`Introduction
`The present disclosure is directed to providing methods of
`rendering a photographic image taken with large depth of
`
`20
`
`25
`
`30
`
`35
`
`40
`
`45
`
`50
`
`55
`
`60
`
`65
`
`device. Commonly, the imaging device is focused on an
`object of interest in the scene. The object of interest shall be
`referred to as the subject of the scene. Otherwise, objects in
`the scene, which may include the subject, shall simply be
`referred to as objects.
`FIG. 1 is a schematic diagram showing the geometrical
`relationships between key parts of an imaging device and
`objects in a scene to be captured. FIG. 1 shows an imaging
`device or system (e.g. a camera) 100 which includes a lens
`110, and a sensor 115. For the purposes ofthis description, the
`camera 100 is typically a compact digital camera and the lens
`110 has relatively small optics producing a large depth of
`field, particularly in comparison to an SLR camera. FIG. 1
`also shows an in-focus plane 130 and a general object 140
`formed by sphere positioned upon a rectangular prism, form-
`ing part of the scene but not necessarily the subject of the
`scene to be captured. The image plane 120 of the imaging
`device 100, also referred to as the focal plane, is defined to be
`at the location of the sensor 115. When projected through the
`lens 110, the image plane 120 forms the in-focus plane 130,
`which can be considered to be a virtual plane in the geometri-
`cal region of the object 140. A distance 150 from the lens 110
`to the image plane 120 is related to a distance 160 from the
`lens 110 to the in-focus plane 130, by the thin lens law
`according to the equation
`
`(1)
`
`where f is the focal length of the lens 110, Z1. is the lens-to-
`sensor distance 150, and Z0 is the distance 160 from the lens
`110 to the in—focus plane 130. The general scene object 140 is
`located at a distance 170 from the lens 110 and at a distance
`180 from the in-focus plane 130. This distance 170 is referred
`to as ZS. The distance 180 from the object 140 to the in-focus
`plane 130 is given by zS—zo and may be positive, zero, or
`negative. If the object 140 is focused onto the image plane
`120, then ZS:Z0 and the object 140 is located in the in-focus
`plane 130. If Z5 is less than or greater than Z0, then the object
`140 is located behind or in front of the in—focus plane 130
`respectively, and the image of the object 140 will appear
`blurred on the image plane 120.
`FIG. 1 illustrates a relatively simple geometrical optics
`model of imaging. This model relies on approximations
`including the thin lens approximation, paraxial imaging rays,
`and a lens free of aberrations. These approximations ignore
`some aspects of the optics that are inherent in actual imaging
`APPLE V COREPHOTONICS
`|PR2020-00906
`Exhibit 2037
`Page 18
`
`APPLE V COREPHOTONICS
`IPR2020-00906
`Exhibit 2037
`Page 18
`
`

`

`US 8,989,517 B2
`
`7
`systems, but are sufiicient for general understanding of imag-
`ing behaviour, as is understood by those skilled in the art.
`Focusing is carried out either manually by the user or by
`using an autofocus mechanism that is built into the imaging
`device 100. Focusing typically manipulates the lens-to-sen-
`sor distance 150 in order to place the in—focus plane 130 such
`that the distance Z0 160 is equal to the distance ZS 170 to a
`specific object of interest, i.e. to place the subject in the
`in-focus plane 130. Other objects in the scene that have a
`distance ZS from the lens 110 that is different from that of the
`subject are located either behind or in front of the in-focus
`plane 130. These other objects will appear blurred to some
`degree on the image plane 120 and thus in the image captured
`on the sensor 115. This blur is referred to as defocus blur.
`
`Defocus Blur
`
`The amount of defocus blurring of an imaged object 140
`increases with the distance 180 of the object 140 from the
`in-focus plane 130. The amount of defocus blur present in a
`given patch or portion of a captured 2D image can be char-
`acterised by the point spread function (PSF). The PSF is the
`response of the imaging system to a point source, defined
`such that the integral of the PSF over the image plane is equal
`to unity. The PSF of an optical system is generally a spatially
`restricted two-dimensional function

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket