throbber
(19) United States
`(12) Patent Application Publication (10) Pub. No.: US 2013/0101176 A1
`PARK et al.
`(43) Pub. Date: Apr. 25, 2013
`
`
`US 20130101176A1
`
`(54)
`
`3D IMAGE ACQUISITION APPARATUS AND
`METHOD OF CALCULATING DEPTH
`INFORMATION IN THE 3D IMAGE
`ACQUISITION APPARATUS
`
`(75)
`
`Inventors:
`
`(73) Assignee:
`
`Yong-hwa PARK, Yongin-si (KR);
`Jang-woo YOU, Yongin-si (KR);
`Hee-sun YOON, Seoul (KR)
`SAMSUNG ELECTRONIC CO.,LTD.,
`Suwon-si (KR)
`
`(21) Appl. No.:
`
`13/594,094
`
`(22)
`
`Filed:
`
`Aug. 24, 2012
`
`(30)
`
`Foreign Application Priority Data
`
`Oct. 25, 2011
`
`(KR) ........................ 10-2011-0109431
`
`Publication Classification
`
`(51)
`
`(2006.01)
`(2006.01)
`
`Int. Cl.
`G06K 9/00
`H04N 13/02
`(52) US. Cl.
`USPC ...................... 382/106; 348/49; 348/E13.074
`ABSTRACT
`(57)
`A 3-dimensional (3D) image acquisition apparatus and a
`method of calculating depth information in the 3D image
`acquisition apparatus, the 3D image acquisition apparatus
`including: an optical modulator for modulating light reflected
`from a subject by sequentially projected N (N is 3 or a larger
`natural number) light beams; an image sensor for generating
`N sub-images by capturing the light modulated by the optical
`modulator; and a signal processor for calculating depth infor-
`mation regarding a distance to the subject by using the N
`sub-images.
`
`INTENSIVELY PROJECT N DIFFERENT
`
`PROJECTION LIGHT BEAMS TO SUBJECT
`
`MODULATE N REFLECTION LIGHT
`BEAMS REFLECTED FROM SUBJECT
`
`GENERATE N SUB-IMAGES BY CAPTURING N
`MODULATED REFLECTION LIGHT BEAMS
`
`READ WEIGHTING FACTORS CORRESPONDING
`
`PHASES OF PROJECTION LIGHT BEAMS FROM MEMORY
`
`TO NUMBER OF USED PROJECTION LIGHT BEAMS,
`INTENSITIES OF PROJECTION LIGHT BEAMS, AND
`
`
`CALCULATE FIRST AVERAGE IMAGE V BY MULTIPLYING N
`SUB-IMAGES BY FIRST WEIGHTING FACTORS
`
`CALCULATE SECOND AVERAGE IMAGE U BY MULTIPLYING
`N SUB-IMAGES BY SECOND WEIGHTING FACTORS
`
`CALCULATE DEPTH INFORMATION FROM FIRST
`AVERAGE IMAGE V AND SECOND AVERAGE IMAGE U
`
`S1
`
`S2
`
`83
`
`S4
`
`S5
`
`S6
`
`S7
`
`Align EX. 1016
`US. Patent No. 9,962,244
`
`0001
`
`Align Ex. 1016
`U.S. Patent No. 9,962,244
`
`0001
`
`

`

`Patent Application Publication
`
`Apr. 25, 2013 Sheet 1 0f 10
`
`US 2013/0101176 A1
`
`
`
`25m5%ZOEMSEam;_3mm:a:295ij328%rIIIIIIIIIII
`
`
`
`
`
`_
`
`
`
`
`
`IIIII_II2955meEQSAIITIIIIII.3%;onE2589;:
`
`
`
`EQBAIIT:ngIIII+IT_H
`
`
`
`2584+-$8385IIIII.4_Liz/mngezgsmfim£2
`2534?IILI-ALVDQI__:EEIo:20:85:2onE__Li.EEIGEQEMEEEE
`
`__Egig:__655%Eu22%u_28580?"$4482822:0"_23m3%295%?52_m
`
`__AIQJI.__Ewngzocomfifmm:
`
`9:m20:m2m29:mmmm.111III|_LINDAIEEEIQ:ZQEMEEZQBm
`
`
`
`AlviLEmEa:ZOEWEE52
`
`0002
`
`H.0;
`
`_ofiHIQ:cow
`
`55%?
`
`0002
`
`
`

`

`Patent Application Publication
`
`Apr. 25, 2013 Sheet 2 0f 10
`
`US 2013/0101176 A1
`
`mlgoc
`
`.3EoE
`
`
`
`_2<m_mEo:zocoflnmm52Q
`
`maEoE
`
`mn_EoE
`
`EEoE
`
`avm.UE
`
`2“.
`
`23mEa:20:05qu928%
`
`
`
`§<mmEo:zOEijnmm5m:
`
`ml_:en
`
`FEEoE
`
`mn—EOE
`
`EEOE
`
`wage;
`
`mn—EoE
`
`filgei
`
`mn_EoE
`
`mn—EOE
`
`EEOE
`
`
`
`EelESESESES
`
`
`
`3%I58393858
`
`
`
`magi-EBESEBESmo?
`
`2mag:gm
`
`.2p:.2»m_’:
`
`._<O_EO
`
`292.582
`
`Ezgm
`
`.....
`29258:
`,____
`
`.205035:25:85382
`
`ézgm
`
`m992;gm
`
`$922gmflfieaqmim:
`939$:..pp»
`
`.2056
`
`292.582
`
`Ezgm
`
`0003
`
`0003
`
`
`
`
`
`

`

`Patent Application Publication
`
`Apr. 25, 2013 Sheet 3 of 10
`
`US 2013/0101176 A1
`
`mn—E2“—
`
`vn_EOE
`
`
`
`_2<m_mEa:zocomjnmm£2E
`
`mn_Eoi
`
`mn—E9“—
`
`EE2“—
`
`
`
`Eel-ESESEG58
`
`'AIIL‘m|.
`m“EmfoElmemSUofi
`_EQEO52
`
`m.“
`
`>_._.m_>m_.:
`
`.2050
`
`292.58:
`
`._<zo_m
`
`2$02:gm
`
`
`
` $92;ea 3%.-ESESESE8
`
`ozoomm
`
`2050
`
`292.5302
`
`.3205
`
`
`
`
`
` $8.-ESEGESESmo?
`
`32:;gm
`
`Em:
`
`EQEQ
`
`292.5902
`
`._<zo_m
`
`0004
`
`Ego;3:5;E:5:Ego:3go:3go:3:5:E:5:E:5:E:5;
`
`m.05
`
`
`
`32mmEa:70:8:mm960%3%Ea:zozomfimm5%EE
`
`
`
`
`
`0004
`
`
`
`
`
`

`

`Patent Application Publication
`
`Apr. 25, 2013 Sheet 4 of 10
`
`US 2013/0101176 A1
`
`FIG; 4A
`
`OPTICAL POWER
`
`
`
`WIIIIIWIWWWWIWIIIIIWWWWIWWIWWIWWWWWIWWIWWWWII
`
`IIIIIIIII
`
`m
`
`GAIN
`
`WIIIIIWWWWWIWNIIIWIWWW I WWWWW IIWUWWWWWWW
`
`
`EFPOOTIALIVIDCO ATOR
`
`{CW j
`
` SUB—IMAGE
`
`L EXPOSURE TIIVIEOF GE PICUP DVEICE J
`
`:11]: Iffflg. :1..::::..f:::f.. [11:1..f:::§:§:.4:fj j
`
`DUTY RATE 100%
`
`0005
`
`

`

`Patent Application Publication
`
`Apr. 25, 2013 Sheet 5 0f 10
`
`US 2013/0101176 A1
`
`FIG. 4B
`
`OPTICAL POWER
`
`
`
`AMBIENT
`LIGHT
`
`
`
`IR LIGHT
`
`
`
`
`
`SUB-IMAGE
`
`
`
`
`
`
`
`EXPOSLFIE TIME OF IMAGE PICKUP DEVICE
`
`
`DUTY RATE 20%
`I-—-—l
`
`0006
`
`0006
`
`

`

`Patent Application Publication
`
`Apr. 25, 2013 Sheet 6 of 10
`
`US 2013/0101176 A1
`
`022505-5322
`
`022505-522m_
`
`oz__\2¢o5-m_o<_2_m_
`
`022505-522:
`
`295.5200
`
`295.5200
`
`295.5200
`
`295.5200
`
`95
`
`32085
`®>3m00xm
`
`2::
`
`a.TTRMmzfigwmmr
`
`”50
`
`
`A Tnn
`
`202.05.555
`
`._._._0_._
`
`._<0:.n_0
`
`mok<43oog
`
`mmzzf
`
`m__0<2_
`
`53205
`
`50350
`
`0007
`
`0007
`
`

`

`Patent Application Publication
`
`Apr. 25, 2013 Sheet 7 of 10
`
`US 2013/0101176 A1
`
`
`
`295.5200295.5200295.5200295.5200
`
`
`
`
`
`
`
`
`
`
`
`0222505522,:02250552275022505522502258522.:
`
`
`
`
`
`to50275002
`
`20505555
`
`:50:
`
`.2950
`
`
`
`Hz:$5855
`
`5000/55
`
`52_._.
`
`52_._wmmr
`
`50<2_
`
`53205
`
`5950
`
`52_._r
`
`0008
`
`0008
`
`

`

`Patent Application Publication
`
`Apr. 25, 2013 Sheet 8 of 10
`
`US 2013/0101176 A1
`
`002:£80
`
`_ REASw,.
`
` x
`
`\/
`
`Eu
`
`
`
`Nkam
`RU&Mg
`
`Ummm¥m><
`
`
`
`398::8853
`
`>mmmc:
`
`QLUVNENV
`_H»\22.41\_
`
`0009
`
`
`
`0009
`
`
`

`

`Patent Application Publication
`
`Apr. 25, 2013 Sheet 9 of 10
`
`US 2013/0101176 A1
`
`
`
`
`
`
`
`$5.0on.ommm.09:.03.08“03.03mn
`
`w.65
`
`:5$21
`
`
`.AogmEN.008$2.02:.05.09KHZA008«SN8235room.02
`IIIIIIIIIII
`
`
`
`
`.Aowwm$3.03:.ofi.09mnzSE“Dom:.08.03$25%.008.03mnz
`
`8mm.0mvmm.o3%.a.5mm.0.mBo.T88.o.mvmm.o..5memBo.—
`058888858888858888858920$ass__________45ng000.8OZ
`
`ram8%SE.008.002.08.02”“03.02
`
`
`
`
`
`
`
`0010
`
`0010
`
`

`

`Patent Application Publication
`
`Apr. 25, 2013 Sheet 10 0f 10
`
`US 2013/0101176 A1
`
`FIG. 9
`
`INTENSIVELY PROJECT N DIFFERENT
`PROJECTION LIGHT BEAMS TO SUBJECT
`
`MODULATE N REFLECTION LIGHT
`BEAMS REFLECTED FROM SUBJECT
`
`GENERATE N SUB-IMAGES BY CAPTURING N
`MODULATED REFLECTION LIGHT BEAMS
`
`READ WEIGHTING FACTORS CORRESPONDING
`
`PHASES OF PROJECTION LIGHT BEAMS FROM MEMORY
`
`TO NUMBER OF USED PROJECTION LIGHT BEAMS,
`INTENSITIES OF PROJECTION LIGHT BEAMS, AND
`
`CALCULATE FIRST AVERAGE IMAGE V BY MULTIPLYING N
`SUB-IMAGES BY FIRST WEIGHTING FACTORS
`
`CALCULATE SECOND AVERAGE IMAGE U BY MULTIPLYING
`N SUB-IMAGES BY SECOND WEIGHTING FACTORS
`
`CALCULATE DEPTH INFORMATION FROM FIRST
`AVERAGE IMAGE V AND SECOND AVERAGE IMAGE U
`
`0011
`
`81
`
`82
`
`83
`
`S4
`
`85
`
`S6
`
`87
`
`0011
`
`

`

`US 2013/0101176 A1
`
`Apr. 25, 2013
`
`3D IMAGE ACQUISITION APPARATUS AND
`METHOD OF CALCULATING DEPTH
`INFORMATION IN THE 3D IMAGE
`ACQUISITION APPARATUS
`
`CROSS-REFERENCE TO RELATED
`APPLICATIONS
`
`[0001] This application claims priority from Korean Patent
`Application No. 10-2011-0109431, filed on Oct. 25, 2011, in
`the Korean Intellectual Property Oflice, the disclosures of
`which are incorporated herein in their entirety by reference.
`
`BACKGROUND
`
`1. Field
`[0002]
`[0003] The present disclosure relates to 3-dimensional
`(3D) image acquisition apparatuses and methods of calculat-
`ing depth information in the 3D image acquisition appara-
`tuses.
`
`2. Description of the Related Art
`[0004]
`[0005] Recently, the importance of 3-dimensional (3D)
`content is increasing with the development and the increase in
`demand of 3D display devices for displaying images having
`depth perception. Accordingly, there is research being con-
`ducted into 3D image acquisition apparatuses, such as a 3D
`camera by which a user personally creates 3D content. Such
`a 3D camera acquires depth information in addition to exist-
`ing 2D color image information in one capture.
`[0006] Depth information regarding distances between sur-
`faces of a subject and a 3D camera may be acquired using a
`stereo Vision method using two cameras or a triangulation
`method using structured light and a camera. However, since
`the accuracy of depth information in these methods rapidly
`decreases as a distance to a subject increases and these meth-
`ods depend on a surface state of the subject, it is diflicult to
`acquire accurate depth information.
`[0007]
`To improve this problem, a Time-of—Flight (TOF)
`method has been introduced. The TOF method is a method of
`
`measuring a light beam’s flight time until the light reflected
`from a subject is received by a light-receiving unit after an
`illumination light is projected to the subject. According to the
`TOF method, light of a predetermined wavelength (e.g., Near
`Infrared (NIR) light of 850 nm) is irradiated to a subject by
`using an illumination optical system including a Light-Emit-
`ting Diode (LED) or a Laser Diode (LD). A light having the
`same wavelength is reflected from the subject and is received
`by a light-receiving unit. Thereafter, a series of processing
`processes for calculating depth information are performed.
`Various TOF technologies are introduced according to the
`series of processing processes.
`[0008]
`In the TOF method described above, depth informa-
`tion is calculated by assuming an ideal environment without
`noise. However, when a 3D camera is used, ambient light,
`such as illumination in an indoor environment and sunlight in
`an outdoor environment, always exists in the surroundings.
`The ambient light is incident to the 3D camera and becomes
`noise in a process of calculating depth information.
`[0009] Accordingly, it is necessary to reduce ambient light
`causing noise in the process of calculating depth information.
`
`SUMMARY
`
`Provided are a method of calculating depth infor-
`[0010]
`mation by reducing captured ambient light and a 3D image
`acquisition apparatus therefor.
`
`[0011] Additional aspects will be set forth in part in the
`description which follows and, in part, will be apparent from
`the description, or may be learned by practice of the exem-
`plary embodiments.
`[0012] According to an aspect of an exemplary embodi-
`ment, a 3-dimensional (3D) image acquisition apparatus
`includes: an optical modulator for modulating light reflected
`from a subject by sequentially projected N (N is 3 or a larger
`natural number) light beams; an image sensor for generating
`N sub-images by capturing the light modulated by the optical
`modulator; and a signal processor for calculating depth infor-
`mation regarding a distance to the subject by using the N
`sub-images.
`[0013] The N light beams may be discontinuously pro-
`jected.
`[0014] The N projected light beams may be different from
`each other and be emitted by one or more light sources.
`[0015] The one or more light sources may sequentially
`project the N light beams with a predetermined time interval.
`[0016] An operating time of the optical modulator may be
`synchronized with a projecting time of each of the N light
`beams.
`
`[0017] The operating time of the optical modulator may be
`shorter than the projecting time.
`[0018] An exposure time of the image sensor may be syn-
`chronized with the operating time of the optical modulator.
`[0019] The image sensor may be exposed during the light-
`projecting time to capture the modulated light and may form
`the N sub-images during at least a portion ofa remaining time
`of the light-projecting time.
`[0020] All pixels ofthe image sensor may be exposed to the
`modulated light during the light-proj ecting time.
`[0021] The N light beams may be periodic waves having the
`same period and at least one selected from the group consist-
`ing of a different intensity and a different phase.
`[0022] The optical modulator may modulate the reflected
`light with the same modulation signal.
`[0023] The N light beams may be the same periodic waves.
`[0024] The optical modulator may modulate the reflected
`light with different modulation signals.
`[0025] A phase difference between any two light beams
`projected at adjacent times from among the N light beams
`may be a value obtained by equally dividing 360° by N.
`[0026] The reflected light may include N reflection light
`beams obtained by reflecting the N light beams from the
`subject.
`[0027] The N sub-images generated by the image sensor
`may sequentially one-to-one match the N reflection light
`beams.
`
`If the N sub-images do not one-to-one match the N
`[0028]
`reflection light beams, the signal processor may convert the N
`sub-images on a line by line basis and sequentially one-to-one
`match the N line-based sub-images with the N reflection light
`beams.
`
`[0029] The signal processor may generate a first average
`image by averaging the N sub-images multiplied by first
`weighting factors, generate a second average image by aver-
`aging the N sub-images multiplied by second weighting fac-
`tors, and calculate the depth information from the first aver-
`age image and the second average image.
`[0030] The depth information may be calculated from an
`arctangent value of a ratio of the first average image to the
`second average image.
`
`0012
`
`0012
`
`

`

`US 2013/0101176 A1
`
`Apr. 25, 2013
`
`[0031] According to another aspect of an exemplary
`embodiment, a method of calculating depth information
`includes: modulating light reflected from a subject by sequen-
`tially projecting N (N is 3 or a larger natural number) light
`beams; generating N sub-images by capturing the light modu-
`lated by the optical modulator; and calculating depth infor-
`mation regarding a distance to the subject by using the N
`sub-images.
`[0032] The N light beams may be discontinuously pro-
`jected.
`[0033] The N projected light beams may be different from
`each other and be emitted by one or more light sources.
`[0034] The N light beams may be sequentially projected
`with a predetermined time interval.
`[0035] An operating time of an optical modulator for
`modulating the light may be synchronized with a projecting
`time of each of the N light beams.
`[0036] The operating time of the optical modulator may be
`shorter than the projecting time.
`[0037] An exposure time of an image sensor for capturing
`the light may be synchronized with the operating time of the
`optical modulator.
`[0038] All pixels ofthe image sensor may be exposed to the
`modulated light during the light-proj ecting time.
`[0039] The N light beams may be periodic waves having the
`same period and at least one selected from the group consist-
`ing of a different intensity and a different phase, and the
`reflected light may be modulated with the same modulation
`signal.
`[0040] The N light beams may be the same periodic waves,
`and the reflected light may be modulated with different modu-
`lation signals.
`[0041] A phase difference between any two light beams
`projected at adjacent times from among the N light beams
`may be a value obtained by equally dividing 360° by N.
`[0042] The generated N sub-images may sequentially one-
`to-one match the N reflection light beams.
`[0043] The method may further include, if the N sub-im-
`ages do not one-to-one match the N reflection light beams,
`converting the N sub-images on a line by line basis and
`sequentially one-to-one matching the N line-based sub-im-
`ages with the N reflection light beams.
`[0044] A first average image may be generated by averag-
`ing the N sub-images multiplied by first weighting factors, a
`second average image may be generated by averaging the N
`sub-images multiplied by second weighting factors, and the
`depth information may be calculated from the first average
`image and the second average image.
`[0045] The depth information may be calculated from an
`arctangent value of a ratio of the first average image to the
`second average image.
`
`BRIEF DESCRIPTION OF THE DRAWINGS
`
`[0046] These and/or other aspects will become apparent
`and more readily appreciated from the following description
`of the embodiments, taken in conjunction with the accompa-
`nying drawings in which:
`[0047]
`FIG. 1 is a schematic diagram of a 3-dimensional
`(3D) image acquisition apparatus according to an exemplary
`embodiment;
`[0048]
`FIGS. 2A to 2C illustrate a process of generating N
`different sub-images by modulating N different reflection
`light beams, according to an exemplary embodiment;
`
`FIGS. 3A to 3C illustrate a process of generating N
`[0049]
`different sub-images with one projection light beam and N
`different optical modulation signals, according to an exem-
`plary embodiment;
`[0050]
`FIGS. 4A and 4B are time graphs when a 3D image
`is captured when a duty rate of proj ection light is 100% and a
`case where a duty rate ofprojection light is 20%, respectively,
`according to an exemplary embodiment;
`[0051]
`FIG. 5 is a time graph of when an image is captured
`by synchronizing a light source, an optical modulator, and an
`image pickup device with each other, according to an exem-
`plary embodiment;
`[0052]
`FIG. 6 is a time graph when an image is captured
`when not all pixels of an image pickup device are exposed
`during a single operating time of an optical modulator;
`[0053]
`FIG. 7 is a schematic diagram for describing a pro-
`cess of calculating depth information from N different
`images, according to an exemplary embodiment;
`[0054]
`FIG. 8 is a table illustrating weighting factors Ak and
`Bk, according to an exemplary embodiment; and
`[0055]
`FIG. 9 is a flowchart illustrating a method of calcu-
`lating depth information, according to an exemplary embodi-
`ment.
`
`DETAILED DESCRIPTION
`
`[0056] Reference will now be made in detail to exemplary
`embodiments, examples of which are illustrated in the
`accompanying drawings. In the drawings, the widths and
`thicknesses of layers and regions are exaggerated for the
`clarity of the specification. In the description, like reference
`numerals refer to like elements throughout. Expressions such
`as “at least one of,” when preceding a list of elements, modify
`the entire list of elements and do not modify the individual
`elements of the list.
`
`FIG. 1 is a schematic diagram of a 3-dimensional
`[0057]
`(3D) image acquisition apparatus 100 according to an exem-
`plary embodiment. Referring to FIG. 1, the 3D image acqui-
`sition apparatus 100 may include a light source 101 for gen-
`erating light having a predetermined wavelength, an optical
`modulator 103 for modulating light reflected from a subject
`200, an image pickup device 105 (e.g., an image sensor) for
`generating a sub-image from the modulated light, a signal
`processor 106 for calculating depth information based on a
`sub-image formed by the image pickup device 105 and gen-
`erating an image including the depth information, and a con-
`troller 107 for controlling operations of the light source 101,
`the optical modulator 103, the image pickup device 105, and
`the signal processor 106.
`[0058]
`In addition, the 3D image acquisition apparatus 100
`may further include, in front of a light-incident face of the
`optical modulator 103, a filter 108 for transmitting only light
`having a predetermined wavelength from among the light
`reflected from the subject 200 and a first lens 109 for concen-
`trating the reflected light within an area of the optical modu-
`lator 103, and a second lens 110 for concentrating the modu-
`lated light within an area of the image pickup device 105
`between the optical modulator 103 and the image pickup
`device 105.
`
`[0059] The light source 101 may be for example a Light-
`Emitting Diode (LED) or a Laser Diode (LD) capable of
`emitting light having a Near Infrared (NIR) wavelength of
`about 850 nm that is invisible to human eyes for safety.
`However, the light source 101 is not limited to a wavelength
`band or type.
`
`0013
`
`0013
`
`

`

`US 2013/0101176 A1
`
`Apr. 25, 2013
`
`[0060] Light projected from the light source 101 to the
`subject 200 may have a form of a periodic continuous func-
`tion having a predetermined period. For example, the pro-
`jected light may have a specifically defined waveform such as
`a sine wave, a ramp wave, or a square wave, or an undefined
`general waveform. In addition, the light source 101 may
`intensively project light to the subject 200 for only a prede-
`termined time in a periodic manner under control of the
`controller 107. A time that light is projected to the subject 200
`is called a light-projecting time.
`[0061] The optical modulator 103 modulates light reflected
`from the subject 200 under control of the controller 107. For
`example, the optical modulator 103 may modulate the inten-
`sity of the reflected light by changing a gain in response to an
`optical modulation signal having a predetermined wave-
`length. To do this, the optical modulator 103 may have a
`variable gain.
`[0062] The optical modulator 103 may operate at a high
`modulation frequency of tens to hundreds of MHZ to identify
`a phase difference or a traveling time of light according to a
`distance. The optical modulator 103 satisfying this condition
`may be at least one of a sub-image intensifier including a
`Multi-Channel Plate (MCP), a solid optical modulator of the
`GaAs series, or a thin-type optical modulator using an elec-
`tro-optic material. Although the optical modulator 103 is a
`transmission-type optical modulator in FIG. 1, a reflection-
`type optical modulator may also be used.
`[0063] Like the light source 101, the optical modulator 103
`may also operate for a predetermined time to modulate the
`light reflected from the subject 200. A time that the optical
`modulator 103 operates to modulate light is called an operat-
`ing time of the optical modulator 103. The light-projecting
`time of the light source 101 may be synchronized with the
`operating time of the optical modulator 103. Thus, the oper-
`ating time ofthe optical modulator 103 may be the same as or
`shorter than the light-projecting time of the light source 101.
`[0064] The image pickup device 105 generates a sub-image
`by detecting the reflected light modulated by the optical
`modulator 103 under control of the controller 107. If only a
`distance to any one point on the subject 200 is to be measured,
`the image pickup device 105 may use a single optical sensor
`such as, for example, a photodiode or an integrator. However,
`if distances to a plurality ofpoints on the subject 200 are to be
`measured, the image pickup device 105 may have a plurality
`of photodiodes or a 2D or 1D array of other optical detectors.
`For example, the image pickup device 105 may include a
`Charge-Coupled Device (CCD) image sensor or a Compli-
`mentary Metal-Oxide Semiconductor (CMOS) image sensor.
`The image pickup device 105 may generate a single sub-
`image per reflected light beam.
`[0065] The signal processor 106 calculates depth informa-
`tion based on a sub-image formed by the image pickup device
`105 and generates a 3D image including the depth informa-
`tion. The signal processor 106 may be implemented by, for
`example, an exclusive Integrated Circuit (IC) or software
`installed in the 3D image acquisition apparatus 100. When the
`signal processor 106 is implemented by software, the signal
`processor 106 may be stored in a separate portable storage
`medium.
`
`[0066] Hereinafter, an operation of the 3D image acquisi-
`tion apparatus 100 having the above-described structure is
`described.
`
`[0067] The light source 101 sequentially and intensively
`projects N different light beams having a predetermined
`
`period and waveform to the subject 200 under control of the
`controller 107, wherein N may be 3 or a larger natural num-
`ber. The light source 101 may sequentially project the N
`different light beams continuously or within a predetermined
`time interval.
`
`For example, when 4 different projection light
`[0068]
`beams are used, the light source 101 may generate and project
`a first projection light beam to the subject 200 for a time T1,
`a second projection light beam to the subject 200 for a time
`T2, a third projection light beam to the subject 200 for a time
`T3, and a fourth projection light beam to the subject 200 for a
`time T4. These first to fourth projection light beams sequen-
`tially projected to the subject 200 may have a form of a
`continuous function having a predetermined period, such as a
`sine wave. For example, the first to fourth projection light
`beams may be periodic waves having the same period and
`waveform and different intensities or phases.
`[0069] When the N different light beams are projected, a
`phase difference between any two of the light beams pro-
`jected at the same time may be 360°/N, and the period of each
`projected light beam may be shorter than the operating time of
`the light source 101. All ofthe N different light beams may be
`sequentially projected to the subject 200 within the operating
`time ofthe light source 101.
`[0070] A light beam projected to the subject 200 is reflected
`on the surface of the subject 200 and incident to the first lens
`109. In general, the subject 200 has a plurality of surfaces
`having different distances, i.e., depths, from the 3D image
`acquisition apparatus 100. FIG. 1 illustrates the subject 200
`having 5 surfaces P1 to P5 having different depths for sim-
`plification of description. When the projected light beam is
`reflected from the 5 surfaces P1 to P5 having different depths,
`5 differently time-delayed (i.e., different phases) reflection
`light beams are generated.
`[0071]
`For example, 5 first reflection light beams having
`different phases are generated when a first projection light
`beam is reflected from the 5 surfaces P1 to P5 of the subject
`200, and 5 second reflection light beams having different
`phases are generated when a second projection light beam is
`reflected from the 5 surfaces P1 to P5 of the subject 200.
`Likewise, 5><N reflection light beams having different phases
`are generated when an Nth projection light beam is reflected
`from the 5 surfaces P1 to P5 of the subject 200. A reflection
`light beam reflected from the surface P1 that is farthest from
`the 3D image acquisition apparatus 100 may arrive at the first
`lens 109 with a phase delay of(1)191, and a reflection light beam
`reflected from the surface P5 that is nearest from the 3D
`
`image acquisition apparatus 100 may arrive at the first lens
`109 with a phase delay of (DPS that is less than (1)191.
`[0072] The first lens 109 focuses the reflection light within
`an area of the optical modulator 103. The filter 108 for trans-
`mitting only light having a predetermined wavelength may be
`disposed between the first lens 109 and the optical modulator
`103 to remove ambient light, such as background light, except
`for the predetermined wavelength. For example, when the
`light source 101 emits light having an NIR wavelength of
`about 850 nm, the filter 108 may be an NIR bandpass filter for
`transmitting an NIR wavelength band of about 850 nm. Thus,
`although light incident to the optical modulator 103 may be
`mostly light emitted from the light source 101 and reflected
`from the subject 200, ambient light is also included therein.
`Although FIG. 1 shows that the filter 108 is disposed between
`the first lens 109 and the optical modulator 103, positions of
`the first lens 109 and the filter 108 may be exchanged. For
`
`0014
`
`0014
`
`

`

`US 2013/0101176 A1
`
`Apr. 25, 2013
`
`example, NIR light first passing through the filter 108 may be
`focused on the optical modulator 103 by the first lens 109.
`[0073] The optical modulator 103 modulates the reflection
`light into an optical modulation signal having a predeter-
`mined wavelength. For convenience of description,
`it is
`assumed that the 5 surfaces P1 to P5 of the subject 200
`correspond to pixels divided in 5 areas of the image pickup
`device 105. A period of a gain wavelength of the optical
`modulator 103 may be the same as a period of a projection
`light wavelength. In FIG. 1, the optical modulator 103 may
`modulate the 5 first reflection light beams reflected from the
`5 surfaces P1 to P5 of the subject 200 and provide the modu-
`lated light beams to the image pickup device 105 and, in
`succession, may sequentially modulate the 5 second reflec-
`tion light beams into the 5><N reflection light beams and
`provide the modulated light beams to the image pickup device
`1 05. The intensity ofthe reflection light may be modulated by
`an amount obtained by multiplying it by an optical modula-
`tion signal when the reflection light passes through the optical
`modulator 103 . A period ofthe optical modulation signal may
`be the same as that of the projection light.
`[0074] The intensity-modulated light output from the opti-
`cal modulator 103 is multiplication-adjusted and refocused
`by the second lens 110 and arrives at the image pickup device
`1 05. Thus, the modulated light is concentrated within the area
`of the image pickup device 105 by the second lens 110. The
`image pickup device 105 may generate sub-images by receiv-
`ing the modulated light for a predetermined time through
`synchronization with the light source 101 and the optical
`modulator 103. A time that the image pickup device 105 is
`exposed to receive the modulated light is an exposure time of
`the image pickup device 105.
`[0075] A method of generating N sub-images from N
`reflection light beams will now be described.
`[0076]
`FIGS. 2A to 2D illustrate a process of generating N
`different sub-images by modulating N different reflection
`light beams, according to an exemplary embodiment.
`[0077] As shown in FIG. 2A, the image pickup device 105
`generates a first sub-image by receiving, for a predetermined
`exposure time, 5 first reflection light beams modulated after
`being reflected from the 5 surfaces P1 to P5 ofthe subject 200.
`Thereafter, as shown in FIG. 2B, the image pickup device 105
`generates a second sub-image by receiving, for the predeter-
`mined exposure time, 5 second reflection light beams modu-
`lated after being reflected from the 5 surfaces P1 to P5 of the
`subject 200. After repeating these procedures, as shown in
`FIG. 2C, the image pickup device 105 finally generates an
`Nth sub-image by receiving, for the predetermined exposure
`time, 5><N reflection light beams modulated after being
`reflected from the 5 surfaces P1 to P5 of the subject 200. In
`this manner, the N different sub-images may be sequentially
`obtained as shown in FIG. 2D.
`
`to Nth sub-images may be sub-frame
`[0078] The first
`images for generating a single frame of an image. For
`example, assuming that a period of a single frame is Td, an
`exposure time of the image pickup device 105 to obtain each
`of the first to Nth sub-images may be about Td/N.
`[0079]
`In FIGS. 2A to 2D, a case of generating N different
`sub-images by using N different projection light beams and N
`different reflection light beams has been described. However,
`it is also possible that the same reflection light beam is used
`for all sub-images and the optical modulator 103 modulates a
`reflection light beam for each of the sub-images with a dif-
`ferent gain waveform.
`
`FIGS. 3A to 3D illustrate a process of generating N
`[0080]
`different sub -images with one same projection light beam and
`N different optical modulation signals, according to an exem-
`plary embodiment. Referring to FIG. 3, reflection light beams
`generated by reflecting the projection light beam from the
`subject 200 have the same waveform and phase for all sub-
`images. As described above, reflection light beams for each
`sub-image have different phase delays (I)P1 to (DPS according
`to the surfaces P1 to P5 ofthe subject 200. As shown in FIGS.
`3A to 3C, the optical modulator 103 modulates 5 first reflec-
`tion light beams by using a first optical modulation signal,
`modulates 5 second reflection light beams by using a second
`optical modulation signal different from the first optical
`modulation signal, and modulates 5><N reflection light beams
`by using an Nth optical modulation signal different from any
`other optical modulation signal. Here, the first to Nth optical
`modulation signals may have waveforms totally different
`from each other or have the same period and waveform except
`for their phases. Accordingly, as shown in FIG. 3D, the image
`pickup device 105 may obtain N first to Nth sub-images that
`are different from each other.
`
`[0081] Hereinafter, a method of generating sub-images by
`using signal waveforms is described.
`[0082]
`For convenience of description, an embodiment in
`which the light source 101 projects N different projection
`light beams to the subject 200 and the optical modulator 103
`uses a single same optical modulation signal is described as
`an example. However, the theoretical description below may
`be equally applied to a case where one same projection light
`beam and N different optical modulation signals are used. In
`addition, since a method of calculating depth information is
`equally applied to each pixel even for a case where a sub-
`image formed by the image pickup device 105 is a 2D array
`sub-image, only a method applied to a single pixel
`is
`described. However, when depth information is calculated
`from a plurality of pixels in a 2D array sub-image at the same
`time, a computation amount may be reduced by omitting a
`portion to be repetitively processed by efficiently processing
`data management and memory allocation.
`[0083]
`First, a waveform Pg ofgeneral projection light hav-
`ing a period Te may be expressed by Equations l-l and 1-2.
`
`P90) 2 amsinwt — 0(5)) + PM
`
`7
`m
`PEEKI) = Z {a[5)sin(kwt) + bf’cosflcwn} + Pave
`k:l
`
`(1-1)
`
`l- 2
`
`(
`
`)
`
`[0084] Here, s denotes an identifier for identifying first to
`Nth projection light beams that are different from each other.
`For example, when N projection light

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket