throbber
Case 1:18-cv-00697-LPS-CJB Document 5-1 Filed 05/24/18 Page 1 of 19 PageID #: 64
`Case 1:18-cv-00697-LPS-CJB Document 5-1 Filed 05/24/18 Page 1 of 19 PageID #: 64
`
`
`
`
`
`EXHIBIT A
`EXHIBIT A
`
`
`

`

`Case 1:18-cv-00697-LPS-CJB Document 5-1 Filed 05/24/18 Page 2 of 19 PageID #: 65
`case lilB'CV'OOBQHPS'CJB mower“1111111111111111111111||||fl11111||1|1111101111111111i 65
`
`USOO996224432
`
`(12) United States Patent
`US 9,962,244 32
`(10) Patent No.:
`
` Esbeeh et at. (45) Date of Patent: May 8, 2018
`
`
`(54) FOCUS SCANNING APPARATUS
`RECORDING COLOR
`.
`.
`.
`.
`(71) Applicant: 3SIIAI’I‘. AIS. Copenhagen K (UK)
`(72)
`Inventors: Bo Esbeeh. Gentofte (DK): Christian
`Romer Rosberg. Bronshoj (13K): Mike
`Van Der Poel. Rodovre (UK): Rasmus
`Kjaer. Copenhagen (DK): Michael
`Vinther. Copenhagen (DK): Karl-Josef
`lIollenbeek. Copenhagen (UK)
`
`(52) US. (:1.
`CPC ............ A61C 9/00 73 (2013.01 ); A616 9006
`(2013.01 ); A616” 9/0066 (2013.01 1: G013
`11/24(2013.01)'
`
`((I‘onlinuod)
`(58) Field of Classification Search
`CPC .. (10113 11724; (10113 1172509: (10113 1172518
`
`((I‘onlinuod)
`
`(56)
`
`References Cited
`
`(73) Assiyee: BSHAPE AIS, Copenhagen K (DK)
`
`US. PATENT DOCUMENTS
`
`( “ ) Notice:
`
`Subject to anyr disclaimer. the term of this
`patent is extended or adjusted under 35
`U.S.C. 154(b) by 223 days.
`
`7.698.068 F32
`8.102.538 B2
`
`4.52010 Babayo [1'
`112012 Babayoff
`((I‘onlinuod)
`
`(21) Appl. No.:
`
`147764.087
`
`FORElCiN PATENT DOCUMENTS
`
`(22) PCT Filed:
`
`Feb. 13, 2014
`
`(86)
`
`PCT No.:
`
`PC'I‘IICP20147052842
`
`§ 371 (c)(1].
`(2) Dale:
`
`Jul. 28, 2015
`
`(87)
`
`PCT Pub. No.: “1020147125037
`
`PCT Pub. Date: Aug. 21., 2014
`
`(65)
`
`Prior Publication Data
`US 2016711022339 A1
`Jan. 28. 2016
`
`Related U.S. Application Data
`
`(60)
`
`Provisional application No. 617764.178, filed on Feb.
`13. 2013.
`
`{30)
`
`Foreign Application Priority Data
`
`Feb. 13. 2013
`
`(UK)
`
`2013 70077
`
`(5])
`
`Int. Cl.
`H01! 40.614
`A6IC 9X00
`
`(2006.01)
`(2006.01)
`(Continued)
`
`(”N
`(.‘N
`
`[02008282 A
`102112845 A
`
`422011
`6-2011
`
`(Continued)
`
`OTHER PUBLICATIONS
`
`The First Oflke Action dated Aug. 2. 2016. by the State Intellectual
`Property Oflice of People‘s Republic of China in corresponding
`Chinese Patent Application No. 2014800209763. and an English
`Translation of the Olliee Action. (18 pages).
`(Continued)
`
`Primarjl’ Examiner — Kevin Pyo
`(74) Artur-oer. Agent. or Fir-11: — Buchanan Ingersoll &
`Rooney PC.
`
`(57)
`
`ABSTRACT
`
`Disclosed are a scanner system and a method for recording
`surface geotnetry and surface color ofan object where both
`surface geometry iniorntation and surface color information
`for a block of said image sensor pixels at least partly from
`one 21) image recorded by said color image sensor.
`
`35 Claims. 4 Drawing Sheets
`
`
`
`
` L“...sI\
`‘9
`I 4 I
`,._:&.._..
`
`
`

`

`Case 1:18-cv-00697-LPS-CJB Document 5-1 Filed 05/24/18 Page 3 of 19 PageID #: 66
`Case 1:18-cv-00697-LPS-CJB Document 5-1 Filed 05/24/18 Page 3 of 19 PageID #: 66
`
`US 9,962,244 B2
` Page 2
`
`(51)
`
`(58)
`
`Int. (:1.
`0013 11/25
`00113151
`1,3111;ng
`(3013 1124
`(52) U.S. (:1.
`CPC ...... G013 1112509 (2013.01): (1010 1112513
`(2013.01); (1013 11/2510 (2013.011; (101.1
`3/0208 (2013.01) (1701.! 370224 (2013.01):
`(11011 30123701113111): (101.1 310278
`(2013.01); G011 31500001301); 00].! 3151
`(2013.01);(10113513901301)
`Field of Classification Search
`USP(‘
`.................................... 2501220. 234; 348147
`Sec application 1111: for complete search history.
`
`(2006.01)
`(200601)
`83323:;
`(2006.01)
`
`201310230350 Al
`912013 Wuetal.
`2014.-:0022356 Al
`112014
`[-‘isker eta].
`20140140142 Al
`5.12014
`0010101111.
`l-‘OleLK'iN PA'l‘lEN'l‘ DOCUMIEN'I'S
`[02402799 A
`43-2012
`102302520 11
`1112012
`3 2 24; 243 A2
`IflijUlfl
`~33}???ng ::
`13$:
`2009109253 A
`56009
`wo 2010-1450150 Al
`1212010
`20|2=i007003 Al
`1-2012
`$8 Egg-133$; '1‘:
`$133};
`“
`'
`‘
`'“
`1
`..
`‘
`‘
`,_
`01 l ”"R PUBI‘K-“M IONS
`
`CN
`(SN
`FF
`j}:
`JP
`wo
`W0
`$8
`
`(56)
`
`References Cited
`U.S. PAII‘HN'I‘ IXXIUMISN'I‘S
`
`9.212.898 B2 "'
`9.456.254 B2 "“
`200510285027" Al
`201050145898 A1
`"J
`."
`7"]
`Egiilxggi‘ég‘g it:
`2012111062716 Al
`201210015425 Al
`201230092461 Al
`201210140243 Al
`
`[2120 l 5
`Banyay .............. C1023 2 130072
`l 0-"20 l 6
`Koeherscheidl
`..... A6 [B 530088
`[212005
`Favalora et a].
`612010
`Malfliet et al.
`1'"!
`"
`i
`_
`'.\-
`353“ i>?:;;}:::fic at al'
`3,2012 Dillon et al.
`312012 Thie]
`432012 Fisker el al.
`612012 Colonna de Leg-a
`
`The First Chinese Search dated Jul. 25. 2016. by the State lntel-
`lectua] Property Office of People‘s Republic of China in corre-
`sponding Chinese Palenl Applicaliun N0. 20|480020976.3.
`(2
`pages).
`Inlernaliona] Search Reporl (PCTIISAIZIO) daled Jul. 7, 20M. by
`Lhe European Palenl Office as Lhe Inlernalional Searching Aulhorily
`for International Application No. PC'l'.-"EP2014!'052842.
`Uflice Action (Notice of Reasons for Rejection) dated Jan. 9. 2018.
`by the Japanese Palent Office in Japanese Palenl Applicalion Nu.
`2015-5571430. and an L-ngllsh Iranslatton of Lhe UflJce Actton. (8
`pages}
`
`* cited by examiner
`
`

`

`e
`tnue
`aP81L/4n43
`Case 1:18-cv-00697-LPS-CJB Document 5-1 Filed 05/24/18 Page 4 of 19 PageID #: 67716
`Qu
`8
`71
`1;
`SPI__
`C
`nu
`
`
`
` lSe-aU
`
`wtom0wt
`,bm
`
`mscm02D900By1daCM
`
`EJto
`
`0MAu0blHam
`
`gMa2wm.l9.m94SeU
`
`#.2DB
`
`Cognac-88.88;;88T
`ziiiillsliiashfliiis.
`
`Wtéuzns‘su
`
`Es
`ta
`
`unfldmwm4*ma
`
`.8885.
`
`Egiiggggiiii
`
`00H
`
`Fig. 1
`
`QwfiGm“
`
`mQH
`
`
`
`
`

`

`Case 1:18-cv-00697-LPS-CJB Document 5-1 Filed 05/24/18 Page 5 of 19 PageID #: 68
`Case 1:18-cv-00697-LPS-CJB Document 5-1 Filed 05/24/18 Page 5 of 19 PageID #: 68
`
`US. Patent
`
`May 8, 2018
`
`Sheet 2 of4
`
`US 9,962,244 32
`
`w... E...“".m...ult...“".‘”&...
`i
`....._ ...‘1.””*....th....
`
`..... m......3”m“W.i:-
`
`..... 2...“... kW...
`,
`kttt_~t_~t: :tjmktt: -.‘:t_kt_‘-t_‘-tlf
`w... gammamh\W\|-\
`55......“ am...“
`._.:.......... _.........
`
`m -
`dflflflflflu‘flu‘ :muuuu. -m -
`1&&&.‘:} raw 5
`
`kt...»
`
`-A....
`.
`-
`A--.:.-A..m». .. $6555.:
`24.3. A......... t..,,....
`-A._
`$65354.21.4”'
`
`5%.:Est;
`Efi'
`
`'-gags:
`
`tif-jfigig
`
`

`

`Case 1:18-cv-00697-LPS-CJB Document 5-1 Filed 05/24/18 Page 6 of 19 PageID #: 69
`Case 1:18-cv-00697-LPS-CJB Document 5-1 Filed 05/24/18 Page 6 of 19 PageID #: 69
`
`US. Patent
`
`May 8, 2018
`
`Sheet 3 of4
`
`Us 9,962,244 32
`
`541
`
`Obtain scanner system obtained
`
`542
`
`Iiiumineting obejct surface w.
`muitichromatic probe Eight
`
`543
`
`Capturing a series of 21:) images of
`said object
`
`544
`
`Derived geometry and enter
`information
`
`
`545
`
`Generate a sub—scan ef the object
`
`546
`
`Generate a catered digital! 30
`representation of the object from
`several sub—scans
`
`54?
`
`Fig. 5
`
`

`

`Case 1:18-cv-00697-LPS-CJB Document 5-1 Filed 05/24/18 Page 7 of 19 PageID #: 70
`Case 1:18-cv-00697-LPS-CJB Document 5-1 Filed 05/24/18 Page 7 of 19 PageID #: 70
`
`US. Patent
`
`May 8, 2018
`
`Sheet 4 OH!
`
`US 9,962,244 32
`
`861
`
`Fig.6A
`
`663
`
`882
`
`/
`
`:
`
`:
`
`:
`
`:
`
`:
`
`
`
`z
`
`Correiation
`measure
`
`
`
`R-value
`
`

`

`Case 1:18-cv-00697-LPS-CJB Document 5-1 Filed 05/24/18 Page 8 of 19 PageID #: 71
`Case 1:18-cv-00697-LPS-CJB Document 5-1 Filed 05/24/18 Page 8 of 19 PageID #: 71
`
`US 9,962,244 32
`
`2
`
`1
`FOCUS SCANNING APPARATUS
`RECORDING COLOR
`
`FIELD OF THE APPLICATION
`
`The application relates to three dimensional (31)] scan-
`ning of the surface geometry and surface color of obiects. A
`particular application is within dentistry. particularly for
`intraoral scanning.
`
`BACKGROUND
`
`3D scanners are widely known from the art. and so are
`intraoral dental 3D scanners [e.g._. Sirona Cerec. Cadent
`Itero. 3Shape TRIOS).
`in many
`The ability to record surface color is useful
`applications. For example in dentistry. the user can difl'er-
`entiate types of tissue or detect existing restorations. For
`example in materials inspection, the user can detect surface
`abnormalities such as crystallization defects or discoloring.
`None of the above is generally possible from surface geom—
`etry infomtation alone.
`W02010145669 mentions the possibility of recording
`color. In particular. several sequential images. each taken for
`an illumination in a different color—typically blue. green.
`and red—are combined to form a synthetic color image. This
`approach hence requires means to change light source color.
`such as color filters.
`l"url.hennore.
`in handheld use.
`the
`scanner will move relative to the scanned object during the
`illumination sequence. reducing the quality of the synthetic
`color image.
`Also U.S. Pat. No. 7.698.068 and U.S. Pat. No. 8.102.538
`(Cadent Inc.) describe an intraoral scanner that records both
`geometry data and texture data with one or more image
`sensor(s). However. there is a slight delay between the color
`and the geometry recording, respectively. US. Pat. No.
`7.698.068 requires sequential illumination in different colors
`to form a synthetic image. while US. Pat. No. 8,102,538
`mentions white light as a possibility, however from a second
`illumination source or recorded by a second image sensor.
`the first set being used for recording the geometry.
`WO2012083967 discloses a scanner for recording geom—
`etry data and texture data with two separate cameras. While
`the first camera has a relatively shallow depth of field as to
`provide focus scanning based on multiple images. the sec-
`ond camera has a relatively large depth of field as to provide
`color texture information from a single image.
`Color—recording scanning confocal microscopes are also
`known from the prior art (e.g.. Keyence VK9700; see also
`1132004029373). A white light illumination system along
`with a color image sensor is used for recording 2]) texture.
`while a laser beam forms a dot that is scanned. i.e.. moved
`over the surface and recorded by a photomultiplier. provid—
`ing the geometry data from many depth measurements. one
`for each position of the dot. The principle of a moving dot
`requires the measured object not to move relative to the
`microscope during measurement, and hence is not suitable
`for handheld use.
`
`If]
`
`15
`
`2f]
`
`30
`
`40
`
`50
`
`60
`
`SUMMARY
`
`One aspect of this application is to provide a scanner
`system and a method for recording surface geometry and
`surface color of an object. and where surface geometry and
`surface color are derived from the same captured 2 D images.
`
`One aspect of this application is to provide a scanner
`system for recording surface geometry and surface color of
`an object, and wherein all 2D images are captured using the
`same color image sensor.
`One aspect of this application is to provide a scanner
`system and a method for recording surface geometry and
`surface color ofan object. in which the infomiation relating
`to the surface geometry and to the surface color are acquired
`simultaneously such that an aliglunent of data relating to the
`recorded surface geometry and data relating to the recorded
`surface color is not required in order to generate a digital 3] )
`representation ofthe object expressing both color and geom—
`etry of the object.
`Disclosed is a scanner system for recording surface geom-
`etry and surface color of an object.
`the scanner system
`comprising:
`a multichromatic light source configured for providing a
`multichromatic probe light
`for
`illumination of the
`object.
`a color image sensor comprising an array of image sensor
`pixels for capturing one or more 2D images of light
`received from said object. and
`a data processing system configured for deriving both
`surface geometry information and surface color infor-
`mation for a block of said image sensor pixels at least
`partly from one 2D image recorded by said color image
`sensor.
`
`Disclosed is a method of recording surface geometry and
`surface color of an object. the method comprising:
`obtaining a scanner system comprising a multichromatic
`light source and a color image sensor comprising an
`array of image sensor pixels;
`ilhu‘ninating the surface of said object with multichro-
`matic probe light
`from said multiclmunatic light
`source:
`
`capturing a series of 2D images of said object using said
`color image sensor: and
`deriving both surface geometry information and surface
`color information for a block of said image sensor
`pixels at least partly from one captured 2D image.
`In the context of the present application,
`the phrase
`“surface color” may refer to the apparent color of an object
`surface and thus in some cases, such as for semi-transparent
`or semi-translucent objects such as teeth, be caused by light
`from the object surface andr’or the material below the object
`surface, such as material
`immediately below the object
`surface.
`
`the phrase
`In the context of the present application,
`“derived at least partly from one 2D image" refers to the
`situation where the surface geometry infonnation for a given
`block of image sensor pixels at least in part is derived from
`one 21) image and where the corresponding surface color
`information at least in part is derived from the same 2])
`image. The phase also covers cases where the surface
`geometry information for a given block of image sensor
`pixels at least in part is derived from a plurality of 2D images
`of a series ofcaptured 21') images and where the correspond-
`ing surface color information at least in part is derived from
`the same 2D images of that series of captured 2D images.
`An advantage of deriving both surface geometry infor—
`tnation and surface color information for a block of said
`image sensor pixels at least partly from one 2|) image is that
`a scanner system having only one image sensor can be
`realized.
`
`It is an advantage that the surface geometry information
`and the surface color information are derived at least panly
`from one 21) image. since this inherently provides that the
`
`

`

`Case 1:18-cv-00697-LPS-CJB Document 5-1 Filed 05/24/18 Page 9 of 19 PageID #: 72
`Case 1:18-cv-00697-LPS-CJB Document 5-1 Filed 05/24/18 Page 9 of 19 PageID #: 72
`
`US 9,962,244 B2
`
`3
`two types of infomlation are acquired simultaneously. There
`is hence no requirement for an exact timing of the operation
`of two color image sensors. which may the case when one
`image sensor is used for the geometry recording and another
`for color recording. liqually there is no need for an elaborate
`calculation aecotmting for significant dilferences in the
`timing of capturing of 21') images ”from which the surface
`geometry infomiation is derived and the timing of the
`capturing of 2D images from which the surface color
`information is derived.
`
`10
`
`15
`
`The present application discloses is a significant improve-
`ment over the state of the art in that only a single image
`sensor and a single multichromatic light source is required.
`and that surface color and surface geometry for at least a part
`of the object can be derived from the same 2D image or 2D
`images, which also means that alignment of color and
`surface geometry is inherently perfect. In the scanner system
`according to the present application. there is no need for
`taking into account or compensating for relative motion of -
`the object and scanner system between obtaining surface
`geometry and surface color. Since the surface geometry and
`the surface color are obtained at precisely the same time. the
`scanner system automatically lnaintains its spatial disposi-
`tion with respect to the object surface while obtaining the
`surface geometry and the surface color. This makes the
`scanner system of the present application suitable for hand—
`held use. for example as an intraoral scanner. or for scanning
`moving objects.
`the data processing system is
`In some embodiments.
`configured for deriving surface geometry infonnation and
`surface color information for said block of image sensor
`pixels from a series of 2]) images. such as from a plurality
`of the 21.) images in a series 0 f captured 2]) images. I.e. the
`data processing system is capable ofanalyzing a plurality of
`the 2D images in a series of captured 2D images in order to
`derive the surface geometry information for a block of
`image sensor pixels and to also derive surface color infor-
`mation from at least one of the 2]) images from which the
`surface geometry information is derived.
`In some embodiments,
`the data processing system is
`configured for deriving surface color infomlation from a
`plurality of 21) images of a series ofcaptured 2]) images and
`for deriving surface geometry information from at least one
`of the 2D images from which the surface color intonnation
`is derived.
`
`30
`
`40
`
`the data processing system is
`In some embodiments,
`configured for deriving surface geometry information from
`a plurality of 2D images of a series of captured 2D images
`and for deriving surface color information from at least one
`of the 2D images from which the surface geometry infor—
`mation is derived.
`In some embodiments. the set of 21) images from which
`surzt'ace color information is derived from is identical to the
`set of 2D images from which surface geometry intonnation
`is derived from.
`
`the data processing system is
`In some embodiments.
`configured for generating a sub-scan of a part of the obiect
`surzt'ace based on surface geometry information and surface
`color information derived from a plurality of blocks of
`image sensor pixels. The sub-scan expresses at least the
`geometry of the part of the object and typically one sub-scan
`is derived from one stack of captured 2D images.
`in some embodiments. all 2D images of a captured series
`of images are analyzed to derive the surface geometry
`information for each block of image sensor pixels on the
`color image sensor.
`
`50
`
`60
`
`4
`
`For a given block of image sensor pixels the correspond-
`ing portions of the captured 2D images in the stack may be
`analyzed to derive the surface geometry information and
`surface color information for that block.
`In some embodiments. the surface geometry infonnation
`relates to where the object surface is located relative to the
`scanner system coordinate system for that particular block of
`image sensor pixels.
`One advantage of the scanner system and the method of
`the current application is that the informations used for
`generating the sub-scan expressing both geometry and color
`of the object [as seen from one view) are obtained concur—
`rently.
`Sub-scans can be generated for a number of djlfcrent
`views of the object such that they together cover the part of
`the surface.
`
`the data processing system is
`In some embodiments.
`configured for combining a number of sub—scans to generate
`a digital 31) representation of the object. The digital 31.)
`representation of the object then preferably expresses both
`the recorded geometry and color of the object.
`The digital 3D representation of the object can be in the
`form ofa data file. When the object is a patient’s set of teeth
`the digital 31.) representation of this set of teeth can e.g. be
`used for (TADICAM manufacture ofa physical model of the
`patient’s set teeth.
`The surface geometry and the surface color are both
`determined from light recorded by the color image sensor.
`In some embodiments, the light received from the object
`originates from the multichromatic light source.
`i.e. it is
`probe light reflected or scattered from the surface of the
`object.
`In some embodiments. the light received form the object
`comprises fluorescence excited by the probe light from the
`multichromatic light source.
`i.e. fluorescence emitted by
`fluorescent materials in the object surface.
`In some embodiments, a second light source is used for
`the excitation of fluorescence while the multichromatic light
`source provides the light for obtaining the geometry and
`color of the object.
`The scanner system preferably comprises an optical sys—
`tem configured for guiding light emitted by the multichro-
`matic light source towards the object to be scanned and for
`guiding light received from the object to the color image
`sensor such that
`the 2D images of said object can be
`captured by said color image sensor.
`111 seine embodiments. the scanner system comprises a
`first optical system. such as an arrangement of lenses, for
`transmitting the probe light from the multichromatic light
`source towards an object and a second optical system for
`imaging light received from the object at the color image
`sensor.
`
`In seine embodiments. single optical system images the
`probe light onto the object and images the object. or at least
`a part of the object. onto the color image sensor. preferably
`along the same optical axis, however in opposite directions
`along optical axis. The scanner may comprise at least one
`beam splitter located in the optical path. where the beam
`splitter is arranged such that it directs the probe light from
`the multichromatic light source towards the object while it
`directs light received from the object towards the color
`image sensor.
`Several scanning principles are suitable. such as triangu—
`lation and focus scanning.
`the scanner system is a focus
`In some embodiments.
`scanner system operating by translating a focus plane along
`an optical axis of the scanner system and capturing the 21')
`
`

`

`Case 1:18-cv-00697-LPS-CJB Document 5-1 Filed 05/24/18 Page 10 of 19 PageID #: 73
`Case 1:18-cv-00697-LPS-CJB Document 5-1 Filed 05/24/18 Page 10 of 19 PageID #: 73
`
`US 9,962,244 32
`
`5
`images at different focus plane positions such that each
`series of captured 2D images forms a stack of 2D images.
`The focus plane position is preferably shifted along an
`optical axis of the scanner system. such that 2D images
`captured at a number of foctls plane positions along the
`optical axis fonns said stack of 2]) ilnages for a given view
`of the object.
`i.e. for a given arrangement of the scanner
`system relative to the object. After changing the arrange—
`ment of the scanner system relative to the object a new stack
`of2D images for that view can be captured. The focus plane
`position may be varied by means of at
`least one focus
`element. e.g.. a moving focus lens.
`In some focus scanner embodiments, the scanner system
`comprises a pattern generating element configured for incor—
`porating a spatial pattern in said probe light.
`In some embodiments. the pattern generating element is
`configured to provide that
`the probe light projected by
`scanner system onto the object comprises a pattem consist-
`ing of dark sections and sections with light having the a
`wavelength distribution according to the wavelength distri—
`bution of the multichromatic light source.
`In some embodiments, the multichromatic light source
`comprises a broadband light source, such as a white light
`SOLITCC
`
`the pixels of the color image
`In some embodiments.
`sensor and the pattern generating element are configured to
`provide that each pixel corresponds to a single bright or dark
`region of the spatial pattern incorporated in said probe light.
`For a focus scanner system the surface geometry infor—
`mation for a given block of image sensor pixels is derived
`by identifying at which distance from the scanner system the
`object surface is in focus for that block of image sensor
`pixels.
`In some embodiments. deriving the surface geometry
`information and surface color information comprises calcu—
`lating for several 2D images. such as for several 2D images
`in a captured stack of 21) images, a correlation measure
`between the portion of the 2[) image captured by said block
`of image sensor pixels and a weight function. Here the
`weight function is preferably determined based on informa—
`tion of the configuration of the spatial pattern. The correla-
`tion measure may be calculated for each 2]) image of the
`stack.
`The scanner system may comprise means for evaluating a
`correlation measure at each focus plane position between at
`least one image pixel and a weight
`limction. where the
`weight function is determined based on information of the
`configuration of the spatial pattern.
`In some embodiments. deriving the surface geometry
`information and the surface color information for a block of
`image sensor pixels comprises identifying the position along
`the optical axis at which the corresponding correlation
`measure has a maximum value. The position along the
`optical axis at which the corresponding correlation measure
`has a maximum value may coincide with the position where
`a 2D image has been captured but it may even more likely
`be in between two neighboring 2D images of the stack of 2])
`images.
`Determining the surface geometry information may then
`relate to calculating a correlation measure of the spatially
`structured light signal provided by the pattern with the
`variation of the pattern itself [which we term reference) for
`every location of the focus plane and finding the location of
`an extremum of this stack of 2D images. In some embodi—
`ments, the pattern is static. Such a static pattern can for
`example be realized as a chrome-on-glass pattern.
`
`6
`One way to define the correlation measure mathematically
`with a discrete set of measurements is as a dot product
`computed from a signal vector. I {ll ..... In). with n>l
`elements representing sensor signals and a reference vector,
`f=(fl,
`.
`.
`.
`.
`fn). of reference weights. The correlation
`measure A is then given by
`
`It]
`
`15
`
`The indices on the elements in the signal vector represent
`sensor signals that are recorded at dilferent pixels. typically
`in a block of pixels. The reference vector 'f can be obtained
`in a calibration step.
`By using knowledge of the optical system used in the
`scanner.
`it
`is possible to transform the location of an
`extremum of the correlation measure, i.e., the focus plane
`into depth data information. on a pixel block basis. All pixel
`blocks combined thus provide an array of depth data. In
`other words, depth is along an optical path that is known
`from the optical design andfor found from calibration, and
`each block of pixels on the image sensor represents the end
`point of an optical path. Therefore. depth along an optical
`path. for a bundle of paths. yields a surface geometry within
`the field ofview ofthe scanner. i.e. a sub—scan for the present
`view.
`
`30
`
`It can be advantageous to smooth and interpolate the
`series of correlation measure values. such as to obtain a
`more robust and accurate detennination of the location of
`the maximum.
`
`In some embodiments, the generating a sub-scan com-
`prises determining a correlation measure function describing
`the variation of the correlation measure along the optical
`axis for each block of image sensor pixels and identifying
`for the position along the optical axis at which the correla—
`tion measure functions have their maximum value for the
`block.
`In some embodiments. the maximum correlation measure
`value is the highest calculated correlation measure value for
`the block of image sensor pixels andfor the highest maxi-
`mum value of the correlation measure function for the block
`of image sensor pixels.
`For example. a polynomial can be fitted to the values of
`A for a pixel block over several images on both sides of the
`recorded maximum, and a location of a deducted maximum
`can be found from the maximum of the fitted polynomial,
`which can be in between two images. The deducted maxi—
`mum is subsequently used as depth data information when
`deriving the surface geometry from the present view,
`i.e.
`when deriving a sub-scan for the view.
`In some embodiments.
`the data processing system is
`configured for determining a color for a point on a generated
`sub—scan based on the surface color information of the 2D
`image of the series ill which the correlation measure has its
`maximum value for the corresponding block of image sensor
`pixels. The color may e.g. be read as the RG13 values for
`pixels in said block of image sensor pixels.
`In some embodiments.
`the data processing system is
`conligured for deriving the color for a point on a generated
`sub-scan based on the surface color informations of the 21)
`images in the series in which the correlation measure has its
`maximum value for the corresponding block of image sensor
`pixels and on at least one additional 2D image. such as a
`neighboring 2]) image from the series of captured 2].)
`images. The surface color information is still derived from
`
`40
`
`50
`
`60
`
`65
`
`

`

`Case 1:18-cv-00697-LPS-CJB Document 5-1 Filed 05/24/18 Page 11 of 19 PageID #: 74
`Case 1:18-cv-00697-LPS-CJB Document 5-1 Filed 05/24/18 Page 11 of 19 PageID #: 74
`
`US 9,962,244 32
`
`7
`least one of the 2|) images from which the surface
`at
`geometry information is derived.
`In some embodiments,
`the data processing system is
`configured for interpolating surface color information of at
`least
`two 20 images in a series when determining the
`sub-scan color, such as an interpolation of surface color
`information of neighboring 2D images in a series.
`In some embodiments,
`the data processing system is
`configured for computing a smoothed color for a number of
`points of the sub-scan. where the computing comprises an
`averaging of sub-scan colors of different points, such as a
`weighted averaging of the colors of the surrounding points
`on the sub—scan.
`Surface color infomiation for a block of image sensor
`pixels is at least partially derived from the same image ”from
`which surface geometry information is derived. In case the
`location ofthe maximum ofA is represented by a 2D image,
`then also color is derived from that same image. In case the
`location ofthe maximum 0 fA is found by interpolation to be
`between two images, then at least one of those two images
`should be used to derive color, or both images using inter—
`polation for color also. It is also possible to average color
`data from more than two images used in the determination
`of the location of the maximum of the correlation measure.
`
`or to average color ”from a subset or superset of multiple
`images used to derive surface geometry. In any case, some
`image sensor pixels readings are used to derive both surface
`color and surface geometry for at least a part of the scanned
`object.
`Typically. there are three color filters. so the overall color
`is composed of three contributions, such as red, green, and
`blue, or cyan, magenta. and yellow. Note that color filters
`typically allow a range of wavelengths to pass, and there is
`typically cross-talk between filters, such that, for example.
`some green light will contribute to the intensity measured in
`pixels with red filters.
`For an image sensor with a color filter array, a color
`component c,- within a pixel block can be obtained as
`
`(-;=:g;,;ti
`:-
`|
`
`l ifpixel i has a filter for color c}. 0 otherwise. For
`where g},
`an RGB filter array like in a Bayer pattern,j is one ofred,
`green, or blue. [itmher weighting of the individual color
`components.
`i.e., color calibration, may be required to
`obtain natural color data,
`typically as compensation for
`varying filter efficiency. illumination source efficiency. and
`different fraction of color components in the filter pattern.
`The calibration may also depend on focus plane location
`andfor position within the field of view, as the mixing of the
`light source component colors may vary with those factors.
`In some embodiments,
`surface color
`information is
`obtained for every pixel in a pixel block.
`In color image
`sensors with a color filter array or with other means to
`separate colors such as dilfractive means, depending on the
`color measured with a particular pixel, an intensity value for
`that color is obtained. In other words. in this case a particular
`piXel has a color value only for one color. Recently devel-
`oped color image sensors allow measurement of several
`colors in the same pixel, at different depths in the substrate.
`so in that case. a particular pixel can yield intensity values
`for several colors. In summary,
`it is possible to obtain a
`resolution of the surface color data that is inherently higher
`than that of the surface geometry information.
`
`8
`In the embodiments where the resolution of the derived
`color is higher than the resolution of the surface geometry
`for the generated digital 3D representation of the object, a
`pattern will be visible when at least approximately in focus,
`which preferably is the case when color is derived. The
`image can be filtered such as to visually remove the pattern,
`however at a loss of resolution. In fact, it can be advanta—
`geous to be able to see the pattern for the user. For example
`in intraoral scanning,
`it may be important to detect
`the
`position of a margin line. the rim or edge of a preparation.
`The image of the pattern overlaid on the geometry of this
`edge is sharper on a side that is seen approximately perpen—
`dicular. and more blurred on the side that is seen at an acute
`angle. Thus, a user, who in this example typically is a dentist
`or dental technician, can use the difference in sharpness to
`more precisely locate the position of the margin line than
`may be possible from examining the surface geometry
`alone.
`IIigh spatial contrast of an ill-focus pattern image on the
`obiect is desirable to obtain a good signal to noise ratio of
`the correlation measure on the color image sensor. Improved
`spatial contrast can be achieved by preferential imaging of
`the specular surface reflection frotn the object on the color
`image sensor. ‘l‘hus, some embodiments com

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket