`
`—
`
`WIPO
`WORLD
`INTELLEGTUAL PROPERTY
`ORGANIZATION
`
`DOCUMENT MADE AVAILABLE UNDER THE
`PATENT COOPERATION TREATY (PCT)
`International application number:
`PCT/EP2014/052842
`
`International filing date:
`
`13 February 2014 (13.02.2014)
`
`Document type:
`
`Documentdetails:
`
`Certified copy of priority document
`
`Country/Office:
`Number:
`Filing date:
`
`US
`61/764,178
`13 February 2013 (13.02.2013)
`
`Date of receipt at the International Bureau:
`
`11 April 2014 (11.04.2014)
`
`Remark: Priority document submitted or transmitted to the International Bureau in compliance with Rule
`17.1(a)},(b) or (b-bis)
`
`34, chemin des Colombettes
`12°1 Geneva 29, Switzerlard
`www.wipo.int
`Align Ex. 1029
`U.S. Patent No. 9,962,244
`
`Align Ex. 1029
`U.S. Patent No. 9,962,244
`
`i
`
`
`
`
`
`
`THE COUNTRY CODE AND NUMBEROF YOUR PRIORITY
`APPLICATION, TO BE USED FOR FILING ABROAD UNDERTHEPARIS
`CONVENTION,IS US61/764,178
`
`Erniee
`
`
`
`WntASTHALES?
`
`ERLESEGERaeT
`PEERSESEES£5ShanSESUREEEEESEEREEOREEE
`
`LOALL,TOWHOMTHESE: PRESENTS; SHAT, COME;3
`
`UNITED STATES DEPARTMENTOF COMMERCE
`
`United States Patent and Trademark Office
`
`February 14, 2014
`
`THIS IS TO CERTIFY THAT ANNEXED HERETOIS A TRUE COPY FROM
`
`THE RECORDS OF THE UNITED STATES PATENT AND TRADEMARK
`
`OFFICE OF THOSE PAPERS OF THE BELOW IDENTIFIED PATENT
`
`APPLICATION THAT MET THE REQUIREMENTSTO BE GRANTED A
`FILING DATE UNDER 35 USC 111.
`
`APPLICATION NUMBER: 61/764,178
`FILING DATE: February 13, 2013
`
`By Authority of the
`UnderSecretary of Commercefor Intellectual Property
`and Director of the United States Patent and Trademark Office
`
`“velse—(
`
`NEY
`
`Certifying Officer
`
`-
`an
`tg et na
`
`
`
`
`
`
`
`
`
`
`
`il
`
`ii
`
`
`
`14948175
`
`
`
`Application Number: 61764178
`
`International Application Number:
`
`Confirmation Number:
`
`9122
`
`Title of Invention:
`
`FOCUS SCANNING APPARATUS RECORDING COLOR
`
`Electronic AcknowledgementReceipt
`
`
`
`ple|WilliamC.Rowland/Robin Copeland
`
`Paymentinformation:
`
`Submitted with Payment
`
`Payment Type
`
`Payment was successfully received in RAM
`
`RAM confirmation Number
`
`yes
`
`Credit Card
`
`$125
`
`593
`
`Pages
`
`Deposit Account
`
`Authorized User
`
`File Listing:
`
`Document
`
`sg
`
`. File Size(Bytes)/
`
`Multi
`
`Copy provided by USPTO from the IFW Imaae Databace nn 09/10/9044
`iil
`
`iii
`
`
`
`Information:
`
`Information:
`
`Information:
`
`anh
`Drawings-only black and white line
`drawings
`
`165101
`
`Drawings.pdf
`
`2168717e2252484196c4161 leaécei bBcb23:
`de5S
`
`Specification
`
`Provisional_Application.pdf
`
`1331498
`
`O0f1b2c65d9ba43270c47a26de2444a37aC]
`
`Transmittal of New Application
`
`Provisional_Transmittal_Letter.
`
`faf4a57615d3<6a8dd45917033249315<9e]
`2968
`
`Application Data Sheet
`
`Application_Data_Sheet.pdf
`
`281970
`
`f4f6dd511d38382c762877d49332842184
`2280
`
`the application.
`
`New International Application Filed with the USPTO as a Receiving Office
`If a new international application is being filed and the international application includes the necessary componentsfor
`an internationalfiling date (see PCT Article 11 and MPEP 1810), a Notification of the International Application Number
`and ofthe International Filing Date (Form PCT/RO/105)will be issued in due course, subject to prescriptions concerning
`hational security, and the date shown on this Acknowledgement Receipt will establish the international filing date of
`
`Information:
`
`This is not an USPTO supplied ADSfillable form
`
`Fee Worksheet (SB06)
`
`fee-info.pdf
`
`876437¢1c2b068392896d52805ef4Sbd4Cal
`6b16f
`
`Information:
`
`This Acknowledgement Receipt evidences receipt on the noted date by the USPTOof the indicated documents,
`characterized by the applicant, and including page counts, where applicable.It serves as evidenceofreceipt similar to a
`Post Card, as described in MPEP 503.
`
`New Applications Under 35 U.S.C. 111
`If a new application is being filed and the application includes the necessary componentsfora filing date (see 37 CFR
`1.53(b)-(d) and MPEP 506), a Filing Receipt (37 CFR 1.54) will be issued in due course and the date shown onthis
`Acknowledgement Receipt will establish thefiling date of the application.
`
`National Stage of an International Application under 35 U.S.C. 371
`If a timely submission to enter the national stage of an international application is compliant with the conditions of 35
`U.S.C. 371 and other applicable requirements a Form PCT/DO/E0/903 indicating acceptanceof the application asa
`national stage submission under 35 U.S.C. 371 will be issued in additionto the Filing Receipt, in due course.
`
`Copyprovided by USPTO from the IFW ImageDatabase on 02/10/2014
`iv
`
`iv
`
`
`
`Substitute for Form PTO/SB/16
`
`Page 1 of 1
`
`PROVISIONAL APPLICATION FOR PATENT COVER SHEET
`This is a requestforfiling a PROVISIONAL APPLICATION FOR PATENTunder37 C.F.R. § 1.53(c),
`
`Attorney Docket Number
`
`0079124-000065
`
`Inventor(s)
`
`
`STATE
`COUNTRY
`Middle Name
`Family Name
`CITY
`
`sd——*PESBECH|Gopennagen S Denia
`onrstan [ROSBERG|CopenhagenV_[|DenmarkRomer
`
`
`[wike|__|VANDERPOEL_[CopenhagenV_FRasmus[feteSoparragen|Penna
`pwichaet|__|VINTHER|CopenhagenS|Denmark]
`
`
`Josef
`HOLLENBECK
`Copenhagen@ {|
`
`Title of Invention
`
`
`
`FOCUS SCANNING APPARATUS RECORDING COLOR
`
`Correspondence Address
`The address corresponding to Custorner Number 21839
`
`|
`
`This invention was made by an agencyof the United States Government or under a contract with an agencyof the
`United States Government.
`No.
`C] Yes, the nameof the U.S. Government agency and the Government contract numberare:
`
`Enclosed Application Parts
`1
`Specification/Claims/Abstract
`DX
`Drawings
`
`# of Pages
`# Sheets
`
`27
`3
`
`L_} CD(s) Number
`LJ] Other(specify):
`
`Total Pages in Spec/Drawings _
`
`30
`
`Filing Fee (2005)
`
`$
`
`125.00
`
`
`Application Data Sheet. See 37 CFR 1.76
`PA
`
`Method of Paymentof Filing Fees
`FILING FEE AMOUNT
`
`DX]
`Applicant claims small entity status. See 37 CFR 1.27
`0
`The undersigned hereby grants the USPTO authority to
`provide the European Patent Office (EPO), the Japan
`Patent Office (JPO}, the Korean Intellectual Property
`Total Page Fee
`Office (KIPO), the World Intellectual Property Office
`(WIPO), and any other intellectual property offices in
`
`
`which a foreign application claiming priority to the above-0.00.(101+ pages) (2085) $155 $____
`
`identified patent applicationis filed access to the above-
`
`identified patent application. See 37 CFR 1.14(c) and (h)._|
`Total App. Filing Fee
`$
`125.00
`The Director is hereby authorized to charge any
`deficiencyin filing fees or credit any overpaymentto
`Deposit Account 02-4800,
`
`
`Paymentby credit card. Paymentwill be made
`electronically at time offiling.
`
`
`Chargefiling fee to Deposit Account 02-4800.
`
`SIGNATUREPDMen 0Leb DATE February 13, 2013
`
`bX]
`
`X
`
`CI
`
`
`
`
`
`TYPED or PRINTED NAME
`
`William C. Rowland
`703.836.6620
`
`Regis. No.
`
`30888
`
`SEND TO: Commissioner for Patents, P.O. Box 1450, Alexandria, VA 22313-1450
`
`nt
`
`'
`
`’
`
`t
`!
`
`poe
`
`'
`
`.
`
`-
`
`'
`
`'
`
`:
`
`: eo
`
`Copy provided bv USPTO from the IFW Imana Ratahann «= Ananinas «
`
`Vv
`
`v
`
`
`
`Focusscanning apparatus recording color
`
`Field of the invention
`
`Theinvention relates to three dimensional (3D) scanning of the surface geometry
`and surface color of objects. A particular application is within dentistry, particularly
`for intraoral scanning.
`
`Backgroundofthe invention
`
`10
`
`3D scanners are widely known from the art, and so are intraoral dental 3D
`
`scanners(e.g., Sirona Cerec, Cadent Itero, 3Shape TRIOS).
`
`Theability to record surface color is useful in many applications. For example in
`dentistry, the user can differentiate types of tissue or detect existing restorations.
`
`15
`
`For examplein materials inspection, the user can detect surface abnormalities
`
`suchas crystallization defects or discoloring. None of the above is generally
`
`possible from 3D surface information alone.
`
`20
`
`W02010145669 mentions the possibility of recording color. In particular, several
`
`sequential images, each takenfor anillumination in a different color - typically |
`blue, green, and red - are combined to form a synthetic color image. This
`
`approach hence requires means to changelight source color, such as colorfilters.
`
`Furthermore, in handheld use, the scannerwill moverelative to the scanned object
`
`25
`
`during the illumination sequence, reducing the quality of the synthetic color image.
`
`Also US7698068 and US8102538 (CadentInc.) describe an intraoral scanner that
`
`records both 3D geometry data and 3D texture data with one or more image
`sensor(s). However, there is a slight delay betweenthe color and the 3D geometry
`recording, respectively. US7698068 requires sequentialillumination in different
`
`30
`
`colors to form a synthetic image, while US8102538 mentions white light as a
`possibility, however from a secondillumination source or recorded by a second
`image sensor, thefirst set being used for recording the 3D geometry.
`
`1
`
`Copvprovided bv LISPTO fram tha IFW Imoana RNatahana an AniaaiAna A
`
`‘
`
`ok
`
`oem
`
`|
`
`
`
`W02012083967 discloses a scannerfor recording 3D geometry data and 3D
`texture data with two separate cameras. While the first camera has a relatively
`shallow depth of field as to provide focus scanning based on multiple images, the
`second camera hasarelatively large depth of field as to provide color texture
`information from a single image.
`
`Color-recording scanning confocal microscopes are also known fromthepriorart
`(e.g., Keyence VK9700; see also JP2004029373). A white light illumination system
`along with a color image sensoris used for recording 2D texture, whilea laser
`beam formsa dot that is scanned, i.e., moved over the surface and recorded by a
`
`photomultiplier, providing the 3D geometry data from many depth measurements,
`
`one for each position of the dot. The principle of a moving dot requires the
`
`measured object not to moverelative to the microscope during measurement, and
`
`henceis not suitable for handheld use.
`
`Summary of the invention
`
`It is an object of the present invention to provide a scannerfor obtaining the 3D
`surface geometry and surface color of the surface of an object, which does not
`require that some 2D images are recorded for determining the 3D surface
`geometry while other images are recorded for determining the surface color.
`
`10
`
`15
`
`20
`
`It is an object of the present invention to provide a scannerfor obtaining the 3D
`
`surface geometry and surface color of the surface of an object, which obtains
`surface color and the 3D surface geometry simultaneously such that an alignment
`
`25
`
`of data relating to 3D surface geometry and data relating to surface coloris not
`
`required.
`
`Disclosed is a scanner for obtaining 3D surface geometry and surface color of an
`
`30
`
`object, the scanner comprising:
`
`-
`
`amultichromatic light source configured for providing a probelight, and
`
`Copy provided bv USPTO from tha IFW Imana Natahan
`
`= anianinns«
`
`
`
`a color image sensor comprising an array of image sensor pixels for
`recording one or more 2D imagesoflight received from said object,
`
`-
`
`whereat least for a block of said image sensorpixels, both surface color and 3D
`surface geometry of a part of the object are derivedat least partly from one 2D
`image recorded by said color image sensor
`
`Disclosed is a scanner for obtaining 3D surface geometry and surfacecolor of an
`
`object, the scanner comprising:
`
`a multichromatic light source configured for providing a probelight,
`
`10
`
`a color image sensor comprising an array of image sensorpixels, and
`
`an optical system configured for guiding light received from the object to
`the color image sensor such that 2D images of said object can be
`
`recorded by said color image sensor;
`
`15
`
`wherein the scanneris configured for acquiring a numberof said 2D imagesof a
`part of the object and for deriving both surface color and 3D surface geometry of
`the part of the object from at least one of said recorded 2D imagesatleastfor a
`block of said image sensorpixels, such that the surface color and 3D surface
`geometry are obtained concurrently by the scanner.
`
`20
`
`Disclosed is a scannerfor obtaining 3D surface geometry and surface color of an
`object, the scanner comprising:
`
`a multichromatic light source configured for providing a probe light;
`
`a color image sensor comprising an array of image sensorpixels, where
`the image sensoris arranged to record 2D imagesoflight received from
`
`25
`
`the object; and
`
`1
`
`toro
`
`:
`
`1
`
`:
`
`oko
`
`Copvprovided bv USPTOfrom the IFW Imane Natahaca an noAninnda
`
`
`
`-
`
`animage processor configured for deriving both surface color and 3D
`surface geometry of at least a part of the object from at least one of said
`2D images recorded bythe color image sensor.
`
`5
`
`Disclosed is a scanner system for obtaining 3D surface geometry and surface
`color of an object, said scanner system comprising
`
`-
`
`ascanner according to any of the embodiments, where the scanner is
`
`configured for deriving surface color and 3D surface geometry of the
`object, and optionally for obtaining a partial or full 3D surface geometry
`of the part of the object; and
`
`10
`
`-~
`
`adata processing unit configured for post-processing 3D surface
`
`geometry and/or surface color readings from the color image sensor, or
`for post-processing the obtained partial or full 3D surface geometry.
`
`15
`
`In some embodiments, the data processing unit comprises a computer readable
`
`medium on whichis stored computer implemented algorithms for performing said
`
`post-processing.
`
`In some embodiments, the data processing unit is integrated in a cart ora
`
`personal computer.
`
`20
`
`~Disclosed is a method of obtaining 3D surface geometry and surface color of an
`
`object, the method comprising:
`
`-
`
`-
`
`-
`
`25
`
`providing a scanner or scanner system according to any of the
`
`embodiments;
`
`illuminating the surface of said object with probe light from said
`
`multichromatic light source;
`
`recording one or more 2D imagesof said object using said color image
`
`sensor; and
`
`1
`tl:
`we OB
`:
`:
`1
`i
`:
`.
`too
`”
`ee
`Copy provided bv USPTOfrom the IFW Imana Natohaen a5 noninnan
`
`
`
`~
`
`deriving both surface color and 3D surface geometry of a part of the
`object from at least someof said recorded 2D imagesat least for a block
`of said image sensorpixels, such that the surface color and 3D surface
`
`geometry are obtained concurrently by the scanner.
`
`The presentinvention is a significant improvement over the state of the art in that
`
`only a single image sensorand a single multichromatic light source is required,
`
`and that surface color and 3D surface geometry forat least a part of the object can
`
`be derived from the same image or images, which also means that alignment of
`
`10
`
`color and 3D surface geometry is inherently perfect. In the scanner according to
`
`the present invention, there is no need for taking into account or compensating for
`
`relative motion of the object and scanner between obtaining 3D surface geometry
`and surface color. Since the 3D surface geometry and the surface color are
`
`obtained at precisely the same time, the scanner automatically maintainsits
`
`15
`
`spatial disposition with respect to the object surface while obtaining the 3D surface
`
`geometry and the surface color. This makes the scannerof the present invention
`
`suitable for handheld use, for example as an intraoral scanner, or for scanning
`moving objects.
`|
`
`In the context of the present invention, the phrase “surface color” mayrefer to the
`
`20
`
`apparentcolor of an object surface and thus in some cases, such as for semi-
`transparent or semi-translucent objects such as teeth, be caused by light from the
`
`object surface and/or the material below the object surface, such as material
`
`immediately below the object surface.
`
`In some embodiments, the 3D surface geometry and the surface color are both
`
`25
`
`determined from light recorded by the color image sensor.
`
`In some embodiments, the light received from the object originates from the
`
`multichromatic light source, i.e. it is probe light reflected or scattered from the
`
`surface of the object.
`
`Copy provided bv USPTO from the IFW Imane Natahaca an n9/ninasa
`
`4
`
`
`
`In some embodiments, thelight received form the object is fluorescence excited by
`the probelight from the multichromatic light source, i.e. fluorescence emitted by
`fluorescent materials in the object surface.
`
`In some embodiments, a secondlight source is used for the excitation of
`fluorescence while the mulitichromatic light source provides the light for obtaining
`the geometry and color of the object.
`
`-
`
`In some embodiments, the scanner comprisesa first optical system, such as an
`arrangementoflenses,for transmitting the probe light from the multichromatic light
`source towards an object and a second optical system for imaging light received
`
`10
`
`from the object at the color image sensor.
`
`In some embodiments, only one optical system images the probe light onto the
`object and imagesthe object, or at least a part of the object, onto the color image
`sensor, preferably along the same optical axis, however along opposite optical
`paths. The scanner may comprise at least one beam splitter located in the optical
`
`15
`
`path, where the beam splitter is arranged such thatit directs the probe light from
`
`the multichromatic light source towards the object while it directs light received
`
`from the object towards the color image sensor.
`
`In some embodiments, the surface color and 3D surface geometry of the part of
`
`the object are derived from a plurality of recorded 2D images. In that case, both
`surface color and 3D surface geometry of the part of the object can be derived
`
`20
`
`from a numberof the plurality of recorded 2D images.
`
`Several scanning principles are suitable for this invention, such as triangulation
`
`and focus scanning.
`
`In some embodiments, the scanneris a focus scanner configured for obtaining a
`
`25
`
`stack of 2D imagesof the object from a numberof different focus plane positions.
`
`In some focus scanning embodiments, the focus plane is adjusted in such a way
`
`that the image of e.g. a spatial pattern projected by the light source on the probed
`object is shifted along the optical axis while recording 2D images at a numberof
`focus plane positions such that said stack of recorded 2D images can be obtained
`
`Copvprovided hv LIGPTO fram tha IEW beann oak
`
`Saran
`
`
`
`for a given position of the scannerrelative to the object. The focus plane position
`may be varied by meansof at least one focus element, e.g., a moving focus lens.
`
`In some focus scanner embodiments, the scanner comprises meansfor
`
`incorporating a spatial pattern in said probe light and meansfor evaluating a
`correlation measure at each focus plane position between at least one imagepixel
`
`and a weight function, where the weight function is determined based on
`information of the configuration ofthe spatial pattern. Determining in-focus
`information maythenrelate to calculating a correlation measureof the spatially
`
`10
`
`structured light signal provided by the pattern with the variation of the pattern itself
`(which we term reference) for every location of the focus plane and finding the
`location of an extremum ofthis series. In some embodiments, the pattern is static.
`
`Suchastatic pattern can for example be realized as a chrome-on-glasspattern.
`
`15
`
`20
`
`25
`
`One wayto define the correlation measure mathematically with a discrete set of
`measurements is as a dot product computed from a signal vector, / = (/1,...,/n),
`with n > 1 elements representing sensor signals and a reference vector, f= (f1,...,
`fn), of reference weights. The correlation measure A is then given by
`
`n
`
`A=f-l=) fil
`
`i=1
`
`The indices on the elements in the signal vector represent sensorsignals that are
`recordedatdifferent pixels, typically in a block of pixels. The reference vector f
`
`can be obtained in a calibration step.
`
`By using knowledgeof the optical system used in the scanner,it is possible to
`transform the location of an extremum of the correlation measure, i.e., the focus
`plane into depth data information, on a pixel block basis.All pixel blocks combined
`thus provide an array of depth data. In other words, depth is along an optical path
`that is known from the optical design and/or found from calibration, and each block
`. Of pixels on the image sensor represents the end point of an optical path.
`Therefore, depth along an optical path, for a bundle of paths, yields a 3D surface
`geometry within the field of view of the scanner.
`
`1
`
`:
`
`soot
`
`.
`
`¥
`
`1
`
`t
`
`.
`
`het
`
`ok
`
`n
`
`Copy provided by USPTOfrom the IFW Imaae Natahace an no/ininnta
`
`
`
`It can be advantageous to smooth andinterpolate the series of correlation
`
`measure values, such as to obtain a more robust and accurate determination of
`
`the location of the maximum. For example, a polynomial can befitted to the values
`of A for a pixel block over several images on both sides of the recorded maximum,
`
`and a location of a deducted maximum can be found from the maximum of the
`
`fitted polynomial, which can be in between two images.
`
`Color for a block of pixels is at least partially derived from the same image from
`which 3D geometry is derived. In case the location of the maximum ofA is
`
`represented by an image, then also coloris derived from that same image. In case
`the location of the maximum of A is found by interpolation to be between two
`
`10
`
`15
`
`20
`
`_ images, then at least one of those two images should be usedto derive color, or
`
`both images using interpolation for color also.It is also possible to average color
`
`' data from more than two imagesusedin the determination of the location of the
`
`maximum of the correlation measure, or to average color from a subset or
`
`superset of multiple images used to derive 3D surface geometry. In any case,
`some image sensor pixels readings are used to derive both surface color and 3D
`surface geometry for at least a part of the scanned object.
`
`Typically, there are three colorfilters, so the overall color is composed of three
`contributions, such as red, green, and blue, or cyan, magenta, and yellow. Note
`that colorfilters typically allow a range of wavelengths to pass, and thereis
`typically cross-talk betweenfilters, such that, for example, some greenlightwill
`contribute to the intensity measuredin pixels with red filters.
`
`For an image sensorwith a colorfilter array, a color component c, within a pixel
`
`block can be obtained as
`
`n
`
`Gg = » Gali
`
`i=1
`
`25
`
`where gj; = 1 if pixel / has a filter for color c, 0 otherwise. For an RGBfilter array
`like in a Bayerpattern, / is one of red, green, or blue. Further weighting of the
`
`individual color components,i.e., color calibration, may be required to obtain
`natural color data, typically as compensation for varyingfilter efficiency,
`
`
`
`illumination source efficiency, and different fraction of color componentsin thefilter
`pattern. The calibration may also depend on focus plane location and/orposition
`within the field of view, as the mixing of the light source component colors may
`vary with those factors.
`
`In some embodiments, color is obtained for every pixel in a pixel block. In sensors
`with a colorfilter array or with other means to separate colors such asdiffractive
`means, depending on the color measuredwith a particular pixel, an intensity value
`for that color is obtained. In other words, in this case a particular pixel has a color
`value only for one color. Recently developed color image sensors allow
`measurementof several colors in the samepixel, at different depths in the
`substrate, so in that case, a particular pixel can yield intensity values for several
`colors. In summary,it is possible to obtain a resolution of the surface color data
`
`10
`
`that is inherently higher than that of the 3D geometry data.
`
`In the embodiments where color resolution is higher than 3D geometry resolution,
`a pattern will be visible when at least approximately in focus, which preferably is
`
`15
`
`the case whencoloris derived. The image canbefiltered such as to visually
`
`remove the pattern, howeverat a loss of resolution.
`
`In fact, it can be
`
`advantageousto be able to see the pattern for the user. For examplein intraoral
`
`20
`
`scanning, it may be important to detect the position of a margin line, the rim or
`edge of a preparation. The imageof the pattern overlaid on the 3D geometry of
`this edge is sharper on a side that is seen approximately perpendicular, and more
`
`blurred on the side that is seen at an acute angle. Thus, a user, who in this
`
`example typically is a dentist or dental technician, can use the difference in
`
`sharpness to more precisely locate the position of the margin line than may be
`
`25
`
`possible from examining the 3D surface geometry alone.
`
`High spatial contrast of the in-focus pattern image on the object is desirable to
`
`obtain a good signal to noise ratio of the correlation measure on the color image
`sensor. Improved spatial contrast can be achieved by preferential imaging of the
`specular surface reflection from the object on the color image sensor. Thus, some
`
`30
`
`embodiments of the invention comprise meansfor preferential/selective imaging of
`specularly reflected light. This may be providedif the scannerfurther comprises
`
`wR
`
`4 Doe
`
`.
`
`:
`
`\
`
`Copy provided by USPTOfrom the [FW Imana Ratahace an naianions a
`
`
`
`meansfor polarizing the probelight, for example by meansofat least one
`polarizing beam splitter.
`
`In some embodiments, the polarizing optics is coated such asto optimize
`
`preservation of the circular polarization of a part of the spectrum of the
`multichromatic light source that is used for obtaining the 3D surface geometry.
`
`The scanneraccording to the invention may further comprise means for changing
`the polarization state of the probe light and/or the light received from the object.
`This can be provided by means of a retardation plate, preferably locatedin the
`optical path. In some embodiments of the invention the retardation plate is a
`
`10
`
`quarter waveretardation plate.
`
`Especially for intraoral applications, the scanner can have an elongatedtip, with
`
`meansfor directing the probe light and/or imaging an object. This may be provided
`
`by meansofat least one folding element. The folding element could bealight
`reflecting element such as a mirror or a prism.
`
`15
`
`For a morein-depth description of the above aspects of this invention, see
`
`W0O2010145669.
`
`The invention disclosed here comprises a multichromatic light source, forexample
`a white light source, for example a multi-die LED.
`Light receivedfrom the scanned object, such as probe light returned from the
`object surface or fluorescence generated by the probelight by exciting fluorescent
`parts of the object, is recorded by a color image sensor. In some embodiments,
`the color image sensor comprisesa colorfilter array such that every pixel in the
`
`color image sensor is a color-specific filter. The color filters are preferably
`arrangedin a regular pattern, for example where the colorfilters are arranged
`according to a Bayercolorfilter pattern. The image data thus obtained are used to
`derive both 3D surface geometry and surface color for each block of pixels. For a
`focus scannerutilizing a correlation measure, the 3D surface geometry may be
`
`found from an extremum of the correlation measure as described above.
`
`In some embodiments, the 3D surface geometry is derived from lightin a first part
`of the spectrum of the probe light provided by the multichromatic light source.
`
`
`20
`
`25
`
`30
`
`Copv provided bv USPTO from the IFW Imana Natahknca an noaninada
`
`
`
`Preferably, the colorfilters are aligned with the image pixels, preferably such that
`eachpixel has a colorfilter for a particular color only.
`
`In some embodiments, the colorfilter array is such that its proportion of pixels with
`colorfilters that match thefirst part of the spectrum is larger than 50%.
`
`in some embodiments, the scanneris configured to derive the surface color with a
`higher resolution than the 3D surface geometry.
`.
`
`In some embodiments, the higher surface color resolution is achieved by
`demosaicing, where color values for pixel blocks may be demosaiced to achieve
`
`an apparently higher resolution of the color image than is present in the 3D
`
`10
`
`surface geometry. The demosaicing may operate on pixel blocks or individual
`
`pixels.
`
`In case a multi-die LED or anotherillumination source comprising physically or
`optically separated light emitters is used, it is preferable to aim at a Kéhler type
`
`illumination in the scanner,i.e. the illumination source is defocused at the object
`
`15
`
`planein order to achieve uniform illumination and good color mixing for the entire
`
`field of view. In case color mixing is not perfect and varies with focal plane
`
`location, color calibration of the scanner will be advantageous.
`
`It can be preferable to compute the 3D surface geometry only from pixels with one
`or two kinds of colorfilters. A single color requires no achromatic optics and is thus
`provides for a scannerthat is easier and cheaperto build. Furthermore, folding
`
`20
`
`elements can generally not preserve the polarization state for all colors equally
`well. When only somecolor(s) is/are used to compute 3D surface geometry, the
`reference vectorf will contain zeros for the pixels with filters for the other color(s).
`Accordingly, the total signal strength is generally reduced, but for large enough
`
`25
`
`blocksof pixels, it is generally still sufficient. Preferentially, the pixel colorfilters
`
`are adaptedfor little cross-talk from one color to the other(s). Note that even in the
`
`embodiments computing geometry from only a subset of pixels, color is preferably
`
`still computed from all pixels.
`
`To obtain a full 3D surface geometry and color representation of an object, i.e. a
`colored full 3D surface geometry of said part of the object surface, typically several
`
`30
`
`11
`
`Copv provided bv USPTOfrom the IFW Imane Natahace an nO/tNinndA
`
`
`
`partial representations of the object have to be combined, where eachpartial
`representation is a view from substantially the same relative position of scanner
`and object. In the present invention, a view from a given relative position
`preferably obtains the 3D geometry and colorof the object surface as seen from
`that relative position.
`
`For a focus scanner, a view correspondsto one passof the focusing element(s),
`i.e, for a focus scanner eachpartial representation is the 3D surface geometry and
`color derived from the stack of 2D images recorded during the pass of the focus
`plane position between its extremum positions.
`|
`
`10
`
`The 3D surface geometry found for various views can be combined by algorithms
`for stitching and registration as widely knownin the literature, or from known view
`
`positions and orientations, for example when the scanner is mounted on axes with
`
`encoders. Color can beinterpolated and averaged by methods suchastexture
`
`weaving, or by simply averaging corresponding color components in multiple views
`
`15
`
`of the same location on the 3D surface. Here, it can be advantageous to account
`
`for differences in apparent color dueto different angles of incidence and reflection,
`
`whichis possible because the 3D surface geometry is also known. Texture
`
`weaving is described by e.g. Callieri M, Cignoni P, Scopigno R. “Reconstructing
`textured meshesfrom multiple range rgb maps”. VMV 2002, Erlangen, Nov 20-22,
`2002.
`
`In some embodiments, the scanner and/or the scanner system is configured for
`generating a partial representation of the object surface based on the obtained
`
`surface color and 3D surface geometry.
`
`In some embodiments, the scanner and/or the scanner system is configured for
`combining partial representations of the object surface obtained from different
`relative positions to obtain a full 3D surface geometry and color representation of
`the part of the object.
`os
`
`In some embodiments, the combination of partial representations of the object to
`obtain the full 3D surface geometry and color representation comprises computing
`the color in each surface point as a weighted average of correspondingpoints in
`all overlapping partial 3D surface geometries at that surface point. The weight of
`
`12
`
`20
`
`25
`
`30
`
`Copv provided by USPTO from tha IFW Imann Datahaann an anianinaan
`
`
`
`each partial presentation in the sum may be determined by several factors, such
`as the presenceof saturated pixel values orthe orientation of the object surface
`with respect to the scanner.
`
`Such a weighted average is advantageous in cases where some scannerpositions
`andorientations relative to the object will give a better estimate of the actual color
`
`than other positions and orientations.If the illumination of the object surfaceis
`uneven this can to some degree also be compensated for by weighting the best
`illuminated parts higher.
`
`In some embodiments,the scanner comprises an image processorconfigured for
`
`10
`
`performing a post-processing of the 3D surface geometry, the surface color
`
`readings, or the derived partial or full 3D surface geometries of the object. The
`
`scanner may be configured for performing the combination