throbber
\_
`
`\ a
`
`WIPO
`WORLD
`INTELLECTUAL PFIOPERTY
`ORGANIZATION
`
`DOCUMENT MADE AVAILABLE UNDER THE
`
`PATENT COOPERATION TREATY (PCT)
`International application number:
`PCT/EP2014/052842
`
`International filing date:
`
`13 February 2014 (13.02.2014)
`
`Document type:
`
`Document details:
`
`Certified copy of priority document
`
`Country/Office:
`Number:
`Filing date:
`
`US
`61/764,178
`13 February 2013 (13.02.2013)
`
`Date of receipt at the International Bureau:
`
`11 April 2014 (11.04.2014)
`
`Remark: Priority document submitted or transmitted to the International Bureau in compliance with Rule
`17.1(a),(b) or (b-bis)
`
`34, Chemin des Colombettes
`12 ‘
`I Geneva 23, Switzerlard
`
`www.wi paint
`Align EX. 1029
`
`US. Patent No. 9,962,244
`
`Align Ex. 1029
`U.S. Patent No. 9,962,244
`
`i
`
`

`

`
`
`
`
`
`UNITED STATES DEPARTMENT'OF COMNIERCE
`
`United States Patent and Trademark Office
`
`
`
`PA 7461729
`
`
`lllllunllllllllllllllnmmlIll|Illlllllllllulllllllluulllillllmumltlllllllllilllllfilxlilllm
`' . flaw—t“
`
`
`THIS IS TO CERTIFY THAT ANNEXED HERETO IS A TRUE COPY FROM
`THE RECORDS OF THE UNITED STATES PATENT AND TRADEMARK
`
`
`
`
`
`
`
`February 14, 2014
`
`
`3
`,
`
`
`,
`3
`J I
`OFFICE OF THOSE PAPERS OF THE BELOW IDENTIFIED PATENT
`13:
`if:
`
`
`
`
`
`
`3%;
`APPLICATION THAT MET THE REQUIREMENTS TO BE GRANTED A
`3 3:3
`
`
`
`FILING DATE UNDER 35 USC 111.
`2:
`
`
`
`
`
`APPLICATION NUMBER: 61/764,] 73
`3 3
`FILING DATE: February 13, 2013
`33%
`
`
`THE COUNTRY CODE AND NUMBER OF YOUR PRIORITY
`E;
`
`
`
`g:
`APPLICATION, TO BE USED FOR FILING ABROAD UNDER THE PARIS
`
`CONVENTION, IS US61/764,178
`
`
`
`
`
`
`By Authority of the
`
`
`.we
`
`
`
`
`
`
`
`335
`
`
`
`
`
`
`«33my)msmwsm.ILQ‘DI-DIS-SRu-.-
`NEY
`«“Wv‘ty A«(a
`b
`
`
`
`
`Iansswsm
`
`
`:j
`
`
`
`Under Secretary of Commerce for Intellectual Property
`and Director of the United States Patent and Trademark Office
`
`R. BLA
`
`Certifying Officer
`
`
`
`
`
`
`ii
`
`ii
`
`

`

`
` Electronic Acknowledgement Receipt
`
`
`
`61764178
`Application Number:
`
`
`
`International Application Number:
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`Confirmation Number:
`
`9122
`
`Title of Invention:
`
`FOCUS SCANNING APPARATUS RECORDING COLOR
`
`“—
`
`—_
`
`Payment information:
`
`_—
`——
`
`File Listing:
`
`Document
`
`.
`
`.
`
`. FileSize(Bytes)l
`
`Multi
`
`Pages
`
`
`
`
`
`
`
`
`
`Cepy provided by USPTO from the IFW lmaae Datahacn nn noHn/onqa
`
`iii
`
`
`
`
`
`
`
`
`
`
`
`
`iii
`
`

`

`Information:
`
`Information:
`
`Information:
`
`.
`.
`Drawings-only black and white line
`.
`drawrngs
`
`‘
`Drawmgs.pdf
`
`165101
`
`2(6371722252484i96c4f61ieaficeibBcbZS
`deS
`
`Specification
`
`Provisional_Application.pdf
`
`1331498
`
`001‘IbZCGSGWMSUOM7a26de24443373<
`ch652
`
`Transmittal ofNew Application
`
`Provisional_Tra nsmittal_Letter.
`pdf
`
`daifla57615d3c6a8dd45917033aa9315692
`2968
`
`Application Data Sheet
`
`Application_Data_Sheet.pdf
`
`281970
`
`f4f6145114383flc76’1377db49333841m 4
`e290
`
`
`
`Information:
`
`This is not an USPTO supplied ADS fillable form
`
`Fee Worksheet ($806)
`
`fee-info.pdf
`
`a76437cl db06838’1886d52805aflstOa
`6b16f
`
`Information:
`
`This Acknowledgement Receipt evidences receipt on the noted date by the USPTO of the indicated documents,
`characterized by the applicant, and including page counts, where applicable. It serves as evidence of receipt similar to a
`Post Card, as described in MPEP 503.
`
`New Applications Under 35 U.S.C. 111
`lfa new application is being filed and the application includes the necessary components for a filing date (see 37 CFR
`1.53(b)—(d) and MPEP 506), a Filing Receipt (37 CFR 1.54) will be issued in due course and the date shown on this
`Acknowledgement Receipt will establish the filing date ofthe application.
`
`National Stage of an International Application under 35 U.S.C. 371
`lfa timely submission to enter the national stage ofan international applicationIs compliant with the conditions of 35
`U.SC. 371 and other applicable requirements a Form PCT/DO/EO/903 indicating acceptance of the application as a
`national stage submission under 35 U.S.C. 371 will be issued'In addition to the Filing Receipt,’In due course.
`
`New International Application Filed with the USPTO as a Receiving Office
`If a new international application is being filed and the international application includes the necessary components for
`an international filing date (see PCT Article 11 and MPEP 1810), a Notification of the International Application Number
`and ofthe International Filing Date (Form PCT/R0/105) will be issued in due course, subject to prescriptions concerning
`national security, and the date shown on this Acknowledgement Receipt will establish the international filing date of
`the application.
`
`Copy provided by USPTO from the IFW Imaoe Database on 02/10/2014
`
`iV
`
`iv
`
`

`

`Substitute for Form PTO/$3116
`
`Page 1 of 1
`
`PROVISIONAL APPLICATION FOR PATENT COVER SHEET
`
`This is a request for filing a PROVISIONAL APPLICATION FOR PATENT under 37 CFR. § 1.53(c)._
`
`Attorney Docket Number
`
`0079124-000065
`
`lnventor(s)
`
`Middle Name
`
`CITY
`
`STATE
`
`COUNTRY
`
`Denmark
`E_—- Copenhagen 8
`————_ Denmark
`_—_— Denmark
`—_—_= Denmark
`-—— Copenhagen 8 — Denmark
`
`HOLLENBECK
`Copenhagenfl _ Denmark
`
`Title of Invention
`
`FOCUS SCANNING APPARATUS RECORDING COLOR
`
`Correspondence Address
`The address corresponding to Customer Number 2 ‘i 8 3 9
`
`
`
`This invention was made by an agency of the United States Government or under a contract with an agency of the
`United States Government.
`No.
`II] Yes, the name of the US. Government agency and the Government contract number are:
`
`I
`
`Enclosed Application Parts
`>14
`Specification/Ciaims/Abstract
`
`)3
`
`Drawings
`
`# of Pages
`
`# Sheets
`
`Total Pages in Spec/Drawings ’
`
`I CD(s) Number
`
`El Other (specify):
`
`27
`
`3
`
`30
`
`
`
`Application Data Sheet. See 37 CFR 1.76
`>14
`
`Method of Payment of Filing Fees
`FILING FEE AMOUNT
`
`)2
`Applicant claims small entity status. See 87 CFR 1.27
`(:1
`The undersigned hereby grants the USPTO authority to
`provide the European Patent Office (EPO), the Japan
`Patent Office (JPO), the Korean intellectual Property
`Office (KIPO), the World intellectual Property Office
`(WIPO), and any other intellectual property offices in
`which a foreign application claiming priority to the above-
`identified patent application is filed access to the above-
`
`identified patent application. See 37 CFR 1 .14(c) and (h).
`The Director is hereby authorized to charge any
`deficiency in filing fees or credit any overpayment to
`Deposit Account 02-4800.
`Payment by credit card. Payment will be made
`electronically at time of filing.
`
`Filing Fee (2005)
`
`$
`
`125.00
`
`Total Page Fee
`(101+ pages) @085) $155
`
`Total App. Filing Fee
`
`,
`
`$__@_9.
`
`$
`
`12500
`
`|Z|
`
`IXI
`
`-_
`
`
`
`Charge filing fee to Deposit Account 02-4800.
`
`
`
`
`SIGNATURE W 6% DATE
`
`
`
`February 13, 2013
`
`TYPED or PRINTED NAME
`
`William C. Rowland
`703.836.6620
`
`Regis. No.
`
`30888
`
`SEND TO: Commissioner for Patents, P.O. Box 1450, Alexandria, VA 22313-1450
`
`y._
`
`.
`
`.
`
`1
`
`r .. .
`
`I
`
`'_
`
`.
`
`‘
`
`I
`
`.
`
`I
`
`CODV provided bv USPTO from the IFW lmnnn n-0—.L-.... -_ ARI-Inlan.‘ .
`
`V
`
`v
`
`

`

`Focus'scanning apparatus recording color
`
`Field of the invention
`
`5
`
`The invention relates to three dimensional (SD) scanning of the surface geometry
`and surface color of objects. A particular application is within dentistry, particularly
`
`for intraoral scanning.
`
`Background of the invention
`
`10
`
`3D scanners are widely known from the art, and so are intraoral dental 3D
`
`scanners (e.g., Sirona Cerec, Cadent ltero, 38hape TRIOS).
`
`.The ability to record surface color is useful in many applications. For example in
`
`15
`
`dentistry, the user can differentiate types of tissue or detect existing restorations.
`
`For example in materials inspection, the user can detect surface abnormalities
`
`such as crystallization defects or discoloring. None of the above is generally
`
`possible from 3D surface information alone.
`
`20 W02010145669 mentions the possibility of recording color. In particular, several
`
`sequential images, each taken for an illumination in a different color - typically .
`
`blue, green, and red - are combined to form a synthetic color image. This
`
`approach hence requires means to change light source color, such as color filters.
`
`Furthermore, in handheld use, the scanner will move relative to the scanned object
`
`25
`
`during the illumination sequence, reducing the quality of the synthetic color image.
`
`Also US7698068 and U88102538 (Cadent Inc.) describe an intraoral scanner that
`
`records both 3D geometry data and 3D texture data with one or more image
`sensor(s). However, there is a slight delay between the color and the 3D geometry
`
`30
`
`recording, respectively. US7698068 requires sequential illumination in different
`
`colors to form a synthetic image, while USS102538 mentions white light as a
`
`possibility, however from a second illumination source or recorded by a second
`image sensor, the first set being used for recording the 3D geometry.
`
`1
`
`"
`
`‘
`
`'
`
`i
`
`"
`
`.'l
`
`I21
`
`l
`
`l
`
`.
`
`2'1
`
`WC
`
`ODV DIOVIded hV “Qan from "\A "3W Inn-nun nae-La..- .. nnlannn4 A
`
`

`

`W02012083967 discloses a scanner for recording 30 geometry data and 3D
`
`texture data with two separate cameras. While the first camera has a relatively
`
`shallow depth of field as to provide focus scanning based on multiple images, the
`
`second camera has a relatively large depth of field as to provide color texture
`
`information from a single image.
`
`Color—recording scanning confocal microscopes are also known from-the prior art
`
`(e.g., Keyence VK9700; see also JP2004029373). A white light illumination system
`
`along with a color image sensor is used for recording 2D texture, while, a laser
`
`beam forms a dot that is scanned, i.e., moved over the surface and recorded by a
`
`photomultiplier, providing the 3D geometry data from many depth measurements,
`
`one for each position of the dot. The principle of a moving dot requires the
`
`measured object not to move relative to the microscope during measurement, and
`
`hence is not suitable for handheld use.
`
`Summam of the invention
`
`It is an object of the present invention to provide a scanner for obtaining the 3D
`
`surface geometry and surface color of the surface of an object, which does not
`
`require that some 2D images are recorded for determining the 3D surface
`
`geometry while other images are recorded for determining the surface color.
`
`10
`
`15
`
`20
`
`it is an object of the present invention to provide a scanner for obtaining the 3D
`
`surface geometry and surface color of the surface of an object, which obtains
`
`25'
`
`surface color and the 3D surface geometry simultaneously such that an alignment
`
`of data relating to 3D surface geometry and data relating to surface color is not
`
`required.
`
`Disclosed is a scanner for obtaining 3D surface geometry and surface color of an
`
`3O
`
`object, the scanner comprising:
`
`-
`
`a multichromatic light source configured for providing a probe light, and
`
`Copy provided bv USPTO frnm Ihn u=w in...“ n.4,...“
`
`.. "an...“ .
`
`

`

`~
`
`a color image sensor comprising an array of image sensor pixels for
`
`recording one or more 20 images of light received from said object,
`
`-
`
`where at least for a block of said image sensor pixels, both surface color and SD
`surface geometry of a part of the object are derived at least partly from one ZD
`
`image recorded by said color image sensor
`
`Disclosed is a scanner for obtaining 3D surface geometry and surface color of an
`
`object, the scanner comprising:
`
`10
`
`-
`
`—
`
`-
`
`a multichromatic light source configured for providing a probe'light,
`
`a color image sensor comprising an array of image sensor pixels, and
`
`an optical system configured for guiding light received from the object to
`
`the color image sensor such that 20 images of said object can be
`
`recorded by said color image sensor;
`
`wherein the scanner is configured for acquiring a number of said 2D images of a
`
`15
`
`part of the object and for deriving both Surface color and 3D surface geometry of
`
`the part of the object from at least one of said recorded 2D images at least for a
`
`block of said image sensor pixels, such that the surface color and 30 surface
`
`geometry are obtained concurrently by the scanner.
`
`20
`
`25
`
`Disclosed is a scanner for obtaining 3D surface geometry and surface color of an
`
`object, the scanner comprising:
`
`-
`
`-
`
`a multichromatic light source configured for providing a-probe light;
`
`a color image sensor comprising an array of image sensor pixels, where
`
`the image sensor is arranged to record 20 images of light received from
`
`the object; and
`
`.mt
`
`I;‘
`
`u.
`
`I
`
`,
`
`.
`
`CODV DI’OVided bV USPTO from the IFW Imam: nnoahaen an noun/MM»
`
`

`

`an image processor Configured for deriving both surface color and 3D
`
`surface geometry of at least a part of the object from at least one of said
`
`2D images recorded by the color image sensor.
`
`Disclosed is a scanner system for obtaining 3D surface geometry and surface
`
`color of an object, said scanner system comprising
`
`a scanner according to any of the embodiments, where the scanner is
`
`configured for deriving surface color and 3D surface geometry of the
`
`object, and optionally for obtaining a partial or full 3D surface geometry
`
`of the part of the object; and
`
`a data processing unit configured for post-processing 3D surface
`
`geometry and/or surface color readings from the color image sensor, or
`
`for post—processing the obtained partial or full 3D surface geometry.
`
`In some embodiments, the data processing unit comprises a computer readable
`
`medium on which is stored computer implemented algorithms for performing said
`
`post-processing.
`
`In some embodiments, the data processing unit is integrated in a cart or a
`
`personal computer.
`
`Disclosed is a method of obtaining 3D surface geometry and surface color of an
`
`object, the method comprising:
`
`providing a scanner or scanner system according to any of the
`
`embodiments;
`
`illuminating the surface of said object with probe light from said
`
`multichromatic light source;
`
`recording one or more 2D images of said object using said color image
`
`sensor; and
`
`10
`
`15
`
`20
`
`25
`
`l
`
`T
`
`.
`
`.r.'L
`
`IL.
`
`I
`
`COPV provided bV USPTO from “19 ":W Imann natal--4... -._ nnmnlnru n
`
`

`

`~
`
`deriving both surface color and 3D surface geometry of a part of the
`
`object from at least some of said recorded 2D images at least for a block
`
`of said image sensor pixels, such that the surface color and BD surface
`
`geometry are obtained concurrently by the scanner.
`
`10
`
`15
`
`20
`
`25
`
`The present invention is a significant improvement over the state of the art in that
`
`only a single image sensor and a single multichromatic light source is required,
`
`and that surface color and 3D surface geometry for at least a part of the object can
`
`be derived from the same image or images, which also means that alignment of
`
`color and 3D surface geometry is inherently perfect. In the scanner according to
`
`the present invention, there is no need for taking into account or compensating for
`
`relative motion of the object and scanner between obtaining 3D surface geometry
`
`and surface color. Since the 3D surface geometry and the surface color are
`
`obtained at precisely the same time, the scanner automatically maintains its
`
`spatial disposition with respect to the object surface while obtaining the 3D surface
`
`geometry and the surface color. This makes the scanner of the present invention
`
`suitable for handheld use, for example as an intraoral scanner, or for scanning
`
`moving objects.
`
`ln the context of the present invention, the phrase “surface color” may refer to the
`
`apparent color of an object surface and thus in some cases, such as for semi-
`
`transparent or semi—translucent objects such as teeth, be caused by light from the
`
`object surface and/or the material below the object surface, such as material
`
`immediately below the object surface.
`
`In some embodiments, the 3D surface geometry and the surface color are both
`
`determined from light recorded by the color image sensor.
`
`In some embodiments, the light received from the object originates from the
`
`multichromatic light source, i.e. it is probe light reflected or scattered from the
`
`surface of the object.
`
`COPV DI‘OVided bV USPTO from the IFW Imam: nahhaen an nm-Inmn-a ,1
`
`'1
`
`

`

`In some embodiments, the light received form the object is fluorescence excited by
`
`the probe light from the multichromatic light source, is. fluorescence emitted by
`
`fluorescent materials in the object surface.
`
`In some embodiments, a second light source is used for the excitation of
`
`5
`
`fluorescence while the mulitichromatic light source provides the light for obtaining
`
`'
`
`the geometry and color of the object.
`
`in some embodiments, the scanner comprises a first optical system, such as an
`
`arrangement of lenses, for transmitting the probe light from the multichromatic light
`
`source towards an object and a second optical system for imaging light received
`
`10
`
`from the object at the color image sensor.
`
`In some embodiments, only one optical system images the probe light onto the
`
`object and images the object, or at least a part of the object, onto the color image
`
`sensor, preferably along the same optical axis, however along opposite optical
`
`paths. The scanner may comprise at least one beam splitter located in the optical
`
`15
`
`path, where the beam splitter is arranged such that it directs the probe light from
`
`the multichromatic light source towards the object while it directs light received
`
`from the object towards the color image sensor.
`
`In some embodiments, the surface color and 3D surface geometry of the part of
`
`the object are derived from a plurality of recorded 2D images. In that case, both
`
`20
`
`surface color and 3D surface geometry of the part of the object can be derived
`
`from a number of the plurality of recorded 2D images.
`
`Several scanning principles are suitable for this invention, such as triangulation
`
`and focus scanning.
`
`In some embodiments, the scanner is a focus scanner configured for obtaining a
`
`25
`
`stack of 2D images of the object from a number of different focus plane positions.
`
`In some focus scanning embodiments, the focus plane is adjusted in such a way
`
`that the image of eg. a spatial pattern projected by the light source on the probed
`
`object is shifted along the optical axis while recording 2D images at a number of
`
`focus plane positions such that said stack of recorded ZD images can be obtained
`
`WC
`
`ODV nrovided hv IIQDTO from um II:\M I..-.... n a L
`
`“U“. ..
`
`

`

`for a given position of the scanner relative to the object. The focus plane position
`
`may be varied by means of at least one focus element, e.g., a moving focus lens.
`
`In some focus scanner embodiments, the scanner comprises means for
`
`incorporating a spatial pattern in said probe light and means for evaluating a
`
`correlation measure at each focus plane position between at least one image pixel
`
`and a weight function, where the weight function is determined based on
`
`information of the configuration of'the spatial pattern. Determining in-focus
`
`information may then relate to calculating a correlation measure of the spatially
`
`structured light signal provided by the pattern with the variation of the pattern itself
`
`(which we term reference) for every location of the focus plane and finding the
`
`location of an extremum of this series. In some embodiments, the pattern is static.
`
`Such a static pattern can for example be realized as a chrome—on-glass pattern.
`
`One way to define the correlation measure mathematically with a discrete set of
`
`measurements is as a dot product computed from a signal vector, I= (I1,...,/n),
`
`with n > 1 elements representing sensor signals and a reference vector, f = (f1
`
`fit), of reference weights. The correlation measure A is then given by
`
`10
`
`15
`
`“1‘4:wa
`
`i=1
`
`The indices on the elements in the signal vector represent sensor signals that are
`
`recorded at different pixels, typically in a block of pixels. The reference vector f
`
`can be obtained in a calibration step.
`
`20
`
`By using knowledge of the optical system used in the scanner, it is possible to
`
`transform the location of an extremum of the correlation measure, i.e., the focus
`
`plane into depth data information, on a pixel block basis. All pixel blocks combined
`thus provide an array of depth data. In other words, depth is along an optical path
`
`that is known from the optical design and/or found from calibration, and each block
`
`25
`
`» of pixels on the image sensor represents the end point of an optical path.
`
`Therefore, depth along an optical path, for a bundle of paths, yields a SD surface
`
`geometry within the field of view of the scanner.
`
`I
`
`i
`
`'
`
`‘
`
`.
`
`'-
`
`l
`
`I
`
`,
`
`g,"
`
`L
`
`\
`
`COPV provided bv USPTO from the lFW Imam: nnoahaen m. noun/on”
`
`

`

`It can be advantageous to smooth and interpolate the series of correlation
`
`measure values, such as to obtain a more robust and accurate determination of
`
`the location of the maximum. For example, a polynomial can be fitted to the values
`
`of A for a pixel block over several images on both sides of the recorded maximum,
`
`and a location of a deducted maximum can be found from the maximum of the
`
`fitted polynomial, which can be in between two images.
`
`Color for a block of pixels is at least partially derived from the same image from
`
`which 3D geometry is derived. In case the location of the maximum of A is
`
`represented by an image, then also color is derived from that same image. In case
`
`the location of the maximum of A is found by interpolation to be between two
`
`. images, then at least one of those two images should be used to derive color, or
`
`both images using interpolation for color also. It is also possible to average color
`
`' data from more than two images used in the determination of the location of the
`
`maximum of the correlation measure, or to average color from a subset or
`
`superset of multiple images used to derive 3D surface geometry. In any case,
`
`some image sensor pixels readings are used to derive both surface color and SD
`
`surface geometry for at least a part of the scanned object.
`
`Typically, there are three color filters, so the overall color is composed of three
`
`contributions, such as red, green, and blue, or cyan, magenta. and yellow. Note
`
`that color filters typically allow a range of wavelengths to pass, and there is
`
`typically crosstalk between filters, such that, for example, some green light will
`
`contribute to the intensity measured in pixels with red filters.
`
`For an image sensor with a color filter array, a color component c] within a pixel
`
`block can be obtained as
`
`10
`
`15
`
`20
`
`n
`
`c] = Z 9],!”
`
`i=1
`
`25
`
`where g“ = 1 if pixel ihas a filter for color 0,, 0 othewvise. For an RGB filter array
`
`like in a Bayer pattern, j is one of red, green, or blue. Further weighting of the
`
`individual color components, i.e., color calibration, may be required to obtain
`
`natural color data, typically as compensation for varying filter efficiency,
`
`

`

`illumination source efficiency, and different fraction of color components in the filter
`
`pattern. The calibration may also depend on focus plane location and/or position
`
`within the field of view, as the mixing of the light source component colors may
`
`vary with those factors.
`
`In some embodiments, color is obtained for every pixel in a pixel block. In sensors
`with a color filter array or with other means to separate colors such as diffractive
`
`means, depending on the color measured with a particular pixel, an intensity value
`
`for that color is obtained. In other words, in this case a particular pixel has a color
`
`value only for one color. Recently developed color image sensors allow
`
`measurement of several colors in the same pixel, at different depths in the
`
`substrate, so in that case, a particular pixel can yield intensity values for several
`
`colors. In summary, it is possible to obtain a resolution of the surface color data
`
`that is inherently higher than that of the 3D geometry data.
`
`In the embodiments where color resolution is higher than 3D geometry resolution,
`
`a pattern will be visible when at least approximately in focus, which preferably is
`
`the case when color is derived. The image can be filtered such as to visually
`
`remove the pattern, however at a loss of resolution.
`
`In fact, it can be
`
`advantageous to be able to see the pattern for the user. For example in intraoral
`
`scanning, it may be impOrtant to detect the position of a margin line, the rim or
`
`edge of a preparation. The image of the pattern overlaid on the 3D geometry of
`
`this edge is sharper on a side that is seen approximately perpendicular, and more
`
`blurred on the side that is seen at an acute angle. Thus, a user, who in this
`
`example typically is a dentist or dental technician, can use the difference in
`
`sharpness to more precisely locate the position of the margin line than may be
`
`possible from examining the BD surface geometry alone.
`
`High spatial contrast of the in-focus pattern image on the object is desirable to
`
`obtain a good signal to noise ratio of the correlation measure on the color image
`sensor. Improved spatial contrast can be achieved by preferential imaging of the
`
`specular surface reflection from the object on the color image sensor. Thus, some
`
`embodiments of the invention comprise means for preferential/selectiveimaging of
`
`specularly reflected light. This may be provided if the scanner further comprises
`
`10
`
`15
`
`20
`
`25
`
`30
`
`”L 4.2-
`
`.. ...
`
`_
`
`XI
`
`Copy provided bV USPTO from "In IFW ImsnA nah-Lama .— haunt-nu n
`
`

`

`means for polarizing the probe. light, for example by means of at least one
`
`polarizing beam splitter.
`
`In some embodiments, the polarizing optics is coated such as to optimize
`
`preservation of the circular polarization of a part of the spectrum of the
`
`multichromatic light source that is used for obtaining the 3D surface geometry.
`
`The scanner according to the invention may further comprise means for changing
`the polarization state of the probe light and/or the light received from the object.
`This can be provided by means of a retardation plate, preferably located in the
`
`optical path. In some embodiments of the invention the retardation plate is a
`
`10
`
`quarter wave retardation plate.
`
`Especially for intraoral applications, the scanner can have an elongated tip, with
`
`means for directing the probe light and/or imaging an object. This may be provided
`
`by means of at least one folding element. The folding element could be a light
`
`reflecting element such as a mirror or a prism.
`
`For a more in-depth description of the above aspects of this invention, see
`
`W02010145669.
`
`The invention disclosed here comprises a multichromatic light source, for'example
`
`a white light source, for example a multi—die LED.
`
`Light reoeived‘from the scanned object, such as probe light returned from the
`
`object surface or fluorescence generated by the probe light by exciting fluorescent
`
`parts of the object, is recorded by a colorimage sensor. In some embodiments,
`
`the color image sensor comprises a color filter array such that every pixel in the
`
`color image sensor is a color-specific filter. The color filters are preferably
`arranged in a regular pattern, for example where the color filters are arranged
`according to a Bayer color filter pattern. The image data thus obtained are used to
`
`derive both 30 surface geometry and surface color for each block of pixels. For a
`
`focus scanner utilizing a correlation measure, the 30 surface geometry may be
`
`found from an extremum of the correlation measure as described above.
`
`In some embodiments, the 3D surface geometry is derived from light in a first part
`
`of the spectrum of the probe light provided by the multichromatic light source.
`
`
`15
`
`20
`
`25
`
`30
`
`CODV D'OVidEd bV USPTO from "u: IFW Imano na.5|~nen a... nanmn-n a
`
`

`

`Preferably, the color filters are aligned with the image pixels, preferably such that
`
`each pixel has a color filter for a particular color only.
`
`In some embodiments, the color filter array is such that its proportion of pixels with
`
`color filters that match the first part of the spectrum is larger than 50%.
`
`In some embodiments, the scanner is configured to derive the surface color with a
`
`higher resolution than the 3D surface geometry.
`
`‘
`
`In some embodiments, the higher surface color resolution is achieved by
`
`demosaicing, where color values for pixel blocks may be demosaiced to achieve
`
`an apparently higher resolution of the color image than is present in the 3D
`
`surface geometry. The demosaicing may operate on pixel blocks or individual
`
`pixels.
`
`In case a multi-die LED or another illumination source comprising physically or
`
`optically separated light emitters is used, it is preferable to aim at a Kohler type
`
`illumination in the scanner, i.e. the illumination source is defocused at the object
`
`plane in order to achieve uniform illumination and good color mixing for the entire
`
`field of view. In case color mixing is not perfect and varies with focal plane
`
`location, color calibration of the scanner will be advantageous.
`
`It can be preferable to compute the 3D surface geometry only from pixels with one
`or two kinds of color filters. A single color requires no achromatic optics and is thus
`provides for a scanner that is easier and cheaper to build. Furthermore, folding
`
`elements can generally not preserve the polarization state for all colors equally
`
`well. When only some color(s) is/are used to compute 3D surface geometry, the
`
`reference vector f will contain zeros for the pixels with filters for the other color(s).
`
`Accordingly, the total signal strength is generally reduced, but for large enough
`
`blocks of pixels, it is generally still sufficient. Preferentially, the pixel color filters
`
`are adapted for little cross—talk from one color to the other(s). Note that even in the
`
`embodiments computing geometry from only a subset of pixels, color is preferably
`
`still computed from all pixels.
`
`To obtain a full 3D surface geometry and color representation of an object, Le. a
`
`colored full 3D surface geometry of said part of the object surface, typically several
`
`11
`
`10
`
`15
`
`20
`
`25
`
`30
`
`CODV provided bV USPTO from “IP IFW lmann naiahaen an nonnmn-I 1|
`
`

`

`partial representations of the object have to be combined, where each partial
`
`representation is a view from substantially the same relative position of scanner
`
`and object. In the present invention, a view from a given relative position
`
`preferably obtains the 3D geometry and color of the object surface as seen from
`
`that relative position.
`
`For a focus scanner, a view corresponds to one pass of the focusing element(s),
`
`i.e. for a focus scanner each partial representation is the 3D surface geometry and
`
`color derived from the stack of 2D images recorded during the pass of the focus
`plane position between its extremum positions.
`-
`
`10
`
`15
`
`The 3D surface geometry found for various views can be cembined by algorithms
`
`for stitching and registration as widely known in the literature, or from known view
`
`positions and orientations, for example when the scanner is mounted on axes with
`
`encoders. Color can be interpolated and averaged by methods such as texture
`
`weaving, or by simply averaging corresponding color components in multiple views
`
`of the same location on the 3D surface. Here, it can be advantageous to account
`
`for differences in apparent color due to different angles of incidence and reflection,
`
`which is possible because the 3D surface geometry is also known. Texture
`
`weaving is described by e.g. Callieri M, Cignoni P, Scopigno R. “Reconstructing
`
`textured meshes from multiple range rgb maps". VMV 2002, Erlangen, Nov 20-22,
`2002.
`
`20
`
`In some embodiments, the scanner and/or the scanner system is configured for
`
`generating a partial representation of the object surface based on the obtained
`
`surface color and 3D surface geometry.
`
`In some embodiments, the scanner and/or the scanner system is configured for
`
`25
`
`combining partial representations of the object surface obtained from different
`
`relative positions to obtain a full 3D surface geometry and color representation of
`
`the part of the object.
`
`V
`
`In some embodiments, the combination of partial representations of the object to
`
`obtain the full 3D surface geometry and color representation comprises computing
`
`30
`
`the color in each surface point as a weighted average of corresponding points in
`
`all overlapping partial 3D surface geometries at that surface point. The weight of
`
`12
`
`CODV Drovided bV USPTO from "In IFW Imnnn nah-Jun”. .— nnunlnn; :-
`
`

`

`each partial presentation in the sum may be determined by several factors, such
`
`as the presence of saturated pixel values or the orientation of the object surface
`
`with respect to the scanner.
`
`Such a weighted average is advantageous in cases where some scanner positions
`
`and orientations relative to the object will give a better estimate of the actual color
`
`than other positions and orientations. If the illumination of the object surf

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket