`
`(19) World Intellectual Property Organization
`International Bureau
`
`(43) International Publication Date
`23 December 2010 (23.12.2010)
`
`PCT
`
`1111111111111111 IIIIII IIIII IIIII IIIII IIII I II Ill lllll lllll lllll lllll lllll 11111111111111111111111
`
`(10) International Publication Number
`WO 2010/145669 Al
`
`(74) Agent: H0IBERG A/S; St. Kongensgade 59 A,
`DK-1264 Copenhagen K (DK).
`
`(81) Designated States (unless otherwise indicated, for every
`kind of national protection available): AE, AG, AL, AM,
`AO, AT, AU, AZ, BA, BB, BG, BH, BR, BW, BY, BZ,
`CA, CH, CL, CN, CO, CR, CU, CZ, DE, DK, DM, DO,
`DZ, EC, EE, EG, ES, FI, GB, GD, GE, GH, GM, GT,
`HN, HR, HU, ID, IL, IN, IS, JP, KE, KG, KM, KN, KP,
`KR, KZ, LA, LC, LK, LR, LS, LT, LU, LY, MA, MD,
`ME, MG, MK, MN, MW, MX, MY, MZ, NA, NG, NI,
`NO, NZ, OM, PE, PG, PH, PL, PT, RO, RS, RU, SC, SD,
`SE, SG, SK, SL, SM, ST, SV, SY, TH, TJ, TM, TN, TR,
`TT, TZ, VA, VG, US, UZ, VC, VN, ZA, ZM, ZW.
`
`(84)
`
`Designated States (unless otherwise indicated, for every
`kind of regional protection available): ARIPO (BW, GH,
`GM, KE, LR, LS, MW, MZ, NA, SD, SL, SZ, TZ, UG,
`ZM, ZW), Eurasian (AM, AZ, BY, KG, KZ, MD, RU, TJ,
`TM), European (AL, AT, BE, BG, CH, CY, CZ, DE, DK,
`EE, ES, FI, FR, GB, GR, HR, HU, IE, IS, IT, LT, LU,
`LV, MC, MK, MT, NL, NO, PL, PT, RO, SE, SI, SK,
`SM, TR), OAPI (BF, BJ, CF, CG, CI, CM, GA, GN, GQ,
`GW, ML, MR, NE, SN, TD, TG).
`
`Declarations under Rule 4.17:
`ofinventorship (Rule 4.17(iv))
`
`Published:
`
`with international search report (Art. 21 (3))
`
`(51) International Patent Classification:
`A61B 5/107 (2006.01)
`
`(21) International Application Number:
`PCT/DK2010/050148
`
`(22) International Filing Date:
`
`(25) Filing Language:
`
`(26) Publication Language:
`
`17 June 2010 (17.06.2010)
`
`English
`
`English
`
`(30) Priority Data:
`61/187,744
`61/231,118
`
`17 June 2009 (17.06.2009)
`4 August 2009 (04.08.2009)
`
`us
`us
`(71) Applicant
`(for all designated States except US):
`3SHAPE A/S [DK/DK]; Holmens Kanai 7, 4. sal,
`DK-1060 Copenhagen K (DK).
`
`(72)
`(75)
`
`Inventors; and
`Inventors/Applicants (for US only): FISKER, Rune
`[DK/DK]; Kaplevej
`87, DK-2830 Virum
`(DK).
`OJELUND, Henrik
`[SE/DK]; Kulsvierparken 55,
`DK-2800 Lyngby (DK). KJJER, Rasmus [DK/DK];
`N0rre S0gade 21, I, DK-1370 Copenhagen K (DK). VAN
`DER POEL, Mike
`[NL/DK]; R.evebakkevej 35B,
`DK-2610 R0dovre (DK). QAZI, Arish A [PK/CA];
`215-1369 Bloor Street West, M6P 4J4, Toronto (CA).
`HOLLENBECK, Karl-Josef [DE/DK]; Ribegade 12,
`3.th, DK-2100 Copenhagen 0 (DK).
`
`;;;;;;;;;;;;;;
`
`---;;;;;;;;;;;;;;
`;;;;;;;;;;;;;; ---------
`
`;;;;;;;;;;;;;; -
`-;;;;;;;;;;;;;; -
`----;;;;;;;;;;;;;; -
`
`(54) Title: FOCUS SCANNING APPARATUS
`
`(57) Abstract: Disclosed is a handheld scanner for obtaining and/or measuring the 3D geometry of at least a part of the surface of
`an object using confocal pattern projection techniques. Specific embodiments are given for intraoral scanning and scanning of the
`interior part of a human ear.
`3SHAPE EXHIBIT 2001
`Align v. 3Shape
`PGR2018-00104
`
`
`
`WO 2010/145669
`
`PCT/DK2010/050148
`
`Focus scanning apparatus
`
`The present invention relates to an apparatus and a method for optical 3D scanning of
`
`surfaces. The principle of the apparatus and method according to the invention may be
`
`5
`
`applied in various contexts. One specific embodiment of the invention is particularly
`
`suited for intraoral scanning, i.e. direct scanning of teeth and surrounding soft-tissue in
`
`the oral cavity. Other dental related embodiments of the invention are suited for
`
`scanning dental impressions, gypsum models, wax bites, dental prosthetics and
`
`abutments. Another embodiment of the invention is suited for scanning of the interior
`
`10
`
`and exterior part of a human ear or ear channel impressions. The invention may find
`use within scanning of the 3D structure of skin in dermatological or cosmetic I
`
`cosmetological applications, scanning of jewelry or wax models of whole jewelry or part
`
`of jewelry, scanning of industrial parts and even time resolved 3D scanning, such as
`
`time resolved 3D scanning of moving industrial parts.
`
`15
`
`Background of the invention
`
`The invention relates to three dimensional (3D) scanning of the surface geometry of
`
`objects. Scanning an object surface in 3 dimensions is a well known field of study and
`
`20
`
`the methods for scanning can be divided into contact and non-contact methods. An
`
`example of contact measurements methods are Coordinate Measurement Machines
`
`(CMM), which measures by letting a tactile probe trace the surface. The advantages
`
`include great precision, but the process is slow and a CMM is large and expensive.
`
`Non-contact measurement methods include x-ray and optical probes.
`
`25
`
`Confocal microscopy is an optical imaging technique used to increase micrograph
`
`contrast and/or to reconstruct three-dimensional images by using a spatial pinhole to
`
`eliminate out-of-focus light or flare in specimens that are thicker than the focal plane.
`
`30
`
`A confocal microscope uses point illumination and a pinhole in an optically conjugate
`
`plane in front of the detector to eliminate out-of-focus information. Only the light within
`
`the focal plane can be detected. As only one point is illuminated at a time in confocal
`
`microscopy, 2D imaging requires raster scanning and 3D imaging requires raster
`
`scanning in a range of focus planes.
`
`35
`
`
`
`WO 2010/145669
`
`PCT/DK2010/050148
`
`2
`
`In WO 00/08415 the principle of confocal microscopy is applied by illuminating the
`
`surface with a plurality of illuminated spots. By varying the focal plane in-focus spot(cid:173)
`
`specific positions of the surface can be determined. However, determination of the
`
`surface structure is limited to the parts of the surface that are illuminated by a spot.
`
`5
`
`WO 2003/060587 relates to optically sectioning of a specimen in microscopy wherein
`
`the specimen is illuminated with an illumination pattern. Focus positions of the image
`
`plane are determined by characterizing an oscillatory component of the pattern.
`
`However, the focal plane can only be adjusted by moving the specimen and the optical
`
`10
`
`system relative to each other, i.e. closer to or further away from each other. Thus,
`
`controlled variation of the focal plane requires a controlled spatial relation between the
`
`specimen and the optical system, which is fulfilled in a microscope. However, such a
`
`controlled spatial relation is not applicable to e.g. a hand held scanner.
`
`15
`
`US2007/0109559 A 1 describes a focus scanner where distances are found from the
`
`focus lens positions at which maximum reflective intensity of light beams incident on
`
`the object being scanned is observed. In contrast to the invention disclosed here, this
`
`prior art exploits no pre-determined measure of the illumination pattern and exploits no
`
`contrast detection, and therefore, the signal-to-noise ratio is sub-optimal.
`
`20
`
`In WO 2008/125605, means for generating a time-variant pattern composed of
`
`alternating split images are described. This document describes a scanning method to
`
`obtain an optical section of a scan object by means of two different illumination profiles,
`
`e.g. two patterns of opposite phases. These two images are used to extract the optical
`
`25
`
`section, and the method is limited to acquisition of images from only two different
`
`illumination profiles. Furthermore, the method relies on a predetermined calibration that
`
`determines the phase offset between the two illumination profiles.
`
`Summary of the invention
`
`30
`
`Thus, an object of the invention is to provide a scanner which may be integrated in a
`
`manageable housing, such as a handheld housing. Further objects of the invention are:
`
`discriminate out-of-focus information and provide a fast scanning time.
`
`
`
`WO 2010/145669
`
`PCT/DK2010/050148
`
`3
`
`This is achieved by a method and a scanner for obtaining and/or measuring the 3D
`
`geometry of at least a part of the surface of an object, said scanner comprising:
`
`5
`
`10
`
`at least one camera accommodating an array of sensor elements,
`
`means for generating a probe light incorporating a spatial pattern,
`
`means for transmitting the probe light towards the object thereby
`
`illuminating at least a part of the object with said pattern in one or more
`
`configurations,
`
`means for transmitting at least a part of the light returned from the object
`
`to the camera,
`
`means for varying the position of the focus plane of the pattern on the
`
`object while maintaining a fixed spatial relation of the scanner and the
`
`object,
`
`means for obtaining at least one image from said array of sensor
`
`elements,
`
`15
`
`means for evaluating a correlation measure at each focus plane position
`
`between at least one image pixel and a weight function, where the
`
`weight function is determined based on information of the configuration
`
`20
`
`25
`
`of the spatial pattern;
`
`data processing means for:
`
`a) determining by analysis of the correlation measure the in-focus
`
`position(s) of:
`
`each of a plurality of image pixels for a range of focus
`
`plane positions, or
`
`each of a plurality of groups of image pixels for a range of
`
`focus plane positions, and
`
`b)
`
`transforming in-focus data into 3D real world coordinates.
`
`The method and apparatus described in this invention is for providing a 3D surface
`
`30
`
`registration of objects using light as a non-contact probing agent. The light is provided
`
`in the form of an illumination pattern to provide a light oscillation on the object. The
`
`variation/ oscillation in the pattern may be spatial, e.g. a static checkerboard pattern,
`
`and/or it may be time varying, for example by moving a pattern across the object being
`
`scanned. The invention provides for a variation of the focus plane of the pattern over a
`
`35
`
`range of focus plane positions while maintaining a fixed spatial relation of the scanner
`
`
`
`WO 2010/145669
`
`PCT/DK2010/050148
`
`4
`
`and the object. It does not mean that the scan must be provided with a fixed spatial
`
`relation of the scanner and the object, but merely that the focus plane can be varied
`
`(scanned) with a fixed spatial relation of the scanner and the object. This provides for a
`
`hand held scanner solution based on the present invention.
`
`5
`
`In some embodiments the signals from the array of sensor elements are light intensity.
`
`One embodiment of the invention comprises a first optical system, such as an
`
`arrangement of lenses, for transmitting the probe light towards the object and a second
`
`10
`
`optical system for imaging light returned from the object to the camera. In the preferred
`
`embodiment of the invention only one optical system images the pattern onto the object
`
`and images the object, or at least a part of the object, onto the camera, preferably
`
`along the same optical axis, however along opposite optical paths.
`
`15
`
`In the preferred embodiment of the invention an optical system provides an imaging of
`
`the pattern onto the object being probed and from the object being probed to the
`
`camera. Preferably, the focus plane is adjusted in such a way that the image of the
`
`pattern on the probed object is shifted along the optical axis, preferably in equal steps
`
`from one end of the scanning region to the other. The probe light incorporating the
`
`20
`
`pattern provides a pattern of light and darkness on the object. Specifically, when the
`
`pattern is varied in time for a fixed focus plane then the in-focus regions on the object
`
`will display an oscillating pattern of light and darkness. The out-of-focus regions will
`
`display smaller or no contrast in the light oscillations.
`
`25
`
`Generally we consider the case where the light incident on the object is reflected
`
`diffusively and/or specularly from the object's surface. But it is understood that the
`
`scanning apparatus and method are not limited to this situation. They are also
`
`applicable to e.g. the situation where the incident light penetrates the surface and is
`
`reflected and/or scattered and/or gives rise to fluorescence and/or phosphorescence in
`
`30
`
`the object. Inner surfaces in a sufficiently translucent object may also be illuminated by
`
`the illumination pattern and be imaged onto the camera. In this case a volumetric
`
`scanning is possible. Some planktic organisms are examples of such objects.
`
`When a time varying pattern is applied a single sub-scan can be obtained by collecting
`
`35
`
`a number of 2D images at different positions of the focus plane and at different
`
`
`
`WO 2010/145669
`
`PCT/DK2010/050148
`
`5
`
`instances of the pattern. As the focus plane coincides with the scan surface at a single
`
`pixel position, the pattern will be projected onto the surface point in-focus and with high
`
`contrast, thereby giving rise to a large variation, or amplitude, of the pixel value over
`
`time. For each pixel it is thus possible to identify individual settings of the focusing
`
`5
`
`plane for which each pixel will be in focus. By using knowledge of the optical system
`
`used, it is possible to transform the contrast information vs. position of the focus plane
`
`into 30 surface information, on an individual pixel basis.
`
`Thus, in one embodiment of the invention the focus position is calculated by
`
`10
`
`determining the light oscillation amplitude for each of a plurality of sensor elements for
`
`a range of focus planes.
`
`For a static pattern a single sub-scan can be obtained by collecting a number of 20
`images at different positions of the focus plane. As the focus plane coincides with the
`
`15
`
`scan surface, the pattern will be projected onto the surface point in-focus and with high
`
`contrast. The high contrast gives rise to a large spatial variation of the static pattern on
`
`the surface of the object, thereby providing a large variation, or amplitude, of the pixel
`
`values over a group of adjacent pixels. For each group of pixels it is thus possible to
`
`identify individual settings of the focusing plane for which each group of pixels will be in
`
`20
`
`focus. By using knowledge of the optical system used, it is possible to transform the
`
`contrast information vs. position of the focus plane into 30 surface information, on an
`individual pixel group basis.
`
`Thus, in one embodiment of the invention the focus position is calculated by
`
`25
`
`determining the light oscillation amplitude for each of a plurality of groups of the sensor
`
`elements for a range of focus planes.
`
`The 20 to 30 conversion of the image data can be performed in a number of ways
`known in the art. I.e. the 30 surface structure of the probed object can be determined
`by finding the plane corresponding to the maximum light oscillation amplitude for each
`
`30
`
`sensor element, or for each group of sensor elements, in the camera's sensor array
`
`when recording the light amplitude for a range of different focus planes. Preferably, the
`
`focus plane is adjusted in equal steps from one end of the scanning region to the other.
`
`Preferably the focus plane can be moved in a range large enough to at least coincide
`
`35
`
`with the surface of the object being scanned.
`
`
`
`WO 2010/145669
`
`PCT/DK2010/050148
`
`6
`
`The present invention distinguishes itself from WO 2008/125605, because in the
`
`embodiments of the present invention that use a time-variant pattern, input images are
`
`not limited to two illumination profiles and can be obtained from any illumination profile
`
`5
`
`of the pattern. This is because the orientation of the reference image does not rely
`
`entirely on a predetermined calibration, but rather on the specific time of the input
`
`image acquisition.
`
`Thus WO 2008/125605 applies specifically exactly two patterns, which are realized
`
`physically by a chrome-on-glass mask as illuminated from either side, the reverse side
`
`10
`
`being reflective. WO 2008/125605 thus has the advantage of using no moving parts,
`
`but the disadvantage of a comparatively poorer signal-to-noise ratio. In the present
`
`invention there is the possibility of using any number of pattern configurations, which
`
`makes computation of the light oscillation amplitude or the correlation measure more
`
`precise.
`
`15
`
`Definitions
`
`Pattern: A light signal comprising an embedded spatial structure in the lateral plane.
`
`20
`
`May also be termed "illumination pattern".
`
`Time varying pattern: A pattern that varies in time, i.e. the embedded spatial structure
`
`varies in time. May also be termed "time varying illumination pattern". In the following
`
`also termed "fringes".
`
`25
`
`Static pattern: A pattern that does not vary in time, e.g. a static checkerboard pattern
`
`or a static line pattern.
`
`Pattern configuration: The state of the pattern. Knowledge of the pattern
`
`30
`
`configuration at a certain time amounts to knowing the spatial structure of the
`
`illumination at that time. For a periodic pattern the pattern configuration will include
`
`information of the pattern phase. If a surface element of the object being scanned is
`
`imaged onto the camera then knowledge of the pattern configuration amounts to
`
`knowledge of what part of the pattern is illuminating the surface element.
`
`35
`
`
`
`WO 2010/145669
`
`PCT/DK2010/050148
`
`7
`
`Focus plane: A surface where light rays emitted from the pattern converge to form an
`
`image on the object being scanned. The focus plane does not need to be flat. It may be
`
`a curved surface.
`
`5
`
`Optical system: An arrangement of optical components, e.g. lenses, that transmit,
`
`collimate and/or images light, e.g. transmitting probe light towards the object, imaging
`
`the pattern on and/or in the object, and imaging the object, or at least a part of the
`
`object, on the camera.
`
`10
`
`Optical axis: An axis defined by the propagation of a light beam. An optical axis is
`
`preferably a straight line. In the preferred embodiment of the invention the optical axis
`
`is defined by the configuration of a plurality of optical components, e.g. the
`
`configuration of lenses in the optical system. There may be more than one optical axis,
`
`if for example one optical system transmits probe light to the object and another optical
`
`15
`
`system images the object on the camera. But preferably the optical axis is defined by
`
`the propagation of the light in the optical system transmitting the pattern onto the object
`
`and imaging the object onto the camera. The optical axis will often coincide with the
`
`longitudinal axis of the scanner.
`
`20
`
`Optical path: The path defined by the propagation of the light from the light source to
`
`the camera. Thus, a part of the optical path preferably coincides with the optical axis.
`
`Whereas the optical axis is preferably a straight line, the optical path may be a non(cid:173)
`
`straight line, for example when the light is reflected, scattered, bent, divided and/or the
`
`like provided e.g. by means of beam splitters, mirrors, optical fibers and the like.
`
`25
`
`Telecentric system: An optical system that provides imaging in such a way that the
`
`chief rays are parallel to the optical axis of said optical system. In a telecentric system
`
`out-of-focus points have substantially same magnification as in-focus points. This may
`
`provide an advantage in the data processing. A perfectly telecentric optical system is
`
`30
`
`difficult to achieve, however an optical system which is substantially telecentric or near
`
`telecentric may be provided by careful optical design. Thus, when referring to a
`
`telecentric optical system it is to be understood that it may be only near telecentric.
`
`Scan length: A lateral dimension of the field of view. If the probe tip (i.e. scan head)
`
`35
`
`comprises folding optics to direct the probe light in a direction different such as
`
`
`
`WO 2010/145669
`
`PCT/DK2010/050148
`
`8
`
`perpendicular to the optical axis then the scan length is the lateral dimension parallel to
`
`the optical axis.
`
`Scan object: The object to be scanned and on which surface the scanner provides
`
`5
`
`information. "The scan object" may just be termed "the object".
`
`Camera: Imaging sensor comprising a plurality of sensors that respond to light input
`
`onto the imaging sensor. The sensors are preferably ordered in a 2D array in rows and
`
`columns.
`
`Input signal: Light input signal or sensor input signal from the sensors in the camera.
`
`This can be integrated intensity of light incident on the sensor during the exposure time
`
`or integration of the sensor. In general, it translates to a pixel value within an image.
`
`May also be termed "sensor signal".
`
`Reference signal: A signal derived from the pattern. A reference signal may also be
`
`denoted a weight function or weight vector or reference vector.
`
`10
`
`15
`
`Correlation measure: A measure of the degree of correlation between a reference
`
`20
`
`and input signal. Preferably the correlation measure is defined such that if the
`
`reference and input signal are linearly related to each other then the correlation
`
`measure obtains a larger magnitude than if they are not.
`
`In some cases the correlation measure is a light oscillation amplitude.
`
`25
`
`Image: An image can be viewed as a 2D array of values (when obtained with a digital
`
`camera) or in optics, an image indicates that there exists a relation between an imaged
`
`surface and an image surface where light rays emerging from one point on said imaged
`
`surface substantially converge on one point on said image surface.
`
`30
`
`Intensity: In optics, intensity is a measure of light power per unit area. In image
`
`recording with a camera comprising a plurality of individual sensing elements, intensity
`
`may be used to term the recorded light signal on the individual sensing elements. In
`
`this case intensity reflects a time integration of light power per unit area on the sensing
`
`element over the exposure time involved in the image recording.
`
`35
`
`
`
`WO 2010/145669
`
`PCT/DK2010/050148
`
`9
`
`Mathematical notation
`
`A
`
`A correlation measure between the weight function and the recorded light
`
`signal. This can be a light oscillation amplitude.
`
`Light input signal or sensor input signal. This can be integrated intensity of
`
`5
`
`light incident on the sensor during the exposure time or integration of the
`
`sensor. In general, it translates to a pixel value within an image.
`
`Reference signal. May also be called weight value.
`
`The number of measurements with a camera sensor and/or several camera
`
`sensors that are used to compute a correlation measure.
`
`Image height in number of pixels
`
`Image width in number of pixels
`
`n
`
`H
`
`W
`
`10
`
`Symbols are also explained as needed in the text.
`
`
`
`WO 2010/145669
`
`PCT/DK2010/050148
`
`10
`
`Detailed description of the invention
`
`The scanner preferably comprises at least one beam splitter located in the optical path.
`
`For example, an image of the object may be formed in the camera by means of a beam
`
`5
`
`splitter. Exemplary uses of beam splitters are illustrated in the figures.
`
`In a preferred embodiment of the invention light is transmitted in an optical system
`
`comprising a lens system. This lens system may transmit the pattern towards the
`
`object and images light reflected from the object to the camera.
`
`10
`
`In a telecentric optical system, out-of-focus points have the same magnification as in(cid:173)
`
`focus points. Telecentric projection can therefore significantly ease the data mapping of
`
`acquired 2D images to 3D images. Thus, in a preferred embodiment of the invention
`
`the optical system is substantially telecentric in the space of the probed object. The
`
`15
`
`optical system may also be telecentric in the space of the pattern and camera.
`
`Varying focus
`
`A pivotal point of the invention is the variation, i.e. scanning, of the focal plane without
`
`moving the scanner in relation to the object being scanned. Preferably the focal plane
`
`20
`
`may be varied, such as continuously varied in a periodic fashion, while the pattern
`
`generation means, the camera, the optical system and the object being scanned is
`
`fixed in relation to each other. Further, the 3D surface acquisition time should be small
`
`enough to reduce the impact of relative movement between probe and teeth, e.g.
`
`reduce effect of shaking. In the preferred embodiment of the invention the focus plane
`
`25
`
`is varied by means of at least one focus element. Preferably the focus plane is
`
`periodically varied with a predefined frequency. Said frequency may be at least 1 Hz,
`
`such as at least 2 Hz, 3, 4, 5, 6, 7, 8, 9 or at least 10 Hz, such as at least 20, 40, 60, 80
`
`or at least 100 Hz.
`
`30
`
`Preferably the focus element is part of the optical system. I.e. the focus element may
`
`be a lens in a lens system. A preferred embodiment comprises means, such as a
`
`translation stage, for adjusting and controlling the position of the focus element. In that
`
`way the focus plane may be varied, for example by translating the focus element back
`
`and forth along the optical axis.
`
`35
`
`
`
`WO 2010/145669
`
`PCT/DK2010/050148
`
`11
`
`If a focus element is translated back and forth with a frequency of several Hz this may
`
`lead to instability of the scanner. A preferred embodiment of the invention thus
`
`comprises means for reducing and/or eliminating the vibration and/or shaking from the
`
`focus element adjustment system, thereby increasing the stability of the scanner. This
`
`5
`
`may at least partly be provided by means for fixing and/or maintaining the centre of
`
`mass of the focus element adjustment system, such as a counter-weight to
`
`substantially counter-balance movement of the focus element; for example, by
`
`translating a counter-weight opposite to the movement of the focus element. Ease of
`
`operation may be achieved if the counter-weight and the focus element are connected
`
`10
`
`and driven by the same translation means. This may however, only substantially
`
`reduce the vibration to the first order. If a counter-weight balanced device is rotated
`
`around the counter-weight balanced axis, there may be issues relating to the torque
`
`created by the counter-weights. A further embodiment of the invention thus comprises
`
`means for reducing and/or eliminating the first order, second order, third order and/or
`
`15
`
`higher order vibration and/or shaking from the focus element adjustment system,
`
`thereby increasing the stability of the scanner.
`
`In another embodiment of the invention more than one optical element is moved to shift
`
`the focal plane. In that embodiment it is desirable that these elements are moved
`
`20
`
`together and that the elements are physically adjacent.
`
`In the preferred embodiment of the invention the optical system is telecentric, or near
`
`telecentric, for all focus plane positions. Thus, even though one or more lenses in the
`
`optical system may be shifted back and forth to change the focus plane position, the
`
`25
`
`telecentricity of the optical system is maintained.
`
`The preferred embodiment of the invention comprises focus gearing. Focus gearing is
`
`the correlation between movement of the lens and movement of the focus plane
`
`position. E.g. a focus gearing of 2 means that a translation of the focus element of 1
`
`30
`
`mm corresponds to a translation of the focus plane position of 2 mm. Focus gearing
`
`can be provided by a suitable design of the optical system. The advantage of focus
`
`gearing is that a small movement of the focus element may correspond to a large
`
`variation of the focus plane position. In specific embodiments of the invention the focus
`
`gearing is between 0.1 and 100, such as between 0.1 and 1, such as between 1 and
`
`
`
`WO 2010/145669
`
`PCT/DK2010/050148
`
`12
`
`10, such as between 2 and 8, such as between 3 and 6, such as least 10, such as at
`
`least 20.
`
`In another embodiment of the invention the focus element is a liquid lens. A liquid lens
`
`5
`
`can control the focus plane without use of any moving parts.
`
`Camera
`
`The camera may be a standard digital camera accommodating a standard CCD or
`
`CMOS chip with one A/D converter per line of sensor elements (pixels). However, to
`
`10
`
`increase the frame rate the scanner according to the invention may comprise a high(cid:173)
`
`speed camera accommodating multiple AID converters per line of pixels, e.g. at least 2,
`
`4, 8 or 16 AID converters per line of pixels.
`
`Pattern
`
`15
`
`Another central element of the invention is the probe light with an embedded pattern
`
`that is projected on to the object being scanned. The pattern may be static or time
`
`varying. The time varying pattern may provide a variation of light and darkness on
`
`and/or in the object. Specifically, when the pattern is varied in time for a fixed focus
`
`plane then the in-focus regions on the object will display an oscillating pattern of light
`
`20
`
`and darkness. The out-of-focus regions will display smaller or no contrast in the light
`
`oscillations. The static pattern may provide a spatial variation of light and darkness on
`
`and/or in the object. Specifically, the in-focus regions will display an oscillating pattern
`
`of light and darkness in space. The out-of-focus regions will display smaller or no
`
`contrast in the spatial light oscillations.
`
`25
`
`Light may be provided from an external light source, however preferably the scanner
`
`comprises at least one light source and pattern generation means to produce the
`
`pattern. It is advantageous in terms of signal-to-noise ratio to design a light source
`
`such that the intensity in the non-masked parts of the pattern is as close to uniform in
`
`30
`
`space as possible. In another embodiment the light source and the pattern generation
`
`means is integrated in a single component, such as a segmented LED. A segmented
`
`LED may provide a static pattern and/or it may provide a time varying pattern in itself
`
`by turning on and off the different segments in sequence. In one embodiment of the
`
`invention the time varying pattern is periodically varying in time. In another embodiment
`
`
`
`WO 2010/145669
`
`PCT/DK2010/050148
`
`13
`
`of the invention the static pattern is periodically varying in space.
`
`Light from the light source (external or internal) may be transmitted through the pattern
`
`generation means thereby generating the pattern. For example the pattern generation
`
`5
`
`means comprises at least one translucent and/or transparent pattern element. For
`
`generating a time varying pattern a wheel, with an opaque mask can be used. E.g. the
`
`mask comprises a plurality of radial spokes, preferably arranged in a symmetrical
`
`order. The scanner may also comprise means for rotating and/or translating the pattern
`
`element. For generating a static pattern a glass plate with an opaque mask can be
`
`10
`
`used. E.g. the mask comprises a line pattern or checkerboard pattern. In general said
`
`mask preferably possesses rotational and/or translational periodicity. The pattern
`
`element is located in the optical path. Thus, light from the light source may be
`
`transmitted through the pattern element, e.g. transmitted transversely through the
`
`pattern element. The time varying pattern can then be generated by rotating and/or
`
`15
`
`translating the pattern element. A pattern element generating a static pattern does not
`
`need to be moved during a scan.
`
`Correlation
`
`One object of the invention is to provide short scan time and real time processing, e.g.
`
`20
`
`to provide live feedback to a scanner operator to make a fast scan of an entire tooth
`
`arch. However, real time high resolution 3D scanning creates an enormous amount of
`
`data. Therefore data processing should be provided in the scanner housing, i.e. close
`
`to the optical components, to reduce data transfer rate to e.g. a cart, workstation or
`
`display. In order to speed up data processing time and in order to extract in-focus
`
`25
`
`information with an optimal signal-to-noise ratio various correlation techniques may be
`
`embedded/ implemented. This may for example be implemented in the camera
`
`electronics to discriminate out-of-focus information. The pattern is applied to provide
`
`illumination with an embedded spatial structure on the object being scanned.
`
`Determining in-focus information relates to calculating a correlation measure of this
`
`30
`
`spatially structured light signal (which we term input signal) with the variation of the
`
`pattern itself (which we term reference signal). In general the magnitude of the
`
`correlation measure is high if the input signal coincides with the reference signal. If the
`
`input signal displays little or no variation then the magnitude of the correlation measure
`
`is low. If the