`US 20050088529Al
`
`(19) United States
`(12) Patent Application Publication
`Geng
`
`(10) Pub. No.: US 2005/0088529 Al
`Apr. 28, 2005
`(43) Pub. Date:
`
`(54) SYSTEM AND A METHOD FOR
`THREE-DIMENSIONAL IMAGING SYSTEMS
`
`(76)
`
`Inventor: Z. Jason Geng, Rockville, MD (US)
`
`Correspondence Address:
`STEVEN L. NICHOLS
`RADER, FISHMAN & GRAVER PLLC
`10653 S. RIVER FRONT PARKWAY
`SUITE 150
`SOUTH JORDAN, UT 84095 (US)
`
`(21) Appl. No.:
`
`10/973,533
`
`(22) Filed:
`
`Oct. 25, 2004
`
`Related U.S. Application Data
`
`( 60) Provisional application No. 60/514,177, filed on Oct.
`23, 2003.
`
`Publication Classification
`
`(51)
`
`Int. Cl.7 ..................................................... H04N 5/225
`
`(52) U.S. Cl. ........................................................ 348/207.99
`
`(57)
`
`ABSTRACT
`
`A method for acquiring a surface profile 3D data set of an
`object includes illuminating a surface of the object with a
`sequence of multiple rainbow projection (MRP) structural
`light patterns, capturing light reflected from the object, and
`calculating 3D data (X, Y, Z) for each visible point on the
`object based upon triangulation mathematical principles of
`the captured reflected light.
`
`3SHAPE 1013 3Shape v Align IPR2021-01383
`
`
`
`Patent Application Publication Apr. 28, 2005 Sheet 1 of 17
`
`US 2005/0088529 Al
`
`3SHAPE 1013
`
`3ShapevAlign
`
`IPR2021-01383
`
`3SHAPE 1013 3Shape v Align IPR2021-01383
`
`
`
`Patent Application Publication Apr. 28, 2005 Sheet 2 of 17
`
`US 2005/0088529 Al
`
`310
`
`Wavelength
`
`
`
`
`
`3SHAPE 1013
`
`3ShapevAlign
`
`IPR2021-01383
`
`3SHAPE 1013 3Shape v Align IPR2021-01383
`
`
`
`Patent Application Publication Apr. 28,2005 Sheet 3 of 17
`
`US 2005/0088529 Al
`
`Wavelength
`
`
`
`3SHAPE 1013
`
`3ShapevAlign
`
`IPR2021-01383
`
`3SHAPE 1013 3Shape v Align IPR2021-01383
`
`
`
`Patent Application Publication Apr. 28, 2005 Sheet 4 of 17
`
`US 2005/0088529 Al
`
`600
`
`S
`
`605
`
`
`AULT
`
`UATATT
`
`UCUATAATA
`
`
`PAAAT
`
`
`COPE
`
`
`PYPPTyYTet
`
`
`
`Fig. 6
`
`640
`
`
`CECI
`WyViv\Viv\iyyf
`
`PUAALANALAAALAAAI
`pyVEVVEVVEYVO
`
`
`VAATAAATAAATAAAT
`
`
`
`
`
`
`3SHAPE 1013
`
`3ShapevAlign
`
`IPR2021-01383
`
`3SHAPE 1013 3Shape v Align IPR2021-01383
`
`
`
`Patent Application Publication Apr. 28,2005 Sheet 5 of 17
`
`US 2005/0088529 Al
`
`800
`
`3SHAPE 1013
`
`3ShapevAlign
`
`IPR2021-01383
`
`3SHAPE 1013 3Shape v Align IPR2021-01383
`
`
`
`Patent Application Publication Apr. 28,2005 Sheet 6 of 17
`
`US 2005/0088529 Al
`
`1080
`
`1018
`
`CS
`
`040
`
`1030Ke
`
`:
`
`Fig.10
`
`1034
`
`1018
`
`1012
`
`3SHAPE 1013
`
`3ShapevAlign
`
`IPR2021-01383
`
`3SHAPE 1013 3Shape v Align IPR2021-01383
`
`
`
`Patent Application Publication Apr. 28,2005 Sheet 7 of 17
`
`US 2005/0088529 Al
`
`1012
`
`1050
`
`4010LO
`
`1014
`
`3SHAPE 1013
`
`3ShapevAlign
`
`IPR2021-01383
`
`3SHAPE 1013 3Shape v Align IPR2021-01383
`
`
`
`Patent Application Publication Apr. 28,2005 Sheet 8 of 17
`
`US 2005/0088529 Al
`
`1070
`
`1300
`
`100%
`
`80%
`
`60%
`
`
`0%
`
`40%
`
`20%
`
`400 nm
`
`500 nm
`
`600 nm
`
`700 nm
`
`Wavelength
`
`Fig. 13
`
`3SHAPE 1013
`
`3ShapevAlign
`
`IPR2021-01383
`
`Transmission
`
`3SHAPE 1013 3Shape v Align IPR2021-01383
`
`
`
`Patent Application Publication Apr. 28,2005 Sheet 9 of 17
`
`US 2005/0088529 Al
`
`1410
`
`1422
`
`1420
`
`1400
`
`1440
`
`1450
`
`1500
`
`1510
`
`/
`
`1530
`
`1540
`
`1550
`
`1560
`
`
`
`1520
`
`3SHAPE 1013
`
`3ShapevAlign
`
`IPR2021-01383
`
`3SHAPE 1013 3Shape v Align IPR2021-01383
`
`
`
`Patent Application Publication Apr. 28,2005 Sheet 10 of 17
`
`US 2005/0088529 Al
`
`1600
`
`A 1510
`
`1700
`
`1720
`
`1730-1710
`
`Fig. 17
`
`3SHAPE 1013
`
`3ShapevAlign
`
`IPR2021-01383
`
`3SHAPE 1013 3Shape v Align IPR2021-01383
`
`
`
`Patent Application Publication Apr. 28,2005 Sheet 11 of 17
`
`US 2005/0088529 Al
`1910
`
`1900
`—\
`
`850
`
`1450
`
`3SHAPE 1013
`
`3ShapevAlign
`
`IPR2021-01383
`
`3SHAPE 1013 3Shape v Align IPR2021-01383
`
`
`
`Patent Application Publication Apr. 28,2005 Sheet 12 of 17
`
`US 2005/0088529 Al
`
`2000
`
`2170
`
`2170
`
`
`
`FRAME CYCLE
`
`
`FRAME CYCLE
`
`
`
`
`ODD FIELD||EVEN FIELD ODD FIELD EVEN FIELD
`
`
`
`
`CYCLE
`CYCLE
`CYCLE
`CYCLE
`!
`
`~~»[TR
`TR
`!
`2182 ty [TS]
`[TS]
`|
`
`
`2180
`
`2190
`
`
`ODD IMAGE||EVEN IMAGE
`
`2196
`2198
`
`2192
`
`2194
`
`Fig. 21
`
`3SHAPE 1013
`
`3ShapevAlign
`
`IPR2021-01383
`
`3SHAPE 1013 3Shape v Align IPR2021-01383
`
`
`
`Patent Application Publication Apr. 28,2005 Sheet 13 of 17
`
`US 2005/0088529 Al
`
`aa
`
`SB
`
`aN
`
`a
`WN
`.-
`o>
`LL.
`
`o Q
`
`N
`
`3SHAPE 1013
`
`3ShapevAlign
`
`IPR2021-01383
`
`i
`Sete
`_PRHE
`
`> eed
`B i
`
`2
`N
`N
`~“
`
`°o
`
`3SHAPE 1013 3Shape v Align IPR2021-01383
`
`
`
`Patent Application Publication Apr. 28, 2005 Sheet 14 of 17
`
`US 2005/0088529 Al
`
`
`
`
`
`ainydes_—Bulsseo0id
`
`3SHAPE 1013
`
`3Shape v Align
`
`IPR2021-01383
`
`Ovec
`
`ebew)|ge|=
`
`OLee
`
`OSE?
`
`OLEc
`
`3SHAPE 1013 3Shape v Align IPR2021-01383
`
`
`
`
`Patent Application Publication Apr. 28,2005 Sheet 15 of 17
`
`US 2005/0088529 Al
`
`
`
`f]
`
`Jayndwoy
`
`
`
`Power Supply (2 wires)
`
`Video (2 wires)
`
`User Commands(3 wires)
`
`
`
`
`
`Camera Trigger (1 wire)
`
`ComputerTrigger (3 wires)
`
`
`
`
`eloweypjoypueyy
`
`
`
`Trigger
`command
`
`cnooder
`
`|
`Computer
`I
`les ew enw eee ee ew ee eee aera ne wee ew oe wm ee ew oe a
`
`3SHAPE 1013
`
`3ShapevAlign
`
`IPR2021-01383
`
`3SHAPE 1013 3Shape v Align IPR2021-01383
`
`
`
`Patent Application Publication Apr. 28,2005 Sheet 16 of 17
`
`US 2005/0088529 Al
`
`Get 3D Image
`
`Get 2DColer
`
`Get 2D Image With Video Mode
`
`Get Initial Pott
`
`Ceree)AO Mo Tuiroo
`i
`TATA|PETAAA|Video Tracifiak |SETTERS
`onl LE Toc
`SeTTETTET ETL TTOTTOTE TTTTT
`PATTI ci
`ininen EtTTETTELETELEET
`Dd el el
`|eel tttLL LY
`MUTT TeIyaaTEPu
`malltLICe Ci HAA T_T
`ne HT HH TTI
`
`TeCIOC CeodCeco oo
`|Ti[tT rittft rT tfmittAATTT
`
`
`
`2710 Color Composite
`
`Image
`
`
`
`3SHAPE 1013
`
`3ShapevAlign
`
`IPR2021-01383
`
`3SHAPE 1013 3Shape v Align IPR2021-01383
`
`
`
`Patent Application Publication Apr. 28,2005 Sheet 17 of 17
`
`US 2005/0088529 Al
`
`oo]
`N
`
`Li.
`
`2800
`
`2840
`
`2850
`
`2860
`
`2830
`
`3SHAPE 1013
`
`3ShapevAlign
`
`IPR2021-01383
`
`2810
`
`OODNDTROOU0
`
`2820
`
`3SHAPE 1013 3Shape v Align IPR2021-01383
`
`
`
`US 2005/0088529 Al
`
`Apr. 28, 2005
`
`1
`
`SYSTEM AND A METHOD FOR
`THREE-DIMENSIONAL IMAGING SYSTEMS
`
`visible points on an object's surface in order to obtain a
`full-frame 3D image in one snapshot.
`
`RELATED APPLICATIONS
`
`[0001] The present application claims priority under 35
`U.S.C. § 119(e) from the following previously-filed Provi(cid:173)
`sional Patent Application, U.S. Application No. 60/514,177,
`filed Oct. 23, 2003 by Geng, entitled "Method and Apparatus
`of 3D Imaging Systems" which is incorporated herein by
`reference in its entirety.
`
`BACKGROUND
`
`[0002] High-speed three-dimensional (3D) imaging is an
`increasingly important function in advanced sensors in both
`military and civilian applications. For example, high-speed
`3D capabilities offer many military systems with greatly
`increased capabilities in target detection, identification, clas(cid:173)
`sification, tracking, and kill determination. As a further
`example, real time 3D imaging techniques also have great
`potential in commercial applications, ranging from 3D tele(cid:173)
`vision, virtual reality, 3D modeling and simulation, Internet
`applications,
`industrial
`inspection, vehicle navigation,
`robotics and tele-operation, to medical imaging, dental mea(cid:173)
`surement, as well as apparel and footwear industries, just to
`name a few.
`
`[0003] A number of three dimensional surface profile
`imaging methods and apparatuses described in U.S. Pat.
`Nos. 5,675,407; U.S. Pat. No. 6,028,672; and U.S. Pat. No.
`6,147,760; the disclosures of which is incorporated herein
`by reference in their entireties, conduct imaging by project(cid:173)
`ing light through a linear variable wavelength filter (LVWF),
`thereby projecting light having a known, spatially distrib(cid:173)
`uted wavelength spectrum on the objects being imaged. The
`LVWF is a rectangular optical glass plate coated with a
`color-filtering film that gradually varies in color, (i.e., wave(cid:173)
`length). If the color spectrum of a LVWF is within the
`visible light region, one edge of the filter rectangle may
`correspond to the shortest visible wavelength (i.e. blue or
`violet) while the opposite edge may correspond to the
`longest visible wavelength, (i.e. red). The wavelength of
`light passing through the coated color-filtering layer is
`linearly proportional to the distance between the position on
`the filter glass where the light passes and the blue or red
`edge. Consequently, the color of the light is directly related
`to the angle 8, shown in FIG. 1, at which the light leaves the
`rainbow projector and LVWF.
`
`[0004] Referring to FIGS. 1 and 2 in more detail, the
`imaging method and apparatus is based on the triangulation
`principle and the relationship between a light projector (100)
`that projects through the LVWF (101), a camera (102), and
`the object or scene being imaged (104). As shown in FIG.
`1, a triangle is uniquely defined by the angles theta (8) and
`alpha (a), and the length of the baseline (B). With known
`values for 8, a, and ~, the distance (i.e., the range R)
`between the camera (102) and a point Q on the object's
`surface can be easily calculated. Because the baseline B is
`predetermined by the relative positions of the light projector
`(100) and the camera (102), and the value of a can be
`calculated from the camera's geometry, the key to the
`triangulation method is to determine the projection angle, 8,
`from an image captured by the camera (102) and more
`particularly to determine all 8 angles corresponding to all the
`
`[0005] FIG. 2 is a more detailed version of FIG. 1 and
`illustrates the manner in which all visible points on the
`object's surface (104) are obtained via the triangulation
`method. As can be seen in the figure, the light projector
`(100) generates a fan beam of light (200). The fan beam
`(200) is broad spectrum light (i.e., white light) which passes
`through the LVWF (101) to illuminate one or more three(cid:173)
`dimensional objects (104) in the scene with a pattern of light
`rays possessing a rainbow-like spectrum distribution. The
`fan beam of light (200) is composed of multiple vertical
`planes of light (202), or "light sheets", each plane having a
`given projection angle and wavelength. Because of the fixed
`geometric relationship among the light source (100), the lens
`of the camera (102), and the LVWF (101), there exists a
`one-to-one correspondence between the projection angle (8)
`of the vertical plane of light and the wavelength (A) of the
`light ray. Note that although the wavelength variations are
`shown in FIG. 2 to occur from side to side across the object
`(104) being imaged, it will be understood by those skilled in
`the art that the variations in wavelength could also be made
`from top to bottom across the object (104) or scene being
`imaged.
`
`[0006] The light reflected from the object (104) surface is
`then detected by the camera (102). If a visible spectrum
`range LVWF (400-700 nm) is used, the color detected by the
`camera pixels is determined by the proportion of its primary
`color Red, Green, and Blue components (RGB). The color
`spectrum of each pixel has a one-to-one correspondence
`with the projection angle (8) of the plane of light due to the
`fixed geometry of the camera (102) lens and the LVWF
`(101) characteristics. Therefore, the color of light received
`by the camera (102) can be used to determine the angle 8 at
`which that light left the light projector (100) through the
`LVWF (101).
`
`[0007] As described above, the angle a is determined by
`the physical relationship between the camera (102) and the
`coordinates of each pixel on the camera's imaging plane.
`The baseline B between the camera's (102) focal point and
`the center of the cylindrical lens of the light projector (100)
`is fixed and known. Given the value for angles a and 8,
`together with the known baseline length B, all necessary
`information is provided to easily determine the full frame of
`three-dimensional range values (x,y,z) for any and every
`visible spot on the surface of the objects (104) seen by the
`camera (102).
`
`[0008] While the camera (102) illustrated in FIG. 2 effec(cid:173)
`tively produces full frame three-dimensional range values
`for any and every visible spot on the surface of an object
`(104), the camera (102) also requires a high signal-to-noise
`(SIN) ratio, a color sensor, and an LVWF (101) with
`precision spectral variation, all of which is expensive to
`achieve. Consequently, there is a need in the art for an
`inexpensive yet high speed three dimensional camera.
`
`SUMMARY
`
`[0009] A method for acquiring a surface profile 3D data
`set of an object includes illuminating a surface of the object
`with a sequence of multiple rainbow projection (MRP)
`structural light patterns, capturing light reflected from the
`object, and calculating 3D data (X, Y, Z) for each visible
`
`3SHAPE 1013 3Shape v Align IPR2021-01383
`
`
`
`US 2005/0088529 Al
`
`Apr. 28, 2005
`
`2
`
`point on the object based upon triangulation mathematical
`principles of the captured reflected light.
`
`BRIEF DESCRIPTION OF THE DRAWINGS
`
`[0010] The accompanying drawings illustrate various
`embodiments of the present system and method and are a
`part of the specification. The illustrated embodiments are
`merely examples and do not limit the scope of the claims.
`[0011] FIG. 1 is a simplified block diagram illustrating a
`triangulation principle according to one exemplary embodi(cid:173)
`ment.
`
`[0012] FIG. 2 is a block diagram illustrating a triangula(cid:173)
`tion principle according to one exemplary embodiment.
`
`[0013] FIG. 3 is a plurality of simplified charts illustrating
`the characteristics of a multiple rainbow projection (MRP)
`pattern, according to one exemplary embodiment.
`
`[0014] FIG. 4
`illustrating an
`is a simplified chart
`improved color matching scheme for an MRP pattern,
`according to one exemplary embodiment.
`[0015] FIG. 5 is a simple block diagram illustrating an
`exemplary system for generating an MRP pattern using a
`multi-rainbow filer, according to one exemplary embodi(cid:173)
`ment.
`[0016] FIG. 6 is a simplified chart illustrating a mono(cid:173)
`chromatic pattern filter, according to one exemplary embodi(cid:173)
`ment.
`
`[0017] FIG. 7 is a chart illustrating a method for forming
`an MRP pattern, according to one exemplary embodiment.
`
`[0018] FIG. 8 is a simple block diagram illustrating a
`MRP system, according to one exemplary embodiment.
`
`[0019] FIG. 9 is a simplified diagram illustrating a mono(cid:173)
`chromatic filter design pattern, according to one exemplary
`embodiment.
`[0020] FIG. 10 is a side view illustrating an optics design
`layout that may be used to produce an MRP pattern, accord(cid:173)
`ing to one exemplary embodiment.
`[0021] FIG.11 is a simplified block diagram illustrating a
`perspective view of an MRP projector, according to one
`exemplary embodiment.
`
`[0022] FIG. 12 is a photograph illustrating a multi-rain(cid:173)
`bow projection pattern from a prototype MRP projector,
`according to one exemplary embodiment.
`[0023] FIG. 13 is a chart illustrating a spectral response of
`a color CCD camera, according to one exemplary embodi(cid:173)
`ment.
`
`[0024] FIG. 14 is a block diagram illustrating a perspec(cid:173)
`tive view of a three-dimensional imaging system, according
`to one exemplary embodiment.
`[0025] FIG. 15 is a block diagram illustrating a perspec(cid:173)
`tive view of an LED array for sequential pattern projection,
`according to one exemplary embodiment.
`[0026] FIG. 16 is a block diagram illustrating a perspec(cid:173)
`tive view of a 3D camera system using an LED array for
`sequential pattern projection, according to one exemplary
`embodiment.
`
`[0027] FIG. 17 is a frontal view of a slot plate configu(cid:173)
`ration, according to one exemplary embodiment.
`[0028] FIG. 18 is a block diagram illustrating an image
`formation mechanism of a zigzag pattern using a slot plate,
`according to one exemplary embodiment.
`[0029] FIG. 19 is a block diagram illustrating a micro(cid:173)
`shifting of the slot plate using a bi-morpher material, accord(cid:173)
`ing to one exemplary embodiment.
`[0030] FIG. 20 is a block diagram illustrating a micro(cid:173)
`switching mechanism used to move the slotted plate into
`multiple positions, according to one exemplary embodi(cid:173)
`ment.
`[0031] FIG. 21 is a chart illustrating a sequential with in
`frame time (SWIFT) imaging method, according to one
`exemplary embodiment.
`[0032] FIG. 22 is a perspective view illustrating a simul(cid:173)
`taneous acquisition of multiple 3D images from different
`views, according to one exemplary embodiment.
`[0033] FIG. 23 is a system diagram illustrating a 3D
`imaging system applied to an intraoral 3D camera, accord(cid:173)
`ing to one exemplary embodiment.
`[0034] FIG. 24 is a communication protocol diagram
`illustrating a communication contact between a handheld
`camera and a host computer, according to one exemplary
`embodiment.
`[0035] FIG. 25 is a control logic diagram illustrating the
`internal logic of an intraoral 3D camera, according to one
`exemplary embodiment.
`
`[0036] FIG. 26 is a timing diagram illustrating an intraoral
`3D camera control timing scheme, according to one exem(cid:173)
`plary embodiment.
`
`[0037] FIG. 27 is a method diagram illustrating the com(cid:173)
`bination of three sequential images to produce a composite
`true color image, according to one exemplary embodiment.
`[0038] FIG. 28 is a simple block diagram illustrating the
`components of a visible or near infrared rainbow 3D camera,
`according to one exemplary embodiment.
`
`[0039] Throughout the drawings, identical reference num(cid:173)
`bers designate similar but not necessarily identical elements.
`
`DETAILED DESCRIPTION
`
`[0040] The present specification discloses a system and a
`method for performing 3D surface imaging. More specifi(cid:173)
`cally, the present specification provides a number of exem(cid:173)
`plary systems and methods for using sequential frame image
`acquisitions to obtain equivalent rainbow projection infor(cid:173)
`mation such that a 3D profile of the object surface can be
`computed accurately. Specific details of the systems and
`methods will be provided herein.
`
`[0041] As used in the present specification and in the
`appended claims, the phrase "CCD" or "charge-coupled
`device" is meant to be understood as any light-sensitive
`integrated circuit that stores and displays the data for an
`image in such a way that each pixel (picture element) in the
`image is converted into an electrical charge, the intensity of
`which is related to a color in the color spectrum. Addition(cid:173)
`ally, the term "trigger" is meant to be understood as an event
`
`3SHAPE 1013 3Shape v Align IPR2021-01383
`
`
`
`US 2005/0088529 Al
`
`Apr. 28, 2005
`
`3
`
`or period of time during which a projection or sensing event
`is performed. "Cross-talk" refers
`to any
`interference
`between projection patterns, whether projected from a single
`projector or multiple projectors. Additionally the term "Phil(cid:173)
`ips prism" is a term of art referring to an optical prism
`having tilted dichroic surfaces. Also, the term "monochro(cid:173)
`matic" refers to any electromagnetic radiation having a
`single wavelength. The term "Rainbow-type image" or
`"Rainbow-type camera" is meant to be understood as an
`image or a camera configured to collect an image that may
`be used to form a three-dimensional image according to the
`triangulation principles illustrated above with respect to
`FIGS. 1 and 2.
`
`[0042]
`In the following description, for purposes of expla(cid:173)
`nation, numerous specific details are set forth in order to
`provide a thorough understanding of the present system and
`method for performing 3D surface imaging using sequential
`frame image acquisitions. It will be apparent, however, to
`one skilled in the art that the present method may be
`practiced without these specific details. Reference in the
`specification to "one embodiment" or "an embodiment"
`means that a particular feature, structure, or characteristic
`described in connection with the embodiment is included in
`at least one embodiment. The appearance of the phrase "in
`one embodiment" in various places in the specification are
`not necessarily all referring to the same embodiment.
`
`[0043] As noted previously, the Rainbow 3D Camera
`includes a number of unique and advantageous features. The
`Rainbow 3D system has an inherent ability of capturing full
`field 3D images. Using a standard CCD camera, a 2D image
`(or snapshot) can be obtained in 0.001 second, and a stream
`of 3D images can be generated at a rate of 30 frames per
`second without any interpolation. This feature facilitates
`many real-time 3D imaging applications.
`
`[0044] Additionally, the Rainbow 3D concept is superior
`to other 3D digitizers in terms of obtaining high-resolution
`3D images. Conventional 3D laser scanners, for example,
`project sheets of light. The deformation of light stripes
`provides 3D measurements along the line. This scheme,
`however, suffers poor spatial resolution due to the width of
`pattern stripes. Not only the spatial resolution is low, but also
`it is not a single snapshot operation. Rainbow 3D camera
`totally eliminates these drawbacks. The projected rainbow
`pattern is a continuous spatial function providing a theoreti(cid:173)
`cally infinite spatial resolution that is practically limited only
`by the spatial and spectral resolution of the CCD image
`sensor. Moreover, the Rainbow 3D system can be designed
`with a safety level comparable to a conventional intraoral
`video camera with built-in light source, and poses no haz(cid:173)
`ards for intraoral clinical applications if used professionally
`for its intended purpose. It is, therefore, completely eye-safe.
`
`[0045] Also, unlike many scanning laser based 3D sys(cid:173)
`tems, the traditional Rainbow 3D camera has no mechanical
`moving parts, therefore the mechanical design becomes very
`simple and reliable. The Rainbow 3D Camera system can
`also provide normal 2D intensity image using the same
`imager. The 3D images and the 2D intensity images, both
`acquired by the same camera, provide complementary infor(cid:173)
`mation that greatly facilitates surface feature analysis and
`data fusion tasks. In addition to using visible light sources,
`it is possible to use Infrared (IR) or Ultra-violet (UV) light
`
`sources with suitable wavelength sensitive detectors for
`special applications with a minor design modification of this
`3D probe.
`[0046] Further, surface color of objects has no effect on 3D
`measurement accuracy of the traditional Rainbow 3D Cam(cid:173)
`era. Under the Rainbow projection, the spectral composition
`at any single surface point contains only single wavelength,
`therefore, any surface point only variation of surface color
`only affects the intensity of detected signal, not its wave(cid:173)
`length. Using a normalization scheme, such effect can be
`totally eliminated in the Rainbow 3D scheme.
`[0047] While the Rainbow 3D Camera technology pro(cid:173)
`vides a solid foundation for building various 3D cameras
`capable of acquiring accurate 3D shape of objects, measure(cid:173)
`ment accuracy is somewhat limited due to the optical
`baseline used.
`[0048]
`In contrast to the traditional Rainbow 3D Camera
`technology, the present system and method incorporates a
`structured light projector design, referred to herein as the
`multi-rainbow projection (MRP) that enables higher 3D
`measurement accuracy with smaller optical baseline. Refer(cid:173)
`ring to the principle of the Rainbow 3D Camera, there is a
`one-to-one corresponding relationship between the wave(cid:173)
`length of a light sheet and its projection angle in the original
`design. This rainbow projection pattern is referred to herein
`as the Single Rainbow Projection (SRP) illumination. Based
`on this one-to-one relationship, a ( w, 8) lookup table can be
`established for color match operation. Error sensitivity
`analysis reveals that the accuracy of color match operation
`used in the traditional Rainbow 3D camera scheme has a
`major effect on the accuracy of 3D images. The accuracy of
`color match, in turn, is significantly determined by the color
`variation rate of the projected light pattern. Consequently,
`the present system and method employs the "Multi-Rainbow
`Projection (MRP)" concept, as shown, for example, in FIG.
`3. Rather than modifying the projection wavelength in a
`single cycle, the present system and method incorporates a
`projection pattern that varies the wavelength of the projec(cid:173)
`tion pattern several times across the entire field of view.
`Consequently, the present MRP pattern (300), illustrated in
`FIG. 3, illuminates a desired scene with improved wave(cid:173)
`length variation rate thereby achieving improved sensitivity
`in color matching. Improved sensitivity in color matching
`results in an improvement of the accuracy of 3D measure(cid:173)
`ments as will be described in further detail below.
`
`[0049] FIG. 3 graphically illustrates the characteristics of
`the MRP pattern, according to one exemplary embodiment.
`As illustrated in FIG. 3, the multiple rainbow projection
`pattern has a spatial displacement (310) that corresponds to
`the varying intensities (320) of the red (R), green (G), and
`blue (B) wavelengths. These varying intensities are then
`combined to produce the MRP pattern (300). As illustrated
`in FIG. 3, it is evident that for a given wavelength, there are
`multiple possible projection angles. Consequently, the MRP
`eliminates the one-to-one corresponding relationship used
`by the Rainbow 3D camera. Therefore, additional proce(cid:173)
`dures are used by the present system and method to distin(cid:173)
`guish the correct projection angle from multiple candidates
`resulting from the one-to-many lookup table illustrated by
`FIG. 3.
`[0050] According to one exemplary embodiment, ambi(cid:173)
`guity in the color match operation is reduced by reducing the
`
`3SHAPE 1013 3Shape v Align IPR2021-01383
`
`
`
`US 2005/0088529 Al
`
`Apr. 28, 2005
`
`4
`
`searching range in the color space to cover only one cycle of
`rainbow variation, as illustrated in FIG. 4. Although color(cid:173)
`angle lookup table has the one-to-many mapping property in
`MRP, the present exemplary embodiment restricts the search
`space to a single cycle of rainbow variation to achieve a
`one-to-one correspondence. Consequently, within one cycle
`of the rainbow projection, the solution to color match
`becomes unique, or one-to-one.
`[0051] As illustrated in FIG. 4, the searching range ( 410)
`in the color space (310) is reduced to cover only one cycle
`of rainbow variation. While this exemplary method pro(cid:173)
`duces a one-to-one correspondence, the outcome of the
`search relies heavily upon the initial condition or initial
`point of the search ( 400). If a search starts within a search
`range ( 410) of the actual solution, it can achieve a correct
`result. If, however, a search starts at an initial point of search
`( 400) located away from the proper solution, it may produce
`a wrong answer.
`[0052] Consequently, the present system and method uses
`an adaptive control scheme to determine the initial search
`point. According to this exemplary embodiment, when using
`a local search method in a one-to-many MRP, using an
`adaptive control scheme increases the chance of identifying
`a good initial point of search (400). According to one
`exemplary embodiment, the initial point of search ( 400) is
`determined by a previously obtained correct match point
`adjacent to it. An adjacent previously obtained correct match
`point is chosen because a large portion of the surface of a
`physical object is typically continuous, thus the projected
`color is similar and the projection angle should be very
`similar.
`[0053] Design and Implementation of the Multi-Rainbow
`Projection Module
`[0054] According to one exemplary embodiment, the
`Multi-Rainbow Projector is a key component in the design
`of a 3D camera that produces multiple cycles of rainbow(cid:173)
`like illumination on the object surface with spatially varying
`wavelength for 3D imaging, according to the present exem(cid:173)
`plary system and method. Two possible approaches to
`producing multi-rainbow projection for 3D camera include
`using a custom-made multi-rainbow filter or using multiple
`monochromatic filters with prism beamsplitters.
`[0055] According to a first exemplary embodiment, the
`MRP projector (500) includes a white light source (510)
`configured to transmit white light through a multi-rainbow
`filter (520) and through projection optics (530), as illustrated
`in FIG. 5. As illustrated in FIG. 5, the use of a custom-made
`multi-rainbow filter is mechanically simple and incorporates
`few parts, thereby enhancing durability. According to one
`exemplary embodiment, the multi-rainbow filter (520) can
`achieve a Spectral Error Band (SEE) of between approxi(cid:173)
`mately 20 to 9 nm. According to this exemplary embodi(cid:173)
`ment, the SEE is the limiting factor to the accuracy of the
`MRP projector (500) because it may affect the accuracy of
`color matching as discussed previously.
`[0056] According to a second exemplary embodiment, the
`MRP projector incorporates monochromic filters in conjunc(cid:173)
`tion with prism beamsplitters to produce the desired MRP.
`By comparison, the incorporation of the monochromic com(cid:173)
`ponents will reduce the overall cost of the MRP projector.
`FIGS. 6 through 9 will now be described in further detail
`to explain the second exemplary embodiment.
`
`[0057] FIG. 6 illustrates a monochromatic pattern filter
`(600), according to one exemplary embodiment. As illus(cid:173)
`trated in FIG. 6, the monochromatic pattern filter (600) has
`a linear variation with multiple periods of cycles. As illus(cid:173)
`trated in FIG. 7, there is a 120 degree phase shift between
`three filters, a first monochromatic filter ( 610) combined
`with a first color, a second monochromatic filter (620)
`combined with a second color, and a third monochromatic
`filter ( 630) combined with a third color. As illustrated in
`FIG. 7, the pattern filters (610, 620, 630) may be combined
`to form a composite (640) of individual color projection
`patterns that produce an MRP pattern.
`[0058] FIG. 8 illustrates an MRP projector (800) incor(cid:173)
`porating the above-mentioned method. As illustrated in FIG.
`8, a red light emitting diode (810), a green LED (812), and
`a blue LED (814) are configured to provide illumination
`energy which is then passed through a red pattern filter
`(820), a green pattern filter (822), and a blue pattern filter
`(824), respectively. AS the illumination energy is passed
`through the pattern filters (820, 822, 824), a number of
`prisms (830) with 45-degree beam-splitters are used to
`combine the light in different colors and phases. After the
`beam combination is performed by the prisms (830) with
`45-degree beam-splitters, the projection optics (840) is able
`to generate a multi-rainbow projection (850) pattern onto the
`surface of a desired object. FIG. 9 illustrates a design of
`monochromic patterns for use in the multi-rainbow projec(cid:173)
`tion system, according to one exemplary embodiment.
`[0059] FIG. 10 illustrates an exemplary optical configu(cid:173)
`ration of an MRP projector (1000), according to one exem(cid:173)
`plary embodiment. As illustrated in FIG. 10, the MRP
`projector (1000) includes a red light source (1010), a green
`light source (1012), and a blue light source (1014). Accord(cid:173)
`ing to one exemplary embodiment, the respective light
`sources may be any number of light sources including, but
`in no way limited to, light emitting diodes (LED). Addition(cid:173)
`ally, a diffuser (1018) is disposed adjacent to each of the
`light sources (1010, 1012, 1014) to diffuse the light pro(cid:173)
`duced by the light sources. Additionally, as illustrated in
`FIG. 10, intermediate optics (1020) may be placed adjacent
`to the diffusers (1018) to selectively focus the diffused light.
`After being selectively focused, the diffused light is passed
`through a red pattern filter (1030), a green pattern filter
`(1032), or a blue pattern filter (1034) corresponding to the
`respective light source (1010, 1012, 1014). The filtered light
`may then be reflected off of any number of reflective
`surfaces (1040) and passed through a plurality of prisms
`(1052) with 45-degree beam-splitters and passed through
`projection optics (1060) to produce a desired MRP (1070) on
`the surface of an object to be imaged