`Miramonti et al.
`
`54 METHOD AND APPARATUS FOR
`OPTICALLY SCANNING THREE
`DIMENSIONAL OBJECTS USING COLOR
`INFORMATION IN TRACKABLE PATCHES
`
`Inventors: John L. Miramonti, West Lebanon,
`N.H., Frederick E. Mueller, San
`Francisco, Calif.
`Assignee: Wavework, Inc., Tiburon, Calif.
`
`Appl. No.: 738,437
`Filed:
`Oct. 25, 1996
`Int. Cl. ..................................................... G06K 7700
`U.S. Cl. .......................... 382/312;382/108; 382/154;
`382/162; 382/167; 382/285; 345/419; 345/425;
`345/426; 345/430; 345/431
`Field of Search ..................................... 382/154, 285,
`382/312, 162, 167, 108; 345/419, 425,
`431, 430, 426
`
`References Cited
`
`U.S. PATENT DOCUMENTS
`4,175.862 11/1979 DiMatteo ................................ 382/154
`4,298,800 11/1981 Goldman ................................... 378/19
`4,645,347 2/1987 Rioux ....
`... 356/376
`4,731,860 3/1988 Wahl .................
`... 38.2/281
`4,737,032 4/1988 Addleman et al.
`... 356/376
`4,937,766 6/1990 Deppe et al. ........................... 382/154
`4,939,380 7/1990 Berger et al. ........................ 250/578.1
`4,969,106 11/1990 Vogel et al. ......
`... 382/108
`4,991,224 2/1991 Takahashi et al.
`... 382/154
`5,109,236 4/1992 Watanabe et al.
`... 347/193
`5,177,349
`1/1993 Setani ................
`250/208.1
`5,179,554
`1/1993 Lomicka et al.
`... 370/257
`5,261,044 11/1993 Dev et al. ............................... 345/357
`5,321,695
`6/1994 Faulk, Jr. et al. ...................... 370/401
`5,402,364 3/1995 Kitoh et al. .......
`... 702/167
`5,528,194 6/1996 Ohtani et al. .
`... 382/154
`5,561,526 10/1996 Huber et al. ..
`... 382/154
`5,577,130 11/1996 Wu ....................
`... 3s2/54
`5,583,991 12/1996 Chatwani et al.
`... 395/200.53
`5,606,664 2/1997 Brown et al. .....
`... 395/200.54
`5,671,157 9/1997 Saito .
`... 382/285
`5.675.377 10/1997 Gibas ........
`... 382/154
`5,684,796 11/1997 Abidi et al. .......
`... 370/389
`5,706,440
`1/1998 Compliment et al. ............. 395/200.54
`
`USOO58.64640A
`Patent Number:
`11
`(45) Date of Patent:
`
`5,864,640
`Jan. 26, 1999
`
`OTHER PUBLICATIONS
`PCT International Search Report, May, 20, 1998.
`Cysurf: B-spline Surfaces from Cyberware Scans.
`Advanced Imaging: “Whole Body Imaging for Visualization
`and Animation: NEWALTERNATIVES’.
`Cyberware Issue 1 -3D Development.
`Cyberware Issue 3 -3D Development.
`Cyberware Issue 4-3D Development.
`Cyberware Color 3D Digitizer.
`Sum of the Parts-Paul I. Anderson, “From Telepresence to
`True Immersive Imaging: Into Real-Life Video Now”.
`Silicon Graphics World, Jun. 1993, “Cyberware scanners
`play major role in creating movie Special effects”.
`Jul. 1995–"A New True 3-D Motion Camera System from
`Lawrence Livermore'.
`BioVisionTM Custom Motion Capture.
`BioVisionTMState of the Art Motion Capture.
`Jurassic Park -“How'd They Do That”.
`Sideline-“Movie-and Manufacturing-Magic”.
`Cyberware -“Cyberware Wins Academy Award”.
`Primary Examiner Leo H. Boudreau
`ASSistant Examiner-Ishrat Sherali
`Attorney, Agent, or Firm-Hickman & Martine, LLP
`57
`ABSTRACT
`The invention provides a three dimensional digital Scanner
`which includes a multiple view detector which is responsive
`to a broad spectrum of visible light. The multiple view
`detector is operative to develop a plurality of images of a
`three dimensional object which is being Scanned. The plu
`rality of images are taken from a plurality of relative angles
`with respect to the object, and the plurality of images depict
`a plurality of Surface portions of the object. A digital
`processor including a computational unit is coupled to the
`detector and is responsive to the plurality of images So that
`it develops 3-D coordinate positions and related image
`information for the plurality of surface portions of the
`object. A three dimensional image of the object to be
`Scanned is thus developed by the digital processor. The data
`developed includes both shape and Surface image color
`information.
`
`17 Claims, 15 Drawing Sheets
`
`f
`
`
`
`30
`
`3SHAPE EXHIBIT 1014
`3Shape v. Align
`IPR2019-00160
`
`
`
`U.S. Patent
`US. Patent
`
`Jan. 26, 1999
`Jan. 26, 1999
`
`Sheet 1 of 15
`Sheet 1 0f 15
`
`5,864,640
`5,864,640
`
`
`
`
`
`
`
`U.S. Patent
`US. Patent
`
`Jan. 26, 1999
`Jan. 26, 1999
`
`Sheet 2 of 15
`Sheet 2 0f 15
`
`5,864,640
`5,864,640
`
`
`
`me
`
`
`
`U.S. Patent
`US. Patent
`
`Jan. 26, 1999
`Jan. 26, 1999
`
`Sheet 3 of 15
`Sheet 3 0f 15
`
`5,864,640
`5,864,640
`
`
`
`
`
`CD1‘
`Q
`1‘
`s
`s
`
`
`
`U.S. Patent
`US. Patent
`
`Jan. 26, 1999
`Jan. 26, 1999
`
`Sheet 4 of 15
`Sheet 4 0f 15
`
`5,864,640
`5,864,640
`
`
`
`
`
`
`
`U.S. Patent
`US. Patent
`
`Jan. 26, 1999
`Jan. 26, 1999
`
`Sheet 5 of 15
`Sheet 5 0f 15
`
`5,864,640
`5,864,640
`
`
`
`W
`
`
`
`US. Patent
`
`Jan. 26, 1999
`
`Sheet 6 0f 15
`
`5,864,640
`
`
`
`
`
`mmooozm.Ilv.zoEmOQ
`
`mohmfimo
`
`$10:I02:10:
`
`5sz00
`
`m._m<>O_>_mm
`
`m.03”.Ohm
`
`m0<n_mm_._.z_
`
`AHV
`
`Q1
`
`~65:I29:81
`
`mmjoEzoo
`
`mew
`
`bow.
`
`EN
`
`EOFOMFMDIEECHOm—Fmo
`
`mm<_>_
`
`NNN
`
`o:
`
`o tZ O 2
`
`80
`mg
`so
`
`DON
`
`New
`
`ISEE
`
`wow.
`
`
`
`
`
`U.S. Patent
`
`Jan. 26, 1999
`
`Sheet 7 of 15
`
`5,864,640
`
`300
`
`
`
`350
`
`DENTIFY
`SLHOUETTES
`
`352
`
`DETERMINE
`SET OF
`TRACKING POINTS
`
`354
`
`DEVELOP RADI
`FOR TRACKING
`POINTS
`
`
`
`
`
`
`
`OUTPUT
`COORDINATES AND
`COLOR VALUES
`
`356
`
`358
`
`
`
`U.S. Patent
`
`Jan. 26, 1999
`
`Sheet 8 of 15
`
`5,864,640
`
`
`
`
`
`4OO
`
`FIND DIFFERENCE
`RELATIVE TO
`PREVIOUS IMAGE
`
`404
`
`APPLY COMPRESSION
`TECHNOUE
`
`O
`1
`2
`3
`
`XXXXXXXXX
`XXXXXXXXX
`XXXXXXXXX
`
`2e2. 5/2
`
`O NO BLUE
`1
`X(NC)
`8(+B)
`2 X(NC)
`2(-B)
`3 X+2 (NC) 1 (-B
`)
`
`Y(NC)
`7(NC)
`7(NC)
`
`2e2.5e
`
`2(+B)
`1 (+B)
`
`Y-2(NC)
`Y-1 (NC)
`
`
`
`U.S. Patent
`
`Jan. 26, 1999
`
`Sheet 9 of 15
`
`5,864,640
`
`
`
`DONE WITH
`IMAGES
`
`
`
`
`
`APPLY FILTER
`KERNELTO
`MAGE
`
`
`
`MOVE IN FROM
`LEFT EDGE OF
`IMAGETO FIND
`POTENTAL
`
`LEFT EDGES(S)
`
`MOVE IN FROM
`RIGHT EDGE
`OF IMAGE ALONG
`SCAN LINE
`TO FIND POTENTIAL
`RIGHT EDGE(S)
`
`USE HEURISTICS TO
`DETERMINE LEFT
`AND RIGHT EDGES
`OF THE OBJECT
`
`608
`
`-
`
`(2
`47. 7
`
`
`
`U.S. Patent
`
`Jan. 26, 1999
`
`Sheet 10 0f 15
`
`5,864,640
`
`door-GSAE) 802
`IMAGES
`* Gre)
`-
`
`DONE WITH
`
`YES
`
`804
`
`354
`
`NO
`
`LOCATE VERTICAL
`CENTERLINE OF
`IMAGE
`
`806
`
`808
`
`812
`
`814
`
`
`
`SEARCHAREA
`FORTRACKABLE
`PATCHES
`
`MARKTRACKABLE
`PATCHES
`
`MOVE TO NEXT
`AREA
`
`2e2.8
`
`
`
`U.S. Patent
`US. Patent
`
`Jan. 26, 1999
`Jan. 26, 1999
`
`Sheet 11 Of 15
`Sheet 11 0f 15
`
`5,864,640
`5,864,640
`
`
`
`85O
`850
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`U.S. Patent
`
`Jan. 26, 1999
`
`Sheet 12 0f 15
`
`5,864,640
`
`944
`
`
`
`
`
`DONE WITH
`PATCHEST?
`
`DETERMINE
`RADIUS
`
`356
`
`
`
`
`
`
`
`CHOOSE PATCH
`AND INTIAL
`IMAGE
`
`40
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`OUTSIDE PATCH
`TRACKING AREA?
`
`CALCULATE EXPECTED
`PATH OF PATCH
`IN THE IMAGE
`
`916
`
`FIND EXACT
`POSITION OF PATCH AND
`STORE FOR THAT IMAGE
`
`UPDATE PATCH
`KERNEZ
`
`GO TO NEXT
`IMAGE
`
`922
`
`
`
`U.S. Patent
`US. Patent
`
`Jan. 26, 1999
`Jan. 26, 1999
`
`Sheet 13 0f 15
`Sheet 13 0f 15
`
`5,864,640
`5,864,640
`
`
`
`1OO
`
`
`
`11 O
`110
`
`100
`
`
`
`
`U.S. Patent
`US. Patent
`
`Jan. 26, 1999
`Jan. 26, 1999
`
`Sheet 14 of 15
`Sheet 14 0f 15
`
`5,864,640
`5,864,640
`
`966 964
`966 964
`l
`j
`
`962
`962
`
`96.O
`960
`
`
`
`
`
`
`
`U.S. Patent
`
`Jan. 26, 1999
`
`Sheet 15 0f 15
`
`5,864,640
`
`1000-GSTARD
`
`CONVERT COORDINATE
`DATATO DESRED
`COORD SYSTEM
`
`CONVERT COLOR
`DATA TO DESIRED
`COLOR SYSTEM
`
`
`
`PERFORMINTERPOLATION
`AND DECIMATION
`
`
`
`
`
`1Of O
`
`102O
`
`STORE IN
`DATASTRUCTURE
`
`1040
`
`
`
`1
`METHOD AND APPARATUS FOR
`OPTICALLY SCANNING THREE
`DIMENSIONAL OBJECTS USING COLOR
`INFORMATION IN TRACKABLE PATCHES
`
`TECHNICAL FIELD
`This invention relates generally to optical Scanners, and
`more particularly to optical Scanners for providing a digital
`representation of three dimensional objects.
`BACKGROUND ART
`Methods for successfully obtaining two dimensional (“2-
`D”) color image data for objects have been developed. This
`proceSS is commonly known as two dimensional Scanning or
`digitizing. When an object is Scanned, a digital data file is
`created which contains image data including color informa
`tion which is associated with a set of two dimensional points
`or coordinates. The color information is obtained by an
`optical detector or Set of optical detectors that are typically
`organized in a one or two dimensional array.
`Matching the color information with the correct two
`dimensional point or location is not a significant problem in
`two dimensional Scanning Since the two dimensional point
`on which the optical detector is focused is the same point
`that is associated with the color information obtained by the
`detector. The color information is mislocated only to the
`extent that there is Some error in the location of the point on
`which the detector is focused (e.g. an error introduced by the
`optical System) and that error can readily be minimized.
`The problem of associating color information with three
`dimensional ("3-D") objects is not so easily solved. This is
`because prior art methods obtain color information with a
`two dimensional Scanning method, while position informa
`tion is obtained by a three dimensional Scanning method.
`The mapping of the 2-D color information to the 3-D
`position information is a complicated process which is prone
`to Significant error.
`Many methods exist for obtaining the three dimensional
`location of the surface points of the object. One such method
`is a System which uses a laser range finder to Scan the object
`and record the distance between the known three dimen
`Sional location of the range finder and the measured location
`of the surface of the object. The result of using this method
`or other methods of generating three dimensional Surface
`models is a Set of three dimensional points which accurately
`represent the Surface of the object. A characteristic of this
`method and other methods of obtaining a three dimensional
`Surface model is that it is inherently monochromatic, that is,
`no color information is obtained in the process. If three
`dimensional color information is desired, then it must be
`generated by Somehow combining or conformally mapping
`the two dimensional color information onto the three dimen
`Sional Surface model.
`The problem of conformally mapping the two dimen
`Sional color information onto the three dimensional Surface
`model is difficult and it is common for mismatching of color
`information with the three dimensional points to occur. The
`problem may be visualized by imagining a white Statue or
`bust of a person's head and a color photograph of the same
`perSon's face. The photograph cannot simply be projected
`onto the bust to transfer the correct color information to the
`correct points on the bust or Significant distortion will occur.
`A significant amount of judgment must be exercised in order
`to correctly associate the color information from the photo
`graph with the correct Surface points on the bust. Similarly,
`it is difficult to accurately associate color information
`
`15
`
`25
`
`35
`
`40
`
`45
`
`50
`
`55
`
`60
`
`65
`
`5,864,640
`
`2
`obtained from two dimensional optical detectors with the
`correct points on a three dimensional Surface model.
`Another problem in the prior art is that color information is
`not used to determine Surface locations, which means leSS
`than the total amount of information that is available is being
`used. Furthermore, both a 2-D and 3-D system is required,
`which adds cost.
`What is needed is a way of generating a set of three
`dimensional points representing a Surface in Such way that
`the three dimensional points are already associated with
`color data So that conformally mapping Separately generated
`color data onto the Set of three dimensional Surface points is
`not necessary. Furthermore, it is desirable to utilize all
`available frequencies of light to determine Surface point
`positions to maximize the accuracy of the Scanning proceSS
`and to eliminate a separate 3-D Scanning Step.
`
`DISCLOSURE OF THE INVENTION
`Accordingly, the present invention provides a System and
`method for using the color information from a Series of two
`dimensional color images to derive the three dimensional
`location in Space of the Surface points which produced the
`color images. Because the color information itself is used to
`derive the three dimensional location of the Surface points,
`there is no need to conformally map separately generated
`color information onto the derived three dimensional Surface
`points. The points are derived from color information and So
`are already associated with the correct color information.
`Also, the use of the color information increases the accuracy
`of the three dimensional location of the Surface points.
`In one embodiment, the present invention provides a three
`dimensional digital Scanner which includes a multiple view
`detector which is responsive to a broad spectrum of Visible
`light. The multiple View detector is operative to develop a
`plurality of images of a three dimensional object which is
`being Scanned. The plurality of images are taken from a
`plurality of relative angles with respect to the object, and the
`plurality of images depict a plurality of Surface portions of
`the object. A digital processor including a computational
`unit is coupled to the detector and is responsive to the
`plurality of images So that it develops 3-D coordinate
`positions and related image information for the plurality of
`Surface portions of the object. A three dimensional image of
`the object to be Scanned is thus developed by the digital
`processor. The data developed includes both shape and
`Surface image color information.
`In another embodiment, a three dimensional color digital
`Scanner includes a color detector responsive to a broad
`Spectrum of visible light to develop a plurality of images of
`a three dimensional object. A rotary object Support having an
`axis of rotation allows the detector to develop a plurality of
`images of a three dimensional object. The plurality of
`images depict a plurality of Surface portions of the object. A
`digital computer is coupled to the detector. The computer
`trackS patches of the Surface portions of the object to
`determine coordinates of the patches as a function of the
`rotation of the rotary object Support and determines radii of
`the patches from the axis of rotation.
`In another embodiment, a method for Scanning a three
`dimensional object includes developing a plurality of images
`of a three dimensional object taken from a plurality of
`relative angles with respect to the object. The plurality of
`images depict a plurality of Surface portions of the object to
`be Scanned. 3-D coordinate positions and related image
`information about the plurality of Surface portions of the
`object is computed from the plurality of imageS Such that a
`
`
`
`5,864,640
`
`1O
`
`15
`
`25
`
`35
`
`40
`
`3
`three dimensional image of the object is developed that
`includes both shape and Surface image information.
`In another embodiment, a method for determining three
`dimensional coordinates of a Surface portion of an object
`includes obtaining a plurality of images of the Surface
`portion of the object and identifying a trackable patch of the
`Surface portion in an initial image. An initial Set of two
`dimensional coordinates of the trackable patch in the initial
`image is determined along with at least one additional Set of
`two dimensional coordinates of the trackable patch in
`another of the images. A radial coordinate of the trackable
`patch is determined and then a set of three dimensional
`coordinates of the trackable patch are determined from the
`radial coordinate of the trackable patch.
`In another embodiment, a method for determining three
`dimensional coordinates of a Surface portion of an object
`includes rotating the object about an axis of rotation So that
`a plurality of images of the Surface portion of the object are
`obtained as the object is rotates about the axis of rotation. A
`trackable patch is identified and the two dimensional coor
`dinates of the trackable patch are determined. The move
`ment of the trackable patch is tracked as a function of the
`rotation of the object. A radial distance of the trackable patch
`from the axis of rotation is determined based on the move
`ment of the trackable patch as a function of the rotation of
`the object and three dimensional coordinates of the Surface
`portion of the object are derived from the coordinates of the
`trackable patch and the radial distance of the trackable patch
`from the axis of rotation.
`The present inventions provides a System and method for
`obtaining 3-D surface information that is linked to color
`information without the need to conformally map 2-D color
`data onto a 3-D Surface. The accuracy of the System is
`enhanced by the use of color data and the cost of the System
`is reduced because the 3-D surface is derived from a series
`of 2-D imageS. These and other advantages of the present
`invention will become apparent upon reading the following
`detailed descriptions and Studying the various figures of the
`drawings.
`BRIEF DESCRIPTION OF THE DRAWINGS
`FIG. 1 illustrates a system for obtaining a series of two
`dimensional color images of an object and processing those
`images to obtain a three dimensional model of the Surface of
`the object.
`FIG. 1A illustrates an alternative embodiment of the
`present invention which enables the top and bottom portions
`of an object to be Scanned.
`FIG. 1B illustrates another embodiment of the present
`invention which produces enhanced shading of an object.
`FIG. 1C illustrates an arrangement where a detector is
`translated about a Stationary object.
`FIG. 1D illustrates an embodiment of the present inven
`tion which uses a multiple number of detectors instead of
`moving a Single detector.
`FIG. 2 illustrates in detail an architecture of an image
`acquisition System.
`FIG. 3A is a flow diagram illustrating a process of
`obtaining multiple images of a rotating object.
`FIG. 3B is a flow diagram illustrating a process for
`generating three dimensional Surface data from the two
`dimensional images of the object.
`FIG. 4 is a flow diagram illustrating a process performed
`on the images before they are Stored.
`FIG. 5A illustrates the vector nature of the color data
`obtained.
`
`45
`
`50
`
`55
`
`60
`
`65
`
`4
`FIG. 5B illustrates an example of blue color data at times
`0, 1, 2, and 3 for a line of pixels.
`FIG. 5C illustrates how the data can be compressed by
`recording only the changes in the color data.
`FIG. 6 is a flow diagram illustrating a process for iden
`tifying the Silhouette of the object in each image.
`FIG. 7 is a flow diagram illustrating a process for finding
`Silhouette edges along each Scan line.
`FIG. 8 is a flow diagram illustrating a process for deter
`mining e a set of trackable patches.
`FIG. 8A illustrates how to search an image for trackable
`patches.
`FIG. 9 is a flow diagram illustrating a process for deter
`mining the radius of the location of patches on the Surface
`of the object as the object is rotated.
`FIG. 9A illustrates a set of patch tracking limits.
`FIG. 9B illustrates the motion of trackable patches in
`different images with different angular displacements.
`FIG. 9C illustrates the determination of an exact position
`of the patch in an image.
`FIG. 9D is a graph which illustrates the filtering of raw
`data points.
`FIG. 9E is a graph which illustrates how the radius is
`determined from the points representing the path of the
`trackable patch acroSS angularly displaced images.
`FIG. 10 is a flow diagram illustrating the post processing
`that occurs once the radius of the trackable patch is known.
`BEST MODES FOR CARRYING OUT THE
`INVENTION
`In FIG. 1, an embodiment of the present invention
`includes a System for obtaining a Series of two dimensional
`color images of an object and processing those images to
`obtain a three dimensional model of the Surface of the
`object. An object 100 which is to be digitized is placed on
`a rotatable platform 102. A motor 104 is provided to drive
`rotatable platform 102 via a shaft 106. A position encoder
`108 detects the angular position of rotatable platform 102
`and generates an electrical Signal which represents the
`angular position of rotatable platform 102. An optical detec
`tor 110 (e.g. a color video camera) views object 100 and
`creates a two dimensional color image of object 100.
`As object 100 is rotated by rotatable platform 102, detec
`tor 110 captures a series of color images of object 100. Each
`color image taken at a different time is associated with an
`angular rotation of object 100 about an axis of rotation, “A”
`which runs through shaft 106. Information about the angular
`position of object 100 is obtained from position encoder 108.
`Thus, each “snapshot' or image of object 100 taken by
`detector 110 from a different view is associated with data
`about the angle of rotation of object 100 with respect to
`detector 110. An image input processing System 120
`("computer') controls the image acquisition process and
`records the acquired images along with the associated angu
`lar position data. That is, processing System 120 is connected
`to detector 110 and receives data for each image or Snapshot
`taken of object 100 from detector 110, and position encoder
`108 sends angular position information to processing System
`120, So that processing System 120 can associate the image
`data from detector 110 with the angular position data taken
`at the same time. In other embodiments, detector 110 is a
`film camera and processing System 120 receives data from
`a digitizer which digitizes the film images from detector 110.
`Processing System 120 includes a processing unit 122 and
`a monitor 124 and also controls motor 104. A monitor 124
`
`
`
`5,864,640
`
`15
`
`25
`
`35
`
`40
`
`S
`can display a current image 126 being captured by detector
`110 or other information about the capturing process.
`Once processing System 120 has obtained a Series of
`images, those images are transferred to an image processor
`130 ("computer”). Image processor 130 can receive data
`from processing system 120 in a number of different ways.
`Image processor 130 can be directly connected to processing
`system 120 via direct connection 132, or data from process
`ing System 120 can be transferred to a removable Storage
`medium such as disk 134 which may be read by image
`processor 130. Processing system 120 may also transfer data
`to image processor 130 via the Internet or a modem con
`nection. Image processor 130 includes processing unit 136
`and also includes monitor 138.
`In other embodiments, processing System 120 and image
`processor 130 are combined on a single computer. The
`advantage of Separating the functions of processing System
`120 and image processor 130 is that the data acquisition and
`Storage function performed by processing System 120 and
`control of the data acquisition System does not require a
`complex or powerful processor. On the other hand, image
`processor 130 receives data representing a Series of two
`dimensional images and perform complex and computation
`ally intensive operations on that data to produce a three
`dimensional Surface model. Image processor 130 is
`therefore, given current technology, likely to be a more
`powerful (and costly ) computer than processing System
`120. If that is the case, then it is economically beneficial to
`utilize a large number of relatively cheap processors for data
`acquisition and temporary Storage and Send data from those
`relatively cheap Systems to a Smaller number of image
`processors which generate the three dimensional Surface
`model from the set of two dimensional color images.
`FIG. 1A illustrates an alternative embodiment of the
`present invention which enables the top and bottom portions
`of an object to be scanned. Again, object 100 is supported by
`rotatable platform 102 which is driven by motor 104. In this
`embodiment, Shaft 107 engages the edge of rotatable plat
`form 102, so that motor 104 and shaft 107 do not obscure the
`image of the bottom of object 100. Rotatable platform 102
`is made from a transparent material So that the bottom of
`object 100 may be viewed through rotatable platform 102. A
`set of mirrors 109 are placed within the field of view of
`detector 110 so that images of the top and bottom surfaces
`of object 100 are captured by detector 110 in addition to the
`Side views.
`FIG. 1B illustrates another embodiment of the present
`invention which is designed to produce contrast enhancing
`shading of object 100. Again, object 100 is supported by
`rotatable platform 102 which is driven by a motor 104 via a
`shaft 106. A second motor 142 also drives a rotatable
`platform 144 via shaft 146. Encoder 148 generates data
`representative of the rotational position of rotatable platform
`144 and transmits that data to processing System 120.
`Likewise, motor 142 receives control commands from pro
`cessing system 120. A light 150 is mounted on rotatable
`platform 144 to provide illumination of object 100. Light
`150 is oriented to provide contrasting illuminated and
`shaded portions on object 100 which aid in the tracking of
`features on the surface of object 100. Because light 150 is
`mounted on rotatable platform 144 which is separately
`controllable by processing system 120, different orientations
`of light 150 with respect to object 100 may be checked to
`determine which one best enhances the Surface features of
`object 100. When platforms 102 and 144 are rotated in a
`Synchronized manner, the shading remains constant.
`Additionally, multiple sets of views of object 100 with
`
`45
`
`50
`
`55
`
`60
`
`65
`
`6
`different Shadings can also be obtained by changing the
`relative position of platforms 102 and 144.
`FIGS. 1, 1A, and 1B each depict embodiments where in
`the object being imaged is rotated. In another embodiment
`of the present invention, the object remains Stationary and
`the detector moves around the object. FIG. 1C illustrates an
`arrangement where a detector is translated about a Stationary
`object. It should be noted that as the detector 110 is moved,
`the optics 111 remain pointed at the object 100. Detector 110
`can be move in many ways and object 100 can be supported
`in many ways. In one embodiment, an unobstructed View of
`object 100 is obtained by Suspending it from very thin wires.
`Detector 110 is translated about object 100. If object 100 is
`very large, detector 110 could be mounted on, for example,
`a helicopter and flown around object 100. It is not necessary
`that the motion of detector 110 be exactly circular around
`object 100. The angular and radial components of the motion
`of detector 110 with respect to object 100 can be computa
`tionally analyzed, as will be appreciated by those skilled in
`the art. AS long as the position of detector 110 is measured
`and recorded, the relative angular position of detector 110
`with respect to object 100 can be determined for each image
`taken by detector 110. Methods of determining the position
`of detector 110 include using GPS or a laser positioning
`System. Once the angular component of the motion is
`analyzed and the radial component is calculated, the System
`compensates for the radial component and the images gen
`erated by detector 110 can be processed similarly to the
`images generated by a System that includes a rotating object
`and a stationary detector.
`FIG. 1D illustrates an embodiment of the present inven
`tion which uses a multiple number of detectors instead of
`moving a single detector. A top view of object 100 is shown
`and set of detectors 110 are provided at different angular
`displacements with respect to object 100. The advantage of
`this embodiment is that no motion is required and the need
`for motors, encoders, and rotatable Supports is limited. The
`image of object 100 captured by each detector is angularly
`displaced with respect the images captured by the other
`detectors and So the images may be processed in a similar
`manner as Successive images taken by one moving detector.
`The cost of multiple detectors 110 may be less than the
`cost of a rotatable drive or a mechanism for moving detector
`110 and recording the position of detector 110. Another
`advantage of this approach is that all of the images of object
`100 can be created simultaneously.
`FIGS. 1 through 1D depict various embodiments for
`creating multiple images of object 100 with object 100 and
`detector 110 at different relative angular displacements.
`Each of these Systems provide two dimensional color images
`of object 100 observed at different angles. This two dimen
`Sional information is converted into a three dimensional
`surface model of object 100 by the process and apparatus of
`the present invention.
`FIG. 2 illustrates in detail the architecture of processing
`System 120 used in Some embodiments. A microprocessor
`200 is connected to a memory bus 202 and memory bus 202
`is connected to a RAM 204 and a ROM 206. Microprocessor
`200 is also connected to an input/output (“I/O”) bus 208. A
`video interface 210 is coupled to I/O bus 208 to control
`monitor 124, as is detector interface 212. Detector interface
`212 buffers and processes data from the detector and also
`carries output commands to the detector from microproces
`Sor 200. In certain embodiments where a moving detector is
`used, the detector provides its own control and records its
`own position. In Such embodiments, the detector/processor
`
`
`
`7
`interface need only be capable of transferring data from the
`detector, including both image and detector position data, to
`the processor Storage System.
`Mass Storage 214 (Such as a hard disk drive) is also
`connected to input/output bus 208 and provides Storage
`capacity for the multiple images generated by the optical
`System. Removable Storage 216 (Such as a floppy disk drive)
`also provides a way of transferring data files to and from
`processing System 120 and another processing System.
`Alternatively, communications interface 218 can be used to
`transfer files as well. Communications interface 218 may be
`connected to a local area network (“LAN”) or wide area
`network (“WAN”) for communication with other worksta
`tions. Position controller 220 is connected to input/output
`bus 208 and provides control to a motor in embodiments
`where processing System 120 provides control commands
`for rotating object 100. In such embodiments, position
`detector 222 receives data from an encoder So that proceSS
`ing system 120 may keep track of the position of object 100.
`Lighting control 224 is also connected to input/output bus
`208 and is used to control the position of lights which may
`be moved with respect to object 100. Lighting control 224
`also controls the intensity of those lights.
`The architecture shown for processing system 120 in FIG.
`2 is capable of Supporting any of the embodiments shown in
`FIGS. 1-1D. If the object is to be rotated, position controller
`220 and position detector 222 provide control of the rotation.
`Position information about object 100 can be integrated with
`image data from interface 212 and Stored in mass Storage
`214. Movement and intensity control of the light is con
`trolled by lighting control 224. If an autonomous detector is
`used, data about the detector position and images captured
`by the detector can be transferred to processing System 120
`via communications interface 218 or removable storage 216.
`Multiple detector interfaces are provided to control a mul
`tiple number of detectors in embodiments which use more
`than one detector. AS described above, a three dimensional
`Surface model can be computed using microprocessor 200
`and the data contained in mass Storage 214, or, alternatively,
`the data in mass Storage 214 can be transferred to a more
`powerful image processing System.
`FIG. 3A is a flow diagram for the process of the present
`invention of obtaining multiple images of a rotating object.
`Preferably, the method is implemented on a processing
`system 120. The process starts at step 300, and the user
`places the object on the rotatable platform in step 302. The
`object begins to rotate while it is being imaged by a detector.
`In step 304, the processor checks whether the required of
`number images have already been captured or taken. If the
`required number of images have been captured, then the
`process is finished at step 306. The two dimensional image
`data is then ready to be taken to an image processor for
`generation of a three dimensional Surface model. If more
`images are to be captured, then control is transferred to Step
`308, and a command is sent to the detector to capture an
`image. The image is preferably pre