throbber
United States Patent [19J
`Miramonti et al.
`
`[54]
`
`METHOD AND APPARATUS FOR
`OPTICALLY SCANNING THREE
`DIMENSIONAL OBJECTS USING COLOR
`INFORMATION IN TRACKABLE PATCHES
`
`[75]
`
`Inventors: John L. Miramonti, West Lebanon,
`N.H.; Frederick E. Mueller, San
`Francisco, Calif.
`
`[73]
`
`Assignee: Wavework, Inc., Tiburon, Calif.
`
`[21]
`
`Appl. No.: 738,437
`
`[22]
`
`Filed:
`
`Oct. 25, 1996
`
`[51]
`[52]
`
`[58]
`
`[56]
`
`Int. Cl.6
`....................................................... G06K 7/00
`U.S. Cl. .......................... 382/312; 382/108; 382/154;
`382/162; 382/167; 382/285; 345/419; 345/425;
`345/426; 345/430; 345/431
`Field of Search ..................................... 382/154, 285,
`382/312, 162, 167, 108; 345/419, 425,
`431, 430, 426
`
`References Cited
`
`U.S. PATENT DOCUMENTS
`
`4,175,862 11/1979 DiMatteo ................................ 382/154
`4,298,800 11/1981 Goldman ................................... 378/19
`4,645,347
`2/1987 Rioux ...................................... 356/376
`4,731,860
`3/1988 Wahl ....................................... 382/281
`4,737,032
`4/1988 Addleman et al. ..................... 356/376
`4,937,766
`6/1990 Deppe et al.
`........................... 382/154
`4,939,380
`7/1990 Berger et al. ........................ 250/578.1
`4,969,106 11/1990 Vogel et al.
`............................ 382/108
`4,991,224
`2/1991 Takahashi et al. ...................... 382/154
`5,109,236
`4/1992 Watanabe et al. ...................... 347/193
`5,177,349
`1/1993 Setani ................................... 250/208.1
`5,179,554
`1/1993 Lomicka et al. ........................ 370/257
`5,261,044 11/1993 Dev et al. ............................... 345/357
`5,321,695
`6/1994 Faulk, Jr. et al.
`...................... 370/401
`5,402,364
`3/1995 Kitoh et al. ............................. 702/167
`5,528,194
`6/1996 Ohtani et al. ........................... 382/154
`5,561,526 10/1996 Huber et al. ............................ 382/154
`5,577,130 11/1996 Wu .......................................... 382/154
`5,583,991 12/1996 Chatwani et al.
`................. 395/200.53
`5,606,664
`2/1997 Brown et al. ...................... 395/200.54
`5,671,157
`9/1997 Saito ....................................... 382/285
`5,675,377 10/1997 Gibas ...................................... 382/154
`5,684,796 11/1997 Abidi et al. ............................. 370/389
`5,706,440
`1/1998 Compliment et al. ............. 395/200.54
`
`I 1111111111111111 11111 111111111111111 lllll 111111111111111 111111111111111111
`US005864640A
`[11] Patent Number:
`[45] Date of Patent:
`
`5,864,640
`Jan. 26, 1999
`
`OTHER PUBLICATIONS
`
`PCT International Search Report, May, 20, 1998.
`Cysurf: B-spline Surfaces from Cyberware Scans.
`Advanced Imaging: "Whole Body Imaging for Visualization
`and Animation: NEW ALTERNATIVES".
`Cyberware Issue 1 -3D Development.
`Cyberware Issue 3 -3D Development.
`Cyberware Issue 4 -3D Development.
`Cyberware Color 3D Digitizer.
`Sum of the Parts-Paul I. Anderson, "From Telepresence to
`True Immersive Imaging: Into Real-Life Video Now".
`Silicon Graphics World, Jun. 1993, "Cyberware scanners
`play major role in creating movie special effects".
`Jul. 1995 -"A New True 3-D Motion Camera System from
`Lawrence Livermore".
`Bio Vision ™Custom Motion Capture.
`Bio Vision™State of the Art Motion Capture.
`Jurassic Park -"How'd They Do That".
`Sideline-"Movie-and Manufacturing-Magic".
`Cyberware -"Cyberware Wins Academy Award".
`
`Primary Examiner-Leo H. Boudreau
`Assistant Examiner----Ishrat Sherali
`Attorney, Agent, or Firm-Hickman & Martine, LLP
`
`[57]
`
`ABSTRACT
`
`The invention provides a three dimensional digital scanner
`which includes a multiple view detector which is responsive
`to a broad spectrum of visible light. The multiple view
`detector is operative to develop a plurality of images of a
`three dimensional object which is being scanned. The plu(cid:173)
`rality of images are taken from a plurality of relative angles
`with respect to the object, and the plurality of images depict
`a plurality of surface portions of the object. A digital
`processor including a computational unit is coupled to the
`detector and is responsive to the plurality of images so that
`it develops 3-D coordinate positions and related image
`information for the plurality of surface portions of the
`object. A three dimensional image of the object to be
`scanned is thus developed by the digital processor. The data
`developed includes both shape and surface image color
`information.
`
`17 Claims, 15 Drawing Sheets
`
`124
`
`A
`
`3SHAPE 1011 3Shape v Align IPR2021-01383
`
`

`

`U.S. Patent
`
`Jan. 26, 1999
`
`Sheet 1 of 15
`
`5,864,640
`
`a a ,_
`
`<( - - - - -
`
`-----<(
`
`a ,_ ,_
`
`-- ~==~-~--,
`I
`
`I -~ I ____........,__ I :
`
`i _ _ _ _ ,
`
`~ ~
`,_
`,_
`
`-~
`
`,-
`
`I
`\
`
`(cid:143)
`
`I
`I~
`
`l-~--
`
`a
`C\J
`.,_
`
`3SHAPE 1011 3Shape v Align IPR2021-01383
`
`

`

`U.S. Patent
`
`Jan. 26, 1999
`
`Sheet 2 of 15
`
`5,864,640
`
`3SHAPE 1011 3Shape v Align IPR2021-01383
`
`

`

`U.S. Patent
`
`Jan. 26, 1999
`
`Sheet 3 of 15
`
`5,864,640
`
`I
`
`I
`
`/
`
`~
`.,_
`
`~
`-~
`~
`
`///, \
`I
`-:-~~-\
`I
`\
`I
`\
`I
`\ I
`\ I
`\
`
`0:,
`.,_
`~
`
`/ /,,,.
`/ / /
`,,..,,,.,,.... ~
`/
`
`a .,_
`.,_
`
`3SHAPE 1011 3Shape v Align IPR2021-01383
`
`

`

`U.S. Patent
`
`Jan. 26, 1999
`
`Sheet 4 of 15
`
`5,864,640
`
`3SHAPE 1011 3Shape v Align IPR2021-01383
`
`

`

`U.S. Patent
`
`Jan. 26, 1999
`
`Sheet 5 of 15
`
`5,864,640
`
`110
`
`-
`
`- --
`-
`
`100
`
`110
`
`110
`
`·11111
`
`111,
`
`,I
`
`111
`
`110
`
`111
`
`I'
`
`,111
`
`11111,
`
`110
`
`110
`
`110
`
`oJ«j. /:JJ
`
`3SHAPE 1011 3Shape v Align IPR2021-01383
`
`

`

`"'-' =
`0--,
`....
`"'-'
`0--,
`00
`....
`Ul
`
`~
`
`'"""' Ul
`0 ....,
`~ ....
`'JJ. =(cid:173)~
`
`\0
`\0
`'"""'
`\0
`
`~~
`N
`?
`~
`~
`
`~ = ......
`~ ......
`~
`•
`r:JJ.
`d •
`
`LIGHTING ~===> LIGHTS
`
`~
`-
`
`224
`
`1 ENCODER
`
`I<
`
`DETECTOR
`'I POSITION
`
`Cl ====, CONTROL
`<
`
`j
`
`t1Jt<j,.2
`
`> MOTOR
`
`222
`
`r220
`
`v1 CONTROLLER t
`
`POSITION
`
`208
`
`ROM
`
`206
`
`216 LS]
`
`r214
`
`~ > DETECTOR
`> MONITOR
`
`212
`
`I•
`
`INTERMF~CE ~===> WAN/LAN
`I co
`
`~218
`
`<::====:>.
`
`STORAGE
`
`REMOVABLE
`
`STORAGE
`
`MASS
`
`1/0
`
`DETECTOR
`
`I
`
`CARD
`VIDEO
`
`210
`
`<===>
`
`µp
`
`200
`
`202
`
`RAM
`
`204
`
`3SHAPE 1011 3Shape v Align IPR2021-01383
`
`

`

`U.S. Patent
`
`Jan. 26, 1999
`
`Sheet 7 of 15
`
`5,864,640
`
`300
`
`302
`
`PLACE OBJECT ON
`ROTATABLE PLATFORM
`
`304
`
`YES
`
`oJ-«j. 34
`
`306
`
`308
`
`310
`
`CAPTURE
`IMAGE
`
`.__ __ PREPROCESS
`AND STORE
`
`350
`
`IDENTIFY
`SILHOUETTES
`
`352
`
`DETERMINE
`SET OF
`TRACKING POINTS
`
`DEVELOP RADII
`FOR TRACKING
`POINTS
`
`OUTPUT
`COORDINATES AND
`COLOR VALUES
`
`354
`
`356
`
`358
`
`~360
`
`3SHAPE 1011 3Shape v Align IPR2021-01383
`
`

`

`U.S. Patent
`
`Jan. 26, 1999
`
`Sheet 8 of 15
`
`5,864,640
`
`R(I)
`
`B
`
`506
`G(I)
`
`310
`
`~
`
`400
`
`FIND DIFFERENCE
`RELATIVE TO
`PREVIOUS IMAGE
`
`404
`
`APPLY COMPRESSION
`TECHNIQUE
`
`406
`
`v/-iq,. 4
`
`0
`
`1
`2
`
`3
`
`1><xxxxxxx><1
`1><xxxxxxx><1
`1><xxxxxxx><1
`
`0 NO BLUE
`
`1 X(NC)
`2
`X(NC)
`
`8( +B)
`2(-8)
`
`3
`
`X+2 (NC) 1 (-8)
`
`Y(NC)
`7(NC)
`
`7(NC)
`
`2(+8)
`
`1 (+B)
`
`Y-2(NC)
`
`Y-1 (NC)
`
`3SHAPE 1011 3Shape v Align IPR2021-01383
`
`

`

`U.S. Patent
`
`Jan. 26, 1999
`
`Sheet 9 of 15
`
`5,864,640
`
`fJJ-«j. 6
`
`600
`
`6 0 4~YE S
`
`~
`
`352
`
`)
`
`602
`
`606
`
`APPLY FILTER
`KERNEL TO
`IMAGE
`
`700
`
`608
`
`702 ---------'-------..
`MOVE IN FROM
`LEFT EDGE OF
`IMAGE TO FIND
`POTENTIAL
`LEFT EDGES(S)
`
`FIND SILHOUETTE
`EDGES
`
`608
`
`/
`
`704
`
`MOVE IN FROM
`RIGHT EDGE
`OF IMAGE ALONG
`SCAN LINE
`TO FIND POTENTIAL
`RIGHT EDGE(S)
`
`USE HEURISTICS TO
`706
`DETERMINE LEFT
`AND RIGHT EDGES ~-A
`OF THE OBJECT
`
`3SHAPE 1011 3Shape v Align IPR2021-01383
`
`

`

`U.S. Patent
`
`Jan. 26, 1999
`
`Sheet 10 of 15
`
`5,864,640
`
`804
`
`354 j
`
`800
`
`802
`
`DONE WITH
`IMAGES?
`
`YES (cid:141)
`
`806
`
`808
`
`810
`
`812
`
`814
`
`NO
`
`LOCATE VERTICAL
`CENTERLINE OF
`IMAGE
`
`SEARCH AREA
`FOR TRACKABLE
`PATCHES
`
`MARK TRACKABLE
`PATCHES
`
`MOVE TO NEXT
`AREA
`
`v/-«j. 8
`
`3SHAPE 1011 3Shape v Align IPR2021-01383
`
`

`

`U.S. Patent
`
`Jan. 26, 1999
`
`Sheet 11 of 15
`
`5,864,640
`
`100
`
`i 850
`I
`I
`
`852
`
`854
`
`850
`
`v/-t<j. 811
`
`3SHAPE 1011 3Shape v Align IPR2021-01383
`
`

`

`U.S. Patent
`
`Jan. 26, 1999
`
`Sheet 12 of 15
`
`5,864,640
`
`900
`
`912
`
`~ YES ~--
`356 l
`
`NO
`
`CHOOSE PATCH
`AND INITIAL
`IMAGE
`
`910
`
`944
`
`DETERMINE
`RADIUS
`
`942
`
`RUN
`RLS
`
`914
`
`940
`
`FILTER
`DATA
`
`NO
`
`CALCULATE EXPECTED
`PATH OF PATCH
`INTHE IMAGE
`
`916
`
`918
`
`tJJ-«j. 9
`
`FIND EXACT
`POSITION OF PATCH AND
`STORE FOR THAT IMAGE
`
`UPDATE PATCH
`KERNEZ
`
`GOTO NEXT
`IMAGE
`
`920
`
`922
`
`3SHAPE 1011 3Shape v Align IPR2021-01383
`
`

`

`U.S. Patent
`
`Jan. 26, 1999
`
`Sheet 13 of 15
`
`5,864,640
`
`100
`
`904
`
`900
`
`110
`
`I
`ct:>
`' I
`
`950
`
`- - ----------=---__,/
`
`'
`
`----
`
`.,,,,-
`
`.,,,,
`
`/
`
`-- - ---+-1\_~--
`//
`j 952
`--------
`-- +-,\_~
`----------
`. 954
`------
`I
`---
`--- ------------
`' ' .....
`.......... __ __
`I
`I
`------- -
`
`-
`
`100 959
`
`I
`
`---
`
`J
`- - -
`/ 956
`
`---
`\ -'"~ 958
`---
`-... _ __,
`----l.--- '
`--
`.. -
`---------~
`I
`---
`-----
`
`/
`
`.,,,,.,,,
`
`/
`
`3SHAPE 1011 3Shape v Align IPR2021-01383
`
`

`

`U.S. Patent
`
`Jan. 26, 1999
`
`Sheet 14 of 15
`
`5,864,640
`
`966 964
`
`.--------.J __ 962
`
`960
`
`968
`
`961
`
`X
`
`R
`
`3SHAPE 1011 3Shape v Align IPR2021-01383
`
`

`

`U.S. Patent
`
`Jan. 26, 1999
`
`Sheet 15 of 15
`
`5,864,640
`
`1000
`
`CONVERT COORDINATE
`DATA TO DESIRED
`COORD SYSTEM
`
`CONVERT COLOR
`DATA TO DESIRED
`COLOR SYSTEM
`
`PERFORM INTERPOLATION
`AND DECIMATION
`
`STORE IN
`DATA STRUCTURE
`
`1010
`
`1020
`
`1030
`
`1040
`
`~1050
`
`v/-i-<t, /0
`
`3SHAPE 1011 3Shape v Align IPR2021-01383
`
`

`

`5,864,640
`
`1
`METHOD AND APPARATUS FOR
`OPTICALLY SCANNING THREE
`DIMENSIONAL OBJECTS USING COLOR
`INFORMATION IN TRACKABLE PATCHES
`
`TECHNICAL FIELD
`
`This invention relates generally to optical scanners, and
`more particularly to optical scanners for providing a digital
`representation of three dimensional objects.
`
`5
`
`2
`obtained from two dimensional optical detectors with the
`correct points on a three dimensional surface model.
`Another problem in the prior art is that color information is
`not used to determine surface locations, which means less
`than the total amount of information that is available is being
`used. Furthermore, both a 2-D and 3-D system is required,
`which adds cost.
`What is needed is a way of generating a set of three
`dimensional points representing a surface in such way that
`10 the three dimensional points are already associated with
`color data so that conformally mapping separately generated
`color data onto the set of three dimensional surface points is
`not necessary. Furthermore, it is desirable to utilize all
`available frequencies of light to determine surface point
`15 positions to maximize the accuracy of the scanning process
`and to eliminate a separate 3-D scanning step.
`
`DISCLOSURE OF THE INVENTION
`
`BACKGROUND ART
`Methods for successfully obtaining two dimensional ("2-
`D") color image data for objects have been developed. This
`process is commonly known as two dimensional scanning or
`digitizing. When an object is scanned, a digital data file is
`created which contains image data including color informa(cid:173)
`tion which is associated with a set of two dimensional points
`or coordinates. The color information is obtained by an
`optical detector or set of optical detectors that are typically
`organized in a one or two dimensional array.
`Matching the color information with the correct two
`dimensional point or location is not a significant problem in
`two dimensional scanning since the two dimensional point
`on which the optical detector is focused is the same point 25
`that is associated with the color information obtained by the
`detector. The color information is mislocated only to the
`extent that there is some error in the location of the point on
`which the detector is focused ( e.g. an error introduced by the
`optical system) and that error can readily be minimized.
`The problem of associating color information with three
`dimensional ("3-D") objects is not so easily solved. This is
`because prior art methods obtain color information with a
`two dimensional scanning method, while position informa(cid:173)
`tion is obtained by a three dimensional scanning method. 35
`The mapping of the 2-D color information to the 3-D
`position information is a complicated process which is prone
`to significant error.
`Many methods exist for obtaining the three dimensional
`location of the surface points of the object. One such method 40
`is a system which uses a laser range finder to scan the object
`and record the distance between the known three dimen(cid:173)
`sional location of the range finder and the measured location
`of the surface of the object. The result of using this method
`or other methods of generating three dimensional surface
`models is a set of three dimensional points which accurately
`represent the surface of the object. A characteristic of this
`method and other methods of obtaining a three dimensional
`surface model is that it is inherently monochromatic, that is,
`no color information is obtained in the process. If three
`dimensional color information is desired, then it must be
`generated by somehow combining or conformally mapping
`the two dimensional color information onto the three dimen(cid:173)
`sional surface model.
`The problem of conformally mapping the two dimen(cid:173)
`sional color information onto the three dimensional surface
`model is difficult and it is common for mismatching of color
`information with the three dimensional points to occur. The
`problem may be visualized by imagining a white statue or
`bust of a person's head and a color photograph of the same 60
`person's face. The photograph cannot simply be projected
`onto the bust to transfer the correct color information to the
`correct points on the bust or significant distortion will occur.
`A significant amount of judgment must be exercised in order
`to correctly associate the color information from the photo(cid:173)
`graph with the correct surface points on the bust. Similarly,
`it is difficult to accurately associate color information
`
`20
`
`30
`
`Accordingly, the present invention provides a system and
`method for using the color information from a series of two
`dimensional color images to derive the three dimensional
`location in space of the surface points which produced the
`color images. Because the color information itself is used to
`derive the three dimensional location of the surface points,
`there is no need to conformally map separately generated
`color information onto the derived three dimensional surface
`points. The points are derived from color information and so
`are already associated with the correct color information.
`Also, the use of the color information increases the accuracy
`of the three dimensional location of the surface points.
`In one embodiment, the present invention provides a three
`dimensional digital scanner which includes a multiple view
`detector which is responsive to a broad spectrum of visible
`light. The multiple view detector is operative to develop a
`plurality of images of a three dimensional object which is
`being scanned. The plurality of images are taken from a
`plurality of relative angles with respect to the object, and the
`plurality of images depict a plurality of surface portions of
`the object. A digital processor including a computational
`unit is coupled to the detector and is responsive to the
`plurality of images so that it develops 3-D coordinate
`positions and related image information for the plurality of
`surface portions of the object. A three dimensional image of
`45 the object to be scanned is thus developed by the digital
`processor. The data developed includes both shape and
`surface image color information.
`In another embodiment, a three dimensional color digital
`scanner includes a color detector responsive to a broad
`50 spectrum of visible light to develop a plurality of images of
`a three dimensional object. A rotary object support having an
`axis of rotation allows the detector to develop a plurality of
`images of a three dimensional object. The plurality of
`images depict a plurality of surface portions of the object. A
`55 digital computer is coupled to the detector. The computer
`tracks patches of the surface portions of the object to
`determine coordinates of the patches as a function of the
`rotation of the rotary object support and determines radii of
`the patches from the axis of rotation.
`In another embodiment, a method for scanning a three
`dimensional object includes developing a plurality of images
`of a three dimensional object taken from a plurality of
`relative angles with respect to the object. The plurality of
`images depict a plurality of surface portions of the object to
`65 be scanned. 3-D coordinate positions and related image
`information about the plurality of surface portions of the
`object is computed from the plurality of images such that a
`
`3SHAPE 1011 3Shape v Align IPR2021-01383
`
`

`

`5,864,640
`
`4
`FIG. SB illustrates an example of blue color data at times
`0, 1, 2, and 3 for a line of pixels.
`FIG. SC illustrates how the data can be compressed by
`recording only the changes in the color data.
`FIG. 6 is a flow diagram illustrating a process for iden(cid:173)
`tifying the silhouette of the object in each image.
`FIG. 7 is a flow diagram illustrating a process for finding
`silhouette edges along each scan line.
`FIG. 8 is a flow diagram illustrating a process for deter(cid:173)
`mining e a set of trackable patches.
`FIG. SA illustrates how to search an image for trackable
`patches.
`FIG. 9 is a flow diagram illustrating a process for deter(cid:173)
`mining the radius of the location of patches on the surface
`of the object as the object is rotated.
`FIG. 9A illustrates a set of patch tracking limits.
`FIG. 9B illustrates the motion of trackable patches m
`different images with different angular displacements.
`FIG. 9C illustrates the determination of an exact position
`of the patch in an image.
`FIG. 9D is a graph which illustrates the filtering of raw
`data points.
`FIG. 9E is a graph which illustrates how the radius is
`determined from the points representing the path of the
`trackable patch across angularly displaced images.
`FIG. 10 is a flow diagram illustrating the post processing
`that occurs once the radius of the trackable patch is known.
`
`3
`three dimensional image of the object is developed that
`includes both shape and surface image information.
`In another embodiment, a method for determining three
`dimensional coordinates of a surface portion of an object
`includes obtaining a plurality of images of the surface 5
`portion of the object and identifying a trackable patch of the
`surface portion in an initial image. An initial set of two
`dimensional coordinates of the trackable patch in the initial
`image is determined along with at least one additional set of
`two dimensional coordinates of the trackable patch in 10
`another of the images. A radial coordinate of the trackable
`patch is determined and then a set of three dimensional
`coordinates of the trackable patch are determined from the
`radial coordinate of the trackable patch.
`In another embodiment, a method for determining three 15
`dimensional coordinates of a surface portion of an object
`includes rotating the object about an axis of rotation so that
`a plurality of images of the surface portion of the object are
`obtained as the object is rotates about the axis of rotation. A
`trackable patch is identified and the two dimensional coor- 20
`dinates of the trackable patch are determined. The move(cid:173)
`ment of the trackable patch is tracked as a function of the
`rotation of the object. A radial distance of the trackable patch
`from the axis of rotation is determined based on the move(cid:173)
`ment of the trackable patch as a function of the rotation of 25
`the object and three dimensional coordinates of the surface
`portion of the object are derived from the coordinates of the
`trackable patch and the radial distance of the trackable patch
`from the axis of rotation.
`The present inventions provides a system and method for 30
`obtaining 3-D surface information that is linked to color
`information without the need to conformally map 2-D color
`data onto a 3-D surface. The accuracy of the system is
`enhanced by the use of color data and the cost of the system
`is reduced because the 3-D surface is derived from a series 35
`of 2-D images. These and other advantages of the present
`invention will become apparent upon reading the following
`detailed descriptions and studying the various figures of the
`drawings.
`
`45
`
`BEST MODES FOR CARRYING OUT THE
`INVENTION
`In FIG. 1, an embodiment of the present invention
`includes a system for obtaining a series of two dimensional
`color images of an object and processing those images to
`obtain a three dimensional model of the surface of the
`object. An object 100 which is to be digitized is placed on
`a rotatable platform 102. A motor 104 is provided to drive
`rotatable platform 102 via a shaft 106. A position encoder
`40 108 detects the angular position of rotatable platform 102
`and generates an electrical signal which represents the
`angular position of rotatable platform 102. An optical detec(cid:173)
`tor 110 (e.g. a color video camera) views object 100 and
`creates a two dimensional color image of object 100.
`As object 100 is rotated by rotatable platform 102, detec-
`tor 110 captures a series of color images of object 100. Each
`color image taken at a different time is associated with an
`angular rotation of object 100 about an axis of rotation, "A"
`which runs through shaft 106. Information about the angular
`50 position of object 100 is obtained from position encoder 108.
`Thus, each "snapshot" or image of object 100 taken by
`detector 110 from a different view is associated with data
`about the angle of rotation of object 100 with respect to
`detector 110. An image input processing system 120
`55 ("computer") controls the image acquisition process and
`records the acquired images along with the associated angu(cid:173)
`lar position data. That is, processing system 120 is connected
`to detector 110 and receives data for each image or snapshot
`taken of object 100 from detector 110, and position encoder
`60 108 sends angular position information to processing system
`120, so that processing system 120 can associate the image
`data from detector 110 with the angular position data taken
`at the same time. In other embodiments, detector 110 is a
`film camera and processing system 120 receives data from
`65 a digitizer which digitizes the film images from detector 110.
`Processing system 120 includes a processing unit 122 and
`a monitor 124 and also controls motor 104. A monitor 124
`
`BRIEF DESCRIPTION OF THE DRAWINGS
`FIG. 1 illustrates a system for obtaining a series of two
`dimensional color images of an object and processing those
`images to obtain a three dimensional model of the surface of
`the object.
`FIG. lA illustrates an alternative embodiment of the
`present invention which enables the top and bottom portions
`of an object to be scanned.
`FIG. lB illustrates another embodiment of the present
`invention which produces enhanced shading of an object.
`FIG. lC illustrates an arrangement where a detector is
`translated about a stationary object.
`FIG. lD illustrates an embodiment of the present inven(cid:173)
`tion which uses a multiple number of detectors instead of
`moving a single detector.
`FIG. 2 illustrates in detail an architecture of an image
`acquisition system.
`FIG. 3A is a flow diagram illustrating a process of
`obtaining multiple images of a rotating object.
`FIG. 3B is a flow diagram illustrating a process for
`generating three dimensional surface data from the two
`dimensional images of the object.
`FIG. 4 is a flow diagram illustrating a process performed
`on the images before they are stored.
`FIG. SA illustrates the vector nature of the color data
`obtained.
`
`3SHAPE 1011 3Shape v Align IPR2021-01383
`
`

`

`5,864,640
`
`5
`can display a current image 126 being captured by detector
`110 or other information about the capturing process.
`Once processing system 120 has obtained a series of
`images, those images are transferred to an image processor
`130 ("computer"). Image processor 130 can receive data 5
`from processing system 120 in a number of different ways.
`Image processor 130 can be directly connected to processing
`system 120 via direct connection 132, or data from process(cid:173)
`ing system 120 can be transferred to a removable storage
`medium such as disk 134 which may be read by image 10
`processor 130. Processing system 120 may also transfer data
`to image processor 130 via the Internet or a modem con(cid:173)
`nection. Image processor 130 includes processing unit 136
`and also includes monitor 138.
`In other embodiments, processing system 120 and image
`processor 130 are combined on a single computer. The
`advantage of separating the functions of processing system
`120 and image processor 130 is that the data acquisition and
`storage function performed by processing system 120 and
`control of the data acquisition system does not require a
`complex or powerful processor. On the other hand, image
`processor 130 receives data representing a series of two
`dimensional images and perform complex and computation(cid:173)
`ally intensive operations on that data to produce a three
`dimensional surface model. Image processor 130 is
`therefore, given current technology, likely to be a more
`powerful ( and costly ) computer than processing system
`120. If that is the case, then it is economically beneficial to
`utilize a large number of relatively cheap processors for data
`acquisition and temporary storage and send data from those
`relatively cheap systems to a smaller number of image
`processors which generate the three dimensional surface
`model from the set of two dimensional color images.
`FIG. lA illustrates an alternative embodiment of the
`present invention which enables the top and bottom portions
`of an object to be scanned. Again, object 100 is supported by
`rotatable platform 102 which is driven by motor 104. In this
`embodiment, shaft 107 engages the edge of rotatable plat(cid:173)
`form 102, so that motor 104 and shaft 107 do not obscure the
`image of the bottom of object 100. Rotatable platform 102
`is made from a transparent material so that the bottom of
`object 100 may be viewed through rotatable platform 102. A
`set of mirrors 109 are placed within the field of view of
`detector 110 so that images of the top and bottom surfaces
`of object 100 are captured by detector 110 in addition to the
`side views.
`FIG. lB illustrates another embodiment of the present
`invention which is designed to produce contrast enhancing
`shading of object 100. Again, object 100 is supported by
`rotatable platform 102 which is driven by a motor 104 via a
`shaft 106. A second motor 142 also drives a rotatable
`platform 144 via shaft 146. Encoder 148 generates data
`representative of the rotational position of rotatable platform
`144 and transmits that data to processing system 120.
`Likewise, motor 142 receives control commands from pro(cid:173)
`cessing system 120. A light 150 is mounted on rotatable
`platform 144 to provide illumination of object 100. Light
`150 is oriented to provide contrasting illuminated and
`shaded portions on object 100 which aid in the tracking of
`features on the surface of object 100. Because light 150 is
`mounted on rotatable platform 144 which is separately
`controllable by processing system 120, different orientations
`of light 150 with respect to object 100 may be checked to
`determine which one best enhances the surface features of
`object 100. When platforms 102 and 144 are rotated in a
`synchronized manner, the shading remains constant.
`Additionally, multiple sets of views of object 100 with
`
`6
`different shadings can also be obtained by changing the
`relative position of platforms 102 and 144.
`FIGS. 1, lA, and lB each depict embodiments where in
`the object being imaged is rotated. In another embodiment
`of the present invention, the object remains stationary and
`the detector moves around the object. FIG. lC illustrates an
`arrangement where a detector is translated about a stationary
`object. It should be noted that as the detector 110 is moved,
`the optics 111 remain pointed at the object 100. Detector 110
`can be move in many ways and object 100 can be supported
`in many ways. In one embodiment, an unobstructed view of
`object 100 is obtained by suspending it from very thin wires.
`Detector 110 is translated about object 100. If object 100 is
`very large, detector 110 could be mounted on, for example,
`15 a helicopter and flown around object 100. It is not necessary
`that the motion of detector 110 be exactly circular around
`object 100. The angular and radial components of the motion
`of detector 110 with respect to object 100 can be computa(cid:173)
`tionally analyzed, as will be appreciated by those skilled in
`20 the art. As long as the position of detector 110 is measured
`and recorded, the relative angular position of detector 110
`with respect to object 100 can be determined for each image
`taken by detector 110. Methods of determining the position
`of detector 110 include using GPS or a laser positioning
`25 system. Once the angular component of the motion is
`analyzed and the radial component is calculated, the system
`compensates for the radial component and the images gen(cid:173)
`erated by detector 110 can be processed similarly to the
`images generated by a system that includes a rotating object
`30 and a stationary detector.
`FIG. lD illustrates an embodiment of the present inven(cid:173)
`tion which uses a multiple number of detectors instead of
`moving a single detector. A top view of object 100 is shown
`and set of detectors 110 are provided at different angular
`35 displacements with respect to object 100. The advantage of
`this embodiment is that no motion is required and the need
`for motors, encoders, and rotatable supports is limited. The
`image of object 100 captured by each detector is angularly
`displaced with respect the images captured by the other
`40 detectors and so the images may be processed in a similar
`manner as successive images taken by one moving detector.
`The cost of multiple detectors 110 may be less than the
`cost of a rotatable drive or a mechanism for moving detector
`45 110 and recording the position of detector 110. Another
`advantage of this approach is that all of the images of object
`100 can be created simultaneously.
`FIGS. 1 through lD depict various embodiments for
`creating multiple images of object 100 with object 100 and
`50 detector 110 at different relative angular displacements.
`Each of these systems provide two dimensional color images
`of object 100 observed at different angles. This two dimen(cid:173)
`sional information is converted into a three dimensional
`surface model of object 100 by the process and apparatus of
`55 the present invention.
`FIG. 2 illustrates in detail the architecture of processing
`system 120 used in some embodiments. A microprocessor
`200 is connected to a memory bus 202 and memory bus 202
`is connected to a RAM 204 and a ROM 206. Microprocessor
`60 200 is also connected to an input/output ("1/0") bus 208. A
`video interface 210 is coupled to 1/0 bus 208 to control
`monitor 124, as is detector interface 212. Detector interface
`212 buffers and processes data from the detector and also
`carries output commands to the detector from microproces-
`65 sor 200. In certain embodiments where a moving detector is
`used, the detector provides its own control and records its
`own position. In such embodiments, the detector/processor
`
`3SHAPE 1011 3Shape v Align IPR2021-01383
`
`

`

`5,864,640
`
`10
`
`7
`interface need only be capable of transferring data from the
`detector, including both image and detector position data, to
`the processor storage system.
`Mass storage 214 (such as a hard disk drive) is also
`connected to input/output bus 208 and provides storage
`capacity for the multiple images generated by the optical
`system. Removable storage 216 (such as a floppy disk drive)
`also provides a way of transferring data files to and from
`processing system 120 and another processing system.
`Alternatively, communications interface 218 can be used to
`transfer files as well. Communications interface 218 may be
`connected to a local area network ("LAN") or wide area
`network ("WAN") for communicati

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket