`US 20050237581Al
`
`(19) United States
`(12) Patent Application Publication
`Knighton et al.
`
`(10) Pub. No.: US 2005/0237581 Al
`Oct. 27, 2005
`(43) Pub. Date:
`
`(54) HAND HELD PORTABLE THREE
`DIMENSIONAL SCANNER
`
`(76)
`
`Inventors: Mark S. Knighton, Santa Monica, CA
`(US); Peter J. DeLaurentis, Los
`Angeles, CA (US); William D.
`McKinley, Manhattan Beach, CA (US);
`David S. Agabra, Pacific Palisades, CA
`(US)
`
`Correspondence Address:
`BLAKELY SOKOLOFF TAYLOR & ZAFMAN
`12400 WILSHIRE BOULEVARD
`SEVENTH FLOOR
`LOS ANGELES, CA 90025-1030 (US)
`
`(21)
`
`Appl. No.:
`
`10/830,210
`
`(22) Filed:
`
`Apr. 21, 2004
`
`Publication Classification
`
`Int. Cl.7 ..................................................... H04N 1/024
`(51)
`(52) U.S. Cl. ............................................ 358/473; 358/483
`
`(57)
`
`ABSTRACT
`
`Embodiments of the invention may include a scanning
`device to scan three dimensional objects. The scanning
`device may generate a three dimensional model. The scan(cid:173)
`ning device may also generate a texture map for the three
`dimensional model. Techniques utilized to generate the
`model or texture map may include tracking scanner position,
`generating depth maps of the object and generation com(cid:173)
`posite image of the surface of the object.
`
`101.
`
`109
`
`Align EX1108
`Align v. 3Shape
`IPR2022-00144
`
`
`
`Patent Application Publication Oct. 27, 2005 Sheet 1 of 12
`
`US 2005/0237581 Al
`
`101
`
`109
`
`FIG. 1
`
`
`
`Patent Application Publication Oct. 27, 2005 Sheet 2 of 12
`
`US 2005/0237581 Al
`
`......
`......
`......
`
`m
`N
`•
`(!) -LL
`
`<(
`N
`•
`
`(!) -LL
`
`><
`
`..,....
`......
`C)
`
`('I)
`C)
`N
`
`..,....
`C)
`N
`
`('I)
`0
`N
`
`
`
`Patent Application Publication Oct. 27, 2005 Sheet 3 of 12
`
`US 2005/0237581 Al
`
`ANTENNA
`319
`
`CONTROLS
`307
`
`COMMUNICATIONS
`DEVICE
`111_
`
`POSITION
`SENSORS
`.32.1
`
`IMAGER
`305
`
`PROCESSOR
`301
`
`MEMORY
`303
`
`DISPLAY
`311
`
`FOCUS POSITIONING
`MOTOR
`313
`
`LIGHTS
`.315. .
`
`FIG. 3
`
`
`
`Patent Application Publication Oct. 27, 2005 Sheet 4 of 12
`
`US 2005/0237581 Al
`
`CAPTURE MULTIPLE IMAGES AT
`DISTINCT FOCUS POSITIONS
`I .4Q1
`
`+
`
`DETERMINE FOR EACH PIXEL
`WHICH IMAGE IS IN OPTIMUM
`FOCUS BASED ON LOCAL MAXIMA
`AND MINIMA OF LIGHT INTENSITY
`
`~ •
`
`DETERMINE FOCAL POSITION FOR EACH
`OPTIMUM FOCUS PIXEL BY LOOKING
`UP THE FOCAL POSITION
`CORRELA TED WITH THE IMAGE WHERE
`THE PIXEL WAS CAPTURED
`
`405 •
`
`GENERATE TWO DIMENSIONAL IMAGE
`UTILIZING OPTIMUM FOCUS PIXELS
`407
`+
`409 •
`
`PRODUCE DEPTH MAP FOR THREE
`DIMENSIONAL DATA
`
`MAP TWO DIMENSIONAL IMAGE
`ONTO PORTION OF THREE
`DIMENSIONAL REPRESENTATION
`OFA TARGET
`411
`
`FIG. 4 ·
`
`
`
`Patent Application Publication Oct. 27, 2005 Sheet 5 of 12
`
`US 2005/0237581 Al
`
`CAPTURE MULTIPLE IMAGE AT
`DIFFERENT FOCAL SETTINGS
`
`DETERMINE OPTIMAL SET OF PIXELS
`FROM THE SET OF IMAGES
`
`. 501 •
`503 •
`505 •
`507 •
`509 •
`
`ASSEMBLE COMPOSITE IMAGE OF
`OPTIMAL PIXELS
`
`REPEAT PROCESS TO GENERATE
`SECOND COMPOSITE IMAGE WITH
`DIFFERENT IMAGE VIEWPOINT
`
`BEGIN STEREO CORRELATION OF
`IMAGES
`
`FIND CORRESPONDING FEATURES
`BETWEEN IMAGES
`511
`-!
`DETERMINE FEATURE DISTANCES
`BASED ON SHIFT OF FEATURE
`BETWEEN IMAGES
`513
`+
`GENERA TE DEPTH MAP
`515
`
`.J,
`CORRELA TE DEPTH MAP WITH
`COMPOSITE IMAGE
`517
`
`FIG. 5
`
`
`
`Patent Application Publication Oct. 27, 2005 Sheet 6 of 12
`
`US 2005/0237581 Al
`
`605
`
`FIG. 6
`
`
`
`Patent Application Publication Oct. 27, 2005 Sheet 7 of 12
`
`US 2005/0237581 Al
`
`CAPTURE FIRST IMAGE AT
`FIRST PATH LENGTH
`701
`+
`
`CAPTURE SECOND IMAGE AT
`SECOND PATH LENGTH
`lQ.3_
`
`..
`
`CALCULATE DISTANCE PER PIXEL
`705
`
`..
`
`CREA TE DEPTH MAP FOR PORTION
`OF THREE DIMENSIONAL TARGET
`NJ...
`
`FIG. 7
`
`CAPTURE WIDE AREA IMAGE
`801
`+
`
`CAPTURE HIGHER DEFINITION
`IMAGE OF A TARGET AREA
`
`803 ..
`
`GENERATE DEPTH MAP FOR THE
`TARGET AREA
`805
`
`CORRELA TE HIGH DEFINITION
`IMAGE TO WIDE AREA IMAGE
`807 .
`
`..
`
`CORRELA TE HIGH DEFINITION IMAGE WITH
`THREE DIMENSIONAL REPRESENTATION
`OF TARGET
`809
`FIG. 8
`
`
`
`Patent Application Publication Oct. 27, 2005 Sheet 8 of 12
`
`US 2005/0237581 Al
`
`907
`
`. . .......................... -.. .
`. -·:
`.
`. ·· :
`
`.··
`
`r
`
`., -- ,,,,,
`.. ·
`.,, .,,
`_ .. · ,.,,. ,,
`,
`,-'
`....
`.·
`.. ··.,,..-_.,--
`··········
`....
`.. ,,,
`--
`. . ::.: ........ .
`
`111
`
`FIG. 9
`
`.... --<:::~ -r~~r:: ,
`
`,
`
`,;.-.,A••••••••••••••••:::::::::,•
`............ .
`-········
`
`
`
`Patent Application Publication Oct. 27, 2005 Sheet 9 of 12
`
`US 2005/0237581 Al
`
`GENERATE HIGH DEFINITION
`IMAGE AND DEPTH MAP
`1001
`i
`CORRELATE NEW DATA WITH
`. THREE DIMENSIONAL REPRESENTATION
`OF TARGET BASED ON CURRENT POSITION
`
`GENERATE TRACKING IMAGES
`10Q5
`
`1003 •
`•
`
`UPDATE POSITION
`1QQ7
`
`Fl·G. 10
`
`
`
`Patent Application Publication Oct. 27, 2005 Sheet 10 of 12
`
`US 2005/0237581 Al
`
`cry
`0 ......
`
`~
`
`......
`......
`0
`......
`
`\
`
`\
`
`\
`
`\
`
`\
`
`\
`
`T-
`T-
`•
`(!)
`LL
`
`-
`
`\
`
`\
`
`\
`
`\
`
`.
`.
`
`'
`'·. ,..,
`'
`.
`'"·
`·.
`-~
`'
`·.,
`· .
`·.'
`·.
`·. '
`'
`· ..
`'
`' \
`·'
`' \
`' \
`' \
`'
`.
`
`
`
`Patent Application Publication Oct. 27, 2005 Sheet 11 of 12
`
`US 2005/0237581 Al
`
`1203
`
`1201
`
`FIG. 12A
`
`1203
`~
`
`1207
`
`FIG. 12B
`
`
`
`Patent Application Publication Oct. 27, 2005 Sheet 12 of 12
`
`US 2005/0237581 Al
`
`GENERATE WIDE AREA IMAGE
`1301
`
`DETERMINE START POSITION
`1m
`
`DETERMINE OPTIMAL PATH
`1305
`
`MOVE !MAGER
`1307
`
`GENERATE IMAGE
`.13_Q9_
`
`NO
`
`NO
`
`YES
`
`FINISH
`1315
`
`FIG. 13
`
`
`
`US 2005/0237581 Al
`
`Oct. 27, 2005
`
`1
`
`HAND HELD PORTABLE THREE DIMENSIONAL
`SCANNER
`
`BACKGROUND
`
`[0001] 1. Field of the Invention
`
`[0002] The embodiments of the invention relate to scan(cid:173)
`ning. Specifically, the embodiments relate to scanning three
`dimensional objects to generate a three dimensional repre(cid:173)
`sentation of the objects.
`
`[0003] 2. Background
`
`[0004] Scanning technology utilizes an image sensor to
`collect light reflected from an object to generate an image of
`the object. A mirror and lens system is combined with the
`imaging device to focus the light reflected by the object onto
`the image sensor. Image sensors convert light energy into an
`electrical charge and then to a set of bits representing the
`color and intensity of the light.
`
`[0005] The image sensor may be one of a charge coupled
`device (CCD) and a complementary metal oxide semicon(cid:173)
`ductor (CMOS). These individual devices are typically
`arranged into an area array. The number of sensors, each
`representing a pixel (short for 'picture element'), determine
`the resolution of the image taken. A pixel is the smallest unit
`that makes up a digital image. A pixel can represent the
`shade and color of a portion of an image. The output of a set
`of image sensors is encoded as a set of pixels to create a
`digital image.
`
`[0006] The digital image may be stored in a compressed
`format such as in a jpeg, tiff, or gif format. The image is then
`stored in a digital storage device and may be displayed on a
`monitor by a display application. The digital image is a two
`dimensional image.
`
`[0007] Scanning devices are used in flatbed scanners and
`copying machines. These devices are large and capable of
`only scanning relatively flat objects such as paper to create
`two dimensional images of the object.
`
`BRIEF DESCRIPTION OF THE DRAWINGS
`
`[0008] Embodiments of the invention are illustrated by
`way of example and not by way of limitation in the figures
`of the accompanying drawings in which like references
`indicate similar elements. It should be noted that references
`to "an" or "one" embodiment in this disclosure are not
`necessarily to the same embodiment, and such references
`mean at least one.
`
`[0009] FIG. 1 is a diagram of one embodiment of a
`scanning device.
`
`[0010] FIG. 2A is a diagram of one embodiment of a lens
`housing.
`
`[0011] FIG. 2B is a diagram of one embodiment of a lens
`housing.
`
`[0012] FIG. 3 is a diagram of one embodiment of the
`components of a scanning device.
`
`[0013] FIG. 4 is a flowchart of one embodiment of a
`process for generating a three and two dimensional mapping
`of an object
`
`[0014] FIG. 5 is a flowchart of one embodiment of a
`process for generating a three dimensional model using
`stereoscopy.
`[0015] FIG. 6 is a diagram of one example embodiment of
`a scanning device imaging an object using stereoscopy.
`[0016] FIG. 7 is a flowchart for one embodiment of a
`process for scanning an object by calculating radiant light
`fall off.
`
`[0017] FIG. 8 is a flowchart for one embodiment of a
`process for aligning data in a scanning process.
`
`[0018] FIG. 9 is one embodiment of a scanning using a
`wide area image to correlate small area images into a three
`dimensional representation.
`
`[0019] FIG. 10 is a flowchart of one embodiment of a
`process for aligning imaging data to generate a three dimen(cid:173)
`sional model by tracking the position of the scanning device.
`
`[0020] FIG. 11 is one embodiment of a scanning device
`tracking its position utilizing image motion tracking.
`
`[0021] FIG. 12A is a diagram of one embodiment of a
`robotic arm system for scanning objects.
`
`[0022] FIG. 12B is a diagram of one embodiment of the
`robotic arm system with a turntable.
`
`[0023] FIG. 13 is a flowchart of one embodiment of a
`process for scanning an object utilizing a robotic arm
`system.
`
`DETAILED DESCRIPTION
`
`[0024] FIG. 1 is one embodiment of a hand held portable
`scanning device. The scanning device may be used to
`generate three dimensional representation of any type of
`object. As used herein three dimensional representations
`may be any type of digital modeling, abstraction or similar
`techniques that may utilize depth maps, polygon meshes,
`parametric solids, point clouds and similar data structures to
`create and store three dimensional representation of the
`scanned object. In one embodiment, the scanner may include
`a lens 101 or set of lenses. Lens 101 may focus light on one
`or more image sensing arrays (ISA). In one embodiment, the
`ISAs may be a charged coupled device (CCD), complemen(cid:173)
`tary metal oxide semiconductor (CMOS) sensor, or similar
`imaging array. Lens 101 may include multiple lenses or
`focal points to focus light on an imaging device or set of
`imaging devices. Lens 101 may be moveable to alter the
`focal point or focus of incoming light. In another embodi(cid:173)
`ment, lens 101 may be replaced by or supplemented by a
`reflector, light guide or similar article, any of which may be
`referred to as an "optical element." By varying the focal
`settings different aspects of the relief of an object may be
`brought into focus on an ISA. In one embodiment, an optical
`system having one or more optical elements distributes a
`same view of a target to a plurality of ISA's, each having a
`different focal range relative to the target.
`
`[0025]
`In one embodiment, the scanning device may be a
`"hand held" device. As used herein, a hand held device may
`be any device of a small size and weight suitable to be held
`and used by a human hand. Movement and positioning may
`be affected by the user without mechanical assistance. This
`is to be distinguished from a "hand directed" device in which
`
`
`
`US 2005/0237581 Al
`
`Oct. 27, 2005
`
`2
`
`a capture end is tethered by an arm to a fixed location but
`movable relative to that location by application of manual
`force.
`
`In one embodiment, lens 101 may fill an aperture
`[0026]
`of housing 111. Housing 111 may also contain one or more
`ISAs and actuators for adjusting the focus of lens 101.
`Housing 111 may be external to a main housing 115.
`Housing 111 may be attached to main housing 115 by a
`telescoping attachment 103. In another embodiment, lens
`housing 111 may be attached to main housing 115 by a
`bendable or stretchable material or similar structure to allow
`lens housing 111 to access areas through small apertures and
`similar obstructions. Telescoping attachment 103 or similar
`structure may be attached to a set of actuators, sensors or
`similar structures to allow the manipulation of lens 101
`positioning from controls 107 at main housing 115, a remote
`controller or similar system. Lens housing 111 positioning
`may be automated by software local to the scanner or
`external to the scanner. In one embodiment, lens housing 111
`or the scanning device may contain a stabilizer mechanism
`to maintain an orientation of lens 101 or the ISA by
`compensating for movement of the scanning device during
`the capture of an image.
`
`[0027]
`In one embodiment, small motors, servos, or piezo
`actuators may be used to move the relative position of the
`lens, the ISA and other optical elements. For example, this
`may be accomplished by moving the ISA, the lens, or both.
`This variance of relative positioning causes a change in the
`focus of the ISA and constitutes one example of changing
`the focal settings of the capture device. In one embodiment,
`lens housing 111 may contain a reflector between lens 101
`and the ISA to alter the focal depth from the target object to
`be scanned to the ISA In another embodiment, an ISA may
`be fabricated to allow it to capture different focused images
`at a plurality of depths within the structure of the ISA In one
`embodiment, the focus at different depths within the ISA
`permits two or more images having different depths of focus
`to be captured concurrently. Such an ISA may be thought of
`as having multiple image planes. In one embodiment, each
`image plane can be read from the ISA individually and
`independently of the other image plane(s).
`
`In one embodiment, housing 111 may have a maxi(cid:173)
`[0028]
`mum cross dimension (e.g., diameter) of less than two
`inches to allow access to small spaces and through small
`apertures. This flexibility permits capture of a wide array of
`possible targets. "Target" as used herein generally refers to
`a physical object, portion of an object and/or a collection of
`objects or portions thereof.
`
`In one embodiment, main housing 115 may include
`[0029]
`a visual display 105 to show the current input from the
`imaging device. In another embodiment, visual display 105
`may provide a progress report showing a wide view of all or
`a portion of a target in one representation and showing a
`successfully captured portion of the wide view as a different
`representation. This form of running progress report permits
`a user to visualize the data collected in the context of the
`larger target. Moreover, because the display is local to the
`scanner, and the scanner is close to the target, the user need
`not look away from both the target and scanning device to
`get the progress report. The progress report thereby facili(cid:173)
`tates surface capture of an arbitrary object by guiding the
`user to areas of the target not yet successfully captured. As
`
`used herein "local to" broadly means integrated into or
`tightly coupled with the noun modified. Conversely, "remote
`from" means at a distance from the noun modified. For
`example, both a server across a distributed network and a
`host PC would be regarded as remote from the unit shown
`in FIG. 1 while the display 105 is local to the ISA (not
`shown) whether disposed within main housing 115 or hous(cid:173)
`ing 111.
`
`[0030]
`In one embodiment, the scanner may provide other
`types of feedback to guide the user in capturing a target. For
`example, the scanner may emit an audible tone either when
`the capture end is in range or out of range to perform a
`capture. Alternatively, the scanner may, for example, project
`a pattern (e.g., two dots) on the surface of the target that
`converges to a known state (e.g., a single dot) when the
`scanner is in capture range. In some embodiments, such
`other forms of feedback may be provided in addition to the
`visualization mechanism on the display discussed above.
`Notably, both of these mechanisms permit a user to view the
`target while receiving the feedback. In one embodiment,
`feedback may be provided when a portion of a target object
`has been imaged. Feedback for imaging may include an
`audible tone, indicator light, tactile feedback or similar
`feedback.
`
`[0031] Main housing 115 may also include a set of manual
`input controls 107 to allow a user to provide input to control
`the scanning device. In one embodiment, the ISA may be
`disposed within the main housing 115 but still with optical
`communication to the lens 101.
`
`[0032] Main housing 115 may also include a light source
`113. Light source 113 may be a light emitting diode or
`similar device. In another embodiment, the scanning device
`may provide multiple light sources. In a further embodi(cid:173)
`ment, the light source may be positioned within the housing
`111. In an additional embodiment, the light source may be
`disposed on the housing 111. In a further embodiment, light
`source 113 may be moveable or rotatable to provide light
`from a different angle or position. Movement and position(cid:173)
`ing of light source 113 may be servo, motor controlled, or
`similarly controlled. Movement may be directed by a pro(cid:173)
`gram or by the user of the scanning device. Other embodi(cid:173)
`ments may not include a light source, relying on ambient
`lighting to capture images.
`
`In one embodiment, main housing 115 may include
`[0033]
`a wireless communication device 109. Wireless communi(cid:173)
`cation device 109 may be a radio frequency (RF) transmitter,
`cellular device, IEEE 802.11 device or similar transmitter. In
`one embodiment, the wireless communication device sup(cid:173)
`ports the Bluetooth standard, TCP/IP communication and
`similar communication standards. In another embodiment,
`the scanning device may include a wire connector for
`communication with other devices. Wire communication
`may utilize any wire type communication protocol or tech(cid:173)
`nology such as a Universal Serial Bus (USE), firewire, 100
`BaseT or similar communication technology.
`
`[0034] FIG. 2A is a diagram of one embodiment of a lens
`housing. In one embodiment, lens housing 111 may have
`light sources 201, 203 embedded within or attached to the
`end of the housing aligned roughly in parallel with the
`direction of the ISA Lens housing 111 may have a single
`light source or multiple light sources 201, 203. Light sources
`attached to lens housing 111 allow a user to insert lens
`
`
`
`US 2005/0237581 Al
`
`Oct. 27, 2005
`
`3
`
`housing 111 into small apertures and interiors of objects to
`obtain images of the interior surfaces or surfaces not easily
`obtainable from an exterior position. The small size of the
`lens housing 111 also allow the terminus to get closer to a
`target, whereas large devices require a greater stand off
`distance to function. Imaging from a position close to the
`surface of the object permits less expensive optical compo(cid:173)
`nents to be used. In one embodiment, the terminus of the
`image capture device requires a stand off distance of less
`than 6".
`[0035] Light sources 201,203 attached in lens housing 111
`provide light for imaging that may not be easily obstructed
`or occluded by surfaces of the object to be imaged. In
`contrast, light sources that may be attached to main housing
`115 or originate external to the scanning device may be
`obstructed or occluded by the surface of the target object
`when the lens housing 111 is imaging an interior space or
`under similar circumstances. Multiple light sources 201, 203
`instead of a single light source may be used in some
`embodiments to determine the three dimensional surface of
`an imaged area of an object. In another embodiment, light
`sources may be attached in any position to lens housing 111.
`Light sources 201, 203 may be attached to actuators or
`similar devices to adjust the position and direction of the
`light.
`[0036] FIG. 2B is a diagram of one embodiment of a lens
`housing from a front elevational perspective. In one embodi(cid:173)
`ment, the cross dimensions x and y may be small to allow
`the whole of lens housing 111 to enter small apertures. The
`small dimensions of lens housing 111 allow the scanning
`device to image surfaces that are not easily accessible or
`visible from a position exterior to an object. An object that
`is within a confined space may also be imaged using a lens
`housing with small dimensions. The cross dimensions x, y
`may both be less than two inches in length. Where both of
`the cross dimensions of lens housing 111 are less than 2" and
`the light source 201, 203 are attached to or within the lens
`housing 111, all light supplied by the instrument and all light
`received for three dimensional imaging necessarily pass
`through a terminus of the instrument with a maximum
`separation of less than two inches. Two inch maximum cross
`dimension for the probing end has been found suitable for
`most capture applications. However, smaller maximum
`cross dimension such as 1", ½" or smaller are contemplated.
`[0037]
`In another embodiment, either dimension x or y
`may have a cross dimension less than two inches in length.
`This may allow lens housing 111 to enter most apertures,
`gaps and similar passages having a two inch or greater
`clearance. Lens housing 111 may contain light sources 201,
`203, lens 101, ISAs and similar components. Lens housing
`111 may be attached to an extendible portion of a scanning
`device having a cross section size smaller than lens housing
`111 to facilitate deployment of lens housing 111 into small
`spaces.
`[0038] FIG. 3 is a block diagram of one embodiment of a
`portable scanner device. The scanner may include a proces(cid:173)
`sor 301. Processor 301 may be a general purpose processor
`or a specialized processor. In one embodiment, processor
`301 may be an application specific integrated circuit (ASIC)
`or similar type of processor. In another embodiment, pro(cid:173)
`cessor 301 may be a general purpose processor.
`[0039]
`In one embodiment, the scanner includes memory
`303 for storing data and software for operating the scanner.
`
`Memory 303 may include multiple types of storage.
`Memory 303 may include SRAM, EPROM, FLASH,
`DRAM, a hard disk and similar types of memory devices. In
`one embodiment, memory device 303 may be removable.
`For example, memory 303 may be a memory stick or similar
`device. In another embodiment, memory 303 may be a
`combination of multiple memory devices. For example,
`memory 303 may be a combination of a hard disk and
`DRAM.
`
`In one embodiment, memory 303 may store soft(cid:173)
`[0040]
`ware to operate the scanning device. Software may include
`drivers for motors, actuators, or similar devices to control
`the positioning of the lenses, light sources, use of position
`determining devices and similar applications. Software may
`also be used to generate a display of incoming data or similar
`user feedback. In one embodiment, software may be used to
`communicate data to other devices.
`[0041]
`In one embodiment, processor 301 may be in
`communication with imaging device 305. Processor 301
`may retrieve or receive data from imaging device 305.
`Processor 301 may process incoming data from imaging
`device 305 and utilize memory 303 as a working memory
`and storage space for incoming images. In one embodiment,
`processor 301 may store images and data in a standardized,
`or specialized format such as jpg, tiff, gif, and similar
`formats. In another embodiment, processor 301 may be
`assisted in generating a display by a specialized graphics
`processor.
`[0042]
`In one embodiment, processor 301 may execute a
`set of drivers or similar programs to operate a set of motors
`313, servos, piezo actuators, or similar devices to adjust the
`position of the lenses and to adjust the focus of the lenses
`and imager 305. A software driver and motor may also be
`present in the device to adjust the telescoping of the lenses
`or other portions of the device. In one embodiment, imager
`305 includes a CCD, CMOS or similar ISA In one embodi(cid:173)
`ment, imager 305 may include a set of sensing devices
`organized into an array. In another embodiment, imaging
`device 305 may include a matrix of image sensors posi(cid:173)
`tioned at different depths to receive light at different focal
`lengths. In such an embodiment, multiple images with
`different focus may be captured at the varying depths in a
`single capture phase.
`
`[0043]
`In one embodiment, the device may include a set of
`lights 315. Lights 315 may be light emitting diodes, incan(cid:173)
`descent lights or similar light sources. Processor 301 may
`alter the state of lights 315 by executing a driver or similar
`program to adjust power to the lights 315. Lights may
`function to illuminate an object to be imaged. In one
`embodiment, multiple lights may be present with a fixed
`spatial relationship to one another. In another embodiment,
`lights 315 may be used to produce a strobe effect or similar
`lighting effect.
`
`[0044]
`In one embodiment, the system may include posi(cid:173)
`tion detection devices 321 or position sensing devices.
`Position tracking or detection sensor 321 may be a single
`device or combination of devices. The devices may include
`a gyroscope, global position device, altimeter or similar
`device that detects the orientation and position of the device
`in three dimensional space. In one embodiment, a set of
`gyroscopes and accelerometers may be used for each of an
`x, y and z axis. The position detection or tracking device
`
`
`
`US 2005/0237581 Al
`
`Oct. 27, 2005
`
`4
`
`may generate positions or movement output data that indi(cid:173)
`cate the position or movement of the device that may be used
`by software executing on processor 301 or a remote com(cid:173)
`puting device to generate the three dimensional representa(cid:173)
`tion of a target.
`[0045]
`In one embodiment, the device may include a
`communications device 317. The communication device
`may transmit data to and receive data from external com(cid:173)
`puting or communication devices. In one embodiment, com(cid:173)
`munications device 317, may be a wireless communication
`device or similar device. Communication device 317 may
`utilize Ethernet, IP, IPX, IEEE 802.11 and similar commu(cid:173)
`nication protocols and technologies. In one embodiment,
`communication device 317 may be directly connected to an
`external system such as a personal computer, workstation,
`server or similar system. In another embodiment, wireless
`communication may utilize Bluetooth, cellular, IEEE 802.11
`or similar communication protocols or technology. Commu(cid:173)
`nication device 317 may be connected to an antenna 319 or
`similar device to transmit and receive wireless signals. In
`one embodiment, the communication device 317 may com(cid:173)
`municate with a network interface to send data to a remote
`system on a distributed network such as a local area network
`(LAN) or the Internet. In another embodiment, communi(cid:173)
`cation device 317 may communicate via a wire line using a
`technology such as a universal serial bus, firewire, 100
`BaseT or similar communication medium or protocol.
`[0046]
`In one embodiment, the device may include a set of
`controls 307 or input devices. Controls 307 may be a set of
`buttons, a touchpad, or similar input devices. Input from
`control device 307 may be handled by a driver or similar
`software that is executed by processor 301 to manipulate the
`settings and actions of the capture device.
`[0047] FIG. 4 is a flowchart of one embodiment of a
`process for generating three and two dimensional mappings
`of an object. In one embodiment, the mapping process may
`be initiated by capturing a set of images of the object to be
`mapped (block 401). In one embodiment, any number of
`images greater than one may be taken of the target. Plural
`images taken at different focal settings of a same portion of
`the target may be used to derive a full focus composite image
`and/or depth measurements for that portion of the target as
`described below. In one embodiment, the images at differing
`focal settings may be obtained by altering the position of a
`lens or set of lenses relative to the ISA Alternatively, the
`images may be obtained by utilizing multiple lenses or ISAs
`to capture multiple images of varying focal depths. In one
`embodiment, the multiple images may be captured simulta(cid:173)
`neously or in a short period of time. This should minimize
`or eliminate the effects of motion on the capture device
`whenever images are captured at different depths. The set of
`images may include the same area of a target at varying focal
`settings. In one embodiment, each captured image has a
`small depth of focus. Depth of focus may be a range in
`which an image or pixel is in satisfactory visual focus. Thus,
`only pixels corresponding to points on the target within the
`range will be in focus in a particular image. The images may
`be captured with a resolution of finer than 300 pixels per
`inch so that optimum focus for a surface of a target can be
`determined even when the surface appears substantially
`homogenous to an unaided human eye.
`[0048]
`In one embodiment, the device may analyze pixels
`of a set of images captured to determine which of the pixels
`
`in a set of corresponding pixels (e.g., corresponding to a
`same point on or area of a target) is in optimum focus (block
`403). Corresponding pixels may be determined based on the
`position of the pixel in an imaging array, by the color or
`encoding of a captured pixel, by correlation of pixels based
`on positioning sensor data or by similar methods. For
`example, the device may capture three complete images
`simultaneously in three separate ISAs, set of lenses or a
`combination thereof. The imaging devices may have match(cid:173)
`ing array sizes. The pixels at the same position may be
`compared to determine which pixel is in optimum focus
`based on determining local minima and maxima of light
`intensity or brightness. Light intensity may be measured at
`the level of a pixel. Any encoding may be used and the
`encoding compared to determine the pixel with most or least
`light intensity. In one embodiment, the pixel with optimum
`focus may be found by selecting the most or least light
`intense pixel. In one embodiment, this selection is made by
`analysis of neighboring pixels. In another embodiment, the
`most and least light intense pixel may be processed with a
`mathematical function, a calibration table, or a similar
`process or combination of processes to determine which
`pixel is in optimal focus. In a further embodiment, the most
`light intense pixel may be selected for optimum focus.
`
`[0049]
`In one embodiment, the processing to discern the
`local maxima and local minima and the determination of
`which pixels are in optimum focus may be performed local
`to the capture device. In another embodiment, all the cap(cid:173)
`tured data may be offloaded to a host processor for further
`processing to discern which pixels are optimally focused.
`
`[0050]
`In one embodiment, the focal position may be
`determined that corresponds to each optimum focus pixel
`(block 405). As used herein an optimum focused pixel may
`be the pixel having the best focus of the pixels captured
`corresponding to the same point or area on the target. The
`focal position may be measured in terms of device position(cid:173)
`ing detected by positioning sensors, by calculation of focal
`lengths based on the position of the lenses and imaging
`device or similar methods. The focal position of each
`captured pixel may be recorded when each image is cap(cid:173)
`tured. When each optimum focus pixel is determined, the
`focal position of the pixel may be correlated with the pixel
`by determining the image in which it was captured. The
`focal settings of each image may be stored when captured
`and used to determine a focal position for the individual
`pixel.
`
`[0051]
`In one embodiment, the optimally focused pixels
`may be compiled into a composite two dimensional image
`(block 407). By selecting the optimally focused pixels to
`assemble into a composite image, the composite image
`generated has substantially optimum focus. The composite
`image may be stored in any format. Formats may include
`jpg, gif, tiff, or similar formats. The composite image may
`be stored locally in the device or may be transmitted to an
`external storage device or system. The composite image may
`be stored and used as the visual representation of the imaged
`region of the target being scanned. The composite image
`may be used as a texture map or bitmap to be mapped onto
`a three dimensional representation of the target being
`scanned.
`
`[0052]
`In one embodiment, a depth map may be produced
`from the spatial da