throbber

`United States Patent
`[19J
`DeLuca
`
`US006064354A
`[11]Patent Number:
`6,064,354
`[45]Date of Patent:
`May 16, 2000
`
`I 1111111111111111 11111 1111111111 111111111111111 IIIII IIIII IIIIII Ill lllll llll
`
`STEREOSCOPIC USER INTERFACE
`
`[54]
`METHOD AND APPARATUS
`
`Primary Examiner-Richard A. Hjerpe
`
`
`
`Assistant Examiner-Ronald Laneau
`
`ABSTRACT
`
`[57]
`
`
`
`Inventor: Michael Joseph DeLuca, 1104 Claire
`[76]
`
`Ave., Austin, Tex. 78703
`A computer system stereoscopically projects a three dimen­
`
`
`
`
`
`
`
`
`sional object having an interface image in a space observ­
`
`
`
`able by a user. The user controls the movement of a physical
`
`
`
`object within the space while observing both the three
`
`
`dimensionally projected object and the physical object. The
`
`
`
`
`computer system monitors the position of the user to deter­
`
`Int. Cl.7 ....................................................... G09G 5/00
`[51]
`
`
`
`mine the position of the interface image within the space and
`
`
`
`U.S. Cl. .................................... 345/7; 345/8; 345/419
`[52]
`
`
`
`
`
`
`further monitors the movement of the physical object to
`
`
`Field of Search ............................... 345/7, 8, 9, 419;
`[58]
`
`
`
`determine its position. A control signal is generated in
`
`348/42, 47, 51
`
`
`
`
`response to the position of the physical object intersecting
`
`
`
`the position of the interface image. For example, a word
`
`
`
`
`processing program is indicated by an interface image such
`
`
`
`as an icon including the letter "W" three dimensionally
`
`
`
`
`
`projected within the space. The word processing program is
`
`
`
`activated when the user's finger moves within the space to
`
`
`
`5,025,314 6/1991 Tang et al. ................................ 358/93
`
`
`touch the projected icon. The interface allows the user to
`
`
`
`5,168,531 12/1992 Sigel ......................................... 382/48
`
`
`
`observe the projected icon, physical finger and their inter­
`
`
`
`5,239,373 8/1993 Tang et al. ................................ 358/93
`
`
`
`
`
`section within the space. The physical object may also be
`
`
`
`
`5,694,142 12/1997 Dumoulin et al. .......................... 345/9
`
`
`
`extended with a stereoscopic extension image generated by
`
`
`5,767,842 1/1998 Korth ...................................... 345/168
`
`
`
`the computer system in response to determining the position
`
`
`
`and orientation of the physical object.
`
`Appl. No.: 09/108,814
`[21]
`
`Filed: Jul. 1, 1998
`[22]
`
`[56]
`
`
`
`References Cited
`
`U.S. PATENT DOCUMENTS
`
`FOREIGN PATENT DOCUMENTS
`
`08139994 9/1994 Japan .
`
`(
`
`310
`
`250
`
`400---
`
`200
`
`
`
`20 Claims, 3 Drawing Sheets
`
`
`
`320
`
`110
`
`120
`
`Page 1 of 9
`
`GOOGLE EXHIBIT 1006
`
`

`

`6,064,354
`U.S. Patent
`Sheet 1 of 3
`May 16, 2000
`
`310
`
`200
`
`s,.�'*'*'�
`
`1,,,*****"'
`� .... ❖�
`�,*'�
`
`320
`
`FIG. 1
`
`110
`
`120
`
`110
`
`250
`
`120
`
`FIG. 2
`
`310
`_)
`200
`
`�
`
`220
`210
`
`320
`_)
`
`310
`_)
`200
`
`�
`A1
`
`D1
`�
`�
`250
`
`D2
`
`220
`210
`
`FIG. 3
`
`A2
`
`320
`_)
`
`Page 2 of 9
`
`

`

`U.S. Patent May 16, 2000
`6,064,354
`Sheet 2 of 3
`
`110
`
`220
`
`210
`
`250
`
`120
`
`A4
`
`FIG. 4
`
`310
`
`225
`
`----215
`
`320
`
`255
`
`FIG. 5
`
`120
`
`FIG. 6
`
`Page 3 of 9
`
`

`

`6,064,354
`U.S. Patent
`May 16, 2000
`Sheet 3 of 3
`
`FIG. 7
`
`VIDEO PATTERN
`
`COORDINATES
`CAMERA RECOGNIZER
`
`310
`
`312
`
`200
`STEREO
`
`314
`
`INTERSECT
`
`MONITOR
`
`SCOPIC
`STEREO-
`DISPLAY
`SCOPIC IMAGE
`COORDINATES
`GENERATION
`
`322
`
`VIDEO PATTERN
`
`CAMERA RECOGNIZER
`320
`
`322
`
`212
`
`214
`
`FIG. 8
`....
`�
`
`
`
`DISPLAY STEREOSCOPIC IMAGE
`
`DETERMINE POSITION OF USER
`
`800
`v
`
`802
`L-,.1
`
`804
`DETERMINE POSITION OF INTERFACE IMAGE
`1..----'
`
`
`
`
`
`
`
`DETERMINE POSITION/ ORIENTATION OF
`
`806
`1..----'
`
`N
`
`PHYSICAL OBJECT
`
`
`
`EXTENSION IMAGE?
`
`810
`_,,,,,
`
`.y
`
`812
`_,,,,,
`DISPLAY EXTENSION IMAGE
`
`814
`�
`
`
`
`
`REDETERMINE POSITION/ ORIENTATION OF
`
`PHYSICAL OBJECT WITH EXTENSION IMAGE
`.._ I
`
`N
`
`816
`�.
`
`PHYSICAL OBJECT & INTERFACE IMAGE
`�
`INTERSECT?
`
`. y
`
`GENERATE CONTROL SIGNAL
`
`818
`1..----'
`
`_...,
`
`....
`
`
`
`MODIFY DISPLAYED IMAGE AND/OR CONTROL
`820
`L-,.1
`
`ANOTHER DEVICE
`
`Page 4 of 9
`
`

`

`1
`
`6,064,354
`
`
`
`STEREOSCOPIC USER INTERFACE
`
`FIELD OF THE INVENTION
`
`2
`thereto. The control signal may cause modification of the
`
`
`
`
`
`
`
`displayed image or control another device. The display
`METHOD AND APPARATUS
`
`
`
`
`
`system is also capable of extending the physical object with
`
`
`
`
`a three dimensional extension image and then using the
`5
`
`
`extended image to determine the intersection.
`This invention generally relates to the area of computer
`
`
`
`
`
`user interfaces and more particularly to virtual three dimen­
`
`BRIEF DESCRIPTION OF THE DRAWINGS
`
`sional user interfaces.
`
`BACKGROUND OF THE INVENTION
`
`FIG. 3 shows determination of the position of the stereo­
`
`FIG. 4 shows a physical object intersecting the stereo­
`
`FIG. 1 shows a perspective view of a user causing an
`
`
`
`
`
`
`
`intersection of a physical object with a three dimensional
`
`
`
`stereoscopic object projected by a display.
`10
`
`
`
`Graphical user interfaces have become a standard for
`
`
`
`
`
`FIG. 2 shows the display of the stereoscopic interface
`
`
`interfacing between a user and a computer. Such interfaces
`image.
`
`
`
`are in wide use in computer operating system interfaces
`
`
`produced by Apple, Microsoft and others. These interfaces
`
`
`
`
`
`
`
`
`
`are limited in that they are intended for interfacing between
`
`scopic interface image.
`15
`
`
`
`a user and a computer having a two dimensional display
`
`
`
`
`
`
`such as a CRT or LCD. A user activates the interface with a
`
`scopic interface image.
`
`
`
`key board and or a pointing device such as a mouse pointing
`
`
`
`
`FIG. 5 shows a stereoscopic extension of the physical
`
`
`
`to an icon on the display. Advancements have been made
`
`
`
`
`object intersecting the stereoscopic interface image.
`
`
`
`with the advent of a touch screen which allows a user to 20
`
`
`
`FIG. 6 shows a stereoscopic extension image of the
`
`
`approximately contact the icon or intended area of the
`
`
`
`
`physical object intersecting the stereoscopic interface image
`
`
`graphical user interface in order to use the interface.
`
`
`wherein the intersection is behind the display.
`
`
`
`
`
`However, contact with the touch screen can contaminate the
`
`
`
`
`FIG. 7 shows a block diagram of the user interface system
`
`
`
`display area of the screen with finger prints and other types
`
`
`
`
`operating in accordance with the present invention.
`
`
`
`of smudges. Also, constant physical contact with the touch
`25
`
`
`
`screen can result in its mechanical failure. Thus, what is
`
`
`
`FIG. 8 shows a flow chart of a process operating in
`
`
`needed is a way to contact user interface images without
`
`
`accordance with the present invention.
`
`
`
`
`contacting a keyboard or a mouse or the display itself.
`DETAILED DESCRIPTION OF THE
`
`
`Three dimensional image displays are improving. Several
`INVENTION
`
`
`
`types of three dimensional displays are known including
`30
`
`
`
`
`stereoscopic displays which display a virtual three dimen­
`FIG. 1 shows a perspective view of a user causing an
`
`
`
`
`
`
`sional image using filters to highlight images intended for
`
`
`
`
`intersection of a physical object with a three dimensional
`
`
`
`
`
`each eye of the viewer, thereby providing a stereoscopic or
`
`
`
`
`
`stereoscopic object projected by a display. The user 100 has
`
`
`
`three dimensional affect. Such systems alternately flash
`
`left and right eyes 110 and 120 which are used to view a
`
`
`
`images for the left and right eye of the user and require a 35
`
`
`
`
`display 200 which projects a three dimensional stereoscopic
`
`
`
`
`
`filter for each eye, usually included in glasses worn by the
`
`
`
`
`object 245 in a space between the user and the display. The
`
`
`
`
`
`viewer. Systems are in public use which require glasses may
`
`
`
`
`stereoscopic object has a stereoscopic interface image 250.
`
`
`
`
`
`have color filters, orthogonally polarized lenses, or actively
`
`
`
`Using pattern recognition and triangulation, images from
`
`
`
`
`switched lenses, and the display is correspondingly modu­
`
`
`video cameras 310 and 320 are used to determine the
`
`
`
`lated with left and right eye images to provide the three
`
`
`
`
`
`position of physical objects within the space, such as the
`40
`
`
`dimensional effect. Furthermore, stereoscopic displays
`
`
`
`position of the user 100 and the user's finger 400. As will be
`
`
`which do not require glasses have been described, descrip­
`
`
`
`
`
`
`described herein, a control signal is generated in response to
`
`
`tions are included in U.S. Pat. No. 4,987,487, Jan. 22, 1991,
`
`
`the intersection of the interface image 250 and a physical
`
`
`to Ichinose et al. entitled Method of stereoscopic images
`
`
`
`
`object 400. For example, the stereoscopic object 245 pro­
`
`display which compensates electronically for viewer head 45
`
`
`
`
`
`jected by the display 200 could be the image of an open
`movement, and U.S. Pat. No. 5,365,370, Nov. 15, 1994, to
`
`
`
`
`book, including readable text on pages of the book. Interface
`
`Hudgins entitled Three dimensional viewing illusion with
`
`
`
`
`
`
`image 250 could be an icon indicating that contact with the
`2D display. Yet another stereoscopic display system in
`
`
`
`
`
`
`
`icon would cause a page in the book to turn. When the finger
`
`completely contained in a head set worn apparatus as
`
`
`
`
`
`
`
`tip 400 of the user touches the icon 250, a control signal is
`described in U.S. Pat. No. 5,673,151 Sep. 30, 1997 to Dennis 50
`
`
`
`
`
`generated causing a new image 245 of a book to be dis­
`entitled Image correction in a virtual reality and heads up
`
`
`
`
`
`
`
`played with a turned page. The stereoscopic three dimen-
`
`
`display. The aforesaid patents are incorporated by reference.
`
`
`
`
`sional image has the advantage of being projected in a space,
`The aforesaid stereoscopic displays allow the viewer to
`
`
`
`
`
`
`
`no physical contact with a keyboard, mouse or touch screen
`simultaneously observe both a stereoscopic object, appear­
`
`
`
`
`
`
`
`is needed to generate a control signal to turn a page of the
`
`ing to be generally set apart in three dimensions from the 55
`
`
`
`
`
`
`
`book. Rather, an intuitive action of a user appearing to make
`image projection means, and a physical object, such as the
`
`
`
`
`
`
`
`
`physical contact with a three dimensional image in the space
`
`hand of the user, in approximately the same perceived space.
`
`
`
`
`
`
`causes generation of the control signal. The user sees the
`What is needed is a method and apparatus by which the
`
`
`
`
`
`interface image in a three dimensional space and simply uses
`intersection of the physical object and the stereoscopic
`
`
`
`
`
`a finger to touch the interface image to cause a response. The
`object can form a user interface with a computer system.
`
`
`
`
`
`
`
`60 user has an actual view of the finger, with which the user has
`
`
`
`had a life time to become familiar. touching a virtual
`
`
`
`stereoscopic object similar to the way the user has spent a
`
`
`
`
`life time touching physical objects. This provides for an
`
`
`
`It is therefor an object of the invention to provide a three
`intuitive interface.
`
`
`
`
`
`dimensional display system capable of determining an inter­
`65
`section of a physical object with a three dimensionally The stereoscopic projector 200 can be any of several
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`displayed object in a space where the three dimensional display means capable of displaying three dimensional
`
`
`
`
`
`
`
`
`
`object is viewed and generating a control signal in response images. Some projectors require the user to wear colored,
`
`
`
`OBJECT OF THE INVENTION
`
`Page 5 of 9
`
`

`

`6,064,354
`
`4
`3
`
`
`
`
`
`polarized of active image filter glasses (not shown) to position of interface image 250 from a top view, it should be
`
`
`
`
`
`
`
`
`
`observe the three dimensional image while others are totally appreciated that a similar analysis applies to determining the
`
`
`
`
`
`
`
`
`contained within a display headset worn by the user, yet position of interface image 250 from a side view, thus
`
`
`
`
`
`
`
`
`
`another requires only a display separate from the user and no providing a three dimensional position of the user 100 and
`
`
`
`
`
`glasses at all. While all displays capable of displaying a three the interface image 250.
`5
`
`
`dimensional image are contemplated, the latter is preferred
`FIG. 4 shows a physical object intersecting the stereo­
`
`
`
`
`
`
`
`because of the convenience to a user requiring no physical
`
`
`scopic interface image. Physical object 400 can be any
`
`
`
`contact with the means necessary to display three dimen­
`
`
`
`
`physical object where the position of the object can be
`
`sional images.
`
`
`
`
`determined. In FIG. 1, the physical object corresponds to the
`
`
`
`tip of the finger of the user. Pattern recognition is used to
`
`
`FIG. 2 shows the display of the stereoscopic interface
`10
`
`
`
`determine the position of the physical object and the tip of
`
`
`
`
`
`image. Display 200 displays an image 210 for viewing by
`
`
`
`the finger of the user. Alternately a fiducial mark such as the
`
`
`the left eye 110 of the user 100 while image 220 displayed
`
`
`
`aforementioned colored or illuminated dot may be added to
`
`
`for viewing by the right eye 120 of user 100. As a result,
`
`
`
`
`assist pattern recognition. Once the desired point is identi­
`
`
`
`stereoscopic interface image 250 appears to occur in a space
`
`
`fied from the images recorded by cameras 310 and 320,
`
`
`between the user 100 and the display 200 at a position
`15
`
`
`angles A3 and A4 may be determined. Given angles A3 and
`
`
`indicated by the intersection of a line from eye 110 to image
`
`
`
`
`A4, and the predetermined distance between cameras 310
`
`
`
`210 and a second line from eye 120 to image 220.
`
`
`
`and 320, the position of the physical object 400 may be
`
`
`
`
`
`FIG. 3 shows determination of the position of the stereo­
`
`
`
`geometrically determined. FIG. 4 shows determining the
`
`
`
`
`
`scopic interface image. The position is dependent upon the
`
`
`
`position of the physical object from a top view, it should be
`
`
`
`distance between images 210 and 220, the distance between
`20
`
`
`
`
`appreciated that a similar analysis applies to determining the
`
`the eyes 110 and 120 of the user 100 and the position of the
`
`
`
`
`position of the physical object from a side view, thus
`
`
`
`
`user including distance D1 between the display 200 and the
`
`
`
`
`providing a three dimensional position of physical object
`
`
`user. Preferably, the size of display 200 is predetermined and
`
`
`
`400.Upon determination of a substantial intersection of the
`
`
`
`the image 250 is determined by the computer generating the
`
`
`
`position of interface image 250 and physical object 400, a
`
`
`
`
`image. Consequently the distance between images 210 and 25
`
`
`
`
`
`
`control signal is generated. The control signal may result in
`
`
`
`220 is also predetermined. The distance between the eyes
`
`
`
`
`the modifications of the image or the control another device
`
`110 and 120 can be entered by the user as a calibration
`
`such as a printer or modem.
`
`
`
`procedure prior to operating the user interface means, or can
`FIG. 4 shows a computer system which stereoscopically
`
`
`
`
`
`
`
`be determined by pattern recognition from images recorded
`
`
`projects a three dimensional object having an interface
`
`
`by cameras 310 and 320. The position of the user including
`30
`
`
`
`
`image in a space observable by a user. The user controls the
`
`
`
`
`the distance between the user and the display can determined
`
`
`
`movement of a physical object within the space while
`
`
`
`by pattern recognition by the images recorded by cameras
`
`
`
`
`observing both the three dimensionally projected object and
`
`310 and 320 to determine a common point relative to the
`
`
`
`the physical object. The computer system monitors the
`
`
`user. Pattern recognition of images of faces and other
`
`
`
`
`position of the user to determine the position of the interface
`
`
`
`physical objects are well known, such descriptions can be
`35
`
`
`
`image within the space and further monitors the movement
`
`
`found in references including U.S. Pat. No. 5,680,481 Oct.
`
`
`
`
`of the physical object to determine its position. A control
`
`
`
`21, 1997 to Prasad et al. entitled Facial feature extraction
`
`
`
`
`
`
`signal is generated in response to the position of the physical
`
`
`
`
`
`method and apparatus for a neural network acoustic and
`
`
`
`
`
`object intersecting the position of the interface image. For
`
`
`
`visual speech recognition system, U.S. Pat. No. 5,715,325
`
`
`
`example, a word processing program is indicated by an
`
`
`
`
`Feb. 3, 1998 to Bang et al. entitled Apparatus and method for 40
`
`
`interface image such as an icon including the letter "W"
`
`
`detecting a face in a video image, and U.S. Pat. No.
`
`
`
`
`three dimensionally projected within the space. The word
`
`
`5,719,951 Feb. 17, 1998 to Shackeleton et al. entitled
`
`processing program is activated when the user's finger
`
`
`
`
`Normalized image feature processing, which are hereby
`
`
`
`moves within the space to touch the projected icon. The
`
`
`incorporated by reference. The common point may be the
`
`
`
`interface allows the user to observe the projected icon,
`
`area between the eyes of the user. Alternately, the identifi-
`45
`
`
`
`physical finger and their intersection within the space.
`
`
`cation of the common point may be simplified by adding a
`
`
`
`
`fiducial mark at the desired point to assist in identifying the
`
`
`
`FIG. 5 shows a stereoscopic extension of the physical
`
`
`
`desired point and its corresponding angle. Such a mark could
`
`
`
`
`object intersecting the stereoscopic interface image. In this
`
`
`
`
`be a colored dot placed between the eyes or at the tip of the
`
`
`
`
`alternative embodiment, the physical object is shown as a
`
`
`nose, or marks on glasses worn by the user, the mark could 50
`bar 450 having a first and second end 452 and 454 with a
`
`
`
`
`
`be further illuminated to simplify patter recognition of
`
`
`
`stereoscopic extension image 255 projecting from end 454.
`
`
`
`images received by the video camera. Thereafter, triangu­
`
`
`
`The orientation and position of the physical object is deter­
`
`lation is performed to determine the position of the user
`
`
`
`mined by determining the positions of end points 452 and
`
`
`
`including D1. D1 is a geometric solution of a predetermined
`
`
`454 from images recorded by cameras 310 and 320. The end
`
`
`
`distance between cameras 310 and 320 angles Al and A2 55
`
`
`
`
`points can be found by pattern recognition or by adding of
`
`
`found from images recorded by cameras 310 and 320. Thus,
`
`
`
`differing colored fiducial marks at either end of the bar. The
`
`
`
`the position including D2 of interface image 250 is readily
`
`
`position of end point 452 may be determined from anglesA6
`
`
`geometrically determined from the aforesaid determina­
`
`and AS of images from cameras 310 and 320 respectively
`
`
`tions. It should be appreciated that the three dimensional
`
`
`while the position of end point 454 may be determined from
`
`
`display means can be constructed such that the position of
`
`
`angles AS and A7 from cameras 310 and 320 respectively.
`60
`
`
`
`the user and the distance D 1 is predetermined in order for the
`
`
`
`
`FIG. 5 shows determining the position of the end points
`
`
`
`
`user to correctly view the stereoscopic effect. Furthermore,
`
`
`from a top view, it should be appreciated that a similar
`
`
`the distance between the eyes 110 and 120 can also be
`
`
`
`
`
`analysis applies to determining the position of the end points
`
`
`
`
`predetermined to be an average distance between eyes of a
`
`
`
`
`from a side view, thus providing a three dimensional posi­
`
`
`
`number of users. This simplifies determination of the posi-65
`
`
`
`tion of end points 452 and 454. From the position of the two
`
`
`
`tion of interface image 250 without departing from the spirit
`
`
`
`
`end points, the orientation of the physical object 450 may be
`
`
`
`and scope of the invention. FIG. 3 shows determining the
`
`
`determined. In response to the determined position and
`
`Page 6 of 9
`
`

`

`6,064,354
`
`6
`5
`orientation of physical object 450 and the determined posi­recogmt10n which triangulate in order to determine the
`
`
`
`
`
`
`
`
`
`
`
`
`
`tion of user 100, a stereoscopic extension image 255 is corresponding position and orientation. In a heads up ste­
`
`
`
`
`
`
`
`
`created such that the extension image appears to be an reoscopic head set display, the cameras could be preferably
`
`
`
`
`
`
`
`
`
`extension of the physical object. In FIG. 5, the extension mounted on the head set for visually monitoring physical
`
`
`
`
`
`
`
`the the user observes in same space in which objects image 255 is shown as a line extending along the line of 5
`
`
`
`
`
`
`
`physical object 450 with an arrow head tip. The length and projected stereoscopic images. In alternate embodiments
`
`
`
`
`
`shape of the extension image is predetermined and may vary other techniques may be used to determine the aforesaid
`
`
`
`
`
`
`
`from application to application. The stereoscopic extension positions and orientations without departing from the spirit
`
`
`
`
`image 255 is created by displaying images 215 and 225 on and scope of the invention.
`
`display 200 for view by eyes 110 and 120 respectively. A
`FIG. 7 shows a block diagram of the user interface system
`
`
`10
`
`
`
`
`control signal is generated when the position of a predeter­
`
`
`
`
`operating in accordance with the present invention. A ste­
`
`
`
`mined portion of the stereoscopic extension image, such as
`
`
`
`
`reoscopic display 200 displays stereoscopic images gener­
`
`the tip of the arrow head, intersects the position of the
`
`
`
`ated by stereoscopic image generation means 212 in a
`
`
`stereoscopic interface image.
`
`
`
`manner know in the art. The stereoscopic display may be a
`
`
`
`
`
`
`
`
`the to be worn by filter glasses screen requiring CRT or LCD FIG. 6 shows a stereoscopic extension image of the 15
`
`
`physical object intersecting the stereoscopic interface image
`
`
`
`user to direct the appropriate image to the corresponding eye
`
`
`
`
`wherein the intersection is behind the display 200. FIG. 6 is
`
`
`of the user. Alternately, it may be a heads up stereoscopic
`
`
`
`
`similar to FIG. 5 in that both show a stereoscopic extension
`
`
`
`
`display worn by the user. Preferably display 200 is a display
`
`
`image, 255 and 255', intersecting a stereoscopic interface
`
`
`
`
`means especially adapted to displaying stereoscopic images
`
`
`image, 250 and 250'. However in FIG. 5 the intersection is 20
`
`
`
`
`without the aid of devices worn by the use. Cameras 310 and
`
`
`in front of display 200, while in FIG. 6 the intersection is
`
`
`
`320 produce images which are analyzed by pattern recog­
`
`
`
`
`behind display 200. The position and orientation of physical
`
`
`
`nizers 312 and 322 which identify certain points of the
`
`
`
`object 450 is determined by determining the position of end
`
`image and their location within the image. As previously
`
`
`points 452 and 454 via cameras 310 and 320 and anglesA5',
`
`
`
`described, the pattern recognition may be performed with or
`
`
`
`A6', A7' and AS'. In this case the resulting extension image 25
`
`
`
`
`without the aid of fiducial marks. The location of the points
`
`
`255' is shown to have a substantially longer predetermined
`
`
`
`from pattern recognizers 312 and 322 are analyzed by
`
`
`length than image 255 of FIG. 5. If display 200 were not a
`
`
`
`coordinate determining means 314 which analyzes the
`
`
`heads-up stereoscopic display, but rather a conventional
`
`
`
`
`angles relative to each point from each camera, and knowing
`
`
`
`LCD or CRT, then the intersection between a physical object
`
`
`
`
`the predetermined distance between the cameras, is able to
`
`
`and an interface image could not occur if the position of the 30
`
`
`
`
`determine the desired positions and orientations. Coordinate
`
`
`
`interface image were behind the display because either the
`
`
`
`determining means 314 also makes available the position of
`
`
`
`
`
`space is physically occupied by another object or the user
`
`
`the user and the position and orientation of the physical
`
`
`
`
`could not see the physical intersection through the display.
`
`
`
`
`object so that the stereoscopic image generator 212 may
`
`
`
`
`The extension image has the advantage of enabling inter­
`
`
`
`generate the stereoscopic extension image in response
`
`
`
`sections to occur in positions appearing behind the display
`
`
`thereto. Coordinate determining means 314 also makes
`35
`
`
`
`200, or in other positions out of reach of the user, while
`
`
`
`available the position of the user to coordinate determining
`
`
`
`allowing the user to directly view the physical object used to
`
`means 214 which determines the position of the interface
`cause the intersection.
`
`
`image relative to the user by determining the distance
`
`
`
`
`between the left eye and right eye images displayed on
`
`
`
`
`Physical object 450 has been referred to as a bar, but it
`
`
`
`
`display 200 with the user's position including the distance
`
`
`
`
`should be appreciated that the physical object could be any 40
`
`
`
`between the user and the display and the spacing between
`
`
`
`of a number of physical objects including the finger of the
`
`
`
`
`the eyes of the user. The positions of the physical object and
`
`
`user where one end is the finger tip and the other end is a
`
`
`
`interface image are then compared by intersection monitor
`
`
`
`joint of the finger. Fiducial marks could be added to the
`
`
`
`
`322 which generates a control signal in response to a
`
`
`points on the finger to facilitate pattern recognition of
`
`
`
`substantial coincidence with the position of the physical
`
`
`
`images recorded by the cameras. While the extension image 45
`
`
`
`
`object, or its stereoscopic extension image, and the position
`
`
`is shown as a line with an arrow head, other types of
`
`
`
`of the stereoscopic interface image.
`
`
`extension images may be used depending upon the applica­
`
`
`
`tion. The stereoscopic extension may be considered a virtual
`FIG. 8 shows a flow chart of a process operating in
`
`
`
`
`
`end effect for a physical handle, a wide variety of end effects
`
`
`
`accordance with the present invention. In step 800, a ste­
`
`
`
`
`may be created by the computer system. For example a paint
`
`
`reoscopic image is displayed. Step 802 determines the
`50
`
`
`
`
`brush could be used for paining a virtual object, the handle
`
`
`position of the user as previously described. Note in alter­
`
`
`
`
`
`being the physical object and the brush bristles and paint
`
`
`nate embodiments the position of the user may be predeter­
`
`
`color the being end effect while the interface image appears
`
`
`mined. Then in step 804 the position of the stereoscopic
`
`
`
`as a paint canvas mounted on and three dimensional easel
`
`
`interface image relative to the user is determined. Step 806
`
`
`
`
`image. In a medical application, the physical object could be
`
`
`determines the position and orientation of the physical
`55
`
`
`
`the handle and the end effect extension image the blade of
`
`
`object and step 810 asks if and extension image is desired.
`
`
`
`
`a scalpel while the stereoscopic interface image part of a
`
`
`
`If so, step 812 causes the display of the extension image and
`
`
`
`
`three dimensional image simulating surgery. Alternately in a
`
`
`
`step 814 redetermines the position and orientation of the
`
`
`
`game application the stereoscopic extension image could be
`
`
`
`physical object with the extension image. Then step 816
`
`
`
`a laser beam, rocket, bullet or bolt of lightning appearing to 60
`
`
`
`determines if there is an intersection between the interface
`
`
`emanate from the finger of the user along a three dimen­
`
`
`
`
`image and the physical object or its extension image. If so,
`
`
`
`sional vector defined by the finger, the stereoscopic interface
`
`
`
`
`step 818 generates a control signal which in step 820
`
`image may be a villain or enemy tank moving in three
`
`
`
`
`
`modifies the displayed image and/or controls another device.
`dimensions.
`
`
`
`Thus what has been provided is a method and apparatus
`
`
`
`
`It should also be appreciated that the position and orien-65
`
`
`by which the intersection of a physical object and a stereo­
`
`
`
`tation of the user 100 and physical object 450 have been
`
`
`scopic object can be determined and be used to form a user
`
`
`
`described as being determined by two cameras with pattern
`
`
`interface with a computer system.
`
`Page 7 of 9
`
`

`

`7
`
`6,064,354
`
`
`
`
`
`7. The method according to claim 6 further comprising the
`
`8
`said step of displaying the stereoscopic extension
`
`
`
`I claim:
`1. A method of displaying a stereoscopic extension image
`
`
`
`
`
`image displays the stereoscopic extension
`
`
`
`
`as an extension of a physical object observable by a user
`
`image in either the front space or the behind
`
`comprising the steps of:
`
`space, and
`determining a position and orientation of the physical
`
`
`said step of determining the intersection further
`
`
`5
`
`object; and
`
`comprises the step of determining the intersec­
`displaying the stereoscopic extension image also observ­
`
`
`
`
`
`tion of the stereoscopic extension image with
`
`
`
`able by the user as the extension of the physical object
`
`
`the stereoscopic interface image in either the
`
`
`in response thereto, wherein
`
`front space or the behind space.
`said step of displaying further comprises the step of 10
`
`
`
`
`
`6.The method according to claim 5 wherein
`
`
`
`determining a position of the user, and includes pro­
`
`
`
`said step of projecting further projects an observable
`
`
`
`
`jecting the stereoscopic extension image relative to the
`
`
`
`image including the stereoscopic interface image and
`
`determined position of the user and
`
`
`the method comprises the step of
`
`
`said step of determining the orientation of the physical
`
`
`
`modifying the observable image in response to the control
`
`
`
`
`object further comprises the steps of:
`signal.
`15
`
`
`visually recognizing a first and a second point on the
`
`
`
`physical object;
`step of
`
`
`
`determining a position of the first point and the position
`
`of the second point; and
`determining a position of the user, wherein
`
`
`
`
`determining coordinates of a line defined by the posi­
`
`said step of determining the intersection determines the
`20
`
`
`
`tions of first and second points; and further wherein
`
`
`
`intersection of the physical object and the stereoscopic
`
`
`
`
`said step of displaying projects the stereoscopic exten­
`
`
`
`interface image relative to the position of the user.
`
`
`
`sion image substantially along the line as observed
`
`
`
`8. The method according to claim 7 further comprising the
`
`by the user.
`step of
`2.The method according to claim 1 wherein the physical
`
`
`
`
`
`25
`
`
`
`object is a handle having at least the first and second points
`wherein
`
`
`
`
`and the stereoscopic extension image is a projection of an
`said step of determining the position of the user is
`
`end effect on the handle.
`
`
`
`determined in response to said step of visually
`
`
`3.The method according to claim 1 wherein the stereo­
`
`monitoring, and
`
`
`
`
`
`scopic extension image is a projection of one of a plurality
`said step of determining the intersection of the physical
`
`
`of selectable end effects.
`30
`
`
`
`
`object with the stereoscopic interface image is deter­
`
`
`
`
`
`
`mined in response to said step of visually monitoring.
`steps of:
`
`
`
`
`9. The method according to claim 5 further comprising the
`displaying a stereoscopic interface image observable by
`
`
`
`
`step of
`the user;
`35
`visually monitoring the physical object, and wherein
`
`
`
`
`
`determining an intersection of the stereoscopic extension
`
`said step of determining the intersection of the physical
`
`
`image with the stereoscopic interface image; and
`
`
`
`
`object with the stereoscopic interface image is deter­
`
`
`
`
`generating the control signal in response thereto.
`
`
`
`mined in response to said step of visually monitoring.
`
`
`
`
`5. A method of generating a control signal comprising:
`
`
`
`10. A method of generating a control signal comprising:
`
`
`
`projecting a stereoscopic interface image in a space40
`
`
`
`projecting a stereoscopic interface image in a space
`observable by a user;
`
`observable by a user;
`
`
`enabling a physical object within in the space to be
`
`
`enabling a physical object within in the space to be
`
`
`observable by the user in addition to the stereoscopic
`
`
`observable by the user in addition to the stereoscopic
`interface image;
`interface image;
`
`
`
`determining an intersection of the physical object with the
`
`
`
`
`
`stereoscopic interface image; and
`
`stereoscopic interface image; and
`
`
`
`
`generating the control signal in response to said step of
`
`
`
`
`determining wherein the physical object includes a
`
`
`
`generating the control signal in response to said step of
`
`
`
`stereoscopic extension image and the method further
`
`
`
`determining wherein the physical object includes a
`
`
`comprises the steps of:
`
`
`
`stereoscopic extension image and the method further
`50
`
`determining a position and orientation of the physical
`
`comprises the steps of:
`
`object; and
`
`displaying the stereoscopic extension image as an
`
`
`
`
`object; and
`
`
`
`
`extension of the physical object in response thereto,
`displaying the stereoscopic extension

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket