`DeLuca
`
`US006064354A
`Patent Number:
`11
`(45) Date of Patent:
`
`6,064,354
`May 16, 2000
`
`54) STEREOSCOPIC USER INTERFACE
`METHOD AND APPARATUS
`
`Primary Examiner Richard A. Hjerpe
`ASSistant Examiner Ronald Laneau
`
`76 Inventor: Michael Joseph DeLuca, 1104 Claire
`Ave., Austin, Tex. 78703
`
`21 Appl. No.: 09/108,814
`22 Filed:
`Jul. 1, 1998
`G09G 5/00
`51 Int. CI.7
`52) U.S. Cl. .................................... 345/7; 34.5/8; 345/419
`58 Field of Search ............................... 345/7, 8, 9, 419;
`348/42, 47, 51
`
`- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
`
`so.
`
`ABSTRACT
`57
`A computer System Stereoscopically projects a three dimen
`Sional object having an interface image in a Space observ
`able by a user. The user controls the movement of a physical
`object within the space while observing both the three
`dimensionally projected object and the physical object. The
`computer System monitors the position of the user to deter
`further monitors the movement of the physical object to
`determine its position. A control Signal is generated in
`response to the position of the physical object interSecting
`the position of the interface image. For example, a word
`processing program is indicated by an interface image Such
`as an icon including the letter “W' three dimensionally
`projected within the Space. The word processing program is
`U.S. PATENT DOCUMENTS
`5,025,314 6/1991 Tang et al. ................................ so activated when the user's finger moves within the space to
`- - - - - - 382/48
`touch the projected icon. The interface allows the user to
`5,168,531 12/1992 Sigel .........
`2-Y/ Y-2
`f
`1ge
`358/93
`observe the projected icon, physical finger and their inter
`5,239,373 8/1993 Tang et al. ........
`5604142 12/1997 Duliouineal... 3450
`section within the Space. The physical object may also be
`5,767,842 1/1998 Korth ...................................... 345/168
`extended with a Stereoscopic extension image generated by
`the computer System in response to determining the position
`and orientation of the physical object.
`20 Claims, 3 Drawing Sheets
`
`mine the position of the interface image within the Space and
`
`56)
`
`References Cited
`
`FOREIGN PATENT DOCUMENTS
`08139994 9/1994 Japan.
`
`
`
`31 O
`
`IPR2022-00090 - LGE
`Ex. 1006 - Page 1
`
`
`
`U.S. Patent
`
`May 16, 2000
`
`Sheet 1 of 3
`
`6,064,354
`
`FIG.
`
`
`
`IPR2022-00090 - LGE
`Ex. 1006 - Page 2
`
`
`
`U.S. Patent
`U.S. Patent
`
`
`
`May16, 2000
`May 16, 2000
`
`Sheet 2 of 3
`Sheet 2 of 3
`
`6,064,354
`
`452
`
`SD)
`
`Uli LfMldiyyWtytify,
`
`
`255’
`
`IPR2022-00090 - LGE
`
`Ex. 1006 - Page 3
`
`IPR2022-00090 - LGE
`Ex. 1006 - Page 3
`
`
`
`U.S. Patent
`
`May 16, 2000
`
`Sheet 3 of 3
`
`6,064,354
`
`
`
`
`
`
`
`FIG. 7
`
`31 O
`
`VIDEO
`CAMERA
`
`PATTERN
`ENCOORDINATES
`
`314
`
`312
`
`200 STEREO
`SCOPIC
`DISPLAY Eas
`
`
`
`GENERATION
`
`
`
`INTERSECT
`MONITOR
`
`COORDINATES
`
`
`
`
`
`
`
`
`
`
`
`802
`
`804
`
`806
`
`81 O
`
`812
`
`814
`
`816
`
`818
`
`
`
`
`
`
`
`
`
`VIDEO
`320 CAMERA
`
`PATTERN
`RECOGNIZER
`
`322
`
`FIG. 8
`
`
`
`
`
`
`
`
`
`
`
`
`
`DISPLAY STEREOSCOPIC MAGE
`
`DETERMINE POSITION OF USER
`
`DETERMINE POSITION OF INTERFACE IMAGE
`
`DETERMINE POSITION | ORIENTATION OF
`PHYSICAL OBJECT
`
`EXTENSION MAGE
`Y
`
`DISPLAY EXTENSION IMAGE
`
`REDETERMINE POSITION/ ORIENTATION OF
`PHYSICAL OBJECT WITH EXTENSION MAGE
`
`PHYSICAL OBJECT & INTERFACE MAGE
`INTERSECT?
`Y
`
`GENERATE CONTROL SIGNAL
`
`MODIFY DISPLAYED IMAGE AND/OR CONTROL
`ANOTHER DEVICE
`
`IPR2022-00090 - LGE
`Ex. 1006 - Page 4
`
`
`
`1
`STEREOSCOPC USER INTERFACE
`METHOD AND APPARATUS
`
`FIELD OF THE INVENTION
`This invention generally relates to the area of computer
`user interfaces and more particularly to virtual three dimen
`Sional user interfaces.
`
`BACKGROUND OF THE INVENTION
`Graphical user interfaces have become a Standard for
`interfacing between a user and a computer. Such interfaces
`are in wide use in computer operating System interfaces
`produced by Apple, MicroSoft and others. These interfaces
`are limited in that they are intended for interfacing between
`a user and a computer having a two dimensional display
`Such as a CRT or LCD. A user activates the interface with a
`keyboard and or a pointing device Such as a mouse pointing
`to an icon on the display. Advancements have been made
`with the advent of a touch screen which allows a user to
`approximately contact the icon or intended area of the
`graphical user interface in order to use the interface.
`However, contact with the touch Screen can contaminate the
`display area of the Screen with finger prints and other types
`of Smudges. Also, constant physical contact with the touch
`Screen can result in its mechanical failure. Thus, what is
`needed is a way to contact user interface images without
`contacting a keyboard or a mouse or the display itself.
`Three dimensional image displays are improving. Several
`types of three dimensional displays are known including
`Stereoscopic displays which display a virtual three dimen
`Sional image using filters to highlight images intended for
`each eye of the viewer, thereby providing a stereoscopic or
`three dimensional affect. Such Systems alternately flash
`images for the left and right eye of the user and require a
`filter for each eye, usually included in glasses worn by the
`Viewer. Systems are in public use which require glasses may
`have color filters, orthogonally polarized lenses, or actively
`Switched lenses, and the display is correspondingly modu
`lated with left and right eye images to provide the three
`dimensional effect. Furthermore, Stereoscopic displayS
`which do not require glasses have been described, descrip
`tions are included in U.S. Pat. No. 4,987.487, Jan. 22, 1991,
`to Ichinose et al. entitled Method of Stereoscopic images
`display which compensates electronically for viewer head
`movement, and U.S. Pat. No. 5,365,370, Nov. 15, 1994, to
`Hudgins entitled Three dimensional viewing illusion with
`2D display. Yet another Stereoscopic display System in
`completely contained in a head Set worn apparatus as
`described in U.S. Pat. No. 5,673,151 Sep. 30, 1997 to Dennis
`entitled Image correction in a virtual reality and heads up
`display. The aforesaid patents are incorporated by reference.
`The aforesaid Stereoscopic displays allow the viewer to
`Simultaneously observe both a Stereoscopic object, appear
`ing to be generally Set apart in three dimensions from the
`image projection means, and a physical object, Such as the
`hand of the user, in approximately the same perceived Space.
`What is needed is a method and apparatus by which the
`interSection of the physical object and the Stereoscopic
`object can form a user interface with a computer System.
`OBJECT OF THE INVENTION
`It is therefor an object of the invention to provide a three
`dimensional display System capable of determining an inter
`Section of a physical object with a three dimensionally
`displayed object in a Space where the three dimensional
`object is viewed and generating a control Signal in response
`
`15
`
`25
`
`35
`
`40
`
`45
`
`50
`
`55
`
`60
`
`65
`
`6,064,354
`
`2
`thereto. The control Signal may cause modification of the
`displayed image or control another device. The display
`System is also capable of extending the physical object with
`a three dimensional eXtension image and then using the
`extended image to determine the interSection.
`BRIEF DESCRIPTION OF THE DRAWINGS
`FIG. 1 shows a perspective View of a user causing an
`interSection of a physical object with a three dimensional
`Stereoscopic object projected by a display.
`FIG. 2 shows the display of the stereoscopic interface
`image.
`FIG. 3 shows determination of the position of the stereo
`Scopic interface image.
`FIG. 4 shows a physical object interSecting the Stereo
`Scopic interface image.
`FIG. 5 shows a stereoscopic extension of the physical
`object interSecting the Stereoscopic interface image.
`FIG. 6 shows a Stereoscopic extension image of the
`physical object interSecting the Stereoscopic interface image
`wherein the intersection is behind the display.
`FIG.7 shows a block diagram of the user interface system
`operating in accordance with the present invention.
`FIG. 8 shows a flow chart of a process operating in
`accordance with the present invention.
`DETAILED DESCRIPTION OF THE
`INVENTION
`FIG. 1 shows a perspective View of a user causing an
`interSection of a physical object with a three dimensional
`stereoscopic object projected by a display. The user 100 has
`left and right eyes 110 and 120 which are used to view a
`display 200 which projects a three dimensional Stereoscopic
`object 245 in a space between the user and the display. The
`Stereoscopic object has a Stereoscopic interface image 250.
`Using pattern recognition and triangulation, images from
`video cameras 310 and 320 are used to determine the
`position of physical objects within the Space, Such as the
`position of the user 100 and the user's finger 400. As will be
`described herein, a control Signal is generated in response to
`the intersection of the interface image 250 and a physical
`object 400. For example, the stereoscopic object 245 pro
`jected by the display 200 could be the image of an open
`book, including readable text on pages of the book. Interface
`image 250 could be an icon indicating that contact with the
`icon would cause a page in the book to turn. When the finger
`tip 400 of the user touches the icon 250, a control signal is
`generated causing a new image 245 of a book to be dis
`played with a turned page. The Stereoscopic three dimen
`Sional image has the advantage of being projected in a Space,
`no physical contact with a keyboard, mouse or touch Screen
`is needed to generate a control Signal to turn a page of the
`book. Rather, an intuitive action of a user appearing to make
`physical contact with a three dimensional image in the Space
`causes generation of the control Signal. The user Sees the
`interface image in a three dimensional Space and Simply uses
`a finger to touch the interface image to cause a response. The
`user has an actual view of the finger, with which the user has
`had a life time to become familiar. touching a virtual
`Stereoscopic object Similar to the way the user has spent a
`life time touching physical objects. This provides for an
`intuitive interface.
`The stereoscopic projector 200 can be any of several
`display means capable of displaying three dimensional
`images. Some projectorS require the user to wear colored,
`
`IPR2022-00090 - LGE
`Ex. 1006 - Page 5
`
`
`
`3
`polarized of active image filter glasses (not shown) to
`observe the three dimensional image while others are totally
`contained within a display headset worn by the user, yet
`another requires only a display Separate from the user and no
`glasses at all. While all displayScapable of displaying a three
`dimensional image are contemplated, the latter is preferred
`because of the convenience to a user requiring no physical
`contact with the means necessary to display three dimen
`Sional images.
`FIG. 2 shows the display of the stereoscopic interface
`image. Display 200 displays an image 210 for viewing by
`the left eye 110 of the user 100 while image 220 displayed
`for viewing by the right eye 120 of user 100. As a result,
`Stereoscopic interface image 250 appears to occur in a Space
`between the user 100 and the display 200 at a position
`indicated by the intersection of a line from eye 110 to image
`210 and a second line from eye 120 to image 220.
`FIG. 3 shows determination of the position of the stereo
`Scopic interface image. The position is dependent upon the
`distance between images 210 and 220, the distance between
`the eyes 110 and 120 of the user 100 and the position of the
`user including distance D1 between the display 200 and the
`user. Preferably, the size of display 200 is predetermined and
`the image 250 is determined by the computer generating the
`image. Consequently the distance between images 210 and
`220 is also predetermined. The distance between the eyes
`110 and 120 can be entered by the user as a calibration
`procedure prior to operating the user interface means, or can
`be determined by pattern recognition from images recorded
`by cameras 310 and 320. The position of the user including
`the distance between the user and the display can determined
`by pattern recognition by the images recorded by cameras
`310 and 320 to determine a common point relative to the
`user. Pattern recognition of images of faces and other
`physical objects are well known, Such descriptions can be
`found in references including U.S. Pat. No. 5,680,481 Oct.
`21, 1997 to Prasad et al. entitled Facial feature extraction
`method and apparatus for a neural network acoustic and
`visual speech recognition system, U.S. Pat. No. 5,715,325
`Feb. 3, 1998 to Bang et al. entitled Apparatus and method for
`detecting a face in a video image, and U.S. Pat. No.
`5,719,951 Feb. 17, 1998 to Shackeleton et al. entitled
`Normalized image feature processing, which are hereby
`incorporated by reference. The common point may be the
`area between the eyes of the user. Alternately, the identifi
`cation of the common point may be simplified by adding a
`fiducial mark at the desired point to assist in identifying the
`desired point and its corresponding angle. Such a mark could
`be a colored dot placed between the eyes or at the tip of the
`nose, or marks on glasses worn by the user, the mark could
`be further illuminated to Simplify patter recognition of
`images received by the Video camera. Thereafter, triangu
`lation is performed to determine the position of the user
`including D1. D1 is a geometric Solution of a predetermined
`distance between cameras 310 and 320 angles A1 and A2
`found from images recorded by cameras 310 and 320. Thus,
`the position including D2 of interface image 250 is readily
`geometrically determined from the aforesaid determina
`tions. It should be appreciated that the three dimensional
`display means can be constructed Such that the position of
`the user and the distance D1 is predetermined in order for the
`user to correctly view the Stereoscopic effect. Furthermore,
`the distance between the eyes 110 and 120 can also be
`predetermined to be an average distance between eyes of a
`number of users. This simplifies determination of the posi
`tion of interface image 250 without departing from the spirit
`and scope of the invention. FIG. 3 shows determining the
`
`15
`
`25
`
`35
`
`40
`
`45
`
`50
`
`55
`
`60
`
`65
`
`6,064,354
`
`4
`position of interface image 250 from a top view, it should be
`appreciated that a similar analysis applies to determining the
`position of interface image 250 from a side view, thus
`providing a three dimensional position of the user 100 and
`the interface image 250.
`FIG. 4 shows a physical object interSecting the Stereo
`scopic interface image. Physical object 400 can be any
`physical object where the position of the object can be
`determined. In FIG. 1, the physical object corresponds to the
`tip of the finger of the user. Pattern recognition is used to
`determine the position of the physical object and the tip of
`the finger of the user. Alternately a fiducial mark Such as the
`aforementioned colored or illuminated dot may be added to
`assist pattern recognition. Once the desired point is identi
`fied from the images recorded by cameras 310 and 320,
`angles A3 and A4 may be determined. Given angles A3 and
`A4, and the predetermined distance between cameras 310
`and 320, the position of the physical object 400 may be
`geometrically determined. FIG. 4 shows determining the
`position of the physical object from a top view, it should be
`appreciated that a similar analysis applies to determining the
`position of the physical object from a Side view, thus
`providing a three dimensional position of physical object
`400. Upon determination of a substantial intersection of the
`position of interface image 250 and physical object 400, a
`control Signal is generated. The control Signal may result in
`the modifications of the image or the control another device
`Such as a printer or modem.
`FIG. 4 shows a computer System which Stereoscopically
`projects a three dimensional object having an interface
`image in a Space observable by a user. The user controls the
`movement of a physical object within the Space while
`observing both the three dimensionally projected object and
`the physical object. The computer System monitors the
`position of the user to determine the position of the interface
`image within the Space and further monitors the movement
`of the physical object to determine its position. A control
`Signal is generated in response to the position of the physical
`object interSecting the position of the interface image. For
`example, a word processing program is indicated by an
`interface image Such as an icon including the letter “W'
`three dimensionally projected within the Space. The word
`processing program is activated when the user's finger
`moves within the Space to touch the projected icon. The
`interface allows the user to observe the projected icon,
`physical finger and their interSection within the Space.
`FIG. 5 shows a stereoscopic extension of the physical
`object interSecting the Stereoscopic interface image. In this
`alternative embodiment, the physical object is shown as a
`bar 450 having a first and second end 452 and 454 with a
`Stereoscopic extension image 255 projecting from end 454.
`The orientation and position of the physical object is deter
`mined by determining the positions of end points 452 and
`454 from images recorded by cameras 310 and 320. The end
`points can be found by pattern recognition or by adding of
`differing colored fiducial marks at either end of the bar. The
`position of end point 452 may be determined from angles A6
`and A8 of images from cameras 310 and 320 respectively
`while the position of end point 454 may be determined from
`angles A5 and A7 from cameras 310 and 320 respectively.
`FIG. 5 shows determining the position of the end points
`from a top view, it should be appreciated that a similar
`analysis applies to determining the position of the end points
`from a Side view, thus providing a three dimensional posi
`tion of end points 452 and 454. From the position of the two
`end points, the orientation of the physical object 450 may be
`determined. In response to the determined position and
`
`IPR2022-00090 - LGE
`Ex. 1006 - Page 6
`
`
`
`S
`orientation of physical object 450 and the determined posi
`tion of user 100, a stereoscopic extension image 255 is
`created Such that the extension image appears to be an
`extension of the physical object. In FIG. 5, the extension
`image 255 is shown as a line extending along the line of
`physical object 450 with an arrow head tip. The length and
`shape of the extension image is predetermined and may vary
`from application to application. The Stereoscopic extension
`image 255 is created by displaying images 215 and 225 on
`display 200 for view by eyes 110 and 120 respectively. A
`control Signal is generated when the position of a predeter
`mined portion of the Stereoscopic extension image, Such as
`the tip of the arrow head, intersects the position of the
`Stereoscopic interface image.
`FIG. 6 shows a Stereoscopic extension image of the
`physical object interSecting the Stereoscopic interface image
`wherein the intersection is behind the display 200. FIG. 6 is
`similar to FIG. 5 in that both show a stereoscopic extension
`image, 255 and 255, interSecting a Stereoscopic interface
`image, 250 and 250'. However in FIG. 5 the intersection is
`in front of display 200, while in FIG. 6 the intersection is
`behind display 200. The position and orientation of physical
`object 450 is determined by determining the position of end
`points 452 and 454 via cameras 310 and 320 and angles A5",
`A6', A7 and A8". In this case the resulting extension image
`255' is shown to have a substantially longer predetermined
`length than image 255 of FIG. 5. If display 200 were not a
`heads-up Stereoscopic display, but rather a conventional
`LCD or CRT, then the intersection between a physical object
`and an interface image could not occur if the position of the
`interface image were behind the display because either the
`Space is physically occupied by another object or the user
`could not see the physical intersection through the display.
`The extension image has the advantage of enabling inter
`Sections to occur in positions appearing behind the display
`200, or in other positions out of reach of the user, while
`allowing the user to directly view the physical object used to
`cause the interSection.
`Physical object 450 has been referred to as a bar, but it
`should be appreciated that the physical object could be any
`of a number of physical objects including the finger of the
`user where one end is the finger tip and the other end is a
`joint of the finger. Fiducial marks could be added to the
`points on the finger to facilitate pattern recognition of
`images recorded by the cameras. While the extension image
`is shown as a line with an arrow head, other types of
`extension images may be used depending upon the applica
`tion. The Stereoscopic extension may be considered a virtual
`end effect for a physical handle, a wide variety of end effects
`may be created by the computer System. For example a paint
`brush could be used for paining a virtual object, the handle
`being the physical object and the brush bristles and paint
`color the being end effect while the interface image appears
`as a paint canvas mounted on and three dimensional easel
`image. In a medical application, the physical object could be
`the handle and the end effect extension image the blade of
`a Scalpel while the Stereoscopic interface image part of a
`three dimensional image Simulating Surgery. Alternately in a
`game application the Stereoscopic extension image could be
`a laser beam, rocket, bullet or bolt of lightning appearing to
`emanate from the finger of the user along a three dimen
`Sional vector defined by the finger, the Stereoscopic interface
`image may be a villain or enemy tank moving in three
`dimensions.
`It should also be appreciated that the position and orien
`tation of the user 100 and physical object 450 have been
`described as being determined by two cameras with pattern
`
`35
`
`40
`
`45
`
`50
`
`55
`
`60
`
`65
`
`6,064,354
`
`15
`
`25
`
`6
`recognition which triangulate in order to determine the
`corresponding position and orientation. In a heads up Ste
`reoScopic head Set display, the cameras could be preferably
`mounted on the head Set for Visually monitoring physical
`objects in Same Space in which the user observes the
`projected Stereoscopic imageS. In alternate embodiments
`other techniques may be used to determine the aforesaid
`positions and orientations without departing from the Spirit
`and Scope of the invention.
`FIG.7 shows a block diagram of the user interface system
`operating in accordance with the present invention. A Ste
`reoScopic display 200 displayS Stereoscopic images gener
`ated by Stereoscopic image generation means 212 in a
`manner know in the art. The Stereoscopic display may be a
`CRT or LCD screen requiring filterglasses to be worn by the
`user to direct the appropriate image to the corresponding eye
`of the user. Alternately, it may be a heads up Stereoscopic
`display worn by the user. Preferably display 200 is a display
`means especially adapted to displaying Stereoscopic images
`without the aid of devices worn by the use. Cameras 310 and
`320 produce imageS which are analyzed by pattern recog
`nizers 312 and 322 which identify certain points of the
`image and their location within the image. AS previously
`described, the pattern recognition may be performed with or
`without the aid of fiducial marks. The location of the points
`from pattern recognizers 312 and 322 are analyzed by
`coordinate determining means 314 which analyzes the
`angles relative to each point from each camera, and knowing
`the predetermined distance between the cameras, is able to
`determine the desired positions and orientations. Coordinate
`determining means 314 also makes available the position of
`the user and the position and orientation of the physical
`object So that the Stereoscopic image generator 212 may
`generate the Stereoscopic extension image in response
`thereto. Coordinate determining means 314 also makes
`available the position of the user to coordinate determining
`means 214 which determines the position of the interface
`image relative to the user by determining the distance
`between the left eye and right eye images displayed on
`display 200 with the user's position including the distance
`between the user and the display and the Spacing between
`the eyes of the user. The positions of the physical object and
`interface image are then compared by interSection monitor
`322 which generates a control Signal in response to a
`Substantial coincidence with the position of the physical
`object, or its Stereoscopic extension image, and the position
`of the Stereoscopic interface image.
`FIG. 8 shows a flow chart of a process operating in
`accordance with the present invention. In step 800, a ste
`reoScopic image is displayed. Step 802 determines the
`position of the user as previously described. Note in alter
`nate embodiments the position of the user may be predeter
`mined. Then in step 804 the position of the stereoscopic
`interface image relative to the user is determined. Step 806
`determines the position and orientation of the physical
`object and Step 810 asks if and extension image is desired.
`If So, Step 812 causes the display of the extension image and
`step 814 redetermines the position and orientation of the
`physical object with the extension image. Then step 816
`determines if there is an interSection between the interface
`image and the physical object or its extension image. If So,
`step 818 generates a control signal which in step 820
`modifies the displayed image and/or controls another device.
`Thus what has been provided is a method and apparatus
`by which the interSection of a physical object and a Stereo
`Scopic object can be determined and be used to form a user
`interface with a computer System.
`
`IPR2022-00090 - LGE
`Ex. 1006 - Page 7
`
`
`
`7
`
`I claim:
`1. A method of displaying a Stereoscopic extension image
`as an extension of a physical object observable by a user
`comprising the Steps of:
`determining a position and orientation of the physical
`object; and
`displaying the Stereoscopic extension image also observ
`able by the user as the extension of the physical object
`in response thereto, wherein
`Said Step of displaying further comprises the Step of
`determining a position of the user, and includes pro
`jecting the Stereoscopic extension image relative to the
`determined position of the user and
`Said Step of determining the orientation of the physical
`object further comprises the Steps of:
`Visually recognizing a first and a Second point on the
`physical object;
`determining a position of the first point and the position
`of the Second point; and
`determining coordinates of a line defined by the posi
`tions of first and Second points, and further wherein
`Said Step of displaying projects the Stereoscopic exten
`Sion image Substantially along the line as observed
`by the user.
`2. The method according to claim 1 wherein the physical
`object is a handle having at least the first and Second points
`and the Stereoscopic extension image is a projection of an
`end effect on the handle.
`3. The method according to claim 1 wherein the stereo
`Scopic extension image is a projection of one of a plurality
`of selectable end effects.
`4. The method according to claim 1 further comprising the
`Steps of:
`displaying a stereoscopic interface image observable by
`the user;
`determining an interSection of the Stereoscopic extension
`image with the Stereoscopic interface image, and
`generating the control Signal in response thereto.
`5. A method of generating a control Signal comprising:
`projecting a Stereoscopic interface image in a Space
`observable by a user;
`enabling a physical object within in the Space to be
`observable by the user in addition to the stereoscopic
`interface image;
`determining an interSection of the physical object with the
`Stereoscopic interface image, and
`generating the control signal in response to Said Step of
`determining wherein the physical object includes a
`Stereoscopic extension image and the method further
`comprises the Steps of:
`determining a position and orientation of the physical
`object; and
`displaying the Stereoscopic extension image as an
`extension of the physical object in response thereto,
`wherein
`Said Step of determining the interSection further
`comprises the Step of determining an interSection
`of the Stereoscopic extension image with the Ste
`reoScopic interface image and further wherein
`Said Step of projecting the Stereoscopic interface
`image is projected by a display having a dis
`play Surface and the Space includes a front
`Space between the display Surface and the user
`and a behind Space behind the display Surface
`and wherein the Stereoscopic interface image is
`projected in either the front Space or the behind
`Space,
`
`15
`
`25
`
`35
`
`40
`
`45
`
`50
`
`55
`
`60
`
`65
`
`6,064,354
`
`8
`Said Step of displaying the Stereoscopic extension
`image displays the Stereoscopic extension
`image in either the front Space or the behind
`Space, and
`Said Step of determining the interSection further
`comprises the Step of determining the interSec
`tion of the Stereoscopic extension image with
`the Stereoscopic interface image in either the
`front Space or the behind Space.
`6. The method according to claim 5 wherein
`Said Step of projecting further projects an observable
`image including the Stereoscopic interface image and
`the method comprises the Step of
`modifying the observable image in response to the control
`Signal.
`7. The method according to claim 6 further comprising the
`Step of
`determining a position of the user, wherein
`Said Step of determining the interSection determines the
`interSection of the physical object and the Stereoscopic
`interface image relative to the position of the user.
`8. The method according to claim 7 further comprising the
`Step of
`Visually monitoring the user and the physical object, and
`wherein
`Said Step of determining the position of the user is
`determined in response to Said Step of Visually
`monitoring, and
`Said Step of determining the interSection of the physical
`object with the Stereoscopic interface image is deter
`mined in response to Said Step of Visually monitoring.
`9. The method according to claim 5 further comprising the
`Step of
`Visually monitoring the physical object, and wherein
`Said Step of determining the interSection of the physical
`object with the Stereoscopic interface image is deter
`mined in response to Said Step of Visually monitoring.
`10. A method of generating a control Signal comprising:
`projecting a stereoscopic interface image in a Space
`observable by a user;
`enabling a physical object within in the Space to be
`observable by the user in addition to the stereoscopic
`interface image;
`determining an interSection of the physical object with the
`Stereoscopic interface image; and
`generating the control Signal in response to Said step of
`determining wherein the physical object includes a
`Stereoscopic extension image and the method further
`comprises the Steps of:
`determining a position and orientation of the physical
`object; and
`displaying the Stereoscopic extension image as an
`extension of the physical object in response thereto,
`wherein
`Said Step of determining the interSection further
`comprises the Step of determining an interSection
`of the Stereoscopic extension image with the Ste
`reoScopic interface image and further wherein
`Said Step of projecting the Stereoscopic interface
`image is performed by a display having a
`display surface viewable by the user and the
`Space includes a behind Space behind the dis
`play Surface and wherein the Stereoscopic
`interface image is projected in the behind
`Space,
`
`IPR2022-00090 - LGE
`Ex. 1006 - Page 8
`
`
`
`6,064,354
`
`Said Step of determining the position and orien
`tation of the physical object includes determin
`ing a position and orientation of the physical
`object to be between the user and the display
`Surface,
`Said Step of displaying the Stereoscopic extension
`image displays the Stereoscopic interface
`image in the behind Space, and
`Said Step of determining the interSection fur