throbber
Case 6:21-cv-00755-ADA Document 45-2 Filed 02/28/22 Page 1 of 17
`
` Exhibit 2
`
`

`

`Case 6:21-cv-00755-ADA Document 45-2 Filed 02/28/22 Page 2 of 17
`
`IIIIIIIIIIIIII0II0III0IIIII100111111111111001111111111111110111111
`
`US007301648B2
`
`(12) United States Patent
`Foxlin
`
`(10) Patent No.: US 7,301,648 B2
`(45) Date of Patent:
`*Nov. 27, 2007
`
`(54) SELF-REFERENCED TRACKING
`
`(75) Inventor: Eric Foxlin, Arlington, MA (US)
`
`(73) Assignee: InterSense, Inc., Bedford, MA (US)
`
`*
`
`Notice:
`
`Subject to any disclaimer, the term of this
`patent is extended or adjusted under 35
`U.S.C. 154(b) by 0 days.
`
`3/1997 Horton et al.
`5,615,132 A
`5,645,077 A * 7/1997 Foxlin 600/587
`5,812,257 A * 9/1998 Teitel et al. 355/141.4
`5,854,843 A * 12/1998 Jacknin et al. 381/309
`5,991,085 A * 11/1999 Rallison et al. 359/630
`6,005,548 A * 12/1999 Latypov et al. 345/156
`6,176,837 B1
`1/2001 Foxlin
`6,474,159 B1
`11/2002 Foxlin et al.
`6,757,068 B2*
`6/2004 Foxlin 358/620
`2003/0158699 Al *
`8/2003 Townsend et al. 702/151
`
`This patent is subject to a terminal dis-
`claimer.
`
`FOREIGN PATENT DOCUMENTS
`
`DE
`
`198 30 359
`
`1/2000
`
`(21)
`
`Appl. No.: 11/463,776
`
`(22)
`
`Filed:
`
`Aug. 10, 2006
`
`(65)
`
`Prior Publication Data
`
`US 2006/0284792 Al Dec. 21, 2006
`
`Related U.S. Application Data
`
`(63) Continuation of application No. 10/837,373, filed on
`Apr. 30, 2004, which is a continuation of application
`No. 09/770,691, filed on Jan. 26, 2001, now Pat. No.
`6,757,068.
`
`OTHER PUBLICATIONS
`
`E. Foxlin, "Head-tracking relative to a moving vehicle or simulator
`platform using differential inertial sensors".
`E. Foxlin, "Inertial head-tracking", M.S. Thesis, Dept. of E.E.C.S.,
`MIT, 1993.
`E. Foxlin, "Inertial head-tracker sensor fusion by a complementary
`separate-bias kalman filter", Proc. VRAIS '96 Virtual Reality
`Annual Intl. Symposium, Santa Clara, CA 1996.
`
`(Continued)
`
`Primary Examiner Roy M. Punnoose
`(74) Attorney, Agent, or Firm Fish & Richardson P.C.
`
`(60) Provisional application No. 60/178,797, filed on Jan.
`28, 2000.
`
`(57)
`
`ABSTRACT
`
`(51) Int. Cl.
`(2006.01)
`GO1B 11/14
`(52) U.S. Cl. 356/620
`(58) Field of Classification Search 356/620
`See application file for complete search history.
`
`(56)
`
`References Cited
`
`U.S. PATENT DOCUMENTS
`
`A new tracking technique is essentially "sourceless" in that
`it can be used anywhere with no set-up, yet it enables a much
`wider range of virtual environment-style navigation and
`interaction techniques than does a simple head-orientation
`tracker. A sourceless head orientation tracker is combined
`with a head-worn tracking device that tracks a hand-
`mounted 3D beacon relative to the head. The system encour-
`ages use of intuitive interaction techniques which exploit
`proprioception.
`
`1/1991 Zimmerman et al. 345/158
`4,988,981 A *
`5,526,022 A * 6/1996 Donahue et al. 345/156
`
`44 Claims, 5 Drawing Sheets
`
`10
`
`META-GNTX-00011619
`
`

`

`Case 6:21-cv-00755-ADA Document 45-2 Filed 02/28/22 Page 3 of 17
`
`US 7,301,648 B2
`Page 2
`
`OTHER PUBLICATIONS
`
`E. Foxlin et al., "Miniature 6-DOF inertial system for tracking
`HMDs", SPIE vol. 3362, Proc. AeroSense '98 Conference on
`Helmet- and Head-Mounted Displays III, Orlando, FL 1998.
`InterSense Inc. homepage—http://www.isense.com.
`K. Britting, "Inertial navigations systems analysis", New York,
`Wiley Interscience, 1971.
`C. Broxmeyer, "Inertial navigation systems", New York, McGraw-
`hill. 1964.
`
`R. Parvin, "Inertial Navigation", Princeton, New Jersey, Van
`Nostrand, 1962.
`R.G. Brown et al., "Introduction to random signals and applied
`Kalman filtering", rd edition, New York, John Wiley & Sons, 1992.
`William Frey, Michael Zyda, Robert McGhee, Bill Cockayne,
`Off-the-Shelf, Real-Time, Human Body Motion Capture for Syn-
`thetic Environments, 1995, Computer Science Department, Navel
`Postgraduate School, Monterey, CA 93943-5118.
`
`* cited by examiner
`
`META-GNTX-00011620
`
`

`

`Case 6:21-cv-00755-ADA Document 45-2 Filed 02/28/22 Page 4 of 17
`
`U.S. Patent
`
`Nov. 27, 2007
`
`Sheet 1 of 5
`
`US 7,301,648 B2
`
`16
`
`12
`
`84
`
`14
`
`80
`
`15
`
`10
`
`FIG. 1
`
`META-GNTX-00011621
`
`

`

`Case 6:21-cv-00755-ADA Document 45-2 Filed 02/28/22 Page 5 of 17
`
`U.S. Patent
`
`Nov. 27, 2007
`
`Sheet 2 of 5
`
`US 7,301,648 B2
`
`40
`
`i
`Orientation Tracker
`
` 60
`Virtual Vision Cap Visor
`
`Inertia Cube
`
`1,---.
`
`42
`
`Computational Uniti---,44
`
`Serial Port
`
`46
`
`Receiver Bar of
`Free D Tracker
`
`Serial Port
`
`Orientation Data(3 DOF)
`
`68
`
`50B
`
`52
`
`X,Y,Z Data
`
`(50A
`
`Free D Ultrasonic
`Emitter Ring
`
`Mouse Buttons
`
`62
`
`63
`
`PC
`
`(64
`
`Program
`
`VGA Output
`
`Monitor
`
`Button States
`65 )
`
`54
`
`68 70
`
`Tracker Driver
`
`(73
`1 al VR Rendering
`
`72
`
`FIG. 2
`
`META-GNTX-00011622
`
`

`

`Case 6:21-cv-00755-ADA Document 45-2 Filed 02/28/22 Page 6 of 17
`
`lualud *seri
`
`LOOZ `LZ 'NEIN
`
`S Jo f loollS
`
`Zll 8179`IOVL Sfl
`
`FIG. 3
`
`Meters
`
`GDOPvertical
`
`0.8
`
`0.6
`
`0.4
`
`0.2
`
`CZ91-1,000-XIND-VI2V\I
`
`Tracking Region
`
`

`

`Case 6:21-cv-00755-ADA Document 45-2 Filed 02/28/22 Page 7 of 17
`
`U.S. Patent
`
`Nov. 27, 2007
`
`Sheet 4 of 5
`
`US 7,301,648 B2
`
`META-GNTX-00011624
`
`

`

`Case 6:21-cv-00755-ADA Document 45-2 Filed 02/28/22 Page 8 of 17
`
`U.S. Patent
`
`Nov. 27, 2007
`
`Sheet 5 of 5
`
`US 7,301,648 B2
`
`00
`..4
`kr)
`
`META-GNTX-00011625
`
`

`

`Case 6:21-cv-00755-ADA Document 45-2 Filed 02/28/22 Page 9 of 17
`
`US 7,301,648 B2
`
`1
`SELF-REFERENCED TRACKING
`
`RELATED APPLICATIONS
`
`This application is a continuation of and claims priority 5
`under 35 USC §120 to U.S. patent application Ser. No.
`10/837,373, filed on Apr. 30, 2004, which is a continuation
`of U.S. patent application Ser. No. 09/770,691, filed on Jan.
`26, 2001 now U.S. Pat. No. 6,757,068, which is entitled
`under 35 USC § 119(e) to the benefit of the filing date of to
`U.S. Provisional Patent Application Ser. No. 60/178,797,
`filed on Jan. 28, 2000, the contents of each which are hereby
`incorporated by reference.
`
`BACKGROUND
`
`This invention relates to self-referenced tracking.
`Virtual reality (VR) systems require tracking of the ori-
`entation and position of a user's head and hands with respect
`to a world coordinate frame in order to control view param-
`eters for head mounted devices (HMDs) and allow manual
`interactions with the virtual world. In laboratory VR setups,
`this tracking has been achieved with a variety of mechanical,
`acoustic, magnetic, and optical systems. These systems
`require propagation of a signal between a fixed "source" and
`the tracked "sensor" and therefore limit the range of opera-
`tion. They also require a degree of care in setting up the
`source or preparing the site that reduces their utility for field
`use.
`The emerging fields of wearable computing and aug-
`mented reality (AR) require tracking systems to be wearable
`and capable of operating essentially immediately in arbitrary
`environments. "Sourceless" orientation trackers have been
`developed based on geomagnetic and/or inertial sensors.
`They allow enough control to look around the virtual
`environment and fly through it, but they don't enable the
`"reach-out-and-grab" interactions that make virtual environ-
`ments so intuitive and which are needed to facilitate com-
`puter interaction.
`
`SUMMARY
`
`15
`
`20
`
`25
`
`30
`
`35
`
`40
`
`In one aspect, in general, the invention provides a new
`tracking technique that is essentially "sourceless" in that it
`can be used anywhere with no set-up of a source, yet it
`enables a wider range of virtual environment-style naviga- 45
`tion and interaction techniques than does a simple head-
`orientation tracker, including manual interaction with virtual
`objects. The equipment can be produced at only slightly
`more than the cost of a sourceless orientation tracker and can
`be used by novice end users without any knowledge of so
`tracking technology, because there is nothing to set up or
`configure.
`In another aspect, in general, the invention features
`mounting a tracker on a user's head and using the tracker to
`track a position of a localized feature associated with a limb 55
`of the user relative to the user's head. The localized feature
`associated with the limb may include a hand-held object or
`a hand-mounted object or a point on a hand.
`In another aspect, in general, the invention features
`mounting a sourceless orientation tracker on a user's head 60
`and using a position tracker to track a position of a first
`localized feature associated with a limb of the user relative
`to the user's head.
`In another aspect, in general, the invention features track-
`ing a point on a hand-held object such as a pen or a point on 65
`a hand-mounted object such as a ring or a point on a hand
`relative to a user's head.
`
`2
`In another aspect, in general, the invention features using
`a position tracker to determine a distance between a first
`localized feature associated with a user's limb and a second
`localized feature associated with the user's head.
`In another aspect, in general, the invention features a
`position tracker which includes an acoustic position tracker,
`an electro-optical system that tracks LEDs, optical sensors
`or reflective marks, a video machine-vision device, a mag-
`netic tracker with a magnetic source held in the hand and
`sensors integrated in the headset or vice versa, or a radio
`frequency position locating device.
`In another aspect, in general, the invention features a
`sourceless orientation tracker including an inertial sensor, a
`tilt-sensor, or a magnetic compass sensor.
`In another aspect, in general, the invention features
`mounting a display device on the user's head and displaying
`a first object at a first position on the display device.
`In another aspect, in general, the invention features
`changing the orientation of a display device, and, after
`changing the orientation of the display device, redisplaying
`the first object at a second position on the display device
`based on the change in orientation.
`In another aspect, in general, the invention features deter-
`mining the second position for displaying the first object so
`as to make the position of the first object appear to be fixed
`relative to a first coordinate reference frame, which frame
`does not rotate with the display device during said changing
`of the orientation of the display device.
`In another aspect, in general, the invention features dis-
`playing the first object in response to a signal from a
`computer.
`In another aspect, in general, the invention features
`mounting a wearable computer on the user's body, and
`displaying a first object in response to a signal from the
`wearable computer.
`In another aspect, in general, the invention features dis-
`playing at least a portion of a virtual environment, such as
`a fly-through virtual environment, or a virtual treadmill, on
`the display device.
`In another aspect, in general, the invention features dis-
`playing a graphical user interface for a computer on the
`display device.
`In another aspect, in general, the invention features first
`object being a window, icon or menu in the graphical user
`interface.
`In another aspect, in general, the invention features the
`first object being a pointer for the graphical user interface.
`In another aspect, in general, the invention features
`changing the position of the first localized feature relative to
`the position tracker and, after changing the position of the
`first localized feature, redisplaying the first object at a
`second position on the display device determined based on
`the change in the position of the first localized feature.
`In another aspect, in general, the invention features dis-
`playing a second object on the display device, so that after
`changing the position of the first localized feature, the
`displayed position of the second object on the display device
`does not change in response to the change in the position of
`the first localized feature.
`In another aspect, in general, the invention features deter-
`mining the second position so as to make the position of the
`first object appear to coincide with the position of the first
`localized feature as seen or felt by the user.
`In another aspect, in general, the invention features
`changing the orientation of the first coordinate reference
`frame in response to a signal being received by the com-
`puter.
`
`META-GNTX-00011626
`
`

`

`Case 6:21-cv-00755-ADA Document 45-2 Filed 02/28/22 Page 10 of 17
`
`US 7,301,648 B2
`
`4
`In another aspect, in general, the invention features setting
`an assumed position for the user's head in a coordinate
`system and setting a position for the first localized feature in
`the coordinate system based on the assumed position of the
`user's head and said position vector.
`In another aspect, in general, the invention features mea-
`suring the orientation of the user's head relative to a fixed
`frame of reference.
`In another aspect, in general, the invention features setting
`o a virtual travel speed and direction for the user modifying
`the assumed position for the user's head based on the user's
`virtual travel speed and direction.
`In another aspect, in general, the invention features
`mounting on the head of a user a three degree of freedom
`5 orientation tracker for tracking the orientation of the head,
`and a three degree of freedom position tracker for tracking
`the position of a first localized feature on the user's limb
`relative to a second localized feature on the user's head,
`computing a position vector for the first localized feature
`In another aspect, in general, the invention features redis- 2o relative to the second localized feature, determining a rota-
`tion matrix based on information received from the rotation
`tracker. and transforming the position vector into a position
`vector for a fixed frame of reference based on the rotation
`matrix.
`mounting a portable beacon, transponder or passive marker 25 In another aspect, in general, the invention features using
`an acoustic or radio frequency position tracker to track a
`position of a first localized feature associated with a limb of
`the user relative to the user's head.
`In another aspect, in general, the invention features
`mining the position vector of the first localized feature 3o mounting a video camera on the back of the user's head and
`displaying an image generated by the video camera in a
`portion of a display device mounted on the user's head.
`In another aspect, in general, the invention features
`mounting a first inertial sensor on a user's head, mounting
`associated with the body of the second user relative to the 35 a second inertial sensor elsewhere on the user's body or in
`an object held by the user, and tracking the position of one
`inertial sensor relative to the other.
`Some embodiments of the invention include sensing data
`at the first and second inertial sensors and using the sensed
`without determining the distance between the second local- 4o data to track the position of one inertial sensor relative to the
`other, tracking the position of the inertial sensor is done
`without reference to any signal received from a source not
`mounted on or held by the user and correcting the drift of the
`relative position or orientation of the second inertial sensor
`first object at the third position, changing the orientation of 45 relative to the first inertial sensor by measurements between
`devices on the user's head and devices elsewhere on the
`users body.
`Among the advantages of the invention are one or more
`of the following. The device is easy to don, can track both
`o head and hand, adds no new cables to a wearable computer
`5
`system, works anywhere indoors or outdoors with no prepa-
`ration, and is simpler than alternatives such as vision-based
`self-tracking.
`The details of one or more embodiments of the invention
`In another aspect, in general, the invention features dis- 55 are set forth in the accompanying drawings and the descrip-
`tion below. Other features, objects, and advantages of the
`invention will be apparent from the description and draw-
`ings, and from the claims.
`
`3
`In another aspect, in general, the invention features
`changing the orientation of the first coordinate reference
`frame in response to a change in the position of the first
`localized feature.
`In another aspect, in general, the invention features 5
`changing the orientation of the first coordinate reference
`frame in response to a signal representative of the location
`of the user.
`In another aspect, in general, the invention features
`changing the orientation of the first coordinate reference 1
`frame in response to a signal representative of a destination.
`In another aspect, in general, the invention features
`changing the orientation of the first coordinate reference
`frame in response to a signal representative of a change in
`the user's immediate surroundings.
`In another aspect, in general, the invention features
`changing the orientation of the first coordinate reference
`frame is changed in response to a signal representative of a
`change in the physiological state or physical state of the user.
`
`playing the first object further comprises changing the
`apparent size of the first object according to the change in
`position of the first localized feature.
`In another aspect, in general, the invention features
`
`at a fixed point in the environment and determining the
`position vector of a second localized feature associated with
`the user's head relative to the fixed point.
`In another aspect, in general, the invention features deter-
`
`relative to the fixed point.
`In another aspect, in general, the invention features
`mounting a sourceless orientation tracker on a second user's
`head and determining the position of a localized feature
`
`fixed point.
`In another aspect, in general, the invention features deter-
`mining the position vector of a second localized feature
`associated with the user's head relative to the fixed point
`
`ized feature and more than one fixed point in the environ-
`ment.
`In another aspect, in general, the invention features dis-
`playing the first object at a third position after displaying the
`
`the display, and after changing the orientation of the display,
`continuing to display the first object at the third position.
`In another aspect, in general, the invention features the
`first object being a window in a wraparound computer
`interface.
`In another aspect, in general, the invention features redis-
`playing the changed position of the first localized feature not
`being within the field of view of the display when the first
`object is redisplayed.
`
`playing the first object at a position coinciding with the
`position of the first localized object when the first localized
`object is within the field of view of the display.
`In another aspect, in general, the invention features posi-
`tioning the first localized feature at a first point positioning 60
`the first localized feature at a second point and calculating
`the distance between the first point and the second point.
`In another aspect, in general, the invention features deter-
`mining a position vector of the first localized feature relative
`to a second localized feature associated with the user's head
`and modifying the position vector based on an orientation of
`the user's head.
`
`DESCRIPTION OF DRAWINGS
`
`FIG. 1 is a perspective view of a self-referenced tracking
`device mounted on a head.
`FIG. 2 is a block diagram.
`FIG. 3 is a graph of tracking coverage and relative
`resolution.
`FIG. 4 is a view of an information cockpit.
`
`65
`
`META-GNTX-00011627
`
`

`

`Case 6:21-cv-00755-ADA Document 45-2 Filed 02/28/22 Page 11 of 17
`
`US 7,301,648 B2
`
`5
`FIG. 5 shows a user using a virtual reality game.
`l ike reference symbols in the various drawings indicate
`like elements.
`
`DETAILED DESCRIPTION
`
`As seen in FIG. 1, implementations of the invention may
`combine a sourceless head orientation tracker 30 with a
`head-worn tracking device 12 that tracks a hand-mounted
`3D beacon 14 relative to the head 16. One implementation
`uses a wireless ultrasonic tracker 12, which has the potential
`for low cost, lightweight, low power, good resolution, and
`high update rates when tracking at the relatively close ranges
`typical of head-hand displacements.
`As FIG. 1 illustrates, this arrangement provides a simple
`and easy to don hardware system. In a fully integrated
`wearable VR system using this tracker there are only three
`parts (a wearable computer 10, a headset 15 with an inte-
`grated tracking system, and a hand-mounted beacon 14) and
`one cable connection 18. This is possible because the entire
`ultrasonic receiver system 12 for tracking the beacon can be
`reduced to a few small signal-conditioning circuits and
`integrated with the sourceless orientation tracker 30 in the
`head-worn display 15. By sharing the microprocessor and its
`power and communications link to the wearable, the cost
`and complexity are reduced.
`The benefits of this combination of elements stem from
`these realizations:
`1. It is usually not important to track the hand unless it is
`in front of the head. Thus range and line-of-sight limitations
`are no problem if the tracker is mounted on the forehead.
`2. The hand position measured in head space can be
`transformed into world space with good seen/felt position
`match using an assumed head pose, no matter how inaccu-
`rate.
`3. Using one fixed beacon, the same tracking hardware
`can provide full 6-DOF tracking.
`Implementations of the invention may exhibit:
`1. A new tracking concept that enables immersive visu-
`alization and intuitive manual interaction using a wearable
`system in arbitrary unprepared environments.
`2. An information cockpit metaphor for a wearable com-
`puter user interface and a set of interaction techniques based
`on this metaphor.
`As shown in FIG. 2, a simple proof-of-concept imple-
`mentation combines an InterSense IS-300 sourceless inertial
`orientation tracker 40 (available from InterSense, Inc., in
`Burlington, Mass.) with a Pegasus FreeD ultrasonic position
`tracker 50 (available from Pegasus Technologies Ltd. in
`Holon, Israel). The IS-300 has an "InertiaCube" inertial
`sensor assembly 42, just over an inch on a side, cabled to a
`small computational unit 44 that outputs orientation data
`through a serial port 46. The FreeD product consists of a
`finger-worn wireless ultrasonic emitter 50A with two mouse
`buttons 54, and an L-shaped receiver bar 50B which nor-
`mally mounts on the frame of a computer monitor, and
`outputs x,y,z data through a serial port. For our experiments
`we mounted the InertiaCube and the L-shaped receiver bar
`on the visor 60 of a V-Cap 1000 see-through HMD (avail-
`able from Virtual Vision of Seattle, Wash.). The FreeD
`therefore measures the ring position relative to the head-
`fixed coordinate frame whose orientation was measured by
`the IS-300.
`Data from both trackers is transmitted to a PC 62 (Pen-
`tium 300 MHz, Windows 98) running a program 63 that uses
`Windows DirectX and Direct3D capabilities to display
`graphics and effect interaction techniques. The graphics
`
`15
`
`25
`
`35
`
`6
`output window of Direct3D is maximized to take control
`over the entire screen, and VGA output 64 (640x480 at 60
`Hz) is passed into the V-Cap HMD as well as a desktop
`monitor.
`5 The program 63 includes a tracker driver 71 and a fairly
`conventional VR rendering environment 72 that expects to
`receive 6-DOF head and hand tracking data from the tracker
`driver as well as button states 65 for the hand tracking
`device. The interaction techniques to be described are imple-
`10 mented in the tracker driver. The basic functions of the
`tracker driver, when tracking a single 3-DOF point on the
`hand, are:
`1. Read in and parse the orientation data 68 from the
`IS-300 and the position triad 70 from the FreeD.
`2. Package the orientation data with the current head
`position in world-frame, and output the combined 6-DOF
`data record 73 for the head to the VR program. The current
`assumed world-frame head position is the same as the
`previous one unless the user is in the process of performing
`20 a navigation interaction such as flying. In this case the
`position is incremented based on the flying speed and
`direction.
`3. Transform the hand position vector from head frame to
`world frame by first multiplying by the rotation matrix from
`head to world frame obtained from the orientation tracker,
`then adding the current assumed world-frame head position.
`Output the result to the VR program as a 3-DOF position
`record 74 for the hand device.
`30 The simple implementation just described is wearable, but
`cannot be integrated into an HMD elegantly, largely due to
`the size and power consumption of the IS-300 processing
`unit. A low-cost wearable version using available technolo-
`gies could be implemented as follows:
`The core of this implementation is an inertial head ori-
`entation module called InterTrax 2 (available from
`InterSense and designed for use with consumer HMDs such
`as the Sony Glasstron and Olympus EyeTrek). Using tiny
`piezoelectric camcorder gyros, and solid-state accelerom-
`40 eters and magnetometers, InterTrax 2 is designed as a single
`long narrow circuit board 30 (FIG. 1) to lie across the top of
`the head mounted display unit along the brow line. It is 9 cm
`long, 2 cm wide, and 0.5 cm thick with all components,
`except for a vertical gyro in the center, which sticks up 1 cm
`45 higher. It contains a low-power embedded 16-bit processor
`that runs a simplified fixed-point version of the GEOS
`drift-corrected orientation-tracking algorithm used in the
`IS-300. It communicates to the host through a single USB
`connector through which it draws its power, and can be
`so manufactured for very low cost in volume. It is expected to
`achieve accuracy on the order of 2-3°, which is sufficient
`because the accuracy with which the hand avatar follows the
`physical hand is totally independent of orientation tracking
`accuracy.
`Another component is an embedded ultrasonic
`rangefinder (perhaps based on the Pegasus FreeD technol-
`ogy). As shown in FIG. 1, three microphones 80, 82, 84 and
`their ultrasonic pulse detection circuits together with the
`InterTrax 2 board are embedded in a rigid plastic assembly
`60 designed to fit elegantly over the brow of an HMD. (In some
`embodiments, all components would be embedded inside
`the HMD display unit while sharing the HMD's cable 18,
`but in others, the added components are clipped on) The
`InterTrax 2 processor has enough unused timer inputs and
`65 processing bandwidth to timestamp the signals from the
`three ultrasonic pulse detectors and relay this data down its
`USB link.
`
`55
`
`META-GNTX-00011628
`
`

`

`Case 6:21-cv-00755-ADA Document 45-2 Filed 02/28/22 Page 12 of 17
`
`US 7,301,648 B2
`
`7
`The ultrasonic tracking technology can be modified to
`take advantage of the very short range requirements. First,
`ultrasonic frequency may be increased from 40 KHz to a
`higher frequency. This increases the attenuation in air, and
`virtually eliminates reverberation and interference between 5
`nearby users. Second, the system can take advantage of the
`much reduced reverberation and the short time-of-flight to
`increase the update rate of tracking to, say, 240 Hz, thus
`allowing the system to average 4 position samples for each
`60 Hz graphics update, or track up to 4 beacons at 60 Hz. To 10
`calculate the resolution that this would yield in various parts
`of the tracking volume we calculated the Geometric Dilution
`of Precision (GDOP) throughout the tracking volume given
`the intended geometry of the microphone mounts on the
`headset. The intended headset geometry, tracking range and is
`optical field of view are illustrated superimposed on an
`isogram of a vertical slice through the GDOP data in FIG. 3.
`The plane of the microphones is angled downward 45° to
`insure that the system has tracking coverage for hands in the
`lap. The resolution at any point in space is the range 20
`measurement resolution (about 0.1 mm for short range
`ultrasonic measurements using 40 KHz) multiplied by the
`GDOP value, divided by 2 as a result of the 4x oversampling
`and averaging. Thus the expected resolution is approxi-
`mately 0.5 mm at a distance of 400 mm away from the 25
`headset.
`A goal of a wearable computer is to keep the user's hands
`free to perform tasks. For this reason, the system uses a
`wireless 3-DOF ring pointer for interaction. The FreeD
`ring-mouse previously described is approximately the right
`size. In some implementations of the system, the tracker will
`need to be triggered by a unique IR code from the headset,
`so that multiple beacons can be tracked.
`In interactive visualization and design (IVD) and many
`other VR applications, a pen-style input device may be more
`useful. An implementation could use a wireless 5-DOF pen
`using the same basic technology as the 3-DOF ring pointer,
`but employing two emitters that are activated in an alter-
`nating sequence. A compact omni-directional pen could be
`implemented using cylindrical radiating ultrasonic transduc-
`ers that have been developed by Virtual Ink (Boston, Mass.),
`mounted at the ends of a cylindrical electronics unit approxi-
`mately the size of a normal pen, with two mouse buttons.
`An additional device that could be included in the system
`and whose applications are discussed below is a small
`wireless anchor beacon that can be easily stuck to any
`surface. Ultrasonic beacons from InterSense are of suitable
`size and functionality.
`
`30
`
`35
`
`40
`
`45
`
`50
`
`Portable VR Application
`Object Selection and Manipulation Exploiting Proprio-
`ception
`M. Mine, F. Brooks, and C. Sequin. (Moving Objects in
`Space: Exploiting Proprioception in Virtual Environment
`Interaction. In SIGGRAPH 97 Conference Proceedings,
`ACM Annual Conference Series, August, 1997), have dis-
`cussed the benefits of designing virtual environment inter-
`action techniques that exploit our proprioceptive sense of the
`relative pose of our head, hands and body. A variety of
`techniques were presented, such as direct manipulation of 60
`objects within arms reach, scaled-world grab, hiding tools
`and menus on the users body, and body-relative gestures.
`Implementations of the invention have advantages over
`conventional world-frame tracking systems for implement-
`ing these techniques effectively. With conventional trackers, 65
`any error in head orientation tracking will cause significant
`mismatch between the visual representation of the virtual
`
`55
`
`8
`hand and the felt position of the real hand, making it difficult
`to accurately activate hidden menus while the virtual hand is
`not in view. With implementations of the invention, the head
`orientation accuracy is immaterial and visual-proprioceptive
`match will be good to the accuracy of the ultrasonic
`tracker—typically 1-2 mm.
`Locomotion & View Control Tricks
`This section describes a few techniques to permit user
`locomotion and view control.
`Flying and Scaled-World Grab
`The usual navigation interface device in fly-through vir-
`tual environments is a joystick. This is appropriate for a
`flight simulator, but reduces one's sense of presence in
`terrestrial environments, where turning one's body toward
`the destination is more instinctive than turning the world
`until the destination is in front. Implementations of the
`invention support this more immersive type of flying. No
`matter how one turns, if she raises a hand in front of her it
`will be trackable, and can be used to control flight speed and
`direction. Better yet, she can use two-handed flying, which
`can be performed with the arms in a relaxed position and
`allows backwards motion, or the scaled-world grab method
`to reach out to a distant object and pull oneself to it in one
`motion.
`Walking Using Head Accelerometers as a Pedometer
`For exploratory walk-throughs, the sense of presence is
`greatest for walking, somewhat reduced for walking-in-
`place, and much further reduced for flying. M. Slater, A.
`Steed and M. Usoh (The Virtual Treadmill: A Naturalistic
`Metaphor for Navigation in Immersive Virtual Environ-
`ments. In First Eurographics Workshop on Virtual Reality,
`M. Goebel Ed. 1993), and M. Slater, M. Usoh and A. Steed
`(Steps and Ladders in Virtual Reality. In Proc. Virtual
`Reality Software & Technology 94, G. Singh, S. K. Feiner,
`and D. Thalmann, Eds. Singapore: World Scientific, pages
`45-54, August 1994) have described a "virtual treadmill"
`technique in which a neural network is trained to recognize
`the bouncing pattern of a position tracker on an HMD, and
`thus control virtual motion. Inertial head-orientation trackers
`do not normally output the position obtained by double
`integrating the accelerometers, because it drifts too much to
`be useful, but it seems reasonable that pattern analysis of the
`acceleration signals would produce good results.
`Head-Motion Parallax Using Anchor Beacon
`When working with close objects, head motion parallax is
`an important visual cue. It can be achieved with the tracking
`system of the invention on demand by using a trick. Nor-
`mally, the system uses the 3-DOF position vector from the
`user's head to the hand-mounted beacon to track the position
`of the h

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket