throbber
Case 6:21-cv-00755-ADA Document 45-10 Filed 02/28/22 Page 1 of 39
`
` Exhibit 10
`
`

`

`Case 6:21-cv-00755-ADA Document 45-10 Filed 02/28/22 Page 2 of 39
`
`225 Franklin Street
`Boston, Massachusetts
`maw-2804
`
`Telephone
`617 542-5070
`
`Facsimile
`617 542-8906
`
`Web Site
`WWW. fr. corn °
`
`1/4c)
`
`.4C711.
`c=i
`v...
`
`FISH 8z RICHARDSON P.C.
`
`January 26, 2001
`
`C:=2 emmacaszra aft
`irrorramas
`
`0 C
`
`Crtb
`
`WW2222202
`Itr=
`Kaa.....see=aa
`.zzzzzzz.-- 4itederick P. Fish
`-=-44 .•'04
`1855-1930
`
`
`0 W.
`
`K. Richardson
`1859-1951
`
`Attorney Docket No.: 09970-006001
`
`Box Patent Application
`Commissioner for Patents
`Washington, DC 20231
`
`cv
`Presented for filing is a new patent application claiming priority from a provisional;:.;
`patent application of:
`
`CD 0
`
`2:t
`
`BOSTON
`
`D ALLAS
`
`D ELAWARE
`
`N EW YORK
`
`S AN DIEGO
`
`SILICON VALLEY
`
`T WIN CITIES
`
`W ASHINGTON, DC
`
`Applicant: ERIC FOXLIN
`
`Title:
`
`SELF-REFERENCED TRACKING
`
`Enclosed are the following papers, including those required to receive a filing date
`under 37 CFR §1.53(b):
`
`Specification
`Claims
`Abstract
`Declaration
`Drawing(s)
`
`Pages
`21
`8
`1
`[To be Filed at a Later Date]
`4
`
`Enclosures:
` Postcard.
`
`Under 35 USC §119(e)(1), this application claims the benefit of prior U.S.
`provisional application 60/178,797, filed January 28, 2000.
`
`This application is entitled to small entity status. A small entity statement will be
`filed at a later date.
`
`CERTIFICATE OF MAILING BY EXPRESS MAIL
`
`Express Mail Label No EL298430912US
`
`1 hereby certify under 37 CFR §1 10 that this correspondence is being
`deposited with the United States Postal Service as Express Mail Post Office to
`Addressee with sufficient postage On the date indicated below and is
`addressed to the Commissioner for Patents, Washington, D.0 20231
`
`Date of Deposit
`^
`
`January 26, 2001
`
`Typed or Printed Name of Person Signing Certificate
`
`C Ate5s0E.
`
`GNTX0000265
`
`

`

`Case 6:21-cv-00755-ADA Document 45-10 Filed 02/28/22 Page 3 of 39
`
`FISH & RICHARDSON P.C.
`
`Commissioner for Patents
`January 26, 2001
`Page 2
`
`Basic filing fee
`Total claims in excess of 20 times $9
`Independent claims in excess of 3 times $40
`Fee for multiple dependent claims
`Total tiling fee:
`
`$355
`$0
`$0
`$0
`$355
`
`Under 37 CFR §1.53(f), no filing fee is being paid at this time.
`
`If this application is found to be incomplete, or if a telephone conference would
`otherwise be helpful, please call the undersigned at (617) 542-5070.
`
`Kindly acknowledge receipt of this application by returning the enclosed postcard.
`
`Please send all correspondence to:
`
`DAVID L. FEIGENBAUM
`Fish & Richardson P.C.
`225 Franklin Street
`Boston, MA 02110-2804
`
`Respectfully submitted,
`
`Lawrence K. Kolodney
`Reg. No. 43,807
`Enclosures
`LKK/lja
`20210311 doe
`
`GNTX0000266
`
`

`

`Case 6:21-cv-00755-ADA Document 45-10 Filed 02/28/22 Page 4 of 39
`
`Attorney's Docket No.: 09970-006001
`
`APPLICATION
`
`FOR
`
`UNITED STATES LETTERS PATENT
`
`TITLE:
`
`SELF-REFERENCED TRACKING
`
`APPLICANT: ERIC. FOXIIN
`
`CERTIFICATE OF MAILING BY EXPRESS MAIL
`
`Express Mail Label No. EL2984309I2US
`
`I hereby certify under 37 CFR §1.10 that this correspondence is being
`deposited with the United States Postal Service as Express Mail Post
`Office to Addressee with sufficient postage on the date indicated below
`and is addressed to the Commissioner for Patents, Washington,
`D.C. 20231.
`
`January 26,2001
`
`Date of Deposit
`
`Signature
`
`Af2S_SitR
`Gt ()
`Typed or Pnnted Name of Person Signing Certificate
`
`GNTX0000267
`
`

`

`Case 6:21-cv-00755-ADA Document 45-10 Filed 02/28/22 Page 5 of 39
`
`Docket No.: 09970-006001
`
`SEI F-REFERENCED TRACKING
`
`CLAIM OF PRIORITY
`This application claims priority under 35 USC §119(e) to provisional U.S. Patent
`Application Serial No. 60/178,797, filed on January 28, 2000, the entire contents of which
`are hereby incorporated by reference.
`
`5 BACKGROUND
`This invention relates to self-referenced tracking.
`
`Virtual reality (VR) systems require tracking of the orientation and position of a
`user's head and hands with respect to a world coordinate frame in order to control view
`parameters for head mounted devices (HMDs) and allow manual interactions with the virtual
`
`10
`
`world. In laboratory VR setups, this tracking has been achieved with a variety of mechanical,
`acoustic, magnetic, and optical systems. These systems require propagation of a signal
`
`between a fixed "source" and the tracked "sensor" and therefore limit the range of operation.
`
`They also require a degree of care in setting up the source or preparing the site that reduces
`
`their utility for field use.
`
`15
`
`The emerging fields of wearable computing and augmented reality (AR) require
`
`tracking systems to be wearable and capable of operating essentially immediately in arbitrary
`
`environments. "Sourceless" orientation trackers have been developed based on geomagnetic
`
`and/or inertial sensors. They allow enough control to look around the virtual environment
`
`and fly through it, but they don't enable the "reach-out-and-grab" interactions that make
`
`20
`
`virtual environments so intuitive and which are needed to facilitate computer interaction.
`
`SUMMARY
`
`In one aspect, in general, the invention provides a new tracking technique that is
`essentially "sourceless" in that it can be used anywhere with no set-up of a source, yet it
`
`25
`
`enables a wider range of virtual environment-style navigation and interaction techniques than
`does a simple head-orientation tracker, including manual interaction with virtual objects. The
`equipment can be produced at only slightly more than the cost of a sourceless orientation
`
`- 1 -
`
`GNTX0000268
`
`

`

`Case 6:21-cv-00755-ADA Document 45-10 Filed 02/28/22 Page 6 of 39
`
`Docket No.: 09970-006001
`
`tracker and can be used by novice end users without any knowledge of tracking technology,
`because there is nothing to set up or configure.
`In another aspect, in general, the invention features mounting a tracker on a user's
`head and using the tracker to track a position of a localized feature associated with a limb of
`5 the user relative to the user's head. The localized feature associated with the limb may
`include a hand-held object or a hand-mounted object or a point on a hand.
`In another aspect, in general, the invention features mounting a sourceless orientation
`tracker on a user's head and using a position tracker to track a position of a first localized
`feature associated with a limb of the user relative to the user's head.
`In another aspect, in general, the invention features tracking a point on a hand-held
`object such as a pen or a point on a hand-mounted object such as a ring or a point on a hand
`relative to a user's head.
`
`10
`
`In another aspect, in general, the invention features using a position tracker to
`determine a distance between a first localized feature associated with a user's limb and a
`second localized feature associated with the user's head.
`
`15
`
`In another aspect, in general, the invention features a position tracker which includes
`
`an acoustic position tracker, an electro-optical system that tracks LEDs, optical sensors or
`reflective marks, a video machine-vision device, a magnetic tracker with a magnetic source
`held in the hand and sensors integrated in the headset or vice versa, or a radio frequency
`
`20
`
`position locating device.
`
`In another aspect, in general, the invention features a sourceless orientation tracker
`including an inertial sensor, a tilt-sensor, or a magnetic compass sensor.
`
`In another aspect, in general, the invention features mounting a display device on the
`user's head and displaying a first object at a first position on the display device.
`
`25
`
`30
`
`In another aspect, in general, the invention features changing the orientation of a
`display device, and, after changing the orientation of the display device, redisplaying the first
`object at a second position on the display device based on the change in orientation.
`In another aspect, in general, the invention features determining the second position
`for displaying the first object so as to make the position of the first object appear to be fixed
`relative to a first coordinate reference frame, which frame does not rotate with the display
`device during said changing of the orientation of the display device.
`
`- 2 -
`
`GNTX0000269
`
`

`

`Case 6:21-cv-00755-ADA Document 45-10 Filed 02/28/22 Page 7 of 39
`
`Docket No.: 09970-006001
`
`In another aspect, in general, the invention features displaying the first object in
`response to a signal from a computer.
`In another aspect, in general, the invention features mounting a wearable computer
`on the user's body, and displaying a first object in response to a signal from the wearable
`5 computer.
`
`In another aspect, in general, the invention features displaying at least a portion of a
`virtual enviromnent, such as a fly-through virtual environment, or a virtual treadmill, on the
`display device.
`
`In another aspect, in general, the invention features displaying a graphical user
`interface for a computer on the display device.
`
`10
`
`In another aspect, in general, the invention features first object being a window, icon
`or menu in the graphical user interface.
`
`In another aspect, in general, the invention features the first object being a pointer for
`the graphical user interface.
`
`15
`
`In another aspect, in general, the invention features changing the position of the first
`localized feature relative to the position tracker and, after changing the position of the first
`
`localized feature, redisplaying the first object at a second position on the display device
`
`determined based on the change in the position of the first localized feature.
`
`In another aspect, in general, the invention features displaying a second object on the
`display device, so that after changing the position of the first localized feature, the displayed
`
`20
`
`position of the second object on the display device does not change in response to the change
`in the position of the first localized feature.
`
`In another aspect, in general, the invention features determining the second position
`so as to make the position of the first object appear to coincide with the position of the first
`localized feature as seen or felt by the user.
`
`In another aspect, in general, the invention features changing the orientation of the
`first coordinate reference frame in response to a signal being received by the computer.
`In another aspect, in general, the invention features changing the orientation of the
`first coordinate reference frame in response to a change in the position of the first localized
`feature.
`
`25
`
`30
`
`-3-
`
`GNTX0000270
`
`

`

`Case 6:21-cv-00755-ADA Document 45-10 Filed 02/28/22 Page 8 of 39
`
`Docket No.: 09970-006001
`
`In another aspect, in general, the invention features changing the orientation of the
`first coordinate reference frame in response to a signal representative of the location of the
`user.
`
`In another aspect, in general, the invention features changing the orientation of the
`5 first coordinate reference frame in response to a signal representative of a destination.
`In another aspect, in general, the invention features changing the orientation of the
`first coordinate reference frame in response to a signal representative of a change in the
`user's immediate surroundings.
`In another aspect, in general, the invention features changing the orientation of the
`first coordinate reference frame is changed in response to a signal representative of a change
`in the physiological state or physical state of the user.
`
`10
`
`In another aspect, in general, the invention features redisplaying the first object
`further comprises changing the apparent size of the first object according to the change in
`position of the first localized feature.
`
`15
`
`In another aspect, in general, the invention features mounting a portable beacon,
`transponder or passive marker at a fixed point in the environment and determining the
`
`position vector of a second localized feature associated with the user's head relative to the
`fixed point.
`
`In another aspect, in general, the invention features determining the position vector
`
`20
`
`of the first localized feature relative to the fixed point.
`
`In another aspect, in general, the invention features mounting a sourceless orientation
`
`tracker on a second user's head and determining the position of a localized feature associated
`
`with the body of the second user relative to the fixed point.
`
`In another aspect, in general, the invention features determining the position vector
`of a second localized feature associated with the user's head relative to the fixed point
`without determining the distance between the second localized feature and more than one
`
`fixed point in the environment.
`
`In another aspect, in general, the invention features displaying the first object at a
`third position after displaying the first object at the third position, changing the orientation of
`the display, and after changing the orientation of the display, continuing to display the first
`object at the third position.
`
`25
`
`30
`
`- 4 -
`
`GNTX0000271
`
`

`

`Case 6:21-cv-00755-ADA Document 45-10 Filed 02/28/22 Page 9 of 39
`
`Docket No.: 09970-006001
`
`In another aspect, in general, the invention features the first object being a window in
`a wraparound computer interface.
`In another aspect, in general, the invention features redisplaying the changed position
`of the first localized feature not being within the field of view of the display when the first
`5 object is redisplayed.
`
`In another aspect, in general, the invention features displaying the first object at a
`position coinciding with the position of the first localized object when the first localized
`object is within the field of view of the display.
`
`10
`
`In another aspect, in general, the invention features positioning the first localized
`feature at a first point positioning the first localized feature at a second point and calculating
`the distance between the first point and the second point.
`
`In another aspect, in general, the invention features determining a position vector of
`the first localized feature relative to a second localized feature associated with the user's head
`
`and modifying the position vector based on an orientation of the user's head.
`
`,f41:
`
`15
`
`In another aspect, in general, the invention features setting an assumed position for
`
`the user's head in a coordinate system and setting a position for the first localized feature in
`
`the coordinate system based on the assumed position of the user's head and said position
`
`vector.
`
`In another aspect, in general, the invention features measuring the orientation of the
`
`20
`
`user's head relative to a fixed frame of reference.
`
`In another aspect, in general, the invention features setting a virtual travel speed and
`
`direction for the user modifying the assumed position for the user's head based on the user's
`
`virtual travel speed and direction.
`
`In another aspect, in general, the invention features mounting on the head of a user a
`
`25
`
`three degree of freedom orientation tracker for tracking the orientation of the head, and a
`three degree of freedom position tracker for tracking the position of a first localized feature
`
`on the user's limb relative to a second localized feature on the user's head, computing a
`position vector for the first localized feature relative to the second localized feature,
`determining a rotation matrix based on information received from the rotation tracker, and
`transforming the position vector into a position vector for a fixed frame of reference based on
`the rotation matrix.
`
`30
`
`- 5 -
`
`GNTX0000272
`
`

`

`Case 6:21-cv-00755-ADA Document 45-10 Filed 02/28/22 Page 10 of 39
`
`Docket No.: 09970-006001
`
`In another aspect, in general, the invention features using an acoustic or radio
`frequency position tracker to track a position of a first localized feature associated with a
`limb of the user relative to the user's head.
`In another aspect, in general, the invention features mounting a video camera on the
`5 back of the user's head and displaying an image generated by the video camera in a portion
`of a display device, mounted on the user's head.
`In another aspect, in general, the invention features mounting a first inertial sensor on
`a user's head, mounting a second inertial sensor elsewhere on the user's body or in an object
`held by the user, and tracking the position of one inertial sensor relative to the other.
`Some embodiments of the invention include sensing data at the first and second
`inertial sensors and using the sensed data to track the position of one inertial sensor relative
`
`10
`
`to the other, tracking the position of the inertial sensor is done without reference to any signal
`received from a source not mounted on or held by the user and correcting the drift of the
`
`relative position or orientation of the second inertial sensor relative to the first inertial sensor
`
`15
`
`by measurements between devices on the user's head and devices elsewhere on the users
`
`body.
`
`Among the advantages of the invention are one or more of the following. The device
`
`is easy to don, can track both head and hand, adds no new cables to a wearable computer
`
`system, works anywhere indoors or outdoors with no preparation, and is simpler than
`
`20
`
`alternatives such as vision-based self-tracking.
`
`The details of one or more embodiments of the invention are set forth in the accompa-
`
`nying drawings and the description below. Other features, objects, and advantages of the
`
`invention will be apparent from the description and drawings, and from the claims.
`
`25
`
`FIG. 1 is a perspective view of a self-referenced tracking device mounted on a head.
`
`DESCRIPTION OF DRAWINGS
`
`FIG. 2 is a block diagram.
`
`FIG. 3 is a graph of tracking coverage and relative resolution.
`
`FIG. 4 is a view of an information cockpit.
`
`FIG. 5 shows a user using a virtual reality game.
`
`30
`
`Like reference symbols in the various drawings indicate like elements.
`
`- 6 -
`
`GNTX0000273
`
`

`

`Case 6:21-cv-00755-ADA Document 45-10 Filed 02/28/22 Page 11 of 39
`
`Docket No.: 09970-006001
`
`DETAILED DESCRIPTION
`As seen in Figure 1, implementations of the invention may combine a sourceless head
`orientation tracker 30 with a head-worn tracking device 12 that tracks a hand-mounted 3D
`beacon 14 relative to the head 16. One implementation uses a wireless ultrasonic tracker 12,
`5 which has the potential for low cost, lightweight, low power, good resolution, and high
`update rates when tracking at the relatively close ranges typical of head-hand displacements.
`As Figure 1 illustrates, this arrangement provides a simple and easy to don hardware
`system. In a fully integrated wearable VR system using this tracker there are only three parts
`(a wearable computer 10, a headset 15 with an integrated tracking system, and a hand-
`mounted beacon 14) and one cable connection 18. This is possible because the entire
`
`10
`
`ultrasonic receiver system 12 for tracking the beacon can be reduced to a few small signal-
`conditioning circuits and integrated with the sourceless orientation tracker 30 in the head-
`
`worn display 15. By sharing the microprocessor and its power and communications link to
`the wearable, the cost and complexity are reduced.
`
`15
`
`The benefits of this combination of elements stem from these realizations:
`
`1. It is usually not important to track the hand unless it is in front of the head. Thus
`
`range and line-of-sight limitations are no problem if the tracker is mounted on the forehead.
`
`2. The hand position measured in head space can be transformed into world space
`
`with good seen/felt position match using an assumed head pose, no matter how inaccurate.
`
`20
`
`3. Using one fixed beacon, the same tracking hardware can provide full 6-DOF
`
`tracking.
`
`Implementations of the invention may exhibit:
`
`1. A new tracking concept that enables immersive visualization and intuitive manual
`
`interaction using a wearable system in arbitrary unprepared environments.
`
`25
`
`2. An information cockpit metaphor for a wearable computer user interface and a set
`
`of interaction techniques based on this metaphor.
`
`As shown in Figure 2, a simple proof-of-concept implementation combines an
`InterSense IS-300 sourceless inertial orientation tracker 40 (available from InterSense, Inc.,
`in Burlington, MA) with a Pegasus FreeD ultrasonic position tracker 50 (available from
`Pegasus Technologies Ltd. in Holon, Israel). The IS-300 has an "InertiaCube" inertial sensor
`assembly 42, just over an inch on a side, cabled to a small computational unit 44 that outputs
`
`30
`
`- 7 -
`
`GNTX0000274
`
`

`

`Case 6:21-cv-00755-ADA Document 45-10 Filed 02/28/22 Page 12 of 39
`
`Docket No.: 09970-006001
`
`orientation data through a serial port 46. The FreeD product consists of a finger-worn
`wireless ultrasonic emitter 50A with two mouse buttons 54, and an L-shaped receiver bar
`50B which normally mounts on the frame of a computer monitor, and outputs x,y,z data
`through a serial port. For our experiments we mounted the InertiaCube and the L-shaped
`5 receiver bar on the visor 60 of a V-Cap 1000 see-through HMD (available from Virtual
`Vision of Seattle, WA). The FreeD therefore measures the ring position relative to the head-
`fixed coordinate frame whose orientation was measured by the IS-300.
`Data from both trackers is transmitted to a PC 62 (Pentium 300 MHz, Windows 98)
`running a program 63 that uses Windows DirectX and Direct3D capabilities to display
`graphics and effect interaction techniques. The graphics output window of Direct3D is
`maximized to take control over the entire screen, and VGA output 64 (640X480 at 60Hz) is
`passed into the V-Cap HMD as well as a desktop monitor.
`
`10
`
`The program 63 includes a tracker driver 71 and a fairly conventional VR rendering
`environment 72 that expects to receive 6-DOF head and hand tracking data from the tracker
`
`15
`
`driver as well as button states 65 for the hand tracking device. The interaction techniques to
`be described are implemented in the tracker driver. The basic functions of the tracker driver,
`
`when tracking a single 3-DOF point on the hand, are:
`
`1. Read in and parse the orientation data 68 from the IS-300 and the position triad 70
`from the FreeD.
`
`20
`
`2. Package the orientation data with the current head position in world-frame, and
`
`output the combined 6-DOF data record 73 for the head to the VR program. The current
`
`assumed world-frame head position is the same as the previous one unless the user is in the
`
`process of performing a navigation interaction such as flying. In this case the position is
`
`incremented based on the flying speed and direction.
`
`25
`
`3. Transform the hand position vector from head frame to world frame by first
`multiplying by the rotation matrix from head to world frame obtained from the orientation
`
`tracker, then adding the current assumed world-frame head position. Output the result to the
`VR program as a 3-DOF position record 74 for the hand device.
`
`The simple implementation just described is wearable, but cannot be integrated into
`an HMD elegantly, largely due to the size and power consumption of the IS-300 processing
`
`30
`
`- 8 -
`
`GNTX0000275
`
`

`

`Case 6:21-cv-00755-ADA Document 45-10 Filed 02/28/22 Page 13 of 39
`
`Docket No.: 09970-006001
`
`10
`
`15
`
`unit. A low-cost wearable version using available technologies could be implemented as
`follows:
`The core of this implementation is an inertial head orientation module called
`InterTrax 2 (available from InterSense and designed for use with consumer HMDs such as
`5 the Sony Glasstron and Olympus EyeTrek). Using tiny piezoelectric camcorder gyros, and
`solid-state accelerometers and magnetometers, InterTrax 2 is designed as a single long
`narrow circuit board 30 (Figure 1) to lie across the top of the head mounted display unit
`along the brow line. It is 9 cm long, 2 cm wide, and 0.5 cm thick with all components, except
`for a vertical gyro in the center, which sticks up 1 cm higher. It contains a low-power
`embedded 16-bit processor that runs a simplified fixed-point version of the GEOS drift-
`corrected orientation-tracking algorithm used in the IS-300. It communicates to the host
`through a single USB connector through which it draws its power, and can be manufactured
`for very low cost in volume. It is expected to achieve accuracy on the order of 2-3°, which is
`sufficient because the accuracy with which the hand avatar follows the physical hand is
`totally independent of orientation tracking accuracy.
`Another component is an embedded ultrasonic rangefinder (perhaps based on the
`Pegasus FreeD technology). As shown in Figure 1, three microphones 80, 82, 84 and their
`ultrasonic pulse detection circuits together with the InterTrax 2 board are embedded in a rigid
`plastic assembly designed to fit elegantly over the brow of an HMD. (In some embodiments,
`all components would be embedded inside the HMD display unit while sharing the HMD's
`cable 18, but in others, the added components are clipped on) The InterTrax 2 processor has
`enough unused timer inputs and processing bandwidth to timestamp the signals from the
`three ultrasonic pulse detectors and relay this data down its USB link.
`The ultrasonic tracking technology can be modified to take advantage of the very
`short range requirements. First, ultrasonic frequency may be increased from 40 KHz to a
`higher frequency. This increases the attenuation in air, and virtually eliminates reverberation
`and interference between nearby users. Second, the system can take advantage of the much
`reduced reverberation and the short time-of-flight to increase the update rate of tracking to,
`say, 240 Hz, thus allowing the system to average 4 position samples for each 60 Hz graphics
`update, or track up to 4 beacons at 60 Hz. To calculate the resolution that this would yield in
`various parts of the tracking volume we calculated the Geometric Dilution of Precision
`
`20
`
`25
`
`30
`
`- 9 -
`
`GNTX0000276
`
`

`

`Case 6:21-cv-00755-ADA Document 45-10 Filed 02/28/22 Page 14 of 39
`
`Docket No.: 09970-006001
`
`(GDOP) throughout the tracking volume given the intended geometry of the microphone
`mounts on the headset. The intended headset geometry, tracking range and optical field of
`view are illustrated superimposed on an isogram of a vertical slice through the GDOP data in
`Figure 3. The plane of the microphones is angled downward 45° to insure that the system has
`5 tracking coverage for hands in the lap. The resolution at any point in space is the range
`measurement resolution (about 0.1 mm for short range ultrasonic measurements using 40
`KHz) multiplied by the GDOP value, divided by 2 as a result of the 4X oversampling and
`averaging. Thus the expected resolution is approximately 0.5 mm at a distance of 400 mm
`away from the headset.
`
`10
`
`A goal of a wearable computer is to keep the user's hands free to perform tasks. For
`
`this reason, the system uses a wireless 3-DOF ring pointer for interaction. The FreeD ring-
`
`mouse previously described is approximately the right size. In some implementations of the
`
`system, the tracker will need to be triggered by a unique IR code from the headset, so that
`
`multiple beacons can be tracked.
`
`15
`
`In interactive visualization and design (IVD) and many other VR applications, a pen-
`
`style input device may be more useful. An implementation could use a wireless 5-DOF pen
`
`using the same basic technology as the 3-DOF ring pointer, but employing two emitters that
`
`are activated in an alternating sequence. A compact omni-directional pen could be
`
`implemented using cylindrical radiating ultrasonic transducers that have been developed by
`
`20
`
`Virtual Ink (Boston, MA), mounted at the ends of a cylindrical electronics unit
`
`approximately the size of a normal pen, with two mouse buttons.
`
`An additional device that could be included in the system and whose applications are
`
`discussed below is a small wireless anchor beacon that can be easily stuck to any surface.
`
`Ultrasonic beacons from InterSense are of suitable size and functionality.
`
`25
`
`Portable VR Application
`
`Object Selection and Manipulation Exploiting Proprioception
`
`M. Mine, F. Brooks, and C. Sequin. (Moving Objects in Space: Exploiting
`
`Proprioception in Virtual Environment Interaction. In SIGGRAPH 97 Conference
`
`Proceedings, ACM Annual Conference Series, August, 1997), have discussed the benefits of
`
`30
`
`designing virtual environment interaction techniques that exploit our proprioceptive sense of
`
`the relative pose of our head, hands and body. A variety of techniques were presented, such
`
`- 10 -
`
`GNTX0000277
`
`

`

`Case 6:21-cv-00755-ADA Document 45-10 Filed 02/28/22 Page 15 of 39
`
`Docket No.: 09970-006001
`
`as direct manipulation of objects within arms reach, scaled-world grab, hiding tools and
`menus on the users body, and body-relative gestures.
`Implementations of the invention have advantages over conventional world-frame
`tracking systems for implementing these techniques effectively. With conventional trackers,
`5 any error in head orientation tracking will cause significant mismatch between the visual
`representation of the virtual hand and the felt position of the real hand, making it difficult to
`accurately activate hidden menus while the virtual hand is not in view. With implementations
`of the invention, the head orientation accuracy is immaterial and visual-proprioceptive match
`will be good to the accuracy of the ultrasonic tracker — typically 1-2 mm.
`Locomotion & View Control Tricks
`
`10
`
`This section describes a few techniques to permit user locomotion and view control.
`Flying and Scaled-World Grab
`
`The usual navigation interface device in fly-through virtual environments is a
`joystick. This is appropriate for a flight simulator, but reduces one's sense of presence in
`terrestrial environments, where turning one's body toward the destination is more instinctive
`than turning the world until the destination is in front. Implementations of the invention
`
`15
`
`support this more immersive type of flying. No matter how one turns, if she raises a hand in
`front of her it will be trackable, and can be used to control flight speed and direction. Better
`
`yet, she can use two-handed flying, which can be performed with the arms in a relaxed
`
`20
`
`position and allows backwards motion, or the scaled-world grab method to reach out to a
`
`distant object and pull oneself to it in one motion.
`
`Walking Using Head Accelerometers as a Pedometer
`
`For exploratory walk-throughs, the sense of presence is greatest for walking,
`somewhat reduced for walking-in-place, and much further reduced for flying . M. Slater, A.
`
`25
`
`30
`
`Steed and M. Usoh (The Virtual Treadmill: A Naturalistic Metaphor for Navigation in
`Immersive Virtual Environments. In First Euro graphics Workshop on Virtual Reality, M.
`Goebel Ed. 1993), and M. Slater, M. Usoh and A. Steed (Steps and Ladders in Virtual
`Reality. In Proc. Virtual Reality Software & Technology 94, G. Singh, S. K. Feiner, and D.
`Thalmann, Eds. Singapore: World Scientific, pages 45-54, August 1994) have described a
`"virtual treadmill" technique in which a neural network is trained to recognize the bouncing
`pattern of a position tracker on an HMD, and thus control virtual motion. Inertial head-
`
`pra
`
`ir4"t
`
`GNTX0000278
`
`

`

`Case 6:21-cv-00755-ADA Document 45-10 Filed 02/28/22 Page 16 of 39
`
`Docket No.: 09970-006001
`
`orientation trackers do not normally output the position obtained by double integrating the
`accelerometers, because it drifts too much to be useful, but it seems reasonable that pattern
`analysis of the acceleration signals would produce good results.
`Head-Motion Parallax Using Anchor Beacon
`
`5 When working with close objects, head motion parallax is an important visual cue. It
`can be achieved with the tracking system of the invention on demand by using a trick.
`Normally, the system uses the 3-DOF position vector from the user's head to the hand-
`mounted beacon to track the position of the hand relative to the head, maintaining the head
`location fixed. When desired, the user may hold the hand still (say on a desk), and push a
`button to reverse this process, so that the tracker driver interprets the negative of the
`
`10
`
`measured vector (in world frame) as a position update of the head relative to the stationary
`
`hand. He can then move his head back and forth to look around an object, and release the
`
`button when his viewpoint is repositioned for optimal viewing. After flying or walking to an
`
`area, this may be a convenient way of making finely controlled viewpoint adjustments using
`
`15
`
`natural neck motion. Note that this operation is equivalent to grabbing the world and moving
`
`it around with one's hand, which may be a more convenient maneuver while standing.
`
`Implementations of the invention can

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket