throbber
United States Patent (19)
`Horton et al.
`
`54 METHOD AND APPARATUS FOR
`DETERMINING POSITION AND
`ORIENTATION OF AMOVEABLE OBJECT
`USING ACCELEROMETERS
`
`75 Inventors: Mike A. Horton, Berkeley; A. Richard
`Newton, Woodside, both of Calif.
`
`73) Assignee: Crossbow Technology, Inc., San Jose,
`Calif.
`
`21 Appl. No.: 184,583
`22 Filed:
`Jan. 21, 1994
`(51) Int. Cl." ................................................ G09G 3/02
`52 U.S. Cl. .......................... 364/516; 364/517; 364/578;
`364/410; 364/449.1; 340/988; 351/209
`58) Field of Search ..................................... 364/410, 449,
`364/453, 516, 517,578; 351/210, 209;
`340/988,989, 990
`
`(56)
`
`References Cited
`
`U.S. PATENT DOCUMENTS
`
`3,983,474 9/1976 Kuipers ................................. 324/43 R
`4,017,858 4/1977 Kuipers .......
`... 343/100 R
`4,849,692 7/1989 Blood ...................................... 324/208
`4,852,988 8/1989 Velez et al. .
`... 35120R
`4,945,305 7/1990 Blood ...................................... 324/2O7
`4,988,98
`6/1991 Zimmermann et al. ....
`... 340/709
`5,072,218 12/1991 Spero et al. ............................ 340,980
`5,245,537 9/1993 Barber .............
`... 364/453 X
`5,280,265
`l/1994 Kramer et al........................... 338,210
`5,290,964 3/1994 Hiyoshi et al. ........................... 84f600
`5,307,072 4/1994 Jones, Jr. ............
`... 342,147
`5,373,857 12/1994 Travers et al.......
`... 128/782
`5,422,653 6/1995 Maguire et al. ............................ 34.5/9
`
`OTHER PUBLICATIONS
`
`GDMulti-Receivers/Transmitters Tracking Device, A Flock
`of Birds M Product Specification by Asension Tech. Corp.
`ACT Nov. 1992.
`3 Space Fastrak Product specifications by PDLHEM US;
`Jul. 1992.
`
`|||||||IIII
`USOO561532A
`11
`Patent Number:
`5,615,132
`45) Date of Patent:
`Mar. 25, 1997
`
`Friedmann, Martin, Staner, Thad and Pentland, Alex, "Syn
`chronization in Virtual Realities,' 1992 PRESENCE vol. 1,
`No. 1, pp. 139 - 144.
`J. A. Adam, “Virtual Reality is for Real", IEEE Spectrum
`Oct. 1993, pp. 22–29.
`T. A. DeFanti et al., “A Room' with a 'View'', IEEE
`Spectrum Oct. 1993, pp. 30-33.
`Analog Devices product specification for Model ADXL50*,
`"Monolithic Accelerometer with Signal Conditioning", pp.
`1-16 Jun. 1993.
`R. A. Quinnel, "Software Simplifies Virtual-World Design",
`EDN-Technology Update, Nov. 25, 1993, pp. 47–54.
`North Atlantic Treaty Organization, Agard Lecture Series
`No. 133 "Advances in Strapdown Intertial Systems', pp.
`3-1 through 3–29; May 1994.
`(List continued on next page.)
`Primary Examiner-James P. Trammell
`Assistant Examiner-Kamini Shah
`Attorney, Agent, or Firm-Albert C. Smith
`ABSTRACT
`(57)
`A three-dimensional position and orientation tracking sys
`tem uses accelerometers to measure acceleration of a move
`able object (e.g., a head-mounted display unit or a data
`glove). A tracking processor generates both position and
`orientation information on the object relative to a simulation
`environment as a function of the acceleration data. In one
`embodiment, a simplified radar-based tracking system is
`disposed relative to the object and periodically provides
`additional tracking data on the object to the tracking pro
`cessor. The tracking processor uses the additional data to
`correct the position and orientation information using a
`feedback filter process. The position and orientation infor
`mation signals generated can be used, for example, in a
`simulation or virtual reality application. Position and orien
`tation information is received by a simulation processor
`relative to the object. The simulation processor modifies a
`simulation environment as a function of the position and
`orientation information received. Modified simulation envi
`ronment information (e.g., video and/or audio information)
`is then presented to a user.
`
`21 Claims, 9 Drawing Sheets
`
`NTAL BODY OR HELME FRAME
`
`
`
`FIXED REFERENCE OR
`LEVE FRAME
`Z.
`
`META 1010
`META V. THALES
`
`

`

`5,615,132
`Page 2
`
`OTHER PUBLICATIONS
`
`North Atlantic Treaty Organization, Agard Lecture Series
`No. 82 “Practical Aspects of Kalman Filtering Implemen
`tation', pp. 2-1 through 2-11; 1976.
`A. Gelb et al., "Applied Optimal Estimation', The M.I.T.
`Press, pp. 50–143 1974.
`
`Shetty et al., Image-sensor-based target maneuver detec
`tion; Optical Engineering vol. 32, No. 11, pp. 2735-2740
`Nov. 1993.
`Guedry et al., The dynamic of spatial orientation during
`complex and changing linear and angular acceleration, Jour
`nal of Vestibular Research:Equilibrium and Orientation, vol.
`2, No. 4, pp. 259-283 Nov. 1992.
`
`META 1010
`META V. THALES
`
`

`

`U.S. Patent
`
`Mar. 25, 1997
`
`Sheet 1 of 9
`
`5,615,132
`
`YALAWONSTAOOV
`
`daLAWOUFTa00V
`
`YaLINONATOVYALAWNOYAIAIOV
`
`YSLAWONATAOOV
`YALAWONATADOV
`
`dVIN
`
`YVAN
`
`
`
`advaNnnyVvaNr
`
`YVANI
`
`YVAN
`
`
`
`SSVdMO1
`
`asl
`
`SSVdMO1
`
`a3l1s
`
`SSVdM01
`
`dal1s
`
`SSVdMO)
`
`yalUd
`
`SSV¥dMO1
`
`dais
`
`SSV¥dMO7
`
`yaldid
`
`
`
`
`
`0gMOXAIdILINN
`
`ogYaLYIANOO
`
`a/v
`
`OFL
`
`QNVNOtLISOd
`
`NOILVLEN3IdO
`
`NOILVNYOANI
`
`ONIMOVaL
`
`YOSSs00Nd
`
`AXOWSN
`
`L3yndl4d
`
`SeNOILWYITIOOV
`
`Vivd
`
`Ze
`
`META 1010
`META V. THALES
`
`META 1010
`META V. THALES
`
`
`
`
`
`
`
`
`
`
`

`

`U.S. Patent
`
`Mar 25, 1997
`
`Sheet 2 of 9
`
`5,615,132
`
`
`
`Tz
`
`META 1010
`META V. THALES
`
`

`

`U.S. Patent
`
`Mar 25, 1997
`
`Sheet 3 of 9
`
`5,615,132
`
`g
`
`O
`
`S
`
`H
`
`1.
`
`Y
`l
`i.
`<
`Li
`s
`<
`1.
`L1
`>-
`O)
`O
`
`N
`
`X
`
`l
`Y
`
`2
`
`META 1010
`META V. THALES
`
`

`

`U.S. Patent
`
`Mar. 25, 1997
`
`Sheet 4 of 9
`
`5,615,132
`
`
`
`68
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`META 1010
`META V. THALES
`
`

`

`U.S. Patent
`
`Mar. 25, 1997
`
`Sheet S of 9
`
`5,615,132
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`12O
`
`ACCELERATION
`CORRECTIONS
`Ce
`
`12O
`
`ACCEEROMETER
`BAS AND
`SCALNG
`
`12O
`
`ACCELEROMETER
`MOUNTING
`DATA
`
`120
`
`ANGUAR
`VELOCITY
`CORRECTIONS
`We
`
`12O
`
`ANGLE
`CORRECTIONS
`Qe
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`44
`
`READ
`ACCELERATION
`DATA FROM
`ACCELEROMETERS
`
`APPLY
`ACCEEROMETER
`BAS, SCALING AND
`CORRECTIONS TO
`ACCELERATION DATA
`
`REMOVE GRAVITY
`AND CENTRIPETAL
`COMPONENS FROM
`ACCELERATION DATA
`
`62
`
`CALCULATE POSITION AND
`ORIENTATION INFORMATION
`
`76
`
`UPDATE DIRECTION
`COSNES MATRIX
`
`CONVERT ACCELERATION
`DATA TO LINEAR BODY
`AND ANGUAR
`COMPONENTS
`
`CONVERT NEAR
`BODY ACCEERATIONS
`TO LEVEL FRAME
`ACCELERATIONS
`
`NTEGRATE ANGULAR
`ACCELERATIONS TO
`ANGULAR WELOCITIES
`
`NTEGRATE LEVE
`FRAME ACCEERATIONS
`TO VELOCITES
`
`APPLY ANGUAR
`VELOCITY
`CORRECTIONS
`
`APPLY WEOCY
`CORRECTIONS
`
`INTEGRATE ANGUAR
`VELOCITES TO ANGES
`(ROLL, PITCH, YAW
`ORIENTATION)
`
`INTEGRATE EVE
`FRAME VELOCITES TO
`POSITIONS (x, y, z
`COORDINATES)
`
`APPLY ANGE
`CORRECTIONS
`
`
`
`
`
`
`
`APPLY POSITION
`CORRECTIONS
`
`82
`
`84
`
`86
`
`88
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`12O
`
`WELOCITY
`CORRECTIONS
`we
`
`12O
`
`
`
`POSITION
`CORRECTIONS
`pe
`
`FIGURE 4
`
`
`
`130
`
`POSITION AND
`ORIENTATION
`INFORMATION
`
`META 1010
`META V. THALES
`
`

`

`U.S. Patent
`
`Mar 25, 1997
`
`Sheet 6 of 9
`
`5,615,132
`
`UPDATE STATE
`TRANSiTION
`MATRIX
`
`UPDATE
`PROCESS
`NOISE MATRIX
`
`
`
`
`
`
`
`
`
`
`
`
`
`COMPUTE
`COVARANCE
`MATRIX
`
`UPDATE
`KALMAN GAIN
`MATRIX
`
`1 O6
`
`1 O8
`
`1 12
`
`
`
`11 O
`
`
`
`
`
`
`
`
`
`
`UPDATE
`COVARANCE
`MATRIX
`
`
`
`
`
`COMPUTE
`CORRECTION
`FACTORS
`
`
`
`
`
`
`
`
`
`
`
`READ
`MEASUREMENTS
`FROM EXTERNAL
`TRACKING SYSTEM
`
`9 O
`
`
`
`
`
`
`
`
`
`CORRECTIONS
`de,pe,ve,e,
`We
`
`12O
`
`FIGURE 5
`
`META 1010
`META V. THALES
`
`

`

`U.S. Patent
`
`Mar. 25, 1997
`
`Sheet 7 of 9
`
`5,615,132
`
`G |
`
`WELSÅS
`
`
`
`
`
`
`
`
`
`
`0/ |
`
`
`
`W31 SÅS
`
`META 1010
`META V. THALES
`
`

`

`U.S. Patent
`
`Mar. 25, 1997
`
`Sheet 8 of 9
`
`5,615,132
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`0/ |
`
`META 1010
`META V. THALES
`
`

`

`U.S. Patent
`
`Mar. 25, 1997
`
`Sheet 9 of 9
`
`5,615,132
`
`
`
`OEGIA
`
`NOISSE}}{e}\WOO
`
`
`
`SHEAING O/R
`
`META 1010
`META V. THALES
`
`

`

`1.
`METHOD AND APPARATUS FOR
`DETERMINING POSITION AND
`ORIENTATION OF AMOVEABLE OBJECT
`USING ACCELEROMETERS
`
`BACKGROUND OF THE INVENTION
`
`1. Field of the Invention
`The invention relates to tracking systems, particularly to
`such systems that determine position and orientation of an
`object in a limited volume using accelerometers.
`2. Description of the Background Art
`In specialized computer applications involving virtual
`reality or "immersive simulations' a computer or processing
`facility providing the simulation must continuously deter
`mine with a high degree of accuracy the position and
`orientation of a user (or part of the user e.g., head or hand)
`relative to a "virtual world' or simulated environment in
`which the user operates. The position and orientation data
`must be updated regularly to provide a realistic simulation.
`In addition, the data must be collected in a manner that does
`not interfere significantly with the user's natural movement.
`Thus, physical connection to a stationary object or heavy
`and/or bulky tracking instruments attached to the user are
`unsuitable. In order to be integrated easily with a head
`mounted display (HMD), data glove or other peripheral
`device for use in a virtual reality application, a tracking
`system must be small and light weight.
`A mechanical gantry containing sensors is used to track
`movement by physically connecting the user to a fixed
`object. However, this system is cumbersome, provides an
`unrealistic simulation due to interferences from the gantry,
`and requires significant installation effort.
`A simplified radar or sonar system having a transmitter
`and a receiver mounted on the user is used to determine
`position of an object. However, this type of system is
`sensitive to noise in the environment, tends to have high
`frequency jitter between position measurements, is subject
`to interference from other objects in the simulation (e.g., a
`hand or other users), is generally bulky, requires multiple
`transmitters and receivers, and may be quite complex and
`expensive. Such systems are embodied in products available
`commercially from Polhemus, Logitech, and Ascension
`Technology.
`Additionally, conventional navigation systems for navi
`gating over large areas of land or airspace such as those for
`planes, cars, missiles, use devices such as gyroscopes that
`are not suitable for attachment to a human user because of
`their size and weight. In addition, these devices are typically
`designed to track over several hundred kilometers and
`several days, and are accurate only to several meters.
`Two-dimensional navigation systems using angular accel
`erometers (a type of gyroscope), such as that used in Barber
`U.S. Pat. No. 5,245,537, are not suitable for virtual reality
`applications requiring three position and three orientation
`measurements for realistic simulation. The system described
`in Barber does not provide a highly accurate measurement
`(as required by virtual reality applications) because it con
`tains no mechanism for correcting errors that are inherent in
`the system (e.g., bias, calibration errors, floating, and posi
`tional errors). Left uncorrected, these errors typically
`increase in size as a function of time of use and/or volume
`traversed, thereby resulting in a significant degradation in
`system performance. Moreover, angular accelerometers are
`not easily integrated into electronic componentry, thus the
`resulting system is generally greater in size and weight and
`
`O
`
`15
`
`20
`
`25
`
`30
`
`35
`
`40
`
`45
`
`50
`
`55
`
`60
`
`65
`
`5,615,132
`
`2
`is not suitable for attachment to a human user. In addition,
`a much higher update rate (e.g., 50-300 Hz) than that used
`in Barber is required for realistic virtual reality simulations.
`Thus, there is a need for a small, lightweight, highly
`integratable, navigational system that can be easily attached
`to a human user without significant interference to natural
`body movement. Furthermore, there is a need for a naviga
`tional system that is highly accurate over a long period of
`time and operates at a high update rate in order to provide
`a realistic virtual reality simulation. The prior art has failed
`to address these needs adequately.
`
`SUMMARY OF THE INVENTION
`The invention is a three-dimensional position and orien
`tation tracking system that uses accelerometers to measure
`acceleration in the six-degrees of freedom (e.g., x, y, Z
`position coordinates and roll, pitch, yaw orientation com
`ponents) of a moveable object (e.g., a head-mounted display
`unit, or a wristband/data glove). Conventional accelerom
`eters, as used herein, measure acceleration in one linear
`direction (e.g., x, y, z, or combination thereof, coordinate
`axis), but may report acceleration data as a nonlinear func
`tion of, for example, acceleration or time. Acceleration data
`on the moveable object is periodically (e.g., 50-300 Hz)
`received by a tracking processor. The tracking processor
`generates both position and orientation information on the
`object relative to a simulation environment as a function of
`the acceleration data. Accelerometers are easily integrated
`into electronic componentry (e.g., using silicon chip tech
`nology). Thus, the tracking system of the present invention
`can be embodied in a small, lightweight unit that is easily
`attached to a human user without significant interference to
`natural body movements.
`In one embodiment, a simplified radar-based tracking
`system, which is disposed relative to the object, periodically
`(e.g., 1 Hz) provides additional tracking data on the object
`to the tracking processor. This data may be provided by, for
`example, infrared light and received by the tracking proces
`sor via an infrared sensor. The tracking processor uses the
`additional data to correct the position, orientation, and/or
`velocity information generated from the accelerometers,
`using a feedback or Kalman filter process. This correction
`feedback loop allows the invention to function accurately
`over a long period of time (e.g., several hours) without
`adjustment. Alternatively, if the user is to remain seated or
`confined to a limited volume during simulation, pre-defined
`position data from the simulation environment software
`specification (e.g., mean position of user and average vari
`ance) can be used in the correction feedback process.
`The position and orientation information signals gener
`ated can be used, for example, in a simulation or virtual
`reality application. Position and orientation information is
`received by a simulation processor relative to the object
`(e.g., via infrared transceiver). The simulation processor
`modifies a simulation environment operating on the simu
`lation processor as a function of the position and orientation
`information received. Modified simulation environment
`information (e.g., video, audio, tactile, and/or olfactory
`information) is transmitted back to the user (e.g., via infrared
`transceiver). Other possible applications of the invention
`include guidance systems for the blind, robotic guidance
`systems, human tracking systems (e.g., prisoners), object
`tracking systems (e.g., parcel package, and/or auto), and
`computer input devices for the handicapped (e.g., head or
`hand controlled input devices).
`
`META 1010
`META V. THALES
`
`

`

`5,615,132
`
`3
`BRIEF DESCRIPTION OF THE DRAWINGS
`FIG. 1 is a simplified block diagram illustrating the
`components used in the tracking system of the present
`invention.
`FIG. 2 is graphical drawing showing one embodiment of
`the tracking system with placement of accelerometers 1-6 in
`FIG. 1 on two mounting points.
`FIG. 2A is a graphical drawing showing object 300 of
`FIG. 2 after movement.
`FIG. 3 is a simplified flow chart depicting one embodi
`ment of the tracking system of the present invention.
`FIG. 4 is a flow chart depicting main loop 41 in FIG. 1.
`FIG. 5 is a flowchart depicting feedback loop 89 in FIG.
`
`FIG. 6 is a simplified block diagram of a virtual reality
`invention using the tracking system of the present invention.
`FIG. 7 is a block diagram of object 300 in FIG. 6.
`FIG. 8 is a block diagram of simulation environment 180
`in FIG. 6.
`
`5
`
`10
`
`15
`
`20
`
`25
`
`4.
`frequency range, and this information should be removed to
`reduce noise. Accordingly, in the preferred embodiment,
`accelerometers are bandlimited, i.e., the highest frequency
`from accelerometers 1-6 are limited to, for example, 50-300
`Hz. This bandwidth can be achieved by coupling acceler
`ometers 1-6 to low pass filters (LPFs) 7-12, respectively, or
`by using low bandwidth accelerometers. In a preferred
`embodiment, accelerometers 1-6 are small and easily inte
`grated with other electronic components, e.g., small micro
`machined accelerometers (bulk or surface micro-machined).
`Output from LPFs 7-12 are used as inputs to multiplexer
`20 such as the ADG508A available commercially from
`Analog Devices. Analog to digital (A/D) converter 30, such
`as the AD1380 available commercially from Analog
`Devices, is used to convert the analog acceleration signal
`from LPFs 7-12 to a digital signal. Some accelerometers can
`provide digital data directly (see e.g., ARPA grant #BAA93
`06 to University of California at Berkeley and Analog
`Devices) thus A/D converter 30 is not necessary. Alterna
`tively, a voltage-to-frequency converter and a frequency
`counter circuit could be used to obtain a digital value. The
`components of the present invention comprising accelerom
`eters 1-6, LPF 7-12, multiplexer 20, and A/D converter 30
`are all highly integratable (unlike gyroscopes, angular accel
`erometers, and other tracking systems). Thus, according to
`the present invention, for example, accelerometers 1-6 (or
`subset thereof), multiplexer 20, A/D converter 30, and
`tracking processor 40 could all be co-located on a single
`integrated computer chip-the result being a small light
`weight navigational system suitable for attachment to human
`users using a virtual reality application.
`Output from A/D converter 30 is acceleration data 35.
`Acceleration data 35 may be reported, for example, as a
`nonlinear function of time (e.g., v(t) where v is volts).
`Acceleration data 35 is input to tracking processor 40.
`Tracking processor 40 can be, for example, a standard
`computer microprocessor such as an INTEL 486, Motorola
`68000, or Pentium-based microprocessor. Tracking proces
`sor 40 is discussed in further detail with reference to FIGS.
`3-5 below. Memory unit 37 is coupled to tracking processor
`40 and is used for storing program instruction steps and
`storing data for execution by tracking processor 40. Memory
`unit 37 is a conventional computer memory unit such as a
`magnetic hard disk storage unit or random access memory
`(RAM) on a chip. Output from tracking processor 40 is
`position and orientation information 130.
`In one embodiment, position and orientation information
`130 is transmitted in a data signal consisting of six elements
`three position elements (e.g., x, y, z) and three orientation
`elements (e.g., roll, pitch, yaw). Each element is two bytes
`long. Each value or element is in twos complement format,
`thus the decimal values -32,768 to 32,767 are covered.
`Measurements are the decimal value divided by 100. Thus,
`measurements from -327.68 to 327.67 (e.g., degrees, cm,
`inches, feet or other angle or linear measurements) can be
`transmitted. Information 130 is transmitted in a standard
`serial interface of three lines-transmit, receive, and
`ground-standard 8 bit words, no parity, and 1 stop bit. A
`mode of operation can be specified as follows:
`R-request mode (default). Position and orientation is
`transmitted upon request.
`F-free running mode. Position and orientation is trans
`mitted as calculated.
`M-mode change. Informs tracker that mode in which
`position and orientation is transmitted (R or F) will
`change.
`
`DESCRIPTION OF THE PREFERRED
`EMBODIMENT
`FIG. 1 is a simplified block diagram illustrating the
`components used in the tracking system invention. Conven
`tional accelerometers 1-6 measure acceleration in one linear
`direction (e.g., x, y, z, or combination thereof, coordinate
`direction), but may report acceleration data, for example, as
`a nonlinear function of time (e.g., v(t), where v is voltage)
`or acceleration. Accelerometers 1-6 are capable of measur
`ing accelerations of at least +2 G. This allows for 1 G due
`to gravity and 1 G of movement acceleration. In the pre
`ferred embodiment, accelerometers should be shock-pro
`tected or resistant so that they are not damaged if dropped.
`To ensure high accuracy, a high signal to noise ratio (SNR)
`is desirable-a lower bound of approximately 10° or 40 dB
`is preferred.
`In one embodiment six accelerometers 1-6 are used to
`track six degrees of freedom of an object in three dimensions
`(e.g., x, y, z position coordinates and roll, pitch, yaw
`orientation components). More than six accelerometers can
`be used to obtain a greater degree of accuracy (e.g., by
`averaging or interpolation) and/or redundancy. Alterna
`tively, three dual-axis or two triaxial accelerometers can be
`employed to track the six degrees of freedom of an object in
`three dimensions. Fewer accelerometers (e.g., four) could be
`used to track the object, for example, in a two-dimensional
`space or one-dimensional space (e.g., two accelerometers).
`Groups or clusters of accelerometers can also be used to
`track a plurality of objects. For example, the tracking
`invention could be implemented on an HMD and two data
`gloves to track head and hand movement of a user. More
`tracking systems can be used along the arm to track elbow
`and shoulder movement. Tracking systems on each finger
`could also be used to track finger movement. Similarly, two
`head-mounted display (HMD) units with six accelerometers
`each could be used to track the 3-dimensional position and
`orientation of two interactive users in a virtual reality
`environment.
`Accelerometers 1-6 are conventional accelerometers such
`as the ADXL-2 manufactured by Analog Devices Corp. of
`Boston, Mass. Due to the nature of human movement
`(typically frequency components are between 0–50 Hz), for
`example, there is generally little information in the high
`
`30
`
`35
`
`40
`
`45
`
`50
`
`55
`
`60
`
`65
`
`META 1010
`META V. THALES
`
`

`

`5,615,132
`
`10
`
`15
`
`20
`
`5
`G-get data. Tracker will transmit position and orienta
`tion information 130.
`H-halt. Turns off tracking system.
`C-calibrate. Runs or reruns the initialization routine 48.
`Alternatively, a file can be created with records of the same
`format described above.
`In FIG. 2, two accelerometer mounting points 301 and
`302 are located on object 300 (e.g., two locations on a
`head-mounted display (HMD) unit, or two locations on the
`wrist of a data glove). Object 300 may be, for example, a
`head-mounted display unit, a wristband/data glove, or other
`similar device attached to a user to monitor the user's
`movement. In this example, each mounting point 301, 302
`contains three accelerometers (e.g., accelerometers 1-3 and
`4-6 respectively). Vectors r-re(r) are the vectors from the
`origin of object 300 (e.g., head of user) to each accelerom
`eter 1-6, respectively, measured in body frame coordinates
`X, Y, Z (e.g., coordinates in reference to object 300). In
`one embodiment, accelerometers 1-6, and thus vectors
`r–re, are fixed when accelerometers 1-6 are mounted.
`However, the location of accelerometers 1-6 could be
`altered during the use of the tracking system and vectors
`r-r updated accordingly. As shown in FIG. 2, r1=r-r and
`r=rs=r because there are only two mounting points 301,
`302.
`Vectors u-ua (u) represent the sensitive direction of each
`accelerometer 1–6, respectively, measured in body frame
`coordinates X, Y, Z. Similarly, sensitive direction vectors
`u-u are generally fixed when accelerometers 1-6 are
`mounted but could be altered and updated accordingly.
`Position and orientation information 130 is reported in a
`fixed, or level frame reference defined by X, Y, Z. The
`coordinate system used in a virtual reality program or
`computer simulation environment 180, for example, is a
`level frame reference. After movement of object 300, body
`frame references are changed as shown in FIG. 2A.
`Accelerometer mounting information 46 (FIG. 3) com
`prises the information in the matrix J (described in the
`program flow below) defined by:
`
`6
`erometers 1-6 is accomplished by running tracking System
`15 while the object to be tracked 300 (e.g., head-mounted
`display (HMD) on a user) remains stationary. Position and
`orientation 130 are calculated according to the present
`invention as specified herein. Feedback filter loop 89 (dis
`cussed below, see also Digital and Kalman Filtering by S.
`M. Bozic, John Wiley and Sons, N.Y.) compares calculated
`position and/or orientation measurements 130 with the
`known position and/or orientation measurement (known to
`be stationary) and uses discrepancies between the two
`measurements to solve for bias and scaling factors 50 for
`each accelerometer 1-6. Tracking system 15 is operated
`such that main loop 41 is executed multiple times (approxi
`mately 15–20) for a successful calibration 48. Total calibra
`tion time is dependent on tracking processor 40 speed. In
`one embodiment, tracking system 15 alerts the user when
`calibration 48 is complete. Notification is through, for
`example, a small LED on an HMD, visual notification on a
`display, or any other suitable means. For more accurate
`initial bias and scale factors 50, calibration 48 is repeated
`with object 300 in several different orientations. Initializa
`tion 48 also includes resetting correction factors 120 (P., V,
`Qe, we) to zero or their reference values. Reference values
`may be dictated, for example, by simulation environment
`80.
`In main loop 41 tracking processor 40 reads 44 accelera
`tion data 35 from accelerometers 1-6 and calculates 60
`position and orientation information 130. Calculation 60 is
`discussed in more detail with reference to FIG. 4 below. In
`operation, main loop 41 is repeated at 50-300 Hz or faster
`depending on hardware capability (e.g., capability of track
`ing processor 40 or other components in FIG. 1). A fast loop
`rate 41 ensures that simulation environment 180 is updated
`with current position and orientation information 130.
`Feedback loop 89 (also known as a Kalman filter) com
`prises reading tracking measurements 90 (e.g., position,
`orientation, and/or velocity) from external tracking system
`170 (FIGS. 6, 7) disposed relative to object 300 and gener
`ating 100 correction factors 120. Generation 100 of the
`correction factors 120 is described in more detail with
`reference to FIG. 5 below. Correction factors 120 are used
`in calculation 60 of position and orientation information
`130.
`If the volume in which object 300 moves is relatively
`large compared to the size of object (e.g., tracking an HMD
`in a 5x5 meter room) or the system is used for long periods
`of time (e.g., over 15 minutes), then external measurements
`90 from, for example, external tracking system 170 are used
`for feedback. External tracking system 170 is a conventional
`tracking system using, for example, radar, Sonar, infrared,
`optical, acoustic/ultrasonic, or magnetic tracking technol
`ogy. External tracking data including position, orientation,
`and/or velocity measurements 90 are provided in the form of
`a 1- to 2-dimensional update or a full 3-dimensional, 6
`degree of freedom, update. Basically, feedback loop 89 will
`use any additional tracking data about object 300 to correct
`position and orientation information 130-more tracking
`data will provide a better correction.
`Alternatively, if object 300 (e.g., HMD) is confined to a
`small volume (e.g., seated), then certain "software specifi
`cation' information (not shown) in simulation environment
`180 can be used in place of measurements 90 as input to
`generation 100 of correction factors 120. For example, the
`mean position and the estimated variance of object 300 in a
`limited volume can be used for measurements 90. The
`variance can be constant or change over time. The variance
`reflects the uncertainty or size of the volume the object 300,
`or user, is confined.
`
`25
`
`30
`
`35
`
`40
`
`(r, x u2)
`
`luz"
`1.
`- inv
`
`.
`
`lus"
`
`(rex u6)
`
`The matrix J resolves the net linear accelerations into linear
`body and angular components. Accelerometers 1-6 must be
`mounted (e.g., 301, 302) such that the matrix J is not
`singular. For example, accelerometers 1-6 cannot all be
`placed in one position r. Similarly, u, representing the
`acceleration sensitive directions, must be non-zero for each
`acceleration direction X, Y, Z. (e.g., the X component of
`every u cannot always be zero). In one embodiment, u, u2,
`and u are orthogonal and u, us, and us are Orthogonal.
`FIG.3 shows a simplified flow chart of tracking system 15
`as implemented on tracking processor 40. Accelerometer
`initialization and calibration 48 is initiated prior to each
`system use to correct for the bias and scaling factors of the
`accelerometers due to such factors as time, temperature,
`mechanical jarring and the like. Accelerometers 1-6 are
`initialized 48 by loading the values of the accelerometer
`biases which are pre-specified at the factory or obtained
`from accelerometer specifications. Calibration 48 of accel
`
`45
`
`50
`
`55
`
`60
`
`65
`
`META 1010
`META V. THALES
`
`

`

`5,615,132
`
`10
`
`15
`
`20
`
`25
`
`30
`
`35
`
`7
`After incorporating correction factors 120 from feedback
`filter loop 89, the output of calculation 60 is position and
`orientation information 130. Position and orientation infor
`mation 130 is used, for example, in a virtual reality program
`or simulation environment 180.
`In FIG. 4, tracking processor 40 reads 44 acceleration data
`35 from each accelerometer 1-6. Accelerometer bias and
`scaling factors 50 are applied 62 to acceleration data 44.
`Acceleration corrections 120 from feedback loop 89 are also
`applied 62 to acceleration data 44. Gravity and centripetal
`components of acceleration are also removed 64 from cor
`rected acceleration data 62. Step 64 involves information
`from the prior output of the direction cosine matrix 76,
`mounting data (r. and u) 46, and angular velocities 70.
`Modified acceleration data 64 is converted to linear body
`and angular components 66. (There is no designation of
`body, level, or reference frame for angular accelerations,
`they simply measure the angle between two axes.) Angular
`accelerations 66 are integrated to angular velocities 68.
`Angular velocity corrections 120 from feedback loop 89 are
`applied 70 to angular velocity data 68. Corrected angular
`velocities 70 are integrated to angles or orientation 72, for
`example, roll, pitch, and yaw. Angular corrections 120 from
`feedback loop 89 are applied 74 to corrected angle data 72.
`Thus, orientation information 130 is produced. Direction
`cosine matrix is updated 46 using a conventional direction
`cosine update routine. (See, for example, Paul G. Savage,
`Strapdown Systems Algorithms, in Advances in Strapdown
`Inertial Systems, NATO Advisory Group for Aerospace
`Research and Development Lecture Series #133, 1984, pp.
`3-1 to 3-30).
`Linear body accelerations 66 are converted to level frame
`or reference frame (e.g., simulation environment coordi
`nates) accelerations 80. Level frame accelerations 80 are
`integrated to level frame velocities 82. Velocity corrections
`120 from feedback loop 89 are applied 84 to level frame
`velocities 82. Corrected level frame velocities 84 are inte
`grated to positions 86. Position corrections 120 from feed
`back loop 89 are applied 88 to positions 86. Thus, position
`information 130 is produced.
`In a preferred embodiment, orientation is calculated (steps
`68, 70, 72,74, 130) and direction cosines matrix is updated
`76 before position is calculated (steps 80, 82, 84, 96, 88,
`130). This control flow has the advantage that direction
`cosines matrix. 76 is more current and accurate for the
`position calculation steps. Alternatively, orientation calcu
`lation (steps 68, 70, 72, 74, 130) and position calculation
`(steps 80, 82, 84, 96, 88, 130) can be processed in parallel.
`However, direction cosines matrix. 76 will reflect data from
`the previous loop 41, thus position calculation 130 may be
`less accurate.
`In a preferred embodiment, calculation 60 also performs
`an estimation of position and orientation 130 one "frame
`delay' into the future. The reason for this predictive step, for
`example, is that simulation environment 180 will take some
`time, t, to utilize position and orientation information 130
`and modify virtual reality program or simulation environ
`ment for presentation to the user (e.g., draw the next frame
`in a video application). Delay time, t, is dependent upon the
`frame update

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket