throbber
I|||||||||||||||||||||||||||||||||||||||||||||||||||||||||l|||||||||||||||
`US0058 l92tl6A
`
`United States Patent
`
`[19]
`
`[11] Patent Number:
`
`5,819,206
`
`Horton et at.
`
`[451 Date of Patent:
`
`*0et. 6, 1998
`
`[54] METHOD AND APPARATUS FOR
`l)]'l'I‘l'lRMINll‘lG l’()SITION AND
`ORIENTA'1‘ION OFA MOVEABLE OBJECT
`USING ACCELEROMETERS
`
`Inventors: Mike A. Horton, Berkeley; A. Richard
`Newton, Woodside, both (it Calif.
`
`4
`_
`/tsqignee: Cmssbow ’leehnology, lric., San Jose,
`Calif.
`
`Notice:
`
`The term of this patent shalt not extend
`beyond the expiration date of Pat. No,
`5,515,131
`
`OTHER PUBLICATIONS
`
`Shelly et al.. i.rnage—sensor—basetl target maneuver clete<:~
`lion; Optical Engineering vol. 32, n. 11, pp. 2735—274(l,
`Nov. 1993.
`
`Guedry et al., tThe dynamic of spatial orientation during
`complex and changing linear and angular acceleration, Jour-
`mil of Vestibular Res.earch:Eqitilibrium and Orientation, vol.
`2, N0_ 4‘ pp 159433’ No“ 1991
`A Gelb et., “Applied optimal Estimation", The M.l.'l‘. Press.
`Pp’ 50-143’ 1974‘
`North Atlantic Treaty Organization, Agartl Lecture Series
`No. 82 “I’ra(:tical Aspects of Kalrnan Filtering Implemen-
`tation”, pp. 24 through 2—11, 1976.
`
`AWL No_: 820,837
`
`Filed:
`
`Mar. 2|], 1997
`
`{List continued on next page.)
`
`Priiiinry Exnr1tiner—Kamir Shah
`rtssi's.'nni' iLLmiiu‘ner—Kamini Shah
`
`Rcmtcd US, Application Data
`
`Attorney, Agent, or Firm~Fenwiel~: & West LLP
`
`Continuation of Scr. No. 184.583. Jan. 21. 1994. Pat. No.
`.'.' 5. 3 .
`‘bl
`I 2
`Iril. Cl.” ..................................................... .. G090 3/U2
`
`ABSTRACT
`[Sin
`.
`.
`.
`.
`.
`.
`.
`A three-dtmelistonal position and orientation tracking sys-
`tem uses acoelerometers to measure acceleration of a move-
`
`U.S. Cl.
`
`........................ .. 702l'I50; ':'[l3/702; 367E436;
`7(!1.t2U7
`364x559. 419,
`new of Search
`3641449, 516, 517, 453, 578; 340988,
`989! 990; 35L,21(L 209; 395f5U2’i50U;
`434,29 43 49
`’
`‘
`‘
`
`References Cited
`
`U.S. PiNl”l:IN'[' DOCUMIL-INTS
`
`. . . . . . . . .
`
`t2tto91 Spero eta].
`5,0't2_.2t8
`‘M1993 Barker
`5,245,537
`li't9‘J4 Kramer et al.
`5.280.265
`3i’l""4 Hitoshi 1" -"A
`5_290.~%4
`4*’1994 “l°““'‘‘-‘ 3''
`"
`5-'3m-'m2
`l2’Jl9("4 Travers ct a '
`5’373’85?
`5,422,633 M1995 Maguire et al.
`5,583,875
`1251906 Weiss
`5,615,132 M997 llorlon or al.
`
`.. .. .. .. 340,I98[1
`364,'453 X
`338l2ttl
`-- 34mm
`342*’ 147
`I28‘f782
`343,r‘9
`.. 3'.I'lf28
`364x516
`
`:1 head-mounted display unit or a clata
`(e.g_.,
`able object
`glove).
`tracking processor generates hath position and
`gf;;,~;;§3n*;;;g,;pff;";*a,;*gg,;g;,*'g;;f,*;g°;;;;;:;g;;0*;‘ ;;;m§;aggg
`.
`‘
`.
`.
`_
`.
`’_
`._
`embodiment, a simplified r_adar—bae-.c(]
`traelong system is
`dtS[2t()_SC(l
`[t‘.'.lE.l]V-C to the object and periodically provides
`additional tracking data on the Object to the tracking pro-
`cessor. The tracking processor uses the additional data to
`correct
`the position and orientation information using a
`feedback tiller process. The position and orientation infor-
`mation signals gcneratevil can he used, for example,
`in a
`simulation or virtual reality application. Position and orien-
`tation information is received by a simulation processor
`relative to the object. The simulation processor modifies a
`simulation environment as a function of the position and
`orientation information received. Modified simulation envi-
`ronment information (e g video andfor audio information)
`is then Pl'C‘it.’.l'll€(l
`to a 1|-%t‘ii'
`
`16 Claims, 9 Drawing Sheets
`
`IHITIAL BODY DR HELMET FRAME
`
`FRED REFERENCE OR
`LEVEL FRAME
`It
`
`Zepp Labs, Inc.
`ZEPP 1042
`Page 1
`
`

`
`5,819,206
`Page 2
`
`OTHER PUBl_.lC.'AT10NS
`
`3 Space Fastrak Product specifications by PDLI-IEM US;
`
`Analog Devices product spccificaliora for Model ADZl5U*,
`"Mnnolithic Accclcromelur with signal ::on(|iIioning”, pp.
`I-16, Jan. 1993.
`J.A. Adam, "Virtual Rcalily is for Real”, Ililili Spectrum,
`pp. 22-29, Oct. 1993.
`
`Jul‘ 1993-
`UDMMH Rweiwm,l.mnSmi[MS .[.l_aCkJ.ng Device A Hock
`_
`'
`_
`.
`.
`., ’
`'
`“f B""-'5 Fmducl SP‘3C'fiCa“''~''“ 5}’ A3'°“-51°“
`['3°h~ (-0711-
`ACT. Nov. 1992.
`
`Zepp Labs, Inc.
`ZEPP 1042
`Page 2
`
`

`
`U.S. Patent
`
`1..C
`
`80;U
`
`1AI.0C..flS
`
`5,819,206
`
`<_<oMmmzo:<Ed8<
`
`mm
`
`ada}
`
`onSEm>z8
`
`6,onmoxfleznz
`
`0_55....5::$5:$5:w_E..=.._ES:
`
`
`
`mm,&.50..mmi:3mm,.:._:8$330..mwé__...o._mmé:3
`
`EmzomflmooqE$:oE._m_8<Eh._:oE._uoo<EE:o~m._u8<
`
`
`
`5:2:5:2...m<H._z_._502...
`
`
`
`EEs_oE._moo<.EE_2omu.aoo<
`
`
`
`
`5&2:«E2:
`
`zo_E_..E8z_
`
`
`
`
`
`9,:zoEmon_oz_5<E
`
`
`
`ofzozfizmaomommmoomaIE05:
`
`Zepp Labs, Inc.
`ZEPP 1042
`Page 3
`
`
`
`
`
`

`
`U.S. Patent
`
`Oct. 6, 1998
`
`Sheet 2 of 9
`
`5,819,206
`
`
`
`m=._<EE_.._._mzmoSam._<_.:z_
`
`._N
`
`mouozumubmBx:
`
`u:<E._u>5
`
`Nmaze:
`
`Zepp Labs, Inc.
`ZEPP 1042
`Page 4
`
`
`

`
`U.S. Patent
`
`Oct. 6, 1998
`
`Sheet 3 of 9
`
`5,819,206
`
`FIGURE2A
`
`P-
`ZLu
`2Lu
`:3-
`
`O2 1L
`
`u
`I-
`LL
`41
`|..I.l
`-1
`0:u.
`>-
`
`I«
`
`O0C
`
`D
`
`Zepp Labs, Inc.
`ZEPP 1042
`Page 5
`
`

`
`U.S. Patent
`
`1...(
`
`U
`
`9f
`
`5,819,206
`
`.0.Hqwmzuu
`
`.zocfimmou
`
`6.»mass:3
`
`ozESo__..%En.__._o~md8<
`
`u:..5o._...u
`
`«:5
`
`8
`
`R:
`
`
`
`n_DO._v_o¢mn_H.:
`
`
`
`3.:.xoEm<V
`
`ma
`
`
`
`n_oO._2.42
`
`
`
`Qxaonno....xoEn_5
`
`
`
`#2350..§.:
`
`swarmozc_9..Eset«E...9.9.
`
`m,,.m_h._:o.,._m._u8<
`
`
`
`29:.E._uou¢
`
`$.49
`
`..E.m:#_=m$:
`
`Emamm
`
`mzozomfiou
`
`
`
`
`
`zo=$,_¢o..z_t2o§.N.._..__»_z.mmm_E:o~m._uuu¢muh._:oE._uou«3%2.
`0oz.._«om4oz...msmzoE..mm_.zooi
`
`
`om......c.9.,.2.....uzoE_E.n.__zoon
`
`
`
`oz...zoEmon_
`
`o2
`
`n2<zo_:mon_
`
`
`
`22::zymo
`
`zcE..__.Eo.__z_
`
`om.
`
`zo_:.._::_m
`
`:n._:zoz_>zu
`
`nn._§o:_
`
`Zepp Labs, Inc.
`ZEPP 1042
`Page 6
`
`
`
`
`
`
`
`
`
`

`
`U.S. Patent
`
`Oct. 6, 1998
`
`Sheet 5 of 9
`
`5,819,206
`
`ACCELERATION
`CO RR ECTIONS
`(IE
`
`ACCELEROMETER
`BIAS AND
`SCALING
`
`ACC ELERONI ETER
`MOUNTING
`DATA
`
`ANGULAR
`VELOCITY
`CORRECTIONS
`we
`
`JAN
`CORRECTIONS
`0:
`
`READ
`ACCELERATION
`DATA FROM
`ACCELEROHETERS
`
`APPLY
`AsCCELERC|H‘E'l'ER
`BIAS. SCAIJNG AND
`CORRECTIONS T0
`ACCELERATION DATA
`
`REMOVE GRAVITY
`AND CENTRIPETAI.
`COHPONENT5 FROM
`ACCELERATION DATA
`
`CONVERT ACCELERATION
`DATA T0 LINEAR BODY
`AND ANGULAR
`DOMPGNENTS
`
`CALCULATE POSITION AND
`ORIENTATION INFORMATION
`
`76
`
`UPDATE DIRECTION
`COSINES MATRIX
`
`INTEGRATE ANGULAR
`ACCELERATIONS TO
`ANIGULAR VELOCITIES
`
`INTEGRATE LEVEL
`FRAME ACCEI.ERA'I'IONS
`TO V'r'.I..DCIT|ES
`
`APPLY ANGULAR
`VELOCITY
`CORRECTIONS
`
`APPLY VELOCITY
`CORRECTIONS
`
`VELOCITY
`CORRECTIONS
`VB
`
`INTEGRATE ANC-‘LILAR
`\"£l.OCl11ES To ANGLES
`(ROLL. FITCH.
`‘MW
`omznmmou)
`
`INTEGRATE LEVEL
`
`"*.¢.*;.Em:.s:°=a“E5 *°
`ca-ORBINATI-:g)' x
`
`AP FLY AN G LE
`CORRECTIONS
`
`APPLY PDS1 TION
`CORRECTIONS
`
`POSITION
`CORRECTIONS
`D3
`
`POSITION AND
`ORIENTATION
`INFORIIIATION
`
`Zepp Labs, Inc.
`ZEPP 1042
`Page 7
`
`

`
`U.S. Patent
`
`Oct. 6, 1998
`
`Sheet 5 of 9
`
`5,819,206
`
`UPDATE STATE
`TRANSITION
`
`UPDATE
`PROCESS
`NOISE MATRIX
`
`COMPUTE
`COVARIANCE
`MATRIX
`
`UPDATE
`KALMAN GAIN
`MATRIX
`
`c0VgR|ANCE
`
`COMPUTE
`°?=3§?3%'s°"
`
`RE“)
`MEASUREMENTS
`
`M
`
`CORRECTIONS
`oe.pe.ve.Qe.
`we
`
`Zepp Labs, Inc.
`ZEPP 1042
`Page 8
`
`

`
`U.S. Patent
`
`Oct. 6, 1998
`
`Sheet 7 of 9
`
`5,819,206
`
`OmwOZ_XO<m._.
`
`oz_xo<E.
`
`9,2zO_.:mOn_
`
`zo:§zH.__mo
`
`zo:<2mo._z_
`
`m.
`
`5u_.mo
`
`zozfizzm
`
`:fizzom_>zm8.
`
`zo_::a:_m
`
`zo:S.._mo.._z_
`
`..m:m»m
`
`am:Em
`
`mH._m:o_.._
`
`0.:
`
`._<zE:m_
`
`oz¢_o<E
`
`IE.m>m
`
`
`
`ON_.mzo:umzmoo
`
`Zepp Labs, Inc.
`ZEPP 1042
`Page 9
`
`
`
`
`
`
`
`

`
`U.S. Patent
`
`Oct. 6, 1998
`
`Sheet 8 of 9
`
`5,819,206
`
`
`
`EH30mi.m_¢—
`
`
`
`Ev_¢u...m~n.__..__.E__,_._.
`
`oaa.
`
`
`
`_._o_mwuEzouuo
`
`02¢
`
`
`
`$~_mu_.:zxw
`
`::n_m_o
`
`
`
`._.¢.._n_mEowns
`
`
`
`EdoEz8zo_mmuE:oo
`
`
`
`zoE_.u_z:.2:ou«.73.
`
`
`
`m_m;_._omEzooE>_mumz._.E
`
`._..._x<_E.
`
`oz_v_o¢E
`
`H3300:
`
`._¢_§__E
`
`uz_xo._.E
`
`u.Sn_o:
`
`
`
`ON_.mzozoummou
`
`._<zE:n._
`
`uzioqfi
`
`._..fi.:.mam
`
`nwagon.
`
`zo:.<._::_m
`
`:,_m:zom_>zm
`
`o2
`
`
`
`oz...zo_:mon_
`
`zo_:E,m_mo
`
`
`
`zO_._.<:mo..z_
`
`zo_.E._=:.m
`
`zo_::.._mEz_
`
`Zepp Labs, Inc.
`ZEPP 1042
`Page 10
`
`
`
`
`
`
`
`
`
`
`
`
`

`
`U.S. Patent
`
`)(
`
`91
`
`6.,
`
`9tC0.....uS
`
`5,819,206
`
`
`
`Cmzzomszmzo_:3:z_mcom
`
`
`
`um<z.tomzo_:3=2m
`
`comcom
`
`mm.mom
`
`
`.mm>_womz<mn._m¢mE<o
`
`
`
`
`$o<z§a<55»mmesmnox.mSm_mt<zo_E._=§m
`
`nyoasm.mU_I&<m0\mm§5m<_._zoE.._a:m
`
`
`
`
`
`zoammmmzoomo»<mu4moo<mmummmxwmommmuomu
`
`mm+m_ca.
`
`
`
`
`
`mwjozpzoozoamwmuzoo>mo:mz
`
`zoE.G_z:2s_ouo_o:<zoE..._::_m
`
`on.00..
`
`
`
`DzdzoEmon_
`
`zoE:zn.__mo
`
`zo:<:mEz_
`
`zo_._<..EEz_
`
`zo_:32.._mmmanor.
`
`8...
`
`
`
`
`
`zu._m_n_o:H._m<m<:3m:,_uo<
`
`
`
`
`
`Ewan»Eu:8ozo_2._:z_m
`
`Zepp Labs, Inc.
`ZEPP 1042
`Page 11
`
`
`
`
`
`
`

`
`5,819,206
`
`1
`METI-10!) AND APPARATUS FOR
`l)ETI*IRMINING POSITION AND
`0RII£N’l‘A'l‘l()N OFA M()VltIABI.E (JB.ll'*ICT
`USING ACCELEROMETERS
`
`This is a continuation of application Ser. No. 08/184,583
`filed on Jan. 31, 1994, U.S. Pat. No. 5,615,132.
`
`BACKGROUND 01-" THE lNVl:iN'l'lON
`
`1. Field of the Invention
`
`The invention reiates to tracking systems, particularly to
`such systems that determine position and orientation of an
`object in a limited volu me using accelerometers.
`2. Description of the Background Art
`In specialized computer applications involving virtual
`reality or “immersive simulations” a computer or processing
`facility providing the simulation must continuously deter-
`mine with a high degree of accuracy the position and
`orientation of a user (or part of the user e.g., head or hand)
`relative to a “virtual world" or simulated environment in
`which the user operates. The position and orientation data
`must be updated regularly to provide a realistic simulation.
`In addition, the data must be collected in a manner that does
`not interfere significantly with the user‘s natural movement.
`Thus, physical connection to a stationary object or heavy
`andfor bulky tracking instruments attached to the user are
`unsuitable. In order to be integrated easily with a head-
`mounted display (HMD), data glove or other peripheral
`device for use in a virtual reality application, a tracking -
`system must be small and light weight.
`A mechanical gantry containing xnsors is used to track
`movement by physically connecting the user to a fixed
`object. However, this system is cumbersome, provides an
`unrealistic simulation clue to interferences from the gantry, “
`and requires significant installation effort.
`A simplified radar or sonar system having a transmitter
`and a receiver mounted on the user is used to determine
`
`-
`
`this type of system is
`position of an object. However,
`sensitive to noise in the environment, tends to have high
`frequency jitter between position measurements. is subject
`to interference from other objects in the simulation (e.g., a
`hand or other users), is generally bulky, requires multiple
`transmitters and receivers, and may be quite complex and
`expensive. Such systems are embodied in products available
`commercially from Polhemus, Logitech, and Ascension
`Technology.
`Additionally, conventional navigation systems for navi-
`gating over large areas of land or airspace such as those for
`planes, cars, missiles, use devices such as gyroscopes that
`are not suitable for attachment to a human user because of
`their size and weight. In addition, these devices are typically
`designed to track over several hundred kilometers and
`several days, and are accurate only to several meters.
`Two-dimensional navigation systems using angular accel-
`erometers (a type of gyroscope), such as that used in Barher
`US. Pat. No. 5,245,537, are not suitable for virtual reality
`applications requiring three position and three orientation
`measurements for realistic simulation. The system described
`in Barber does not provide a highly accurate measurement
`(as required by virtual reality applications) because it con-
`tains no mcchanism for correcting errors that are inherent in
`the system (e.g.. bias, calibration errors, lloaling, and posi-
`tional errors).
`Ififl uncorrected,
`these errors typically
`increase in size as a function of time of use andfor volume
`traversed. thereby resulting in a significant degradation in
`
`_
`
`2
`system performance. Moreover. angular accelerometers are
`not easily integrated into eiectronic componentry, thus the
`resulting system is generally greater in size and weight and
`is not suitable for attachment to a human user. In addition,
`a much higher update rate (e.g., 50-300 l-I2) than that used
`in Barber is required for realistic virtual reality simulations.
`Thus,
`there is a need for a small,
`lightweight, highly
`integratable, navigational system that can be easily attached
`to a human user without significant interference to natural
`body movement. Furthermore, there is a need for a naviga-
`tional system that is highly accurate over a long period of
`time and operates at a high update rate in order to provide
`a realistic virtual reality simulation. The prior art has failed
`to address these needs adequately.
`SUMMARY OF THE INVENTION
`
`The invention is a three-dimensional position and orien-
`tation tracking system that uses accelerometers to measure
`acceleration in the six-degrees of freedom (e.g., x, y, 2
`position coordinates and roll, pitch, yaw orientation
`components) of a moveahle object (e.g., a head-mounted
`display unit, or
`a wristbandfdata glove). Conventional
`accelerometers, as used herein, measure acceleration in one
`linear direction (e.g., x, y, 2, or combination thereof, coor-
`dinate axis), but may report acceleration data as a nonlinear
`function of, for example, acceleration or time. Acceleration
`data on the moveahle object is periodically (e.g., 50-300 I-lz)
`received by a tracking processor. The tracking processor
`generates both position and orientation information on the
`object relative to a simulation environment as a function of
`the acceleration data. Acceleromctcrs are easily integrated
`into electronic componenlry (e.g., using is silicon chip
`technology). Thus, the tracking system of the present inven-
`tion can be embodied in a small, lightweight unit that is
`easily attached to a human user without significant interfer-
`ence to natural body movements.
`In one embodiment, a simplified radar-based tracking
`system, which is disposed relative to the object, periodically
`(e.g_, ]
`I-Iz) provides additional tracking data on the object
`to the tracking processor. This data may he provided by, for
`example, infrared light and received by the tracking proces-
`sor via an infrared senwr. The tracking processor uses the
`additional data to correct the position, orientation, and/or
`velocity information generated from the accelerometers,
`using a feedback or Kalman filter process. This correction
`feedback loop allows the invention to function accurately
`over a long period of time {e.g., several hours) without
`adjustment. Alternatively, if the user is to remain seated or
`confined to a limited volume during simulation. prc-defined
`position data from the simulation environment software
`specification t_e.g., mean position of user and average
`variance] can be used in the correction feedback process.
`The position and orientation information signals gener-
`ated can be used, for example, in a simulation or virtual
`reality application. Position and orientation information is
`received by a simulation processor relative to the object
`(e.g., via infrared transceiver). The simulation processor
`modifies a simulation environment operating on the simu-
`lation processor as a function of the position and orientation
`information received, Modified simulation environment
`information [e.g., video. audio,
`tactile. andlor olfactory
`information) is transmitted back to the user (e.g,, via infrared
`transceiver). Other possible applications of the invention
`include guidance systems for the blind, robotic guidance
`systems, human tracking systems (e.g., prisoners), object
`tracking systems {e.g., parcel package, and/or auto}. and
`
`Zepp Labs, Inc.
`ZEPP 1042
`Page 12
`
`

`
`5,819,206
`
`3
`computer input devices for the handicapped (e.g.. head or
`hand controlled input devices).
`
`BRIEF DESCRIPTION OF THE DRAWINGS
`
`is a simplified block diagram illustrating the
`I
`FIG.
`components used in the tracking system of the present
`invention.
`
`FIG. 2 is graphical drawing showing one embodiment of
`the tracking system with placement ofaccelerometers 1-6 in
`FIG. 1 on two mounting points.
`FIG. 2A is a graphical drawing showing object 300 of
`FIG. 2 after movement.
`
`FIG. 3 is a simplified flow chart depicting one embodi-
`ment of the tracking system ot‘ the present invention.
`FIG. 4 is a flow chart depicting main loop 4] in FIG. 1.
`FIG. 5 is a llowchart depicting feedback loop 89 in FIG.
`
`I.
`
`FIG. 6 is a simplified block diagram of a virtual reality
`invention using the tracking system ofthe present invention.
`FIG. 7 is a block diagram of object 300 in FIG. 6.
`FIG. 8 is a block diagram of simulation environment 180
`in FIG. 6.
`
`Dl:'.SCRIP'I'lON OI’ '11 IE. PRL-IIiIiRRl:".D
`EMBODIMENT
`
`is a simplified block diagram illustrating the
`I
`FIG.
`components used in the tracking system invention. Conven-
`tional accelerometers 1-6 measure acceleration in one linear
`direction (c.g., x, y, 2, or combination thereof, coordinate
`direction), but may report acceleration data, for example, as
`a nonlinear function of time (e.g_, v(t), where v is voltage)
`or acceleration. Aceelerometcrs 1-6 are capable of measur-
`ing accelerations of at least :2 G. This allows for 1 G due "
`to gravity and 1 G of movement acceleration. In the pre-
`ferred embodiment, accelerometers should be shock-
`protected or resistant so that
`they are not damaged if
`dropped. To ensure high accuracy, a high signal to noise ratio
`(SNR) is desirable—a lower bou nd of approximately 103 or
`40 dB is preferred.
`In one embodiment six accelerometers l—6 are used to
`track six degrees of freedom of an object in three dimensions
`(e.g., x, y,
`2. position coordinates and roll. pitch, yaw
`orientation components). More than six accelerometers can
`be used to obtain a greater degree of accuracy (e.g., by
`averaging or
`interpolation} andfor
`redundancy.
`Alternatively, threc dual-axis or two triaxial accelerometers
`can be employed to track the six degrees of freedom of an _
`object in three dimensions. Fewer accelerometers (e.g., four)
`could be used to track the object, for example, in a two-
`dimensional space or one—din'tensional space {e.g.,
`two
`accelerometers). Groups or clusters of accelerometers can
`also be used to track a plurality of objects. For example, the _ _
`tracking invention could be implemented on an I-[MD and
`two data gloves to track head and hand movement of a user.
`More tracking systems can be used along the arm to track
`elbow and shoulder movement. Tracking systems on each
`finger could also he used to track finger movement.
`Similarly, two l'tead—mountcd display (HMD) units with six
`accelerometers each could be used to track the
`3-dimensional position and orientation of two interactive
`users in a virtual reality environment.
`Accelerometers 1-6 are conventional accelerometers such
`as the ADXI.-2 manufactured by Analog Devices Corp. of
`Boston. Mass. Due to the nature of human movement
`
`4
`(typically frequency components are between 0-50 Hz), for
`example,
`there is generally little information in the high
`frequency range, and this information should be removed to
`reduce noise. Accordingly,
`in the preferred embodiment,
`accelerometers are bandlimited, i.e., the highest frequency
`from accelerometers 1-6 are limited to, for example, 50-300
`Hz. This bandwidth can be achieved by coupling acceler-
`ometers l—t5 to low pass filters (LPFS) 7-12. respectively, or
`by using low bandwidth accelerometers.
`In a preferred
`embodiment, accelerometers 1-6 are small and easily inte-
`grated with other electronic components, e.g., small micro-
`macltined accelerometers (bulk or surface micro-machined).
`Output from l.PFs 7-12 are used as inputs to multiplexer
`20 such as the ADGSOSA available commercially from
`Analog Devices. Analog to digital (ND) converter 30, such
`as the AD1380 available commercially from Analog
`Devices,
`is used to convert the analog acceleration signal
`from l_PFs 7-12 to a digital signal. Some accelerometers can
`provide digital data directly (see e.g., ARPA grant #BA:’\93-
`06 to University of California at Berkeley and Analog
`Devices)
`thus A/D converter 30 is not necessary.
`Alternatively, a voltage-to-frequency converter and a fre-
`quency counter circuit could be used to obtain a digital
`value. The components of the present invention comprising
`accelerometers 1-6, LPI? 7-12, multiplexer 20, and A/D
`converter 30 are all highly integratable (unlike gyroscopes,
`angular accelerometers, and other tracking systems). Thus,
`according to the present invention, for example, accelerom-
`eters 1-6 (or subset thereof), multiplexer 20, /VD converter
`30, and tracking processor 40 could all be co-located on a
`single integrated computer chip—the result being a small
`lightweight navigational system suitable for attachment to
`human users using a virtual reality application.
`Output from MD converter 30 is acceleration data 35.
`Acceleration data 35 may be reported, for example, as a
`nonlinear function of time (e.g.. vlt) where v is volts).
`Acceleration data 35 is input
`to tracking processor 40.
`Tracking processor 40 can be, for example,
`a standard
`computer microprocessor such as an IN'l'I:'L 486, Motorola
`68000, or Pentium-based microprocessor. Tracking proces-
`sor 40 is discussed in further detail with reference to FIGS.
`3-5 below. Memory unit 3'.’ is coupled to tracking processor
`40 and is used for storing program instruction steps and
`storing data for execution by tracking processor 40. Memory
`unit 37 is a conventional computer memory unit such as a
`magnetic hard disk storage unit or random access memory
`(RAM) on a chip. Output from tracking processor 40 is
`position and orientation information 130.
`In one embodiment, position and orientation information
`139 is transmitted in a data signal consisting of six
`elements—three position elements (eg, x, y, z) and three
`orientation elements (eg., roll. pitch. yaw). Each element is
`two bytes long. Each value or element is in twos comple-
`ment format. thus the decimal values -32.768 to 32,767 are
`covered. Measurements are the decimal value divided by
`IOU. Thus, measurements from -327.68 to 327.67 (e.g.,
`degrees, cm,
`inches,
`feet or other angle or
`linear
`measuremenLs} can be transmitted. Information 130 is trans-
`mitted in a standard serial interface of three lines——transn'tit,
`receive, and grou nd—-standard 8 bit words, no parity. and I
`stop hit. A mode of operation can be specified as follows:
`R—request mode (default). Position and orientation is
`transmitted upon request.
`F—free running mode. Position and orientation is trans-
`mitted as calculated.
`M—mode change. Informs tracker that mode in which
`position and orientation is transmitted (R or F) will
`change.
`
`Zepp Labs, Inc.
`ZEPP 1042
`Page 13
`
`

`
`5,819,206
`
`5
`G—get data. Tracker will transmit position and orienta-
`tion information 130.
`H—halt. Turns olf tracking system.
`C—calibrate. Ru ms or reruns the initialization routine 48.
`Alternatively, a file can he created with records ofthe same
`format described above.
`In FIG. 2, two accelerometer mounting points 361 and
`302 are located on object 300 (e.g., two locations on a
`head-mounted display (HMD) unit, or two locations on the
`wrist of a data glove). Object 30!) may be, for example, a
`head-mounted display unit, a wristltandfdata glove, or other
`similar devio: attached to a user to monitor the user’s
`movement. In this example, each mounting point 301, 302
`contains three accelerometers (e.g., accelerometers 1-3 and
`4-6 respectively). Vectors r,—rfi {r,—) are the vectors from the
`origin ofobject 300 (:'..g., head of user) to each accelerom-
`eter 1%, respectively, measured in body frame coordinates
`x,__., ya, 7.” (eg., coordinates in reference to object 300). In
`one embodiment, accelerometers 1-6, and thus vectors
`I'1—r,.,, are fixed when accelerometers 1-6 are mounted.
`However,
`the location of accelerometers l-6 could be
`altered during the use of the tracking system and vectors
`r,—r,, updated accordingly. As shown in FIG. 2, r]=o_.=r3 and
`r,,=r_.,=r5 because there are only two mounting points 301,
`302.
`Vectors LI ,-u,_ (u,.) represent the sensitive direction of each ’
`accelerometer 1-6, respectively, measured in body frame
`coordinates x,;, y,,, :5. Similarly, sensitive direction vectors
`u,—ufi are generally fixed when accelerometers 1-6 are
`mounted but could be altered and updated accordingly.
`Position and orientation information 130 is reported in a '
`fixed, or level frame reference defined by x,_, yL, zy. The
`coordinate system used in a virtual reality program or
`computer simulation environment 180, for example,
`is a
`level frame reference. After movement of object 300, body
`frame references are changed as shown in FIG. 2A.
`Accelerometer mounting information 46 (FIG. 3) com-
`prises the information in the matrix J (described in the
`program flow below) defined by:
`
`J = int:
`
`lmftn x enl’l
`lltzrlrz X Ir.vt’l
`|- -
`-I
`
`l!r,l'(t"n K trcilrl
`
`The matrix J resolves the net linear accelerations into linear
`body and angular components, Accelerometers 1-ti must be
`rnounterl (e.g., 301, 3m) such that
`the matrix J is not
`singular. For example, accelerometers 1-6 cannot all be .
`placed in one position r,—. Similarly,
`u,—, representing the
`acceleration sensitive directions, must be non-zero for each
`acceleration direction x_,,, y,,, ;r.,,. (eg., the x,, component of
`every ufcannot always be zero). In one embodiment, u, , Liz,
`and u_, are orthogonal and u_,, 115, and U5 are orthogonal.
`FIG. 3 shows a simplified flow chart of tracking system 15
`as implemented on tracking processor 40. Accelerometer
`initialization and calibration 48 is initiated prior to each
`system use to correct for the bias and scaling factors of the
`accelerometers clue to such factors as time,
`temperature,
`mechanical jarring and the like. Accelerometers 1-6 are
`initialized 48 by loading the values of the accelerometer
`biases which are pre—speeificd at the factory or obtained
`from accelerometer specifications. Calibration 48 of accel-
`erometers 1-6 is accomplished by running tracking system
`15 while the object to be tracked 301) (eg, head-mounted
`display (I-IMD) on a user) remains stationary. Position and
`
`6
`orientation 130 are calculated according to the present
`invention as specified herein. Feedback filter
`loop 89
`(discussed below, see also Digital and Knlmtm Filtering by
`S. M. Ilozic, John Wiley and Sons, New York) compares
`calculated position andfor orientation measurements 130
`with the known position andfor orientation measurement
`(known to be stationary) and uses discrepancies between the
`two measurements to solve for bias and scaling factors 50
`for each accelerometer 1-6. Tracking system 15 is operated
`such that main loop 41 is executed multiple times
`(approximately 15-20) for a successful calibration 48. Total
`calibration time is dependent on tracking processor 40
`speed. In one embodiment, tracking system 15 alerts the user
`when calibration 48 is complete. Notification is through. for
`example, a small LED on an HM D, visual notification on a
`display, or any other suitable means. For more accurate
`initial bias and scale factors 50, calibration 48 is repeated
`with object 300 in several different orientations. Initializa-
`tion 48 also includes resetting correction factors 120 (pp, V9,
`£28, we) to zero or their reference values. Reference values
`may be dictated, for example, by simulation environment
`181].
`In main loop 41 tracking processor 40 reads 44 accelera-
`tion data 35 from accelerometers 1-6 and calculates 60
`position and orientation information 131]. Calculation 60 is
`discussed in more detail with reference to FIG. 4 below. In
`operation, main loop 41 is repeated at 50-300 112 or faster
`depending on hardware capability [e.g., capability of track-
`ing processor 49 or other components in FIG. 1). A fast loop
`rate 41 ensures that simulation environment 180 is updated
`with current position and orientation information 130.
`Feedback loop 89 (also known as a Kalman filter) com-
`prises reading tracking measurements 90 [e.g., position,
`orientation, andfor velocity) from external tracking system
`170 (FIGS. 6, 7) disposed relative to object 300 and gener-
`ating ltlfl correction factors 1211. Generation 1013' of the
`correction factors 120 is described in more detail with
`reference to 1710. 5 below. Correction factors 120 are used
`i.n calculation 60 of position and orientation information
`131].
`If the volume in which object 300 moves is relatively
`large compared to the size of object (e.g., tracking an HMD
`in a 5x5 meter room) or the system is used for long periods
`oftirrte (c.g., over 15 minutes), then external measurements
`90 from, for example, external tracking system 170 are used
`for feedback. External tracking system 170 is a conventional
`tracking system using, for example, radar, sonar, infrared,
`optical, acousticfultrasonic, or magnetic tracking technol-
`ogy. External tracking data including position, orientation,
`and/or velocity measurements 90 are provided in the form of
`a 1-
`to 2-dimensional update or a full 3-dimensional, 6
`degree of freedom, update. Basically, feedback loop 89 will
`use any additional tracking data about object 300 to correct
`position and orientation information 130—more tracking
`data will provide a better correction.
`Alternatively, if object 300 (e.g., HMD) is confined to a
`small volume {e.g., seated), then certain "software specifi-
`cation" information (not shown) in simulation environment
`181] can he used in place of measurements 90 as input to
`generation 100 of correction factors 120. For example, the
`mean position and the estimated variance of object 300 in a
`limited volume can be used for measurements 90. The
`variance can be constant or change over time. The variance
`re llects the uncertainty or size of the volume the object 300,
`or user, is confined.
`After incorporating correction factors 120 from feedback
`filter loop 89, the output of calculation 60 is position and
`orientation information 130. Position and orientation infor-
`
`Zepp Labs, Inc.
`ZEPP 1042
`Page 14
`
`

`
`5,819,206
`
`_
`
`.
`
`7
`trtation 130 is used, for example. in a virtual reality program
`or simulation environment 180.
`In FIG. 4. tracking processor 40 reads 44 accelcrat ion data
`35 from each accelerometer 1-6. Accelerometer bias and
`scaling factors 50 are applied 62 to acceleration data 44.
`Acceleration corrections 120 from feedback loop 89 are also
`applied 62 to acceleration data 44. Gravity and centripetal
`components of acceleration are also removed 64 from cor-
`rected acceleration data 62. Step 64 involves information
`from the prior output of the direction cosine matrix 76,
`mounting data (r,- and u,-) 46. and angular velocities 70.
`Modified acceleration data 64 is converted to linear body
`and angular components 66. (There is no designation of
`body, level, or reference frame for angular accelerations,
`they simply measure the angle between two axes.) Angular
`accelerations 66 are integrated to angular velocities 68.
`Angular velocity corrections 120 from feedback loop 89 are
`applied 70 to angular velocity data 68. Corrected angular
`velocities '.-'0 are integrated to angles or orientation 72, for
`example, roll, pitch, and yaw. Angular corrections 120 from
`feedback loop 89 are applied 74 to corrected angle data 7'2.
`Thus, orientation information 130 is produced. Direction
`cosine matrix is updated 46 using a conventional direction
`cosine update routine. (See, for example, Paul G. Savage,
`Sirnpdown Sysrerrrs Algorirlrrtts,
`in Advnrtccs‘ in Strnpdowrt
`inertial S_}’.5'i(?.l'HS, NATO Advisory Group for Aerospace
`Research and Development l..ecture Series #l33, I984, pp.
`3-1 to 3-30}.
`Linear body accelerations 66 are convened to level frame
`reference frame (e.g., simulation environment
`or
`coordinates) accelerations 80. Level frame accelerations 80
`are integrated to level frame velocities 8.2. Velocity correc-
`tions l20 from feedback loop 89 are applied 84 to level
`frame velocities 82. Corrected level frame velocities 84 are
`integrated to positions 86. Position corrections 120 from ..
`feedback loop 89 are applied 88 to positions 86. Thus,
`position information 130 is produced.
`In a preferred embodiment, orientation is calculated (steps
`68, TI], 72, 74, 130) and direction cosines matrix is updated
`76 before position is calculated (steps 80, 82, B4, 96, 88,
`130). This control flow has the advantage that direction
`cosines matrix 76 is more current and accurate for the
`position calculation steps. Alternatively, orientation calcu-
`lation (steps 68, 70, 72, ‘I4, 130) and position calculation
`(steps 80, 82, 84, 96, 88, 130) can be processed in parallel.
`llowever, direction cosines matrix 76 will reflect data from
`the previous loop 41, thus position calculation 130 may be
`less accurate.
`In a preferred embodiment, calculation 60 also performs
`an estimation of position and orientation 130 one "frame .
`delay" into the future. The reason for this predictive step, for
`example, is that simulation environment [80 will take some
`time. t. to utilize position and orientation information 130
`and modify virtual reality program or simulation environ-
`ment for presentation to the user (e.g., draw the next frame ..
`in a vi

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket