throbber
(12) United States Patent
`Wilson et al.
`
`(10) Patent N0.:
`(45) Date of Patent:
`
`US 8,282,487 B2
`Oct. 9, 2012
`
`US008282487B2
`
`(54) DETERMINING ORIENTATION IN AN
`EXTERNAL REFERENCE FRAME
`
`(U S);
`(75) Inventors: Andrew Wilson, Seattle,
`Steven Michael Beeman, Klrkland, WA
`(US)
`
`'
`.
`-
`-
`(73) Asslgnee. 1\{IJlSCl‘0S0ft Corporation, Redmond, WA
`(
`)
`
`’
`’
`7,095,401 B2
`
`2003/0156756 A1
`2004/0001113 A1
`2004/0155902 A1
`2004/0189720 A1
`Zoos/0151850 A1
`2005/0212753 A1
`
`PDeneV ital a1
`emps let '
`8/2006 Liu et a1.
`gglie?um et 31‘
`8/2003 Gokturk et al.
`1/2004 Zipperer et al.
`8/2004 Dempski et a1.
`9/2004 Wilson et a1.
`700% Ahn et a1‘
`9/2005 Marvit et al.
`
`2005/0238201 A1 10/2005 Shamaie
`
`2005/0255434 A1 11/2005 Lok et a1,
`2006/0007142 Al
`l/2006 Wilson et a1.
`2006/0012675 A1* 1/2006 Alpaslan et al. .............. .. 348/51
`2006/0036944 A1
`2/2006 W1ls0n
`2006/0092267 A1
`5/2006 Dempski et a1.
`(Cont1nued)
`OTHER PUBLICATIONS
`
`.
`
`Romero, Joshua J ., “How Do Motion-Sensing Video Game Control
`lers Work?”, posted Dec. 18, 2006, retrieved at << http://scienceline.
`org/2006/12/18/motioncontrollers/ >>, 4 Pages.
`“Gametrak Fusion 3DWireless Motion Sensor Gaming”, Posted Oct.
`20, 2006, retrieved at << http://WWW.pcvsconsole.com/neWs.
`php?nid:32l2 >>, 2 Pages.
`Morris, et al., “User-De?ned Gesture Set for Surface Computing”,
`Application ?led Aug. 4, 2008, US Appl. No. 12/185,166.
`.
`(Commued)
`S M C1 11
`J
`yE
`P
`rimar xaml'ner i ames
`C e an
`(74) Attorney) Agent] or Firm i Aneman Han MCCOy
`Russell & Tuttle LLP
`
`ABSTRACT
`(57)
`Orientation in an external reference is determined. An exter
`nal-frame acceleration for a device is determined, the exter
`nal-frame acceleration being in an external reference frame
`relative to the device. An internal-frame acceleration for the
`device is determined, the internal-frame acceleration being in
`an internal reference frame relative to the device. An orien
`tation of the device is determined based on a comparison
`betWeen a direction of the external-frame acceleration and a
`direction of the internal-frame acceleration.
`
`24 Claims, 4 Drawing Sheets
`
`_
`
`( * ) Not1ce:
`
`_
`
`Subject to any d1scla1mer, the term ofth1s
`patent is extended or adjusted under 35
`U_S_C_ 154(1)) by 572 days_
`
`_
`
`_
`
`_
`
`(21) Appl NO _ 12/490 331
`
`.
`
`..
`
`,
`
`(22) Filed:
`
`Jun. 24, 2009
`
`(65)
`
`Prior Publication Data
`US 2010/0103269 A1
`Apr_ 29, 2010
`
`Related U_s_ Application Data
`
`(63) Continuation of application No. 12/256,747, ?led on
`Oct. 23, 2008.
`
`(51) Int. Cl.
`A63F 13/00
`463/39 463/37
`I
`I
`(52) US Cl
`-
`-
`-
`.......................................... ..
`(58) Field of Classi?cation Search .................. .. 463/36,
`463/37, 39, 40
`See application ?le for complete search history.
`
`(2006.01)
`
`(56)
`
`References Cited
`
`U.S. PATENT DOCUMENTS
`Thingvold
`11/1999
`5,990,908
`A
`11/2000
`6,148,280
`Kramer
`A
`Lyons
`1/2001
`6,181,343
`B1
`Lyons
`2/2001
`6,195,104
`B1
`Rehg et al.
`7/2001
`6,269,172
`B1
`7/2003
`6,600,475
`B2
`Gutta et a1.
`6,678,059
`1/2004
`Cho et al.
`B2
`6,693,284
`Tanaka
`2/2004
`B2
`6,693,666
`2/2004
`Baker et al.
`B1
`Higaki et al.
`6,804,396
`10/2004
`B2
`
`REFERENCE
`FRAME
`
`INTERNAL
`REFERENCE
`FRAME
`
`Zepp Labs, Inc.
`ZEPP 1018
`Page 1
`
`

`
`US 8,282,487 B2
`Page 2
`
`U.S. PATENT DOCUMENTS
`2006/0178212
`PenZias
`A1
`8/2006
`2006/0252474
`A1 * 11/2006
`Zalewski et al. ................ .. 463/1
`Page
`2007/0152157
`A1
`7/2007
`Dunko
`2007/0230747
`A1 10/2007
`Delean
`2007/0252898
`A1 11/2007
`2008/0036732
`A1
`2/2008
`Wilson et a1.
`Pryor et a1.
`2008/0122786
`A1
`5/2008
`Wilson
`2008/0193043
`A1
`8/2008
`2009/0121894
`A1
`5/2009
`Wilson et a1.
`
`OTHER PUBLICATIONS
`Wilson, Andrew David., “Computer Vision-Based Multi-Touch
`Sensing Using Infrared Lasers”Application ?led May 12, 2008, US.
`Appl.No.12/118,955.
`Wilson , et al., “Determining Orientation in an External Reference
`Frame”, Application ?led Oct. 23, 2008, US. Appl. No. 12/256,747.
`Singh, Amit, “The Apple Motion Sensor as a Human Interface
`Device,” http://osXbook.com/booldbonus/chapter10/ams2hid/, Mar.
`2005.
`
`* cited by examiner
`
`Zepp Labs, Inc.
`ZEPP 1018
`Page 2
`
`

`
`US. Patent
`
`0a. 9, 2012
`
`Sheet 1 of4
`
`US 8,282,487 B2
`
`Fig. 1A
`
`10 f
`
`WAND MONITOR
`
`>
`
`\ 14
`
`ORIENTATION
`INFERRING
`SUBSYSTEM
`
`A
`
`\16
`
`ACCELERATION-
`MEASURING SUBSYSTEM
`
`-\
`20
`
`5
`ANGULAR MOTION-
`MEASURING SUBSYSTEM - """"""""""""""""" "
`
`TA RG ET
`
`WAND
`
`Zepp Labs, Inc.
`ZEPP 1018
`Page 3
`
`

`
`US. Patent
`
`0a. 9, 2012
`
`Sheet 2 of4
`
`US 8,282,487 B2
`
`Fig. 1B
`
`v/-10‘
`
`TARGET
`
`‘ POSITION INFERRING
`'
`SUBSYSTEM
`
`\1s'
`
`A
`
`\16'
`
`ACCELERATION-
`MEASURING SUBSYSTEM
`
`~\
`20.
`
`ANGULAR MOTION-
`MEASURING SUBSYSTEM {k- ---------------------- -
`
`5
`
`22'
`TARGET MONITOR _\
`14'
`
`WAND
`
`Zepp Labs, Inc.
`ZEPP 1018
`Page 4
`
`

`
`US. Patent
`
`0a. 9, 2012
`
`Sheet 3 of4
`
`US 8,282,487 B2
`
`EXTERNAL
`REFERENCE
`FRAME
`
`INTERNAL
`REFERENCE
`
`_: I
`‘
`
`'
`
`INTERNAL
`
`REFERENCE K FRAME
`
`Zepp Labs, Inc.
`ZEPP 1018
`Page 5
`
`

`
`US. Patent
`
`0a. 9, 2012
`
`Sheet 4 of4
`
`US 8,282,487 B2
`
`F lg. 4
`
`v[60
`
`INFER COARSE ORIENTATION OF GAME CONTROLLER
`
`CONTROLLER
`
`_ CONTROLLER
`DETERMINE ORIENTATION OF CAME CONTROLLER BASED ON
`COMPARISON BETWEEN DIRECTION OF EXTERNAL-FRAME
`ACCELERATION AND DIRECTION OF INTERNAL-FRAME
`ACCELERATION
`
`UPDATE COARSE ORIENTATION OF GAME CONTROLLER BASED
`ON ANGULAR MOTION INFORMATION OBSERVED BY THE GAME
`CONTROLLER
`
`70
`
`Zepp Labs, Inc.
`ZEPP 1018
`Page 6
`
`

`
`US 8,282,487 B2
`
`1
`DETERMINING ORIENTATION IN AN
`EXTERNAL REFERENCE FRAME
`
`CROSS REFERENCE TO RELATED
`APPLICATION(S)
`
`This application is a continuation of US. patent applica
`tion Ser. No. 12/256,747 ?led on Oct. 23, 2008, entitled
`“DETERMINING ORIENTATION IN AN EXTERNAL
`REFERENCE FRAME”, the entire contents of Which is
`hereby incorporated by reference.
`
`BACKGROUND
`
`A gyroscope can use angular momentum to assess a rela
`tive orientation of a device in a frame of reference that is
`internal to that device. HoWever, even the mo st accurate gyro
`scopes available may accumulate small orientation errors
`over time.
`
`SUMMARY
`
`This Summary is provided to introduce a selection of con
`cepts in a simpli?ed form that are further described beloW in
`the Detailed Description. This Summary is not intended to
`identify key features or essential features of the claimed sub
`ject matter, nor is it intended to be used to limit the scope of
`the claimed subject matter. Furthermore, the claimed subject
`matter is not limited to implementations that solve any or all
`disadvantages noted in any part of this disclosure.
`Determining orientation in an external reference frame is
`disclosed herein. An external-frame acceleration for a device
`is determined, the external-frame acceleration being in an
`external reference frame relative to the device. An intemal
`frame acceleration for the device is also determined, the inter
`nal-frame acceleration being in an internal reference frame
`relative to the device. An orientation of the device is deter
`mined based on a comparison betWeen a direction of the
`external-frame acceleration and a direction of the intemal
`frame acceleration.
`
`BRIEF DESCRIPTION OF THE DRAWINGS
`
`FIG. 1A schematically shoWs an orientation-determining
`computing system in accordance With an embodiment of the
`present disclosure.
`FIG. 1B schematically shoWs a position-determining com
`puting system in accordance With another embodiment of the
`present disclosure.
`FIG. 2 shoWs an exemplary con?guration of the orienta
`tion-determining computing system of FIG. 1.
`FIG. 3 shoWs a comparison of an external-frame accelera
`tion vector and an intemal-frame acceleration vector corre
`sponding to the controller orientation of FIG. 2.
`FIG. 4 shoWs a process How of an example method of
`tracking an orientation of a game controller.
`
`DETAILED DESCRIPTION
`
`20
`
`25
`
`30
`
`35
`
`40
`
`45
`
`50
`
`55
`
`FIG. 1 shoWs an orientation-determining computing sys
`tem 10 including a Wand 12, a Wand monitor 14 and an
`orientation inferring subsystem 16. Orientation inferring sub
`system 16 is con?gured to determine an orientation of Wand
`12 in a frame of reference that is external to the Wand 12. In
`particular, the orientation inferring subsystem 16 may infer a
`coarse orientation of the Wand 12 in the external reference
`frame by comparing acceleration information of the Wand 12
`
`60
`
`65
`
`2
`in the external reference frame With acceleration information
`of the Wand 12 in an internal reference frame.
`The acceleration information in the external reference
`frame may be assessed by Wand monitor 14. The Wand moni
`tor 14 may be con?gured to observe the Wand 12 as the Wand
`12 moves relative to the Wand monitor 14. Such observations
`may be translated into an external-frame acceleration for the
`Wand. Any suitable technique may be used by the Wand moni
`tor 14 for observing the Wand 12. As a nonlimiting example,
`the Wand monitor 14 may be con?gured to visually observe
`the Wand 12 With stereo cameras. In some embodiments, the
`Wand 12 may include a target 18 that facilitates observation
`by the Wand monitor 14.
`The acceleration information in the internal reference
`frame may be assessed by the Wand 12. The Wand 12 may be
`con?gured to sense Wand accelerations and report such
`sensed accelerations to orientation inferring subsystem 16. In
`some embodiments, the Wand may include an acceleration
`measuring sub system 20 for measuring Wand accelerations in
`a frame of reference that is internal to the Wand 12.
`In addition to determining a coarse orientation of the Wand
`12 by comparing Wand accelerations in internal and external
`reference frames, the orientation inferring sub system 16 may
`update the coarse orientation of the Wand 12 based on angular
`motion information observed by the Wand 12 itself. As such,
`the Wand 12 may include an angular-motion measuring sub
`system 22 for measuring angular motion of the Wand 12 in a
`frame of reference that is internal to the Wand. Even When
`such an angular-motion measuring subsystem 22 is included,
`the coarse orientation inferred using internal and extemal
`frame accelerations may be used to limit errors that may
`accumulate if only the angular-motion measuring subsystem
`22 is used.
`The Wand may be con?gured to serve a variety of different
`functions in different embodiments Without departing from
`the scope of this disclosure. As a nonlimiting example, in
`some embodiments, computing system 10 may be a game
`system in Which Wand 12 is a game controller device for
`controlling various game functions. It is to be understood that
`the orientation inferring methods described herein may addi
`tionally and/ or alternatively be applied to an orientation-de
`termining computing system other than a game system, and
`the Wand need not be a game controller in all embodiments.
`Furthermore, it is to be understood that the arrangement
`shoWn in FIG. 1A is exemplary, and other arrangements are
`Within the scope of this disclosure. As a nonlimiting example,
`FIG. 1B shoWs a position-determining computing system 10'
`in accordance With another embodiment of the present dis
`closure. Position-determining computing system 10' includes
`a Wand 12', a target monitor 14' and a position inferring
`subsystem 16'. Position inferring subsystem 16' is con?gured
`to determine a position of Wand 12' in a frame of reference
`that is external to the Wand 12'. In particular, the position
`inferring subsystem 16' may infer a coarse position of the
`Wand 12' in the external reference frame by comparing ori
`entation information of the Wand 12' in the external reference
`frame With acceleration information of the Wand 12' in an
`internal reference frame.
`In some embodiments, target 18' may include one or more
`LEDs (e.g., infrared LEDs) positioned in a ?xed location,
`such as near a television or any other suitable location. In such
`embodiments, the Wand 12' may include a target monitor 14'
`con?gured to vieW the target 18' and deduce an orientation of
`the Wand based upon a relative position of the target 18' Within
`the target monitor’s ?eld of vieW. Such information may be
`used in cooperation With acceleration information measured
`by an acceleration-measuring subsystem 20' and/or angular
`motion information measured by an angular-motion measur
`ing subsystem 22' to infer a coarse position of the Wand as
`discussed beloW With reference to inferring coarse orienta
`tion.
`
`Zepp Labs, Inc.
`ZEPP 1018
`Page 7
`
`

`
`US 8,282,487 B2
`
`3
`In yet other embodiments, a Wand may include both a target
`and a target monitor, and/or both a target and a target monitor
`may be positioned at one or more locations external to the
`Wand. In other Words, the arrangements shoWn in FIGS. 1A
`and 1B may be at least partially combined, thus enabling
`direct deduction of both Wand position and Wand orientation,
`Which may optionally be con?rmed/veri?ed With inferred
`position and inferred orientation, as described herein. Fur
`ther, it should be understood that the relative positioning of
`targets, target monitors, Wand monitors, and other compo
`nents described herein may be varied from the speci?c
`examples provided herein Without departing from the scope
`of the present disclosure.
`FIG. 2 shoWs an example game system 30 including a
`controller 32, a controller monitor 34 including stereo cam
`eras 36, and a gaming console 38 including an orientation
`inferring subsystem 40.
`In such a game system 30, orientation inferring subsystem
`40 is con?gured to infer a coarse orientation of controller 32
`in an external reference frame relative to controller 32. In
`particular, the coarse orientation of the controller 32 in a
`television’s, or other display’s, reference frame may be
`inferred. The orientation inferring subsystem 40 infers the
`coarse orientation of the controller 32 by comparing accel
`eration information from an external reference frame relative
`to the controller 32 With acceleration information from an
`internal reference frame relative to the controller 32.
`In the illustrated embodiment, orientation inferring sub
`system 40 is con?gured to determine an external-frame accel
`eration of controller 32 using time-elapsed position informa
`tion received from stereo cameras 36. While shoWn placed
`near a television, it should be understood that stereo cameras
`36, or another Wand/target monitor, may be placed in numer
`ous different positions Without departing from the scope of
`this disclosure.
`The stereo cameras may observe a target 41 in the form of
`an infrared light on controller 32. The individual position of
`the target 41 in each camera’s ?eld of vieW may be coopera
`tively used to determine a three-dimensional position of the
`target 41, and thus the controller 32, at various times. Visu
`ally-observed initial position information and subsequent
`position information may be used to calculate the extemal
`frame acceleration of the controller 32 using any suitable
`technique.
`The folloWing technique is a nonlimiting example forusing
`initial position information and subsequent position informa
`tion to determine an external-frame acceleration of the con
`troller. Taking X—O to be a current position of controller 32 as
`observed by controller monitor 34 at a time to, and X—_l to be
`a previous position of controller 32 as observed by controller
`monitor 34 at a previous time t_ 1, an expected position X—O' for
`controller 32 at a current time to can be calculated according
`to the folloWing equation,
`
`Here, the velocity V is calculated from prior position infor
`mation as folloWs,
`
`20
`
`25
`
`30
`
`35
`
`40
`
`45
`
`50
`
`55
`
`60
`
`i T-T
`V I (
`1
`2),
`(L1 — L2)
`
`Where X_2 is a more previous position of the controller as
`observed by the controller monitor at a more previous
`time t_2.
`
`65
`
`4
`If it is determined that the expected position X—O' is not equal
`to the current position X—O, then the difference may be a result
`of acceleration of controller 32. In such a case, the orientation
`inferring subsystem 40 determines an external-frame accel
`eration 5 of controller 32 at a current time tO be given by the
`folloWing,
`
`2,
`
`Where g is a gravitational acceleration.
`Orientation inferring subsystem 40 is con?gured to deter
`mine an internal-frame acceleration of controller 32 from
`acceleration information received from controller 32. The
`controller 32 may obtain the internal-frame acceleration in
`any suitable manner. For example, the controller may include
`an acceleration-measuring subsystem con?gured to report
`acceleration information to the orientation inferring sub
`system 40. In some embodiments, the acceleration-measur
`ing subsystem may be a three-axis accelerometer 42 located
`proximate to the target 41, as schematically shoWn in FIG. 2.
`The orientation inferring subsystem 40 can determine a
`coarse orientation of controller 32 based on a comparison
`betWeen a direction of the external-frame acceleration and a
`direction of the internal-frame acceleration. FIG. 3 shoWs an
`example of such a comparison 50 corresponding to the con
`troller movement shoWn in FIG. 2. Vector 52 represents the
`direction of the external-frame acceleration and vector 54
`represents the direction of the internal-frame acceleration.
`The misalignment betWeen the external-frame acceleration
`and the internal-frame acceleration can be resolved to ?nd
`any difference betWeen the external reference frame and the
`internal reference frame. Accordingly, an orientation of the
`controller 32 can be inferred in the external frame of refer
`ence.
`As a nonlimiting example, if stereo cameras 36 observe
`controller 32 accelerating due east Without changing eleva
`tion or moving north/south; and if acceleration-measuring
`subsystem 20 reports that controller 32 accelerates to the
`right, Without moving up/doWn or front/back; then orienta
`tion inferring subsystem 40 can infer that controller 32 is
`pointing toWard the north. The above is a simpli?ed and
`someWhat exaggerated scenario. In many usage scenarios,
`controller 32 Will be pointed substantially toWard a television
`or other display, and any relative misalignments betWeen
`internal and external reference frames Will be less severe.
`Nonetheless, the orientation inferring methods described
`herein may be used to assess a coarse orientation.
`The assessed external-frame acceleration of controller 32
`may differ from the actual controller acceleration due to one
`or more of the folloWing factors: noise and error in the data
`visually-observed by stereo cameras 36, noise and error in the
`accelerometer data, and/or misalignment betWeen the inter
`nal reference frame and the external reference frame. HoW
`ever, an inferred coarse orientation of controller 32, Which is
`found as described herein, is absolute, rather than relative,
`and therefore does not accumulate error over time.
`In some embodiments, orientation inferring subsystem 40
`may be further con?gured to update the coarse orientation of
`controller 32 based on angular motion information observed
`by controller 32. The controller 32 may obtain the angular
`motion information in any suitable manner. One such suitable
`manner includes obtaining the angular motion information by
`means of an angular-motion measuring subsystem 44 con?g
`ured to report angular motion information to the orientation
`
`Zepp Labs, Inc.
`ZEPP 1018
`Page 8
`
`

`
`US 8,282,487 B2
`
`5
`inferring subsystem 40. In some embodiments, the angular
`motion measuring subsystem may include spaced-apart
`three-axis accelerometers con?gured to be used in combina
`tion to determine the angular motion of controller 32. As
`shoWn in FIG. 2, in such embodiments, one three-axis accel
`erometer 42 may be located at a head end of controller 32 and
`another three-axis accelerometer 46 may be located at a tail
`end of controller 32, such that subtracting a head acceleration
`direction obtained by the head accelerometer 42 from a tail
`acceleration direction obtained by the tail accelerometer 46
`yields an orientation change of controller 32 in the internal
`reference frame relative to controller 32. In other embodi
`ments, such an angular-motion measuring subsystem 44 may
`include a three-axis gyroscope 48 Which calculates the angu
`lar velocity of controller 32, Which can then be integrated over
`time to determine an angular position.
`In betWeen frames Where a coarse orientation is available
`(e. g., if target 41 does not move su?icient distance for detec
`tion by stereo cameras 36), measurements from the angular
`motion measuring subsystem 44 may accumulate error. A
`20
`long period of very sloW motion, as might Well happen When
`draWing, is the Worst-case scenario. HoWever, such a situation
`is the best-case scenario for smoothing and ?ltering the accel
`erometer data, because it is expected that a user Will attempt
`to draW smooth lines and curves.
`Controller 32 may report acceleration information and/or
`angular motion information to orientation inferring sub
`system 40 by any suitable means. In some embodiments,
`controller 32 may report acceleration information and/or
`angular motion information by Wirelessly transmitting such
`information to orientation inferring subsystem 40, as sche
`matically shoWn in FIG. 2. In other embodiments, controller
`32 may be physically connected to orientation inferring sub
`system 40.
`FIG. 4 shoWs a process How diagram of an example method
`60 of tracking an orientation of a game controller. Method 60
`begins at 62 by inferring a coarse orientation of the game
`controller. At 64, method 60 includes determining an exter
`nal-frame acceleration for the game controller, the extemal
`frame acceleration being in an external reference frame rela
`tive to the game controller. At 66, method 60 includes
`determining an internal-frame acceleration for the game con
`troller, the internal-frame acceleration being in an internal
`reference frame relative to the game controller. At 68, method
`60 includes determining an orientation of the game controller
`based on a comparison betWeen a direction of the extemal
`frame acceleration and a direction of the internal-frame accel
`eration, as explained above. Upon inferring a coarse orienta
`tion of the game controller, method 60 may optionally
`include, at 70, updating the coarse orientation of the game
`controller based on angular motion information observed by
`the game controller.
`In some embodiments, an unscented Kalman ?lter may be
`used to combine three-dimensional position tracking from
`stereo cameras, angular velocity information from gyro
`scopes, and acceleration information from accelerometers
`into a uni?ed estimate of position and absolute orientation of
`the device. An unscented Kalman ?lter may be appropriate
`because of nonlinearities that may be introduced in the obser
`vation part of the process model (i.e., using the orientation to
`correct accelerometers). An extended Kalman ?lter may
`alternatively be used.
`The Kalman ?lter approach combines the information pro
`vided from all sensors and alloWs the introduction of (Gaus
`sian) noise models for each of the sensors. For example, any
`noise associated With position estimates from the cameras can
`be incorporated directly into the model. Similarly, the noise of
`
`50
`
`55
`
`60
`
`65
`
`25
`
`30
`
`35
`
`40
`
`45
`
`6
`the gyro scopes and accelerometers may be represented by the
`model. By tuning each of these separately, the system may
`favor the more reliable sensors Without neglecting less reli
`able sensors.
`The Kalman state, state transition, and observation model
`are described as folloWs, and the standard Kalman ?lter equa
`tions are used thereafter. At each frame, the state is updated
`With the state transition model, and predicted sensor values
`are computed from state estimates given the observation
`model. After the ?lter is updated, an updated position and
`orientation information is “read” from the updated state vec
`tor.
`The Kalman state {x, x, x, q, 00} includes information to be
`represented and carried from frame to frame, and is described
`as folloWs:
`x is a 3D position of the device (3-vector);
`x is a velocity of the device (3-vector);
`x is an acceleration of the device (3-vector);
`q is a device orientation (quatemion); and
`u) is an angular velocity: change in yaW, pitch and roll in the
`device coordinate frame (3 -vector).
`Next, a state transition is used to advance the state to the
`next time step based on process dynamics (velocity, accelera
`tion, etc.). The state transition is described mathematically as
`folloWs:
`
`Where:
`q(u)) is a quatemion formed from a change in yaW, pitch,
`and roll.
`Next the sensed values are “observed” from the state, as
`folloWs:
`Z is a 3D position from a stereo camera system (3 -vector);
`gyro are gyroscope values including change in yaW, pitch
`and roll (3 -vector);
`a is accelerometer values (3-vector);
`g is a direction of gravity (3-vector);
`Where:
`
`Where:
`R(q) is a rotation matrix formed from the quatemion q.
`The last equation is the focus, Where the accelerometer
`values are predicted by combining the effects of acceleration
`due to motion of the device, the effect of gravity, and the
`absolute orientation of the device. Discrepancies in the pre
`dicted values are then propagated back to the state by Way of
`the standard Kalman update equations.
`It should be understood that the con?gurations and/or
`approaches described herein are exemplary in nature, and that
`these speci?c embodiments or examples are not to be consid
`ered in a limiting sense, because numerous variations are
`possible. The speci?c routines or methods described herein
`may represent one or more of any number of processing
`strategies. As such, various acts illustrated may be performed
`in the sequence illustrated, in other sequences, in parallel, or
`in some cases omitted. Likewise, the order of the above
`described processes may be changed.
`
`Zepp Labs, Inc.
`ZEPP 1018
`Page 9
`
`

`
`US 8,282,487 B2
`
`7
`The subject matter of the present disclosure includes all
`novel and nonobvious combinations and subcombinations of
`the various processes, systems and con?gurations, and other
`features, functions, acts, and/or properties disclosed herein,
`as Well as any and all equivalents thereof. Furthermore, U.S.
`Pat. No. 6, 982,697 is hereby incorporated herein by reference
`for all purposes.
`
`The invention claimed is:
`1. A system comprising:
`an orientation inferring subsystem including a monitor
`con?gured to visually observe a motion of an object
`relative to the monitor;
`a position inferring subsystem including a target monitor
`coupled to the object and con?gured to determine a
`coarse position of the object relative to a target separate
`from the object; and
`a gaming subsystem for using the observed motion from
`the orientation inferring subsystem and the determined
`coarse position from the position inferring subsystem to
`control a game function.
`2. The system of claim 1 Wherein the monitor includes at
`least one camera.
`3. The system of claim 1 Wherein the target monitor
`includes at least one camera.
`4. The system of claim 2 Wherein an external frame of
`acceleration of the object is determined using time-elapsed
`position information received by the at least one camera.
`5. The system of claim 2 Wherein the at least one camera is
`used to visually observe the object using infrared light.
`6. The system of claim 2 Wherein the orientation inferring
`subsystem updates an orientation of the object based on angu
`lar information.
`7. The system of claim 1 Wherein the orientation inferring
`subsystem infers an orientation of the object relative to a
`display or a television.
`8. The system of claim 3 Wherein the at least one camera in
`the position inferring subsystem is used to determine a three
`dimensional position of the object.
`9. An apparatus comprising:
`a monitor con?gured to visually observe a motion of an
`object relative to the monitor;
`a target monitor coupled to the object and con?gured to
`determine a coarse position of the object relative to a
`target separate from the object; and
`
`25
`
`30
`
`35
`
`40
`
`8
`a gaming device for processing the observed motion from
`the monitor and the determined coarse position from the
`target monitor to control a game function.
`10. The apparatus of claim 9 Wherein the monitor includes
`at least one camera.
`11. The apparatus of claim 9 Wherein the target monitor
`includes at least one camera.
`12. The apparatus of claim 10 Wherein an external frame of
`acceleration of the object is determined using time-elapsed
`position information received by the at least one camera.
`13. The apparatus of claim 10 Wherein the at least one
`camera is used to visually observe the object using infrared
`light.
`14. The apparatus of claim 10 Wherein the monitor updates
`an orientation of the object based on angular information.
`15. The apparatus of claim 9 Wherein the monitor infers an
`orientation of the object relative to a display or a television.
`16. The apparatus of claim 11 Wherein the at least one
`camera is used to determine a three-dimensional position of
`the object.
`17. A method comprising:
`observing a motion of an object relative to a monitor;
`determining a coarse position of the object relative to a
`target separate from the object; and
`processing the observed motion from the monitor and the
`determined coarse position from a target monitor; and
`controlling a game function With the processed observed
`motion and determined coarse position.
`18. The method of claim 17 Wherein the monitor includes
`at least one camera.
`19. The method of claim 17 Wherein the target monitor
`includes at least one camera.
`20. The method of claim 18 further comprising determin
`ing an external frame of acceleration of the object using
`time-elapsed position information received by the at least one
`camera.
`21. The method of claim 18 Wherein the observing further
`comprises observing the object using infrared light.
`22. The method of claim 18 further comprising updating an
`orientation of the object based on angular information.
`23. The method of claim 17 further comprising inferring an
`orientation of the object relative to a display or a television.
`24. The method of claim 19 Wherein the determining fur
`ther comprises determining a three-dimensional position of
`the object using the at least one camera.
`
`*
`
`*
`
`*
`
`*
`
`*
`
`Zepp Labs, Inc.
`ZEPP 1018
`Page 10

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket