`(12) Patent Application Publication (10) Pub. No.: US 2009/0265671 A1
`(43) Pub. Date:
`Oct. 22, 2009
`SACHS et al.
`
`US 20090265671 A1
`
`(54) MOBILE DEVICES WITH MOTION
`GESTURE RECOGNITION
`
`(75) Inventors:
`
`DAVID SACHS, SUNNYVALE,
`CA (US); STEVENS. NASIRI,
`SARATOGA, CA (US); JOSEPH
`JIANG, SANJOSE, CA (US);
`ANJIAGU, STANFORD, CA (US)
`Correspondence Address:
`SAWYER LAW GROUP PC
`2465 E. Bayshore Road, Suite No. 406
`PALO ALTO, CA 94.303 (US)
`(73) Assignee:
`INVENSENSE, SUNNYVALE,
`CA (US)
`12/252,322
`
`(21) Appl. No.:
`
`(22) Filed:
`
`Oct. 15, 2008
`Related U.S. Application Data
`(63) Continuation-in-part of application No. 12/106,921,
`filed on Apr. 21, 2008.
`
`Publication Classification
`
`(51) Int. Cl.
`G06F 3/0
`
`(2006.01)
`
`(52) U.S. Cl. ........................................................ 71.5/863
`
`ABSTRACT
`(57)
`Mobile devices using motion gesture recognition. In one
`aspect, processing motion to control a portable electronic
`device includes receiving, on the device, sensed motion data
`derived from motion sensors of the device and based on
`device movement in space. The motion sensors include at
`least three rotational motion sensors and at least three accel
`erometers. A particular operating mode is determined to be
`active while the movement of the device occurs, the mode
`being one of multiple different operating modes of the device.
`Motion gesture(s) are recognized from the motion data from
`a set of motion gestures available for recognition in the active
`operating mode. Each of the different operating modes, when
`active, has a different set of gestures available. State(s) of the
`device are changed based on the recognized gestures, includ
`ing changing output of a display Screen on the device.
`
`102
`
`101
`
`102
`
`APPLE 1007
`
`1
`
`
`
`Patent Application Publication
`
`Oct. 22, 2009 Sheet 1 of 13
`
`US 2009/0265671 A1
`
`16a
`
`DISPLAY
`SCREEN
`
`16
`
`INTERFACE
`DEVICES
`
`MEMORY
`
`14
`
`18
`
`
`
`SOFTWARE
`
`APPLICATION
`PROCESSOR
`
`-10
`
`36
`
`12
`
`
`
`MOTION
`CONTROL
`
`
`
`HARDWARE
`30- PROCESSING
`
`GYRO-
`SCOPES
`
`ACCELER-
`OMETERS
`
`26
`
`28
`
`21
`
`20
`
`FIG. 1
`
`ANALOG
`SENSORS
`
`sES5s
`
`2
`
`
`
`Patent Application Publication
`
`Oct. 22, 2009 Sheet 2 of 13
`
`US 2009/0265671 A1
`
`|
`
`
`
`
`
`S? HE' LEWNO
`
`-}HETEOO\/
`
`3
`
`
`
`Patent Application Publication
`
`Oct. 22, 2009 Sheet 3 of 13
`
`US 2009/0265671 A1
`
`10
`
`10
`
`10
`
`FIG. 3A
`
`FIG. 3B
`
`
`
`102
`
`102 D
`
`FIG. 4B
`
`4
`
`
`
`Patent Application Publication
`
`Oct. 22, 2009 Sheet 4 of 13
`
`US 2009/0265671 A1
`
`
`
`FIG. 5A
`
`FIG. 5B
`
`FIG. 5C
`
`s FIG. 6A
`
`FIG. 6B
`
`F.G. 6C
`
`5
`
`
`
`Patent Application Publication
`
`Oct. 22, 2009 Sheet 5 of 13
`
`US 2009/0265671 A1
`
`
`
`
`
`
`
`99 ||
`
`6
`
`
`
`Patent Application Publication
`
`Oct. 22, 2009 Sheet 6 of 13
`
`US 2009/0265671 A1
`
`1
`
`Y.
`
`FIG. 8A
`
`
`
`r
`
`F.
`
`--194
`d
`
`FIG. 8B
`
`FIG 10A
`
`FIG. 1 OB
`
`7
`
`
`
`Patent Application Publication
`
`Oct. 22, 2009 Sheet 7 of 13
`
`US 2009/0265671 A1
`
`2OO
`
`-1
`
`
`
`
`
`RECEIVE SENSED
`MOTION DATA
`
`2O3
`
`DETERMINE ACTIVE
`OPERATING MODE
`OF DEVICE
`
`204
`
`SELECT SET OF GESTURES
`WHICH CAN BE RECOGNIZED 1205
`NACTIVE OPERATING MODE
`
`ANALYZE MOTON DATA TO
`RECOGNIZE MOTION
`GESTURE(S)
`
`206
`
`ONE OR MORE
`STATES OF DEVICE CHANGED
`BASED ON RECOGNIZED
`MOTION GESTURE(S)
`
`FIG. 9
`
`8
`
`
`
`Patent Application Publication
`
`Oct. 22, 2009 Sheet 8 of 13
`
`US 2009/0265671 A1
`
`
`
`220
`
`
`
`
`
`FIG. 1 1A
`
`FIG. 11B
`
`246
`
`FIG 11C
`
`FIG. 1 1D
`
`272
`
`FIG. 11E
`
`FIG. 11 F
`
`9
`
`
`
`Patent Application Publication
`
`Oct. 22, 2009 Sheet 9 of 13
`
`US 2009/0265671 A1
`
`
`
`FIG. 12A
`
`FIG. 12B
`
`294
`
`296
`
`1N-298
`
`299
`
`FIG. 13A
`
`FIG. 13B
`
`10
`
`
`
`Patent Application Publication
`
`Oct. 22, 2009 Sheet 10 of 13
`
`US 2009/0265671 A1
`
`300-N
`
`FIG. 14
`
`
`
`FIG. 15
`
`11
`
`
`
`Patent Application Publication
`
`Oct. 22, 2009 Sheet 11 of 13
`
`US 2009/0265671 A1
`
`WW
`WW
`
`WW
`
`--- - - - - - - -
`
`
`
`
`
`
`
`
`
`
`
`12
`
`
`
`Patent Application Publication
`
`Oct. 22, 2009 Sheet 12 of 13
`
`US 2009/0265671 A1
`
`orgº
`
`869
`
`
`
`
`Z89 #789
`Sogg
`
`EXHTMLSE|5)
`
`EXHTMLSE|5)
`
`0089
`
`13
`
`
`
`Patent Application Publication
`
`Oct. 22, 2009 Sheet 13 of 13
`
`US 2009/0265671 A1
`
`400 N
`
`402
`
`6-AXIS DEVICE OUTPUT (DIGITALIANALOG)
`
`CRITICAL POINT / FEATURE DATA
`
`404
`
`
`
`PROGRAM
`MABLE
`
`MOTION LOGIC
`
`N 406
`
`OUTPUT
`
`FIG. 18
`
`14
`
`
`
`US 2009/0265671 A1
`
`Oct. 22, 2009
`
`MOBILE DEVICES WITH MOTION
`GESTURE RECOGNITION
`
`CROSS REFERENCE TO RELATED
`APPLICATIONS
`0001. This application claims the benefit of U.S. Provi
`sional Application No. 61/022,143, filed Jan. 18, 2008,
`entitled, “Motion Sensing Application Interface.” and
`0002 This application is a continuation-in-part of U.S.
`patent application Ser. No. 12/106,921 (4360P), filed Apr. 21,
`2008, entitled, “Interfacing Application Programs and
`Motion Sensors of a Device.”
`0003 all of which are incorporated herein by reference in
`their entireties.
`
`FIELD OF THE INVENTION
`0004. The present invention relates generally to motion
`sensing devices, and more specifically to recognizing motion
`gestures based on motion sensors of a motion sensing device.
`
`BACKGROUND OF THE INVENTION
`0005 Motion sensors, such as inertial sensors like accel
`erometers or gyroscopes, can be used in electronic devices.
`Accelerometers can be used for measuring linear acceleration
`and gyroscopes can be used for measuring angular Velocity of
`a moved device. The markets for motion sensors include
`mobile phones, video game controllers, PDAs, mobile inter
`net devices (MIDs), personal navigational devices (PNDs),
`digital still cameras, digital video cameras, and many more.
`For example, cell phones may use accelerometers to detect
`the tilt of the device in space, which allows a video picture to
`be displayed in an orientation corresponding to the tilt. Video
`game console controllers may use accelerometers to detect
`motion of the hand controller that is used to provide input to
`a game. Picture and video stabilization is an important feature
`in even low- or mid-end digital cameras, where lens or image
`sensors are shifted to compensate for handjittering measured
`by a gyroscope. Global positioning system (GPS) and loca
`tion base service (LBS) applications rely on determining an
`accurate location of the device, and motion sensors are often
`needed when a GPS signal is attenuated or unavailable, or to
`enhance the accuracy of GPS location finding.
`0006 Most existing portable (mobile) electronic devices
`tend to use only the very basic of motion sensors, such as an
`accelerometer with “peak detection' or steady state measure
`ments. For example, current mobile phones use an acceler
`ometer to determine tilting of the device, which can be deter
`mined using a steady state gravity measurement. Such simple
`determination cannot be used in more Sophisticated applica
`tions using, for example, gyroscopes or other applications
`having precise timing requirements. Without a gyroscope
`included in the device, the tilting and acceleration of the
`device is not sensed reliably. And since motion of the device
`is not always linear or parallel to the ground, measurement of
`several different axes of motion using an accelerometer or
`gyroscope is needed for greater accuracy.
`0007 More sophisticated motion sensors typically are not
`used in electronic devices. Some attempts have been made for
`more Sophisticated motion sensors in particular applications,
`Such as detecting motion with certain movements. But most
`of these efforts have failed or are not robust enough as a
`product. This is because the use of motion sensors to derive
`motion is complicated. For example, when using a gyroscope,
`
`it is not trivial to identify the tilting or movement of a device.
`Using motion sensors for image stabilization, for sensing
`location, or for other Sophisticated applications, requires in
`depth understanding of motion sensors, which makes motion
`sensing design very difficult.
`0008 Furthermore, everyday portable consumer elec
`tronic devices for the consumer market are desired to be
`low-cost. Yet the most reliable and accurate inertial sensors
`Such as gyroscopes and accelerometers are typically too
`expensive for many consumer products. Low-cost inertial
`sensors can be used bring many motion sensing features to
`portable electronic devices. However, the accuracy of such
`low-cost sensors are limiting factors for more Sophisticated
`functionality.
`0009 For example, such functionality can include motion
`gesture recognition implemented on motion sensing devices
`to allow a user to input commands or data by moving the
`device or otherwise cause the device sense the user's motion.
`For example, gesture recognition allows a user to easily select
`particular device functions by simply moving, shaking, or
`tapping the device. Prior gesture recognition for motion sens
`ing devices typically consists of examining raw sensor data
`Such as data from gyroscopes or accelerometers, and either
`hard-coding patterns to look for in this raw data, or using
`machine learning techniques (such as neural networks or
`Support vector machines) to learn patterns from this data. In
`Some cases the required processing resources for detecting
`gestures using machine learning can be reduced by first using
`machine learning to learn the gesture, and then hard-coding
`and optimizing the result of the machine learning algorithm.
`0010 Several problems exist with these prior techniques.
`One problem is that gestures are very limited in their appli
`cations and functionality when implemented in portable
`devices. Another problem is that gestures are often not reli
`ably recognized. For example, raw sensor data is often not the
`best data to examine for gestures because it can greatly vary
`from user to user for a particular gesture. In Such a case, if one
`user trains a learning system or hard-codes a pattern detector
`for that user's gestures, these gestures will not be recognized
`correctly when a different user uses the device. One example
`of this is in the rotation of wrist movement. One user might
`draw a pattern in the air with the device without rotating his
`wrist at all, but another user might rotate his wrist while
`drawing the pattern. The resulting raw data will look very
`different from user to user. A typical solution is to hard-code
`or train all possible variations of a gesture, but this solution is
`expensive in processing time and difficult to implement.
`0011. Accordingly, a system and method that provides
`varied, robust and accurate gesture recognition with low-cost
`inertial sensors would be desirable in many applications.
`
`SUMMARY OF THE INVENTION
`0012. The invention of the present application relates to
`mobile devices providing motion gesture recognition. In one
`aspect, a method for processing motion to control a portable
`electronic device includes receiving, on the device, sensed
`motion data derived from motion sensors of the device, where
`the sensed motion data is based on movement of the portable
`electronic device in space. The motion sensors provide six
`axis motion sensing and include at least three rotational
`motion sensors and at least three accelerometers. A particular
`operating mode is determined to be active while the move
`ment of the device occurs, where the particular operating
`mode is one of a plurality of different operating modes avail
`
`15
`
`
`
`US 2009/0265671 A1
`
`Oct. 22, 2009
`
`able in the operation of the device. One or more motion
`gestures are recognized from the motion data, where the one
`or more motion gestures are recognized from a set of motion
`gestures that are available for recognition in the active oper
`ating mode of the device. Each of the different operating
`modes of the device, whenactive, has a different set of motion
`gestures available for recognition. One or more states of the
`device are changed based on the one or more recognized
`motion gestures, including changing output of a display
`screen on the device.
`0013. In another aspect of the invention, a method for
`recognizing a gesture performed by a user using a motion
`sensing device includes receiving motion sensor data in
`device coordinates indicative of motion of the device, the
`motion sensor data received from a plurality of motion sen
`sors of the motion sensing device including a plurality of
`rotational motion sensors and linear motion sensors. The
`motion sensor data is transformed from device coordinates to
`world coordinates, the motion sensor data in the device coor
`dinates describing motion of the device relative to a frame of
`reference of the device, and the motion sensor data in the
`world coordinates describing motion of the device relative to
`a frame of reference external to the device. A gesture is
`detected from the motion sensor data in the world coordi
`nates.
`0014. In another aspect of the invention, a system for
`detecting gestures includes a plurality of motion sensors pro
`viding motion sensor data, the motion sensors including a
`plurality of rotational motion sensors and linear motion sen
`sors. At least one feature detector is each operative to detect
`an associated data feature derived from the motion sensor
`data, each data feature being a characteristic of the motion
`sensor data, and each feature detector outputting feature val
`ues describing the detected data feature. At least one gesture
`detector is each operative to detect a gesture associated with
`the gesture detector based on the feature values.
`00.15
`Aspects of the present invention provide more flex
`ible, varied, robust and accurate recognition of motion ges
`tures frominertial sensor data of a mobile or handheld motion
`sensing device. Multiple rotational motion sensors and linear
`motion sensors are used, and appropriate sets of gestures can
`be recognized in different operating modes of the device. The
`use of world coordinates for sensed motion data allows minor
`variations in motions from user to user during gesture input to
`be recognized as the same gesture without significant addi
`tional processing. The use of data features in motion sensor
`data allows gestures to be recognized with reduced process
`ing compared to processing all the motion sensor data.
`
`BRIEF DESCRIPTION OF THE FIGURES
`0016 FIG. 1 is a block diagram of a motion sensing device
`suitable for use with the present invention;
`0017 FIG. 2 is a block diagram of one embodiment of a
`motion processing unit Suitable for use with the present
`invention;
`0018 FIGS. 3A and 3B are diagrammatic illustrations
`showing different motions of a device in space, as moved by
`a user performing a gesture;
`0019 FIGS. 4A and 4B are diagrammatic illustrations
`showing the motions of FIGS. 3A and 3B as appearing using
`augmented sensor data;
`0020 FIGS. 5A-5C are diagrammatic illustrations show
`ing different user positions when using a motion sensing
`device;
`
`FIGS. 6A-6C are diagrammatic illustrations show
`0021
`ing different coordinate systems for sensing motion data;
`0022 FIG. 7 is a block diagram illustrating a system of the
`present invention for producing augmented data for recogniz
`ing motion gestures;
`0023 FIGS. 8A and 8B are diagrammatic illustrations
`showing rotational movement of a device indicating whether
`or not a user is intending to input a gesture;
`0024 FIG. 9 is a flow diagram illustrating a method of the
`present invention for recognizing gestures based on an oper
`ating mode of the portable electronic device:
`(0025 FIGS. 10A and 10B are diagrammatic illustrations
`of motion data of example shake gestures;
`0026 FIGS. 11A-10F are diagrammatic illustrations
`showing magnitude peaks for gesture recognition;
`0027 FIGS. 12A and 12B are diagrammatic illustrations
`of two examples of tap gestures;
`0028 FIGS. 13A and 13B are diagrammatic illustrations
`of detecting a tap gesture by rejecting particular spikes in
`motion data;
`0029 FIG. 14 is a diagrammatic illustration of motion data
`of an example circle gesture;
`0030 FIG. 15 is a diagrammatic illustration of examples
`of character gestures;
`0031
`FIG. 16 is a diagrammatic illustration showing one
`example of a set of data features of device movement that can
`be processed for gestures;
`0032 FIG. 17 is a block diagram illustrating one example
`ofa system for recognizing and processing gestures including
`data features;
`0033 FIG. 18 is a block diagram illustrating one example
`of distributing the functions of the gesture recognition system
`of FIG. 16.
`
`DETAILED DESCRIPTION
`0034. The present invention relates generally to motion
`sensing devices, and more specifically to recognizing motion
`gestures using motion sensors of a motion sensing device.
`The following description is presented to enable one of ordi
`nary skill in the art to make and use the invention and is
`provided in the context of a patent application and its require
`ments. Various modifications to the preferred embodiment
`and the generic principles and features described herein will
`be readily apparent to those skilled in the art. Thus, the
`present invention is not intended to be limited to the embodi
`ment shown but is to be accorded the widest scope consistent
`with the principles and features described herein.
`0035) To more particularly describe the features of the
`present invention, please refer to FIGS. 1-18 in conjunction
`with the discussion below.
`0036 FIG. 1 is a block diagram of one example of a
`motion sensing system or device 10 suitable for use with the
`present invention. Device 10 can be implemented as a device
`or apparatus, such as a portable device that can be moved in
`space by a user and its motion and/or orientation in space
`therefore sensed. For example, such a portable device can be
`a mobile phone, personal digital assistant (PDA), video game
`player, video game controller, navigation device, mobile
`internet device (MID), personal navigation device (PND).
`digital still camera, digital video camera, binoculars, tele
`photo lenses, or other portable device, or a combination of
`one or more of these devices. In some embodiments, the
`device 10 is a self-contained device that includes its own
`display and other output devices in addition to input devices.
`
`16
`
`
`
`US 2009/0265671 A1
`
`Oct. 22, 2009
`
`In other embodiments, the portable device 10 only functions
`in conjunction with a non-portable device Such as a desktop
`computer, electronic tabletop device, server computer, etc.
`which can communicate with the moveable orportable device
`10, e.g., via network connections.
`0037 Device 10 includes an application processor 12,
`memory 14, interface devices 16, a motion processing unit 20,
`analog sensors 22, and digital sensors 24. Application pro
`cessor 12 can be one or more microprocessors, central pro
`cessing units (CPUs), or other processors which run software
`programs for the device 10. For example, different software
`application programs such as menu navigation software,
`games, camera function control, navigation Software, and
`phone or a wide variety of other software and functional
`interfaces can be provided. In some embodiments, multiple
`different applications can be provided on a single device 10,
`and in Some of those embodiments, multiple applications can
`run simultaneously on the device 10. In some embodiments,
`the application processor implements multiple different oper
`ating modes on the device 10, each mode allowing a different
`set of applications to be used on the device and a different set
`of gestures to be detected. This is described in greater detail
`below with respect to FIG. 9.
`0038 Multiple layers of software can be provided on a
`computer readable medium Such as electronic memory or
`other storage medium Such as hard disk, optical disk, etc., for
`use with the application processor 12. For example, an oper
`ating system layer can be provided for the device 10 to control
`and manage system resources in real time, enable functions of
`application software and other layers, and interface applica
`tion programs with other software and functions of the device
`10. A motion algorithm layer can provide motion algorithms
`that provide lower-level processing for raw sensor data pro
`vided from the motion sensors and other sensors. A sensor
`device driver layer can provides a software interface to the
`hardware sensors of the device 10.
`0039. Some or all of these layers can be provided in soft
`ware 13 of the processor 12. For example, in some embodi
`ments, the processor 12 can implement the gesture processing
`and recognition described herein based on sensor inputs from
`a motion processing unit (MPUTM) 20 (described below).
`Other embodiments can allow a division of processing
`between the MPU20 and the processor 12 as is appropriate
`for the applications and/or hardware used, where some of the
`layers (such as lower level software layers) are provided in the
`MPU. For example, in embodiments allowing processing by
`the MPU 20, an API layer can be implemented in layer 13 of
`processor 12 which allows communication of the states of
`application programs running on the processor 12 to the MPU
`20 as well as API commands (e.g., over bus 21), allowing the
`MPU20 to implement some or all of the gesture processing
`and recognition described herein. Some embodiments of API
`implementations in a motion detecting device are described in
`co-pending U.S. patent application Ser. No. 12/106,921,
`incorporated herein by reference in its entirety.
`0040 Device 10 also includes components for assisting
`the application processor 12, Such as memory 14 (RAM,
`ROM, Flash, etc.) and interface devices 16. Interface devices
`16 can be any of a variety of different devices providing input
`and/or output to a user. Such as a display Screen, audio speak
`ers, buttons, touchscreen, joystick, slider, knob, printer, Scan
`ner, camera, computer network I/O device, other connected
`peripheral, etc. For example, one interface device 16 included
`in many embodiments is a display screen 16a for outputting
`
`images viewable by the user. Memory 14 and interface
`devices 16 can be coupled to the application processor 12 by
`a bus 18.
`0041) Device 10 also can include a motion processing unit
`(MPUTM) 20. The MPU is a device including motion sensors
`that can measure motion of the device 10 (or portion thereof)
`in space. For example, the MPU can measure one or more
`axes of rotation and one or more axes of acceleration of the
`device. In preferred embodiments, at least some of the motion
`sensors are inertial sensors, such as gyroscopes and/or accel
`erometers. In some embodiments, the components to perform
`these functions are integrated in a single package. The MPU
`20 can communicate motion sensor data to an interface bus
`21, e.g., I2C or Serial Peripheral Interface (SPI) bus, to which
`the application processor 12 is also connected. In one
`embodiment, processor 12 is a controller or master of the bus
`21. Some embodiments can provide bus 18 as the same bus as
`interface bus 21.
`0042 MPU20 includes motion sensors, including one or
`more rotational motion sensors 26 and one or more linear
`motion sensors 28. For example, in some embodiments, iner
`tial sensors are used, where the rotational motion sensors are
`gyroscopes and the linear motion sensors are accelerometers.
`Gyroscopes 26 can measure the angular Velocity of the device
`10 (or portion thereof) housing the gyroscopes 26. From one
`to three gyroscopes can typically be provided, depending on
`the motion that is desired to be sensed in a particular embodi
`ment. Accelerometers 28 can measure the linear acceleration
`of the device 10 (or portion thereof) housing the accelerom
`eters 28. From one to three accelerometers can typically be
`provided, depending on the motion that is desired to be sensed
`in a particular embodiment. For example, if three gyroscopes
`26 and three accelerometers 28 are used, then a 6-axis sensing
`device is provided providing sensing in all six degrees of
`freedom.
`0043. In some embodiments the gyroscopes 26 and/or the
`accelerometers 28 can be implemented as MicroElectroMe
`chanical Systems (MEMS). Supporting hardware such as
`storage registers for the data from motion sensors 26 and 28
`can also be provided.
`0044. In some embodiments, the MPU20 can also include
`a hardware processing block 30. Hardware processing block
`30 can include logic or controllers to provide processing of
`motion sensor data in hardware. For example, motion algo
`rithms, or parts of algorithms, may be implemented by block
`30 in some embodiments, and/or part of or all the gesture
`recognition described herein. In such embodiments, an API
`can be provided for the application processor 12 to commu
`nicate desired sensor processing tasks to the MPU 20, as
`described above. Some embodiments can include a hardware
`buffer in the block 30 to store sensor data received from the
`motion sensors 26 and 28. A motion control 36, such as a
`button, can be included in some embodiments to control the
`input of gestures to the electronic device 10, as described in
`greater detail below.
`0045 One example of an MPU20 is described below with
`reference to FIG. 2. Other examples of an MPU suitable for
`use with the present invention are described in co-pending
`U.S. patent application Ser. No. 1 1/774,488, filed Jul. 6, 2007,
`entitled, “Integrated Motion Processing Unit (MPU) With
`MEMS Inertial Sensing and Embedded Digital Electronics.”
`and incorporated herein by reference in its entirety. Suitable
`implementations for MPU20 in device 10 are available from
`Invensense, Inc. of Sunnyvale, Calif.
`
`17
`
`
`
`US 2009/0265671 A1
`
`Oct. 22, 2009
`
`0046. The device 10 can also include other types of sen
`sors. Analog sensors 22 and digital sensors 24 can be used to
`provide additional sensor data about the environment in
`which the device 10 is situation. For example, sensors such
`one or more barometers, compasses, temperature sensors,
`optical sensors (such as a camera sensor, infrared sensor,
`etc.), ultrasonic sensors, radio frequency sensors, or other
`types of sensors can be provided. In the example implemen
`tation shown, digital sensors 24 can provide sensor data
`directly to the interface bus 21, while the analog sensors can
`be provide sensor data to an analog-to-digital converter
`(ADC)34 which supplies the sensor data in digital form to the
`interface bus 21. In the example of FIG. 1, the ADC 34 is
`provided in the MPU 20, such that the ADC 34 can provide
`the converted digital data to hardware processing 30 of the
`MPU or to the bus 21. In other embodiments, the ADC 34 can
`be implemented elsewhere in device 10.
`0047 FIG. 2 shows one example of an embodiment of
`motion processing unit (MPU) 20 suitable for use with inven
`tions described herein. The MPU 20 of FIG. 2 includes an
`arithmetic logic unit (ALU) 36, which performs processing
`on sensor data. The ALU 36 can be intelligently controlled by
`one or more programs stored in and retrieved from program
`RAM (random access memory) 37. The ALU 36 can control
`a direct memory access (DMA) block 38, which can read
`sensor data independently of the ALU 36 or other processing
`unit, from motion sensors such as gyroscopes 26 and accel
`erometers 28 as well as other sensors such as temperature
`sensor 39. Some orall sensors can be provided on the MPU20
`or external to the MPU 20; e.g., the accelerometers 28 are
`shown in FIG. 2 as external to the MPU 20. The DMA 38 can
`also provide interrupts to the ALU regarding the status of read
`or write operations. The DMA 38 can provide sensor data
`read from sensors to a data RAM 40 for storage. The data
`RAM 40 provides data to the ALU 36 for processing, and the
`ALU36 provides output, including processed data, to the data
`RAM 40 for storage. Bus 21 (also shown in FIG. 1) can be
`coupled to the outputs of data RAM40 and/or FIFO buffer 42
`so that application processor 12 can read the data read and/or
`processed by the MPU20.
`0048. A FIFO (first in first out) buffer 42 can be used as a
`hardware buffer for storing sensor data which can be accessed
`by the application processor 12 over the bus 21. The use of a
`hardware buffer such as buffer 42 is described in several
`embodiments below. For example, a multiplexer 44 can be
`used to select either the DMA 38 writing raw sensor data to
`the FIFO buffer 42, or the data RAM 40 writing processed
`data to the FIFO buffer 42 (e.g., data processed by the ALU
`36).
`0049. The MPU20 as shown in FIG. 2 thus can support
`one or more implementations of processing motion sensor
`data, including the gesture processing and recognition
`described herein. For example, the MPU20 can process raw
`sensor data fully, where programs in the program RAM 37
`can control the ALU 36 to intelligently process sensor data
`and provide high-level data to the application processor 12
`and application programs running thereon. Or, raw sensor
`data can be pre-processed or processed partially by the MPU
`20 using the ALU 36, where the processed data can then be
`retrieved by the application processor 12 for additional low
`level processing on the application processor 12 before pro
`viding resulting high-level information to the application pro
`grams. Or, raw sensor data can be merely buffered by the
`MPU 20, where the raw sensor data is retrieved by the appli
`
`cation processor 12 for low-level processing. In some
`embodiments, different applications or application programs
`running on the same device 10 can use different ones of these
`processing methods as is most Suitable to the application or
`program.
`
`Recognizing Motion Gestures
`0050 FIGS. 3A and 3B are diagrammatic illustrations
`showing different motions of a device 10 in space, as moved
`by a user performing a gesture. A 'gesture' or “motion ges
`ture.” as referred to herein, is a predefined motion or set of
`motions of the device which, when recognized by the device
`have occurred, triggers one or more associated functions of
`the device. This motion can be a contained set of motions such
`as a shake or circle motion, or can be a simple movement of
`the device, such as tilting the device in a particular axes or
`angle. The associated functions can include, for example,
`scrolling a list or menu displayed on a display screen of the
`device in a particular direction, selecting and/or manipulating
`a displayed item (button, menu, control), providing input
`Such as desired commands or data (such as characters, etc.) to
`a program or interface of the device, turn on or offmain power
`to the device, and so on.
`0051. An aspect of the invention pre-processes the raw
`sensor data of the device 10 by changing coordinate systems
`or converting to other physical parameters, such that the
`resulting “augmented data' looks similar for all users regard
`less of the Small, unintentional differences in user motion.
`This augmented data can then be used to train learning sys
`tems or hard-code pattern recognizers resulting in much more
`robust gesture recognition, and is a cost effective way of
`utilizing motion sensor data from low-cost inertial sensors to
`provide a repeatable and robust gesture recognition.
`0.052 Some embodiments of the invention use inertial sen
`sors such as gyroscopes and/or accelerometers. Gyroscopes
`output angular velocity in device coordinates, while acceler
`ometers output the Sum of linear acceleration in device coor
`dinates and tilt due to gravity. The outputs of gyroscopes and
`accelerometers is often not consistent from user to user or
`even during the use of the same user, despite the users intend
`ing to perform or repeat the same gestures. For example,
`when a user rotates the device in a vertical direction, a Y-axis
`gyroscope may sense the movement; however, with a differ
`ent wristorientation of a user, the Z-axis gyroscope may sense
`the movement.
`0053 Training a system to respond to the gyroscope signal
`differently depending on the tilt of the device (where the tilt is
`extracted from the accelerometers and the X-axis gyroscope)
`would be very difficult. However, doing a coordinate trans
`form from device coordinates to world coordinates simplifies
`the problem. Two users providing different device tilts are
`both rotating the device downward relative to the world exter
`nal to the device. If the augmented data angular Velocity in
`world coordinates is used, then the system will be more easily
`trained or hard-coded, because the sensor data has been pro
`cessed to look the same for both users.
`0054) In the examples of FIGS.3A and 3B, while perform
`ing a “straight down movement of the device 10 as a