throbber
1111111111111111 IIIIII IIIII 111111111111111 lllll lllll lllll 111111111111111 1111111111 11111111
`US 20090265671Al
`
`(19) United States
`02) Patent Application Publication
`SACHS et al.
`
`(IO) Pub. No.: US 2009/0265671 Al
`Oct. 22, 2009
`(43) Pub. Date:
`
`(54) MOBILE DEVICES WITH MOTION
`GESTURE RECOGNITION
`
`(75)
`
`lnve11tors:
`
`DAVID SACHS, SUNNYVALE,
`CA (US); STEVEN S. NASIRI,
`SARATOGA, CA (US); JOSEPH
`JIANG, SAN JOSE, CA (US);
`ANJIA GU, STANFORD, CA (US)
`
`Correspondence Address:
`SA\\-YER LAW GROUP PC
`2465 E. Bayshore Road, Suite No. 406
`PALO ALTO, CA 94303 (US)
`
`(73)
`
`Assignee:
`
`INVENSENSE, SUNNYVALE,
`CA(US)
`
`(21)
`
`Appl. No.:
`
`12/252,322
`
`(22)
`
`Filed:
`
`Oct. 15, 2008
`
`Related U.S. Application Data
`
`(63)
`
`Continuation-in-part of application No. 12/106,921 ,
`filed on Apr. 21 , 2008.
`
`Publication Classification
`
`(51)
`
`Int. CI.
`G06F 3101
`
`(2006.01)
`
`(52) U.S. CI . ........................................................ 715/863
`
`(57)
`
`ABSTRACT
`Mobile devices using motion gesture recognition. In one
`aspect, processing motion to control a portable electronic
`device includes receiving, on the device, sensed motion data
`derived from motion sensors of the device and based on
`device movement in space. The motion sensors include at
`least three rotational motion sensors and at least three accel(cid:173)
`erometers. A particular operating mode is determined to be
`active while the movement of the device occurs, the mode
`being one of multiple different operating modes of the device.
`Motion gesture(s) are recognized from the motion data from
`a set of motion gestures available for recognition in the active
`operating mode. Each of the different operating modes, when
`active, has a different set of gestures available. State(s) of the
`device are changed based on the recognized gestures, includ(cid:173)
`ing changing output of a display screen on the device.
`
`102
`
`101
`
`Petitioner Samsung Ex-1033, 0001
`
`

`

`Patent Application Publication Oct. 22, 2009 Sheet 1 of 13
`
`US 2009/0265671 Al
`
`16a
`
`DISPLAY
`SCREEN
`
`16
`
`INTERFACE
`DEVICES
`
`MEMORY
`
`14
`
`18
`
`30-
`
`HARDWARE
`PROCESSING
`
`GYRO-
`SCOPES
`
`ACCELER-
`OMETERS
`
`26
`
`2
`
`13
`
`SOFTWARE
`
`APPLICATION
`PROCESSOR
`
`12
`
`,-10
`
`36
`
`MOTION
`CONTROL
`
`21
`
`20
`
`MPU
`
`ADC
`
`34
`
`FIG. 1
`
`ANALOG
`SENSORS
`
`DIGITAL
`SENSORS
`
`22
`
`24
`
`Petitioner Samsung Ex-1033, 0002
`
`

`

`> ,_.
`,_.
`--.l
`c:'I
`V1
`c:'I
`N
`<:>
`-...
`\I:>
`<:>
`<:>
`N
`r.n
`d
`
`~
`
`Q -. ,_.
`('D ....
`r.n =- ('D
`
`N
`
`\I:>
`<:>
`<:>
`N
`~
`N
`I') r---
`0
`
`~ .... -· Q =
`~ = I')
`"3
`~ --· Q =
`t -= --· I')
`
`('D = ....
`~ ....
`-e
`
`21
`
`ALU
`
`RAM
`DATA
`
`40
`
`PROGRAM
`
`RAM
`
`37
`
`OMA
`
`TEMPERATURE __
`
`26
`
`SCOPES
`GYRO(cid:173)
`
`2
`
`OMETERS
`ACCELER(cid:173)
`
`3
`
`SENSOR
`
`FIG. 2
`
`L-----------------------------~
`I
`I
`I
`I
`I
`I
`I
`
`42
`
`44
`
`~--~
`
`FIFO
`
`38
`
`..__~
`
`Petitioner Samsung Ex-1033, 0003
`
`

`

`Patent Application Publication Oct. 22, 2009 Sheet 3 of 13
`
`US 2009/0265671 Al
`
`10
`
`~1 0
`
`I
`
`D [ 10
`
`I
`
`FIG. 3A
`
`FIG. 38
`
`-----.--1
`
`(1 02
`
`ll-101
`
`100
`
`101 - { l
`
`ti
`
`102
`
`FIG . 4A
`
`FIG. 48
`
`Petitioner Samsung Ex-1033, 0004
`
`

`

`Patent Application Publication
`
`Oct. 22, 2009 Sheet 4 of 13
`
`US 2009/0265671 Al
`
`FIG. SA
`
`FIG. 58
`
`FIG. SC
`
`y
`
`)-x
`
`z
`FIG. 6A
`
`FIG. 68
`
`FIG. 6C
`
`Petitioner Samsung Ex-1033, 0005
`
`

`

`> ....
`....
`
`-....J
`O'I
`Ul
`O'I
`N
`
`1,0 --- 0
`
`0
`0
`N
`rJJ
`c
`
`.... 0 =
`.... 0 = ""O = O" -....
`('D = ..... t "e -....
`
`~ .....
`
`(')
`
`~ .....
`
`(')
`
`~ .....
`""O
`
`L _____________ I
`I
`I .___ _____ _J
`I
`IN DEVICE COORDINATES
`LINEAR ACCELERATION 180 I
`I
`I
`I
`I
`I
`I
`I
`I
`I
`I
`I
`I
`I
`I
`I
`I
`I
`I
`172 1
`
`DEVICE COORDINATES
`
`ACCELERATION IN
`GRAVITATIONAL
`
`178
`
`IN WORLD COORDINATES
`LINEAR ACCELERATION
`
`176
`
`174
`
`ROTATION MATRIX
`
`QUATERNION/
`
`FIG. 7
`
`I
`I
`I
`I
`I
`I
`I
`'---r-----.---_J I
`3D INTEGRATION
`
`TRANSFORM
`COORDINATE
`
`164
`
`156
`
`GRAVITY
`REMOVE
`
`CALIBRATION t-----_J
`
`154
`
`158
`
`REFERENCE
`
`GRAVITY
`
`152
`
`CALIBRATION
`
`I
`
`____ I
`I
`I
`I
`I
`I
`I
`
`OMETERS
`ACCELER(cid:173)
`
`SCOPES
`GYRO-
`
`1 28
`I
`I
`I
`I
`I
`I
`
`1
`
`1
`
`WORLD COORDINATES
`ANGULAR VELOCITY IN
`
`160
`
`1---------
`
`TRANSFORM
`COORDINATE
`
`I
`170 I
`I
`1-------------7
`
`DEVICE COORDINATES
`1-----------1-1 ANGULAR VELOCITY IN
`
`162
`
`SENSOR DATA
`AUGMENTED
`
`150~
`
`Petitioner Samsung Ex-1033, 0006
`
`

`

`Patent Application Publication Oct. 22, 2009 Sheet 6 of 13
`
`US 2009/0265671 Al
`
`r190
`~
`
`FIG. 8A
`
`212~
`
`217~
`
`~ 9 0
`
`1o~no
`
`I
`I
`I
`
`t--194
`
`I
`I
`
`LU L~92
`I •
`
`FIG. 88
`
`FIG. 1 OA
`
`FIG. 1 OB
`
`PITCH
`
`YAW
`
`PITCH
`
`Petitioner Samsung Ex-1033, 0007
`
`

`

`Patent Application Publication Oct. 22, 2009 Sheet 7 of 13
`
`US 2009/0265671 Al
`
`y200
`
`202
`
`RECEIVE SENSED
`MOTION DATA
`
`203
`
`DETERMINE ACTIVE
`OPERATING MODE
`OF DEVICE
`
`204
`
`SELECT SET OF GESTURES
`WHICH CAN BE RECOGNIZED
`IN ACTIVE OPERATING MODE
`
`ANALYZE MOTION DATA TO
`RECOGNIZE MOTION
`GESTURE(S)
`
`ONE OR MORE
`STATES OF DEVICE CHANGED
`BASED ON RECOGNIZED
`MOTION GESTURE(S)
`
`205
`
`206
`
`207
`
`DONE
`
`208
`
`FIG. 9
`
`Petitioner Samsung Ex-1033, 0008
`
`

`

`Patent Application Publication
`
`Oct. 22, 2009 Sheet 8 of 13
`
`US 2009/0265671 Al
`
`FIG. 11A
`
`FIG.118
`
`246
`
`:/
`
`..................... --·
`
`FIG.11C
`
`FIG.11D
`
`262
`
`264
`
`,,___,_272
`I
`I
`I
`
`- ---
`
`---
`
`---
`
`-- ----
`
`--
`
`I
`I
`I
`
`1
`I
`I
`
`FIG. 11 E
`
`FIG. 11 F
`
`Petitioner Samsung Ex-1033, 0009
`
`

`

`Patent Application Publication Oct. 22, 2009 Sheet 9 of 13
`
`US 2009/0265671 Al
`
`282-....,...
`
`,2ao
`
`,2aa
`
`FIG. 12A
`
`FIG. 128
`
`294
`
`~298
`
`FIG. 13A
`
`FIG. 138
`
`Petitioner Samsung Ex-1033, 0010
`
`

`

`Patent Application Publication Oct. 22, 2009 Sheet 10 of 13
`
`US 2009/0265671 Al
`
`300~
`
`YAW
`
`PITCH
`
`FIG. 14
`
`310"
`
`,312
`I
`
`,314
`
`,320
`I
`
`,322
`)_
`
`,316
`J
`
`,324
`
`j
`
`FIG. 15
`
`Petitioner Samsung Ex-1033, 0011
`
`

`

`Patent Application Publication Oct. 22, 2009 Sheet 11 of 13
`
`US 2009/0265671 Al
`
`-,----------------"'
`00
`lJ)
`M
`
`/ / i / / /1 / /f f iV /
`
`i
`
`:
`
`i
`
`/
`
`i
`
`c.o
`~
`~iiiif<iJi;lef-~-------0_!_ __ _
`~z
`<( _
`w~
`a.. -~
`
`(0
`'r-
`
`CJ -LL
`
`0
`lJ)
`M
`
`00
`lJ)
`M
`
`--------------- ~
`
`Petitioner Samsung Ex-1033, 0012
`
`

`

`'"""
`'""" >
`
`-...l
`Q\
`Ul
`Q\
`N
`
`'I:)
`0
`0
`N
`rJ)
`d
`
`-- 0
`
`'"""
`t.H
`-.
`Q
`N
`
`~
`
`:r
`rJ)
`
`~ -'"""
`
`'I:)
`0
`0
`N
`~
`N
`:""'
`e,
`0
`
`=
`-· Q
`· e,
`Q'
`=
`-e
`=

`~
`-·
`e,
`'O
`'O
`= -
`>
`~ -~
`
`~
`
`-
`--
`
`--
`-
`
`Q
`
`-e
`
`OUTPUT
`GESTURE
`
`FINAL
`
`398
`
`_,-370
`
`VALUE
`
`TIMER
`
`384 382
`
`VALUE
`
`TIMER
`
`382
`VALUE
`
`TIMER
`
`384
`
`GESTURE
`
`380c
`
`3
`
`-
`
`360b
`
`GESTURE
`
`2
`
`-
`
`380
`
`360a
`
`GESTURE
`
`1
`
`-
`
`380
`
`I
`I
`,-------.L39~1
`
`ORIENTATION 392
`
`VALUE
`
`TIMER
`
`378 376
`
`VALUE
`
`TIMER
`
`376
`
`378
`
`VALUE
`
`TIMER
`
`FEATURE
`
`374c
`3
`
`-
`
`FEATURE
`
`374b
`2
`
`-
`
`374
`
`372
`
`-
`
`FEATURE
`
`374a
`1
`
`-
`
`374
`
`TED DATA
`AUGMEN-
`RAW/
`
`384 382
`
`380
`
`378 376
`
`374
`
`-
`
`FIG. 17
`
`I
`I
`
`I ___________ I
`I
`
`CONDITIONS --
`396
`
`ABORT
`
`AMOUNT OF 394
`
`I
`
`MOVEMENT -
`
`Petitioner Samsung Ex-1033, 0013
`
`

`

`Patent Application Publication Oct. 22, 2009 Sheet 13 of 13
`
`US 2009/0265671 Al
`
`400 ~
`
`,.. 402
`j
`
`\
`I
`
`HAR D
`WIRED
`
`PROGRAM-
`MABLE
`
`408 J
`
`-
`
`6-AXIS DEVICE OUTPUT (DIGITAUANALOG)
`
`"'Ill ~~
`
`C RITICAL POINT / FEATURE DATA
`
`-- r-- ... 404
`
`,...
`
`...
`MOTION LOGIC
`
`r-. _, 406
`
`"'111,._~n ... ~~
`
`"Ill.._~~
`
`OUTPUT
`
`FIG. 18
`
`Petitioner Samsung Ex-1033, 0014
`
`

`

`US 2009/0265671 Al
`
`Oct. 22, 2009
`
`1
`
`MOBILE DEVICES WITH MOTION
`GESTURE RECOGNITION
`
`CROSS REFERENCE TO RELATED
`APPLICATIONS
`
`[0001] This application claims the benefit of U.S. Provi(cid:173)
`sional Application No. 61/022,143, filed Jan. 18, 2008,
`entitled, "Motion Sensing Application Interface," and
`[0002] This application is a continuation-in-part of U.S.
`patent application Ser. No. 12/106,921 ( 4360P), filedApr. 21,
`2008, entitled, "Interfacing Application Programs and
`Motion Sensors of a Device,"
`[0003]
`all of which are incorporated herein by reference in
`their entireties.
`
`FIELD OF THE INVENTION
`
`[0004] The present invention relates generally to motion
`sensing devices, and more specifically to recognizing motion
`gestures based on motion sensors of a motion sensing device.
`
`BACKGROUND OF THE INVENTION
`
`[0005] Motion sensors, such as inertial sensors like accel(cid:173)
`erometers or gyroscopes, can be used in electronic devices.
`Accelerometers can be used for measuring linear acceleration
`and gyroscopes can be used for measuring angular velocity of
`a moved device. The markets for motion sensors include
`mobile phones, video game controllers, PDAs, mobile inter(cid:173)
`net devices (MIDs), personal navigational devices (PNDs),
`digital still cameras, digital video cameras, and many more.
`For example, cell phones may use accelerometers to detect
`the tilt of the device in space, which allows a video picture to
`be displayed in an orientation corresponding to the tilt. Video
`game console controllers may use accelerometers to detect
`motion of the hand controller that is used to provide input to
`a game. Picture and video stabilization is an important feature
`in even low- or mid-end digital cameras, where lens or image
`sensors are shifted to compensate for hand j ittering measured
`by a gyroscope. Global positioning system (GPS) and loca(cid:173)
`tion base service (LBS) applications rely on determining an
`accurate location of the device, and motion sensors are often
`needed when a GPS signal is attenuated or unavailable, or to
`enhance the accuracy of GPS location finding.
`[0006] Most existing portable (mobile) electronic devices
`tend to use only the very basic of motion sensors, such as an
`accelerometer with "peak detection" or steady state measure(cid:173)
`ments. For example, current mobile phones use an acceler(cid:173)
`ometer to determine tilting of the device, which can be deter(cid:173)
`mined using a steady state gravity measurement. Such simple
`determination cannot be used in more sophisticated applica(cid:173)
`tions using, for example, gyroscopes or other applications
`having precise timing requirements. Without a gyroscope
`included in the device, the tilting and acceleration of the
`device is not sensed reliably. And since motion of the device
`is not always linear or parallel to the ground, measurement of
`several different axes of motion using an accelerometer or
`gyroscope is needed for greater accuracy.
`[0007] More sophisticated motion sensors typically are not
`used in electronic devices. Some attempts have been made for
`more sophisticated motion sensors in particular applications,
`such as detecting motion with certain movements. But most
`of these efforts have failed or are not robust enough as a
`product. This is because the use of motion sensors to derive
`motion is complicated. For example, when using a gyroscope,
`
`it is not trivial to identify the tilting or movement of a device.
`Using motion sensors for image stabilization, for sensing
`location, or for other sophisticated applications, requires in(cid:173)
`depth understanding of motion sensors, which makes motion
`sensing design very difficult.
`[0008] Furthermore, everyday portable consumer elec(cid:173)
`tronic devices for the consumer market are desired to be
`low-cost. Yet the most reliable and accurate inertial sensors
`such as gyroscopes and accelerometers are typically too
`expensive for many consumer products. Low-cost inertial
`sensors can be used bring many motion sensing features to
`portable electronic devices. However, the accuracy of such
`low-cost sensors are limiting factors for more sophisticated
`functionality.
`[0009] For example, such functionality can include motion
`gesture recognition implemented on motion sensing devices
`to allow a user to input commands or data by moving the
`device or otherwise cause the device sense the user's motion.
`For example, gesture recognition allows a user to easily select
`particular device functions by simply moving, shaking, or
`tapping the device. Prior gesture recognition for motion sens(cid:173)
`ing devices typically consists of examining raw sensor data
`such as data from gyroscopes or accelerometers, and either
`hard-coding patterns to look for in this raw data, or using
`machine learning techniques (such as neural networks or
`support vector machines) to learn patterns from this data. In
`some cases the required processing resources for detecting
`gestures using machine learning can be reduced by first using
`machine learning to learn the gesture, and then hard-coding
`and optimizing the result of the machine learning algorithm.
`[0010] Several problems exist with these prior techniques.
`One problem is that gestures are very limited in their appli(cid:173)
`cations and functionality when implemented in portable
`devices. Another problem is that gestures are often not reli(cid:173)
`ably recognized. For example, raw sensor data is often not the
`best data to examine for gestures because it can greatly vary
`from user to user for a particular gesture. In such a case, if one
`user trains a learning system or hard-codes a pattern detector
`for that user's gestures, these gestures will not be recognized
`correctly when a different user uses the device. One example
`of this is in the rotation of wrist movement. One user might
`draw a pattern in the air with the device without rotating his
`wrist at all, but another user might rotate his wrist while
`drawing the pattern. The resulting raw data will look very
`different from user to user. A typical solution is to hard-code
`or train all possible variations of a gesture, but this solution is
`expensive in processing time and difficult to implement.
`[0011] Accordingly, a system and method that provides
`varied, robust and accurate gesture recognition with low-cost
`inertial sensors would be desirable in many applications.
`
`SUMMARY OF THE INVENTION
`
`[0012] The invention of the present application relates to
`mobile devices providing motion gesture recognition. In one
`aspect, a method for processing motion to control a portable
`electronic device includes receiving, on the device, sensed
`motion data derived from motion sensors of the device, where
`the sensed motion data is based on movement of the portable
`electronic device in space. The motion sensors provide six(cid:173)
`axis motion sensing and include at least three rotational
`motion sensors and at least three accelerometers. A particular
`operating mode is determined to be active while the move(cid:173)
`ment of the device occurs, where the particular operating
`mode is one of a plurality of different operating modes avail-
`
`Petitioner Samsung Ex-1033, 0015
`
`

`

`US 2009/0265671 Al
`
`Oct. 22, 2009
`
`2
`
`able in the operation of the device. One or more motion
`gestures are recognized from the motion data, where the one
`or more motion gestures are recognized from a set of motion
`gestures that are available for recognition in the active oper(cid:173)
`ating mode of the device. Each of the different operating
`modes of the device, when active, has a different set of motion
`gestures available for recognition. One or more states of the
`device are changed based on the one or more recognized
`motion gestures, including changing output of a display
`screen on the device.
`[0013]
`In another aspect of the invention, a method for
`recognizing a gesture performed by a user using a motion
`sensing device includes receiving motion sensor data in
`device coordinates indicative of motion of the device, the
`motion sensor data received from a plurality of motion sen(cid:173)
`sors of the motion sensing device including a plurality of
`rotational motion sensors and linear motion sensors. The
`motion sensor data is transformed from device coordinates to
`world coordinates, the motion sensor data in the device coor(cid:173)
`dinates describing motion of the device relative to a frame of
`reference of the device, and the motion sensor data in the
`world coordinates describing motion of the device relative to
`a frame of reference external to the device. A gesture is
`detected from the motion sensor data in the world coordi(cid:173)
`nates.
`[0014]
`In another aspect of the invention, a system for
`detecting gestures includes a plurality of motion sensors pro(cid:173)
`viding motion sensor data, the motion sensors including a
`plurality of rotational motion sensors and linear motion sen(cid:173)
`sors. At least one feature detector is each operative to detect
`an associated data feature derived from the motion sensor
`data, each data feature being a characteristic of the motion
`sensor data, and each feature detector outputting feature val(cid:173)
`ues describing the detected data feature. At least one gesture
`detector is each operative to detect a gesture associated with
`the gesture detector based on the feature values.
`[0015] Aspects of the present invention provide more flex(cid:173)
`ible, varied, robust and accurate recognition of motion ges(cid:173)
`tures from inertial sensor data of a mobile or handheld motion
`sensing device. Multiple rotational motion sensors and linear
`motion sensors are used, and appropriate sets of gestures can
`be recognized in different operating modes of the device. The
`use of world coordinates for sensed motion data allows minor
`variations in motions from user to user during gesture input to
`be recognized as the same gesture without significant addi(cid:173)
`tional processing. The use of data features in motion sensor
`data allows gestures to be recognized with reduced process(cid:173)
`ing compared to processing all the motion sensor data.
`
`BRIEF DESCRIPTION OF THE FIGURES
`
`[0016] FIG.1 is a block diagram ofa motion sensing device
`suitable for use with the present invention;
`[0017] FIG. 2 is a block diagram of one embodiment of a
`motion processing unit suitable for use with the present
`invention;
`[0018] FIGS. 3A and 3B are diagrammatic illustrations
`showing different motions of a device in space, as moved by
`a user performing a gesture;
`[0019] FIGS. 4A and 4B are diagrammatic illustrations
`showing the motions of FIGS. 3A and 3B as appearing using
`augmented sensor data;
`[0020] FIGS. SA-SC are diagrammatic illustrations show(cid:173)
`ing different user positions when using a motion sensing
`device;
`
`[0021] FIGS. 6A-6C are diagrammatic illustrations show(cid:173)
`ing different coordinate systems for sensing motion data;
`[0022] FIG. 7 is a block diagram illustrating a system of the
`present invention for producing augmented data for recogniz(cid:173)
`ing motion gestures;
`[0023] FIGS. SA and 8B are diagrammatic illustrations
`showing rotational movement of a device indicating whether
`or not a user is intending to input a gesture;
`[0024] FIG. 9 is a flow diagram illustrating a method of the
`present invention for recognizing gestures based on an oper(cid:173)
`ating mode of the portable electronic device;
`[0025] FIGS. l0A and 10B are diagrammatic illustrations
`of motion data of example shake gestures;
`[0026] FIGS. llA-l0F are diagrammatic illustrations
`showing magnitude peaks for gesture recognition;
`[0027] FIGS. 12A and 12B are diagrammatic illustrations
`of two examples of tap gestures;
`[0028] FIGS. 13A and 13B are diagrammatic illustrations
`of detecting a tap gesture by rejecting particular spikes in
`motion data;
`[0029] FIG.14 is a diagrammatic illustration of motion data
`of an example circle gesture;
`[0030] FIG. 15 is a diagrammatic illustration of examples
`of character gestures;
`[0031] FIG. 16 is a diagrammatic illustration showing one
`example of a set of data features of device movement that can
`be processed for gestures;
`[0032] FIG. 17 is a block diagram illustrating one example
`of a system for recognizing and processing gestures including
`data features;
`[0033] FIG. 18 is a block diagram illustrating one example
`of distributing the functions of the gesture recognition system
`ofFIG.16.
`
`DETAILED DESCRIPTION
`
`[0034] The present invention relates generally to motion
`sensing devices, and more specifically to recognizing motion
`gestures using motion sensors of a motion sensing device.
`The following description is presented to enable one of ordi(cid:173)
`nary skill in the art to make and use the invention and is
`provided in the context of a patent application and its require(cid:173)
`ments. Various modifications to the preferred embodiment
`and the generic principles and features described herein will
`be readily apparent to those skilled in the art. Thus, the
`present invention is not intended to be limited to the embodi(cid:173)
`ment shown but is to be accorded the widest scope consistent
`with the principles and features described herein.
`[0035] To more particularly describe the features of the
`present invention, please refer to FIGS. 1-18 in conjunction
`with the discussion below.
`[0036] FIG. 1 is a block diagram of one example of a
`motion sensing system or device 10 suitable for use with the
`present invention. Device 10 can be implemented as a device
`or apparatus, such as a portable device that can be moved in
`space by a user and its motion and/or orientation in space
`therefore sensed. For example, such a portable device can be
`a mobile phone, personal digital assistant (PDA), video game
`player, video game controller, navigation device, mobile
`internet device (MID), personal navigation device (PND),
`digital still camera, digital video camera, binoculars, tele(cid:173)
`photo lenses, or other portable device, or a combination of
`one or more of these devices. In some embodiments, the
`device 10 is a self-contained device that includes its own
`display and other output devices in addition to input devices.
`
`Petitioner Samsung Ex-1033, 0016
`
`

`

`US 2009/0265671 Al
`
`Oct. 22, 2009
`
`3
`
`In other embodiments, the portable device 10 only functions
`in conjunction with a non-portable device such as a desktop
`computer, electronic tabletop device, server computer, etc.
`which can communicate with the moveable or portable device
`10, e.g., via network connections.
`[0037] Device 10 includes an application processor 12,
`memory 14, interface devices 16, a motion processing unit 20,
`analog sensors 22, and digital sensors 24. Application pro(cid:173)
`cessor 12 can be one or more microprocessors, central pro(cid:173)
`cessing units (CPUs), or other processors which run software
`programs for the device 10. For example, different software
`application programs such as menu navigation software,
`games, camera function control, navigation software, and
`phone or a wide variety of other software and functional
`interfaces can be provided. In some embodiments, multiple
`different applications can be provided on a single device 10,
`and in some of those embodiments, multiple applications can
`run simultaneously on the device 10. In some embodiments,
`the application processor implements multiple different oper(cid:173)
`ating modes on the device 10, each mode allowing a different
`set of applications to be used on the device and a different set
`of gestures to be detected. This is described in greater detail
`below with respect to FIG. 9.
`[0038] Multiple layers of software can be provided on a
`computer readable medium such as electronic memory or
`other storage medium such as hard disk, optical disk, etc., for
`use with the application processor 12. For example, an oper(cid:173)
`ating system layer can be provided for the device 10 to control
`and manage system resources in real time, enable functions of
`application software and other layers, and interface applica(cid:173)
`tion programs with other software and functions of the device
`10. A motion algorithm layer can provide motion algorithms
`that provide lower-level processing for raw sensor data pro(cid:173)
`vided from the motion sensors and other sensors. A sensor
`device driver layer can provides a software interface to the
`hardware sensors of the device 10.
`[0039] Some or all of these layers can be provided in soft(cid:173)
`ware 13 of the processor 12. For example, in some embodi(cid:173)
`ments, the processor 12 can implement the gesture processing
`and recognition described herein based on sensor inputs from
`a motion processing unit (MPurM) 20 (described below).
`Other embodiments can allow a division of processing
`between the MPU 20 and the processor 12 as is appropriate
`for the applications and/or hardware used, where some of the
`layers ( such as lower level software layers) are provided in the
`MPU. For example, in embodiments allowing processing by
`the MPU 20, anAPI layer can be implemented in layer 13 of
`processor 12 which allows communication of the states of
`application programs running on the processor 12 to the MPU
`20 as well as API commands ( e.g., over bus 21 ), allowing the
`MPU 20 to implement some or all of the gesture processing
`and recognition described herein. Some embodiments of API
`implementations in a motion detecting device are described in
`co-pending U.S. patent application Ser. No. 12/106,921,
`incorporated herein by reference in its entirety.
`[0040] Device 10 also includes components for assisting
`the application processor 12, such as memory 14 (RAM,
`ROM, Flash, etc.) and interface devices 16. Interface devices
`16 can be any of a variety of different devices providing input
`and/or output to a user, such as a display screen, audio speak(cid:173)
`ers, buttons, touch screen, joystick, slider, knob, printer, scan(cid:173)
`ner, camera, computer network I/O device, other connected
`peripheral, etc. For example, one interface device 16 included
`in many embodiments is a display screen 16a for outputting
`
`images viewable by the user. Memory 14 and interface
`devices 16 can be coupled to the application processor 12 by
`a bus 18.
`[0041] Device 10 also can include a motion processing unit
`(MPurM) 20. The MPU is a device including motion sensors
`that can measure motion of the device 10 ( or portion thereof)
`in space. For example, the MPU can measure one or more
`axes of rotation and one or more axes of acceleration of the
`device. In preferred embodiments, at least some of the motion
`sensors are inertial sensors, such as gyroscopes and/or accel(cid:173)
`erometers. In some embodiments, the components to perform
`these functions are integrated in a single package. The MPU
`20 can communicate motion sensor data to an interface bus
`21, e.g., I2C or Serial Peripheral Interface (SPI) bus, to which
`the application processor 12 is also connected. In one
`embodiment, processor 12 is a controller or master of the bus
`21. Some embodiments can provide bus 18 as the same bus as
`interface bus 21.
`[0042] MPU 20 includes motion sensors, including one or
`more rotational motion sensors 26 and one or more linear
`motion sensors 28. For example, in some embodiments, iner(cid:173)
`tial sensors are used, where the rotational motion sensors are
`gyroscopes and the linear motion sensors are accelerometers.
`Gyroscopes 26 can measure the angular velocity of the device
`10 (or portion thereof) housing the gyroscopes 26. From one
`to three gyroscopes can typically be provided, depending on
`the motion that is desired to be sensed in a particular embodi(cid:173)
`ment. Accelerometers 28 can measure the linear acceleration
`of the device 10 (or portion thereof) housing the accelerom(cid:173)
`eters 28. From one to three accelerometers can typically be
`provided, depending on the motion that is desired to be sensed
`in a particular embodiment. For example, if three gyroscopes
`26 and three accelerometers 28 are used, then a 6-axis sensing
`device is provided providing sensing in all six degrees of
`freedom.
`[0043]
`In some embodiments the gyroscopes 26 and/or the
`accelerometers 28 can be implemented as MicroElectroMe(cid:173)
`chanical Systems (MEMS). Supporting hardware such as
`storage registers for the data from motion sensors 26 and 28
`can also be provided.
`[0044]
`In some embodiments, the MPU 20 can also include
`a hardware processing block 30. Hardware processing block
`30 can include logic or controllers to provide processing of
`motion sensor data in hardware. For example, motion algo(cid:173)
`rithms, or parts of algorithms, may be implemented by block
`30 in some embodiments, and/or part of or all the gesture
`recognition described herein. In such embodiments, an API
`can be provided for the application processor 12 to commu(cid:173)
`nicate desired sensor processing tasks to the MPU 20, as
`described above. Some embodiments can include a hardware
`buffer in the block 30 to store sensor data received from the
`motion sensors 26 and 28. A motion control 36, such as a
`button, can be included in some embodiments to control the
`input of gestures to the electronic device 10, as described in
`greater detail below.
`[0045] One example of an MPU 20 is described below with
`reference to FIG. 2. Other examples of an MPU suitable for
`use with the present invention are described in co-pending
`U.S. patent application Ser. No. 11/77 4,488, filed Jul. 6, 2007,
`entitled, "Integrated Motion Processing Unit (MPU) With
`MEMS Inertial Sensing and Embedded Digital Electronics,"
`and incorporated herein by reference in its entirety. Suitable
`implementations for MPU 20 in device 10 are available from
`Invensense, Inc. of Sunnyvale, Calif.
`
`Petitioner Samsung Ex-1033, 0017
`
`

`

`US 2009/0265671 Al
`
`Oct. 22, 2009
`
`4
`
`[0046] The device 10 can also include other types of sen(cid:173)
`sors. Analog sensors 22 and digital sensors 24 can be used to
`provide additional sensor data about the environment in
`which the device 10 is situation. For example, sensors such
`one or more barometers, compasses, temperature sensors,
`optical sensors ( such as a camera sensor, infrared sensor,
`etc.), ultrasonic sensors, radio frequency sensors, or other
`types of sensors can be provided. In the example implemen(cid:173)
`tation shown, digital sensors 24 can provide sensor data
`directly to the interface bus 21, while the analog sensors can
`be provide sensor data to an analog-to-digital converter
`(ADC) 34 which supplies the sensor data in digital form to the
`interface bus 21. In the example of FIG. 1, the ADC 34 is
`provided in the MPU 20, such that the ADC 34 can provide
`the converted digital data to hardware processing 30 of the
`MPU orto the bus 21. In other embodiments, theADC 34 can
`be implemented elsewhere in device 10.
`[0047] FIG. 2 shows one example of an embodiment of
`motion processing unit (MPU) 20 suitable for use with inven(cid:173)
`tions described herein. The MPU 20 of FIG. 2 includes an
`arithmetic logic unit (ALU) 36, which performs processing
`on sensor data. The ALU 36 can be intelligently controlled by
`one or more programs stored in and retrieved from program
`RAM (random access memory) 37. The ALU 36 can control
`a direct memory access (DMA) block 38, which can read
`sensor data independently of the ALU 3 6 or other processing
`unit, from motion sensors such as gyroscopes 26 and accel(cid:173)
`erometers 28 as well as other sensors such as temperature
`sensor 39. Some or all sensors can be provided on the MPU 20
`or external to the MPU 20; e.g., the accelerometers 28 are
`shown in FIG. 2 as external to the MPU 20. The DMA 38 can
`also provide interrupts to the ALU regarding the status of read
`or write operations. The DMA 38 can provide sensor data
`read from sensors to a data RAM 40 for storage. The data
`RAM 40 provides data to the ALU 36 for processing, and the
`ALU 3 6 provides output, including processed data, to the data
`RAM 40 for storage. Bus 21 (also shown in FIG. 1) can be
`coupled to the outputs of data RAM 40 and/or FIFO buffer 42
`so that application processor 12 can read the data read and/or
`processed by the MPU 20.
`[0048] A FIFO (first in first out) buffer 42 can be used as a
`hardware buffer for storing sensor data which can be accessed
`by the application processor 12 over the bus 21. The use of a
`hardware buffer such as buffer 42 is described in several
`embodiments below. For example, a multiplexer 44 can be
`used to select either the DMA 38 writing raw sensor data to
`the FIFO buffer 42, or the data RAM 40 writing processed
`data to the FIFO buffer 42 ( e.g., data processed by the ALU
`36).
`[0049] The MPU 20 as shown in FIG. 2 thus can support
`one or more implementations of processing motion sensor
`data, including the gesture processing and recognition
`described herein. For example, the MPU 20 can process raw
`sensor data fully, where programs in the program RAM 37
`can control the ALU 36 to intelligently process sensor data
`and provide high-level data to the application processor 12
`and application programs running thereon. Or,

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket