`US 20050212756Al
`
`(19) United States
`(12) Patent Application Publication
`Marvit et al.
`
`(10) Pub. No.: US 2005/0212756 Al
`Sep. 29, 2005
`(43) Pub. Date:
`
`(54) GESTURE BASED NAVIGATION OF A
`HANDHELD USER INTERFACE
`
`(76)
`
`Inventors: David L. Marvit, San Francisco, CA
`(US); Albert H. M. Reinhardt, San
`Francisco, CA (US); B. Thomas Adler,
`Menlo Park, CA (US); Hitoshi
`Matsumoto, San Jose, CA (US)
`
`Correspondence Address:
`BAKER BOTTS L.L.P.
`2001 ROSS AVENUE
`SUITE 600
`DALLAS, TX 75201-2980 (US)
`
`(21) Appl. No.:
`
`10/807,568
`
`(22) Filed:
`
`Mar. 23, 2004
`
`Publication Classification
`
`Int. Cl.7 ............................... G09G 5/00; G09G 5/08
`(51)
`(52) U.S. Cl.
`.............................................................. 345/156
`
`ABSTRACT
`(57)
`A motion controlled handheld device includes a display
`having a viewable surface and operable to generate a current
`image. The device includes a motion detection module
`operable to detect motion of the device within three dimen(cid:173)
`sions and to identify components of the motion in relation to
`the viewable surface. The device also includes a gesture
`database comprising a plurality of gestures, each gesture
`defined by a motion of the device with respect to a first
`position of the device. The gestures comprise at least four
`planar gestures each defined by a motion vector generally
`aligned in parallel with the viewable surface. The device
`includes a gesture mapping database mapping each of the
`gestures to a corresponding command, the gesture mapping
`database mapping each of the four planar gestures to a
`corresponding grid navigation command. The device also
`includes a motion response module operable to identify a
`matching one of the planar gestures based on the motion and
`to determine the corresponding one of the grid navigation
`commands based on the identified planar gesture and a
`display control module operable to logically parse a view(cid:173)
`able image into a plurality of grid sections, to set one of the
`grid sections as the current image, and to set another one of
`the grid sections as the current image in response to the
`determined grid navigation command.
`
`14
`
`INPUT
`
`DISPLAY
`
`12
`
`10
`
`PROCESSOR
`
`18
`
`16
`
`MOTION
`DETECTOR
`
`22
`
`20
`
`APPLICATIONS
`\
`19
`
`21
`\
`DATABASES
`
`COMMUNICATIONS INTERFACE
`
`Align EX1006
`Align v. 3Shape
`IPR2022-00144
`
`
`
`Patent Application Publication Sep. 29, 2005 Sheet 1 of 11
`
`US 2005/0212756 Al
`
`14
`
`INPUT
`
`DISPLAY
`
`12
`
`10
`
`PROCESSOR
`
`18
`
`16
`
`MOTION
`DETECTOR
`
`22
`
`20
`
`APPLICATIONS
`
`19
`
`21
`\
`DATABASES
`
`COMMUNICATIONS INTERFACE
`
`FIG. 1
`
`- ACCELEROMETER
`
`'-
`
`3oal RANGEFINDER j
`~-X--AX-1S _ ___.' 2264aa~ CAMERA I 28a~
`::i I
`. ~
`3obl RANGEFINDER I
`~-Y--AX-1s _ ___."" 2264bb~ c. AMERA I 28b~
`L..:.:J
`_
`1 I
`
`_ ACCELEROMETER
`
`'-
`
`Z-AXIS
`ACCELEROMETER
`(
`24c
`
`26c~ CAMERA j 2 s c 8 30c~ RANGEFINDER j
`32-1 PROCESSOR
`FIG. 2
`
`MOTION DETECTOR
`
`22
`
`X t-38
`
`I
`I
`
`39
`y
`\_/
`_,,,/' 40
`' , /
`~z
`37
`
`FIG. 4
`
`
`
`Patent Application Publication Sep. 29, 2005 Sheet 2 of 11
`
`US 2005/0212756 Al
`
`23a '- X-AXIS ACCELEROMETER
`RAW DATA
`
`---
`
`23b
`'- Y-AXIS ACCELEROMETER
`RAW DATA
`23c '- Z-AXIS ACCELEROMETER
`RAW DATA
`
`CAMERA RAW DATA
`
`CAMERA RAW DATA
`
`CAMERA RAW DATA
`
`25a
`. /
`
`25b
`. /
`
`25c
`. /
`
`-
`
`-
`
`-
`
`32
`;
`
`,,
`
`,-
`
`PROCESSOR
`
`-
`
`'
`
`'
`
`'
`
`TRANSLATION X
`
`TRANSLATION Y
`
`TRANSLATION Z
`
`ROTATION X
`
`ROTATION Y
`
`ROTATION Z
`
`34/
`
`-
`
`-
`
`-
`
`-
`
`GYRO
`RAW DATA
`
`GYRO
`RAW DATA
`
`GYRO
`RAW DATA
`
`. / 27a
`
`27b
`
`V
`
`V 27c
`
`RANGEFINDER
`RAW DATA
`
`RANGEFINDER
`RAW DATA
`
`-
`RANGEFINDER
`RAW DATA
`
`I'-- 29a
`
`I'- 29b
`
`I'-- 29c
`
`FIG. 3
`
`--
`
`16
`
`36
`
`PROCESSOR
`
`DEVICE BEHAVIOR
`
`I
`44~
`I
`
`41
`
`50
`
`-"--
`
`48
`
`46
`
`FIG. 5
`
`
`
`Patent Application Publication Sep. 29, 2005 Sheet 3 of 11
`
`US 2005/0212756 Al
`
`62a
`
`62b
`
`X-ACCELERATION
`DATA
`
`Y-ACCELERATION
`DATA
`
`Z-ACCELERATION
`DATA
`
`62c
`
`64
`
`PROCESS
`
`X
`
`z
`
`AUGMENTX
`
`AUGMENTY
`
`AUGMENT Z
`
`r-------,
`USER
`I
`I
`<1---
`1 PREFERENCES I
`L _ _ _ \ _ _ _ J
`
`68a
`
`70
`
`72
`
`PROCESS
`
`68c
`
`69
`
`DEVICE BEHAVIOR
`
`FIG. 6
`
`80
`~
`
`START
`
`82a
`
`DETECT X-AXIS 82b
`ACCELERATION /1
`
`DETECT Y-AXIS
`ACCELERATION /1
`
`DETECT Z-AXIS
`ACCELERATION /1
`
`82c
`
`YES
`
`FIG. 7
`
`SET ZERO-POINT
`
`86
`
`END
`
`
`
`Patent Application Publication Sep. 29, 2005 Sheet 4 of 11
`
`US 2005/0212756 Al
`
`FIG. 8
`
`90
`
`I
`
`I
`
`I
`I
`
`I
`
`I
`I
`I
`I
`
`I
`
`I
`
`I
`
`I
`
`I
`I
`
`I
`I
`
`-----~------r-----4------~-----+-----~------r-----4------~-----
`I 92
`I 94 I
`I 96 I
`I 98
`
`I
`
`I
`I
`I
`I
`
`I
`
`I
`
`I
`
`I
`
`I
`I
`
`I
`I
`
`I
`
`I
`I
`I
`I
`
`I
`I
`I
`I
`I
`
`I
`
`I
`I
`
`-----~--- --~-----1--- --~-----t---- j -i _ ~ ----1- ,. -li, -
`
`I
`:
`I
`:
`: I
`I :
`:
`:
`:
`:
`:
`~------~ ----~- ----~--- -+----1~------~t----~-1----~---t-
`: 1--+-: I
`:
`: ___.:
`:
`_:_....
`I:
`I
`:
`:
`:
`:
`:
`:
`I :
`: I
`: I
`:
`I
`J ______ L ____ J _____ L ____ l ____ ~J ______ LJ ____ J_~----L---J-
`: --i--- : L:---:.J
`: L-t-.J
`
`1
`
`I
`
`I
`
`I
`
`I
`
`I
`
`I
`
`I
`
`I
`
`I
`I
`I
`I
`I
`I
`I
`I
`I
`I
`I
`I
`I
`I
`I
`I
`I
`I
`I
`I
`I
`-----7------r-----7------,-----T-----7------r-----7------,-----
`1
`I
`I
`I
`I
`I
`I
`I
`I
`I
`I
`I
`I
`I
`I
`I
`I
`I
`I
`I
`I
`I
`I
`I
`I
`I
`I
`I
`I
`I
`I
`I
`I
`I
`I
`I
`I
`I
`I
`I
`I
`I
`I
`I
`I
`
`-----~------r-----4------~-----+-----~------r-----4------~-----
`
`1
`I
`I
`I
`I
`
`I
`I
`I
`I
`I
`
`I
`I
`I
`I
`I
`
`I
`I
`I
`I
`I
`
`I
`I
`I
`I
`I
`
`I
`I
`I
`I
`I
`
`I
`I
`I
`I
`I
`
`I
`I
`I
`I
`I
`
`I
`I
`I
`I
`I
`
`START
`
`FIG. 9
`
`100
`
`102
`
`104
`
`106
`
`108
`
`MOVE DEVICE TO RIGHT
`
`DISENGAGE MOTION
`SENSITIVITY
`
`MOVE DEVICE TO LEFT
`
`REENGAGE MOTION
`SENSITIVITY
`
`MOVE DEVICE TO RIGHT
`
`YES
`
`END
`
`
`
`Patent Application Publication Sep. 29, 2005 Sheet 5 of 11
`
`US 2005/0212756 Al
`
`120
`
`r------------
`
`bi 1b 1b
`
`122d
`
`122e
`
`122f
`
`128 q
`
`122g
`
`124f
`
`10
`
`I
`I
`126~
`I
`I
`I
`I 124a
`
`124b
`
`124c
`
`124d
`
`124e
`
`L----------------------
`
`130
`
`134
`
`10
`
`FIG. JOA
`
`UP
`
`DOWN
`
`LEFT
`
`RIGHT
`
`IN
`
`OUT
`
`FIG. JOB
`
`142
`\
`A
`
`140
`;
`C D
`
`B
`
`E
`
`I
`
`M
`
`F G H
`
`J
`
`N
`
`K
`
`0
`
`L
`
`p
`
`144
`;
`B2
`
`B5
`
`B8
`
`B3
`
`B6
`
`B9
`
`B1
`
`B4
`
`B7
`
`146
`)
`
`B2a B2b
`
`B2c
`
`82d B2e
`
`B2f
`
`B2g B2h
`
`B2i
`
`FIG. 11
`
`
`
`Patent Application Publication Sep. 29, 2005 Sheet 6 of 11
`
`US 2005/0212756 Al
`
`154
`
`I
`
`A B C D
`
`)
`153
`
`E F G H
`
`J K L
`I
`M N o( p
`/
`148
`
`~
`
`K L
`J K L
`I
`p 151 M( N 0
`p
`
`0
`
`/
`148
`
`FIG. 12A
`
`150
`1
`48" \
`A~
`E
`
`B
`
`F
`
`C D
`
`G H
`
`152
`
`I
`
`A B C D
`
`E F G H
`
`147
`
`/
`
`\
`8
`14
`A B C l)o
`E F G H
`J K L T
`149
`
`I
`
`M N 0
`
`p
`
`FIG. 12B
`
`I
`
`J
`
`M
`
`N
`
`A
`
`B
`
`C
`
`D
`
`E
`
`F
`
`2
`160
`\
`
`3
`
`4
`
`5
`
`7
`
`6
`162
`;
`
`158
`
`/
`
`164
`;
`
`161
`
`163
`
`165
`
`B2
`
`B3
`
`B5
`
`B6
`
`E3
`
`E4
`
`159
`
`159
`
`159
`
`
`
`Patent Application Publication Sep. 29, 2005 Sheet 7 of 11
`
`US 2005/0212756 Al
`
`170~
`
`172
`
`174
`
`176
`
`178
`
`RECEIVE RAW
`MOTION DATA
`
`PROCESS
`
`CHECK DEVICE STATE
`
`ANALYZE MOTION
`DETECTOR OUTPUT
`
`FIG. 13
`
`182
`
`184
`
`PROVIDE FEEDBACK
`
`NO
`
`BEHAVE
`
`PROVIDE FEEDBACK
`
`188
`
`200
`,r-1
`
`204
`
`AUTHENTICATOR
`
`FIG. 14
`
`10
`
`HANDHELD
`DEVICE
`
`mCommerce
`APPLICATION
`
`202
`
`220 D
`
`~ .
`
`-A ! \ :
`
`222
`
`HANDHELD DEVICE
`
`10
`
`226
`
`224
`
`REMOTE DEVICE
`
`FIG. 15
`
`
`
`Patent Application Publication Sep. 29, 2005 Sheet 8 of 11
`
`US 2005/0212756 Al
`
`230
`~
`
`232
`
`234
`
`236
`\
`
`, ___ l ___ ,
`
`RECEIVE RAW MOTION DATA I
`+
`I
`PROCESS
`+
`DETERMINE MOTION AND ORIENTATION
`I
`, ___ I ___ , , ___ I ___ ,
`r-1-,
`I ORIENTED AT
`I TRANSLATING
`ROTATING
`I o o o 1 STILL 1
`I
`I
`I
`a,B,w
`I
`I
`I
`I
`I ALONG X-AXIS
`I AROUND Z-AXIS I
`I
`L--;----..J L--;----..J L-----\-..J
`L-\-.J
`237a
`237b
`237c
`237n
`
`+
`DETERMINE ENVIRONMENT BASED ON MOTION AND ORIENTATION
`I
`r---1---, r---1---,
`r---1---,
`I FACE DOWN I
`I
`I
`I ON TRAIN
`I o o o
`FALLING
`I
`I
`I
`I
`I
`I ON TABLE
`L.-;-----1 L.-;-----1 L.---\-....1
`239a
`239b
`239c
`
`r---1---,
`I HELD IN
`I
`I ~
`I
`HAND
`238
`L.---\-....1
`239n
`
`+
`MAP ENVIRONMENT TO BEHAVIOR
`I
`,-_I---, r ____ _l _____ ,
`I ENGAGE
`I 1 POWER DOWN CHIPS 1
`I
`I
`I TO SURVIVE IMPACT
`MUTE
`I
`L.-;--....1 L----;-----.J
`241a
`241b
`
`0
`
`0
`
`0
`
`"
`
`,-----1------,
`INCREASE MOTION
`1
`1
`I ACTIVATION THRESHOLD I
`L. ______ \ ____ ..J
`
`242
`
`+
`BEHAVE
`
`FIG. 16
`
`I
`
`241n
`
`\
`240
`
`
`
`Patent Application Publication Sep. 29, 2005 Sheet 9 of 11
`
`US 2005/0212756 Al
`
`250
`~-
`
`"
`/
`'\
`V
`
`/
`
`/
`I
`I
`\
`\
`
`---
`
`'\
`\
`I
`I
`I
`
`I\
`\
`I
`\
`
`I
`
`10-DJ---~
`
`I
`I
`i'-252
`I
`L ____ _j
`
`I
`'\,f=(l JI/
`10~
`¥--,
`f=?1
`'/256
`1
`10~
`'
`I
`\
`I
`'
`
`\
`
`'
`
`---
`
`/
`
`/
`
`FIG. 17
`
`270
`'\
`,----,
`ED J
`..._ -
`'
`'
`
`I
`
`/
`
`RECEIVE RAW
`MOTION DATA
`
`PROCESS
`
`280
`
`MAP MOTION
`TO GESTURE
`
`GESTURE
`DATABASE
`
`MAP GESTURE
`TO FUNCTION
`
`BEHAVE
`
`FIG. 18
`
`FUNCTION
`DATABASE
`
`284
`
`272
`
`274
`
`276
`
`278
`
`282
`
`286
`
`254
`\,-,
`'
`I
`I
`I
`
`\~ 1 0
`
`330
`~
`RECEIVE USER
`INDICATION
`REGARDING
`GESTURE CREATION
`
`RECEIVE RAW
`MOTION DATA
`
`PROCESS
`
`STORE MOTION
`AS GESTURE
`
`RECEIVE
`FUNCTION MAPPING
`INFORMATION
`
`STORE
`FUNCTION MAPPING
`INFORMATION
`
`FIG. 21
`
`332
`
`334
`
`336
`
`338
`
`340
`
`342
`
`
`
`Patent Application Publication Sep. 29, 2005 Sheet 10 of 11
`
`US 2005/0212756 Al
`
`292
`
`294
`
`296
`
`RECEIVE RAW
`MOTION DATA
`
`PROCESS
`
`MAP MOTION
`TO GESTURE
`
`290
`~
`
`FIG. 19
`
`APPLICATION 1
`
`APPLICATION 2
`
`APPLICATION 3
`
`APPLICATION 4
`
`FUNCTION 1
`
`FUNCTION 2
`
`FUNCTION 3
`
`FUNCTION 4
`
`300a
`
`300b
`
`300c
`
`300d
`
`312
`
`314
`
`316
`
`RECEIVE RAW
`MOTION DATA
`
`PROCESS
`
`MAP MOTION
`TO GESTURE
`
`310
`~
`
`FIG. 20
`
`USER 1
`
`USER 2
`
`USER 3
`
`USER 4
`
`FUNCTION 1
`
`FUNCTION 2
`
`FUNCTION 3
`
`FUNCTION 4
`
`320a
`
`320b
`
`320c
`
`320d
`
`
`
`Patent Application Publication Sep. 29, 2005 Sheet 11 of 11
`
`US 2005/0212756 Al
`
`-
`/- "", ,350
`'(
`
`I
`
`,- .....
`
`I
`I
`1
`
`\
`
`\
`\
`
`352
`
`; 1 0~ r
`
`1 0~
`'
`I
`....._ _ _./
`
`~
`\
`
`I
`I
`-....__/
`
`FIG. 22
`
`370
`~
`
`372
`
`374
`
`376
`
`RECEIVE RAW
`MOTION DATA
`
`PROCESS
`
`MAP MOTION
`TO GESTURE
`
`384
`
`MAP GESTURE
`TO FUNCTION
`
`GESTURE
`DATABASE
`
`382
`
`FUNCTION
`MAPPING
`DATABASE
`
`386
`
`CONTEXT
`388
`
`FUNCTION 1
`
`FUNCTION 2
`
`FUNCTION 3
`
`392a
`
`392b
`
`392c
`
`USER 1
`0 PRECISION
`0 NOISE
`0 USER-CREATED
`GESTURES
`
`USER 2
`
`ENVIRONMENTAL MODEL
`
`USER 1
`0 USER MAPPING
`0 USER-CREATED
`FUNCTIONS (MACROS,
`PHONE NUMBERS, ... )
`
`USER 2
`
`ENVIRONMENTAL MODEL
`
`APPLICATION IN FOCUS
`
`DEVICE STATE
`
`FIG. 23
`
`379
`
`380
`
`387
`
`USER
`IDENTITIES
`
`381
`
`389
`
`390
`
`391
`
`
`
`US 2005/0212756 Al
`
`Sep.29,2005
`
`1
`
`GESTURE BASED NAVIGATION OF A HANDHELD
`USER INTERFACE
`
`TECHNICAL FIELD
`
`[0001] The present invention relates generally to portable
`devices and, more particularly, to portable devices with a
`motion interface.
`
`BACKGROUND
`[0002] The use of computing devices, such as cellular
`phones and personal digital assistants (PDAs) has grown
`rapidly. Such devices provide many different functions to
`users through different types of interfaces, such as keypads
`and displays. Some computing devices utilize motion as an
`interface by detecting tilt of the device by a user. Some
`implementations of a motion interface involve tethering a
`computing device with fishing lines or carrying large mag(cid:173)
`netic tracking units that require large amounts of power.
`
`SUMMARY
`
`[0003]
`In accordance with the present invention, a hand(cid:173)
`held device with motion a motion interface is provided.
`[0004]
`In accordance with a particular embodiment, a
`motion controlled handheld device includes a display having
`a viewable surface and operable to generate a current image.
`The device includes a motion detection module operable to
`detect motion of the device within three dimensions and to
`identify components of the motion in relation to the view(cid:173)
`able surface. The device also includes a gesture database
`comprising a plurality of gestures, each gesture defined by
`a motion of the device with respect to a first position of the
`device. The gestures comprise at least four planar gestures
`each defined by a motion vector generally aligned in parallel
`with the viewable surface. The device includes a gesture
`mapping database mapping each of the gestures to a corre(cid:173)
`sponding command, the gesture mapping database mapping
`each of the four planar gestures to a corresponding grid
`navigation command. The device also includes a motion
`response module operable to identify a matching one of the
`planar gestures based on the motion and to determine the
`corresponding one of the grid navigation commands based
`on the identified planar gesture and a display control module
`operable to logically parse a viewable image into a plurality
`of grid sections, to set one of the grid sections as the current
`image, and to set another one of the grid sections as the
`current image in response to the determined grid navigation
`command.
`[0005]
`In accordance with another embodiment, a method
`for controlling a handheld device includes maintaining a
`gesture database comprising a plurality of gestures, each
`gesture defined by a motion of the device with respect to a
`first position of the device. The gestures comprise at least
`four planar gestures each defined by a motion vector gen(cid:173)
`erally aligned in parallel with the viewable surface. The
`method includes maintaining a gesture mapping database
`mapping each of the gestures to a corresponding command,
`the gesture mapping database mapping each of the four
`planar gestures to a corresponding grid navigation com(cid:173)
`mand. The method includes generating a current image on a
`viewable surface of the handheld device, logically parsing a
`viewable image into a plurality of grid sections, setting one
`of the grid sections as the current image and detecting
`
`motion of the device within three dimensions. The method
`also includes identifying components of the motion in
`relation to the viewable surface, identifying a matching one
`of the planar gestures based on the motion, determining the
`corresponding one of the grid navigation commands based
`on the identified planar gesture, and setting another one of
`the grid sections as the current image in response to the
`determined grid navigation command.
`
`[0006] Technical advantages of particular embodiments
`include the ability to navigate through a virtual desktop
`using a plurality of motion interface gestures. This allows
`menus of information to be flattened since a user may easily
`navigate across a particular menu too large to fit on a
`device's display. Accordingly, a user may need to memorize
`less menu information thus increasing functionality and
`capability of the device for the user.
`
`[0007] Other technical advantages will be readily apparent
`to one skilled in the art from the following figures, descrip(cid:173)
`tions and claims. Moreover, while specific advantages have
`been enumerated above, various embodiments may include
`all, some or none of the enumerated advantages.
`
`BRIEF DESCRIPTION OF THE DRAWINGS
`
`[0008] For a more complete understanding of particular
`embodiments of the invention and their advantages, refer(cid:173)
`ence is now made to the following descriptions, taken in
`conjunction with the accompanying drawings, in which:
`
`[0009] FIG. 1 illustrates a handheld device with motion
`interface capability, in accordance with a particular embodi(cid:173)
`ment;
`
`[0010] FIG. 2 illustrates a motion detector of the handheld
`device of FIG. 1, in accordance with a particular embodi(cid:173)
`ment;
`
`[0011] FIG. 3 illustrates the use of motion detector com(cid:173)
`ponents of the handheld device of FIG. 1, in accordance
`with a particular embodiment;
`
`[0012] FIG. 4 illustrates an example handheld device with
`motion detection capability, in accordance with a particular
`embodiment;
`
`[0013] FIG. 5 illustrates an example of selection and
`amplification of a dominant motion of a handheld device, in
`accordance with a particular embodiment;
`
`[0014] FIG. 6 is a flowchart illustrating preferred motion
`selection, in accordance with a particular embodiment;
`
`[0015] FIG. 7 is a flowchart illustrating the setting of a
`zero-point for a handheld device, in accordance with a
`particular embodiment;
`
`[0016] FIG. 8 illustrates an example of scrubbing func(cid:173)
`tionality with a handheld device for virtual desktop naviga(cid:173)
`tion, in accordance with a particular embodiment;
`
`[0017] FIG. 9 is a flowchart illustrating the scrubbing
`process of FIG. 8, in accordance with a particular embodi(cid:173)
`ment;
`
`[0018] FIG. lOA illustrates an example of menu naviga(cid:173)
`tion using gesture input, in accordance with a particular
`embodiment;
`
`
`
`US 2005/0212756 Al
`
`Sep.29,2005
`
`2
`
`[0019] FIG. 10B illustrates example gestures which may
`be used to perform various functions at a handheld device,
`in accordance with a particular embodiment;
`[0020] FIG. 11 illustrates an example of map navigation
`using motion input, in accordance with a particular embodi(cid:173)
`ment;
`[0021] FIG. 12A illustrates a form of motion input cursor
`navigation, in accordance with a particular embodiment;
`[0022] FIG. 12B illustrates another form of motion input
`cursor navigation, in accordance with a particular embodi(cid:173)
`ment;
`[0023] FIG. 13 is a flowchart illustrating a process for
`utilizing feedback in response to motion input, in accor(cid:173)
`dance with a particular embodiment;
`[0024] FIG. 14 illustrates an example system utilizing
`spatial signatures with a handheld device, in accordance
`with a particular embodiment;
`[0025] FIG. 15 illustrates an example system in which
`motion input of a handheld device controls multiple other
`devices, in accordance with a particular embodiment;
`[0026] FIG. 16 is a flowchart illustrating an environmen(cid:173)
`tal modeling process of a handheld device, in accordance
`with a particular embodiment;
`[0027] FIG. 17 illustrates example gestures which may be
`mapped to different functions of a handheld device, in
`accordance with a particular embodiment;
`[0028] FIG. 18 is a flowchart illustrating the utilization of
`a preexisting symbol gesture, in accordance with a particular
`embodiment;
`[0029] FIG. 19 is a flowchart illustrating the use of
`context-based gesture mapping, in accordance with a par(cid:173)
`ticular embodiment;
`[0030] FIG. 20 is a flowchart illustrating the use of
`user-based gesture mapping, in accordance with a particular
`embodiment;
`[0031] FIG. 21 is a flowchart illustrating the assignment
`process for user-created gestures, in accordance with a
`particular embodiment;
`[0032] FIG. 22 illustrates three gestures input using a
`handheld device with varying levels of precision, in accor(cid:173)
`dance with a particular embodiment; and
`[0033] FIG. 23 is a flowchart illustrating a gesture recog(cid:173)
`nition process utilizing a number of features, in accordance
`with a particular embodiment.
`
`DETAILED DESCRIPTION
`
`[0034] FIG. 1 illustrates a handheld device 10 with
`motion interface capability, in accordance with a particular
`embodiment of the present invention. Handheld device 10
`can recognize movement of the device and can perform
`various functions corresponding to such movement. Thus,
`movement of the device operates as a form of input for the
`device. Such movement input may directly alter what is
`being displayed on a device display or may perform other
`functions. Handheld device 10 may comprise a mobile
`phone, personal digital assistant (PDA), still camera, video
`camera, pocket calculator, portable radio or other music or
`
`video player, digital thermometer, game device, portable
`electronic device, watch or any other device capable of
`being held or worn by a user. As indicated in the examples
`listed above, handheld device 10 may include wearable
`portable devices such as watches as well. A watch may
`include any computing device worn around a user's wrist.
`
`[0035] Handheld device 10 includes a display 12, input 14,
`processor 16, memory 18, communications interface 20 and
`motion detector 22. Display 12 presents visual output of the
`device and may comprise a liquid crystal display (LCD), a
`light emitting diode (LED) or any other type of display for
`communicating output to a user. Input 14 provides an
`interface for a user to communicate input to the device. Input
`14 may comprise a keyboard, keypad, track wheel, knob,
`touchpad, stencil or any other component through which a
`user may communicate an input to device 10. In particular
`embodiments, display 12 and input 14 may be combined into
`the same component, such as a touchscreen.
`
`[0036] Processor 16 may be a microprocessor, controller
`or any other suitable computing device or resource. Proces(cid:173)
`sor 16 is adapted to execute various types of computer
`instructions in various computer languages for implement(cid:173)
`ing functions available within system handheld device 10.
`Processor 16 may include any suitable controllers for con(cid:173)
`trolling the management and operation of handheld device
`10.
`
`[0037] Memory 18 may be any form of volatile or non(cid:173)
`volatile memory including, without limitation, magnetic
`media, optical media, random access memory (RAM), read
`only memory (ROM), removable media or any other suit(cid:173)
`able local or remote memory component. Memory 18
`includes components, logic modules or software executable
`by processor 16. Memory 18 may include various applica(cid:173)
`tions 19 with user interfaces utilizing motion input, such as
`mapping, calendar and file management applications, as
`further discussed below. Memory 18 may also include
`various databases, such as gesture databases and function or
`gesture mapping databases, as further discussed below.
`Components of memory 18 may be combined and/or divided
`for processing according to particular needs or desires
`within the scope of the present invention. Communications
`interface 20 supports wireless or wireline communication of
`data and information with other devices, such as other
`handheld devices, or components.
`
`[0038] Motion detector 22 tracks movement of handheld
`device 10 which may be used as a form of input to perform
`certain functions. Such input movement may result from a
`user moving the device in a desired fashion to perform
`desired tasks, as further discussed below.
`
`It should be understood that handheld device 10 in
`[0039]
`accordance with particular embodiments may include any
`suitable processing and/or memory modules for performing
`the functions as described herein, such as a control module,
`a motion tracking module, a video analysis module, a
`motion response module, a display control module and a
`signature detection module.
`
`[0040]
`In particular embodiments, input movement may
`be in the form of translation and/or gestures. Translation(cid:173)
`based input focuses on a beginning point and endpoint of a
`motion and differences between such beginning points and
`endpoints. Gesture-based input focuses on an actual path
`
`
`
`US 2005/0212756 Al
`
`Sep.29,2005
`
`3
`
`traveled by the device and is a holistic view of a set of points
`traversed. As an example, when navigating a map using
`translation-based input, motion in the form of an "O" may
`change the display during the movement but may ultimately
`yield no change between the information displayed prior to
`the movement and the information displayed at the end of
`the movement since the device presumably will be in the
`same point as it started when the motion ends. However, in
`a gesture input mode the device will recognize that it has
`traveled in the form of an "O" because in gesture-based
`input the device focuses on the path traveled during the
`motion or movement between a beginning point and an
`endpoint of the gesture ( e.g., even though the beginning and
`endpoints may be the same). This gesture "O" movement
`may be mapped to particular functions such that when the
`device recognizes it has traveled along a path to constitute
`an "O" gesture, it may perform the functions, as further
`elaborated upon below. In particular embodiments, move(cid:173)
`ment of the device intended as a gesture may be recognized
`as by the device as a gesture by matching a series, sequence
`or pattern of accelerations of the movement to those defining
`gestures of a gesture database.
`[0041] Handheld devices
`in accordance with other
`embodiments may not include some of the components of
`the device illustrated in FIG. 1. For example, some embodi(cid:173)
`ments may include a handheld device 10 without an input 14
`separate from a motion detector such that motion of the
`device provides the sole or primary input for the device. It
`should be noted that handheld devices in accordance with
`other embodiments may include additional components not
`specifically illustrated with respect to device 10.
`[0042] FIG. 2 illustrates motion detector 22 of FIG. 1, in
`accordance with a particular embodiment of the present
`invention. In this embodiment, motion detector 22 includes
`accelerometers 24a, 24b and 24c; cameras 26a, 26b and 26c;
`gyros 28a, 28b and 28c; rangefinders 30a, 30b and 30c; and
`a processor 32.
`[0043] Accelerometers 24a, 24b and 24c detect movement
`of the device by detecting acceleration along a respective
`sensing axis. A particular movement of the device may
`comprise a series, sequence or pattern of accelerations
`detected by the accelerometers. When the handheld device is
`tilted along a sensing axis of a particular accelerometer, the
`gravitational acceleration along the sensing axis changes.
`This change in gravitational acceleration is detected by the
`accelerometer and reflects the tilt of the device. Similarly,
`translation of the handheld device, or movement of the
`device without rotation or tilt also produces a change in
`acceleration along a sensing axis which is also detected by
`the accelerometers.
`
`[0044]
`In the illustrated embodiment, accelerometer 24a
`comprises an x-axis accelerometer that detects movement of
`the device along an x-axis, accelerometer 24b comprises a
`y-axis accelerometer that detects movement of the device
`along a y-axis and accelerometer 24c comprises a z-axis
`accelerometer that detects movement of the device along a
`z-axis. In combination, accelerometers 24a, 24b and 24c are
`able to detect rotation and translation of device 10. As
`indicated above, rotation and/or translation of device 10 may
`serve as an input from a user to operate the device.
`
`accelerometers were used, the motion detector may not be
`able to disambiguate translation of the handheld device from
`tilt in the plane of translation. However, using a third, z-axis
`accelerometer ( an accelerometer with a sensing axis at least
`approximately perpendicular to the sensing axes of the other
`two accelerometers) enables many cases of tilt to be disam(cid:173)
`biguated from many cases of translation.
`
`[0046]
`It should be understood that some unique move(cid:173)
`ments may exist that may not be discernible from each other
`by accelerometers 24a, 24b and 24c. For example, move(cid:173)
`ment comprising a certain rotation and a certain translation
`may appear to accelerometers 24a, 24b and 24c as the same
`movement as a different movement that comprises a differ(cid:173)
`ent particular rotation and a different particular translation.
`If a motion detector 22 merely included three accelerometers
`to detect movement (without any additional components to
`ensure greater accuracy), some unique, undiscernible move(cid:173)
`ments may be mapped to the same function or may not be
`mapped to a function to avoid confusion.
`
`[0047] As indicated above, motion detector 22 also
`includes cameras 26a, 26b and 26c, which may comprise
`charge coupled device (CCD) cameras or other optical
`sensors. Cameras 26a, 26b and 26c provide another way to
`detect movement of the handheld device (both tilt and
`translation). If only one camera were installed on a device
`for movement detection, tilt of the device may be indistin(cid:173)
`guishable from translation (without using other motion
`detection components, such as accelerometers). However,
`by using at least two cameras, tilt and translation may be
`distinguished from each other. For example, if two cameras
`were installed on handheld device 10 ( one on the top of the
`device and one on the bottom of the device), each camera
`would see the world moving to the right when the device
`was translated to the left. If the device is lying horizontally
`and is rotated by lifting its left edge while lowering its right
`edge, the camera on the bottom will perceive the world as
`moving to the right while the camera on the top will perceive
`the world as moving to the left. Thus, when a device is
`translated, cameras on opposite surfaces will see the world
`move in the same direction (left in the example given).
`When a device is rotated, cameras on opposite surfaces will
`see the world move in opposite directions. This deductive
`process can be reversed. If both cameras see the world
`moving in the same direction, then the motion detector
`knows that the device is being translated. If both cameras see
`the world moving in opposite directions, then the motion
`detector knows that the device is being rotated.
`
`[0048] When the device is rotated, the magnitude of the
`movement of the world to the cameras is directly related to
`the magnitude of the rotation of the device. Thus, the amount
`of the rotation can accurately be determined based on such
`movement of the world to the cameras. However, when the
`device is translated, the magnitude of the translation is
`related to both the magnitude of the movement of the world
`to the cameras and to the distance to the objects in the field
`of view of the cameras. Therefore, to accurately determine
`amount of translation using cameras alone, some form of
`information concerning the distance to objects in the camera
`fields of view must be obtained. However, in some embodi(cid:173)
`ments cameras with rangefinding capability may be used.
`
`[0045] The use of three accelerometers for motion detec(cid:173)
`tion provides certain advantages. For example, if only two
`
`[0049]
`It should be understood that even without distance
`information, optical information can be of significant value
`
`
`
`US 2005/0212756 Al
`
`Sep.29,2005
`
`4
`
`when correlated against the information from accelerom(cid:173)
`eters or other sensors. For example, optical camera input
`may be used to inform the device that no significant motion
`is taking place. This could provide a solution to problems of
`drift which may be inherent in using acceleration data to
`determine absolute position information for certain device
`functions.
`[0050] As discussed above, distance information may be
`useful to determine amount of translation when cameras are
`being used to detect movement. In the illustrated embodi(cid:173)
`ment, such distance information is provided by rangefinders
`30a, 30b and 30c. Rangefinders 30a, 30b and 30c may
`comprise ultrasound rangefinders, laser rangefinders or any
`other suitable distance measuring component. Other com(cid:173)
`ponents may also be used to determine distance information.
`For example, cameras with rangefinding capability may be
`used, and multiple cameras may be utilized on the same side
`of the device to function as a range-finder using stereopsis.
`Determined distance information allows for accurate and
`explicit computation of the portion of any apparent transla(cid:173)
`tion that is due to translation and the portion that is due to
`rotation.
`[0051] As indicated above, motion detector 22 addition(cid:173)
`ally includes gyros 28a, 28b and 28c. Gyros 28a, 28b and
`28c are used in combination with the other components of
`motion detector 22 to provide increased accuracy in detect(cid:173)
`ing movement of device 10.
`[0052] Processor 32 processes data from accelerometers
`24, cameras 26, gyros 28 and rangefinders 30 to produce an
`output indicative of the motion of device 10. Processor 32
`may comprise a microprocessor, controller or any other
`suitable computing device or resource, such as a video
`analysis module for receiving a video stream from each
`camera. In some embodiments, the processing described
`herein with respect to processor 32 of motion detector 22
`may be performed by processor 16 of handheld device 10 or
`any other suitable processor, including processors located
`remote to the device.
`
`[0053] As discussed above, motion detector 22 includes
`three accelerometers, three cameras, three gyros and three
`rangefinders. Motion detectors in accordance with other
`embodime