`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`SAMSUNG EXHIBIT 1015
`Samsung Electronics America Inc. v. Uniloc Luxembourg, S.A.
`IPR2018-01664
`
`Page 1 of 34
`
`
`
`
`
`US 7,180,502 B2
`
`Page 2
`
`
`
`
`
`
`
`
`
`
`
`
`Patent Application entitled, “Motion Controlled Remote Control-
`
`
`
`
`
`
`
`
`
`
`ler”, 65 pages specification, claims and abstract,
`11 pages of
`
`
`
`
`
`
`
`drawings, inventors Marvit et al, Mar. 23, 2004.
`
`
`
`
`
`
`
`Patent Application entitled, “Gesture Identification of Controlled
`
`
`
`
`
`
`
`
`
`Devices”, 65 pages specification, claims and abstract, 11 pages of
`
`
`
`
`
`
`
`drawings, inventors Marvit et al, Mar. 23, 2004.
`
`
`
`
`
`
`
`Patent Application entitled, “Environmental Modeling for Motion
`
`
`
`
`
`
`
`Controlled Handheld Devices”, 65 pages specification, claims and
`
`
`
`
`
`
`
`
`
`abstract, 11 pages of drawings, inventors Marvit et al, Mar. 23,
`2004.
`
`
`
`
`
`
`
`
`Patent Application entitled, “Gesture Based User Interface Support-
`
`
`
`
`
`
`
`ing Preexisting Symbols’, 65 pages specification, claims and.
`
`
`
`
`
`
`
`
`abstract, 11 pages of drawings, inventors Marvit et al, Mar. 23,
`2004.
`
`
`
`
`
`
`
`entitled,
`“Context Dependent Gesture
`Patent Application
`
`
`
`
`
`
`
`
`Response”, 66 pages specification, claims and abstract, 11 pages of
`
`
`
`
`
`
`
`drawings, inventors Marvit et al, Mar. 23, 2004.
`
`
`
`
`
`
`
`Patent Application entitled, “Customizable Gesture Mappings for
`
`
`
`
`
`
`
`Motion Controlled Handheld Devices’, 67 pages specification,
`
`
`
`
`
`
`
`
`
`claims and abstract, 11 pages of drawings, inventors Marvitet al,
`
`
`
`Mar. 23, 2004.
`
`
`
`
`
`
`
`
`Patent Application entitled, “User Definable Gestures for Motion
`
`
`
`
`
`
`
`Controlled Handheld Devices”, 66 pages specification, claims and
`
`
`
`
`
`
`
`
`
`abstract, 11 pages of drawings, inventors Marvit et al, Mar. 23,
`2004.
`
`
`
`
`
`
`
`Patent Application entitled, “Non-Uniform Gesture Precision’, 66
`
`
`
`
`
`
`
`
`
`pages specification, claims and abstract, 11 pages of drawings,
`
`
`
`
`
`
`inventors Marvit et al, Mar. 23, 2004.
`
`
`
`
`
`
`Patent Application entitled, “Dynamic Adaptation of Gestures for
`
`
`
`
`
`
`Motion Controlled Handheld Devices’, 65 pages specification,
`
`
`
`
`
`
`
`
`claims and abstract, 11 pages of drawings, inventors Marvitet al,
`
`
`
`Mar. 23, 2004.
`
`
`
`
`
`
`
`PCT Notification of Transmittal of the International Search Report
`
`
`
`
`
`
`
`
`and the Written Opinion of the International Searching Authority, or
`
`
`
`
`
`
`
`the Declaration, mailed Nov. 3, 2005, re PCT/US2005/007409filed
`
`
`
`Mar 7, 2005, 13 pages.
`
`
`
`
`
`U.S. PATENT DOCUMENTS
`
`
`
`
`
`
`
`
`6,466,198 Bl
`2002/0190947 Al
`
`
`2004/0027330 Al
`
`
`
`
`
`
`10/2002 Feinstein ..........0..000. 345/158
`12/2002 Feinstein ....
`. 345/158
`
`
`
`
`2/2004 Bradski
`.......ce eee 345/158
`
`
`
`
`
`
`
`
`
`
`FOREIGN PATENT DOCUMENTS
`
`
`WO 01/86920
`11/2001
`
`
`WO 03/001340
`1/2003
`
`
`
`
`
`
`
`
`
`
`
`WO
`WO
`
`
`
`
`OTHER PUBLICATIONS
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`Yee, Ka-Ping, “Peephole Displays: Pen Interaction on Spatially
`
`
`
`
`
`
`
`
`Aware Handheld Computers”, CHI 2003, Ft. Lauderdale, Florida, 8
`
`
`
`pages, Apr. 5, 2003.
`
`
`
`
`
`
`Patent Application entitled, “Distinguishing Tilt and Translation
`
`
`
`
`
`Motion Components in Handheld Devices’, 68 pages specification,
`
`
`
`
`
`
`
`
`claims and abstract, 11 pages of drawings, inventors Marvitet al,
`
`
`
`Mar. 23, 2004.
`
`
`
`
`
`
`
`Patent Application entitled, “Motion Sensor Engagement for a
`
`
`
`
`
`
`
`
`Handheld Device’, 68 pages specification, claims and abstract, 11
`
`
`
`
`
`
`
`
`pages of drawings, inventors Marvit et al, Mar. 23, 2004.
`
`
`
`
`
`
`
`Patent Application entitled, “Selective Engagement of Motion
`
`
`
`
`
`
`
`
`Detection”, 66 pages specification, claims and abstract, 11 pages of
`
`
`
`
`
`
`
`drawings, inventors Marvit et al, Mar. 23, 2004.
`
`
`
`
`
`
`
`Patent Application entitled, “Gesture Based Navigation of a
`
`
`
`
`
`
`
`
`Handheld User Interface”, 69 pages specification, claims and
`
`
`
`
`
`
`
`
`
`abstract, 11 pages of drawings, inventors Marvit et al, Mar. 23,
`2004.
`
`
`
`
`
`
`
`Patent Application entitled, “Translation Controlled Cursor”, 66
`
`
`
`
`
`
`
`
`pages specification, claims and abstract, 11 pages of drawings,
`
`
`
`
`
`
`inventors Reinhardt et al, Mar. 23, 2004.
`
`
`
`
`
`
`
`Patent Application entitled, “Selective Engagement ofMotion Input
`
`
`
`
`
`
`
`
`Modes”, 66 pages specification, claims and abstract, 11 pages of
`
`
`
`
`
`
`
`drawings, inventors Marvit et al, Mar. 23, 2004.
`
`
`
`
`
`
`
`
`Patent Application entitled, “Feedback Based User Interface for
`
`
`
`
`
`
`
`Motion Controlled Handheld Devices”, 66 pages specification,
`
`
`
`
`
`
`
`
`
`claims and abstract, 11 pages of drawings, inventor Marvit etal,
`
`
`
`Mar. 23, 2004.
`
`
`
`
`
`
`
`Patent Application entitled, “Spatial Signatures”, 64 pages specifi-
`
`
`
`
`
`
`
`
`
`cation, claims and abstract, 11 pages of drawings, inventors Marvit
`
`
`
`
`
`et al, Mar. 23, 2004.
`
`
`
`
`Page 2 of 34
`
`Page 2 of 34
`
`
`
`
`U.S. Patent
`
`
`
`
`
`Feb. 20, 2007
`
`
`
`
`
`Sheet 1 of 11
`
`
`
`US 7,180,502 B2
`
`
`
`
`
`
`
`
`
`al
`
`tg
`
`
`
`
`DATABASES
`
`
`
`
`
`24¢
`
`
`
`,
`
`26b
`Y-AXIS
`
`
`
`
`
`
`ACCELEROMETER|X.94h
`
`X-AXIS
`
`
`ACCELEROMETER
`
`26a
`
`
`
`
`
`
`
`
`Z-AXIS
`
`
`
`
`ACCELEROMETER|26c
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`Page 3 of 34
`
`Page 3 of 34
`
`
`
`
`U.S. Patent
`
`
`
`
`
`Feb. 20, 2007
`
`
`
`
`Sheet 2 of 11
`
`
`
`US 7,180,502 B2
`
`23a~]
`
`23D~J
`
`
`
`
`y-axis ACCELEROMETER
`
`
`RAW DATA
`
`y-axiS ACCELEROMETER
`
`
`RAW DATA
`
`
`
`
`
`23C~J
`
`
`7-axiS ACCELEROMETER
`
`
`RAW DATA
`
`
`
`
`
`30
`
`PROCESSOR
`
`
`
`
`GYRO
`
`RAW DATA
`
`
`
`
`
`GYRO
`
`RAW DATA
`
`
`
`
`GYRO
`
`RAW DATA
`
`
`
`2fa
`
`
`
`27b
`
`
`
`27c
`
`
`
`
`
`
`29a
`
`29b
`
`29¢
`
`
`
`CAMERA RAW DATA
`
`
`
`
`
`
`
`
`
`CAMERA RAW DATA
`
`CAMERA RAWDATA
`
`
`
`
`
`
`
`
`
`25a
`
`25b
`
`25¢
`
`
`
`RANGEFINDER
`RAW DATA
`
`RANGEFINDER
`RAW DATA
`
`
`
`
`
`
`
`
`
`
`
`
`RANGEFINDER
`RAW DATA
`
`
`
`
`FIG. 3
`
`
`
`
`TRANSLATION X
`
`
`
`
`
`TRANSLATION Y
`
`TRANSLATION Z
`
`
`ROTATION X
`
`
`
`ROTATION Y
`
`ROTATION Z
`
`
`
`
`PROCESSOR
`
`
`
`
`
`
`DEVICE BEHAVIOR
`
`
`
`w 4
`
`
`
`
`
`16
`
`36
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`Page 4 of 34
`
`Page 4 of 34
`
`
`
`
`U.S. Patent
`
`
`
`
`
`Feb. 20, 2007
`
`
`
`
`Sheet 3 of 11
`
`
`
`US 7,180,502 B2
`
`
`
`
`
`Z-ACCELERATION|-62c
`X-ACCELERATION||Y-ACCELERATION
`
`
`
`DATA
`DATA
`
`
`
`
`
`62a
`
`
`
`62b
`
`
`
`DATA
`
`
`
`
`
`
`PROCESS
`
`
`
`
`
`
`
`64
`
`66
`
`
`
`Je
`
`
`
`
`
`X
`
`DOMINANT
`
`AXIS?
`
`
`
`Z
`
`
`
`68b
`
`
`
`
`
`
`
`[acura] =
`
`
`68c
`
`
`
`
`FIG. 6
`
`
`
`
`
`
`ree 7
`I
`USER
`|
`
`‘l PREFERENCES |
`L
`J
`
`
`
`69
`
`
`
`
`
`68a
`
`
`
`
`
`
`
`
`
`
`
`r
`
`72
`
`
`
`
`
`
`
`
`
` DETECT Y-AXISACCELERATION A
`
`DETECT X-AXIS
`
`
`
`ACCELERATION A
`
`
`
`
`
`
`ANY ONE
`
`
`GREATER THAN
`
`
`THRESHOLD?
`
`
`
`
`
`
`
`82a
`
`
`
`
`
`
`
`
`
`DETECT Z-AXIS
`
`ACCELERATION A
`
`
`
`
`
`
`
`
`FIG. 7
`
`
`
`
`
`N
`
`86
`
`
`
`
`SET ZERO-POINT
`
`
`
`Page 5 of 34
`
`Page 5 of 34
`
`
`
`
`U.S. Patent
`
`
`
`
`
`Feb. 20, 2007
`
`
`
`
`Sheet 4 of 11
`
`
`
`US 7,180,502 B2
`
`L—---H-4+
`
`-----JeLL
`
`!!!4
`
`!!!4
`
`J'!|!|+4i|4!ot
`
`t!!4
`t
`
`
`
`
`
`
`
`
`
`
`
`----4
`
`J4--___-1f}--__
`
`
`
`
`
`
`
`
`
`
`oS
`So—
`
`102
`
`
`
`104
`
`
`
`106
`
`
`
`108
`
`
`
`
`
`MOVE DEVICE TO RIGHT
`
`
`
`DISENGAGE MOTION
`
`SENSITIVITY
`
`
`
`
`
`MOVE DEVICE TO LEFT
`
`
`
`REENGAGE MOTION
`
`SENSITIVITY
`
`
`
`
`
`MOVE DEVICE TO RIGHT
`
`
`
`
`
`NEED FURTHER
`
`MOVEMENT?
`
`110
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`Page 6 of 34
`
`Page 6 of 34
`
`
`
`
`
`
`
`
`
`U.S. Patent
`
`
`
`
`
`Feb. 20, 2007
`
`
`
`
`Sheet 5 of 11
`
`
`
`US 7,180,502 B2
`
`
`
` 124a
`
`
`
`
`124b
`
`
`:
`
`
`124d
`
`
`
`124e
`
`130
`
`
`
`134
`
`
`
`
`
`
`132A Fh.
`
`
`
`
`
`1 F
`
`
`IG. LOA
`
`
`
`
`
`
`
`136
`
`
`
`
`
`138
`
`
`
`
`
`DOWN
`
`
`
`
`
`RIGHT
`
`
`
`
`
`IN
`
`OUT
`
`
`
`
`
`
`
`FIG. 10B
`
`
`
`
`
`146
`
`
`
`
`
`
`
`
`Q ty GS> & Us.
`
`
`
`
`
`Bec
`
`
`
`
`
`Page 7 of 34
`
`Page 7 of 34
`
`
`
`
`U.S. Patent
`
`
`
`
`
`Feb. 20, 2007
`
`
`
`
`Sheet 6 of 11
`
`
`
`US 7,180,502 B2
`
`
`
`
`
`
`
`
`FIG. 12A
`
`
`
` n~S
`
`
`
`
`
`Page 8 of 34
`
`
`
`
`
`
`
`Page 8 of 34
`
`
`
`
`
`U.S. Patent
`
`
`
`
`
`Feb. 20, 2007
`
`
`
`
`Sheet 7 of 11
`
`
`
`US 7,180,502 B2
`
`172
`
`
`170%.
`
` y
`
`174
`
` V
`
`
`
`
`RECEIVE RAW
`
`
`
`
`MOTION DATA
`
`
`
`PROCESS
` V
`CHECK DEVICE STATE 176
`
`
`V
`
`
`
`
`
`
`
`
`ANALYZE MOTION
`
`
`DETECTOR OUTPUT
`V
`
`
`FIG. 13
`
`
`
`182
`
`
`
`184
`
`
`
`PROVIDE FEEDBACK
`
`178
`
`
`MOTION
`
`GIVEN STATE?
`
`MEANINGFUL/
`
`RECOGNIZABLE
`
`
`
`
`
`
`180
`ABOVE
`
`
`THRESHOLD
`?
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`COMMUNICATION
`
`NETWORK
`
`
`
`
`
`
`
`
`FIG. 15
`
`
`
`
`
`
` AUTHENTICATOR
`
`
`
`FIG. 14
`
`
`
`COMMUNICATION
`
`NETWORK
`
`
`
`
`mCommerce
`
`APPLICATION
`
`202
`
`
`
`
`
`
`
`
`
`
`
`Page 9 of 34
`
`Page 9 of 34
`
`
`
`
`U.S. Patent
`
`
`
`
`
`Feb. 20, 2007
`
`
`
`
`Sheet 8 of 11
`
`
`
`US 7,180,502 B2
`
`232
`
`
`
`RECEIVE RAW MOTION DATA
`
`
`
`
`
`
`
`
`230
`x
`
`236
`
`234
`
`PROCESS
`
`
`
`
`DETERMINE MOTION AND ORIENTATION
`
`
`
`eee =—— =F hr —_—_——e—=
`=——e
`ero
`_—
`
`
`
`
`
`
`1
`ROTATING
`{4 TRANSLATING 1
`4 ORIENTED AT
`+ ooo! STL!
`
`
`
`
`
`| AROUND Z-AXIS !
`| ALONG X-AXIS !
`8,
`|
`|
`1
`
`
`
`
`
`
`
`
`J lL
`LL
`ai
`lL
`ae
`
`
`
`
`23/b
`
`|ACTIVATION THRESHOLD!
`
`
`
`
`
`
`
`
`DETERMINE ENVIRONMENT BASED ON MOTION AND ORIENTATION
`
`
`
`
`
`
`
`
`
`
`| HELDIN |
`IFACEDOWNT | carmin tt outpam |
`
`
`
`I
`1
`1
`|
`loo!
`
`
`
`
`HAND
`| ON TABLE tI FALLING
`4 ONTRAIN j
`yj
`|
`I
`
`
`
`
`
`
`
`
`a SS i ToT
`
`
`
`
`239Nn
`239a
`239b
`239¢
`
`
`
`938
`
`
`
`
`
`
`
`MAP ENVIRONMENT TO BEHAVIOR
`
`
`
`
`
`
`
`| ENGAGE 1 1 POWERDOWN CHIPS 1, , 1 INCREASEMOTION |
`
`
`
`
`
`MUTE _! | TO SURVIVE IMPACT |
`
`
`
`
`
`
`
`
`
`
`
`
`FIG. 16
`
`Page 10 of 34
`
`Page 10 of 34
`
`
`
`
`U.S. Patent
`
`
`
`
`
`Feb. 20, 2007
`
`
`
`
`Sheet 9 of 11
`
`
`
`US 7,180,502 B2
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`KJ
`
`
`~woL”
`
`
`
`
`
`wed fofl
`
`
`
`
`
`Le”
`
`
`
`N
`
`
`
`
`
`FIG. 17
`
`
`
`
`
` 330
`
`
`
`
`RECEIVE USER
`
`INDICATION
`
`REGARDING
`
`GESTURE CREATION
`/
`
`
`
`
`
`RECEIVE RAW
`
`
`MOTION DATA
`V
`
`
`
`PROCESS
`V
`
`
`
`STORE MOTION
`
`
`AS GESTURE
`
`
`RECEIVE
`
`FUNCTION MAPPING
`INFORMATION
`
`
`
`
`
`STORE
`
`FUNCTION MAPPING
`
`INFORMATION
`
`
`
`
`FIG. 21
`
`
`
`334
`
`
`
`336
`
`
`
`338
`
`
`
`340
`
`
`
`
`
`342
`
`272
`
`
`
`/
`
`
`
`
`
`
`
`
`
`
`RECEIVE RAW
`
`MOTION DATA
`V
`
`PROCESS
`
`V
`
`
`
`
`GESTURE
`
`DATABASE
`
`
`
`"0"
`V
`
`
`
`
`
`
`
`MAP GESTURE
`
`FUNCTION
`
`
`TO FUNCTION
`
`
`
`
`
`DATABASE
`
`
`274
`
`276
`
`278
`
`282
`
` V
`
`
`
`BEHAVE 286
`
`
`
`
`FIG. 18
`
`Page 11 of 34
`
`Page 11 of 34
`
`
`
`
`U.S. Patent
`
`
`
`
`
`Feb. 20, 2007
`
`
`
`
`Sheet 10 of 11
`
`
`
`US 7,180,502 B2
`
`292
`
`
`
`
`
`
`RECEIVE RAW
`
`
`
`
`MOTION DATA
`V
` 294 V
`
`
`
`
`PROCESS
`
`
`
`290
`
`
`296
`
`
`
`
`
`FUNCTION 4
`
`FUNCTION 2
`
`FUNCTION 3
`
`FUNCTION 4
`
`300a
`
`
`
`300b
`
`
`
`300¢
`
`
`
`
`300d
`
`APPLICATION 4
`
`
`
`
`
`312
`
`314
`
`316
`
`
`
`
`
`
`
`
`
`MAP MOTION
`TO GESTURE
`
`
`
`
`MAP MOTION
`TO GESTURE
`V
`
` WHICH
`
`
`
`
`FIG. 19
`
`
`
`APPLICATION IN
`
`FOCUS?
`
` APPLICATION1 APPLICATION 2.=|APPLICATION 3
`
`
`
`
`
`
`
`
`USER 3
`
`USER 4
`
`FUNCTION 3
`
`FUNCTION 4
`
`320a
`
`320b
`
`320c
`
`320d
`
`Page 12 of 34
`
`Page 12 of 34
`
`
`
`
`U.S. Patent
`
`
`
`
`
`Feb. 20, 2007
`
`
`
`
`
`Sheet 11 of 11
`
`
`
`US 7,180,502 B2
`
`yo
`
`10
`
`
`
`
`-
`
`!
`
`
`rN
`
`10
`
`
`
`
`
`
`
`
`
`
`\
`i
`“7 350
`fn
`
`
`
`
`
`
`1
`ave
`!
`4. arar
`
`
`|
`i
`10
`|
`
`
`
`
`KY ,
`ty
`
`
`\
`NL
`
`
`FIG. 22
`
`
`370
`‘
`
`
`
`RECEIVE RAW
`
`
`MOTION DATA
`
`Y
`
`
`
`378
`
`
`
`
`
`
`
`
`
`
`° PRECISION
`
`
`° NOISE
`
`e USER-CREATED
`
`GESTURES
`
`379
`
`
`
`
`
`3/2
`
`
`
`376
`
`
`
`384
`
`
`
` USER 1
`
`
`
`
`
`
`
`USER
`
`
`USER 2
`
`SETTINGS
`
`
`
`DATABASE
`
`
`
`ENVIRONMENTAL MODEL
`
`
`GESTURE
`
`
` DATABASE
`
`USER 1
`
`
`
`
`
`e USER MAPPING
`
`
`
`
`V
`° USER-CREATED
`
`
`
`
`
`FUNCTIONS (MACROS,
`
`FUNCTION
`
`
`MAP GESTURE
`
`
`
`PHONE NUMBERS...)
`MAPPING
`
`
`TO FUNCTION
`
`
`
`
`DATABASE
`381
`
`
`
`
`
`:
`
`
`MAP MOTION
`TO GESTURE
`
`
`
`
`
`
`
`
`
`
`
`
`
`|
`
`fF
`
`
`
`
`
`
`
`
`USER
`IDENTITIES
`
`
`
`
`
`USER 2
`
`
`IR
`
`
`
`ENVIRONMENTAL MODEL
`
`389
`
`
`
`
`APPLICATION IN FOCUS
`
`
`
`
`DEVICE STATE
`
`
`
`FIG. 23
`
`390
`
`
`
`301
`
`
`
`
`
`CONTEXT
`
`
`388
`
`
`dJ
`
`
`<
`
`
`d
`
`
`FUNCTION 2
`
`
`FUNCTION 3
`
`|
`
`i
`
`FUNCTION 1
`
`
`392a
`
`392b
`
`
`
`392c
`
`
`
`Page 13 of 34
`
`Page 13 of 34
`
`
`
`
`1
`HANDHELD DEVICE WITH PREFERRED
`
`
`
`
`MOTION SELECTION
`
`
`
`
`
`
`
`US 7,180,502 B2
`
`TECHNICAL FIELD
`
`
`
`
`
`
`
`
`
`
`
`The present invention relates generally to portable devices
`
`
`
`
`
`
`
`
`and, more particularly, to portable devices with a motion
`interface.
`
`
`BACKGROUND
`
`
`
`
`
`
`
`
`
`
`
`The use of computing devices, such as cellular phones and
`
`
`
`
`
`
`
`
`personal digital assistants (PDAs) has grown rapidly. Such
`
`
`
`
`
`
`
`devices provide many different functions to users through
`
`
`
`
`
`
`
`different types of interfaces, such as keypads and displays.
`
`
`
`
`
`
`Some computing devices utilize motion as an interface by
`
`
`
`
`
`
`detectingtilt of the device by a user. Some implementations
`
`
`
`
`
`
`
`of a motion interface involve tethering a computing device
`
`
`
`
`
`
`
`
`
`with fishing lines or carrying large magnetic tracking units
`
`
`
`
`
`that require large amounts of power.
`
`SUMMARY
`
`
`
`
`
`
`
`20
`
`
`
`
`
`25
`
`
`
`
`
`
`
`
`
`invention, a handheld
`In accordance with the present
`
`
`
`
`
`
`device with motion a motion interface is provided.
`
`
`
`
`
`
`In accordance with a particular embodiment, a motion
`
`
`
`
`
`
`controlled handheld device includes a display having a
`
`
`
`
`
`
`
`viewable surface and operable to generate an image and a
`
`
`
`
`
`
`motion detection module operable to detect motion of the
`
`
`
`
`
`
`
`30
`device within three dimensions and to identify components
`of the motion in relation to the viewable surface. The
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`components comprise a first component parallel
`to the
`
`
`
`
`
`
`
`viewable surface, a second componentparallel to the view-
`
`
`
`
`
`
`
`
`able surface and perpendicular to the first component, and a
`
`
`
`
`
`
`
`35
`third componentperpendicular to the viewable surface. The
`
`
`
`
`
`
`device includes a motion processing module operable to
`
`
`
`
`
`
`compare the components, to isolate a preferred one of the
`
`
`
`
`
`
`
`components based on the comparison, and to adjust a
`
`
`
`
`magnitude of the preferred component by an augmentation
`
`
`
`
`
`
`
`
`40
`factor. The device also includes a motion response module
`
`
`
`
`
`
`
`
`
`operable to modify the image based on the augmented
`
`
`preferred component.
`
`
`
`
`
`
`
`In accordance with another embodiment, a method for
`
`
`
`
`
`
`controlling a handheld device includes detecting motion of
`
`
`
`
`
`
`
`
`45
`the device within three dimensions and identifying compo-
`nents of the motion in relation to a viewable surface of the
`
`
`
`
`
`
`
`
`
`
`
`
`
`device. The components comprise a first componentparallel
`
`
`
`
`
`
`
`
`to the viewable surface, a second componentparallel to the
`
`
`
`
`
`
`
`viewable surface and perpendicular to the first component,
`
`
`
`
`
`
`
`and a third component perpendicular to the viewable sur-
`
`
`
`
`
`
`
`
`face. The method includes comparing the components, iso-
`
`
`
`
`
`
`
`
`
`lating a preferred one of the components based on the
`
`
`
`
`
`comparison, adjusting a magnitude of the preferred compo-
`
`
`
`
`
`
`nent by an augmentation factor, and modifying an image
`
`
`
`
`
`
`
`55
`displayed on the viewable surface based on the augmented
`
`
`preferred component.
`
`
`
`
`
`
`Technical advantages of particular embodiments of the
`
`
`
`
`
`
`
`invention include a handheld device capable of
`present
`
`
`
`
`
`
`recognizing and amplifying movement of a motion input in
`
`
`
`
`
`
`
`a preferred direction while minimizing movement of the
`
`
`
`
`
`
`
`motion input
`in other directions. Accordingly, a user’s
`
`
`
`
`
`
`
`
`ability to take advantage of motion interfaces may be
`
`
`
`
`
`
`
`enhanced, and the handheld device or applications running
`
`
`
`
`
`
`
`
`
`on the device maybeableto filter out user-induced noise and
`unintended movements.
`
`
`
`
`
`
`
`
`
`Other technical advantages will be readily apparent to one
`
`
`
`
`
`
`
`
`skilled in the art from the following figures, descriptions and
`
`50
`
`
`
`60
`
`
`
`65
`
`
`
`Page 14 of 34
`
`
`2
`
`
`
`
`
`
`
`claims. Moreover, while specific advantages have been
`
`
`
`
`
`
`
`enumerated above, various embodiments may include all,
`
`
`
`
`
`someor none of the enumerated advantages.
`
`BRIEF DESCRIPTION OF THE DRAWINGS
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`For a more complete understanding of particular embodi-
`
`
`
`
`
`
`
`ments of the invention and their advantages, reference is
`
`
`
`
`
`
`
`now made to the following descriptions, taken in conjunc-
`
`
`
`
`
`
`tion with the accompanying drawings, in which:
`FIG.1 illustrates a handheld device with motion interface
`
`
`
`
`
`
`
`
`
`
`
`capability, in accordance with a particular embodiment;
`FIG. 2 illustrates a motion detector of the handheld device
`
`
`
`
`
`
`
`
`
`
`
`
`of FIG. 1, in accordance with a particular embodiment;
`
`
`
`
`
`
`
`FIG.3 illustrates the use of motion detector components
`
`
`
`
`
`
`
`
`
`of the handheld device of FIG. 1,
`in accordance with a
`
`
`particular embodiment;
`
`
`
`
`
`
`FIG. 4 illustrates an example handheld device with
`
`
`
`
`
`motion detection capability, in accordance with a particular
`
`embodiment;
`
`
`
`
`
`
`FIG. 5 illustrates an example of selection and amplifica-
`
`
`
`
`
`
`tion of a dominant motion of a handheld device, in accor-
`
`
`
`
`dance with a particular embodiment;
`
`
`
`
`
`FIG.6 is a flowchart illustrating preferred motion selec-
`
`
`
`
`
`
`tion, in accordance with a particular embodiment;
`
`
`
`
`
`FIG.7 is a flowchart illustrating the setting of a zero-point
`
`
`
`
`
`
`for a handheld device,
`in accordance with a particular
`
`embodiment;
`
`
`
`
`FIG.8 illustrates an example of scrubbing functionality
`
`
`
`
`
`
`
`with a handheld device for virtual desktop navigation, in
`
`
`
`
`accordance with a particular embodiment;
`
`
`
`
`
`
`FIG.9 is a flowchart illustrating the scrubbing process of
`
`
`
`
`
`
`FIG.8, in accordance with a particular embodiment;
`
`
`
`
`
`
`FIG. 10A illustrates an example of menu navigation using
`
`
`
`
`
`gesture input, in accordance with a particular embodiment;
`
`
`
`
`
`
`
`FIG. 10Billustrates example gestures which may be used
`
`
`
`
`
`
`to perform various functions at a handheld device, in accor-
`
`
`
`
`dance with a particular embodiment;
`
`
`
`
`
`FIG. 11 illustrates an example of map navigation using
`
`
`
`
`
`motion input, in accordance with a particular embodiment;
`
`
`
`
`
`
`
`FIG. 12A illustrates a form of motion input cursor navi-
`
`
`
`
`
`gation, in accordance with a particular embodiment;
`
`
`
`
`
`
`
`FIG. 12B illustrates another form of motion input cursor
`
`
`
`
`
`navigation, in accordance with a particular embodiment;
`
`
`
`
`
`
`FIG. 13 is a flowchart illustrating a process for utilizing
`
`
`
`
`
`
`feedback in response to motion input, in accordance with a
`
`
`particular embodiment;
`
`
`
`
`
`
`
`
`FIG. 14 illustrates an example system utilizing spatial
`
`
`
`
`
`
`
`signatures with a handheld device,
`in accordance with a
`
`
`particular embodiment;
`
`
`
`
`
`
`FIG. 15 illustrates an example system in which motion
`
`
`
`
`
`
`input of a handheld device controls multiple other devices,
`
`
`
`
`
`in accordance with a particular embodiment;
`
`
`
`
`
`FIG.16 is a flowchart illustrating an environmental mod-
`
`
`
`
`
`
`
`eling process of a handheld device, in accordance with a
`
`
`particular embodiment;
`
`
`
`
`
`
`
`FIG. 17 illustrates example gestures which may be
`
`
`
`
`
`
`
`mapped to different functions of a handheld device,
`in
`
`
`
`
`accordance with a particular embodiment;
`
`
`
`
`
`
`
`FIG. 18 is a flowchart illustrating the utilization of a
`
`
`
`
`
`
`preexisting symbol gesture, in accordance with a particular
`
`embodiment;
`
`
`
`
`
`FIG.19 is a flowchart illustrating the use of context-based
`
`
`
`
`
`gesture mapping, in accordance with a particular embodi-
`
`ment;
`
`
`
`
`
`
`
`Page 14 of 34
`
`
`
`
`3
`
`
`
`
`
`
`FIG. 20 is a flowchart illustrating the use of user-based
`
`
`
`
`
`
`gesture mapping, in accordance with a particular embodi-
`
`ment;
`
`
`
`
`
`
`FIG.21 is a flowchart illustrating the assignment process
`
`
`
`
`
`
`
`for user-created gestures, in accordance with a particular
`
`embodiment;
`
`
`
`
`
`
`
`FIG. 22 illustrates three gestures input using a handheld
`
`
`
`
`
`
`
`
`device with varying levels of precision, in accordance with
`
`
`
`
`a particular embodiment; and
`
`
`
`
`
`FIG. 23 is a flowchart illustrating a gesture recognition
`
`
`
`
`
`
`process utilizing a numberof features, in accordance with a
`
`
`particular embodiment.
`
`DETAILED DESCRIPTION
`
`
`
`
`FIG. 1 illustrates a handheld device 10 with motion
`
`
`
`
`
`
`
`
`
`
`
`
`
`interface capability, in accordance with a particular embodi-
`
`
`
`
`
`
`
`
`ment of the present invention. Handheld device 10 can
`
`
`
`
`
`
`
`recognize movement of the device and can perform various
`
`
`
`
`
`
`functions corresponding to such movement. Thus, move-
`
`
`
`
`
`
`
`
`mentof the device operates as a form of input for the device.
`
`
`
`
`
`
`
`
`Such movement
`input may directly alter what
`is being
`
`
`
`
`
`
`
`displayed on a device display or may perform other func-
`
`
`
`
`
`
`
`
`tions. Handheld device 10 may comprise a mobile phone,
`
`
`
`
`
`
`
`
`personal digital assistant (PDA), still camera, video camera,
`
`
`
`
`
`
`
`
`pocket calculator, portable radio or other music or video
`
`
`
`
`
`
`
`player, digital
`thermometer, game device, portable elec-
`
`
`
`
`
`
`
`
`tronic device, watch or any other device capable of being
`
`
`
`
`
`
`
`held or worn by a user. As indicated in the exampleslisted
`
`
`
`
`
`
`
`above, handheld device 10 may include wearable portable
`
`
`
`
`
`
`
`
`devices such as watches as well. A watch may include any
`
`
`
`
`
`
`computing device worn around a user’s wrist.
`
`
`
`
`
`
`
`
`input 14,
`Handheld device 10 includes a display 12,
`
`
`
`
`
`
`processor 16, memory 18, communications interface 20 and
`
`
`
`
`
`
`
`motion detector 22. Display 12 presents visual output of the
`
`
`
`
`
`
`
`
`device and may comprise a liquid crystal display (LCD), a
`
`
`
`
`
`
`
`
`
`light emitting diode (LED)or any other type of display for
`
`
`
`
`
`
`
`
`communicating output
`to a user.
`Input 14 provides an
`
`
`
`
`
`
`
`
`interface for a user to communicate inputto the device. Input
`
`
`
`
`
`
`
`
`14 may comprise a keyboard, keypad, track wheel, knob,
`
`
`
`
`
`
`
`
`touchpad, stencil or any other component through which a
`
`
`
`
`
`
`
`user may communicate an input to device 10. In particular
`
`
`
`
`
`
`
`embodiments, display 12 and input 14 may be combinedinto
`
`
`
`
`
`the same component, such as a touchscreen.
`
`
`
`
`
`Processor 16 may be a microprocessor, controller or any
`
`
`
`
`
`
`other suitable computing device or resource. Processor 16 is
`
`
`
`
`
`
`adapted to execute various types of computerinstructions in
`
`
`
`
`
`
`various computer languages for implementing functions
`
`
`
`
`
`
`
`available within system handheld device 10. Processor 16
`
`
`
`
`
`
`
`
`may include any suitable controllers for controlling the
`
`
`
`
`
`
`management and operation of handheld device 10.
`
`
`
`
`
`
`
`Memory 18 may be any form of volatile or nonvolatile
`
`
`
`
`
`
`
`memory including, without limitation, magnetic media, opti-
`
`
`
`
`
`
`
`
`cal media,
`random access memory (RAM),
`read only
`
`
`
`
`
`
`
`memory (ROM), removable media or any other suitable
`
`
`
`
`
`
`local or remote memory component. Memory 18 includes
`
`
`
`
`
`
`components, logic modules or software executable by pro-
`
`
`
`
`
`
`
`cessor 16. Memory 18 mayinclude various applications 19
`
`
`
`
`
`
`
`
`with user interfaces utilizing motion input, such as mapping,
`
`
`
`
`
`
`
`
`calendar and file management applications, as further dis-
`
`
`
`
`
`
`
`
`cussed below. Memory 18 mayalso include various data-
`
`
`
`
`
`
`
`
`bases, such as gesture databases and function or gesture
`
`
`
`
`
`
`mapping databases, as further discussed below. Components
`
`
`
`
`
`
`
`
`of memo ry 18 may be combined and/or divided for pro-
`
`
`
`
`
`
`
`cessing according to particular needs or desires within the
`
`
`
`
`
`
`scopeofthe present invention. Communications interface 20
`
`Page 15 of 34
`
`
`
`US 7,180,502 B2
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`4
`
`
`
`
`
`
`supports wireless or wireline communication of data and
`
`
`
`
`
`
`
`information with other devices, such as other handheld
`
`
`devices, or components.
`Motion detector 22 tracks movement of handheld device
`
`
`
`
`
`
`
`
`
`
`
`
`10 which may be used as a form of input to perform certain
`
`
`
`
`
`
`
`functions. Such input movement may result from a user
`
`
`
`
`
`
`moving the device in a desired fashion to perform desired
`
`
`
`
`tasks, as further discussed below.
`It should be understood that handheld device 10 in
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`accordance with particular embodiments may include any
`
`
`
`
`
`
`
`suitable processing and/or memory modules for performing
`
`
`
`
`
`
`
`the functions as described herein, such as a control module,
`
`
`
`
`
`
`
`a motion tracking module, a video analysis module, a
`
`
`
`
`
`
`
`motion response module, a display control module and a
`
`
`
`signature detection module.
`
`
`
`
`
`
`
`In particular embodiments, input movement maybein the
`
`
`
`
`
`
`form oftranslation and/or gestures. Translation-based input
`
`
`
`
`
`
`
`focuses on a beginning point and endpoint of a motion and
`
`
`
`
`
`
`
`differences between such beginning points and endpoints.
`
`
`
`
`
`
`Gesture-based input focuses on an actual path traveled by
`
`
`
`
`
`
`
`
`the device andis a holistic view ofa set of points traversed.
`
`
`
`
`
`
`
`As an example, when navigating a map using translation-
`
`
`
`
`
`
`
`
`
`based input, motion in the form of an “O” may change the
`
`
`
`
`
`
`
`
`display during the movement but may ultimately yield no
`
`
`
`
`
`
`
`
`change between the information displayed prior to the
`
`
`
`
`
`
`
`movementand the information displayed at the end of the
`
`
`
`
`
`
`
`
`movementsince the device presumably will be in the same
`
`
`
`
`
`
`
`
`point as it started when the motion ends. However,
`in a
`
`
`
`
`
`
`
`
`
`gesture input mode the device will recognize that it has
`
`
`
`
`
`
`
`
`traveled in the form of an “O” because in gesture-based
`
`
`
`
`
`
`
`
`
`
`input the device focuses on the path traveled during the
`
`
`
`
`
`
`
`motion or movement between a beginning point and an
`
`
`
`
`
`
`
`
`endpointof the gesture (e.g., even though the beginning and
`
`
`
`
`
`
`
`
`endpoints may be the same). This gesture “O” movement
`
`
`
`
`
`
`
`may be mappedto particular functions such that when the
`
`
`
`
`
`
`
`device recognizes it has traveled along a path to constitute
`
`
`
`
`
`
`
`
`
`an “O” gesture,
`it may perform the functions, as further
`
`
`
`
`
`
`elaborated upon below. In particular embodiments, move-
`
`
`
`
`
`
`mentof the device intended as a gesture may be recognized
`
`
`
`
`
`
`
`as by the device as a gesture by matching a series, sequence
`
`
`
`
`
`
`or pattern of accelerations of the movement to those defining
`
`
`
`gestures of a gesture database.
`Handheld devices in accordance with other embodiments
`
`
`
`
`
`
`
`
`
`
`
`
`
`may not include some of the components of the device
`
`
`
`
`
`
`illustrated in FIG. 1. For example, some embodiments may
`
`
`
`
`
`include a handheld device 10 without an input 14 separate
`from a motion detector such that motion of the device
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`provides the sole or primary input for the device. It should
`be noted that handheld devices in accordance with other
`
`
`
`
`
`
`
`
`
`
`
`
`
`embodiments may include additional components not spe-
`
`
`
`
`
`
`cifically illustrated with respect to device 10.
`
`
`
`
`
`
`
`FIG.2 illustrates motion detector 22 of FIG. 1, in accor-
`
`
`
`
`
`
`
`dance with a particular embodimentof the present invention.
`
`
`
`
`
`
`
`In this embodiment, motion detector 22 includes acceler-
`
`
`
`
`
`
`
`
`
`
`
`ometers 24a, 246 and 24c; cameras 26a, 266 and 26c; gyros
`
`
`
`
`
`
`
`
`
`
`28a, 285 and 28c; rangefinders 30a, 306 and 30c; and a
`
`
`processor 32.
`
`
`
`
`
`
`
`Accelerometers 24a, 246 and 24c detect movementof the
`
`
`
`
`
`
`device by detecting acceleration along a respective sensing
`
`
`
`
`
`
`
`axis. A particular movement of the device may comprise a
`
`
`
`
`
`
`series, sequence or pattern of accelerations detected by the
`
`
`
`
`
`
`accelerometers. When the handheld deviceis tilted along a
`
`
`
`
`
`
`sensing axis of a particular accelerometer, the gravitational
`
`
`
`
`
`
`
`
`acceleration along the sensing axis changes. This change in
`
`
`
`
`
`gravitational acceleration is detected by the accelerometer
`
`
`
`
`
`
`
`and reflects the tilt of the device. Similarly, translation of the
`
`
`
`
`
`
`handheld device, or movement of the device without rota-
`
`
`
`
`
`
`
`20
`
`25
`
`
`
`30
`
`
`
`35
`
`
`
`40
`
`45
`
`
`
`
`
`50
`
`
`
`55
`
`
`
`60
`
`
`
`65
`
`
`
`Page 15 of 34
`
`
`
`
`5
`
`
`
`
`
`
`
`
`tion or tilt also produces a change in acceleration along a
`
`
`
`
`
`
`
`sensing axis which is also detected by the accelerometers.
`
`
`
`
`
`
`
`In the illustrated embodiment, accelerometer 24a com-
`
`
`
`
`
`
`prises an x-axis accelerometer that detects movementof the
`
`
`
`
`
`
`
`device along an x-axis, accelerometer 24b comprises a
`
`
`
`
`
`
`y-axis accelerometer that detects movement of the device
`
`
`
`
`
`
`
`along a y-axis and accelerometer 24c comprises a z-axis
`
`
`
`
`
`
`accelerometer that detects movement of the device along a
`
`
`
`
`
`
`
`z-axis. In combination, accelerometers 24a, 246 and 24c are
`able to detect rotation and translation of device 10. As
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`indicated above,rotation and/or translation of device 10 may
`
`
`
`
`
`
`
`serve as an input from a user to operate the device.
`The use of three accelerometers for motion detection
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`provides certain advantages. For example, if only two accel-
`
`
`
`
`
`
`
`
`erometers were used, the motion detector may not be able to
`
`
`
`
`
`
`disambiguate translation of the handheld device from tilt in
`
`
`
`
`
`
`
`
`the plane of translation. However, using a third, z-axis
`
`
`
`
`
`
`
`accelerometer (an accelerometer with a sensing axis at least
`
`
`
`
`
`
`approximately perpendicular to the sensing axes of the other
`
`
`
`
`
`
`two accelerometers) enables many casesoftilt to be disam-
`
`
`
`
`
`biguated from manycases oftranslation.
`
`
`
`
`
`
`
`
`Tt should be understood that some unique movements may
`
`
`
`
`
`
`
`
`exist that may not be discernible from each other by accel-
`
`
`
`
`
`
`
`
`
`erometers 24a, 246 and 24c. For example, movement com-
`25
`
`
`
`
`
`
`
`prising a certain rotation and a certain translation may
`
`
`
`
`
`
`
`
`appear to accelerometers 24a, 246 and 24c as the same
`
`
`
`
`
`
`movementas a different movement that comprises a differ-
`
`
`
`
`
`
`
`ent particular rotation and a different particular translation.
`
`
`
`
`
`
`
`Ifa motion detector 22 merely included three accelerometers
`
`
`
`
`
`
`
`to detect movement (without any additional components to
`
`
`
`
`
`
`
`ensure greater accuracy), some unique, undiscernible move-
`
`
`
`
`
`
`
`ments may be mappedto the s