throbber

`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`SAMSUNG EXHIBIT 1015
`Samsung Electronics America Inc. v. Uniloc Luxembourg, S.A.
`IPR2018-01664
`
`Page 1 of 34
`
`

`

`
`
`US 7,180,502 B2
`Page 2
`
`
`
`
`
`
`
`
`
`W0
`W0
`
`U.S. PATENT DOCUMENTS
`
`
`
`10/2002 Feinstein .................... 345/158
`6,466,198 B1
`
`
`
`
`
`
`12/2002 Feinstein
`.345/158
`2002/0190947 A1
`
`
`
`
`
`
`2/2004 Bradski
`...................... 345/158
`2004/0027330 A1
`
`
`
`
`
`FOREIGN PATENT DOCUMENTS
`
`
`WO 01/86920
`11/2001
`
`
`
`
`W0 03/001340
`1/2003
`
`
`
`
`OTHER PUBLICATIONS
`
`
`Yee, Ka-Ping, “Peephole Displays: Pen Interaction on Spatially
`
`
`
`
`
`
`
`
`Aware Handheld Computers”, CHI 2003, Ft. Lauderdale, Florida, 8
`
`
`
`
`
`
`
`
`pages, Apr. 5, 2003.
`
`
`
`Patent Application entitled, “Distinguishing Tilt and Translation
`
`
`
`
`
`
`Motion Components in Handheld Devices”, 68 pages specification,
`
`
`
`
`
`claims and abstract, 11 pages of drawings, inventors Marvit et al,
`
`
`
`
`
`
`
`
`Mar. 23, 2004.
`
`
`
`Patent Application entitled, “Motion Sensor Engagement for a
`
`
`
`
`
`
`
`Handheld Device”, 68 pages specification, claims and abstract, 11
`
`
`
`
`
`
`
`
`pages of drawings, inventors Marvit et al, Mar. 23, 2004.
`
`
`
`
`
`
`
`
`Patent Application entitled, “Selective Engagement of Motion
`
`
`
`
`
`
`
`Detection”, 66 pages specification, claims and abstract, 11 pages of
`
`
`
`
`
`
`
`
`drawings, inventors Marvit et al, Mar. 23, 2004.
`
`
`
`
`
`
`
`Patent Application entitled, “Gesture Based Navigation of a
`
`
`
`
`
`
`
`Handheld User Interface”, 69 pages specification, claims and
`
`
`
`
`
`
`
`
`abstract, 11 pages of drawings, inventors Marvit et al, Mar. 23,
`
`
`
`
`
`
`
`
`
`2004.
`
`Patent Application entitled, “Translation Controlled Cursor”, 66
`
`
`
`
`
`
`
`pages specification, claims and abstract, 11 pages of drawings,
`
`
`
`
`
`
`
`
`
`inventors Reinhardt et al, Mar. 23, 2004.
`
`
`
`
`
`
`Patent Application entitled, “Selective Engagement ofMotion Input
`
`
`
`
`
`
`
`Modes”, 66 pages specification, claims and abstract, 11 pages of
`
`
`
`
`
`
`
`
`drawings, inventors Marvit et al, Mar. 23, 2004.
`
`
`
`
`
`
`
`Patent Application entitled, “Feedback Based User Interface for
`
`
`
`
`
`
`
`
`Motion Controlled Handheld Devices”, 66 pages specification,
`
`
`
`
`
`
`
`claims and abstract, 11 pages of drawings, inventor Marvit et al,
`
`
`
`
`
`
`
`
`
`Mar. 23, 2004.
`
`
`
`Patent Application entitled, “Spatial Signatures”, 64 pages specifi-
`
`
`
`
`
`
`
`cation, claims and abstract, 11 pages of drawings, inventors Marvit
`
`
`
`
`
`
`
`
`
`et al, Mar. 23, 2004.
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`Patent Application entitled, “Motion Controlled Remote Control—
`
`
`
`
`
`
`
`ler”, 65 pages specification, claims and abstract,
`11 pages of
`
`
`
`
`
`
`
`
`
`
`drawings, inventors Marvit et al, Mar. 23, 2004.
`
`
`
`
`
`
`
`Patent Application entitled, “Gesture Identification of Controlled
`
`
`
`
`
`
`
`Devices”, 65 pages specification, claims and abstract, 11 pages of
`
`
`
`
`
`
`
`
`
`drawings, inventors Marvit et al, Mar. 23, 2004.
`
`
`
`
`
`
`
`Patent Application entitled, “Environmental Modeling for Motion
`
`
`
`
`
`
`
`Controlled Handheld Devices”, 65 pages specification, claims and
`
`
`
`
`
`
`
`abstract, 11 pages of drawings, inventors Marvit et al, Mar. 23,
`
`
`
`
`
`
`
`
`
`2004.
`
`Patent Application entitled, “Gesture Based User Interface Support—
`
`
`
`
`
`
`
`ing Preexisting Symbols”, 65 pages specification, claims and
`
`
`
`
`
`
`
`abstract, 11 pages of drawings, inventors Marvit et al, Mar. 23,
`
`
`
`
`
`
`
`
`2004.
`
`entitled,
`“Context Dependent Gesture
`Patent Application
`
`
`
`
`
`
`Response”, 66 pages specification, claims and abstract, 11 pages of
`
`
`
`
`
`
`
`
`drawings, inventors Marvit et al, Mar. 23, 2004.
`
`
`
`
`
`
`
`Patent Application entitled, “Customizable Gesture Mappings for
`
`
`
`
`
`
`
`Motion Controlled Handheld Devices”, 67 pages specification,
`
`
`
`
`
`
`
`claims and abstract, 11 pages of drawings, inventors Marvit et al,
`
`
`
`
`
`
`
`
`
`Mar. 23, 2004.
`
`
`
`Patent Application entitled, “User Definable Gestures for Motion
`
`
`
`
`
`
`
`
`Controlled Handheld Devices”, 66 pages specification, claims and
`
`
`
`
`
`
`
`abstract, 11 pages of drawings, inventors Marvit et al, Mar. 23,
`
`
`
`
`
`
`
`
`
`2004.
`
`Patent Application entitled, “Non—Uniform Gesture Precision”, 66
`
`
`
`
`
`
`pages specification, claims and abstract, 11 pages of drawings,
`
`
`
`
`
`
`
`
`
`inventors Marvit et al, Mar. 23, 2004.
`
`
`
`
`
`
`Patent Application entitled, “Dynamic Adaptation of Gestures for
`
`
`
`
`
`
`Motion Controlled Handheld Devices”, 65 pages specification,
`
`
`
`
`
`
`claims and abstract, 11 pages of drawings, inventors Marvit et al,
`
`
`
`
`
`
`
`
`Mar. 23, 2004.
`
`
`
`PCT Notification of Transmittal of the International Search Report
`
`
`
`
`
`
`
`and the Written Opinion of the International Searching Authority, or
`
`
`
`
`
`
`
`
`the Declaration, mailed Nov. 3, 2005, re PCT/US2005/007409 filed
`
`
`
`
`
`
`
`Mar 7, 2005, 13 pages.
`
`
`
`
`
`
`
`
`Page 2 of 34
`
`Page 2 of 34
`
`

`

`
`U.S. Patent
`
`
`
`
`
`Feb. 20, 2007
`
`
`
`
`
`Sheet 1 0f 11
`
`
`
`US 7,180,502 B2
`
`
`
`
`
`Z-AXIS
`
`ACCELEROMETER
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`24C
`
`
`
`32
`
`
`
`PROCESSOR
`
`
`
`
`MOTION DETECTOR
`
`
`
`
`FIG. 2
`
`22
`
`
`
`
`
`Page 3 of 34
`
`Page 3 of 34
`
`

`

`
`U.S. Patent
`
`
`
`
`
`Feb. 20, 2007
`
`
`
`
`Sheet 2 0f 11
`
`
`
`US 7,180,502 B2
`
`27a
`
`
`
`27b
`
`
`
`27C
`
`
`
`29a
`29b
`
`
`
`
`29c
`
`
`
`235‘
`
`23”
`
`
`
`
`X-AXIS ACCELEROMETER
`
`
`RAW DATA
`
`Y—AXIS ACCELEROMETER
`
`
`RAw DATA
`
`
`
`
`
`23C
`
`
`Z—AXIS ACCELEROMETER
`RAw DATA
`
`
`
`
`
`
`
`32
`
`PROCESSOR
`
`
`
`
`GYRO
`
`RAw DATA
`
`
`GYRO
`
`RAw DATA
`
`
`
`
`
`
`GYRO
`RAw DATA
`
`
`
`
`RANGEFINDER
`
`RANGEHNDER
`
`
`-
`
`
`
`-
`
`
`
`
`R3663???
`
`
`
`
`FIG 3
`
`25a —
`CAMERA RAW DATA
`
`
`
`
`
`25b —
`CAMERA RAw DATA
`
`
`
`
`
`CAMERA RAw DATA
`
`
`
`
`25c
`
`
`
`34
`
`
`
`
`
`16
`
`36
`
`I
`
`44MlI
`
`41
`
`
`
`42
`
`
`
`46
`
`_ _ _
`
`K
`48
`
`
`
`
`
`
`
`
`FIG. 5
`
`Page 4 of 34
`
`
`
`TRANSLATION x
`
`
`TRANSLATION Y
`
`TRANSLATION z
`
`
`
`
`
`
`
`ROTATION x
`
`
`ROTATION Y
`ROTATION z
`
`
`
`
`PROCESSOR
`
`
`
`
`
`
`DEVICE BEHAVIOR
`
`
`
`50
`
`
`
`Page 4 of 34
`
`

`

`
`U.S. Patent
`
`
`
`
`
`Feb. 20, 2007
`
`
`
`
`Sheet 3 0f 11
`
`
`
`US 7,180,502 B2
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`X—ACCELERATION
`
`DATA
`
`
`
`Y—ACCELERATION
`
`DATA
`
`Z—ACCELERATION
`
`DATA
`
`
`
`
`
`
`
`620
`
`
`
`[60
`
`
`
`64
`
`
`
`PROCESS
`
`
`
`
`
`
`
`DOMINANT
`
`AXIS?
`
`
`
`
`
`
`
`AUGMENTX
`
`
`
`AUGMENTY
`
`
`AUGMENTZ
`
`
`
`
`
`
`
`PROCESS
`
`
`DEVICE BEHAVIOR
`
`
`
`72
`
`
`
`
`FIG. 6
`
`
`
`
`
`r _______ 'I
`
`+— —'
`'
`USER
`
`
`IL PREFERENCES I
`___ég__
`
`
`
`
`
`
`
`
`
`
`
`DETECT X—AXIS
`
`
`
`
`ACCELERATION A
`
`
`
`
`DETECT Y-AXIS
`ACCELERATION A
`
`
`
`DETECT Z-AXIS
`
`
`ACCELERATION A
`
`
`
`ANY ONE
`
`
`
`
`
`GREATER THAN
`THRESHOLD?
`
`
`
`84
`N0
`
`
`
`
`
`
`
`
`
`
`FIG. 7
`
`86
`
`
`
`
`
`
`SET ZERO—POINT
`
`
`
`
`
`
`
`82a
`
`Page 5 of 34
`
`Page 5 of 34
`
`

`

`CSU.
`
`
`etm
`
`
`
`
`
`0
`
`
`
`
`
`
`
`
`
`US 7,180,502 B2
`
`
`
`
`
`
`
`
`
`HI:Hfm0DH
`
`
`
`
`
`
`9G.H
`
`
`
`
`
`eEmmSW
`
`0MT
`
`I_u____J.__._LH___I_n___I.u___I.
`
`2J"0:.02.a0bLn1e_F__J.__nm_
`
`mmM
`
`nV
`
` T
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`Page 6 of 34
`
`O
`
`DE
`
`
`
`
`201
`
`
`
`104
`
`
`
`106
`
`
`
`108
`
`
`
`DISENGAGE MOTION
`
`
`SENSITIVITY
`
`
`
`
`
`MOVE DEVICE TO LEFT
`
`
`
`REENGAGE MOTION
`SENSITIVITY
`
`
`
`
`
`
`MOVE DEVICE TO RIGHT
`
`
`
`
`
`NEED FURTHER
`
`MOVEMENT?
`
`110
`
`
`
`Page 6 of 34
`
`
`
`
`
`
`
`

`

`
`U.S. Patent
`
`
`
`
`
`Feb. 20, 2007
`
`
`
`
`Sheet 5 0f 11
`
`
`
`US 7,180,502 B2
`
`120
`
`
`
`1246
`
`124a
`
`
`
`
`124D
`
`
`124C 1
`
`
`124d
`
`
`
`130
`
`
`A -73
`
`
`
`10/013110
`
`
`
`134
`
`
`f),
`
`
`310
`
`
`
`
`
`
`FIG. 10A
`
`
`
`133
`
`
`
`135
`
`
`
`133
`
`
`
`137
`
`
`
`138
`
`
`
`UP
`
`DOWN
`
`LEFT
`
`RIGHT
`
`IN
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`m 1) g :3 fl @1139
`
`OUT
`
`
`
`FIG. [OB
`
`
`
`144
`
`
`
`Illa:
`
`
`
`
`{l—li
`
`
`1 46
`
`
`
`
`Iii
`
`
`
`
`FIG. 11
`
`
`
`Page 7 of 34
`
`Page 7 of 34
`
`

`

`
`U.S. Patent
`
`
`
`
`
`Feb.20,2007
`
`
`
`
`Sheet6 ofll
`
`
`
`US 7,180,502 B2
`
`
`
`
`
`
`FIG. 12A
`
`
`
`
`
`
`
`FIG. [28
`
`Page 8 of 34
`
`
`
`
`161
`
`
`
`163
`
`
`
`165
`
`
`
`Page 8 of 34
`
`
`
`

`

`
`U.S. Patent
`
`
`
`
`
`Feb. 20, 2007
`
`
`
`
`Sheet 7 0f 11
`
`
`
`US 7,180,502 B2
`
`172
`
`
`
`
`170\
`
`
`
`RECEIVE RAW
`
`
`MOTION DATA
`
`174
`
`
`
`PROCESS
`
`
`
`I75
`
`CHECK DEVICE STATE
`
`
`
`I78
`
`
`
`
`
`ANALYZE MOTION
`DETECTOR OUTPUT
`
`
`
`MOTION
`
`
`MEANINGFUL/
`
`RECOGNIZABLE
`
`
`GIVEN STATE?
`
`
`FIG. I 3
`
`
`
`182
`
`
`
`
`
`
`ABOVE
`THRESHOLD
`
`
`
`
`
`10
`
`
`
`HANDHELD
`
`
`DEVICE
`
`
`
`
`
`200
`
`
`184
` FIG. 14
`
` 206
`
`
`
` AUTHENTICATOR
`
`
`
`
`
`
`
`
`
`
`
`COMMUNICATION
`
`NETWORK
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`226
`
`
`
`
`
`224
`
`COMMUNICATION
`
`NETWORK
`
`
`
`
`
`
`
`202
`
`
`
`
`FIG. 15
`
`
`
`Page 9 of 34
`
`Page 9 of 34
`
`

`

`
`U.S. Patent
`
`
`
`
`
`Feb. 20, 2007
`
`
`
`
`Sheet 8 0f 11
`
`
`
`US 7,180,502 B2
`
`232
`
`
`
`RECEIVE RAW MOTION DATA
`
`
`
`
`
`
`230
`
`
`
`236
`
`234
`
`PROCESS
`
`
`
`
`DETERMINE MOTION AND ORIENTATION
`
`
`
`___
`
`___ ___
`
`:-
`
`
`
`|
`
`___ -_"-l
`
`
`| ORIENTED AT
`| TRANSLATING |
`ROTATING
`
`
`
`
`
`
`
`I
`aflw
`I ALONG X-AXIS I
`IAROUND Z—AXISI
`I
`__ ___—J l..__ ___—_l I. _____ __l
`
`
`
`
`
`
`K
`L 7
`7
`
`
`2373
`237b
`237G
`
`
`
`E o o o I STILL l
`
`
`
`
`I'_ _1
`
`
`I
`I
`I._ __l
`
`‘\
`237n
`
`238
`
`
`
`
`
`
`DETERMINE ENVIRONMENT BASED ON MOTION AND ORIENTATION
`
`
`
`
`
`'[
`'1
`l"__ -"
`
`
`
`I FACEDOWN I
`I ONTABLE I
`_ ___—l
`
`
`‘ r
`
`2393
`
`
`
`
`1
`l—
`l‘___]:"""'1 f-__I—""l
`-"' “__l
`
`
`HELD IN
`I
`|
`I
`I
`I o o 0 I
`
`
`
`
`FALLING
`HAND
`I ONTRAIN I
`I
`I
`I
`I
`_ ___—J
`___ __J
`l____ __l
`
`
`
`
`
`
`
`K
`" z
`‘
`*\
`
`
`239b
`2390
`239n
`
`
`
`
`
`
`
`MAP ENVIRONMENT TO BEHAVIOR
`
`
`
`
`
`I L_'V'UIE__I Ll‘lsiJEV'VE W911
`
`
`
`I'-
`-l
`I ENGAGE I
`
`
`
`/'
`
`241a
`
`
`
`
`
`l' __________ 'l
`l—
`I POWER DOWN CHIPS I o o o I
`INCREASE MOTION
`
`
`
`
`LED‘E‘EQNJHREEHELPJ
`
`
`
`
`7
`'\
`
`
`241b
`241n
`
`W
`
`
`
`
`FIG. 1 6
`
`Pagel()of34
`
`Page 10 of 34
`
`

`

`
`U.S. Patent
`
`
`
`
`
`Feb. 20, 2007
`
`
`
`
`Sheet 9 0f 11
`
`
`
`US 7,180,502 B2
`
`
`258,
`
`/ \\
`
`
`\
`
`
`
`/’_\\
`
`
`/
`
`
`10
`
`
`
`
`’/\\
`
`/
`
`
`\
`
`
`
`I
`
`254
`
`
`
`"\
`
`
`I
`
`x
`
`\\
`
`\
`
`'
`
`
`1
`
`
`I
`//
`//
`Ll/
`\_//
`
`
`
`
`|
`
`
`
`258
`
`\
`
`,
`
`
`,
`
`
`,1 \
`
`fl 1
`
`
`10
`
`260
`
`
`
`
`FIG. 17
`
`
`
`330
`
`
`
`
`
`RECEIVE USER
`
`
`INDICATION
`REGARDING
`
`GESTURE CREATION
`
`V
`
`
`
`
`
`RECEIVE RAW
`MOTION DATA
`
`
`V
`
`PROCESS
`V
`
`
`
`STORE MOTION
`
`
`
`
`AS GESTURE
`v
`
`
`
`332
`
`
`
`334
`
`
`
`336
`
`
`
`338
`
`
`
`340
`
`
`
`342
`
`
`
`272
`
`
`
`274
`
`
`
`276
`
`
`
`278
`
`282
`
`
`
`
`
`GESTURE
`
`DATABASE
`
`
`
`
`
`
`
`
`
`
`286
`
`
`
`BEHAVE
`
`
`
`Page 11 of 34
`
`
`
`RECEIVE
`
`
`
`FUNCTION MAPPING
`
`
`
`' - INFORMATION
`
`
`.
`“43:35:10;
`
`
`
`
`
`
`
`DATABASE
`
`
`
`
`FIG. 18
`
`
`
`
`
`.
`STORE
`
`
`FUNCTION MAPPING
`INFORMATION
`
`
`
`FIG. 21
`
`Page 11 of 34
`
`

`

`
`U.S. Patent
`
`
`
`
`
`Feb. 20, 2007
`
`
`
`
`Sheet 10 0f 11
`
`
`
`US 7,180,502 B2
`
`
`
`RECEIVE RAW
`
`
`MOTION DATA
`,
`PROCESS
`
`
`
`292
`
`
`
`294
`
`
`
`296
`
`
`
`
`
`290
`/
`
`
`
`
`
`
`FIG. 19
`
`
`
`
`
`
`
` WHICH
`
`
`APPLICATION IN
`
`
`FOCUS?
`
`3003
`
`
`
`300b
`
`
`
`3000
`
`
`
`
`300d
`
`
`
`
`
`RECEIVE RAW
`MOTION DATA
`
`
`
`
`
`Page 12 of 34
`
`Page 12 of 34
`
`

`

`
`U.S. Patent
`
`
`
`
`
`Feb. 20, 2007
`
`
`
`
`
`Sheet 11 0f 11
`
`
`
`US 7,180,502 B2
`
`
`"‘ 350
`
`
`\\/
`1% v
`
`
`I
`
`
`‘\ /
`
`//
`
`
`
`
`
`
`
`10
`
`10
`
`/
`
`
`/ \\\
`1%
`
`
`*
`
`I
`
`
`
`
`
`
`
`10
`
`
`I,
`
`I
`fl
`
`I
`
`r\
`
`
`
`\
`1
`
`I
`I
`
`I
`/
`
`
`\J
`\,
`
`
`
`
`379
`
`
`
`
`FIG.22
`
`370
`\
`
`
`
`
` USER 1
`
`
`
`° PRECISION
`° NOISE
`
`
`
`
`
`° USER—CREATED
`
`
`GESTURES
`
`
`
`
`
`378
`
`
`
`
`
`
`
`372
`
`
`
`
`
`RECEIVE RAW
`MOTION DATA
`
`
`
`'
`
`'
`
`
`MAP MOTION
`TO GESTURE
`
`
`
`
`
`
`
`
`
`
`
`
`
`376
`
`
`
`384
`
`
`
`
`USER
`
`
`USER 2
`
`
`SETTINGS
`
`
`DATABASE
`
`
` ENVIRONMENTAL MODEL
`
`
`GESTURE
`
`
`
`DATABASE
`
`
`
` USER 1
`
`
`° USER MAPPING
`
`
`
`
`‘
`° USER—CREATED
`
`
`
`
`
`
`FUNCTIO N
`
`
`FUNCTIONS (MACROS,
`
`
`MAP GESTURE
`
`
`
`
`MAPPING
`
`PHONE NUMBERS...)
`
`
`
`
`T0 FUNCTION
`‘
`DATABASE
`
`
`
`
`USER 2
`
`
`
`
`
`
`
`
`USER
`
`IDENTITIES
`
`
`
`381
`
`
`
`IRONMENTAL MODEL
`
`
`
`389
`
`
`
`APPLICATION IN FOCUS
`
`
`
`
`390
`
`DEVICE STATE_ 391
`
`
`FIG. 23
`
`
`
`
`
`CONTEXT
`
`
`
`388
`
`
`
`<
`
`
`<
`
`
`<
`
`
`Y
`
`FUNCTION 1
`
`
`FUNCTION 2
`
`
`FUNCTION 3
`
`
`
`392a
`
`392D
`
`
`
`392C
`
`
`
`Page 13 of 34
`
`Page 13 of 34
`
`

`

`
`
`US 7,180,502 B2
`
`2
`
`1
`
`HANDHELD DEVICE WITH PREFERRED
`
`
`
`
`MOTION SELECTION
`
`
`
`
`
`
`TECHNICAL FIELD
`
`
`
`
`
`
`
`
`
`
`
`The present invention relates generally to portable devices
`
`
`
`
`
`
`
`
`and, more particularly, to portable devices with a motion
`interface.
`
`
`BACKGROUND
`
`
`
`
`
`
`
`
`
`
`
`The use of computing devices, such as cellular phones and
`
`
`
`
`
`
`
`
`personal digital assistants (PDAs) has grown rapidly. Such
`
`
`
`
`
`
`
`devices provide many different functions to users through
`
`
`
`
`
`
`
`different types of interfaces, such as keypads and displays.
`
`
`
`
`
`
`Some computing devices utilize motion as an interface by
`
`
`
`
`
`
`detecting tilt of the device by a user. Some implementations
`
`
`
`
`
`
`
`of a motion interface involve tethering a computing device
`
`
`
`
`
`
`
`
`
`with fishing lines or carrying large magnetic tracking units
`
`
`
`
`
`that require large amounts of power.
`SUMMARY
`
`
`
`10
`
`
`
`15
`
`
`
`20
`
`
`
`
`
`25
`
`
`
`
`
`
`
`
`
`invention, a handheld
`In accordance with the present
`
`
`
`
`
`
`device with motion a motion interface is provided.
`
`
`
`
`
`
`In accordance with a particular embodiment, a motion
`
`
`
`
`
`
`controlled handheld device includes a display having a
`
`
`
`
`
`
`
`viewable surface and operable to generate an image and a
`
`
`
`
`
`
`motion detection module operable to detect motion of the
`
`
`
`
`
`
`
`device within three dimensions and to identify components 30
`of the motion in relation to the viewable surface. The
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`components comprise a first component parallel
`to the
`
`
`
`
`
`
`
`viewable surface, a second component parallel to the view-
`
`
`
`
`
`
`
`
`able surface and perpendicular to the first component, and a
`
`
`
`
`
`
`
`third component perpendicular to the viewable surface. The 35
`
`
`
`
`
`
`device includes a motion processing module operable to
`
`
`
`
`
`
`compare the components, to isolate a preferred one of the
`
`
`
`
`
`
`
`components based on the comparison, and to adjust a
`
`
`
`
`magnitude of the preferred component by an augmentation
`factor. The device also includes a motion response module 40
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`operable to modify the image based on the augmented
`
`
`preferred component.
`In accordance with another embodiment, a method for
`
`
`
`
`
`
`
`
`
`
`
`
`
`controlling a handheld device includes detecting motion of
`
`
`
`
`
`
`
`
`the device within three dimensions and identifying compo- 45
`nents of the motion in relation to a viewable surface of the
`
`
`
`
`
`
`
`
`
`
`
`
`
`device. The components comprise a first component parallel
`
`
`
`
`
`
`
`
`to the viewable surface, a second component parallel to the
`
`
`
`
`
`
`
`viewable surface and perpendicular to the first component,
`
`
`
`
`
`
`
`and a third component perpendicular to the viewable sur- 5o
`
`
`
`
`
`
`
`
`face. The method includes comparing the components, iso-
`
`
`
`
`
`
`
`
`
`lating a preferred one of the components based on the
`
`
`
`
`
`comparison, adjusting a magnitude of the preferred compo-
`
`
`
`
`
`
`nent by an augmentation factor, and modifying an image
`
`
`
`
`
`
`
`displayed on the viewable surface based on the augmented 55
`
`
`preferred component.
`
`
`
`
`
`
`Technical advantages of particular embodiments of the
`
`
`
`
`
`
`
`invention include a handheld device capable of
`present
`
`
`
`
`
`
`recognizing and amplifying movement of a motion input in
`
`
`
`
`
`
`
`a preferred direction while minimizing movement of the 60
`
`
`
`
`
`
`
`motion input
`in other directions. Accordingly, a user’s
`
`
`
`
`
`
`
`
`ability to take advantage of motion interfaces may be
`
`
`
`
`
`
`
`enhanced, and the handheld device or applications running
`on the device may be able to filter out user-induced noise and
`
`
`
`
`
`
`
`
`
`unintended movements.
`
`
`
`
`
`
`
`
`
`Other technical advantages will be readily apparent to one
`
`
`
`
`
`
`
`
`skilled in the art from the following figures, descriptions and
`
`
`
`
`
`65
`
`
`
`Page 14 of 34
`
`
`
`
`
`
`
`
`
`claims. Moreover, while specific advantages have been
`enumerated above, various embodiments may include all,
`
`
`
`
`
`
`
`
`
`
`
`
`some or none of the enumerated advantages.
`BRIEF DESCRIPTION OF THE DRAWINGS
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`For a more complete understanding of particular embodi-
`
`
`
`
`
`
`
`ments of the invention and their advantages, reference is
`
`
`
`
`
`
`
`now made to the following descriptions, taken in conjunc-
`
`
`
`
`
`
`tion with the accompanying drawings, in which:
`FIG. 1 illustrates a handheld device with motion interface
`
`
`
`
`
`
`
`
`
`
`
`capability, in accordance with a particular embodiment;
`FIG. 2 illustrates a motion detector of the handheld device
`
`
`
`
`
`
`
`
`
`
`
`
`of FIG. 1, in accordance with a particular embodiment;
`
`
`
`
`
`
`
`FIG. 3 illustrates the use of motion detector components
`of the handheld device of FIG. 1,
`in accordance with a
`
`
`
`
`
`
`
`
`
`
`
`particular embodiment;
`
`
`
`
`
`
`FIG. 4 illustrates an example handheld device with
`
`
`
`
`
`motion detection capability, in accordance with a particular
`embodiment;
`
`
`
`
`
`
`
`FIG. 5 illustrates an example of selection and amplifica-
`tion of a dominant motion of a handheld device, in accor-
`
`
`
`
`
`
`
`
`
`
`dance with a particular embodiment;
`
`
`
`
`
`FIG. 6 is a flowchart illustrating preferred motion selec-
`
`
`
`
`
`
`tion, in accordance with a particular embodiment;
`
`
`
`
`
`FIG. 7 is a flowchart illustrating the setting of a zero-point
`
`
`
`
`
`
`for a handheld device,
`in accordance with a particular
`embodiment;
`
`
`
`
`
`FIG. 8 illustrates an example of scrubbing functionality
`
`
`
`
`
`
`
`with a handheld device for virtual desktop navigation, in
`
`
`
`
`accordance with a particular embodiment;
`
`
`
`
`
`
`FIG. 9 is a flowchart illustrating the scrubbing process of
`FIG. 8, in accordance with a particular embodiment;
`
`
`
`
`
`
`
`
`
`
`
`
`FIG. 10A illustrates an example of menu navigation using
`
`
`
`
`
`gesture input, in accordance with a particular embodiment;
`
`
`
`
`
`
`
`FIG. 10B illustrates example gestures which may be used
`to perform various functions at a handheld device, in accor-
`
`
`
`
`
`
`
`
`
`
`dance with a particular embodiment;
`
`
`
`
`
`FIG. 11 illustrates an example of map navigation using
`
`
`
`
`
`motion input, in accordance with a particular embodiment;
`FIG. 12A illustrates a form of motion input cursor navi-
`
`
`
`
`
`
`
`
`
`
`
`
`gation, in accordance with a particular embodiment;
`FIG. 12B illustrates another form of motion input cursor
`
`
`
`
`
`
`
`
`
`
`
`
`navigation, in accordance with a particular embodiment;
`
`
`
`
`
`
`FIG. 13 is a flowchart illustrating a process for utilizing
`
`
`
`
`
`
`feedback in response to motion input, in accordance with a
`
`
`particular embodiment;
`
`
`
`
`
`
`
`
`FIG. 14 illustrates an example system utilizing spatial
`signatures with a handheld device,
`in accordance with a
`
`
`
`
`
`
`
`
`
`particular embodiment;
`
`
`
`
`
`
`FIG. 15 illustrates an example system in which motion
`
`
`
`
`
`
`input of a handheld device controls multiple other devices,
`in accordance with a particular embodiment;
`
`
`
`
`
`FIG. 16 is a flowchart illustrating an environmental mod-
`
`
`
`
`
`
`
`
`
`
`
`
`eling process of a handheld device, in accordance with a
`
`
`particular embodiment;
`
`
`
`
`
`
`
`FIG. 17 illustrates example gestures which may be
`mapped to different functions of a handheld device,
`in
`
`
`
`
`
`
`
`
`
`
`
`accordance with a particular embodiment;
`FIG. 18 is a flowchart illustrating the utilization of a
`
`
`
`
`
`
`
`
`
`
`
`
`
`preexisting symbol gesture, in accordance with a particular
`embodiment;
`
`FIG. 19 is a flowchart illustrating the use of context-based
`
`
`
`
`
`
`
`
`
`
`gesture mapping, in accordance with a particular embodi-
`ment;
`
`
`
`
`
`
`
`
`Page 14 of 34
`
`

`

`
`
`US 7, 1 80,502 B2
`
`
`3
`
`
`
`
`
`
`FIG. 20 is a flowchart illustrating the use of user-based
`
`
`
`
`
`
`gesture mapping, in accordance with a particular embodi-
`ment;
`
`
`
`
`
`
`
`FIG. 21 is a flowchart illustrating the assignment process
`
`
`
`
`
`
`
`for user-created gestures, in accordance with a particular
`embodiment;
`
`
`
`
`
`
`
`
`FIG. 22 illustrates three gestures input using a handheld
`
`
`
`
`
`
`
`
`device with varying levels of precision, in accordance with
`
`
`
`
`a particular embodiment; and
`
`
`
`
`
`FIG. 23 is a flowchart illustrating a gesture recognition
`
`
`
`
`
`
`process utilizing a number of features, in accordance with a
`
`
`particular embodiment.
`DETAILED DESCRIPTION
`
`
`
`
`FIG. 1 illustrates a handheld device 10 with motion
`
`
`
`
`
`
`
`
`
`
`
`
`
`interface capability, in accordance with a particular embodi-
`
`
`
`
`
`
`
`
`ment of the present invention. Handheld device 10 can
`
`
`
`
`
`
`
`recognize movement of the device and can perform various
`
`
`
`
`
`
`functions corresponding to such movement. Thus, move-
`
`
`
`
`
`
`
`
`ment of the device operates as a form of input for the device.
`
`
`
`
`
`
`
`
`Such movement
`input may directly alter what
`is being
`
`
`
`
`
`
`
`displayed on a device display or may perform other func-
`
`
`
`
`
`
`
`
`tions. Handheld device 10 may comprise a mobile phone,
`
`
`
`
`
`
`
`
`personal digital assistant (PDA), still camera, video camera,
`
`
`
`
`
`
`
`
`pocket calculator, portable radio or other music or video
`
`
`
`
`
`
`
`player, digital
`thermometer, game device, portable elec-
`
`
`
`
`
`
`
`
`tronic device, watch or any other device capable of being
`
`
`
`
`
`
`
`held or worn by a user. As indicated in the examples listed
`
`
`
`
`
`
`
`above, handheld device 10 may include wearable portable
`
`
`
`
`
`
`
`
`devices such as watches as well. A watch may include any
`
`
`
`
`
`
`computing device worn around a user’s wrist.
`
`
`
`
`
`
`
`
`input 14,
`Handheld device 10 includes a display 12,
`
`
`
`
`
`
`processor 16, memory 18, communications interface 20 and
`
`
`
`
`
`
`
`motion detector 22. Display 12 presents visual output of the
`
`
`
`
`
`
`
`
`device and may comprise a liquid crystal display (LCD), a
`
`
`
`
`
`
`
`
`
`light emitting diode (LED) or any other type of display for
`
`
`
`
`
`
`
`
`communicating output
`to a user.
`Input 14 provides an
`interface for a user to communicate input to the device. Input
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`14 may comprise a keyboard, keypad, track wheel, knob,
`
`
`
`
`
`
`
`
`touchpad, stencil or any other component through which a
`
`
`
`
`
`
`
`user may communicate an input to device 10. In particular
`
`
`
`
`
`
`
`embodiments, display 12 and input 14 may be combined into
`
`
`
`
`
`the same component, such as a touchscreen.
`
`
`
`
`
`Processor 16 may be a microprocessor, controller or any
`other suitable computing device or resource. Processor 16 is
`
`
`
`
`
`
`
`
`
`
`
`
`adapted to execute various types of computer instructions in
`
`
`
`
`
`
`various computer languages for implementing functions
`available within system handheld device 10. Processor 16
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`may include any suitable controllers for controlling the
`
`
`
`
`
`
`management and operation of handheld device 10.
`
`
`
`
`
`
`
`Memory 18 may be any form of volatile or nonvolatile
`
`
`
`
`
`
`
`memory including, without limitation, magnetic media, opti-
`
`
`
`
`
`
`
`
`cal media,
`random access memory (RAM),
`read only
`memory (ROM), removable media or any other suitable
`
`
`
`
`
`
`
`
`
`
`
`
`
`local or remote memory component. Memory 18 includes
`
`
`
`
`
`
`components, logic modules or software executable by pro-
`
`
`
`
`
`
`
`cessor 16. Memory 18 may include various applications 19
`
`
`
`
`
`
`
`
`with user interfaces utilizing motion input, such as mapping,
`
`
`
`
`
`
`
`
`calendar and file management applications, as further dis-
`
`
`
`
`
`
`
`
`cussed below. Memory 18 may also include various data-
`
`
`
`
`
`
`
`
`bases, such as gesture databases and function or gesture
`
`
`
`
`
`
`mapping databases, as further discussed below. Components
`
`
`
`
`
`
`
`
`of memo ry 18 may be combined and/or divided for pro-
`
`
`
`
`
`
`
`cessing according to particular needs or desires within the
`scope of the present invention. Communications interface 20
`
`
`
`
`
`
`
`Page 15 of 34
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`4
`
`
`
`
`
`
`
`supports wireless or wireline communication of data and
`information with other devices, such as other handheld
`
`
`
`
`
`
`
`
`
`devices, or components.
`Motion detector 22 tracks movement of handheld device
`
`
`
`
`
`
`
`
`
`
`
`
`10 which may be used as a form of input to perform certain
`
`
`
`
`
`
`
`functions. Such input movement may result from a user
`
`
`
`
`
`
`moving the device in a desired fashion to perform desired
`tasks, as further discussed below.
`
`
`
`
`It should be understood that handheld device 10 in
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`accordance with particular embodiments may include any
`
`
`
`
`
`
`
`suitable processing and/or memory modules for performing
`the functions as described herein, such as a control module,
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`a motion tracking module, a video analysis module, a
`
`
`
`
`
`
`
`motion response module, a display control module and a
`
`
`
`signature detection module.
`
`
`
`
`
`
`
`In particular embodiments, input movement may be in the
`
`
`
`
`
`
`form of translation and/or gestures. Translation-based input
`
`
`
`
`
`
`
`focuses on a beginning point and endpoint of a motion and
`
`
`
`
`
`
`
`differences between such beginning points and endpoints.
`
`
`
`
`
`
`Gesture-based input focuses on an actual path traveled by
`the device and is a holistic view of a set of points traversed.
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`As an example, when navigating a map using translation-
`
`
`
`
`
`
`
`
`
`based input, motion in the form of an “0” may change the
`
`
`
`
`
`
`
`
`display during the movement but may ultimately yield no
`
`
`
`
`
`
`
`
`change between the information displayed prior to the
`
`
`
`
`
`
`
`movement and the information displayed at the end of the
`
`
`
`
`
`
`
`
`movement since the device presumably will be in the same
`point as it started when the motion ends. However,
`in a
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`gesture input mode the device will recognize that it has
`
`
`
`
`
`
`
`
`traveled in the form of an “0” because in gesture-based
`
`
`
`
`
`
`
`
`
`
`input the device focuses on the path traveled during the
`
`
`
`
`
`
`
`motion or movement between a beginning point and an
`
`
`
`
`
`
`
`
`endpoint of the gesture (e.g., even though the beginning and
`
`
`
`
`
`
`
`
`endpoints may be the same). This gesture “O” movement
`
`
`
`
`
`
`
`may be mapped to particular functions such that when the
`
`
`
`
`
`
`
`device recognizes it has traveled along a path to constitute
`
`
`
`
`
`
`
`
`
`an “O” gesture,
`it may perform the functions, as further
`
`
`
`
`
`
`elaborated upon below. In particular embodiments, move-
`
`
`
`
`
`
`ment of the device intended as a gesture may be recognized
`
`
`
`
`
`
`
`as by the device as a gesture by matching a series, sequence
`
`
`
`
`
`
`or pattern of accelerations of the movement to those defining
`
`
`
`gestures of a gesture database.
`Handheld devices in accordance with other embodiments
`
`
`
`
`
`
`
`
`
`
`
`
`
`may not include some of the components of the device
`
`
`
`
`
`
`illustrated in FIG. 1. For example, some embodiments may
`
`
`
`
`
`include a handheld device 10 without an input 14 separate
`from a motion detector such that motion of the device
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`provides the sole or primary input for the device. It should
`be noted that handheld devices in accordance with other
`
`
`
`
`
`
`
`
`
`
`
`
`
`embodiments may include additional components not spe-
`
`
`
`
`
`
`cifically illustrated with respect to device 10.
`FIG. 2 illustrates motion detector 22 of FIG. 1, in accor-
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`dance with a particular embodiment of the present invention.
`In this embodiment, motion detector 22 includes acceler-
`
`
`
`
`
`
`
`ometers 24a, 24b and 240; cameras 26a, 26b and 260; gyros
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`28a, 28b and 280; rangefinders 30a, 30b and 300; and a
`
`
`processor 32.
`Accelerometers 24a, 24b and 240 detect movement of the
`
`
`
`
`
`
`
`
`
`
`
`
`
`device by detecting acceleration along a respective sensing
`
`
`
`
`
`
`
`axis. A particular movement of the device may comprise a
`
`
`
`
`
`
`series, sequence or pattern of accelerations detected by the
`accelerometers. When the handheld device is tilted along a
`
`
`
`
`
`
`
`
`
`
`
`
`sensing axis of a particular accelerometer, the gravitational
`
`
`
`
`
`
`
`
`acceleration along the sensing axis changes. This change in
`
`
`
`
`
`gravitational acceleration is detected by the accelerometer
`and reflects the tilt of the device. Similarly, translation of the
`
`
`
`
`
`
`
`handheld device, or movement of the device without rota-
`
`
`
`
`
`
`
`5
`
`
`
`10
`
`
`
`15
`
`
`
`20
`
`25
`
`
`
`30
`
`
`
`35
`
`
`
`40
`
`45
`
`
`
`
`
`50
`
`
`
`55
`
`
`
`60
`
`
`
`65
`
`
`
`Page 15 of 34
`
`

`

`
`
`US 7,180,502 B2
`
`5
`
`
`
`
`
`
`
`
`
`tion or tilt also produces a change in acceleration along a
`
`
`
`
`
`
`
`sensing axis which is also detected by the accelerometers.
`In the illustrated embodiment, accelerometer 24a com-
`
`
`
`
`
`
`
`prises an x-axis accelerometer that detects movement of the
`
`
`
`
`
`
`
`
`
`
`
`
`
`device along an x-axis, accelerometer 24b comprises a
`
`
`
`
`
`
`y-axis accelerometer that detects movement of the device
`
`
`
`
`
`
`
`along a y-axis and accelerometer 240 comprises a Z-axis
`accelerometer that detects movement of the device along a
`
`
`
`
`
`
`Z-axis. In combination, accelerometers 24a, 24b and 240 are
`
`
`
`
`
`
`
`able to detect rotation and translation of device 10. As
`
`
`
`
`
`
`
`
`
`indicated above, rotation and/or translation of device 10 may
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`serve as an input from a user to operate the device.
`The use of three accelerometers for motion detection
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`provides certain advantages. For example, if only two accel-
`erometers were used, the motion detector may not be able to
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`disambiguate translation of the handheld device from tilt in
`
`
`
`
`
`
`
`
`the plane of translation. However, using a third, Z-axis
`
`
`
`
`
`
`
`accelerometer (an accelerometer with a sensing axis at least
`
`
`
`
`
`
`approximately perpendicular to the sensing axes of the other
`two accelerometers) enables many cases of tilt to be disam-
`
`
`
`
`
`
`
`
`
`
`
`biguated from many cases of translation.
`
`
`
`
`
`
`
`
`It should be understood that some unique movements may
`
`
`
`
`
`
`
`
`exist that may not be discernible from each other by accel-
`
`
`
`
`
`
`
`
`
`erometers 24a, 24b and 240. For example, movement com-
`25
`
`
`
`
`
`
`
`prising a certain rotation and a certain translation may
`
`
`
`
`
`
`
`
`appear to accelerometers 24a, 24b and 240 as the same
`
`
`
`
`
`
`movement as a different movement that comprises a differ-
`
`
`
`
`
`
`
`ent particular rotation and a different particular translation.
`If a motion detector 22 merely included three accelerometers
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`to detect movement (without any additional components to
`
`
`
`
`
`
`
`

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket