`
`(12) United States Patent
`Liberty
`
`(10) Patent No.:
`(45) Date of Patent:
`
`US 7,158,118 B2
`*Jan. 2, 2007
`
`(54)
`
`(75)
`
`(73)
`
`3D POINTING DEVICES W'l'l'H
`ORIENTATION COMPENSATION AND
`IMPROVED USABILITY
`
`Inventor: Matthew G. Liberty. Potomac. MD
`(US)
`
`Assigncc: Hillcrest Laboratories, Inc.. Rockvillc.
`MD (US)
`
`(*)
`
`Notice:
`
`Subject to any disclaimer. the term of this
`patent is extended or adjusted under 35
`U.S.C. 154(b) by 0 days.
`
`This patent is subject to a terminal dis-
`claimcr.
`
`(21)
`
`Appl. No.: 11/119,719
`
`(22)
`
`Filed:
`
`May 2, 2005
`
`(65)
`
`(6(1)
`
`(51)
`
`(52)
`
`(58)
`
`(56)
`
`Prior Publication Data
`
`US 2005/0243062 A1
`
`Nov. 3. 2005
`
`Related US. Application Data
`
`Provisional application No. 60/641.410. filed on Jan.
`5. 2005. provisional application No. 60/566444. filed
`on Apr. 30. 2004. provisional application No. 601612.
`571. filed on Sep. 23. 2004.
`
`Int. Cl.
`(2006.01)
`G096 5/00
`(2006.01)
`G096 5/08
`US. (Tl.
`...................... 345/158: 345/156: 345/157:
`345/ 163
`345/1567169.
`Field of Classification Search
`345/173—179:
`l78/'18.01—l8.07. 19.01—19.06
`See application file for complete search history.
`References Cited
`
`U.S. PATENT DOCUMENTS
`
`4.787.051 A
`
`111988 Olson
`
`4.839.838 .A
`5.045.843 A
`5.138.154 .A
`5.181.181 A “
`5.359.348 .A
`5.440.326 A
`5.506.605 A “
`5.698.784 A
`5.703.623 A
`5.825.350 A
`5.835.156 A
`5.898.421 A
`5.912.612 A
`5.955.988 A
`6.002.394 A
`6.016.144 A
`6.049.823 A
`6.115.028 A *
`6.164.808 A “
`
`345.1163
`
`61989 IaBiche et al.
`9’1991
`Ilansen
`81992 Hotelling
`191993 Glynn ........................ 7021141
`1051994 Pilcher et al.
`8.51995 Quinn
`451996 Paley
`12.51997
`IIotelling et al.
`12199? Hall et a1.
`10.51998 Case. Jr. et a].
`1151998 Blonstein et a1.
`4.51999 Quinn
`6.11999 DeVolpi
`91999 Blonstein et a1.
`12.11999 Schein et a1.
`[.2000 Blonstein et a1.
`4-"2000 Hwang
`3451157
`9.52000 Balakrishnan el al.
`12"2000 Shibata et a1.
`................ 700385
`
`(Continued)
`OTIIER PUBLICATIONS
`
`Navarrete. P.. et al.. “Eigensapce-based Recognition of Faces:
`Comparisons and a new Approach." Image Analysis and Processing.
`2001. pp. [-6.
`
`(Continued)
`
`Primary Examiner—Vijay Shankar
`(74) Attorney Agent. or FirmiPotomac Patent Group
`PLLC
`
`(57)
`
`ABSTRACT
`
`invention
`Systems and methods according to the present
`describe 3D pointing devices which enhance usability by
`transfonning sensed motion data from a first frame of
`reference (e.g.. the body of the 3D pointing device) into a
`second frame ofreference (e.g.. a user’s frame of reference).
`One exemplary embodiment of the present
`invention
`removes elfects associated with a tilt orientation in which the
`
`3D pointing device is held by a user.
`
`17 (Ilaims, 9 Drawing Sheets
`
`(10
`
`
`
`Page 1 of 22
`
`SAMSUNG EXHIBIT 1006
`
`SAMSUNG EXHIBIT 1006
`
`Page 1 of 22
`
`
`
`
`
`US 7,158,118 B2
` Page 2
`
`
`U.S. PATENT DOCUMENTS
`
`
`
`
`
`2006/0092133 A1*
`
`
`
`
`5/2006 Touma et a1.
`
`
`
`
`
`............... 345/158
`
`
`
`
`
`
`
`
`
`
`................ 700/85
`
`
`6,466,831 B1* 10/2002 Shibata et a1.
`
`
`
`
`
`
`
`
`
`6,492,981 B1
`12/2002 Stork et 31'
`
`
`
`
`
`6,753,849 B1
`6/2004 Curran et 31'
`
`
`
`
`
`6,757,446 B1 *
`6/2004 Li et al.
`..................... 382/293
`6933923 Bl
`8/2005 Feinstein
`
`
`
`
`
`
`
`
`
`6,990,639 Bl
`1/2006 Wilson
`6,998,966 B1
`2/2006 Pedersen et al.
`
`
`
`
`2003/0107551 A1
`6/2003 Dunker
`
`
`
`
`
`2004/0095317 A1
`5/2004 Zhang et al.
`
`
`
`
`
`2004/0239626 A1
`12/2004 Noguem
`
`
`
`
`2004/0268393 A1
`12/2004 Hunleth et 31.
`
`
`
`
`
`2005/0174324 A1
`8/2005 Liberty et a1.
`
`
`
`
`
`
`
`
`
`
`
`2005/0212767 A1
`9/2005 MarVit et al.
`
`
`
`
`
`2005/0243061 A1
`“/2005 Liberty et al.
`
`
`
`
`
`
`2005/0253806 A1
`11/2005 Liberty et al.
`2006/0028446 A1
`2/2006 Liberty et al.
`
`
`
`
`
`
`
`
`
`
`OTHER PUBLICATIONS
`
`
`
`
`
`
`
`
`
`
`Jakubowski, J., et al., “Higher Order Statistics and Neural Network
`
`
`
`
`
`
`
`for Tremor Recognition,” IEEE Transactions on Biomedical Engi-
`
`
`
`
`
`
`
`
`neering, vol. 49, No. 2, Feb. 2002, pp. 152459
`Liu, C., et al., “Enhanced Fisher Linear Discriminant Models for
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`Face Recognition,” Proc. 14th International Conference on Pattern
`Recognition, Queensland, Australia, Aug. 17-20, 1998, pp. 1-5.
`
`
`
`
`
`
`
`
`International Search Report for PCT/US05/15096, mailed May 16,
`
`
`
`
`
`
`
`2006.
`
`Written Opinion for PCTflJS05/ 15096, mailed May 15, 2006.
`
`
`
`
`
`
`
`International Search Report for PCT/US04/35369, mailed May 11,
`
`
`
`
`
`
`
`2006
`
`
`
`
`
`
`
`
`Written Opinion for PCT/USO4/35369, mailed May 11, 2006.
`
`
`
`
`
`
`Green, J., Iet al., “New 1MEMS Angular-Rate-Sensmg Gyroscope,”
`
`
`
`
`
`
`“310% 1313109103703 (2003), PP' 1""
`* cited by examiner
`
`
`
`
`
`
`
`
`
`Page 2 of 22
`
`Page 2 of 22
`
`
`
`
`
`FAV RITE
`
`
`
`CODE SET
`
`
`@ &>
`
`
`
`TV/VIDEO TV/DSS
`JUMP
`
`
`
`U.S. Patent
`
`
`
`
`Jan. 2, 2007
`
`
`
`
`
`Sheet 1 of 9
`
`
`
`US 7,158,118 B2
`
`
`FIG.
`1
`
`
`(Prior Art)
`
`
`
`
`
`— POWER —"'"'
`CABLE TV
`DSS
`
`
`
`—FUNCTION -—
`
`
`
`CABLE
`TV
`DSS
`
`
`
`O C) CD
`
`Page 3 of 22
`
`Page 3 of 22
`
`
`
`S.U
`
`na
`
`SU
`
`8
`
`2
`
`JENEmEmmww
`
`
`
` ENWWWOWNm3ch33¢.3&5m$.3\.¥3
`8.38%38805a«38p.a.
`
`
`
` .\EmMbmQNQNNmgmSE“Mm“.wwwmmmwm.miobcou
`9mm$$§30390.5386300%:
`
`
`
`2.,ombs\o.§<Ems8EE396
`
`
`EemmaSusan3:5:
`
`
`
`awmumWNWRmKwmnmNR
`
`BImNGE
`
`
`
`7,05:33@300
`
`H3chE98:
`
`
`
`1,3&5
`
`Page 4 of 22
`
`Page 4 of 22
`
`
`
`
`
`
`U.S. Patent
`
`
`
`
`Jan. 2, 2007
`
`
`
`
`Sheet 3 of 9
`
`
`
`US 7,158,118 B2
`
`
`
`
`
`Page 5 of 22
`
`Page 5 of 22
`
`
`
`Ps”U
`
`2.
`
`m
`
`US 7,158,118 B2
`
`mQB
`
`mma.qsflmm.GE
`
`
`
`S“fill—ll.-
`918.0Mto:SEQ4Iammmbucozem
`
`:89?:0933.5%
`
`vow
`
`
`
`cotegobN90
`
`.
`
`
`
`22mg?:00meEom.
`
`Vaz:0.8‘
`
`Page 6 of 22
`
`Page 6 of 22
`
`
`
`
`
`
`U.S. Patent
`
`
`
`
`Jan. 2, 2007
`
`
`
`
`Sheet 5 of 9
`
`
`
`US 7,158,118 B2
`
`
`
`
`FIG. 68
`
`
`
`
`
`¢ 410
`
`408
`
`
`
`Page 7 of 22
`
`Page 7 of 22
`
`
`
`
`U.S. Patent
`
`
`
`
`Jan. 2, 2007
`
`
`
`
`Sheet 6 of 9
`
`
`
`US 7,158,118 B2
`
`
`FIG. 6C
`
`
`
`
`
`
`FIG. 60
`
`
`
`¢ 470
`
`
`408
`
`
`
`
`
`
`
`Page 8 of 22
`
`Page 8 of 22
`
`
`
`
`U.S. Patent
`
`
`
`
`Jan. 2, 2007
`
`
`
`
`Sheet 7 of 9
`
`
`
`US 7,158,118 B2
`
`
`FIG. 7
`
`
`687
`
`
`
`
`
`
`
`800
`
`
`
`Scroll Wheel
`
`
`
`
`
`802
`
`
`
`
`
`
`
`
`
`
`
`
`806
`
`
`
`Processor
`
`
`
`
`
`
`
`
`
`
`Switch Matrix
`
`
`
`808
`
`
`
`804
`
`
`
`
`
`
`R0 tat/'ono/
`
`Sensors
`
`Photodetector
`
`
`
`
`
`
`
`
`
`
`
`874
`
`872
`
`870
`
`Page 9 of 22
`
`Page 9 of 22
`
`
`
`
`U.S. Patent
`
`
`
`
`Jan. 2, 2007
`
`
`
`
`Sheet 8 of 9
`
`
`
`US 7,158,118 B2
`
`
`
`
`
`
`
` force sleep
`
`
`FIG. 8
`
`Page 10 of 22
`
`Page 10 of 22
`
`
`
`
`U.S. Patent
`
`
`
`
`Jan. 2, 2007
`
`
`
`
`Sheet 9 of 9
`
`
`
`US 7,158,118 B2
`
`
`FIG. 9
`
`
`
`
`903
`
`Sensors
`
`
`
`
`In terpret
`
`Sensors
`
`
`Map
`Movement
`
`
`
`
`Produce
`
`Action
`
`
`FIG. 10
`
`
`
`—Gravity—>-
`
`
`Body Frame
`
`
`
`Q0
`
`
`User Frame
`
`
`
`Page 11 of 22
`
`Page 11 of 22
`
`
`
`
`
`US 7,158,118 B2
`
`1
`
`3D POINTING DEVICES WITH
`
`
`
`
`ORIENTATION COMPENSATION AND
`
`
`IMPROVED USABILITY
`
`
`
`
`
`
`RELATED APPLICATIONS
`
`
`
`
`2
`
`
`being transmitted on those channels and (3) date and time.
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`The television was tuned to the desired channel by adjusting
`
`
`
`
`
`
`
`
`
`
`a tuner knob and the viewer watched the selected program.
`
`
`
`
`
`
`
`
`Later, remote control devices were introduced that permitted
`viewers to tune the television from a distance. This addition
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`to the user-television interface created the phenomenon
`
`
`
`
`
`
`
`known as “channel surfing” whereby a viewer could rapidly
`
`
`
`
`
`
`
`view short segments being broadcast on a number of chan-
`
`
`
`
`
`
`
`
`nels to quickly learn what programs were available at any
`
`
`given time.
`
`
`
`
`
`
`
`
`
`Despite the fact that the number of channels and amount
`
`
`
`
`
`
`
`
`of viewable content has dramatically increased, the gener-
`
`
`
`
`
`
`
`
`ally available user interface, control device options and
`
`
`
`
`
`
`
`
`
`frameworks for televisions has not changed much over the
`
`
`
`
`
`
`
`
`
`
`last 30 years. Printed guides are still the most prevalent
`
`
`
`
`
`
`mechanism for conveying programming information. The
`
`
`
`
`
`
`
`
`multiple button remote control with up and down arrows is
`
`
`
`
`
`
`
`still the most prevalent channel/content selection mecha-
`
`
`
`
`
`
`
`
`
`nism. The reaction of those who design and implement the
`TV user interface to the increase in available media content
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`has been a straightforward extension of the existing selec-
`
`
`
`
`
`
`
`
`tion procedures and interface objects. Thus, the number of
`
`
`
`
`
`
`
`
`rows in the printed guides has been increased to accommo-
`date more channels. The number of buttons on the remote
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`control devices has been increased to support additional
`
`
`
`
`
`
`
`functionality and content handling, e. g., as shown in FIG. 1.
`
`
`
`
`
`
`
`
`However, this approach has significantly increased both the
`
`
`
`
`
`
`
`
`time required for a viewer to review the available informa-
`
`
`
`
`
`
`
`tion and the complexity of actions required to implement a
`
`
`
`
`
`
`
`selection. Arguably, the cumbersome nature of the existing
`
`
`
`
`
`
`interface has hampered commercial implementation of some
`services, e.g., video-on-demand, since consumers are resis-
`
`
`
`
`
`
`
`tant to new services that will add complexity to an interface
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`that they view as already too slow and complex.
`In addition to increases in bandwidth and content, the user
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`interface bottleneck problem is being exacerbated by the
`
`
`
`
`
`
`aggregation of technologies. Consumers are reacting posi-
`
`
`
`
`
`
`
`
`
`tively to having the option of buying integrated systems
`
`
`
`
`
`
`rather than a number of segregable components. An example
`of this trend is the combination television/VCR/DVD in
`
`
`
`
`
`
`
`
`
`
`
`
`
`which three previously independent components are fre-
`
`
`
`
`
`
`
`
`quently sold today as an integrated unit. This trend is likely
`to continue, potentially with an end result that most if not all
`
`
`
`
`
`
`
`
`
`of the communication devices currently found in the house-
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`hold will be packaged together as an integrated unit, e.g., a
`television/VCR/DVD/intemet
`access/radio/stereo
`unit.
`
`
`
`
`
`
`
`
`
`
`
`Even those who continue to buy separate components will
`
`
`
`
`
`
`
`
`likely desire seamless control of, and interworking between,
`
`
`
`
`
`
`
`the separate components. With this increased aggregation
`
`
`
`
`
`
`
`
`
`comes the potential for more complexity in the user inter-
`face. For example, when so-called “universal” remote units
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`were introduced, e.g., to combine the functionality of TV
`remote units and VCR remote units, the number of buttons
`
`
`
`
`
`
`
`
`
`on these universal remote units was typically more than the
`
`
`
`
`
`
`
`
`
`
`number of buttons on either the TV remote unit or VCR
`
`
`
`
`
`
`
`
`
`remote unit individually. This added number of buttons and
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`functionality makes it very difficult to control anything but
`
`
`
`
`
`
`
`the simplest aspects of a TV or VCR without hunting for
`
`
`
`
`
`
`
`
`
`exactly the right button on the remote. Many times, these
`
`
`
`
`
`
`
`universal remotes do not provide enough buttons to access
`
`
`
`
`
`
`
`many levels of control or features unique to certain TVs. In
`these cases, the original device remote unit is still needed,
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`and the original hassle of handling multiple remotes remains
`
`
`
`
`
`
`
`
`due to user interface issues arising from the complexity of
`
`
`
`
`
`
`
`
`aggregation. Some remote units have addressed this problem
`
`
`
`
`
`
`
`
`
`by adding “soft” buttons that can be programmed with the
`expert commands. These soft buttons sometimes have
`
`
`
`
`
`
`
`
`10
`
`
`
`15
`
`
`
`
`
`
`
`
`
`
`
`This application is related to, and claims priority from,
`
`
`
`
`
`
`
`US. Provisional Patent Application Ser. No. 60/566,444
`
`
`
`
`
`
`
`
`filed on Apr. 30, 2004, entitled “Freespace Pointing Device”,
`
`
`
`
`
`
`the disclosure of which is incorporated here by reference.
`
`
`
`
`
`
`
`
`
`This application is also related to, and claims priority from,
`
`
`
`
`
`
`
`US. Provisional Patent Application Ser. No. 60/612,571,
`
`
`
`
`
`
`
`
`
`filed on Sep. 23, 2004, entitled “Free Space Pointing
`
`
`
`
`
`
`
`Devices and Methods”, the disclosure of which is incorpo-
`
`
`
`
`
`
`
`
`rated here by reference. This application is also related to,
`
`
`
`
`
`
`
`
`and claims priority from, US. Provisional Patent Applica-
`tion Ser. No. 60/641,410, filed on Jan. 5, 2005, entitled
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`“Freespace Pointing Devices and Methods for Using Same”,
`
`
`
`
`
`
`the disclosure of which is incorporated here by reference.
`
`
`
`
`
`
`
`This application is also related to US. patent applications
`Ser. Nos. 11/119,987, 11/119,688, and 11/119,663, entitled
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`“Methods and Devices for Removing Unintentional Move-
`
`
`
`
`
`
`
`
`ment in 3D Pointing Devices”, “Methods and Devices for
`
`
`
`
`
`
`
`
`Identifying Users Based on Tremor”, and “3D Pointing
`
`
`
`
`
`
`
`Devices and Methods”, all of which were filed concurrently
`
`
`
`
`
`
`
`
`
`here with and all of which are incorporated here by refer-
`ence.
`
`
`20
`
`25
`
`BACKGROUND
`
`
`
`30
`
`35
`
`40
`
`
`
`
`
`
`
`
`invention relates generally to handheld,
`The present
`
`
`
`
`
`
`pointing devices and, more specifically to three-dimensional
`
`
`
`
`
`
`
`
`(hereinafter “3D”) pointing devices and techniques for tilt
`
`
`
`
`
`
`compensation and improved usability associated therewith.
`
`
`
`
`
`
`Technologies associated with the communication of infor-
`mation have evolved rapidly over the last several decades.
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`Television, cellular telephony, the Internet and optical com-
`
`
`
`
`
`
`
`
`munication techniques (to name just a few things) combine
`to inundate consumers with available information and enter-
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`tainment options. Taking television as an example, the last
`three decades have seen the introduction of cable television
`
`
`
`
`
`
`
`
`service, satellite television service, pay-per-view movies and
`
`
`
`
`
`
`
`video-on-demand. Whereas television viewers of the 1 960s
`
`
`
`
`
`
`
`
`
`
`
`
`
`could typically receive perhaps four or five over-the-air TV
`45
`channels on their television sets, today’s TV watchers have
`
`
`
`
`
`
`
`the opportunity to select from hundreds,
`thousands, and
`
`
`
`
`
`
`
`
`potentially millions of channels of shows and information.
`
`
`
`
`
`
`
`
`
`
`
`
`Video-on-demand technology, currently used primarily in
`
`
`
`
`
`
`
`
`
`hotels and the like, provides the potential for in-home
`entertainment selection from among thousands of movie
`
`
`
`
`
`
`
`titles.
`
`
`
`
`
`
`
`The technological ability to provide so much information
`
`
`
`
`
`
`
`
`and content to end users provides both opportunities and
`
`
`
`
`
`
`
`challenges to system designers and service providers. One
`
`
`
`
`
`
`
`
`challenge is that while end users typically prefer having
`more choices rather than fewer, this preference is counter-
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`weighted by their desire that the selection process be both
`
`
`
`
`
`
`
`
`fast and simple. Unfortunately,
`the development of the
`systems and interfaces by which end users access media
`
`
`
`
`
`
`
`
`
`items has resulted in selection processes which are neither
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`fast nor simple. Consider again the example of television
`
`
`
`
`
`
`
`programs. When television was in its infancy, determining
`
`
`
`
`
`
`
`which program to watch was a relatively simple process
`primarily due to the small number of choices. One would
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`consult a printed guide which was formatted, for example, as
`series of columns and rows which showed the correspon-
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`dence between (1) nearby television channels, (2) programs
`
`50
`
`55
`
`60
`
`65
`
`Page 12 of 22
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`Page 12 of 22
`
`
`
`
`
`US 7,158,118 B2
`
`
`3
`
`
`
`
`
`
`
`accompanying LCD displays to indicate their action. These
`too have the flaw that they are difficult
`to use without
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`looking away from the TV to the remote control. Yet another
`
`
`
`
`
`
`
`
`flaw in these remote units is the use of modes in an attempt
`to reduce the number of buttons. In these “moded” universal
`5
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`remote units, a special button exists to select whether the
`
`
`
`
`
`
`
`
`
`remote should communicate with the TV, DVD player, cable
`
`
`
`
`
`
`
`
`
`set-top box, VCR, etc. This causes many usability issues
`
`
`
`
`
`
`
`including sending commands to the wrong device, forcing
`the user to look at the remote to make sure that it is in the 10
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`right mode, and it does not provide any simplification to the
`
`
`
`
`
`
`
`integration of multiple devices. The most advanced of these
`
`
`
`
`
`
`
`universal remote units provide some integration by allowing
`
`
`
`
`
`
`
`the user to program sequences of commands to multiple
`devices into the remote. This is such a difficult task that
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`many users hire professional
`installers to program their
`universal remote units.
`
`
`
`
`
`
`
`
`
`
`
`Some attempts have also been made to modernize the
`
`
`
`
`
`
`
`
`screen interface between end users and media systems.
`
`
`
`
`
`
`
`
`However, these attempts typically suffer from, among other 20
`
`
`
`
`
`
`
`
`drawbacks, an inability to easily scale between large col-
`lections of media items and small collections of media items.
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`For example, interfaces which rely on lists of items may
`work well for small collections of media items, but are
`
`
`
`
`
`
`
`
`
`
`tedious to browse for large collections of media items.
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`Interfaces which rely on hierarchical navigation (e.g., tree
`
`
`
`
`
`
`
`
`structures) may be speedier to traverse than list interfaces for
`
`
`
`
`
`
`
`
`
`large collections of media items, but are not readily adapt-
`
`
`
`
`
`
`
`able to small collections of media items. Additionally, users
`
`
`
`
`
`
`
`
`tend to lose interest in selection processes wherein the user 30
`
`
`
`
`
`
`
`
`has to move through three or more layers in a tree structure.
`For all of these cases, current remote units make this
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`selection processor even more tedious by forcing the user to
`
`
`
`
`
`
`
`
`repeatedly depress the up and down buttons to navigate the
`list or hierarchies. When selection skipping controls are 35
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`available such as page up and page down, the user usually
`has to look at the remote to find these special buttons or be
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`trained to know that they even exist. Accordingly, organiz-
`
`
`
`
`
`
`
`
`ing frameworks, techniques and systems which simplify the
`control and screen interface between users and media sys- 40
`
`
`
`
`
`
`
`
`
`tems as well as accelerate the selection process, while at the
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`same time permitting service providers to take advantage of
`the increases in available bandwidth to end user equipment
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`by facilitating the supply of a large number of media items
`and new services to the user have been proposed in US. 45
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`patent application Ser. No. 10/768,432, filed on Jan. 30,
`2004, entitled “A Control Framework with a Zoomable
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`Graphical User Interface for Organizing, Selecting and
`Launching Media Items”, the disclosure of which is incor-
`
`
`
`
`
`
`
`porated here by reference.
`
`
`
`
`
`
`
`
`
`
`
`
`Of particular interest for this specification are the remote
`devices usable to interact with such frameworks, as well as
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`other applications and systems. As mentioned in the above-
`
`
`
`
`
`
`incorporated application, various different types of remote
`devices can be used with such frameworks including, for 55
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`example, trackballs, “mouse”-type pointing devices,
`light
`
`
`
`
`
`
`
`
`pens, etc. However, another category of remote devices
`which can be used with such frameworks (and other appli-
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`cations) is 3D pointing devices. The phrase “3D pointing” is
`
`
`
`
`
`
`
`
`used in this specification to refer to the ability of an input 60
`device to move in three (or more) dimensions in the air in
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`front of, e.g., a display screen, and the corresponding ability
`of the user interface to translate those motions directly into
`
`
`
`
`
`
`
`
`user interface commands, e.g., movement of a cursor on the
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`display screen. The transfer of data between the 3D pointing 65
`
`
`
`
`
`
`
`device may be performed wirelessly or via a wire connecting
`
`
`
`
`
`
`
`
`the 3D pointing device to another device. Thus “3D point-
`
`50
`
`
`
`15
`
`
`
`
`
`25
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`4
`
`
`
`
`
`
`
`
`
`
`ing” differs from, e.g., conventional computer mouse point-
`
`
`
`
`
`
`
`
`ing techniques which use a surface, e.g., a desk surface or
`
`
`
`
`
`
`
`mousepad, as a proxy surface from which relative move-
`ment of the mouse is translated into cursor movement on the
`
`
`
`
`
`
`
`
`
`
`
`
`
`computer display screen. An example of a 3D pointing
`device can be found in US. Pat. No. 5,440,326.
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`The ’326 patent describes, among other things, a vertical
`
`
`
`
`
`
`
`
`gyroscope adapted for use as a pointing device for control-
`
`
`
`
`
`
`
`ling the position of a cursor on the display of a computer. A
`
`
`
`
`
`
`
`motor at the core of the gyroscope is suspended by two pairs
`
`
`
`
`
`
`
`of orthogonal gimbals from a hand-held controller device
`
`
`
`
`
`
`
`
`
`and nominally oriented with its spin axis vertical by a
`
`
`
`
`
`
`
`pendulous device. Electro-optical shaft angle encoders sense
`the orientation of a hand-held controller device as it is
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`manipulated by a user and the resulting electrical output is
`
`
`
`
`
`
`
`converted into a format usable by a computer to control the
`
`
`
`
`
`
`
`movement of a cursor on the screen of the computer display.
`
`
`
`
`
`
`However, the freedom of use associated with 3D pointers
`
`
`
`
`
`
`
`creates additional challenges. For example, since there is
`
`
`
`
`
`
`
`generally no proxy surface on which a 3D pointing device
`rests, the orientation of the handheld control device may
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`vary considerably from user to user or even use to use. If a
`
`
`
`
`
`
`
`
`
`3D pointing device is used to, for example, control the
`
`
`
`
`
`
`
`movement of a cursor displayed on a screen, then some
`
`
`
`
`
`
`mapping is performed between the detected movement of
`the handheld device and the movement of the cursor on the
`
`
`
`
`
`
`
`
`screen.
`
`
`
`
`
`
`
`
`
`
`One technique for performing this mapping is to use the
`body frame of the device as the frame of reference for
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`mapping detected motion of the 3D pointing device into
`
`
`
`
`
`
`
`
`intended motion of the cursor. The term “body frame” refers
`
`
`
`
`
`
`
`
`to a set of axes associated with the body of the object being
`moved as described in more detail below. Using the body
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`frame of reference to perform the mapping, however, has
`
`
`
`
`
`
`
`
`certain drawbacks. For example, it requires the user to hold
`the device in a certain orientation in order to obtain the
`
`
`
`
`
`
`
`
`cursor movement he or she desires. For example, if the user
`
`
`
`
`
`
`
`
`holds the device on its side and moves the device left to
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`right, the cursor will move vertically, not horizontally, on the
`screen.
`
`
`
`
`
`
`
`
`Accordingly, the present invention describes methods and
`devices for processing the data received from sensor(s) in a
`
`
`
`
`
`
`
`
`manner which addresses these and other problems associ-
`
`
`
`
`
`
`
`
`ated with conventional 3D pointing devices.
`
`
`
`
`
`SUMMARY
`
`
`
`
`
`
`
`
`
`
`Systems and methods according to the present invention
`
`
`
`
`
`
`
`describe 3D pointing devices which enhance usability by
`
`
`
`
`
`
`
`
`transforming sensed motion data from a first frame of
`
`
`
`
`
`
`
`reference (e.g., the body of the 3D pointing device) into a
`second frame of reference (e.g., a user’s frame of reference).
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`One exemplary embodiment of the present
`invention
`removes effects associated with a tilt orientation in which the
`
`
`
`
`
`
`
`
`
`
`
`
`3D pointing device is held by a user.
`
`
`
`
`
`
`According to an exemplary embodiment of the present
`invention, a handheld, pointing device includes a first rota-
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`tional sensor for determining rotation of the pointing device
`
`
`
`
`
`
`
`
`about a first axis and generating a first rotational output
`associated therewith, a second rotational sensor for deter-
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`mining rotation of the pointing device about a second axis
`
`
`
`
`
`
`
`and generating a second rotational output associated there-
`with, an accelerometer for determining an acceleration of the
`
`
`
`
`
`
`
`
`
`
`
`
`pointing device and outputting an acceleration output asso-
`
`
`
`
`
`
`
`
`
`ciated therewith and a processing unit for receiving the first
`
`
`
`
`
`
`
`
`and second rotational outputs and the acceleration output
`
`
`
`
`
`
`
`
`
`
`and for: (a) converting the first and second rotational outputs
`
`Page 13 of 22
`
`Page 13 of 22
`
`
`
`
`
`US 7,158,118 B2
`
`5
`
`
`
`
`
`
`
`
`
`and the acceleration output from a body frame of reference
`
`
`
`
`
`
`
`
`associated with the handheld pointing device into a user’s
`frame of reference in order to remove the effects of tilt
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`associated with the manner in which a user is holding the
`
`
`
`
`
`
`
`handheld, pointing device; and (b) determining data associ-
`ated with x and y coordinates which are in turn associated
`
`
`
`
`
`
`
`
`with movement of a screen cursor, the data based on the
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`converted first and second rotational outputs and the con-
`
`
`
`
`
`
`
`
`verted acceleration output, wherein the step of converting
`
`
`
`
`
`
`
`renders the movement of the screen cursor substantially
`
`
`
`
`
`
`
`
`independent of an orientation in which a user holds the
`handheld device.
`
`
`
`
`
`
`
`
`According to another exemplary embodiment of the
`
`
`
`
`
`
`
`present invention, a method for using a 3D pointing device
`
`
`
`
`
`
`includes the steps of detecting movement of the 3D pointing
`
`
`
`
`
`
`
`device and compensating the detected movement by trans-
`
`
`
`
`
`
`
`
`forming the detected movement from a body frame of
`
`
`
`
`
`
`
`
`reference associated with the 3D pointing device into an
`inertial frame of reference.
`
`
`
`
`
`
`
`
`
`According to yet another exemplary embodiment of the
`
`
`
`
`
`
`
`present invention, a 3D, handheld device includes at least
`
`
`
`
`
`
`
`
`one sensor for detecting movement of the 3D pointing
`
`
`
`
`
`
`
`
`device and a processing unit for compensating the detected
`
`
`
`
`
`
`movement by transforming the detected movement from a
`
`
`
`
`
`
`
`
`body frame of reference associated with the 3D pointing
`device into an inertial frame of reference.
`
`
`
`
`
`
`BRIEF DESCRIPTION OF THE DRAWINGS
`
`
`
`
`
`
`
`
`
`
`The
`exemplary
`illustrate
`drawings
`accompanying
`
`
`
`
`embodiments of the present invention, wherein:
`FIG. 1 depicts a conventional remote control unit for an
`
`
`
`
`
`
`
`
`
`entertainment system;
`
`
`
`
`
`
`
`FIG. 2 depicts an exemplary media system in which
`
`
`
`
`
`
`exemplary embodiments of the present invention can be
`
`implemented;
`
`
`
`
`
`
`FIG. 3 shows a 3D pointing device according to an
`
`
`
`
`
`exemplary embodiment of the present invention;
`
`
`
`
`
`
`
`FIG. 4 illustrates a cutaway view of the 3D pointing
`device in FIG. 4 including two rotational sensors and one
`
`
`
`
`
`
`
`accelerometer;
`
`
`
`
`
`
`FIG. 5 is a block diagram illustrating processing of data
`
`
`
`
`
`associated with 3D pointing devices according to an exem-
`
`
`
`
`
`plary embodiment of the present invention;
`FIGS. 6(a)76(d) illustrate the effects of tilt;
`
`
`
`
`
`
`
`
`
`
`FIG. 7 depicts a hardware architecture of a 3D pointing
`
`
`
`
`
`
`
`device according to an exemplary embodiment of the
`
`
`present invention;
`
`
`
`
`
`FIG. 8 is a state diagram depicting a stationary detection
`
`
`
`
`mechanism according to an exemplary embodiment of the
`
`
`present invention;
`
`
`
`
`
`FIG. 9 is a block diagram illustrating transformation of
`sensed motion data from a first frame of reference into a
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`second frame of reference according to an exemplary
`embodiment of the present invention; and
`
`
`
`
`
`FIG. 10 graphically illustrates the transformation of
`
`
`
`
`
`
`
`sensed motion data from a first frame of reference into a
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`second frame of reference according to an exemplary
`embodiment of the present invention.
`
`
`
`DETAILED DESCRIPTION
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`The following detailed description of the invention refers
`
`
`
`
`
`
`
`
`to the accompanying drawings. The same reference numbers
`
`
`
`
`
`
`
`
`in different drawings identify the same or similar elements.
`
`Page 14 of 22
`
`
`6
`
`
`
`
`
`
`
`
`
`Also, the following detailed description does not limit the
`
`
`
`
`
`
`invention. Instead, the scope of the invention is defined by
`
`
`
`the appended claims.
`
`
`
`
`
`
`
`
`In order to provide some context for this discussion, an
`
`
`
`
`
`
`
`exemplary aggregated media system 200 in which the
`
`
`
`
`
`
`
`present invention can be implemented will first be described
`
`
`
`
`
`
`
`
`
`with respect to FIG. 2. Those skilled in the art will appre-
`
`
`
`
`
`
`
`
`ciate, however, that the present invention is not restricted to
`
`
`
`
`
`
`
`
`implementation in this type of media system and that more
`
`
`
`
`
`
`
`or fewer components can be included therein. Therein, an
`
`
`
`
`
`
`
`
`input/output (I/O) bus 210 connects the system components
`
`
`
`
`
`
`
`
`
`
`in the media system 200 together. The 1/0 bus 210 represents
`
`
`
`
`
`
`any of a number of different of mechanisms and techniques
`
`
`
`
`
`
`
`
`for routing signals between the media system components.
`
`
`
`
`
`
`
`
`
`For example, the I/O bus 210 may include an appropriate
`
`
`
`
`
`
`
`
`number of independent audio “patch” cables that route audio
`
`
`
`
`
`
`
`
`signals, coaxial cables that route video signals, two-wire
`
`
`
`
`
`
`
`
`serial lines or infrared or radio frequency transceivers that
`
`
`
`
`
`
`
`
`
`route control signals, optical fiber or any other routing
`
`
`
`
`
`
`mechanisms that route other types of signals.
`
`
`
`
`
`
`
`
`In this exemplary embodiment, the media system 200
`includes a television/monitor 212, a video cassette recorder
`
`
`
`
`
`
`
`
`
`
`
`
`
`(VCR) 214, digital video disk (DVD) recorder/playback
`
`
`
`
`
`
`
`
`
`device 216, audio/video tuner 218 and compact disk player
`
`
`
`
`
`
`
`
`
`
`220 coupled to the I/O bus 210. The VCR 214, DVD 216 and
`
`
`
`
`
`
`
`
`
`compact disk player 220 may be single disk or single
`
`
`
`
`
`
`
`cassette devices, or alternatively may be multiple disk or
`
`
`
`
`
`
`
`multiple cassette devices. They may be independent units or
`
`
`
`
`
`
`
`
`integrated together.
`In addition,
`the media system 200
`
`
`
`
`
`
`includes a microphone/speaker system 222, video camera
`
`
`
`
`
`
`
`
`224 and a wireless I/O control device 226. According to
`
`
`
`
`
`
`exemplary embodiments of the present invention, the wire-
`
`
`
`
`
`
`
`
`less I/O control device 226 is a 3D pointing device according
`to one of the exemplary embodiments described below. The
`
`
`
`
`
`
`
`wireless I/O control device 226 can communicate with the
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`entertainment system 200 using, e.g., an IR or RF transmit-
`
`
`
`
`
`
`
`
`ter or transceiver. Alternatively, the I/O control device can be
`connected to the entertainment system 200 via a wire.
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`The entertainment system 200 also includes a system
`
`
`
`
`
`
`controller 228. According to one