`
`US 200400953 17A1
`
`(19) United States
`(12) Patent Application Publication (10) Pub. No.: US 2004/0095317 Al
`
`
`(54) METHOD AND APPARATUS OF UNIVERSAL
`REMOTE POINTING CONTROL FOR HOME
`ENTERTAINMENT SYSTEM AND
`COMPUTER
`
`(76) Inventors: Jingxi Zhang, Foster City, CA (US);
`Yang Zhang, Foster City, CA (US);
`Huifang Ni, Foster City, CA (US)
`
`Correspondence Address:
`JINGXI ZHANG
`1403 MELBOURNE STREET
`FOSTER CITY, CA 94404 (US)
`(21) Appl. No. (cid:9)
`
`10/065,798
`
`(22) Filed:
`
`Nov. 20, 2002
`
`Publication Classification
`
`(51) (cid:9)
`
`Int. Cl.7 ....................................................... G09G 5/08
`
`(52) (cid:9) U.S. Cl . (cid:9)
`
`.............................................................. 345/158
`
`(57) (cid:9)
`
`ABSTRACT
`
`A universal television and computer pointing control system
`is disclosed. The system is comprised of a handheld pointing
`device, a display control unit, and a command delivery unit.
`The system allows the user to simply point and click to
`control a computer or various home entertainment compo-
`nent devices remotely. Inside the handheld device, orienta-
`tion sensors detect pointing direction. The pointing direction
`signals are transmitted to the display control unit, and a
`cursor (pointer) is drawn onto the screen indicating the
`pointer's location. By interpreting the pointing direction
`signals and the button activities, the display control unit
`issues a control signal to the command delivery unit. The
`command delivery unit then forwards the commands to the
`target device to execute the desired control function.
`
`510
`
`502
`501
`(cid:9) : 500
`
`520
`
` r
`
`
`
`420
`
`=' (cid:9)
`
``
`
`410 \
`
`'
`
`0
`1 (cid:9)
`
`II
`
`/
`
`400 (cid:9)
`
`430 (cid:9)
`
`/ (cid:9)
`/, 10
`
`200
`
`: , 50
`
`300 _... (cid:9)
`
`,103
`O -- 102
`101
`
`SAMSUNG EXHIBIT 1005
`
`Page 1 of 15
`
`(cid:9)
`
`
`Patent Application Publication May 20, 2004 Sheet 1 of 8
`
`US 2004/0095317 Al
`
`<) _) (cid:9)
`
`420
`
`\\\ f
`
`0 (cid:9)
`
`_____________
`
`410
`
`502
`" ; 501
`500
`
`/
`
`/
`
`520
`
`1 (cid:9)
`
`Q1 0 D~,CD (cid:9)
`
`//
`
`I (cid:9)
`
`1 (cid:9)
`
`400 \ (cid:9)
`1 (cid:9)
`
`1
`
`430
`
`qq D
`
`,~ /
`
`A,
`/'
`/
`
`/
`/
`/
`/
`/
`/
`
`351 -~ \ (cid:9)
`
`300
`
`FIG. 1
`
`,51O
`
`200
`
`50
`
`3
`
`102
`
`101
`
`Page 2 of 15
`
`(cid:9)
`
`
`Patent Application Publication May 20, 2004 Sheet 2 of 8 (cid:9)
`
`US 2004/0095317 Al
`
`200
`
`r r~
`
`-10
`V
`
`103
`102
`101
`
`FIG. 2
`
`Page 3 of 15
`
`
`
`Patent Application Publication May 20, 2004 Sheet 3 of 8 (cid:9)
`
`US 2004/0095317 Al
`
`101 102 103
`
`x
`
`Y
`
`z
`
`FIG. 3
`
`Page 4 of 15
`
`
`
`Patent Application Publication May 20, 2004 Sheet 4 of 8 (cid:9)
`
`US 2004/0095317 Al
`
`25 (cid:9)
`
`N
`
`3(00 320
`i
`
`S
`
`120 (cid:9)
`
`Y
`+ (cid:9)
`
`-4
`
`21 ,
`
`22
`
`r r
`
`X
`
`l_ (cid:9)
`l - f
`i I
`
`_
`
`FIG. 4a
`
`Y
`
`300 320
`
`32
`
`26
`
`FIG. 4b
`
`Page 5 of 15
`
`
`
`Patent Application Publication May 20, 2004 Sheet 5 of 8 (cid:9)
`
`US 2004/0095317 Al
`
`101 102~~ 103\[
`
`100
`
`120
`
`FIG. 5
`
`Page 6 of 15
`
`
`
`Patent Application Publication May 20, 2004 Sheet 6 of 8 (cid:9)
`
`US 2004/0095317 Al
`
`221 (cid:9)
`
`231
`
`270 (cid:9)
`
`Receiver (cid:9)
`
`DEMOD (cid:9)
`
`Memory
`
`50 (cid:9)
`
`21O
`
`200
`
`240
`
`CPU (cid:9)
`
`PC (cid:9)
`
`Interface
`
`To PC
`Peripheral
`
`FIG. 6a
`
`22 (cid:9)
`
`231
`
`270 (cid:9)
`
`Video
`Input
`Signal (cid:9)
`
`50 (cid:9)
`
`60
`
`222
`
`Receiver (cid:9)
`
`210
`
`Transmitter
`
`DEMOD
`
`Memory
`
`251 (cid:9)
`
`CPU (cid:9)
`
`MOD
`
`232
`
`200
`
`250 (cid:9)
`
`
`
`252
`I
`
`To TV
`Monitor
`
`Video (cid:9)
`Decoder (cid:9)
`
`Graphics/ (cid:9)
`Video MUX (cid:9)
`
`Video
`Encoder
`
`FIG. 6b
`
`Page 7 of 15
`
`(cid:9)
`(cid:9)
`(cid:9)
`(cid:9)
`(cid:9)
`(cid:9)
`
`
`Patent Application Publication May 20, 2004 Sheet 7 of 8 (cid:9)
`
`US 2004/0095317 Al
`
`.0 (cid:9)
`
`300
`
`330
`
`Receiver (cid:9)
`
`DEMOD
`
`351
`
`350 (cid:9)
`
`( 310
`
`360
`
`IR t MCU (cid:9)
`Transmitter (cid:9)
`
`IR
`Receiver
`
`361
`
`370 — (cid:9)
`
`Non-volatile
`Memory
`
`FIG. 7a
`
`:35
`
`300 ~61
`
`FIG. 7b
`
`Page 8 of 15
`
`(cid:9)
`(cid:9)
`(cid:9)
`(cid:9)
`
`
`Patent Application Publication May 20, 2004 Sheet 8 of 8 (cid:9)
`
`US 2004/0095317 Al
`
`221 (cid:9)
`
`50 (cid:9)
`
`60 (cid:9)
`
`222
`
`Receiver (cid:9)
`
`210
`
`Transmitter
`
`232
`
`DEMOD (cid:9)
`
`CPU (cid:9)
`
`MOD
`
`260
`
`200
`
`memory (cid:9)
`
`IR
`Receiver
`
`251 (cid:9)
`
`250 (cid:9)
`
`233 261
`
`Video (cid:9)
`Decoder (cid:9)
`
`Graphics/ (cid:9)
`Video MUX (cid:9)
`
`Video (cid:9)
` 1
`Encoder
`
`To TV
`Monitor
`
`231 (cid:9)
`
`270 (cid:9)
`
`Video
`Input (cid:9)
`Signal (cid:9)
`
`FIG. 8a
`
`200
`
`J (cid:9)
`
`261 (cid:9)
`
`800
`N a®®
`tee®
`a®e
`
`FIG. 8b
`
`Page 9 of 15
`
`(cid:9)
`(cid:9)
`
`
`US 2004/0095317 Al
`
`May 20, 2004
`
`METHOD AND APPARATUS OF UNIVERSAL
`REMOTE POINTING CONTROL FOR HOME
`ENTERTAINMENT SYSTEM AND COMPUTER
`
`BACKGROUND OF INVENTION
`
`[0001] With advancing technology, more and more fea-
`tures are added to home video and audio entertainment
`systems. For example, interactive television sets allow users
`to purchase a pay program by pressing buttons on the remote
`control. However, the rich set of functions requires more
`buttons on the remote control unit. The jam-packed button
`layout on the remote control unit makes the handheld device
`bulky and complicated. Moreover, an increasing number of
`audio and video component devices, for example, VCRs,
`DVD players, digital TV set-top boxes, are added into home
`entertainment systems. Each device is usually controlled by
`a unique remote control unit. To reduce the user's confusion
`of multiple remote control units, universal remote control
`devices were introduced to consumers. The universal remote
`control device can be either preprogrammed or trained with
`other remote controls by the user to provide multi-device
`control functionality. However, because more functions are
`being added to this type of handheld device, and because of
`the limited number of buttons available (which are already
`crowding the device), each button must serve multiple
`functions. Unfortunately, the multi-function buttons cannot
`provide clear visual feedback indicating their current func-
`tion. This unfriendly user interface is obscure to the user
`operating the remote control unit and leads to only a small
`subset of the functions being utilized. Furthermore, the
`expandability of present universal remote control devices is
`very poor. As new media modules are introduced into home
`entertainment systems, for instance, Internet browsers, it
`becomes even more difficult to adapt the existing universal
`remote control to the new requirements, in the case of
`Internet browsers, that users be able to move a pointer and
`select a visual object on the screen to operate a certain
`function. A handheld pointing control device is desirable in
`such a case. While using the pointing device, the on-screen
`graphical user interface (GUI) provides friendly visual feed-
`back. The dynamically displayed selectable on-screen iden-
`tifiers (menus, icons, buttons, etc.) greatly reduce the num-
`ber of buttons on the pointing control device.
`
`[0002] In the case of computer slide presentations, a
`convenient handheld remote pointing and control device is
`also considered necessary. Conventional computer control
`depends on keyboard and mouse which are physically
`bounded with computer hardware and a fixed surface such as
`a table. To control the flow of the presentation slides or to
`point out figures on the slide to the audience, the presenter
`is forced to stay with the computer keyboard and mouse.
`This constraint is very inconvenient for the presenter trying
`to deliver his/her talk to the audience. A remote pointing
`control device could help the presenter to freely walk about
`the stage and move a pointer on the screen to guide the
`audience.
`[0003] Because of the need for a remote pointing mecha-
`nism for home entertainment systems and computer presen-
`tations, many methods and devices have been invented. For
`examples, Fan (U.S. Pat. No. 5,926,168) has described
`several methods, including using light emission and elec-
`tromagnetic fields, to develop remote pointing devices;
`Kahn (U.S. Pat. No. 6,404,416) described a pointing inter-
`
`face for computer systems based on raster scanned light
`emitted from display screens. The methods presented in
`those inventions are complicated, and some require a new
`display apparatus to replace the existing one. Marsh et al.
`(U.S. Pat. No. 5,999,167) introduced a pointer control
`device based on an ultrasound source. Pitcher et al. (U.S.
`Pat. No. 5,359,348), Hansen (U.S. Pat. No. 5,045,843),
`Odell (U.S. Pat. No. 5,574,479) and King, et al (U.S. Pat.
`No. 4,565,999) presented pointing devices based on detect-
`ing fixed light sources. Auerbach (U.S. Pat. No. 4,796,019)
`explained a pointing device containing multiple light
`sources and the lights are detected by a fixed light sensor.
`Wang et al. (U.S. Pat. No. 5,126,513) suggested a pointing
`measurement method by detecting the wave phases from a
`fixed transmitter. However, in practice, all the approaches
`based on detecting fixed local sources suffer from the
`limitations of the fixed source locations and orientations, as
`well as the distance between the pointing device and fixed
`sources. Moreover, the control methods proposed in all the
`aforementioned inventions are limited to only a single target
`device. The control scope is narrow and cannot cover all the
`related video/audio devices or equipment.
`[0004] Recently, low cost magnetic field sensors based on
`magneto-resistive, magneto-inductive and Hall-effect tech-
`nologies were developed. Those magnetic sensors are sen-
`sitive enough to measure earth's magnetic field and are
`widely used in such navigational devices as digital com-
`passes and the Global Positioning System (GPS). Some
`magnetic sensors are packaged to detect two-axis, even
`three-axis, magnetic field changes and provide a linear
`output to the direction of the magnetic field flux, such as
`HMC1052 two-axis magnetic sensor from Honeywell
`(www.ssec.honeywell.com). The two-axis magnetic field
`sensor can be easy and cost-perfect to implement a pointing
`device to detect the yaw (azimuth) angle relative to earth's
`North Pole. However, using magnetic field sensors to detect
`a pitch (inclination) angle change would be a problem,
`particularly when the pointing device's heading direction is
`perpendicular to earth's North-South axis. Hall et al. (U.S.
`Pat. No. 5,703,623) presented a pointing device using three
`pairs of orthogonally mounted one-axis Hall-effect sensors.
`To overcome the problem in measuring pitch and roll angles,
`a set of piezoelectric sensors is used to detect the accelera-
`tion changes. The authors suggested using the detected
`acceleration data to compensate the deficient of magnetic
`sensors. However, to measure device angular movement an
`integration of the acceleration steps is required. The piezo-
`electric sensors detect only the dynamic changes of accel-
`eration. The acceleration measurement errors are introduced
`because piezoelectric sensors are failed to measure the
`constant acceleration. The accumulated acceleration error in
`the integration process would eventually render the device
`unusable.
`[0005] To detect a pointing device's pitch and roll angles,
`a static accelerometer can be used. Recently, low-cost,
`lightweight accelerometer sensors using Micro-Electro-Me-
`chanical Systems (MEMS) technology are available from
`many sources. MEMS devices integrate mechanical ele-
`ments, sensors, actuators, and electronics on a common
`silicon substrate using micro-fabrication technology, which
`provides a cost-effective and small-footprint component for
`consumer manufactories. Two-axis linear MEMS acceler-
`ometers, such as ADXL-202E from Analog Devices
`(www.analog.com), LIS2L01 from STMicroelectronics
`
`Page 10 of 15
`
`(cid:9)
`
`
`US 2004/0095317 Al
`
`May 20, 2004
`
`2
`
`(www.st.com), and MXD2010U/W from MEMSIC (www-
`.memsic.com), can measure both dynamic and static accel-
`eration and are good candidates for use in pointing devices
`to determine the pitch and roll angles. The earth's gravity
`exerts a constant acceleration on the MEMS accelerometer.
`By calculating the accelerometer's static acceleration out-
`puts, a tilt angle (pitch or roll) can be obtained.
`
`[0006] Besides magnetic field sensors and accelerometer
`sensors, gyro sensors can also be used in pointing device
`design. Gyro sensors, such as the ADXRS150 MEMS gyro-
`scope from Analog Devices (www.analog.com), can detect
`changes in the device's orientation angle and thus can be
`used in detecting the pointing device's heading.
`
`[0007] The object of the present invention is to provide a
`low-cost, practical, universal pointing device to control
`home entertainment systems and computer systems using
`spatial orientation sensor technologies.
`
`SUMMARY OF INVENTION
`
`[0008] A universal pointing control system for televisions
`and computer displays is disclosed. The system is comprised
`of a remote handheld device, a display control unit and a
`command delivery unit. The remote handheld device
`includes a set of orientation sensors that detect the device's
`current orientation. In the preferred embodiment, a two-axis
`magnetic sensor identifies the device's azimuth angle by
`detecting the earth's magnetic field, and a dual-axis accel-
`erometer sensor identifies the device's inclination angle by
`detecting the earth's gravity. The signals from the orienta-
`tion sensors are translated and encoded into pointing direc-
`tion information by a microprocessor or logic circuits on the
`pointing device and transmitted to the display control unit.
`Along with the directional information, data regarding the
`user's selection activities collected by a selection unit in the
`handheld device is also encoded and sent to the display unit.
`The display control unit includes a data transceiver, a CPU,
`and a display control circuit for interfacing target device.
`The pointing direction information received by the trans-
`ceiver is decoded and manipulated by the on board CPU.
`Based on the pointing information, the CPU instructs the
`controlled target device interface, either a television set or a
`computer, to display a pointer at the corresponding coordi-
`nates on the target device screen. User selection activities
`are also interpreted by the CPU based on the current pointer
`location, and corresponding commands are sent to the com-
`mand delivery unit. The command delivery unit, which can
`be a stand-alone device or built into the handheld pointing
`device, forwards the commands to any remote controllable
`target device using an infrared beam to execute a desired
`operation.
`
`[0009] The handheld remote control device is simple and
`easy to use. User directly points to any position of the screen
`and a cursor is displayed on the screen at the pointed
`location. By selecting a menu or active control shape on the
`screen using a selection button on the device, the user can
`control the target device's operation intuitively. Because
`fewer buttons are required to operate the device (e.g. a
`selection button, a calibration button, and a button to show
`and hide the on-screen pointer), the device can be made
`smaller and lighter. The selectable items can vary and
`change their appearance dynamically based on the status of
`the operations. With visual feedback, the system provides a
`
`much better and friendlier graphical interface to users.
`Because the pointing signals are generated from the hand-
`held remote control device without reference to any source
`from other devices or equipment, there is no significant
`change necessary on the television or computer system. In
`the described embodiment, the remote pointing device can
`be directly used in existing televisions and computers with-
`out any modification. The control scope of this system is
`broad enough to cover all the audio/video devices which are
`originally controlled by their respective remote controls.
`The extendibility of the system allows new types of devices
`to be easily adapted and controlled.
`
`BRIEF DESCRIPTION OF DRAWINGS
`
`[0010] FIG. 1 is a perspective view of the universal
`pointing system in controlling a variety of equipments in the
`home entertainment system.
`[0011] FIG. 2 is a perspective view of the pointing system
`in controlling computer presentations.
`[0012] FIG. 3 shows the components in the handheld
`pointing device.
`[0013] FIG. 4a and 4b demonstrate the principal mecha-
`nism of the orientation sensor detecting the device's orien-
`tation changes, and how the screen pointer to reflects these
`changes.
`[0014] FIG. 5 is the functional block diagram of the
`pointing device.
`[0015] FIG. 6a is the functional block diagram of the
`display control unit for a computer.
`[0016] FIG. 6b is the functional block diagram of the
`display control unit for a home entertainment system.
`[0017] FIG. 7a is the functional block diagram of the
`command delivery unit.
`[0018] FIG. 7b shows the command delivery unit being
`trained by an original remote control.
`[0019] FIG. 8a is the alternative functional block diagram
`of the display control unit which includes the remote control
`training circuit.
`[0020] FIG. 8b shows the display control unit being
`trained by an original remote control.
`
`DETAILED DESCRIPTION
`
`[0021] The present invention's universal pointing control
`system consists of a handheld pointing device 100, a display
`control unit 200 and a command deliver unit 300 as shown
`in FIG. 1. In this example, the display control unit 200 is
`connected to a television 400 and a video component device
`500, which can be a digital TV set-top box, a VCR, or a
`DVD player, through video cables 520 and 510, respec-
`tively. The display control unit 200 can also be embedded
`inside the TV or other video component device in alternative
`embodiments. The handheld pointing device 100 is aimed at
`the television screen 420 indicated by a line of sight 10. On
`the other end of this line, a pointer 410 is displayed on the
`screen. When the user points the device to an arbitrary
`position of the screen, a set of orientation sensors inside the
`pointing device 100, which will be described in later, detects
`the device's current orientation and generate the pointing
`
`Page 11 of 15
`
`(cid:9)
`
`
`US 2004/0095317 Al
`
`May 20, 2004
`
`3
`
`direction signal. The pointing direction signal is encoded
`and sent to the display control unit 200 through a transmis-
`sion link 50. This transmission link can be any form of signal
`linkage. For example, it could be implemented by using
`radio frequency (RF) wireless link, infrared (IR) link, or
`even a wired cable. Upon receiving the signal, a central
`process unit (CPU) inside the display unit 200 decodes and
`analyzes the pointing direction and determines the new
`coordinates of the pointer on the screen. A pointer is drawn
`at the calculated coordinate and the pointer image is then
`superimposed onto the input video signal, which is input
`from video component device 500 through cable 570. A set
`of menus and control items 430 are also drawn and super-
`imposed to the video signal. The composite video is then
`output to the television 400 through the output video cable
`520 and displayed on the television screen 420. As a result,
`the pointer 470 is shown at a new location on the screen
`where the user points to. The user perceives that the pointer
`is moved following the aiming line of sight 70.
`
`[0022] Buttons are located on the handheld pointing
`device to collect the user's selection activities. Three buttons
`are shown in this example, one for command selection
`(101), one to show and hide screen pointer (102), and
`another one for calibration purpose (703). When the user
`uses the device at first time, a calibration procedure is
`performed. The user aims the device at the center of the
`screen and presses button 703. The device's pointing direc-
`tion information is recorded and stored into the display
`control unit as the screen center reference. Any subsequent
`pointing information is then compared with this reference,
`and the difference will be calculated as the pointer displace-
`ment distance away from the screen center.
`
`[0023] During normal usage, as the user points and clicks
`the selection button, the on-screen menu or selectable items
`under the pointer are processed by the CPU in the display
`control unit. Selection information is generated and for-
`warded to the command delivery unit 300 by means of
`transmission link 60. The link 60, again, can be any form of
`signal linkage. The command deliver unit 300 can be a
`stand-alone device facing the TV 400 and other equipments
`(500,570,520), or can be embedded inside the pointing
`device 100. All remote control command codes for the
`devices in the home entertainment system are prerecorded in
`a memory module in the command delivery unit 300. Upon
`receiving selection information, the command delivery unit
`issues the corresponding command by searching the
`memory module, and emits the command infrared (IR)
`signal through the IR emitter 351 to the controlled equip-
`ments. The target equipment performs a task as if it had
`received a command directly from its original remote con-
`trol device.
`
`[0024] FIG. 2 shows the pointing control system as used
`in a computer presentation scenario. In this case, the pre-
`sentation is projected onto the screen 720 by a projector 700,
`which receives the video input from a computer 600 though
`a video cable 620. The display control unit 200 is connected
`to the peripheral port of a computer 600 through the cable
`610. The presenter aims the pointing device at the screen 720
`by a line of sight 10. The aiming direction information
`generated by a set of orientation sensors in the pointing
`device 100 is transmitted to display control unit 200 through
`transmission link 50. The CPU in the display control unit
`interprets the direction information, sends the pointer move
`
`command to the computer's peripheral port, and instructs
`the computer to move the pointer 710 on screen to the aimed
`place. This is analogous to moving the pointer by moving a
`regular computer mouse device, except that the moving
`information is in absolute coordinates instead of relative
`steps. The buttons 101, 102, and 103 on the pointing device
`allow the presenter to select and execute a command
`remotely.
`
`[0025] FIG. 3 exposes the components inside of the
`handheld pointing device. On the top face of the device are
`buttons 101, 102, and 103 for collecting user selection
`activities. A set of orientation sensors 120 and 130 mounted
`on the print circuit board 160 detect device's orientation
`changes. Note that the sensors are mounted orthogonally to
`each other. The sensor 120 detects the device's yaw (azi-
`muth) angle and sensor 130 detects device's pitch (inclina-
`tion) angle. Additional sensors (not show in the picture)
`could be used to detect device's roll angle which may
`provide an additional dimension of control. A microcontrol-
`ler 110 provides computation power for calculating and
`encoding the orientation signal output from the orientation
`sensors. It also provides logic control for the transmitter 140
`and other electronic components. The device is powered by
`batteries 170.
`
`[0026] The orientation sensors' mechanisms are shown in
`FIGS. 4a and 4b. The orientation sensor demonstrated in
`FIG. 4a is a magnetic field sensor, whereas the one in FIG.
`4b is an accelerometer sensor. However, the orientation
`detection may not be limited to these types of sensors. Other
`sensors, for example, a gyro sensor, can also be used in the
`pointing control system. In FIG. 4a, a two-axis magnetic
`field sensor 120 is used to detect the device's orientation
`relative to the direction of the earth's magnetic field 25. The
`sensor contains two magnetic field detectors which are
`arranged orthogonal to each other. The sensor is mounted on
`the device's circuit board so that the two magnetic field
`detectors are laid on the x-z plane as shown in the picture.
`The azimuth angle j between the device's heading direction
`and the earth's North Pole direction can be calculated from
`the sensor's x and z output: =arc tan(x/z). When the user
`performs calibration, the device records the azimuth angle
`~0 as the reference angle as the user points the device to
`center of the screen. When the device is rotated about the
`y-axis and the pointing direction is moved away from
`screen's center, the azimuth angle difference from the ref-
`erence angle is
`(cid:9) This difference is interpreted by the
`display control unit as the degree of the pointer's horizontal
`departure from the screen center. The amount by which the
`pointer moves horizontally (22) can be adjusted in the
`display control unit proportionally to the change in the
`pointer's azimuth angle 21.
`
`[0027] The orientation sensor 130 uses a similar method to
`detect the device's inclination angle. The sensor could be an
`accelerometer or another orientation sensor that can sense
`the device's heading change in the y-z plane. An acceler-
`ometer sensor which can detect static acceleration is
`described in detail here. The accelerometer sensor 130
`contains two orthogonally arranged acceleration detectors.
`The sensor is mounted perpendicular to the circuit board's
`plane so that one detector in the sensor detects y-axis
`acceleration and the other detects z-axis acceleration.
`Earth's gravity 26 exerts a static acceleration on these
`detectors. When the device is placed on a horizontal level,
`
`Page 12 of 15
`
`(cid:9)
`
`
`US 2004/0095317 Al
`
`May 20, 2004
`
`0
`
`the accelerometer's z-axis detector outputs zero accelera-
`tion, while the y-axis outputs the maximum acceleration (1
`g). If the device is rotated about the x-axis, the z and y
`channel outputs of the sensor are changed according to the
`inclination angle. The inclination angle E thus can be
`calculated: e=arc tan(z/y). During the calibration, the
`device's inclination angle to the screen center eo is recorded
`and stored as a reference angle. Any inclination angles
`sampled thereafter is compared with this reference angle by
`determining the offset, e—eo. This difference is interpreted by
`the display control unit as a degree of the pointer's departure
`from the screen's center in the vertical direction. The
`amount by which the pointer moves vertically (32) can be
`adjusted in the display control unit proportionally to the
`change in the pointer's inclination angle 31.
`[0028] For a simplified version, a one-axis accelerometer
`sensor can be used. In such a case, the acceleration detector
`is mounted along the device's z-axis. The inclination angle
`e thus can be calculated: e=arc sin(z).
`[0029] FIG. 5 is the functional block diagram of the
`handheld pointing device. The signal conditioning circuit for
`sensor 120 consists of two amplifiers 121, 123 and two low
`pass filters 122, 124. Because we are interested in the static
`position and low frequency movement of the device, the
`high frequency noises of the amplified x-axis and y-axis
`signals are filtered in order to get a higher resolution of the
`azimuth angle changes. Two amplifiers 131, 133 and two
`low pass filters 132, 134 are for conditioning the sensor 130
`output's x-axis and z-axis signals. We are interested in the
`sensor's static output relative to earth's constant gravity.
`Therefore, the high frequency noises of the signals are also
`filtered in order to get a higher resolution of the inclination
`angle changes. The conditioned signals from sensors 120
`and 130 are then sent to an analog-to-digital converter
`(ADC) 111 by an analog multiplexer 112. The digitized
`sensors data are then sent to a microcontroller (MCU) 110
`for further signal processing. Some variations of the orien-
`tation sensor convert the analog signal internally to a digital
`or time period-based signal. In those cases, the signals can
`be directly sampled by the microcontroller without an ADC
`chip. The MCU 110 computes the azimuth and inclination
`angles. Buttons 101, 102, and 103 produce activity signals
`that are sampled by MCU 100. The sensor orientation data
`and buttons activities are coded in such a way that the
`display control unit can decode them later. The encoded data
`is passed to a modulator 113 to modulate a carrier frequency
`for transmission. The transmitter 140 emits the modulated
`signal 50 to the display control unit 200. The circuit is
`powered by batteries 170. A battery manager unit 171
`conditions the voltage for all components of the circuit. The
`MCU 110 constantly monitors the changes of sensor out-
`puts. If the sensor outputs are not changed for a period, the
`MCU interprets the device as not been used. The MCU
`instructs the battery manager unit 171 to shut down the
`battery power supply to the transmitter and other compo-
`nents in order to save power consumption during the idle
`stage.
`
`[0030] FIG. 6a and FIG. 6b show the functional block
`diagrams of display control unit 200 for a computer and a
`television set, respectively. A central process unit (CPU)
`210, a receiver 221, a demodulator 231, and a memory
`module 270 are common for both cases. The transmitted
`signal 50 from the pointing device, which includes handheld
`
`device orientation and user selection activities, is intercepted
`by the receiver module 221. After being demodulated by a
`demodulator 231, the pointing device data is sent to CPU
`210 for further processing. The CPU compares the device's
`azimuth and inclination angle data with the reference angles,
`which are sampled and stored in the memory module 270
`during the calibration procedure. The difference angles
`calculated are translated into screen coordinates and the
`target device is instructed to move the pointer to the new
`location. The interface components of the display control
`unit are different for each control target. In FIG. 6a, a
`computer peripheral interface module is used to connect to
`a computer port. The pointer coordinates are sent to the
`computer and, by the computer's processor and video card,
`the pointer on the screen is moved to the corresponding
`location. The button activities are also sent to the computer
`though this interface and trigger certain actions for the
`computer.
`[0031] FIG. 6b demonstrates the display control interfaces
`to a television. The input video signal, which may come
`from other home entertainment devices such as digital TV
`set-top boxes, DVD players, etc., is decoded by a video
`decoder 251 frame by frame. A new pointer image is drawn
`at the coordinate calculated by CPU 210. The pointer image,
`along with menus and other control item images pre-stored
`in the memory module 270, are sent to a graphic-video
`multiplexer 250 to superimpose onto a video frame. The
`composite video frame is then encoded by a video encoder
`252 and sent to the television for display. The process is
`about 30 frames per second. As the result, a pointer is moved
`on top of the video following the handheld device's pointing
`direction. If the CPU 210 senses a button click activity while
`the pointer is moved on top a menu or a controllable item,
`it sends a command to a transmitter 222 through a modulator
`232. The modulated transmission signal 60 is forwarded to
`the command delivery unit 300 for controlling the television
`and other home entertainment equipment.
`
`[0032] FIG. 7a is the functional block diagram of com-
`mand delivery unit 300. A receiver 320 intercepts the
`transmitted signal from the display control unit 200. The
`signal is sent to a microcontroller (MCU) 310 after demodu-
`lation by a demodulator 330. In the command delivery unit,
`there is a non-volatile memory module which stores all the
`control command codes for varieties of home entertainment
`equipments. These command codes can preset by the vendor
`or stored by the user during programming or training pro-
`cedures. The command codes are stored in such way that
`each command is coupled with an identification number
`(key). The arriving signal from the display control unit is
`served as the key so that MCU 310 can look up the key's
`record in memory and fetch the corresponding command
`code. For example, if the pointer is moved on top of a VCR
`play button on the screen and user clicks the selection
`button, the display control unit sends a value equal to 100 to
`the command delivery unit. By looking up the key value 100
`in the memory module, the MCU 310 fetches the pre-stored
`VCR play command code. The command code is sent to an
`infrared transmitter 350 to drive an infrared emitter 351. The
`infrared-carried command code is then sent to the home