throbber
1111111111111111 IIIIII IIIII 11111 1111111111 1111111111 11111 1111111111 111111111111111 11111111
`US 20050110751Al
`
`(19) United States
`(12) Patent Application Publication
`Wilson et al.
`
`(10) Pub. No.: US 2005/0110751 Al
`May 26, 2005
`( 43) Pub. Date:
`
`(54) SYSTEM AND PROCESS FOR SELECTING
`OBJECTS IN A UBIQUITOUS COMPUTING
`ENVIRONMENT
`
`(75)
`
`Inventors: Andrew Wilson, Seattle, WA (US);
`Steven A. N. Shafer, Seattle, WA (US);
`Daniel Wilson, Pittsburgh, PA (US)
`
`Correspondence Address:
`LYON & HARR, LLP
`300 ESPLANADE DRIVE, SUITE 800
`OXNARD, CA 93036 (US)
`
`(73)
`
`Assignee: Microsoft Corporation, Redmond, WA
`
`(21)
`
`Appl. No.:
`
`11/020,064
`
`(22)
`
`Filed:
`
`Dec. 20, 2004
`
`Related U.S. Application Data
`
`(63)
`
`Continuation of application No. 10/160,692, filed on
`May 31, 2002.
`
`(60)
`
`Provisional application No. 60/355,368, filed on Feb.
`7, 2002.
`
`Publication Classification
`
`Int. Cl.7 ....................................................... G09G 5/00
`(51)
`(52) U.S. Cl. .............................................................. 345/156
`
`(57)
`
`ABSTRACT
`
`A system and process for selecting objects in an ubiquitous
`computing environment where various electronic devices
`are controlled by a computer via a network connection and
`the objects are selected by a user pointing to them with a
`wireless RF pointer. By a combination of electronic sensors
`onboard the pointer and external calibrated cameras, a host
`computer equipped with an RF transceiver decodes the
`orientation sensor values transmitted to it by the pointer and
`computes the orientation and 3D position of the pointer. This
`information, along with a model defining the locations of
`each object in the environment that is associated with a
`controllable electronic component, is used to determine
`what object a user is pointing at so as to select that object for
`further control actions.
`
`( Begin )
`+
`800 '"" Input The Sensor Readings Provided In
`L
`
`The Orientation Message
`
`802 '""
`
`Normalize The Sensor Readings
`
`l
`
`804 ~ Derive The Orientation Of The Pointer
`From The Normalized Sensor Readings
`
`l
`806 '"" Compute The Position Of The Pointer
`!
`808 ~ Determine What The Pointer Is Being
`Pointed At Within The Environment
`
`!
`810 '"" Wait For Another Orientation Message
`I
`
`To Be Received
`
`Exhibit 1110
`DENTAL IMAGING
`IPR2023-00111
`
`0001
`
`

`

`Patent Application Publication May 26, 2005 Sheet 1 of 16
`
`US 2005/0110751 Al
`
`16
`
`18
`
`12
`
`10~
`
`I 14
`
`q
`~
`1-a-1
`I -=-=I
`1-g1
`I
`I
`
`FIG. 1
`
`FIG. 2
`
`0002
`
`

`

`Patent Application Publication May 26, 2005 Sheet 2 of 16
`
`US 2005/0110751 Al
`
`312
`
`310
`
`TX
`
`-4-
`
`3~1
`
`PIC
`MPC
`
`318
`
`316
`
`314
`
`--r---
`
`304
`
`,--- --,
`"----~·
`ACCEL : GYRO :
`
`302
`
`306
`
`FIG. 3
`
`602
`
`600
`
`RX
`
`PIC
`MPC
`
`COMMS
`1 - - - - - - 1 INTERFACE
`
`TO PC
`
`(606
`
`604
`
`FIG. 6
`
`0003
`
`

`

`Patent Application Publication May 26, 2005 Sheet 3 of 16
`
`US 2005/0110751 Al
`
`Begin
`
`Yes
`
`402
`
`N
`
`Perform Other
`Commands Included
`In The Message
`And Wait For The
`Next Message
`
`404
`
`406
`
`408
`
`Identify The Last-read Outputs From
`The Accelerometer, Magnetometer And
`Gyroscope (If Used)
`
`Package The Last-Read Sensor
`Outputs, Along With The Identifier
`Assigned To The Pointer (if Employed),
`And Optionally The Current State Of
`The Button And Error Detection Data
`To Form An Orientation Data Message
`
`Transmit The Orientation Data
`Message
`
`Exit
`
`FIG. 4
`
`0004
`
`

`

`Patent Application Publication May 26, 2005 Sheet 4 of 16
`
`US 2005/0110751 Al
`
`Begin
`..---Nn---------------1-----------------.
`500
`
`Read The Sensors
`
`No
`
`504
`
`Package And
`Transmit An
`Orientation
`Message
`
`Exit
`
`FIG. 5A
`
`No
`
`510
`
`Reset The
`Count-down
`Timer
`
`Decrement The
`Count-Down
`Timer
`
`Yes
`
`~
`
`0005
`
`

`

`Patent Application Publication May 26, 2005 Sheet 5 of 16
`
`US 2005/0110751 Al
`
`No
`
`Power Down The Pointer
`
`Wake The Pointer After The Prescribed
`Period Of Time Has Passed
`
`514
`
`516
`
`Is The
`Current Output Of The
`Accelerometer Significantly
`Different From The Last-read
`Accelerometer Output
`Prior To The
`Shutdown
`
`Yes
`
`520
`
`Powered Up The Pointer And Reset
`The Count-Down Timer
`
`FIG. 5B
`
`0006
`
`

`

`------------------.
`• ••
`I
`I
`L - -----------------, 192
`
`I
`
`SYSTEM MEMORY
`{ROM) ill
`
`BIOS
`
`133
`
`{RAM) 132
`
`OPERATING
`SYSTEM
`
`134
`
`APPLICATION
`PROGRAMS 135
`
`OTHER PROGRAM
`MODULES
`136
`
`PROGRAM
`DATA
`
`130
`
`120
`
`PROCESSING
`UNIT
`
`194
`
`190
`
`I
`
`CAMERA
`INTERFACE
`
`VIDEO
`INTERFACE
`
`,e::195
`I
`
`OUTPUT
`PERIPHERAL
`INTERFACE
`
`I
`I
`1191
`I
`
`MONITOR
`
`~~
`
`PRINTER
`
`121
`
`160
`
`.__.....__---1 SPEAKERS
`
`197
`
`NON-REMOVABLE
`NON-VOL. MEMORY
`INTERFACE
`
`140
`
`REMOVABLE
`NON-VOL.
`MEMORY
`INTERFACE
`150
`
`USER
`INPUT
`INTERFACE
`
`171
`NETWORK
`INTERFACE ~ - -L . . - - -
`LOCAL AREA
`NETWORK
`
`170
`
`L'.:=-=--:=-::-::--:: __ --~--,--;,J ~55
`c,~o- -- .,/ ~41
`' ' ,
`'
`
`/
`
`/
`
`'
`
`'
`'
`
`1~
`
`/
`
`/
`
`/
`
`/
`
`/
`
`/
`OPERATING APPLICATION OTHER PROGRAM
`MODULES
`PROGRAMS
`SYSTEM
`145
`144
`
`146
`
`FIG. 7
`
`PROGRAM
`DATA
`147
`
`100_)'
`
`~ 156
`
`172
`
`MODEM
`
`__ _,
`,WIDE AREA NETWORK
`
`173
`
`180
`
`REMOTE
`COMPUTER
`
`KEYBOARD
`
`161
`
`MOUSE
`
`REMOTE
`APPLICATION
`PROGRAMS
`
`185
`
`181
`
`""C
`
`~ .... ~ = ....
`t "Cl -....
`~ ....
`.... 0 =
`~
`O' -....
`~ ....
`.... 0 =
`
`(')
`
`(')
`
`~
`~
`'-<
`N
`~~
`
`N 8 Ul
`'JJ. =(cid:173)~
`~ ....
`0 ....,
`'"""' ~
`
`~
`
`d
`'JJ.
`N
`0
`0
`~
`0
`'"""'
`'"""' 0
`-..J
`Ul
`
`'"""' >
`'"""'
`
`0007
`
`

`

`Patent Application Publication May 26, 2005 Sheet 7 of 16
`
`US 2005/0110751 Al
`
`( Begin
`
`~ .~
`
`~ Input The Sensor Readings Provided In
`The Orientation Message
`
`800
`
`802
`
`~
`
`Normalize The Sensor Readings
`
`1,
`
`804 ---......, Derive The Orientation Of The Pointer
`From The Normalized Sensor Readings
`
`806
`~
`Compute The Position Of The Pointer
`
`1 ..
`
`808 ---......,
`
`Determine What The Pointer Is Being
`Pointed At Within The Environment
`
`810
`~ Wait For Another Orientation Message
`To Be Received
`
`FIG. 8
`
`0008
`
`

`

`Patent Application Publication May 26, 2005 Sheet 8 of 16
`
`US 2005/0110751 Al
`
`Begin
`
`Initiate The
`Magnetometer Correction
`Factor Calibration Mode
`
`900
`
`u
`s
`E
`R
`
`Point The Pointer In The
`Prescribed Direction
`Within The Environment,
`With The Device Being
`Held In A Known
`Orientation
`
`Activate The Button On
`The Pointer
`
`Exit
`
`902
`
`904
`
`906
`
`Request The Pointer
`Provide An Orientation
`Message
`
`No
`
`908
`
`tch Status lndi
`Orientation Me
`te That The Poi
`witch Has Bee
`
`FIG. 9
`
`910
`
`Yes
`
`Designate The
`Magnetometer Reading
`For Each Axis Contained
`In The Orientation
`Message As The
`Magnetometer Correction
`Factor For That Axis
`
`Exit
`
`H
`0
`s
`T
`
`C
`0
`M
`p
`u
`T
`E
`R
`
`0009
`
`

`

`Patent Application Publication May 26, 2005 Sheet 9 of 16
`
`US 2005/0110751 Al
`
`Begin
`
`Initiate The
`Magnetometer Max/min
`Calibration Mode
`
`moo
`
`Wave The Pointer About
`For A Prescribed Period
`Of Time
`
`1002
`
`u
`s
`E
`R
`
`Exit
`
`FIG. 10
`
`1006
`
`1008
`
`1010
`
`1012
`
`1004
`
`Request The Pointer
`Provide Orientation
`Messages During The
`Time The Pointer Is Being
`Waved
`
`Input And Record The
`Magnetometer Readings
`Contained In Each
`Orientation Message
`
`Select The Highest
`Reading Recorded For
`Each Magnetometer Axis
`And Designate The Level
`As The Maximum For
`That Axis
`
`Select The Lowest
`Reading Recorded For
`Each Magnetometer Axis
`And Designate The Level
`As The Minimum For That
`Axis
`
`Compute And Store
`Normalization Factors For
`Each Magnetometer Axis
`Using The Maximum And
`Minimum Levels
`
`Exit
`
`H
`0
`s
`T
`
`C
`0
`M
`p
`u
`T
`E
`R
`
`0010
`
`

`

`Patent Application Publication May 26, 2005 Sheet 10 of 16
`
`US 2005/0110751 Al
`
`Begin
`
`Normalize The Accelerometer And
`Magnetometer Values Received In The
`Orientation Data Message
`
`1102
`
`Compute The Pitch And Roll Angles Of
`The Pointer From The Normalized x(cid:173)
`axis And y-axis Accelerometer Values,
`Respectively
`
`Refine The Normalized Magnetometer
`Output Values Using The Pitch And
`Roll Angles Computed From The·
`Normalized Accelerometer Output
`Values
`
`Apply The Magnetometer Correction
`Factors To The Refined Magnetometer
`Values
`
`Compute The Yaw Angle Of The
`Pointing Device About The z-axis Using
`The Corrected And Refined
`Magnetometer Output Values
`
`Tentatively Designate The Computed
`Yaw Angle, Along With The Pitch And
`Roll Angles Derived From The
`Accelerometer Readings, As Defining
`The Orientation Of The Pointer At The
`Time The Orientation Data Message
`Was Transmitted By The Pointer
`
`1108
`
`1110
`
`FIG. 11A
`
`0011
`
`

`

`Patent Application Publication May 26, 2005 Sheet 11 of 16
`
`US 2005/0110751 Al
`
`Compute The Orientation Of The
`Pointer Assuming The Pointer Is In
`A Right-side Up Position With
`Respect To Roll
`
`Estimate What The Magnetometer
`Values Should Be Given The
`Pointer Orientation Assuming The
`Pointer Is In A Right-side Up
`Position
`
`Compute The Orientation Of The
`Pointer Assuming The Pointer Is In
`An Up-side Down Position With
`Respect To Roll
`
`Estimate What The Magnetometer
`Values Should Be Given The
`Pointer Orientation Assuming The
`Pointer Is In A Up-Side Down
`Position
`
`1112
`
`1114
`
`1116
`
`1118
`
`Determine How Close The
`Determine How Close The
`Estimated Magnetometer Values
`Estimated Magnetometer Values
`For The Right-side Up Case Are To 1 - - - - r - - - - - i For The Up-side Down Case Are To
`The Actual Values Contained In The
`The Actual Values Contained In The
`Orientation Data Message
`Orientation Data Message
`
`1120
`
`1126
`
`es For Magneto
`celerometer Comput
`-side Up Case Clos
`lues Than Those C
`r The Upside-do
`
`1122
`
`1128
`
`Deem The Pointer To Have Been
`Right-side Up
`
`Deem The Pointer To Have Been
`Up-side Down
`
`No
`
`Modify The Roll Angle To Match
`The Deemed Case
`Yesi--------..
`
`FIG.118
`1132
`
`1134
`
`Designate The Tentative Rotation
`Matrix As The Finalized Matrix
`
`Exit
`
`0012
`
`

`

`Patent Application Publication May 26, 2005 Sheet 12 of 16
`
`US 2005/0110751 Al
`
`Video □□□□
`Image
`Frames
`IR _Jl'-----------..--------,....-----,n _ _ _ _ --1f
`LED
`- . . . ; . . . . . . - - - - - - - - - - - - , , - - - - - - t
`;◄
`";
`1/30 s
`
`•••
`
`1/15 s
`
`FIG. 12
`
`FIG. 13A
`
`FIG. 13B
`
`FIG. 13C
`
`FIG. 13D
`
`0013
`
`

`

`Patent Application Publication May 26, 2005 Sheet 13 of 16
`
`US 2005/0110751 Al
`
`Begin
`
`140
`
`1402
`
`140
`
`140
`
`Subtract A Pair Of Frames Produced
`Contemporaneously From Each Of The
`Cameras
`
`Identify The Pixel In Each Difference
`Image That Exhibits The Highest
`Intensity
`
`Designate The Coordinates Of The
`Identified Pixel In Each Difference
`Image As Representing The Location
`Of The Pointer In The Image
`
`Determine The 3D Location Of The
`Pointer In The Environment Using The
`Image Coordinates Of The Pointer
`From The Diffference Images
`
`Exit
`
`FIG. 14
`
`0014
`
`

`

`Patent Application Publication May 26, 2005 Sheet 14 of 16
`
`US 2005/0110751 Al
`
`u
`s
`E
`R
`
`Begin
`
`Initiate The Target
`Training Procedure
`
`Hold The Button On The
`Pointer Down While
`Tracing The Outline Of
`The Object Being
`Modeled
`
`Enter Information That
`Identifies The Object
`
`Exit
`
`1500
`
`1506
`
`1502
`
`1508
`
`504
`
`Request The Pointer
`Provide An Orientation
`Message
`
`No
`
`Input The Orientation
`Message
`
`ch Status lndi
`Orientation M
`e That The P
`
`Yes
`
`FIG. 15
`
`1512
`
`1514
`
`Compute And Record The
`Location Of The Pointer
`
`Yes
`
`Request The Pointer
`Provide An Orientation
`Message And Input It
`
`1518
`
`Define A Gaussian Blob For The
`Series Of Locations Recorded
`During The Tracing Operation
`
`N
`
`Exit
`
`H
`0
`s
`T
`
`C
`0
`M
`p
`u
`T
`E
`R
`
`0015
`
`

`

`Patent Application Publication May 26, 2005 Sheet 15 of 16
`
`US 2005/0110751 Al
`
`Begin
`
`Initiate The Target
`Training Procedure
`
`u
`s
`E
`R
`
`Enter Information That
`Identifies The Object
`
`Repeatedly Point At The
`Object Being Modeled
`With The Pointer And
`Depress The Device's
`Button, Each Time From
`A Different Position In
`The Environment
`
`1600
`
`1602
`
`1608
`
`1610
`
`1604
`
`Request The Pointer
`Provide An Orientation
`Message
`
`No
`
`Input The Orientation
`Message
`
`1612
`
`Inform The Host
`Computer That The
`Pointing Procedure Is
`Complete
`
`1606
`
`Orientation M
`te That The Po
`witch Has Bee
`
`Exit
`
`1618
`
`Establish A Ray That Projects
`From The Pointer's Location
`Along Its Orientation Direction
`For Each Recorded Pointing
`Location
`
`Compute The Coordinates Of
`The Mean Of A Gaussian Blob
`Representing The Object Being
`Modeled
`
`Yes
`
`1614
`
`Compute And Record The
`Orientation And Location
`Of The Pointer
`
`No
`
`Establish The Covariance Of
`The Gaussian Blob
`Representing The Object Being
`Modeled
`
`1622
`
`Exit
`
`FIG. 16
`
`H
`0
`s
`T
`
`C
`0
`M
`p
`u
`T
`E
`R
`
`0016
`
`

`

`Patent Application Publication May 26, 2005 Sheet 16 of 16
`
`US 2005/0110751 Al
`
`Begin
`
`170
`
`1702
`
`170
`
`1708
`
`1710
`
`1712
`
`Establish A Ray That Projects From
`The Pointer's Location Along Its
`Orientation Direction
`
`Define A Line Between The Mean Point
`Of Each Gaussian Blob And The
`Pointer's Location
`
`For Each Gaussian Blob, Define A
`Plane Normal To The Line Between
`The Blob Mean And The Pointer's
`Location, Or Normal To The Ray
`
`Project Each Gaussian Blob Onto Its
`Associated Plane To Define A 2D
`Gaussian
`
`Project The Ray Onto Each Plane
`
`Compute The Probability That The
`Pointer Is Pointing At An Object
`Associated With A Projected Gaussian
`Based On How Far The Origin Of The
`Projected Gaussian Is From The
`Closest Point Of Projected Ray, For
`Each Projected Gaussian
`
`Identify The Gaussian Associated With
`The Highest Probability
`
`FIG. 17
`
`oes
`ability Com
`Gaussian
`ighest Prob
`T
`
`Yes
`
`No
`
`Designate The Object
`Associated With The Gaussian
`Exhibiting The Highest
`Probability As The Object The
`User Is Pointing At
`
`1716
`
`Exit
`
`0017
`
`

`

`US 2005/0110751 Al
`
`May 26, 2005
`
`1
`
`SYSTEM AND PROCESS FOR SELECTING
`OBJECTS IN A UBIQUITOUS COMPUTING
`ENVIRONMENT
`
`BACKGROUND
`
`[0001] 1. Technical Field
`
`[0002] The invention is related to selecting objects in a
`ubiquitous computing environment where various electronic
`devices are controlled by a computer via a network connec(cid:173)
`tion, and more particularly to a system and process for
`selecting objects within the environment by a user pointing
`to the object with a wireless pointing device.
`
`[0003] 2. Background Art
`
`[0004]
`Increasingly our environment is populated with a
`multitude of intelligent devices, each specialized in function.
`The modern living room, for example, typically features a
`television, amplifier, DVD player, lights, and so on. In the
`near future, we can look forward to these devices becoming
`more inter-connected, more numerous and more specialized
`as part of an increasingly complex and powerful integrated
`intelligent environment. This presents a challenge in design(cid:173)
`ing good user interfaces.
`
`[0005] For example, today's living room coffee table is
`typically cluttered with multiple user interfaces in the form
`of infrared (IR) remote controls. Often each of these inter(cid:173)
`faces controls a single device. Tomorrow's intelligent envi(cid:173)
`ronment presents the opportunity to present a single intel(cid:173)
`ligent user interface (UI) to control many such devices when
`they are networked. This UI device should provide the user
`a natural interaction with intelligent environments. For
`example, people have become quite accustomed to pointing
`at a piece of electronic equipment that they want to control,
`owing to the extensive use of IR remote controls. It has
`become almost second nature for a person in a modern
`environment to point at the object he or she wants to control,
`even when it is not necessary. Take the small radio frequency
`(RF) key fobs that are used to lock and unlock most
`automobiles in the past few years as an example. Inevitably,
`a driver will point the free end of the key fob toward the car
`while pressing the lock or unlock button. This is done even
`though the driver could just have well pointed the fob away
`from the car, or even pressed the button while still in his or
`her pocket, owing to the RF nature of the device. Thus, a
`single UI device, which is pointed at electronic components
`or some extension thereof (e.g., a wall switch to control
`lighting in a room) to control these components, would
`represent an example of the aforementioned natural inter(cid:173)
`action that is desirable for such a device.
`
`[0006] There are some so-called "universal" remote con(cid:173)
`trols on the market that are preprogrammed with the known
`control protocols of a litany of electronic components, or
`which are designed to learn the command protocol of an
`electronic component. Typically, such devices are limited to
`one transmission scheme, such as IR or RF, and so can
`control only electronic components operating on that
`scheme. However, it would be desirable if the electronic
`components themselves were passive in that they do not
`have to receive and process commands from the UI device
`directly, but would instead rely solely on control inputs from
`the aforementioned network. In this way, the UI device does
`not have to differentiate among various electronic compo-
`
`nents, say by recognizing the component in some manner
`and transmitting commands using some encoding scheme
`applicable only to that component, as is the case with
`existing universal remote controls.
`
`[0007] Of course, a common control protocol could be
`implemented such that all the controllable electronic com(cid:173)
`ponents within an environment use the same control proto(cid:173)
`col and transmission scheme. However, this would require
`all the electronic components to be customized to the
`protocol and transmission scheme, or to be modified to
`recognize the protocol and scheme. This could add consid(cid:173)
`erably to the cost of a "single UI-controlled" environment.
`It would be much more desirable if the UI device could be
`used to control any networked group of new or existing
`electronic components regardless of remote control proto(cid:173)
`cols or transmission schemes the components were intended
`to operate under.
`
`[0008]
`It is noted that in the preceding paragraphs, as well
`as in the remainder of this specification, the description
`refers to various individual publications identified by a
`numeric designator contained within a pair of brackets. For
`example, such a reference may be identified by reciting,
`"reference [1]" or simply "[1]". Multiple references will be
`identified by a pair of brackets containing more than one
`designator, for example, [2, 3]. A listing of references
`including the publications corresponding to each designator
`can be found at the end of the Detailed Description section.
`
`SUMMARY
`
`[0009] The present invention is directed toward a system
`and process that provides a remote control UI device that is
`capable of controlling a group of networked electronic
`components regardless of any control protocols or transmis(cid:173)
`sion schemes under which they operate. In addition, the UI
`device of the present system and process is able to control
`the electronic components without having to directly differ(cid:173)
`entiate among the components or employ a myriad of
`different control protocols and transmission schemes. And in
`order to provide a natural interaction experience, the present
`system is operated by having the user point at the electronic
`component ( or an extension thereof that he or she wishes to
`control.
`
`[0010] The system and process according to the present
`invention provides a remote control UI device that can be
`simply pointed at objects in an ubiquitous computing envi(cid:173)
`ronment that are associated in some way with controllable,
`networked electronic components, so as to select that object
`for controlling via the network. This can for example
`involve pointing the UI device at a wall switch and pressing
`a button on the device to turn a light operated by the switch
`on or off. The idea is to have a UI device so simple that it
`requires no particular instruction or special knowledge on
`the part of the user.
`
`[0011]
`In general, the system includes the aforementioned
`remote control UI device in the form of a wireless RF
`pointer, which includes a radio frequency (RF) transceiver
`and various orientation sensors. The outputs of the sensors
`are periodically packaged as orientation messages and trans(cid:173)
`mitted using the RF transceiver to a base station, which also
`has a RF transceiver to receive the orientation messages
`transmitted by the pointer. There is also a pair of digital
`video cameras each of which is located so as to capture
`
`0018
`
`

`

`US 2005/0110751 Al
`
`May 26, 2005
`
`2
`
`images of the environment in which the pointer is operating
`from different viewpoints. A computer, such as a PC, is
`connected to the base station and the video cameras. Ori(cid:173)
`entation messages received by the base station from the
`pointer are forwarded to the computer, as are images cap(cid:173)
`tured by the video cameras. The computer is employed to
`compute the orientation and location of the pointer using the
`orientation messages and captured images. The orientation
`and location of the pointer is in turn used to determine if the
`pointer is being pointed at an object in the environment that
`is controllable by the computer via a network connection. If
`it is, the object is selected.
`[0012] The pointer specifically includes a case having a
`shape with a defined pointing end, a microcontroller, the
`aforementioned RF transceiver and orientation sensors
`which are connected to the microcontroller, and a power
`supply (e.g., batteries) for powering these electronic com(cid:173)
`ponents. In the tested versions of the pointer, the orientation
`sensors included at least, an accelerometer that provides
`separate x-axis and y-axis orientation signals, and a mag(cid:173)
`netometer that provides separate x-axis, y-axis and z-axis
`orientation signals. These electronics were housed in a case
`that resembled a wand.
`[0013] The pointer's microcontroller packages and trans(cid:173)
`mits orientation messages at a prescribed rate. While the
`microcontroller could be programmed to accomplish this
`task by itself, a command-response protocol was employed
`in tested versions of the system. This entailed the computer
`periodically instructing the pointer's microcontroller to
`package and transmit an orientation message by causing the
`base station to transmit a request for the message to the
`pointer at the prescribed rate. This prescribed rate could for
`example be approximately 50 times per second as it was in
`tested versions of the system.
`[0014] As indicated previously, the orientation messages
`generated by the pointer include the outputs of the sensors.
`To this end, the pointer's microcontroller periodically reads
`and stores the outputs of the orientation sensors. Whenever
`a request for an orientation message is received ( or it is time
`to generate such a message if the pointer is programmed to
`do so without a request), the microcontroller includes the
`last-read outputs from the accelerometer and magnetometer
`in the orientation message.
`
`[0015] The pointer also includes other electronic compo(cid:173)
`nents such as a user activated switch or button, and a series
`of light emitting diodes (LEDs). The user-activated switch,
`which is also connected to the microcontroller, is employed
`for the purpose of instructing the computer to implement a
`particular function, such as will be described later. To this
`end, the state of the switch in regard to whether it is activated
`or deactivated at the time an orientation message is pack(cid:173)
`aged is included in that message for transmission to the
`computer. The series of LEDs includes a pair of differently(cid:173)
`colored, visible spectrum LEDs, which are connected to the
`microcontroller, and which are visible from the outside of
`the pointer's case when lit. These LEDs are used to provide
`status or feedback information to the user, and are controlled
`via instructions transmitted to the pointer by the computer.
`[0016] The foregoing system is used to select an object by
`having the user simply point to the object with the pointer.
`This entails the computer first inputting the orientation
`messages transmitted by the pointer. For each message
`
`received, the computer derives the orientation of the pointer
`in relation to a predefined coordinate system of the envi(cid:173)
`ronment in which the pointer is operating using the orien(cid:173)
`tation sensor readings contained in the message. In addition,
`the video output from the video cameras is used to ascertain
`the location of the pointer at a time substantially contem(cid:173)
`poraneous with the generation of the orientation message
`and in terms of the predefined coordinate system. Once the
`orientation and location of the pointer are computed, they
`are used to determine whether the pointer is being pointed at
`an object in the environment that is controllable by the
`computer. If so, then that object is selected for future control
`actions.
`[0017] The computer derives the orientation of the pointer
`from the orientation sensor readings contained in the orien(cid:173)
`tation message as follows. First, the accelerometer and
`magnetometer output values contained in the orientation
`message are normalized. Angles defining the pitch of the
`pointer about the x-axis and the roll of the device about the
`y-axis are computed from the normalized outputs of the
`accelerometer. The normalized magnetometer output values
`are then refined using these pitch and roll angles. Next,
`previously established correction factors for each axis of the
`magnetometer, which relate the magnetometer outputs to the
`predefined coordinate system of the environment, are
`applied to the associated refined and normalized outputs of
`the magnetometer. The yaw angle of the pointer about the z
`axis is computed using the refined magnetometer output
`values. The computed pitch, roll and yaw angles are then
`tentatively designated as defining the orientation of the
`pointer at the time the orientation message was generated. It
`is next determined whether the pointer was in a right-side up
`or up-side down position at the time the orientation message
`was generated. If the pointer was in the right-side up
`position, the previously computed pitch, roll and yaw angles
`are designated as the defining the finalized orientation of the
`pointer. However, if it is determined that the pointer was in
`the up-side down position at the time the orientation mes(cid:173)
`sage was generated, the tentatively designated roll angle is
`corrected accordingly, and then the pitch, yaw and modified
`roll angle are designated as defining the finalized orientation
`of the pointer. In the foregoing description, it is assumed that
`the accelerometer and magnetometer of the pointer are
`oriented such that their respective first axis corresponds to
`the x-axis which is directed laterally to a pointing axis of the
`pointer and their respective second axis corresponds to the
`y-axis which is directed along the pointing axis of the
`pointer, and the third axis of the magnetometer correspond
`to the z-axis which is directed vertically upward when the
`pointer is positioned right-side up with the x and y axes lying
`in a horizontal plane.
`[0018] The computer derives the location of the pointer
`from the video output of the video cameras as follows. There
`is an infrared (IR) LED connected to the microcontroller that
`is able to emit IR light outside the pointer's case when lit.
`The microcontroller causes the IR LEDs to flash. In addi(cid:173)
`tion, the aforementioned pair of digital video cameras each
`have an IR pass filter that results in the video image frames
`capturing only IR light emitted or reflected in the environ(cid:173)
`ment toward the camera, including the flashing from the
`pointer's IR LED which appears as a bright spot in the video
`image frames. The microcontroller causes the IR LED to
`flash at a prescribed rate that is approximately one-half the
`frame rate of the video cameras. This results in only one of
`
`0019
`
`

`

`US 2005/0110751 Al
`
`May 26, 2005
`
`3
`
`each pair of image frames produced by a camera having the
`IR LED flashes depicted in it. This allows each pair of
`frames produced by a camera to be subtracted to produce a
`difference image, which depicts for the most part only the IR
`emissions and reflections directed toward the camera which
`appear in one or the other of the pair of frames but not both
`(such as the flash from the IR LED of the pointing device).
`In this way, the background IR in the environment is
`attenuated and the IR flash becomes the predominant feature
`in the difference image. The image coordinates of the pixel
`in the difference image that exhibits the highest intensity is
`then identified using a standard peak detection procedure. A
`conventional stereo image technique is then employed to
`compute the 3D coordinates of the flash for each set of
`approximately contemporaneous pairs of image frames gen(cid:173)
`erated by the pair of cameras using the image coordinates of
`the flash from the associated difference images and prede(cid:173)
`termined intrinsic and extrinsic camera parameters. These
`coordinates represent the location of the pointer ( as repre(cid:173)
`sented by the location of the IR LED) at the time the video
`image frames used to compute them were generated by the
`cameras.
`
`[0019] The orientation and location of the pointing device
`at any given time is used to determine whether the pointing
`device is being pointed at an object in the environment that
`is controllable by the computer. In order to do so the
`computer must know what objects are controllable and
`where they exist in the environment. This requires a model
`of the environment. In the present system and process, the
`location and extent of objects within the environment that
`are controllable by the computer are modeled using 3D
`Gaussian blobs defined by a location of the mean of the blob
`in terms of its environmental coordinates and a covariance.
`Two different methods have been developed to model
`objects in the environment.
`
`[0020] The first involves the user inputting information
`identifying the object that is to be modeled. The user then
`activates the switch on the pointing device and traces the
`outline of the object. Meanwhile, the computer is running a
`target training procedure that causes requests for orientation
`messages to be sent to the pointing device a prescribed
`request rate. The orientation messages are input as they are
`received, and for each orientation message, it is determined
`whether the switch state indictor included in the orientation
`message indicates that the switch is activated. Whenever it
`is initially determined that the switch is not activated, the
`switch state determination action is repeated for each sub(cid:173)
`sequent orientation message received until an orientation
`message is received which indicates that the switch is
`activated. At that point, each time it is determined that the
`switch is activated, the location of the pointing device is
`ascertained as described previously using the digital video
`input from the pair of video cameras. When the user is done
`tracing the outline of the object being modeled, he or she
`deactivates the switch. The target training process sees this
`as the switch has been deactivated after having been acti(cid:173)
`vated in the immediately preceding orientation message.
`Whenever, such a condition occurs, the tracing procedure is
`deemed to be complete and a 3D Gaussian blob representing
`the object is established using the previously ascertained
`pointing device locations stored during the tracing proce(cid:173)
`dure.
`
`[0021] The second method of modeling objects once again
`begins by the user inputting information identifying the
`object that is to be modeled. However, in this case the user
`repeatedly points the pointer at the object and momentarily
`activates the switch on the device, each time pointing the
`device from a different location within the environment.
`Meanwhile, the computer is running a target training pro(cid:173)
`cedure that causes requests for orientation messages to be
`sent to the pointing device at a prescribed request rate. Each
`orientation message received from the pointing device is
`input until the user indicates the target training inputs are
`complete. For each orientation message input, it is deter(cid:173)
`mined whether the switch state indicator contained therein
`indicates that the switch is activated. Whenever it is deter(cid:173)
`mined

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket