`(12) Patent Application Publication (10) Pub. No.: US 2003/0085870 A1
`Hinckley
`(43) Pub. Date:
`May 8, 2003
`
`US 2003008587OA1
`
`(54) METHOD AND APPARATUS USING
`MULTIPLE SENSORS IN A DEVICE WITH A
`DISPLAY
`(76) Inventor: Kenneth P. Hinckley, Redmond, WA
`(US)
`Correspondence Address:
`Theodore M. Magee
`WESTMAN CHAMPLIN & KELLY
`International Centre - Suite 1600
`900 South Second Avenue
`Minneapolis, MN 55402-3319 (US)
`(21) Appl. No.:
`10/294,286
`(22) Filed:
`Nov. 14, 2002
`Related U.S. Application Data
`(63) Continuation-in-part of application No. 09/875,477,
`filed on Jun. 6, 2001. Continuation-in-part of appli
`cation No. 10/162.487, filed on Jun. 3, 2002, which is
`a continuation-in-part of application No. 09/875,477,
`filed on Jun. 6, 2001.
`
`(60) Provisional application No. 60/218,748, filed on Jul.
`17, 2000.
`
`Publication Classification
`
`
`
`(51) Int. Cl. ................................................... G09G 5/00
`(52) U.S. Cl. .............................................................. 345/156
`(57)
`ABSTRACT
`In a device having a display, a change in focus for an
`application is used with a requested usage of a context
`attribute to change the amount of information regarding the
`context attribute that is Sent to another application. A method
`of changing the orientation of images on a device's display
`detects movement followed by an end of movement of the
`device. The orientation of the device is then determined and
`is used to Set the orientation of images on the display. A
`method of Setting the orientation of a display also includes
`Storing information regarding an item displayed in a first
`orientation before changing the orientation. When the ori
`entation is returned to the first orientation, the Stored infor
`mation is retrieved and is used to display the item in the first
`orientation. The stored information can include whether the
`item is to appear in the particular orientation.
`
`306
`
`256
`
`308
`
`Page 1 of 21
`
`SAMSUNG EXHIBIT 1012
`
`
`
`Patent Application Publication
`
`May 8, 2003 Sheet 1 of 10
`
`US 2003/0085870 A1
`
`Z IZ
`
`# IZ
`
`?NIOWN@HWN
`
`0 IZ
`
`ZOZ
`
`YHOSS@HOONHdH
`
`9 IZETRIOJLS
`
`JLOEIf{HO
`
`0 LZ9
`
`ZSZ
`
`OId
`
`
`
`
`
`
`
`
`
`
`
`I "OIH
`
`
`
`NOI LVOINUOJANWOO
`
`
`
`GHOV HYHOEILNI
`
`Page 2 of 21
`
`
`
`Patent Application Publication
`
`May 8, 2003 Sheet 2 of 10
`
`US 2003/0085870 A1
`
`Z I £
`
`
`
`Z * OIH·
`
`00Z
`
`Z00 I
`
`
`
`
`
`
`
`Z || 9
`
`900 I
`
`|sae).
`
`NOIJL VJATRIO HNI
`
`JLXEILNO O
`
`(HEARISHS
`
`0 I "OIH
`
`000 [
`
`Page 3 of 21
`
`
`
`/
`\\\\\\g
`k\\\\\
`
`
`
`Wo‘.0HnH0/:nH
`
`w.0HrTHm0.HnH
`
`Page 4 of 21
`
`
`
`
`
`
`Patent Application Publication May 8, 2003 Sheet 4 of 10
`
`US 2003/0085870 A1
`
`
`
`s
`
`
`
`s
`
`S
`
`2 s
`
`2 2. % 2 % 2. % % % 2. % % 2. % % %
`
`s
`
`s
`
`s
`
`Page 5 of 21
`
`
`
`Patent Application Publication May 8, 2003 Sheet 5 of 10
`
`US 2003/0085870 A1
`
`906
`
`908
`
`90
`
`45
`40
`35
`30
`25
`20
`
`DIST.
`(CM)
`
`904
`
`10
`
`5 s- SENSOR VALUE
`902
`
`180
`FIG. 9
`
`Page 6 of 21
`
`
`
`Patent Application Publication
`
`May 8, 2003 Sheet 6 of 10
`
`US 2003/0085870 A1
`
`90 I I
`
`
`
`ZI "OIH
`
`00 [ [
`
`
`
`
`
`
`
`
`
`
`
`2
`
`90 I I
`
`Page 7 of 21
`
`
`
`Patent Application Publication May 8, 2003 Sheet 7 of 10
`
`US 2003/0085870 A1
`
`
`
`1314
`
`FORWARD/
`BACK
`TILTANGLE
`
`
`
`LANDSCAPE
`(LEFT)
`
`306
`
`1312
`
`portrait s
`(UPSIDE-DOWN)
`
`
`
`LANDSCAPE
`(RIGHT)
`
`
`
`.
`
`1308
`
`1313
`90
`-90
`
`1300
`
`PORTRAIT
`-1\
`
`1310
`
`1315
`
`-45
`
`Sifrtright TILTANGLE
`
`O
`
`45
`
`90
`
`FIG. 13
`
`Page 8 of 21
`
`
`
`Patent Application Publication May 8, 2003 Sheet 8 of 10
`
`US 2003/0085870 A1
`
`
`
`O
`2
`
`
`
`
`
`
`
`
`
`Monday's Meeting
`John Smith
`Bill Toms Status Report Duc
`
`402
`
`1 400
`
`
`
`g
`
`2.
`O O
`
`FIG. 14
`
`O O
`
`John Smith Monday's Meeting
`Bill Toms Status Report Due
`
`1400
`
`
`
`
`
`
`
`
`
`22
`O
`
`FIG. 15
`
`Page 9 of 21
`
`
`
`Patent Application Publication
`
`May 8, 2003. Sheet 9 of 10
`
`US 2003/0085870 A1
`
`
`
`2 O O
`
`FIG. 16
`
`Page 10 of 21
`
`
`
`Patent Application Publication May 8, 2003 Sheet 10 of 10 US 2003/0085870 A1
`
`
`
`O
`2
`
`This is a word processing program, which provides
`the ability to enter and edit text
`
`2 % % 2 % % 2 2. % 2 2. 2 % 2. % 2. 2. % 2 %
`
`2.
`O O
`
`
`
`
`
`2
`
`FIG. 1 7
`
`
`
`2 2
`3
`2
`2
`%
`2
`2
`2
`2
`2
`%
`%
`%
`%
`%
`%
`%
`2
`%
`2
`2
`%
`2.
`2
`%
`%
`%
`
`John Smith Monday's Meeting
`Bill Toms Status Report Due
`
`%
`3
`%
`2
`%
`%
`2
`%
`%
`2
`%
`%
`%
`%
`%
`%
`%
`%
`2
`%
`2
`2
`O
`%
`2
`2
`O
`%
`2
`
`Page 11 of 21
`
`
`
`US 2003/0085870 A1
`
`May 8, 2003
`
`METHOD AND APPARATUS USING MULTIPLE
`SENSORS IN A DEVICE WITH A DISPLAY
`REFERENCE TO RELATED APPLICATION
`0001. This application is a Continuation-In-Part applica
`tion of U.S. patent application Ser. No. 09/875,477, filed
`Jun. 6, 2001, which claims priority from a U.S. Provisional
`application having serial No. 60/218,748, filed on Jul. 17,
`2000 and entitled “METHOD AND APPARATUS USING
`MULTIPLE SENSORS IN A MOBILE DEVICE and is a
`Continuation-In-Part of U.S. patent application Ser. No.
`10/162,487, filed Jun. 3, 2002, which was a Continuation
`In-Part of U.S. patent application Ser. No. 09/875,477 filed
`Jun. 6, 2001.
`BACKGROUND OF THE INVENTION
`0002 The present invention relates to devices with dis
`playS. In particular, the present invention relates to comput
`ing and mobile devices.
`0.003 Mobile devices, such as personal information man
`agers (PIMs), tablet PCs, cellular telephones, pagers,
`watches, and wearable computers typically include one or
`more buttons or touch screens through which the mobile
`device receives explicit instructions from the user. For
`example, the user can press buttons to explicitly instruct the
`device to enter a full-power mode, activate an application, or
`change the orientation of a display.
`0004 Although the devices are responsive to information
`provided through Such explicit instructions, they are gener
`ally not responsive to information that is present in the
`manner in which the device is being handled by the user. For
`example, the devices do not automatically enter a full-power
`mode, even when the user is holding the device in a manner
`that is consistent with wanting to use the device.
`0005 Because prior art devices are generally not respon
`Sive to the manner in which the user is holding the devices,
`the user is forced to enter explicit instructions into the device
`to achieve various functions. In light of this, mobile devices
`are needed that can Sense how they are being handled in
`order to perform certain background functions that expand
`the functionality of the mobile device without requiring the
`user to perform any additional actions.
`SUMMARY OF THE INVENTION
`0006. In a device having a display, a change in focus for
`an application is used with a requested usage of a context
`attribute to change the amount of information regarding the
`context attribute that is Sent to another application. A method
`of changing the orientation of images on a device's display
`detects movement followed by an end of movement of the
`device. The orientation of the device is then determined and
`is used to Set the orientation of images on the display. A
`method of Setting the orientation of a display also includes
`Storing information regarding an item displayed in a first
`orientation before changing the orientation. When the ori
`entation is returned to the first orientation, the Stored infor
`mation is retrieved and is used to display the item in the first
`orientation. The stored information can include whether the
`item is to appear in the particular orientation.
`BRIEF DESCRIPTION OF THE DRAWINGS
`0007 FIG. 1 is a block diagram of the components of a
`mobile device under one embodiment of the present inven
`tion.
`
`0008 FIG. 2 is a bottom view of a mobile device of one
`embodiment of the present invention.
`0009 FIG.3 is a front view of the mobile device of FIG.
`2.
`0010 FIG. 4 is a back view of the mobile device of FIG.
`2.
`0011 FIG. 5 is a left side view of the mobile device of
`FG, 2.
`0012 FIG. 6 is a right side view of the mobile device of
`FG, 2.
`0013 FIG. 7 is a front view of a second embodiment of
`a mobile device.
`0014 FIG. 8 is a back view of the mobile device of FIG.
`7.
`0015 FIG. 9 is a graph of distance between a user and a
`mobile device as a function of proximity Sensor levels.
`0016 FIG. 10 is a block diagram of components used to
`practice Several embodiments of the present invention.
`0017 FIG. 11 is a front view of a mobile device in a
`portrait orientation.
`0018 FIG. 12 is a front view of a mobile device in a
`landscape orientation.
`0019 FIG. 13 is a chart showing the display orientations
`for various combinations of forward/back tilt and left/right
`tilt.
`0020 FIGS. 14 and 15 show a mobile device in two
`different orientations.
`0021 FIG.16 shows a front view of a mobile device with
`an orientation feedback icon displayed.
`0022 FIG. 17 shows a mobile device in a landscape
`orientation with a word-processing application.
`0023 FIG. 18 shows a mobile device in a portrait ori
`entation with an e-mail application.
`
`DETAILED DESCRIPTION OF THE
`PREFERRED EMBODIMENTS
`0024 FIG. 1 is a block diagram of a mobile device 200,
`which is an exemplary environment for embodiments of the
`present invention. Mobile device 200 includes a micropro
`cessor 202, memory 204, input/output (I/O) components
`206, and a communication interface 208 for communicating
`with remote computers or other mobile devices. In one
`embodiment, the afore-mentioned components are coupled
`for communication with one another over a Suitable bus 210.
`0025 Memory 204 is a computer readable media, which
`can be any available media that can be accessed by processor
`202 and includes both volatile and nonvolatile media,
`removable and non-removable media. By way of example,
`and not limitation, computer readable media may comprise
`computer Storage media and communication media. Com
`puter Storage media includes both volatile and nonvolatile,
`removable and nonremovable media implemented in any
`method or technology for Storage of information Such as
`computer readable instructions, data Structures, program
`modules or other data. Computer Storage media includes, but
`is not limited to, RAM, TOM, EEPROM, flash memory or
`
`Page 12 of 21
`
`
`
`US 2003/0085870 A1
`
`May 8, 2003
`
`other memory technology, CD-ROM, digital versatile disks
`(DVD) or other optical disk storage, magnetic cassettes,
`magnetic tape, magnetic disk Storage or other magnetic
`Storage devices, or any other medium which can be used to
`Store the desired information and which can be accessed by
`processor 202. Communication media typically embodies
`computer readable instructions, data Structures, program
`modules or other data in a modulated data Signal Such as a
`carrier wave or other transport mechanism and includes any
`information delivery media. The term "modulated data Sig
`nal” means a signal that has one or more of its characteristics
`Set or changed in Such a manner as to encode information in
`the Signal. By way of example, and not limitation, commu
`nication media includes wired media Such as a wired net
`work or direct-wired connection, and wireleSS media Such as
`acoustic, RF, infrared, and other wireleSS media. Combina
`tions of any of the above should also be included within the
`Scope of computer readable media.
`0026. The particular computer readable medium used in
`a mobile device is a function of the size of the device and the
`power capacity of the device. For example, a tablet PC will
`typically include one or more disk drives where as a PIM
`will typically only include a random access memory (RAM)
`with a battery back-up module (not shown) such that infor
`mation stored in memory 204 is not lost when the general
`power to mobile device 200 is shut down.
`0027) A portion of memory 204 is preferably allocated as
`addressable memory for program execution, while another
`portion of memory 204 is preferably used for storage.
`0028 Memory 204 includes an operating system 212,
`application programs 214, and an object Store 216. During
`operation, operating System 212 is preferably executed by
`processor 202 from memory 204. Operating system 212, in
`one preferred embodiment, is a WINDOWS(R) CE brand
`operating System commercially available from MicroSoft
`Corporation. In Tablet PC embodiments, Windows(R XP
`brand operating System available from MicroSoft Corpora
`tion is utilized. Operating System 212 is preferably designed
`for mobile devices, and implements database features that
`can be utilized by applications 214 through a set of exposed
`application programming interfaces and methods. The
`objects in object Store 216 are maintained by applications
`214 and operating System 212 at least partially in response
`to calls to the exposed application programming interfaces
`and methods.
`0029 Communication interface 208 represents numerous
`devices and technologies that allow mobile device 200 to
`Send and receive information. The devices include wired and
`wireleSS modems, Satellite receivers and broadcast tuners to
`name a few. Mobile device 200 can also be directly con
`nected to a computer to exchange data there with. In Such
`cases, communication interface 208 can be an infrared
`transceiver or a Serial, parallel, USB, or FireWire commu
`nication connection, all of which are capable of transmitting
`Streaming information.
`0030 Input/output components 206 include a variety of
`input devices that have previously been found on mobile
`devices Such as a touch-Sensitive Screen or transparent tablet
`overlay Sensitive to properties of a special Stylus including
`position, proximity to the Screen, preSSure, azimuth, eleva
`tion, which end of the Stylus is being used (e.g. writing tip
`on one end, eraser on the other end) and possibly a unique
`
`ID encoded in the Stylus, buttons, rollers, and a microphone
`as well as a variety of output devices including an audio
`generator, a vibrating device, and a display. The devices
`listed above are by way of example and need not all be
`present on mobile device 200.
`0031) Mobile device 200 also includes additional input
`devices under the present invention. Under one embodiment,
`these input devices are connected to the mobile device
`through a separate Serial port 250 and a peripheral interface
`controller (PIC) microprocessor 252. In other embodiments,
`these additional devices are connected to processor 202
`through communication interface 208 and PIC microproces
`Sor 252 or through PIC microprocessor 252 directly. Under
`one embodiment, a microchip 16C73A peripheral interface
`controller is used as the PIC microprocessor. In still further
`embodiments, PIC microprocessor 252 is not present and the
`input devices are connected to processor 202 through vari
`ous ports Such as Serial port 250 or through communication
`interface 208, or through memory-mapped I/O or direct
`connection to the System processor(s).
`0032 Under the embodiment of FIG. 1, The additional
`input devices include a set of touch Sensors Such as touch
`sensors 254 and 256. Touch sensors 254 and 256 are
`provided to a separate peripheral interface controller micro
`processor 276 which converts the touch signals into digital
`values and provides the digital values to PIC microprocessor
`252. In other embodiments, touch sensors 254 and 256 are
`connected directly to analog or digital inputs in PIC micro
`processor 252 instead of being connected to PIC 276 or are
`connected to processor 202.
`0033. The input devices also include a dual axis linear
`accelerometer tilt sensor 258 capable of detecting forward/
`back tilt, left/right tilt, and linear accelerations Such as those
`resulting from Vibrations or movement.
`0034. The input devices also include a light sensor 260,
`a proximity Sensor 262 consisting of an infrared transmitter
`264 and an infrared receiver 266, a digital compass (e.g. a
`Single or multiple axis magnetometer) 284, and a gravity
`Switch 282. The sensing signals from the infrared receiver
`266, linear accelerator 258, light sensor 260, digital compass
`284, and gravity Switch 282 may be provided through
`respective amplifiers 270, 272,274, 285 and 287 to analog
`inputs of PIC microprocessor 252. These analog inputs are
`connected to analog-to-digital converters within PIC micro
`processor 252. In other embodiments, the Sensors provide a
`digital output and thus are connected to digital inputs on the
`microprocessor. In further embodiments, the input devices
`also include a temperature Sensor.
`0035 PIC microprocessor 252 also includes a connection
`to the power bus of mobile device 200, which is shown as
`connection 278 in FIG. 1. PIC microprocessor 252 also
`includes a connection to a power Switch 280, which enables
`PIC microprocessor 252 to turn mobile device 200 on and
`off. Note that PIC microprocessor 252 always receives
`power and, under one embodiment, is able to control which
`of the Sensors receives power at any one time. This allows
`PIC microprocessor 252 to manage power consumption by
`only Sending power to those Sensors that it anticipates will
`need to be active.
`0036 Under one embodiment, PIC microprocessor 252
`continuously Samples the Sensors and transmits packets
`
`Page 13 of 21
`
`
`
`US 2003/0085870 A1
`
`May 8, 2003
`
`representing the State of these Sensors at a rate of approxi
`mately 400 samples per second through serial port 250. In
`Some embodiments, Samples are reported at lower Speeds to
`conserve power and processing resources. Some Sensors
`may be reported at different sampling rates than others (e.g.
`tilt may be updated more frequently than touch).
`0037 Under one embodiment, the touch sensors are
`capacitive touch Sensors that are divided into two regions. In
`other embodiments, these Sensors are implemented as a
`Single detector pad. Under one embodiment, the touch
`Sensors are spread across the back and Sides of mobile
`device 200. This is shown in more detail in FIGS. 4-6 which
`show a back, left side view and right side view of the outside
`of mobile device 200. In FIGS. 4, 5, and 6, touch sensor 254
`is shown as two regions 300 and 302. Region 300 extends
`from the left side to the back of mobile device 200 and
`region 302 extends from the right side to the back of mobile
`device 200. When a user touches either section 300 or 302,
`the capacitance associated with the touched Section changes
`indicating that the user has touched the device. Note that
`although the touch Sensors are shown on the exterior of the
`device in the embodiment of FIGS. 4-6, in other embodi
`ments, the touch Sensor is located beneath an outer covering
`of the device.
`0038 Touch sensor 256 is shown in FIG. 3, which is a
`front view of mobile device 200. In the embodiment of FIG.
`3, touch sensor 256 is located on the left bezel of display
`Screen 304. In other embodiments, touch sensor 256 is
`located on the outer casing on the front portion of mobile
`device 200, but not necessarily on bezel 306 of mobile
`device 200.
`0039. In some embodiments, the touch sensors described
`above are realized using a plurality of independent touch
`Sensors that each provides a Separate touch Signal. In other
`embodiments, the touch Sensors are replaced with position
`Sensors that indicate the location where the user is touching
`the device. Those skilled in the art will recognize that
`additional touch sensors may be added to the mobile device
`within the Scope of the present invention.
`0040 FIGS. 7 and 8 indicate locations for touch sensors
`under one embodiment of a tablet PC. In FIG. 7, touch
`sensors 700, 702, 704, and 706 are located at various
`locations around the perimeter of a display 708 on the front
`of tablet PC 701. Sensors associated with display 708 are
`able to detect the location of a stylus 710 when it is near
`display 708 using inductive coupling between display 708
`and conductors in stylus 710. Under some embodiments, the
`sensors associated with display 708 are able to detect the
`proximity of stylus 710 as well as the azimuth of the stylus.
`0041 FIG. 8 provides a back view of tablet PC 701 and
`shows a touch sensor 712 located on the back Surface of the
`tablet PC.
`0042 Tablet PC 701 can be of a slate form, in which the
`tablet PC only includes the display for input and does not
`include a keyboard. The slate forms of the tablet PC can be
`used with a docking Station to provide a connection to other
`input devices and memory devices.
`0043. In other embodiments, the tablet PC is a convert
`ible device with a keyboard. Under one convertible embodi
`ment, the keyboard is attached to the display through a
`pivoting connection that allows the tablet PC to be in either
`
`a closed State or an open State. In Such embodiments, the
`display is embedded in a top portion of the tablet PC and the
`keyboard is embedded in a bottom portion of the tablet PC.
`In the closed state, the top and bottom portions of the tablet
`PC are brought together so that the keyboard is hidden
`between the top and bottom portions while the display is
`Visible on the exterior of the top portion. In the open State,
`the display pivots So that it faces the keyboard.
`0044) In another convertible embodiment, the display
`portion of the tablet PC is detachable from a keyboard and
`extended device portion, which can contain various disk
`drives and additional memory. In Such embodiments, the
`back touch sensor 712 can be located on the back of the
`display portion or on the back of the keyboard portion.
`004.5
`Tilt sensor 258 is shown as a single dotted element
`308 in FIG. 3 and element 714 in FIG. 7. The tilt sensor is
`embedded within the casing of mobile device 200 and 701
`and in one embodiment is located at a point about which
`users typically pivot mobile device 200 and 701 when tilting
`the device. Note that the tilt sensor's position within the
`mobile device is unimportant as it Senses only the angle of
`its physical attitude with respect to gravity. The Sensor's
`angular position within the device is important.
`0046 Under one embodiment, an Analog Devices
`ADXL202 two-axis linear accelerometer is used for tilt
`sensor 258. Such a sensor detects forward/backward tilt,
`shown by arrows 310 of FIG. 5, and left/right tilt, shown in
`the bottom view of FIG. 2 as arrows 312. The sensor also
`responds to linear accelerations, Such as those resulting from
`Shaking the device. Typically, the tilt Sensor has a response
`curve both in the forward/back direction and left/right
`direction with the form:
`
`Angle = sin ()
`
`EQ. 1
`
`0047 where T is the tilt sensor value, T is the sensor
`value at 0 tilt, and k is a gain parameter. In embodiments
`where the Sensor cannot detect the Sign of the gravity vector,
`it is unable to determine if the user is holding the device with
`the display facing up or down. Gravity Switch 282 of FIG.
`1 is thus provided in some embodiments to indicate whether
`the display is facing the ground. In other embodiments, a
`three-axis accelerometer is used to provide the Sign of the
`gravity vector.
`0048. In addition, the tilt sensor does not respond to
`rotation about an axis running from the front to the back of
`the mobile device. Thus, the tiltsensor is unable to sense the
`Spinning of the mobile device on its back when laid on a flat
`table. Digital magnetic compass 284 of FIG. 1 is thus
`provided in Some embodiments to indicate this type of
`rotation. In other embodiments, Solid State gyros are used
`instead of the compass. In further embodiments, a multiple
`axis magnetometer may be used in lieu of the digital
`compass, and combined with the tilt Sensor values, to
`improve the robustness of the Sensed compass direction.
`0049. When present, gravity switch 282 and digital com
`pass 284 are also internal to mobile devices 200 and 701.
`They are not shown in FIGS. 3 and 7 to reduce the
`complexity of FIGS. 3 and 7.
`
`Page 14 of 21
`
`
`
`US 2003/0085870 A1
`
`May 8, 2003
`
`0050) Note that the additional input devices of FIG. 1 do
`not all have to be present under the present invention.
`Different embodiments of the invention will use different
`numbers of and different combinations of these additional
`Sensors. Further, additional Sensors may be added without
`affecting the functions of the Sensors discussed in the present
`application.
`0051 Transmitter 264 and receiver 266 of proximity
`sensor 262 are shown in FIGS. 3 and 7. In the embodiment
`of FIG. 3, transmitter 264 is shown below and to the right
`of receiver 266, and both the transmitter and receiver are
`located at the top front of mobile device 200.
`0.052 Under one embodiment, a timer 265 drives trans
`mitter 264 at 40 kilohertz and transmitter 264 is an infrared
`light emitting diode with a 60 beam angle. Under Such
`embodiments, receiver 266 is also an infrared receiver that
`is capable of operating at the same frequency as transmitter
`264. The light produced by transmitter 264 bounces off
`objects that are near mobile device 200 and the reflected
`light is received by receiver 266. Receiver 266 typically has
`an automatic gain control Such that the Strength of the
`received signal is proportional to the distance to the object.
`In a further embodiment, multiple light emitting diodes with
`different beam angles may be combined to improve Sensor
`response to both distant objects (using a narrow collimated
`beam angle, e.g. 5) as well as objects that are not directly
`in front of the Sensor (using a wide beam angle).
`0053 FIG. 9 shows a response curve for one embodi
`ment of the proximity Sensor. In FIG. 9, the Sensor value is
`shown along horizontal axis 902 and the actual distance to
`the object is shown along vertical axis 904. The graph of
`FIG. 9 is divided into three ranges. Range 906 extends from
`a distance of approximately 27 centimeters to infinity and
`indicates that no objects are within range of mobile device
`200. Range 908 extends from approximately 7 centimeters
`to 27 centimeters and indicates that at least one object is
`within range of mobile device 200. Readings in third range
`910, which extends from 7 centimeters to 0 centimeters, are
`considered to be close to mobile device 200. The response
`curve of FIG. 9 is described by the following equation:
`
`Zon = -
`
`-
`
`EQ. 2
`
`0057 FIG. 10 provides a block diagram of the software
`components of one embodiment of the present invention. In
`FIG. 10, a context information server 1000 receives the
`sensor data from serial port 250 of FIG. 1.
`0.058 Context information server 1000 acts as a broker
`between the Sensor values received by the microprocessor
`252 and a set of applications 1002 operating on mobile
`device 1000. Context information server 1000 continuously
`receives sensor data packets from PIC 252, converts the raw
`data into a logical form, and derives additional information
`from the Sensor data.
`0059 Applications 1002 can access the logical form of
`information generated by registering with context informa
`tion server 1000 to receive messages when a particular
`context attribute is updated. This logical form is referred to
`as a context attribute. Such context attributes can be based
`on the State of one or more Sensors as well as past States of
`the Sensors, or even anticipated (predicted) future States.
`Tables 1, 2, 3 and 4 below provide lists of the context
`attributes that an application can register to receive. In the
`description column of each table, Specific values for the
`variables are shown in italics. For example, the DISPLAY
`ORIENTATION variable can have values of flat, portrait,
`landscape left, landscape right, or portrait upside down.
`
`Group
`
`Touch
`
`TABLE 1.
`
`Context Variable
`
`Description
`
`Holding Left
`HoldingRight
`Holdingback
`
`LeftTouchState
`RightTouchState
`BackTouchState
`
`Holding&Duration
`
`TouchingBezel&
`Duration
`
`Binary
`indication of
`contact with
`touch sensor
`Degree of
`touching:
`OutOfRange,
`Inrange, Close
`Whether or not
`the user is
`holding the
`device and for
`how long
`Whether user is
`touching screen
`bezel and for
`how long
`
`is the distance in centimeters to the
`0.054 where Z
`object, p is the raw proximity reading, pa is the maximum
`Sensor reading, c is a constant, C. is a nonlinear parameter
`(0.77 in one embodiment), and k is a gain factor.
`0.055 Under one embodiment, the power consumed by
`proximity Sensor 262 is limited by pulsing transmitter 264 a
`few times a Second when the user is out of range, or by
`reducing the duty cycle of timer 265.
`0056. In other embodiments, IR receiver 266 generates a
`digital Signal instead of the analog Signal shown in FIG. 1.
`The digital signal provides a representation of the transmit
`ted Signal. However, as the distance between the device and
`the user increases, the number of errors in the digital Signal
`increases. By counting these errors, PIC 252 is able to
`determine the distance between the user and the device.
`
`0060)
`
`TABLE 2
`
`Group
`
`Context Variable
`
`Description
`
`Tilt?
`Accelerometer
`
`TiltAngleLR,
`TiltAngleFB
`
`TiltGravityFb
`TiltGravityLI
`TiltAbsAngleFb
`TiltAbsAngleLr
`GravityDir
`
`Left/Right and
`Forward/Back
`tilt angles in
`degrees relative
`tO Screen
`orientation
`Absolute linear
`acceleration
`Absolute tilt
`angle
`Facing up or
`down
`
`Page 15 of 21
`
`
`
`US 2003/0085870 A1
`
`May 8, 2003
`
`TABLE 2-continued
`
`Group
`
`Context Variable
`
`Description
`
`DisplayOrientation,
`Refresh
`
`HZLR, MagnitudeLR,
`HzFB, MagnitudeFB
`
`Looking At,
`Duration
`
`Moving & Duration
`
`Shaking
`
`Walking,
`Duration
`
`Flat, Portrait,
`LandscapeLeft,
`LandscapeRight,
`or Portrait
`UpsideDown. A
`Refresh event is
`posted if apps
`need to update
`orientation
`Dominant
`frequency and
`magnitude from
`FFT of tilt
`angles over the
`last few seconds
`If user is
`looking at
`display
`If device is
`moving in any
`way.
`If device is
`being shaken
`If user is
`walking
`
`0061
`
`TABLE 3
`
`Group
`
`Context Variable
`
`Description
`
`Proximity
`
`Proximity
`
`ProximityState,
`Duration
`
`Estimated
`distance in cm
`to proximal
`object
`Close, InRange,
`OutOfRange,
`AmbientLight
`(when out-of
`range and bright
`ambient light is
`present)
`
`0062)
`
`Group
`
`Other
`
`TABLE 4
`
`Context Variable
`
`Description
`
`ScreenOrientation
`
`VibrateCout
`
`Light
`
`Temperature
`
`Current display
`format
`vibrator
`intensity
`light sensor
`value
`temperature
`sensor value
`
`0.063. The context attributes of Table 1 are generated
`based on Signals from the touch Sensors, those in Table 2 are
`generated based on tilt Sensors and the gravity Switch, those
`in Table 3 are generated based on the proximity Sensors, and
`those in Table 4 are posted by applications other various
`SCSOS.
`
`0064. Each context attribute is defined in a table entry
`that includes the following fields:
`0065 1. A locally unique and/or globally unique
`identifier (GUID) for the attribute;
`0.066
`2. The value of the attribute;
`0067 3. The type of data that the attribute rep
`. The twoe O. data that the attribute representS
`(e.g. integer, Boolean, etc.)
`0068 4. A read/write/read-write permission flag
`0069 5. Input, Output, or Input--Output attribute
`0070) 6. Context class of attribute
`0.071) 7. “Dirty” data bit
`0072 8. Human-readable name and description
`0073. To receive messages when a context attribute is
`updated, a client first generates a context client. Under one
`embodiment, this is done using a method CreateContextCli
`ent(myHwnd) as in:
`client=CreateContextClient(myHwnd)
`0074 where client is the resulting context client object
`and myHwnd is the window that will handle notification
`meSSageS.
`0075. After the context client is defined, context
`attributes to be provided to the client are requested using a
`call Such as:
`init value=RequestNotification (client, context, usage)
`0076 where client is the client object created above,
`context is the name of the context attribute that the client
`wants to be updated on, and usage describes how the Sensor
`will be used. The RequestNotification method returns the
`current value of the requested context attribute, which is
`assigned to init value.
`0077. The usage parameter can have one of four values:
`0078 FgService: the application is using the
`attribute to provide a foreground Service in which the
`user is manipulating the device to provide direct
`input to the device.
`0079
`BgService: The application is using the sensor
`to provide a background Service. This means the
`application is monitoring a Signal to see if it meets
`criteria consistent with an anticipated pattern of user
`activity in which the user is not trying to provide
`input through their usage.
`0080 BgMonitor: The application is monitoring the
`context a