throbber
1111111111111111 IIIIII IIIII 1111111111 11111 lllll lllll 111111111111111 11111 1111111111 11111111
`US 20110264928Al
`
`c19) United States
`c12) Patent Application Publication
`Hinckley
`
`c10) Pub. No.: US 2011/0264928 Al
`Oct. 27, 2011
`(43) Pub. Date:
`
`(54) CHANGING POWER MODE BASED ON
`SENSORS IN A DEVICE
`
`(60) Provisional application No. 60/218,748, filed on Jul.
`17, 2000.
`
`(75)
`
`Inventor:
`
`(73) Assignee:
`
`Kenneth P. Hinckley, Redmond,
`WA (US)
`
`Publication Classification
`
`MICROSOFT CORPORATION,
`Redmond, WA (US)
`
`(51)
`
`Int. Cl.
`G06F 1126
`
`(2006.01)
`
`(21) Appl. No.:
`
`13/175,655
`
`(22) Filed:
`
`Jul.1, 2011
`
`(52) U.S. Cl. ........................................................ 713/300
`
`Related U.S. Application Data
`
`(57)
`
`ABSTRACT
`
`(63) Continuation of application No. 10/294,286, filed on
`Nov.14, 2002, which is acontinuation-in-partofappli(cid:173)
`cation No. 09/875,477, filed on Jun. 6, 2001, now Pat.
`No. 7,289,102.
`
`An orientation of a device is detected based on a signal from
`at least one orientation sensor in the device. In response to the
`detected orientation, the device is placed in a full power
`mode.
`
`266
`
`264
`
`200
`
`256
`
`304
`
`I
`I
`I
`I
`I
`I
`
`I
`I
`I
`I
`I
`I
`
`r-----------,
`
`~ - - - - - - - - - - - I ----
`
`308
`
`00 00
`
`Petitioner Samsung Ex-1011, 0001
`
`

`

`(') ...
`.... 0 =
`.... 0 = ""O = O" -....
`('D = ..... t "e -....
`
`0
`
`~ .....
`
`(')
`
`~ .....
`
`(')
`
`~ .....
`""O
`
`> ....
`
`--- 0
`....
`0 ....
`N
`rJJ
`c
`
`QO
`N
`1,0
`.i;...
`O'I
`N
`
`....
`0 ....
`....
`.....
`rJJ =(cid:173)
`....
`0 ....
`
`0
`
`('D
`('D
`
`N
`~-....J
`N
`
`216
`
`STORE
`OBJECT
`
`I/0
`
`214
`
`204
`
`212
`
`APP(S)
`
`OS
`
`MEMORY
`
`210
`
`PROCESSOR
`
`202
`
`AID INPUT
`
`AID INPUT
`
`NPUTS
`IGITAL
`
`258
`
`r276
`
`AID INPUT
`
`AID INPUT
`
`287
`
`252
`
`PIC
`
`285:)
`
`282
`
`284
`
`262 265
`
`266
`
`260
`
`~
`
`200
`
`FIG. 1
`
`PORT
`SERIAL
`
`280
`
`278
`
`INTERFACE
`
`OMMUNICATION
`
`' 250
`
`1
`
`}lMERI 1 =
`
`:Sl:264
`
`206
`
`208
`
`I >-IA/D INPUT
`
`Petitioner Samsung Ex-1011, 0002
`
`

`

`Patent Application Publication
`
`Oct. 27, 2011 Sheet 2 of 10
`
`US 2011/0264928 Al
`
`N .
`0
`~
`~
`
`N
`0
`0
`,......,
`
`~~
`
`'-(
`
`N
`,.....,
`M
`
`'-0
`0
`0 ,.....,
`
`0
`i:.....
`
`-j:..!..,
`
`lf1 z
`
`0 -
`-,...)
`
`f-;
`-<
`u
`P-.
`
`P-. -<
`
`z
`0
`0:::
`f-; -
`X f-; ril
`ril -< >
`f-; ;8 0:::
`z 0::: ril
`0 0
`
`lf1
`
`u j:..!.., z -
`
`0
`~ .
`0
`~
`~
`
`0
`0
`0 ,.....,
`
`0
`tr)
`N
`
`Petitioner Samsung Ex-1011, 0003
`
`

`

`Patent Application Publication
`
`Oct. 27, 2011 Sheet 3 of 10
`
`US 2011/0264928 Al
`
`0
`0
`r::r)
`
`N
`0
`r'")
`
`N
`0
`M
`
`00
`0
`M
`
`0
`0
`QD
`0
`0
`
`0
`0
`N
`
`\.0
`\.0
`N
`
`~I
`
`\.0
`0
`r::r)
`
`\.0
`lr)
`N
`
`~
`
`. u
`
`~
`~
`
`A .
`
`0
`,.....,
`M
`
`u
`
`~
`~
`
`0
`0
`r::r)
`
`~ D 6
`
`Petitioner Samsung Ex-1011, 0004
`
`

`

`Patent Application Publication
`
`Oct. 27, 2011 Sheet 4 of 10
`
`US 2011/0264928 Al
`
`00
`
`r-. u ~
`
`~
`
`0
`,.......;
`t"'-
`
`;-
`0
`t"'-
`
`N
`,.......;
`t"'-
`
`N
`0
`t"'-
`
`00
`
`0
`0
`
`~I
`
`1--,
`I
`I
`I
`I
`I
`I
`
`7~ -
`
`t"'-
`
`,-;
`
`f'.(_
`
`0
`0
`t"'-
`
`\0
`\0
`N
`
`Petitioner Samsung Ex-1011, 0005
`
`

`

`Patent Application Publication Oct. 27, 2011 Sheet 5 of 10
`
`US 2011/0264928 Al
`
`4
`4
`3
`3
`2
`2
`
`DIST.
`(CM)
`
`>1
`904
`I
`5
`0
`
`906
`
`908
`
`910
`
`50
`
`902 :s-
`
`SENSOR VALUE
`
`180
`
`FIG. 9
`
`Petitioner Samsung Ex-1011, 0006
`
`

`

`> ....
`
`--- 0
`....
`0 ....
`N
`rJJ
`c
`
`QO
`N
`1,0
`.i;...
`O'I
`N
`
`(') ...
`0
`.... 0 =
`.... 0 = ""O = O" -....
`('D = ..... t "e -....
`
`~ .....
`
`(')
`
`~ .....
`
`(')
`
`~ .....
`""O
`
`....
`0 ....
`O'I
`.....
`rJJ =(cid:173)
`....
`0 ....
`
`0
`
`('D
`('D
`
`N
`~-....J
`N
`
`FIG. 12
`
`0
`0
`
`1106
`
`context values is then used to
`objects. At least one of the
`situated relative to one or more
`values indicate how the device is
`the sensor signal. The context
`values are then generated from
`device. One or more context
`generated from a sensor in the
`least one sensor signal is
`In a device having a display, at
`
`0
`
`1102
`
`1104
`
`1100
`
`1100
`
`1106
`
`FIG. 11
`
`00
`
`control the operation of one
`values is then used to
`least one of the context
`one or more objects. At
`device is situated relative to
`values indicate how the
`sensor signal. The context
`then generated from the
`or more context values are
`sensor in the device. One
`signal is generated from a
`display, at least one sensor
`In a device having a
`
`0
`
`Petitioner Samsung Ex-1011, 0007
`
`

`

`Patent Application Publication Oct. 27, 2011 Sheet 7 of 10
`
`US 2011/0264928 Al
`
`1306
`
`1312
`
`PORTRAIT
`(UPSIDE-DOWN)
`
`-45
`1314 -+-----
`
`FORWARD/
`BACK
`TILT ANG
`
`LANDSCAPE
`(LEFT)
`
`LANDSCAPE
`(RIGHT)
`
`1302
`
`1308
`
`PORTRAIT
`
`1315
`
`-90 ~ 0
`_,/ LEFT/RIGHT TILT ANGLE
`
`1300
`
`45
`
`90
`
`1304
`FIG. 13
`
`Petitioner Samsung Ex-1011, 0008
`
`

`

`Patent Application Publication
`
`Oct. 27, 2011 Sheet 8 of 10
`
`US 2011/0264928 Al
`
`14005
`
`0
`0
`
`0 0
`
`nO
`
`1-1 □ 1
`
`s--1402
`
`John Smith Monday's Meeting
`
`Bill !'oms
`
`Status Report Due
`
`FIG. 14
`
`0 0
`
`0
`0
`
`1-1 □ 1
`
`Jnhn Smith Monday's Meeting
`
`Bill Toms
`
`Status Report Due
`
`S--1400
`
`lo□
`
`FIG. 15
`
`Petitioner Samsung Ex-1011, 0009
`
`

`

`Patent Application Publication
`
`Oct. 27, 2011 Sheet 9 of 10
`
`US 2011/0264928 Al
`
`00
`
`0
`0
`
`><!
`D
`I
`
`FIG. 16
`
`Petitioner Samsung Ex-1011, 0010
`
`

`

`Patent Application Publication
`
`Oct. 27, 2011 Sheet 10 of 10
`
`US 2011/0264928 Al
`
`1-1 □ 1,
`
`This is a word processing program, which pn.wides
`the ability to enter and edit text
`
`00
`
`FIG. 17
`
`1800
`
`4 ,,
`
`1-1 □ 1,
`
`John Smith Monday's Meeting
`Bill Toms
`Status Report Due
`
`0
`0
`
`00
`
`FIG. 18
`
`Petitioner Samsung Ex-1011, 0011
`
`

`

`US 2011/0264928 Al
`
`Oct. 27, 2011
`
`1
`
`CHANGING POWER MODE BASED ON
`SENSORS IN A DEVICE
`
`REFERENCE TO RELATED APPLICATION
`
`[0001] This application is a continuation of and claims
`priority from U.S. patent application Ser. No. 10/294,286
`filed on Nov. 14, 2002, which was a Continuation-In-Part
`Application of U.S. patent application Ser. No. 09/875,477,
`filed Jun. 6, 2001, which claims priority from a U.S. Provi(cid:173)
`sional application having Ser. No. 60/218,748, filed on Jul.
`17, 2000 and is a Continuation-In-Part of U.S. patent appli(cid:173)
`cation Ser. No. 10/162,487, filed Jun. 3, 2002, which was a
`Continuation-In-Part of U.S. patent application Ser. No.
`09/875,477 filed Jun. 6, 2001.
`
`BACKGROUND OF THE INVENTION
`
`[0002] The present invention relates to devices with dis(cid:173)
`plays. In particular, the present invention relates to computing
`and mobile devices.
`[0003] Mobile devices, such as personal information man(cid:173)
`agers (PIMs ),
`tablet PCs, cellular telephones, pagers,
`watches, and wearable computers typically include one or
`more buttons or touch screens through which the mobile
`device receives explicit instructions from the user. For
`example, the user can press buttons to explicitly instruct the
`device to enter a full-power mode, activate an application, or
`change the orientation of a display.
`[0004] Although the devices are responsive to information
`provided through such explicit instructions, they are gener(cid:173)
`ally not responsive to information that is present in the man(cid:173)
`ner in which the device is being handled by the user. For
`example, the devices do not automatically enter a full-power
`mode, even when the user is holding the device in a manner
`that is consistent with wanting to use the device.
`[0005] Because prior art devices are generally not respon(cid:173)
`sive to the manner in which the user is holding the devices, the
`user is forced to enter explicit instructions into the device to
`achieve various functions. In light of this, mobile devices are
`needed that can sense how they are being handled in order to
`perform certain background functions that expand the func(cid:173)
`tionality of the mobile device without requiring the user to
`perform any additional actions.
`
`SUMMARY OF THE INVENTION
`
`[0006] An orientation of a device is detected based on a
`signal from at least one orientation sensor in the device. In
`response to the detected orientation, the device is placed in a
`full power mode.
`
`BRIEF DESCRIPTION OF THE DRAWINGS
`
`[0007] FIG. 1 is a block diagram of the components of a
`mobile device under one embodiment of the present inven(cid:173)
`tion.
`[0008] FIG. 2 is a bottom view of a mobile device of one
`embodiment of the present invention.
`[0009] FIG. 3 is a front view of the mobile deviceofFIG. 2.
`[0010] FIG. 4 is a back view of the mobile device of FIG. 2.
`[0011] FIG. 5 is a left side view of the mobile device of FIG.
`2.
`[0012] FIG. 6 is a right side view of the mobile device of
`FIG. 2.
`[0013] FIG. 7 is a front view of a second embodiment of a
`mobile device.
`
`[0014] FIG. 8 is a back view of the mobile device of FIG. 7.
`[0015] FIG. 9 is a graph of distance between a user and a
`mobile device as a function of proximity sensor levels.
`[0016] FIG. 10 is a block diagram of components used to
`practice several embodiments of the present invention.
`[0017] FIG. 11 is a front view of a mobile device in a
`portrait orientation.
`[0018] FIG. 12 is a front view of a mobile device in a
`landscape orientation.
`[0019] FIG. 13 is a chart showing the display orientations
`for various combinations of forward/back tilt and left/right
`tilt.
`[0020] FIGS. 14 and 15 show a mobile device in two dif(cid:173)
`ferent orientations.
`[0021] FIG. 16 shows a front view of a mobile device with
`an orientation feedback icon displayed.
`[0022] FIG. 17 shows a mobile device in a landscape ori(cid:173)
`entation with a word-processing application.
`[0023] FIG. 18 shows a mobile device in a portrait orienta(cid:173)
`tion with an e-mail application.
`
`DETAILED DESCRIPTION OF THE PREFERRED
`EMBODIMENTS
`
`[0024] FIG. 1 is a block diagram of a mobile device 200,
`which is an exemplary environment for embodiments of the
`present invention. Mobile device 200 includes a micropro(cid:173)
`cessor 202, memory 204, input/output (I/0) components 206,
`and a communication interface 208 for communicating with
`remote computers or other mobile devices. In one embodi(cid:173)
`ment, the afore-mentioned components are coupled for com(cid:173)
`munication with one another over a suitable bus 210.
`[0025] Memory 204 is a computer readable media, which
`can be any available media that can be accessed by processor
`202 and includes both volatile and nonvolatile media, remov(cid:173)
`able and non-removable media. By way of example, and not
`limitation, computer readable media may comprise computer
`storage media and communication media. Computer storage
`media includes both volatile and nonvolatile, removable and
`nonremovable media implemented in any method or technol(cid:173)
`ogy for storage of information such as computer readable
`instructions, data structures, program modules or other data.
`Computer storage media includes, but is not limited to, RAM,
`TOM, EEPROM, flash memory or other memory technology,
`CD-ROM, digital versatile disks (DVD) or other optical disk
`storage, magnetic cassettes, magnetic tape, magnetic disk
`storage or other magnetic storage devices, or any other
`medium which can be used to store the desired information
`and which can be accessed by processor 202. Communication
`media typically embodies computer readable instructions,
`data structures, program modules or other data in a modulated
`data signal such as a carrier wave or other transport mecha(cid:173)
`nism and includes any information delivery media. The term
`"modulated data signal" means a signal that has one or more
`of its characteristics set or changed in such a manner as to
`encode information in the signal. By way of example, and not
`limitation, communication media includes wired media such
`as a wired network or direct-wired connection, and wireless
`media such as acoustic, RF, infrared, and other wireless
`media. Combinations of any of the above should also be
`included within the scope of computer readable media.
`[0026] The particular computer readable medium used in a
`mobile device is a function of the size of the device and the
`power capacity of the device. For example, a tablet PC will
`typically include one or more disk drives where as a PIM will
`
`Petitioner Samsung Ex-1011, 0012
`
`

`

`US 2011/0264928 Al
`
`Oct. 27, 2011
`
`2
`
`typically only include a random access memory (RAM) with
`a battery back-up module (not shown) such that information
`stored in memory 204 is not lost when the general power to
`mobile device 200 is shut down.
`[0027] A portion of memory 204 is preferably allocated as
`addressable memory for program execution, while another
`portion of memory 204 is preferably used for storage.
`[0028] Memory 204 includes an operating system 212,
`application programs 214, and an object store 216. During
`operation, operating system 212 is preferably executed by
`processor 202 from memory 204. Operating system 212, in
`one preferred embodiment, is a WINDOWS® CE brand oper(cid:173)
`ating system commercially available from Microsoft Corpo(cid:173)
`ration. In Tablet PC embodiments, Windows® XP brand
`operating system available from Microsoft Corporation is
`utilized. Operating system 212 is preferably designed for
`mobile devices, and implements database features that can be
`utilized by applications 214 through a set of exposed appli(cid:173)
`cation programming interfaces and methods. The objects in
`object store 216 are maintained by applications 214 and oper(cid:173)
`ating system 212 at least partially in response to calls to the
`exposed application programming interfaces and methods.
`[0029] Communication interface 208 represents numerous
`devices and technologies that allow mobile device 200 to send
`and receive information. The devices include wired and wire(cid:173)
`less modems, satellite receivers and broadcast tuners to name
`a few. Mobile device 200 can also be directly connected to a
`computer to exchange data therewith. In such cases, commu(cid:173)
`nication interface 208 can be an infrared transceiver or a
`serial, parallel, USB, or Firewire communication connection,
`all of which are capable of transmitting streaming informa(cid:173)
`tion.
`Input/output components 206 include a variety of
`[0030]
`input devices that have previously been found on mobile
`devices such as a touch-sensitive screen or transparent tablet
`overlay sensitive to properties of a special stylus including
`position, proximity to the screen, pressure, azimuth, eleva(cid:173)
`tion, which end of the stylus is being used ( e.g. writing tip on
`one end, eraser on the other end) and possibly a unique ID
`encoded in the stylus, buttons, rollers, and a microphone as
`well as a variety of output devices including an audio genera(cid:173)
`tor, a vibrating device, and a display. The devices listed above
`are by way of example and need not all be present on mobile
`device 200.
`[0031] Mobile device 200 also includes additional input
`devices under the present invention. Under one embodiment,
`these input devices are connected to the mobile device
`through a separate serial port 250 and a peripheral interface
`controller (PIC) microprocessor 252. In other embodiments,
`these additional devices are connected to processor 202
`through communication interface 208 and PIC microproces(cid:173)
`sor 252 or through PIC microprocessor 252 directly. Under
`one embodiment, a microchip 16C73A peripheral interface
`controller is used as the PIC microprocessor. In still further
`embodiments, PIC microprocessor 252 is not present and the
`input devices are connected to processor 202 through various
`ports such as serial port 250 or through communication inter(cid:173)
`face 208, or through memory-mapped I/O or direct connec(cid:173)
`tion to the system processor(s).
`[0032] Under the embodiment of FIG. 1, The additional
`input devices include a set of touch sensors such as touch
`sensors 254 and 256. Touch sensors 254 and 256 are provided
`to a separate peripheral interface controller microprocessor
`276 which converts the touch signals into digital values and
`
`provides the digital values to PIC microprocessor 252. In
`other embodiments, touch sensors 254 and 256 are connected
`directly to analog or digital inputs in PIC microprocessor 252
`instead of being connected to PIC 276 or are connected to
`processor 202.
`[0033] The input devices also include a dual axis linear
`accelerometer tilt sensor 258 capable of detecting forward/
`back tilt, left/right tilt, and linear accelerations such as those
`resulting from vibrations or movement.
`[0034] The input devices also include a light sensor 260, a
`proximity sensor 262 consisting of an infrared transmitter
`264 and an infrared receiver 266, a digital compass ( e.g. a
`single or multiple axis magnetometer) 284, and a gravity
`switch 282. The sensing signals from the infrared receiver
`266, linear accelerator 258, light sensor 260, digital compass
`284, and gravity switch 282 may be provided through respec(cid:173)
`tive amplifiers 270,272,274,285 and287 to analog inputs of
`PIC microprocessor 252. These analog inputs are connected
`to analog-to-digital converters within PIC microprocessor
`252. In other embodiments, the sensors provide a digital
`output and thus are connected to digital inputs on the micro(cid:173)
`processor. In further embodiments, the input devices also
`include a temperature sensor.
`[0035] PIC microprocessor 252 also includes a connection
`to the power bus of mobile device 200, which is shown as
`connection 278 in FIG. 1. PIC microprocessor 252 also
`includes a connection to a power switch 280, which enables
`PIC microprocessor 252 to tum mobile device 200 on and off.
`Note that PIC microprocessor 252 always receives power
`and, under one embodiment, is able to control which of the
`sensors receives power at any one time. This allows PIC
`microprocessor 252 to manage power consumption by only
`sending power to those sensors that it anticipates will need to
`be active.
`[0036] Under one embodiment, PIC microprocessor 252
`continuously samples the sensors and transmits packets rep(cid:173)
`resenting the state of these sensors at a rate of approximately
`400 samples per second through serial port 250. In some
`embodiments, samples are reported at lower speeds to con(cid:173)
`serve power and processing resources. Some sensors may be
`reported at different sampling rates than others ( e.g. tilt may
`be updated more frequently than touch).
`[0037] Under one embodiment, the touch sensors are
`capacitive touch sensors that are divided into two regions. In
`other embodiments, these sensors are implemented as a single
`detector pad. Under one embodiment, the touch sensors are
`spread across the back and sides of mobile device 200. This is
`shown in more detail in FIGS. 4-6 which show a back, left
`side view and right side view of the outside of mobile device
`200. In FIGS. 4, 5, and 6, touch sensor 254 is shown as two
`regions 300 and 302. Region 300 extends from the left side to
`the back of mobile device 200 and region 302 extends from
`the right side to the back of mobile device 200. When a user
`touches either section 300 or 302, the capacitance associated
`with the touched section changes indicating that the user has
`touched the device. Note that although the touch sensors are
`shown on the exterior of the device in the embodiment of
`FIGS. 4-6, in other embodiments, the touch sensor is located
`beneath an outer covering of the device.
`[0038] Touch sensor 256 is shown in FIG. 3, which is a front
`view of mobile device 200. In the embodiment of FIG. 3,
`touch sensor 256 is located on the left bezel of display screen
`304. In other embodiments, touch sensor 256 is located on the
`
`Petitioner Samsung Ex-1011, 0013
`
`

`

`US 2011/0264928 Al
`
`Oct. 27, 2011
`
`3
`
`outer casing on the front portion of mobile device 200, but not
`necessarily on bezel 306 of mobile device 200.
`[0039]
`In some embodiments, the touch sensors described
`above are realized using a plurality of independent touch
`sensors that each provides a separate touch signal. In other
`embodiments, the touch sensors are replaced with position
`sensors that indicate the location where the user is touching
`the device. Those skilled in the art will recognize that addi(cid:173)
`tional touch sensors may be added to the mo bile device within
`the scope of the present invention.
`[0040] FIGS. 7 and 8 indicate locations for touch sensors
`under one embodiment of a tablet PC. In FIG. 7, touch sensors
`700, 702, 704, and 706 are located at various locations around
`the perimeter of a display 708 on the front of tablet PC 701.
`Sensors associated with display 708 are able to detect the
`location of a stylus 710 when it is near display 708 using
`inductive coupling between display 708 and conductors in
`stylus 710. Under some embodiments, the sensors associated
`with display 708 are able to detect the proximity of stylus 710
`as well as the azimuth of the stylus.
`[0041] FIG. 8 provides a back view of tablet PC 701 and
`shows a touch sensor 712 located on the back surface of the
`tablet PC.
`[0042] Tablet PC 701 can be of a slate form, in which the
`tablet PC only includes the display for input and does not
`include a keyboard. The slate forms of the tablet PC can be
`used with a docking station to provide a connection to other
`input devices and memory devices.
`[0043]
`In other embodiments, the tablet PC is a convertible
`device with a keyboard. Under one convertible embodiment,
`the keyboard is attached to the display through a pivoting
`connection that allows the tablet PC to be in either a closed
`state or an open state. In such embodiments, the display is
`embedded in a top portion of the tablet PC and the keyboard
`is embedded in a bottom portion of the tablet PC. In the closed
`state, the top and bottom portions of the tablet PC are brought
`together so that the keyboard is hidden between the top and
`bottom portions while the display is visible on the exterior of
`the top portion. In the open state, the display pivots so that it
`faces the keyboard.
`[0044]
`In another convertible embodiment, the display por(cid:173)
`tion of the tablet PC is detachable from a keyboard and
`extended device portion, which can contain various disk
`drives and additional memory. In such embodiments, the back
`touch sensor 712 can be located on the back of the display
`portion or on the back of the keyboard portion.
`[0045] Tilt sensor 258 is shown as a single dotted element
`308 in FIG. 3 and element 714 in FIG. 7. The tilt sensor is
`embedded within the casing of mobile device 200 and 701
`and in one embodiment is located at a point about which users
`typically pivot mobile device 200 and 701 when tilting the
`device. Note that the tilt sensor's position within the mobile
`device is unimportant as it senses only the angle ofits physical
`attitude with respect to gravity. The sensor's angular position
`within the device is important.
`[0046] Under one embodiment, an Analog Devices
`ADXL202 two-axis linear accelerometer is used for tilt sen(cid:173)
`sor 258. Such a sensor detects forward/backward tilt, shown
`by arrows 310 of FIG. 5, and left/right tilt, shown in the
`bottom view of FIG. 2 as arrows 312. The sensor also
`responds to linear accelerations, such as those resulting from
`shaking the device. Typically, the tilt sensor has a response
`curve both in the forward/back direction and left/right direc(cid:173)
`tion with the form:
`
`Angle=sin-1 T
`
`(
`
`T-T)
`
`EQ. 1
`
`where Tis the tilt sensor value, Tc is the sensor value at 0° tilt,
`and k is a gain parameter. In embodiments where the sensor
`cannot detect the sign of the gravity vector, it is unable to
`determine if the user is holding the device with the display
`facing up or down. Gravity switch 282 of FIG. 1 is thus
`provided in some embodiments to indicate whether the dis(cid:173)
`play is facing the ground. In other embodiments, a three-axis
`accelerometer is used to provide the sign of the gravity vector.
`[0047]
`In addition, the tilt sensor does not respond to rota(cid:173)
`tion about an axis running from the front to the back of the
`mobile device. Thus, the tilt sensor is unable to sense the
`spinning of the mobile device on its back when laid on a flat
`table. Digital magnetic compass 284 of FIG. 1 is thus pro(cid:173)
`vided in some embodiments to indicate this type of rotation.
`In other embodiments, solid state gyros are used instead of the
`compass. In further embodiments, a multiple axis magnetom(cid:173)
`eter may be used in lieu of the digital compass, and combined
`with the tilt sensor values, to improve the robustness of the
`sensed compass direction.
`[0048] When present, gravity switch 282 and digital com(cid:173)
`pass 284 are also internal to mobile devices 200 and 701. They
`are not shown in FIGS. 3 and 7 to reduce the complexity of
`FIGS. 3 and 7.
`[0049] Note that the additional input devices of FIG. 1 do
`not all have to be present under the present invention. Differ(cid:173)
`ent embodiments of the invention will use different numbers
`of and different combinations of these additional sensors.
`Further, additional sensors may be added without affecting
`the functions of the sensors discussed in the present applica(cid:173)
`tion.
`[0050] Transmitter 264 and receiver 266 of proximity sen(cid:173)
`sor 262 are shown in FIGS. 3 and 7. In the embodiment of
`FIG. 3, transmitter 264 is shown below and to the right of
`receiver 266, and both the transmitter and receiver are located
`at the top front of mobile device 200.
`[0051] Under one embodiment, a timer 265 drives trans(cid:173)
`mitter 264 at 40 kilohertz and transmitter 264 is an infrared
`light emitting diode with a 60° beam angle. Under such
`embodiments, receiver 266 is also an infrared receiver that is
`capable of operating at the same frequency as transmitter 264.
`The light produced by transmitter 264 bounces off objects
`that are near mobile device 200 and the reflected light is
`received by receiver 266. Receiver 266 typically has an auto(cid:173)
`matic gain control such that the strength of the received signal
`is proportional to the distance to the object. In a further
`embodiment, multiple light emitting diodes with different
`beam angles may be combined to improve sensor response to
`both distant objects (using a narrow collimated beam angle,
`e.g. 5°) as well as objects that are not directly in front of the
`sensor (using a wide beam angle).
`[0052] FIG. 9 shows a response curve for one embodiment
`of the proximity sensor. In FIG. 9, the sensor value is shown
`along horizontal axis 902 and the actual distance to the object
`is shown along vertical axis 904. The graph of FIG. 9 is
`divided into three ranges. Range 906 extends from a distance
`of approximately 27 centimeters to infinity and indicates that
`no objects are within range of mobile device 200. Range 908
`extends from approximately 7 centimeters to 27 centimeters
`and indicates that at least one object is within range of mobile
`
`Petitioner Samsung Ex-1011, 0014
`
`

`

`US 2011/0264928 Al
`
`Oct. 27, 2011
`
`4
`
`device 200. Readings in third range 910, which extends from
`7 centimeters to O centimeters, are considered to be close to
`mobile device 200. The response curve of FIG. 9 is described
`by the following equation:
`
`TABLE I-continued
`
`Group
`
`Context Variable
`
`Description
`
`k
`Z,m = -(_P~_ c~)a
`
`Pmax
`
`EQ. 2
`
`where zcm is the distance in centimeters to the object, pis the
`raw proximity reading, Pmax is the maximum sensor reading,
`c is a constant, a is a nonlinear parameter (0.77 in one
`embodiment), and k is a gain factor.
`[0053] Under one embodiment, the power consumed by
`proximity sensor 262 is limited by pulsing transmitter 264 a
`few times a second when the user is out of range, or by
`reducing the duty cycle of timer 265.
`[0054]
`In other embodiments, IR receiver 266 generates a
`digital signal instead of the analog signal shown in FIG. 1.
`The digital signal provides a representation of the transmitted
`signal. However, as the distance between the device and the
`user increases, the number of errors in the digital signal
`increases. By counting these errors, PIC 252 is able to deter(cid:173)
`mine the distance between the user and the device.
`[0055] FIG. 10 provides a block diagram of the software
`components of one embodiment of the present invention. In
`FIG.10, a context information server 1000 receives the sensor
`data from serial port 250 of FIG. 1.
`[0056] Context information server 1000 acts as a broker
`between the sensor values received by the microprocessor
`252 and a set ofapplications 1002 operating on mobile device
`1000. Context information server 1000 continuously receives
`sensor data packets from PIC 252, converts the raw data into
`a logical form, and derives additional information from the
`sensor data.
`[0057] Applications 1002 can access the logical form of
`information generated by registering with context informa(cid:173)
`tion server 1000 to receive messages when a particular con(cid:173)
`text attribute is updated. This logical form is referred to as a
`context attribute. Such context attributes can be based on the
`state of one or more sensors as well as past states of the
`sensors, or even anticipated (predicted) future states. Tables
`1, 2, 3 and 4 below provide lists of the context attributes that
`an application can register to receive. In the description col(cid:173)
`unm of each table, specific values for the variables are shown
`in italics. For example, the DISPLAYORIENTATION vari(cid:173)
`able can have values of flat, portrait, landscape left, landscape
`right, or portrait upside down.
`
`Holding&Duration
`
`TouchingBezel&
`Duration
`
`Whether or not
`the user is
`holding the
`device and for
`how long
`Whether user is
`touching screen
`bezel and for
`how long
`
`TABLE2
`
`Group
`
`Context Variable
`
`Description
`
`Tilt/
`Accelerometer
`
`TiltAngleLR,
`TiltAngleFB
`
`TiltGravityFb
`TiltGravityLr
`TiltAbsAngleFb
`TiltAbsAngleLr
`GravityDir
`
`DisplayOrientation,
`Refresh
`
`HzLR, MagnitudeLR,
`HzFB, MagnitudeFB
`
`LookingAt,
`Duration
`
`Moving & Duration
`
`Shaking
`
`Walking,
`Duration
`
`TABLE3
`
`Left/Right and
`Forward/Back
`tilt angles in
`degrees relative
`to screen
`orientation
`Absolute linear
`acceleration
`Absolute tilt
`angle
`Facing up or
`down
`Flat, Portrait,
`Landscape Left,
`Landscape Right,
`or Portrait(cid:173)
`UpsideDown. A
`Refresh event is
`posted if apps
`need to update
`orientation
`Dominant
`frequency and
`magnitude from
`FFT of tilt
`angles over the
`last few seconds
`If user is
`looking at
`display
`If device is
`moving in any
`way.
`If device is
`being shaken
`If user is
`walking
`
`Group
`
`Context Variable
`
`Description
`
`TABLE 1
`
`Proximity
`
`Proximity
`
`Group
`
`Touch
`
`Context Variable
`
`Description
`
`HoldingLeft
`HoldingRight
`Holdingback
`
`LeftTouchState
`RightTouchState
`BackTouchState
`
`Binary
`indication of
`contact with
`touch sensor
`Degree of
`touching:
`OutOfRange,
`Inrange, Close
`
`ProximityState,
`Duration
`
`Estimated
`distance in cm
`to proximal
`object
`Close, InRange,
`OutOfRange,
`AmbientLight
`(when out-of(cid:173)
`range and bright
`ambient light is
`present)
`
`Petitioner Samsung Ex-1011, 0015
`
`

`

`US 2011/0264928 Al
`
`Oct. 27, 2011
`
`5
`
`Group
`
`Other
`
`TABLE4
`
`Context Variable
`
`Description
`
`ScreenOrientation
`
`VibrateOut
`
`Light
`
`Temperature
`
`Current display
`format
`vibrator
`intensity
`light sensor
`value
`temperature
`sensor value
`
`[0058] The context attributes ofTable 1 are generated based
`on signals from the touch sensors, those in Table 2 are gen(cid:173)
`erated based on tilt sensors and the gravity switch, those in
`Table 3 are generated based on the proximity sensors, and
`those in Table 4 are posted by applications other various
`sensors.
`[0059] Each context attribute is defined in a table entry that
`includes the following fields:
`[0060]
`1. A locally unique and/or globally unique iden(cid:173)
`tifier (GUID) for the attribute;
`[0061] 2. The value of the attribute;
`[0062] 3. The type of data that the attribute represents
`(e.g. integer, Boolean, etc.)
`[0063] 4. A read/write/read-write permission flag
`[0064] 5. Input, Output, or Input+Output attribute
`[0065]
`6. Context class of attribute
`[0066] 7. "Dirty" data bit
`[0067] 8. Human-readable name and description
`[0068] To receive messages when a context attribute is
`updated, a client first generates a context client. Under one
`embodiment, this is done using a method CreateContextCli(cid:173)
`ent(myHwnd) as in:
`[0069]
`client=CreateContextClient(myHwnd)
`where client is the resulting context client object and myH(cid:173)
`wnd is the window that will handle notification messages.
`[0070] After the context client is defined, co

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket