throbber
1111111111111111 IIIIII IIIII 1111111111 11111 111111111111111 IIIII IIIII 11111 1111111111 11111111
`US 2011 0148752Al
`
`(19) United States
`
`02) Patent Application Publication
`Alameh et al.
`
`(10) Pub. No.: US 2011/0148752 Al
`(43) Pub. Date:
`Jun. 23, 2011
`
`(54) MOBILE DEVICE WITH USER
`INTERACTION CAPABILITY AND METHOD
`OF OPERATING SAME
`
`(76)
`
`Inventors:
`
`Rachld Alameh, Crystal Lake, IL
`(US); Thomas Merrell, Beach
`Park, IL (US); Kenneth Paid, West
`Dundee, IL (US)
`
`(21) Appl. No.:
`
`12/641,830
`
`(22) Filed:
`
`Dec. 18, 2009
`
`Publication Classification
`
`(51)
`
`Int. Cl.
`G06F 3/01
`
`(2006.01)
`
`(52) U.S. Cl . ..
`
`... 345/156
`
`(57)
`
`ABSTRACT
`
`ln one embodiment a method of operating a mobile device
`includes sensing either an orientation or a movement of the
`mobile device, determining a command based on the sensed
`orientation or sensed movement, sensing a proximity of an
`object in relation to at least a portion of the mobile device, and
`executing the command upon the proximity of the object
`being sensed. In another embodiment, a method of operating
`a mobile device governs a manner of interaction of the mobile
`device relative to one or more other mobile devices. In at least
`some embodiments, at least one of the mobile devices
`includes an accelerometer and an infrared proximity sensor,
`and operation of the mobile device is determined based upon
`signals from those components.
`
`ST ART
`
`-, 602
`
`DEVICE SENSES RESTING POSITION AND
`STORES CURRENT X, Y, Z COORDINATE ORIENTATION
`
`DEVICE IS POISED TO SENSE MOVEMENT
`
`-
`
`. r608
`••
`-------
`MOVEMENT O! iT~ NO
`
`YES
`
`DEVICE TRACKS THE CHANGE !N X, Y, Z
`, COORDINATE ORIENTATION BETWEEN RESTING POSITIONS
`
`COMPARE TRACKED
`MOVEMENT TO A LIST OF PREDEFINED MOTIONS
`
`LOOK UP PRESELECTED
`COMMAND ASSOCIATED WITH THE MOTION SENSED
`
`IDENTIFY THE PRESELECTED
`COMMAND AND PREPARE DEVICE FOR TRIGGER SIGNAL
`
`r ·----- .. ____ __ ___ . ___ -· _ ··-- ______ _______ _ ,r_618
`:
`ACTIVATE THE INFRARED TRANSMITTER
`..,· - -~
`-------- --------------------------- ------~
`
`NO
`
`NO
`
`EXECUTE
`PRESELECTED COMMAND
`
`Petitioner Samsung Ex-1007, 0001
`
`

`

`Patent Application Publication
`
`Jun. 23, 2011 Sheet 1 of 8
`
`US 2011/0148752 Al
`
`~102
`
`G,os
`
`f I
`
`108
`
`FIG. 1
`
`Petitioner Samsung Ex-1007, 0002
`
`

`

`Patent Application Publication
`
`Jun. 23, 2011 Sheet 2 of 8
`
`US 2011/0148752 Al
`
`WIRELESS
`TRANSCEIVERS
`WWAN I
`203 ......
`TRAI~SCEIVER ~
`1
`. 206--...
`I WPAN I
`TRANSCEIVER
`204 • "".
`PROCESSOR "'
`
`20§::) ..
`: 216 .... . ...
`"'\
`
`,. . -
`
`......... ~ .... -
`
`200
`
`r 228
`
`SENSORS
`
`~
`,,
`;;
`·I
`
`,r22~.
`PRO)!IMITY
`
`SENSORS I
`r231.
`S.ENSORS I
`orHER
`I
`
`r 206
`
`•• -
`
`.
`
`__ c?,10
`,·
`222 :
`
`J
`
`',j
`
`,l • ~
`'i J"'
`. .,
`· A
`
`-~
`
`· 1
`I
`I
`I
`)
`I
`1
`' l
`~ _r.....,
`• "'
`!
`_Ii','·
`.
`
`·§
`:
`i
`l
`.j
`
`I .,
`I
`
`i
`I
`I'
`
`VISUAL INPUT
`
`AUDIO
`INPUT
`
`·i
`I.
`{
`l
`I
`
`l ·,
`·r 224 ;
`· 1
`' ·I
`i
`
`, ·
`226
`
`!
`·1
`·l
`
`f
`.1
`
`:I\_
`MECHANICAL
`.I • ·/
`.,,.
`INPUT
`·"'
`t .. ', . ........... - - • - ___ .... ;..·. ___ ,}
`~ 214
`
`/l
`
`· -
`
`.... -..... ·.- 0 ·0
`
`.
`
`,
`
`.' 4
`
`"') .
`
`.l'\ I
`~ ,,
`
`MEMORY
`
`i
`
`.......... - -- ✓ -
`
`-
`
`• -
`
`.· ..... ,.., ..
`
`. •
`
`·.-
`
`-
`
`:· .
`
`• ).
`I
`:
`· _A '.
`.. ~
`-..
`
`I
`!
`
`' I ,
`
`..
`
`232--
`
`VISUAL OUTPUT
`
`'\,
`
`·1
`i
`l
`l
`/
`I
`'J
`\ 218
`1
`' {
`'
`l
`i, ·,.
`:. 220 ,.,
`I
`
`AUDIO
`OUTPUT
`
`/ i
`'-- ~
`
`:1
`i
`
`1 .,
`:
`
`I
`I
`:
`I
`I
`
`I
`l
`I
`!
`!
`i
`
`'
`
`MECHANICAL
`OUTPUT
`
`'
`' I
`" I
`' I
`,-----------------
`.. ~
`
`-~
`I
`I
`
`l
`
`COMPONENT
`INTERFACE
`
`/l-
`
`~
`.,
`~ POWER SUPPLY
`
`FIG. 2
`
`Petitioner Samsung Ex-1007, 0003
`
`

`

`Patent Application Publication
`
`Jun. 23, 2011 Sheet 3 of 8
`
`US 2011/0148752 Al
`
`z
`I
`'
`
`102
`~
`
`FIG. 3
`
`Petitioner Samsung Ex-1007, 0004
`
`

`

`Patent Application Publication
`
`Jun. 23, 2011 Sheet 4 of 8
`
`US 2011/0148752 Al
`
`0
`
`.,,., . .,. "' ~ ...... • .... \
`
`,,
`\
`:,
`' ).
`
`,. ,.~
`
`' '\(
`
`\
`
`.~ ,r'_..,
`
`l ·
`l
`' I'
`l
`I
`
`' \
`\ :.
`\
`.\ .
`
`. ~ .,
`\ ' .... _..,,,
`
`l
`
`\
`
`_ ..... ,.,,.
`
`-:' ;_.:
`
`FIG. 4
`
`Petitioner Samsung Ex-1007, 0005
`
`

`

`Patent Application Publication
`
`Jun. 23, 2011 Sheet 5 of 8
`
`US 2011/0148752 Al
`
`FIG. 5
`
`Petitioner Samsung Ex-1007, 0006
`
`

`

`Patent Application Publication
`
`Jun. 23, 2011 Sheet 6 of 8
`
`US 2011/0148752 Al
`
`•
`
`-·············
`
`••••••••••·••••••• · ··················-········-·.(' ST ART
`
`,. • ·6Q.2
`
`DEVICE SENSES RESTING POSIT!ON AND
`STORES CURRENT X, Y, Z COORD!NATE ORIENTATION
`
`DEVICE IS POISED TO SENSE MOVEMENT
`
`NO
`
`YES
`
`DEVICE TRACKS THE CHANGE IN X, Y, Z
`COORDINATE ORIENTATION BETWEEN RESTING POSITIONS
`________ ....._ _______ ~
`,:-612
`COMPARE TRACKED
`MOVEMENT TO A LIST OF PREDEFINED MOTIONS
`;r6f4
`--------~--------'--,,
`LOOK UP PRESELECTED
`COMMAND ASSOC IA TED WITH THE MOTION SENSED
`
`IDENTIFY THE PRESELECTED
`COMMAND AND PREPARE DEVICE FOR TRIGGER SIGNAL
`
`r - - - - - - - - - - - - - - - - - - - - - -~ - - .• - - - - - - - - - - - - - - - - .c,618
`-----1, ------------ ------------- . -------'
`:
`ACTIVATE THE INFRARED TRANSMITTER
`~----,
`
`.. NO
`
`.
`
`.
`
`YE:
`
`.
`
`.
`
`YES
`
`NO
`
`622
`~----'----....,_
`EXECUTE
`PRESELECTED COMMAND
`
`FIG. 6
`
`Petitioner Samsung Ex-1007, 0007
`
`

`

`Patent Application Publication
`
`Jun. 23, 2011 Sheet 7 of 8
`
`US 2011/0148752 Al
`
`102
`l
`y .... . _J _______ J
`
`702
`t
`
`. J
`·.
`FIG. 7
`
`u x _ _,c700
`
`FIG. 8
`
`HJ2
`1
`
`702
`1004
`1005
`,. .. -.. _~ < _/
`J
`J,
`r ......... ---......._, ~., .. .,-. .,.~~---~ ·;--,..·---
`roQ-x.
`~
`(, ____ .........,......_ _________ _ ................ --.... ·-------- ~ ···········-,..___-.......
`t
`
`10()6·,y/ /
`/
`/
`J /
`
`FIG. JO
`
`Petitioner Samsung Ex-1007, 0008
`
`

`

`Patent Application Publication
`
`Jun. 23, 2011 Sheet 8 of 8
`
`US 2011/0148752 Al
`
`~ f102
`
`'1104
`- - - - - - " - - - - -= - - - - - - - - " - , .
`PRECONFIGURE LIST OF
`STORED MOBILE DEVICES FOR POTENTIAL PAIR!NG
`
`1105
`....-- --------'--------....,.
`MONITOR THE RELATIVE
`POSITION OF THE FIRST MOBILE DEVICE
`,--------·------------......
`1106
`MOBILE DEVICE SENSES
`IT IS IN A RESTING POSITION, CALCULATE
`CURRENT X, Y, Z COORDINATE ORIENTATION
`_ ____ _.__ ______ ..<..,.:1108
`PAIR WITH MOBILE DEVICE(S) ON STORED DEVICE LIST
`THAT IS IN RANGE VIA WIRELESS COMMUNICATION
`
`. - - - - - - - - - - - - - - - - . . . . . ; , ./.'}110
`DEACTIVATE ANY
`PAIRED MOBILE DEVICE TO BE EXCLUDED
`.· -1112
`.-----~---'---------
`COMMUNICATE WITH A PAIRED
`DEVICE TO ASCERTAIN ITS CURRENT POSITION
`
`SET COMMAND (E.G.
`COMMAND= TRANSFER SELECTED FILE
`TO OTHER PAIRED DEVICE IN SIMILAR ORIENTATION)
`
`> YES
`EXECUTE.COMMAND
`. . . · .
`.. .
`.. ·..
`.
`
`CONFIGURED FOR
`MULTIPLE DATA TRANSFERS'.?
`
`.. YES
`
`Fl.G. 11
`
`END
`
`Petitioner Samsung Ex-1007, 0009
`
`

`

`US 2011/0148752 Al
`
`Jun. 23, 2011
`
`1
`
`MOBILE DEVICE WITH USER
`INTERACTION CAPABILITY AND METHOD
`OF OPERATING SAME
`
`CROSS-REFERENCE TO RELATED
`APPLICATIONS
`
`[0001] This application claims the benefit of U.S. patent
`application Ser. No. 12/471,062, titled "Sensing Assembly
`For Mobile Device" and filed on May 22, 2009, which is
`hereby incorporated by reference herein.
`
`FIELD OF THE INVENTION
`
`[0002] The present invention relates generally to mobile
`devices and, more particularly, to methods and systems
`capable of being implemented by mobile devices that facili(cid:173)
`tate interactions with one or more of those mobile devices.
`
`BACKGROUND OF THE INVENTION
`
`[0003] Mobile devices such as cellular telephones, smart
`phones, and other handheld or portable electronic devices
`such as personal digital assistants (PDAs ), headsets, MP3
`players, etc. have become increasingly popular and ubiqui(cid:173)
`tous. As more and more people carry mobile devices with
`them, there is a desire that such mobile devices become
`capable of numerous functions, yet also be easy to use.
`[0004] Conventional mobile devices have numerous touch(cid:173)
`sensitive input actuation mechanisms, such as buttons, key(cid:173)
`pads, joysticks, touchscreens, etc. These input actuation
`mechanisms are often sometimes unwieldy depending upon
`the circumstance. This can be particularly true for some users,
`for example, those with larger hands or the elderly. In addi(cid:173)
`tion, the necessity of repeatedly entering various commands
`can be time consuming and non-intuitive.
`[0005] Therefore, for the above reasons, there is an oppor(cid:173)
`tunity to develop a method and/or system that provides con(cid:173)
`venient user interaction functionality.
`
`BRIEF DESCRIPTION OF THE DRAWINGS
`
`[0006] FIG. 1 is a front view of one embodiment of an
`exemplary mobile device described herein;
`[0007] FIG. 2 is a block diagram illustrating exemplary
`components of the mobile device of FIG. 1;
`[0008] FIG. 3 is a further perspective view of the mobile
`device ofFIG.1 shown in relation to an exemplary coordinate
`system;
`[0009] FIG. 4 is a top view of the mobile device of FIG. 1
`being moved in a manner parallel to an X-Y plane defined by
`the coordinate system of FIG. 3;
`[0010] FIG. 5 is a top view of the mobile device in FIG. 4
`with a hand being waved over an infrared proximity sensor of
`the mobile device;
`[0011] FIG. 6 is a flow chart illustrating exemplary steps of
`interfacing with the mobile device of FIG. 1 to initiate a
`command;
`[0012] FIG. 7 is a side view of the mobile device of FIG. 1
`along with a second exemplary mobile device situated in a flat
`orientation about a horizontal plane (X-Y plane);
`[0013] FIG. 8 is an additional side view of the mobile
`devices of FIG. 7, which are now shown to be situated in
`orientations that are rotated relative to the horizontal plane;
`
`[0014] FIG. 9 is another side view of the mobile devices of
`FIG. 7, where the first mobile device is orientated as shown in
`FIG. 8 while the second mobile device is orientated as shown
`in FIG. 7;
`[0015] FIG. 10 is a further side view of the mobile devices
`of FIG. 7 now shown in conjunction with one or more addi(cid:173)
`tional mobile devices; and
`[0016] FIG. 11 is a flow chart illustrating exemplary steps
`pertaining to the interfacing of mobile devices situated in
`substantially similar and/or dissimilar orientations.
`
`DETAILED DESCRIPTION
`
`[0017] Methods, mobile devices, and systems with support
`for interactions with one or more mobile devices with accel(cid:173)
`eration and proximity sensing, are described below. In at least
`one embodiment, a mobile device including an accelerometer
`and infrared sensor is configured to associate commands ( or,
`in some cases, to learn commands) through accelerometer
`sensing and then to actuate the commands based on infrared
`sensing, or vice-versa. The commands can include a plethora
`of possibilities. For example, commands for turning a page on
`an electronic book (e.g., eBook), changing TV channels,
`scrolling through a list or website ( or web pages thereof),
`transferring a song to another mobile device, etc., can be
`implemented with a slight movement of the mobile device
`and the waving ofa hand over the mobile device. In additional
`embodiments, such acceleration and infrared sensing is uti(cid:173)
`lized to govern an interaction between a first mobile device
`and another mobile device based upon the first mobile
`device's orientation with respect to the other mobile device,
`such as when both devices are orientated flat on a horizontal
`surface. Upon determining appropriate interfacing between
`the mobile devices based upon the orientation of the mobile
`devices, a slight movement of the mobile device, and the
`waving of a hand over the mobile device can transfer data
`from one device to another.
`[0018] More particularly, one embodiment relates to a
`method of operating a mobile device The method includes
`sensing at least one of an orientation and a movement of the
`mobile device, selecting a command based on the sensed
`orientation or sensed movement, and executing the command
`upon sensing a proximity of an object in relation to at least a
`portion of the mobile device. An additional embodiment
`relates to a method of operating a first mobile device in
`relation to a second mobile device. The method includes
`sensing a first orientation of the first mobile device relative to
`a reference orientation and receiving, from the second mobile
`device, information concerning a second orientation of the
`second mobile device relative to the reference orientation.
`The method additionally includes determining whether a first
`criterion concerning a similarity between the first orientation
`and the second orientation has been met, and transferring first
`data from the first mobile device to the second mobile device
`upon sensing a triggering event, provided that the first crite(cid:173)
`rion has been met.
`[0019] An additional embodiment involves a mobile
`device. The mobile device includes a processor, a wireless
`transceiver coupled to the processor, and an accelerometer
`coupled to the processor to provide a first signal to the pro(cid:173)
`cessor indicative of a first orientation, first movement, or first
`acceleration of the mobile device. The mobile device further
`includes an infrared proximity sensor, coupled to the proces(cid:173)
`sor, for providing a second signal indicative of a presence of
`an object in proximity to the infrared proximity sensor. The
`
`Petitioner Samsung Ex-1007, 0010
`
`

`

`US 2011/0148752 Al
`
`Jun. 23, 2011
`
`2
`
`processor determines based at least in part upon the first
`signal whether a first criterion has been met and, upon deter(cid:173)
`mining that the first criterion has been met, enters a first state
`in which the processor is prepared to execute a command
`upon receiving the second signal. Further, the processor, upon
`receiving the second signal, executes the command.
`[0020] FIG. 1 shows an exemplary mobile device 102 that
`includes, among its various components, an accelerometer
`104 (shown in phantom), such as a gravimeter, an electronic
`compass 105 (shown in phantom), and an infrared proximity
`sensor 106, in accordance with a first embodiment. In the
`present example, the mobile device 102 is a personal digital
`assistant (PDA), albeit the mobile device is also intended to
`be representative of a variety of other mobile devices as well,
`including for example, cellular telephones, smart phones,
`other handheld or portable electronic devices such as note(cid:173)
`book, netbook, or laptop computing devices, remote control(cid:173)
`lers, headsets, MP3 players and other portable video and
`audio players, global positioning navigation devices, and
`even other electronic devices, including a wide variety of
`devices that can utilize or benefit from control based upon the
`sensed presence of one or more external objects ( e.g., elec(cid:173)
`tronic displays, kiosks, ATMs, vending machines, vehicles,
`etc.). Further included among the components of the mobile
`device 102 as shown in FIG. 1 are a video screen 108, a
`keypad 110 having numerous keys, and a navigation key
`cluster (in this case, a "five-way navigation key cluster") 112.
`Although an electronic compass 105 is included separately
`with the exemplary embodiment to assist with orientation
`sensing, in at least some embodiments, the acceleration sen(cid:173)
`sor 104 provides orientation sensing without the addition of
`the electronic compass 105.
`[0021] FIG. 2 illustrates example internal components 200
`of a mobile device, such as the mobile device 102. This
`embodiment includes one or more wireless transceivers 202,
`a processor 204 ( e.g., a microprocessor, microcomputer,
`application-specific integrated circuit, etc.), a memory por(cid:173)
`tion 206, one or more output devices 208, and one or more
`input devices 210. In at least some embodiments, a user
`interface component (e.g.; a touch screen) is considered both
`an output device 208 and an input device 210. The internal
`components 200 can further include a component interface
`212 to provide a direct connection to auxiliary components or
`accessories for additional or enhanced functionality. The
`internal components 200 preferably also include a power
`supply 214, such as a battery, for providing power to the other
`internal components while enabling the mobile device 102 to
`be portable. As will be described in further detail, the internal
`components 200 in the present embodiment further include
`sensors 228 such as the infrared proximity sensor 106, the
`accelerometer 104, and the electronic compass 105 ofFIG.1.
`All of the internal components 200 can be coupled to one
`another, and in communication with one another, by way of
`one or more internal communication links 232 ( e.g., an inter(cid:173)
`nal bus).
`[0022] Each of the wireless transceivers 202 utilize a wire(cid:173)
`less technology for communication, such as, but not limited
`to, wireless wide area network (WWAN) technologies such as
`analog communications (using AMPS), digital communica(cid:173)
`tions (using CDMA, TDMA, GSM, iDEN, GPRS, EDGE,
`etc.), and next-generation communications (using UMTS,
`WCDMA, LTE, IEEE 802.16, etc.) or variants thereof, or
`peer-to-peer or ad hoc communication technologies such as
`HomeRF, Bluetooth® and IEEE 802.ll(a, b, g orn), or other
`
`wireless communication technologies such as infrared tech(cid:173)
`nology. In the present embodiment, the wireless transceivers
`202 include both a WWAN transceiver 203 and a wireless
`personal area network (WPAN) transceiver 205 (which par(cid:173)
`ticularly can employ Bluetooth® technology), although in
`other embodiments only one of these types of wireless trans(cid:173)
`ceivers (and possibly neither of these types of wireless trans(cid:173)
`ceivers, and/or other types of wireless transceivers) is present.
`Also, the number of wireless transceivers can vary and, in
`some embodiments, only one wireless transceiver is present
`and further, depending upon the embodiment, each wireless
`transceiver 202 can include both a receiver and a transmitter,
`or only one or the other of those devices.
`[0023] Exemplary operation of the wireless transceivers
`202 in conjunction with others of the internal components
`200 of the mobile device 102 can take a variety of forms and
`can include, for example, operation in which, upon reception
`of wireless signals, the internal components detect commu(cid:173)
`nication signals and the transceiver 202 demodulates the
`communication signals to recover incoming information,
`such as voice and/or data, transmitted by the wireless signals.
`After receiving the incoming information from the trans(cid:173)
`ceiver 202, the processor 204 formats the incoming informa(cid:173)
`tion for the one or more output devices 208. Likewise, for
`transmission of wireless signals, the processor 204 formats
`outgoing information, which may or may not be activated by
`the input devices 210, and conveys the outgoing information
`to one or more of the wireless transceivers 202 for modulation
`to communication signals. Depending upon the embodiment,
`the wireless transceiver(s) 202 can convey the modulated
`signals to, or receive modulated signals from, a remote
`device, such as a cell tower, an access point, or a remote server
`(not shown), and/or from another mobile device that is
`located remotely (including, for example, in the case where
`two mobile devices are in communication via a Bluetooth®
`link).
`[0024] Depending upon the embodiment, the output
`devices 208 of the internal components 200 can include a
`variety of visual, audio, and/or mechanical output devices.
`For example, the output device(s) 208 can include a visual
`output device 216 such as a liquid crystal display and light
`emitting diode indicator, an audio output device 218 such as
`a speaker, alarm and/or buzzer, and/or a mechanical output
`device 220 such as a vibrating mechanism. The visual output
`devices 216 among other things can include the video screen
`108 of FIG. 1.
`[0025] Likewise, the input devices 210 can take a variety of
`forms. For example, the input devices 210 can include a visual
`input device 222 such as an optical sensor (for example, a
`camera), an audio input device 224 such as a microphone, and
`a mechanical input device 226 such as a flip sensor, keyboard,
`keypad, selection button, touch pad, touchscreen, capacitive
`sensor, or motion sensor. The mechanical input device 226
`can also in particular include, among other things, the keypad
`110 and the navigation key cluster 112 of FIG. 1. Actions that
`can actuate one or more of the input devices 210 can further
`include, but need not be limited to, opening the mobile device,
`unlocking the device, moving the device to actuate a motion,
`moving the device to actuate a location positioning system,
`and otherwise operating the device.
`[0026]
`In at least some circumstances, the sensors 228 are
`considered as input devices 210. In particular as shown, the
`sensors 228 can include both proximity sensors 229 and other
`sensors 231. As will be described in further detail, the prox-
`
`Petitioner Samsung Ex-1007, 0011
`
`

`

`US 2011/0148752 Al
`
`Jun. 23, 2011
`
`3
`
`imity sensors 229 can include, among other things, one or
`more sensors such as the infrared proximity sensor 106 of
`FIG. 1 by which the mobile device 102 is able to detect the
`presence ( or passing) of an external object, including portions
`of the body ofa human being such as a hand (not shown). By
`comparison, the other sensors 231 can include a variety of
`other types of sensors such as, for example, a variety of
`circuits and sensors capable of allowing orientation/location
`determinations ( and/or related determinations, such as deter(cid:173)
`minations concerning velocity or acceleration) to be made
`including, for example, the accelerometer 104 and electronic
`compass 105 of FIG. 1. In addition, other devices/compo(cid:173)
`nents, such as a gyroscope or other information collecting
`device(s) that can identify a current location or orientation of
`the mobile device 102, can be present depending upon the
`embodiment.
`[0027] The memory portion 206 of the internal components
`200 can encompass one or more memory devices of any of a
`variety of forms (e.g., read-only memory, random access
`memory, static random access memory, dynamic random
`access memory, etc.), and can be used by the processor 204 to
`store and retrieve data. The data that is stored by the memory
`portion 206 can include, but need not be limited to, operating
`systems, applications, and informational data. Each operating
`system includes executable code that controls basic functions
`of the communication device, such as interaction among the
`various components included among the internal components
`200, communication with external devices via the wireless
`transceivers 202 and/or the component interface 212, and
`storage and retrieval of applications and data to and from the
`memory portion 206. Each application includes executable
`code that utilizes an operating system to provide more spe(cid:173)
`cific functionality for the communication devices, such as file
`system service and handling of protected and unprotected
`data stored in the memory portion 206. Informational data is
`non-executable code or information that can be referenced
`and/or manipulated by an operating system or application for
`performing functions of the communication device.
`[0028] FIGS. 3-5 depict the mobile device 102 of FIG. 1 in
`several different contexts. More particularly, FIG. 3 provides
`a perspective view of the mobile device 102 showing the
`mobile device in relation to an exemplary coordinate system
`that in this case is a conventional 3-D coordinate system
`having X, Y and Z axes that are each perpendicular with
`respect to one another. In the present embodiment, the accel(cid:173)
`erometer 104 can be used to measure static acceleration, such
`as the tilt of the mobile device relative to gravity, as well as
`dynamic acceleration, such as that resulting from motion,
`shock, or vibration of the mobile device. This information can
`be used to provide acceleration, motion, and orientation infor(cid:173)
`mation for the mobile device. In addition, in conjunction with
`other information (e.g., information regarding an initial ori(cid:173)
`entation and/or velocity of the mobile device), it can be used
`to further determine a change in the orientation and/or veloc(cid:173)
`ity of the mobile device 102. As one example in this regard,
`FIG. 4 depicts the mobile device 102 with an exemplary
`partly-translational, partly-rotational movement. The move(cid:173)
`ment, shown in FIG. 4 particularly, is representative of a
`common type of movement that can be experienced by the
`mobile device 102, in which the mobile device 102 is oriented
`substantially flat on a surface, such as a tabletop 410, coun(cid:173)
`tertop, motor vehicle console, etc., and subsequently angled
`to one side by a user so as to arrive at a second orientation, as
`represented by a second image of the mobile device 412
`
`(shown in phantom). Upon moving the mobile device to the
`second orientation, the mobile device 102 can be left there or
`returned to its original resting orientation. The mobile device
`102 can also be held in hand instead of being orientated flat on
`a surface as it undergoes a similar motion as described above
`with reference to FIG. 4.
`[0029]
`In addition to sensing motion, the mobile device 102
`infrared proximity sensor 106 is capable of sensing an object
`that is present in proximity to it. As shown in FIG. 5, in one
`embodiment the infrared proximity sensor 106 operates by
`transmitting an infrared signal 314 generated by at least one
`infrared phototransmitter (e.g., a photo-light emitting diode
`(photo-LED)). An object that is present such as a hand 516,
`then reflects portions of the infrared signal 314 to constitute at
`least one reflected signal ( e.g., a reflected signal also proceed(cid:173)
`ing along the same general path as the infrared signal 314).
`The reflected signal is in turn sensed by at least one photore(cid:173)
`ceiver ( e.g., photodiode ), which is also part of the infrared
`proximity sensor. In some embodiments, it is sufficient for
`infrared proximity sensing that the infrared proximity sensor
`106 have only a single infrared phototransmitter and a single
`infrared photoreceiver. However, in alternate embodiments a
`variety of other types of infrared proximity sensor arrange(cid:173)
`ments can be employed including, for example, the use of
`multiple proximity sensors ( each with potentially its own
`phototransmitter and photoreceiver) positioned at multiple
`locations on the mobile device, as well as the use of any of a
`variety of different types of pyramid-type sensing assemblies
`such as those described in pending U.S. patent application
`Ser. No. 12/471,062 entitled "Sensing Assembly for Mobile
`Device" and filed on May 22, 2009, which is hereby incor(cid:173)
`porated by reference herein. Other types of proximity sensors
`can also be used such as, but not limited to, ultrasonic, capaci(cid:173)
`tive, inductive, resistive, RF, and camera type image sensors.
`[0030] As discussed in further detail below, in at least some
`embodiments it is possible for the mobile device 102 to inter(cid:173)
`face with a user or other mobile device(s) based upon the
`combination of sensed motion information obtained using the
`accelerometer 104, and sensed presence/proximity informa(cid:173)
`tion obtained using the infrared proximity sensor 106. More
`particularly, in such embodiments, the mobile device 102 can
`interpret one or more particular sensed motions as being
`respectively indicative of selecting one or more particular
`commands. The sensing of the motions does not cause the
`mo bile device 102 to execute the commands corresponding to
`those motions, but rather cause the mobile device to enter into
`a state of readiness in which the mobile device is then recep(cid:173)
`tive to trigger signals sensed by way of the infrared proximity
`sensor 106 or other types of proximity sensors. Thus, when a
`trigger signal is received, the mobile device executes those
`commands. The trigger signal can include one or more of
`numerous signals received from various components, for
`example, signals from the infrared proximity sensor, a push(cid:173)
`button, and motion sensing devices. In the present embodi(cid:173)
`ment, upon sensing movement of the mo bile device 102 in the
`manner shown in FIG. 4, the mobile device then becomes
`ready to execute a command or operation corresponding to
`that particular movement. The mobile device does not
`execute that command or operation until it senses the pres(cid:173)
`ence of the hand 516 in proximity to ( or movement of the hand
`across) the infrared proximity sensor 106 as represented by
`FIG. 5. The mode and context of the mo bile device 102 can in
`part aid with the interpreting the command, for example,
`holding the mobile device at an angle and displaying pictures,
`
`Petitioner Samsung Ex-1007, 0012
`
`

`

`US 2011/0148752 Al
`
`Jun. 23, 2011
`
`4
`
`or holding or laying the mobile device in a stationary hori(cid:173)
`zontal position during a hands-free phone call. Further, the
`proximity sensor 106 can be utilized to sense a temporarily(cid:173)
`present object ( e.g., an object passing by) or a relatively stable
`presence of an object, with the length of presence being
`indicated by the amount of time the proximity sensor 106
`provides a "high" sensing signal. In addition, this sensing can
`be used to direct the execution of different commands based
`on the time duration.
`[0031] FIG. 6 shows one exemplary manner of interfacing
`with the mobile device 102 to initiate a command. As shown,
`after starting the operation at step 602, at step 604 the mobile
`device 102 is placed in a substantially motionless orientation
`for a preset amount of time, as detected by a lack of significant
`acceleration in any direction. As this occurs, the processor
`204 (see FIG. 2) of the mobile device 102 senses the relative
`motionlessness (i.e., motion is less than a predetermined
`threshold). In at least some cases, the processor 204 at this
`time is also able to determine the current coordinates ( e.g., X,
`Y, and Z coordinates along the coordinate axes X, Y, and Z of
`FIG. 3) of the device and store them in the memory 206 (see
`FIG. 2). Such determinations can be made using only the
`accelerometer 104, assuming that the mobile device 102 is
`continuously using the accelerometer 104 to detect and
`record ongoing movement of the mobile device over time
`relative to an initial starting location (the coordinates of
`which can be set by a user or preset at a factory), and/or using
`accelerometer information in conjunction with other infor(cid:173)
`mation such as that provided by a GPS receiver.
`[0032] The relative motionlessness of the mobile device
`102 as sensed at step 604 in the present embodiment serves as
`a cue to the mobile device that the mobile device is potentially
`about to be moved in a manner that signifies that a command
`selection is forthcoming. Thus, at step 606, the mobile device
`102 is waiting to sense motion and at step 608 a determination
`is made by the mobile device 102 as to whether motion is
`detected. If motion is not detected at step 608, then the pro(cid:173)
`cess returns to the step 604. Alternatively, if motion is
`detected at step 608, the process advances to a step 610, at
`which the processor 204 uses the accelerometer 104 to track
`the motion of the mobile device 102 during movement. In
`some cases, the processor 204 actually determines the spe(cid:173)
`cific orientation variation experienced by the mobile device
`102, by calculation by using double integration, assuming a
`known starting orientation, regardless of whether the device
`is in hand or resting on a surface. In other cases, the processor
`204 merely monitors the variation in acceleration experi(cid:173)
`enced by the mobile device 102 and sensed by the acceler(cid:173)
`ometer 104 (also, in some embodiments, velocity can be
`specifically determined/monitored, as in the case where the
`mobile device starting orientation is not stationary). Still in
`other cases, the electronic compass 105 can be used to supple(cid:173)
`ment the acceleration sensor 104 by further monitoring the
`motion of the mobile device 102. The motion is tracked until
`the mobile device becomes relatively stationary (i.e., motion
`sensed is less than a predetermined threshold). To the extent
`that the mobile device 102 is located within a moving vehicle
`(e.g., on the dashboard ofa car), in some embodiments further
`adjustments to the above process can be made to take into
`account the motion of

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket