throbber
US 8,676,224 B2
`(10) Patent No.:
`a2) United States Patent
`Louch
`(45) Date of Patent:
`Mar.18, 2014
`
`
`US008676224B2
`
`(54) SPEAKERPHONE CONTROL FOR MOBILE
`DEVICE
`
`(75)
`
`Inventor:
`
`.
`.
`John O. Louch, San Luis Obispo, CA
`(US)
`
`5
`.
`:
`(73) Assignee: Apple Inc., Cupertino, CA (US)
`(*) Notice:
`Subject to any disclaimer, the term ofthis
`patent is extended or adjusted under 35
`US.C. 154(b) by 870 days.
`
`(21) Appl. No.: 12/033,706
`.
`Filed:
`
`Feb. 19, 2008
`
`(22)
`
`(65)
`
`Prior Publication Data
`
`US 2009/0209293 Al
`
`Aug. 20, 2009
`
`(51)
`
`2006.01
`(
`01)
`(2009.01)
`(2006.01)
`
`Int. Cl
`HOB 700
`HO4W 24/00
`F04M 1/00
`(52) U.S. CL
`.
`.
`.
`USPC oeeeeeeeeee 455/456.1; 455/41.2; poet
`(58) Field ofClassification Search
`CPC ciecececccseeeeereeceteceeeeceeeeescenecneeeens H04M 1/6066
`USPC coecccccccccees 455/569.1, 575.1, 90.1, 41.2-413,
`455/66.1, 67.11, 456.1
`See applicationfile for complete search history.
`
`(56)
`
`References Cited
`U.S. PATENT DOCUMENTS
`
`5,224,151 A *
`5,379,057 A
`5,428,805 A
`5,675,362 A
`5,712,911 A *
`6,411,828 BI1*
`
`6/1993 Bowenetal. ....... 455/569.1
`1/1995 Clough et al.
`6/1995 Morgan
`10/1997 Clough etal.
` WI1998 Her wc 379/388.01
`6/2002 Lands etal.
`.......... 455/569.1
`
`6,434,371 B1*
`8/2002 Claxton ccc 455/90.1
`6,449,363 BL*
`9/2002 Kielsnia ....... 0. 379/420.01
`6,677,932 Bl
`1/2004 Westerman
`6,751,446 BL*
`6/2004 Kimetal. we, 455/90.1
`6,771,768 B2*
`8/2004 Dietz et al.
`..
`. 379/387.01
`6,853,850 B2*
`2/2005 Shimetal.
`..
`vee 455/550.1
`6,993,366 B2*
`1/2006 Kim ........
`w 455/569. 1
`7,239,900 B2*
`7/2007 Choietal.
`..
`wee 455/575.3
`7,260,422 B2*
`8/2007 Knoedgen...
`ve 455/569.1
`7,263,373 B2*
`8/2007 Mattisson 0... 455/456.3
`7,400,316 B2
`7/2008 Appleyard et al.
`7,493,573 B2
`2/2009 Wagner
`7,499,686 B2*
`3/2009 Sinclair etal. on... 455/223
`
`7,696,905 B2*
`..
`... 340/974
`4/2010 Ellenby etal.
`
`7,697,962 B2*
`4/2010 Cradicket al.
`..
`455/569.1
`7,774.029 B2*
`8/2010 Leeetal. cece 455/566
`
`7,920,696 B2*
`.. 379/388.02
`4/2011 Chew...
`
`8,099,124 B2*
`1/2012 Tilley .......
`w 455/550.1
`2004/0198332 A1l* 10/2004 Lundsgaard 0... 455/417
`2005/0154798 Al
`7/2005 Nurmi
`2005/0216867 Al
`9/2005 Marvit etal.
`2005/0219228 A1* 10/2005 Alamehetal. ou.. 345/173
`2007/0283264 Al
`12/2007 Vau etal.
`2008/0034321 Al
`2/2008. Griffin
`2008/0146289 Al*
`6/2008 Korneluk et al.
`.......... 455/569.1
`2008/0188273 A1*
`we 455/575.3
`8/2008 You..........
`
`
`9/2008 Sinha et al. veces 455/1
`2008/0220715 AL*
`2008/0280640 Al* 11/2008 Wedel et al. 0... 455/556.1
`2009/0024943 Al
`1/2009 Adleret al.
`2009/0031257 Al
`1/2009 Arnesonetal.
`5009/0100384 Al
`4/2009 Louch
`* cited by examiner
`.
`.
`.
`Primary Examiner — Simon Nguyen
`(57)
`ABSTRACT
`A speakerphonesystem integrated in a mobile deviceis auto-
`matically controlled based on the current state of the mobile
`device. In one implementation, the mobile device is con-
`trolled based on an orientation or position of the mobile
`device. In another implementation, the control of the speak-
`erphone includes automatically controlling one or more
`graphical user interfaces associated with the speakerphone
`system.
`
`30 Claims, 5 Drawing Sheets
`
`
`
`APPLE 1011
`
`APPLE 1011
`
`1
`
`

`

`U.S. Patent
`
`Mar.18, 2014
`
`Sheet 1 of 5
`
`US8,676,224 B2
`
`
`140
`
`FIG. 1A
`
`FIG. 1B
`
`2
`
`

`

`U.S. Patent
`
`Mar.18, 2014
`
`Sheet 2 of 5
`
`US8,676,224 B2
`
`oCos“
`;
`
`;
`
`poessesees,*
`
`LN,274
`
`272
`
`Accellerometer
`
`Proximity
`Sensor
`
`100 268
`|(Ds) (B)
`(@)Iy*
`
`ee| 5 202
`
`
`_ Carrier @=42:34 PM 0
`
`—-— 4.
`
`206
`
`Text
`
`Calendar
`
`Photos
`
`Camera
`
`Calculator
`
`Stocks
`
`Weather
`
`Maps
`
`Clock
`
`Address Book
`
`Settings
`
`270
`
`Ambient Light
`Sensor
`
` 122 290
`
`
`124
`Microphone
`Loudspeaker
`
`FIG. 2
`
`3
`
`

`

`316
`
`394
`
`3
`
`23
`
`319
`
`318
`
`317
`
`310
`
`312
`
`314
`
`320
`
`322
`
`324
`
`194
`
`122
`
`o
`
`|;
`‘»!
`
`Proximity Sensor
`
`Camera
`Subsystem
`
`Wireless
`Communication
`Subsystem(s)
`
`Audio Subsystem =i
`
`U.S. Patent
`
`Mar.18, 2014
`
`Sheet 3 of 5
`
`US8,676,224 B2
`
`300 u
`
`;
`Operating System Instructions
`Communication instructions
`GU] Instructions
`Sensor Processing Instructions
`;
`PhoneInstructions
`:
`:
`Electronic Messaging Instructions
`Web Browsing Instructions
`Media ProcessingInstructions
`GPS/Navigation Instructions
`CameraInstructions
`Other Software Instructions
`GUI AdjustmentInstructions
`Audio ManagementInstructions
`Activation Record/|MEI
`Ma
`
`a“
`
`352
`
`356
`358
`360
`369
`364
`363
`368
`37
`379
`373
`375
`374
`
`Other Sensor(s)
`
`Time Sensor
`
`—
`Gripping Sensor
`Hardware Connection
`Sensor
`
`Light Sensor
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`pentecteceaeatensedposncensenensnseceed390ALT
`
`MemoryInterface
`
`306
`
`302
`
`304
`
`Processor(s
`
`)
`
`i
`:
`
`i
`
`i
`
`Perioherals
`nterfate
`
`
`
`326
`
`
`
`
`l/O Subsystem
`344
`
`342
`
`
`Touch-Screen Controller
`Other Input Controller
`
`(s)
`
`340
`
`346
`
`Other Input / Control
`Devices
`
`448
`
`FIG. 3
`
`4
`
`

`

`U.S. Patent
`
`Mar.18, 2014
`
`Sheet 4 of 5
`
`US8,676,224 B2
`
`400 \
`
`Use Sensor(s) on Mobile Device to
`Determine Current State of Mobile Device
`
`On Mobile Device
`
`.
`.
`.
`Determine Control Action(s) Associated With
`Current State(s) of the Mobile Device
`
`Automatically Implement Control Action(s)
`
`410
`
`420
`
`430
`
`FIG. 4
`
`5
`
`

`

`U.S. Patent
`
`Mar.18, 2014
`
`Sheet 5 of 5
`
`US8,676,224 B2
`
`Mapping States to Control Actions
`
`State of the Mobile Device
`
`Control Action to the
`
`Speakerphone
`
`The Mobile Device Orienting
`Towards/Awayfrom a Voice
`Source
`
`Deactivating /
`Activating Speakerphone
`
`The Mobile Device in
`Proximity / within a Distance
`to a Voice Source
`
`Deactivating /
`Activating Speakerphone
`
`Deactivating /
`The Mobile Device being
`Gripped/Released by a User|Activating Speakerphone
`
`Activating Speakerphone
`
`The Mobile Device being
`Disconnected/ Connected to
`Hardware Device(s)
`
`Deactivating /
`Activating Speakerphone
`
`The Mobile Device at an
`Angle more Horizontal/
`Perpendicular with the
`Ground.
`
`Deactivating /
`Activating Speakerphone
`
`The Mobile Device Receiving
`Higher/Lower Volumefrom
`a Voice Source
`
`Deactivating /
`Activating Speakerphone
`
`The Mobile Device Receiving
`Input / Not Receiving From
`User Interface
`
`Deactivating /
`
`FIG. 5
`
`6
`
`

`

`US 8,676,224 B2
`
`1
`SPEAKERPHONE CONTROL FOR MOBILE
`DEVICE
`
`RELATED APPLICATION
`
`This subject matter is related to U.S. patent application Ser.
`No. 11/937,463, for “Variable Device Graphical UserInter-
`face,”filed Nov. 8, 2007, the subject matter of whichis incor-
`porated by reference herein its entirety.
`
`TECHNICAL FIELD
`
`This subject matter is generally related to mobile devices.
`
`BACKGROUND
`
`Modern mobile devices (e.g., mobile phones, media play-
`ers) often include a speakerphone system. The speakerphone
`system, which typically includes a loudspeaker and a micro-
`phoneintegrated into the mobile device, can free the user’s
`hands and facilitate multi-party conversations using the
`mobile device. A typical speakerphone system for a mobile
`device is controlled by hardware and/or software mechanisms
`which require the user to make physical contact with the
`mobile device. When operating the mobile device in hands
`free mode, the user must manually activate the speakerphone
`system to engage in a conversation, and then deactivate the
`speakerphone system when finished with the conversation.
`Even if the user is holding the device, the manual steps of
`activating and deactivating the speakerphone system can be
`annoying to the user.
`
`SUMMARY
`
`A speakerphone system integrated in a mobile device is
`automatically controlled based on the current state of the
`mobile device. In one implementation, the mobile device is
`controlled based on an orientation or position of the mobile
`device. In another implementation, the control of the speak-
`erphone includes automatically controlling one or more
`graphical user interfaces associated with the speakerphone
`system.
`
`DESCRIPTION OF DRAWINGS
`
`10
`
`15
`
`20
`
`25
`
`35
`
`40
`
`45
`
`FIGS. 1A and 1B illustrate an example speakerphonesys-
`tem for a mobile device which can be managed based on a
`currentstate of the mobile device.
`
`FIG.2 illustrates a graphicaluser interface for the example
`mobile device.
`
`50
`
`FIG. 3 is a block diagram of an example implementation of
`the mobile device of FIG. 2.
`
`2
`camera, a smart phone, an enhanced general packet radio
`service (EGPRS) mobile phone, a network base station, a
`media player, a navigation device, an email device, a game
`console, or a combination of any two or moreof these data
`processing devices or other data processing devices. The
`mobile device 100 can have an speakerphone system associ-
`ated with the mobile device 100, e.g., an integrated speaker in
`the mobile device 100, or an external speaker wirelessly
`connected to the mobile device 100. The speakerphonesys-
`tem can include a microphone 122 which can be arranged on
`a front or back side of the mobile device 100 to facilitate
`hands-free operation during a telephone conversation or play-
`back of audio content (e.g., music, voicemail). In various
`implementations, the speakerphone system also includes a
`loudspeaker 124 to deliver and/or amplify voice to reach a
`user of the mobile device 100.
`The speakerphone system can be managed based on the
`current state of the mobile device 100. In some implementa-
`tions, the current state of the mobile device 100 can be deter-
`mined using a reference frame and one or moresensors(e.g.,
`accelerometer, gyro,
`light sensor, proximity sensor) inte-
`grated into the mobile device 100. A “state” can be an orien-
`tation or position of the device with respect to the reference
`frame. For example, when the mobile device 100 is laid flat on
`a surface (e.g., the x-y plane) of a table top 140, the speaker-
`phonesystem can beactivated to allow handsfree operation.
`Ifthe mobile device 100 is laying flat on a surface, an assump-
`tion can be madethatthe user intendsto use the speakerphone
`system for hands free operation. Similarly, if the mobile
`device 100 is orientated substantially vertical (in the z-plane),
`an assumption can be made that the mobile device 100 is
`temporarily fixed in a docking or recharging device (as shown
`in FIG. 1A), and the speakerphone system can be activated to
`allow hands free operation while docked and/or while
`recharging.
`A “state” of the mobile device 100 can also be determined
`
`based solely on sensor inputs. For example, one or more
`sensors(e.g., proximity sensor 268) on the front and/or back
`sides of the mobile device 100 can indicate a state of the
`mobile device 100. For example,ifa first proximity sensor on
`the back side of the mobile device 100 is triggered and a
`second proximity sensoronthefront side ofthe mobile device
`100 is not triggered, then an assumption can be madethat the
`mobile device 100 is laying flat on a surface. Based on this
`assumption, the speakerphone system can be controlled (e.g.,
`activated) to allow handsfree operation.If thefirst proximity
`sensor and/or the second proximity sensorare triggered, then
`an assumption can be made that the mobile device 100 is
`being held by the useror is stored (e.g., stored in a bag or
`case). Based on this assumption, the speakerphone system
`can be controlled differently (e.g., deactivated).
`In some implementations, a processor(e.g., processor 304)
`in the mobile device 100 can use a state machine to maintain
`the current state of the mobile device 100. The state machine
`
`FIG. 4 illustrates an example process for managing a
`mobile device’s speakerphone system based ona currentstate
`of the mobile device.
`
`55
`
`can track various combinations of inputs which can cause a
`state change to occur. A control action can then be issued
`FIG.5illustrates an example data structure for mapping
`based on the current state of the mobile device 100 as indi-
`mobile device states to speakerphone control actions.
`
`DETAILED DESCRIPTION
`
`Speakerphone System Overview
`
`FIGS. 1A and 1B illustrate an example speakerphonesys-
`tem for a mobile device 100 which can be managed based on
`a current state of the mobile device 100. The mobile device
`100 can be, for example, a handheld computer, a personal
`digital assistant, a cellular telephone, a network appliance, a
`
`60
`
`65
`
`cated by the state machine. A control action can be activating
`or deactivating the speakerphone system, generating or
`adjusting a graphical user interface and/or any other suitable
`control action.
`
`For example, a first state of the mobile device 100 can be
`defined by a first proximity sensor on the back side of the
`mobile device 100 sensing proximity to an object(e.g., atable
`top surface) and a motion sensor not sensing motion of the
`mobile device 100 (e.g., acceleration is below a threshold
`value). The combination of these sensor inputs can place the
`
`7
`
`

`

`3
`state machine ofthe mobile device 100 into thefirst state. The
`
`US 8,676,224 B2
`
`4
`can be resized to reduce the graphical representations of
`display objects 204 and 206, e.g., graphical icons, and their
`first state can exist when the mobile device 100 is laying at
`corresponding touch areas(e.g., areas on the touch-sensitive
`rest, face up onaflat surface, for example. The control action
`display where a touch onthe display 202 selects the graphical
`can be activating the speakerphone system and adjusting the
`icons). In various implementations, an ambient light sensor
`volumeofthe loudspeaker 124. Another control action can be
`270 can also be used to determine the current state of the
`to generate a graphical user interface, as described in U.S.
`device. For example, the ambientlight sensor 270 can sense
`patent application Ser. No. 11/937,463.
`whenthe mobile device 100 has been stored away. This sensor
`A secondstate of the mobile device 100 can be defined by
`input can be used alone or in combination with other sensor
`the motion sensor sensing motion(e.g., acceleration above a
`inputs to determinethe current state ofthe mobile device 100.
`threshold value). Such motion can place the state machine
`into the second state. The second state can exist when a user
`In some implementations, the microphone 122 can be used
`as a volumesensor which can detect the user’s voice volume.
`has picked up the mobile device 100 from the surface to make
`a call, for example. The control action can be lowering the
`volume of the loudspeaker 124. Other control actions are
`possible.
`A third state of the mobile device 100 can be defined by a
`second proximity sensor located on the front side of the
`mobile device 100 sensing proximity to an object (e.g., the
`user’s head) and the motion sensor not sensing motion of the
`mobile device 100 (e.g., acceleration is again below a thresh-
`old value). The combination of these sensor inputs can place
`the state machine ofthe mobile device 100 into the third state.
`The third state can exist when the user 110 has raised the
`mobile device 100 to the user’s ear and the mobile device is no
`
`longer in motion. The control action can be deactivating the
`speakerphone system. Other control actions are possible.
`Tt should be understood that any numberofstates and/or
`combinations of states can be defined and used to trigger
`control actions. ‘The state machine can be implemented by a
`processorofthe mobile device 100 (e.g., processor 304). The
`processor can also determine appropriate control actions
`based on the current state of the mobile device 100 as deter-
`mined by the state machine.
`
`Example Mobile Device
`
`FIG. 2 illustrates a graphical user interface for an example
`mobile device. As described in reference to FIG. 1, the mobile
`device 100 typically includesa built-in microphone 122 and
`loudspeaker 124. In some implementations, an up/down but-
`ton 284 for volume control of the loudspeaker 124 and the
`microphone 122 can be included. The mobile device 100 can
`also include an on/off button 282 for a ring indicator of
`incoming phonecalls. An audio jack 266 can also be included
`for use of headphones and/or a microphone.
`In addition, as shown in FIG.2, the mobile device 100 can
`include a display 202, which, in some implementations, is
`touch-sensitive. The touch-sensitive display 202 can imple-
`mentliquid crystal display (LCD) technology,light emitting
`polymer display (LPD) technology, or some other display
`technology. The touchsensitive display 202 can be sensitive
`to haptic and/ortactile contact with a user.
`In some implementations, the mobile device 100 can dis-
`play one or more graphical user interfaces on the touch-
`sensitive display 202 for providing user access to various
`system objects and for conveying information to a user. In
`some implementations,
`the graphical user interfaces can
`include one or more display objects, e.g., 204 and 206. In the
`example shown,the display objects 204 and 206, are graphic
`representations of system objects. Some examples of system
`objects include device functions, applications, windows,
`files, alerts, events, or other identifiable system objects.
`In some implementations, a proximity sensor 268 can be
`included to determine the current state of the mobile device
`
`100 by detecting the user 110 positioning the mobile device
`100 proximate to the user’s ear, as described in reference to
`FIG. 1. In some implementations, the graphical user interface
`
`10
`
`15
`
`20
`
`25
`
`30
`
`35
`
`40
`
`45
`
`50
`
`55
`
`60
`
`65
`
`For example, when the volumelevel from the voice source
`exceedsa default value, an assumption can be madethat the
`user is speaking directly into the microphone 122 while hold-
`ing the mobile device 100 to their ear, resulting in the speak-
`erphone system being deactivated, for example.
`In some implementations, the ambientlight sensor 270 can
`be utilized to facilitate adjusting the brightness of the display
`202, and an accelerometer 272 can be utilized to detect move-
`mentof the mobile device 100, as indicated by the directional
`arrow 274. Accordingly,
`the speakerphone system and a
`graphical user interface can be adjusted according to a
`detected orientation of the mobile device 100.
`
`In some implementations, the mobile device 100 includes
`circuitry and sensors for supporting a location determining
`capability, such as that provided by the global positioning
`system (GPS) or other positioning systems (e.g., systems
`using Wi-Fi access points, television signals, cellular grids,
`Uniform Resource Locators (URLs)). In some implementa-
`tions, a positioning system (e.g., a GPS receiver) can be
`integrated into the mobile device 100 or provided as a sepa-
`rate device that can be coupled to the mobile device 100
`through an interface (e.g., port device 290) to provide access
`to location-based services. In some implementations, the
`mobile device 100 includes a gyroscopic sensor or other
`sensors that can be usedto detect motionor orientation ofthe
`device with respect to a reference frame.
`In some implementations, positioning sensors (e.g., an
`accelerometer 272) can be used to compute an instantaneous
`coordinate frame of the mobile device 100. For example,
`when the mobile device 100 is lying flat on a surface, an
`instantaneous coordinate frame centered on the mobile
`
`device 100 can be computed. For example, the z-axis can be
`perpendicular to the surface which canlie in the x-y plane in
`a right-handed coordinated system, as shown in FIG. 1A.If
`the user 110 moves the mobile device 100 to the position and
`orientation shownin FIG. 1B, then a trajectory for the mobile
`device 100 can be determined from the change in coordinates
`of the mobile device 100. For example, in reference to FIGS.
`1A and 1B, the mobile device’s 100 coordinate frame in FIG.
`1A rotates by about ninety degrees with respect to the z-axis
`to change to a coordinate frame in FIG. 1B while the user 110
`is holding the mobile device 100. Accordingly, the speaker-
`phone system can be controlled according to the detected
`change of coordinate frames.
`In some implementations, one or more sensors (e.g., a
`pressure sensor, temperature sensor) for detecting when a
`user is holding or gripping the mobile device 100 can be
`integrated into a housing of the mobile device 100. These
`sensors can detect when the mobile device 100 is gripped by
`a user, for example, by detecting a pressure exerted upon the
`body ofthe mobile device 100 or a partial temperature change
`(e.g., deviation from an ambient temperature) on the mobile
`device 100.
`the mobile device 100 can
`In some implementations,
`include a touch sensor, which detects a user entering input via
`
`8
`
`

`

`US 8,676,224 B2
`
`5
`the graphical user interface, resulting in the speakerphone
`system being activated, for example. The user input can be
`received by the mobile device 100 from the user touching the
`touch-sensitive display 202, or from the user touching a key-
`pad or a like device (not shown) associated with the mobile
`device 100.
`the mobile device 100 can
`In some implementations,
`include a time sensor(e.g., using the internal clock of the
`mobile device 100), which detects a duration for a certain
`state (e.g., position, or orientation) of the mobile device 100.
`The detected duration can be used to determineif a control
`action will be triggered, to prevent overly frequent, unneces-
`sary responses to each state change. By wayofillustration, if
`the state change does not exceed a certain amountof time,
`e.g., five seconds, an assumption can be madethat the state
`change is temporal, and therefore no control action will be
`triggered in response. By contrast, if the state change lasts
`longer than five seconds, an assumption can be madethat the
`state change will remain for a longerperiod, and thus a control
`action can be triggered accordingly.
`The decision whether to trigger a corresponding control
`action can also be made upondetection of time in combina-
`tion with a transition distance of the mobile device 100, to
`enhance accuracyof the state determination. For example, in
`FIGS. 1A and1B, ifthe mobile device 100 has been raised by
`the user 110 by twenty feet, for a interval exceeding five
`seconds, an assumption can be madethat the user intends to
`use the handset for the telephone conversation. Accordingly,
`the speakerphone system can be deactivated in responseto the
`assumption. Otherwise, the speakerphone system can remain
`unchanged until the state change is greater than a certain
`amount on time or distance.
`In some implementations, a port device 290, e.g., a Uni-
`versal Serial Bus (USB)port, or a docking port, or some other
`wired port connection, can be included. The port device 290
`can, for example, be utilized to establish a wired connection
`to other computing devices, such as other communication
`devices, network access devices, a personal computer, a
`printer, a display screen, or other processing devices capable
`of receiving and/or transmitting data. In some implementa-
`tions, the port device 290 allows the mobile device 100 to
`synchronize with a host device using one or moreprotocols,
`such as, for example, the TCP/IP, HTTP, UDPand anyother
`knownprotocol.
`In some implementations, the mobile device 100 can have
`hardware connection sensors that detect whether the mobile
`device 100 is connected to any hardware devicesvia the port
`device 290. When the mobile device 100 is being connected to
`hardware devices(e.g., a docking station or re-charger), it is
`morelikely than not, a user of the mobile device 100 is not
`holding the handset, and thus the speakerphonesystem (e.g.,
`the speaker volume and/or microphonesensitivity), and
`graphical user interface can be adjusted accordingly.
`The mobile device 100 can also include a camera lens and
`
`sensor 280. In some implementations, the camera lens and
`sensor 280 can be located on the back surface of the mobile
`
`device 100. The camera can capturestill images and/or video.
`In some implementations, the images captured by the camera
`can be used to measure proximity to a user or if the mobile
`device 100 is held by the user and the speakerphone system
`and graphical user interface can be activated or adjusted
`accordingly.
`The mobile device 100 can also include one or more wire-
`
`less communication subsystems, such as an 802.11b/g com-
`munication device 186, and/or a Bluetooth™ communication
`device 188. Other communication protocols can also be sup-
`ported, including other 802.x communication protocols (e.g.,
`
`10
`
`15
`
`20
`
`25
`
`30
`
`35
`
`40
`
`45
`
`50
`
`55
`
`60
`
`65
`
`6
`WiMax, Wi-Fi, 3G), code division multiple access (CDMA),
`global system for mobile communications (GSM), Enhanced
`Data GSM Environment (EDGE), etc.
`
`Example Mobile Device Architecture
`
`FIG. 3 is a block diagram of an example implementation
`300 of the mobile device 100 of FIG. 1. The mobile device
`100 can include a memory interface 302, one or more data
`processors, image processors and/or central processing units
`304, and a peripherals interface 306. The memory interface
`302, the one or more processors 304 and/or the peripherals
`interface 306 can be separate componentsor can be integrated
`in one or more integrated circuits. The various componentsin
`the mobile device 100 can be coupled by one or more com-
`munication busesor signallines.
`Sensors, devices, and subsystems can be coupled to the
`peripherals interface 306 to facilitate multiple functionalities.
`For example, a motion sensor 310, a light sensor 312, anda
`proximity sensor 314 can be coupledto the peripherals inter-
`face 306 to facilitate the orientation, lighting, and proximity
`functions described with respect to FIG. 2. A hardware con-
`nection sensor 318 can be coupled to the peripherals interface
`306, to facilitate determining a state of connecting the mobile
`device 100 to any hardware,e.g., a docking station, a charger,
`a personal computer, etc. A gripping sensor 319 can be
`coupled to the peripherals interface 306, to determineif the
`mobile device 100 is being gripped. In various implementa-
`tion, a gripping sensor can include a temperature sensor,
`and/or a pressure sensor. Further, a touch sensor 321 can be
`coupled to the peripherals interface 306, to detect if a useris
`touching user input interface, e.g., a touch screen ora keypad.
`A time sensor 323 can also be coupled to the peripherals
`interface 306, to detect a duration of a certain state of the
`mobile device 100. Other sensors 316 can also be connected
`
`to the peripherals interface 306, such asa positioning system
`(e.g., GPS receiver), a temperature sensor, a biometric sensor,
`a gyroscope, or other sensing device,
`to facilitate related
`functionalities.
`A camera subsystem 320 and an optical sensor 322, e.g., a
`charged coupled device (CCD) or a complementary metal-
`oxide semiconductor (CMOS)optical sensor, can be utilized
`to facilitate camera functions, such as recording photographs
`and video clips.
`Communication functions can be facilitated through one or
`more wireless communication subsystems 324, which can
`include radio frequency receivers and. transmitters and/or
`optical(e.g., infrared) receivers and transmitters. The specific
`design and implementation of the communication subsystem
`324 can depend on the communication network(s) over which
`the mobile device 100 is intended to operate. For example, a
`mobile device 100 may include communication subsystems
`324 designed to operate over a GSM network, a GPRSnet-
`work, an EDGE network, a Wi-Fi or WiMax network, and a
`Bluetooth™network.In particular, the wireless communica-
`tion subsystems 324 mayinclude hosting protocols such that
`the device 100 may be configured as a base station for other
`wireless devices.
`An audio subsystem 326 can be coupled to a loudspeaker
`124, and microphone 122 to facilitate voice-enabled func-
`tions, for example, hands-free functionalities, voice recogni-
`tion, voice replication, digital recording, and telephony func-
`tions.
`
`The I/O subsystem 340 can include a touch screen control-
`ler 342 and/or other input controller(s) 344. The touch-screen
`controller 342 can be coupled to a touch screen 346. The
`touch screen 346 and touch screen controller 342 can, for
`
`9
`
`

`

`US 8,676,224 B2
`
`7
`example, detect contact and movementor break thereofusing
`any of a plurality of touch sensitivity technologies, including
`but not limited to capacitive, resistive, infrared, and surface
`acoustic wave technologies, as well as other proximity sensor
`arrays or other elements for determining one or more points of
`contact with the touch screen 346.
`The other input controller(s) 344 can be coupled to other
`input/control devices 348, such as one or more buttons, rocker
`switches, thumb-wheel, infrared port, USB port, and/or a
`pointer device such as a stylus. The one or more buttons (not
`shown) can include an up/down button for volume control of
`the speaker 126 and loudspeaker 124 and/or the microphone
`122.
`
`10
`
`8
`cessing-related processes and functions, respectively. An
`activation record and International Mobile Equipment Iden-
`tity (MEI) 374 or similar hardware identifier can also be
`stored in memory 350.
`Each of the above identified instructions and applications
`can correspondtoa set of instructions for performing one or
`more functions described above. Theseinstructions need not
`be implemented as separate software programs, procedures,
`or modules. The memory 350 can include additional instruc-
`tions or fewer instructions. Furthermore,various functions of
`the mobile device 100 may be implemented in hardware
`and/or in software, including in one or more signal processing
`and/or application specific integrated circuits.
`
`Example Process of Controlling Speakerphone
`System
`
`the mobile device 100 can
`In some implementations,
`present recorded audio and/or videofiles, such as MP3, AAC,
`and MPEGfiles. In some implementations, the mobile device
`100 can include the functionality ofan MP3player, such as an
`iPod™, The mobile device 100 may, therefore, include a
`FIG.4 illustrates an example process 400 for managing a
`36-pin connector that is compatible with the iPod. Other
`mobile device’s speakerphone system based onacurrentstate
`20
`input/output and control devices can also be used.
`ofthe mobile device 100. For convenience,the process 400 is
`The memory interface 302 can be coupled to memory 350.
`described below in reference to FIGS. 1-3 (e.g., a mobile
`The memory 350 can include high-speed random access
`device 100, a speakerphone system, and other components
`memory and/or non-volatile memory, such as one or more
`that perform the process 400).
`magnetic disk storage devices, one or more optical storage
`In some implementations, the process 400 can begin when
`devices, and/or flash memory (e.g., NAND, NOR). The
`input from one or more sensors on the mobile device are used
`memory 350 can store an operating system 352, such as
`to determine a current state of the mobile device (410). An
`Darwin, RTXC, LINUX, UNIX, OS X, WINDOWS, or an
`example state can be a change of the mobile device’s position
`or orientation relative to a user of the mobile device or a
`embedded operating system such as VxWorks. The operating
`system 352 may include instructions for handling basic sys-
`tem services and for performing hardware dependenttasks. In
`some implementations, the operating system 352 can be a
`kernel (e.g., UNIX kernel).
`The memory 350 may also store communication instruc-
`tions 354to facilitate communicating with one or more addi-
`tional devices, one or more computers and/or one or more
`servers. The memory 350 may include graphical user inter-
`face instructions 356 to facilitate graphic user interface pro-
`cessing; sensor processing instructions 358 to facilitate sen-
`sor-related processing and functions; phone instructions 360
`to facilitate phone-related processes and functions; electronic
`messaging instructions 362to facilitate electronic-messaging
`related processes and functions; web browsing instructions
`364 to facilitate web browsing-related processes and func-
`tions; media processing instructions 366 to facilitate media
`processing-related processes and functions; GPS/navigation
`instructions 368 to facilitate GPS and navigation-related pro-
`cesses and instructions; camera instructions 370 to facilitate
`camera-related processes and functions; GUI adjustment
`instructions 373 to facilitate adjustment of graphical user
`interfaces and user interface elements in response to sensor
`data; and/or other software instructions 372 to facilitate other
`
`processes and functions. FIG.5illustrates an example data structure 500 for map-
`In addition, the memory 350 can store audio management
`ping mobile device states to speakerphone control actions.
`instructions 376 to facilitate functions managing audio sub-
`The mobile device 100 can use the data structure 500 to map
`the current state to one or more control actions. Thestates can
`system, including the loudspeaker 124, and the microphone
`122.
`In some implementations,
`the audio management
`be determined based on sensorinputs, as describedin refer-
`instructions 376 are operable to toggle the speakerphonesys-
`ence to FIGS. 1-4, and mappedto control actions which can
`tem and adjust speaker volume and/or microphonesensitiv-
`be appliedto the speakerphonesystem,a graphicaluserinter-
`ity, in response to the sensor processing instructions 358.
`face and any other feature, peripheral, application of the
`The memory 350 may also store other soft

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket