`(10) Patent No.:
`US 8,676,224 B2
`
`Louch
`(45) Date of Patent:
`Mar. 18, 2014
`
`USOO8676224B2
`
`(54) SPEAKERPHONE CONTROL FOR MOBILE
`DEVICE
`
`(75)
`
`Inventor:
`
`.
`.
`John 0. Louch, San Luls Oblspo, CA
`(US)
`
`-
`.
`'
`(73) ASSlgnee’ Apple Inc" CUpemno’ CANS)
`.
`*
`.
`.
`.
`.
`) Notlce:
`Subject to any dlsclalmer, the term of this
`patent is extended or adjusted under 35
`U.S.C. 15403) by 870 days.
`
`(
`
`(21) APP1~ N05 12/0333706
`.
`Flled:
`
`Feb. 19, 2008
`
`(22)
`
`(65)
`
`Prior Publication Data
`
`US 2009/0209293 A1
`
`Aug. 20, 2009
`
`(51)
`
`2006 01
`)
`(
`’
`(200901)
`(2006.01)
`
`Int Cl
`H04B '7/00
`H04W24/00
`H04M 1/00
`(52) US. Cl.
`.
`.
`.
`USPC .................... 455/456.1,455/41.2, 44555506576511,
`(58) Field of Classification Search
`CPC .................................................... H04M1/6066
`USPC ................ 455/569.1, 575.1, 90.1, 41.2413,
`.
`.
`455/66.1, 67.11, 456.1
`See appllcatlon file for complete search hlstory.
`
`(56)
`
`References Cited
`
`U.S. PATENT DOCUMENTS
`
`5,224,151 A *
`223133523; 1’:
`5,675,362 A
`5,712,911 A *
`6,411,828 B1 *
`
`6/1993 Bowen et 3L ~~~~~~~~~~~~~~ 455/5691
`51332 3335111“ 31'
`10/1997 Clough etal.
`1/1998 Her .......................... 379/388.01
`6/2002 Lands et a1.
`............... 455/569.1
`
`6,434,371 B1 *
`8/2002 Claxton ....................... 455/901
`6,449,363 B1*
`9/2002 Klelsnla ................... 379/42001
`6,677,932 B1
`1/2004 Westerman
`6,751,446 B1*
`6/2004 Kim et al.
`.................... 455/901
`6,771,768 B2 *
`8/2004 Dietz et 31.
`..
`.. 379/387.01
`6,853,850 B2 *
`2/2005 Shim et a1.
`..
`455/550.1
`6,993,366 B2*
`1/2006 Kim ............
`455/569.1
`7,239,900 B2 *
`7/2007 Choi et a1.
`..
`455/575.3
`7,260,422 B2*
`8/2007 Knoedgen
`455/569.1
`7,263,373 B2 *
`8/2007 Mattisson .................. 455/456.3
`7,400,316 132
`7/2008 Appleyard et 31.
`7,493,573 B2
`2/2009 Wagner
`
`340/974
`7,499,686 B2 *
`3/2009 Sinclair et al.
`................ 455/223
`7,696,905 B2*
`4/2010 Ellenbyetal.
`..
`455/569.1
`
`7,697,962 B2*
`4/2010 Cradick et al.
`..
`7,774,029 B2 *
`8/2010 Lee et a1.
`....... 455/566
`
`7,920,696 B2*
`.. 379/388.02
`4/2011 Chew
`
`8,099,124 B2*
`1/2012 Tilley .......
`455/550.1
`2004/0198332 A1* 10/2004 Lundsgaard .................. 455/417
`2005/0154798 A1
`7/2005 Nurmi
`2005/0216867 A1
`9/2005 MarVit et al.
`2005/0219228 A1 *
`10/2005 Alameh et al.
`2007/0283264 A1
`12/2007 Vau et a1.
`
`................ 345/173
`
`2008/0034321 A1
`2/2008 Griffin
`2008/0146289 A1*
`6/2008 Korneluk et a1.
`.......... 455/569.1
`
`2008/0188273 A1*
`8/2008 You ..........
`455/575.3
`
`2008/0220715 A1*
`.
`........... 455/1
`9/2008 Sinha et a1.
`2008/0280640 A1* 11/2008 Wedel et a1.
`............... 455/556.1
`2009/0024943 A1
`1/2009 Adler et a1.
`2009/0031257 A1
`1/2009 Arneson et al.
`2009/0100384 A1
`4/2009 Louch
`* cited by examiner
`S'
`P .
`E
`.
`guyen
`”WW x“’"’””* “no“
`(57)
`ABSTRACT
`A speakerphone system integrated in a mobile deVice is auto-
`matically controlled based on the current state of the mobile
`device. In one implementation, the mobile deVice is c011-
`trolled based on an orientation or position of the mobile
`device. In another implementation, the control of the speak-
`erphone includes automatically controlling one or more
`graphical user interfaces associated with the speakerphone
`SYStem
`
`N
`
`30 Claims, 5 Drawing Sheets
`
`APPLE 1011
`
`Z
`
`100
`
` 1
`
`APPLE 1011
`
`1
`
`
`
`US. Patent
`
`Mar. 18, 2014
`
`Sheet 1 of5
`
`Us 8,676,224 B2
`
`100
`140
`
`FIG. 1A
`
`FIG. 1B
`
`2
`
`
`
`US. Patent
`
`Mar. 18, 2014
`
`Sheet 2 of5
`
`Us 8,676,224 B2
`
`100
`
`Accellerometer
`
`206
`
`I.@ ~
`
`Text
`
`Calendar
`
`Photos
`
`Camera
`
`3
`
`Calculator
`
`Stocks
`
`Weather
`
`Maps
`
`Clock
`
`Address Book
`
`Settings
`
`268
`
`Proximity
`
`270
`
`Ambient Light
`Sensor
`
`Sensor 272
`
`
`122
`290
`124
`
`Microphone
`
`Loudspeaker
`
`FIG. 2
`
`3
`
`
`
`US. Patent
`
`Mar. 18, 2014
`
`Sheet 3 of5
`
`US 8,676,224 B2
`
`Operating System Instructions
`.
`.
`.
`Communication Instructions
`GUI Instructions
`Sensor Processing Instructions
`Phone Instructions
`.
`.
`.
`Electronic Messaging Instructions
`Web Browsing Instructions
`
`Media Processing Instructions
`
`
`OPS/Navigation Instructions
`
`Camera Instructions
`
`Other Software Instructions
`GUI Adjustment instructions
`Audio Management Instructions
`Activation Record/IMEI
`n
`‘.
`
`
`
`
`
`352
`354
`356
`358
`360
`362
`364
`366
`368
`370
`372
`373
`376
`
` 374
`
`300 1
`
`
`
`
`
`
`
`
`
`
`
`
`.
`.....................................3.5_Q___-_-._--
`
`_
`
`-
`
`,
`
`Memory Interface
`
`306
`
`I 302
`
`PI'OCGSSOI 8
`I
`
`)
`
`
`
`Peri herals
`p
`Interface
`
`i
`
`-------------------------------------
`
`316
`
`321
`
`323
`
`319
`
`318
`
`317
`
`310
`
`312
`
`314
`
`320
`
`Other Sensor(s)
`
`Touch Sensor
`
`-
`
`Gripping Sensor
`
`Hardware Connection
`Sensor
`
`Volume Sensor
`
`Motion Sensor
`
`Light Sensor
`
`Proximity Sensor
`
`Camera
`Subsystem
`
`Wireless
`Communication
`
`322
`
`324
`
`Subsystem(s) 124
`Audio Subsyste II -I‘
`326
`.
`
`122
`
`
`
`42
`3
`I/O Subsystem
`
`
`
`344
`
`340
`
`
`Touch—Screen Controller
`Other Input Controller
`
`(8)
`
`346
`
`Other Input / Control
`DeVIces
`
`348
`
`FIG. 3
`
`4
`
`
`
`US. Patent
`
`Mar. 18, 2014
`
`Sheet 4 0f 5
`
`US 8,676,224 B2
`
`400 “K
`
`Use Sensor(s) on Mobile Device to
`Determine Current State of Mobile Device
`
`On Mobile Device
`
`Determine Control Action(s) Associated With
`Current State(s) of the Mobile Device
`
`Automatically Implement Control Action(s)
`
`410
`
`420
`
`430
`
`FIG. 4
`
`5
`
`
`
`US. Patent
`
`Mar. 18, 2014
`
`Sheet 5 of5
`
`US 8,676,224 B2
`
`Mapping States to Control Actions
`
`State of the Mobile Device
`
`Control Action to the
`
`Speakerphone
`
`The Mobile Device Orienting
`Towards/ Away from a Voice
`Source
`
`Deactivating /
`Activating Speakerphone
`
`The Mobile Device in
`
`Proximity / within a Distance
`to a Voice Source
`
`Deactivating /
`Activating Speakerphone
`
`
`
`Deactivating /
`The Mobile Device being
`Gripped/ Released by a User Activating Speakerphone
`
`The Mobile Device being
`Disconnected/ Connected to
`Hardware Device(s)
`
`Deactivating /
`Activating Speakerphone
`
`The Mobile Device at an
`Angle more Horizontal/
`Perpendicular with the
`Ground
`
`Deactivating /
`ACtiVating Speakerphone
`
`The Mobile Device Receiving D
`Higher/ Lower Volume from
`a Voice Source
`
`eactivating /
`Activating Speakerphone
`
`The Mobile Device Receiving
`Input / Not Receiving From
`User Interface
`
`Deactivating /
`Activating Speakerphone
`
`FIG. 5
`
`6
`
`
`
`US 8,676,224 B2
`
`1
`SPEAKERPHONE CONTROL FOR MOBILE
`DEVICE
`
`RELATED APPLICATION
`
`This subject matter is related to US. patent application Ser.
`No. 11/937,463, for “Variable Device Graphical User Inter-
`face,” filed NOV. 8, 2007, the subject matter of which is incor-
`porated by reference herein its entirety.
`
`TECHNICAL FIELD
`
`This subject matter is generally related to mobile devices.
`
`BACKGROUND
`
`Modem mobile devices (e.g., mobile phones, media play-
`ers) often include a speakerphone system. The speakerphone
`system, which typically includes a loudspeaker and a micro-
`phone integrated into the mobile device, can free the user’s
`hands and facilitate multi-party conversations using the
`mobile device. A typical speakerphone system for a mobile
`device is controlled by hardware and/or software mechanisms
`which require the user to make physical contact with the
`mobile device. When operating the mobile device in hands
`free mode, the user must manually activate the speakerphone
`system to engage in a conversation, and then deactivate the
`speakerphone system when finished with the conversation.
`Even if the user is holding the device, the manual steps of
`activating and deactivating the speakerphone system can be
`annoying to the user.
`
`SUMMARY
`
`A speakerphone system integrated in a mobile device is
`automatically controlled based on the current state of the
`mobile device. In one implementation, the mobile device is
`controlled based on an orientation or position of the mobile
`device. In another implementation, the control of the speak-
`erphone includes automatically controlling one or more
`graphical user interfaces associated with the speakerphone
`system.
`
`DESCRIPTION OF DRAWINGS
`
`10
`
`15
`
`20
`
`25
`
`30
`
`35
`
`40
`
`45
`
`FIGS. 1A and 1B illustrate an example speakerphone sys-
`tem for a mobile device which can be managed based on a
`current state of the mobile device.
`
`FIG. 2 illustrates a graphical user interface for the example
`mobile device.
`
`50
`
`FIG. 3 is a block diagram of an example implementation of
`the mobile device of FIG. 2.
`
`FIG. 4 illustrates an example process for managing a
`mobile device’s speakerphone system based on a current state
`of the mobile device.
`
`55
`
`FIG. 5 illustrates an example data structure for mapping
`mobile device states to speakerphone control actions.
`
`DETAILED DESCRIPTION
`
`Speakerphone System Overview
`
`FIGS. 1A and 1B illustrate an example speakerphone sys-
`tem for a mobile device 100 which can be managed based on
`a current state of the mobile device 100. The mobile device
`100 can be, for example, a handheld computer, a personal
`digital assistant, a cellular telephone, a network appliance, a
`
`60
`
`65
`
`2
`
`camera, a smart phone, an enhanced general packet radio
`service (EGPRS) mobile phone, a network base station, a
`media player, a navigation device, an email device, a game
`console, or a combination of any two or more of these data
`processing devices or other data processing devices. The
`mobile device 100 can have an speakerphone system associ-
`ated with the mobile device 100, e. g., an integrated speaker in
`the mobile device 100, or an external speaker wirelessly
`connected to the mobile device 100. The speakerphone sys-
`tem can include a microphone 122 which can be arranged on
`a front or back side of the mobile device 100 to facilitate
`
`hands-free operation during a telephone conversation or play-
`back of audio content (e.g., music, voicemail). In various
`implementations, the speakerphone system also includes a
`loudspeaker 124 to deliver and/or amplify voice to reach a
`user of the mobile device 100.
`
`The speakerphone system can be managed based on the
`current state of the mobile device 100. In some implementa-
`tions, the current state of the mobile device 100 can be deter-
`mined using a reference frame and one or more sensors (e. g.,
`accelerometer, gyro,
`light sensor, proximity sensor) inte-
`grated into the mobile device 100. A “state” can be an orien-
`tation or position of the device with respect to the reference
`frame. For example, when the mobile device 100 is laid flat on
`a surface (e.g., the x-y plane) of a table top 140, the speaker-
`phone system can be activated to allow hands free operation.
`Ifthe mobile device 100 is laying flat on a surface, an assump-
`tion can be made that the user intends to use the speakerphone
`system for hands free operation. Similarly, if the mobile
`device 100 is orientated substantially vertical (in the z-plane),
`an assumption can be made that the mobile device 100 is
`temporarily fixed in a docking or recharging device (as shown
`in FIG. 1A), and the speakerphone system can be activated to
`allow hands free operation while docked and/or while
`recharging.
`A “state” of the mobile device 100 can also be determined
`
`based solely on sensor inputs. For example, one or more
`sensors (e.g., proximity sensor 268) on the front and/or back
`sides of the mobile device 100 can indicate a state of the
`
`mobile device 100. For example, if a first proximity sensor on
`the back side of the mobile device 100 is triggered and a
`second proximity sensor on the front side ofthe mobile device
`100 is not triggered, then an assumption can be made that the
`mobile device 100 is laying flat on a surface. Based on this
`assumption, the speakerphone system canbe controlled (e. g.,
`activated) to allow hands free operation. If the first proximity
`sensor and/or the second proximity sensor are triggered, then
`an assumption can be made that the mobile device 100 is
`being held by the user or is stored (e.g., stored in a bag or
`case). Based on this assumption, the speakerphone system
`can be controlled differently (e.g., deactivated).
`In some implementations, a processor (e.g., processor 304)
`in the mobile device 100 can use a state machine to maintain
`the current state of the mobile device 100. The state machine
`
`can track various combinations of inputs which can cause a
`state change to occur. A control action can then be issued
`based on the current state of the mobile device 100 as indi-
`
`cated by the state machine. A control action can be activating
`or deactivating the speakerphone system, generating or
`adjusting a graphical user interface and/or any other suitable
`control action.
`
`For example, a first state of the mobile device 100 can be
`defined by a first proximity sensor on the back side of the
`mobile device 100 sensing proximity to an object (e.g., a table
`top surface) and a motion sensor not sensing motion of the
`mobile device 100 (e.g., acceleration is below a threshold
`value). The combination of these sensor inputs can place the
`
`7
`
`
`
`US 8,676,224 B2
`
`3
`state machine ofthe mobile device 100 into the first state. The
`
`first state can exist when the mobile device 100 is laying at
`rest, face up on a flat surface, for example. The control action
`can be activating the speakerphone system and adjusting the
`volume ofthe loudspeaker 124. Another control action can be 5
`to generate a graphical user interface, as described in US.
`patent application Ser. No. 11/937,463.
`A second state of the mobile device 100 can be defined by
`the motion sensor sensing motion (e. g., acceleration above a
`threshold value). Such motion can place the state machine 10
`into the second state. The second state can exist when a user
`
`has picked up the mobile device 100 from the surface to make
`a call, for example. The control action can be lowering the
`volume of the loudspeaker 124. Other control actions are
`possible.
`A third state of the mobile device 100 can be defined by a
`second proximity sensor located on the front side of the
`mobile device 100 sensing proximity to an object (e.g., the
`user’s head) and the motion sensor not sensing motion of the
`mobile device 100 (e.g., acceleration is again below a thresh- 20
`old value). The combination of these sensor inputs can place
`the state machine ofthe mobile device 100 into the third state.
`The third state can exist when the user 110 has raised the
`mobile device 100 to the user’s ear and the mobile device is no
`
`15
`
`longer in motion. The control action can be deactivating the 25
`speakerphone system. Other control actions are possible.
`It should be understood that any number of states and/or
`combinations of states can be defined and used to trigger
`control actions. The state machine can be implemented by a
`processor ofthe mobile device 100 (e.g., processor 304). The 30
`processor can also determine appropriate control actions
`based on the current state of the mobile device 100 as deter-
`
`mined by the state machine.
`
`Example Mobile Device
`
`35
`
`45
`
`FIG. 2 illustrates a graphical user interface for an example
`mobile device. As described in reference to FIG. 1, the mobile
`device 100 typically includes a built-in microphone 122 and
`loudspeaker 124. In some implementations, an up/down but- 40
`ton 284 for volume control of the loudspeaker 124 and the
`microphone 122 can be included. The mobile device 100 can
`also include an on/off button 282 for a ring indicator of
`incoming phone calls.An audio jack 266 can also be included
`for use of headphones and/or a microphone.
`In addition, as shown in FIG. 2, the mobile device 100 can
`include a display 202, which, in some implementations, is
`touch-sensitive. The touch-sensitive display 202 can imple-
`ment liquid crystal display (LCD) technology, light emitting
`polymer display (LPD) technology, or some other display 50
`technology. The touch sensitive display 202 can be sensitive
`to haptic and/or tactile contact with a user.
`In some implementations, the mobile device 100 can dis-
`play one or more graphical user interfaces on the touch-
`sensitive display 202 for providing user access to various 55
`system objects and for conveying information to a user. In
`some implementations,
`the graphical user interfaces can
`include one or more display objects, e.g., 204 and 206. In the
`example shown, the display objects 204 and 206, are graphic
`representations of system objects. Some examples of system 60
`objects include device functions, applications, windows,
`files, alerts, events, or other identifiable system objects.
`In some implementations, a proximity sensor 268 can be
`included to determine the current state of the mobile device
`
`100 by detecting the user 110 positioning the mobile device 65
`100 proximate to the user’s ear, as described in reference to
`FIG. 1. In some implementations, the graphical user interface
`
`4
`
`can be resized to reduce the graphical representations of
`display objects 204 and 206, e.g., graphical icons, and their
`corresponding touch areas (e. g., areas on the touch-sensitive
`display where a touch on the display 202 selects the graphical
`icons). In various implementations, an ambient light sensor
`270 can also be used to determine the current state of the
`
`device. For example, the ambient light sensor 270 can sense
`when the mobile device 100 has been stored away. This sensor
`input can be used alone or in combination with other sensor
`inputs to determine the current state ofthe mobile device 100.
`In some implementations, the microphone 122 can be used
`as a volume sensor which can detect the user’ s voice volume.
`
`For example, when the volume level from the voice source
`exceeds a default value, an assumption can be made that the
`user is speaking directly into the microphone 122 while hold-
`ing the mobile device 100 to their ear, resulting in the speak-
`erphone system being deactivated, for example.
`In some implementations, the ambient light sensor 270 can
`be utilized to facilitate adjusting the brightness of the display
`202, and an accelerometer 272 can be utilized to detect move-
`ment of the mobile device 100, as indicated by the directional
`arrow 274. Accordingly,
`the speakerphone system and a
`graphical user interface can be adjusted according to a
`detected orientation of the mobile device 100.
`
`In some implementations, the mobile device 100 includes
`circuitry and sensors for supporting a location determining
`capability, such as that provided by the global positioning
`system (GPS) or other positioning systems (e.g., systems
`using Wi-Fi access points, television signals, cellular grids,
`Uniform Resource Locators (URLs)). In some implementa-
`tions, a positioning system (e.g., a GPS receiver) can be
`integrated into the mobile device 100 or provided as a sepa-
`rate device that can be coupled to the mobile device 100
`through an interface (e.g., port device 290) to provide access
`to location-based services. In some implementations, the
`mobile device 100 includes a gyroscopic sensor or other
`sensors that can be used to detect motion or orientation ofthe
`
`device with respect to a reference frame.
`In some implementations, positioning sensors (e.g., an
`accelerometer 272) can be used to compute an instantaneous
`coordinate frame of the mobile device 100. For example,
`when the mobile device 100 is lying flat on a surface, an
`instantaneous coordinate frame centered on the mobile
`
`device 100 can be computed. For example, the Z-axis can be
`perpendicular to the surface which can lie in the x-y plane in
`a right-handed coordinated system, as shown in FIG. 1A. If
`the user 110 moves the mobile device 100 to the position and
`orientation shown in FIG. 1B, then a trajectory for the mobile
`device 100 can be determined from the change in coordinates
`of the mobile device 100. For example, in reference to FIGS.
`1A and 1B, the mobile device’s 100 coordinate frame in FIG.
`1A rotates by about ninety degrees with respect to the z-axis
`to change to a coordinate frame in FIG. 1B while the user 110
`is holding the mobile device 100. Accordingly, the speaker-
`phone system can be controlled according to the detected
`change of coordinate frames.
`In some implementations, one or more sensors (e.g., a
`pressure sensor, temperature sensor) for detecting when a
`user is holding or gripping the mobile device 100 can be
`integrated into a housing of the mobile device 100. These
`sensors can detect when the mobile device 100 is gripped by
`a user, for example, by detecting a pressure exerted upon the
`body ofthe mobile device 100 or a partial temperature change
`(e.g., deviation from an ambient temperature) on the mobile
`device 100.
`the mobile device 100 can
`In some implementations,
`include a touch sensor, which detects a user entering input via
`
`8
`
`
`
`US 8,676,224 B2
`
`5
`the graphical user interface, resulting in the speakerphone
`system being activated, for example. The user input can be
`received by the mobile device 100 from the user touching the
`touch-sensitive display 202, or from the user touching a key-
`pad or a like device (not shown) associated with the mobile
`device 100.
`
`the mobile device 100 can
`In some implementations,
`include a time sensor (e.g., using the internal clock of the
`mobile device 100), which detects a duration for a certain
`state (e.g., position, or orientation) of the mobile device 100.
`The detected duration can be used to determine if a control
`
`action will be triggered, to prevent overly frequent, unneces-
`sary responses to each state change. By way of illustration, if
`the state change does not exceed a certain amount of time,
`e.g., five seconds, an assumption can be made that the state
`change is temporal, and therefore no control action will be
`triggered in response. By contrast, if the state change lasts
`longer than five seconds, an assumption can be made that the
`state change will remain for a longer period, and thus a control
`action can be triggered accordingly.
`The decision whether to trigger a corresponding control
`action can also be made upon detection of time in combina-
`tion with a transition distance of the mobile device 100, to
`enhance accuracy of the state determination. For example, in
`FIGS. 1A and 1B, ifthe mobile device 100 has been raised by
`the user 110 by twenty feet, for a interval exceeding five
`seconds, an assumption can be made that the user intends to
`use the handset for the telephone conversation. Accordingly,
`the speakerphone system canbe deactivated in response to the
`assumption. Otherwise, the speakerphone system can remain
`unchanged until the state change is greater than a certain
`amount on time or distance.
`
`In some implementations, a port device 290, e.g., a Um-
`versal Serial Bus (USB) port, or a docking port, or some other
`wired port connection, can be included. The port device 290
`can, for example, be utilized to establish a wired connection
`to other computing devices, such as other communication
`devices, network access devices, a personal computer, a
`printer, a display screen, or other processing devices capable
`of receiving and/or transmitting data. In some implementa-
`tions, the port device 290 allows the mobile device 100 to
`synchronize with a host device using one or more protocols,
`such as, for example, the TCP/IP, HTTP, UDP and any other
`known protocol.
`In some implementations, the mobile device 100 can have
`hardware connection sensors that detect whether the mobile
`device 100 is connected to any hardware devices via the port
`device 290. When the mobile device 100 is being connected to
`hardware devices (e.g., a docking station or re-charger), it is
`more likely than not, a user of the mobile device 100 is not
`holding the handset, and thus the speakerphone system (e.g.,
`the speaker volume and/or microphone sensitivity), and
`graphical user interface can be adjusted accordingly.
`The mobile device 100 can also include a camera lens and
`
`sensor 280. In some implementations, the camera lens and
`sensor 280 can be located on the back surface of the mobile
`
`device 100. The camera can capture still images and/or video.
`In some implementations, the images captured by the camera
`can be used to measure proximity to a user or if the mobile
`device 100 is held by the user and the speakerphone system
`and graphical user interface can be activated or adjusted
`accordingly.
`The mobile device 100 can also include one or more wire-
`
`less communication subsystems, such as an 802.11b/g com-
`munication device 186, and/or a BluetoothTM communication
`device 188. Other commumcation protocols can also be sup-
`ported, including other 802 .x communication protocols (e.g.,
`
`10
`
`15
`
`20
`
`25
`
`30
`
`35
`
`40
`
`45
`
`50
`
`55
`
`60
`
`65
`
`6
`WiMax, Wi-Fi, 3G), code division multiple access (CDMA),
`global system for mobile communications (GSM), Enhanced
`Data GSM Environment (EDGE), etc.
`
`Example Mobile Device Architecture
`
`FIG. 3 is a block diagram of an example implementation
`300 of the mobile device 100 of FIG. 1. The mobile device
`
`100 can include a memory interface 302, one or more data
`processors, image processors and/or central processing units
`304, and a peripherals interface 306. The memory interface
`302, the one or more processors 304 and/or the peripherals
`interface 306 canbe separate components or can be integrated
`in one or more integrated circuits. The various components in
`the mobile device 100 can be coupled by one or more com-
`munication buses or signal lines.
`Sensors, devices, and subsystems can be coupled to the
`peripherals interface 306 to facilitate multiple functionalities.
`For example, a motion sensor 310, a light sensor 312, and a
`proximity sensor 314 can be coupled to the peripherals inter-
`face 306 to facilitate the orientation, lighting, and proximity
`functions described with respect to FIG. 2. A hardware c011-
`nection sensor 318 can be coupled to the peripherals interface
`306, to facilitate determining a state of connecting the mobile
`device 100 to any hardware, e.g., a docking station, a charger,
`a personal computer, etc. A gripping sensor 319 can be
`coupled to the peripherals interface 306, to determine if the
`mobile device 100 is being gripped. In various implementa-
`tion, a gripping sensor can include a temperature sensor,
`and/or a pressure sensor. Further, a touch sensor 321 can be
`coupled to the peripherals interface 306, to detect if a user is
`touching user input interface, e.g., a touch screen or a keypad.
`A time sensor 323 can also be coupled to the peripherals
`interface 306, to detect a duration of a certain state of the
`mobile device 100. Other sensors 316 can also be connected
`
`to the peripherals interface 306, such as a positioning system
`(e.g., GPS receiver), a temperature sensor, a biometric sensor,
`a gyroscope, or other sensing device,
`to facilitate related
`functionalities.
`
`A camera subsystem 320 and an optical sensor 322, e.g., a
`charged coupled device (CCD) or a complementary metal—
`oxide semiconductor (CMOS) optical sensor, can be utilized
`to facilitate camera functions, such as recording photographs
`and video clips.
`Communication functions can be facilitated through one or
`more wireless communication subsystems 324, which can
`include radio frequency receivers and transmitters and/or
`optical (e. g., infrared) receivers and transmitters. The specific
`design and implementation of the communication subsystem
`324 can depend on the communication network(s) over which
`the mobile device 100 is intended to operate. For example, a
`mobile device 100 may include communication subsystems
`324 designed to operate over a GSM network, a GPRS net-
`work, an EDGE network, a Wi-Fi or WiMax network, and a
`BluetoothTM network. In particular, the wireless communica-
`tion subsystems 324 may include hosting protocols such that
`the device 100 may be configured as a base station for other
`wireless devices.
`
`An audio subsystem 326 can be coupled to a loudspeaker
`124, and microphone 122 to facilitate voice-enabled func-
`tions, for example, hands-free functionalities, voice recogni-
`tion, voice replication, digital recording, and telephony func-
`tions.
`
`The I/O subsystem 340 can include a touch screen control-
`ler 342 and/or other input controller(s) 344. The touch-screen
`controller 342 can be coupled to a touch screen 346. The
`touch screen 346 and touch screen controller 342 can, for
`
`9
`
`
`
`US 8,676,224 B2
`
`7
`example, detect contact and movement or break thereofusing
`any of a plurality of touch sensitivity technologies, including
`but not limited to capacitive, resistive, infrared, and surface
`acoustic wave technologies, as well as other proximity sensor
`arrays or other elements for determining one or more points of
`contact with the touch screen 346.
`
`The other input controller(s) 344 can be coupled to other
`input/control devices 348, such as one or more buttons, rocker
`switches, thumb-wheel, infrared port, USB port, and/or a
`pointer device such as a stylus. The one or more buttons (not
`shown) can include an up/down button for volume control of
`the speaker 126 and loudspeaker 124 and/or the microphone
`122.
`
`the mobile device 100 can
`In some implementations,
`present recorded audio and/or video files, such as MP3,AAC,
`and MPEG files. In some implementations, the mobile device
`100 can include the functionality ofan MP3 player, such as an
`iPodTM. The mobile device 100 may, therefore, include a
`36-pin connector that is compatible with the iPod. Other
`input/output and control devices can also be used.
`The memory interface 302 can be coupled to memory 350.
`The memory 350 can include high-speed random access
`memory and/or non-volatile memory, such as one or more
`magnetic disk storage devices, one or more optical storage
`devices, and/or flash memory (e.g., NAND, NOR). The
`memory 350 can store an operating system 352, such as
`Darwin, RTXC, LINUX, UNIX, OS X, WINDOWS, or an
`embedded operating system such as VxWorks. The operating
`system 352 may include instructions for handling basic sys-
`tem services and for performing hardware dependent tasks. In
`some implementations, the operating system 352 can be a
`kernel (e.g., UNIX kernel).
`The memory 350 may also store communication instruc-
`tions 354 to facilitate communicating with one or more addi-
`tional devices, one or more computers and/or one or more
`servers. The memory 350 may include graphical user inter-
`face instructions 356 to facilitate graphic user interface pro-
`cessing; sensor processing instructions 358 to facilitate sen-
`sor-related processing and functions; phone instructions 360
`to facilitate phone-related processes and functions; electronic
`messaging instructions 362 to facilitate electronic—messaging
`related processes and functions; web browsing instructions
`364 to facilitate web browsing-related processes and func-
`tions; media processing instructions 366 to facilitate media
`processing-related processes and functions; GPS/navigation
`instructions 368 to facilitate GPS and navigation-related pro-
`cesses and instructions; camera instructions 370 to facilitate
`camera-related processes and functions; GUI adjustment
`instructions 373 to facilitate adjustment of graphical user
`interfaces and user interface elements in response to sensor
`data; and/or other software instructions 372 to facilitate other
`processes and functions.
`In addition, the memory 350 can store audio management
`instructions 376 to facilitate functions managing audio sub-
`system, including the loudspeaker 124, and the microphone
`122.
`In some implementations,
`the audio management
`instructions 376 are operable to toggle the speakerphone sys-
`tem and adjust speaker volume and/or microphone sensitiv-
`ity, in response to the sensor processing instructions 358.
`The memory 350 may also store other software instruc-
`tions (not shown), such as web video instructions to facilitate
`web video-related processes and functions; and/or web shop-
`ping instructions to facilitate web shopping-related processes
`and functions. In some implementations, the media process-
`ing instructions 366 are divided into audio processing instruc-
`tions and video processing instructions to facilitate audio
`processing-related processes and functions and video pro-
`
`5
`
`10
`
`15
`
`20
`
`25
`
`30
`
`35
`
`40
`
`45
`
`50
`
`55
`
`60
`
`65
`
`8
`cessing-related processes and functions, respectively. An
`activation record and International Mobile Equipment Iden-
`tity (IMEI) 374 or similar hardware identifier can also be
`stored in memory 350.
`Each of the above identified instructions and applications
`can correspond to a set of instructions for performing one or
`more functions described above. These instructions need not
`
`be implemented as separate software programs, procedures,
`or modules. The memory 350 can include additional instruc-
`tions or fewer instructions. Furthermore, various functions of
`the mobile device 100 may be implemented in hardware
`and/or in software, including in one or more signal processing
`and/or application specific integrated circuits.
`
`Example Process of Controlling Speakerphone
`System
`
`FIG. 4 illustrates an example process 400 for managing a
`mobile device’s speakerphone system based on a current state
`ofthe mobile device 100. For convenience, the process 400 is
`described below in reference to FIGS. 1-3 (e.g., a mobile
`device 100, a speakerphone system, and