`(10) Patent No.:
`a2) United States Patent
`Louch
`(45) Date of Patent:
`Mar.18, 2014
`
`
`US008676224B2
`
`(54) SPEAKERPHONE CONTROL FOR MOBILE
`DEVICE
`
`(75)
`
`Inventor:
`
`.
`.
`John O. Louch, San Luis Obispo, CA
`(US)
`
`r
`.
`-Assignes: Apple Ine. Cupertino;(US)
`(73)
`(*) Notice:
`Subject to any disclaimer, the term ofthis
`patent is extended or adjusted under 35
`U.S.C. 154(b) by 870 days.
`
`(21) Appl. No.: 12/033,706
`.
`Filed:
`
`Feb. 19, 2008
`
`(22)
`
`(65)
`
`Prior Publication Data
`
`US 2009/0209293 Al
`
`Aug. 20, 2009
`
`(51)
`
`(56)
`
`
`
`6,434,371 BL*
`8/2002 Claxton wo. 455/90.1
`6,449,363 BI*
`9/2002 Kielsnia 0... 379/420.01
`6,677,932 Bl
`1/2004 Westerman
`6,751,446 BI*
`6/2004 Kimetal. ow. 455/90.1
`6,771,768 B2*
`8/2004 Dietz et al. ccc 379/387.01
`6,853,850 B2*
`2/2005 Shimetal.
`..
`wee 455/550.1
`6,993,366 B2*
`1/2006 Kim ............
`we 455/569.1
`7,239,900 B2*
`7/2007 Choietal.
`..
`wee 455/575.3
`7,260,422 B2*
`8/2007 Knoedgen...
`a 455/569.1
`7,263,373 B2*
`8/2007 Mattisson
`...... 455/456.3
`7,400,316 B2
`7/2008 Aeplegaul etal.
`7,493,573 B2
`2/2009 Wagner
`7,499,686 B2*
`3/2009 Sinclair etal. ou. 455/223
`7,696,905 B2*
`..
`... 340/974
`4/2010 Ellenby etal.
`
`7,697,962 B2*
`4/2010 Cradick et al.
`..
`455/569.1
`7,774,029 B2*
`8/2010 Lee et al. cases 455/566
`
`7,920,696 B2*
`.. 379/388.02
`4/2011 Chew.......
`
`8,099,124 B2*
`1/2012 Tilley .........
`we 455/550.1
`2004/0198332 A1l* 10/2004 Lundsgaard o..... 455/417
`2005/0154798 Al
`7/2005 Nurmi
`2005/0216867 Al
`9/2005 Marvitet al.
`2005/0219228 A1l*
`10/2005 Alamehetal. ow... 345/173
`2007/0283264 Al
`12/2007 Vau etal.
`2008/0034321 Al
`2/2008. Griffin
`2008/0146289 A1l*
`6/2008 Korneluk et al.
`.......... 455/569.1
`2008/0188273 Al*
`wee 455/575.3
`8/2008 You ......
`
`
`9/2008 Sinha et al. wesc 455/I
`2008/0220715 AL*
`2008/0280640 Al* 11/2008 Wedeletal. oo... 455/556.1
`2009/0024943 Al
`1/2009 Adler et al.
`2009/0031257 Al
`1/2009 Arnesonetal.
`3009/0100384 Al
`4/2009 Louch
`* cited by examiner
`.
`.
`.
`Primary Examiner — Simon Nguyen
`57
`ABSTRACT
`(67)
`A speakerphone systemintegrated in a mobile device is auto-
`matically controlled based on the current state of the mobile
`device. In one implementation, the mobile device is con-
`trolled based on an orientation or position of the mobile
`device. In another implementation, the control of the speak-
`erphone includes automatically controlling one or more
`graphical user interfaces associated with the speakerphone
`system.
`
`30 Claims, 5 Drawing Sheets
`
`Int.Cl
`HOB 700
`HO4W 24/00
`HO4M 1/00
`(52) U.S.C.
`USPC seveeeeeceven 455/456.1; 455/41.2; por
`(58) Field ofClassification Search
`OP: » aren sen serene H04M 1/6066
`USPC coeccccccceceee 455/569.1, 575.1, 90.1, 41.2-413,
`455/66.1, 67.11, 456.1
`So
`1
`2
`See application file for complete search history.
`References Cited
`
`
`2006.01
`(
`01)
`(2009.01)
`(2006.01)
`
`.
`
`.
`
`.
`
`U.S. PATENT DOCUMENTS
`
`5,224,151 A * .1993 Cpwen et a sarentsassnss 455/569.1
`auesaes ‘
`‘Hoos Moranetal.
`5,675,362 A
`10/1997 Clough et al.
`SHIZ9TL As *
` A/T998 Hed wevevsnnssovnvennrnessicen 379/388.01
`6,411,828 BL*
`6/2002 Lands etal.
`.........0.. 455/569.1
`
`
`
`APPLE 1011
`
`APPLE 1011
`
`1
`
`
`
`
`
`
`
`U.S. Patent Mar. 18,2014—Sheet 1 of 5 US 8,676,224 B2
`
`
`140
`
`FIG. 1A
`
`FIG. 1B
`
`2
`
`
`
`U.S. Patent
`
`Mar. 18,2014
`
`Sheet 2 of 5
`
`US 8,676,224 B2
`
`ey
`;
`
`—:
`
`NY274
`
`272
`
`:
`
`Accellerometer
`
`Proximity
`Sensor
`
`100 266
`S=@eal
`
`more F202
`
`
`_ Carrier @=42:34 PM 0
`
`—.—.+._.
`
`206
`
`Text
`
`Calendar
`
`Photos
`
`Camera
`
`Calculator
`
`Stocks
`
`Weather
`
`Maps
`
`Notes
`
`Clock
`
`Address Book
`
`Settings
`
`270
`
`AmbientLight
`Sensor
`
` 122 290 124
`
`
`
`Microphone
`Loudspeaker
`
`FIG. 2
`
`3
`
`
`
`
`
`
`
`U.S. Patent Mar. 18,2014—Sheet 3 of 5 US 8,676,224 B2
`
`300 &
`
`Other Sensor(s)
`
`316
`
`
`
`
`
`
`
`
`
`
`352
`
`354
`356
`358
`
`a8
`369
`364
`386
`368
`370
`372
`
`Touch Sensor
`
`Gripping Sensor
`Hardware Connection
`Sensor
`
`Volume Sensor
`
`Motion Sensor
`
`;
`Operating System Instructions
`er
`;
`Communication Instructions
`GUI Instructions
`Sensor Processing Instructions
`PhoneInstructions
`:
`:
`Electronic MessagingInstructions
`Web Browsing Instructions
`
`Media Processing Instructions
`
`
`GPS/Navigation Instructions
`
`CameraInstructions
`
`Other Software Instructions
`wwaNo>Ww
`
`
`GUI AdjustmentInstructions
`312 Light Sensor
`Audio Management Instructions
`
`yeveeceecenenseseepececeeeeerereeeeees390.
`
`.
`
`MemoryInterface
`
`ae
`
`Proximity Sensor
`
`Camera
`Subsystem
`
`324
`
`323
`
`319
`
`318
`
`317
`
`310
`
`314
`
`320
`
`322
`
`324
`
`122
`
`
`
`
`3 500
`
`i
`
`,
`
`rocessor(s
`
`5
`
`304
`
`Peripherals
`
`Wireless
`
`Communication
`Subsystem(s)
`
`Interface 124
`Audio Subsystem =i
`a 326
`UO
`
`342
`
`ubsystem
`l/O Subsyst
`
`344
`
`340
`
`Touch-Screen Controller
`e
`
`OtherInput Controller
`
`(s)
`
`346
`
`Devices
`
`348
`
`FIG. 3
`
`4
`
`
`
`Use Sensor(s) on Mobile Device to
`Determine Current State of Mobile Device
`
`.
`.
`.
`Determine Control Action(s) Associated With
`Current State(s) of the Mobile Device
`
`On Mobile Device
`
`410
`
`420
`
`430
`
`
`
`
`
`U.S. Patent Mar. 18,2014—Sheet 4 of 5 US 8,676,224 B2
`
`400 \
`
`Automatically Implement Control Action(s)
`
`FIG. 4
`
`5
`
`
`
`U.S. Patent
`
`Mar. 18,2014
`
`Sheet 5 of 5
`
`US 8,676,224 B2
`
`Mapping States to Control Actions
`
`State of the Mobile Device
`
`Control Action to the
`
`Speakerphone
`
`The Mobile Device Orienting
`Towards/Away from a Voice
`Source
`
`Deactivating /
`Activating Speakerphone
`
`The Mobile Device in
`Proximity / within a Distance
`to a Voice Source
`
`Deactivating /
`Activating Speakerphone
`
`Deactivating /
`The Mobile Device being
`Gripped/Released by a User|Activating Speakerphone
`
`Activating Speakerphone
`
`The Mobile Device being
`Disconnected/Connected to
`Hardware Device(s)
`
`Deactivating /
`Activating Speakerphone
`
`The Mobile Device at an
`Angle more Horizontal/
`Perpendicular with the
`Ground.
`
`Deactivating /
`Activating Speakerphone
`
`The Mobile Device Receiving
`Higher/Lower Volume from
`a Voice Source
`
`Deactivating /
`Activating Speakerphone
`
`The Mobile Device Receiving
`Input / Not Receiving From
`User Interface
`
`Deactivating /
`
`FIG. 5
`
`6
`
`
`
`US 8,676,224 B2
`
`1
`SPEAKERPHONE CONTROL FOR MOBILE
`DEVICE
`
`RELATED APPLICATION
`
`This subject matter is related to U.S. patent application Ser.
`No. 11/937,463, for “Variable Device Graphical User Inter-
`face,”filed Nov. 8, 2007, the subject matter of whichis incor-
`porated by reference herein its entirety.
`
`TECHNICAL FIELD
`
`This subject matter is generally related to mobile devices.
`
`BACKGROUND
`
`Modern mobile devices (e.g., mobile phones, media play-
`ers) often include a speakerphone system. The speakerphone
`system, which typically includes a loudspeaker and a micro-
`phone integrated into the mobile device, can free the user’s
`hands and facilitate multi-party conversations using the
`mobile device. A typical speakerphone system for a mobile
`device is controlled by hardware and/or software mechanisms
`which require the user to make physical contact with the
`mobile device. When operating the mobile device in hands
`free mode, the user must manually activate the speakerphone
`system to engage in a conversation, and then deactivate the
`speakerphone system when finished with the conversation.
`Even if the user is holding the device, the manual steps of
`activating and deactivating the speakerphone system can be
`annoying to the user.
`
`SUMMARY
`
`A speakerphone system integrated in a mobile deviceis
`automatically controlled based on the current state of the
`mobile device. In one implementation, the mobile device is
`controlled based on an orientation or position of the mobile
`device. In another implementation, the control of the speak-
`erphone includes automatically controlling one or more
`graphical user interfaces associated with the speakerphone
`system.
`
`DESCRIPTION OF DRAWINGS
`
`10
`
`15
`
`25
`
`35
`
`40
`
`45
`
`FIGS. 1A and 1B illustrate an example speakerphonesys-
`tem for a mobile device which can be managed based on a
`currentstate of the mobile device.
`
`FIG.2 illustrates a graphical user interface for the example
`mobile device.
`
`50
`
`FIG.3 is a block diagram of an example implementation of
`the mobile device of FIG. 2.
`
`FIG. 4 illustrates an example process for managing a
`mobile device’s speakerphone system based ona currentstate
`of the mobile device.
`
`55
`
`FIG.5 illustrates an example data structure for mapping
`mobile device states to speakerphone control actions.
`
`DETAILED DESCRIPTION
`
`Speakerphone System Overview
`
`FIGS. 1A and 1B illustrate an example speakerphonesys-
`tem for a mobile device 100 which can be managed based on
`a current state of the mobile device 100. The mobile device
`100 can be, for example, a handheld computer, a personal
`digital assistant, a cellular telephone, a network appliance, a
`
`60
`
`65
`
`2
`camera, a smart phone, an enhanced general packet radio
`service (EGPRS) mobile phone, a network base station, a
`media player, a navigation device, an email device, a game
`console, or a combination of any two or more of these data
`processing devices or other data processing devices. The
`mobile device 100 can have an speakerphone system associ-
`ated with the mobile device 100, e.g., an integrated speaker in
`the mobile device 100, or an external speaker wirelessly
`connected to the mobile device 100. The speakerphone sys-
`tem can include a microphone 122 which can be arranged on
`a front or back side of the mobile device 100 to facilitate
`hands-free operation during a telephone conversation or play-
`back of audio content (e.g., music, voicemail). In various
`implementations, the speakerphone system also includes a
`loudspeaker 124 to deliver and/or amplify voice to reach a
`user of the mobile device 100.
`
`The speakerphone system can be managed based on the
`current state of the mobile device 100. In some implementa-
`tions, the current state of the mobile device 100 can be deter-
`mined using a reference frame and one or more sensors(e.g.,
`accelerometer, gyro,
`light sensor, proximity sensor) inte-
`grated into the mobile device 100. A “state” can be an orien-
`tation or position of the device with respect to the reference
`frame. For example, when the mobile device 100 is laid flat on
`a surface (e.g., the x-y plane) of a table top 140, the speaker-
`phone system can beactivated to allow handsfree operation.
`Ifthe mobile device 100 is laying flat on a surface, an assump-
`tion can be madethatthe user intends to use the speakerphone
`system for hands free operation. Similarly, if the mobile
`device 100 is orientated substantially vertical (in the z-plane),
`an assumption can be madethat the mobile device 100 is
`temporarily fixed in a docking or recharging device (as shown
`in FIG. 1A), and the speakerphone system can be activated to
`allow hands free operation while docked and/or while
`recharging.
`A “state” of the mobile device 100 can also be determined
`
`based solely on sensor inputs. For example, one or more
`sensors(e.g., proximity sensor 268) on the front and/or back
`sides of the mobile device 100 can indicate a state of the
`mobile device 100. For example,if a first proximity sensor on
`the back side of the mobile device 100 is triggered and a
`second proximity sensoron thefront side ofthe mobile device
`100 is not triggered, then an assumption can be madethat the
`mobile device 100 is laying flat on a surface. Based on this
`assumption, the speakerphone system can be controlled (e.g.,
`activated) to allow handsfree operation.If the first proximity
`sensor and/or the second proximity sensorare triggered, then
`an assumption can be madethat the mobile device 100 is
`being held by the useror is stored (e.g., stored in a bag or
`case). Based on this assumption, the speakerphone system
`can be controlled differently (e.g., deactivated).
`In some implementations, a processor(e.g., processor 304)
`in the mobile device 100 can use a state machine to maintain
`the current state of the mobile device 100. The state machine
`
`can track various combinations of inputs which can cause a
`state change to occur. A control action can then be issued
`based on the current state of the mobile device 100 as indi-
`
`cated by the state machine. A control action can be activating
`or deactivating the speakerphone system, generating or
`adjusting a graphical user interface and/or any other suitable
`control action.
`
`For example, a first state of the mobile device 100 can be
`defined by a first proximity sensor on the back side of the
`mobile device 100 sensing proximity to an object(e.g., atable
`top surface) and a motion sensor not sensing motion of the
`mobile device 100 (e.g., acceleration is below a threshold
`value). The combination of these sensor inputs can place the
`
`7
`
`
`
`3
`state machine ofthe mobile device 100 into thefirst state. The
`
`US 8,676,224 B2
`
`4
`can be resized to reduce the graphical representations of
`display objects 204 and 206, e.g., graphical icons, and their
`first state can exist when the mobile device 100 is laying at
`corresponding touch areas(e.g., areas on the touch-sensitive
`rest, face up onaflat surface, for example. The control action
`display where a touch on the display 202 selects the graphical
`can be activating the speakerphone system and adjusting the
`icons). In various implementations, an ambient light sensor
`volume ofthe loudspeaker 124. Another control action can be
`270 can also be used to determine the current state of the
`to generate a graphical user interface, as described in U.S.
`device. For example, the ambientlight sensor 270 can sense
`patent application Ser. No. 11/937,463.
`whenthe mobile device 100 has been stored away. This sensor
`A secondstate of the mobile device 100 can be defined by
`input can be used alone or in combination with other sensor
`the motion sensor sensing motion (e.g., acceleration above a
`inputs to determinethe currentstate ofthe mobile device 100.
`threshold value). Such motion can place the state machine
`into the second state. The second state can exist when a user
`In some implementations, the microphone 122 can be used
`as a volume sensor which can detect the user’s voice volume.
`has picked up the mobile device 100 from the surface to make
`For example, when the volumelevel from the voice source
`a call, for example. The control action can be lowering the
`exceeds a default value, an assumption can be madethat the
`volume of the loudspeaker 124. Other control actions are
`user is speaking directly into the microphone 122 while hold-
`possible.
`ing the mobile device 100 to their ear, resulting in the speak-
`A third state of the mobile device 100 can be defined by a
`erphone system being deactivated, for example.
`second proximity sensor located on the front side of the
`In some implementations, the ambientlight sensor 270 can
`mobile device 100 sensing proximity to an object(e.g., the
`be utilized to facilitate adjusting the brightness of the display
`user’s head) and the motion sensor not sensing motion of the
`202, and an accelerometer 272 can beutilized to detect move-
`mobile device 100 (e.g., acceleration is again below a thresh-
`mentof the mobile device 100, as indicated bythe directional
`old value). The combination of these sensor inputs can place
`the state machine ofthe mobile device 100 into the third state.
`arrow 274. Accordingly,
`the speakerphone system and a
`The third state can exist when the user 110 has raised the
`graphical user interface can be adjusted according to a
`detected orientation of the mobile device 100.
`mobile device 100 to the user’s ear and the mobile device is no
`
`In some implementations, the mobile device 100 includes
`circuitry and sensors for supporting a location determining
`capability, such as that provided by the global positioning
`system (GPS) or other positioning systems (e.g., systems
`using Wi-Fi access points, television signals, cellular grids,
`Uniform Resource Locators (URLs)). In some implementa-
`tions, a positioning system (e.g., a GPS receiver) can be
`integrated into the mobile device 100 or provided as a sepa-
`rate device that can be coupled to the mobile device 100
`through an interface (e.g., port device 290) to provide access
`to location-based services. In some implementations, the
`mobile device 100 includes a gyroscopic sensor or other
`sensors that can be usedto detect motionororientation ofthe
`device with respect to a reference frame.
`In some implementations, positioning sensors (e.g., an
`accelerometer 272) can be used to compute an instantaneous
`coordinate frame of the mobile device 100. For example,
`when the mobile device 100 is lying flat on a surface, an
`instantaneous coordinate frame centered on the mobile
`device 100 can be computed. For example, the z-axis can be
`perpendicular to the surface which can lie in the x-y plane in
`a right-handed coordinated system, as shown in FIG. 1A.If
`the user 110 moves the mobile device 100 to the position and
`orientation shown in FIG.1B, then a trajectory for the mobile
`device 100 can be determined from the changein coordinates
`of the mobile device 100. For example, in reference to FIGS.
`1A and1B, the mobile device’s 100 coordinate frame in FIG.
`1A rotates by about ninety degrees with respect to the z-axis
`to changeto a coordinate frame in FIG. 1B while the user 110
`is holding the mobile device 100. Accordingly, the speaker-
`phone system can be controlled according to the detected
`change of coordinate frames.
`In some implementations, one or more sensors (e.g., a
`pressure sensor, temperature sensor) for detecting when a
`user is holding or gripping the mobile device 100 can be
`integrated into a housing of the mobile device 100. These
`sensors can detect when the mobile device 100 is gripped by
`auser, for example, by detecting a pressure exerted upon the
`body ofthe mobile device 100 or a partial temperature change
`(e.g., deviation from an ambient temperature) on the mobile
`device 100.
`the mobile device 100 can
`In some implementations,
`include a touch sensor, which detects a user entering input via
`
`10
`
`15
`
`25
`
`30
`
`35
`
`40
`
`45
`
`50
`
`55
`
`60
`
`65
`
`longer in motion. The control action can be deactivating the
`speakerphone system. Other control actions are possible.
`Tt should be understood that any numberof states and/or
`combinations of states can be defined and usedto trigger
`control actions. The state machine can be implemented by a
`processorofthe mobile device 100 (e.g., processor 304). The
`processor can also determine appropriate control actions
`based on the current state of the mobile device 100 as deter-
`mined by the state machine.
`
`Example Mobile Device
`
`FIG. 2 illustrates a graphical user interface for an example
`mobile device. As described in reference to FIG. 1, the mobile
`device 100 typically includes a built-in microphone 122 and
`loudspeaker 124. In some implementations, an up/downbut-
`ton 284 for volumecontrol of the loudspeaker 124 and the
`microphone 122 can be included. The mobile device 100 can
`also include an on/off button 282 for a ring indicator of
`incoming phonecalls. An audio jack 266 can also be included
`for use of headphones and/or a microphone.
`In addition, as shown in FIG.2, the mobile device 100 can
`include a display 202, which, in some implementations, is
`touch-sensitive. The touch-sensitive display 202 can imple-
`mentliquid crystal display (LCD) technology,light emitting
`polymer display (LPD) technology, or some other display
`technology. The touch sensitive display 202 can be sensitive
`to haptic and/or tactile contact with a user.
`In some implementations, the mobile device 100 can dis-
`play one or more graphical user interfaces on the touch-
`sensitive display 202 for providing user access to various
`system objects and for conveying information to a user. In
`some implementations,
`the graphical user interfaces can
`include one or more display objects, e.g., 204 and 206. In the
`example shown,the display objects 204 and 206, are graphic
`representations of system objects. Some examples of system
`objects include device functions, applications, windows,
`files, alerts, events, or other identifiable system objects.
`In some implementations, a proximity sensor 268 can be
`included to determine the current state of the mobile device
`
`100 by detecting the user 110 positioning the mobile device
`100 proximate to the user’s ear, as described in reference to
`FIG. 1. Insome implementations, the graphical user interface
`
`8
`
`
`
`US 8,676,224 B2
`
`5
`the graphical user interface, resulting in the speakerphone
`system being activated, for example. The user input can be
`received by the mobile device 100 from the user touching the
`touch-sensitive display 202, or from the user touching a key-
`pad or a like device (not shown) associated with the mobile
`device 100.
`the mobile device 100 can
`In some implementations,
`include a time sensor(e.g., using the internal clock of the
`mobile device 100), which detects a duration for a certain
`state (e.g., position, or orientation) of the mobile device 100.
`The detected duration can be used to determineif a control
`action will be triggered, to prevent overly frequent, unneces-
`sary responses to each state change. By wayofillustration, if
`the state change does not exceed a certain amountof time,
`e.g.. five seconds, an assumption can be madethat the state
`change is temporal, and therefore no control action will be
`triggered in response. By contrast, if the state change lasts
`longer than five seconds, an assumption can be madethat the
`state change will remain for a longerperiod, and thus a control
`action can betriggered accordingly.
`The decision whether to trigger a corresponding control
`action can also be made upon detection of time in combina-
`tion with a transition distance of the mobile device 100, to
`enhance accuracyofthe state determination. For example, in
`FIGS. 1A and 1B, ifthe mobile device 100 has been raised by
`the user 110 by twenty feet, for a interval exceeding five
`seconds, an assumption can be madethat the user intends to
`use the handset for the telephone conversation. Accordingly,
`the speakerphone system can be deactivated in responseto the
`assumption. Otherwise, the speakerphone system can remain
`unchanged until the state change is greater than a certain
`amount on time or distance.
`In some implementations, a port device 290, e.g., a Uni-
`versal Serial Bus (USB)port, or a docking port, or some other
`wired port connection, can be included. The port device 290
`can, for example, be utilized to establish a wired connection
`to other computing devices, such as other communication
`devices, network access devices, a personal computer, a
`printer, a display screen, or other processing devices capable
`of receiving and/or transmitting data. In some implementa-
`tions, the port device 290 allows the mobile device 100 to
`synchronize with a host device using one or more protocols,
`such as, for example, the TCP/IP, HTTP, UDPand any other
`knownprotocol.
`In some implementations, the mobile device 100 can have
`hardware connection sensors that detect whether the mobile
`device 100 is connected to any hardware devicesvia the port
`device 290. When the mobile device 1001s being connected to
`hardware devices(e.g., a docking station or re-charger), it is
`morelikely than not, a user of the mobile device 100 is not
`holding the handset, and thus the speakerphone system (e.g.,
`the speaker volume and/or microphonesensitivity), and
`graphical user interface can be adjusted accordingly.
`The mobile device 100 can also include a camera lens and
`
`sensor 280. In some implementations, the camera lens and
`sensor 280 can be located on the back surface of the mobile
`
`device 100. The camera can capture still images and/or video.
`In some implementations, the images captured by the camera
`can be used to measure proximity to a user or if the mobile
`device 100 is held by the user and the speakerphone system
`and graphical user interface can be activated or adjusted
`accordingly.
`The mobile device 100 can also include one or more wire-
`
`less communication subsystems, such as an 802.11b/g com-
`munication device 186, and/or a Bluetooth™ communication
`device 188. Other communication protocols can also be sup-
`ported, including other 802.x communication protocols (e.g.,
`
`10
`
`15
`
`25
`
`30
`
`35
`
`40
`
`45
`
`50
`
`55
`
`60
`
`65
`
`6
`WiMax, Wi-Fi, 3G), code division multiple access (CDMA),
`global system for mobile communications (GSM), Enhanced
`Data GSM Environment (EDGE), etc.
`
`Example Mobile Device Architecture
`
`FIG. 3 is a block diagram of an example implementation
`300 of the mobile device 100 of FIG. 1. The mobile device
`100 can include a memory interface 302, one or more data
`processors, image processors and/or central processing units
`304, and a peripherals interface 306. The memory interface
`302, the one or more processors 304 and/or the peripherals
`interface 306 can be separate components or can be integrated
`in one or more integrated circuits. The various componentsin
`the mobile device 100 can be coupled by one or more com-
`munication busesor signallines.
`Sensors, devices, and subsystems can be coupled to the
`peripherals interface 306 to facilitate multiple functionalities.
`For example, a motion sensor 310, a light sensor 312, anda
`proximitysensor 314 can be coupledto the peripherals inter-
`face 306 to facilitate the orientation, lighting, and proximity
`functions described with respect to FIG. 2. A hardware con-
`nection sensor 318 can be coupled to the peripherals interface
`306, to facilitate determining a state of connecting the mobile
`device 100 to any hardware, e.g., a dockingstation, a charger,
`a personal computer, etc. A gripping sensor 319 can be
`coupled to the peripherals interface 306, to determineif the
`mobile device 100 is being gripped. In various implementa-
`tion, a gripping sensor can include a temperature sensor,
`and/or a pressure sensor. Further, a touch sensor 321 can be
`coupledto the peripherals interface 306, to detectif a useris
`touching user input interface, e.g., a touch screen ora keypad.
`A time sensor 323 can also be coupled to the peripherals
`interface 306, to detect a duration of a certain state of the
`mobile device 100. Other sensors 316 can also be connected
`
`to the peripherals interface 306, such as a positioning system
`(e.g., GPS receiver), a temperature sensor, a biometric sensor,
`a gyroscope, or other sensing device, to facilitate related
`functionalities.
`A camera subsystem 320 and an optical sensor 322,e.g., a
`charged coupled device (CCD) or a complementary metal-
`oxide semiconductor (CMOS)optical sensor, can be utilized
`to facilitate camera functions, such as recording photographs
`and video clips.
`Communication functions can befacilitated through one or
`more wireless communication subsystems 324, which can
`include radio frequency receivers and transmitters and/or
`optical(e.g., infrared) receivers and transmitters. The specific
`design and implementation of the communication subsystem
`324 can depend on the communication network(s) over which
`the mobile device 100 is intended to operate. For example, a
`mobile device 100 may include communication subsystems
`324 designed to operate over a GSM network, a GPRS net-
`work, an EDGE network, a Wi-Fi or WiMax network, and a
`Bluetooth™network. In particular, the wireless communica-
`tion subsystems 324 mayinclude hosting protocols such that
`the device 100 may be configured as a base station for other
`wireless devices.
`An audio subsystem 326 can be coupled to a loudspeaker
`124, and microphone 122 to facilitate voice-enabled func-
`tions, for example, hands-free functionalities, voice recogni-
`tion, voice replication, digital recording, and telephony func-
`tions.
`
`The I/O subsystem 340 can include a touch screen control-
`ler 342 and/or other input controller(s) 344. The touch-screen
`controller 342 can be coupled to a touch screen 346. The
`touch screen 346 and touch screen controller 342 can, for
`
`9
`
`
`
`US 8,676,224 B2
`
`7
`example, detect contact and movementor break thereofusing
`any of a plurality of touch sensitivity technologies, including
`but not limited to capacitive, resistive, infrared, and surface
`acoustic wave technologies, as well as other proximity sensor
`arrays or other elements for determining one or morepoints of
`contact with the touch screen 346.
`The other input controller(s) 344 can be coupled to other
`input/control devices 348, such as one or more buttons, rocker
`switches, thumb-wheel, infrared port, USB port, and/or a
`pointer device such as a stylus. The one or more buttons (not
`shown) can include an up/down button for volume control of
`the speaker 126 and loudspeaker 124 and/or the microphone
`122.
`
`10
`
`8
`cessing-related processes and functions, respectively. An
`activation record and International Mobile Equipment Iden-
`tity (MEI) 374 or similar hardware identifier can also be
`stored in memory 350.
`Each of the above identified instructions and applications
`can correspondtoa set of instructions for performing one or
`more functions described above. These instructions need not
`be implemented as separate software programs, procedures,
`or modules. The memory 350 can include additional instruc-
`tions or fewerinstructions. Furthermore, various functions of
`the mobile device 100 may be implemented in hardware
`and/orin software, including in one or moresignal processing
`and/or application specific integrated circuits.
`
`15
`
`Example Process of Controlling Speakerphone
`System
`
`the mobile device 100 can
`In some implementations,
`present recorded audio and/or videofiles, such as MP3, AAC,
`and MPEGfiles. In some implementations, the mobile device
`100 can include the functionality ofan MP3player, such as an
`iPod™, The mobile device 100 may, therefore, include a
`FIG.4 illustrates an example process 400 for managing a
`36-pin connector that is compatible with the iPod. Other
`mobile device’s speakerphone system based onacurrentstate
`input/output and control devices can also be used.
`ofthe mobile device 100. For convenience, the process 400 is
`The memory interface 302 can be coupled to memory 350.
`described below in reference to FIGS. 1-3 (e.g., a mobile
`The memory 350 can include high-speed random access
`device 100, a speakerphone system, and other components
`memory and/or non-volatile memory, such as one or more
`that perform the process 400).
`magnetic disk storage devices, one or more optical storage
`In some implementations, the process 400 can begin when
`devices, and/or flash memory (e.g., NAND, NOR). The
`input from one or more sensors on the mobile device are used
`memory 350 can store an operating system 352, such as
`to determine a current state of the mobile device (410). An
`Darwin, RTXC, LINUX, UNIX, OS X, WINDOWS, or an
`example state can be a change of the mobile device’s position
`or orientation relative to a user of the mobile device or a
`embeddedoperating system such as VxWorks. The operating
`system 352 may include instructions for handling basic sys-
`tem services and for performing hardware dependenttasks. In
`some implementations, the operating system 352 can be a
`kernel (e.g., UNIX kernel).
`The memory 350 may also store communication instruc-
`tions 354to facilitate communicating with one or more addi-
`tional devices, one or more computers and/or one or more
`servers. The memory 350 may include graphical user inter-
`face instructions 356 to facilitate graphic user interface pro-
`cessing; sensor processing instructions 358 to facilitate sen-
`sor-related processing and functions; phoneinstructions 360
`to facilitate phone-related processes and functions; electronic
`messaging instructions 362 to facilitate electronic-messaging
`related processes and functions; web browsing instructions
`364 to facilitate web browsing-related processes and func-
`tions; media processing instructions 366 to facilitate media
`processing-related processes and functions; GPS/navigation
`instructions 368 to facilitate GPS and navigation-related pro-
`cesses andinstructions; camera instructions 370 to facilitate
`camera-related processes and functions; GUI adjustment
`instructions 373 to facilitate adjustment of graphical user
`interfaces and user interface elements in response to sensor
`data; and/or other software instructions 372 to facilitate other
`processes and functions.
`In addition, the memory 350 can store audio management
`instructions 376 to facilitate functions managing audio sub-
`system, including the loudspeaker 124, and the microphone
`122.
`In some implementations,
`the audio management
`instructions 376 are operable to toggle the speakerphonesys-
`tem and adjust speaker volume and/or microphonesensitiv-
`ity, in response to the sensor processing instructions 358.
`The memory 350 may also store other software instruc-
`tions (not shown), such as web video instructionsto facilitate
`web video-related processes and functions; and/or web shop-
`ping instructionsto facilitate web shopping-related processes
`and functions. In some implementations, the media process-
`ing instructions 366 are dividedinto audio processing instruc-
`tions and video processing instructions to facilitate audio
`processing-related processes and functions and video pro-
`
`25
`
`30
`
`35
`
`40
`
`45
`
`50
`
`55
`
`60
`
`65
`
`ref