throbber
as) United States
`a2) Patent Application Publication co) Pub. No.: US 2010/0298032 Al
` Leeet al. (43) Pub. Date: Nov.25, 2010
`
`
`
`US 20100298032A1
`
`(54) MOBILE TERMINAL AND METHOD OF
`PROVIDING GRAPHIC USER INTERFACE
`USING THE SAME
`
`(75)
`
`Inventors:
`
`KhwanheeLee, Seoul (KR);
`WoochulSong, Seoul (KR)
`
`Correspondence Address:
`LEE, HONG, DEGERMAN, KANG & WAIMEY
`660 S. FIGUEROA STREET,Suite 2300
`LOS ANGELES, CA 90017 (US)
`(73) Assignee:
`LG Electronics Inc.
`
`;
`(21) Appl. No.:
`
`12/569,355
`
`(22)
`
`Filed:
`
`Sep. 29, 2009
`
`(a)
`
`(30)
`
`Foreign Application Priority Data
`
`May 22, 2009
`Jul. 6, 2009
`
`(KR) oon eeeeeeeeee 10-2009-0044915
`(KR) wee 10-2009-0061270
`
`Publication Classification
`
`(51)
`
`Int. Cl.
`(2006.01)
`HO4M 1/00
`(2006.01)
`G06F 3/00
`(2006.01)
`GO6F 3/041
`(52) US. Ch oe 455/566; 715/700; 345/173
`(57)
`ABSTRACT
`Provided are a mobile terminal and a method ofproviding a
`graphic user interface using the same. The mobile terminal
`includesa plurality ofbodies. Various graphic user interfaces
`are provided according to a posture of the mobile terminal,
`which is formed by the plurality of bodies of the mobile
`terminal.
`
`APPLE 1013
`
`(b) 195
`
`APPLE 1013
`
`1
`
`

`

`Patent Application Publication
`
`Nov. 25,2010 Sheet 1 of 14
`
`US 2010/0298032 Al
`
`FIG.
`
`|
`
`100
`
`190
`
`POWER SUPPLY
`
`110
`
`RADIO COMMUNICATION
`UNIT
`
`BROADCAST
`RECEIVING MODULE
`
`WIRELESS INTERNET
`NODULE
`
`1
`
`13
`
`AUDIO OUTPUT MODULE |-}— 152
`112LJMOBILE COMMUNICATION
`MODULE
`
`QUTPUT UNIT
`
`FIRST TOUGH DEVIGE |
`
`151
`
`154
`
`
`153
`
`HAPTIG MODULE
`
`180
`
`160
`
`181
`
`MEMORY
`
`SENSING UNIT
`
`
`
`via
`
`LOCAL AREA
`COMMUNICATION NODULE
`
`15
`
`120
`
`121
`
`122
`
`LOCATION
`INFORMATION MODULE
`
`CONTROLLER
`
`A/V INPUT UNIT
`
`CAMERA
`
`MICROPHONE
`
`|
`
`130+
`
`133 —H]
`
`USER INPUT UNIT
`
`SECOND TOUGH DEVICE
`
`MULTIMEDIA
`MODULE
`
`140 —+
`
`
`
`INTERFACE
`
`170
`
`2
`
`

`

`Patent Application Publication
`
`Nov. 25,2010 Sheet 2 of 14
`
`US 2010/0298032 Al
`
`FIG, 2
`
`=
`
`101
`
`102
`
`132
`
`102
`
`101
`
`151a
`
`195
`
`131
`
`(a)
`
`x
`
`151b
`
`195
`
`(b)
`
`3
`
`

`

`Patent Application Publication
`
`Nov. 25,2010 Sheet 3 of 14
`
`US 2010/0298032 Al
`
`FIG. 3
`
`102
`
`195
`
`@
`
`(a)
`
`
`
`4
`
`

`

`Patent Application Publication
`
`Nov. 25,2010 Sheet 4 of 14
`
`US 2010/0298032 Al
`
`FIG. 4A
`
`102
`
`151b
`
`
`
`
`101
`
`dla
`
`195
`
`131
`
`FIG. 4B
`
`5
`
`

`

`Patent Application Publication
`
`Nov. 25,2010 Sheet 5 of 14
`
`US 2010/0298032 Al
`
`FIG. 5
`
`
`
`
`SENSE CHANGE OF FIRST ANGLE BETWEEN
`FIRST AND SECOND BODIES TO SECOND ANGLE
`
`1S THE SECOND ANGLE MAINTAINED
`FOR PREDETERMINED TIME?
`
` 1S GUI CORRESPONDING
`
`TO THE SECOND ANGLE SET?
`
`
`CHECK ONE OF FIRST AND SECOND TOUSH
`SCREENS,
`TO WHICH PRIORITY |S GIVEN
`
`
`
`|S TOUCH SCREEN HAVING
`PRIORITY ACTIVATED?
`
`
`
`ACTIVATE THE TOUCH SCREEN
`HAVING PRIORITY
`
`
`
`
`
`
`
`
`
`PROVIDE GUI CORREPSONDING TO THE
`SECOND ANGLE TO THE TOUCH SCREEN
`HAVING PRIORITY
`
` ACTIVATE OR INACTIVE THE TOUCH SCREEN
`
`HAVING NO PRIORITY OR MAINTAIN
`ORIGINAL STATE OF THE TOUSH SCREEN
`HAVING NO PRIORITY ACCORDING TO
`SETTINS OR USER’
`S COMMAND
`
`5100
`
`$170
`
`6
`
`

`

`Patent Application Publication
`
`Nov. 25,2010 Sheet 6 of 14
`
`US 2010/0298032 Al
`
`FIG. 6
`
`
`
`7
`
`

`

`Patent Application Publication
`
`Nov. 25,2010 Sheet 7 of 14
`
`US 2010/0298032 Al
`
`FIG. 7
`
`195
`
`8
`
`

`

`Patent Application Publication
`
`Nov. 25,2010 Sheet 8 of 14
`
`US 2010/0298032 Al
`
`FIG. 8
`
`195
`
`(a)
`
`(b)
`
`9
`
`

`

`Patent Application Publication
`
`Nov. 25,2010 Sheet 9 of 14
`
`US 2010/0298032 Al
`
`FIG. 9A
`|
`
`
`
`100———~»
`
`
`
`
`
`
`
`
`
`PDF
`
`151b
`
`195
`
`131
`
`15la
`
`IA movie. av it
`1B movie. avi tt
`ae movie. mpg!G photo.gif!
`1D movie. mpg TOOT
`
`|
`
`|
`!
`
`10
`
`10
`
`

`

`Patent Application Publication
`
`Nov. 25,2010 Sheet 10 of 14
`
`US 2010/0298032 Al
`
`FIG. 10
`
`
`
`11
`
`11
`
`

`

`Patent Application Publication
`
`Nov. 25,2010 Sheet 11 of 14
`
`US 2010/0298032 Al
`
`FIG. 11
`
`
`
`FIG. 12
`
`{1
`
`101
`
`103
`
`21
`
`12
`
`102
`
`104
`
`99 105
`
`12
`
`12
`
`

`

`Patent Application Publication
`
`Nov. 25,2010 Sheet 12 of 14
`
`US 2010/0298032 Al
`
`FIG. 13
`
`
`
`12
`
`102
`
`104
`
`99 105
`
`FIG. 14
`
`12
`195, 06
`if
`
`
`|Peczcrcrerz2222ZZTTTTZIZZZZIIATIZIZZIIIZIZIZA~~ IZIZIECIZZIZIZIEEDS,
`
`102
`
`104
`
`99
`
`94
`
`101
`
`103
`
`FIG. 15
`
`105
`
`103
`
`13
`
`13
`
`

`

`Patent Application Publication
`
`Nov. 25,2010 Sheet 13 of 14
`
`US 2010/0298032 Al
`
`
`
`12
`
`102
`
`104
`
`22
`
`105
`
`
`
`
`12
`
`FIG. 18
`
`92 195 94
`
`11
`
`
`
`Dror TTT IIIT ITT TTI, ODETTE TTT ZT TT TTT TTT]PT
`
`102
`
`104
`
`105
`
`101
`
`103
`
`14
`
`14
`
`

`

`Patent Application Publication
`
`Nov. 25,2010 Sheet 14 of 14
`
`US 2010/0298032 Al
`
`FIG. 19A
`
`—>
`
`31
`
`101
`
`32
`
`33
`
`Al
`
`102
`
`42
`
`43
`
`FIG. 19B
`
`—
`
`31
`
`101
`
`32
`
`33
`
`pTnd
`it
`
`AI
`
`10242
`
`43
`
`FIG. 19C
`
`31
`
`101
`
`32
`
`33
`
`
`
`41
`
`10242
`
`43
`
`15
`
`15
`
`

`

`US 2010/0298032 Al
`
`Nov. 25, 2010
`
`MOBILE TERMINAL AND METHOD OF
`a posture change among theplurality of graphic user inter-
`PROVIDING GRAPHIC USER INTERFACE
`faces on the first or second touch screen when the posture
`USING THE SAME
`sensed by the sensing unit is changed fromafirst posture to a
`second posture.
`[0011]
`To accomplish the objects of the present invention,
`according to a third aspect of the present invention, there is
`provided a methodof providing a graphic user interface in a
`mobile terminal having a first body and a second body com-
`bined with each other such that the mobile terminal can be
`
`[0001] The present application claimspriorities to Korean
`Application Nos. 10-2009-0044915 filed on May 22, 2009
`and 10-2009-0061270filed on Jul. 6, 2009 in Korea,the entire
`contents of which are hereby incorporated by reference in
`their entirety.
`
`folded into the first and second bodies, the method including
`sensing a changein a posture formedbythefirst and second
`bodies from a first posture to a second posture; and providing
`a graphic user interface including at least one object and
`corresponding to the second posture to at least one ofa first
`touch screen included in the first body and a second touch
`screen included in the second body.
`[0012] According to the mobile terminal and the method of
`providing a graphic user interface using the same, various
`graphic user interfaces can be provided according to shapes
`formed by the plurality of bodies and/or the posture of the
`mobile terminal, and thus a user can be provided with a
`required graphic user interface without performing an addi-
`tional menu search operation.
`
`BRIEF DESCRIPTION OF THE DRAWINGS
`
`BACKGROUND OF THE INVENTION
`
`1. Field of the Invention
`[0002]
`[0003] The present invention relates to a mobile terminal,
`and moreparticularly, to a mobile terminal having a plurality
`of bodies and providing various graphic user interfaces
`according to the angle betweenthe plurality of bodies and a
`methodof providing graphic user interfaces using the same.
`[0004]
`2. Discussion of the Related Art
`[0005] Mobile terminals having various functions and
`shapes come to the market as mobile terminal technology
`makesrapid progress. However, the size of a mobile terminal
`is restricted for portability. It is inconvenient for a user to
`operate the mobile terminal dueto a restriction on the size of
`the display of the mobile terminal.
`[0006] Accordingly, a variety of researches for solving the
`restriction on the mobile terminal size have been performed
`recently, and thus mobile terminals having various body
`structures cometo the market.
`
`SUMMARYOF THE INVENTION
`
`[0007] Accordingly, one object of the present invention is
`to address the above-noted and other drawbacksofthe related
`art.
`
`[0013] The accompanying drawings, which are included to
`provide a further understanding of the invention and are
`incorporated in and constitute a part of this application, illus-
`trate embodiment(s) of the invention and together with the
`description serve to explain the principle of the invention. In
`the drawings:
`[0014]
`FIG. 1 is a block diagram of a mobile terminal
`according to an embodimentofthe present invention;
`[0015]
`FIGS. 2,3, 4A and 4Billustrate the external appear-
`[0008] Another object ofthe present inventionis to provide
`ance of the mobile terminal according to embodiments of the
`a mobile terminal having a plurality of bodies and providing
`present invention;
`graphic user interfaces corresponding to postures of the
`[0016]
`FIG. 5 is a flowchart of a method of providing a
`mobile terminal, formed by the plurality of bodies, and a
`graphic user interface according to an embodiment of the
`methodof providing a graphic user interface using the same.
`present invention;
`[0009]
`To accomplish the objects of the present invention,
`[0017]
`FIG.6 illustrates an open status of the mobile ter-
`according to a first aspect of the present invention, there is
`minal in which the angle betweenfirst and second bodies 101
`provided a mobile terminalincludingafirst body including a
`and 102 correspondsto a;
`first touch screen; a second body including a second touch
`screen; a combining part combining the first body and the
`[0018]
`FIGS. 7and§8 illustrate exemplary images displayed
`second body with each othersuch that the mobile terminal can
`on the mobile terminal when an application corresponding to
`be folded into the first and second bodies; a sensing unit
`a specific menu selected in the state shown in FIG. 6 is
`sensing the posture ofthe mobile terminal, formedby thefirst
`executed;
`and second bodies; a memory storing a plurality of graphic
`[0019]
`FIGS. 9A and 9Billustrate exemplary images dis-
`user interfaces including at least one object; and a controller
`played on the mobile terminal when graphic user interfaces
`configured to display a graphic user interface corresponding
`according to the angle between the first and second bodies
`to the posture sensed by the sensing unit amongthe plurality
`101 and 102 are provided;
`of graphic user interfaces on thefirst or second touch screen.
`[0020]
`FIGS. 10 and 11 illustrate a change ofa touch screen
`[0010]
`To accomplish the objects of the present invention,
`having priority according to the posture of the mobile termi-
`according to a second aspect ofthe present invention,there is
`nal;
`provided a mobile terminalincludingafirst body including a
`[0021]
`FIGS. 12, 13 and 14 are cross-sectional views of
`first touch screen; a second body including a second touch
`sensor elements and sensed objects according to a first
`screen; a combining part combining the first body and the
`embodimentof the present invention;
`second body with each othersuch that the mobile terminal can
`[0022]
`FIG. 15 is across-sectional view of a sensor element
`be folded into the first and second bodies; a sensing unit
`mounted on an FPCB;
`sensing the posture ofthe mobile terminal, formedby thefirst
`[0023]
`FIGS. 16, 17 and 18 are cross-sectional views of
`and second bodies; a memory storing a plurality of graphic
`user interfaces including at least one object; and a controller
`sensor elements and sensed objects according to a second
`configured to display a graphic user interface correspondsto
`embodimentof the present invention; and
`
`16
`
`16
`
`

`

`US 2010/0298032 Al
`
`Nov. 25, 2010
`
`FIGS. 19A, 19B and 19C are cross-sectional views
`[0024]
`of sensor elements and sensed objects according to a third
`embodimentof the present invention.
`
`DETAILED DESCRIPTION OF THE PREFERRED
`EMBODIMENTS
`
`[0025] The present invention will now be described more
`fully with reference to the accompanying drawings, in which
`exemplary embodiments of the invention are shown. The
`invention may, however, be embodied in many different
`forms and should not be construed as being limited to the
`embodiments set forth herein; rather, there embodiments are
`provided so that this disclosure will be thorough and com-
`plete, and will fully convey the concept of the invention to
`those skilled in the art.
`
`[0026] Hereinafter, a mobile terminalrelating to the present
`invention will be described below in more detail with refer-
`
`ence to the accompanying drawings. In the following descrip-
`tion, suffixes “module”and “unit” are given to components of
`the mobile terminal in consideration of only facilitation of
`description and do not have meaningsor functions discrimi-
`nated from each other.
`
`[0027] The mobile terminal described in the specification
`can include a cellular phone, a smart phone, a laptop com-
`puter, a digital broadcasting terminal, personal digital assis-
`tants (PDA), a portable multimedia player (PMP), a naviga-
`tion system and so on.
`[0028]
`FIG. 1 isa block diagram of a mobile terminal 100
`according to an embodiment of the present invention. As
`shown,the mobile terminal 100 may include a radio commu-
`nication unit 110, an audio/video (A/V) input unit 120, a user
`input unit 130, a sensing unit 140, an output unit 150, a
`memory 160, an interface 170, a controller 180, and a power
`supply 190.
`[0029] The radio communication unit 110 may include at
`least one module that enables radio communication between
`
`the mobile terminal 100 and a radio communication system or
`between the mobile terminal 100 and a network in which the
`mobile terminal 100 is located. For example, the radio com-
`munication unit 110 includes a broadcasting receiving mod-
`ule 111, a mobile communication module 112, a wireless
`Internet module 113, a local area communication module 114
`and a position information module 115.
`[0030] The broadcasting receiving module 111 receives
`broadcasting signals and/or broadcasting related information
`from an external broadcasting managementserver through a
`broadcasting channel. Also, the broadcasting channel can
`includea satellite channel anda terrestrial channel, and the
`broadcasting managementserver can be a server that gener-
`ates and transmits broadcasting signals and/or broadcasting
`related information or a server that receives previously cre-
`ated broadcasting signals and/or broadcasting related infor-
`mation and transmits the broadcasting signals and/or broad-
`casting related information to a terminal.
`[0031]
`Further, the broadcasting signals can include not
`only TV broadcasting signals, radio broadcasting signals and
`data broadcasting signals, but also signals in the form of
`combination of a TV broadcasting signal and a radio broad-
`casting signal. In addition, the broadcasting related informa-
`tion can be information on a broadcasting channel, a broad-
`casting program or a broadcasting service provider, and can
`be provided even through a mobile communication network.
`In the latter case, the broadcasting related information can be
`received by the mobile communication module 112.
`
`[0032] Also, the broadcasting related information can exist
`in various forms. For example, the broadcasting related infor-
`mation can exist in the form of an electronic program guide
`(EPG)of the digital multimedia broadcasting (DMB) system
`or in the form of an electronic service guide (ESG) of the
`digital video broadcast-handheld (DVB-H) system.
`[0033]
`In addition, the broadcasting receiving module 111
`receives broadcasting signals using various broadcasting sys-
`tems. Particularly, the broadcasting receiving module 111 can
`receive digital broadcasting signals using digital broadcast-
`ing systems such as the digital multimedia broadcasting-
`terrestrial (DMB-T) system, the digital multimedia broad-
`casting-satellite (ODMB-S) system, the media forward link
`only (MediaFLO) system, the DVB-H andintegrated services
`digital broadcast-terrestrial
`(SDB-T) systems, etc. The
`broadcasting receiving module 111 can also be constructed to
`be suited to broadcasting systems providing broadcasting
`signals other than the above-described digital broadcasting
`systems.
`Further, the broadcasting signals and/or broadcast-
`[0034]
`ing related information received through the broadcasting
`receiving module 111 can be stored in the memory 160. The
`mobile communication module 112 transmits/receives a
`
`radio signal to/from at least one of a base station, an external
`terminal and a server on a mobile communication network.
`The radio signal can include a voice call signal, a video
`telephony call signal or data in various forms according to
`transmission and reception of text/multimedia messages.
`[0035]
`In addition, the wireless Internet module 113 corre-
`sponds to a module for wireless Internet access and can be
`included in the mobile terminal 100 or externally attached to
`the mobile terminal 100. A wireless LAN (WLAN)(Wi-Fi),
`wireless broadband (Wibro), world interoperability for
`microwave access (Wimax), high speed downlink packet
`access (HSDPA) and so on can be used as a wireless Internet
`technique.
`[0036] Also, the local area communication module 114
`corresponds to a module for local area communication. Fur-
`ther, bluetooth, radio frequency identification (RFID), infra-
`red data association (IrDA), ultra wideband (UWB)and Zig-
`Bee can be used as a local area communication technique.
`[0037] The position information module 115 confirms or
`obtains the position of the mobile terminal. The position
`information module 115 can obtain position information by
`using a global navigationsatellite system (GNSS). The GNSS
`is a terminology describing radio navigation satellite systems
`that revolve round the earth and transmit reference signals to
`predetermined types of radio navigation receivers such that
`the radio navigation receivers can determine their positions
`on the earth’s surface or near the earth’s surface. The GNSS
`
`includes a global positioning system (GPS) of the United
`States, Galileo of Europe, global orbiting navigational satel-
`lite system (GLONASS) of Russia, COMPASS of China,
`quasi-zenith satellite system (QZSS) of Japan and so on.
`[0038]
`In more detail, a global positioning system (GPS)
`module is a representative example ofthe position informa-
`tion module 115. In addition, the GPS module 115 can cal-
`culate information on distances between one point or object
`andat least three satellites and information on the time when
`
`the distance information 1s measured and apply trigonometry
`to the obtained distance information to obtain three-dimen-
`sional position information onthe pointor object according to
`the latitude, longitude andaltitude at a predetermined time.
`
`17
`
`17
`
`

`

`US 2010/0298032 Al
`
`Nov. 25, 2010
`
`[0042] The microphone 122 receives an external audio sig-
`nal in a call mode, a recording modeor a speech recognition
`mode and processes the received audio signal into electric
`audio data. The audio data can then be converted into a form
`that can be transmitted to a mobile communication base sta-
`tion through the mobile communication module 112 and out-
`put in the call mode. Further, the microphone 122 can employ
`various noise removal algorithms for removing noise gener-
`ated when the external audio signal is received.
`[0043]
`In addition, the user input unit 130 receives input
`data for controlling the operation of the terminal from a user.
`The user input unit 130 can include a keypad, a dome switch,
`a touch pad (constant voltage/capacitance), jog wheel, jog
`switch and so on.
`
`the display unit 151 can also beofthe light transmission type.
`Furthermore, a method of calculating position and
`[0039]
`Accordingly, a user can see an object located behind the body
`time information using three satellites and correcting the
`ofthe mobile terminal 100 throughthe transparent area ofthe
`calculated position and time information using anothersatel-
`lite can also used. In addition, the GPS module 115 continu-
`body of the mobile terminal 100, which is occupied by the
`display unit 151.
`ously calculates the current position in real time and calcu-
`lates velocity information using the position information.
`[0048] The mobile terminal 100 can also include at least
`twodisplay units 151. For example, the mobile terminal 100
`[0040] Referring to FIG.1, the A/V input unit 120 is used to
`
`input an audio signal or a video signal and includes a camera can includeaplurality ofdisplays that are arranged ona single
`121 and a microphone 122. The camera 121 processes image
`face at a predetermined distance or integrated displays. The
`frames ofstill images or moving images obtained by an image
`plurality of displays can also be arranged on different sides.
`sensor in a video telephony mode or a photographing mode.
`[0049]
`In addition, when the display unit 151 and a sensor
`Further, the processed image frames can be displayed on a
`sensing touch (referred to as a touch sensorhereinafter) form
`display unit 151.
`a layered structure, which is referred to as a touch screen
`[0041] Also, the image frames processed by the camera 121
`hereinafter, the display unit 151 can be used asan input device
`can be stored in the memory 160 or transmitted to an external
`in addition to an output device. The touch sensor can be in the
`device through the radio communication unit 110. The
`form of a touch film, a touch sheet and a touch pad, for
`mobile terminal 100 can also include at least two cameras.
`example.
`Further, the touch sensor can be constructed to con-
`[0050]
`vert a variation in pressure appliedto a specific portion of the
`display unit 151 or a variation in capacitance generated at a
`specific portion of the display unit 151 into an electric input
`signal. The touch sensor can also be constructed to sense
`pressure oftouch as well as the position and area ofthe touch.
`[0051] Also, when the user applies touch inputto the touch
`sensor, a signal corresponding to the touch input is transmit-
`ted to a touch controller. The touch controller then processes
`the signal and transmits data corresponding to the processed
`signal to the controller 180. Accordingly, the controller 180
`can detect a touchedportion of the display 151.
`[0052] Referring to FIG. 1, the proximity sensor of the
`sensing unit 140 can be located in an internal region of the
`mobile terminal, surrounded by the touch screen, or near the
`touch screen. The proximity sensor senses an object
`approaching a predetermined sensing face or an object
`located near the proximity sensor using an electromagnetic
`force or infrared rays without having mechanical contact.
`Further, the proximity sensor haslifetime longer thanthat of
`a contact sensor andthushas a wide application in the mobile
`terminal 100.
`
`[0044] Also, the sensing unit 140 sensesthe current state of
`the mobile terminal 100, such as an open/close state of the
`mobile terminal 100, the position of the mobile terminal 100,
`whethera user touches the mobile terminal 100, the direction
`of the mobile terminal 100 andthe acceleration/deceleration
`
`ofthe mobile terminal 100, and generates a sensing signal for
`controlling the operation of the mobile terminal 100. For
`example, the sensing unit 140 can sense whether a slide phone
`is opened or closed when the mobile terminal 100 is the slide
`phone. Furthermore, the sensing unit 140 can sense whether
`the power supply 190 supplies power and whetherthe inter-
`face 170 is connected to an external device. The sensing unit
`140 can also include a proximity sensor.
`[0045]
`In addition, the output unit 150 generates visual,
`auditory or tactile output and can include the display unit 151,
`an audio output module 152, an alarm 153 and a haptic
`module 154. Further, the display unit 151 displays informa-
`tion processed by the mobile terminal 100. For example, the
`display unit 151 displays a user interface (UI) or graphic user
`interface (GUI)related to a telephone call when the mobile
`terminalis in the call mode. The display unit 151 also displays
`acaptured or/and received image, UI or GUI whenthe mobile
`terminal 100 is in the video telephony mode or the photo-
`graphing mode.
`[0046]
`In addition, the display unit 151 can includeatleast
`one of a liquid crystal display, a thin film transistor liquid
`crystal display, an organic light-emitting diode display, a
`flexible display and a three-dimensional display. Further,
`someofthese displays can be of a transparenttypeora light
`transmission type. Thatis, the display unit 151 can include a
`transparent display.
`[0047]
`In more detail, the transparent display includes a
`transparentliquid crystal display. Further, the rear structure of
`
`In addition, the proximity sensor includes a trans-
`[0053]
`mission type photo-electric sensor, a direct reflection type
`photo-electric sensor, a mirror reflection type photo-electric
`sensor, a high-frequency oscillating proximity sensor, a
`capacitive proximity sensor, a magnetic proximity sensor, an
`infrared proximity sensor, etc. Further, a capacitive touch
`screen is constructed such that proximity of a pointer is
`detected through a variation in an electric field according to
`the proximity ofthe pointer. In this instance, the touch screen
`(touch sensor) can be classified as a proximity sensor.
`[0054]
`For convenience of explanation, the action of the
`pointer approaching the touch screen without actually touch-
`ing the touch screenis referredto as “proximity touch” and an
`action of bringing the pointer into contact with the touch
`screen is referred to as “contact touch” in the following
`description. In addition, the proximity touch point of the
`pointer on the touch screen correspondsto a point ofthe touch
`screen to which the pointer touches the touch screen.
`[0055]
`Further, the proximity sensor senses the proximity
`touch and a proximity touch pattern (for example, a proximity
`touch distance, a proximity touch direction, a proximity touch
`velocity, a proximity touch time, a proximity touch position,
`a proximity touch moving state, etc.). Information corre-
`sponding to the sensed proximity touch action and proximity
`touch pattern can then be displayed on the touch screen.
`
`18
`
`18
`
`

`

`US 2010/0298032 Al
`
`Nov. 25, 2010
`
`[0056] Also, the audio output module 152 can output audio
`data received from the radio communication unit 110 or
`
`stored in the memory 160 in a call signal receiving mode, a
`telephonecall mode or a recording mode, a speech recogni-
`tion mode and a broadcasting receiving mode. Further, the
`audio output module 152 outputs audio signals related to
`functions (for example, a call signal incoming tone, a mes-
`sage incoming tone, etc.) performed in the mobile terminal
`100. The audio output module 152 can includea receiver, a
`speaker, a buzzer, etc. The audio output module 152 can
`output sounds through an earphonejack. The user can hear the
`sounds by connecting an earphoneto the earphone jack.
`[0057]
`In addition, the alarm 153 outputs a signal for indi-
`cating the generation of an event of the mobile terminal 100.
`For example, alarms can be generated whenreceiving a call
`signal, receiving a message, inputting a key signal, inputting
`touch, etc. The alarm 153 can also output signals in forms
`different from video signals or audio signals, for example, a
`signal for indicating generation of an event through vibration.
`The video signals or the audio signals can be also output
`through the display unit 151 or the audio output module 152.
`[0058] Also, the haptic module 154 generates various hap-
`tic effects that the user can feel. One representative example
`of the haptic effects is vibration. The intensity and pattern of
`vibration generated by the haptic module 154 can also be
`controlled. For example, different vibrations can be com-
`bined and output or can be sequentially output.
`[0059]
`Further, the haptic module 154 can generate a vari-
`ety ofhaptic effects including an effect of stimulus according
`to arrangement of pins vertically moving against a contact
`skin surface, an effect of stimulus according to a jet force or
`sucking force of air through a jet hole or a sucking hole, an
`effect of stimulus of rubbing the skin, an effect of stimulus
`according to contact of an electrode, an effect of stimulus
`using an electrostatic force, and an effect according to a
`reproduction of cold and warmth using an element capable of
`absorbingorradiating heat in addition to vibrations.
`[0060] The haptic module 154 can also not only transmit
`haptic effects through direct contact but also allow the user to
`feel haptic effects through a kinesthetic sense of the user’s
`fingers or arms. The mobile terminal 100 can also include
`multiple haptic modules 154.
`[0061]
`Inaddition, the memory 160 can store a program for
`the operation of the controller 180 and temporarily store
`input/output data (for example, phone book, messages, still
`images, moving images, etc.). The memory 160 can also store
`data about vibrations and sounds in various patterns, which
`are output from when a touch input is applied to the touch
`screen.
`
`Further, the memory 160 can includeat least one of
`[0062]
`a flash memory,a hard disk type memory, a multimedia card
`micro type memory,a card type memory (for example, SD or
`XD memory), a random access memory (RAM), a static
`RAM (SRAM), a read-only memory (ROM), an electrically
`erasable programmable ROM (EEPROM), a programmable
`ROM (PROM) magnetic memory, a magnetic disk and an
`optical disk. The mobile terminal 100 can also operate in
`relation to a web storage performing the storing function of
`the memory 160 on the Internet.
`[0063] Theinterface 170 serves as a path to external devices
`connected to the mobile terminal 100. Further, the interface
`170 receives data from the external devices or power and
`transmits the data or powerto the internal components of the
`mobile terminal 100 or transmits data of the mobile terminal
`
`the interface 170 can
`100 to the external devices. Also,
`include a wired/wireless headset port, an external charger
`port, a wired/wireless data port, amemory cardport, a port for
`connecting a device having a user identification module, an
`audio I/O port, a video I/O port, an earphoneport, etc., for
`example.
`Inaddition, the interface 170 can also interface with
`[0064]
`auseridentification module thatis a chip that stores informa-
`tion for authenticating the authority to use the mobile termi-
`nal 100. For example, the user identification module can be a
`user identify module (UIM), a subscriber identify module
`(SIM) anda universal subscriber identify module (USIM). An
`identification device includingthe useridentification module
`can also be manufactured in the form of a smart card. Accord-
`
`ingly, the identification device can be connected to the mobile
`terminal 100 through a port of the interface 170.
`[0065] The interface 170 can also be a path through which
`power from an external cradle is provided to the mobile
`terminal 100 when the mobile terminal 100 is connected to
`
`the external cradle or a path through which various command
`signals input by the user throughthe cradle are transmitted to
`the mobile terminal 100. The various command signals or
`powerinput from the cradle can be used as signals for con-
`firming whether the mobile terminal is correctly set in the
`cradle.
`
`In addition, the controller 180 controls the overall
`[0066]
`operations ofthe mobile terminal. For example, the controller
`180 performs control and processing for voice communica-
`tion, data communication and video telephony. As shown in
`FIG.1, the controller 180 also includes a multimedia module
`181 for playing multimedia. Also, the multimedia module
`181 can be includedin the controller 180 as shown in FIG. 1
`or can be separated from the controller 180.
`[0067]
`Further, the controller 180 can perform a pattern
`recognition process capable ofrecognizing handwriting input
`or picture-drawing input applied to the touch screen as char-
`acters or images. In addition, the power supply 190 receives
`external power and internal power and provides power
`required for the operations of the components of the mobile
`terminal under the control of the controller 180.
`
`the
`implementation,
`hardware
`[0068] According to
`embodiments of the present invention can be implemented
`using at least one of application specific integrated circuits
`(ASICs), digital signal processors (DSPs), digital signal pro-
`cessing devices
`(DSPDs), programmable logic devices
`(PLDs), field programmable gate arrays (FPGAs), proces-
`sors, controllers, micro-controllers, microprocessors, electri-
`cal units for executing functions. In some cases, the embodi-
`ments can be implemented by the controller 180.
`[0069] According to software implementation, embodi-
`ments such as procedures or functions can be implemented
`with a separate software module executing at least one func-
`tion or operation. Software codes can be implemented
`according to a software application written in an appropriate
`software language. Furthermore, the software codes can be
`stored in the memory 160 and executed by the controller 180.
`[0070] Next, embodiments of the present invention will
`now be explained.
`[0071]
`FIGS. 2, 3, 4A and 4B illustrates the external
`appearance of the mobile terminal 100 according to the
`embodiments of the present
`invention, which will be
`explained later. The mobile terminal 100 includesa first body
`101 and a second body 102 which are combined with each
`other through a combining part 195.
`
`19
`
`19
`
`

`

`US 2010/0298032 Al
`
`Nov. 25, 2010
`
`[0079] The sensing unit 140 can include at least one of a
`hall sensor, 3-axis or 6-axis motion sensor, terrestrial mag-
`netic sensor and acceleration sensor in order to sense the
`
`position or direction o

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket