`(12) Patent Application Publication (10) Pub. No.: US 2010/0298032 A1
`Lee et al.
`(43) Pub. Date:
`Nov. 25, 2010
`
`US 2010O298.032A1
`
`(54) MOBILE TERMINAL AND METHOD OF
`PROVIDING GRAPHC USER INTERFACE
`USING THE SAME
`
`(75) Inventors:
`
`Khwanhee Lee, Seoul (KR):
`Woochul Song, Seoul (KR)
`
`Correspondence Address:
`LEE, HONG, DEGERMAN, KANG & WAIMEY
`660 S. FIGUEROASTREET, Suite 2300
`LOS ANGELES, CA 90017 (US)
`
`(73) Assignee:
`
`LG Electronics Inc.
`
`(21) Appl. No.:
`
`12/569,355
`
`(22) Filed:
`
`Sep. 29, 2009
`
`(30)
`
`Foreign Application Priority Data
`
`May 22, 2009 (KR) ........................ 10-2009-0044915
`Jul. 6, 2009 (KR) ........................ 10-2009-0061270
`Publication Classification
`
`(51) Int. Cl.
`(2006.01)
`H04M I/00
`(2006.01)
`G06F 3/00
`(2006.01)
`GO6F 3/04 I
`(52) U.S. Cl. .......................... 455/566; 715/700; 34.5/173
`(57)
`ABSTRACT
`Provided are a mobile terminal and a method of providing a
`graphic user interface using the same. The mobile terminal
`includes a plurality of bodies. Various graphic user interfaces
`are provided according to a posture of the mobile terminal,
`which is formed by the plurality of bodies of the mobile
`terminal.
`
`
`
`195
`
`(a)
`
`(b)
`
`Exhibit 1013
`Page 01 of 25
`
`
`
`Patent Application Publication
`
`Nov. 25, 2010 Sheet 1 of 14
`
`US 2010/0298032 A1
`
`X
`
`
`
`111
`
`112
`
`113
`
`110
`
`
`
`RADIO COMMUNICATION
`UNIT
`
`BROADCAST
`RECEWING MODULE
`
`MOBILE COMMUNICATION
`MODULE
`
`WRELESS INTERNET
`MODULE
`
`LOCAL AREA
`114 --
`COMMUNICATION MODULE
`
`115
`
`LOCATION
`INFORMATION MODULE
`
`
`
`120
`
`A/V INPUT UNIT
`
`121
`
`122
`
`CAMERA
`
`MICROPHONE
`
`130 - USER INPUT UNIT
`133 - SECOND TOUCH DEVICE
`
`40 - SENSING UNIT
`
`FIG. I.
`
`POWER SUPPLY
`
`190
`
`100
`
`150
`
`OUTPUT UNIT
`FIRST TOUCH DEVICE
`
`151
`
`AUDIO OUTPUT MODULE -- 152
`
`HAPTIC MODULE
`
`153
`
`154
`
`CONTROLLER
`
`180
`
`MULTIMEDIA
`MODULE
`
`a
`
`160
`
`MEMORY
`
`INTERFACE
`
`170
`
`Exhibit 1013
`Page 02 of 25
`
`
`
`Patent Application Publication
`
`Nov. 25, 2010 Sheet 2 of 14
`
`US 2010/0298032 A1
`
`FIG 2
`
`100
`
`151 a
`
`195
`
`131
`
`(a)
`
`le,
`
`151b
`
`-1
`
`195
`
`(b)
`
`N
`
`101
`102
`
`132
`
`102
`
`101
`
`Exhibit 1013
`Page 03 of 25
`
`
`
`Patent Application Publication
`
`Nov. 25, 2010 Sheet 3 of 14
`
`US 2010/0298032 A1
`
`FIG 3
`
`101
`
`102
`
`195
`
`C
`
`(a)
`
`
`
`195
`
`(b)
`
`Exhibit 1013
`Page 04 of 25
`
`
`
`Patent Application Publication
`
`Nov. 25, 2010 Sheet 4 of 14
`
`US 2010/0298032 A1
`
`FIG. 4A
`
`100
`
`
`
`102
`
`151b
`
`101
`
`151 a
`
`195
`
`FIG. 4B
`
`t
`
`151b
`
`195
`
`151a
`
`A ---- - - ---- - - - - A
`
`102
`
`101
`
`Exhibit 1013
`Page 05 of 25
`
`
`
`Patent Application Publication
`
`Nov. 25, 2010 Sheet 5 of 14
`
`US 2010/0298032 A1
`
`FIG. 5
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`SENSE CHANGE OF FIRST ANGLE BETWEEN
`FIRST AND SECOND BODIES TO SECOND ANGLE
`
`S100
`
`IS THE SECOND ANGLE MANTA NED
`FOR PREDETERMINED TIME?
`
`S110
`
`S GUI CORRESPONDING
`TO THE SECOND ANGLE SET
`
`S120
`
`CHECK ONE OF FIRST AND SECOND TOUSH
`SCREENS, TO WHICH PRIORITY |S GIVEN
`
`S130
`
`S TOUCH SCREEN HAVING
`PRIORITY ACT WATED?
`
`ACTIVATE THE TOUCH SCREEN
`HAVING PRIORITY
`
`PROVIDE GU CORREPSONDING TO THE
`SECOND ANGLE TO THE TOUCH SCREEN
`HAVING PRIORITY
`
`ACTIVATE OR NACTIVE THE TOUCH SCREEN
`HAVING NO PRIORITY OR MANTAIN
`ORIGINAL STATE OF THE TOUSH SCREEN
`HAVING NO PRIORITY ACCORDING TO
`SETTINS OR USER S COMMAND
`
`S140
`
`S150
`
`S160
`
`S170
`
`Exhibit 1013
`Page 06 of 25
`
`
`
`Patent Application Publication
`
`Nov. 25, 2010 Sheet 6 of 14
`
`US 2010/0298032 A1
`
`FIG. 6
`
`
`
`195
`
`Exhibit 1013
`Page 07 of 25
`
`
`
`Patent Application Publication
`
`Nov. 25, 2010 Sheet 7 of 14
`
`US 2010/0298032 A1
`
`FIG. 7
`
`
`
`195
`
`(b)
`
`Exhibit 1013
`Page 08 of 25
`
`
`
`Patent Application Publication
`
`Nov. 25, 2010 Sheet 8 of 14
`
`US 2010/0298032 A1
`
`FIG. 8
`
`
`
`195
`
`(a)
`
`(b)
`
`Exhibit 1013
`Page 09 of 25
`
`
`
`Patent Application Publication
`
`Nov. 25, 2010 Sheet 9 of 14
`
`US 2010/0298032 A1
`
`FIG. 9B
`
`40
`
`
`
`40a
`
`40b
`
`40c
`
`40d
`
`
`A movie, avi; Ephoto.jpg
`B movie. avi. F photo.jpg
`C movie. mpg 3G photo.gif
`
`D movie. mpg: -
`
`-
`
`-
`
`-
`
`-
`
`-
`
`-
`
`-
`
`-
`
`Exhibit 1013
`Page 10 of 25
`
`
`
`Patent Application Publication
`
`Nov. 25, 2010 Sheet 10 of 14
`
`US 2010/0298032 A1
`
`FIG 10
`
`
`
`Exhibit 1013
`Page 11 of 25
`
`
`
`Patent Application Publication
`
`Nov. 25, 2010 Sheet 11 of 14
`
`US 2010/0298032 A1
`
`FIG 11
`
`
`
`FIG. 12
`
`
`
`11
`
`101
`
`103
`
`21
`
`Exhibit 1013
`Page 12 of 25
`
`
`
`Patent Application Publication
`
`Nov. 25, 2010 Sheet 12 of 14
`
`US 2010/0298032 A1
`
`
`
`12
`
`102
`
`104
`
`22 105
`
`12
`
`
`
`FIG. 14
`
`195
`
`11
`
`2222222222222222222222 Aziz
`
`102
`
`104
`
`22 21
`
`101
`
`103
`
`FIG 15
`
`105
`
`103
`
`21
`
`Exhibit 1013
`Page 13 of 25
`
`
`
`Patent Application Publication
`
`Nov. 25, 2010 Sheet 13 of 14
`
`US 2010/0298032 A1
`
`FIG 16
`
`)
`
`12
`
`102
`
`104
`
`22
`
`105
`
`
`
`FIG. 18
`
`12
`22 195 21
`11
`%z: s
`2%. 12%a
`
`
`
`102
`
`104
`
`105
`
`101
`
`103
`
`Exhibit 1013
`Page 14 of 25
`
`
`
`Patent Application Publication
`
`Nov. 25, 2010 Sheet 14 of 14
`
`US 2010/0298032 A1
`
`FIG. 19A
`D
`31
`
`101
`
`32
`
`33 in
`
`41
`
`102
`
`42
`
`43
`
`FIG. 19B
`{H
`101
`32
`
`31
`
`33 in
`
`41
`
`102
`
`42
`
`43
`
`FIG. 19C
`<H
`
`31
`
`101
`
`32
`
`33
`
`41
`
`102
`
`42
`
`43
`
`Exhibit 1013
`Page 15 of 25
`
`
`
`US 2010/0298O32 A1
`
`Nov. 25, 2010
`
`MOBILE TERMINAL AND METHOD OF
`PROVIDING GRAPHC USER INTERFACE
`USING THE SAME
`
`0001. The present application claims priorities to Korean
`Application Nos. 10-2009-0044915 filed on May 22, 2009
`and 10-2009-0061270 filed on Jul. 6, 2009 in Korea, the entire
`contents of which are hereby incorporated by reference in
`their entirety.
`
`BACKGROUND OF THE INVENTION
`0002 1. Field of the Invention
`0003. The present invention relates to a mobile terminal,
`and more particularly, to a mobile terminal having a plurality
`of bodies and providing various graphic user interfaces
`according to the angle between the plurality of bodies and a
`method of providing graphic user interfaces using the same.
`0004 2. Discussion of the Related Art
`0005 Mobile terminals having various functions and
`shapes come to the market as mobile terminal technology
`makes rapid progress. However, the size of a mobile terminal
`is restricted for portability. It is inconvenient for a user to
`operate the mobile terminal due to a restriction on the size of
`the display of the mobile terminal.
`0006. Accordingly, a variety of researches for solving the
`restriction on the mobile terminal size have been performed
`recently, and thus mobile terminals having various body
`structures come to the market.
`
`SUMMARY OF THE INVENTION
`0007 Accordingly, one object of the present invention is
`to address the above-noted and other drawbacks of the related
`art.
`0008 Another object of the present invention is to provide
`a mobile terminal having a plurality of bodies and providing
`graphic user interfaces corresponding to postures of the
`mobile terminal, formed by the plurality of bodies, and a
`method of providing a graphic user interface using the same.
`0009. To accomplish the objects of the present invention,
`according to a first aspect of the present invention, there is
`provided a mobile terminal including a first body including a
`first touch screen; a second body including a second touch
`screen; a combining part combining the first body and the
`second body with each other such that the mobile terminal can
`be folded into the first and second bodies; a sensing unit
`sensing the posture of the mobile terminal, formed by the first
`and second bodies; a memory storing a plurality of graphic
`user interfaces including at least one object; and a controller
`configured to display a graphic user interface corresponding
`to the posture sensed by the sensing unit among the plurality
`of graphic user interfaces on the first or second touch screen.
`0010. To accomplish the objects of the present invention,
`according to a second aspect of the present invention, there is
`provided a mobile terminal including a first body including a
`first touch screen; a second body including a second touch
`screen; a combining part combining the first body and the
`second body with each other such that the mobile terminal can
`be folded into the first and second bodies; a sensing unit
`sensing the posture of the mobile terminal, formed by the first
`and second bodies; a memory storing a plurality of graphic
`user interfaces including at least one object; and a controller
`configured to display a graphic user interface corresponds to
`
`a posture change among the plurality of graphic user inter
`faces on the first or second touch screen when the posture
`sensed by the sensing unit is changed from a first posture to a
`second posture.
`0011 To accomplish the objects of the present invention,
`according to a third aspect of the present invention, there is
`provided a method of providing a graphic user interface in a
`mobile terminal having a first body and a second body com
`bined with each other such that the mobile terminal can be
`folded into the first and second bodies, the method including
`sensing a change in a posture formed by the first and second
`bodies from a first posture to a second posture; and providing
`a graphic user interface including at least one object and
`corresponding to the second posture to at least one of a first
`touch screen included in the first body and a second touch
`screen included in the second body.
`0012. According to the mobile terminal and the method of
`providing a graphic user interface using the same, various
`graphic user interfaces can be provided according to shapes
`formed by the plurality of bodies and/or the posture of the
`mobile terminal, and thus a user can be provided with a
`required graphic user interface without performing an addi
`tional menu search operation.
`
`BRIEF DESCRIPTION OF THE DRAWINGS
`
`0013 The accompanying drawings, which are included to
`provide a further understanding of the invention and are
`incorporated in and constitute a part of this application, illus
`trate embodiment(s) of the invention and together with the
`description serve to explain the principle of the invention. In
`the drawings:
`0014 FIG. 1 is a block diagram of a mobile terminal
`according to an embodiment of the present invention;
`(0015 FIGS. 2, 3, 4A and 4B illustrate the external appear
`ance of the mobile terminal according to embodiments of the
`present invention;
`0016 FIG. 5 is a flowchart of a method of providing a
`graphic user interface according to an embodiment of the
`present invention;
`0017 FIG. 6 illustrates an open status of the mobile ter
`minal in which the angle between first and second bodies 101
`and 102 corresponds to a:
`0018 FIGS. 7 and 8 illustrate exemplary images displayed
`on the mobile terminal when an application corresponding to
`a specific menu selected in the state shown in FIG. 6 is
`executed;
`(0019 FIGS. 9A and 9B illustrate exemplary images dis
`played on the mobile terminal when graphic user interfaces
`according to the angle between the first and second bodies
`101 and 102 are provided;
`0020 FIGS. 10 and 11 illustrate a change of a touchscreen
`having priority according to the posture of the mobile termi
`nal;
`FIGS. 12, 13 and 14 are cross-sectional views of
`0021
`sensor elements and sensed objects according to a first
`embodiment of the present invention;
`0022 FIG. 15 is a cross-sectional view of a sensorelement
`mounted on an FPCB;
`0023 FIGS. 16, 17 and 18 are cross-sectional views of
`sensor elements and sensed objects according to a second
`embodiment of the present invention; and
`
`Exhibit 1013
`Page 16 of 25
`
`
`
`US 2010/0298O32 A1
`
`Nov. 25, 2010
`
`0024 FIGS. 19A, 19B and 19C are cross-sectional views
`of sensor elements and sensed objects according to a third
`embodiment of the present invention.
`
`DETAILED DESCRIPTION OF THE PREFERRED
`EMBODIMENTS
`0025. The present invention will now be described more
`fully with reference to the accompanying drawings, in which
`exemplary embodiments of the invention are shown. The
`invention may, however, be embodied in many different
`forms and should not be construed as being limited to the
`embodiments set forth herein; rather, there embodiments are
`provided so that this disclosure will be thorough and com
`plete, and will fully convey the concept of the invention to
`those skilled in the art.
`0026. Hereinafter, a mobile terminal relating to the present
`invention will be described below in more detail with refer
`ence to the accompanying drawings. In the following descrip
`tion, suffixes “module' and “unit' are given to components of
`the mobile terminal in consideration of only facilitation of
`description and do not have meanings or functions discrimi
`nated from each other.
`0027. The mobile terminal described in the specification
`can include a cellular phone, a Smart phone, a laptop com
`puter, a digital broadcasting terminal, personal digital assis
`tants (PDA), a portable multimedia player (PMP), a naviga
`tion system and so on.
`0028 FIG. 1 is a block diagram of a mobile terminal 100
`according to an embodiment of the present invention. As
`shown, the mobile terminal 100 may include a radio commu
`nication unit 110, an audio/video (A/V) input unit 120, a user
`input unit 130, a sensing unit 140, an output unit 150, a
`memory 160, an interface 170, a controller 180, and a power
`supply 190.
`0029. The radio communication unit 110 may include at
`least one module that enables radio communication between
`the mobile terminal 100 and a radio communication system or
`between the mobile terminal 100 and a network in which the
`mobile terminal 100 is located. For example, the radio com
`munication unit 110 includes a broadcasting receiving mod
`ule 111, a mobile communication module 112, a wireless
`Internet module 113, a local area communication module 114
`and a position information module 115.
`0030 The broadcasting receiving module 111 receives
`broadcasting signals and/or broadcasting related information
`from an external broadcasting management server through a
`broadcasting channel. Also, the broadcasting channel can
`include a satellite channel and a terrestrial channel, and the
`broadcasting management server can be a server that gener
`ates and transmits broadcasting signals and/or broadcasting
`related information or a server that receives previously cre
`ated broadcasting signals and/or broadcasting related infor
`mation and transmits the broadcasting signals and/or broad
`casting related information to a terminal.
`0031. Further, the broadcasting signals can include not
`only TV broadcasting signals, radio broadcasting signals and
`data broadcasting signals, but also signals in the form of
`combination of a TV broadcasting signal and a radio broad
`casting signal. In addition, the broadcasting related informa
`tion can be information on a broadcasting channel, a broad
`casting program or a broadcasting service provider, and can
`be provided even through a mobile communication network.
`In the latter case, the broadcasting related information can be
`received by the mobile communication module 112.
`
`0032. Also, the broadcasting related information can exist
`in various forms. For example, the broadcasting related infor
`mation can exist in the form of an electronic program guide
`(EPG) of the digital multimedia broadcasting (DMB) system
`or in the form of an electronic service guide (ESG) of the
`digital video broadcast-handheld (DVB-H) system.
`0033. In addition, the broadcasting receiving module 111
`receives broadcasting signals using various broadcasting sys
`tems. Particularly, the broadcasting receiving module 111 can
`receive digital broadcasting signals using digital broadcast
`ing systems such as the digital multimedia broadcasting
`terrestrial (DMB-T) system, the digital multimedia broad
`casting-satellite (DMB-S) system, the media forward link
`only (MediaFLO) system, the DVB-H and integrated services
`digital broadcast-terrestrial (ISDB-T) systems, etc. The
`broadcasting receiving module 111 can also be constructed to
`be Suited to broadcasting systems providing broadcasting
`signals other than the above-described digital broadcasting
`systems.
`0034) Further, the broadcasting signals and/or broadcast
`ing related information received through the broadcasting
`receiving module 111 can be stored in the memory 160. The
`mobile communication module 112 transmits/receives a
`radio signal to/from at least one of a base station, an external
`terminal and a server on a mobile communication network.
`The radio signal can include a voice call signal, a video
`telephony call signal or data in various forms according to
`transmission and reception of text/multimedia messages.
`0035. In addition, the wireless Internet module 113 corre
`sponds to a module for wireless Internet access and can be
`included in the mobile terminal 100 or externally attached to
`the mobile terminal 100. A wireless LAN (WLAN) (Wi-Fi),
`wireless broadband (Wilbro), world interoperability for
`microwave access (Wimax), high speed downlink packet
`access (HSDPA) and so on can be used as a wireless Internet
`technique.
`0036. Also, the local area communication module 114
`corresponds to a module for local area communication. Fur
`ther, bluetooth, radio frequency identification (RFID), infra
`red data association (IrDA), ultra wideband (UWB) and Zig
`Bee can be used as a local area communication technique.
`0037. The position information module 115 confirms or
`obtains the position of the mobile terminal. The position
`information module 115 can obtain position information by
`using a global navigation satellite system (GNSS). The GNSS
`is a terminology describing radio navigation satellite systems
`that revolve round the earth and transmit reference signals to
`predetermined types of radio navigation receivers such that
`the radio navigation receivers can determine their positions
`on the earth's surface or near the earth's surface. The GNSS
`includes a global positioning system (GPS) of the United
`States, Galileo of Europe, global orbiting navigational satel
`lite system (GLONASS) of Russia, COMPASS of China,
`quasi-Zenith satellite system (QZSS) of Japan and so on.
`0038. In more detail, a global positioning system (GPS)
`module is a representative example of the position informa
`tion module 115. In addition, the GPS module 115 can cal
`culate information on distances between one point or object
`and at least three satellites and information on the time when
`the distance information is measured and apply trigonometry
`to the obtained distance information to obtain three-dimen
`sional position information on the point or object according to
`the latitude, longitude and altitude at a predetermined time.
`
`Exhibit 1013
`Page 17 of 25
`
`
`
`US 2010/0298O32 A1
`
`Nov. 25, 2010
`
`0039. Furthermore, a method of calculating position and
`time information using three satellites and correcting the
`calculated position and time information using another satel
`lite can also used. In addition, the GPS module 115 continu
`ously calculates the current position in real time and calcu
`lates Velocity information using the position information.
`0040. Referring to FIG. 1, the A/V input unit 120 is used to
`input an audio signal or a video signal and includes a camera
`121 and a microphone 122. The camera 121 processes image
`frames of still images or moving images obtained by an image
`sensor in a video telephony mode or a photographing mode.
`Further, the processed image frames can be displayed on a
`display unit 151.
`0041. Also, the image frames processed by the camera 121
`can be stored in the memory 160 or transmitted to an external
`device through the radio communication unit 110. The
`mobile terminal 100 can also include at least two cameras.
`0042. The microphone 122 receives an external audio sig
`nal in a call mode, a recording mode or a speech recognition
`mode and processes the received audio signal into electric
`audio data. The audio data can then be converted into a form
`that can be transmitted to a mobile communication base sta
`tion through the mobile communication module 112 and out
`put in the call mode. Further, the microphone 122 can employ
`various noise removal algorithms for removing noise gener
`ated when the external audio signal is received.
`0043. In addition, the user input unit 130 receives input
`data for controlling the operation of the terminal from a user.
`The user input unit 130 can include a keypad, a dome switch,
`a touch pad (constant Voltage/capacitance), jog wheel, jog
`Switch and so on.
`0044 Also, the sensing unit 140 senses the current state of
`the mobile terminal 100, such as an open/close state of the
`mobile terminal 100, the position of the mobile terminal 100,
`whether a user touches the mobile terminal 100, the direction
`of the mobile terminal 100 and the acceleration/deceleration
`of the mobile terminal 100, and generates a sensing signal for
`controlling the operation of the mobile terminal 100. For
`example, the sensing unit 140 can sense whethera slide phone
`is opened or closed when the mobile terminal 100 is the slide
`phone. Furthermore, the sensing unit 140 can sense whether
`the power supply 190 supplies power and whether the inter
`face 170 is connected to an external device. The sensing unit
`140 can also include a proximity sensor.
`0045. In addition, the output unit 150 generates visual,
`auditory or tactile output and can include the display unit 151,
`an audio output module 152, an alarm 153 and a haptic
`module 154. Further, the display unit 151 displays informa
`tion processed by the mobile terminal 100. For example, the
`display unit 151 displays a user interface (UI) or graphic user
`interface (GUI) related to a telephone call when the mobile
`terminal is in the call mode. The display unit 151 also displays
`a captured or/and received image, UI or GUI when the mobile
`terminal 100 is in the video telephony mode or the photo
`graphing mode.
`0046. In addition, the display unit 151 can include at least
`one of a liquid crystal display, a thin film transistor liquid
`crystal display, an organic light-emitting diode display, a
`flexible display and a three-dimensional display. Further,
`Some of these displays can be of a transparent type or a light
`transmission type. That is, the display unit 151 can include a
`transparent display.
`0047. In more detail, the transparent display includes a
`transparent liquid crystal display. Further, the rear structure of
`
`the display unit 151 can also be of the light transmission type.
`Accordingly, a user can see an object located behind the body
`of the mobile terminal 100 through the transparent area of the
`body of the mobile terminal 100, which is occupied by the
`display unit 151.
`0048. The mobile terminal 100 can also include at least
`two display units 151. For example, the mobile terminal 100
`can include a plurality of displays that are arranged on a single
`face at a predetermined distance or integrated displays. The
`plurality of displays can also be arranged on different sides.
`0049. In addition, when the display unit 151 and a sensor
`sensing touch (referred to as a touch sensor hereinafter) form
`a layered structure, which is referred to as a touch screen
`hereinafter, the display unit 151 can be used as an input device
`in addition to an output device. The touch sensor can be in the
`form of a touch film, a touch sheet and a touch pad, for
`example.
`0050. Further, the touch sensor can be constructed to con
`Vert a variation in pressure applied to a specific portion of the
`display unit 151 or a variation in capacitance generated at a
`specific portion of the display unit 151 into an electric input
`signal. The touch sensor can also be constructed to sense
`pressure of touch as well as the position and area of the touch.
`0051. Also, when the user applies touch input to the touch
`sensor, a signal corresponding to the touch input is transmit
`ted to a touch controller. The touch controller then processes
`the signal and transmits data corresponding to the processed
`signal to the controller 180. Accordingly, the controller 180
`can detect a touched portion of the display 151.
`0.052
`Referring to FIG. 1, the proximity sensor of the
`sensing unit 140 can be located in an internal region of the
`mobile terminal, surrounded by the touch screen, or near the
`touch screen. The proximity sensor senses an object
`approaching a predetermined sensing face or an object
`located near the proximity sensor using an electromagnetic
`force or infrared rays without having mechanical contact.
`Further, the proximity sensor has lifetime longer than that of
`a contact sensor and thus has a wide application in the mobile
`terminal 100.
`0053. In addition, the proximity sensor includes a trans
`mission type photo-electric sensor, a direct reflection type
`photo-electric sensor, a mirror reflection type photo-electric
`sensor, a high-frequency oscillating proximity sensor, a
`capacitive proximity sensor, a magnetic proximity sensor, an
`infrared proximity sensor, etc. Further, a capacitive touch
`screen is constructed Such that proximity of a pointer is
`detected through a variation in an electric field according to
`the proximity of the pointer. In this instance, the touch screen
`(touch sensor) can be classified as a proximity sensor.
`0054 For convenience of explanation, the action of the
`pointer approaching the touch screen without actually touch
`ing the touchscreen is referred to as “proximity touch' and an
`action of bringing the pointer into contact with the touch
`screen is referred to as “contact touch' in the following
`description. In addition, the proximity touch point of the
`pointer on the touchscreen corresponds to a point of the touch
`screen to which the pointer touches the touch screen.
`0055. Further, the proximity sensor senses the proximity
`touch and a proximity touch pattern (for example, a proximity
`touch distance, a proximity touch direction, a proximity touch
`Velocity, a proximity touch time, a proximity touch position,
`a proximity touch moving state, etc.). Information corre
`sponding to the sensed proximity touch action and proximity
`touch pattern can then be displayed on the touch screen.
`
`Exhibit 1013
`Page 18 of 25
`
`
`
`US 2010/0298O32 A1
`
`Nov. 25, 2010
`
`0056. Also, the audio output module 152 can output audio
`data received from the radio communication unit 110 or
`stored in the memory 160 in a call signal receiving mode, a
`telephone call mode or a recording mode, a speech recogni
`tion mode and a broadcasting receiving mode. Further, the
`audio output module 152 outputs audio signals related to
`functions (for example, a call signal incoming tone, a mes
`sage incoming tone, etc.) performed in the mobile terminal
`100. The audio output module 152 can include a receiver, a
`speaker, a buzzer, etc. The audio output module 152 can
`output sounds through an earphonejack. The user can hear the
`Sounds by connecting an earphone to the earphone jack.
`0057. In addition, the alarm 153 outputs a signal for indi
`cating the generation of an event of the mobile terminal 100.
`For example, alarms can be generated when receiving a call
`signal, receiving a message, inputting a key signal, inputting
`touch, etc. The alarm 153 can also output signals in forms
`different from Video signals or audio signals, for example, a
`signal for indicating generation of an event through vibration.
`The video signals or the audio signals can be also output
`through the display unit 151 or the audio output module 152.
`0058 Also, the haptic module 154 generates various hap
`tic effects that the user can feel. One representative example
`of the haptic effects is vibration. The intensity and pattern of
`vibration generated by the haptic module 154 can also be
`controlled. For example, different vibrations can be com
`bined and output or can be sequentially output.
`0059. Further, the haptic module 154 can generate a vari
`ety of haptic effects including an effect of stimulus according
`to arrangement of pins vertically moving against a contact
`skin Surface, an effect of stimulus according to a jet force or
`Sucking force of air through a jet hole or a Sucking hole, an
`effect of stimulus of rubbing the skin, an effect of stimulus
`according to contact of an electrode, an effect of stimulus
`using an electrostatic force, and an effect according to a
`reproduction of cold and warmth using an element capable of
`absorbing or radiating heat in addition to vibrations.
`0060. The haptic module 154 can also not only transmit
`haptic effects through direct contact but also allow the user to
`feel haptic effects through a kinesthetic sense of the user's
`fingers or arms. The mobile terminal 100 can also include
`multiple haptic modules 154.
`0061. In addition, the memory 160 can store a program for
`the operation of the controller 180 and temporarily store
`input/output data (for example, phone book, messages, still
`images, moving images, etc.). The memory 160 can also store
`data about vibrations and Sounds in various patterns, which
`are output from when a touch input is applied to the touch
`SCC.
`0062. Further, the memory 160 can include at least one of
`a flash memory, a hard disk type memory, a multimedia card
`micro type memory, a card type memory (for example, SD or
`XD memory), a random access memory (RAM), a static
`RAM (SRAM), a read-only memory (ROM), an electrically
`erasable programmable ROM (EEPROM), a programmable
`ROM (PROM) magnetic memory, a magnetic disk and an
`optical disk. The mobile terminal 100 can also operate in
`relation to a web storage performing the storing function of
`the memory 160 on the Internet.
`0063. The interface 170 serves as a path to external devices
`connected to the mobile terminal 100. Further, the interface
`170 receives data from the external devices or power and
`transmits the data or power to the internal components of the
`mobile terminal 100 or transmits data of the mobile terminal
`
`100 to the external devices. Also, the interface 170 can
`include a wired/wireless headset port, an external charger
`port, a wired/wireless data port, a memory card port, a port for
`connecting a device having a user identification module, an
`audio I/O port, a video I/O port, an earphone port, etc., for
`example.
`0064. In addition, the interface 170 can also interface with
`a user identification module that is a chip that stores informa
`tion for authenticating the authority to use the mobile termi
`nal 100. For example, the user identification module can be a
`user identify module (UIM), a subscriber identify module
`(SIM) and a universal subscriber identify module (USIM). An
`identification device including the user identification module
`can also be manufactured in the form of a Smart card. Accord
`ingly, the identification device can be connected to the mobile
`terminal 100 through a port of the interface 170.
`0065. The interface 170 can also be a path through which
`power from an external cradle is provided to the mobile
`terminal 100 when the mobile terminal 100 is connected to
`the external cradle or a path through which various command
`signals input by the user through the cradle are transmitted to
`the mobile terminal 100. The various command signals or
`power input from the cradle can be used as signals for con
`firming whether the mobile terminal is correctly set in the
`cradle.
`0066. In addition, the controller 180 controls the overall
`operations of the mobile terminal. For example, the controller
`180 performs control and processing for Voice communica
`tion, data communication and video telephony. As shown in
`FIG. 1, the controller 180 also includes a multimedia module
`181 for playing multimedia. Also, the multimedia module
`181 can be included in the controller 180 as shown in FIG. 1
`or can be separated from the controller 180.
`0067. Further, the controller 180 can perform a pattern
`recognition process capable of recognizing handwriting input
`or picture-drawing input applied to the touch screen as char
`acters or images. In addition, the power supply 190 receives
`external power and internal power and provides power
`required for the operations of the components of the mobile
`terminal under the control of the controller 180.
`0068 According to hardware implementation, the
`embodiments of the present invention can be implemented
`using at least one of application specific integrated circuits
`(ASICs), digital signal processors (DSPs), digital signal pro
`cessing devices (DSPDs), programmable logic devices
`(PLDs), field programmable gate arrays (FPGAs