throbber
US007952566B2
`
`(12) United States Patent
`Poupyrev et al.
`
`(10) Patent N0.:
`(45) Date of Patent:
`
`US 7,952,566 B2
`May 31, 2011
`
`..
`2/2004 Hoshino et al.
`.. 345/ 173
`2004/0021643 A1*
`(54) APPARATUS AND METHOD FOR TOUCH
`
`ran 6 3.
`.
`. . . . . . . . . . . . . . . . . ..
`at 322/13:
`ON 3332/3323; :1: 152222 e: at
`FEEDBACK AND PRESSURE
`2006/0132457 Al *
`6/2006 Rimas-Ribikauskas
`MEASUREMENT
`et 31.
`........................... N 345/173
`7/2006 Prados et al.
`............... .. 345/173
`
`2006/0146039 A1*
`
`(75)
`
`Inventors:
`
`Ivan Poupyrev, Tokyo (JP); Shigeaki
`Mamyama, Kanagawa (Jp)
`
`(73) Assignee: Sony Corporation, Tokyo (JP)
`( * ) Notice:
`Subject to any disclaimer, the term ofthis
`patent is extended or adjusted under 35
`U.S.C. 154(b) by 763 days.
`
`(21) App]. N0.: 11/331,703
`
`(22)
`
`F11ed3
`
`(65)
`
`JUL 312 2007
`_
`_
`_
`Prior Publication Data
`
`Jan. 31, 2008
`US 2008/0024459 A1
`Foreign Application Priority Data
`(30)
`Jul. 31, 2006
`(JP) ............................... .. 2006-208047
`
`(51)
`
`Int. Cl.
`(2006.01)
`G06F 3/041
`(52) U.S. Cl.
`...................................... .. 345/173; 715/701
`(58) Field of Classification Search ........ .. 345/173—178;
`178/l8.0l—l8.ll, l9.0l—l9.07, 20.0l—20.04;
`715/70(L702
`See application file for complete search history.
`
`(56)
`
`References Cited
`
`U.S. PATENT DOCUMENTS
`7,683,889 B2 *
`3/2010 Rirnas Ribikauskas
`et al.
`........................... .. 345/173
`2001/0035854 A1* 11/2001 Rosenberg et al.
`......... .. 345/156
`
`JP
`
`FOREIGN PATENT DOCUMENTS
`2003-016502
`1/2003
`
`W0 0154109 A1 *
`W0
`* Cited by examillef
`
`7/2001
`
`Primary Examiner — Amare Mengistu
`Assistant Examiner — Dmitriy Bolotin
`(74) Attorney,
`Agent,
`or
`Firm — Oblon,
`McClella11d, Maier & Neustadt, L.L.P.
`
`Spivak,
`
`ABSTRACT
`(57)
`An apparatus includes a display section with a touch screen
`and the touch screen is adapted to display at least one graphi-
`cal user interface object and detect a touch position on the
`touch screen. The apparatus has a haptic feedback generating
`unit attached to the touch screen and is adapted to generating
`haptic feedback. A pressure sensing unit is attached to the
`touch screen and adapted to detect pressure applied to the
`touch screen. A controller section is adapted to control and
`drive the display section. The graphical user interface object
`displayed on the touch screen has a plurality of logical states.
`The controller section determines a current logical state ofthe
`graphical user interface object and a form of the haptic feed-
`back to be generated depending on the detected touch posi-
`(ion.
`
`18 Claims, 11 Drawing Sheets
`
`.__...._..._____.._.___....—.____.__
`
`1
`
`_._—.._._...a
`
`
`
`{
`EI
`
`i iI
`
`Signal
`generation
`unit
`
`generating unit
`
`Display
`
`_ Wsualdispiay
`
`Data storage
`Tactile waveshapes
`
`Tactile interface
`
`°°""°"°’
`Greippigral User
`n e ace
`controller
`
`
`
`
`
`
`
`2D position
`sensiingnunii
`
`can re er
`Pressure
`
`.
`.
`sensing unit
`
`
`..
`§g1§i::'E'J"n".'t
`
`Pressure
`.
`
`.
`
`—
`
`I
`
`controller
`
`
`
`Display/input
`Application.’
`10
`
` hardware
`30
`
`
`
`
`APPLE INC.
`EXHIBIT 1013 - PAGE 1
`
`

`

`U.S. Patent
`
`May 31, 2011
`
`Sheet 1 of 11
`
`US 7,952,566 B2
`
`
`
`1 l
`
`.—..—.—-—..¢-—.__——....._.————-——uua1—.—u—-——u1-—.-.—1-4-—._.,........__. —.....
`.
`
`I
`
`Haptio
`
`feedback
`
`I
`
`T
`
`
`
`
`
`Data storage
`
`Tactile waveshapes
`
`
`Signal
`generation
`
`unit
`
`
`
`
`Tactile interface
`Display
`
`controller
`controller
`
`
`
`
`
`
`
`
`
`
`2D position
`Graphical User
`sensing unit
`
`lnterfaoe
`controller
`
`controller
`
`Pressure _
`
`sensing unit
`
`controller
`
`
`
`Figure 1
`
`
`
`APPLE INC.
`EXHIBIT 1013 - PAGE 2
`
`

`

`U.S. Patent
`
`May 31, 2011
`
`Sheet 2 of 11
`
`US 7,952,566 B2
`
`
`
`touch
`start 4- down
`
`62 VET? ’"’ci‘?
`L?» hold ————;.lift off‘"4
`
`stop
`
`Figure 2
`
`
`
`APPLE INC.
`EXHIBIT 1013 - PAGE 3
`
`

`

`U.S. Patent
`
`May 31, 2011
`
`Sheet 3 of 11
`
`US 7,952,566 B2
`
`
`
`f“*‘*:-\
`
`203a 203b ®
`
` 200
`
`Figure 3
`
`
`
`APPLE INC.
`EXHIBIT 1013 - PAGE 4
`
`

`

`U.S. Patent
`
`May 31, 2011
`
`Sheet 4 of 11
`
`US 7,952,566 B2
`
`
`
`
`
`
`Tracking user
`fingedpen
`
`position on
`the screen
`
` Is finger/pen
`
`inside of the GUI
`element?
`
`
`
`
`
`
`is pressing
`event recognized?
`
`
`
`
`
`Provide
`Provide tactile feedback
`appropriate tactile
`for pressing event
`
`feedback
`depending on
`
`position, applied
`Activate GUI element,
`”'es§,5'{fi§{‘;‘E,fta‘e
`
`send appropriate events
`elements and
`
`appropriate actions
`
`
`
`
`
`
`
`
`Figure 4
`
`APPLE INC.
`EXHIBIT 1013 - PAGE 5
`
`

`

`U.S. Patent
`
`May 31, 2011
`
`Sheet 5 of 11
`
`US 7,952,566 B2
`
`cfimaom
`
`m2%:
`
`APPLE INC.
`EXHIBIT 1013 - PAGE 6
`
`
`
`

`

`U.S. Patent
`
`May 31, 2011
`
`Sheet 6 of 11
`
`US 7,952,566 B2
`
`330
`
`331
`
`
`l userslidesfingerinto
` GUI element with
`
`
`
`
`Tracking user
`fingerlpen
`os|tion_on
`he device
`
`
`
`
`pressure p > p1
`
`
`_
`ls fingerlpen
`
`element?
`Inside of the GUI
`
`
`
`Proviqe appropriate
`tachje feedbacyg
`
`depencjm on posmon.
`applle pressure,
`the stake of the _GUI_
`elements and a lncailon
`
`
`
`
`
`
`
`assure
`Applied
`
`reshold?
`more the
`p>N
`
`
`
`
`Activate GUl_ element
`send a pro nate even{s
`appropriate actions
`an pe_ orm other
`
`
`
`
`
`APPLE INC.
`EXHIBIT 1013 - PAGE 7
`
`

`

`U.S. Patent
`
`11m1,3WM
`
`my
`
`LI.
`
`US 7,952,566 B2
`
`Nom.
`
`IIIUIF 7‘Il'III1l.I
`q«Mm
`
`.....
`
`m.
`an
`
`%..
`
`1momHn.mMcom
`
`‘.E.
`A...,....
`
`52:2“.
`
`APPLE INC.
`EXHIBIT 1013 - PAGE 8
`
`
`
`

`

`U.S. Patent
`
`May 31, 2011
`
`Sheet 8 of 11
`
`US 7,952,566 B2
`
`user slides finger into
`GUI element with
`pressure p0
`
`800
`
`803
`
`pressure
`
`802
`
`
`
`Tracking user
`804
`fingeripen
`
`positron and
`pressure
`
`
`
`
`I f
`I
`:nsiZaé"§fii1§°é'ur
`
`element?
`
`
`
`recognized?
`
`
`
` ls actuate event
`
`
`
`
`
` Pressure
`
`changed more than
`in*=~p' i>d
`threshold value?
`
`Provide tactile feedback
`for acutatton event
`
`3 20
`
`Activate GU|_ element
`senda pro nate events
`3" P9. °'m °i.h°r
`appropriate actions
`
`
`822
`
`
`
`
`
`End
`
`Yes
`
`Remember
`pressure
`
`Provide appropriate
`
`
`
`
` tactite feedbac_k_
`
`
`
`
`depenctm on positron,
`apptre pressure.
`the state of the _GUI.
`elements and a trcation
`
`
`
`Figure 8
`
`
`
`APPLE INC.
`EXHIBIT 1013 - PAGE 9
`
`

`

`U.S. Patent
`
`May 31, 2011
`
`Sheet 9 of 11
`
`US 7,952,566 B2
`
`user slides finger into
`
`9 00
`
`GUI element
`
`Remember
`posrtron
`
`902
`
`9 03
`x5=x°
`y5=y°
`
`
`
`Tracking user
`fingiertpen
`posr run and
`
`ts fingerlpen
`Inside of the GUI
`
`pressure
` _
`element?
`
`9 04
`
`
`
` ls actuate event
`
`recogntzed?
`
`
`
`
`
`
` Position
`changed more than
`
`threshold vatue?
`
`Activate GU! element,
`Yes
`
`
`send appro nate events
`and pe_ orm ot_her
`
`approprtate actrons
`
`a
`
`922
`
`Remember
`current position
`
`
`
`
`
`Provide appropriate
`tacttte teedbacyt
`dependm on position.
`applre pressure,
`
`the state of the _GUI_
`elements and apptrcatron
`
`
`
`
`
`
`
`APPLE INC.
`EXHIBIT 1013 - PAGE 10
`
`

`

`U.S. Patent
`
`May 31, 2011
`
`Sheet 10 of 11
`
`US 7,952,566 B2
`
`APPLE INC.
`EXHIBIT 1013 - PAGE 11
`
`

`

`U.S. Patent
`
`May 31, 2011
`
`Sheet 11 of 11
`
`US 7,952,566 B2
`
` confirmation
`
`fidu — — - — — ————c—-ouuw-—¢p
`
`4
`
` pressure
`
`time
`
`Figure 11
`
`
`
`APPLE INC.
`EXHIBIT 1013 - PAGE 12
`
`

`

`US 7,952,566 B2
`
`1
`APPARATUS AND METHOD FOR TOUCH
`SCREEN INTERACTION BASED ON TACTILE
`FEEDBACK AND PRESSURE
`MEASUREMENT
`
`BACKGROUND OF THE INVENTION
`
`1. Field of the Invention
`
`The present invention relates to a method ofa user interface
`utilizing a touch screen and tactile feedback, and an apparatus
`that employs such a user interface method.
`2. Discussion of the RelatedArt
`
`Japanese Patent Application Publication No. 2003 -016502
`discloses an example of such a user interface system for
`detecting a position of a user’s finger or a pointing device on
`the touch screen of a display device. In this user interface
`system, the tactile feedback are provided by vibrating the
`touch screen when the user touches one of graphical user
`interface objects displayed on the touch panel. A functionality
`assigned to the selected graphical user interface object is
`actuated when the user releases or detouches the finger or
`pointing device from the touch screen.
`Japanese Patent Application Publication No. 2005-190290
`discloses another example of a user interface system capable
`ofproviding tactile feedbacks when a user touches on a touch
`screen. In this user interface system, the initial tactile feed-
`back is provided when the user first touches the touch panel,
`and a different tactile feedback is provided when the touch
`position is moved to a region of the touch screen where a
`graphical user interface object
`is displayed. A function
`assigned to the selected graphical user interface object is
`actuated when the user detouches the finger or pointing
`device or presses for a longer period oftime. The actuation of
`the selected graphical user interface object is notified to the
`user in a form of tactile feedback, color change of the graphi-
`cal user interface object, sound or combination thereof.
`Minsky, M., “Manipulating simulated objects with real-
`world gestures using force and position sensitive screen”.
`Proceedings of SIGGRAPH’84. 1984: ACM: pp. 195-203
`discloses still another example of a user interface system in
`which a pressure sensor is added to a touch screen for detect-
`ing pressure applied to the touch screen, allowing more flex-
`ibility in the user interface operation.
`
`SUMMARY OF THE INVENTION
`
`It is desirable to provide tactile notification when a user
`touches a user interface element on a touch screen without
`
`executing functionality ofthe user interface element. Further-
`more, it is desirable to provide tactile notification to the user
`when the functionality of the user interface element
`is
`executed.
`
`Furthermore, it is desirable to provide a method of user
`interface utilizing a touch screen display device capable of
`providing tactile feedback and measuring pressure applied to
`the touch screen, thereby allowing a user to have interactive
`operations similar to ones with physical operation means,
`such as pressing buttons or keys. Further, it is also desirable to
`provide an apparatus that employs such a user interface
`method.
`
`The present invention is made in view of the forgoing
`issues described above.
`
`In an embodiment of the present invention, there is pro-
`vided an apparatus including a display section with a touch
`screen. The touch screen is configured to display at least one
`graphical user interface object and detect a touch position on
`the touch screen. The touch position is inputted with a user’ s
`
`10
`
`15
`
`20
`
`25
`
`30
`
`35
`
`40
`
`45
`
`50
`
`55
`
`60
`
`65
`
`2
`
`finger or a pointing device. The apparatus includes: a haptic
`feedback generating unit attached to the touch screen and
`generating haptic feedback; a pressure sensing unit attached
`to the touch screen and detecting pressure applied to the touch
`screen; and a controller section configured to control and
`drive the display section. The graphical user interface object
`displayed on the touch screen has a plurality of logical states.
`The controller section determines a current logical state ofthe
`graphical user interface object using a history of detected
`touch positions and a history of detected pressure values. The
`controller section determines a form of the haptic feedback to
`be generated depending on (i) the detected touch position, (ii)
`the detected pressure value and (iii) the determined current
`logical state of the graphical user interface object.
`In another embodiment of the present invention, the haptic
`feedback generating unit may generate different tactile feed-
`back for different logical states of the GUI object.
`In another embodiment ofthe present invention, the logical
`states of the GUI object may include at least a selected state
`and an actuated state. The controller section may determine
`that the graphical user interface object is in the actuated state
`if a pressing event is recognized. The controller section may
`recognize the pressing event using a history of the detected
`pressure value. Alternatively,
`the controller section may
`determine that the GUI object is in the actuated state if: (i) the
`touch position is inside ofthe GUI object; and (ii) the detected
`pressure is more than a preset actuation threshold value. In
`another example, the controller section may determine that
`the GUI object is in the actuated state if: (i) the touch position
`is inside of the GUI object; and (ii) a history of the detected
`pressure satisfies a preset actuation condition. In the present
`embodiment, the logical state of GUI object is allowed to
`change to the actuated state only after the selected state.
`In another embodiment of the present invention, the haptic
`feedback generating unit may include a single or plurality of
`piezoelectric elements. At least one of the piezoelectric ele-
`ments may be used for generating the haptic feedback and
`detecting the pressure applied by the user. Alternatively, the at
`least one of the piezoelectric elements may generate the hap-
`tic feedback and detect the pressure in time sharing manner.
`In another embodiment of the present invention, the haptic
`feedback is controlled in either a frequency, an amplitude or
`both amplitude and frequency simultaneously.
`In another embodiment of the present invention, the haptic
`feedback generating unit may generate a continuous haptic
`feedback as long as the touch position is inside of the GUI
`object. Further, the continuous tactile feedback is changed in
`response to a change of the pressure applied to the touch
`screen. The change of the continuous tactile feedback
`depends on the current logical state of the graphical user
`interface object.
`In another embodiment of the present invention, the haptic
`feedback generating unit may generate a single burst of the
`haptic feedback when the touch position crosses over a
`hotspot predefined within the GUI object. Alternatively, the
`haptic feedback generating unit may generate a single burst of
`the tactile feedback when the touch position or the detected
`pressure changes more than a preset threshold value
`In another embodiment of the present invention, the GUI
`object may be formed with a plurality of sub-elements, and
`the haptic feedback generating unit may generate different
`tactile feedbacks for different sub-elements thereof.
`
`In another embodiment of the present invention, the con-
`troller section may determine that the GUI object is in the
`activated state by using a plurality of pressure thresholds.
`In another embodiment of the present invention, the con-
`troller section may differentiates a stronger push and a lighter
`
`
`
`APPLE INC.
`EXHIBIT 1013 - PAGE 13
`
`

`

`US 7,952,566 B2
`
`3
`push based on a noise level of a signal output from the touch
`screen or a circuitry thereof, the stronger push corresponding
`to the pressing event, the lighter push corresponding to sliding
`of the user’s finger or pointing device.
`In another embodiment of the present invention, the dis-
`play section may generate visual feedback in correlation with
`the haptic feedback.
`In another embodiment of the present invention, a graphi-
`cal user interface method for a touch screen is provided. The
`method includes: displaying a graphical user interface object
`on the touch screen, the graphical user interface object having
`a plurality of logical states; detecting a touch position on the
`touch screen, at which a user’s finger or a pointing device is
`touching; detecting pressure applied on the touch screen
`when the touch position is detected; and generating haptic
`feedback in response to the touching, a form of the haptic
`feedback being determined depending on (i) the detected
`touch position, (ii) the detected pressure value and (iii) a
`current logical state of the GUI object. The current logical
`state of the GUI object is determined by using a history of
`detected touch positions and a history of detected pressure
`values.
`
`In the embodiments of the present invention, the form of
`the haptic feedback is determined depending on the touch
`position, the pressure applied by the user and the current
`logical state of the graphical user interface object. Accord-
`ingly, various forms of the haptic feedback may be provided
`for different logical states of the graphical user interface
`object, making it easy for the user to know the current state of
`the graphical user interface object.
`
`5
`
`10
`
`15
`
`20
`
`25
`
`30
`
`ADVANTAGES OF THE INVENTION
`
`The present invention makes it possible to provide tactile
`notification when a user touches a user interface element o11 a
`
`35
`
`touch screen without executing functionality ofthe user inter-
`face element, and tactile notification to the user when the
`functionality of the user interface element is executed.
`Furthermore, according to the present invention, a method
`of user interface utilizing a touch screen display device
`capable ofproviding tactile feedback and measuring pressure
`applied to the touch screen is provided. The method allows a
`user to have interactive operations similar to ones with physi-
`cal operation means. Further, according to the present inven-
`tion, an apparatus that employs such a user interface method
`is provided.
`
`BRIEF DESCRIPTION OF THE DRAWINGS
`
`A more complete appreciation ofthe invention and many of
`the attendant advantages thereofwill be readily obtained a the
`same becomes better understood by reference to the follow-
`ing detailed description when considered in connection with
`the accompanying drawings, wherein:
`FIG. 1 is a block diagram showing an example of an appa-
`ratus configuration according to an embodiment of the
`present invention;
`FIG. 2 is a schematic diagram showing an example of
`interaction with touch screens of prior art;
`FIG. 3 is an explanatory illustration of a user interface
`method according to an embodiment of the present invention
`for a case where a user finger slides over a GUI object;
`FIG. 4 is a flow chart showing steps of a user interface
`method according to an embodiment ofthe present invention;
`FIG. 5(a) is an explanatory illustration of a user interface
`method according to another embodiment of the present
`invention for a case where a pressing event is recognized;
`
`40
`
`45
`
`50
`
`55
`
`60
`
`65
`
`4
`
`FIG. 5(b) is an explanatory illustration of a user interface
`method according to another embodiment of the present
`invention for a case where a pressing event is recognized;
`FIG. 5(c) is an explanatory illustration of a user interface
`method according to another embodiment of the present
`invention for a case where a pressing event is recognized;
`FIG. 6 is a flow chart showing steps of a user interface
`method according to another embodiment of the present
`invention;
`FIG. 7(a) is a schematic diagram showing an example of
`hotspot in a GUI object;
`FIG. 7(b) is a schematic diagram showing an example of
`hotspot in a GUI object;
`FIG. 7(c) is a schematic diagram showing an example of
`hotspot in a GUI object;
`FIG. 7(d) is a schematic diagram showing an example of
`hotspot in a GUI object;
`FIG. 8 is a flow chart showing steps of a user interface
`method according to still another embodiment of the present
`invention;
`FIG. 9 is a flow chart showing steps of a user interface
`method according to another embodiment of the present
`invention;
`FIG. 10(a) is an explanatory illustration of a user interface
`method according to an embodiment of the present invention
`for a slider-type GUI object;
`FIG. 10(b) is an explanatory illustration of a user interface
`method according to an embodiment of the present invention
`for a slider-type GUI object;
`FIG. 10(c) is an explanatory illustration of a user interface
`method according to an embodiment of the present invention
`for a slider-type GUI object;
`FIG. 10(d) is an explanatory illustration of a user interface
`method according to an embodiment of the present invention
`for a slider-type GUI object; and
`FIG. 11 is a graph showing changes of pressure applied by
`a user with time during a pressing event of a user interface
`method according to another embodiment of the present
`invention.
`
`DESCRIPTION OF PREFERRED
`EMBODIMENTS
`
`invention will be
`Below embodiments of the present
`described with reference to the accompanying figures. In the
`following description, some terminology is used to describe
`certain characteristics of the present invention.
`The term “touch screen” is a transparent screen-type posi-
`tion sensing device capable of detecting a touch position on
`the screen surface, at which a user’s finger or any pointing
`device is touching.
`The term “logical states of a graphical user interface
`object” means distinct states of a graphical user interface
`object, by which different corresponding operations or pro-
`cessing are triggered. The logical states includes at least a
`selected state which indicates the graphical user interface
`object is selected by a user but none of the corresponding
`operation or operation is triggered, and an actuated state in
`which the corresponding operations or processing is per-
`formed.
`
`FIG. 1 shows an example of an apparatus to which a user
`interface method according to an embodiment of the present
`invention is applied. The apparatus 1 includes a display/input
`section 10, a controller section 20 and an application section
`30.
`
`The display/input section 10 displays on a touch screen
`thereof images of buttons, keys, switches or any other
`
`
`
`APPLE INC.
`EXHIBIT 1013 - PAGE 14
`
`

`

`US 7,952,566 B2
`
`5
`Graphic User Interface (GUI) objects to prompt a user 2 to
`interact with the apparatus 1. The display/input section 10
`further detects a touch position of a user’ s finger or a pointing
`device on the screen and pressure applied when the finger or
`pointing device touches the screen. The display/input section 5
`10 further provides different types of tactile feedback in
`response to the user’s input operation.
`It should be noted, in this specification, that the word
`“tactile” and “haptic” indicate the same sensory effect and are
`used interchangeably.
`The control section 20 dynamically correlates: (i) the touch
`position on the screen or a GUI object selected by the user’s
`input operation; (ii) the pressure applied on the screen by the
`user’s input operation; and (iii) a current logical state of the
`selected GUI object; with the type of tactile feedback to be
`presented to the user 2.
`The application section 30 performs various operations or
`functions in response to the user’ s input operation detected by
`the display/input section 10. The application section 30 may
`include various applications and software units or hardware.
`(1) Display/Input Section
`The display/input section 10 includes a haptic feedback
`generating unit 102, a visual display unit 103, a two-dimen-
`sional (2D) position sensing unit 104 and a pressure sensing
`unit 105.
`
`20
`
`25
`
`10
`
`15
`
`6
`actuator may be used with a switching circuit for measuring
`pressure and generating the feedback.
`The 2D position sensing unit 105 detects where the user is
`touching on the touch screen. Any type of a touch screen or
`touch panel technology may be used as the 2D position sens-
`ing unit 105 as long as the touch screen/panel can measure
`two-dimensional position of the user’s finger or pointing
`device. For example, a resistive touch screen, a capacitive
`touch screen, a surface acoustic wave touch screen, or the like
`may be used.
`(2) Controller Section
`The controller section 20 drives and controls sub-sections
`
`of the display/input section 10 in response to the user’s input
`operation detected by the display/input section 10. The con-
`troller section 20 controls the display/input section 10 to
`change the tactile feedback depending on the position, pres-
`sure of the user’s touch on the screen and the current logical
`state of GUI object, attempting to simulate the interactive
`operations with physical interface objects. Accordingly, the
`apparatus ofthe present embodiment allows the user to easily
`and intuitively perform input operations even without the
`physical user interface objects.
`The control section 20 and the application section 30 may
`be embodied with a computer (not shown in the figure), which
`may include a CPU, a memory, an external data storage, and
`an input/output interface. Various functions performed by
`sub-sections of the control section 20 and the application
`section 30 may be realized by executing corresponding soft-
`ware installed in the computer, or adding dedicated circuitry
`or hardware to the computer. The application section 30 may
`include any application software or hardware that may be
`controlled in response to the user’s input operations detected
`by the display section 20.
`The software may be installed into the computer via a
`recording medium or a carrier signal. The software may also
`be installed by downloading from a server on a network or
`Internet through wired or wireless connection.
`The controller section 20 includes a signal generating unit
`106, a display controller 107, a two-dimensional (2D) posi-
`tion sensing unit controller 108, a pressure sensing unit con-
`troller 109, a data storage 110, a tactile interface controller
`111, and a graphical user interface (GUI) controller 112.
`The signal generating unit 106 generates and provides a
`signal to the tactile feedback generating unit 102 for driving
`tactile feedback generating elements or the piezoelectric
`actuators. The signal may be a voltage function of time, with
`amplitude, shape and period changed in response to the posi-
`tion and/or pressure ofthe user’ s input operation on the screen
`ofthe display/input section 10. Examples of output signal are
`a square wave, sinusoidal and so on. In the present embodi-
`ment, the type of signal is not limited to the above-described
`examples. Other signals may also be employed providing that
`the signal can be used to generate and change the tactile
`feedback in response to the user’s input operation.
`The tactile feedback generating unit 102 receives the input
`signal and converts the input signal into force patterns that are
`transmitted to the user 2 via a mechanical assembly that
`combines the screen with the tactile feedback generating
`elements or the piezoelectric actuators. The user 2 can feel the
`force patterns when the user 2 is touching the screen.
`For the user’s input operation, a pen-type pointing device
`may be used for selecting an image on the screen instead of
`the user’ s own finger. The user input to the apparatus 1 may be
`detected using a) touch screen technology where the user 2
`can directly touch the screen with their fingers, or b) pen input
`technology where the pen-type devices are used to report a
`position where the user 2 is touching the screen.
`
`The visual display unit 103 presents visual information to
`the user 2. Such visual information may include various pre-
`defined GUI objects that user can interact with, such as
`images of buttons, sliders, drawing, scroll bars, hyper links
`and etc. The visual display unit 103 may be formed with any
`type of display as long as it can be used with the tactile
`feedback generating unit 102, the 2D position sensing unit
`104 and the pressure sensing unit 105. For example, a Liquid
`Crystal Display (LCD), a Organic Light Emitting Diode
`(OLED) display or the like may be employed as the visual
`display unit 103.
`The haptic feedback generating unit 102 may be formed
`with piezoelectric bimorph actuators with single or multiple
`layer structure. Examples of such actuators for generating the
`tactile feedback are disclosed in Japanese Patent Application
`Publication No. 2006-48302. Alternatively, various types of
`mechanical or electrical or electromagnetic actuators/motors
`may be employed to generate the tactile feedback depending
`on a size/mass of the display and/or available power.
`The pressure sensing unit 104 allows to measure pressure
`applied to the touch screen by the user’s input operation. In
`the present embodiment, various types of pressure sensing
`units may be employed as long as such devices can measure
`the pressure of the user’ s touch with a predetermined resolu-
`tion and be incorporated in the display/input section 10 with
`other units 102-104. For example, a force sensitive circuit
`elements such as strain gauges or pressure sensitive resistors
`may be used to sense the force which the touch screen mem-
`ber exerts on each support of the touch screen when finger
`pressure is applied to the member.
`Alternatively, the piezoelectric actuators may be used to
`measure the pressure applied to the touch screen. For
`example, the piezoelectric actuators may be connected with a
`driver circuit and a detector circuit so as to use some of the
`
`actuators for generating the tactile feedback and the others for
`measuring the pressure applied thereon, respectively. An
`example of such a pressure sensing unit formed with the
`piezoelectric actuators is disclosed in Japanese Patent Appli-
`cation Publication No. 2006-48302.Altematively, the driving
`of the actuators and measuring of the pressure may be per-
`formed time sharing manner. More specifically, a single
`
`30
`
`35
`
`40
`
`45
`
`50
`
`55
`
`60
`
`65
`
`
`
`APPLE INC.
`EXHIBIT 1013 - PAGE 15
`
`

`

`US 7,952,566 B2
`
`7
`The pressure sensing unit controller 109 determines the
`value of pressure applied when the user is touching on the
`screen. The position sensing unit controller 108 determines
`the position where the user 2 is touching on the screen. The
`determined data is communicated to the GUI controller 112.
`
`When the user presses the screen and at the same time
`tactile feedback is provided to the user, the pressure signal
`will have a component from the tactile feedback signal. This
`signal may be filtered out because the exact shape of the
`signal is known. Alternatively, the pressure may be measured
`only at the point of time when no the tactile feedback is
`provided, i.e. when the tactile feedback waveshape is at zero
`value.
`
`The GUI controller 112 determines which GUI object the
`user 2 is intending to interact with. Further, depending on a)
`the current state of the GUI object and b) pressure value
`applied to the GUI object, the GUI controller 112 determines
`an appropriate change in the state of the GUI object. For
`example, if the GUI object is a graphical button, the GUI
`controller 112 can calculate is there was enough pressure
`applied on the graphical button on the screen to change the
`state of the button from “free” to “pressed” or “non-actuated”
`to “actuated”. After determining the state of the GUI objects,
`the GUI controller 112 changes the visual state of GUI object
`by sending commands to the display controller 107.
`Alternatively, sound or audio alarm may be generated
`when the visual state of the GUI object changed so as to
`inform the user 2 about the change in the state of GUI object
`The GUI controller 112 further sends the commands to the
`
`tactile interface controller 111 that generates appropriate
`commands for driving the tactile signal generation unit 102.
`The data for tactile feedback may be generated algorithmi-
`cally by the signal generation unit 106 as well as stored as data
`110a on the data storage 110. Any widely available data
`storage devices may be used as the data storage 110 including
`flash memory, ROM, hard drive as well as network storage.
`Any file systems can be used to organize data on the data
`storage 110.
`In another embodiment of the present invention, a history
`ofthe previous input and a history ofthe previous states ofthe
`GUI objects is used to determine the tactile feedback to the
`user 2.
`
`Before describing further details of embodiments of the
`present invention, a short description of touch-screen inter-
`action operation of related art might be useful and provided
`below:
`
`FIG. 2 presents an example of a typical touch-screen inter-
`action of related art. The interaction starts when the user 2
`
`touches the screen (touch down event T1). The user 2 can then
`either drag a finger across the input space (drag or slide state
`T2) or hold it steady in one place (hold state T3). Further, the
`user 2 can lift the finger off the screen, which can happen
`either from inside of the GUI object (lift off in event T4) or
`from outside of the GUI object (lift off out event T5).
`Therefore, each GUI object can be in the following states:
`1) neutral; 2) selected: that is when the user 2 selects the GUI
`object by touching it, such as placing the finger or pen-type
`device inside ofthe GUI object; and 3) activated, that is when
`the user 2 indicates that the GUI object should execute an
`associated command, corresponding to pressing of a physical
`button. In additional to these states, the GUI object can also be
`inactive, meaning that it can not be actuated, but it may or may
`not respond to the user input.
`It should be noted that, in the related art technology, the
`user 2 can select a GUI object and then actuate it or return the
`GUI object into the neutral state, by moving the finger/pen
`outside of the GUI object and lifting the finger/pen. Such
`
`10
`
`15
`
`20
`
`25
`
`30
`
`35
`
`40
`
`45
`
`50
`
`55
`
`60
`
`65
`
`8
`interaction method of the related art technology is different
`from what the user typically would do with the physical
`button while the use

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket