throbber
(19) United States
`(12) Patent Application Publication (10) Pub. No.: US 2006/0097991 A1
`(43) Pub. Date:
`May 11, 2006
`Hotelling et al.
`
`US 20060097991Al
`
`(54) MULTIPOINT TOUCHSCREEN
`
`(22) Filed:
`
`May 6, 2004
`
`(75) Inventors: Steve Hotelling, San Jose, CA (US);
`Joshua A. Strickon, San Jose, CA
`(US); Brian Q. Huppi, San Francisco,
`CA (US)
`
`Correspondence Address:
`WONG, CABELLO, LUTSCH, RUTHERFORD
`& BRUCCULERI,
`P.C.
`20333 SH 249
`SUITE 600
`HOUSTON, TX 77070 (US)
`
`(73) Assignee: Apple Computer, Inc.
`
`(21) Appl. No.:
`
`10/840,862
`
`Publication Classi?cation
`
`(51) Int. Cl.
`(2006.01)
`G09G 5/00
`(52) U.S. Cl. ............................................................ .. 345/173
`
`(57)
`
`ABSTRACT
`
`A touch panel having a transparent capacitive sensing
`medium con?gured to detect multiple touches or near
`touches that occur at the same time and at distinct locations
`in the plane of the touch panel and to produce distinct signals
`representative of the location of the touches on the plane of
`the touch panel for each of the multiple touches is disclosed.
`
`1
`
`Petitioner Samsung 1010
`
`

`

`Patent Application Publication May 11, 2006 Sheet 1 0f 14
`
`US 2006/0097991 A1
`
`FIG. 1A
`
`\%5
`
`FIG 18
`
`Row projectlon
`V slgnal
`
`Column projection
`signal
`
`[or
`
`2
`
`

`

`Patent Application Publication May 11, 2006 Sheet 2 0f 14
`
`US 2006/0097991 A1
`
`/
`
`s2
`
`S1
`
`38
`
`3
`
`

`

`Patent Application Publication May 11, 2006 Sheet 3 0f 14
`
`US 2006/0097991 A1
`
`38
`
`44
`
`44
`
`44
`
`42A
`
`44
`
`42A
`
`42
`
`FIG. 3
`
`4
`
`

`

`Patent Application Publication May 11, 2006 Sheet 4 0f 14
`
`US 2006/0097991 A1
`
`RECEIVE MULTIPLE J
`TOUCHES ON
`SURFACE OF TOUCH
`SCREEN
`
`I
`
`47
`RECOGNIZE EACH OF _/
`THE MULTIPLE
`TOUCHES
`
`48
`I
`REPORT TOUCH DATA j
`BASED ON MULTIPLE
`TOUCHES
`
`FIG. 4
`
`FIG. 5
`
`5
`
`

`

`Patent Application Publication May 11, 2006 Sheet 5 0f 14
`
`US 2006/0097991 A1
`
`104
`
`110
`
`116{
`
`110
`
`102
`
`100
`108 f
`
`110
`}116
`
`110
`110
`
`104
`112
`
`SERIAL DATA
`BUS
`
`106
`
`FIG. 6B
`
`6
`
`

`

`Patent Application Publication May 11, 2006 Sheet 6 0f 14
`
`US 2006/0097991 A1
`
`‘I’
`
`09K
`
`mm?
`
`
`
`MQNF Mow?
`
`7
`
`

`

`Patent Application Publication May 11, 2006 Sheet 7 0f 14
`
`US 2006/0097991 A1
`
`150
`
`156
`
`1588
`
`1528
`
`152A
`
`158A
`
`154
`
`FIG. 9
`
`190
`
`(
`181
`\ j
`
`/ 1
`
`/f|
`
`'
`
`177 ~-'/
`
`FIG. 10
`
`170
`
`f
`192
`182
`
`180
`
`178
`@172
`
`8
`
`

`

`Patent Application Publication May 11, 2006 Sheet 8 0f 14
`
`US 2006/0097991 A1
`
`f 200
`
`@@@@@@©
`
`206
`
`204
`
`206
`
`204
`
`FIG. 11A
`
`f 202
`
`9
`
`

`

`Patent Application Publication May 11, 2006 Sheet 9 0f 14
`
`US 2006/0097991 A1
`
`232
`
`.
`
`220\‘
`
`222
`
`s
`
`a
`
`224
`
`( f
`
`236
`
`230
`
`‘
`
`+
`
`OUT
`
`FIG. 13
`
`10
`
`

`

`Patent Application Publication May 11, 2006 Sheet 10 0f 14
`
`US 2006/0097991 A1
`
`272
`
`DSP
`
`HOST
`
`274
`
`ND 2 \—'27O
`
`260
`
`268
`
`262
`
`262
`
`11
`
`

`

`Patent Application Publication May 11, 2006 Sheet 11 0f 14
`
`US 2006/0097991 A1
`
`280 \
`
`DRIVE SENSING POINTS
`
`I
`
`READ OUTPUT FROM SENSING
`POINTS
`
`I
`
`PRODUCE AND ANALYZE
`TOUCHSCREEN DATA
`
`V
`
`COMPARE CURRENT DATA TO PAST
`DATA
`
`KN 00 N
`
`KN m .b.
`
`KM \N O0 oo oo 0)
`
`I
`COMPARISON K
`
`290
`
`PERFORM ACTION BASED ON
`
`FIG. 15
`
`12
`
`

`

`Patent Application Publication May 11, 2006 Sheet 12 0f 14
`
`US 2006/0097991 A1
`
`RECEIVE RAW DATA
`
`I
`
`FILTER RAW DATA
`
`I
`
`\00 O N
`
`306
`
`GENERATE GRADIENT DATA J
`
`y
`308
`CALCULATE BOUNDARIES FOR TOUCH J
`REGIONS
`
`I
`
`CALCULATE COORDINATES FOR EACH
`TOUCH REGION
`
`I
`
`PERFORM MULTIPOINT TRACKING
`
`O
`
`N
`
`FIG. 16
`
`13
`
`

`

`Patent Application Publication May 11, 2006 Sheet 13 0f 14
`
`US 2006/0097991 A1
`
`RAW DATA INCLUDING NOISE
`
`I
`
`II
`
`L4,II.I.._U LA
`
`TOUCH REGIONS
`
`G
`
`I:
`
`u
`
`E]
`
`FIG. 17D
`
`COORDINATES OF TOUCH REGIONS
`
`E
`
`I
`a=33_00 p=133.97
`x=707.07.04, y=331.323230
`
`@
`a=9.00 p= 3.33
`x=417.29, y=333.666667
`
`,
`
`,_
`
`a=35.00 p=133.74
`X=290.16, y=570.155950
`
`FIG. 17E
`
`FIG. 17B
`GRADIENT DATA
`
`FIG. 17C
`
`14
`
`

`

`Patent Application Publication May 11, 2006 Sheet 14 0f 14
`
`US 2006/0097991 A1
`
`356 358
`
`\ \
`
`\l
`
`)
`
`(- 350
`
`370
`
`2
`
`354
`
`366
`
`372
`
`380
`
`376
`
`373
`
`356
`
`358
`
`372
`
`350 f
`
`362
`
`FIG. 19
`
`15
`
`

`

`US 2006/0097991 A1
`
`May 11, 2006
`
`MULTIPOINT TOUCHSCREEN
`
`BACKGROUND OF THE INVENTION
`
`[0001] 1. Field of the Invention
`
`[0002] The present invention relates generally to an elec
`tronic device having a touch screen. More particularly, the
`present invention relates to a touch screen capable of sensing
`multiple points at the same time.
`
`[0003] 2. Description of the Related Art
`
`[0004] There exist today many styles of input devices for
`performing operations in a computer system. The operations
`generally correspond to moving a cursor and/or making
`selections on a display screen. By Way of example, the input
`devices may include buttons or keys, mice, trackballs, touch
`pads, joy sticks, touch screens and the like. Touch screens,
`in particular, are becoming increasingly popular because of
`their ease and versatility of operation as Well as to their
`declining price. Touch screens alloW a user to make selec
`tions and move a cursor by simply touching the display
`screen via a ?nger or stylus. In general, the touch screen
`recogniZes the touch and position of the touch on the display
`screen and the computer system interprets the touch and
`thereafter performs an action based on the touch event.
`
`[0005] Touch screens typically include a touch panel, a
`controller and a softWare driver. The touch panel is a clear
`panel With a touch sensitive surface. The touch panel is
`positioned in front of a display screen so that the touch
`sensitive surface covers the vieWable area of the display
`screen. The touch panel registers touch events and sends
`these signals to the controller. The controller processes these
`signals and sends the data to the computer system. The
`softWare driver translates the touch events into computer
`events.
`
`[0006] There are several types of touch screen technolo
`gies including resistive, capacitive, infrared, surface acous
`tic Wave, electromagnetic, near ?eld imaging, etc. Each of
`these devices has advantages and disadvantages that are
`taken into account When designing or con?guring a touch
`screen. In resistive technologies, the touch panel is coated
`With a thin metallic electrically conductive and resistive
`layer. When the panel is touched, the layers come into
`contact thereby closing a sWitch that registers the position of
`the touch event. This information is sent to the controller for
`further processing. In capacitive technologies, the touch
`panel is coated With a material that stores electrical charge.
`When the panel is touched, a small amount of charge is
`draWn to the point of contact. Circuits located at each corner
`of the panel measure the charge and send the information to
`the controller for processing.
`
`[0007] In surface acoustic Wave technologies, ultrasonic
`Waves are sent horizontally and vertically over the touch
`screen panel as for example by transducers. When the panel
`is touched, the acoustic energy of the Waves are absorbed.
`Sensors located across from the transducers detect this
`change and send the information to the controller for pro
`cessing. In infrared technologies, light beams are sent hori
`Zontally and vertically over the touch panel as for example
`by light emitting diodes. When the panel is touched, some of
`the light beams emanating from the light emitting diodes are
`interrupted. Light detectors located across from the light
`
`emitting diodes detect this change and send this information
`to the controller for processing.
`
`[0008] One problem found in all of these technologies is
`that they are only capable of reporting a single point even
`When multiple objects are placed on the sensing surface.
`That is, they lack the ability to track multiple points of
`contact simultaneously. In resistive and capacitive technolo
`gies, an average of all simultaneously occurring touch points
`are determined and a single point Which falls someWhere
`betWeen the touch points is reported. In surface Wave and
`infrared technologies, it is impossible to discern the exact
`position of multiple touch points that fall on the same
`horiZontal or vertical lines due to masking. In either case,
`faulty results are generated.
`[0009] These problems are particularly problematic in
`tablet PCs Where one hand is used to hold the tablet and the
`other is used to generate touch events. For example, as
`shoWn in FIGS. 1A and 1B, holding a tablet 2 causes the
`thumb 3 to overlap the edge of the touch sensitive surface 4
`of the touch screen 5. As shoWn in FIG. 1A, if the touch
`technology uses averaging, the technique used by resistive
`and capacitive panels, then a single point that falls some
`Where betWeen the thumb 3 of the left hand and the index
`?nger 6 of the right hand Would be reported. As shoWn in
`FIG. 1B, if the technology uses projection scanning, the
`technique used by infra red and SAW panels, it is hard to
`discern the exact vertical position of the index ?nger 6 due
`to the large vertical component of the thumb 3. The tablet 2
`can only resolve the patches shoWn in gray. In essence, the
`thumb 3 masks out the vertical position of the index ?nger
`6.
`
`SUMMARY OF THE INVENTION
`
`[0010] The invention relates, in one embodiment, to a
`touch panel having a transparent capacitive sensing medium
`con?gured to detect multiple touches or near touches that
`occur at the same time and at distinct locations in the plane
`of the touch panel and to produce distinct signals represen
`tative of the location of the touches on the plane of the touch
`panel for each of the multiple touches.
`
`[0011] The invention relates, in another embodiment, to a
`display arrangement. The display arrangement includes a
`display having a screen for displaying a graphical user
`interface. The display arrangement further includes a trans
`parent touch panel alloWing the screen to be vieWed there
`through and capable of recogniZing multiple touch events
`that occur at different locations on the touch sensitive
`surface of the touch screen at the same time and to output
`this information to a host device.
`
`[0012] The invention relates, in another embodiment, to a
`computer implemented method. The method includes
`receiving multiple touches on the surface of a transparent
`touch screen at the same time. The method also includes
`separately recognizing each of the multiple touches. The
`method further includes reporting touch data based on the
`recogniZed multiple touches.
`
`[0013] The invention relates, in another embodiment, to a
`computer system. The computer system includes a processor
`con?gured to execute instructions and to carry out opera
`tions associated With the computer system. The computer
`also includes a display device that is operatively coupled to
`
`16
`
`

`

`US 2006/0097991 Al
`
`May 11, 2006
`
`the processor. The computer system further includes a touch
`screen that is operatively coupled to the processor. The touch
`screen is a substantially transparent panel that is positioned
`in front of the display. The touch screen is con?gured to
`track multiple objects, Which rest on, tap on or move across
`the touch screen at the same time. The touch screen includes
`a capacitive sensing device that is divided into several
`independent and spatially distinct sensing points that are
`positioned throughout the plane of the touch screen. Each
`sensing point is capable of generating a signal at the same
`time. The touch screen also includes a sensing circuit that
`acquires data from the sensing device and that supplies the
`acquired data to the processor.
`
`[0014] The invention relates, in another embodiment, to a
`touch screen method. The method includes driving a plu
`rality of sensing points. The method also includes reading
`the outputs from all the sensing lines connected to the
`sensing points. The method further includes producing and
`analyZing an image of the touch screen plane at one moment
`in time in order to determine Where objects are touching the
`touch screen. The method additionally includes comparing
`the current image to a past image in order to determine a
`change at the objects touching the touch screen.
`
`[0015] The invention relates, in another embodiment, to a
`digital signal processing method. The method includes
`receiving raW data. The raW data includes values for each
`transparent capacitive sensing node of a touch screen. The
`method also includes ?ltering the raW data. The method
`further includes generating gradient data. The method addi
`tionally includes calculating the boundaries for touch
`regions base on the gradient data. Moreover, the method
`includes calculating the coordinates for each touch region.
`
`BRIEF DESCRIPTION OF THE DRAWINGS
`
`[0016] The invention Will be readily understood by the
`folloWing detailed description in conjunction With the
`accompanying draWings, Wherein like reference numerals
`designate like structural elements, and in Which:
`
`[0017] FIGS. 1A and 1B shoW a user holding conven
`tional touch screens.
`
`[0018] FIG. 2 is a perspective vieW of a display arrange
`ment, in accordance With one embodiment of the present
`invention.
`
`[0019] FIG. 3 shoWs an image of the touch screen plane
`at a particular point in time, in accordance With one embodi
`ment of the present invention.
`
`[0020] FIG. 4 is a multipoint touch method, in accordance
`With one embodiment of the present invention.
`
`[0021] FIG. 5 is a block diagram of a computer system, in
`accordance With one embodiment of the present invention.
`
`[0022] FIG. 6 is a partial top vieW of a transparent
`multiple point touch screen, in accordance With one embodi
`ment of the present invention.
`
`[0023] FIG. 7 is a partial top vieW of a transparent multi
`point touch screen, in accordance With one embodiment of
`the present invention.
`
`[0024] FIG. 8 is a front elevation vieW, in cross section of
`a display arrangement, in accordance With one embodiment
`of the present invention.
`
`[0025] FIG. 9 is a top vieW of a transparent multipoint
`touch screen, in accordance With another embodiment of the
`present invention.
`
`[0026] FIG. 10 is a partial front elevation vieW, in cross
`section of a display arrangement, in accordance With one
`embodiment of the present invention.
`
`[0027] FIGS. 11A and 11B are partial top vieW diagrams
`of a driving layer and a sensing layer, in accordance With one
`embodiment.
`
`[0028] FIG. 12 is a simpli?ed diagram of a mutual capaci
`tance circuit, in accordance With one embodiment of the
`present invention.
`
`[0029] FIG. 13 is a diagram of a charge ampli?er, in
`accordance With one embodiment of the present invention.
`
`[0030] FIG. 14 is a block diagram of a capacitive sensing
`circuit, in accordance With one embodiment of the present
`invention.
`
`[0031] FIG. 15 is a How diagram, in accordance With one
`embodiment of the present invention.
`
`[0032] FIG. 16 is a How diagram of a digital signal
`processing method, in accordance With one embodiment of
`the present invention.
`
`[0033] FIGS. 17A-E shoW touch data at several steps, in
`accordance With one embodiment of the present invention
`
`[0034] FIG. 18 is a side elevation vieW of an electronic
`device, in accordance With one embodiments of the present
`invention.
`
`[0035] FIG. 19 is a side elevation vieW of an electronic
`device, in accordance With one embodiments of the present
`invention.
`
`DETAILED DESCRIPTION OF THE
`INVENTION
`
`[0036] Embodiments of the invention are discussed beloW
`With reference to FIGS. 2-19. HoWever, those skilled in the
`art Will readily appreciate that the detailed description given
`herein With respect to these ?gures is for explanatory
`purposes as the invention extends beyond these limited
`embodiments.
`
`[0037] FIG. 2 is a perspective vieW of a display arrange
`ment 30, in accordance With one embodiment of the present
`invention. The display arrangement 30 includes a display 34
`and a transparent touch screen 36 positioned in front of the
`display 34. The display 34 is con?gured to display a
`graphical user interface (GUI) including perhaps a pointer or
`cursor as Well as other information to the user. The trans
`parent touch screen 36, on the other hand, is an input device
`that is sensitive to a user’s touch, alloWing a user to interact
`With the graphical user interface on the display 34. By Way
`of example, the touch screen 36 may alloW a user to move
`an input pointer or make selections on the graphical user
`interface by simply pointing at the GUI on the display 34.
`
`[0038] In general, touch screens 36 recogniZe a touch
`event on the surface 38 of the touch screen 36 and thereafter
`output this information to a ho st device. The ho st device may
`for example correspond to a computer such as a desktop,
`laptop, handheld or tablet computer. The host device inter
`prets the touch event and thereafter performs an action based
`
`17
`
`

`

`US 2006/0097991 Al
`
`May 11, 2006
`
`on the touch event. Conventionally, touch screens have only
`been capable of recognizing a single touch event even When
`the touch screen is touched at multiple points at the same
`time (e.g., averaging, masking, etc.). Unlike conventional
`touch screens, hoWever, the touch screen 36 shoWn herein is
`con?gured to recognize multiple touch events that occur at
`different locations on the touch sensitive surface 38 of the
`touch screen 36 at the same time. That is, the touch screen
`36 alloWs for multiple contact points T1-T4 to be tracked
`simultaneously, i.e., if four objects are touching the touch
`screen, then the touch screen tracks all four objects. As
`shoWn, the touch screen 36 generates separate tracking
`signals 81-84 for each touch point T1-T4 that occurs on the
`surface of the touch screen 36 at the same time. The number
`of recognizable touches may be about 15. 15 touch points
`alloWs for all 10 ?ngers, tWo palms and 3 others.
`[0039] The multiple touch events can be used separately or
`together to perform singular or multiple actions in the host
`device. When used separately, a ?rst touch event may be
`used to perform a ?rst action While a second touch event
`may be used to perform a second action that is different than
`the ?rst action. The actions may for example include moving
`an object such as a cursor or pointer, scrolling or panning,
`adjusting control settings, opening a ?le or document, vieW
`ing a menu, making a selection, executing instructions,
`operating a peripheral device connected to the host device
`etc. When used together, ?rst and second touch events may
`be used for performing one particular action. The particular
`action may for example include logging onto a computer or
`a computer netWork, permitting authorized individuals
`access to restricted areas of the computer or computer
`netWork, loading a user pro?le associated With a user’s
`preferred arrangement of the computer desktop, permitting
`access to Web content, launching a particular program,
`encrypting or decoding a message, and/or the like.
`[0040] Recognizing multiple touch events is generally
`accomplished With a multipoint sensing arrangement. The
`multipoint sensing arrangement is capable of simultaneously
`detecting and monitoring touches and the magnitude of
`those touches at distinct points across the touch sensitive
`surface 38 of the touch screen 36. The multipoint sensing
`arrangement generally provides a plurality of transparent
`sensor coordinates or nodes 42 that Work independent of one
`another and that represent different points on the touch
`screen 36. When plural objects are pressed against the touch
`screen 36, one or more sensor coordinates are activated for
`each touch point as for example touch points Ti -T4. The
`sensor coordinates 42 associated With each touch point
`T1-T4 produce the tracking signals 81-84.
`
`[0041] In one embodiment, the touch screen 36 includes a
`plurality of capacitance sensing nodes 42. The capacitive
`sensing nodes may be Widely varied. For example, the
`capacitive sensing nodes may be based on self capacitance
`or mutual capacitance. In self capacitance, the “self ’ capaci
`tance of a single electrode is measured as for example
`relative to ground. In mutual capacitance, the mutual capci
`tance betWeen at least ?rst and second electrodes is mea
`sured. In either cases, each of the nodes 42 Works indepen
`dent of the other nodes 42 so as to produce simultaneously
`occurring signals representative of different points on the
`touch screen 36.
`
`[0042] In order to produce a transparent touch screen 36,
`the capacitance sensing nodes 42 are formed With a trans
`
`parent conductive medium such as indium tin oxide (ITO).
`In self capacitance sensing arrangements, the transparent
`conductive medium is patterned into spatially separated
`electrodes and traces. Each of the electrodes represents a
`different coordinate and the traces connect the electrodes to
`a capacitive sensing circuit. The coordinates may be asso
`ciated With Cartesian coordinate system (x and y), Polar
`coordinate system (r, 6) or some other coordinate system. In
`a Cartesian coordinate system, the electrodes may be posi
`tioned in columns and roWs so as to form a grid array With
`each electrode representing a different x, y coordinate.
`During operation, the capacitive sensing circuit monitors
`changes in capacitance that occur at each of the electrodes.
`The positions Where changes occur and the magnitude of
`those changes are used to help recognize the multiple touch
`events. A change in capacitance typically occurs at an
`electrode When a user places an object such as a ?nger in
`close proximity to the electrode, i.e., the object steals charge
`thereby affecting the capacitance.
`[0043] In mutual capacitance, the transparent conductive
`medium is patterned into a group of spatially separated lines
`formed on tWo different layers. Driving lines are formed on
`a ?rst layer and sensing lines are formed on a second layer.
`Although separated by being on different layers, the sensing
`lines traverse, intersect or cut across the driving lines
`thereby forming a capacitive coupling node. The manner in
`Which the sensing lines cut across the driving lines generally
`depends on the coordinate system used. For example, in a
`Cartesian coordinate system, the sensing lines are perpen
`dicular to the driving lines thereby forming nodes With
`distinct x and y coordinates. Alternatively, in a polar coor
`dinate system, the sensing lines may be concentric circles
`and the driving lines may be radially extending lines (or vice
`versa). The driving lines are connected to a voltage source
`and the sensing lines are connected to capacitive sensing
`circuit. During operation, a current is driven through one
`driving line at a time, and because of capacitive coupling,
`the current is carried through to the sensing lines at each of
`the nodes (e.g., intersection points). Furthermore, the sens
`ing circuit monitors changes in capacitance that occurs at
`each of the nodes. The positions Where changes occur and
`the magnitude of those changes are used to help recognize
`the multiple touch events. A change in capacitance typically
`occurs at a capacitive coupling node When a user places an
`object such as a ?nger in close proximity to the capacitive
`coupling node, i.e., the object steals charge thereby affecting
`the capacitance.
`[0044] By Way of example, the signals generated at the
`nodes 42 of the touch screen 36 may be used to produce an
`image of the touch screen plane at a particular point in time.
`Referring to FIG. 3, each object in contact With a touch
`sensitive surface 38 of the touch screen 36 produces a
`contact patch area 44. Each of the contact patch areas 44
`covers several nodes 42. The covered nodes 42 detect
`surface contact While the remaining nodes 42 do not detect
`surface contact. As a result, a pixilated image of the touch
`screen plane can be formed. The signals for each contact
`patch area 44 may be grouped together to form individual
`images representative of the contact patch area 44. The
`image of each contact patch area 44 may include high and
`loW points based on the pressure at each point. The shape of
`the image as Well as the high and loW points Within the
`image may be used to differentiate contact patch areas 44
`that are in close proximity to one another. Furthermore, the
`
`18
`
`

`

`US 2006/0097991 A1
`
`May 11, 2006
`
`current image, and more particularly the image of each
`contact patch area 44 can be compared to previous images
`to determine What action to perform in a host device.
`[0045] Referring back to FIG. 2, the display arrangement
`30 may be a stand alone unit or it may integrated With other
`devices. When stand alone, the display arrangement 32 (or
`each of its components) acts like a peripheral device (moni
`tor) that includes its oWn housing and that can be coupled to
`a host device through Wired or Wireless connections. When
`integrated, the display arrangement 30 shares a housing and
`is hard Wired into the host device thereby forming a single
`unit. By Way of example, the display arrangement 30 may be
`disposed inside a variety of host devices including but not
`limited to general purpose computers such as a desktop,
`laptop or tablet computers, handhelds such as PDAs and
`media players such as music players, or peripheral devices
`such as cameras, printers and/or the like.
`
`[0046] FIG. 4 is a multipoint touch method 45, in accor
`dance With one embodiment of the present invention. The
`method generally begins at block 46 Where multiple touches
`are received on the surface of the touch screen at the same
`time. This may for example be accomplished by placing
`multiple ?ngers on the surface of the touch screen. FolloW
`ing block 46, the process How proceeds to block 47 Where
`each of the multiple touches is separately recogniZed by the
`touch screen. This may for example be accomplished by
`multipoint capacitance sensors located Within the touch
`screen. Following block 47, the process How proceeds to
`block 48 Where the touch data based on multiple touches is
`reported. The touch data may for example be reported to a
`host device such as a general purpose computer.
`
`[0047] FIG. 5 is a block diagram of a computer system 50,
`in accordance With one embodiment of the present inven
`tion. The computer system 50 may correspond to personal
`computer systems such as desktops, laptops, tablets or
`handhelds. By Way of example, the computer system may
`correspond to any Apple or PC based computer system. The
`computer system may also correspond to public computer
`systems such as information kiosks, automated teller
`machines (ATM), point of sale machines (POS), industrial
`machines, gaming machines, arcade machines, vending
`machines, airline e-ticket terminals, restaurant reservation
`terminals, customer service stations, library terminals, leam
`ing devices, and the like.
`
`[0048] As shoWn, the computer system 50 includes a
`processor 56 con?gured to execute instructions and to carry
`out operations associated With the computer system 50. For
`example, using instructions retrieved for example from
`memory, the processor 56 may control the reception and
`manipulation of input and output data betWeen components
`of the computing system 50. The processor 56 can be a
`single-chip processor or can be implemented With multiple
`components.
`
`[0049] In most cases, the processor 56 together With an
`operating system operates to execute computer code and
`produce and use data. The computer code and data may
`reside Within a program storage block 58 that is operatively
`coupled to the processor 56. Program storage block 58
`generally provides a place to hold data that is being used by
`the computer system 50. By Way of example, the program
`storage block may include Read-Only Memory (ROM) 60,
`Random-Access Memory (RAM) 62, hard disk drive 64
`
`and/or the like. The computer code and data could also
`reside on a removable storage medium and loaded or
`installed onto the computer system When needed. Remov
`able storage mediums include, for example, CD-ROM,
`PC-CARD, ?oppy disk, magnetic tape, and a netWork com
`ponent.
`
`[0050] The computer system 50 also includes an input/
`output (I/O) controller 66 that is operatively coupled to the
`processor 56. The (U0) controller 66 may be integrated With
`the processor 56 or it may be a separate component as
`shoWn. The U0 controller 66 is generally con?gured to
`control interactions With one or more I/O devices. The U0
`controller 66 generally operates by exchanging data betWeen
`the processor and the I/O devices that desire to communicate
`With the processor. The I/O devices and the I/O controller
`typically communicate through a data link 67. The data link
`67 may be a one Way link or tWo Way link. In some cases,
`the I/O devices may be connected to the I/O controller 66
`through Wired connections. In other cases, the I/O devices
`may be connected to the I/O controller 66 through Wireless
`connections. By Way of example, the data link 67 may
`correspond to PS/2, USB, FireWire, IR, RF, Bluetooth or the
`like.
`
`[0051] The computer system 50 also includes a display
`device 68 that is operatively coupled to the processor 56.
`The display device 68 may be a separate component (periph
`eral device) or it may be integrated With the processor and
`program storage to form a desktop computer (all in one
`machine), a laptop, handheld or tablet or the like. The
`display device 68 is con?gured to display a graphical user
`interface (GUI) including perhaps a pointer or cursor as Well
`as other information to the user. By Way of example, the
`display device 68 may be a monochrome display, color
`graphics adapter (CGA) display, enhanced graphics adapter
`(EGA) display, variable-graphics-array (VGA) display,
`super VGA display, liquid crystal display (e.g., active
`matrix, passive matrix and the like), cathode ray tube (CRT),
`plasma displays and the like.
`
`[0052] The computer system 50 also includes a touch
`screen 70 that is operatively coupled to the processor 56. The
`touch screen 70 is a transparent panel that is positioned in
`front of the display device 68. The touch screen 70 may be
`integrated With the display device 68 or it may be a separate
`component. The touch screen 70 is con?gured to receive
`input from a user’s touch and to send this information to the
`processor 56. In most cases, the touch screen 70 recogniZes
`touches and the position and magnitude of touches on its
`surface. The touch screen 70 reports the touches to the
`processor 56 and the processor 56 interprets the touches in
`accordance With its programming. For example, the proces
`sor 56 may initiate a task in accordance With a particular
`touch.
`
`[0053] In accordance With one embodiment, the touch
`screen 70 is capable of tracking multiple objects, Which rest
`on, tap on, or move across the touch sensitive surface of the
`touch screen at the same time. The multiple objects may for
`example correspond to ?ngers and palms. Because the touch
`screen is capable of tracking multiple objects, a user may
`perform several touch initiated tasks at the same time. For
`example, the user may select an onscreen button With one
`?nger, While moving a cursor With another ?nger. In addi
`tion, a user may move a scroll bar With one ?nger While
`
`19
`
`

`

`US 2006/0097991 A1
`
`May 11, 2006
`
`selecting an item from a menu With another ?nger. Further
`more, a ?rst object may be dragged With one ?nger While a
`second object may be dragged With another ?nger. More
`over, gesturing may be performed With more than one ?nger.
`
`[0054] To elaborate, the touch screen 70 generally
`includes a sensing device 72 con?gured to detect an object
`in close proximity thereto and/or the pressure exerted
`thereon. The sensing device 72 may be Widely varied. In one
`particular embodiment, the sensing device 72 is divided into
`several independent and spatially distinct sensing points,
`nodes or regions 74 that are positioned throughout the touch
`screen 70. The sensing points 74, Which are typically hidden
`from vieW, are dispersed about the touch screen 70 With each
`sensing point 74 representing a different position on the
`surface of the touch screen 70 (or touch screen plane). The
`sensing points 74 may be positioned in a grid or a pixel array
`Where each pixilated sensing point 74 is capable of gener
`ating a signal at the same time. In the simplest case, a signal
`is produced each time an object is positioned over a sensing
`point 74. When an object is placed over multiple sensing
`points 74 or When the object is moved betWeen or over
`multiple sensing point 74

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket