throbber
PTO/SB/14 (07-07)
`Approved for use through 06/30/2010. 0MB 0651-0032
`U.S. Patent and Trademark Office; U.S. DEPARTMENT OF COMMERCE
`Under the Paperwork Reduction Act of 1995, no persons are required to respond to a collection of information unless it contains a valid 0MB control number.
`
`Application Data Sheet 37 CFR 1.76
`
`Attorney Docket Number QGS
`
`Application Number
`
`Title of Invention Gesture Recognition
`
`The application data sheet is part of the provisional or nonprovisional application for which it is being submitted. The following form contains the
`bibliographic data arranged in a format specified by the United States Patent and Trademark Office as outlined in 37 CFR 1.76.
`This document may be completed electronically and submitted to the Office in electronic format using the Electronic Filing System (EFS) or the
`document may be printed and included in a paper filed application.
`
`Secrecy Order 37 CFR 5.2
`D Portions or all of the application associated with this Application Data Sheet may fall under a Secrecy Order pursuant to
`37 CFR 5.2 (Paper filers only. Applications that fall under Secrecy Order may not be filed electronically.)
`r
`,DD 1can n orma 10n:
`t I f
`f
`A
`Aoolicant 1
`Applicant Authority (!)Inventor I QLegal Representative under 35 U.S.C. 117
`Prefix Given Name
`Middle Name
`
`I Remove I
`I QParty of Interest under 35 U.S.C. 118
`Family Name
`Suffix
`
`Alan
`Bowens
`Residence Information (Select One) 0 US Residency ® Non US Residency 0 Active US Military Service
`I GB
`Country Of Residencei
`City Southampton
`
`Citizenship under 37 CFR 1.41(b) i
`Mailing Address of Applicant:
`Address 1
`QRG Lid.,
`
`GB
`
`Address 2
`1 Mitchell Point, Ensign Way
`I Hamble, Southampton
`Postal Code
`SO314RF
`
`City
`
`I State/Province
`I Countryi I GB
`All Inventors Must Be Listed - Additional Inventor Information blocks may be
`generated within this form by selecting the Add button.
`
`I
`
`I Add
`
`I
`
`Correspondence Information:
`Enter either Customer Number or complete the Correspondence Information section below.
`For further information see 37 CFR 1.33(a).
`
`An Address is being provided for the correspondence Information of this application.
`
`(cid:143)
`
`Customer Number
`
`20191
`
`Email Address
`
`dak@patent-faq.com
`
`Application Information:
`
`Title of the Invention
`
`Gesture Recognition
`
`Attorney Docket Number QGS
`
`Application Type
`
`Nonprovisional
`
`Subject Matter
`
`Utility
`
`Suggested Class (if any)
`
`Suggested Technology Center (if any)
`
`I I Add Email I
`
`!Remove Emaill
`
`I Small Entity Status Claimed
`
`(cid:143)
`
`I Sub Class (if any)I
`
`Total Number of Drawing Sheets (if any)
`
`11
`
`I Suggested Figure for Publication (if any) I 1
`
`EFS Web 2.2.2
`
`Petitioner Samsung Ex-1004, 0001
`
`

`

`PTO/SB/14 (07-07)
`Approved for use through 06/30/2010. 0MB 0651-0032
`U.S. Patent and Trademark Office; U.S. DEPARTMENT OF COMMERCE
`Under the Paperwork Reduction Act of 1995, no persons are required to respond to a collection of information unless it contains a valid 0MB control number.
`
`Application Data Sheet 37 CFR 1.76
`
`Attorney Docket Number
`
`QGS
`
`Application Number
`
`Title of Invention
`
`Gesture Recognition
`
`Publication Information:
`D Request Early Publication (Fee required at time of Request 37 CFR 1.219)
`Request Not to Publish. I hereby request that the attached application not be published under 35 U.S.
`D C. 122(b) and certify that the invention disclosed in the attached application has not and will not be the subject of
`an application filed in another country, or under a multilateral international agreement, that requires publication at
`eighteen months after filing.
`
`Representative Information:
`
`Representative information should be provided for all practitioners having a power of attorney in the application. Providing
`this information in the Application Data Sheet does not constitute a power of attorney in the application (see 37 CFR 1.32).
`Enter either Customer Number or
`complete
`the Representative Name
`section
`below.
`If both
`are completed the Customer Number will be used for the Representative Information during processing.
`
`sections
`
`Please Select One:
`
`Customer Number
`
`0 Customer Number
`20191
`
`I O US Patent Practitioner 10 Limited Recognition (37 CFR 11.9)
`
`Domestic Benefit/National Stage Information:
`This section allows for the applicant to either claim benefit under 35 U.S.C. 119(e), 120, 121, or 365(c) or indicate National Stage
`entry from a PCT application. Providing this information in the application data sheet constitutes the specific reference required by
`35 U.S.C. 119(e) or 120, and 37 CFR 1.78(a)(2) or CFR 1.78(a)(4), and need not otherwise be made part of the specification.
`I Remove I
`Filing Date (YYYY-MM-DD)
`
`Application Number
`
`Continuity Type
`
`Prior Application Number
`
`Prior Application Status Pending
`
`non provisional of
`
`61049453
`
`Additional Domestic Benefit/National Stage Data may be generated within this form
`by selecting the Add button.
`
`2008-05-01
`I
`
`Add
`
`I
`
`Foreign Priority Information:
`This section allows for the applicant to claim benefit of foreign priority and to identify any prior foreign application for which priority is
`not claimed. Providing this information in the application data sheet constitutes the claim for priority as required by 35 U.S.C. 119(b)
`and 37 CFR 1.55(a).
`
`I Remove I
`
`Application Number
`
`Country i
`
`Parent Filing Date (YYYY-MM-DD)
`
`Priority Claimed
`0 Yes 0 No
`I
`
`Add
`
`Additional Foreign Priority Data may be generated within this form by selecting the
`Add button.
`Assignee Information:
`Providing this information in the application data sheet does not substitute for compliance with any requirement of part 3 of Title 37
`of the CFR to have an assignment recorded in the Office.
`I Remove I
`
`I
`
`Assianee 1
`
`EFS Web 2.2.2
`
`Petitioner Samsung Ex-1004, 0002
`
`

`

`PTO/SB/14 (07-07)
`Approved for use through 06/30/2010. 0MB 0651-0032
`U.S. Patent and Trademark Office; U.S. DEPARTMENT OF COMMERCE
`Under the Paperwork Reduction Act of 1995, no persons are required to respond to a collection of information unless it contains a valid 0MB control number.
`
`Application Data Sheet 37 CFR 1.76
`
`Attorney Docket Number
`
`QGS
`
`Application Number
`
`Title of Invention
`
`Gesture Recognition
`
`If the Assignee is an Organization check here.
`
`Prefix
`
`Given Name
`
`(cid:143)
`
`Middle Name
`
`Family Name
`
`Suffix
`
`Mailing Address Information:
`
`Address 1
`
`Address 2
`
`City
`
`Country ii
`
`Phone Number
`
`Email Address
`
`State/Province
`
`Postal Code
`
`Fax Number
`
`Additional Assignee Data may be generated within this form by selecting the Add
`button.
`
`I Add I
`
`Signature:
`A signature of the applicant or representative is required in accordance with 37 CFR 1.33 and 10.18. Please see 37
`CFR 1.4(d) for the form of the signature.
`
`Signature
`
`/David A Kiewit. Reg. 34640 I
`
`Date (YYYY-MM-DD) 2008-10-20
`
`First Name
`
`David
`
`I Last Name
`
`I Kiewit
`
`Registration Number
`
`34640
`
`This collection of information is required by 37 CFR 1.76. The information is required to obtain or retain a benefit by the public which
`is to file (and by the USPTO to process) an application. Confidentiality is governed by 35 U.S.C. 122 and 37 CFR 1.14. This
`collection is estimated to take 23 minutes to complete, including gathering, preparing, and submitting the completed application data
`sheet form to the USPTO. Time will vary depending upon the individual case. Any comments on the amount of lime you require to
`complete this form and/or suggestions for reducing this burden, should be sent to the Chief Information Officer, U.S. Patent and
`Trademark Office, U.S. Department of Commerce, P.O. Box 1450, Alexandria, VA 22313-1450. DO NOT SEND FEES OR
`COMPLETED FORMS TO THIS ADDRESS. SEND TO: Commissioner for Patents, P.O. Box 1450, Alexandria, VA 22313-1450.
`
`EFS Web 2.2.2
`
`Petitioner Samsung Ex-1004, 0003
`
`

`

`Privacy Act Statement
`
`The Privacy Act of 1974 (P .L. 93-579) requires that you be given certain information in connection with your submission of the attached form related to
`a patent application or patent. Accordingly, pursuant to the requirements of the Act, please be advised that: (1) the general authority for the collection
`of this information is 35 U.S.C. 2(b)(2); (2) furnishing of the information solicited is voluntary; and (3) the principal purpose for which the information is
`used by the U.S. Patent and Trademark Office is to process and/or examine your submission related to a patent application or patent. If you do not
`furnish the requested information, the U.S. Patent and Trademark Office may not be able to process and/or examine your submission, which may
`result in termination of proceedings or abandonment of the application or expiration of the patent.
`
`The information provided by you in this form will be subject to the following routine uses:
`
`1.
`
`2.
`
`3.
`
`4.
`
`5.
`
`6.
`
`7.
`
`8.
`
`The information on this form will be treated confidentially to the extent allowed under the Freedom of Information Act (5 U.S.C. 552)
`and the Privacy Act (5 U.S.C. 552a). Records from this system of records may be disclosed to the Department of Justice to determine
`whether the Freedom of Information Act requires disclosure of these records.
`
`A record from this system of records may be disclosed, as a routine use, in the course of presenting evidence to a court, magistrate, or
`administrative tribunal, including disclosures to opposing counsel in the course of settlement negotiations.
`
`A record in this system of records may be disclosed, as a routine use, to a Member of Congress submitting a request involving an
`individual, to whom the record pertains, when the individual has requested assistance from the Member with respect to the subject matter of
`the record.
`
`A record in this system of records may be disclosed, as a routine use, to a contractor of the Agency having need for the information in
`order to perform a contract. Recipients of information shall be required to comply with the requirements of the Privacy Act of 1974, as
`amended, pursuant to 5 U.S.C. 552a(m).
`
`A record related to an International Application filed under the Patent Cooperation Treaty in this system of records may be disclosed,
`as a routine use, to the International Bureau of the World Intellectual Property Organization, pursuant to the Patent Cooperation Treaty.
`
`A record in this system of records may be disclosed, as a routine use, to another federal agency for purposes of National Security
`review (35 U.S.C. 181) and for review pursuant to the Atomic Energy Act (42 U.S.C. 218(c)).
`
`A record from this system of records may be disclosed, as a routine use, to the Administrator, General Services, or his/her designee,
`during an inspection of records conducted by GSA as part of that agency's responsibility to recommend improvements in records
`management practices and programs, under authority of 44 U.S.C. 2904 and 2906. Such disclosure shall be made in accordance with the
`GSA regulations governing inspection of records for this purpose, and any other relevant (i.e., GSA or Commerce) directive. Such
`disclosure shall not be used to make determinations about individuals.
`
`A record from this system of records may be disclosed, as a routine use, to the public after either publication of the application pursuan
`to 35 U.S.C. 122(b) or issuance of a patent pursuant to 35 U.S.C. 151. Further, a record may be disclosed, subject to the limitations of 37
`CFR 1.14, as a routine use, to the public if the record was filed in an application which became abandoned or in which the proceedings were
`terminated and which application is referenced by either a published application, an application open to public inspections or an issued
`patent.
`
`9.
`
`A record from this system of records may be disclosed, as a routine use, to a Federal, State, or local law enforcement agency, if the
`USPTO becomes aware of a violation or potential violation of law or regulation.
`
`EFS Web 2.2.2
`
`Petitioner Samsung Ex-1004, 0004
`
`

`

`ABSTRACT OF THE DISCLOSURE
`
`A state machine gesture recognition algorithm for interpreting streams of coordinates
`
`received from a touch sensor. The gesture recognition code can be written in a high
`
`level language such as C and then compiled and embedded in a microcontroller
`
`chip, or CPU chip as desired. The gesture recognition code can be loaded into the
`
`same chip that interprets the touch signals from the touch sensor and generates the
`
`time series data, e.g. a microcontroller, or other programmable logic device such as
`
`a field programmable gate array.
`
`Petitioner Samsung Ex-1004, 0005
`
`

`

`BACKGROUND OF THE INVENTION
`
`[0001] The invention relates to gesture recognition in particular gesture recognition by
`
`processing of time series of positional inputs received by a two-dimensional (2D) touch sensor,
`such as a capacitive or resistive touch sensor. The invention may also be applied to one(cid:173)
`dimensional (1 D) touch sensors, and the principles could also be applied to three-dimensional
`sensors. It may also be applied to proximity sensors, where no physical contact, i.e. touch, with
`a sensing surface is involved. The invention can be applied to sensing surfaces operable by a
`human finger, or a stylus.
`
`[0002] 1 D and 2D capacitive and resistive touch sensors have been in widespread use for
`many years. Examples include the screens of personal digital assistants (PDAs), MP3 audio
`player controls, mobile phone keypads and/or displays, and multimedia devices. The touchpad
`in notebook computers provided in place of a mouse is another form of 2D capacitive touch
`sensor. 2D sensors are also provided in many domestic appliances, so-called "white goods",
`such as ovens and blenders.
`
`[0003] Detailed descriptions of 2D capacitive sensors have been given many times, for
`
`example in patents and patent applications with the inventor Harald Philipp such as
`US 2005/0041018 A 1, US 2007/0247443 A 1, US 2007/0257894 A 1, and US 2007/0279395 A 1,
`the contents of which are incorporated herein in their entirety.
`
`[0004] Other prior art examples of touch screens are as follows.
`
`[0005] US 3,593,115 shows a touch element having triangulated shapes for determining
`
`object position. However this scheme requires numerous secondary electrode connections as
`well as two or more layers of construction, increasing construction costs and reducing
`
`transparency.
`
`[0006] US 5,650,597 shows a 2D sensing method which in its active area requires only one
`layer but requires large numbers of electrode connections. Resistive strips resolve one axis of
`position, and the accuracy is dependent on the tolerance of large numbers of resistive strips.
`This method however does suppress hand shadow effects.
`
`Petitioner Samsung Ex-1004, 0006
`
`

`

`[0007] US 6,297,811 describes a touch screen using triangulated wire outline electrode
`shapes to create field gradients. However this patent suffers from the problem that it is difficult
`to scale up the screen size, as the number of electrode connections to a sensing circuit is one
`per triangle. It is desirable to dramatically reduce the number of connections in order to reduce
`cost and simplify construction. Also it is desirable to use solid shapes rather than wire outlines
`which are more expensive to construct. This method however does suppress hand shadow
`effects.
`
`[0008] Gesture recognition has also been used for many years in such devices. An early
`example is character recognition in PDAs, such as the original machines from Palm Inc.
`Tracking finger motion, and single and double taps on a notebook touchpad is another long
`used example. More recently, gesture recognition has been incorporated into handheld devices
`such as the Apple iPhone (RTM). Prior art patent publications on touch screens that involve
`gesture recognition are also large in number, with significant numbers of publications from
`Synaptics, Inc. and also more recently Apple Computer, Inc, for example.
`
`[0009] US 2007/152984 A 1 assigned to Apple Computer, Inc. discloses a portable
`
`communication device with multi-touch input which detects one or more multi-touch contacts
`
`and motions and performs one or more operations on an object based on the one or more multi(cid:173)
`touch contacts and/or motions.
`
`[001 OJ US 2002/015024 A 1 assigned to University of Delaware discloses simultaneously
`tracking multiple finger and palm contacts as hands approach, touch, and slide across a
`proximity-sensor. Segmentation processing extracts shape, position and surface proximity
`features for each contact and a persistent path tracker is used to detect individual contact
`touchdown and liftoff. Combinatorial optimization modules associate each contact's path with a
`particular fingertip, thumb, or palm of either hand on the basis of biomechanical constraints and
`contact features. Classification of intuitive hand configurations and motions enables
`unprecedented integration of typing, resting, pointing, scrolling, 3D manipulation, and
`handwriting into a versatile, ergonomic computer input device.
`
`[0011] US 5,825,352 discloses a touch panel which is capable of detecting multiple touches
`simultaneously. In an xy electrode array, maxima and minima are identified in each of the x and
`y signals, wherein maxima are designated as finger touches. Peak and valley data in the x and
`
`Petitioner Samsung Ex-1004, 0007
`
`

`

`y directions are then interpolated to identify the location of one or more fingers on the sensor
`array.
`
`[0012] US 6028271, US 6414671 and US 6750852 are related patents assigned to Synaptics,
`Inc. which disclose gesture recognition of an object on a touch-sensor pad and for cursor
`motion. Tapping, drags, pushes, extended drags and variable drags gestures are recognized by
`
`analyzing the position, pressure, and movement of the conductive object on the sensor pad
`during the time of a suspected gesture, and signals are sent to a host indicating the occurrence
`of these gestures.
`
`[0013] US2007/176906 A1 assigned to Synaptics, Inc. discloses a touch sensor having a
`signal processor adapted to distinguish between three gestures based on different finger
`motions on the sensing device by providing a workflow with an idle state and three gesture(cid:173)
`specific states referred to as first, second and third result states, as illustrated in Figure 5 of
`US2007/176906 A 1.
`
`[0014] Generally, the raw output from the 2D touch sensor will be a time series of x, y
`
`coordinates, which are then processed by software, or firmware generated from higher level
`
`software, to distinguish the nature of the gesture that has been input. Generally, the raw data is
`split into contiguous touch segments and then processed to determine what if any gestures can
`be deduced. The processing of the raw data to identify the gestures may be carried out in the
`same chip as generates the raw data, or the raw data may be exported to an external chip, for
`example by transmission over a communication bus to the device's central processing unit
`(CPU). The former approach is preferred by Synaptics, the latter by Apple as exemplified by US
`2006/0066582 A 1.
`
`[0015] Most of the patent literature is unspecific about how the raw time series data are
`
`converted into gestures. The straightforward approach is to write appropriate high level code, for
`example in C or another suitable programming language, in which the interpretation of the time
`series data is analyzed using conditional statements, such as if .. then .. else.
`
`[0016] However, it is difficult to reliably and efficiently add code to identify a new gesture into
`an existing block of code for distinguishing between a significant number of gestures, for
`example at least 3 or 4, perhaps 10 to 20. Testing of the code is a particular difficulty. This is
`because in general at any intermediate point in a time series of x,y,t data the input may relate to
`
`Petitioner Samsung Ex-1004, 0008
`
`

`

`a plurality of possible gestures, thereby making the coding for recognizing one gesture generally
`dependent on or linked to the coding for recognizing another gesture.
`
`SUMMARY OF THE INVENTION
`
`[0017] The invention solves this problem by adopting a state machine approach to designing
`and writing the gesture recognition algorithm. In particular, the invention relates to a touch
`sensor device comprising an at least one-dimensional sensor arranged to output a sense signal
`responsive to proximity of an object, a position processing unit for calculating a position of an
`interaction with the sensitive area from an analysis of the sense signals and output a time series
`of data indicative of interaction positions on the sensor, and a gesture processing unit operable
`to analyze the time series data to distinguish one or more gesture inputs therefrom, wherein the
`gesture processing unit is coded with gesture recognition code comprising a plurality of linked
`state modules. The invention also relates to a corresponding signal processing method.
`
`[0018] The gesture recognition code can be written in a high level language such as C and
`then compiled and embedded in a microcontroller chip, or CPU chip as desired. Preferably, the
`
`gesture recognition code is loaded into the same chip that interprets the touch signals from the
`
`screen and generates the time series data, e.g. a microcontroller, or other programmable logic
`device such as a field programmable gate array (FPGA). This approach has been used to
`create reliable testable code both for single-touch data input screens and also multi-touch data
`input screens. A single-touch screen is one which assumes only one simultaneous touch of the
`screen, and is designed to output only one x,y coordinate at any one time. A multi-touch screen
`is one that can sense multiple simultaneous touches, for example up to 2 or 3 simultaneous
`touches.
`
`[0019] The state machine includes an idle state module which is the start state, and also the
`
`state which is returned to after a gesture interpretation state module has been exited.
`
`[0020] Responsive to a touch, the idle state passes control to a touch state.
`
`[0021]
`
`In a multi-touch environment, the state machine is implemented in the second
`
`embodiment described below such that there are multiple touch states, one for a single touch,
`one for a double touch, one for a triple touch etc with control passing to the appropriate touch
`state based on the number of simultaneous touches defined by the time series data at the time.
`
`Petitioner Samsung Ex-1004, 0009
`
`

`

`[0022] Although the above approach for handling multitouch gestures by having two-touch
`and three-touch states linked to one touch states operates well, redesigning the state machine
`to, for example, add a new multitouch gesture is difficult in view of the increasingly complex web
`of states and transitions. This problem is addressed by a fourth embodiment of the invention
`described below according to which there is provided a plurality of state machines limited to
`single-touch gesture recognition. If the gesture recognition code is configured to recognize
`gestures having up to, say 3 simultaneous touches, then 3 such single-touch state machines
`are provided. Further state machines are provided for multi-touch gesture recognition, each
`catering for a certain number of simultaneous touches, so there is a two-touch state machine
`and optionally a three-touch state machine, and further optionally additional state machines for
`still higher numbers of simultaneous touches. A key advantage of this approach is that the same
`code base is used for handling single touches, and each of 2-, 3- or higher numbers of
`simultaneous touches are processed using separate additional code embodied in separate state
`machines.
`
`[0023] A touch is usually only output as a valid touch, if certain criteria are satisfied, typically
`that there are a succession of touch at a stable x,y location or x,y region over multiple time
`
`sample increments. If a touch of a duration longer than a threshold duration is sensed in the
`touch state, then control flow passes to a press state module, wherein the press state is for
`handling longer touches. The press state is preferably a superstate comprising multiple sub(cid:173)
`states to distinguish between different durations of press and/or to allow a very long press to be
`interpreted as being repeat presses, which may be useful for alphanumeric key entry
`applications for example.
`
`[0024] The state machine preferably also has a plurality of state modules for interpreting
`higher level gestures, such as one or more states for interpreting double taps, flicks, drags and
`any other gestures. The gestures include those specifically described in this document as well
`as other gestures known in the art, specifically all those disclosed in the above-referenced prior
`art documents.
`
`[0025] The invention provides in one aspect a touch sensor device comprising: a sensor
`
`having a sensitive area extending in at least one-dimension and arranged to output sense
`signals responsive to proximity of an object to the sensitive area; a position processing unit
`operable to calculate positions of interactions with the sensitive area from an analysis of the
`sense signals, and output a time series of data indicative of the interaction positions on the
`
`Petitioner Samsung Ex-1004, 0010
`
`

`

`sensor, and thus touches; and a gesture processing unit operable to analyze the time series
`data to distinguish one or more gesture inputs therefrom, wherein the gesture processing unit is
`coded with gesture recognition code comprising a plurality of linked state modules.
`
`[0026] Further aspects of the invention relate to the gesture processing unit on its own and
`
`the gesture processing unit in combination with the position processing unit, but without the
`sensor.
`
`[0027] The plurality of state modules preferably includes an idle state module and a plurality
`of gesture interpretation state modules, wherein the idle state module is entered at the start of
`operation, and is returnable to from at least some of the gesture interpretation state modules.
`The plurality of gesture interpretation state modules may include a touch state module for single
`touches, and wherein, responsive to a touch, the idle state passes control to the touch state.
`
`[0028]
`
`In some embodiments, the plurality of gesture interpretation state modules includes at
`
`least one multitouch state module operable to process multiple simultaneous touches, and
`wherein the gesture processing unit is operable to pass control to the appropriate touch state
`module based on the number of simultaneous touches defined by the time series data at the
`
`time. A multitouch state module for each of two simultaneous touches and three simultaneous
`touches may be provided, and optionally also higher numbers of touches.
`
`[0029] The plurality of gesture interpretation state modules may advantageously include a
`press state module to which control can pass from a touch state module if a touch of a duration
`longer than a threshold duration is sensed in the touch state module. The press state is
`preferably a superstate comprising multiple sub-states to distinguish between different durations
`of press.
`
`[0030]
`
`In some embodiments, the plurality of gesture interpretation state modules includes a
`
`plurality of state modules operable to recognize motion related gestures derived from one or
`more moving touches. In other embodiments, only static gestures, such as press, tap and
`double tap are catered for.
`
`[0031] The best mode of implementing multitouch gesture interpretation according to the
`invention provides gesture recognition code configured to recognize gestures having up to N
`simultaneous touches, wherein N is at least 2, and comprises N single-touch state machines
`
`Petitioner Samsung Ex-1004, 0011
`
`

`

`operable to recognize only single touch gestures, and N-1 multi-touch state machines each
`operable to recognize only n-touch gestures, wherein n=2 to N.
`
`[0032] The position processing unit and the gesture processing unit may be accommodated
`in, and run on, a single integrated circuit, for example a microcontroller. Alternatively, the
`position processing unit may be accommodated in, and run on, a first integrated circuit, such as
`
`a microcontroller, and the gesture processing unit accommodated in, and run on, one or more
`separate integrated circuits, such as a personal computer or other complex system having its
`own central processing unit, graphics processing unit and/or digital signal processor with
`associated memory and bus communications.
`
`[0033] The invention provides in another aspect a method of recognizing gestures from a time
`series of touch data comprising coordinates of interaction positions on a touch sensor, the
`method comprising: receiving touch coordinates labeled with, or ordered by, time; analyzing the
`touch coordinates in a state machine comprising a plurality of linked state modules to recognize
`any one of a plurality of defined gestures therefrom; and outputting the recognized gestures.
`
`[0034] The invention provides in a still further aspect a single integrated circuit having a
`memory on which is loaded the above-referenced gesture state machine and which is operable
`to carry out the method of gesture recognition defined thereby.
`
`[0035] The invention provides in yet another aspect a computer having a memory on which is
`
`loaded the above-referenced gesture state machine and which is operable to carry out the
`method of gesture recognition defined thereby.
`
`[0036]
`
`It will be appreciated that the gesture state machine approach for gesture recognition
`
`can be applied to any hardware platform. Capacitive touch sensors, in particular one(cid:173)
`dimensional and two-dimensional capacitive touch sensors are one important sensor type which
`can provide a hardware platform for a gesture recognition state machine according to the
`invention. In particular, the invention is equally applicable to so-called passive or active
`capacitive sensing techniques.
`
`[0037] Passive capacitive sensing devices rely on measuring the capacitance of a sensing
`
`electrode to a system reference potential (earth). The principles underlying this technique are
`described in US 5,730,165 and US 6,466,036, for example. In broad summary, passive
`
`Petitioner Samsung Ex-1004, 0012
`
`

`

`capacitive sensors employ sensing electrodes coupled to capacitance measurement circuits.
`Each capacitance measurement circuit measures the capacitance (capacitive coupling) of its
`associated sensing electrode to a system ground. When there is no pointing object near to the
`sensing electrode, the measured capacitance has a background or quiescent value. This value
`depends on the geometry and layout of the sensing electrode and the connection leads to it,
`and so on, as well as the nature and location of neighboring objects, e.g. the sensing electrodes
`proximity to nearby ground planes. When a pointing object, e.g. a user's finger, approaches the
`sensing electrode, the pointing object appears as a virtual ground. This serves to increase the
`measured capacitance of the sensing electrode to ground. Thus an increase in measured
`capacitance is taken to indicate the presence of a pointing object. US 5,730,165 and US
`6,466,036 are primarily directed to discrete (single button) measurements, and not to 2D
`position sensor applications. However the principles described in US 5,730,165 and US
`6,466,036 are readily applicable to 2D capacitive touch sensors (2DCTs), e.g. by providing
`electrodes to define either a 2D array of discrete sensing areas, or rows and columns of
`
`electrodes in a matrix configuration.
`
`[0038] Active 2DCT sensors are based on measuring the capacitive coupling between two
`
`electrodes (rather than between a single sensing electrode and a system ground). The
`principles underlying active capacitive sensing techniques are described in US 6,452,514. In an
`active-type sensor, one electrode, the so called drive electrode, is supplied with an oscillating
`drive signal. The degree of capacitive coupling of the drive signal to the sense electrode is
`determined by measuring the amount of charge transferred to the sense electrode by the
`oscillating drive signal. The amount of charge transferred, i.e. the strength of the signal seen at
`the sense electrode, is a measure of the capacitive coupling between the electrodes. When
`
`there is no pointing object near to the electrodes, the measured signal on the sens

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket