throbber
Exhibit 13
`
`U.S. Patent No. 8,526,767 (“’767 Patent”)
`
`Invalidity Chart Based On Primary Reference U.S. Patent Application Publication No. 2008/0036743 (“WESTERMAN”)
`
`WESTERMAN qualifies as prior art to U.S. Patent No. 8,526,767 (“’767 Patent”) at least under 35 U.S.C. § 102(a) and anticipates
`and, alone or with other references, renders obvious one or more of claims 1-3, 6, and 11-14. To the extent WESTERMAN does not
`disclose one or more limitations of the claims, it would have been obvious to combine the teachings of WESTERMAN with the
`knowledge of one of ordinary skill in the art and with one or more of the references below to render the claims at-issue in the ’767
`Patent invalid.
`
`• U.S. Patent Application Publication No. 2009/0284478 (“BALTIERRA”)
`• U.S. Patent Application Publication No. 2007/0247435 (“BENKO”)
`• U.S. Patent No. 8,519,965 (“CADY”)
`• U.S. Patent Application Publication No. 2009/0325643 (“HAMADENE”)
`•
`Japanese Laid-Open Patent Application Gazette H09-231004 (“KATOU”)
`• U.S. Patent Application Publication No. 2009/0213084 (“KRAMER”)
`• U.S. Patent Application Publication No. 2010/0020025 (“LEMORT”)
`• U.S. Patent Application Publication No. 2008/0046425 (“PERSKI”)
`•
`International Patent Publication No. WO 00/63874 (“STRINGER”)
`• U.S. Patent Application Publication No. 2007/0176906 (“WARREN”)
`• U.S. Patent Application Publication No. 2009/0225039 (“WILLIAMSON”)
`• U.S. Patent Application Publication No. 2007/0046643 (“HILLIS”) (prior art under at least 35 U.S.C. §102(b))
`• U.S. Patent Application Publication No. 2006/0066582 (“LYON”) (prior art under at least 35 U.S.C. §102(b))
`• U.S. Patent Application Publication No. 2007/0152984 (“ORDING”) (prior art under at least 35 U.S.C. §102(a))
`• U.S. Patent Application Publication No. 2007/0291009 (“WRIGHT”) (prior art under at least 35 U.S.C. §102(a))
`• Admitted Prior Art
`
`The excerpts cited herein are exemplary. For any claim limitation, Samsung may rely on excerpts cited for any other limitation and/or
`additional excerpts not set forth fully herein to the extent necessary to provide a more comprehensive explanation for a reference’s
`disclosure of a limitation. Where an excerpt refers to or discusses a figure or figure items, that figure and any additional descriptions
`of that figure should be understood to be incorporated by reference as if set forth fully herein. Similarly, where an excerpt cites to
`particular text referring to a figure, the citation should be understood to include the figure and related figures as well.
`
`1
`
`SAMSUNG V. SOLAS
`IPR2021-01254
`Exhibit 2014
`Page 1
`
`

`

`These invalidity contentions are not an admission by Samsung that the accused products or components, including any current or past
`version of these products or components, are covered by, or infringe the asserted claims, particularly when these claims are properly
`construed and applied. These invalidity assertions are also not an admission that Samsung concedes or acquiesces to any claim
`construction(s) implied or suggested by Plaintiff in its Complaint or the associated infringement claim charts. Nor is Samsung
`asserting any claim construction positions through these charts, including whether the preamble is a limitation. Samsung also does not
`concede or acquiesce that any asserted claim satisfies the requirements of 35 U.S.C. §§ 112 or 101 and submits these invalidity
`contentions only to the extent Plaintiff’s assertions may be understood.
`
`
`
`
`
`2
`
`SAMSUNG V. SOLAS
`IPR2021-01254
`Exhibit 2014
`Page 2
`
`

`

`
`
`Asserted Claims
`
`Claim 1
`
`Exemplary Disclosures
`
`
`
`[1.pre] A touch sensor device
`comprising:
`
`WESTERMAN, alone or in combination with the knowledge of a person of ordinary skill in the
`art, discloses and/or renders obvious the touch sensor device recited in claim 1.
`
`WESTERMAN at Abstract:
`“Methods and systems for implementing gestures with sensing devices are disclosed. More
`particularly, methods and systems related to gesturing with multipoint sensing devices are
`disclosed.”
`
`WESTERMAN at [0025]-[0026]:
`“With touch pads, the movement of the input pointer corresponds to the relative movements of
`the user's finger (or stylus) as the finger is moved along a surface of the touch pad. Touch
`screens, on the other hand, are a type of display screen that has a touch-sensitive transparent
`panel covering the screen. When using a touch screen, a user makes a selection on the display
`screen by pointing directly to GUI objects on the screen (usually with a stylus or finger). In
`general, the touch device recognizes the touch and position of the touch and the computer system
`interprets the touch and thereafter performs an action based on the touch event.
`In order to provide additionally functionality, gestures have been implemented with some of
`these input devices. By way of example, in touch pads, selections may be made when one or
`more taps are detected on the surface of the touch pad.”
`
`WESTERMAN at [0029]:
`“The invention relates, in one embodiment, to an electronic system. The electronic system
`includes a multipoint sensing device that provides a multipoint sensing area for receiving inputs
`from one or more objects. The electronic system also includes a gesture module configured to
`determine a gesture set for a given input arrangement received by the multipoint sensing area of
`the multipoint sensing device, to monitor the given input arrangement for one or more gesture
`events included in the gesture set, and to initiate input actions associated with a gesture event
`
`3
`
`SAMSUNG V. SOLAS
`IPR2021-01254
`Exhibit 2014
`Page 3
`
`

`

`Asserted Claims
`
`Exemplary Disclosures
`
`when the gesture event is performed with the input arrangement. The input arrangement may for
`example be an arrangement of fingers and/or other parts of the hand.”
`
`WESTERMAN at [0030]:
`“The invention relates, in another embodiment, to a gestural control method. The method
`includes detecting multiple points within a sensing area at the same time. The method also
`includes determining a chord when one or more points are detected within the sensing area. The
`chord is a specific arrangement of points within the sensing area. The method further includes
`determining a gesture set associating commands to one or more gesture events. The method
`additionally includes monitoring points for gesture events. Moreover, the method includes
`performing command associated with gesture event if a gesture event is recognized.”
`
`WESTERMAN at [0031]:
`“The invention relates, in another embodiment, to a control operation. The control operations
`includes detecting a touch or near touch. The operations also includes determining a gesture set
`for the touch. The gesture set includes one or more gesture events for provoking or initiating a
`command. The operation further includes monitoring the touch for a gesture event. The operation
`additionally includes initiating a command when a gesture event associated with the gesture set
`is performed.”
`
`WESTERMAN at [0032]:
`“The invention relates, in another embodiment, to a gesture operation. The operation includes
`monitoring a touch motion. The operation also includes differentiating the touch motion between
`first and second states. The operation further includes performing a first action if the touch
`motion is associated with first state. The operation additionally includes performing a second
`action if motion is associated with second state.”
`
`WESTERMAN at [0033]:
`“The invention relates, in another embodiment, to a control operation. The control operation
`includes providing a first input device and a second input device that is different than the first
`input device. The first input device includes an object sensing device such as a touch sensing
`
`4
`
`SAMSUNG V. SOLAS
`IPR2021-01254
`Exhibit 2014
`Page 4
`
`

`

`Asserted Claims
`
`Exemplary Disclosures
`
`device for providing input events. The operation also includes monitoring the first input device
`for input events. The operation further includes simultaneously monitoring the second input
`device for input events. The operation additionally includes performing input operations in
`accordance with input events associated with first input device. Moreover, the method includes
`simultaneously performing input operations in accordance with input events associated with
`second input device.”
`
`WESTERMAN at [0034]:
`“The invention relates, in another embodiment, to a control operation. The control operation
`provides a list of input functions. The input function have commands and gesture events that are
`linked to the commands. The commands are related to the input function. The operation also
`includes assigning input functions to chords. The operation additionally includes linking an input
`function to a chord when the chord is recognized.”
`
`WESTERMAN at [0037]:
`“The invention relates, in another embodiment, to a gesture operation. The gesture operations
`includes detecting a first finger. The gesture operation also includes determining the state of the
`finger. The state of the finger may for example be moving or stationary. The gesture operation
`further includes detecting one or more additional fingers. For example, a second finger may be
`detected. The gesture operation additionally includes determining the state of the additional
`fingers. The state of the additional fingers may for example be that they are present or not.
`Moreover, the method includes implementing different input modes based on timing of states of
`first and additional fingers relative to one another. The different modes may for example be
`pointing modes, dragging modes and the like.”
`
`WESTERMAN at [0093]-[0095]:
`“Gestures and methods of implementing gestures with sensing devices are disclosed. More
`particularly, gestures and methods of implementing gestures with multipoint sensing devices are
`disclosed. Multipoint sensing devices have a number of advantages over conventional single
`point devices in that they can distinguish more than one object (finger) simultaneously or near
`simultaneously. In most cases, multipoint sensing devices and systems that utilize such devices
`
`5
`
`SAMSUNG V. SOLAS
`IPR2021-01254
`Exhibit 2014
`Page 5
`
`

`

`Asserted Claims
`
`Exemplary Disclosures
`
`monitor a surface for a touch or near touch event. When such an event occurs, it can determine
`the distinct area(s) of contact and identify the nature of the events via their geometric features
`and geometric arrangement. Once identified, the touch or near touch events are monitored to
`determine if they correspond to various gestures events.
`A gesture event may be defined as a stylized interaction with the sensing surface mapped to one
`or more specific computing operations. Gesture events may be made through various hand, and
`more particularly digit, motions, taps, pressures, dwells, and/or the like. Because the surface is
`based on multipoint technology, complex gesturing may be performed with any number of digits
`or other contact portions of the hand. In fact, because of this, a large gesture language analogous
`to sign language may be developed. Gesture language (or map) may include for example a set of
`instructions that recognize an arrangement of contacts (e.g., chords), recognizes the occurrence
`of gesture events (e.g., motions), and informs one or more software agents of the gesture events
`and/or what action(s) to take in response to the gesture events.
`…
`A wide range of different gestures can be utilized with multipoint sensing devices. For example,
`a gesture may be a single point or a multipoint gesture; a static or dynamic gesture; a continuous
`or segmented gesture; and/or the like. Single point gestures are those gestures that are performed
`with a single contact point, e.g., the gesture is performed with a single touch as for example from
`a single finger, a palm or a stylus. Multipoint gestures are those gestures that can be performed
`with multiple points, e.g., the gesture is performed with multiple touches as for example from
`multiple fingers, fingers and palms, a finger and a stylus, multiple styli and/or any combination
`thereof. Static gestures may be those gestures that do not substantially include gesture events
`(e.g., chords), and dynamic gestures may be those gestures that do include significant gesture
`events (e.g., motions, taps, etc.). Continuous gestures may be those gestures that are performed
`in a single stroke, and segmented gestures may be those gestures that are performed in a
`sequence of distinct steps or strokes.”
`
`WESTERMAN at [0096]:
`“Multipoint sensing devices can be embodied in various forms including but not limit to standard
`sized touch pads, large extended palm pads, touch screens, touch sensitive housings, etc.
`Furthermore, multipoint sensing devices can be positioned on many form factors including but
`
`6
`
`SAMSUNG V. SOLAS
`IPR2021-01254
`Exhibit 2014
`Page 6
`
`

`

`Asserted Claims
`
`Exemplary Disclosures
`
`not limited to tablet computers, laptop computers, desktop computers as well as handheld
`computing devices such as media players, PDAs, cell phones, and the like. The multipoint
`sensing devices may also be found on dedicated input devices such as touch screen monitors,
`keyboards, navigation pads, tablets, mice, and the like.”
`
`WESTERMAN at [0098]-[0111]:
`“FIG. 1 shows illustrative gesture control operation 10, in accordance with one embodiment of
`the present invention. The operation 10 may begin at block 12 where a multi-point sensing
`device is provided. The multi-point sensing device is capable of detecting multiple points of
`contact or near contact at the same time. The multi-point sensing device may for example include
`a multi-touch sensing surface capable of simultaneously sensing multi objects on the its touch
`surface.
`…
`Following block 12, the operation can proceed to block 14 where a determination is made as to
`whether or not a touch or near touch is detected by the multi-point sensing device. If a touch is
`not detected, the operation can wait. If a touch is detected, the operation can proceed to block 16
`where a chord associated with the touch is determined. A chord may be a specific arrangement of
`contacts or near contacts that can be assigned to some input functionality.
`…
`The chord can be widely varied and may depend on many factors including the size of the touch
`surface, whether the touch surface is a touch screen or touch pad, etc. Furthermore, the chords
`may be based on the number of unknown contacts or a specific arrangement of known contacts.
`The chords may be further based on whether the contacts are close together, in a neutral position
`or spread apart. The chords may be further based on whether the contacts are adjacent or offset
`one another. The chords may be further based on the whether they are from left and/or right
`hand.
`Determining the chord may include analyzing the touch (image created by objects touching or
`near touching the touch surface) and recognizing a specific arrangement of contacts. More
`particularly, the determination may include classifying or identifying the contacts via the
`geometric features of each contact as well as the geometric arrangement of contacts, and then
`referring to a database of expected chords (e.g., gesture map).
`
`7
`
`SAMSUNG V. SOLAS
`IPR2021-01254
`Exhibit 2014
`Page 7
`
`

`

`Asserted Claims
`
`Exemplary Disclosures
`
`…
`Examples of one hand chords are shown below in Table 1. It should be appreciated that Table 1
`is not an exhaustive list and that it is shown by way of example and not by way of limitation. For
`example, the palm may be counted as a contact and added to any of the combinations shown in
`Table 1 to create more chords. It should be further noted that many more chord combinations can
`be created by combining the list of Table 1 with the same list from the opposite hand. It should
`also be noted that although there are many combinations some chords may not be feasible for
`various reasons including ease of use, ergonomics, intuitiveness, etc.
`Once the chord has been determined, the operation can proceed to block 18 where a gesture set
`associating actions to one or more gesture events is determined. A gesture set may be a map that
`links actions to gesture events. The gesture set may depend on the determined chord, but may
`also depend on other factors including location of the chord, open application, state or mode of
`the application, other touch characteristics, etc. In essence, the chord selects an input channel
`with each channel having a different gesture set associated therewith. In some cases, for
`organizational purposes and ease of use, the channels are associated with a particular input
`functionality such as navigating operations, file operations, edit operations, viewing operations,
`formatting operations, tool operations, web browsing operations, etc. Each of these input
`functionalities can have an associated set of commands that are tied to gesture events.
`The actions may be state commands or manipulative commands.
`…
`A manipulative command is a command that continuously manipulates the selected object.
`Examples of manipulative commands include pointing, tracking, dragging, scrolling, panning,
`zooming, sizing, stretching, paging, volume, etc.
`…
`As noted above, a gesture event can be any stylized physical action that can be performed on or
`above the touch surface. Examples of gesture events may include for example motions, taps,
`pressure changes, dwells, etc. In some cases, the gesture events may be performed by the chord.
`In other case, the gesture events may be performed by a subset of the chord. In other cases, the
`gesture events may be performed by new contacts in addition to or separate from the initial
`chord.
`…
`
`8
`
`SAMSUNG V. SOLAS
`IPR2021-01254
`Exhibit 2014
`Page 8
`
`

`

`Asserted Claims
`
`Exemplary Disclosures
`
`In block 20, the touch can be monitored for gesture events, and in block 22 a determination can
`be made as to whether or not gesture events are recognized. Recognizing the gesture events may
`include analyzing the touch characteristics of contacts (contacts associated with the chord and/or
`new contacts), identifying a specific pattern and referring to the gesture set and its list of
`expected gesture events. If the pattern matches an expected gesture event then the pattern may be
`presumed to be the expected gesture event. The touch characteristics may for example include
`first order consideration such as motion, tapping, change in pressure, dwell, and second order
`considerations such as speed (absolute or relative), direction (absolute or relative), orientation
`(absolute or relative), size (absolute or relative), duration (absolute or relative), shape (absolute
`or relative), length (absolute or relative), and/or the like.
`…
`If a gesture event is recognized, the operation can proceed to block 24 where the action(s)
`associated with the gesture events are performed. Block 24 may include referring to the gesture
`set and locating the action(s) associated with the recognized gesture events. Once located, the
`action(s) can be initiated.
`…
`Following block 24, the operation can proceed to block 26 where a determination is made as to
`whether or not a switching event has been performed. A switching event can refer to an event
`that resets the operation or initiates a chord change. The switching event may be implemented in
`a variety of ways. For example, it may be implemented by removing all contacts for a
`predetermined amount of time (e.g., lifting hand off of touch surface). It may also be
`implemented by changing the base chord during the touch (e.g., adding/removing contacts). It
`may also be implemented by adding/removing contacts from the opposite hand (e.g., placing one
`or more fingers down with the opposite hand while the other hand is still touching). It may also
`be implemented by pausing (e.g., if the contacts stay stationary for a preset amount of time). It
`may also be implemented by a key entry or button click from a standard keyboard or mouse. It
`may also be implemented via a gesture event. If a switching event has occurred, the operation
`proceeds back to block 12. If a switching event has not occurred, then the operation proceeds
`back to block 20.”
`
`
`9
`
`SAMSUNG V. SOLAS
`IPR2021-01254
`Exhibit 2014
`Page 9
`
`

`

`Asserted Claims
`
`Exemplary Disclosures
`
`WESTERMAN at [0112]-[0116]:
`“FIG. 2 shows illustrative control operation 50, in accordance with one embodiment of the
`present invention. The operation may begin at block 52 where a touch or near touch is detected.
`Following block 52, the operation can proceed to block 54, where a gesture set is determined for
`the touch. The gesture set may depend on many factors including touch characteristics, touch
`location, open application, mode of application, and the like. In most cases, the gesture set is
`based at least in part on the arrangement of contacts at touchdown.
`Following block 54, the operation can proceed to block 56 where the touch is monitored for
`gesture events associated with the gesture set. The gesture set may include one or more gesture
`events for provoking or initiating a command (e.g., commands can be tied or linked to specific
`gesture events).
`Following block 56, the operation can proceed to block 58 where one or more commands are
`initiated when gesture events are performed. For example, a user may slide a specific
`arrangement of fingers to provoke or initiate a scrolling event.
`FIG. 3 shows illustrative control operation 60, in accordance with one embodiment of the present
`invention. The control operation 60 may for example correspond to block 54 in FIG. 2. The
`control operation 60 may begin at block 62 where the initial arrangement of contacts are
`recognized. Thereafter, in block 64, the initial arrangement of contacts can be compared to a
`stored set of arrangement of contacts. For example, the system may refer to a gesture map that
`includes a list of initial arrangement of contacts and gesture sets assigned thereto. If there is a
`match, the operation can proceed to block 66 where the gesture set assigned to the recognized
`initial arrangement of contacts is loaded.”
`
`WESTERMAN at [0117]:
`“FIG. 4 shows illustrative control operation 70, in accordance with one embodiment of the
`present invention. The control operation 70 may begin at block 72 where a touch is detected.
`Thereafter, in block 74, the location of the touch can be determined. Thereafter, in block 76, the
`arrangement of contacts associated with the touch can be determined (e.g., touch pattern).
`Thereafter, in block 78, the active application can be determined. Thereafter, in block 80, the
`current state of the application can be determined (e.g., current mode). Thereafter, in block 82,
`an appropriate gesture set can be set or selected based on one or more of the determined
`
`10
`
`SAMSUNG V. SOLAS
`IPR2021-01254
`Exhibit 2014
`Page 10
`
`

`

`Asserted Claims
`
`Exemplary Disclosures
`
`attributes mention above (blocks 74-80). For example, using the determined attributes, a system
`may refer to a stored gesture map that links each of the above mentioned attributes to a particular
`gesture set.”
`
`WESTERMAN at [0118]:
`“FIG. 5 shows illustrative control operation 100, in accordance with one embodiment of the
`present invention. The control operation may begin at block 102 where a determination is made
`as to whether or not a touch is detected. If a touch is detected, the operation can proceed to block
`104 where the arrangement of contacts are recognized. Block 104 may include sub blocks 106
`and 108. In block 106 a determination is made as to whether the contact patches can be precisely
`identified. For example, whether a contact patch may be an index finger or thumb or palm. If
`they cannot be precisely identified, then the operation can proceed to block 108 where the
`number of contact patches are determined. For example, whether there are two contact patches,
`three contact patches, etc. Following block 104, the operation can proceed to block 110 where
`the recognized arrangement of contacts are compared to stored arrangement of contacts in a
`gesture map. If there is no match, then the operation can proceed back to block 102. If there is a
`match, then the operation can proceed to block 112 where after referring to the gesture map, the
`gesture set associated with the initial arrangement of contacts are loaded. Thereafter, in block
`116, the touch can be monitored for gesture events associated with the gesture set. If a gesture
`event is performed, the operation can proceed to block 118 where the command associated with
`the gesture event is performed.”
`
`WESTERMAN at [0119]:
`“FIG. 6 shows illustrative control operation 120, in accordance with one embodiment of the
`present invention. The control operation 120 may begin at block 122 where a touch or near touch
`is detected. Thereafter in block 124 a chord can be determined for the touch. Thereafter, in block
`126, the input functionality associated with chord can be determined.
`…
`Thereafter, in block 128 the gesture set associated with the input functionality can be activated or
`loaded.
`…
`
`11
`
`SAMSUNG V. SOLAS
`IPR2021-01254
`Exhibit 2014
`Page 11
`
`

`

`Asserted Claims
`
`Exemplary Disclosures
`
`The gesture event may also include second order parameters that define the first order parameters
`such as speed, direction, shape, timing/duration, length, and/or the like. Thereafter, in block 130,
`actions associated with gesture events can be implemented when gesture events are performed.”
`
`WESTERMAN at [0120]:
`“FIG. 7 shows illustrative gesture operation 140, in accordance with one embodiment of the
`present invention. The operation 140 may begin at block 142 where a touch is detected.
`Following block 144, the operation can proceed to block 144 where an arrangement of contacts
`are recognized. Thereafter, in block 146, the chord associated with the recognized arrangement
`of contacts can be determined. Following block 146, the operation can proceed to block 148
`where the gesture set associated with the chord is loaded. The gesture set contains gesture events
`that can be performed with any arrangement of contacts during the touch. Thereafter, in block
`150, the touch can be monitored for a gesture event performed by any arrangement of contacts. If
`a gesture event has been performed by any arrangement of contacts, the operation can proceed to
`block 152 where the control/command associated with the gesture event is initiated. If a gesture
`event has not been performed, the operation can proceed to block 154 where a determination is
`made as to whether or not a touch is still detected. If a touch is still detected, the operation can
`proceed back to block 150. If a touch is not detected, the operation can proceed back to block
`142. That is, a lift followed by a touch resets the chord and thus the gesture set.”
`
`WESTERMAN at [0122]:
`“FIG. 8 shows illustrative gesture operation 160, in accordance with one embodiment of the
`present invention. The operation 160 may begin at block 162 where a determination is made as
`to whether or not 2 adjacent fingers are detected. If so, the operation can proceed to block 164
`where pointing operations are performed in accordance with motion. If not, the operation can
`proceed to block 166 where a determination is made as to whether or not 3 adjacent fingers are
`detected. If so, the operation can proceed to block 168 where dragging operations are performed
`in accordance with motion. If not, the operation can proceed to block 170 where a determination
`is made as to whether or not a thumb and two adjacent fingers are detected. If so, the operation
`can proceed to block 172 where secondary dragging operations are performed. If not, the
`
`12
`
`SAMSUNG V. SOLAS
`IPR2021-01254
`Exhibit 2014
`Page 12
`
`

`

`Asserted Claims
`
`Exemplary Disclosures
`
`operation can proceed to back to block 162. This process can be reset each and every time all the
`fingers are lifted off of the touch surface (e.g., touch is no longer detected).”
`
`WESTERMAN at [0123]:
`“FIG. 9 shows illustrative gesture operation 180, in accordance with one embodiment of the
`present invention. The operation 180 may begin at block 182 where a base chord is determined.
`Thereafter, the operation can perform three different processes either separately or
`simultaneously (parallel blocks 184-188). In block 184, motion can be detected with the base
`chord. Thereafter, in block 190 pointing operations can be performed in accordance with the
`motion. In block 186, a new first digit can be detected. That is, a new first digit not associated
`with the base chord can be detected (in addition to the base chord). Thereafter, in block 192, a
`first command can be initiated each time the new first digit is detected. In some cases, a user can
`perform repetitive commands by continuously tapping the new first digit. In block 188, a new
`second digit can be detected (in addition to the base chord). That is, a new second digit not
`associated with the base chord can be detected. Thereafter, in block 194, a second command can
`be initiated each time the new second digit is detected.”
`
`WESTERMAN at [0126]:
`“FIG. 10 shows illustrative gesture operation 200, in accordance with one embodiment of the
`present invention. The operation 200 may begin at block 202 where a touch is detected.
`Following block 204, the operation can proceed to block 204 where an arrangement of contacts
`are recognized. Thereafter, in block 206, the chord associated with the recognized arrangement
`of contacts can be determined. Following block 206, the operation can proceed to block 208
`where the gesture set associated with the chord is loaded. The gesture set may contain gesture
`events that can be performed with any arrangement of contacts during the touch. Thereafter, in
`block 210, the touch can be monitored for a gesture event performed by any arrangement of
`contacts. If a gesture event has been performed by any arrangement of contacts, the operation
`can proceed to block 212 where the control/command associated with the gesture event is
`initiated. If a gesture event has not been performed, the operation can proceed to block 214
`where a determination is made as to whether the base chord or current arrangement of contacts
`has paused during the touch. If so, the pause can be presumed to be a chord switching event and
`
`13
`
`SAMSUNG V. SOLAS
`IPR2021-01254
`Exhibit 2014
`Page 13
`
`

`

`Asserted Claims
`
`Exemplary Disclosures
`
`the operation can proceed back to block 204. If not, the operation can proceed to block 216
`where a determination is made as to whether or not a touch is still detected. If a touch is still
`detected, the operation can proceed back to block 210. If a touch is not detected, the operation
`can proceed back to block 202.”
`
`WESTERMAN at [0128]:
`“FIG. 11 shows illustrative gesture operation 220, in accordance with one embodiment of the
`present invention. The operation 220 may begin at block 222 where a determination is made as
`to whether or not 1 finger is detected. If so, the operation can proceed to block 224 where
`pointing operations are performed in accordance with motion. If not, the operation can proceed
`to block 226 where a determination is made as to whether or not 2 adjacent fingers are detected.
`If so, the operation can proceed to block 228 where dragging operations are performed in
`accordance with motion. In some cases, drag lock/extend may be initiated by clicking and in
`other cases it is initiated by dropping the thumb (two adjacent fingers+thumb). If not, the
`operation can proceed to block 230 where a determination is made as to whether or two non
`adjacent fingers are detected. If so, the operation can proceed to block 232 where secondary
`dragging operations are performed. In some cases, drag lock/extend may be initiated by clicking
`and in other cases it is initiated by dropping the thumb (two non adjacent fingers+thumb). If not,
`the operation can proceed to block 234 where a determination is made as to whether of not 3 or
`four fingers are detected. If so, scrolling can be initiated in accordance with motion (block 236).
`If not the operation can proceed back to block 222. This process can be reset each and every time
`all the fingers are lifted off of the touch surface (e.g., touch is no longer detected) or if the chord
`is paused for a moment.”
`
`WESTERMAN at [0129]:
`“FIG. 12 shows illustrative gesture operation 240, in accordance with one embodiment of the
`present inven

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket