`
`Docket No.: 122202-6225 (P29273USC1)
`
`AMENDMENTS TO THE CLAIMS
`
`This listing of claims will replace all prior versions, and listings, of claims in the
`
`application:
`
`LISTING OF CLAlMS:
`
`1-20.
`
`(Cancelled)
`
`21.
`
`(New) A method comprising:
`
`detecting a touch input from both outside and inside of a constrained writing region of a
`
`touch-sensitive space of a device; and
`
`generating an input character based on a character recognition of the touch input from
`
`both outside and inside of the constrained writing region.
`
`22.
`
`(New) The method of claim 21, further comprising:
`
`providing multiple touch-sensitive affordances on the touch-sensitive space of the device,
`
`the touch-sensitive affordances comprising the constrained writing region.
`
`23.
`
`(New) The method of claim 22, wherein generating the input character
`
`comprises:
`
`determining that the touch input follows within a threshold amount of time a previous
`
`touch input that was used to generate another input character, and
`
`based on the determination, providing the detected touch input from both outside and
`
`inside of the constrained writing region to a character recognizing engine to translate the touch
`
`input from both outside and inside of the constrained writing region to the input character.
`
`24.
`
`(New) The method of claim 23, further comprising:
`
`based on the determination, foregoing providing the touch input to a handler for one of
`
`the multiple touch-sensitive affordances that is outside of the constrained writing region.
`
`
`
`Application No. 16/181,296
`
`Docket No.: 122202-6225 (P29273USC1)
`
`25.
`
`(New) The method of claim 22, wherein generating the input character
`
`comprises:
`
`providing the detected touch input to a handler for one of the multiple touch-sensitive
`
`affordances that is outside of the constrained writing region;
`
`from the handler, receiving a rejection of the touch input; and
`
`responsive to receiving the rejection, providing the touch input from both outside and
`
`inside of the constrained writing region to a character recognizing engine to translate the touch
`
`input from both outside and inside of the constrained writing region to the input character.
`
`26.
`
`(New) The method of claim 21, wherein generating the input character
`
`comprises:
`
`generating a graph having a plurality of nodes and a plurality of edges between the nodes,
`
`each node corresponding to at least one touch-input stroke's touch-down event or one touch-input
`
`stroke's touch-up, each edge between a pair of nodes representing one or more characters,
`
`for each respective edge in the graph, defining a set of candidate characters represented
`
`by the respective edge, and
`
`identifying a plurality of characters written in the touch-sensitive space by selecting a
`
`path through the edges of the graph.
`
`27.
`
`(New) The method of claim 26, further comprising:
`
`for each respective edge in the graph, associating a score for each candidate character,
`
`and
`
`wherein identifying the plurality of characters comprises selecting the path through the
`
`edges of the graph that produces a best score based on the score of a selected character for each
`
`respective edge on the path.
`
`28.
`
`(New) The method of claim 26, wherein the plurality of edges comprises a
`
`single-stroke edge between a pair of adjacent nodes in the graph for representing one or more
`
`single stroke characters, and a multi-stroke edge between a pair of non-adjacent nodes in graph
`
`representing one or more multi-stroke characters.
`
`
`
`Application No. 16/181,296
`
`Docket No.: 122202-6225 (P29273USC1)
`
`29.
`
`(New) The method of claim 26, wherein defining the set of candidate characters
`
`for each edge comprises using spatial, temporal, and language constraints to limit the candidate
`
`characters to include in the set.
`
`30.
`
`(New) The method of claim 26, wherein each of the plurality of nodes
`
`corresponds to a touch-up of a prior stroke and a touch-down of a subsequent stroke.
`
`3 1.
`
`(New) The method of claim 21, further comprising:
`
`identifying a series of characters input on the touch-sensitive space of the device,
`
`determining that the identified series of characters includes an improper combination of
`
`at least one upper case character and one lower case character, and
`
`responsive to the determining, presenting the series of characters with at least one of the
`
`inputted characters having a modified character case.
`
`32.
`
`(New) A device comprising:
`
`at memory, and
`
`at least one processor configured to:
`
`receive a plurality of touch-input strokes on a touch-sensitive space of the device,
`
`generate a graph having a plurality of nodes and a plurality of edges between the
`
`nodes, each node corresponding to a touch-down event or a touch-up event of at least one
`
`of the plurality of touch-input strokes, each edge between a pair of nodes representing
`
`one or more characters,
`
`for each respective edge in the graph, assign a set of candidate characters
`
`represented by the respective edge, and
`
`identify a plurality of characters written in the touch-sensitive space by selecting a
`
`path through the edges of the graph.
`
`33.
`
`(New) The device of claim 32, wherein the at least one processor is further
`
`configured to:
`
`for each respective edge in the graph, associate a score for each candidate character, and
`
`
`
`Application No. 16/181,296
`
`Docket No.: 122202-6225 (P29273USC1)
`
`identify the plurality of characters by selecting the path through the edges of the graph
`
`that produces a best score based on the score of a selected character for each respective edge on
`
`the path.
`
`34.
`
`(New) The device of claim 32, wherein the plurality of edges comprises a single-
`
`stroke edge between a pair of adjacent nodes in the graph for representing one or more single
`
`stroke characters, and a multi-stroke edge between a pair of non-adjacent nodes in graph
`
`representing one or more multi-stroke characters.
`
`35.
`
`(New) The device of claim 32, wherein the at least one processor is configured to
`
`generate the graph using spatial, temporal, and stroke count constraints to define edges between
`
`nodes in the graph.
`
`36.
`
`(New) The device of claim 32, wherein the at least one processor is configured to
`
`define the set of candidate characters for each edge using spatial, temporal, and language
`
`constraints to limit the candidate characters to include in the set.
`
`37.
`
`(New) The device of claim 32, wherein each of the plurality of nodes
`
`corresponds to a touch-up of a prior stroke and a touch-down of a subsequent stroke.
`
`38.
`
`(New) The device of claim 32, wherein the touch-sensitive space comprises a
`
`constrained writing region and at least a portion of at least one of the plurality of touch-input
`
`strokes is received outside of the constrained writing region.
`
`39.
`
`(New) A non-transitory machine readable medium comprising code that, when
`
`executed by one or more processors, causes the one or more processors to perform operations,
`
`the code comprising:
`
`code to generate stroke input data corresponding to received touch input data,
`
`code to add a node to a stroke graph, the node corresponding to the stroke input data,
`
`code to generate one or more edges in the stroke graph that connect the added node to
`
`one or more other nodes based at least in part on one or more stroke constraints,
`
`
`
`Application No. 16/181,296
`
`Docket No.: 122202-6225 (PZ9273USC1)
`
`code to generate one or more edges in a character graph associated with one or more
`
`candidate characters that correspond to the one or more edges in the stroke graph;
`
`code to determine a score for each of the one or more candidate characters and assigning
`
`the score to the associated one or more edges in the character graph; and
`
`code to select one or more of the candidate characters to form a string based at least in
`
`part on a sum of the scores corresponding to the one or more edges in the character graph that
`
`represent the one or more of the candidate characters.
`
`40.
`
`(New) The non-transitory machine readable medium of claim 39, wherein the
`
`touch input data is received on a touch-sensitive space and at least a portion of the touch input
`
`data is received outside of a constrained writing region of the touch-sensitive space.
`
`

Accessing this document will incur an additional charge of $.
After purchase, you can access this document again without charge.
Accept $ ChargeStill Working On It
This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.
Give it another minute or two to complete, and then try the refresh button.
A few More Minutes ... Still Working
It can take up to 5 minutes for us to download a document if the court servers are running slowly.
Thank you for your continued patience.

This document could not be displayed.
We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.
You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.
Set your membership
status to view this document.
With a Docket Alarm membership, you'll
get a whole lot more, including:
- Up-to-date information for this case.
- Email alerts whenever there is an update.
- Full text search for other cases.
- Get email alerts whenever a new case matches your search.

One Moment Please
The filing “” is large (MB) and is being downloaded.
Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!
If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document
We are unable to display this document, it may be under a court ordered seal.
If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.
Access Government Site