throbber
United States Patent [19]
`Pisutha-Arnond
`
`I 11111111111111111 UII Ill 1111111111111111111111111111
`5,745,116
`Apr. 28, 1998
`
`US005745 l 16A
`[111 Patent Number:
`[451 Date of Patent:
`
`[54]
`
`INTUITIVE GESTURE-BASED GRAPHICAL
`USER INTERFACE
`
`[75]
`
`Inventor: Suthirug Num Pisutha-Arnond.
`Wheeling. lli.
`
`Primary Examiner-Matthew M. Kim
`Assistant Examiner-Huynh Ba
`Attorney, Agent, or Finn-Sylvia Chen; Rolland R.
`Hackbart
`
`[73] Assignee: Motorola, Inc., Schaumburg, lli.
`
`[57)
`
`ABSTRACT
`
`[21] Appl. No.: 711,236
`
`Sep. 9, 1996
`
`[22] Filed:
`Int. Cl. 6
`[51]
`...................................................... G06F 15/00
`[52] U.S. Cl . .......................... 345/358; 345/358; 345/352;
`345/353; 345/354
`[5 8] Field of Search ..................................... 345/326. 327,
`345/328,335.337.338. 339. 340,347,
`348.349,352.353.358,354
`
`[56]
`
`References Cited
`
`U.S. PATENT DOCUMENfS
`
`4,803,474
`5,384,910
`5,386,568
`
`211939 Kulp ........................................ 340no9
`1/1995 Torres ..................................... 345/352
`1/1995 Wold et al .............................. 395noo
`
`OTHER PUBLICATIONS
`
`Evan Graham, DSI Datotech Systems. Inc., "White Paper
`Dato Patented Pointing Gesture System", Aug. 1996, pp.
`1-20.
`
`A user performs a manual selection or a gesture selection of
`a screen object (210, 220, 230) on a screen (150) of an
`electronic device (100) using a pointing device (190). After
`a manual selection, such as a single tap, the electronic device
`(100) automatically presents a temporary directional palette
`(450) having palette buttons (451. 452, 453. 454. 455) that
`explicitly state functions of the electronic device (100). Each
`palette button has a unique compass direction relative to the
`original tap area. By making a second tap on a desired
`palette button. a novice user learns available functions of the
`electronic device (100) and their corresponding directional
`gestures. Alternately. the user may perform a gesture selec(cid:173)
`tion of both a screen object and a function, such as making
`a double tap or drawing a line in the appropriate direction.
`before the directional palette appears on the screen (150).
`
`16 Claims, 5 Drawing Sheets
`
`210
`
`150
`
`12:36 arm
`
`5 emails, 3 new
`
`2/22/96 9:45am
`John Smith
`Meeting
`
`g
`
`.
`2/22/9
`John Smith
`Meeting
`

`@) @ 0
`

`
`GOOGLE EXHIBIT 1028
`
`Page 1 of 10
`
`

`

`U.S. Patent
`
`Apr. 28, 1998
`
`Sheet 1 of 5
`
`5,745,116
`
`150
`
`90
`
`120
`
`FIG.1
`
`GOOGLE EXHIBIT 1028
`
`Page 2 of 10
`
`

`

`~
`-..
`Ut
`.a;;;..
`-...I
`-..
`Ut
`
`~ ""'
`
`r.J'J =- ti> a
`
`Ol
`g,
`N
`
`"'
`~
`:"I
`'t:I
`>
`
`I--' !
`
`~ ..... a
`
`~ • 00
`
`•
`
`Meeting
`John Smith
`2/22/96 9:45am
`
`~
`
`21Q
`
`Meeting
`John Smith
`2/22/96 9:45am
`
`B
`
`B
`
`220
`
`210.
`
`150
`
`5 emails, 3 new
`
`CS:::)
`
`6 c□OOO 12:36 ClZlllD
`(<1>)
`
`190
`
`150
`
`~ 5 emails, 3 new
`
`6 c□OOO 12:36 CIJlJD
`(<1>)
`
`~
`
`150
`
`~~ 2/22/96 9:45am
`
`Meeting
`John Smith
`
`r-.B 2/22/96 9:45am
`
`Meeting
`John Smith
`
`r--.B 2/22/96 9:45am
`
`Meeting
`John Smith
`
`5 emails, 3 new
`
`c:::::J
`
`230
`
`221
`
`21
`
`c□DD 12:36 az:mo
`
`D
`
`~
`
`(@ ® 0
`~
`

`

`
`® @) ® 0
`
`."ll
`===1
`A
`

`
`Meeting
`John Smith
`2/22/96 9:45am
`
`~
`
`® @) ® 0
`A = V
`

`
`FIG.4
`
`FIG.3
`
`FIG.2
`
`GOOGLE EXHIBIT 1028
`
`Page 3 of 10
`
`

`

`Q\
`Ii(cid:173)
`Ii-(cid:173)
`,_.
`Ol
`~
`~
`,_.
`Ol
`
`tl1
`~
`~
`
`~ a
`
`® @) ® 0
`
`V
`:::::.
`A
`

`
`753
`
`Meeting
`John Smith
`2/22/96 9:45am
`
`454
`
`I I
`
`··
`

`1th
`~-·
`
`~
`~
`t :'I
`
`~
`
`QC
`
`~ ;-a
`
`•
`~ • r:J).
`
`Meeting
`John Smith
`2/22/96 9:4
`
`~
`
`751
`
`210
`
`190
`
`~ 5 emails, 3 new
`
`IJCll1l)
`
`6 c:1□000 12:36
`(<1>)
`
`150
`
`FIG.7
`
`® @) ® 0
`
`V
`A
`

`
`Meeting
`John Smith
`2/22/9..,, ....
`
`Meeting
`John Smith
`2/22/96 9:45
`
`E] 5 emails, 3 new
`
`~ =□□DD 12:36 Clll1D
`(<1>)
`
`FIG.5
`
`GOOGLE EXHIBIT 1028
`
`Page 4 of 10
`
`

`

`=".
`~
`~
`....
`01
`~
`.... .......
`01
`
`01
`~
`~
`ll.
`00 =-~
`
`0C
`I,:)
`'"""
`I,:)
`"'
`0C
`N
`:"I
`"Cl
`>
`
`~ = ~
`= ~
`~
`•
`~ • rJ'J.
`
`® @) ® 01
`

`
`2/22/96 9:45am 11753
`
`Meeting
`John Smith
`
`230 r§
`
`---'-l ~ John Sot~ I
`
`190
`
`I ]50
`
`. . ..
`John Smith
`2/22/96 9:45am
`
`2J01: 5 emails, 3 new
`Ii
`. =□□DD 12:36 auzn
`
`w
`
`FIG.B
`
`® @) ® 0
`
`V
`A
`

`
`r§ 2/22/96 9:45am
`
`Meeting
`John Smith
`
`Meeting
`John Smith
`2/22/96 9:450/ ~ 190
`
`· -·
`
`210~1 ~
`
`John Snt~I
`2/22/91
`
`220 I C::::J
`
`/'"'.
`
`150
`
`I
`r'
`
`~ 5 emails, 3 new
`i
`
`=□□DD 12:36 auzn
`
`FIG.6
`
`GOOGLE EXHIBIT 1028
`
`Page 5 of 10
`
`

`

`U.S. Patent
`
`Apr. 28, 1998
`
`Sheet 5 of 5
`
`5,745,116
`
`FIG.9
`START
`
`901
`
`DISPLAY ONE OR
`MORE OBJECTS
`
`910
`
`RECEIVE USER INPUT
`(HIGHLIGHT)
`
`920
`
`TAP
`
`GESTURE
`
`935
`
`DISPLAY
`. - - - - - i ..... DIRECTIONAL
`PALETTE
`
`RECEIVE
`USER INPUT
`(HIGHLIGHT)
`
`950
`
`963
`
`955
`GESTURE
`
`DISPLAY
`PALETTE
`BUTTON
`
`970
`
`965
`
`PERFORM
`FUNCTION
`(BLINK)
`
`980
`
`END
`
`990
`
`GOOGLE EXHIBIT 1028
`
`Page 6 of 10
`
`

`

`5,745,116
`
`1
`INTUITIVE GESTURE-BASED GRAPHICAL
`USER INTERFACE
`
`FIELD OF THE INVENTION
`
`This invention relates generally to user interfaces, and
`more particularly to graphical user interfaces that allow
`gestures.
`
`BACKGROUND OF THE INVENTION
`
`5
`
`2
`selection of a screen item by tapping, or drawing a point. on
`a touch-sensitive screen using the stylus. After a single tap,
`the device would present a directional palette with palette
`buttons having different compass directions relative to the
`center of the directional palette. Each palette button explic(cid:173)
`itly displays a unique identifier representing a function or
`other item of information. By malting a manual section of a
`desired palette button, such as a second tap, the novice user
`learns available functions of the device and their corre-
`10 sponding gestures. As the user grows more familiar with the
`electronic device, a user may tap twice in succession on a
`screen object or draw a line in an appropriate compass
`direction originating from a selected screen object. and the
`device would process this gesture selection appropriately
`15 without the directional palette appearing.
`
`BRIEF DESCRIPTION OF THE DRAWINGS
`
`FIG. 1 shows a diagram of an intuitive gesture-based
`graphical user interface according to a preferred embodi(cid:173)
`ment as implemented on an electronic device.
`FIG. 2 shows details of the screen of the electronic device
`shown in FIG. 1 according to a preferred embodiment.
`FIG. 3 shows details of the screen shown in FIG. 2 when
`25 a user performs a manual selection of a screen object
`according to a preferred embodiment.
`FIG. 4 shows details of the screen shown in FIG. 2 when
`a directional palette appears according to a preferred
`embodiment.
`FIG. 5 shows details of the screen shown in FIG. 2 when
`a user performs a manual selection of a palette button on the
`directional palette shown in FIG. 4 according to a preferred
`embodiment.
`FIG. (i shows details of the screen shown in FIG. 2 when
`a user performs a gesture selection of a screen object and a
`function according to a preferred embodiment.
`FIG. 7 shows details of the screen shown in FIG. 2 when
`a user performs a manual selection of a palette button on the
`directional palette shown in FIG. 4 according to another
`preferred embodiment.
`FIG. 8 shows details of the screen shown in FIG. 2 when
`a user performs a gesture selection of a screen object
`according to another preferred embodiment.
`FIG. 9 shows a flow chart diagram of the operation of an
`intuitive gesture-based graphical user interface according to
`a preferred embodiment.
`
`45
`
`DEfAll.ED DESCRJPI10N OF THE
`PREFERRED EMBODIMENT
`
`50
`
`Generally. a graphical user interface uses linear menus,
`screen buttons. and a pointing device, such as a mouse,
`joystick, or touch-pad, to control an electronic machine such
`as a computer or computer-controlled appliance. The point(cid:173)
`ing device selects an area of the screen to indicate a desired
`function and then a selection button, such as a mouse button,
`commands implementation of that function. In order for a
`user to know what functions are available. the linear menus
`and screen buttons explicitly state or iconically show their
`functionality on a continuous basis, which reduces the 20
`amount of display space available to present other informa(cid:173)
`tion to the user. To minimize the screen space taken by the
`linear menus and screen buttons. pull-down or other sub-(cid:173)
`menus may list collateral choices and present additional
`functions upon selection of a main menu item.
`Several pen- or stylus-based electronic products have
`adopted "gesture" user interfaces as shortcuts for performing
`frequently-used functions. Gestures, however, are available
`only as an addition to conventional linear menu and screen
`button graphical user interfaces, thus the problem of reduced 30
`screen space remains in both types of user interfaces.
`Through simple, learned motions. gesture user interfaces
`allow quick access to functions. For example, instead of
`using a pointing device to select characters to be deleted and
`depressing a "delete" button, a user could simply strike 35
`through words (from left to right) using the stylus and they
`would be deleted. Unfortunately, most of the gestures avail(cid:173)
`able are not nearly as intuitive as this deletion example.
`Also, the gesture shortcuts are radically different from the
`linear menu and screen button methods of achieving the 40
`same function.
`Thus, these gesture user interfaces are meant to be used
`only by expert users. Novice users have no way of easily
`ascertaining what gestures are available. The novice user
`must first go through a process of learning available gestures
`and their functionality from user documentation or
`on-screen "help" documentation and then memorizing these
`gestures. If a gesture is forgotten, the user must make a
`concerted effort to return to the documentation to relearn the
`gesture and its function. Consequently. there is a need for an
`easy-to-learn gesture-based user interface that allows users
`to quickly access frequently-used functions on an electronic
`machine and yet avoid the inconvenience of documentation.
`Also there is a demand for a gesture-based user interface that
`reduces the need for linear menus and screen buttons which
`consume display space.
`
`SUMMARY
`
`An intuitive gesture-based graphical user interface allows
`users to learn quickly gestures as a method of accessing
`frequently-used functions of a computer or computer-based
`appliance. The intuitive gesture-based graphical user inter(cid:173)
`face also reduces the need for screen buttons that decrease
`available display space. Preferably. the intuitive gesture- 65
`based graphical user interface is implemented using a pen-
`or stylus-based device, and the user performs a manual
`
`55
`
`FIG. 1 shows a diagram of an intuitive gesture-based
`graphical user interface according to a preferred embodi(cid:173)
`ment as implemented on an electronic device. Electronic
`device 100 is preferably a computer or other
`microprocessor-based appliance. In this diagram, electronic
`device 100 is shown as an integrated wireless communica-
`tion device with radio-telephone. electronic mail, and fac(cid:173)
`simile features; however. electronic device 100 could be a
`desktop computer or a portable computer with a MODEM
`60 (modulator/demodulator). a television/VCR combination, a
`facsimile machine, a photocopy machine, a personal digital
`assistant, or the like. Electronic device 100 has a touch(cid:173)
`sensitive screen 150 and push buttons 120. A pointing device
`190 such as a pen or stylus interacts with the screen 150 to
`select areas of the screen. Of course, another pointing
`device, such as a mouse. joystick. touch-pad. or even human
`finger. could be substituted for a pen or stylus.
`
`GOOGLE EXHIBIT 1028
`
`Page 7 of 10
`
`

`

`5,745,116
`
`3
`FIG. 2 shows details of the screen of the electronic device
`shown in FIG. 1 according to a preferred embodiment This
`screen displays one or more screen objects 210. 220. 230.
`such as a list of received messages in an electronic mailbox.
`as directed by the microprocessor. A screen object is a
`graphical representation of data. Although an electronic mail
`software program is used in this example. many other
`programs may be substituted with other types of screen
`objects, such as an address book program with an alphabet
`index as screen objects, a scheduling program with calendar
`dates as screen objects, a memo or to-do program with list
`items as screen objects. an electronic game with directional
`buttons as screen objects. or electronic programs suitable for
`functions such as photocopying or facsimile transmission.
`There are two methods of selecting a screen object and an
`associated function. A manual selection of a screen object
`such as a tap or a drawing of a point. automatically brings
`up a directional palette with palette buttons having explicit
`identifiers to indicate functions corresponding to the
`manually-selected screen object. A manual selection of a
`palette button, such as a second tap on the desired palette
`button. directs execution of the identified function. On the
`other hand. a gesture selection allows the designation of both
`a screen object and a function simultaneously. A gesture
`selection. such as drawing a line starting on a screen object
`and going in a certain direction. indicates the desired screen
`object from the starting point of the line and the desired
`function from the direction the line is drawn.
`In order to manually select a screen object, a novice user
`would intuitively tap, or draw a point, on a screen object to
`select an electronic mail message. FIG. 3 shows details of
`the screen shown in FIG. 2 when a user performs a manual
`selection of a screen object according to a preferred embodi(cid:173)
`ment The electronic device preferably provides visual feed(cid:173)
`back to the user by highlighting the tapped area 310. Audio
`feedback may optionally be provided to the user. After a
`manual selection such as a single tap or a drawing of a point.
`the electronic device would automatically present the user
`with a directional palette preferably centering on the same
`area of the screen where the tap or point occurred.
`FIG. 4 shows details of the screen shown in FIG. 2 when
`a directional palette appears according to a preferred
`embodiment. Directional palette 450 presents the user with
`a number of palette buttons. Although the directional palette
`450 shown presents five palette buttons to the user. direc(cid:173)
`tional palettes may present more or less buttons as needed.
`The shape, size. and configuration of a directional palette
`depends on which screen object is selected. Each palette
`button has a unique direction relative to the center of the
`directional palette. which is preferably also the approximate
`location of the first tap; these directions may be referred to
`like compass directions. Each palette button also displays a
`function identifier stating or showing what function can be
`accessed by activating the palette button.
`In this electronic mail example, a screen object represents
`a received mail message. Thus the directional palette 450
`that appears when a received mail message screen object is
`manually selected has a palette button 451 to the north which
`accesses a cancel function. An alternate method of accessing
`this cancel function is to tap on a portion of the screen
`outside of the palette. The cancel function removes the
`present palette from the screen or otherwise undoes the most
`recent action. To the east. the user may select palette button
`452 to forward the message to another recipient. To the
`south. palette button 453 directs deletion of the message. and
`to the west. palette button 454 prompts the user to reply to
`the sender of the message. Finally. center palette button 455
`
`25
`
`4
`allows the user to read the message. Tapping on any of the
`palette buttons calls the function that is identified on the
`palette button.
`In order to aid the user in learning and memorizing the
`5 available function options and their related gestures, the
`directional palette preferably presents function options as
`logically as possible. In this example. "reply" is to the west
`while "forward" is to the east. Thus. the user can intuit that
`a "reply" somehow travels backwards and a "forward"
`10 somehow travels forward.
`FIG. 5 shows details of the screen shown in FIG. 2 when
`a user performs a manual selection of a palette button on the
`directional palette shown in FIG. 4 according to a preferred
`embodiment. A manual selection of a palette button is
`15 implemented using a second tap. After the center palette
`button 455 is tapped. the microprocessor temporarily high(cid:173)
`lights the palette button, as if it blinks. and the previous
`screen as shown in FIG. 2 is replaced by the selected
`electronic message screen. Thus. the second tap allows the
`20 user to read the message selected by the first tap. In this
`manner, the directional palette 450 teaches the user to
`"double-tap" to read a selected message.
`Continuing with the directional palette shown in FIG. 5.
`a second tap to the left of the first tap. on the "reply" palette
`button. will bring up a reply screen. The corresponding
`gesture would be a line drawn from right to left as shown in
`FIG. 6. FIG. 6 shows details of the screen shown in FIG. 2
`when a user performs a gesture selection of a screen object
`30 and a function according to a preferred embodiment. The
`gesture, a line drawn from right to left on screen object 220.
`is tracked using line 620 on the screen. Once the reply
`gesture is recognized by the microprocessor, the reply
`palette button 454 appears, blinks to provide visual
`35 feedback, and a reply screen appears exactly as the reply
`screen would appear if the user had manually selected the
`reply palette button from a directional palette. Note that the
`mental model for accessing a function is the same both for
`novice users and expert users. Thus, there is no longer a need
`40 for separate, redundant screen buttons for novice users.
`Once a user has learned a gesture, the user no longer has
`to wait for a directional palette to appear in order to access
`available functions. Thus. the intuitive gesture-based graphi(cid:173)
`cal user interface teaches a novice user to gesture as a
`45 primary method for interacting with the device. By combin(cid:173)
`ing the directional palette with a gesture-based graphical
`user interface, the intuitive gesture-based graphical user
`interface explicitly depicts program functionality. Because,
`however, the user may interact with the device before a
`so directional palette is presented, the intuitive gesture-based
`graphical user interface promotes more efficient use of the
`screen by providing the user with explicit function options
`only if they are needed. Furthermore, by providing the same
`mental model to both novice and expert users. the novice can
`55 more easily graduate to expert user status.
`FIG. 7 shows details of the screen shown in FIG. 2 when
`a user performs a manual selection of a palette button on the
`directional palette shown in FIG. 4 according to another
`preferred embodiment. Directional palettes may be
`60 "stacked" to create any number of gestures and accessible
`functions. A tap on the west palette button 454, which is
`labeled "reply," brings up a sub-palette 750. Sub-palette 750
`provides only two palette buttons, a north-west palette
`button 751 to reply to the sender of the message and a
`65 south-west palette button 753 to select from a list of possible
`recipients. Sub-palettes 750 may be stacked indefinitely and
`contain any number of palette buttons.
`
`GOOGLE EXHIBIT 1028
`
`Page 8 of 10
`
`

`

`5,745,116
`
`15
`
`30
`
`5
`FIG. 8 shows details of the screen shown in FIG. 2 when
`a user performs a gesture selection of a screen object
`according to another preferred embodiment. In a stacked
`palette embodiment, a gesture selection may bring a user to
`a subpalette or straight to accessing a function. In this
`embodiment. gesture line 620 is made by a user and sub(cid:173)
`palette 750 appears. If, however, the gesture had continued
`with a north-west bend, the reply palette button 454 would
`first appear and blink, and then palette button 751 would
`appear and blink, and a reply screen addressed to the sender
`of the message, John. would replace the present screen.
`Thus, both a palette 450 and a subpalette 750 may be
`bypassed through the use of gestures.
`FIG. 9 shows a flow chart diagram of the operation of an
`intuitive gesture-based graphical user interface according to
`a preferred embodiment After the start step 901, the elec(cid:173)
`tronic device displays one or more objects as shown in step
`910. The device then waits for a user input as shown in step
`920. When a user input is received, a portion of the screen
`is highlighted to provide visual feedback to the user. The 20
`machine could also provide audio feedback to the user. The
`device evaluates whether the received user input was a
`manual selection or a gesture selection of a screen object as
`shown in step 930. If the user input is determined to be a
`manual selection (e.g., a tap). shown by branch 933. the 25
`device displays a directional palette as shown in step 940.
`Again, the devices waits for a user input and highlights user
`input as shown in step 950 and evaluates whether the
`received user input was a tap or a gesture as shown in step
`955.
`If the next user input is a tap. the microprocessor evalu(cid:173)
`ates whether the user input was a function call as shown in
`step 960. If the next user input was a gesture, the selected
`palette button is displayed as visual feedback as shown in
`step 970 and then the microprocessor evaluates whether the 35
`user input was a function call as shown in step 960. Again,
`audio feedback may also be used to acknowledge the gesture
`selection.
`If the user input was not a function call as evaluated in 40
`step 960, the device returns to step 940 and displays a next
`directional palette (i.e., subpalette). Loop 963 allows a
`stacked palette effect and may be performed as many times
`as necessary until a user input is a function call as shown by
`branch 965. Once a function call is selected, the device
`blinks the selected palette button and the microprocessor
`performs the function as shown in step 980.
`Returning to step 930. if the initial user input is deter(cid:173)
`mined to be a gesture selection, as shown in branch 935, the
`device displays the palette button selected by the gesture
`selection in step 970 to provide visual feedback to the user.
`Then. the device goes directly to step 960 to determine
`whether the gesture was a function call. Once a selected
`function is performed. the end step 990 occurs and the
`device may return to the start step 901.
`As an option. the timing of the appearance of the direc(cid:173)
`tional palette may vary to motivate a novice user to use
`gesture selections rather than simply wait for the directional
`palette to appear before making a second tap. For instance,
`when a device is first used. the directional palette appears 60
`quickly in reaction to a single tap. The quick appearance of
`the directional palette allows the user to learn the available
`functions and their corresponding gestures at the outset. As
`the user learns to make a second tap in certain directions to
`activate certain functions. the directional palette takes longer
`and longer to appear after the first tap has occurred. Thus. an
`impatient user will be inclined to make a second tap before
`
`6
`the directional palette appears. Once the user either taps
`twice or draws a line before a directional palette appears. the
`user has employed a gesture selection but without making
`any concerted effort to memorize the gesture. If a user
`5 forgets a gesture, the user merely needs to wait until the
`directional palette appears on the screen after the first tap to
`relearn the gesture and related function.
`Thus. an intuitive gesture-based graphical user interface
`quickly and easily teaches users gestures for interacting with
`10 a microprocessor-controlled electronic device. While spe(cid:173)
`cific components and functions of the intuitive gesture-based
`graphical user interface are described above. fewer or addi(cid:173)
`tional functions could be employed by one skilled in the art
`within the true spirit and scope of the present invention. The
`invention should be limited only by the appended claims.
`I claim:
`1. A method for providing a microprocessor-controlled
`intuitive gesture-based graphical user interface for an elec(cid:173)
`tronic device having a screen comprising the steps of:
`A) displaying a screen object on the screen;
`B) receiving a user input corresponding to a selection of
`the screen object and evaluating the user input as either
`corresponding to a manual selection or corresponding
`to a gesture selection;
`C) if the user input corresponds to a manual selection,
`performing the steps of:
`1) automatically presenting on the screen a directional
`palette, the directional palette having at least one
`palette button. each at least one palette button each
`having a unique compass direction and a unique
`function identifier; and
`2) receiving a next user input corresponding to a
`selection of the at least one palette button; and
`D) if the user input corresponds to a gesture selection.
`performing the steps of:
`1) providing a user feedback acknowledging the ges(cid:173)
`ture selection;
`2) determining if the user input is a function call;
`3) if the user input is a function call, performing a
`function; and
`4) if the user input is not a function call, returning to the
`step of automatically presenting on the screen a
`directional palette ( step C) 1) ).
`2. A method for providing a microprocessor-controlled
`45 intuitive gesture-based graphical user interface for an elec(cid:173)
`tronic device having a screen according to claim 1 wherein
`the user feedback is a visual feedback
`3. A method for providing a microprocessor-controlled
`intuitive gesture-based graphical user interface for an elec-
`50 tronic device having a screen according to claim 2 wherein
`the user feedback is a temporary palette button.
`4. A method for providing a microprocessor-controlled
`intuitive gesture-based graphical user interface for an elec(cid:173)
`tronic device having a screen according to claim 1 wherein
`55 the user feedback is an audio feedback.
`5. A method for providing a microprocessor-controlled
`intuitive gesture-based graphical user interface for an elec(cid:173)
`tronic device having a screen according to claim 1 further
`comprising the step of:
`waiting for a variable time period before the step of
`automatically presenting on the screen a directional
`palette (step C) 1)).
`6. A method for providing a microprocessor-controlled
`intuitive gesture-based graphical user interface for an elec-
`65 tronic device having a screen according to claim 1 wherein
`the user input corresponding to a manual selection is a
`drawing of a point.
`
`GOOGLE EXHIBIT 1028
`
`Page 9 of 10
`
`

`

`5,745,116
`
`s
`
`15
`
`7
`7. A method for providing a microprocessor-controlled
`intuitive gesture-based graphical user interface for an elec(cid:173)
`tronic device having a screen according to claim 1 wherein
`the user input corresponding to a gesture selection is a
`drawing of a line.
`8. A method for providing a microprocessor-controlled
`intuitive gesture-based graphical user interface for an elec(cid:173)
`tronic device having a screen according to claim 1 further
`comprising the step of:
`providing visual feedback after the step of receiving a 10
`user input (step B)).
`9. A method for providing a microprocessor-controlled
`intuitive gesture-based graphical user interface for an elec(cid:173)
`tronic device having a screen according to claim 1 further
`comprising the step of:
`providing audio feedback after the step of receiving a user
`input (step B)).
`10. A method for providing a microprocessor-controlled
`intuitive gesture-based graphical user interface for an elec(cid:173)
`tronic device having a screen according to claim 1 further 20
`comprising the steps of:
`C)
`3) evaluating the next user input as either correspond(cid:173)
`ing to a manual selection or corresponding to a 25
`gesture selection;
`4) if the next user input corresponds to manual
`selection. performing the steps of:
`5) determining if the next user input is a function call;
`6) if the next user input is a function call. performing 30
`a function; and
`7) if the next user input is not a function call, returning
`to the step of automatically presenting on the screen
`a directional palette (step C) 1)).
`11. A method for providing a microprocessor-controlled 35
`intuitive gesture-based graphical user interface for an elec(cid:173)
`tronic device having a screen according to claim 10 further
`comprising the steps of:
`D) if the user input corresponds to a gesture selection,
`performing the steps of:
`1) providing a user feedback acknowledging the ges(cid:173)
`ture selection;
`2) determining if the user input is a function call;
`3) if the user input is a function call. performing a
`function; and
`4) if the user input is not a function call, returning to the
`step of automatically presenting on the screen a
`directional palette (step C) 1)).
`12. A method for providing a microprocessor-controlled
`intuitive gesture-based graphical user interface for an elec- so
`tronic device having a screen according to claim 11 wherein
`the user feedback is a visual feedback.
`13. A method for providing a microprocessor-controlled
`intuitive gesture-based graphical user interface for an elec(cid:173)
`tronic device having a screen according to claim 12 wherein 55
`the user feedback is a first temporary palette button.
`14. A method for providing a microprocessor-controlled
`intuitive gesture-based graphical user interface for an elec-
`
`8
`tronic device having a screen according to claim 11 wherein
`the user feedback is an audio feedback.
`15. A method for providing a microprocessor-controlled
`intuitive gesture-based graphical user interface for a radio(cid:173)
`telephone having a screen comprising the steps of:
`A) displaying a screen object on the screen;
`B) receiving a user input corresponding to a selection of
`the screen object and evaluating the user input as either
`corresponding to a manual selection or corresponding
`to a gesture selection;
`C) if the user input corresponds to a manual selection,
`performing the steps of:
`1) automatically presenting on the screen a directional
`palette. the directional palette having at least one
`palette button, each at least one palette button each
`having a unique compass direction and a unique
`function identifier;
`2) receiving a next user input corresponding to a
`selection of the at least one palette button and
`evaluating the next user input as either correspond(cid:173)
`ing to a manual selection or corresponding to a
`gesture selection;
`3) if the next user input corresponds to a manual
`selection, performing the steps of:
`4) determining if the next user input is a function call;
`5) if the next user input is a function call. performing
`a function;
`6) if the next user input is not a function call, returning
`to the step of automatically presenting on the screen
`a directional palette ( step C) 1 ));
`D) if the user input corresponds to a gesture selection,
`performing the steps of:
`1) providing a user feedback acknowledging the ges(cid:173)
`ture selection;
`2) determining if the user input is a function call;
`3) if the user input is a function call, performing a
`function; and
`4) if the user input is not a function call, returning to the
`step of automatically presenting on the screen a
`directional palette ( step C) 1 )).
`16. A method for providing a microprocessor-controlled
`intuitive gesture-based graphical user interface for a radio(cid:173)
`telephone having a screen according to claim 15 further
`45 comprising the steps of:
`C)
`7) if the next user input corresponds to a gesture
`selection. performing the steps of:
`1) providing a user feedback acknowledging the ges(cid:173)
`ture selection;
`2) determining if the next user input is a function call;
`3) if the next user input is a function call, performing
`a function; and
`4) if the next user input is not a function call, returning
`to the step of automatically presenting on the screen
`a directional palette (step C) 1)).
`
`40
`
`* * * * *
`
`GOOGLE EXHIBIT 1028
`
`Page 10 of 10
`
`

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket