throbber
I 1111111111111111 11111 lllll lllll lllll lllll 111111111111111 1111111111 11111111
`US008228292B 1
`
`02) United States Patent
`Ruiz et al.
`
`(IO) Patent No.:
`(45) Date of Patent:
`
`US 8,228,292 Bl
`Jul. 24, 2012
`
`(54) FLIPPING FOR MOTION-BASED INPUT
`
`(75)
`
`Inventors: Jaime Guillermo Ruiz, Kitchener (CA);
`Yang LI, Sunnyvale, CA (US)
`
`(73) Assignee: Google Inc., Mountain View, CA (US)
`
`(") Notice:
`
`Subject to any disclaimer, the term of this
`patent is extended or adjusted under 35
`U.S.C. 154(b) by 0 days.
`
`(21) Appl. No.: 13/249,575
`
`(22) Filed:
`
`Sep. 30, 2011
`
`Related U.S. Application Data
`
`(63) Continuation of application No. 12/770,325, filed on
`Apr. 29, 2010.
`
`(60) Provisional application No. 61/320,663, filed on Apr.
`2, 20 10.
`
`(51)
`
`Int. CJ.
`G09G 5100
`(2006.01)
`(52) U.S. CJ .
`........................................ 345/156; 345/172
`(58) Field of Classifieatlon Search ........... 345/1 57- 163
`See application file for complete search history.
`
`(56)
`
`References Cited
`
`U.S. PATENT DOCUMENTS
`7 ,881,902 Bl *
`2/2011 Kahn el al. . . . . . . . . . . . . . . . . . . . . 702/160
`8/2008 Elgoyhen et al
`2008/0192005 Al
`1 0/2009 N asi,·i et al.
`2009/0262074 Al
`12/2009 Tao et al. ...................... 345/ 163
`2009/0303184 Al *
`
`OTHER PUBLICATIONS
`
`Bait lett. "Rock ·n• Scroll ls Here to Stay." IEEE Comput. Graph.
`Appl., vol. 20, Issue 3, 2000.*
`
`'Android' [online] ''Android Open Source Project," 2010, [retrieved
`on Feb. 9, 2012]. Retrieved from the loternet: URL:< http://source.
`android.com>. 2 pages.
`B:uilett. ''Rock 'n' Scroll Is Here to Stay." IEEE CompuJ. Graph.
`Appl., vol. 20, Issue 3, 2000, pp. 40-45.
`Buxton. "Lexical and pragmatic considerations of input stmchrres."
`SJGGRAPHComput. Graph, vol. 17, Issue 1, 1983, pp. 31-37.
`Gonsalves. ' lnfonnation Week' ( on line] "Apple iPhone Muket Share
`Slips in 4Q;' 2010, [retrieved on Feb. 9, 2012]. Retrieved from the
`Internet: URL: < http://w1.V\v.informationweek.com/news/telecom/
`business/showArticle.jhtml?a,ticleID=222600940>. 2 pages.
`·Google' [ on line] "Google Maps with Street View," 2011, [rettieved
`on Feb. 9, 2012]. Retrieved from the Internet: URL: < http://maps.
`google.convhelp/maps/st:reetview/>. 1 page.
`·Google' [online) ''Nexus One," 2012, [retrieved on Feb. 10, 2012].
`Retrieved from the loternet: URL: < http://www.google.com/phone/
`detail/nexus-one>. 2 pages.
`Harrison et al. "Squeeze me, hold me, tilt me! An exploration of
`manipulative user interfaces." Conference paper in CHI '98: Pro(cid:173)
`ceetii11gs oft he SJGCHI conference 011 H11111a.nfactors in computing.
`New York, ACM/Addison-Wesley, 1998, pp. 17-24. Abstract Only,
`(retrieved on Feb. 10, 2012). Retrieved from the Internet: URL: <
`http://v.,•ww.mendeley.com/researcb/squeeze-me-hold-me-tilt-me(cid:173)
`ao-exploration-of-maoipulative-user-interfaces/#page-1>. 1 page.
`
`(Continued)
`
`Primary Examiner - Ke Xiao
`(74) Attorney, Agenl, or Firm - Fish & Richardson P.C.
`
`ABSTRACT
`(57)
`A computer-implemented method for identifying motion(cid:173)
`based inputs to an electronic device involves determining that
`the electronic device has been rotated in a first direction of
`rotation past a first threshold orientation; determining that the
`electronic device has been rotated in a second direction of
`rotation that is substantially the opposite of the first direction
`of rotation, past a second threshold; and analyzing motion of
`the device to identify motion-based inputs to the device other
`than the rotation of the device in the first and second direc(cid:173)
`tions, based on the two determinations.
`
`21 Claims, 7 Drawing Sheets
`
`Petitioner Samsung Ex-1006, 0001
`
`

`

`US 8,228,292 Bl
`Page 2
`
`OTHER PUBLICATIONS
`
`Hinckley et al. "Design and analysis of delimiters for selection-action
`pen gesture phrases in scriboli." Conference paper in CHI '05: Pro(cid:173)
`ceedings of the SIGCHI conference on human factors in computing
`systems. New York, ACM, 2005, pp. 451-460.
`Hinckley et al. "Sensing techniques for mobile interaction." Confer(cid:173)
`ence paper in UIST '00: Proceedings of the 13th annual ACM sym(cid:173)
`posium on User interface software and technology. New York, ACM,
`2000, pp. 91-100.
`Jones et al. "GesText: Accelerometer-based Gestural Text-Entry Sys(cid:173)
`tems." Conference paper in CHI '10: Proceedings of the SIGCHI
`conference on Human factors in computing systems. 2010, 10 pages.
`Li et al. "Experimental analysis of mode switching techniques in
`pen-based user interfaces." Conference paper in CHI '05: Proceed(cid:173)
`ings of the SIGCHI conference on Human factors in computing
`systems. New York, ACM, 2005, pp. 461-470.
`Liu et al. "User evaluation of lightweight user authentication with a
`single tri-axis accelerometer." Conference paper in MobileHCl '09:
`Proceedings of the 11th International Conference on Human-Com(cid:173)
`puter Interaction with Mobile Devices and Services. New York,
`ACM, 2009, pp. 1-10.
`Myers and Rabiner. "A Comparative Study of Several Dynamic
`Time-warping Algorithms for Connected Word Recognition." The
`Bell System Technica!Journal, vol. 60, Issue 7, 1981, pp. 1389-1409.
`'Nintendo' [online]. "Wii at Nintendo," 2012, [retrieved on Feb. 9,
`2012]. Retrieved from the Internet: URL: < http://www.nintendo.
`corn/wii>. 3 pages.
`Partridge et al. "TiltType: accelerometer-supported text entry for very
`small devices." Conference paper in UIST '02: Proceedings of the
`15th annual ACM symposium on User interface software and tech(cid:173)
`nology. New York, ACM, 2002, pp. 201-204.
`
`Rekimoto. "Tilting operations for small screen interfaces." Confer(cid:173)
`ence paper in UIST '96: Proceedings of the 9th annual A CM sympo(cid:173)
`sium on User interface software and technology. New York, ACM,
`1996, pp. 167-168.
`Ruiz et al. "A model of non-preferred hand mode switching." Con(cid:173)
`ference paper in GI '08: Proceedings of graphics interface 2008.
`Toronto, Canadian Information Processing Society, 2008, pp. 49-56.
`Ruiz and Lank. "A study on the scalability of non-preferred hand
`mode manipulation." Conference paper in ICMI '07: Proceedings of
`the 9th international conference on multimodal interfaces. New York,
`ACM, 2007, pp. 170-177.
`Small and Ishii. "Design of spatially aware graspable displays." Con(cid:173)
`ference paper in Chi '97: Chi '97 extended abstracts on human
`factors in computing systems. New York, ACM, 1997, pp. 367-368.
`Weberg et al. "A piece of butter on the PDA display." Conference
`Paper in CHI 'OJ: CHI 'OJ extended abstracts on human factors in
`computing systems. New York, ACM, 2001, pp. 435-436.
`Wigdor and Balakrishnan. "TiltT ext: using tilt for text input to mobile
`phones." Conference paper in UIST '03: Proceedings of the 16th
`annual ACM symposium on User interface software and technology.
`New York, ACM, 2003, pp. 81-90.
`'Wikipedia' [online]. "Dynamic time warping-Wikipedia, the free
`encyclopedia," 2011, [retrieved on Feb. 9, 2012]. Retrieved from the
`Internet: URL: < http://en.wikipedia.org/wiki/Dynamic_time_
`warping>. 3 pages.
`'Wikipedia' [online]. "iPhone-Wikipedia, the free encyclopedia,"
`2012, [retrieved on Feb. 9, 2012]. Retrieved from the Internet: <
`http://en.wikipedia.org/wiki/IPhone>. 16 pages.
`Wobbrock et al. "Gestures without libraries, toolkits or training: a $1
`recognizer for user interface prototypes." Conference paper in UIST
`'07: Proceedings of the 20th annual ACM symposium on User inter(cid:173)
`face software and technology. New York, ACM, 2007, pp. 159-168.
`
`* cited by examiner
`
`Petitioner Samsung Ex-1006, 0002
`
`

`

`U.S. Patent
`U.S. Patent
`
`Sheet 1 of 7
`Sheet 1 of 7
`
`US 8,228,292 B1
`US 8,228,292 Bl
`
`Jul. 24, 2012
`Jul. 24, 2012
`
`FIG.1A
`
`,r-
`
`Petitioner Samsung Ex-1006, 0003
`
`Petitioner Samsung Ex-1006, 0003
`
`

`

`U.S. Patent
`U.S. Patent
`
`Sheet 2 of 7
`Sheet 2 of 7
`
`US8,228,292 B1
`US 8,228,292 Bl
`
`Jul. 24, 2012
`Jul. 24, 2012
`
`FIG.1B
`
`..-
`
`Petitioner Samsung Ex-1006, 0004
`
`Petitioner Samsung Ex-1006, 0004
`
`

`

`N = "'""'
`
`\0
`'N
`00
`N
`'N
`00
`d r.,;_
`
`FIG. 2
`
`Data
`User
`
`Models
`Motion
`
`:::
`
`'
`224--...
`
`,,_.
`
`Editor
`Me od
`ln~ut
`
`Sensor(s)
`
`Motion
`
`Controller
`
`Input
`
`22
`
`214,
`
`Application
`
`er
`
`Con
`Dis
`
`216,
`
`218
`
`212
`
`208
`202
`
`200~
`
`Petitioner Samsung Ex-1006, 0005
`
`

`

`U.S. Patent
`
`Jul. 24, 2012
`
`Sheet 4 of 7
`
`US 8,228,292 Bl
`
`Enroll Study Group
`
`:m
`
`, ,
`
`Load Tracking Software on User
`Devices
`
`a.Q!l.
`
`, ,
`
`capture Data For User Motion On
`Defined Gestures
`
`D.
`
`' ,
`
`Build Model of Motion Data For
`Gestures
`
`3ml
`
`, ,
`
`Prc,.,ide Model Data To Other User
`Devices
`
`a1Q
`
`' ,
`
`Execute Motion-Based Gesture
`Control on Other User Devices
`all
`
`FIG. 3
`
`Petitioner Samsung Ex-1006, 0006
`
`

`

`U.S. Patent
`
`Jul. 24, 2012
`
`Sheet 5 of 7
`
`US 8,228,292 Bl
`
`Receive Azimuth, Pitch, and Roll
`Data From Orientation Sensor(
`
`Convert Data to Radians
`
`Calculate Change in Each Reading
`Over Prior Readi
`(s)
`
`Round Floati Points to Integers
`4£11
`
`Compress Input Using Averaging
`Window
`
`Fl .4
`
`Petitioner Samsung Ex-1006, 0007
`
`

`

`U.S. Patent
`
`Jul. 24, 2012
`
`Sheet 6 of 7
`
`US 8,228,292 Bl
`
`Quantize Motion-Based Inputs
`!2D.
`
`Compare to Model Data Using
`Dynamic lime Warping (DTW)
`4.22.
`
`Switch Device Input Mode to Accept
`Motion-Based Gestures
`
`Analyze Motion-Based Input Data to
`Identify Match To Gesture Library
`Until Mode Change Input is Received
`428
`
`Fl
`
`. 4B
`
`Petitioner Samsung Ex-1006, 0008
`
`

`

`N = "'""'
`
`\0
`'N
`00
`N
`'N
`00
`d r.,;_
`
`0 ....
`-....J
`.....
`rJJ =(cid:173)
`
`('D
`('D
`
`-....J
`
`US 8,228,292 Bl
`
`580 7
`
`Jul. 24, 2012
`
`Sheet 7 of 7
`
`---
`
`FIG.5
`
`U.S. Patent
`
`524
`
`L 514
`
`\
`\52
`~
`
`/
`
`I
`I
`
`Petitioner Samsung Ex-1006, 0009
`
`

`

`US 8,228,292 B 1
`
`1
`FLIPPING FOR MOTION-BASED INPUT
`
`CROSS-REFERENCE TO RELATED
`APPLICATIONS
`
`This application is a continuation of and claims priority to
`U.S. application Ser. No. 12/770,325, filed on Apr. 29, 2010,
`which claims priority to U.S. Provisional Application Ser.
`No. 61/320,663, entitled "Flipping for Motion-Based Input,"
`filed on Apr. 2, 2010, the entire disclosure of which is incor(cid:173)
`porated herein by reference.
`
`TECHNICAL FIELD
`
`2
`on the device may subscribed and may then receive informa(cid:173)
`tion. For example, an input method editor (IME) on a device
`may provide functionality for intercepting all inputs to a
`device (e.g., via physical keyboard, virtual keyboard, soft-
`5 keys, trackballs, device motion, and voice) and for converting
`those inputs into a common form that can be passed to appli(cid:173)
`cations that are running on the device. As one example, an
`application may be open to receiving directional inputs from
`a user of the device on which the application is executing. The
`10 application may be unconcerned with whether the user enters
`the information via swiping on a touch screen, rolling a track(cid:173)
`ball, pressing on a D pad, or using motion-based input.
`Rather, an IME may receive the inputs, may manage whether
`the device is in a motion-based input mode, and may supply
`the application with information regarding the user's direc(cid:173)
`tional input, where the supplied information is essentially
`independent of the mechanism by which the user provided the
`input.
`In certain implementations, such systems and technique
`20 may provide one or more advantages. For example, motion(cid:173)
`based input may be more intuitive than is keyboard or touch
`screen input in many instances. However, a user may not want
`to have their device be open to receiving motion-based input
`at all times (lest they accidentally provide input that they did
`25 not intend). Therefore, the techniques here can provide an
`input that is natural, but is not likely to be occur by accident
`(i.e., by mere jostling of the device), so that a user can easily
`switch to an input mechanism that may be more convenient
`for them at the moment. In addition, in certain implementa-
`30 tions, the gesture to tum on motion-based input gives users
`the control to activate motion gestures without any hardware
`modifications to existing devices. The gesture can be quick to
`perform and can be performed in a limited amount of physical
`space. In addition, a motion to activate a motion-based input
`35 mode, and subsequent motions to provide such additional
`input, may be made without a user having to pause at all.
`In one implementation, a computer-implemented method
`for identifying motion-based inputs to an electronic device is
`disclosed. The method comprises determining that the elec-
`40 tronic device has been rotated in a first direction of rotation
`past a first threshold orientation; determining that the elec(cid:173)
`tronic device has been rotated in a second direction of rotation
`that is substantially the opposite of the first direction of rota(cid:173)
`tion, past a second threshold; and analyzing motion of the
`45 device to identify motion-based inputs to the device other
`than the rotation of the device in the first and second direc-
`tions, based on the two determinations. Determining that the
`electronic device has been rotated past the first and second
`thresholds can comprise comparing motion data for the
`device to an electronic model that represents motion by one or
`more other users performing a mode-switching motion. Also,
`comparing the motion data for the device to the electronic
`model can comprise performing a dynamic time warping
`comparison. In addition, analyzing motion of the device
`55 based on the two determinations can be triggered by deter(cid:173)
`mining that a distance between motion data for the electronic
`device and motion data in the model falls below a threshold
`distance. The method can also comprise receiving motion
`data from electronic motion sensors in the electronic device
`and quantizing the received data before comparing the
`received data to the model. Moreover, the method can further
`comprise compressing the received motion data using a mov(cid:173)
`ing averaging window on the data over time.
`In some aspects, the electronic model is formed by wire(cid:173)
`lessly gathering data from users of mobile computing devices
`during normal operation of the mobile computing devices. In
`addition, analyzing motion of the device can be based on the
`
`This document relates to systems and techniques for inter- 15
`preting motion-based user inputs to a mobile computing
`device.
`
`BACKGROUND
`
`The use of motion-based inputs in computing devices has
`become more and more pervasive over the last several years.
`Motion-based inputs are inputs that involve movement of an
`entire device housing, such as by a user's hands, as distin(cid:173)
`guished from typed, spoken, or touch screen inputs. Primary
`interaction with the NINTENDO WII gaming console is
`motion-based (moving the entire WIIMOTE device), and
`many smartphones have accelerometers and other motion(cid:173)
`sensing components such as magnetometers. Generally, sen(cid:173)
`sor-based interactions in mobile phones have occurred with
`respect to specific application contexts, such as tilt detection
`in games and determining screen orientation.
`Some mobile devices depend on users to press a button
`and/or hold down a button while making a motion input, in
`order to indicate an intent that their movement of a device be
`interpreted as a purposeful input rather than as meaningless
`general motion of the device. Such a mechanism gives the
`user control over how the device interprets the user's inputs,
`but it may be difficult for a user to maintain force on a button
`while performing various motion-based input actions.
`
`SUMMARY
`
`This document describes systems and techniques that may
`be used to permit interpretation and analysis of motion-based
`inputs to a computing device. For example, a motion of turn(cid:173)
`ing a device over and then turning in back to approximately its
`original orientation can be identified as a user input to switch
`into a motion-based input mode on the device. As such, after
`a flip/un-flip (or double flip) input motion has been provided 50
`to a device, subsequent motion of the device may be sensed
`and interpreted according to a set of predetermined motion(cid:173)
`based inputs. For example, while the mode is changed, tilting
`of the device may be interpreted to move a cursor on a display
`of the device in the direction of the tilting.
`To change the device back to its original mode in which
`motion-based input is not active, a user may repeat the double
`flip or perform another action (including by inaction, such as
`by not moving the device for a predetermined time period).
`Also, rather than preceding one or more motion-based com- 60
`mands with a double flip, a motion-based command may be
`provided by a user, and may be followed by a delimiter input
`such as a double-flip, where the command is recognized and
`executed only after the delimiter input is received.
`The mode switching and motion-based inputs may be 65
`handled by a universal component, such as a component of an
`operating system on a device, to which various applications
`
`Petitioner Samsung Ex-1006, 0010
`
`

`

`US 8,228,292 B 1
`
`4
`The details of one or more embodiments are set forth in the
`accompa-nying drawings and the description below. Other
`features and advantages will be apparent from the description
`and drawings, and from the claims.
`
`DESCRIPTION OF DRAWINGS
`
`FIGS. lA and 1B show motion-based inputs to a wireless
`computing device.
`FIG. 2 is a schematic diagram of a smart phone system that
`supports motion-based inputs.
`FIG. 3 is a flow chart of a process for training a motion(cid:173)
`based input model employing a group of mobile device users.
`FIGS. 4A and 4B are flow charts of methods for processing
`motion-based input to a mobile electronic device.
`FIG. 5 shows an example of a computer device and a
`mobile computer device that can be used to implement the
`techniques described here.
`Like reference symbols in the various drawings indicate
`20 like elements.
`
`3
`two determinations comprises changing from a mode in
`which particular motion-based inputs are not recognized by
`the device to a mode in which the particular motion-based
`inputs are recognized by the device. The method can further
`comprise receiving motion-based gestures from a user of the 5
`electronic device and converting the received gestures to
`commands for one or more applications on the electronic
`device. An input method editor (IME) on the electronic device
`can also convert the received gestures to commands, selects
`an active application on the device from among multiple 10
`applications, and provides data for the commands to the
`selected active application.
`In other aspects, the electronic device is in a position so that
`a graphical display of the electronic device faces a user of the
`electronic device both before the device is rotated in the first 15
`direction and after the device is rotated in the second direc(cid:173)
`tion. In addition, analyzing motion of the device based on the
`two determinations can comprise analyzing data for motion(cid:173)
`based input received in a window of time that precedes the
`rotations in the first direction and the second direction. Also,
`analyzing motion of the device based on the two determina(cid:173)
`tions can comprise interpreting received motion data without
`switching to a motion-based input mode.
`In another implementation, a computer-implemented
`method for identifying motion-based inputs to an electronic 25
`device is disclosed. The method comprises obtaining data that
`characterizes a user's motion of the electronic device in free
`space; determining whether the obtained data matches a
`model for a particular gesture that corresponds to mode
`switching; and switching the electronic device, based on the 30
`determination, into a mode in which motion of the device in
`free space is interpreted as intentional user motion-based
`input and is used to control one or more applications execut-
`ing on the electronic device. Determining whether the
`obtained data matches the model for the particular gestures 35
`can comprise performing a dynamic time warping compari-
`son between the model and the obtained data. Switching the
`electronic device can be triggered by determining that a dis(cid:173)
`tance between motion data for the electronic device and
`motion data in the model falls below a threshold distance. In 40
`addition, obtaining data that characterizes the user's motion
`of the electronic device in free space can comprise receiving
`motion data from electronic motion sensors in the electronic
`device and quantizing the received data before determining
`whether the obtained data matches the model.
`In certain aspects, the method further comprises compress(cid:173)
`ing the received motion data using a moving averaging win(cid:173)
`dow on the data over time. The electronic model can also be
`formed by wirelessly gathering data from users of mobile
`computing devices during normal operation of the mobile
`computing devices. In addition, switching the electronic
`device can comprise changing from a mode in which particu(cid:173)
`lar motion-based inputs are not recognized by the electronic
`device to a mode in which the particular motion-based inputs
`are recognized by the electronic device.
`In yet another implementation, a computer-implemented
`system for identifying motion-based input to an electronic
`device comprises one or more motion sensors mounted inside
`a housing of the electronic device, and a mode switching
`module. The module is progranmied to use information from
`the motions sensors to determine whether the electronic
`device has been: (a) rotated in a first direction of rotation past
`a first threshold orientation, and (b) rotated in a second direc(cid:173)
`tion of rotation that is substantially the opposite of the first
`direction of rotation, past a second threshold. The modules is
`also programmed to cause an operational mode of the elec(cid:173)
`tronic device to be switched based on the two determinations.
`
`DETAILED DESCRIPTION
`
`This document describes systems and techniques for han(cid:173)
`dling motion-based input to a computing device such as a
`smart phone or app phone. As described here, a particular
`motion-based input may be assigned as a delimiter to indicate
`that a user would like to "open" the device to additional
`motion-based inputs-i.e., to switch input modes on the
`device so that the other motion-based inputs, which were
`previously disabled, will be enabled. The delimiting input
`discussed here is a double flip, whereby a user makes a first
`motion with the device, and then backtracks through the
`motion, where the particular motion here involves rotating the
`device approximately 180 degrees about an axis of the device
`and then rotating it back substantially to its original orienta-
`tion. The delimiting motion could be other motions, though
`will generally be selected from motions that are not com(cid:173)
`monly made accidentally by the user of a mobile device, so
`that the user's device will change modes only when the user
`intends that it changes modes.
`FIGS. lA and 1B show motion-based inputs to a wireless
`computing device. Referring to FIG. lA, there are shown
`three chronological instances of a user's interaction with a
`45 touch screen smart phone that the user is holding in her right
`hand. In each instance, the user is holding the device horizon(cid:173)
`tally above a horizontal surface like the top of a table. In
`instance (a), the user is cradling the device in her palm, with
`a touch screen of the device facing upward. The user may
`50 have just picked the device up off the table, or may have just
`finished interacting with the device, such as by contacting a
`surface of a touch screen on the device with her left forefinger.
`In instance (b ), the user has rotated her hand while gripping
`the device in a normal manner, approximately 180 degrees
`55 counterclockwise (as viewed from the user's arm). As a
`result, the face of the smart phone is now pointing substan(cid:173)
`tially downward toward the table or floor. The rotation in this
`example has been substantially about a longitudinal axis that
`passes from the top of the smart phone to the bottom (though
`60 the axis may translate some during the rotating motion). In
`short, the user has flipped the device over.
`The user may hold the device in this orientation for a time
`period, which may be as little as a split instant. Then the user
`may tum their wrist back clockwise, placing the device in the
`65 orientation shown in instance ( c ), where the touch screen on
`the device is again facing upward, presumably so that the user
`can easily view the screen. The device may have been
`
`Petitioner Samsung Ex-1006, 0011
`
`

`

`US 8,228,292 B 1
`
`5
`returned substantially to its original orientation, though it
`may have translated some during the flip and unflip motion
`(e.g., if the user is walking or driving while performing the
`motions).
`The effect of this motion-based input may be to change an 5
`operating mode of the device. For example, the change may
`involve changing from one operating application on the
`device to a second operating application (where the device
`supports multi-tasking). The direction that the user tilted the
`device may affect the direction that the application on the 10
`display is shifted. For example, each application may be
`represented by a card on a graphical user interface, and the
`various cards may be displayed in a horizontal row ( e.g., with
`inactive applications on the left and right edge of a graphical 15
`display). Flipping to the left and back may cause the device to
`push the row of applications to the left in response to such a
`motion, and thus to make the application that was previously
`to the right of the prior active application, the new active
`application.
`The motion-based input may also be used to affect the
`display of a particular application on the device. For example,
`the input may be used to change pages in a document being
`displayed on the device. For example, flipping a device over
`to the left and back may turn a page forward in a book or 25
`word-processing document that is being displayed on the
`device. Such motion-based inputs may also be used to shift
`through photographs, record albums, songs, or other items on
`the device. Again, the direction (right or left, or up or down)
`that the user tilts the device may affect the direction that such 30
`a shift is made.
`The threshold level of motion that is required in order to
`affect a change in the operation of the device like those
`discussed here may be determined in a number of manners.
`For example, a certain level of inclination of the device
`against a base plane (which may be a horizontal plane or a
`slanted plane) may trigger the change. Alternatively, a certain
`amount of change in inclination or certain degrees or radians
`of rotation may be the trigger, even if a certain level of
`inclination is not reached. Also a combination of absolute
`change and relative change may be required for a trigger to
`occur.
`In addition, the degree of motion to trigger a mode change
`or similar input may be a function of motion-based inputs
`measure from a plurality of mobile device users. For example,
`as explained below, certain users may be asked to perform
`various motion-based gestures, the orientation or other
`motion-related information from their actions may be
`recorded, and a model may be constructed of their motion.
`That model may then be loaded on a later user's device and 50
`matched to motions by that later user in order to determine
`whether the later user intend a particular motion. Thus, if the
`model indicates that most users flip their devices only 150
`degrees when trying to change modes, then a subsequent
`mode change for another user may be triggered based on a 55
`150 degree rotation of her device.
`Such training can be used for multiple motions, including
`for a delimiter like that just discussed, and subsequent motion
`inputs that may be received after the delimiter causes the
`device to be changed to a motion-based input mode. For 60
`example, users whose inputs are used as training data may be
`asked to execute a number of motions and the motion-based
`data measured from their devices (e.g., from accelerometers
`in the devices) may be correlated with the relevant action that
`were asked to perform. Such actions may include shaking of
`a device, tapping on a device, jerking a device in a particular
`manner ( e.g., as if the user is setting the hook in a fish, when
`
`6
`the device represents the user's fishing reel and pole), wag(cid:173)
`gling the device loosely, and the like.
`In addition to a threshold for the level of the initial motion
`( e.g., the flip) needed to trigger recognition of a user input by
`a device, a system may include a threshold for a level of
`backtracking motion ( e.g., the unflip) before a delimiter will
`be triggered. For example, a threshold number of degree of
`rotation in the backtracking direction may be used ( either
`absolute degrees or degrees relative to the original flipping
`motion, such as +/-15 degrees from the initial rotation
`motion), or a certain orientation relative to the starting orien-
`tation may be used as a trigger threshold. And again, com(cid:173)
`parison to models of training user behavior (and subsequent
`gathered motion/ orientation data) may be made such as in the
`manners discussed in detail below.
`Referring now to FIG. 1B, a similar chronological
`sequence is shown, though here in five separate images. In
`this example, there is no background surface ( e.g., no table
`under the user's hand), so that no absolute frame of reference
`20 for the motion is shown here. That is because the comparisons
`described here may be made if the longitudinal axis of the
`device is horizontal, vertical, or at an angle in between hori(cid:173)
`zontal and vertical. Also, although there are references here to
`axes, rotations about the axes, and rotating a device 180
`degrees, it should be understood that substantial compliance
`is acceptable, within an understanding of acceptable repeat-
`ability by a user of a device so that a proper level of false
`positives or false negatives is maintained. In particular, abso(cid:173)
`lute values may be assigned as predetermined values to make
`a device determine when an intended delimiter-creating input
`is provided ( e.g., flipping a device more than 160 degrees total
`or within 15 degrees from flat horizontal), or acceptable
`exceptions from model data may be used ( e.g., so that 95% of
`motion inputs that a training group intended to be inputs are
`35 considered to be inputs in the model).
`In the figure, the user is first holding their device (e.g., a
`Nexus One telephone) in a right hand so that its screen is
`facing them. They then start rotating their hand counterclock(cid:173)
`wise ( as viewed down the user's wrist toward her hand) until
`40 the telephone has been rotated through an angle of about 170
`degree and then rotated back to substantially its original ori(cid:173)
`entation. Note that, with the input completed, the screen is
`again facing the user, so that the user is in a good position to
`continue using the phone in a manner that is particularly
`45 useful.
`FIG. 2 is a schematic diagram of a smart phone system 200
`that supports motion-based inputs. In general, the system is
`represented by a mobile device 202, such as a smart phone,
`that has a touch screen display 208. In addition, the device
`202 may have alternative input mechanisms, such as a track
`ball 210, D Pad, and other selectable buttons. A number of
`components within the device 202 may provide for interac(cid:173)
`tion by the device 202 in various manner, including by
`motion-based inputs (i.e., where the device itself is moved to
`provide inputs). Only certain example components are shown
`here, for purposes of clarity. In this example, the display 208
`is showing a first-person shooting spaceship game, where,
`when the device is in a motion-based input mode, the user can
`steer their spaceship by rotating the device right or left (to
`bank their view to the right or left, respectively), and tilting it
`forward and backward in a familiar manner.
`The device 202 may communicate via a wireless interface
`216, through a network 206 such as the internet and/or a
`cellular network, with servers 204. For example, the device
`65 202 may carry telephone calls through a telephone network or
`through a data network using VOIP technologies in familiar
`manners. Also

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket