`(12) Patent Application Publication (10) Pub. No.: US 2015/0123949 A1
`Li
`(43) Pub. Date:
`May 7, 2015
`
`US 2015O123949A1
`
`(54) GESTURE DETECTION
`(71) Applicant: AT&T Intellectual Property I, L.P.,
`Atlanta, GA (US)
`
`(72) Inventor: Kevin Li, New York, NY (US)
`(73) Assignee: AT&T Intellectual Property I, L.P.,
`Atlanta, GA (US)
`
`(21) Appl. No.: 14/070,493
`
`(22) Filed:
`
`Nov. 2, 2013
`
`Publication Classification
`
`(2006.01)
`
`(51) Int. Cl.
`G06F 3/043
`(52) U.S. Cl.
`CPC ...................................... G06F 3/043 (2013.01)
`ABSTRACT
`(57)
`A Supplemental Surface area allows gesture recognition on
`outer surfaces of mobile devices. Inputs may be made without
`visual observance of display devices. Gesture control on outer
`Surfaces permits Socially acceptable, inconspicuous interac
`tions without overt manipulation.
`
`
`
`SSSSSSSSSSSSSSSSSS
`
`--- 30
`
`
`
`
`
`
`
`APPLE 1008
`
`1
`
`
`
`Patent Application Publication
`
`May 7, 2015 Sheet 1 of 22
`
`US 201S/O123949 A1
`
`F.G. 1
`
`
`
`
`
`
`
`s
`
`2
`
`
`
`Patent Application Publication
`
`May 7, 2015 Sheet 2 of 22
`
`US 201S/O123949 A1
`
`FIG. 2
`
`
`
`3
`
`
`
`Patent Application Publication
`
`May 7, 2015 Sheet 3 of 22
`
`US 201S/O123949 A1
`
`
`
`
`
`
`
`
`
`º ‘‘OICH
`
`
`
`IbuïIS yndynO
`
`
`
`J0,00000I 0Jn]S05)
`
`4
`
`
`
` eTfh.
`
`eae3ee
`eeaaseee
`
`33
`
`eea=eenr
`
`eeateoo
`
`pO
`
`5
`
`
`
` stteateatabetha
`
`
`
`
`aeos
`
`x3
`
`cetteteatAARtet
`
`A.
`
`S‘Old
`
`eeoh
`
`eeSSanne
`eeSie
`
`es:a=eemt
`
`ae
`
`as
`
`6
`
`
`
`Patent Application Publication
`
`May 7, 2015 Sheet 6 of 22
`
`US 201S/O123949 A1
`
`
`
`7
`
`
`
`Patent Application Publication
`
`May 7, 2015 Sheet 7 of 22
`
`US 201S/O123949 A1
`
`L "OIDH
`
`8
`
`
`
`Patent Application Publication
`
`May 7, 2015 Sheet 8 of 22
`
`US 201S/O123949 A1
`
`FG. 8
`
`
`
`* * ! | | | I I I I I I I I I | | I | | | | | | I I | I I !
`
`9
`
`
`
`Patent Application Publication
`
`May 7, 2015 Sheet 9 of 22
`
`US 201S/O123949 A1
`
`FIG
`
`9
`
`
`
`10
`
`
`
`Patent Application Publication
`
`May 7, 2015 Sheet 10 of 22
`
`US 201S/O123949 A1
`
`FIG 10
`
`
`
`42
`
`x-as-s-s-s-s-s-s-s-s-s-s-s-s-s-s-s-s-s-s-s-s-s-s-s-s-s-s-s-SSSSSSSSSSSSSSSSSSSS
`
`%
`
`= = = = = = = = = = = = = = l= = = = = = = = = = = = = = = = = = =
`
`%
`
`
`
`
`
`
`
`x
`
`S. ^x &
`
`s
`
`11
`
`
`
`Patent Application Publication
`
`May 7, 2015 Sheet 11 of 22
`
`US 201S/O123949 A1
`
`FIG 11
`
`20
`
`
`
`
`
`
`
`
`
`
`~~~~ ~~~~ ~~~~ ~~~~ ~~~~%
`
`
`
`
`
`%
`
`x
`
`s
`
`12
`
`
`
`Patent Application Publication
`
`May 7, 2015 Sheet 12 of 22
`
`US 201S/O123949 A1
`
`F.G. 12
`
`20
`
`
`
`100
`
`Learning
`Mode
`
`Supplemental Gesture Surface
`
`26
`
`is -- 42
`
`Touch to Begin Gesture
`
`Touch When Complete
`
`102
`
`104
`
`13
`
`
`
`Patent Application Publication
`
`May 7, 2015 Sheet 13 of 22
`
`US 201S/O123949 A1
`
`FIG. 13
`
`
`
`Gesture
`
`14
`
`
`
`Patent Application Publication
`
`May 7, 2015 Sheet 14 of 22
`
`US 201S/O123949 A1
`
`F.G. 14
`
`
`
`100
`
`Learning
`Mode
`
`
`
`<<<<<) ,,,%
`
`Menu of Commands
`
`Call Mom
`
`
`
`
`
`
`
`
`
`
`
`Flashlight
`
`Appointment
`
`Database of
`Gestures
`
`Call you later
`
`Output
`Signal
`
`SS ^x &
`
`15
`
`
`
`Patent Application Publication
`
`May 7, 2015 Sheet 15 of 22
`
`US 201S/O123949 A1
`
`FIG. 15
`
`
`
`16
`
`
`
`Patent Application Publication
`
`May 7, 2015 Sheet 16 of 22
`
`US 201S/O123949 A1
`
`
`
`9I “OICH
`
`| |
`
`17
`
`
`
`Patent Application Publication
`
`May 7, 2015 Sheet 17 of 22
`
`US 201S/O123949 A1
`
`5
`
`18
`
`
`
`Patent Application Publication
`
`May 7, 2015 Sheet 18 of 22
`
`US 2015/O123949 A1
`
`
`
`19
`
`
`
`Patent Application Publication
`
`May 7, 2015 Sheet 19 of 22
`
`US 201S/O123949 A1
`
`
`
`
`
`
`
`
`
`6 I "OIDH
`
`9
`S
`
`sºlungs05)
`
`20
`
`
`
`Patent Application Publication
`
`May 7, 2015 Sheet 20 of 22
`
`US 201S/O123949 A1
`
`
`
`1.
`O
`en
`
`C
`
`s
`
`21
`
`
`
`Patent Application Publication
`
`US 201S/O123949 A1
`
`IZ * OIDH
`
`
`
`
`
`
`
`3.Ings05)
`
`uuu!!!.105.IV^
`
`22
`
`
`
`Patent Application Publication
`
`May 7, 2015 Sheet 22 of 22
`
`US 201S/O123949 A1
`
`
`
`ZZ “OICH
`
`23
`
`
`
`US 2015/O123949 A1
`
`May 7, 2015
`
`GESTURE DETECTION
`
`COPYRIGHT NOTIFICATION
`0001. A portion of the disclosure of this patent document
`and its attachments contain material which is Subject to copy
`right protection. The copyright owner has no objection to the
`facsimile reproduction by anyone of the patent document or
`the patent disclosure, as it appears in the Patent and Trade
`mark Office patent files or records, but otherwise reserves all
`copyrights whatsoever.
`
`BACKGROUND
`0002 Touch sensors are common in electronic displays.
`Many mobile smartphones and tablet computers, for
`example, have a touch screen for making inputs. A user's
`finger touches a display, and a touch sensor detects the input.
`
`BRIEF DESCRIPTION OF THE SEVERAL
`VIEWS OF THE DRAWINGS
`0003. The features, aspects, and advantages of the exem
`plary embodiments are better understood when the following
`Detailed Description is read with reference to the accompa
`nying drawings, wherein:
`0004 FIGS. 1 and 2 are simplified schematics illustrating
`an environment in which exemplary embodiments may be
`implemented;
`0005 FIG. 3 is a more detailed block diagram illustrating
`the operating environment, according to exemplary embodi
`ments;
`0006 FIGS. 4-5 are schematics illustrating a gesture
`detector, according to exemplary embodiments;
`0007 FIGS. 6-7 are circuit schematics illustrating a piezo
`electric transducer, according to exemplary embodiments;
`0008 FIGS. 8-11 are more schematics illustrating the ges
`ture detector, according to exemplary embodiments;
`0009 FIGS. 12-14 are schematics illustrating a learning
`mode of operation, according to exemplary embodiments;
`0010 FIG. 15 is an exploded component view of an elec
`tronic device, according to exemplary embodiments;
`0011 FIG.16 is a schematic illustrating contactless, three
`dimensional gestures, according to exemplary embodiments;
`0012 FIG. 17-19 are schematics illustrating output sam
`pling, according to exemplary embodiments;
`0013 FIGS. 20A and 20B are schematics illustrating a
`protective case, according to exemplary embodiments; and
`0014 FIGS. 21-22 are schematics illustrating other oper
`ating environments for additional aspects of the exemplary
`embodiments.
`
`DETAILED DESCRIPTION
`0015 The exemplary embodiments will now be described
`more fully hereinafter with reference to the accompanying
`drawings. The exemplary embodiments may, however, be
`embodied in many different forms and should not be con
`strued as limited to the embodiments set forth herein. These
`embodiments are provided so that this disclosure will be
`thorough and complete and will fully convey the exemplary
`embodiments to those of ordinary skill in the art. Moreover,
`all statements herein reciting embodiments, as well as spe
`cific examples thereof, are intended to encompass both struc
`tural and functional equivalents thereof. Additionally, it is
`intended that such equivalents include both currently known
`
`equivalents as well as equivalents developed in the future
`(i.e., any elements developed that perform the same function,
`regardless of structure).
`0016. Thus, for example, it will be appreciated by those of
`ordinary skill in the art that the diagrams, schematics, illus
`trations, and the like represent conceptual views or processes
`illustrating the exemplary embodiments. The functions of the
`various elements shown in the figures may be provided
`through the use of dedicated hardware as well as hardware
`capable of executing associated Software. Those of ordinary
`skill in the art further understand that the exemplary hard
`ware, Software, processes, methods, and/or operating systems
`described hereinare for illustrative purposes and, thus, are not
`intended to be limited to any particular named manufacturer.
`0017. As used herein, the singular forms “a,” “an,” and
`“the are intended to include the plural forms as well, unless
`expressly stated otherwise. It will be further understood that
`the terms “includes.” “comprises.” “including, and/or “com
`prising, when used in this specification, specify the presence
`of stated features, integers, steps, operations, elements, and/
`or components, but do not preclude the presence or addition
`of one or more other features, integers, steps, operations,
`elements, components, and/or groups thereof. It will be
`understood that when an element is referred to as being “con
`nected' or “coupled to another element, it can be directly
`connected or coupled to the other element or intervening
`elements may be present. Furthermore, “connected” or
`“coupled as used herein may include wirelessly connected
`or coupled. As used herein, the term “and/or” includes any
`and all combinations of one or more of the associated listed
`items.
`0018. It will also be understood that, although the terms
`first, second, etc. may be used herein to describe various
`elements, these elements should not be limited by these
`terms. These terms are only used to distinguish one element
`from another. For example, a first device could be termed a
`second device, and, similarly, a second device could be
`termed a first device without departing from the teachings of
`the disclosure.
`0019 FIGS. 1 and 2 are simplified schematics illustrating
`an environment in which exemplary embodiments may be
`implemented. FIGS. 1 and 2 illustrate an electronic device 20
`that accepts touches, Swipes, and other physical gestures as
`inputs. The electronic device 20, for simplicity, is illustrated
`as a mobile Smartphone 22, but the electronic device 20 may
`be any processor-controlled device (as later paragraphs will
`explain). Regardless, FIG. 1 illustrates a front side 24 of the
`electronic device 20, with body 26 housing the components
`within the electronic device 20. A display device 28, for
`example, displays icons, messages, and other content to a user
`of the electronic device 20. The display device 28 interfaces
`with a processor 30. The processor 30 executes instructions
`that are stored in a memory 32. The electronic device 20 may
`also include a touch sensor 34. The touch sensor 34 is con
`ventionally installed on or above a front face of the display
`device 28. The touch sensor 34 detects the user's physical
`inputs above the display device 28. The display device 28
`generates visual output in response to instructions from the
`processor 30, and the touch sensor 34 generates an output in
`response to the user's physical inputs, as is known.
`0020 FIG. 2 illustrates a backside 40 of the electronic
`device 20. Here the body 26 includes a gesture detector 42.
`The gesture detector 42 detects physical gestures that are
`made on an outer surface 44 of the body 26. The user may
`
`24
`
`
`
`US 2015/O123949 A1
`
`May 7, 2015
`
`make gestures on the outer surface 44 of the body 26, and the
`processor 30 interprets those gestures to control the electronic
`device 20. The user's fingers, for example, may contact the
`body 26 and make a Swiping motion on the outer Surface 44.
`The processor 30 interprets the swiping motion to execute
`Some command, Such as transitioning to a different display
`screen, answering a call, capturing a photo, or any other
`action. The user may also tap the outer surface 44 of the body
`26 to select icons, web pages, or other options displayed on
`the display device (illustrated as reference numeral 28 in FIG.
`1). Indeed, the user may associate any gesture to any action, as
`later paragraphs will explain.
`0021
`Exemplary embodiments thus greatly increase input
`area. Conventional electronic devices limit gesture detection
`to the display device 28 (i.e., the touch sensor 34 above the
`display device 28, as FIG. 1 illustrated). Exemplary embodi
`ments, instead, recognize inputs over any portion of the body
`26. The user's fingers may draw shapes across the body 26 of
`the electronic device 20, and those shapes may be recognized
`and executed. Exemplary embodiments thus permit inputs
`without having to visually observe the display device 28. The
`user may make gesture inputs without observing the display
`device 28 and, indeed, without holding the electronic device
`20 in the hand. For example, when the Smartphone 22 is
`carried in a pocket, the user may still make gesture inputs,
`without removing the Smartphone 22. The gesture detector 42
`recognizes simple taps and Swipes, more complex geometric
`shapes, and even alphanumeric characters. Because the elec
`tronic device 20 need not be held, exemplary embodiments
`permit Socially acceptable interactions in situations without
`overtly holding and manipulating the display device 28.
`Exemplary embodiments thus permit inconspicuous interac
`tion in a variety of environments, using the entire body 26 as
`an input surface.
`0022 FIG. 3 is a more detailed block diagram illustrating
`the operating environment, according to exemplary embodi
`ments. FIG. 3 illustrates the electronic device 20, the proces
`sor 30, and the memory 32. The processor 30 may be a
`microprocessor (uP), application specific integrated circuit
`(ASIC), or other component that executes agesture algorithm
`50 stored in the memory 32. The gesture algorithm 50
`includes instructions, code, and/or programs that cause the
`processor 30 to interpret any gesture input sensed by the
`gesture detector 42. When the user draws and/or taps agesture
`on the outer surface of the body (illustrated, respectively, as
`reference numerals 44 and 26 in FIGS. 1-2), the gesture
`detector 42 generates an output signal 52. The processor 30
`receives the output signal 52 and queries a database 54 of
`gestures. FIG. 3 illustrates the database 54 of gestures as a
`table 56 that is locally stored in the memory 32 of the elec
`tronic device 20. The database 54 of gestures, however, may
`be remotely stored and queried from any location in a com
`munications network. Regardless, the database 54 of gestures
`maps, associates, or relates different output signals 52 to their
`corresponding commands 58. The processor 30 compares the
`output signal 52 to the entries stored in the database 54 of
`gestures. Should a match be found, the processor 30 retrieves
`the corresponding command 58. The processor 30 then
`executes the command 58 in response to the output signal 52,
`which is generated by the gesture detector 42 in response to
`the user's gesture input.
`0023 FIG. 4 is another schematic illustrating the gesture
`detector 42, according to exemplary embodiments. While the
`gesture detector 42 may be any device, the gesture detector 42
`
`is preferably a piezoelectric transducer 70. The gesture detec
`tor 42 may thus utilize the piezoelectric effect to respond to
`vibration 72 sensed in, on, or around the body 26. As the user
`draws and/or taps the gesture 74 on the outer surface 44 of the
`body 26, vibration waves travel through or along the outer
`surface 44 of the body 26. The piezoelectric transducer 70
`senses the vibration 72. The piezoelectric effect causes piezo
`electric transducer 70 to generate the output signal (illustrated
`as reference numeral 52 in FIG. 3), in response to the vibra
`tion 72. Exemplary embodiments then execute the corre
`sponding command (illustrated as reference numeral 58 in
`FIG. 3), as earlier paragraphs explained.
`0024. The gesture detector 42 may even respond to sound
`waves. As the gesture detector 42 may utilize the piezoelectric
`effect, the gesture detector 42 may sense the vibration 72 due
`to both mechanical waves and acoustic waves. As those of
`ordinary skill in the art understand, the vibration 72 may be
`generated by Sound waves propagating along the body 26
`and/or incident on the piezoelectric transducer 70. Sound
`waves may thus also excite the piezoelectric transducer 70.
`So, whether the user taps, draws, or even speaks, the gesture
`detector 42 may respond by generating the output signal 52.
`Indeed, the piezoelectric transducer 70 may respond to the
`vibration 72 caused by the user's physical and audible inputs.
`The gesture detector 42 may thus generate the output signal
`52 in response to any mechanical and/or acoustic wave.
`0025 FIG. 5 is another schematic illustrating the gesture
`detector 42, according to exemplary embodiments. Here the
`gesture detector 42 may respond to electrical charges 80 on or
`in the body 26 of the electronic device 20. As the user draws
`the gesture 74 on surface 44 of the body 26, electrical charges
`80 may build on or within the body 26. FIG.5 grossly enlarges
`the electrical charges 80 for clarity of illustration. Regardless,
`the electrical charges 80 may cause an electric field 82, which
`may also excite the piezoelectric transducer 70. So, the ges
`ture detector 42 may also generate the output signal (illus
`trated as reference numeral 52 in FIG. 3) in response to the
`electric field 82. The gesture detector 42 may thus also
`respond to the electric charges 80 induced on the body 26.
`0026 FIGS. 6-7 are modeling circuit schematics illustrat
`ing the piezoelectric transducer 70, according to exemplary
`embodiments. Because the gesture detector 42 may utilize the
`piezoelectric effect, the gesture detector 42 may sense
`mechanical waves, acoustic waves, and the electrical charge
`(illustrated as reference numeral 80 in FIG.5). The piezoelec
`tric transducer 70 responds by generating the output signal
`52. The output signal 52 may be Voltage or charge, depending
`on construction of the piezoelectric transducer 70. FIG. 6, for
`example, is a circuit Schematic illustrating the piezoelectric
`transducer 70 modeled as a charge source with a shunt capaci
`tor and resistor. FIG. 7 illustrates the piezoelectric transducer
`70 modeled as a Voltage source with a series capacitor and
`resistor. The output Voltage may vary from microvolts to
`hundreds of Volts, so some signal conditioning (e.g., analog
`to-digital conversion and amplification) may be needed.
`0027 FIGS. 8-11 are more schematics illustrating the ges
`ture detector 42, according to exemplary embodiments.
`Because the gesture detector 42 responds to physical ges
`tures, the gesture detector 42 may be installed at any position
`or location on or in the body 26. FIG. 8, for example, illus
`trates the gesture detector 42 mounted to a central region 90
`on the backside 40 of the electronic device20. As the backside
`40 may present a large, Supplemental gesture Surface area 92
`for inputting gestures, the gesture detector 42 may be dis
`
`25
`
`
`
`US 2015/O123949 A1
`
`May 7, 2015
`
`posed in or near the central region 90 to detect the vibration
`72. FIG.9, though, illustrates the gesture detector 42 disposed
`in or near an end region 94 on the backside 40 of the electronic
`device 20. The end region 94 may be preferred in some
`situations, such as when the body 26 includes an access door
`96 to a battery compartment. A discontinuous gap 98 around
`the access door 96 may attenuate transmission of waves or
`conduction of charge, thus reducing or nullifying the output
`signal 52 produced by the gesture detector 42. A designer may
`thus prefer to locate the gesture detector 42 in some region of
`the body 26 that adequately propagates waves or conducts
`charge.
`0028 FIGS. 10 and 11 illustrate frontal orientations. FIG.
`10 illustrates the gesture detector 42 disposed on or proximate
`the front side 24 of the electronic device 20. Even though the
`electronic device 20 may have the conventional touch sensor
`34 detecting inputs above the display device 28, any portion
`of the frontside 24 of the body 26 may also be used forgesture
`inputs. FIG. 11, likewise, illustrates the gesture detector 42
`located in a corner region of the body 26. The gesture detector
`42 may thus be installed at any location of the body 26 to
`detect the vibration 72 caused by gesture inputs.
`0029 FIGS. 12-14 are schematics illustrating a learning
`mode 100 of operation, according to exemplary embodi
`ments. Wherever the gesture detector 42 is located, here the
`user trains the electronic device 20 to recognize particular
`gestures drawn on the body 26. When the user wishes to store
`a gesture for later recognition, the user may first put the
`electronic device 20 into the learning mode 100 of operation.
`FIG. 12, for example, illustrates a graphical user interface or
`screen that is displayed during the learning mode 100 of
`operation. The user may be prompted 102 to draw a gesture
`Somewhere on the body 26, Such as the Supplemental gesture
`surface area (illustrated as reference numeral 92 in FIG. 8).
`After the user inputs the desired gesture, the user may confirm
`completion 104 of the gesture.
`0030 FIG. 13 again illustrates the backside 40 of the elec
`tronic device 20. Here the outer surface 44 of the backside 40
`of the electronic device 20 is the supplemental gesture surface
`area 92. The user performs any two-dimensional or even
`three-dimensional movement. As the gesture is drawn, the
`vibration 72 propagates through the body 26 as mechanical
`and/or acoustical waves. The gesture detector 42 senses the
`vibration 72 and generates the output signal 52. The gesture
`detector 42 may also sense and respond to the electrical
`charges (as explained with reference to FIGS. 5-7). The ges
`ture algorithm 50 causes the electronic device 20 to read and
`store the output signal 52 in the memory 32. Once the gesture
`is complete, the user selects the completion icon 104, as FIG.
`12 illustrates.
`0031
`FIG. 14 illustrates a menu 110 of the commands 58.
`The menu 110 is stored and retrieved from the memory (illus
`trated as reference numeral 32 in FIG. 13). The menu 110 is
`processed for display by the display device 28. Once the user
`confirms completion of the gesture, the user may then asso
`ciate one of the commands 58 to the gesture. The menu 110
`thus contains a selection of different commands 58 from
`which the user may choose. FIG. 14 only illustrates a few
`popular commands 58, but the menu 110 may be a much fuller
`listing. The user touches or selects the command 58 that she
`wishes to associate to the gesture (e.g., the output signal 52).
`Once the user makes her selection, the processor (illustrated
`as reference numeral 30 in FIG. 13) adds a new entry to the
`database 54 of gestures. The database 54 of gestures is thus
`
`updated to associate the output signal 52 to the command 58
`selected from the menu 110. The user may thus continue
`drawing different gestures, and associating different com
`mands, to populate the database 54 of gestures.
`0032. The database 54 of gestures may also be prepopu
`lated. When the user purchases the electronic device 20, a
`manufacturer or retailer may preload the database 54 of ges
`tures. Gestures may be predefined to invoke or call com
`mands, functions, or any other action. The user may then learn
`the predefined gestures, such as by viewing training tutorials.
`The user may also download entries or updates to the database
`54 of gestures. A server, accessible from the Internet, may
`store predefined associations that are downloaded and stored
`to the memory 32.
`0033 FIG. 15 is an exploded component view of the elec
`tronic device 20, according to exemplary embodiments. The
`electronic device 20 is illustrated as the popular IPHONE(R)
`manufactured by Apple, Inc. The body 26 may have multiple
`parts or components, such as a bottom portion 120 mating
`with a central portion 122. The display device 28 and the
`touch sensor 34 are illustrated as an assembled module that
`covers the central portion 122. The body 26 houses a circuit
`board 124 having the processor 30, the memory 32, and many
`other components. A battery 126 provides electrical power.
`FIG. 15 illustrates the gesture detector 42 integrated into the
`assembly, proximate the bottom portion 120 of the body 26.
`This location may be advantageous for sensing vibration
`caused by gestures drawn on the outer Surface 44. The gesture
`detector 42 may have an interface to the circuit board 124,
`Such as a metallic strip or contact pad that conducts signals
`to/from the circuit board 124. The interface may also be a
`physical cable that plugs into a socket in the circuitboard 124.
`Whatever the interface, the gesture detector 42 senses the
`vibration and/or the electrical charge (referred to above, and
`illustrated, as reference numerals 72 and 80) caused by ges
`ture inputs on the body 26. The gesture detector 42 produces
`the output signal (referred to above, and illustrated, as refer
`ence numeral 52) in response to the vibration 72. The proces
`sor 30 analyzes the output signal 52 and executes the corre
`sponding command 58, as earlier paragraphs explained.
`0034. The body 26 may have any design and construction.
`The body 26, for example, may have a two-piece clamshell
`design with mating upper and lower halves. The body 26,
`however, may have any number of mating components that
`protect the internal circuit board 124. The body 26 may have
`a rectangular access opening through which the display
`device 28 and the touch sensor 34 insert or protrude. The body
`26, in other words, may have an inner rectangular edge or wall
`that frames the display device 28 and/or the touch sensor 34.
`The body 26 may be made of any material. Such as metal,
`plastic, or wood.
`0035 Exemplary embodiments thus transform the back
`side 40. Conventional smartphones fail to utilize the backside
`40 for gesture inputs. Exemplary embodiments, in contradis
`tinction, transform the outer surface 44 of the backside 40 into
`the supplemental surface area forgesture detection. Whatever
`the shape or size of the outer surface 44 of the body 26,
`gestures may be input to execute the corresponding command
`58, as earlier paragraphs explained. While the gesture detec
`tor 42 may be disposed anywhere within the electronic device
`20, the gesture detector 42 is preferable proximate the supple
`mental gesture surface area. While the gesture detector 42
`may be adhered to the outer surface 44 of the body 26, the
`gesture detector 42 may be preferably adhered to an inner
`
`26
`
`
`
`US 2015/O123949 A1
`
`May 7, 2015
`
`surface of the bottom portion 120 of the body 26 for added
`protection from physical damage. A glue or adhesive may
`simply and quickly adhere the gesture detector 42 to the body
`26. While any adhesive compound may be used, the adhesive
`may be chosen to minimize attenuation as the vibration 72
`travels through the adhesive. However, the gesture detector
`42 may alternatively be mechanically adhered, such as by
`fastener or weld. The gesture detector 42 may be soldered or
`welded to the body 26, especially when the body 26 is con
`structed of aluminum, magnesium, stainless steel, or any
`other metal. The gesture detector 42 may be soldered, TIG
`welded, or MIG welded to the body 26. Indeed, the body 26,
`and the Supplemental gesture surface area 92, may be con
`structed of plastic, metal, wood, and/or any other material.
`0036 FIG.16 is a schematic illustrating contactless, three
`dimensional gestures, according to exemplary embodiments.
`FIG. 16 again illustrates the user's fingers performing some
`gesture 74. Here, though, the user's fingers need not contact
`the body 26. That is, the user may make the three-dimensional
`gesture 74 in the vicinity of the gesture detector 42. The
`three-dimensional gesture 74 may have motions or move
`ments that do not come into contact with the body 26 of the
`electrical device 20. When the user's fingers perform the
`gesture 74, the gesture movements may cause air molecules
`to vibrate. The gesture detector 42 senses the vibrating air
`molecules and generates its output signal 52. Moreover, the
`user's contactless gesture movements may also induce the
`electrical charges 80 in the air to build on the body 26, thus
`also causing the gesture detector 42 to produce the output
`signal 52 (as explained with reference to FIGS. 5-7). Exem
`plary embodiments may thus respond to both two-dimen
`sional gestures drawn on the body 26 and to three-dimen
`sional gestures having contactless movements.
`0037 FIG. 17-19 are schematics illustrating output sam
`pling, according to exemplary embodiments. Whateverges
`ture the user performs, the gesture detector (illustrated as
`reference numeral 42 in FIG. 16) generates the output signal
`52. The output signal 52 may be voltage or charge (current),
`depending on the circuit design (as explained with reference
`to FIGS. 4-7). Regardless, the output signal 52 may have too
`much data for fast processing. For example, FIG. 17 illus
`trates a graph of the output signal 52 for an exemplary gesture
`having a one second (1 Sec.) duration. The output signal 52 is
`illustrated as being biased about a biasing Voltage V (illus
`trated as reference numeral 130). Even though the gesture is
`only one second in duration, the output signal 52 may still
`contain too much data for quick processing. The processor 30,
`in other words, may require more time that desired to process
`the output signal 52.
`0038 FIG. 18 illustrates sampling of the output signal 52.
`Exemplary embodiments may sample the output signal 52 to
`produce discrete data points 132 according to Some sampling
`rate 134. For mathematical simplicity, the sampling rate 134
`is assumed to be 0.2 seconds, which may be adequate for
`human gestures. So, when the user performs the gesture hav
`ing the one second duration, the output signal 52 may be
`sampled every 0.2 seconds to yield five (5) data points 132.
`0039 FIG. 19 again illustrates the database 54 of gestures.
`Because the output signal 52 may be sampled, the database 54
`of gestures need only store the discrete data points 132
`sampled from the output signal 52. FIG. 19 thus illustrates
`each sampled output signal 52 as a collection or set of the
`discrete data points 132 for each output signal 52. When the
`database 54 of gestures is queried, exemplary embodiments
`
`need only match the sampled values and not an entire, con
`tinuous Voltage, charge, or current signal. The burden on the
`processor 30 is thus reduced, yielding a quicker response to
`the user's gesture input.
`0040 FIGS. 20A and 20B are schematics illustrating a
`protective case 200, according to exemplary embodiments.
`As many readers understand, many users of Smartphones,
`tablet computers, and other mobile devices purchase the pro
`tective case 200. The protective case 200 protects the elec
`tronic device 20 (Such as the Smartphone 22) from damage.
`However, the protective case 200 may also deaden or insulate
`the backside 40 from the user's gesture inputs.
`0041
`FIG. 20A thus illustrates the gesture detector 42.
`Because the protective case 200 may limit access to the back
`side 40 of the electronic device 20, the gesture detector 42
`may be added to the protective case 200. FIG. 20A, for
`example, illustrates the gesture detector 42 adhered to an
`inner surface 202 of the protective case 200. The user may
`thus make gestures on or near the protective case 200, and the
`gesture detector 42 may still sense vibration and electrical
`charge (as explained above). The gesture detector 42 may still
`have the interface to the circuit board of the electronic device
`20, again such as a metallic contact or Socket.
`0042 Exemplary embodiments may be applied to the
`automotive environment. An interior of a car or truck, for
`example, has many surfaces for mounting the gesture detector
`42. A center console, for example, may have a dedicated
`gesture Surface for sensing the driver's gesture inputs. One or
`more of the piezoelectric transducers 70 may be affixed,
`mounted, or integrated into the gesture Surface for sensing
`touch and other gesture-based inputs. An armrest and/or a
`steering wheel may also have an integrated gesture Surface for
`sensing gesture inputs. As the driver (or passenger) gestures
`on or near the gesture surface, the piezoelectric transducer 70
`senses the vibration 72 or the electric charge 80, as earlier
`paragraphs explained. Because the piezoelectric transducer
`70 senses vibration and electrical charge, the gesture detector
`42 may be integrated into any surface of any material.
`0043. Exemplary embodiments may also be applied to
`jewelry and other adornment. As wearable devices become
`common, jewelry will evolve as a computing platform. An
`article of jewelry, for example, may be instrumented with the
`piezoelectric transducer 70, thus enabling inputs across a
`surface of the jewelry. Moreover, as the piezoelectric trans
`ducer 70 may be small and adhesively adhered, exemplary
`embodiments may be applied or retrofitted to heirloom pieces
`and other existing jewelry, thus transforming older adorn
`ment to modern, digital usage.
`0044 FIG. 21 is a schematic illustrating still more exem
`plary embodiments. FIG. 21 is a generic block diagram illus
`t