throbber
US009996738B2
`
`US 9,996,738 B2
`(10) Patent No.:
`a2) United States Patent
`Boshernitzanet al.
`(45) Date of Patent:
`Jun. 12, 2018
`
`
`(54) SYSTEM AND METHOD FOR
`CONTROLLING A TERMINAL DEVICE
`
`(56)
`
`References Cited
`U.S. PATENT DOCUMENTS
`
`(72)
`
`(71) Applicant: Swan Solutions Inc., Houston, TX
`(US)
`Inventors: Yaniv Boshernitzan, Houston, TX
`(US); Ohad Nezer, Houston, TX (US)
`(73) Assignee: Swan Solutions, Inc., Austin, TX (US)
`(*) Notice:
`Subject to any disclaimer, the term ofthis
`parent iscab)by| Puen under 35
`
`WO
`
`4,377,049 A *
`5,103,085 A *
`5,971,761 A
`6,335,722 BL*
`
`3/1983 Simon wee GO9F3/18
`;
`200/309
`4/1992 Zimmerman .......... COLne
`10/1999 Tillman,Sr.
`1/2002 Tani oe GOS5B 23/0216
`ou
`345/173
`(Continued)
`FOREIGN PATENT DOCUMENTS
`
`2013165348
`
`11/2013
`
`(21) Appl. No.: 15/043,283
`;
`Filed:
`
`(22)
`(65)
`
`Feb. 12, 2016
`Prior Publication Data
`US 2016/0239707 Al
`Aug. 18, 2016
`Related U.S. Application Data
`pp
`(60) Provisional application No. 62/115,769,filed on Feb.
`13. 2015
`,
`,
`Int. Cl.
`GO6F 3/043
`G06K 9/00
`G06F 3/041
`GO6K 9/32
`(52) US. CL
`CPC ......... G0O6K 9/00335 (2013.01): GO6F 3/043
`(
`);
`(2013.01); GO6F 3/0416 (2013.01); GO6K
`9/3233 (2013.01)
`
`(51)
`
`(2006.01)
`(2006.01)
`(2006.01)
`(2006.01)
`
`Primary Examiner — Joe H Cheng
`(74) Attorney, Agent, or Firm — Andrew W. Chu; Craft
`Chu PLLC
`"
`ABSTRACT
`(57)
`Acontrol system includes a housing engaged to a mountin
`g
`y'
`engags
`g
`fe
`tained
`within
`the housi
`i
`SHEIACE, & SENSOF COMAMEE WIM ING DOUSHIE, 2 SEIVET 1D
`communication with the sensor, and a terminal device in
`communication with the server. A gesture by a user associ-
`ated with the mounting surface controls activity of the
`terminal device, such as a knock on a wall
`lowering a
`thermostat. The control system enables a mounting surface
`y
`g
`independent from the terminal device to becomea controller
`for the terminal device. The sensor forms an interactive
`zone, and a contact interaction with the mounting surface
`within the interactive zone is detected by
`the sensor as data
`y
`signals. The server receives the data signals, determines a
`data pattern corresponding to the data signals, and matches
`the data pattern with a gesture profile. The gesture profile is
`associated with a command transmitted to the terminal
`device to control activity of the terminal device.
`
`13 Claims, 6 Drawing Sheets
`
`(58) Field of Classification Search
`CPC ...... GO6F 3/017; GO6F 3/0414; GOG6F 3/0433;
`GO6F 3/0436; GO6F 3/0488; GO6K
`9/00335; GO6K 9/3233; GO6K 9/6201
`See application file for complete search history.
`
`
`
`
`
`
`APPLE 1001
`
` 1
`
`
`
`APPLE 1001
`
`1
`
`

`

`US 9,996,738 B2
`
`Page 2
`
`(56)
`
`References Cited
`
`U.S. PATENT DOCUMENTS
`
`8,228,315 Bl
`8,665,237 B2*
`
`7/2012 Starner et al.
`3/2014 Koshiyama........... GO6F 3/0412
`345/156
`
`7/2014 Stedman et al.
`8,788,978 B2
`9,812,004 B1* 11/2017 Boshernitzan ......... GO8C 17/02
`5004/0914617 A1l* 10/2004 Kanazawa... OAT. 12/2823
`455/574
`3/2005 Breed vce B6OC 11/24
`340/13.31
`7/2006 Harvey ween GO8B 29/26
`340/628
`4/2007 Marks ...ccccsesen GOSB 17/10
`340/628
`
`2005/0046584 Al*
`
`2006/0164253 Al*
`2007/0080819 AL*®
`
`2010/0001992 Al
`2010/0017407 Al*
`
`1/2010 Van Loenenetal.
`1/2010 Beniyama......... GO6F 17/30259
`707/E17.016
`
`2010/0053152 Al*
`
`2012/0249416 Al
`2013/0321346 Al
`2014/0050354 Al
`20 14/0088794 A1*®
`
`2014/0111483 Al
`2014/0191938 AL*
`
`2014/0225824 Al
`2014/0249681 AL*
`
`2015/0100167 AL*®
`
`2016/0266636 A1*
`2017/0131783 A1*
`
`* cited by examiner
`
`3/2010 Lewis... GO6T 19/00
`345/419
`
`10/2012 Maciocciet al.
`12/2013 Tyleretal.
`2/2014 _Heim
`3/2014 Yashiro .eccccccccccce... GO8C 17/02
`701/2
`
`4/2014. Harrisonetal.
`7/2014 Ybanez Zepeda...... GO6F 3/017
`345/156
`
`8/2014 Shpuntetal.
`9/2014 Yamaguchi............ GO8C 17/02
`700/276
`4/2015 S100 vecscscessssssenes HO4L 67/025
`700/278
`9/2016 Boshernitzan ........ GOOF 1/3231
`5/2017 Boshernitzan .......... GO6E 3/017
`
`2
`
`

`

`U.S. Patent
`
`Jun. 12, 2018
`
`Sheet 1 of6
`
`US 9,996,738 B2
`
`g
`
`FIG.1
`
`3
`
`

`

`U.S. Patent
`
`Jun. 12, 2018
`
`Sheet 2 of 6
`
`US 9,996,738 B2
`
`
`
`60
`Contact interaction
`
`FIG. 2
`
`
`
`
`
`
`Profile $0
`i Device 50
`
` 70
`Recard Data Signal
`
`Another
`
`Data Signal?
`7o
`
` Measure Time
`Setweer Data Signals
`
`Determine Data Pattern
`
`
`
`80
`
`Match to
`Gesture
`
`_
`
`Neo Command to
`|
`i Terminal Device 50
`
`i Command to Terminal
`
`4
`
`

`

`U.S. Patent
`
`US 9,996,738 B2
`
`Jun. 12, 2018
`
`Sheet 3 of6
`
`
`
`5
`
`

`

`U.S. Patent
`
`Jun. 12, 2018
`
`Sheet 4 of6
`
`US 9,996,738 B2
`
`
`
`6
`
`

`

`U.S. Patent
`
`Jun. 12, 2018
`
`Sheet 5 of6
`
`US 9,996,738 B2
`
`
`
`30
`
`430
`
`7
`
`

`

`U.S. Patent
`
`Jun. 12, 2018
`
`Sheet 6 of6
`
`US 9,996,738 B2
`
`FIG. 7
`
`FIG. 8
`
`1A |
`
`60
`
`ee
`FIG. 9
`
`8
`
`

`

`US 9,996,738 B2
`
`1
`SYSTEM AND METHOD FOR
`CONTROLLING A TERMINAL DEVICE
`
`CROSS-REFERENCE TO RELATED
`APPLICATIONS
`
`The present application claims priority under 35 U.S.C.
`Section 119(e) from U.S. Provisional Patent Application Ser.
`No. 62/115,769, filed on 13 Feb. 2015, entitled “CONTROL
`INTERFACE SYSTEM AND METHOD”.
`
`See also Application Data Sheet.
`
`STATEMENT REGARDING FEDERALLY
`SPONSORED RESEARCH OR DEVELOPMENT
`
`Not applicable.
`
`THE NAMES OF PARTIES TO A JOINT
`RESEARCH AGREEMENT
`
`Not applicable.
`
`INCORPORATION-BY-REFERENCE OF
`MATERIAL SUBMITTED ON A COMPACT
`DISC OR AS A TEXT FILE VIA THE OFFICE
`
`ELECTRONIC FILING SYSTEM (EFS-WEB)
`
`Not applicable.
`
`STATEMENT REGARDING PRIOR
`DISCLOSURES BY THE INVENTOR OR A
`JOINT INVENTOR
`
`Not applicable.
`
`BACKGROUND OF THE INVENTION
`
`1. Field of the Invention
`Thepresent invention relates to a manual control system
`for a terminal device, such as a television, lighting fixture,
`thermostat or laptop. More particularly, the present inven-
`tion relates to a control system on an exterior mounting
`surface independent from the terminal device to be con-
`trolled. Even moreparticularly, the present invention relates
`to a system to detect gestures on a mounting surface and to
`generate commands for
`the terminal device based on
`detected gestures.
`2. Description of Related Art Including Information Dis-
`closed Under 37 CFR 1.97 and 37 CFR 1.98.
`
`With the development of electronic technology, output
`devices or terminal devices are used daily and are increas-
`ingly integrated with interactive features in order to enhance
`convenience and functionality. Users now can use a control
`system or controller, such as a remote control device, to
`adjust lights, curtains, a thermostat etc. Existing control
`systemsinclude distinct remote control devices dedicated to
`and associated with the particular output or terminal device
`to be controlled. Remote control devices can also be asso-
`ciated with more than one terminal device, such as a master
`controller for electronics and a touchscreen computertablet
`made integral with furniture or walls to control lighting and
`room temperature. Any computer with an interface (key-
`board, mouse, touch pad or touchscreen) can be a remote
`control device for multiple terminal devices with smart
`technology. Mobile phonesare also knownto be enabled for
`controlling terminal devices, such as home security cameras
`
`2
`and door locks. Another existing control system involves
`voice recognition technology.
`Existing control systems have limitations. Each output or
`terminal device typically is associated with a respective
`remote control device, such as a controller for the cable box,
`a controller for the DVD player, and a controller for the
`sound mixer. An excessive numberof controllers is needed
`
`in order to remotely control multiple devices. Furthermore,
`an individual controller is often misplaced or left in loca-
`tionsthat are not readily accessible to the user. The user must
`search for a controller or change locations to access the
`controller. Additionally, voice recognition technology often
`requires cumbersome training sessions to calibrate for pro-
`nunciations and accents of each particular user. Furthermore,
`voice recognition technology is often impaired by back-
`ground noise resulting in difficulties for that control system
`to recognize verbal commands. Additionally,
`the sound
`produced by voice commands may be obtrusive in many
`environments such as in a room whereothers are sleeping,
`or in a room while watching a movie.
`For remote control devices associated with multiple ter-
`minal devices, for example, computer tablets with a touch-
`screen and computers with touchpads,
`remote control
`devices can be built into or integrated into furniture. Smart
`tables have been built with touchscreens that are able to
`receive touch-based gestures. In the case of integrating these
`touchscreen or touch pads into surfaces of structures such as
`furniture, the cost of the structure is significantly increased
`due to design modifications required to accommodate the
`remote control device, and the cost of the components and
`hardware. Furthermore,
`aesthetics
`are often affected.
`Appearancesare altered when furniture, walls and surround-
`ings are filled with touchscreens,
`touchpads, and other
`conspicuous devices.
`Integration of such hardware into
`furniture also requires the manufacturer to modify existing
`designs such that the hardware can be accommodated into
`the structure.
`Prior art manual control systems range from buttons on a
`television remote controller to a touchscreen of a mobile
`phone. Simple gestures of pressing dedicated buttons and
`complex gestures of finger motions on a touchscreen are
`both used to control terminal devices. Various patents and
`publicationsare available in the field of these manual control
`systems.
`U.S. Pat. No. 8,788,978, issued to Stedman et al on Jul.
`22, 2014, teaches a gesture sensitive interface for a com-
`puter. The “pinch zoom”functionality is the subject matter,
`so that the detection of first and second interaction points,
`and the relative motion between the points are detected by
`sensors. A variety of sensors are disclosedto definethefield,
`including a touch screen, camera, motion sensor, and prox-
`imity sensors.
`World Intellectual Property Organization Publication No.
`W02013165348, published for Bess on Nov. 7, 2013,
`describes a system with at least three accelerometers dis-
`posed in different locations of an area with a surface to
`capture respective vibration data corresponding to a com-
`mandtapped onto the surface by a user. A processing system
`receives the vibration data from each accelerometer, iden-
`tifying the command and a location of the user from the
`vibration data. A control signal based on the command and
`the location is generated.
`U.S. Patent Publication No. 20140225824, published for
`Shpunt et al on Aug. 14, 2014, discloses flexible room
`controls. A control apparatus includes a projector for direct-
`ing first light toward a scene that includes a hand of a user
`in proximity to a wall of a room andto receive thefirst light
`
`10
`
`15
`
`20
`
`25
`
`30
`
`35
`
`40
`
`45
`
`50
`
`55
`
`60
`
`65
`
`9
`
`

`

`US 9,996,738 B2
`
`3
`that is reflected from the scene, and to direct second light
`towardthe wall so as to project an image of a control device
`onto the wall. A processor detects hand motions within the
`projected field.
`USS. Patent Publication No. 20120249416, published for
`Maciocciet al on Oct. 4, 2012, describes another projection
`system with gesture identification. The projector is a unit
`worn on the body of the user to project onto surfaces, such
`as walls and tables. Spatial data is detected by a sensorarray.
`Additional rendering operations may include tracking move-
`ments of the recognized body parts, applying a detection
`algorithm to the tracked movements to detect a predeter-
`mined gesture, applying a command corresponding to the
`detected predetermined gesture, and updating the projected
`images in response to the applied command.
`USS. Patent Publication No. 20100019922, published for
`Van Loenen on Jan. 28, 2010, is the knownprior art for an
`interactive surface by tapping. Sound detection is filtered
`and interpreted either in the system to be controlled or else
`in the sensors themselves. The direction of movement of a
`
`hand stroking the surface can be interpreted as a command
`to increase or decrease a parameter, such as the sound
`volumelevel of a television, for example. Determination of
`the position of the user’s hand is unnecessary.
`For a prior art control system, including a set of simple
`buttons or a complex touchpad,there is a discrete boundary
`or differentiation between the touch-sensitive region and
`non-touch regions on a surface of a remote control device.
`A touch-sensitive region can be bound by the button, the
`keypad, or the outer edge of a touchpadthat is integrated in
`the surface. Therefore, a commandis often processed the
`moment a contact interaction occurs between a person’s
`hand and the button or touchpad of an activated terminal
`device. For the prior art light detection devices, there is a
`discrete boundary of visible light as the touch-sensitive
`region. Only gestures within the field of projected light and
`only gestures made, when the projected light is activated, are
`processed within the control system for commandsof the
`terminal output.
`There is a need to remove the boundary between the
`touch-sensitive region and non-touch regions so that an
`entire surface can be an interactive zone. For individuals
`with disabilities,
`the touch-sensitive region may not be
`accessible, such as a switch or a touchscreen mounted too
`high. Sufficient motor control to interact properly with a
`touchscreen maynotbe possible for individuals with neuro-
`muscular problems or other physical constraints. Elderly
`individuals may also need assistance to adequately view
`buttons and touchscreens, when controlling their terminal
`devices. There is a need to improve the manual control
`systemsfor all types of users with wide ranges of physical
`abilities.
`
`It is an object of the present invention to provide a system
`and method for controlling a terminal device.
`Tt is an object of the present invention to provide a manual
`system to control a terminal device.
`Tt is an object of the present invention to provide an
`interactive control system based on gestures.
`It is another object of the present invention to provide an
`interactive control system based on physical impact on a
`surface independent from the terminal device.
`It is another object of the present invention to provide an
`embodimentof the system for controlling a terminal device
`by contact
`interactions through an associated mounting
`surface.
`
`4
`It is another object of the present invention to provide an
`embodimentof the system for controlling a terminal device
`with an interactive zone coordinated or aligned with an
`exterior surface.
`It is another object of the present invention to provide an
`embodiment of the interactive control system to detect a
`gesture as a contact interaction within an interactive zoneset
`by a sensor.
`It is another object of the present invention to provide an
`embodimentof the system for controlling a terminal device
`to detect contact interactions associated with a mounting
`surface as data signals.
`It is still another object of the present invention to provide
`an embodiment of the system for controlling a terminal
`device to determine a data pattern based on the data signals.
`It is still another object of the present invention to provide
`an embodiment of the system for controlling a terminal
`device to match a detected data pattern with a gestureprofile
`associated with a commandofa terminal device.
`
`It is still another object of the present invention to provide
`an embodiment of the system for controlling a terminal
`device by converting a contact interaction detected through
`a sensor into a commandassociated with a gesture profile
`matched to a detected data pattern.
`These and other objectives and advantages of the present
`invention will become apparent from a reading of the
`attached specification.
`
`25
`
`BRIEF SUMMARY OF THE INVENTION
`
`Embodiments of the control system of the present inven-
`tion convert any independent mounting surface into a con-
`troller for a terminal device. A physically separate mounting
`surface, such as a wall or table surface, can be used to
`activate and deactivate a television or light fixtures, without
`the user touching either appliance. The control system
`includes a housing engaged to a mounting surface, a sensor
`within the housing, a server in communication with the
`sensor, and a terminal device in communication with the
`server. The terminal device is to be controlled by gestures
`associated with the mounting surface.
`The housing has an engagement means for a mounting
`surface, such that the housing can be placed exterior, under-
`neath or interior to the mounting surface. Planar surfaces,
`such as tables and walls, as well as non-planar surfaces, such
`as beds, can be mounting surfaces. The engagement means
`can include an attachment means between the housing to the
`mounting surface and a transmission portion connecting the
`sensor to the housing. In some embodiments, the transmis-
`sion portion has a spring loaded portion to reduce damping
`of the sensor within the housing.
`The sensor forms an interactive zone defined by a range
`of the sensor, and the interactive zone is aligned with the
`mounting surface. The interactive zone can be coplanar,
`overlaying or made integral with the mounting surface. The
`sensor is in a fixed position relative to the engagement
`means, so that a contact
`interaction with the mounting
`surface within the interactive zone is detected by the sensor
`as data signals. In an alternate embodiment, there can be
`more than one sensor. With an additional sensor, there is an
`additional interactive zone for detecting the same contact
`interaction on the mounting surface. The additional data
`signals from the additional sensor can be detected along with
`the data signals of the sensor.
`The contact interaction generates the data signals of the
`sensor through the transmission portion of the housing. In
`some embodiments, the contact interaction is comprised of
`
`40
`
`45
`
`60
`
`10
`
`10
`
`

`

`US 9,996,738 B2
`
`5
`an impact or plurality of impacts on the mounting surface,
`the data signals having a respective defined peak corre-
`sponding to each impact, a measured time period between
`each defined peak, and a defined time period after a last
`defined peak. A data pattern for each contact interaction is
`determined by each defined peak and the defined time period
`after the last defined peak, and each measured time period
`between each defined peak,if there is a plurality of impacts.
`When the sensor is an acoustic sensor, the data signals are
`sound data, such as volume,intensity, pitch, frequency, and
`duration. When the sensor is an accelerometer,
`the data
`signals are vibration data, such as amplitude, intensity, and
`duration. Other sensors, such as sensors with mechanical,
`light, and piezoelectric capabilities can also be incorporated
`into the control system. Contact interactions, such as tap-
`ping, knocking, sweeping, and dragging, can be detected by
`the sensor as data signals, such that different gestures of a
`user can be used by the present invention for controlling
`activity of the terminal device. In an alternate embodiment
`with more than one sensor, the data signals and the addi-
`tional data signals can be used to determine the data pattern
`for the contact interaction.
`The server of the present invention is in communication
`with the sensor, including but not limited to wifi, Bluetooth,
`local area network, wired or other wireless connection. The
`server can be comprised of a routing module, a processing
`module being connected to the routing module, and an
`output module connected to the processing module. The
`routing module receives the data signals from the sensor, and
`the processing module determines the data pattern corre-
`sponding to the data signals of the contact interaction. For
`more than one sensor, the data pattern corresponds to the
`data signals of the sensor and any additional data signals
`corresponding to other sensors. The processing module
`matches the data pattern with a gesture profile. The gesture
`profile is associated with a command. Once matched, the
`output module transmits the command to the terminal
`device.
`
`The terminal device can be an appliance, lighting fixture
`or climate regulator. Examples include a television, a ther-
`mostat, a computer, a software system, a game console, a
`smartphone, a device running software, a fan, a mattress
`adjustor, an alarm clock, and a lighting fixture. The terminal
`device can be comprised of a receiving module and means
`for initiating activity of the terminal device corresponding to
`the command. The activity of the terminal device can be
`dedicate to the particular terminal device, such as powering
`on and off for a lamp, raising and lowering temperature of
`a thermostat, answering a phone call on a smartphone,
`checking calendar software on a table, and changing chan-
`nels on a television. The receiving module in communica-
`tion with the server receives the command, and the means
`for initiating performsthe activity. The meansfor initiating
`can be a switch or other actuating mechanism to change the
`status of the terminal device.
`Embodiments of the present invention further include a
`methodof controlling a terminal device with the system of
`the present invention. A housing is installed on a mounting
`surface by an engagement device. The server is connected so
`as to be in communication with the sensor, and the terminal
`device is connected so as to be in communication with the
`
`server. A physical impact is made on the mounting surface
`so as to generate a contact interaction, and the sensor detects
`the contact interaction as data signals. The server receives
`the data signals from the sensor and determines a data
`pattern corresponding to the data signals of the contact
`interaction. The data pattern is matched to a gesture profile
`
`6
`associated with a command. The commandis transmitted to
`
`the terminal device, so that the terminal device performs an
`activity according to the command. The gesture related to
`the mounting surface controls the terminal device, even as
`the mounting surface is independent from the terminal
`device.
`
`BRIEF DESCRIPTION OF THE SEVERAL
`VIEWS OF THE DRAWINGS
`
`FIG. 1 is a schematic view of an embodiment of the
`control system of the present invention.
`FIG.2 is flow diagram of the embodimentof the method
`for controlling a terminal device with the system of the
`present invention.
`FIG. 3 is a schematic view of an embodiment of the
`housing and sensor of the control system of the present
`invention.
`FIG. 4 is a side elevation view of the embodiment of the
`
`housing and sensor of FIG. 3.
`FIG.5 is a top plan view of an embodimentof the housing
`on a mounting surface of the present invention.
`FIG.6 is a top plan view of another embodiment of the
`housing with a plurality of sensors on the mounting surface
`of the present invention.
`FIGS. 7-9 are schematic views of the housing, interactive
`zone, and mounting surface of the control system of the
`present invention.
`
`DETAILED DESCRIPTION OF THE
`INVENTION
`
`The control system of the present invention is a manual
`control system for all types of users with wide ranges of
`physical abilities. Any independent mounting surface can be
`converted into a controller for a terminal device. Simple
`gestures associated with the mounting surface can be used to
`control the terminal device in a different location. Gaining
`access to a button or switch on a dedicated controller or
`being able to view and manipulate a complicated menu on
`a touchscreen are no longer required. An appliance, such as
`a television or a thermostat, can be activated or deactivated
`without physically touching either appliance. The user does
`not have to physically reach the appliance as the terminal
`device. A separate mounting surface, such as a wallor table
`surface,
`is converted into a controller without a touch-
`sensitive region boundary. Simple physical interactions on
`an independentsurface can now control the terminal device.
`Referring to FIGS. 1-2, the control system 10 includes a
`housing 20 engaged to a mounting surface 22, a sensor 30
`within the housing 20, a server 40 in communication with
`the sensor 30, and a terminal device 50 in communication
`with the server 40. Interfaces 99 are connected to the server
`40 in order to interact with the control system 10. The
`interfaces 99 can include computers, laptops, tablets and
`smartphones. FIG. 1 showsa variety of different interfaces
`99. The interfaces 99 allow the user to adjust the settings of
`the control system 10. Gestures by a user associated with the
`mounting surface 22 control the terminal device 50 in FIGS.
`5 and 7-9.
`In some embodiments,
`the devices that are
`interfaces 99 could also be terminal devices 50.
`
`In FIGS. 3-4, the housing 20 is comprised of an engage-
`ment means 24 for a mounting surface 22. FIGS. 5 and 7-9
`show that the housing 20 can be placed exterior, underneath
`or interior to the mounting surface 22. Planar surfaces, such
`as tables and walls, as well as non-planar surfaces, such as
`beds, can be mounting surfaces 22. FIG. 4 shows an embodi-
`
`10
`
`15
`
`20
`
`25
`
`30
`
`35
`
`40
`
`45
`
`50
`
`55
`
`60
`
`65
`
`11
`
`11
`
`

`

`US 9,996,738 B2
`
`7
`ment of the engagement means 24 being comprised of an
`attachment means 26 between the housing 20 to the mount-
`ing surface 22 and a transmission portion 28 connecting the
`sensor 30 to the housing 20. The attachment means 26 can
`be an adhesive, mechanical fasteners, threaded screws or
`other components to hold the housing 20 to the mounting
`surface 22. In some embodiments, the transmission portion
`28 can be comprised of frames and brackets 38 or a spring
`loaded portion (not shown) so as to reduce damping. There
`is a rigid positioning of the sensor 30 relative to the
`mounting surface 22 through the housing 20. Any sound or
`vibration of the mounting surface 22 is transmitted to the
`sensor 30. The engagement means24 attaches the sensor 30
`and reduces damping so that sensor 30 more accurately
`detects the mounting surface 22. The transmission portion
`28 affects sound or vibration or other stimuli from the
`
`mounting surface 22 to the sensor 30.
`The control system 10 of the present invention includes a
`sensor 30 as shownin FIGS. 3-4. The housing 20 contains
`the sensor 30 comprised of a printed circuit board 34' with
`a flash memory 31, microcontroller unit (MCU) 33,
`the
`sensor unit 35, antenna 37, and light emitting diode 39. The
`sensor unit 35 can be an accelerometer or acoustic sensor.
`The microcontroller unit 33 and antenna 37 can have wifi
`
`capability for communication with the server 40. The rela-
`tionship between the sensor unit 35 of the sensor 30 and the
`transmission portion 28 of the housing 20 is shown. The
`rigid position of the sensor 30 establishes the transmission of
`the contact interaction to the sensor 30. Other parts in the
`housing 20 include batteries 36 as a known powersupply for
`the control system 10. The stable construction of the housing
`20 and the sensor 30 enable the accurate and efficient
`conversion of the gestures into commandsfor the terminal
`device 50.
`FIGS. 5 and 6 show embodiments of the sensor 30
`
`forming an interactive zone 32 defined by a range 34 ofthe
`sensor 30. A contact interaction with the mounting surface
`22 within the interactive zone 32 is detected by the sensor 30
`as data signals. FIGS. 5 and 6 show the interactive zone 32
`aligned with the mounting surface 22,
`in particular,
`the
`interactive zone 32 is coplanar with the mounting surface 22.
`The contact
`interaction on the mounting surface 22 is
`detected by the sensor 30 on the mounting surface 22. In one
`example, the housing 20 rests on a table, and knocking on
`the tabletop can control the terminal device 50. The tabletop
`for knocking and the mounting surface 22 are the same
`structure.
`FIGS. 7-9 show other embodiments with the interactive
`zone 32 aligned with the mounting surface 22 in different
`positions. FIGS. 7 and 8 show the interactive zone 32
`coplanar and overlaying the mounting surface 22. FIG. 7
`showsa wall as the mounting surface 22 with the housing 20
`behind the wall. The contact
`interaction is on the wall
`surface opposite the mounting surface 22, but the knocking
`on the wall surface is transmitted to the mounting surface 22
`and then to the sensor 30. Similarly, FIG. 8 showsa table as
`the mounting surface 22 with the housing 20 underneath the
`table. The contact interaction is on the tabletop, opposite to
`the mounting surface 22. The knocking on the tabletop is
`transmitted to the mounting surface 22 on the bottom of the
`table and to the sensor 30. FIG. 9 showsthe interactive zone
`
`32 made integral with the mounting surface 22, such as
`embeddedin a table. The mounting surface 22 is within the
`table, and the tabletop is not the mounting surface 22. The
`contact interaction is associated with the mounting surface
`22 and the contact interaction is detected through the mount-
`ing surface 22, even if the contact interaction is not always
`
`10
`
`15
`
`20
`
`25
`
`30
`
`35
`
`40
`
`45
`
`50
`
`55
`
`60
`
`65
`
`8
`directly on the mounting surface 22. In the present inven-
`tion,
`the sensor 30 is in a fixed position relative to the
`engagement means 24, so that contact interaction is trans-
`mitted through the mounting surface 22 to the sensor 30.
`Movement, sound waves, and vibration through the mount-
`ing surface 22 can be transmittedas efficiently as possible to
`the sensor 30 through the mounting surface 22 and the
`engagement means 24.
`The engagement means 24 of the housing 20 is coopera-
`tive with the sensor 30 so that any contact
`interaction
`generates data signals of the sensor through the transmission
`portion 28 of the engagement means 24. There is less
`damping of the contact interaction as sound or vibration. The
`transmission portion 28 can have less damping than the
`mounting surface 22 or the actual surface of the knocking in
`the interactive zone 32. In some embodiments, the trans-
`mission portion 28 affects transmission of the data signal to
`the sensor 30. Therigid position of the sensor 30 relative to
`the mounting surface 22 reduces damping of the contact
`interaction through the transmission portion 28. The trans-
`mission portion 28 can be comprised of a rigid material,
`such an injection molded frame with flexibility different than
`the materials of the mounting surface 22 or surface of the
`contact interaction, if different from the mounting surface
`22. In the embodiment with the spring loaded portion (not
`shown), the spring loaded portion of the transmission por-
`tion 28 has less damping than the mounting surface 22 or
`surface of the contact interaction, if not the same. Sound or
`vibration has less damping through a spring loaded portion
`for the transmission of the contact interaction through the
`transmission portion 28 to the sensor 30. For example, the
`spring loaded portion as the transmission portion 28 may
`hold the housing 20 closer and stronger to the mounting
`surface 22 so as to reduce damping soundor vibration of
`contact interaction. The data signals of the sensor 30 may
`have improved clarity and accuracy than systems without
`the relationship of the sensor 30 within the housing 20
`relative to the mounting surface 22 for transmission through
`to the sensor 30. The sensor 30 can bestabilizedrelative to
`the housing 20 on both sidesofthe printed circuit board 34'.
`Alternatively, the printed circuit board 34' may be held along
`multiple points along the perimeter of the printed circuit
`board 34', including brackets spaced every 120 degrees or
`ever 90 degrees. The sensor 30 is held in position to prevent
`flopping and vibrating separate from the mounting surface
`so that the sensor 30 maintains the properrelationship to the
`mounting surface 22.
`FIG.2 is a flow diagram of an embodimentofthe present
`invention, showing the data signals of the sensor 30 in
`relation to the server 40. The contact interaction 60 gener-
`ates the data signals 70 of the sensor 30 through the
`transmission portion 28 of the housing 20. In the present
`invention,
`the contact interaction 60 is comprised of an
`impactor plurality of impacts associated with the mounting
`surface 22. In FIGS. 5 and 6, knocking on the tabletop is also
`knocking on the mounting surface 22, so the contact inter-
`action 60 is directly on the mounting surface 22. In FIGS.
`7-9, knocking on an associated surface of the wall or table
`is the contact interaction 60. In those embodiments, the
`impactor plurality of impacts on the associated surface is the
`contact
`interaction 60, not an impact on the mounting
`surface 22. The impacts are coordinated or correspond or
`translate to the mounting surface 22 for detection by the
`sensor 30 through the mounting surface 22 as data signals
`70.
`
`In the embodiments of the control system 10, the data
`signals 70 having a respective defined peak corresponding to
`
`12
`
`12
`
`

`

`US 9,996,738 B2
`
`9
`each impact, a measured time period between each defined
`peak, and a defined time period after a last defined peak.
`Each peak i

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket