`US 9,712,895 B2
`(0) Patent No.:
`Jul. 18, 2017
`(45) Date of Patent:
`Yeo et al.
`
`US009712895B2
`
`DISTRIBUTED WIRELESS SENSING
`SYSTEM
`
`(58) Field of Classification Search
`None
`See application file for complete search history.
`
`(54)
`
`(71)
`
`(72)
`
`Applicant: Singapore University of Technology
`and Design, Singapore (SG)
`
`(56)
`
`Inventors: Kian Peen Yeo, Singapore (SG);
`Suranga Chandima Nanayakkara,
`Singapore (SG)
`
`References Cited
`U.S. PATENT DOCUMENTS
`
`6,735,630 BL*
`
`5/2004 Gelvin wu... B60R 25/1004
`706/33
`7,123,180 B1* 10/2006 Daniell ww. GO8C 17/02
`341/176
`
`(Continued)
`
`FOREIGN PATENT DOCUMENTS
`
`WO
`
`0126068 Al
`
`4/2001
`
`OTHER PUBLICATIONS
`
`Ballagas, R.et al. “iStuffMobile: Rapidly Prototyping New Mobile
`Phone Interfaces for Ubiquitos Computing”. Proceedings of the
`SIGCHI Conference on Human Factors in Computing Systems
`(CHI’07), ACM, 2007, p. 1107-1116.
`(Continued)
`
`Primary Examiner — George Bugg
`Assistant Examiner — Renee Dorsey
`(74) Attorney, Agent, or Firm — Lerner, David,
`Littenberg, Krumholz & Mentlik, LLP
`
`ABSTRACT
`(57)
`A device comprising: at least one input sensor, at least one
`output transducer, a wireless communication module, and a
`processor configured to receive a local control parameter
`from the input sensor or a remote control parameter from a
`remote module communicating with the processor via the
`wireless communication module, and selecting one a of a
`plurality of operational configurations de-pending on the
`local and remote control parameters, each ofthe plurality of
`operational configurations including a predetermined thresh-
`old for a sensing parameter received from the input sensor,
`and an output response if the sensing parameter breaches the
`threshold.
`
`(73)
`
`Assignee: Singapore University of Technology
`and Design, Singapore (SG)
`
`(*)
`
`Notice:
`
`Subject to any disclaimer, the term ofthis
`patent is extended or adjusted under 35
`U.S.C. 154(b) by 0 days.
`
`(21)
`
`Appl. No.:
`
`——-14/760,116
`
`(22)
`
`PCT Filed:
`
`Dec. 20, 2013
`
`PCT No.:
`
`PCT/SG2013/000545
`
`§ 371 (€)(),
`(2) Date:
`
`Jul. 9, 2015
`
`PCT Pub. No.: WO2014/109710
`
`PCT Pub. Date: Jul. 17, 2014
`
`Prior Publication Data
`
`US 2015/0358696 Al
`
`Dec. 10, 2015
`
`Related U.S. Application Data
`
`Provisional application No. 61/750,578, filed on Jan.
`9, 2013.
`
`(86)
`
`(87)
`
`(65)
`
`(60)
`
`(51)
`
`Int. Cl.
`
`GO8C 19/22
`H04Q 9/00
`
`(52)
`
`USS. Cl.
`
`(2006.01)
`(2006.01)
`(Continued)
`
`CPC wee H04Q 9/00 (2013.01); GO8B 25/10
`(2013.01); HO4W 84/18 (2013.01); GO8B
`25/14 (2013.01); HO4Q 2209/40 (2013.01)
`
`16 Claims, 18 Drawing Sheets
`100
`
`3. Autonomous Response to Sound
`() Sensor node
`(il) Sensornode
`triggered bya
`sound source
`plays2 soundin
`response
`d —
`102
`Sensor Node
`
`
`@p
`easy ee
`Wireless Networking
`Base Station
`
`Gi} Sound alarm
`and blinkLEO
`ed
`
`¥,
`
`&
`
`Sensor Node
`
`402
`
`(Find
`
`
`
`
`é
`Mee,
`:
`106
`\ t
`
`displaynotification
`(il) Vibrateand
`©)
`message
`Receiver
`
`Dongle
`2. Remote Sound Triggering
`
`
`sensornode 7
`
`APPLE 1048
`
`2, Remote Monitoring of Sound
`(i) Sensor node
`102
`‘triggered bya
`sound source
`
`2g
`
`a
`
` a
`
`-
`
`Smart Phone
`
`108,
`
`
`
` x
`
`APPLE 1048
`
`1
`
`
`
`US 9,712,895 B2
` Page 2
`
`(51)
`
`(56)
`
`Int. Cl.
`GO8B 25/10
`HOAW 84/18
`GO8B 25/14
`
`(2006.01)
`(2009.01)
`(2006.01)
`
`References Cited
`U.S. PATENT DOCUMENTS
`
`8,159,476 B2*
`
`8,725,462 B2*
`
`2003/0074489 AL*
`
`2006/0116175 A1*
`
`2011/0231535 A1*
`
`2012/0127317 Al*
`
`2012/0215446 Al*
`
`4/2012 Kim oe GO6F 3/0362
`345/156
`5/2014 Jain we GO6F 19/3406
`702/187
`4/2003 Steger 0... GOLD 9/005
`710/1
`6/2006 Chu wo. HO4M 1/72569
`455/567
`9/2011 Starnes we. HO4W 4/001
`709/223
`5/2012 Yantek we. GOLV 8/14
`348/156
`8/2012 Schunder «0.00.00... GO07C 5/008
`702/3
`
`OTHER PUBLICATIONS
`
`International Preliminary Report on Patentability for Application
`No. PCT/SG2013/000545 dated Apr. 29, 2013.
`International Search Report and Written Opinion for Application
`No. PCT/SG2013/000545 dated Mar. 13, 2014.
`
`* cited by examiner
`
`2
`
`
`
`U.S. Patent
`
`Jul. 18, 2017
`
`Sheet 1 of 18
`
`US 9,712,895 B2
`
`1. Remote Monitoring of Sound
`
`3. Autonomous Response to Sound
`
`fe,
`
`(ii) Sensor node
`Plays a sound in
`response
`
`102
`
`sound source dy)
`and bfink LED
`
`
`
`with mobile computing device)
`
`(i) Sensor node
`triggered by a
`
`102
`
`(i) Sensor node
`triggered bya
`sound source
`
`44
`4)
`
`~~
`Sensor Node
`
`(ii) Sound alarm
`
`(ii) Notification
`
`Wireless Networking
`Base Station
`
`(i) Find
`sensor node
`
`(ii) Vibrate and
`display notification
`message
`
`Receiver
`
`Dongle
`
`2. RemoteSound Triggering
`
`Figure 1
`
`RF Transceiver Madule (to
`communicate with sensor node
`and receiver dongle)
`
`104
`
`x
`
`302
`
`306
`
`32-bit Microprocessor
`
`Bluetooth module (to communicate
`
`Figure 3
`
`3
`
`
`
`U.S. Patent
`
`Jul. 18, 2017
`
`Sheet 2 of 18
`
`US 9,712,895 B2
`
`Electret or MEMS
`Microphone
`
`Rotary
`Encoder
`
`Mechanicalor
`Capacitive
`
`3-axis
`Accelerometer
`
`102
`
`L
`
`202
`
`208
`
`206
`
`204
`
`218
`
`Switch
`
`
`
`Speaker
`P
`
`eak
`
`Display (Tri-Colour
`LED, OLED orEink)
`
`RF
`.
`Transceiver
`Module
`
`NEC Read
`e
`
`er
`
`Figure 2
`
`
`
`106
`Push Button
`
`Switch
`
`406
`
`
`
`32-bit Microprocessor
`
`
`
`
`RF
`Display (LED, OLED
`Vibrating
`
`
`Transceiver
`
`Motor
`
`
`or E-ink)
`
`Module
`
`
`Figure 4
`
`4
`
`
`
`U.S. Patent
`
`Jul. 18, 2017
`
`Sheet 3 of 18
`
`US 9,712,895 B2
`
`508
`
`
`Microphone Ex coed
`Threshold Interrupt
`
`
`
`
`Update LED color
`and mode of operation
`
`
`
`
`input Mode?
`
`/ Rotary Encoder
`interrupt
`
`502
`
`N
`
`504
`
`
`
` Configure Made?
` Blink LED and send
`Sensitivity
`
`
`
`
`RF Radio interrupt
`
`
`blink LED
`
`Input Mode?
`
`Update microphone
`
`notification to Base station
`
`518
`
`512
`
`Acceleromete
`Interrupt
`
` Reset sensor node,
`Output Mode?
`return to configure mode
`
`?
`
`Sound alarm and
`
`Turn off alarm and LED
`
`
`
`Figure 5
`
`5
`
`
`
`U.S. Patent
`
`Jul. 18, 2017
`
`Sheet 4 of 18
`
`US 9,712,895 B2
`
`from Sensor
`
`
`Update Database
`Pairing Notification
`with Pair information
`
`
`
`
`
`
`from Sensor
`Update Database
`Trigger Notification
`
`
`with Trigger Information
`
`
`
`616
`618
`
`
`
` Trigger and Battery Status Query from Receiver
`Send Reply to
`Query Database for
`Receiver Devices
`
`
`624
`
`
`
`
`Update Database
`ow Battery Notification
`
`
`fram Sensor
`with Battery information
`
`
`
`"510
`622
`
`
`Gund Trigger Command
`from Receiver
`
`
`Command to Sensor Node
`
`Query Database for
`Sensor Node Address
`
`Send Sound Trigger
`
`Figure 6
`
`6
`
`
`
`U.S. Patent
`
`Jul. 18, 2017
`
`Sheet 5 of 18
`
`US 9,712,895 B2
`
`Fimer interrupt Occur,
`
`Query Wireless
`Base Station
`
`702
`
`
`
`
`
`
`Low Battery?
`
`Bink LEO Light
`
`
`
` Sensor Trigger?
`Pulse Motor
`
`Exit Timer Interrupt
`
`712
`
`
`
`Mardware interrupt
`from Push Switch
`
`
`
`
`
`
`
` Exit Interrupt }
`
`Off LED and Show
`Status on Display
`
`Off Motor and Show
`Status on Display
`
`Off Display
`
`
`
`Figure 7
`
`7
`
`
`
`U.S. Patent
`
`Jul. 18, 2017
`
`Sheet 6 of 18
`
`US 9,712,895 B2
`
`Phone starts
`
`
`
`User permission
`
`Bluetooth on?
`to start Bluetooth
`
`
`
`User selects
`
` Permission?
`
`base station
`
`Notify user that
`
`
`
`
`
`still on?
`
`¥
`
`Message?
`
`Y
`
`N—
`
`804
`
`New Device Notification
`
` Connected?
`
`program is running Bluetooth
`
`
`
`
`.
`.
`Device Exist?
`
`— CY
`Query Database
`
`Ouery
`very from
`ensor node
`
`N
`
`Configure sensor node
`as output request
`
`
`Generate a
`
`notification on phone
`
`Figure 8
`
`8
`
`
`
`U.S. Patent
`
`Jul. 18, 2017
`
`Sheet 7 of 18
`
`US 9,712,895 B2
`
`
` ser receiver
`new sensor node
`notification,
`
`Device configuration
`(dialog)
`
`
`User input
`device name
`and mode
`
`
`
`
`
`
`
`
` Toast with
`
`Vibration,
`
`
` Valid device name? —
`
`
`
`Save in Database
`
`
`Process Exit
`
`
`Figure 9
`
`9
`
`
`
`U.S. Patent
`
`Jul. 18, 2017
`
`Sheet 8 of 18
`
`US 9,712,895 B2
`
`
` User receives
`
`
`notification to configure
`
`sensor node as output
`
` Toast with
`
` ' User voice input
`device name
`
`Vibration Valid device mame?<<
`Device exist?
`
` Save in Database
`
` Pracess Exit
`
`Figure 10
`
`10
`
`10
`
`
`
`US 9,712,895 B2
`
`U.S. Patent
`
`Jul. 18, 2017
`
`Sheet 9 of 18
`
`i
`Select base station to connect
`Paired dovices
`ate a
`
`
`
`
`
`Figure 11(b)
`
`11
`
`11
`
`
`
`U.S. Patent
`
`Jul. 18, 2017
`
`Sheet 10 of 18
`
`US 9,712,895 B2
`
`12
`
`12
`
`
`
`US 9,712,895 B2
`
`U.S. Patent
`
`Jul. 18, 2017
`
`Sheet 11 of 18
`
`
`
`Figure 11(f)
`
`13
`
`13
`
`
`
`U.S. Patent
`
`Jul. 18, 2017
`
`Sheet 12 of 18
`
`US 9,712,895 B2
`
`Push downto activate
`
`a)
`
`push-button switch
`
`
`Semi-transparent
`diffuser for LED light
`
`*Bottom side facing
`object when
`monitoring for
`localized sound event
`
`Rotate to activate
`
`rotary encoder
`
`on the object
`
`b}
`
`Semi-transparent diffuser
`
`for LED light Windowfor sound
`
`to enter microphone
`
`*Bottom side facing
`up when monitoring
`for an area/location
`specific sound event
`
`ch
`
`Figure 12
`
`14
`
`14
`
`
`
`US 9,712,895 B2
`
`Jul. 18, 2017
`
`Sheet 13 of 18
`
`U.S. Patent
`
`
`
`Figure 12
`
`15
`
`15
`
`
`
`U.S. Patent
`
`Jul. 18, 2017
`
`Sheet 14 of 18
`
`US 9,712,895 B2
`
`1302
`
`1304
`
`1310
`
`1312
`
`(b)
`
`Figure 13
`
`16
`
`16
`
`
`
`U.S. Patent
`
`Jul. 18, 2017
`
`Sheet 15 of 18
`
`US 9,712,895 B2
`
`(i} Sensor node
`triggered by a
`sound source
`
`Sound Detected
`
`Vibrate and display
`notification
`
`Fy ~~
`
`Receiver
`Dongle
`
`« »)))
`
`ke
`
`Base Station
`
`(ii) Base station receives a|Cm|message
`notification from sensor node and
`updates itsdatabase
`WirelessNetworking
`sO
`
`@))
`Sensor Node
`wen
`|
`“ay
`{i} User issue a voice
`:
`-
`;
`commandto trigger cme|(iii) Sensor node receives
`
`sound ona sensor node
`sound trigger comand from
`basestation. Its plays a sound
`alarm andblinks its LED.
`
`(iti) Receiver devices query base station
`for any notifications and informs users
`if any sensor node has beentriggered
`
`Figure 14
`
`ne .
`
`O
`
`(ii) Base station receives
`request from mobile device
`and sends a cammandto
`
`trigger sound on the
`corresponding sensor node.
`
`Figure 15
`
`17
`
`17
`
`
`
`U.S. Patent
`
`Jul. 18, 2017
`
`Sheet 16 of 18
`
`US 9,712,895 B2
`
`(i) Sensor node
`triggered bya
`
`sound source > *
`
`Sensor Node
`
`(ii) Sensor node
`plays a sound in
`response
`
`Figure 16
`
`Sensor Node 1
`
`Sensor Node 2
`
`O
`
`Boey
`* Mg ”
`se,
`we
`+
`«
`*
`z
`.
`«
`:
`*
`.
`+
`la
`:
`*.
`.
`*
`. 4d
`*.,
`:
`-
`*
`eee,“ee eOe, s
`
`
`
`a
`he
`a
`E ML :
`*
`Prey
`
`nay
`
`ey,
`
`°
`
`Sound Source
`we
`.
`
`os
`
`.
`+
`.
`(i) Sensor nodes receive time-of-
`oe
`.
`,
`arrive information of sound
`.
`source and sends to basestation
`
`C)
`
`Sensor Node 3
`%
`*o
`
`'°
`%
`
`¢
`
`*e
`
`‘
`
`La
`*,
`*e
`*
`*
`*
`*,
`*
`*
`*
`*,
`“ee,
`“
`4
`¢.
`stp)
`
`; C)
`
`<
`f
`*
`:
`e
`:
`<
`e
`.
`v
`
`*
`.
`-
`2
`
`Sensor Node 4
`«
`:
`
`(ii) Base station computes time-of-
`arrival information to localize the
`direction of sound source and send
`the results to a mobile device
`
`
`
`Sound Detected
`
`
`
`at XX, YY
`
`Figure 17(a)
`
`18
`
`18
`
`
`
`U.S. Patent
`
`Jul. 18, 2017
`
`Sheet 17 of 18
`
`US 9,712,895 B2
`
`Sensor node
`
`Oy
`
`Sensor node
`
`O
`
`Top view of a person’s head
`
`Figure 17(b)
`
`(i) User issue a voice
`commandto trigger
`sound on a sensor node
`
`
`
`(ii) Base station receives
`request from mobile device
`and sends a commandto
`
`trigger sound on the
`corresponding sensor node.
`
`
`
`(iii) Sensor node receives
`soundtrigger command fram
`basestation. Its plays a sound
`alarm andbfinks its LED.
`
`wwe OR
`
`Sensor Node
`
`(iv) Mobile device receives a
`notification about sound
`detected at location xx.
`
`Sensor Nodeat a know
`location XX
`
`(iii) Nearby sensor node hears
`the sound trigger and sends
`notification to base station
`
`Figure 17(c)
`
`19
`
`19
`
`
`
`U.S. Patent
`
`Jul. 18, 2017
`
`Sheet 18 of 18
`
`US 9,712,895 B2
`
`perform certain functions
`
`
`»
`
`ae
`
`{ii} Base station receives
`notification and decodes
`sound gesture into commands
`sent to the mabile device
`
`(iii) Mobile phone or any other device
`can be remotely “controlled” to
`
`Sensor Node
`
`¥..,
`
`(i) User working on the table
`generate different “knocking”
`sound gestures whichis
`captured by a sensor node.
`
`Figure 18
`
`20
`
`20
`
`
`
`US 9,712,895 B2
`
`1
`DISTRIBUTED WIRELESS SENSING
`SYSTEM
`
`CROSS REFERENCE TO RELATED
`APPLICATIONS
`
`The present application is a national phase entry under 35
`USS.C. §371 of International Application No. PCT/SG2013/
`000545, filed Dec. 20, 2013, published in English, which
`claimspriority from U.S. Patent Application No. 61/750,578
`filed Jan. 9, 2013, all of which are incorporated herein by
`reference.
`
`FIELD
`
`This invention relates to a distributed wireless sensing
`system.
`
`BACKGROUND
`
`10
`
`15
`
`20
`
`Various networked sensors are known in the art, which
`allow remote collection of data for industrial and consumer
`
`including Ninja
`applications. Also prior art products
`Blocks™ and Twine™include various sensors and a web
`
`25
`
`app monitor the outputs. However neither product has any
`local reconfiguration of the sensing mode of the device.
`Most smart phonesortablets include multiple sensors and
`may seem to be an ‘all-round’ solution for many applica-
`tions. However, they might not be a cost effective solution
`when having to deploy multiple devices at various locations.
`Dedicated lost cost sensor devices deployed in large sensor
`networks may be cheaper, but may be complicated to setup
`and deploy. These sensor devices are also often a permanent
`installation that is inaccessible and non-spatially reconfigu-
`rable.
`
`SUMMARY
`
`Embodiments may seek to bring together the advantages
`of portability, accessibility and/or re-configurability into a
`single system.
`In general termsin a first aspect the invention proposes a
`portable wireless sensor that is remotely and locally recon-
`figurable. This may have the advantage that controlling the
`device is very simple and user friendly.
`In a second aspect the invention proposes an app or user
`interface that allows configuration of portable wireless sen-
`sors. This may have the advantagethatthe sensors, receivers
`and/or base station can interactively implement scenarios.
`In a third aspect
`the invention proposes a wearable
`receiver dongle. This may have the advantage that
`the
`dongle can provide a simple and easy alert
`to specific
`sensors that have determined analert condition.
`A system may include distributed wireless sensor nodes
`that have tangible input user interfaces, a central data
`collection/management unit
`(base station) and wireless
`receiver devices that are able to receive notification on the
`
`status of sensor nodes. Sound (acquired through a micro-
`phone) may be used as an input to the sensor nodes.
`Embodiments may have the advantage(s)that:
`a personal device can remotely monitor and interact with
`sensor nodes,
`simple and intuitive device for the non-expert users,
`network-enabled for multi-devices to support multi-input/
`multi-output configurations,
`
`30
`
`35
`
`40
`
`45
`
`50
`
`55
`
`60
`
`65
`
`21
`
`2
`
`spatially reconfigurable,
`pairing between sensor nodes and receiver devices by
`bringing them within close proximity,
`easily reconfigurable remote monitoring of sound events
`on specific objects or locations,
`easily reconfigurable remote output triggering on objects
`or at locations (e.g. soundorlight),
`intuitive ways to set-up sensor nodes to autonomously
`respond to an sound input event (e.g. sensor input
`triggers a predefined output),
`sound based system to make everyday objects collabora-
`tion with each other,
`collective input monitoring/capturing may beselected,
`collective output may beselected,
`input from one object(s) triggering response on other
`object(s) or vice versa may be selected, and/or
`everyday objects may be transformed into sound based
`input devices that are able to interact with personal
`digital devices (e.g. cancelling a incoming call by
`tapping the sensor node placedinside a user’s pocket).
`In a first and second specific expression of the invention
`there is provided a distributed wireless sensing device
`according to claim 1 or 14. In a third specific expression of
`the invention there is provided a distributed wireless sensing
`system according to claim 7. Embodiments may be imple-
`mented according to any of claims 2 to 6 and 8 to 13.
`
`BRIEF DESCRIPTION OF THE DRAWINGS
`
`One or more example embodiments of the invention will
`now bedescribed, with reference to the following figures, in
`which:
`
`FIG. 1 is a block diagram of the overall technology;
`FIG. 2 is a hardware block diagram of a sensor node;
`FIG. 3 is a hardware block diagram of a wireless base
`station;
`FIG.4 is a hardware block diagram ofa receiver dongle;
`FIG. 5 is a generalized flow chart of sensor node firm-
`ware;
`FIG. 6 is a generalized flow chart of base station firm-
`ware;
`FIG. 7 is a generalized flow chart of receiver dongle
`firmware;
`FIG.8 is a software application flowchart;
`FIG. 9 is a software application flowchart;
`FIG. 10 is a software application flowchart;
`FIG. 11(a)is a screenshotof setting up a connection to the
`base station;
`FIG. 11(4) is a screenshot of notification from a new
`sensor node;
`FIG. 11(c) is a screenshot of configuring new sensor node
`by assigning it a name and a mode. In Alert mode,a user will
`be alerted with a sound, vibration and a message. In Ambient
`mode, a user only receives notification message;
`FIG. 11(@)
`is a screenshot of displaying a history of
`activities and events and allows reconfiguring of devices;
`FIG. 11(e) is a screenshot of setting up a sensor node as
`an object finder. User records a voice tag to associate it with
`the sensor node;
`FIG. 11(f/)
`is a screenshot of a user giving a voice
`commandto activate sound on the sensor node;
`FIGS. 12(a) to (c): are perspective views of a physical
`design of a sensor node;
`FIGS. 12(d@)
`to () are schematic drawings of a user
`physically interacting with the sensor node (d) Turning, (e)
`Pressing, (f) Shaking;
`
`21
`
`
`
`US 9,712,895 B2
`
`3
`FIG. 13 is a perspective view of a physical design of a
`receiver dongle;
`FIG. 14 is a schematic drawing of remote monitoring of
`object specific or location specific events captured by a
`sensor node;
`FIG. 15 is a schematic drawing of remote eventtriggering
`on a sensor node;
`FIG. 16 is a schematic drawing of autonomousresponse
`to sound events by a sensor node;
`FIG. 17 are schematic drawings of collaboration between
`sensor nodes
`(a) Collective Input
`(Sound localization
`example),
`(b) Collective Output
`(Stereo/Multi channel
`sound output),
`(c) Input from sensor node(s) triggering
`response on other sensor nodes or vice versa; and
`FIG. 18 is a schematic drawing of application as an input
`device for interacting with digital devices.
`
`DETAILED DESCRIPTION
`
`Humanshave evolved to use sound (apart from vision) as
`one of the primary mediums for communication.In addition,
`humansperceive the world around them through their five
`senses (sight, sound, smell, touch and taste). Among these
`five senses, sound is perhaps the mostnatural ‘active’ form
`of two-way communication since human hear and produce
`sound naturally. Likewise, for natural ‘passive’ one-way
`communication to humans, the sense of sight and sound are
`perhaps the most efficient in terms of range. Embodiments
`may seek to enable users to extend their natural sense of
`sight and sound. Users may be able to ‘hear further’, ‘speak
`further’ and ‘communicate’ naturally with objects and the
`environmentthrough a sound andlight enabled input/output
`device distributed within their home or work environment,
`or even in remote locations.
`
`Accordingto a first embodiment shownin FIG.1 distrib-
`uted but interconnected (wireless) sensor nodes 102 may be
`sound based and havetangible user interfaces with input and
`output capabilities. The sensors nodes may wirelessly con-
`nect to other parts of the systems 100, including a wireless
`base station 104 and receiver devices 106 that can either be
`based on a computing (mobile) device with the associated
`software applications or a small receiver dongle.
`Hardware
`i. Sensor Node
`As shown in FIG. 2, each sensor node device 102 has an
`inbuilt microphone 202 for capturing sound inputs. This can
`be based on an electret or MEMS microphone. User inputs
`to the sensor node can be based on shaking the device,
`pressing or touching on one or more buttons on the device
`and rotating a knob on the device. The shaking action is
`captured by a 3-axis accelerometer 204. The press or touch
`action can be captured by a mechanical or capacitive switch
`206. A mechanical or magnetic based rotary encoder 208 can
`capture the rotation action.
`Sensor nodes provide both auditory and visual output. A
`diaphragm or MEMSspeaker 210 is used to provide audi-
`tory output. Visual output can be based on one or more
`tri-colour LED lights 212 or a more complex display such as
`an OLED or E-Ink display.
`Each sensor node has a wireless radio frequency (RF)
`module 214 which can be based on 2.4 GHz or sub-GHz
`frequency. The wireless module is used for exchanging
`messages between other sensor nodes and with the network-
`ing base station. A near field communication (NFC) reader
`216 is available on the sensor node for contactless commu-
`
`nication to establish pairing with receiver devices.
`
`10
`
`15
`
`20
`
`25
`
`35
`
`40
`
`45
`
`55
`
`60
`
`4
`A 32-bit ARM microprocessor 218 is used for interfacing
`the input, output and wireless modules. The microprocessor
`should meet the following minimum requirements: proces-
`sor speed of 48 MHz, 8 KB of RAM and 32 KB of FLASH,
`support for SPI, I2C, UART, ADC and GPIO.A recharge-
`able battery is used to power each sensor node.
`For example an ARM Cortex-M4 microprocessor may be
`used. AMEMSmicrophone (ADMP401) with analog output
`is used for measuring audio signals which is fed into an
`amplifier circuit. Variable gain control on the amplifier is
`achieved through a digital potentiometer (MCP4131) acting
`as a feedback resistor. A mechanical push button switch is
`used for detecting user press input. A low profile shaft-less
`rotary encoderis used for detecting the turning gesture for
`users. A 3-axis
`accelerometer with analog
`outputs
`(ADXL335)is used detect a shake input from a user. A RGB
`led and 8 Ohm speakeris connected to the outputports of the
`microcontroller. Wireless connectivity is achieved using a
`proprietary 2.4 GHztransceiver (nRF24L01+). A contactless
`communication controller
`(PN532)
`is used for data
`exchange with receiver devices.
`ii. Wireless Base Station
`FIG. 3 showsthe wireless base station 104 used to relay
`messages between sensor nodes, and between sensor nodes
`and receiver devices. The hardware of the wireless base’
`station consists of a 32-bit ARM microprocessor 302 and
`wireless RF modules 304. The wireless base station 104 is
`powered from the mains supply through regulated power
`supply (not shown).
`For example an ARM Cortex-M4 microprocessor may be
`used, connected to a nRF24L01+2.4 GHz transceiver (with
`power amplifier and external antenna for improved range)
`and a Bluetooth module 306. For compatibility with iOS
`devices and Android devices that support Bluetooth Low
`Energy (BLE), a BLE112 Bluetooth 4.0 module is used. For
`capability with devices support Bluetooth 2.1 and below, a
`RN-42 Bluetooth module is used.
`ili. Receiver Device
`
`The receiver device can be based on a computing (mobile)
`device with the associated software applications or a
`receiver dongle. The receiver device can receive notification
`messages from the wireless base station. In certain hardware
`configurations, the receiver device can receive notification
`messages directly from sensor nodes. The function of the
`receiver device is to inform a user of any sensor trigger
`events through visual, haptic and/or audio feedback.
`The computing (mobile) device is used for communica-
`tion with the wireless base station. This could include any
`form of (portable/wearable) computing device that supports
`software and hardware requirements. The basic hardware
`requirements for the device include Bluetooth, Wi-Fi, dis-
`play screen, user input capability (capacitive touch or physi-
`cal buttons) and audio output capabilities (speaker). Soft-
`ware requirement varies from the operating system on the
`device. For example a mobile phone running on Android 4.0
`and above, and iOS version 6.0 (with Bluetooth 4.0 support)
`and above may be used.
`FIG. 4 showsthe receiver dongle 106 which can be used
`for communication with the wireless base station 104 or
`
`with sensor nodes 102. The dongle is based on a 32-bit ARM
`microprocessor 402 and a wireless RF module 404.
`In
`addition, it comes with a mechanical push button switch 406
`for user input. Visual output is provided by either a tri-color
`LED, OLED or E-ink display 408. A vibrating motor 410 is
`used to provide haptic feedback. An NFC tag 412 is used for
`contactless information exchange. For example an ARM
`Cortex-M4 microprocessor may be used, connected to a
`
`22
`
`22
`
`
`
`US 9,712,895 B2
`
`5
`nRF24L014+2.4 GHz transceiver. An OLED display with
`SSD1306 controller and a LED are used to provide visual
`output. A 3.7V Li-Polymer battery is used to power the
`device.
`Firmware
`i. Sensor Node
`The firmware running on each sensor nodeis interrupt
`based. In order to reduce power consumption and increase
`the operating time of the sensor node, the device is put to
`sleep most of the time unless an interrupt occurs. There are
`three interrupt events that can occur; a user input (from the
`push button 206, rotary encoder 208 and/or accelerometer
`204), microphone 202 input (exceeding a predefined thresh-
`old) and from the wireless module 214 (whenthere is data
`available to be read).
`When the device is first switched on or reset, it defaults
`to configure mode, which is activated by pressing on push
`button switch 206. As shown in FIG.5 first interrupt 502 is
`based on a user selecting between 2 further different modes
`of operation by turning the rotary encoder 208. The colour
`on the LED 212 cycles between red and green to indicate the
`mode of operation. To activate the selected operation, the
`user has to press on push button switch 206 on the sensor
`node. The colour red indicates that device is in Input mode
`504, while green indicates that it is in Output mode 505.
`In the Input Mode, the device serves a sound monitoring
`function using the on-board microphone 202. A user can
`further adjust the sensitivity of the microphone by turning
`506 on the rotary encoder 208 in which the LED 212
`changes its brightness accordingly. Whenever the micro-
`phone 202 receives a sound exceeding the defined threshold
`508, the LED 212 on sensor node will start blinking 510 and
`a message containing the ID of the sensor node will be sent
`to the base station.
`In Output Mode, the device becomesa receiver to support
`an output triggering function. The sensor node waits for an
`incoming commandreceived through the RF module 214.
`Upon receiving this command 512,
`it activates 514 the
`speaker 210 to produce a beeping tone andalso blinks the
`LED light 212. A user can turn off the alarm and light by
`pressing 516 on the push button 206 switch on the sensor
`node device. The user can also shake 518 the device to reset
`
`10
`
`15
`
`20
`
`25
`
`30
`
`35
`
`40
`
`to configure mode.
`ii. Wireless Base Station
`FIG. 6 shows the firmware 600 on the wireless base
`
`45
`
`station configured to listen to incoming data packets sent
`either from the sensor node or from the receiver device. A
`
`database containing the device id of sensor nodes andtheir
`destination receiver devices is maintained in the server.
`
`Battery status of each sensor node is updated regularly on
`the database. An incoming data packet can be one of 5
`possible commands:
`a. Pairing notification 602 sent from a sensor node. This
`data consist of the sensor node device ID and its
`destination ID of the receiver device
`b. Trigger notification 604 from the sensor node. The data
`consist of the device ID of the sensor node, indicating
`that the sensor has been triggered
`c. A query command 606 from the receiver device to
`check for sensor trigger and battery status
`d. Low battery notification 608 sent from a sensor node
`e. Trigger sound 610 on sensor node command from the
`receiver device
`
`For a pairing notification 602, the program updates the
`database 612 with the sensor node device ID and the
`destination ID of the receiver. For a trigger to notification
`604, the program sets a trigger status flag 614 in the database
`
`55
`
`60
`
`65
`
`6
`indicating that a particular sensor node has been triggered.
`For a query command 606, the program retrieves 616 the
`status flags from the database based onthe ID ofthe receiver
`device and sends a reply 618 to the receiverto indicate if the
`sensor node has been triggered or if it is running low on
`battery power. For a low battery notification 608, the pro-
`gram updates a battery status flag 620 in the database
`indicating that the sensor node has running low on battery
`power. For trigger sound 610,
`the program receives a
`command 622 to query the ID of the sensor node tagged to
`the receiver and then issues a command 624 to the sensor
`node to trigger sound onit.
`ili. Receiver Dongle
`FIG. 7 shows the receiver dongle 106 runs on a timer
`interrupt 702, waking up from sleep mode at regular inter-
`vals to query 704 the wireless networking base station for
`the status of sensor nodesthatit is paired with. It will receive
`a reply message on the trigger status 706 and battery status
`708 of the sensor nodesit is paired with. If the trigger status
`is set active, vibrating motor will pulsed 710 on a regular
`interval. The vibration will stop only when the user presses
`on the push button switch 712, in which the OLED display
`will be turn on for 10 seconds to indicate the ID 714 of the
`
`sensor nodethat has been triggered. The OLED display will
`be turned off 716 after 10 seconds. If the battery statusis set
`active, the LED light is set to blink 718 at regular intervals.
`Whenthe user presses the push button switch 712, the LED
`will be turned off and the OLED display turn on for 10
`seconds 720 to show the ID of the sensor node that is low
`on battery power. After 10 seconds,
`the display will be
`turned off 716.
`For the sensor node and receiver device, the software is
`standalone, written and complied specifically for the micro-
`processor type. For example, the firmware on the sensor
`node andreceiver device is developed in C on a 32 bit ARM
`microprocessor. But it can be generalized to work on any
`microcontroller/microprocessor
`that meets the specified
`hardware requirements.
`Software Application for Mobile Device
`FIG. 8 shows a software application 106 on a mobile
`device running Android 4.0 operating system. The software
`application works with sensor nodes that are configured as
`input sound monitoring devices or as remote sound (alarm)
`output devices. In the input sound monitoring mode, the
`software application waits for incoming messages from the
`Bluetooth enabled wireless base station 802 which receives
`its incoming data from a sensor node. This message is
`checked 804 against a database of stored IDs (unique for
`each sensor node). If the ID is not present in the database
`806, it is recognized as a new device setup 808. As shown
`in FIG.9 the user is promoted for a device name assignment
`902 andnotification type (ambient ORalert), shown in FIG.
`11c. For example, a user ‘sticks’ a sensor node on the door
`and gives a knock on the doorto trigger it for the first time
`in which the software application recognizes this as a new
`device setup. The user enters a name assignment, e.g.
`“door”. This name assignment is tagged to the ID of the
`sensor node and is stored in the database 906. Subsequent
`messagesreceived by the application will generate an appro-
`priate notification 810 (a passive message on the notification
`bar for ‘ambient’ types and message with vibration and
`sound for ‘alert’ type) to the user (since a match is found in
`the database) to inform the user of a sound event happening
`to that particular sensor node
`The software application is shown in FIG. 10 as an output
`triggering mode. The userinitiates a setup to link a sensor
`nodeto the software. The mobile phone application receives
`
`23
`
`23
`
`
`
`US 9,712,895 B2
`
`7
`a setup prompt 1002 in which it opens up a voice input
`request 1004. The user assigns a voice tag (converted to text
`using a speech to text engine) to the ID of the sensor node
`whichis then stored in the database 1006. Wheneverthe user
`needs to remotely trigger an alarm outputto a specific sensor
`node, he or she opens the mobile phone application and
`issues a voice command which will be searched through the
`database for a matching voice tag and its corresponding
`device ID. When a valid voice tag is found, the software
`application sends a message to the wireless base station
`which in turn, routes the message to the corresponding
`sensor node (based on the unique node ID)to activate the
`alarm on it.
`The software application on the mobile computational
`device can be written for various platforms including (but
`not limited to): Android OS, 10S, MeeGo, Symbian and
`Windows mobile.
`
`8
`it is attached to. Also, there is only 1 push button switch,
`which can be triggered from either side of the sensor node.
`The LED will light up and blink whenit detects sound
`regardless of whether it is monitoring an object or a space.
`Tt will also start blinking when a user remotely triggers
`sound on the sensor node.
`
`The visual feedback may be in the form of a single LED
`or any form of visual display such as an OLED display
`screen or an E-Ink display.
`The (directional) microphone may beoriented such that
`its main receiving lobe is toward the front face of the sen