`(12) Patent Application Publication oo) Pub. No.: U S 2012/0203491 A l
`(43) Pub. Date:
`A ug . 9 ,2 0 1 2
`Sun et al.
`
`US 20120203491A1
`
`Publication Classification
`
`(54) METHOD AND APPARATUS FOR
`PROVIDING CONTEXT-AWARE CONTROL
`OF SENSORS AND SENSOR DATA
`
`(51)
`
`(75)
`
`Inventors:
`
`Feng-Tso Sun, Palo Alto, CA (US);
`Cynthia Kuo, Mountain View, CA
`(US); Raja Bose, Mountain View,
`CA (US)
`
`(73) Assignee:
`
`Nokia Corporation, Espoo (FI)
`
`(21) Appl.No.:
`
`13/020,631
`
`(22) Filed:
`
`Feb. 3, 2011
`
`SENSOR MANAGER 109
`
`Int. Cl.
`G06F 19/00
`U.S. Cl...........
`
`(2011.01)
`
`....... 702/108
`
`(52)
`ABSTRACT
`(57)
`An approach is provided for context-aware control of sensors
`and sensor data. A sensor manager determines context infor
`mation based, at least in part, on one or more sensors. The
`sensor manager also determines resource consumption infor
`mation associated with a one or more other sensors, one or
`more functions of the one or more other sensors, or a combi
`nation thereof. The sensor manager then processes and/or
`facilitates a processing of the context information and the
`resource consumption information to determine at least one
`operational state associated with the one or more other sen
`sors, the one or more functions of the one or more other
`sensors, or a combination thereof.
`
`CONTROL LOGIC
`201
`
`SENSOR
`INTERFACE 203
`
`CONTEXT MODULE
`205
`
`RESOURCE
`MODULE 207
`
`STATE
`DETERMINATION
`MODULE 211
`
`SENSOR DATABASE
`209
`
`^ __
`
`__—^
`
`SENSOR GROUP 103
`
`SENSOR
`105a
`
`SENSOR
`105b
`
`SENSOR
`105c
`
`APPLE 1031
`
`1
`
`
`
`Patent Application Publication
`
`Aug. 9, 2012 Sheet 1 of 15
`
`US 2012/0203491 A1
`
`100
`
`SENSOR MANAGER
`
`109c
`
`*
`
`- -
`, * *
`
`SENSOR
`
`105b
`
`SENSOR
`
`105a
`
`SENSOR GROUP 103
`
`FIG. 1
`
`2
`
`
`
`Patent Application Publication
`
`Aug. 9, 2012 Sheet 2 of 15
`
`US 2012/0203491 A1
`
`SENSOR
`
`105c
`
`SENSOR
`
`105b
`
`SENSOR
`
`105a
`
`SENSOR GROUP 103
`
`DETERMINATION
`
`MODULE 211
`
`STATE
`
`MODULE 207
`RESOURCE
`
`CONTEXT MODULE
`
`205
`
`INTERFACE 203
`
`SENSOR
`
`CONTROL LOGIC
`
`201
`
`SENSOR MANAGER 109
`
`FIG. 2
`
`3
`
`
`
`Patent Application Publication
`
`Aug. 9, 2012 Sheet 3 of 15
`
`US 2012/0203491 A1
`
`313
`
`305
`
`DETERMINE WHETHER TO PERFORM
`
`FUNCTIONS AT SENSOR, DEVICE,
`
`AND/OR SERVICE
`
`INFORMATION AND RESOURCE
`OPTIONALLY MONITOR CONTEXT
`________________±________________
`
`CONSUMPTION
`
`ii
`
`END
`
`YES
`
`
`
`OF SECOND SENSOR OR FUNCTIONS
`CONSUMPTION TO DETERMINE STATE
`PROCESS CONTEXT AND RESOURCE
`
`~
`
`+
`
`PROCESS CONTEXT AND RESOURCE
`
`FUNCTIONS FOR SECOND SENSOR
`
`TO DETERMINE SCHEDULE OF
`
`i
`
`ASSOCIATED WITH SECOND SENSOR
`
`CONSUMPTION INFORMATION
`
`DETERMINE RESOURCE
`
`-vj DETERMINE CONTEXT INFORMATION
`
`FROM FIRST SENSOR
`
`311
`
`309
`
`307
`
`303
`
`301
`
`START
`
`300
`
`FIG. 3
`
`4
`
`
`
`Patent Application Publication
`
`Aug. 9, 2012 Sheet 4 of 15
`
`US 2012/0203491 A1
`
`COO
`
`FIG. 4A
`
`5
`
`
`
`Patent Application Publication
`
`Aug. 9, 2012 Sheet 5 of 15
`
`US 2012/0203491 A1
`
`FIG.4B
`
`6
`
`
`
`Patent Application Publication
`
`Aug. 9, 2012 Sheet 6 of 15
`
`US 2012/0203491 A1
`
`k
`
`\
`
`\
`
`/
`
`A
`
`A
`
`donnjscjusst
`
`/
`
`C D
`L O
`
`&S
`
`T
`
`A a
`
`s :
`
`O
`
`<s
`
`££
`f£
`
`>
`
`IS,
`
`J
`T---
`C O
`L O
`
`• !___ j
`
`L O
`L O
`
`S fr
`C -
`
`s*
`J
`(
`
`£
`
`I s ?
`
`E
`
`V
`
`&
`
`J §
`
`•vS
`-i$ww
`
`<g*
`.-X;
`
`S5S
`a
`,
`
`s
`
`;
`
`£ £
`
`■>
`
`o
`
`:i8
`
`o
`
`\ U
`
`< §
`
`n >
`. 4 £5* V
`$ $
`a
`
`.
`
`£ C
`= £ £
`
`£ i
`•■S?
`T^i-
`'iet
`2 *
`
`w J :
`
`?>•
`: * 3 :
`£ 2 :
`S 3
`
`> S
`
`&
`■a-J-
`
`1 3
`J Q -
`
`a
`4/1
`
`«>•
`
`m t
`
`ip
`
`. . S ' ,
`
`<5?
`
`a
`4&
`C l
`
`o
`
`j §
`
`^ 5 :
`
`S T
`g
`
`E :
`
`•Si'
`
`° £ l
`& • '
`
`-
`
`aa
`
`£
`
`8
`
`f
`
`
`
`\
`
`s - t
`
`i
`
`\
`
`t
`
`
`
`\
`
`/
`o
`to
`
`/
`o
`C O
`to
`
`/
`in
`o
`m
`
`/
`
`o
`m
`
`/
`
`/
`
`CD
`O
`L O
`
`T_
`—
`L O
`
`The phone as master nods
`
`<
`LO
`CD
`Ll_
`
`7
`
`
`
`Patent Application Publication
`
`Aug. 9,2012 Sheet 7 of 15
`
`US 2012/0203491 A1
`
`OG
`LO
`O
`
`8
`
`
`
`Patent Application Publication
`
`Aug. 9, 2012 Sheet 8 of 15
`
`US 2012/0203491 A1
`
`OL
`
`O
`CD
`
`9
`
`
`
`Patent Application Publication
`
`Aug. 9, 2012 Sheet 9 of 15
`
`US 2012/0203491 A1
`
`© V§
`
`>
`$3
`
`um x
`
`&
`
`A S
`
`3
`33O"'
`■S3
`\
`
`wcoo
`
`l
`
`C O
`C O
`
`fc:
`x i
`<s
`
`A
`IS>©
`
`> v
`
`y.
`o?
`
`&5
`$3
`ST
`£3
`
`iC
`©■««3
`
`l
`
`t —
`C O
`
`.
`
`U o
`
`,
`
`a:
`:0E
`
`c«
`
`s
`H3jp
`'>>
`
`yJS
`
`^
`
`/
`[■ "-
`o
`C O
`
`/
`<y>
`o
`C O
`
`S '
`«S>
`
`J3-
`
`§ c £
`
`S-
`«3-
`
`UC3-
`
`oo
`.*&
`
`,JSi£a
`«3-
`q *:
`
`mu
`
`- &M
`
`'
`V
`/
`L O
`o
`C O
`
`A
`
`Ck,|© s o s
`y®s
`
`Q©
`
`3
`
`m-J:
`
`Oi
`
`SS
`
`fflf I
`••-A
`r•3>
`
`ms
`
`3
`
`/
`o
`C O
`
`T—
`
`/
`C O
`o
`C O
`
`FIG. 6A
`
`10
`
`
`
`Patent Application Publication
`
`Aug. 9, 2012 Sheet 10 of 15
`
`US 2012/0203491 A1
`
`vs
`fiJ
`T;
`OJw
`s:
`
`1!
`
`3C
`
`coc
`
`j
`■i2
`•o
`<y
`0u
`<y
`QC
`
`><
`
`\
`
`<n
`0)
`
`<2 4H
`5 >
`JSiL
`fy*
`CJ _
`CO
`J L
`CL qj
`QJ m
`> —
`i-xw
`“*
`6
`u
`j
`co
`O '
`i j
`
`Ua
`
`tnty3
`O '
`tykn>
`i
`:C
`■c
`o
`u
`ty >
`'cu
`u
`0) sc
`
`O)L.
`CO
`
`Vr
`
`CO
`CMco
`
`■:
`
`Eo
`V
`£
`ro c:
`£
`o o
`o ■,p <n
`| Qjj
`^
`| T3 4— u c
`(&■ <y
`■*-*
`RJ c
`$
`-2* c <y
`o ..c
`j!
`* u J- - J
`■:
`V
`
`J
`f
`CNI
`CO
`
`Wm i1
`
`w
`
`<D
`CO
`£
`CL
`<u
`—
`
`po
`
`rs
`tss
`re
`
`<y
`+->
`t o
`
`FIG. 6B
`
`11
`
`
`
`Patent Application Publication
`
`Aug. 9, 2012 Sheet 11 of 15
`
`US 2012/0203491 A1
`
`ECG sensor
`State diagram
`
`FIG. 6C
`
`12
`
`
`
`Patent Application Publication
`
`Aug. 9, 2012 Sheet 12 of 15
`
`US 2012/0203491 A1
`
`DATA DESTINATION: DR. SMITH’S
`
`OFFICE
`
`►MEDIUM
`
`HIGH
`
`LOW
`
`THRESHOLD:
`SENSOR ACTIVITY
`
`HIGH
`MEDIUM
`
`ENERGY PROFILE:► LOW
`
`SENSOR SYSTEM CONFIGURATION
`
`707
`
`705
`
`703
`
`701
`
`FIG. 7
`
`13
`
`
`
`Patent Application Publication
`
`Aug. 9, 2012 Sheet 13 of 15
`
`US 2012/0203491 A1
`
`FIG. 8
`
`14
`
`
`
`Patent Application Publication
`
`Aug. 9, 2012 Sheet 14 of 15
`
`US 2012/0203491 A1
`
`FIG. 9
`
`15
`
`
`
`Patent Application Publication
`
`Aug. 9, 2012 Sheet 15 of 15
`
`US 2012/0203491 A1
`
`i__________i
`
`FIG. 10
`
`16
`
`
`
`US 2012/0203491 A1
`
`1
`
`Aug. 9, 2012
`
`METHOD AND APPARATUS FOR
`PROVIDING CONTEXT-AWARE CONTROL
`OF SENSORS AND SENSOR DATA
`
`BACKGROUND
`[0001] Service providers (e.g., wireless, cellular, etc.) and
`device manufacturers are continually challenged to deliver
`value and convenience to consumers by, for example, provid
`ing compelling network services. One area of development
`has been the integration of sensors for determining contextual
`information for use in network services to enable such ser
`vices to be, for instance, context-aware. For example, con-
`text-aware systems use knowledge about a user’s current
`situation to tailor system services, functions, content, etc. in a
`situationally-appropriate maimer based on data collected
`from one or more sensors. These sensors may include health
`and wellness sensors such as electrocardiograph (ECG) sen
`sors, photoplethysmograph (PPG) sensors, galvanic skin
`response (GSR) sensors, and the like. As use of such sensors
`become more common, service providers and device manu
`facturers face significant challenges to enabling the sensors to
`operate continuously for prolonged periods, particularly
`when the sensors operate on limited battery power.
`
`SOME EXAMPLE EMBODIMENTS
`[0002] Therefore, there is a need for an approach for pro
`viding context-aware control of sensors and sensor data while
`maximizing, for instance, eneigy efficiency and data quality.
`[0003] According to one embodiment, a method comprises
`determining context information based, at least in part, on one
`or more sensors. The method also comprises determining
`resource consumption information associated with one or
`more other sensors, one or more functions of the one or more
`other sensors, or a combination thereof. The method further
`comprises processing and/or facilitating a processing of the
`context information and the resource consumption informa
`tion to determine at least one operational state associated with
`the one or more other sensors, the one or more functions of the
`one or more other sensors, or a combination thereof.
`[0004] According to another embodiment, an apparatus
`comprising at least one processor, and at least one memory
`including computer program code, the at least one memory
`and the computer program code configured to, with the at
`least one processor, cause, at least in part, the apparatus to
`determine context information based, at least in part, on one
`or more sensors. The apparatus is also caused to determine
`resource consumption information associated with one or
`more other sensors, one or more functions of the one or more
`other sensors, or a combination thereof. The apparatus is
`further caused to process and/or facilitate a processing of the
`context information and the resource consumption informa
`tion to determine at least one operational state associated with
`the one or more other sensors, the one or more functions of the
`one or more other sensors, or a combination thereof.
`[0005] According to another embodiment, a computer-
`readable storage medium carrying one or more sequences of
`one or more instructions which, when executed by one or
`more processors, cause, at least in part, an apparatus to deter
`mine context information based, at least in part, on one or
`more sensors. The apparatus is also caused to determine
`resource consumption information associated with one or
`more other sensors, one or more functions of the one or more
`other sensors, or a combination thereof. The apparatus is
`
`further caused to process and/or facilitate a processing of the
`context information and the resource consumption informa
`tion to determine at least one operational state associated with
`the one or more other sensors, the one or more functions of the
`one or more other sensors, or a combination thereof.
`[0006] According to another embodiment, an apparatus
`comprises means for determining context information based,
`at least in part, on one or more sensors. The apparatus also
`comprises means for determining resource consumption
`information associated with one or more other sensors, one or
`more functions of the one or more other sensors, or a combi
`nation thereof. The apparatus further comprises means for
`processing and/or facilitating a processing of the context
`information and the resource consumption information to
`determine at least one operational state associated with the
`one or more other sensors, the one or more functions of the
`one or more other sensors, or a combination thereof.
`[0007]
`In addition, for various example embodiments of the
`invention, the following is applicable: a method comprising
`facilitating a processing of and/or processing (1) data and/or
`(2) information and/or (3) at least one signal, the (1) data
`and/or (2) information and/or (3) at least one signal based, at
`least in part, on (including derived at least in part from) any
`one or any combination of methods (or processes) disclosed
`in this application as relevant to any embodiment of the inven
`tion.
`[0008] For various example embodiments of the invention,
`the following is also applicable: a method comprising facili
`tating access to at least one interface configured to allow
`access to at least one service, the at least one service config
`ured to perform any one or any combination of network or
`service provider methods (or processes) disclosed in this
`application.
`[0009] For various example embodiments of the invention,
`the following is also applicable: a method comprising facili
`tating creating and/or facilitating modifying (1) at least one
`device user interface element and/or (2) at least one device
`user interface functionality, the (1) at least one device user
`interface element and/or (2) at least one device user interface
`functionality based, at least in part, on data and/or informa
`tion resulting from one or any combination of methods or
`processes disclosed in this application as relevant to any
`embodiment of the invention, and/or at least one signal result
`ing from one or any combination of methods (or processes)
`disclosed in this application as relevant to any embodiment of
`the invention.
`[0010] For various example embodiments of the invention,
`the following is also applicable: a method comprising creat
`ing and/or modifying (1) at least one device user interface
`element and/or (2) at least one device user interface function
`ality, the (1) at least one device user interface element and/or
`(2) at least one device user interface functionality based at
`least in part on data and/or information resulting from one or
`any combination of methods (or processes) disclosed in this
`application as relevant to any embodiment of the invention,
`and/or at least one signal resulting from one or any combina
`tion of methods (or processes) disclosed in this application as
`relevant to any embodiment of the invention.
`[0011]
`In various example embodiments, the methods (or
`processes) can be accomplished on the service provider side
`or on the mobile device side or in any shared way between
`service provider and mobile device with actions being per
`formed on both sides.
`
`17
`
`
`
`US 2012/0203491 A1
`
`2
`
`Aug. 9, 2012
`
`[0012] For various example embodiments, the following is
`applicable: An apparatus comprising means for performing
`the method of any of originally filed claims 1-10, 21-30, and
`46-48.
`[0013] Still other aspects, features, and advantages of the
`invention are readily apparent from the following detailed
`description, simply by illustrating a number of particular
`embodiments and implementations, including the best mode
`contemplated for carrying out the invention. The invention is
`also capable of other and different embodiments, and its
`several details can be modified in various obvious respects, all
`without departing from the spirit and scope of the invention.
`Accordingly, the drawings and description are to be regarded
`as illustrative in nature, and not as restrictive.
`
`BRIEF DESCRIPTION OF THE DRAWINGS
`[0014] The embodiments of the invention are illustrated by
`way of example, and not by way of limitation, in the figures of
`the accompanying drawings:
`[0015] FIG. 1 is a diagram of a system capable of providing
`context-aware control of sensors and sensor data, according
`to one embodiment;
`[0016] FIG. 2 is a diagram of the components of a sensor
`manager, according to one embodiment;
`[0017] FIG. 3 is a flowchart of a process for providing
`context-aware control of sensor data, according to one
`embodiment;
`[0018] FIG. 4A is a diagram of a framework for context-
`aware control of health and wellness sensors, according to
`one embodiment;
`[0019] FIG. 4B is a flowchart of a process for context-aware
`control of health and wellness sensors, according to once
`embodiment;
`[0020] FIGS. 5A-5C are diagrams of a process for context-
`aware control of sensors and sensor data wherein a device acts
`as a master of the process, according to various embodiments;
`[0021] FIGS. 6A-6C are diagrams of a process for context-
`aware control of sensors and sensor data wherein a sensor acts
`as a master of the process, according to various embodiments;
`[0022] FIG. 7 is a diagram of a user interface utilized in the
`processes of FIGS. 1-6C, according to one embodiment;
`[0023] FIG. 8 is a diagram of hardware that can be used to
`implement an embodiment of the invention;
`[0024] FIG. 9 is a diagram of a chip set that can be used to
`implement an embodiment of the invention; and
`[0025] FIG. 10 is a diagram of a mobile terminal (e.g.,
`handset) that can be used to implement an embodiment of the
`invention.
`
`DESCRIPTION OF SOME EMBODIMENTS
`[0026] Examples of a method, apparatus, and computer
`program for providing context-aware control of sensors and
`sensor data are disclosed. In the following description, for the
`purposes of explanation, numerous specific details are set
`forth in order to provide a thorough understanding of the
`embodiments of the invention. It is apparent, however, to one
`skilled in the art that the embodiments of the invention may be
`practiced without these specific details or with an equivalent
`arrangement. In other instances, well-known structures and
`devices are shown in block diagram form in order to avoid
`unnecessarily obscuring the embodiments of the invention.
`[0027] Although various embodiments are discussed with
`respect to health and wellness sensors, it is contemplated that
`
`embodiments of the approach described herein are applicable
`to any type of sensor including environmental sensors, sen
`sors for physical properties, material sensors, location sen
`sors, etc.
`[0028] FIG. 1 is a diagram of a system capable of providing
`context-aware control of sensor and sensor data, according to
`one embodiment. As discussed above, the contextual aware
`ness of a system or service is often based on sensor data. For
`example, possible sensors that may be associated with
`devices (e.g., mobile devices such as cell phones, smart
`phones, etc.) include location sensors (e.g., Global Position
`ing System (GPS) sensors, light sensors, proximity sensors,
`accelerometers, gyroscopes, etc.).
`[0029] Within the context of systems for supporting health
`and wellness services and/or applications, possible sensors
`include electrocardiograph (ECG) sensors, photoplethysmo-
`graph (PPG) sensors, galvanic skin response (GSR) sensors,
`electroencephalograph (EEG) sensors, electromyography
`(EMG) sensors, and the like. In one embodiment, the health
`and wellness sensors support body sensor network (BSN)
`technologies that offer opportunities for monitoring physi
`ological signals with wearable sensors in a mobile environ
`ment. For example, ECG-based wearable sensors enable con
`tinuous or substantially continuous monitoring for emotion
`monitoring and/or monitoring for cardiovascular disease.
`[0030]
`In one embodiment, such monitoring is used to sup
`port pervasive healthcare which has drawn the attention in
`research communities such as ubiquitous computing, bio
`engineering, and medical informatics because of the potential
`for the monitoring to provide longitudinal and quantitative
`personal data collection. The reliability and continuous
`nature of such monitoring is one key element in a program to
`maintain user wellness. As noted, a main component to sup
`port pervasive healthcare is a BSN system. In one embodi
`ment, a BSN system includes use of wireless sensor nodes
`with smaller size, longer battery life, and powerful computing
`capabilities.
`[0031] However, the operating lifetime of the physiological
`sensor is a key challenge in continuous monitoring design.
`More specifically, sensors may potentially require a signifi
`cant amount of battery power (relative to the capacity of a
`battery on a small device) to operate continuously. Accord
`ingly, extending and optimizing battery life (e.g., reducing
`energy consumption) is a significant challenge for service
`providers and device manufacturers. In other words, in order
`to offer the continuous monitoring and real-time or substan
`tially real-time collection and analysis of sensor data, the
`BSN and its sensors need sufficient efficiency with respect to
`energy consumption to sense, transmit, and/or process the
`sensor data stream. For example, a wearable ECG sensor for
`stress detection cannot function effectively if battery life is
`limited to only a few hours. In particular, limited battery life
`and/or inefficient use of available energy reserves (e.g., bat
`tery life) can be further exacerbated with high data rate physi
`ological sensors or high use of wireless transceivers to trans
`mit the data from the sensors. In other cases, reducing energy
`consumption by the sensors also enables design of smaller,
`lighter, and more wearable sensor designs.
`[0032] To address these problems, a system 100 of FIG. 1
`introduces the capability of using context information (e.g.,
`sensor data) detected or otherwise collected at one or more
`sensors to determine an operational state of one or more other
`sensors (e.g., health and wellness sensors) or one or more
`functions of the one or more other sensors. As used herein, an
`
`18
`
`
`
`US 2012/0203491 A1
`
`3
`
`Aug. 9, 2012
`
`operational state refers to an operating condition (e.g.,
`enabled or disabled), one or more operating parameters (e.g.,
`sampling rate, sampling start or end, sampling parameters,
`etc.). In one embodiment, the operational state is determined
`to reduce resource consumption (e.g., energy consumption,
`bandwidth consumption, processing consumption, etc.) by
`the one or more other sensors. In this way, resources can be
`conserved to prolong the operational life or time of the sen
`sors before one or more of the resources has to be replenished
`(e.g., recharging or replacing a sensor’s battery).
`[0033]
`In one embodiment, the one or more functions can
`be related to, for instance, on-node data collection, data pro
`cessing, data transmission, and related operations. For
`example, depending on the context information and informa
`tion on eneigy consumption or availability, one or more of the
`functions can be performed at the sensor itself, transmitted to
`an associated device (e.g., a mobile device) for processing,
`transmitted to a related service (e.g., a backend service) for
`processing, or some combination. In one embodiment, the
`determination of whether to perform on-node (e.g., on sensor)
`functions can be based, at least in part, on a comparison of the
`energy costs associated with performing the function at the
`node versus the eneigy costs associated with transmitting the
`data to another device or service for processing. In most
`cases, the eneigy or resource costs of transmitting usually
`outweigh the resource burden of on-node processing. Accord
`ingly, the system 100 can exploit the on-node processing
`capabilities of a sensor to reduce over resource or energy
`consumption and prolong the operational life of the sensor.
`[0034]
`In one embodiment, in the context of health and
`wellness sensors (e.g., a wearable ECG sensor), the system
`100 can determine context, information at another sensor or
`sensors (e.g., an accelerometer, gyroscope, compass, etc.) to
`determine when to enable or disable one or more of the health
`and wellness sensors (e.g., an ECG sensor) and/or their func
`tions to conserve resources. For example, many health and
`wealth sensors measure physiological characteristics of a
`user. Flistorically, these measurements have not been accurate
`if the measurement is taken with the user is moving or engage
`in some level of physical activity. Accordingly, in one
`embodiment, the system 100 uses an individual’s physical
`activity level to boost the accuracy of sensor data interpreta
`tion as well as to reduce energy consumption by turning the
`physiological sensor off or other restricting its functions
`under conditions (e.g., high levels of movement) when the
`collected data would not be accurate.
`[0035] For example, assuming the user is wearing a first
`sensor or group of sensors that capture acceleration and a
`second sensor or group of sensors that capture physiological
`data such as heart rate signals, the system 100 determines the
`user’s physical activity using the accelerometer data. In one
`embodiment, the physical activity level is categorized in
`descriptive terms such as “low,” “medium,” and “high.” In
`addition or alternatively, the physical activity level can be
`described using a numerical metric or other ordinal scale. In
`either case, during vigorous physical activity, physiological
`sensor data can be unreliable, as the activity introduces
`motion artifacts. Thus, under this context (e.g., high physical
`activity), the system 100 stops collecting and/or processing
`data at the physiological sensor or sensors while the user is
`active. Using the context information collected at the first
`sensor or group of sensors (e.g., the accelerometer data) to
`stop data collection and/or processing at the second sensor or
`group of sensors enables the system 100 to: (1) save resources
`
`(e.g., battery life of the sensor), and (2) increase the accuracy
`of the sensor data analysis by avoiding collecting data when
`artifacts can reduce the quality of the data.
`[0036] As shown in FIG. 1, the system 100 includes a user
`equipment (UE) 101 with connectivity to at least one sensor
`group 103 including sensors 105a (e.g., a first sensor) and
`105/) (e.g., a second sensor). In one embodiment, the sensor
`group 103 constitutes a wearable sensor in which multiple
`sensors (e.g., sensors 105a and 105/)) are included to provide
`additional functionality. For example, as described above, the
`sensor group 103 may include a combination of an acceler
`ometer (e.g., sensor 105a) and a physiological sensor (e.g.,
`sensor 105/)) such as an ECG sensor. As shown, the UE 101
`also has connectivity to a standalone sensor 105c that can
`operate independently or in coordination with the sensor
`group 103 or other sensor groups or sensors. In one embodi
`ment, the sensor group 103 and or the sensors 105a-105c
`(also collectively referred to as sensors 105) may comprise a
`BSN. By way of example, connectivity between the UE 101
`and the sensor group 103 and the sensors 105a-105c can be
`facilitated by short range wireless communications (e.g.,
`Bluetooth, Wi-Fi, ANT/ANT+, ZigBee, etc.).
`[0037]
`In addition, the UE 101 can execute an application
`107 that is a software client for storing, processing, and/or
`forwarding the sensor data to other components of the system
`100. In one embodiment, the application 107 may include a
`sensor manager 109a for performing functions related to
`providing context-aware control of the sensor group 103 and/
`or the sensors 105a-105c as discussed with respect to the
`various embodiments of the approach described herein. In
`addition or alternatively, it is contemplated that the UE 101
`may include a standalone sensor manager 109b that operates
`independently of the application 107, and that the sensors
`themselves may include a sensor manager 109c (e.g., as
`shown with respect to sensor 105/)).
`[0038] As shown in FIG. 1, the UE 101 has connectivity via
`a communication network 111 to a service platform 113
`which includes one or more services 115a-115« (also collec
`tively referred to as services 115) (e.g., health and wellness
`service or any other service that can use contextually aware
`sensor information), the one ormore content providers 117a-
`117m (also collectively referred to as content providers 117)
`(e.g., online content retailers, public databases, etc.). In one
`embodiment, the sensors 105a-105c, the sensor managers
`109a-109c (also collectively referred to as sensor managers
`109), and or the application 107 can transmit sensor data to
`the service platform 113, the services 115a-115«, and/or the
`content providers 117a-117m for storage, processing, and/or
`further transmission.
`[0039]
`In one sample use case, a user wears the sensor
`group 103 and/or the sensors 105a-105cforcontinuousmoni-
`toring and collection of sensor data (e.g., for continuous ECG
`monitoring). For such ECG monitoring, in an ideal case, the
`user wearing a sensor is stationary when a measurement is
`taken to reduce potential movement artifacts in the data. For
`example, the sensor group 103 transmits accelerometer and
`ECG information to the UE 101 at periodic intervals. The UE
`101 (e.g., via the application 107 and/or the sensor manager
`109b) stores the data temporarily, performs any needed pro
`cessing and aggregation, and sends the data to one or more of
`the services 115 at periodic intervals. In one embodiment, the
`data sent includes, at least in part, timestamps, sensor data
`(e.g., physiological data), and/or context information (e.g.,
`activity level determined from the accelerometer data).
`
`19
`
`
`
`US 2012/0203491 A1
`
`4
`
`Aug. 9, 2012
`
`[0040] When the context information (e.g., accelerometer
`data) indicates movement of the sensor group 103 and/or
`movement of the user wearing the sensor group 103 above a
`predetermined threshold, the sensor manager 109 will, for
`instance: (1) turn off the sensor 105 collecting the data; (2)
`transmit an indicator that activity levels are high and that no
`data will be collected; and/or (3) log or store the activity levels
`in the sensor manager 109’s memory such as a flash memory
`of the sensor 105. This decreases the amount of data trans
`ferred to the UE 101 and to the corresponding service 115,
`thereby extending both the sensor 105’s and the UE 101’s
`operational capacities (e.g., battery lives) while also remov
`ing potentially noisy data (e.g. motion artifacts) from the data
`set. In one embodiment, the sensor manager 109 process the
`context information to recognize simple and/or coarse
`grained daily activities (e.g., sitting, standing, walking, etc.)
`to optimize the energy consumption of the sensors 105.
`[0041]
`It is noted that although various embodiments dis
`cuss context information as motion or movement informa
`tion, it is contemplated that the context information may
`relate to any operational parameter corresponding to the sen
`sor 105 that is performing the data collection. For example, if
`the data collecting sensor 105 is an ECG sensor, the context
`information may also include parameters related to oxygen
`ation levels in the blood, heart rate, galvanic skin response, or
`a combination of the parameters.
`[0042] By way of example, the communication network
`111 of system 100 includes one or more networks such as a
`data network (not shown), a wireless network (not shown), a
`telephony network (not shown), or any combination thereof.
`It is contemplated that the data network may be any local area
`network (LAN), metropolitan area network (MAN), wide
`area network (WAN), a public data network (e.g., the Inter
`net), short range wireless network, or any other suitable
`packet-switched network, such as a commercially owned,
`proprietary packet-switched network, e.g., a proprietary
`cable or fiber-optic network, and the like, or any combination
`thereof. In addition, the wireless network may be, for
`example, a cellular network and may employ various tech
`nologies including enhanced data rates for global evolution
`(EDGE), general packet radio service (GPRS), global system
`for mobile communications (GSM), Internet protocol multi-
`media subsystem (IMS), universal mobile telecommunica
`tions system (UMTS), etc., as well as any other suitable
`wireless medium, e.g., worldwide interoperability for micro-
`wave access (WiMAX), Long Term Evolution (LTE) net
`works, code division multiple access (CDMA), wideband
`code division multiple access (WCDMA), wireless fidelity
`(Wi-Fi), wireless LAN (WLAN), Bluetooth®, Internet Pro
`tocol (IP) data casting, satellite, mobile ad-hoc network (MA
`NET), and the like, or any combination thereof.
`[0043] The UE 101 is any type of mobile terminal, fixed
`terminal, or portable terminal including a mobile handset,
`station, unit, device, multimedia computer, multimedia tab
`let, Internet node, communicator, desktop computer, laptop
`computer, notebook computer, netbook computer, tablet
`computer, personal communication system (PCS) device,
`personal navigation device, personal digital assistants
`(PDAs), audio/video player, digital camera/camcorder, posi
`tioning device, television receiver, radio broadcast receiver,
`electronic book device, game device, or any combination
`thereof, including the accessories and peripherals of these
`devices, or any combination thereof. It is also contemplated
`
`that the UE 101 can support any type of interface to the user
`(such as “wearable” circuitry, etc.).
`[0044] By way of example, the UE 101, the sensor group
`103, the sensors 105, the application 107, and service plat
`form 113 communicate with each other and other compo
`nents of the communication network 111 using well known,
`new or still developing protocols. In this context, a protocol
`includes a set of rules defining how the network nodes within
`the communication network 111 interact with each other
`based on information sent over the communication links. The
`protocols are effective at different layers of operation within
`each node, from generating and receiving physical signals of
`various types, to selecting a link for transferring those signals,
`to the format of information indicated by those signals, to
`identifying which software application executing on a com
`puter system sends or receives the information. The concep
`tually different layers of protocols for exchanging informa
`tion over a network are described in the Open Systems
`Interconnection (OSI) Reference Model.
`[0045] Communications between the network nodes are
`typically effected by exchanging discrete packets of data.
`Each packet typically comprises (1) header information asso
`ciated with a particular protocol, and (2) payload information
`that follows the header information and contains information
`that may be processed independently of that particular pro
`tocol. In some protocols, the packet includes (3) trailer infor
`mation following the payload and indicating the end of the
`payload information. The header includes