throbber
LANE LAYOUT 8/24/10 10:43 AM Page 140
`
`AD HOC AND SENSOR NETWORKS
`
`A Survey of Mobile Phone Sensing
`
`Nicholas D. Lane, Emiliano Miluzzo, Hong Lu, Daniel Peebles, Tanzeem Choudhury,
`and Andrew T. Campbell, Dartmouth College
`
`ABSTRACT
`Mobile phones or smartphones are rapidly
`becoming the central computer and communica-
`tion device in people’s lives. Application delivery
`channels such as the Apple AppStore are trans-
`forming mobile phones into App Phones, capa-
`ble of downloading a myriad of applications in
`an instant. Importantly, today’s smartphones are
`programmable and come with a growing set of
`cheap powerful embedded sensors, such as an
`accelerometer, digital compass, gyroscope, GPS,
`microphone, and camera, which are enabling the
`emergence of personal, group, and community-
`scale sensing applications. We believe that sen-
`sor-equipped mobile phones will revolutionize
`many sectors of our economy, including busi-
`ness, healthcare, social networks, environmental
`monitoring, and transportation. In this article we
`survey existing mobile phone sensing algorithms,
`applications, and systems. We discuss the emerg-
`ing sensing paradigms, and formulate an archi-
`tectural framework for discussing a number of
`the open issues and challenges emerging in the
`new area of mobile phone sensing research.
`
`INTRODUCTION
`Today’s smartphone not only serves as the key
`computing and communication mobile device of
`choice, but it also comes with a rich set of
`embedded sensors, such as an accelerometer,
`digital compass, gyroscope, GPS, microphone,
`and camera. Collectively, these sensors are
`enabling new applications across a wide variety
`of domains, such as healthcare [1], social net-
`works [2], safety, environmental monitoring [3],
`and transportation [4, 5], and give rise to a new
`area of research called mobile phone sensing.
`Until recently mobile sensing research such
`as activity recognition, where people’s activity
`(e.g., walking, driving, sitting, talking) is classi-
`fied and monitored, required specialized mobile
`devices (e.g., the Mobile Sensing Platform
`[MSP]) [6] to be fabricated [7]. Mobile sensing
`applications had to be manually downloaded,
`installed, and hand tuned for each device. User
`studies conducted to evaluate new mobile sens-
`ing applications and algorithms were small-scale
`because of the expense and complexity of doing
`experiments at scale. As a result the research,
`which was innovative, gained little momentum
`outside a small group of dedicated researchers.
`Although the potential of using mobile phones
`
`as a platform for sensing research has been dis-
`cussed for a number of years now, in both indus-
`trial [8] and research communities [9, 10], there
`has been little or no advancement in the field
`until recently.
`All that is changing because of a number of
`important technological advances. First, the
`availability of cheap embedded sensors initially
`included in phones to drive the user experience
`(e.g., the accelerometer used to change the dis-
`play orientation) is changing the landscape of
`possible applications. Now phones can be pro-
`grammed to support new disruptive sensing
`applications such as sharing the user’s real-time
`activity with friends on social networks such as
`Facebook, keeping track of a person’s carbon
`footprint, or monitoring a user’s well being. Sec-
`ond, smartphones are open and programmable.
`In addition to sensing, phones come with com-
`puting and communication resources that offer a
`low barrier of entry for third-party programmers
`(e.g., undergraduates with little phone program-
`ming experience are developing and shipping
`applications). Third, importantly, each phone
`vendor now offers an app store allowing develop-
`ers to deliver new applications to large popula-
`tions of users across the globe, which is
`transforming the deployment of new applications,
`and allowing the collection and analysis of data
`far beyond the scale of what was previously possi-
`ble. Fourth, the mobile computing cloud enables
`developers to offload mobile services to back-end
`servers, providing unprecedented scale and addi-
`tional resources for computing on collections of
`large-scale sensor data and supporting advanced
`features such as persuasive user feedback based
`on the analysis of big sensor data.
`The combination of these advances opens the
`door for new innovative research and will lead to
`the development of sensing applications that are
`likely to revolutionize a large number of existing
`business sectors and ultimately significantly
`impact our everyday lives. Many questions
`remain to make this vision a reality. For exam-
`ple, how much intelligence can we push to the
`phone without jeopardizing the phone experi-
`ence? What breakthroughs are needed in order
`to perform robust and accurate classification of
`activities and context out in the wild? How do we
`scale a sensing application from an individual to
`a target community or even the general popula-
`tion? How do we use these new forms of large-
`scale application delivery systems (e.g., Apple
`AppStore, Google Market) to best drive data
`
`140
`
`0163-6804/10/$25.00 © 2010 IEEE
`
`IEEE Communications Magazine • September 2010
`
`e.Digital Corporation
`Exhibit 2017 - Page 1
`
`

`
`LANE LAYOUT 8/24/10 10:43 AM Page 141
`
`collection, analysis and validation? How can we
`exploit the availability of big data shared by
`applications but build watertight systems that
`protect personal privacy? While this new
`research field can leverage results and insights
`from wireless sensor networks, pervasive com-
`puting, machine learning, and data mining, it
`presents new challenges not addressed by these
`communities.
`In this article we give an overview of the sen-
`sors on the phone and their potential uses. We
`discuss a number of leading application areas and
`sensing paradigms that have emerged in the liter-
`ature recently. We propose a simple architectural
`framework in order to facilitate the discussion of
`the important open challenges on the phone and
`in the cloud. The goal of this article is to bring
`the novice or practitioner not working in this field
`quickly up to date with where things stand.
`
`SENSORS
`As mobile phones have matured as a computing
`platform and acquired richer functionality, these
`advancements often have been paired with the
`introduction of new sensors. For example,
`accelerometers have become common after being
`initially introduced to enhance the user interface
`and use of the camera. They are used to automat-
`ically determine the orientation in which the user
`is holding the phone and use that information to
`automatically re-orient the display between a
`landscape and portrait view or correctly orient
`captured photos during viewing on the phone.
`Figure 1 shows the suite of sensors found in
`the Apple iPhone 4. The phone’s sensors include
`a gyroscope, compass, accelerometer, proximity
`sensor, and ambient light sensor, as well as other
`more conventional devices that can be used to
`sense such as front and back facing cameras, a
`microphone, GPS and WiFi, and Bluetooth
`radios. Many of the newer sensors are added to
`support the user interface (e.g., the accelerome-
`ter) or augment location-based services (e.g., the
`digital compass).
`The proximity and light sensors allow the
`phone to perform simple forms of context recog-
`nition associated with the user interface. The
`proximity sensor detects, for example, when the
`user holds the phone to her face to speak. In
`this case the touchscreen and keys are disabled,
`preventing them from accidentally being pressed
`as well as saving power because the screen is
`turned off. Light sensors are used to adjust the
`brightness of the screen. The GPS, which allows
`the phone to localize itself, enables new loca-
`tion-based applications such as local search,
`mobile social networks, and navigation. The
`compass and gyroscope represent an extension
`of location, providing the phone with increased
`awareness of its position in relation to the physi-
`cal world (e.g., its direction and orientation)
`enhancing location-based applications.
`Not only are these sensors useful in driving
`the user interface and providing location-based
`services; they also represent a significant oppor-
`tunity to gather data about people and their
`environments. For example, accelerometer data
`is capable of characterizing the physical move-
`ments of the user carrying the phone [2]. Dis-
`
`Ambient light
`
`Proximity
`
`Dual cameras
`
`GPS
`
`Accelerometer
`
`Dual microphones
`
`Compass
`
`Gyroscope
`
`Figure 1. An off-the-self iPhone 4, representative of the growing class of sensor-
`enabled phones. This phone includes eight different sensors: accelerometer,
`GPS, ambient light, dual microphones, proximity sensor, dual cameras, com-
`pass, and gyroscope.
`
`tinct patterns within the accelerometer data can
`be exploited to automatically recognize different
`activities (e.g., running, walking, standing). The
`camera and microphone are powerful sensors.
`These are probably the most ubiquitous sensors
`on the planet. By continuously collecting audio
`from the phone’s microphone, for example, it is
`possible to classify a diverse set of distinctive
`sounds associated with a particular context or
`activity in a person’s life, such as using an auto-
`matic teller machine (ATM), being in a particu-
`lar coffee shop, having a conversation, listening
`to music, making coffee, and driving [11]. The
`camera on the phone can be used for many
`things including traditional tasks such as photo
`blogging to more specialized sensing activities
`such as tracking the user’s eye movement across
`the phone’s display as a means to activate appli-
`cations using the camera mounted on the front
`of the phone [12]. The combination of
`accelerometer data and a stream of location esti-
`mates from the GPS can recognize the mode of
`transportation of a user, such as using a bike or
`car or taking a bus or the subway [3].
`More and more sensors are being incorporat-
`ed into phones. An interesting question is what
`new sensors are we likely to see over the next
`few years? Non-phone-based mobile sensing
`devices such as the Intel/University of Washing-
`ton Mobile Sensing Platform (MSP) [6] have
`shown value from using other sensors not found
`in phones today (e.g., barometer, temperature,
`humidity sensors) for activity recognition; for
`example, the accelerometer and barometer make
`it easy to identify not only when someone is
`walking, but when they are climbing stairs and in
`which direction. Other researchers have studied
`air quality and pollution [13] using specialized
`
`IEEE Communications Magazine • September 2010
`
`141
`
`e.Digital Corporation
`Exhibit 2017 - Page 2
`
`

`
`LANE LAYOUT 8/24/10 10:43 AM Page 142
`
`UbitFit Garden
`
`Garbage Watch
`
`Participatory Urbanism
`
`Individual
`
`Group
`
`Community
`
`Figure 2. Mobile phone sensing is effective across multiple scales, including: a
`single individual (e.g., UbitFit Garden [1]), groups such as social networks or
`special interest groups (e.g., Garbage Watch [23]), and entire communities/
`population of a city (e.g., Participatory Urbanism [20]).
`
`sensors embedded in prototype mobile phones.
`Still others have embedded sensors in standard
`mobile phone earphones to read a person’s
`blood pressure [14] or used neural signals from
`cheap off-the-shelf wireless electroencephalogra-
`phy (EEG) headsets to control mobile phones
`for hands-free human-mobile phone interaction
`[36]. At this stage it is too early to say what new
`sensors will be added to the next generation of
`smartphones, but as the cost and form factor
`come down and leading applications emerge, we
`are likely to see more sensors added.
`
`APPLICATIONS AND APP STORES
`New classes of applications, which can take
`advantage of both the low-level sensor data and
`high-level events, context, and activities inferred
`from mobile phone sensor data, are being
`explored not only in academic and industrial
`research laboratories [11, 15–22] but also within
`startup companies and large corporations. One
`such example is SenseNetworks, a recent U.S.-
`based startup company, which uses millions of
`GPS estimates sourced from mobile phones
`within a city to predict, for instance, which sub-
`population or tribe might be interested in a spe-
`cific type of nightclub or bar (e.g., a jazz club).
`Remarkably, it has only taken a few years for
`this type of analysis of large-scale location infor-
`mation and mobility patterns to migrate from
`the research laboratory into commercial usage.
`In what follows we discuss a number of the
`emerging leading application domains and argue
`that the new application delivery channels (i.e.,
`app stores) offered by all the major vendors are
`critical for the success of these applications.
`TRANSPORTATION
`Traffic remains a serious global problem; for
`example, congestion alone can severely impact
`both the environment and human productivity
`(e.g., wasted hours due to congestion). Mobile
`phone sensing systems such as the MIT VTrack
`
`project [4] or the Mobile Millennium project [5]
`(a joint initiative between Nokia, NAVTEQ, and
`the University of California at Berkeley) are
`being used to provide fine-grained traffic infor-
`mation on a large scale using mobile phones that
`facilitate services such as accurate travel time
`estimation for improving commute planning.
`SOCIAL NETWORKING
`Millions of people participate regularly within
`online social networks. The Dartmouth
`CenceMe project [2] is investigating the use of
`sensors in the phone to automatically classify
`events in people’s lives, called sensing presence,
`and selectively share this presence using online
`social networks such as Twitter, Facebook, and
`MySpace, replacing manual actions people now
`perform daily.
`ENVIRONMENTAL MONITORING
`Conventional ways of measuring and reporting
`environmental pollution rely on aggregate statis-
`tics that apply to a community or an entire city.
`The University of California at Los Angeles
`(UCLA) PEIR project [3] uses sensors in phones
`to build a system that enables personalized envi-
`ronmental impact reports, which track how the
`actions of individuals affect both their exposure
`and their contribution to problems such as car-
`bon emissions.
`HEALTH AND WELL BEING
`The information used for personal health care
`today largely comes from self-report surveys and
`infrequent doctor consultations. Sensor-enabled
`mobile phones have the potential to collect in
`situ continuous sensor data that can dramatically
`change the way health and wellness are assessed
`as well as how care and treatment are delivered.
`The UbiFit Garden [1], a joint project between
`Intel and the University of Washington, captures
`levels of physical activity and relates this infor-
`mation to personal health goals when presenting
`feedback to the user. These types of systems
`have proven to be effective in empowering peo-
`ple to curb poor behavior patterns and improve
`health, such as encouraging more exercise.
`APP STORES
`Getting a critical mass of users is a common
`problem faced by people who build systems,
`developers and researchers alike. Fortunately,
`modern phones have an effective application dis-
`tribution channel, first made available by Apple’s
`App Store for the iPhone, that is revolutionizing
`this new field. Each major smartphone vendor
`has an app store (e.g., Apple AppStore, Android
`Market, Microsoft Mobile Marketplace, Nokia
`Ovi). The success of the app stores with the pub-
`lic has made it possible for not only startups but
`small research laboratories and even individual
`developers to quickly attract a very large number
`of users. For example, an early use of app store
`distribution by researchers in academia is the
`CenceMe application for iPhone [2], which was
`made available on the App Store when it opened
`in 2008. It is now feasible to distribute and run
`experiments with a large number of participants
`from all around the world rather than in labora-
`tory controlled conditions using a small user
`
`142
`
`IEEE Communications Magazine • September 2010
`
`e.Digital Corporation
`Exhibit 2017 - Page 3
`
`

`
`LANE LAYOUT 8/24/10 10:43 AM Page 143
`
`study. For example, researchers interested in sta-
`tistical models that interpret human behavior
`from sensor data have long dreamed of ways to
`collect such large-scale real-world data. These
`app stores represent a game changer for these
`types of research. However, many challenges
`remain with this new approach to experimenta-
`tion via app stores. For example, what is the best
`way to collect ground-truth data to assess the
`accuracy of algorithms that interpret sensor
`data? How do we validate experiments? How do
`we select a good study group? How do we deal
`with the potentially massive amount of data
`made available? How do we protect the privacy
`of users? What is the impact on getting approval
`for human subject studies from university institu-
`tional review boards (IRBs)? How do
`researchers scale to run such large-scale studies?
`For example, researchers used to supporting
`small numbers of users (e.g., 50 users with
`mobile phones) now have to construct cloud ser-
`vices to potentially deal with 10,000 needy users.
`This is fine if you are a startup, but are academic
`research laboratories geared to deal with this?
`
`SENSING SCALE AND PARADIGMS
`Future mobile phone sensing systems will oper-
`ate at multiple scales, enabling everything from
`personal sensing to global sensing as illustrated
`in Fig. 2 where we see personal, group, and com-
`munity sensing — three distinct scales at which
`mobile phone sensing is currently being studied
`by the research community. At the same time
`researchers are discussing how much the user
`(i.e., the person carrying the phone) should be
`actively involved during the sensing activity (e.g.,
`taking the phone out of the pocket to collect a
`sound sample or take a picture); that is, should
`the user actively participate, known as participa-
`tory sensing [15], or, alternatively, passively par-
`ticipate, known as opportunistic sensing [17]?
`Each of these sensing paradigms presents impor-
`tant trade-offs. In what follows we discuss differ-
`ent sensing scales and paradigms.
`SENSING SCALE
`Personal sensing applications are designed for a
`single individual, and are often focused on data
`collection and analysis. Typical scenarios include
`tracking the user’s exercise routines or automating
`diary collection. Typically, personal sensing appli-
`cations generate data for the sole consumption of
`the user and are not shared with others. An excep-
`tion is healthcare applications where limited shar-
`ing with medical professionals is common (e.g.,
`primary care giver or specialist). Figure 2 shows
`the UbitFit Garden [1] as an example of a person-
`al wellness application. This personal sensing
`application adopts persuasive technology ideas to
`encourage the user to reach her personal fitness
`goals using the metaphor of a garden blooming as
`the user progresses toward their goals.
`Individuals who participate in sensing appli-
`cations that share a common goal, concern, or
`interest collectively represent a group. These
`group sensing applications are likely to be popu-
`lar and reflect the growing interest in social net-
`works or connected groups (e.g., at work, in the
`neighborhood, friends) who may want to share
`
`Mobile computing cloud
`
`Big sensor data
`
`Inform, share and
`persuasion
`
`Learn
`
`Sense
`
`b
`
`Fi2j2
`
`Rij
`
`{l} Mij
`
`{l}
`Yij
`{l}
`
`Fi1j1
`
`{l}
`Yij
`{l}
`
`Rij
`
`{l} Mij
`
`Finjn
`
`{l}
`Yij
`{l}
`
`Rij
`
`{l} Mij
`
`Application
`distribution
`
`Figure 3. Mobile phone sensing architecture.
`
`sensing information freely or with privacy pro-
`tection. There is an element of trust in group
`sensing applications that simplify otherwise diffi-
`cult problems, such as attesting that the collect-
`ed sensor data is correct or reducing the degree
`to which aggregated data must protect the indi-
`vidual. Common use cases include assessing
`neighborhood safety, sensor-driven mobile social
`networks, and forms of citizen science. Figure 2
`shows GarbageWatch [23] as an example of a
`group sensing application where people partici-
`pate in a collective effort to improve recycling by
`capturing relevant information needed to
`improve the recycling program. For example,
`students use the phone’s camera to log the con-
`tent of recycling bins used across a campus.
`Most examples of community sensing only
`become useful once they have a large number of
`people participating; for example, tracking the
`spread of disease across a city, the migration
`patterns of birds, congestion patterns across city
`roads [5], or a noise map of a city [24]. These
`applications represent large-scale data collection,
`analysis, and sharing for the good of the commu-
`
`IEEE Communications Magazine • September 2010
`
`143
`
`e.Digital Corporation
`Exhibit 2017 - Page 4
`
`

`
`LANE LAYOUT 8/24/10 10:43 AM Page 144
`
`Raw data
`
`Extracted features
`
`Classification inferences
`
`Figure 4. Raw audio data captured from mobile phones is transformed into
`features allowing learning algorithms to identify classes of behavior (e.g., driv-
`ing, in conservation, making coffee) occurring in a stream of sensor data, for
`example, by SoundSense [11].
`
`nity. To achieve scale implicitly requires the
`cooperation of strangers who will not trust each
`other. This increases the need for community
`sensing systems with strong privacy protection
`and low commitment levels from users. Figure 2
`shows carbon monoxide readings captured in
`Ghana using mobile sensors attached to taxicabs
`as part of the Participatory Urbanism project
`[20] as an example of a community sensing appli-
`cation. This project, in conjunction with the N-
`SMARTs project [13] at the University of
`California at Berkeley, is developing prototypes
`that allow similar sensor data to be collected
`with phone embedded sensors.
`The impact of scaling sensing applications
`from personal to population scale is unknown.
`Many issues related to information sharing, pri-
`vacy, data mining, and closing the loop by pro-
`viding useful feedback to an individual, group,
`community, and population remain open. Today,
`we only have limited experience in building scal-
`able sensing systems.
`SENSING PARADIGMS
`One issue common to the different types of sens-
`ing scale is to what extent the user is actively
`involved in the sensing system [12]. We discuss
`two points in the design space: participatory sens-
`ing, where the user actively engages in the data
`collection activity (i.e., the user manually deter-
`mines how, when, what, and where to sample) and
`opportunistic sensing, where the data collection
`stage is fully automated with no user involvement.
`The benefit of opportunistic sensing is that it
`lowers the burden placed on the user, allowing
`overall participation by a population of users to
`remain high even if the application is not that
`personally appealing. This is particularly useful
`for community sensing, where per user benefit
`may be hard to quantify and only accrue over a
`long time. However, often these systems are
`technically difficult to build [25], and a major
`resource, people, are underutilized. One of the
`main challenges of using opportunistic sensing is
`the phone context problem; for example, the
`application wants to only take a sound sample
`for a city-wide noise map when the phone is out
`of the pocket or bag. These types of context
`issues can be solved by using the phone sensors;
`for example, the accelerometer or light sensors
`can determine if the phone is out of the pocket.
`Participatory sensing, which is gaining inter-
`est in the mobile phone sensing community,
`places a higher burden or cost on the user; for
`example, manually selecting data to collect (e.g.,
`lowest petrol prices) and then sampling it (e.g.,
`
`taking a picture). An advantage is that complex
`operations can be supported by leveraging the
`intelligence of the person in the loop who can
`solve the context problem in an efficient man-
`ner; that is, a person who wants to participate in
`collecting a noise or air quality map of their
`neighborhood simply takes the phone out of
`their bag to solve the context problem. One
`drawback of participatory sensing is that the
`quality of data is dependent on participant
`enthusiasm to reliably collect sensing data and
`the compatibility of a person’s mobility patterns
`to the intended goals of the application (e.g.,
`collect pollution samples around schools). Many
`of these challenges are actively being studied.
`For example, the PICK project [23] is studying
`models for systematically recruiting participants.
`Clearly, opportunistic and participatory rep-
`resent extreme points in the design space. Each
`approach has pros and cons. To date there is lit-
`tle experience in building large-scale participato-
`ry or opportunistic sensing applications to fully
`understand the trade-offs. There is a need to
`develop models to best understand the usability
`and performance issues of these schemes. In
`addition, it is likely that many applications will
`emerge that represent a hybrid of both these
`sensing paradigms.
`
`MOBILE PHONE SENSING
`ARCHITECTURE
`Mobile phone sensing is still in its infancy. There
`is little or no consensus on the sensing architec-
`ture for the phone and the cloud. For example,
`new tools and phone software will be needed to
`facilitate quick development and deployment of
`robust context classifiers for the leading phones
`on the market. Common methods for collecting
`and sharing data need to be developed. Mobile
`phones cannot be overloaded with continuous
`sensing commitments that undermine the perfor-
`mance of the phone (e.g., by depleting battery
`power). It is not clear what architectural compo-
`nents should run on the phone and what should
`run in the cloud. For example, some researchers
`propose that raw sensor data should not be
`pushed to the cloud because of privacy issues. In
`the following sections we propose a simple archi-
`tectural viewpoint for the mobile phone and the
`computing cloud as a means to discuss the major
`architectural issues that need to be addressed.
`We do not argue that this is the best system
`architecture. Rather, it presents a starting point
`for discussions we hope will eventually lead to a
`converging view and move the field forward.
`Figure 3 shows a mobile phone sensing archi-
`tecture that comprises the following building
`blocks.
`
`SENSE
`Individual mobile phones collect raw sensor data
`from sensors embedded in the phone.
`LEARN
`Information is extracted from the sensor data by
`applying machine learning and data mining tech-
`niques. These operations occur either directly on
`the phone, in the mobile cloud, or with some
`
`144
`
`IEEE Communications Magazine • September 2010
`
`e.Digital Corporation
`Exhibit 2017 - Page 5
`
`

`
`Most of the
`smartphones on the
`market are open and
`programmable by
`third party
`developers and offer
`SDKs, APIs, and
`software tools. It is
`easy to cross-compile
`code and leverage
`existing software
`such as established
`machine learning
`libraries.
`
`LANE LAYOUT 8/24/10 10:43 AM Page 145
`
`partitioning between the phone and cloud.
`Where these components run could be governed
`by various architectural considerations, such as
`privacy, providing user real-time feedback,
`reducing communication cost between the phone
`and cloud, available computing resources, and
`sensor fusion requirements. We therefore con-
`sider where these components run to be an open
`issue that requires research.
`INFORM, SHARE, AND PERSUASION
`We bundle a number of important architectural
`components together because of commonality or
`coupling of the components. For example, a per-
`sonal sensing application will only inform the user,
`whereas a group or community sensing application
`may share an aggregate version of information
`with the broader population and obfuscate the
`identity of the users. Other considerations are how
`to best visualize sensor data for consumption of
`individuals, groups, and communities. Privacy is a
`very important consideration as well.
`While phones will naturally leverage the dis-
`tributed resources of the mobile cloud (e.g.,
`computation and services offered in the cloud),
`the computing, communications, and sensing
`resources on the phones are ever increasing. We
`believe that as resources of the phone rapidly
`expand, one of the main benefits of using the
`mobile computing cloud will be the ability to
`compute and mine big data from very large num-
`bers of users. The availability of large-scale data
`benefits mobile phone sensing in a variety of
`ways; for example, more accurate interpretation
`algorithms that are updated based on sensor
`data sourced from an entire user community.
`This data enables personalizing of sensing sys-
`tems based on the behavior of both the individu-
`al user and cliques of people with similar
`behavior.
`In the remainder of the article we present a
`detailed discussion of the three main architec-
`tural components introduced in this section:
`• Sense
`• Learn
`• Inform, share, and persuasion
`
`SENSE: THE MOBILE PHONE AS A
`SENSOR
`As we discussed, the integration of an ever
`expanding suite of embedded sensors is one of
`the key drivers of mobile phone applications.
`However, the programmability of the phones
`and the limitation of the operating systems that
`run on them, the dynamic environment present-
`ed by user mobility, and the need to support
`continuous sensing on mobile phones present a
`diverse set of challenges the research community
`needs to address.
`PROGRAMMABILITY
`Until very recently only a handful of mobile
`phones could be programmed. Popular plat-
`forms such as Symbian-based phones presented
`researchers with sizable obstacles to building
`mobile sensing applications [2]. These platforms
`lacked well defined reliable interfaces to access
`low-level sensors and were not well suited to
`
`writing common data processing components,
`such as signal processing routines, or performing
`computationally costly inference due to the
`resource constraints of the phone. Early sensor-
`enabled phones (i.e., prior to the iPhone in
`2007) such as the Symbian-based Nokia N80
`included an accelerometer, but there were no
`open application programming interfaces (APIs)
`to access the sensor signals. This has changed
`significantly over the last few years. Note that
`phone vendors initially included accelerometers
`to help improve the user interface experience.
`Most of the smartphones on the market are
`open and programmable by third-party develop-
`ers, and offer software development kits (SDKs),
`APIs, and software tools. It is easy to cross-com-
`pile code and leverage existing software such as
`established machine learning libraries (e.g.,
`Weka).
`However, a number of challenges remain in
`the development of sensor-based applications.
`Most vendors did not anticipate that third par-
`ties would use continuous sensing to develop
`new applications. As a result, there is mixed API
`and operating system (OS) support to access the
`low-level sensors, fine-grained sensor control,
`and watchdog timers that are required to devel-
`op real-time applications. For example, on Nokia
`Symbian and Maemo phones the accelerometer
`returns samples to an application unpredictably
`between 25–38 Hz, depending on the CPU load.
`While this might not be an issue when using the
`accelerometer to drive the display, using statisti-
`cal models to interpret activity or context typi-
`cally requires high and at least consistent
`sampling rates.
`Lack of sensor control limits the management
`of energy consumption on the phone. For
`instance, the GPS uses a varying amount of
`power depending on factors such as the number
`of satellites available and atmospheric condi-
`tions. Currently, phones only offer a black box
`interface to the GPS to request location esti-
`mates. Finer-grained control is likely to help in
`preserving battery power and maintaining accu-
`racy; for example, location estimation could be
`aborted when accuracy is likely to be low, or if
`the estimate takes too long and is no longer use-
`ful.
`As third parties demand better support for
`sensing applications, the API and OS support
`will improve. However, programmability of the
`phone remains a challenge moving forward. As
`more individual, group, and community-scale
`applications are developed there will be an
`increasing demand placed on phones, both indi-
`vidually and collectively. It is likely that abstrac-
`tions that can cope with persistent spatial queries
`and secure the use of resources from neighbor-
`ing phones will be needed. Phones may want to
`interact with other collocated phones to build
`new sensing paradigms based on collaborative
`sensing [12].
`Different vendors offer different APIs, mak-
`ing porting the same sensing application to mul-
`tivendor platforms challenging. It is useful for
`the research community to think about and pro-
`pose sensing abstractions and APIs that could be
`standardized and adopted by different mobile
`phone vendors.
`
`IEEE Communications Magazine • September 2010
`
`145
`
`e.Digital Corporation
`Exhi

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket