`(12) Patent Application Publication (10) Pub. No.: US 2011/0066383 A1
`
`
` Jangle et a1. (43) Pub. Date: Mar. 17, 2011
`
`US 20110066383A1
`
`(54)
`
`INDENTIFYING ONE OR MORE ACTIVITIES
`OF AN ANIMATE OR INANIMATE OBJECT
`
`Publication Classification
`
`(75)
`
`Inventors:
`
`(73) Assignee:
`
`.
`(21) Appl. NO"
`.
`.
`(22) Flled'
`
`Jeetendra Jangle, Fremont, CA
`(US); Vijay Nadkarni, San Jose,
`CA (US)
`
`Wellcore Corporation, San Jose,
`CA (US)
`
`12/883’304
`
`(51)
`
`Int. Cl.
`232: 1122000
`
`(388281)
`(
`’
`)
`(52) US. Cl. ........................................... 702/19; 702/141
`
`ABSTRACT
`(57)
`Methods, systems and apparatus for identifying an activity of
`an animate or inanimate object are disclosed. One method
`includes identifying each elemental motion of a sequence of
`elemental motions of a device attached to the animate or
`inanimate object. The activity of the animate or inanimate
`object can be identified by matching the sequence of identi-
`fied elemental motions of the device with a library of stored
`sequences 0f elemental motions, wherein each stored
`sequence of elemental motions corresponds with an activity.
`
`Sep. 16’ 2010
`.
`.
`Related U'S' Appllcatlon Data
`(63) Continuation-in-part of application No. 12/560,069,
`filed on Sep. 15, 2009.
`
`
`Aw
`112
`
`
`
`Ace.
`
`'
`'*
`
`Ace.
`116
`
`
`
`I.
`W
`
`
`
`m
`Motion Detection Device
`
`
`
`
`
`
`
`
`140
`
`
`
`
`
`
`K
`1
`
`\_,/ Co mpas‘e—>
`130
`
`ND
`12:3
`
`
`
`
`
`
`
`Audio
`1’80
`
`
`
`
`
`GPS
`190
`
`
`
`|
`
`‘
`
`Controller
`
`”‘3'
`
`I
`
`.
`
`
`
`,’
`
`,’
`
`,’
`
`I
`
`,’
`
`,’
`Contra-Her can be internai or
`
`inciude external controliefis) of
`the Network
`
`
`
`Sequences of
`Elemental
`Motions
`Library
`
`::_»o
`
`
`
`Seqtrences of
`Activities
`Library
`
`E
`
`
`
`Communication
`
`i
`i
`
`Connection to
`Network
`
`Wirelesss
`
`,1
`
`APPLE 1012
`
`APPLE 1012
`
`1
`
`
`
`Patent Application Publication Mar. 17, 2011 Sheet 1 0f 8
`
`US 2011/0066383 A1
`
`
`
` w
`
`
`
`
`
`Ace.
`1 16
`
`A00.
`1 12
`
`
`
`
`Act.
`1 14
`
`
`
`
`/
`
`\
`
`Maiion Detection Device
`
`Anceiei‘stion
`Signature
`Library
`140
`
`Compare
`139
`
`Cmitrolier
`
`1 7‘3
`
`Sass] uenccs 0f
`Elemcmai
`Motions
`Li h ra ry
`
`150
`
`Sequcnccx of
`Activities
`Library
`160
`
`
`
`
`
`GPS
`,
`19f)
`
`4—D
`
`
`
`/
`
`, ’
`
`/ /
`
`/
`
`/ /
`
`/ /
`Cmimiler cam be internal 0:"
`
`include external coritmiiefis) of
`the Network
`
`
`
`
`
`Commotion ta
`
`Natwork
`
`—> Wireiess
`
`,y
`
`i
`
`i
`
`i :
`
`Cemmunication
`
`FIGURE 1
`
`2
`
`
`
`Patent Application Publication Mar. 17, 2011 Sheet 2 0f 8
`
`US 2011/0066383 A1
`
`Identifying each elemental motion Gin sequence ()ieienieniai minions Ufa device attached in
`ihc animate er inanimatn Object
`
`m
`
`identiiying the activity oi" the animate er inanimate abject. comprising matching the sequence
`of identified elementai motions efthe device with a stored sequenees of elemental motinns,
`wherein each stored sequence of eiementai minions cei‘i‘espendg with an activity
`
`wherein each stored sequence of activities conespends with an identified behavim‘
`
`identifying eanh activity (if a, sequence et‘activities: ofthe animate er inanimate (inject
`
`30
`
`identiiying the bei’tnviot 05‘ the animate or inanii'riate object, cemprising matching the sequence
`01‘ identified antivi‘iiee of the animate 0r inanimate, ebjec‘i with a SiOFCCi sequences; 0f activities,
`
`FIGURE 2
`
`3
`
`
`
`Patent Application Publication Mar. 17, 2011 Sheet 3 0f 8
`
`US 2011/0066383 A1
`
`/'
`.
`I
`\ Behavwr Fame?
`
`\‘
`. {.f’
`//
`\\
`Eden...sed
`"
`\
`[I]
`( Elementa! Matian )
`\\
`/,
`
`\\
`
`,/
`
`identified
`Behaviors
`
`33
`
`.
`Location
`
`/
`
`Identified
`Activities
`
`Elementai
`
`Motion
`
`_
`\
`identified
`‘
`\\ E5 manta! Moticn //
`_ \
`,
`310
`
`,
`
`\\
`
`I’Atomic E‘\/E0tian\>
`\\
`(signature)
`
`{A
`/I\:.
`xmomic Motian‘x)
`\
`(signature)
`
`FIGURE 3
`
`4
`
`
`
`Patent Application Publication Mar. 17, 2011 Sheet 4 0f 8
`
`US 2011/0066383 A1
`
`EStablisé'ied Daily Pattern
`
`
`
`HGURE4A
`
`Current Day Pattern
`
`
`
`HGURE4B
`
`5
`
`
`
`Patent Application Publication Mar. 17, 2011 Sheet 5 0f 8
`
`US 2011/0066383 A1
`
`
`810W Failing and lying down summcrsauit
`
`Acceieration
`
`
`100
`
`A /\
`
`I
`
`\W
`
`
`
`
`
`
`
`WWW—MM/W ‘I
`\
`
`-1ng *
`
`150
`
`253
`
`X Axis
`
`350
`
`F IGURE 5A
`
`
`Siipping and failing on hack on a bmmcy Surface (air n'iattmss‘;
`
`Acceleration
`
`100' "i
`
`
`
`Winn
`
`
` w
`
`“100 '3
`
`Mb
`
`250 X AXES
`
`350
`
`FIGURE EB
`
`6
`
`
`
`Patent Application Publication Mar. 17, 2011 Sheet 6 0f 8
`
`US 2011/0066383 A1
`
`Generating en aeeeieratien signature based on sensed eeeeieretien of the ebject
`
`1E)
`
`Matching the aeeeietatien signature with at Eeest one Of a piuratity 0t stated
`aceeietatien signatures, wherein each stored eccetetatien signatures corresponds with
`a type of motion
`
`630
`
`20
`
`identifying the type et metien of the object based en the stetistieei matching or exact
`matching of the eeceteretien signature
`
`FIGURE 6
`
`7
`
`
`
`Patent Application Publication Mar. 17, 2011 Sheet 7 0f 8
`
`US 2011/0066383 A1
`
`The mation detection device determining what network cannections are available to
`the motian detectim device
`
`ӣ0
`
`@
`
`The motien detectian device digtributing at Eeast some of the acceieration Signature
`matching processing if processing capabiiity is avaiiabie tn the maiden detection device
`thgugh avaiiabie metwerk canneetisns
`
`FIGURE 7
`
`8
`
`
`
`Patent Application Publication Mar. 17, 2011 Sheet 8 0f 8
`
`US 2011/0066383 A1
`
`
`
`81 i)
`
`Biuetmth ®
`
`fl
`
`
`
`
`Processor
`
`
`
`
`
`
`
`
`; /
`
`ka \
`
`//’//
`
`\\\‘Z$fl/
`
`
`
`\-\._, \\_____
`
`\\ \
`
`‘
`
`845‘x
`Zig Bee
`
`
`
`
`
`Processor
`
`83C:
`
`N0 Network
`Avaiiabie
`
`‘
`
`Ceiluiar
`820
`
`l
`Z /$/
`
`
`Pr<3ces;ss<)r
`
`
`
`
`
`9:0
`
`Home
`Base Station
`840
`
`FIGURE 8
`
`9
`
`
`
`US 2011/0066383 A1
`
`Mar. 17, 2011
`
`INDENTIFYING ONE OR MORE ACTIVITIES
`OF AN ANIMATE OR INANIMATE OBJECT
`
`RELATED APPLICATIONS
`
`[0001] This patent application is a continuation in part
`(CIP) of US. patent application Ser. No. 12/560,069 filed on
`Sep. 15, 2009, which is incorporated by reference.
`
`FIELD OF THE DESCRIBED EMBODIMENTS
`
`[0002] The described embodiments relate generally to
`monitoring motion. More particularly, the described embodi-
`ments relate to a method, system and apparatus for identify-
`ing one or more activities of an animate or inanimate object.
`
`BACKGROUND
`
`[0003] There is an increasing need for remote monitoring
`of individuals, animals and inanimate objects in their daily or
`natural habitats. Many seniors live independently and need to
`havc thcir safcty and wcllncss trackcd. A largc pcrccntagc of
`society is fitness conscious, and desire to have, for example,
`workouts and exercise regimen assessed. Public safety offic-
`ers, such as police and firemen, encounter hazardous situa-
`tions on a frequent basis, and need their movements, activities
`and location to be mapped out precisely.
`[0004] The value in such knowledge is enormous. Physi-
`cians, for example, like to know their patients sleeping pat-
`terns so they can treat sleep disorders. A senior living inde-
`pendently wants peace of mind that if he has a fall it will be
`detected automatically and help summoned immediately. A
`fitness enthusiast wants to track her daily workout routine,
`capturing the various types of exercises, intensity, duration
`and caloric burn. A caregiver wants to know that her father is
`living an active, healthy lifestyle and taking his daily walks.
`The police would like to know instantly when someone has
`been involved in a car collision, and whether the victims are
`moving or not.
`[0005] Existing products for the detection of animate and
`inanimate motions are simplistic in nature, and incapable of
`interpreting anything more than simple atomic movements,
`such as jolts, changes in orientation and the like. It is not
`possible to draw reliable conclusions about human behavior
`from these simplistic assessments.
`[0006]
`It is desirable to have an apparatus and method that
`can accurately identify and monitor activities of an animate or
`inanimate object
`
`SUMMARY
`
`[0007] An embodiment includes a method ofidentifying an
`activity of an animate or inanimate object. The method
`includes identifying each elemental motion of a sequence of
`elemental motions of a device attached to the animate or
`
`inanimate object. The activity of the animate or inanimate
`object can be identified by matching the sequence of identi-
`fied elemental motions of the device with a stored sequences
`of elemental motions, wherein each stored sequence of
`elemental motions corresponds with an activity.
`[0008] Another embodiment
`includes an apparatus for
`identifying an activity of an animate or inanimate object. The
`apparatus includes a controller operative to identify each
`elemental motion of a sequence of elemental motions of
`device attached to the animate or inanimate object. The con-
`troller is further operative to identify the activity of the ani-
`mate or inanimate object, comprising matching the sequence
`
`of identified elemental motions of the object with stored
`sequences of elemental motions, wherein each stored
`sequence of elemental motions corresponds with an activity.
`[0009] Another embodiment includes a system for identi-
`fying an activity of a animate or inanimate object. The system
`includes means for identifying each elemental motion of a
`sequence of elemental motions of an device attached to the
`animate or inanimate object, and means for identifying the
`activity of the animate or inanimate object, comprising
`matching the sequence of identified elemental motions of the
`device with a library of stored sequences of elemental
`motions, wherein each stored sequence of elemental motions
`corresponds with an activity. The means for identifying each
`elemental motion includes means for generating an accelera—
`tion signature based on sensed acceleration of the device,
`means for matching the acceleration signature with at least
`one of a plurality of stored acceleration signatures, wherein
`each stored acceleration signatures corresponds with a type of
`motion, and means for identifying the type of motion of the
`device based on the matching of the acceleration signature
`with the stored acceleration signature.
`[0010] Other aspects and advantages of the described
`embodiments will become apparent from the following
`detailed description, taken in conjunction with the accompa-
`nying drawings, illustrating by way of example the principles
`of the described embodiments.
`
`BRIEF DESCRIPTION OF THE DRAWINGS
`
`FIG. 1 shows an example of a block diagram ofa
`[0011]
`motion-detection and tracking device.
`[0012]
`FIG. 2 is a flow chart that includes steps of an
`example of a method of identifying an activity of a animate or
`inanimate object.
`[0013]
`FIG. 3 shows an example of hierarchical relation-
`ships between elemental motions, activities, behaviors and
`behavioral patterns.
`[0014]
`FIGS. 4A, 4B are plots that show examples of an
`established activity pattern and a daily activity pattem for an
`animate or inanimate object, allowing for detection of
`changes in behavior.
`[0015]
`FIGS. 5A, 5B shows examples of time-lines of seV-
`eral different acceleration curves (signatures), wherein each
`signature is associated with a different type of sensed or
`dctcctcd motion.
`
`FIG. 6 is a flow chart that includes the steps of one
`[0016]
`example of a method of identifying a type of motion of an
`animate or inanimate object.
`[0017]
`FIG. 7 is a flow chart that includes steps of one
`example of a method of a motion detection device checking
`network availability for improvements in speed and/or pro—
`cessing power of acceleration signature matching.
`[0018]
`FIG. 8 shows an example of a motion detection and
`tracking device that can be connected to one of multiple
`networks.
`
`DETAILED DESCRIPTION
`
`[0019] The described embodiments include methods, sys-
`tems and apparatuses that provide human activity and motion
`pattern recognition, allowing a determination of granular
`level activities of daily living being performed by a user.
`Embodiments of these granular feature determinations pro-
`vide the capability to identify user safety. For example, a
`comprised safety situation, such as, the user falling down can
`
`10
`
`10
`
`
`
`US 2011/0066383 A1
`
`Mar. 17, 2011
`
`be identified. By combining the granular motion actions and
`features with data from other sensors such as GPS (global
`positioning system), vital stats sensors and other inferred data
`such as time, it is possible to establish the high level activity
`being performed by the user. Knowledge of high level activi-
`ties being performed during time periods such as a day allows
`for the building of various interesting applications that are
`useful for improving the quality of life of the users and their
`caregivers and to customize and optimize care plans. Armed
`with the knowledge of variation of peoples’ behavior, repeti-
`tive and variant patterns across people, age, gender, location
`and time, systems can provide customized services for indi-
`viduals and categories of people.
`[0020] The monitoring of human activities generally falls
`into three categories: safety, daily lifestyle, and fitness. By
`carefully interpreting human movements it is possible to draw
`accurate and reasonably complete inferences about the state
`of well being of individuals. A high degree of sophistication
`is required in these interpretations. Simplistic assessments of
`human activity lead to inaccurate determinations, and ulti-
`mately are of questionable value. By contrast, a comprehen-
`sive assessment leads to an accurate interpretation and can
`prove to be indispensable in tracking the well being and safety
`of the individual.
`
`To draw accurate inferences about the behavior of
`[0021]
`humans, it turns out that the atomic movements become sim-
`ply alphabets that include elemental motions. Furthermore,
`specific sequences of elemental motions become the vocabu-
`lary that comprises human behavior. As an example, take the
`case of a person who leaves the home and drives to the
`shopping center. In such a scenario, the behavioral pattern of
`the person is walking to the door or the house, opening and
`closing the door, walking further to the car, settling down in
`the car, starting the engine, accelerating the car, going
`through a series of stops, starts and turns, parking the car,
`getting out and closing the car door, and finally walking to the
`shopping center. This sequence of human behavior is com-
`prised of individual motions such as standing, walking, sit-
`ting, accelerating (in the car), decelerating, and turning left or
`right. Each individual motion, for example walking, is com-
`prised of multiple atomic movements such as acceleration in
`an upward direction, acceleration in a downward direction, a
`modest forward acceleration with each step, a modest decel-
`eration with each step, and so on.
`[0022] With written prose, letters by themselves convey
`almost no meaning at all. Words taken independently convey
`individual meaning, but do not provide the context to com-
`prehend the situation. It takes a complete sentence to obtain
`that context. Along the same line of reasoning, it requires a
`comprehension of a complete sequence of movements to be
`able to interpret human behavior.
`[0023] Although there is an undeniable use for products
`that are able to detect complex human movements accurately,
`the key to the success of such technologies lies in whether
`users adopt them or not. The technology needs to capture a
`wide range of human activities. The range of movements
`should ideally extend to all types of daily living activities that
`a human being expects to encounterisleeping, standing,
`walking, running, aerobics, fitness workouts, climbing stairs,
`vehicular movements, falling, jumping and colliding, to name
`some of the more common ones.
`
`It is important to detect human activities with a great
`[0024]
`deal of precision. In particular, activities that relate to safety,
`fitness, vehicular movements, and day to day lifestyle pat-
`
`terns such as walking, sleeping, climbing stairs, are important
`to identify precisely. For example, it is not enough to know
`that a person is walking. One needs to know the pace and
`duration of the walk, and additional knowledge of gait,
`unsteadiness, limping, cadence and the like are important.
`[0025]
`It is critical that false positives as well as false nega-
`tives be eliminated. This is especially important for cases of
`safety, such as falls, collisions, and the like. Human beings
`come in all typesishort, tall, skinny, obese, male, female,
`athletic, couch potato, people walking with stick/rolator,
`people with disabilities, old and young. The product needs to
`be able to adapt to their individuality and lifestyle.
`[0026] The embodiments described provide identification
`of types of motion of an animate or inanimate object. Motion
`is identified by generating acceleration signatures based on
`the sensed acceleration of the object. The acceleration signa-
`tures are compared with a library of motion signature, allow-
`ing the motion of the object
`to be identified. Further,
`sequences of the motions can be determined, allowing iden-
`tification of activities of, for example, a person the object is
`attached to.
`
`Just as the handwritten signatures of a given human
`[0027]
`being are substantively similar from one signature instance to
`the next, yet have minor deviations with each new instance, so
`too will the motion signatures of a given human be substan-
`tively similar from one motion instance to the next, yet have
`minor deviations.
`
`[0028] Algorithms used for pattern recognition (signature
`matching) should have the sophistication to accurately handle
`a wide range of motions. Such algorithms should have the
`ability to recognize the identical characteristics ofa particular
`motion by a given human being, yet allow for minor varia-
`tions arising from human randomness. Additionally,
`the
`devices used to monitor peoples’ movement need to be min-
`iature and easy to wear. These two objectives are fundamen-
`tally opposed. However, the described embodiments provide
`a single cohesive device and system that is both sophisticated
`enough to detect a wide range of motions.
`[0029]
`FIG. 1 shows an example of a block diagram of a
`motion-detection and tracking device. The motion detection
`device can be attached to an animate or inanimate object, and
`therefore, motion of the object that can be detected and iden-
`tified. Based on the identified motion, estimates ofthe behav-
`ior and conditions of the object can be determined.
`[0030] The motion detection device includes sensors (such
`as, accelerometers) that detect motion of the object. One
`embodiment ofthe sensors includes accelerometers 112, 114,
`116 that can sense, for example, acceleration of the object in
`X, Y and Z directional orientations. It is to be understood that
`other types of motion detection sensors can alternatively be
`used.
`
`[0031] An analog to digital converter (ADC) digitizes ana-
`log accelerometer signals. The digitized signals are received
`by compare processing circuitry 130 that compares the digi-
`tized accelerometer signals with signatures that have been
`stored within a library of signatures 140. Each signature
`corresponds with a type of motion. Therefore, when a match
`between the digitized accelerometer signals and a signature
`stored in the library 140, the type of motion experienced by
`the motion detection device can determined.
`
`[0032] An embodiment includes filtering the accelerometer
`signals before attempting to match the signatures. Addition-
`ally, the matching process can be made simpler by reducing
`the possible signature matches.
`
`11
`
`11
`
`
`
`US 2011/0066383 A1
`
`Mar. 17, 2011
`
`[0033] An embodiment includes identifying a previous
`human activity, context. That is, for example, by knowing that
`the previous human activity was walking, certain signatures
`can intelligently be eliminated from the possible matches of
`the present activity that occurs subsequent to the previous
`human activity (walking).
`[0034] An embodiment includes additionally reducing the
`number of possible signature matches by performing a time-
`domain analysis on the accelerometer signal. The time-do-
`main analysis can be used to identify a transient or steady-
`state signature of the accelerometer signal. That
`is, for
`example, a walk may have a prominent steady-state signature,
`whereas a fall may have a prominent transient signature.
`Identification of the transient or steady-state signature of the
`accelerometer signal can further reduce or eliminate the num-
`ber of possible signature matches, and therefore, make the
`task ofmatching the accelerometer signature with a signature
`within the library of signature simpler, and easier to accom-
`plish. More specifically, the required signal processing is
`simpler, easier, and requires less computing power.
`[0035] A controller 170 manages the signature matching
`and identification. As will be described, the controller 170 can
`be connected to an external network. The processing of the
`controller 170 can be performed locally or distributed
`amongst other controller through the network. Determination
`of where processing takes place (that is, what controller or
`processor) can be based on a balance of speed of the process-
`ing, and power of the local controller (that is, power required
`of a controller within a mobile device). The controller 170
`also manages the activity identificationbased on sequences of
`motion, and manages the identifications of behaviors based
`on the identified activities as will be described. A sequences
`ofelemental motions library 150 can be used for matching the
`sequences of motion to a particular activity. A sequences of
`activities library 160 can be used for matching sequences of
`activities to a particularbehavior. Again, the processing ofthe
`controller 170, as well as the libraries 150, 160, 170 can be
`distributed across the network through a wired or wireless
`connection.
`
`[0036] Upon detection of certain types of motion, an audio
`device 180 and/or a global positioning system (GPS) 190 can
`engaged to provide additional information that can be used to
`determine the situation of, for example, a human being the
`motion detection device is attached to.
`
`[0037] The condition, or information relating to the motion
`detection device can be communicated through a wired or
`wireless connection. A receiver of the information can pro-
`cess it, and make a determination regarding the status of the
`human being the motion detection device is attached to. Infor-
`mation and history of a user of the motion detection device
`can be utilized to characterize the user/FIG. 2 is a flow chart
`
`that includes steps of an example of a method of identifying
`an activity of an animate or inanimate object. A first step 210
`includes identifying each elemental motion of a sequence of
`elemental motions of a device attached to the animate or
`
`inanimate object. A second step 220 includes identifying the
`activity of the animate or inanimate object, comprising
`matching the sequence of identified elemental motions of the
`device with stored sequences of elemental motions, wherein
`each stored sequence of elemental motions corresponds with
`an activity.
`[0038] A plurality or sequence of identified activities of, for
`example, a human being, can be used to identify a behavior of
`the human being. As such, a third step 230 includes identify-
`
`ing each activity of a sequence of activities of the animate or
`inanimate object. A fourth step 240 includes identifying the
`behavior of the animate or inanimate object, comprising
`matching the sequence of identified activities of the animate
`or inanimate object with a stored sequences of activities,
`wherein each stored sequence of activities corresponds with
`an identified behavior.
`
`[0039] The animate or inanimate object can be many
`things, such as, a human being or an animal. Alternatively or
`additionally, the animate or inanimate object can be an object
`associated with a human being, such as, a vehicle. The device
`can be attached to the animate or inanimate object in many
`different ways. For example, the device can be attached to a
`human being, or clothing (pants, shirt, jacket, and/or hat)
`being worn by the human being. The device can be within a
`pendant or necklace being worn by the human being. The
`device can be attached, for example, to a vehicle being oper-
`ated by the human being.
`identifying each elemental
`[0040]
`For an embodiment,
`motion includcs generating an acceleration signaturc bascd
`on sensed acceleration of the device, matching the accelera-
`tion signature with at least one of a plurality of stored accel-
`eration signatures, wherein each stored acceleration signa-
`tures corresponds with a type of motion, and identifying the
`type of motion of the device based on the matching of the
`acceleration signature with the stored acceleration signature.
`[0041] Other factors can be used to refine (improve) the
`identification of the activity. These factors can include, for
`example, analyzing timing of the identified activity. For an
`embodiment, the timing includes at least one of an hour of a
`day, a day of a week, a week of a month, a month of a year.
`Other factors include analyzing at least one identified loca-
`tion of the identified activity, analyzing a rate of change of a
`location of the animate or inanimate object, analyzing pat-
`terns of a plurality of identified activities, and/or analyzing an
`age of the animate or inanimate object.
`[0042] As previously mentioned, behaviors can be identi-
`fied based on sequences of identified activities. Embodiments
`further include tracking at least one behavior ofthe animate or
`inanimate object over time. One embodiment includes iden-
`tifying patterns of the at least one behavior. An embodiment
`includes grouping the patterns of the animate or inanimate
`objects based on a common parameter between the animate or
`inanimate objects. Embodiments include identifying changes
`in at least one behavior ofthe animate or inanimate object. An
`embodiment further includes sending an alert upon identifi-
`cation of predetermined set of behavior changes.
`[0043]
`FIG. 3 shows an example of hierarchical relation-
`ships between elemental motions, activities, behaviors and
`behavioral patterns. At the lowest level of the hierarchy are
`the identified elemental motions 310. As described,
`the
`elemental motions can be identified by sensing signatures of
`motion (by, for example, accelerometers within a device
`attached to a user) and matching the signatures within known
`signatures. At the next higher level of the hierarchy are the
`identified activities 320. As described, the activities can be
`identified by matching determined sequences of elemental
`motions with previously known sequences of elemental
`motions. At the next higher level of the hierarchy are identi-
`fied behaviors 330. As described, the behaviors can be iden-
`tified by matching determined sequences of activities with
`previously known sequences of activities. Each of the levels
`of hierarchy can be aided with additional information. For
`example, the identified behaviors can be more intelligently
`
`12
`
`12
`
`
`
`US 2011/0066383 A1
`
`Mar. 17, 2011
`
`identified with time, location and or age ofthe user. Addition-
`ally, this information can be used for grouping and identified
`behavior patterns. Once a behavior pattern has been associ-
`ated with a user, much more useful information can be asso-
`ciated with the user.
`
`[0044] The described embodiments can correlate the
`sequences of activity data being generated along with the
`ambient information like location, time, etc to generate daily
`patterns of the user. These daily patterns then emerge as
`behavioral patterns of the person. Behavioral patterns allow
`the system to determine how people spend their time, recre-
`ational and buying habits, interests of people, and pattern
`variations across demographics etc. Based on the behavioral
`patterns, how habits ofpeople vary in relationship to time and
`their physical wellbeing can be deduced or inferred.
`[0045] The described embodiment includes systems that
`can detect critical conditions based on the previous knowl-
`edge obtained by the systems for an individual and help
`prevent and aid safety situations. Additionally, the systems
`can detect early signs of conditions that enable carly attention
`from experts. Additionally, the systems can learn from obser-
`vation and capture behavior patterns that cannot be deter-
`mined with generic real-time monitoring. The systems adapt
`to the observed person using the data being collected through
`monitoring.
`[0046] Descriptively, an analogy can be drawn between a
`person’s motions and languages. For example person has
`minute motions, activities, daily lifestyle, behavioral patterns
`and analogous to word, sentences, paragraph, chapters and
`books. As there are words in the vocabulary, vocabulary can
`be created of elemental motions. The way sentences are cre-
`ated with putting words into certain order, the elemental
`motions can be put into certain order and form activities.
`Activities canbe put in succession along with ambient param-
`eters and form contextual activities. Series of contextual
`
`activities or data mining of activities per day/week can form
`a lifestyle ofa person, which can lead to behavioral patterns of
`a person.
`[0047] Analogies include, for example, Sound—>Atomic
`motion, Alphabets—>Elemental Motion, Orientation,
`Words—>Basic Movements, Sentences—>Compound Move-
`ments,
`Paragraph—>Contextual
`Movements,
`ChapterseActivity Pattern, BookseBehavioral Patterns.
`[0048] A person’s lifestyle or behavior can be determined
`based on his/her movement patterns. Small movement (el-
`emental motion) patterns can be determined by the 3 dimen-
`sional acceleration signals and orientation. Examples of
`elemental motions include, for example, arm movement, sit-
`ting sedentary in the chair, getting up from the chair, standing,
`walking, running, falling. Putting the movement patterns
`basic components in series (varied combinations) provides a
`high degree of inference of the activity. Inference can made
`using the metrics of elemental motions and metrics of ambi-
`ent variables. Examples of ambient variables include time of
`the day, GPS location, environment, and/or physical nature.
`[0049] The following is an example of a series (sequence)
`of higher-level contextual activities includes that each
`includes a sequence of elemental motions. A first example of
`a higher level activity includes a user going to a grocery store.
`First, the user leaves his/her house which can include the
`following elemental motions/activities. Leaving the house
`can be detected as including the following sequence of
`elemental motions: getting up from chair, walking few steps,
`stopping briefly at the door, and walking to the car. Next, can
`
`include: identifying the user driving a car, including the fol-
`lowing sequence of elemental motions: sitting into the car,
`driving the car, car movements, parking of car, getting out of
`car, with the additional inputs of, for example, location and
`time of the day. The next step can include identifying that the
`user walked to the grocery store at the location.
`[0050] Other identified activities can include identifying
`the user getting up in the morning, by identifying the follow-
`ing sequence of elemental motions and inputs: identifying the
`time of the day (night), identifying sleeping, or long seden-
`tary activity, identifying going to the bathroom and rolling
`over in the sleep. The activity of sitting in a car can include
`identifying the following sequence of elemental motions:
`opening the door, sitting down in the car, closing the door,
`pulling on the belt, putting on the seat belt, and sitting back in
`the seat. The activity of driving can be identified by identify-
`ing the following sequence of elemental motions (influence,
`for example, by location and time of the day): sitting in the
`car, starting the car, the car moving, stopping the car, opening
`the car door, getting out of the car, closing the car door.
`Identification of car movement can include identifying the
`following sequence of elemental motions: going forward,
`going backward, driving, braking, turning left, and turning
`right.
`[0051] Higher-level activity identification can include, for
`example, monitoring a duration of sleep including identifying
`a number oftimes woke up and leaving the bed, duration away
`from bed, duration ofdeep sleep and number of deep sleeps at
`night, start and end time of sleep. This information can be
`used to establish long term trends of sleep including deter-
`mining when the person (user) goes to bed, how long it takes
`to fall asleep, quality of sleep, and/or length of sleep. If there
`is a sudden departure from the established sleep pattern, for
`example, frequent visits to the bathroom at night, this could
`indicate signs of distress or unease requiring a visit from
`caregivers or visit to physician. Additional sensors like tem-
`perature and/or blood pressure could be triggered automati-
`cally to take vital stats at certain intervals and transmitted to
`a physician for study prior to visit in