throbber
(19) United States
`(12) Patent Application Publication (10) Pub. No.: US 2010/0217533 A1
`
`
` Nadkarni et al. (43) Pub. Date: Aug. 26, 2010
`
`US 20100217533A1
`
`(54)
`
`IDENTIFYING A TYPE OF MOTION OF AN
`OBJECT
`
`(75)
`
`Inventors:
`
`Vij ay Nadkarni, San Jose, CA
`(US); Jeetendra Jangle, Fremont,
`CA (US); John Bentley, Santa
`Clara, CA (US); Umang Salgia,
`Nigadi (IN)
`
`Correspondence Address;
`Law Office of Brian Short
`PO. Box 641867
`
`San Jose, CA 95164-1867 (US)
`
`(73) Assignee:
`
`LABURNUM NETWORKS,
`INC., San Jose, CA (US)
`
`(21) APPL N05
`
`12/560,069
`
`(22)
`
`Filed:
`
`Sep. 15, 2009
`
`Related US. Application Data
`(60) Provisional application No. 61/208,344, filed on Feb.
`23, 2009.
`
`_
`_
`_
`_
`Publlcatlon Class1ficat10n
`
`(51)
`
`Int. Cl.
`(2006.01)
`G01P 15/00
`(2006.01 )
`G06F 19/00
`(2006.01)
`G06F 1 7/18
`(52) US. Cl. ............................ 702/19; 702/141; 702/179
`(57)
`ABSTRACT
`
`A method of identifying a type of motion of an animate or
`inanimate object is disclosed. The method includes generat-
`mg an acceleration Signature based on the sensed acceleratlon
`of the object. The acceleration signature is matched with at
`least one of a plurality of stored acceleration signatures,
`wherein each stored acceleration signatures corresponds with
`type of motion. The type of motion of the object is identified
`based on the statistical matching or exact matching of the
`acceleration signature.
`
`Generating an acceleration signature based on the sensed acceleration of the object
`
`61
`
`60
`
`Matching the acceleration signature with at least one of a plurality of stored
`acceleration signatures, wherein each stored acceleration signatures corresponds with
`type of motion
`
`Identifying the type of motion of the object based on the statistical matching or exact
`matching of the acceleration signature
`
`
`
`63
`
`1
`
`APPLE 1008
`
`APPLE 1008
`
`1
`
`

`

`Patent Application Publication
`
`Aug. 26, 2010 Sheet 1 0f 9
`
`US 2010/0217533 A1
`
`FiGURE ”i
`
`2
`
`

`

`Patent Application Publication
`
`Aug. 26, 2010 Sheet 2 0f 9
`
`US 2010/0217533 A1
`
`Slow Falling and lying down summersault
`
`
`
`
`
`
`m.r‘" Acceleration1’
`
`350
`
`HGUREZA
`
`
`Slippipggand. falling.99,,,b%9¥s9n la beggcy surface (air mattress).
`
`Acceleration
`
`
`
`
`
`
`
`X Axis250 350
`
`HGUREZB
`
`3
`
`

`

`Patent Application Publication
`
`Aug. 26, 2010 Sheet 3 0f 9
`
`US 2010/0217533 A1
`
`
`539911ffi9¢,With,k¥1§¢§fl§¥§d A ..
`
`AcceIeration
`
`
`
`FIGURE 2C
`
`4
`
`

`

`Patent Application Publication
`
`Aug. 26, 2010 Sheet 4 0f 9
`
`US 2010/0217533 A1
`
`
`
`312
`
`Ace.
`
`314
`
`Ace,
`316
`
`
`
`
`
`Q
`
`z mAID
`320
`
`Compare
`
`330
`
`Library
`
`34“
`
`Audio.
`360
`
`370
`
`Controller
`
`i
`2
`
`Communication
`
`
`
`:
`1
`
` i'
`
`5 WI
`
`'—,>
`
`FIGURE 3
`
`5
`
`

`

`Patent Application Publication
`
`Aug. 26, 2010 Sheet 5 0f 9
`
`US 2010/0217533 A1
`
`Monitoring an activity of a person the motion detection device is attached
`ilQ
`
`Performing instantaneous computations over raw signals to compute atomic motions
`along with gravity vector and tilt vector
`5.2.9
`
`430
`
`Applying series of digital filters to remove noise in the atomic motions data
`
`\
`
`WM
`
`Mertorming state analysis on series of atomic data samples“
`m
`NRN M
`
`Perioidic State Analysis
`44_5
`
`Transcient State Analysis
`@
`
`W A
`
`Formation of macro motion signatures
`5.6.9.
`
`learning system providing the right model for the user from a set of model
`m
`
`4_8_Q
`
`Pre—building a motion database of motion libraries
`
`FIGURE 4
`
`6
`
`

`

`Patent Application Publication
`
`Aug. 26, 2010 Sheet 6 0f 9
`
`US 2010/0217533 A1
`
`
`
`Another large
`acceleration
`
`
`
`
`
`
` Deviation in Motion
`
`Monitor
`Pattern Detected,
`
`recorded and
`Activity
`5.19
`
`reported
`
`
`m
`
`
`Fall Report
`
`
`Large acceleration
`Acknowledged
`
`
`5%
`threshold exceeded,
`
`
`
`record audio
`
` Q9
`
`
`
`Monitor
`
`
`
`Fall
`Prob. Fail
`
`
`magnitude detected
`Repo rted
`fl
`5.2.5
`
`
`52,5:
`
`
`Short Period of
`
`
`Inactivity
`
`
`iii
`
` Send Alert, recording
`
`and analysis sent
`
`
`fl
`
`Normal Movement
`Prob. Fall
`
`
`Detected, Audio Rec.
`Detected
`
`Stopped
`.5.£LQ
`
` 54—5
`
`Fall
`
`
`Detected
`
`
`5.6.5.
`Period of Inactivity
`
`fl
`
`
`
`
`
`
`
`Analysis of motion,
`positions indicates
`normal activity
`@
`
`
`Analysis
`§§§
`
`
`Analysis of motion,
`positions indicates a
`
`fall occurred
`
`
`5%
`
`FIGURE 5
`
`7
`
`

`

`Patent Application Publication
`
`Aug. 26, 2010 Sheet 7 0f 9
`
`US 2010/0217533 A1
`
`Generating an acceleration signature based on the sensed acceleration of the object
`
`610
`
`60
`
`Matching the acceleration signature with at least one of a plurality of stored
`acceleration signatures, wherein each stored acceleration signatures corresponds with
`type of motion
`
`identifying the type of motion of the object based on the statistical matching or exact
`matching of the acceleration signature
`
`
`
`30
`
`FIGURE 6
`
`8
`
`

`

`Patent Application Publication
`
`Aug. 26, 2010 Sheet 8 0f 9
`
`US 2010/0217533 A1
`
`
`
`The motion detection device determining what network connections are available to
`the motion detection device
`
`10
`
`
`
`
`
`
`The motion detection device distributing at least some of the acceleration signature
`matching processing if processing capability is available to the motion detection device
`
`though available network connections
`m
`
`
`
`
`
`FIGURE 7
`
`9
`
`

`

`Patent Application Publication
`
`Aug. 26, 2010 Sheet 9 of 9
`
`US 2010/0217533 A1
`
` Processor
`
`\
`
`Noflllilwe’fiuork
`
`Available
`
`810
`
`Blue Tooth
`
`Processor
`
`8312
`
`Cellular
`820
`
`\‘
`A”
`
`350
`
`
`
`Processor
`
`
`
`Home
`Base Station
`840
`
`FIGURE 8
`
`10
`
`10
`
`

`

`US 2010/0217533 A1
`
`Aug. 26, 2010
`
`IDENTIFYING A TYPE OF MOTION OF AN
`OBJECT
`
`RELATED APPLICATIONS
`
`[0001] This patent application claims priority to US. pro-
`visional patent application Ser. No. 61/208,344 filed on Feb.
`23, 2009 which is incorporated by reference.
`
`FIELD OF THE DESCRIBED EMBODIMENTS
`
`[0002] The described embodiments relate generally to
`motion detecting. More particularly, the described embodi-
`ments relate to a method and apparatus for identifying a type
`of motion of an animate or inanimate object.
`
`BACKGROUND
`
`[0003] There is an increasing need for remote monitoring
`of individuals, animals and inanimate objects in their daily or
`natural habitats. Many seniors live independently and need to
`have their safety and wellness trackcd. A large percentage of
`society is fitness conscious, and desire to have, for example,
`workouts and exercise regimen assessed. Public safety offic-
`ers, such as police and firemen, encounter hazardous situa-
`tions on a frequent basis, and need their movements, activities
`and location to be mapped out precisely.
`[0004] The value in such knowledge is enormous. Physi-
`cians, for example, like to know their patients sleeping pat-
`terns so they can treat sleep disorders. A senior living inde-
`pendently wants peace of mind that if he has a fall it will be
`detected automatically and help summoned immediately. A
`fitness enthusiast wants to track her daily workout routine,
`capturing the various types of exercises, intensity, duration
`and caloric burn. A caregiver wants to know that her father is
`living an active, healthy lifestyle and taking his daily walks.
`The police would like to know instantly when someone has
`been involved in a car collision, and whether the victims are
`moving or not.
`[0005] Existing products for the detection of animate and
`inanimate motions are simplistic in nature, and incapable of
`interpreting anything more than simple atomic movements,
`such as jolts, changes in orientation and the like. It is not
`possible to draw reliable conclusions about human behavior
`from these simplistic assessments.
`[0006]
`It is desirable to have an apparatus and method that
`can accurately monitor motion of either animate of inanimate
`objects.
`
`SUMMARY
`
`[0007] An embodiment includes a method of identifying a
`type ofmotion of an animate or inanimate object. The method
`includes generating an acceleration signature based on the
`sensed acceleration of the object. The acceleration signature
`is matched with at least one of a plurality of stored accelera-
`tion signatures, wherein each stored acceleration signatures
`corresponds with type of motion. The type of motion of the
`object is identified based on the statistical matching or exact
`matching of the acceleration signature.
`[0008] Another embodiment includes a method of identi-
`fying a type of motion of a person. The method includes
`generating an acceleration signature based on the sensed
`acceleration of an object attached to the person. The accel-
`eration signature is matched with at least one of a plurality of
`stored acceleration signatures, wherein each stored accelera-
`tion signatures corresponds with type of motion. The type of
`
`motion of the object is identified based on the statistical
`matching or exact matching of the acceleration signature.
`[0009] Other aspects and advantages of the described
`embodiments will become apparent from the following
`detailed description, taken in conjunction with the accompa-
`nying drawings, illustrating by way of example the principles
`of the described embodiments.
`
`BRIEF DESCRIPTION OF THE DRAWINGS
`
`FIG. 1 shows examples of different types ofmotions
`[0010]
`of a human being that an object attached to the human being
`can be used to detect or sense.
`
`FIGS. 2A, 2B, 2C shows examples of time-lines of
`[0011]
`several different acceleration curves (signatures), wherein
`each signature is associated with a different type of sensed or
`detected motion.
`
`FIG. 3 is an example ofa block diagram ofa motion
`[0012]
`detection device.
`
`FIG. 4 is a flowchart that includes the steps of an
`[0013]
`example of a method for detecting various motions of daily
`living activities and emergency situations, such as, a fall.
`[0014]
`FIG. 5 is a flowchart that includes the steps of a
`method for detection of a fall.
`
`FIG. 6 is a flow chart that includes the steps of one
`[0015]
`example of a method of identifying a type of motion of an
`animate or inanimate object.
`[0016]
`FIG. 7 is a flow chart that includes steps of one
`example of a method of a motion detection device checking
`network availability for improvements in speed and/or pro-
`cessing power of acceleration signature matching.
`[0017]
`FIG. 8 shows a motion detection device that can be
`connected to one of multiple networks.
`
`DETAILED DESCRIPTION
`
`[0018] The monitoring of human activities generally falls
`into three categories: safety, daily lifestyle, and fitness. By
`carefully interpreting human movements it is possible to draw
`accurate and reasonably complete inferences about the state
`of well being of individuals. A high degree of sophistication
`is required in these interpretations. Simplistic assessments of
`human activity lead to inaccurate determinations, and ulti-
`mately are of questionable value. By contrast, a comprehen-
`sive assessment leads to an accurate interpretation and can
`prove to be indispensable in tracking the well being and safety
`of the individual.
`
`To draw accurate inferences about the behavior of
`[0019]
`humans, it turns out that the atomic movements become sim-
`ply alphabets that include elemental motions. Furthermore,
`specific sequences of elemental motions become the vocabu-
`lary that comprises human behavior. As an example, take the
`case of a person who leaves the home and drives to the
`shopping center. In such a scenario, the behavioral pattern of
`the person is walking to the door or the house, opening and
`closing the door, walking further to the car, settling down in
`the car, starting the engine, accelerating the car, going
`through a series of stops, starts and turns, parking the car,
`getting out and closing the car door, and finally walking to the
`shopping center. This sequence of human behavior is com-
`prised of individual motions such as standing, walking, sit-
`ting, accelerating (in the car), decelerating, and turning left or
`right. Each individual motion, for example walking, is com-
`prised of multiple atomic movements such as acceleration in
`an upward direction, acceleration in a downward direction, a
`
`11
`
`

`

`US 2010/0217533 A1
`
`Aug. 26, 2010
`
`modest forward acceleration with each step, a modest decel-
`eration with each step, and so on.
`[0020] With written prose, letters by themselves convey
`almo st no meaning at all. Words taken independently convey
`individual meaning, but do not provide the context to com-
`prehend the situation. It takes a complete sentence to obtain
`that context. Along the same line of reasoning, it requires a
`comprehension of a complete sequence of movements to be
`able to interpret human behavior.
`[0021] Although there is an undeniable use for products
`that are able to detect complex human movements accurately,
`the key to the success of such technologies lies in whether
`users adopt them or not. The technology needs to capture a
`wide range of human activities. The range of movements
`should ideally extend to all types of daily living activities that
`a human being expects to encounterisleeping, standing,
`walking, running, aerobics, fitness workouts, climbing stairs,
`vehicular movements, falling, jumping and colliding, to name
`some of the more common ones.
`
`It is important to detect human activities with a great
`[0022]
`deal of precision. In particular, activities that relate to safety,
`fitness, vehicular movements, and day to day lifestyle pat-
`terns such as walking, sleeping, climbing stairs, are important
`to identify precisely. For example, it is not enough to know
`that a person is walking. One needs to know the pace and
`duration of the walk, and additional knowledge of gait,
`unsteadiness, limping, cadence and the like are important.
`[0023]
`It is critical that false positives as well as false nega-
`tives be eliminated. This is especially important for cases of
`safety, such as falls, collisions, and the like. Human beings
`come in all typesishort, tall, skinny, obese, male, female,
`athletic, couch potato, people walking with stick/rolator,
`people with disabilities, old and young. The product needs to
`be able to adapt to their individuality and lifestyle.
`[0024] The embodiments described provide identification
`of types of motion of an animate or inanimate object. Motion
`is identified by generating acceleration signatures based on
`the sensed acceleration of the object. The acceleration signa-
`tures are compared with a library of motion signature, allow-
`ing the motion of the object
`to be identified. Further,
`sequences of the motions can be determined, allowing iden-
`tification of activities of, for example, a person the object is
`attached to.
`
`Just as the handwritten signatures of a given human
`[0025]
`being are substantively similar from one signature instance to
`the next, yet have minor deviations with each new instance, so
`too will the motion signatures of a given human be substan-
`tively similar from one motion instance to the next, yet have
`minor deviations.
`
`[0026] Algorithms used for pattern recognition (signature
`matching) should have the sophistication to accurately handle
`a wide range of motions. Such algorithms should have the
`ability to recognize the identical characteristics ofa particular
`motion by a given human being, yet allow for minor varia-
`tions arising from human randomness. Additionally,
`the
`devices used to monitor peoples’ movement need to be min-
`iature and easy to wear. These two objectives are fundamen-
`tally opposed. However, the described embodiments provide
`a single cohesive system that is both sophisticated enough to
`detect a wide range of motions.
`[0027]
`FIG. 1 shows examples of different types ofmotions
`of a human being that an object attached to the human being
`can be used to detect or sense. The human motions can
`
`include, for example, standing, sleeping, walking, and run-
`
`ning. A first motion 110 can include walking. A second
`motion 120 can include falling. A third motion 130 can
`include running. Each of the motions generates a unique
`motion signature. As will be described, the signatures can be
`universal to, for example, many individuals. Additionally, the
`signatures can have additional characteristics that are unique
`to an individual.
`
`FIGS. 2A, 2B, 2C shows examples ofdifferent types
`[0028]
`of acceleration and orientation signatures for various sample
`motions by human beings. It should be noted that these sig-
`natures are expected to have certain components that are
`common from one human being to the next, but also have
`certain components that vary from one human to the next.
`[0029] The signatures of FIGS. 2A, 2B, 2C are depicted in
`only one orientation. That is, three accelerometers can be
`used to generate acceleration signatures in the X, Y and Z
`(three) orientations. The signatures of FIGS. 2A, 2B, 2C only
`show the signature of one of the three orientations. It is to be
`understood that matching can use the other orientations as
`well.
`
`FIG. 2A shows an example of an acceleration sig-
`[0030]
`nature of a person doing a slow fall and lying down summer-
`sault. FIG. 2B shows an example of an acceleration signature
`of a person slipping and falling back on a bouncy surface (for
`example, an air mattress). FIG. 2C shows an acceleration
`signature of a person fall on their face with their knees flexed.
`By matching an acceleration signature that has been gener-
`ated by sensing the motion of a person with one of many
`stored signatures, the motion ofthe person can be determined.
`[0031]
`FIG. 3 is an example ofa block diagram ofa motion
`detection device. The motion detection device can be attached
`
`to an object, and therefore, detect motion ofthe object that can
`be identified. Based on the identified motion, estimates ofthe
`behavior and conditions of the object can be determined.
`[0032] The motion detection device includes sensors (such
`as, accelerometers) that detect motion of the object. One
`embodiment ofthe sensors includes accelerometers 312, 314,
`316 that can sense, for example, acceleration of the object in
`X, Y and Z directional orientations. It is to be understood that
`other types of motion detection sensors can alternatively be
`used.
`
`[0033] An analog to digital converter (ADC) digitizes ana-
`log accelerometer signals. The digitized signals are received
`by compare processing circuitry 330 that compares the digi-
`tized accelerometer signals with signatures that have been
`stored within a library of signatures 340. Each signature
`corresponds with a type of motion. Therefore, when a match
`between the digitized accelerometer signals and a signature
`stored in the library 340, the type of motion experienced by
`the motion detection device can determined.
`
`[0034] An embodiment includes filtering the accelerometer
`signals before attempting to match the signatures. Addition-
`ally, the matching process can be made simpler by reducing
`the possible signature matches.
`[0035] An embodiment
`includes identifying a previous
`human activity context. That is, for example, by knowing that
`the previous human activity was walking, certain signatures
`can intelligently be eliminated from the possible matches of
`the present activity that occurs subsequent to the previous
`human activity (walking).
`[0036] An embodiment includes additionally reducing the
`number of possible signature matches by performing a time-
`domain analysis on the accelerometer signal. The time-do-
`main analysis can be used to identify a transient or steady-
`
`12
`
`12
`
`

`

`US 2010/0217533 A1
`
`Aug. 26, 2010
`
`is, for
`state signature of the accelerometer signal. That
`example, a walk may have a prominent steady-state signature,
`whereas a fall may have a prominent transient signature.
`Identification of the transient or steady-state signature of the
`accelerometer signal can further reduce or eliminate the num-
`ber of possible signature matches, and therefore, make the
`task ofmatching the accelerometer signature with a signature
`within the library of signature simpler, and easier to accom-
`plish. More specifically, the required signal processing is
`simpler, easier, and requires less computing power.
`[0037] Upon detection of certain types of motion, an audio
`device 360 and/or a global positioning system (GPS) 370 can
`engaged to provide additional information that can be used to
`determine the situation of, for example, a human being the
`motion detection device is attached to.
`
`[0038] The condition, or information relating to the motion
`detection device can be communicated through a wired or
`wireless connection. A receiver of the information can pro-
`cess it, and make a determination regarding the status of the
`human being the motion detection device is attached to.
`[0039]
`FIG. 4 is a flowchart that includes the steps of an
`example of a method for detecting various motions of daily
`living activities and emergency situations, such as, a fall. A
`first step 410 includes monitoring an activity of a person the
`motion detection device is attached. Raw signal data is col-
`lected from, for example, an accelerometer sensor. A second
`step 420 includes performing instantaneous computations
`over raw signals to compute atomic motions along with grav-
`ity vector and tilt vector. A step third step 430 includes apply-
`ing series of digital filters to remove noise in the atomic
`motions data. A fourth step 440 includes performing state
`analysis on series of atomic data samples and forming con—
`text. Depending on the state analysis, the series of atomic data
`is passed through either a step 445 periodic or steady state
`data analysis or a step 450 transient state data analysis.A sixth
`step 460 includes formation of macro motion signatures. The
`macro motion signatures are built from an output of state
`analysis vectors using known wavelet transformation tech-
`niques (for example, a Haar Transform). The transform per-
`forms pattern matching on current motion pattern with exist-
`ing motion pattern library using, for example, DWT (Discreet
`Wavelet Transform) techniques. Complex motion wavelets
`are later matched using statistical pattern matching tech-
`niques, such as, HHMM (Hidden Heuristic Markov Model).
`The statistical pattern matching includes detecting and clas-
`sifying events of interest. The events of interest are built by
`observing various motions and orientation states data of an
`animate or inanimate object. This data is used to train the
`statistical model which performs the motion/activity detec-
`tion. Each activity will have its own model trained based on
`the observed data. A seventh step 470 includes a learning
`system providing the right model for the user from a set of
`model. It also aids in building newer (personal) patterns
`which are not in the library for the person who is wearing the
`motion detection device. An eighth step 480 includes pre-
`building a motion database of motion libraries against which
`motion signatures are compared. The database adds new
`motion/states signature dynamically as they are identified.
`[0040]
`FIG. 5 is a flowchart that includes the steps of an
`example of a method for detecting a fall. A first step 510
`includes monitoring an activity of, for example, a person the
`motion detection device is attached to. A step 515 includes
`recording and reporting in deviations in normal motion pat-
`terns of the person. A step 520 includes detecting the accel-
`
`eration magnitude deviation exceeding a threshold. The
`acceleration magnitude deviation exceeding the threshold
`can be sensed as a probable fall, and audio recording is
`initiated. Upon detection ofthis condition, sound recording of
`the person the motion detection device is connected to can be
`activated. The activation of sound can provide additional
`information that can be useful in assessing the situation ofthe
`person. A step 530 includes monitoring the person after the
`probable fan. A step 525 includes detection of another accel-
`eration having magnitude lesser than the threshold, and c011-
`tinuing monitoring of audio. A step 535 includes detecting a
`short period of inactivity. A step 540 includes monitoring the
`person after determining a fall probably occurred. A step 545
`includes subsequently detecting normal types of motion and
`turning offthe audio because the person seems to be perform-
`ing normal activity. A step 550 includes monitoring a period
`of inactivity. A step 555 includes additional analysis of
`detected information and signals. A step 560 includes further
`analysis including motion data, orientation detection all indi-
`cating the person is functioning normally. A step 560 includes
`determining that a fall has occurred based on the analysis of
`the motion data, and analysis of a concluded end position and
`orientation of the person. The sound recording can be de-
`activated. A step 565 includes concluding that a fall has
`occurred. A step 570 includes sending an alert and reporting
`sound recordings. A step 575 includes the fall having been
`reported. A step 580 includes an acknowledgement ofthe fall.
`[0041]
`FIG. 6 is a flow chart that includes the steps of one
`example of a method of identifying a type of motion of an
`animate or inanimate object. A first step 610 includes gener-
`ating an acceleration signature (for example, a tri -axial) based
`on the sensed acceleration of the object. A second step 620
`includes matching the acceleration signature with at least one
`of a plurality of stored acceleration signatures, wherein each
`stored acceleration signatures corresponds with type of
`motion. A third step 630 includes identifying the type of
`motion of the object based on the statistical (pattern) match-
`ing or exact matching ofthe acceleration signature. As will be
`described, the acceleration signal can be created using a
`wavelet transformation.
`
`For embodiments, the type of motion includes at
`[0042]
`least one of atomic motion, elemental motion and macro-
`motion.
`
`[0043] Though embodiments of generating matching
`acceleration signatures are described, it is to be understood
`that additional or alternate embodiments can include gener-
`ating and matching of orientation and/or audio signatures.
`Correspondingly, the first step 610 can include generating an
`acceleration signature, (and/or) orientation and audio signa-
`ture based on the sensed acceleration, orientation ofthe object
`and audio generated by the object, for example, a thud of a
`fall, or a cry for help.
`[0044] Atomic motion includes but is not limited to a sharp
`jolt, a gentle acceleration, complete stillness, a light accelera-
`tion that becomes stronger, a strong acceleration that fades, a
`sinusoidal or quasi-sinusoidal acceleration pattern, vehicular
`acceleration, vehicular deceleration, vehicular left and right
`turns, and more.
`[0045] Elemental motion includes but is not limited to
`motion patterns for walking, running, fitness motions (e.g.
`elliptical machine exercises, rowing, stair climbing, aerobics,
`skipping rope, bicycling .
`.
`. ), vehicular traversal, sleeping,
`sitting, crawling, turning over in bed, getting out of bed,
`getting up from chair, and more.
`
`13
`
`13
`
`

`

`US 2010/0217533 A1
`
`Aug. 26, 2010
`
`[0046] Macro-motion includes but is not limited to going
`for a walk in the park, leaving home and driving to the shop-
`ping center, getting out of bed and visiting the bathroom,
`performing household chores, playing a game of tennis, and
`more.
`
`[0047] Each of the plurality of stored acceleration signa-
`tures corresponds with a particular type ofmotion. By match-
`ing the detected acceleration signature of the object with at
`least one of a plurality of stored acceleration signatures, an
`estimate or educated guess can be made about the detected
`acceleration signature.
`[0048] An embodiment includes a common library and a
`specific library, and matching the acceleration signature
`includes matching the acceleration signature with stored
`acceleration signatures of the common library, and then
`matching the acceleration signature with stored acceleration
`signatures of the specific library. For a particular embodi-
`ment, the general library includes universal acceleration sig-
`natures, and the specific library includes personal accelera-
`tion signatures. That is, for example, the stored acceleration
`signatures of the common library are useable for matching
`acceleration signatures of motions of multiple humans, and
`the stored acceleration signatures of the specific library are
`useable for matching acceleration signatures of motions of a
`particular human. Additionally, each library can be further
`categorized to reduce the number of possible matches. For
`example, at an initialization, a user may enter physical char-
`acteristics of the user, such as, age, sex, and/or physical char-
`acteristics (such as, the user has a limp). Thereby, the possible
`signature matches within the general library can be reduced.
`The signature entries within the specific library can be
`leamed (built) over time as the human wearing the motion
`detection device goes through normal activities ofthe specific
`human. The specific library can be added to, and improved
`over time.
`
`[0049] An embodiment includes filtering the acceleration
`signals. Additional embodiment include reducing the number
`of stored acceleration signature matches by identifying a
`previous activity of the object, and performing a time domain
`analysis on the filtered acceleration signal to identify tran-
`sient signatures or steady-state signatures of the filtered
`acceleration signal. That is, by identifying a previous activity
`(for example, a human walking of sleeping) the possible
`number ofpresent activities can be reduced, and therefore, the
`number of possible stored acceleration signature matches
`reduced. Additionally, the transient and/or steady-state sig-
`natures can be used to reduce the number of possible stored
`acceleration signature matches, which can improve the pro-
`cessing speed.
`includes activating audio
`[0050] Another embodiment
`sensing of the object if matches are made with at least por-
`tions of particular
`stored acceleration signatures. For
`example, if the acceleration signature exceeds a threshold
`value, then audio sensing of the object is activated. This is
`useful because the audio information can provide additional
`clues as to what, for example, the condition of a person. That
`is, a fall may be detected, and audio information can be used
`to confirm that a fall has in fact occurred.
`
`tion to determine, for example, if the user is in pain, uncon-
`scious or in a dangerous situation (for example, in a shower or
`in a fire).
`[0052] An embodiment includes the object being associ-
`ated a person, and the stored acceleration signatures corre-
`sponding with different types of motion related to the person.
`A particular embodiment includes identifying an activity of
`the person based on a sequence of identified motions of the
`person. The activity of the person can include, for example,
`falling (the most important in some applications), walking,
`running, driving and more. Furthermore, the activities can be
`classified as daily living activities such as walking, running,
`sitting, sleeping, driving, climbing stairs, and more, or spo-
`radic activities, such as falling, having a car collision, having
`a seizure and so on.
`
`[0053] An embodiment includes transmitting information
`related to the identified type of motion if matches are made
`with particular stored acceleration signatures. The informa-
`tion related to the identified type ofmotion can include at least
`one of motions associated with a person the object is associ-
`ated with. The motions can include, for example, a heartbeat
`of the person, muscular spasms, facial twitches, involuntary
`reflex movements which can be sensed by, for example, an
`accelerometer. Additionally, the information related to the
`identified type of motion can include at least one of location
`of the object, audio sensed by the object, temperature of the
`object.
`[0054] Another embodiment includes storing at least one of
`the plurality of stored acceleration signatures during an ini-
`tialization cycle. The initializing cycle can be influenced
`based on what the object is attached to. That is, initializing the
`stored acceleration signatures (motion patterns) can be based
`on what the object is attached to, which can both reduce the
`number of signature required to be store within, for example,
`the general library, and therefore, reduce the number of pos-
`sible matches and reduce the processing required to identify a
`match. Alternatively or additionally, initializing the stored
`acceleration signatures can be based on who the object is
`attached to, which can influence the specific library. The
`initialization can be used to determine motions unique, for
`example, to an individual. For example, a unique motion can
`be identified for a person who walks with a limp, and the
`device can be initialized with motion patterns of the person
`walking with a limp.
`[0055] An embodiment includes initiating a low-power
`sleep mode of the object if sensed acceleration is below a
`threshold for a predetermined amount of time. That is, if, for
`example, a person is sensed

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket