throbber
as) United States
`a2) Patent Application Publication co) Pub. No.: US 2010/0217533 Al
` Nadkarniet al. (43) Pub. Date: Aug. 26, 2010
`
`
`
`US 20100217533A1
`
`(54)
`
`(75)
`
`IDENTIFYING A TYPE OF MOTION OF AN
`OBJECT
`Inventors:
`
`Vijay Nadkarni, San Jose, CA
`(US); Jeetendra Jangle, Fremont,
`CA (US); John Bentley, Santa
`Clara, CA (US); UmangSalgia,
`Nigadi (IN)
`
`Correspondence Address:
`Law Office of Brian Short
`P.O. Box 641867
`San Jose, CA 95164-1867 (US)
`(73) Assignee:
`LABURNUM NETWORKS
`.
`INC., San Jose, CA (US)
`
`$
`
`(21) Appl. No.:
`
`12/560,069
`
`(22)
`
`Filed:
`
`Sep. 15, 2009
`
`Related U.S. Application Data
`(60) Provisional application No. 61/208,344,filed on Feb.
`23, 2009.
`o,
`.
`.
`Publication Classification
`
`(51)
`
`(52)
`(57)
`
`Int. Cl.
`(2006.01)
`GO1P 15/00
`(2006.01)
`GO6F 19/00
`(2006.01)
`GO6F 17/18
`UWS. C1. oo ececccccccceeerteeees 702/19; 702/141 3 702/179
`ABSTRACT
`
`A method of identifying a type of motion of an animate or
`inanimate object is disclosed. The method includes generat-
`ing an acceleration signaturebasedonthe sensedacceleration
`of the object. The acceleration signature is matched with at
`least one of a plurality of stored acceleration signatures,
`wherein each stored acceleration signatures corresponds with
`type of motion. The type of motionof the objectis identified
`based on the statistical matching or exact matching of the
`acceleration signature.
`
`620
`
`Generating an acceleration signature based on the sensed acceleration of the object
`
`61
`
`Matching the acceleration signature with at least one of a plurality of stored
`acceleration signatures, wherein each stored acceleration signatures corresponds with
`type of motion
`
`Identifying the type of motion of the object based onthestatistical matching or exact
`matching of the acceleration signature
`
`
`
`63
`
`1
`
`APPLE 1008
`
`APPLE 1008
`
`1
`
`

`

`Patent Application Publication
`
`Aug. 26,2010 Sheet 1 of 9
`
`US 2010/0217533 Al
`
`FIGURE 1
`
`2
`
`

`

`Patent Application Publication
`
`Aug. 26,2010 Sheet 2 of 9
`
`US 2010/0217533 Al
`
`Slow Falling and lying down summersault
`
`Acceleration o
`
`vgrannengnmmntpin
`Slippingandfallingonbackon a bouncysurface (air mattress) Acceleration
`
`FIGURE 2A
`
`
`
`FIGURE 2B
`
`3
`
`

`

`Patent Application Publication
`
`Aug. 26,2010 Sheet 3 of 9
`
`US 2010/0217533 Al
`
`Fallon face with knees flexed
`
`Acceleration

`T
`
`
`
`FIGURE 2C
`
`4
`
`

`

`Patent Application Publication
`
`Aug. 26,2010 Sheet 4 of 9
`
`US 2010/0217533 Al
`
`
`
`~
`
`| Lf
`
`AID
`
`320
`
`Compare
`
`330
`
`Library
`
`340
`
`
`
`
`312
`
`Acc.
`a4
`
`Acc.
`316
`
`
`
`Audio.
`360
`
`370
`
`|
`
`Controller
`=
`
`Communication
`
`FIGURE 3
`
`5
`
`

`

`Patent Application Publication
`
`Aug. 26,2010 Sheet 5 of 9
`
`US 2010/0217533 Al
`
`Monitoring an activity of a person the motion detection device is attached
`410
`
`Performing instantaneous computations over raw signals to compute atomic motions
`along with gravity vector andtilt vector
`420
`
`430
`
`Applying series of digitalfilters to remove noise in the atomic motions data
`
`~~Performing state analysis on series of atomic data samples~~~
`440
`Tna
`eee
`
`—.
`
`Perioidic State Analysis
`445
`
`Transcient State Analysis
`450
`
`
`
`Formation of macro motion signatures
`460
`
`A learning system providing the right model for the user from a set of model
`470
`
`480
`
`Pre-building a motion database of motion libraries
`
`FIGURE 4
`
`6
`
`

`

`Patent Application Publication
`
`Aug. 26, 2010 Sheet 6 of 9
`
`US 2010/0217533 Al
`
`
`
`
`
`
`
`
`
`Deviation in Motion
`
`
`Pattern Detected,
`Monitor
`recorded and
`
`
`Activity
`510
`
`reported
`
`
`515
`
`
`Fall Report
`
`
`Acknowledged
`Large acceleration
`
`
`threshold exceeded,
`580
`
`
`
`record audio
`
`
`
`920
`
`
`Another large
`
`Monitor
`
`acceleration
`
`
`Fall
`Prob. Fail
`
`magnitude detected
`
`
`Reported
`530
`575
`525
`
`
`
`
`
`
`545
`
`Normal Movement
`Detected, Audio Rec.
`Stopped
`
`Analysis of motion,
`positions indicates
`normal activity
`
`560
`
`Short Period of
`
`Inactivity
`535
`
`
`Send Alert, recording
`
`
`and analysis sent
`
`
`570
`
`Prob. Fall
`
`
`Detected
`
`340
`
`
`
`
`Fall
`
`Detected
`565 Period ofInactivity
`
` 550
`
`
`
`Analysis of motion,
`
`positions indicates a
`fall occurred
`
`260
`
`Analysis
`555
`
`FIGURE 5
`
`7
`
`

`

`Patent Application Publication
`
`Aug. 26,2010 Sheet 7 of 9
`
`US 2010/0217533 Al
`
`Generating an acceleration signature based on the sensed acceleration of the object
`
`610
`
`620
`
`Matching the acceleration signature with at least one ofa plurality of stored
`acceleration signatures, wherein each stored acceleration signatures corresponds with
`type of motion
`
`Identifying the type of motion of the object based on the statistical matching or exact
`matching of the acceleration signature
`
`
`
`30
`
`FIGURE6
`
`8
`
`

`

`Patent Application Publication
`
`Aug. 26,2010 Sheet 8 of 9
`
`US 2010/0217533 Al
`
`
`
`The motion detection device determining what network connections are available to
`the motion detection device
`
`
`10
`
`
`
`
`
`The motion detection device distributing at least some of the acceleration signature
`matching processing if processing capability is available to the motion detection device
`
`though available network connections
`720
`
`
`
`
`
`FIGURE 7
`
`9
`
`

`

`Patent Application Publication
`
`Aug. 26,2010 Sheet 9 of 9
`
`US 2010/0217533 Al
`
`810
`Blue Tooth
`
`ihLL
`
`wee
`
`
`300
`
`“
`
`
`
`Processor
`
`rome
`
`aeNN
`
`NoNetwork
`
`Available
`
`Processor
`830
`
`Cellular
`820
`
`\
`ale
`
`
` 850
`
` Processor
`
`Home
`Base Station
`840
`
`FIGURE 8
`
`10
`
`10
`
`

`

`US 2010/0217533 Al
`
`Aug. 26, 2010
`
`IDENTIFYING A TYPE OF MOTION OF AN
`OBJECT
`
`RELATED APPLICATIONS
`
`[0001] This patent application claims priority to U.S. pro-
`visional patent application Ser. No. 61/208,344filed on Feb.
`23, 2009 whichis incorporated by reference.
`
`FIELD OF THE DESCRIBED EMBODIMENTS
`
`[0002] The described embodiments relate generally to
`motion detecting. More particularly, the described embodi-
`ments relate to a method and apparatusfor identifying a type
`of motion of an animate or inanimate object.
`
`BACKGROUND
`
`motion of the object is identified based on the statistical
`matching or exact matching of the acceleration signature.
`[0009] Other aspects and advantages of the described
`embodiments will become apparent from the following
`detailed description, taken in conjunction with the accompa-
`nying drawings,illustrating by way of example the principles
`of the described embodiments.
`
`BRIEF DESCRIPTION OF THE DRAWINGS
`
`FIG. 1 shows examples ofdifferent types ofmotions
`[0010]
`of a humanbeingthat an object attached to the human being
`can be usedto detect or sense.
`[0011]
`FIGS. 2A, 2B, 2C shows examples of time-lines of
`several different acceleration curves (signatures), wherein
`each signature is associated with a different type of sensed or
`detected motion.
`
`FIG. 4 is a flowchart that includes the steps of an
`[0013]
`example of a method for detecting various motions of daily
`living activities and emergencysituations, suchas, a fall.
`[0014]
`FIG. 5 is a flowchart that includes the steps of a
`method for detection ofa fall.
`
`[0003] There is an increasing need for remote monitoring
`[0012] FIG.3is an example ofa block diagram ofa motion
`
`ofindividuals, animals and inanimate objectsin their daily or
`detection device.
`natural habitats. Many seniors live independently and need to
`have their safety and wellness tracked. A large percentage of
`society is fitness conscious, and desire to have, for example,
`workouts and exercise regimen assessed. Public safety offic-
`ers, such as police and firemen, encounter hazardoussitua-
`tions ona frequentbasis, and need their movements, activities
`FIG. 6 is a flow chart that includes the steps of one
`[0015]
`and location to be mappedoutprecisely.
`example of a method of identifying a type of motion of an
`[0004] The value in such knowledge is enormous. Physi-
`animate or inanimate object.
`cians, for example, like to know their patients sleeping pat-
`[0016]
`FIG. 7 is a flow chart that includes steps of one
`terns so they can treat sleep disorders. A senior living inde-
`example of a method of a motion detection device checking
`pendently wants peace of mindthatif he hasafall it will be
`network availability for improvements in speed and/or pro-
`detected automatically and help summoned immediately. A
`cessing powerof acceleration signature matching.
`fitness enthusiast wants to track her daily workout routine,
`[0017]
`FIG. 8 shows a motion detection device that can be
`capturing the various types of exercises, intensity, duration
`connected to one of multiple networks.
`and caloric burn. A caregiver wants to know that herfatheris
`living an active, healthy lifestyle and taking his daily walks.
`The police would like to know instantly when someonehas
`been involved in a car collision, and whether the victims are
`moving or not.
`[0005] Existing products for the detection of animate and
`inanimate motionsare simplistic in nature, and incapable of
`interpreting anything more than simple atomic movements,
`such as jolts, changes in orientation and thelike. It is not
`possible to draw reliable conclusions about human behavior
`from these simplistic assessments.
`[0006]
`It is desirable to have an apparatus and methodthat
`can accurately monitor motion of either animate of inanimate
`objects.
`
`DETAILED DESCRIPTION
`
`[0018] The monitoring of humanactivities generally falls
`into three categories: safety, daily lifestyle, and fitness. By
`carefully interpreting human movementsit is possible to draw
`accurate and reasonably complete inferences about the state
`of well being of individuals. A high degree of sophistication
`is required in these interpretations. Simplistic assessments of
`humanactivity lead to inaccurate determinations, and ulti-
`mately are of questionable value. By contrast, a comprehen-
`sive assessment leads to an accurate interpretation and can
`prove to be indispensablein tracking the well being andsafety
`of the individual.
`
`SUMMARY
`
`[0007] An embodimentincludes a method of identifying a
`type ofmotion of an animate or inanimate object. The method
`includes generating an acceleration signature based on the
`sensed acceleration of the object. The acceleration signature
`is matched with at least one of a plurality of stored accelera-
`tion signatures, wherein each stored acceleration signatures
`corresponds with type of motion. The type of motion of the
`object is identified based on thestatistical matching or exact
`matching ofthe acceleration signature.
`[0008] Another embodiment includes a method ofidenti-
`fying a type of motion of a person. The method includes
`generating an acceleration signature based on the sensed
`acceleration of an object attached to the person. The accel-
`eration signature is matched with atleast one of a plurality of
`stored acceleration signatures, wherein each stored accelera-
`tion signatures corresponds with type of motion. The type of
`
`To draw accurate inferences about the behavior of
`[0019]
`humans, it turns out that the atomic movements becomesim-
`ply alphabets that include elemental motions. Furthermore,
`specific sequences of elemental motions becomethe vocabu-
`lary that comprises human behavior. As an example, take the
`case of a person who leaves the home and drives to the
`shopping center. In sucha scenario, the behavioral pattern of
`the person is walking to the dooror the house, opening and
`closing the door, walking further to the car, settling down in
`the car, starting the engine, accelerating the car, going
`through a series of stops, starts and turns, parking the car,
`getting out and closing the car door, andfinally walkingto the
`shopping center. This sequence of human behavior is com-
`prised of individual motions such as standing, walking,sit-
`ting, accelerating (in the car), decelerating, and turningleft or
`right. Each individual motion, for example walking, is com-
`prised of multiple atomic movements such as acceleration in
`an upwarddirection, acceleration in a downwarddirection, a
`
`11
`
`11
`
`

`

`US 2010/0217533 Al
`
`Aug. 26, 2010
`
`modest forward acceleration with each step, a modest decel-
`eration with each step, and so on.
`[0020] With written prose, letters by themselves convey
`almost no meaningat all. Words taken independently convey
`individual meaning, but do not provide the context to com-
`prehendthesituation. It takes a complete sentence to obtain
`that context. Along the sameline of reasoning, it requires a
`comprehension of a complete sequence of movements to be
`able to interpret human behavior.
`[0021] Although there is an undeniable use for products
`that are able to detect complex human movementsaccurately,
`the key to the success of such technologies lies in whether
`users adopt them or not. The technology needs to capture a
`wide range of human activities. The range of movements
`should ideally extendto all types ofdaily living activities that
`a human being expects to encounter—sleeping, standing,
`walking, running, aerobics, fitness workouts, climbingstairs,
`vehicular movements, falling, jumping and colliding, to name
`some of the more commonones.
`
`Itisimportant to detect humanactivities with a great
`[0022]
`deal of precision.In particular, activities that relate to safety,
`fitness, vehicular movements, and day to daylifestyle pat-
`terns such as walking, sleeping, climbing stairs, are important
`to identify precisely. For example,it is not enough to know
`that a person is walking. One needs to know the pace and
`duration of the walk, and additional knowledge of gait,
`unsteadiness, limping, cadence andthe like are important.
`[0023]
`Itis critical that false positives as well as false nega-
`tives be eliminated. This is especially important for cases of
`safety, such as falls, collisions, and the like. Human beings
`comein all types—short, tall, skinny, obese, male, female,
`athletic, couch potato, people walking with stick/rolator,
`people with disabilities, old and young. The product needs to
`be able to adaptto their individuality andlifestyle.
`[0024] The embodiments described provide identification
`oftypes of motion of an animate or inanimate object. Motion
`is identified by generating acceleration signatures based on
`the sensed acceleration of the object. The acceleration signa-
`tures are compared with a library of motion signature, allow-
`ing the motion of the object
`to be identified. Further,
`sequences of the motions can be determined, allowing iden-
`tification of activities of, for example, a person the objectis
`attachedto.
`
`Just as the handwritten signatures of a given human
`[0025]
`being are substantively similar from one signature instance to
`the next, yet have minordeviations with each new instance, so
`too will the motion signatures of a given humanbe substan-
`tively similar from one motion instanceto the next, yet have
`minor deviations.
`
`[0026] Algorithms used for pattern recognition (signature
`matching) should have the sophistication to accurately handle
`a wide range of motions. Such algorithms should have the
`ability to recognizethe identical characteristics ofa particular
`motion by a given humanbeing, yet allow for minor varia-
`tions arising from human randomness. Additionally,
`the
`devices used to monitor peoples’ movement need to be min-
`iature and easy to wear. These two objectives are fundamen-
`tally opposed. However, the described embodiments provide
`a single cohesive system that is both sophisticated enough to
`detect a wide range of motions.
`[0027] FIG.1 shows examples ofdifferent types ofmotions
`of a humanbeingthat an object attached to the human being
`can be used to detect or sense. The human motions can
`
`include, for example, standing, sleeping, walking, and run-
`
`ning. A first motion 110 can include walking. A second
`motion 120 can include falling. A third motion 130 can
`include running. Each of the motions generates a unique
`motion signature. As will be described, the signatures can be
`universal to, for example, many individuals. Additionally, the
`signatures can have additional characteristics that are unique
`to an individual.
`
`[0028] FIGS.2A,2B,2C shows examplesofdifferent types
`of acceleration and orientation signatures for various sample
`motions by human beings. It should be noted that these sig-
`natures are expected to have certain components that are
`common from one human being to the next, but also have
`certain components that vary from one human to the next.
`[0029] The signatures of FIGS. 2A, 2B, 2C are depicted in
`only one orientation. That is, three accelerometers can be
`used to generate acceleration signatures in the X, Y and Z
`(three) orientations. The signatures of FIGS. 2A, 2B, 2C only
`show the signature of one of the three orientations. It is to be
`understood that matching can use the other orientations as
`well.
`
`FIG. 2A shows an example of an acceleration sig-
`[0030]
`nature of a person doing a slow fall and lying down summer-
`sault. FIG. 2B shows an exampleofan acceleration signature
`ofa person slipping andfalling back on a bouncysurface (for
`example, an air mattress). FIG. 2C shows an acceleration
`signature of a personfall on their face with their kneesflexed.
`By matching an acceleration signature that has been gener-
`ated by sensing the motion of a person with one of many
`stored signatures, the motion ofthe person can be determined.
`[0031]
`FIG. 3 is anexample ofa block diagram ofa motion
`detection device. The motion detection device can be attached
`
`to an object, and therefore, detect motion ofthe object that can
`be identified. Based on the identified motion, estimates ofthe
`behavior and conditions of the object can be determined.
`[0032] The motion detection device includes sensors (such
`as, accelerometers) that detect motion of the object. One
`embodimentofthe sensors includes accelerometers 312, 314,
`316 that can sense, for example, acceleration of the object in
`X,Y and Z directional orientations. It is to be understood that
`other types of motion detection sensors can alternatively be
`used.
`
`[0033] An analogto digital converter (ADC) digitizes ana-
`log accelerometer signals. The digitized signals are received
`by compare processing circuitry 330 that comparesthe digi-
`tized accelerometer signals with signatures that have been
`stored within a library of signatures 340. Each signature
`corresponds with a type of motion. Therefore, when a match
`between the digitized accelerometer signals and a signature
`stored in the library 340, the type of motion experienced by
`the motion detection device can determined.
`
`[0034] An embodimentincludesfiltering the accelerometer
`signals before attempting to match the signatures. Addition-
`ally, the matching process can be made simpler by reducing
`the possible signature matches.
`[0035] An embodiment
`includes identifying a previous
`humanactivity context. That is, for example, by knowingthat
`the previous humanactivity was walking, certain signatures
`can intelligently be eliminated from the possible matches of
`the present activity that occurs subsequent to the previous
`humanactivity (walking).
`[0036] An embodimentincludes additionally reducing the
`numberofpossible signature matches by performing a time-
`domain analysis on the accelerometer signal. The time-do-
`main analysis can be used to identify a transient or steady-
`
`12
`
`12
`
`

`

`US 2010/0217533 Al
`
`Aug. 26, 2010
`
`is, for
`state signature of the accelerometer signal. That
`example, a walk may have a prominentsteady-state signature,
`whereas a fall may have a prominent transient signature.
`Identification of the transient or steady-state signature of the
`accelerometer signal can further reduceor eliminate the num-
`ber of possible signature matches, and therefore, make the
`task ofmatching the accelerometer signature with a signature
`within the library of signature simpler, and easier to accom-
`plish. More specifically, the required signal processing is
`simpler, easier, and requires less computing power.
`[0037] Upon detection of certain types of motion, an audio
`device 360 and/or a global positioning system (GPS) 370 can
`engaged to provide additional informationthat can be used to
`determine the situation of, for example, a human being the
`motion detection device is attachedto.
`
`[0038] Thecondition, or informationrelating to the motion
`detection device can be communicated through a wired or
`wireless connection. A receiver of the information can pro-
`cess it, and make a determination regarding the status of the
`humanbeing the motion detection device is attached to.
`[0039]
`FIG. 4 is a flowchart that includes the steps of an
`example of a method for detecting various motionsof daily
`living activities and emergency situations, such as, a fall. A
`first step 410 includes monitoring an activity of a person the
`motion detection device is attached. Raw signal data is col-
`lected from, for example, an accelerometer sensor. A second
`step 420 includes performing instantaneous computations
`over raw signals to compute atomic motions along with grav-
`ity vectorandtilt vector. A step third step 430 includes apply-
`ing series of digital filters to remove noise in the atomic
`motions data. A fourth step 440 includes performing state
`analysis on series of atomic data samples and forming con-
`text. Depending onthe state analysis, the series of atomic data
`is passed through either a step 445 periodic or steady state
`data analysis or a step 450 transient state data analysis. A sixth
`step 460 includes formation of macro motion signatures. The
`macro motion signatures are built from an output of state
`analysis vectors using known wavelet transformation tech-
`niques (for example, a Haar Transform). The transform per-
`formspattern matching on current motion pattern with exist-
`ing motion pattern library using, for example, DWT (Discreet
`Wavelet Transform) techniques. Complex motion wavelets
`are later matched using statistical pattern matching tech-
`niques, such as, HHMM (Hidden Heuristic Markov Model).
`Thestatistical pattern matching includes detecting and clas-
`sifying events of interest. The events of interest are built by
`observing various motions and orientation states data of an
`animate or inanimate object. This data is used to train the
`statistical model which performs the motion/activity detec-
`tion. Each activity will have its own modeltrained based on
`the observed data. A seventh step 470 includes a learning
`system providing the right model for the user from a set of
`model. It also aids in building newer (personal) patterns
`whichare not in the library for the person whois wearing the
`motion detection device. An eighth step 480 includes pre-
`building a motion database of motionlibraries against which
`motion signatures are compared. The database adds new
`motion/states signature dynamically as they are identified.
`[0040]
`FIG. 5 is a flowchart that includes the steps of an
`example of a method for detecting a fall. A first step 510
`includes monitoring an activity of, for example, a person the
`motion detection device is attached to. A step 515 includes
`recording and reporting in deviations in normal motionpat-
`terns of the person. A step 520 includes detecting the accel-
`
`eration magnitude deviation exceeding a threshold. The
`acceleration magnitude deviation exceeding the threshold
`can be sensed as a probable fall, and audio recording is
`initiated. Upon detection ofthis condition, sound recording of
`the person the motion detection device is connected to can be
`activated. The activation of sound can provide additional
`information thatcan be useful in assessing thesituation ofthe
`person. A step 530 includes monitoring the person after the
`probable fan. A step 525 includes detection of another accel-
`eration having magnitude lesser than the threshold, and con-
`tinuing monitoring of audio. A step 535 includes detecting a
`short period of inactivity. A step 540 includes monitoring the
`person after determininga fall probably occurred. A step 545
`includes subsequently detecting normal types of motion and
`turning offthe audio because the person seemsto be perform-
`ing normalactivity. A step 550 includes monitoring a period
`of inactivity. A step 555 includes additional analysis of
`detected information and signals. A step 560 includes further
`analysis including motiondata, orientation detectionall indi-
`cating the personis functioning normally. A step 560 includes
`determining that a fall has occurred based on the analysis of
`the motion data, and analysis of a concluded endposition and
`orientation of the person. The sound recording can be de-
`activated. A step 565 includes concluding that a fall has
`occurred. A step 570 includes sending an alert and reporting
`sound recordings. A step 575 includesthe fall having been
`reported. A step 580 includes an acknowledgementofthefall.
`[0041]
`FIG. 6 is a flow chart that includes the steps of one
`example of a method of identifying a type of motion of an
`animate or inanimate object. A first step 610 includes gener-
`ating an acceleration signature (for example, a tri-axial) based
`on the sensed acceleration of the object. A second step 620
`includes matching the acceleration signature with atleast one
`of a plurality of stored acceleration signatures, wherein each
`stored acceleration signatures corresponds with type of
`motion. A third step 630 includes identifying the type of
`motion of the object based on thestatistical (pattern) match-
`ing or exact matching ofthe acceleration signature. As will be
`described, the acceleration signal can be created using a
`wavelet transformation.
`
`For embodiments, the type of motion includes at
`[0042]
`least one of atomic motion, elemental motion and macro-
`motion.
`
`[0043] Though embodiments of generating matching
`acceleration signatures are described, it is to be understood
`that additional or alternate embodiments can include gener-
`ating and matching of orientation and/or audio signatures.
`Correspondingly,the first step 610 can include generating an
`acceleration signature, (and/or) orientation and audio signa-
`ture based on the sensed acceleration,orientation ofthe object
`and audio generated by the object, for example, a thud ofa
`fall, or a cry for help.
`[0044] Atomic motion includes butis not limited to a sharp
`jolt, a gentle acceleration, completestillness, a light accelera-
`tion that becomesstronger, a strong acceleration that fades, a
`sinusoidal or quasi-sinusoidal acceleration pattern, vehicular
`acceleration, vehicular deceleration, vehicular left and right
`turns, and more.
`[0045] Elemental motion includes but is not limited to
`motion patterns for walking, running, fitness motions (e.g.
`elliptical machine exercises, rowing,stair climbing, aerobics,
`skipping rope, bicycling .
`.
`. ), vehicular traversal, sleeping,
`sitting, crawling, turning over in bed, getting out of bed,
`getting up from chair, and more.
`
`13
`
`13
`
`

`

`US 2010/0217533 Al
`
`Aug. 26, 2010
`
`[0046] Macro-motion includes but is not limited to going
`for a walk in the park, leaving home anddriving to the shop-
`ping center, getting out of bed and visiting the bathroom,
`performing household chores, playing a game of tennis, and
`more.
`
`[0047] Each ofthe plurality of stored acceleration signa-
`tures correspondswith a particular type ofmotion. By match-
`ing the detected acceleration signature of the object with at
`least one of a plurality of stored acceleration signatures, an
`estimate or educated guess can be made about the detected
`acceleration signature.
`[0048] An embodiment includes a commonlibrary and a
`specific library, and matching the acceleration signature
`includes matching the acceleration signature with stored
`acceleration signatures of the common library, and then
`matching the acceleration signature with stored acceleration
`signatures of the specific library. For a particular embodi-
`ment, the generallibrary includes universal acceleration sig-
`natures, and the specific library includes personal accelera-
`tion signatures. That is, for example, the stored acceleration
`signatures of the commonlibrary are useable for matching
`acceleration signatures of motions of multiple humans, and
`the stored acceleration signatures of the specific library are
`useable for matching acceleration signatures of motionsof a
`particular human. Additionally, each library can be further
`categorized to reduce the numberof possible matches. For
`example, at an initialization, a user may enter physical char-
`acteristics of the user, such as, age, sex, and/or physical char-
`acteristics (such as, the user has a limp). Thereby, the possible
`signature matches within the generallibrary can be reduced.
`The signature entries within the specific library can be
`learned (built) over time as the human wearing the motion
`detection device goes through normalactivities ofthe specific
`human. The specific library can be added to, and improved
`over time.
`
`[0049] An embodimentincludesfiltering the acceleration
`signals. Additional embodimentinclude reducing the number
`of stored acceleration signature matches by identifying a
`previousactivity of the object, and performing a time domain
`analysis on the filtered acceleration signal to identify tran-
`sient signatures or steady-state signatures of the filtered
`acceleration signal. Thatis, by identifying a previousactivity
`(for example, a human walking of sleeping) the possible
`numberofpresent activities can be reduced,andtherefore, the
`number of possible stored acceleration signature matches
`reduced. Additionally, the transient and/or steady-state sig-
`natures can be used to reduce the numberofpossible stored
`acceleration signature matches, which can improvethe pro-
`cessing speed.
`includes activating audio
`[0050] Another embodiment
`sensing of the object if matches are made with at least por-
`tions of particular
`stored acceleration signatures. For
`example, if the acceleration signature exceeds a threshold
`value, then audio sensing of the object is activated. This is
`useful because the audio information can provide additional
`clues as to what, for example, the condition of a person. That
`is, a fall may be detected, and audio information can be used
`to confirm that a fall hasin fact occurred.
`
`includes transmitting the
`[0051] Another embodiment
`sensed audio. For example, of a user wearing the object has
`fallen, and the fall has been detected, audio information can
`be very useful for determining the condition of the user. The
`audio information can allow a receiver of the audio informa-
`
`tion to determine, for example, if the useris in pain, uncon-
`scious or ina dangeroussituation (for example, in a showeror
`ina fire).
`[0052] An embodimentincludes the object being associ-
`ated a person, and the stored acceleration signatures corre-
`sponding with different types of motionrelated to the person.
`A particular embodiment includes identifying an activity of
`the person based on a sequence ofidentified motions of the
`person. The activity of the person can include, for example,
`falling (the most important in some applications), walking,
`running, driving and more. Furthermore, the activities can be
`classified as daily living activities such as walking, running,
`sitting, sleeping, driving, climbing stairs, and more, or spo-
`radic activities, such as falling, having a car collision, having
`a seizure and so on.
`
`[0053] An embodiment includes transmitting information
`related to the identified type of motion if matches are made
`with particular stored acceleration signatures. The informa-
`tion related to the identified type ofmotion can includeatleast
`one of motions associated with a person the object is associ-
`ated with. The motionscan include, for example, a heartbeat
`of the person, muscular spasms, facial twitches, involuntary
`reflex movements which can be sensed by, for example, an
`accelerometer. Additionally, the information related to the
`identified type of motion can includeat least one of location
`of the object, audio sensed by the object, temperature of the
`object.
`[0054] Another embodimentincludesstoring at least one of
`the plurality of stored acceleration signatures during an ini-
`tialization cycle. The initializing cycle can be influenced
`based on whatthe objectis attachedto. Thatis, initializing the
`stored acceleration signatures (motion patterns) can be based
`on what the object is attached to, which can both reduce the
`numberof signature required to be store within, for example,
`the general library, and therefore, reduce the numberofpos-
`sible matches and reducethe processing required to identify a
`match. Alternatively or additionally, initializing the stored
`acceleration signatures can be based on who the object is
`attached to, which can influence the specific library. The
`initialization can be used to determine motions unique, for
`example, to an individual. For example, a unique motion can
`be identified for a person who walks with a limp, and the
`device can be initialized with motion patterns of the person
`walking with a limp.
`[0055] An embodiment includes initiating a low-power
`sleep modeof the object if sensed acceleration is below a
`threshold for a predetermined amountoftime. Thatis,

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket