`Kahn et al.
`
`(10) Patent No.:
`(45) Date of Patent:
`
`US 8,712,723 B1
`*Apr. 29, 2014
`
`US008712723B1
`
`HUMAN ACTIVITY MONITORING DEVICE
`
`(54)
`(75)
`
`Inventors:
`
`Philippe Kahn, Aptos, CA (US);
`Arthur Kinsolving, Santa Cruz, CA
`(US); Mark Andrew Christensen, Santa
`Cruz, CA (US); Brian Y. Lee, Aptos, CA
`(US); David Vogel, Santa Cruz, CA (US)
`DP Technologies, Inc., Scotts Valley, CA
`(US)
`Subject to any disclaimer, the term of this
`patent is extended or adjusted under 35
`U.S.C. 154(b) by 115 days.
`This patent is Subject to a terminal dis
`claimer.
`
`(73)
`
`Assignee:
`
`(*)
`
`Notice:
`
`(21)
`(22)
`
`(63)
`
`(51)
`
`(52)
`
`(58)
`
`Appl. No.:
`
`13/018,321
`
`Filed:
`
`Jan. 31, 2011
`
`Related U.S. Application Data
`Continuation of application No. 12/694,135, filed on
`Jan. 26, 2010, now Pat. No. 7,881,902, which is a
`continuation of application No. 1 1/644,455, filed on
`Dec. 22, 2006, now Pat. No. 7,653,508.
`
`(2006.01)
`(2006.01)
`(2006.01)
`(2006.01)
`
`Int. Cl.
`GOIC 22/00
`GOIPI3/00
`G06F 9/00
`G06F 7/40
`U.S. C.
`USPC ............. 702/160; 73/1.79; 377/242; 702/97;
`702/187; 702/189: 708/105: 708/200
`Field of Classification Search
`USPC ..... - - - - - 33/700, 701; 73/1.01, 1.37, 1.38, 1.75,
`73/1.76, 1.77, 1.78, 1.79, 181,432.1,
`73/865.4, 865.8: 377/1, 13, 15, 17, 19, 20,
`
`377/24, 24.1, 24.2: 702/1, 85,97, 104, 127,
`702/141, 150, 155, 158, 160, 187, 189:
`708/100, 101, 105, 131, 160, 200, 212
`IPC ..... G01B5/005/02; G01C 22/00, 25/00; G01D
`7/00; G01P 13/00; G06F 11/00, 11/30, 11/32,
`G06F 17/00, 17/40, 19/00
`See application file for complete search history.
`
`(56)
`
`References Cited
`
`U.S. PATENT DOCUMENTS
`
`4,285,041 A
`4,578,769 A
`5,446,725 A
`5,446,775 A
`
`8, 1981 Smith
`3, 1986 Frederick
`8, 1995 Ishiwatari
`8/1995 Wright et al.
`(Continued)
`
`FOREIGN PATENT DOCUMENTS
`
`JP
`
`2005-309691. A * 11/2005
`
`OTHER PUBLICATIONS
`
`Cheng, et al., “Periodic Human Motion Description for Sports Video
`Databases.” Proceedings of the Pattern Recognition, 2004, 5 pages.
`(Continued)
`Primary Examiner — Edward Cosimano
`(74) Attorney, Agent, or Firm — Blakely, Sokoloff, Taylor &
`Zafman LLP: Judith A. Szepesi
`(57)
`ABSTRACT
`A method for monitoring human activity using an inertial
`sensor includes continuously determining an orientation of
`the inertial sensor, assigning a dominant axis, updating the
`dominant axis as the orientation of the inertial sensor
`changes, and counting periodic human motions by monitor
`ing accelerations relative to the dominant axis.
`19 Claims, 9 Drawing Sheets
`
`Start
`
`- 800
`
`F.
`Take Measurement(s) of Acceleration Data 805
`
`Filter Measurement(s) 310
`
`Orient Device by Assigning dominant Axis 812
`
`Measurement(s)
`Within Cadence
`Window? 815
`
`
`
`acceleration Along Dominant Axis
`Greater Than Lower Threshold?
`
`Yes
`
`820
`
`
`
`
`
`Previous Measurement(s)
`
`Acceleration Lower Than
`Upper Threshold?
`
`Count Step 835
`
`Mc
`
`830
`
`
`
`No Step
`Counted 840
`
`
`
`
`
`
`
`
`
`Page 1 of 21
`
`SAMSUNG EXHIBIT 1001
`
`
`
`US 8,712,723 B1
`Page 2
`
`(56)
`
`References Cited
`
`U.S. PATENT DOCUMENTS
`
`1/1996 Smith et al. ................... TO2,160
`5,485.402 A *
`5,583,776 A 12/1996 Levi et al.
`5,593,431 A
`1/1997 Sheldon
`5,654,619 A
`8, 1997 Iwashita
`5,778,882 A
`7/1998 Raymond et al.
`5,955,667 A
`9/1999 Fyfe
`5,976,083. A 11/1999 Richardson et al.
`6,013,007 A
`1/2000 Root et al.
`6,122,595 A
`9/2000 Varley et al.
`6,135,951 A 10/2000 Richardson et al.
`6,145,389 A 1 1/2000 Ebeling et al.
`6,282,496 B1
`8/2001 Chowdhary
`6,353449 B1
`3/2002 Gregg et al.
`6,369,794 B1
`4/2002 Sakurai et al.
`6,428,490 B1
`8/2002 Kramer et al.
`6,493,652 B1
`12/2002 Ohlenbusch et al.
`6,496,695 B1
`12/2002 Kouji et al.
`6,513,381 B2
`2/2003 Fyfe et al.
`6,522,266 B1
`2/2003 Soehren et al.
`6,532,419 B1
`3/2003 Begin et al.
`6,539,336 B1
`3/2003 Vocket al.
`6,611,789 B1
`8/2003 Darley
`6,700,499 B2
`3, 2004 Kubo et al.
`6,771,250 B1
`8, 2004 Oh
`6,786,877 B2
`9, 2004 Foxlin
`6,790, 178 B1
`9/2004 Mault et al.
`6,813,582 B2 11/2004 Levi et al.
`6,823,036 B1
`1 1/2004 Chen
`6.826477 B2 11/2004 Ladetto et al.
`6,836,744 B1
`12/2004 Asphahani et al.
`6,881,191 B2
`4/2005 Oakley et al.
`6.885,971 B2
`4/2005 Vocket al.
`6,898,550 B1
`5/2005 Blackadar et al.
`6,928,382 B2
`8/2005 Hong et al.
`6,941,239 B2
`9, 2005 Unuma et al.
`6.959,259 B2 10/2005 Vocket al.
`6,975,959 B2 12/2005 Dietrich et al.
`7,010,332 B1
`3, 2006 Irvin et al.
`7,054,784 B2
`5, 2006 Flentov et al.
`7.057,551 B1
`6/2006 Vogt
`7,072,789 B2
`7/2006 Vocket al.
`7,092,846 B2
`8, 2006 Vocket al.
`7,148,797 B2 12/2006 Albert
`7,158,912 B2
`1/2007 Vocket al.
`7,169,084 B2
`1/2007 Tsuji
`7,171,331 B2
`1/2007 Vocket al.
`7, 177,684 B1
`2/2007 Kroll et al.
`7,200,517 B2
`4/2007 Darley et al.
`7.212,943 B2
`5, 2007 Aoshima et al.
`7,220,220 B2
`5, 2007 Stubbs et al.
`7,297,088 B2 11/2007 Tsuji
`7.305,323 B2 * 12/2007 Skvortsov et al. ............ TO2,160
`7,328,611 B2
`2/2008 Klees et al.
`7,334,472 B2
`2/2008 Seo et al.
`7,353,112 B2
`4/2008 Choi et al.
`7,387,611 B2
`6, 2008 Inoue et al.
`7,428,471 B2 * 9/2008 Darley et al. ................. TO2, 182
`7,451,056 B2 11/2008 Flentov et al.
`7,457.719 B1
`1 1/2008 Kahn et al.
`7.463,997 B2 * 12/2008 Pasolini et al. ............... TO2,160
`7.467,060 B2 12/2008 Kulach et al.
`7,512,515 B2
`3/2009 Vocket al.
`7,526,402 B2
`4/2009 Tananhaus et al.
`7,608,050 B2 10/2009 Shugg
`7,617,071 B2 * 1 1/2009 Darley et al. ................. TO2,165
`7,640,134 B2 * 12/2009 Park et al. ..................... TO2/141
`7,640,804 B2
`1/2010 Daumer et al.
`7,647,196 B2
`1/2010 Kahn et al.
`7,653,508 B1
`1, 2010 Kahn et al.
`22. I R
`23. R t
`77
`$53 N.
`7,788,059 B1* 8.2010 Kahnet al... TO2/141
`7,857,772 B2 12/2010 Bouvier et al.
`7.881,902 B1* 2/2011 Kahn et al. .................... TO2,160
`
`
`
`2/2011 Dahl
`7,892,080 B1
`6/2011 Darley et al. ................. TO2,165
`7,962,312 B2 *
`7.987,070 B2 * 7/2011 Kahn et al. ...
`... 702/160
`8,187,182 B2 * 5/2012 Kahn et al. .................... 600/300
`2002fOO23654 A1
`2/2002 Webb
`2002fOO894.25 A1
`7/2002 Kubo et al.
`2002/0109600 A1
`8, 2002 Mault et al.
`2002/0118121 A1
`8/2002 Lehrman et al.
`2002/0151810 A1 10/2002 Wong et al.
`2003, OO1843.0 A1
`1/2003 Ladetto et al.
`2003/00482.18 A1
`3/2003 Milnes et al.
`2003/0083596 A1
`5/2003 Kramer et al.
`2003.0109258 A1
`6/2003 Mantyjarvi et al.
`2003/0139692 A1
`7/2003 Barrey et al.
`2004/0225467 A1 11, 2004 Vocket al.
`2004/0236500 A1 11, 2004 Choi et al.
`2005, OO33200 A1
`2/2005 Soehren et al.
`2005/0202934 A1
`9, 2005 Olrik et al.
`2005/0210300 A1
`9/2005 Song et al.
`2005/0222801 Al
`10/2005 Wulffet al.
`2005/0232388 Al
`10/2005 Tsuji
`2005/0232404 A1 10, 2005 Gaskill
`2005/0238 132 Al
`10/2005 Tsuji
`2005/0240375 A1 10/2005 Sugai
`2005.0245988 A1 11/2005 Miesel
`2005/0248718 A1 11/2005 Howell et al.
`2006, OO20177 A1
`1/2006 Seo et al.
`2006, OO63980 A1
`3/2006 Hwang et al.
`2006, OO64276 A1
`3/2006 Ren et al.
`2006/0100546 A1
`5.2006 Silk
`2006, O136173 A1
`6/2006 Case, Jr. et al.
`2006, O149516 A1
`7/2006 Bond et al.
`2006/O161377 A1
`7/2006 Rakkola et al.
`2006.0167387 A1
`7/2006 Buchholz et al.
`2006/0174685 A1
`8/2006 Skvortsov et al. ............. 73.1.37
`2006/0206258 A1
`9, 2006 Brooks
`2006/0223547 A1 10, 2006 Chin et al.
`2006, O259268 A1 11, 2006 Vocket al.
`2006/0284979 A1 12/2006 Clarkson
`2006/0288781 A1 12/2006 Daumer et al.
`2007/0038364 A1
`2/2007 Lee et al.
`2007/0061105 A1
`3/2007 Darley et al.
`2007, OO63850 A1
`3/2007 Devaul et al.
`2007, OO67094 A1
`3/2007 Park et al.
`2007/OO73482 A1
`3/2007 Churchill et al.
`2007/0O82789 A1
`4/2007 Nissila et al.
`2007/O125852 A1
`6/2007 Rosenberg
`2007/O130582 A1
`6/2007 Chang et al.
`2007/O142715 A1
`6/2007 Banet et al.
`2007/0143068 A1* 6/2007 Pasolini et al. ............... TO2,160
`2007, 0145680 A1
`6/2007 Rosenberg
`2007/015O136 A1
`6/2007 Doll et al.
`2007/02O8531 A1
`9/2007 Darley et al.
`2007/0213126 A1
`9, 2007 Deutsch et al.
`2007/O250261 A1 10, 2007 Soehren
`2007/0259716 Al
`11/2007 Mattice et al.
`2007/0259.717 A1 11/2007 Mattice et al.
`2007/0260418 A1 11/2007 Ladetto et al.
`2007/0260482 A1 11/2007 Nurmela et al.
`2008.0171918 A1
`7/2008 Teller et al.
`2008/0243432 A1* 10, 2008 Kato et al. .................... TO2,160
`2009/0043531 A1
`2/2009 Kahn et al.
`2009/0047645 A1
`2/2009 Dibenedetto et al.
`2009/0124348 Al
`5/2009 Yoseloffet al.
`2009, 0213002 A1
`8, 2009 Rani et al.
`2009, 0234614 A1
`9, 2009 Kahn et al.
`20090319221 A1 12/2009 Kahn et al.
`2010.0056872 A1
`3/2010 Kahn et al.
`2010/0057398 A1
`3/2010 Darley et al.
`
`OTHER PUBLICATIONS
`
`Heart Rate Monitors, Suunto.com/suunto/Worlds/main? world ar
`ticle product no ATL.jsp?CONTENT963C963Ecnt
`id=10134 198676968765&FOLDER963C963Efolder
`d=9852723697 225397&ASSORTMENT963C963East
`id=1408474395903593&bmUID=1174532640618speedd, Apr. 4,
`2007, 1 page.
`
`Page 2 of 21
`
`
`
`US 8,712,723 B1
`Page 3
`
`(56)
`
`References Cited
`
`OTHER PUBLICATIONS
`
`Jones, L., et al., “Wireless Physiological Sensor System for Ambula
`tory
`Use.
`ieeexplore.ieee.org/xpl/freeabs all.jsp?tp
`&arnumber=1612917&isnumber=33861 >, Apr. 3-5, 2006.
`“Sensor Fusion.” u-dynamics.com, accessed Aug. 29, 2008, 2
`pageS.
`Sinha, Alex, "Heart Monitoring Training.” marathonguide.com/
`training/articles/HeartMonitorTraining.cfm, Apr. 4, 2007, 5 pages.
`Wang, Shu, et al. “Location Based Services for Mobiles: Technolo
`gies and Standards, LG Electronics MobileComm.” IEEE ICC 2008,
`Beijing, pp. 1-66 (part 1 of 3).
`Wang, Shu, et al. “Location Based Services for Mobiles: Technolo
`gies and Standards, LG Electronics MobileComm.” IEEE ICC 2008,
`Beijing, pp. 67-92 (part 2 of 3).
`Wang, Shu, et al. “Location Based Services for Mobiles: Technolo
`gies and Standards, LG Electronics MobileComm.” IEEE ICC 2008,
`Beijing, pp. 93-123 (part 3 of 3).
`Weckesser, P. et al. “Multiple Sensorprocessing for High-Precision
`Navigation and Environmental Modeling with a Mobile Robot.”
`IEEE, 1995, pp. 453-458.
`Yoo, Chang-Sun, et al. “Low Cost GPS/INS Sensor Fusion System
`for UAV Navigation.” IEEE, 2003, 9 pages.
`Anderson, Ian, et al. “Shakira: Tracking and Sharing Daily Activity
`Levels with Unaugmented Mobile Phones.” Mobile Netw Appl. Aug.
`3, 2007, pp. 185-199.
`Aylward, Ryan, et al. “Sensemble: A Wireless, Compact, Multi-User
`Sensor System for Interactive Dance.” International Conference on
`New Interfaces for Musical Expression (NIME06), Jun. 4-8, 2006,
`pp. 134-139.
`Baca, Arnold, et al., “Rapid Feedback Systems for Elite Sports Train
`ing.” IEEE Pervasive Computing, Oct.-Dec. 2006, pp. 70-76.
`Bakhru, Kesh, “A Seamless Tracking Solution for Indoor and Out
`door Position Location.” IEEE 16th International Symposium on
`Personal, Indoor, and Mobile Radio Communications, 2005, pp.
`2O29-2033.
`Bliley, Kara E. et al. "A Miniaturized Low Power Personal Motion
`Analysis Logger Utilizing MEMS Accelerometers and Low Power
`Microcontroller.” IEEE EMBS Special Topic Conference on
`Microtechnologies in Medicine and Biology, May 12-15, 2005, pp.
`92-93.
`Fang, Lei, et al., “Design of a Wireless Assisted Pedestrian Dead
`Reckoning System. The NavMote Experience.” IEEE Transactions
`on Instrumentation and Measurement, vol. 54, No. 6, Dec. 2005, pp.
`2342-2358.
`Healey, Jennifer, et al. “Wearable Wellness Monitoring Using ECG
`and Accelerometer Data.” IEEE Int. Symposium on Wearable Com
`puters (ISWC’05), 2005, 2 pages.
`Hemmes, Jeffrey, et al. “Lessons Learned Building TeamTrak: An
`Urban/Outdoor Mobile Testbed, 2007 IEEE Int. Conf. on Wireless
`Algorithms, Aug. 1-3, 2007, pp. 219-224.
`Jovanov, Emil, et al. "A Wireless Body Area Network of Intelligent
`Motion Sensors for Computer Assisted Physical Rehabilitation.”
`Journal of NeuroEngineering and Rehabilitation, Mar. 2005, 10
`pageS.
`
`Kalpaxis, Alex, “Wireless Temporal-Spatial Human Mobility Analy
`sis. Using Real-Time Three Dimensional Acceleration Data.” IEEE
`Intl. Multi-Conf. on Computing in Global IT (ICCGI’07), 2007, 7
`pageS.
`Milenkovic, Milena, et al., “An Accelerometer-Based Physical Reha
`bilitation System.” IEEE SouthEastern Symposium on System
`Theory, 2002, pp. 57-60.
`Otto, Chris, et al. “System Architecture of a Wireless Body Area
`Sensor Network for Ubiquitous Health Monitoring.” Journal of
`Mobile Multimedia, vol. 1, No. 4, 2006, pp. 307-326.
`Park, Chulsung, et al. “Eco: An Ultra-Compact Low-Power Wireless
`Sensor Node for Real-Time Motion Monitoring.” IEEE Int. Symp. on
`Information Processing in Sensor Networks, 2005, pp. 398-403.
`Shen, Chien-Lung, et al. “Wearable Band Using a Fabric-Based
`Sensor for Exercise ECG Monitoring.” IEEE Int. Symp. on Wearable
`Computers, 2006, 2 pages.
`Tapia, Emmanuel Munguia, et al., “Real-Time Recognition of Physi
`cal Activities and Their Intensities Using Wireless Accelerometers
`and a Heart Rate Monitor.” IEEE Cont, on Wearable Computers, Oct.
`2007, 4 pages.
`Wixted, Andrew J, et al., “Measurement of Energy Expenditure in
`Elite Athletes Using MEMS-Based Triaxial Accelerometers.” IEEE
`Sensors Journal, vol. 7, No. 4, Apr. 2007, pp. 481-488.
`Wu, Winston H. et al., “Context-Aware Sensing of Physiological
`Signals.” IEEE Int. Conf. on Engineering for Medicine and Biology,
`Aug. 23-26, 2007, pp. 5271-5275.
`Bourzac, Katherine"Wearable Health Reports.” Technology Review,
`Feb.
`28,
`2006,
`.techreview.com/printer friendly article
`aspx?id-16431, Mar. 22, 2007, 3 pages.
`Dao, Ricardo, “Inclination Sensing with Thermal Accelerometers'.
`MEMSIC, May 2002, 3 pages.
`Lee, Seon-Woo, et al., “Recognition of Walking Behaviors for Pedes
`trian Navigation.” ATR Media Integration & Communications
`Research Laboratories, Kyoto, Japan, 4 pages.
`Margaria, Rodolfo, "Biomechanics and Energetics of Muscular
`Exercise”, Chapter 3, pp. 105-125, Oxford: Clarendon Press 1976.
`Mizell, David, “Using Gravity to Estimate Accelerometer Orienta
`tion'. Seventh IEEE International Symposium on Wearable Comput
`ers, 2003, 2 pages.
`Ormoneit, D., et al., “Learning and Tracking Cyclic Human Motion.”
`Encyclopedia of Library and Information Science, vol. 53, Supple
`ment 16, 2001, 7 pages.
`PCT International Search Report and Written Opinion for Interna
`tional Application No. PCT/US2008/072537, mailed Oct. 22, 2008,
`10 pages.
`PCT International Search Report and Written Opinion for Interna
`tional Application No. PCT/US2009/48523, mailed Aug. 27, 2009, 8
`pageS.
`Weinberg, Harvey, “MEMS Motion Sensors Boost Handset Reliabil
`ity” Jun. 2006, mwrf.com/Articles/Print.cfm?ArticleID=12740, Feb.
`21, 2007, 4 pages.
`
`* cited by examiner
`
`Page 3 of 21
`
`
`
`U.S. Patent
`
`Apr. 29, 2014
`
`Sheet 1 of 9
`
`US 8,712,723 B1
`
`
`
`Filter
`120
`
`ACCeleration
`Measuring
`LOgic
`
`Dominant Axis Logic 127
`Dominant
`Rolling Average
`Axis Setting
`LOgic 135
`LOgic
`
`Cadence Logic
`132
`
`Measurement
`Selection
`LOgic 145
`
`Measurement
`Comparator
`155
`
`Threshold
`Comparator
`160
`
`Mode
`LOgic
`190
`
`Step Count
`Buffer
`165
`Step Counting Logic 13
`
`Final Step
`Count
`175
`
`Electronic Device 100
`
`Figure 1
`
`Page 4 of 21
`
`
`
`U.S. Patent
`
`Apr. 29, 2014
`
`Sheet 2 of 9
`
`US 8,712,723 B1
`
`
`
`f5) uoojeepov
`
`Page 5 of 21
`
`
`
`U.S. Patent
`
`Apr. 29, 2014
`
`Sheet 3 of 9
`
`US 8,712,723 B1
`
`Sleep
`Conditions
`Met
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`Initial State
`
`Sleep Mode 305
`
`No Activity
`
`Detect ACCelerations
`
`
`
`Entry Mode 315
`
`N Steps in Cadence
`Steps in
`Stepping Mode 325)
`Cadence
`X Steps in
`Cadence
`
`No Steps Detected Within
`Cadence Window
`
`Exit Mode 335
`
`Figure 3
`
`Page 6 of 21
`
`
`
`U.S. Patent
`
`Apr. 29, 2014
`
`Sheet 4 of 9
`
`US 8,712,723 B1
`
`400 1 N.
`
`
`
`Set Sleep Mode Sampling Rate
`405
`
`Take Measurement(s)
`41
`
`
`
`f ACCeleration Data
`
`
`
`
`
`
`
`
`
`ACCeleration
`Detected?
`415
`
`Yes
`Initiate Entry
`MOOle 420
`
`Figure 4
`
`Page 7 of 21
`
`
`
`U.S. Patent
`
`Apr. 29, 2014
`
`Sheet 5 Of 9
`
`US 8,712,723 B1
`
`/11 C start)
`
`500
`
`Set Stepping
`Sampling Rate 504
`
`Recognize First Step
`510
`
`Set Default Cadence
`WindoW 514
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`Set Buffered Step
`Count to One 520
`
`
`
`SN-2
`
`Additional
`Step?
`
`In Cadence
`WindoW2
`
`
`
`540
`
`Add One to Buffered
`Step Count
`560
`
`Reset Buffered
`Step Count to Zero
`534
`
`
`
`ACCeleration
`Detected?
`
`MSteps
`in Buffered Step
`COUnt?
`
`Yes
`
`570
`
`Set New Cadence
`WindoW
`574.
`
`Caden Ce
`Window =
`Default?
`
`NO
`
`Initiate Sleep Mode
`544
`
`584
`
`Steps in
`Buffered Step
`COUnt?
`
`
`
`
`
`Yes/Add Buffered Steps to
`Actual Step Count and
`Initiate Stepping Mode
`
`Figure 5
`
`Page 8 of 21
`
`
`
`U.S. Patent
`
`Apr. 29, 2014
`
`Sheet 6 of 9
`
`US 8,712,723 B1
`
`600
`
`
`
`
`
`
`
`
`
`Set New Cadence
`Window 610
`
`Yes
`
`625
`
`S-2
`
`Recognize Step?
`
`In Cadence
`WindoW2
`
`
`
`Add One to Step
`COunt 620
`
`NO
`
`Initiate Exit MOce
`630
`
`Figure 6
`
`Page 9 of 21
`
`
`
`U.S. Patent
`
`Apr. 29, 2014
`
`Sheet 7 Of 9
`
`US 8,712,723 B1
`
`7001 N.
`
`
`
`Initiate (Reset)
`Step Timer
`705
`
`ACld One to
`Buffered Step
`COunt 710
`
`Buffered Step
`Count = X?
`
`
`
`
`
`
`
`Add Buffered Steps to
`Actual Step Count and
`Return to Stepping Mode
`720
`
`
`
`Clear Buffered StepN Yes
`COUnt and Initiate
`Entry Mode 730
`
`Step Timer
`Elapsed?
`
`
`
`
`
`Recognize
`AClditional
`Step?
`
`
`
`Figure 7
`
`Page 10 of 21
`
`
`
`U.S. Patent
`
`Apr. 29, 2014
`
`Sheet 8 of 9
`
`US 8,712,723 B1
`
`S
`
`Take Measurement(s) of Acceleration Data 805
`
`Filter Measurement(s)
`
`10
`
`Orient Device by Assigning Dominant Axis
`
`12
`
`Measurement(s) Within Cadence
`Window? 815
`
`Acceleration Along Dominant Axis
`Greater Than LOWer. Threshold?
`
`ACCeleration Greater Than
`Previous Measurement(s)?
`
`
`
`
`
`
`
`
`
`
`
`
`
`ACCeleration LOWer Than
`Upper Threshold?
`
`Yes
`Count Step 835
`
`NO
`
`830
`
`No Step
`Counted 840
`
`Figure 8
`
`Page 11 of 21
`
`
`
`U.S. Patent
`
`Apr. 29, 2014
`
`Sheet 9 Of 9
`
`US 8,712,723 B1
`
`900 - N.
`
`Detect Period of Stepping Cadence 910
`
`Create Rolling Averages of Accelerations 91
`
`
`
`
`
`Assign Dominant Axis
`
`20
`
`End
`
`Figure 9
`
`Page 12 of 21
`
`
`
`1.
`HUMAN ACTIVITY MONITORING DEVICE
`
`US 8,712,723 B1
`
`The present patent application is a continuation of U.S.
`application Ser. No. 12/694,135, filed on Jan. 26, 2010, now
`U.S. Pat. No. 7,881,902, issued on Feb. 1, 2011; which is a
`continuation of U.S. application Ser. No. 1 1/644,455, filed on
`Dec. 22, 2006, now U.S. Pat. No. 7,653,508, issued on Jan.
`26, 2010.
`
`FIELD OF THE INVENTION
`
`This invention relates to a method of monitoring human
`activity, and more particularly to counting periodic human
`motions such as steps.
`
`10
`
`15
`
`BACKGROUND
`
`The development of Micro-Electro-Mechanical Systems
`(MEMS) technology has enabled manufacturers to produce
`inertial sensors (e.g., accelerometers) of Sufficient size, cost,
`and power consumption to fit into portable electronic devices.
`Such inertial sensors can be found in a limited number of
`commercial electronic devices such as cellular phones, por
`table music players, pedometers, game controllers, and por
`table computers.
`Step counting devices are used to monitor an individuals
`daily activity by keeping track of the number of steps that he
`or she takes. Generally, step counting devices that utilize an
`inertial sensor to measure motion to detect steps require the
`user to first position the device in a limited set of orientations.
`In some devices, the required orientations are dictated to the
`user by the device. In other devices, the beginning orientation
`is not critical, so long as this orientation can be maintained.
`Step counting devices are often confused by motion noise
`experienced by the device throughout a user's daily routine.
`This noise causes false steps to be measured and actual steps
`to be missed in conventional step counting devices. Conven
`tional step counting devices also fail to accurately measure
`steps for individuals who walk at a slow pace. Such step
`counting devices can fail to operate for seniors and others
`walking at a slow pace.
`
`25
`
`30
`
`35
`
`40
`
`BRIEF DESCRIPTION OF THE DRAWINGS
`
`45
`
`The present invention is illustrated by way of example, and
`not by way of limitation, and can be more fully understood
`with reference to the following detailed description when
`considered in connection with the following figures:
`50
`FIG. 1 is a block diagram illustrating one embodiment of
`an electronic device;
`FIG. 2 illustrates an exemplary cadence of motion graph
`that measures time versus acceleration, in accordance with
`one embodiment of the present invention:
`FIG.3 shows a state diagram for the behavior of a system
`of monitoring human activity using an inertial sensor, in
`accordance with one embodiment of the present invention;
`FIG. 4 illustrates a flow diagram for a method of operating
`an electronic device in sleep mode, in accordance with one
`embodiment of the present invention;
`FIG. 5 illustrates a flow diagram for a method of operating
`an electronic device in entry mode, in accordance with one
`embodiment of the present invention;
`FIG. 6 illustrates a flow diagram for a method of operating
`an electronic device in stepping mode, in accordance with one
`embodiment of the present invention;
`
`60
`
`65
`
`55
`
`2
`FIG. 7 illustrates a flow diagram for a method of operating
`an electronic device in exit mode, in accordance with one
`embodiment of the present invention;
`FIG. 8 illustrates a flow diagram for a method of recogniz
`ing a step in accordance with one embodiment of the present
`invention, in accordance with one embodiment of the present
`invention; and
`FIG. 9 illustrates a flow diagram for a method of orienting
`an inertial sensor, in accordance with one embodiment of the
`present invention.
`
`DETAILED DESCRIPTION
`
`Embodiments of the present invention are designed to
`monitor human activity using an inertial sensor. In one
`embodiment, a dominant axis is assigned after determining an
`orientation of an inertial sensor. The orientation of the inertial
`sensor is continuously determined, and the dominant axis is
`updated as the orientation of the inertial sensor changes. In
`one embodiment, periodic human motions are counted by
`monitoring accelerations relative to the dominant axis.
`FIG. 1 is a block diagram illustrating an electronic device
`100, in accordance with one embodiment of the present
`invention. The electronic device 100 in one embodiment
`comprises an acceleration measuring logic 105, a filter 120, a
`dominant axis logic 127, a step counting logic 130, a timer
`170, and a final step count 175. In one embodiment, the
`electronic device 100 is a portable electronic device that
`includes one or more inertial sensors. The inertial sensors
`may measure accelerations along a single axis or multiple
`axes. The inertial sensors may measure linear as well as
`rotational (angular) accelerations. The electronic device 100
`may be used to count steps or other periodic human motions.
`Steps may be accurately counted regardless of the placement
`and/or orientation of the device on a user. Steps may be
`accurately counted whether the electronic device 100 main
`tains a fixed orientation or changes orientation during opera
`tion. The electronic device 100 may be carried in a backpack,
`pocket, purse, hand, or elsewhere, and accurate steps may still
`be counted.
`The acceleration measuring logic 105 measures accelera
`tion data at a sampling rate. The sampling rate may be fixed or
`variable. In one embodiment, the acceleration measuring
`logic 105 receives a timing signal from the timer 170 in order
`to take measurements at the sampling rate. The acceleration
`measuring logic 105 may be an inertial sensor.
`In one embodiment, measurement data is processed by the
`filter 120 to remove noise. The filter 120 may be implemented
`in hardware, software, or both hardware and software. The
`filter 120 may include a high pass filter, a low pass filter, a
`bandpass filter, a bandstop filter and/or additional filters. The
`filter 120 may include a digital filter and/or an analog filter. In
`one embodiment, a hardware digital filter includes at least one
`of a finite impulse response (FIR) filter and an infinite impulse
`response (IIR) filter. In one embodiment, an N-tap hardware
`digital FIR filter is used. The use of a hardware FIR filter may
`reduce power consumption by reducing and/or eliminating
`Software digital filtering.
`In one embodiment, the filter 120 includes multiple filters,
`and a determination of which filters to apply to the measure
`ment data is made based upon an operating mode of the
`electronic device 100. In one embodiment, the selection of
`which filters to use is determined by the type of user activity
`detected. For example, a low pass filter may be used to remove
`high frequency noise that would interfere with step counting
`when a user is walking. In contrast, a high pass filter may be
`used when quick motions are to be monitored.
`
`Page 13 of 21
`
`
`
`3
`Filtered measurement data may be passed on to the domi
`nant axis logic 127 and the step counting logic 130. In one
`embodiment, the dominant axis logic 127 includes a cadence
`logic 132, a rolling average logic 135, and a dominant axis
`setting logic 140. In an alternative embodiment, more or 5
`fewer logics may be used to determine a dominant axis. One
`embodiment of implementing dominant axis assignment may
`be found in U.S. Ser. No. 1 1/603,472, now issued as U.S. Pat.
`No. 7,457.719 which is incorporated herein by reference.
`Alternative means of identifying a dominant axis may be used 10
`in other embodiments.
`In one embodiment, the dominant axis logic 127 is used to
`determine an orientation of the electronic device 100 and/or
`an inertial sensor within the electronic device 100. In alter
`native embodiments, other logics may be used to determine
`an orientation of the electronic device 100.
`Referring to FIG. 1, the cadence logic 132 may determine
`one or more sample periods to be used by the rolling average
`logic 135, and may determine a cadence window 150 to be 20
`used by the step counting logic 130. In one embodiment, the
`cadence logic 132 detects a period and/or cadence of a motion
`cycle. The period and/or cadence of the motion cycle may be
`based upon user activity (e.g. rollerblading, biking, running,
`walking, etc.).
`Many types of motions that are useful to keep track of have
`a periodic set of movements. Specific periodic human
`motions may be characteristic of different types of user activ
`ity. For example, to walk, an individual must lift a first leg,
`move it forward, plant it, then repeat the same series of 30
`motions with a second leg. In contrast, a person rollerblading
`performs a repeated sequence of pushing, coasting and liftoff
`for each leg. For a particular individual, the series of walking
`motions will usually occur in about the same amount of time,
`and the series of rollerblading motions will usually occur in 35
`the same amount of time. The repeated set of motions can be
`considered a unit, and defines the motion cycle. The amount
`of time that it takes to complete one motion cycle defines the
`motion cycle's period, and the number of motion cycles that
`occur in a given unit of time define the motion cycle's 40
`cadence. For simplicity, the term “step” is used in this appli
`cation to describe the user activity being evaluated. However,
`in the context of this application, the term “step’ should be
`taken to mean any user activity having a periodic set of
`repeated movements.
`45
`FIG. 2 illustrates an exemplary motion cycle graph 200 that
`measures time versus acceleration, in accordance with one
`embodiment of the present invention. The exemplary motion
`cycle graph 200 shows acceleration data taken with a single
`tri-axis inertial sensor. The acceleration at a given time is
`represented for a first axis 203, a second axis 205, and a third
`axis 207. In one embodiment, the cadence logic 132 of FIG.
`1 analyzes the acceleration along the first axis 203, second
`axis 205 and third axis 207 to detect a motion cycle. Once a
`motion cycle is detected, a period of the motion cycle is
`determined, and a cadence of the motion cycle is determined.
`FIG. 2 shows an exemplary period of a motion cycle 210 for
`the third axis 207, the period 215 being approximately 0.6
`seconds. The same period can also be seen to a lesser degree
`in the second axis 205 and the first axis 203. The correspond
`ing cadence to the motion cycle is approximately one hundred
`motion cycles per minute.
`In one embodiment, once a stepping period (or other
`motion cycle period) is determined, that period may be used
`to set the cadence window (the allowable time window for
`steps to occur). In one embodiment, the period is updated
`after each step. The current stepping period may be a rolling
`
`50
`
`55
`
`60
`
`65
`
`US 8,712,723 B1
`
`15
`
`25
`
`4
`average of the stepping periods over previous steps, as dis
`cussed in more detail with reference to the rolling average
`logic 135 of FIG. 1.
`Acadence window may be used to facilitate accurate mea
`Surement of a step, or other periodic human motion. A
`cadence window is a window of time since a last step was
`counted that is looked at to detect a new step. A cadence
`window may be set based on the period and/or cadence of the
`actual motion cycle (e.g., a stepping period), on set limits,
`and/or on other determiners.
`Referring to FIG. 2, an exemplary first cadence window
`240 and second cadence window 255 are shown. The first
`cadence window 240 may be defined by a first cadence win
`dow minimum 230 and a first cadence window maximum
`235. The second cadence window 255 may be defined by a
`second cadence window minimum 245 and a second cadence
`window maximum 250. In one embodiment, the cadence
`window minimums 230 and 245 and cadence window maxi
`mums 235 and 250 are determined by measuring lengths of
`time since the most recent step was counted. In one embodi
`ment, this length of time is measured via the timer 170 of FIG.
`1. In other embodiments, other variables may be used to set
`the cadence window. For example, cadence windows may be
`determined by measuring cumulative amounts of accelera
`tion that have been measured since the previous step was
`counted.
`Returning to FIG. 2, cadence windows may be used to
`count steps until an expected Step is not encountered. In one
`embodiment, new cadence windows are determined periodi
`cally. In one embodiment, the cadence window is a dynamic
`cadence window that continuously updates as a user's
`cadence changes. For example, using a dynamic cadence
`window, a new cadence window length may be set after each
`step. The cadence window minimums may be determined by
`Subtracting a value from the stepping period, and the cadence
`window maximums may be determined by adding a value to
`the stepping period. In one embodiment, the cadence window
`maximums are preset, and the cadence window minimums
`are updated after each step is counted. In one embodiment, the
`cadence window minimums are preset, and the cadence win
`dow maximums are updated after each step is counted. In one
`embodiment, both the cadence window minimums and
`cadence window maximums are updated when a step is
`counted. In one embodiment, the current cadence window
`minimum is determined by subtracting 200 ms from the cur
`rent stepping cadence period. In one embodiment, the
`cadence window minimum has a minimum value of 240 ms.
`In the illustrated embodiment of FIG. 2, a first step 217 is
`counted at 0.65 seconds, and a second step 232 is counted at
`approximately 1.15 seconds. The first cadence window 240
`opens at approximately 0.4 seconds from the first step 217,
`and closes at approximately 0.8 seconds from the first step
`217. As shown, the second step 232 falls within the first
`dynamic cadence window 240. A third step 233 falls within
`the second dynamic cadence window 255, which may have a
`second cadence window minimum 245 and second cadence
`window maximum 250 that are different from the first
`cadence window minimum 230 and first cadence window
`maximum 235. The illustrated second cadence window mini
`mum is about 0.35 seconds from the second step 232, and the
`second cadence window maximum 250 is about 0.75 seconds
`from the second step 232. Other cadence window minimums
`and maximums are also possible. When motion criteria (e.g.,
`threshold conditions) are met withina cadence window, a step
`is detected, whereas when motion criteria are met outside of
`the cadence windows no step is detected.
`
`Page 14 of 21
`
`
`
`US 8,712,723 B1
`
`5
`
`10
`
`15
`
`30
`
`35
`
`40
`
`5
`If no previous steps have been detected, there is no cadence
`minimum, and a step may be detected at any time that motion
`criteria are met. If fewer than the r