throbber
||||||||||||||||||||||||||||||||||||||||||l|l|l|l||||||||||||||||||||||||||
`
`U8008268865B2
`
`(12}
`
`United States Patent
`
`Narayanan et a1.
`
`[10) Patent No.:
`
`(45) Date of Patent:
`
`US 8,768,865 32
`Jul. 1, 2014
`
`(54)
`
`(75.)
`
`LEARNING SITUATIONS VIA PATTERN
`MATCHING
`
`Inventors: Vidya Narayanart. San Diego. (TA (US);
`Sanjhr Nanda. Ramona. CA (US);
`Fuming Shih. Cambridge. MA (US)
`
`(73)
`
`Assignee: Qualcomm Incorporated. San Diego.
`CA (US)
`
`(*1
`
`Notice:
`
`Subject to any disclaimer. the term of this
`patent is extended or adjusted under 35
`U.S.C‘. 154(b) by 250 days.
`
`(21}
`
`Appl. No.: 131269.516
`
`(22)
`
`Filed:
`
`Oct. 7, 2011
`
`(65)
`
`Prior Publication Data
`
`US 20121’0265717A1
`
`Oct. 18. 2012
`
`Related US. Application Data
`
`(60)
`
`Provisional application No. 611434.400. filed on Jan.
`19. 2011.
`
`(51)
`
`Int. (21.
`
`[2006.01]
`(2006.01)
`(2010.01)
`
`G06)” 1700
`GOLD 15/00
`GflfiN 99/00
`U.S. (It.
`CPC
`USPC
`Field of Classification Search
`None
`
`GU6N99/005 (2013.01)
`70602: 7021027
`
`See application file for complete search history.
`
`References Cited
`
`US PA'l'liNT DOCUMENTS
`
`(52}
`
`(58)
`
`(56)
`
`2.520.943 132
`200750036347 Al
`
`80009 Sowari et al.
`2:"2002 Teieher
`
`200930303204 Al
`2009-0305661 Al
`201030001949 A1
`201030075639 A1
`2010-0217533 Al
`201030299757Ir A1
`201030317371 Al
`201130039522 A1
`201130066383 Al *
`2011-0070863 At
`
`1212009 Nasiri ct 31.
`1252009 Ito
`112010 Shkolnikovetal.
`332010 tlorvitz etal.
`832010 Nattkarnielal.
`lt.-"2010 Lee
`12-2010 Westerinen eta].
`2-"2011 Partridgeetal.
`3-"2011
`Jangle et at.
`3:20” Ma el al.
`
`FOREIGN PATENT DOCUMENTS
`
`(313
`W0
`
`2434504 A
`WO2008054|35 Al
`
`7:200?
`532008
`
`OTHER PUBLICATIONS
`
`702.319
`
`Calderon. et 3].. “Recognition and Generation of Motion Primitives
`with Humanoid Robots”. 2009 IEEE-"ASMF. International C unfer-
`ence on Advanced Intelligent Mechatronies Suntec Convention and
`Exhibition Center. Singapore. Jul. 14—12. 2009. pp. 912—922.
`Ghasemzadeh. et al.. “Collaborative Signal Processing for Action
`Recognition in Body Sensor Networks: A Distributed Classification
`Algorithm LI sing Motion Transcripts." ll-‘SN‘ 10. Apr. 12-16. 2010.
`Stockholm. Sweden. pp. 244-255.
`Iluynh. et al.. "Analyzing Features for Activity Recognition.” Joint
`sOc—EUSAI conference. Grenoble. Oct. 2005. 6 pages.
`Vztltonen NI. et al.. “Proactive and Adaptive Fuzzy Profile Control for
`Mobile Phones“. percom. pp. 1-3. 2009 115131;; International Confer-
`ence on Pervasive Computing and (.‘orrununications. 2009.
`
`{Continued}
`
`Primary Examiner Alan (Then
`(74) Attornev, Agent. or firm
`Stockton LLP
`
`Kilpalrick 'lbwnsend &
`
`[57)
`
`ABSTRACT
`
`Example methods. apparatuses. or articles ofmanufacture are
`disclosed herein that may be utilized. in whole or in part, to
`facilitate or support one or more operations or techniques for
`machine learning 0 fsitual ions via pattern matching. or recog-
`[1111011.
`
`53 Claims, 5 Drawing Sheets
`
`”may, :kamnhlle det-Iflanpll signflstlnn Iphraiilyaf
`Lambertm milled with said muse: device
`
`hair,“ a .43: park-m hunt a: last in Flt
`nu Hide: lent one detected collation
`
`: lflalill man and Eur-identified pat-m
`
`Faxing - n5”: mun-in; minimum annealed wilt. suit:
`I‘ml pattern. and sea-:3 piramzzn are hung dmmt.
`n: 'Jaslln pan. rm lid manimzeinpm ugmh
`
`Lniutzq; I mwmnipramgem m‘a mndpllkrr.
`inconceclmn ma and maturing said Input est-untamed.
`
`APPLE 1001
`
`APPLE 1001
`
`1
`
`

`

`US 8,768,865 B2
`Page 2
`
`(56}
`
`References Cited
`
`O'l‘I-IFJR PUBLICATIONS
`
`Yangt el al.. "Distributed Recognition of Human Actions Ijsing
`Wearable Motion Senmr Networks." Journal ofAmbient Intelligence
`and Smart Environments (2009). pp. 1-13.
`
`Yang, et 3].. “Distributed Segmentation and Classification of Human
`Actions Using :1 Weamblc Motion Sensor: 'ctwork." Computer Soci-
`ety Conference on Computer Vision and Pattern Recognition Work-
`shops. 2008. CVPRW '08. pp. 1-8‘
`Intemationai Search Repon and Written Opinion—PCP-“USEO12.-'
`021T43
`ISA-’EPO-
`-May 14‘ 2012,
`
`" cited by examiner
`
`2
`
`

`

`US. Patent
`
`Jul. 1, 2014
`
`Sheet 1 of 5
`
`US 8,768,865 32
`
`100
`
`Yaxis
`
`fl
`
`/
`
`/
`
`T
`
`(I)
`
`106
`
`Xaxis
`
`104
`
`102
`
`—————————
`
`___ ———..
`
`/
`
`/
`
`/
`
`/
`
`/
`
`/
`
`/
`
`FIG. 1
`
`3
`
`

`

`US. Patent
`
`Jul. 1, 2014
`
`Sheet 2 of 5
`
`US 8,768,865 B2
`
`com
`
`minimfl
`
`claimamlemm=ndammy
`
`N.0?—
`
`“use;+E55ZX:3509:35DwEBmeQ3%?0
`
`23.63022.".
`
`BEtsEoO
`
`4
`
`

`

`J
`
`
`
`EmomcfiwéEchEou8858:3888“...089.968hams“;c“Elsmnofl.,EwaEImEcE:_.lhml.Im.a“—88888888888Eotwm80“an—SmlofifimAcchE—888888888m.EE58.:.Hgoo?m3“mEvade.mi:330:;
`
`US 8,768,865 B2
`
`
`
`
`
`D—SN—UDHNHDCUQEON—Hamm
`
`m.05
`
`
`
`
`
`
`
`
`
`
`.8M”E._o.m=om,“umbifionoowA5.952m”QESmIoE:hAWEE—5m05EZEoEoQkflEHxBaoQ0&3meSo;I.S,..r.
`
`
`
`
`
`
`
`DEmPE9588338853836“EC529.,888EB:oUEw>oE£=o:9._:m:mov—.1Cam.
`
`
`
`
`3.8.5?GEEcoocommwfimfiFused.0:87:vaoUfiofiSiBH—oo0835:06m.bacoEficzcm“oafiluonSM.Emu—Eufiomfifizuxmfiou
`
`
`
`,m28888888882882.8-8:8888288828B8:3.893..E.fimCEBmenme:EmEupOEB—UEEL:
`
`
`
`
`
`
`
`
`
`
`II.”4.35$E53:23.35:83:88.3:3.“EmwlfilwfizaiH$52scream55.5
`
`ADE—“895088855580958:“::2;.8254:”wigliflzou
`
`
`
`
`o“xo:o.530up:2w:5a.A:cuHUH_uLa.p.888888888888duotammowcm88888888888
`
`US. Patent
`
`mom8m
`
`5
`
`

`

`US. Patent
`
`Jul. 1, 2014
`
`Sheet 4 of 5
`
`US 8,768,865 B2
`
`400
`
`\
`
`Monitoring, at a mobile device, input signals from a plurality of
`
`information sources associated with said mobile device
`
`402
`
`Detecting at least one condition based. at least in part,
`
`on at least one of said monitored input signals
`
`404
`
`on said at least one detected condition
`
`Identifying a first pattern based, at least in part,
`
`406
`
`Fixing a subset of varying parameters associated with said
`first pattern, said varying parameters are being derived,
`
`at least in part, from said monitored input signals
`
`408
`
`Initiating a process to attempt a recognition of a second pattern
`in connection with said monitoring said input signals based,
`
`at least in part. on said first identified pattern
`
`410
`
`FIG. 4
`
`6
`
`

`

`US. Patent
`
`Jul. 1, 2014
`
`Sheet 5 of 5
`
`US 8,768,865 B2
`
`com
`
`-MWHDLEOU
`
`Efizflug
`
`modmmmmtfi
`
`ZOEKEZDEEOUAmEm—rfiLmEOmmAmUan
`
`
`
`
`
`
`
`mqm<0<mmwZOELUmZZOU
`>MOEME.EZDOZHmmmUOME
`
`m.05
`
`>M02m—2
`
`
`
`MOmme>E—E—vn0xm
`
`
`
`AmEOmZm—mmmEPO
`
`MLCEZSEmCmZMmFIG:.rZEmExw
`
`7
`
`

`

`US 8,?68,865 B2
`
`1
`LEARNING SITUATIONS VIA PATTERN
`MATCHING
`
`CROSS-REFERENCE TO RELATED
`APPI..-ICA'I‘IONS
`
`The present application claims priority to U.S. Provisional
`Patent Application Ser. No. 61f434.400, entitled “Learning
`Situations via Pattern Matching," filed on Jan. 19, 2011.
`which is assigned to the assignee hereof and which is
`expressly incorporated herein by reference. Additionally.
`U.S. patent application Ser. No. 13f269,513, filed Oct. 7,
`2011, entitled “MACHINE LEARNING OF KNOWN OR
`UNKNOWN MOTION STATES WITII SENSOR FUSION"
`is being filed concurrently, the entire disclosure of which is
`hereby incorporated by reference.
`
`10
`
`BACKGROUND
`
`1. Field
`'lhe present disclosure relates generally to machine learn-
`ing and, more particularly, to machine learning of situations
`via pattern matching or recognition for use in or with mobile
`commtmication devices.
`2. lnformat ion
`
`Mobile communication devices. such as. for example. cel-
`lular telephones, smart telephones. portable navigation units,
`laptop computers. personal digital assistants. or the like are
`becoming more common every day. These devices may
`include. for example, a variety ofsensors to support a number
`of host applications. Typically, although not necessarily. sen-
`sors are capable of converting physical phenomena into ana-
`log or digital signals and may be integrated into (e. g. . built-in.
`etc.) or otherwise supported by {e.g.. stand-alone, etc.) a
`mobile communication device. For example. a mobile com-
`munication device may feature one or more accelerometers.
`gymscopcs, magnetometers, gravitometers. ambient
`light
`detectors, proximity sensors. thermometers, location sensors,
`microphones. cameras. etc. capable of measuring various
`motion states,
`locations. positions, orientations, ambient
`environments. etc. of the mobile device. Sensors may be
`utilized individually or may be used in combination with
`other sensors, depending on an application.
`A popular and rapidly growing market trend in sensor-
`enabled technology includes, for example,
`intelligent or
`smart mobile communication devices that may be capable of
`understanding what associated users are doing (cg, user
`activities, intentions, goals. etc.) so as to assist. participate. or.
`at times. intervene in a more meaningful way. Integration of
`an ever-expanding variety or suite o'fembedded or associated
`sensors that continually capture. obtain. or process large vol-
`umes of incoming information streams may, however. present
`a number of challenges. These challenges may include. for
`example. multi-sensor parameter
`tracking, multi-rnodal
`information stream integration, increased signal pattern clas-
`sification or recognition complexity. background processing
`bandwidth requirements, or the like, which may be at least
`partially attributed to a more dynamic environment created by
`user mobility. Accordingly. how to capture, integrate. or oth—
`erwise process multi-dimensional sensor infomtation in an
`effective or eflicient manner for a more satisfying user expe-
`rience continues to be an area of development.
`
`BRIEF [)ESCRIP’I'ION 01" THE DRAWINGS
`
`Non—limiting and non—exhaustive aspects are described
`with reference to the following figures. wherein like reference
`numerals refer to like parts throughout the various figures
`unless otherwise specified.
`
`3o
`
`40
`
`45
`
`50
`
`55
`
`60
`
`65
`
`8
`
`2
`
`FIG. 1 is an example coordinate system that may be used
`for machine learning of situations via pattern matching or
`recognition according to an implementation.
`F IG. 2 is an example context plot of a multi-dimensional
`sensor information stream according to an implementation.
`FIG. 3 is an example temporal pattem and an example
`generated rule according to an implementation.
`FIG. 4 is a flow diagram illustrating an implementation of
`an example process for machine learning of situations via
`pattern matching or recognition according to an implementa-
`tion.
`
`FIG. 5 is a schematic diagram illustrating an example com-
`puting environment associated with a mobile device accord-
`ing to an implementation.
`
`SUMMARY
`
`Example implementations relate to machine learning of
`known or unknown motion states with sensor fusion. In one
`
`implementation. a method may comprise monitoring, at a
`mobile device, input signals from a plurality of information
`sources associated with the mobile device: detecting at least
`one condition based. at least in part. on at least one of the
`monitored input signals; identifying a first pattern based, at
`least in part, on the at least one detected condition: and fixing
`a subset of varying parameters associated with the first pat-
`tern. the varying parameters derived. at least in part. from the
`monitored input signals.
`In another implementation. an apparatus may comprise a
`mobile device comprising at least one processor to monitor
`input signals from a plurality of information sources associ-
`ated with the mobile device; detect at least one condition
`based, at least in part, on at least one of the monitored input
`signals; identify a first pattern based. at least in part. on the at
`least one detected condition; and fix a subset of varying
`parameters associated with the first pattern,
`the varying
`parameters are being derived, at least in part. from the moni-
`tored input signals.
`In yet another implementation, an apparatus may comprise
`means for monitoring, at a mobile device. input signals from
`a plurality of information sources associated with the mobile
`device: means for detecting at least one condition based. at
`least in part, on at least one of the monitored input signals;
`means for identifying a first pattern based. at least in part. on
`the at least one detected condition; and means for fixing a
`subset of varying parameters associated with the first pattem.
`the varying parameters are being derived at least in part. from
`the monitored input signals.
`In yet another implementation. an article may comprise a
`non—transitory storage medium having instructions stored
`thereon executable by a special purpose computing platform
`at a mobile device to monitor input signals from a plurality of
`information sources associated with the mobile device; detect
`at least one condition based. at least in part. on at least one of
`the monitored input signals; identify a first pattern based. at
`least in part. on the at least one detected condition: and fix a
`subset ofvarying parameters associated with the first pattern.
`the varying parameters derived. at least in part, from the
`monitored input signals. It should be understood. however.
`that these are merely example implementations. and that
`claimed subject matter is not limited to these particular imple-
`mentations.
`
`DETAILED DESCRIPTION
`
`111 the following detailed description. numerous specific
`details are set forth to provide a thorough understanding of
`
`8
`
`

`

`3
`
`4
`
`US 8,?68,865 B2
`
`it will be understood by
`claimed subject matter. However,
`those skilled in the art that claimed subject matter may be
`practiced without these specific details. In other instances.
`methods. apparatuses. or systems that would be known by one
`of ordinary skill have not been described in detail so as not to
`obscure Claimed subject matter.
`Some example methods. apparatuses. or articles of manti—
`facture are disclosed herein that may be implemented.
`in
`whole or in part. to facilitate or support one or more opera-
`tions or techniques for learning one or more situations via
`pattern matching or recognition for use in or with a mobile
`coinmtuiication device. As used herein. “mobile device."
`“mobile communication device,“ “wireless device.“ “hand-
`held device.“ or the plural fonn of such terms may be used
`interchangeably and may refer to ally kind of special purpose
`computing platform or apparatus that may from time to time
`have a position or location that changes. In some instances. a
`mobile corninunication device may. for example. be capable
`of communicating with other devices. mobile or otherwise.
`through wireless transmission or receipt of information over
`suitable communications networks according to one or more
`communication protocols. As a way of illustration. special
`purpose mobile communication devices, which may herein
`be called simply mobile devices. may include. for example.
`cellular telephones. satellite telephones. smart telephones.
`personal digital assistants (PDAs). laptop computers. per-
`sonal entertainment systems. tablet personal computers (PC).
`personal audio or video devices. persoth navigation devices.
`or the like. It should be appreciated. however. that these are
`merely illustrative examples of mobile devices that may be
`utilized in connection with machine learning of situations via
`pattern matching or recognition. eutd that claimed subject
`matter is not limited in this regard.
`As previously mentioned. a mobile device may comprise a
`suite or a variety of sensors providing measurement signals
`that may be processed in some manner. such as via a suitable
`application processor. for example. so as to draw a number of
`inferences with respect to an associated user activity. inten-
`tion. goal. or the like. As will be described in greater detail
`below. in sortie instances. an inference may include a certain
`context. which may characterize or specify a particular situ-
`ation or circumstances relevant to a user experience. Particu—
`lar examples ofa context may include, for example, traveling
`between home and a place of work. being on a plane or
`vehicle. participating in a meeting, having lunch. exercising
`in a gym. sending or receiving a text message or e-mail. or the
`like. though claimed subject matter is not so limited. As
`described below. a mobile device may utilize one or more
`measurement signals obtained or received from certain sen—
`sors specifying a particular situation, for example, while con—
`sidering signals from other sensors so as to make a more
`complete. accurate. or otherwise sulficient inference of what
`an associated user is doing. about to do. or the like. A mobile
`device may. for example. make an inference while being
`co-located with a portion of the user’s body. such as via a
`suitable sensor—enabled body area network (e.g.. in a pocket.
`belt clip. armband. etc.), just to illustrate one possible imple~
`mentation. At times, an inference may be made in connection
`with an input of a user operating a mobile device in some
`manner. such as. for example. sending an e-mail. silencing a
`ringer. muting a call. or the like. which may facilitate or
`support
`learning or recognition of situations via pattern
`matching. as will also be seen.
`In some instances. a mobile device may. for example. uti—
`lize or employ. in whole or in part. one or more suitable
`pattem matching or recognition techniques to classify sensor-
`related observations in order to make a number of relevant or
`
`5
`
`10
`
`2E]
`
`25
`
`3o
`
`35
`
`4E]
`
`45
`
`50
`
`55
`
`60
`
`65
`
`otherwise sull'icient inferences with respect to user activities,
`intentions. goals. situations. or the like. For example. a suit-
`able application processor [e.g.. of a mobile device. etc.) may
`associate one or more varying parameters of interest or so-
`called variables received or derived from one or more in for-
`
`mation streams with one or more user-related mobility pat-
`tents or other sensor-captured patterns that may be indicative
`of whether an associated user is in a particular context. By
`way of example but not limitation. varying parameters or
`variables of interest may comprise. for example. an accelera-
`tion, vibration. gyroscopic rotation. wireless connectivity.
`luminous intensity of the ambient light. temperature. vari-
`ance. velocity. background noise level. or the like. Particular
`examples of certain pattern matching or recognition tech—
`niques that may be used. in whole or in part. in connection
`with machine learning of various situations will be described
`in greater detail below.
`As was indicated. a mobile device may include. for
`example. a number of sensors. such as one or more acceler-
`ometers. gyroscopes. magnetometers. ambient light detec-
`tors. proximity sensors, cameras. microphones. thermom—
`eters. or the like. In addition. a mobile device may feature a
`number of devices that may be used. at least in part. for
`sensing. such as Global Positioning System (GPS). Wireless
`Fidelity (WiFi), BIuetoolhTM-enabled devices. or the like.
`Thus, it should be appreciated that “sensor," “sensing device,"
`or the plural form of such terms may be used interchangeably
`herein. These sensors or sensing devices. as well as other
`possible sensors or devices not listed. may be capable of
`providing signals for use by a variety of host applications
`[e.g.. navigation. location. communication, etc.) while mea-
`suring various motion states. locations. positions. orienta-
`tions, ambient environlnents. etc. of a mobile device using
`appropriate techniques.
`An accelerometer. for example. may sense a direction of
`gravity toward the center of the Earth and may detect or
`measure a motion with reference to one. two. or three direc~
`tions ofien referenced in a Cartesian coordinate space as
`dimensions or axes X.Y. and Z. Optionally oralternatively. an
`accelerometer may also provide measurements of magni tude
`of various accelerations. for example. A direction of gravity
`may be measured in relation to any suitable frame of refer-
`ence, such as, for example. in a coordinate system in which
`the origin or initial point ofgravity vectors is fixed to or moves
`with a mobile device. An example coordinate system that may
`be used. in whole or in part. to facilitate or support one or
`more processes in connection with machine learning ofsitu-
`ations via pattern matching or recognition will be described in
`greater detail below in connection with FIG. 1. A gyroscope
`may utilize the Coriolis effect and may provide angular rate
`measurements in roll. pitch. or yaw dimensions and may be
`ttsed, for example. in applications determining herding or
`azimuth changes. A magnetometer may measure the direction
`ofa magnetic field in X. Y. Z dimensions and may be used. for
`example. in sensing true North or absolute heading in various
`navigation applications.
`Following the above discussion. measurement signals
`received or obtained from a variety of sources ofiniormation.
`such as. for example, one or more sensors. applications. user
`actions. etc. may be integrated in some manner so as to make
`a more complete. accurate. or otherwise sufficient inference
`or classification of a motion state. activity. intention. goal,
`situation. etc. of an associated user. FIG.
`1
`illustrates an
`implementation of an example coordinate system 100 that
`may be used. in whole or in part. to facilitate or support one or
`more operations or techniques for machine learning of situ-
`ation via pattern matching or recognition for use in or with a
`
`9
`
`

`

`5
`
`6
`
`US 8,?68,865 BZ
`
`mobile device. such as a mobile device 102, for example. As
`illustrated, example coordinate system 100 may comprise, for
`example,
`tlu‘ee-dimensional Cartesian coordinate system,
`though claimed subject matter is not so limited. In this illus-
`trated example. one or tnore translational aspects or charac-
`teristics of motion of mobile device 102 representing, for
`example. acceleration vibration tnay be detected or tnea—
`sured. at least in part, by a suitable accelerometer. such as a
`3D accelerometer, with reference to three dimensions or axes
`X. Y. and 7. relative to an origin 104 of example coordinate
`system 1 00. It should be appreciated that example coordinate
`system 100 may or may not be aligned with a body of mobile
`device 102. It should also be noled that in certain implemen-
`tations a non—Cartesian coordinate system may be used or that
`a coordinate system may define dimensions that are mutually
`orthogonal.
`One or more rotational aspects or characteristics ofmotion
`ofmobile device 102, such as orientation changes about grav-
`ity. for example. may also be detected or measured. at least in
`part. by a suitable accelerometer with reference to one or two
`dimensions. For example, rotational motion ofmobile device
`102 may be detected or measured in terms of coordinates (4:,
`1:), where phi (o) represents roll or rotation about an X axis. as
`illustrated generally by arrow at 106, and tall (1:) represents
`pitch or rotation about an Y axis, as illustrated generally at
`108. Accordingly, here. a 3].) accelerometer may detect or
`measure. at least in part. a level of acceleration vibration as
`well as a change about gravity with respect to roll or pitch
`dimensions. for exatnple. thus, providing five dimensions of
`observability (KY, 2., 13,13). It should be understood, hoWever.
`that these are merely examples ofvarious motions that may be
`detected or measured. at least in part. by an accelerometer
`with reference to example coordinate system 100. and that
`claimed subject matter is not limited to these particular
`motions or coordinate system.
`At times. one or more rotational aspects or characteristics
`ofmo’tion ofmo bile device 102 may. for exa mple. be detected
`or measured. at least in part, by a suitable gyroscope capable
`of providing adequate degrees of observability. just to illus-
`trate another possible implementation. For example. a gyro-
`scope may detect or measure rotational motion of mobile
`device 102 with reference to one, two, or three dimensions.
`Tints. gyroscopic rotation may, for example. be detected or
`measured. at least in part. in terms of coordinates (q). “C. 11').
`where phi (:11) represents roll or rotation 106 about an X axis.
`tau (1:) represents pitch or rotation 108 about aY axis, and psi
`(ll!) represents yaw or rotation about a Z axis. as referenced
`generally at 110. A gyroscope may typically, although not
`necessarily, provide measurements in terms of angular accel—
`eration or vibration (e.g., a change in an angle per unit of time
`squared). angular velocity (e.g.. a change in an angle per urtit
`of time), or the like. Of course. details relating to various
`motions that may be detected or measured, at least in part, by
`a gyroscope with reference to example coordinate system 100
`are merely examples. and claimed subject matter is not so
`limited.
`
`In certain implementations, mobile device 102 may
`include one or more ambient environment or like sensors,
`such as. for example, an ambient light detector. a proximity
`sensor. a temperature sensor. a barometric pressure sensor. or
`the like. For example. a proximity sensor may typically com-
`prise an infrared (IR) emitter-receiver pair placed sufficiently
`closely on mobile device 102 so as to detect a presence of
`nearby objects. measure a distance to such objects. etc. with—
`out physical contact. A proximity sensor may be often fea—
`tured in mobile devices to turn off a display while not in use.
`for example, deactivate a touch screen to avoid unwanted
`
`IU
`
`30
`
`4E]
`
`60
`
`input during a call. or the like. Certain implementations of
`mobile device 102 may feature an ambient light detector to
`help in adjusting a touch screen backlighting or visibility of a
`display in a dimly lit environment. for example. via measur-
`ing an increase in luminous intensity of the ambient light.
`Ambient environment sensors are generally known and net-id
`not be described here in greater detail.
`It should be appreciated that in sotne example implemenn
`tations mobile device 102 may include other types of sensors
`or sensing devices beyond sensors or devices listed herein so
`as to facilitate or support machine learning ofsituations via a
`pattern matching or recognition. For example, mobile device
`102 may include one or more digital cameras that may track
`optical motion ofan object or associated environment so as to
`make a context-relevant inference. facilitate or support con-
`text rccognition, or the like. In addition. mobile device 102
`may be equipped with a microphone. forexample. and may be
`capable of sensing an audio that may be associated with a
`particularcontext or activity 0 fa user. such as. being in a gym.
`having a conversation, listening to the music. cooking or
`making coffee. watching a movie. or the like. as another
`possible example. In some instances, mobile device 102 may
`comprise one or more devices that may be used. at least in
`part, for sensing. such as. for example. GI’S. WiFi. Bitte-
`toolhTM-enabled devices, as previously mentioned. For
`example. a GPS-enabled device in conjunction with measure-
`ments from an accelerometer may enable mobile device 102
`to make an inference with respect to a mode oftransportation
`of a user, such as being in a car or riding a bike, taking a bus
`or train, or the like. Of course. these are merely examples
`relating to sensors that may be used, at least in part. in con-
`nection with machine learning of situations via pattern
`matching or recognition, and claimed subject matter is not so
`limited.
`
`As alluded to previously. how to design or implement a
`machine learning approach for mobile devices to be able to
`understand what associated users are doing (e.g.. user activi-
`ties. intentions. goals. situations. etc.) so as to assist. partici-
`pate. or. at times. intervene in a more meaningful way. for
`example. continues to be an area of development. In sotue
`instances. a learning approach. sttclr as in supervised or unsu—
`pervised machine learning, for example, may include one or
`more signal-related pattern recognition techniques (eg. stt -
`tistical. structural, etc.) that may help to classify one or more
`sensor-related observations. as was indicated. Typically,
`although not necessarily. signal-related patterns may be
`specified or observed in a mold-dimensional space with
`respect to multiple sources of infonnation. Thus. one or more
`patterns to be identified may, for example, be represented via
`one or more vectors ofobservations in multiple dimensions.
`As will be seen, in some instances, dimensions may corre-
`spond. for example, to a signal attribute (e. g. _. represented via
`a variable, etc.) in a set of infonnation sources that may be
`monitored in some manner. At
`times. pattern recognition
`techniques may, for example. employ or utilize. at least in
`part, one or more pattem—matching templates, bttt sortie prior
`knowledge ofan applicable domain may be needed or other—
`wise useful to find variations that may fit a somewhat gener-
`alized template. if any. Typical approaches to pattern match-
`ing or recognition may include. for example. utilizing or
`otherwise considering a relatively rigid specification of a
`particular pattern to be found. For example, at times, a match
`may imply that an identical pattern is found or located in one
`or more testing or training datasets. suitable infomration
`repositories. or the like.
`in addition. one or more suitable
`distance metrics may, for example. be applied in some man-
`
`10
`
`10
`
`

`

`7
`
`8
`
`US 8,?68,865 B2
`
`events of interest. By way of example but not limitation= a
`condition or event may include. for example. a time of day,
`day of week. state or action ofa host application, action of a
`user operating a mobile device (cg. silencing a ringer. mut-
`ing a call. sending a text message. etc .] or the likeniust to name
`a few examples. As will be described in greater detail below.
`in an implementation, upon or after detecting these one or
`more conditions or events. a mobile device may. for example,
`selectively initiate a process to attempt to recognize a particu-
`lar signal-related pattern that occurs in connection with the
`detected condition or event.
`
`More specifically. a subset clone or more varying param-
`eters or variables as socialed with a condition or event may, for
`example. be fixed in sortie manner and stored in a suitable
`database. As described below. such a subset may comprise.
`for example, a distinct signal-related pattern corresponding to
`a certain detected condition or event. just to illustrate one
`possible implementation. Such a condition or event-related
`pattern may be fixed. for example. by associating correspond-
`ing parameters or variables having a particular. distinct. or
`otherwise suitable pattern to represent the condition or event.
`In the next or otherwise suitable occurrence of such a condi—
`tion or event-related pattern, an electronic “snapshot“ of one
`or more other co-occurring signal-related patterns represen-
`tative of associated sensors‘ behavior may be captured. A
`suitable processor may then look or search for a pattern
`match, exact or approximate. in one or more other signal-
`related patterns every time a condition or event—related pat—
`tern occurs. for example. by utilizing a “snapshot.“ in whole
`or in part, using any suitable pattern matching processes or
`algorithms.
`To illustrate, a user may silence a ringer or mute a call.
`which may comprise a condition or event of interest. for
`example. and at that moment a “snapshot" of one or more
`sensors associated with a monitored information stream and
`
`nor, in whole or in part. to facilitate or support approximate
`pattern matching or recognition.
`Since typical pattern recognition approaches generally
`employ processes or algorithms that work with a fixed known
`number of information sources. pattern recognition with
`respect to a multidimensional information stream acquired
`or obtained via a suite of sensors may present a number of
`challenges. These challenges may include.
`for example.
`detecting or “picking up" patterns from a large number of
`information sources with an unknown or different subset of
`
`5
`
`10
`
`sources being relevant to different situations or contexts. In
`other words. in sortie instances. it may be somewhat difficult
`to detect or retxtgni're an existing pattern if such a pattern is
`not pre—defined or pre—specified in some manner for a certain
`inionnation source. Another
`challenge with
`typical
`approaches may be, for example, identifying one or more
`relevant situations and learning patterns that are correlated
`with or correspond to these relevant situations. Consider, for
`example. a multi-dimcnsional information stream captured or
`obtained via a variety of sensors with respect to a typical
`“retum—home—after—work" experience ofa user.
`By way of example but not limitation. an example context
`plot 200 of a multi-dimensional sensor information stream
`captured or obtained in connection with certain simulations
`or experiments is illustrated in FIG. 2. For this example, a
`multi-dimensional sensor information stream is captured via
`a suite of sensors. such as. for example. an accelerometer.
`WiFi, ambient
`light detector. and microphone for an
`“Ofi'ice—ef’arking Lot~+Driving—»Home"
`routine
`(e.g..
`between 5 and 6 pm, etc.) of a user. Here, an acceleration
`vibration may, for example, indicate that a user is driving or
`walking. a lost WiFi connectivity may indicate that a user is
`no longer at work (e.g.. disconnected with a work-re

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket