`(12) Patent Application Publication (10) Pub. N0.: US 2006/0004680 A1
`(43) Pub. Date:
`Jan. 5, 2006
`Robarts et al.
`
`US 20060004680A1
`
`(54) CONTEXTUAL RESPONSES BASED ON
`AUTOMATED LEARNING TECHNIQUES
`
`(76) Inventors: James O. Robarts, Redmond, WA
`(US); Eric L. Matteson, Bellevue, WA
`(Us)
`Correspondence Address:
`SEED INTELLECTUAL PROPERTY LAW
`GROUP PLLC
`701 FIFTH AVE
`SUITE 6300
`SEATTLE, WA 98104-7092 (US)
`
`(21)
`(22)
`
`Appl. No.:
`
`11/033,974
`
`Filed:
`
`Jan. 11, 2005
`
`Related US. Application Data
`
`(63) Continuation of application No. 09/825,152, ?led on
`Apr. 2, 2001, noW Pat. No. 6,842,877, Which is a
`continuation-in-part of application No. 09/216,193,
`?led on Dec. 18, 1998, noW Pat. No. 6,466,232, and
`Which is a continuation-in-part of application No.
`09/464,659, ?led on Dec. 15, 1999, noW Pat. No.
`6,513,046, and Which is a continuation-in-part of
`application No. 09/724,902, ?led on Nov. 28, 2000.
`
`(60) Provisional application No. 60/194,006, ?led on Apr.
`2, 2000. Provisional application No. 60/ 193,999, ?led
`on Apr. 2, 2000. Provisional application No. 60/ 194,
`123, ?led on Apr. 2, 2000.
`
`Publication Classi?cation
`
`(51) Int. Cl.
`(2006.01)
`G06F 15/18
`(2006.01)
`G06F 17/00
`(2006.01)
`G06N 5/02
`(2006.01)
`G06F 3/00
`(2006.01)
`G06F 9/00
`(52) US. Cl. ............................. .. 706/12; 706/46; 715/700
`
`(57)
`
`ABSTRACT
`
`Techniques are disclosed for using a combination of explicit
`and implicit user context modeling techniques to identify
`and provide appropriate computer actions based on a current
`context, and to continuously improve the providing of such
`computer actions. The appropriate computer actions include
`presentation of appropriate content and functionality. Feed
`back paths can be used to assist automated machine learning
`in detecting patterns and generating inferred rules, and
`improvements from the generated rules can be implemented
`With or Without direct user control. The techniques can be
`used to enhance softWare and device functionality, including
`self-customizing of a model of the user’s current context or
`situation, customizing received themes, predicting appropri
`ate content for presentation or retrieval, self-customizing of
`softWare user interfaces, simplifying repetitive tasks or
`situations, and mentoring of the user to promote desired
`change.
`
`Initialize CS and/or CC J“ 402
`for theme set for user
`1
`
`At CS, obtain data signals from input devices f 404
`
`1
`
`At CS, process data signals to produce
`attributes representing user's context,
`including adding required data ?elds, such as
`attribute name, timestamp, units, etc.
`
`f 406
`
`[At CS, provide attribute to CM for storage M 408
`
`lAt CC, request from CM and receive attributes
`
`|~./" 410
`
`|At CC, process attributes to determine appropriate output M 412
`
`IProvide output signal to user, other entity or other process |~/‘ 414
`
`l
`l
`
`400
`
`1
`
`Google Inc., Nest Labs, Inc., and Dropcam, Inc.
`GOOG 1008
`IPR of US Pat. No. 8,315,618
`
`
`
`Patent Application Publication
`
`Jan. 5, 2006 Sheet 1 of 48
`
`US 2006/0004680 A1
`
`Eoq.:8b>:m
`
`m=oEm>
`
`Smaom
`
`
`
`wowmuom>oQuomaom
`
`.83m:otm>
`
`
`
`Nm_.muo_>oQnfiasofiémfiwozm
`
`X:3.5
`
`
`
`¢N_.u:onaEo_E
`
`
`
`
`
`wm_.mou_>oQE05383
`
`
`
`NN_.muo_>oQS9:09
`
`
`
`How:m=o_.:~>.
`
`_..a£SE
`
`S9:.83
`
`
`
`
`
`m:oCm>Nmwbxmoamuoourmm
`
`
`
`mo3>oQuomnom
`
`.5mDm:ota>H
`......:_m:o_bw>
`
`om—.bfimfiooHE2::oH9:mufimtoméoz
`
`
`
`
`
`mwN_.mo2>oQHomnom
`
`8—823209
`
`o_._.H33
`
`.C9om.:O83>
`
`
`
`uo_>uQ._.N_.80:30
`
`3&5H28$53:0
`
`uoxwomm..HHUU.L.P.HHH.L.L.b.HHH.L.b.HHHHH.......................muo_>uQ:.K::O8_.Mmm—..€Em_QH.,,5oE>
`
`
`
`NS..m2
`
`~.5
`
`2
`
`
`
`
`Patent Application Publication Jan. 5, 2006 Sheet 2 0f 48
`
`US 2006/0004680 A1
`
`Input devices
`
`'
`
`CPU
`
`h
`I
`I
`
`1 2O
`
`Output devices
`
`'
`
`" ‘ ' ' " "" ‘ ‘I z
`""'|"1""
`
`rv *
`
`214
`
`> Audio input CS
`216
`
`Video input cs
`21 a
`
`‘
`> Body temp. CS
`
`r
`
`. 220
`
`_>
`Heart rate cs
`
`-
`
`'
`
`234
`
`232
`
`230
`
`#248
`\
`Poi-matter
`and visual
`display
`selector
`A
`
`7 ------ - -,
`CM :
`I
`
`----- - —,
`1
`i
`:
`
`240
`7 S :
`
`_ _ _ _ , l
`
`|
`l
`l
`{=3
`I a :6 i
`
`l
`i
`I
`i
`
`I
`
`:
`E
`:
`
`I
`
`I
`.'
`1
`I
`
`134
`
`~
`a Head-mounted
`' display
`
`1 30
`
`Flat panel
`display
`
`132
`N
`Earpiece
`p
`
`s e er
`
`Tactile output
`
`252
`
`242
`
`136
`
`121
`
`_
`Video camera
`
`202
`
`204
`
`EKG/EEG
`
`$811801’
`
`monitor
`
`208
`
`monitor
`
`2 . - - - _ > O _
`222
`H '
`I“ ' ' ' '> O
`i- _ _ _ _ _ i>
`_______ _ _ _ _>
`
`Blood pressure
`
`I CS
`
`224
`
`g
`
`.3 :E
`|____ U, E ‘g _ _) Teleconi.
`I
`U o __
`transceiver
`I
`o
`
`_
`
`-> Blood 02 CS
`
`; _ _ _ _'
`
`254
`
`\1 ‘22:3,?
`
`}
`
`226
`‘ ~
`
`—+ Other CS's
`
`12/44
`
`Other CC's
`
`-
`
`5pm“
`
`_
`
`4\
`
`sensor
`
`I
`
`I
`
`,
`
`.
`
`Application program
`
`H
`
`I
`
`236
`
`250
`
`e
`
`De?brillator
`
`""""""""""" ' '
`
`1 26
`
`Various user
`
`||
`
`sensor devices
`
`Browser program f 238
`
`Various
`environment
`128 N sensor devices
`
`122
`
`Various user
`input devices
`
`Fig 2
`
`138
`
`Various user
`output devices
`
`3
`
`
`
`Patent Application Publication Jan. 5, 2006 Sheet 3 of 48
`
`US 2006/0004680 A1
`
`Code
`module A
`(CS)
`
`Code
`module B
`(CC)
`
`Code
`module C
`(CS/CC)
`
`Code
`module D
`(CS/CC)
`
`k
`
`A
`
`CM
`
`Fig. 3
`
`232
`
`300
`
`Initialize CS and/or CC
`for theme set for user f 402
`
`At CS, obtain data signals from input devices f 404
`
`v
`At CS, process data signals to produce
`attributes representing user's context, f 406
`including adding required data ?elds, such as
`attribute name, timestamp, units, etc.
`
`[At CS, provide attribute to CM for storage y 408
`
`[At CC, request from CM and receive attributes V‘ 410
`
`7
`
`lAt CC, process attributes to determine appropriate output M 412
`
`V
`
`.
`
`V
`IProvide output signal to user, other entity or other process |\/‘ 414
`
`Fig. 4
`
`400
`
`4
`
`
`
`Patent Application Publication Jan. 5, 2006 Sheet 4 0f 48
`
`US 2006/0004680 A1
`
`Launch at start-up
`
`f 502
`
`7
`Perlodlcally recelve GPS data
`stream
`
`504
`
`V
`Parse and store GPS data
`
`f 506
`
`508
`
`Receive
`request for latitude,
`longitude or altitude
`I attribute?
`
`Provide requested attribute value(s),
`with Timestamp, uncertainty, units,
`etc.
`
`500
`
`Fig.
`
`5A
`
`521
`
`522
`
`524
`
`526
`
`528
`
`530
`
`Name
`Latitude (value)
`Uncertainty
`Timestamp
`Units
`Format version/?ags
`
`L
`L
`
`Fig. 5B
`
`520
`
`5
`
`
`
`Patent Application Publication Jan. 5, 2006 Sheet 5 0f 48
`
`US 2006/0004680 A1
`
`Register for latitude, longitude and altitude attributes \f‘ 602
`
`4 ormal operation?
`
`606
`
`Use default attribute mediator
`
`Reevaluate all latitude attributes f 608
`
`Select lowest uncertainty attribute mediator J‘ 610
`
`Determine CS providing resulting attributes f 612
`
`7
`Request latitude, longitude and altitude attributes from CS/CM \f‘ 614
`
`616
`
`4 otify feature active?
`
`Receive input de?ning boundary
`
`V
`Provide event parameters to CM regarding boundary transversal J‘ 620
`
`Receive
`response from CM to
`stablished event‘7
`
`Notify user that boundary has been crossed f 624
`I
`
`Fig. 6
`
`600
`
`6
`
`
`
`Patent Application Publication Jan. 5, 2006 Sheet 6 0f 48
`
`US 2006/0004680 Al
`
`
`
`M $235K 82am
`
`
`
`N 2x25“ 8E5
`
`A. I I I I I
`
`A. I I I I I A I I I I I
`
`A I I I
`
`vow
`
`won
`
`A. I I I I I A 2366 BQQEQSQQEV
`
`
`All‘ OO\mU “0 m0 M 3053
`30:5 omwou 8% 5&5
`
`
`\ AI I I N 2366 N 88:8
`
`1 . 2mg 8% 3%:
`
`H H OK
`
`A Z @1608 Z @858
`
`Bwoq 8% :55
`
`N .mi
`
`con
`
`7
`
`
`
`Patent Application Publication
`
`Jan. 5, 2006 Sheet 7 of 48
`
`US 2006/0004680 A1
`
`User.
`Desired _privacy_level
`Interruptibility
`Speed
`Direction
`Acceleration
`Availability.
`Cognitive_availability
`Tacti le_availability
`Manual_availability
`Visual_availability
`Oral_availability
`Audit0ry_availability
`Proximity.<ltem or place name>
`Mood.
`Happiness
`Sadness
`Anger
`Frustration
`Confusion
`Activity.
`Driving
`Eating
`Running
`Sleeping
`Talking
`Typing
`Walking
`Location.
`Place_name
`Latitude
`Longitude
`Altitude
`Room
`Floor
`Building
`Address
`Street
`City
`County
`State
`Country
`Postal_Code
`Destination. (same as User.Location.)
`Physiology.
`Pulse
`Body_temperature
`Blood _pressure
`Respiration
`Person.<name or ID>. (same as User.)
`Platform.
`UI.
`
`Oral_input_device_availability
`Manual_input_device_availability
`TactiIe_output_device_avai labil ity
`Visual_output_device_availabil ity
`Auditory_output_devi ce_availabil ity
`
`Platform. (continued)
`CPU.
`Load
`Speed
`
`_
`
`'
`
`F lg‘ 8
`
`Memory.
`Total_capacity
`Used
`Storage.
`Total_capacity
`Used
`Connectivity.
`Connection_status
`Connection_speed
`Connection_type/device
`Connection_activity
`Power.
`Power_source
`Power_level
`Environment.
`People.
`Nearest
`Number _prcsent
`Number_close
`Local.
`Time
`Date
`Temperature
`Pressure
`Wind_speed
`Wind_direction
`Absolute_humidity
`Hi gh_f0recast_temperature
`Low_forecast_temperature
`People _present
`Ambient_noise_level
`Ambient_light_level
`Days.<previous or future>.
`High_temperature
`Low'temperature
`Precipitation_type
`Precipitation_amount
`Place.<place name>. (same as Environment.Local)
`Application.
`Mail.
`Available
`New__messages_waitin g
`Messages__waiting_to_be_sent
`Phone.
`Available
`In__use
`On/ot‘f
`Noti?cation_mech anism
`Cal l_incoming
`Caller_lD
`Sound_recorder.
`Available
`Recording
`
`8
`
`
`
`Patent Application Publication Jan. 5, 2006 Sheet 8 of 48
`
`US 2006/0004680 A1
`
`User Setting
`Mental Context
`Meaning
`Cognition
`Divided User Attention
`Task Switching
`Background Awareness
`Solitude
`Privacy
`Desired Privacy
`Perceived Privacy
`Social Context
`Affect
`Physical Situation
`Body
`Biometrics
`Posture
`Motion
`Physical Encumberment
`Senses
`Eyes
`Ears
`Tactile
`Hands
`Nose
`Tongue
`Workload demands/effects
`Interaction with computer devices
`Interaction with people
`Physical Health
`Environment
`Time/Space
`Objects
`Persons
`Audience/Privacy Availability
`Scope of Disclosure
`Hardware af?nity for privacy
`Privacy Indicator for User
`Privacy Indicator for Public
`Watching Indicator
`Being Observed Indicator
`Ambient Interference
`Visual
`Audio
`Tactile
`
`F lg, 9
`
`Computer
`Power
`Con?guration
`User Input Systems
`Hand/Haptic
`Keyboard/Keystrokes
`Voice/Audio
`Eye Tracking
`Cursors
`Axis
`Resolution
`Selection
`Invocation
`Accelerators
`Output Systems
`Visual
`Resolution
`Audio
`Public/Private
`Haptic
`External Resources
`[/0 devices
`Connectivity
`
`Data
`Quantity/State
`Urgency/Importance
`Use of Prominence
`Modality
`Sensitivity/Purpose
`Privacy Issues
`Use of Association
`Use of Safety
`Source/Ownership
`Types
`User generated
`Other computers or people
`Sensor
`PC State
`Use of Association
`
`9
`
`
`
`Patent Application Publication
`
`Jan. 5, 2006 Sheet 9 of 48
`
`US 2006/0004680 A1
`
`
`
`uu._>ovwnaamfiou
`
`82v_.8.EoZ
`
`3ME
`
`
`
`
`
`oooeoo__>uDm5=aEooomnonmomoumfiufi.
`
`
`
`
`
`
`
`chowEofioom=o.S~>
`
`3.:
`
`\._.o_.
`
`muo_>2.39:0
`
`u-mu2>ou39:u»u.
`
`uounumoiIEozmqofiufi._
`
`uouuoqaou
`
`v=9EoZ
`
`
`
`uS§.mB.._8:qEoU2:2:
`
`94.6$35$9.2252
`
`EsfifimISaw:o.nmN_Eo.m=o
`
`Emu
`
`
`
`.\oo:o.§oE._8D
`
`
`
`2:23.E850
`
`ufioflom39::
`
`10
`
`$m_EeE
`
`ufionk€8.50oNo_.ummhzm
`
`
`
`
`
`EmOEUQFE350
`
`HOEQCOHDQ
`
`38:80
`
`.$mOO£U
`
`82boas:
`
`$8800
`
`.5_£.oE
`
`Sowzoumnnomfi
`
`:§:oU
`
`10
`
`
`
`
`
`
`
`
`
`
`Patent Application Publication
`
`Jan. 5, 2006 Sheet 10 of 48
`
`US 2006/0004680 A1
`
`
`
`mo:an..9=a......w9.5..”3:34cocacan.502mzwoocmuoocuw
`
`
`
`
`
`W,..9%._A1
`
`
`
`macum....,.__..£m%............(:1.1..::4I1;
`
`
`
`
`
`
`
`mm.£=E..8_E£%...a=ot£u==o.e..zouoo.m2.3....o...o§.330
`
`
`
`
`
`
`
`
`
`
`
`.flaw©Es:..E...§.
`
`
`
`.A_w:.__.§_.__.€._..m.§_o..=....2§_.o=m.B.9638..5.5:6.:fimoctoz
`
`
`
`
`
`
`
`
`
`.c$Q=..&Qa...oB.:....Ececuw
`
`mo2.
`
`
`
`
`
`2...mu.acoa.a=<£:o3__..t=.._.y.~oommtfiu.outfit.
`
`
`
`
`
`:2.2:
`
`
`VNNN.m..§~E:S:
`
`um3......m2.:~30uxmh
`
`.5325Ramham
`
`
`
`.§=a_Eou..Mmfi....._mQXE.
`
`
`
`
`
`mo...3..32:3.n=..u_
`
`
`
`=.m...muQ_._=oo.mm3§mQ.¢2_..=..u~»
`
`
`
`
`
`
`
`vmanzotcawzfig.__mE_.=.:..HEuucm
`
`
`
`
`
`
`
`
`
`Em...:2:.:8.....Ed:3.
`
`
`
`35ms._....§.§.a.«.........§:258.3.:S:
`
`
`
`
`
`11
`
`11
`
`
`
`Patent Application Publication Jan. 5, 2006 Sheet 11 0f 48
`
`US 2006/0004680 A1
`
`mm:
`
`mm:
`
`
`
`3&0 mam 3%
`
`
`
`
`
`685a $.56 “ES.
`
`0:95 3 9: mi M 53
`
`,
`
`
`
`b... 33mm“. 2 mmqhw
`
`2 o>< magma? :2
`
`S: 63,3“
`$95K
`
`ow:
`
`PM:
`
`12
`
`
`
`Patent Application Publication Jan. 5, 2006 Sheet 12 0f 48
`
`US 2006/0004680 A1
`
`
`
`.o?ctmat 00am c930. Seam mFSmQ tEEom
`
`
`
`
`
`are “328%. ENQRQEQ 5.56 E55 Wt
`
`
`
`E358 mmrsmq XE
`
`
`623.: 2:39am .mmzo?zom E255 E 25% 5.) N
`
`
`
`
`
`E552: an @5305 355.50 aim-2S P
`
`
`13
`
`
`
`Patent Application Publication Jan. 5, 2006 Sheet 13 0f 48
`
`US 2006/0004680 A1
`
`
`
`mg 336 ‘R3883 £36 in; =5
`“E2535 mEiQsE: Eoam Em mom 7
`
`
`
`
`$358 pmpsma at
`
`14
`
`
`
`Patent Application Publication Jan. 5, 2006 Sheet 14 0f 48
`
`US 2006/0004680 A1
`
`
`
`@330 5...; 3:5
`
`=§ss Em
`
`mm:
`
`
`
`:5 5.3 Hum-L
`
`15
`
`
`
`Patent Application Publication
`
`Jan. 5, 2006 Sheet 15 of 48
`
`US 2006/0004680 A1
`
`5:
`
`
`
`.._om..=_E._ou:d
`
`
`
`3.5.::24
`
`
`
`3:59:EEq
`
`
`
`~EE._a_Emma
`
`=o..sozE5
`
`mE_=_.2.5.
`
`
`
`:33?___m
`
`
`
`mmaz:@_.m_
`
`fiw.
`
`.m.__EuE:n
`
`
`
`.._.m_=_.n_.
`
`set”.=2.
`
`
`
`mugs.:._=.
`
`J.:om:._2.
`
`
`
`cafiummi:31
`
`
`
`.35.u_._w_.:&._
`
`
`
`:23.:3...
`
`
`
`«Sunnm_._
`
`
`
`:2:__.__..s_
`
`
`
`auto.«Bow
`
`
`
`.._mEm_.._£m
`
`
`
`
`
`.=_=m=__u_mm=_Eou___.._.59:Sean._.__:m__m._.
`
`
`
`
`
`
`
`gum?£53.5:93:923:32:95__u.r
`
`
`
`
`
`.=o_:_u>=_Bu:5::Sons_.=_:m__m._.
`
`
`
`
`
`E52m:Ew_::.5m_=a:5.26:5m._£no.3:
`
`$n__.=o_2cou=EunEm.__=:m=__.8_::_m_:_u_2
`
`
`
`
`
`
`
`miME
`
`
`
`sum?mECafin.CF51EU.8:53m:_==_:EU
`
`
`
`sum?82?um:2:2.:5E.29.33..
`
`
`
`
`
`..€umm:_.m.§..3m..._u:=.=._~ucu_...m:_.E=m9
`
`iguana=9swanms:En”
`
`16
`
`16
`
`
`
`Patent Application Publication Jan. 5, 2006 Sheet 16 0f 48
`
`US 2006/0004680 A1
`
`
`
`to“ E0 3o: .... SE. twig
`
`
`
`
`
`2%: EC 35 mm“...
`
`£35: ME 50am :8 and.
`
`
`
`
`
`.Emh comb 36am tmQ 26h
`
`17
`
`
`
`Patent Application Publication Jan. 5, 2006 Sheet 17 0f 48
`
`US 2006/0004680 A1
`
`9% aims: 32 t
`
`EEEmQEQE
`
`
`
`.EE E=$x
`
`y
`
`18
`
`
`
`Patent Application Publication Jan. 5, 2006 Sheet 18 0f 48
`
`US 2006/0004680 A1
`
`
`
`.5... SEE. .Eumu £5 93.2.2: 2. :
`
`
`2.. .2; 522:. u 2. 2...... = 38 2.
`
`
`.wwg 9-8: n22:
`
`2.5.2.2.. ZEEEQ
`
`
`
`
`
`
`
`
`:5: @553 3.2. E5. @ 13.5.. .32. @2556‘: .22. $5.23. ,
`
`
`
`
`
`.?... Es. 9.. \ E8". .9; wag?
`
`
`
`
`
`E... 5...; 235.2: 2.25 .2...
`
`
`
`
`
`
`
`
`
`. 22.3.25. mm: 2...
`
`19
`
`
`
`Patent Application Publication
`
`Jan. 5, 2006 Sheet 19 of 48
`
`US 2006/0004680 A1
`
`5:ME
`
`
`
`
`
`.Emo...umo‘00ofia2:.95.:0:awn..33ewapazomw,
`
`
`
`
`
`.x...o__s_him2=9..~m_Eo22m=...omEm..E.....m__.£...____...~_
`
`ummocmm2%2..9...:
`
`
`
`.=..:__.23.RammamL
`
`Each...mu.DUham.
`
`L
`
`20
`
`
`
`men...o..35.9...42....
`
`
`
`
`
`VS.
`
`3:
`
`20
`
`
`
`Patent Application Publication
`
`Jan. 5, 2006 Sheet 20 of 48
`
`US 2006/0004680 A1
`
`
`
`”:UZ~Zm<>>.,,.%....,.€_:%
`
`
`
`
`
`morwmZWQQDW_.....3..%2:2..mxzm
`
`.=___mo.__u3..oo2....
`
`..€QsSam2.
`
`
`
`8:MNNN.M..~.W
`
`ezmzazsm353.8an
`
`21
`
`E32,...Rammam
`
`21
`
`
`
`Patent Application Publication
`
`Jan. 5, 2006 Sheet 21 of 48
`
`US 2006/0004680 A1
`
`
`
`..3.§__2=..2__net3...__$.e_.sea=...s_...._n..___..._.$.._mfi%as=..m=.=mEas:38...E228..2...:
`
`
`
`
`
`
`
`
`
`
`
`
`
`5.....3...o....=:=maauaa.¢.___9s_oEo.££.___.s.3.=___=__.$01
`
`
`
`
`
`
`
`
`
`Eass:ma.3...:
`
`anfltrxs.....8
`
`H:.u.£
`
`
`
`...mE...¢..E0.am...a...c=:.o...mam053..on52wuummuwaolmm
`
`dam95%2..m_.=o.a
`
`
`
`.uEné_E..._...a..m__.:o>.w
`
`22
`
`22
`
`
`
`
`Patent Application Publication
`
`Jan. 5, 2006 Sheet 22 of 48
`
`US 2006/0004680 A1
`
`Example Current Theme Selection
`
`SelectA New Current Theme
`
`1150
`
`Starting The Day
`Driving Home
`At Home
`
`Work-Related Group
`Driving To Work
`Arriving At Work
`At Work
`
`1155
`
`Entertainment-Related Group
`
`1157
`
`.
`Ftgure 11M
`
`Example Current Theme Set Modification
`
`Current Theme Set
`
`1150
`
`{Starting The Day
`
`At Home
`Eating Breakfast
`
`1157
`
`Add Theme
`
`Figure 1 IN
`
`Example Current Theme Layout Selection
`
`Select A New Current Theme Layout
`
`Starting The Day theme
`
`1 160
`
`{ Starting The Day—default layout
`
`Starting The Day — vacation
`Starting The Day — responsible for dropping off kids
`
`Cancel
`
`1157
`
`Figure 110
`
`23
`
`23
`
`
`
`Patent Application Publication
`
`Jan. 5, 2006 Sheet 23 of 48
`
`US 2006/0004680 A1
`
`onm—
`
`0 § A
`
`mam?how?
`
`VN
`
`~ME
`
`RN.
`
`
`
`_..u|...:uu.._ou._m%
`
`”mmu_o£Qum4
`
`
`
`mEm:H“mmm
`
`§i
`
`m~_:mm.ONfi
`
`24
`
`24
`
`
`
`
`Patent Application Publication
`
`Jan. 5, 2006 Sheet 24 of 48
`
`US 2006/0004680 A1
`
`«Swe
`
`EmE:2_>:mw.@
`
`Q
`
`Bumfio3
`
`
`
`
`
`W._£:_:..m.m,n_.i,._b_>_”_od~
`
`N _
`
`.Nw
`
`25
`
`4as«man.
`
`”mwu_o:U#
`
`om_:mm._Omm
`
`BmzFm
`
`25
`
`
`
`Patent Application Publication
`
`Jan. 5, 2006 Sheet 25 of 48
`
`US 2006/0004680 A1
`
`92.m.£
`
`26
`
`26
`
`
`
`Patent Application Publication
`
`Jan. 5, 2006 Sheet 26 of 48
`
`US 2006/0004680 A1
`
`;
`
`”mmu_o_._o%
`
`
`
`
`
`#5i262WI
`
`{
`
`mEwF_.“mmmmfl
`
`.wN_:wm.ONB
`
`m_..__._wmMM:
`
`Eon....
`
`27
`
`
`
`got.umzouum
`
`HIUHCDUxw
`
`
`
`z9.3..9_m=mmE...ma:
`
`
`
`5.5dzsmm.m_n_Own_
`
`Q2ME
`
`
`
`..>mam;
`
`EmE_._9_>:m_may
`
`mwom_n_M“
`
`}fluB$3.30mm
`
`27
`
`
`
`Patent Application Publication
`
`Jan. 5, 2006 Sheet 27 of 48
`
`US 2006/0004680 A1
`
`NS.u.£
`
`_mmo_o:ou
`
`
`
`mEm_._._.“mmN
`
`
`
`m~_:mm_ONfi.,
`
`._‘
`
`BEEI
`
`EmE:o._..Em_mi
`33.30agg
`
`28
`
`28
`
`
`
`Patent Application Publication
`
`Jan. 5, 2006 Sheet 28 of 48
`
`US 2006/0004680 A1
`
`K3.M..n~
`
`ummu_o_._o#
`
`m~_%m_oN2.02.m..
`
`
`
`mEm_.:.“mmM"
`
`w.mom_n_Nm_aomn_m_.
`
`gEwE:o.__>:mmg
`
`..uniE,m.fi:ams.@:.0...0
`
`.2S2.=~__=.§xE_>zo<ml.
`
`29
`
`29
`
`
`
`Patent Application Publication
`
`Jan. 5, 2006 Sheet 29 of 48
`
`US 2006/0004680 A1
`
`vi.M¢~
`
`.3,._
`
`DEsonmama.
`
`EmE:9_>:mmmmom_n_M.
`
`gE
`
`gag
`
`.3?m_aomn_@W
`
`262w"mmu_o_._oaw
`
`I‘
`
`mEm_._._..mwm.
`
`ms.6.
`
`Eu.xB:ou.I
`
`30
`
`30
`
`
`
`Patent Application Publication
`
`Jan. 5, 2006 Sheet 30 of 48
`
`US 2006/0004680 A1
`
`(D
`
`:5
`
`NSNS
`
`memfimmmmfl
`m~_cmm.ON%_
`
`EmE_._o._>_._.n._m
`
`38303%
`
`1?.g5..
`
`EuEflcou
`
`31
`
`
`
`
`Patent Application Publication
`
`Jan. 5, 2006 Sheet 31 of 48
`
`US 2006/0004680 A1
`
`1302
`1304
`
`
`Example Theme Data Structure
`
`1 300
`
`Intent Is To Or
` anize Both Driving Info & Info About Work Arrival
`
`
`
`
`
`Default:
`"“""“‘; Self: "”'”'”'”"'; .
`.
`.
`Access: Seli+ Famil + Friends + Source; Modification: Self; C n
`
`
`
`
`
`
`
`
`
`
`
`1305
`
`1310
`
`1315
`
`1320
`
`1325
`
`1330
`
`1335
`
`23
`
`
`
`
`
`1 340
`
`
`.
`
`DTW-BCD.drivin-direction:
`
`source = "BCD-DTW CS" '
`
`user.activity.driving = true (required); environment.local.time = 08:00-09:00 (optional);
`
`DTW-BCD.driving-direction : NOT “South” (required);
`.
`4
`IF <user.activity.driving> AND user.location.latitude > reccntuser. 1ocation.latitude
`THEN DTW—BCD.driving-direction = “North”;
`[F DTW-BCD.driving-direction = “East” THEN Alert(“You’re Lost Again! l”);
`SET user.1ocation Access = Famil & Co-workers;
`
`
`
`
`
`
`Theme-Content
`
`Theme-Matching
`
`
`
`Theme-Logic
`
`
`
`
`
`
`
`
`None
`
`
`
`
`
`
`
`
`
`
`
` Attribute I
`
`
`
`
`
`
`Name
`Value
`Uncertain
`
`Timestam
`Units
`Access
`
`
`
`
`
`
`
`
`07:32 03/24/xx
`
`
`
`_
`
`
`
`
`
`32
`
`
`
`
`
`
`
`
`1 345
`
`1350
`
`1355
`
`1360
`
`1365
`
`1370
`
`1 390
`
`32
`
`
`
`Patent Application Publication
`
`Jan. 5, 2006 Sheet 32 of 48
`
`US 2006/0004680 A1
`
`ct;Ream
`
`
`
`=osa.nm_5E_u<OEUFH
`
`Ev_.
`
`
`
`»om=.oE>o.m8o25:9
`
`
`
`
`
`031.szscin082.5.
`
`
`
`.553.ofionh
`
`._oc:o%.oM
`
`
`
`
`
`F0U_>0Dn—.—.fi3.E00.~UmDDEUFH
`
`83o2>uD
`
`
`“E5.:50.5>.5muEu_E.
`
`.mm:
`
`uEu_E.
`
`uoumntowofio
`=o:§E8Emmv_,
`
`
`
`
`._oN_.8w2mU050-.5.
`
`03:.
`
`5._.
`
`moi
`
`Ev_.
`
`$3
`
`hm:
`
`£3
`
`
`
`B>u_boMuEo_.F
`
`
`
`Ezooumufiofi.
`
`
`
`.o.=£bm_D2:25.
`
`
`
`omamboEo_._H
`
`
`
`Bmoonv2.22:.
`
`
`
`B_ovoEQEOJH.
`
`
`
`omcommumufiunu.
`
`.§So:uO
`
`
`
`uE2E.uu.aEo.=<
`
`_5N_E2m=O
`
`
`
`BmoonoE850
`
`utoguz
`
`no.2
`
`$3
`
`
`
`ufioahu2mEo.=<
`
`mmo=o§.EohE<
`
`._oE==2oQ
`
`
`Swcum_--m_wm...e.,_."
`uE2E._.:oE>mm_
`_IIIIIIII
`
`$3 N91.
`
`
`
`2252.su_§__a€<
`
`..u_.:_...o$<uoEEU
`
`ufiofi.
`
`mmv—
`
`«NV—
`
`m2=£.E<
`
`uE2.E.
`
`muo_>uu
`
`oz:83
`
`aonmncomflHnofintowouuo_
`
`33
`
`33
`
`
`
`
`
`
`Patent Application Publication
`
`Jan. 5, 2006 Sheet 33 of 48
`
`US 2006/0004680 A1
`
`1 500
`
`Theme usage
`routine
`
`_
`
`Fig. 15
`
`1505
`_
`Receive indications of currently available themes
`and their associated theme information
`
`1 510
`
`1 51 5
`
`1 545
`
`1595
`
`Yes 1 575
`
`Generate appropriate response
`to selected theme
`
`Detect changes to context or user indication
`
`1555
`
`Userindication?
`
`
`Yes
`
`1 560
`
`No
`
`
`Present current context
`infonnation to user
`
`Change context
`information?
`
`1570
`
`1580
`
`Receive indication of
`context changes
`
`Provide indicated
`functionality to user
`
`
`
`34
`
`
`
`
`Determine themes that match current context
`
`1 530
`
`1 520
`
`1 525
`
`. N
`
`o
`
`Present matching or all
`
`Receive indication of
`selected theme
`
`Select highest priority
`matching theme
`
`34
`
`
`
`Patent Application Publication
`
`Jan. 5, 2006 Sheet 34 of 48
`
`US 2006/0004680 A1
`
`1 540
`
`Response generator
`subroutine
`
`
`
`
`
`Receive indication of selected theme
`and of associated theme information
`
`1 605
`
`Fig. 16
`
`
`
`
`
` Select default theme layout
`
`if multiple are associated
`
`
`
`1 620
`
`1610
`
`Associated theme
`
`1ayout(s)?
`
`Retrieve content specified
`by theme layout
`
`Perform data logic specified
`by theme layout
`
`1 630
`
`Retrieve user preference
`information
`_
`
`
`
`Present content in manner specified by presentation logic
`of theme layout and in a manner consistent with
`
`preference information
`
`Other response
`
`
`specified by theme
`
`layout?
`
`
`Provide other response as
`appropriate
`
`
`
`
`
`35
`
`35
`
`
`
`Patent Application Publication
`
`Jan. 5, 2006 Sheet 35 of 48
`
`US 2006/0004680 A1
`
`1 700
`Theme creatorl
`modifier routine
`
`1 705
`
`Receive indication of theme
`to be created or modified
`
`1710
`
`1 720
`
`1725
`
`
` Fig. 1 7
`
`
`
`
`information associated with theme
`
`1730
`Receive indication from user to modify
`specified properties of theme or
`
`
`
`1755
`
`Modify theme property as
`indicated by user
`
`1 740
`
`Present theme layout to user
`in visual editor
`
`
`
`Store theme and
`theme-related information
`
`More themes?
`
`1775
`
`36
`
`Modify theme layout as
`indicated by user
`
` Modify theme attribute,
`
`theme CS, or theme CC as
`indicated by user
`
`1765
`
`More changes to
`theme?
`
`No
`
`1 770
`
`
`
`36
`
`
`
`Patent Application Publication
`
`Jan. 5, 2006 Sheet 36 of 48
`
`US 2006/0004680 A1
`
`
`
`1 800
`Theme distributor
`routine
`
`
`1805
`
`Receive indications of
`available themes
`
`1810
`
`Fig. 18
`
`
`
` Receive request for theme(s)
`or indication of accessible
`
`Retrieve indications of
`appropriate users for the
`available themes
`
`user
`
`
`
`
`
`1820
`
`
`
`Retrieve information about
`user from database or user's
`
`computing device
`
`
`
`
`
`1840
`
`Determine themes that are
`appropriate for user
`
` Determine if any needed
`
`payment information has
`been provided
`
`Determine if user has
`provided appropriate access
`
`
`information
`
`
`1830
`
`Appropriate
`access?
`
`
`
`
`-
`
`1 850
`
`Payment
`provided?
`
`Yes
`
`
`
`Gather theme-related information
`associated with requested or determined
`
`themes, including the themes
`
`
`
`
`
` Send gathered information
`to requesting or indicated
`user
`
`
`
`37
`
`37
`
`
`
`Patent Application Publication
`
`Jan. 5, 2006 Sheet 37 of 48
`
`US 2006/0004680 A1
`
`'
`
`1 900
`Theme receiver
`routine
`
`
`1905
`Receive user request for
`
`theme or indication of sent
`theme
`
`Fig. 19
`
`1915
`
`Attempt to determine theme
`server that can provide
`requested theme
`—
`
`
`
`
`1920
`
`Theme server
`found?
`
`
`
`
`
`1910
`
`1 940
`
`Receive indication of sent
`theme
`
`1 945
`
`
`
`
`Payment
`information
`needed?
`
`
`Yes
`
`_Receive appropriate access
`information for theme and theme sewer, either from
`
`user or from storage
`
`
`Send payment mfomatjon
`
`1 955
`
`1 925
`
`
`
`Yes
`
`Payment
`accepted?
`
`Yes
`
`1960
`
`
`
`Receive indication of
`payment mechanism if
`Indicate failure
`
`
`
`needed, either from user or
`to user
`from storage
`
` Receive theme and any
`
`associated theme-related
`information
`
`
`
`
`
`
`
`
`Send request for theme to
`theme server, including any
`
`access information and
`payment information
` Store received information,
`
`and load for use if
`immediately useful
`
`38
`
`38
`
`
`
`Patent Application Publication
`
`Jan. 5, 2006 Sheet 38 of 48
`
`US 2006/0004680 A1
`
`2000
`Automated theme
`customizer routine
`
`Receive indications of available themes
`
`2005
`
`Repeatedly monitor user actions,
`including interactions with themes,
`modifications to themes, and explicit
`changes to the current theme or the
`current context
`
`201 5
`
`patterns of actions
`
`_
`
`-
`
`
`
`
`
`Fig. 20
`
`
`
`
`
`2020
`Any patterns above relevance
`threshold?
`
`Yes
`
`2025
`
`Retrieve user preference information
`
`Determine modifications to some or all
`themes that are consistent with patterns
`above the relevance threshold and with
`user preferences
`
`
`
`2035
`
`Yes
`
`—
`
`2040
`
`
`Eggsent suggested modifications to the
`
`
`
`modifications
`
`
`
`2060
`
`
`
`No
`
`2045
`
`2065
`
`2095
`
`
`
`39
`
`
`
`39
`
`
`
`Patent Application Publication
`
`Jan. 5, 2006 Sheet 39 of 48
`
`US 2006/0004680 A1
`
`
`
`mow:uubomfi
`
`ll
`
`
`
`mud...zozaxm#35
`
`Eo§Bo<
`
`RME
`
`Ecouoocaoo
`
`hasmo_uwoE
`
`HxflaooSufiBfim
`.288éoefim
`
`EofiooSm:mo
`
`£35
`
`2%
`
`fiomaom
`
`Eutma
`
`esm
`
`0:3w
`
`Saw
`
`40
`
`40
`
`
`
`
`
`Patent Application Publication
`
`Jan. 5, 2006 Sheet 40 of 48
`
`US 2006/0004680 A1
`
`
`
`tnxcoo:o=mE_
`
`E58
`
`ES»
`
`
`
`ofimcooocouomfi
`
`m8§.E%a%
`
`:ocmoE.Ho>
`
`oumEoS<
`
`mmoaoamcmoumma
`
`:os8ctu>
`
`_SEmE
`
`
`
`mqouowHksmfioo
`
`
`
`.B:n_EoomzomDcom
`
`uCoG.HGO.~_>G®
`
`o&o_BE
`
`6:5
`
`
`
`38:80zosmxm
`
`_ocoS
`
`2:
`
`mm.m.£
`
`9:
`
`41
`
`41
`
`
`
`Patent Application Publication
`
`Jan. 5, 2006 Sheet 41 of 48
`
`US 2006/0004680 A1
`
`0&2Bfiosbm83.
`
`
`
`E55€295/Ill
`
`253no_®UOE
`
`tozmxoBflmnouxm
`
`Saw@—.~O>>Rom
`
`.'-'.-‘.--.'-
`
`o
`»—1
`
`mm.u.E
`
`
`
`ofiwcouoaobmfi
`
`H<
`
`42
`
`42
`
`
`
`
`Patent Application Publication
`
`Jan. 5, 2006 Sheet 42 of 48
`
`US 2006/0004680 A1
`
`Fig. 24
`
`
`
`Phone, Internet, Time,
`Computer Network
`
`
`
`
`GPS, Radio & TV, paging
`Vx
`
`Verbal
`GPS
`Mouse
`
`Keyboard
`Time
`
`— Message Loop*
`
`Aging &
`Compression
`
`Actions
`
`Modify
`Windows
`
`Manage PIM
`
`Initiate calls,
`pages, faxes,
`email, web
`searches
`
`Display Maps
`
`
`
`Pattern
`
`Recognition
`
`
`
`
`
`Inference
`
`Engine
`
`43
`
`
`
`Filters &
`Builds Datastore
`
`
`
`Preference
`Pattem
`
`
`
`43
`
`
`
`Patent Application Publication
`
`Jan. 5, 2006 Sheet 43 of 48
`
`US 2006/0004680 A1
`
`Context-Based Automated
`
`500
`
`-Fzg. 25
`
`
`
`Receive context information
`
`2505
`
`2510
`
`
`
`
`
`Add context information to explicit
`context model, and provide responses
`as indicated by explicit context rules
`
` Provide context information from
`
`explicit context model to implicit
`context model with inference engines
`
`2520
`
`
`Detect patterns in explicit context
`information, and suggest inferred rule
`with appropriate response
`
`2525
`
`Verify appropriateness of suggested
`rule
`
`
`
`
`
`
`2530
`
`Appropriate?
`
`Yes
`
`
`Store suggested rule, and provide
`suggested response if appropriate
`
`44
`
`44
`
`
`
`Patent Application Publication
`
`Jan. 5, 2006 Sheet 44 of 48
`
`US 2006/0004680 A1
`
`» Self-Customizing Context
`
`600
`
`.Ftg. 26
`
`Receive current context information
`
`2605
`
`2610
`
`Receive indication of modeled context attribute
`
`information from explicit context model that
`reflects received context
`
`Receive indication of user selection of new
`modeled context attribute information or of
`
`indication of appropriate modeled context
`attribute information
`‘
`
`
`
`
`
`
`
`Add all received information (and changes in
`current context from previous context) to
`appropriate context awareness customizing
`implicit model and inference engine
`
`2625
`
`Detect pattern in indicated user selection or
`indication, and generate corresponding inferred
`
`rule to change modeled context attribute
`information in appropriate customized manner
`
`
`
`
`
`Verify appropriateness of generated rule
`
`2630
`
`2635
`
`Appropriate?
`
`Yes
`
`2640
`
`Store generated rule for future use
`
`2645
`N°
`
`2695
`
`45
`
`45
`
`
`
`Patent Application Publication
`
`Jan. 5, 2006 Sheet 45 of 48
`
`US 2006/0004680 A1
`
`700
`
`2705
`
`Fig 27
`
`
`
`
`Receive current context information
`
`
`
`Receive indication of user selection of
`information or user indication of
`
`
`
`appropriate information
`
`
`
`
`
`2715
`Add current context information
`
`(including changes from previous
`
`context) and indicated user selection or
`
`indication to appropriate content
`implicit model and inference engine
`
`
`
`
`
`2720
`
`Detect pattern in indicated user
`selections of information and generate
`corresponding inferred rule to provide
`
`appropriate information
` 2725
`
`Yes
`
`Verify appropriateness of generated
`rule
`
`2730
`
`Appropriate?
`
`
`
`Store generated rule for future use
`
`795
`
`46
`
`46
`
`
`
`Patent Application Publication
`
`Jan. 5, 2006 Sheet 46 of 48
`
`US 2006/0004680 A1
`
`800
`
`Fig 28
`
`Receive current context information
`
`2805
`
`2810
`
`
`
`
`
`Receive indication of user changes to
`UI functionality or user indication of
`appropriate UI functionality
`
`
`
`
`
`
`Add received information (including
`changes from previous context) to
`appropriate UI optimizing implicit
`
`
`model and inference engine
`
` Detect pattern in user changes or
`
`indications, and generate
`
`corresponding inferred rule to provide
`appropriate UI functionality
`
`
`
` Verify appropriateness of generated
`rule
`
`2830
`
`Appropriate?
`
`Store generated rule for. future use
`
`.
`Yes 2835
`
`
`895
`
`47
`
`47
`
`
`
`Patent Application Publication
`
`Jan. 5, 2006 Sheet 47 of 48
`
`US 2006/0004680 A1
`
`900
`
`Fig. 29
`
`2905
`
`Receive current context information
`
`
`
`
`
`
`Receive indication of user task
`
`performed having multiple steps, or of
`user indication that task is appropriate
`
`Add received information (including
`changes from previous context) to
`appropriate task simplification implicit
`model and inference engine
`
`
`Detect pattern in task, and generate
`corresponding inferred rule to provide
`
`
`some or all of appropriate task
`
`
`
`Verify appropriateness of generated
`rule
`—
`
`2930
`
`Appropriate?
`
`Yes 2935
`Store generated rule for future use
`
`48
`
`48
`
`
`
`Patent Application Publication
`
`Jan. 5, 2006 Sheet 48 of 48
`
`US 2006/0004680 A1
`
`3000
`
`
`
`Mentoring Routine
`
`-
`
`Receive current context information
`
`Fig. 30
`
`3005
`
`
`
`3010
`
`3035
`
`
`Yes
`
`
`
`Explicit mentoring
`rules?
`
`
`
`Retrieve explicit mentoring rules
`
`Y“
`
`
`
`3040
`Determine if match to current
`
`Receive indication from
`context, and if so prompt user as
`
`
`user of user action that
`indicated
`should have been
`performed based on
`current context other than
`
`
`
`any prompted actions
`
`
`
`
`
`After prompt,
`receive "Why?"
`
`Generate and provide
`response based on
`explicit rules
`
`
`
`
`
`
`question
`
`
`
`Add received information (including changes from
`previous context) to appropriate mentoring implicit
`model and inference engine
`
`
`
`
`
`
` Detect pattern in indicated user action and current
`context information, and generate inferred rule to
`
`provide indicated action in context for pattern
`
`
` 3055
`
`Verify appropriateness of generated rule
`
`3060
`
`
`
`Appropriate?
`
`
`Modify explicit rules (or store inferred rule if no
`
`explicit rules) to reflect inferred rule
`
`49
`
`49
`
`
`
`US 2006/0004680 A1
`
`Jan. 5, 2006
`
`CONTEXTUAL RESPONSES BASED ON
`AUTOMATED LEARNING TECHNIQUES
`
`CROSS-REFERENCE TO RELATED
`APPLICATIONS
`
`[0001] This application is a continuation-in-part of U.S.
`patent application Ser. No. 09/216,193 (Attorney Docket
`No. 29443-8001), filed Dec. 18, 1998 and entitled “Method
`and System for Controlling Presentation of Information to a
`User Based on the User’s Condition”; of U.S. patent appli-
`cation Ser. No. 09/464,659 (Attorney Docket No. 29443-
`8003), filed Dec. 15, 1999 and entitled “Storing and Recall-
`ing Information to Augment Human Memories”; and of U.S.
`patent application Ser. No. 09/724,902 (Attorney Docket
`No. 29443-8002), filed Nov. 28, 2000 and entitled “Dynami-
`cally Exchanging Computer User’s Context,” which claims
`the benefit of provisional U.S. Patent Application No.
`60/194,006 filed Apr. 2, 2000. Each of these applications are
`hereby incorporated by reference in their entirety.
`
`[0002] This application also claims the benefit of provi-
`sional U.S. Patent Application No. 60/193,999 (Attorney
`Docket # 29443-8008), filed Apr. 2, 2000 and entitled
`“Obtaining And Using Contextual Data For Selected Tasks
`Or Scenarios, SuchAs For AWearable Personal Computer,”
`and of provisional U.S. Patent Application No. 60/194,123
`(Attorney Docket # 29443-8024), filed Apr. 2, 2000 and
`entitled “Supplying And Consuming User Context Data,”
`both of which are hereby incorporated by reference in their
`entirety.
`
`TECHNICAL FIELD
`
`[0003] The invention described below relates generally to
`using various information to allow a system to automatically
`enhance its responses to changing contextual information,
`such as information about a user and the user’s surround-
`1ngs.
`
`BACKGROUND
`
`[0004] Existing computer systems provide little apprecia-
`tion of a user’s overall condition or context, and as a result
`they can effectively respond to only a limited number of
`changes in parameters that they monitor. For example, with
`respect to the low-level physical status of the user, numerous
`devices exist for monitoring the physical parameters of the
`user, such as heart rate monitors that provide user pulse or
`heart rate data. While many of these devices simply provide
`information to the user regarding current values of a user’s
`health condition, others (e.g., a defibrillator or a system with
`an alarm) are capable of providing a corresponding response
`if a monitored parameter exceeds (or falls below) a threshold
`value. However, since such devices lack important informa-
`tion about the specific context of the user (e.g., whether the
`user
`is currently exercising or
`is currently sick), any
`response will