`(12) Patent Application Publication (10) Pub. N0.: US 2006/0004680 A1
`(43) Pub. Date:
`Jan. 5, 2006
`Robarts et al.
`
`US 20060004680A1
`
`(54) CONTEXTUAL RESPONSES BASED ON
`AUTOMATED LEARNING TECHNIQUES
`
`(76) Inventors: James O. Robarts, Redmond, WA
`(US); Eric L. Matteson, Bellevue, WA
`(Us)
`Correspondence Address:
`SEED INTELLECTUAL PROPERTY LAW
`GROUP PLLC
`701 FIFTH AVE
`SUITE 6300
`SEATTLE, WA 98104-7092 (US)
`
`(21)
`(22)
`
`Appl. No.:
`
`11/033,974
`
`Filed:
`
`Jan. 11, 2005
`
`Related US. Application Data
`
`(63) Continuation of application No. 09/825,152, ?led on
`Apr. 2, 2001, noW Pat. No. 6,842,877, Which is a
`continuation-in-part of application No. 09/216,193,
`?led on Dec. 18, 1998, noW Pat. No. 6,466,232, and
`Which is a continuation-in-part of application No.
`09/464,659, ?led on Dec. 15, 1999, noW Pat. No.
`6,513,046, and Which is a continuation-in-part of
`application No. 09/724,902, ?led on Nov. 28, 2000.
`
`(60) Provisional application No. 60/194,006, ?led on Apr.
`2, 2000. Provisional application No. 60/ 193,999, ?led
`on Apr. 2, 2000. Provisional application No. 60/ 194,
`123, ?led on Apr. 2, 2000.
`
`Publication Classi?cation
`
`(51) Int. Cl.
`(2006.01)
`G06F 15/18
`(2006.01)
`G06F 17/00
`(2006.01)
`G06N 5/02
`(2006.01)
`G06F 3/00
`(2006.01)
`G06F 9/00
`(52) US. Cl. ............................. .. 706/12; 706/46; 715/700
`
`(57)
`
`ABSTRACT
`
`Techniques are disclosed for using a combination of explicit
`and implicit user context modeling techniques to identify
`and provide appropriate computer actions based on a current
`context, and to continuously improve the providing of such
`computer actions. The appropriate computer actions include
`presentation of appropriate content and functionality. Feed
`back paths can be used to assist automated machine learning
`in detecting patterns and generating inferred rules, and
`improvements from the generated rules can be implemented
`With or Without direct user control. The techniques can be
`used to enhance softWare and device functionality, including
`self-customizing of a model of the user’s current context or
`situation, customizing received themes, predicting appropri
`ate content for presentation or retrieval, self-customizing of
`softWare user interfaces, simplifying repetitive tasks or
`situations, and mentoring of the user to promote desired
`change.
`
`Initialize CS and/or CC J“ 402
`for theme set for user
`1
`
`At CS, obtain data signals from input devices f 404
`
`1
`
`At CS, process data signals to produce
`attributes representing user's context,
`including adding required data ?elds, such as
`attribute name, timestamp, units, etc.
`
`f 406
`
`[At CS, provide attribute to CM for storage M 408
`
`lAt CC, request from CM and receive attributes
`
`|~./" 410
`
`|At CC, process attributes to determine appropriate output M 412
`
`IProvide output signal to user, other entity or other process |~/‘ 414
`
`l
`l
`
`400
`
`1
`
`Google Inc., Nest Labs, Inc., and Dropcam, Inc.
`GOOG 1008
`IPR of US Pat. No. 8,311,524
`
`
`
`Patent Application Publication
`
`Jan. 5, 2006 Sheet 1 0f 48
`
`US 2006/0004680 A1
`
`
`
`
`
`$9.$033.530mWWWMWMWMD
`
`
`
`
`
`
`
`H83muons,£853wabxmoam829mm
`
`fififiqobém
`
`32.23
`
`
`
`830mwmr323222
`
`
`
`we825m"22%:me
`
`BwD32:3.QEEQ
`
`35$SE
`
`NE838:55tm0.2
`SmD32:3\n
`
`.£59;
`
`
`omr$59800....UmEofiqobém
`
`uBmtoméoZomwSEEDBaum..........
`
`
`
`
`
`mwmwmooSoQ530m
`
`weruser—@208
`
`32:5
`
`#0_.
`
`>53th82>
`
`
`
`830G_‘N_.«5:80
`
`N.mi
`
`o_._.H83
`
`AmEflQ”vacant..\53:0
`
`
`
`02mwe..QEEQ9593/
`
`
`
`
`
`Nov..mmF~83on:.I.HU,I..L:I:I..l.H.I:L:I:I.HH.I:I:L:I.HH.I:.32593930
`
`BassoEémfiwoxm
`
`3:.QEEU
`
`2
`
`
`
`
`Patent Application Publication Jan. 5, 2006 Sheet 2 0f 48
`
`US 2006/0004680 A1
`
`Input devices
`
`'
`
`CPU
`
`h
`I
`I
`
`1 2O
`
`Output devices
`
`'
`
`" ‘ ' ' " "" ‘ ‘I z
`""'|"1""
`
`rv *
`
`214
`
`> Audio input CS
`216
`
`Video input cs
`21 a
`
`‘
`> Body temp. CS
`
`r
`
`. 220
`
`_>
`Heart rate cs
`
`-
`
`'
`
`234
`
`232
`
`230
`
`#248
`\
`Poi-matter
`and visual
`display
`selector
`A
`
`7 ------ - -,
`CM :
`I
`
`----- - —,
`1
`i
`:
`
`240
`7 S :
`
`_ _ _ _ , l
`
`|
`l
`l
`{=3
`I a :6 i
`
`l
`i
`I
`i
`
`I
`
`:
`E
`:
`
`I
`
`I
`.'
`1
`I
`
`134
`
`~
`a Head-mounted
`' display
`
`1 30
`
`Flat panel
`display
`
`132
`N
`Earpiece
`p
`
`s e er
`
`Tactile output
`
`252
`
`242
`
`136
`
`121
`
`_
`Video camera
`
`202
`
`204
`
`EKG/EEG
`
`$811801’
`
`monitor
`
`208
`
`monitor
`
`2 . - - - _ > O _
`222
`H '
`I“ ' ' ' '> O
`i- _ _ _ _ _ i>
`_______ _ _ _ _>
`
`Blood pressure
`
`I CS
`
`224
`
`g
`
`.3 :E
`|____ U, E ‘g _ _) Teleconi.
`I
`U o __
`transceiver
`I
`o
`
`_
`
`-> Blood 02 CS
`
`; _ _ _ _'
`
`254
`
`\1 ‘22:3,?
`
`}
`
`226
`‘ ~
`
`—+ Other CS's
`
`12/44
`
`Other CC's
`
`-
`
`5pm“
`
`_
`
`4\
`
`sensor
`
`I
`
`I
`
`,
`
`.
`
`Application program
`
`H
`
`I
`
`236
`
`250
`
`e
`
`De?brillator
`
`""""""""""" ' '
`
`1 26
`
`Various user
`
`||
`
`sensor devices
`
`Browser program f 238
`
`Various
`environment
`128 N sensor devices
`
`122
`
`Various user
`input devices
`
`Fig 2
`
`138
`
`Various user
`output devices
`
`3
`
`
`
`Patent Application Publication Jan. 5, 2006 Sheet 3 of 48
`
`US 2006/0004680 A1
`
`Code
`module A
`(CS)
`
`Code
`module B
`(CC)
`
`Code
`module C
`(CS/CC)
`
`Code
`module D
`(CS/CC)
`
`k
`
`A
`
`CM
`
`Fig. 3
`
`232
`
`300
`
`Initialize CS and/or CC
`for theme set for user f 402
`
`At CS, obtain data signals from input devices f 404
`
`v
`At CS, process data signals to produce
`attributes representing user's context, f 406
`including adding required data ?elds, such as
`attribute name, timestamp, units, etc.
`
`[At CS, provide attribute to CM for storage y 408
`
`[At CC, request from CM and receive attributes V‘ 410
`
`7
`
`lAt CC, process attributes to determine appropriate output M 412
`
`V
`
`.
`
`V
`IProvide output signal to user, other entity or other process |\/‘ 414
`
`Fig. 4
`
`400
`
`4
`
`
`
`Patent Application Publication Jan. 5, 2006 Sheet 4 0f 48
`
`US 2006/0004680 A1
`
`Launch at start-up
`
`f 502
`
`7
`Perlodlcally recelve GPS data
`stream
`
`504
`
`V
`Parse and store GPS data
`
`f 506
`
`508
`
`Receive
`request for latitude,
`longitude or altitude
`I attribute?
`
`Provide requested attribute value(s),
`with Timestamp, uncertainty, units,
`etc.
`
`500
`
`Fig.
`
`5A
`
`521
`
`522
`
`524
`
`526
`
`528
`
`530
`
`Name
`Latitude (value)
`Uncertainty
`Timestamp
`Units
`Format version/?ags
`
`L
`L
`
`Fig. 5B
`
`520
`
`5
`
`
`
`Patent Application Publication Jan. 5, 2006 Sheet 5 0f 48
`
`US 2006/0004680 A1
`
`Register for latitude, longitude and altitude attributes \f‘ 602
`
`4 ormal operation?
`
`606
`
`Use default attribute mediator
`
`Reevaluate all latitude attributes f 608
`
`Select lowest uncertainty attribute mediator J‘ 610
`
`Determine CS providing resulting attributes f 612
`
`7
`Request latitude, longitude and altitude attributes from CS/CM \f‘ 614
`
`616
`
`4 otify feature active?
`
`Receive input de?ning boundary
`
`V
`Provide event parameters to CM regarding boundary transversal J‘ 620
`
`Receive
`response from CM to
`stablished event‘7
`
`Notify user that boundary has been crossed f 624
`I
`
`Fig. 6
`
`600
`
`6
`
`
`
`Patent Application Publication Jan. 5, 2006 Sheet 6 0f 48
`
`US 2006/0004680 Al
`
`
`
`M $235K 82am
`
`
`
`N 2x25“ 8E5
`
`A. I I I I I
`
`A. I I I I I A I I I I I
`
`A I I I
`
`vow
`
`won
`
`A. I I I I I A 2366 BQQEQSQQEV
`
`
`All‘ OO\mU “0 m0 M 3053
`30:5 omwou 8% 5&5
`
`
`\ AI I I N 2366 N 88:8
`
`1 . 2mg 8% 3%:
`
`H H OK
`
`A Z @1608 Z @858
`
`Bwoq 8% :55
`
`N .mi
`
`con
`
`7
`
`
`
`Patent Application Publication
`
`Jan. 5, 2006 Sheet 7 of 48
`
`US 2006/0004680 A1
`
`User.
`Desired _privacy_level
`Interruptibility
`Speed
`Direction
`Acceleration
`Availability.
`Cognitive_availability
`Tacti le_availability
`Manual_availability
`Visual_availability
`Oral_availability
`Audit0ry_availability
`Proximity.<ltem or place name>
`Mood.
`Happiness
`Sadness
`Anger
`Frustration
`Confusion
`Activity.
`Driving
`Eating
`Running
`Sleeping
`Talking
`Typing
`Walking
`Location.
`Place_name
`Latitude
`Longitude
`Altitude
`Room
`Floor
`Building
`Address
`Street
`City
`County
`State
`Country
`Postal_Code
`Destination. (same as User.Location.)
`Physiology.
`Pulse
`Body_temperature
`Blood _pressure
`Respiration
`Person.<name or ID>. (same as User.)
`Platform.
`UI.
`
`Oral_input_device_availability
`Manual_input_device_availability
`TactiIe_output_device_avai labil ity
`Visual_output_device_availabil ity
`Auditory_output_devi ce_availabil ity
`
`Platform. (continued)
`CPU.
`Load
`Speed
`
`_
`
`'
`
`F lg‘ 8
`
`Memory.
`Total_capacity
`Used
`Storage.
`Total_capacity
`Used
`Connectivity.
`Connection_status
`Connection_speed
`Connection_type/device
`Connection_activity
`Power.
`Power_source
`Power_level
`Environment.
`People.
`Nearest
`Number _prcsent
`Number_close
`Local.
`Time
`Date
`Temperature
`Pressure
`Wind_speed
`Wind_direction
`Absolute_humidity
`Hi gh_f0recast_temperature
`Low_forecast_temperature
`People _present
`Ambient_noise_level
`Ambient_light_level
`Days.<previous or future>.
`High_temperature
`Low'temperature
`Precipitation_type
`Precipitation_amount
`Place.<place name>. (same as Environment.Local)
`Application.
`Mail.
`Available
`New__messages_waitin g
`Messages__waiting_to_be_sent
`Phone.
`Available
`In__use
`On/ot‘f
`Noti?cation_mech anism
`Cal l_incoming
`Caller_lD
`Sound_recorder.
`Available
`Recording
`
`8
`
`
`
`Patent Application Publication Jan. 5, 2006 Sheet 8 of 48
`
`US 2006/0004680 A1
`
`User Setting
`Mental Context
`Meaning
`Cognition
`Divided User Attention
`Task Switching
`Background Awareness
`Solitude
`Privacy
`Desired Privacy
`Perceived Privacy
`Social Context
`Affect
`Physical Situation
`Body
`Biometrics
`Posture
`Motion
`Physical Encumberment
`Senses
`Eyes
`Ears
`Tactile
`Hands
`Nose
`Tongue
`Workload demands/effects
`Interaction with computer devices
`Interaction with people
`Physical Health
`Environment
`Time/Space
`Objects
`Persons
`Audience/Privacy Availability
`Scope of Disclosure
`Hardware af?nity for privacy
`Privacy Indicator for User
`Privacy Indicator for Public
`Watching Indicator
`Being Observed Indicator
`Ambient Interference
`Visual
`Audio
`Tactile
`
`F lg, 9
`
`Computer
`Power
`Con?guration
`User Input Systems
`Hand/Haptic
`Keyboard/Keystrokes
`Voice/Audio
`Eye Tracking
`Cursors
`Axis
`Resolution
`Selection
`Invocation
`Accelerators
`Output Systems
`Visual
`Resolution
`Audio
`Public/Private
`Haptic
`External Resources
`[/0 devices
`Connectivity
`
`Data
`Quantity/State
`Urgency/Importance
`Use of Prominence
`Modality
`Sensitivity/Purpose
`Privacy Issues
`Use of Association
`Use of Safety
`Source/Ownership
`Types
`User generated
`Other computers or people
`Sensor
`PC State
`Use of Association
`
`9
`
`
`
`Patent Application Publication
`
`Jan. 5, 2006 Sheet 9 0f 48
`
`US 2006/0004680 A1
`
`
`
`3.500mafia—=00
`
`
`
`000V€9,802
`
`3ME
`
`
`
`000..our/umMata—=00owaonmom2:2:qu
`
`
`
`
`
`85288
`
`x838Z
`
`mwoe
`
`nror
`
`303833:0
`
`II8030039:u-ul
`“2:305.:5an0:55._
`
`uvamBLouquoU2:2:
`1'-EMUGOUMNmEOumd—O
`
`
`
`
`
`
`
`310$02:30—.56034
`
`
`
`Hum082$“gonzo
`
`SEESBQ
`
`38:80
`
`8895
`
`onebase:
`
`$8280
`
`53002
`
`000rscum—EOE:
`
`“x850
`
`“.80568:60N9umfiem 562%3934
`
`banana
`
`
`
`2:23.2850
`
`10
`
`
`
`$285003~83
`
`
`
`0no_.32:82593
`
`«:388035
`
`10
`
`
`
`
`
`
`
`
`
`
`Patent Application Publication
`
`m.
`
`hS
`
`US 2006/0004680 A1
`
`
`
`
`
`
`
`
`
`$3us...32m..96
`
`
`
`
`
`.1~35:55.}!.............L
`
`seen33....canawnpm333.8$333xi0.2:2:gage.Seam$.3835am1£5:2:.Em...
`
`
`8magnum2:23«mag.4met3»32:3.tat
`
`
`amo2.J335me£333633%
`
`
`5:@339tcawzfih353*“.56ch
`
`
`
`n.a...$3033....£=o§tz$~oommuttmozflu.
`
`
`
`M35a3.3::SE5.22%2298%:S:
`mo:Oh“9.4390QSHM0584omomQ2%“L502QEEODQMuOOQUM
`mm“ESEoQEoH033:0Eggs:2..»onm2.2:.96$.330
`
`
`
`.ism@SE$2.3..
`
`
`
`
`
`.3333$3230afiuofiaoam5o58$yewas:.mfimacro:
` x13maymEtEm
`
`
`
`t:9.:
`ViMEEa:
`
`
`
`
`
`
`
`11
`
`11
`
`
`
`Patent Application Publication Jan. 5, 2006 Sheet 11 0f 48
`
`US 2006/0004680 A1
`
`mm:
`
`mm:
`
`
`
`3&0 mam 3%
`
`
`
`
`
`685a $.56 “ES.
`
`0:95 3 9: mi M 53
`
`,
`
`
`
`b... 33mm“. 2 mmqhw
`
`2 o>< magma? :2
`
`S: 63,3“
`$95K
`
`ow:
`
`PM:
`
`12
`
`
`
`Patent Application Publication Jan. 5, 2006 Sheet 12 0f 48
`
`US 2006/0004680 A1
`
`
`
`.o?ctmat 00am c930. Seam mFSmQ tEEom
`
`
`
`
`
`are “328%. ENQRQEQ 5.56 E55 Wt
`
`
`
`E358 mmrsmq XE
`
`
`623.: 2:39am .mmzo?zom E255 E 25% 5.) N
`
`
`
`
`
`E552: an @5305 355.50 aim-2S P
`
`
`13
`
`
`
`Patent Application Publication Jan. 5, 2006 Sheet 13 0f 48
`
`US 2006/0004680 A1
`
`
`
`mg 336 ‘R3883 £36 in; =5
`“E2535 mEiQsE: Eoam Em mom 7
`
`
`
`
`$358 pmpsma at
`
`14
`
`
`
`Patent Application Publication Jan. 5, 2006 Sheet 14 0f 48
`
`US 2006/0004680 A1
`
`
`
`@330 5...; 3:5
`
`=§ss Em
`
`mm:
`
`
`
`:5 5.3 Hum-L
`
`15
`
`
`
`=uh
`
`m:ME
`
`smug.£5?«a»:2:2.EUco”to?3352.5
`
`saw?823um:2:3.:5:o28:25
`
`
`
`
`
`
`
`team.9.3%.:.533:5.26:Em.8mag:
`
`
`
`.239.5m5553::5%523555__m._.
`
`
`
`“.33£5atB.53:9.“«as?2:25.
`
`
`
`
`
`.=o_=_u>=_3a:59aSans2:95=9...
`
`
`
`
`
`$2252so@5522FE:9:133:35“.2
`
`4538:socan:9539.,
`
`Patent Application Publication
`
`Jan. 5, 2006 Sheet 15 0f 48
`
`US 2006/0004680 A1
`
`5:
`
`
`
`:omzsaw.925
`
`
`
`32mg::25.
`
`32:9:23.
`
`mm.
`
`
`
`3:33.5me
`
`:35:55
`
`
`
`way—3.2.69
`
`=a££§Em
`
`
`
`away.:35
`
`.m3.28
`
`
`
`.._.23—22.
`
`not”.=2.
`
`mtg—om5:.
`
`.dsemi—2.
`
`
`
`5:5me:31
`
`33¢SEES.
`
`
`
`=23.:3...
`
`3:2.mm:
`
`:2:2.22
`
`
`
`euro.zoom
`
`:25.55
`
`
`
`
`
`
`
`imam—.32..:amxuzrztmucwcoatsm9
`
`16
`
`16
`
`
`
`Patent Application Publication Jan. 5, 2006 Sheet 16 0f 48
`
`US 2006/0004680 A1
`
`
`
`to“ E0 3o: .... SE. twig
`
`
`
`
`
`2%: EC 35 mm“...
`
`£35: ME 50am :8 and.
`
`
`
`
`
`.Emh comb 36am tmQ 26h
`
`17
`
`
`
`Patent Application Publication Jan. 5, 2006 Sheet 17 0f 48
`
`US 2006/0004680 A1
`
`9% aims: 32 t
`
`EEEmQEQE
`
`
`
`.EE E=$x
`
`y
`
`18
`
`
`
`Patent Application Publication Jan. 5, 2006 Sheet 18 0f 48
`
`US 2006/0004680 A1
`
`
`
`.5... SEE. .Eumu £5 93.2.2: 2. :
`
`
`2.. .2; 522:. u 2. 2...... = 38 2.
`
`
`.wwg 9-8: n22:
`
`2.5.2.2.. ZEEEQ
`
`
`
`
`
`
`
`
`:5: @553 3.2. E5. @ 13.5.. .32. @2556‘: .22. $5.23. ,
`
`
`
`
`
`.?... Es. 9.. \ E8". .9; wag?
`
`
`
`
`
`E... 5...; 235.2: 2.25 .2...
`
`
`
`
`
`
`
`
`
`. 22.3.25. mm: 2...
`
`19
`
`
`
`Patent Application Publication
`
`Jan. 5, 2006 Sheet 19 0f 48
`
`US 2006/0004680 A1
`
`
`
`.583an:00228m5..95:0::8€quowuumzomw,
`
`£333ammEEEEOZ2mEomEm..gunmen:
`
`52.uE
`
`
`
`
`
`
`
`.5325human95:
`
`
`
`
`
`metno»3229...42...
`
`magnum2%Em3:
`
`Egoan...00$5.
`
`3r
`
`m3;
`
`20
`
`20
`
`
`
`Patent Application Publication
`
`Jan. 5, 2006 Sheet 20 0f 48
`
`US 2006/0004680 A1
`
`
`
`8:MNN.Mmh
`
`.Emflu3..Do25
`
`£333cm2.
`
`
`85239%mfimfiflm
`
`HZWZHEEH$208.8an
`
`505724;»
`
`21
`
`21
`
`
`
`Patent Application Publication
`
`Jan. 5, 2006 Sheet 21 0f 48
`
`US 2006/0004680 A1
`
`a:.mi
`
`32....”
`
`3:as...
`
`35%”so2..a2:22....2:33a...mmRagga
`
`dam95%whammzoh
`
`
`
`659..“EL.gummaoxw
`
`
`
` EaSeEma...Ex3332%as...anE$3:‘83Eeasy.2.
`
`.3a_,a_,
`
`a:an“;3.2
`
`Ein:Em3...:.
`
`£33050:£3333..$01
`
`.33he3:22..
`
`dmmfimm2%t.—95m
`
`22
`
`22
`
`
`
`Patent Application Publication
`
`Jan. 5, 2006 Sheet 22 of 48
`
`US 2006/0004680 A1
`
`Example Current Theme Selection
`
`SelectA New Current Theme}
`
`1150
`
`Starting The Day
`Driving Home
`At Home
`
`Work-Related Group
`Driving To Work
`Arriving At Work
`At Work
`
`1155
`
`Entertainment-Related Group
`
`1157
`
`.
`thure 11M
`
`Example Current Theme Set Modification
`
`Current Theme Set
`
`Starting The Day
`
`1150 {Satirlggieakfast
`
`1157
`
`Figure IIN
`
`Example Current Theme Layout Selection
`
`Select A New Current Theme Layout
`
`Starting The Day theme
`
`1 160
`
`{ Starting The Day—default layout
`
`Starting The Day — vacation
`Starting The Day — responsible for dropping off kids
`
`Cancel
`
`“57
`
`Figure 110
`
`23
`
`23
`
`
`
`Patent Application Publication
`
`J
`
`6
`
`Bte
`
`US 2006/0004680 A1
`
`S08.
`
` %33.30m,fg,oEmE:E_>cm_wfi,
`2m5,mm:m$2mem,
`m_%tfi
`moFNr
`»QIcu?Aati2.62F4,
`”mmufico#
`x.,mmEmQONfi
`
`
`
`‘VEuluxflcoufiw
`
`m35.5ME
`
`24
`
`24
`
`
`
`
`Patent Application Publication
`
`Jan. 5, 2006 Sheet 24 0f 48
`
`US 2006/0004680 A1
`
`m2ME
`
`
`5::Emmais@5304mm
`
`4a:eman—
`
`memstmmmm
`”wwu_o:o#
`omEmQONM
`385NB%maomn.M
`
`
`
`.352w”
`
`Echthcm$9
`
`38.303.
`
`:Nr
`
`25
`
`25
`
`
`
`Patent Application Publication
`
`Jan. 5, 2006 Sheet 25 0f 48
`
`US 2006/0004680 A1
`
`92.ME
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`26
`
`26
`
`
`
`Patent Application Publication
`
`Jan. 5, 2006 Sheet 26 0f 48
`
`US 2006/0004680 A1
`
`
`
`m_o_omn_mm”zman35%;$25.32.33:04.mfi
`<3Esau”
`
`Q3.mE
`
`:Mw,y.5500wmwn.
`
`EmEco§cmmay
`335mmG
`.in
`
`‘7aD28.303%
`
`
`
`”mmflocofi
`
`{
`
`meE.“mmmmnHU
`
`wwEmQONW
`
`ms.wmm»!
`
`HHHU
`
`
`
`27
`
`
`
`
`
`L3:”taxmacou“w
`
`“x35”.in
`
`27
`
`
`
`Patent Application Publication
`
`Jan. 5, 2006 Sheet 27 0f 48
`
`US 2006/0004680 A1
`
`
`
`
`
`m2.uE
`
`Echthcmmam
`38.30@§§
`
`"mmofico“n
`
`
`
`mEmE.“mmMug“g
`
`
`
`mmEmméNfi.,
`
`
`
`
`
`28
`
`28
`
`
`
`Patent Application Publication
`
`Jan. 5, 2006 Sheet 28 0f 48
`
`US 2006/0004680 A1
`
`
`
`55ME
`
`
` woomEmmfiomam7.aas.52.33“3550.1m..a
`
`
`
`,3;£35mime05—.W=.0...m
`
`mmEmQON262w.
`
`"mmflocou
`
`mEmEmem"
`
`EchoLSCmmg
`
`..
`
`
`
`29
`
`29
`
`
`
`Patent Application Publication
`
`Jan. 5, 2006 Sheet 29 0f 48
`
`US 2006/0004680 A1
`
`
`
`
`
`.3,..
`
`
`
`.r:Boammmm
`
`USME
`
`
`
`tauaxwucou.I
`
`”$2050u
`
`262Fm4
`
`ms.6mEmiuwwm,
`
`EmE:9_>cmmmmomEMggismfiown.@W
`
`38.30a.
`
`30
`
`30
`
`
`
`Patent Application Publication
`
`Jan. 5, 2006 Sheet 30 0f 48
`
`US 2006/0004680 A1
`
`
`
`tauuxuucnuM.
`
`
`
`m2,.mE
`
`
`
`0&9:“mmmmHmmEmQONfi.
`
`EmEco._>:m_w
`
`38.303%;
`
`1W1a
`
`31
`
`31
`
`
`
`
`Patent Application Publication
`
`Jan. 5, 2006 Sheet 31 0f 48
`
`US 2006/0004680 A1
`
`1302
`1304
`
`
`Example Theme Data Structure
`
`1 300
`
`
`
`
`
`
`
`1305
`
`1310
`
`1315
`
`1320
`
`1325
`
`1330
`
`
`
`
`Intent [5 To Or-anize Both Driving Info & Info About Work Arrival
`
`
`Default: “"2 Self: ***"‘*"; .
`.
`.
`
`
`
`
`Access: Selt+ Famil + Friends + Source; Modification: Self; C n
`
`
`
`
`
`
`1335
`23
`
`
`
`1 340
`
`1 345
`
`Theme-Content
`
`
`
`
`1350
`
`1355
`
`1360
`
`1365
`
`
`
`
`DTW-BCD.drivin_-direction:
`source = "BCD-DTW CS" '
`
`
`
`Theme-Matching
`user.activity.driving = true (required); environmentilocaltime = 08:00—09:00 (optional);
`
`
`
`DTW-BCD.driving-direction : NOT “South” (required);
`.
`r
`.
`
`
`lF <user.activity.driving> AND userilocationJatitude > recentuser. locationJatitude
`
`
`Theme-Logic
`THEN DTW—BCDdriving-direction = “North”;
`
`
`[F DTW-BCD.driving-direction = “East” THEN Alert(“You’re Lost Again! l”);
`SET user.location Access = Famil & Co-workers;
`
`
`1370
`
`<link>;
`
`'
`
`' ' <link>
`
`
`
`Attribute I
`
`
`
`1 390
`
`
`
`
`
`
`
`
`
`07:32 03/24/xx
`
`
`
`_
`
`
`
`
`
`
`
`
`
`
`
`None
`
`
`
`Attribute N
`Name
`
`Value
`Uncertain
`Timestam-
`Units
`Access
`
`
`
`
`
`
`_
`
`
`
`
`
`
`
`32
`
`32
`
`
`
`Patent Application Publication
`
`Jan. 5, 2006 Sheet 32 0f 48
`
`US 2006/0004680 A1
`
`
`
`
`
`omvr003005:38chHumDoEonF
`
`
`
`
`
`oov_‘oogoa
`
`“5::ECU520mEmu—E.
`
`23Beam
`
`
`
`:osabmWSEEm082:1
`
`5v_.
`
`
`
`Howmvoguofiouo082$
`
`
`
`8353:25oeufi
`
`
`
`$2.qu052$.
`
`hovcommum
`
`
`
`558330082:
`
`0v:
`
`.NE;
`
`noumNtomeD
`
`sauna—85
`
`BEE.
`
`vmv_‘
`
`mm:
`
`59»_‘
`
`cm:
`
`hm:
`
`mm:
`
`5550M082;
`
`$308M255.
`
`
`
`~82:me082$.
`
`$33082;
`
`
`
`58950825‘
`
`
`
`8—282Enoch
`
`
`
`omcommum2:221
`
`83550
`
`
`
`082E.BEEoE<
`
`ENMEQQO
`
`
`
`580:035:80
`
`M8352
`
`no:
`
`av:
`
`2.5533893.
`
`mmonofitnoanax
`
`.5555on
`
`595m1M???“
`2:22.WEvian"
`
`£35NM:
`
`2:22Eggs?
`
`kumfioghoumuuo
`
`«Eve.
`
`83
`
`«NVr
`
`WBBEZ
`
`082:.
`
`a???"533:0wBaU_
`
`59250de"noufitowoao_
`
`822%
`
`0:;m9;
`
`33
`
`33
`
`
`
`
`
`
`Patent Application Publication
`
`Jan. 5, 2006 Sheet 33 of 48
`
`US 2006/0004680 A1
`
`1 500
`
`Theme usage
`routine
`
`_
`
`Fig. 15
`
`1505
`_
`Receive indications of currently available themes
`and their associated theme information
`
`Determine themes that match current context
`
`1 530
`
`Present matching or all
`
`1 520
`
`.
`
`,7
`
`No
`
`1 525
`
`Receive indication of
`selected theme
`
`Select highest priority
`matching theme
`
`Generate appropriate response
`to selected theme
`
`Detect changes to context or user indication
`
`1 555
`
`
`
`
`Yes
`
`1 560
`
`1 510
`
`1 51 5
`
`1 545
`
`1 595
`
`Yes 1 575
`
`
`
`
`
`
`
`
`
`Present current context
`information to user
`
`Change context
`infomatjon?
`
`1570
`
`1580
`
`Receive indication of
`context changes
`
`Provide indicated
`fimctionality to user
`
`No
`
`
`34
`
`34
`
`
`
`Patent Application Publication
`
`Jan. 5, 2006 Sheet 34 0f 48
`
`US 2006/0004680 A1
`
`Fig. 16
`
`1 540
`
`Response generator
`subroutine
`
`
`
`
`
`Receive indication of selected theme
`and of associated theme information
`
`1 605
`
`
`
`
`
`1 620
`
`1610
`
`Associated theme
`
`layout(s)?
`
`
`
`
`
`
` Select default theme layout
`
`if multiple are associated
`
`
`
`Retrieve content specified
`by theme layout
`
`Perform data logic specified
`by theme layout
`
`1 630
`
`Retrieve user preference
`information
`
`Present content in manner specified by presentation logic
`of theme layout and in a manner consistent With
`
`
`preference information
`
`
`
`
`
`
`Other response
`specified by theme
`
`layout?
`
`
`Provide other response as
`appropriate
`
`
`35
`
`35
`
`
`
`Patent Application Publication
`
`Jan. 5, 2006 Sheet 35 0f 48
`
`US 2006/0004680 A1
`
`1 700
`Theme creator/
`modifier routine
`
`1 705
`
`Receive indication of theme
`to be created or modified
`
`1710
`
`1 720
`
`1725
`
`
`information associated with theme
`
`1730
`Receive indication from user to modify
`specified properties of theme or
`
`1 740
`
`Present theme layout to user
`in Visual editor
`
`1755
`
`Modify theme property as
`indicated by user
`
`
` Fig. 1 7
`
`
`
`
`
` 1765
`
`More themes?
`
`Modify theme layout as
`indicated by user
`
` Modify theme attribute,
`
`
`theme CS, or theme CC as
`indicated by user
`
`More changes to
`theme?
`
`No
`
`1 770
`
`Store theme and
`theme-related information
`
`1775
`
`36
`
`36
`
`
`
`Patent Application Publication
`
`Jan. 5, 2006 Sheet 36 of 48
`
`US 2006/0004680 A1
`
`Fig. 18
`
`
`
`1 800
`Theme distributor
`routine
`
`
`1805
`
`Receive indications of
`available themes
`
`1810
`
`
`
`
`
`Retrieve indications of
`appropriate users for the
`
`available themes
`
`
` Receive request for theme(s)
`or indication of accessible
`
`user
`
`
` Retrieve information about
`
`user from database or user's
`
`computing device
`
`
`1820
`
`
`
`1840
`
`Determine themes that are
`appropriate for user
`
`
`
`
`
`
`Determine if user has
`
`
`provided appropriate access
`
`
`information
`
`1830
`Appropriate
`access?
`
` Determine if any needed
`
`
`payment information has
`
`been provided
`
`
`-
`
`1 850
`
`Payment
`provided?
`
`Yes
`
`Gather theme-related information
`associated with requested or determined
`
`themes, including the themes
`
`
`
`
` Send gathered information
`to requesting or indicated
`user
`
`
`37
`
`37
`
`
`
`1 945
`
`Payment
`information
`
`needed?
`
`Yes
`
`
`
`
`1920
`
`Theme server
`found?
`
`Yes
`
`1 925
`
`
`1905
`Receive user request for
`
`theme or indication of sent
`theme
`
`
`
`
`
`
`
`
`
`Receive indication of
`
`payment mechanism if
`
`
`needed, either from user or
`from storage
`
`
` Receive theme and any
`
`associated theme-related
`
`information
`
`
`
`
`
`Send request for theme to
`
`theme server, including any
`
`access information and
`
`payment information
` Store received information,
`
`
`
`Patent Application Publication
`
`Jan. 5, 2006 Sheet 37 0f 48
`
`US 2006/0004680 A1
`
`'
`
`1 900
`Theme receiver
`routine
`
`1910
`
`
`
`
`1 940
`
`Receive indication of sent
`theme
`
`Fig. 19
`
`1915
`
`Attempt to determine theme
`server that can provide
`requested theme
`,
`
`Receive appropriate access
`information for theme and
`theme server, either from
` user or from storage
`
`
`.
`i
`Send payment information
`
`1 955
`
`Payment
`accepted?
`
`Yes
`
`1960
`
`Indicate failure
`to user
`
`and load for use if
`immediately useful
`
`
`
`38
`
`38
`
`
`
`Patent Application Publication
`
`Jan. 5, 2006 Sheet 38 0f 48
`
`US 2006/0004680 A1
`
`2000
`Automated theme
`customizer routine
`
`
`
`2005
`
`Receive indications of available themes
`
`Repeatedly monitor user actions,
`including interactions with themes,
`modifications to themes, and explicit
`changes to the current theme or the
`current context
`
`Fig. 20
`
`
`
`
`201 5
`
`patterns of actions
`
`_
`
`-
`
`2020
`Any patterns above relevance
`
`Yes
`
`2025
`
`Retrieve user preference information
`
`Determine modifications to some or all
`themes that are consistent with patterns
`above the relevance threshold and with
`user preferences
`
`2035
`
`
`
`
`
`
`
`
`
`*
`
`
`
`
`
`Yes
`
`2040
`
`Egzsent suggested modifications to the
`
`
`
`
`
`
`
`No
`
`2045
`
`modifications
`
`2065
`
`2060
`
`2095
`
`
`
`39
`
`39
`
`
`
`Patent Application Publication
`
`Jan. 5, 2006 Sheet 39 0f 48
`
`US 2006/0004680 A1
`
`
`
`HmmdcflUOGQOU
`
`How:mo$on
`
`53388%25m
`
`32:388%
`
`fig $3500Hum:Mo
`
`Em
`
`€8an
`
`
`
`3::3.5%:
`
`3ME
`
`|li
`
`
`
`moi::ozaxmEm
`
`d.>3:
`
`mHonBo<
`
`Beta
`
`93m
`
`28m
`
`8%
`
`40
`
`40
`
`
`
`
`
`Patent Application Publication
`
`Jan. 5, 2006 Sheet 40 0f 48
`
`US 2006/0004680 A1
`
`
`
`tano“SEE:
`
`3on
`
`£3
`
`
`
`03mgoocouomfi
`
`mmoqouwimoamm
`
`cacao—mu?»
`
`oumEoS/x
`
`mmoqoamcmemaa
`
`cosmoct?
`
`35:32
`
`95mDcom
`
`
`
`38:80:0:me
`
`8on
`
`Emo—BE
`
`6;»
`
`
`
`mqouowHausanO
`
`33988
`
`“COEOHTVGO
`
`_‘o_.
`
`mm.mi
`
`cor
`
`41
`
`41
`
`
`
`Patent Application Publication
`
`Jan. 5, 2006 Sheet 41 0f 48
`
`US 2006/0004680 A1
`
`-'-'.-‘.-'.'-
`
`o
`
`>—1 2:ch85me
` 0&2883%8%
`$on3029::Tll
`
`mm.ME
`
`28?moESE
`
`:omixooEEqum
`
`8%253“mom
`
`H<
`
`42
`
`42
`
`
`
`
`Patent Application Publication
`
`Jan. 5, 2006 Sheet 42 0f 48
`
`US 2006/0004680 A1
`
`
`
` Phone, Internet, Time,
`
`
`GPS, Radio & TV, paging
`\\
`/\/
`
`Computer Network
`
`Fig. 24
`
`
`Verbal
`
`GPS
`
`Mouse
`
`Keyboard
`
`Time
`
`— Message Loop"
`
`
`
`Filters &
`Builds Datastore
`
`
`
` Preference
`
`Pattern
`
`
`
`
`Aging &
`Compression
`
`Actions
`
`Modify
`Windows
`
`Manage PIM
`
`Initiate calls,
`pages, faxes,
`email, web
`searches
`
`Display Maps
`
`Pattern
`
`Recognition
`
`
`
`
`Inference
`Engine
`
`43
`
`43
`
`
`
`Patent Application Publication
`
`Jan. 5, 2006 Sheet 43 0f 48
`
`US 2006/0004680 A1
`
`Context-Based Automated
`
`500
`
`-
`mg. 25
`
`Receive context information
`
`2505
`
`2510
`
`
`
`
`Add context information to explicit
`context model, and provide responses
`as indicated by explicit context rules
`
`
`
`
` Provide context information from
`
`explicit context model to implicit
`context model with inference engines
`
`2520
`
`
`Detect patterns in explicit context
`information, and suggest inferred rule
`with appropriate response
`
`2525
`
`Verify appropriateness of suggested
`rule
`
`2530
`
`Appropriate?
`
`Yes
`
`
`
`
`
`
`
`Store suggested rule, and provide
`suggested response if appropriate
`
`44
`
`44
`
`
`
`Patent Application Publication
`
`Jan. 5, 2006 Sheet 44 of 48
`
`US 2006/0004680 A1
`
`, Self-Customizing Context
`
`600
`
`,
`Fig. 26
`
`
`
`Receive current context information
`
`2605
`
`information from explicit context model that
`reflects received context
`
`2610
`
`
`Receive indication of modeled context attribute
`
`
`
`Receive indication of user selection of new
`modeled context attribute information or of
`
`
`indication of appropriate modeled context
`attribute information
`'
`
`
`
`Add all received information (and changes in
`current context from previous context) to
`
`
`appropriate context awareness customizing
`
`
`implicit model and inference engine
`
`
`
`
`
`2625
`
`Detect pattern in indicated user selection or
`indication, and generate corresponding inferred
`
`rule to change modeled context attribute
`information in appropriate customized manner
`
`Verify appropriateness of generated rule
`
`2630
`
`2635
`
`Appropriate?
`
`Yes
`
`2640
`
`Store generated rule for future use
`
`
`
`2645
`mm
`
`2695
`
`45
`
`45
`
`
`
`Patent Application Publication
`
`Jan. 5, 2006 Sheet 45 0f 48
`
`US 2006/0004680 A1
`
`700
`
`2705
`
`.H
`
`g 27
`
`P d' A
`
`'
`
`
`
`
`
`Receive current context information
`
`appropriate information
`
`
`
`Receive indication of user selection of
`information or user indication of
`
`
`
`
`
`
`2715
`Add current context information
`
`(including changes from previous
`
`context) and indicated user selection or
`
`indication to appropriate content
`implicit model and inference engine
`
`
`
`
`
`2720
`
`Detect pattern in indicated user
`selections of information and generate
`corresponding inferred rule to provide
`
`appropriate information
` 2725
`
`Yes
`
`
`
`Verify appropriateness of generated
`rule
`
`2730
`
`Appropriate?
`
`Store generated rule for future use
`
`795
`
`46
`
`46
`
`
`
`Patent Application Publication
`
`Jan. 5, 2006 Sheet 46 0f 48
`
`US 2006/0004680 A1
`
`800
`
`Fig. 28
`
`Receive current context information
`
`2805
`
`2810
`
`
`
`
`
`Receive indication of user changes to
`UI functionality or user indication of
`appropriate UI functionality
`
`
`
`
`Add received information (including
`changes from previous context) to
`appropriate U1 optimizing implicit
`
`
`
`model and inference engine
`
` Detect pattern in user changes or
`
`indications, and generate
`
`
`corresponding inferred rule to provide
`appropriate UI functionality
`
`
`
` Verify appropriateness of generated
`rule
`
`2830
`
`Appropriate?
`
`Store generated rule for. future use
`
`.
`Yes 2835
`
`
`
`895
`
`47
`
`47
`
`
`
`Patent Application Publication
`
`Jan. 5, 2006 Sheet 47 of 48
`
`US 2006/0004680 A1
`
`T kS'
`
`l'f
`
`’
`
`900
`
`'
`
`2905
`
`
`'- mg. 29
`
`
`
`
`Receive current context information
`
`
`
`Receive indication of user task
`performed having multiple steps, or of
`
`user indication that task is appropriate
`
`Add received information (including
`changes from previous context) to
`appropriate task simplification implicit
`model and inference engine
`
`
`Detect pattern in task, and generate
`corresponding inferred rule to provide
`
`
`some or all of appropriate task
`
`
`
`Verify appropriateness of generated
`rule
`
`. 2
`
`930
`
`Appropriate?
`
`Yes 2935
`Store generated rule for future use
`
`48
`
`48
`
`
`
`Patent Application Publication
`
`Jan. 5, 2006 Sheet 48 of 48
`
`US 2006/0004680 A1
`
`3000
`
`
`
`Mentoring Routine
`
`-
`
`Receive current context information
`
`Fig. 30
`
`3005
`
`
`
`
`
`
`
`Yes
`
`Retrieve explicit mentoring rules
`
`
`
`3010
`
`Explicit mentoring
`rules?
`
`3035
`
`3040
`Yes
`
`Determine if match to current
`
`
`Receive indication from
`
`
`context, and if so prompt user as
`
`
`user of user action that
`indicated
`should have been
`
`performed based on
`
`
`current context other than
`
`
`
`Generate and provide
`After prompt,
`any prompted actions
`
`
`
`receive "Why?"
`response based on
`
`explicit rules
`question
`
`
`
`
`
`
`Add received information (including changes from
`
`
`previous context) to appropriate mentoring implicit
`
`model and inference engine
`
`
` Detect pattern in indicated user action and current
`context information, and generate inferred rule to
`
`provide indicated action in context for pattern
`
`
`Verify appropriateness of generated rule
`
` 3055
`
`
`3060
`
`Appropriate?
`
`
`Modify explicit rules (or store inferred rule if no
`
`explicit rules) to reflect inferred rule
`
`49
`
`49
`
`
`
`US 2006/0004680 A1
`
`Jan. 5, 2006
`
`CONTEXTUAL RESPONSES BASED ON
`AUTOMATED LEARNING TECHNIQUES
`
`CROSS-REFERENCE TO RELATED
`APPLICATIONS
`
`[0001] This application is a continuation-in-part of US.
`patent application Ser. No. 09/216,193 (Attorney Docket
`No. 29443-8001), filed Dec. 18, 1998 and entitled “Method
`and System for Controlling Presentation of Information to a
`User Based on the User’s Condition”; of US. patent appli-
`cation Ser. No. 09/464,659 (Attorney Docket No. 29443-
`8003), filed Dec. 15, 1999 and entitled “Storing and Recall-
`ing Information to Augment Human Memories”; and of US.
`patent application Ser. No. 09/724,902 (Attorney Docket
`No. 29443-8002), filed Nov. 28, 2000 and entitled “Dynami-
`cally Exchanging Computer User’s Context,” which claims
`the benefit of provisional US. Patent Application No.
`60/194,006 filed Apr. 2, 2000. Each of these applications are
`hereby incorporated by reference in their entirety.
`
`[0002] This application also claims the benefit of provi-
`sional US. Patent Application No. 60/193,999 (Attorney
`Docket # 29443-8008), filed Apr. 2, 2000 and entitled
`“Obtaining And Using Contextual Data For Selected Tasks
`Or Scenarios, SuchAs For AWearable Personal Computer,”
`and of provisional US. Patent Application No. 60/194,123
`(Attorney Docket # 29443-8024), filed Apr. 2, 2000 and
`entitled “Supplying And Consuming User Context Data,”
`both of which are hereby incorporated by reference in their
`entirety.
`
`TECHNICAL FIELD
`
`[0003] The invention described below relates generally to
`using various information to allow a system to automatically
`enhance its responses to changing contextual information,
`such as information about a user and the user’s surround-
`1ngs.
`
`BACKGROUND
`
`[0004] Existing computer systems provide little apprecia-
`tion of a user’s overall condition or context, and as a result
`they can effectively respond to only a limited number of
`changes in parameters that they monitor. For example, with
`respect to the low-level physical status of the user, numerous
`devices exist for monitoring the physical parameters of the
`user, such as heart rate monitors that provide user pulse or
`heart rate data. While many of these devices simply provide
`information to the user regarding current values of a user’s
`health condition, others (e.g., a defibrillator or a system with
`an alarm) are capable of providing a corresponding response
`if a monitored parameter exceeds (or falls below) a threshold
`value. However, since such devices lack important informa-
`tion about the specific context of the user (e.g., whether the
`user
`is currently exercising or
`is currently sick), any
`response will attempt to accommodate a wide range of user
`contexts and is thus unlikely to be optimal for the specific
`context of the user. For example, a defibrillator may provide
`too great or too small of a resuscitating charge simply
`because only one or a small number of parameters of a
`person are being monitored.
`
`In a similar manner, existing computer systems
`[0005]
`have little appreciation for a user’s current mental and
`emotional state, or for higher-level abstractions of a user’s
`
`physical activity (e.g., going jogging or driving an automo-
`bile), and as a result are generally ineffective at anticipating
`tasks that a us