`Newell et al.
`
`I lllll llllllll Ill lllll lllll lllll lllll lllll 111111111111111111111111111111111
`US006466232Bl
`US 6,466,232 Bl
`Oct. 15, 2002
`
`(10) Patent No.:
`(45) Date of Patent:
`
`(54) METHOD AND SYSTEM FOR
`CONTROLLING PRESENTATION OF
`INFORMATION TO A USER BASED ON THE
`USER'S CONDITION
`
`(75)
`
`Inventors: Dan Newell, Seattle; Kenneth H.
`Abbott, III, Kirkland, both of WA (US)
`
`(73) Assignee: Tangis Corporation, Seattle, WA (US)
`
`( *) Notice:
`
`Subject to any disclaimer, the term of this
`patent is extended or adjusted under 35
`U.S.C. 154(b) by 0 days.
`
`(21) Appl. No.: 09/216,193
`
`(22)
`
`Filed:
`
`Dec. 18, 1998
`
`(51)
`
`(52)
`
`(58)
`
`(56)
`
`Int. Cl.7 .............................. G09G 5/00; G06F 3/00
`
`U.S. Cl. ........................ 345/700; 345/3.1; 345/751;
`345/8; 345/866; 709/139
`
`Field of Search ............................ 345/8, 169, 158,
`345/2.1, 3.1, 700, 751, 762; 708/139; 704/270;
`710/8; 705/27
`
`References Cited
`
`U.S. PATENT DOCUMENTS
`
`4,916,441 A
`5,032,083 A
`5,201,034 A
`5,208,449 A
`5,214,757 A
`5,227,614 A
`5,335,276 A
`5,416,730 A
`5,470,233 A
`
`4/1990 Gombrich
`7/1991 Friedman
`4/1993 Matsuura et al.
`5/1993 Eastman et al.
`5/1993 Mauney et al.
`7/1993 Danielson et al.
`8/1994 Thompson et al.
`5/1995 Lookofsky
`11/1995 Fruchterman et al.
`
`(List continued on next page.)
`
`FOREIGN PATENT DOCUMENTS
`
`EP
`JP
`JP
`WO
`WO
`
`0823 813 A2
`05260188
`09091112
`WO 90/08361
`WO 97/03434
`
`2/1998
`10/1993
`4/1997
`7/1990
`1/1997
`
`OTHER PUBLICATIONS
`
`Smailagic et al. "Matching interface design with user task:
`Modalities of Interaction with CMU Wearable Computers,"
`IEEE Personal Communications, Feb. 1996.*
`Steve Mann, '"Smart Clothing': Wearable Multimedia Com(cid:173)
`puting and 'Personal Imaging' to Restore the Technological
`Balance Between People and Their Environment," ACM
`Multimedia, Nov. 1996.*
`Finger et al., "Rapid Design and Manufacture of Wearable
`Computer," Communication of the ACM, vol. 39, No. 2,
`Feb. 1996.*
`Rekimoto et al., "The World through the computer: Com(cid:173)
`puter Augmented Interaction with Real World Environ(cid:173)
`ments," ACM, Nov. 1995. *
`
`(List continued on next page.)
`Primary Examiner-Raymond J. Bayerl
`Assistant Examiner-Tadesse Hailu
`(74) Attorney, Agent, or Firm-Perkins Coie LLP
`ABSTRACT
`
`(57)
`
`A system for controlling presentation of information to a
`user based on the user's current condition. In particular, the
`system monitors the user and the user's environment, and
`creates and maintains an updated model of the current
`condition of the user. The user condition can include a
`variety of condition variables, including abstract concepts
`such as the user's current cognitive load, desired level of
`privacy for output information, and desired scope of audi(cid:173)
`ence for output information. Upon receiving output infor(cid:173)
`mation to be presented to the user (e.g., from an application
`program), the system determines an appropriate output
`device and an appropriate format with which to present the
`information to the user, and then presents the output infor(cid:173)
`mation. The system can also receive description information
`about the output information that describes relevant factors
`for determining when and how to present the output infor(cid:173)
`mation (e.g., the importance and urgency of the output
`information, the consequences of the user not receiving or
`ignoring the output information, etc.). Some versions of the
`system execute on a wearable computer having a variety of
`available output display devices.
`
`88 Claims, 9 Drawing Sheets
`
`Various
`
`Vanous User
`Sensor Devices 156
`
`Handhc\d
`HatPand
`Display
`130
`
`Various
`
`150
`
`---r-°'--lo·Telephone
`168
`
`IPR2018-00294
`Apple Inc. EX1005 Page 1
`
`
`
`US 6,466,232 Bl
`Page 2
`
`U.S. PATENT DOCUMENTS
`
`5,493,692 A
`2/1996
`5,555,376 A
`9/1996
`5,559,520 A
`9/1996
`10/1996
`5,568,645 A
`5,601,435 A *
`2/1997
`5,611,050 A
`3/1997
`5,642,303 A
`6/1997
`5,646,629 A
`7/1997
`5,719,744 A
`2/1998
`5,726,660 A
`3/1998
`5,751,260 A *
`5/1998
`5,781,913 A
`7/1998
`5,790,974 A
`8/1998
`5,798,733 A
`8/1998
`5,812,865 A
`9/1998
`5,873,070 A
`2/1999
`5,878,274 A
`3/1999
`5,902,347 A
`5/1999
`5,910,799 A
`6/1999
`5,938,721 A
`8/1999
`5,948,041 A
`9/1999
`5,959,611 A
`9/1999
`5,991,687 A
`11/1999
`6,014,638 A *
`1/2000
`6,047,301 A
`4/2000
`6,064,943 A
`5/2000
`6,108,197 A
`8/2000
`6,127,990 A * 10/2000
`
`Theimer et al.
`Theimer et al.
`Barzegar et al.
`Morris et al.
`Quy ....................... 434/307 R
`Theimer et al.
`Small et al.
`Loomis et al.
`Jenkins et al.
`Purdy et al.
`Nappi et al. .. .. ... ... ... ... ... 345/8
`F elsenstein et al.
`Tognazzini
`Ethridge
`Theimer
`Bunte et al.
`Kono et al.
`Backman et al.
`Carpenter et al.
`Dussell et al.
`Abo et al.
`Smailagic et al.
`Hale et al.
`Burge et al. .................. 705/27
`Bjorklund et al.
`Clark, Jr. et al.
`Janik
`Zwern .. ... .. ... ... ... ... ... .. ... 345/8
`
`OIBER PUBLICATIONS
`
`Kortuem et al., "Context-Aware, Adaptive Wearable Com(cid:173)
`puters as Remote Interfaces to 'Intelligent' Environments,"
`University of Oregon, Oct. 1998.*
`Starner et al., "Visual Contextual Awareness in Wearable
`Computing," Media Lab, MIT, Oct. 1998.*
`Bauer et al., "A Collaborative Wearable System with
`Remote Sensing," University of Oregon, Feb. 1996.*
`Rhodes, Bradley, J. "The Wearable Remembrance Agent: A
`System for Augmented Memory," Proceedings of the First
`International Symposium
`on Wearable Computers
`(ISWC'97), Cambridge, MA, Oct. 13-14, 1997.
`Yezdi Lashkari et al., "Collaborative Interface Agents,"
`Proceedings of AAAI '94 Conference, Seattle, Washington,
`Aug. 1994.
`Maes, Pattie, "Agents That Reduce Work and Information
`Overload," Communications of the ACM. vol. 37, No. 7, Jul.
`1994.
`Lunt, Teresa F. et al., "Knowledge-Based Intrusion Detec(cid:173)
`tion," Proceedings of the Annual Artificial Intelligence Sys(cid:173)
`tems in Government Conference, IEEE Comp. Soc. Press,
`vol. Conf. 4, 1989, pp. 102-107.
`Sato, J. et al., "Autonomous Behavior Control of Virtual
`Actors Based on the AIR Model," Proceedings Computer
`Animation, Jun. 5, 1997.
`
`Billinghurst, Mark and Thad Starner, "New Ways to Manage
`Information," IEEE, pp. 57-64, Jan. 1999.
`Hull et al., "Towards Situated Computing," Hewlett-Pack(cid:173)
`ard Laboratories, HPL-97-66 (1997).
`Kirsch, Dana, "The Sentic Mouse: A tool for measuring
`emotional valence," http ://www.media.mit.edi/affect/ AC_
`research/projects/sentic_mouse.html, pp. 1-2 [Accessed
`Oct. 2, 1998].
`Metz, Cade, "MIT: Wearable PCs Electronic Ink, and Smart
`Rooms," PC Magazine, p. 192-193, Jun. 1998.
`Oakes, Chris, "The Truman Show Realized?," http://
`www.wired.com/news/technology /story /157 45 .html, pp.1-4
`[Accessed Oct. 21, 1998].
`Picard, R.W. and Healey, J., "Affective Wearables," Personal
`Technologies vol. 1:231-240, MIT Media Laboratory
`(1997).
`Rhodes, Bradley, "WIMP Interface Considered Fatal,"
`http://rhodes.www.media.mit.edu/people/rhodes/Papers/
`no-wimp.html, pp. 1-3 [Accessed Oct. 2, 1998].
`Tan, Hong Z. and Alex Pentland, "Tactual Displays for
`Wearable Computing," IEEE, Massachusetts Institute of
`Technology Media Laboratory, pp. 84-88, 1997.
`'"Affective Understanding:' Modeling and Responding to
`User Affect,"
`http://www.media.mit.edu/affect/AC_re(cid:173)
`search/understanding.html, pp. 1-3 [Accessed Oct. 2, 1998].
`"Alps GlidePoint," http://www.alps.com/pl 7.html, p. 1 [Ac(cid:173)
`cessed Oct. 2, 1998].
`"GyroPoint Technology," http://www.gyration.com/html/
`gyropoint.html, pp. 1-3 [Accessed Oct. 2, 1998].
`"Haptics," http://www.ai.mit.edu/projects/handarm-haptics/
`haptics.html, pp. 1-2[Accessed Oct. 2, 1998].
`"Research Areas in Affective Computing," http://www.me(cid:173)
`dia.mit.edu/affect/, p. 1 [Accessed Oct. 2, 1998].
`"Research on Affective Pattern Recognition and Modeling,"
`http://www.media.mit.edu/affect/AC_research/recogniz(cid:173)
`ing.html, pp. 1-4, [Accessed Oct. 2, 1998].
`"Research on Sensing Human Affect," http://www.medi(cid:173)
`a.mit.edu/affect/AC_research/sensing.html, pp. 1-5 [Ac(cid:173)
`cessed Oct. 2, 1998].
`"Smart Rooms," http://vismod.www.media.mit.edu/vismod/
`demos/smartroom/, pp. 1-3 [Accessed Oct. 2, 1998].
`"SmartDesk Home Page," http://vismod.www.media.
`mit.edu/vismod/demos/smartdesk/, pp. 1-4 [Accessed Oct.
`2, 1998].
`"The MIT Wearable Computing Web Page," http://wear(cid:173)
`ables.www.media.mit.edu/projects/wearables/, pp. 1-3 [Ac(cid:173)
`cessed Oct. 2, 1998].
`"Wearable Computer Systems for Affective Computing,"
`http://www.media.mit.edu/affect/AC_research/wear(cid:173)
`ables.html, pp. 1-5 [Accessed Oct. 2, 1998].
`
`* cited by examiner
`
`IPR2018-00294
`Apple Inc. EX1005 Page 2
`
`
`
`S aker 132
`
`: ~
`
`E<upiece pe
`
`Handheld
`Flat Panel
`Display
`130
`
`Various
`Environment
`Sensor Devices 128 :
`
`I
`I
`I
`
`Input Ji
`Output/
`, __
`:
`~ - - - - - - - - ......... ·I· ... .
`.. - ~- -............ ·i- ... .
`
`Body-Mounted Computer
`
`CDOS System
`10~1 I
`
`Eyeglass-mounted
`display 134
`
`Various
`User Input
`Devices 152
`
`Various User
`Sensor Devices 156
`
`I I .__ _____ __..
`I r-
`
`1
`I
`
`Sensor
`Devices 158
`
`Tactile
`[ ]
`Display 136
`.
`
`Non-Portable
`Computer
`
`150
`
`Telephone
`168
`
`Fig.1
`
`d •
`\JJ.
`•
`~
`~ ......
`~ = ......
`
`0
`I")
`!"""
`'"""'
`~Ul
`N c c
`
`N
`
`'Jl =(cid:173)~
`~ .....
`'"""' 0 .....,
`
`\C
`
`e
`
`rJ'J.
`O'I
`~
`O'I
`O'I
`'N
`~
`N
`~
`lo-"
`
`IPR2018-00294
`Apple Inc. EX1005 Page 3
`
`
`
`130 I Flat Panel
`Display
`1221 User Input
`Devices
`152 I User Input
`Devices
`124
`Microphone
`
`User Sensor
`Devices
`1561 User Sensor
`Devices
`
`Environment
`Sensor Devices
`158 I Environment
`Sensor Devices
`
`280
`
`CPU
`
`Fig. 2
`290
`Storage
`Device
`
`Body Mounted
`Computer 120
`
`Memory 270
`
`Condition-Dependent Output Supplier (CDOS) 100
`220
`
`Format Module
`
`Store
`Information
`
`Date/Time
`User Input
`
`205
`
`210
`Model of
`
`Format Module
`
`Format Module
`
`Format Module
`
`Format Module
`
`Format Module
`
`Format Module
`
`Format Module
`
`221
`
`222
`
`223
`
`224
`
`225
`
`226
`
`227
`
`228
`
`User
`Sensed User
`Informatio Characterization
`Module
`Sensed
`--- . -- - --------
`Information
`
`-
`
`i ApplTcat10i1 -=-t £6§. _1
`I
`I
`I
`Supplied
`1:
`~
`l1-w191-~].?J1_J
`+
`I ~tput (
`Format Module
`orma ion
`1
`I Characterization
`I
`I
`~ _________ Ji Application Program I 260
`
`Output
`Information
`
`I
`
`I
`
`d •
`\JJ.
`•
`~
`~ ......
`~ = ......
`
`0
`I")
`!"""
`'"""'
`~Ul
`
`N c s
`
`'Jl =(cid:173)~
`~ .....
`N
`0 .....,
`\C
`
`e
`
`rJ'J.
`O'I
`~
`O'I
`O'I
`'N
`~
`N
`~
`lo-"
`
`Handheld Flat
`Panel Dis la.. I 130
`Earpiece
`Soeaker
`
`1132
`
`Tactile Display 1136
`
`Display
`
`Speaker
`
`Olfactory
`Device
`
`Printer
`
`Telt:phone
`
`Eyeglass
`Mounted
`Disola·
`
`1160
`
`1162
`
`1164
`
`1166
`
`1168
`
`1134
`
`IPR2018-00294
`Apple Inc. EX1005 Page 4
`
`
`
`U.S. Patent
`
`Oct. 15, 2002
`
`Sheet 3 of 9
`
`US 6,466,232 Bl
`
`Fig. 3
`
`210
`
`User: X
`Latitude
`Longitude
`Altitude
`Heart Rate
`Blood Pressure
`Last User Input
`Ambient Tem_l)_erature
`Ambient Noise
`Location
`Speed
`Near~ Objects
`Near~Pe~e
`
`Model of User Condition
`Time: 14:22 Date 10/15/XX
`37°55.3' N
`95°24.7' w
`102'
`57 beats/minute
`125 I 80
`Voice Command "Stop Recordil!['
`67°F
`20 dB
`Office
`2MPH
`Desk
`PI.!:rsical: None. Audio: "Dal!& Smith"
`
`+/-10%
`
`Hi_g_hl_y Likely
`
`User Activi_!y
`Ce>gnitive Load
`Level of Priv'!9'_
`Sc~e of Audience
`
`Talki1!_g_ on Cell Phone, WalkiJ.!g_
`77
`Com_QaJ:!Yi Executive
`Self
`
`Application X-Factor 1 Normal: Mean-23, Std Dev 3
`
`User Format Preference Visual > Audit<>!Y_
`User Device Preference
`E_y_eglass Mounted Display_
`
`IPR2018-00294
`Apple Inc. EX1005 Page 5
`
`
`
`Fig. 4
`
`User Characterization Module 205
`
`User: X
`IF
`IF
`IF
`
`<Latitude>::::: "37°55.2'N' AND <Longitude>~ "95°24.7'W' THEN <Location>= "Office"
`<Infrared.Link.To.Desktop>= True THEN <Nearby Objects> Includes "Desk"
`<Voice.Recognition.ID><> "X'' AND <Speakerphone.Status> ="Disabled"
`THEN <Nearby People> Includes ValueOf <Voice.Recognition.ID>
`<Desktop.Motion.Sensor.Human.Movement>= True AND <User Activity>
`Includes "Seated" THEN <Nearby People.Physical> Includes "Unidentified Person"
`<User Activity>= "Walking" THEN <Cognitive Load> = 20
`<User Activity>= "Talking*" THEN <Cognitive Load>= 55
`<User Activity> Includes "Walking" AND <User Activity> Includes
`"Talking On Cell Phone" THEN <Cognitive Load> = 77
`<Output.To.User>= True THEN <Cognitive Load>=+ 10
`<User.Mood> Includes "Angry" THEN <Cognitive Load>= +20%
`<Nearby People.*> Includes Only [Company Executives] THEN
`<Level Of Privacy> Includes "Executive"
`<Nearby People.*> Includes Only [Company Employees] THEN
`<Level of Privacy> Includes "Company"
`<Nearby People.Physical>= "None" THEN <Scope of Audience>= "Self'
`IF
`<Output.Intrusive.To.Others>= "Likely" THEN <Scope of Audience>= "Self'
`IF
`AppX:IF <Application X-Factor I .Mean>> 25 THEN
`<Application X Output> = "Undesired" WITH Likeliho od "Likely"
`(<Current.Time> - <Time.OfLast.User.lnput>) > 5 minutes THEN <Interacting.With.Computer>
`=False WITH Likelihood "Somewhat Likely"
`
`IF
`
`IF
`IF
`IF
`
`WHILE
`WHILE
`IF
`
`IF
`
`IF
`
`d •
`\JJ.
`•
`~
`~ ......
`~ = ......
`
`0
`I")
`:-""
`'"""'
`~Ul
`
`N c s
`
`'Jl =(cid:173)~
`~ .....
`.i;;..
`0 .....,
`\C
`
`e
`
`\JJ.
`O'I
`~
`O'I
`O'I
`'N
`~
`N
`~
`lo-"
`
`IPR2018-00294
`Apple Inc. EX1005 Page 6
`
`
`
`User: X
`In
`Currently
`Available Use
`x
`
`Fig. 5
`
`Output Device Selector Module 215
`.
`
`Supported Cognitive Load Level of
`Senses
`Priv~
`Visual, Audio Very Low-
`All
`Medium
`
`Degree of Degree of Intrusiveness
`Scope of
`Audience Interru_JJ_tibil!!Y_
`on Others
`Low
`Very Low
`Self[+3]
`
`x
`x
`
`x
`x
`x
`x
`x
`x
`
`Device
`
`Handheld Flat
`Panel Display
`130
`Earpiece
`S_Qeaker 132
`Eyeglass
`Mounted
`DiSQlay_ 134
`Tactile
`DisQlay_ 136
`Display 160
`
`Speaker 162
`
`Olfactory
`Device 164
`Printer 166
`
`Telephone
`168
`Pager 502
`
`Cellular
`Telephone
`504
`Car Radio
`506
`
`x
`
`Audio
`
`Visual
`
`Very Low-
`Somewhat Hi__g_h
`Very Low-
`Somewhat Low
`
`All
`
`All
`
`All
`
`Tactile
`
`Visual
`
`Audio
`
`Olfactory
`
`Visual
`
`Audio
`
`Very Low-
`V~High
`Very Low-
`Business,
`Somewhat Hi_g_h Sensitive
`Low-
`Business
`Somewhat High
`Close
`Medium-
`Somewhat Hig_h Friends
`Very Low-
`Business
`V~High
`Very Low-
`Medium
`Visual, Audio, Very Low-
`Tactile
`Hig_h
`Audio
`Very Low-
`Medium
`
`Self
`
`Self
`
`Self
`
`Self +6
`
`Low-
`V~High
`Medium-
`High
`
`Very Low
`
`Very Low
`
`Very Low-
`V~High
`Low-Medium Very Low-Medium
`
`Very Low
`
`Many Medium-High
`
`Many
`
`Very Low-
`Somewhat Lo~
`Unlimited Very Low
`
`Low-
`V~High
`Medium-Very High
`
`Somewhat High
`
`High-Very High
`
`Medium-Very High
`
`High-Very High
`
`High
`
`High-
`Very_ High
`High
`
`Medium-
`High
`
`Low-
`High
`
`Family
`
`All
`
`Highly
`Sensitive
`
`Self
`
`Self
`
`Self
`
`Audio
`
`Low-
`Somewhat High!
`
`Sensitive
`
`Self+ Few
`
`d •
`\JJ.
`•
`~
`~ ......
`~ = ......
`
`0
`I")
`!"""
`'"""'
`~Ul
`
`N c s
`
`'Jl =(cid:173)~
`~ .....
`Ul
`0 .....,
`\C
`
`e
`
`rJ'J.
`O'I
`~
`O'I
`O'I
`'N
`~
`N
`~
`lo-"
`
`IPR2018-00294
`Apple Inc. EX1005 Page 7
`
`
`
`U.S. Patent
`
`Oct. 15, 2002
`
`Sheet 6 of 9
`
`US 6,466,232 Bl
`
`User Characterization
`Routine
`
`Fig. 6
`
`605
`Retrieve stored information for user, including characterization
`rules, and create default model of user condition
`
`set timer
`
`615
`Receive input or
`timer expires
`
`645
`
`Characterize
`User
`
`Store updated user
`condition
`
`655
`
`Update
`characterization
`rules if necessary
`
`5
`yes Set specified user
`>---.i preference or user
`condition variable
`
`Forward user input
`to appropriate
`program
`
`continue?
`
`Yes
`
`END
`
`IPR2018-00294
`Apple Inc. EX1005 Page 8
`
`
`
`U.S. Patent
`
`Oct. 15, 2002
`
`Sheet 7 of 9
`
`US 6,466,232 Bl
`
`645
`Characterize User
`Subroutine
`705
`Retrieve current model of
`user condition
`710
`Retrieve current date and
`time
`
`Yes
`
`Yes
`
`740
`Add new characterization rule,
`determine if current condition
`variable values trigger the
`rule, and if so propagate
`changes through rules
`
`Fig. 7
`
`720
`
`Examine condition variables
`that represent time-sensitive or
`historical data to determine if
`they should be updated
`
`725
`Determine if current date and
`time trigger any rules, and if so
`propagate changes through
`rules
`
`745
`
`Determine if current input or
`current date and time trigger any
`rules, and if so propagate changes
`through rules
`
`Store any changes in condition
`variables and their values, including
`date and time, in updated model of
`user condition
`
`750
`
`RETURN
`
`IPR2018-00294
`Apple Inc. EX1005 Page 9
`
`
`
`U.S. Patent
`
`Oct. 15, 2002
`
`Sheet 8 of 9
`
`US 6,466,232 Bl
`
`00
`Output Device Selector
`Routine
`
`805
`Receive output infonnation to be presented to the user or an indication that a
`timer has expired
`
`815
`Retrieve deferred output
`information and any
`description information
`
`Yes
`
`Fig. 8
`
`820
`
`optionally receive description of output
`information
`
`825
`Retrieve current model of user condition
`
`830
`Determine whether to currently present the output information to the user
`based on cognitive load, desired and available levels of privacy and scope of
`audience, the importance and deferability of the output information, and the
`consequences of ignoring the output information
`
`840
`Set timer for later date and
`store output information and ___ Y_e_s-<
`description information
`
`4
`Select available output device whose characteristics best match the
`above factors and user preferences
`
`Format and present output information
`
`Notify User Characterization module of presentation
`of output information
`
`850
`
`855
`
`Yes
`
`END
`
`IPR2018-00294
`Apple Inc. EX1005 Page 10
`
`
`
`U.S. Patent
`
`Oct. 15, 2002
`
`Sheet 9 of 9
`
`US 6,466,232 Bl
`
`Fig. 9
`
`905
`
`Receive output information,
`information description and
`relevant user preference
`information
`
`910
`
`Select a user sense supported by
`the output device
`
`91
`
`Select formatting for the output
`information to correspond to
`the user condition and the
`output information description
`
`920
`
`Present the output information
`to the user with the selected
`formatting
`
`RETURN
`
`IPR2018-00294
`Apple Inc. EX1005 Page 11
`
`
`
`US 6,466,232 Bl
`
`1
`METHOD AND SYSTEM FOR
`CONTROLLING PRESENTATION OF
`INFORMATION TO A USER BASED ON THE
`USER'S CONDITION
`
`TECHNICAL FIELD
`
`The present invention relates generally to computer pro(cid:173)
`gram user interfaces, and more particularly to presenting
`information to a user based on the user's current condition.
`
`BACKGROUND OF THE INVENTION
`
`5
`
`2
`than at a user's immediate request (e.g., reminding the user
`of an upcoming appointment), computer programs are
`increasingly likely to present information in a manner that
`interrupts the user (and may be bothersome or dangerous),
`that may not be perceived by the user even if highly
`important and urgent, that may disturb others around the
`user, and that may inadvertently reveal sensitive information
`to others.
`A growing trend of using wearable computers will only
`10 exacerbate this problem. Such wearable computers are
`designed to act as constant companions and intelligent
`assistants to a user, thus being available to receive input
`from the user at any time and to present output information
`to the user at any time. Wearable computers are typically
`15 strapped to the user's body or mounted in a holster, and may
`include a variety of both input and output devices. The close
`association of wearable computers to their users results in
`the wearable computer interacting with the user in virtually
`any social or business situation, and thus the likelihood of
`20 inappropriate output behavior increases.
`
`As computers become increasingly powerful and
`ubiquitous, users increasingly use their computers for a
`broad variety of tasks. For example, in addition to traditional
`activities such as running word processing and database
`applications, users increasingly use computers as an integral
`part of their daily lives. Programs to schedule activities,
`generate reminders, and provide rapid communication capa(cid:173)
`bilities are becoming increasingly popular. Moreover, com(cid:173)
`puters are increasingly present during virtually all of a
`person's daily activities. For example, hand-held computer
`organizers (e.g., PDAs) are increasingly common, and com(cid:173)
`munication devices such as portable phones are increasingly
`incorporating computer capabilities. Thus, users may be
`presented with output information from one or more com(cid:173)
`puters at any time.
`While advances in hardware make computers increas(cid:173)
`ingly ubiquitous, traditional computer programs are not
`typically designed to efficiently present information to users
`in a wide variety of environments. For example, most
`computer programs are designed with a prototypical user
`being seated at a stationary computer with a large display
`device, and with the user devoting full attention to the
`display. In that environment, the computer can safely present
`information to the user at any time, with minimal risk that
`the user will fail to perceive the information or that the
`information will disturb the user in a dangerous manner
`(e.g., by startling the user while they are using power
`machinery or by blocking their vision while they are moving
`with information sent to a head-mounted display). However,
`in many other environments these assumptions about the
`prototypical user are not true, and users thus may not
`perceive output information (e.g., failing to notice an icon or
`message on a hand-held display device when it is holstered,
`or failing to hear audio information when in a noisy envi(cid:173)
`ronment or when intensely concentrating). Similarly, some
`user activities may have a low degree of interruptibility (i.e.,
`ability to safely interrupt the user) such that the user would
`prefer that the presentation of low-importance or of all 50
`information be deferred, or that information be presented in
`a non-intrusive manner.
`In addition to assuming that a user is devoting full
`attention to the display, current computer programs typically
`assume that only the user is devoting attention to the
`computer system. Thus, current computer programs are not
`concerned with factors related to the user's environment,
`such as whether other people around the user are disturbed
`by information being presented or whether sensitive infor(cid:173)
`mation is inadvertently made available to others. Instead,
`current computer programs typically assume that the user
`will request output information only at appropriate times and
`that the user will control whether others are privy to output
`information (e.g., by orienting the display accordingly or
`adjusting speaker volumes).
`However, as computers are increasingly present with
`users and are designed to present output information other
`
`SUMMARY OF THE INVENTION
`
`Some embodiments of the present invention provide a
`25 method and system for controlling presentation of informa(cid:173)
`tion to a user based on the user's current condition. In
`particular, the system monitors the user and the user's
`environment, and creates and maintains an updated model of
`the current condition of the user. The user condition can
`30 include a variety of condition variables, including abstract
`concepts such as the user's current cognitive load, desired
`level of privacy for output information, and desired scope of
`audience for output information. The user condition can also
`include condition variables indicating physical characteris-
`35 tics (e.g., deafness) and physically observable characteristics
`(e.g., movement or proximity to another object) of the user.
`Upon receiving output information to be presented to the
`user (e.g., from an application program), the system deter(cid:173)
`mines an appropriate output device and an appropriate
`40 format with which to present the information to the user, and
`then presents the output information. In some embodiments,
`the system also receives description information about the
`output information that describes relevant factors for deter(cid:173)
`mining when and how to present the output information
`45 (e.g., the importance and urgency of the output information,
`the consequences of the user not receiving or ignoring the
`output information, etc.). The system executes in some
`embodiments on a wearable computer having a variety of
`available output display devices.
`In one embodiment, the system presents output informa-
`tion to a user by first receiving information about a modeled
`characteristic of the user which may include a modeled
`preference of the user for receiving sensitive information, a
`modeled indication of a current degree of interruptibility of
`55 the user, or a modeled preference of the user for an amount
`of people to perceive information presented by the com(cid:173)
`puter. The system then selects an output device capable of
`presenting the output information in accordance with the
`modeled characteristic, and presents the output information
`60 on the selected output device in accordance with the mod(cid:173)
`eled characteristic.
`In an alternate embodiment, the system presents informa(cid:173)
`tion to a user on one of multiple available output devices.
`The system monitors the user to collect information about a
`65 current state of the user, and then models a current user
`condition based on the collected information by determining
`a current level of privacy desired by the user that indicates
`
`IPR2018-00294
`Apple Inc. EX1005 Page 12
`
`
`
`US 6,466,232 Bl
`
`3
`a group of people allowed to perceive information presented
`by the computer, by determining a current scope of audience
`desired by the user that indicates how many people are
`intended to perceive information presented by the computer,
`and by determining a current cognitive load of the user that
`indicates ability of the user to devote attention to the
`computer. The system then receives output information to be
`presented to the user, and presents the output information in
`a manner consistent with the modeled current user condition
`by selecting one of the output devices such that information
`presentation capabilities of the selected output device sup(cid:173)
`port the determined current desired level of privacy, the
`determined current desired scope of audience, and the deter(cid:173)
`mined current cognitive load, and by presenting the output
`information to the user on the selected output device, so that
`the presentation of information by the system satisfies the
`modeled current user condition.
`
`BRIEF DESCRIPTION OF THE DRAWINGS
`
`4
`and geographic information (e.g., location and speed), while
`higher levels of abstraction may attempt to characterize or
`predict the user's physical activity (e.g., jogging or talking
`on a phone), emotional state (e.g., angry or puzzled), desired
`5 output behavior for different types of information (e.g., to
`present private family information so that it is perceivable
`only to myself and my family members), and cognitive load
`(i.e., the amount of attention required for the user's current
`activities). Background information which changes rarely or
`not at all can also be included, such as the user's age, gender
`and visual acuity. The model can similarly hold environment
`information at a low level of abstraction, such as air tem(cid:173)
`perature or raw data from a motion sensor, or at higher levels
`of abstraction, such as the number and identities of nearby
`people, objects, and locations. The model of the user's
`15 condition can additionally include information added explic(cid:173)
`itly from other sources (e.g., application programs), as well
`as user-specified or system-learned defaults and preference
`information. An illustrative example of a model of a user
`condition is described in greater detail with respect to FIG.
`20 3.
`
`10
`
`FIG. 1 illustrates a user wearing a body-mounted com(cid:173)
`puter executing an embodiment of the Condition-Dependent
`Output Supplier (CDOS) system of the present invention.
`FIG. 2 is a block diagram illustrating the contents and
`information flow of an embodiment of the CDOS system.
`FIG. 3 is an illustrative example of a model of a current
`user condition.
`FIG. 4 is an illustrative example of a User Characteriza(cid:173)
`tion Module.
`FIG. 5 is an illustrative example of an Output Device 30
`Selector Module.
`FIG. 6 is an exemplary flow diagram of an embodiment
`of the User Characterization routine.
`FIG. 7 is an exemplary flow diagram of an embodiment
`of the Characterize User subroutine.
`FIG. 8 is an exemplary flow diagram of an embodiment
`of the Output Device Selector routine.
`FIG. 9 is an exemplary flow diagram of an embodiment
`of the Format And Present Output Information subroutine. 40
`DETAILED DESCRIPTION OF THE
`INVENTION
`
`The CDOS system includes a User Characterization
`Module, an Output Device Selector Module, and a Format
`Module associated with each available output device. The
`User Characterization Module monitors the user and the
`25 user's environment in order to create a current model of the
`user's condition. After the User Characterization Module
`has created a model of the user's current condition, the
`Output Device Selector Module and the one or more Format
`Modules can then use the model to determine when and how
`to present output information to the user.
`The User Characterization Module can receive a variety
`of types of information, and can use this information to
`determine the user's current condition in a variety of ways.
`For example, the User Characterization Module can receive
`35 user input supplied by the user to the computer system,
`information about the user sensed from a variety of sensor
`devices, information about the environment surrounding the
`user received from a variety of sensor devices, indications
`from the CDOS system about output information currently
`being presented to the user, stored background information
`about the user or about the world, and various types of
`information from external entities such as application pro(cid:173)
`grams.
`User input information alone can provide significant
`45 information to the CDOS system about the user's current
`condition. For example, if the user is currently supplying
`input to the computer via a full-sized keyboard, it is likely
`that the user is engaged in little other physical activity (e.g.,
`walking), that the user is devoting a significant amount of
`50 attention to the computer system, and that the user would see
`information flashed on the display. If the user is instead
`generating user input audibly (e.g., through a head-mounted
`microphone), that fact may provide less user condition
`information to the CDOS system since the user can supply
`55 such audio information while engaged in a variety of types
`of physical activity. Those skilled in the art will appreciate
`that there are a wide variety of input devices with which a
`user can supply information to the computer system, includ(cid:173)
`ing voice recognition devices, traditional qwerty keyboards,
`60 chording keyboards, half qwerty keyboards, dual forearm
`keyboards, chest mounted keyboards, handwriting recogni(cid:173)
`tion and digital ink devices, a mouse, a track pad, a digital
`stylus, a finger or glove device to capture user movement,
`pupil tracking devices, a gyropoint, a trackball, a voice grid
`65 device, digital cameras (still and motion), etc.
`In addition to the information received via user input, the
`User Characterization Module also uses sensed information
`
`An embodiment of the present invention provides a
`method and system for controlling presentation of informa(cid:173)
`tion to a user based on the user's current condition. In
`particular, the Condition-Dependent Output Supplier
`(CDOS) system monitors the user and the user's
`environment, and creates and maintains an updated model of
`the current condition of the user. Upon receiving output
`information to be presented to the user (e.g., from an
`application program), the CDOS system determines an
`appropriate output device and an appropriate format with
`which to present the information to the user, and then
`presents the output information. In some embodiments, the
`CDOS system also receives description information about
`the output information that describes relevant factors for
`determining when and how to present the output information
`(e.g., the importance and urgency of the output information,
`the consequences of the user not receiving or ignoring the
`output information, etc.).
`In one embodiment, the model of the user's current
`condition includes a variety of condition variables that
`represent information about the user and the user's environ(cid:173)
`men