throbber
(19) United States
`(12) Patent Application Publication (10) Pub. No.: US 2002/0044152 A1
`(43) Pub. Date:
`Apr. 18, 2002
`Abbott, III et al.
`
`US 2002004.4152A1
`
`(54) DYNAMIC INTEGRATION OF COMPUTER
`GENERATED AND REAL WORLD IMAGES
`
`(76) Inventors: Kenneth H. Abbott III, Kirkland, WA
`(US); Dan Newell, Medina, WA (US);
`James O. Robarts, Redmond, WA (US)
`
`Correspondence Address:
`LEE & HAYES, PLLC
`421 W. RIVERSIDEAVE, STE 500
`SPOKANE, WA99201 (US)
`
`(21) Appl. No.:
`(22) Filed:
`
`09/879,827
`Jun. 11, 2001
`Related U.S. Application Data
`(63) Non-provisional of provisional application No.
`60/240,672, filed on Oct. 16, 2000. Non-provisional
`of provisional application No. 60/240,684, filed on
`Oct. 16, 2000.
`
`
`
`Publication Classification
`
`(51) Int. Cl." ....................................................... G09G 5/00
`(52) U.S. Cl. .............................................................. 345/629
`
`
`
`(57)
`
`ABSTRACT
`
`A System integrates virtual information with real world
`imageS presented on a display, Such as a head-mounted
`display of a wearable computer. The System modifies how
`the virtual information is presented to alter whether the
`virtual information is more or less visible relative to the real
`World images. The modification may be made dynamically,
`Such as in response to a change in the user's context, or
`user's eye focus on the display, or a user command. The
`Virtual information may be modified in a number of ways,
`Such as adjusting the transparency of the information, modi
`fying the color of the virtual information, enclosing the
`information in borders, and changing the location of the
`Virtual information on the display. Through these tech
`niques, the System provides the information to the user in a
`way that minimizes distraction of the user's view of the real
`World images.
`
`Transparent
`Menu
`
`202 REAL
`WORLD
`
`Niantic's Exhibit No. 1015
`Page 001
`
`

`

`Patent Application Publication Apr. 18, 2002 Sheet 1 of 5
`
`US 2002/0044152 A1
`
`100-
`
`EYEGLASS
`
`102
`
`Dis-AY11 N
`\-m / EARPIECE
`
`SPEAKER
`
`112
`FLAT
`PANEL
`DISPLAY
`
`110
`MC
`
`
`
`114
`INPUT
`DEVICE(S)
`
`120
`OUTPUT
`DEVICE(S)
`
`122
`USER
`SENSOR(S)
`
`-124
`ENVIRONMENT
`SENSOR(S)
`
`DATA COMMUNICATIONS INTERFACE(S)
`
`
`
`CENTRAL COMPUTING UNIT
`140
`
`144
`
`13O
`
`132
`
`142
`146 MEMORY
`APPLICATION(S)
`
`150
`
`148
`
`CDOS SYSTEM
`
`Niantic's Exhibit No. 1015
`Page 002
`
`

`

`Patent Application Publication Apr. 18, 2002. Sheet 2 of 5
`
`US 2002/0044152 A1
`
`
`
`
`
`7, 204
`Y Transparent
`Menu
`
`
`
`\ 202 REAL
`WORLD
`
`\ 204
`Transparent
`Menu
`
`N
`202 REAL
`WORLD
`
`Niantic's Exhibit No. 1015
`Page 003
`
`

`

`Patent Application Publication Apr. 18, 2002. Sheet 3 of 5
`
`US 2002/0044152 A1
`
`300
`
`East, 3,500 :
`risic
`
`
`
`
`
`Y Transparent
`Menu
`
`Niantic's Exhibit No. 1015
`Page 004
`
`

`

`Patent Application Publication Apr. 18, 2002 Sheet 4 of 5
`
`US 2002/0044152 A1
`
`
`
`MAROUEE
`
`Niantic's Exhibit No. 1015
`Page 005
`
`

`

`Patent Application Publication Apr. 18, 2002 Sheet 5 of 5
`
`US 2002/0044152 A1
`
`GENERATE VIRTUAL
`INFORMATION
`
`802
`
`DETERMINE HOW TO PRESENT
`VIRTUAL INFORMATION
`
`*USER CONTEXT
`*IMPORTANCE
`*RELEVANCY
`
`ASSIGN DEGREE OF
`TRANSPARENCY AND LOCATION
`
`
`
`*NOTIFICATION
`*BORDERS
`*COLOR
`*BACKGROUND
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`MONITOR USER BEHAVOR,
`CONTEXT, AND CONDITIONS
`
`804
`
`806
`
`808
`
`CHANGE TRANSPARENCY AND/
`ORMODIFY PROMINENCE
`
`812
`
`Niantic's Exhibit No. 1015
`Page 006
`
`

`

`US 2002/0044152 A1
`
`Apr. 18, 2002
`
`DYNAMIC INTEGRATION OF COMPUTER
`GENERATED AND REAL WORLD IMAGES
`
`RELATED APPLICATIONS
`0001. A claim of priority is made to U.S. Provisional
`Application No. 60/240,672, filed Oct. 16, 2000, entitled
`“Method For Dynamic Integration Of Computer Generated
`And Real World Images', and to U.S. Provisional Applica
`tion No. 60/240,684, filed Oct. 16, 2000, entitled “Methods
`for Visually Revealing Computer Controls”.
`
`TECHNICAL FIELD
`0002 The present invention is directed to controlling the
`appearance of information presented on displayS, Such as
`those used in conjunction with wearable personal comput
`erS. More particularly, the invention relates to transparent
`graphical user interfaces that present information transpar
`ently on real world images to minimize obstructing the
`user's view of the real world images.
`BACKGROUND
`0003. As computers become increasingly powerful and
`ubiquitous, users increasingly employ their computers for a
`broad variety of taskS. For example, in addition to traditional
`activities Such as running word processing and database
`applications, users increasingly rely on their computers as an
`integral part of their daily lives. Programs to Schedule
`activities, generate reminders, and provide rapid communi
`cation capabilities are becoming increasingly popular. More
`Over, computers are increasingly present during Virtually all
`of a person's daily activities. For example, hand-held com
`puter organizers (e.g., PDAS) are more common, and com
`munication devices Such as portable phones are increasingly
`incorporating computer capabilities. Thus, users may be
`presented with output information from one or more com
`puters at any time.
`0004. While advances in hardware make computers
`increasingly ubiquitous, traditional computer programs are
`not typically designed to efficiently present information to
`users in a wide variety of environments. For example, most
`computer programs are designed with a prototypical user
`being Seated at a Stationary computer with a large display
`device, and with the user devoting full attention to the
`display. In that environment, the computer can safely present
`information to the user at any time, with minimal risk that
`the user will fail to perceive the information or that the
`information will disturb the user in a dangerous manner
`(e.g., by Startling the user while they are using power
`machinery or by blocking their vision while they are moving
`with information sent to a head-mounted display). However,
`in many other environments these assumptions about the
`prototypical user are not true, and users thus may not
`perceive output information (e.g., failing to notice an icon or
`message on a hand-held display device when it is holstered,
`or failing to hear audio information when in a noisy envi
`ronment or when intensely concentrating). Similarly, Some
`user activities may have a low degree of interruptibility (i.e.,
`ability to safely interrupt the user) such that the user would
`prefer that the presentation of low-importance or of all
`information be deferred, or that information be presented in
`a non-intrusive manner.
`0005 Consider an environment in which the user must be
`cognizant of the real world Surroundings Simultaneously
`
`with receiving information. Conventional computer Systems
`have attempted to display information to users while also
`allowing the user to view the real world. However, such
`Systems are unable to display this virtual information with
`out obscuring the real-world view of the user. Virtual
`information can be displayed to the user, but doing So
`visually impedes much of the user's view of the real world.
`0006. Often the user cannot view the computer-generated
`information at the same time as the real-world information.
`Rather, the user is typically forced to Switch between the real
`World and the virtual World by either mentally changing
`focus or by physically actuating Some Switching mechanism
`that alters between displaying the real world and displaying
`the virtual word. To view the real world, the user must stop
`looking at the display of virtual information and concentrate
`on the real world. Conversely, to view the virtual informa
`tion, the user must stop looking at the real world.
`0007 Switching display modes in this way can lead to
`awkward, or even dangerous, Situations that leave the user in
`transition and Sometimes in the wrong mode when they need
`to deal with an important event. An example of this awkward
`behavior is found in inadequate current technology of com
`puter displays that are worn by users. Some computer
`hardware is equipped with an extra piece of hardware that
`flips down behind the visor display. This effect creates
`complete background opaqueness when the user needs to
`view more information, or needs to view it without the
`distraction of the real-world image.
`0008 Accordingly, there is a need for new techniques to
`display virtual information to a user in a manner that does
`not disrupt, or disrupts very little, the user's view of the real
`world.
`
`SUMMARY
`0009. A system is provided to integrate computer-gener
`ated virtual information with real world images on a display,
`Such as a head-mounted display of a wearable computer. The
`System presents the Virtual information in a way that creates
`little interference with the user's view of the real world
`images. The system further modifies how the virtual infor
`mation is presented to alter whether the virtual information
`is more or less visible relative to the real world images. The
`modification may be made dynamically, Such as in response
`to a change in the user's context, or user's eye focus on the
`display, or a user command.
`0010. The virtual information may be modified in a
`number of ways. In one implementation, the Virtual infor
`mation is presented transparently on the display and over
`lays the real world images. The user can easily view the real
`World images through the transparent information. The
`System can then dynamically adjust the degree of transpar
`ency acroSS a range from fully transparent to fully opaque
`depending upon how noticeable the information is to be
`displayed.
`0011. In another implementation, the system modifies the
`color of the virtual information to selectively blend or
`contrast the Virtual information with the real world images.
`Borders may also be drawn around the virtual information to
`Set it apart. Another way to modify presentation is to
`dynamically move the virtual information on the display to
`make it more or leSS prominent for viewing by the user.
`
`Niantic's Exhibit No. 1015
`Page 007
`
`

`

`US 2002/0044152 A1
`
`Apr. 18, 2002
`
`BRIEF DESCRIPTION OF THE DRAWINGS
`0012 FIG. 1 illustrates a wearable computer having a
`head mounted display and mechanisms for displaying virtual
`information on the display together with real world images.
`0013 FIG. 2 is a diagrammatic illustration of a view of
`real world images through the head mounted display. The
`illustration shows a transparent user interface (UI) that
`presents computer-generated information on the display
`over the real world images in a manner that minimally
`distracts the user's vision of the real world images.
`0014 FIG. 3 is similar to FIG. 2, but further illustrates
`a transparent watermark overlaid on the real world images.
`0015 FIG. 4 is similar to FIG. 2, but further illustrates
`context Specific information depicted relative to the real
`World images.
`0016 FIG. 5 is similar to FIG. 2, but further illustrates
`a border about the information.
`0017 FIG. 6 is similar to FIG. 2, but further illustrates
`a way to modify prominence of the virtual information by
`changing its location on the display.
`0018 FIG. 7 is similar to FIG. 2, but further illustrates
`enclosing the information within a marquee.
`0.019
`FIG. 8 shows a process for integrating computer
`generated information with real world images on a display.
`
`DETAILED DESCRIPTION
`0020 Described below is a system and user interface that
`enables Simultaneous display of Virtual information and real
`world information with minimal distraction to the user. The
`user interface is described in the context of a head mounted
`Visual display (e.g., eye glasses display) of a wearable
`computing System that allows a user to view the real world
`while overlaying additional virtual information. However,
`the user interface may be used for other displays and in
`contexts other than the wearable computing environment.
`0021 Exemplary System
`0022 FIG. 1 illustrates a body-mounted wearable com
`puter 100 worn by a user 102. The computer 100 includes a
`variety of body-worn input devices, Such as a microphone
`110, a hand-held flat panel display 112 with character
`recognition capabilities, and various other user input devices
`114. Examples of other types of input devices with which a
`user can supply information to the computer 100 include
`Voice recognition devices, traditional qwerty keyboards,
`chording keyboards, half qwerty keyboards, dual forearm
`keyboards, chest mounted keyboards, handwriting recogni
`tion and digital ink devices, a mouse, a track pad, a digital
`Stylus, a finger or glove device to capture user movement,
`pupil tracking devices, a gyropoint, a trackball, a voice grid
`device, digital cameras (still and motion), and So forth.
`0023 The computer 100 also has a variety of body-worn
`output devices, including the hand-held flat panel display
`112, an earpiece Speaker 116, and a head-mounted display in
`the form of an eyeglass-mounted display 118. The eyeglass
`mounted display 118 is implemented as a display type that
`allows the user to view real world images from their
`Surroundings while Simultaneously overlaying or otherwise
`presenting computer-generated information to the user in an
`
`unobtrusive manner. The display may be constructed to
`permit direct viewing of real images (i.e., permitting the user
`to gaze directly through the display at the real world objects)
`or to show real world images captured from the Surround
`ings by Video devices, Such as digital cameras. The display
`and techniques for integrating computer-generated informa
`tion with the real world Surrounding are described below in
`greater detail. Other output devices 120 may also be incor
`porated into the computer 100, Such as a tactile display, an
`olfactory output device, tactile output devices, and the like.
`0024. The computer 100 may also be equipped with one
`or more various body-worn user sensor devices 122. For
`example, a variety of Sensors can provide information about
`the current physiological State of the user and current user
`activities. Examples of Such Sensors include thermometers,
`Sphygmometers, heart rate Sensors, Shiver response Sensors,
`skin galvanometry Sensors, eyelid blink Sensors, pupil dila
`tion detection Sensors, EEG and EKG sensors, Sensors to
`detect brow furrowing, blood Sugar monitors, etc. In addi
`tion, Sensors elsewhere in the near environment can provide
`information about the user, Such as motion detector Sensors
`(e.g., whether the user is present and is moving), badge
`readers, Still and Video cameras (including low light, infra
`red, and X-ray), remote microphones, etc. These sensors can
`be both passive (i.e., detecting information generated exter
`nal to the Sensor, Such as a heart beat) or active (i.e.,
`generating a signal to obtain information, Such as Sonar or
`X-rays).
`0025 The computer 100 may also be equipped with
`various environment Sensor devices 124 that Sense condi
`tions of the environment Surrounding the user. For example,
`devices Such as microphones or motion Sensors may be able
`to detect whether there are other people near the user and
`whether the user is interacting with those people. Sensors
`can also detect environmental conditions that may affect the
`user, Such as air thermometers or geigercounterS. Sensors,
`either body-mounted or remote, can also provide informa
`tion related to a wide variety of user and environment factors
`including location, orientation, Speed, direction, distance,
`and proximity to other locations (e.g., GPS and differential
`GPS devices, orientation tracking devices, gyroscopes,
`altimeters, accelerometers, anemometers, pedometers, com
`passes, laser or optical range finders, depth gauges, Sonar,
`etc.). Identity and informational Sensors (e.g., bar code
`readers, biometric Scanners, laser Scanners, OCR, badge
`readers, etc.) and remote Sensors (e.g., home or car alarm
`Systems, remote camera, national weather Service web page,
`a baby monitor, traffic sensors, etc.) can also provide rel
`evant environment information.
`0026. The computer 100 further includes a central com
`puting unit 130 that may or may not be worn on the user. The
`various inputs, outputs, and Sensors are connected to the
`central computing unit 130 via one or more data commu
`nications interfaces 132 that may be implemented using
`wire-based technologies (e.g., wires, coax, fiber optic, etc.)
`or wireless technologies (e.g., RF, etc.).
`0027. The central computing unit 130 includes a central
`processing unit (CPU) 140, a memory 142, and a storage
`device 144. The memory 142 may be implemented using
`both volatile and non-volatile memory, such as RAM, ROM,
`Flash, EEPROM, disk, and so forth. The storage device 144
`
`Niantic's Exhibit No. 1015
`Page 008
`
`

`

`US 2002/0044152 A1
`
`Apr. 18, 2002
`
`is typically implemented using non-volatile permanent
`memory, such as ROM, EEPROM, diskette, memory cards,
`and the like.
`0028. One or more application programs 146 are stored in
`memory 142 and executed by the CPU 140. The application
`programs 146 generate data that may be output to the user
`via one or more of the output devices 112,116,118, and 120.
`For discussion purposes, one particular application program
`is illustrated with a transparent user interface (UI) compo
`nent 148 that is designed to present computer-generated
`information to the user via the eyeglass mounted display 118
`in a manner that does not distract the user from viewing real
`world parameters. The transparent UI 148 organizes orien
`tation and presentation of the data and provides the control
`parameters that direct the display 118 to place the data
`before the user in many different ways that account for Such
`factors as the importance of the information, relevancy to
`what is being viewed in the real world, and So on.
`0029. In the illustrated implementation, a Condition
`Dependent Output Supplier (CDOS) system 150 is also
`shown stored in memory 142. The CDOS system 148
`monitors the user and the user's environment, and creates
`and maintains an updated model of the current condition of
`the user. AS the user moves about in various environments,
`the CDOS system receives various input information includ
`ing explicit user input, Sensed user information, and Sensed
`environment information. The CDOS system updates the
`current model of the user condition, and presents output
`information to the user via appropriate output devices.
`0030. Of particular relevance, the CDOS system 150
`provides information that might affect how the transparent
`UI 148 presents the information to the user. For instance,
`Suppose the application program 146 is generating geo
`graphical or spatial relevant information that should only be
`displayed when the user is looking in a Specific direction.
`The CDOS system 150 may be used to generate data
`indicating where the user is looking. If the user is looking in
`the correct direction, the transparent UI 148 presents the data
`in conjunction with the real world view of that direction. If
`the user turns his/her head, the CDOS system 148 detects the
`movement and informs the application program 146,
`enabling the transparent UI 148 to remove the information
`from the display.
`0031. A more detailed explanation of the CDOS system
`130 may be found in a co-pending U.S. patent application
`Ser. No. 09/216,193, entitled “Method and System For
`Controlling Presentation of Information To a User Based On
`The User's Condition', which was filed Dec. 18, 1998, and
`is commonly assigned to Tangis Corporation. The reader
`might also be interested in reading U.S. paten application
`Ser. No. 09/724.902, entitled “Dynamically Exchanging
`Computer User's Context”, which was filed Nov. 28, 2000,
`and is commonly assigned to Tangis Corporation. These
`applications are hereby incorporated by reference.
`0032. Although not illustrated, the body-mounted com
`puter 100 may be connected to one or more networks of
`other devices through wired or wireleSS communication
`means (e.g., wireless RF, a cellular phone or modem,
`infrared, physical cable, a docking Station, etc.). For
`example, the body-mounted computer of a user could make
`use of output devices in a Smart room, Such as a television
`and Stereo when the user is at home, if the body-mounted
`
`computer can transmit information to those devices via a
`wireleSS medium or if a cabled or docking mechanism is
`available to transmit the information. Alternately, kiosks or
`other information devices can be installed at various loca
`tions (e.g., in airports or at tourist spots) to transmit relevant
`information to body-mounted computers within the range of
`the information device.
`0033 Transparent UI
`0034 FIG. 2 shows an exemplary view that the user of
`the wearable computer 100 might see when looking at the
`eyeglass mounted display 118. The display 118 depicts a
`graphical Screen presentation 200 generated by the trans
`parent UI 148 of the application program 146 executing on
`the wearable computer 100. The screen presentation 200
`permits viewing of the real world Surrounding 202, which is
`illustrated here as a mountain range.
`0035. The transparent screen presentation 200 presents
`information to the user in a manner that does not signifi
`cantly impede the user's view of the real world 202. In this
`example, the virtual information consists of a menu 204 that
`lists various items of interest to the user. For the mountain
`Scaling environment, the menu 204 includes context relevant
`information Such as the present temperature, current eleva
`tion, and time. The menu 204 may further include navigation
`items that allow the user to navigate to various levels of
`information being monitored or stored by the computer 100.
`Here, the menu items include mapping, email, communica
`tion, body parameters, and geographical location. The menu
`204 is placed along the side of the display to minimize any
`distraction from the user's vision of the real world.
`0036) The menu 204 is presented transparently, enabling
`the user to see the real world images 202 behind the menu.
`By making the menu transparent and locating it along the
`side of the display, the information is available for the user
`to See, but does not impair the user's view of the mountain
`range.
`0037. The transparent UI possesses many features that
`are directed toward the goal of displaying virtual informa
`tion to the user without impeding too much of the user's
`view of the real world. Some of these features are explored
`below to provide a better understanding of the transparent
`UI.
`0038. Dynamically Changing Degree of Transparency
`0039. The transparent UI 148 is capable of dynamically
`changing the transparency of the virtual information. The
`application program 146 can change the degree of transpar
`ency of the menu 204 (or other virtual objects) by imple
`menting a display range from completely opaque to com
`pletely transparent. This display range allows the user to
`view both real world and virtual-world information at the
`Same time, with dynamic changes being performed for a
`variety of reasons.
`0040. One reason to change the transparency might be the
`level of importance ascribed to the information. AS the
`information is deemed more important by the application
`program 146 or user, the transparency is decreased to draw
`more attention to the information.
`0041 Another reason to vary transparency might be
`context specific. Integrating the transparent UI into a System
`that models the user's context allows the transparent UI to
`
`Niantic's Exhibit No. 1015
`Page 009
`
`

`

`US 2002/0044152 A1
`
`Apr. 18, 2002
`
`vary the degree of transparency in response to a rich Set of
`States from the user, their environment, or the computer and
`its peripheral devices. Using this model, the System can
`automatically determine what parts of the virtual informa
`tion to display as more or less transparent and vary their
`respective transparencies accordingly.
`0.042
`For example, if the information becomes more
`important in a given context, the application program may
`decrease the transparency toward the opaque end of the
`display range to increase the noticeability of the information
`for the user. Conversely, if the information is less relevant
`for a given context, the application program may increase
`the transparency toward the fully transparent end of the
`display range to diminish the noticeability of the Virtual
`information.
`0.043 Another reason to change transparency levels may
`be due to a change in the user's attention on the real world.
`For instance, a mapping program may display directional
`graphics when the user is looking in one direction and fade
`those graphics out (i.e., make them more transparent) when
`the user moves his/her head to look in another direction.
`0044 Another reason might be the user's focus as
`detected, for example, by the user's eye movement or focal
`point. When the user is focused on the real world, the virtual
`object's transparency increases as the user no longer focuses
`on the object. On the other hand, when the user returns their
`focus to the virtual information, the objects become visibly
`Opadue.
`004.5 The transparency may further be configured to
`change over time, allowing the Virtual image to fade in and
`out depending on the circumstances. For example, an unused
`window can fade from View, becoming very transparent or
`perhaps eventually fully transparent, when the user main
`tains their focus elsewhere. The window may then fade back
`into View when the user attention is returned to it.
`0.046
`Increased transparency generally results in the user
`being able to see more of the real-world view. In such a
`configuration, comparatively important Virtual objects-like
`those used for control, Status, power, Safety, etc.-are the
`last Virtual objects to fade from View. In Some configura
`tions, the user may configure the System to never fade
`Specified virtual objects. This type of configuration can be
`performed dynamically on Specific objects or by making
`changes to a general System configuration.
`0047 The transparent UI can also be controlled by the
`user instead of the application program. Examples of this
`involve a visual target in the user interface that is used to
`adjust transparency of the virtual objects being presented to
`the user. For example, this target can be a control button or
`slider that is controlled by any variety of input methods
`available to the user (e.g., voice, eye-tracking controls to
`control the target/control object, keyboard, etc.).
`0048 Watermark Notification
`0049. The transparent UI 148 may also be configured to
`present faintly visible notifications with high transparency to
`hint to the user that additional information is available for
`presentation. The notification is usually depicted in response
`to Some event about which an application desires to notify
`the user. The faintly visible notification notifies the user
`without disrupting the user's concentration on the real world
`
`Surroundings. The virtual image can be formed by manipu
`lating the real world image, akin to watermarking the digital
`image in Some manner.
`0050 FIG. 3 shows an example of a watermark notifi
`cation 300 overlaid on the real world image 202. In this
`example, the watermark notification 300 is a graphical
`envelope icon that Suggests to the user that new, unread
`electronic mail has been received. The envelope icon is
`illustrated in dashed lines around the edge of the full display
`to demonstrate that the icon is faintly visible (or highly
`transparent) to avoid obscuring the view of the mountain
`range. Thus, the user is able to See through the watermark
`due to its partial transparency, thus helping the user to easily
`focus on the current task.
`0051. The notification may come in many different
`shapes, positions, and sizes, including a new window, other
`icon shapes, or Some other graphical presentation of infor
`mation to the user. Like the envelope, the watermark noti
`fication can be Suggestive of a particular task to orient the
`user to the task at hand (i.e., read mail).
`0052 Depending on a given situation, the application
`program 146 can decrease the transparency of the informa
`tion and make it more or leSS Visible. Such information can
`be used in a variety of Situations, Such as incoming infor
`mation, or when more information related to the user's
`context or user's view (both virtual and real world) is
`available, or when a reminder is triggered, or anytime more
`information is available than can be viewed at one time, or
`for providing “help'. Such watermarks can also be used for
`hinting to the user about advertisements that could be
`presented to the user.
`0053. The watermark notification also functions as an
`active control that may be Selected by the user to control an
`underlying application. When the user looks at the water
`mark image, or in Some other way Selects the image, it
`becomes visibly opaque. The user's method for Selecting the
`image includes any of the various ways a user of a wearable
`personal computer can perform Selections of graphical
`objects (e.g., blinking, voice Selection, etc.). The user can
`configure this behavior in the System before the commands
`are given to the System, or generate the System behaviors by
`commands, controls, or corrections to the System.
`0054) Once the user selects the image, the application
`program provides a suitable response. In the FIG. 3
`example, user selection of the envelope icon 300 might
`cause the email program to display the newly received email
`meSSage.
`0055 Context Aware Presentation
`0056. The transparent UI may also be configured to
`present information in different degrees of transparency
`depending upon the user's context. When the wearable
`computer 100 is equipped with context aware components
`(e.g., eye movement Sensors, blink detection sensors, head
`movement Sensors, GPS Systems, and the like), the appli
`cation program 146 may be provided with context data that
`influences how the virtual information is presented to the
`user via the transparent UI.
`0057 FIG. 4 shows one example of presenting virtual
`information according to the user's context. In particular,
`this example illustrates a situation where the virtual infor
`
`Niantic's Exhibit No. 1015
`Page 0010
`
`

`

`US 2002/0044152 A1
`
`Apr. 18, 2002
`
`mation is presented to the user only when the user is facing
`a particular direction. Here, the user is looking toward the
`mountain range. Virtual information 400 in the form of a
`climbing aid is overlaid on the display. The climbing aid 400
`highlights a desired trail to be taken by the user when Scaling
`the mountain.
`0058. The trail 400 is visible (i.e., a low degree of
`transparency) when the user faces in a direction Such that the
`particular mountain is within the viewing area. AS the user
`rotates their head slightly, while keeping the mountain
`within the viewing area, the trail remains indexed to the
`appropriate mountain, effectively moving acroSS the Screen
`at the rate of the head rotation.
`0059. If the user turns their head away from the moun
`tain, the computer 100 will sense that the user is looking in
`another direction. This data will be input to the application
`program controlling the trail display and the trail 400 will be
`removed from the display (or made completely transparent).
`In this manner, the climbing aid is more intuitive to the user,
`appearing only when the user is facing the relevant task.
`0060. This is just one example of modifying the display
`of virtual information in conjunction with real world Sur
`roundings based on the user's context. There are many other
`Situations that may dictate when Virtual information is
`presented or withdrawn depending upon the user's context.
`0061 Bordering
`0.062 Another technique for displaying virtual informa
`tion to the user without impeding too much of the user's
`View of the real world is to border the computer-generated
`information. Borders, or other forms of outlines, are drawn
`around objects to provide greater control of transparency
`and opaqueness.
`0063 FIG. 5 illustrates the transparent UI 200 where a
`border 500 is drawn around the menu 204. The border 500
`draws a bit more attention to the menu 204 without notice
`ably distracting from the user's view of the real world 202.
`Graphical imageS can be created with Special borders
`embedded in the artwork, Such that the borders can be used
`to highlight the Virtual object.
`0.064
`Certain elements of the

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket