throbber
(19) United States
`(12) Patent Application Publication (10) Pub. No.: US 2003/0161097 A1
`(43) Pub. Date:
`Aug. 28, 2003
`Le et al.
`
`US 2003O161097A1
`
`(54)
`
`(76)
`
`(21)
`(22)
`
`(51)
`(52)
`
`WEARABLE COMPUTER SYSTEMAND
`MODES OF OPERATING THE SYSTEM
`
`(57)
`
`ABSTRACT
`
`Inventors: Dana Le, Los Gatos, CA (US); Lucian
`P. Hughes, El Granada, CA (US);
`Owen E. Richter, Sunnyvale, CA (US)
`Correspondence Address:
`FISH & RICHARDSON P.C.
`45 ROCKEFELLER PLAZA, SUITE2800
`NEW YORK, NY 10111 (US)
`Appl. No.:
`10/087,134
`
`Filed:
`
`Feb. 28, 2002
`
`Publication Classification
`
`Int. Cl. .................................................. G06F 1/16
`U.S. Cl. .............................................................. 361/683
`
`A wearable computer System has a user interface with at
`least an audio-only mode of operating, and that is natural in
`appearance and facilitates natural interactions with the Sys
`tem and the user's Surroundings. The wearable computer
`System may retrieve information from the user's voice or
`Surroundings using a passive user interface. The audio-only
`user interface for the wearable computer System may include
`two audio receivers and a single output device, Such as a
`Speaker, that provides audio data directly to the user. The
`two audio receivers may be miniature microphones that
`collaborate to input audio Signals from the user's Surround
`ings while also accurately inputting voice commands from
`the user. Additionally, the user may enter natural Voice
`commands to the wearable computer System in a manner
`that blends in with the natural phrases and terminology
`spoken by the user.
`
`O
`
`
`
`Exhibit 1015
`Page 01 of 10
`
`

`

`Patent Application Publication Aug. 28, 2003 Sheet 1 of 2
`
`US 2003/0161097 A1
`
`
`
`O
`
`
`
`fIC, 1
`
`Exhibit 1015
`Page 02 of 10
`
`

`

`Patent Application Publication Aug. 28, 2003 Sheet 2 of 2
`
`US 2003/0161097 A1
`
`3O T
`:
`AUDIO RECEIVER
`
`was - as as as as u a w up us as a
`
`wore is as a t-w
`
`
`
`AUDIO
`DATA
`OUTPUT
`DEVICE
`
`
`
`PERSONAL
`AUDIO
`RECEIVER
`
`- - - -
`
`- - - - - - - - - - we - - - -
`
`POWER
`SOURCE
`
`PROCESSOR
`
`GPS
`SENSOR
`
`IR
`RECEIVER
`CIRCUIT
`
`16
`
`R.
`
`PORT
`
`8
`
`4
`
`g
`
`COMPUTER
`MEMORY
`
`Exhibit 1015
`Page 03 of 10
`
`

`

`US 2003/O161097 A1
`
`Aug. 28, 2003
`
`WEARABLE COMPUTER SYSTEMAND MODES
`OF OPERATING THE SYSTEM
`
`TECHNICAL FIELD
`0001. The invention relates to a wearable computer sys
`tem, and more particularly to contextual information Storage
`and retrieval using a wearable computer System.
`
`BACKGROUND
`0002 Advances in computer electronics technology have
`reduced the size of portable computer Systems while increas
`ing the processing Speed and memory capacity. More
`recently, these advances have resulted in the use of a new
`type of portable computer System known as a wearable
`computer System. Wearable computer Systems can be worn
`by a user and allow the user to operate the computer System
`while performing other actions, Such as walking or Standing.
`Wearable computers are also convenient to use in work
`Spaces that do not offer enough space to use conventional
`computers or in WorkSpaces that require hands-free opera
`tion of computers. More recently, GPS (global positioning
`System) sensors have been added to wearable computer
`Systems, which enable the user to Store location data to the
`wearable computer System or request current location data
`from the wearable computer System. For example, wearable
`computer systems with GPS sensors may detect the user's
`location, from which information the System may be able to
`determine whether the user is home, for example, or near a
`grocery Store or other resource location. AS Such, the wear
`able computer System may, for example, display a notice to
`the user to purchase groceries when in the vicinity of the
`grocery Store.
`0003) The “hands-free” nature of wearable computer
`Systems offers advantages not available with other portable
`computer Systems, Such as notebook computers and PDAS
`(personal digital assistants). However, wearable computer
`Systems are currently not nearly as widely used as other
`portable computer Systems. While actual wearable computer
`housings are much Smaller than notebook computers, user
`interface devices for the wearable computer Systems are
`often considered to be obtrusive, appear unusual, and do not
`enable natural interaction with the wearable computer Sys
`tem. For example, the user interface typically includes a
`Small video display Screen worn on the user's head. These
`Video display Screens are generally visor displayS or eye
`glass displayS, both of which are worn on the head of the
`user So that the Screen is positioned where the user can view
`it.
`In addition, wearable computer systems often
`0004.
`include a microphone So that the user may enter voice
`commands to the computer System. While hands-free opera
`tion of wearable computer Systems using voice commands is
`convenient, the language and Syntax used to enter the Voice
`commands may be disruptive to a conversation that the user
`is having with Someone at the same time.
`
`SUMMARY
`0005 The invention provides a wearable computer sys
`tem that is more natural in appearance and facilitates natural
`interactions with the System and the user's Surroundings.
`The invention also provides increased functionality in wear
`able computer Systems.
`
`0006. In one aspect, the invention provides a wearable
`computer System that includes a computer unit wearable by
`a user and which has at least a mode of operation with an
`audio-only user interface. The computer unit may be worn
`on the belt, or in a pocket, of the user to enable hands-free
`operation of the wearable computer System. The audio-only
`user interface includes devices that allow the user to Store
`information to the computer unit using audio Signals, Such as
`the Voice of the user, and a device that allows the computer
`unit to output information to the user in the form of audio
`Signals.
`0007. In one embodiment, the audio-only user interface
`includes an audio receiver, Such as a personal microphone,
`that is wearable by the user and connectable to the computer
`unit to enable the audio receiver to receive voice Signals
`from the user and provide the Voice Signals to the computer
`unit for processing. In addition, the audio-only user interface
`includes a speaker that is likewise wearable by the user and
`connectable to the computer unit, the Speaker enabling the
`computer unit to Send audio signals to the Speaker to provide
`output to the user.
`0008. In another embodiment of the invention, an audio
`only user interface includes a first and a Second audio
`receiver. The first audio receiver is wearable by the user and
`is connectable to the computer unit to enable the first audio
`receiver to receive voice Signals from the user and provide
`the Voice Signals to the computer unit for processing. The
`Second audio receiver is also wearable by the user and
`connectable to the computer unit So as to enable the Second
`audio receiver to input audio signals from the user's Sur
`roundings to the computer unit. In one implementation, the
`first audio receiver is a personal microphone that receives
`audio input from the user, and the Second audio receiver is
`an environmental microphone that receives audio input from
`the user's Surroundings. The audio Signals received by the
`first audio receiver that do not originate with the user may
`be filtered with an audio filter (for example, by using a
`noise-canceling microphone). AS Such, voice Signals from
`the user may be received without interference from envi
`ronmental noise.
`0009. The wearable computer system may also include a
`video display but still provide the mode of operation where
`the interface with the user is audio-only. For example, the
`wearable computer unit, in one implementation, may com
`prise a personal digital assistant (PDA), or hand-held com
`puter, which in the audio-only mode is kept in the user's
`pocket or on a belt clip. In Such an implementation, the user
`may access information on the screen of the PDA, if, for
`example, hands-free operation is not needed, by taking the
`PDA in hand and Viewing its display. In another implemen
`tation, a cellular telephone is used and worn on a belt clip or
`kept in the user's pocket. This cellular telephone may be
`integrated with the wearable computer unit (which may be
`a PDA, for example) to provide communications between
`the user and a remote user, or a remote computer.
`0010. In another aspect, the invention provides a wear
`able computer System that continuously Stores, in a Scrolling
`buffer, audio information, for example, audio information
`from the user's Surroundings. Upon receiving a predeter
`mined Voice command from the user, the wearable computer
`System Stores in memory an audio clip received in the
`Scrolling buffer for a predetermined period of time (Such as
`
`Exhibit 1015
`Page 04 of 10
`
`

`

`US 2003/O161097 A1
`
`Aug. 28, 2003
`
`30 Seconds or one minute), so that the user may later retrieve
`and listen to the audio clip. In various implementations, the
`audio information Stored for later retrieval may have been
`received just prior to receipt of the Voice command, just after
`receipt of the Voice command, or during a period of time
`occurring both before and after receipt of the Voice com
`mand. In another aspect of the invention, a wearable com
`puter System may be operated using natural Voice commands
`to execute functions, Such as Storing and retrieving infor
`mation. A natural Voice command is a word or phrase used
`to execute a function of the wearable computer System that
`is also a Standard word or phrase spoken during particular
`events or occurrences in daily life, for example, "nice to
`meet you.” Thus, the user may issue a natural voice com
`mand to the wearable computer System that is less likely to
`disrupt a conversion the user is having with another perSon.
`0011. In accordance with this aspect of the invention, the
`wearable computer System includes an audio receiver that is
`adapted to be worn by a user. The audio receiver receives
`audio signals from the user and produces a corresponding
`electrical Signal. The computer unit includes a processor,
`computer memory, and circuitry that receives and digitizes
`the electrical Signal from the audio receiver. The computer
`memory has instructions that, when executed by the proces
`Sor, perform a Series of functions that include processing the
`digitized signals and recognizing spoken words therein,
`determining whether the recognized spoken words consti
`tute a predetermined natural voice command, and respond
`ing to the predetermined natural Voice commands from the
`user by prompting the processor to eXecute a predetermined
`function.
`0012. The wearable computer system may interact with
`the user under various modes of operation. One mode of
`operation is to passively record data of certain events that the
`user may recall at a later time. For example, the user may
`meet a new perSon and Say, “nice to meet you,” which is a
`natural voice command used to prompt the computer to
`record Sound, location, time, and date information of this
`event. Another example of a natural Voice command is the
`phrase, “What was that number again?” This phrase may
`initiate the Storage of the Voice of Someone with whom the
`user is speaking Saying the perSon's phone number, for
`example. In addition to these natural voice commands, the
`wearable computer System may also utilize various explicit
`Voice commands, Such as the phrase, “Store that,” start
`recording,” and “end recording,” as a few examples.
`0013 Another mode of operation for the computer sys
`tem is location-based augmentation of the user's memory.
`For example, the user may be reminded to purchase items on
`a shopping list, which was recorded by the computer System,
`when the computer System Senses that the user is near a
`location where the items may be purchased. Yet another
`mode of operation for the computer System is to provide
`real-time idea sharing. For example, the user may be wear
`ing the computer System while answering questions at a
`Sales presentation, and a coworker may send important Sales
`information via a wireleSS network to the user's wearable
`computer System, which enables the user to appropriately
`respond to the questions.
`0.014. The details of one or more embodiments of the
`invention are Set forth in the accompanying drawings and
`the description below. Other features, objects, and advan
`
`tages of the invention will be apparent from the description
`and drawings, and from the claims.
`
`DESCRIPTION OF DRAWINGS
`0015 FIG. 1A is a profile view of a user wearing a
`computer System in accordance with an embodiment of the
`invention.
`0016 FIG. 1B is a perspective view of the wearable
`computer system shown in FIG. 1A.
`0017 FIG. 1C is a closer view of a portion of FIG. 1A,
`showing the user's ear and an earpiece of the computer
`System.
`0018 FIG. 2 is a block diagram of an embodiment of the
`wearable computer system shown in FIG. 1A.
`0019. Like reference symbols in the various drawings
`indicate like elements.
`
`DETAILED DESCRIPTION
`0020. In FIG. 1A, a user 2 is wearing a hands-free,
`Voice-operated computer System 10 in accordance with the
`invention. In this embodiment, which is also shown in FIG.
`1B, the wearable computer system 10 includes a computer
`unit 15 that may be attached to a belt 3 worn by the user 2.
`The wearable computer system 10 also includes an audio
`only user interface, which outputs data directly to the user in
`a form consisting of audio signals. The audio-only user
`interface includes an earpiece 30, which houses both a
`personal microphone 36 and a speaker 32, and an environ
`mental microphone 38. The speaker 32 is housed in an
`earpiece 30, as shown in more detail in FIG. 1C. The
`personal microphone 36 may also be housed in the earpiece
`30 and is used to receive audio signals from the user 2. The
`environmental microphone 38 may be attached to the belt 3
`of the user 2, and may be used to input audio from the user's
`Surroundings.
`0021. Before discussing the computer system 10 in more
`detail, we will provide an example of how the system 10
`may be used. The wearable computer system 10 may be used
`to Store information from an introduction of the user 2 to a
`new perSon. For example, the user 2 may enter a predeter
`mined Voice command to inform the wearable computer
`System 10 that the introduction to a new perSon is occurring.
`The predetermined Voice command is received by the per
`Sonal microphone 36. Receipt of the Voice command may,
`for example, prompt the computer System 10 to record and
`Store an audio clip Surrounding the event, Such as the new
`perSon Speaking his or her name, using the environmental
`microphone 38. Other automated sensors and devices
`(described later) of the wearable computer system 10 may be
`used to Store other contextual information about the user's
`introduction to a new perSon, Such as location, time, and
`date. Later, the user 2 may recall the location, time, date, and
`audio data of the introduction. The information may be
`recalled from the wearable computer System, for example, if
`the user 2 Sees the person previously met. In this case, other
`Voice commands may prompt the replay of audible infor
`mation to the speaker 32 in the earpiece 30. The information
`may also be uploaded to another computer System, Such as
`a desktop computer, and recalled from there.
`0022 Turning now to the details of the wearable com
`puter system 10, the speaker 32 and personal microphone 36
`
`Exhibit 1015
`Page 05 of 10
`
`

`

`US 2003/O161097 A1
`
`Aug. 28, 2003
`
`in the earpiece 30 may be connected to the computer unit 15
`using a thin, flexible wire 34, as shown in FIGS. 1A-1C. One
`example of Such an integrated earpiece 30 with both an
`internal speaker 32 and a microphone 36 is sold by JABRA
`Corporation of San Diego, Calif. The wire 34 may be
`wrapped behind the user's ear 6, worn under the upper body
`clothing of the user 2, and connected to the computer unit 15
`on the user's belt 3. The wire 34 being worn under the
`clothing helps both to prevent the wire 34 from becoming
`Snagged on Something and to conceal the wire 34 from the
`view of other people. In other embodiments, a boom
`Speaker/microphone assembly may be used, or an earpiece
`with a bone-conduction microphone may also be used.
`0023 The personal microphone 36 may be used to input
`predetermined Voice commands from the user 2 to the
`computer unit 15 using a conventional voice recognition
`engine (discussed later). Also, an audio filter may be asso
`ciated with the personal microphone 36 to filter noise from
`the Surroundings of the user while properly receiving pre
`determined Voice commands from the user. In one embodi
`ment, the audio filter may be a combination of the position
`and the Sensitivity Setting of the personal microphone 36. In
`addition, or alternatively, the operation of the microphone 36
`may be controlled by an algorithm that performs the noise
`cancellation. The microphone 36 in the FIG. 1 embodiment,
`for example, may operate on the principle that the micro
`phone 36 will remain a constant distance from the user's
`mouth, and the algorithm calibrates the microphone opera
`tion to that distance. AS Such, the user 2 may input a voice
`command to the personal microphone 36 while in a noisy
`environment and the Voice command from the user 2 may be
`properly received and input to the computer unit 15 to
`execute the command.
`0024. The environmental microphone 38 may also be
`connected to the computer unit 15 using another thin,
`flexible wire 37. The personal microphone 36 and the
`environmental microphone 38 may be used, in combination,
`to input audio signals to the computer unit 15. For example,
`as discussed above, the user 2 may input a voice command
`to Save a person's name as the perSon Speaks. Even if the
`user 2 is in a noisy environment, the audio filter associated
`with the personal microphone 36 filters the environmental
`noise and properly inputs the user's voice command.
`Because the audio filter may also filter the voice of the
`person to whom the user 2 is Speaking, the environmental
`microphone 38, which receives audio Signals that are not
`filtered, may be used to receive audio signals from the
`perSon. Thus, the personal microphone 36 and the environ
`mental microphone 38 are used in collaboration to assure
`that Voice commands are input to the computer unit 15 only
`from the user 2 while the audio signals from the user's
`Surroundings may also be properly input to the computer
`unit 15.
`0.025 The wearable computer system 10 may also
`include a miniature camera 20 (see FIG. 1B) that is con
`nected to the computer unit 15 using a thin, flexible wire 21.
`The miniature camera 20 may be used to automatically Store
`images of people or objects when the user 2 enters a
`predetermined Voice command to the wearable computer
`system 10. The miniature camera 20 may be worn on the
`shirt of the user 2 with the wire 21 worn underneath the
`upper body clothing of the user 2. For example, a button on
`
`the user's Shirt may be replaced with the miniature camera
`20 So that the camera has a natural appearance.
`0026. As shown in FIG. 1A, the computer unit 15 may be
`worn on the belt 3 of the user 2 so that the computer unit 15
`is minimally obtrusive. By way of example, the size of the
`computer unit 15 may have a width that is no greater than 5.0
`inches, a length that is no greater than 4.0 inches, and a depth
`that is no greater than 1.5 inches. In addition, computer units
`15 with Smaller sizes and varying shapes to provide a
`computer unit 15 that is less obtrusive are within the scope
`of the invention. It is contemplated that the size of the
`computer unit 15 may become Smaller as computer tech
`nology advances, or the size may remain constant but may
`provide more capability.
`0027. A block diagram of the wearable computer system
`10 from FIGS. 1A-C is shown in FIG. 2. A processor 16 is
`connected to computer memory 18 inside the computer unit
`15. A power source 19, such as a battery, may be housed
`within the computer unit 15 for supplying power to all the
`circuitry in the system 10. An audio output device 32 and a
`personal audio receiver 36, Such as the Speaker 32 and
`personal microphone 36, respectively, are housed in the
`earpiece 30. The personal microphone 36 receives audio
`Signals from the user 2 and Sends electrical Signals, Such as
`analog signals, to the computer unit 15. The computer unit
`15 includes conventional analog-digital circuitry 26 that
`digitizes the analog signal from the personal microphone 36.
`The computer memory 18 includes a voice recognition
`engine that receives the digitized signals from the analog
`digital circuitry 26 and interprets the proper commands to be
`executed by the processor 16. In addition, an environmental
`audio receiver 38 and an image recorder 20 are connected to
`the computer unit 15, Such as the environmental microphone
`38 and miniature camera 20 shown in FIGS. 1A-B. Similar
`analog-digital circuitry 26 may be connected to the Speaker
`32, the environmental microphone 38, and the miniature
`camera 20.
`0028. The computer unit 15 may include a continuously
`Scrolling audio buffer to Store audio information received by
`the environmental microphone, for example. This buffer
`(which is a part of memory 18 in one implementation)
`continuously records ambient audio, and Saves it for Some
`predetermined period of time, Such as 30 Seconds or one
`minute. In other words, this continuously Scrolling buffer
`may discard recorded audio information after 30 seconds if
`the user has not issued a predetermined command to Store
`the information for later retrieval. This allows the user to
`Store audio clips just before, or after, the user issues a
`predetermined Voice command, as will be described in more
`detail later.
`0029 While the data input directly from the user 2 to the
`wearable computer System 10 consists of audio data, the
`wearable computer System 10 may automatically input data
`from other Sources that do not employ a user interface. A
`conventional GPS sensor 22 to input the location of the user
`2 may be enclosed inside the computer unit 15 of the
`wearable computer System 10 and connected to the proces
`Sor 16. Another source of data for the wearable computer
`system 10 may be a conventional IR (infra red) receiver
`circuit 24 for inputting data, Such as positional information
`within a building, from an IR beacon.
`0030) A data port 28 is used to upload saved data from the
`computer unit 15 directly to a remote computer (not shown)
`
`Exhibit 1015
`Page 06 of 10
`
`

`

`US 2003/O161097 A1
`
`Aug. 28, 2003
`
`or to download information, Such as Software updates, from
`the remote computer to the computer unit 15. The data port
`28 may use a conventional connection to the remote com
`puter, such as a USB or IR port, or a wireless network
`connection. In one embodiment, the data port 28 of the
`computer unit 15 may be connected to a wireleSS radio
`frequency (RF) transmitter (for example, a cellular tele
`phone), for transmissions to or from another person or
`remote computer. The data port 28, the miniature camera 20,
`the GPS sensor 22, and the IR receiver circuit 24 are all
`examples of Sources that may be used by the wearable
`computer System 10 to input information without employing
`a user interface, and thus enabling the wearable computer
`system 10 to be less noticeable on the user 2.
`0031. The user interface devices of the wearable com
`puter system 10, such as the earpiece 30 and the environ
`mental microphone 38, blend in with the natural appearance
`of the user 2. The wearable computer system 10 is also
`minimally obtrusive to the movements and actions of the
`user 2. The audio-only user interface of the wearable com
`puter System 10 does not require the use of noticeable visual
`displays, Such as a visor display or an eyeglass display.
`Visual displays for wearable computers have often been
`worn on the user's head with a Small display Screen pro
`jecting in front of the user's eye. Even the Smallest of these
`displays are difficult to conceal and do not blend in with the
`natural appearance of the user 2. Also, Such displays are
`distracting and disruptive to conversation and interaction
`with other people.
`0032. Nevertheless, in one embodiment a personal digital
`assistant (PDA), or hand-held computer, may be integrated
`with the computer unit 15, or serve as the computer unit 15.
`As such, the PDA provides a display for the user when
`hands-free operation is not needed. Even in this embodi
`ment, although a video user display is available, the wear
`able computer System avoids the use of the head-mounted
`Video displays used in the prior art.
`0.033
`Additional measures may be taken to make the
`wearable computer system 10 even more unintrusive for the
`user and people who interact with the user. For example,
`FIGS. 1A-1B show the computer unit 15 attached to the belt
`3 on the user 2, but the computer unit 15 may alternatively
`be carried in a pocket of the user's clothing, depending on
`the size of the computer unit 15. Also, the earpiece 30 may
`be made of a transparent or translucent material, or the color
`of the earpiece 30 may be similar to the skin color of the user
`2 to further blend in with the natural appearance of the user
`2. In addition, having an earpiece in one's ear is becoming
`a normal appearance. Indeed, with cellular telephones, for
`example, earpieces are commonly used to converse on the
`telephone in a hands-free manner.
`0034.
`In another aspect of the invention, the wearable
`computer System 10 uses natural Voice commands from the
`user 2. Natural Voice commands enable the user 2 to input
`Voice commands to the wearable computer System 10 in a
`manner that blends with the natural phrases and terminology
`spoken by the user 2. A natural Voice command is a word or
`phrase used to execute a function of the wearable computer
`System 10 that is also a Standard word or phrase Spoken
`during particular events or occurrences in daily life. AS Such,
`the issuance of a voice command by the user 2 may be done
`in a way that does not disrupt the conversion. For example,
`
`the phrase, "Nice to meet you,” is a Standard Statement that
`is commonly spoken during an introduction between two
`people. This Standard phrase may be used as a natural voice
`command to execute a function, or Series of functions, by the
`wearable computer System 10 based on the event of meeting
`a new perSon.
`0035) Other examples of standard phrases used to derive
`the context of the user's actions that may be used as natural
`voice commands include: “How are you doing?”“What is
`your name'?”“Remember to buy,” and “www.” For example,
`the user 2 may say "How are you doing?” to another person,
`which prompts the wearable computer system 10 to store a
`brief audio recording of the conversation, the time and date
`of the conversation, the user's location, and an image of the
`perSon Speaking with the user 2. A similar set of functions
`may be performed by the wearable computer system if the
`user 2 asks “What is your name'?” to another person. In
`another example, the user may speak the phrase, "I need to
`remember to buy,” during a conversation with another
`perSon about a particular product, or when the user is alone.
`The “remember to buy” portion of that phrase may prompt
`the wearable computer System to record an audio Soundbyte
`of the conversation and the time and date of the conversa
`tion. Similarly, the user 2 may read aloud an internet website
`address that is printed on a sign, So the phrase, “WWW,” may
`be used to prompt the computer System 10 to record an audio
`Sound byte of the user Speaking the complete website
`address. Many other natural voice commands may be used
`by the wearable computer System 10 depending on the
`location and preferences of the user 2.
`0036) The previously discussed mode of operation for an
`introduction of the user 2 to a new perSon may now be
`explained in more detail. The user 2 may be introduced to a
`new perSon, and greet the new perSon by Speaking the
`phrase, “Nice to meet you.” This phrase may be set up by the
`user 2, or during manufacture, to be one of the natural voice
`commands programmed in the wearable computer System 10
`that is recognized by the Voice recognition engine. The
`“Nice to meet you’ phrase can be easily picked up by the
`personal microphone 36 and passively input to the computer
`unit 15 as a command to execute a Series of functions to
`occur based on the context of meeting the new perSon.
`0037. The wearable computer system 10 may then be
`prompted to use the miniature camera 20 to Save an image
`of the new person that the user is presumably meeting, and
`a thirty-Second Sound byte Surrounding the event is input
`from the environmental microphone 38 and saved into the
`computer memory 18. The previously described audio buffer
`may be employed to Store a clip audio data before the natural
`Voice command is spoken, So the Sound clip may include
`Some audio data of the words Spoken before the natural
`Voice command was actually spoken. For example, if Some
`one with whom the user is speaking Says "my name is John
`Doe,” and the user responds, “I am Jane Doe, it is nice to
`meet you,” then the audio buffer allows the capture of audio
`information just before the Voice command was issued. In
`other contexts, it may be desirable to record audio informa
`tion that occurs after the Voice command is issued, or a
`combination of audio information received before and after
`the Voice command is issued. In addition, the System may
`allow the user to issue an explicit voice command Such as
`“start recording,” which would start the Storage of received
`audio information for later retrieval, and issue a later explicit
`
`Exhibit 1015
`Page 07 of 10
`
`

`

`US 2003/O161097 A1
`
`Aug. 28, 2003
`
`Voice command Such as "stop recording,” to Stop the Storage
`of audio information for later retrieval. In this case, the audio
`information received between the two commands would be
`stored and available for later retrieval.
`0.038. The predetermined voice commands, whether
`natural or explicit, may be customized by the user through
`a set-up procedure. For example, one user may Select the
`phrase, "Nice to meet you,” to initiate the Storage of an audio
`clip, while another user may select the phrase, “How do you
`do?” In one implementation, the Set-up procedure may be
`implemented by the user being prompted by the audio
`receiver 36 to Speak a phrase that will Serve as the prede
`termined Voice command for a Specific function type, Such
`as meeting a new perSon and recording that person's name.
`In response, the user will provide the desired phrase, which
`will be stored so that later when that phrase is spoken by the
`user, the Storage may occur. In addition, during this set-up
`procedure, the user may be prompted for additional infor
`mation, Such as the timing of the Voice clip in relation to the
`issuance of the Voice command the Voice clip will be taken
`(for example, before the voice command, after the voice
`command, or a combination of both before and after the
`voice command). Also, the set up procedure may allow the
`user to Select the period of time for the Voice clip, for
`example, 10 Seconds, 20 Seconds, one minute, etc. AS Such,
`it is possible to adjust the System So that when the audio
`information is later retrieved, it does not take too long to
`obtain the information needed. For example, if only the
`name of a perSon previously met is needed, it may not be
`desirable for the user to have to review a one minute clip of
`previously Stored audio information. As an alternative to the
`Set up procedure being done on the wearable computer
`System 10 itself, the Setup may be done using another
`computer, and downloaded to the wearable computer 10.
`0039 Location information from the GPS sensor 22 may
`be referenced against a database of locations Stored in the
`computer memory 18, and labeled as “work,”“home,” or
`“Store,” that may be set up by the user 2. A conventional
`computer clock and calendar of the computer unit 15 may be
`used to record the time and date of the introduction of the
`new perSon. Thus, contextual information from the intro
`duction, which may also be Stored, may include location,
`time and date information, audio of the new perSon Speaking
`his or her name, and an image of the perSon. This contextual
`information may also be uploaded to a di

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket