throbber
(19) United States
`(12) Patent Application Publication (10) Pub. No.: US 2008/0113689 A1
`
`
` Bailey (43) Pub. Date: May 15, 2008
`
`US 20080113689A1
`
`(54) VOICE ACTIVATED DIALING FOR
`WIRELESS HEADSETS
`
`(52) US. Cl.
`
`.................................................... 455/5691
`
`(76)
`
`Inventor:
`
`William P. Bailey, Lithonia, GA
`(US)
`
`(57)
`
`ABSTRACT
`
`Correspondence Address:
`WITHERS & KEYS FOR BELL SOUTH
`P. 0. BOX 71355
`MARIETTA, GA 30007-1355
`
`(21) APP1~ N03
`.
`Flled:
`
`(22)
`
`11/558547
`
`Nov. 10, 2006
`
`Publication Classification
`
`(51)
`
`Int. Cl,
`H04M 1/00
`
`(2006.01)
`
`Provided are methods and systems for hands free commu-
`nication with a wearable telecommunication device config-
`ured to recognize a command associated with the audio
`message and executing the command by communicating
`wirelessly with at least one communication device or a
`mobile telecommunication system. The wearable Wireless
`telecommunications headset includes a casing containing a
`speaker for receiving an audio message, a microphone for
`sending an audio message,
`transceiver and a processor
`programmed for voice recognition. Methods are provided
`for direct communication with a telecommunication system
`and also communicating with a telecommunications system
`via an intermediary local device.
`
`
`
`Bose Exhibit 1038
`
`Bose V. Koss
`
`

`

`Patent Application Publication May 15, 2008 Sheet 1 of 7
`
`US 2008/0113689 A1
`
`\
`
`Radio Frequency
`Transceiver
`
`LE
`
`'
`
`'
`
`'
`
`Processor
`
`m
`
`Voice Recognition
`Module
`
`m
`
`&
`
`Microphone
`
`FIG. 1A
`
`

`

`Patent Application Publication
`
`May 15, 2008 Sheet 2 of 7
`
`US 2008/0113689 A1
`
`203
`
`Radio Frequency
`Transceiver
`
`rm
`
`Processor
`
`fl
`
`Voice Recognition
`Module
`M
`
`Microphone
`192
`
`Transducer
`
`M 112
`
`Bluetooth Receiver
`
`1%
`
`FIG. 1B
`
`

`

`Patent Application Publication May 15, 2008 Sheet 3 0f 7
`
`US 2008/0113689 A1
`
`
`
`FIG. 2
`
`

`

`Patent Application Publication May 15, 2008 Sheet 4 0f 7
`
`US 2008/0113689 A1
`
`
`300a
`
`START
`
`’01
`
`RECEIVE PROMPT TO
`
`
`
`
`
`ENERGIZE RP
`TRANSCIEVER
`
`
`
`USER UTTERS AUDIO
`SIGNAL
`
`
`
`
`
`PROCESSOR
`RECOGNIZES AUDIO
`COMMAND
`
`
`
`
`
`
`305
`
`309
`
`DETERMINE
`
`
`DATA
`PHONE NUMBER
`
`
`,
`REESE)?
`ASSOCIATED
`
`WITH COMMAND
`
`
`
`
`
`
`HEADSET PROCESSOR
`311 \ DIALS PHONE NUMBER
`AND MAKES CALL
`
`FIG. 3A
`
`

`

`Patent Application Publication May 15, 2008 Sheet 5 0f 7
`
`US 2008/0113689 A1
`
`
`
`
`
`RECEIVE PROMPT TO
`START OUTGOING
`CALL
`
`
`
`301
`4”!
`
`
`DETERMINE DEVICE
`TO MAKE CALL
`
`304
`
`
`N
`303 \ HEADSET?
`————>
`'
`
`
`
`
`SELECT LOCAL
`DEVICE FOR
`TRANSMISSION
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`ENERGIZED RF
`TRANSCIEVER
`
`USER UTTERS AUDIO
`SIGNAL
`
`305
`
`307
`
`ENERGIZE
`LOCAL
`TRANSCEIVER
`
`
`
`306
`
`
`PROCESSOR
`
`
`RECOGNIZES AUDIO 4—— 308
`COMMAND
`
`DETERMINE
`
`
`LOOKUP
`PHONE NUMBER
`
`
`
`REQUIRED?
`ASSOCIATED
`
`
`
`WITH COMMAND
`
`
`
`
`INSTRUCT CALLING
`
`DEVICE TO DIAL PHONE
`NUMBER AND MAKE
`
`CALL
`
`
`END
`
`FIG. 3B
`
`

`

`Patent Application Publication May 15, 2008 Sheet 6 0f 7
`
`US 2008/0113689 A1
`
`4;C 0 DD1
`
`FIG. 4A
`
`
`
`
`RECEIVE
`NOTIFICATION OF "— 401
`
`INCOMING CALL
`
`
`
`402
`CALLER
`Y
`
`K,
`INFO
`
`PRESENT?
`
`DETERMINE
`
`CALLER
`4— 403
`
`
`INFORMATION
`
`
`NOTIFY USER OF
`'
`404
`INCOMING CALL
`
`WITH AUDIO
`
`
`
`USER
`
`
`CHOOSESTO
`
`ACCEPT
`
`CALL?
`
`
`
`USER
`
`
`
`ANS“$RS
`
`CALL?
`‘ 408
`
`
`
`SEND TO
`ALTERNATIVE
`DESTINATION
`
`
`
`
`ACCEPT AT
`
`
`HEADSET
`
`' 450
`
`

`

`Patent Application Publication May 15, 2008 Sheet 7 0f 7
`
`US 2008/0113689 A1
`
`FIG. 4B
`
`4001)
`
`CALLER
`INFO
`PRESENT?
`
`
`
`DETERMINE
`CALLER
`INFORMATION
`
`
`NOTIFY USER OF
` 404
`INCOMING CALL
`
`WITH AUDIO
`
`
`
`4- 403
`
`
`
`IGNORE CALL
`
`
`USER
`
`CHOOSES TO
`
`ACCEPT
`CALL?
`
`
`405
`
`407
`
`
`
`USER
`
`
`SEND TO
`ANSWERS
`
`
`
`CALL?
`ALTERNATIVE
`
`
`
`DESTINATION
`
`
`
`409
`
`—->
`
`
`ACCEPT AT
`ACCEPT AT
`
`HEADSET?
`HEADSET
`
`‘— 450
`
`SEND TO REMOTE
`
`
`DEVICE
`
`
`‘ ‘ 451
`
`

`

`US 2008/0113689 A1
`
`May 15, 2008
`
`VOICE ACTIVATED DIALING FOR
`WIRELESS HEADSETS
`
`BACKGROUND
`
`[0001] Wireless headsets are popular devices for cell
`phone users. The portability and convenience of wireless
`headsets have convinced professional and amateur users
`alike to switch from wired headsets, conventional cell phone
`speakers and microphones. Wireless headsets generally may
`include one or more components for transmitting sound
`(e.g., a speaker) one or more components for receiving
`sound (e.g. a microphone), and one or more signaling
`components (e.g. a radio), the combination of these com-
`ponents enable a user to wirelessly listen to an audio
`message and/or participate in a conversation.
`[0002] Conventionally, wireless headsets are used in con-
`junction with detached cell phones. Auser may, for example,
`have a cell phone in his pocket,
`the cell phone being
`simultaneously in communication with both a cell tower and
`a wireless headset affixed to the user’s ear or head. Even
`
`though cellular telephones have been reduced in size and are
`sleeker in design, they still constitute a weight that must be
`carried in a pocket, purse or on a belt.
`[0003]
`If a user, however, wishes to enjoy the benefits of
`a cellular telephone without the inconvenience of carrying
`an extra weight in his pocket or on his belt, the existing
`solutions fall short. Furthermore, if a user wants to receive
`audio announcement information about an incoming call
`through his headset, again existing solutions fall short.
`Finally, if a user wants the ability to connect to a remote
`person or location using audio commands, headset solutions
`do not handle such commands in as simple and centralized
`a method as possible.
`
`communication with the recipient over a mobile communi-
`cation system based in part on the audio command.
`[0008] Other apparatuses, methods, and/or computer pro-
`gram products according to embodiments will be or will
`become apparent to one with skill in the art upon review of
`the following drawings and Detailed Description.
`It
`is
`intended that all such additional systems, methods, and/or
`computer program products be included within this descrip-
`tion, be within the scope of the present invention, and be
`protected by the accompanying claims.
`
`BRIEF DESCRIPTION OF THE DRAWINGS
`
`FIG. 1A is a block diagram illustrating functional
`[0009]
`compo 1ents that may be found in a wireless telecommuni-
`cations headset;
`[0010]
`FIG. 1B is a block diagram illustrating an optional
`embod'ment
`including additional
`functional components
`that may be found in a wireless telecommunications headset;
`[0011]
`FIG. 2 depicts an exemplary wireless telecommu-
`nicati01s headset in communication with multiple remote
`devices and a telecommunications system;
`[0012]
`FIG. 3A is a flow chart illustrating an exemplary
`methoc for initiating an outgoing phone call from a wireless
`telecommunications headset;
`[0013]
`FIG. 3B is a flow chart illustrating an exemplary
`methoc for initiating an outgoing phone call from a wireless
`telecommunications headset utilizing remote devices;
`[0014]
`FIG. 4A is a flow chart illustrating an exemplary
`methoc for receiving an incoming phone call using a wire-
`less telecommunications headset; and
`[0015]
`FIG. 4B is a flow chart illustrating an exemplary
`methoc for receiving an incoming phone call using a wire-
`less telecommunications headset utilizing remote devices.
`
`
`
`SUMMARY
`
`DETAILED DESCRIPTION
`
`this Summary is
`It should be appreciated that
`[0004]
`provided to introduce a selection of concepts in a simplified
`form that are further described below in the Detailed
`
`Description. This Summary is not intended to identify key
`features or essential features of the claimed subject matter,
`nor is it intended to be used to limit the scope of the claimed
`subject matter.
`[0005] Exemplary embodiments include a wireless tele-
`communications headset casing being detachably secured to
`a user’s head and containing a speaker for providing audio
`to a user, a microphone for receiving an audio message from
`a user, a transceiver for communicating with a mobile
`telecommunication system and a processor for recognizing
`a command associated with the audio message and execut-
`ing the command.
`[0006] Exemplary embodiments provide a method to
`establish a communication by a headset. The method
`includes receiving an audio signal from a user at the headset
`via a microphone attached to the headset. An audio com-
`mand is then derived from the audio signal, allowing the
`headset to establish a communication with the recipient over
`a mobile communication system based in part on the audio
`command.
`
`In accordance with other exemplary embodiments,
`[0007]
`a computer readable medium is provided with instructions to
`receive an audio signal from a user via a microphone
`attached to the headset. An audio command is then derived
`
`from the audio signal, allowing the headset to establish a
`
`[0016] The following detailed description is directed to an
`apparatus and method for receiving and initiating telephone
`calls. In the following detailed description, references are
`made to the accompanying drawings that form a part hereof
`and which are shown, by way of illustration, using specific
`embodiments or examples. Referring now to the drawings,
`in which like numerals represent like elements through the
`several figures, aspects of the apparatus and methods pro-
`vided herein will be described.
`
`FIG. 1A is a block diagram illustrating functional
`[0017]
`components that may be found in a wireless telecommuni-
`cations headset 101. Headset 101 may be wearable by a user.
`For example, headset 101 may be formed so as to affix to the
`head of a user by wrapping around an ear or inserting at least
`partially within the ear canal, or the headset may include a
`speaker for one or both ears and extend over or around the
`head as with conventional headphones. Headset 101 may
`also be separated into multiple physical components
`attached to each other using wired and/or wireless schemes
`(e.g. separate microphone or battery attached by a wire).
`Although represented here by distinct blocks for clarity,
`functional components of headset 101 may be combined into
`a single component (e.g. processor with on—board memory)
`and/or split
`into multiple components (e.g. multiple co-
`processors).
`[0018] As illustrated in FIG. 1A, the headset 101 may
`include a processor 102. Processor 102 may include a central
`processing unit, an embedded processor, a specialized pro-
`
`

`

`US 2008/0113689 A1
`
`May 15, 2008
`
`cessor (e.g. digital signal processor), or any other electronic
`element responsible for interpretation and execution of
`instructions, performance of calculations and/or execution
`of voice recognition protocols. Processor 102 may commu-
`nicate with, control and/or work in concert with other
`functional components,
`including a microphone 105, a
`speaker 106, an antenna 109, a battery 111 and a voice
`recognition module 114. Communication between compo-
`nents may be facilitated by a bus 108. Bus 108 may be
`inclusive of multiple buses and/or communication paths.
`Communication may be in the form of multiple parallel
`paths, a single serial path, or any other communication
`scheme.
`
`Processor 102 may include a voice recognition
`[0019]
`module (“VRM”) 114. VRM 114 may be any available voice
`recognition hardware, software or firmware allowing audio
`commands uttered by the user 201 to be transformed into
`electronic commands understandable by processor 102 or
`the other components of headset 101. As an alternative,
`VRM 114 may reside on a remote device 202 or 205 capable
`of communicating with headset 101.
`[0020] RF transceiver 110 is capable of communicating
`wirelessly with a transponder of a telecommunications sys-
`tem 203 using antenna 109 or a second antenna, if desired.
`RF transceiver 110 may include multiple radios, each being
`specialized for different frequencies and/or reception and
`transmission. The telecommunications system 203 may be
`any telecommunications system including a mobile telecom-
`munications system where the user may travel from base
`station-to-base station or hot spot-to-hot spot. A telecom-
`munications system may be an analog or digital cellular
`telecommunications system. Moreover, the telecommunica-
`tions system 203 may be a Personal Communication Service
`(PCS) in either of its analog and digital versions. The
`telecommunication system 203 may utilize Time Division
`Multiple Access (TDMA), Code Division Multiple Access
`(CDMA), Frequency Division Multiple Access (FDMA) or
`Global System for Mobile (GSM) technologies. The tele-
`communication system 203 may be a satellite communica-
`tion system.
`[0021] Microphone 105 and speaker 106 may each include
`any form of transducer capable of converting audio waves
`into electrical signals (as with the microphone) and/or
`converting electrical signals into audio waves (as with the
`speaker). Ultimately,
`these components enable a user of
`headset 101 to participate in a telephonic conversation and
`may also enable the user to provide audio commands and
`receive audio. Microphone 105 and speaker 106 may be
`designed to also provide “speaker phone” capability for
`conference call use.
`
`[0022] According to exemplary embodiments, each elec-
`tronic component comprising headset 101 is powered by
`battery 111. Battery 111 can be any type of battery com-
`mensurate with the manufacturer’s ultimate design choices.
`As non-limiting examples, such a battery can be a recharge-
`able or disposable battery and can range from a ubiquitous
`AAA battery to a miniature lithium ion battery. If the headset
`101 is disposable, the battery does not have to be replace-
`able. The composition of the battery is not essential to the
`subject matter being described herein as long as the power
`provided is sufiicient to the manufacturer’s ultimate design.
`Battery 111 can be integrated into the headset 101 or reside
`externally and provide power via an external cable.
`
`FIG. 1B is a block diagram illustrating an optional
`[0023]
`embodiment
`including additional
`functional components
`that may be found in a wireless telecommunications headset
`101. Headset 101, wearable by a user, may also include a
`local
`transceiver 103, a memory 104, a transducer 107,
`shielding 112 and a synchronizing connection 113.
`[0024] Local transceiver 103 is capable of communicating
`wirelessly with other local devices using electromagnetic
`frequencies broadcasted and received using antenna 109.
`Transceiver 103 may include multiple radios. Local trans-
`ceiver 103 may include transmitters and/or receivers capable
`of utilizing radio standards for communicating with remote
`devices. As an example, local transceiver 103 may be of
`limited range and be enabled to utilize a Bluetooth® radio
`standard. Radio standards may also include Ultra-Wideband
`(UWB), Wireless USB (WUSB), Wi-Fi
`(IEEE 802.11),
`WiMAX. WiBro, infrared, near-field magnetics, HiperLAN,
`and so forth. These short range radio standards will be
`referred to as the local network or local system. The local
`transceiver may also be an optical transceiver operating in
`commonly used spectra such as the infrared or ultraviolet
`spectra. Antenna 109 of the headset 101 may include mul-
`tiple antennas, each being specialized for different frequen-
`cies and/or reception and transmission.
`[0025] The headset 101 may include shielding 112 as
`protection for the user in order to directionally attenuate any
`RF energy being emitted from transceivers 103 and/or 110.
`Shielding 112 may be constituted from any materials known
`to one of ordinary skill in the art as being suitable for such
`purposes now or in the future.
`[0026] Memory 104 may be utilized for the storage of
`electronic data and electronic instructions for use by pro-
`cessor 102. Memory 104 may include one or more types of
`computing memory, including volatile (powered) and non-
`volatile forrns of memory. Volatile memory are most com-
`monly comprised of integrated circuits and may include
`various forms of static random access memory (SRAM) and
`dynamic random access memory (DRAM). Non-volatile
`memory may include integrated circuit forms of memory
`such as flash memory, as well as other categories of memory
`including magnetic and/or optical forms of data storage. As
`above, memory 104 may be comprised of a single integrated
`circuit, or multiple components. Memory 104 may record
`multiple types of data also including ring tones, caller ID
`information, operational instructions and all types of data-
`bases. In the alternative, the memory 104 may be resident on
`a remote computing device 205 such as a nearby personal
`computer, which is accessible by the local transceiver 103.
`Memory 107 may also contain VRM 114 or instructions
`associated with VRM 114.
`
`[0027] Transducer 107 may provide an additional input
`method for providing or prompting commands by the user.
`Transducer 107 may be a button, toggle, touchpad or other
`suitable device to convert mechanical energy into an elec-
`trical signal. Transducer 107 may include a touch sensor, a
`motion sensor, a sound sensor, or any other component
`capable of providing or prompting commands by the user.
`For purposes of headset 101, the functionality of transducer
`107 may be integrated with microphone 105 so as to enable
`vocal commands or prompts by the user. Transducer 107 can
`be comprised of a single multifunction transducer, multiple
`single purpose transducers that operate in conjunction or
`independently with each other and/or multiple multifunction
`transducers that operate in conjunction or independently
`
`

`

`US 2008/0113689 A1
`
`May 15, 2008
`
`with each other. If desired, multifunction transducer 107 can
`also be a single function transducer. Transducer 107 may be
`used to initiate any number of functions associated with
`headset 101. Transducer 107 may be used to initiate a call,
`receive a call, send a call to voice mail, terminate a call,
`initiate/terminate speaker phone capability for microphone
`105 and speaker 106 or select a phone number to call. The
`preceding list of functions controlled by transducer 107 is
`exemplary and may be expanded to include any and all
`functions of headset 101.
`
`[0028] Headset 101 may include synchronizing connec-
`tion 113 (“sync connector”). Sync connector 113 may be
`used to receive and deliver updates and downloads via a
`computing device such as remote device 205, for example.
`Sync connector 113 may communicate by a radio protocol
`(i.e. Bluetooth®), optics (i.e. infrared) by a cable connection
`or any other available communication medium. Updates and
`downloads may be also accomplished from telecommuni-
`cations system 203.
`[0029]
`FIG. 2 depicts an example of wireless telecommu-
`nications headset 101 in use with telecommunication system
`203. User 201 may wear headset 101 over one ear. User 201
`may speak such that microphone 105 can pick up the user’s
`voice. Auser 201 may hear synthesized audio feedback from
`headset 101, via VRM 114, as well as the voices of others
`via speaker 106. User 201 may also prompt, or otherwise
`command headset 101, using transducer 107 or by speaking
`audio commands into microphone 105 which are then con-
`verted to digital signals by VRM 114 and processor 102. Via
`VRM 114, audio commands may be used to control any and
`all functions of headset 101. For example, such audio
`commands may direct processor 102 to connect RF trans-
`ceiver 110 with telecommunication system 203 and dial a
`particular number. Such communication can use any radio
`standard used by the telecom provider.
`[0030] Headset 101 may optionally be in wireless com-
`munication with one or more local remote devices 202 and
`
`205, simultaneously via a local network. Remote devices
`may include conventional telephonic devices in addition to
`other less conventional devices, including personal comput-
`ers and video enabled phones. Wireless communication may
`be in the form of a Bluetooth® radio standard, or other
`wireless standards as discussed above. For example, when
`user 201 is at home, headset 101 may be connected to a
`home phone, such as the communication device 202 via a
`Bluetooth® wireless connection. User 201 may have already
`authorized headset 101 to communicate with the communi-
`cation device 202 using an authentication process such as
`pairing. The communications device 202 may have wireless
`networking built in to communicate with headset 101 or the
`home phone may be connected through a wireless converter
`which converts wireless signals back and forth into conven-
`tional wired telephone signals. The communications devices
`202 and 205 may include POTS phones, Voice over Internet
`Protocol
`(VoIP), WiFi phones, computers and Personal
`Digital Assistants (PDAs).
`[0031]
`In an exemplary embodiment, headset 101 acts as
`a conduit for audio signals, acting as a receiver and trans-
`mitter of a conversation between user 201 and remote user
`204. When user 201 speaks, audio waves are converted to
`analog electronic signals by microphone 105. The analog
`signals may or may not be processed by processor 102 to
`convert them into digital signals, for example, by digital
`sampling of the analog signal. Either way, according to
`
`exemplary embodiments, signals are transmitted wirelessly
`by RF transceiver 110 directly to the telecommunications
`system 203 for communication with remote user 204. Tele-
`communications system 203 may include, a packet switched
`data network such as the Internet, or any combination of
`networks used to distribute information such as voice and/or
`
`data signals. Likewise signals may return along the same or
`a different path through the same wireless connection for
`conversion to audio waves for user 201 to hear.
`
`In other exemplary embodiments, headset 101 may
`[0032]
`interpret audio commands delivered verbally by user 201.
`The user 201 may prompt headset 101 to make a call by
`either speaking a particular word or phrase, by manipulating
`transducer 107 or both. Headset 101 may then use speech
`recognition techniques, via VRM 114, to interpret a verbal
`command provided by user 201. The command may include
`a telephone number to be dialed or a name from an address
`book to be dialed from memory device 104. Once the
`command is interpreted by processor 102, headset 101 may
`act on the command by, for example, transmitting signals to
`telecommunications system 203 via RF transceiver 110.
`[0033] Once wireless headset 101 determines a phone
`number associated with the audio command,
`the phone
`number may be converted to an audio message to be
`delivered as feedback to user 201 prior to dialing, or it may
`be automatically dialed without providing feedback. An
`address book created by user 201 and stored in memory 104
`may include a listing of names, other identification infor-
`mation and one or more telephone numbers associated with
`each of the names and/or other identification information.
`
`If, in addition to headset 101, user 201 also owns
`[0034]
`communication device 202, then the user may wish to utilize
`wireless headset 101 in conjunction with the communication
`device 202. Communication device 202 may be in commu-
`nication with remote user 204 over telecommrmication net-
`
`work 203. User 201 may pair headset 101 with communi-
`cation device 202. In this fashion, headset 101 may be used
`either with the communication device 202 or with the
`
`telecommunication network 203. When making an outgoing
`call using headset 101, user 201 may have to indicate a
`choice to connect with either communication device 202 or
`
`with network 203 if both are in range of the headset. This
`indication may occur at the time of the call. Choice of
`connection may be signaled by depression of button 107 or
`by an audio command via VRM 114. Alternatively, an
`indication of priority may have been previously provided
`thus making one remote device preferred over another when
`both are in range. A list of remote devices in a preferred
`access order may be stored in memory 104 of headset 101.
`Another alternative may involve headset 101 weighing the
`relative strength of the wireless signal between both devices
`and using the device with the strongest signal to make the
`outgoing call.
`[0035] When receiving an incoming call on either com-
`munication device 202 or telecommunication network 203,
`headset 101 may announce the incoming communication,
`either through a generated voice, a particular sound, or the
`use of other feedback mechanisms such as text on an LCD
`
`display or LED lights. Such an announcement may include
`incoming caller
`information. For example,
`an audio
`announcement of an incoming call may be generated stating,
`“Headset is receiving a call from (202) 555-1212”. If user
`201 is engaged with a call via a communication device (e.g.,
`home phone 202), and an incoming call
`is received on
`
`

`

`US 2008/0113689 A1
`
`May 15, 2008
`
`headset 101 (e.g. from cellular network 203), then one of a
`number of actions may be taken. For example, headset 101
`may notify the user about the call on the headset 101 using
`a VRM 114 generated voice such as, “receiving a call from
`(404) 555-4132”. Headset 101 may alternatively provide a
`simple audio prompt (such as a “beep”), to announce a new
`incoming call similar to a call waiting tone. Headset 101
`may also ignore the incoming call, if so configured. If, user
`201 is alerted to an incoming call, headset 101 may enable
`the user to place the current call on hold while handling the
`incoming call from the other device by enunciating a verbal
`command or manipulating transducer 107.
`[0036]
`Incoming call announcements may be utilized
`regardless of whether a current call
`is ongoing. Such
`announcements may be determined by announcing a name
`or other identification information associated with a call
`
`rather than the phone number. For example, the announce-
`ment of an incoming call may be, “headset is receiving a call
`from Joe Smit ” or “headset is receiving a call from Dad”.
`The identification information may be pulled from standard
`Caller ID information associated with the incoming call by
`processor 102. Identification information may also be deter-
`mined by performing a reverse lookup of the incoming
`telephone number in an address book that could be stored in
`memory 104. For example, if user 201 has an address book
`stored in memory 104 of headset 101,
`the headset may
`analyze the incoming caller information and perform a
`lookup based on the incoming telephone number. Once an
`associated name or other identification infonnation is deter-
`
`mined, the identification information can be made part of the
`vocal announcement played out of speaker 106 on headset
`101. A user may then decide whether or not to answer the
`incoming call using an audio command spoken into micro-
`phone 105, or by depressing button 107, or by other known
`input methods.
`[0037] Headset 101 may also be linkable with additional
`remote devices such as remote device 205. Remote device
`
`in (e.g.,
`205 may have a wireless radio standard built
`Bluetooth® functionality included with the device). Headset
`101 may utilize incoming call announcements for incoming
`calls associated with remote device 205. Headset 101 may
`also utilize one of the priority schemes identified above
`when making outgoing calls in the presence of, for example,
`office phone 202 and device 205. In a situation where both
`remote devices receive incoming calls at the same time,
`headset 101 may permit user 201 to choose which call to
`answer using transducer 107 or an audio command. Alter-
`natively, the headset may utilize a priority scheme similar to
`the schemes described above, including having previously
`identified one device (e.g., communication device 202) as
`having priority over another (e.g. device 205) for incoming
`calls. Alternatively, headset 101 may simply compare the
`relative wireless signal strengths of the connections to both
`devices and choose the strongest signal when deciding
`which incoming call to connect. Remote device 205 may
`also act as a local server to store information and execute
`commands for headset 101.
`
`FIG. 3A is a flow chart illustrating an example
`[0038]
`routine 300a for initiating an outgoing communication (e.g.
`a phone call) from wireless telecommunications headset
`101. The functional blocks displayed in this and other
`flowcharts are intended to suggest an order for completing a
`method. The blocks and their order, however, are not
`intended to provide the exact method for performing the
`
`method. Instead, functional blocks may be combined, split
`reordered, added and removed.
`[0039]
`In the example shown in FIG. 3A, a prompt is
`received by wireless telecommunications headset 101 from
`user 201 to initiate an outgoing call at operation 301. The
`prompt may be in the form of a transducer 107 manipulation,
`an audio command (via VRM 114), or similar input. Upon
`receipt of the prompt, the RF transceiver 110 is energized by
`the battery 111 at process 305 or alternatively, the power
`level to the RF receiver 110 is increased.
`
`[0040] At operation 307, user 201 utters an audio message
`which is received via microphone 105 and may be stored
`digitally in memory 104. The contents of the audio message
`may initially be unknown to headset 101 but at operation
`308 processor 102 performs speech recognition analysis
`using VRM 114 on the stored audio message, thus achieving
`a level of recognition of what user 201 uttered. A command
`may be recognized, such as “work phone” and/or “Dial 2 0
`2 5 5 5 1 2 1 2,” or “Dial Joe Smith Mobile.” Such speech
`recognition techniques are widely known, and may require
`that user 201 have previously trained headset 101 as to the
`peculiarities of the user’s voice.
`[0041] At decision point 309, a determination is made as
`to whether the recognized command requires a phone num-
`ber lookup, as with “Dial Joe Smith Mobile”. If so, a phone
`number associated with the name in the command is deter-
`
`mined at process 310. This determination may be achieved
`by processor 102 looking up the name in an address book
`stored in memory 104 within headset 101. This lookup may
`look for an exact match, or it may use phonetic approxima-
`tion to find the name which sounds closest to the recognized
`command.
`
`[0042] At operation 311, the resulting phone number is
`dialed by processor 102 in order that the call be initiated via
`telecommunications system 203, at which point user 201 can
`converse wirelessly with remote user 204. Routine 300a
`ends after user 201 begins his conversation via RF trans-
`ceiver 110.
`
`FIG. 3B is a flow chart illustrating an example
`[0043]
`routine 30019 for initiating an outgoing communication from
`wireless telecommunications headset 101 with optional
`remote devices 202 and/or 205 included. In the example
`shown in FIG. 3B, at operation 301, a prompt is received by
`wireless telecommunications headset 101 from user 201 to
`
`initiate an outgoing call. The prompt may be in the form of
`a transducer manipulation, an audio command via VRM
`114, or similar input. At operation 302, if multiple remote
`devices (e.g. 202 and 205) are within range and appropri-
`ately authorized, headset 101 determines which of the
`devices to use to initiate the call. At decision point 303, a
`determination is made as to whether the call will be made
`
`directly from the RF transceiver 110 of the headset 101 to
`telecommunication system 203 or whether a local commu-
`nication device, such as the devices 202 and 205, will be
`used and selected in step 304 in which case the local
`transceiver 103 will be activated in step 306. This determi-
`nation may be made by user 201 manipulating transducer
`107, by user 201 uttering an audio command to processor
`102 via microphone 105 or by a preconfigured set of rules.
`[0044] An example of a rule may be to access a home
`phone first via local transceiver 103, a work phone second
`via local transceiver 103, and then to the headset third via RF
`transceiver 110 during daytime hours, but always use the
`headset after 7 pm. Other rules are certainly configurable.
`
`

`

`US 2008/0113689 A1
`
`May 15, 2008
`
`Headset 101 may use the remote device 202 having the
`strongest signal Via local transceiver 103. Another alterna-
`tive is to have the headset 101 select a remote device based
`
`on the phone number being dialed. For instance, certain
`numbers may need to go out over the work phone (e.g.,
`international calls),
`in which case local
`transceiver 103
`would be used in conjunction with the work phone while
`other calls go out over the headset RF transceiver 110 (i.e.
`personal calls). Another alternative is to have user 201 select
`among the available remote devices. This selection may be
`made by audibly prompting user 201 to select a remote
`device 202 and awaiting a selection in the form of a vocal
`or other input or by manipulating transducer 107.
`[0045] At operation 307, user 201 utters an audio message
`which is received via microphone 105 which may be stored
`digitally in memory 104, via VRM 114. The contents of the
`audio message may initially be unknown to headset 101 but
`at operation 308, processor 102 performs speech recognition
`analysis on the audio message, achieving a level of recog-
`nition of what user 201 utte

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket