`(12) Patent Application Publication (10) Pub. No.: US 2006/0167687 A1
`Kates
`(43) Pub. Date:
`Jul. 27, 2006
`
`US 2006.0167687A1
`
`(54) MANAGEMENT AND ASSISTANCE SYSTEM
`FOR THE DEAF
`
`(76) Inventor: Lawrence Kates, Corona Del Mar, CA
`(US)
`
`Correspondence Address:
`KNOBBE MARTENS OLSON & BEAR LLP
`2O4O MAN STREET
`FOURTEENTH FLOOR
`IRVINE, CA 92614 (US)
`(21) Appl. No.:
`11/041,166
`
`(22) Filed:
`
`Jan. 21, 2005
`
`Publication Classification
`
`(51) Int. Cl.
`(2006.01)
`GIOL I5/26
`(52) U.S. Cl. .............................................................. 704/235
`
`ABSTRACT
`(57)
`A computer-aided communication and assistance system
`that uses a signal processing and other algorithms in a
`processor in wireless communication with a microphone
`system to aid a deaf person. An instrumented communica
`tion module receives information from one or more micro
`phones and provides textual and, optionally, stimulatory
`information to the deaf person. In one embodiment, a
`microphone is provided in a piece of jewelry or clothing. In
`one embodiment, a wireless (or wired) earpiece is provided
`to provide microphones and vibration stimulators.
`
`
`
`Receive Sound From Own Microphone
`
`Speech Analysis
`Speech to text
`Volume analysis (relative to ambient)
`
`Display Speech as Text on Display
`indicate Volume relative to ambient
`
`Exhibit 1018
`Page 01 of 19
`
`
`
`Patent Application Publication Jul. 27, 2006 Sheet 1 of 10
`
`US 2006/0167687 A1
`
`
`
`FIG. I.
`
`Exhibit 1018
`Page 02 of 19
`
`
`
`Patent Application Publication Jul. 27, 2006 Sheet 2 of 10
`
`US 2006/0167687 A1
`
`
`
`loo
`M
`
`Processor
`
`f16 2
`
`Exhibit 1018
`Page 03 of 19
`
`
`
`Patent Application Publication Jul. 27, 2006 Sheet 3 of 10
`
`US 2006/0167687 A1
`
`.
`
`Reggive exterial
`
`.
`
`.
`
`3 O2
`
`
`
`
`
`
`
`
`
`E.
`. Speake)
`
`
`
`es ironsay
`
`Exhibit 1018
`Page 04 of 19
`
`
`
`Patent Application Publication Jul. 27, 2006 Sheet 4 of 10
`
`US 2006/0167687 A1
`
`
`
`Receive Sound From Own Microphone
`
`Speech Analysis
`Speech to text
`Volume analysis (relative to ambient)
`
`Display Speech as Text on Display
`indicate Volume relative to ambient
`
`Exhibit 1018
`Page 05 of 19
`
`
`
`Patent Application Publication Jul. 27, 2006 Sheet 5 of 10
`
`US 2006/0167687 A1
`
`
`
`Wa
`O
`
`Vy
`
`i
`.
`N \
`...Y
`
`.
`
`. .
`
`SN
`V)
`
`7.
`
`li
`n
`A VA NS
`In
`2. W-
`
`%
`
`S
`y
`
`Exhibit 1018
`Page 06 of 19
`
`
`
`Patent Application Publication Jul. 27, 2006 Sheet 6 of 10
`
`US 2006/0167687 A1
`
`
`
`S
`
`Exhibit 1018
`Page 07 of 19
`
`
`
`Patent Application Publication Jul. 27, 2006 Sheet 7 of 10
`
`US 2006/0167687 A1
`
`S
`
`-
`
`
`
`S
`S
`
`
`
`s
`
`
`
`Exhibit 1018
`Page 08 of 19
`
`
`
`Patent Application Publication Jul. 27, 2006 Sheet 8 of 10
`
`US 2006/0167687 A1
`
`
`
`Exhibit 1018
`Page 09 of 19
`
`
`
`Patent Application Publication Jul. 27, 2006 Sheet 9 of 10
`
`US 2006/0167687 A1
`
`
`
`Exhibit 1018
`Page 10 of 19
`
`
`
`Patent Application Publication Jul. 27, 2006 Sheet 10 of 10
`
`US 2006/0167687 A1
`
`723
`
`768
`
`
`
`Wireless
`interface
`
`g 16
`
`-7622
`
`Exhibit 1018
`Page 11 of 19
`
`
`
`US 2006/0167687 A1
`
`Jul. 27, 2006
`
`MANAGEMENT AND ASSISTANCE SYSTEM FOR
`THE DEAF
`
`BACKGROUND OF THE INVENTION
`0001) 1. Field of the Invention
`0002 The present invention relates to a system for com
`puter-aided assistance and life management system for deaf
`people.
`0003 2. Description of the Related Art
`0004 People without the sense of hearing live a difficult
`and dangerous existence. They do not hear warning Sounds
`like sirens. They do not hear information Sounds like a
`doorbell or the beep of a microwave oven. Worst of all, they
`do not hear the speech of other people. This makes com
`munication with other people very difficult and frustrating.
`
`SUMMARY
`0005 These and other problems are solved by a com
`puter-aided communication and assistance system that uses
`a computer or other processor in wireless communication
`with a microphone system to aid the deaf person. An
`instrumented communication module receives information
`from one or more microphones and provides textual and,
`optionally, stimulatory information to the deaf person. In
`one embodiment, a microphone is provided in a piece of
`jewelry or clothing. In one embodiment, a wireless (or
`wired) earpiece is provided to provide microphones and
`vibration stimulators.
`0006.
`In one embodiment, the communication and assis
`tance system communicates with microphones located in
`and about a house. In one embodiment, the communication
`and assistance system communicates with microphones
`located at doorways. In one embodiment, the communica
`tion and assistance system relays information from the
`microphones to a computer monitoring system. In one
`embodiment, the assistance system provides voice-recogni
`tion (e.g., recognition of the person speaking) processing. In
`one embodiment, the assistance system provides language
`translation processing. In one embodiment, the assistance
`system provides speech-recognition processing.
`0007. In one embodiment, the communication and assis
`tance system includes a computer system provided to a first
`wireless communication transceiver and a communication
`module provided to a second wireless communication trans
`ceiver. The communication module has an identification
`code and is configured to communicate with the computer
`system using two-way handshaking communication Such
`that the computer system can send instructions to the com
`munication module and receive acknowledgement of the
`instructions from the communication module. The commu
`nication module can send data to the computer system and
`receive acknowledgement from the computer system
`according to the identification code. The computer system is
`configured to send instructions to the communication mod
`ule and to receive data from the communication module
`related to one or more actions of the user wearing or carrying
`the communication module. In one embodiment, the com
`puter system is configured to keep records of at least a
`portion of the user's actions so that the system can learn to
`function in a more precise fashion (e.g., the system remem
`
`bers voices and when the user identifies a person to the
`system, the system can then correlate the person's voice with
`the person's name).
`0008. In one embodiment, the communication module
`includes at least one of an acoustic input device, a vibrator
`device, an infrared receiver, an infrared transmitter, a micro
`phone, a display device, etc.
`0009. In one embodiment, the communication module
`includes an acoustic input device. In one embodiment, the
`communication module includes an acoustic output device.
`In one embodiment, the communication module includes a
`vibrator device. In one embodiment, the communication
`module includes a keypad input device. In one embodiment,
`the communication module includes an infrared receiver. In
`one embodiment, the communication module includes an
`infrared transmitter.
`0010. In one embodiment, the system includes one or
`more repeaters.
`0011. In one embodiment, the communication device
`includes a cellular telephone. In one embodiment, the com
`munication device includes a GPS receiver. In one embodi
`ment, the communication device configured to obtain Voice
`or other sound information from one or more location
`microphones when the microphone reader is within range to
`read information from the one or more location micro
`phones, and the communication device configured to obtain
`location from the GPS receiver when location information is
`available from the GPS receiver.
`0012. In one embodiment, the system can be augmented
`by acoustic sensors provided to the vehicle (e.g., external to
`the vehicle or attached to the windows of the vehicle) and/or
`a cockpit display in the vehicle. In one embodiment, the
`cockpit display includes a warning light. In one embodi
`ment, the cockpit display includes a flashing light. In one
`embodiment, the cockpit display includes a text display that
`provides text or picture information to the driver. In one
`embodiment, the cockpit display indicates the type of Sound
`(e.g., siren, screeching brakes, horn, impact or crash Sounds,
`backup beeper Sounds, sirens, warning shouts, etc.). In one
`embodiment, the cockpit display indicates the direction of
`the Sound. In one embodiment, the cockpit display indicates
`the direction of the sound source. In one embodiment, the
`cockpit display indicates the estimated distance to the Sound.
`In one embodiment, the cockpit display indicates the Volume
`of the sound. In one embodiment, the cockpit display
`indicates the duration the Sound.
`
`BRIEF DESCRIPTION OF THE DRAWINGS
`0013 FIG. 1 shows a user wearing and carrying elements
`of a management and assistance system for the deaf.
`0014 FIG. 2 is a block diagram of various elements of
`the management and assistance system for the deaf.
`0015 FIG. 3 is a flowchart showing sound processing for
`external Sounds.
`0016 FIG. 4 is a flowchart showing sound processing for
`speech generated by the user.
`0017 FIG. 5 shows the elements of a management and
`assistance system for the deaf in connection with a home
`automation system.
`
`Exhibit 1018
`Page 12 of 19
`
`
`
`US 2006/0167687 A1
`
`Jul. 27, 2006
`
`0018 FIG. 6A is a diagram of a handheld device that can
`be used by the user or by a third party in connection with the
`assistance system for the deaf.
`0019 FIG. 6B is a block diagram of the handheld device
`shown in FIG. 6A.
`0020 FIG. 7A shows a vehicle sensor and warning
`system using forward and aft sensors and a cockpit display
`for helping deaf drivers.
`0021
`FIG. 7B shows a vehicle sensor and warning
`system using four quadrant sensors and a cockpit display for
`helping deaf drivers.
`
`DETAILED DESCRIPTION
`0022 FIG. 1 shows a user 101 wearing elements of a
`management and assistance system for the deaf. In FIG. 1,
`the user 101 is shown wearing a communication module 102
`and a headset 160. A handheld module 112 can be used by
`the user 101 or handed to a third party to aid in communi
`cation with the user 101. In one embodiment, the handheld
`module 112 is used in lieu of the communication module 102
`and provides the functions of the communication module
`102. In one embodiment, the handheld module is comple
`mentary to the communication module 102 and used in
`connection with the communication module 102. In order to
`simplify the explanation, the disclosure that follows refers to
`the communication module 102, with the understanding that
`the communication module 102 can be built as a wearable
`device as shown in FIG. 1 or as a device that can be carried
`(e.g., handheld, carried in a pocket, etc.)
`0023. In one embodiment, the handheld module 112 can
`be used by a deaf or hearing-impaired parent to monitor a
`child or children. The handheld module 112 receives sounds
`from the child or the vicinity of the child and provides
`information to the communication module 102. The hand
`held module 112 can be placed in an area near the child or
`children. Although referred to herein as a handheld device,
`in one embodiment, the handheld module 112 can be con
`figured to be worn by a child as a wearable device. In one
`embodiment, the handheld module 112 is configured to
`identify sounds corresponding to a child in trouble (e.g.,
`crying, yelling, breaking glass, etc.) and warn the parent. In
`one embodiment, the module 112 includes a location sensor
`and is configured to identify a location of the child and warn
`the parent when the child has moved. In one embodiment,
`the module 112 is configured to warn the parent when the
`child has moved into a dangerous area (e.g., a forbidden
`room, a pool area, near a hot stove, etc.). In one embodi
`ment, the module 112 can be queried by the communication
`module 102 so that the parent can “listen' in on the child by
`reading speech to text provided by the communication
`module 102.
`0024 One of ordinary skill in the art will recognize that
`although the preceding paragraph referred to monitoring a
`child, the handheld module 112 can also be used by a deaf
`or hearing-impaired person to monitor a being needing care
`and attention Such as, for example, a spouse, a pet, an elderly
`parent, a disabled person, etc.
`0025. One or more microphones in the headset 160
`provide acoustic information to the communication module
`102. The communication module 102 uses the information
`from the microphones to ascertain the character of acoustic
`
`sounds in the environment, sounds made by the user 101,
`and optionally, the direction of various Sounds. In one
`embodiment, the communication module 102 uses the head
`set 160 to provide vibrator and/or optical alerts to the user
`101. The user 101 can use a microphone in the headset 160
`to send Voice commands to the communication module 102
`or 112. The user 101 can also use buttons on a keypad on the
`communication module 102 or 112 to control the operation
`of the system and input commands into the system.
`0026 FIG. 2 shows block diagrams of the headset 160
`and a communication module 161. The communication
`module 161 is representative of the modules 102 and 112
`shown in FIG. 1. In the headset 160, a first microphone 202,
`a vibrator 203, a second microphone 204, and a communi
`cation system 205 are provided to a processor 201. The
`communication system 205 can use Radio Frequency (RF)
`communication, optical (e.g., InfraRed communication),
`direct connection, etc. In one embodiment, the first micro
`phone 202 is configured to pick up Sounds in the environ
`ment (e.g., speech of others, sirens, horns, doorbells, etc.). In
`one embodiment, the second microphone 204 is configured
`to pick up the speech of the user 101. In one embodiment,
`the first and second microphones 202, 204 are configured to
`provide direction information so that the direction of a sound
`Source can be ascertained.
`0027. In the communication module 161, a microphone
`251, a first communication system 256, a keypad 253, a
`display 254, a vibrator 255, and a second communication
`system 252 are provided to a processor 250.
`0028. In one embodiment, the processor 250 provides
`processing of the sounds received by the microphones 202,
`204, and/or 251. In one embodiment, the acoustic signal
`processing algorithms are used to distinguish danger sounds
`(e.g., sirens) from other Sounds (e.g., the wind). In one
`embodiment, the acoustic signal processing algorithms are
`used to distinguish danger sounds (e.g., Sirens) from indi
`cator Sounds (e.g., a doorbell). In one embodiment, the
`acoustic signal processing algorithms are used in speech
`recognition to convert the received sounds into text on the
`display 254. In one embodiment, a loudspeaker 257 is
`provided to the module 161. In one embodiment, the user
`101 can enter text using the keypad 253 and instruct the
`processor 250 to convert the text to speech.
`0029 FIG. 3 is a flowchart showing one embodiment of
`processing of Sounds from the environment (e.g., Sounds not
`produced by the user 101). In a block 301, the system
`receives an external sound. In a block 302, an initial analysis
`of the sound is performed. The initial analysis is passed to
`a decision block 303. The decision block 303 determines if
`the external sound corresponds to voice Sounds (e.g., talk
`ing, yelling, etc). If the Sound is a voice Sound, then control
`is passed to a speech analysis block 304; otherwise, control
`passes to a decision block 307. The speech analysis block
`304 converts the sounds into text. Where the speaker's voice
`is recognized, the block 304 also identifies the speaker. If
`language translation has been requested, the block 304 also
`translates the text into a desired language.
`0030) The results from the block 304 are provided to a
`decision block 305. The decision block 305 determines if the
`speech corresponds to a warning (e.g., “watch out, 'stop'.
`etc.). If the sound is a warning Sound, then control is passed
`to a classification block 308; otherwise, control passes to a
`
`Exhibit 1018
`Page 13 of 19
`
`
`
`US 2006/0167687 A1
`
`Jul. 27, 2006
`
`display block 306. The display block 306 displays the text of
`the speech on the display 254. In one embodiment, the
`display block 306 uses the vibrator to alert the user 101 to
`the presence of text.
`0031) The decision block 307 determines if the external
`Sound corresponds to warning Sounds (e.g., horns, sirens,
`etc). If the Sound is a warning Sound, then control is passed
`to the classification block 308; otherwise, control passes to
`a decision block 310. The classification block 308 classifies
`the urgency or potential level of danger indicated by the
`warning. Data from the classification block 308 is provided
`to a warning block 309. The warning block 309 uses the
`vibrators 203, 255, and the display 254 to alert and warn the
`user 101. In one embodiment, the warning block 309 also
`uses the display to give the user an indication of the direction
`of the warning sound. In one embodiment, the strength of the
`vibrations produced by the vibrators 203, 255 correspond to
`the relatively level of perceived danger.
`0032) The decision block 310 determines if the external
`Sound corresponds to desired sounds (e.g., a doorbell, a
`beeper on a microwave oven, a ringing telephone, etc.). If
`the Sound is a desired Sound, then control is passed to a
`message block 311; otherwise, the Sound is ignored and
`control returns to the block 301. The message block 311
`classifies the type of Sound and issues an appropriate mes
`Sage (e.g., “doorbell ringing, etc.).
`0033 FIG. 4 is a block diagram showing the processing
`for speech or sounds made by the user 101. In a block 401,
`user speech sounds from the microphone on the headset 160
`are received. The Sounds are passed to a speech analysis
`block 402. The block 402 provides speech to text processing.
`In one embodiment, the block 402 also compares the volume
`of the speech to the ambient sounds. The results from the
`block 402 are provided to a display block 403. The display
`block 403 displays the speech as text so that the user 101 can
`verify that his/her speech was intelligible and correctly
`formed. In one embodiment, the display block 403 also
`indicates the user's speech level as compared to the ambient
`level so that the user will know if he/she is speaking too
`loudly or too softly.
`0034. In one embodiment, the speech analysis block 402
`and the display block 403 provide displays to help the user
`101 formulate speech sounds properly. For example, most
`human languages are composed of a relatively small number
`of sounds (e.g., the letters of the alphabet and the various
`ways of saying those letters.) In one embodiment, the user
`can place the system 160 in a mode where it will display
`Such formants for the user so that the user can practice
`forming speech Sounds in order to improve his/her speech.
`0035) In one embodiment, the user 101 can carry an extra
`communication module 102 and provide the extra module
`160 to a third person for conversation. The Third person can
`speak into the second communication module 102 and see
`his/her speech converted to text on the display. The text on
`the third person's display is relayed by the second commu
`nication module 102 to a first communication module 112
`held or worn by the user 101. In this way, both participants
`in the conversation can verify that the speech to text opera
`tion and text-to-speech operations are translating speech and
`text as desired.
`0.036
`Various elements of a communication and assis
`tance system 100 for helping a deaf person 101 can be
`
`integrated into a home or building automation system 500 as
`shown in FIG. 5. The elements shown in FIG. 5 work
`together with the elements shown in FIG. 1 to provide
`additional functionality and capability. For purposes of
`explanation, and not by way of limitation, the system 500 is
`described herein as a system to be used by a person who is
`deaf. One of ordinary skill in the art will recognize that
`various aspects of the system 500 can also be used for
`persons that are partially deaf, or otherwise impaired. The
`system 500 includes a computer system 503 and/or com
`munication module 502 to control the system 500 and, to
`collect data, and to provide data for the caretaker and/or the
`user 101. The system typically includes a wireless commu
`nication module 112 and a wireless base unit 504. The
`communication module 112 communicates with the user
`101.
`0037. The microphones placed about a house or structure
`550 provides an identification code to identify location,
`objects, environment, etc. The communication module 504
`reads the microphones and relays the information from the
`microphones to the computer 503 and/or to the user 101.
`0038. The system 500 can also include one or more of the
`following optional devices: one or more video cameras
`monitors 505, one or more loudspeakers 507, one or more
`motion sensors 506, etc. The system 500 can further include
`one or more of the following optional devices: a remote
`control/display 112 for allowing the user 101 to interact with
`the system 503, ambient condition sensors (e.g., Smoke, gas,
`fire, etc.) etc. In one embodiment, the ambient condition
`sensors are wireless sensors that communicate wirelessly
`with the computer system 503 and/or communication mod
`ule 112.
`0039. In one embodiment, the system 500 can be used as
`a computerized system for informing the user 101 of sounds
`or events around the house. Textual instructions or informa
`tion can be provided through the 160.
`0040. In one embodiment, a modem 530 is provided for
`making connections with the telephone system, to allow the
`system 500 to communicate with a caretaker and/or the user
`101 through cellular telephone, text messaging, pager, etc. A
`network connection 508 (e.g., an Internet connection, local
`area network connection, wide area network connection,
`etc.) is provided to allow the caretaker and/or the user 101
`to communicate with the system 500 and to allow the system
`500 to receive updated software, updated status information,
`etc. Thus, for example, in one embodiment, the user 101 can
`contact the system 503 to obtain map information, call for
`assistance, etc.
`0041. In one embodiment, the system 500 provides indi
`cations (e.g., green light, text messages, etc.) when the user
`101 is in a safe environment and/or warning indications
`(e.g., red lights, warning messages, vibration, etc.) when the
`user is in an unsafe environment (e.g., unknown person at
`the front door, motion sensor activated, Smoke alarm acti
`vated, home security system activated, outside motion sen
`sor activated, etc.). In one embodiment, the user 101 can
`select the conditions that trigger Sounds versus vibrations.
`Thus, for example, an experienced user may choose to use
`vibration from the communicate module 112 for certain
`types of Sounds and text messages for other types of Sounds.
`0042. In one embodiment, the system 500 uses the sen
`sors 529 to detect fire or smoke. In one embodiment, the
`
`Exhibit 1018
`Page 14 of 19
`
`
`
`US 2006/0167687 A1
`
`Jul. 27, 2006
`
`system 500 receives alarm data from a home alarm system.
`In one embodiment, a wireless microphone 509 is used to
`detect a fire alarm. When the system 500 detects a fire or
`smoke alarm, the system 500 can instruct the user to leave
`and notify the a family member or caretaker. The caretaker
`can be notified by using the loudspeakers 507, by telephone,
`pager, and/or text messaging using the modem 530 to
`connect with the telephone system, and/or by using the
`network connection 508 (e.g., email instant messaging, etc.).
`The modem 530 is configured to place a telephone call and
`then communicate with the user using data (e.g., in the case
`of text messaging) and/or synthesized Voice. The modem
`530 can also be used by the caretaker and/or the user 101 to
`contact the computer system 503 and/or control the system
`500 using voice recognition instructions and/or data or
`keyboard inputs from the cellular telephone. In one embodi
`ment, the communication device 160 is configured with a
`cellular telephone interface so that the user 101 can com
`municate with the system 503 via the display and keyboard
`on the communication device 160.
`0043. The user's response to instructions is monitored by
`the system 500 by using data from the communication
`module 102, and/or by video processing from one or more
`video cameras 506. Thus, for example, if the user 101 does
`not respond to a fire or Smoke alarm (e.g., because the user
`is not wearing a vibrator and is asleep and does not see a
`flashing light), then the system 500 can notify a neighbor,
`family member, or other caretaker. In addition, the user's
`response to instructions can be determined by the caretaker
`and/or the user 101 in real time. In one embodiment, a
`caretaker or instructor works with the user 501 and the
`system 500 to get the user accustomed to the system.
`0044) The communication module 102 is configured to
`be carried and/or to be worn on the wrist, belt, chest, etc. In
`the communication module 102, includes one or more sound
`sensing devices (e.g., a microphones), a vibration device,
`and a communication device (e.g., a first RF transceiver).
`The Sound sensing device is configured to sense Sound
`waves (Sonic and/or ultrasonic) Such as, for example, a
`microphone, a transducer, etc. For convenience, and without
`limitation, the Sound sensing device is referred to herein as
`a microphone with the understanding that other acoustic
`transducers can be used as well. For convenience, and
`without limitation, the sound producing device is referred to
`herein as a loudspeaker with the understanding that the
`Sound producing device is configured to produce Sound
`waves (Sonic and/or ultrasonic) Such as, for example, a
`loudspeaker, a transducer, a buzzer, etc. The communication
`module 102 can also include one or more lights (not shown)
`for providing visual indications to the user.
`0045. The microphones are used to pick up sound waves
`Such as, for example, Sounds produced by the user 101,
`Sounds produced by other people, and/or acoustic waves
`produced by an acoustic location device (Sonic or ultra
`Sonic), etc. In one embodiment, the microphone 202 is
`configured to pick up external Sounds (e.g., Sounds not made
`by the user) and the microphone 204 is configured to pick up
`Sounds made by the users. In one embodiment, the system
`100 includes voice-recognition processing to help the user
`101 know who is in the room, at door, etc., and what the
`person is saying. The processor 201 processes the sounds
`picked up by the microphones and, if needed, sends pro
`
`cessed data to the computer system 503 and/or communi
`cation module 102 for further processing.
`0046) The vibrator can be used in a manner similar to a
`vibrator on a cellular telephone to alert the user 101 without
`disturbing other people in the area. The vibrator can also be
`used to alert the user 101 to abnormal or potentially dan
`gerous conditions or to the presence of text messages on the
`communication device 160. Deaf people tend to rely more
`on their sense of touch than people with good hearing. Thus,
`in one embodiment, the vibrator can be configured to
`provided different types of vibrations (e.g., different fre
`quency, different intensity, different patterns, etc.) to send
`information to the user 101.
`0047. The first RF transceiver 205 communicates with
`the communication unit 160. The communication unit 160
`can communicate with the system 500 either directly or
`through the repeaters. In one embodiment, the RF trans
`ceiver 205 provides two-way communications such that the
`communication module 102 can send information to the
`computer system 503 and/or communication module 102
`and receive instructions from the computer system 503
`and/or communication module 102. In one embodiment, the
`computer system 503 and/or communication module 102
`and the first RF transceiver 302 communicate using a
`handshake protocol, to verify that data is received.
`0.048. The user 101 can use the system 100 to “listen” to
`various microphones 509 around the house and thereby
`obtain information about the users surroundings. For
`example, in one embodiment, microphones are provided
`near windows, doors, in children's play areas, etc. In one
`embodiment, the communication module 102 includes one
`or more location and tracking systems, such as, for example,
`an IR system, a GPS location system, an Inertial Motion
`Unit (IMU) and/or radio frequency systems. The tracking
`systems can be used alone or in combination to ascertain the
`location of the user 101 and to help the user 101 hear sounds
`in the areas about the structure 550. Thus, for example, a
`child's cry in a different room can be forwarded by the
`system 500 to the user 101. Whereas, a child’s cry in a room
`occupied by the user 101 does not need to be relayed
`because it will be picked up by the headset 160.
`0049. In one embodiment, the microphone 204 is used to
`allow the user to send voice commands to the system 500.
`0050. The communication module 102 sends low-battery
`warnings to the computer system 503 and/or communication
`module 102 to alert the caretaker and/or the user 101 that the
`communication module 102 needs fresh batteries.
`0051) The Global Positioning System (GPS) is accurate
`but often does not work well indoors, and sometimes does
`not have enough vertical accuracy to distinguish between
`floors of a building. GPS receivers also require a certain
`amount of signal processing and Such processing consumes
`power. In a limited-power device Such as the communication
`module 102, the power consumed by a GPS system can
`reduce battery life. However, GPS has the advantages of
`being able to operate over a large area and is thus, particu
`larly useful when locating a user that has escaped a confined
`area or is out of the range of other locating systems.
`0.052 GPS tends to work well outdoors, but poorly inside
`buildings. Thus, in one embodiment, the system 100 uses
`GPS in outdoor situations where microphones are unavail
`
`Exhibit 1018
`Page 15 of 19
`
`
`
`US 2006/0167687 A1
`
`Jul. 27, 2006
`
`able, and microphones indoors where GPS is unavailable or
`unreliable. Thus, using the system 100, the position of the
`user 101 in a building can be ascertained.
`0053. In one embodiment, the GPS system 302 operates
`in a standby mode and activates at regular intervals or when
`instructed to activate. The GPS system can be instructed by
`the computer 503 and/or to the user 101 or the communi
`cation module to activate. When activated, the GPS system
`obtains a position fix on the user 101 (if GPS satellite signals
`are available) and updates the IMU. In one embodiment, a
`GPS system is also provided to the computer system 503
`and/or communication module 102. The computer system
`uses data from its GPS system to send location and/or timing
`data to the GPS system in the communication module 102
`allowing the GPS system 302 to warm start faster, obtain a
`fix more quickly, and therefore, use less power.
`0054. In one embodiment, location system units are
`placed about the house or building 550 to locate movement
`and location of the user 101. In one embodiment, location
`system units send infrared light, acoustic waves, and/or
`electromagnetic waves to one or more sensors on the com
`munication module 102 in order to conserve power in the
`communication module 102. In one embodiment, the com
`munication module 102 sends infrared light, acoustic waves,
`and/or electromagnetic waves to the location system units in
`order to conserve power in the units. In one embodiment, the
`communication module 102 sends inaudible Sounds (e.g.,
`ultrasonic sounds) to the wireless microphones 509 to locate
`the user 101.
`0.055
`For example, location system units placed near
`doorways or in hallways can be used to determine when the
`user 101 moves from one room to another. Even if the user
`cannot be exactly located within the room (e.g., due to blind
`spots), a location system unit placed to sense the movement
`of the user though the doorway allows the system 500 to
`know which room the user is in by watching the user 101
`move from room to room.
`0056.
`In one embodiment, each location transmitter
`(whether in the communication module 102 or the location
`system units) sends a coded pattern of pulses to allow the
`transmitter to be identified. In one embodiment, in order to
`conserve power, the location receiver (whether in the com
`munication module 102 or the location system units 118)
`notifies the computer system