throbber
Case 6:21-cv-00984-ADA Document 55-6 Filed 05/25/22 Page 1 of 45
`Case 6:21-cv-00984-ADA Document 55-6 Filed 05/25/22 Page 1 of 45
`
`
`
`
`
`
`
`
`EXHIBIT 6
`EXHIBIT 6
`
`

`

`Case 6:21-cv-00984-ADA Document 55-6 Filed 05/25/22 Page 2 of 45
`case 62-0000A08, os.amgHHA
`
`US008467543B2
`
`a2) United States Patent
`US 8,467,543 B2
`(0) Patent No.:
`*Jun. 18, 2013
`(45) Date of Patent:
`Burnett et al.
`
`(54)
`
`(75)
`
`MICROPHONEAND VOICE ACTIVITY
`
`DETECTION (VAD) CONFIGURATIONS FOR
`USE WITH COMMUNICATION SYSTEMS
`
`Inventors: Gregory C. Burnett, Livermore, CA
`(US); Nicolas J. Petit, San Francisco,
`CA (US); Alexander M.Asseily, San
`Francisco, CA (US); Andrew E.
`Einaudi, San Francisco, CA (US)
`
`(73)
`
`Assignee: AliphCom, San Francisco, CA (US)
`
`Notice:
`
`Subject to any disclaimer, the term ofthis
`patent is extended or adjusted under 35
`U.S.C. 154(b) by 977 days.
`
`This patent is subject to a terminal dis-
`claimer.
`
`(21)
`
`Appl. No.: 10/400,282
`
`(22)
`
`Filed:
`
`Mar.27, 2003
`
`(65)
`
`(60)
`
`(51)
`
`(52)
`
`(58)
`
`Prior Publication Data
`
`US 2003/0228023 Al
`
`Dec. 11, 2003
`
`Related U.S. Application Data
`
`Provisional application No. 60/368,209, filed on Mar.
`27, 2002.
`
`Int. Cl.
`
`HO04B 15/00
`US. Cl.
`
`(2006.01)
`
`USPC on. 381/94.1; 381/92; 381/94.3; 381/94.7;
`704/226; 704/233
`
`Field of Classification Search
`USPC woe 381/94.7, 71.6, 110, 71.7, 111, 11,
`381/91—92, 94.1-94.3; 704/226, 233
`See application file for complete search history.
`
`(56)
`
`References Cited
`
`U.S. PATENT DOCUMENTS
`
`3,789,166 A
`4,006,318 A
`
`1/1974 Sebesta
`2/1977 Sebestaet al.
`
`(Continued)
`FOREIGN PATENT DOCUMENTS
`0 637 187 A
`2/1995
`0795 851 A2
`9/1997
`
`EP
`EP
`
`(Continued)
`OTHER PUBLICATIONS
`
`ZhaoLi et al: “Robust Speech Coding Using Microphone Arrays”,
`Signals Systems and Computers, 1997. Conf. record of3 1st Asilomar
`Conf., Nov. 2-5, 1997, IEEE Comput. Soc. Nov. 2, 1997. USA.
`
`(Continued)
`
`Primary Examiner — Disler Paul
`(74) Attorney, Agent, or Firm — Kokka & Backus, PC
`
`ABSTRACT
`(57)
`Communication systems are described, including both por-
`table handset and headset devices, which use a number of
`microphone configurations to receive acoustic signals of an
`environment. The microphone configurations include, for
`example, a two-microphone array including two unidirec-
`tional microphones, and a two-microphone array including
`one unidirectional microphone and one omnidirectional
`microphone. The communication systemsalso include Voice
`Activity Detection (VAD) devices to provide information of
`human voicing activity. Components of the communications
`systemsreceive the acoustic signals and voice activity signals
`and, in response, automatically generate control signals from
`data of the voice activity signals. Components of the commu-
`nication systems use the control signals to automatically
`select a denoising method appropriate to data of frequency
`subbands of the acoustic signals. The selected denoising
`methodis applied to the acoustic signals to generate denoised
`acoustic signals when the acoustic signal includes speech and
`noise.
`
`26 Claims, 29 Drawing Sheets
`
`106]
`
`VAD
`
`
`Voicing Information
`
`103
`
`: a
`Mic |
`m,(n)
`
`100
`
`.
`
`Cleaned Speech
`
`
`
`
`
`
`104
`
`me)
`
`—
`

`
`
`
`s(n)
`
`Hz)
`
`H()
`
`n,(n)
`
`s(n)
`
`
`7
`n(n)
`
`101
`
`()
`“sa
`
`s(n
`
`102
`
`@)
`Noise
`n(n)
`
`

`

`Case 6:21-cv-00984-ADA Document 55-6 Filed 05/25/22 Page 3 of 45
`Case 6:21-cv-00984-ADA Document 55-6 Filed 05/25/22 Page 3 of 45
`
`US 8,467,543 B2
`
`Page 2
`
`bebe eneeteseseseeeeees 704/233
`
`PPPPPPEEEEPEPSPPEEEPPPEEPS *
`
`U.S. PATENT DOCUMENTS
`Iwata
`5/1986
`2/1990
`Gollmar et al.
`3/1992
`Baba
`5/1993
`Ariyoshi
`10/1994
`Ohet al.
`3/1995
`Linhard
`4/1995
`Silverberg etal.
`5/1995
`Sims,Jr.
`12/1995
`Yoshidaetal.
`bebeseneesenes 381/94.7
`5/1996
`Scanlonetal.
`5/1996
`Sugiyama
`7/1996
`Robbeetal.
`12/1996
`Park et al. cece 704/227
`4/1997
`Matouket al. 0... 379/392.01
`5/1997
`Kanamoriet al.
`TN997T
`Guptaet al.
`11/1997
`Scanlonetal.
`3/1998
`Holzrichteret al.
`5/1998
`Hosoi etal.
`11/1998
`Warnakaetal.
`12/1998
`Scanlon
`6/1999
`Sasakiet al.
`McEwan
`10/1999
`McEwan
`11/1999
`12/1999
`Holzrichter
`12/1999
`Nagata
`5/2000
`Martin etal.
`McEwan
`2/2001
`7/2001
`Ikeda
`8/2002
`Handeletal.
`9/2004
`HOUSMI ween 455/550.1
`11/2005
`Vaudreyetal.
`.. 381/94.7
`
`12/2005
`Turnbull et al.
`. 340/425.5
`4/2007
`Yang et al. vce 381/92
`
`4,591,668
`4,901,354
`5,097,515
`5,212,764
`5,353,376
`5,400,409
`5,406,622
`5,414,776
`5,473,702
`5,515,865
`5,517,435
`5,539,859
`5,590,241
`5,625,684
`5,633,935
`5,649,055
`5,684,460
`5,729,694
`5,754,665
`5,835,608
`5,853,005
`5,917,921
`5,966,090
`5,986,600
`6,006,175
`6,009,396
`6,069,963
`6,191,724
`6,266,422
`6,430,295
`6,795,713
`6,963,649
`6,980,092
`7,206,418
`
`4/2002 Burnett etal.
`2002/0039425 Al
`3/2003 Ouyangetal. .........8. 381/92
`2003/0044025 Al*
`7/2003 Beaucoupetal. ............ 704/226
`2003/0130839 Al*
`FOREIGN PATENT DOCUMENTS
`0 984 660 A2
`3/2000
`2000 312 395
`11/2000
`2001 189 987
`7/2001
`WO 02 07151
`1/2002
`
`EP
`JP
`JP
`WO
`
`OTHER PUBLICATIONS
`
`L.C. Ng et al.: “Denoising of Human Speech Using Combined.
`Acoustic and EM Sensor Signal Processing”, 2000 IEEEIntl Conf on
`Acoustics Speech and Signal Processing. Proceedings (Cat. No.
`00CH37 100), Istanbul, Turkey, Jun. 5-9, 2000 XP002186255, ISBN
`0-7803-6293-4.
`S. Affes et al.: “A Signal Subspace Tracking Algorithm for Micro-
`phone Array Processing of Speech”. IEEE Transactions on Speech
`and Audio Processing, N.Y, USA vol. 5, No. 5, Sep. 1, 1997.
`XP000774303. ISBN 1063-6676.
`Gregory C. Burnett: “The Physiological Basis of Glottal Electromag-
`netic Micropower Sensors (GEMS) and Their Use in Defining an
`Excitation Function for the Human Vocal Tract”, Dissertation. Uni-
`versity of California at Davis. Jan. 1999. USA.
`Todd J. Gable etal.: “Speaker Verification Using Combined Acoustic
`and EM Sensor Signal Processing”, ICASSP-2001, Salt Lake City,
`USA.
`A. Hussain: “Intelligibility Assessment of a Multi-Band Speech
`Enhancement Scheme”, Proceedings IEEE Intl. Conf. on Acoustics,
`Speech & Signal Processing (ICASSP-2000). Istanbul, Turkey. Jun.
`2000.
`
`* cited by examiner
`
`

`

`Case 6:21-cv-00984-ADA Document 55-6 Filed 05/25/22 Page 4 of 45
`Case 6:21-cv-00984-ADA Document 55-6 Filed 05/25/22 Page 4 of 45
`
`U.S. Patent
`
`Jun. 18, 2013
`
`Sheet 1 of 29
`
`US 8,467,543 B2
`
`yooodg
`
`PDld
`
`paurayy|[eaoweyaston.|(()!LUW(u)uyswoleustg
`LOINNama
`
`
`aNSSI0NaSOI(uur7—~coy
`|COL
`
`001—~*
`
`
`
`UOHeULIOJUTSUIDI0A901
`
`(u)s
`
`101
`
`
`

`

`Case 6:21-cv-00984-ADA Document 55-6 Filed 05/25/22 Page 5 of 45
`Case 6:21-cv-00984-ADA Document 55-6 Filed 05/25/22 Page 5 of 45
`
`U.S. Patent
`
`Jun. 18, 2013
`
`Sheet 2 of 29
`
`US 8,467,543 B2
`
` VAD
`
`Device
`
`VAD
`Algorithm
`
`Configuration
`
`Pathfinder Noise
`Suppression
`
`
` Microphone
`
`
`
`
`
`Denoised
`Speech
`
`Communication
`Device
`
`
`
`FIG.1A
`
`

`

`Case 6:21-cv-00984-ADA Document 55-6 Filed 05/25/22 Page 6 of 45
`Case 6:21-cv-00984-ADA Document 55-6 Filed 05/25/22 Page 6 of 45
`
`U.S. Patent
`
`Jun. 18, 2013
`
`Sheet 3 of 29
`
`US 8,467,543 B2
`
`pouray)
`
`yosads
`
`sandepy
`
`TOY
`
`weer ee ie fee ee ee KE ee ee ne ee dee eeee
`
`
`
`
`
`J9]90087)ASTONaandepy
`
`ATewaIg
`
`JOUIOTOY
`
`dq]Old
`
`(LUVWord)
`
`(a)°u
`
`
`(u)0(«s))
`
`(2)H
`
`(u)!uYSION
`
`((*))
`
`HOS
`
`
`

`

`Case 6:21-cv-00984-ADA Document 55-6 Filed 05/25/22 Page 7 of 45
`Case 6:21-cv-00984-ADA Document 55-6 Filed 05/25/22 Page 7 of 45
`
`U.S. Patent
`
`Jun. 18, 2013
`
`Sheet 4 of 29
`
`US 8,467,543 B2
`
`UNIDIRECTIONAL
`
`
`
`
`
`
`
`esoo
`
`Polar
`Response
`Pattern
`
`Coverage
`Angle
`Angle of
`Maxi
`Rejection
`(aull angle)
`
`
`
`5
`
`180
`.
`
`5
`
`126
`
`;
`
`110
`
`
`
`
`
`si Super-|Hyper:om . Bi-
`
`
`
`
`
`
`Rear
`
`
`
`Rejection 25dB|12dB|6dB
`(relative to front)
`
`
` Ambient Sound
`
`
`
`Sensitivity
`(relative to omni)
`
`Distance
`Factor
`
`(relative to omni)
`
`100%|33% 27% 25% 33%
`
`
`
`
`
`l
`
`1.7
`
`1.9
`
`2
`
`1.7
`
`FIG.2
`(PRIOR ART)
`
`
`
`
`
`

`

`Case 6:21-cv-00984-ADA Document 55-6 Filed 05/25/22 Page 8 of 45
`Case 6:21-cv-00984-ADA Document 55-6 Filed 05/25/22 Page 8 of 45
`
`U.S. Patent
`
`Jun. 18, 2013
`
`Sheet 5 of 29
`
`US 8,467,543 B2
`
`300
`
`Away
`from
`Speech
`
`Towards
`Speech
`
`FIG.3A
`
`

`

`Case 6:21-cv-00984-ADA Document 55-6 Filed 05/25/22 Page 9 of 45
`Case 6:21-cv-00984-ADA Document 55-6 Filed 05/25/22 Page 9 of 45
`
`U.S. Patent
`
`Jun. 18, 2013
`
`Sheet 6 of 29
`
`US 8,467,543 B2
`
`(INA
`
`SpIBMO|,
`
`yoseds
`
`SpJ@AO],
`
`yosads
`
`dtOld
`
`jospuey]
`
`
`
`

`

`Case 6:21-cv-00984-ADA Document 55-6 Filed 05/25/22 Page 10 of 45
`Case 6:21-cv-00984-ADA Document 55-6 Filed 05/25/22 Page 10 of 45
`
`U.S. Patent
`
`Jun. 18, 2013
`
`Sheet 7 of 29
`
`US 8,467,543 B2
`
`
`Headset
`
`FIG.3C
`
`

`

`Case 6:21-cv-00984-ADA Document 55-6 Filed 05/25/22 Page 11 of 45
`Case 6:21-cv-00984-ADA Document 55-6 Filed 05/25/22 Page 11 of 45
`
`U.S. Patent
`
`Jun. 18, 2013
`
`Sheet 8 of 29
`
`US 8,467,543 B2
`
`,
`
`400
`
`UNI
`
`7”
`
`d T
`
`owards
`Speech
`
`FIG.4A
`
`

`

`Case 6:21-cv-00984-ADA Document 55-6 Filed 05/25/22 Page 12 of 45
`Case 6:21-cv-00984-ADA Document 55-6 Filed 05/25/22 Page 12 of 45
`
`U.S. Patent
`
`Jun. 18, 2013
`
`Sheet 9 of 29
`
`US 8,467,543 B2
`
`SpeMO],
`
`yosods
`
`yoseds
`
`dyOl
`
`yospuey
`
`SpreMmo],
`
`INNO©)
`
`
`
`N\,ONSqoseds
`
`—olpWoy
`
`ARMY
`
`
`

`

`Case 6:21-cv-00984-ADA Document 55-6 Filed 05/25/22 Page 13 of 45
`Case 6:21-cv-00984-ADA Document 55-6 Filed 05/25/22 Page 13 of 45
`
`U.S. Patent
`
`Jun. 18, 2013
`
`Sheet 10 of 29
`
`US 8,467,543 B2
`
`
`
`FIG.4C
`
`Headset
`
`

`

`Case 6:21-cv-00984-ADA Document 55-6 Filed 05/25/22 Page 14 of 45
`Case 6:21-cv-00984-ADA Document 55-6 Filed 05/25/22 Page 14 of 45
`
`U.S. Patent
`
`Jun. 18, 2013
`
`Sheet 11 of 29
`
`US 8,467,543 B2
`
`500
`
`Away
`from
`Speech
`
`Towards
`Speech
`
`FIG.SA
`
`

`

`Case 6:21-cv-00984-ADA Document 55-6 Filed 05/25/22 Page 15 of 45
`Case 6:21-cv-00984-ADA Document 55-6 Filed 05/25/22 Page 15 of 45
`
`U.S. Patent
`
`Jun. 18, 2013
`
`Sheet 12 of 29
`
`US 8,467,543 B2
`
`Sp
`
`IMO],
`
`yosads
`
`(INA
`
`INNO©
`
`SpIeMO]
`
`6DIA
`
`yospuey yoseds
`
`
`
`

`

`Case 6:21-cv-00984-ADA Document 55-6 Filed 05/25/22 Page 16 of 45
`Case 6:21-cv-00984-ADA Document 55-6 Filed 05/25/22 Page 16 of 45
`
`U.S. Patent
`
`Jun. 18, 2013
`
`Sheet 13 of 29
`
`US 8,467,543 B2
`
`—_~
`
`
`
`—
`
`a —<
`
`ej 28
`© Nw 2 on
`
`FIG.5C
`
`z
`rm
`
`a
`“
`
`~—
`
`3o m
`
`o
`
`Headset
`
`

`

`Case 6:21-cv-00984-ADA Document 55-6 Filed 05/25/22 Page 17 of 45
`Case 6:21-cv-00984-ADA Document 55-6 Filed 05/25/22 Page 17 of 45
`
`U.S. Patent
`
`Jun. 18, 2013
`
`Sheet 14 of 29
`
`US 8,467,543 B2
`
`“
`
`600
`
`
`
`Towards
`Speech
`
`FIG.6A
`
`

`

`Case 6:21-cv-00984-ADA Document 55-6 Filed 05/25/22 Page 18 of 45
`Case 6:21-cv-00984-ADA Document 55-6 Filed 05/25/22 Page 18 of 45
`
`U.S. Patent
`
`Jun. 18, 2013
`
`Sheet 15 of 29
`
`US 8,467,543 B2
`
` Towards Speech
`
`END
`
`©)UNI
`
`FIG.6B
`
`Towards Speech
`
`Handset
`
`

`

`Case 6:21-cv-00984-ADA Document 55-6 Filed 05/25/22 Page 19 of 45
`Case 6:21-cv-00984-ADA Document 55-6 Filed 05/25/22 Page 19 of 45
`
`U.S. Patent
`
`Jun. 18, 2013
`
`Sheet 16 of 29
`
`US 8,467,543 B2
`
`
`
`FIG.6C
`
`Headset
`
`

`

`Case 6:21-cv-00984-ADA Document 55-6 Filed 05/25/22 Page 20 of 45
`Case 6:21-cv-00984-ADA Document 55-6 Filed 05/25/22 Page 20 of 45
`
`U.S. Patent
`
`Jun. 18, 2013
`
`Sheet 17 of 29
`
`US 8,467,543 B2
`
`700
`
`Away
`from
`Speech
`
`Towards
`Speech
`
`FIG.7A
`
`

`

`Case 6:21-cv-00984-ADA Document 55-6 Filed 05/25/22 Page 21 of 45
`Case 6:21-cv-00984-ADA Document 55-6 Filed 05/25/22 Page 21 of 45
`
`U.S. Patent
`
`Jun. 18, 2013
`
`Sheet 18 of 29
`
`US 8,467,543 B2
`
`SpJEMO],
`
`yosads
`
`qiOld yooods
`
`SpIeMO|,
`
`jospuey]
`
`
`

`

`Case 6:21-cv-00984-ADA Document 55-6 Filed 05/25/22 Page 22 of 45
`Case 6:21-cv-00984-ADA Document 55-6 Filed 05/25/22 Page 22 of 45
`
`U.S. Patent
`
`Jun. 18, 2013
`
`Sheet 19 of 29
`
`US 8,467,543 B2
`
`
`Awayfrom~<—
`Speech
`
`Headset
`
`FIG.7C
`
`

`

`Case 6:21-cv-00984-ADA Document 55-6 Filed 05/25/22 Page 23 of 45
`Case 6:21-cv-00984-ADA Document 55-6 Filed 05/25/22 Page 23 of 45
`
`U.S. Patent
`
`Jun. 18, 2013
`
`Sheet 20 of 29
`
`US 8,467,543 B2
`
`“
`
`800
`
`
`
`Towards
`Speech
`
`FIG.8A
`
`

`

`Case 6:21-cv-00984-ADA Document 55-6 Filed 05/25/22 Page 24 of 45
`Case 6:21-cv-00984-ADA Document 55-6 Filed 05/25/22 Page 24 of 45
`
`US 8,467,543 B2
`
`FIG.8B
`
`U.S. Patent
`
`Jun. 18, 2013
`
`Sheet 21 of 29
`
`Handset
`
`

`

`Case 6:21-cv-00984-ADA Document 55-6 Filed 05/25/22 Page 25 of 45
`Case 6:21-cv-00984-ADA Document 55-6 Filed 05/25/22 Page 25 of 45
`
`U.S. Patent
`
`Jun. 18, 2013
`
`Sheet 22 of 29
`
`US 8,467,543 B2
`
`Towards Speech
`
`FIG.8C
`
`Headset
`
`

`

`Case 6:21-cv-00984-ADA Document 55-6 Filed 05/25/22 Page 26 of 45
`Case 6:21-cv-00984-ADA Document 55-6 Filed 05/25/22 Page 26 of 45
`
`U.S. Patent
`
`Jun. 18, 2013
`
`Sheet 23 of 29
`
`US 8,467,543 B2
`
`900
`
`
` (Orientation Irrelevant)
`
`Towards
`Speech
`
`FIG.IA
`
`

`

`Case 6:21-cv-00984-ADA Document 55-6 Filed 05/25/22 Page 27 of 45
`Case 6:21-cv-00984-ADA Document 55-6 Filed 05/25/22 Page 27 of 45
`
`U.S. Patent
`
`Jun. 18, 2013
`
`Sheet 24 of 29
`
`US 8,467,543 B2
`
`FIG.9B
`
`Handset
`
`

`

`Case 6:21-cv-00984-ADA Document 55-6 Filed 05/25/22 Page 28 of 45
`Case 6:21-cv-00984-ADA Document 55-6 Filed 05/25/22 Page 28 of 45
`
`U.S. Patent
`
`Jun. 18, 2013
`
`Sheet 25 of 29
`
`US 8,467,543 B2
`
`
`
`
`
`FIG.IC
`
`Headset
`
`

`

`Case 6:21-cv-00984-ADA Document 55-6 Filed 05/25/22 Page 29 of 45
`Case 6:21-cv-00984-ADA Document 55-6 Filed 05/25/22 Page 29 of 45
`
`U.S. Patent
`
`Jun. 18, 2013
`
`Sheet 26 of 29
`
`US 8,467,543 B2
`
` -
`
`FIG.1OA
`
`

`

`Case 6:21-cv-00984-ADA Document 55-6 Filed 05/25/22 Page 30 of 45
`Case 6:21-cv-00984-ADA Document 55-6 Filed 05/25/22 Page 30 of 45
`
`U.S. Patent
`
`Jun. 18, 2013
`
`Sheet 27 of 29
`
`US 8,467,543 B2
`
`So
`oN
`So—_
`
`FIG.10B
`
`

`

`Case 6:21-cv-00984-ADA Document 55-6 Filed 05/25/22 Page 31 of 45
`Case 6:21-cv-00984-ADA Document 55-6 Filed 05/25/22 Page 31 of 45
`
`U.S. Patent
`
`Jun. 18, 2013
`
`Sheet 28 of 29
`
`US 8,467,543 B2
`
`
`
`JOYIOJOaUOgplojse]YIM
`
`
`
`ealy-V PHN-OIeqoY}MOjOgeaIV-Teyof}puryog
`
`
`
` aSON-AONSSI],SUTJCIGIA,
`
` JOeJU0DUleULDJegopisuy-Jleur,JeqJOJOLYUIvoIY-BOTY
`
`
`4994)
`VITOld
`
`

`

`Case 6:21-cv-00984-ADA Document 55-6 Filed 05/25/22 Page 32 of 45
`Case 6:21-cv-00984-ADA Document 55-6 Filed 05/25/22 Page 32 of 45
`
`U.S. Patent
`
`Jun. 18, 2013
`
`Sheet 29 of 29
`
`US 8,467,543 B2
`
`N—
`oe
`
`FIG.11B Accelerometer/SSM
`
`o—
`
`

`

`Case 6:21-cv-00984-ADA Document 55-6 Filed 05/25/22 Page 33 of 45
`Case 6:21-cv-00984-ADA Document 55-6 Filed 05/25/22 Page 33 of 45
`
`US 8,467,543 B2
`
`1
`MICROPHONEAND VOICE ACTIVITY
`
`DETECTION (VAD) CONFIGURATIONS FOR
`USE WITH COMMUNICATION SYSTEMS
`
`RELATED APPLICATIONS
`
`This application claims priority from U.S. Patent Applica-
`tion No. 60/368,209, entitled MICROPHONE AND VOICE
`ACTIVITY DETECTION (VAD) CONFIGURATIONS
`FOR USE WITH PORTABLE COMMUNICATION SYS-
`TEMS, filed Mar. 27, 2002.
`Further, this applicationrelates to the following U.S. Patent
`Applications: Application Ser. No. 09/905,361, entitled
`METHOD AND APPARATUS FOR REMOVING NOISE
`
`FROM ELECTRONIC SIGNALS, filed Jul. 12, 2001; appli-
`cation Ser. No. 10/159,770, entitled DETECTING VOICED
`AND UNVOICED SPEECH USING BOTH ACOUSTIC
`
`AND NONACOUSTIC SENSORS, filed May 30, 2002;
`application Ser. No. 10/301,237, entitled METHOD AND
`APPARATUS FOR REMOVING NOISE FROM ELEC-
`
`20
`
`2
`Thus, similar limitations are found in noise suppression sys-
`tems using these single-microphone VADs.
`Manylimitations of these typical single-microphone VAD
`systems were overcome with the introduction of the Path-
`finder noise suppression system by Aliph of San Francisco,
`Calif.
`(http://www.aliph.com), described in detail
`in the
`Related Applications. The Pathfinder noise suppression sys-
`tem differs from typical noise cancellation systems in several
`important ways. For example, it uses an accurate voiced activ-
`ity detection (VAD) signal along with two or more micro-
`phones, where the microphonesdetect a mix ofboth noise and
`speech signals. While the Pathfinder noise suppression sys-
`tem can be used with and integrated in a number of commu-
`nication systems and signal processing systems, so can a
`variety of devices and/or methods be used to supply the VAD
`signal. Further, a number of microphonetypes and configu-
`rations can be usedto provide acoustic signal information to
`the Pathfinder system.
`
`BRIEF DESCRIPTION OF THE FIGURES
`
`TRONIC SIGNALS,filed Nov. 21, 2002; and application Ser.
`No. 10/383,162, entitled VOICE ACTIVITY DETECTION
`(VAD) DEVICES AND METHODS FOR USE WITH
`NOISE SUPPRESSION SYSTEMS, filed Mar. 5, 2003.
`
`TECHNICAL FIELD
`
`The disclosed embodiments relate to systems and methods
`for detecting and processing a desired acoustic signal in the
`presence of acoustic noise.
`
`BACKGROUND
`
`Manynoise suppression algorithms and techniques have
`been developed overthe years. Mostof the noise suppression
`systemsin use today for speech communication systems are
`based on a single-microphonespectral subtraction technique
`first develop in the 1970’s and described, for example, by S.
`F. Boll in “Suppression of Acoustic Noise in Speech using
`Spectral Subtraction,’ IEEE Trans. on ASSP, pp. 113-120,
`1979. These techniques have beenrefined overthe years, but
`the basic principles ofoperation have remained the same. See,
`for example, U.S. Pat. No. 5,687,243 of McLaughlin,et al.,
`and U.S. Pat. No. 4,811,404 ofVilmur, et al. Generally, these
`techniques make use of a single-microphone Voice Activity
`Detector (VAD) to determine the background noise charac-
`teristics, where “voice” is generally understood to include
`human voiced speech, unvoiced speech, or a combination of
`voiced and unvoiced speech.
`The VAD hasalso been usedin digital cellular systems. As
`an example of such a use, see U.S. Pat. No. 6,453,291 of
`Ashley, where a VAD configuration appropriate to the front-
`end of a digital cellular system is described. Further, some
`Code Division Multiple Access (CDMA) systemsutilize a
`VAD to minimize the effective radio spectrum used, thereby
`allowing for more system capacity. Also, Global System for
`Mobile Communication (GSM)systems can include a VAD
`to reduce co-channel interference and to reduce battery con-
`sumption on the client or subscriber device.
`These typical single-microphone VAD systemsare signifi-
`cantly limited in capability as a result of the analysis of
`acoustic information received by the single microphone,
`wherein the analysis is performed using typical signal pro-
`cessing techniques. In particular, limitations in performance
`of these single-microphone VAD systems are noted when
`processing signals having a low signal-to-noise ratio (SNR),
`and in settings where the background noise varies quickly.
`
`25
`
`30
`
`35
`
`40
`
`45
`
`50
`
`55
`
`60
`
`65
`
`FIG. 1 is a block diagram of a signal processing system
`including the Pathfinder noise removal or suppression system
`and a VAD system, under an embodiment.
`FIG. 1A is a block diagram of a noise suppression/com-
`munication system including hardware for use in receiving
`and processing signals relating to VAD,andutilizing specific
`microphoneconfigurations, under the embodimentof FIG.1.
`FIG. 1B is a block diagram ofa conventional adaptive noise
`cancellation system ofthe prior art.
`FIG. 2 is a table describing different types of microphones
`and the associated spatial responsesin theprior art.
`FIG. 3A shows a microphone configuration using a unidi-
`rectional speech microphone and an omnidirectional noise
`microphone, under an embodiment.
`FIG. 3B shows a microphone configuration in a handset
`using a unidirectional speech microphone and an omnidirec-
`tional noise microphone, under the embodiment of FIG. 3A.
`FIG. 3C shows a microphone configuration in a headset
`using a unidirectional speech microphone and an omnidirec-
`tional noise microphone, under the embodiment of FIG. 3A.
`FIG. 4A showsa microphone configuration using an omni-
`directional speech microphone and a unidirectional noise
`microphone, under an embodiment.
`FIG. 4B shows a microphone configuration in a handset
`using an omnidirectional speech microphoneand a unidirec-
`tional noise microphone, under the embodiment of FIG. 4A.
`FIG. 4C shows a microphone configuration in a headset
`using an omnidirectional speech microphoneand a unidirec-
`tional noise microphone, under the embodiment of FIG. 4A.
`FIG. 5A showsa microphone configuration using an omni-
`directional speech microphone and a unidirectional noise
`microphone,underan alternative embodiment.
`FIG. 5B shows a microphone configuration in a handset
`using an omnidirectional speech microphoneand a unidirec-
`tional noise microphone, under the embodiment of FIG. 5A.
`FIG. 5C shows a microphone configuration in a headset
`using an omnidirectional speech microphoneand a unidirec-
`tional noise microphone, under the embodiment of FIG. 5A.
`FIG. 6A shows a microphone configuration using a unidi-
`rectional speech microphone and a unidirectional noise
`microphone, under an embodiment.
`FIG. 6B shows a microphone configuration in a handset
`using a unidirectional speech microphone and a unidirec-
`tional noise microphone, under the embodimentof FIG. 6A.
`
`

`

`Case 6:21-cv-00984-ADA Document 55-6 Filed 05/25/22 Page 34 of 45
`Case 6:21-cv-00984-ADA Document 55-6 Filed 05/25/22 Page 34 of 45
`
`US 8,467,543 B2
`
`3
`FIG. 6C shows a microphoneconfiguration in a headset
`using a unidirectional speech microphone and a unidirec-
`tional noise microphone, under the embodimentof FIG. 6A.
`FIG. 7A shows a microphoneconfiguration using a unidi-
`rectional speech microphone and a unidirectional noise
`microphone,underan alternative embodiment.
`FIG. 7B shows a microphone configuration in a handset
`using a unidirectional speech microphone and a unidirec-
`tional noise microphone, under the embodiment of FIG. 7A.
`FIG. 7C shows a microphoneconfiguration in a headset
`using a unidirectional speech microphone and a unidirec-
`tional noise microphone, under the embodiment of FIG. 7A.
`FIG. 8A shows a microphone configuration using a unidi-
`rectional speech microphone and a unidirectional noise
`microphone, under an embodiment.
`FIG. 8B shows a microphone configuration in a handset
`using a unidirectional speech microphone and a unidirec-
`tional noise microphone, under the embodimentof FIG. 8A.
`FIG. 8C shows a microphoneconfiguration in a headset
`using a unidirectional speech microphone and a unidirec-
`tional noise microphone, under the embodimentof FIG. 8A.
`FIG. 9A shows a microphone configuration using an omni-
`directional speech microphone and an omnidirectional noise
`microphone, under an embodiment.
`FIG. 9B shows a microphone configuration in a handset
`using an omnidirectional speech microphone and an omnidi-
`rectional noise microphone, under the embodiment of FIG.
`9A.
`
`FIG. 9C shows a microphoneconfiguration in a headset
`using an omnidirectional speech microphone and an omnidi-
`rectional noise microphone, under the embodiment of FIG.
`9A.
`
`20
`
`25
`
`30
`
`FIG. 10A showsan area of sensitivity on the human head
`appropriate for receiving a GEMSsensor, under an embodi-
`ment.
`
`35
`
`FIG. 10B shows GEMSantenna placement on a generic
`handset or headset device, under an embodiment.
`FIG. 11A showsareas of sensitivity on the human head
`appropriate for placement of an accelerometer/SSM, under
`an embodiment.
`
`FIG. 11B shows accelerometer/SSM placement on a
`generic handset or headset device, under an embodiment.
`In the drawings, the same reference numbersidentify iden-
`tical or substantially similar elements or acts. To easily iden-
`tify the discussion of any particular elementoract, the most
`significant digit or digits in a reference numberrefer to the
`Figure numberin which that elementis first introduced(e.g.,
`element 105 is first introduced and discussed with respect to
`FIG.1).
`The headings provided herein are for convenience only and
`do not necessarily affect the scope or meaning of the claimed
`invention. The following description provides specific details
`for a thorough understanding of, and enabling description for,
`embodiments ofthe invention. However, one skilled in the art
`will understand that the invention may be practiced without
`these details. In other instances, well-knownstructures and
`functions have not been shownordescribedin detail to avoid
`
`unnecessarily obscuring the description of the embodiments
`of the invention.
`
`DETAILED DESCRIPTION
`
`Numerous communication systems are described below,
`including both handset and headset devices, which use a
`variety of microphone configurations to receive acoustic sig-
`nals of an environment. The microphone configurations
`include, for example, a two-microphonearray including two
`
`40
`
`45
`
`50
`
`55
`
`60
`
`65
`
`4
`unidirectional microphones, and a two-microphone array
`including one unidirectional microphone and one omnidirec-
`tional microphone,but are not so limited. The communication
`systems can also include Voice Activity Detection (VAD)
`devices to provide voice activity signals that include infor-
`mation of human voicing activity. Components of the com-
`munications systems receive the acoustic signals and voice
`activity signals and, in response, automatically generate con-
`trol signals from data of the voice activity signals. Compo-
`nents ofthe communication systemsusethe control signals to
`automatically select a denoising method appropriate to data
`of frequency subbandsof the acoustic signals. The selected
`denoising methodis applied to the acoustic signals to gener-
`ate denoised acoustic signals when the acoustic signals
`include speech and noise.
`Numerous microphone configurations are described below
`for use with the Pathfinder noise suppression system. As such,
`each configuration is described in detail along with a method
`of use to reduce noise transmission in communication
`
`devices, in the context of the Pathfinder system. When the
`Pathfinder noise suppression system is referred to, it should
`be kept in mindthat noise suppression systems that estimate
`the noise waveform and subtract it from a signal and that use
`or are capable of using the disclosed microphone configura-
`tions and VAD information for reliable operation are included
`in that reference. Pathfinder is simply a convenient referenced
`implementation for a system that operates on signals com-
`prising desired speech signals along with noise. Thus, the use
`of these physical microphone configurations includes but is
`not limited to applications such as communications, speech
`recognition, and voice-feature control of applications and/or
`devices.
`The terms “speech” or “voice” as used herein generally
`refer to voiced, unvoiced, or mixed voiced and unvoiced
`human speech. Unvoiced speech or voiced speech is distin-
`guished where necessary. However, the term “speech signal”
`or “speech”, when used as a converse to noise, simply refers
`to any desired portion of a signal and does not necessarily
`have to be human speech.It could, as an example, be music or
`some other type of desired acoustic information. As used in
`the Figures, “speech”is meant to mean anysignalofinterest,
`whether human speech, music, or anything other signalthatit
`is desired to hear.
`In the same manner, “noise” refers to unwanted acoustic
`information that distorts a desired speech signal or makesit
`moredifficult to comprehend. “Noise suppression” generally
`describes any method by whichnoise is reduced or eliminated
`in an electronic signal.
`Moreover, the term “VAD”is generally defined as a vector
`or array signal, data, or information that in some manner
`represents the occurrence of speech in the digital or analog
`domain. A commonrepresentation of VAD information is a
`one-bit digital signal sampled at the samerate as the corre-
`sponding acoustic signals, with a zero value representing that
`no speech has occurred during the corresponding time
`sample, and a unity value indicating that speech has occurred
`during the corresponding time sample. While the embodi-
`ments described herein are generally describedin the digital
`domain, the descriptionsare alsovalid for the analog domain.
`The term “Pathfinder”, unless otherwise specified, denotes
`any denoising system using two or more microphones, a VAD
`device and algorithm, and which estimates the noise in a
`signal and subtracts it from that signal. The Aliph Pathfinder
`system is simply a convenient reference for this type of
`denoising system, although it is more capable than the above
`definition. In some cases (such as the microphone arrays
`described in FIGS. 8 and 9), the “full capabilities” or “full
`
`

`

`Case 6:21-cv-00984-ADA Document 55-6 Filed 05/25/22 Page 35 of 45
`Case 6:21-cv-00984-ADA Document 55-6 Filed 05/25/22 Page 35 of 45
`
`US 8,467,543 B2
`
`5
`version”of the Aliph Pathfinder system are used(as there is a
`significant amount of speech energy in the noise micro-
`phone), and these cases will be enumeratedin the text. “Full
`capabilities” indicates the use of both H, (z) and H(z) by the
`Pathfinder system in denoising the signal. Unless otherwise
`specified, it is assumed that only H,(z) is used to denoise the
`signal.
`The Pathfinder system is a digital signal processing—
`(DSP) based acoustic noise suppression and echo-cancella-
`tion system. The Pathfinder system, which can couple to the
`front-end of speech processing systems, uses VAD informa-
`tion and received acoustic information to reduce or eliminate
`
`noise in desired acoustic signals by estimating the noise
`waveform and subtracting it from a signal including both
`speech and noise. The Pathfinder system is described further
`below andin the Related Applications.
`FIG. 1 is a block diagram ofa signal processing system 100
`including the Pathfinder noise removalor suppression system
`105 anda VAD system 106, under an embodiment. The signal
`processing system 100 includes two microphones MIC 1 103
`and MIC 2 104 that receive signals or information from at
`least one speech signal source 101 andat least one noise
`source 102. The path s(n) from the speech signal source 101
`to MIC 1 andthe path n(n) from the noise source 102 to MIC
`2 are considered to be unity. Further, H, (z) represents the path
`from the noise source 102 to MIC 1, and H,(z) represents the
`path from the speech signal source 101 to MIC 2.
`Components of the signal processing system 100, for
`example the noise removal system 105, couple to the micro-
`phones MIC 1 and MIC 2 via wireless couplings, wired
`couplings, and/or a combination of wireless and wired cou-
`plings. Likewise, the VAD system 106 couples to components
`of the signal processing system 100, like the noise removal
`system 105, via wireless couplings, wired couplings, and/or a
`combination of wireless and wired couplings. As an example,
`the VAD devices and microphonesdescribed below as com-
`ponents of the VAD system 106 can comply with the Blue-
`tooth wireless specification for wireless communication with
`other componentsofthe signal processing system,but are not
`so limited.
`FIG. 1A is a block diagram of a noise suppression/com-
`munication system including hardware for use in receiving
`and processing signals relating to VAD,andutilizing specific
`microphone configurations, under an embodiment. Referring
`to FIG. 1A, each of the embodiments described below
`includesat least two microphonesin a specific configuration
`110 and one voiced activity detection (VAD) system 130,
`which includes both a VAD device 140 and a VAD algorithm
`150, as described in the Related Applications. Note that in
`some embodiments the microphone configuration 110 and
`the VAD device 140 incorporate the same physical hardware,
`but they are not so limited. Both the microphones 110 and the
`VAD130 input informationinto the Pathfinder noise suppres-
`sion system 120 which uses the received information to
`denoise the information in the microphones and output
`denoised speech 160 into a communications device 170.
`The communications device 170 includes both handset and
`headset communication devices, but is not so limited. Hand-
`sets or handset communication devices include, but are not
`limited to, portable communication devices that
`include
`microphones, speakers, communications electronics and
`electronic transceivers, such as cellular telephones, portable
`or mobile telephones, satellite telephones, wireline tele-
`phones, Internet telephones, wireless transceivers, wireless
`communication radios, personal digital assistants (PDAs),
`and personal computers (PCs).
`
`20
`
`25
`
`30
`
`35
`
`40
`
`45
`
`55
`
`60
`
`65
`
`6
`Headsetor headset communication devicesinclude, but are
`notlimitedto, self-contained devices including microphones
`and speakers generally attached to and/or worn onthe body.
`Headsets often function with handsets via couplings with the
`handsets, where the couplings can be wired, wireless, or a
`combination ofwired and wireless connections. However, the
`headsets can communicate independently with components
`of a communications network.
`The VAD device 140 includes, but is not limited to, accel-
`erometers, skin surface microphones (SSMs), and electro-
`magnetic devices, along with the associated software or algo-
`rithms. Further,
`the VAD device 140 includes acoustic
`microphones along with the associated software. The VAD
`devices and associated software are described in U.S. patent
`application Ser. No. 10/383,162, entitled VOICE ACTIVITY
`DETECTION (VAD) DEVICES AND METHODS FOR
`USE WITH NOISE SUPPRESSION SYS

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket