throbber
EXHIBIT 2115
`EXHIBIT 2115
`
`
`
`IPR2017—00321
`CONDITIONAL MOTION TO AMEND
`
`VALENCELL, INC.
`EXHIBIT 2115 — PAGE 1
`
`IPR2017-00321
`CONDITIONAL MOTION TO AMEND
`
`VALENCELL, INC.
`EXHIBIT 2115 - PAGE 1
`
`

`

`USOO8923941B2
`
`US 8,923,941 B2
`(10) Patent N0.:
`(12) United States Patent
`
`LeBoeuf et a1.
`(45) Date of Patent:
`*Dec. 30, 2014
`
`(54) METHODS AND APPARATUS FOR
`GENERATING DATA OUTPUT CONTAINING
`PHYSIOLOGICAL AND MOTION-RELATED
`INFORMATION
`
`(71) Applicant: Valencell, Inc., Raleigh, NC (US)
`
`(72)
`
`Inventors: Steven Francis LeBoeuf, Raleigh, NC
`(US); Jesse Berkley Tucker, Knightdale,
`NC (US); Michael Edward Aumer,
`Raleigh, NC Q1S)
`
`(73) Assignee: Valencell, Inc., Raleigh, NC (US)
`
`( * ) Notice:
`
`Subject to any disclaimer, the term of this
`patent is extended or adjusted under 35
`U.S.C. 154(b) by 0 days.
`
`This patent is subject to a terminal dis-
`claimer.
`
`(21) Appl. N0.: 14/184,396
`
`(22)
`
`(65)
`
`Filed:
`
`Feb. 19, 2014
`
`Prior Publication Data
`
`US 2014/0171755 A1
`
`Jun. 19,2014
`
`Related US. Application Data
`
`(63) Continuation of application No. 12/691,388, filed on
`Jan. 21, 2010, now Pat. No. 8,700,111.
`
`(60) Provisional application No. 61/208,567, filed on Feb.
`25, 2009, provisional application No. 61/208,574,
`filed on Feb. 25, 2009, provisional application No.
`61/212,444,
`filed on Apr. 13, 2009, provisional
`application No. 61/274,191, filed onAug. 14, 2009.
`
`(51)
`
`Int. Cl.
`A613 5/00
`H04R 1/10
`
`(2006.01)
`(2006.01)
`
`(Continued)
`
`(52) US. Cl.
`CPC ................. A613 5/4812 (2013.01); A613 5/00
`(2013.01); A61B 5/6815 (2013.01);
`
`(Continued)
`(58) Field of Classification Search
`USPC .......................................................... 600/310
`
`(56)
`
`References Cited
`
`US. PATENT DOCUMENTS
`
`5,086,229 A
`
`2/1992 Rosenthal et a1.
`
`(Continued)
`
`FOREIGN PATENT DOCUMENTS
`
`EP
`JP
`
`2 077 091 A2
`7-241279
`
`7/2009
`9/1995
`
`(Continued)
`OTHER PUBLICATIONS
`
`Notification of Transmittal ofthe International Search Report and the
`Written Opinion of the International Searching Authority, or the
`Declaration corresponding to International Application No. PCT/
`US2013/070271; Date of Mailing: Feb. 26, 2014; International
`Search Report; Written Opinion of the International Searching
`Authority; 13 pages.
`
`(Continued)
`
`Primary Examiner 7 Rodney Fuller
`(74) Attorney, Agent, or FirmiMyers Bigel Sibley &
`Sajovec
`
`(57)
`
`ABSTRACT
`
`A method of generating a data string containing physiological
`and motion-related information includes sensing physical
`activity of a subject via at least one motion sensor attached to
`the subject, sensing physiological information from the sub-
`ject via at least one photoplethysmography (PPG) sensor
`attached to the subject, and processing signals from the at
`least one motion sensor and signals from the at least one PPG
`sensor into a serial data string of physiological information
`and motion-related information. A plurality of subject physi-
`ological parameters can be extracted from the physiological
`information, and a plurality of subject physical activity
`parameters can be extracted from the motion-related infor-
`mation. The serial data string is parsed out such that an appli-
`cation-specific interface can utilize the physiological infor-
`mation and motion-related information for an application that
`generates statistical relationships between subject physi-
`ological parameters and subject physical activity parameters
`in the physiological information and motion-related informa-
`tion.
`
`See application file for complete search history.
`
`21 Claims, 21 Drawing Sheets
`
`2\
`
`3|!
`
`
`
`
`
`
`
`20
`
`
`
`IPR2017—00321
`
`CONDITIONAL MOTION TO AMEND
`
`VALENCELL, INC.
`EXHIBIT 2115 — PAGE 2
`
`IPR2017-00321
`CONDITIONAL MOTION TO AMEND
`
`VALENCELL, INC.
`EXHIBIT 2115 - PAGE 2
`
`

`

`US 8,923,941 B2
`
`Page2
`
`(
`
`(51)
`
`(200601)
`(2006-01)
`(2006.01)
`(2006.01)
`(2006.01)
`
`Int. Cl.
`-
`A613 ”203
`A613 5/11 _
`A6IB 5/1453
`A613 5/0476
`A613 5716
`52) US Cl
`-
`-
`-
`CPC ............ A6IB5/6838(2013.01);H04R1/1091
`(2013.01);A6IB 5/02055 (2013.01);A6IB
`5/6803 (2013.01);H04R1/105(2013.01);
`A613 5/1118(2013.01);A6IB 5/1455
`(2013-01);A6IB 5/721(2013-01);A613
`5/0082 (2013.01);A61B 5/0059 (2013.01);
`A613 5/0476 (2013.01);A6IB 5/14551
`(2013.01);A6IB 5/165(2013.01);A6IB
`.
`.
`574848 (2013’01)’A61B5/4866(2013’01)’
`A613 5/0013 (2013~01);A613 5/0084
`(2013.01);A6IB5/11(2013.01);A6IB5/7214
`(2013.01);A6]B5/411(2013.01);A6IB5/415
`.
`.
`(2013.01),161354125203113090fi65113
`‘
`/
`(
`-
`)
`USPC .......................................................... 600/310
`
`
`
`8,251,903 B2
`8,512,242 B2
`20 3/0109030 A
`20 4/0034293 A
`20 4/0054291 A
`20 4/0225207 A
`20 5/0043600 A
`20 5/0177034 A.
`20 5/0209516 A
`20 5/0223299 A.
`20 6/0009685 A
`20 8/0076372 A
`38 gjg‘fggéfg g
`20 8/0177162 A
`20 9/0030350 A
`20 9/0054752 A
`38 381232;; 2‘
`a.
`20 9/0287067 A *
`2010/0168531 A
`2010/0217103 A
`3810;823:323 :
`2012/0179011 A
`2012/0197093 A
`2013/0131519 A
`
`
`
`
`
`8/2 12 LeBoeufet al.
`8/2 13 LeBoeufet 31.
`6/2
`3 Uchida etal.
`2/2
`4 Kimball
`3/2
`4 Schulz etal.
`11/2
`4 Bae et a1.
`2/2
`5 Diab etal.
`8/2
`5 Beaumont
`9/2
`5 Fraden
`10/2
`5 Banet
`1/2
`6 Finarov etal.
`31:2
`8 Derogusker et a1.
`4g g 2:113:13
`7/2
`8 Bae et a1.
`1/2
`9 Yang et a1.
`2/2
`9 Jennalagadda eta].
`131% 0; 1;?er frill-
`,
`101 e a .
`11/2 09 Dorogusker et a1.
`7/2 10 Shaltiset a1.
`/2 10 Abdul-Hafiz et a1.
`1% 1(1) xchombieletal.
`7
`'i son eta .
`7/2 12 Moon etal.
`8/2 12 LeBoeufet a1.
`572 13 LeBoeufetal.
`
`
`
`......... 600/300
`
`(56)
`
`References Cited
`
`FOREIGN PATENT DOCUMENTS
`
`U.S. PATENT DOCUMENTS
`
`5,596,987 A
`6,078,829 A
`6,080,110 A
`2:310:23
`,
`,
`6,783,501 B2
`gfiggggg 3%
`7,107,088 B2
`7,209,775 132
`8,055,319 B2
`
`1/1997 Chance
`6/2000 Uchida et a1.
`6/2000 Thorgersen
`31/3383 Fransefflal.
`.. maiet
`.
`8/2004 Takahashi et a1.
`13/338: 3:210 et a1.
`9/2006 Aceti
`4/2007 Bae etal.
`11/2011 Oh et a1.
`
`JP
`JP
`JP
`JP
`JP
`wo
`
`9/1997
`9-253062
`11/1997
`9—299342
`4/2000
`2000-116611
`1/2001
`2001-025462
`7/2007
`2007-185348
`3/2013
`wo 2013/038296 A1
`OTHER PUBLICATIONS
`Fitrainer “The Only Trainer Yon Need”; http://itamieom; Down-
`loaded Feb. 26, 2010;©2008 F1Trran1erTM;2pages.
`
`* cited by examiner
`
`IPR2017—00321
`CONDITIONAL MOTION TO AMEND
`
`VALENCELL, INC.
`EXHIBIT 2115 — PAGE 3
`
`IPR2017-00321
`CONDITIONAL MOTION TO AMEND
`
`VALENCELL, INC.
`EXHIBIT 2115 - PAGE 3
`
`

`

`U.S. Patent
`
`Dec. 30, 2014
`
`Sheet 1 of 21
`
`US 8,923,941 B2
`
`
`
`
`
`
`
`
`
`
`
`100
`
`
`
`
`
`I32
`
`HG. 2
`
`IPR2017—00321
`CONDITIONAL MOTION TO AMEND
`
`VALENCELL, INC.
`EXHIBIT 2115 — PAGE 4
`
`IPR2017-00321
`CONDITIONAL MOTION TO AMEND
`
`VALENCELL, INC.
`EXHIBIT 2115 - PAGE 4
`
`

`

`U.S. Patent
`
`Dec. 30, 2014
`
`Sheet 2 of 21
`
`US 8,923,941 B2
`
`“\\\\\\\\\\\\\\\\\\\\““
`
`/1 ‘_\\\\\\\\\\\\\\\L\“
`Am
`\m‘.
`
`\-,
`
`
`
`P2 [EM
`\
`I-I/‘/I
`MI.
`
`[IIIIII
`\‘x\\
`//t'llllllllllllllllll/‘\\
`VIVA5\\L\\\\\\\\\\\\\\\\\V'ii
`)3,:2
`
`/:\\\\IVA\\\\\\\\\‘|'“‘um“
`
`
`
`%& “
`
`
`
`IPR2017—00321
`CONDITIONAL MOTION TO AMEND
`
`VALENCELL, INC.
`EXHIBIT 2115 — PAGE 5
`
`IPR2017-00321
`CONDITIONAL MOTION TO AMEND
`
`VALENCELL, INC.
`EXHIBIT 2115 - PAGE 5
`
`

`

`P
`
`W,
`
`FIG. 4B
`
`IPR2017—00321
`
`CONDITIONAL MOTION TO AMEND
`
`VALENCELL, INC.
`EXHIBIT 21 15
`- PAGE 6
`
`IPR2017-00321
`CONDITIONAL MOTION TO AMEND
`
`VALENCELL, INC.
`EXHIBIT 2115 - PAGE 6
`
`
`
`

`

`
`
`IPR2017—00321
`
`CONDITIONAL MOTION TO AMEND
`
`VALENCELL, INC.
`EXHIBIT 21 15 - PAGE 7
`
`IPR2017-00321
`CONDITIONAL MOTION TO AMEND
`
`VALENCELL, INC.
`EXHIBIT 2115 - PAGE 7
`
`
`

`

`U.S. Patent
`
`Dec. 30, 2014
`
`Sheet 5 of 21
`
`US 8,923,941 B2
`
`
`
`IPR2017—00321
`CONDITIONAL MOTION TO AMEND
`
`VALENCELL, INC.
`EXHIBIT 2115 — PAGE 8
`
`IPR2017-00321
`CONDITIONAL MOTION TO AMEND
`
`VALENCELL, INC.
`EXHIBIT 2115 - PAGE 8
`
`

`

`U.S. Patent
`
`Dec. 30, 2014
`
`Sheet 6 of 21
`
`US 8,923,941 B2
`
`
`
`
`,
`,
`r
`" 7/
`
`// "
`
`18w
`
`
`110
`
`
`
`
`
`
`
`
`
`
`\S
`s\\\\\\\\\\\\\\\\\\\\\\‘s\\\\
`
`A
`
`\‘Q‘d’ ‘
`5‘ A ‘6’
`sgwmmsuw,,
`
`
`
`
`
`
`H]
`
`\ 50
`
`
`
`A\\\'\\\\\V’-k\\\\\\\\\\\
`
`
`III/II/Au.’\\\.‘\\\\\\\\V
`\\\\\\\\\\\\\\\\\\V-'.\\\\\\\%
`\
`
`
`
`
`
`
`
`
`I,
`I’ll:
`
`IPR2017—00321
`CONDITIONAL MOTION TO AMEND
`
`VALENCELL, INC.
`EXHIBIT 2115 — PAGE 9
`
`IPR2017-00321
`CONDITIONAL MOTION TO AMEND
`
`VALENCELL, INC.
`EXHIBIT 2115 - PAGE 9
`
`

`

`U.S. Patent
`
`Dec. 30, 2014
`
`Sheet 7 of 21
`
`US 8,923,941 B2
`
`Iu
`
`\
`Ftwllllllllll/
`77
`\
`[/2
`K
`
`g AVA“\‘A\\\\\\\\\\\\\V\\\
`
`IPR2017—00321
`CONDITIONAL MOTION TO AMEND
`
`VALENCELL, INC.
`EXHIBIT 2115 — PAGE 10
`
`IPR2017-00321
`CONDITIONAL MOTION TO AMEND
`
`VALENCELL, INC.
`EXHIBIT 2115 - PAGE 10
`
`

`

`U.S. Patent
`
`Dec. 30, 2014
`
`Sheet 8 of 21
`
`US 8,923,941 B2
`
`
`
`22r~~~... \
`
`\\\
`
`x ‘
`
`
`
`
`
`
`~
`
`§§
`~~~ ‘~ ~ ~
`é‘

`4 '
`mm“\\\\\\\\\\\\‘
`
`t'
`
`IPR2017—00321
`CONDITIONAL MOTION TO AMEND
`
`VALENCELL, INC.
`EXHIBIT 2115 — PAGE 11
`
`IPR2017-00321
`CONDITIONAL MOTION TO AMEND
`
`VALENCELL, INC.
`EXHIBIT 2115 - PAGE 11
`
`

`

`U.S. Patent
`
`Dec. 30, 2014
`
`Sheet 9 of 21
`
`US 8,923,941 B2
`
`.......... .>
`InJ
`
`
`
`
`// k """""""2'4'--------
`
`I
`
`
`
`/‘
`I
`
`
`' /§ .......
`s
`N.
`‘
`
`9 /§ ""'---......_:_‘_~ §
`
`'6“
`
`p
`
`§§
`
`‘
`\
`
`
`
`IPR2017—00321
`CONDITIONAL MOTION TO AMEND
`
`VALENCELL, INC.
`EXHIBIT 2115 — PAGE 12
`
`IPR2017-00321
`CONDITIONAL MOTION TO AMEND
`
`VALENCELL, INC.
`EXHIBIT 2115 - PAGE 12
`
`

`

`U.S. Patent
`
`Dec. 30, 2014
`
`Sheet 10 of 21
`
`US 8,923,941 B2
`
`
`
`
`
`
`
`\V
`
`
`
` S
`
`
`
`\t340
`
`
`
`'
`
`\
`
`111a
`
`TRIANGULAR
`FOSSA
`
`
`
`INTERTRAGIC
`NOTCH
`
`IPR2017—00321
`CONDITIONAL MOTION TO AMEND
`
`VALENCELL, INC.
`EXHIBIT 2115 — PAGE 13
`
`IPR2017-00321
`CONDITIONAL MOTION TO AMEND
`
`VALENCELL, INC.
`EXHIBIT 2115 - PAGE 13
`
`

`

`US. Patent
`
`Dec. 30,2014
`
`Sheet 11 0121
`
`US 8,923,941 132
`
`
`
`IPR2017—00321
`CONDITIONAL MOTION TO AMEND
`
`VALENCELL, INC.
`EXHIBIT 2115 — PAGE 14
`
`IPR2017-00321
`CONDITIONAL MOTION TO AMEND
`
`VALENCELL, INC.
`EXHIBIT 2115 - PAGE 14
`
`

`

`US. Patent
`
`Dec. 30,2014
`
`Sheet 12 of 21
`
`US 8,923,941 132
`
`Lu
`
`m L
`
`5 E
`
`
`
`2
`3
`: _=. g :
`W :5
`u
`
`1
`
`a
`
`.5
`
`u.|
`
`,2
`'—
`
`w?
`{l/ ”a
`
`Ox
`
`“—
`
`l-I-I
`
`\\
`
`
`
`a
`
`.\\\
`
`
`
`a
`
`
`
`IPR2017—00321
`CONDITIONAL MOTION TO AMEND
`
`VALENCELL, INC.
`EXHIBIT 2115 — PAGE 15
`
`IPR2017-00321
`CONDITIONAL MOTION TO AMEND
`
`VALENCELL, INC.
`EXHIBIT 2115 - PAGE 15
`
`

`

`U.S. Patent
`
`Dec. 30, 2014
`
`Sheet 13 of 21
`
`US 8,923,941 B2
`
` \
`
`/
`
`27d
`
`FA(ING ANTITRAGUS
`
`27b
`
`FAIING EARBUD
`
`FIG. 124
`
`FIG. 123
`
`IPR2017—00321
`CONDITIONAL MOTION TO AMEND
`
`VALENCELL, INC.
`EXHIBIT 2115 — PAGE 16
`
`IPR2017-00321
`CONDITIONAL MOTION TO AMEND
`
`VALENCELL, INC.
`EXHIBIT 2115 - PAGE 16
`
`

`

`US. Patent
`
`Dec. 30,2014
`
`Sheet 14 of 21
`
`US 8,923,941 132
`
`200
`\
`
`201
`g
`ESTIMATOR
`
`
`PARAMEIER
`
`CHANNEL”
`
`(HANNELA
`
`(HANNELC
`
`(HANNELB
`
`203 \
`
`
`
`
`
`PARAMEIER
`
`FIG. 13
`
`IPR2017—00321
`CONDITIONAL MOTION TO AMEND
`
`VALENCELL, INC.
`EXHIBIT 2115 — PAGE 17
`
`IPR2017-00321
`CONDITIONAL MOTION TO AMEND
`
`VALENCELL, INC.
`EXHIBIT 2115 - PAGE 17
`
`

`

`U.S. Patent
`
`Dec. 30, 2014
`
`Sheet 15 of 21
`
`US 8,923,941 B2
`
`RAW SIGNAL IN 10W MOTION
`
`5020
`
`89101112131415
`F16.14A
`
`”BLOCKED CHANNEL” IN LOW MOTION
`
`89101112131415
`00.143
`
`RAW SIGNAL IN HIGH MOTION
`
`5050
`300c/ 5000
`4950
`
`70
`
`71
`
`72
`
`73
`
`74
`
`75
`
`76
`
`77
`
`78
`
`79
`
`80
`
`FIG. 14C
`
`"BLOCKED CHANNEL” IN HIGH MOTION
`
`30/ WW2950
`
`70
`
`71
`
`72
`
`73
`
`74
`
`75
`
`76
`
`77
`
`78
`
`79
`
`80
`
`F113. 14D
`
`IPR2017—00321
`CONDITIONAL MOTION TO AMEND
`
`VALENCELL, INC.
`EXHIBIT 2115 — PAGE 18
`
`IPR2017-00321
`CONDITIONAL MOTION TO AMEND
`
`VALENCELL, INC.
`EXHIBIT 2115 - PAGE 18
`
`

`

`U.S. Patent
`
`Dec. 30, 2014
`
`Sheet 16 of 21
`
`US 8,923,941 B2
`
`400
`
`\ 200180
`
`
`
`
`
`HEARTRATE(bpm)
`
`I 60
`
`120
`
`100
`
`30
`
`60
`
`40
`
`20
`
`ADAPTIVE FILTER + BEAT FINDER
`
`1’I
`
`lIDIlI
`
`|
`
`JOGGING
`
`RESTING
`
`RUNNING
`n \
`I
`,
`, “I" T '\
`r“ "
`'\
`v'
`.~"’
`‘
`
`
`A
`n
`I
`
`I
`I
`1' ‘1 1‘
`I”
`
`[J
`
`‘
`
`{
`I
`I
`I
`
`‘1
`
`BEAT FINDER ONLY
`
`O
`
`0
`
`I000
`
`2000
`
`3000
`
`4000
`
`5000
`
`6000
`
`TIME (SECONDS)
`
`FIG. 15
`
`500
`
`PRE—ADAPTIVE SIGNAL CONDITIONING
`
`PARAMETER EXTRACTION
`
`ADAPTIVE FILTERING
`
`FIG. 16
`
`IPR2017—00321
`CONDITIONAL MOTION TO AMEND
`
`VALENCELL, INC.
`EXHIBIT 2115 — PAGE 19
`
`IPR2017-00321
`CONDITIONAL MOTION TO AMEND
`
`VALENCELL, INC.
`EXHIBIT 2115 - PAGE 19
`
`

`

`U.S. Patent
`
`Dec. 30, 2014
`
`Sheet 17 of 21
`
`US 8,923,941 B2
`
`PROCESSOR/
`MULTIPLEXER
`
`
`
` ____________________
`
`
`
`
`IPR2017—00321
`CONDITIONAL MOTION TO AMEND
`
`VALENCELL, INC.
`EXHIBIT 2115 — PAGE 20
`
`IPR2017-00321
`CONDITIONAL MOTION TO AMEND
`
`VALENCELL, INC.
`EXHIBIT 2115 - PAGE 20
`
`

`

`U.S. Patent
`
`Dec. 30, 2014
`
`Sheet 18 of 21
`
`US 8,923,941 B2
`
`
`
`FIG. 19
`
`-ENSOR
`_OISESOURCES
`
`INPUT/OUTPUT
`
`i i
`
`SENSORS
`i -0WER
`SIGNALE- PROCESSING
`
`I | '
`
`|
`
`IPR2017—00321
`CONDITIONAL MOTION TO AMEND
`
`VALENCELL, INC.
`EXHIBIT 2115 — PAGE 21
`
`IPR2017-00321
`CONDITIONAL MOTION TO AMEND
`
`VALENCELL, INC.
`EXHIBIT 2115 - PAGE 21
`
`

`

`U.
`
`0
`
`mu
`
`2B
`
`
`
`
`
`S.25.255%.5:55:2.55:5
`
`m“22.35V2:228:55E:M,:2m:82:325:5:MDDD:22ng
`
`Es“;
`
`1.4a:...............185E222w,Ew222:0
`
`
`
`SDEE:U2:582ozzozazs
`
`8,”.8235.3%...
`
`9$8mea5E3:mEazasEggs:
`
`53582::Em5.5::as:
`
`$53:22228532528332$5225:
`
`tS<>En0mmMWkwkwmmmm55%mRammmmmmmmmmmaPP
`
`com
`
`>DOm
`
`:25:st
`
`IPR2017—00321
`
`CONDITIONAL MOTION TO AMEND
`
`VALENCELL, INC.
`EXHIBIT 2115 — PAGE 22
`
`IPR2017-00321
`CONDITIONAL MOTION TO AMEND
`
`VALENCELL, INC.
`EXHIBIT 2115 - PAGE 22
`
`
`
`

`

`U.S. Patent
`
`Dec. 30, 2014
`
`Sheet 20 of 21
`
`US 8,923,941 B2
`
`
`
`FIG.22A
`
`FIG.223
`
`IPR2017—00321
`CONDITIONAL MOTION TO AMEND
`
`VALENCELL, INC.
`EXHIBIT 2115 — PAGE 23
`
`IPR2017-00321
`CONDITIONAL MOTION TO AMEND
`
`VALENCELL, INC.
`EXHIBIT 2115 - PAGE 23
`
`

`

`U.S. Patent
`
`Dec. 30, 2014
`
`Sheet 21 0f21
`
`US 8,923,941 B2
`
`mm
`
`VIII/III]
`
`’é”?”éfi¢l’gg\”fi",fia::
`I’ll/””4,”7"]’4\a”,Ea:-.II«I‘
`
` m
`
`mma:
`
`3:
`
`IPR2017-00321
`CONDITIONAL MOTION TO AMEND
`
`VALENCELL, INC.
`EXHIBIT 2115 - PAGE 24
`
`
`
`
`

`

`US 8,923,941 B2
`
`1
`METHODS AND APPARATUS FOR
`GENERATING DATA OUTPUT CONTAINING
`PHYSIOLOGICAL AND MOTION-RELATED
`
`INFORMATION
`
`RELATED APPLICATIONS
`
`This application is a continuation application of US.
`patent application Ser. No. 12/691,388, filed Jan. 21, 2010,
`now US. Pat. No. 8,700,] l 1, which claims the benefit of and
`priority to US. Provisional Patent Application No. 61/208,
`567 filed Feb. 25, 2009, US. Provisional Patent Application
`No. 61/208,574 filed Feb. 25, 2009, U.S. Provisional Patent
`Application No. 61/212,444 filed Apr. 13, 2009, and US.
`Provisional Patent Application No. 61/274,191 filed Aug. 14,
`2009, the disclosures of which are incorporated herein by
`reference as if set forth in their entireties.
`
`FIELD OF THE INVENTION
`
`The present invention relates generally to physiological
`monitoring and, more particularly, to physiological monitor-
`ing apparatus.
`
`BACKGROUND OF THE INVENTION
`
`There is growing market demand for personal health and
`environmental monitors, for example, for gauging overall
`health and metabolism during exercise, athletic training, diet-
`ing, daily life activities, sickness, and physical therapy. How-
`ever, traditional health monitors and environmental monitors
`may be bulky, rigid, and uncomfortableigenerally not suit-
`able for use during daily physical activity. There is also grow-
`ing interest in generating and comparing health and environ-
`mental exposure statistics ofthe general public and particular
`demographic groups. For example, collective statistics may
`enable the healthcare industry and medical community to
`direct healthcare resources to where they are most highly
`valued. However, methods of collecting these statistics may
`be expensive and laborious, often utilizing human-based
`recording’analysis steps at multiple sites.
`As such, improved ways of collecting, storing and analyz-
`ing physiological
`information are needed.
`In addition,
`improved ways of seamlessly extracting physiological infor-
`mation from a person during everyday life activities, espe-
`cially during high activity levels, may be important for
`enhancing fitness training and healthcare quality, promoting
`and facilitating prevention, and reducing healthcare costs.
`
`SUMMARY
`
`It should be appreciated that this Summary is provided to
`introduce a selection of concepts in a simplified form, the
`concepts being further described below in the Detailed
`Description. This Summary is not intended to identify key
`features or essential features of this disclosure, nor is it
`intended to limit the scope of the invention.
`According to some embodiments of the present invention,
`a headset configured to be attached to the ear of a person
`includes a base, an earbud housing extending outwardly from
`the base that is configured to be positioned within an ear of a
`subject, and a cover surrounding the earbud housing. The base
`includes a speaker, an optical emitter, and an optical detector.
`The cover includes light transmissive material that is in opti-
`cal communication with the optical emitter and the optical
`detector and serves as a light guide to deliver light from the
`optical emitter into the ear canal of the subject wearing the
`
`2
`
`headset at one or more predetermined locations and to collect
`light external to the earbud housing and deliver the collected
`light to the optical detector. The optical emitter, via the light-
`guiding cover, directs optical energy towards a particular
`region of ear and the optical detector detects secondary opti-
`cal energy emanating from the ear region. In some embodi-
`ments, the optical detector may include an optical filter con-
`figured to pass
`secondary optical energy at
`selective
`wavelengths. In some embodiments, the light transmissive
`material of the cover may be configured, for example via the
`use of cladding and/or light reflective material, such that the
`cover serves as a light guide that is coupled in parallel to the
`optical emitter and detector. In some embodiments, the light
`transmissive material of the cover may be configured, for
`example via the use of cladding and/or light reflective mate-
`rial, such that the cover serves as a light guide that is coupled
`perpendicular to the optical emitter and detector.
`In some embodiments, the headset may include various
`electronic components secured to the base. For example, the
`headset may include one or more environmental sensors con-
`figured to detect and/or measure environmental conditions in
`a vicinity of the headset. The headset may include a signal
`processor configured to receive and process signals produced
`by the optical detector. For example, in some embodiments, a
`signal processor may be configured to extract secondary opti-
`cal energy and remove optical noise or environmental noise.
`The headset may include a signal processor configured to
`receive and process signals produced by the one or more
`environmental sensors. In addition, the headset may include a
`transmitter configured to transmit signals processed by the
`signal processor to a remote device in real time. Headsets
`according to embodiments of the present invention may uti-
`lize, for example, Bluetooth®, Wi-Fi, ZigBee, or other wire-
`less transmitters.
`
`In some embodiments, a housing is secured to and overlies
`the base so as to enclose and protect the speaker, optical
`emitter and optical detector, as well as other electronic com-
`ponents secured to the base (e.g., sensors, processor, trans-
`mitter etc.).
`The earbud housing is in acoustical communication with
`the speaker and has at least one aperture through which sound
`from the speaker can pass. The light-guiding cover surround-
`ing the earbud housing also includes at least one aperture
`through which sound from the speaker can pass. The cover
`may be formed from a soft, resilient material, such as silicone
`which deforms when inserted within an ear canal of a subject.
`In some embodiments, the cover includes an alignment mem-
`ber that facilitates alignment of the earbud housing within an
`ear canal of a subject.
`Light directed into the ear of a subject from a light emitter
`and the subsequent collection of light at a light detector,
`according to embodiments of the present invention, may be
`utilized for detecting and/or measuring, among other things,
`body temperature, skin temperature, blood gas levels, muscle
`tension, heart rate, blood flow, cardiopulmonary functions,
`etc.
`
`In some embodiments of the present invention, the light-
`guiding cover may include a lens that is in optical communi-
`cation with the optical emitter and/or optical detector. The
`lens may be configured to focus light emitted by the optical
`emitter and/or to focus collected light toward the optical
`detector. In some embodiments, multiple lenses may be
`incorporated into a light-guiding cover.
`In some embodiments, the light-guiding cover may include
`a light diffusion region in optical communication with the
`light transmissive material that diffuses light emitted by the
`optical detector.
`
`5
`
`10
`
`15
`
`20
`
`25
`
`30
`
`35
`
`40
`
`45
`
`50
`
`55
`
`60
`
`65
`
`IPR2017—00321
`
`CONDITIONAL MOTION TO AMEND
`
`VALENCELL, INC.
`EXHIBIT 2115 — PAGE 25
`
`IPR2017-00321
`CONDITIONAL MOTION TO AMEND
`
`VALENCELL, INC.
`EXHIBIT 2115 - PAGE 25
`
`

`

`US 8,923,941 B2
`
`3
`
`4
`
`In some embodiments, the light-guiding cover may include
`a luminescence-generating region, such as a phosphor-con-
`taining region, that is in optical communication with the light
`transmissive material. The luminescence-generating region
`may be embedded within the light-guiding cover and/or on a
`surface of the light-guiding cover. The luminescence-gener-
`ating region is configured to receive light emitted by the
`optical emitter and convert at least a portion of the received
`light to light having a different wavelength from that of the
`received light.
`In some embodiments, the light-guiding cover includes
`one or more grooves formed therein. Each groove is config-
`ured to direct external light to the optical detector.
`In some embodiments, the light transmissive material of
`the light-guiding cover is configured to direct light from the
`optical emitter to a plurality oflocations at an outer surface of
`the cover for delivery into an ear canal of a subject.
`In some embodiments, the light transmissive material of
`the light-guiding cover is a translucent material or includes
`translucent material in selected locations.
`
`In some embodiments, a light reflective material is on at
`least a portion of one or both ofthe inner and outer surfaces of
`the light-guiding cover.
`According to some embodiments of the present invention,
`a light-guiding earbud for a headset includes light transmis-
`sive material that is in optical communication with an optical
`emitter and optical detector associated with the headset. The
`light transmissive material is configured to deliver light from
`the optical emitter into the ear canal ofa subject at one or more
`predetermined locations and to collect light external to the
`earbud housing and deliver the collected light to the optical
`detector. In some embodiments, the light emitter and light
`detector may be integral with the earbud. For example, in
`some embodiments, a flexible optical emitter is incorporated
`within the earbud and is in optical communication with the
`light transmissive material.
`In some embodiments, an earbud includes at least one lens
`in optical communication with the light transmissive mate-
`rial. Each lens may be configured to focus light from the
`optical emitter onto one or more predetermined locations in
`the ear of a subject and/or to focus collected external light
`onto the optical detector.
`In some embodiments of the present invention, an earbud
`may include luminescent material. Luminescent light is gen-
`erated from optical excitation ofthe luminescent material by
`an optical emitter.
`In some embodiments of the present invention, an earbud
`may integrate a sensor module containing aplurality of sensor
`elements for measuring physiological information and at
`least one noise source for measuring noise information. A
`“noise source”, as used herein, refers to a sensor, such as an
`optical sensor, inertial sensor, electrically conductive sensor,
`capacitive sensor, inductive sensor, etc., and derives it name
`from the fact that it is a source of input to a filter, such as an
`adaptive filter described below.
`The physiological sensors of the sensor module may gen-
`erate a signal that includes physiological information plus
`noise information. The noise may be removed by combining
`the physiological information and noise information from the
`sensor module with noise information from the noise source
`
`v.
`
`10
`
`15
`
`20
`
`25
`
`30
`
`35
`
`40
`
`45
`
`50
`
`55
`
`60
`
`of the sensor module via an electronic filtering method, such
`as a signal processing technique. Specific examples of such
`signal processing techniques include FIR (Finite Impulse
`Response), IIR (Infinite Impulse Response),
`informatics,
`machine learning, and adaptive filter methods. The output of 65
`the adaptive filter may be a physiological signal that is wholly
`or partially free of noise. In some embodiments, motion-
`
`related noise from a subject activity such as running may be
`removed from the physiological plus noise signal generated
`by a photoplethysmography (PPG) sensor for measuring
`blood constituent levels or blood flow properties, such as
`blood oxygen level, V02, or heart rate.
`In some embodiments of the present invention, the noise
`source input of an adaptive filter may include a “blocked
`channel” of optical energy, an inertial sensor, or environmen-
`tal energy. In some embodiments, the environmental energy
`may be unwanted ambient optical noise.
`In some embodiments ofthe present invention, a processor/
`multiplexor processes physiological signals and noise signals
`into a data string. This data string may contain information
`relating to physiological
`information and motion-related
`information. The processing method may include signal pro-
`cessing techniques such as pre-adaptive signal conditioning,
`adaptive filtering, and parameter extraction.
`In some embodiments, an earbud includes one or more
`sensor modules that includes one or more sensors for sensing
`physiological information and environmental information,
`such as noise, for example. As such, the earbud may function
`as a physiological monitor as well as an environmental moni-
`tor. In some embodiments, the earbud may include a micro-
`processor that is in electrical communication with the sensor
`module(s). For example, a microprocessor incorporated into
`an earbud may be configured to execute an adaptive filter
`algorithm to remove noise from at least one signal generated
`by a sensor module in the earbud. A microprocessor may also
`be configured to process information from the one or more
`sensors to generate a digital output string, wherein the digital
`output string includes a plurality of physiological and
`motion-related information.
`
`Physiological sensors that may be incorporated into head-
`sets and/or earbuds, according to some embodiments of the
`present invention, may be configured to detect and/or mea-
`sure one or more of the following types of physiological
`information: heart rate, pulse rate, breathing rate, blood flow,
`V02, VOzmax, heartbeat
`signatures, cardio-pulmonary
`health, organ health, metabolism, electrolyte type and/or con-
`centration, physical activity, caloric intake, caloric metabo-
`lism, blood metabolite levels or ratios, blood pH level, physi-
`cal and/or psychological stress levels and/or stress level
`indicators, drug dosage and/or dosimetry, physiological drug
`reactions, drug chemistry, biochemistry, position and/or bal-
`ance, body strain, neurological functioning, brain activity,
`brain waves, blood pressure, cranial pressure, hydration level,
`auscultatory information, auscultatory signals associated
`with pregnancy, physiological response to infection, skin and/
`or core body temperature, eye muscle movement, blood vol-
`ume, inhaled and/or exhaled breath volume, physical exer-
`tion, exhaled breath physical and/or chemical composition,
`the presence and/or identity and/or concentration of Viruses
`and/or bacteria, foreign matter in the body, internal toxins,
`heavy metals in the body, anxiety, fertility, ovulation, sex
`hormones, psychological mood, sleep patterns, hunger and/or
`thirst, hormone type and/or concentration, cholesterol, lipids,
`blood panel, bone density, organ and/or body weight, reflex
`response, sexual arousal, mental and/or physical alertness,
`sleepiness, auscultatory information, response to external
`stimuli, swallowing volume, swallowing rate, sickness, voice
`characteristics, voice tone, voice pitch, voice volume, vital
`signs, head tilt, allergic reactions, inflammation response,
`auto-immune response, mutagenic response, DNA, proteins,
`protein levels in the blood, water content of the blood, phero-
`mones, internal body sounds, digestive system functioning,
`cellular regeneration response, healing response, stem cell
`regeneration response, etc.
`
`IPR2017—00321
`
`CONDITIONAL MOTION TO AMEND
`
`VALENCELL, INC.
`EXHIBIT 2115 — PAGE 26
`
`IPR2017-00321
`CONDITIONAL MOTION TO AMEND
`
`VALENCELL, INC.
`EXHIBIT 2115 - PAGE 26
`
`

`

`US 8,923,941 B2
`
`5
`
`6
`
`Environmental sensors that may be incorporated into head-
`sets and/or earbuds, according to some embodiments of the
`present invention, may be configured to detect and/or mea-
`sure one or more of the following types of environmental
`information: climate, humidity, temperature, pressure, baro-
`metric pressure, soot density, airborne particle density, air-
`borne particle size, airborne particle shape, airborne particle
`identity, volatile organic chemicals (VOCs), hydrocarbons,
`polycyclic aromatic hydrocarbons (PAHs), carcinogens, tox-
`ins, electromagnetic energy, optical
`radiation, X-rays,
`gamma rays, microwave radiation, terahertz radiation, ultra-
`violet radiation,
`infrared radiation, radio waves, atomic
`energy alpha particles, atomic energy beta-particles, gravity,
`light
`intensity,
`light frequency,
`light flicker,
`light phase,
`ozone, carbon monoxide, carbon dioxide, nitrous oxide, sul-
`fides, airborne pollution, foreign material in the air, viruses,
`bacteria, signatures from chemical weapons, wind, air turbu-
`lence, sound and/or acoustical energy, ultrasonic energy,
`noise pollution, human voices, animal sounds, diseases
`expelled from others, exhaled breath and/or breath constitu-
`ents of others, toxins from others, pheromones from others,
`industrial and/or transportation sounds, allergens, animal
`hair, pollen, exhaust from engines, vapors and/or fumes, fuel,
`signatures for mineral deposits and/or oil deposits, snow, rain,
`thermal energy, hot surfaces, hot gases, solar energy, hail, ice,
`vibrations, traflic, the number of people in a vicinity of the
`person, coughing and/or sneezing sounds from people in the
`vicinity of the person,
`loudness and/or pitch from those
`speaking in the vicinity of the person.
`According to some embodiments of the present invention,
`earbuds for headsets may include a chipset having at least one
`sensor element, noise source element, signal processor, input/
`output line, digital control, and power regulator.
`Light-guiding earbuds according to the various embodi-
`ments of the present invention may be utilized with mono
`headsets (i.e., headsets having one earbud) as well as stereo
`headsets (i .e., headsets having two earbuds).Additionally, the
`light-guiding region of earbuds, according to embodiments of
`the present invention, may be integrated not only into an
`earbud cover and earbud housing, but also into each or all
`components of an earbud. Moreover, light-guiding earbuds
`according to the various embodiments of the present inven-
`tion may be utilized with hearing aids, body jewelry, or any
`other attachment that can be placed near the head region, such
`as eye glasses or shades, a headband, a cap, helmet, visor, or
`the like.
`
`According to some embodiments of the present invention,
`a monitoring device includes a circular band capable of encir-
`cling a finger ofa subject, and a base having an optical emitter
`and an optical detector attached to the circular band. The
`circular band includes light transmissive material in optical
`communication with the optical emitter and optical detector
`that is configured to deliver light from the optical emitter to
`one or more portions of the finger of the subject and to collect
`light from one or more portions ofthe finger ofthe subject and
`deliver the collected light to the optical detector. In some
`embodiments, the circular band includes first and second
`concentric body portions.
`In some embodiments, the circular band includes a lens
`region in optical connnunication with the optical emitter that
`focuses light emitted by the optical emitter and/or that col-
`lects light reflected from a finger. In some embodiments the
`circular band includes a phosphor-containing region in opti-
`cal communication with the light transmissive material,
`wherein the phosphor-containing region receives light emit-
`
`v.
`
`10
`
`15
`
`20
`
`25
`
`30
`
`35
`
`40
`
`45
`
`50
`
`55
`
`60
`
`65
`
`ted by the optical emitter and converts at least a portion ofthe
`received light to light having a different wavelength from the
`received light.
`In some embodiments, the light transmissive material of
`the circular band has an outer surface and an inner surface,
`and a cladding material, such as light reflective material, is on
`(or near) at least a portion of one or both ofthe inner and outer
`surfaces.
`
`In some embodiments, the base includes one or more ofthe
`follo

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket