throbber
Case 2:21-cv-00040-JRG Document 64-2 Filed 08/15/21 Page 1 of 39 PageID #: 1048
`Case 2:21-cv-00040-JRG Document 64-2 Filed 08/15/21 Page 1 of 39 PagelD #: 1048
`
`
`EXHIBIT B
`EXHIBIT B
`
`

`

`Case 2:21-cv-00040-JRG Document 64-2 Filed 08/15/21 Page 2 of 39 PageID #: 1049
`ee NTTTTT
`
`US008194924B2
`
`US 8,194,924 B2
`(10) Patent No.:
`a2) United States Patent
`Pryor
`(45) Date of Patent:
`Jun. 5, 2012
`
`
`(54) CAMERA BASED SENSING IN HANDHELD,
`MOBILE, GAMING OR OTHER DEVICES
`s
`+5
`.
`.
`.
`Inventor: Timothy R. Pryor, Sylvania,OH (US)
`(76)
`.
`.
`.
`o,
`(*) Notice:
`Subject to any disclaimer, the term ofthis
`patent is extended or adjusted under 35
`USC. 154(b) by 0 days.
`,
`(21) Appl. No.: 13/051,698
`.
`Filed:
`
`Mar. 18, 2011
`
`(22)
`
`(65)
`
`Prior Publication Data
`
`US 2011/0170746 Al
`
`Jul. 14, 2011
`
`ale
`.
`‘ati
`ate
`Related U.S. Application Data
`(63) Continuation of application No. 12/834,281,filed on
`i
`i
`Jul. 12, 2010,oeee ae earner1 a
`continuation of application
`No.
`»/10, Hed on
`Oct. 31, 2007, now Pat. No. 7,756,297, which is a
`continuation of application No. 10/893,534, filed on
`Jul. 19, 2004, now Pat. No. 7,401,783, which is a
`continuation of application No. 09/612,225, filed on
`Jul. 7, 2000, now Pat. No. 6,766,036.
`
`(60) Provisional application No. 60/142,777, filed on Jul. 8,
`1999,
`
`(51)
`
`Int. Cl.
`(2006.01)
`GO6K 9/00
`(52) US. Ch cence 382/103; 382/154; 382/312
`(58) Field of Classification Search 00.00.0000... 382/103,
`oo
`382/154, 312
`See applicationfile for complete searchhistory.
`.
`References Cited
`U.S. PATENT DOCUMENTS
`
`(56)
`
`
`
`7/1982 Hedgesetal.
`4,339,798 A
`12/1986 Pugh
`4,631,676 A
`/1991 Ando
`5,008,946 A
`2/1992 Ch
`5,088,928 A
`7/1993 Yokota et al.
`5.227.986 A
`3/1994 Dementhonetal.
`5,297,061 A
`2/1995 DeMenthon
`5,388,059 A
`2/1996 Umezawaetal. ......... 348/14.02
`$491,507 A *
`12/1996 Cipolla etal.
`5,581,276 A
`/1997 Freemanetal.
`5,594,469 A
`41997 Oh
`5,616,078 A
`4/1997 Ohkuboet al.
`5,624,117 A
`/1998 Fishbine etal.
`5,781,647 A
`.. 382/103
`5,828,770 A * 10/1998 Leiset al.
`5,845,006 A * 12/1998 Sumi et ab. ee 382/154
`5,853,327 A
`12/1998 Gilboa
`5,878,174 A
`3/1999 Stewart et al.
`5,926,168 A
`7/1999 Fan
`5,940,126 A *
`41999) Kimura oo...eee 348/294
`6,204,852 BL*
`3/2001 Kumaretal. 0... 345/419
`6342917 BI
`72002 Amenta
`Oeaes BS
`fap mat et al.
`Aa,
`ie
`reed et al.
`6,508,709 BI
`1/2003. Karmarkar
`6,597,817 BL
`7/2003. Silverbrook
`6,775,361 B1l*
`8/2004 Araietal. won 379/93.17
`9/2004 Silverbrook .
`.. 348/207.2
`6,788,336 B1*
`
`2005 Brinjes eeseencnerenen 345/175
`6,911,972 B2*
`P2009 LOC eereescrrereresesesens 396/429
`7,489,863 Bo
`* cited by examiner
`
`Primary Examiner — Tom Y Lu
`(74) Attorney, Agent, or Firm --- Warner Norcross & Judd
`LLP
`
`ABSTRACT
`67)
`Method and apparatusare disclosed to enable rapid TV cam-
`era and computer based sensing in many practical applica-
`tions, including, but not limited to, handheld devices, cars,
`and video games. Several unique forms ofsocial video games
`are disclosed.
`
`3,909,002 A
`4,219,847 A
`
`9/1975 Levy
`8/1980 Pinkneyetal.
`
`14 Claims, 23 Drawing Sheets
`
`GTP_00000846
`
`

`

`Case 2:21-cv-00040-JRG Document 64-2 Filed 08/15/21 Page 3 of 39 PageID #: 1050
`Case 2:21-cv-00040-JRG Document 64-2 Filed 08/15/21 Page 3 of 39 PagelD #: 1050
`
`U.S. Patent
`
`Jun. 5, 2012
`
`Sheet 1 of 23
`
`US 8,194,924 B2
`
`138
`
`as FIG. 1A
`
`147
`
`148
`
`160
`
`165
`
`164
`
`166
`
`467
`
`FIG. 1B
`
`181
`
`480
`
`183
`
`182
`
`‘FIG. 1C
`
`GTP_00000847
`
`

`

`Case 2:21-cv-00040-JRG Document 64-2 Filed 08/15/21 Page 4 of 39 PageID #: 1051
`Case 2:21-cv-00040-JRG Document 64-2 Filed 08/15/21 Page 4 of 39 PagelD #: 1051
`
`U.S. Patent
`
`Jun. 5, 2012
`
`Sheet 2 of 23
`
`US 8,194,924 B2
`
`1000
`
`y
`ELEMENTS
`
`300
`
`
`500
`
`X
`ELEMENTS
`
`COMPUTER
`
`205
`
`1000
`
`
`
`250
`
`251
`
`FIG. 2B
`
`GTP_00000848
`
`

`

`Case 2:21-cv-00040-JRG Document 64-2 Filed 08/15/21 Page 5 of 39 PageID #: 1052
`Case 2:21-cv-00040-JRG Document 64-2 Filed 08/15/21 Page 5 of 39 PagelD #: 1052
`
`U.S. Patent
`
`Jun. 5, 2012
`
`Sheet 3 of 23
`
`US 8,194,924 B2
`
`220
`
` COMPUTER
`
`271
`
`270
`
`280
`
`FIG. 2C
`
`GTP_00000849
`
`

`

`Case 2:21-cv-00040-JRG Document 64-2 Filed 08/15/21 Page 6 of 39 PageID #: 1053
`Case 2:21-cv-00040-JRG Document 64-2 Filed 08/15/21 Page 6 of 39 PagelD #: 1053
`
`U.S. Patent
`
`Jun. 5, 2012
`
`Sheet 4 of 23
`
`US 8,194,924 B2
`
`
`
`FIG. 2D
`
`410Sa oA‘~os
`
`402
`
`405
`
`406
`
`FIG. 4A
`
`GTP_00000850
`
`

`

`Case 2:21-cv-00040-JRG Document 64-2 Filed 08/15/21 Page 7 of 39 PageID #: 1054
`Case 2:21-cv-00040-JRG Document 64-2 Filed 08/15/21 Page 7 of 39 PagelD #: 1054
`
`U.S. Patent
`
`Jun. 5, 2012
`
`Sheet 5 of 23
`
`US 8,194,924 B2
`
`
`
`302
`
`301
`
`445
`
`FIG. 4B
`
`GTP_00000851
`
`

`

`Case 2:21-cv-00040-JRG Document 64-2 Filed 08/15/21 Page 8 of 39 PageID #: 1055
`Case 2:21-cv-00040-JRG Document 64-2 Filed 08/15/21 Page 8 of 39 PagelD #: 1055
`
`U.S. Patent
`
`Jun. 5, 2012
`
`Sheet6 of 23
`
`US 8,194,924 B2
`
`349
`
`345
`
`355
`
`~~
`\
`
`346
`
`347
`
`340
`
`341
`
`Ve’
`
`FIG. 3B
`
`365
`
`360
`
`GTP_00000852
`
`

`

`Case 2:21-cv-00040-JRG Document 64-2 Filed 08/15/21 Page 9 of 39 PageID #: 1056
`Case 2:21-cv-00040-JRG Document 64-2 Filed 08/15/21 Page 9 of 39 PagelD #: 1056
`
`U.S. Patent
`
`Jun. 5, 2012
`
`Sheet 7 of 23
`
`US 8,194,924 B2
`
`mk 500
`
`FIG. SA
`
`530
`
`FIG. 5B
`
`GTP_00000853
`
`

`

`Case 2:21-cv-00040-JRG Document 64-2 Filed 08/15/21 Page 10 of 39 PageID #: 1057
`Case 2:21-cv-00040-JRG Document 64-2 Filed 08/15/21 Page 10 of 39 PagelD #: 1057
`
`U.S. Patent
`
`Jun. 5, 2012
`
`Sheet 8 of 23
`
`US 8,194,924 B2
`
` COMPUTER
`
`GTP_00000854
`
`

`

`Case 2:21-cv-00040-JRG Document 64-2 Filed 08/15/21 Page 11 of 39 PageID #: 1058
`Case 2:21-cv-00040-JRG Document 64-2 Filed 08/15/21 Page 11 of 39 PagelD #: 1058
`
`U.S. Patent
`
`Jun. 5, 2012
`
`Sheet 9 of 23
`
`US 8,194,924 B2
`
`730
`
`731m 700
`rh 701
`
`—___
`
`733
`
`732
`
`705
`
`
`
`905
`
`FIG. 9
`
`GTP_00000855
`
`

`

`Case 2:21-cv-00040-JRG Document 64-2 Filed 08/15/21 Page 12 of 39 PageID #: 1059
`Case 2:21-cv-00040-JRG Document 64-2 Filed 08/15/21 Page 12 of 39 PagelD #: 1059
`
`U.S. Patent
`
`Jun. 5, 2012
`
`Sheet 10 of 23
`
`US 8,194,924 B2
`
`803
`
` 824
`
`FIG. 8A
`
`853
`
`851
`
`852
`
`850
`
`849
`
`FIG. &B
`
`GTP_00000856
`
`

`

`Case 2:21-cv-00040-JRG Document 64-2 Filed 08/15/21 Page 13 of 39 PageID #: 1060
`Case 2:21-cv-00040-JRG Document 64-2 Filed 08/15/21 Page 13 of 39 PagelD #: 1060
`
`VOL SIA
`
`US8,194,924B2
`Sheet11of23
`Jun.5,2012
`O€OL\| oPol
`
`U.S.Patent
`
`SZOl
`
`9901
`
`a
`
`L901
`
`ZPOL
`
`LyOL
`
`OvoL
`
`OOOL
`OLOL
`
`-
`
`Srol
`
`(3
`
`GTP_00000857
`
`

`

`Case 2:21-cv-00040-JRG Document 64-2 Filed 08/15/21 Page 14 of 39 PageID #: 1061
`Case 2:21-cv-00040-JRG Document 64-2 Filed 08/15/21 Page 14 of 39 PagelD #: 1061
`
`U.S. Patent
`
`Na©b
`
`ee 1077
`
`Jun. 5, 2012
`
`Sheet 12 of 23
`
`US 8,194,924 B2
`
`FIG.10B
`
`GTP_00000858
`
`

`

`Case 2:21-cv-00040-JRG Document 64-2 Filed 08/15/21 Page 15 of 39 PageID #: 1062
`Case 2:21-cv-00040-JRG Document 64-2 Filed 08/15/21 Page 15 of 39 PagelD #: 1062
`
`U.S. Patent
`
`Jun. 5, 2012
`
`Sheet 13 of 23
`
`US 8,194,924 B2
`
`1160
`
`1152
`
`f=|comp
`1140
`
`L W145
`
`nie
`
`"pe
`
`NY
`1110vl
`
`
`ns eeae 1188
`
`1151
`
`FIG. 11A
`
`GTP_00000859
`
`

`

`Case 2:21-cv-00040-JRG Document 64-2 Filed 08/15/21 Page 16 of 39 PageID #: 1063
`Case 2:21-cv-00040-JRG Document 64-2 Filed 08/15/21 Page 16 of 39 PagelD #: 1063
`
`U.S. Patent
`
`Jun. 5, 2012
`
`Sheet 14 of 23
`
`US 8,194,924 B2
`
`|[Mw/\
`
`1167
`
`FIG.11B
`
`GTP_00000860
`
`

`

`Case 2:21-cv-00040-JRG Document 64-2 Filed 08/15/21 Page 17 of 39 PageID #: 1064
`Case 2:21-cv-00040-JRG Document 64-2 Filed 08/15/21 Page 17 of 39 PagelD #: 1064
`
`1266
`
`jus}yed“Sf
`Z10Z‘g‘une
`
`£7JOST99S FIG. 12
`
`7d$76°F61'8SA
`
`GTP_00000861
`
`

`

`Case 2:21-cv-00040-JRG Document 64-2 Filed 08/15/21 Page 18 of 39 PageID #: 1065
`Case 2:21-cv-00040-JRG Document 64-2 Filed 08/15/21 Page 18 of 39 PagelD #: 1065
`
`U.S. Patent
`
`Jun. 5, 2012
`
`Sheet 16 of 23
`
`US 8,194,924 B2
`
`FIG.13
`
`GTP_00000862
`
`1370
`
`one1301
`
`1350
`
`a1330
`
`©Ne
`
`y=
`
`1355
`
`

`

`Case 2:21-cv-00040-JRG Document 64-2 Filed 08/15/21 Page 19 of 39 PageID #: 1066
`Case 2:21-cv-00040-JRG Document 64-2 Filed 08/15/21 Page 19 of 39 PagelD #: 1066
`
`£7JOLT99S
`
`jus}yed“Sf
`Z10Z‘g‘une
`
`FIG. 14A
`
`7d$76°F61'8SA
`
`GTP_00000863
`
`

`

`Case 2:21-cv-00040-JRG Document 64-2 Filed 08/15/21 Page 20 of 39 PageID #: 1067
`Case 2:21-cv-00040-JRG Document 64-2 Filed 08/15/21 Page 20 of 39 PagelD #: 1067
`
`U.S. Patent
`
`Jun. 5, 2012
`
`Sheet 18 of 23
`
`US 8,194,924 B2
`
`
`
`FIG. 14C
`
`GTP_00000864
`
`

`

`Case 2:21-cv-00040-JRG Document 64-2 Filed 08/15/21 Page 21 of 39 PageID #: 1068
`Case 2:21-cv-00040-JRG Document 64-2 Filed 08/15/21 Page 21 of 39 PagelD #: 1068
`
`U.S. Patent
`
`Jun. 5, 2012
`
`Sheet 19 of 23
`
`US 8,194,924 B2
`
`1505 FIG.15
`
`NM
`1510
`
`©9i
`
`)—
`
`1550
`
`COMP.
`
`GTP_00000865
`
`

`

`Case 2:21-cv-00040-JRG Document 64-2 Filed 08/15/21 Page 22 of 39 PageID #: 1069
`Case 2:21-cv-00040-JRG Document 64-2 Filed 08/15/21 Page 22 of 39 PagelD #: 1069
`
`~D
`
`-
`$~~
`
`om
`—~-,
`
`Sa
`
`o Pw
`
`n
`yy>
`SS)
`
`N= S
`
`= 2o
`
`PIXEL
`ELEMENTS
`
`on)
`
`M i=\
`
`o >
`
`S
`
`_ o
`
`w
`be
`
`GTP_00000866
`
`PIXEL
`ELEMENTS
`
`1000 —-
`
`1601
`
`
`
`1615
`
`FIG. 16
`
`a
`
`

`

`Case 2:21-cv-00040-JRG Document 64-2 Filed 08/15/21 Page 23 of 39 PageID #: 1070
`Case 2:21-cv-00040-JRG Document 64-2 Filed 08/15/21 Page 23 of 39 PagelD #: 1070
`
`\ 1710
`x¢
`
`oY
`
`Zz
`
`1725
`COMP.
`
`1745
`
`aS
`
`1704
`
`as
`
`17
`
`50
`
`1715
`
`1740
`
`6
`ss>$>
`
`1735
`
`1730
`
`1702
`
`170
`1700
`
`36
`
`1720
`
`FIG. 17A
`
`FIG. 17C
`
`jus}yed“Sf
`Z10Z‘g‘une
`£7JO[7Jo9y$§
`7d$76°F61'8SA
`
`GTP_00000867
`
`

`

`Case 2:21-cv-00040-JRG Document 64-2 Filed 08/15/21 Page 24 of 39 PageID #: 1071
`Case 2:21-cv-00040-JRG Document 64-2 Filed 08/15/21 Page 24 of 39 PagelD #: 1071
`
`£7J977Joos
`
`yuajed‘SN
`
`Z10Z‘g‘une
`
`FIG. 17B
`
`7d$76°F61'8SA
`
`GTP_00000868
`
`

`

`Case 2:21-cv-00040-JRG Document 64-2 Filed 08/15/21 Page 25 of 39 PageID #: 1072
`Case 2:21-cv-00040-JRG Document 64-2 Filed 08/15/21 Page 25 of 39 PagelD #: 1072
`
`
`
`1956-~O
`
`1935
`
`1981
`
`eK
`oF
`/\-1958
`
`a oo1910
`
`1957
`
`O
`
`Fig. 18
`
`yuajed‘SN
`
`Z10Z‘g‘une
`£7JO£7JooUS
`7d$76°F61'8SA
`
`GTP_00000869
`
`

`

`Case 2:21-cv-00040-JRG Document 64-2 Filed 08/15/21 Page 26 of 39 PageID #: 1073
`Case 2:21-cv-00040-JRG Document 64-2 Filed 08/15/21 Page 26 of 39 PagelD #: 1073
`
`US 8,194,924 B2
`
`1
`CAMERA BASED SENSING IN HANDHELD,
`MOBILE, GAMING OR OTHER DEVICES
`
`
`CROSS REFERENCE TO RELATED
`APPLICATIONS
`
`2
`6. “Apparel Manufacture and Distance Fashion Shopping
`in Both Present and Future”: U.S. Appin. Ser. No. 60/187,
`397, filed Mar. 7, 2000.
`
`FIELD OF THE INVENTION
`
`This application is a continuation ofapplication Ser. No.
`12/834,281, filed Jul. 12, 2010 (now U.S. Pat. No. 7,933,431),
`whichis a continuation of application Ser. No. 11/980,710,
`filed Oct. 31, 2007 (now U.S. Pat. No. 7,756,297), which is a
`continuation of application Ser. No. 10/893,534,filed Jul. 19,
`2004 (now U.S. Pat. No. 7,401,783), whichis a continuation
`of application Ser. No. 09/612,225, filed Jul. 7, 2000 (now
`USS. Pat. No. 6,766,036), which claims the benefit of U.S.
`Provisional Application No. 60/142,777, filed Jul. 8, 1999.
`Cross references to related co-pending US applications by
`the inventor having similar subject matter.
`1. Touch TV and other Man Machine Interfaces: Ser. No.
`
`09/435,854 filed Nov. 8, 1999, now U.S. Pat. No. 7,098,891;
`which wasa continuation of application Ser. No. 07/946,908,
`now U.S. Pat. No. 5,982,352;
`2. More Useful Man MachineInterfaces and Applications:
`Ser. No. 09/433,297 filed Nov. 3, 1999, now U.S. Pat. No.
`6,750,848;
`3. Useful Man Machineinterfaces and applications: Ser.
`No. 09/138,339, Pub. Appin. 2002-0036617, now abandoned;
`4. Vision Target based assembly: Ser. No. 08/469,907filed
`Jun. 6, 1995, now U.S. Pat. No. 6,301,783;
`5. Picture Taking method and apparatus: provisional appli-
`cation 60/133,671, and regular application Ser. No. 09/568,
`$52 filed May 11, 2000, now U'S. Pat. No. 7,015,950;
`6. Methods and Apparatus for Man MachineInterfaces and
`Related Activity: Provisional Application: provisional appli-
`cation 60/133,673 filed May 11, 1999; and regular applica-
`tion Ser. No. 09/568,554 filed May 11, 2000, now U’S. Pat.
`No. 6,545,670;
`7. Tactile Touch Screens for Automobile Dashboards, Inte-
`riors and Other Applications: provisional application Ser. No.
`60/183,807; and regular application Ser. No. 09/789,538,
`now US. Pat. No. 7,084,859; and
`8. Apparel Manufacture and Distance Fashion Shopping in
`Both Present and Future: provisional application 60/187,397
`filed Mar. 7, 2000.
`The disclosures ofthe following U.S. patents and co-pend-
`ing patent applications by the inventor,or the inventor and his
`colleagues, are incorporated herein by reference:
`1. “Man machine Interfaces”: U.S. application Ser. No.
`09/435,854 and U.S. Pat. No. 5,982,352, and U.S. application
`Ser. No. 08/290,516, filed Aug. 15, 1994, now U.S. Pat. No.
`6,008,000, the disclosure ofboth ofwhich is contained in that
`of Ser. No. 09/435,854;
`2. “Useful Man Machine Interfaces and Applications”:
`US. application Ser. No. 09/138,339, now Pub. Appln. 2002-
`0036617;
`3. “More Useful Man Machine Interfaces and Applica-
`tions”: U.S. application Ser. No. 09/433,297, now U.S. Pat.
`No. 6,750,848;
`4. “Methods and Apparatus for Man Machine Interfaces
`and Related Activity”: U.S. Appln. Ser. No. 60/133,673 filed
`as regular application Ser. No. 09/568,554, now U.S. Pat. No.
`6,545,670;
`5. “Tactile Touch Screens for Automobile Dashboards,
`Interiors and Other Applications”: U.S. provisional Appin.
`Ser. No. 60/183,807, filed Feb. 22, 2000, now filed as reg.
`application Ser. No. 09/789,538; and
`
`10
`
`15
`
`25
`
`30
`
`35
`
`40
`
`45
`
`50
`
`60
`
`The invention relates to simple input devices for comput-
`ers, particularly, but not necessarily, intended for use with 3-D
`graphically intensive activities, and operating by optically
`sensing a human input to a display screen or other object
`and/or the sensing of humanpositions or orientations. The
`invention herein is a continuation in part of several inventions
`of mine, listed above.
`This continuation application seeks to provide further use-
`ful embodiments for improving the sensing of objects. Also
`disclosed are new applications in a variety of fields such as
`computing, gaming, medicine, and education. Further dis-
`closed are improved systems for display and control pur-
`poses.
`The invention uses single or multiple TV cameras whose
`output is analyzed and used as input to a computer, such as a
`homePC,to typically provide data concerning the location of
`parts of, or objects held by, a person or persons.
`
`DESCRIPTION OF RELATED ART
`
`The above mentioned co-pending applications incorpo-
`rated by reference discuss manyprior art references in various
`pertinent fields, which form a background forthis invention.
`Some more specific U.S. Patent references are for example:
`DeMenthon——U’S. Pat. Nos. 5,388,059; 5,297,061; 5,227,
`985
`Cipolla—U’S. Pat. No. 5,581,276
`Pugh—U.S. Pat. No. 4,631,676
`Pinckney—U.S. Pat. No. 4,219,847
`
`DESCRIPTION OF FIGURES
`
`FIG.1 illustrates a basic computer terminal embodiment of
`the invention, similar to that disclosed in copending applica-
`tions.
`FIG. 2 illustrates object tracking embodiments of the
`invention employing a pixel addressable camera.
`FIG. 3 illustrates tracking embodiments ofthe invention
`using intensity variation to identify and/ortrack objecttarget
`datums.
`FIG.4 illustrates tracking embodiments of the invention
`using variation in color to identify and/or track object target
`datums.
`FIG. § illustrates special camera designs for determining
`target position in addition to providing normalcolor images.
`FIG.6 identification and tracking with stereo pairs.
`FIG.7 illustrates use of an indicator or co-target.
`FIG.8 illustrates control of functions with the invention,
`using a handheld device which itself has functions.
`FIG. 9 illustrates pointing at an object represented on a
`screen using a finger or laser pointer, and then manipulating
`the represented object using the invention.
`FIG.10 illustrates control of automobile or other functions
`
`with the invention, using detected knob, switch orslider posi-
`tions.
`FIG. 11 illustrates a board game embodiment ofthe inven-
`tion.
`FIG. 12 illustrates a generic game embodimentof the
`invention.
`FIG. 13 illustrates a game embodiment of the invention,
`such as might be played in a bar.
`
`GTP_00000870
`
`

`

`Case 2:21-cv-00040-JRG Document 64-2 Filed 08/15/21 Page 27 of 39 PageID #: 1074
`Case 2:21-cv-00040-JRG Document 64-2 Filed 08/15/21 Page 27 of 39 PagelD #: 1074
`
`US 8,194,924 B2
`
`3
`FIG. 14 illustrates a laser pointer or other spot designator
`embodimentofthe invention.
`
`4
`Pinkneydescribed below,the full 6 degree of freedom solu-
`tion of the human face location and orientation can be
`
`10
`
`15
`
`th>
`
`35
`
`40
`
`45
`
`;
`
`FIG. 15illustrates a gesture based flirting game embodi-
`mentof the invention.
`FIG.16 illustrates a version ofthe pixel addressing camera
`technique wherein two lines on either side of a 1000 element
`square array are designated as perimeterfencelinestoinitiate
`tracking or other action.
`FIG.17 illustrates a 3-D acoustic imaging embodiment of
`the invention.
`
`FIG. 18 illustrates an improved handheld computer
`embodimentofthe invention, in which the camera or cameras
`may be used to look at objects, screens and the like as well as
`lookat the user.
`
`THE INVENTION EMBODIMENTS
`
`FIG. 1
`The invention herein and disclosed in portions of other
`copending applications noted above, comprehends a combi-
`nation of one or more TV cameras(or other suitable electro-
`optical sensors) and a computer to provide various position
`and orientation related functions of use. It also comprehends
`the combination of these functions with the basic task of
`generating, storing and/or transmitting a TV image of the
`scene acquired-—-either in two or three dimensions.
`The embodiment depicted in FIG. 1Aillustrates the basic
`embodiments of many of my co-pending applications above.
`A stereo pair of cameras 100 and 101 located on each side of
`the upper surface of monitor 102 (for example a rear projec-
`tion TV of 60 inch diagonal screen size) with display screen
`103 facing the user, are connected to PC computer 106 (inte-
`grated in this case into the monitor housing), for example a
`400 Mhz Pentium II. For appearances andprotection a single
`extensive cover window may be used to cover both cameras
`and their associated light sources 110 and 111, typically
`LEDs.
`The LEDs inthis application are typically used to illumi-
`nate targets associated with any ofthe fingers, hand, feet and
`head ofthe user, or objects such as 131 held by a user, 135
`with hands 136 and 137, and head 138. Thesetargets, such as
`circular target 140 and band target 141 on object 131 are
`desirably, but not necessarily, retro-reflective, and may be
`constituted by the object features themselves (e.g., a finger
`tip, such as 145), or by features provided on clothing worn by
`the user (e.g., a shirt button 147 or polka dot 148, or by
`artificial targets other than retroreflectors.
`Alternatively, a three camera arrangementcan be used,for
`example using additional camera 144, to provide added sen-
`sitivity in certain angular and positional relationships. Still
`more cameras can be used to further improve matters, as
`desired. Alternatively, and or in addition, camera 144 can be
`used for other purposes, such as acquire images of objects
`such as persons, for transmission, storage or retrieval inde-
`pendent of the cameras used for datumand feature location
`determination.
`For many applications, a single camera can suffice for
`measurementpurposes as well, such as 160 shown in FIG. 1B
`for example, used for simple 2 dimensional (2D) measure-
`ments in the xy plane perpendicular to the camera axis (z
`axis), or 3D (xyz, roll pitch yaw) where a target grouping, for
`example of three targets is used suchas the natural features
`formed by the two eyes 164, 165 and nose 166 of a human
`167. These features are roughly at knowndistances from each
`other, the data from which can be used to calculate the
`approximate position and orientation of the human face.
`Using for example the photogrammetric technique of
`
`achieved to an accuracy limited by the ability of the camera
`imageprocessing software utilized to determine the centroids
`or other delineating geometric indicators ofthe position ofthe
`eyes and nose, (or some other facial feature such as the
`mouth), and the accuracyofthe initial imputing ofthe spacing
`of the eyes andtheir respective spacing to the nose. Clearly if
`a standard humanvalueis used (say for adult, or fora child or
`even by age) some lessening of precisionresults, since these
`spacings are used in the calculation of distance and orienta-
`tion ofthe face of human 167 from the camera 160.
`
`In another generally more photogrammetrically accurate
`case, one might chooseto use four special targets (e.g., glass
`bead retro-reflectors, or orange dots) 180-183 on the object
`185 having knownpositional relationships relative to each
`other on the object surface, such as one inch centers. Thisis
`shown in FIG. 1C, and may be used in conjunction with a
`pixel addressable camera such as described in FIG. 2 below,
`whichallows oneto rapidly determine the object position and
`orientation and track its movements in up to 6 degrees of
`freedomas disclosed by Pinkney U.S. Pat. No. 4,219,847 and
`technical papers referenced therein. For example, the system
`described above for FIGS. 1 and 2 involving the photogram-
`metric resolution of the relative position of three or more
`known target points as viewed by a camera is knownandis
`described in a paper entitled “A Single Camera Method for
`the 6-Degree of Freedom Sprung Mass ResponseofVehicles
`Redirected by Cable Barriers” presented by M. C. van Wijk
`and H. F. L. Pinkney to The Society of Photo-optical Instru-
`mentation Engineers.
`The stereo pair of cameras can also acquire a two view
`stereo image of the scene as well, which can be displayed in
`3D using stereoscopic or auto-stereoscopic means, as well as
`transmitted or recorded as desired.
`
`In many applications ofthe foregoing invention it is desir-
`able not just to use a large screen but in fact one capable of
`displaying life size images. This particularlyrelates to human
`scaled images, giving a life-like presence to the data on the
`screen. In this way the natural response of the user with
`motions of hands, head, arms, etc., is scaled in “real” propor-
`tion to the data being presented.
`FIG. 2
`This embodiment and others discloses special types of
`cameras useful with the invention. In the first case, that of
`FIG.2A,a pixel addressable camera such as the MAPP2200
`made by IVP corporation of Sweden is used, which allows
`one to do many things useful for rapidly determining location
`of objects, their orientation and their motion.
`For example, as shown in FIG. 2A, an approximately cir-
`cular image 201 of a target datum such as 180 on object 185
`ofFIG. 1C may be acquired by scanningthe pixel elements on
`a matrix array 205 on which the image is formed. Such an
`array in the future will have for example 1000.times.1000
`pixels. or more
`(today the
`largest TVP makes
`is
`$12.times.512. The IVPalso is not believed to be completely
`randomly addressable, which somefuture arrays will be).
`Asan illustration, computer 220 determines, after the array
`205 has been interrogated, that the centroid “x, y”ofthe pixel
`elements on which the target image lies is at pixel x=500,
`y=300 Gncluding a sub-fraction thereof in many cases). The
`centroid location can be determined for example by the
`moment method disclosed in the Pinkney patent, referenced
`above.
`The target in this case is defined as a contrasting point on
`the object, and such contrast can be in color as well as, or
`
`GTP_00000871
`
`

`

`Case 2:21-cv-00040-JRG Document 64-2 Filed 08/15/21 Page 28 of 39 PageID #: 1075
`Case 2:21-cv-00040-JRG Document 64-2 Filed 08/15/21 Page 28 of 39 PagelD #: 1075
`
`US 8,194,924 B2
`
`5
`instead of, intensity. Or with some added preprocessing,it can
`be a distinctive pattern on the object, such as a checkerboard
`or herringbone.
`Subsequent Tracking
`To subsequently track the movementofthis target image,it
`is now only necessary to look in a small pixel window com-
`posed of a small number of pixels around the target. For
`example the square 230 shown, as the new position x'y'of the
`target image cannotbe further distant within a short period of
`time elapsed from the first scan, and in consideration of the
`small required time to scan the window.
`For example, if the window is 100.times.100 pixels, this
`can be scanned in 1 millisecond or less with such a pixel
`addressing camera, by interrogating only those pixels in the
`window, while still communicating with the camera over a
`relatively slow USBserial link of 12 mb transmission rate
`(representing 12,000 pixel gray level values in one millisec-
`ond).
`Onethus avoids the necessity to scan the wholefield, once
`the starting target image position is identified. This can be
`known byan initial scan as mentioned, or can be known by
`having the user move an object with a target against a known
`location with respect to the camera such as a mechanical stop,
`and then indicate that tracking should start either by verbally
`saying so with voice recognition,or by actuating a control key
`such as 238 or whatever.
`It is noted that ifthe tracking window is madelarge enough,
`then it can encompass a whole group of datums, such as
`180-183 on an object.
`FIG. 2B Reduction in Acquisition Time
`Another application of such a pixel addressing camera is
`shown in FIG, 2B. One canlookat the wholefield, x y of the
`camera, 240, but only address say every 10.sup.th pixel such
`as 250, 251 and 252, in each direction, i.e., for a total 10,000
`pixels in a field of 1 million (1000.times. 1000, say).
`In this case computer 220 simply queries this fraction ofthe
`pixelsin the image, knowing apriorithatthe target image such
`as 260 will have an imagesize larger than 10.times. 10Opixels,
`and must be detectable,if of sufficient contrast, by one of the
`queried pixels. (For smalleror larger target images, the num-
`ber and spacing of queried pixels can be adjusted accord-
`ingly). This for example, allows one to find approximate
`location of targets with only oo the pixel interrogation time
`otherwise needed, for example, plus any gain obtained as
`disclosed above, by knowing in what region of the image to
`look (for example during tracking, or given some apriori
`knowledge of approximate location due to a particular aspect
`of the physical arrangementor the program in question).
`Once a target has been approximately found as just
`wa D>
`described, the addressing can be optimizedfor that region of 5
`the image only, as disclosed in subsequent tracking section
`above.
`
`th>
`
`30
`
`35
`
`40
`
`45
`
`Giventhe invention, the potential for target acquisition ina
`millisecond or two thus is achievable with simple pixel
`addressable CMOS cameras coming on stream now (today
`costing under $50), assuming the target points are easily
`identifiable from at least one of brightness (over a value),
`contrast (with respect to surroundings), color, color contrast,
`and moredifficult, shape or pattern (e.g., a plaid, or herring-
`boneportion of a shirt). This has major ramifications for the
`robustness of control systems built on such camera based
`acquisition, be they for controlling displays, or machines or
`whatever.
`It’s noted that with new 2000.times.2000 cameras coming
`on stream,it may only be necessary to look at every 15.sup.th
`or 20.sup.th pixel in each direction to get an adequate feelfor
`target location. This means every 200.sup.th to 400.sup.th
`
`6
`pixel, not enough to cause imagerenditiondifficulties even if
`totally dark grey (as it might be in a normal whitelight image
`if set up for IR wavelengths only).
`FIG. 2C
`
`Another methodfor finding thetargetinthe first place with
`limited pixel interrogation is to look at pixels near a home
`point where a person for example indicates that the target is.
`This could be for example, placing ones fingernail such as
`270, whose natural or artificial (e.g., reflective nail polish)
`features are readily seen by the camera 275 and determined to
`be in the right corner of a pad 271 in FIG. 2C which approxi-
`mately covers the field of view 274 of the camera 275. The
`computer 220 analyzes the pixels in the right corner 278 ofthe
`image field 279 representing the pad portion 271 with the
`camera 275, either continuously, or only when the finger for
`example hits a switch such as 280 at the edgeofthe pad, or on
`command(e.g., by the user pushing a buttonor key, or a voice
`message inputted via microphone 285 for example). After
`such acquisition, the target is then tracked to other locations in
`xy spaceof the pad, for example as described above. Its noted
`that it helps to provide a beep or other sound or indication
`when acquisition has been made.
`Pick Windowsin Real Time
`
`Anotheraspect ofthe invention is that one can also pick the
`area of the image to interrogate at any desired moment. This
`can be done by creating a window ofpixels withinthefield to
`generate information, for example as discussedrelative to a
`specific car dashboard application of FIG. 10.
`FIG. 2D—ScanPattern
`
`A pixel addressing camera also allows a computer such as
`220 to cause scans to be generated whichare nottypical raster
`scans. For example circular or radial, or even odd shapes as
`desired. This can be done by providing from the computer the
`sequential addresses of the successive pixels on the camera
`chip whose detected voltages are to be queried.
`A circular scan of pixels addressed at high speed can be
`used to identify when and where a target enters a field
`enclosed by the circular pixel scan. This is highly useful, and
`after that, the approximate location ofthe target can be deter-
`mined by further scans ofpixels in the target region.
`For example consider addressing the pixels cl c2. c3 ...cn
`representing a circle 282 at the outer perimeter ofthe array,
`285, of 1000.times.1000 elements such as discussed above.
`The numberofpixelsin afull circle is approximately 1000 pi,
`which can be scanned even with USB (universal serial bus)
`limits at 300 times per second or better. For targets of Yioo
`field in width, this meansthat a target image enteringthefield
`suchascirculartarget image 289 (whichis shownintersecting
`element cm and its neighbors) would have totravel oo the
`field width in 0.0033 seconds to be totally missed in a worst
`case. Ifthe imagefield corresponds to 20 inchesin objectfield
`width this is 0.2 inches.times.300/sec or 60 inches/second,
`very fast for human movement, and not likely to be exceeded
`even where smaller targets are used.
`Alternative shapes to circular “trip wire” perimeters may
`be used, such as squares, zig-zag, or other layouts ofpixels to
`determine target presence. Once determined, a group of pix-
`els such as group 292 can be interrogated to get a better
`determination oftarget location.
`FIG. 3
`Since many applications of the invention concern, or at
`least have present a human caused motion, or motion ofa part
`of a human,or an object moved by a human,the identification
`and tracking problem can be simplified if the features of
`interest, either natural orartificial of the object provide some
`kind of change in appearance during such motion.
`
`GTP_00000872
`
`

`

`Case 2:21-cv-00040-JRG Document 64-2 Filed 08/15/21 Page 29 of 39 PageID #: 1076
`Case 2:21-cv-00040-JRG Document 64-2 Filed 08/15/21 Page 29 of 39 PagelD #: 1076
`
`US 8,194,924 B2
`
`th>
`
`25
`
`30
`
`7
`FIG. 3 illustrates tracking embodiments of the invention
`using intensity variationto identify and/ortrack object target
`datums. In a simple case, a subtraction of successive images
`can aid in identifying zones in an image having movement of
`features as is well known.It is also useful to add pixel inten-
`sities of successive images in computer 220 for example. This
`is particular true with bright targets (with respect to their
`usual surroundings) such as LEDsorretro-reflectors. If the
`pixels in use by the cameraare ableto gather light preferen-
`tially at the same time a special illumination light is on, this
`will accentuate the target with respect to background. And if
`successive frames are taken in this way, not only will a sta-
`tionary imageofthe special target build up, but if movement
`takes place the target image then will blur in a particular
`direction whichitself can become identify-able. And the blur
`direction indicates direction of motion as well, at least in the
`2-D planeofthe pixel array used.
`Another form of movement can take place artificially,
`wherethe target is purposely moved to provide an indication
`ofits presence. This movementcan be done by a humaneasily
`by just dithering ones finger for example (if a portion ofthe
`finger such asthetip is the target in question), or by vibrating
`an object having target features of interest on it, for example
`by moving the object up and down with ones hand.
`For example consider FIG. 3A, where a human 301 moves
`his finger 302 in a rapid up and down motion, creating diffe

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket