throbber
Case 2:21-cv-00040-JRG Document 72-1 Filed 09/08/21 Page 1 of 38 PageID #: 1587
`#: 1587
`1 of 38 PagelD
`Case 2:21-cv-00040-JRG Document 72-1 Filed 09/08/21 Page
`
`
`EXHIBIT F
`EXHIBIT F
`
`

`

`NS
`Case 2:21-cv-00040-JRG Document 72-1 Filed 09/08/21 Page 2 of 38 PageID #: 1588
`ENTAATATTA
`
`US007401783B2
`
`a2) United States Patent
`(10) Patent No.:
`US 7,401,783 B2
`
`(45) Date of Patent: Jul, 22, 2008
`Pryor
`
`(54)
`
`(76)
`
`(*)
`
`(21)
`
`(22)
`
`(65)
`
`(63)
`
`(60)
`
`(51)
`
`(52)
`(58)
`
`CAMERA BASED MAN MACHINE
`INTERFACES
`
`Notice:
`
`Inventor: Timothy R. Pryor, 416 Old Tecumseh
`Road, Tecumseh, Ontario (CA) N8N 388
`to any disclaimer, the term ofthis
`Subject
`patent is extended or
`adjusted under 35
`U.S.C. 154(b) by 0 days.
`Appl. No.: 10/893,534
`Jul. 19, 2004
`Filed:
`
`Prior Publication Data
`
`US 2005/0129273 Al
`
`Jun. 16, 2005
`
`Related U.S. Application Data
`Continuation of application No. 09/612,225, filed on
`now Pat. No. 6,766,036.
`Jul. 7, 2000,
`Provisional application No. 60/142,777, filed
`1999,
`
`on Jul. 8,
`
`Int. Cl.
`A63F 3/00
`(2006.01)
`cc ccceceeeteeeeeeee
`US. C1.
`273/237; 463/1; 273/274
`.................
`Field of Classification Search.
`273/237,
`273/454, 274, 309; 463/1, 40, 42, 43
`See application file for complete search history.
`
`(56)
`
`References Cited
`
`U.S. PATENT DOCUMENTS
`*
`oo... eeeeeeseetereeeeeee
`9/1975 Levy
`*
`ou.
`7/1982 Hedgesetal.
`2/1992 Chan
`.............
`7/1998 Fishbine et al.
`w.
`12/1998 Gilboa
`
`«0.0.0...
`1/2003 Karmarkar
`
`3,909,002 A
`4,339,798 A
`5,088,928 A
`*
`5,781,647 A
`*
`5,853,327 A
`6,508,709 B1*
`
`463/26
`463/26
`
`382/100
`463/39
`463/42
`
`*
`
`cited by examiner
`Primary Examiner—Vishu K. Mendiratta
`or Firm—Stites & Harbison PLLC;
`(74) Attorney, Agent,
`Douglas E. Jackson
`
`(57)
`
`ABSTRACT
`
`Disclosed herein are new forms of computer inputs particu-
`larly using TV Cameras, and providing affordable methods
`and apparatus for data communication with respect to
`people
`and computers using optically inputted information from spe-
`cialized datum’s on
`objects and/or natural features of objects.
`Particular embodiments capable of fast and reliable acquisi-
`tion of features and tracking ofmotionare
`disclosed, together
`with numerous
`applications in various fields of endeavor.
`
`17 Claims, 22 Drawing Sheets
`
`450
`
`445
`
`
`
`
`
`302
`
`301
`
`

`

`Case 2:21-cv-00040-JRG Document 72-1 Filed 09/08/21 Page 3 of 38 PageID #: 1589
`#: 1589
`Case 2:21-cv-00040-JRG Document 72-1 Filed 09/08/21 Page 3 of 38 PagelD
`
`U.S. Patent
`
`Jul. 22, 2008
`
`Sheet 1 of 22
`
`US 7,401,783 B2
`
` 137
`
`145
`
`140
`
`at
`
`141
`
`136
`
`138
`
`147
`
`148
`
`Nass
`
`403
`
`FIG. 1A
`
`165
`
`164
`
`166
`
`467
`
`FIG. 1B
`
`ae
`Gh
`
`FIG. 1C
`
`

`

`Case 2:21-cv-00040-JRG Document 72-1 Filed 09/08/21 Page 4 of 38 PageID #: 1590
`#: 1590
`Case 2:21-cv-00040-JRG Document 72-1 Filed 09/08/21 Page
`4 of 38 PagelD
`
`U.S. Patent
`
`Jul. 22, 2008
`
`Sheet 2 of 22
`
`US 7,401,783 B2
`
`1000
`
`205
`
`ELEMENTS
`
`
`
`
`ol
`1000
`
` y
`
`
`
`500
`x
`
`ELEMENTS
`
`
`
`
`
`COMPUTER
`
`220
`
`238
`
`240
`
`—_
`
`250 FIG. 2B
`
`

`

`Case 2:21-cv-00040-JRG Document 72-1 Filed 09/08/21 Page 5 of 38 PageID #: 1591
`#: 1591
`Case 2:21-cv-00040-JRG Document 72-1 Filed 09/08/21 Page 5 of 38 PagelD
`
`U.S. Patent
`
`Jul. 22, 2008
`
`Sheet 3 of 22
`
`US 7,401,783 B2
`
`220
`
` COMPUTER
`
`275
`
`274
`
`270
`
`280
`
`271
`
`FIG. 2C
`
`

`

`Case 2:21-cv-00040-JRG Document 72-1 Filed 09/08/21 Page 6 of 38 PageID #: 1592
`#: 1592
`Case 2:21-cv-00040-JRG Document 72-1 Filed 09/08/21 Page 6 of 38 PagelD
`
`U.S. Patent
`
`Jul. 22, 2008
`
`Sheet 4 of 22
`
`US 7,401,783 B2
`
` FIG. 2D
`
`410
`
`O01
`
`Sa
`
`aN
`
`402
`
`“e
`
`405
`
`406
`
`FIG. 4A
`
`

`

`Case 2:21-cv-00040-JRG Document 72-1 Filed 09/08/21 Page 7 of 38 PageID #: 1593
`#: 1593
`7 of 38 PagelD
`Case 2:21-cv-00040-JRG Document 72-1 Filed 09/08/21 Page
`
`U.S. Patent
`
`Jul. 22, 2008
`
`Sheet 5 of 22
`
`US 7,401,783 B2
`
` 302
`
`301
`
`445
`
`FIG. 4B
`
`

`

`Case 2:21-cv-00040-JRG Document 72-1 Filed 09/08/21 Page 8 of 38 PageID #: 1594
`#: 1594
`Case 2:21-cv-00040-JRG Document 72-1 Filed 09/08/21 Page
`8 of 38 PagelD
`
`U.S. Patent
`
`Jul. 22, 2008
`
`Sheet 6 of 22
`
`US 7,401,783 B2
`
`349
`
`sae
`
`- 355
`“ee
`\
`
`346
`
`347
`
`340
`
`341
`
`360
`
`

`

`Case 2:21-cv-00040-JRG Document 72-1 Filed 09/08/21 Page 9 of 38 PageID #: 1595
`#: 1595
`Case 2:21-cv-00040-JRG Document 72-1 Filed 09/08/21 Page 9 of 38 PagelD
`
`U.S. Patent
`
`Jul. 22, 2008
`
`Sheet 7 of 22
`
`US 7,401,783 B2
`
`as 500
` FIG. 5B
`
`

`

`Case 2:21-cv-00040-JRG Document 72-1 Filed 09/08/21 Page 10 of 38 PageID #: 1596
`#: 1596
`Case 2:21-cv-00040-JRG Document 72-1 Filed 09/08/21 Page 10 of 38 PagelD
`
`U.S. Patent
`
`Jul. 22, 2008
`
`Sheet 8 of 22
`
`US 7,401,783 B2
`
`
`
`COMPUTER
`
`
`
`FIG. 6
`
`

`

`Case 2:21-cv-00040-JRG Document 72-1 Filed 09/08/21 Page 11 of 38 PageID #: 1597
`#: 1597
`11 of 38 PagelD
`Case 2:21-cv-00040-JRG Document 72-1 Filed 09/08/21 Page
`
`U.S. Patent
`
`Jul. 22, 2008
`
`Sheet 9 of 22
`
`US 7,401,783 B2
`
`730
`
`700
`
`731
`
`an
`
`——
`
`733
`
`732
`
`705
`
` 905
`
`FIG. 9
`
`

`

`Case 2:21-cv-00040-JRG Document 72-1 Filed 09/08/21 Page 12 of 38 PageID #: 1598
`#: 1598
`Case 2:21-cv-00040-JRG Document 72-1 Filed 09/08/21 Page
`12 of 38 PagelD
`
`U.S. Patent
`
`Jul. 22, 2008
`
`Sheet 10 of 22
`
`US 7,401,783 B2
`
`816
`
`803
`
`FIG. 8A
`
`
`
`853
`
`851
`
`852
`
`850
`
`FIG. 8B
`
`824
`
`849
`
`

`

`Case 2:21-cv-00040-JRG Document 72-1 Filed 09/08/21 Page 13 of 38 PageID #: 1599
`#: 1599
`Case 2:21-cv-00040-JRG Document 72-1 Filed 09/08/21 Page 13 of 38 PagelD
`
`U.S. Patent
`
`Jul. 22, 2008
`
`Sheet 11 of 22
`
`US 7,401,783 B2
`
`oO
`oO
`oO—
`
`1042
`
`10A
`1021 FIG.
`
`1020
`
`¢)s
`
`0251025)
`
`1045
`
`1065
`
`1066
`
`1061
`
`

`

`Case 2:21-cv-00040-JRG Document 72-1 Filed 09/08/21 Page 14 of 38 PageID #: 1600
`#: 1600
`Case 2:21-cv-00040-JRG Document 72-1 Filed 09/08/21 Page
`14 of 38 PagelD
`
`U.S. Patent
`
`Jul. 22, 2008
`
`Sheet 12 of 22
`
`US 7,401,783 B2
`
`10B
`FIG.
`
`
`
`

`

`Case 2:21-cv-00040-JRG Document 72-1 Filed 09/08/21 Page 15 of 38 PageID #: 1601
`#: 1601
`Case 2:21-cv-00040-JRG Document 72-1 Filed 09/08/21 Page 15 of 38 PagelD
`
`U.S. Patent
`
`Jul. 22, 2008
`
`Sheet 13 of 22
`
`US 7,401,783 B2
`
`1160
`
`1152
`
`
`
`
`1140
`
`FIG. 11A
`
`

`

`Case 2:21-cv-00040-JRG Document 72-1 Filed 09/08/21 Page 16 of 38 PageID #: 1602
`#: 1602
`Case 2:21-cv-00040-JRG Document 72-1 Filed 09/08/21 Page 16 of 38 PagelD
`
`U.S. Patent
`
`Jul. 22, 2008
`
`Sheet 14 of 22
`
`US 7,401,783 B2
`
`1167
`
`[far
`
`11B
`FIG.
`
`
`
`

`

`Case 2:21-cv-00040-JRG Document 72-1 Filed 09/08/21 Page 17 of 38 PageID #: 1603
`#: 1603
`17 of 38 PagelD
`Case 2:21-cv-00040-JRG Document 72-1 Filed 09/08/21 Page
`
`U.S. Patent
`
`Jul. 22, 2008
`
`Sheet 15 of 22
`
`US 7,401,783 B2
`
` <e)
`
`©N—
`
`

`

`Case 2:21-cv-00040-JRG Document 72-1 Filed 09/08/21 Page 18 of 38 PageID #: 1604
`#: 1604
`Case 2:21-cv-00040-JRG Document 72-1 Filed 09/08/21 Page
`18 of 38 PagelD
`
`U.S. Patent
`
`Jul. 22, 2008
`
`Sheet 16 of 22
`
`US 7,401,783 B2
`
`13
`FIG.
`
`1355
`
`1330
`a—~_..
`
`©No
`
`N—
`
`1370
`
`1301
`con
`
`1350
`
`
`
`

`

`Case 2:21-cv-00040-JRG Document 72-1 Filed 09/08/21 Page 19 of 38 PageID #: 1605
`#: 1605
`Case 2:21-cv-00040-JRG Document 72-1 Filed 09/08/21 Page 19 of 38 PagelD
`
`U.S. Patent
`
`Jul. 22, 2008
`
`Sheet 17 of 22
`
`US 7,401,783 B2
`
`1450
`
`14A
`FIG.
`
`1410
`
`

`

`Case 2:21-cv-00040-JRG Document 72-1 Filed 09/08/21 Page 20 of 38 PageID #: 1606
`#: 1606
`Case 2:21-cv-00040-JRG Document 72-1 Filed 09/08/21 Page 20 of 38 PagelD
`
`U.S. Patent
`
`Jul. 22, 2008
`
`Sheet 18 of 22
`
`US 7,401,783 B2
`
` FIG. 14C
`
`

`

`Case 2:21-cv-00040-JRG Document 72-1 Filed 09/08/21 Page 21 of 38 PageID #: 1607
`#: 1607
`Case 2:21-cv-00040-JRG Document 72-1 Filed 09/08/21 Page
`21 of 38 PagelD
`
`U.S. Patent
`
`Jul. 22, 2008
`
`Sheet 19 of 22
`
`US 7,401,783 B2
`
`1505
`
`15
`
`1520
`
`COMP.
`
` FIG.
`
`©o
`
`o
`Ww—
`
`1550
`
`—
`1510
`
`

`

`Case 2:21-cv-00040-JRG Document 72-1 Filed 09/08/21 Page 22 of 38 PageID #: 1608
`#: 1608
`Case 2:21-cv-00040-JRG Document 72-1 Filed 09/08/21 Page
`22 of 38 PagelD
`
`U.S. Patent
`
`Jul. 22, 2008
`
`Sheet 20 of 22
`
`US 7,401,783 B2
`
`” zL
`
`u =L
`
`u
`—!
`Lu
`
`PIXEL
`
`16
`FIG.
`
`1615
`
` PIXELELEMENTS
`
`—
`
`1000
`
`

`

`Case 2:21-cv-00040-JRG Document 72-1 Filed 09/08/21 Page 23 of 38 PageID #: 1609
`#: 1609
`Case 2:21-cv-00040-JRG Document 72-1 Filed 09/08/21 Page 23 of 38 PagelD
`
`U.S. Patent
`
`Jul. 22, 2008
`
`Sheet 21 of 22
`
`US 7,401,783 B2
`
`$33
`GOO
`
`wo
`~-
`na
`
`°o
`-
`¢€
`
`N
`=
`.
`
`Q@
`Ty
`
`17A
`FIG.
`
`
`
`

`

`Case 2:21-cv-00040-JRG Document 72-1 Filed 09/08/21 Page 24 of 38 PageID #: 1610
`#: 1610
`Case 2:21-cv-00040-JRG Document 72-1 Filed 09/08/21 Page
`24 of 38 PagelD
`
`U.S. Patent
`
`Jul. 22, 2008
`
`Sheet 22 of 22
`
`US 7,401,783 B2
`
`17B
`FIG.
`
`_ O
`
`=
`or UI
`}— k-
`2)
`OoO
`
`
`
`

`

`Case 2:21-cv-00040-JRG Document 72-1 Filed 09/08/21 Page 25 of 38 PageID #: 1611
`#: 1611
`Case 2:21-cv-00040-JRG Document 72-1 Filed 09/08/21 Page 25 of 38 PagelD
`
`US 7,401,783 B2
`
`1
`CAMERA BASED MAN MACHINE
`INTERFACES
`
`CROSS REFERENCE TO RELATED
`APPLICATIONS
`
`This application is a continuation of application Ser. No.
`now U.S. Pat. No. 6,766,036;
`09/612,225, filed Jul. 7, 2000,
`which claimsthe benefit of U.S. Provisional Application No.
`60/142,777filed Jul. 8, 1999.
`Cross references to related co-pending applications by the
`matter.
`inventor having similar subject
`1. Touch TV and other Man Machine Interfaces (Ser. No.
`09/435,854 which wasa continuation of application Ser.
`now U.S. Pat. No. 5,982,352,);
`No. 07/946,908,
`2. More Useful Man MachineInterfaces and applications
`Ser. No. 09/433,297;
`3. Useful Man Machineinterfaces and applications Ser.
`now Pub. AppIn. 2002-0036617;
`No. 09/138,339,
`4. Vision Target based assembly, U.S. Ser. No. 08/469,907,
`now U.S. Pat. No. 6,301,783;
`5. Picture Taking method and apparatus U.S. provisional
`now filed as
`application No. 60/133,671,
`regular appli-
`cation Ser. No. 09/586,552;
`6. Methods and Apparatus for Man MachineInterfaces and
`Related Activity, U.S. Provisional Application No.
`as
`regular application Ser. No. 09/568,
`60/133,673, filed
`now U.S. Pat. No. 6,545,670;
`554,
`7. Tactile Touch Screens for Automobile Dashboards, Inte-
`riors and Other Applications, provisional application
`Ser. No. 60/183,807 filed as reg. application Ser. No.
`09/789,538; and
`8. Apparel Manufacture and Distance Fashion Shoppingin
`Both Present and Future, Provisional application No.
`60/187,397.
`The disclosures ofthe following U.S. patents and co-pend-
`ing patent applications by the inventor,or the inventor and his
`are
`incorporated herein by reference:
`colleagues,
`1. “Man machine Interfaces”, U.S. application Ser. No.
`09/435,854 and U.S. Pat. No. 5,982,352, and US. appli-
`now
`cation Ser. No. 08/290,516, filed Aug. 15, 1994,
`USS. Pat. No. 6,008,000, the disclosure of both of which
`is contained in that of Ser. No. 09/435,854;
`2. “Useful Man Machine Interfaces and Applications”,
`now Pub. Appln.
`US. application Ser. No. 09/138,339,
`2002-0036617;
`3. “More Useful Man Machine Interfaces and Applica-
`tions”, U.S. application Ser. No. 09/433,297;
`4. “Methods and Apparatus for Man MachineInterfaces
`and Related Activity”, U.S. application Ser. No. 60/133,
`673 filed as
`regular application Ser. No. 09/568,554,
`now U.S. Pat. No. 6,545,670;
`5. “Tactile Touch Screens for Automobile Dashboards,
`Interiors and Other Applications”, U.S. provisional
`now
`Appln. Ser. No. 60/183,807, filed Feb. 22, 2000,
`filed as reg. application Ser. No. 09/789,538; and
`6. “Apparel Manufacture and Distance Fashion Shopping
`in Both Present and Future”, U.S. application Ser. No.
`60/187,397,filed Mar. 7, 2000.
`
`FIELD OF THE INVENTION
`
`The invention relates to
`simple input devices for comput-
`necessarily, intended for use with 3-D
`ers, particularly, but not
`graphically intensive activities, and operating by optically
`a human input
`to a
`screen or other object
`sensing
`display
`
`10
`
`15
`
`20
`
`25
`
`35
`
`40
`
`45
`
`50
`
`55
`
`60
`
`65
`
`2
`or orientations. The
`and/or the sensing of human positions
`invention herein is a continuation in part of several inventions
`of mine,listed above.
`provide further use-
`This continuation application seeks to
`ful embodiments for improving the sensing of objects. Also
`disclosed are new
`applications in a
`variety of fields such as
`computing, gaming, medicine, and education. Further dis-
`closed are
`improved systems for display and control pur-
`poses.
`or
`The invention uses
`multiple TV cameras whose
`single
`output is analyzed and used as
`to a
`computer, such as a
`input
`homePC,to typically provide data concerningthe location of
`or
`a person or persons.
`objects held by,
`parts of,
`DESCRIPTION OF RELATED ART
`
`The above mentioned co-pending applications incorpo-
`rated by reference discuss manyprior
`art references in various
`
`pertinent fields, which form a
`backgroundforthis invention.
`Some more
`specific U.S. Patent references are for example:
`DeMenthon—U.S. Pat. Nos. 5,388,059; 5,297,061; 5,227,
`985
`Cipolla—u.S. Pat. No. 5,581,276
`Pugh—US. Pat. No. 4,631,676
`Pinckney—U.S. Pat. No. 4,219,847
`DESCRIPTION OF FIGURES
`
`FIG.1 illustrates a basic computer terminal embodiment of
`the invention, similar to that disclosed in copending applica-
`tions.
`FIG. 2 illustrates object tracking embodiments of the
`a
`pixel addressable camera.
`invention employing
`FIG.3 illustrates tracking embodiments of the invention
`using intensity variation to
`identify and /or track object target
`datums.
`FIG.4 illustrates tracking embodiments of the invention
`using variation in color to
`identify and/or track object target
`datums.
`camera
`FIG.5 illustrates special
`designs for determining
`normalcolor images.
`target position in addition to
`providing
`FIG.6 identification and tracking with stereo
`pairs.
`FIG.7 illustrates use of an indicator or
`co-target.
`FIG.8 illustrates control of functions with the invention,
`a handheld device whichitself has functions.
`using
`on a
`at an
`FIG.9 illustrates pointing
`object represented
`screen
`a
`or laser pointer, and then manipulating
`using,
`finger
`the represented object using the invention.
`FIG. 10 illustrates control of automobileor other functions
`with the invention, using detected knob, switch orslider posi-
`tions.
`FIG.11 illustrates a board game embodimentof the inven-
`tion.
`FIG. 12 illustrates a
`generic game embodiment of the
`invention.
`FIG. 13 illustrates a game embodimentof the invention,
`such as
`might be playedin a bar.
`FIG.14 illustrates a laser pointer
`or other spot designator
`embodimentofthe invention.
`FIG.15 illustrates a
`gesture basedflirting game embodi-
`mentof the invention.
`camera
`FIG.16 illustrates a version ofthe pixel addressing
`technique wherein twolines on either side of a 1000 element
`as
`squarearray are
`perimeter fencelinesto initiate
`designated
`or other action.
`tracking
`FIG.17 illustrates a 3-D acoustic imaging embodimentof
`the invention.
`
`

`

`Case 2:21-cv-00040-JRG Document 72-1 Filed 09/08/21 Page 26 of 38 PageID #: 1612
`#: 1612
`Case 2:21-cv-00040-JRG Document 72-1 Filed 09/08/21 Page 26 of 38 PagelD
`
`US 7,401,783 B2
`
`3
`THE INVENTION EMBODIMENTS
`
`FIG. 1
`The invention herein and disclosed in portions of other
`a combi-
`copending applications noted above, comprehends
`nation of one or more TV cameras
`(or other suitable electro-
`optical sensors) and a
`computer to
`provide various position
`and orientation related functions of use. It also comprehends
`the combination of these functions with the basic task of
`a TV image of the
`generating, storing and/or transmitting
`scene
`acquired—either in two or three dimensions.
`The embodimentdepicted in FIG. 1A illustrates the basic
`embodiments of many of my co-pending applications above.
`pair of cameras 100 and 101 located on each side of
`A stereo
`a rear
`the upper surface of monitor 102 (for example
`projec-
`screen
`screen
`tion TV of 60 inch diagonal
`size) with display
`103 facing the user, are connected to PC computer 106 (inte-
`grated in this case into the monitor housing ), for example
`a
`a
`400 Mhz Pentium II. For appearances and
`protection
`single
`extensive cover window may
`be used to cover both cameras
`sources 110 and 111, typically
`and their associated light
`LEDs.
`are
`usedto illumi-
`The LEDsin this application
`typically
`nate targets associated with any of the fingers, hand, feet and
`head of the user, or
`objects such as 131 held by
`a user, 135
`with hands 136 and 137, and head 138. These targets, such as
`circular target 140 and bandtarget 141 on
`object 131 are
`desirably, but not
`necessarily, retro-reflective, and may be
`a
`constituted by the object features themselves (e.g.,
`finger
`or
`tip, such as
`on
`worn
`by features provided
`145),
`clothing
`by
`the user
`a shirt button 147 or
`or
`polka dot 148,
`(e.g.,
`by
`artificial targets other than retroreflectors.
`a three camera arrangementcan be used,for
`Alternatively,
`example using additional camera
`provide added sen-
`144, to
`sitivity in certain angular and positional relationships. Still
`more cameras can be used to further improve matters, as
`camera 144 can be
`desired. Alternatively, and or in addition,
`used for other purposes, such as
`acquire images of objects
`such as persons, for transmission, storage
`or retrieval inde-
`pendent of the cameras used for datum and feature location
`determination.
`a
`camera can suffice for
`For many applications,
`single
`measurement purposesas
`well, such as 160 shown in FIG. 1B
`measure-
`for example, used for simple 2 dimensional (2D)
`to the camera axis (z
`ments in the xy plane perpendicular
`or 3D (xyz,roll pitch yaw) where a
`target grouping, for
`axis),
`example of three targets is used such as the natural features
`formed by the two eyes 164, 165 and nose 166 of a human
`167. These features are
`at knowndistances from each
`roughly
`the data from which can be used to calculate the
`other,
`approximate position and orientation of the human face.
`Using for example the photogrammetric technique of
`Pinkney described below, the full 6 degree of freedom solu-
`tion of the human face location and orientation can be
`achieved to an accuracy limited by the ability of the camera
`imageprocessing software utilized to determine the centroids
`or other delineating geometric indicators ofthe position ofthe
`some other facial feature such as the
`eyes and nose, (or
`mouth), and the accuracy ofthe initial imputing ofthe spacing
`of the eyes andtheir respective spacing
`to the nose.
`Clearly if
`a standard humanvalueis used
`or for a child or
`(say for adult,
`even
`some
`lessening of precision results, since these
`by age)
`are used in the calculation of distance and orienta-
`spacings
`tion of the face of human 167 from the camera 160.
`more
`accurate
`In another generally
`photogrammetrically
`case, one
`might chooseto use four special targets(e.g., glass
`
`10
`
`15
`
`20
`
`25
`
`30
`
`35
`
`40
`
`45
`
`50
`
`55
`
`60
`
`65
`
`4
`or orange dots) 180-183 on the object
`bead retro-reflectors,
`185 having knownpositional relationships relative to each
`other on the object surface, such as one inch centers. Thisis
`shown in FIG. 1C, and may be used in conjunction with a
`pixel addressable camera such as described in FIG. 2 below,
`whichallows one to
`rapidly determine the object position and
`orientation and track its movements in up to 6 degrees of
`freedom as disclosed by Pinkney U.S. Pat. No. 4,219,847 and
`technical papers referenced therein. For example, the system
`described above for FIGS. 1 and 2 involving the photogram-
`metric resolution of the relative position of three or more
`a camera is known andis
`as viewed by
`known target points
`described in a paper entitled “A Single Camera Method for
`the 6-Degree of Freedom Sprung Mass ResponseofVehicles
`Redirected by Cable Barriers” presented by M. C. van
`Wijk
`to The Society of Photo-optical Instru-
`and H. F. L. Pinkney
`mentation Engineers.
`pair of cameras can also acquire
`a two view
`The stereo
`imageof the scene as
`well, which can be displayed in
`stereo
`or
`auto-stereoscopic means, as well as
`3D using stereoscopic
`transmitted or recorded as desired.
`In many applications of the foregoing invention it is desir-
`to use a
`screen but in fact one
`able not
`capable of
`just
`large
`displaying life size images. This particularly relates to human
`a life-like presence to the data on the
`scaled images, giving
`screen. In this way the natural response of the user with
`”
`motions of hands, head, arms, etc., is scaled in “real
`pro-
`to the data being presented.
`portion
`FIG. 2
`This embodiment and others discloses special types of
`cameras useful with the invention. In the first case, that of
`a
`pixel addressable camera such as the MAPP2200
`FIG.2A,
`made by IVP corporation of Sweden is used, which allows
`one to do many
`things useful for rapidly determining location
`of objects, their orientation and their motion.
`an
`as shown in FIG. 2A,
`For example,
`approximately cir-
`target datum such as 180 on
`cular image 201 of a
`object 185
`ofFIG. 1C may be acquired by scanningthe pixel elements on
`a matrix array 205 on which the image is formed. Such an
`or
`array in the future will have for example 1000x1000 pixels,
`more
`(today the largest IVP makes is 512x512. The IVP also
`is not believed to be completely randomly addressable, which
`somefuture arrays will be).
`Asan
`illustration, computer 220 determines, after the array
`205 has been interrogated, that the centroid “x, y” ofthe pixel
`elements on which the target image lies is at
`pixel x=500,
`a sub-fraction thereof in many cases). The
`y=300 (including
`centroid location can be determined for example by the
`moment method disclosed in the Pinkney patent, referenced
`above.
`on
`Thetarget in this case is defined as a
`contrasting point
`the object, and such contrast can be in color as well as, or
`instead of, intensity. Or with some added preprocessing,it
`can
`be a distinctive pattern
`on the object, such as a checkerboard
`or
`herringbone.
`
`Subsequent Tracking
`To subsequently track the movementofthistarget image,it
`is now
`only necessary to look in a small pixel window com-
`posed of a small number of pixels around the target. For
`example the square 230 shown,as the new
`position x'y' ofthe
`cannotbe further distant within a short period of
`target image
`time elapsed from the first scan, and in consideration of the
`small required time to scan the window.
`For example, if the window is 100x100 pixels, this can be
`scanned in 1 millisecondor less with such a
`pixel addressing
`camera, by interrogating only those pixels in the window,
`
`

`

`Case 2:21-cv-00040-JRG Document 72-1 Filed 09/08/21 Page 27 of 38 PageID #: 1613
`#: 1613
`Case 2:21-cv-00040-JRG Document 72-1 Filed 09/08/21 Page
`27 of 38 PagelD
`
`US 7,401,783 B2
`
`5
`while still communicating with the camera overa
`relatively
`slow USBserial link of 12 mb transmissionrate
`(representing
`12,000 pixel gray level values in one
`millisecond).
`to scan the wholefield,
`once
`One thus avoids the necessity
`the starting target image position is identified. This can be
`knownbyan initial scan as
`or can be known by
`mentioned,
`a known
`having the user move an
`object with a
`target against
`location with respect to the camera such as a mechanical stop,
`and then indicate that tracking should start either by verbally
`or
`so with voice recognition,
`a control key
`saying
`by actuating
`such as 238 or whatever.
`Tt is notedthat ifthe tracking windowis made large enough,
`then it can encompass a whole group of datums, such as
`180-183 on an
`object.
`FIG. 2B Reduction in Acquisition Time
`camerais
`Another application of such a
`pixel addressing
`x y of the
`shown in FIG. 2B. Onecan look at the whole field,
`camera, 240, but only address say every 10” pixel such as
`250, 251 and 252, in each direction, i.e., for a total 10,000
`pixels in a field of 1 million (1000x1000, say).
`In this case
`computer 220 simply queriesthis fraction ofthe
`pixels in the image, knowingapriori thatthe target image such
`as 260 will have an
`imagesize larger than 10x10 pixels, and
`one of the
`must be detectable, if of sufficient contrast, by
`queried pixels. (For smaller or
`larger target images, the num-
`can be adjusted accord-
`ber and spacing of queried pixels
`ingly). This for example, allows one to find approximate
`location of targets with only 100 the pixel interrogation time
`otherwise needed, for example, plus any gain obtained as
`to
`disclosed above, by knowing in whatregion of the image
`or
`some
`look (for example during tracking,
`given
`apriori
`knowledge of approximate location dueto a
`particular aspect
`of the physical arrangementor the program in question).
`Once a
`target has been approximately found as
`just
`can be optimizedfor that region of
`described, the addressing
`as disclosed in subsequent tracking section
`the image only,
`above.
`Given the invention, the potential for target acquisition in a
`millisecond or two thus is achievable with simple pixel
`on stream now
`addressable CMOS cameras
`coming
`(today
`are
`costing under $50), assuming the target points
`easily
`identifiable from at least one of brightness (over
`a
`value),
`contrast(with respect to
`surroundings), color, color contrast,
`or
`a
`or
`and more
`difficult, shape
`pattern (e.g.,
`plaid,
`herring-
`boneportion of a
`shirt). This has major ramifications for the
`robustness of control systems built on such camera based
`or machines or
`acquisition, be they for controlling displays,
`whatever.
`on
`It’s noted that with new 2000x2000 cameras
`coming
`stream, it may only be necessary to look at every 15” or 20”
`an
`pixel in each direction to get
`adequate feel for target
`location. This means every 200” to 400” pixel,
`not
`to
`enough
`cause
`imagerenditiondifficulties even if totally dark grey (as
`it might be in a normal white light image if set up for IR
`wavelengths only).
`FIG. 2C
`Another methodfor finding the targetin the first place with
`near a home
`limited pixel interrogation is to look at
`pixels
`point where a person for example indicates that thetargetis.
`ones
`fingernail such as
`This could be for example, placing
`270, whose natural orartificial (e.g., reflective nail polish)
`seen
`features are
`by the camera 275 and determined to
`readily
`corner of a
`pad 271 in FIG. 2C which approxi-
`be in the right
`covers the field of view 274 of the camera 275. The
`mately
`corner 278 ofthe
`computer 220 analyzesthe pixels in the right
`image field 279 representing the pad portion 271 with the
`
`10
`
`15
`
`20
`
`25
`
`30
`
`35
`
`40
`
`45
`
`50
`
`55
`
`60
`
`65
`
`6
`camera
`or
`only whenthe finger for
`275, either continuously,
`or on
`example hits a switch such as 280 at the edge ofthe pad,
`command
`(e.g., by the user
`a button or
`or a voice
`pushing
`key,
`message inputted via microphone 285 for example). After
`such acquisition, the targetis then tracked to other locations in
`as described above. Its noted
`xy space ofthe pad, for example
`a
`or other sound or indication
`to
`that it helps
`provide
`beep
`when acquisition has been made.
`Pick Windowsin Real Time
`Anotheraspect ofthe inventionis that one can also pick the
`area of the image
`to
`at any desired moment. This
`interrogate
`a window of
`can be doneby creating
`pixels with in the field to
`as discussed relative to a
`generate information, for example
`car dashboard application of FIG. 10.
`specific
`FIG. 2D—ScanPattern
`computer such as
`camera also allows a
`A pixel addressing
`220 to cause scansto be generated whichare not
`raster
`typical
`scans. For example circular or
`as
`or even odd shapes
`radial,
`desired. This can be done by providing from the computer the
`on the camera
`sequential addresses of the successive pixels
`are to be queried.
`chip whose detected voltages
`can be
`A circular scan of pixels addressed at
`high speed
`identify when and where a
`target enters a field
`used to
`scan. This is highly useful, and
`enclosed by the circular pixel
`after that, the approximate location ofthe target
`can be deter-
`mined by further scans of pixels in the target region.
`For example consider addressing the pixels cl c2 c3...cn
`a circle 282 at the outer
`perimeterof thearray,
`representing
`285, of 1000x1000 elements such as discussed above. The
`numberof pixels in a full circle is approximately 1000 pi,
`which can be scanned even with USB (universal serial bus)
`limits at 300 times per second orbetter. For targets of Yoo
`field in width, this meansthat a
`target image enteringthefield
`suchascircular target image 289 (which is shownintersecting
`element cm andits neighbors) would have to travel oo the
`field width in 0.0033 secondsto be totally missed in a worst
`case. Ifthe imagefield corresponds
`to 20 inchesin objectfield
`width this is 0.2 inchesx300/sec or 60 inches/second, very
`fast for human movement, and not
`to be exceeded even
`likely
`are used.
`where smaller targets
`to circular “trip wire” perimeters may
`Alternative shapes
`be used, such as squares, zig-zag,
`or other layoutsof pixels
`to
`a group of pix-
`determine target presence. Once determined,
`els such as group 292 can be interrogated
`a better
`to get
`determination of target location.
`FIG. 3
`Since many applications of the invention concern, or at
`a human caused motion,
`or motion ofa part
`least have present
`ofa human,or an
`a human,theidentification
`object moved by
`can be simplified if the features of
`and tracking problem
`some
`interest, either naturalorartificial of the object provide
`kind of change in appearance during such motion.
`FIG.3 illustrates tracking embodiments of the invention
`using intensity variation to
`identify and/or track object target
`datums. In a
`simple case, a subtraction of successive images
`zones in an
`can aid in identifying
`movement of
`image having
`features as is well known.It is also useful to add pixel inten-
`sities of successive images in computer 220 for example. This
`true with bright targets (with respect to their
`is particular
`usual surroundings) such as LEDsorretro-reflectors. If the
`pixels in use
`by the camera are able to
`gather light preferen-
`at the same timea
`special illumination lightis on, this
`tially
`will accentuate the target with respect to
`background. Andif
`successive frames are taken in this way, not
`only will a sta-
`ofthe special target build up, but if movement
`tionary image
`
`

`

`Case 2:21-cv-00040-JRG Document 72-1 Filed 09/08/21 Page 28 of 38 PageID #: 1614
`#: 1614
`Case 2:21-cv-00040-JRG Document 72-1 Filed 09/08/21 Page
`28 of 38 PagelD
`
`US 7,401,783 B2
`
`7
`takes place the target image then will blur in a
`particular
`direction whichitself can becomeidentify-able. And the blur
`direction indicates direction of motion as
`well, at least in the
`2-D planeof the pixel array used.
`Another form of movement can take place artificially,
`movedto provide
`an indication
`wherethe target is purposely
`ofits presence. This movementcan be done by ahuman easily
`ones
`a
`finger for example (if
`portion of the
`by just dithering
`or
`suchas thetip is the target in question),
`finger
`by vibrating
`object having target features of interest on
`an
`it, for example
`by moving the object up and down with ones hand.
`For example consider FIG. 3A, where a human 301 moves
`his finger 302 in a
`rapid up and down motion, creating differ-
`ent
`image positions sequentially in time of bright target ring
`320, 320' on his finger,
`as seen
`camera 325. If the camera
`by
`can read quickly enough
`eachofthese positions such as 326
`and 327 in imagefield 328 can be resolved, other wise a blur
`imagesuch as 330 is registered
`on the camera and recorded in
`the computer 335.
`ones
`to create
`Instead of using
`finger, it is also possible
`movementof a
`target for example with a
`tuning fork or other
`mechanism mechanically energizing the target movement, on
`whatotherwise mightbe a static object say. Andit is possible
`or a
`computer controlling the movement in
`for the human,
`to create it in such a manner thatit aids identifica-
`question
`a certain number of movesof ones
`tion. For example,
`finger
`or 2 moves/sec of ones
`or horizontal moves of
`(e.g., 4),
`finger,
`ones
`finger etc., any or all of these could indicate to the
`was
`computer upon analysis ofthe camera
`image, that a
`target
`present.
`The invention comprehendsthis as a method for acquiring
`a
`the datum to be trackedin the first place, and has provided
`notto lose the
`camera mechanism for tracking fast enough
`a
`sufficiently distinct feature. For example, it
`data, assuming
`is desirable to not
`require sophisticated image processing
`routines andthe like if possible,
`to avoid the timeit takes to
`execute same with affordable equipment. And yet in many
`a
`target cant be done easily today without
`scenes, finding
`some
`aid, either a
`contrast target (contrasting brightness
`high
`or coloror
`both, for example). Or the aid can be movementas
`noted, which allows the search for the target to be at least
`ofthe field ofview, and thence take
`localized to a small region
`muchless time to run, even if a
`sophisticated algorithm is
`employed.
`FIG.3B illustrates an embodiment wherein a
`target which
`case is a modulated
`blinks optically is used. The simplest
`LEDtarget such 340 on
`object 341 shown. Successive frames
`taken with camera 345 looking
`at
`pixel window 346 at 300
`scans of the pixels within the window per second where the
`can
`image 347 of the LEDtargetis located,
`determine, using
`may
`or
`computer 349 (which
`beseparate from,
`incorporated
`with the image sensor), 5 complete blinks of target 340, if
`blinked at a 60 hz rate. Both blink frequency, blink spacing,
`can all be determinedif the scan rate is
`blink pulse lengt

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket