throbber
Case 2:21-cv-00040-JRG Document 70-8 Filed 09/02/21 Page 1 of 77 PageID #: 1343
`
`Exhibit M
`
`

`

`Case 2:21-cv-00040-JRG Document 70-8 Filed 09/02/21 Page 2 of 77 PageID #: 1344
`Case 2:21-cv-00040-JRG Document 70-8 Filed 09/02/21 Page 2 of 77 PagelD #: 1344
`
`PTOIAIA/01 (06-12)
`Approved for use through 01/31/2074. OMS 0661-0032
`U.S. Patent-and TrademarkOffice: U.S. DEPARTMENT OF COMMERCE
`Under the Paperwork Reduction Act of 1995, no persons are required to respond to a collection of infarmation unlessit displays a vatid OMB control: number.
`
`DECLARATION(37 CFR 1.63) FOR UTILITY OR DESIGN APPLICATION USING AN
`
`APPLICATION DATA SHEET(37 CFR 1.76)
`
`Invention|DEVICES As the below named inventor, | hereby declare that:
`
`Petitioner/applicant is cautioned to avoid submitting personal information in documents filed in a patent application that may
`contribute to identity theft. Personal information such as. social security numbers, bank account numbers, or credit card numbers |
`(other than @ check or credit card authorization form PTO-2038 submitted for payment purposes) is never required by the USPTO!
`to-support a petition or an application.
`If this type of personal information is included in documents submitted to the USPTO,
`petitioners/applicants should consider redacting such personal information from the documents before submitting them to the
`USPTO. Petitioner/applicant is advised that the record of a patent applicationis available to the public after publication of the
`application (unless a non-publication request in compliance with 37 CFR 1.213(a) is made in the application) or issuance of a
`patent. Furthermore, the record from an abandoned application may also be available to the public ifthe application is
`referenced in a published application or an issued patent (see 37 CFR 1.14). Checks and credit card authorization forms
`PTO-2038 submitted for payment purposes are not retained in the application file and therefore are not publicly available.
`
`Title of|CAMERA BASED SENSING IN HANDHELD, MOBILE, GAMING OR OTHER
`
`inat
`'
`This declaration
`The attached application, or
`is directed to:
`|
`i| United States application or PCT international application number
`filed on
`
`The above-identified application was made or authorized to be made by me.
`
`|
`
`{ believe that | am the original inventor or an original joint inventor of a claimed invention in the application.
`
`{ hereby acknowledge that any willful false statement made in this declaration ts punishable under 18 U.S.C. 1001
`by fine or imprisonment of not more than five (5). years, or both.
`
`WARNING:
`
`LEGAL NAME OF INVENTOR
`
`Pryor
`Timothy R
`Inventor:
`ry
`won
`Signature: F
`
`f
`Date (Optional) ¥-
`
`
`
`Note: An application data sheet (PTO/SB/14 or equivalent), including naming the entire inventive entity, must accompanythis form .or.must have
`been previously filed. Use an additional PTO/ALA/01 form for each additional inventor.
`
`This collection of information is required by 35.0.8.C. 1415 and. 3? CFR 1.63. Theinfomation is required to obtain of retain a benefit by the public which is to file (and
`by the USPTO to process) an application. Confidentiality is governed by 36 U.S.C. 122 and.37 CFR 1.11 and 4.14. This collection is estimated:to take 7 minute to
`complete, including gathering, preparing, and submitting the completed application form to the USPTO. Time will vary depending upon the individual case. Any
`comments on.the amount of time yourequire to complete this form and/or suggestionsfor reducing this burden, should be sent to the Chief information Officer, US.
`Patent and Trademark Office, U.S. Department of Commerce, P.O. Box 1450, Alexandria, VA 22313-1450. DO NOT SEND FEES OR COMPLETED FORMS TO
`THIS ACORESS. SEND TO: Commissionerfor Patents, P.O. Box 1450, Alexandria, VA 22313-1450,
`fFyou need assistance in completing the form, calf 1-900-PTO-9199 and select option 2.
`
`

`

`Case 2:21-cv-00040-JRG Document 70-8 Filed 09/02/21 Page 3 of 77 PageID #: 1345
`Case 2:21-cv-00040-JRG Document 70-8 Filed 09/02/21 Page 3 of 77 PagelD #: 1345
`
`138
`
`Aas FIG. 1A
`
`147
`
`148
`
`160
`
`165
`
`164
`
`166
`
`M467
`
`FIG. 1B
`
`181
`
`4809
`
`183
`
`182
`
`"FIG. 1C
`
`

`

`Case 2:21-cv-00040-JRG Document 70-8 Filed 09/02/21 Page 4 of 77 PageID #: 1346
`Case 2:21-cv-00040-JRG Document 70-8 Filed 09/02/21 Page 4 of 77 PagelD #: 1346
`
`1000]
`
`205
`
`ELEMENTS
`
` y
`
`1000
`
`
`
`500
`
`X
`ELEMENTS
`
`COMPUTER
`
`—>X
`
`240—_
`
`251
`
`250
`
`FIG. 2B
`
`

`

`Case 2:21-cv-00040-JRG Document 70-8 Filed 09/02/21 Page 5 of 77 PageID #: 1347
`Case 2:21-cv-00040-JRG Document 70-8 Filed 09/02/21 Page 5 of 77 PagelD #: 1347
`
`220
`
` COMPUTER
`
`271
`
`270
`
`280
`
`FIG. 2C
`
`

`

`Case 2:21-cv-00040-JRG Document 70-8 Filed 09/02/21 Page 6 of 77 PageID #: 1348
`Case 2:21-cv-00040-JRG Document 70-8 Filed 09/02/21 Page 6 of 77 PagelD #: 1348
`
`
`
`FIG. 2D
`
`|
`
`410
`
`SS 01‘~~x
`
`402
`
`406
`
`FIG. 4A
`
`

`

`Case 2:21-cv-00040-JRG Document 70-8 Filed 09/02/21 Page 7 of 77 PageID #: 1349
`Case 2:21-cv-00040-JRG Document 70-8 Filed 09/02/21 Page 7 of 77 PagelD #: 1349
`
`
`
`302
`
`301
`
`450
`
`P
`
`445
`
`FIG. 4B
`
`

`

`Case 2:21-cv-00040-JRG Document 70-8 Filed 09/02/21 Page 8 of 77 PageID #: 1350
`Case 2:21-cv-00040-JRG Document 70-8 Filed 09/02/21 Page 8 of 77 PagelD #: 1350
`
`349
`
`345
`
`355
`
`Oo”
`
`346
`
`347
`
`340
`
`341
`
`SET
`FIG. 3B
`
`365
`
`360
`
`

`

`Case 2:21-cv-00040-JRG Document 70-8 Filed 09/02/21 Page 9 of 77 PageID #: 1351
`Case 2:21-cv-00040-JRG Document 70-8 Filed 09/02/21 Page 9 of 77 PagelD #: 1351
`
`
`
`

`

`Case 2:21-cv-00040-JRG Document 70-8 Filed 09/02/21 Page 10 of 77 PageID #: 1352
`Case 2:21-cv-00040-JRG Document 70-8 Filed 09/02/21 Page 10 of 77 PagelD #: 1352
`
` COMPUTER
`
`

`

`Case 2:21-cv-00040-JRG Document 70-8 Filed 09/02/21 Page 11 of 77 PageID #: 1353
`Case 2:21-cv-00040-JRG Document 70-8 Filed 09/02/21 Page 11 of 77 PagelD #: 1353
`
`730
`
`731 700
`rh 701
`
`——
`
`733
`
`732
`
`705
`
`
`
`905
`
`FIG. 9
`
`

`

`Case 2:21-cv-00040-JRG Document 70-8 Filed 09/02/21 Page 12 of 77 PageID #: 1354
`Case 2:21-cv-00040-JRG Document 70-8 Filed 09/02/21 Page 12 of 77 PagelD #: 1354
`
`816
`
`830
`
`LSSfeowputer
`
`803
`
` 824
`
`FIG. 8A
`
`853
`
`852
`
`851
`
`850
`
`FIG. 8B
`
`849
`
`

`

`Case 2:21-cv-00040-JRG Document 70-8 Filed 09/02/21 Page 13 of 77 PageID #: 1355
`Case 2:21-cv-00040-JRG Document 70-8 Filed 09/02/21 Page 13 of 77 PagelD #: 1355
`
`VOLSid
`
`S9OL
`
`9901
`
`L901
`
`

`

`Case 2:21-cv-00040-JRG Document 70-8 Filed 09/02/21 Page 14 of 77 PageID #: 1356
`Case 2:21-cv-00040-JRG Document 70-8 Filed 09/02/21 Page 14 of 77 PagelD #: 1356
`
`FIG.10B
`
`

`

`Case 2:21-cv-00040-JRG Document 70-8 Filed 09/02/21 Page 15 of 77 PageID #: 1357
`Case 2:21-cv-00040-JRG Document 70-8 Filed 09/02/21 Page 15 of 77 PagelD #: 1357
`
`1160
`
`1152
`
`[\ 1150
`
`
`1155
`
`UJ
`
`N27
`“ee
`
`1110
`
`
`
`LD —eh ee
`
`1151
`
`ON 1158
`
`FIG. 11A
`
`

`

`Case 2:21-cv-00040-JRG Document 70-8 Filed 09/02/21 Page 16 of 77 PageID #: 1358
`Case 2:21-cv-00040-JRG Document 70-8 Filed 09/02/21 Page 16 of 77 PagelD #: 1358
`
`|fs/\
`
`1167
`
`FIG.11B
`
`

`

`Case 2:21-cv-00040-JRG Document 70-8 Filed 09/02/21 Page 17 of 77 PageID #: 1359
`Case 2:21-cv-00040-JRG Document 70-8 Filed 09/02/21 Page 17 of 77 PagelD #: 1359
`
`ChOld
`
`99CL
`
`

`

`Case 2:21-cv-00040-JRG Document 70-8 Filed 09/02/21 Page 18 of 77 PageID #: 1360
`Case 2:21-cv-00040-JRG Document 70-8 Filed 09/02/21 Page 18 of 77 PagelD #: 1360
`
`FIG.13
`
`1370
`
`ome1301
`
`1350
`
`a1330
`
`©No
`
`O
`<-
`
`1355
`
`

`

`Case 2:21-cv-00040-JRG Document 70-8 Filed 09/02/21 Page 19 of 77 PageID #: 1361
`Case 2:21-cv-00040-JRG Document 70-8 Filed 09/02/21 Page 19 of 77 PagelD #: 1361
`
`VrOld
`
`

`

`Case 2:21-cv-00040-JRG Document 70-8 Filed 09/02/21 Page 20 of 77 PageID #: 1362
`Case 2:21-cv-00040-JRG Document 70-8 Filed 09/02/21 Page 20 of 77 PagelD #: 1362
`
`
`
`FIG. 14C
`
`

`

`Case 2:21-cv-00040-JRG Document 70-8 Filed 09/02/21 Page 21 of 77 PageID #: 1363
`Case 2:21-cv-00040-JRG Document 70-8 Filed 09/02/21 Page 21 of 77 PagelD #: 1363
`
`1505 FIG.15
`
`Nm
`1510
`
`oO
`n)—
`
`ou
`
`1550
`
`COMP.
`
`

`

`Case 2:21-cv-00040-JRG Document 70-8 Filed 09/02/21 Page 22 of 77 PageID #: 1364
`Case 2:21-cv-00040-JRG Document 70-8 Filed 09/02/21 Page 22 of 77 PagelD #: 1364
`
`”a2r
`
`r =L
`
`y— L
`
`u
`
`PIXEL
`
`PIXELELEMENTS
`
`FIG.16
`
`

`

`Case 2:21-cv-00040-JRG Document 70-8 Filed 09/02/21 Page 23 of 77 PagelD #: 1365
`Case 2:21-cv-00040-JRG Document 70-8 Filed 09/02/21 Page 23 of 77 PageID #: 1365
`
`‘OIFeZLOLL.SeL1OrZlOLb
`
`
`
`
`SLZL
`
`VLESila
`
`
`
`OcLIGEZl
`
`O€Zl
`
`OSZL
`
`voll
`
`Ww
`<t
`nN—
`
`
`
`ScZlOLLI
`
`> xX
`
`\_3
`
`
`

`

`Case 2:21-cv-00040-JRG Document 70-8 Filed 09/02/21 Page 24 of 77 PageID #: 1366
`Case 2:21-cv-00040-JRG Document 70-8 Filed 09/02/21 Page 24 of 77 PagelD #: 1366
`
`O
`
`—_/
`oe
`OY Ww
`a ae
`zn”
`Of
`
`FIG.17B
`
`

`

`Case 2:21-cv-00040-JRG Document 70-8 Filed 09/02/21 Page 25 of 77 PageID #: 1367
`Case 2:21-cv-00040-JRG Document 70-8 Filed 09/02/21 Page 25 of 77 PagelD #: 1367
`
`CAMERA BASED SENSING IN HANDHELD, MOBILE, GAMING, OR OTHER DEVICES
`
`CROSS REFERENCE TO RELATED APPLICATIONS
`
`[0001]
`
`This application is a continuation of Application No. 13/461,954, filed May 2,
`
`2012 (now U.S. Patent _
`
`), which is a continuation of Application No. 13/051,698, filed
`
`March 18, 2011 (now U.S. Patent 8,194,924), which is a continuation of U.S. Application No.
`
`12/834,281, filed July 12, 2010 (now U.S. Patent 7,933,431), which is a continuation of
`
`Application No. 11/980,710, filed Oct. 31, 2007 (now U.S. Patent 7,756,297), which is a
`
`continuation of Application No. 10/893,534, filed Jul. 19, 2004 (now U.S. Patent 7,401,783),
`
`which is a continuation of Application No. 09/612,225, filed Jul. 7, 2000 (now U.S. Patent
`
`6,766,036), which claims the benefit of U.S. Provisional Application No. 60/142,777, filed Jul.
`
`8, 1999.
`
`[0002]
`
`Cross references to related co-pending US applications by the inventor having
`
`similar subject matter.
`
`[0003]
`
`1. Touch TV and other Man MachineInterfaces: Ser. No. 09/435,854 filed Nov.
`
`8, 1999, now U.S. Pat. No. 7,098,891; which was a continuation of application Ser. No.
`
`07/946,908, now U.S. Pat. No. 5,982,352;
`
`[0004]
`
`2. More Useful Man MachineInterfaces and Applications: Ser. No. 09/433,297
`
`filed Nov. 3, 1999, now U.S. Pat. No. 6,750,848;
`
`[0005]
`
`3. Useful Man Machine interfaces and applications: Ser. No. 09/138,339, Pub.
`
`Appln. 2002-0036617, now abandoned;
`
`[0006]
`
`4, Vision Target based assembly: Ser. No. 08/469,907 filed Jun. 6, 1995, now
`
`USS. Pat. No. 6,301,783;
`
`

`

`Case 2:21-cv-00040-JRG Document 70-8 Filed 09/02/21 Page 26 of 77 PageID #: 1368
`Case 2:21-cv-00040-JRG Document 70-8 Filed 09/02/21 Page 26 of 77 PagelD #: 1368
`
`[0007]
`
`5. Picture Taking method and apparatus: provisional application 60/133,671, and
`
`regular application Ser. No. 09/568,552 filed May 11, 2000, now U.S. Pat. No. 7,015,950;
`
`[0008]
`
`6. Methods and Apparatus for Man Machine Interfaces and Related Activity:
`
`Provisional Application: provisional application 60/133,673 filed May 11, 1999; and regular
`
`application Ser. No. 09/568,554 filed May 11, 2000, now U.S. Pat. No. 6,545,670;
`
`[0009]
`
`7. Tactile Touch Screens for Automobile Dashboards,
`
`Interiors and Other
`
`Applications: provisional application Ser. No. 60/183,807; and regular application Ser. No.
`
`09/789 ,538, now U.S. Pat. No. 7,084,859; and
`
`[0010]
`
`8. Apparel Manufacture and Distance Fashion Shopping in Both Present and
`
`Future: provisional application 60/187,397 filed Mar. 7, 2000.
`
`[0011]
`
`The disclosures of the following U.S. patents and co-pending patent applications
`
`by the inventor, or the inventor and his colleagues, are incorporated herein by reference:
`
`[0012]
`
`1. "Man machine Interfaces": U.S. application Ser. No. 09/435,854 and U.S. Pat.
`
`No. 5,982,352, and U.S. application Ser. No. 08/290,516, filed Aug. 15, 1994, now U.S. Pat. No.
`
`6,008,000, the disclosure of both of which is contained in that of Ser. No. 09/435,854;
`
`[0013]
`
`2. "Useful Man Machine Interfaces and Applications": U.S. application Ser. No.
`
`09/138,339, now Pub. Appln. 2002-0036617;
`
`[0014]
`
`3. "More Useful Man Machine Interfaces and Applications”: U.S. application Ser.
`
`No. 09/433,297;
`
`[0015]
`
`4. "Methods and Apparatus for Man Machine Interfaces and Related Activity":
`
`U.S. Appln. Ser. No. 60/133,673 filed as regular application Ser. No. 09/568,554, now U.S. Pat.
`
`No. 6,545,670;
`
`-2-
`
`

`

`Case 2:21-cv-00040-JRG Document 70-8 Filed 09/02/21 Page 27 of 77 PageID #: 1369
`Case 2:21-cv-00040-JRG Document 70-8 Filed 09/02/21 Page 27 of 77 PagelD #: 1369
`
`[0016]
`
`5. "Tactile Touch Screens for Automobile Dashboards,
`
`Interiors and Other
`
`Applications": U.S. provisional ApplIn. Ser. No. 60/183,807, filed Feb. 22, 2000, now filed as
`
`reg. application Ser. No. 09/789,538; and
`
`[0017]
`
`6. "Apparel Manufacture and Distance Fashion Shopping in Both Present and
`
`Future": U.S. Appln. Ser. No. 60/187,397, filed Mar. 7, 2000.
`
`FIELD OF THE INVENTION
`
`[0018]
`
`The invention relates to simple input devices for computers, particularly, but not
`
`necessarily, intended for use with 3-D graphically intensive activities, and operating by optically
`
`sensing a human input to a display screen or other object and/or the sensing of humanpositions
`
`or orientations. The invention herein is a continuation in part of several inventions of mine,listed
`
`above.
`
`[0019]
`
`This continuation application seeks to provide further useful embodiments for
`
`improving the sensing of objects. Also disclosed are new applications in a variety of fields such
`
`as computing, gaming, medicine, and education. Further disclosed are improved systems for
`
`display and control purposes.
`
`[0020]
`
`The invention uses single or multiple TV cameras whose output is analyzed and
`
`used as input to a computer, such as a home PC,to typically provide data concerning the location
`
`of parts of, or objects held by, a person or persons.
`
`DESCRIPTION OF RELATED ART
`
`

`

`Case 2:21-cv-00040-JRG Document 70-8 Filed 09/02/21 Page 28 of 77 PageID #: 1370
`Case 2:21-cv-00040-JRG Document 70-8 Filed 09/02/21 Page 28 of 77 PagelD #: 1370
`
`[0021]
`
`The above mentioned co-pending applications incorporated by reference discuss
`
`manyprior art references in variouspertinent fields, which form a background for this invention.
`
`Some morespecific U.S. Patent references are for example:
`
`[0022]
`
`DeMenthon--U.S. Pat. Nos. 5,388,059; 5,297,061; 5,227,985
`
`[0023]
`
`[0024]
`
`[0025]
`
`Cipolla--U.S. Pat. No. 5,581,276
`
`Pugh--U.S. Pat. No. 4,631,676
`
`Pinckney--U.S. Pat. No. 4,219,847
`
`DESCRIPTION OF FIGURES
`
`[0026]
`
`FIG. 1 illustrates a basic computer terminal embodimentof the invention, similar
`
`to that disclosed in copending applications.
`
`[0027]
`
`FIG, 2 illustrates object tracking embodiments of the invention employing a pixel
`
`addressable camera.
`
`[0028]
`
`FIG, 3 illustrates tracking embodiments of the invention using intensity variation
`
`to identify and/or track object target datums.
`
`[0029]
`
`FIG, 4 illustrates tracking embodiments of the invention using variation in color
`
`to identify and/or track object target datums.
`
`[0030]
`
`FIG. 5 illustrates special camera designs for determining target position in
`
`addition to providing normalcolor images.
`
`[0031]
`
`[0032]
`
`[0033]
`
`FIG. 6 identification and tracking with stereo pairs.
`
`FIG. 7 illustrates use of an indicator or co-target.
`
`FIG. 8 illustrates control of functions with the invention, using a handheld device
`
`which itself has functions.
`
`-4-
`
`

`

`Case 2:21-cv-00040-JRG Document 70-8 Filed 09/02/21 Page 29 of 77 PageID #: 1371
`Case 2:21-cv-00040-JRG Document 70-8 Filed 09/02/21 Page 29 of 77 PagelD #: 1371
`
`[0034]
`
`FIG. 9 illustrates pointing at an object represented on a screen using a finger or
`
`laser pointer, and then manipulating the represented object using the invention.
`
`[0035]
`
`FIG. 10 illustrates control of automobile or other functions with the invention,
`
`using detected knob, switch or slider positions.
`
`FIG. 11 illustrates a board game embodimentof the invention.
`
`FIG. 12 illustrates a generic game embodiment of the invention.
`
`FIG. 13 illustrates a game embodiment of the invention, such as might be played
`
`[0036]
`
`[0037]
`
`[0038]
`
`in a bar.
`
`[0039]
`
`FIG. 14 illustrates a laser pointer or other spot designator embodiment of the
`
`invention.
`
`[0040]
`
`[0041]
`
`FIG. 15 illustrates a gesture based flirting game embodiment of the invention.
`
`FIG. 16 illustrates a version of the pixel addressing camera technique wherein two
`
`lines on either side of a 1000 element square array are designated as perimeter fence lines to
`
`initiate tracking or other action.
`
`[0042]
`
`FIG.
`
`17 illustrates a 3-D acoustic imaging embodiment of the invention.
`
`THE INVENTION EMBODIMENTS
`
`FIG. 1
`
`[0043]
`
`The invention herein and disclosed in portions of other copending applications
`
`noted above, comprehends a combination of one or more TV cameras (or other suitable electro-
`
`optical sensors) and a computer to provide various position and orientation related functions of
`
`use. It also comprehends the combination of these functions with the basic task of generating,
`
`storing and/or transmitting a TV imageof the scene acquired--either in two or three dimensions.
`
`

`

`Case 2:21-cv-00040-JRG Document 70-8 Filed 09/02/21 Page 30 of 77 PageID #: 1372
`Case 2:21-cv-00040-JRG Document 70-8 Filed 09/02/21 Page 30 of 77 PagelD #: 1372
`
`[0044]
`
`The embodiment depicted in FIG. 1A illustrates the basic embodiments of many
`
`of my co-pending applications above. A stereo pair of cameras 100 and 101 located on each side
`
`of the upper surface of monitor 102 (for example a rear projection TV of 60 inch diagonal screen
`
`size) with display screen 103 facing the user, are connected to PC computer 106 (integrated in
`
`this case into the monitor housing), for example a 400 Mhz Pentium IL. For appearances and
`
`protection a single extensive cover window may be used to cover both cameras and their
`
`associated light sources 110 and 111, typically LEDs.
`
`[0045]
`
`The LEDs in this application are typically used to illuminate targets associated
`
`with any of the fingers, hand, feet and head of the user, or objects such as 131 held by a user, 135
`
`with hands 136 and 137, and head 138. These targets, such as circular target 140 and bandtarget
`
`141 on object 131 are desirably, but not necessarily, retro-reflective, and may be constituted by
`
`the object features themselves (e.g., a finger tip, such as 145), or by features provided on
`
`clothing worn bythe user(e.g., a shirt button 147 or polka dot 148, or by artificial targets other
`
`than retroreflectors.
`
`[0046]
`
`Alternatively, a three camera arrangement can be used, for example using
`
`additional camera 144,
`
`to provide added sensitivity in certain angular and positional
`
`relationships. Still more cameras can be used to further
`
`improve matters,
`
`as desired.
`
`Alternatively, and or in addition, camera 144 can be used for other purposes, such as acquire
`
`images of objects such as persons, for transmission, storage or retrieval independent of the
`
`cameras used for datum and feature location determination.
`
`[0047]
`
`For many applications, a single camera can suffice for measurement purposes as
`
`well, such as 160 shown in FIG. 1B for example, used for simple 2 dimensional
`
`(2D)
`
`measurements in the xy plane perpendicular to the camera axis (z axis), or 3D (xyz, roll pitch
`
`

`

`Case 2:21-cv-00040-JRG Document 70-8 Filed 09/02/21 Page 31 of 77 PageID #: 1373
`Case 2:21-cv-00040-JRG Document 70-8 Filed 09/02/21 Page 31 of 77 PagelD #: 1373
`
`yaw) where a target grouping, for example of three targets is used such as the natural features
`
`formed by the two eyes 164, 165 and nose 166 of a human 167. These features are roughly at
`
`knowndistances from each other, the data from which can be used to calculate the approximate
`
`position and orientation of the human face. Using for example the photogrammetric technique of
`
`Pinkney described below, the full 6 degree of freedom solution of the human face location and
`
`orientation can be achieved to an accuracy limited by the ability of the camera image processing
`
`software utilized to determine the centroids or other delineating geometric indicators of the
`
`position of the eyes and nose, (or someother facial feature such as the mouth), and the accuracy
`
`of the initial imputing of the spacing of the eyes and their respective spacing to the nose. Clearly
`
`if a standard human value is used (say for adult, or for a child or even by age) some lessening of
`
`precision results, since these spacings are used in the calculation of distance and orientation of
`
`the face of human 167 from the camera 160.
`
`[0048]
`
`In another generally more photogrammetrically accurate case, one might choose
`
`to use four special targets (e.g., glass bead retro-reflectors, or orange dots) 180-183 on the object
`
`185 having known positional relationships relative to each other on the object surface, such as
`
`one inch centers. This is shown in FIG. 1C, and may be used in conjunction with a pixel
`
`addressable camera such as described in FIG. 2 below, which allows one to rapidly determine the
`
`object position and orientation and track its movements in up to 6 degrees of freedom as
`
`disclosed by Pinkney U.S. Pat. No. 4,219,847 and technical papers referenced therein. For
`
`example,
`
`the system described above for FIGS.
`
`1 and 2 involving the photogrammetric
`
`resolution of the relative position of three or more knowntarget points as viewed by a camerais
`
`known and is described in a paper entitled "A Single Camera Method for the 6-Degree of
`
`-7-
`
`

`

`Case 2:21-cv-00040-JRG Document 70-8 Filed 09/02/21 Page 32 of 77 PageID #: 1374
`Case 2:21-cv-00040-JRG Document 70-8 Filed 09/02/21 Page 32 of 77 PagelD #: 1374
`
`Freedom Sprung Mass Response of Vehicles Redirected by Cable Barriers" presented by M. C.
`
`van Wijk and H. F. L. Pinkney to The Society of Photo-optical Instrumentation Engineers.
`
`[0049]
`
`The stereo pair of cameras can also acquire a two view stereo image of the scene
`
`as well, which can be displayed in 3D using stereoscopic or auto-stereoscopic means, as well as
`
`transmitted or recorded as desired.
`
`[0050]
`
`In many applications of the foregoing invention it is desirable not just to use a
`
`large screen but in fact one capable of displaying life size images. This particularly relates to
`
`humanscaled images, giving a life-like presence to the data on the screen. In this way the natural
`
`response of the user with motions of hands, head, arms, etc., is scaled in "real" proportion to the
`
`data being presented.
`
`FIG, 2
`
`[0051]
`
`This embodiment and others discloses special types of cameras useful with the
`
`invention. In the first case, that of FIG. 2A, a pixel addressable camera such as the MAPP2200
`
`made by IVP corporation of Sweden is used, which allows one to do many things useful for
`
`rapidly determining location of objects, their orientation and their motion.
`
`[0052]
`
`For example, as shown in FIG. 2A, an approximately circular image 201 of a
`
`target datum such as 180 on object 185 of FIG. 1C may be acquired by scanning the pixel
`
`elements on a matrix array 205 on which the image is formed. Such an array in the future will
`
`have for example 1000.times.1000 pixels, or more (today the largest
`
`IVP makes
`
`is
`
`512.times.512. The IVP also is not believed to be completely randomly addressable, which some
`
`future arrays will be).
`
`[0053]
`
`As an illustration, computer 220 determines, after the array 205 has been
`
`interrogated, that the centroid "x, y" of the pixel elements on which the target image lies is at
`
`

`

`Case 2:21-cv-00040-JRG Document 70-8 Filed 09/02/21 Page 33 of 77 PageID #: 1375
`Case 2:21-cv-00040-JRG Document 70-8 Filed 09/02/21 Page 33 of 77 PagelD #: 1375
`
`pixel x=500, y=300 (including a sub-fraction thereof in many cases). The centroid location can
`
`be determined for example by the moment method disclosed in the Pinkney patent, referenced
`
`above.
`
`[0054]
`
`The target in this case is defined as a contrasting point on the object, and such
`
`contrast can be in color as well as, or instead of, intensity. Or with some added preprocessing,it
`
`can be a distinctive pattern on the object, such as a checkerboard or herringbone.
`
`Subsequent Tracking
`
`[0055]
`
`To subsequently track the movementof this target image, it is now only necessary
`
`to look in a small pixel window composed of a small number of pixels around the target. For
`
`example the square 230 shown, as the new position x'y' of the target image cannot be further
`
`distant within a short period of time elapsed from the first scan, and in consideration of the small
`
`required time to scan the window.
`
`[0056]
`
`For example, if the window is 100.times.100 pixels, this can be scanned in 1
`
`millisecond or less with such a pixel addressing camera, by interrogating only those pixels in the
`
`window, while still communicating with the camera over a relatively slow USB serial link of 12
`
`mb transmission rate (representing 12,000 pixel gray level values in one millisecond).
`
`[0057]
`
`One thus avoids the necessity to scan the whole field, once the starting target
`
`image position is identified. This can be known byaninitial scan as mentioned, or can be known
`
`by having the user move an object with a target against a known location with respect to the
`
`camera such as a mechanicalstop, and then indicate that tracking should start either by verbally
`
`saying so with voice recognition, or by actuating a control key such as 238 or whatever.
`
`[0058]
`
`It
`
`is noted that if the tracking window is made large enough,
`
`then it can
`
`encompass a whole group of datums, such as 180-183 on an object.
`
`

`

`Case 2:21-cv-00040-JRG Document 70-8 Filed 09/02/21 Page 34 of 77 PageID #: 1376
`Case 2:21-cv-00040-JRG Document 70-8 Filed 09/02/21 Page 34 of 77 PagelD #: 1376
`
`FIG. 2B Reduction in Acquisition Time
`
`[0059]
`
`Another application of such a pixel addressing camera is shown in FIG. 2B. One
`
`can look at the whole field, x y of the camera, 240, but only address say every 10.sup.th pixel
`
`such as 250, 251 and 252, in each direction, i.e., for a total 10,000 pixels in a field of 1 million
`
`(1000.times.1000, say).
`
`[0060]
`
`In this case computer 220 simply queries this fraction of the pixels in the image,
`
`knowingapriori that the target image such as 260 will have an imagesize larger than 10.times.10
`
`pixels, and must be detectable, if of sufficient contrast, by one of the queried pixels. (For smaller
`
`or larger target images, the number and spacing of queried pixels can be adjusted accordingly).
`
`This for example, allows one to find approximate location of targets with only 1/100 the pixel
`
`interrogation time otherwise needed, for example, plus any gain obtained as disclosed above, by
`
`knowing in what region of the image to look (for example during tracking, or given some apriori
`
`knowledge of approximate location due to a particular aspect of the physical arrangement or the
`
`program in question).
`
`[0061]
`
`Once a target has been approximately found as just described, the addressing can
`
`be optimized for that region of the image only, as disclosed in subsequenttracking section above.
`
`[0062]
`
`Given the invention, the potential for target acquisition in a millisecond or two
`
`thus is achievable with simple pixel addressable CMOS cameras coming on stream now (today
`
`costing under $50), assuming the target points are easily identifiable from at least one of
`
`brightness (over a value), contrast (with respect to surroundings), color, color contrast, and more
`
`difficult, shape or pattern (e.g., a plaid, or herringbone portion of a shirt). This has major
`
`ramifications for the robustness of control systems built on such camera based acquisition, be
`
`they for controlling displays, or machines or whatever.
`
`-10-
`
`

`

`Case 2:21-cv-00040-JRG Document 70-8 Filed 09/02/21 Page 35 of 77 PageID #: 1377
`Case 2:21-cv-00040-JRG Document 70-8 Filed 09/02/21 Page 35 of 77 PagelD #: 1377
`
`[0063]
`
`It's noted that with new 2000.times.2000 cameras coming on stream, it may only
`
`be necessary to look at every 15.sup.th or 20.sup.th pixel in each direction to get an adequate feel
`
`for target location. This means every 200.sup.th to 400.sup.th pixel, not enough to cause image
`
`rendition difficulties even if totally dark grey (as it might be in a normal white light imageif set
`
`up for IR wavelengths only).
`
`FIG. 2C
`
`[0064]
`
`Another method for finding the target
`
`in the first place with limited pixel
`
`interrogation is to look at pixels near a home point where a person for example indicates that the
`
`target is. This could be for example, placing ones fingernail such as 270, whose natural or
`
`artificial (e.g., reflective nail polish) features are readily seen by the camera 275 and determined
`
`to be in the right corner of a pad 271 in FIG. 2C which approximately covers the field of view
`
`274 of the camera 275. The computer 220 analyzes the pixels in the right corner 278 of the
`
`image field 279 representing the pad portion 271 with the camera 275, either continuously, or
`
`only when the finger for example hits a switch such as 280 at the edge of the pad, or on
`
`command(e.g., by the user pushing a button or key, or a voice message inputted via microphone
`
`285 for example). After such acquisition, the target is then tracked to other locations in xy space
`
`of the pad, for example as described above. Its noted that it helps to provide a beep or other
`
`sound or indication when acquisition has been made.
`
`Pick Windows in Real Time
`
`[0065]
`
`Another aspect of the invention is that one can also pick the area of the image to
`
`interrogate at any desired moment. This can be donebycreating a window ofpixels with in the
`
`field to generate information, for example as discussed relative to a specific car dashboard
`
`application of FIG. 10.
`
`-l|1-
`
`

`

`Case 2:21-cv-00040-JRG Document 70-8 Filed 09/02/21 Page 36 of 77 PageID #: 1378
`Case 2:21-cv-00040-JRG Document 70-8 Filed 09/02/21 Page 36 of 77 PagelD #: 1378
`
`FIG, 2D-Scan Pattern
`
`[0066]
`
`A pixel addressing camera also allows a computer such as 220 to cause scans to
`
`be generated which are not typical raster scans. For example circular or radial, or even odd
`
`shapes as desired. This can be done by providing from the computer the sequential addresses of
`
`the successive pixels on the camera chip whose detected voltages are to be queried.
`
`[0067]
`
`A circular scan of pixels addressed at high speed can be usedto identify when and
`
`where a target enters a field enclosed by the circular pixel scan. This is highly useful, and after
`
`that, the approximate location of the target can be determined by further scans of pixels in the
`
`target region.
`
`[0068]
`
`For example consider addressing the pixels cl c2 c3 . .. cn representing a circle
`
`282 at the outer perimeter of the array, 285, of 1000.times.1000 elements such as discussed
`
`above. The numberof pixels in a full circle is approximately 1000 pi, which can be scanned even
`
`with USB (universal serial bus) limits at 300 times per second orbetter. For targets of 1/100 field
`
`in width, this means that a target image entering the field such as circular target image 289
`
`(which is shownintersecting element cm and its neighbors) would have to travel 1/100 the field
`
`width in 0.0033 secondsto be totally missed in a worst case. If the image field corresponds to 20
`
`inches in object field width this is 0.2 inches.times.300/sec or 60 inches/second, very fast for
`
`human movement, and notlikely to be exceeded even where smaller targets are used.
`
`[0069]
`
`Alternative shapesto circular "trip wire" perimeters may be used, such as squares,
`
`zig-zag, or other layouts of pixels to determine target presence. Once determined, a group of
`
`pixels such as group 292 can be interrogated to get a better determination of target location.
`
`FIG, 3
`
`-12-
`
`

`

`Case 2:21-cv-00040-JRG Document 70-8 Filed 09/02/21 Page 37 of 77 PageID #: 1379
`Case 2:21-cv-00040-JRG Document 70-8 Filed 09/02/21 Page 37 of 77 PagelD #: 1379
`
`[0070]
`
`Since many applications of the invention concern, or at least have present a
`
`human caused motion, or motion of a part of a human, or an object moved by a human, the
`
`identification and tracking problem can be simplified if the features of interest, either natural or
`
`artificial of the object provide some kind of change in appearance during such motion.
`
`[0071]
`
`FIG, 3 illustrates tracking embodiments of the invention using intensity variation
`
`to identify and/or track object target datums. In a simple case, a subtraction of successive images
`
`can aid in identifying zones in an image having movementoffeatures as is well known. It is also
`
`useful to add pixel
`
`intensities of successive images in computer 220 for example. This is
`
`particular true with bright targets (with respect to thei

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket