`
`US008194924B2
`
`(IO) Patent No.: US 8,194,924 B2
`c12) United States Patent
`(45)Date of Patent:
`Pryor
`Jun. 5, 2012
`
`(54)CAMERA BASED SENSING IN HANDHELD,
`
`MOBILE, GAMING OR OTHER DEVICES
`
`
`
`
`
`(76) Inventor: Timothy R. Pryor, Sylvania, OH (US)
`
`(21)Appl. No.: 13/051,698
`
`Hedges et al.
`
`7/1982
`4,339,798 A
`
`Pugh
`12/1986
`
`4,631,676 A
`Ando
`4/1991
`
`5,008,946 A
`Chan
`2/1992
`5,088,928 A
`Yokota et al.
`7/1993
`5,227,986 A
`
`Dementhon et al.
`3/1994
`
`5,297,061 A
`( * ) Notice:Subject to any disclaimer, the term ofthis
`
`
`DeMenthon
`2/1995
`5,388,059 A
`
`Umezawa et al. ......... 348/14.02
`2/1996
`5,491,507 A *
`
`
`
`patent is extended or adjusted under 35
`
`Cipolla et al.
`12/1996
`5,581,276 A
`
`
`U.S.C. 154(b) by O days.
`
`Freeman et al.
`1/1997
`
`5,594,469 A
`Oh
`4/1997
`
`5,616,078 A
`Ohkubo et al.
`4/1997
`
`5,624,117 A
`
`Fishbine et al.
`7/1998
`5,781,647 A
`
`Leis et al. . . . . . . . . . . . . . . . . . . . . . 3 82/ l 03
`10/1998
`
`5,828,770 A *
`
`Sumi et al. .................... 382/154
`12/1998
`
`5,845,006 A *
`Gilboa
`12/1998
`5,853,327 A
`
`Stewart et al.
`3/1999
`
`5,878,174 A
`Fan
`7/1999
`
`5,926,168 A
`Kimura . . . . . . . . . . . . . . . . . . . . . . . . . 348/294
`
`8/1999
`
`5,940,126 A *
`
`Kumar et al. ................. 345/419
`3/2001
`
`6,204,852 Bl *
`Related U.S. Application Data
`Amenta
`1/2002
`
`6,342,917 Bl
`
`Palalau et al.
`4/2002
`6,373,472 Bl
`(63)Continuation of application No. 12/834,281, filed on
`
`
`
`Breed et al.
`8/2002
`
`6,442,465 B2
`
`Jul. 12, 2010, now Pat. No. 7,933,431, which is a
`Karmarkar
`1/2003
`
`6,508,709 Bl
`
`
`
`continuation of application No. 11/980,710, filed on
`Silverbrook
`7/2003
`6,597,817 Bl
`
`Oct. 31, 2007, now Pat. No. 7,756,297, which is a
`
`Arai et al ................... 379/93.17
`8/2004
`
`6,775,361 Bl *
`
`
`
`Silverbrook ............... 348/207 .2
`9/2004
`
`6,788,336 Bl *
`
`
`
`continuation of application No. 10/893,534, filed on
`
`
`Brinjes ......................... 345/175
`6/2005
`
`6,911,972 B2 *
`
`Jul. 19, 2004, now Pat. No. 7,401,783, which is a
`
`Lee ............................... 396/429
`2/2009
`7,489,863 B2 *
`
`
`
`continuation of application No. 09/612,225, filed on
`*cited by examiner
`Jul. 7, 2000, now Pat. No. 6,766,036.
`
`
`
`Mar. 18, (22)Filed: 2011
`
`
`
`(65)
`
`
`
`Prior Publication Data
`
`
`
`US 2011/0170746 Al Jul. 14, 2011
`
`
`
`
`
`1999.
`
`
`(60)Provisional application No. 60/142,777, filed on Jul. 8,
`Primary Examiner - Tom Y Lu
`
`
`
`(74)Attorney, Agent, or Firm - Warner Norcross & Judd
`LLP
`
`ABSTRACT
`
`(51)Int. Cl.
`(2006.01)
`G06K 9/00
`
`
`
`(52)U.S. Cl. ......................... 382/103; 382/154; 382/312
`(57)
`
`
`
`(58)Field of Classification Search .................. 382/103,
`
`382/154, 312
`Method and apparatus are disclosed to enable rapid TV cam
`
`
`
`
`
`See application file for complete search history.
`
`
`
`era and computer based sensing in many practical applica
`
`
`
`
`
`tions, including, but not limited to, handheld devices, cars,
`
`
`
`
`and video games. Several unique forms of social video games
`are disclosed.
`
`
`
`(56)
`
`
`
`References Cited
`
`U.S. PATENT DOCUMENTS
`
`3,909,002 A 9/1975 Levy
`
`
`4,219,847 A 8/ 1980 Pinkney et
`
`al.
`
`
`
`
`
`14 Claims, 23 Drawing Sheets
`
`1958
`
`1957
`
`1951
`DISPLAY
`CPU
`
`1956
`
`/
`/
`
`1906
`
`/
`
`1910 1930
`///
`/1/
`... ···.,
`
`-----. .:.
`
`1901
`
`
`
`
`
`U.S. Patent Jun.5,2012 Sheet 1 of 23
`
`
`
`US 8,194,924 B2
`
`137
`
`138
`
`FIG. 1A
`
`�135
`
`165
`
`164
`
`FIG. 1B
`
`180
`
`185
`FIG. 1C
`
`182
`
`
`
`U.S. Patent Jun.5,2012
`
`
`US 8,194,924 B2
`Sheet 2 of 23
`
`1ooof
`
`ELEMENTS
`
`300
`
`205
`
`230
`
`201
`
`500
`
`1000
`
`ELEMENTS
`
`COMPUTER 220
`
`238
`
`y
`t
`
`FIG. 2A
`_/"240
`
`251
`
`250
`
`252
`
`FIG. 2B
`
`
`
`
`U.S. Patent
`
`Jun.5,2012 Sheet 3 of 23 US 8,194,924 B2
`
`220
`
`COMPUTER
`
`275
`
`271
`
`280
`
`FIG. 2C
`
`
`
`
`U.S. Patent
`
`Jun.5,2012 Sheet 4 of 23 US 8,194,924 B2
`
`C1 C2 C3 / ...-282
`I
`
`285
`
`----------
`...... /
`
`--
`
`Cm
`
`a--Cn
`
`289
`a �,
`I '-292
`
`- - _
`'
`
`'
`\
`\
`\
`I
`I
`I
`/
`/
`/
`
`'
`'
`'
`'
`..... -...__
`�
`____
`
`,,..
`
`FIG. 2D
`
`406
`
`FIG. 4A
`
`
`
`
`
`U.S. Patent Jun.5,2012 Sheet 5 of 23 US 8,194,924 B2
`
`-327
`
`335
`
`COMP.
`
`320
`
`301
`
`FIG. 3A
`
`445
`
`FIG. 4B
`
`
`
`
`U.S. Patent Jun.5,2012 Sheet 6 of 23
`
`US 8,194,924 B2
`
`345
`o-/355
`\'
`
`'------------/
`
`FIG. 3B
`
`370
`
`�
`
`375
`
`0-...._ __ ___.
`
`t-
`
`341
`
`360
`
`FIG. 3C
`
`
`
`
`U.S. Patent
`Jun.5,2012
`Sheet 7 of 23
`
`US 8,194,924 B2
`
`510
`511
`
`R
`
`505
`
`512
`
`\__500
`
`FIG. 5A
`
`540 541
`
`555
`550
`
`543
`
`542
`
`530
`
`FIG. 5B
`
`
`
`U.S. Patent
`Jun.5,2012 Sheet 8 of 23 US 8,194,924
`B2
`
`COMPUTER
`
`602
`
`601
`
`631
`
`630
`
`615
`
`610
`
`FIG. 6
`
`
`
`U.S. Patent Jun.5,2012 Sheet 9 of 23 US 8,194,924 B2
`
`/
`
`Q-�
`
`705
`
`733
`
`732
`
`FIG. 7
`
`910
`
`901
`\..
`
`FIG. 9
`
`
`
`U.S. Patent
`Jun.5,2012 Sheet 10 of 23
`
`US 8,194,924
`B2
`
`816
`
`806
`
`801
`
`,,
`
`...................... ....
`'
`'
`I
`.., ""
`..._ -..
`,
`..., ..., .., ........... ., ..
`1...
`- I
`'-....
`I I
`
`I
`
`BOO) - - - - - --'t
`
`801
`
`FIG. BA
`
`853
`
`851
`
`852
`
`FIG. BB
`
`
`
`U.S. Patent
`US 8,194,924 B2
`Jun. 5, 2012
`Sheet 11 of 23
`US 8,194,924
`B2
`U.S. Patent
`Jun.5,2012
`Sheet 11 of 23
`
`�
`
`,._
`
`LO
`
`<.O
`
`a
`..-
`
`I
`
`I
`
`,-L -�
`I I
`
`,_
`
`---
`
`cvOL
`
`OvOL
`
`LVOL
`ay
`
`OOOLSvOL
`
`..-
`
`OLOL-?F*
`
`ogo?SHLOL
`
`CJ
`
`..-
`
`<.O
`-.::t'
`0
`
`..-
`
`8vOl
`
`0:,
`-q-0
`
`..-
`
`0ZOL
`
`O)
`
`VOLSid
`LZ0L|S201
`9901a
`S\|9P0l
`
`L901
`0 ..-
`
`..-<.O
`
`
`
`Page 12 of 38
`
`
`
`
`U.S. Patent
`Jun. 5, 2012
`Sheet 12 of 23
`US 8,194,924 B2
`Sheet 12 of 23 US 8,194,924 B2
`U.S. Patent
`Jun.5,2012
`
`C)
`,-..
`
`FIG.10B
`
`.,.... u.:
`
`co
`
`(0,..._
`
`r 1077
`
`NN
`
`N
`.......
`0
`
`R©r
`
`LO
`.......
`0
`
`N
`N
`
`0.,....
`
`Page 13 of 38
`
`
`
`Jun.5,2012 Sheet 13 of 23 US 8,194,924
`B2
`U.S. Patent
`
`1152
`
`1155
`
`1101
`
`�1158
`
`FIG. 11A
`
`
`
`Jun.5,2012
`U.S.Patent
`U.S. Patent Jun.5,2012
`
`Sheet14of23
`
`Sheet 14 of 23
`
`US 8,194,924 B2
`
`US8,194,924B2
`
`11B
`
`1167
`
`\
`
`Page 15 of 38
`
`lO
`(0
`
`a.:
`�
`0
`(.)
`
`FIG.
`
`•
`(!)
`it
`
`
`
`U.S. Patent
`US 8,194,924 B2
`Jun. 5, 2012
`Sheet 15 of 23
`
`Sheet 15 of 23 US 8,194,924 B2
`U.S. Patent
`Jun.5,2012
`
`Chld
`
`I.{)
`(Y)
`N
`T""
`
`I.{)
`LO
`N
`..---
`
`N
`0
`N
`
`0
`T""
`
`N
`
`�,
`
`T""
`
`T""
`
`N
`T""
`
`..---
`0
`N
`T"" co
`
`T""
`
`-.;:f"
`0
`N
`
`0
`N
`T""
`
`co
`co
`N
`T"""
`
`Page 16 of 38
`
`
`
`
`U.S. Patent
`Jun. 5, 2012
`Sheet 16 of 23
`U.S. Patent Jun.5,2012 Sheet 16 of 23
`
`US 8,194,924 B2
`
`US 8,194,924 B2
`
`FIG.13
`
`1370
`
`COMP1301
`
`1350
`
`a21330
`
`©N~_
`
`1355
`
`Page 17 of 38
`
`
`
`U.S. Patent
`Jun. 5, 2012
`Sheet 17 of 23
`US 8,194,924 B2
`U.S. Patent Jun.5,2012 Sheet 17 of 23 US 8,194,924 B2
`
`T"" ..-
`
`N �
`
`0.: �
`
`FIG.14A
`
`0
`
`CV) �
`
`T""
`
`� �
`
`•
`
`(!)
`it
`
`�
`
`T""
`
`0
`(.)
`
`0
`T""
`
`-.;;t"
`......
`
`Page 18 of 38
`
`
`
`U.S. Patent Jun.5,2012 Sheet 18 of 23
`
`US 8,194,924
`B2
`
`1465
`
`1460
`
`�462 1476
`
`1486
`
`1480
`
`FIG. 14B
`
`1491
`
`1496
`
`1490
`
`1497
`
`1498
`
`FIG. 14C
`
`
`
`Sheet 19 of 23
`US 8,194,924 B2
`U.S. Patent
`Jun. 5, 2012
`
`US 8,194,924 B2
`U.S. Patent
`Jun.5,2012
`Sheet 19 of 23
`
`1505 FIG.15
`
`lC')
`0
`
`lC') �
`
`It)
`�
`
`—
`1510
`
`oO
`0
`C")
`—
`lC') �
`
`)w=
`
`1550
`
`0
`lC')
`�
`
`COMP.
`
`CL
`I.O
`� �
`0
`(.)
`
`Page 20 of 38
`
`
`
`
`Jun.5,2012
`U.S. Patent
`
`Sheet 20 of 23
`
`
`
`US 8,194,924 B2
`
`en
`1-
`-l Z
`WUJ
`X2
`Cl. UJ
`_J
`
`w
`
`N
`0
`tO
`...-
`
`.-------+-------
`
`l
`t-------�-------1---
`
`0
`-,- 0
`0
`
`.....
`...-
`
`l'--0')
`
`
`O')
`
`C0
`
`,-.
`
`(!)
`ti:
`
`X
`
`-('I')
`
`I-
`
`o
`
`0
`
` .....
`...-tO
`
`
`
`LO
`
`..- T"""
`
`
`
`-lZ
`WwX2
`Cl. w
`_J
`
`w
`
`I
`0
`0
`0
`T"""
`
`T"""
`
`
`
`FIG. 17A
`
`FIG. 17C
`
`1720
`
`1730
`
`1740
`
`17000000
`5 000
`
`5�
`17
`
`0 00
`
`0 1750
`
`0
`
`COMP.
`
`1704
`
`z
`
`1745
`
`1725
`
`1710
`
`
`
`U.S. Patent
`Jun. 5, 2012
`Sheet 22 of 23
`US 8,194,924 B2
`
`U.S. Patent Jun.5,2012 Sheet 22 of 23 US 8,194,924 B2
`
`o=
`c..: 5�
`Oo” Ww
`�w
`- b-
`�
`I- I-
`ZY)
`0 z(I)
`Of
`(J o>-
`O
`
`FIG.17B
`
`|O
`
`,-.... uCl)
`
`�
`
`T""-
`
`LO
`co
`,.._
`
`�
`
`,...
`
`i:;:
`
`0
`
`�
`
`�
`
`Page 23 of 38
`
`
`
`U.S. Patent
`US 8,194,924 B2
`Jun. 5, 2012
`Sheet 23 of 23
`US 8,194,924 B2
`U.S. Patent
`
`Jun.5,2012 Sheet 23 of 23
`
`co
`I.()
`O'.l
`...-
`
`..-
`
`...- m
`
`..-
`
`
`
`0 ...-
`O'.l
`
`�\_q �
`
`\
`\
`\
`\
`
`co
`
`� .
`
`0)
`
`LL
`
`',,,� \ i- �
`
`,. r ,.
`
`\ .--.,-
`\
`\
`
`
`
`\
`\
`\
`\
`\
`\
`\
`
`(0
`i.n
`0)
`...-
`
`906)
`
`Page 24 of 38
`
`
`
`US 8,194,924 B2
`1
`2
`6."Apparel Manufacture and Distance Fashion Shopping
`
`
`
`
`CAMERA BASED SENSING IN HANDHELD,
`
`
`in Both Present and Future": U.S. Appln. Ser. No. 60/187,
`MOBILE, GAMING OR OTHER DEVICES
`
`397, filed Mar. 7, 2000.
`
`CROSS REFERENCE TO RELATED
`APPLICATIONS
`
`FIELD OF THE INVENTION
`
`DESCRIPTION OF RELATED ART
`
`
`
`DESCRIPTION OF FIGURES
`
`The invention relates to simple input devices for comput
`
`
`
`
`
`
`
`This application is a continuation of application Ser. No.
`
`
`
`ers, particularly, but not necessarily, intended for use with 3-D
`
`
`
`12/834,281, filed Jul. 12, 2010 (now U.S. Pat. No. 7,933,431),
`
`
`
`
`graphically intensive activities, and operating by optically
`which is a continuation of application Ser. No. 11/980, 710,
`
`
`
`
`
`10 sensing a human input to a display screen or other object
`
`filed Oct. 31, 2007 (now U.S. Pat. No. 7,756,297), which is a
`
`
`
`
`and/or the sensing of human positions or orientations. The
`
`
`
`
`continuation of application Ser. No. 10/893,534, filed Jul. 19,
`
`
`
`
`invention herein is a continuation in part of several inventions
`
`
`2004 (now U.S. Pat. No. 7,401,783), which is a continuation
`
`
`of mine, listed above.
`
`
`of application Ser. No. 09/612,225, filed Jul. 7, 2000 (now
`This continuation application seeks to provide further use-
`
`
`
`
`
`
`
`
`U.S. Pat. No. 6,766,036), which claims the benefit of U.S.
`
`
`
`
`15 ful embodiments for improving the sensing of objects. Also
`
`
`
`Provisional Application No. 60/142,777, filed Jul. 8, 1999.
`
`
`
`
`disclosed are new applications in a variety of fields such as
`
`
`
`
`Cross references to related co-pending US applications by
`
`
`
`
`
`computing, gaming, medicine, and education. Further dis
`
`
`
`
`the inventor having similar subject matter.
`
`
`
`
`
`closed are improved systems for display and control pur-
`
`
`
`1. Touch TV and other Man Machine Interfaces: Ser. No.
`poses.
`
`
`09/435,854 filed Nov. 8, 1999, now U.S. Pat. No. 7,098,891;
`The invention uses single or multiple TV cameras whose
`
`
`
`
`20
`
`
`which was a continuation of application Ser. No. 07 /946,908,
`
`
`
`output is analyzed and used as input to a computer, such as a
`now U.S. Pat. No. 5,982,352;
`
`
`
`
`home PC, to typically provide data concerning the location of
`
`
`
`2. More Useful Man Machine Interfaces and Applications:
`
`
`
`
`parts of, or objects held by, a person or persons.
`
`
`Ser. No. 09/433,297 filed Nov. 3, 1999, now U.S. Pat. No.
`
`6,750,848;
`25
`3. Useful Man Machine interfaces and applications: Ser.
`
`
`
`
`The above mentioned co-pending applications incorpo
`
`
`
`
`
`
`No. 09/138,339, Pub.Appln. 2002-0036617, now abandoned;
`
`
`
`rated by reference discuss many prior art references in various
`
`
`
`
`4. Vision Target based assembly: Ser. No. 08/469,907 filed
`
`
`
`pertinent fields, which form a background for this invention.
`
`Jun. 6, 1995, now U.S. Pat. No. 6,301,783;
`
`
`
`Some more specific U.S. Patent references are for example:
`
`
`5. Picture Taking method and apparatus: provisional appli-30
`
`
`
`
`
`DeMenthon-U.S. Pat. Nos. 5,388,059; 5,297,061; 5,227,
`
`
`
`cation 60/133,671, and regular application Ser. No. 09/568,
`985
`
`552 filed May 11, 2000, now U.S. Pat. No. 7,015,950;
`Cipolla-U.S. Pat. No. 5,581,276
`
`
`
`
`
`6. Methods and Apparatus for Man Machine Interfaces and
`
`Pugh-U.S. Pat. No. 4,631,676
`
`
`
`
`
`Related Activity: Provisional Application: provisional appli-
`
`Pinckney-U.S. Pat. No. 4,219,847
`35
`
`
`
`cation 60/133,673 filed May 11, 1999; and regular applica
`
`
`tion Ser. No. 09/568,554 filed May 11, 2000, now U.S. Pat.
`No. 6,545,670;
`7. Tactile Touch Screens for Automobile Dashboards, Inte
`
`
`
`
`FIG.1 illustrates a basic computer terminal embodiment of
`
`
`
`
`
`
`
`riors and Other Applications: provisional application Ser. No.
`
`
`
`
`40 the invention, similar to that disclosed in copending applica-
`
`
`
`60/183,807; and regular application Ser. No. 09/789,538,
`tions.
`
`now U.S. Pat. No. 7,084,859; and
`FIG. 2 illustrates object tracking embodiments of the
`
`
`
`
`8. Apparel Manufacture and Distance Fashion in
`
`
`
`
`
`
`
`invention Shopping employing a pixel addressable camera.
`
`
`
`
`
`
`Both Present and Future: provisional application 60/187,397 FIG. 3 illustrates tracking embodiments of the invention
`
`
`
`filed Mar. 7, 2000.
`
`
`
`
`45 using intensity variation to identify and/or track object target
`
`
`
`
`
`The disclosures of the following U.S. patents and co-penddatums.
`
`
`
`
`ing patent applications by the inventor, or the inventor and his
`FIG. 4 illustrates tracking embodiments of the invention
`
`
`
`
`
`
`colleagues, are incorporated herein by reference:
`
`
`
`using variation in color to identify and/or track object target
`
`
`
`1. "Man machine Interfaces": U.S. application Ser. No.
`datums.
`
`
`09/435,854 and U.S. Pat. No. 5,982,352, and U.S. application
`FIG. 5 illustrates special camera designs for determining
`
`
`
`50
`
`
`
`Ser. No. 08/290,516, filed Aug. 15, 1994, now U.S. Pat. No.
`
`
`
`
`target position in addition to providing normal color images.
`
`
`
`6,008,000, the disclosure of both of which is contained in that
`
`
`
`
`FIG. 6 identification and tracking with stereo pairs.
`of Ser. No. 09/435,854;
`
`
`
`FIG. 7 illustrates use of an indicator or co-target.
`
`
`2. " Useful Man Machine Interfaces and Applications":
`
`
`
`FIG. 8 illustrates control of functions with the invention,
`
`
`
`U.S. application Ser. No. 09/138,339, now Pub.Appln. 2002-55
`
`
`
`
`using a handheld device which itself has functions.
`0036617;
`
`
`
`
`FIG. 9 illustrates pointing at an object represented on a
`
`
`
`3. "More Useful Man Machine Interfaces and Applica
`
`
`
`screen using a finger or laser pointer, and then manipulating
`
`
`
`
`tions": U.S. application Ser. No. 09/433,297, now U.S. Pat.
`
`
`the represented object using the invention.
`No. 6,750,848;
`
`
`
`FIG. 10 illustrates control of auto mo bile or other functions
`
`
`
`4."Methods and Apparatus for Man Machine Interfaces
`
`
`
`
`with the invention, using detected knob, switch or slider posi
`60
`
`
`
`
`and Related Activity": U.S. Appln. Ser. No. 60/133,673 filed
`tions.
`
`
`
`as regular application Ser. No. 09/568,554, now U.S. Pat. No.
`
`6,545,670;
`tion.
`
`
`
`
`5. " Tactile Touch Screens for Automobile
`
`
`Interiors and Other Applications": U.S. provisional Appln. 65 invention.
`
`
`Ser. No. 60/183,807, filed Feb. 22, 2000, now filed as reg. FIG. 13 illustrates a game embodiment of the invention,
`
`
`
`
`
`application Ser. No. 09/789,538; and
`
`such as might be played in a bar.
`
`
`
`FIG. 11 illustrates a board game embodiment of the inven
`
`
`
`FIG. 12 illustrates Dashboards, a generic game embodiment of the
`
`
`
`
`
`US 8,194,924 B2
`
`THE INVENTION EMBODIMENTS
`
`FIG.1
`
`
`
`4
`3
`FIG. 14 illustrates a laser pointer or other spot design ator
`
`
`
`Pinkney described below, the full 6 degree of freedom solu
`
`
`
`
`embodiment of the invention.
`
`
`tion of the human face location and orientation can be
`
`
`
`FIG. 15 illustrates a gesture based flirting game embodi
`
`
`
`
`achieved to an accuracy limited by the ability of the camera
`ment of the invention.
`
`
`
`
`image processing software utilized to determine the centroids
`
`
`
`
`FIG. 16 illustrates a version of the pixel addressing camera
`
`
`
`5 or other delineating geometric indicators of the position of the
`
`
`
`
`technique wherein two lines on either side of a 1000 element
`
`
`
`eyes and nose, ( or some other facial feature such as the
`
`
`
`square array are designated as perimeter fence lines to initiate
`
`
`
`
`
`mouth), and the accuracy of the initial imputing of the spacing
`
`tracking or other action.
`
`
`
`of the eyes and their respective spacing to the nose. Clearly if
`FIG. 17 illustrates a 3-D acoustic imaging embodiment of
`
`
`
`
`a standard human value is used ( say for adult, or for a child or
`the invention.
`
`
`
`10 even by age) some lessening of precision results, since these
`
`
`
`FIG. 18 illustrates an improved handheld computer
`
`
`
`
`spacings are used in the calculation of distance and orienta
`
`
`embodiment of the invention, in which the camera or cameras
`tion of the face of human 167 from the camera 160.
`
`
`may be used to look at objects, screens and the like as well as
`
`
`
`In another generally more photogranimetrically accurate
`look at the user.
`
`
`case, one might choose to use four special targets ( e.g., glass
`15
`
`
`bead retro-reflectors, or orange dots) 180-183 on the object
`
`185 having known positional relationships relative to each
`
`
`
`other on the object surface, such as one inch centers. This is
`
`
`
`
`shown in FIG. lC, and may be used in conjunction with a
`
`
`
`
`The invention herein and disclosed in portions of other
`
`
`in FIG. 2 below, 20 pixel addressable camera such as described
`
`
`
`copending applications noted above, comprehends a combi
`
`
`
`which allows one to rapidly determine the object position and
`
`
`
`
`nation of one or more TV cameras ( or other suitable electro
`
`
`
`
`orientation and track its movements in up to 6 degrees of
`
`
`
`
`
`optical sensors) and a computer to provide various position
`
`
`
`
`freedom as disclosed by Pinkney U.S. Pat. No. 4,219,847 and
`
`
`
`and orientation related functions of use. It also comprehends
`
`
`
`
`
`technical papers referenced therein. For example, the system
`
`
`the combination of these functions with the basic task of
`
`
`25 described above for FIGS. 1 and 2 involving the photogram
`
`
`
`generating, storing and/or transmitting a TV image of the
`
`
`
`
`metric resolution of the relative position of three or more
`
`
`scene acquired-either in two or three dimensions.
`
`known target points as viewed by a camera is known and is
`
`
`
`The embodiment depicted in FIG. lA illustrates the basic
`
`
`
`described in a paper entitled "A Single Camera Method for
`
`
`
`embodiments of many of my co-pending applications above.
`
`
`
`the 6-Degree of Freedom Sprung Mass Response ofVehicles
`
`
`
`A stereo pair of cameras 100 and 101 located on each side of
`
`
`
`30 Redirected by Cable Barriers" presented by M. C. van Wijk
`
`
`
`the upper surface of monitor 102 (for example a rear projec
`
`
`
`and H.F. L. Pinkney to The Society of Photo-optical Instru
`
`
`
`
`tion TV of 60 inch diagonal screen size) with display screen
`
`mentation Engineers.
`
`
`
`103 facing the user, are connected to PC computer 106 (inte
`The stereo pair of cameras can also acquire a two view
`
`
`
`
`
`
`
`grated in this case into the monitor housing), for example a
`
`
`
`stereo image of the scene as well, which can be displayed in
`
`
`
`
`400 Mhz Pentium II. For appearances and protection a single
`
`
`
`35 3D using stereoscopic or auto-stereoscopic means, as well as
`
`
`
`extensive cover window may be used to cover both cameras
`
`
`transmitted or recorded as desired.
`
`
`and their associated light sources 110 and 111, typically
`In many applications of the foregoing invention it is desir
`
`
`
`
`
`LEDs.
`
`
`
`The LEDs in this application are typically used to illumi
`
`
`able not just to use a large screen but in fact one capable of
`nate targets associated with any of the fingers, hand, feet and
`
`
`
`
`
`
`displaying life size images. This particularly relates to human
`head of the user, or objects such as 131 held by a user, 135
`
`
`
`
`
`
`40 scaled images, giving a life-like presence to the data on the
`
`with hands 136 and 137, and head 138. These targets, such as
`
`
`
`screen. In this way the natural response of the user with
`
`circular target 140 and band target 141 on object 131 are
`
`
`
`
`
`
`
`
`motions of hands, head, arms, etc., is scaled in "real" propor
`
`desirably, but not necessarily, retro-reflective, and may be
`
`
`
`tion to the data being presented.
`
`constituted by the object features themselves (e.g., a finger FIG. 2
`
`
`
`
`
`tip, such as 145), or by features provided on clothing worn by 45 This embodiment and others discloses special types of
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`the user (e.g., a shirt button 147 or polka dot 148, or by cameras useful with the invention. In the first case, that of
`
`
`
`
`artificial targets other than retroreflectors. FIG. 2A, a pixel addressable camera such as the M APP2200
`
`made by IVP corporation of Sweden is used, which allows
`
`
`Alternatively, a three camera arrangement can be used, for
`
`
`
`
`one to do many things useful for rapidly determining location
`example using additional camera 144, to provide added sen
`
`
`
`
`
`of objects, their orientation and their motion.
`
`sitivity in certain angular and positional relationships. Still
`
`
`
`
`50
`
`more cameras can be used to further improve matters, as
`
`
`
`
`
`
`For example, as shown in FIG. 2A, an approximately cir
`
`
`
`desired. Alternatively, and or in addition, camera 144 can be
`
`
`
`cular image 201 of a target datum such as 180 on object 185
`
`
`used for other purposes, such as acquire images of objects
`
`
`
`
`of FIG. 1 C may be acquired by scanning the pixel elements on
`
`
`
`
`such as persons, for transmission, storage or retrieval inde
`
`a matrix array 205 on which the image is formed. Such an
`
`
`
`
`pendent of the cameras used for datum and feature location
`
`
`array in the future will have for example 1000.times.1000
`55
`determination.
`
`
`pixels, or more (today the largest IVP makes is
`
`
`
`
`For many applications, a single
`
`
`512.times.512. camera can suffice for The IVP also is not believed to be completely
`
`
`
`
`
`randomly as well, such as 160 shown in addressable, which some future arrays will be).
`
`
`measurement purposes FIG. lB
`
`for example, used for simple 2 dimensional
`
`
`
`
`As an illust(2D) measure ration, computer 220 determines, after the array
`
`
`
`
`
`ments in the xy plane perpendicular to the camera axis (z 60 205 has been interrogated, that the centroid "x, y" of the pixel
`
`
`
`
`axis), or 3D (xyz, roll pitch yaw) where a target grouping, for elements on which the target image lies is at pixel x=500,
`
`
`
`
`
`
`
`
`
`
`example of three targets is used such as the natural features y=300 (including a sub-fraction thereof in many cases). The
`
`
`
`
`
`
`
`
`
`formed by the two eyes 164, 165 and nose 166 of a human centroid location can be determined for example by the
`
`
`
`
`
`
`
`167. These features are roughly at known distances from each moment method disclosed in the Pinkney patent, referenced
`
`
`
`
`other, the data from which can be used to calculate the 65 above.
`approximate position and orientation of the human face.The target in this case is defined as a contrasting point on
`
`
`
`
`
`
`
`
`
`
`
`Using for example the photogrammetric technique of the object,
`
`
`and such contrast can be in color as well as, or
`
`
`
`
`
`US 8,194,924 B2
`
`6
`
`5
`
`instead of, intensity. Or with some added preprocessing, it can
`
`pixel, not enough to cause image rendition difficulties even if
`
`
`
`
`
`be a distinctive pattern on the object, such as a checkerboard
`
`
`
`
`totally dark grey ( as it might be in a normal white light image
`or herringbone.
`
`
`if set up for IR wavelengths only).
`
`Subsequent Tracking
`FIG. 2C
`
`
`
`
`
`To subsequently track the movement of this target image, it 5
`Another method for finding the target in the first place with
`
`
`
`
`
`is now only necessary to look in a small pixel window com
`
`
`
`limited pixel interrogation is to look at pixels near a home
`
`
`posed of a small number of pixels around the target. For
`
`
`
`
`point where a person for example indicates that the target is.
`
`
`
`example the square 230 shown, as the new position x'y' of the
`
`
`This could be for example, placing ones fingernail such as
`
`
`
`
`
`
`
`target image cannot be further distant within a short period of
`
`
`
`
`270, whose natural or artificial (e.g., reflective nail polish)
`
`
`
`time elapsed from the first scan, and in consideration of the
`10
`
`
`features are readily seen by the camera 275 and determined to
`
`small required time to scan the window.
`
`
`be in the right corner of a pad 271 in FIG. 2C which approxi
`
`
`
`
`For example, if the window is 100.times.100 pixels, this
`
`
`
`mately covers the field of view 274 of the camera 275. The
`
`
`can be scanned in 1 millisecond or less with such a pixel
`
`
`
`computer 220 analyzes the pixels in the right corner 278 of the
`
`
`
`
`addressing camera, by interrogating only those pixels in the
`
`
`image field 279 representing the pad portion 271 with the
`
`
`
`window, while still communicating with the camera over a
`15
`
`
`
`
`camera 275, either continuously, or only when the finger for
`
`
`
`relatively slow USB serial link of 12 mb transmission rate
`
`
`
`example hits a switch such as 280 at the edge of the pad, or on
`
`
`(representing 12,000 pixel gray level values in one millisec
`
`
`
`command (e.g., by the user pushing a button or key, or a voice
`ond).
`
`
`
`
`message inputted via microphone 285 for example). After
`
`
`
`One thus avoids the necessity to scan the whole field, once
`
`
`
`
`such acquisition, the target is then tracked to other locations in
`
`
`
`the starting target image position is identified. This can be
`20
`
`
`
`
`xy space of the pad, for example as described above. Its noted
`
`
`known by an initial scan as mentioned, or can be known by
`
`
`that it helps to provide a beep or other sound or indication
`
`
`
`having the user move an object with a target against a known
`
`when acquisition has been made.
`
`
`location with respect to the camera such as a mechanical stop,
`
`Pick Windows in Real Time
`
`
`
`
`and then indicate that tracking should start either by verbally
`Another aspect of the invention is that one can also pick the
`
`
`
`
`
`
`
`
`
`saying so with voice recognition, or by actuating a control key 25
`
`
`
`
`
`
`area of the image to interrogate at any desired moment. This
`such as 238 or whatever.
`
`
`
`
`can be done by creating a window of pixels with in the field to
`
`It is noted that if the tracking window is made large enough,
`
`
`
`
`
`generate information, for example as discussed relative to a
`
`
`
`then it can encompass a whole group of datums, such as
`
`
`
`
`specific car dashboard application of FIG. 10.
`
`180-183 on an object.
`
`
`
`FIG. 2B Reduction in Acquisition Time
`30 FIG. 2D-Scan Pattern
`A pixel addressing camera also allows a computer such as
`
`
`
`
`
`
`
`Another application of such a pixel addressing camera is
`
`
`
`220 to cause scans to be generated which are not typical raster
`
`shown in FIG. 2B. One can look at the whole field, x y of the
`
`
`
`
`
`scans. For example circular or radial, or even odd shapes as
`
`
`
`camera, 240, but only address say every 1 0.sup.th pixel such
`
`
`
`as 250, 251 and 252, in each direction,
`
`
`desired. i.e., for a total 10,000 This can be done by providing from the computer the
`
`
`
`
`
`35 sequential addresses of the successive pixels on the camera
`
`
`
`pixels in a field of 1 million (1000.times.1000, say).
`
`
`chip whose detected voltages are to be queried.
`
`
`
`In this case computer 220 simply queries this fraction of the
`
`
`
`
`A circular scan of pixels addressed at high speed can be
`
`
`
`
`
`
`pixels in the image, knowing apriori that the target image such
`
`
`
`used to identify when and where a target enters a field
`
`
`as 260 will have an image size larger than 10.times.l0pixels,
`
`
`
`
`enclosed by the circular pixel scan. This is highly useful, and
`
`
`
`and must be detectable, if of sufficient contrast, by one of the
`
`
`
`
`
`
`after that, the approximate location of the target can be deter
`
`
`
`
`
`queried pixels. (For smaller or larger target images, the num-40
`
`
`
`
`mined by further scans of pixels in the target region.
`
`
`
`
`ber and spacing of queried pixels can be adjusted accord
`cl c2 c3
`
`
`
`
`For example consider addressing the pixels . . . en
`
`
`
`
`ingly). This for example, allows one to find approximate
`
`
`
`
`location of targets with only 1/100 the pixel interrogation time
`
`
`representing a circle 282 at the outer perimeter of the array,
`
`
`
`285, of 1000.times.1000 elements such as discussed above.
`
`
`
`
`otherwise needed, for example, plus any gain obtained as
`
`
`
`The number of pixels in a full circle is approximately 1000 pi,
`
`
`
`
`disclosed above, by knowing in what region of the image to
`45
`
`
`
`which can be scanned even with USB (universal serial bus)
`
`
`look (for example during tracking, or given some apriori
`
`
`limits at 300 times per second or better. For targets of ½oo
`
`
`
`
`knowledge of approximate location due to a particular aspect
`
`
`
`
`
`field in width, this means that a target image entering the field
`
`
`
`
`of the physical arrangement or the program in question).
`
`
`
`such as circular target image 289 (which is shown intersecting
`
`
`Once a target has been approximately found as just
`
`element cm and its neighbors) would have to travel 1/100 the
`
`
`
`
`
`
`
`described, the addressing can be optimized for that region of 50
`
`
`
`field width in 0.0033 seconds to be totally missed in a worst
`
`
`
`the image only, as disclosed in subsequent tracking section
`
`
`
`
`
`case. If the image field corresponds to 20 inches in object field
`above.
`
`
`
`
`
`Given the invention, the potential for target
`
`width this is acquisition in a 0.2 inches.times.300/sec or 60 inches/second,
`
`millisecond or two thus is achievable
`
`
`very fast for with simple pixel human movement, and not likely to be exceeded
`
`
`addressable CMOS cameras coming on stream now (today
`
`
`
`
`55 even where smaller targets are used.
`
`
`
`
`costing under $50), assuming the target points are easily
`
`
`
`Alternative shapes to circular "trip wire" perimeters may
`
`
`identifiable from at least one of brightness ( over a value),
`
`
`
`
`be used, such as squares, zig-zag, or other layouts of pixels to
`
`
`
`
`contrast (with respect to surroundings), color, color contrast,
`
`
`
`
`determine target presence. Once determined, a group of pix
`
`
`
`and more difficult, shape or pattern ( e.g., a plaid, or herring
`
`els such as group 292 can be interrogated to get a better
`
`
`bone portion of a shirt). This has major ramifications for the
`
`
`60 determination of target location.
`
`
`
`robustness of control systems built on such camera based
`FIG. 3
`
`
`
`
`
`acquisition, be they for controlling displays, or machines or
`Since many applications of the invention concern, or at
`
`
`
`
`whatever.
`
`
`least have present a human caused motion, or motion of a part
`
`
`
`
`It's noted that with new 2000.times.2000 cameras coming of a human, or an object moved by a human, the identification
`
`
`
`
`
`
`
`
`on stream, it may only be necessary to look at every 15.sup.th and tracking problem can be simplified if the features of
`65
`
`
`
`
`
`
`
`
`
`
`
`or 20.sup.th pixel in each direction to get an adequate feel for interest, either natural or artificial of the object provide some
`
`
`
`
`
`
`target location. This means every 200.sup.th to 400.sup.th kind of change in appearance during such motion.
`
`
`
`US 8,194,924 B2
`8
`7
`blink pulse length can all be determined if the scan rate is
`
`
`FIG. 3 illustrates tracking embodiments of the invention
`
`
`
`
`
`sufficiently faster than the blink rate, or pulse time.
`
`
`
`
`
`using intensity variation to identify and/or track object target
`
`
`
`
`It should be noted that if the target 340 is a retro-reflector as
`
`
`datums. In a simple case, a subtraction of successive images
`
`
`
`in FIG. 1, with an illumination source such as 355 near the
`
`
`
`can aid in identifying zones in an image having movement of
`
`
`5 axis of the camera, then the LEDs ( or other sources) of the
`
`
`
`features as is well known. It is also useful to add pixel inten
`
`
`
`illuminator can be modulated, causing the same effect on the
`
`
`
`sities of successive images in computer 220 for example. This
`target.
`
`
`
`
`is particular true with bright targets (with respect to their
`Somewhat more sophisticated is the situation shown in
`
`
`
`
`
`usual surroundings) such as LEDs or retro-reflectors. If the
`
`
`
`
`FIG. 3C where a target 380 (on object 360) illuminated by a
`
`pixels in use by the camera are able to gather light preferen
`
`
`
`
`10 light source 365 provides a time variant intensity change in
`
`
`
`tially at the same time a special illumination light is on, this
`
`the camera image 368 obtained by camera 370 as the target
`
`
`
`
`will accentuate the target with respect to background. And if
`
`
`
`moves its position and that of the image. This can be achieved
`
`
`successive frames are taken in this way, not only will a sta
`
`
`
`
`naturally by certain patterns of material such as herringbone,
`
`
`
`
`tionary image of the special target build up, but if movement
`
`
`
`or by multifaceted reflectors such as cut diamonds (genuine or
`
`takes place the target image then will blur in a particular
`
`
`
`
`
`15 glass), which "twinkle" as the object moves. A relative high
`
`
`
`direction which itself can become identify-able. And the blur
`
`
`
`
`frequency "twinkle" in the image indicates then the presence
`
`
`
`
`direction indicates direction of motion as well, at least in the
`
`
`
`
`of the target in that area of the image in which it is found.
`
`
`2-D plane of the pixel array used.
`
`
`
`When analog sensors such as PSD (position sensing diode)
`
`
`Another form of movement can take place artificially,
`
`
`
`
`sensor 369 descri