throbber
Case 2:21-cv-00040-JRG Document 64-3 Filed 08/15/21 Page 1 of 17 PageID #: 1087
`Case 2:21-cv-00040-JRG Document 64-3 Filed 08/15/21 Page 1o0f17 PagelD #: 1087
`
`
`EXHIBIT C
`EXHIBIT C
`
`

`

`Case 2:21-cv-00040-JRG Document 64-3 Filed 08/15/21 Page 2 of 17 PageID #: 1088
`ceensTTTTAA
`
`US008553079B2
`
`a2) United States Patent
`Pryor
`
`(10) Patent No.:
`(45) Date of Patent:
`
`US 8,553,079 B2
`Oct. 8, 2013
`
`(54) MORE USEFUL MAN MACHINE
`INTERFACES AND APPLICATIONS
`
`.
`oe
`~
`:
`.
`~
`(71) Applicant: Timothy R. Pryor, Sylvania,OH (US)
`.
`(72)
`Inventor: Timothy R. Pryor, Sylvania, OH (US)
`
`(21) Appl. No.: 13/714,748
`
`(22) Filed:
`
`Dee. 14, 2012
`.
`tott
`‘
`Prior Publication Data
`US 2013/0169535 Al
`Jul. 4, 2013
`
`(65)
`
`Related U.S. Application Data
`
`12/1986 Pugh
`4,631,676 A
`12/1988 Blazoetal.
`4,791,589 A
`6/1989 Kruegeret al.
`4,843,568 A
`3/1990 Fujiokaetal.
`4,908,704 A
`L/1991 Zimerman etal.
`AO88.981 A
`4/1991 Ando
`5,008,946 A
`2/1992 Chan
`5,088,928 A
`7/1993 Yokotaetal.
`5,227,986 A
`,,joos ain
`aoroe, ‘
`Subject to any disclaimer, the term ofthis
`(*} Notice:
`thonet
`al
`?
`ementhon et ai.
`ety
`:
`:
`11/1994 Holeva
`5365,597 A
`SC 1ened oe adjusted under 35
`
`
`
`S.C. by0days.154(b) 5,376,796 A 12/1994 Chan etal.
`
`5,388,059 A
`2/1995 DeMenthon
`5,454,043 A
`9/1995 Freeman
`§,459,793 A *
`10/1995 Naoietal oe 382/165
`5,491,507 A
`2/1996 Umezawaetal.
`5,528,263 A *
`6/1996 Platzkeretal. wou. 345/156
`5,534,921 A
`7/1996 Sawanobori
`5,572,251 A * 11/1996 Ogawa veces 348/207.99
`5,581,276 A
`12/1996 Cipolla etal.
`5,594,469 A
`1/1997 Freeman etal.
`§,616,078 A * 4997 Ol icccecceeteenes 463/8
`5,624,117 A
`4/1997 Ohkuboet al.
`5,781,647 A
`7/1998 Fishbineetal.
`5,781,650 A
`7/1998 Loboet al.
`5,808,672 A *
`9/1998 Wakabayashiet al.
`3,828,770 A
`10/1998 Leis et al.
`2845.06 A
`13/1098 Sumi et al.
`5,853,327 A
`12/1998 Gilboa
`5,864,334 A *
`1/1990 Sellers wuss 345/168
`Continued)
`
`
`
`.... 348/220.1
`
`(63) Continuation of application No. 12/700,055, filed on
`Feb. 4, 2010, which is a continuation of application
`No. 10/866,191,
`filed on Jun.
`14, 2004, now
`abandoned, which is a continuation ofapplication No.
`09/433,297,
`filed on Nov.
`3,
`1999, now Pat. No
`>
`>
`co
`>
`.
`.
`6,750,848.
`(60) Provisional application No. 60/107,652,filed on Nov.
`9, 1998,
`Int. Cl.
`HOAN 9/47
`HOAN 7/18
`(52) U.S.C
`USPC ccccesccseseeeesrnssseerersreneesens 348/77, 348/155
`(58) Field of Classification Search
`aoe licationfile for complete search histo
`PP
`P
`References Cited
`
`(51)
`
`(56)
`
`(2006.01)
`(2006.01)
`
`oe
`
`U.S. PATENT DOCUMENTS
`
`3,909,002 A
`4,219,847 A
`4,339,798 A
`
`9/1975 Levy
`8/1980 Pinkneyetal.
`7/1982 Hedgeset al.
`
`Primary Examiner--- Peling Shaw
`(74) Attorney, Agent, or Firm — Warner Norcross & Judd
`LLP
`
`-
`.
`ABSTRACT
`(57)
`A method for determining a gesture illuminated by a light
`source utilizes the light source to provide illumination
`through a work volume above the light source. A camera is
`positioned to observe and determine the gesture performed in
`the work volume.
`
`30 Claims, 7 Drawing Sheets
`
`
`
`GTP_00000884
`
`

`

`U.S. PATENT DOCUMENTS
`3/1999
`5/1999
`F999
`8/1999
`8/1999
`11/1999
`12/1999
`4/2000
`8/2000
`8/2000
`11/2000
`12/2000
`3/2001
`6/2001
`
`PPPPPEPPESSPP
`
`Bl
`Bl*
`
`Re
`
`Stewart etal.
`ums
`.
`;
`ee.Cee 345/157
`Pryor
`Grimsonet al.
`Christian etal.
`French etal.
`Ito et al.
`Andersonet al.
`Lee etal.
`Kumar etal.
`S€QON eececccceceesestesteeens 715/863
`
`5,878,174
`5,904,484
`5,926,168
`5,936,610
`5,940,126
`5,982,352
`5,999,840
`6,052,132
`6,098,458
`6,108,033
`6,148,100
`6,160,899
`6,204,852
`6,252,598
`
`Case 2:21-cv-00040-JRG Document 64-3 Filed 08/15/21 Page 3 of 17 PageID #: 1089
`Case 2:21-cv-00040-JRG Document 64-3 Filed 08/15/21 Page 3 of 17 PagelD #: 1089
`
`US 8,553,079 B2
`
`Page 2
`
`(56)
`
`References Cited
`
`betes eneeeees 345/8
`
`6,342,917 Bl
`6,346,929 Bl *
`6,359,647 Bl
`6,363,160 Bl
`6,373,472 Bl
`6,442,465 B2
`6,508,709 Bl
`6,529,617 Bl
`6,597,817 BL
`6,663,491 B2
`6,750,848 Bl
`6,775,361 Bl
`6,788,336 BL
`6,911,972 B2
`7,489,863 B2
`
`1/2002
`/2002
`3/2002
`3/2002
`(2002
`5/2002
`1/2003
`(2003
`42003
`12/2003
`6/2004
`5/2004
`9/2004
`6/2005
`/2009
`
`Amenta
`Fukushimaetal.
`Sengupta etal.
`Bradski et al.
`Palalauet al.
`Breedet al.
`Karmarkar
`Prokoski
`Silverbrook
`Watabeet al.
`Pryor
`Arai et al.
`Silverbrook
`Brinjes
`Lee
`
`* cited by examiner
`
`GTP_00000885
`
`

`

`Case 2:21-cv-00040-JRG Document 64-3 Filed 08/15/21 Page 4 of 17 PageID #: 1090
`Case 2:21-cv-00040-JRG Document 64-3 Filed 08/15/21 Page 4 of 17 PagelD #: 1090
`
`U.S. Patent
`
`Oct. 8, 2013
`
`Sheet 1 of 7
`
`US 8,553,079 B2
`
`
`
`GTP_00000886
`
`

`

`Case 2:21-cv-00040-JRG Document 64-3 Filed 08/15/21 Page 5 of 17 PageID #: 1091
`Case 2:21-cv-00040-JRG Document 64-3 Filed 08/15/21 Page 5of17 PagelD #: 1091
`
`yuajed‘SN
`
`E107‘8“PO
`
`LJ97}09U$ 7d6L0°ESS°8SA
`
`GTP_00000887
`
`

`

`Case 2:21-cv-00040-JRG Document 64-3 Filed 08/15/21 Page 6 of 17 PageID #: 1092
`Case 2:21-cv-00040-JRG Document 64-3 Filed 08/15/21 Page 6 of 17 PagelD #: 1092
`
`U.S. Patent
`
`Oct. 8, 2013
`
`Sheet 3 of 7
`
`US 8,553,079 B2
`
` ©O
`
`o—
`
`GTP_00000888
`
`

`

`Case 2:21-cv-00040-JRG Document 64-3 Filed 08/15/21 Page 7 of 17 PageID #: 1093
`Case 2:21-cv-00040-JRG Document 64-3 Filed 08/15/21 Page 7 of 17 PagelD #: 1093
`
`U.S. Patent
`
`Oct. 8, 2013
`
`Sheet 4 of 7
`
`US 8,553,079 B2
`
`IN
`
`411
`
`420
`
`400
`
`408
`
`FIG.4
`
`GTP_00000889
`
`

`

`Case 2:21-cv-00040-JRG Document 64-3 Filed 08/15/21 Page 8 of 17 PageID #: 1094
`Case 2:21-cv-00040-JRG Document 64-3 Filed 08/15/21 Page 8 of 17 PagelD #: 1094
`
`U.S. Patent
`
`Oct. 8, 2013
`
`Sheet 5 of 7
`
`US 8,553,079 B2
`
`535
`
`540
`
`oO
`oy
`wo
`
`+
`
`500
`
`
`
`GTP_00000890
`
`FIG.5
`

`
`©W
`
`w)
`
`

`

`Case 2:21-cv-00040-JRG Document 64-3 Filed 08/15/21 Page 9 of 17 PageID #: 1095
`Case 2:21-cv-00040-JRG Document 64-3 Filed 08/15/21 Page 9 of 17 PagelD #: 1095
`
`jus}yed“Sf
`E107‘8“PO
`
`LJO9J99Y$ 7d6L0°ESS°8SA
`
`GTP_00000891
`
`

`

`Case 2:21-cv-00040-JRG Document 64-3 Filed 08/15/21 Page 10 of 17 PageID #: 1096
`Case 2:21-cv-00040-JRG Document 64-3 Filed 08/15/21 Page 10 of 17 PagelD #: 1096
`
`U.S. Patent
`
`Oct. 8, 2013
`
`Sheet 7 of 7
`
`US 8,553,079 B2
`
`
`
`1070
`
`1065
`
`1059
`
`1060
`
`INPUT
`WOMAN'S
`MEASURE-
`
`INTERNET
`
`
`
`
`ei es
`
`
`
`INTERNET
`REMOTE
`
`1050
`
`FIG. 7B
`
`1085
`
`GTP_00000892
`
`

`

`Case 2:21-cv-00040-JRG Document 64-3 Filed 08/15/21 Page 11 of 17 PageID #: 1097
`Case 2:21-cv-00040-JRG Document 64-3 Filed 08/15/21 Page 11 of 17 PagelD #: 1097
`
`US 8,553,079 B2
`
`1
`MORE USEFUL MAN MACHINE
`INTERFACES AND APPLICATIONS
`
`CROSS REFERENCE TO RELATED
`APPLICATIONS
`
`
`
`This application is a continuation of U.S. patent applica-
`tion Ser. No. 12/700,055, filed Feb. 4, 2010, which is a con-
`tinuation ofU.S. patent application Ser. No. 10/866,191, filed
`Jun. 14, 2004, which is a continuation of U.S. patent appli-
`cation Ser. No. 09/433,297, filed Nov. 3, 1999 (now US. Pat.
`No. 6,750,848), which claims benefit of U.S. Provisional
`Application No. 60/107,652, filed Nov. 9, 1998. These appli-
`cations are hereby incorporated by reference.
`
`REFERENCES TO RELATED APPLICATIONS
`BY THE INVENTORS
`
`US. patent application Ser. No. 09/138,339, filed Aug. 21,
`1998.
`U.S. Provisional Application No. 60/056,639, filed Aug.
`22, 1997.
`US. Provisional Application No. 60/059,561, filed Sep.
`19, 1998.
`Man MachineInterfaces: Ser. No. 08/290,516, filed Aug.
`15, 1994, and now U:S. Pat. No. 6,008,800.
`Touch TV and Other Man Machine Interfaces: Ser. No.
`08/496,908, filed Jun. 29, 1995, and nowU.S. Pat. No. 5,982,
`352.
`
`Systems for Occupant Position Sensing: Ser. No. 08/968,
`114, filed Nov. 12, 1997, now abandoned, which claims ben-
`efit of Ser. No. 60/031,2456, filed Nov. 12, 1996.
`Target holes and corners: U.S. Ser. No. 08/203,603,filed
`Feb. 28, 1994, and Ser. No. 08/468,358 filed Jun. 6, 1995, now
`US. Pat. No. 5,956,417 and US. Pat. No. 6,044,183.
`Vision Target Based Assembly: U.S. Ser. No. 08/469,429,
`filed Jun. 6, 1995, now abandoned; Ser. No. 08/469,907,filed
`Jun. 6, 1995, now U.S. Pat. No. 6,301,763; Ser. No. 08/470,
`325,filed Jun. 6, 1995, now abandoned; and Ser. No. 08/466,
`294,filed Jun. 6, 1995, now abandoned.
`Picture Taking Method and Apparatus: Provisional Appli-
`cation No. 60/133,671, filed May 11, 1998.
`Methods and Apparatus for Man Machine Interfaces and
`Related Activity: Provisional Application No. 60/133,673
`filed May 11, 1998.
`Camera Based Man-Machine Interfaces: Provisional
`Patent Application No. 60/142,777,filed Jul. 8, 1999.
`The copies of the disclosure of the above referenced appli-
`cations are incorporated herein by reference.
`
`BACKGROUNDOF THE INVENTION
`
`1. Field of the Invention
`The invention relates to simple input devices for comput-
`ers, particularly, but not necessarily, intended for use with 3-D
`graphically intensive activities, and operating by optically
`sensing object or human positions and/or orientations. The
`invention in many preferred embodiments, uses real time
`stereo photogrammetry using single or multiple TV cameras
`whose output is analyzed and used as input to a personal
`computer, typically to gather data concerning the 3D location
`of parts of, or objects held by, a personor persons.
`This continuation application seeks to provide further
`detail on useful embodiments for computing. One embodi-
`ment is a keyboard for a laptop computer (or stand alone
`keyboard for any computer) that incorporates digital TV cam-
`eras to look at points on, typically, the hand or the finger, or
`
`10
`
`15
`
`25
`
`30
`
`35
`
`40
`
`45
`
`50
`
`60
`
`65
`
`2
`objects held in the hand ofthe user, which are usedto input
`data to the computer. It may also or alternatively, look at the
`head ofthe user as well.
`
`Both hands or multiple fingers of each hand, or an object in
`one hand and fingers of the other can be simultaneously
`observed, as can alternate arrangements as desired.
`2. Description of Related Art
`incorporated
`My referenced co-pending applications
`herein by reference discuss manyprior art referencesinvari-
`ous pertinent fields, which form a backgroundfor this inven-
`tion.
`
`BRIEF DESCRIPTION OF FIGURES
`
`FIG.1 illustrates a laptop or other computer keyboard with
`cameras according to the invention located on the keyboard
`surface to observe objects such as fingers and hands overhead
`of the keyboard.
`FIG.2 illustrates another keyboard embodiment using spe-
`cial datumsorlight sources such as LEDs.
`FIG.3 illustrates a further finger detection system for lap-
`top or other computer input.
`FIG. 4 illustrates learning, amusement, monitoring, and
`diagnostic methods and devicesfor the crib, playpen and the
`like.
`FIG. 5 illustrates a puzzle toy for young children having cut
`out wood characters according to the invention.
`FIG.6 illustrates an improved handheld computer embodi-
`ment ofthe invention, in which the camera or cameras maybe
`used to look at objects, screens andthelike as well as look at
`the user along the lines of FIG. 1.
`FIGS. 7A-Billustrate new methods for internet commerce
`and other activities involving remote operation with 3D vir-
`tual objects display.
`
`DESCRIPTION OF THE INVENTION
`
`FIG. 1
`A laptop (or other) computer keyboard based embodiment
`is shown in FIG.1. In this case, a stereo pair of cameras 100
`and 101 located on each side of the keyboard are used, desir-
`ably having cover windows 103 and 104 mounted flush with
`the keyboard surface 102. The cameras are preferably pointed
`obliquely inwardat angles ® toward the center ofthe desired
`work volume 170 above the keyboard. In the case of cameras
`mounted at the rear of the keyboard (toward the display
`screen), these cameras are also inclined to point toward the
`user at an angle as well.
`Alternate camera locations may be used such as the posi-
`tions of cameras 105 and 106, on upper corners of screen
`housing 107 looking downat the top ofthe fingers (or hands,
`or objects in hand or in front of the cameras), or of cameras
`108 and 109 shown.
`One ofthe referenced embodiments of the invention is to
`
`determine the pointing direction vector 160 of the user’s
`finger (for example pointing at an object displayed on screen
`107), or the position and orientation of an object held by the
`user. Alternatively, finger position data can be usedto deter-
`mine gestures such as pinch or grip, and other examples of
`relative juxtaposition of objects with respect to each other, as
`has been described in co-pending referenced applications.
`Positioning of an object or portions (such as handsorfingers
`of a doll) is also of use, though more for use with larger
`keyboards and displays.
`In one embodiment, shown in FIG. 2, cameras such as
`100/101 are used to simply lookat the tip of a finger 201 (or
`thumb) of the user, or an object such as a rmg 208 on the
`
`GTP_00000893
`
`

`

`Case 2:21-cv-00040-JRG Document 64-3 Filed 08/15/21 Page 12 of 17 PageID #: 1098
`Case 2:21-cv-00040-JRG Document 64-3 Filed 08/15/21 Page 12 of 17 PagelD #: 1098
`
`US 8,553,079 B2
`
`3
`finger. Light from below, such as provided by single central
`light 122 can be used to illuminate the finger that typically
`looks bright under suchillumination.
`It is also notedthat the illuminationis directed or concen-
`trated in an area wherethe finger is typically located such as
`in work volume 170. If the light is of sufficient spectral
`content, the natural flesh tone of the finger can be observed-—
`and recognized by use of the colorTV cameras 100/101.
`Asis typically the case, the region of the overlapping
`cameras Viewing area is relatively isolated to the overlapping
`volumetric zone oftheir fields 170 showndueto focal lengths
`of their lenses and the angulation of the camera axes with
`respect to each other. This restricted overlap zone helps miti-
`gate against unwanted matches in the two images due to
`information generated outside the zone ofoverlap. Thus there
`are no significant image matches found ofother objects in the
`room, since the only flesh-toned object in the zoneis typically
`the fingeror fingers ofthe user. Oralternatively, for example,
`the user’s hand or hands. Similarly objects or targets thereon
`can be distinguished by special colors or shapes.
`If desired, or required, motion of the fingers can be also
`used to further distinguish their presence vis-a-vis anystatic
`background. If for example, by subtraction of successive
`camera frames, the image ofa particular object is determined
`to have movedit is determinedthat this is likely the object of
`potential interest which can be further analyzed directly to
`determineifis the object of interest.
`In case of obscuration ofthe fingers or objects in the hand,
`cameras in additional locations such as those mentioned
`above, can be usedto solve for position if the view of one or
`more cameras is obscured.
`The use of cameras mounted on both the screen and the
`keyboard allows oneto deal with obscurations that may occur
`and certain objects may or may not be advantageously delin-
`eated in one view orthe other.
`
`In addition, it may be in many cases desirable to have a
`datum on the top of the finger as opposed to the bottom
`because on the bottom, it can get in the wayofcertain activi-
`ties. In this case the sensors are required onthe screen looking
`downwardor in someother location suchas off the computer
`entirely and located overhead has been noted in previous
`application.
`To determine finger location, a front end processorlike that
`described in the target holes and corners co-pending applica-
`tion reference incorporated U.S. Ser. Nos. 08/203,603 and
`08/468,358 can be usedto also allow the finger shape as well
`as color to be detected.
`
`10
`
`15
`
`25
`
`30
`
`35
`
`40
`
`45
`
`Finger gestures comprising a sequence of finger move-
`ments can also be detected by analyzing sequential imagesets
`such as the motionofthe finger, or one finger with respect to 5
`another such as in pinching something can be determined.
`Cameras 100 and 101 have been shownat the rear ofthe
`
`keyboard near the screen or at the front. They may mount in
`the middle of the keyboard or any other advantageousloca-
`tion.
`The cameras can also see one’s fingers directly, to allow
`typing as now, but without the physical keys. One can type in
`space abovethe plane ofthe keyboard (or inthis case plane of
`the cameras). This is useful for those applications where the
`keyboard of conventionalstyle is too big (e.g., the hand held
`computer of FIG. 6).
`FIG. 2
`It is also desirable for fast reliable operation to use retro-
`reflective materials and other materials to augment the con-
`trast of objects used in the application. For example, a line
`target such as 200 can be worm on a finger 201, and advanta-
`geously can be located if desired between twojoints of the
`
`60
`
`65
`
`4
`finger as shown. This allowsthe tip ofthe finger to be used to
`type on the keyboard without feeling unusual---the case per-
`haps with target material ontip of the finger.
`The line image detected by the camera canbe provided also
`by acylinder such asretroreflective cylinder 208 worn on the
`finger 201 which effectively becomesa line imagein thefield
`of view of each camera (assuming each camera is equipped
`with a sufficiently coaxiallight source, typically one or more
`LEDssuchas 210 and 211), can be used to solve easily using
`the line image pairs with the stereo cameras for the pointing
`directionofthe finger that is often a desired result. The line, in
`the stereo pair ofimages providesthe pointing direction ofthe
`finger, for example pointing at an object displayed on the
`screen 140 of the laptop computer 138.
`FIG. 3
`
`Itis also possible to have light sourceson thefinger that can
`be utilized such as the 2 LED light sources shown in FIG.3.
`This can be used with either TV camera type sensors or with
`PSD type analog image position sensors as disclosed in ref-
`erences incorporated.
`In particular the ring mounted LED light sources 301 and
`302 can be modulated at different frequencies that can be
`individually discerned by sensors imaging the sources onto a
`respective PSD detector. Alternatively, the sources can sim-
`ply be tumed on and off at different times such that the
`position of each point can be independently found allowing
`the pointing direction to be calculated from the LED point
`data gathered by the stereo pair of PSD based sensors.
`The “natural interface keyboard”here described can have
`camerasor other sensors locatedat the rear looking obliquely
`outward toward the front as well as inward so as to have their
`working volume overlap in the middle ofthe keyboard suchas
`the nearly full volume over the keyboard area is accommo-
`dated.
`Clearly larger keyboards can have a larger working volume
`than one might have on a laptop. Thepair of sensors used can
`be augmented with other sensors mounted onthe screen hous-
`ing. It is noted thatthe linked dimensionaffordedfor calibra-
`tion between the sensors located on the screen and those on
`the keyboard is provided by the laptop unitary construction.
`One can use angle sensing meanssuchas a rotary encoder
`for the laptop screen tilt. Alternatively, cameras located onthe
`screen can be used to image reference points on the keyboard
`as reference points to achieve this. This allows the calibration
`ofthe sensors mounted fixedly with respect to the screenwith
`respect to the sensors and keyboard space below. It also
`allows one to use stereo pairs of sensors that are not in the
`horizontal direction (such as 101/102) but could for example
`be acamera sensor suchas 100 on the keyboard coupled with
`one on the screen, such as 106.
`Knowing the pointing angles of the two cameras with
`respect to one anotherallowsoneto solve for the 3D location
`of objects from the matching of the object image positionsin
`the respective camerafields.
`Asnoted previously,it is also of interest to locate a line or
`cylindertype target on the finger betweenthefirst and second
`joints. This allows one to use the fingertip for the keyboard
`activity but by raising the finger up, it can be used as a line
`target capable of solving for the pointed direction for
`example.
`Alternatively one can use two point targets on the finger
`suchas either retroreflective datums, colored datums such as
`rings or LED light sources that can also be used with PSD
`detectors which has also beennoted in FIG.2.
`
`Whenusing the cameras located for the purpose of stereo
`determination of the position of the fingers from their flesh
`tone images it is useful to follow the preprocessing capable of
`
`GTP_00000894
`
`

`

`Case 2:21-cv-00040-JRG Document 64-3 Filed 08/15/21 Page 13 of 17 PageID #: 1099
`Case 2:21-cv-00040-JRG Document 64-3 Filed 08/15/21 Page 13 of 17 PagelD #: 1099
`
`US 8,553,079 B2
`
`10
`
`15
`
`25
`
`30
`
`35
`
`5
`processing data obtained from the camerasin order to look for
`the finger. This can be done on both color basis and on the
`basis of shape as well as motion.
`In this invention, I have shownthe use of not only cameras
`located on a screen looking downward or outward from the
`screen, but also cameras that can be used instead of or in
`combination with those on the screen placed essentially on
`the member on which the keyboard is incorporated. This
`allows essentially the keyboard to mounted cameras which
`are preferably mounted flush with the keyboard surface to be
`unobtrusive, and yet visually be able to see the users fingers,
`handsor objects held by the user and in somecases, the face
`of the user.
`for 3D displays, for
`is also useful
`This arrangement
`example where special synchronized glasses (e.g., the “Crys-
`tal Eyes” brand often used with Silicon Graphics work sta-
`tions) are usedto alternatively present right andleft images to
`eacheye. In this case the object may appearto be actually in
`the workspace 170 above the keyboard, and it may be
`manipulated by virtually grasping (pushing, pulling, etc.) it,
`as has been described in co-pending applications.
`FIG. 4: Baby Learning and Monitoring System
`A baby’s reaction to the mother(or father) and the mother’s
`analysis of the baby’s reaction is very important. There are
`many gestures of babies apparently indicated in child psy-
`chology as being quite indicative of various needs, wants, or
`feelings and emotions, etc. These gestures are typically made
`with the baby’s hands.
`Todaythis is done and learnedentirely by the mother being
`with the baby. However with an Electro-optical sensor based
`computer system, such as that described in co-pending appli-
`cations located proximate toor evenin the crib (for example),
`one can have the child’s reactions recorded, not just in the
`sense of a video tape which would be too long and involved
`for most to use, but also in terms of the actual motions which
`could be computer recorded and analyzed also with the heip
`ofthe mother as to what the baby’s responses were. And such
`motions, combined with other audio and visual data can be
`very important to the baby’s health, safety, and learning.
`Consider for example crib 400 with computer 408 having
`LCD monitor 410 and speaker 411 and camera system (single
`or stereo) 420 as shown, able to amuse or inform baby430,
`while at the same time recording (both visually, aurally, andin
`movementdetected position data concerning parts ofhis body
`or objects such as rattles in his hand) his responses for anyor
`all of the purposes of diagnosis of his state of being, remote
`transmissionofhis state, cues to various programsor images
`to display to him or broadcast to others, or the like.
`For one example, baby’s motions could be used to signal a
`response from the TVeither in the absence of the mother or 5
`with the mother watching on a remote channel. This can even
`be overthe Internet if the mother is at work.
`
`40
`
`45
`
`For example, a comforting message could comeup on the
`TVfrom the mother that could be prerecorded (or alterna-
`tively could actually be live with TV camerasin the mother’s
`or father’s workplace for example on a computerused by the
`parent) totell the baby something reassuring or comfort the
`baby or whatever. Indeed the parent can be monitored using
`the invention and indicate something back or even control a
`teleoperater robotic device to give a small child something to
`eat or drink for example. The same applies to a disabled
`person.
`lf the father or mother came up on the screen, the baby
`could wave at it, move its head or“talk” to it but the hand
`gestures may be the most important.
`If the mother knows what the babyis after, she can talk to
`baby or say something, or show something that the baby
`
`60
`
`65
`
`6
`recognizes such as a doll. After a while, looking atthis live
`one can then moveto talking to the baby from someprere-
`corded data.
`
`What other things might we suppose? The baby for
`example knows to puts its hand on the mother’s cheek to
`cause the motherto turn to it. The baby also learns some other
`reflexes whenit is very youngthat it forgets whenit gets older.
`Manyofthese reflexes are hand movements, and are impor-
`tant in communicating with the remote TV based mother
`representation, whether real via telepresense or from CD
`Rom or DVD disk (or other media, including information
`transmitted to the computer from afar) and for the learning of
`the baby’s actions.
`Certainly just from the making the baby feel good point-
`of-view, it would seem like certain motherly (orfatherly,etc.)
`responses to certain baby actions in the form of words and
`images would be useful. This stops short of physical holding
`of the baby which is often needed, but could act as a stop gap
`to allow the parents to get another hour’s sleep for example.
`Asfar as the baby touching things, I’ve discussed in other
`applications methods for realistic touch combined with
`images. This leads to anew form oftouching crib mobilesthat
`could contain video imaged and or be imaged themselves—
`plus if desired—touched in ways that would be far beyond
`any response that you could get from a normal mobile.
`For example,let us say there is a targeted (or otherwise TV
`observable) mobile 450 in the crib above the baby. Baby
`reaches up and touchesa piece of the mobile whichis sensed
`by the TV camera system(either from the baby’s hand posi-
`tion, the mobile movement, or both, and a certain sound is
`called up by the computer, a musical note for example.
`Another piece of the mobile and another musical note. The
`mobile becomes a musical instrumentfor the baby that could
`play either notes or chords or complete passages, or any other
`desired programmedfunction.
`The baby can also signal things. The baby can signal using
`agitated movements would often meanthat it’s unhappy. This
`could be interpreted using learned movement signatures and
`artificial intelligence as needed by the computer to call for
`mother even if the baby wasn’t crying. If the babycries, that
`can be picked up by microphone 440, recognized using a
`voice recognition systemalong the lines of that used in IBM
`Via Voice commercial product for example. And even the
`degree of crying can be analyzed to determine appropriate
`action.
`The computer could also be used to transmit information of
`this sort via the internet email to the mother who could even
`be at work. And until help arrives in the form of mother
`intervention or whatever, the computer could access a pro-
`gram that could display on a screen for the baby things that the
`baby likes and could try to soothe the baby through either
`images of familiar things, music or whatever. This could be
`useful at night when parents need sleep, and anything that
`would make the baby feel more comfortable would help the
`parents.
`Itcould also be used to allowthe baby to inputto the device.
`For example, if the baby was hungry, a picture of the bottle
`could be brought up on the screen. The baby then could yell
`for the bottle. Or if the baby needed his diaper changed,
`perhaps something reminiscentofthat. If the baby reacts to
`such suggestions of his problem, this gives a lot more intel-
`ligence as to whyheis crying and while mothers can gener-
`ally tell right away, not everyoneelse can. In other words, this
`is pretty neat for babysitters and other members ofthe house-
`hold so they can act more intelligently onthe signals the baby
`is providing.
`
`GTP_00000895
`
`

`

`Case 2:21-cv-00040-JRG Document 64-3 Filed 08/15/21 Page 14 of 17 PageID #: 1100
`Case 2:21-cv-00040-JRG Document 64-3 Filed 08/15/21 Page 14 of 17 PagelD #: 1100
`
`US 8,553,079 B2
`
`7
`Besides in the crib, the system as described can be used in
`conjunction with a playpen, hi-chair or other place of baby
`activity.
`Asthe child gets older, the invention can further be used
`also with more advancedactivity with toys, and to take data
`from toy positions as well. For example, blocks, dolls, little
`cars, and moving toys even suchastrikes, scooters, drivable
`toy cars and bikes with training wheels.
`The followingfigureillustrates the ability of the invention
`to learn, and thus to assist in the creation of toys and other
`things.
`FIG. 5: Learning Puzzle Roy
`Disclosed in FIG. 5 is a puzzle toy 500 where woodcut
`animals such as bear 505 and lion 510 are pulled out with
`handle such as 511. The child can show the animal to the
`camera and a computer 530 with TV camera (or cameras) 535
`can recognize the shape as the animal, and provide a suitable
`image and sounds on screen 540.
`Alternatively, and more simply, a target, or targets on the
`back of the animal can be used such astriangle 550 on the
`back oflion 511. In either case the camera can solve for the
`
`th>
`
`8
`orientation informationto the TV camera based analysis soft-
`ware, and in making the object easier to see in reflective
`illumination.
`
`Aid to Speech Recognition
`The previous co-pending application entitled “Useful man
`machine interfaces and applications” referenced above, dis-
`cussed the use of persons movements or positions to aid in
`recognizing the voice spoken by the person.
`In one instance, this can be achieved by simply using ones
`handto indicate to the camera system ofthe computerthat the
`voice recognition should start (or stop, or any other function,
`such as a paragraph or sentence end,etc.).
`Another example is to use the camera systemof the inven-
`tion to determine the location ofthe persons head (or other
`part), from which one caninstruct a computer to preferen-
`tially evaluate the sound field in phase and amplitude of two
`or more spaced microphones to listen from that location—
`thus aiding the pickup of speech—which often times is not
`able to be heard well enough for computer based automatic
`speech recognition to occur.
`Digital Interactive TV
`As you watch TV, data can be taken from the camera
`systemof the invention and transmitted back to the source of
`programming. This could include voting ona given proposi-
`tion by raising your hand for example, with your handindi-
`cation transmitted. Or you could hold up 3 fingers, and the
`countoffingers transmitted. Or in a more extreme case, your
`position, or the position of an object or portion thereof could
`be transmitted—for example you could buy a coded object—
`whose code would be transmitted to indicate that you person-
`ally (having been pre-registered) had transmitted a certain
`packet ofdata.
`Ifthe programming source can transmit individually to you
`(not possible today, but forecast for the future), then much
`more is possible. The actual image and voice can respond
`using the invention to positions and orientations ofpersons or
`objects in the room-—just as in the case ofprerecorded data
`or one to one internet connections. This allows group activity
`as well.
`In the extremecase, full video is transmitted in both direc-
`tions andtotal interaction of users and programming sources
`and each other becomespossible.
`An interim possibility using the invention is to have a
`programbroadcast to many, which shifts to prerecorded DVD
`disc or the like driving a local image, say when your hand
`input causes a signal to be activated.
`Handwriting Authentication
`A referenced co-pending application illustrated the use of
`the invention to track the position of a pencil in three dimen-
`sional space such that the point at whichthe user intends the
`writing point to be at, can be identified and therefore used to
`input information, such as the intended script.
`Asherein disclosed, this part of the invention can also be
`used for the purpose of determining whether or not a given
`person’s handwriting or signature is correct.
`For example, consider authentication of an Internet com-
`mercial transaction. In this case, the user simply writes his
`name or address and the invention is used to look at the
`
`GTP_00000896
`
`3D, and even 5 or 6D position and orientation of the animal
`object, and cause it to move accordingly on the screen as the
`child maneuvers it. The child can hold two animals, one in
`each hand and they can each be detected, even with a single
`camera, and be programmedin software to interact as the
`child wishes (or as he learns the program).
`This is clearly for very young children oftwoor three years
`of age. The toys haveto be large so they can’t be swallowed.
`With the invention in this manner, one can make a toy of
`virtually anything, for example a block. Just hold this block
`up, teach the computer/camera system the object and play
`using

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket