`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`Exhibit A
`
`
`
`Case 2:21-cv-00040-JRG Document 28-1 Filed 05/06/21 Page 2 of 17 PageID #: 282
`Casez;21-cv-0004o-m D°°“"‘e“tmllllllllllflllflflllllllflllflllfillllllllllflllfifl"111182
`
`USOO8553079B2
`
`(12) United States Patent
`US 8,553,079 B2
`(10) Patent N0.:
`
`(45) Date of Patent: Oct. 8, 2013
`Pryor
`
`(54) MORE USEFUL MAN MACHINE
`INTERFACES AND APPLICATIONS
`
`(71) Applicant: Timothy R. Pryor, Sylvania, OH (US)
`
`(72)
`
`Inventor: Timothy R. Pryor, Sylvania, OH (US)
`
`( * ) Notice:
`
`Subject to any disclaimer, the term of this
`patent is extended or adjusted under 35
`U.S.C. 154(b) by 0 days.
`
`(21) Appl.No.: 13/714,748
`
`(22)
`
`Filed:
`
`Dec. 14, 2012
`
`(65)
`
`Prior Publication Data
`
`US 2013/0169535 A1
`
`Jul. 4, 2013
`
`Related US. Application Data
`
`(63) Continuation of application No. 12/700,055, filed on
`Feb. 4, 2010, which is a continuation of application
`No. 10/866,191,
`filed on Jun.
`14, 2004, now
`abandoned, which is a continuation of application No.
`09/433,297, filed on Nov. 3, 1999, now Pat. No.
`6,750,848.
`
`(60) Provisional application No. 60/107,652, filed on Nov.
`9, 1998.
`
`(51)
`
`Int. Cl.
`H04N 9/47
`H04N 7/18
`
`(2006.01)
`(2006.01)
`
`(52) US. Cl.
`USPC ............................................. 348/77; 348/155
`(58) Field of Classification Search
`None
`
`See application file for complete search history.
`
`(56)
`
`References Cited
`
`U.S. PATENT DOCUMENTS
`
`3,909,002 A
`4,219,847 A
`4,339,798 A
`
`9/1975 Levy
`8/1980 Pinkney et a1.
`7/1982 Hedges et a1.
`
`12/1986 Pugh
`4,631,676 A
`12/1988 Blazo et a1.
`4,791,589 A
`6/1989 Krueger et a1.
`4,843,568 A
`3/1990 Fujioka et a1.
`4,908,704 A
`1/1991 Zimmerman et a1.
`4,988,981 A
`4/1991 Ando
`5,008,946 A
`2/1992 Chan
`5,088,928 A
`7/1993 Yokota et a1.
`5,227,986 A
`9/1993 Jain
`5,249,053 A
`3/1994 Dementhon et a1.
`5,297,061 A
`11/1994 Holeva
`5,365,597 A
`12/1994 Chan et a1.
`5,376,796 A
`2/1995 DeMenthon
`5,388,059 A
`9/1995 Freeman
`5,454,043 A
`.................... 382/165
`5,459,793 A * 10/1995 Naoiet a1.
`5,491,507 A
`2/1996 Umezawa et a1.
`5,528,263 A *
`6/1996 Platzker et a1.
`5,534,921 A
`7/1996 Sawanobori
`5,572,251 A * 11/1996 Ogawa ..................... 348/207.99
`5,581,276 A
`12/1996 Cipolla et a1.
`5,594,469 A
`1/1997 Freeman et a1.
`5,616,078 A *
`4/1997 Oh .................................... 463/8
`5,624,117 A
`4/1997 Ohkubo et a1.
`5,781,647 A
`7/1998 Fishbine et a1.
`5,781,650 A
`7/1998 Lobo et a1.
`5,808,672 A *
`9/1998 Wakabayashiet a1.
`5,828,770 A
`10/1998 Leis et a1.
`5,845,006 A
`12/1998 Sumiet a1.
`5,853,327 A
`12/1998 Gilboa
`5,864,334 A *
`1/1999 Sellers .......................... 345/168
`
`............... 345/156
`
`348/220.1
`
`(Continued)
`
`Primary Examiner 7 Peling Shaw
`
`(74) Attorney, Agent, or Firm 7 Warner Norcross & Judd
`LLP
`
`(57)
`
`ABSTRACT
`
`A method for determining a gesture illuminated by a light
`source utilizes the light source to provide illumination
`through a work volume above the light source. A camera is
`positioned to observe and determine the gesture performed in
`the work volume.
`
`30 Claims, 7 Drawing Sheets
`
`
`
`
`
`Case 2:21-cv-00040-JRG Document 28-1 Filed 05/06/21 Page 3 of 17 PageID #: 283
`Case 2:21-cv-OOO40-JRG Document 28-1 Filed 05/06/21 Page 3 of 17 PageID #: 283
`
`US 8,553,079 B2
`
`Page 2
`
`(56)
`
`References Cited
`
`U.S. PATENT DOCUMENTS
`
`518781174 A
`g’ggga‘l‘gg :
`5,936,610 A *
`5940126 A
`5,982,352 A
`5,999,840 A
`6,052,132 A
`6,098,458 A
`6,108,033 A
`6,148,100 A
`6,160,899 A
`6,204,852 B1
`6,252,598 B1 *
`
`3/1999 Stewart “31
`34333 1133:5115
`8/1999 Endo ............................. 345/157
`8/1999 Kimura
`“/1999 Pryor
`12/1999 Grimson etal.
`4/2000 Christian etal.
`8/2000 French etal.
`8/2000 Ito et 31,
`11/2000 Anderson etal.
`12/2000 Lee etal.
`3/2001 Kumar et al.
`6/2001 Segen ........................... 715/863
`
`6,342,917 B1
`6,346,929 131*
`6,359,647 B1
`6,363,160 131
`6,373,472 B1
`6,442,465 B2
`223332 :1
`’
`1
`6,597,817 B1
`6,663,491 32
`6,750,848 B1
`6,775,361 B1
`6,788,336 B1
`6,911,972 132
`7,489,863 B2
`.
`.
`* Clted by examlner
`
`1/2002 Amenta
`2/2002 Fukushima etal.
`3/2002 Sengupta et al.
`3/2002 Bradski etal.
`4/2002 Palalau etal.
`8/2002 Breed etal.
`$88; gargafl?“
`To 051“
`7/2003 Sllverbrook
`12/2003 Watabe et31~
`6/2004 Pry9r
`8/2004 Araletal.
`9/2004 Silverbrook
`6/2005 Brinjes
`2/2009 Lee
`
`.............. 345/8
`
`
`
`Case 2:21-cv-00040-JRG Document 28-1 Filed 05/06/21 Page 4 of 17 PageID #: 284
`Case 2:21-cv-OOO40-JRG Document 28-1 Filed 05/06/21 Page 4 of 17 PageID #: 284
`
`US. Patent
`
`Oct. 8, 2013
`
`Sheet 1 of7
`
`US 8,553,079 B2
`
`FIG.1
`
`
`
`Case 2:21-cv-00040-JRG Document 28-1 Filed 05/06/21 Page 5 of 17 PageID #: 285
`Case 2:21-cv-OOO40-JRG Document 28-1 Filed 05/06/21 Page 5 of 17 PageID #: 285
`
`US. Patent
`
`Oct. 8, 2013
`
`Sheet 2 of7
`
`US 8,553,079 B2
`
`
`
`
`
`Case 2:21-cv-00040-JRG Document 28-1 Filed 05/06/21 Page 6 of 17 PageID #: 286
`Case 2:21-cv-OOO40-JRG Document 28-1 Filed 05/06/21 Page 6 of 17 PageID #: 286
`
`US. Patent
`
`Oct. 8, 2013
`
`Sheet 3 of7
`
`US 8,553,079 B2
`
`
`
`
`
`Case 2:21-cv-00040-JRG Document 28-1 Filed 05/06/21 Page 7 of 17 PageID #: 287
`Case 2:21-cv-OOO40-JRG Document 28-1 Filed 05/06/21 Page 7 of 17 PageID #: 287
`
`US. Patent
`
`Oct. 8, 2013
`
`Sheet 4 of7
`
`US 8,553,079 B2
`
`0 5
`
`;
`
`‘9
`
`VI-
`
`0)
`
`\U
`
`.
`
`g
`
`408
`
`r-
`2;
`
`8
`v
`
`Q
`
`5Q
`
`
`
`Case 2:21-cv-00040-JRG Document 28-1 Filed 05/06/21 Page 8 of 17 PageID #: 288
`Case 2:21-cv-OOO40-JRG Document 28-1 Filed 05/06/21 Page 8 of 17 PageID #: 288
`
`US. Patent
`
`Oct. 8, 2013
`
`Sheet 5 of7
`
`US 8,553,079 B2
`
`FIG.5
`
`O(
`
`‘0
`in
`
`x
`
`O
`(.0
`L0
`
`
`
`500
`
`535
`
`540
`
`
`
`Case 2:21-cv-00040-JRG Document 28-1 Filed 05/06/21 Page 9 of 17 PageID #: 289
`Case 2:21-cv-OOO40-JRG Document 28-1 Filed 05/06/21 Page 9 of 17 PageID #: 289
`
`US. Patent
`
`Oct. 8, 2013
`
`Sheet 6 of7
`
`US 8,553,079 B2
`
`FIG.6
`
`
`
` 951DISPLAY CPU
`
`
`
`to
`
`OO
`
`)
`
`
`
`Case 2:21-cv-00040-JRG Document 28-1 Filed 05/06/21 Page 10 of 17 PageID #: 290
`Case 2:21-cv-OOO40-JRG Document 28-1 Filed 05/06/21 Page 10 of 17 PageID #: 290
`
`US. Patent
`
`Oct. 8, 2013
`
`Sheet 7 of7
`
`US 8,553,079 B2
`
`
`
`1065
`
`o
`
`1075 V 1056
`fig
`,.
`
`INPUT
`WOMAN’S
`MEASURE-
`
`INTERNET
`
`
`Ej—a MENTS
`
`
`
`INTERNET
`REMOTE
`
`1050
`
`1055
`
`FIG. 7B
`
`
`
`Case 2:21-cv-00040-JRG Document 28-1 Filed 05/06/21 Page 11 of 17 PageID #: 291
`Case 2:21-cv-OOO40-JRG Document 28-1 Filed 05/06/21 Page 11 of 17 PageID #: 291
`
`US 8,553,079 B2
`
`2
`
`1
`MORE USEFUL MAN MACHINE
`INTERFACES AND APPLICATIONS
`
`CROSS REFERENCE TO RELATED
`APPLICATIONS
`
`This application is a continuation of US. patent applica-
`tion Ser. No. 12/700,055, filed Feb. 4, 2010, which is a con-
`tinuation ofUS. patent application Ser. No. 10/866, 191, filed
`Jun. 14, 2004, which is a continuation of US. patent appli-
`cation Ser. No. 09/433,297, filed Nov. 3, 1999 (now US. Pat.
`No. 6,750,848), which claims benefit of US. Provisional
`Application No. 60/107,652, filed Nov. 9, 1998. These appli-
`cations are hereby incorporated by reference.
`
`10
`
`15
`
`REFERENCES TO RELATED APPLICATIONS
`BY THE INVENTORS
`
`US. patent application Ser. No. 09/138,339, filedAug. 21,
`1998.
`
`20
`
`US. Provisional Application No. 60/056,639, filed Aug.
`22, 1997.
`US. Provisional Application No. 60/059,561, filed Sep.
`19, 1998.
`Man Machine Interfaces: Ser. No. 08/290,516, filed Aug.
`15, 1994, and now US. Pat. No. 6,008,800.
`Touch TV and Other Man Machine Interfaces: Ser. No.
`08/496,908, filed Jun. 29, 1995, and now US. Pat. No. 5,982,
`352.
`
`Systems for Occupant Position Sensing: Ser. No. 08/968,
`114, filed Nov. 12, 1997, now abandoned, which claims ben-
`efit of Ser. No. 60/031,256, filed Nov. 12, 1996.
`Target holes and corners: U.S. Ser. No. 08/203,603, filed
`Feb. 28, 1994, and Ser. No. 08/468,358 filed Jun. 6, 1995, now
`US. Pat. No. 5,956,417 and US. Pat. No. 6,044,183.
`Vision Target Based Assembly: U.S. Ser. No. 08/469,429,
`filed Jun. 6, 1995, now abandoned; Ser. No. 08/469,907, filed
`Jun. 6, 1995, now US. Pat. No. 6,301,763; Ser. No. 08/470,
`325, filed Jun. 6, 1995, now abandoned; and Ser. No. 08/466,
`294, filed Jun. 6, 1995, now abandoned.
`Picture Taking Method and Apparatus: Provisional Appli-
`cation No. 60/133,671, filed May 11, 1998.
`Methods and Apparatus for Man Machine Interfaces and
`Related Activity: Provisional Application No. 60/ 133,673
`filed May 11, 1998.
`Camera Based Man-Machine Interfaces: Provisional
`
`Patent Application No. 60/142,777, filed Jul. 8, 1999.
`The copies of the disclosure of the above referenced appli-
`cations are incorporated herein by reference.
`
`BACKGROUND OF THE INVENTION
`
`1. Field of the Invention
`
`The invention relates to simple input devices for comput-
`ers, particularly, but not necessarily, intended for use with 3 -D
`graphically intensive activities, and operating by optically
`sensing object or human positions and/or orientations. The
`invention in many preferred embodiments, uses real time
`stereo photogrammetry using single or multiple TV cameras
`whose output is analyzed and used as input to a personal
`computer, typically to gather data concerning the 3D location
`of parts of, or objects held by, a person or persons.
`This continuation application seeks to provide further
`detail on useful embodiments for computing. One embodi-
`ment is a keyboard for a laptop computer (or stand alone
`keyboard for any computer) that incorporates digital TV cam-
`eras to look at points on, typically, the hand or the finger, or
`
`25
`
`30
`
`35
`
`40
`
`45
`
`50
`
`55
`
`60
`
`65
`
`objects held in the hand of the user, which are used to input
`data to the computer. It may also or alternatively, look at the
`head of the user as well.
`
`Both hands or multiple fingers of each hand, or an object in
`one hand and fingers of the other can be simultaneously
`observed, as can alternate arrangements as desired.
`2. Description of Related Art
`incorporated
`My referenced co -pending applications
`herein by reference discuss many prior art references in vari-
`ous pertinent fields, which form a background for this inven-
`tion.
`
`BRIEF DESCRIPTION OF FIGURES
`
`FIG. 1 illustrates a laptop or other computer keyboard with
`cameras according to the invention located on the keyboard
`surface to observe objects such as fingers and hands overhead
`of the keyboard.
`FIG. 2 illustrates another keyboard embodiment using spe-
`cial datums or light sources such as LEDs.
`FIG. 3 illustrates a further finger detection system for lap-
`top or other computer input.
`FIG. 4 illustrates learning, amusement, monitoring, and
`diagnostic methods and devices for the crib, playpen and the
`like.
`
`FIG. 5 illustrates a puzzle toy for young children having cut
`out wood characters according to the invention.
`FIG. 6 illustrates an improved handheld computer embodi-
`ment ofthe invention, in which the camera or cameras may be
`used to look at objects, screens and the like as well as look at
`the user along the lines of FIG. 1.
`FIGS. 7A-B illustrate new methods for internet commerce
`
`and other activities involving remote operation with 3D vir-
`tual objects display.
`
`DESCRIPTION OF THE INVENTION
`
`FIG. 1
`
`A laptop (or other) computer keyboard based embodiment
`is shown in FIG. 1. In this case, a stereo pair of cameras 100
`and 101 located on each side of the keyboard are used, desir-
`ably having cover windows 103 and 104 mounted flush with
`the keyboard surface 1 02. The cameras are preferably pointed
`obliquely inward at angles (I) toward the center of the desired
`work volume 170 above the keyboard. In the case of cameras
`mounted at the rear of the keyboard (toward the display
`screen), these cameras are also inclined to point toward the
`user at an angle as well.
`Alternate camera locations may be used such as the posi-
`tions of cameras 105 and 106, on upper corners of screen
`housing 107 looking down at the top of the fingers (or hands,
`or objects in hand or in front of the cameras), or of cameras
`108 and 109 shown.
`One of the referenced embodiments of the invention is to
`
`determine the pointing direction vector 160 of the user’s
`finger (for example pointing at an object displayed on screen
`107), or the position and orientation of an object held by the
`user. Alternatively, finger position data can be used to deter-
`mine gestures such as pinch or grip, and other examples of
`relative juxtaposition of objects with respect to each other, as
`has been described in co-pending referenced applications.
`Positioning of an object or portions (such as hands or fingers
`of a doll) is also of use, though more for use with larger
`keyboards and displays.
`In one embodiment, shown in FIG. 2, cameras such as
`100/101 are used to simply look at the tip of a finger 201 (or
`thumb) of the user, or an object such as a ring 208 on the
`
`
`
`Case 2:21-cv-00040-JRG Document 28-1 Filed 05/06/21 Page 12 of 17 PageID #: 292
`Case 2:21-cv-OOO40-JRG Document 28-1 Filed 05/06/21 Page 12 of 17 PageID #: 292
`
`US 8,553,079 B2
`
`3
`finger. Light from below, such as provided by single central
`light 122 can be used to illuminate the finger that typically
`looks bright under such illumination.
`It is also noted that the illumination is directed or concen-
`
`trated in an area where the finger is typically located such as
`in work volume 170. If the light is of sufficient spectral
`content, the natural flesh tone of the finger can be observedi
`and recognized by use of the color TV cameras 100/101.
`As is typically the case, the region of the overlapping
`cameras viewing area is relatively isolated to the overlapping
`volumetric zone oftheir fields 170 shown due to focal lengths
`of their lenses and the angulation of the camera axes with
`respect to each other. This restricted overlap zone helps miti-
`gate against unwanted matches in the two images due to
`information generated outside the zone of overlap. Thus there
`are no significant image matches found of other objects in the
`room, since the only flesh-toned object in the zone is typically
`the finger or fingers of the user. Or alternatively, for example,
`the user’s hand or hands. Similarly objects or targets thereon
`can be distinguished by special colors or shapes.
`If desired, or required, motion of the fingers can be also
`used to further distinguish their presence vis-a-vis any static
`background. If for example, by subtraction of successive
`camera frames, the image of a particular object is determined
`to have moved it is determined that this is likely the object of
`potential interest which can be further analyzed directly to
`determine if is the object of interest.
`In case of obscuration of the fingers or objects in the hand,
`cameras in additional locations such as those mentioned
`
`above, can be used to solve for position if the view of one or
`more cameras is obscured.
`The use of cameras mounted on both the screen and the
`
`5
`
`10
`
`15
`
`20
`
`25
`
`30
`
`keyboard allows one to deal with obscurations that may occur
`and certain objects may or may not be advantageously delin-
`eated in one view or the other.
`
`35
`
`In addition, it may be in many cases desirable to have a
`datum on the top of the finger as opposed to the bottom
`because on the bottom, it can get in the way of certain activi-
`ties. In this case the sensors are required on the screen looking
`downward or in some other location such as off the computer
`entirely and located overhead has been noted in previous
`application.
`To determine finger location, a front end processor like that
`described in the target holes and comers co-pending applica-
`tion reference incorporated U.S. Ser. Nos. 08/203,603 and
`08/468,358 can be used to also allow the finger shape as well
`as color to be detected.
`
`Finger gestures comprising a sequence of finger move-
`ments can also be detected by analyzing sequential image sets
`such as the motion of the finger, or one finger with respect to
`another such as in pinching something can be determined.
`Cameras 100 and 101 have been shown at the rear of the
`
`40
`
`45
`
`50
`
`keyboard near the screen or at the front. They may mount in
`the middle of the keyboard or any other advantageous loca-
`tion.
`
`55
`
`The cameras can also see one’s fingers directly, to allow
`typing as now, but without the physical keys. One can type in
`space above the plane ofthe keyboard (or in this case plane of
`the cameras). This is useful for those applications where the
`keyboard of conventional style is too big (e. g., the hand held
`computer of FIG. 6).
`FIG. 2
`
`It is also desirable for fast reliable operation to use retro-
`reflective materials and other materials to augment the con-
`trast of objects used in the application. For example, a line
`target such as 200 can be worn on a finger 201, and advanta-
`geously can be located if desired between two joints of the
`
`60
`
`65
`
`4
`
`finger as shown. This allows the tip of the finger to be used to
`type on the keyboard without feeling unusualithe case per-
`haps with target material on tip of the finger.
`The line image detected by the camera can be provided also
`by a cylinder such as retroreflective cylinder 208 worn on the
`finger 201 which effectively becomes a line image in the field
`of view of each camera (assuming each camera is equipped
`with a sufficiently coaxial light source, typically one or more
`LEDs such as 210 and 211), can be used to solve easily using
`the line image pairs with the stereo cameras for the pointing
`direction ofthe finger that is often a desired result. The line, in
`the stereo pair ofimages provides the pointing direction ofthe
`finger, for example pointing at an object displayed on the
`screen 140 of the laptop computer 138.
`FIG. 3
`
`It is also possible to have light sources on the finger that can
`be utilized such as the 2 LED light sources shown in FIG. 3.
`This can be used with either TV camera type sensors or with
`PSD type analog image position sensors as disclosed in ref-
`erences incorporated.
`In particular the ring mounted LED light sources 301 and
`302 can be modulated at different frequencies that can be
`individually discerned by sensors imaging the sources on to a
`respective PSD detector. Alternatively, the sources can sim-
`ply be tumed on and off at different times such that the
`position of each point can be independently found allowing
`the pointing direction to be calculated from the LED point
`data gathered by the stereo pair of PSD based sensors.
`The “natural interface keyboar ” here described can have
`cameras or other sensors located at the rear looking obliquely
`outward toward the front as well as inward so as to have their
`
`working volume overlap in the middle ofthe keyboard such as
`the nearly full volume over the keyboard area is accommo-
`dated.
`
`Clearly larger keyboards can have a larger working volume
`than one might have on a laptop. The pair of sensors used can
`be augmented with other sensors mounted on the screen hous-
`ing. It is noted that the linked dimension afforded for calibra-
`tion between the sensors located on the screen and those on
`
`the keyboard is provided by the laptop unitary construction.
`One can use angle sensing means such as a rotary encoder
`for the laptop screen tilt. Alternatively, cameras located on the
`screen can be used to image reference points on the keyboard
`as reference points to achieve this. This allows the calibration
`ofthe sensors mounted fixedly with respect to the screen with
`respect to the sensors and keyboard space below. It also
`allows one to use stereo pairs of sensors that are not in the
`horizontal direction (such as 101/102) but could for example
`be a camera sensor such as 100 on the keyboard coupled with
`one on the screen, such as 106.
`Knowing the pointing angles of the two cameras with
`respect to one another allows one to solve for the 3D location
`of obj ects from the matching of the object image positions in
`the respective camera fields.
`As noted previously, it is also of interest to locate a line or
`cylinder type target on the finger between the first and second
`joints. This allows one to use the fingertip for the keyboard
`activity but by raising the finger up, it can be used as a line
`target capable of solving for the pointed direction for
`example.
`Alternatively one can use two point targets on the finger
`such as either retroreflective datums, colored datums such as
`rings or LED light sources that can also be used with PSD
`detectors which has also been noted in FIG. 2.
`
`When using the cameras located for the purpose of stereo
`determination of the position of the fingers from their flesh
`tone images it is useful to follow the preprocessing capable of
`
`
`
`Case 2:21-cv-00040-JRG Document 28-1 Filed 05/06/21 Page 13 of 17 PageID #: 293
`Case 2:21-cv-OOO40-JRG Document 28-1 Filed 05/06/21 Page 13 of 17 PageID #: 293
`
`US 8,553,079 B2
`
`5
`processing data obtained from the cameras in order to look for
`the finger. This can be done on both color basis and on the
`basis of shape as well as motion.
`In this invention, I have shown the use of not only cameras
`located on a screen looking downward or outward from the
`screen, but also cameras that can be used instead of or in
`combination with those on the screen placed essentially on
`the member on which the keyboard is incorporated. This
`allows essentially the keyboard to mounted cameras which
`are preferably mounted flush with the keyboard surface to be
`unobtrusive, and yet Visually be able to see the users fingers,
`hands or objects held by the user and in some cases, the face
`of the user.
`
`is also useful for 3D displays, for
`This arrangement
`example where special synchronized glasses (e.g., the “Crys-
`tal Eyes” brand often used with Silicon Graphics work sta-
`tions) are used to alternatively present right and left images to
`each eye. In this case the object may appear to be actually in
`the workspace 170 above the keyboard, and it may be
`manipulated by virtually grasping (pushing, pulling, etc.) it,
`as has been described in co-pending applications.
`FIG. 4: Baby Learning and Monitoring System
`A baby’ s reaction to the mother (or father) and the mother’ s
`analysis of the baby’s reaction is very important. There are
`many gestures of babies apparently indicated in child psy-
`chology as being quite indicative of various needs, wants, or
`feelings and emotions, etc. These gestures are typically made
`with the baby’s hands.
`Today this is done and learned entirely by the mother being
`with the baby. However with an Electro-optical sensor based
`computer system, such as that described in co-pending appli-
`cations located proximate to or even in the crib (for example),
`one can have the child’s reactions recorded, not just in the
`sense of a video tape which would be too long and involved
`for most to use, but also in terms of the actual motions which
`could be computer recorded and analyzed also with the help
`ofthe mother as to what the baby’ s responses were. And such
`motions, combined with other audio and visual data can be
`very important to the baby’s health, safety, and learning.
`Consider for example crib 400 with computer 408 having
`LCD monitor 410 and speaker 411 and camera system (single
`or stereo) 420 as shown, able to amuse or inform baby 430,
`while at the same time recording (both visually, aurally, and in
`movement detected position data concerning parts ofhis body
`or objects such as rattles in his hand) his responses for any or
`all of the purposes of diagnosis of his state of being, remote
`transmission of his state, cues to various programs or images
`to display to him or broadcast to others, or the like.
`For one example, baby’s motions could be used to signal a
`response from the TV either in the absence of the mother or
`with the mother watching on a remote channel. This can even
`be over the Internet if the mother is at work.
`
`For example, a comforting message could come up on the
`TV from the mother that could be prerecorded (or alterna-
`tively could actually be live with TV cameras in the mother’ s
`or father’ s workplace for example on a computer used by the
`parent) to tell the baby something reassuring or comfort the
`baby or whatever. Indeed the parent can be monitored using
`the invention and indicate something back or even control a
`teleoperater robotic device to give a small child something to
`eat or drink for example. The same applies to a disabled
`person.
`If the father or mother came up on the screen, the baby
`could wave at it, move its head or “talk” to it but the hand
`gestures may be the most important.
`If the mother knows what the baby is after, she can talk to
`baby or say something, or show something that the baby
`
`10
`
`15
`
`20
`
`25
`
`30
`
`35
`
`40
`
`45
`
`50
`
`55
`
`60
`
`65
`
`6
`recognizes such as a doll. After a while, looking at this live
`one can then move to talking to the baby from some prere-
`corded data.
`
`What other things might we suppose? The baby for
`example knows to puts its hand on the mother’s cheek to
`cause the mother to turn to it. The baby also learns some other
`reflexes when it is very young that it forgets when it gets older.
`Many of these reflexes are hand movements, and are impor-
`tant in communicating with the remote TV based mother
`representation, whether real via telepresense or from CD
`Rom or DVD disk (or other media, including information
`transmitted to the computer from afar) and for the learning of
`the baby’s actions.
`Certainly just from the making the baby feel good point-
`of-view, it would seem like certain motherly (or fatherly, etc.)
`responses to certain baby actions in the form of words and
`images would be useful. This stops short of physical holding
`of the baby which is often needed, but could act as a stop gap
`to allow the parents to get another hour’s sleep for example.
`As far as the baby touching things, I’ve discussed in other
`applications methods for realistic touch combined with
`images. This leads to a new form oftouching crib mobiles that
`could contain video imaged and or be imaged themselvesi
`plus if desireditouched in ways that would be far beyond
`any response that you could get from a normal mobile.
`For example, let us say there is a targeted (or otherwise TV
`observable) mobile 450 in the crib above the baby. Baby
`reaches up and touches a piece of the mobile which is sensed
`by the TV camera system (either from the baby’s hand posi-
`tion, the mobile movement, or both, and a certain sound is
`called up by the computer, a musical note for example.
`Another piece of the mobile and another musical note. The
`mobile becomes a musical instrument for the baby that could
`play either notes or chords or complete passages, or any other
`desired programmed function.
`The baby can also signal things. The baby can signal using
`agitated movements would often mean that it’ s unhappy. This
`could be interpreted using learned movement signatures and
`artificial intelligence as needed by the computer to call for
`mother even if the baby wasn’t crying. If the baby cries, that
`can be picked up by microphone 440, recognized using a
`voice recognition system along the lines of that used in IBM
`Via Voice commercial product for example. And even the
`degree of crying can be analyzed to determine appropriate
`action.
`
`The computer could also be used to transmit information of
`this sort via the intemet email to the mother who could even
`
`be at work. And until help arrives in the form of mother
`intervention or whatever, the computer could access a pro-
`gram that could display on a screen for the baby things that the
`baby likes and could try to soothe the baby through either
`images of familiar things, music or whatever. This could be
`useful at night when parents need sleep, and anything that
`would make the baby feel more comfortable would help the
`parents.
`It could also be used to allow the baby to input to the device.
`For example, if the baby was hungry, a picture of the bottle
`could be brought up on the screen. The baby then could yell
`for the bottle. Or if the baby needed his diaper changed,
`perhaps something reminiscent of that. If the baby reacts to
`such suggestions of his problem, this gives a lot more intel-
`ligence as to why he is crying and while mothers can gener-
`ally tell right away, not everyone else can. In other words, this
`is pretty neat for babysitters and other members ofthe house-
`hold so they can act more intelligently on the signals the baby
`is providing.
`
`
`
`Case 2:21-cv-00040-JRG Document 28-1 Filed 05/06/21 Page 14 of 17 PageID #: 294
`Case 2:21-cv-OOO40-JRG Document 28-1 Filed 05/06/21 Page 14 of 17 PageID #: 294
`
`US 8,553,079 B2
`
`7
`Besides in the crib, the system as described can be used in
`conjunction with a playpen, hi-chair or other place of baby
`activity.
`As the child gets older, the invention can further be used
`also with more advanced activity with toys, and to take data
`from toy positions as well. For example, blocks, dolls, little
`cars, and moving toys even such as trikes, scooters, drivable
`toy cars and bikes with training wheels.
`The following figure illustrates the ability of the invention
`to learn, and thus to assist in the creation of toys and other
`things.
`FIG. 5: Learning Puzzle Roy
`Disclosed in FIG. 5 is a puzzle toy 500 where woodcut
`animals such as bear 505 and lion 510 are pulled out with
`handle such as 511. The child can show the animal to the
`
`camera and a computer 53 0 with TV camera (or cameras) 535
`can recognize the shape as the animal, and provide a suitable
`image and sounds on screen 540.
`Alternatively, and more simply, a target, or targets on the
`back of the animal can be used such as triangle 550 on the
`back of lion 511. In either case the camera can solve for the
`
`3D, and even 5 or 6D position and orientation of the animal
`object, and cause it to move accordingly on the screen as the
`child maneuvers it. The child can hold two animals, one in
`each hand and they can each be detected, even with a single
`camera, and be programmed in software to interact as the
`child wishes (or as he learns the program).
`This is clearly for very young children oftwo or three years
`of age. The toys have to be large so they can’t be swallowed.
`With the invention in this manner, one can make a toy of
`virtually anything, for example a block. Just hold this block
`up, teach the computer/camera system the object and play
`using any program you might want to represent it and its
`actions. To make this block known to the system, the shape of
`the block, the color ofthe block or some code on the block can
`be determined. Any of those items could tell the camera
`which block it was, and most could give position and orien-
`tation if known.
`
`At that point, an image is called up from the computer
`representing that particular animal or whatever else the block
`is supposed to represent. Of course this can be changed in the
`computer to be a variety of things if this is something that is
`acceptable to the child. It could certainly be changed in size
`such as a small lion could grow into a large lion. The child
`could probably absorb that more than a lion changing into a
`giraffe for example since the block wouldn’t correspond to
`that. The child can program or teach the system any of his
`blocks to be the animal he wants and that might be fun.
`For example, he or the child’s parent could program a
`square to be a giraffe where as a triangle would be a lion.
`Maybe this could be an interesting way to get the child to
`learn his geometric shapes!
`Now the basic block held up in front of the camera system
`could be looked at just for what it is. As the child may move
`the thing toward or away from the camera system, one may
`get a rough sense of depth from the change in shape of the
`object. However this is not so easy as the object changes in
`shape due to any sort of rotations.
`Particularly interesting then is to also sense the rotations if
`the object so that the animal can actually move realistically in
`3 Dimensions on the screen. And perhaps having the de-
`tuning of the shape of the movement so that the child’s rela-
`tively jerky movements would not appear jerky on the screen
`or would not look so accentuated. Conversely of course, you
`can go the other way and accentuate the motions.
`This can, for example, be done with a line target around the
`edge of the object is often useful for providing position or
`
`5
`
`10
`
`15
`
`20
`
`25
`
`30
`
`35
`
`40
`
`45
`
`50
`
`55
`
`60
`
`65
`
`8
`orientation information to the TV camera based analysis soft-
`ware, and in making the object easier to see in reflective
`illumination.
`
`Aid to Speech Recognition
`The previous co-pending application entitled “Useful man
`machine interfaces and applications” referenced above, dis-
`cussed the use of persons movements or positions to aid in
`recognizing the voice spoken by the person.
`In one instance, this can be achieved by simply using ones
`hand to indicate to the camera system ofthe computer that the
`voice recognition should start (or stop, or any other function,
`such as a paragraph or sentence end, etc.).
`Another example is to use the camera system of the inven-
`tion to determine the location of the persons head (or other
`part), from which one can