`Pryor
`
`USOO6750848B1
`(10) Patent No.:
`US 6,750,848 B1
`(45) Date of Patent:
`Jun. 15, 2004
`
`(54) MORE USEFUL MAN MACHINE
`INTERFACES AND APPLICATIONS
`(76) Inventor: Timothy R. Pryor, 516 Old Tecumseh
`St. Tecumseh, Ontario (CA), N8N
`
`(*) Notice:
`
`Subject to any disclaimer, the term of this
`patent is extended or adjusted under 35
`U.S.C. 154(b) by 0 days.
`
`(21) Appl. No.: 09/433,297
`(22) Filed:
`Nov. 3, 1999
`
`Related U.S. Application Data
`(60) Provisional application No. 60/107,652, filed on Nov. 9,
`1998.
`(51) Int. Cl. .................................................. G09G 5/00
`(52) U.S. Cl. ........................ 345/168; 345/719; 34.5/863
`(58) Field of Search ................................. 34.5/156, 157,
`345/158, 168, 700, 719,863; 340/407.2;
`341/22; 400/719
`
`(56)
`
`References Cited
`U.S. PATENT DOCUMENTS
`
`3,718,116 A 2/1973 Thettu
`3,831,553 A
`8/1974 Thettu
`3,846,826 A 11/1974 Mueller
`4,309,957 A
`1/1982 Swift
`4,484.179 A 11/1984 Kasday
`4,542,375 A 9/1985 Alles et al.
`4,629,319 A 12/1986 Clarke et al.
`4,668,537 A 5/1987 Matsuyama et al.
`4,686,374. A 8/1987 Liptay-Wagner et al.
`4,891,772 A
`1/1990 Case et al.
`4908,670 A 3/1990 Ndebi
`4988981 A 1/1991 Zimmerman et al.
`5,045,843 A 9/1991 Hansen
`5,168,531 A * 12/1992 Sigel .......................... 34.5/157
`5,227,985 A 7/1993 DeMenthon
`5,232.499 A 8/1993 Kato et al.
`
`
`
`2 :
`
`Y-2
`
`5,267,004 A 11/1993 Mills
`5,297.061 A 3/1994 Dementhon et al.
`5,388,059 A 2/1995 DeMenthon
`E. A : 3.9. Era r 382/168
`5,528.263 A * 6/1996 Platzker et al. ............. 34.5/156
`5,534,062 A 7/1996 Dawson et al.
`SR A lso E. s al
`5,616,078 A 4/1997 Oh
`5,617,312 A
`4/1997 Iura et al. ................... 34.5/157
`5,709,423 A
`1/1998 Romero
`5,767,842 A
`6/1998 Korth ......................... 345/168
`5,808,672 A * 9/1998 Wakabayashi et al....... 34.8/552
`5,864,334 A
`1/1999 Sellers ....................... 34.5/156
`5,936,610 A
`8/1999 Endo .......................... 34.5/157
`5,936,615 A 8/1999 Waters ....................... 34.5/156
`6,043,805. A
`3/2000 Hsieh ......................... 34.5/158
`6,160,899 A 12/2000 Lee et al. ................... 345/863
`6,191,773 B1 * 2/2001 Maruno et al. ............. 34.5/158
`6.252,598 B1 * 6/2001 Segen ........................ 34.5/156
`6,265,993 B1
`7/2001 Johnson ...................... 34.5/156
`6,346,929 B1 * 2/2002 Fukushima et al. ......... 34.5/156
`* cited by examiner
`Primary Examiner-Chanh Nguyen
`ASSistant Examiner Paul A. Bell
`E. Attorney, Agent, or Firm-Stites & Harbison PLLC;
`ouglas E. Jackson
`ABSTRACT
`(57)
`imed
`licati
`tinuati
`fearli
`tion i
`The i
`e Invention is a continuation of earlier applications aime
`at providing affordable methods and apparatus for inputting
`position, attitude(orientation) or other object characteristic
`data to computers for the purpose of controlling the display
`thereof. Preferred embodiments utilize electro-optical sen
`SorS Such as TV cameras, to input data from objects and
`fingers, and/or other body parts of the user, to laptop, hand
`held and other computers used for Computer Aided learning,
`Gaming, 3D graphics, internet commerce and other appli
`cations
`
`aSSa C a
`
`22 Claims, 7 Drawing Sheets
`
`IPR2022-00090 - LGE
`Ex. 1003 - Page 1
`
`
`
`U.S. Patent
`U.S. Patent
`
`Jun. 15, 2004
`Jun. 15, 2004
`
`Sheet 1 of 7
`Sheet 1 of 7
`
`US 6,750,848 B1
`US 6,750,848 B1
`
`
`
`
`
`IPR2022-00090 - LGE
`
`Ex. 1003 - Page 2
`
`IPR2022-00090 - LGE
`Ex. 1003 - Page 2
`
`
`
`U.S. Patent
`U.S. Patent
`
`Jun. 15, 2004
`Jun. 15, 2004
`
`Sheet 2 of 7
`Sheet 2 of 7
`
`US 6,750,848 B1
`US 6,750,848 B1
`
`
`
`
`
`IPR2022-00090 - LGE
`
`Ex. 1003 - Page 3
`
`IPR2022-00090 - LGE
`Ex. 1003 - Page 3
`
`
`
`U.S. Patent
`U.S. Patent
`
`Jun. 15, 2004
`Jun. 15, 2004
`
`Sheet 3 of 7
`Sheet 3 of 7
`
`US 6,750,848 B1
`US 6,750,848 B1
`
`
`
`
`
`O
`O
`y
`
`IPR2022-00090 - LGE
`
`Ex. 1003 - Page 4
`
`IPR2022-00090 - LGE
`Ex. 1003 - Page 4
`
`
`
`U.S. Patent
`U.S. Patent
`
`Jun. 15, 2004
`Jun. 15, 2004
`
`Sheet 4 of 7
`Sheet 4 of 7
`
`US 6,750,848 B1
`US 6,750,848 B1
`
`7
`D
`g-UN
`
`JIN
`
`411
`
`420
`& N
`
`N
`
`FIG.4
`S. U
`
`400
`
`3.
`408
`
`IPR2022-00090 - LGE
`
`Ex. 1003 - Page 5
`
`IPR2022-00090 - LGE
`Ex. 1003 - Page 5
`
`
`
`U.S. Patent
`U.S. Patent
`
`Jun. 15, 2004
`Jun. 15, 2004
`
`Sheet 5 of 7
`Sheet 5 of 7
`
`US 6,750,848 B1
`US 6,750,848 B1
`
`s
`FIG.5
`
`oO
`O
`oO
`CO
`O
`
`w
`
`:
`535
`
`i
`540
`
`
`
`:
`530
`
`oO
`—-
`wh
`
`500
`
`IPR2022-00090 - LGE
`
`Ex. 1003 - Page 6
`
`IPR2022-00090 - LGE
`Ex. 1003 - Page 6
`
`
`
`U.S. Patent
`U.S. Patent
`
`Jun. 15, 2004
`Jun. 15, 2004
`
`Sheet 6 of 7
`Sheet 6 of 7
`
`US 6,750,848 B1
`US 6,750,848 B1
`
`
`
`
`
`ce
`oO
`Oo
`
`IPR2022-00090 - LGE
`
`Ex. 1003 - Page 7
`
`IPR2022-00090 - LGE
`Ex. 1003 - Page 7
`
`
`
`U.S. Patent
`
`Jun. 15, 2004
`
`Sheet 7 of 7
`
`US 6,750,848 B1
`
`1030
`
`C
`42 P
`U
`
`1070
`
`1065
`
`1059
`
`1075
`
`1056
`
`1060
`INPUT
`WOMAN'S
`MEASURE
`
`
`
`
`
`
`
`INTERNET
`
`(
`1050
`
`INTERNET
`REMOTE
`
`FIG. 7B
`
`IPR2022-00090 - LGE
`Ex. 1003 - Page 8
`
`
`
`1
`MORE USEFUL MAN MACHINE
`INTERFACES AND APPLICATIONS
`
`US 6,750,848 B1
`
`This application Provisional Application No. 60/107,652
`filed Nov. 9, 1998
`CROSS REFERENCES TO RELATED
`APPLICATIONS BY THE INVENTOR
`1. Man Machine Interfaces (Ser. No. 08/290,516)
`2. Touch TV and other Man Machine Interfaces (Ser. No.
`08/496,908)
`3. Systems for Occupant Position Sensing, Ser. No. 08/968,
`114
`4. Target holes and corners U.S. Ser. Nos. 08/203,603, and
`08/468,358
`5. Useful Man Machine interfaces and applications, Prov.
`Appl. filed Aug. 21, 1998
`6. Vision Target based assembly, U.S. Ser. Nos. 08/469,429,
`08/469,907, 08/470,325, 08/466,294
`7. Stereo camera Based input to Computer Systems (new
`provisional application)
`8. Picture Taking method and apparatus(new provisional
`application)
`9. Methods and Apparatus for Man Machine Interfaces and
`Related Activity (new provisional application
`10. Camera Based Man-Machine Interfaces new Provisional
`Patent Application, filed July 1999
`The copies of the disclosure of the above referenced
`applications are incorporated herein by reference.
`Federally Sponsored R and D Statement
`not applicable
`
`15
`
`25
`
`Microfiche Appendix
`
`not applicable
`BACKGROUND OF THE INVENTION
`1. Field of the Invention
`The invention relates to Simple input devices for
`computers, particularly, but not necessarily, intended for use
`with 3-D graphically intensive activities, and operating by
`optically Sensing object or human positions and/or orienta
`tions. The invention in many preferred embodiments, uses
`real time Stereo photogrammetry using Single or multiple TV
`cameras whose output is analyzed and used as input to a
`personal computer, typically to gather data concerning the
`3D location of parts of, or objects held by, a perSon or
`perSons.
`This continuation application Seeks to provide further
`detail on useful embodiments for computing. One embodi
`ment is a keyboard for a laptop computer (or Stand alone
`50
`keyboard for any computer) that incorporates digital TV
`cameras to look at points on, typically, the hand or the finger,
`or objects held in the hand of the user, which are used to
`input data to the computer. It may also or alternatively, look
`at the head of the user as well.
`Both hands or multiple fingers of each hand, or an object
`in one hand and fingers of the other can be simultaneously
`observed, as can alternate arrangements as desired.
`2. Description of Related Art
`My referenced copending applications incorporated
`herein by reference discuss many prior art references in
`various pertinent fields, which form a background for this
`invention.
`
`35
`
`40
`
`45
`
`55
`
`60
`
`DESCRIPTION OF FIGURES
`FIG. 1 illustrates a lap top or other computer keyboard
`with cameras according to the invention located on the
`
`65
`
`2
`keyboard Surface to observe objects Such as fingers and
`hands overhead of the keyboard
`FIG. 2 illustrates another keyboard embodiment using
`Special datums or light Sources Such as LEDS
`FIG. 3 illustrates a further finger detection system for
`laptop or other computer input
`FIG. 4 illustrates learning, amusement, monitoring, and
`diagnostic methods and devices for the crib, playpen and the
`like
`FIG. 5 illustrates a puzzle toy for young children having
`cut out wood characters according to the invention
`FIG. 6 illustrates an improved handheld computer
`embodiment of the invention, in which the camera or
`cameras may be used to look at objects, Screens and the like
`as well as look at the user along the lines of FIG. 1.
`FIG. 7 illustrates new methods for internet commerce and
`other activities involving remote operation with 3D virtual
`objects display.
`
`FIG. 1
`A laptop (or other)computer keyboard based embodiment
`is shown in FIG.1. In this case, a stereo pair of cameras 100
`and 101 located on each side of the keyboard are used,
`desirably having cover windows 103 and 104 mounted flush
`with the keyboard Surface 102. The cameras are preferably
`pointed obliquely inward at angles (p toward the center of the
`desired work volume 170 above the keyboard. In the case of
`cameras mounted at the rear of the keyboard (toward the
`display Screen), these cameras are also inclined to point
`toward the user at an angle as well.
`Alternate camera locations may be used Such as the
`positions of cameras 105 and 106, on upper corners of
`Screenhousing 107 looking down at the top of the fingers (or
`hands, or objects in hand or in front of the cameras), or of
`cameras 108 and 109 shown.
`Alternate camera locations may be used Such as positions
`105 and 106, on upper corners of screen housing 107
`looking down at the top of the fingers (or hands, or objects
`in hand or in front of the cameras), or 108 and 109 shown.
`One of the referenced embodiments of the invention is to
`determine the pointing direction vector 160 of the users
`finger (for example pointing at an object displayed on Screen
`107), or the position and orientation of an object held by the
`user. Alternatively, finger position data can be used to
`determine gestures Such as pinch or grip, and other examples
`of relative juxtaposition of objects with respect to each other,
`as has been described in co-pending referenced applications.
`Positioning of an object or portions (Such as hands or fingers
`of a doll) is also of use, though more for use with larger
`keyboards and displayS.
`In one embodiment, shown in FIG. 2, cameras Such as
`100/101 are used to simply look at the tip of a finger 201 (or
`thumb) of the user, or an object such as a ring 208 on the
`finger. Light from below, Such as provided by Single central
`light 122 can be used to illuminate the finger that typically
`looks bright under Such illumination.
`It is also noted that the illumination is directed or con
`centrated in an area where the finger is typically located Such
`as in work volume 170. If the light is of Sufficient spectral
`content, the natural flesh tone of the finger can be
`observed-and recognized by use of the color TV cameras
`100/101.
`AS is typically the case, the region of the overlapping
`cameras viewing area is relatively isolated to the Overlap
`ping volumetric Zone of their fields 170 shown due to focal
`
`IPR2022-00090 - LGE
`Ex. 1003 - Page 9
`
`
`
`3
`lengths of their lenses and the angulation of the camera axes
`with respect to each other. This restricted overlap Zone, helps
`mitigate against unwanted matches in the two images due to
`information generated outside the Zone of Overlap. Thus
`there are no significant image matches found of other objects
`in the room, Since the only flesh toned object in the Zone is
`typically the finger or fingers of the user. Or alternatively, for
`example, the users hand or hands. Similarly objects or
`targets thereon can be distinguished by Special colors or
`shapes.
`If desired, or required, Motion of the fingers can be also
`used to further distinguish their presence Vis a Vis any Static
`background. If for example by Subtraction of Successive
`camera frames, the image of a particular object is deter
`mined to have moved it is determined that this is likely the
`object of potential interest which can be further analyzed
`directly to determine if is the object of interest.
`In case of obscuration of the fingers or objects in the hand,
`cameras in additional locations Such as those mentioned
`above, can be used to solve for position if the view of one
`or more cameras is obscured.
`The use of cameras mounted on both the Screen and the
`keyboard allows one to deal with obscurations that may
`occur and certain objects may or may not be advantageously
`delineated in one view or the other.
`In addition, it may be, in many cases, desirable to have a
`datum on the top of the finger as opposed to the bottom
`because on the bottom, it can get in the way of certain
`activities. In this case the Sensors are required on the Screen
`looking downward or in Some other location Such as off the
`computer entirely and located overhead has been noted in
`previous application.
`To determine finger location, a front end processor like
`that described in the target holes and corners copending
`application reference incorporated U.S. Ser. Nos. 08/203,
`603, and 08/468,358 can be used, to also allow the finger
`shape as well as color to be detected.
`Finger gestures comprising a sequence of finger move
`ments can also be detected, by analyzing Sequential image
`Sets Such at the motion of the finger, or one finger with
`respect to another Such as in pinching Something can be
`determined. Cameras 100 and 101 have been shown at the
`rear of the keyboard near the screen or at the front. They may
`mounted in the middle of the keyboard or any other advan
`tageous location.
`The cameras can also see ones fingers directly, to allow
`typing as now, but without the physical keys. One can type
`in space above the plane of the keyboard (or in this case
`plane of the cameras), this is useful for those applications
`where the keyboard of conventional style is too big (e.g. the
`hand held computer of FIG. 6).
`FIG. 2
`It is also desirable for fast reliable operation to use
`retro-reflective materials and other materials to augment the
`contrast of objects used in the application. For example, A
`line target such as 200 can be worn on a finger 201, and
`advantageously can be located if desired between two joints
`of the finger as shown. This allows the tip of the finger to be
`used to type on the keyboard without feeling unusual-the
`case perhaps with target material on tip of the finger.
`The line image detected by the camera can be provided
`also by a cylinder such as retroreflective cylinder 208 worn
`on the finger 201 which effectively becomes a line image in
`the field of view of each camera, (assuming each camera is
`
`4
`equipped with a Sufficiently coaxial light Source, typically
`one or more LEDs such as 210 and 211), can be used to solve
`easily using the line image pairs with the Stereo cameras for
`the pointing direction of the finger that is often a desired
`result. The line, in the Stereo pair of imageS provides the 3D
`pointing direction of the finger, for example pointing at an
`object displayed on the screen 140 of the laptop computer
`138
`
`FIG. 3
`It is also possible to have light Sources on the finger that
`can be utilized such as the 2 LED light sources shown in
`FIG. 3. This can be used with either TV camera type sensors
`or with PSD type analog image position Sensors as disclosed
`in references incorporated.
`In particular the ring mounted LED light sources 301 and
`302 can be modulated at different frequencies that can be
`individually discerned by Sensors imaging the Sources on to
`a respective PSD detector. Alternatively, the sources can
`simply be turned on and off at different times such that the
`position of each point can be independently found allowing
`the pointing direction to be calculated from the LED point
`data gathered by the stereo pair of PSD based sensors.
`The “natural interface keyboard' here described can have
`camera or other Sensors located at the rear looking obliquely
`outward toward the front as well as inward so as to have their
`working volume overlap in the middle of the keyboard such
`as the nearly full Volume over the keyboard area is accom
`modated.
`Clearly larger keyboards can have a larger working Vol
`ume than one might have on a laptop. The pair of Sensors
`used can be augmented with other Sensors mounted on the
`screenhousing. It is noted that the linked dimension afforded
`for calibration between the Sensors located on the Screen and
`those on the keyboard is provided by the laptop unitary
`construction.
`One can use angle Sensing means Such as a rotary encoder
`for the lap top Screen tilt. Alternatively, cameras located on
`the Screen can be used to image reference points on the
`keyboard as reference points to achieve this. This allows the
`calibration of the sensors mounted fixedly with respect to the
`Screen with respect to the Sensors and keyboard Space below.
`It also allows one to use Stereo pairs of Sensors that are not
`in the horizontal direction (such as 101/102) but could for
`example be a camera sensor such as 100 on the keyboard
`coupled with one on the screen, such as 106
`Knowing the pointing angles of the two cameras with
`respect to one another allows one to Solve for the 3d location
`of objects from the matching of the object image positions
`in the respective camera fields.
`AS noted previously, it is also of interest to locate a line
`or cylinder type target on the finger between the first and
`Second joints. This allows one to use the fingertip for the
`keyboard activity but by raising the finger up, it can be used
`as a line target capable of Solving for the pointed direction
`for example.
`Alternatively one can use two point targets on the finger
`Such as either retro-reflective datums, colored datums Such
`as rings or LED light sources that can also be used with PSD
`detectors which has also been noted in FIG. 2.
`When using the cameras located for the purpose of Stereo
`determination of the position of the fingers from their flesh
`tone images it is useful to follow the preprocessing capable
`of processing data obtained from the cameras in order to
`look for the finger. This can be done on both color basis and
`on the basis of shape as well as motion.
`
`US 6,750,848 B1
`
`5
`
`1O
`
`15
`
`25
`
`35
`
`40
`
`45
`
`50
`
`55
`
`60
`
`65
`
`IPR2022-00090 - LGE
`Ex. 1003 - Page 10
`
`
`
`S
`In this invention, I have shown the use of not only
`cameras located on a Screen looking downward or outward
`from the Screen, but also cameras that can be used instead of
`or in combination with those on the Screen placed essentially
`on the member on which the keyboard is incorporated. This
`allows essentially the keyboard to mounted cameras which
`are preferably mounted flush with the keyboard surface to be
`unobtrusive, and yet visually be able to see the users fingers,
`hands or objects held by the user and in Some cases, the face
`of the user.
`This arrangement is also useful for 3D displays, for
`example where special Synchronized glasses (eg the “Crystal
`Eyes” brand often used with Silicon Graphics work stations)
`are used to alternatively present right and left images to each
`eye. In this case the object may appear to be actually in the
`workspace 170 above the keyboard, and it may be manipu
`lated by Virtually grasping (pushing, pulling, etc) it, as has
`been described in co-pending applications
`FIG. 4
`Baby Learning and Monitoring System
`A baby's reaction to the mother (or father) and the
`mother's analysis of the baby's reaction is very important.
`There are many gestures of babies apparently indicated in
`child psychology as being quite indicative of various needs,
`wants, or feelings and emotions, etc. These gestures are
`typically made with the baby's hands.
`Today this is done and learned entirely by the mother
`being with the baby. However with a Electro-optical sensor
`based computer System, Such as that described in copending
`applications, located proximate to or even in the crib(for
`example), one can have the child's reactions recorded, not
`just in the Sense of a Video tape which would be too long and
`involved for most to use, but also in terms of the actual
`motions which could be computer recorded and analyzed
`also with the help of the mother as to what the baby's
`responses were. And Such motions, combined with other
`audio and Visual data can be very important to the baby's
`health, Safety, and learning.
`Consider for example crib 400 with computer 408 having
`LCD monitor 410 and speaker 411 and camera system
`(single or Stereo) 420 as shown, able to amuse or inform
`baby 430, while at the same time recording (both visually,
`aurally, and in movement detected position data concerning
`parts of his body or objects Such as rattles in his hand) his
`responses for any or all of the purposes of diagnosis of his
`State of being, remote transmission of his State, cues to
`various programs or images to display to him or broadcast
`to others, or the like.
`For one example, baby's motions could be used to Signal
`a response from the TV either in the absence of the mother
`or with the mother watching on a remote channel. This can
`even be over the Internet if the mother is at work.
`For example, a comforting message could come up on the
`TV from the mother that could be prerecorded (or alterna
`tively could actually be live with TV cameras in the mothers
`or fathers work place for example on a computer used by the
`parent) to tell the baby Something reassuring or comfort the
`baby or whatever. Indeed the parent can be monitored using
`the invention and indicate Something back or even control a
`teleoperater robotic device to give a Small child Something
`to eat or drink for example. The same applies to a disabled
`perSon.
`If the father or mother came up on the screen, the baby
`could wave at it, move its head or “talk” to it but the hand
`gestures may be the most important.
`
`1O
`
`15
`
`25
`
`35
`
`40
`
`45
`
`50
`
`55
`
`60
`
`65
`
`US 6,750,848 B1
`
`6
`If the mother knows what the baby is after, she can talk
`to baby or Say Something, or show Something that the baby
`recognizes Such as a doll. After a while, looking at this live
`one can then move to talking to the baby from Some
`prerecorded data.
`What other things might we suppose? The baby for
`example knows to puts its hand on the mother's cheek to
`cause the mother to turn to it. The baby also learns Some
`other reflexes when it is very young that it forgets when it
`gets older. Many of these reflexes are hand movements, and
`are important in communicating with the remote TV based
`mother representation, whether real via telepresense or from
`CD Rom or DVD disk (or other media, including informa
`tion transmitted to the computer from afar) and for the
`learning of the baby's actions.
`Certainly just from the making the baby feel good point
`of-view, it would seem like certain motherly (or fatherly, etc)
`responses to certain baby actions in the form of words and
`images would be useful. This stops short of physical holding
`of the baby which is often needed, but could act as a stopgap
`to allow the parents to get another hours Sleep for example.
`AS far as the baby touching things, ive discussed in other
`applications methods for relalistic touch combined with
`images. This leads to a new form of touching crib mobiles
`that could contain Video imaged and or be imaged
`themselves plus if desired, touched in ways that would be
`far beyond any response that you could get from a normal
`mobile.
`For example, let us say there is a targeted (or otherwise
`TV observable) mobile 450 in the crib above the baby. Baby
`reaches up and touches a piece of the mobile which is Sensed
`by the tv camera system (either from the baby's hand
`position, the mobile movement, or both, and a certain Sound
`is called up by the computer, a musical note for example.
`Another piece of the mobile and another musical note. The
`mobile becomes a musical instrument for the baby that could
`play either notes or chords or complete passages, or any
`other desired programmed function.
`The baby can also signal things. The baby can Signal using
`agitated movements would often mean that it's unhappy.
`This could be interpreted using learned movement Signa
`tures and artificial intelligence as needed by the computer to
`call for mother even if the baby wasn't crying. If the baby
`cries, that can be picked up by microphone 440, recognized
`using a voice recognition System along the lines of that used
`in IBM Via Voice commercial product for example. And
`even the degree of crying can be analyzed to determine
`appropriate action.
`The computer could also be used to transmit information
`of this sort via the internet email to the mother who could
`even be at work. And until help arrives in the form of mother
`intervention or whatever, the computer could access a pro
`gram that could display on a Screen for the baby things that
`the baby likes and could try to soothe the baby through either
`images of familiar things, music or whatever. This could be
`useful at night when parents need sleep, and any thing that
`would make the baby feel more comfortable would help the
`parents.
`It could also be used to allow the baby to input to the
`device. For example, if the baby was hungry, a picture of the
`bottle could be brought up on the screen. The baby then
`could yell for the bottle. Or if the baby needed his diaper
`changed, perhaps Something reminiscent of that. If the baby
`reacts to Such Suggestions of his problem, This gives a lot
`more intelligence as to why he is crying and while mothers
`can generally tell right away, not every one else can. In other
`
`IPR2022-00090 - LGE
`Ex. 1003 - Page 11
`
`
`
`US 6,750,848 B1
`
`7
`words, this is pretty neat for babysitters and other members
`of the household So they can act more intelligently on the
`Signals the baby is providing.
`Besides in the crib, the System as described can be used
`in conjunction with a playpen, hi chair or other place of baby
`activity.
`AS the child gets older, the invention can further be used
`also with more advanced activity with toys, and to take data
`from toy positions as well. For example, blocks, dolls, little
`cars, and moving toys even Such as Trikes, Scooters, driv
`able toy cars and bikes with training wheels
`The following figure illustrates the ability of the invention
`to learn, and thus to assist in the creation of toys and other
`things.
`
`FIG. 5
`Learning Puzzle Toy
`Disclosed in FIG. 5 is a puzzle toy 500 where woodcut
`animals such as bear 505 and lion 510 are pulled out with
`handle Such as 511. The child can show the animal to the
`camera and a computer 530 with TV camera (or cameras)
`535 can recognize the shape as the animal, and provide a
`suitable image and sounds on screen 540.
`Alternatively, and more simply, a target, or targets on the
`back of the animal can be used such as triangle 550 on the
`back of lion 511. In either case the camera can Solve for the
`3D, and even 5 or 6D position and orientation of the animal
`object, and cause it to move accordingly on the Screen, as the
`child maneuvers it. The child can hold two animals, one in
`each hand and they can each be detected, even with a single
`camera, and be programmed in Software to interact as the
`child wishes.(or as he learns the program)
`This is clearly for very young children of two or three
`years of age. The toys have to be large So they can’t be
`Swallowed.
`With the invention in this manner, one can make a toy of
`virtually anything, for example a block. Just hold this block
`up, teach the computer/camera System the object and play
`using any program you might want to represent it and its
`actions. To make this block known to the System, the shape
`of the block, the color of the block or Some code on the block
`can be determined. Any of those items could tell the camera
`which block it was, and most could give position and
`orientation if known.
`At that point, an image is called up from the computer
`representing that particular animal or whatever else the
`block is Supposed to represent. Of course this can be
`changed in the computer to be a variety of things if this is
`Something that is acceptable to the child. It could certainly
`be changed in size Such as a Small lion could grow into a
`large lion. The child could probably absorb that more than
`a lion changing into a giraffe for example since the block
`wouldn't correspond to that. the child can program or teach
`the System any of his blockS to be the animal he wants and
`that might be fun.
`For example, he or the child's parent could program a
`Square to be a giraffe where as a triangle would be a lion.
`Maybe this could be an interesting way to get the child to
`learn his geometric shapes!
`Now the basic block held up in front of the camera system
`could be looked at just for what it is. As the child may move
`the thing toward or away from the camera System, one may
`get a rough Sense of depth from the change in shape of the
`object. However this is not So easy as the object changes in
`shape due to any Sort of rotations.
`
`8
`Particularly interesting then is to also Sense the rotations
`if the object So that the animal can actually move realisti
`cally in 3 Dimensions on the Screen. And perhaps having the
`de-tuning of the shape of the movement So that the child's
`relatively jerky movements would not appear jerky on the
`Screen or would not look So accentuated. Conversely of
`course, you can go the other way and accentuate the
`motions.
`This can, for example, be done with a line target around
`the edge of the object is often useful for providing position
`or orientation information to the tv camera based analysis
`Software, and in making the object easier to see in reflective
`illumination.
`
`15
`
`25
`
`35
`
`40
`
`45
`
`50
`
`55
`
`60
`
`65
`
`Aid to Speech Recognition
`The previous copending application entitled “Useful man
`machine interfaces and applications' referenced above, dis
`cussed the use of perSons movements or positions to aid in
`recognizing the Voice spoken by the perSon.
`In one instance, this can be achieved by Simply using ones
`hand to indicate to the camera System of the computer that
`the voice recognition should start (or Stop, or any other
`function, Such as a paragraph or Sentance end etc).
`Another example is to use the camera System of the
`invention to determine the location of the persons head (or
`other part), from which one can instruct a computer to
`preferentially evaluate the Sound field in phase and ampli
`tude of two or more Spaced microphones to listen from that
`location-thus aiding the pickup of Speech, which often
`times is notable to be heard well enough for computer based
`automatic Speech recognition to occur.
`Digtial Interactive TV
`AS you watch tv, data can be taken from the camera
`System of the invention and transmitted back to the Source
`of programming. This could include Voting on a given
`proposition by raising your hand for example, with your
`hand indication transmitted. Or you could hold up 3 fingers,
`and the count of fingers transmitted. Or in a more extreme
`case, your position, or the position of an object or portion
`thereof could be transmitted-for example you could buy a
`coded object, whose code would be transmitted to indicate
`that you personally (having been preregistered) had trans
`mitted a certain packet of data.
`If the programming Source can transmit individually to
`you (not possible today, but forecast for the future), then
`much more is possible. The actual image and Voice can
`respond using the invention to positions and orientations of
`perSons or objects in the room just as in the case of
`prerecorded data, or one to one internet connections. This
`allows group activity as well.
`In the extreme case, full video is transmitted in both
`directions and total interaction of users and programming
`Sources and each other becomes possible.
`An interim possibility using the invention is to have a
`program broadcast to many, which shifts to prerecorded
`DVD disc or the like driving a local image, Say when your
`hand input causes a Signal to be activated.
`Handwriting Authentication
`A referenced copending application illustrated the use of
`the invention to track the position of a pencil in three
`dimensional Space Such that the point at which the user
`intends the writing point to be at, can be identified and
`therefore used to input information, Such as the intended
`Script.
`
`IPR2022-00090 - LGE
`Ex. 1003 - Page 12
`
`
`
`5
`
`15
`
`25
`
`AS herein disclosed, this part of the invention can also be
`used for the purpose of determining