throbber
PTO/AIAIOT (06-12)
`Approved for use through 01/31/2014, OMB0651-0032
`U.S. Patent and Trademark Office: U.S. DEPARTMENT OF COMMERCE
`Under the Paperwork Reduction Act of 1995, no persons are required to respond to.a collection of information unless it displays a valid OMS contro! number.
`
`DECLARATION (37 CFR 1.63) FOR UTILITY OR DESIGN APPLICATION USING AN
`
`Titeof
`Invention
`
`{MORE USEFUL MAN MACHINE INTERFACES AND APPLICATIONS
`
`As the below named inventor, | hereby declare that:
`
`This declaration
`
`ig directed to: - The attached application, or
`[|
`United States application or PCT international application number
`
`i
`
`nati
`
`APPLICATIONDATA SHEET(37 CFR 1.76)
`
`filed on
`
`eee
`
`The above-identified application was made or authorized to be made by me.
`
`1 believe that | am the original inventor or an original joint inventor of a claimed invention in the application.
`
`Ehereby acknowledge that any willful false staternent madein this declaration is punishable under 18 U.S.C, 1001
`by fine or imprisonment of not more than five (5) years, or both.
`
`WARNING:
`
`Petitioner/applicant is cautioned to avoid submitting personal information in documents filed in a patent application that may
`contribute to identity theft. Personal information such as social security numbers, bank account numbers, or credit card numbers
`{other than a check or credit card authorization form PTO-20238 submitted for payment purposes) is never required by the USPTO
`to support a petition or an application.
`If this type of personalinformation is included in documents submitted to the USPTO,
`: pétitioners/applicants shauld consider redacting such personal information fram the documents before submitting them to the
`fUSPTO. Petitioner/applicant is advised that the record of a patent application is available to the public after publication of the
`-application (unless a non-publication request in compliance with 37 CFR 1.213{a) is made in the application) or issuance of a
`patent. Furthermore, the record from an abandoned application may also be available to the public ifthe application is
`referenced in a published application or an issued patent (see 37 CFR 1.14). Checks and credit card authorization forms
`PTO-2038 submitted for payment purposes are not retained in the application file and therefore are not publicly available.
`
`LEGAL NAME OF INVENTOR
`
`Inventor: Fimothy fo
`Signature: / ;
`
`Date copttonan2 é £ zd 2
`
`Note: An application data sheet (PTO/SB/44 or equivaient), including naming the entire inventive entity, must accompany ihis form or must have
`been previously filed. Use an additional PTO/AIA/01 form for each additional inventor.
`
`
`This collection of information is required by 35 U.3.C. 115 and 37 CFR 4.63. The information is required to obtain or retain a benefit by the putvic which is to file (and
`by the USPTO to process} an application. Confidentiality is. governed by 35 U.S.C. 122 and 37 CFR 1.11 and 1.14. This collection is estimated to take 1 minute to
`complete, including gathering, preparing, and submitting the completed application form to the USPTO.Time will vary depending upon the individual case, Any
`comments on the amount of time you require to complete this forrn-and/or suggestions for reducing this burden, should be sent to the Chief Information Officer, U.S.
`Patent and Trademark Office, U.S, Department of Commerce, P.O. Box 1450, Alexandria, VA 22313-1490, DO NOT SEND FEES OR COMPLETED FORMS TO
`THIS ADDRESS. SEND TO: Commissionerfor Patents, P.O. Box 1450, Alexandria, VA 22343-1450.
`if you need assisiance in completing the fom, call ¢-800-PTO-9799 and select option 2
`
`IPR2022-00090 - LGE
`
`Ex. 1002 - Page 1
`
`IPR2022-00090 - LGE
`Ex. 1002 - Page 1
`
`

`

`1/7
`
`
`
`IPR2022-00090 - LGE
`
`Ex. 1002 - Page 2
`
`IPR2022-00090 - LGE
`Ex. 1002 - Page 2
`
`

`

`2/7
`
`
`
`IPR2022-00090 - LGE
`
`Ex. 1002 - Page 3
`
`IPR2022-00090 - LGE
`Ex. 1002 - Page 3
`
`

`

`3/7
`
`
`
`IPR2022-00090 - LGE
`
`Ex. 1002 - Page 4
`
`IPR2022-00090 - LGE
`Ex. 1002 - Page 4
`
`

`

`4/7
`
`MAK
`
`OST
`
`vOlA
`
`OOV
`
`807
`
`IPR2022-00090 - LGE
`
`Ex. 1002 - Page 5
`
`IPR2022-00090 - LGE
`Ex. 1002 - Page 5
`
`

`

`FIG.5
`
`Oo
`
`©W
`
`w
`
`
`
`IPR2022-00090 - LGE
`
`Ex. 1002 - Page 6
`
`5/7
`
`oO
`co
`wo
`
`535
`
`540
`
`
`
`500
`
`IPR2022-00090 - LGE
`Ex. 1002 - Page 6
`
`

`

`6/7
`
`957
`
`FIG.6
`
` 951DISPLAY CPU
`
`
`
`IPR2022-00090 - LGE
`
`Ex. 1002 - Page 7
`
`©o
`
`O
`oO
`
`IPR2022-00090 - LGE
`Ex. 1002 - Page 7
`
`

`

`TIT
`
` FIG. 7A
`
`1070
`
`1059
`
`1075
`
`1056
`
`INTERNET
`
`
`
`
`INPUT
`WOMAN'S
`MEASURE-
`MENTS
`
`
`INTERNET
`REMOTE
`
`(
`1050
`
`FIG. 7B
`
`IPR2022-00090 - LGE
`
`Ex. 1002 - Page 8
`
`IPR2022-00090 - LGE
`Ex. 1002 - Page 8
`
`

`

`MORE USEFUL MAN MACHINE INTERFACES AND APPLICATIONS
`
`CROSS REFERENCE TO RELATED APPLICATIONS
`
`[0001]
`
`This application is a continuation of U.S. Patent Application No. 12/700,055, filed
`
`February 4, 2010 (now U.S. Patent
`
`), which is a continuation of U.S. Patent
`
`Application No. 10/866,191, filed June 14, 2004, which is a continuation of U.S. Patent
`
`Application No. 09/433,297, filed November3, 1999 (now U.S. Patent 6,750,848), which claims
`
`benefit of U.S. Provisional Application No. 60/107,652,
`
`filed November 9, 1998. These
`
`applications are hereby incorporated byreference.
`
`REFERENCES TO RELATED APPLICATIONS BY THE INVENTORS
`
`[0002]
`
`[0003]
`
`[0004]
`
`U.S. Patent Application No. 09/138,339, filed August 21, 1998.
`
`U.S. Provisional Application No. 60/056,639, filed August 22, 1997.
`
`U.S. Provisional Application No. 60/059,561, filed September 19, 1998.
`
`[0005]
`
`Man Machine Interfaces: SN 08/290,516,
`
`filed 8/15/1994, and now USP
`
`6,008,800.
`
`[0006]
`
`Touch TV and Other Man Machine Interfaces: SN 08/496,908, filed 6/29/1995,
`
`and now USP 5,982,352.
`
`[0007]
`
`Systems for Occupant Position Sensing: SN 08/968,114, filed 11. /12/1997, now
`
`abandoned, which claimsbenefit of 60/031,256, filed 11/12/1996.
`
`[0008]
`
`Target holes and corners: USSN 08/203,603, filed 2/28/1994, and 08/468,358
`
`filed 6/6/1995, now USP 5,956,417 and USP 6,044, 183.
`
`[0009]
`
`Vision Target Based Assembly: USSN 08/469,429,
`
`filed 6/6/1995, now
`
`abandoned; 08/469,907, filed 6/6/1995, now USP 6,301,763; 08/470,325, filed 6/6/1995, now
`
`abandoned; and 08/466,294,filed 6/6/1995, now abandoned.
`
`IPR2022-00090 - LGE
`
`Ex. 1002 - Page 9
`
`IPR2022-00090 - LGE
`Ex. 1002 - Page 9
`
`

`

`[00010]
`
`Picture Taking Method and Apparatus: Provisional Application no. 60/133,671,
`
`filed May 11, 1998.
`
`[00011]
`
`Methods and Apparatus for Man Machine Interfaces and Related Activity:
`
`Provisional Application no. 60/133,673 filed May 11, 1998.
`
`[00012]
`
`Camera Based Man-Machine Interfaces: Provisional Patent Application no.
`
`60/142,777, filed July 8, 1999.
`
`[00013]
`
`The copies of the disclosure of the above referenced applications are incorporated
`
`herein by reference.
`
`BACKGROUND OF THE INVENTION
`
`[00014]
`
`Field of the Invention
`
`[00015]
`
`The invention relates to simple input devices for computers, particularly, but not
`
`necessarily, intended for use with 3-D graphically intensive activities, and operating by optically
`
`sensing object or human positions and/or orientations. The invention in many preferred
`
`embodiments, uses real time stereo photogrammetry using single or multiple TV cameras whose
`
`output is analyzed and used as input to a personal computer, typically to gather data concerning
`
`the 3D location of parts of, or objects held by, a person or persons.
`
`[00016]
`
`This
`
`continuation application seeks
`
`to provide further detail on useful
`
`embodiments for computing. One embodiment is a keyboard for a laptop computer (or stand
`
`alone keyboard for any computer) that incorporates digital TV cameras to look at points on,
`
`typically, the hand or the finger, or objects held in the hand of the user, which are used to input
`
`data to the computer. It may also or alternatively, look at the head of the user as well.
`
`[00017]
`
`Both hands or multiple fingers of each hand, or an object in one hand andfingers
`
`of the other can be simultaneously observed, as can alternate arrangements as desired.
`
`IPR2022-00090 - LGE
`
`Ex. 1002 - Page 10
`
`IPR2022-00090 - LGE
`Ex. 1002 - Page 10
`
`

`

`DESCRIPTION OF RELATED ART
`
`[00018]
`
`My referenced co-pending applications incorporated herein by reference discuss
`
`manyprior art references in various pertinent fields, which form a backgroundfor this invention.
`
`BRIEF DESCRIPTION OF FIGURES
`
`[00019]
`
`Fig. 1 illustrates a laptop or other computer keyboard with cameras according to
`
`the invention located on the keyboard surface to observe objects such as fingers and hands
`
`overhead of the keyboard.
`
`[00020]
`
`Fig. 2 illustrates another keyboard embodiment using special datums or light
`
`sources such as LEDs.
`
`[00021]
`
`Fig. 3 illustrates a further finger detection system for laptop or other computer
`
`input.
`
`[00022]
`
`Fig. 4 illustrates learning, amusement, monitoring, and diagnostic methods and
`
`devices for the crib, playpen andthelike.
`
`[00023]
`
`Fig. 5 illustrates a puzzle toy for young children having cut out wood characters
`
`according to the invention.
`
`[00024]
`
`Fig. 6 illustrates an improved handheld computer embodimentof the invention, in
`
`which the camera or cameras maybe used to look at objects, screens and the like as well as look
`
`at the user alongthe lines of Fig. 1.
`
`[00025]
`
`Fig. 7 illustrates new methods for internet commerce and other activities
`
`involving remote operation with 3D virtual objects display.
`
`IPR2022-00090 - LGE
`
`Ex. 1002 - Page 11
`
`IPR2022-00090 - LGE
`Ex. 1002 - Page 11
`
`

`

`Figure|
`
`DESCRIPTION OF THE INVENTION
`
`[00026]
`
`A laptop (or other) computer keyboard based embodiment is shown in Fig. 1. In
`
`this case, a stereo pair of cameras 100 and 101 located on each side of the keyboard are used,
`
`desirably having cover windows 103 and 104 mounted flush with the keyboard surface 102. The
`
`cameras are preferably pointed obliquely inward at angles ® toward the center of the desired
`
`work volume 170 above the keyboard. In the case of cameras mounted at the rear of the
`
`keyboard (toward the display screen), these camerasare also inclined to point toward the user at
`
`an angle as well.
`
`[00027]
`
`Alternate camera locations may be used such as the positions of cameras 105 and
`
`106, on upper corners of screen housing 107 looking downat the top of the fingers (or hands, or
`
`objects in handor in front of the cameras), or of cameras 108 and 109 shown.
`
`[00028]
`
`One of the referenced embodiments of the invention is to determine the pointing
`
`direction vector 160 of the user’s finger (for example pointing at an object displayed on screen
`
`107), or the position and orientation of an object held by the user. Alternatively, finger position
`
`data can be used to determine gestures such as pinch or grip, and other examples of relative
`
`juxtaposition of objects with respect
`
`to each other, as has been described in co-pending
`
`referenced applications. Positioning of an object or portions (such as handsorfingers of a doll) is
`
`also of use, though more for use with larger keyboards and displays.
`
`[00029]
`
`In one embodiment, shown in Fig. 2, cameras such as 100/101 are used to simply
`
`look at the tip of a finger 201 (or thumb) of the user, or an object such as a ring 208 on the
`
`finger. Light from below, such as provided bysingle central light 122 can be used to illuminate
`
`the finger that typically looks bright under such illumination.
`
`IPR2022-00090 - LGE
`
`Ex. 1002 - Page 12
`
`IPR2022-00090 - LGE
`Ex. 1002 - Page 12
`
`

`

`[00030]
`
`It is also noted that the illumination is directed or concentrated in an area where
`
`the finger is typically located such as in work volume 170. If the light is of sufficient spectral
`
`content, the natural flesh tone of the finger can be observed — and recognized by use of the color
`
`TV cameras 100/101.
`
`[00031]
`
`Asis typically the case, the region of the overlapping cameras viewing area is
`
`relatively isolated to the overlapping volumetric zone of their fields 170 shown due to focal
`
`lengths of their lenses and the angulation of the camera axes with respect to each other. This
`
`restricted overlap zone helps mitigate against unwanted matches in the two images due to
`
`information generated outside the zone of overlap. Thus there are no significant image matches
`
`found of other objects in the room, since the only flesh-toned object in the zone is typically the
`
`finger or fingers of the user. Or alternatively, for example, the user’s hand or hands. Similarly
`
`objects or targets thereon can be distinguished by special colors or shapes.
`
`[00032]
`
`If desired, or required, motion of the fingers can be also used to further
`
`distinguish their presence vis-a-vis any static background. If for example, by subtraction of
`
`successive camera frames, the image of a particular object is determined to have movedit is
`
`determined that this is likely the object of potential interest which can be further analyzed
`
`directly to determineif is the object of interest.
`
`[00033]
`
`In case of obscuration of the fingers or objects in the hand, cameras in additional
`
`locations such as those mentioned above, can be used to solve for position if the view of one or
`
`more camerasis obscured.
`
`[00034]
`
`The use of cameras mounted on both the screen and the keyboard allows one to
`
`deal with obscurations that may occur and certain objects may or may not be advantageously
`
`delineated in one view or the other.
`
`IPR2022-00090 - LGE
`
`Ex. 1002 - Page 13
`
`IPR2022-00090 - LGE
`Ex. 1002 - Page 13
`
`

`

`[00035]
`
`In addition, it may be in manycases desirable to have a datum on the top of the
`
`finger as opposed to the bottom because on the bottom,it can get in the wayof certain activities.
`
`In this case the sensors are required on the screen looking downwardor in some other location
`
`such as off the computer entirely and located overhead has been noted in previous application.
`
`[00036]
`
`To determine finger location, a front end processorlike that described in the target
`
`holes and corners co-pending application reference incorporated USSN 08/203,603 and
`
`08/468,358 can be usedto also allow the finger shape as well as color to be detected.
`
`[00037]
`
`Finger gestures comprising a sequence of finger movements can also be detected
`
`by analyzing sequential image sets such as the motion of the finger, or one finger with respect to
`
`another such as in pinching something can be determined. Cameras 100 and 101 have been
`
`shown at the rear of the keyboard near the screen or at the front. They may mountin the middle
`
`of the keyboard or any other advantageouslocation.
`
`[00038]
`
`The cameras can also see one’s fingers directly,
`
`to allow typing as now, but
`
`without the physical keys. One can type in space above the plane of the keyboard(orin this case
`
`plane of the cameras). This is useful for those applications where the keyboard of conventional
`
`style is too big (e.g., the hand held computerofFig. 6).
`
`[00039]
`
`Figure 2
`
`[00040]
`
`It is also desirable for fast reliable operation to use retro-reflective materials and
`
`other materials to augment the contrast of objects used in the application. For example, a line
`
`target such as 200 can be worn on a finger 201, and advantageously can be located if desired
`
`between twojoints of the finger as shown. This allowsthe tip of the finger to be used to type on
`
`the keyboard without feeling unusual — the case perhaps with target material on tip of the finger.
`
`IPR2022-00090 - LGE
`
`Ex. 1002 - Page 14
`
`IPR2022-00090 - LGE
`Ex. 1002 - Page 14
`
`

`

`[00041]
`
`The line image detected by the camera can be provided also by a cylinder such as
`
`retroreflective cylinder 208 worn on the finger 201 which effectively becomesa line image in the
`
`field of view of each camera (assuming each camera is equipped with a sufficiently coaxial light
`
`source, typically one or more LEDs such as 210 and 211), can be used to solve easily using the
`
`line image pairs with the stereo cameras for the pointing direction of the finger that is often a
`
`desired result. The line, in the stereo pair of images provides the pointing direction of the finger,
`
`for example pointing at an object displayed on the screen 140 of the laptop computer 138.
`
`[00042]
`
`Figure 3
`
`[00043]
`
`It is also possible to have light sources on the finger that can be utilized such as
`
`the 2 LED light sources shownin Fig. 3. This can be used with either TV camera type sensors or
`
`with PSD type analog image position sensors as disclosed in references incorporated.
`
`[00044]
`
`In particular the ring mounted LED light sources 301 and 302 can be modulated at
`
`different frequencies that can be individually discerned by sensors imaging the sources on to a
`
`respective PSD detector. Alternatively, the sources can simply be turned on andoff at different
`
`times such that the position of each point can be independently found allowing the pointing
`
`direction to be calculated from the LED point data gathered by the stereo pair of PSD based
`
`sensors.
`
`[00045]
`
`The “natural
`
`interface keyboard” here described can have cameras or other
`
`sensors located at the rear looking obliquely outward toward the front as well as inward so as to
`
`have their working volumeoverlap in the middle of the keyboard such as the nearly full volume
`
`over the keyboard area is accommodated.
`
`[00046]
`
`Clearly larger keyboards can have a larger working volume than one might have
`
`on a laptop. The pair of sensors used can be augmented with other sensors mounted on the screen
`
`IPR2022-00090 - LGE
`
`Ex. 1002 - Page 15
`
`IPR2022-00090 - LGE
`Ex. 1002 - Page 15
`
`

`

`housing. It is noted that the linked dimension afforded for calibration between the sensors
`
`located on the screen and those on the keyboard is provided by the laptop unitary construction.
`
`[00047]
`
`One can use angle sensing means such as a rotary encoder for the laptop screen
`
`tilt. Alternatively, cameras located on the screen can be used to image reference points on the
`
`keyboard as reference points to achieve this. This allows the calibration of the sensors mounted
`
`fixedly with respect to the screen with respect to the sensors and keyboard space below. It also
`
`allows one to use stereo pairs of sensors that are not in the horizontal direction (such as 101/102)
`
`but could for example be a camera sensor such as 100 on the keyboard coupled with one on the
`
`screen, such as 106.
`
`[00048]
`
`Knowing the pointing angles of the two cameras with respect to one another
`
`allows one to solve for the 3D location of objects from the matching of the object image
`
`positions in the respective camerafields.
`
`[00049]
`
`Asnoted previously, it is also of interest to locate a line or cylinder type target on
`
`the finger between the first and second joints. This allows one to use the fingertip for the
`
`keyboard activity but by raising the finger up, it can be used as a line target capable of solving
`
`for the pointed direction for example.
`
`[00050]
`
`Alternatively one can use two point
`
`targets on the finger such as either
`
`retroreflective datums, colored datums such as rings or LED light sources that can also be used
`
`with PSD detectors which hasalso been noted in Fig. 2.
`
`[00051]
`
`When using the cameras located for the purpose of stereo determination of the
`
`position of the fingers from their flesh tone images it is useful to follow the preprocessing
`
`capable of processing data obtained from the cameras in order to look for the finger. This can be
`
`done on both color basis and on the basis of shape as well as motion.
`
`IPR2022-00090 - LGE
`
`Ex. 1002 - Page 16
`
`IPR2022-00090 - LGE
`Ex. 1002 - Page 16
`
`

`

`[00052]
`
`In this invention, I have shown the use of not only cameras located on a screen
`
`looking downward or outward from the screen, but also cameras that can be used instead of or in
`
`combination with those on the screen placed essentially on the member on which the keyboardis
`
`incorporated. This allows essentially the keyboard to mounted cameras which are preferably
`
`mounted flush with the keyboard surface to be unobtrusive, and yet visually be able to see the
`
`users fingers, hands or objects held by the user and in somecases, the face of the user.
`
`[00053]
`
`This arrangement is also useful for 3D displays, for example where special
`
`synchronized glasses (e.g., the “Crystal Eyes” brand often used with Silicon Graphics work
`
`stations) are used to alternatively present right and left images to each eye. In this case the object
`
`may appearto be actually in the workspace 170 above the keyboard, and it may be manipulated
`
`by virtually grasping (pushing, pulling, etc.) it, as has been described in co-pending applications.
`
`[00054]
`
`Figure 4: Baby Learning and Monitoring System
`
`[00055]
`
`A baby’s reaction to the mother (or father) and the mother’s analysis of the baby’s
`
`reaction is very important. There are many gestures of babies apparently indicated in child
`
`psychology as being quite indicative of various needs, wants, or feelings and emotions, etc.
`
`These gestures are typically made with the baby’s hands.
`
`[00056]
`
`Today this is done and learned entirely by the mother being with the baby.
`
`However with an Electro-optical sensor based computer system, such as that described in co-
`
`pending applications located proximate to or even in the crib (for example), one can have the
`
`child’s reactions recorded, not just in the sense of a video tape which would be too long and
`
`involved for most to use, but also in terms of the actual motions which could be computer
`
`recorded and analyzed also with the help of the mother as to what the baby’s responses were.
`
`IPR2022-00090 - LGE
`
`Ex. 1002 - Page 17
`
`IPR2022-00090 - LGE
`Ex. 1002 - Page 17
`
`

`

`And such motions, combined with other audio and visual data can be very important to the
`
`baby’s health, safety, and learning.
`
`[00057]
`
`Consider for example crib 400 with computer 408 having LCD monitor 410 and
`
`speaker 411 and camera system (single or stereo) 420 as shown, able to amuse or inform baby
`
`430, while at the same time recording (both visually, aurally, and in movement detected position
`
`data concerning parts of his body or objects such as rattles in his hand) his responses for any or
`
`all of the purposes of diagnosis of his state of being, remote transmission of his state, cues to
`
`various programsor imagesto display to him or broadcast to others, or the like.
`
`[00058]
`
`For one example, baby’s motions could be used to signal a response from the TV
`
`either in the absence of the mother or with the mother watching on a remote channel. This can
`
`even be over the Internet if the mother is at work.
`
`[00059]
`
`For example, a comforting message could come up on the TV from the mother
`
`that could be prerecorded (or alternatively could actually be live with TV cameras in the
`
`mother’s or father’s workplace for example on a computer used by the parent) to tell the baby
`
`something reassuring or comfort the baby or whatever. Indeed the parent can be monitored using
`
`the invention and indicate something back or even control a teleoperater robotic device to give a
`
`small child something to eat or drink for example. The same applies to a disabled person.
`
`[00060]
`
`If the father or mother came up on the screen, the baby could waveat it, moveits
`
`head or “talk” to it but the hand gestures may be the most important.
`
`[00061]
`
`If the mother knows whatthe babyis after, she can talk to baby or say something,
`
`or show something that the baby recognizes such as a doll. After a while, looking at this live one
`
`can then moveto talking to the baby from someprerecorded data.
`
`-10-
`
`IPR2022-00090 - LGE
`
`Ex. 1002 - Page 18
`
`IPR2022-00090 - LGE
`Ex. 1002 - Page 18
`
`

`

`[00062]
`
`What other things might we suppose? The baby for example knowsto puts its
`
`hand on the mother’s cheek to cause the mother to turn to it. The baby also learns some other
`
`reflexes when it is very young that it forgets when it gets older. Many of these reflexes are hand
`
`movements, and are important
`
`in communicating with the remote TV based mother
`
`representation, whether real via telepresense or from CD Rom or DVD disk (or other media,
`
`including information transmitted to the computer from afar) and for the learning of the baby’s
`
`actions.
`
`[00063]
`
`Certainly just from the making the baby feel good point-of-view, it would seem
`
`like certain motherly (or fatherly, etc.) responses to certain baby actions in the form of words and
`
`images would be useful. This stops short of physical holding of the baby which is often needed,
`
`but could act as a stop gap to allow the parents to get another hour’s sleep for example.
`
`[00064]
`
`Asfar as the baby touching things, I’ve discussed in other applications methods
`
`for realistic touch combined with images. This leads to a new form of touching crib mobiles that
`
`could contain video imaged and or be imaged themselves — plus if desired — touched in waysthat
`
`would be far beyond any response that you could get from a normal mobile.
`
`[00065]
`
`For example, let us say there is a targeted (or otherwise TV observable) mobile
`
`450 in the crib above the baby. Baby reaches up and touches a piece of the mobile which is
`
`sensed by the TV camera system (either from the baby’s handposition, the mobile movement, or
`
`both, and a certain sound is called up by the computer, a musical note for example. Another
`
`piece of the mobile and another musical note. The mobile becomes a musical instrument for the
`
`baby that could play either notes or chords or complete passages, or any other desired
`
`programmedfunction.
`
`-l|1-
`
`IPR2022-00090 - LGE
`
`Ex. 1002 - Page 19
`
`IPR2022-00090 - LGE
`Ex. 1002 - Page 19
`
`

`

`[00066]
`
`The baby can also signal things. The baby can signal using agitated movements
`
`would often mean that
`
`it’s unhappy. This could be interpreted using learned movement
`
`signatures and artificial intelligence as needed by the computer to call for mother even if the
`
`baby wasn’t crying. If the baby cries, that can be picked up by microphone 440, recognized using
`
`a voice recognition system along the lines of that used in IBM Via Voice commercial product for
`
`example. And even the degree of crying can be analyzed to determine appropriate action.
`
`[00067]
`
`The computer could also be used to transmit information of this sort via the
`
`internet email to the mother who could even be at work. And until help arrives in the form of
`
`mother intervention or whatever, the computer could access a program that could display on a
`
`screen for the baby things that the baby likes and could try to soothe the baby through either
`
`images of familiar things, music or whatever. This could be useful at night when parents need
`
`sleep, and anything that would make the baby feel more comfortable would help the parents.
`
`[00068]
`
`It could also be used to allow the baby to input to the device. For example,if the
`
`baby was hungry,a picture of the bottle could be brought up on the screen. The baby then could
`
`yell for the bottle. Or if the baby needed his diaper changed, perhaps something reminiscent of
`
`that. If the baby reacts to such suggestions of his problem,this gives a lot moreintelligence as to
`
`whyhe is crying and while mothers can generally tell right away, not everyone else can. In other
`
`words, this is pretty neat for babysitters and other members of the household so they can act
`
`more intelligently on the signals the baby is providing.
`
`[00069]
`
`Besides in the crib, the system as described can be used in conjunction with a
`
`playpen, hi-chair or other place of babyactivity.
`
`[00070]
`
`Asthe child gets older, the invention can further be used also with more advanced
`
`activity with toys, and to take data from toy positions as well. For example, blocks, dolls, little
`
`-12-
`
`IPR2022-00090 - LGE
`
`Ex. 1002 - Page 20
`
`IPR2022-00090 - LGE
`Ex. 1002 - Page 20
`
`

`

`cars, and moving toys even such as trikes, scooters, drivable toy cars and bikes with training
`
`wheels.
`
`[0007 1]
`
`The following figure illustrates the ability of the invention to learn, and thus to
`
`assist in the creation of toys and otherthings.
`
`[00072]
`
`Figure 5: Learning Puzzle Roy
`
`[00073]
`
`Disclosed in Fig. 5 is a puzzle toy 500 where woodcut animals such as bear 505
`
`and lion 510 are pulled out with handle such as 511. The child can show the animal to the camera
`
`and a computer 530 with TV camera (or cameras) 535 can recognize the shape as the animal, and
`
`provide a suitable image and sounds on screen 540.
`
`[00074]
`
`Alternatively, and more simply, a target, or targets on the back of the animal can
`
`be used such as triangle 550 on the back of lion 511. In either case the camera can solve for the
`
`3D, and even 5 or 6D position and orientation of the animal object, and cause it to move
`
`accordingly on the screen as the child maneuvers it. The child can hold two animals, one in each
`
`hand and they can each be detected, even with a single camera, and be programmed in software
`
`to interact as the child wishes(or as he learns the program).
`
`[00075]
`
`This is clearly for very young children of two or three years of age. The toys have
`
`to be large so they can’t be swallowed.
`
`[00076]
`
`With the invention in this manner, one can makea toy ofvirtually anything, for
`
`example a block. Just hold this block up, teach the computer/camera system the object and play
`
`using any program you might want to represent it and its actions. To make this block knownto
`
`the system, the shape of the block, the color of the block or some code on the block can be
`
`determined. Any of those items could tell the camera which block it was, and most could give
`
`position and orientation if known.
`
`-13-
`
`IPR2022-00090 - LGE
`
`Ex. 1002 - Page 21
`
`IPR2022-00090 - LGE
`Ex. 1002 - Page 21
`
`

`

`[00077]
`
`At that point, an image is called up from the computer representing that particular
`
`animal or whatever else the block is supposed to represent. Of course this can be changed in the
`
`computer to be a variety of things if this is something that is acceptable to the child. It could
`
`certainly be changed in size such as a small lion could growinto a large lion. The child could
`
`probably absorb that more than a lion changing into a giraffe for example since the block
`
`wouldn’t correspondto that. The child can program or teach the system any of his blocks to be
`
`the animal he wants and that might be fun.
`
`[00078]
`
`For example, he or the child’s parent could program a square to be a giraffe where
`
`as a triangle would be a lion. Maybe this could be an interesting way to get the child to learn his
`
`geometric shapes!
`
`[00079]
`
`Nowthe basic block held up in front of the camera system could be lookedat just
`
`for whatit is. As the child may movethe thing toward or away from the camera system, one may
`
`get a rough sense of depth from the change in shape of the object. Howeverthis is not so easy as
`
`the object changes in shape dueto anysort ofrotations.
`
`[00080]
`
`Particularly interesting then is to also sense the rotations if the object so that the
`
`animal can actually moverealistically in 3 Dimensions on the screen. And perhaps having the
`
`de-tuning of the shape of the movementso that the child’s relatively jerky movements would not
`
`appear jerky on the screen or would not look so accentuated. Conversely of course, you can go
`
`the other way and accentuate the motions.
`
`[00081]
`
`This can, for example, be done with a line target around the edge of the objectis
`
`often useful for providing position or orientation information to the TV camera based analysis
`
`software, and in making the object easier to see in reflective illumination.
`
`-14-
`
`IPR2022-00090 - LGE
`
`Ex. 1002 - Page 22
`
`IPR2022-00090 - LGE
`Ex. 1002 - Page 22
`
`

`

`[00082]
`
`Aid to speech recognition
`
`[00083]
`
`The previous co-pending application entitled “Useful man machine interfaces and
`
`applications” referenced above, discussed the use of persons movements or positions to aid in
`
`recognizing the voice spoken by the person.
`
`[00084]
`
`In one instance, this can be achieved by simply using ones handto indicate to the
`
`camera system of the computer that the voice recognition should start (or stop, or any other
`
`function, such as a paragraphor sentence end,etc.).
`
`[00085]
`
`Another example is to use the camera system of the invention to determine the
`
`location of the persons head (or other part), from which one can instruct a computer to
`
`preferentially evaluate the sound field in phase and amplitude of two or more spaced
`
`microphonesto listen from that location — thus aiding the pickup of speech — which often times
`
`is not able to be heard well enough for computer based automatic speech recognition to occur.
`
`[00086]
`
`Digital interactive TV
`
`[00087]
`
`As you watch TV, data can be taken from the camera system of the invention and
`
`transmitted back to the source of programming. This could include voting on a given proposition
`
`by raising your hand for example, with your hand indication transmitted. Or you could hold up 3
`
`fingers, and the count of fingers transmitted. Or in

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket