throbber

`
`I 1111111111111111 11111 111111111111111 IIIII IIIII 1111111111111111 IIII IIII IIII
`
`US008553079B2
`
`(IO) Patent No.: US 8,553,079 B2
`c12) United States Patent
`(45)Date of Patent:
`Pryor
`Oct. 8, 2013
`
`(54)MORE USEFUL MAN MACHINE
`
`INTERFACES AND APPLICATIONS
`
`
`
`
`
`
`
`
`
`(71) Applicant: Timothy R. Pryor, Sylvania, OH (US)
`
`12/1986 Pugh
`4,631,676 A
`12/1988 Blazo et al.
`4,791,589 A
`
`6/1989 Krueger et al.
`4,843,568 A
`
`3/1990 Fujioka et al.
`
`4,908,704 A
`
`1/1991 Zimmerman et al.
`
`4,988,981 A
`4/1991 Ando
`
`5,008,946 A
`2/1992 Chan
`
`5,088,928 A
`7/1993 Yokota et al.
`
`5,227,986 A
`9/1993 Jain
`5,249,053 A
`( * ) Notice: Subject to any disclaimer, the term ofthis
`
`
`
`
`3/1994 Dementhon et al.
`5,297,061 A
`
`
`
`patent is extended or adjusted under 35
`11/1994 Holeva
`
`5,365,597 A
`
`
`U.S.C. 154(b) by O days.
`12/1994 Chan et al.
`
`5,376,796 A
`2/1995 DeMenthon
`
`5,388,059 A
`9/1995 Freeman
`5,454,043 A
`10/1995 Naoi et al. .................... 382/165
`
`
`
`
`5,459,793 A *
`2/1996 Umezawa et al.
`5,491,507 A
`
`
`6/1996 Platzker et al. ............... 345/156
`5,528,263 A *
`7/1996 Sawanobori
`5,534,921 A
`
`
`11/1996 Ogawa ..................... 348/207.99
`
`5,572,251 A *
`12/1996 Cipolla et al.
`5,581,276 A
`
`1/1997 Freeman et al.
`
`5,594,469 A
`
`4/1997 Oh .................................... 463/8
`5,616,078 A *
`4/1997 Ohkubo et al.
`5,624,117 A
`(63)Continuation of application No. 12/700,055, filed on
`
`
`
`
`7/1998 Fishbine et al.
`5,781,647 A
`7/1998 Lobo et al.
`5,781,650 A
`
`Feb. 4, 2010, which is a continuation of application
`
`
`9/1998 Wakabayashi et al. .... 348/220.1
`
`5,808,672 A *
`
`No. 10/866,191, filed on Jun. 14, 2004, now
`10/ 1998 Leis et al.
`5,828,770 A
`
`
`abandoned, which is a continuation of application No.
`12/1998 Sumi et al.
`
`5,845,006 A
`
`09/433,297, filed on Nov. 3, 1999, now Pat. No.
`
`l 2/ 1998 Gilboa
`
`5,853,327 A
`6,750,848.
`
`
`
`5,864,334 A *
`
`
`
`
`
`(72)Inventor: Timothy R. Pryor, Sylvania, OH (US)
`
`
`
`
`
`(21)Appl. No.: 13/714,748
`
`
`
`(22)Filed:Dec. 14, 2012
`
`(65)
`
`
`
`Prior Publication Data
`
`
`
`US 2013/0169535 Al Jul. 4, 2013
`
`
`
`
`
`Related U.S. Application Data
`
`(60)Provisional application No. 60/107,652, filed on Nov.
`
`
`
`
`9, 1998.
`
`(Continued)
`
`1/1999 Sellers .......................... 345/168
`
`
`
`
`
`Primary Examiner - Peling Shaw
`
`
`
`(51)Int. Cl.
`H04N9/47
`
`(2006.01)
`(2006.01)
`
`
`
`(74)Attorney, Agent, or Firm - Warner Norcross & Judd
`
`
`LLP
`
`H04N7/18
`(52)U.S. Cl.
`USPC ............................................. 348/77; 348/155
`
`
`(57)
`ABSTRACT
`
`( 58)Field of Classification Search
`None
`
`
`
`
`
`
`See application file for complete search history.
`
`(56)
`
`
`
`References Cited
`
`U.S. PATENT DOCUMENTS
`
`3,909,002 A 9/1975 Levy
`
`
`
`
`4,219,847 A 8/ 1980 Pinkney et al.
`
`
`4,339,798 A 7 / 1982 Hedges et al.
`
`A method for determining a gesture illuminated by a light
`
`
`
`
`
`source utilizes the light source to provide illumination
`
`
`
`through a work volume above the light source. A camera is
`
`
`
`
`
`positioned to observe and determine the gesture performed in
`the work volume.
`
`
`
`
`
`30 Claims, 7 Drawing Sheets
`
`IPR2021-00922
`Apple EX1001 Page 1
`
`

`

`US 8,553,079 B2
`
`Page 2
`
`(56)
`
`References Cited
`
`U.S. PATENT DOCUMENTS
`
`
`
`6,342,917 Bl 1/2002 Amenta
`
`
`6,346,929 Bl * 2/2002 Fukushima et al. .............. 345/8
`
`
`
`6,359,647 Bl 3/2002 Sengupta et al.
`
`
`
`6,363,160 Bl 3/2002 Bradski et al.
`Stewart et al.
`
`
`5,878,174 A 3/1999
`
`
`
`6,373,472 Bl 4/2002 Palalau et al.
`
`5,904,484 A Burns 5/1999
`
`8/2002 Breed et al.
`
`6,442,465 B2
`5,926,168 A
`7/1999 Fan
`1/2003 Karmarkar
`
`6,508,709 Bl
`
`
`
`5,936,610 A * 8/1999 Endo ............................. 345/157
`3/2003 Prokoski
`6,529,617 Bl
`
`5,940,126 A 8/1999
`
`Kimura
`7/2003 Silverbrook
`
`6,597,817 Bl
`
`5,982,352 A 11/1999 Pryor
`
`12/2003 Watabe et al.
`
`6,663,491 B2
`
`
`
`5,999,840 A 12/1999 Grimson et al.
`6/2004 Pryor
`
`6,750,848 Bl
`
`
`6,052,132 A 4/2000 Christian et al.
`8/2004 Arai et al.
`
`6,775,361 Bl
`
`6,098,458 A 8/2000
`
`French et al.
`9/2004 Silverbrook
`
`6,788,336 Bl
`
`6,108,033 A 8/2000 Ito et al.
`
`6/2005 Brinjes
`
`6,911,972 B2
`
`
`
`6,148,100 A 11/2000 Anderson et al.
`
`
`6,160,899 A 12/2000 Lee et al. 7,489,863 B2
`2/2009 Lee
`
`
`6,204,852 Bl 3/2001 Kumar et al.
`
`
`examiner6,252,598 Bl * 6/2001 Segen ........................... 715/863 * cited by
`
`
`
`IPR2021-00922
`Apple EX1001 Page 2
`
`

`

`
`U.S. Patent Oct. 8,
`Sheet 1 of 7
`2013
`
`US 8,553,079 B2
`
`co
`
`CV)
`
`"t""-
`
`�
`,...._
`
`0
`T"""
`
`(0
`0
`
`0
`0
`T"""
`
`IPR2021-00922
`Apple EX1001 Page 3
`
`

`

`
`
`U.S. Patent Oct. 8, 2013 Sheet 2 of 7 US 8,553,079 B2
`
`IPR2021-00922
`Apple EX1001 Page 4
`
`

`

`U.S. Patent
`
`
`Oct. 8, 2013 Sheet 3 of 7
`
`
`
`US 8,553,079 B2
`
`0
`0
`......
`
`r--0
`
`
`......
`
`CD
`0
`......
`
`IPR2021-00922
`Apple EX1001 Page 5
`
`

`

`U.S. Patent
`US 8,553,079 B2
`Oct. 8, 2013
`Sheet 4 of 7
`
`0
`Cl)
`,q-
`
`T"""
`
`T"""
`,q-
`
`0
`0
`,q-
`
`::J
`CL
`0
`
`co
`0
`,q-
`
`IPR2021-00922
`Apple EX1001 Page 6
`
`

`

`
`U.S. Patent
`Sheet 5 of 7
`
`Oct. 8, 2013
`
`US 8,553,079 B2
`
`0
`co
`LO
`
`..--.LO :
`'
`
`..
`.-
`
`�·
`A : ,
`........ ,,; .. ,· ,
`.. ...... ,
`.. ....... ,'
`
`/..
`
`
`
`0
`0
`LO
`
`LO
`0
`LO
`
`IPR2021-00922
`Apple EX1001 Page 7
`
`

`

`US 8,553,079 B2
`U.S. Patent
`Sheet 6 of 7
`Oct. 8, 2013
`
`co
`
`LO
`0)
`
`gl'-q �\
`
`\
`\
`\
`
`\
`
`\
`\
`\
`\
`
`LO
`LO
`0)
`
`0)
`
`�
`LO Cl. Cl.
`'r" ...J::::,
`cnCl)U
`0
`
`(Q
`•(!)
`ti:
`
`
`
`0
`
`\ \
`\
`\
`\� \
`\ \
`
`\ro
`
`0
`'r"
`0)
`
`'r"
`
`0
`
`co
`0
`0)
`
`IPR2021-00922
`Apple EX1001 Page 8
`
`

`

`Sheet 7 of 7 US 8,553,079 B2
`U.S. Patent
`Oct. 8, 2013
`
`1001
`
`1014
`
`..........
`
`,/ ...... :: .... ...
`,,.. .. .... ;�.
`,' ........ , ,
`.... .. ...... ,/
`,,,
`.... .. .. ,: ,,�
`....
`-.•. • :, __ 1020
`
`1000
`
`1030
`
`C
`p
`
`1002
`
`FIG. 7A
`
`1070
`
`1075
`
`1056
`
`1060
`
`INTERNET
`
`INPUT
`WOMAN'S
`MEASURE-.--- -----' '---�
`MENTS
`CPU 1055
`)
`( -----.i
`
`1050
`
`INTERNET
`REMOTE
`
`FIG. 7B
`
`IPR2021-00922
`Apple EX1001 Page 9
`
`

`

`
`
`US 8,553,079 B2
`
`1
`
`2
`
`CROSS REFERENCE TO RELATED
`
`APPLICATIONS
`
`BRIEF DESCRIPTION OF FIGURES
`
`MORE USEFUL MAN MACHINE
`
`
`INTERFACES AND APPLICATIONS
`
`objects held in the hand of the user, which are used to input
`
`
`
`
`
`data to the computer. It may also or alternatively, look at the
`head of the user as well.
`Both hands or multiple fingers of each hand, or an object in
`
`
`
`
`
`5 one hand and fingers of the other can be simultaneously
`
`
`
`observed, as can alternate arrangements as desired.
`
`
`
`
`
`
`This application is a continuation of U.S. patent applica­2.Description of Related Art
`My referenced co-pending applications incorporated
`
`tion Ser. No. 12/700,055, filed Feb. 4, 2010, which is a con­
`
`
`
`
`
`
`
`
`
`
`
`
`tinuation ofU. S. patent application Ser. No. 10/866, 191, filed herein by reference discuss many prior art references in vari-
`
`
`
`
`
`
`Jun. 14, 2004, which is a continuation of U.S. patent appli­10 ous pertinent fields, which form a background for this inven­
`
`
`
`
`
`cation Ser. No. 09/433,297, filed Nov. 3, 1999 (now U.S. Pat. tion.
`
`
`No. 6,750,848), which claims benefit of U.S. Provisional
`
`
`
`
`Application No. 60/107,652, filed Nov. 9, 1998. These appli­
`
`
`
`cations are hereby incorporated by reference.
`15 FIG. 1 illustrates a laptop or other computer keyboard with
`
`
`
`
`
`
`REFERENCES TO RELATED APPLICATIONS
`
`
`
`
`
`cameras according to the invention located on the keyboard
`BY THE INVENTORS
`
`
`
`
`surface to observe objects such as fingers and hands overhead
`of the keyboard.
`
`
`
`
`U.S. patent application Ser. No. 09/138,339, filedAug. 21,
`FIG. 2 illustrates another keyboard embodiment using spe-
`
`
`
`
`1998.
`
`20 cial datums or light sources such as LEDs.
`
`
`U.S. Provisional Application No. 60/056,639, filed Aug.
`
`
`
`
`FIG. 3 illustrates a further finger detection system for lap­
`22, 1997.
`
`
`top or other computer input.
`
`
`
`U.S. Provisional Application No. 60/059,561, filed Sep.
`FIG. 4 illustrates learning, amusement, monitoring, and
`
`
`
`
`19, 1998.
`
`
`
`diagnostic methods and devices for the crib, playpen and the
`25 like.
`
`
`
`Man Machine Interfaces: Ser. No. 08/290,516, filed Aug.
`
`15, 1994, and now U.S. Pat. No. 6,008,800.
`FIG. 5 illustrates a puzzle toy for young children having cut
`
`
`
`
`
`
`
`
`Touch TV and Other Man Machine Interfaces: Ser. No.
`
`
`out wood characters according to the invention.
`
`
`08/496,908, filed Jun. 29, 1995, and now U.S. Pat. No. 5,982,
`
`
`
`
`
`FIG. 6 illustrates an improved handheld computer embodi­
`352.
`
`
`
`
`ment of the invention, in which the camera or cameras may be
`
`
`
`
`Systems for Occupant Position Sensing: Ser. No. 08/968,
`
`
`
`30 used to look at objects, screens and the like as well as look at
`
`
`
`114, filed Nov. 12, 1997, now abandoned, which claims ben­
`
`the user along the lines of FIG. 1.
`
`
`efit of Ser. No. 60/031,256, filed Nov. 12, 1996.
`
`FIGS. 7A-B illustrate new methods for internet commerce
`
`
`Target holes and corners: U.S. Ser. No. 08/203,603, filed
`
`
`
`and other activities involving remote operation with 3D vir­
`
`
`
`Feb. 28, 1994, and Ser. No. 08/468,358 filed Jun. 6, 1995, now
`
`tual objects display.
`
`U.S. Pat. No. 5,956,417 and U.S. Pat. No. 6,044,183.
`35
`
`
`
`
`Vision Target Based Assembly: U.S. Ser. No. 08/469,429,
`
`
`
`filed Jun. 6, 1995, now abandoned; Ser. No. 08/469,907, filed
`FIG.1
`
`
`Jun. 6, 1995, now U.S. Pat. No. 6,301,763; Ser. No. 08/470,
`A laptop ( or other) computer keyboard based embodiment
`
`
`
`
`
`
`325, filed Jun. 6, 1995, now abandoned; and Ser. No. 08/466,
`
`40 is shown in FIG. 1. In this case, a stereo pair of cameras 100
`
`294, filed Jun. 6, 1995, now abandoned.
`
`
`and 101 located on each side of the keyboard are used, desir­
`
`
`
`
`Picture Taking Method and Apparatus: Provisional Appli­
`
`
`
`ably having cover windows 103 and 104 mounted flush with
`
`
`
`cation No. 60/133,671, filed May 11, 1998.
`
`
`
`
`the keyboard surface 102. The cameras are preferably pointed
`
`
`
`
`Methods and Apparatus for Man Machine Interfaces and
`
`
`
`
`
`obliquely inward at angles <I> toward the center of the desired
`
`
`
`
`Related Activity: Provisional Application No. 60/133,673
`
`
`
`
`45 work volume 170 above the keyboard. In the case of cameras
`filed May 11, 1998.
`
`
`
`
`mounted at the rear of the keyboard (toward the display
`
`
`Camera Based Man-Machine Interfaces: Provisional
`
`
`
`
`screen), these cameras are also inclined to point toward the
`
`
`Patent Application No. 60/142,777, filed Jul. 8, 1999.
`user at an angle as well.
`
`
`
`
`The copies of the disclosure of the above referenced appli­
`
`
`Alternate camera locations may be used such as the posi-
`
`
`
`cations are incorporated herein by reference.
`
`
`50 tions of cameras 105 and 106, on upper corners of screen
`BACKGROUND OF THE INVENTION
`
`
`
`
`housing 107 looking down at the top of the fingers ( or hands,
`
`
`
`or objects in hand or in front of the cameras), or of cameras
`1. Field of the Invention
`
`108 and 109 shown.
`The invention relates to simple input devices for comput­
`
`
`
`One of the referenced embodiments of the invention is to
`
`
`
`
`
`
`ers, particularly, but not necessarily, intendedforusewith3-D
`
`
`
`
`
`determine the pointing direction vector 160 of the user's
`55
`
`
`
`graphically intensive activities, and operating by optically
`
`
`
`
`finger (for example pointing at an object displayed on screen
`
`
`
`
`
`sensing object or human positions and/or orientations. The
`
`
`
`107), or the position and orientation of an object held by the
`
`
`
`
`invention in many preferred embodiments, uses real time
`
`
`
`user. Alternatively, finger position data can be used to deter­
`
`
`
`
`stereo photogranimetry using single or multiple TV cameras
`
`
`mine gestures such as pinch or grip, and other examples of
`
`
`
`whose output is analyzed and used as input to a personal
`
`
`
`
`relative juxtaposition of objects with respect to each other, as
`60
`
`
`
`
`computer, typically to gather data concerning the 3D location
`
`
`
`has been described in co-pending referenced applications.
`
`
`of parts of, or objects held by, a person or persons.
`
`
`
`Positioning of an object or portions (such as hands or fingers
`
`
`
`
`
`This continuation application seeks to provide further of a doll) is also of use, though more for use with larger
`
`
`
`
`
`
`detail on useful embodiments for computing. One embodi­keyboards and displays.
`ment is a keyboard for a laptop computer ( or stand alone 65 In one embodiment, shown in FIG. 2, cameras such as
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`keyboard for any computer) that incorporates digital TV cam­100/101 are used to simply look at the tip of a finger 201 ( or
`
`
`
`
`
`
`eras to look at points on, typically, the hand or the finger, or thumb) of the user, or an object such as a ring 208 on the
`
`DESCRIPTION OF THE INVENTION
`
`IPR2021-00922
`Apple EX1001 Page 10
`
`

`

`
`
`US 8,553,079 B2
`
`4
`3
`finger. Light from below, such as provided by single central finger as shown. This allows the tip of the finger to be used to
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`light 122 can be used to illuminate the finger that typically type on the keyboard without feeling unusual-the case per­
`
`looks bright under such illumination.
`
`
`
`haps with target material on tip of the finger.
`
`
`It is also noted that the illumination is directed
`
`
`
`The line image detected or concen­ by the camera can be provided also
`
`
`
`trated in an area where the finger is typically located such as
`
`
`
`5 by a cylinder such as retroreflective cylinder 208 worn on the
`
`in work volume 170. If the light is of sufficient spectral
`
`
`
`
`finger 201 which effectively becomes a line image in the field
`
`
`content, the natural flesh tone of the finger can be observed­
`
`
`of view of each camera (assuming each camera is equipped
`
`
`
`and recognized by use of the color TV cameras 100/101.
`
`
`
`
`with a sufficiently coaxial light source, typically one or more
`
`As is typically the case, the region of the overlapping
`
`LEDs such as 210 and 211), can be used to solve easily using
`
`
`
`cameras viewing area is relatively isolated to the overlapping
`
`
`10 the line image pairs with the stereo cameras for the pointing
`
`
`
`volumetric zone of their fields 170 shown due to focal lengths
`
`
`
`
`direction of the finger that is often a desired result. The line, in
`
`
`
`of their lenses and the angulation of the
`
`
`
`
`
`the stereo pair of camera axes with images provides the pointing direction of the
`
`
`respect to each other. This restricted overlap
`
`
`
`finger, for example zone helps miti­ pointing at an object displayed on the
`
`
`
`
`gate against unwanted matches in the two images due to
`
`
`
`
`
`
`screen 140 of the laptop computer 138.
`
`
`
`
`information generated outside the zone of overlap. Thus there 15
`FIG. 3
`
`
`
`are no significant image matches found of other objects in the
`It is also possible to have light sources on the finger that can
`
`
`
`
`
`
`
`room, since the only flesh-toned object in the zone is typically
`
`
`be utilized such as the 2 LED light sources shown in FIG. 3.
`
`
`
`the finger or fingers of the user. Or alternatively, for example,
`
`
`This can be used with either TV camera type sensors or with
`
`
`
`
`the user's hand or hands. Similarly objects or targets thereon
`
`
`
`PSD type analog image position sensors as disclosed in ref-
`
`
`
`can be distinguished by special colors or shapes.
`
`20 erences incorporated.
`In particular the ring mounted LED light sources 301 and
`
`
`
`
`
`If desired, or required, motion of the fingers can be also
`
`
`302 can be modulated at different frequencies that can be
`
`
`
`
`used to further distinguish their presence vis-a-vis any static
`
`
`
`
`
`individually discerned by sensors imaging the sources on to a
`
`
`
`of successive by subtraction backgr ound. If for example,
`
`
`
`
`respective PSD detector. Alternatively, the sources can sim­
`
`
`
`camera frames, the image of a particular object is determined
`
`ply be turned on and off at different times such that the
`
`
`
`
`to have moved it is determined that this is likely the object of 25
`
`
`
`position of each point can be independently found allowing
`
`
`
`
`potential interest which can be further analyzed directly to
`
`
`
`
`the pointing direction to be calculated from the LED point
`
`determine if is the object of interest.
`
`
`data gathered by the stereo pair of PSD based sensors.
`
`
`
`
`
`In case of obscuration of the fingers or objects in the hand,
`
`
`
`
`The "natural interface keyboard" here described can have
`
`
`
`cameras in additional locations such as those mentioned
`
`
`
`
`cameras or other sensors located at the rear looking obliquely
`
`
`above, can be used to solve for position if the view of one or 30
`
`
`
`outward toward the front as well as inward so as to have their
`
`more cameras is obscured.
`
`
`
`
`
`working volume overlap in the middle of the keyboard such as
`
`
`
`
`The use of cameras mounted on both the screen and the
`
`
`
`the nearly full volume over the keyboard area is accommo­
`
`
`
`keyboard allows one to deal with obscurations that may occur
`dated.
`
`
`
`and certain objects may or may not be advantageously delin­
`Clearly larger keyboards can have a larger working volume
`
`
`
`
`eated in one view or the other.
`35
`
`
`than one might have on a laptop. The pair of sensors used can
`
`
`In addition, it may be in many cases desirable to have a
`
`
`
`
`
`be augmented with other sensors mounted on the screen hous­
`
`
`
`datum on the top of the finger as opposed to the bottom
`
`
`
`
`
`ing. It is noted that the linked dimension afforded for calibra­
`
`
`
`
`
`because on the bottom, it can get in the way of certain activi­
`
`
`
`
`
`tion between the sensors located on the screen and those on
`
`
`
`
`
`ties. In this case the sensors are required on the screen looking
`
`
`
`
`the keyboard is provided by the laptop unitary construction.
`
`
`downward or in some other location such as off the computer
`40
`
`
`One can use angle sensing means such as a rotary encoder
`
`
`
`entirely and located overhead has been noted in previous
`
`
`
`
`
`for the laptop screen tilt. Alternatively, cameras located on the
`application.
`
`
`
`
`screen can be used to image reference points on the keyboard
`
`
`
`
`To determine finger location, a front end processor like that
`
`
`
`
`
`as reference points to achieve this. This allows the calibration
`
`
`
`described in the target holes and comers co-pending applica­
`
`
`
`
`
`
`of the sensors mounted fixedly with respect to the screen with
`
`
`tion reference incorporated U.S. Ser. Nos. 08/203,603 and
`45
`
`
`
`respect to the sensors and keyboard space below. It also
`
`
`08/468,358 can be used to also allow the finger shape as well
`
`
`
`allows one to use stereo pairs of sensors that are not in the
`as color to be detected.
`
`
`
`
`
`
`
`Finger gestures comprising a sequence of finger
`
`horizontal direction move­ (such as 101/102) but could for example
`
`
`ments can also be detected by analyzing
`
`
`
`be a camera sensor sequential image sets such as 100 on the keyboard coupled with
`
`
`
`
`such as the motion of the finger, or one finger with respect to
`
`
`
`50 one on the screen, such as 106.
`another such as in pinching something can be determined. Knowing the pointing angles of the two cameras with
`
`
`
`
`
`
`
`
`
`
`
`
`
`Cameras 100 and 101 have been shown at the rear of the respect to one another allows one to solve for the 3D location
`
`
`
`
`
`
`
`
`
`
`keyboard near the screen or at the front. They may mount in of objects from the matching of the object image positions in
`
`
`
`
`
`the middle of the keyboard or any other advantageous loca-the respective camera fields.
`tion.
`As noted previously, it is also of interest to locate a line or
`
`
`
`
`
`55
`
`
`
`
`
`cylinder type target on the finger between the first and second
`
`
`
`The cameras can also see one's fingers directly, to allow
`
`
`
`joints. This allows one to use the fingertip for the keyboard
`typing as now, but without the physical keys. One can type in
`
`
`
`
`
`
`activity but by raising the finger up, it can be used as a line
`
`space above the plane of the keyboard ( or in this case plane of
`
`
`
`
`target capable of solving for the pointed direction for
`
`the cameras). This is useful for those applications where the
`
`
`
`
`keyboard of conventional style is too big ( e.g., the hand held 60 example.
`
`
`Alternatively one can use two point targets on the finger
`
`
`computer of FIG. 6).
`
`
`
`FIG. 2
`
`
`
`such as either retroreflective datums, colored datums such as
`
`
`
`
`
`
`It is also desirable for fast reliable operation to use retro­rings or LED light sources that can also be used with PSD
`
`
`
`
`
`
`reflective materials and other materials to augment the con­detectors which has also been noted in FIG. 2.
`
`
`
`
`
`
`trast of objects used in the application. For example, a line 65 When using the cameras located for the purpose of stereo
`
`
`
`
`
`
`target such as 200 can be worn on a finger 201, and advanta­determination of the position of the fingers from their flesh
`
`
`
`
`
`
`
`
`
`
`
`geously can be located if desired between two joints of the tone images it is useful to follow the preprocessing capable of
`
`IPR2021-00922
`Apple EX1001 Page 11
`
`

`

`
`
`US 8,553,079 B2
`
`6
`5
`processing data obtained from the cameras in order to look for recognizes such as a doll. After a while, looking at this live
`
`
`
`
`
`
`
`
`
`
`
`
`the finger. This can be done on both color basis and on the
`
`
`
`one can then move to talking to the baby from some prere­
`
`basis of shape as well as motion.
`
`corded data.
`
`In this invention, I have shown the use of not only cameras
`What other things might we suppose? The baby for
`
`
`
`
`
`
`
`located on a screen looking downward or outward from the
`
`
`
`
`example knows to puts its hand on the mother's cheek to
`5
`
`
`
`screen, but also cameras that can be used instead of or in
`
`
`
`cause the mother to turn to it. The baby also learns some other
`
`
`
`
`
`combination with those on the screen placed essentially on
`
`
`
`reflexes when it is very young that it forgets when it gets older.
`
`
`the member on which the keyboard is incorporated. This
`
`
`Many of these reflexes are hand movements, and are impor-
`
`
`
`
`
`allows essentially the keyboard to mounted cameras which
`
`
`tant in communicating with the remote TV based mother
`
`
`
`
`are preferably mounted flush with the keyboard surface to be 10
`
`
`
`representation, whether real via telepresense or from CD
`
`
`unobtrusive, and yet visually be able to see the users fingers,
`
`Rom or DVD disk ( or other media, including information
`
`
`hands or objects held by the user and in some cases, the face
`
`
`
`
`
`
`transmitted to the computer from afar) and for the learning of
`
`of the user.
`
`the baby's actions.
`
`This arrangement is also useful for 3D displays, for
`
`
`Certainly just from the making the baby feel good point-
`example where special synchronized glasses ( e.g., the "Crys-15
`
`
`
`
`
`
`
`of-view, it would seem like certain motherly ( or fatherly, etc.)
`ta! Eyes" brand often used with Silicon Graphics work sta­
`
`
`
`
`
`
`
`
`responses to certain baby actions in the form of words and
`tions) are used to alternatively present right and left images to
`
`
`
`
`
`images would be useful. This stops short of physical holding
`
`each eye. In this case the object may appear to be actually in
`
`
`
`
`
`
`of the baby which is often needed, but could act as a stop gap
`the workspace 170 above the keyboard, and it may be
`
`
`
`
`
`
`for example. to get another hour's sleep manipulated by virtually grasping (pushing, pulling, etc.) it, 20 to allow the parents
`
`
`
`
`
`
`
`As far as the baby touching things, I've discussed in other
`
`
`as has been described in co-pending applications.
`
`
`
`
`applications methods for realistic touch combined with
`
`
`
`FIG. 4: Baby Learning and Monitoring System
`
`
`images. This leads to a new form of touching crib mobiles that
`
`
`
`A baby's reaction to the mother ( or father) and the mother's
`
`could contain video imaged and or be imaged themselves-
`
`
`
`
`analysis of the baby's reaction is very important. There are
`
`
`25 plus if desired-touched in ways that would be far beyond
`
`
`
`
`many gestures of babies apparently indicated in child psy­
`
`
`any response that you could get from a normal mobile.
`
`
`
`
`chology as being quite indicative of various needs, wants, or
`
`
`
`
`For example, let us say there is a targeted ( or otherwise TV
`
`
`
`
`feelings and emotions, etc. These gestures are typically made
`
`
`observable) mobile 450 in the crib above the baby. Baby
`
`with the baby's hands.
`
`
`
`reaches up and touches a piece of the mobile which is sensed
`
`
`
`Today this is done and learned entirely by the mother being
`
`
`
`30 by the TV camera system ( either from the baby's hand posi­
`
`
`
`
`with the baby. However with an Electro-optical sensor based
`
`
`tion, the mobile movement, or both, and a certain sound is
`
`
`
`
`computer system, such as that described in co-pending appli­
`
`
`
`
`
`called up by the computer, a musical note for example.
`
`
`
`
`cations located proximate to or even in the crib (for example),
`
`
`
`Another piece of the mobile and another musical note. The
`
`
`
`
`one can have the child's reactions recorded, not just in the
`
`
`
`
`mo bile becomes a musical instrument for the baby that could
`
`sense of a video tape which would be too long and involved
`
`
`
`
`35 play either notes or chords or complete passages, or any other
`
`
`
`for most to use, but also in terms of the actual motions which
`
`desired programmed function.
`
`
`
`could be computer recorded and analyzed also with the help
`The baby can also signal things. The baby can signal using
`
`
`
`
`
`
`
`of the mother as to what the baby's responses were.And such
`
`
`
`
`agitated movements would often mean that it's unhappy. This
`
`
`
`
`motions, combined with other audio and visual data can be
`
`
`
`could be interpreted using learned movement signatures and
`
`
`
`very important to the baby's health, safety, and learning.
`
`
`
`40 artificial intelligence as needed by the computer to call for
`
`
`
`Consider for example crib 400 with computer 408 having
`
`
`
`
`
`mother even if the baby wasn't crying. If the baby cries, that
`
`
`LCD monitor 410 and speaker 411 and camera system ( single
`
`can be picked up by microphone 440, recogn ized using a
`
`
`
`or stereo) 420 as shown, able to amuse or inform baby 430,
`
`voice recognition system along the lines of that used in IBM
`
`
`
`while at the same time recording (both visually, aurally, and in
`
`
`
`Via Voice commercial product for example. And even the
`
`
`
`
`
`movement detected position data concerning parts ofhis body
`
`
`
`
`45 degree of crying can be analyzed to determine appropriate
`
`
`
`or objects such as rattles in his hand) his responses for any or
`action.
`
`
`
`
`all of the purposes of diagnosis of his state of being, remote
`The computer could also be used to transmit information of
`
`
`
`
`
`
`
`
`transmission of his state, cues to various programs or images
`
`this sort via the internet email to the mother who could even
`
`
`
`to display to him or broadcast to others, or the like.
`
`
`
`
`For one example, baby's motions could be used to sign al a
`
`be at work. And until help arrives in the form of mother
`
`
`
`
`response from the TV either in the absence of the mother or
`
`
`
`
`intervention or whatever, the computer could access a pro­
`50
`
`
`
`with the mother watching on a remote channel. This can even
`
`
`
`
`gram that could display on a screen for the baby things that the
`
`
`be over the Internet if the mother is at work.
`
`
`baby likes and could try to soothe the baby through either
`
`
`For example, a comforting message could come up on the
`
`
`
`images of familiar things, music or whatever. This could be
`
`
`TV from the mother that could be prerecorded (or alterna­
`
`
`
`
`useful at night when parents need sleep, and anything that
`
`
`
`tively could actually be live with TV cameras in the mother's
`
`
`
`55 would make the baby feel more comfortable would help the
`
`
`
`or father's workplace for example on a computer used by the
`parents.
`
`
`
`
`
`parent) to tell the baby something reassuring or comfort the
`It could also be used to allow the baby to input to the device.
`
`
`
`
`
`baby or whatever. Indeed the parent can be monitored using
`
`
`
`For example, if the baby was hungry, a picture of the bottle
`
`
`
`the invention and indicate something back or even control a
`
`
`
`could be brought up on the screen. The baby then could yell
`
`
`
`
`
`teleoperater robotic device to give a small child something to
`
`
`
`
`60 for the bottle. Or if the baby needed his diaper changed,
`
`
`
`eat or drink for example. The same applies to a disabled
`
`
`
`perhaps something reminiscent of that. If the baby reacts to
`person.
`
`
`
`such suggestions of his problem, this gives a lot more intel­
`
`
`If the father or mother came up on the screen, the baby
`
`
`
`ligence as to why he is crying and while mothers can gener­
`
`
`could wave at it, move its head or "talk" to it but the hand
`
`
`
`
`ally tell right away, not everyone else can. In other words, this
`
`
`gestures may be the most important.
`
`
`
`65 is pretty neat for babysitters and other members of the house­
`
`If the mother knows what the baby is after, she can talk to
`
`
`
`
`
`hold so they can act more intelligently on the signals the baby
`
`
`baby or say something, or show something that the baby
`is providing.
`
`IPR2021-00922
`Apple EX1001 Page 12
`
`

`

`
`
`US 8,553,079 B2
`
`8
`
`7
`Besides in the crib, the system as described can be used in orientation information to the TV camera based analysis soft­
`
`
`
`
`
`
`
`
`
`
`
`
`conjunction with a playpen, hi-chair or other place of baby
`
`
`ware, and in making the object easier to see in reflective
`activity.
`illumination.
`
`
`
`As the child gets older, the invention can further be used
`Aid to Speech Recogn ition
`
`
`also with more advanced activity with toys, and to take data 5
`The previous co-pending application entitled "Useful man
`
`
`
`
`
`
`
`from toy positions as well. For example, blocks, dolls, little
`
`
`machine interfaces and applications" referenced above, dis­
`
`
`cars, and moving toys even such as trikes, scooters, drivable
`
`
`
`cussed the use of persons movements or positions to aid in
`
`toy cars and bikes with training wheels.
`
`
`recognizing the voice spoken by the person.
`
`
`
`
`
`The following figure illustrates the ability of the invention
`
`
`
`In one instance, this can be achieved by simply using ones
`
`
`
`
`to learn, and thus to assist in the creation of toys and other
`
`
`
`10 hand to indicate to the camera system of the computer that the
`things.
`
`
`
`
`any other function, start ( or stop, or voice recogn ition should
`FIG. 5: Learning Puzzle Roy
`
`
`
`
`such as a paragraph or sentence end, etc.).
`Disclosed in FIG. 5 is a puzzle toy 500 where woodcut
`
`
`
`
`
`Another example is to use the camera system of the inven-
`
`
`animals such as bear 505 and lion 510 are pulled out with
`
`
`
`
`tion to determine the location of the persons head ( or other
`handle such as 511. The child can show the animal to the
`15
`
`
`
`part), from which one can instruct a computer to preferen­
`
`
`
`camera and a computer 53 0 with TV camera ( or cameras) 535
`
`
`
`
`tially evaluate the sound field in phase and amplitude of two
`
`
`
`a suitable and provide can recogn ize the shape as the animal,
`
`
`or more spaced microphones to listen from that location­
`
`image and sounds on screen 540.
`
`
`
`
`thus aiding the pickup of speech-which often times is not
`
`
`
`
`Alternatively, and more simply, a target, or targets on the
`
`
`based automatic 20 able to be heard well enough for computer
`
`
`back of the animal can be used such as triangle 550 on the
`speech recogn ition to occur.
`
`
`back oflion 511. In either case the camera can solve for the
`
`Digital Interactive TV
`
`
`3D, and even 5 or 6D position and orientation of the animal
`As you watch TV, data can be taken from the camera
`
`
`
`
`object, and cause it to move accordingly on the screen as the
`
`
`
`
`system of the invention and transmitted back to the source of
`
`
`child maneuvers it. The child can hold two animals, one in
`
`
`
`25 programming. This could include voting on a given proposi­
`
`
`each hand and they can each be detected, even with a single
`
`
`tion by raising your hand for example, with your hand indi­
`
`
`
`
`camera, and be programmed in software to interact as the
`
`
`
`
`cation transmitted. Or you could hold up 3 fingers, and the
`
`
`child wishes ( or as he learns the program).
`
`
`count of fingers transmitted. Or in a more extreme case, your
`
`
`This is clearly for very young children of two or three years
`
`
`
`
`
`position, or the position of an object or portion thereof could
`
`of age. The toys have to be large so they can't be swallowed.
`
`
`30 be transmitted-for example you could buy a coded object­
`
`
`With the invention in this manner, one can make a toy of
`
`
`whose code would be transmitted to indicate that you person­
`
`
`
`
`
`virtually anything, for example a block. Just hold this block
`
`
`
`ally (having been pre-registered) had transmitted a certain
`
`
`
`up, teach the computer/camera system the object and play
`
`packet of data.
`
`
`using any program you might want to represent it and its
`If the programming source can transmit individually to you
`
`
`
`
`
`
`
`
`actions. To make this block known to the system, the shape of
`
`
`
`
`
`35 (not possible today, but forecast for the future), then much
`
`
`the block, the color

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket