throbber

`
`UNITED STATES PATENT AND TRADEMARK OFFICE
`____________
`
`BEFORE THE PATENT TRIAL AND APPEAL BOARD
`____________
`
`
`APPLE INC.
`Petitioner
`
`v.
`
`GESTURE TECHNOLOGY PARTNERS LLC
`Patent Owner
`____________
`
`
`Case No. IPR2021-00922
`U.S. Patent No. 8,553,079
`____________
`
`
`
`
`
`
`PETITION FOR INTER PARTES REVIEW
`OF U.S. PATENT NO. 8,553,079
`
`

`

`IPR2021-00922
`U.S. Patent No. 8,553,079
`
`Table of Contents
`
`
`
`INTRODUCTION ........................................................................................ 1
`I.
`SUMMARY OF THE ’079 PATENT ......................................................... 1
`II.
`THE ’079 PATENT’S ALLEGED INVENTION ....................................................... 1
`A.
`THE ’079 PATENT’S PROSECUTION .................................................................. 3
`B.
`C. A PERSON HAVING ORDINARY SKILL IN THE ART ........................................... 4
`III. REQUIREMENTS FOR IPR UNDER 37 C.F.R. § 42.104 ...................... 4
`A.
`STANDING UNDER 37 C.F.R. § 42.104(A) ....................................................... 4
`B.
`CHALLENGE UNDER 37 C.F.R. § 42.104(B) AND RELIEF REQUESTED ............. 4
`C.
`CLAIM CONSTRUCTION UNDER 37 C.F.R. § 42.104(B)(3) ............................... 5
`IV. THE CHALLENGED CLAIMS ARE UNPATENTABLE ...................... 6
`A. GROUND 1: CLAIMS 1, 2, 4-14, 17, 19, 21-22, 24-28, AND 30 ARE OBVIOUS
`UNDER PRE-AIA 35 U.S.C. § 103 OVER NUMAZAKI IN VIEW OF THE
`KNOWLEDGE OF A PHOSITA .......................................................................... 6
`B. GROUND 2: CLAIMS 3, 15, AND 23 ARE OBVIOUS UNDER PRE-AIA 35 U.S.C. §
`103 OVER NUMAZAKI IN VIEW OF NUMAZAKI ’863 ........................................... 35
`C. GROUND 3: CLAIMS 16 AND 29 ARE OBVIOUS UNDER PRE-AIA 35 U.S.C. § 103
`OVER NUMAZAKI IN VIEW OF DELUCA ............................................................ 42
`D. GROUND 4: CLAIM 18 IS OBVIOUS UNDER PRE-AIA 35 U.S.C. § 103 OVER
`NUMAZAKI IN VIEW OF DELEEUW ................................................................... 49
`V. DISCRETIONARY CONSIDERATIONS ............................................... 68
`A.
`THE FINTIV FACTORS FAVOR INSTITUTION .................................................... 68
`VI. CONCLUSION ........................................................................................... 76
`VII. MANDATORY NOTICES UNDER 37 C.F.R. § 42.8(A)(1) ................... 77
`
`
`
`i
`
`

`

`I.
`
`INTRODUCTION
`
`IPR2021-00922
`U.S. Patent No. 8,553,079
`
`Petitioner Apple Inc. (“Petitioner”) requests an Inter Partes Review (“IPR”)
`
`of claims 1–30 (the “Challenged Claims”) of U.S. Patent No. 8,553,079 (“the ’079
`
`Patent”).
`
`II.
`
`SUMMARY OF THE ’079 PATENT
`
`A.
`
`The ’079 Patent’s Alleged Invention
`
`The ’079 Patent generally describes computer input devices employing
`
`cameras and lights to observe points on the human body and optically sense human
`
`positions and/or orientations.’079 Patent (Ex. 1001), 1:54-2:6. Examples of input
`
`devices contemplated by the patent include a computer keyboard, puzzle toy, and
`
`handheld computer. Id. at 2:15-31. Fig. 2 below illustrates one exemplary
`
`embodiment implemented in a laptop computer:
`
`1
`
`

`

`IPR2021-00922
`U.S. Patent No. 8,553,079
`
`
`
`Id. at Fig. 2. As illustrated, a laptop 138 may include camera locations 100, 101, 105,
`
`106, 108, 109; keyboard surface 102; screen housing 107; light 122; light emitting
`
`diodes (LEDs) 210 and 211, and work volume area 170 within which a user’s
`
`movements are detected. Id. at 2:39-53. The system can detect a user’s finger alone
`
`or the user may employ external objects such as ring 208 to help detect and recognize
`
`gestures performed in work volume area 170. Id. at 2:54-3:8. The ’079 Patent
`
`2
`
`

`

`IPR2021-00922
`U.S. Patent No. 8,553,079
`describes detecting point, pinch, and grip gestures using this configuration. Id. at
`
`3:48-51.
`
`B.
`
`The ’079 Patent’s Prosecution
`
`The Application that resulted in the ’079 Patent was filed on December 14,
`
`2012. The Application claims priority to provisional patent application No.
`
`60/107,652, filed November 9, 1998, through a succession of continuation
`
`applications 09/433,297; 10/866,191; and 12/700,055. Id. For purposes of this
`
`petition and without waiving its right to challenge priority in this or any other
`
`proceeding, Petitioner adopts November 9, 1998 as the invention date for the
`
`Challenged Claims.
`
`Applicant canceled twenty originally filed claims (1–20) by preliminary
`
`amendment and replaced them with thirty new claims (21–50). ’079 File History
`
`(Ex. 1002), 134-138. A Notice of Allowance issued on July 24, 2013, in which the
`
`Examiner allowed all thirty claims (renumbered 1–30) in a first office action. Id. at
`
`150, 175. Examiner amended the abstract and reasoned that none of Naoi et al. (US
`
`5459793), Platzker et al. (US 5528263), Sellers (US 5864334 A), nor Fukushima et
`
`al. (US 6346929 B1) taught or suggested the independently claimed elements: (1)
`
`providing a camera oriented to observe a gesture performed in the work volume; (2)
`
`the camera being fixed relative to the light source; and (3) determining, using the
`
`3
`
`

`

`IPR2021-00922
`U.S. Patent No. 8,553,079
`camera, the gesture performed in the work volume and illuminated by the light
`
`source. Id. at 153.
`
`C. A Person Having Ordinary Skill in the Art
`
`A person having ordinary skill in the art (“PHOSITA”) at the time of the ’079
`
`Patent would have had at least a bachelor’s degree in electrical engineering or
`
`equivalent with at least one year of experience in the field of human computer
`
`interaction. Additional education or experience might substitute for the above
`
`requirements. Bederson Dec. (Ex. 1010), ¶¶ 29-31.
`
`III. REQUIREMENTS FOR IPR UNDER 37 C.F.R. § 42.104
`
`A.
`
`Standing Under 37 C.F.R. § 42.104(A)
`
`Petitioner certifies that the ’079 Patent is available for IPR and that Petitioner
`
`is not barred or estopped from requesting an IPR challenging the claims of the ’079
`
`Patent. Specifically, (1) Petitioner is not the owner of the ’079 Patent, (2) Petitioner
`
`has not filed a civil action challenging the validity of any claim of the ’079 Patent,
`
`and (3) this Petition is filed less than one year after the Petitioner was served with a
`
`complaint alleging infringement of the ’079 Patent.
`
`B. Challenge Under 37 C.F.R. § 42.104(B) and Relief Requested
`
`In view of the prior art and evidence presented, claims 1-30 of the ’079 Patent
`
`are unpatentable and should be cancelled. 37 C.F.R. § 42.104(b)(1). Further, based
`
`4
`
`

`

`IPR2021-00922
`U.S. Patent No. 8,553,079
`on the prior art references identified below, IPR of the Challenged Claims should be
`
`granted. 37 C.F.R. § 42.104(b)(2).
`
`Exhibits
`Ex. 1004
`
`Proposed Ground of Unpatentability
`Ground 1: Claims 1, 2, 4-14, 17, 19, 21-22, 24-28, and 30 are
`obvious under pre-AIA 35 U.S.C. § 103 over U.S. Patent No.
`6,144,366 (“Numazaki”) in view of the knowledge of a PHOSITA
`Ground 2: Claims 3, 15, and 23 are obvious under pre-AIA 35 USC
`§ 103 over Numazaki in view of U.S. Patent No. 5,900,863
`(“Numazaki ’863”)
`Ground 3: Claims 16 and 29 are obvious under pre-AIA 35 U.S.C §
`103 over Numazaki in view of U.S. Patent No. 6,064,354 (“DeLuca”)
`Ground 4: Claim 18 is obvious under pre-AIA 35 U.S.C § 103 over
`Numazaki in view of U.S. Patent No. 6,008,018 (“DeLeeuw”)
`Ground 5: Claim 20 is obvious under pre-AIA 35 USC § 103 over
`Numazaki in view of U.S. Patent No. 6,191,773 (“Maruno”)
`
`Section IV identifies where each element of the Challenged Claims is found
`
`Ex. 1004,
`Ex. 1005
`
`Ex. 1004,
`Ex. 1006
`Ex. 1004,
`Ex. 1007
`Ex. 1004,
`Ex. 1008
`
`in the prior art. 37 C.F.R. § 42.104(b)(4). The exhibit numbers of the evidence relied
`
`upon to support the challenges are provided above and the relevance of the evidence
`
`to the challenges raised is provided in Section IV. 37 C.F.R. § 42.104(b)(5). Exhibits
`
`1001-1016 are also attached.
`
`C. Claim Construction Under 37 C.F.R. § 42.104(B)(3)
`
`In this proceeding, claims are interpreted under the same standard applied by
`
`Article III courts (i.e., the Phillips standard). See 37 C.F.R § 42.100(b); see also 83
`
`Fed. Reg. 197 (Oct. 11, 2018); Phillips v. AWH Corp., 415 F.3d 1303, 1312 (Fed.
`
`Cir. 2005) (en banc). Under this standard, words in a claim are given their plain
`
`5
`
`

`

`IPR2021-00922
`U.S. Patent No. 8,553,079
`meaning which is the meaning understood by a person of ordinary skill in the art in
`
`view of the patent and file history. Phillips, 415 F.3d 1303, 1212-13. For purposes
`
`of the proposed grounds below, Petitioner proposes no terms require express
`
`construction.
`
`IV. THE CHALLENGED CLAIMS ARE UNPATENTABLE
`
`A. Ground 1: Claims 1, 2, 4-14, 17, 19, 21-22, 24-28, and 30 are obvious
`under pre-AIA 35 U.S.C. § 103 over Numazaki in view of the
`knowledge of a PHOSITA
`Overview of Numazaki
`
`U.S. Patent No. 6,144,366 to Numazaki et al. (“Numazaki”) (Ex. 1004) was
`
`filed on October 17, 1997 and is prior art to the ’079 Patent under at least 35 U.S.C.
`
`§ 102(e) (pre-AIA). Numazaki was not cited or considered during prosecution of the
`
`’079 Patent or its parent, U.S. Patent No. 6,750,848. ’079 Patent (Ex. 1001); ’848
`
`Patent (Ex. 1003).
`
`Numazaki is generally directed to a method for detecting a gesture or the
`
`movement of a user’s hand. (“Numazaki”) (Ex. 1004), Abstract, 4:9-40. Numazaki
`
`purports to have improved upon prior methods by using a controlled light source to
`
`illuminate the target object (e.g., the user’s hand), a first camera unit (referred to by
`
`6
`
`

`

`IPR2021-00922
`U.S. Patent No. 8,553,079
`Numazaki as a “photo-detection unit”),1 and a second camera unit. Id. at 11:9-23.
`
`This arrangement is illustrated in Fig. 2 below:
`
`
`
`Id. at Fig. 2. A timing control unit is used to turn lighting unit 101 on (i.e.,
`
`illuminating the target object) when the first camera unit is active and off when the
`
`second camera unit is active. Id. at 11:20-32. The result of this light control is the
`
`first camera unit captures an image of the target object illuminated by both natural
`
`light and the lighting unit 101 and the second camera unit captures an image of the
`
`target object illuminated by only natural light. Id. at 11:33-39. The difference
`
`
`
`1 A PHOSITA would have considered Numazaki’s photo-detection units to be
`
`camera units. Bederson Dec. (Ex. 1010), ¶ 36 (explaining that Numazaki describes
`
`using CMOS or CCD sensor units, which were two of the more common optical
`
`sensors used in camera units at the time).
`
`7
`
`

`

`IPR2021-00922
`U.S. Patent No. 8,553,079
`between the two images—obtained by difference calculation unit 111—represents
`
`the “reflected light from the object resulting from the light emitted by the lighting
`
`unit 101.” Id. at 11:43-51. This information is then used by feature data generation
`
`unit 103 to determine gestures, pointing, etc. of the target object that may be
`
`converted into commands executed by a computer. Id. at 10:57-66.
`
`In its eighth embodiment, Numazaki describes implementing this structure in
`
`a computer such that a user can point or gesture with an index finger while typing
`
`on the keyboard “with[] hardly any shift of the hand position.” Id. at 50:25-43. This
`
`arrangement is illustrated in Fig. 74 below:
`
`Id. at Figure 74 (annotated to indicate light source 701 and photo-detection sensor
`
`unit 702). Numazaki teaches the entirety of the operator’s hand is illuminated within
`8
`
`
`
`

`

`IPR2021-00922
`U.S. Patent No. 8,553,079
`the range depicted by the dashed circle such that the user’s pointing and gestures can
`
`be captured and converted to commands. Id. at 50:38-48.
`
`Numazaki teaches that its eighth embodiment incorporates “the information
`
`input generation apparatus of the present invention as described in the above
`
`embodiments.” Id. at 50:21-24. A PHOSITA would have understood that the
`
`referenced information input generation apparatus is that illustrated in Fig. 2 and
`
`described in the corresponding disclosure. Bederson Dec. (Ex. 1010), ¶¶ 42-43
`
`(explaining
`
`that Numazaki describes
`
`its controlled
`
`light and
`
`two-camera
`
`configuration as key to its invention and noting that Numazaki at 53:22-36 teaches
`
`that the eighth embodiment uses the precise image difference calculation taught by
`
`Fig. 2 and its corresponding disclosure).
`
`Because Numazaki, like the ’079 Patent, discloses a method and apparatus for
`
`generating computer input information by capturing hand gestures, Numazaki is in
`
`the same field of endeavor as the ’079 Patent. Compare Numazaki (Ex. 1004),
`
`50:29-37 (“In this computer of FIG. 74, a lighting unit 701 and a photo-detection
`
`sensor unit 702 of the information input generation apparatus are provided at
`
`positions beyond the keyboard” such that the “entire hand of the operator is
`
`illuminated . . . [in] a range of illumination”) with ’079 Patent (Ex 1001), Abstract
`
`(describing “[a] method for determining a gesture illuminated by a light source” that
`
`“utilizes the light source to provide illumination through a work volume above the
`
`9
`
`

`

`IPR2021-00922
`U.S. Patent No. 8,553,079
`light source” where a “camera is positioned to observe and determine the gesture
`
`performed in the work volume”). Numazaki is therefore analogous art to the ’079
`
`Patent. Bederson Dec. (Ex. 1010), ¶ 41.
`
`
`
`Claim 1.
`
`1[P] A computer implemented method comprising:
`
`To the extent the preamble is limiting, Numazaki teaches a computer
`
`implemented method. In particular, Numazaki teaches a “method . . . for generating
`
`information input . . . capable of realizing a direct command type information input
`
`scheme by which the gesture or the motion can be inputted easily.” Numazaki (Ex.
`
`1004), 4:9-13.
`
`Numazaki’s eighth embodiment teaches a computer implemented method for
`
`controlling functions on a laptop device through gestures or pointing:
`
`In this configuration, the operator operating the keyboard can make the
`pointing or gesture input by slightly raising and moving the index
`finger. The user’s convenience is remarkably improved here because
`the keyboard input and the pointing or gesture input can be made
`without hardly any shift of the hand position.
`
`Id. at 50:38-43. This arrangement is illustrated in Fig. 74 below:
`
`10
`
`

`

`IPR2021-00922
`U.S. Patent No. 8,553,079
`
`
`
`Id. at Fig. 74.
`
`[1(a)] providing a light source adapted to direct illumination through a work
`volume above the light source;
`
`Numazaki teaches a lighting unit 701 (i.e., light source) that is adapted to
`
`illuminate a user’s hand (i.e., human body part) within a work volume generally
`
`above the light source. For example, Numazaki teaches “the entire hand of the
`
`operator is illuminated, as can be seen from a dashed line circle indicating a range
`
`of illumination.” Id. at 50:35-37 (emphasis added). As depicted in Fig. 74 below,
`
`the lighting unit 701 is adapted to illuminate the user’s hand by positioning it
`
`“beyond the keyboard” to illuminate a work volume indicated by the dashed line
`
`circle above it:
`
`11
`
`

`

`IPR2021-00922
`U.S. Patent No. 8,553,079
`
`
`
`Id. at Fig. 74 (annotated to illustrate light source and range of hand illumination).
`
`[1(b)] providing a camera oriented to observe a gesture performed in the work
`volume, the camera being fixed relative to the light source; and determining, using
`the camera, the gesture performed in the work volume and illuminated by the light
`source.
`
`Numazaki discloses a photo-detection sensor unit 702 (i.e., camera2) that is
`
`positioned next to lighting unit 701 (i.e., in fixed location relative to the light source)
`
`
`
`2 As described in the overview section above, a PHOSITA would have considered
`
`Numazaki’s photo-detection units to be camera units. Bederson Dec. (Ex. 1010), ¶
`
`
`
`12
`
`

`

`IPR2021-00922
`U.S. Patent No. 8,553,079
`and arranged to have the optical axis of the photo-detection sections pointing
`
`obliquely upward towards the operator side to observe a gesture performed by the
`
`user (i.e., observe a gesture) within a “dashed line circle” (i.e., work volume)
`
`depicted below. Id. at 50:30-43. This arrangement is illustrated in Fig. 74 below:
`
`
`
`Id. at Fig. 74 (annotated to show camera).
`
`As described with reference to Fig. 2, which is incorporated into the eighth
`
`embodiment as noted above, Numazaki uses this light and camera arrangement to
`
`
`
`36 (explaining that Numazaki describes using CMOS or CCD sensor units, which
`
`were two of the more common optical sensors used in camera units at the time).
`
`13
`
`

`

`IPR2021-00922
`U.S. Patent No. 8,553,079
`illuminate the target object (e.g., the user’s hand) in a controlled manner such that a
`
`precise image of the user’s hand and hand movement can be ascertained. Id. at
`
`11:9-23. Specifically, a timing control unit is used to turn lighting unit 101 on (i.e.,
`
`illuminating the target object) when the first camera unit is active and off when the
`
`second camera unit is active. Id. at 11:20-32. The result of this light control is the
`
`first camera unit captures an image of the target object illuminated by both natural
`
`light and the lighting unit 101 and the second camera unit captures an image of the
`
`target object illuminated by only natural light. Id. at 11:33-39. The difference
`
`between the two images – obtained by difference calculation unit 111 – represents
`
`the “reflected light from the object resulting from the light emitted by the lighting
`
`unit 101.” Id. at 11:43-51. This information is then used by feature data generation
`
`unit 103 to determine gestures, pointing, etc. of the target object that may be
`
`converted into commands executed by a computer. Id. at 10:57-66. Through this
`
`arrangement, a PHOSITA would have understood that component 702 illustrated in
`
`Fig. 2 is “oriented to observe a gesture performed in the work volume” illuminated
`
`by lighting unit 701. Bederson Dec. (Ex. 1010), ¶¶ 42-43.
`
`
`
`Claim 2
`
`2. The method according to claim 1 wherein the light source includes a light
`emitting diode.
`
`As discussed above with reference to limitation 1[a], Numazaki teaches light
`
`source 701 in the embodiment depicted in Fig. 74 below:
`14
`
`

`

`IPR2021-00922
`U.S. Patent No. 8,553,079
`
`
`
`Numazaki (Ex. 1004), Fig. 74; see also limitation 1[a], supra. Although Numazaki
`
`does not discuss the specific lighting technology contemplated for light source 701,
`
`with reference to the first embodiments, Numazaki teaches that an “LED can be used
`
`as the light source” since “the LED has a property that it can emit more intense light
`
`instantaneously” while “reduc[ing] the power required for the light emission.” Id. at
`
`14:49-56; see also id. at 68:13-20. Based on Numazaki’s teachings, a PHOSITA
`
`would have been motivated to implement light source 701 in Fig. 74 using LED
`
`technology for a number of reasons. Bederson Dec. (Ex. 1010), ¶ 44. First, the
`
`benefit of emitting more intense light instantaneously described with reference to
`
`the first embodiment would have improved Fig. 74’s apparatus such that it could
`
`15
`
`

`

`IPR2021-00922
`U.S. Patent No. 8,553,079
`quickly and accurately detect pointing and gestures. Id; see also KSR Int’l Co. v.
`
`Teleflex Inc., 550 U.S. 398, 417 (2007) (obvious to use known techniques to improve
`
`similar devices in the same way). Second, a PHOSITA would have understood that
`
`the “note PC” depicted in Fig. 74 is a self-contained and portable unit that would
`
`have benefitted from the power reduction of an LED light source discussed with
`
`reference to the first embodiment. Bederson Dec. (Ex. 1010), ¶ 44 (noting that
`
`Numazaki’s eighth embodiment contemplates other portable implementations and,
`
`at col. 52, ln. 33 – col. 53, ln. 7, expressly discusses the desire to conserve power
`
`and techniques for doing so by controlling the lighting unit).
`
`iv.
`
`Claim 4
`
`4. The method according to claim 1 wherein detecting a gesture includes analyzing
`sequential images of the camera.
`Numazaki’s eighth embodiment analyzes the sequential movement of “click
`
`and drag” operations by detecting a combination of “pointing or gesture input” and
`
`“keyboard input” via a “button for use in conjunction with the pointing or gesture
`
`input.” Numazaki (Ex. 1004), 50:38-47. This permits “selecting and moving icons
`
`on the screen.” Id. at 50:45-47.
`
`A PHOSITA would have understood that using gesture recognition to
`
`implement a “click and drag” operation would have been implemented by capturing
`
`a sequence of images to determine the user’s hand movement, which dictates the
`
`“drag” operation to be performed. Bederson Dec. (Ex. 1010), ¶¶ 45-46. Indeed, in
`16
`
`

`

`IPR2021-00922
`U.S. Patent No. 8,553,079
`its second embodiment, Numazaki expressly describes a process through which the
`
`system tracks lateral finger movements by detecting the center of gravity of a finger,
`
`where “finger tip movement and the center of gravity movement can be smoothly
`
`correlated” using pixel values. Numazaki (Ex. 1004), 19:43-20:25. Fig. 10 below
`
`illustrates the process by which the reflected light image of a hand and finger are
`
`mapped to a pixelated target space, and Fig. 11 illustrates how coordinate-based
`
`finger movement can be tracked on the basis of pixel value changes:
`
`Id. at Fig. 10, 19:13-42 (describing the same).
`
`
`
`17
`
`

`

`IPR2021-00922
`U.S. Patent No. 8,553,079
`
`
`
`Id. at Fig. 11A, 19:43-20:25 (describing a center of gravity tracking that enables a
`
`fingertip to be tracked). Using this technique, Numazaki teaches “the cursor on [a]
`
`screen can be controlled” so that “when the finger is moved, the cursor is also
`
`moved” and a cursor position depends on a fingertip position. Id. at 26:8-14, 26:23-
`
`25. A PHOSITA would have understood that processing such cursor control is
`
`similar to processing the “click and drag” functionality described with reference to
`
`the eighth embodiment and would have been motivated to implement the eighth
`
`embodiment to use the same processing functionality described in the second
`
`embodiment. Bederson Dec. (Ex. 1010), ¶¶ 45-46.
`
`18
`
`

`

`v.
`
`Claim 5
`
`IPR2021-00922
`U.S. Patent No. 8,553,079
`
`5. The method according to claim 1 wherein the detected3 gesture includes at least
`one of a pinch gesture, a pointing gesture, and a grip gesture.
`In the eighth embodiment, Numazaki teaches “the operator operating the
`
`keyboard can make [a] pointing or gesture input by slightly raising and moving the
`
`index finger” and consequently conduct a “click and drag” operation for “selecting
`
`and moving icons on the screen.” Numazaki (Ex. 1004), 50:38-48. Figure 74
`
`illustrates such an operation:
`
`
`
`3Claim 5 recites “the detected gesture” of claim 1, but the term “detected” lacks
`
`antecedent basis. For purposes of this petition, Petitioner assumes “the detected
`
`gesture” of claim 5 is a reference to Claim 1’s “determining, using the camera, the
`
`gesture performed in the work volume and illuminated by the light source.”
`
`19
`
`

`

`IPR2021-00922
`U.S. Patent No. 8,553,079
`
`
`
`Id. at Fig. 74 (annotated to show pointing gesture).
`
`vi.
`
`
`
`Claim 6
`
`6. The method according to claim 1 further including determining the pointing
`direction of a finger in the work volume.
`In the eighth embodiment, Numazaki teaches “the operator operating the
`
`keyboard can make [a] pointing or gesture input by slightly raising and moving the
`
`index finger” and consequently conduct a “click and drag” operation for “selecting
`
`and moving icons on the screen.” Id. at 50:38-48. Any operator hand movements
`
`used as computer inputs are detectable within the volume of space Numazaki refers
`
`to as a “range of illumination.” Figure 74 illustrates this operation:
`
`20
`
`

`

`IPR2021-00922
`U.S. Patent No. 8,553,079
`
`
`
`Id. at Fig. 74 (annotated to show pointing gesture, icon selection, and range of
`
`illumination). A PHOSITA would have understood that determining the pointing
`
`direction of a finger in a work volume is necessary to implement the described “click
`
`and drag” feature. Bederson Dec. (Ex. 1010), ¶ 47 (explaining that the pointing
`
`direction of a finger is necessary to determine which icon a user intends to select and
`
`how far and in which direction the icon is to be moved).
`
`21
`
`

`

`vii.
`
`
`
`Claim 7
`
`IPR2021-00922
`U.S. Patent No. 8,553,079
`
`7. The method according to claim 1 further including providing a target positioned
`on a user that is viewable in the work volume.
`As depicted in Figures 74 and 77, Numazaki’s eighth embodiment also teaches
`
`“the entire hand of [an] operator is illuminated, as can be seen from a dashed line
`
`circle indicating a range of illumination” Numazaki (Ex. 1004), 50:35-37 (emphasis
`
`added).
`
`
`
`Id. at Figs. 74, 77 (annotated to illustrate the expanding range of illumination for
`
`capturing hand gestures). In this configuration, the user’s hand operates within, and
`
`is detectable within, an illuminated work volume. Numazaki discloses the user’s
`
`hand itself as a target object in the range of illumination:
`
`When the target object is a hand, it becomes possible to obtain the
`information regarding a gesture or a pointing according to the feature
`data extracted from the reflected light image of the hand . . . and it
`becomes possible to operate a computer by using this obtained
`information.
`
`22
`
`

`

`
`Id. at 10:57-66 (emphasis added). Numazaki discloses additional hand attributes as
`
`IPR2021-00922
`U.S. Patent No. 8,553,079
`
`subsidiary targets, including color, material, and shape information. Id. at 16:45-61.
`
`The orientation of the hand also creates different targets with different distances
`
`extracted from the hand as a target object to help extract its overall “3D shape.” Id.
`
`at 12:27-45.
`
`Although the above disclosures in Numazaki utilize natural characteristics of
`
`a user’s hand to improve target detection, Numazaki also notes that it was known in
`
`the prior art to position a target on a user (i.e., something that is added to a user’s
`
`person) in order to improve target detection. For example, Numazaki notes that it
`
`was known to paint a fingertip or to wear a ring in a particular color to improve
`
`detection. Id. at 3:4-11. Numazaki, however, cautions that requiring users to wear or
`
`mount some external component may negatively impact the user’s convenience and
`
`may bring with it durability issues. Id. at 3:32-38. A PHOSITA would have
`
`understood, however, that the Fig. 74 arrangement described in the eighth
`
`embodiment is particularly well suited to a ring or other small target mounted on a
`
`user’s finger. Bederson Dec. (Ex. 1010), ¶¶ 48-49. Given the option of improved
`
`accuracy in exchange for the minor inconvenience of wearing a small ring or other
`
`hand-based target when using gesture recognition while typing, a PHOSITA would
`
`have understood that many users would accept this tradeoff. Id. Indeed, the
`
`durability concerns are implicated by a ring target, and many adults wear rings
`23
`
`

`

`IPR2021-00922
`U.S. Patent No. 8,553,079
`routinely while typing with no ill effect, which suggests that such a tradeoff would
`
`be acceptable to many users. Id.
`
`viii.
`
`
`
`Claim 8
`
`8. The method according to claim 1 further including determining the three-
`dimensional position of a point on a user.
`As noted above, Numazaki’s eighth embodiment uses the “information input
`
`generation apparatus” of Fig. 2 to detect and process a user’s gestures. Within this
`
`Fig. 2 apparatus, feature data generation unit 103 determines three-dimensional
`
`information representing the user’s hand:
`
`The feature data generation unit 103 extracts various feature data from
`the reflected light image. Many different types of the feature data and
`their extraction methods can be used in the present invention, as will be
`described in detail in the later embodiments. When the target object
`is a hand, it becomes possible to obtain the information regarding a
`gesture or a pointing according to the feature data extracted from the
`reflected light image of the hand, for example, and it becomes possible
`to operate a computer by using this obtained information. It is also
`possible to extract the 3D information on the target object for
`further utilization.
`Numazaki (Ex. 1004), 10:57-67 (emphasis added). A PHOSITA would have
`
`understood that detecting a three-dimensional representation of a user’s hand
`
`involves determining the three-dimensional position of at least one “point on a user”
`
`as claimed. Bederson Dec. (Ex. 1010), ¶ 50 (explaining that representing a hand in
`
`three-dimensions necessarily involves ascertaining the three-dimensional position of
`
`a plurality of points on the user’s hand).
`
`24
`
`

`

`IPR2021-00922
`U.S. Patent No. 8,553,079
`Further, Numazaki uses distance- and region-based extraction methods to
`
`track the uniformity or homogeneity of a surface reflecting light to extract a 3D
`
`shape of a user’s hand (e.g., detect the uneven palm of a hand). Numazaki (Ex. 1004),
`
`16:34-44.
`
`ix.
`
`Claim 9
`
`9. The method according to claim 1 wherein the camera and the light source are
`positioned in fixed relation relative to a keypad.
`Numazaki’s eighth embodiment places the light source and camera above a
`
`keyboard as depicted in Figure 74 below:
`
`Id. at Fig. 74 (annotated to show keyboard in relation to light source and camera).
`
`
`
`25
`
`

`

`IPR2021-00922
`U.S. Patent No. 8,553,079
`Numazaki further teaches that the lighting and camera units are mounted in a
`
`fixed location just above the keyboard:
`
`This computer of FIG. 74 is a portable computer generally called note
`PC in which a keyboard and a display are integrally provided with the
`computer body. In this computer of FIG. 74, a lighting unit 701 and a
`photo-detection sensor unit 702 of the information input generation
`apparatus are provided at positions beyond the keyboard when
`viewed from an operator side, and arranged to have the optical axis of
`the photo-detection sections pointing obliquely upward towards the
`operator side.
`Id. at 50:25-35. This positioning ensures “the entire hand of the operator is
`
`illuminated” within the “dashed line circle . . . range of illumination.” Id. at 50:35-37.
`
`Even when using a keyboard separate from the display as illustrated in Fig.
`
`75, Numazaki teaches keeping a lighting unit 703 and photo-detection sensor unit
`
`704 at “positions in such a positional relationship with the keyboard that the
`
`light is irradiated onto the hand when the hand is raised from the home position
`
`of the keyboard.” Id. at 51:13-15 (emphasis added).
`
`26
`
`

`

`IPR2021-00922
`U.S. Patent No. 8,553,079
`
`
`
`Id. at Fig. 75.
`
`x.
`
` Claim 10
`
`10. The method according to claim 9 the camera, the light source and the keypad
`form part of a laptop computer.
`Numazaki teaches a configuration that arranged all of a light source, camera,
`
`and keypad together on a laptop computer as depicted in Figure 74 below:
`
`27
`
`

`

`IPR2021-00922
`U.S. Patent No. 8,553,079
`
`
`
`Id. at Fig. 74; see also id. at 50:25-35 (describing a “portable computer generally
`
`called note PC” comprising a keyboard, display, lighting unit 701, and photo-
`
`detection sensor unit 702); Bederson Dec. (Ex. 1010), ¶ 51 (explaining that “note
`
`PC” is a term used to describe laptop computers and concluding the device depicted
`
`in Fig. 74 is a laptop computer).
`
`xi.
`
` Claim 11
`
`11[P] A computer apparatus, comprising:
`To the extent the preamble is limiting, Numazaki teaches a computer
`
`apparatus in an eighth embodiment that includes a “computer of FIG. 74 [that] is a
`
`portable computer generally called note PC in which a keyboard and a display are
`
`integrally provided with the computer body.” Numazaki (Ex. 1004), 50:27-29.
`
`28
`
`

`

`IPR2021-00922
`U.S. Patent No. 8,553,079
`11[a] a light source adapted to illuminate a human body part within a work volume
`generally above the light source;
`See limitation 1[a], supra.
`
`11[b] a camera in fixed relation relative to the light source and oriented to observe
`a gesture performed by the human body part in the work volume; and a processor
`adapted to determine the gesture performed in the work volume and illuminated
`by the light source based on the camera output.
`See limitation 1[b], supra. As explained above with reference to limitation
`
`1[b], Numazaki’s eighth embodiment utilizes the information input generation
`
`apparatus illustrated in Figs. 1 and 2 described with reference to the first
`
`embodiment. This apparatus determines a user’s gesture with feature data generation
`
`unit 103. Id. at 10:57-11:4. In its second embodiment, Numazaki describes

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket