`
`BEFORE THE PATENT TRIAL AND APPEAL BOARD
`
`LG ELECTRONICS, INC. and LG ELECTRONICS U.S.A., INC.
`Petitioner
`
`v .
`
`GESTURE TECHNOLOGY PARTNERS LLC
`Patent Owner
`
`Case No. IPR2022-00090
`U.S. Patent No. 8,553,079
`
`PETITION FOR INTER PARTES REVIEW
`OF U.S. PATENT NO. 8,553,079
`
`EAST\185894198.3
`
`
`
`TABLE OF CONTENTS
`
`I.
`II.
`
`Page
`INTRODUCTION ........................................................................................... 1
`SUMMARY OF THE ’079 PATENT ............................................................. 1
`A.
`The ’079 Patent’s Alleged Invention .................................................... 1
`B.
`The ’079 Patent’s Prosecution............................................................... 3
`C. A Person Having Ordinary Skill in the Art ........................................... 4
`III. REQUIREMENTS FOR IPR UNDER 37 C.F.R. § 42.104 ............................ 4
`A.
`Standing Under 37 C.F.R. § 42.104(A) ................................................ 4
`B.
`Challenge Under 37 C.F.R. § 42.104(B) and Relief Requested ........... 4
`C.
`Claim Construction Under 37 C.F.R. § 42.104(B)(3) ........................... 5
`IV. THE CHALLENGED CLAIMS ARE UNPATENTABLE ............................ 6
`A. Ground 1: Claims 1, 2, 4-14, 17, 19, 21-22, 24-28, and 30 are
`obvious under pre-AIA 35 U.S.C. § 103 over Numazaki in view
`of the knowledge of a PHOSITA .......................................................... 6
`1.
`Overview of Numazaki ............................................................... 6
`a.
`Claim 1. ........................................................................... 10
`b.
`Claim 2 ............................................................................ 14
`c.
`Claim 4 ............................................................................ 16
`d.
`Claim 5 ............................................................................ 19
`e.
`Claim 6 ............................................................................ 20
`f.
`Claim 7 ............................................................................ 21
`g.
`Claim 8 ............................................................................ 23
`h.
`Claim 9 ............................................................................ 24
`i.
`Claim 10 .......................................................................... 26
`j.
`Claim 11 .......................................................................... 27
`k.
`xii. Claim 12 ................................................................... 29
`l.
`xiii. Claim 13 .................................................................. 29
`m. Claim 14 .......................................................................... 31
`n.
`Claim 17 .......................................................................... 32
`
`EAST\185894198.3
`
`i
`
`
`
`TABLE OF CONTENTS
`(Continued)
`
`Page
`Claim 19 .......................................................................... 32
`o.
`xxi. Claim 21 .................................................................. 32
`p.
`Claim 22 .......................................................................... 33
`q.
`Claim 24 .......................................................................... 33
`r.
`Claim 25 .......................................................................... 33
`s.
`Claim 26 .......................................................................... 33
`t.
`Claim 27 .......................................................................... 34
`u.
`Claim 28 .......................................................................... 34
`v.
`Claim 30 .......................................................................... 34
`w.
`B. Ground 2: Claims 3, 15, and 23 are obvious under pre-AIA 35
`U.S.C. § 103 over Numazaki in view of Numazaki ’863 ................... 34
`1.
`Overview of Numazaki ’863 ..................................................... 34
`2. Motivation to Modify Numazaki in view of Numazaki
`’863 ........................................................................................... 37
`a.
`Claim 3 ............................................................................ 39
`b.
`Claim 15 .......................................................................... 40
`c.
`xxiii. Claim 23 ................................................................ 40
`C. Ground 3: Claims 16 and 29 are obvious under pre-AIA 35
`U.S.C. § 103 over Numazaki in view of DeLuca................................ 41
`1.
`Overview of DeLuca ................................................................. 41
`2. Motivation to Modify Numazaki in view of DeLuca ............... 43
`a.
`Claim 16 .......................................................................... 47
`b.
`Claim 29 .......................................................................... 48
`D. D. Ground 4: Claim 18 is obvious under pre-AIA 35 U.S.C. §
`103 over Numazaki in view of DeLeeuw ........................................... 48
`1.
`Overview of DeLeeuw .............................................................. 48
`2. Motivation to Modify Numazaki in view of DeLeeuw ............ 51
`a.
`Claim 18 .......................................................................... 53
`
`EAST\185894198.3
`
`ii
`
`
`
`TABLE OF CONTENTS
`(Continued)
`
`Page
`
`E.
`
`Ground 5: Claim 20 is obvious under pre-AIA 34 U.S.C. § 103
`over Numazaki in view of Maruno ..................................................... 54
`1.
`Overview of Maruno ................................................................. 54
`2. Motivation to Combine Numazaki and Maruno ....................... 60
`a.
`Claim 20 .......................................................................... 62
`V. DISCRETIONARY CONSIDERATIONS ................................................... 66
`A.
`The Fintiv Factors Favor Institution ................................................... 66
`1.
`The Fintiv factors strongly favor institution ............................. 67
`a. Whether the court granted a stay or evidence exists
`that one may be granted if a proceeding is instituted. .... 67
`Proximity of the court’s trial date to the Board’s
`projected statutory deadline for a final written
`decision. .......................................................................... 67
`Investment in the parallel proceeding by the court
`and the parties. ................................................................ 68
`Overlap between issues raised in the petition and in
`the parallel proceeding. ................................................... 69
`e. Whether the petitioner and the defendant in the
`parallel proceeding are the same party. .......................... 69
`Other circumstances that impact the Board’s
`exercise of discretion, including the merits. ................... 70
`The Fintiv Framework Should Be Overturned ......................... 70
`a.
`The Fintiv framework exceeds the Director’s
`authority .......................................................................... 71
`The Fintiv framework is arbitrary and capricious .......... 72
`The Fintiv framework was impermissibly adopted
`without notice-and-comment rulemaking....................... 73
`VI. CONCLUSION .............................................................................................. 74
`VII. MANDATORY NOTICES UNDER 37 C.F.R. § 42.8(A)(1) ...................... 75
`A.
`Real Party-In-Interest .......................................................................... 75
`
`b.
`
`c.
`
`d.
`
`2.
`
`f.
`
`b.
`c.
`
`EAST\185894198.3
`
`iii
`
`
`
`TABLE OF CONTENTS
`(Continued)
`
`Page
`Related Matters .................................................................................... 75
`Lead and Back-Up Counsel ................................................................. 75
`
`B.
`C.
`
`EAST\185894198.3
`
`iv
`
`
`
`I.
`
`INTRODUCTION
`Petitioner LG Electronics, Inc. and LG Electronics U.S.A. Inc. (“Petitioner”)
`
`requests an Inter Partes Review (“IPR”) of claims 1–30 (the “Challenged Claims”)
`
`of U.S. Patent No. 8,553,079 (“the ’079 Patent”). This petition is substantively the
`
`same as IPR2021-00922 (which is currently pending institution), and is being filed
`
`concurrently with a motion for joinder with respect to that proceeding.
`
`II.
`
`SUMMARY OF THE ’079 PATENT
`A.
`The ’079 Patent’s Alleged Invention
`The ’079 Patent generally describes computer input devices employing
`
`cameras and lights to observe points on the human body and optically sense human
`
`positions and/or orientations.’079 Patent (Ex. 1001), 1:54-2:6. Examples of input
`
`devices contemplated by the patent include a computer keyboard, puzzle toy, and
`
`handheld computer. Id. at 2:15-31. Fig. 2 below illustrates one exemplary
`
`embodiment implemented in a laptop computer:
`
`EAST\185894198.3
`
`1
`
`
`
`Id. at Fig. 2. As illustrated, a laptop 138 may include camera locations 100, 101, 105,
`
`106, 108, 109; keyboard surface 102; screen housing 107; light 122; light emitting
`
`diodes (LEDs) 210 and 211, and work volume area 170 within which a user’s
`
`movements are detected. Id. at 2:39-53. The system can detect a user’s finger alone or
`
`the user may employ external objects such as ring 208 to help detect and recognize
`
`gestures performed in work volume area 170. Id. at 2:54-3:8. The ’079 Patent
`
`describes detecting point, pinch, and grip gestures using this configuration. Id. at
`
`3:48-51.
`
`EAST\185894198.3
`
`2
`
`
`
`The ’079 Patent’s Prosecution
`B.
`The Application that resulted in the ’079 Patent was filed on December 14,
`
`2012. The Application claims priority to provisional patent application No.
`
`60/107,652, filed November 9, 1998, through a succession of continuation
`
`applications 09/433,297; 10/866,191; and 12/700,055. Id. For purposes of this
`
`petition and without waiving its right to challenge priority in this or any other
`
`proceeding, Petitioner adopts November 9, 1998 as the invention date for the
`
`Challenged Claims.
`
`Applicant canceled twenty originally filed claims (1–20) by preliminary
`
`amendment and replaced them with thirty new claims (21–50). ’079 File History
`
`(Ex. 1002), 134-138. A Notice of Allowance issued on July 24, 2013, in which the
`
`Examiner allowed all thirty claims (renumbered 1–30) in a first office action. Id. at
`
`150, 175. Examiner amended the abstract and reasoned that none of Naoi et al. (US
`
`5459793), Platzker et al. (US 5528263), Sellers (US 5864334 A), nor Fukushima et
`
`al. (US 6346929 B1) taught or suggested the independently claimed elements: (1)
`
`providing a camera oriented to observe a gesture performed in the work volume; (2)
`
`the camera being fixed relative to the light source; and (3) determining, using the
`
`camera, the gesture performed in the work volume and illuminated by the light
`
`source. Id. at 153.
`
`EAST\185894198.3
`
`3
`
`
`
`A Person Having Ordinary Skill in the Art
`C.
`A person having ordinary skill in the art (“PHOSITA”) at the time of the
`
`’079 Patent would have had at least a bachelor’s degree in electrical engineering or
`
`equivalent with at least one year of experience in the field of human computer
`
`interaction. Additional education or experience might substitute for the above
`
`requirements. Bederson Dec. (Ex. 1010), ¶¶ 29-31.
`
`III. REQUIREMENTS FOR IPR UNDER 37 C.F.R. § 42.104
`A.
`Standing Under 37 C.F.R. § 42.104(A)
`Petitioner certifies that the ’079 Patent is available for IPR and that
`
`Petitioner is not barred or estopped from requesting an IPR challenging the claims
`
`of the ’079 Patent. Specifically, (1) Petitioner is not the owner of the ’079 Patent,
`
`(2) Petitioner has not filed a civil action challenging the validity of any claim of the
`
`’079 Patent, and (3) this Petition is filed less than one year after the Petitioner was
`
`served with a complaint alleging infringement of the ’079 Patent.
`
`Challenge Under 37 C.F.R. § 42.104(B) and Relief Requested
`B.
`In view of the prior art and evidence presented, claims 1-30 of the ’079 Patent
`
`are unpatentable and should be cancelled. 37 C.F.R. § 42.104(b)(1). Further, based
`
`on the prior art references identified below, IPR of the Challenged Claims should
`
`be granted. 37 C.F.R. § 42.104(b)(2).
`
`EAST\185894198.3
`
`4
`
`
`
`Proposed Ground of Unpatentability
`Ground 1: Claims 1, 2, 4-14, 17, 19, 21-22, 24-28, and 30 are
`obvious under pre-AIA 35 U.S.C. § 103 over U.S. Patent No.
`6,144,366 (“Numazaki”) in view of the knowledge of a PHOSITA
`Ground 2: Claims 3, 15, and 23 are obvious under pre-AIA 35
`USC § 103 over Numazaki in view of U.S. Patent No. 5,900,863
`(“Numazaki ’863”)
`Ground 3: Claims 16 and 29 are obvious under pre-AIA 35 U.S.C
`§ 103 over Numazaki in view of U.S. Patent No. 6,064,354
`(“DeLuca”)
`Ground 4: Claim 18 is obvious under pre-AIA 35 U.S.C § 103 over
`Numazaki in view of U.S. Patent No. 6,008,018 (“DeLeeuw”)
`Ground 5: Claim 20 is obvious under pre-AIA 35 USC § 103 over
`Numazaki in view of U.S. Patent No. 6,191,773 (“Maruno”)
`
`Exhibits
`Ex. 1004
`
`Ex. 1004,
`Ex. 1005
`
`Ex. 1004,
`Ex. 1006
`
`Ex. 1004,
`Ex. 1007
`Ex. 1004,
`Ex. 1008
`
`Section IV identifies where each element of the Challenged Claims is found
`
`in the prior art. 37 C.F.R. § 42.104(b)(4). The exhibit numbers of the evidence
`
`relied upon to support the challenges are provided above and the relevance of the
`
`evidence to the challenges raised is provided in Section IV. 37 C.F.R. §
`
`42.104(b)(5). Exhibits 1001-1016 are also attached.
`
`Claim Construction Under 37 C.F.R. § 42.104(B)(3)
`C.
`In this proceeding, claims are interpreted under the same standard applied by
`
`Article III courts (i.e., the Phillips standard). See 37 C.F.R § 42.100(b); see also 83
`
`Fed. Reg. 197 (Oct. 11, 2018); Phillips v. AWH Corp., 415 F.3d 1303, 1312 (Fed.
`
`Cir. 2005) (en banc). Under this standard, words in a claim are given their plain
`
`meaning which is the meaning understood by a person of ordinary skill in the art in
`
`view of the patent and file history. Phillips, 415 F.3d 1303, 1212-13. For purposes
`
`EAST\185894198.3
`
`5
`
`
`
`of the proposed grounds below, Petitioner proposes no terms require express
`
`construction.
`
`IV. THE CHALLENGED CLAIMS ARE UNPATENTABLE
`A.
`Ground 1: Claims 1, 2, 4-14, 17, 19, 21-22, 24-28, and 30 are
`obvious under pre-AIA 35 U.S.C. § 103 over Numazaki in view of
`the knowledge of a PHOSITA
`1.
`Overview of Numazaki
`U.S. Patent No. 6,144,366 to Numazaki et al. (“Numazaki”) (Ex. 1004) was
`
`filed on October 17, 1997 and is prior art to the ’079 Patent under at least 35
`
`U.S.C. § 102(e) (pre-AIA). Numazaki was not cited or considered during
`
`prosecution of the ’079 Patent or its parent, U.S. Patent No. 6,750,848. ’079 Patent
`
`(Ex. 1001); ’848 Patent (Ex. 1003).
`
`Numazaki is generally directed to a method for detecting a gesture or the
`
`movement of a user’s hand. (“Numazaki”) (Ex. 1004), Abstract, 4:9-40. Numazaki
`
`purports to have improved upon prior methods by using a controlled light source to
`
`illuminate the target object (e.g., the user’s hand), a first camera unit (referred to
`
`by Numazaki as a “photo-detection unit”),1 and a second camera unit. Id. at 11:9-
`
`23. This arrangement is illustrated in Fig. 2 below:
`
`1 A PHOSITA would have considered Numazaki’s photo-detection units to be
`
`camera units. Bederson Dec. (Ex. 1010), ¶ 36 (explaining that Numazaki describes
`
`EAST\185894198.3
`
`6
`
`
`
`Id. at Fig. 2. A timing control unit is used to turn lighting unit 101 on (i.e.,
`
`illuminating the target object) when the first camera unit is active and off when the
`
`second camera unit is active. Id. at 11:20-32. The result of this light control is the first
`
`camera unit captures an image of the target object illuminated by both natural light
`
`and the lighting unit 101 and the second camera unit captures an image of the target
`
`object illuminated by only natural light. Id. at 11:33-39. The difference between the
`
`two images—obtained by difference calculation unit 111—represents the “reflected
`
`light from the object resulting from the light emitted by the lighting unit 101.” Id. at
`
`11:43-51. This information is then used by feature data generation unit 103 to
`
`using CMOS or CCD sensor units, which were two of the more common optical
`
`sensors used in camera units at the time).
`
`EAST\185894198.3
`
`7
`
`
`
`determine gestures, pointing, etc. of the target object that may be converted into
`
`commands executed by a computer. Id. at 10:57-66.
`
`In its eighth embodiment, Numazaki describes implementing this structure in
`
`a computer such that a user can point or gesture with an index finger while typing
`
`on the keyboard “with[] hardly any shift of the hand position.” Id. at 50:25-43.
`
`This arrangement is illustrated in Fig. 74 below:
`
`Id. at Figure 74 (annotated to indicate light source 701 and photo-detection sensor unit
`
`702). Numazaki teaches the entirety of the operator’s hand is illuminated within the
`
`range depicted by the dashed circle such that the user’s pointing and gestures can
`
`be captured and converted to commands. Id. at 50:38-48.
`
`EAST\185894198.3
`
`8
`
`
`
`Numazaki teaches that its eighth embodiment incorporates “the information
`
`input generation apparatus of the present invention as described in the above
`
`embodiments.” Id. at 50:21-24. A PHOSITA would have understood that the
`
`referenced information input generation apparatus is that illustrated in Fig. 2 and
`
`described in the corresponding disclosure. Bederson Dec. (Ex. 1010), ¶¶ 42-43
`
`(explaining that Numazaki describes its controlled light and two-camera
`
`configuration as key to its invention and noting that Numazaki at 53:22-36 teaches
`
`that the eighth embodiment uses the precise image difference calculation taught by
`
`Fig. 2 and its corresponding disclosure).
`
`Because Numazaki, like the ’079 Patent, discloses a method and apparatus
`
`for generating computer input information by capturing hand gestures, Numazaki is
`
`in the same field of endeavor as the ’079 Patent. Compare Numazaki (Ex. 1004),
`
`50:29-37 (“In this computer of FIG. 74, a lighting unit 701 and a photo-detection
`
`sensor unit 702 of the information input generation apparatus are provided at
`
`positions beyond the keyboard” such that the “entire hand of the operator is
`
`illuminated . . . [in] a range of illumination”) with ’079 Patent (Ex 1001), Abstract
`
`(describing “[a] method for determining a gesture illuminated by a light source”
`
`that “utilizes the light source to provide illumination through a work volume above
`
`the light source” where a “camera is positioned to observe and determine the
`
`EAST\185894198.3
`
`9
`
`
`
`gesture performed in the work volume”). Numazaki is therefore analogous art to
`
`the ’079 Patent. Bederson Dec. (Ex. 1010), ¶ 41.
`
`a.
`
`Claim 1.
`i.
`1[P] A computer implemented method
`comprising:
`To the extent the preamble is limiting, Numazaki teaches a computer
`
`implemented method. In particular, Numazaki teaches a “method . . . for generating
`
`information input . . . capable of realizing a direct command type information input
`
`scheme by which the gesture or the motion can be inputted easily.” Numazaki (Ex.
`
`1004), 4:9-13.
`
`Numazaki’s eighth embodiment teaches a computer implemented method for
`
`controlling functions on a laptop device through gestures or pointing:
`
`In this configuration, the operator operating the keyboard can make the
`pointing or gesture input by slightly raising and moving the index
`finger. The user’s convenience is remarkably improved here because
`the keyboard input and the pointing or gesture input can be made
`without hardly any shift of the hand position.
`Id. at 50:38-43. This arrangement is illustrated in Fig. 74 below:
`
`EAST\185894198.3
`
`10
`
`
`
`Id. at Fig. 74.
`
`ii.
`
`[1(a)] providing a light source adapted to direct
`illumination through a work volume above the
`light source;
`Numazaki teaches a lighting unit 701 (i.e., light source) that is adapted to
`
`illuminate a user’s hand (i.e., human body part) within a work volume generally
`
`above the light source. For example, Numazaki teaches “the entire hand of the
`
`operator is illuminated, as can be seen from a dashed line circle indicating a
`
`range of illumination.” Id. at 50:35-37 (emphasis added). As depicted in Fig. 74
`
`below, the lighting unit 701 is adapted to illuminate the user’s hand by positioning
`
`it “beyond the keyboard” to illuminate a work volume indicated by the dashed line
`
`circle above it:
`
`EAST\185894198.3
`
`11
`
`
`
`Id. at Fig. 74 (annotated to illustrate light source and range of hand illumination).
`
`iii.
`
`[1(b)] providing a camera oriented to observe a
`gesture performed in the work volume, the
`camera being fixed relative to the light source;
`and determining, using the camera, the gesture
`performed in the work volume and illuminated
`by the light source.
`Numazaki discloses a photo-detection sensor unit 702 (i.e., camera2) that is positioned
`
`next to lighting unit 701 (i.e., in fixed location relative to the light source) and
`
`2 As described in the overview section above, a PHOSITA would have considered
`
`Numazaki’s photo-detection units to be camera units. Bederson Dec. (Ex. 1010), 36
`
`EAST\185894198.3
`
`12
`
`
`
`arranged to have the optical axis of the photo-detection sections pointing obliquely
`
`upward towards the operator side to observe a gesture performed by the user (i.e.,
`
`observe a gesture) within a “dashed line circle” (i.e., work volume) depicted
`
`below. Id. at 50:30-43. This arrangement is illustrated in Fig. 74 below:
`
`Id. at Fig. 74 (annotated to show camera).
`
`As described with reference to Fig. 2, which is incorporated into the eighth
`
`embodiment as noted above, Numazaki uses this light and camera arrangement to
`
`illuminate the target object (e.g., the user’s hand) in a controlled manner such that
`
`(explaining that Numazaki describes using CMOS or CCD sensor units, which
`
`were two of the more common optical sensors used in camera units at the time).
`
`EAST\185894198.3
`
`13
`
`
`
`a precise image of the user’s hand and hand movement can be ascertained. Id. at
`
`11:9-23. Specifically, a timing control unit is used to turn lighting unit 101 on (i.e.,
`
`illuminating the target object) when the first camera unit is active and off when the
`
`second camera unit is active. Id. at 11:20-32. The result of this light control is the
`
`first camera unit captures an image of the target object illuminated by both natural
`
`light and the lighting unit 101 and the second camera unit captures an image of the
`
`target object illuminated by only natural light. Id. at 11:33-39. The difference
`
`between the two images – obtained by difference calculation unit 111 – represents
`
`the “reflected light from the object resulting from the light emitted by the lighting
`
`unit 101.” Id. at 11:43-51. This information is then used by feature data generation
`
`unit 103 to determine gestures, pointing, etc. of the target object that may be
`
`converted into commands executed by a computer. Id. at 10:57-66. Through this
`
`arrangement, a PHOSITA would have understood that component 702 illustrated
`
`in Fig. 2 is “oriented to observe a gesture performed in the work volume”
`
`illuminated by lighting unit 701. Bederson Dec. (Ex. 1010), ¶¶ 42-43.
`
`b.
`
`Claim 2
`i.
`2. The method according to claim 1 wherein the
`light source includes a light emitting diode.
`
`As discussed above with reference to limitation 1[a], Numazaki teaches light
`
`source 701 in the embodiment depicted in Fig. 74 below:
`
`EAST\185894198.3
`
`14
`
`
`
`Numazaki (Ex. 1004), Fig. 74; see also limitation 1[a], supra. Although Numazaki
`
`does not discuss the specific lighting technology contemplated for light source 701,
`
`with reference to the first embodiments, Numazaki teaches that an “LED can be used
`
`as the light source” since “the LED has a property that it can emit more intense light
`
`instantaneously” while “reduc[ing] the power required for the light emission.” Id. at
`
`14:49-56; see also id. at 68:13-20. Based on Numazaki’s teachings, a PHOSITA
`
`would have been motivated to implement light source 701 in Fig. 74 using LED
`
`technology for a number of reasons. Bederson Dec. (Ex. 1010), ¶ 44. First, the benefit
`
`of emitting more intense light instantaneously described with reference to the first
`
`embodiment would have improved Fig. 74’s apparatus such that it could quickly and
`
`accurately detect pointing and gestures. Id; see also KSR Int’l Co. v. Teleflex Inc.,
`
`EAST\185894198.3
`
`15
`
`
`
`550 U.S. 398, 417 (2007) (obvious to use known techniques to improve similar
`
`devices in the same way). Second, a PHOSITA would have understood that the
`
`“note PC” depicted in Fig. 74 is a self-contained and portable unit that would have
`
`benefitted from the power reduction of an LED light source discussed with reference
`
`to the first embodiment. Bederson Dec. (Ex. 1010), ¶ 44 (noting that Numazaki’s
`
`eighth embodiment contemplates other portable implementations and, at col. 52, ln.
`
`33 – col. 53, ln. 7, expressly discusses the desire to conserve power and techniques
`
`for doing so by controlling the lighting unit).
`
`c.
`
`Claim 4
`i.
`4. The method according to claim 1 wherein
`detecting a gesture includes analyzing
`sequential images of the camera.
`Numazaki’s eighth embodiment analyzes the sequential movement of “click
`
`and drag” operations by detecting a combination of “pointing or gesture input” and
`
`“keyboard input” via a “button for use in conjunction with the pointing or gesture
`
`input.” Numazaki (Ex. 1004), 50:38-47. This permits “selecting and moving icons
`
`on the screen.” Id. at 50:45-47.
`
`A PHOSITA would have understood that using gesture recognition to
`
`implement a “click and drag” operation would have been implemented by
`
`capturing a sequence of images to determine the user’s hand movement, which
`
`dictates the “drag” operation to be performed. Bederson Dec. (Ex. 1010), ¶¶ 45-46.
`
`EAST\185894198.3
`
`16
`
`
`
`Indeed, in its second embodiment, Numazaki expressly describes a process through
`
`which the system tracks lateral finger movements by detecting the center of gravity
`
`of a finger, where “finger tip movement and the center of gravity movement can be
`
`smoothly correlated” using pixel values. Numazaki (Ex. 1004), 19:43-20:25. Fig. 10
`
`below illustrates the process by which the reflected light image of a hand and finger
`
`are mapped to a pixelated target space, and Fig. 11 illustrates how coordinate-based
`
`finger movement can be tracked on the basis of pixel value changes:
`
`Id. at Fig. 10, 19:13-42 (describing the same).
`
`EAST\185894198.3
`
`17
`
`
`
`Id. at Fig. 11A, 19:43-20:25 (describing a center of gravity tracking that enables a
`
`fingertip to be tracked). Using this technique, Numazaki teaches “the cursor on [a]
`
`screen can be controlled” so that “when the finger is moved, the cursor is also moved”
`
`and a cursor position depends on a fingertip position. Id. at 26:8-14, 26:2325. A
`
`PHOSITA would have understood that processing such cursor control is similar to
`
`processing the “click and drag” functionality described with reference to the eighth
`
`embodiment and would have been motivated to implement the eighth embodiment to
`
`use the same processing functionality described in the second embodiment. Bederson
`
`Dec. (Ex. 1010), ¶¶ 45-46.
`
`EAST\185894198.3
`
`18
`
`
`
`d.
`
`Claim 5
`i.
`5. The method according to claim 1 wherein the
`detected3 gesture includes at least one of a pinch
`gesture, a pointing gesture, and a grip gesture.
`In the eighth embodiment, Numazaki teaches “the operator operating the
`
`keyboard can make [a] pointing or gesture input by slightly raising and moving the
`
`index finger” and consequently conduct a “click and drag” operation for “selecting
`
`and moving icons on the screen.” Numazaki (Ex. 1004), 50:38-48. Figure 74
`
`illustrates such an operation:
`
`Id. at Fig. 74 (annotated to show pointing gesture).
`
`3 Claim 5 recites “the detected gesture” of claim 1, but the term “detected” lacks
`
`antecedent basis. For purposes of this petition, Petitioner assumes “the detected
`
`gesture” of claim 5 is a reference to Claim 1’s “determining, using the camera, the
`
`gesture performed in the work volume and illuminated by the light source.”
`
`EAST\185894198.3
`
`19
`
`
`
`e.
`
`Claim 6
`i.
`6. The method according to claim 1 further
`including determining the pointing direction of
`a finger in the work volume.
`In the eighth embodiment, Numazaki teaches “the operator operating the
`
`keyboard can make [a] pointing or gesture input by slightly raising and moving the
`
`index finger” and consequently conduct a “click and drag” operation for “selecting
`
`and moving icons on the screen.” Id. at 50:38-48. Any operator hand movements
`
`used as computer inputs are detectable within the volume of space Numazaki refers
`
`to as a “range of illumination.” Figure 74 illustrates this operation:
`
`EAST\185894198.3
`
`20
`
`
`
`Id. at Fig. 74 (annotated to show pointing gesture, icon selection, and range of
`
`illumination). A PHOSITA would have understood that determining the pointing
`
`direction of a finger in a work volume is necessary to implement the described “click
`
`and drag” feature. Bederson Dec. (Ex. 1010), ¶ 47 (explaining that the pointing
`
`direction of a finger is necessary to determine which icon a user intends to select and
`
`how far and in which direction the icon is to be moved).
`
`f.
`
`Claim 7
`i.
`7. The method according to claim 1 further
`including providing a target positioned on a
`user that is viewable in the work volume.
`As depicted in Figures 74 and 77, Numazaki’s eighth embodiment also
`
`teaches “the entire hand of [an] operator is illuminated, as can be seen from a
`
`dashed line circle indicating a range of illumination” Numazaki (Ex. 1004), 50:35-
`
`37 (emphasis added).
`
`EAST\185894198.3
`
`21
`
`
`
`Id. at Figs. 74, 77 (annotated to illustrate the expanding range of illumination for
`
`capturing hand gestures). In this configuration, the user’s hand operates within, and is
`
`detectable within, an illuminated work volume. Numazaki discloses the user’s hand
`
`itself as a target object in the range of illumination:
`
`When the target object is a hand, it becomes possible to obtain the
`information regarding a gesture or a pointing according to the feature
`data extracted from the reflected light image of the hand . . . and it
`becomes possible to operate a computer by using this obtained
`information.
`Id. at 10:57-66 (emphasis added). Numazaki discloses additional hand attributes as
`
`subsidiary targets, including color, material, and shape information. Id. at 16:45-61.
`
`The orientation of the hand also creates different targets with different distances
`
`extracted from the hand as a target object to help extract its overall “3D shape.” Id. at
`
`12:27-45.
`
`Although the above disclosures in Numazaki utilize natural characteristics of
`
`a user’s hand to improve target detection, Numazaki also notes that it was known in
`
`the prior art to position a target on a user (i.e., something that is added to a user’s
`
`person) in order to improve target detection. For example, Numazaki notes that it
`
`was known to paint a fingertip or to wear a ring in a particular color to improve
`
`detection. Id. at 3:4-11. Numazaki, however, cautions that requiring users to wear or
`
`mount some external component may negatively impact the user’s convenience and
`
`may bring with it durability issues. Id. at 3:32-38. A PHOSITA would have
`
`EAST\185894198.3
`
`22
`
`
`
`understood, however, that the Fig. 74 arrangement described in the eighth
`
`embodiment is particularly well suited to a ring or other small target mounted on a
`
`user’s finger. Bederson Dec. (Ex. 1010), ¶¶ 48-49. Given the option of improved
`
`accuracy in exchange for the minor inconvenience of wearing a small ring or other
`
`hand-based target when using gesture recognition while typing, a PHOSITA would
`
`have understood that many users would accept this tradeoff. Id. Indeed, the
`
`durability concerns are implicated by a ring target, and many adults wear rings
`
`routinely while typing with no ill effect, which suggests that such a tradeoff would
`
`be acceptable to many users. Id.
`
`g.
`
`Claim 8
`i.
`8. The method according to claim 1 further
`including determining the three-dimensional
`position of a point on a user.
`As noted above, Numazaki’s eighth embodiment uses the “information input
`
`generation apparatus” of Fig. 2 to detect and process a user’s gestures. Within this
`
`Fig. 2 apparatus, feature data generation unit 103 determines three-dimensional
`
`information representing the user’s hand:
`
`The feature data generation unit 103 extracts various feature data from
`the reflec