throbber
Case 2:21-cv-00040-JRG Document 157-5 Filed 12/16/21 Page 1 of 29 PageID #: 6279
`Case 2:21-cv-00040-JRG Document 157-5 Filed 12/16/21 Page 1 of 29 PagelD #: 6279
`
`EXHIBIT 5
`EXHIBIT 5
`
`

`

`Case 2:21-cv-00040-JRG Document 157-5 Filed 12/16/21 Page 2 of 29 PageID #: 6280
`Trials@uspto.gov
`Paper 8
`571-272-7822
`Entered: December 13, 2021
`
`UNITED STATES PATENT AND TRADEMARK OFFICE
`
`BEFORE THE PATENT TRIAL AND APPEAL BOARD
`
`APPLE INC.,
`Petitioner,
`v.
`GESTURE TECHNOLOGY PARTNERS, LLC,
`Patent Owner.
`
`IPR2021-00921
`Patent 8,878,949 B2
`
`
`
`
`
`
`
`
`
`Before PATRICK R. SCANLON, GREGG I. ANDERSON, and
`BRENT M. DOUGAL, Administrative Patent Judges.
`SCANLON, Administrative Patent Judge.
`
`DECISION
`Granting Institution of Inter Partes Review
`35 U.S.C. § 314
`
`
`
`
`
`
`
`

`

`Case 2:21-cv-00040-JRG Document 157-5 Filed 12/16/21 Page 3 of 29 PageID #: 6281
`IPR2021-00921
`Patent 8,878,949 B2
`
`I.
`INTRODUCTION
`Apple Inc. (“Petitioner”) filed a Petition (Paper 1, “Pet.”) requesting
`an inter partes review of claims 1–18 of U.S. Patent No. 8,878,949 B2
`(Ex. 1001, “the ’949 patent”). Gesture Technology Partners, LLC (“Patent
`Owner”) filed a Preliminary Response (Paper 6, “Prelim. Resp.”).
`We have authority to determine whether to institute an inter partes
`review. See 35 U.S.C. § 314 (2018); 37 C.F.R. § 42.4(a) (2020). To
`institute an inter partes review, we must determine that the information
`presented in the Petition shows “a reasonable likelihood that the petitioner
`would prevail with respect to at least 1 of the claims challenged in the
`petition.” 35 U.S.C. § 314(a). For the reasons set forth below, we determine
`that the information presented in the Petition establishes a reasonable
`likelihood that Petitioner will prevail with respect to at least one challenged
`claim. Accordingly, an inter partes review is hereby instituted.
`II. BACKGROUND
`A. Real Parties in Interest
`Petitioner identifies itself as the real party in interest. Pet. 65. Patent
`Owner identifies itself as the real party in interest. Paper 4, 1.
`B. Related Matters
`The parties identify the following proceedings as related matters
`involving the ’949 patent: Gesture Technology Partners, LLC v. Apple Inc.,
`No. 6:21-cv-00121 (W.D. Tex.); Gesture Technology Partners, LLC v.
`Lenovo Group Ltd., No. 6:21-cv-00122 (W.D. Tex.); Gesture Technology
`Partners, LLC v. LG Electronics, Inc., No. 6:21-cv-00123 (W.D. Tex.);
`Gesture Technology Partners, LLC v. Huawei Device Co., Ltd., No. 2:21-cv-
`00040 (E.D. Tex.); and Gesture Technology Partners, LLC v. Samsung
`Electronics Co., Ltd., No. 2:21-cv-00041 (E.D. Tex.). Pet. 65; Paper 4, 1.
`
`2
`
`

`

`Case 2:21-cv-00040-JRG Document 157-5 Filed 12/16/21 Page 4 of 29 PageID #: 6282
`IPR2021-00921
`Patent 8,878,949 B2
`In addition, Patent Owner identifies the following inter partes review
`proceedings as related matters: IPR2021-00917; IPR2021-00920; IPR2021-
`00922; and IPR2021-00923. Paper 4, 2.
`C. The ’949 Patent
`The ’949 patent, titled “Camera Based Interaction and Instruction,”
`issued November 4, 2014, with claims 1–18. Ex. 1001, codes (45), (54),
`15:21–16:50. The ’949 patent relates to “enhanc[ing] the quality and
`usefulness of picture taking for pleasure, commercial, or other business
`purposes.” Id. at 1:4–6. In one embodiment, “stereo photogrammetry is
`combined with digital image acquisition to acquire or store scenes and poses
`of interest, and/or to interact with the subject in order to provide data to or
`from a computer.” Id. at 1:6–10.
`Figure 2A of the ’949 patent is reproduced below.
`
`
`
`3
`
`

`

`Case 2:21-cv-00040-JRG Document 157-5 Filed 12/16/21 Page 5 of 29 PageID #: 6283
`IPR2021-00921
`Patent 8,878,949 B2
`Figure 2A illustrates still camera system 201, which includes central camera
`202 having high resolution and color accuracy for picture taking. Id.
`at 4:66–5:2. Camera system 201 also includes two cameras 210, 211 on
`either side of central camera 202. Id. at 5:2–3. Cameras 210, 211 “may be
`lower resolution (allowing lower cost, and higher frame rate, as they have
`less pixels to scan in a given frame time), with little or no accurate color
`capability, as they are used to simply see object positions or special datum
`positions on objects.” Id. at 5:3–7.
`Camera system 201 further includes computer 220 that processes data
`from cameras 210, 211 “to get various position and/or orientation data
`concerning a person.” Id. at 5:24–26. “In general, one can use the system to
`automatically ‘shoot’ pictures” in response to a particular event, such as the
`subject undertaking a particular position or gesture—i.e., a silent command
`to take a picture. Id. at 5:30–49.
`D. Challenged Claims
`As noted above, Petitioner challenges claims 1–18 of the ’949 patent.
`Claims 1, 8, and 13 are independent. Claim 1 is illustrative of the claimed
`subject matter and is reproduced below:
`1. A portable device comprising:
`a device housing including a forward facing portion, the
`forward facing portion of the device housing encompassing
`an electro-optical sensor having a field of view and
`including a digital camera separate from the electro-optical
`sensor; and
`a processing unit within the device housing and operatively
`coupled to an output of the electro-optical sensor, wherein
`the processing unit is adapted to:
`
`4
`
`

`

`Case 2:21-cv-00040-JRG Document 157-5 Filed 12/16/21 Page 6 of 29 PageID #: 6284
`Case 2:21-cv-00040-JRG Document 157-5 Filed 12/16/21 Page 6 of 29 PagelD #: 6284
`IPR2021-00921
`Patent 8,878,949 B2
`
`determine a gesture has been performedin the electro-
`optical sensor field of view based onthe electro-optical
`sensor output, and
`
`control the digital camera in responseto the gesture
`performedin the electro-optical sensorfield of view,
`wherein the gesture corresponds to an image capture
`command, and wherein the image capture command
`causesthe digital camera to store an image to memory.
`
`Ex. 1001, 15:21-38.
`
`E. Asserted Grounds of Unpatentability
`
`Petitioner contendsthat the challenged claims would have been
`
`unpatentable on the following grounds:!
`
`
`
`Claim(s) Challenged|35 U.S.C. § Reference(s)/Basis
`
` Numazaki,? Nonaka?
`
`103(a
`
`Numazaki, Nonaka, Aviv*
`
`Pet. 6—7. Petitioner supports its challenge with the Declaration of
`
`Dr. Benjamin B. Bederson (Ex. 1003).
`
`Il. ANALYSIS
`
`A. Level ofOrdinary Skill in the Art
`
`In determining whether an invention would have been obviousat the
`
`time it was made, 35 U.S.C. § 103 requires us to resolve the level of
`
`ordinary skill in the pertinentart at the time of the effective filing date of the
`
`claimed invention. Graham v. John Deere Co., 383 U.S. 1, 17 (1966). The
`
`' The Leahy-Smith America Invents Act, Pub. L. No. 112-29, 125 Stat. 284
`(2011) (“AIA”), amended 35 U.S.C. § 103. Because the ’949 patent has an
`effective filing date before the March 16, 2013, effective date of the
`applicable AIA amendments, we apply the pre-AIA version of 35 U.S.C.
`§ 103.
`7 US 6,144,366, issued Nov. 7, 2000 (Ex. 1004).
`> JP H4-73631, published Mar. 9, 1992 (Ex. 1005).
`* US 5,666,157, issued Sept. 9, 1997 (Ex. 1006).
`
`5
`
`

`

`Case 2:21-cv-00040-JRG Document 157-5 Filed 12/16/21 Page 7 of 29 PageID #: 6285
`IPR2021-00921
`Patent 8,878,949 B2
`person of ordinary skill in the art is a hypothetical person who is presumed
`to have known the relevant art. In re GPAC, Inc., 57 F.3d 1573, 1579
`(Fed. Cir. 1995). Factors that may be considered in determining the level of
`ordinary skill in the art include, but are not limited to, the types of problems
`encountered in the art, the sophistication of the technology, and educational
`level of active workers in the field. Id. In a given case, one or more factors
`may predominate. Id.
`Petitioner contends that a person having ordinary skill in the art
`“would have had at least a bachelor’s degree in electrical engineering or
`equivalent with at least one year of experience in the field of human
`computer interaction,” and “[a]dditional education or experience might
`substitute for the above requirements.” Pet. 5–6 (citing Ex. 1003 ¶¶ 29–31).
`Patent Owner does not dispute Petitioner’s definition for the purposes of its
`Preliminary Response. Prelim. Resp. 5.
`Based on our review of the record before us, we determine that
`Petitioner’s stated level of ordinary skill in the art is reasonable because it is
`consistent with the evidence of record, including the asserted prior art.
`Accordingly, for the purposes of this Decision, we adopt Petitioner’s
`definition.
`
`B. Claim Construction
`In inter partes reviews, the Board interprets claim language using the
`district-court-type standard, as described in Phillips v. AWH Corp., 415 F.3d
`1303 (Fed. Cir. 2005) (en banc). See 37 C.F.R. § 42.100(b) (2020). Under
`that standard, we generally give claim terms their ordinary and customary
`meaning, as would be understood by a person of ordinary skill in the art at
`the time of the invention, in light of the language of the claims, the
`specification, and the prosecution history. See Phillips, 415 F.3d at
`
`6
`
`

`

`Case 2:21-cv-00040-JRG Document 157-5 Filed 12/16/21 Page 8 of 29 PageID #: 6286
`IPR2021-00921
`Patent 8,878,949 B2
`1313–14. Although extrinsic evidence, when available, may also be useful
`when construing claim terms under this standard, extrinsic evidence should
`be considered in the context of the intrinsic evidence. See id. at 1317–19.
`Petitioner proposes claim constructions for the phrases “the image
`capture command causes the digital camera to store an image to memory” in
`claim 1, “capturing an image to the digital camera in response to . . . the
`image capture command” in claim 8, and “correlate the gesture detected
`. . . with an image capture function and subsequently capture an image using
`the digital camera” in claim 13. Pet. 8. Specifically, Petitioner asserts that
`these phrases “should be construed broadly enough to encompass
`capturing/storing video or still images,” and provides reasons supporting its
`assertion. Id. at 8–10. Patent Owner does not contest Petitioner’s proposed
`claim constructions at this stage of the proceeding. Prelim. Resp. 5.
`Accordingly, we adopt Petitioner’s proposed claim constructions for the
`purposes of this Decision.
`The parties are hereby given notice that claim construction, in general,
`is an issue to be addressed at trial and claim constructions expressly or
`implicitly addressed in this Decision are preliminary in nature. Claim
`construction will be determined at the close of all the evidence and after any
`hearing. The parties are expected to assert all of their claim construction
`arguments and evidence in the Petition, Patent Owner’s Response,
`Petitioner’s Reply, Patent Owner’s Sur-reply, or otherwise during trial, as
`permitted by our rules.
`C. Asserted Obviousness Based on Numazaki and Nonaka
`Petitioner asserts that claims 1–18 of the ’949 patent are unpatentable
`under 35 U.S.C. § 103(a) based on Numazaki and Nonaka. Pet. 10–49.
`Patent Owner provides arguments addressing this asserted ground of
`
`7
`
`

`

`Case 2:21-cv-00040-JRG Document 157-5 Filed 12/16/21 Page 9 of 29 PageID #: 6287
`IPR2021-00921
`Patent 8,878,949 B2
`unpatentability. Prelim. Resp. 6–28. We first summarize the references and
`then address the parties’ contentions.
`1. Numazaki
`Numazaki “relates to a method and an apparatus for generating
`information input in which input information is extracted by obtaining a
`reflected light image of a target object.” Ex. 1007, 1:8–11. An information
`input generation apparatus according to a first embodiment includes lighting
`unit 101, reflected light extraction unit 102, feature data generation unit 103,
`and timing signal generation unit 104. Id. at 10:23–28, Fig. 1. Light
`emitting unit 101 emits light that varies in intensity in time according to a
`timing signal from timing signal generation unit 104. Id. at 10:29–31. The
`light is directed onto a target object, and light reflected from the target object
`is extracted by reflected light extraction unit 102. Id. at 10:31–35. Feature
`data generation unit 103 extracts feature data from the reflected light image.
`Id. at 10:57–58. “When the target object is a hand, it becomes possible to
`obtain the information regarding a gesture or a pointing according to the
`feature data extracted from the reflected light image of the hand, for
`example, and it becomes possible to operate a computer by using this
`obtained information.” Id. at 10:61–66.
`Figure 2, reproduced below, depicts a detailed block diagram of the
`information input generation apparatus of the first embodiment. Id. at 5:11–
`12, 11:9–11.
`
`8
`
`

`

`Case 2:21-cv-00040-JRG Document 157-5 Filed 12/16/21 Page 10 of 29 PageID #: 6288
`IPR2021-00921
`Patent 8,878,949 B2
`
`
`Figure 2 shows that light emitted from lighting unit 101 is reflected by target
`object 106, such that an image is formed on a photo-detection plane of
`reflected light extraction unit 102. Id. at 11:11–14. Reflected light
`extraction unit 102 includes first photo-detection unit 109, second photo-
`detection unit 110, and difference calculation unit 111. Id. at 11:16–19.
`Timing control unit 112 causes lighting unit 101 to emit light when first
`photo-detection unit 109 is in a photo-detecting state and not to emit light
`when second photo-detection unit 110 is in a photo-detecting state. Id.
`at 11:26–32. Accordingly, first photo-detection unit 109 receives the light
`emitted from lighting unit 101 that is reflected by target object 106 and
`external light, such as illumination light or sunlight, but second photo-
`detection unit 110 receives the external light only. Id. at 11:33–39.
`Difference calculation unit 111 calculates and outputs the difference
`between the image detected by first photo-detection unit 109 and the image
`detected by second photo-detection unit 110, which difference corresponds
`to the light emitted from lighting unit 101 that is reflected by target object
`106. Id. at 11:43–55. The output from reflected light extraction unit 102 is
`
`9
`
`

`

`Case 2:21-cv-00040-JRG Document 157-5 Filed 12/16/21 Page 11 of 29 PageID #: 6289
`IPR2021-00921
`Patent 8,878,949 B2
`amplified by amplifier 113, converted from analog signals into digital
`signals by analog-to-digital converter 114, and stored at memory 115. Id. at
`11:61–64. At an appropriate time, the data stored in memory 115 is read out
`and processed by feature data generation unit 103. Id. at 11:64–66.
`Numazaki also discloses a third embodiment that “is directed to
`another exemplary case of the feature data generation unit of the first
`embodiment, which realizes a gesture camera for recognizing the hand
`action easily and its application as a pointing device in the three-dimensional
`space.” Id. at 29:4–8. Figure 23, reproduced below, shows the feature data
`generation unit of the third embodiment. Id. at 6:4–6, 29:9–10.
`
`
`Figure 23 shows that the feature data generation unit includes range image
`memory unit 331 for storing a distance matrix, shape memory unit 332 for
`storing shape interpretation rules, and shape interpretation unit 333 for
`interpreting a shape of the distance matrix according to the shape
`interpretation rules. Id. at 29:11–18. Shape interpretation unit 333 performs
`the processing for determining if a matching shape interpretation rule exists.
`Id. at 29:28–38, Fig. 25. When a matching shape is found, a command
`corresponding to that shape is outputted. Id. at 30:2–3. Thus, this
`
`10
`
`

`

`Case 2:21-cv-00040-JRG Document 157-5 Filed 12/16/21 Page 12 of 29 PageID #: 6290
`IPR2021-00921
`Patent 8,878,949 B2
`embodiment uses hand gesture recognition as a trigger for inputting a
`command into a computer and can also be used to power on and off a device
`such as a TV or lighting equipment. Id. at 31:3–10.
`In addition, Numazaki discloses a fifth embodiment that “is directed
`to another exemplary case of the feature data generation unit in the first
`embodiment” that uses a video compression technique that extracts only
`useful image information to lower communications costs. Id. at 39:6–20.
`Figure 46, reproduced below, shows the feature data generation unit
`according to the fifth embodiment. Id. at 7:4–6, 39:21–23.
`
`
`
`Figure 46 shows feature data generation unit 103 in conjunction with
`reflected light extraction unit 102 and visible light photo-detection array
`351, which is generally a CCD camera for taking video images. Id. at
`39:24–41. Images captured by visible light photo-detection array 351 are
`stored in image memory unit 352, and a mask (i.e., the image detected by
`reflected light extraction unit 102) is stored in range image memory unit
`
`11
`
`

`

`Case 2:21-cv-00040-JRG Document 157-5 Filed 12/16/21 Page 13 of 29 PageID #: 6291
`IPR2021-00921
`Patent 8,878,949 B2
`331. Id. at 39:51–57. Extraction unit 353 superposes the original image and
`the mask, leaving only the overlapping portion. Id. at 39:57–59.
`Numazaki also discloses an eighth embodiment that “is directed to a
`system configuration incorporating the information input generation
`apparatus” described in the previous embodiments. Id. at 50:21–24. Figure
`74, reproduced below, shows a computer equipped with the information
`input generation apparatus. Id. at 8:31–34, 50:25–26.
`
`
`Figure 74 depicts a portable computer having a keyboard and a display
`integrated with the computer body. Id. at 50:26–29. Lighting unit 701 and
`photo-detection sensor unit 702 are positioned beyond the keyboard. Id.
`at 50:30–33.
`
`2. Nonaka
`Nonaka relates to a camera equipped with a remote release device.
`Ex. 1005, 2:1–3. In one embodiment, a “photographer gives a release
`instruction by means of a predetermined motion towards the camera in
`conjunction with the display timing of the aforementioned display patterns,
`the distance measurement device . . . detects this motion by the subject . . . ,
`
`12
`
`

`

`Case 2:21-cv-00040-JRG Document 157-5 Filed 12/16/21 Page 14 of 29 PageID #: 6292
`IPR2021-00921
`Patent 8,878,949 B2
`and [an] exposure is carried out.” Id. at 3:35–38. Nonaka describes that an
`objective of this invention is to provide “a remote release device-equipped
`camera which enables remote release operations without using a transmitter
`or receiver to give a release instruction, thereby achieving a higher degree of
`freedom, good portability, and cost benefits.” Id. at 2:26–29.
`3. Independent Claim 1
`Petitioner contends that the proposed combination of Numazaki and
`Nonaka discloses the limitations of challenged claim 1. Pet. 10–33. In
`particular, Petitioner relies on: (1) Numazaki’s first embodiment as teaching
`using the reflected light extraction unit to detect an object such as a user’s
`hand; (2) Numazaki’s third embodiment as teaching detecting when the user
`has performed a pre-registered gesture by comparing the output of the
`reflected light extraction unit to stored data reflecting pre-registered gestures
`or hand positions and instructing the device to implement a command
`corresponding to the gesture; (3) Numazaki’s fifth embodiment as teaching
`taking video images with visible light photo-detection array 351; and
`(4) Numazaki’s eighth embodiment as teaching portable devices that
`implement the information input generation apparatus described in the other
`embodiments. Id. at 20 (citing Ex. 1004, 4:32–35, 29:19–30:5, 31:3–10,
`39:21–60, 50:19–24). Regarding these embodiments, Petitioner argues that,
`[a]lthough Numazaki does not expressly describe combining all
`these features into a single portable device such that a user
`could perform a gesture command (pursuant to its third
`embodiment) that causes video capture to initiate (pursuant to
`its fifth embodiment), a [person having ordinary skill in the art]
`would have been motivated to implement Numazaki’s portable
`device in this manner pursuant to Nonaka’s image capture
`command gesture teachings.
`
`13
`
`

`

`Case 2:21-cv-00040-JRG Document 157-5 Filed 12/16/21 Page 15 of 29 PageID #: 6293
`IPR2021-00921
`Patent 8,878,949 B2
`Id. at 20–21. For example, Petitioner argues that combining Numazaki’s
`embodiments as proposed would have improved Numazaki’s portable
`devices in the same way that Nonaka’s gesture-based image capture
`functionality benefits its camera device. Id. at 21 (citing Ex. 1003 ¶¶ 48–49;
`KSR Int’l Co. v. Teleflex Inc., 550 U.S. 398, 417 (2007)). That is, Petitioner
`argues that Nonaka’s “gesture-based image capture solution ‘achiev[es] a
`higher degree of freedom, good portability, and cost benefits,’” and one of
`ordinary skill in the art “would have recognized that these same benefits
`would be realized in Numazaki’s laptop.” Id. (citing Ex. 1006, 2:26–29)
`(alteration in original). Petitioner also identifies certain passages in
`Numazaki and explains the significance of each passage with respect to the
`corresponding claim limitation. Id. at 25–33.
`Patent Owner argues that Petitioner has failed to show that the cited
`references disclose certain limitations of claim 1 and has failed to provide
`sufficient reasoning to combine the references. Prelim. Resp. 6–17. We
`address Patent Owner’s arguments in turn.
`a) Limitation [1(a)]: “a device housing including a forward facing
`portion, the forward facing portion of the device housing encompassing
`an electro-optical sensor having a field of view and including a digital
`camera separate from the electro-optical sensor”
`Patent Owner argues that the combination of Numazaki and Nonaka
`does not teach or suggest limitation [1(a)]. Prelim. Resp. 6. Specifically,
`Patent Owner argues that Petitioner contends that this limitation is met by
`incorporating Numazaki’s fifth embodiment into Numazaki’s eighth
`embodiment, wherein the laptop of the eighth embodiment is the claimed
`“device housing,” and reflected light extraction unit 102 and visible light
`photo-detection array 351 of the fifth embodiment correspond to the claimed
`“electro-optical sensor” and “digital camera,” respectively. Id. at 6–7 (citing
`
`14
`
`

`

`Case 2:21-cv-00040-JRG Document 157-5 Filed 12/16/21 Page 16 of 29 PageID #: 6294
`IPR2021-00921
`Patent 8,878,949 B2
`Pet. 26–28). According to Patent Owner, however, photo-detection sensor
`unit 702 of the eighth embodiment (which Patent Owner contends
`corresponds to reflected light extraction unit 102 or visible photo-detection
`array 351 in the proposed combination) “is located on the same upward
`facing portion of the laptop as the keyboard, and just ‘beyond the keyboard
`when viewed from an operator side.’” Id. at 9 (quoting Ex. 1004, 50:31–33).
`Patent Owner further argues that this upward facing portion of the laptop is
`not forward facing and, even if photo-detection sensor unit 702 has a
`forward-facing field of view, it is not located on a forward-facing portion of
`the laptop. Id.
`We do not agree with Patent Owner’s argument. Petitioner argues
`that one of ordinary skill in the art would have been motivated to implement
`the videoconference functionality of Numazaki’s fifth embodiment into the
`laptop of the eighth embodiment. Pet. 26. To accomplish this
`implementation, Petitioner argues that Numazaki’s two-camera reflected
`light extraction unit 102 would have been used in conjunction with visible
`photo-detection array 351. Id. at 26–27 (citing Ex. 1004, 39:21–49).
`Petitioner also argues that, because the output of reflected light extraction
`unit 102 is processed to define which portions of the video captured by
`visible photo-detection array 351 are retained, one of ordinary skill in the art
`would have understood that both reflected light extraction unit 102 and
`visible photo-detection array 351 are forward facing. Id. at 27–28 (citing
`Ex. 1004, 39:24–60, Fig. 48; Ex. 1003 ¶ 52).
`As such, we do not read the Petition as asserting that either reflected
`light extraction unit 102 or visible photo-detection array 351 physically
`replaces photo-detection sensor unit 702. Instead, as discussed above,
`Petitioner relies on the fifth embodiment’s disclosure of using forward-facing
`
`15
`
`

`

`Case 2:21-cv-00040-JRG Document 157-5 Filed 12/16/21 Page 17 of 29 PageID #: 6295
`IPR2021-00921
`Patent 8,878,949 B2
`reflected light extraction unit 102 and visible photo-detection array 351
`together. The test for obviousness is not whether the features of one
`embodiment may be bodily incorporated into another embodiment to
`produce the claimed subject matter; rather, it is what the combination of
`embodiments makes obvious to one of ordinary skill in the art. See In re
`Mouttet, 686 F.3d 1322, 1332 (Fed. Cir. 2012) (“It is well-established that a
`determination of obviousness based on teachings from multiple references
`does not require an actual, physical substitution of elements.”).
`Furthermore, neither party has offered an express construction for the
`claim term “forward facing portion,” and we do not agree, based on the
`current record, that the portion of Numazaki’s laptop encompassing photo-
`detection sensor unit 702 cannot be considered as “forward facing.” Indeed,
`Figure 74 depicts this portion of the laptop to be slanted and facing both
`upward and forward. We, thus, are persuaded at this preliminary stage that
`Numazaki’s laptop includes a forward-facing portion that includes photo-
`detection sensor unit 702. We invite the parties to brief the proper
`construction of “forward facing portion” during trial, if desired, and we will
`address this limitation on the complete trial record, including any claim
`construction analysis for the term “forward facing portion,” to the extent
`included in the record.
`b) Limitation [1(b)]: “a processing unit within the device housing and
`operatively coupled to an output of the electro-optical sensor, wherein
`the processing unit is adapted to: determine a gesture has been
`performed in the electro-optical sensor field of view based on the
`electro-optical sensor output”
`First, Patent Owner argues that Numazaki requires two, not one,
`images from different photo-detection units to perform an analysis of a
`target object and identify a gesture. Prelim. Resp. 10–11. Accordingly,
`
`16
`
`

`

`Case 2:21-cv-00040-JRG Document 157-5 Filed 12/16/21 Page 18 of 29 PageID #: 6296
`IPR2021-00921
`Patent 8,878,949 B2
`Patent Owner argues that Numazaki “does not teach or suggest a processing
`unit capable of ‘determin[ing] a gesture has been performed’ based on the
`output of one or more electro-optical sensors, as set forth in claim element
`[1(b)].”5 Id. at 12 (alteration in original). Patent Owner also argues that
`“Numazaki does not teach or suggest ‘determin[ing] a gesture has been
`performed’ absent the other hardware that Numazaki identifies as necessary,
`such as the lighting unit, the image-subtraction circuitry, and the associated
`timing circuitry.” Id. (alteration in original).
`These arguments are not persuasive because claim 1 employs the
`open-ended language “comprising,” and thus does not preclude additional
`elements. See Genentech, Inc. v. Chiron Corp., 112 F.3d 495, 501 (Fed. Cir.
`1997) (discussing “open ended” claim terms, such as “comprising”).
`Specifically, claim 1 does preclude determining that a gesture has been
`performed based on the output of more than one electro-optical sensor. Nor
`does claim 1 preclude additional hardware such as a lighting unit, image-
`subtraction circuitry, and timing circuitry.
`Second, Patent Owner argues that the Petition requires that both of
`Numazaki’s third and fifth embodiments are implemented into the eighth
`embodiment, but reflected light extraction unit 102 operates differently in
`the third and fifth embodiments. Prelim. Resp. 12–13 (citing Ex. 1004,
`29:5–18, 39:26–31, 39:50–60, Figs. 23, 46, 48). According to Patent
`Owner, “the Petition requires that reflected light extraction unit 102 be
`
`
`5 Although referring to “one or more electro-optical sensors,” it appears that
`Patent Owner intended to argue that Numazaki does not disclose
`determining that a gesture has been performed based on the output of one
`sensor. One or more sensors would include the two sensors that Patent
`Owner argues Numazaki requires.
`
`17
`
`

`

`Case 2:21-cv-00040-JRG Document 157-5 Filed 12/16/21 Page 19 of 29 PageID #: 6297
`IPR2021-00921
`Patent 8,878,949 B2
`modified to generate both a ‘reflection matrix’ and a ‘distance matrix,’” but
`“there is no motivation for doing so” because “Numazaki’s third and fifth
`embodiments are disclosed as disparate embodiments.” Id. at 13.
`This argument is not persuasive because it mischaracterizes
`Petitioner’s position. We do not read the Petition as proposing to modify
`Numazaki’s reflected light extraction unit 102 so as to generate both a
`reflection matrix in accordance with the fifth embodiment and a distance
`matrix in accordance with the third embodiment. Rather, Petitioner relies on
`the fifth embodiment as disclosing videoconference functionality (Pet. 26–
`29) and the third embodiment as disclosing gesture recognition (id. at 29–
`30). As for the reflected light extraction unit, Petitioner relies on reflected
`light extraction unit 102 of the fifth embodiment, which Petitioner contends
`is the same two-camera reflected light extraction unit 102 used in the first
`embodiment. Id. at 26 & n.2.
`c) Limitation [1(c)]: “control the digital camera in response to the gesture
`performed in the electro-optical sensor field of view, wherein the gesture
`corresponds to an image capture command, and wherein the image
`capture command causes the digital camera to store an image to
`memory”
`Patent Owner argues that, although the Petition combines Numazaki’s
`third, fifth, and eighth embodiments to meet limitation [1(c)], there is no
`motivation to do so for several reasons. Prelim. Resp. 14. First, Patent
`Owner argues that “Numazaki explicitly delineates multiple embodiments”
`and teaches away from combining the third and fifth embodiments because
`these “embodiments effectively disclose competing implementations for the
`[feature data generation unit].” Id. at 14–15 (citing Ex. 1004, 10:21–27,
`29:1, 29:4–8, 39:3, 39:17–20, 39:50–60, Figs. 1, 23, 46, 48).
`
`18
`
`

`

`Case 2:21-cv-00040-JRG Document 157-5 Filed 12/16/21 Page 20 of 29 PageID #: 6298
`IPR2021-00921
`Patent 8,878,949 B2
`We disagree that the implementations of the feature data generation
`unit for the third and fifth embodiments are “competing” to the degree that
`they are incompatible or teach away from combining aspects of the
`embodiments. Numazaki discloses that both the third and fifth embodiments
`are directed to other exemplary cases of the feature data generation unit of
`the first embodiment (Ex. 1004, 29:4–8, 39:17–20), thereby suggesting that
`the alternative implementations of the feature data generation unit can be
`used in place of the feature data generation unit of the first embodiment. In
`other words, Numazaki suggests that the various implementations are
`interchangeable. The mere fact that Numazaki’s third and fifth
`embodiments disclose different feature data generation units would not have
`discouraged one of ordinary skill in the art from considering and combining
`various aspects of the embodiments. For these reasons, we disagree with
`Patent Owner’s arguments at this stage of the proceeding.
`Second, Patent Owner challenges Petitioner’s argument that one of
`ordinary skill in the art would have combined Numazaki’s embodiments in
`the manner proposed to achieve a higher degree of freedom, good
`portability, and cost benefits as taught by Nonaka. Prelim. Resp. 15.
`Specifically, Patent Owner argues that “Nonaka teaches that ‘a higher degree
`of freedom, good portability, and cost benefits’ result from not making a
`camera operable via a remote-control unit,” but “Numazaki is completely
`silent regarding the existence of remote-control units and the use of remote-
`control units to operate a camera.” Id. (citing Ex. 1005, 2). Thus, in Patent
`Owner’s view, Petitioner’s reason for combining Numazaki’s embodiments
`is based on solving a problem that Numazaki never had. Id.
`We agree that Nonaka discloses that its gesture-based image capture
`functionality provides a higher degree of freedom, good portability, and cost
`
`19
`
`

`

`Case 2:21-cv-00040-JRG Document 157-5 Filed 12/16/21 Page 21 of 29 PageID #: 6299
`IPR2021-00

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket