throbber
Trials@uspto.gov
`571-272-7822
`
`
`
`
`
`
`
`
`
`
`
`
`Paper No. 11
`Entered: October 3, 2017
`
`UNITED STATES PATENT AND TRADEMARK OFFICE
`____________
`
`BEFORE THE PATENT TRIAL AND APPEAL BOARD
`____________
`
`SAMSUNG ELECTRONICS CO., LTD.; AND
`SAMSUNG ELECTRONICS AMERICA, INC.
`Petitioners
`
`v.
`
`IMAGE PROCESSING TECHNOLOGIES, LLC
`Patent Owner
`____________
`
`IPR2017-01190
`Patent 6,717,518 B1
`____________
`
`
`Before JONI Y. CHANG, MIRIAM L. QUINN, and
`SHEILA F. McSHANE, Administrative Patent Judges.
`
`McSHANE, Administrative Patent Judge.
`
`
`
`DECISION
`Instituting Inter Partes Review
`35 U.S.C. § 314(a) and 37 C.F.R. § 42.108
`
`
`
`
`
`
`

`

`IPR2017-01190
`Patent 6,717,518 B1
`
`
`I. INTRODUCTION
`A. Background
`Samsung Electronics Co., Ltd. and Samsung Electronics America,
`Inc. (“Petitioner”) filed a Petition requesting inter partes review of claim 39
`(“the challenged claim”) of U.S. Patent No. 6,717,518 B1 (Ex. 1001, “the
`’518 patent”) pursuant to 35 U.S.C. §§ 311–319. Paper 2 (“Pet.”). Image
`Processing Technologies, LLC (“Patent Owner”) filed a Preliminary
`Response to the Petition. Paper 6 (“Prelim. Resp.”).
`We have authority under 35 U.S.C. § 314(a), which provides that an
`inter partes review may not be instituted “unless . . . the information
`presented in the petition . . . shows that there is a reasonable likelihood that
`the Petitioner would prevail with respect to at least 1 of the claims
`challenged in the petition.” See 37 C.F.R. § 42.4(a) (“The Board institutes
`the trial on behalf of the Director.”).
`We determine that Petitioner has demonstrated that there is a
`reasonable likelihood that it would prevail with respect to the one challenged
`claim. For the reasons described below, we institute an inter partes review
`of claim 39 of the ’518 patent.
`B. Related Proceedings
`The parties indicate that a related matter is: Image Processing
`
`Technologies LLC v. Samsung Elecs. Co., No. 2:16-cv-00505-JRG (E.D.
`Tex.). Pet. 1, Paper 4, 1. The parties also indicate that inter partes review
`petitions have been filed for other patents asserted in the district court action.
`Pet. 1–2; Paper 4, 1.
`
`
`
`2
`
`

`

`IPR2017-01190
`Patent 6,717,518 B1
`
`
`C. The ’518 Patent
`The ’518 patent is titled “Method And Apparatus For Detection Of
`
`Drowsiness,” and was filed as PCT application No. PCT/EP99/00300 on
`January 15, 1999, and issued on April 6, 2004. Ex. 1001, [22], [45], [54],
`[86]. The ’518 patent claims priority to application FR 98 00378, dated
`January 15, 1998 and application PCT/EP98/05383, dated August 25, 1998.
`Id. at [30]. The application entered the U.S. national stage as application
`No. 09/600,390, meeting the requirements under 35 U.S.C. § 371 on
`February 9, 2001. Id. at [21], [86].
`
`The ’518 patent is directed to applying a generic image processing
`system in order to detect a person’s drowsiness. Ex. 1001, 2:1–5, 2:32–40.
`In order to accomplish that, the driver’s blink rate is detected using a video
`camera in a car. Id. at 6:28–57. The system first detects a driver entering
`the vehicle, by use of pixels “moving in a lateral direction away from the
`driver’s door.” Id. at 25:24–39. A driver’s head is detected by identifying
`pixels with selected characteristics, with the pixels loaded in histograms as
`depicted in Figure 24, reproduced below. Id. at 5:64–65, 26:46–49.
`
`
`
`
`
`3
`
`

`

`IPR2017-01190
`Patent 6,717,518 B1
`
`Figure 24, above, illustrates the detection of the edges of a head using
`histograms. Ex. 1001, 5:64–65. The head edges are detected by looking for
`peaks in the histogram. Id. at 26:49–65. The system then masks portions of
`an image, and continues to analyze only the unmasked portions. Id. at
`26:66–27:10; see also id. at Fig. 25. The system then uses an
`anthropomorphic model to set sub-areas for further analysis. Id. at 27:31–
`38. Figure 26, reproduced below, shows the derivation of a sub-area. See
`id. at 27:31–38.
`
`
`Figure 26, above, depicts masking outside the eyes. Ex. 1001, 6:1–2. The
`’518 patent includes a variety of methods to identify blinking, including use
`of histograms to determine whether eyes are open or closed as depicted in
`Figure 27, reproduced below. Id. at 27:52–28:14.
`
`
`
`4
`
`

`

`IPR2017-01190
`Patent 6,717,518 B1
`
`
`
`The system checks for eye movement by methods including analyzing the
`pixels within area Z' depicted above in Figure 27. Ex. 1001, 27:52–55. The
`peaks of the histogram shown in Figure 27, above, are used to determine
`whether an eye is open or closed. Id. at 28:32–29:10. Characteristics of
`features in a search box, such as, such as “a moving eyelid, a pupil, iris or
`cornea, a shape corresponding to an eye, a shadow corresponding to an eye,
`or any other indicia indicative of an eye,” may also be analyzed. Id. at
`30:56–59.
`
`Claim 39, with added formatting and paragraph annotations, is
`reproduced below.
`
`A process of detecting a feature of an eye, the process
`39.
`comprising the steps of:
`[a] acquiring an image of the face of the person, the image
`
`comprising pixels corresponding to the feature to be detected;
`
`[b] identifying a characteristic of the face other than the feature
`to be detected;
`
`[c] identifying a portion of the image of the face comprising the
`feature to be detected using an anthropomorphic model based on
`the location of the identified facial characteristic;
`
`[d] selecting pixels of the portion of the image having
`characteristics corresponding to the feature to be detected;
`
`
`
`5
`
`

`

`IPR2017-01190
`Patent 6,717,518 B1
`
`
`[e] forming at least one histogram of the selected pixels;
`
`[f] and analyzing the at least one histogram over time to
`
`identify characteristics of the feature to be detected;
`
`[g] said feature being the iris, pupil or cornea.
`Ex. 1001, 36:60–38:4.
`
`
`
`D. Asserted Grounds of Unpatentability
`Petitioner asserts the following grounds of unpatentability against
`
`claim 39 of the ’518 patent:
`Ground
`§ 103
`§ 103
`§ 103
`
`Prior Art
`Eriksson1 and Stringa2
`Ando3 and Suenaga4
`Ando and Stringa
`
`Pet. 3.
`
`II. ANALYSIS
`A. Claim Construction
`In an inter partes review, the Board interprets claim terms in an
`unexpired patent according to the broadest reasonable construction in light
`of the specification of the patent in which they appear. 37 C.F.R.
`§ 42.100(b); Cuozzo Speed Techs., LLC v. Lee, 136 S. Ct. 2131, 2144–46
`
`
`1 Martin Eriksson, Eye-Tracking for Detection of Driver Fatigue,
`Proceedings of November 1997 IEEE Conference on Intelligent
`Transportation Systems, 314–319. (Ex. 1005).
`2 Luigi Stringa, Eyes Recognition for Face Recognition, Applied Artificial
`Intelligence—An International Journal, Vol. 7, No. 1, 1993, 365–382. (Ex.
`1006).
`3 U.S. Patent No. 5,008,946 (issued April 16, 1991) (Ex. 1009).
`4 U.S. Patent No. 5,805,720 (issued September 8, 1998) (Ex. 1007).
`
`
`
`6
`
`

`

`IPR2017-01190
`Patent 6,717,518 B1
`
`(2016) (upholding the use of the broadest reasonable interpretation
`approach). Under that standard, and absent any special definitions, we give
`claim terms their ordinary and customary meaning, as they would be
`understood by one of ordinary skill in the art at the time of the invention.
`In re Translogic Tech., Inc., 504 F.3d 1249, 1257 (Fed. Cir. 2007).
`Petitioner does not propose that any specific definitions apply to any
`of the terms of claim 39, and the terms should be given their ordinary and
`customary meaning. Pet. 3–4. Patent Owner agrees that the ordinary
`meaning of terms should apply, provides a proposed claim construction for
`the terms “characteristic of the face”/“facial characteristic,” and directs us to
`the claim construction opinion from Image Processing Technologies, LLC v.
`Samsung Elecs. Co., No. 16-cv-00505-JRG (E.D. Tex.) (Ex. 2001). See
`Prelim. Resp. 20–23.
`At this time, we determine that it is not necessary to provide an
`express interpretation of any term of the claims. Vivid Techs., Inc. v. Am.
`Sci. & Eng’g, Inc., 200 F.3d 795, 803 (Fed. Cir. 1999)) (“[O]nly those terms
`need be construed that are in controversy, and only to the extent necessary to
`resolve the controversy.”).
`B. Alleged Obviousness of Claim 39 over Eriksson and Stringa
` Petitioner contends that claim 39 is obvious over Eriksson and
`Stringa. Pet. 26–41. To support its contentions, Petitioner provides
`evidence and explanations as to how the prior art teaches each claim
`limitation. Id. Petitioner also relies upon the Declaration of Dr. John C.
`Hart (“Hart Declaration” (Ex. 1002)) to support its positions. Patent Owner
`counters that the prior art does not render claim 39 obvious because the prior
`art fails to teach some limitations of the claim. Prelim. Resp. 27–32.
`
`
`
`7
`
`

`

`IPR2017-01190
`Patent 6,717,518 B1
`
`
` On this record, we are persuaded by Petitioner’s explanations and
`evidence in support of the obviousness grounds asserted under Eriksson and
`Stringa for claim 39. We begin our discussion with a brief summary of the
`prior art, and then address the evidence, analysis, and arguments presented
`by the parties.
`
`1. Eriksson (Ex. 1005)
`Eriksson is directed to “a system that locates and tracks the eyes of a
`
`driver” for the “purpose of . . . detect[ing] driver fatigue.” Ex. 1005, 314.5
`Eriksson uses a small camera to “monitor the face of the driver and look for
`eye movements which indicate that the driver is no longer in condition to
`drive.” Id. Eriksson uses four steps for detection: (1) localization of the
`face; (2) computation of the vertical location of the eyes; (3) computation of
`the exact location of the eyes; and (4) estimation of the iris position. Id. at
`315. In the first step, localization of the face, Eriksson uses a “symmetry
`histogram,” shown in Figure 1 below. Id.
`
`
`
`
`5 The references used herein refer to the page numbers used in the original
`publication.
`
`
`
`8
`
`

`

`IPR2017-01190
`Patent 6,717,518 B1
`
`Figure 1, above, depicts computed symmetry values that form the symmetry
`histogram used to determine the center of a face. Ex. 1005, 315–316. In the
`second step of Eriksson, the vertical location of the eyes is determined using
`an edge detection algorithm to form the histogram depicted in Figure 2. Id.
`at 316.
`
`
`
`Figure 2, above, depicts an original image, edges detected, and a histogram
`of the detected edges. Ex. 1005, 316. The peaks formed are considered in
`the third step of the process that computes the exact location of the eyes. Id.
`The eyes are located by searching for “intensity-valleys” in the image and
`also using “general constraints, such [as] that both eyes must be located
`‘fairly close’ to the center of the face.” Id. Finally, the position of the iris is
`found by the use of an “eye-template” shown in Figure 3. Id.
`
`
`Figure 3, above, depicts the eye-template that is laid over the image to find
`the position of the iris. Ex. 1005, 316. The template determines that there is
`
`
`
`9
`
`

`

`IPR2017-01190
`Patent 6,717,518 B1
`
`a good match if there are “many dark pixels in the area inside the inner
`circle, and many bright pixels in the area between the two circles.” Id. at
`316–317. Upon a match, “the inner circle is centered on the iris and the
`outside circle covers the sclera.” Id. at 317. Upon detection, Eriksson
`generates a horizontal histogram across the pupil. Id. at 318. Figure 5,
`reproduced below, depicts horizonal histograms for open and closed eyes.
`Id. at 318.
`
`
`
`The histograms depicted above in Figure 5 are used to determine whether an
`eye is open or closed. Ex. 1005, 318. Measurement of blink rates over time
`can be used to detect drowsy drivers. Id.
`Stringa (Ex. 1006)
`1.
`Stringa is directed to an image processing normalization algorithm for
`
`face recognition. Ex. 1006, 365. Stringa locates the position of eyes “based
`on the exploitation of (a priori) anthropometric information combined with
`the analysis of suitable grey-level distributions, allowing direct localization
`of both eyes.” Id. at 369. Stringa discusses the “grammar” of facial
`structure, where the “human face presents a reasonable symmetry,” with
`“knowledge of the relative position of the main facial features.” Id. at 369.
`Stringa’s “guidelines can be derived from anthropometric data
`
`
`
`10
`
`

`

`IPR2017-01190
`Patent 6,717,518 B1
`
`corresponding to an average face and refined through the analysis of real
`faces.” Id. An algorithm detects a line that connects the eyes, side limits of
`the face, and the nose axis, in order to estimate “the expectation zones of the
`two eyes.” Id. at 376. Stringa searches for the pupil based upon an analysis
`of horizonal grey-level distribution. Id. at 377. Figure 9, reproduced below,
`depicts the expectation zone for the eyes. Id.
`
`
`Figure 9, above, depicts the computed expection zone for two eyes. Ex.
`1006, 377. Stringa then uses a second derivation to produce a graph that
`identifies a pupil as depicted in Figure 10. Id. at 377–378.
`
`
`
`11
`
`

`

`IPR2017-01190
`Patent 6,717,518 B1
`
`
`
`Figure 10, above, depicts a plot of a second derivation of eye data used to
`locate pupil location. Ex. 1006, 378. Figure 10 shows a peak corresponding
`to the eye’s pupil, with two adjacent peaks of lesser intensity indicating the
`discontinuity represented by the cornea. Id.
`3. Analysis
` A patent claim is unpatentable under 35 U.S.C. § 103(a) if the
`differences between the claimed subject matter and the prior art are such that
`the subject matter, as a whole, would have been obvious at the time the
`invention was made to a person having ordinary skill in the art to which said
`subject matter pertains. KSR Int’l Co. v. Teleflex Inc., 550 U.S. 398, 406
`(2007). The question of obviousness is resolved on the basis of underlying
`factual determinations including: (1) the scope and content of the prior art;
`(2) any differences between the claimed subject matter and the prior art;
`
`
`
`12
`
`

`

`IPR2017-01190
`Patent 6,717,518 B1
`
`(3) the level of ordinary skill in the art;6 and (4) objective evidence of
`nonobviousness.7 Graham v. John Deere Co., 383 U.S. 1, 17–18 (1966).
` Petitioner contends that both Eriksson and Stringa individually teach
`every element of claim 39. See Pet. 29–40. Petitioner nonetheless contends
`that Eriksson and Stringa are not cumulative, but rather “compliment” each
`other. Id. at 40. Petitioner alleges that “while Eriksson likely renders Claim
`39 of the ’518 Patent obvious in combination with the knowledge of a
`person of ordinary skill in the art [POSA], the combination of Eriksson with
`Stringa provides a more complete disclosure.” Id. at 40–41. It is unclear
`why Petitioner sought to rely on two references to make a “more complete
`disclosure,” but our regulations require Petitioner to identify the basis of its
`challenges, and in this Petition the reliance on multiple references for the
`challenges to the same claim is excessive. See 37 C.F.R. § 42.104(b). The
`Petition additionally states that the respective prior art references “make up
`for the weaknesses of the other,” and alleges that “Stringa contains extensive
`discussion and use of anthropomorphic models to find a pupil in a video
`image.” Pet. 40. The Petition also contends that Stringa proposes more
`extensive use of anthropomorphic models based on multiple facial features
`than Eriksson, and “incorporation of those models would have allowed
`Eriksson to achieve more precise localization of the pupil.” Id. at 27.
`Petitioner therefore indicates that Eriksson alone “likely” renders claim 39
`obvious, however, Stringa more closely satisfies the element relating to the
`
`
`6 Petitioner proposes an assessment of the level of ordinary skill in the art.
`Pet. 3; see Ex. 1002 ¶¶ 44–46. Patent Owner does not propose any required
`qualifications. Prelim. Resp. 1–26. At this juncture, we adopt the
`Petitioner’s proposed qualifications.
`7 There is no objective indicia of nonobviousness yet in the record.
`
`
`
`13
`
`

`

`IPR2017-01190
`Patent 6,717,518 B1
`
`use of the anthropomorphic models than Eriksson, that is, element [c].
`Accordingly, we decline to rely on both prior art references for the teachings
`of all the elements of claim 39, and based on the Petition’s representations,
`we will proceed analyzing Eriksson’s teachings for the majority of elements
`of claim 39, except for element [c] where we will also consider Stringa’s
`teachings.
` Petitioner alleges that Eriksson teaches a process of detecting the
`features of an eye by acquiring an image of a person’s face comprised of
`pixels. Pet. 29–30. Petitioner contends that Eriksson teaches the step of
`identifying a “character of the face other than the feature to be detected”
`recited as element [b] in claim 39. Id. at 31–32. For this, Petitioner relies
`upon Eriksson’s disclosure of edges of the face and the vertical location of
`the eyes. Id. Petitioner also asserts that Stringa teaches element [c] by, at
`least, its use of the eye-connecting line, the face sides, and the nose axis for
`identification of the expectation zone for eyes. Id. at 34–35. Stringa’s
`disclosure of the use of the nose and left side of the face to identify a region
`of a face known to contain pupils based on anthropomorphic models is also
`relied upon in the alternative. Id. at 35. Petitioner contends that Eriksson
`teaches the step of pixel selection of the feature by selecting pixels from the
`portion of the image identified using the symmetry histogram and the
`gradient histogram (element [d]). Id. at 36. Petitioner also alleges that
`Eriksson teaches the limitation of “forming . . . a histogram of the selected
`pixels,” with analysis over time, to identify the characteristic of the feature,
`i.e. the pupil (elements [e] and [f]). Id. at 37–39. Finally, Eriksson’s
`disclosure of the identification of a “pupil” is relied upon for the teaching of
`element [g]. Id. at 40.
`
`
`
`14
`
`

`

`IPR2017-01190
`Patent 6,717,518 B1
`
`
` Petitioner asserts that a person of ordinary skill in the art would have
`been motivated to combine Eriksson and Stringa because both references are
`directed to similar systems that operate in a similar manner. Pet. 26.
`Petitioner alleges that Stringa was cited by, and partially incorporated into,
`Eriksson, and therefore one of ordinary skill in the art would therefore have
`known “that Stringa was a relevant and helpful reference in the field of
`facial recognition.” Id. at 28 (citing Ex. 1005, 315; Ex. 1002 ¶ 82).
`Petitioner points to Stringa’s more extensive use of anthropomorphic
`models, as discussed above, and incorporation of the models would have
`allowed Eriksson to achieve more precise localization of the pupil. Id. at 27.
`Petitioner also asserts that a person of ordinary skill would have expected
`the combination of the references to yield predictable results, that is, they
`involve applying known anthropomorphic models in similar systems. Id. at
`27–28.
` We have reviewed the Petitioner’s evidence and explanations for the
`alleged teaching of the elements of claim 39, and are persuaded that the
`evidence provided is sufficient at this juncture. Based on the current record,
`Petitioner also provides sufficiently persuasive rationale for combining the
`teachings of Eriksson with Stringa for purposes of this Decision.
` Patent Owner argues that Eriksson fails to teach the limitations of
`elements [d]–[g] of claim 39. Prelim. Resp. 27–32. Patent Owner’s
`argument is premised on the contention that the step of “selecting pixels” in
`element [d] is limited to selection of pixels “having characteristics
`corresponding to the feature to be detected,” but not to selection of all the
`pixels in a particular area. Id. at 28. For support of its argument, Patent
`Owner directs us to Figure 36 of the ’518 patent, where the selection is
`
`
`
`15
`
`

`

`IPR2017-01190
`Patent 6,717,518 B1
`
`limited to “pixels with ‘very low luminance levels and high gloss’ . . . as
`these are character[i]stics of a pupil.” Id. at 29 (citing Ex. 1001, 30:61–64,
`Fig. 36). Patent Owner argues that Eriksson does not disclose the limitations
`because the references teach selection of all the pixels in a selected area. Id.
`at 30.
` On this record, we are not persuaded by the Patent Owner’s
`arguments. We do not agree that the claim limitation of “selecting pixels of
`a portion of the image having characteristics corresponding to the feature to
`be selected” precludes selection of pixels that are not of the feature itself.
`That is, at this juncture, our view of claim limitation [d] is that it requires
`selection including pixels having characteristics corresponding to the
`feature, but it does not, however, limit selection to only those pixels and
`others could be included in the selection. Moreover, this view is supported
`by the ’518 patent. The ’518 patent specification discloses examples of
`selections of pixels, with histograms formed, that include pixels of the
`feature to be identified as well as other non-feature pixels. For instance, the
`’518 patent specification includes the selection and analysis of pixels within
`a particular area of Z' that are feature pixels, as well as including non-feature
`pixels. See Ex. 1001, 27:52–59, Figs. 16, 27, see also Fig. 24.
` Therefore, based on the record before us, Petitioner has demonstrated
`a reasonable likelihood of prevailing on its assertion that claim 39 would
`have been obvious over Eriksson and Stringa, as we have discussed above.
`C. Alleged Obviousness of Claim 39 over Ando and Suenaga
` Petitioner contends that claim 39 is obvious over Ando and Suenaga.
`Pet. 41–56. To support its contentions, Petitioner provides explanations as
`to how the prior art teach each claim limitation. Id. Petitioner also relies
`
`
`
`16
`
`

`

`IPR2017-01190
`Patent 6,717,518 B1
`
`upon the Hart Declaration to support its positions. Patent Owner counters
`that the prior art does not render claim 39 obvious because the prior art fails
`to sufficiently teach some limitations of the claim, and that a person of
`ordinary skill in the art would not be motivated to combine Ando and
`Suenaga. Prelim. Resp. 34–46.
` On this record, we are persuaded by Petitioner’s explanations and
`evidence in support of the obviousness grounds asserted under Ando and
`Suenaga for claim 39. We begin our discussion with a brief summary of the
`prior art, and then address the evidence, analysis, and arguments presented
`by the parties.
`
`1. Ando (Ex. 1009)
` Ando is directed to a system for detecting certain portions of an
`image, including a “driver’s eyes and mouth.” Ex. 1009, 2:1–4. The steps
`used in tracking the eyes and mouth are described as:
`To precisely extract information about the eyes and mouth from
`image information in response to the changes in the positions of the
`face, eyes, and mouth, the apparatus further includes a storage means
`for storing the detected positions, a window setting means for setting
`a region narrower than the image produced by the camera means
`according to the stored positions, a means for setting the region
`covered by a position-detecting means to the narrower region after a
`certain period of time elapses since the detected positions are stored
`in the storage means, and an updating means for updating the
`positions of the aforementioned certain portions within the narrower
`region which are stored in the storage means. Once the positions of
`the certain portions, i.e., the eyes and mouth, are detected, the scan
`made to detect the eyes and mouth is limited to the narrower region
`and so they can be detected quickly. Further, the accuracy with
`which the detection is made is enhanced. Consequently, the
`apparatus follows the eyes and mouth quickly and precisely.
`Ex. 1009, 2:21–41.
`
`
`
`17
`
`

`

`IPR2017-01190
`Patent 6,717,518 B1
`
`Ando first identifies likely locations of features such as the head,
`
`forehead, eyes, eyebrows, mouth, and nose. Ex. 1009, Fig. 2. Histograms
`are used to calculate thresholds for distinguishing specific face elements.
`See, e.g., id. at Fig. 5b, 7b, 8b (identifying thresholds of different features),
`also see id. at 16:44–57, Figs. 7a–7c. 17:63–21:53. Ando describes defining
`a portion of the image Sd, (id. at 18:11–14), and the expected position of the
`right eye using known ratios for a human face, as depicted in Figure 13d,
`reproduced below (see id. at 21:17–18).
`
`
`Figure 13d, above, depicts defined portion of the image, Sd, which is then
`used to calculated a gray level histogram to identify features including
`pupils. Ex. 1009, 18:15–20:52. Ando also discloses the use of subsequent
`image frames to check whether eyes are open or closed. Id. at 12:31–35.
`Ando “detect[s] the opening and closing pattern of an eye and the position of
`a moved pupil” in order to detect blinking and the direction the driver is
`looking. Id. at 29:58–32:23.
`
`
`
`18
`
`

`

`IPR2017-01190
`Patent 6,717,518 B1
`
`
`2. Suenaga (Ex. 1007)
`Suenaga discloses a “facial image processing system for detecting . . .
`
`a dozing or drowsy condition of an automobile driver . . . from the opened
`and closed conditions of his eyes.” Ex. 1007, 1:6–10. Suenaga discloses
`using camera images and converting them into binary images. Id. at 6:41–
`51. An evaluation function calculation, “first finds the barycenter or
`centroid 31 from the average of the coordinates of black pixels in a binary
`image 30” (id. at 23:21–24), and “rectangular areas existing in the
`predetermined ranges in the X direction on the left and right sides of this
`barycenter or centroid 31 are set as eye presence areas 32 (id. at 23:24–27).
`Figure 61, reproduced below, depicts an embodiment of the invention used
`to identify an eye presence area. See id. at 23:8–14.
`
`
`Figure 61, above, sets eye presence 32, where histograms are formed of
`select portions, and “hatched candidate areas 35 (namely 35a and 35b) for an
`eye presence area are extracted.” Ex. 1007, 23:24–35. Subsequent
`
`
`
`19
`
`

`

`IPR2017-01190
`Patent 6,717,518 B1
`
`evaluations are used to examine the eye shape. Id. at 6:61–65, 7:4–65,
`23:52–54.
`
`Analysis
`2.
`Petitioner contends that Ando teaches every element of claim 39, and
`
`that Suenaga also teaches all of the elements except elements [a] and [g].
`See Pet. 44–55. Petitioner nonetheless contends that Ando and Suenaga are
`not cumulative, but rather “compliment” each other. Id. at 56. Petitioner
`argues that “while Ando likely renders Claim 39 of the ’518 Patent obvious
`in combination with the knowledge of a person of ordinary skill in the art
`[POSA], the combination of Ando with Suenaga provides a more complete
`disclosure.” Id. Similar to the discussion of the previous ground, supra
`Section II.B.3, it is unclear why Petitioner sought to rely upon two
`references to make a “more complete disclosure,” but our regulations require
`Petitioner to identify the basis of its challenges, and in this Petition the
`reliance on multiple references for the challenges to the same claim is
`excessive. See 37 C.F.R. § 42.104(b). The Petition asserts that Suenaga
`discloses use of X and Y histograms to increase detection accuracy. Pet. 56.
`Petitioner also refers to Suenaga’s algorithms using X and Y histograms as
`an additional technique that could be used to increase detection accuracy in
`Ando as rationale for the combination of the prior art. Id. at 43. Petitioner
`indicates that Ando alone “likely” renders claim 39 obvious, however,
`Suenaga more closely satisfies claim elements relating to the use of
`histograms to identify features than Ando, that is, claim elements [e] and [f].
`Accordingly, we decline to rely on both prior art references for the teachings
`of most of the elements of claim 39, and based on the Petition’s
`representations, we will proceed analyzing Ando’s teachings for the majority
`
`
`
`20
`
`

`

`IPR2017-01190
`Patent 6,717,518 B1
`
`of elements of claim 39, except for elements [e] and [f] where we will also
`consider Suenaga’s teachings.
`
`Petitioner argues that Ando teaches the detection of features of an eye
`by acquiring a pixelated images of a face. Pet. 44–45. Petitioner asserts that
`Ando’s identification of the boundaries of a head or a forehead are the
`“characteristic of the face other than the feature to be detected.” Id. at 45–
`46. Petitioner further contends that Ando identifies the feature to be
`detected using an anthropomorphic model based on the boundaries of the
`head or on the relative position of the forehead boundaries. Id. at 48–50.
`Ando’s disclosure of pixel selection in region Sd, or alternatively, the pixels
`corresponding only to the pupil, are relied upon for the teaching of element
`[d] of claim 39. Id. at 51–52. Petitioner also relies upon Suenaga’s
`disclosures of X- and Y-histograms of selected pixels for the histogram
`formation step of the claim. Id. at 53. The Petition asserts that Suenaga
`teaches the histogram analysis step, with its disclosure of analysis, to
`determine whether an eye is open or closed. Id. at 54–55. Finally, Ando’s
`disclosures are relied upon for the teachings of identification of the feature
`to be detected, where the feature is an iris, pupil, or cornea. Id. at 53–55.
`
`Petitioner asserts that a person of ordinary skill in the art would have
`been motivated to combine Ando and Suenaga because both references are
`directed to similar systems that operate in a similar manner to solve the same
`problem. Pet. 41–42. It is also argued that Ando recognizes that it is
`difficult to perform accurate detection with nonuniform illumination and
`driver position changes. Id. at 42. As such, Petitioner alleges that a person
`of ordinary skill in the art would have been motivated to look to
`improvements for Ando, such as the disclosures of Suenaga, for features like
`
`
`
`21
`
`

`

`IPR2017-01190
`Patent 6,717,518 B1
`
`the use of X and Y histograms to distinguish between the eyebrow and eye
`and to identify whether the eye is open or closed. Id. at 42–43.
`Additionally, Petitioner asserts that a person of ordinary skill would have
`expected the combination of the references to yield predictable results. Id. at
`43.
`We have reviewed the Petitioner’s evidence and explanations for the
`
`alleged teaching of the elements of claim 39, and are persuaded that the
`evidence provided is sufficient. Based on the current record, Petitioner also
`provides sufficiently persuasive rationale for combining the teachings of
`Ando and Suenaga for purposes of this Decision.
` Patent Owner argues that Ando does not disclose forming a histogram
`of pixels that are selected corresponding to the iris, pupil, or cornea. Prelim.
`Resp. 33. Patent Owner contends that the identification of the dimensions of
`“black pixels” region is used to judge whether an eye is present, and because
`dimensions—and not a histogram—are used to identify eye features, Ando
`fails to teach claim 39 elements [d], [e], [f], and [g]. Id. at 33–34 (citing,
`e.g., Exhibit 1009, 18:61–20:13). Patent Owner also asserts that Ando
`discloses a “differential gradation histogram” for the region Sd that is used to
`determine the grey level value that is used to separate “black” and “white”
`pixels, and it is the result of this analysis that is used to generate the “black”
`pixels used for feature identification, and therefore there is no teaching of
`elements [d] and [e] of claim 39. Id. at 34 (citing Ex. 1009, 18:11–57).
` Patent Owner also alleges that Suenaga does not teach element [b] of
`claim 39 because it does not disclose the use of “a distinguishing element of
`a face, such as the nose, nostril, ears, eyebrows, mouth, iris, pupil, cornea,
`etc. other than the feature to be detected.” Prelim. Resp. 35. Patent Owner
`
`
`
`22
`
`

`

`IPR2017-01190
`Patent 6,717,518 B1
`
`argues that Suenaga discloses eye detection by finding the centroid of
`thresholded pixels, but this is not a distinguishing element. Id. More
`specifically, Patent Owner contends that Suenaga’s Figure 61, relied upon
`by Petitioner, does not disclose identification of a nose or other
`distinguishing element, but rather is a “barycenter or centeroid” based on a
`“black blob” created by a predetermined threshold using a binarized image.
`Id. at 37–38.
` Patent Owner additionally alleges that even if the combination of
`Ando and Suenaga disclo

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket