throbber

`
`
`
`
`
`UNITED STATES PATENT AND TRADEMARK OFFICE
`
`____________________
`
`BEFORE THE PATENT TRIAL AND APPEAL BOARD
`
`____________________
`
`SAMSUNG ELECTRONICS CO., LTD.; AND
`SAMSUNG ELECTRONICS AMERICA, INC.
`Petitioner
`
`v.
`
`IMAGE PROCESSING TECHNOLOGIES, LLC
`Patent Owner
`
`____________________
`
`Patent No. 6,717,518
`____________________
`
`PETITION FOR INTER PARTES REVIEW
`OF U.S. PATENT NO. 6,717,518
`
`
`
`

`

`Petition for Inter Partes Review
`Patent No. 6,717,518
`
`
`
`I.
`
`TABLE OF CONTENTS
`
`Contents
`INTRODUCTION ........................................................................................... 1
`
`II. MANDATORY NOTICES UNDER 37 C.F.R. § 42.8 ................................... 1
`
`III.
`
`PAYMENT OF FEES UNDER 37 C.F.R. § 42.15(a) .................................... 2
`
`IV. GROUNDS FOR STANDING ........................................................................ 3
`
`V.
`
`PRECISE RELIEF REQUESTED .................................................................. 3
`
`VI. LEGAL STANDARDS ................................................................................... 3
`
`A.
`
`B.
`
`Claim Construction ............................................................................... 3
`
`Level of Ordinary Skill In The Art ....................................................... 4
`
`VII. OVERVIEW OF THE ’518 PATENT ............................................................ 4
`
`VIII. DETAILED EXPLANATION OF GROUNDS ............................................ 10
`
`A. Overview Of The Prior Art References .............................................. 10
`
`1. Martin Eriksson et al., Eye Tracking for Detection of
`Driver Fatigue, IEEE Conference on Intelligent
`Transportation Systems (Nov. 1997) (“Eriksson”) (Ex.
`1005) ......................................................................................... 10
`
`2.
`
`3.
`
`4.
`
`Luigi Stringa, Eyes Detection For Face Recognition,
`Applied Artificial Intelligence (1993) (“Stringa”) (Ex.
`1006) ......................................................................................... 15
`
`U.S. Patent No. 5,805,720, Facial Image Processing
`System (Filed Mar. 11, 1996) (“Suenaga”) (Ex. 1007) ............. 18
`
`U.S. Patent No. 5,008,946, System For Recognizing
`Image (Filed Sept. 9, 1988) (“Ando”) (Ex. 1009) .................... 21
`
`IX. SPECIFIC EXPLANATION OF GROUNDS FOR INVALIDITY ............. 26
`
`i
`
`

`

`Petition for Inter Partes Review
`Patent No. 7,650,015
`A. Ground 1: Eriksson In View Of Stringa Renders Obvious Claim
`39 ......................................................................................................... 26
`
`1.
`
`2.
`
`3.
`
`Reasons To Combine Eriksson And Stringa ............................ 26
`
`Claim 39 .................................................................................... 29
`
`Eriksson and Stringa Are Not Cumulative ............................... 40
`
`A. Ground 2: Ando In View Of Suenaga Renders Obvious Claim
`39 ......................................................................................................... 41
`
`1.
`
`2.
`
`3.
`
`Reasons To Combine Ando and Suenaga ................................. 41
`
`Claim 39 .................................................................................... 44
`
`Ando and Suenaga Are Not Cumulative ................................... 56
`
`B.
`
`Ground 3: Ando In View Of Stringa Renders Obvious Claim 39 ...... 56
`
`1.
`
`2.
`
`3.
`
`Reasons To Combine Ando and Stringa ................................... 56
`
`Claim 39 .................................................................................... 59
`
`Ando and Stringa Are Not Cumulative ..................................... 69
`
`X.
`
`CONCLUSION .............................................................................................. 70
`
`
`
`
`
`
`
`-ii-
`
`
`
`

`

`Petition for Inter Partes Review
`Patent No. 6,717,518
`
`LIST OF EXHIBITS1
`
`U.S. Patent No. 6,717,518 (“the ’518 Patent”)
`Declaration of Dr. John C. Hart
`Curriculum Vitae for Dr. John C. Hart
`Prosecution File History of U.S. Patent No. 6,717,518
`Martin Eriksson et al., Eye Tracking For Detection Of Driver
`Fatigue, IEEE Conference on Intelligent Transportation
`Systems (Nov. 1997) (“Eriksson”)
`Luigi Stringa, Eyes Detection For Face Recognition, Applied
`Artificial Intelligence (1993) (“Stringa”)
`U.S. Patent No. 5,805,720, Facial Image Processing System
`(Filed Mar. 11, 1996) (“Suenaga”)
`U.S. Patent No. 5,293,427, Eye Position Detecting System and
`Method Therefor (Filed Dec. 11, 1991) (“Ueno”)
`U.S. Patent No. 5,008,946, System For Recognizing Image
`(Filed Sept. 9, 1988) (“Ando”)
`Declaration of William Garrity from U.C. Davis Regarding
`Stringa
`Declaration of Dr. Umit Ozguner Regarding Eriksson
`
`1001
`1002
`1003
`1004
`1005
`
`1006
`
`1007
`
`1008
`
`1009
`
`1010
`
`1011
`
`
`
` 1
`
` Citations to non-patent publications are to the original page numbers of the
`
`publication, and citations to U.S. patents are to column:line number of the patents.
`
`iii
`
`

`

`Petition for Inter Partes Review
`Patent No. 6,717,518
`
`I.
`
`INTRODUCTION
`
`Samsung Electronics Co., Ltd. and Samsung Electronics America, Inc.
`
`(collectively, “Petitioner”) request inter partes review (“IPR”) of Claim 39 of U.S.
`
`Patent No. 6,717,518 (“the ’518 Patent”) (Ex. 1001), which Petitioner understands
`
`to be currently assigned to Image Processing Technologies, LLC (“Patent
`
`Owner”). This Petition presents three non-cumulative grounds of invalidity that
`
`the U.S. Patent and Trademark Office (“PTO”) did not consider during
`
`prosecution. These grounds are each likely to prevail, and this Petition,
`
`accordingly, should be granted on all grounds and the challenged claim should be
`
`cancelled.
`
`II. MANDATORY NOTICES UNDER 37 C.F.R. § 42.8
`Real Parties-in-Interest: Petitioner identifies the following real parties-in-
`
`interest: Samsung Electronics Co., Ltd. and Samsung Electronics America, Inc.
`
`Related Matters: Patent Owner has asserted the ’518 Patent against
`
`Petitioner in Image Processing Technologies LLC v. Samsung Elecs. Co., No.
`
`2:16-cv-00505-JRG (E.D. Tex.). Patent Owner has also asserted U.S. Patent Nos.
`
`6,959,293; 8,805,001; 8,983,134; 7,650,015; and 8,989,445 in the related action.
`
`Petitioner is concurrently filing additional IPR petitions for several of these
`
`asserted patents, and has previously filed the following IPR petitions:
`
`• IPR2017-00355 against U.S. Patent 7,650,015
`
`1
`
`
`

`

`Petition for Inter Partes Review
`Patent No. 6,717,518
`
`• PR2017-00357 against U.S. Patent 8,989,445
`
`• IPR2017-00336 against U.S. Patent 6,959,293
`
`• IPR2017-00347 against U.S. Patent 8,805,001
`
`• IPR2017-00353 against U.S. Patent 8,983,134
`
`Lead and Back-Up Counsel:
`
`• Lead Counsel: John Kappos (Reg. No. 37,861), O'Melveny & Myers
`
`LLP, 610 Newport Center Drive, 17th Floor, Newport Beach,
`
`California 92660. (Telephone: 949-823-6900; Fax: 949-823-6994;
`
`Email: jkappos@omm.com.)
`
`• Backup Counsel: Nicholas J. Whilt (Reg. No. 72,081), Brian M. Cook
`
`(Reg. No. 59,356), O'Melveny & Myers LLP, 400 S. Hope Street, Los
`
`Angeles, CA 90071. (Telephone: 213-430-6000; Fax: 213-430-6407;
`
`Email: nwhilt@omm.com, bcook@omm.com.)
`
`Service Information: Samsung consents to electronic service by email to
`
`IPTSAMSUNGOMM@OMM.COM. Please address all postal and hand-delivery
`
`correspondence to lead counsel at O’Melveny & Myers LLP, 610 Newport Center
`
`Drive, 17th Floor, Newport Beach, California 92660, with courtesy copies to the
`
`email address identified above.
`
`III. PAYMENT OF FEES UNDER 37 C.F.R. § 42.15(a)
`The Office is authorized to charge an amount in the sum of $23,000 to
`
`2
`
`
`

`

`Petition for Inter Partes Review
`Patent No. 6,717,518
`Deposit Account No. 50-2862 for the fee set forth in 37 CFR § 42.15(a), and any
`
`additional fees that might be due in connection with this Petition.
`
`IV. GROUNDS FOR STANDING
`Petitioner certifies that the ’518 Patent is available for IPR and Petitioner is
`
`not barred or estopped from requesting IPR on the grounds identified herein.
`
`V.
`
`PRECISE RELIEF REQUESTED
`
`Petitioner respectfully requests review and cancellation of Claim 39 of the
`
`’518 Patent based on three grounds:
`
`• Ground 1: Claim 39 is obvious under 35 U.S.C. § 103(a) over
`
`Eriksson in view of Stringa.
`
`• Ground 2: Claim 39 is obvious under 35 U.S.C. § 103(a) over Ando
`
`in view of Suenaga.
`
`• Ground 3: Claim 39 is obvious under 35 U.S.C. § 103(a) over Ando
`
`in view of Stringa.
`
`VI. LEGAL STANDARDS
`A. Claim Construction
`In an inter partes review, “[a] claim in an unexpired patent shall be given its
`
`broadest reasonable construction in light of the specification of the patent in which
`
`it appears.” 37 C.F.R. § 42.100(b). The ’518 patent will not expire before a final
`
`written decision issues, and its claims should be given their broadest reasonable
`
`3
`
`
`

`

`Petition for Inter Partes Review
`Patent No. 6,717,518
`interpretation.2 Petitioner submits that for purposes of this Petition, no special
`
`definition applies to any term of Claim 39, and the terms should be interpreted
`
`according to their ordinary and customary meaning. Ex. 1002, Hart Decl. ¶ 47.
`
`Level of Ordinary Skill In The Art
`
`B.
`One of ordinary skill in the art the time of the alleged invention of the ’518
`
`Patent would have had either (1) a Master’s Degree in Electrical Engineering or
`
`Computer Science or the equivalent plus at least a year of experience in the field of
`
`image processing, image recognition, machine vision, or a related field or (2) a
`
`Bachelor’s Degree in Electrical Engineering or Computer Science or the equivalent
`
`plus at least three years of experience in the field of image processing, image
`
`recognition, machine vision, or a related field. Additional education could
`
`substitute for work experience and vice versa. Ex. 1002, ¶¶ 44–46.
`
`VII. OVERVIEW OF THE ’518 PATENT
`The ’518 Patent purports to disclose an application for the inventor’s
`
`
` Because the claim construction standard in this proceeding differs from the
`
` 2
`
`standard applicable to a district court litigation, see In re Am. Acad. of Sci. Tech
`
`Ctr., 367 F.3d 1359, 1364, 1369 (Fed. Cir. 2004), Petitioner expressly reserves the
`
`right to argue in litigation a different construction for any term recited by the
`
`claims of the ’518 Patent.
`
`4
`
`
`

`

`Petition for Inter Partes Review
`Patent No. 6,717,518
`previously patented “generic image processing system . . . ” (“GIPS”). Ex. 1001 at
`
`2:1–5. Specifically, the ’518 Patent proposes applying GIPS to “detect the
`
`drowsiness of a person.” Id. at 2:28–29. The patent explains that drowsiness
`
`detection addresses the problem that “a significant number of highway accidents
`
`result from drivers becoming drowsy or falling asleep . . . .” Id. at 1:12–17.
`
`Drowsiness can be detected by the duration of blinks (i.e., longer blinks occur
`
`when a driver becomes drowsy). Id. at 1:18–24. Thus, the Patent proposes
`
`mounting a video camera in a car and detecting blink rates using GIPS. Id. at
`
`6:28–56.
`
`For example, when the driver enters the vehicle, GIPS could detect the
`
`driver by looking for pixels that are “moving in a lateral direction away from the
`
`driver’s door” and that have the “hue characteristics of skin.” Id. at 25:24–39; Ex.
`
`1002, ¶ 33. Knowing a driver is present, GIPS then “detects the face of the driver
`
`in the video signal and eliminates from further processing those superfluous
`
`portions of the video signal above, below, and to the right and left of the head of
`
`the driver.” Ex. 1001 at 26:16–22. Specifically, the head is detected by looking
`
`for pixels with “selected characteristics” such as pixels that appear to be moving or
`
`to have a skin color. Id. at 26:21–45. These pixels could then be loaded into
`
`several histograms (324x and 324y), as shown below:
`
`5
`
`
`

`

`Petition for Inter Partes Review
`Patent No. 6,717,518
`
`
`
`Ex. 1001, Fig. 24
`
`
`
`Thus, for example, the head (in the region V) could be detected in the figure
`
`above by looking for peaks in the histogram, which can indicate the edge of the
`
`face. Id. at 26:49–65. Alternatively, GIPS could search for groups of pixels with
`
`“low luminance levels” to identify “nostrils.” Id. at 29:18–29.
`
`GIPS can then ignore the area in the frame outside of the face, and only
`
`continue with analyzing the face (V), which would be in the region Z bounded by
`
`Ya, Yb, Xc, and Xd in Figure 25, below. Id. at 26:66–27:10.
`
`6
`
`
`

`

`Petition for Inter Partes Review
`Patent No. 6,717,518
`
`
`
`The patent calls the exclusion of the hashmarked background area in the
`
`frame “masking.” Id. at 26:66–27:1.
`
`Next, GIPS “uses the usual anthropomorphic ratio between the zone of the
`
`eyes and the entire face for a human being” to obtain a mask for the eyes of the
`
`driver. Id. at 27:33–38. Use of an anthropomorphic model is explained to refer to
`
`using a “facial characteristic, e.g., the nose, ears, eyebrows, mouth, etc., and
`
`combinations thereof” or “the outline of the head of the driver” as a “starting point
`
`for locating the eyes.” Id. at 29:43–56; Ex. 1002, ¶ 37. The patent explains that
`
`the sub-area can also be “set using an anthropomorphic model, wherein the spatial
`
`relationship between the eyes and nose of humans is known.” Ex. 1001 at 30:43–
`
`45. Thus, using the anthropomorphic model, the patent proposes deriving the sub
`
`area Zʹ from the larger face area Z, as indicated below. Id. at 27:31–38; Ex. 1002,
`
`¶ 37.
`
`7
`
`
`

`

`Petition for Inter Partes Review
`Patent No. 6,717,518
`
`
`
`
`
`For example, the patent explains that the “nostrils 272” can be used to
`
`identify a “search box 276” around the “eye 274 of the driver,” as shown in Figure
`
`32, using “an anthropomorphic model.” Ex. 1001 at 30:40–45.
`
`
`
`Ex. 1001, Fig. 32
`
`
`
`Having reduced the area for processing to a smaller region that contains the
`
`eye, GIPS can then check for blinks by “analyzing the pixels within the area Zʹ to
`
`identify” blinking. Id. at 27:54–55, 31:3–9. The Patent proposes a variety of
`
`methods to identify blinking, such as (1) “analyzing the shape of the eye
`
`shadowing to identify shapes corresponding to openings and closings of the eye,”
`
`(id. at 4:25–33, 31:10–17), (2) analyzing pixels in the eye area with “high speed
`
`8
`
`
`

`

`Petition for Inter Partes Review
`Patent No. 6,717,518
`vertical movement” with “the hue of skin” (id. at 27:56–57), or (3) analyzing
`
`pixels in the eye area that lack “the hue of skin” (id. at 27:62–65). Ex. 1002, ¶ 39.
`
`Figure 27, below, shows the use of histograms to analyze the pixels in the eye
`
`area—peaks can indicate whether the eye is open or closed. Ex. 1001 at 28:47–51.
`
`
`
`
`
`The patent proposes that these histograms can be created for each frame, and
`
`changes in the histograms over time can be analyzed to determine blink rates. Id.
`
`at 28:32–29:10; Ex. 1002, ¶ 40. For example, Figure 33 shows the histograms for
`
`an open eye (featuring large peaks), and Figure 34 shows the histograms for a
`
`closed eye (featuring small peaks):
`
`
`
`
`
`Ex. 1001, Fig. 33
`
`
`
`
`
`
`
`
`
`Ex. 1001, Fig. 34
`
`
`
`9
`
`
`

`

`Petition for Inter Partes Review
`Patent No. 6,717,518
`The patent also proposes searching for “characteristics indicative of an eye
`
`
`
`present in the search box,” such as “a moving eyelid, a pupil, iris or cornea, a
`
`shape corresponding to an eye, a shadow corresponding to an eye, or any other
`
`indicia indicative of an eye.” Id. at 30:56–59; Ex. 1002, ¶ 41. Thus, for example,
`
`Figure 36 “shows a sample histogram of a pupil 432,” formed by “detect[ing]
`
`pixels with very low luminance levels and high gloss that are characteristic of a
`
`pupil.” Ex. 1001 at 30:61–64.
`
`
`
`Ex. 1001, Fig. 36
`
`VIII. DETAILED EXPLANATION OF GROUNDS
`A. Overview Of The Prior Art References
`1. Martin Eriksson et al., Eye Tracking for Detection of Driver
`Fatigue, IEEE Conference on Intelligent Transportation
`Systems (Nov. 1997) (“Eriksson”) (Ex. 1005)
`
`In the 1990s, the detection of driver fatigue was the subject of government
`
`funding by institutions such as the Minnesota Department of Transportation, the
`
`National Science Foundation, and the Center for Transportation Studies. See, e.g.,
`
`Ex. 1005 at 319. Pursuant to that funding, Martin Eriksson and Professor Nikolaos
`10
`
`
`

`

`Petition for Inter Partes Review
`Patent No. 6,717,518
`Papanikolopoulos at the University of Minnesota developed a system to detect
`
`driver fatigue that is very similar to the process of Claim 39 of the ’518 Patent.
`
`IEEE made Eriksson publicly available at an IEEE conference from November 9–
`
`12, 1997. See Ex. 1011 at ¶¶ 2–4. Thus, Eriksson is prior art at least under pre-
`
`AIA 35 U.S.C. § 102(a) and (b) and is a statutory bar under pre-AIA 35 U.S.C.
`
`§ 119.
`
`Eriksson “describe[s] a system that locates and tracks the eyes of a driver”
`
`for the “purpose of . . . detect[ing] driver fatigue.” Ex. 1005 at 314. Eriksson
`
`proposes mounting “a small camera inside the car” to “monitor the face of the
`
`driver and look for eye movements which indicate that the driver is no longer in
`
`condition to drive.” Id. at 314. Eriksson notes that “[a]s the driver becomes more
`
`fatigued, we expect the eye blinks to last longer.” Id. at 317. Thus, Eriksson
`
`proposes a system for detecting the driver’s pupil—when the pupil is detected, the
`
`eye is open, and when the pupil is not detected, the eye is closed. Id. at 318.
`
`Eriksson determines the location of the eyes in four steps. Id. at 315. The
`
`first step is “localization of the face.” Id. Eriksson explains that the face is
`
`localized using a “symmetry histogram.” Id.
`
`11
`
`
`

`

`Petition for Inter Partes Review
`Patent No. 6,717,518
`
`
`
`
`
`Eriksson calculates a “symmetry-value” for each pixel-column in order to
`
`find the center of the face. Id. at 316. The pixel column with the lowest symmetry
`
`value will be the center of the face. Id.; Ex. 1002, ¶ 50. Then, having identified
`
`the center of the face, Eriksson narrows the search area to a smaller area that
`
`includes the eyes: “the search-space is . . . limited to the area around this line,
`
`which reduces the probability of having distracting features in the background.”
`
`Ex. 1005 at 316.
`
`
`
`The second step in localizing the face is computing the vertical location of
`
`the eyes. Id. To do this, Eriksson creates a gradient histogram of the sub-area of
`
`the image identified in the first step, as illustrated in Figure 2:
`
`12
`
`
`

`

`Petition for Inter Partes Review
`Patent No. 6,717,518
`
`
`
`
`
`Eriksson “consider[s] the best three peaks in” the histogram (which in the
`
`above example appear to correspond to the eyes, the nose, and the mouth) as
`
`potential vertical locations for the eyes. Ex. 1005 at 316.
`
`
`
`The third step in localizing the eyes is finding “the exact location of the
`
`eyes.” Id. at 316. Having limited the search for the eyes to the horizontal region
`
`determined in the first step, and the three possible vertical locations determined in
`
`the second step, Eriksson finds the eyes by searching for “intensity-valleys” in the
`
`image and also using “general constraints, such [as] that both eyes must be located
`
`‘fairly close’ to the center of the face.” Id.
`
`
`
`The fourth step in localizing the eyes is estimating the position of the iris.
`
`Eriksson uses an “eye-template,” shown below, that, when laid over the picture,
`
`indicates a good match if there are “many dark pixels in the area inside the inner
`
`circle, and many bright pixels in the area between the two circles.” Id. at 316–17.
`
`When a match occurs, Eriksson knows “the inner circle is centered on the iris and
`
`the outside circle covers the sclera.” Id. at 317.
`
`13
`
`
`

`

`Petition for Inter Partes Review
`Patent No. 6,717,518
`
`
`
`
`
`Having found the eye, Eriksson next generates a horizontal intensity
`
`histogram across the pupil. Id. at 318. Eriksson notes that the pupil and iris are
`
`dark and the sclera is light. Id. Thus, the histogram of an open eye is markedly
`
`different from the histogram of a closed eye:
`
`
`
`Finally, having found the iris, pupil, and sclera, and having determined
`
`whether the eye is open or closed in each frame, Eriksson is able to measure blink
`
`rates over time and detect drowsy drivers. Id. at 318.
`
`
`
`14
`
`
`

`

`Petition for Inter Partes Review
`Patent No. 6,717,518
`Luigi Stringa, Eyes Detection For Face Recognition, Applied
`Artificial Intelligence (1993) (“Stringa”) (Ex. 1006)
`
`2.
`
`Stringa is a printed publication published and made publicly available in
`
`1993. See Ex. 1010. Thus, Stringa is prior art to the ’518 Patent under at least 35
`
`U.S.C. § 102(a) and (b).
`
`Stringa discloses an image processing normalization algorithm for
`
`improving previously developed algorithms for face detection. Ex. 1006 at 365.
`
`Stringa explains that for face recognition systems, sometimes captured faces are
`
`not looking “straight into the camera” and thus “some adjustment and
`
`normalization is necessary before the system can proceed to the recognition step.”
`
`Id. at 366. As part of this normalization procedure, Stringa discloses detecting the
`
`pupils of the face in a manner similar to the ’518 Patent, especially with respect to
`
`Claim 39’s use of an anthropometric model.
`
`Stringa explains that its approach to “locating the position of the eyes” is
`
`“based on the exploitation of (a priori) anthropometric information combined with
`
`the analysis of suitable grey-level distributions, allowing direct localization of both
`
`eyes.” Ex. 1006 at 369. Stringa explains that
`
`there exists a sort of ‘grammer’ of facial structures that provides some
`
`very basic a priori information used in the recognition of faces. Every
`
`human face presents a reasonable symmetry, and the knowledge of the
`
`relative position of the main facial features (nose between eyes and
`15
`
`
`

`

`Petition for Inter Partes Review
`Patent No. 6,717,518
`over mouth, etc.) proves very useful to discriminate among various
`
`hypotheses. These guidelines can be derived from anthropometric
`
`data corresponding to an average face and refined through the analysis
`
`of real faces. Some typical examples . . . are:
`
`• the eyes are located halfway between the top of the head and
`
`the bottom of the chin;
`
`• the eyes are about one eye width apart;
`
`• the bottom of the nose is halfway between the eyebrows and the
`
`chin; . . . .
`
`Ex. 1006 at 369.
`
`
`
`Stringa’s eye localization algorithm first detects the line that connects the
`
`eyes, then the side limits of the face and the nose axis. Id. at 370. To obtain the
`
`pupil location, Stringa first uses “the approximate location of the eye-connecting
`
`line, of the face sides, and of the nose axis” to estimate “the expectation zones of
`
`the two eyes . . . with reasonable accuracy.” Id. at 376. Stringa illustrates “the
`
`expectation zones for the two eyes” in Figure 9:
`
`16
`
`
`

`

`Petition for Inter Partes Review
`Patent No. 6,717,518
`
`
`
`In the expectation zones for the two eyes, “the search of the pupil is based
`
`on the analysis of the horizontal grey-level distribution,” (i.e., a histogram). Id. at
`
`377; Ex. 1002, ¶ 60. Stringa uses the histogram and further mathematical
`
`calculations to produce a graph that identifies pupil (Ex. 1006 at 377):
`
`
`
`17
`
`
`

`

`Petition for Inter Partes Review
`Patent No. 6,717,518
`U.S. Patent No. 5,805,720, Facial Image Processing System
`(Filed Mar. 11, 1996) (“Suenaga”) (Ex. 1007)
`
`3.
`
`Suenaga is a U.S. Patent filed on March 11, 1996 and issued on Sept. 28,
`
`1998. Ex. 1007. Thus, Suenaga is prior art to the ’518 Patent under at least 35
`
`U.S.C. § 102(a), (b) & (e).
`
`Suenaga discloses a “facial image processing system for detecting . . . a
`
`dozing or drowsy condition of an automobile driver . . . from the opened and
`
`closed conditions of his eyes.” Ex. 1007 at 1:6–10. Suenaga uses a video camera
`
`to obtain images of a face. Id. at 2:44–49; 6:25–35.
`
`Suenaga discloses many embodiments. Embodiment 31 explains that boxes
`
`11, 12, and 13 (in Fig. 60, below) in the flowchart for Embodiment 31 are the same
`
`as those steps in Embodiment 1. Id. at 23:19–21. Embodiment 1 explains that in
`
`boxes 11, 12, and 13, Suenaga converts the image from the camera into a binary
`
`image (i.e., each pixel is assigned to be a one or a zero). Id. at 6:41–51.
`
`
`
`Ex. 1007, Fig. 60
`
`18
`
`
`

`

`Petition for Inter Partes Review
`Patent No. 6,717,518
`For box 15A, the evaluation function calculation means “first finds the
`
`
`
`barycenter or centroid 31 from the average of the coordinates of black pixels in a
`
`binary image 30” (id. at 23:21–24):
`
`Ex. 1007, Fig. 61
`
`
`
`
`
`Next, “rectangular areas existing in the predetermined ranges in the X-
`
`direction on the left and right sides of this barycenter or centroid 31 are set as eye
`
`presence areas 32.” Id. at 23:24–27. Then, “in the eye presence area 32, X-
`
`histograms 33 (namely, 33a and 33b) are generated.” Id. at 23:27–29. Next,
`
`“zonal regions are set on the basis of the X-histograms. Furthermore, Y-
`
`19
`
`
`

`

`Petition for Inter Partes Review
`Patent No. 6,717,518
`histograms 34 (namely, 34a and 34b) . . . are produced.” Id. at 23:29–34.
`
`Additionally, “hatched candidate areas 35 (namely 35a and 35b) for an eye
`
`presence area are extracted.” Id. at 23:34–35.
`
`
`
`Next, Embodiment 31 calculates the evaluation function as described in the
`
`previous embodiments. Id. at 23:36–40. In Embodiment 1, the “the evaluation
`
`function calculation means 15 calculates an evaluation function, which”
`
`determines the “shape of the eye.” Id. at 6:61–65.
`
`
`
`Figure 2, below, illustrates how Suenaga determines whether the eye is open
`
`or closed in Embodiments 1 and 31. Ex. 1007 at 7:4–24, 23:52–54. Suenaga
`
`examines the shape of the eye by analyzing the histograms of the eye pixels. Id. at
`
`7:4–65.
`
`
`
`Ex. 1007, Figure 2
`
`20
`
`
`

`

`Petition for Inter Partes Review
`Patent No. 6,717,518
`Suenaga calculates a “value K” based on the lines 9 and 10 calculated from
`
`
`
`the shape of the histogram curve. Id. at 7:28–65. Suenaga determines whether the
`
`eye is open or closed based on the relationship of K to a threshold value, KB. Id.
`
`As shown in Figure 2, this evaluation is performed over time. Id. at 7:4–24 (“This
`
`diagram illustrates the relation among the lapse of time (from a moment TA, at
`
`which the eye is opened, to another moment TC, at which the eye is closed . . . ).”).
`
`4.
`
`U.S. Patent No. 5,008,946, System For Recognizing Image
`(Filed Sept. 9, 1988) (“Ando”) (Ex. 1009)
`
`Ando was filed in September, 1988 and issued in April, 1991. Thus, Ando
`
`is prior art to the ’518 Patent under at least 35 U.S.C. § 102(a), (b) & (e).
`
`Ando discloses a system for detecting certain portions of an image, such as
`
`“the driver’s eyes and mouth.” Ex. 1009 at 2:1–4. The system uses information
`
`about the driver’s eyes, such as the position of the eyes and whether they are open
`
`or closed, to allow the driver to “control electrical devices,” such as the windows
`
`and radio, “in a noncontact manner.” Id. at 2:18–20, Fig. 1a. Ando’s hands-free
`
`control system increases the safety and comfort of driving. Id. at 2:58–59. The
`
`system uses a video camera 3 mounted on the dashboard (id. at 6:60–7:3):
`
`21
`
`
`

`

`Petition for Inter Partes Review
`Patent No. 6,717,518
`
`
`
`Ex. 1009, Fig. 1b (showing video camera 3 on dashboard)
`
`Ando includes “a window setting means for setting a region narrower than
`
`the image produced by the camera means.” Id. at 2:25–41. “Once the position of
`
`the certain portions, i.e., the eyes and mouth, are detected, the scan made to detect
`
`the eyes and mouth is limited to the narrower region [] so they can be detected
`
`quickly.” Id. Ando “can detect the driver’s head, face, and pupils with high
`
`accuracy.” Id. at 3:30–31.
`
`Ando’s algorithm has two main phases, described in more detail below. The
`
`first phase operates on the first frame and determines some parameters for tracking
`
`the pupils in later frames. Id. at 35:14–36:31. The second phase operates on
`
`subsequent frames and tracks the pupils using the information calculated from the
`
`first frame. Id. at 36:32–44. If detection fails in the subsequent frames, the first
`
`phase is repeated. Id. at 36:44–51.
`
`22
`
`
`

`

`Petition for Inter Partes Review
`Patent No. 6,717,518
`
`a)
`In phase one, Ando identifies likely locations for the head HDD, then the
`
`Phase One
`
`forehead BRD, the right eye RED, the left eye LED, the eyebrows 35, the mouth
`
`MOD, and the nose NOD. Id. at Fig. 2. Ando uses histograms to calculate
`
`thresholds for distinguishing those face elements from other elements in the frame.
`
`Id. at Figs. 5b (calculating threshold for head detection), 7b (calculating threshold
`
`for forehead detection), 8b (calculating pupil threshold 25). Ando also identifies
`
`windows, or portions of the frame, for where those elements are located within the
`
`frame. Id. at Figs. 2 (identifying head and forehead windows in 34 & 39), 8b
`
`(identifying pupil region Sd 113).
`
`For example, to find the head, Ando uses a histogram of the gray level of
`
`each pixel. Id. at 14:23–16:32, Figs. 5a–5c. As part of the head detection process,
`
`Ando calculates the width of the head AW. Id. at 16:7–13. Similarly, Ando finds
`
`the forehead using another gray level histogram. Id. at 16:44–57, Figs. 7a–7c. For
`
`the forehead, Ando calculates the upper boundary HTY, the right end boundary
`
`HRX, the left end boundary HLX, and the width HW. Id. at 17:52–54, Fig. 13d.
`
`Ando also detects the right and left eyes. Id. at 17:63–21:53. As part of eye
`
`detection, Ando defines a portion of the image Sd which is calculated based on the
`
`forehead boundaries (id. at 18:11–14) and the expected position of the eye using
`
`23
`
`
`

`

`Petition for Inter Partes Review
`Patent No. 6,717,518
`known ratios for a human face (i.e., an anthropomorphic model), as shown in
`
`Figure 13d:
`
`Ex. 1009, Fig. 13d (showing portion Sd)
`
`
`
`Having defined the portion of the image Sd, Ando calculates a gray level
`
`histogram for that portion in order to identify the pupils specifically. Id. at 18:15–
`
`20:52, Figs. 8a–8d. Then, Ando detects the mouth and the nose using similar
`
`processes. Id. at Fig. 2 (MOD and NOD), 22:8–62.
`
`Having identified likely locations for the face elements, Ando next conducts
`
`a face verification process FAD to verify whether the relative locations of those
`
`face elements imply they are indeed part of a face, and are thus the elements they
`
`appear to be. Specifically, Ando uses a “similarity degree-detecting circuit,” to
`
`check whether the positions and locations of those elements indicate that they are
`
`similar to “reference values” for a face. Id. at 12:20–26, 4:66–44; 22:63–27:35
`
`24
`
`
`

`

`Petition for Inter Partes Review
`Patent No. 6,717,518
`39:49–41:14, Figs. 9a–9d, Table 1. As part of this process, Ando uses an
`
`anthropomorphic model created by measuring the faces of his acquaintances and
`
`himself. Id. at 22:67–23:32.
`
`Ando’s “similarity degree-detecting circuit” measures “the degrees of
`
`similarity of the detected elements to the elements of [a] reference image.” Id. at
`
`5:19–29. The reference image consists of “statistical values” describing the
`
`“shapes and relative positions” of “ordinary persons.” Id. at 39:58–63. If the
`
`image has face elements that are similar in shape and position to those elements in
`
`the face of an ordinary person, Ando determines that the image contains a face and
`
`that the identified face elements are located where they were calculated to be. Id.
`
`at 12:23–31. Upon positive verification, Ando then proceeds to phase 2. Id. at
`
`12:27–32.
`
`b)
`In phase two (summarized in boxes 40–54 in Figure 2), Ando retrieves a
`
`Phase Two
`
`new frame from the camera and uses the thresholds and windows calculated in
`
`phase one to search for the pupils 44 and the mouth 51. Id. at 12:31–33. After
`
`finding the pupils, Ando checks whether the eyes are open or closed and which
`
`direction the pupil is looking 49. Id. at 12:33–35. Specifically, Ando is “equipped
`
`with a state change-detecting means for detecting the states of the eyes and mouth
`
`at successive instants in time to detect the changes in the states.” Id. at 2:42–46.
`
`25
`
`
`

`

`Petition for Inter Partes Review
`Patent No. 6,717,518
`Thus, “when the states of the monitored eyes and mo

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket