throbber

`
`
`
`UNITED STATES PATENT AND TRADEMARK OFFICE
`
`____________________
`
`BEFORE THE PATENT TRIAL AND APPEAL BOARD
`
`____________________
`
`SAMSUNG ELECTRONICS CO., LTD.; and
`SAMSUNG ELECTRONICS AMERICA, INC.
`Petitioners
`
`v.
`
`IMAGE PROCESSING TECHNOLOGIES, LLC
`Patent Owner
`
`____________________
`
`Patent No. 6,717,518
`____________________
`
`DECLARATION OF DR. JOHN C. HART
`IN SUPPORT OF PETITION FOR INTER PARTES REVIEW
`OF U.S. PATENT NO. 6,717,518
`
`
`
`
`
`
`
`Page 1 of 78
`
`SAMSUNG EXHIBIT 1002
`Samsung v. Image Processing Techs.
`
`

`

`Declaration of Dr. John C. Hart
`Inter Partes Review of U.S. Patent No. 6,717,518
`TABLE OF CONTENTS
`
`INTRODUCTION .............................................................................................................. 1
`
`BACKGROUND AND EXPERIENCE ............................................................................. 1
`
`A.
`
`B.
`
`Qualifications .......................................................................................................... 1
`
`Previous Testimony ................................................................................................ 4
`
`TECHNOLOGICAL BACKGROUND.............................................................................. 5
`
`THE ’518 PATENT .......................................................................................................... 11
`
`SUMMARY OF OPINIONS ............................................................................................ 17
`
`LEVEL OF ORDINARY SKILL IN THE ART .............................................................. 18
`
`
`I.
`
`II.
`
`III.
`
`IV.
`
`V.
`
`VI.
`
`VII. CLAIM CONSTRUCTION .............................................................................................. 19
`
`VIII. THE PRIOR ART TEACHES OR SUGGESTS EACH LIMITATION OF
`CLAIM 39 OF THE ’518 PATENT ................................................................................. 20
`
`A.
`
`Overview Of The Prior Art References ................................................................ 20
`
`1.
`
`2.
`
`3.
`
`4.
`
`Martin Eriksson et al., Eye Tracking For Detection Of Driver
`Fatigue, IEEE Conference on Intelligent Transportation Systems
`(Nov. 1997) (“Eriksson”) (Ex. 1005) ........................................................ 20
`
`Luigi Stringa, Eyes Detection For Face Recognition, Applied
`Artificial Intelligence (1993) (“Stringa”) (Ex. 1006) ............................... 23
`
`U.S. Patent No. 5,805,720, Facial Image Processing System (Filed
`Mar. 11, 1996) (“Suenaga”) (Ex. 1007) .................................................... 26
`
`U.S. Patent No. 5,008,946, System For Recognizing Image (Filed
`Sept. 9, 1988) (“Ando”) (Ex. 1009) .......................................................... 30
`
`B.
`
`Ground 1: Eriksson In View Of Stringa Teaches or Suggests Every
`Limitation of Claim 39 .......................................................................................... 35
`
`1.
`
`2.
`
`Reasons To Combine Eriksson And Stringa ............................................. 35
`
`Claim 39 .................................................................................................... 37
`
`C.
`
`Ground 2: Ando In View Of Suenaga Teaches or Suggests Every
`Limitation of Claim 39 .......................................................................................... 48
`
`
`
`i
`
`SAMSUNG EXHIBIT 1002
`Page 2 of 78
`
`

`

`Declaration of Dr. John C. Hart
`Inter Partes Review of U.S. Patent No. 6,717,518
`Reasons To Combine Ando And Suenaga ................................................ 48
`
`Claim 39 .................................................................................................... 51
`
`1.
`
`2.
`
`D.
`
`Ground 3: Ando In View Of Stringa Teaches or Suggests Every
`Limitation of Claim 39 .......................................................................................... 62
`
`1.
`
`2.
`
`Reasons To Combine Ando And Stringa .................................................. 62
`
`Claim 39 .................................................................................................... 64
`
`IX.
`
`CONCLUSION ................................................................................................................. 75
`
`ii
`
`
`
`
`
`
`
`
`
`SAMSUNG EXHIBIT 1002
`Page 3 of 78
`
`

`

`Declaration of Dr. John C. Hart
`Inter Partes Review of U.S. Patent No. 6,717,518
`I, John C. Hart, declare as follows:
`
`1.
`
`I.
`
`INTRODUCTION
`2.
`
`I have been retained by Samsung Electronics Co., Ltd. and Samsung
`
`Electronics America, Inc. (collectively, “Petitioner”) as an independent expert
`
`consultant in this proceeding before the United States Patent and Trademark Office
`
`(“PTO”).
`
`3.
`
`I have been asked to consider whether certain references disclose,
`
`teach, or suggest the limitations recited in Claim 39 (the “Challenged Claim”) of
`
`U.S. Patent No. 6,717,518 (“the ’518 Patent”) (Ex. 1001), which I understand is
`
`allegedly owned by Image Processing Technologies, LLC (“Patent Owner”). My
`
`opinions and the bases for my opinions are set forth below.
`
`4.
`
`I am being compensated at my ordinary and customary consulting rate
`
`for my work.
`
`5. My compensation is in no way contingent on the nature of my
`
`findings, the presentation of my findings in testimony, or the outcome of this or
`
`any other proceeding. I have no other interest in this proceeding.
`
`II. BACKGROUND AND EXPERIENCE
`A. Qualifications
`6.
`I have more than 25 years of experience in computer graphics and
`
`image processing technologies. In particular, I have devoted much of my career to
`
`
`
`1
`
`SAMSUNG EXHIBIT 1002
`Page 4 of 78
`
`

`

`Declaration of Dr. John C. Hart
`Inter Partes Review of U.S. Patent No. 6,717,518
`researching and designing graphics hardware and systems for a wide range of
`
`applications.
`
`7. My research has resulted in the publication of more than 80 peer-
`
`reviewed scientific articles, and more than 50 invited papers, and talks in the area
`
`of computer graphics and image processing.
`
`8.
`
`I have authored or co-authored several publications that are directly
`
`related to target identification and tracking in image processing systems. Some
`
`recent publications include:
`
`•
`
`•
`
`•
`
`P.R. Khorrami, V.V. Le, J.C. Hart, T.S. Huang. A System for
`
`Monitoring the Engagement of Remote Online Students using Eye
`
`Gaze Estimation. Proc. IEEE ICME Workshop on Emerging
`
`Multimedia Systems and Applications, July 2014.
`
`V. Lu, I. Endres, M. Stroila and J.C. Hart. Accelerating Arrays of
`
`Linear Classifiers Using Approximate Range Queries. Proc. Winter
`
`Conference on Applications of Computer Vision, Mar. 2014.
`
`M. Kamali, E. Ofek, F. Iandola, I. Omer, J.C. Hart Linear Clutter
`
`Removal from Urban Panoramas. Proc. International Symposium on
`
`Visual Computing. Sep. 2011.
`
`9.
`
`From 2008–2012, as a Co-PI of the $18M Intel/Microsoft Universal
`
`Parallelism Computing Research Center at the University of Illinois, I led the
`
`
`
`2
`
`SAMSUNG EXHIBIT 1002
`Page 5 of 78
`
`

`

`Declaration of Dr. John C. Hart
`Inter Partes Review of U.S. Patent No. 6,717,518
`AvaScholar project for visual processing of images that included face
`
`identification, tracking and image histograms.
`
`10.
`
`11.
`
`I am a co-inventor of U.S. Patent No. 7,365,744.
`
`I have served as the Director for Graduate Studies for the Department
`
`of Computer Science, an Associate Dean for the Graduate College, and I am
`
`currently serving as the Executive Associate Dean of the Graduate College at the
`
`University of Illinois. I am also a professor in the Department of Computer
`
`Science at the University of Illinois, where I have served on the faculty since
`
`August 2000. As a professor I have taught classes on image processing and
`
`graphics technology and have conducted research into specific applications of
`
`these technologies.
`
`12. From 1992 to 2000, I worked first as an Assistant Professor and then
`
`as an Associate Professor in the School of Electrical Engineering and Computer
`
`Science at Washington State University.
`
`13. From 1991-1992, I was a Postdoctoral Research Associate at the
`
`Electronic Visualization Laboratory at the University of Illinois at Chicago, and at
`
`the National Center for Supercomputing Applications at the University of Illinois
`
`at Urbana-Champaign.
`
`14.
`
`I earned a Doctor of Philosophy in Electrical Engineering and
`
`Computer Science from the University of Illinois at Chicago in 1991.
`
`
`
`3
`
`SAMSUNG EXHIBIT 1002
`Page 6 of 78
`
`

`

`Declaration of Dr. John C. Hart
`Inter Partes Review of U.S. Patent No. 6,717,518
`I earned a Master’s Degree in Electrical Engineering and Computer
`
`15.
`
`Science from the University of Illinois at Chicago in 1989.
`
`16.
`
`I earned a Bachelor of Science in Computer Science from Aurora
`
`University in 1987.
`
`17.
`
`I have been an expert in the field of graphics and image processing
`
`since prior to 1996. I am qualified to provide an opinion as to what a person of
`
`ordinary skill in the art (“POSA”) would have understood, known, or concluded as
`
`of 1996.
`
`18. Additional qualifications are detailed in my curriculum vitae, which I
`
`understand has been submitted as Exhibit 1003 in this proceeding.
`
`B.
`19.
`
`Previous Testimony
`
`In the previous five years, I have testified as an expert at trial or by
`
`deposition or have submitted declarations in the following cases:
`
`20. Certain Computing or Graphics Systems, Components Thereof, and
`
`Vehicles Containing Same, Inv. No. 337-TA-984 and Certain Consumer
`
`Electronics with Display and Processing Capabilities, Inv. No. 337-TA-884.
`
`21. ZiiLabs Inc., Ltd v. Samsung Electronics Co. Ltd. et al., No. 2:14-cv-
`
`00203 (E.D. Tex. Feb. 4, 2016).
`
`22.
`
`I have also submitted Declarations in support of Petitions for Inter
`
`Partes Review in the following proceedings:
`
`
`
`4
`
`SAMSUNG EXHIBIT 1002
`Page 7 of 78
`
`

`

`Declaration of Dr. John C. Hart
`Inter Partes Review of U.S. Patent No. 6,717,518
`• IPR2017-00355 against U.S. Patent 7,650,015
`
`• PR2017-00357 against U.S. Patent 8,989,445
`
`• IPR2017-00336 against U.S. Patent 6,959,293
`
`• IPR2017-00347 against U.S. Patent 8,805,001
`
`• IPR2017-00353 against U.S. Patent 8,983,134
`
`III. TECHNOLOGICAL BACKGROUND
`23.
`Image processing systems have long used histograms as a
`
`mathematical tool to identify and track image features and to adjust image
`
`properties. The use of histograms to identify and track image features dates back
`
`to well before 1997. D. Trier, A. K. Jain and T. Taxt, “Feature Extraction Methods
`
`for Character Recognition-A Survey”, Pattern Recognition, vol. 29, no. 4, 1996,
`
`pp. 641–662 (Ex. 1009) (citing M. H. Glauberman, “Character recognition for
`
`business machines,” Electronics, vol. 29, pp. 132(136), Feb. 1956(Ex. 1010))
`
`24. A digital image is represented by a number of picture elements, or
`
`pixels, where each pixel has certain properties, such as brightness, color, position,
`
`velocity, etc., which may be referred to as domains. For each pixel property or
`
`domain, a histogram may be formed. A histogram is a type of statistical tool. In
`
`image processing, histograms are often used to count the number of pixels in an
`
`image in a certain domain of the pixel. Histograms have multiple bins, where each
`
`bin in the histogram counts the pixels that fall within a range for that domain. For
`
`
`
`5
`
`SAMSUNG EXHIBIT 1002
`Page 8 of 78
`
`

`

`Declaration of Dr. John C. Hart
`Inter Partes Review of U.S. Patent No. 6,717,518
`example, for the continuous variable of luminance (also called brightness), the
`
`luminance value for each pixel can be sampled by a camera and then digitized and
`
`represented by an 8-bit value. Then, those luminance values could be loaded into a
`
`luminance histogram. The histogram would have one bin for each range of
`
`luminance values, and each bin would count the number of pixels in the image that
`
`fall within that luminance value range. As shown below, a luminance histogram
`
`may reveal certain properties of an image, such as whether it is properly exposed,
`
`based on whether an excessive number of pixels fall on the dark end or light end of
`
`the luminance range.
`
`
`
`25. Histograms of other pixel properties can also be formed. For
`
`example, the figure below illustrates two histograms formed by counting the
`
`number of black pixels having each X-coordinate value (i.e., the X-coordinate
`
`domain) and the number having each Y-coordinate value (i.e., the Y-coordinate
`
`domain).
`
`
`
`6
`
`SAMSUNG EXHIBIT 1002
`Page 9 of 78
`
`

`

`Declaration of Dr. John C. Hart
`Inter Partes Review of U.S. Patent No. 6,717,518
`
`
`
`26. Such histograms are sometimes called “projection histograms”
`
`because they represent the image projected onto each axis. In the example above,
`
`the image was pure black and white, but projection histograms of a greyscale
`
`image can also be formed in a similar manner by defining a luminance threshold
`
`and projecting, for example, only those pixels that have a luminance value lower
`
`than 100.
`
`27. A more complex greyscale image is shown below, along with its
`
`luminance histogram (black = 0; white = 255):
`
`
`
`7
`
`SAMSUNG EXHIBIT 1002
`Page 10 of 78
`
`

`

`Declaration of Dr. John C. Hart
`Inter Partes Review of U.S. Patent No. 6,717,518
`
`
`
`
`
`28. Here, the peak in the dark luminance region (luminance = 0-50)
`
`corresponds to the dark suit and tie and relatively dark background. The peak in
`
`the light luminance region (luminance > 230) corresponds to the white shirt, while
`
`the central peak (between luminance 130 and 170) corresponds largely to the
`
`medium brightness of the face. If one were to select only the subset of pixels with
`
`
`
`8
`
`SAMSUNG EXHIBIT 1002
`Page 11 of 78
`
`

`

`Declaration of Dr. John C. Hart
`Inter Partes Review of U.S. Patent No. 6,717,518
`brightness between 130 and 170 and plot them according to their x and y position,
`
`one would get the following image:
`
`
`
`29. Taking projection histograms of this subset of pixels with luminance
`
`between 130 and 170, then, provides an indication of location of the face in the
`
`image. On the left, below, is a projection of this subset of pixels onto the x axis,
`
`and on the right is a similar projection onto the y axis.
`
`
`
`9
`
`SAMSUNG EXHIBIT 1002
`Page 12 of 78
`
`

`

`Declaration of Dr. John C. Hart
`Inter Partes Review of U.S. Patent No. 6,717,518
`
`
`
`30. Histograms may also be formed of pixel color properties in much the
`
`same way. Color is typically represented by three values: hue, saturation and
`
`luminance. Hue (aka “tone”) is an angle ranging from 0° to 360° around a color
`
`wheel that indicates which “color” is bring represented, e.g. 0° = red, 60° = yellow,
`
`120° = green, 180° = cyan, 240° = blue, and 300° = magenta. Saturation, which
`
`may also range from 0 to 255, represents how “brilliant” the color is. For example,
`
`if a color with a saturation of 255 represents red, then a saturation of 128 would
`
`represent pink and a saturation of 0 would represent gray. Luminance ranges from
`
`0 to 255 and represents the “brightness” of the color. If luminance = 0, then the
`
`color is black, regardless of the other values. Given a color image, the luminance
`
`values of the pixels would yield the “black-and-white” or grayscale version of the
`
`image.
`
`
`
`10
`
`SAMSUNG EXHIBIT 1002
`Page 13 of 78
`
`

`

`Declaration of Dr. John C. Hart
`Inter Partes Review of U.S. Patent No. 6,717,518
`
`
`
`
`
`IV. THE ’518 PATENT
`31. The ’518 Patent, entitled “Method and Apparatus For Detection of
`
`Drowsiness,” was filed on January 15, 1999, and issued on April 6, 2004. The
`
`’518 Patent names Patrick Pirim as the sole inventor. I understand IPT Claims that
`
`the ’518 Patent has a priority date of January 15, 1998.
`
`32. The ’518 Patent purports to disclose an application for the inventor’s
`
`previously patented “generic image processing system . . . ” (“GIPS”). Ex. 1001 at
`
`2:1–5. Specifically, the ’518 Patent proposes applying GIPS to “detect the
`
`drowsiness of a person.” Id. at 2:28–29. The patent explains that drowsiness
`
`detection addresses the problem that “a significant number of highway accidents
`
`result from drivers becoming drowsy or falling asleep . . . .” Id. at 1:12–17.
`
`Drowsiness can be detected by the duration of blinks (i.e., longer blinks occur
`
`
`
`11
`
`SAMSUNG EXHIBIT 1002
`Page 14 of 78
`
`

`

`Declaration of Dr. John C. Hart
`Inter Partes Review of U.S. Patent No. 6,717,518
`when a driver becomes drowsy). Id. at 1:18–24. Thus, the Patent proposes
`
`mounting a video camera in a car and detecting blink rates using GIPS. Id. at
`
`6:28–56.
`
`33. For example, when the driver enters the vehicle, GIPS could detect
`
`the driver by looking for pixels that are “moving in a lateral direction away from
`
`the driver’s door” and that have the “hue characteristics of skin.” Id. at 25:24–39.
`
`Knowing a driver is present, GIPS then “detects the face of the driver in the video
`
`signal and eliminates from further processing those superfluous portions of the
`
`video signal above, below, and to the right and left of the head of the driver.” Id.
`
`at 26:16–22. Specifically, the head is detected by looking for pixels with
`
`“selected characteristics” such as pixels that appear to be moving or to have a skin
`
`color. Id. at 26:21–45. These pixels could then be loaded into several histograms
`
`(324x and 324y), as shown below:
`
`
`
`12
`
`SAMSUNG EXHIBIT 1002
`Page 15 of 78
`
`

`

`Declaration of Dr. John C. Hart
`Inter Partes Review of U.S. Patent No. 6,717,518
`
`
`
`Ex. 1001, Fig. 24
`
`34.
`
`
`
`Thus, for example, the head (in the region V) could be detected
`
`in the figure above by looking for peaks in the histogram, which can indicate the
`
`edge of the face. Id. at 26:49–65. Alternatively, GIPS could search for groups of
`
`pixels with “low luminance levels” to identify “nostrils.” Id. at 29:18–29.
`
`35. GIPS can then ignore the area in the frame outside of the face, and
`
`only continue with analyzing the face (V), which would be in the region Z
`
`bounded by Ya, Yb, Xc, and Xd in Figure 25, below. Id. at 26:66–27:10.
`
`
`
`13
`
`SAMSUNG EXHIBIT 1002
`Page 16 of 78
`
`

`

`Declaration of Dr. John C. Hart
`Inter Partes Review of U.S. Patent No. 6,717,518
`
`
`
`36. The patent calls the exclusion of the hashmarked background area in
`
`the frame “masking.” Id. at 26:66–27:1.
`
`37. Next, GIPS “uses the usual anthropomorphic ratio between the zone
`
`of the eyes and the entire face for a human being” to obtain a mask for the eyes of
`
`the driver. Id. at 27:33–38. Use of an anthropomorphic model is explained to refer
`
`to using a “facial characteristic, e.g., the nose, ears, eyebrows, mouth, etc., and
`
`combinations thereof” or “the outline of the head of the driver” as a “starting point
`
`for locating the eyes.” Id. at 29:43–56. The patent explains that the sub-area can
`
`also be “set using an anthropomorphic model, wherein the spatial relationship
`
`between the eyes and nose of humans is known.” Id. 30:43–45. Thus, using the
`
`anthropomorphic model, the patent proposes deriving the sub area Zʹ from the
`
`larger face area Z, as indicated below. Id. at 27:31–38.
`
`
`
`14
`
`SAMSUNG EXHIBIT 1002
`Page 17 of 78
`
`

`

`Declaration of Dr. John C. Hart
`Inter Partes Review of U.S. Patent No. 6,717,518
`
`
`
`38. For example, the patent explains that, for example, the “nostrils 272”
`
`can be used to identify a “search box 276” around the “eye 274 of the driver,” as
`
`shown in Figure 32, using “an anthropomorphic model.” Id. at 30:40–45.
`
`
`
`Ex. 1001, Fig. 32
`
`39. Having reduced the area for processing to a smaller region that
`
`contains the eye, GIPS can then check for blinks by “analyzing the pixels within
`
`the area Zʹ to identify” blinking. Id. at 27:54–55, 31:3–9. The Patent proposes a
`
`variety of methods to identify blinking, such as (1) “analyzing the shape of the eye
`
`shadowing to identify shapes corresponding to openings and closings of the eye,”
`
`(id. at 4:25–33, 31:10–17), (2) analyzing pixels in the eye area with “high speed
`
`
`
`15
`
`SAMSUNG EXHIBIT 1002
`Page 18 of 78
`
`

`

`Declaration of Dr. John C. Hart
`Inter Partes Review of U.S. Patent No. 6,717,518
`vertical movement” with “the hue of skin” (id. at 27:56–57), or (3) analyzing
`
`pixels in the eye area that lack “the hue of skin” (id. at 27:62–65). Figure 27,
`
`below, shows the use of histograms to analyze the pixels in the eye area—peaks
`
`can indicate whether the eye is open or closed. Id. at 28:47–51.
`
`
`
`40. The patent proposes that these histograms can be created for each
`
`frame, and changes in the histograms over time can be analyzed to determine blink
`
`rates. Id. at 28:32–29:10. For example, Figure 33 shows the histograms for an
`
`open eye (featuring large peaks), and Figure 34 shows the histograms for a closed
`
`eye (featuring small peaks):
`
`Ex. 1001, Fig. 33
`
`
`
`
`
`
`
`
`
`Ex. 1001, Fig. 34
`
`
`
`
`
`16
`
`SAMSUNG EXHIBIT 1002
`Page 19 of 78
`
`

`

`Declaration of Dr. John C. Hart
`Inter Partes Review of U.S. Patent No. 6,717,518
`41. The patent also proposes searching for “characteristics indicative of
`
`an eye present in the search box,” such as “a moving eyelid, a pupil, iris or cornea,
`
`a shape corresponding to an eye, a shadow corresponding to an eye, or any other
`
`indicia indicative of an eye.” Id. at 30:56–59. Thus, for example, Figure 36
`
`“shows a sample histogram of a pupil 432,” formed by “detect[ing] pixels with
`
`very low luminance levels and high gloss that are characteristic of a pupil.” Id. at
`
`30:61–64.
`
`
`
`Ex. 1001, Fig. 36
`
`V.
`
`SUMMARY OF OPINIONS
`42.
`
`In preparing this declaration, I have reviewed at least the documents
`
`labeled Exhibits 1001–1009 and other materials referred to herein in connection
`
`with providing this declaration. In addition to these materials, I have relied on my
`
`education, experience, and my knowledge of practices and principles in the
`
`relevant field, e.g., image processing. My opinions have also been guided by my
`
`appreciation of how one of ordinary skill in the art would have understood the
`
`
`
`17
`
`SAMSUNG EXHIBIT 1002
`Page 20 of 78
`
`

`

`Declaration of Dr. John C. Hart
`Inter Partes Review of U.S. Patent No. 6,717,518
`claims and specification of the ’518 Patent around the time of the alleged
`
`invention, which I have been asked to assume is the earliest claimed priority date
`
`of January 15, 1998.
`
`43. Based on my experience and expertise, it is my opinion that certain
`
`references teach or suggest the limitations in Claim 39 of the ’518 Patent, as
`
`explained in detail below. Specifically, it is my opinion that Claim 39 is disclosed
`
`by:
`
`(a) Martin Eriksson et al., Eye Tracking For Detection Of Driver
`
`Fatigue, IEEE Conference on Intelligent Transportation Systems (Nov.
`
`1997) (“Eriksson”) in combination with Luigi Stringa, Eyes Detection For
`
`Face Recognition, Applied Artificial Intelligence (1993) (“Stringa”),
`
`(b) U.S. Patent No. 5,008,946, System For Recognizing Image (Filed
`
`Sept. 9, 1988) (“Ando”) in combination with U.S. Patent No. 5,805,720,
`
`Facial Image Processing System (Filed Mar. 11, 1996) (“Suenaga”), and
`
`(c) Ando in combination with Stringa.
`
`VI. LEVEL OF ORDINARY SKILL IN THE ART
`44. Based on my review of the ’518 Patent specification, claims, file
`
`history, and prior art, I believe one of ordinary skill in the art around the time of
`
`the alleged invention of the ’518 Patent would have had either (1) a Master’s
`
`Degree in Electrical Engineering or Computer Science or the equivalent plus at
`
`
`
`18
`
`SAMSUNG EXHIBIT 1002
`Page 21 of 78
`
`

`

`Declaration of Dr. John C. Hart
`Inter Partes Review of U.S. Patent No. 6,717,518
`least a year of experience in the field of image processing, image recognition,
`
`machine vision, or a related field or (2) a Bachelor’s Degree in Electrical
`
`Engineering or Computer Science or the equivalent plus at least three years of
`
`experience in the field of image processing, image recognition, machine vision, or
`
`a related field. Additional education could substitute for work experience and vice
`
`versa.
`
`45.
`
`In determining the level of ordinary skill in the art, I was asked to
`
`consider, for example, the type of problems encountered in the art, prior art
`
`solutions to those problems, the rapidity with which innovations are made, the
`
`sophistication of the technology, and the educational level of active workers in the
`
`field.
`
`46. My opinions concerning the ’518 Patent claims are from the
`
`perspective of a person of ordinary skill in the art (“POSA”), as set forth above.
`
`VII. CLAIM CONSTRUCTION
`47. For my analysis of the ’518 Patent, I have interpreted all claim terms
`
`according to their plain and ordinary meaning under the broadest reasonable
`
`construction of the terms.
`
`
`
`19
`
`SAMSUNG EXHIBIT 1002
`Page 22 of 78
`
`

`

`Declaration of Dr. John C. Hart
`Inter Partes Review of U.S. Patent No. 6,717,518
`VIII. THE PRIOR ART TEACHES OR SUGGESTS EACH LIMITATION
`OF CLAIM 39 OF THE ’518 PATENT
`A. Overview Of The Prior Art References
`1. Martin Eriksson et al., Eye Tracking For Detection Of Driver
`Fatigue, IEEE Conference on Intelligent Transportation
`Systems (Nov. 1997) (“Eriksson”) (Ex. 1005)
`48. Eriksson “describe[s] a system that locates and tracks the eyes of a
`
`driver” for the “purpose of . . . detect[ing] driver fatigue.” Ex. 1005 at 314.
`
`Eriksson proposes mounting “a small camera inside the car” to “monitor the face
`
`of the driver and look for eye movements which indicate that the driver is no
`
`longer in condition to drive.” Id. at 314. Eriksson notes that “[a]s the driver
`
`becomes more fatigued, we expect the eye blinks to last longer.” Id. at 317. Thus,
`
`Eriksson proposes a system for detecting the driver’s pupil—when the pupil is
`
`detected, the eye is open, and when the pupil is not detected, the eye is closed. Id.
`
`at 318.
`
`49. Eriksson determines the location of the eyes in four steps. Id. at 315.
`
`The first step is “localization of the face.” Id. Eriksson explains that the face is
`
`localized using a “symmetry histogram.”
`
`
`
`20
`
`SAMSUNG EXHIBIT 1002
`Page 23 of 78
`
`

`

`Declaration of Dr. John C. Hart
`Inter Partes Review of U.S. Patent No. 6,717,518
`
`
`
`50.
`
` Eriksson calculates a “symmetry-value” for each pixel-column in
`
`order to find the center of the face. Id. at 316. The pixel column with the lowest
`
`symmetry value will be the center of the face. Id. Then, having identified the
`
`center of the face, Eriksson narrows the search area to a smaller area that includes
`
`the eyes: “the search-space is . . . limited to the area around this line, which
`
`reduces the probability of having distracting features in the background.” Id.
`
`51. The second step in localizing the face is computing the vertical
`
`location of the eyes. Id. To do this, Eriksson creates a gradient histogram of the
`
`sub-area of the image identified in the first step, as illustrated in Figure 2:
`
`
`
`21
`
`SAMSUNG EXHIBIT 1002
`Page 24 of 78
`
`

`

`Declaration of Dr. John C. Hart
`Inter Partes Review of U.S. Patent No. 6,717,518
`
`
`
`52.
`
` Eriksson “consider[s] the best three peaks in” the histogram (which in
`
`the above example appear to correspond to the eyes, the nose, and the mouth) as
`
`potential vertical locations for the eyes. Id. at 316.
`
`53. The third step in localizing the eyes is finding “the exact location of
`
`the eyes.” Id. at 316. Having limited the search for the eyes to the horizontal
`
`region determined in the first step, and the three possible vertical locations
`
`determined in the second step, Eriksson finds the eyes by searching for “intensity-
`
`valleys” in the image and also using “general constraints, such [as] that both eyes
`
`must be located ‘fairly close’ to the center of the face.” Id.
`
`54. The fourth step in localizing the eyes is estimating the position of the
`
`iris. Eriksson uses an “eye-template,” shown below, that, when laid over the
`
`picture, indicates a good match if there are “many dark pixels in the area inside the
`
`inner circle, and many bright pixels in the area between the two circles.” Id. at
`
`316–17. When a match occurs, Eriksson knows “the inner circle is centered on the
`
`iris and the outside circle covers the sclera.” Id. at 317.
`
`
`
`22
`
`SAMSUNG EXHIBIT 1002
`Page 25 of 78
`
`

`

`Declaration of Dr. John C. Hart
`Inter Partes Review of U.S. Patent No. 6,717,518
`
`
`
`55.
`
` Having found the eye, Eriksson next generates a horizontal intensity
`
`histogram across the pupil. Id. at 318. Eriksson notes that the pupil and iris are
`
`dark and the sclera is light. Id. Thus, the histogram of an open eye is markedly
`
`different from the histogram of a closed eye:
`
`
`
`56.
`
` Finally, having found the iris, pupil, and sclera, and having
`
`determined whether the eye is open or closed in each frame, Eriksson is able to
`
`measure blink rates over time and detect drowsy drivers. Id. at 318.
`
`2.
`
`Luigi Stringa, Eyes Detection For Face Recognition, Applied
`Artificial Intelligence (1993) (“Stringa”) (Ex. 1006)
`57. Stringa discloses an image processing normalization algorithm for
`23
`
`
`
`SAMSUNG EXHIBIT 1002
`Page 26 of 78
`
`

`

`Declaration of Dr. John C. Hart
`Inter Partes Review of U.S. Patent No. 6,717,518
`improving previously developed algorithms for face detection. Ex. 1006 at 365.
`
`Stringa explains that for face recognition systems, sometimes captured faces are
`
`not looking “straight into the camera” and thus “some adjustment and
`
`normalization is necessary before the system can proceed to the recognition step.”
`
`Id. at 366. As part of this normalization procedure, Stringa discloses detecting the
`
`pupils of the face in a manner similar to the ’518 Patent, especially with respect to
`
`Claim 39’s use of an anthropometric model.
`
`58. Stringa explains that its approach to “locating the position of the eyes”
`
`is “based on the exploitation of (a priori) anthropometric information combined
`
`with the analysis of suitable grey-level distributions, allowing direct localization of
`
`both eyes.” Ex. 1006 at 369. Stringa explains that
`
`there exists a sort of ‘grammer’ of facial structures that provides some
`
`very basic a priori information used in the recognition of faces. Every
`
`human face presents a reasonable symmetry, and the knowledge of the
`
`relative position of the main facial features (nose between eyes and
`
`over mouth, etc.) proves very useful to discriminate among various
`
`hypotheses. These guidelines can be derived from anthropometric
`
`data corresponding to an average face and refined through the analysis
`
`of real faces. Some typical examples . . . are:
`
`• the eyes are located halfway between the top of the head and
`
`
`
`24
`
`SAMSUNG EXHIBIT 1002
`Page 27 of 78
`
`

`

`Declaration of Dr. John C. Hart
`Inter Partes Review of U.S. Patent No. 6,717,518
`the bottom of the chin;
`
`• the eyes are about one eye width apart;
`
`• the bottom of the nose is halfway between the eyebrows and the
`
`chin; . . . .
`
`Ex. 1006 at 369.
`
`59. Stringa’s eye localization algorithm first detects the line that connects
`
`the eyes, then the side limits of the face and the nose axis. Id. at 370. To obtain
`
`the pupil location, Stringa first uses “the approximate location of the eye-
`
`connecting line, of the face sides, and of the nose axis” to estimate “the expectation
`
`zones of the two eyes . . . with reasonable accuracy.” Id. at 376. Stringa illustrates
`
`“the expectation zones for the two eyes” in Figure 9:
`
`60.
`
` In the expectation zones for the two eyes, “the search of the pupil is
`
`based on the analysis of the horizontal grey-level distribution,” (i.e., a histogram).
`
`
`
`
`
`25
`
`SAMSUNG EXHIBIT 1002
`Page 28 of 78
`
`

`

`Declaration of Dr. John C. Hart
`Inter Partes Review of U.S. Patent No. 6,717,518
`Id. at 377. Stringa uses the histogram and some further mathematical calculations
`
`to produce a graph whose peaks indicate the location of the pupil (id.):
`
`
`
`3.
`
`U.S. Patent No. 5,805,720, Facial Image Processing System
`(Filed Mar. 11, 1996) (“Suenaga”) (Ex. 1007)
`61. Suenaga discloses a “facial image processing system for detecting . . .
`
`a dozing or drowsy condition of an automobile driver . . . from the opened and
`
`closed conditions of his eyes.” Ex. 1007 at 1:6–10. Suenaga uses a video camera
`
`to obtain images of a face. Id. at 2:44–49; 6:25–35.
`
`62. Suenaga discloses many embodiments. Embodiment 31 explains that
`
`boxes 11, 12, and 13 (in Fig. 60, below) in the flowchart for Embodiment 31 are
`
`the same as those steps in Embodiment 1. Id. at 23:19–21. Embodiment 1
`
`explains that in boxes 11, 12, and 13, Suenaga converts the image

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket