throbber

`
`
`
`
`
`
`UNITED STATES PATENT AND TRADEMARK OFFICE
`
`____________________
`
`BEFORE THE PATENT TRIAL AND APPEAL BOARD
`
`____________________
`
`SAMSUNG ELECTRONICS CO., LTD.; AND
`SAMSUNG ELECTRONICS AMERICA, INC.
`Petitioner
`
`v.
`
`IMAGE PROCESSING TECHNOLOGIES, LLC
`Patent Owner
`
`____________________
`
`Patent No. 8,983,134
`____________________
`
`DECLARATION OF DR. JOHN C. HART
`IN SUPPORT OF PETITION FOR INTER PARTES REVIEW
`OF U.S. PATENT NO. 8,983,134
`
`
`
`Page 1 of 186
`
`SAMSUNG EXHIBIT 1002
`Samsung v. Image Processing Techs.
`
`

`

`Declaration of Dr. John C. Hart
`Inter Partes Review of Patent No. 8,805,001
`
`
`
`I.
`
`II.
`
`TABLE OF CONTENTS
`
`Contents
`INTRODUCTION ........................................................................................... 1
`
`BACKGROUND AND EXPERIENCE .......................................................... 1
`
`A. Qualifications ........................................................................................ 1
`
`B.
`
`Previous Testimony ............................................................................... 4
`
`III. TECHNOLOGICAL BACKGROUND .......................................................... 5
`
`IV. THE ’134 PATENT ....................................................................................... 11
`
`V.
`
`SUMMARY OF OPINIONS ......................................................................... 20
`
`VI. LEVEL OF ORDINARY SKILL IN THE ART ........................................... 21
`
`VII. CLAIM CONSTRUCTION .......................................................................... 22
`
`VIII. THE PRIOR ART TEACHES OR SUGGESTS EVERY STEP AND
`FEATURE OF THE CHALLENGED CLAIMS OF THE ’134
`PATENT ........................................................................................................ 22
`
`A. Overview Of The Prior Art References .............................................. 22
`
`1.
`
`2.
`
`3.
`
`U.S. Patent No. 5,481,622 (“Gerhardt”) (Ex. 1013) ................. 22
`
`U.S. Patent No. 6,044,166 (“Bassman”) (Ex. 1014) ................. 29
`
`Alton L. Gilbert et al., A Real-Time Video Tracking
`System, PAMI-2 No. 1 IEEE Transactions on Pattern
`Analysis and Machine Intelligence 47 (Jan. 1980)
`(“Gilbert”) (Ex. 1005) ............................................................... 32
`
`4.
`
`U.S. Patent No. 5,521,843 (“Hashima”) (Ex. 1006) ................. 44
`
`B.
`
`Gerhardt In View Of Bassman Renders Obvious Claims 3-6 ............ 50
`
`1.
`
`Reasons To Combine Gerhardt And Bassman ......................... 50
`
`i
`
`SAMSUNG EXHIBIT 1002
`Page 2 of 186
`
`

`

`Declaration of Dr. John C. Hart
`Inter Partes Review of Patent No. 8,893,134
`Elements Incorporated Into Claims 3-6 As Claims
`Dependent From An Independent Claim .................................. 53
`
`Claim 3: “The process according to claim 1, wherein said
`image processing system comprises at least one
`component selected from a memory, a temporal
`processing unit, and a spatial processing unit” ......................... 60
`
`Claim 4: “The process according to claim 1, wherein
`forming the at least one histogram further comprises
`successively increasing the size of a selected area until
`the boundary of the target is found” ......................................... 64
`
`Claim 5: “The process according to claim 4, wherein
`forming the at least one histogram further comprises
`adjusting a center of the selected area based upon a shape
`of the target until substantially the entire target is within
`the selected area” ...................................................................... 65
`
`Claim 6: “The process according to claim 5, wherein
`forming the at least one histogram further comprises
`setting the X minima and maxima and Y minima and
`maxima as boundaries in X and Y histogram formation
`units such that only pixels within the selected area will be
`processed by the image processing system” ............................. 68
`
`Gerhardt and Bassman Are Not Cumulative ............................ 69
`
`Detailed Application Of Gerhardt And Bassman To The
`Challenged Claims .................................................................... 70
`
`2.
`
`3.
`
`4.
`
`5.
`
`6.
`
`7.
`
`8.
`
`C.
`
`Gerhardt In View Of Gilbert And Further In View Of Hashima
`Renders Obvious Claims 3-6 .............................................................104
`
`1.
`
`2.
`
`3.
`
`Reasons To Combine Gilbert, Gerhardt, And Hashima .........104
`
`Elements Incorporated Into Claims 3-6 As Claims
`Dependent From An Independent Claim ................................109
`
`Claim 3: “The process according to claim 1, wherein said
`image processing system comprises at least one
`
`
`
`
`
`-ii-
`
`
`
`SAMSUNG EXHIBIT 1002
`Page 3 of 186
`
`

`

`Declaration of Dr. John C. Hart
`Inter Partes Review of Patent No. 8,893,134
`component selected from a memory, a temporal
`processing unit, and a spatial processing unit” .......................119
`
`Claim 4: “The process according to claim 1, wherein
`forming the at least one histogram further comprises
`successively increasing the size of a selected area until
`the boundary of the target is found” .......................................123
`
`Claim 5: “The process according to claim 4, wherein
`forming the at least one histogram further comprises
`adjusting a center of the selected area based upon a shape
`of the target until substantially the entire target is within
`the selected area” ....................................................................128
`
`Claim 6: “The process according to claim 5, wherein
`forming the at least one histogram further comprises
`setting the X minima and maxima and Y minima and
`maxima as boundaries in X and Y histogram formation
`units such that only pixels within the selected area will be
`processed by the image processing system” ...........................130
`
`Gerhardt and Bassman Are Not Cumulative ..........................131
`
`Detailed Application Of Gilbert, Gerhardt, And Ueno To
`The Challenged Claims ...........................................................132
`
`4.
`
`5.
`
`6.
`
`7.
`
`8.
`
`IX. CONCLUSION ............................................................................................182
`
`
`
`
`
`
`
`-iii-
`
`
`
`SAMSUNG EXHIBIT 1002
`Page 4 of 186
`
`

`

`Declaration of Dr. John C. Hart
`Inter Partes Review of Patent No. 8,893,134
`I, John C. Hart, declare as follows:
`
`1.
`
`I.
`
`INTRODUCTION
`2.
`
`I have been retained by Samsung Electronics Co., Ltd. and Samsung
`
`Electronics America, Inc. (collectively, “Petitioner”) as an independent expert
`
`consultant in this proceeding before the United States Patent and Trademark Office
`
`(“PTO”).
`
`3.
`
`I have been asked to consider whether certain references teach or
`
`suggest the features recited in Claims 3 through 6 (the “Challenged Claims”) of
`
`U.S. Patent No. 8,983,134(“the ’134 Patent”) (Ex. 1001), which I understand is
`
`allegedly owned by Image Processing Technologies, LLC (“Patent Owner”). My
`
`opinions and the bases for my opinions are set forth below.
`
`4.
`
`I am being compensated at my ordinary and customary consulting rate
`
`for my work.
`
`5. My compensation is in no way contingent on the nature of my
`
`findings, the presentation of my findings in testimony, or the outcome of this or
`
`any other proceeding. I have no other interest in this proceeding.
`
`II. BACKGROUND AND EXPERIENCE
`A. Qualifications
`6.
`I have more than 25 years of experience in computer graphics and
`
`image processing technologies. In particular, I have devoted much of my career to
`
`1
`
`SAMSUNG EXHIBIT 1002
`Page 5 of 186
`
`

`

`Declaration of Dr. John C. Hart
`Inter Partes Review of Patent No. 8,893,134
`researching and designing graphics hardware and systems for a wide range of
`
`applications.
`
`7. My research has resulted in the publication of more than 80 peer-
`
`reviewed scientific articles, and more than 50 invited papers, and talks in the area
`
`of computer graphics and image processing.
`
`8.
`
`I have authored or co-authored several publications that are directly
`
`related to target identification and tracking in image processing systems. Some
`
`recent publications include:
`
`• P.R. Khorrami, V.V. Le, J.C. Hart, T.S. Huang. A System for
`
`Monitoring the Engagement of Remote Online Students using Eye
`
`Gaze Estimation. Proc. IEEE ICME Workshop on Emerging
`
`Multimedia Systems and Applications, July 2014.
`
`• V. Lu, I. Endres, M. Stroila and J.C. Hart. Accelerating Arrays of
`
`Linear Classifiers Using Approximate Range Queries. Proc. Winter
`
`Conference on Applications of Computer Vision, Mar. 2014.
`
`• M. Kamali, E. Ofek, F. Iandola, I. Omer, J.C. Hart Linear Clutter
`
`Removal from Urban Panoramas. Proc. International Symposium on
`
`Visual Computing. Sep. 2011.
`
`9.
`
`From 2008-2012, as a Co-PI of the $18M Intel/Microsoft Universal
`
`Parallelism Computing Research Center at the University of Illinois, I led the
`
`2
`
`SAMSUNG EXHIBIT 1002
`Page 6 of 186
`
`

`

`Declaration of Dr. John C. Hart
`Inter Partes Review of Patent No. 8,893,134
`AvaScholar project for visual processing of images that included face
`
`identification, tracking and image histograms.
`
`10.
`
`11.
`
`I am a co-inventor of U.S. Patent No. 7,365,744.
`
`I have served as the Director for Graduate Studies for the Department
`
`of Computer Science, an Associate Dean for the Graduate College, and I am
`
`currently serving as the Executive Associate Dean of the Graduate College at the
`
`University of Illinois. I am also a professor in the Department of Computer
`
`Science at the University of Illinois, where I have served on the faculty since
`
`August 2000. As a professor I have taught classes on image processing and
`
`graphics technology and have conducted research into specific applications of
`
`these technologies.
`
`12. From 1992 to 2000, I worked first as an Assistant Professor and then
`
`as an Associate Professor in the School of Electrical Engineering and Computer
`
`Science at Washington State University.
`
`13. From 1991-1992, I was a Postdoctoral Research Associate at the
`
`Electronic Visualization Laboratory at the University of Illinois at Chicago, and at
`
`the National Center for Supercomputing Applications at the University of Illinois
`
`at Urbana-Champaign.
`
`14.
`
`I earned a Doctor of Philosophy in Electrical Engineering and
`
`Computer Science from the University of Illinois at Chicago in 1991.
`
`3
`
`SAMSUNG EXHIBIT 1002
`Page 7 of 186
`
`

`

`Declaration of Dr. John C. Hart
`Inter Partes Review of Patent No. 8,893,134
`I earned a Master’s Degree in Electrical Engineering and Computer
`
`15.
`
`Science from the University of Illinois at Chicago in 1989.
`
`16.
`
`I earned a Bachelor of Science in Computer Science from Aurora
`
`University in 1987.
`
`17.
`
`I have been an expert in the field of graphics and image processing
`
`since prior to 1996. I am qualified to provide an opinion as to what a person of
`
`ordinary skill in the art (“POSA”) would have understood, known, or concluded as
`
`of 1996.
`
`18. Additional qualifications are detailed in my curriculum vitae, which I
`
`understand has been submitted as Exhibit 1003 in this proceeding.
`
`B.
`19.
`
`Previous Testimony
`
`In the previous five years, I have testified as an expert at trial or by
`
`deposition or have submitted declarations in the following cases:
`
`20. Certain Computing or Graphics Systems, Components Thereof, and
`
`Vehicles Containing Same, Inv. No. 337-TA-984.
`
`21. ZiiLabs Inc., Ltd v. Samsung Electronics Co. Ltd. et al., No. 2:14-cv-
`
`00203 (E.D. Tex. Feb. 4, 2016).
`
`22. Certain Consumer Electronics with Display and Processing
`
`Capabilities, Inv. No. 337-TA-884.
`
`4
`
`SAMSUNG EXHIBIT 1002
`Page 8 of 186
`
`

`

`Declaration of Dr. John C. Hart
`Inter Partes Review of Patent No. 8,893,134
`I have also submitted declarations in support of the following
`
`23.
`
`Petitions for Inter Partes Review in Samsung v. Image Processing Technologies,
`
`LLC:
`
`• IPR2017-00357 against the ’445 Patent, filed 11/30/2016.
`
`• IPR2017-00336 against U.S. Patent No. 6,959,293, filed 11/29/2016.
`
`• IPR2017-00355 against U.S. Patent No. 7,650,015, filed 11/30/2016.
`
`• IPR2017-00347 against U.S. Patent No. 8,805,001, filed 11/29/2016.
`
`• IPR2017-00353 against U.S. Patent No. 8,983,134, filed 11/30/2016.
`
`III.
`
`
`24.
`
`TECHNOLOGICAL BACKGROUND
`
`Image processing systems have long used histograms as a
`
`mathematical tool to identify and track image features and to adjust image
`
`properties. The use of histograms to identify and track image features dates back
`
`to well before 1997. D. Trier, A. K. Jain and T. Taxt, “Feature Extraction Methods
`
`for Character Recognition-A Survey”, Pattern Recognition, vol. 29, no. 4, 1996,
`
`pp. 641–662 (Ex. 1009) (citing M. H. Glauberman, “Character recognition for
`
`business machines,” Electronics, vol. 29, pp. 132(136), Feb. 1956(Ex. 1010))
`
`25. A digital image is represented by a number of picture elements, or
`
`pixels, where each pixel has certain properties, such as brightness, color, position,
`
`velocity, etc., which may be referred to as domains. For each pixel property or
`
`domain, a histogram may be formed. A histogram is a type of statistical tool. In
`
`5
`
`SAMSUNG EXHIBIT 1002
`Page 9 of 186
`
`

`

`Declaration of Dr. John C. Hart
`Inter Partes Review of Patent No. 8,893,134
`image processing, histograms are often used to count the number of pixels in an
`
`image in a certain domain of the pixel. Histograms have multiple bins, where each
`
`bin in the histogram counts the pixels that fall within a range for that domain. For
`
`example, for the continuous variable of luminance (also called brightness), the
`
`luminance value for each pixel can be sampled by a camera and then digitized and
`
`represented by an 8-bit value. Then, those luminance values could be loaded into a
`
`luminance histogram. The histogram would have one bin for each range of
`
`luminance values, and each bin would count the number of pixels in the image that
`
`fall within that luminance value range. As shown below, a luminance histogram
`
`may reveal certain properties of an image, such as whether it is properly exposed,
`
`based on whether an excessive number of pixels fall on the dark end or light end of
`
`the luminance range.
`
`26. Histograms of other pixel properties can also be formed. For
`
`example, the figure below illustrates two histograms formed by counting the
`
`number of black pixels having each X-coordinate value (i.e., the X-coordinate
`
`
`
`6
`
`SAMSUNG EXHIBIT 1002
`Page 10 of 186
`
`

`

`Declaration of Dr. John C. Hart
`Inter Partes Review of Patent No. 8,893,134
`domain) and the number having each Y-coordinate value (i.e., the Y-coordinate
`
`domain).
`
`
`
`27. Such histograms are sometimes called “projection histograms”
`
`because they represent the image projected onto each axis. In the example above,
`
`the image was pure black and white, but projection histograms of a greyscale
`
`image can also be formed in a similar manner by defining a luminance threshold
`
`and projecting, for example, only those pixels that have a luminance value lower
`
`than 100.
`
`28. A more complex greyscale image is shown below, along with its
`
`luminance histogram (black = 0; white = 255):
`
`7
`
`SAMSUNG EXHIBIT 1002
`Page 11 of 186
`
`

`

`Declaration of Dr. John C. Hart
`Inter Partes Review of Patent No. 8,893,134
`
`
`
`
`
`29. Here, the peak in the dark luminance region (luminance = 0-50)
`
`corresponds to the dark suit and tie and relatively dark background. The peak in
`
`the light luminance region (luminance > 230) corresponds to the white shirt, while
`
`the central peak (between luminance 130 and 170) corresponds largely to the
`
`medium brightness of the face. If one were to select only the subset of pixels with
`
`8
`
`SAMSUNG EXHIBIT 1002
`Page 12 of 186
`
`

`

`Declaration of Dr. John C. Hart
`Inter Partes Review of Patent No. 8,893,134
`brightness between 130 and 170 and plot them according to their x and y position,
`
`one would get the following image:
`
`
`
`30. Taking projection histograms of this subset of pixels with luminance
`
`between 130 and 170, then, provides an indication of location of the face in the
`
`image. On the left, below, is a projection of this subset of pixels onto the x axis,
`
`and on the right is a similar projection onto the y axis.
`
`9
`
`SAMSUNG EXHIBIT 1002
`Page 13 of 186
`
`

`

`Declaration of Dr. John C. Hart
`Inter Partes Review of Patent No. 8,893,134
`
`
`
`31. Histograms may also be formed of pixel color properties in much the
`
`same way. Color is typically represented by three values: hue, saturation and
`
`luminance. Hue (aka “tone”) is an angle ranging from 0° to 360° around a color
`
`wheel that indicates which “color” is bring represented, e.g. 0° = red, 60° = yellow,
`
`120° = green, 180° = cyan, 240° = blue, and 300° = magenta. Saturation, which
`
`may also range from 0 to 255, represents how “brilliant” the color is. For example,
`
`if a color with a saturation of 255 represents red, then a saturation of 128 would
`
`represent pink and a saturation of 0 would represent gray. Luminance ranges from
`
`0 to 255 and represents the “brightness” of the color. If luminance = 0, then the
`
`color is black, regardless of the other values. Given a color image, the luminance
`
`values of the pixels would yield the “black-and-white” or grayscale version of the
`
`image.
`
`10
`
`SAMSUNG EXHIBIT 1002
`Page 14 of 186
`
`

`

`Declaration of Dr. John C. Hart
`Inter Partes Review of Patent No. 8,893,134
`
`
`
`IV. THE ’134 PATENT
`32. The ’134 Patent, entitled “Image Processing Method,” was filed on
`
`March 17, 2014, and issued on March 17, 2015. The ’134 Patent names Patrick
`
`Pirim as the sole inventor. I understand that the ’134 Patent claims a priority date
`
`of July 22, 1996.
`
`33. The ’134 Patent is generally directed to tracking a target using an
`
`image processing system. For example, in the Abstract, the ’134 Patent notes that
`
`the invention relates to “[a] method and apparatus for localizing an area in relative
`
`movement and for determining the speed and direction thereof.” Ex. 1001, ’134
`
`Patent at Abstract.
`
`34. The ’134 Patent uses the pixels in a frame of an image of a video
`
`signal to form one or more histograms in order to identify and track a target. See
`
`e.g., Ex. 1001, ’134 Patent at Claim 1. The input signal employed in the ’134
`
`11
`
`SAMSUNG EXHIBIT 1002
`Page 15 of 186
`
`

`

`Declaration of Dr. John C. Hart
`Inter Partes Review of Patent No. 8,893,134
`Patent is comprised of a “succession of frames, each frame having a succession of
`
`pixels.” Ex. 1001 at 3:32-34. Although the disclosed embodiments relate
`
`primarily to a video signal, the ’134 Patent also teaches that the input signal could
`
`correspond to other types of signals, for example “ultrasound, IR, Radar, tactile
`
`array, etc.” Ex. 1001, ’134 Patent at 9:27-32.
`
`35. The ’134 Patent teaches constructing a histogram showing the
`
`frequency of the pixels meeting a certain characteristic. In the ’134 Patent, these
`
`characteristics—such as luminance or speed—are referred to as “domains.”
`
`Histograms may be constructed in a variety of domains, for example, the ’134
`
`Patent teaches that examples of possible domains include pixel data such as “i)
`
`luminance, ii) speed (V), iii) oriented direction (D1), (iv) time constant (CO), v)
`
`hue, vi) saturation, and vii) first axis (x(m)), and viii) second axis (y(m)).” Id. at
`
`4:5-9.
`
`36. A domain can be further subdivided into classes, each class consisting
`
`of the subset of pixels with similar domain values. Figure 14a (and the
`
`accompanying text) illustrates an example of “classes” within a domain:
`
`
`
`12
`
`SAMSUNG EXHIBIT 1002
`Page 16 of 186
`
`

`

`Declaration of Dr. John C. Hart
`Inter Partes Review of Patent No. 8,893,134
`FIG. 14 a shows an example of the successive classes C1
`C2 . . . Cn−1 Cn, each representing a particular velocity,
`for a hypothetical velocity histogram, with their being
`categorization for up to 16 velocities (15 are shown) in
`this example. Also shown is envelope 38, which is a
`smoothed representation of the histogram.
`
`Ex. 1001, ’134 Patent at 20:49-54.
`
`37. The hypothetical histogram in Figure 14a would be constructed using
`
`histogram formation block 25 in Figure 11. In this figure, various histogram
`
`processors (numbered 24-29) are shown that allow creation of histograms in
`
`various domains. Block 25 is disclosed as creating velocity histograms. Id. at
`
`17:4-10.
`
`
`
`13
`
`SAMSUNG EXHIBIT 1002
`Page 17 of 186
`
`

`

`Declaration of Dr. John C. Hart
`Inter Partes Review of Patent No. 8,893,134
`A detailed depiction of histogram block 25 is shown in Figure 13:
`
`
`
`38.
`
`In Figure 13, velocity data for a pixel is input into a memory address
`
`and into classifier 25b. The classifier contains registers 106 that correspond to
`
`classes within the particular domain. Thus, for a classifier in a velocity histogram
`
`formation block, the classifier would have a register for each velocity class.
`
`Because the histogram will only be incremented for pixels satisfying the
`
`classification criteria that, when met, outputs a classification signal of “1,” the
`
`histogram in Figure 14a would be constructed using a classifier 25b that has each
`
`of the velocity-class registers set to “1.” In this example, each pixel that is input
`
`14
`
`SAMSUNG EXHIBIT 1002
`Page 18 of 186
`
`

`

`Declaration of Dr. John C. Hart
`Inter Partes Review of Patent No. 8,893,134
`into classifier 25b would generate a classification signal of “1.” The histogram
`
`would then be updated to include the input pixel, which it would do by
`
`incrementing the histogram bin corresponding to the appropriate velocity class. In
`
`other examples, a classifier may output a classification signal of “1” for only
`
`specific classes of a domain, rather than for all of the classes in a domain as in
`
`Figure 14a. Thus, for example, the classifier could choose pixels with only
`
`specific velocities for consideration in subsequent histograms. This feature may be
`
`used in conjunction with the output from other classification units to create
`
`histograms identifying only pixels meeting multiple classification criteria in a
`
`variety of domains.
`
`39. The ’134 Patent discloses that its teachings are applicable to a broad
`
`range of applications. For example, in one embodiment, the ’134 Patent performs
`
`“automatic framing of a person . . . during a video conference.” Ex. 1001, ’134
`
`Patent at 22:5-6. In this application, histograms are constructed in the X- and Y-
`
`domains and count the number of pixels between successive frames where the
`
`differences in luminance are above certain threshold values. Ex. 1001, ’134 Patent
`
`at 22:44-54. By this method, the ’134 Patent teaches the system is able to
`
`determine the boundaries of the target based on peaks in the histograms generated.
`
`Ex. 1001, ’134 Patent at 10:33-61. This application and result are shown in Figure
`
`16 and 17, reproduced below:
`
`15
`
`SAMSUNG EXHIBIT 1002
`Page 19 of 186
`
`

`

`Declaration of Dr. John C. Hart
`Inter Partes Review of Patent No. 8,893,134
`
`
`
`Ex. 1001, Figure 16
`
`
`
`Ex. 1001, Figure 17
`
`
`40. Other related applications disclosed by the ’134 Patent include using a
`
`mounted spotlight or camera to automatically track a target, for example using a
`
`spotlight mounted on a helicopter to track a target on the ground or using
`
`automated stage lights to track a performer during a performance. Ex. 1001, ’134
`
`Patent at 23:35-40. The ’134 Patent also teaches that “[t]he invention would
`
`similarly be applicable to weapons targeting systems.” Ex. 1001, ’134 Patent at
`
`23:39-40.
`
`41.
`
`In each of these embodiments, the ’134 Patent finds the X- and Y-
`
`minima and maxima of the histograms and uses these points to determine a center
`
`point of the target. The patent states that the “XMIN and XMAX” for the X-projection
`
`histogram and “YMIN and YMAX” for the y projection histogram, are “key
`
`characteristics” of the histogram “which include the minimum (MIN) of the
`
`histogram” and “the maximum (MAX) of the histogram.” Id. at 19:41-50. It
`
`16
`
`SAMSUNG EXHIBIT 1002
`Page 20 of 186
`
`

`

`Declaration of Dr. John C. Hart
`Inter Partes Review of Patent No. 8,893,134
`teaches that these key characteristics are computed by the condition:
`
`For each pixel with a validation signal V2 of “l”:
`
`(a) if the data value of the pixel<MIN (which is initially
`
`set to the maximum possible value of the histogram),
`
`then write data value in MIN,
`
`(b) if the data value of the pixel>MAX (which is initially
`
`set to the minimum possible value of the histogram), then
`
`write data value in MAX
`
`Id. at 19:51-57. Hence the ‘134 Patent defines the minimum of the X-projection
`
`histogram is the smallest X-coordinate of any pixel in the image region whose
`
`validation signal is “1.” Similarly the maximum is the largest X-coordinate of any
`
`pixel in the image region whose validation signal is “1.” The same holds true for
`
`the Y-projection histogram where the maximum and minimum are computed in the
`
`same way, but using the y coordinate axis.
`
`42. After determining the X- and Y-minima and maxima of the
`
`histograms, the ’134 Patent teaches that a center point of the target image may be
`
`found at the coordinates of (XMIN+XMAX)/2 and (YMIN+YMAX)/2. Ex. 1001, ’134
`
`Patent at 24:46-51. This is only one possible way of computing a center point of
`
`an object. Indeed, for an irregularly shaped object like those shown in Figure 20,
`
`below, there is not one clear center point. Rather, various center points such as
`
`17
`
`SAMSUNG EXHIBIT 1002
`Page 21 of 186
`
`

`

`Declaration of Dr. John C. Hart
`Inter Partes Review of Patent No. 8,893,134
`center of mass, center-of-area, and center of gravity might all give different
`
`coordinates, but would nevertheless accomplish the purpose of the ’134 Patent.
`
`43. The X- and Y- minima and maxima is used to draw a “tracking box”
`
`around the target. Figure 23 shows an example of the tracking box in a frame:
`
`
`
`
`
`
`
`
`
`18
`
`SAMSUNG EXHIBIT 1002
`Page 22 of 186
`
`

`

`Declaration of Dr. John C. Hart
`Inter Partes Review of Patent No. 8,893,134
`In addition, the ’134 Patent teaches that the image processing system
`
`44.
`
`may include a memory, a temporal processing unit, and a spatial processing unit.
`
`The ’134 Patent states:
`
`Referring to FIG. 2, image processing system 11 includes
`
`a first assembly 11a, which consists of a temporal
`
`processing unit 15 having an associated memory 16, a
`
`spatial processing unit 17 having a delay unit 18 and
`
`sequencing unit 19, and a pixel clock 20, which generates
`
`a clock signal HP, and which serves as a clock for
`
`temporal processing unit 15 and sequencing unit 19.
`
`Clock pulses HP are generated by clock 20 at the pixel
`
`rate of the image, which is preferably 13.5 MHZ.
`
`Ex. 1001, ’134 Patent at 10:8-15.
`
`45. The temporal processing unit 15 of the ’134 Patent “smooth[s] the
`
`video signal and generate[s] a number of outputs that are utilized by spatial
`
`processing unit 17.” Ex. 1001, ’134 Patent at 10:16-19. Specifically, the temporal
`
`processing unit of the ’134 Patent “generates a binary output signal DP for each
`
`pixel, which identifies whether the pixel has undergone significant variation, and a
`
`digital signal CO, which represents the updated calculated value of time constant
`
`C.” Ex. 1001, ’134 Patent at 10:28-32.
`
`19
`
`SAMSUNG EXHIBIT 1002
`Page 23 of 186
`
`

`

`Declaration of Dr. John C. Hart
`Inter Partes Review of Patent No. 8,893,134
`46. Thus, the “temporal processing unit” of the ’134 Patent generates
`
`signal based on the information obtained by two (or more) frames in the video
`
`signal representing images at different times.
`
`47. The spatial processing unit of the ’134 Patent receives input from the
`
`temporal processing unit, and determines the parameters relating to the movement
`
`of the target. Ex. 1001, ’134 Patent at 15:31-55.
`
`48. The ’134 Patent also teaches a method by which the system will
`
`“process pixels only within a user-defined area.” Ex. 1001, ’134 Patent at 21:12-
`
`24. For example, the system can receive user input instructing it to “process pixels
`
`only in a defined rectangle by setting the XMIN and XMAX, and YMIN and
`
`YMAX values as desired.” Id. The size of the area may be incrementally
`
`increased until the box bounding the processed area overlaps the boundary of the
`
`target. Ex. 1001, ’134 Patent at 24:35-38 (“This process is continued until the
`
`histogram formed by either of histogram formation units 28 and 29 contains
`
`meaningful information, i.e., until the box overlaps the boundary of the target.”).
`
`V.
`
`SUMMARY OF OPINIONS
`49.
`
`In preparing this declaration, I have reviewed at least the documents
`
`labeled Exhibits 1001-1022 and other materials referred to herein in connection
`
`with providing this declaration. In addition to these materials, I have relied on my
`
`education, experience, and my knowledge of practices and principles in the
`
`20
`
`SAMSUNG EXHIBIT 1002
`Page 24 of 186
`
`

`

`Declaration of Dr. John C. Hart
`Inter Partes Review of Patent No. 8,893,134
`relevant field, e.g., image processing. My opinions have also been guided by my
`
`appreciation of how one of ordinary skill in the art would have understood the
`
`claims and specification of the ’134 Patent around the time of the alleged
`
`invention, which I have been asked to assume is the earliest claimed priority date
`
`of July 22, 1996.
`
`50. Based on my experience and expertise, it is my opinion that certain
`
`references teach or suggest all the features recited in Claims 3-6 of the ’134 Patent,
`
`as explained in detail below. Specifically, it is my opinion that Claims 5-13 are
`
`disclosed by U.S. Patent No. 5,481,622 (“Gerhardt”) in combination with U.S.
`
`Patent No. 6,044,166 (“Bassman”). It is also my opinion that Claims 3-6 are
`
`disclosed by Alton L. Gilbert et. al., A Real-Time Video Tracking System
`
`(“Gilbert”) in combination with Gerhardt, and further in combination with U.S.
`
`Patent No. 5,521,843 (“Hashima”).
`
`VI. LEVEL OF ORDINARY SKILL IN THE ART
`51. Based on my review of the ’134 Patent specification, claims, file
`
`history, and prior art, I believe one of ordinary skill in the art around the time of
`
`the alleged invention of the ’134 Patent would have had either (1) a Master’s
`
`Degree in Electrical Engineering or Computer Science or the equivalent plus at
`
`least a year of experience in the field of image processing, image recognition,
`
`machine vision, or a related field or (2) a Bachelor’s Degree in Electrical
`
`21
`
`SAMSUNG EXHIBIT 1002
`Page 25 of 186
`
`

`

`Declaration of Dr. John C. Hart
`Inter Partes Review of Patent No. 8,893,134
`Engineering or Computer Science or the equivalent plus at least three years of
`
`experience in the field of image processing, image recognition, machine vision, or
`
`a related field. Additional education could substitute for work experience and vice
`
`versa.
`
`52.
`
`In determining the level of ordinary skill in the art, I was asked to
`
`consider, for example, the type of problems encountered in the art, prior art
`
`solutions to those problems, the rapidity with which innovations are made, the
`
`sophistication of the technology, and the educational level of active workers in the
`
`field.
`
`53. My opinions concerning the ’134 Patent claims are from the
`
`perspective of a person of ordinary skill in the art (“POSA”), as set forth above.
`
`VII. CLAIM CONSTRUCTION
`54. For my analysis, I have interpreted all claim terms according to their
`
`plain and ordinary meaning.
`
`VIII. THE PRIOR ART TEACHES OR SUGGESTS EVERY STEP AND
`FEATURE OF THE CHALLENGED CLAIMS OF THE ’134 PATENT
`A. Overview Of The Prior Art References
`1.
`U.S. Patent No. 5,481,622 (“Gerhardt”) (Ex. 1013)
`55. Lester A. Gerhardt and Ross M. Sabolcik, researchers at Rensselaer
`
`Polytechnic Institute, disclosed using a histogram of pixel characteristics to
`
`identify and track a target in digitized visual input in U.S. Patent No. 5,481,622
`
`22
`
`SAMSUNG EXHIBIT 1002
`Page 26 of 186
`
`

`

`Declaration of Dr. John C. Hart
`Inter Partes Review of Patent No. 8,893,134
`(“Gerhardt”). Gerhardt issued on January 2, 1996.
`
`56. Gerhardt’s system tracks the position of a user’s pupil to generate
`
`input to the computer. This allows one to interact with a computer without using
`
`one’s hands. In one example, Gerhardt’s system uses a video camera mounted on
`
`a helmet, as shown in Figures 1 and 2.
`
`
`
`57. Gerhardt’s system receives an input signal from a “camera means for
`
`acquiring a video image” and a “frame grabber means [that is] coupled to the
`
`camera means.” E

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket