`
`
`
`UNITED STATES PATENT AND TRADEMARK OFFICE
`
`____________________
`
`BEFORE THE PATENT TRIAL AND APPEAL BOARD
`
`____________________
`
`SAMSUNG ELECTRONICS CO., LTD.; AND
`SAMSUNG ELECTRONICS AMERICA, INC.
`Petitioner
`
`v.
`
`IMAGE PROCESSING TECHNOLOGIES, LLC
`Patent Owner
`
`____________________
`
`Patent No. 7,650,015
`____________________
`
`PETITION FOR INTER PARTES REVIEW
`OF U.S. PATENT NO. 7,650,015
`
`
`
`
`
`Petition for Inter Partes Review
`Patent No. 7,650,015
`
`
`
`I.
`
`TABLE OF CONTENTS
`
`Contents
`INTRODUCTION ........................................................................................... 1
`
`II. MANDATORY NOTICES UNDER 37 C.F.R. § 42.8 ................................... 1
`
`III.
`
`PAYMENT OF FEES UNDER 37 C.F.R. § 42.15(a) .................................... 3
`
`IV. GROUNDS FOR STANDING ........................................................................ 3
`
`V.
`
`PRECISE RELIEF REQUESTED .................................................................. 3
`
`VI. LEGAL STANDARDS ................................................................................... 3
`
`A.
`
`B.
`
`C.
`
`Claim Construction ............................................................................... 3
`
`Level of Ordinary Skill In The Art ....................................................... 4
`
`This Petition Is Not Redundant ............................................................. 4
`
`VII. OVERVIEW OF THE RELEVANT TECHNOLOGY AND THE
`’015 PATENT .................................................................................................. 6
`
`VIII. DETAILED EXPLANATION OF GROUNDS ............................................ 10
`
`A. Overview Of The Prior Art References .............................................. 10
`
`1.
`
`2.
`
`3.
`
`U.S. Patent No. 5,481,622 to Gerhardt (Ex. 1013) ............... 10
`
`U.S. Patent No. 6,044,166 to Bassman (Ex. 1014) ................... 17
`
`Alton L. Gilbert et al., A Real-Time Video Tracking
`System, PAMI-2 No. 1 IEEE Transactions on Pattern
`Analysis and Machine Intelligence 47 (Jan. 1980)
`(“Gilbert”) (Ex. 1005) ............................................................... 19
`
`4.
`
`U.S. Patent No. 5,521,843 (“Hashima”) (Ex. 1006) ................. 26
`
`5. W. B. Schaming, Adaptive Gate Multifeature Bayesian
`Statistical Tracker, 359 Applications of Digital Image
`Processing IV 68 (1982) (“Schaming”) (Ex. 1008) .................. 31
`
`i
`
`
`
`Petition for Inter Partes Review
`Patent No. 7,650,015
`IX. Specific Explanation Of Grounds For Invalidity........................................... 34
`
`A. Ground 1: Gerhardt In View Of Bassman Renders Obvious
`Claims 1-2 and 4-5 .............................................................................. 34
`
`1.
`
`2.
`
`3.
`
`4.
`
`5.
`
`Reasons To Combine Gerhardt and Bassman .......................... 34
`
`Claim 1 ...................................................................................... 36
`
`Claim 2: “The process according to claim 1, comprising
`centering the tracking box relative to an optical axis of
`the frame” .................................................................................. 44
`
`Claim 4: “The process according to claim 1, wherein said
`image processing system comprises at least one
`component selected from a memory, a temporal
`processing unit, and a spatial processing unit.” ........................ 45
`
`Claim 5: “The process according to claim 1, wherein said
`image processing system comprises at least two
`components selected from a memory, a temporal
`processing unit, and a spatial processing unit” ......................... 50
`
`6.
`
`Gerhardt and Bassman Are Not Cumulative ............................ 50
`
`B.
`
`Ground 2: Gerhardt In View Of Bassman And Further In View
`Of Hashima Renders Obvious Claims 3 and 7.................................... 51
`
`1.
`
`2.
`
`3.
`
`4.
`
`Reasons To Combine Gerhardt and Bassman with
`Hashima .................................................................................... 51
`
`Claim 3: “The process according to claim 1, comprising
`calculating a histogram according to a projection
`axis…and calculating an anticipated next frame” .................... 52
`
`Claim 7 ...................................................................................... 55
`
`Gerhardt, Bassman, and Hashima Are Not Cumulative ........... 61
`
`C.
`
`Ground 3: Gilbert In View Of Gerhardt And Further In View
`Of Schaming Renders Obvious Claims 1-5 and 7 .............................. 62
`
`1.
`
`Reasons To Combine Gilbert, Gerhardt and Schaming ............ 62
`
`
`
`
`
`-ii-
`
`
`
`
`
`Petition for Inter Partes Review
`Patent No. 7,650,015
`Claim 1 ...................................................................................... 65
`
`Claim 2: “The process according to claim 1, comprising
`centering the tracking box relative to an optical axis of
`the frame” .................................................................................. 75
`
`Claim 3: “The process according to claim 1, comprising
`calculating a histogram according to a projection
`axis…and calculating an anticipated next frame” .................... 76
`
`Claim 4: “The process according to claim 1, wherein said
`image processing system comprises at least one
`component selected from a memory, a temporal
`processing unit, and a spatial processing unit.” ........................ 78
`
`Claim 5: “The process according to claim 1, wherein said
`image processing system comprises at least two
`components selected from a memory, a temporal
`processing unit, and a spatial processing unit” ......................... 80
`
`Claim 7 ...................................................................................... 81
`
`Gilbert, Gerhardt, and Schaming Are Not Cumulative ............ 83
`
`2.
`
`3.
`
`4.
`
`5.
`
`6.
`
`7.
`
`8.
`
`X.
`
`CONCLUSION .............................................................................................. 84
`
`
`
`
`
`
`
`-iii-
`
`
`
`
`
`Petition for Inter Partes Review
`Patent No. 7,650,015
`
`LIST OF EXHIBITS1
`
`U.S. Patent No. 7,650,015 (“the ’015 Patent”)
`Declaration of Dr. John C. Hart
`Curriculum Vitae for Dr. John C. Hart
`Prosecution File History of U.S. Patent No. 7,650,015
`Alton L. Gilbert et al., A Real-Time Video Tracking System,
`PAMI-2 No. 1 IEEE Transactions on Pattern Analysis and
`Machine Intelligence 47 (Jan. 1980) (“Gilbert”)
`U.S. Patent No. 5,521,843 (“Hashima”)
`Not used
`W. B. Schaming, Adaptive Gate Multifeature Bayesian
`Statistical Tracker, 359 Applications of Digital Image
`Processing IV 68 (1982) (“Schaming”)
`D. Trier, A. K. Jain and T. Taxt, “Feature Extraction Methods
`for Character Recognition-A Survey”, Pattern Recognition, vol.
`29, no. 4, 1996, pp. 641–662
`M. H. Glauberman, “Character recognition for business
`machines,” Electronics, vol. 29, pp. 132-136, Feb. 1956
`Declaration of Gerard P. Grenier (authenticating Ex. 1005)
`Declaration of Eric A. Pepper (authenticating Ex. 1008)
`U.S. Patent No. 5,481,622 to Gerhardt
`U.S. Patent No. 6.044,166 to Bassman
`
`1001
`1002
`1003
`1004
`1005
`
`1006
`1007
`1008
`
`1009
`
`1010
`
`1011
`1012
`1013
`1014
`
`
`
` 1
`
` Citations to non-patent publications are to the original page numbers of the
`
`publication, and citations to U.S. patents are to column:line number of the patents.
`
`iv
`
`
`
`Petition for Inter Partes Review
`Patent No. 7,650,015
`
`I.
`
`INTRODUCTION
`
`Samsung Electronics Co., Ltd. and Samsung Electronics America, Inc.
`
`(collectively, “Petitioner”) request inter partes review (“IPR”) of Claims 1-5 and 7
`
`of U.S. Patent No. 7,650,015 (“the ’015 Patent”) (Ex. 1001), which, on its face, is
`
`assigned to Image Processing Technologies, LLC (“Patent Owner”). This Petition
`
`presents three non-cumulative grounds of invalidity that the U.S. Patent and
`
`Trademark Office (“PTO”) did not consider during prosecution. These grounds
`
`are each likely to prevail, and this Petition accordingly should be granted on all
`
`grounds and the challenged claims should be cancelled.
`
`II. MANDATORY NOTICES UNDER 37 C.F.R. § 42.8
`Real Parties-in-Interest: Petitioner identifies the following real parties-in-
`
`interest: Samsung Electronics Co., Ltd.; Samsung Electronics America, Inc.
`
`Related Matters: Patent Owner has asserted the ’015 Patent against
`
`Petitioner in Image Processing Technologies LLC v. Samsung Elecs. Co., No.
`
`2:16-cv-00505-JRG (E.D. Tex.). Patent Owner has also asserted U.S. Patent Nos.
`
`6,959,293; 8,805,001; 8,983,134; 8,989,445; and 6,717,518 in the related action.
`
`Petitioner is concurrently filing IPR petitions for all of these asserted patents.
`
`Petitioner has previously filed the following IPR petitions against the ’015 Patent
`
`and the first four patents listed above:
`
`IPR2017-00355 against the ’015 Patent, filed 11/30/2016.
`
`1
`
`
`
`Petition for Inter Partes Review
`Patent No. 7,650,015
`PR2017-00357 against U.S. Patent 8,989,445, filed 11/30/2016.
`
`IPR2017-00336 against U.S. Patent 6,959,293, filed 11/29/2016.
`
`IPR2017-00347 against U.S. Patent 8,805,001, filed 11/29/2016.
`
`IPR2017-00353 against U.S. Patent 8,983,134, filed 11/30/2016.
`
`IPR2017-01190 against U.S. Patent No. 6,717,518, filed 3/29/2017.
`
`IPR2017-01212 against U.S. Patent No. 8,989,445, filed 3/30/2017.
`
`IPR2017-01189 against U.S. Patent No. 6,959,293, filed 3/30/2017.
`
`Lead and Back-Up Counsel:
`
`• Lead Counsel: John Kappos (Reg. No. 37,861), O'Melveny & Myers
`
`LLP, 610 Newport Center Drive, 17th Floor, Newport Beach,
`
`California 92660. (Telephone: 949-823-6900; Fax: 949-823-6994;
`
`Email: jkappos@omm.com.)
`
`• Backup Counsel: Nicholas J. Whilt (Reg. No. 72,081), Brian M. Cook
`
`(Reg. No. 59,356), O'Melveny & Myers LLP, 400 S. Hope Street, Los
`
`Angeles, CA 90071. (Telephone: 213-430-6000; Fax: 213-430-6407;
`
`Email: nwhilt@omm.com, bcook@omm.com.)
`
`Service Information: Samsung consents to electronic service by email to
`
`IPTSAMSUNGOMM@OMM.COM. Please address all postal and hand-delivery
`
`correspondence to lead counsel at O’Melveny & Myers LLP, 610 Newport Center
`
`Drive, 17th Floor, Newport Beach, California 92660, with courtesy copies to the
`
`2
`
`
`
`Petition for Inter Partes Review
`Patent No. 7,650,015
`
`email address identified above.
`
`III. PAYMENT OF FEES UNDER 37 C.F.R. § 42.15(a)
`The Office is authorized to charge an amount in the sum of $23,000 to
`
`Deposit Account No. 50-2862 for the fee set forth in 37 CFR § 42.15(a), and any
`
`additional fees that might be due in connection with this Petition.
`
`IV. GROUNDS FOR STANDING
`Petitioner certifies that the ’015 Patent is available for IPR and Petitioner is
`
`not barred or estopped from requesting IPR on the grounds identified herein.
`
`V.
`
`PRECISE RELIEF REQUESTED
`
`Petitioner respectfully requests review of Claims 1-5 and 7 of the ’015
`
`Patent, and cancellation of these claims, based on the grounds listed below:
`
`• Ground 1: Claims 1-2, and 4-5 are obvious under 35 U.S.C. § 103(a)
`
`over Gerhardt in view of Bassman.
`
`• Ground 2: Claims 3 and 7 are obvious under 35 U.S.C. § 103(a) over
`
`Gerhardt in view of Bassman and further in view of Hashima.
`
`• Ground 3: Claims 1-5 and 7 are obvious under 35 U.S.C. § 103(a)
`
`over Gilbert in view of Gerhardt and further in view of Schaming.
`
`VI. LEGAL STANDARDS
`A. Claim Construction
`The ’015 Patent will expire on December 2, 2017—within 18 months of the
`
`3
`
`
`
`Petition for Inter Partes Review
`Patent No. 7,650,015
`Notice of Filing Date. Thus, for purposes of this proceeding, Petitioner has
`
`interpreted each claim term according to its plain and ordinary meaning. Ex. 1002,
`
`¶49. For purposes of invalidity raised in this proceeding, Petitioner does not
`
`believe any term needs an explicit construction.
`
`Level of Ordinary Skill In The Art
`
`B.
`One of ordinary skill in the art the time of the alleged invention of the ’015
`
`Patent would have had either (1) a Master’s Degree in Electrical Engineering or
`
`Computer Science or the equivalent plus at least a year of experience in the field of
`
`image processing, image recognition, machine vision, or a related field or (2) a
`
`Bachelor’s Degree in Electrical Engineering or Computer Science or the equivalent
`
`plus at least three years of experience in the field of image processing, image
`
`recognition, machine vision, or a related field. Additional education could
`
`substitute for work experience and vice versa. Ex. 1002, ¶¶ 44-48.
`
`C. This Petition Is Not Redundant
`This Petition is not redundant to earlier filed IPR2017-00355 (the “’355
`
`Petition”) pertaining to the ’015 Patent. First, this Petition is necessitated because
`
`after Samsung filed the ’355 Petition, Patent Owner moved to add new claims to its
`
`infringement contentions that were originally served August 16, 2016, over three
`
`months earlier in the EDTX litigation. The motion for leave to amend was granted
`
`February 28, 2017. Thus, Samsung promptly prepared and filed this second
`
`4
`
`
`
`Petition for Inter Partes Review
`Patent No. 7,650,015
`Petition to address the newly-added claims. See Microsoft Corp. v. Proxyconn,
`
`Inc., Case No. IPR2013-00109, slip op., 3 (P.T.A.B. Feb. 25, 2014) (Paper 15)
`
`(instituting IPR because additional claims asserted in concurrent district court
`
`litigation). Samsung has also included any remaining, unchallenged, claims in this
`
`Petition as a protective measure against IPT continuing to assert new claims in the
`
`district court litigation. See Silicon Labs. Inc. v. Cresta Tech. Corp., Case No.
`
`IPR2015-00615, slip op. 24 (P.T.A.B. Aug. 14, 2015) (Paper 9) (instituting where
`
`petitioner filed to “challenge the remaining claims that the Patent Owner may
`
`likely assert in the district court case”).
`
`Second, this petition raises new arguments not raised in the ’355 Petition.
`
`See id. For example, this Petition seeks institution on all new claims that were not
`
`the subject of the ’355 Petition. See Cepheid v. Roche Molecular Sys., Inc., Case
`
`No. IPR2015-00881 (P.T.A.B. Sept. 17, 2015) (Paper 9). This Petition does not
`
`seek institution on any claim that was the subject of the earlier ’355 Petition.
`
`Because these new claims have different scope, this Petition raises new arguments
`
`to address new limitations. Moreover, all grounds are new—this Petition relies on
`
`different prior art not included in the ’355 Petition to address the limitations of the
`
`newly-added claims. Facebook, Inc. v. TLI Commc’ns, LLC, Case No. IPR2015-
`
`00778, Paper 17, 26-27 (P.T.A.B. Aug. 28, 2015) (instituting where prior art and
`
`arguments were not substantially similar to previous petitions).
`
`5
`
`
`
`Petition for Inter Partes Review
`Patent No. 7,650,015
`VII. OVERVIEW OF THE RELEVANT TECHNOLOGY AND THE ’015
`PATENT
`
`The purported invention of the ’015 Patent relates to identifying and tracking
`
`a target in an input signal using one or more histograms derived from an image
`
`frame in the video signal. Ex. 1001, at Claims 1-5 and 7; Ex. 1002, ¶¶32-34.
`
`Video image processing and the use of histograms to identify and track targets, and
`
`to derive other information from a video signal were well known at the time the
`
`asserted patents were filed. Ex. 1002, ¶¶24-31. An input signal used in the
`
`purported invention has “a succession of frames, each frame having a succession of
`
`pixels.” Ex. 1001, 3:13-23. The input signal may be a video signal or any other
`
`signal that “generates an output in the form of an array of information
`
`corresponding to information observed by the imaging device,” such as
`
`“ultrasound, IR, Radar, tactile array, etc.” Ex. 1001, 9:6-16. The ’015 Patent then
`
`constructs a histogram showing the frequency of pixels meeting a certain
`
`characteristic. The characteristics used to form histograms are referred to as
`
`“domains” in the ’015 Patent. Ex. 1001, 3:46-58; Ex. 1002, ¶35. The ’015 Patent
`
`teaches that “the domains are preferably selected from the group consisting of i)
`
`luminance, ii) speed (V), iii) oriented direction (DI), iv) time constant (CO), v)
`
`hue, vi) saturation, and vii) first axis (x(m)), and viii) second axis (y(m)).” Ex.
`
`1001, 3:54-58; Ex. 1002, ¶15. Figure 11 shows histogram processors that can
`
`create histograms in various domains:
`
`6
`
`
`
`Petition for Inter Partes Review
`Patent No. 7,650,015
`
`The histograms include a plurality of “classes” within a given domain. Ex.
`
`1002, ¶37. Figure 14a (and its accompanying description) illustrates an example of
`
`“classes” within a domain:
`
`
`
`7
`
`
`
`Petition for Inter Partes Review
`Patent No. 7,650,015
`
`
`
`FIG.14a shows an example of the successive classes C1
`
`C2…Cn−1 Cn, each representing a particular velocity,
`
`for a hypothetical velocity histogram, with their being
`
`categorization for up to 16 velocities (15 are shown) in
`
`this example. Also shown is envelope 38, which is a
`
`smoothed representation of the histogram.
`
`Ex. 1001, 20:47-52; Ex. 1002, ¶¶36-37.
`
`The ’015 Patent then uses the histograms to identify a target in the input
`
`signal. For example, one embodiment of the ’015 Patent performs “automatic
`
`framing of a person… during a video conference.” Ex. 1001, 22:4-6; Figure 15:
`
`8
`
`
`
`Petition for Inter Partes Review
`Patent No. 7,650,015
`
`
`
`The system constructs histograms in the X and Y domains counting the
`
`number of pixels that have a difference in luminance between successive frames
`
`above certain threshold values. Ex. 1001, 22:44-54 and 10:29-44 (explaining that
`
`DP is set to “1” when the pixel value of the pixel under consideration has
`
`“undergone significant variation as compared to…the same pixel in the prior
`
`frame”); Ex. 1002, ¶¶38-40. Figures 16 and 17 show camera setup and the
`
`histogram constructed using this method:
`
`Ex. 1001, Fig.16
`
`
`
`Ex. 1001, Fig.17
`
`
`
`9
`
`
`
`Petition for Inter Partes Review
`Patent No. 7,650,015
`In addition, the system may also be used to automatically track a target by “a
`
`spotlight or a camera. Using a spotlight the invention might be used on a
`
`helicopter to track a moving target on the ground, or to track a performer on a stage
`
`during an exhibition. The invention would similarly be applicable to weapons
`
`targeting systems.” Ex. 1001, 23:35-40; Ex. 1002, ¶41. In such applications, the
`
`system determines the center of the target. Ex. 1001, 24:46-51. Once the center of
`
`the target is determined, the center is used to adjust the camera or spotlight to be
`
`directed to the moving target. Ex. 1001, 25:8-21; Ex. 1002, ¶42. Figure 23 shows
`
`an example of the targeting box in a frame:
`
`Ex. 1001 at Fig.23
`
`
`
`VIII. DETAILED EXPLANATION OF GROUNDS
`A. Overview Of The Prior Art References
`1.
`U.S. Patent No. 5,481,622 to Gerhardt (Ex. 1013)
`The ’015 Patent’s purported invention relates to a process of identifying a
`
`10
`
`
`
`Petition for Inter Partes Review
`Patent No. 7,650,015
`target in digitized visual input by using histograms of pixel characteristics and
`
`tracking the target. This, however, was a technology already developed by Lester
`
`A. Gerhardt and Ross M. Sabolcik, researchers at Rensselaer Polytechnic Institute,
`
`and published as U.S. Patent No. 5,481,622 (“Gerhardt”). Gerhardt issued on
`
`January 2, 1996, and thus qualifies as prior art at least under pre-AIA 35 U.S.C. §
`
`102(b). Although Gerhardt was of record during prosecution, it was not applied in
`
`any office action. Ex. 1004.
`
`Gerhardt discloses an image processing system that allows a user to interface
`
`with a computer without hands. Instead, Gerhardt’s system tracks the position of a
`
`user’s pupil to generate input to the computer. Ex. 1002, ¶¶49-50. In one
`
`example, Gerhardt’s system uses a video camera mounted on a helmet, as shown in
`
`Figures 1 and 2.
`
`Gerhardt’s system receives an input signal from a “camera means for
`
`acquiring a video image” and a “frame grabber means [that is] coupled to the
`
`camera means.” Ex. 1013, 2:25-44. The “frame grabber” converts video data
`
`
`
`11
`
`
`
`Petition for Inter Partes Review
`Patent No. 7,650,015
`(which inherently contains a plurality of frames) to digital pixel data (plurality of
`
`pixels). Ex. 1002, ¶51. For each frame input, Gerhardt generates a histogram
`
`based on the pixels’ intensity values to identify and track the user’s pupil. Ex.
`
`1013, 9:39-61. Gerhardt forms a histogram of the eye image with bins along the
`
`horizontal axis, where the “vertical axis indicates the pixel count of each bin, and
`
`the horizontal axis indicates the magnitude of the pixel intensity of each bin.” Ex.
`
`1013, 9:39-61. In one embodiment, Gerhardt teaches classification according to
`
`the continuous variable of intensity and that intensity may be “represented by a 7-
`
`bit greyscale, or in other words, divided up into 128 bins.” Id. An example
`
`histogram formed based on the eye image is shown in Figure 5:
`
`From the intensity histogram, Gerhardt identifies the pupil (i.e., the target).
`
`
`
`12
`
`
`
`Petition for Inter Partes Review
`Patent No. 7,650,015
`Gerhardt uses an intensity threshold level that will divide pixel data into two sets—
`
`a darker set (pixels with intensity below the threshold) that has total pixel area
`
`substantially equal to the expected size of the use’s pupil in the eye image, and a
`
`lighter set (the remaining pixels). Ex. 1002, ¶52. In the example shown in Figure
`
`5, the threshold intensity (about 61) is chosen such that the pixels below the
`
`threshold (shown in black in Figure 5 above) take up about 5% of the image area.
`
`After finding the intensity threshold corresponding to the pupil (i.e., the
`
`target), Gerhardt creates a binary image that shows only the pixels belonging to the
`
`pupil. Ex. 1013, 10:6-34; Ex. 1002, ¶53. A binary image created from the eye
`
`image is shown in Figure 6.
`
`
`
`Once pixels belonging to the target (pupil) are identified in the histogram,
`
`Gerhardt then “locat[es] the pupil, map[s] the pupil coordinates to display screen
`
`coordinate, and inform[s] peripheral devices of the pupil location.” Ex. 1013,
`
`8:34-37; Ex. 1002, ¶54. This is done by first identifying the “blobs” or “set[s] of
`
`contiguous pixels” in the image using a region-growing method. Ex. 1013, 12:32-
`
`61. The system then “selects one of these blobs as corresponding to the user’s
`
`pupil” based on the blob’s properties (such as its size, centroid, X- and Y-minima
`
`13
`
`
`
`Petition for Inter Partes Review
`Patent No. 7,650,015
`and maxima of the pixels in the blob, the length-to-width ratio of the blob’s
`
`bounding rectangle, the perimeter of…the blob, or the moment of inertia. Ex.
`
`1013, 9:7-17; 12:32-61. Examples of the “bounding rectangle[s]…that
`
`correspond[] to the x and y-coordinate maxima…and minima” of the identified
`
`blobs are shown in Figure 10:
`
`
`
`Once the system selects a blob as the target (the pupil), Gerhardt’s system
`
`maps the pupil’s centroid in (x,y) image coordinates “into a corresponding location
`
`in screen coordinates (corresponding, for example, to the user’s point of regard on
`
`a display screen).” Ex. 1013, 15:22-27. The screen coordinates are used by the
`
`interface to provide feedback to the operator. Ex. 1013, 15:32-39; Ex. 1002, ¶55.
`
`The above-described process of generating histogram and locating the pupil
`
`blob in the image is repeated for each frame of the video signal. Ex. 1013, 8:45-
`
`52, 9:62-10:1. Figure 15 shows a flow chart of the image processing steps
`
`14
`
`
`
`Petition for Inter Partes Review
`Patent No. 7,650,015
`described above in a continuous loop. Id., 8:45-52 (the process of identifying and
`
`locating the pupil is performed in a “continuous loop, which involves continually
`
`acquiring an eye image with camera 12 and attempting to locate the pupil
`
`position.”); Ex. 1002, ¶56.
`
`
`
`For each image frame, the threshold intensity level found in the intensity
`
`histogram may change, because Gerhardt uses the area criterion (e.g., the 5% area
`
`threshold), which “permits the threshold level to be changed for each image frame
`
`15
`
`
`
`Petition for Inter Partes Review
`Patent No. 7,650,015
`to adjust for changes in lighting conditions.” Ex. 1013, 9:65-10:1; Ex. 1002, ¶57.
`
`Gerhardt’s system also displays an outline associated with the pupil (target)
`
`at a display location, which is based on the target location. Ex. 1002, ¶58. Figure
`
`12 illustrates the bounding rectangle around the blob identified as the pupil and as
`
`seen on the display of the image processing system. Ex. 1013, 14:33-52 (“[t]he
`
`pupil selection method according to the present invention is able to successfully
`
`select pupil blob 150 from the image of Fig.12.”). In addition to a bounding
`
`rectangle, the “perimeter of the blob” may also be used to select the target. Ex.
`
`1013, 12:58-61.
`
`
`
`To improve processing efficiency, Gerhardt’s system may identify a
`
`rectangular area within the image frame and generate a histogram based only on
`
`the plurality of pixels within the identified rectangular area. Ex. 1013, 21:1-18;
`
`Ex. 1002, ¶59. One method of identifying such rectangular area is by “keeping a
`
`running average of the centroid location for previously-selected pupil blobs.” Ex.
`
`1013, 21:8-11. Histograms are generated in the “active area” that is “centered
`
`16
`
`
`
`Petition for Inter Partes Review
`Patent No. 7,650,015
`about the running average centroid location.” Id. If the pupil is not found in the
`
`rectangular area considered, “the size of the active window can be incrementally
`
`increased until the pupil blob is again successfully selected.” Ex. 1013, 21:1-18.
`
`In some cases, Gerhardt’s system receives a user input designating the
`
`position of the pupil (target). Ex. 1002, ¶60. For example, during calibration of
`
`the system “a cursor is placed at a known location on the user interface…and the
`
`user then looks at the cursor for a set period of time.” Ex. 1013, 18:40-58. This
`
`provides the input of the pupil position to the system, and enables the system to
`
`calibrate by determining the user’s pupil location. Ex. 1013, 18:40-58.
`
`U.S. Patent No. 6,044,166 to Bassman (Ex. 1014)
`
`2.
`A similar process and apparatus is also described in a patent issued to
`
`researchers at Sarnoff Corporation of Princeton, New Jersey. U.S. Patent No.
`
`6,044,166 to Bassman was filed on February 23, 1996, and thus qualifies as prior
`
`art at least under pre-AIA 35 U.S.C. § 102(e). Bassman was not of record and was
`
`not considered during the ’015 Patent’s prosecution. Bassman discloses an image
`
`processing system for tracking vehicles (targets) on a roadway. Ex. 1014, 2:39-
`
`3:13; Ex. 1002, ¶61.
`
`Bassman’s image processor receives input from a video camera, and
`
`digitally processes “the pixels of the successive image frames.” Ex. 1014, 2:39-
`
`3:7; Ex. 1002, ¶62. For example, the video camera may derive a “640x480 pixel
`
`17
`
`
`
`Petition for Inter Partes Review
`Patent No. 7,650,015
`image of the portion of the roadway within the field of view” (i.e., a plurality of
`
`pixels) at a “frame rate of 7.5 frames per second” (i.e., a plurality of frames). Id.
`
`Figure 5 shows an example of an image frame derived from the video camera.
`
`
`
`Bassman’s system uses the pixels within the image zone (for example, zone
`
`508, which shows the second lane 506 in Figure 5 above), and integrate the pixels
`
`into a one dimensional (1D) strip (510). Ex. 1014, 6:10-26; Ex. 1002, ¶63. For
`
`example, the system may integrate “all image pixels on row y that are within the
`
`delineated lane bounds” (Ex. 1014, 6:27-35) by creating a histogram for each row
`
`and determining whether an object (i.e., a target, such as a car) is present at row y
`
`within the lane. Ex. 1014, 6:60-7:4. Bassman’s system further “permit[s] objects
`
`18
`
`
`
`Petition for Inter Partes Review
`Patent No. 7,650,015
`to be tracked over time” by “computing and storing the average value [of the
`
`intensity of pixels in the histogram] contained within the integration window” of
`
`each strip pixel at each frame. Id., 7:5-17; Ex. 1002, ¶64. By comparing the
`
`average values at times t-1 and t, a “one-dimensional image ‘flow’” that maps the
`
`pixels in t-1 to pixels in t can be computed. Ex. 1014, 7:5-15. “This flow
`
`information can be used to track objects between each pair of successive image
`
`frames.” Ex. 1014, 7:15-17.
`
`3.
`
`Alton L. Gilbert et al., A Real-Time Video Tracking System,
`PAMI-2 No. 1 IEEE Transactions on Pattern Analysis and
`Machine Intelligence 47 (Jan. 1980) (“Gilbert”) (Ex. 1005)
`
`The purported invention of the ’015 Patent relates to a process of identifying
`
`a target in digitized visual input by using histograms of pixel characteristics and
`
`tracking the target. But researchers at U.S. Army White Sands Missile Range,
`
`New Mexico, in collaboration with New Mexico State University, Las Cruces, had
`
`already developed a system that utilizes histograms to identify and track targets,
`
`and they published their findings in January 1980, more than 17 years before the
`
`earliest effective filing date of the ’015 Patent. Ex. 1002, ¶65; Ex. 1011, Grenier
`
`Decl.
`
`The article, entitled “A Real-Time Video Tracking System,” published in
`
`IEEE Transactions on Pattern Analysis and Machine Intelligence in January 1980,
`
`(“Gilbert”), qualifies as prior art under pre-AIA § 102(b). Gilbert describes “a
`
`19
`
`
`
`Petition for Inter Partes Review
`Patent No. 7,650,015
`system for missile and aircraft identification and tracking…applied in real time to
`
`identify and track objects.” Ex. 1002, ¶66; Ex. 1005, 47. Gilbert was not of record
`
`and was not considered during prosecution of the ’015 Patent. The Gilbert system
`
`includes an image processing system comprising a video processor, a projection
`
`processor, a tracker processor, and a control processor as shown in Figure 1. Ex.
`
`1002, ¶66; Ex. 1005, 48.
`
`The Video Processor receives an input of digitized video signal comprising
`
`60 fields/s. Ex. 1002, ¶67; Ex. 1005, 48. Each field (half of an interlaced frame)
`
`consists of a succession of n X m pixels. Ex. 1005, 48.
`
`
`
`20
`
`
`
`Petition for Inter Partes Review
`Patent No. 7,650,015
`The Video Processor calculates histograms of pixel intensity in each region
`
`of a tracking window (background region, plume region, and target region) in the
`
`256 gray-level classes of the intensity domain. Id. at 49 (“As each pixel in the
`
`region is processed, one (and only one) element of H is incremented as h[x(j)] h
`
`[x(j)] + 1. When the entire region has been scanned, h contains the distributions of
`
`pixels over intensity and is referred to as the feature histogram of the region R.”);
`
`Ex. 1002, ¶¶68-70; Ex. 1005, Fig.2(below).
`
`
`
`Although Gilbert uses histograms in the intensity domain as examples, it
`
`also notes that other “features that can be functionally derived from relationship
`
`between pixels, e.g., texture, edge, and linearity measure” may be used. Ex. 1005,
`
`48; Ex. 1002, ¶70.
`
`Each feature histogram is normalized to a probability density function and a
`
`“linear recursive estimator and predictor [10] is utilized to establish learned
`
`estimates of the density functions.” Ex. 1005, 49; Ex. 1002, ¶¶71-72. These
`
`learned density functions derived from histogram statistics are used as
`
`21
`
`
`
`Petition for Inter Partes Review
`Patent No. 7,650,015
`classification thresholds to classify pixels in the target region as target,
`
`background, or plume. Ex. 1005, 50; Ex. 1002, ¶72. This identification process
`
`may be done for one target/plume/background set, or two different
`
`target/plume/background sets simultaneously. Ex. 1005, 48 (“Although one
`
`tracking window is satisfactory for tracking missile targets with plumes, two
`
`windows are used to provide additional reliability and flexibility for independently
`
`tracking a target and plume, or two targets.”); Ex. 1002, ¶73.
`
`The Tracker Processor then uses the target classification results to track the
`
`target. Ex. 1005, 51. In particular:
`
`The tracker processor establishes a confidence weight for
`
`its inputs, computes boresight and zoom correction
`
`signals, and controls the position and shape of the target
`
`tracking window to implement an intelligent tracking
`
`strategy.
`
`Id., 52. The tracking window data is then fed back to the Video Processor. Id.
`
`The size, shape, and position of the tracking window, in turn, control which pixels
`
`are included in each of the BR, PR, and TR histograms of pixel intensity that are
`
`acquired by the Video Processor. Id., 48; Ex. 1002, ¶80. Thus, the statistical