throbber
UNITED STATES PATENT AND TRADEMARK OFFICE
`______________________________________________
`
`BEFORE THE PATENT TRIAL AND APPEAL BOARD
`______________________________________________
`
`Samsung Electronics Co., Ltd., and
`Samsung Electronics America, Inc.,
`Petitioners
`
`v.
`
`Image Processing Technologies, LLC,
`Patent Owner.
`
`
`
`PETITION FOR INTER PARTES REVIEW
`OF U.S. PATENT NO. 6,959,293
`
`

`
`U.S. Patent No. 6,959,293
`Petition for Inter Partes Review
`
`TABLE OF CONTENTS
`
`
`Introduction ..................................................................................................... 1
`I.
`II. Grounds for Standing, Mandatory Notices, and Fee Authorization ............... 1
`III. Summary of Challenges .................................................................................. 3
`IV. Overview of The Relevant Technology and ’293 Patent ............................... 4
`V.
`The Invalidating Prior Art .............................................................................. 8
`A.
`International Patent Publication WO 99/36893 (“Pirim”) ................... 8
`B. U.S. Patent No. 5,546,125 to Tomitaka et al. (“Tomitaka”) .............. 13
`C.
`Robert B. Rogers, “Real-Time Video Filtering With Bit-Slice
`Microprogrammable Processors,” Ph.D. Dissertation, New
`Mexico State University (1978) (“Rogers”) ...................................... 17
`D. Alton L. Gilbert et al., “A Real-Time Video Tracking System,”
`IEEE Transactions on Pattern Analysis and Machine
`Intelligence, Vol. PAMI-2, No. 2, January 1980 (“Gilbert”)............. 24
`VI. Level of Ordinary Skill In The Art ............................................................... 28
`VII. Claim Construction ....................................................................................... 29
`VIII. Specific Explanation Of Grounds For Invalidity.......................................... 30
`A. Ground 1: Claims 1, 18, 19, 22, and 29 are rendered obvious
`under 35 U.S.C. § 103 by the combination of Pirim and
`Tomitaka ............................................................................................. 30
`1.
`Reasons to Combine Pirim and Tomitaka ............................... 30
`2.
`Claim 1 ..................................................................................... 32
`3.
`Claim 18 ................................................................................... 40
`4.
`Claim 19 ................................................................................... 49
`5.
`Claim 22 ................................................................................... 49
`6.
`Claim 29 ................................................................................... 51
`B. Ground 2: Rogers in combination with Gilbert renders obvious
`claims 1, 18, 19, 22, and 29 under 35 U.S.C. § 103. .......................... 53
`1.
`Reasons to Combine Rogers and Gilbert ................................. 53
`2.
`Claim 1 ..................................................................................... 54
`
`
`
`

`
`U.S. Patent No. 6,959,293
`Petition for Inter Partes Review
`
`
`
`Claim 18 ................................................................................... 58
`3.
`Claim 19 ................................................................................... 63
`4.
`Claim 22 ................................................................................... 64
`5.
`Claim 29 ................................................................................... 66
`6.
`C. Ground 3: Claims 1, 18, 19, 22, and 29 are rendered obvious
`under 35 U.S.C. § 103 by the combination of Tomitaka and
`Rogers ................................................................................................. 68
`1.
`Reasons to Combine Tomitaka and Rogers ............................. 68
`2.
`Claim 1 ..................................................................................... 71
`3.
`Claim 18 ................................................................................... 76
`4.
`Claim 19 ................................................................................... 79
`5.
`Claim 22 ................................................................................... 79
`6.
`Claim 29 ................................................................................... 82
`IX. Grounds 1, 2, and 3 are not Cumulative ....................................................... 83
`X.
`Conclusion .................................................................................................... 84
`Certification of Word Count ................................................................................... 84
`
`ii
`
`
`
`
`
`

`
`U.S. Patent No. 6,959,293
`Petition for Inter Partes Review
`
`LIST OF EXHIBITS
`
`iii
`
`
`
`1006
`
`1009
`1010
`
`1011
`
`1012
`
`
`
`
`Exhibit No. Description
`1001
`U.S. Patent No. 6,959,293
`1002
`Declaration of Dr. John C. Hart
`1003
`Curriculum Vitae of Dr. John C. Hart
`1004
`Prosecution File History of U.S. Patent No. 6,959,293
`1005
`WO 99/36893, Patrick Pirim and Thomas Binford, “Method and
`Apparatus for Detection of Drowsiness,” published July 22, 1999
`Robert Rogers, “Real-Time Video Filtering with Bit-Slide
`Microprogrammable Processors,” Ph.D. Dissertation, New Mexico
`State University (December 1978)
`U.S. Patent No. 5,546,125 to Tomitaka, et al., issued August 1996
`Alton L. Gilbert et al., “A Real-Time Video Tracking System,”
`IEEE Transactions on Pattern Analysis and Machine Intelligence,
`Vol. PAMI-2, No. 2, January 1980
`Declaration of Susan E. Beck (authenticating Ex. 1006)
`D. Trier, A. K. Jain and T. Taxt, “Feature Extraction Methods for
`Character Recognition-A Survey”, Pattern Recognition, vol. 29, no.
`4, 1996, pp. 641–662.
`M. H. Glauberman, “Character recognition for business machines,”
`Electronics, vol. 29, pp. 132-136, Feb. 1956
`Declaration of Gerard P. Grenier (authenticating Ex. 1008)
`
`1007
`1008
`
`

`
`
`I.
`
`U.S. Patent No. 6,959,293
`Petition for Inter Partes Review
`
`INTRODUCTION
`Pursuant to 37 C.F.R. § 42.100, et seq., Samsung Electronics Co., Ltd.
`
`(“Petitioner” or “Samsung”) hereby petitions the United States Patent and
`
`Trademark Office (the “Office”) to institute an inter partes review of claims 1, 18,
`
`19, 22, and 29 of U.S. Patent No. 6,959,293 (“the ’293 Patent”). The ’293 Patent,
`
`attached as Ex. 1001, is assigned to Image Processing Technologies, LLC (“Patent
`
`Owner”). The ’293 Patent generally relates to a system and method of analyzing
`
`an aural or visual image or event by using histograms. See, e.g., Ex. 1001 at
`
`claims 1, 18, 22, 29. As set forth below, claims 1, 18, 19, 22, and 29 of the ’293
`
`Patent are invalid as obvious over the prior art. This petition presents non-
`
`cumulative grounds of invalidity based on combinations of prior art that were not
`
`before the Office during prosecution. These grounds are each reasonably likely to
`
`prevail, and this petition, accordingly, should be granted on all grounds.
`
`II. GROUNDS FOR STANDING, MANDATORY NOTICES, AND FEE
`AUTHORIZATION
`Grounds for Standing: Petitioner certifies that the ’293 patent is available
`
`for inter partes review and that Petitioner is not barred or estopped from requesting
`
`an inter partes review challenging the patent claims on the grounds identified in
`
`this petition.
`
`Real Party-In-Interest: Samsung Electronics Co., Ltd.; and Samsung
`
`1
`
`

`
`
`Electronics America, Inc.
`
`U.S. Patent No. 6,959,293
`Petition for Inter Partes Review
`
`Notice of Related Matters: Patent Owner has asserted the ’293 patent
`
`against Petitioner in Image Processing Techs., LLC v. Samsung Electronics Co.,
`
`Ltd., et al., Case No. 2:16-cv-00505-JRG (E.D. Tex., filed May 13, 2016).
`
`Patent Owner has also asserted infringement of U.S. Patents Nos. 8,989,445;
`
`8,893,134; 8,805,001; and 7,650,015 in this same litigation and Petitioner is
`
`concurrently filing IPR petitions requesting review of each of these patents.
`
`Lead and Back-Up Counsel:
`
`• Lead Counsel: John Kappos (Reg. No. 37,861), O’Melveny & Myers
`
`LLP, 610 Newport Center Drive, 17th Floor, Newport Beach, California
`
`92660. (Telephone: 949-823-6900; Fax: 949-823-6994; Email:
`
`jkappos@omm.com.)
`
`• Backup Counsel: Nick Whilt (Reg. No. 72,081), Brian M. Cook (Reg.
`
`No. 59,356), O’Melveny & Myers LLP, 400 S. Hope Street, Los
`
`Angeles, CA 90071. (Telephone: 213-430-6000; Fax: 213-430-6407;
`
`Email: nwhilt@omm.com, bcook@omm.com.
`
`Service Information: Samsung consents to electronic service by email to
`
`IPTSAMSUNGOMM@omm.com. Please address all postal and hand-delivery
`
`correspondence to lead counsel at O’Melveny & Myers LLP, 610 Newport Center
`
`Drive, 17th Floor, Newport Beach, California 92660, with courtesy copies to the
`
`2
`
`

`
`
`email address identified above.
`
`U.S. Patent No. 6,959,293
`Petition for Inter Partes Review
`
`Fee Authorization: The Office is authorized to charge an amount in the
`
`sum of $23,000 to Deposit Account No. 50-0639 for the fee set forth in 37 CFR §
`
`42.15(a), and any additional fees that might be due in connection with this Petition.
`
`III. SUMMARY OF CHALLENGES
`Petitioner respectfully requests cancellation of claims 1, 18, 19, 22, and 29
`
`on the following grounds:
`
`• Ground 1: International Patent Publication WO 99/36893 (“Pirim”) in
`
`combination with U.S. Patent No. 5,546,125 (“Tomitaka”) renders obvious
`
`claims 1, 18, 19, 22, and 29 under 35 U.S.C. § 103;
`
`• Ground 2: Robert B. Rogers, “Real-Time Video Filtering With Bit-Slice
`
`Microprogrammable Processors,” Ph.D. Dissertation, New Mexico State
`
`University (1978) (“Rogers”) in combination with Alton L. Gilbert et al., “A
`
`Real-Time Video Tracking System,” IEEE Transactions on Pattern Analysis
`
`and Machine Intelligence, Vol. PAMI-2, No. 2, January 1980 (“Gilbert”)
`
`renders obvious claims 1, 18, 19, 22, and 29 under 35 U.S.C. § 103; and
`
`• Ground 3: Tomitaka in combination with Rogers renders obvious claims 1,
`
`18, 19, 22, and 29 under 35 U.S.C. § 103.
`
`See Ex. 1002, Hart Decl. ¶¶ 36-37.
`
`3
`
`

`
`
`IV. OVERVIEW OF THE RELEVANT TECHNOLOGY AND ’293
`PATENT
`
`U.S. Patent No. 6,959,293
`Petition for Inter Partes Review
`
`The ’293 patent was filed on February 23, 2001, names Patrick Pirim as the
`
`inventor, and claims priority to a foreign application filed February 24, 2000. Ex.
`
`1001 at 1, 50. It is directed to using histograms for image processing, which was
`
`well known for decades before its priority date. Ex. 1002, Hart Decl. ¶¶19-26. It
`
`claims a device and method for processing a scene by acquiring one or more
`
`histograms of parameters associated with a digitized picture element or “pixel.”
`
`See, e.g., Ex. 1001, ’293 Patent, at claims 1, 18, 22, 29. For example, an input
`
`video signal S(t) comprises a succession of frames, each made up of pixels. Id. at
`
`7:55-63. “This signal S(t) carries a value aij of the parameter A for each pixel (i,
`
`j).” Id. at 7:59-60. Parameter A refers to a property of a pixel, such as its speed,
`
`shape, color, etc. See id. at 1:18-20, 29-31. The values of A for a given frame are
`
`analyzed using a histogram processor, such as depicted in Figure 3, annotated
`
`below:
`
`4
`
`

`
`
`
`U.S. Patent No. 6,959,293
`Petition for Inter Partes Review
`
`
`
`Digital DATA(A), corresponding to parameter A, flows through input
`
`multiplexer 105 (shaded green) to the address input of histogram memory 100
`
`(shaded red). If each DATA(A) were an 8-bit value representing pixel brightness
`
`(ranging from 0 to 255) for a pixel in the frame, the histogram memory would
`
`increment the value stored at the address representing the brightness value for that
`
`pixel. In other words, once the frame is processed, the histogram memory would
`
`contain a value at each of 256 memory addresses representing the number of pixels
`
`having the brightness value corresponding to that address. See id. at 8:45-64. Ex.
`
`5
`
`

`
`
`1002, Hart Decl. ¶¶27-29.
`
`U.S. Patent No. 6,959,293
`Petition for Inter Partes Review
`
`Figure 3 also depicts a “classifier unit” 101 (shaded blue) that takes a
`
`DATA(A) value as an input and evaluates whether it meets a particular condition,
`
`for example, brightness equal to 203. Ex. 1001at 9:31-34. Other embodiments
`
`include more general classifiers that evaluate whether a data value falls within a
`
`certain range or exceeds a certain threshold. See id. at Figs 12, 13a (classifier 119
`
`evaluates whether data P is greater than condition Q). The output of the classifier
`
`indicating whether or not the condition is met is sent to coincidence bus 111
`
`(shaded yellow). Id. at 9:36-42. Output signals from multiple classifiers
`
`associated with other histogram units (B, C, D, E . . .) may also be present on
`
`coincidence bus 111 and are sent to coincidence unit 102 (shaded purple). Id. at
`
`10:34-40. Ex. 1002, Hart Decl. ¶¶30-31.
`
`The coincidence unit 102 (shaded purple) includes logic that determines
`
`whether a pixel will be added to the histogram memory 100 (shaded red) based on
`
`selected classification conditions. Ex. 1001 at Fig. 3 (validation signal).
`
`Validation signal logic might enable the histogram memory when the brightness
`
`parameter for that pixel is greater than 50, or it might enable the histogram
`
`memory only for those pixels with both brightness greater than 100 and color equal
`
`to red. See id. at 9:36-50. Ex. 1002, Hart Decl. ¶¶30-31.
`
`The classification condition described above is a fixed value. However, the
`
`6
`
`

`
`
`classification condition may also be set automatically based on statistics about the
`
`U.S. Patent No. 6,959,293
`Petition for Inter Partes Review
`
`scene. See id. at 11:9-29. For example, in Figure 13a, annotated below, the
`
`classifier 119 (shaded blue) evaluates whether data P is greater than condition Q.
`
`
`
`Q might be derived from statistics such as RMAX (shaded red), the number
`
`of counts in the highest bin, or NBPTS (shaded orange), the number of pixels in
`
`the histogram, for example. Id. at 10:7-31.
`
`Generally, the classifier may be achieved according to numerous
`
`embodiments, the essential being that it allows to place the parameter
`
`DATA(A) with respect to values or limits statistically determined
`
`over a set of former data DATA(A).
`
`Id. at 13:32-36. For example, the processor might determine the maximum
`
`brightness of the pixels in a frame and set a classifier condition for subsequent
`
`7
`
`

`
`
`frames based on that statistic, such as implementing a classifier that selects pixels
`
`U.S. Patent No. 6,959,293
`Petition for Inter Partes Review
`
`having a brightness less than 80% of the maximum brightness. This classifier, in
`
`conjunction with the validation logic, would then ensure that only those pixels
`
`satisfying this condition are included in the histogram. Ex. 1002, Hart Decl. ¶¶31-
`
`35.
`
`V. THE INVALIDATING PRIOR ART
`A.
`International Patent Publication WO 99/36893 (“Pirim”)
`Pirim, Ex. 1005, was published on July 22, 1999, based on an international
`
`application filed January 15, 1999, designating the U.S. and qualifies as prior art
`
`under at least pre-AIA 35 U.S.C. §§ 102(a), 102(b), and § 119 (“but no patent shall
`
`be granted . . . for an invention which had been . . . described in a printed
`
`publication in any country more than one year before the date of the actual filing of
`
`the application in this country.”). Pirim names one inventor in common with the
`
`challenged ’293 patent but has a different inventive entity. Ex. 1005 at 1. Pirim is
`
`of record in the prosecution history but was never discussed or used in a rejection
`
`by the examiner. Further, the obviousness combination presented here was never
`
`considered during prosecution. Ex. 1004, Prosecution History at 110, 201-231;
`
`237-242.
`
`Pirim discloses a system for detecting whether a driver is falling asleep by
`
`acquiring pictures of the driver and forming histograms to analyze opening and
`
`8
`
`

`
`
`closing of the driver’s eyes. Ex. 1005, Pirim, at 5. Pirim’s image processing
`
`U.S. Patent No. 6,959,293
`Petition for Inter Partes Review
`
`system “receives a digital video signal S originating from a video camera or other
`
`imaging device 13 which monitors a scene 13a.” Id. at 12. “Signal S(PI)
`
`represents signal S composed of pixels PI.” Id. at 13. Each video frame comprises
`
`horizontal scanned lines, each including “a succession of pixels or image points PI,
`
`e.g., a1.1, a1.2, and a1.3 for line l1.1.” Id.
`
`With reference to Figure 14, annotated below, Pirim discloses a histogram
`
`unit having a memory 100 (shaded red). Data(V), representing pixel parameter V,
`
`proceeds through input multiplexer 104 (shaded green) to the address input of
`
`memory 100. Id. at 29. Just as in the ’293 patent, a value stored at the address
`
`corresponding to the value of the input data parameter is incremented to
`
`accumulate a histogram of the parameter. Id.
`
`Pirim further discloses a “classifier 25b” (shaded blue) that receives the
`
`data(V) value and compares it to a “register 106 that enables the classification
`
`criteria to be set by the user, or by a separate computer program.” Id. at 29-30.
`
`9
`
`

`
`
`
`U.S. Patent No. 6,959,293
`Petition for Inter Partes Review
`
`
`
`The output of classifier 25b proceeds to a bus 23 (shaded yellow), which
`
`also carries the output of other classifiers in the system. Id. at 31. These signals
`
`proceed to validation unit 31 (shaded purple). “Each validation unit generates a
`
`validation signal which is communicated to its associated histogram formation
`
`block 24-29. The validation signal determines, for each incoming pixel, whether
`
`the histogram formation block will utilize that pixel in forming it histogram.” Id.
`
`10
`
`

`
`
`at 30. Thus, the operation of the system is summarized as follows:
`
`U.S. Patent No. 6,959,293
`Petition for Inter Partes Review
`
`Thus, using the classifiers in combination with validation units 30-35,
`
`the system may select for processing only data points in any selected
`
`classes within any selected domains. For example, the system may be
`
`used to detect only data points having speed 2, direction 4, and
`
`luminance 125 by setting each of the following registers to “1”: the
`
`registers in the validation units for speed, direction, and luminance,
`
`register 2 in the speed classifier, register 4 in the direction classifier,
`
`and register 125 in the luminance classifier. In order to form those
`
`pixels into a block, the registers in the validation units for the x and y
`
`directions would be set to “1” as well.
`
`Id. at 31.
`
`Pirim also discloses that statistical characteristics of the histogram are
`
`calculated, including “the minimum (MIN) of the histogram, the maximum (MAX)
`
`of the histogram, the number of points (NBPTS) in the histogram, the position
`
`(POSRMAX) of the maximum of the histogram.” Id. at 32. Such statistics may be
`
`used to automatically set limits of the classifiers:
`
`Fig. 13 diagrammatically represents the envelopes of histograms 38
`
`and 39, respectively in x and y coordinates, for velocity data. In this
`
`example, XM and YM represent the x and y coordinates of the maxima of
`
`11
`
`

`
`
`
`U.S. Patent No. 6,959,293
`Petition for Inter Partes Review
`
`the two histograms 38 and 39, whereas la and lb for the x axis and lc
`
`and ld for the y axis represent the limits of the range of significant or
`
`interesting speeds, la and lc being the longer [sic] limits and lb and ld
`
`being the upper limited [sic] of the significant portions of the
`
`histograms. Limits la, lb, lc, and ld may be set by the user or by an
`
`application program using the system, may be set as a ratio of the
`
`maximum of the histogram, e.g., XM/2, or may be set as otherwise
`
`desired for the particular application.
`
`Id. at 36-37 (emphasis added). In other words, among the ways the classification
`
`criterion can be set, it can be set to a statistic derived from the histogram, such as
`
`half of the maximum value (of the velocity data in this example).
`
`Pirim also discloses classifiers based on X and Y position of pixels that
`
`restrict histogram processing to only a particular rectangular region:
`
`In order to process pixels only within a user-defined area, the x-
`
`direction histogram formation unit 28 may be programmed to process
`
`pixels only in a class of pixels defined by boundaries, i.e., XMIN and
`
`XMAX. This is accomplished by setting the XMIN and XMAX
`
`values in a user-programmable memory in x-direction histogram
`
`formation unit 28 or in linear combination units 30-35. Any pixels
`
`outside of this class will not be processed. Similarly, y-direction
`
`12
`
`

`
`
`
`U.S. Patent No. 6,959,293
`Petition for Inter Partes Review
`
`histogram formation unit 29 may be used to process pixels only in a
`
`class of pixels defined by boundaries YMIN and YMAX.
`
`Id. at 35. These X and Y MIN and MAX classification criteria may also be
`
`changed automatically by the system using statistics derived from the histograms:
`
`Because the moving object may leave the bounded area the system
`
`preferably includes an anticipation function which enables XMIN,
`
`XMAX, YMIN, and YMAX to be automatically modified by the
`
`system to compensate for the direction of the target. This is
`
`accomplished by determining values for O-MVT, corresponding to
`
`the orientation (direction) of movement of the target within the
`
`bounded area using
`
`the direction histogram, and
`
`I-MVT,
`
`corresponding to the intensity (velocity) of movement. Using these
`
`parameters, controller 42 may modify the values of XMIN, XMAX,
`
`YMIN, and YMAX on a frame-by-frame basis to ensure that the
`
`target remains in the bounded box being searched.
`
`Id. at 39-40. Thus, Pirim discloses automatic updating of classification criteria
`
`based on statistical data derived from the histograms. See Ex. 1002, Hart Decl.
`
`¶¶42-47.
`
`B. U.S. Patent No. 5,546,125 to Tomitaka et al. (“Tomitaka”)
`Tomitaka, Ex. 1007, was filed July 6, 1994 and issued August 13, 1996. Ex.
`
`13
`
`

`
`
`1007 at 1. Tomitaka qualifies as prior art under at least 35 U.S.C. § 102(b).
`
`U.S. Patent No. 6,959,293
`Petition for Inter Partes Review
`
`Tomitaka was not before the PTO during prosecution of the challenged patent. Ex.
`
`1001, ’293 patent at face; Ex. 1004, Prosecution History at 110, 201-231; 237-242.
`
`Tomitaka discloses a “video signal follow-up processing system for
`
`adaptively tracking to the moving of a subject.” Ex. 1007, Tomitaka, at Abstract.
`
`As illustrated in Figure 1, annotated below, a color video signal from optical
`
`system 1 is digitized in A/D 6, and the color of each pixel is separated into
`
`brightness (Y or L) and chroma (C) signals. Id. at 4:7-16. The chroma color signal
`
`is further demodulated into individual color difference signals R-Y and B-Y to
`
`form three color values: Y, R-Y, and B-Y. Id. at 4:17-32. The color data Y, R-Y,
`
`and B-Y is converted into the HLS color coordinate system and written to image
`
`memory 15 (shaded orange) for processing. Id. at 4:39-50. Brightness and hue are
`
`then processed by two histogram units 19 and 20 (shaded blue and red). Id. at 6:1-
`
`6.
`
`14
`
`

`
`
`
`U.S. Patent No. 6,959,293
`Petition for Inter Partes Review
`
`
`
`Hue histogram signal S13 and brightness histogram signal S14 are sent to
`
`follow-up signal processor 16 (shaded purple). Id.at 5:42-65. The follow-up
`
`signal processor 16 forms “feature patterns” from the hue and brightness
`
`histograms that are compared with reference measurements to track an object:
`
`“they are compared with an image portion in a reference measurement frame so
`
`that the panning and tilting of the lens block 1 is adaptively controlled to always
`
`move the position of a detected measurement frame having an image with the
`
`highest similarity to the signal of the reference measurement frame.” Id. at 6:64-
`
`7:2.
`
`However, “when the values of components of the hue signal HUE and the
`
`15
`
`

`
`
`components of the brightness signal Y are close to the threshold values
`
`U.S. Patent No. 6,959,293
`Petition for Inter Partes Review
`
`corresponding to each sort value, the sort values to be included become uncertain
`
`depending on presence or absence of noise.” Id. at 6:14-19. To address this issue,
`
`certain pixels are prevented from being included in the Hue histogram by logic
`
`involving gate 18 (shaded green), comparator 25 (shaded yellow), and noise
`
`threshold signal S15 from the follow-up signal processor 16 (shaded purple). As
`
`seen in Figure 1, a “noise determination signal S15” is determined by the follow-
`
`up signal processor 16 and sent to comparator circuit 25 (shaded yellow), where it
`
`is compared to the saturation value. Id. at 6:40-49.
`
`When the hue signal HUE detected at the saturation/hue detection
`
`circuit 14 is close to the L axis (shown in FIG. 2), there is a possibility
`
`that the hue signal HUE may not have meaning as information since it
`
`is buried in noise because of having low saturation. Such a
`
`meaningless hue signal HUE is removed in the gate circuit 18.
`
`Id. at 6:50-55. Thus, pixels that fail the classification condition set up in
`
`comparator 25 are prevented by logic in gate 18 from being included in the HUE
`
`histogram formed by histogram unit 19. Furthermore, the classification criterion
`
`for rejecting such a pixel, represented by signal S15 from the follow-up signal
`
`processor 16, is based on histogram data inputs S13 and S14. See Ex. 1002, Hart
`
`Decl. ¶¶48-50.
`
`16
`
`

`
`
`
`U.S. Patent No. 6,959,293
`Petition for Inter Partes Review
`
`C. Robert B. Rogers, “Real-Time Video Filtering With Bit-Slice
`Microprogrammable Processors,” Ph.D. Dissertation, New
`Mexico State University (1978) (“Rogers”)
`
`The Rogers dissertation, Ex. 1006, was catalogued with the New Mexico
`
`State University library in 1978. Ex. 1009, Beck Decl. ¶¶ 3-10. Rogers qualifies
`
`as prior art under at least 35 U.S.C. § 102(b). Rogers was not before the PTO
`
`during prosecution of the challenged patent. Ex. 1001, ’293 patent at face; Ex.
`
`1004, Prosecution History at 110, 201-231; 237-242.
`
`Rogers describes a system for tracking a missile or similar object by
`
`digitizing video images and analyzing the pixel intensity (brightness) using
`
`histograms:
`
`Real-time video filtering is concerned with the separation of a target
`
`image from the background scene at standard video data rates. The
`
`scene in the field-of-view (FOV) of the television camera is digitized
`
`to form an n-by-m matrix representation of the picture P as
`
`P = Pij, i = 1, 2, . . . n, j = 1, 2, . . . m,
`
`where Pij represents the pixel intensity at point (i,j).
`
`Ex. 1006, Rogers at 25. Within the camera’s field-of-view, the system defines a
`
`tracking windows comprising three regions—the target region (TR), the plume
`
`region(PR), and the background region (BR), as illustrated in Figure 1 of Rogers
`
`(annotated below):
`
`17
`
`

`
`
`
`U.S. Patent No. 6,959,293
`Petition for Inter Partes Review
`
`
`
`The target region (TR) of the tracking window contains the entire target plus part
`
`of the target’s plume (bright exhaust feature), plus a region of background that is
`
`neither target nor plume. Id. at 26. The plume region (PR) samples the pixel
`
`intensities in the target’s plume, and the background region (BR) samples pixel
`
`intensities that are in the background of the scene. Id.
`
`18
`
`

`
`Rogers discloses that two independent windows that can be used either to
`
`U.S. Patent No. 6,959,293
`Petition for Inter Partes Review
`
`
`
`track two objects, to redundantly track an object and its plume, or to provide a
`
`redundant tracking window (possibly with a different size) for a single target. Id.
`
`at 28-29. This is illustrated in Figure 2, reproduced below:
`
`
`
`For every video field (half of an interlaced frame) acquired, histograms of
`
`the pixel intensity for the pixels falling within each region are acquired: “During
`
`each field, the feature histograms are accumulated for the three regions.” Id. at 29.
`
`Thus, for two tracking windows, six histograms of pixel intensity are formed—TR,
`
`BR, and PR histograms for each of the two tracking windows. The regions are
`
`19
`
`

`
`
`configured such that “1. the BR contains only background points (state of nature
`
`U.S. Patent No. 6,959,293
`Petition for Inter Partes Review
`
`B), 2. the PR contains background and plume points (states of nature B and P), and
`
`3. the TR contains background, plume, and target points (states of nature B, P, and
`
`T)” Id. at 41.
`
`The pixel intensity data used to build the histograms is subjected to a
`
`“threshold classifier” that classifies each pixel in the TR region as target, plume, or
`
`background by comparing it to classification thresholds. Id. at 34-35. For
`
`example:
`
`If XT is the set of pixel values that are to be classified as target pixels,
`
`then an arbitrary pixel value x is classified as
`
`x ∈ XT iff hBR(x) < tBR and hPR(x) < tPR.
`
`Similarly, if XP is the set of pixels which are to be classified as plume
`
`pixels, then
`
`x ∈ XP iff hBR(x) < tBR and hPR(x) > tPR.
`
`The third possibility of classification is the background set XB, which
`
`is chosen if neither one of the above conditions apply:
`
`x ∈ XB iff hBR(x) >= tBR.
`
`Id. at 35. Here, hR(x) is the normalized feature histogram of region R, where R is
`
`the background, plume, or target region and tBR and tPR are classification
`
`thresholds. Id. at 30, 32. The classification thresholds tBR and tPR are not static but
`
`20
`
`

`
`
`instead are set dynamically depending on the data collected in the feature
`
`U.S. Patent No. 6,959,293
`Petition for Inter Partes Review
`
`histograms:
`
`A convenient feature of the thresholding algorithm is that the
`
`threshold value may be adjusted dynamically on a frame-to-frame
`
`basis using a hueristic rule. One intuitively reasonable rule is that the
`
`number of points in XT should be no more than η percent of the
`
`number of points in the TR region, with typical values of η being ten
`
`to twenty percent. A similar statement can be made concerning the
`
`number of plume points in XP. By adjusting the threshold values
`
`according to the observed values of η, the sensitivity of the algorithm
`
`to noise in the observed scenes (and to overlap between the target,
`
`plume, and background distributions) may be tailored to the
`
`application.
`
`Id. at 37. Furthermore, the classification thresholds are based on probability
`
`densities that are updated based on histogram data from prior video frames. Id. at
`
`33. A weighting parameter, ω, can be set between 0 and 1 to specify how heavily
`
`prior histogram data should be weighted. For example:
`
`Letting h(i|j) represent the learned estimate of any probability density
`
`function for the ith field using the sampled density functions for all
`
`21
`
`

`
`previous fields up to the jth field, a linear estimator and predictor can
`
`U.S. Patent No. 6,959,293
`Petition for Inter Partes Review
`
`
`
`be expressed as
`
`h(i|i) = ω h(i|i-1) + (1-ω) hi
`
`and
`
`h(i+l|i) = 2 h(i|i) - h(i-1|i-1),
`
`where hi denotes the observed probability density function at the ith
`
`frame. The above equations provide a linear recursive method for
`
`compiling learned density functions. The weighting factor can be used
`
`to vary the learning rate. When ω=0, the learning effect is disabled,
`
`and the measured histograms are used by the predictor. As ω
`
`increases toward one, the learning effect increases and the measured
`
`density functions have a reduced effect.
`
`Id. at 33. Thus, the threshold classifiers can be automatically updated based on
`
`statistical analysis of the measured histograms.
`
`The pixels identified as target pixels using the threshold classifier are then
`
`fed to the “region definition logic (RDL) which defines the locations of the
`
`tracking windows in the camera’s field-of-view.” Id. at 49.
`
`The region definition logic (RDL) combines the pixel clock and
`
`horizontal sync signals with information specifying the required sizes
`
`and locations of the two tracking windows to produce level signals
`
`22
`
`

`
`
`
`U.S. Patent No. 6,959,293
`Petition for Inter Partes Review
`
`indicating the location of the current pixel relative to each tracking
`
`window.
`
`The first block of hardware in the RDL consists of a group of latches
`
`which are loaded by the video processor (just prior to the end of VSX)
`
`with the location of the top left corner, and the height, and the width,
`
`of the two windows.”
`
`Id. at 59. The tracking windows, in turn, select the subsets of pixels within the
`
`field of view that are accumulated in each of the six intensity histograms, as can be
`
`seen

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket