`571—272—7822
`
`Paper No. 13
`Entered: May 25, 2017
`
`UNITED STATES PATENT AND TRADEMARK OFFICE
`
`BEFORE THE PATENT TRIAL AND APPEAL BOARD
`
`SAMSUNG ELECTRONICS CO., LTD. and
`
`SAMSUNG ELECTRONICS AMERICA, INC.,
`
`Petitioner,
`
`V.
`
`IMAGE PROCESSING TECHNOLOGIES LLC,
`
`Patent Owner.
`
`'
`
`Case IPR2017-00355
`
`Patent 7,650,015 B2
`
`Before JONI Y. CHANG, MICHAEL R. ZECHER, and
`JESSICA C. KAISER, Administrative Patent Judges.
`
`ZECHER, Administrative Patent Judge.
`
`DECISION
`
`Granting Institution of Inter Partes Review
`35 US. C. § 314(a) and 37 C.F.R. § 42.108
`
`
`
`IPR20 1 7-003 55
`
`Patent 7,650,015 B2
`
`I. INTRODUCTION
`
`Petitioner, Samsung Electronics Co., Ltd. and Samsung Electronics
`
`America, Inc. (collectively “Samsung”), filed a Petition requesting an inter
`
`partes review of claim 6 of US. Patent No. 7,650,015 B2 (Ex. 1001, “the
`
`’015 patent”). Paper 2 (“Pet”). Patent Owner, Image Processing
`
`Technologies LLC (“Image Processing”), filed a Preliminary Response.
`
`Paper 7 (“Prelim ReSp.”).
`
`Under 35 U.S.C. § 314(a), an inter partes review may not be instituted
`
`unless the information presented in the Petition shows “there is a reasonable
`
`likelihood that the petitioner would prevail with respect to at least 1 of the
`
`claims challenged in the petition.” Taking into account the arguments
`
`presented in Image Processing’s Preliminary Response, we conclude that the
`
`information presented in the Petition establishes that there is a reasonable
`
`likelihood that Samsung would prevail in challenging claim 6 of the ’015
`
`patent as unpatentable under 35 U.S.C. § 103(a). Pursuant to § 314, we
`
`hereby institute an inter partes review as to this claim of the ’015 patent.
`
`A. Related Matters
`
`The ’01 5 patent is involved in a district court case titled Imaging
`
`Processing Techs. LLC v. Samsung Elecs. Co., No. 2:16-cv-00505-IRG
`
`(ED. Tex.). Pet. 1; Paper 5, 2. In addition to this Petition, Samsung filed
`
`other petitions challenging the patentability of certain subsets of claims in
`
`the following patents owned by Image Processing: (1) US. Patent No.
`
`6,959,293 B2 (Case IPR2017-00336); (2) US. Patent No. 8,805,001 B2
`
`(Case IPR2017-00347); (3) US. Patent No. 8,983,134 B2 (Case IPR2017-
`
`
`
`IPR2017-00355
`
`Patent 7,650,015 B2
`
`00353); and (4) US. Patent No. 8,989,445 B2 (Case IPR2017-00357).
`
`Pet. 1; Paper 5, 2.
`
`B. The ’015 Patent
`
`The ’01 5 patent, titled “Image Processing Method,” issued January
`
`19, 2010, from US. Patent Application No. 11/676,926, filed on February
`
`20, 2007. Ex. 1001, at [54], [45], [21], [22]. Based on our review of the
`
`prosecution file history, the ’015 patent has an extensive chain of priority
`
`that ultimately results in it claiming the benefit of Patent Cooperation Treaty
`
`(“PCT”) French Patent Application No. 97/01354, filed on July 22, 1997.
`
`Ex. 1004, 8.1
`
`The ’015 patent generally relates to an image processing apparatus
`
`and, in particular, to a method and apparatus for identifying and localizing
`
`an area in relative movement in a scene, and determining the speed and
`
`direction of that area in real-time. Ex. 1001, 1:17—21. The ’015 patent
`
`discloses a number of known systems and methods for identifying and
`
`localizing an object in relative movement, but explains that each of those
`
`systems/methods are inadequate for various reasons (e.g., memory intensive,
`
`limited in terms of the information obtained about an object, did not provide
`
`information in real-time, used complex algorithms for computing object
`
`information, designed to detect only one type of object, etc.). See id. at
`
`1:23—2:63. The ’015 patent purportedly solves these problems by providing
`
`a method and apparatus for detecting the relative movement and
`
`1 All references to the page numbers in the prosecution file history refer to
`the page numbers inserted by Samsung in the bottom, right-hand comer of
`each page in Exhibit 1004.
`
`
`
`IPR2017—00355
`
`Patent 7,650,015 B2
`
`non-movement of an area within an image. Id. at 8:65—67. According to the
`
`’01 5 patent, relative movement is any movement of an area, which may be
`
`an object (e.g., a person, a portion of a person, or any animals or inanimate
`
`object), in a motionless environment or, alternatively, in an environment that
`
`is at least partially in movement. Id. at 8:67—9:5.
`
`Figure 11 of the ’015 patent, reproduced below, illustrates a block
`
`diagram showing the interrelationship between various histogram formation
`
`‘ units that make up a histogram processor. Ex. 1001, 8:35—36.
`
`,
`
`As shown in Figure 11 reproduced above, histogram processor 22(a) (not
`
`labeled) includes bus 23 that transmits signals between various components,
`
`including histogram formation and processing blocks 24—29. Id. at 16:53—
`
`59. The function of each histogram formation and processing block 24-29 is
`
`to form a histogram for the domain associated with that particular block. Id.
`
`at 16:59—61.
`
`
`
`IPR2017-00355
`
`Patent 7,650,015 B2
`
`According to the ’01 5 patent, each histogram formation and
`
`processing block 24—29 operates in the same manner. Ex. 1001, 17:41—43.
`
`As one example, Figure 13 of the ’015 patent, reproduced below, illustrates
`
`a block diagram of histogram formation and processing block 25. Id. at
`
`8:39—40.
`
`VAUDA‘I'ION V2
`
`
`
`
`HISTOGRAM
`FORMATION
`253
`
`p’
`
`01mm
`coumsa
`
`FllGJ3
`
`As shown in Figure 13 reproduced above, histogram formation and
`
`processing block 25 includes histogram forming portion 253, which forms
`the histogram for the block, and classifier 25b, which selects the criteria of
`
`pixels for which the histogram is to be formed. Id. at 17:46—49. Histogram
`
`forming portion 25a and classifier 25b operate under the control of computer
`
`sofiware in integrated circuit 25c (not shown in Figure 13), which extracts
`
`certain limits of the histogram generated by the histogram formation block.
`
`Id. at 17:50—53. Classifier 25b includes register 106 that enables the
`
`classification criteria to be set by a user or, alternatively, by a separate
`
`computer program. Id. at 18:16—19.
`
`
`
`IPR2017-00355
`
`Patent 7,650,015 B2
`
`Figure 15 of the ’015 patent, reproduced below, illustrates how the
`
`image processing system may be used for video-conferencing. Ex. 1001,
`
`8:45—46.
`
`
`
`140.15
`
`As shown in Figure 15 reproduced above, video camera 13 observes subject
`
`.
`
`. P, who may be in movement. Id. at 2226—7. Video signal S is transmitted by
`
`wire, Optical fiber, radio relay, or other communication means from camera
`
`13 to both monitor 10b and image processing system 11. Id. at 2227—11.
`
`Image processing system 11 determines the position and movement of
`
`subject P, as well as controls servo motors 43 of camera 13 to direct the
`
`optical axis of the camera towards the subject, particularly the subject’s face.
`
`Id. at 22:11—14. Image processing system 11 also may vary the zoom, focal
`
`distance, and focus of camera 13 to provide the best framing and image of
`
`the subject. Id. at 22:16—17.
`
`C. Challenged Claim
`
`Independent claim 6 is directed to a process of tracking a target in an
`
`input signal implemented using a system, and is reproduced below:
`
`A process of tracking a target in an input signal
`6.
`implemented using a system comprising an image processing
`
`6
`
`
`
`IPR2017-00355
`
`Patent 7,650,015 B2
`
`system, the input signal comprising a succession of frames, each ’
`frame comprising a succession of pixels, the target comprising
`pixels in one or more of a plurality of classes in one or more of a
`plurality of domains, the process performed by said system
`comprising, on a frame-by-frame basis:
`forming at least one histogram of the pixels in the one or
`more of a plurality of classes in the one or more of a plurality of
`domains, said at least one histogram referring to classes defining
`said target;
`identifying the target from said at least one histogram;
`,
`drawing a tracking box around the target; and
`centering the tracking box relative to an optical axis of the.
`I
`
`frame.
`
`Ex. 1001, 27:9—22-(paragraph indentations added).
`
`D. Prior Art References Relied Upon
`
`Samsung relies upon the prior art references set forth in the table
`beloizv:
`
`Hashima
`
`
`
`
`5,521,843
`
`issued May 28, 1996,
`PCT filed Jan. 29, 1993
`
`issued Sept. 22, 1992,
`5,150,432
`Ueno
`
`filed Mar. 22, 1991
`
`'
`
`1006
`
`1007
`
`
`
`
`
`
`
`
`
`
`2 For clarity and ease of reference, we only list the first named inventor.
`
`7
`
`
`
`IPR2017-00355
`
`Patent 7,650,015 B2
`
`Non-Patent Literature
`
`
`
` Exhibit No.
`
`
`Alton L. Gilbert et. al., A Real-Time Video Tracking System,
`PAMI—Z, N0. 1 [BEE TRANSACTIONS ON PATTERN
`
`ANALYSIS AND MACHINE INTELLIGENCE 47—56 (1980)
`
`
`“Gilbert”
`
`W.B. Schaming, Adaptive gate multifeature Bayesian
`
`
`
`
`statistical tracker, 359 APPLICATIONS OF DIGITAL
`
`
`IMAGING PROCESSING IV 68—76 1982
`“Schamin;
`
`1005
`
`1008
`
`
`
`
`
`
`
`E. Asserted Grounds of Unpatentability
`
`Samsung challenges claim 6 of the ’015 patent based on the asserted
`
`grounds of unpatentability (“grounds”) set forth in the table below. Pet. 3,
`
`39—79.
`
`\s.
`' RefeFences
`' mm:
`"m
`! Gilbert and Schaming
`
`
`Gilbert and Ueno
`
`
`
`Hashima and Schaming
`
`.
`
`‘
`
`§ 103(a)
`
`{————\——_.
`
`_,
`
`,
`
`‘
`
`«a;
`
`.
`
`
`'
`_
`
`
`§ 103(a) _
`
`II. ANALYSIS
`
`A. Claim Construction
`
`'As an initial matter, we determine the proper standard of construction
`
`to apply. The term of a patent grant begins on the date on which the patent
`
`issues and ends twenty (20) years from the date on which the application for
`
`the patent was filed in the United States, “or, if the application contains a
`
`specific reference to an earlier filed application or applications under
`
`section 120, 121, 365(0), or 386(0), from the date on which the earliest such
`
`application was filed.” 35 U.S.C. § 154(a)(2) (2012 & Supp. III 2015). In
`
`its Petition, Samsung asserts that the ’015 patent will expire on December 2,
`
`
`
`IPR2017-00355
`
`Patent 7,650,015 B2
`
`2017. Pet. 3. Image Processing does not dispute Samsung’s assertion in this
`
`regard. See generally Prelim. Resp. 7—8.
`
`We note that the file history of the ’015 patent includes a
`
`“Supplemental Application Data Sheet” that clarifies the earliest patent
`
`application referenced for the benefit of priority under 35 U.S.C. § 365(c) of
`
`this patent was filed on July 22, 1997. Ex. 1004, 8. We also note the title
`
`page of the ’015 patent indicates that the term of this patent has been
`
`extended or adjusted under 35 U.S.C. § 154(b) by one hundred thirty four
`
`(134) days. Ex. 1001, at [*]. After adding 20 years to the date of July 22,
`
`1997, plus the 134 day term extension identified above, we agree with the
`
`parties that the term of the ’01 5 patent expires at or around December 2,
`
`2017 .
`
`On this record, because we conclude that the term of the ’015 patent
`
`will expire within eighteen (18) months from the entry of the Notice of
`
`Filing Date Accorded to the Petition, which, in this case is December
`
`14, 2016 (Paper 4), we construe the claims of the ’015 patent under the
`
`standard applicable to expired patents. For claims of an expired patent, our
`
`claim interpretation is similar to that of a district court. See In re Rambus
`
`Inc., 694 F.3d 42, 46 (Fed. Cir. 2012). “In determining the meaning of the
`
`disputed claim limitation, we look principally to the intrinsic evidence of
`record, examining the claim language itself, the written description, and the
`
`prosecution history, if in evidence.” DePuy Spine, Inc. v. Medtronic
`
`Sofamor Danek, Inc., 469 F.3d 1005, 1014 (Fed. Cir. 2006) (citing Phillips
`
`v. AWH Corp, 415 F.3d 1303, 1312—17 (Fed. Cir. 2005) (en banc)). There
`
`is, however, a “heavy presumption” that a claim term carries its ordinary and
`
`
`
`IPR2017-00355
`
`Patent 7,650,015 B2
`
`customary meaning. CCS Fitness, Inc. v. Brunswick Corp, 288 F.3d 1359,
`
`1366 (Fed. Cir. 2002).
`
`In its Petition, Samsung does not propose constructions for any claim
`
`terms recited in the challenged claim of the ’015 patent, but rather contends
`
`that each claim term should be accorded its ordinary and customary
`
`meaning. Pet. 3—4. In response, Image Processing proposes constructions
`
`for the following four claims terms: (1) “domain”; (2) “class”; (3) “forming
`
`at least one histogram of the pixels in the one or more of a plurality of
`
`classes in the one or more of a plurality of domains”; and (4) “said at least
`
`’ one histogram referring to classes defining said target.” Prelim. Resp. 8-19.
`
`We determine that, for purposes of this Decision, the only claim term.
`
`requiring construction is “forming at least one histogram of the pixels in the
`
`one or more of a plurality of classes in the one or more of a plurality of
`
`domains,” and we construe that term only to the extent necessary to resolve
`
`the issues discussed below. See, e. g., Vivid Techs., Inc. v. Am. Sci. & Eng ’g,
`
`Inc., 200 F.3d 795, 803 (Fed. Cir. 1999) (explaining that only those claim
`
`terms that are in controversy need to be construed, and only to the extent
`
`necessary to resolve the controversy).
`
`Image Processing contends that “forming at least one histogram of the
`
`pixels in the one or more of a plurality of classes in the one or more of a
`
`plurality of domains” should be construed as “forming at least one histogram
`
`of the pixels in two or more classes that are in two or more domains.”
`
`Prelim. Resp. 12. In essence, Image Processing argues that “one or more of
`
`a plurality” requires at least one plurality (i.e., two or more). Id. at 12—13
`
`(citing Ex. 2002). Image Processing further argues that a construction that
`
`requires “at least one class selected from multiple classes, and at least one
`
`10
`
`
`
`IPR2017-00355
`
`Patent 7,650,015 B2
`
`domain selected from multiple domains, would render the term ‘plurality’
`
`superfluous, so that the claim limitation would be reduced to ‘one or more
`
`classes in one or more domains.’” Id. at 13.
`
`On the current record, we are not persuaded by Image Processing’s
`
`argument. “[C]laims are interpreted with an eye toward giving effect to all
`
`terms in the claim.” Bicon, Inc. v. Straumarm Ca, 441 F.3d 945, 950 (Fed.
`
`Cir. 2006). Here, Image Processing’s proposed construction would render
`
`“one or more” in the claim superfluous (i.e., under Image Processing’s
`
`construction, the claim could simply read “a plurality of classes that are in a
`
`plurality of domains”). On the other hand, interpreting the phrase “forming
`
`at least one histogram of the pixels in the one or more of a plurality of
`
`classes in the one or more of a plurality of domains” to encompass at least
`
`one class from among a plurality of possible classes and at least one domain
`
`from among a plurality of possible domains gives effect to all the terms of
`
`the claim.
`
`Accordingly, for purposes of this Decision, we determine that
`
`“forming at least one histogram of the pixels in the one or more of a plurality
`
`of classes in the one or more of a plurality of domains” is not limited to
`
`“forming at least one histogram of the pixels in two or more classes that are
`
`in two or more domains.”
`
`B. Obviousness Over the Combined Teachings of Gilbert and Schamz'ng
`
`Samsung contends that claim 6 of the ’015 patent is unpatentable
`
`under § 103(a) over the combined teachings of Gilbert and Schaming.
`
`Pet. 39—55. Samsung explains how this proffered combination teaches or
`
`suggests the subject matter of this challenged claim, and provides reasoning
`
`11
`
`
`
`IPR2017-00355
`
`Patent 7,650,015 B2
`
`as to why one of ordinary skill in the art would have been prompted to
`
`modify or combine the references’ respective teachings. Id. Samsung also
`
`relies upon the Declaration of Dr. John C. Hart to support its positions. Ex.
`
`1002 111] 88—118. At this stage of the proceeding, we are persuaded by
`
`Samsung’s explanations and supporting evidence.
`
`We begin our analysis with the principles of law that generally apply
`
`to a ground based on obviousness, followed by brief overviews of Gilbert
`
`and Schaming, and then we address the parties’ contentions with respect to
`
`independent claim 6.
`
`1. Principles ofLaw
`
`A claim is ‘unpatentable under § 103(a) if the differences between the
`
`claimed subject matter and the prior art are such that the subject matter, as a
`
`whole, would have been obvious at the time the invention was made to a
`
`person having ordinary skill in the art to which said subject matter pertains.
`
`KSR Int ’1 Co. v. Teleflex Inc., 550 US. 398, 406 (2007). The question of
`
`obviousness is resolved on the basis of underlying factual determinations,
`
`including (1) the scope and content of the prior art; (2) any differences
`
`between the claimed subject matter and the prior art; (3) the level of skill in
`
`the art;3 and (4) when in evidence, objective indicia of non-obviousness
`
`3 Relying upon the testimony of Dr. Hart, Samsung offers an assessment as
`to the level of skill in the art. Pet. 4 (citing Ex. 1002 111] 44—48). Image
`Processing offers a similar assessment of the level of skill in the art. Prelim.
`Resp. 7. To the extent necessary, we accept the assessment offered by
`Samsung as it is consistent with the ’015 patent and the asserted prior art,
`but note that our conclusions would be the same under Image Processing’s
`assessment.
`
`12
`
`
`
`1PR2017-00355
`
`Patent 7,650,015 B2
`
`(i.e., secondary considerations). Graham v. John Deere Ca, 383 US. 1, 17—
`
`18 (1966). We analyze this asserted ground based on obviousness with the
`
`principles identified above in mind.
`
`2. Gilbert Overview
`
`Gilbert, titled “A Real-Time Video Tracking System,” is dated
`
`January 1980. Ex. 1005, 47 .4 Gilbert relates to an object identification and
`
`tracking system, which includes an image processing system that includes a
`
`Video processor, a projection processor, a tracker processor, and a control
`
`processor. Id. at 47—48. Gilbert’s video processor receives a digitized video
`
`signal in which each field consists of pixels. Id. at 48. Gilbert discloses that
`
`“[e]very 96 ns, a pixel intensity is digitized and quantized into eight bits
`(256 gray levels), counted into one of six 256-level histogram memories, and.
`
`then converted by a decision memory to a 2-bit code indicating its
`
`classification (target, plume, or background).” Id. Gilbert’s projection
`
`processor then uses pixels identified as being part of the target to create
`
`x— and y-projections. Id. at 50. Figure 4 of Gilbert, reproduced below,
`
`illustrates a projection location technique.
`
`4 All references to the page numbers in Gilbert are to the original page
`numbers located at the top of each page in Exhibit 1005, rather than the page
`numbers inserted by Samsung at the bottom of each page.
`
`13
`
`
`
`IPR2017-00355
`
`Patent 7,650,015 B2
`
`
`
`Fix. 4. Whether fixation technique.
`
`Figure 4 of Gilbert, reproduced above, illustrates Y-projections and
`
`X-projections of the target. Gilbert’s system uses these projections to
`
`determine the center of the upper and lower portions of the target, and those
`
`points are then used to determine the center of the target (Xc, Yc). Id. at 50—
`
`51.
`
`3. Schaming Overview
`
`Schaming, titled “Adaptive gate multifeature Bayesian statistical
`
`tracker,” is dated 1982. Ex. 1008, 68.5 Schaming describes a statistically:-
`
`based tracking algorithm that uses a powerful segmentation algorithm. Id.
`
`The tracking algorithm is based on the use of multi-feature joint probability
`
`density functions for the statistical separation of targets from their
`
`backgrounds. Id. These features, which include intensity, edge magnitude,
`
`5 All references to the page numbers in Schaming refer to the original page
`numbers located at the bottom, left-hand corner or bottom, right-hand comer
`of each page in Exhibit 1008, rather than the page numbers inserted by
`Samsung in the bottom, right-hand corner of each page.
`
`14
`
`
`
`[PR2017-00355
`
`Patent 7,650,015 B2
`
`and spatial frequency, are combined to form a joint probability distribution,
`
`which characterizes a target region and its immediate surroundings. Id; see
`
`also id. at 70—71 (describing the computation of features, such as intensity,
`
`edge magnitude, and spatial frequency).
`
`Figure 2 of Schaming, reproduced below, illustrates an example as to
`
`how histograms are used to separate a target from its background. Ex. 1008,
`
`70.
`
`”In.“
`IUD“
`
` HKTNW
`
`
`INS" mow
`HIMOMM
`
`.
`"l/
`
`
`
`
`
`l,
`
`'.
`
`"MILK OF MITELS IN EACH INYEMW GROW
`
`As shown in Figure 2 reproduced above, each bin in the histogram is
`
`examined to determine if the intensity value falling within that bin is more
`
`likely to be either the target or background. 1d. Although this particular
`
`example focuses on a single feature (i.e., intensity), Schaming indicates that
`
`the same process is used for multiple features in an N—dimensional histogram
`
`representing a joint probability density. Id.
`
`4. Claim 6
`
`In its Petition, Samsung contends that the combined teachings of
`
`Gilbert and Schaming account for all the limitations recited in independent
`
`claim 6. Pet. 45—54. Beginning with the preamble of independent claim 6,6
`
`Samsung contends that Gilbert teaches each of the elements in the preamble
`
`6 At this stage of the proceeding, we need not decide whether the claim
`preamble is limiting. Vivid Techs., 200 F.3d at 803.
`
`15
`
`
`
`IPR2017-003 5 5
`
`Patent 7,650,015 B2
`
`because its tracking system uses a video signal (i.e., input signal) that
`includes digitized fields (i.e., frames) with a frame rate of 60 fields per
`
`second (i.e., a succession of frames), where each field further includes an n
`
`X m matrix of digitized points (i.e., a succession of pixels). Id. at 45 (citing
`
`Ex. 1005, 48; Ex. 1002 11 99). Samsung argues that Gilbert tracks the image
`
`of the target (i.e., a missile) by categorizing the pixels into one of three
`
`classes—namely, background, plume, and target—based on gray-scale
`
`intensity levels (i.e., a domain) of each pixel. Id. (citing Ex. 1005, 48, Ex.
`
`1002 1] 100). Samsung further argues that Gilbert performs its target
`
`tracking process “during each field” (i.e., on a frame-‘by-frame basis). Id.
`
`(citing Ex. 1005, 48; Ex. 1.002 11 101). Samsung also contends that
`
`Schaming teaches each of the elements in the preamble because it discloses
`
`a process of tracking a target from a video signal (i.e., an input signal
`
`comprising a succession of frames, each frame further comprising a
`
`succession of pixels). Id. at 45—46 (citing Ex. 1008, 68, 69; Ex. 1002 11 99).
`
`Samsung argues that Schaming discloses identifying the image of the target
`
`using intensity histograms created from the target window and background
`
`' window. Id. at 46 (citing Ex. 1008, 70—71, Fig. 2; Ex. 1002 11 100).
`
`Samsung further argues that its statistical searching of pixels occurs in every
`
`frame (i.e., on a frame-by-frame basis). Id. (citing Ex. 1008, 69; Ex. 1002
`
`1] 101).
`
`With respect to the “forming” step recited in independent claim 6,
`
`Samsung contends that Gilbert teaches this step because its tracking system
`
`forms histograms using the intensity domain, where each “class” within the
`
`intensity domain comprises the pixels meeting certain intensity criteria (i.e.,
`
`one of 256' gray-scale levels). Pet. 46—47 (citing Ex. 1005, 48; Ex. 1002
`
`16
`
`
`
`IPR2017-00355
`
`Patent 7,650,015 B2
`
`111] 102, 103). Alternatively, Samsung contends that Gilbert teaches this step
`
`because its projection processor forms X— and Y-projection histograms using
`
`binary pictures generated by the Video processor. Id. at 48 (citing Ex. 1005,
`
`Fig. 4; Ex. 1002 11 1008). Samsung also contends that Schaming teaches the
`
`“forming” step because it discloses forming separate intensity histograms
`
`from the target window and background window. Id (citing Ex. 1008, 69—
`
`70, Fig. 2; Ex. 1002 11 105). In particular, Samsung argues that, although
`
`Figure 2 of Schaming illustrates an example using a single feature (i.e.,
`
`intensity) as the domain of the histogram, multiple features in an N—
`
`dimensional histogram, such as intensity, edge magnitude, and special
`
`frequency, can be used to identify the target. Id. at 47 (citing Ex. 1008, 70—
`
`72, Fig. 2; Ex. 1002 1] 105).
`(With rCSpect to the “identifying” step recited in independent claim 6,
`
`Samsung contends that Gilbert teaches this step because it discloses using
`
`probability estimates based on the 256 level gray-scale histograms to
`
`determine whether a particular pixel belongs to the target, plume, or
`
`background region. Pet. 49 (citing Ex. 1005, 48—50; Ex. 1002 fil 110).
`According to Samsung, this result from Gilbert is further used by the
`
`projection and tracking processors to iocate and track the target. Id.
`
`Samsung also contends that Schaming teaches the “identifying” step because
`
`it discloses examining each bin in the histogram (i.e., each class within the
`
`intensity domain) to determine whether the intensity value falling within the
`
`bin is more likely to be the target or background. Id. (citing Ex. 1008, 70,
`
`Fig. 2; Ex. 1002 fl 111).
`
`With respect to the “drawing” step recited in independent claim 6,
`
`Samsung contends that Gilbert teaches this step because both Figures 2 and
`
`17
`
`
`
`IPR2017-00355
`
`Patent 7,650,015 B2
`
`7 illustrate drawing a tracking box around the target. Pet. 50—51 (citing
`
`Ex. 1005, 48, 55, Figs. 2, 7; Ex. 1002 1111 112, 113). Samsung also contends
`
`that Schaming teaches the “drawing” step because both Figures 5 and 7
`
`illustrate drawing a tracking box around the target. Id. at 51—52 (citing
`
`Ex. 1008, 74—76, Figs. 5, 7; Ex. 1002 11 115).
`
`With respect to the “centering” step recited in independent claim 6,
`
`Samsung contends that Gilbert teaches this step because its tracking system
`
`calculates the center of the target by finding the midpoint of the two center-
`
`of-area points of the upper and lower halves of the target, uses the.
`
`differences between the center of the target and the center of the displayed
`
`image to compute the “boresight correction signal,” and then uses this signal
`
`to control the azimuth and elevation pointing angles of the telescope to
`
`effectively orientate the tracking optics toward the target. Pet. 52—53 (citing
`
`Ex. 1005, 52, 54; Ex. 1002 11 116). Samsung also contends that Schaming
`
`teaches the “centering” step because its tracking system can control
`
`movement of the camera by using an error signal to point the camera to the
`
`target’s location, such that the target remains centered in the image frame.
`
`Id. at 53 (citing Ex. 1008, 75; Ex. 1002 11 117). According to Samsung, with
`
`a few exceptional cases (e.g., the beginning of the target search), Schaming
`
`discloses that the tracking window position generally will remain in the
`
`center of the image frame. Id.
`
`Turning to Samsung’s rationale to combine the teachings of Gilbert
`
`and Schaming, Samsung relies upon the testimony of Dr. Hart to explain
`
`why one of ordinary skill in the art would have had sufficient reason to
`
`combine the references’ respective teachings. Pet. 39—44; Ex. 1002 W 89—
`
`96. For instance, apart from the exemplary rationales articulated in KSR,
`
`18
`
`
`
`IPR2017-00355
`
`Patent 7,650,015 B2
`
`Samsung contends that one of ordinary skill in the art would have
`
`recognized that, because Gilbert teaches that many features or domains may
`
`be used to form histograms used to identify a target, a person of ordinary
`
`skill in the art reading Gilbert would have looked to employ histograms
`
`using other domains, such as those taught by Schaming. Pet. 41—42 (Ex.
`
`1005, 48, 50—51; Ex. 1002 1] 92). That is, Samsung argues it would have
`
`been obvious to use Schaming’s multiple domains, such as edge magnitude
`
`and Spatial frequency, in Gilbert’s tracking system to form histograms used
`
`to identify a target. See id. at 42 (citing. Ex. 1008, 71; Ex. 1002 11 93). In
`
`addition, Samsung argues that one of ordinary skill in the art, upon reading
`
`Gilbert, would have recognized that plotting histograms in other domains
`would result in certain. adyantages over the disclosure in Gilbert. Id. at 44
`
`(citing Ex. 1002 11 96). According to Samsung, by having the capability of
`
`plotting histograms in additional domains, Gilbert’s tracking system would
`
`have a higher likelihood of successfully recognizing a target in an image
`
`frame. Id. Consequently, Samsung asserts that a person of ordinary skill in
`
`the art would have had a sufficient reason to look to a reference, such as
`
`Schaming, that uses histograms across multiple domains simultaneously. Id.
`
`(citing Ex. 1008, 69—71; Ex. 1002 11 95).
`
`In its Preliminary Response, Image Processing presents a number of
`
`arguments that can be grouped as follows: (1) whether Samsung has
`
`demonstrated that Gilbert and Schaming, alone or in combination, account
`
`for all the limitations recited in independent claim 6; and (2) whether
`
`Samsung has demonstrated that a person of ordinary skill in the art would
`
`have combined the teachings of Gilbert and Schaming. Prelim. Resp. 24—3 1,
`
`38—43. We address these groupings of arguments in turn.
`
`19
`
`
`
`IPR2017-00355
`
`Patent 7,650,015 B2
`
`(1. Limitations
`
`Image Processing argues that Gilbert does not teach “forming at least
`
`one histogram of the pixels in the one or more of a plurality of classes in the
`
`one or more of a plurality of domains, said at least one histogram referring to
`
`classes defining said target,” and “identifying the target from said at least
`
`one histogram,” as recited in independent claim 6. Prelim. Resp. 24—31.
`
`Image Processing argues that neither Gilbert’s intensity histograms nor its
`
`projection histograms teach these claim limitations. Id. at 25—29. With
`
`respect to Gilbert’s intensity histograms, Image Processing argues they fail
`
`. to teach these limitations because (1) they are in a single domain (i.e.,
`
`intensity); (2) they are not formed of pixels in two or more classes; and (3)
`
`they do not define the target. Id. at 25—28. Image Processing also argues
`that, because Gilbert mentions other parameters (i.e., domains), but does not
`
`disclose using more than a single parameter (i.e., intensity) in combination
`
`with those other parameters, a person of ordinary skill in the art would have
`
`been lead away “from forming histogram(s) of the pixels in two or more
`
`selected subsets of parameter values that are in two or more domains, where
`
`the histogram(s) refer to selected subsets of parameter values that define the
`
`target.” Id. at 27. With respect to Gilbert’s projection histograms, Image
`
`Processing argues they fail to teach the disputed limitations identified above
`
`because (1) they are formed’afier the target already has been identified; and
`
`(2) the classes in those histograms do not define the target. Id. at 28—29.
`
`Image Processing further argues that Schaming does not teach
`
`“forming at least one histogram of the pixels in the one or more of a plurality
`
`of classes in the one or more of a plurality of domains, said at least one
`
`histogram referring to classes defining said target,” as recited in independent
`
`20
`
`
`
`IPR2017-00355
`
`Patent 7,650,015 B2
`
`claim 6. Prelim. Resp. 29—30. In particular, Image Processing argues that
`
`Schaming’s N—dimensional histograms fail to teach this particular limitation
`
`because (1) they are formed using all the pixel data within certain areas of
`
`the image; and (2) they do not refer to classes that define the target. Id.
`
`On the current record, we are not persuaded by Image Processing’s
`
`arguments with respect to the teachings of Gilbert and Schaming. As
`
`discussed above in our claim construction section, we declined to adopt
`
`Image Processing’s proposed construction that would require a plurality of
`
`domains. See supra Section II.A. We also find Samsung has shown
`
`sufficiently that the asserted references teach a plurality of classes. For
`
`example, Samsung relies on Gilbert’s 256 gray»scale levels in its intensity
`
`domain as teaching a plurality of classes. See Pet. 47; Ex. 1002 1111 102, 103.
`
`We'also see no requirement in independent claim 6 that the classes,
`
`collectively, must comprise less than all of the values in the domain, as
`Image Processing appears to suggest. See Prelim. Resp. 27 (arguing
`
`“Gilbert’s intensity histograms are not formed of pixels in two or more
`
`selected subsets (classes) ofa parameter, in this case, intensity”); id. at 29
`
`(arguing Schaming’s N-dimensional histograms “are formed using all pixel
`
`data within certain areas of the image”). In particular, independent claim 6,
`which uses the open-ended transition “comprising,” recites that the target
`
`comprises pixels in one or more of a plurality of classes and that the
`
`histogram is formed from those pixels, but it does not preclude the histogram
`
`from also including other pixels outside the target (e.g., pixels of the
`
`background).
`
`21
`
`
`
`IPR2017-00355
`
`Patent 7,650,015 B2
`
`We are not persuaded by Image Processing’s argument that a target is
`
`not identified from Gilbert’s projection histograms because the target was
`
`already previously identified in Gilbert’s intensity histogram. See Prelim.
`
`Resp. 28—29. Identifying