throbber
Trials@uspto.gov
`571-272-7822
`
`
` Paper No. 16
`
`Entered: May 25, 2017
`
`UNITED STATES PATENT AND TRADEMARK OFFICE
`____________
`
`BEFORE THE PATENT TRIAL AND APPEAL BOARD
`____________
`
`SAMSUNG ELECTRONICS CO., LTD. and
`SAMSUNG ELECTRONICS AMERICA, INC.,
`Petitioner,
`
`v.
`
`IMAGE PROCESSING TECHNOLOGIES LLC,
`Patent Owner.
`____________
`
`Case IPR2017-00357
`Patent 8,989,445 B2
`____________
`
`
`
`
`
`
`
`Before JONI Y. CHANG, MICHAEL R. ZECHER, and
`JESSICA C. KAISER, Administrative Patent Judges.
`
`ZECHER, Administrative Patent Judge.
`
`
`
`
`DECISION
`Granting Institution of Inter Partes Review
`35 U.S.C. § 314(a) and 37 C.F.R. § 42.108
`
`
`
`
`
`

`

`IPR2017-00357
`Patent 8,989,445 B2
`
`
`I. INTRODUCTION
`
`Petitioner, Samsung Electronics Co., Ltd. and Samsung Electronics
`America, Inc. (collectively “Samsung”), filed a Petition requesting an inter
`partes review of claims 1, 4, 6, 9, 18, 24, 25, and 27 of U.S. Patent No.
`8,989,445 B2 (Ex. 1001, “the ’445 patent”). Paper 2 (“Pet.”). Patent
`Owner, Image Processing Technologies LLC (“Image Processing”), filed a
`Preliminary Response. Paper 9 (“Prelim. Resp.”).
`Under 35 U.S.C. § 314(a), an inter partes review may not be instituted
`unless the information presented in the Petition shows “there is a reasonable
`likelihood that the petitioner would prevail with respect to at least 1 of the
`claims challenged in the petition.” Taking into account the arguments
`presented in Image Processing’s Preliminary Response, we conclude that the
`information presented in the Petition establishes that there is a reasonable
`likelihood that Samsung would prevail in challenging claims 1, 4, 6, 9, 18,
`24, 25, and 27 of the ’445 patent as unpatentable under 35 U.S.C. § 103(a).
`Pursuant to § 314, we hereby institute an inter partes review as to these
`claims of the ’445 patent.
`
`A. Related Matters
`
`The ’445 patent is involved in a district court case titled Imaging
`
`Processing Techs. LLC v. Samsung Elecs. Co., No. 2:16-cv-00505-JRG
`(E.D. Tex.). Pet. 1; Paper 7, 2. In addition to this Petition, Samsung filed
`other petitions challenging the patentability of certain subsets of claims in
`the following patents owned by Image Processing: (1) U.S. Patent No.
`6,959,293 B2 (Case IPR2017-00336); (2) U.S. Patent No. 8,805,001 B2
`(Case IPR2017-00347); (3) U.S. Patent No. 8,983,134 B2 (Case IPR2017-
`
`2
`
`

`

`IPR2017-00357
`Patent 8,989,445 B2
`
`00353); and (4) U.S. Patent No. 7,650,015 B2 (Case IPR2017-00355).
`Pet. 1; Paper 7, 2.
`
`B. The ’445 Patent
`
`The ’445 patent, titled “Image Processing Apparatus and Method,”
`issued March 24, 2015, from U.S. Patent Application No. 14/449,809, filed
`on August 13, 2014. Ex. 1001, at [54], [45], [21], [22]. The ’445 patent has
`an extensive chain of priority that ultimately results in it claiming the benefit
`of Patent Cooperation Treaty (“PCT”) French Patent Application No.
`97/01354, filed on July 22, 1997. Id. at [60].
`The ’445 patent generally relates to an image process apparatus and,
`in particular, to a method and apparatus for identifying and localizing an
`area in relative movement in a scene, and determining the speed and
`direction of that area in real-time. Ex. 1001, 1:38–40. The ’445 patent
`discloses a number of known systems and methods for identifying and
`localizing an object in relative movement, but explains that each of those
`systems/methods are inadequate for various reasons (e.g., memory intensive,
`limited in terms of the information obtained about an object, did not provide
`information in real-time, used complex algorithms for computing object
`information, designed to detect only one type of object, etc.). See id. at
`1:44–3:17. The ’445 patent purportedly solves these problems by providing
`a method and apparatus for detecting the relative movement and
`non-movement of an area within an image. Id. at 9:17–19. According to the
`’445 patent, relative movement is any movement of an area, which may be
`an object (e.g., a person, a portion of a person, or any animals or inanimate
`
`3
`
`

`

`IPR2017-00357
`Patent 8,989,445 B2
`
`object), in a motionless environment or, alternatively, in an environment that
`is at least partially in movement. Id. at 9:19–24.
`Figure 11 of the ’445 patent, reproduced below, illustrates a block
`diagram showing the interrelationship between various histogram formation
`units that make up a histogram processor. Ex. 1001, 8:54–55.
`
`
`As shown in Figure 11 reproduced above, histogram processor 22(a) (not
`labeled) includes bus 23 that transmits signals between various components,
`including histogram formation and processing blocks 24–29. Id. at 16:57–
`63. The function of each histogram formation and processing block 24–29 is
`to form a histogram for the domain associated with that particular block. Id.
`at 16:63–65.
`According to the ’445 patent, each histogram formation and
`processing block 24–29 operates in the same manner. Ex. 1001, 17:47–50.
`As one example, Figure 13 of the ’445 patent, reproduced below, illustrates
`
`4
`
`

`

`IPR2017-00357
`Patent 8,989,445 B2
`
`a block diagram of histogram formation and processing block 25. Id. at
`8:58–59.
`
`
`As shown in Figure 13 reproduced above, histogram formation and
`processing block 25 includes histogram forming portion 25a, which forms
`the histogram for the block, and classifier 25b, which selects the criteria of
`pixels for which the histogram is to be formed. Id. at 17:50–53. Histogram
`forming portion 25a and classifier 25b operate under the control of computer
`software in integrated circuit 25c (not shown in Figure 13), which extracts
`certain limits of the histogram generated by the histogram formation block.
`Id. at 17:54–57. Classifier 25b includes register 106 that enables the
`classification criteria to be set by a user or, alternatively, by a separate
`computer program. Id. at 18:20–23.
`
`C. Illustrative Claim
`
`Of the challenged claims, claims 1 and 24 are independent.
`
`Independent claim 1 is directed to a process of tracking a target in an image
`
`5
`
`

`

`IPR2017-00357
`Patent 8,989,445 B2
`
`processing system, whereas independent claim 24 is directed to just an
`image processing system. Claims 4, 6, 9, and 18 directly or indirectly
`depend from independent claim 1, and claims 25 and 27 directly depend
`from independent claim 24. Independent claim 1 is illustrative of the
`challenged claims and is reproduced below:
`1.
`A process of tracking a target in an image
`processing system comprising:
`receiving an input signal including a plurality of frames,
`each frame including a plurality of pixels;
`generating a histogram based on classification values of a
`plurality of pixels in a first frame of the input signal;
`identifying a target from the histogram generated based on
`the first frame;
`determining a target location based on the histogram
`generated based on the first frame;
`generating a histogram based on classification values of a
`plurality of pixels in a second frame of the input signal
`subsequent to the first frame; and
`adjusting the target location based on the histogram
`generated based on the second frame.
`
`Ex. 1001, 26:38–52.
`D. Prior Art References Relied Upon
`
`Samsung relies upon the prior art references set forth in the table
`below:
`Inventor1 U.S. Patent No. Relevant Dates
`Hashima
`5,521,843
`issued May 28, 1996,
`PCT filed Jan. 29, 1993
`issued June 2, 1998,
`filed Apr. 27, 1995
`
`Brady
`
`5,761,326
`
`Exhibit No.
`1006
`
`1007
`
`
`1 For clarity and ease of reference, we only list the first named inventor.
`
`6
`
`

`

`IPR2017-00357
`Patent 8,989,445 B2
`
`
`
`Non-Patent Literature
`Alton L. Gilbert et. al., A Real-Time Video Tracking System,
`PAMI-2, No. 1 IEEE TRANSACTIONS ON PATTERN
`ANALYSIS AND MACHINE INTELLIGENCE 47–56 (1980)
`(“Gilbert”)
`O.D. Altan et. al., Computer Architecture and
`Implementation of Vision-Based Real-Time Lane Sensing,
`PROCEEDINGS OF THE INTELLIGENT VEHICLES ’92
`SYMPOSIUM 202–06 (1992) (“Altan”)
`
`Exhibit No.
`1005
`
`1008
`
`
`
`E. Asserted Grounds of Unpatentability
`Samsung challenges claims 1, 4, 6, 9, 18, 24, 25, and 27 of the
`
`’445 patent based on the asserted grounds of unpatentability (“grounds”) set
`forth in the table below. Pet. 3, 37–83.
`References
`Basis
`Gilbert and Brady
`§ 103(a)
`Gilbert, Brady, and Altan
`§ 103(a)
`Hashima and Gilbert
`§ 103(a)
`Hashima, Gilbert, and Altan § 103(a)
`Hashima and Brady
`§ 103(a)
`
`Challenged Claim(s)
`1, 4, 6, 9, 18, 24, 25, and 27
`18
`1, 4, 6, 9, 24, 25, and 27
`18
`1, 4, 6, 9, 18, 24, 25, and 27
`
`II. ANALYSIS
`
`A. Claim Construction
`
`As an initial matter, we determine the proper standard of construction
`to apply. The term of a patent grant begins on the date on which the patent
`issues and ends twenty (20) years from the date on which the application for
`the patent was filed in the United States, “or, if the application contains a
`specific reference to an earlier filed application or applications under
`
`7
`
`

`

`IPR2017-00357
`Patent 8,989,445 B2
`
`section 120, 121, 365(c), or 386(c), from the date on which the earliest such
`application was filed.” 35 U.S.C. § 154(a)(2) (2012 & Supp. III 2015).
`Samsung suggests that the earliest patent application referenced for the
`benefit of priority under 35 U.S.C. § 365(c) for the ’445 patent was filed on
`July 22, 1997, and the patent has no term extensions. See Pet. 3 (stating that
`“[t]he ’445 Patent will expire on July 22, 2017”). Image Processing does not
`dispute Samsung’s assertion in this regard. See generally Prelim. Resp. 7.
`The term of the ’445 patent, therefore, expires no later than July 22, 2017.
`On this record, because we agree with the parties that the term of the
`’445 patent will expire within eighteen (18) months from the entry of the
`Notice of Filing Date Accorded to the Petition, which, in this case is
`December 2, 2016 (Paper 3), we construe the claims of the ’445 patent under
`the standard applicable to expired patents. For claims of an expired patent,
`our claim interpretation is similar to that of a district court. See In re
`Rambus Inc., 694 F.3d 42, 46 (Fed. Cir. 2012). “In determining the meaning
`of the disputed claim limitation, we look principally to the intrinsic evidence
`of record, examining the claim language itself, the written description, and
`the prosecution history, if in evidence.” DePuy Spine, Inc. v. Medtronic
`Sofamor Danek, Inc., 469 F.3d 1005, 1014 (Fed. Cir. 2006) (citing Phillips
`v. AWH Corp., 415 F.3d 1303, 1312–17 (Fed. Cir. 2005) (en banc)). There
`is, however, a “heavy presumption” that a claim term carries its ordinary and
`customary meaning. CCS Fitness, Inc. v. Brunswick Corp., 288 F.3d 1359,
`1366 (Fed. Cir. 2002).
`The parties do not propose constructions for any claim terms recited
`in the challenged claims of the ’445 patent. See generally Pet. 3–4; Prelim.
`Resp. 7. Because there is no dispute between the parties regarding claim
`
`8
`
`

`

`IPR2017-00357
`Patent 8,989,445 B2
`
`construction, we need not construe explicitly any claim term of the
`’445 patent at this time. See, e.g., Vivid Techs., Inc. v. Am. Sci. & Eng’g,
`Inc., 200 F.3d 795, 803 (Fed. Cir. 1999) (explaining that only those claim
`terms or phrases that are in controversy need to be construed, and only to the
`extent necessary to resolve the controversy).
`
`B. Obviousness Over the Combined Teachings of Gilbert and Brady
`
`Samsung contends that claims 1, 4, 6, 9, 18, 24, 25, and 27 of the
`
`’445 patent are unpatentable under § 103(a) over the combined teachings of
`Gilbert and Brady. Pet. 37–59. Samsung explains how this proffered
`combination teaches or suggests the subject matter of each challenged claim,
`and provides reasoning as to why one of ordinary skill in the art would have
`been prompted to modify or combine their respective teachings. Id.
`Samsung also relies upon the Declaration of Dr. John C. Hart to support its
`positions. Ex. 1002 ¶¶ 89–119, 122–133. At this stage of the proceeding,
`we are persuaded by Samsung’s explanations and supporting evidence.
`
`We begin our analysis with the principles of law that generally apply
`to a ground based on obviousness, followed by brief overviews of Gilbert
`and Brady, and then we address the parties’ contentions with respect to the
`challenged claims.
`
`1. Principles of Law
`
`A claim is unpatentable under § 103(a) if the differences between the
`claimed subject matter and the prior art are such that the subject matter, as a
`whole, would have been obvious at the time the invention was made to a
`person having ordinary skill in the art to which said subject matter pertains.
`KSR Int’l Co. v. Teleflex Inc., 550 U.S. 398, 406 (2007). The question of
`
`9
`
`

`

`IPR2017-00357
`Patent 8,989,445 B2
`
`obviousness is resolved on the basis of underlying factual determinations,
`including (1) the scope and content of the prior art; (2) any differences
`between the claimed subject matter and the prior art; (3) the level of skill in
`the art;2 and (4) when in evidence, objective indicia of non-obviousness
`(i.e., secondary considerations). Graham v. John Deere Co., 383 U.S. 1, 17–
`18 (1966). We analyze this asserted ground based on obviousness with the
`principles identified above in mind.
`
`2. Gilbert Overview
`
`Gilbert, titled “A Real-Time Video Tracking System,” is dated
`January 1980. Ex. 1005, 47.3 Gilbert relates to an object identification and
`tracking system, which includes an image processing system that includes a
`video processor, a projection processor, a tracker processor, and a control
`processor. Id. at 47–48. Gilbert’s video processor receives a digitized video
`signal in which each field consists of pixels. Id. at 48. Gilbert discloses that
`“[e]very 96 ns, a pixel intensity is digitized and quantized into eight bits
`(256 gray levels), counted into one of six 256-level histogram memories, and
`then converted by a decision memory to a 2-bit code indicating its
`
`
`2 Relying upon the testimony of Dr. Hart, Samsung offers an assessment as
`to the level of skill in the art. Pet. 4 (citing Ex. 1002 ¶¶ 47–50). Image
`Processing offers a similar assessment of the level of skill in the art. Prelim.
`Resp. 6. To the extent necessary, we accept the assessment offered by
`Samsung as it is consistent with the ’445 patent and the asserted prior art,
`but note that our conclusions would be the same under Image Processing’s
`assessment.
`3 All references to the page numbers in Gilbert are to the original page
`numbers located at the top of each page in Exhibit 1005, rather than the page
`numbers inserted by Samsung in the bottom of each page.
`
`10
`
`

`

`IPR2017-00357
`Patent 8,989,445 B2
`
`classification (target, plume, or background).” Id. Gilbert’s projection
`processor then uses pixels identified as being part of the target to create
`x- and y-projections. Id. at 50. Figure 4 of Gilbert, reproduced below,
`illustrates a projection location technique.
`
`
`Figure 4 of Gilbert, reproduced above, illustrates Y-projections and
`X-projections of the target. Gilbert’s system uses these projections to
`determine the center of the upper and lower portions of the target, and those
`points are then used to determine the center of the target (XC, YC). Id. at 50–
`51.
`
`3. Brady Overview
`
`Brady, titled “Method and Apparatus for Machine Vision
`Classification and Tracking,” issued on June 2, 1998. Ex. 1007, at [54],
`[45]. As suggested by its title, Brady generally relates to systems used for
`traffic detection, monitoring, management, and vehicle classification and
`tracking. Id. at 1:12–14. In particular, the invention disclosed therein is
`directed to a method and apparatus for classifying and tracking objects in
`
`11
`
`

`

`IPR2017-00357
`Patent 8,989,445 B2
`
`images (i.e., vehicles) provided by real-time video feed. Id. at 1:14–16.
`According to Brady, the apparatus of the disclosed invention includes a
`plurality of video cameras situated over a plurality of roadways, thereby
`allowing the video cameras to film each site it covers in real-time. Id. at
`3:61–63. The video cameras are interconnected electrically to a switcher,
`which allows for manual or automatic switching between each of the video
`cameras. Id. at 3:63–66. The video filmed by the video camera is
`transmitted to a plurality of image processors that analyze the image from
`the video and create classification and tracking data. Id. at 3:66–4:1.
`Brady discloses that a video image of a scene may be a 512x512 pixel
`three color image having an integer number defining intensity with a
`definition range for each color of 0–255. Ex. 1007, 5:40–43. Brady further
`discloses that image processing entails using a regional selection module
`that defines potential regions of interest, or candidate regions, for
`classification. Id. at 7:35–36. The region selection module also defines
`candidate regions. Id. at 7:39–40. Each vehicle class is assigned a set of
`appropriately sized and shaped regions, which, in one embodiment, are
`trapezoidal in shape. Id. at 7:38–54. Image processing also uses an edgel
`definition module that evaluates each pixel of the image array output from
`the scene being evaluated for the magnitude of its edge element intensity.
`Id. at 6:23–25. According to Brady, edgel intensity indicates the likelihood
`that a given pixel is located on some edge having particular orientation and
`contrast. Id. at 6:25–27. The greater the contrast between a particular pixel
`and the pixels surrounding it in a particular orientation, the greater the edgel
`intensity. Id. at 6:27–30.
`
`12
`
`

`

`IPR2017-00357
`Patent 8,989,445 B2
`
`
`For each pixel’s edge intensity in a region of interest, Brady discloses
`applying tent functions. Ex. 1007, 9:5–6. For each tent function a histogram
`is produced that records the frequency a range of angles occurs, as weighted
`by the intensity of the edgels, within a particular tent function. Id. at 9:6–9.
`Once all the histograms have been produced, they are strung together to
`form a vector, which, in turn, is output from a vectorization module. Id. at
`9:25–28. The vector then is evaluated by a vehicle learning module, which
`classifies vehicles based on the vector and may generate a signal indicative
`of the classification of a vehicle in question. Id. at 9:50–57. Icon
`assignment module may assign a unique icon identifying the vehicle class of
`the vehicle in question, which, in turn, may be output to a tracking module to
`facilitate visual tracking of that vehicle. Id. at 9:65–66. This icon will move
`with progression of the vehicle as the track progresses over time through a
`series of frames until the vehicle is no longer identifiable in the scene. Id. at
`10:7–9. In one embodiment, once the tracking module initiates a tracking
`sequence for the vehicle, the centroid of the tracked vehicle may be
`identified to allow centering of the tracking point over the vehicle. Id. at
`10:63–66.
`In one embodiment, Brady discloses that the identification and
`tracking of vehicles may occur at the time when the immediately previous
`image is acquired (i.e., every frame). Ex. 1007, 12:15–17. In another
`embodiment, Brady discloses using intermittent tracking. Id. at 12:17–20.
`When employing intermittent tracking, the target location is adjusted only if
`the target has moved significantly. Id. at 14:50–53.
`
`13
`
`

`

`IPR2017-00357
`Patent 8,989,445 B2
`
`
`4. Claims 1 and 24
`In its Petition, Samsung contends that the combined teachings of
`Gilbert and Brady account for all the steps recited in independent claim 1.
`Pet. 40–46. Beginning with the “receiving” step recited in independent
`claim 1, Samsung contends that Gilbert teaches this step because its tracking
`system uses a video signal (i.e., input signal) that includes digitized fields
`(i.e., frames) with a frame rate of 60 fields per second (i.e., a succession of
`frames), where each field further includes an n X m matrix of digitized
`points (i.e., a succession of pixels). Id. at 41 (citing Ex. 1005, 48; Ex. 1002
`¶ 97). Samsung also argues that Brady teaches the “receiving” step because
`it discloses using a digitized real-time video signal that includes a plurality
`of image frames, each of which includes a plurality of pixels. Id. (citing
`Ex. 1007, 5:38–45; Ex. 1002 ¶ 97).
`With respect to the first “generating” step recited in independent
`claim 1, Samsung contends that Gilbert teaches this step because its video
`processor generates intensity histograms for each of the target, plume, and
`background regions (i.e., a plurality of pixels) in a first frame of the video
`signal based on the 256-level grayscale value (i.e., classification value) for
`each pixel. Pet. 41–42 (citing Ex. 1005, 48, 49; Ex. 1002 ¶ 98). Samsung
`also argues that Brady teaches the first “generating” step because it discloses
`generating histograms of pixels in the region-of-interest in the frame being
`evaluated (i.e., a first frame) based on the pixels’ edgel intensity, as
`transformed by a tent function. Id. at 42 (citing Ex. 1007, 8:60–9:9;
`Ex. 1002 ¶ 98).
`With respect to the “identifying” step recited in independent claim 1,
`Samsung contends that Gilbert teaches this step because it discloses
`
`14
`
`

`

`IPR2017-00357
`Patent 8,989,445 B2
`
`identifying a target (i.e., a missile in flight) based on the previously
`generated intensity histogram for the pixels in a first frame of the video
`signal. Pet. 42 (citing Ex. 1005, 48; Ex. 1002 ¶¶ 97, 99). Samsung also
`argues that Brady teaches the “identifying” step because it discloses
`identifying the target vehicles based on the analysis of the previously
`generated histogram of pixels in the region-of-interest in the frame being
`evaluated. Id. at 42–43 (citing Ex. 1007, 9:50–10:23; Ex. 1002 ¶¶ 97, 99).
`In particular, Samsung argues that Brady creates a vector output by stringing
`together nine histograms covering a region of interest, and then evaluates the
`vector to determine whether it represents one of a predetermined class of
`vehicles. Id. at 42 (citing Ex. 1007, 9:25–27, 9:50–62; Ex. 1002 ¶ 99).
`With respect to the “determining” step recited in independent claim 1,
`Samsung contends that Gilbert teaches this step because it discloses
`determining a target location based on the X- and Y-projection histograms
`created after the pixels corresponding to the target are identified. Pet. 43–44
`(citing Ex. 1005, 50, Fig. 4; Ex. 1002 ¶ 100). Samsung asserts that Gilbert’s
`X- and Y-projection histograms are generated based on the first frame. Id. at
`44 (citing Ex. 1002 ¶ 100). Samsung also contends that Brady teaches the
`“determining” step because it discloses determining the target location based
`on the nine histograms representing the edgel intensities as transformed by
`the tent functions. Id. at 44 (citing Ex. 1007, 10:63–11:13; Ex. 1002 ¶ 101).
`With respect to the second “generating” step recited in independent
`claim 1, Samsung contends that Gilbert teaches this step because it discloses
`generating histograms on a frame-by-frame basis. Pet. 44 (citing Ex. 1005,
`49). In other words, Samsung argues that after Gilbert generates an intensity
`histogram based on one frame (i.e., the first frame), Gilbert repeats the same
`
`15
`
`

`

`IPR2017-00357
`Patent 8,989,445 B2
`
`process for the next frame (i.e., the second frame). Id. (citing Ex. 1005, 48–
`49; Ex. 1002 ¶ 102). Samsung also argues that Brady teaches the second
`“generating” step because it discloses generating histograms on a frame-by-
`frame basis. Id. at 45. Samsung argues that Brady does not identify and
`track vehicles in every frame, but rather discloses “intermittent tracking” in
`which the tracking system skips certain frames, particularly those frames
`where the target has not moved significantly. Id. (citing Ex. 1007, 12:15–
`20; Ex. 1002 ¶ 103). According to Samsung, in Brady’s “intermittent
`tracking” mode, the second frame would not be the frames skipped by the
`process, but rather would be the frame immediately following the last
`processed frame. Id.
`With respect to the “adjusting” step recited in independent claim 1,
`Samsung contends that Gilbert teaches this step because it discloses
`calculating the target location on a frame-by-frame basis. Pet. 45 (citing
`Ex. 1005, 50; Ex. 1002 ¶ 104). According to Samsung, Gilbert’s method of
`calculating the target location allows one to re-calculate and adjust the center
`point of the target based on the updated projection histograms based on the
`second frame. Id. at 45–46 (citing Ex. 1005, 48, 49, 52; Ex. 1002 ¶ 104).
`Samsung also argues that Brady teaches the “adjusting” step because it
`discloses calculating the target location on a frame-by-frame basis. Id. at 46
`(citing Ex. 1007, 10:7–10; Ex. 1002 ¶ 104). Similar to Gilbert, Samsung
`argues that Brady’s method of calculating the target location allows one to
`re-calculate and adjust the center point of the target based on the updated
`projection histograms based on the second frame. Id. (citing Ex. 1007,
`10:63–67; Ex. 1002 ¶ 104).
`
`16
`
`

`

`IPR2017-00357
`Patent 8,989,445 B2
`
`
`Samsung also contends that the combined teachings of Gilbert and
`Brady account for all the limitations recited in independent claim 24.
`Pet. 54–57. In particular, with the exception of the “camera” and
`“processing system” features recited in independent claims 24, Samsung
`relies upon the same explanation and supporting evidence discussed above
`with respect to independent claim 1 to account for all the limitations recited
`in independent claim 24. Compare id. at 54–57, with id. at 40–46. Samsung
`turns to Gilbert’s television camera and image processor to teach the claimed
`“camera” and “processing system,” respectively. Id. at 54–55 (citing
`Ex. 1005, 47, 48, Fig. 1; Ex. 1002 ¶ 123). Samsung also argues that Brady’s
`video camera and tracking system that receives real-time video image from
`the video camera teaches the claimed “camera” and “processing system,”
`respectively. Id. at 55–56 (citing Ex. 1007, 5:35–44; Ex. 1002 ¶¶ 124, 125).
`Turning to Samsung’s rationale to combine the teachings of Gilbert
`and Brady, Samsung relies upon the testimony of Dr. Hart to explain why
`one of ordinary skill in the art would have had a sufficient reason to combine
`their respective teachings. Pet. 37–40; Ex. 1002 ¶¶ 90–95. For instance,
`apart from the exemplary rationales articulated in KSR, Samsung contends
`that one of ordinary skill in the art would have recognized that applying
`Brady’s “intermittent processing” to Gilbert’s tracking system would be
`beneficial because it would reduce the computational burden imposed on the
`system and improve efficiency, especially in situations where the object
`being tracked is slow moving or not moving (e.g., a missile before launch).
`Id. at 40 (Ex. 1007, 12:15–20, 14:44–67; Ex. 1002 ¶ 95). In addition,
`Samsung argues that one of ordinary skill in the art would have had a
`sufficient reason to apply Brady’s simultaneous, independent tracking of
`
`17
`
`

`

`IPR2017-00357
`Patent 8,989,445 B2
`
`multiple targets, as well as its ability to classify multiple targets by type in
`real-time, to Gilbert’s tracking system. Id. According to Samsung,
`combining the teachings of Gilbert and Brady in this particular manner
`would benefit Gilbert’s tracking system by allowing it to simultaneously
`identify, distinguish, and track (1) multiple missiles; or (2) two dissimilar
`objects, such as a missile and an aircraft. Id. (citing Ex. 1007, 3:49–60,
`9:50–52; Ex. 1002 ¶ 95).
`In its Preliminary Response, Image Processing presents two
`arguments, both of which focus on whether Samsung has demonstrated that
`a person of ordinary skill in the art would have combined the teachings of
`Gilbert and Brady. Prelim. Resp. 12–14. We address each argument in turn.
`First, Image Processing contends that Samsung has not demonstrated that a
`person of ordinary skill in the art would have combined the teachings of
`Gilbert and Brady because these references are directed to different
`objectives. Id. at 12. According to Image Processing, Gilbert discloses
`tracking a single target, or possibly two targets in separate tracking
`windows, as well as using a moveable optical mount to follow a target. Id.
`(citing Ex. 1005, 47, 48). In contrast, Image Processing argues that Brady
`discloses monitoring vehicle traffic by switching between a plurality of
`stationary cameras situated in a plurality of roadways, and automatically
`tracking all vehicles in an image, which Brady criticizes the prior art for
`being unable to do. Id. at 12–13 (citing Ex. 1007, 2:23–27, 11:14–19).
`We are not persuaded by Image Processing’s argument that a person
`of ordinary skill in the art would not have had a sufficient reason to combine
`the teachings of Gilbert and Brady because, purportedly, these references are
`directed to different objectives. See Prelim. Resp. 12–13. It is well-settled
`
`18
`
`

`

`IPR2017-00357
`Patent 8,989,445 B2
`
`that simply because two references have different objectives does not
`preclude a person of ordinary skill in the art from combining their respective
`teachings. In re Heck, 699 F.2d 1331, 1333 (Fed. Cir. 1983) (“The use of
`patents as references is not limited to what the patentees describe as their
`own inventions or to the problems with which they are concerned.” (quoting
`In re Lemelson, 397 F.2d 1006, 1009 (CCPA 1968))); see also EWP Corp. v.
`Reliance Universal Inc., 755 F.2d 898, 907 (Fed. Cir. 1985) (“A reference
`must be considered for everything that it teaches by way of technology and
`is not limited to the particular invention it is describing and attempting to
`protect.”).
`Second, Image Processing contends that Samsung has not
`demonstrated that a person of ordinary skill in the art would have combined
`the teachings of Gilbert and Brady because these references operate in
`different ways that are incompatible. Prelim. Resp. 13. For instance, Image
`Processing argues that Gilbert’s objective is to identify and track objects in
`complex and changing backgrounds; however, Gilbert does not seek to
`identify different types (i.e., classes) of targets. Id. (citing Ex. 1005,
`Abstract). In contrast, Image Processing argues that Brady’s tracking is
`performed against a static background (i.e., roadway), and the tracking
`algorithms are tailored specifically to roadway backgrounds by, for example,
`assuming that regions of interest are sized according to vehicle class (e.g.,
`car-sized or truck-sized trapezoids.) Id. at 13–14 (citing Ex. 1007, 7:46–54).
`According to Image Processing, defining candidate regions of appropriate
`size would not be possible in Gilbert’s tracking system because there are not
`static points of references, such as lane boundaries, and not known or
`predetermined vehicle shapes. Id. at 14. Image Processing further asserts
`
`19
`
`

`

`IPR2017-00357
`Patent 8,989,445 B2
`
`that, without an appropriate candidate region, edgel values and fuzzy set
`theory would have limited utility, if any, in Gilbert’s tracking system. Id.
`Image Processing also asserts that Brady’s disclosure of dynamically
`generating candidate regions by “prior calibration of the scene” would not be
`possible in Gilbert’s complex and changing background. Id. (citing
`Ex. 1007, 7:42–45).
`At this stage of the proceeding, we are not persuaded by Image
`Processing’s argument that a person of ordinary skill in the art would not
`have had a sufficient reason to combine the teachings of Gilbert and Brady
`because, purportedly, these references operate in different ways that are
`incompatible. See Prelim. Resp. 13–14. Apart from mere attorney argument
`that includes directing us to disparate portions of Gilbert and Brady, the
`record before us does not include sufficient or credible evidence that
`Gilbert’s tracking system would become inoperable if modified to include
`the teachings of Brady, particularly its (1) “intermittent processing”;
`(2) simultaneous, independent tracking of multiple targets; and (3) ability to
`classify multiple targets by type in real-time. Cf. In re Geisler, 116 F.3d
`1465, 1470 (Fed. Cir. 1997) (explaining that attorney arguments and
`conclusory statements that are unsupported by factual evidence are entitled
`to little probative value). Instead, on the current record, we are persuaded
`that Samsung has presented sufficient evidence that

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket