`
`____________________
`
`BEFORE THE PATENT TRIAL AND APPEAL BOARD
`
`____________________
`
`SAMSUNG ELECTRONICS CO., LTD.; AND
`SAMSUNG ELECTRONICS AMERICA, INC.
`Petitioner
`
`v.
`
`IMAGE PROCESSING TECHNOLOGIES, LLC
`Patent Owner
`
`____________________
`
`Patent No. 8,989,445
`____________________
`
`PETITION FOR INTER PARTES REVIEW
`OF U.S. PATENT NO.8,989,445
`
`
`
`
`
`Petition for Inter Partes Review
`Patent No. 8,989,445
`
`TABLE OF CONTENTS
`
`Contents
`INTRODUCTION ........................................................................................... 1
`
`I.
`
`II. MANDATORY NOTICES UNDER 37 C.F.R. § 42.8 ................................... 1
`
`III.
`
`PAYMENT OF FEES UNDER 37 C.F.R. § 42.15(a) .................................... 2
`
`IV. GROUNDS FOR STANDING ........................................................................ 2
`
`V.
`
`PRECISE RELIEF REQUESTED .................................................................. 3
`
`VI. LEGAL STANDARDS ................................................................................... 3
`
`A.
`
`B.
`
`Claim Construction ............................................................................... 3
`
`Level Of Ordinary Skill In The Art ....................................................... 4
`
`VII. OVERVIEW OF THE RELEVANT TECHNOLOGY AND ’445
`PATENT .......................................................................................................... 4
`
`VIII. DETAILED EXPLANATION OF GROUNDS ............................................ 12
`
`A. Overview Of The Prior Art References .............................................. 12
`
`1.
`
`2.
`
`3.
`
`4.
`
`Alton L. Gilbert et al., A Real-Time Video Tracking
`System, PAMI-2 No. 1 IEEE Transactions on Pattern
`Analysis and Machine Intelligence 47 (Jan. 1980)
`(“Gilbert”) (Ex. 1005) ............................................................... 12
`
`U.S. Patent No. 5,521,843 (“Hashima”) (Ex. 1006) ................. 19
`
`U.S. Patent No. 5,761,326 (“Brady”) (Ex. 1007) ..................... 29
`
`O. D. Altan et al., “Computer Architecture And
`Implementation Of Vision-Based Real-Time Lane
`Sensing,” Proceedings Of The Intelligent Vehicles ’92
`Symposium 202 (1992) (“Altan”) (Ex. 1008) .......................... 36
`
`IX. SPECIFIC EXPLANATION OF GROUNDS FOR INVALIDITY ............. 37
`
`i
`
`
`
`Petition for Inter Partes Review
`Patent No. 8,989,445
`A. Ground 1: Gilbert In View Of Brady Renders Obvious The
`Challenged Claims .............................................................................. 37
`
`1.
`
`2.
`
`3.
`
`4.
`
`5.
`
`6.
`
`7.
`
`8.
`
`9.
`
`Reasons To Combine Gilbert And Brady ................................. 37
`
`Claim 1 ...................................................................................... 40
`
`Claim 4: “The process of claim 1, further comprising
`displaying an outline associated with the target at a
`display location based on the target location” .......................... 46
`
`Claim 6: “The process of claim 4, wherein displaying the
`outline includes moving a center point of the outline” ............. 48
`
`Claim 9 ...................................................................................... 49
`
`Claim 18: “The process of claim 1, wherein adjusting the
`target location includes not adjusting the target location
`during input of a third frame located in the input signal
`between the first and second frames” ....................................... 52
`
`Claim 24 .................................................................................... 54
`
`Claim 25: “The image processing system of claim 24,
`further comprising a display, and wherein the processing
`system is further configured to display an outline
`associated with the target at a display location based on
`the target location” .................................................................... 57
`
`Claim 27: “The image processing system of claim 24,
`wherein the processing system is further configured to
`move a center point of the outline based on the histogram
`based on the first frame and the histogram based on the
`second based on the second frame” .......................................... 57
`
`10. Gilbert And Brady Are Not Cumulative ................................... 58
`
`B.
`
`Ground 2: Hashima In View Of Gilbert Renders Obvious The
`Challenged Claims .............................................................................. 59
`
`1.
`
`2.
`
`Reasons To Combine Hashima And Gilbert............................. 59
`
`Claim 1 ...................................................................................... 62
`
`
`
`
`
`-ii-
`
`
`
`
`
`3.
`
`4.
`
`5.
`
`6.
`
`7.
`
`8.
`
`9.
`
`Petition for Inter Partes Review
`Patent No. 8,989,445
`Claim 4: “The process of claim 1, further comprising
`displaying an outline associated with the target at a
`display location based on the target location” .......................... 67
`
`Claim 6: “The process of claim 4, wherein displaying the
`outline includes moving a center point of the outline” ............. 69
`
`Claim 9 ...................................................................................... 70
`
`Claim 18: “The process of claim 1, wherein adjusting the
`target location includes not adjusting the target location
`during input of a third frame located in the input signal
`between the first and second frames” ....................................... 72
`
`Claim 24 .................................................................................... 73
`
`Claim 25: “The image processing system of claim 24,
`further comprising a display, and wherein the processing
`system is further configured to display an outline
`associated with the target at a display location based on
`the target location” .................................................................... 76
`
`Claim 27: “The image processing system of claim 24,
`wherein the processing system is further configured to
`move a center point of the outline based on the histogram
`based on the first frame and the histogram based on the
`second based on the second frame” .......................................... 76
`
`10. Hashima and Gilbert Are Not Cumulative ............................... 77
`
`C.
`
`Ground 3: Hashima In View Of Brady Renders Obvious The
`Challenged Claims .............................................................................. 78
`
`1.
`
`2.
`
`3.
`
`Reasons To Combine Hashima And Brady .............................. 78
`
`Hashima In View Of Brady Renders Challenged Claims
`Obvious ..................................................................................... 81
`
`Hashima and Brady Are Not Cumulative ................................. 81
`
`X.
`
`CONCLUSION .............................................................................................. 83
`
`Certification of Word Count .................................................................................... 84
`
`
`
`
`
`-iii-
`
`
`
`
`
`Petition for Inter Partes Review
`Patent No. 8,989,445
`The undersigned certifies pursuant to 37 C.F.R. § 42.6(e) and §
`42.105 that on November 30, 2016, a true and correct
`copy of Petitioner Petition for Inter Partes Review of
`U.S. Patent No. 8,989,445 was served via express mail
`on the Petitioner at the following correspondence address
`of record: ..................................................................................... 1
`
`
`
`
`
`
`
`-iv-
`
`
`
`
`
`Petition for Inter Partes Review
`Patent No. 8,989,445
`
`1001
`1002
`1003
`1004
`1005
`
`1006
`1007
`1008
`
`1009
`
`1010
`
`1011
`
`LIST OF EXHIBITS1
`
`U.S. Patent No. 8,989,445 (“the ’445 Patent”)
`Declaration of Dr. John C. Hart
`Curriculum Vitae for Dr. John C. Hart
`Prosecution File History of U.S. Patent No. 8,989,445
`Alton L. Gilbert et al., A Real-Time VideoTracking System,
`PAMI-2 No. 1 IEEE Transactions on Pattern Analysis and
`Machine Intelligence 47 (Jan. 1980) (“Gilbert”)
`U.S. Patent 5,521,843 (“Hashima”)
`U.S. Patent 5,761,326 (“Brady”)
`O. D. Altan et al., “Computer Architecture And Implementation
`Of Vision-Based Real-Time Lane Sensing,” Proceedings Of
`The Intelligent Vehicles ’92 Symposium 202 (1992) (“Altan”)
`(Ex. 1008)
`D. Trier, A. K. Jain and T. Taxt, “Feature Extraction Methods
`for Character Recognition-A Survey”, Pattern Recognition, vol.
`29, no. 4, 1996, pp. 641–662
`M. H. Glauberman, “Character recognition for business
`machines,” Electronics, vol. 29, pp. 132-136, Feb. 1956
`Declaration of Gerard P. Grenier (authenticating Exs. 1005 and
`1008)
`
`
`
` 1
`
` Citations to non-patent publications are to the original page numbers of the
`
`publication, and citations to U.S. patents are to column:line number of the patents.
`
`v
`
`
`
`Petition for Inter Partes Review
`Patent No. 8,989,445
`
`I.
`
`INTRODUCTION
`
`Samsung Electronics Co., Ltd. and Samsung Electronics America, Inc.
`
`(collectively, “Petitioner”) request inter partes review (“IPR”) of Claims 1, 4, 6, 9,
`
`18, 24, 25, and 27 of U.S. Patent No. 8,989,445 (“the ’445 Patent”) (Ex. 1001),
`
`which, on its face, is assigned to Image Processing Technologies, LLC (“Patent
`
`Owner”). This Petition presents several non-cumulative grounds of invalidity that
`
`the U.S. Patent and Trademark Office (“PTO”) did not consider during
`
`prosecution. These grounds are each likely to prevail, and this Petition,
`
`accordingly, should be granted on all grounds and the challenged claims should be
`
`cancelled.
`
`II. MANDATORY NOTICES UNDER 37 C.F.R. § 42.8
`Real Parties-in-Interest: Petitioner identifies the following real parties-in-
`
`interest: Samsung Electronics Co., Ltd.; Samsung Electronics America, Inc.
`
`Related Matters: Patent Owner has asserted the ’445 Patent against
`
`Petitioner in Image Processing Technologies LLC v. Samsung Elecs. Co., No.
`
`2:16-cv-00505-JRG (E.D. Tex.). Patent Owner has also asserted U.S. Patent Nos.
`
`6,959,293; 7,650,015; 8,805,001; and 8,983,134 in the related action. Petitioner is
`
`concurrently filing IPR petitions for all of these asserted patents.
`
`Lead and Back-Up Counsel:
`
`• Lead Counsel: John Kappos (Reg. No. 37,861), O’Melveny & Myers
`
`1
`
`
`
`Petition for Inter Partes Review
`Patent No. 8,989,445
`LLP, 610 Newport Center Drive, 17th Floor, Newport Beach,
`
`California 92660. (Telephone: 949-823-6900; Fax: 949-823-6994;
`
`Email: jkappos@omm.com)
`
`• Backup Counsel: Nicholas J. Whilt (Reg. No. 72,081), Brian M. Cook
`
`(Reg. No. 59,356), O’Melveny & Myers LLP, 400 S. Hope Street, Los
`
`Angeles, CA 90071. (Telephone: 213-430-6000; Fax: 213-430-6407;
`
`Email: nwhilt@omm.com, bcook@omm.com)
`
`Service Information: Samsung consents to electronic service by email to
`
`IPTSAMSUNGOMM@OMM.COM. Please address all postal and hand-delivery
`
`correspondence to lead counsel at O’Melveny & Myers LLP, 610 Newport Center
`
`Drive, 17th Floor, Newport Beach, California 92660, with courtesy copies to the
`
`email address identified above.
`
`III. PAYMENT OF FEES UNDER 37 C.F.R. § 42.15(a)
`The Office is authorized to charge an amount in the sum of $23,000 to
`
`Deposit Account No. 50-0639 for the fee set forth in 37 C.F.R § 42.15(a), and any
`
`additional fees that might be due in connection with this Petition.
`
`IV. GROUNDS FOR STANDING
`Petitioner certifies that the ’445 Patent is available for IPR and Petitioner is
`
`not barred or estopped from requesting IPR on the grounds identified herein.
`
`2
`
`
`
`Petition for Inter Partes Review
`Patent No. 8,989,445
`
`V.
`
`PRECISE RELIEF REQUESTED
`
`Petitioner respectfully requests review of Claims 1, 4, 6, 9, 18, 24, 25, and
`
`27 (the “Challenged Claims”) of the ’445 Patent, and cancellation of these claims,
`
`based on the grounds listed below:
`
`• Ground 1: Claims 1, 4, 6, 9, 18, 24, 25, and 27 are obvious under 35
`
`U.S.C. § 103(a) over Gilbert in view of Brady, and Claim 18 is further
`
`obvious under 35 U.S.C. § 103(a) over Gilbert in view of Brady and
`
`further in view of Altan;
`
`• Ground 2: Claims 1, 4, 6, 9, 24, 25, and 27 are obvious under 35
`
`U.S.C. § 103(a) over Hashima in view of Gilbert, and Claim 18 is
`
`further obvious under 35 U.S.C. § 103(a) over Hashima in view of
`
`Gilbert and further in view of Altan; and
`
`• Ground 3: Claims 1, 4, 6, 9, 18, 24, 25, and 27 are obvious under 35
`
`U.S.C. § 103(a) over Hashima in view of Brady.
`
`VI. LEGAL STANDARDS
`A. Claim Construction
`The ’445 Patent will expire on July 22, 2017—within 18 months of the
`
`Notice of Filing Date. Thus, for purposes of this proceeding, Petitioner has
`
`interpreted each claim term according to its plain and ordinary meaning under
`
`Phillips v. AWH Corp., 415 F.3d. 1303, 1316 (Fed. Cir. 2005). For purposes of
`
`3
`
`
`
`Petition for Inter Partes Review
`Patent No. 8,989,445
`invalidity raised in this proceeding, petitioner does not believe any term needs an
`
`explicit construction.
`
`Level Of Ordinary Skill In The Art
`
`B.
`One of ordinary skill in the art the time of the alleged invention of the ’445
`
`Patent would have had either (1) a Master’s Degree in Electrical Engineering or
`
`Computer Science or the equivalent plus at least a year of experience in the field of
`
`image processing, image recognition, machine vision, or a related field or (2) a
`
`Bachelor’s Degree in Electrical Engineering or Computer Science or the equivalent
`
`plus at least three years of experience in the field of image processing, image
`
`recognition, machine vision, or a related field. Additional education could
`
`substitute for work experience and vice versa. Ex. 1002, ¶¶47-50.
`
`VII. OVERVIEW OF THE RELEVANT TECHNOLOGY AND ’445
`PATENT
`
`The purported invention of the ’445 Patent relates to identifying and tracking
`
`a target in an input signal using one or more histograms derived from an image
`
`frame in the video signal. See Ex. 1001, Claim 1; Ex. 1002, ¶32. Video image
`
`processing and the use of histograms to identify and track targets, and to derive
`
`other information from a video signal were well known at the time the asserted
`
`patents were filed. Ex. 1002, ¶¶23-31, 62, 76, 86. An input signal used in the
`
`purported invention has “a succession of frames, each frame having a succession of
`
`pixels.” Id. at 3:34-37. The input signal may be a video signal or any other signal
`
`4
`
`
`
`Petition for Inter Partes Review
`Patent No. 8,989,445
`that “generates an output in the form of an array of information corresponding to
`
`information observed by the imaging device,” such as “ultrasound, IR, Radar,
`
`tactile array, etc.” Ex. 1001, 9:29-34; Ex. 1002, ¶33. The ’445 Patent then
`
`constructs a histogram “showing the frequency of pixels meeting a certain
`
`characteristic. The characteristics used to form histograms are referred to as
`
`“domains” in the ’445 Patent. Ex. 1002, ¶34. The ’445 Patent teaches that “the
`
`domains are preferably selected from the group consisting of i) luminance, ii)
`
`speed (V), iii) oriented direction (DI), iv) time constant (CO), v) hue, vi)
`
`saturation, and vii) first axis (x(m)), and viii) second axis (y(m)).” Ex. 1001, 4:9-
`
`13; Ex. 1002, ¶34. Figure 11 shows histogram processors that can create
`
`histograms in various domains:
`
`5
`
`
`
`Petition for Inter Partes Review
`Patent No. 8,989,445
`
`The histograms include a plurality of “classes” within a given domain. Ex.
`
`1002, ¶35. Figure 14a (and its accompanying description) illustrates an example of
`
`“classes” within a domain:
`
`
`
`6
`
`
`
`Petition for Inter Partes Review
`Patent No. 8,989,445
`
`
`
`FIG. 14a shows an example of the successive classes C1
`
`C2…Cn−1 Cn, each representing a particular velocity,
`
`for a hypothetical velocity histogram, with their being
`
`categorization for up to 16 velocities (15 are shown) in
`
`this example. Also shown is envelope 38, which is a
`
`smoothed representation of the histogram.
`
`Ex. 1001, 20:51-56; Ex. 1002, ¶¶35-37. The ’445 Patent then uses the
`
`histograms to identify a target in the input signal. For example, one embodiment
`
`of the ’445 Patent performs “automatic framing of a person…during a video
`
`conference.” Ex. 1001, 22:6-8 and Figure 15:
`
`7
`
`
`
`Petition for Inter Partes Review
`Patent No. 8,989,445
`
`The system constructs histograms in the X and Y domains counting the
`
`number of pixels where the differences in luminance between successive frames
`
`are above certain threshold values:
`
`
`
`The pixels with greatest movement within the image will
`
`normally occur at the peripheral edges of the head of the
`
`subject, where even due to slight movements, the pixels
`
`will vary between the luminance of the head of the
`
`subject and the luminance of the background. Thus, if
`
`the system of the invention is set to identify only pixels
`
`with DP=1, and to form a histogram of these pixels, the
`
`histogram will detect movement peaks along the edges of
`
`the face where variations in brightness, and therefore in
`
`pixel value, are the greatest, both in the horizontal
`
`8
`
`
`
`Petition for Inter Partes Review
`Patent No. 8,989,445
`projection along Ox and in the vertical projection along
`
`Oy.
`
`Ex. 1001, 22:47-57 and 10:35-63 (DP is set to “1” when pixel value of the pixel
`
`under consideration has “undergone significant variation as compared to…the
`
`same pixel in the prior frame”); Ex. 1002, ¶38. Figures 16 and 17 show camera
`
`setup and the histogram constructed using this method:
`
`Ex. 1001, Fig. 16
`
`
`
`Ex. 1001, Fig. 17
`
`In addition, the system may also be used to automatically track a target by “a
`
`
`
`spotlight or a camera. Using a spotlight the invention might be used on a
`
`helicopter to track a moving target on the ground, or to track a performer on a stage
`
`during an exhibition. The invention would similarly be applicable to weapons
`
`targeting systems.” Ex. 1001, 23:38-43; Ex. 1002, ¶39. In such applications, the
`
`system uses X- and Y-minima and maxima of the histograms in X- and Y-domains
`
`9
`
`
`
`to determine the center of the target:
`
`Petition for Inter Partes Review
`Patent No. 8,989,445
`
`In a preferred embodiment, the new center of the area is
`
`determined to be (XMIN+XMAX)/2, (YMIN+YMAX)/2, where
`
`XMIN and XMAX are the positions of the minima and
`
`maxima of the x projection histogram, and YMIN and
`
`YMAX are the positions of the minima and maxima of the
`
`y projection histogram.
`
`Id. at 24:48-53. The patent defines “the positions of the minima” of a projection
`
`histogram to be the smallest x (and y) coordinate of any pixel in the image region
`
`whose validation signal is “1.” Similarly the maximum is the largest x (and y)
`
`coordinate of any pixel in the image region whose validation signal is “1.” Ex.
`
`1002, ¶40.
`
`Once the center of the target is determined, the center is used to adjust the
`
`camera or spotlight to be directed to the moving target:
`
`Having acquired the target, controller 206 controls
`
`servomotors 208 to maintain the center of the target in
`
`the center of the image. . . .
`
`It will be appreciated that as the target moves, the
`
`targeting box will move with the target, constantly
`
`adjusting the center of the targeting box based upon the
`
`10
`
`
`
`Petition for Inter Partes Review
`Patent No. 8,989,445
`movement of the target, and enlarging and reducing the
`
`size of the targeting box. The targeting box may be
`
`displayed on monitor 212, or on another monitor as
`
`desired to visually track the target.
`
`Ex. 1001, 25:10-24; Ex. 1002, ¶¶42-43. Figure 23 shows an example of the
`
`targeting box in a frame:
`
`Ex. 1001 at Fig. 23
`
`
`
`Although the ’445 Patent only teaches tracking a single target, or selecting a
`
`single target from among multiple targets, it contemplates that its methods may be
`
`adapted for tracking multiple targets simultaneously: “[W]hile the invention has
`
`been described with respect to tracking a single target, it is foreseen that multiple
`
`targets may be tracked, each with user-defined classification criteria, by replicating
`
`the various elements of the invention.” Ex. 1001, 25:56-61; Ex. 1002, ¶44.
`
`11
`
`
`
`Petition for Inter Partes Review
`Patent No. 8,989,445
`
`VIII. DETAILED EXPLANATION OF GROUNDS
`A. Overview Of The Prior Art References
`1.
`Alton L. Gilbert et al., A Real-Time Video Tracking System,
`PAMI-2 No. 1 IEEE Transactions on Pattern Analysis and
`Machine Intelligence 47 (Jan. 1980) (“Gilbert”) (Ex. 1005)
`
`The purported invention of the ’445 Patent relates to a process of identifying
`
`a target in digitized visual input by using histograms of pixel characteristics and
`
`tracking the target. However, researchers at U.S. Army White Sands Missile
`
`Range, New Mexico, in collaboration with New Mexico State University, Las
`
`Cruces, had already developed a system that utilizes histograms to identify and
`
`track targets, and they published their findings in January 1980, more than 17 years
`
`before the earliest effective filing date of the ’445 Patent. Ex. 1002, ¶51; Ex. 1011.
`
`The article, entitled “A Real-Time Video Tracking System,” published in
`
`IEEE Transactions on Pattern Analysis and Machine Intelligence in January 1980
`
`(“Gilbert”), qualifies as prior art under pre-AIA § 102(b). Gilbert describes “a
`
`system for missile and aircraft identification and tracking…applied in real time to
`
`identify and track objects.” Ex. 1002, ¶51; Ex. 1005,Gilbert at 47. Gilbert was not
`
`of record and was not considered during prosecution of the ’445 Patent. The
`
`Gilbert system includes an image processing system comprising a video processor,
`
`a projection processor, a tracker processor, and a control processor as shown in
`
`Figure 1, reproduced below. Ex. 1005,Gilbert at 48.
`
`12
`
`
`
`Petition for Inter Partes Review
`Patent No. 8,989,445
`
`
`
`The video processor receives a digitized video signal comprising 60 fields/s
`
`(30 frames/s) as input. Ex. 1002, ¶52; Ex. 1005,Gilbert at 48. Each field consists
`
`of a succession of n X m pixels:
`
`As the TV camera scans the scene, the video signal is
`
`digitized at m equally spaced points across each
`
`horizontal scan. During each video field, there are n
`
`horizontal scans which generate an n X m discrete matrix
`
`representation at 60 fields/s [i.e., 30 frames/s]
`
`Ex. 1005, 48. Although Gilbert uses the word “field” instead of “frame,” a POSA
`
`would have understood that a “frame” consists of two “fields” in the context in
`
`which they are used in Gilbert. At the time, video signals were interlaced such that
`
`13
`
`
`
`Petition for Inter Partes Review
`Patent No. 8,989,445
`a frame of a non-interlaced video consisted of two fields. The first field would be
`
`the odd numbered scanlines and the second field would be the even numbered
`
`scanlines recorded 1/60th of a second later than the first field. Ex. 1002, ¶53.
`
`Gilbert then constructs histograms of pixels in the 256 gray-level classes in the
`
`intensity domain:
`
`Every 96 ns, a pixel intensity is digitized and quantized
`
`into eight bits (256 gray levels), counted into one of six
`
`256-level histogram memories, and then converted by a
`
`decision memory
`
`to a 2-bit code
`
`indicating
`
`its
`
`classification (target, plume, or background.).
`
`Id. at 48-49 (“Then the entire region has been scanned, h contains the distribution
`
`of pixels over intensity and is referred to as the feature histogram of the region
`
`R.”); 1002, ¶54. In other words, the Video Processor of Gilbert creates histograms
`
`using the intensity domain over classes of all intensity values. Although Gilbert
`
`uses histograms in the intensity domain as examples, it also notes that other
`
`“features that can be functionally derived from relationship between pixels, e.g.,
`
`texture, edge, and linearity measure” may be used. Ex. 1005, 48; Ex. 1002, ¶54.
`
`Using the histograms, the video processor “separates the target from the
`
`background,” i.e., identifies the target. Ex. 1005, 48; Ex. 1002, ¶54. Gilbert uses
`
`probability estimates based on a 256 level grayscale histogram to determine
`
`14
`
`
`
`Petition for Inter Partes Review
`Patent No. 8,989,445
`whether a particular pixel belongs to the target, plume, or background region. Ex.
`
`1002, ¶54. This identification is repeated for each frame. Ex. 1005, 47 (“The
`
`camera output is statistically decomposed into background, foreground, target, and
`
`plume region by the video processor, with this operation carried on at video rate
`
`for up to the full frame.”). Ex. 1002, ¶60. This identification process may be done
`
`for one target/plume/background set, or two different target/plume/background sets
`
`simultaneously. Ex. 1006, Gilbert at 48 (“Although one tracking window is
`
`satisfactory for tracking missile targets with plumes, two windows are used to
`
`provide additional reliability and flexibility for independently tracking a target and
`
`plume, or two targets.”); Ex. 1002, ¶54.
`
`Once the video processor identifies the pixels belonging to the target, the
`
`video processor creates “a binary picture, where target presence is represented by a
`
`‘1’ and target absence by a ‘0.’” Ex. 1005, 50; Ex. 1002, ¶¶55-56. A projection
`
`processor creates projections using only the pixels identified for inclusion. Ex.
`
`1002, ¶57. Although these projections are not explicitly referred to by Gilbert as
`
`projection histograms, reference to Figure 4 of Gilbert (annotated below) clearly
`
`shows four different projection histograms formed using the target pixels:
`
`15
`
`
`
`Petition for Inter Partes Review
`Patent No. 8,989,445
`
`
`
`Ex. 1002, ¶57. These Figure 4 projections will be referred to as projection
`
`histograms throughout this petition.
`
`The projection processor then identifies the target location, orientation, and
`
`structure using the projection histograms:
`
`The
`
`target
`
`location, orientation, and structure are
`
`characterized by the pattern of 1 entries in the binary
`
`picture matrix, and the target activity is characterized by
`
`a sequence of picture matrices. In the projection
`
`processor, these matrices are analyzed field-by-field at 60
`
`16
`
`
`
`Petition for Inter Partes Review
`Patent No. 8,989,445
`
`fields/s [i.e., 30 frames/s]….
`
`Ex. 1005, 50; Ex. 1002, ¶58. The projection processor computes a center of area
`
`point for the target in order to “precisely determine the target position and
`
`orientation.” Ex. 1005, 50; Ex. 1002, ¶58. This calculation is done by first finding
`
`a center of area for each of top and bottom portions using the projection histograms
`
`in X- and Y-domains for the top and bottom portions of the target. Ex. 1005, 50-
`
`51; Ex. 1002, ¶58. The projection processor then uses these center-of-area points
`
`to determine a target center-of-area point. Ex. 1005, 50-51; Ex. 1002, ¶58. This is
`
`shown in Figure 4, reproduced below (with annotations):
`
`17
`
`
`
`Petition for Inter Partes Review
`Patent No. 8,989,445
`
`
`
`The tracker processor analyzes information, such as the target’s size,
`
`location, and orientation, from the projection processor and outputs information
`
`such as “1) tracking window size, 2) tracking window shape, and 3) tracking
`
`window position.” Ex. 1005, 52; Ex. 1002, ¶59. The video processor then uses
`
`this information to draw a tracking window, i.e., a tracking box, around the target.
`
`Ex. 1005, 52; Ex. 1002, ¶60. The display of the tracking box in the frame is shown
`
`in Gilbert’s Fig. 2. Ex. 1002, ¶60.
`
`18
`
`
`
`Petition for Inter Partes Review
`Patent No. 8,989,445
`
`
`The tracker processor also outputs the target’s movements to the control
`
`processor, which controls the direction and zoom scale of the lens to follow the
`
`target. Ex. 1002, ¶61.
`
`The outputs to the control processor are used to control
`
`the target location and size for the next frame. The bore-
`
`sight correction signals are used to control the azimuth
`
`and elevation pointing angles of the telescope. The
`
`desired zoom is used to control the zoom lens, keeping
`
`the target visible within the FOV. The desired image
`
`rotation controls the image rotation element to keep the
`
`target image vertical.
`
`Ex. 1005, 52.
`
`U.S. Patent No. 5,521,843 (“Hashima”) (Ex. 1006)
`
`2.
`Hashima also discloses a “system for and method of recognizing and
`
`tracking a target mark…by processing an image of the target mark produced by a
`
`19
`
`
`
`Petition for Inter Partes Review
`Patent No. 8,989,445
`video camera.” Ex. 1006, 1:6-12; Ex. 1002, ¶63. Hashima qualifies as prior art
`
`under pre-AIA 35 U.S.C. §§ 102(a), 102(b), 102(e), and § 119. Although Hashima
`
`was of record during prosecution, it was not applied in any office action. Ex. 1004.
`
`Hashima uses an image processing system that takes input from a video camera
`
`and identifies the target shape based on histograms constructed from pixel
`
`characteristics. Ex. 1006, 2:45-63; Ex. 1002, ¶¶64-69. Once the target shape is
`
`detected, Hashima uses the histogram information to determine the location of the
`
`target, and move a robot arm based on the information to grip the target object.
`
`Ex. 1006, Figure 27; Ex. 1002, ¶¶64-69. One embodiment of this system is shown
`
`in Figure 1, reproduced below.
`
`
`
`20
`
`
`
`Petition for Inter Partes Review
`Patent No. 8,989,445
`Hashima uses a pre-determined mark, a black circle with white triangle
`
`inside, as shown in Fig. 3 below, but Hashima notes that “target marks of various
`
`configurations…can be detected by the [disclosed] process.” Ex. 1006, Hashima
`
`at10:20-23; Ex. 1002, ¶64. The histograms of the exemplary mark in X and Y
`
`domains counting the number of black pixels are shown in Figure 6. Ex. 1002,
`
`¶64.
`
`Hashima Fig. 3
`
`
`
`
`The image processor reads input from the video camera, converts the image
`
`Hashima Fig. 6
`
`
`
`into a binary image, and constructs a histogram of the binary images. Ex. 1006,
`
`8:22-30 (“X and Y-projected histograms of the target mark image are
`
`determined.… The X-projected histogram represents the sum of pixels having the
`
`same X coordinates, and the Y-projected histogram represents the sum of pixels
`
`having the same Y coordinates.”) and 8:18-9:7; Ex. 1002, ¶65. Then the image
`
`processor determines whether the image represents the target by counting the
`
`21
`
`
`
`Petition for Inter Partes Review
`Patent No. 8,989,445
`number of peaks and valleys in the projected histogram. Ex. 1006, 9:8-9:13; Ex.
`
`1002, ¶66. If the number matches the number of peaks and valleys of the X and Y
`
`projected histograms as shown in Fig. 6, the system identifies the image as that of
`
`the target mark. Ex. 1006, 9:13-9:23; Ex. 1002, ¶¶66-67. Fig. 5 shows the flow
`
`chart describing the detection process:
`
`
`
`Once the target mark is detected, Hashima determines the center of the
`
`detected mark from the X and Y maxima and minima of the X and Y histograms:
`
`The image processor 40…generat[es] a histogram 15
`
`projected onto the X-axis and a histogram 16 projected
`
`22
`
`
`
`Petition for Inter Partes Review
`Patent No. 8,989,445
`onto the Y-axis, and then determines the X and Y
`
`coordinates of the central position Pm (mx, my) from the
`
`projected histograms 15, 16.
`
` Specifically, the X
`
`coordinate mx of the central position Pm (mx, my) can be
`
`determined using opposite end positions Xb1, Xb2
`
`obtained from the X-projected histogram 15 according to
`
`the following equation (3):
`
`mx=(Xb1+Xb2)/2
`
`
`
`
`
`
`
`
`
`(3).
`
`The Y coordinate my of the central position Pm (mx, my)
`
`can be determined using opposite end positions Yb1, Yb2
`
`obtained from the Y-projected histogram 16 according to
`
`the following equation (4):
`
`my=(Yb1+Yb2)/2
`
`
`
`
`
`
`
`
`
`(4).
`
`Ex. 1006, 11:6-25; Ex. 1002, ¶¶68-69. Figure 15 illustrates the process for finding
`
`the center position of the detected target:
`
`23
`
`
`
`Petition for Inter Partes Review
`Patent No. 8,989,445
`
`
`The center of the target is then compared to the center of the image memory
`
`to find the shift of the target in the X and Y directions. Ex. 1002, ¶70. The shift
`
`amount is used to move the robot arm (and the camera mounted thereon) toward
`
`the target, and is recalculated based on new images as the robot arm and the
`
`camera move toward the target, until the object is gripped. Ex. 1002, ¶69. Fig. 27
`
`shows a flow chart describing the process:
`
`24
`
`
`
`Petition for Inter Partes Review
`Patent No. 8,989,445
`
`
`As the target is tracked, Hashima displays a rectangular window around the
`
`target—i.e., a tracking box. Ex. 1002, ¶71. As the system receives new frames
`
`from the video camera, the target and window locations are recalculated using the
`
`25
`
`
`
`new histograms created from the new frame:
`
`Petition for Inter Partes Review
`Patent No. 8,989,445
`
`When the target mark 10 starts to be tracked, the window
`
`is established using t