throbber
UNITED STATES PATENT AND TRADEMARK OFFICE
`
`____________________
`
`BEFORE THE PATENT TRIAL AND APPEAL BOARD
`
`____________________
`
`SAMSUNG ELECTRONICS CO., LTD.; AND
`SAMSUNG ELECTRONICS AMERICA, INC.
`Petitioner
`
`v.
`
`IMAGE PROCESSING TECHNOLOGIES, LLC
`Patent Owner
`
`____________________
`
`Patent No. 8,805,001
`____________________
`
`PETITION FOR INTER PARTES REVIEW
`OF U.S. PATENT NO. 8,805,001
`
`
`
`

`
`Petition for Inter Partes Review
`Patent No. 8,805,001
`
`
`
`I.
`
`TABLE OF CONTENTS
`
`Contents
`INTRODUCTION ........................................................................................... 1
`
`II. MANDATORY NOTICES UNDER 37 C.F.R. § 42.8 ................................... 1
`
`III.
`
`PAYMENT OF FEES UNDER 37 C.F.R. § 42.15(a) .................................... 2
`
`IV. GROUNDS FOR STANDING ........................................................................ 2
`
`V.
`
`PRECISE RELIEF REQUESTED .................................................................. 2
`
`VI. LEGAL STANDARDS ................................................................................... 3
`
`A.
`
`B.
`
`Claim Construction ............................................................................... 3
`
`Level Of Ordinary Skill In The Art ....................................................... 4
`
`VII. OVERVIEW OF THE RELEVANT TECHNOLOGY AND ’001
`PATENT .......................................................................................................... 4
`
`VIII. DETAILED EXPLANATION OF GROUNDS ............................................ 10
`
`A. Overview Of The Prior Art References .............................................. 10
`
`1.
`
`2.
`
`3.
`
`Alton L. Gilbert et al., A Real-Time Video Tracking
`System, PAMI-2 No. 1 IEEE Transactions on Pattern
`Analysis and Machine Intelligence 47 (Jan. 1980)
`(“Gilbert”) (Ex. 1005) ............................................................. 10
`
`U.S. Patent No. 5,521,843 (“Hashima”) (Ex. 1006) .............. 19
`
`U.S. Patent No. 5,150,432 (“Ueno”) (Ex. 1007) ................... 26
`
`IX. Specific Explanation Of Grounds For Invalidity........................................... 32
`
`A. Ground 1: Gilbert In View Of Hashima Renders Obvious
`Claims 1-4 ........................................................................................... 32
`
`1.
`
`2.
`
`Reasons To Combine Gilbert And Hashima ............................ 32
`
`Claim 1 ...................................................................................... 37
`
`i
`
`

`
`Petition for Inter Partes Review
`Patent No. 8,805,001
`Claim 2: “The process according to claim 1, wherein
`identifying the target in said at least one histogram
`further comprises determining a center of the target to be
`between X and Y minima and maxima of the target” .............. 42
`
`Claim 3: “The process according to claim 1, wherein
`identifying the target in said at least one histogram
`further comprises determining the center of the target at
`regular intervals” ....................................................................... 43
`
`Claim 4: “The process according to claim 1 further
`comprising drawing a tracking box around the target” ............ 44
`
`Gilbert And Hashima Are Not Cumulative .............................. 46
`
`3.
`
`4.
`
`5.
`
`6.
`
`B.
`
`Ground 2: Hashima In View Of Ueno Renders Obvious
`Claims 1-4 ........................................................................................... 49
`
`1.
`
`2.
`
`3.
`
`4.
`
`5.
`
`6.
`
`Reasons To Combine Hashima And Ueno ............................... 49
`
`Claim 1 ...................................................................................... 51
`
`Claim 2: “The process according to claim 1, wherein
`identifying the target in said at least one histogram
`further comprises determining a center of the target to be
`between X and Y minima and maxima of the target” .............. 55
`
`Claim 3: “The process according to claim 1, wherein
`identifying the target in said at least one histogram
`further comprises determining the center of the target at
`regular intervals” ....................................................................... 55
`
`Claim 4: “The process according to claim 1 further
`comprising drawing a tracking box around the target” ............ 56
`
`Hashima And Ueno Are Not Cumulative ................................. 58
`
`C.
`
`Ground 3: Ueno In View Of Gilbert Renders Obvious Claims
`1-4 ........................................................................................................ 60
`
`1.
`
`2.
`
`Reasons To Combine Ueno And Gilbert .................................. 60
`
`Claim 1 ...................................................................................... 64
`
`
`
`
`
`-ii-
`
`
`
`

`
`Petition for Inter Partes Review
`Patent No. 8,805,001
`Claim 2: “The process according to claim 1, wherein
`identifying the target in said at least one histogram
`further comprises determining a center of the target to be
`between X and Y minima and maxima of the target” .............. 68
`
`Claim 3: “The process according to claim 1, wherein
`identifying the target in said at least one histogram
`further comprises determining the center of the target at
`regular intervals” ....................................................................... 68
`
`Claim 4: “The process according to claim 1 further
`comprising drawing a tracking box around the target” ............ 69
`
`Ueno and Gilbert Are Not Cumulative ..................................... 72
`
`3.
`
`4.
`
`5.
`
`6.
`
`X.
`
`CONCLUSION .............................................................................................. 74
`
`
`
`
`
`
`
`-iii-
`
`
`
`

`
`Petition for Inter Partes Review
`Patent No. 8,805,001
`
`LIST OF EXHIBITS1
`
`U.S. Patent No. 8,805,001 (“the ’001 Patent”)
`Declaration of Dr. John C. Hart
`Curriculum Vitae for Dr. John C. Hart
`Prosecution File History of U.S. Patent No. 8,805,001
`Alton L. Gilbert et al., A Real-Time Video Tracking System,
`PAMI-2 No. 1 IEEE Transactions on Pattern Analysis and
`Machine Intelligence 47 (Jan. 1980) (“Gilbert”)
`U.S. Patent 5,521,843 (“Hashima”)
`U.S. Patent 5,150,432 (“Ueno”)
`D. Trier, A. K. Jain and T. Taxt, “Feature Extraction Methods
`for Character Recognition-A Survey”, Pattern Recognition, vol.
`29, no. 4, 1996, pp. 641–662
`M. H. Glauberman, “Character recognition for business
`machines,” Electronics, vol. 29, pp. 132-136, Feb. 1956
`Declaration of Gerard P. Grenier (authenticating Ex. 1005)
`
`1001
`1002
`1003
`1004
`1005
`
`1006
`1007
`1008
`
`1009
`
`1010
`
`
`
` 1
`
` Citations to non-patent publications are to the original page numbers of the
`
`publication, and citations to U.S. patents are to column:line number of the patents.
`
`iv
`
`

`
`Petition for Inter Partes Review
`Patent No. 8,805,001
`
`I.
`
`INTRODUCTION
`
`Samsung Electronics Co., Ltd. and Samsung Electronics America, Inc.
`
`(collectively, “Petitioner”) request inter partes review (“IPR”) of Claims 1-4 of
`
`U.S. Patent No. 8,805,001 (“the ’001 Patent”) (Ex. 1001), which, on its face, is
`
`assigned to Image Processing Technologies, LLC (“Patent Owner”). This Petition
`
`presents several non-cumulative grounds of invalidity that the U.S. Patent and
`
`Trademark Office (“PTO”) did not consider during prosecution. These grounds
`
`are each likely to prevail, and this Petition, accordingly, should be granted on all
`
`grounds and the challenged claims should be cancelled.
`
`
`
`II. MANDATORY NOTICES UNDER 37 C.F.R. § 42.8
`Real Parties-in-Interest: Petitioner identifies the following real parties-in-
`
`interest: Samsung Electronics Co., Ltd.; Samsung Electronics America, Inc.
`
`Related Matters: Patent Owner has asserted the ’001 Patent against
`
`Petitioner in Image Processing Technologies LLC v. Samsung Elecs. Co., No.
`
`2:16-cv-00505-JRG (E.D. Tex.). Patent Owner has also asserted U.S. Patent Nos.
`
`6,959,293; 7,650,015; 8,983,134; and 8,989,445 in the related action. Petitioner is
`
`concurrently filing IPR petitions for all of these asserted patents.
`
`Lead and Back-Up Counsel:
`
`• Lead Counsel: John Kappos (Reg. No. 37,861), O'Melveny & Myers
`
`LLP, 610 Newport Center Drive, 17th Floor, Newport Beach,
`
`1
`
`

`
`Petition for Inter Partes Review
`Patent No. 8,805,001
`California 92660. (Telephone: 949-823-6900; Fax: 949-823-6994;
`
`Email: jkappos@omm.com.)
`
`• Backup Counsel: Nick Whilt (Reg. No. 72,081), Brian M. Cook (Reg.
`
`No. 59,356), O'Melveny & Myers LLP, 400 S. Hope Street, Los
`
`Angeles, CA 90071. (Telephone: 213-430-6000; Fax: 213-430-6407;
`
`Email: nwhilt@omm.com, bcook@omm.com.)
`
`Service Information: Samsung consents to electronic service by email to
`
`IPTSAMSUNGOMM@OMM.COM. Please address all postal and hand-delivery
`
`correspondence to lead counsel at O’Melveny & Myers LLP, 610 Newport Center
`
`Drive, 17th Floor, Newport Beach, California 92660, with courtesy copies to the
`
`email address identified above.
`
`III. PAYMENT OF FEES UNDER 37 C.F.R. § 42.15(a)
`The Office is authorized to charge an amount in the sum of $23,000 to
`
`Deposit Account No. 50-0639 for the fee set forth in 37 CFR § 42.15(a), and any
`
`additional fees that might be due in connection with this Petition.
`
`IV. GROUNDS FOR STANDING
`Petitioner certifies that the ’001 Patent is available for IPR and Petitioner is
`
`not barred or estopped from requesting IPR on the grounds identified herein.
`
`V.
`
`PRECISE RELIEF REQUESTED
`
`Petitioner respectfully requests review of Claims 1-4 of the ’001 Patent, and
`
`2
`
`

`
`Petition for Inter Partes Review
`Patent No. 8,805,001
`cancellation of these claims, based on the grounds listed below:
`
`• Ground 1: Claims 1-4 are obvious under 35 U.S.C. § 103(a) over
`
`Gilbert in view of Hashima;
`
`• Ground 2: Claims 1-4 are obvious under 35 U.S.C. § 103(a) over
`
`Hashima in view of Ueno; and
`
`• Ground 3: Claims 1-4 are obvious under 35 U.S.C. § 103(a) over
`
`Ueno in view of Gilbert.
`
`VI. LEGAL STANDARDS
`A. Claim Construction
`For expired claims, the Federal Circuit has held that the claims should be
`
`construed according to the Phillips v. AWH Corp. standard applicable in district
`
`court. See In re Rambus Inc. 753 F.3d 1253, 1256 (Fed. Cir. 2014). Under
`
`Philips, terms are given “the meaning that [a] term would have to a person of
`
`ordinary skill in the art in question at the time of the invention.” Phillips v. AWH
`
`Corp., 415 F.3d 1303, 1316 (Fed. Cir. 2005). Under 37 C.F.R. 42.100(b), the
`
`PTAB may also apply a district court-type claim construction if the patent is to
`
`expire within 18 months of the entry of the Notice of Filing Date.
`
`The ’001 Patent will expire on December 2, 2017—within 18 months of the
`
`Notice of Filing Date. Thus, for purposes of this proceeding, Petitioner has applied
`
`the Phillips standard to all claim construction issues. See also Ex. 1002, ¶49.
`
`3
`
`

`
`Petition for Inter Partes Review
`Patent No. 8,805,001
`
`Level Of Ordinary Skill In The Art
`
`B.
`One of ordinary skill in the art the time of the alleged invention of the ’001
`
`Patent would have had either (1) a Master’s Degree in Electrical Engineering or
`
`Computer Science or the equivalent plus at least a year of experience in the field of
`
`image processing, image recognition, machine vision, or a related field or (2) a
`
`Bachelor’s Degree in Electrical Engineering or Computer Science or the equivalent
`
`plus at least three years of experience in the field of image processing, image
`
`recognition, machine vision, or a related field. Additional education could
`
`substitute for work experience and vice versa. Ex. 1002, ¶¶44-48.
`
`VII. OVERVIEW OF THE RELEVANT TECHNOLOGY AND ’001
`PATENT
`
`The purported invention of the ’001 Patent relates to identifying and tracking
`
`a target in an input signal using one or more histograms derived from an image
`
`frame in the video signal. See e.g., Ex. 1001, Claims 1-4; Ex. 1002, ¶¶31-33.
`
`Video image processing and the use of histograms to identify and track targets, and
`
`to derive other information from a video signal were well known at the time the
`
`asserted patents were filed. Ex. 1002, ¶¶23-30, 60, 70, 77. An input signal used in
`
`the purported invention has “a succession of frames, each frame having a
`
`succession of pixels.” Ex. 1001, 3:28-31. The input signal may be a video signal
`
`or any other signal that “generates an output in the form of an array of information
`
`corresponding to information observed by the imaging device,” such as
`
`4
`
`

`
`Petition for Inter Partes Review
`Patent No. 8,805,001
`“ultrasound, IR, Radar, tactile array, etc.” Ex. 1001, 9:26-29; Ex. 1002, ¶32. The
`
`’001 Patent then constructs a histogram showing the frequency of pixels meeting a
`
`certain characteristic. The characteristics used to form histograms are referred to
`
`as “domains” in the ’001 Patent. Ex. 1002, ¶34. The ’001 Patent teaches that “the
`
`domains are preferably selected from the group consisting of i) luminance, ii)
`
`speed (V), iii) oriented direction (DI), iv) time constant (CO), v) hue, vi)
`
`saturation, and vii) first axis (x(m)), and viii) second axis (y(m)).” Ex. 1001, 4:2-6;
`
`Ex. 1002, ¶¶34, 36. Figure 11 shows histogram processors that can create
`
`histograms in various domains:
`
`
`
`5
`
`

`
`Petition for Inter Partes Review
`Patent No. 8,805,001
`The histograms include a plurality of “classes” within a given domain. Ex.
`
`1002, ¶35. Figure 14a (and its accompanying description) illustrates an example of
`
`“classes” within a domain:
`
`
`
`FIG. 14a shows an example of the successive classes C1
`
`C2…Cn−1 Cn, each representing a particular velocity, for a
`
`hypothetical velocity histogram, with
`
`their being
`
`categorization for up to 16 velocities (15 are shown) in
`
`this example. Also shown is envelope 38, which is a
`
`smoothed representation of the histogram.
`
`Ex. 1001, 20:54-59.
`
`The ’001 Patent then uses the histograms to identify a target in the input
`
`signal. For example, one embodiment of the ’001 Patent performs “automatic
`
`framing of a person…during a video conference.” Id. at 22:10-12, Figure 15:
`
`6
`
`

`
`Petition for Inter Partes Review
`Patent No. 8,805,001
`
`The system constructs histograms in the X- and Y-domains counting the
`
`number of pixels where the differences in luminance between successive frames
`
`are above certain threshold values:
`
`
`
`The pixels with greatest movement within the image will
`
`normally occur at the peripheral edges of the head of the
`
`subject, where even due to slight movements, the pixels
`
`will vary between the luminance of the head of the
`
`subject and the luminance of the background. Thus, if
`
`the system of the invention is set to identify only pixels
`
`with DP=1, and to form a histogram of these pixels, the
`
`histogram will detect movement peaks along the edges of
`
`the face where variations in brightness, and therefore in
`
`pixel value, are the greatest, both in the horizontal
`
`projection along Ox and in the vertical projection along
`
`7
`
`

`
`Petition for Inter Partes Review
`Patent No. 8,805,001
`
`Oy.
`
`Id. at 22:49-59, 10:30-58 (explaining that DP is set to “1” when pixel value of the
`
`pixel under consideration has “undergone significant variation as compared
`
`to…the same pixel in the prior frame”); Ex. 1002, ¶¶36-37. Figures 16 and 17
`
`show camera setup and the histogram constructed using this method:
`
`Ex. 1001, Fig. 16
`
`
`
`Ex. 1001, Fig. 17
`
`In addition, the system may also be used to automatically track a target by “a
`
`
`
`spotlight or a camera. Using a spotlight the invention might be used on a
`
`helicopter to track a moving target on the ground, or to track a performer on a stage
`
`during an exhibition. The invention would similarly be applicable to weapons
`
`targeting systems.” Ex. 1001, 23:40-46; Ex. 1002, ¶¶38-39. In such applications,
`
`the system uses X- and Y-minima and maxima of the histograms in X- and Y-
`
`domains to determine the center of the target:
`
`8
`
`

`
`Petition for Inter Partes Review
`Patent No. 8,805,001
`In a preferred embodiment, the new center of the area is
`
`determined to be (XMIN+XMAX)/2, (YMIN+YMAX)/2, where
`
`XMIN and XMAX are the positions of the minima and
`
`maxima of the x projection histogram, and YMIN and
`
`YMAX are the positions of the minima and maxima of the
`
`y projection histogram.
`
`Ex. 1001, 24:50-55. The patent defines “the positions of the minima” of a
`
`projection histogram to be the smallest X- (and Y) coordinate of any pixel in the
`
`image region whose validation signal is “1.” Ex. 1002, ¶¶40-41. Similarly the
`
`maximum is the largest X- (and Y) coordinate of any pixel in the image region
`
`whose validation signal is “1.” Id. The system may recalculate the histograms at
`
`regular intervals and use those points to again find the center coordinates as the
`
`new frames are received. Ex. 1002, ¶42.
`
`Once the center of the target is determined, the center is used to adjust the
`
`camera or spotlight to be directed to the moving target:
`
`Having acquired the target, controller 206 controls
`
`servomotors 208 to maintain the center of the target in
`
`the center of the image….
`
`It will be appreciated that as the target moves, the
`
`targeting box will move with the target, constantly
`
`9
`
`

`
`Petition for Inter Partes Review
`Patent No. 8,805,001
`adjusting the center of the targeting box based upon the
`
`movement of the target, and enlarging and reducing the
`
`size of the targeting box. The targeting box may be
`
`displayed on monitor 212, or on another monitor as
`
`desired to visually track the target.
`
`Ex. 1001, 25:12-25; Ex. 1002, ¶43. Figure 23 shows an example of the targeting
`
`box in a frame:
`
`Ex. 1001, Fig. 23
`
`
`
`VIII. DETAILED EXPLANATION OF GROUNDS
`A. Overview Of The Prior Art References
`1.
`Alton L. Gilbert et al., A Real-Time Video Tracking System,
`PAMI-2 No. 1 IEEE Transactions on Pattern Analysis and
`Machine Intelligence 47 (Jan. 1980) (“Gilbert”) (Ex. 1005)
`
`The purported invention of the ’001 Patent relates to a process of identifying
`
`a target in digitized visual input by using histograms of pixel characteristics and
`
`10
`
`

`
`Petition for Inter Partes Review
`Patent No. 8,805,001
`tracking the target. However, researchers at U.S. Army White Sands Missile
`
`Range, New Mexico, in collaboration with New Mexico State University, Las
`
`Cruces, had already developed a system that utilizes histograms to identify and
`
`track targets, and they published their findings in January 1980, more than 17 years
`
`before the earliest effective filing date of the ’001 Patent. Ex. 1002, ¶50; Ex. 1010,
`
`Grenier Decl.
`
`The article, entitled “A Real-Time Video Tracking System,” published in
`
`IEEE Transactions on Pattern Analysis and Machine Intelligence in January 1980
`
`(“Gilbert”), qualifies as prior art under pre-AIA § 102(b). Gilbert describes “a
`
`system for missile and aircraft identification and tracking…applied in real time to
`
`identify and track objects.” Ex. 1002, ¶51; Ex. 1005, 47. Gilbert was not of record
`
`and was not considered during prosecution of the ’001 Patent. The Gilbert system
`
`includes an image processing system comprising a video processor, a projection
`
`processor, a tracker processor, and a control processor as shown in Figure 1,
`
`reproduced below. Ex. 1002, ¶51; Ex. 1005, 48.
`
`11
`
`

`
`Petition for Inter Partes Review
`Patent No. 8,805,001
`
`
`
`The video processor receives a digitized video signal comprising 60 fields/s,
`
`i.e., 30 frames/s, as input. Ex. 1002, ¶52; Ex. 1005, 48. Each field—i.e., frame—
`
`consists of a succession of n X m pixels:
`
`As the TV camera scans the scene, the video signal is
`
`digitized at m equally spaced points across each
`
`horizontal scan. During each video field, there are n
`
`horizontal scans which generate an n X m discrete matrix
`
`representation at 60 fields/s.
`
`Ex. 1005, 48. Although Gilbert uses the word “field” instead of “frame,” a POSA
`
`would have understood that a “frame” consists of two “fields” in the context in
`
`which they are used in Gilbert. (At the time, video signals were interlaced such
`
`12
`
`

`
`Petition for Inter Partes Review
`Patent No. 8,805,001
`that a frame of a non-interlaced video consisted of two fields. The first field would
`
`be the odd numbered scanlines and the second field would be the even numbered
`
`scanlines recorded 1/60th of a second later than the first field.) Ex. 1002, ¶52.
`
`Gilbert then constructs histograms of pixels in the 256 gray-level classes in the
`
`intensity domain:
`
`Every 96 ns, a pixel intensity is digitized and quantized
`
`into eight bits (256 gray levels), counted into one of six
`
`256-level histogram memories, and then converted by a
`
`decision memory
`
`to a 2-bit code
`
`indicating
`
`its
`
`classification (target, plume, or background.).
`
`Ex. 1005, 48, 49 (“Then the entire region has been scanned, h contains the
`
`distribution of pixels over intensity and is referred to as the feature histogram of
`
`the region R.”). In other words, the Video Processor of Gilbert creates histograms
`
`using the intensity domain over classes of all intensity values. Ex. 1002, ¶52.
`
`Although Gilbert uses histograms in the intensity domain as examples, it also notes
`
`that other “features that can be functionally derived from relationship between
`
`pixels, e.g., texture, edge, and linearity measure” may be used. Ex. 1005, 48; Ex.
`
`1002, ¶53.
`
`Using the histograms, the video processor “separates the target from the
`
`background,” i.e., identifies the target. Ex. 1005, 48. Gilbert uses probability
`
`13
`
`

`
`Petition for Inter Partes Review
`Patent No. 8,805,001
`estimates based on a 256 level grayscale histogram to determine whether a
`
`particular pixel belongs to the target, plume, or background region. Ex. 1002, ¶52.
`
`This identification is repeated for each frame. See Ex. 1005, 47 (“The camera
`
`output is statistically decomposed into background, foreground, target, and plume
`
`region by the video processor, with this operation carried on at video rate for up to
`
`the full frame.”).
`
`Once the video processor identifies the pixels belonging to the target, the
`
`video processor creates “a binary picture, where target presence is represented by a
`
`‘1’ and target absence by a ‘0.’” Ex. 1005, 50 (“In the projection processor, these
`
`matrices are analyzed field-by-field at 60 field/s using projection based
`
`classification algorithm to extract the structural and activity parameters needed to
`
`identify and track the target.”); Ex. 1002, ¶54. This binary picture indicates to the
`
`projection processor whether or not a pixel should be considered further in
`
`constructing X- and Y-projections. Ex. 1002, ¶54. A projection processor creates
`
`projections using only the pixels identified for inclusion. Ex. 1002, ¶55. Although
`
`these projections are not explicitly referred to by Gilbert as projection histograms,
`
`reference to Figure 4 of Gilbert (annotated below) clearly shows four different
`
`projection histograms formed using the target pixels:
`
`14
`
`

`
`Petition for Inter Partes Review
`Patent No. 8,805,001
`
`
`
`In addition, Gilbert explains that a projection “gives the number of object points
`
`along parallel lines; hence it is a distribution of the target points for a given view
`
`angle.” Ex. 1005, 50. Thus, these Figure 4 projections will be referred to as
`
`projection histograms throughout this petition.
`
`The projection processor then identifies the target location, orientation, and
`
`structure using the projection histograms:
`
`The
`
`target
`
`location, orientation, and structure are
`
`characterized by the pattern of 1 entries in the binary
`
`picture matrix, and the target activity is characterized by
`
`15
`
`

`
`Petition for Inter Partes Review
`Patent No. 8,805,001
`a sequence of picture matrices. In the projection
`
`processor, these matrices are analyzed field-by-field at 60
`
`fields/s [i.e., 30 frames/s]….
`
`Ex. 1005, 50. The projection processor computes a center of area point for the
`
`target in order to “precisely determine the target position and orientation.” Id.; Ex.
`
`1002, ¶56. This calculation is done by first finding a center of area for each of top
`
`and bottom portions using the projection histograms in X- and Y-domains for the
`
`top and bottom portions of the target. Ex. 1002, ¶¶56-57. The projection
`
`processor then uses these center-of-area points to determine a target center-of-area
`
`point. This is shown in Figure 4, reproduced below (with annotations):
`
`16
`
`

`
`Petition for Inter Partes Review
`Patent No. 8,805,001
`
`
`
`The tracker processor analyzes information, such as the target’s size,
`
`location, and orientation, from the projection processor and outputs information
`
`such as “1) tracking window size, 2) tracking window shape, and 3) tracking
`
`window position.” Ex. 1005, 52; Ex. 1002, ¶59. The video processor then uses
`
`this information to draw a tracking window, i.e., a tracking box, around the target.
`
`See id. The display of the tracking box in the frame is shown in Figure 2.
`
`17
`
`

`
`Petition for Inter Partes Review
`Patent No. 8,805,001
`
`
`The tracker processor also outputs the target’s movements to the control
`
`processor, which controls the direction and zoom scale of the lens to follow the
`
`target:
`
`The outputs to the control processor are used to control
`
`the target location and size for the next frame. The bore-
`
`sight correction signals are used to control the azimuth
`
`and elevation pointing angles of the telescope. The
`
`desired zoom is used to control the zoom lens, keeping
`
`the target visible within the FOV. The desired image
`
`rotation controls the image rotation element to keep the
`
`target image vertical.
`
`Ex. 1005, 52.
`
`The histogram formation and target identification process of Gilbert is
`
`performed on a frame-by-frame basis. Ex. 1005, 52; Ex. 1002, ¶58.
`
`18
`
`

`
`Petition for Inter Partes Review
`Patent No. 8,805,001
`U.S. Patent No. 5,521,843 (“Hashima”) (Ex. 1006)
`
`2.
`Hashima also discloses a “system for and method of recognizing and
`
`tracking a target mark…by processing an image of the target mark produced by a
`
`video camera.” Ex. 1006, 1:6-12. Hashima qualifies as prior art under pre-AIA 35
`
`U.S.C. §§ 102(a), 102(b), 102(e), and § 119 (“but no patent shall be granted…for
`
`an invention which had been…described in a printed publication in any country
`
`more than one year before the date of the actual filing of the application in this
`
`country.”). Although Hashima was of record, it was not applied during
`
`prosecution of the ’001 Patent. Hashima uses an image processing system that
`
`takes input from a video camera and identifies the target shape based on
`
`histograms constructed from pixel characteristics. Ex. 1002, ¶¶61-62. Once the
`
`target shape is detected, Hashima uses the histogram information to determine the
`
`location of the target, and move a robot arm based on the information to grip the
`
`target object. One embodiment of this system is shown in Figure 1, reproduced
`
`below.
`
`19
`
`

`
`Petition for Inter Partes Review
`Patent No. 8,805,001
`
`
`
`Hashima uses a pre-determined mark, a black circle with white triangle
`
`inside, as shown in Figure 3 below, and Hashima notes that “target marks of
`
`various configurations…can be detected by the [disclosed] process.” Ex. 1006,
`
`10:20-23; Ex. 1002, ¶62. The histograms of the exemplary mark in X- and Y-
`
`domains counting the number of black pixels are shown in Figure 6.
`
`20
`
`

`
`Petition for Inter Partes Review
`Patent No. 8,805,001
`
`Hashima Fig. 3
`
`
`
`
`The image processor reads input from the video camera, converts the image
`
`Hashima Fig. 6
`
`
`
`into a binary image, and constructs a histogram of the binary images. Ex. 1006,
`
`8:22-30 (“X- and Y-projected histograms of the target mark image are determined
`
`…. The X-projected histogram represents the sum of pixels having the same X-
`
`coordinates, and the Y-projected histogram represents the sum of pixels having the
`
`same Y-coordinates.”); Ex. 1002, ¶62. See generally, Ex. 1006, 8:18-9:7. Then
`
`the image processor determines whether the image represents the target by
`
`counting the number of peaks and valleys in the projected histogram. Ex. 1006,
`
`9:8-9:13; Ex. 1002, ¶63. If the number matches the number of peaks and valleys
`
`of the X- and Y-projected histograms as shown in Figure 6 (reproduced above), the
`
`system identifies the image as that of the target mark. Id. at 9:13-9:23; Ex. 1002,
`
`¶63. Figure 5 shows the flow chart describing the detection process:
`
`21
`
`

`
`Petition for Inter Partes Review
`Patent No. 8,805,001
`
`Once the target mark is detected, Hashima determines the center of the
`
`detected mark from the X- and Y-maxima and minima of the X- and Y-histograms:
`
`
`
`The image processor 40…generat[es] a histogram 15
`
`projected onto the X-axis and a histogram 16 projected
`
`onto the Y-axis, and then determines the X and Y
`
`coordinates of the central position Pm (mx, my) from the
`
`projected histograms 15, 16.
`
` Specifically, the X
`
`22
`
`

`
`Petition for Inter Partes Review
`Patent No. 8,805,001
`coordinate mx of the central position Pm (mx, my) can be
`
`determined using opposite end positions Xb1, Xb2
`
`obtained from the X-projected histogram 15 according to
`
`the following equation (3):
`
`mx=(Xb1+Xb2)/2
`
`
`
`
`
`
`
`
`
`(3).
`
`The Y coordinate my of the central position Pm (mx, my)
`
`can be determined using opposite end positions Yb1, Yb2
`
`obtained from the Y-projected histogram 16 according to
`
`the following equation (4):
`
`my=(Yb1+Yb2)/2
`
`
`
`
`
`
`
`
`
`(4).
`
`Id. at 11:6-25; Ex. 1002, ¶¶65-66. Figure 15 illustrates the process for finding the
`
`center position of the detected target:
`
`
`The center of the target is then compared to the center of the image memory
`
`from the previous frame to find the shift of the target in the X- and Y-directions.
`
`The shift amount is used to move the robot arm (and the camera mounted thereon)
`
`23
`
`

`
`Petition for Inter Partes Review
`Patent No. 8,805,001
`toward the target, and is recalculated based on new images as the robot arm and the
`
`camera move toward the target, until the object is gripped. Ex. 1002, ¶¶64, 67; Ex.
`
`1006, 14:61-15:37. Figure 27 shows a flow chart describing the process:
`
`
`
`24
`
`

`
`Petition for Inter Partes Review
`Patent No. 8,805,001
`As the target is tracked, Hashima displays a rectangular window around the
`
`target—i.e., a tracking box. Ex. 1006, 14:29-34, Figure 23. As the system
`
`receives new frames from the video camera, the target and window locations are
`
`recalculated using the new histograms created from each new frame:
`
`When the target mark 10 starts to be tracked, the window
`
`is established using the projected histogram information
`
`obtained when the target mark image is recognized.
`
`When the target mark 10 is subsequently tracked, the
`
`window
`
`[44]
`
`is established using new projected
`
`histogram information obtained upon each measurement
`
`made by the camera 20.
`
`Id. at 14:29-34; Ex. 1002, ¶68. Figure 23 shows the target window 44 around the
`
`target mark 10A.
`
`Hashima Fig. 23
`
`
`
`25
`
`

`
`Petition for Inter Partes Review
`Patent No. 8,805,001
`A block diagram of the system in Hashima shows that after determining an
`
`appropriate window, the image is displayed on a monitor, 315. See also, Ex. 1007,
`
`Hashima at 25:34–38, Figure 52; Ex. 1002, ¶69.
`
`
`
`U.S. Patent No. 5,150,432 (“Ueno”) (Ex. 1007)
`
`3.
`A similar process and apparatus is described in Ueno. Ueno uses histograms
`
`in X- and Y-domains to identify the facial region in the frames of video signals,
`
`and tracks the facial region with a rectangle constructed from X- and Y-minima
`
`and maxima derived from the histograms. Ex. 1007, 1:39-68; Ex. 1002, ¶71. Ueno
`
`qualifies as prior art under pre-AIA § 102(b). Ueno was not of record and was not
`
`considered during prosecution of the ’001 Patent.
`
`Ueno describes a video encoding system that includes “frame memory 101
`
`for storing image signals for one frame” connected to “a facial region detecting
`
`26
`
`

`
`Petition for Inter Partes Review
`Patent No. 8,805,001
`circuit 102 for detecting human facial region of the frame data….” Ex. 1007, 3:5-
`
`7; Ex. 1002, ¶72. In Ueno, “video signals are sequentially input to the image input
`
`terminal 100 in frames.” Ex. 1007, 4:25-26. “The image signal is written in the
`
`frame memory 101 as frame data one frame by one frame. The frame data written
`
`in the frame memory 101…is supplied to the facial region detecting circuit 102….”
`
`Ex. 1007, 4:33-37. The facial region detecting circuit of Ueno uses differences
`
`between two consecutive

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket