throbber
UNITED STATES PATENT AND TRADEMARK OFFICE
`
`____________________
`
`BEFORE THE PATENT TRIAL AND APPEAL BOARD
`
`____________________
`
`SAMSUNG ELECTRONICS CO., LTD.; AND
`SAMSUNG ELECTRONICS AMERICA, INC.
`Petitioner
`
`v.
`
`IMAGE PROCESSING TECHNOLOGIES, LLC
`Patent Owner
`
`____________________
`
`Patent No. 8,983,134
`____________________
`
`PETITION FOR INTER PARTES REVIEW
`OF U.S. PATENT NO. 8,983,134
`
`
`
`

`
`Petition for Inter Partes Review
`Patent No. 8,983,134
`
`TABLE OF CONTENTS
`
`Contents
`INTRODUCTION ........................................................................................... 1
`
`I.
`
`II. MANDATORY NOTICES UNDER 37 C.F.R. § 42.8 ................................... 1
`
`III.
`
`PAYMENT OF FEES UNDER 37 C.F.R. § 42.15(a) .................................... 2
`
`IV. GROUNDS FOR STANDING ........................................................................ 2
`
`V.
`
`PRECISE RELIEF REQUESTED .................................................................. 2
`
`VI. LEGAL STANDARDS ................................................................................... 3
`
`A.
`
`B.
`
`Claim Construction ............................................................................... 3
`
`Level Of Ordinary Skill In The Art ....................................................... 4
`
`VII. OVERVIEW OF THE RELEVANT TECHNOLOGY AND ’134
`PATENT .......................................................................................................... 4
`
`VIII. DETAILED EXPLANATION OF GROUNDS ............................................ 11
`
`A. Overview Of The Prior Art References .............................................. 11
`
`1.
`
`2.
`
`3.
`
`Alton L. Gilbert et al., A Real-Time Video Tracking
`System, PAMI-2 No. 1 IEEE Transactions on Pattern
`Analysis and Machine Intelligence 47 (Jan. 1980)
`(“Gilbert”) (Ex. 1005) ............................................................... 11
`
`U.S. Patent No. 5,521,843 (“Hashima”) (Ex. 1006) ................. 19
`
`U.S. Patent No. 5,150,432 (“Ueno”) (Ex. 1007) ...................... 28
`
`IX. Specific Explanation Of Grounds For Invalidity........................................... 34
`
`A. Ground 1: Gilbert In View Of Hashima Renders Obvious
`Claims 1-2 ........................................................................................... 34
`
`1.
`
`2.
`
`Reasons To Combine Gilbert And Hashima ............................ 34
`
`Claim 1 ...................................................................................... 39
`
`i
`
`

`
`Petition for Inter Partes Review
`Patent No. 8,983,134
`Claim 2: “The process according to Claim 1 further
`comprising drawing a tracking box around the target” ............ 45
`
`Gilbert And Hashima Are Not Cumulative .............................. 48
`
`3.
`
`4.
`
`B.
`
`Ground 2: Hashima In View Of Ueno Renders Obvious
`Claims 1-2 ........................................................................................... 50
`
`1.
`
`2.
`
`3.
`
`4.
`
`Reasons To Combine Hashima And Ueno ............................... 50
`
`Claim 1 ...................................................................................... 53
`
`Claim 2: “The process according to Claim 1 further
`comprising drawing a tracking box around the target” ............ 56
`
`Hashima And Ueno Are Not Cumulative ................................. 59
`
`C.
`
`Ground 3: Gilbert In View Of Ueno Renders Obvious Claims
`1-2 ........................................................................................................ 60
`
`1.
`
`2.
`
`3.
`
`4.
`
`Reasons To Combine Ueno And Gilbert .................................. 60
`
`Claim 1 ...................................................................................... 64
`
`Claim 2: “The process according to Claim 1 further
`comprising drawing a tracking box around the target” ............ 68
`
`Gilbert and Ueno Are Not Cumulative ..................................... 71
`
`X.
`
`CONCLUSION .............................................................................................. 73
`
`Certification of Word Count .................................................................................... 73
`
`The undersigned certifies pursuant to 37 C.F.R. § 42.6(e) and §
`42.105 that on November 30, 2016, a true and correct
`copy of Petitioner Petition for Inter Partes Review of
`U.S. Patent No. 8,983,134 was served via express mail
`on the Petitioner at the following correspondence address
`of record: ................................................................................... 74
`
`
`
`
`
`
`
`-ii-
`
`
`
`

`
`Petition for Inter Partes Review
`Patent No. 8,983,134
`
`LIST OF EXHIBITS1
`
`U.S. Patent No. 8,983,134 (“the ’134 patent”)
`Declaration of Dr. John C. Hart
`Curriculum Vitae for Dr. John C. Hart
`Prosecution File History of U.S. Patent No. 8,983,134
`Alton L. Gilbert et al., A Real-Time Video Tracking System,
`PAMI-2 No. 1 IEEE Transactions on Pattern Analysis and
`Machine Intelligence 47 (Jan. 1980) (“Gilbert”)
`U.S. Patent 5,521,843 (“Hashima”)
`U.S. Patent 5,150,432 (“Ueno”)
`D. Trier, A. K. Jain and T. Taxt, “Feature Extraction Methods
`for Character Recognition-A Survey”, Pattern Recognition, vol.
`29, no. 4, 1996, pp. 641–662
`M. H. Glauberman, “Character recognition for business
`machines,” Electronics, vol. 29, pp. 132-136, Feb. 1956
`Declaration of Gerard P. Grenier (authenticating Ex. 1005)
`
`1001
`1002
`1003
`1004
`1005
`
`1006
`1007
`1008
`
`1009
`
`1010
`
`
`
` 1
`
` Citations to non-patent publications are to the original page numbers of the
`
`publication, and citations to U.S. patents are to column:line number of the patents.
`
`iii
`
`

`
`Petition for Inter Partes Review
`Patent No. 8,983,134
`
`I.
`
`INTRODUCTION
`
`Samsung Electronics Co., Ltd. and Samsung Electronics America, Inc.
`
`(collectively, “Petitioner”) request inter partes review (“IPR”) of Claims 1-2 of
`
`U.S. Patent No. 8,983,134 (“the ’134 Patent”) (Ex. 1001), which, on its face, is
`
`assigned to Image Processing Technologies, LLC (“Patent Owner”). This Petition
`
`presents several non-cumulative grounds of invalidity that the U.S. Patent and
`
`Trademark Office (“PTO”) did not consider during prosecution. These grounds
`
`are each likely to prevail, and this Petition, accordingly, should be granted on all
`
`grounds and the challenged claims should be cancelled.
`
`II. MANDATORY NOTICES UNDER 37 C.F.R. § 42.8
`Real Parties-in-Interest: Petitioner identifies the following real parties-in-
`
`interest: Samsung Electronics Co., Ltd.; Samsung Electronics America, Inc.
`
`Related Matters: Patent Owner has asserted the ’134 Patent against
`
`Petitioner in Image Processing Technologies LLC v. Samsung Elecs. Co., No.
`
`2:16-cv-00505-JRG (E.D. Tex.). Patent Owner has also asserted U.S. Patent Nos.
`
`6,959,293; 7,650,015; 8,805,001; and 8,989,445 in the related action. Petitioner is
`
`concurrently filing IPR petitions for all of these asserted patents.
`
`Lead and Back-Up Counsel:
`
`• Lead Counsel: John Kappos (Reg. No. 37,861), O’Melveny & Myers
`
`LLP, 610 Newport Center Drive, 17th Floor, Newport Beach,
`
`1
`
`
`

`
`Petition for Inter Partes Review
`Patent No. 8,983,134
`California 92660. (Telephone: 949-823-6900; Fax: 949-823-6994;
`
`Email: jkappos@omm.com.)
`
`• Backup Counsel: Nicholas J. Whilt (Reg. No. 72,081), Brian M. Cook
`
`(Reg. No. 59,356), O’Melveny & Myers LLP, 400 S. Hope Street, Los
`
`Angeles, CA 90071. (Telephone: 213-430-6000; Fax: 213-430-6407;
`
`Email: nwhilt@omm.com, bcook@omm.com.)
`
`Service Information: Samsung consents to electronic service by email to
`
`IPTSAMSUNGOMM@OMM.COM. Please address all postal and hand-delivery
`
`correspondence to lead counsel at O’Melveny & Myers LLP, 610 Newport Center
`
`Drive, 17th Floor, Newport Beach, California 92660, with courtesy copies to the
`
`email address identified above.
`
`III. PAYMENT OF FEES UNDER 37 C.F.R. § 42.15(a)
`The Office is authorized to charge an amount in the sum of $23,000 to
`
`Deposit Account No. 50-0639 for the fee set forth in 37 CFR § 42.15(a), and any
`
`additional fees that might be due in connection with this Petition.
`
`IV. GROUNDS FOR STANDING
`Petitioner certifies that the ’134 Patent is available for IPR and Petitioner is
`
`not barred or estopped from requesting IPR on the grounds identified herein.
`
`V.
`
`PRECISE RELIEF REQUESTED
`
`Petitioner respectfully requests review of Claims 1-2 of the ’134 Patent, and
`
`2
`
`
`

`
`Petition for Inter Partes Review
`Patent No. 8,983,134
`cancellation of these claims, based on the grounds listed below:
`
`• Ground 1: Claims 1-2 are obvious under 35 U.S.C. § 103(a) over
`
`Gilbert in view of Hashima;
`
`• Ground 2: Claims 1-2 are obvious under 35 U.S.C. § 103(a) over
`
`Hashima in view of Ueno; and
`
`• Ground 3: Claims 1-2 are obvious under 35 U.S.C. § 103(a) over
`
`Gilbert in view of Ueno.
`
`VI. LEGAL STANDARDS
`A. Claim Construction
`For expired claims, the Federal Circuit has held that the claims should be
`
`construed according to the Phillips v. AWH Corp. standard applicable in district
`
`court. See In re Rambus Inc. 753 F.3d 1253, 1256 (Fed. Cir. 2014). Under
`
`Philips, terms are given “the meaning that [a] term would have to a person of
`
`ordinary skill in the art in question at the time of the invention.” Phillips v. AWH
`
`Corp., 415 F.3d 1303, 1316 (Fed. Cir. 2005) (en banc). Under 37 C.F.R.
`
`42.100(b), the PTAB may also apply a district court-type claim construction if the
`
`patent is to expire within 18 months of the entry of the Notice of Filing Date.
`
`The ’134 Patent will expire on July 22, 2017—within 18 months of the
`
`Notice of Filing Date. Thus, for purposes of this proceeding, Petitioner has
`
`interpreted each claim term according to its plain and ordinary meaning. See also
`
`3
`
`
`

`
`Petition for Inter Partes Review
`Patent No. 8,983,134
`Ex. 1002, Hart Decl. ¶48. For purposes of invalidity raised in this proceeding,
`
`Petitioner does not believe any term needs an explicit construction.
`
`Level Of Ordinary Skill In The Art
`
`B.
`One of ordinary skill in the art the time of the alleged invention of the ’134
`
`Patent would have had either (1) a Master’s Degree in Electrical Engineering or
`
`Computer Science or the equivalent plus at least a year of experience in the field of
`
`image processing, image recognition, machine vision, or a related field or (2) a
`
`Bachelor’s Degree in Electrical Engineering or Computer Science or the equivalent
`
`plus at least three years of experience in the field of image processing, image
`
`recognition, machine vision, or a related field. Additional education could
`
`substitute for work experience and vice versa. Ex. 1002, Hart Decl. ¶¶45-48.
`
`VII. OVERVIEW OF THE RELEVANT TECHNOLOGY AND ’134
`PATENT
`
`The purported invention of the ’134 Patent relates to identifying and tracking
`
`a target in an input signal using one or more histograms derived from an image
`
`frame in the video signal. See e.g., Ex. 1001, ’134 Patent, at Claims 1-2; Ex. 1002,
`
`Hart Decl. ¶¶31-33. Video image processing and the use of histograms to identify
`
`and track targets, and to derive other information from a video signal were well
`
`known at the time the asserted patents were filed. Ex. 1002, Hart Decl. ¶¶23-30,
`
`44, 59, 69, 76. An input signal used in the purported invention has “a succession
`
`of frames, each frame having a succession of pixels.” Ex. 1001, ’134 Patent, at
`4
`
`
`

`
`Petition for Inter Partes Review
`Patent No. 8,983,134
`3:31-34. The input signal may be a video signal or any other signal that “generates
`
`an output in the form of an array of information corresponding to information
`
`observed by the imaging device,” such as “ultrasound, IR, Radar, tactile array,
`
`etc.” Ex. 1001, ’134 Patent, at 9:27-32; Ex. 1002, Hart Decl. ¶33. The ’134 Patent
`
`then constructs a histogram showing the frequency of pixels meeting a certain
`
`characteristic. The characteristics used to form histograms are referred to as
`
`“domains” in the ’134 Patent. Ex. 1002, Hart Decl. ¶34. The ’134 Patent teaches
`
`that “the domains are preferably selected from the group consisting of i)
`
`luminance, ii) speed (V), iii) oriented direction (DI), iv) time constant (CO), v)
`
`hue, vi) saturation, and vii) first axis (x(m)), and viii) second axis (y(m)).” Ex.
`
`1001, ’134 Patent, at 4:5-9; Ex. 1002, Hart Decl. ¶¶34, 36. Figure 11 shows
`
`histogram processors that can create histograms in various domains:
`
`5
`
`
`

`
`Petition for Inter Partes Review
`Patent No. 8,983,134
`
`The histograms include a plurality of “classes” within a given domain. Ex.
`
`1002, Hart Decl. ¶35. Figure 14a (and its accompanying description) illustrates an
`
`example of “classes” within a domain:
`
`
`
`
`
`6
`
`
`

`
`Petition for Inter Partes Review
`Patent No. 8,983,134
`
`
`
`FIG. 14a shows an example of the successive classes C1
`
`C2 . . . Cn−1 Cn, each representing a particular velocity,
`
`for a hypothetical velocity histogram, with their being
`
`categorization for up to 16 velocities (15 are shown) in
`
`this example. Also shown is envelope 38, which is a
`
`smoothed representation of the histogram.
`
`Ex. 1001, ’134 Patent at 20:49-54.
`
`The ’134 Patent then uses the histograms to identify a target in the input
`
`signal. For example, one embodiment of the ’134 Patent performs “automatic
`
`framing of a person . . . during a video conference.” Ex. 1001, ’134 Patent, at
`
`22:4-6; see also Figure 15:
`
`The system constructs histograms in the X and Y domains counting the
`
`number of pixels, where the differences in luminance between successive frames
`
`
`
`7
`
`
`

`
`are above certain threshold values:
`
`Petition for Inter Partes Review
`Patent No. 8,983,134
`
`The pixels with greatest movement within the image will
`
`normally occur at the peripheral edges of the head of the
`
`subject, where even due to slight movements, the pixels
`
`will vary between the luminance of the head of the
`
`subject and the luminance of the background. Thus, if
`
`the system of the invention is set to identify only pixels
`
`with DP=1, and to form a histogram of these pixels, the
`
`histogram will detect movement peaks along the edges of
`
`the face where variations in brightness, and therefore in
`
`pixel value, are the greatest, both in the horizontal
`
`projection along Ox and in the vertical projection along
`
`Oy.
`
`Ex. 1001, ’134 Patent, at 22:44-54 and 10:33-61 (explaining that DP is set to “1”
`
`when pixel value of the pixel under consideration has “undergone significant
`
`variation as compared to . . . the same pixel in the prior frame”); Ex. 1002, Hart
`
`Decl. ¶¶36-37. Figures 16 and 17 show camera setup and the histogram
`
`constructed using this method:
`
`8
`
`
`

`
`Petition for Inter Partes Review
`Patent No. 8,983,134
`
`Ex. 1001, Fig. 16
`
`
`
`Ex. 1001, Fig. 17
`
`In addition, the system may also be used to automatically track a target by “a
`
`
`
`spotlight or a camera. Using a spotlight the invention might be used on a
`
`helicopter to track a moving target on the ground, or to track a performer on a stage
`
`during an exhibition. The invention would similarly be applicable to weapons
`
`targeting systems.” Ex. 1001, ’134 Patent, at 23:39-40; Ex. 1002, Hart Decl. ¶¶38-
`
`39. In such applications, the system uses X and Y minima and maxima of the
`
`histograms in X and Y domains to determine the center of the target:
`
`In a preferred embodiment, the new center of the area is
`
`determined to be (XMIN+XMAX)/2, (YMIN+YMAX)/2, where
`
`XMIN and XMAX are the positions of the minima and
`
`maxima of the x projection histogram, and YMIN and
`
`YMAX are the positions of the minima and maxima of the
`
`y projection histogram.
`
`9
`
`
`

`
`Petition for Inter Partes Review
`Patent No. 8,983,134
`Ex. 1001, ’134 Patent, at 24:46-51. The patent defines “the positions of the
`
`minima” of a projection histogram to be the smallest X (and Y) coordinate of any
`
`pixel in the image region whose validation signal is “1.” Ex. 1002, Hart Decl.
`
`¶¶40-41. Similarly the maximum is the largest X (and Y) coordinate of any pixel
`
`in the image region whose validation signal is “1.” Id. The system may
`
`recalculate the histograms at regular intervals and use those points to again find the
`
`center coordinates as the new frames are received. Ex. 1002, Hart Decl. ¶42.
`
`Once the center of the target is determined, the center is used to adjust the
`
`camera or spotlight to be directed to the moving target:
`
`Having acquired the target, controller 206 controls
`
`servomotors 208 to maintain the center of the target in
`
`the center of the image. . . .
`
`It will be appreciated that as the target moves, the
`
`targeting box will move with the target, constantly
`
`adjusting the center of the targeting box based upon the
`
`movement of the target, and enlarging and reducing the
`
`size of the targeting box. The targeting box may be
`
`displayed on monitor 212, or on another monitor as
`
`desired to visually track the target.
`
`Ex. 1001, ’134 Patent, at 25:8-21; Ex. 1002, Hart Decl. ¶43. Figure 23 shows an
`
`10
`
`
`

`
`example of the targeting box in a frame:
`
`Petition for Inter Partes Review
`Patent No. 8,983,134
`
`Ex. 1001 at Fig. 23
`
`
`
`VIII. DETAILED EXPLANATION OF GROUNDS
`A. Overview Of The Prior Art References
`1.
`Alton L. Gilbert et al., A Real-Time Video Tracking System,
`PAMI-2 No. 1 IEEE Transactions on Pattern Analysis and
`Machine Intelligence 47 (Jan. 1980) (“Gilbert”) (Ex. 1005)
`
`The purported invention of the ’134 Patent relates to a process of identifying
`
`a target in digitized visual input by using histograms of pixel characteristics and
`
`tracking the target. However, researchers at U.S. Army White Sands Missile
`
`Range, New Mexico, in collaboration with New Mexico State University, Las
`
`Cruces, had already developed a system that utilizes histograms to identify and
`
`track targets, and they published their findings in January 1980, more than 17 years
`
`before the earliest effective filing date of the ’134 Patent. Ex. 1002, Hart Decl.
`
`¶49; Ex. 1010, Grenier Decl.
`
`11
`
`
`

`
`Petition for Inter Partes Review
`Patent No. 8,983,134
`The article, entitled “A Real-Time Video Tracking System,” published in
`
`IEEE Transactions on Pattern Analysis and Machine Intelligence in January 1980,
`
`(“Gilbert”), qualifies as prior art under pre-AIA § 102(b). Gilbert describes “a
`
`system for missile and aircraft identification and tracking . . . applied in real time
`
`to identify and track objects.” Ex. 1002, Hart Decl. ¶50; Ex. 1005, Gilbert at 47.
`
`Gilbert was not of record and was not considered during prosecution of the ’134
`
`Patent. The Gilbert system includes an image processing system comprising a
`
`video processor, a projection processor, a tracker processor, and a control
`
`processor as shown in Figure 1, reproduced below. Ex. 1002, Hart Decl. ¶50; Ex.
`
`1005, Gilbert at 48.
`
`12
`
`
`

`
`Petition for Inter Partes Review
`Patent No. 8,983,134
`
`The video processor receives a digitized video signal comprising 60 fields/s, i.e.,
`
`30 frames/s, as input. Ex. 1002, Hart Decl. ¶51; Ex. 1005, Gilbert at 48. Each
`
`field—i.e., frame—consists of a succession of n X m pixels:
`
`
`
`As the TV camera scans the scene, the video signal is
`
`digitized at m equally spaced points across each
`
`horizontal scan. During each video field, there are n
`
`horizontal scans which generate an n X m discrete matrix
`
`representation at 60 fields/s.
`
`Ex. 1005, Gilbert at 48. Although Gilbert uses the word “field” instead of “frame,”
`13
`
`
`

`
`Petition for Inter Partes Review
`Patent No. 8,983,134
`a POSA would have understood that a “frame” consists of two “fields” in the
`
`context in which they are used in Gilbert. (At the time, video signals were
`
`interlaced such that a frame of a non-interlaced video consisted of two fields. The
`
`first field would be the odd numbered scanlines and the second field would be the
`
`even numbered scanlines recorded 1/60th of a second later than the first field.)
`
`Hart Decl. ¶51. Gilbert then constructs histograms of pixels in the 256 gray-level
`
`classes in the intensity domain:
`
`Every 96 ns, a pixel intensity is digitized and quantized
`
`into eight bits (256 gray levels), counted into one of six
`
`256-level histogram memories, and then converted by a
`
`decision memory
`
`to a 2-bit code
`
`indicating
`
`its
`
`classification (target, plume, or background.).
`
`Ex. 1005, Gilbert at 48. See also id. at 49 (“Then the entire region has been
`
`scanned, h contains the distribution of pixels over intensity and is referred to as the
`
`feature histogram of the region R.”). In other words, the Video Processor of
`
`Gilbert creates histograms using the intensity domain over classes of all intensity
`
`values. Ex. 1002, Hart Decl. ¶51. Although Gilbert uses histograms in the
`
`intensity domain as examples, it also notes that other “features that can be
`
`functionally derived from relationship between pixels, e.g., texture, edge, and
`
`linearity measure” may be used. Ex. 1005, Gilbert at 48; Ex. 1002, Hart Decl. ¶52.
`
`14
`
`
`

`
`Petition for Inter Partes Review
`Patent No. 8,983,134
`Using the histograms, the video processor “separates the target from the
`
`background,” i.e., identifies the target. Ex. 1005, Gilbert at 48. Gilbert uses
`
`probability estimates based on a 256 level grayscale histogram to determine
`
`whether a particular pixel belongs to the target, plume, or background region. Ex.
`
`1002, Hart Decl. ¶51. This identification is repeated for each frame. See Ex. 1005,
`
`Gilbert at 47 (“The camera output is statistically decomposed into background,
`
`foreground, target, and plume region by the video processor, with this operation
`
`carried on at video rate for up to the full frame.”). See Ex. 1002, Hart Decl. ¶57.
`
`Once the video processor identifies the pixels belonging to the target, the
`
`video processor creates “a binary picture, where target presence is represented by a
`
`‘1’ and target absence by a ‘0.’” Ex. 1005, Gilbert at 50. (“In the projection
`
`processor, these matrices are analyzed field-by-field at 60 field/s using projection
`
`based classification algorithm to extract the structural and activity parameters
`
`needed to identify and track the target.”); Ex. 1002, Hart Decl. ¶53. This binary
`
`picture indicates to the projection processor whether or not a pixel should be
`
`considered further in constructing X- and Y-projections. Ex. 1002, Hart Decl. ¶53.
`
`A projection processor creates projections using only the pixels identified for
`
`inclusion. Ex. 1002, Hart Decl. ¶54. Although these projections are not explicitly
`
`referred to by Gilbert as projection histograms, reference to Figure 4 of Gilbert
`
`(annotated below) clearly shows four different projection histograms formed using
`
`15
`
`
`

`
`the target pixels:
`
`Petition for Inter Partes Review
`Patent No. 8,983,134
`
`
`
`In addition, Gilbert explains that a projection “gives the number of object points
`
`along parallel lines; hence it is a distribution of the target points for a given view
`
`angle.” Ex. 1005, Gilbert at 50. Thus, these Figure 4 projections will be referred
`
`to as projection histograms throughout this petition.
`
`The projection processor then identifies the target location, orientation, and
`
`structure using the projection histograms:
`
`The
`
`target
`
`location, orientation, and structure are
`
`characterized by the pattern of 1 entries in the binary
`16
`
`
`

`
`Petition for Inter Partes Review
`Patent No. 8,983,134
`picture matrix, and the target activity is characterized by
`
`a sequence of picture matrices. In the projection
`
`processor, these matrices are analyzed field-by-field at 60
`
`fields/s [i.e., 30 frames/s] . . . .
`
`Ex. 1005, Gilbert at 50. The projection processor computes a center of area point
`
`for the target in order to “precisely determine the target position and orientation.”
`
`Id.; Ex. 1002, Hart Decl. ¶55. This calculation is done by first finding a center of
`
`area for each of top and bottom portions using the projection histograms in X- and
`
`Y-domains for the top and bottom portions of the target. Ex. 1002, Hart Decl.
`
`¶¶55-56. The projection processor then uses these center-of-area points to
`
`determine a target center-of-area point. This is shown in Figure 4, reproduced
`
`below (with annotations):
`
`17
`
`
`

`
`Petition for Inter Partes Review
`Patent No. 8,983,134
`
`
`
`The tracker processor analyzes information, such as the target’s size,
`
`location, and orientation, from the projection processor and outputs information
`
`such as “1) tracking window size, 2) tracking window shape, and 3) tracking
`
`window position” to the video processor. Ex. 1005, Gilbert at 52; Ex. 1002, Hart
`
`Decl. ¶58. The video processor then uses this information to draw a tracking
`
`window, i.e., a tracking box, around the target. See id. The display of the tracking
`
`18
`
`
`

`
`box in the frame is shown in Figure 2.
`
`Petition for Inter Partes Review
`Patent No. 8,983,134
`
`
`The tracker processor also outputs the target’s movements to the control processor,
`
`which controls the direction and zoom scale of the lens to follow the target:
`
`The outputs to the control processor are used to control
`
`the target location and size for the next frame. The bore-
`
`sight correction signals are used to control the azimuth
`
`and elevation pointing angles of the telescope. The
`
`desired zoom is used to control the zoom lens, keeping
`
`the target visible within the FOV. The desired image
`
`rotation controls the image rotation element to keep the
`
`target image vertical.
`
`Ex. 1005, Gilbert at 52.
`
`U.S. Patent No. 5,521,843 (“Hashima”) (Ex. 1006)
`
`2.
`Hashima also discloses a “system for and method of recognizing and
`
`tracking a target mark . . . by processing an image of the target mark produced by a
`
`19
`
`
`

`
`Petition for Inter Partes Review
`Patent No. 8,983,134
`video camera.” Ex. 1006, Hashima at 1:6-12. Hashima qualifies as prior art under
`
`pre-AIA 35 U.S.C. §§ 102(a), 102(b), 102(e), and § 119 (“but no patent shall be
`
`granted . . . for an invention which had been . . . described in a printed publication
`
`in any country more than one year before the date of the actual filing of the
`
`application in this country.”). Although Hashima was of record, it was not applied
`
`during prosecution of the ’134 Patent. Hashima uses an image processing system
`
`that takes input from a video camera and identifies the target shape based on
`
`histograms constructed from pixel characteristics. Ex. 1002, Hart Decl. ¶¶60-61.
`
`Once the target shape is detected, Hashima uses the histogram information to
`
`determine the location of the target, and move a robot arm based on the
`
`information to grip the target object. One embodiment of this system is shown in
`
`Figure 1, reproduced below.
`
`20
`
`
`

`
`Petition for Inter Partes Review
`Patent No. 8,983,134
`
`
`
`Hashima uses a pre-determined mark, a black circle with white triangle
`
`inside, as shown in Figure 3 below, and Hashima notes that “target marks of
`
`various configurations . . . can be detected by the [disclosed] process.” Ex. 1006,
`
`Hashima at 10:20-23; Ex. 1002, Hart Decl. ¶62. The histograms of the exemplary
`
`mark in X- and Y-domains counting the number of black pixels are shown in
`
`Figure 6.
`
`21
`
`
`

`
`Petition for Inter Partes Review
`Patent No. 8,983,134
`
`Hashima Fig. 3
`
`
`
`
`The image processor reads input from the video camera, converts the image
`
`Hashima Fig. 6
`
`
`
`into a binary image, and constructs a histogram of the binary images. Ex. 1006,
`
`Hashima at 8:22-30 (“X- and Y-projected histograms of the target mark image are
`
`determined. . . . The X-projected histogram represents the sum of pixels having
`
`the same X- coordinates, and the Y-projected histogram represents the sum of
`
`pixels having the same Y- coordinates.”); Ex. 1002, Hart Decl. ¶61. See generally,
`
`Ex. 1006, Hashima at 8:18-9:7. Then the image processor determines whether the
`
`image represents the target by counting the number of peaks and valleys in the
`
`projected histogram. Ex. 1006, Hashima at 9:8-9:13; Ex. 1002, Hart Decl. ¶62. If
`
`the number matches the number of peaks and valleys of the X- and Y-projected
`
`histograms as shown in Figure 6 (reproduced above), the system identifies the
`
`image as that of the target mark. Id. at 9:13-9:23; Ex. 1002, Hart Decl. ¶62.
`
`Figure 5 shows the flow chart describing the detection process:
`
`22
`
`
`

`
`Petition for Inter Partes Review
`Patent No. 8,983,134
`
`
`
`
`
`Once the target mark is detected, Hashima determines the center of the
`
`detected mark from the X- and Y-maxima and minima of the X- and Y-histograms:
`
`The image processor 40 . . . generat[es] a histogram 15
`
`projected onto the X-axis and a histogram 16 projected
`
`onto the Y-axis, and then determines the X and Y
`
`coordinates of the central position Pm (mx, my) from the
`
`23
`
`
`

`
`projected histograms 15, 16.
`
`Petition for Inter Partes Review
`Patent No. 8,983,134
` Specifically, the X
`
`coordinate mx of the central position Pm (mx, my) can be
`
`determined using opposite end positions Xb1, Xb2
`
`obtained from the X-projected histogram 15 according to
`
`the following equation (3):
`
`mx=(Xb1+Xb2)/2
`
`
`
`
`
`
`
`
`
`(3).
`
`The Y coordinate my of the central position Pm (mx, my)
`
`can be determined using opposite end positions Yb1, Yb2
`
`obtained from the Y-projected histogram 16 according to
`
`the following equation (4):
`
`my=(Yb1+Yb2)/2
`
`
`
`
`
`
`
`
`
`(4).
`
`Id. at 11:6-25; Ex. 1002, Hart Decl. ¶¶64-65. Figure 15 illustrates the process for
`
`finding the center position of the detected target:
`
`
`The center of the target is then compared to the center of the image memory
`
`from the previous frame to find the shift of the target in the X- and Y-directions.
`24
`
`
`

`
`Petition for Inter Partes Review
`Patent No. 8,983,134
`The shift amount is used to move the robot arm (and the camera mounted thereon)
`
`toward the target, and is recalculated based on new images as the robot arm and the
`
`camera move toward the target, until the object is gripped. Ex. 1002, Hart Decl.
`
`¶¶63, 66; Ex. 1006, Hashima at 14:61-15:37. Figure 27 shows a flow chart
`
`describing the process:
`
`25
`
`
`

`
`Petition for Inter Partes Review
`Patent No. 8,983,134
`
`
`As the target is tracked, Hashima displays a rectangular window around the
`
`target—i.e., a tracking box. Ex. 1006, Hashima at 14:29-34, Figure 23. As the
`
`system receives new frames from the video camera, the target and window
`
`26
`
`
`

`
`Petition for Inter Partes Review
`Patent No. 8,983,134
`locations are recalculated using the new histograms created from each new frame:
`
`When the target mark 10 starts to be tracked, the window
`
`is established using the projected histogram information
`
`obtained when the target mark image is recognized.
`
`When the target mark 10 is subsequently tracked, the
`
`window
`
`[44]
`
`is established using new projected
`
`histogram information obtained upon each measurement
`
`made by the camera 20.
`
`Id. at 14:29-34; Ex. 1002, Hart Decl. ¶67. Figure 23 shows the target window 44
`
`around the target mark 10A.
`
`Hashima Fig. 23
`
`
`
`A block diagram of the system in Hashima shows that after determining an
`
`appropriate window, the image is displayed on monitor 315. See also, Ex. 1006,
`
`Hashima at 25:34–38, Figure 52; Ex. 1002, Hart Decl. ¶68.
`
`27
`
`
`

`
`Petition for Inter Partes Review
`Patent No. 8,983,134
`
`
`
`U.S. Patent No. 5,150,432 (“Ueno”) (Ex. 1007)
`
`3.
`A similar process and apparatus is described in Ueno. Ueno uses histograms
`
`in X- and Y-domains to identify the facial region in the frames of video signals,
`
`and tracks the facial region with a rectangle constructed from X- and Y-minima
`
`and maxima derived from the histograms. Ex. 1007, Ueno at 1:39-68, 7:7-45; Ex.
`
`1002, Hart Decl. ¶70. Ueno qualifies as prior art under pre-AIA § 102(b). Ueno
`
`was not of record and was not considered during prosecution of the ’134 Patent.
`
`Ueno describes a video encoding system that includes “frame memory 101
`
`for storing image signals for one frame” connected to “a facial region detecting
`
`circuit 102 for detecting human facial region of the frame data . . . .” Ex. 1007,
`
`Ueno at 3:5-7; Ex. 1002, Hart Decl. ¶71. In Ueno, “video signals are sequentially
`
`input to the image input terminal 100 in frames.” Id. at 4:25-26. “The image
`
`28
`
`
`

`
`Petition for Inter Partes Review
`Patent No. 8,983,134
`signal is written in the frame memory 101 as frame data one frame by one frame.
`
`The frame data written in the frame memory 101 . . . is supplied to the facial region
`
`detecting circuit 102. . . .” Ex. 1007, U

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket