throbber

`
`
`
`UNITED STATES PATENT AND TRADEMARK OFFICE
`
`____________________
`
`BEFORE THE PATENT TRIAL AND APPEAL BOARD
`
`____________________
`
`SAMSUNG ELECTRONICS CO., LTD.; AND
`SAMSUNG ELECTRONICS AMERICA, INC.
`Petitioner
`
`v.
`
`IMAGE PROCESSING TECHNOLOGIES, LLC
`Patent Owner
`
`____________________
`
`Patent No.7,650,015
`____________________
`
`PETITION FOR INTER PARTES REVIEW
`OF U.S. PATENT NO. 7,650,015
`
`
`
`1
`
`Exhibit 2003
`IPR2017-00355
`Petitioner - Samsung Electronics Co., Ltd., et al.
`Patent Owner - Image Processing Technologies LLC
`
`

`

`Petition for Inter Partes Review
`Patent No. 7,650,015
`
`
`
`I.
`
`TABLE OF CONTENTS
`
`Contents
`INTRODUCTION ........................................................................................... 1
`
`II. MANDATORY NOTICES UNDER 37 C.F.R. § 42.8 ................................... 1
`
`III.
`
`PAYMENT OF FEES UNDER 37 C.F.R. § 42.15(a) .................................... 2
`
`IV. GROUNDS FOR STANDING ........................................................................ 2
`
`V.
`
`PRECISE RELIEF REQUESTED .................................................................. 2
`
`VI. LEGAL STANDARDS ................................................................................... 3
`
`A.
`
`B.
`
`Claim Construction ............................................................................... 3
`
`Level of Ordinary Skill In The Art ....................................................... 4
`
`VII. OVERVIEW OF THE RELEVANT TECHNOLOGY AND THE
`’015 PATENT .................................................................................................. 4
`
`VIII. DETAILED EXPLANATION OF GROUNDS ............................................ 12
`
`A. Overview Of The Prior Art References .............................................. 12
`
`1.
`
`2.
`
`3.
`
`Alton L. Gilbert et al., A Real-Time Video Tracking
`System, PAMI-2 No. 1 IEEE Transactions on Pattern
`Analysis and Machine Intelligence 47 (Jan. 1980)
`(“Gilbert”) (Ex. 1005) ............................................................. 12
`
`U.S. Patent No. 5,521,843 (“Hashima”) (Ex. 1006) ................. 21
`
`U.S. Patent No. 5,150,432 (“Ueno”) (Ex. 1007) ...................... 29
`
`4. W. B. Schaming, Adaptive Gate Multifeature Bayesian
`Statistical Tracker, 359 Applications of Digital Image
`Processing IV 68 (1982) (“Schaming”) (Ex. 1008) .................. 34
`
`IX. Specific Explanation Of Grounds For Invalidity........................................... 39
`
`i
`
`2
`
`

`

`Petition for Inter Partes Review
`Patent No. 7,650,015
`A. Ground 1: Gilbert In View Of Schaming Renders Obvious
`Claim 6 ................................................................................................ 39
`
`1.
`
`2.
`
`3.
`
`Reasons To Combine Gilbert And Schaming ........................... 39
`
`Claim 6 ...................................................................................... 45
`
`Gilbert And Schaming Are Not Cumulative............................. 54
`
`B.
`
`Ground 2: Gilbert In View Of Ueno Renders Obvious Claim 6 ......... 56
`
`1.
`
`2.
`
`3.
`
`Reasons To Combine Gilbert And Ueno .................................. 56
`
`Claim 6 ...................................................................................... 60
`
`Gilbert And Ueno Are Not Cumulative .................................... 67
`
`C.
`
`Ground 3: Hashima In View Of Schaming Renders Obvious
`Claim 6 ................................................................................................ 68
`
`1.
`
`2.
`
`3.
`
`Reasons To Combine Hashima And Schaming ........................ 68
`
`Claim 6 ...................................................................................... 71
`
`Hashima And Schaming Are Not Cumulative .......................... 78
`
`X.
`
`CONCLUSION .............................................................................................. 79
`
`Certification of Word Count .................................................................................... 79
`
`The undersigned certifies pursuant to 37 C.F.R. § 42.6(e) and §
`42.105 that on November 30, 2016, a true and correct
`copy of Petitioner Petition for Inter Partes Review of
`U.S. Patent No. 7,650,015 was served via express mail
`on the Petitioner at the following correspondence address
`of record: ................................................................................... 80
`
`
`
`
`
`
`
`-ii-
`
`
`
`3
`
`

`

`Petition for Inter Partes Review
`Patent No. 7,650,015
`
`LIST OF EXHIBITS1
`
`U.S. Patent No. 7,650,015 (“the ’015 Patent”)
`Declaration of Dr. John C. Hart
`Curriculum Vitae for Dr. John C. Hart
`Prosecution File History of U.S. Patent No. 7,650,015
`Alton L. Gilbert et al., A Real-Time Video Tracking System,
`PAMI-2 No. 1 IEEE Transactions on Pattern Analysis and
`Machine Intelligence 47 (Jan. 1980) (“Gilbert”)
`U.S. Patent No. 5,521,843 (“Hashima”)
`U.S. Patent No. 5,150,432 (“Ueno”)
`W. B. Schaming, Adaptive Gate Multifeature Bayesian
`Statistical Tracker, 359 Applications of Digital Image
`Processing IV 68 (1982) (“Schaming”)
`D. Trier, A. K. Jain and T. Taxt, “Feature Extraction Methods
`for Character Recognition-A Survey”, Pattern Recognition, vol.
`29, no. 4, 1996, pp. 641–662
`M. H. Glauberman, “Character recognition for business
`machines,” Electronics, vol. 29, pp. 132-136, Feb. 1956
`Declaration of Gerard P. Grenier (authenticating Ex. 1005)
`Declaration of Eric A. Pepper (authenticating Ex. 1008)
`
`1001
`1002
`1003
`1004
`1005
`
`1006
`1007
`1008
`
`1009
`
`1010
`
`1011
`1012
`
`
`
` 1
`
` Citations to non-patent publications are to the original page numbers of the
`
`publication, and citations to U.S. patents are to column:line number of the patents.
`
`iii
`
`4
`
`

`

`Petition for Inter Partes Review
`Patent No. 7,650,015
`
`I.
`
`INTRODUCTION
`
`Samsung Electronics Co., Ltd. and Samsung Electronics America, Inc.
`
`(collectively, “Petitioner”) request inter partes review (“IPR”) of Claim 6 of U.S.
`
`Patent No. 7,650,015 (“the ’015 Patent”) (Ex. 1001), which, on its face, is assigned
`
`to Image Processing Technologies, LLC (“Patent Owner”). This Petition presents
`
`three non-cumulative grounds of invalidity that the U.S. Patent and Trademark
`
`Office (“PTO”) did not consider during prosecution. These grounds are each likely
`
`to prevail, and this Petition, accordingly, should be granted on all grounds and the
`
`challenged claims should be cancelled.
`
`II. MANDATORY NOTICES UNDER 37 C.F.R. § 42.8
`Real Parties-in-Interest: Petitioner identifies the following real parties-in-
`
`interest: Samsung Electronics Co., Ltd.; Samsung Electronics America, Inc.
`
`Related Matters: Patent Owner has asserted the ’015 Patent against
`
`Petitioner in Image Processing Technologies LLC v. Samsung Elecs. Co., No.
`
`2:16-cv-00505-JRG (E.D. Tex.). Patent Owner has also asserted U.S. Patent Nos.
`
`6,959,293; 8,805,001; 8,983,134; and 8,989,445 in the related action. Petitioner is
`
`concurrently filing IPR petitions for all of these asserted patents.
`
`Lead and Back-Up Counsel:
`
`• Lead Counsel: John Kappos (Reg. No. 37,861), O'Melveny & Myers
`
`LLP, 610 Newport Center Drive, 17th Floor, Newport Beach,
`
`
`
`1
`
`5
`
`

`

`Petition for Inter Partes Review
`Patent No. 7,650,015
`California 92660. (Telephone: 949-823-6900; Fax: 949-823-6994;
`
`Email: jkappos@omm.com.)
`
`• Backup Counsel: Nicholas J. Whilt (Reg. No. 72,081), Brian M. Cook
`
`(Reg. No. 59,356), O'Melveny & Myers LLP, 400 S. Hope Street, Los
`
`Angeles, CA 90071. (Telephone: 213-430-6000; Fax: 213-430-6407;
`
`Email: nwhilt@omm.com, bcook@omm.com.)
`
`Service Information: Samsung consents to electronic service by email to
`
`IPTSAMSUNGOMM@OMM.COM. Please address all postal and hand-delivery
`
`correspondence to lead counsel at O’Melveny & Myers LLP, 610 Newport Center
`
`Drive, 17th Floor, Newport Beach, California 92660, with courtesy copies to the
`
`email address identified above.
`
`III. PAYMENT OF FEES UNDER 37 C.F.R. § 42.15(a)
`The Office is authorized to charge an amount in the sum of $23,000 to
`
`Deposit Account No. 50-0639 for the fee set forth in 37 CFR § 42.15(a), and any
`
`additional fees that might be due in connection with this Petition.
`
`IV. GROUNDS FOR STANDING
`Petitioner certifies that the ’015 Patent is available for IPR and Petitioner is
`
`not barred or estopped from requesting IPR on the grounds identified herein.
`
`V.
`
`PRECISE RELIEF REQUESTED
`
`Petitioner respectfully requests review of Claim 6 of the ’015 Patent, and
`
`
`
`2
`
`6
`
`

`

`Petition for Inter Partes Review
`Patent No. 7,650,015
`cancellation of this claim, based on the grounds listed below:
`
`• Ground 1: Claim 6 is obvious under 35 U.S.C. § 103(a) over Gilbert
`
`in view of Schaming.
`
`• Ground 2: Claim 6 is obvious under 35 U.S.C. § 103(a) over Gilbert
`
`in view of Ueno.
`
`• Ground 3: Claim 6 is obvious under 35 U.S.C. § 103(a) over
`
`Hashima in view of Schaming.
`
`VI. LEGAL STANDARDS
`A. Claim Construction
`For expired claims, the Federal Circuit has held that the claims should be
`
`construed according to the Phillips v. AWH Corp. standard applicable in district
`
`court. See In re Rambus Inc. 753 F.3d 1253, 1256 (Fed. Cir. 2014). Under
`
`Philips, terms are given “the meaning that [a] term would have to a person of
`
`ordinary skill in the art in question at the time of the invention.” Phillips v. AWH
`
`Corp., 415 F.3d 1303, 1316 (Fed. Cir. 2005) (en banc). Under 37 C.F.R.
`
`42.100(b), the PTAB may also apply a district court-type claim construction if the
`
`patent is to expire within 18 months of the entry of the Notice of Filing Date.
`
`The ’015 Patent will expire on December 2, 2017—within 18 months of the
`
`Notice of Filing Date. Thus, for purposes of this proceeding, Petitioner has
`
`interpreted each claim term according to its plain and ordinary meaning. See also
`
`
`
`3
`
`7
`
`

`

`Petition for Inter Partes Review
`Patent No. 7,650,015
`Ex. 1002, ¶49. For purposes of invalidity raised in this proceeding, Petitioner does
`
`not believe any term needs an explicit construction.
`
`Level of Ordinary Skill In The Art
`
`B.
`One of ordinary skill in the art the time of the alleged invention of the ’015
`
`Patent would have had either (1) a Master’s Degree in Electrical Engineering or
`
`Computer Science or the equivalent plus at least a year of experience in the field of
`
`image processing, image recognition, machine vision, or a related field or (2) a
`
`Bachelor’s Degree in Electrical Engineering or Computer Science or the equivalent
`
`plus at least three years of experience in the field of image processing, image
`
`recognition, machine vision, or a related field. Additional education could
`
`substitute for work experience and vice versa. Ex. 1002, ¶¶44-48.
`
`VII. OVERVIEW OF THE RELEVANT TECHNOLOGY AND THE ’015
`PATENT
`
`The purported invention of the ’015 Patent relates to identifying and tracking
`
`a target in an input signal using one or more histograms derived from an image
`
`frame in the video signal. See, e.g., Ex. 1001, at Claim 6; Ex. 1002, ¶¶31-33.
`
`Video image processing and the use of histograms to identify and track targets, and
`
`to derive other information from a video signal were well known at the time the
`
`asserted patents were filed. Ex. 1002, ¶¶23-30, 61, 73, 81, 87. An input signal
`
`used in the purported invention has “a succession of frames, each frame having a
`
`succession of pixels.” Ex. 1001, 3:31-34. The input signal may be a video signal
`
`
`
`4
`
`8
`
`

`

`Petition for Inter Partes Review
`Patent No. 7,650,015
`or any other signal that “generates an output in the form of an array of information
`
`corresponding to information observed by the imaging device,” such as
`
`“ultrasound, IR, Radar, tactile array, etc.” Ex. 1001, 9:27-32. The ’015 Patent
`
`then constructs a histogram showing the frequency of pixels meeting a certain
`
`characteristic. The characteristics used to form histograms are referred to as
`
`“domains” in the ’015 Patent. Ex. 1001, 9:10-15; Ex. 1002, ¶33. The ’015 Patent
`
`teaches that “the domains are preferably selected from the group consisting of i)
`
`luminance, ii) speed (V), iii) oriented direction (DI), iv) time constant (CO), v)
`
`hue, vi) saturation, and vii) first axis (x(m)), and viii) second axis (y(m)).” Ex.
`
`1001, 3:54-58; Ex. 1002, ¶34. Figure 11 shows histogram processors that can
`
`create histograms in various domains:
`
`
`
`5
`
`9
`
`

`

`Petition for Inter Partes Review
`Patent No. 7,650,015
`
`The histograms include a plurality of “classes” within a given domain. Ex.
`
`1002, ¶35. Figure 14a (and its accompanying description) illustrates an example of
`
`“classes” within a domain:
`
`
`
`
`
`6
`
`10
`
`

`

`Petition for Inter Partes Review
`Patent No. 7,650,015
`
`
`
`FIG. 14a shows an example of the successive classes C1
`
`C2…Cn−1 Cn, each representing a particular velocity,
`
`for a hypothetical velocity histogram, with their being
`
`categorization for up to 16 velocities (15 are shown) in
`
`this example. Also shown is envelope 38, which is a
`
`smoothed representation of the histogram.
`
`Ex. 1001, 20:47-52.
`
`The ’015 Patent then uses the histograms to identify a target in the input
`
`signal. For example, one embodiment of the ’015 Patent performs “automatic
`
`framing of a person… during a video conference.” Ex. 1001, 22:4-6; see also id.,
`
`Figure 15:
`
`
`
`7
`
`11
`
`

`

`Petition for Inter Partes Review
`Patent No. 7,650,015
`
`The system constructs histograms in the X and Y domains counting the
`
`number of pixels, where the differences in luminance between successive frames
`
`are above certain threshold values:
`
`
`
`The pixels with greatest movement within the image will
`
`normally occur at the peripheral edges of the head of the
`
`subject, where even due to slight movements, the pixels
`
`will vary between the luminance of the head of the
`
`subject and the luminance of the background. Thus, if
`
`the system of the invention is set to identify only pixels
`
`with DP=1, and to form a histogram of these pixels, the
`
`histogram will detect movement peaks along the edges of
`
`the face where variations in brightness, and therefore in
`
`pixel value, are the greatest, both in the horizontal
`
`projection along Ox and in the vertical projection along
`
`
`
`8
`
`12
`
`

`

`Petition for Inter Partes Review
`Patent No. 7,650,015
`
`Oy.
`
`Ex. 1001, 22:44-54 and 10:33-61 (explaining that DP is set to “1” when pixel value
`
`of the pixel under consideration has “undergone significant variation as compared
`
`to…the same pixel in the prior frame”); Ex. 1002, ¶¶36-37. Figures 16 and 17
`
`show camera setup and the histogram constructed using this method:
`
`Ex. 1001, Fig. 16
`
`
`
`Ex. 1001, Fig. 17
`
`In addition, the system may also be used to automatically track a target by “a
`
`
`
`spotlight or a camera. Using a spotlight the invention might be used on a
`
`helicopter to track a moving target on the ground, or to track a performer on a stage
`
`during an exhibition. The invention would similarly be applicable to weapons
`
`targeting systems.” Ex. 1001, 23:35-40; Ex. 1002, ¶¶38-39. In such applications,
`
`the system uses X and Y minima and maxima of the histograms in X and Y
`
`domains to determine the center of the target:
`
`
`
`9
`
`13
`
`

`

`Petition for Inter Partes Review
`Patent No. 7,650,015
`In a preferred embodiment, the new center of the area is
`
`determined to be (XMIN+XMAX)/2, (YMIN+YMAX)/2, where
`
`XMIN and XMAX are the positions of the minima and
`
`maxima of the x projection histogram, and YMIN and
`
`YMAX are the positions of the minima and maxima of the
`
`y projection histogram.
`
`Ex. 1001, 24:46-51. The patent defines “the positions of the minima” of a
`
`projection histogram to be the smallest X (and Y) coordinate of any pixel in the
`
`image region whose validation signal is “1.” Ex. 1002, ¶¶40-41. Similarly the
`
`maximum is the largest X (and Y) coordinate of any pixel in the image region
`
`whose validation signal is “1.” Id.
`
`Once the center of the target is determined, the center is used to adjust the
`
`camera or spotlight to be directed to the moving target:
`
`Having acquired the target, controller 206 controls
`
`servomotors 208 to maintain the center of the target in
`
`the center of the image….
`
`It will be appreciated that as the target moves, the
`
`targeting box will move with the target, constantly
`
`adjusting the center of the targeting box based upon the
`
`movement of the target, and enlarging and reducing the
`
`
`
`10
`
`14
`
`

`

`Petition for Inter Partes Review
`Patent No. 7,650,015
`size of the targeting box. The targeting box may be
`
`displayed on monitor 212, or on another monitor as
`
`desired to visually track the target.
`
`Ex. 1001, 25:8-21; Ex. 1002, ¶42. The system recalculates the histograms for each
`
`frame, uses the recalculated histogram to again find the center coordinate, and then
`
`moves the direction of the camera toward the center coordinate of the target, so
`
`that the target will be displayed at the center of the screen. Ex. 1001, 25:8-21; Ex.
`
`1002, ¶43.
`
`FIG. 15 shows an example of use of the system of the
`
`invention to perform automatic framing of a person
`
`moving, for example, during a video conference. A
`
`video camera 13 observes the subject P, who may or may
`
`not be moving. A video signal S from the video camera is
`
`transmitted by wire, optical fiber, radio relay, or other
`
`communication means to a monitor 10 b and to the image
`
`processing system of the invention 11. The image
`
`processing system determines the position and movement
`
`of the subject P, and controls servo motors 43 of camera
`
`13 to direct the optical axis of the camera towards the
`
`subject and particularly towards the face of the subject,
`
`
`
`11
`
`15
`
`

`

`Petition for Inter Partes Review
`Patent No. 7,650,015
`as a function of the location, speed and direction of the
`
`subject, and may vary the zoom, focal distance and/or the
`
`focus of the camera to provide the best framing and
`
`image of the subject.
`
`Ex. 1001, 22:4-17, Fig. 15 (reproduced above). Figure 23 shows an example of the
`
`targeting box in a frame:
`
`Ex. 1001 at Fig. 23
`
`
`
`VIII. DETAILED EXPLANATION OF GROUNDS
`A. Overview Of The Prior Art References
`1.
`Alton L. Gilbert et al., A Real-Time Video Tracking System,
`PAMI-2 No. 1 IEEE Transactions on Pattern Analysis and
`Machine Intelligence 47 (Jan. 1980) (“Gilbert”) (Ex. 1005)
`
`The purported invention of the ’015 Patent relates to a process of identifying
`
`a target in digitized visual input by using histograms of pixel characteristics and
`
`tracking the target. However, researchers at U.S. Army White Sands Missile
`
`
`
`12
`
`16
`
`

`

`Petition for Inter Partes Review
`Patent No. 7,650,015
`Range, New Mexico, in collaboration with New Mexico State University, Las
`
`Cruces, had already developed a system that utilizes histograms to identify and
`
`track targets, and they published their findings in January 1980, more than 17 years
`
`before the earliest effective filing date of the ’015 Patent. Ex. 1002, ¶50; Ex. 1011,
`
`Grenier Decl.
`
`The article, entitled “A Real-Time Video Tracking System,” published in
`
`IEEE Transactions on Pattern Analysis and Machine Intelligence in January 1980,
`
`(“Gilbert”), qualifies as prior art under pre-AIA § 102(b). Gilbert describes “a
`
`system for missile and aircraft identification and tracking…applied in real time to
`
`identify and track objects.” Ex. 1002, ¶51; Ex. 1005, 47. Gilbert was not of record
`
`and was not considered during prosecution of the ’015 Patent. The Gilbert system
`
`includes an image processing system comprising a video processor, a projection
`
`processor, a tracker processor, and a control processor as shown in Figure 1,
`
`reproduced below. Ex. 1002, ¶51; Ex. 1005, 48.
`
`
`
`13
`
`17
`
`

`

`Petition for Inter Partes Review
`Patent No. 7,650,015
`
`The video processor receives a digitized video signal comprising 60 fields/s, i.e.,
`
`30 frames/s, as input. Ex. 1005, 48. Each field—i.e., frame—consists of a
`
`succession of n X m pixels:
`
`
`
`As the TV camera scans the scene, the video signal is
`
`digitized at m equally spaced points across each
`
`horizontal scan. During each video field, there are n
`
`horizontal scans which generate an n X m discrete matrix
`
`representation at 60 fields/s.
`
`Ex. 1005, 48. Although Gilbert uses the word “field” instead of “frame,” a POSA
`
`
`
`14
`
`18
`
`

`

`Petition for Inter Partes Review
`Patent No. 7,650,015
`would have understood that a “frame” consists of two “fields” in the context in
`
`which they are used in Gilbert. (At the time, video signals were interlaced such
`
`that a frame of a non-interlaced video consisted of two fields. The first field would
`
`be the odd numbered scanlines and the second field would be the even numbered
`
`scanlines recorded 1/60th of a second later than the first field.). Hart Decl. ¶52.
`
`Gilbert then constructs histograms in the intensity domain, specifically in the
`
`classes of target, plume, and background, each of which comprises a range of
`
`intensities from the 256 gray levels:
`
`Every 96 ns, a pixel intensity is digitized and quantized
`
`into eight bits (256 gray levels), counted into one of six
`
`256-level histogram memories, and then converted by a
`
`decision memory
`
`to a 2-bit code
`
`indicating
`
`its
`
`classification (target, plume, or background.).
`
`Ex. 1005, 48; id. at 49 (“Then the entire region has been scanned, h contains the
`
`distribution of pixels over intensity and is referred to as the feature histogram of
`
`the region R.”). In other words, the Video Processor of Gilbert creates histograms
`
`using the intensity domain over classes of all intensity values. Ex. 1002, ¶52.
`
`Although Gilbert uses histograms in the intensity domain as examples, it also notes
`
`that other “features that can be functionally derived from relationship between
`
`pixels, e.g., texture, edge, and linearity measure” may be used. Ex. 1005, 48; Ex.
`
`
`
`15
`
`19
`
`

`

`Petition for Inter Partes Review
`Patent No. 7,650,015
`
`1002, ¶53.
`
`Using the histograms, the video processor “separates the target from the
`
`background,” i.e., identifies the target. Ex. 1005, 48. Gilbert uses probability
`
`estimates based on a 256 level grayscale histogram to determine whether a
`
`particular pixel belongs to the target, plume, or background region. Ex. 1002, ¶52.
`
`This identification is repeated for each frame. See Ex. 1005, 47 (“The camera
`
`output is statistically decomposed into background, foreground, target, and plume
`
`region by the video processor, with this operation carried on at video rate for up to
`
`the full frame.”).
`
`Once the video processor identifies the pixels belonging to the target, the
`
`video processor creates “a binary picture, where target presence is represented by a
`
`‘1’ and target absence by a ‘0.’” Ex. 1005, 50 (“In the projection processor, these
`
`matrices are analyzed field-by-field at 60 field/s using projection based
`
`classification algorithm to extract the structural and activity parameters needed to
`
`identify and track the target.”); Ex. 1002, ¶54. This binary picture is simply a
`
`representation of whether or not a pixel should be considered further in
`
`constructing X- and Y-projections. Ex. 1002, ¶54. A projection processor creates
`
`projections using only the pixels identified for inclusion. Ex. 1002, ¶55. Although
`
`these projections are not explicitly referred to by Gilbert as projection histograms,
`
`reference to Figure 4 of Gilbert (annotated below) clearly shows four different
`
`
`
`16
`
`20
`
`

`

`projection histograms formed using the target pixels:
`
`Petition for Inter Partes Review
`Patent No. 7,650,015
`
`
`
`In addition, Gilbert explains that a projection “gives the number of object points
`
`along parallel lines; hence it is a distribution of the target points for a given view
`
`angle.” Ex. 1005, 50. Thus, these Figure 4 projections will be referred to as
`
`projection histograms throughout this petition.
`
`The projection processor then identifies the target location, orientation, and
`
`structure using the projection histograms:
`
`The
`
`target
`
`location, orientation, and structure are
`
`characterized by the pattern of 1 entries in the binary
`
`
`
`17
`
`21
`
`

`

`Petition for Inter Partes Review
`Patent No. 7,650,015
`picture matrix, and the target activity is characterized by
`
`a sequence of picture matrices. In the projection
`
`processor, these matrices are analyzed field-by-field at 60
`
`fields/s [i.e., 30 frames/s]….
`
`Ex. 1005, 50. The projection processor computes a center of area point for the
`
`target in order to “precisely determine the target position and orientation.” Id.; Ex.
`
`1002, ¶56. This calculation is done by first finding a center of area for each of the
`
`top and bottom portions using the projection histograms in X- and Y-domains for
`
`the top and bottom portions of the target. Ex. 1002, ¶¶56-57. The projection
`
`processor then uses these center-of-area points to determine a target center-of-area
`
`point. This is shown in Figure 4, reproduced below (with annotations):
`
`
`
`18
`
`22
`
`

`

`Petition for Inter Partes Review
`Patent No. 7,650,015
`
`
`
`The tracker processor analyzes information, such as the target’s size,
`
`location, and orientation, from the projection processor and outputs information
`
`such as “1) tracking window size, 2) tracking window shape, and 3) tracking
`
`window position.” Ex. 1005, 52; Ex. 1002, ¶59. The video processor then uses
`
`this information to draw a tracking window, i.e., a tracking box, around the target.
`
`See id. The display of the tracking box in the frame is shown in Figure 2.
`
`
`
`19
`
`23
`
`

`

`Petition for Inter Partes Review
`Patent No. 7,650,015
`
`
`The tracker processor also outputs the target’s movements to the control
`
`processor, which controls the direction and zoom scale of the lens to follow the
`
`target:
`
`The outputs to the control processor are used to control
`
`the target location and size for the next frame. The bore-
`
`sight correction signals are used to control the azimuth
`
`and elevation pointing angles of the telescope. The
`
`desired zoom is used to control the zoom lens, keeping
`
`the target visible within the FOV. The desired image
`
`rotation controls the image rotation element to keep the
`
`target image vertical.
`
`Id. at 52; see also id. (“The tracker processor…computes boresight and zoom
`
`correction signals, and controls the position and shape of the target tracking
`
`window to implement an intelligent tracking strategy.”); Ex. 1002, ¶60.
`
`
`
`20
`
`24
`
`

`

`Petition for Inter Partes Review
`Patent No. 7,650,015
`The histogram formation and target identification process of Gilbert is
`
`performed on a frame-by-frame basis. Ex. 1005, 52; Ex. 1002, ¶58. Because the
`
`boresight of the tracking optics follows the location of the target as the center
`
`location of the target is updated based on subsequent frames, the target object
`
`remain centered relative to the optical axis of the image frame. Ex. 1005, 52; Ex.
`
`1002, ¶60.
`
`U.S. Patent No. 5,521,843 (“Hashima”) (Ex. 1006)
`
`2.
`Hashima also discloses a “system for and method of recognizing and
`
`tracking a target mark…by processing an image of the target mark produced by a
`
`video camera.” Ex. 1006, 1:6-12. Hashima qualifies as prior art under pre-AIA 35
`
`U.S.C. §§ 102(a), 102(b), 102(e), and § 119 (“but no patent shall be granted… for
`
`an invention which had been…described in a printed publication in any country
`
`more than one year before the date of the actual filing of the application in this
`
`country.”). Hashima was not of record and was not considered during prosecution
`
`of the ’015 Patent. Hashima uses an image processing system that takes input from
`
`a video camera and identifies the target shape based on histograms constructed
`
`from pixel characteristics. Ex. 1002, ¶¶61-62. Once the target shape is detected,
`
`Hashima uses the histogram information to determine the location of the target,
`
`and move a robot arm based on the information to grip the target object. One
`
`embodiment of this system is shown in Figure 1, reproduced below.
`
`
`
`21
`
`25
`
`

`

`Petition for Inter Partes Review
`Patent No. 7,650,015
`
`
`
`Hashima uses a pre-determined mark, a black circle with white triangle
`
`inside, as shown in Figure 3 below, and Hashima notes that “target marks of
`
`various configurations…can be detected by the [disclosed] process.” Ex. 1006,
`
`10:20-23; Ex. 1002, ¶63. The histograms of the exemplary mark in X- and Y-
`
`domains counting the number of black pixels are shown in Figure 6.
`
`
`
`22
`
`26
`
`

`

`Petition for Inter Partes Review
`Patent No. 7,650,015
`
`Hashima Fig. 3
`
`
`
`
`The image processor reads input from the video camera, converts the image
`
`Hashima Fig. 6
`
`
`
`into a binary image, and constructs a histogram of the binary images. Ex. 1006,
`
`8:22-30 (“X- and Y-projected histograms of the target mark image are
`
`determined…. The X-projected histogram represents the sum of pixels having the
`
`same X- coordinates, and the Y-projected histogram represents the sum of pixels
`
`having the same Y-coordinates.”); Ex. 1002, ¶63. See generally, Ex. 1006, 8:18-
`
`9:7. Then the image processor determines whether the image represents the target
`
`by counting the number of peaks and valleys in the projected histogram. Ex. 1006,
`
`9:8-9:13; Ex. 1002, ¶64. If the number matches the number of peaks and valleys
`
`of the X- and Y-projected histograms as shown in Figure 6 (reproduced above), the
`
`system identifies the image as that of the target mark. Ex. 1006, 9:13-9:23; Ex.
`
`1002, ¶64. Figure 5 shows the flow chart describing the detection process:
`
`
`
`23
`
`27
`
`

`

`Petition for Inter Partes Review
`Patent No. 7,650,015
`
`Once the target mark is detected, Hashima determines the center of the
`
`detected mark from the X- and Y-maxima and minima of the X- and Y-histograms:
`
`
`
`The image processor 40…generat[es] a histogram 15
`
`projected onto the X-axis and a histogram 16 projected
`
`onto the Y-axis, and then determines the X and Y
`
`
`
`24
`
`28
`
`

`

`Petition for Inter Partes Review
`Patent No. 7,650,015
`coordinates of the central position Pm (mx, my) from the
`
`projected histograms 15, 16.
`
` Specifically, the X
`
`coordinate mx of the central position Pm (mx, my) can be
`
`determined using opposite end positions Xb1, Xb2
`
`obtained from the X-projected histogram 15 according to
`
`the following equation (3):
`
`mx=(Xb1+Xb2)/2
`
`
`
`
`
`
`
`
`
`(3).
`
`The Y coordinate my of the central position Pm (mx, my)
`
`can be determined using opposite end positions Yb1, Yb2
`
`obtained from the Y-projected histogram 16 according to
`
`the following equation (4):
`
`my=(Yb1+Yb2)/2
`
`
`
`
`
`
`
`
`
`(4).
`
`Ex. 1006, 11:6-25; see also Ex. 1002, ¶¶66-67. Figure 15 illustrates the process
`
`for finding the center position of the detected target:
`
`
`The center of the target is then compared to the center of the image memory
`
`from the previous frame to find the shift of the target in the X- and Y-directions.
`
`
`
`25
`
`29
`
`

`

`Petition for Inter Partes Review
`Patent No. 7,650,015
`The shift amount is used to move the robot arm (and the camera mounted thereon)
`
`toward the target, and is recalculated based on new images as the robot arm and the
`
`camera move toward the target, until the object is gripped. Ex. 1002, ¶¶65, 68; Ex.
`
`1006, 14:61-15:37. Figure 27 shows a flow chart describing the process:
`
`
`
`
`
`26
`
`30
`
`

`

`Petition for Inter Partes Review
`Patent No. 7,650,015
`As the target is tracked, Hashima displays a rectangular window around the
`
`target—i.e., a tracking box. Ex. 1006, 14:29-34, Fig. 23. As the system receives
`
`new frames from the video camera, the target and window locations are
`
`recalculated using the new histograms created from the each new frame:
`
`When the target mark 10 starts to be tracked, the window
`
`is established using the projected histogram information
`
`obtained when the target mark image is recognized.
`
`When the target mark 10 is subsequently tracked, the
`
`window
`
`[44]
`
`is established using new projected
`
`histogram information obtained upon each measurement
`
`made by the camera 20.
`
`Id. at 14:29-34; Ex. 1002, ¶¶69-70. Figure 23 shows the target window 44 around
`
`the target mark 10A.
`
`Hashima Fig. 23
`
`
`
`
`
`27
`
`31
`
`

`

`Petition for Inter Partes Review
`Patent No. 7,650,015
`A block diagram of the system in Hashima shows that after determining an
`
`appropriate window, the image is displayed on a monitor, 315. See also Ex. 1006,
`
`25:34–38, Fig. 52; Ex. 1002, ¶70.
`
`
`Because the direction of the robot arm, on which the camera is mounted, is
`
`continuously updated to track the target location (Ex. 1006, 14:61-15

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket