`
`IPR2014‐00222
`
`1
`
`MAGNA 2013
`Valeo v. Magna
`IPR2014-00221
`
`
`
`Q. Okay. And you don't actually know whether copies
`of the article referred to in Grenier Exhibit 1 were
`available at the symposium, do you?
`A. No, I do not.
`Q. So it's possible, in fact, that this article was not
`available at the conference; isn't that right?
`A. I don't -- I don't know.
`Q. And in your declaration of Exhibit 1, in paragraph
`6, you say that “The article is currently available for
`public downloads.” Do you see that?
`A. Yes, I do.
`Q. When was this article first available for public
`download?
`A. I can only say some time after the symposium
`date. I do not have an exact date.
`Q. In fact, you don't know when it was available for
`download, do you?
`A. No.
`Q. There's nothing on that abstract page that you
`referred to earlier that says when it was available for
`download; isn't that right?
`A. Yes.
`
`2
`
`Grenier Depo. Transcript (Ex. 2011), pp. 12-13
`
`
`
`3
`
`Broggi I Certification (Ex. 1021), Excerpt
`
`
`
`3. The imaging system of claim 1, wherein said control
`determines shadows present in the field ofview of said imag-
`ing device and discerns shadows from objects present in the
`field of View of said imaging device.
`
`4
`
`’114 Patent (Ex. 1001), claim 3
`
`
`
`
`
`23.
`
`The "114 patent determines whether a detected obj eet appearing in the
`
`blind spot of a vehicle is an object of interest. or whether the detected object is a
`
`shadow. (See ‘1 14 patent. 3:12-15. 10:43-50.) The "114 patent identifies a “target“
`
`and then engages an algoritlnn to determine if the target is a shadow for the
`
`pm‘pose of removing or ignoring the shadow. (Id. at 10:50-61) Tlms. the ’114
`
`patent determines the metes and bomids of the shadow in order to know which
`
`portion of the image to remove or ignore. (Id)
`
`5
`
`Turk Declaration (Ex. 2001), ¶ 23
`
`
`
`
`
`Host Shadow Removal:
`
`ln side object detection, the host shadow ofthe vehicle may
`be detected as a target vehicle if the host shadow is extended
`in the hot zone (or zone or area of interest alongside the
`subject or host vehicle). Often. the host shadow may fall on
`the adjacent lane in the morning or evening time. The host
`shadow consists ofa straight horizontal edge and an edge line
`with some slope, such as shown in FIG. 14.
`
`’114 Patent (Ex. 1001), 10:42-49
`
`’114 Patent (EX. 1001), 10:42—49
`
`Horizontal edge line
`
`gs
`
`loped edge line
`
`
`
`6
`
`’114 Patent (Ex. 1001), FIG. 14
`
`FIG. 14
`
`
`
`
`
`D. Bounding Boxes generation
`
`Up to now the algorithm gives information about
`
`the
`
`vehicle’s center position only but since vehicle width is
`needed as well. a precise bounding box detkction is manda-
`tory. For each peak in the vertical edges symmetry image
`
`that survived the previous filterings the width of the box is
`
`given by the distance between the peak and the top of the
`symmetry image: the box is centered in the column (fig. 7).
`
`Otherwise -in case this is not possible- a columnwise
`histogram of the number of edges is considered to detect
`the box width (fig. 8).
`
`The shadow under the car is searched for in order to
`
`find the box base. It
`
`is defined as a horizontal edge. but
`
`since other shadows.
`
`like bridges’ ones. could be present
`
`on the road as well. and the algorithm looks for a high
`
`concentration of edges above the horizontal edge: if no base
`7
`can be detected the column is discarded. The search for
`
`Broggi I (Ex. 1005), p. 312
`
`vehicle roof is not performed and a rectangle with aspect
`ratio equal to g is displayed.
`
`
`
`14. The imaging system of claim 1, wherein said control
`reduces captured image data to a reduced data set of said
`image data, said control processing said reduced data set to
`extract information from said reduced data set.
`
`15. The imaging system of claim 14, wherein said reduced
`data set is representative of a target zone that is encompassed
`by the field of View of said imaging device and that is not
`inclusive of a portion of the equipped vehicle.
`
`8
`
`’114 Patent (Ex. 1001), claims 14 -15
`
`
`
`The Institution Decision suggests that claim 15 is obvious over Gutta, Nissan,
`47.
`Broggi, and Kastrinaki. (Institution Decision, p. 26.) The Institution Decision states:
`
`Petitioner cites Kastrinaki for teaching the use of a “region of interest (ROI) within each
`frame and process[ing] only relevant features within this ROI instead of the entire
`image.” Pet. 49 (quoting Ex. 1009, 365). Section 3.2 of Kastrinaki then describes several
`techniques for predicting the ROI from previously processed frames. Ex. 1009, 365. We
`are persuaded that Kastrinaki, as cited by Petitioner, supports Petitioner’s contention.
`(Id. at 26-27.)
`
`But Kastrinaki does not teach the features recited by claim 15. Specifically,
`48.
`Kastrinaki does not teach that the target zone “is not inclusive of a portion of the equipped
`vehicle.” The referenced passage of Kastrinaki teaches that only the relevant features in the
`region of interest (“ROI”) are processed. (Kastrinaki, p. 365.) This is different than having a
`ROI or target zone that is not inclusive of certain features. According to Kastrinaki, the
`equipped vehicle would be included in the ROI, it just would not be processed (if it was not a
`relevant feature). Kastrinaki does not teach excluding the equipped vehicle from the ROI
`itself, as required by claim 15 of the ’114 patent.
`
`9
`
`Turk Declaration (Ex. 2001), ¶¶ 47-48
`
`
`
`Patent Owner Magna Electronics Inc.’s Demonstratives
`
`IPR2014‐00221
`
`10
`
`
`
`
`
`1. An imaging system for a vehicle, said imaging system
`comprising:
`an imaging array sensor comprising a plurality of photo-
`sensing pixels, wherein said imaging array sensor is
`disposed at an exterior rearview mirror assembly at a
`side of a vehicle equipped with said imaging system;
`wherein, when said imaging array sensor is disposed at the
`exterior rearview mirror assembly, said imaging array
`sensor has a field of view exterior of the equipped
`vehicle, and wherein said imaging array sensor is oper-
`able to capture an image exterior of the equipped
`vehicle;
`a control for processing said captured image;
`wherein said control is operable to determine that said
`imaging array sensor is misaligned when said imaging
`array sensor is disposed at the exterior rearview mirror
`assembly at the side of the equi ped vehicle; and
`wherein said control, responsivet a determination of mis~
`’522 Patent (Ex. 1001), claim 1
`11
`alignment of said imaging array sensor, is operable to at
`least partially compensate for the determined misalign-
`ment of said imaging array sensor.
`
`
`
`6. The imaging system of claim 1, wherein said control,
`responsive to a determination of misalignment of said imag-
`ing array sensor,
`is operable to adjust processing of said
`captured image to at least partially compensate for the deter-
`mined misalignment of said imaging array sensor.
`
`12
`
`’522 Patent (Ex. 1001), claim 6
`
`
`
`13
`
`Institution Decision (Paper 13), pp. 13-14
`
`
`
`
`
`lspccam
`
`\ t u
`
`"
`
`:
`:
`
`E
`\\
`“
`\x__.f.._________,____;
`
`;:\:.~.
`Y‘,
`\
`\
`
`I
`
`5
`
`E71;
`i
`,_
`\lx
`\'
`l
`“«
`2-_\:‘~ _________________
`
`L;
`Bu
`\’~
`‘i‘
`
`l
`
`5
`
`\
`’522 Patent (Ex. 1001), FIGS. 3A-3C
`I“
`.._.._.._........................:
`
`(
`
`.
`1
`.
`-‘
`-
`~
`~
`bhm ll". horizontal UIYL‘CUOYl «‘1' 3‘3?
`
`.
`.
`..
`.
`.
`.l
`5321?: m vcrucui mrcctzon (Pit-ch}
`
`Rotation Roll‘
`i
`}
`
`FlG. 3A
`
`FIG. 3B
`
`’522 Patent (EX. 1001), FIGS. 3A—3C
`
`FEG. SC
`
`The algorithm may further perform a fine structure fitting
`(such as Via a correlation algorithm or contour fitting algo-
`rithm or the like) for calculating shift in yaw, pitch and roll. As
`shown in FIGS. 3A-C, the actual or detected vehicle edges
`may be misaligned or separated from the expected vehicle
`edges, such that the image processing may be adjusted to shift
`the captured image data accordingly to accommodate such
`misalignment of the camera. Based on the results ofthe image
`14
`processing techniques, data or information of the yaw, pitch
`and roll may be used to set the polygon co-ordinates and H
`depression pixel calibration parameters, so that the expected
`vehicle edges are substantially aligned with the actual or
`
`’522 Patent (Ex. 1001), 6:40-52
`
`
`
`15
`
`Corrected Petition (Paper 6), p. 37
`
`
`
`[I7]
`
`
`
`16
`
`Hitachi (Ex. 1013), FIG. 7
`
`
`
`
`
`27. Given the fact that the Petitioners and Dr. Frahm relied on the same
`
`flawed construction of these limitations of independent claims 1. 27. 36. 41. and 47
`
`that embraces the adjustment of something other than a previously capttu‘ed image
`
`(e.g.. a subsequently captured image). it is likely that this flawed construction
`
`impacted the Petitioners‘ articulation of the proposed rejection of independent
`
`claims 1. 27. 36. 41. and 47. More specifically. it is not at all clear Whether the
`
`Petitioners argue that the “image data“ of Hitachi is image data of a previously
`
`capttu‘ed image. or whether the Petitioners argue that the “image data" of Hitachi is
`
`image data of a subsequently captured image. Dr. Fralnn states that “[w]hen this
`
`adjustment occurs the image processing of Hitachi operates on the pixels of the
`
`adjusted area dtu'ing image processing.“ (Frahm Declaration T 145.) It is not clear
`
`if Dr. Fralnn regards Hitachi as adjusting the processing or adjusting the capttu‘ed
`
`17
`image area.
`
`In my opinion.
`
`the ambiguity on this point prevents me from
`
`Turk Declaration (Ex. 2003), ¶ 27
`
`concluding that all of the elements of independent claims 1. 27. 36. 41. and 47 are
`
`disclosed or suggested by Hitachi (alone or in combination with Nissan).
`
`