`
`
`
`
`
`UNITED STATES PATENT AND TRADEMARK OFFICE
`____________
`
`BEFORE THE PATENT TRIAL AND APPEAL BOARD
`____________
`
`PANASONIC SYSTEM NETWORKS CO., LTD.
`Petitioner
`
`v.
`
`6115187 CANADA, INC.
`Patent Owner
`____________
`
`Case IPR _____________
`U.S. Patent No. 6,844,990
`Issue Date: January 18, 2005
`
`Title: METHOD FOR CAPTURING AND DISPLAYING A
`VARIABLE RESOLUTION DIGITAL PANORAMIC IMAGE
`____________
`
`DECLARATION OF SHISHIR K. SHAH, PH.D.
`
`1
`
`
`
`
`
`I, Shishir K. Shah, declare as follows:
`
`1.
`
`I have been retained by Panasonic System Networks Co., Ltd.
`
`(“Petitioner”) as an expert in this case.
`
`2.
`
`I have been asked to provide my opinions concerning the subject
`
`matter disclosed in U.S. Patent No. 6,844,990 (“the ‘990 patent”) and in the prior
`
`art, in particular relating to image correction methods.
`
`3.
`
`I have also been asked to provide my opinions concerning the state of
`
`the relevant art prior to May 11, 2001, and the level and knowledge of one of
`
`ordinary skill in the art in the May 2001 time frame.
`
`4.
`
`I have reviewed certain prior art references and analyzed whether
`
`certain limitations of the claims of the '990 patent are disclosed and/or would have
`
`been obvious in view of those prior art references. I have also reviewed portions
`
`of the Declaration of Jack Feinberg, Ph.D. (Exhibit 1013).
`
`5. My opinions set forth in this declaration are based on my education,
`
`training and experience in the relevant field, as well as the materials I reviewed in
`
`this case, and the scientific knowledge regarding the same subject matter.
`
`I.
`
`Qualifications
`
`6.
`
`I earned a bachelor of science (B.S.) in Mechanical Engineering from
`
`the University of Texas at Austin in 1994, a master of science (M.S.) in Electrical
`
`and Computer Engineering from the University of Texas at Austin in 1995, and a
`
`
`
`2
`
`
`
`
`
`Ph.D. in Electrical and Computer Engineering from the University of Texas at
`
`Austin in 1998.
`
`7.
`
`I have over 20 years of experience of in the areas of imaging and
`
`image analysis.
`
`8.
`
`I am currently an Associate Professor in the Department of Computer
`
`Science at the University of Houston.
`
`9.
`
`I am currently serving as the Director of the Quantitative Imaging Lab
`
`in the Department of Computer Science at the University of Houston.
`
`10. A listing of my publications and research is included in my
`
`curriculum vitae, a copy of which is attached as Appendix A.
`
`11.
`
`I have been retained in this matter by Panasonic System Networks
`
`Co., Ltd. (“Petitioner”) to provide an analysis of the scope and content of prior art
`
`references that existed prior to the earliest date of filing of the patent application
`
`underlying U.S. Patent No. 6,844,990 (“the ‘990 patent”). In particular, I analyzed
`
`whether certain limitations of the claims of the ‘990 patent are described in the
`
`prior art references.
`
`12.
`
`I am being compensated for my work. My fee is not contingent on the
`
`outcome of any matter or on any of the technical positions I explain in this
`
`declaration. I have no financial interest in Petitioner.
`
`
`
`3
`
`
`
`
`
`13.
`
`I have been informed that the assignee of the patent, 6115187
`
`CANADA, INC., is also known as ImmerVision Inc. (hereinafter referred to as
`
`“Patentee”). I have no financial interest in the Patentee or the ‘990 patent.
`
`II. Documents Reviewed
`
`14.
`
`15.
`
`I have reviewed the ‘990 patent. Exhibit 1001.
`
`I have reviewed European Patent Publication EP 1 028 389 A2 Shiota
`
`(“Shiota”). Exhibit 1008.
`
`16.
`
`I have reviewed an English language translation of Japanese Patent
`
`Application Publication No. 2000-242773 to Matsui (“Matsui”). Exhibit 1010. I
`
`have also looked at the original Japanese document. Exhibit 1009.
`
`17.
`
`I have reviewed an English language translation of Japanese Patent
`
`Application Publication No. 11-261868 to Enami (“Enami”). Exhibit 1012. I have
`
`also looked at the original Japanese document. Exhibit 1011.
`
`18.
`
`I have also reviewed U.S. Patent No. 5,686,957 (“Baker”) (Exhibit
`
`1002,), U.S. Patent No. 6,128,145 (“Nagaoka”)( Exhibit 1003,), and U.S. Patent
`
`No. 3,953,111 (“Fisher”)(Exhibit 1004).
`
`19. As noted earlier, I have also reviewed portions of the Declaration of
`
`Jack Feinberg, Ph.D. Exhibit 1013.
`
`
`
`4
`
`
`
`
`
`III. The Person of Ordinary Skill in the Relevant Field in the Relevant
`Timeframe
`
`20.
`
`I have been informed that “a person of ordinary skill in the relevant
`
`art” is a hypothetical person to whom an expert in the relevant field could assign a
`
`routine task with reasonable confidence that the task would be successfully carried
`
`out. I have been informed that the level of skill in the art can be deduced by
`
`examining the prior art references. I have been informed that "a person of ordinary
`
`skill in the relevant art" is a hypothetical person who is presumed to have known
`
`the relevant art at the time of the invention. Factors that may be considered in
`
`determining the level of ordinary skill in the art may include: (1) type of problems
`
`encountered in the art; (2) prior art solutions to those problems; (3) rapidity with
`
`which innovations are made; (4) sophistication of the technology; and (5)
`
`educational level of active workers in the field. In a given case, every factor may
`
`not be present, and one or more factors may predominate. In many cases a person
`
`of ordinary skill will be able to fit the teachings of multiple patents together like
`
`pieces of a puzzle.
`
`21. I have been informed that the date for determining whether a document
`
`or information is considered “prior art” is May 11, 2001.
`
`22. A person of ordinary skill in the subject matter claimed and disclosed in
`
`the ‘990 patent would have at least a bachelor’s degree in Physics and/or Electrical
`
`
`
`5
`
`
`
`
`
`Engineering and at least five years’ experience working with lenses or related
`
`optical systems.
`
`23. The prior art discussed herein demonstrates that a person of ordinary
`
`skill in the relevant art, before May 11, 2001, would have been aware of panoramic
`
`objective lenses, fish-eye lenses and other wide-angle lenses.
`
`24. A person of ordinary skill in the relevant art would have been aware of
`
`panoramic objective lenses with non-linear distribution functions, i.e., a
`
`distribution function of image points that is not linear relative to the field angle of
`
`the object points of the panorama.
`
`25. A person of ordinary skill in the relevant art would also have
`
`understood the desirability of, and how to, correct an image obtained from a
`
`panoramic objective lens having a non-linear distribution function, including by
`
`means of a reciprocal function of the non-linear distribution function and by means
`
`of the non-linear distribution function.
`
`26. Based on my education and experience, I have a very good
`
`understanding of the capabilities of a person of ordinary skill in the relevant art at
`
`the time of invention.
`
`IV. Background of the Technology
`
`27.
`
`It was well known prior to May 2001 that certain fish-eye lenses
`
`produced highly distorted images. Such an image could be captured using a CCD
`
`6
`
`
`
`
`
`
`
`or other electronic camera device, and the captured image could be corrected using
`
`a computer.
`
`28.
`
`In 1994, I was lead author on a paper entitled: “A Simple Calibration
`
`Procedure for Fish-Eye (High Distortion) Lens Camera*.” This paper was
`
`published in 1994 in the IEEE Proceedings of the International Conference on
`
`Robotics and Automation, Vol. 4, pp. 3422-3427.
`
`29.
`
`In this paper, it was shown that a fish-eye lens had a non-linear
`
`distribution function and that such function could be estimated using a calibration
`
`pattern. I discussed a new algorithm for the geometric camera calibration of a fish-
`
`eye lens mounted on a CCD TV camera. The algorithm determined a mapping
`
`between points in the world coordinate system and their corresponding
`
`point locations in the image plane. Specifically, a non-linear distortion model was
`
`developed that would relate the points on an imaged object to its corresponding
`
`points on the image plane and the algorithm discussed allowed the estimation of
`
`the coefficient of the non-linear distortion model. Further, a method for correcting
`
`the obtained fish-eye image was presented that used an inverse mapping by
`
`traversing points in the corrected image and finding corresponding points in the
`
`fish-eye image so that the intensity values or image color could be interpolated and
`
`mapped to the corrected image. Finally, this paper shows that the corrected image
`
`
`
`7
`
`
`
`
`
`resulted in a number of pixels that are greater than the fish-eye image due to
`
`expanded number of pixels from the periphery of the fish-eye image.
`
`30.
`
`In 1996, I was lead author on another paper, entitled: “Intrinsic
`
`Parameter Calibration Procedure For A (High-Distortion) Fish-Eye Lens Camera
`
`With Distortion Model And Accuracy Estimation.” This paper was published in
`
`the journal Pattern Recognition, Vol. 29, No. 11, pp. 1775-1788.
`
`31. This paper expanded on the work presented in the paper in 1994 and
`
`discussed a non-linear distortion correction model addressing both radial and
`
`tangential distortions. Further, inverse mapping was also discussed to ensure that
`
`every pixel intensity was mapped in the expanded corrected image while using the
`
`estimated distortion coefficients in correcting the distorted image.
`
`32. Further, as explained in the Declaration of Jack Feinberg, Ph.D.
`
`(Exhibit 1013,), U.S. Patent No. 5,686,957 (“Baker”) (Exhibit 1002,), U.S. Patent
`
`No. 6,128,145 (“Nagaoka”)( Exhibit 1003,), and U.S. Patent No. 3,953,111
`
`(“Fisher”)(Exhibit 1004) demonstrate that panoramic objective lenses having an
`
`image point distribution function that is not linear relative to the field angle of
`
`object points of the panorama, the distribution function having a maximum
`
`divergence of at least ±10% compared to a linear distribution function, such that
`
`the panoramic image obtained has at least one substantially expanded zone and at
`
`
`
`8
`
`
`
`
`
`least one substantially compressed zone, were well known to persons of ordinary
`
`skill in the art prior to May 2001.
`
`V. The ‘990 Patent
`
`33. The ‘990 patent discloses panoramic objective lenses having non-
`
`linear distribution functions of image points, and methods for correcting the non-
`
`linearity of the initial image.
`
`34. The ‘990 patent discloses two embodiments of correction methods.
`
`Exhibit 1001, ‘990 patent, col. 10, line 6 – col. 14, line 41.
`
`35. The first disclosed embodiment “involves correcting the initial image
`
`by means of a function Fd-1 that is the reciprocal function of the distribution
`
`function Fd according to the present invention. As the distribution function Fd is
`
`known and determined at the time the non-linear objective lens is designed, it is
`
`easy to deduce the reciprocal function Fd-1 therefrom. This correction step allows a
`
`corrected image to be obtained in which the non-linearity due to the objective lens
`
`according to the present invention is removed. The corrected image is equivalent to
`
`an image taken by means of a classical panoramic objective lens and can then be
`
`processed by any classical display software program available in stores, provided
`
`for transferring the image points of an image disk into a three-dimensional space
`
`and for interactively displaying a sector of the image obtained.” Exhibit 1001,
`
`‘990 patent, col.10, lines 39-53.
`
`9
`
`
`
`
`
`
`
`36. The ‘990 patent states that the “second alternative of the method
`
`involves using the distribution function Fd in an image display algorithm working
`
`backwards, that is defining in real time the color of the pixels of a display window
`
`using the image points of the image disk.” Exhibit 1001, ‘990 patent, col. 10, lines
`
`54-58.
`
`37. Claim 10 of the ‘990 patent states:
`
`A method for displaying an initial panoramic image obtained in
`
`accordance with the method according to claim 1, the method for
`
`displaying comprising: correcting the non-linearity of the initial
`
`image, performed by means of a reciprocal function of the non-linear
`
`distribution function of the objective lens or by means of the non-
`
`linear distribution function.
`38. Claim 11 of the ‘990 patent states:
`
`The method according to claim 10, wherein the step of correcting
`
`comprises a step of transforming the initial image into a corrected
`
`digital image comprising a number of image points higher than the
`
`number of pixels that the image sensor comprises.
`39. Claim 15 of the ‘990 patent states:
`
`The method according to claim 10, further comprising: determining
`
`the color of image points of a display window, by projecting the
`
`image points of the display window onto the initial image by means of
`
`the non-linear distribution function, and allocating to each image point
`
`
`
`10
`
`
`
`
`
`of the display window the color of an image point that is the closest
`
`on the initial image.
`40. Claim 16 of the ‘990 patent states:
`
`The method according to claim 15, wherein the projection of the
`
`image points of the display window onto the initial image comprises:
`
`projecting the image points of the display window onto a sphere or a
`
`sphere portion, determining the angle in relation to the center of the
`
`sphere or the sphere portion of each projected image point, and
`
`projecting onto the initial image each image point projected onto the
`
`sphere or the sphere portion, the projection being performed by means
`
`of the non-linear distribution function considering the field angle that
`
`each point to be projected has in relation to the center of the sphere or
`
`the sphere portion.
`
`
`
`VI. Claim Interpretation
`
`41.
`
`In reviewing the claims of the '990 patent, I understand that the claims
`
`are generally accorded their broadest reasonable interpretation in light of the patent
`
`specification, and should be free from any limitations disclosed in the
`
`specifications that are not expressly listed in the claims.
`
`42.
`
`I also understand that claimed terms should be accorded their ordinary
`
`and accustomed meaning unless the specification otherwise defines the terms.
`
`
`
`11
`
`
`
`
`
`43.
`
`In my analysis, I have construed the terms stated in the claims relating
`
`to corrections using their ordinary and accustomed meaning, which would be the
`
`broadest reasonable interpretation in light of the patent specification.
`
`VII. Shiota
`
`44.
`
`I have been informed that Shiota (Exhibit 1008) published August 16,
`
`2000, and that it is prior art to the ‘990 patent.
`
`45. Shiota discloses an image transformation system for correcting the
`
`output from a non-linear objective fish-eye lens.
`
`46. Shiota discloses a fisheye lens 2 attached to a CCD camera 1. Exhibit
`
`1008, [0044] and Fig. 4.
`
`47. Shiota discloses that when the focal distance of the fisheye lens is f¸
`
`the nonlinear stereographic projection h=2f·tan(θ/2) is established. Exhibit 1008,
`
`[0037].
`
`48. Shiota discloses the method of transforming the fish-eye image into a
`
`planar image where the planar image is obtained by projecting points from the
`
`hemispherical objective lens surface onto a plane intersecting the lens surface at a
`
`point identifying the viewing direction from the origin of the camera. Exhibit
`
`1008, [0028-0032].
`
`49. Shiota discloses that the pixels on the fish-eye image are related to
`
`points projected onto the lens surface according to the nonlinear lens projection
`
`12
`
`
`
`
`
`
`
`h=2f·tan(θ/2) and that this relationship is used in relating pixels of the fish-eye
`
`image to the planar image. Knowing the lens projection, the pixels on the fish-eye
`
`image can be related to points on the lens surface (Fig. 1) and can be computed
`
`according to the nonlinear distribution function of the lens. Exhibit 1008, [0033-
`
`0041].
`
`50. Shiota further discloses that generating the planar image requires that
`
`the points on the lens surface be mapped to a plane intersecting the lens surface,
`
`such mapping being defined by arithmetic calculations based on underlying
`
`trigonometric geometry. Exhibit 1008, [0028-0032].
`
`51. Shiota discloses a transformation where the mapping of points
`
`between the planar image and the lens surface are computed separately from the
`
`mapping of points between the lens surface and the fish-eye image. Exhibit 1008,
`
`[0028 and 0033].
`
`52. Shiota thus discloses a two step process of transforming the image
`
`obtained by a non-linear objective fish-eye lens into an undistorted image. Exhibit
`
`1008, [0028-0042].
`
`53. Shiota discloses that the fish-eye image obtained is related to the
`
`spherical objective fish-eye lens according to the nonlinear stereographic
`
`projection h=2f·tan(θ/2) is established. Exhibit 1008, [0037].
`
`
`
`13
`
`
`
`
`
`54. Shiota discloses a first coordinate calculating unit for obtaining first
`
`projection coordinates derived by projecting coordinates on the plane image onto a
`
`fisheye image face as an imaginary object face, and a second coordinate
`
`calculating unit for obtaining second projection coordinates derived by projecting
`
`the first projection coordinates obtained by the first coordinate calculating unit
`
`onto the fisheye image face. Exhibit 1008, [0001, 0009-0010].
`
`55. More specifically, Shiota discloses an operation part 40 comprising a
`
`first coordinate calculating unit 35, a second coordinate calculating unit 36, a first
`
`lookup table 37 connected to the first coordinate calculating unit 35, and a second
`
`lookup table 38 connected to the second coordinate calculating unit 36. The first
`
`coordinate calculating unit 35 is a part of executing the calculation of the first step
`
`shown in FIG. 3 and can obtain the first projection coordinates (X2, Y2, Z2) on the
`
`hemispherical face from the (u, v) coordinates in the plane image. This relates the
`
`point on the planar image to the lens surface. The first lookup table 37 is a table
`
`for obtaining the correction coefficient k1 from the distance L. Exhibit 1008,
`
`[0045] and Fig. 4.
`
`56. Additionally, Shiota discloses that the second coordinate calculating
`
`unit 36 is a part of executing the calculation of the second step in FIG. 3 and can
`
`obtain the second projection coordinates (p1, q1) on the fisheye image face from the
`
`first projection coordinates (X2, Y2, Z2) derived by the first coordinate calculating
`
`14
`
`
`
`
`
`
`
`unit 35. This relates the point on the lens surface to the pixel on the fish-eye image.
`
`The second lookup table 38 is a table for obtaining the correction coefficient k2.
`
`Exhibit 1008, [0046].
`
`57. Shiota discloses a correction coefficient k2, which equals the function
`
`h over the radius r. i.e., where r is the distance from the origin to the projected
`
`point. Shiota further discloses that h can equal 2f·tan(θ/2). Accordingly, when
`
`h=2f·tan(θ/2), the correction coefficient k2 is based on a non-linear distribution
`
`function. Exhibit 1008, [0037-0041].
`
`58. Shiota discloses “transforming a fisheye image obtained using a
`
`fisheye lens (2) into a plane image for display comprising: a first coordinate
`
`calculating unit (35) for obtaining first projection coordinates derived by projecting
`
`coordinates on the plane image onto a fisheye face as an imaginary object face.”
`
`Exhibit 1008, Abstract.
`
`59. Shiota discloses “Necessary parameters are, as shown in Figs. 1 and 2,
`
`(X0, Y0, Z0) indicative of the center (origin) of a plane image and change amounts
`
`∂ux, ∂vx, ∂uy, ∂vy, ∂uz, ∂vz in the respective axes of the (X, Y, Z) coordinates
`
`when a point is moved in the respective directions on the (u, v) coordinate system
`
`by an amount of one pixel (corresponding to one pixel on the monitor screen).”
`
`Exhibit 1008 [0025]. “The parameters can be easily obtained from the information
`
`of the angle information (ψ, θ, α) of the view point and the magnification of the
`
`15
`
`
`
`
`
`
`
`image.” Exhibit 1008 [0026]. Since point P’ “is on the surface of the hemisphere
`
`of radius of 1, the zenithal angle (θ1) is unconditionally determined …” Exhibit
`
`1008, [0034].
`
`60. Shiota discloses “First, the coordinates of a point P' (first projection
`
`coordinates) on the hemispherical face as an imaginary object face, which is a
`
`projection of a point P on a plane image (the u, v coordinates) are obtained.”
`
`Exhibit 1008, [0028].
`
`61. Shiota discloses, as a second step of the calculation, “a procedure of
`
`obtaining second projection coordinates w(p1, q1,) on a fisheye image face from
`
`the first projection coordinates (X2, Y2, Z2) determined (refer to FIG. 1 with
`
`respect to w) will be explained.” Exhibit 1008, [0033].
`
`62. An example of a specific function of F(θ1) according to the projecting
`
`method is the stereographic projection, which has the non-linear distribution
`
`function: 2f•tan(θ/2). Exhibit 1008, [0037].
`
`VIII. Matsui
`
`63.
`
`I have been informed that Matsui (Exhibit 1009) was published on
`
`September 8, 2000 and that it is prior art to the ‘990 patent.
`
`64. Matsui discloses a data image conversion device, in which image data
`
`obtained with a fish-eye lens is corrected to remove distortion so that it may be
`
`displayed. Exhibit 1010, [0001].
`
`16
`
`
`
`
`
`
`
`65. Matsui discloses that a camera 1 uses a fish-eye lens 11 that is capable
`
`of image capture at a field angle of 90° or more with respect to the optical axis.
`
`Exhibit 1010, [0017].
`
`66. Matsui discloses that the fish-eye lens includes the property
`
`h=2f·tan(θ/2). As a result, the amount of information in the area of a large field
`
`angle, the periphery of a circular image, is increased. Exhibit 1010, [0017].
`
`67. More specifically, Matsui discloses the calculation of pixel positions
`
`on the circular surface from the pixel positions actually captured by the CCD
`
`utilizing the non-linear distribution function h = 2f·tan(θ/2). Exhibit 1010, [0025]
`
`68. Matsui discloses a data converter 2 that is an image data conversion
`
`device that converts circular image data stored in a first image memory 3 into
`
`cylindrical image data using a conversion process. Then, the data converter 2
`
`outputs the image data to a second image memory. Exhibit 1010, [0018].
`
`69. Matsui discloses that the circular image obtained by the fish-eye lens
`
`can include an image of all orientations, but the image becomes more distorted
`
`further toward an outer periphery. Exhibit 1010, [0003].
`
`70. Matsui further discloses that this distortion can be largely removed
`
`from the circular image data by mapping the points in the circular image to points
`
`onto a hemisphere surface. Further, the image data mapped onto a hemisphere
`
`surface can be further projected onto a planar surface. Exhibit 1010, [0003].
`
`17
`
`
`
`
`
`
`
`71. Matsui discloses that circular image data obtained by images captured
`
`using a fish-eye lens can be mapped onto a cylindrical surface by considering
`
`points in the circular image indexed by (g(Θ)•cosψ, g(Θ)•sinψ), such that the
`
`origin of the indexing is at the center of the circular image and where Θ is a
`
`parameter fulfilling 0 < Θ < π/2; g(Θ) being a function fulfilling g(0) = 0 and
`
`monotonically increasing in the range of Θ; and ψ being an angle formed by a line
`
`segment joining the center and a point on the circular image. The points so
`
`indexed are mapped onto the cylindrical surface indexed by (R, ψ, R/tanΘ) where
`
`R is a constant. Exhibit 1010, [0006].
`
`72. Matsui further discloses that g(Θ) in the indexing of the circular
`
`image points can be the distance of each point on the circular image computed
`
`from the center of the circular image, and as such, according to the mapping, a line
`
`segment oriented in a radial direction from the center of the circular image data,
`
`where the radial direction is given by ψ, is converted into the cylindrical coordinate
`
`system where R is a constant (i.e., as a line segment oriented from up to down on a
`
`cylindrical surface of radius R). This results in mapping each line from the circular
`
`image to a segment on the cylindrical surface, thereby expanding the line and
`
`correcting the distortion. The image converted into a cylindrical surface can be
`
`readily made into a planar image by projecting the cylindrical surface. Exhibit
`
`1010, [0008].
`
`
`
`18
`
`
`
`
`
`73. Matsui discloses that when using a lens having the property h = g(θ)
`
`(h being an image height and θ being a field angle), an amount that g(Θ) increases
`
`accompanying an increase of Θ is the same as the amount that the image height h
`
`increases. In such a case, the circular image is converted into an ideal, undistorted
`
`hemisphere surface. Hence the mapping of the circular image to the cylindrical
`
`surface to remove the distortion uses the lens distribution function. Exhibit 1010,
`
`[0009].
`
`74. More specifically, Matsui discloses that a point on the circular image
`
`can be indexed by (h•cosψ, h•sinψ), where h is the distance from the center of the
`
`circular image and ψ is the angle of the line segment that links the point to the
`
`center of the circular image (Fig. 3). Considering the lens distribution h =
`
`2f•tan(θ/2) , the point is indexed by (2f•tan(θ/2)•cosψ, 2f•tan(θ/2)•sinψ). This
`
`point mapped to the cylindrical surface of radius R and z = R/tanθ is given as (R,
`
`ψ, R / tanθ). Exhibit 1010, [0023].
`
`75. Matsui discloses that points in the circular image can be directly
`
`referenced by indexing ψ and z of the cylindrical coordinates according to the
`
`relationship (2f•R•cosψ / {(R2 + z2)1/2 + z)}, 2f•R•sinψ / {(R2 + z2)1/2 + z)}) derived
`
`by using the trigonometric expansion h = 2f•tan(θ/2) = 2f•{sinθ / (1 + cosθ)}.
`
`Exhibit 1010, [0025]. This provides an unambiguous determination as to which
`
`
`
`19
`
`
`
`
`
`pixel position of the cylindrical surface each pixel of the circular image is
`
`transformed into.
`
`76. Matsui also discloses that rather than using the lens distribution
`
`function h = g(θ), an inverse function θ = Tan-1 (R / z) can be used to relate points
`
`on the cylindrical surface to the points in the circular image. In this case, the
`
`points in the circular image can be directly referenced by indexing ψ and z of the
`
`cylindrical coordinates according to the relationship (f•G(Tan-1 (R / z)) )•cosψ,
`
`f•G(Tan-1 (R / z))•sinψ). Exhibit 1010, [0030-0031].
`
`77. Matsui discloses that h = 2f·tan(θ/2) = 2f·{sinθ/(1 + cosθ)}. Exhibit
`
`1010, [0025].
`
`78. Matsui discloses a data converter 2 that calculates the circular surface
`
`S pixel position corresponding to each pixel on the post-conversion cylindrical
`
`surface C, then converts the pixel data of the pixel positions as respective pixel
`
`data on the cylindrical surface C. As a result, as shown in Fig. 4, the pixel data on
`
`the circular surface S is converted as respective pixel data on the cylindrical
`
`surface C. Exhibit 1010, [0026].
`
`79. The data converter 2 converts P1 shown in Fig. 3 into P2 shown in
`
`Fig. 2. Exhibit 1010, [0023]. Matsui discloses correcting the non-linearity of the
`
`initial image by means of the non-linear distribution function. Specifically, Matsui
`
`discloses that rather than using the lens distribution function h = g(θ), an inverse
`
`20
`
`
`
`
`
`
`
`function θ = Tan-1 (R / z) can be used to relate points on the cylindrical surface to
`
`the points in the circular image. In this case, the points in the circular image can be
`
`directly referenced by indexing ψ and z of the cylindrical coordinates according to
`
`the relationship (f•G(Tan-1 (R / z)) )•cosψ, f•G(Tan-1 (R / z))•sinψ). Exhibit 1010,
`
`[0030-0031].
`
`80. Matsui expands the edges of the image, and thereby transforms the
`
`initial image into a corrected digital image where the number of image points is
`
`higher than the number of pixels that the image sensor comprises.
`
`IX. Enami
`
`81.
`
`I have been informed that Enami (Exhibit 1011) published on
`
`September 24, 1999 and that it is prior art to the ‘990 patent.
`
`82. Enami discloses a fisheye lens camera, including a non-linear image
`
`distortion correction method. Exhibit 1012, Abstract.
`
`83. Enami discloses that an image captured by a fisheye lens 1-1 and a
`
`CCD imaging device 1-2 is stored in a picture member 1-3, and an image
`
`correction processing unit 1-4 operates coordinate transform for correcting an
`
`installation angle of a fisheye lens camera and coordinate transform for correcting
`
`distortion of a fisheye lens image of equal area projection is transformed through
`
`mapping at high speed. Exhibit 1012, Abstract.
`
`
`
`21
`
`
`
`
`
`84. Enami discloses that an image correction processing circuit 13-4
`
`corrects a distorted image of the fisheye lens so as to recover an original image
`
`with respect to a monitor display region, and the corrected picture image (NTSC)
`
`output circuit 13-5 outputs the corrected image signal as a normal television
`
`picture signal of an NTSC format. Exhibit 1012, [0007].
`
`85. Enami discloses that a frame corresponding to a displayed frame is
`
`assumed as the virtual image frame 14-1. Exhibit 1012, [0017].
`
`86. Enami discloses that an image obtained using a fish-eye lens is a
`
`mapping of image incident from a specific direction (r, θ, ϕ) to a point (x", y")
`
`expressed by the transform x" = Lcosθ and y"=Lsinθ, where L is the image height
`
`and is dependent on the lens projection or the lens distribution function. Exhibit
`
`1012, [0053-0056].
`
`87. Enami discloses that using the mapping, a coordinate of a point in the
`
`fisheye lens image corresponds to a coordinate of a point arranged on a raster in a
`
`corrected image, and that a color information signal of the point having the
`
`corresponding coordinates is made to correspond. Exhibit 1012, [0058].
`
`88. Enami discloses that to obtain a smooth corrected image, linear
`
`interpolation of the color information can be used. Exhibit 1012, [0072-74]. In
`
`doing so, Enami describes calculation of the color value of a coordinate point on
`
`the fisheye lens image by converting mapped coordinate points to integers by
`
`22
`
`
`
`
`
`
`
`omitting the decimal points, or rounding down, and utilizing the color values at the
`
`corresponding integer coordinate points. Similarly, rounding up to obtain another
`
`integer coordinate is also described. Finally, the use of both integer coordinates is
`
`described as part of linear interpolation. It is well understood that interpolation can
`
`be used to obtain a smoother color value and multiple approaches can be utilized in
`
`determining the calculation of the color value. Enami discloses the usage of both
`
`the rounding up and rounding down process in order to find the closest coordinate
`
`point. Enami further discloses the averaging of the rounding up and rounding
`
`down processes in order to find the closest coordinate point and to obtain a
`
`smoother color value. Exhibit 1012, [0072-0074].
`
`89. Enami discloses that a relationship between a point (x,y) in the fisheye
`
`lens imaging plane 14-3 and a point (p,q) in the image frame 14-1 is expressed by a
`
`relation of Equation (1):
`
`Exhibit 1012, [0019-0020].
`
`
`
`23
`
`
`
`
`
`
`
`90. Enami discloses that a distortion of the fisheye lens image is corrected
`
`using the relationship of Equation (1) so as to recover the original image for
`
`display on a monitor. Exhibit 1012, [0021].
`
`91. Enami discloses that the image correction processing circuit 13-4
`
`obtains the point (x,y) in the fisheye lens imaging plane 14-3 corresponding to the
`
`point (p,q) in the image frame 14-1 using the relationship of Equation (1), reads a
`
`color information signal of the point from the picture memory (frame memory) 13-
`
`3, and writes the color information signal to an output picture memory (not shown)
`
`having an address corresponding to the point (p,q) in the image frame 14-1.
`
`Exhibit 1012, [0022-0023].
`
`92. Enami discloses that the parameters for correcting distortion of a
`
`fisheye lens image such as a central position of a displayed image, an aspect ratio
`
`of the fisheye lens image, and a radius of a fisheye lens image area are extracted
`
`from the captured fisheye lens image itself and distortion of the fisheye lens image
`
`is corrected using the parameters. Exhibit 1012, [0033].
`
`X. Claim 10 Of The ‘990 Patent Would Have Been Obvious To A Person
`Of Ordinary Skill In The Art Over Nagaoka In View Of Shiota
`
`93. The Feinberg Declaration (Exhibit 1013) explains how Nagaoka
`
`(Exhibit 1003) discloses all the elements of claim 1, i.e., a panoramic objective
`
`lens having an image point distribution function that is not linear relative to the
`
`
`
`24
`
`
`
`
`
`field angle of object points