`Artonne et al.
`
`(10) Patent N0.:
`(45) Date of Patent:
`
`US 6,844,990 B2
`Jan. 18,2005
`
`US006844990B2
`
`(54) METHOD FOR CAPTURING AND
`DISPLAYING A VARIABLE RESOLUTION
`DIGITAL PANORAMIC IMAGE
`
`(56)
`
`References Cited
`U.S. PATENT DOCUMENTS
`
`(75) Inventors: J ean-Claude Artonne, Montreal (CA);
`Christophe Moustier, Marseilles (FR);
`Benjamin Blanc, Montreal (CA)
`
`(73) Assignee: 6115187 Canada Inc., Saint Laurent
`(CA)
`
`( * ) Notice:
`
`Subject to any disclaimer, the term of this
`patent is extended or adjusted under 35
`U.S.C. 154(b) by 0 days.
`
`(21) Appl. No.: 10/706,513
`(22) Filed:
`Nov. 12, 2003
`
`(65)
`
`Prior Publication Data
`
`US 2004/0136092 A1 Jul. 15, 2004
`
`Related US. Application Data
`
`(63) Continuation of application No. PCT/FR02/01588, ?led on
`May 10, 2002.
`Foreign Application Priority Data
`
`(30)
`
`May 11, 2001
`
`(FR) .......................................... .. 01 06261
`
`(51) Int. Cl.7 ........................ .. G02B 13/06; G02B 13/18
`(52) US. Cl. ...................................... .. 359/725; 359/718
`(58) Field of Search ............................... .. 359/718, 719,
`35 9/725 , 728
`
`4/1976 Fisher et al.
`3,953,111 A
`3/1999 Ishiietal.
`5,880,896 A
`2/2000 Inoue
`6,031,670 A
`6,333,826 B1 * 12/2001 Charles .................... .. 359/725
`6,449,103 B1 * 9/2002 Charles .................... .. 359/725
`
`FOREIGN PATENT DOCUMENTS
`
`1/1996
`5/2000
`7/2000
`
`EP
`0 695 085 A1
`EP
`1 004 915 A1
`W0
`WO 00/42470 A1
`* cited by examiner
`Primary Examiner—Scott J. Sugarman
`(74) Attorney, Agent, or Firm—Akin Gump Strauss Hauer
`& Feld, LLP
`(57)
`
`ABSTRACT
`
`A method for capturing a digital panoramic image includes
`projecting a panorama onto an image sensor by means of a
`panoramic objective lens. The panoramic objective lens has
`a distribution function of the image points that is not linear
`relative to the ?eld angle of the object points of the
`panorama, such that at least one Zone of the image obtained
`is expanded While at least another Zone of the image is
`compressed. When a panoramic image obtained is then
`displayed, correcting the non-linearity of the initial image is
`required and is performed by means of a reciprocal function
`of the non-linear distribution function of the objective lens
`or by means of the non-linear distribution function.
`
`26 Claims, 11 Drawing Sheets
`
`
`
`U.S. Patent
`US. Patent
`
`Jan. 18,2005
`Jan. 18, 2005
`
`Sheet 1 0f 11
`Sheet 1 0f 11
`
`US 6,844,990 B2
`US 6,844,990 B2
`
`
`
`Panasonic Exhibit 1001 Page 2 of 24
`
`
`
`U.S. Patent
`
`Jan. 18,2005
`
`Sheet 2 0f 11
`
`US 6,844,990 B2
`
`Fdc
`
`gig. 4A
`
`
`
`U.S. Patent
`
`Jan. 18,2005
`
`Sheet 3 0f 11
`
`US 6,844,990 B2
`
`Quay‘.
`
`:11! 7
`
`. , x
`
`"c "b
`
`d% .a
`
`T R A R O m P
`
`.2
`
`l_d
`d..2
`
`O
`
`l1_ F
`6 0
`0.
`
`/
`
`= /
`
`o 0 o 0 o 0 0 u 0
`1 a.” 2.. 1_. a. i. .... q.” a. 1.. o
`
`
`( /
`P / l
`d / .0
`
`r. . trill 1|£U d / 2
`
`
`
`
`
`) I
`
`.d / 2
`/// C ‘I! 8
`.1 l M 2 lmw
`F I 0. '0 AL __ s
`5 // m 4 O x .w
`/ r m. , d 15 / H“ MU
`
`/ kw
`
`1
`
`a
`
`-m...
`
`9. 7A
`
`21.9
`.78
`
`
`
`U.S. Patent
`
`Jan. 18,2005
`
`Sheet 4 0f 11
`
`US 6,844,990 B2
`
`F612
`
`dr
`
`1
`
`—
`
`Pd2(dr=0.9)
`
`0.7-1
`- Pdl1(clr=0.338)
`
`FdB
`
`0.777)
`
`90°
`
`
`
`US. Patent
`U.S. Patent
`
`Jan. 18, 2005
`Jan. 18,2005
`
`Sheet 5 0f 11
`Sheet 5 0f 11
`
`US 6,844,990 B2
`US 6,844,990 B2
`
`
`
`Panasonic Exhibit 1001 Page 6 of 24
`
`
`
`U.S. Patent
`
`Jan. 18,2005
`
`Sheet 6 0f 11
`
`US 6,844,990 B2
`
`Img2 a/A
`
`wasp
`
`
`
`U.S. Patent
`
`Jan. 18,2005
`
`Sheet 7 0f 11
`
`US 6,844,990 B2
`
`S1 - Acquisition
`
`- Taking a panoramic image by means of a still digital camera
`or a digital video camera equipped with a panoramic lens
`having a non-linear distribution function F d
`
`S2 - Transfer of the image ?le into a computer
`
`- Transfer of the image ?le (image disk) into a
`microcomputer
`- Storage in the auxiliary storage (optional)
`
`S3 -Linearisation of the image disk
`
`- Transfer of the image points of the initial image disk into a
`second virtual image disk comprising more image points than
`the initial image disk, by means of the fimction Fd'
`Obtaining a linear image disk
`
`S4 - Digitisation
`
`- Transfer of the image points of the second image disk into a
`system of axes OXYZ in spherical coordinates Obtaining a
`panoramic image in a hemisphere
`
`S5 - Interactive display
`
`- Determination of the image points of an image sector to be
`displayed
`- Display of the image sector on a display window
`- Detection of the user’s actions on a screen pointer or any
`other control means,
`- Detection of the user’s actions on keys for image
`enlargement,
`- Modi?cation of the sector displayed (sliding the image
`sector displayed on the surface of the hemisphere and/or
`shrinking/expanding the image sector displayed)
`
`
`
`U.S. Patent
`
`Jan. 18,2005
`
`Sheet 8 0f 11
`
`US 6,844,990 B2
`
`Ptpx, py, pl)
`
`(1, J)
`
`M
`
`Dw
`
`V
`
`HS
`
`
`
`U.S. Patent
`
`Jan. 18,2005
`
`Sheet 9 0f 11
`
`US 6,844,990 B2
`
`Fig. 14
`
`S1 - Acquisition
`
`- Taking a panoramic image by means of a still digital camera
`or a digital video camera equipped with a panoramic lens
`having a non-linear distribution ?mction Fd
`
`S2 — Transfer of the image file into a computer
`
`- Transfer of the image ?le (image disk) into a
`microcomputer
`- Storage in the auxiliary storage (optional)
`
`S3’ — Interactive display with implicit correction of the
`non-linearity of the initial image
`
`A - Determination of the colour of the points E(i, j) of an
`image sector to be displayed using the points p(pu, pv) of the
`image disk:
`
`1- determination of the coordinates Ex, Ey, E2 in the
`coordinate system OXYZ of each point E(i, j) of the sector to
`be displayed,
`2- determination of the coordinates Px, Py, P2 of points P of
`the hemisphere corresponding to the points E(i, j),
`3-_ calculation of the coordinates, in the coordinate system
`O'UV of the image disk, of the points p(pu, pv) corresponding
`to the points P of the hemisphere, by means of the function
`F d,
`
`B - Presentation of the image sector in a display window,
`C - Detection of the user’s actions on a screen pointer or any
`other control means,
`D - Detection of the user’s actions on enlargement keys,
`E - Modi?cation of the image sector displayed (moving
`and/or shrinking/expanding the image sector)
`
`
`
`U.S. Patent
`US. Patent
`
`Jan. 18,2005
`Jan. 18, 2005
`
`Sheet 10 0f 11
`Sheet 10 0f 11
`
`US 6,844,990 B2
`US 6,844,990 B2
`
`\
`
`
`
`
`Wang
`hgmmu
`-
`[*0 "WE‘—
`§\ 5511‘s;
`zrl
`r
`..
`.
`”IQ‘rfll/ys‘s
`
`L
`‘Fflfix " .§
`
`‘
`
`Z )5 :vi:}-:':-: '1;
`
`
`
`‘.17
`_l§'_J'_.g. 17
`
`Panasonic Exhibit 1001 Page 11 of 24
`
`
`
`U.S. Patent
`US. Patent
`
`Jan. 18,2005
`Jan. 18, 2005
`
`Sheet 11 0f 11
`Sheet 11 0f 11
`
`US 6,844,990 B2
`US 6,844,990 B2
`
`
`
`43
`
`Fig. 18
`
`M2
`
`Panasonic Exhibit 1001 Page 12 of 24
`
`
`
`US 6,844,990 B2
`
`1
`METHOD FOR CAPTURING AND
`DISPLAYING A VARIABLE RESOLUTION
`DIGITAL PANORAMIC IMAGE
`
`CROSS-REFERENCE TO RELATED
`APPLICATIONS
`
`This application is a continuation of International Appli
`cation No. PCT/FR02/01588, ?led May 10, 2002 the dis
`closure of Which is incorporated herein by reference.
`
`BACKGROUND OF THE INVENTION
`The present invention relates to obtaining digital pan
`oramic images and displaying panoramic images on com
`puter screens.
`FIG. 1 represents a classical device alloWing a digital
`panoramic image to be produced and presented on a com
`puter screen. The device comprises a digital camera 1
`equipped With a panoramic objective lens 2 of the “?sh-eye”
`type, having an angular aperture on the order of 180°. The
`camera 1 is connected to a computer 5, such as a micro
`computer for example, equipped With a screen 6. The
`connection to the microcomputer 5 may be permanent,
`When, for example, the camera 1 is a digital video camera,
`or temporary, When, for example, the camera 1 is a still
`digital camera equipped With an image memory, the con
`nection then being carried out at the time the image ?les are
`to be transferred into the microcomputer.
`FIG. 2 schematically represents the appearance of a
`panoramic image 3 obtained by means of the panoramic
`objective lens 2. The round appearance of the image is
`characteristic of the axial symmetry of panoramic objective
`lenses and the image has dark edges 4 that Will subsequently
`be removed. This digital panoramic image is delivered by
`the camera 1 in the form of a computer ?le containing image
`points coded RGBA arranged in a tWo-dimensional table,
`“R” being the red pixel of an image point, “G” the green
`pixel, “B” the blue pixel, and “A” the Alpha parameter or
`transparency. The parameters R, G, B, A are generally being
`coded on 8 bits.
`The image ?le is transferred into the microcomputer 5
`Which transforms the initial image into a three-dimensional
`digital image, then presents the user With a sector of the
`three-dimensional image in a display WindoW 7 occupying
`all or part of the screen 6.
`FIG. 3 schematically shoWs classical steps of transform
`ing the tWo-dimensional panoramic image into a panoramic
`image offering a realistic perspective effect. After removing
`the black edges of the image, the microcomputer has a set of
`image points forming an image disk 10 of center O and axes
`OX and OY. The image points of the image disk are
`transferred into a three-dimensional space de?ned by an
`orthogonal coordinate system of axes O‘X‘Y‘Z, the axis O‘Z
`being perpendicular to the plane of the image disk. The
`transfer is performed by a mathematical function imple
`mented by an algorithm executed by the microcomputer, and
`leads to obtaining a set of image points referenced in the
`coordinate system O‘X‘Y‘Z. These image points are for
`example coded in spherical coordinates RGBA(¢,0), 4) being
`the latitude and 0 the longitude of an image point. The angles
`4) and 0 are coded in 4 to 8 bytes (IEEE standard). These
`image points form a hemisphere 11 When the panoramic
`objective lens used has an aperture of 180°, otherWise a
`portion of a hemisphere. The microcomputer thus has a
`virtual image in the shape of a hemisphere one sector 12 of
`Which, corresponding to the display WindoW 7, is presented
`on the screen (FIG. 1) considering that the observer is on the
`
`10
`
`15
`
`25
`
`35
`
`40
`
`45
`
`55
`
`65
`
`2
`central point O‘ of the system of axes O‘X‘Y‘Z, Which de?nes
`With the center O“ of the image sector 12, a direction O‘O“
`called “vieWing direction”.
`In order to avoid the image sector displayed 12 having
`geometrical distortions unpleasant for the observer, the
`classical panoramic objective lenses must have a distribution
`function of the image points according to the ?eld angle of
`the object points of a panorama that is as linear as possible.
`Therefore, if tWo points A‘, B‘, situated on the same meridian
`of the hemisphere 11, and the corresponding points A, B on
`the image disk 10 are considered, the ratio betWeen the
`angles (A‘O‘Z) and (B‘O‘Z) must be equal to the ratio
`betWeen the distances OA and OB on the image disk.
`Due to this property of linearity of a classical panoramic
`objective lens, image points corresponding to object points
`having an identical ?eld angle form concentric circles C10,
`C20 .
`.
`. C90 on the image disk 10, as represented in FIG.
`4A. Classically, “?eld angle of an object point” means the
`angle of an incident light ray passing through the object
`point considered and through the center of the panorama
`photographed, relative to the optical axis of the objective
`lens. The ?eld angle of an object point can be betWeen 0 and
`90° for an objective lens having an aperture of 180°.
`Therefore, the circle C10 is formed by the image points
`corresponding to object points having a ?eld angle of 10°,
`the circle C20 is formed by image points corresponding to
`object points having a ?eld angle of 20°, etc., the circle C90
`being formed by the image points having a ?eld angle of
`90°.
`FIG. 4B represents the shape of the distribution function
`Fdc of a classical panoramic objective lens, Which deter
`mines the relative distance dr of an image point in relation
`to the center of the image disk according to the ?eld angle
`ax of the corresponding object point. The relative distance dr
`is betWeen 0 and 1 and is equal to the distance of the image
`point in relation to the center of the image divided by the
`radius of the image disk. The ideal form of the function Fdc
`is a straight line of gradient K:
`
`in Which the constant K is equal to 0.111 degree-1 (1/90°).
`This technique of displaying a digital panoramic image
`sector on a computer screen has various advantages, par
`ticularly the possibility of “exploring” the panoramic image
`by sliding the image sector presented on the screen to the
`left, the right, upWards or doWnWards, until the limits of the
`panoramic image are reached. This technique also alloWs
`complete rotations of the image to be carried out When tWo
`complementary digital images have been taken and supplied
`to the microcomputer, the latter thus reconstituting a com
`plete panoramic sphere by assembling tWo hemispheres.
`Another advantage provided by presenting a panoramic
`image on screen is to enable the observer to make enlarge
`ments or Zooms on parts of the image. The Zooms are
`performed digitally, by shrinking the image sector displayed
`and expanding the distribution of the image points on the
`pixels of the screen.
`Various examples of interactive panoramic images can be
`found on the Web. Reference could be made in particular to
`the central site “http://WWW.panoguide.com” (“The Guide to
`Panoramas and Panoramic Photography”) Which gives a
`full overvieW of all the products available to the public to
`produce these images. SoftWare programs alloWing digital
`panoramic photographs to be transformed into interactive
`panoramic images are offered to the public in the form of
`doWnloadable programs or CD-ROMs available in stores.
`
`
`
`US 6,844,990 B2
`
`3
`Despite the various advantages that this technique for
`displaying digital images offers, the digital enlargements
`have the disadvantage of being limited by the resolution of
`the image sensor used When taking the initial image and the
`resolution of an image sensor is generally much loWer than
`that of a classical photograph. Therefore, When the enlarge
`ment increases, the granulosity of the image appears as the
`limits of the resolution of the image sensor are being
`reached.
`To overcome this disadvantage, it is Well knoWn to
`proceed With pixel interpolations so as to delay the appari
`tion of the blocks of color Which betray the limits of the
`resolution of the sensor. HoWever, this method only
`improves the appearance of the enlarged image sector and
`does not in any Way increase the de?nition. Another obvious
`solution is to provide an image sensor With a high resolution,
`higher than the resolution required to present an image
`sector Without enlargement, so that there is a remaining
`margin of de?nition for Zooms. HoWever, this solution is
`expensive as the cost price of an image sensor rapidly rises
`With the number of pixels per unit of area.
`Some attempts have been made to improve the quality of
`the enlargements, by changing the optical properties of the
`panoramic objective lenses themselves. Thus, U.S. Pat. No.
`5,710,661 teaches capturing a panoramic image With tWo
`overlocking objective lenses using a set of mirrors. A?rst set
`of mirrors provides an overall vieW, and a mobile central
`mirror provides a detailed vieW on a determined Zone of the
`panorama. HoWever, this solution does not offer the same
`?exibility as digital Zooms, particularly When the image is
`not displayed in real time, as the observer no longer has the
`possibility of choosing the image portion that he Wants to
`enlarge once the photograph has been taken.
`
`BRIEF SUMMARY OF THE INVENTION
`Therefore, the present invention comprises a method
`alloWing the physical limits of image sensors to be circum
`vented and the de?nition offered by digital enlargements
`concerning certain parts of a digital panoramic image to be
`improved, Without the need to increase the number of pixels
`per unit of area of an image sensor or to provide an
`overlooking optical enlargement system in a panoramic
`objective lens.
`The present invention is based on the observation that, in
`several applications, only certain Zones of a panoramic
`image are of a practical interest and are likely to be
`expanded by the observer by means of a digital Zoom. Thus,
`in applications such as video surveillance,
`videoconferencing, visio-conferencing, a panoramic camera
`can be installed against a Wall or on the ceiling and there is
`generally no reason to make enlargements on the Zones of
`the panoramic image corresponding to the Wall or the
`ceiling. Similarly, as part of a videoconference performed by
`means of a panoramic camera, the most interesting Zone is
`generally situated at a speci?c place situated toWards the
`center of the image (in the case of individual use) or on the
`edges of the image (in the case of collective use or visio
`conferencing). Furthermore, When used for recreation and
`leisure, most panoramic images comprise parts that are less
`interesting than others, such as the parts representing the sky
`or a ceiling for example, the most useful part generally being
`in the vicinity of the center of the image.
`Therefore, the present invention is based on the premise
`that a panoramic image has some Zones that are not very
`useful and that can tolerate a reasonable de?nition to the
`bene?t of other Zones of the image.
`On the basis of this premise, the idea of the present
`invention is to produce panoramic photographs by means of
`
`15
`
`25
`
`35
`
`40
`
`45
`
`55
`
`65
`
`4
`a panoramic objective lens that is not linear, Which expands
`certain Zones of the image and compresses other Zones of the
`image. The technical effect obtained is that the expanded
`Zones of the image cover a number of pixels of the image
`sensor that is higher than if they Were not expanded, and thus
`bene?t from a better de?nition. By choosing an objective
`lens that expands the most useful Zones of an image (Which
`depend on the intended application), the de?nition is excel
`lent in these Zones and the de?nition is mediocre in the Zones
`of lesser importance.
`Thus, the present invention proposes a method for cap
`turing a digital panoramic image, by projecting a panorama
`onto an image sensor by means of a panoramic objective
`lens, in Which the panoramic objective lens has an image
`point distribution function that is not linear relative to the
`?eld angle of object points of the panorama, the distribution
`function having a maximum divergence of at least 110%
`compared to a linear distribution function, such that the
`panoramic image obtained has at least one substantially
`expanded Zone and at least one substantially compressed
`Zone.
`According to one embodiment, the objective lens has a
`non-linear distribution function that is symmetrical relative
`to the optical axis of the objective lens, the position of an
`image point relative to the center of the image varying
`according to the ?eld angle of the corresponding object
`point.
`According to one embodiment, the objective lens expands
`the center of the image and compresses the edges of the
`image.
`According to one embodiment, the objective lens expands
`the edges of the image and compresses the center of the
`image.
`According to one embodiment, the objective lens com
`presses the center of the image and the edges of the image,
`and expands an intermediate Zone of the image located
`betWeen the center and the edges of the image.
`According to one embodiment, the objective lens com
`prises a set of lenses forming an apodiZer.
`According to one embodiment, the set of lenses forming
`an apodiZer comprises at least one aspherical lens.
`According to one embodiment, the set of lenses forming
`an apodiZcr comprises at least one diffractivc lcns.
`According to one embodiment, the objective lens com
`prises a set of mirrors comprising at least one distorting
`mirror.
`The present invention also relates to a method for dis
`playing an initial panoramic image obtained in accordance
`With the method described above, comprising a step of
`correcting the non-linearity of the initial image, performed
`by means of a reciprocal function of the non-linear distri
`bution function of the objective lens or by means of the
`non-linear distribution function.
`According to one embodiment, the step of correcting
`comprises a step of transforming the initial image into a
`corrected digital image comprising a number of image
`points higher than the number of pixels that the image sensor
`comprises.
`According to one embodiment, the method comprises a
`step of calculating the siZe of the corrected image, by means
`of the reciprocal function of the distribution function, so that
`the resolution of the corrected image is equivalent to the
`most expanded Zone of the initial image, and a step of
`scanning each image point of the corrected image, searching
`for the position of a tWin point of the image point on the
`
`
`
`US 6,844,990 B2
`
`5
`initial image and allocating the color of the tWin point to the
`image point of the corrected image.
`According to one embodiment, the initial image and the
`corrected image comprise an image disk.
`According to one embodiment, the method comprises a
`step of transferring the image points of the corrected image
`into a three-dimensional space and a step of presenting one
`sector of the three-dimensional image obtained on a display
`means.
`According to one embodiment, the method comprises a
`step of determining the color of image points of a display
`WindoW, by projecting the image points of the display
`WindoW onto the initial image by means of the non-linear
`distribution function, and allocating to each image point of
`the display WindoW the color of an image point that is the
`closest on the initial image.
`According to one embodiment, the projection of the
`image points of the display WindoW onto the initial image
`comprises a step of projecting the image points of the
`display WindoW onto a sphere or a sphere portion, a step of
`determining the angle in relation to the center of the sphere
`or the sphere portion of each projected image point, and a
`step of projecting onto the initial image each image point
`projected onto the sphere or the sphere portion, the projec
`tion being performed by means of the non-linear distribution
`function considering the ?eld angle that each point to be
`projected has in relation to the center of the sphere or the
`sphere portion.
`The present invention also relates to a panoramic objec
`tive lens comprising optical means for projecting a pan
`orama into an image plane of the objective lens, the pan
`oramic objective lens having an image point distribution
`function that is not linear relative to the ?eld angle of object
`points of the panorama, the distribution function having a
`maximum divergence of at least 110% compared to a linear
`distribution function, such that a panoramic image obtained
`by means of the objective lens comprises at least one
`substantially expanded Zone and at least one substantially
`compressed Zone.
`According to one embodiment, the panoramic objective
`lens has a non-linear distribution function that is symmetri
`cal relative to the optical axis of the objective lens, the
`position of an image point relative to the center of an image
`obtained varying according to the ?eld angle of the corre
`sponding object point.
`According to one embodiment, the panoramic objective
`lens expands the center of an image and compresses the
`edges of the image.
`According to one embodiment, the panoramic objective
`lens expands the edges of an image and compresses the
`center of the image.
`According to one embodiment, the panoramic objective
`lens compresses the center of an image and the edges of the
`image, and expands an intermediate Zone of the image
`located betWeen the center and the edges of the image.
`According to one embodiment, the panoramic objective
`lens comprises a set of lenses forming an apodiZer.
`According to one embodiment, the set of lenses forming
`an apodiZer comprises at least one aspherical lens.
`According to one embodiment, the set of lenses forming
`an apodiZer comprises at least one diffractive lens.
`According to one embodiment, the panoramic objective
`lens comprises polymethacrylate lenses.
`According to one embodiment, the panoramic objective
`lens comprises a set of mirrors comprising at least one
`distorting mirror.
`
`6
`BRIEF DESCRIPTION OF THE SEVERAL
`VIEWS OF THE DRAWINGS
`
`10
`
`15
`
`25
`
`35
`
`40
`
`45
`
`55
`
`The foregoing summary, as Well as the folloWing detailed
`description of preferred embodiments of the invention, Will
`be better understood When read in conjunction With the
`appended draWings. For the purpose of illustrating the
`invention, there are shoWn in the draWings embodiments
`Which are presently preferred. It should be understood,
`hoWever, that the invention is not limited to the precise
`arrangements and instrumentalities shoWn.
`In the draWings:
`FIG. 1 described above represents a system for displaying
`a digital panoramic image on a screen;
`FIG. 2 described above represents a panoramic image
`before it is processed by a computer;
`FIG. 3 described above shoWs a classical method for
`transforming a tWo-dimensional panoramic image into a
`three-dimensional digital panoramic image;
`FIGS. 4A and 4B described above shoW the linearity of a
`classical panoramic objective lens;
`FIGS. 5 and 6 shoW one aspect of the method according
`to the present invention and respectively represent a distri
`bution of image points obtained With a classical panoramic
`objective lens and a distribution of image points obtained
`With a non-linear panoramic objective lens according to the
`present invention;
`FIGS. 7A and 7B shoW a ?rst example of non-linearity of
`a panoramic objective lens according to the present inven
`tion;
`FIG. 8 shoWs a second example of non-linearity of a
`panoramic objective lens according to the present invention;
`FIG. 9 shoWs a third example of non-linearity of a
`panoramic objective lens according to the present invention;
`FIG. 10 represents a system for displaying a digital
`panoramic image by means of Which a method for correcting
`the panoramic image according to the present invention is
`implemented;
`FIG. 11 schematically shoWs a ?rst embodiment of the
`correction method according to the present invention;
`FIG. 12 is a How chart describing a method for displaying
`a panoramic image incorporating the ?rst correction method
`according to the present invention;
`FIG. 13 schematically shoWs a second embodiment of the
`correction method according to the present invention;
`FIG. 14 is a How chart describing a method for displaying
`a panoramic image incorporating the second correction
`method according to the present invention;
`FIG. 15 is a cross-section of a ?rst embodiment of a
`non-linear panoramic objective lens according to the present
`invention;
`FIG. 16 is an exploded cross-section of a system of lenses
`present in the panoramic objective lens in FIG. 15;
`FIG. 17 is a side vieW of a lens present in the panoramic
`objective lens in FIG. 15; and
`FIG. 18 is the diagram of a second embodiment of a
`non-linear panoramic objective lens according to the present
`invention.
`
`65
`
`DETAILED DESCRIPTION OF THE
`INVENTION
`A—Compression/Expansion of an Initial Image
`FIG. 5 schematically represents a classical system for taking
`panoramic shots, comprising a panoramic objective lens 15
`
`
`
`US 6,844,990 B2
`
`7
`of optical axis OZ and a digital image sensor 17 arranged in
`the image plane of the objective lens 15. Here, four object
`points a, b, c, d Will be considered that belong to a panorama
`PM located opposite the objective lens and respectively
`having angles of incidence (X1, (X2, —ot2, —(X1. As explained
`in the preamble, the ?eld angle of an object point is the angle
`that an incident light ray passing through the object point
`considered and through the center of the panorama PM,
`marked by a point “p” on FIG. 5, has relative to the optical
`axis OZ of the objective lens. In this example, the angle (X1
`is equal to tWo times the angle (X2. On the image sensor 17,
`image points a‘, b‘, c‘, d‘ corresponding to the object points
`a, b, c, d are located at distances from the center of the image
`respectively equal to d1, d2, —d2, —d1. As the distribution of
`the image points according to the ?eld angle of the object
`points is linear With a classical panoramic objective lens, the
`distances d1 and d2 are linked by the folloWing relation:
`
`d1/a1=d2/a2
`
`As the angle (x1 is here equal to 20.2, it folloWs that:
`
`d1=2d2
`
`As is Well knoWn by those skilled in the art, the term
`“linearity” here refers to a ratio of proportionality betWeen
`the distance of an image point measured relative to the
`center of the image and the ?eld angle of the corresponding
`object point. The notion of “linearity” in the ?eld of pan
`oramic objective lenses is therefore different from that
`prevailing in the ?eld of paraxial optics (in the vicinity of the
`optical axis) When the conditions of Gauss are met.
`FIG. 6 represents a system for taking shots of the same
`type as above, but in Which the classical panoramic objective
`lens 15 is replaced by an objective lens 18 according to the
`present invention, the image sensor 17 being arranged in the
`image plane of the objective lens 15. The projection onto the
`image sensor 17 of the object points a, b, c, d having angles
`of incidence (X1, (X2, —ot2 and —ot1 relative to the axis OZ of
`the objective lens and to the center “p” of the panorama are
`considered again. On the image sensor 17, the corresponding
`image points a“, b“, c“, d“ are located at distances from the
`center of the image respectively equal to d1‘, d2‘, —d2‘, —d1‘.
`According to the present invention, the objective lens 18
`has a distribution function of the image points that is not
`linear. The ratio of the distances d1‘, d2‘, —d2‘, —d1‘ are not
`equal to the ratio of the angles of incidence (X1, (X2, —ot2,
`—(X1. In the example represented, the distance d2‘ is clearly
`greater than d1‘/2, such that the central part of the panoramic
`image projected onto the image sensor 17, Which corre
`sponds to a solid angle 20.2 centered on the optical axis OZ,
`occupies a greater area on the image sensor 17 than the area
`it occupies in FIG. 5 With the classical panoramic objective
`lens (hatched Zone). This central part of the panoramic
`image is therefore projected onto the image sensor With
`expansion of its area, in relation to the area the central part
`Would occupy if the objective lens Were linear. The result is
`that the number of pixels of the image sensor covered by this
`part of the image is greater than in previous practices and
`that the de?nition obtained is improved. On the other hand,
`the part of the image delimited by tWo circles respectively
`passing through the points a“, d“ and through the points b“,
`c“ is compressed relative to the corresponding part in FIG.
`5, and the de?nition on the edges of the image is less than
`that obtained With a classical linear objective lens, to the
`bene?t of the central part of the image.
`By applying the principle according to the present
`invention, Which involves expanding one part of the image
`
`8
`and compressing another part of the image, the part to be
`expanded and the part to be compressed can be chosen
`according to the intended application, by producing several
`types of non-linear objective lenses and by choosing an
`objective lens suited to the intended application. Depending
`on the intended application, the most useful part of a
`panoramic image may be located in the center of the image,
`on the edge of the image, in an intermediate Zone situated
`betWeen the center and the edge of the image, etc.
`FIGS. 7A—7B, 8 and 9 shoW three examples of non-linear
`distribution functions according to the present invention.
`The distribution function shoWn in FIGS. 7A and 7B
`corresponds to the example in FIG. 6, that is a panoramic
`objective lens that expands the image in the center. FIG. 7A
`represents equidistant concentric circles C10, C20, .
`.
`. , C90
`present on an image disk, each circle being formed by image
`points corresponding to object points having the same ?eld
`angle. The circle C10 is formed by the image points corre
`sponding to object points having a ?eld angle of 10°, the
`circle C20 is formed by image points corresponding to
`object points having a ?eld angle of 20°, etc. By comparing
`FIG. 7A With FIG. 4A described in the