throbber
Petition for Inter Partes Review of 
`U.S. Pat. No. 7,477,284
`IPR2013‐00327
`EXHIBIT
`Sony‐
`
`

`
`(19) Japanese Patent Office (JP)
`(12) Kokai Unexamined Patent Application Bulletin (A)
`(11) Laid Open Patent Application No.
`
`8-159762
`(43) Publication Date
`
`June 21, 1996
`Number of Claims
`
`25 OL
`
`Number of Pages
`
`15
`
`Examination Request
`
`not yet made
`
`
`
`Internal File No. FI
`
`
`
`9365-5H
`
`HV
`
`(51)
`
`
`Int. Cl.6
`G01C 11/06
`G01B 11/00
`G01C 3/06
`
`Identification Code
`
`
`(21) Application No.:
`
`6-298224
`
`
`
`(71)
`
`Applicant:
`
`(22) Application Date:
`
`December 1, 1994
`
`(72)
`
`Inventor:
`
`
`
`G06F 15/62 350A
` 415
`
`Tech. Indic.
`
`
`
`Continued on the
`last page
`
`000213909
`Aero Asahi Corporation
`3-1-1 Higashiikebukuro,
`Toshima-ku, Tokyo-to
`INOUE, Toru
`Aero Asahi Corporation
`3-1-1 Higashiikebukuro,
`Toshima-ku, Tokyo-to
`Patent Attorney, TANAKA, Tsuneo
`
`
`
`(74)
`
`
`
`
`Agent:
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`(54) [Title of invention] Three-Dimensional Data Extraction Method and Device, and Stereo Image Forming Device
`
`
`(57) [Abstract]
`[Object] To create DEM data from video images.
`[Constitution] Video images are taken of a target region
`from the air (S1). At this time, the camera position is
`measured by way of differential GPS. The camera is
`mounted on an anti-vibration device, and the orientation
`of
`the camera
`is measured precisely by way of
`gyroscope output and magnetic bearing sensor output
`thereof. Exterior orientation elements are determined
`accurately by matching fields in video images that
`overlap 60% (S2). The leading line, middle line, and final
`line of each field are extracted, and are separately
`combined
`to create continuous mosaic
`images
`consisting of a forward view image, a nadir view image
`and a rearward view image (S3). The vertical parallax is
`removed from the continuous mosaic images (S4). The
`parallax difference is calculated from the forward view
`image and the rearward view image (or the nadir view
`image) (S5), and the height is calculated from the
`parallax difference (S6).
`
`
`
`
`
`
`
`
`Translation by Patent Translations Inc. 1-800-844-0494 mail@PatentTranslations.com
`
`

`
`[Claims]
`[Claim 1] A
`three-dimensional data extraction method
`comprising: a basic information collection step of imaging a
`three-dimensional data extraction target while moving, recording
`that image signal, and recording imaging information including
`the position and orientation of the imaging camera; and a
`three-dimensional
`data
`generation
`step
`of
`generating
`three-dimensional data for said extracted object from the images
`and imaging information collected in said basic information
`collection step,
`being
`extraction method
`data
`the
`three-dimensional
`characterized in that said three-dimensional data generation step
`comprises:
`a continuous mosaic image generation step of extracting image
`data of prescribed lines in prescribed screens in consecutive
`screens of captured images, and generating at least two
`continuous mosaic images from among a forward view image, a
`nadir view image, and a rearward view image;
`a vertical parallax removal step of removing vertical parallax from
`the continuous mosaic images generated in said continuous
`mosaic image generation step,
`a parallax difference calculation step of calculating the parallax
`difference for a prescribed position in the continuous mosaic
`images from which vertical parallax was removed in said vertical
`parallax removal step; and
`a height calculation step of calculating the height of said
`prescribed position from the parallax difference calculated in said
`parallax difference calculation step.
`[Claim 2] The three-dimensional data extraction method recited
`in claim 1, further comprising an orientation calculation step of
`establishing, by way of relative orientation and successive
`orientation, exterior orientation elements
`from consecutive
`screens of captured images.
`[Claim 3] The three-dimensional data extraction method recited
`in claim 2, wherein said orientation calculation step comprises: a
`relative orientation step of extracting, from consecutive screens
`of captured images, two screens that overlap in a prescribed
`proportion and performing relative orientation; and a successive
`orientation step of associating models that have been relatively
`oriented by said relative orientation step.
`[Claim 4] The three-dimensional data extraction method recited
`in claim 3, wherein said prescribed proportion is 60%.
`[Claim 5] The three-dimensional data extraction method recited
`in any one of claims 2 to 4, wherein said vertical parallax removal
`step comprises: an exterior orientation element interpolation step
`of interpolating exterior orientation elements for each line of the
`continuous mosaic images generated in said continuous mosaic
`image generation step, in accordance with the exterior orientation
`elements determined in said orientation calculation step; and a
`projection step of transforming the lines of the continuous mosaic
`images generated in said continuous mosaic image generation
`step into images projected to a prescribed altitude in accordance
`with the exterior orientation elements of said lines.
`[Claim 6] The three-dimensional data extraction method recited
`in any one of claims 1 to 5, wherein said parallax difference
`calculation step comprises: an intermediate image formation step
`of forming one or more intermediate images between said
`continuous mosaic images; a corresponding point detection step
`of going through said one or more intermediate images and
`detecting corresponding points
`in said continuous mosaic
`images; and a computation step of calculating the parallax
`difference of said corresponding points in accordance with the
`detection results of said corresponding point detection step.
`[Claim 7] The three-dimensional data extraction method recited
`in claim 6, wherein said intermediate image formation step
`
`
`
`JP-08-159762-A Page 2
`extracts image data of intermediate lines in prescribed screens in
`consecutive screens of captured
`images, and
`forms said
`intermediate image.
`[Claim 8] The three-dimensional data extraction method recited
`in any one of claims 1 to 7, wherein a GPS reception means is
`provided as a means for detecting the position of said camera.
`[Claim 9] The three-dimensional data extraction method recited
`claim 8, wherein a correction means that differentially corrects
`the output of the GPS reception means is further provided.
`[Claim 10] The three-dimensional data extraction method recited
`in any one of claims 1 to 9, wherein said camera is prevented
`from vibrating by an anti-vibration means.
`[Claim 11] A
`three-dimensional data extraction device
`characterized by comprising: a video camera that images a
`three-dimensional data extraction target; a position measurement
`means that measures the position of said video camera; a
`recording means that records images captured by said video
`camera and measurement values of said position measurement
`means; a conveyance means that conveys said video camera
`and said position measurement system [sic]; a reproduction
`means that reproduces the image information and position
`information recorded by said recording means; a continuous
`mosaic image generation means that extracts image data of
`prescribed lines in prescribed screens in consecutive screens of
`captured images, and generates at least two continuous mosaic
`images from among a forward view image, a nadir view image,
`and a rearward view image; a vertical parallax removal means
`that removes vertical parallax from the continuous mosaic images
`generated by said continuous mosaic image generation means; a
`parallax difference calculation means that calculates parallax
`difference in continuous mosaic images from which vertical
`parallax has been removed by said vertical parallax removal
`means; and a height calculation means that calculates the height
`of said [sic] prescribed position from the parallax difference
`calculated by said parallax difference calculation means.
`[Claim 12] The three-dimensional data extraction device recited
`in claim 11, further comprising an orientation calculation means
`that establishes, by way of relative orientation and successive
`orientation, exterior orientation elements
`from consecutive
`screens of captured images.
`[Claim 13] The three-dimensional data extraction device recited
`in claim 12, wherein said orientation calculation means provides:
`a relative orientation step of extracting, from consecutive screens
`of captured images, two screens that overlap in a prescribed
`proportion and performing relative orientation; and a successive
`orientation step of associating models that have been relatively
`oriented by said relative orientation step.
`[Claim 14] The three-dimensional data extraction device recited
`in claim 13, wherein said prescribed proportion is 60%.
`[Claim 15] The three-dimensional data extraction device recited
`in any one of claims 12 to 14, wherein said vertical parallax
`removal means comprises an exterior orientation element
`interpolation means that interpolates exterior orientation elements
`in each line of the continuous mosaic images generated by said
`continuous mosaic image generation means, in accordance with
`the exterior orientation elements determined by said orientation
`calculation means; and a projection means that transforms the
`lines of the continuous mosaic images generated by said
`continuous mosaic
`image generation means
`into
`images
`projected to a prescribed altitude in accordance with the exterior
`
`
`Translation by Patent Translations Inc. 1-800-844-0494 mail@PatentTranslations.com
`
`

`
`
`
`JP-08-159762-A Page 3
`expense. If three-dimensional measurement is done by stereo
`matching using aerial photographs taken from a low altitude,
`matching errors will occur due to the influence of occlusion. This
`is because the two images that form a stereo image are seen
`from different viewing directions, and differences in the image
`due to differences in the observation direction make perfect
`matching impossible. In conventional cases, attempts have been
`made to eliminate such influence by using multiple orientation
`points on the ground, but with this, automation is impossible.
`[0003]
`[Problems to Be Solved by the Invention] In contrast, it is easy
`to automate the technology using video images to create
`topographical maps. But
`in
`the prior art,
`in extracting
`photographic three-dimensional data, in the same way as in an
`aerial survey, it is considered necessary to have several air-photo
`signal points
`in
`the
`image
`(signals
`for which clear
`three-dimensional coordinates are known), and even if the
`necessary number of air-photo signals are assured, the errors
`will be of poor precision, which is in meter units, making this
`impractical.
`[0004] Three-dimensional topographical maps are useful for the
`management of roads, rivers, railroads, and the like, and for
`planning new routes for the same and surveying the state of
`development of cities and the like; there is a demand for a
`system that makes it possible to acquire three-dimensional
`topographical
`data
`quickly,
`cheaply,
`and
`simply.
`If
`three-dimensional topographical data is available, bird’s-eye
`views can also be created easily (shown on a display or output
`on a printer), and various simulations can be run. And if
`three-dimensional data can be obtained by processing video
`images, it will also be easy to extract just the parts that have
`changed, thus also making it easy to survey the state of
`development of cities and the like.
`[0005] An object of the present invention is to propose a
`three-dimensional data extraction method and device, which
`extract three-dimensional data automatically.
`[0006] Another object of the present invention is to propose a
`three-dimensional data extraction method and device, which
`extract three-dimensional data from video images.
`[0007] Another object of the present invention is to propose a
`stereo image formation device that forms stereo images (two
`images suitable for stereo matching) from video images.
`[0008]
`[Means for Solving the Problems] In the present invention,
`image data of prescribed lines of video images are extracted, and
`continuous mosaic images having different parallax are formed.
`After vertical parallax is removed from these continuous mosaic
`images, the parallax difference is calculated by stereo matching.
`Heights are calculated from the resulting parallax differences.
`[0009] Preferably, at least three images that overlap in a
`
`orientation elements of said lines.
`[Claim 16] The three-dimensional data extraction device recited
`in any one of claims 11 to 15, wherein said parallax difference
`calculation means comprises: an intermediate image formation
`means that forms one or more intermediate images between said
`continuous mosaic images; a corresponding point detection
`means that goes through said one or more intermediate images
`and detects corresponding points in said continuous mosaic
`images; and a computation means that calculates the parallax
`difference of said corresponding points in accordance with the
`detection results of said corresponding point detection means.
`[Claim 17] The three-dimensional data extraction device recited
`in claim 16, wherein said intermediate image formation means
`extracts image data of intermediate lines in prescribed screens in
`consecutive screens of captured
`images, and
`forms said
`intermediate image.
`[Claim 18] The three-dimensional data extraction device recited
`in any one of claims 11 to 17, wherein said position measurement
`means is a GPS reception means.
`[Claim 19] The three-dimensional data extraction device recited
`in claim 18,
`further comprising a correction means
`that
`differentially corrects the output of the GPS reception means.
`[Claim 20] The three-dimensional data extraction device recited
`in any one of claims 11 to 19, wherein said camera is mounted on
`said conveyance means via an anti-vibration means.
`[Claim 21] The three-dimensional data extraction device recited
`in any one of claims 11 to 20, wherein said conveyance means is
`an aircraft.
`[Claim 22] The three-dimensional data extraction device recited
`in any one of claims 11 to 21, further comprising a bearing
`detection means that detects the bearing of said camera, wherein
`the output of said bearing detection means is also recorded in
`said recording means.
`[Claim 23] The three-dimensional data extraction device recited
`in any one of claims 11 to 22, wherein said camera is disposed
`so that the direction of travel of said conveyance means is a
`direction that is perpendicular to the direction of the scan lines of
`the camera.
`[Claim 24] A stereo image formation device characterized by
`comprising: an extraction means that extracts line image data, at
`two or more different prescribed line positions in screens, in a
`video captured image; and a combining means that combines
`line image data from the same line positions.
`[Claim 25] The stereo image formation device recited in claim 24,
`further comprising a vertical parallax removal means that
`removes vertical parallax from the images combined by said
`combining means, based on exterior orientation elements of each
`of the line image data on which [the images] are based.
`[Detailed Description of the Invention]
`[0001]
`[Field of Industrial Application] The present invention relates to
`a method and a device for three-dimensional data extraction, and
`to a stereo image formation device; more specifically, it relates to
`a method and device for extracting three-dimensional data from
`video images, and to a stereo image formation device that forms
`stereo images from video images.
`[0002]
`[Prior Art] Conventionally, aerial survey technology based on
`aerial photographs has been used in creating three-dimensional
`topographical maps. But aerial survey technology involves
`having a helicopter or light airplane fly above a locality while
`taking stereoscopic photographs of the ground, and analyzing the
`resulting
`stereo
`photographs;
`obtaining
`stereographic
`photographs alone requires a great deal of time and expense,
`and analysis of the same likewise entails enormous effort and
`
`
`Translation by Patent Translations Inc. 1-800-844-0494 mail@PatentTranslations.com
`
`

`
`
`
`JP-08-159762-A Page 4
`anti-vibration stabilizing device 12 can be set arbitrarily; the
`camera control device 22 controls the focus, zoom, diaphragm
`value, color balance and the like of the camera 10, and the VTR
`control device 24 controls the recording start, stop, and pause on
`the VTR 14, and also acquires and transfers to the computer 18
`a time code that is recorded together with the output image signal
`of the camera 10. This time code is used for synchronization
`when analyzing the image information and other measurement
`data that is recorded on the VTR 14, in the on-the-ground
`analysis system.
`[0017] A height above ground sensor 26 detects the height
`above ground, and the magnetic bearing sensor 28 detects the
`magnetic bearing. Because, even with
`the high-precision
`anti-vibration stabilizing device 12, there is slow directional
`movement caused by gyroscopic drift, it is necessary to correct
`the orientation of the camera by way of the magnetic bearing
`sensor 28. The outputs of the sensors 26 and 28 are applied to
`the computer 18 as digital data. Also input to the computer 18 is
`tri-axial gyroscopic data from the anti-vibration stabilizing device
`12 indicating the tri-axial orientation of the camera 10 (roll angle,
`pitch angle, and yaw angle), and zoom data indicating the zoom
`value from the camera 10.
`[0018] [Reference numeral] 30 is a GPS (global positioning
`system) receiving antenna, and 32 is a GPS reception device
`that collects the current ground coordinates (longitude, latitude,
`and altitude) from the GPS antenna 30. The GPS position
`measurement data output from the GPS reception device 32 is
`applied to the computer 18 for recording, and is also applied to a
`navigation system 34 for navigation. The navigation system 34
`performs
`three-dimensional graphic display of
`the current
`position on a screen of a monitor 38, with respect to set survey
`lines in accordance with navigational (survey line data) that has
`been previously recorded on a floppy disk 36. This makes it
`possible to capture images following along desired survey lines in
`regions where there are no target objects on the ground, or in
`regions where
`they cannot be ascertained (for example,
`mountainous regions or sea regions or the like).
`[0019] Note that, the differential GPS (D-GPS) method in which,
`even at reference points where the coordinates are known,
`measurements are taken by GPS, and the GPS position
`measurement data is corrected by the measurement error, is
`known as a way to improve GPS measurement precision. In this
`working example, this differential GPS method is adopted; the
`coordinates of a reference station of known coordinates are
`measured at the same time by GPS, and the measurement error
`data is radioed to the helicopter as GPS correction data. The
`communication device 40 receives the GPS correction data from
`the reference station and transfers it to the computer 18.
`[0020] The computer 18 records the flight data that is input
`(height above ground data, magnetic bearing data, zoom data,
`tri-axial gyroscopic data), together with GPS correction data and
`
`prescribed proportion are extracted from consecutive screens of
`captured images, the screens are matched by way of the
`overlapping parts thereof, and exterior orientation elements are
`established. Then, in accordance with the exterior orientation
`elements determined by this orientation calculation, the exterior
`orientation elements are interpolated for each line of the
`continuous mosaic images, and the lines of the continuous
`mosaic images are transformed into images projected to a
`prescribed altitude, in accordance with the exterior orientation
`elements of said lines.
`[0010]
`[Operation] The above processing can be automated on a
`computer, thus making it possible to automatically execute on a
`computer all the processes by which the stereo images needed
`for stereo matching are obtained from images resulting from
`video imaging, and by which heights are calculated, making to
`possible to quickly obtain three-dimensional data for a desired
`region or the like.
`[0011] Being video images, a great deal of information is
`available that is needed for establishing exterior orientation
`elements, and the precision of the exterior orientation elements is
`increased. Consequently, the height data that is ultimately
`obtained is also of good precision. Furthermore, in the stereo
`image matching computation as well, by temporarily creating
`intermediate images for said stereo images, and searching for
`corresponding points chainwise,
`the corresponding points
`between stereo images can be established with greater precision
`than in the case of stereo photographs, and the occlusion
`problem can be completely solved.
`[0012]
`[Working Example] Hereafter, referring to the drawings, a
`working example of the present invention is described in detail.
`[0013] FIG. 1 shows a schematic block diagram of an airborne
`measurement system in a working example of the present
`invention; FIG. 2 shows a schematic block diagram of an
`on-the-ground measurement system; and FIG. 3 shows
`schematic block diagram of an on-the-ground analysis system.
`[0014] The airborne measurement system shown in FIG. 1 will be
`described. In this working example, the airborne measurement
`system shown in FIG. 1 is aboard a helicopter. In this working
`example, a high-quality camera 10
`is mounted on a
`high-precision anti-vibration stabilizing device (anti-vibration
`device) 12, and the high-quality image signal output thereof is
`recorded on videotape by a high-quality video tape recorder 14.
`Note that, the camera 10 is generally facing downward, and is set
`so that the image directly below moves in a direction that is
`perpendicular to the scan lines. The output image signal of the
`camera 10 is also applied to a high-quality monitor 16. This
`allows visual confirmation of what the camera 10 is capturing and
`the state of imaging.
`[0015] The high-precision anti-vibration stabilizing device 12 is
`made so that vibration from the aircraft does not affect the
`camera 10. This makes it possible to record images without
`blurring. That is to say, by combining a gyroscope and gimbal
`servo, the high-precision anti-vibration stabilizing device 12 has a
`spatial stabilization function that keeps the optical axis of the
`camera 10 pointed in a fixed direction in inertial space against
`any fluctuations in the angles about the roll, pitch, and yaw axes
`that arise in the airframe.
`[0016] [Reference numeral] 18 is a personal computer that, along
`with collecting and recording measurement data, controls the
`high-precision anti-vibration stabilizing device 12 via a three-axis
`control device 20, controls the camera 10 via a camera control
`device 22, and controls the VTR 14 via a VTR control device 24.
`With the three-axis control device 20, the target bearing of the
`
`
`Translation by Patent Translations Inc. 1-800-844-0494 mail@PatentTranslations.com
`
`

`
`
`
`JP-08-159762-A Page 5
`recorded on a floppy disk 42 along with the time codes from the
`VTR 14. The time codes are used during analysis on the ground
`for synchronizing the position and orientation of the camera with
`the reproduced images.
`[0025] The position of the camera is known basically from the
`GPS position measurement data that is output from the GPS
`receiver [sic] 32, and for sake of better precision undergoes
`differential processing with the GPS correction data from the
`reference station. The differential processing may be done
`aboard the aircraft, but in consideration of such factors as bad
`communication of the GPS correction data, it is preferable to
`record the output of the GPS receiver 32 (the GPS position
`measurement data) and the GPS correction data separately on a
`floppy disk 42, and to do the differential processing during
`analysis on
`the ground. When
`there has been bad
`communication of the GPS correction data, then the GPS
`position measurement data undergoes differential processing
`with the GPS correction data that has been recorded and saved
`by way of the on-the-ground measurement system shown in FIG.
`2.
`[0026] With regard to the orientation of the camera 10, values
`wherein the output of the gyro sensors of the anti-vibration
`stabilizing device 12 have been corrected by the output of the
`magnetic bearing sensor 28 are recorded to the floppy disk 42.
`Specifically, the orientation of the three axes (pitch, roll, and yaw)
`is recorded on the floppy disk. Of course, the orientation of the
`camera 10 may be fixed for sake of simplicity, or if the
`performance of the anti-vibration stabilizing device 12 is good, or
`if simplification is acceptable.
`[0027] In this connection, the imaging altitude is set to 1000 feet,
`and with a Hi-Vision camera that uses a 2-million-pixel CCD
`image sensor, if the focal length is 8.5 mm, the imaging range is
`339 m and the resolution is 17.7 cm, and if the focal length is
`102.0 mm, the imaging range is 28 m and the resolution is
`1.5 cm.
`[0028] The recorded information (image and flight information) is
`reproduced and analyzed by the on-the-ground analysis system
`shown in FIG. 3. As described above, in accordance with the
`information during imaging from the computer 80 (the camera
`position and bearing, and the time codes), the workstation 76
`controls the VTR 70 and causes it to play back images with the
`same time code. The reproduced image signals are digitized and
`stored in the frame buffer 74. In this way, the workstation 76 can
`obtain image data as well as data on the position and orientation
`of the camera during imaging, and outputs DEM data after going
`through the various processing of orientation calculation (S2),
`continuous mosaic image formation (S3), vertical parallax
`
`timer codes from the VTR control device 24, to the floppy disk 42.
`The computer 18 can also display various input data on the
`monitor 44 as necessary, and an operator can input various
`instructions to the computer 18 from a keyboard 46.
`[0021] In readiness for cases in which the communication with
`the reference station by the communication device 40 goes bad,
`in this working example, as shown in FIG. 2, the measured GPS
`correction data is saved on a floppy disk independently at the
`reference station as well. That is to say, the GPS reception
`device 50 calculates the current location of the GPS antenna 52
`from the output of the GPS antenna 52, and outputs the GPS
`position measurement data
`to a computer 54. Accurate
`coordinates (reference position data) of the GPS antenna 52 are
`measured beforehand, and this data is input to or set in the
`computer 54. The computer 54 computes the error of the GPS
`position measurement data from the GPS reception device 50
`and the reference position data, and records it on a floppy disk
`56 as GPS correction data. Of course, measurement-time
`information is also recorded at the same time. The GPS position
`measurement data and the error (which is to say, the GPS
`correction data) are displayed on the screen of the monitor 58 as
`necessary. The operator can input various instructions to the
`computer 54 by way of a keyboard 60. The computer 54 also
`sends the GPS correction data via a communication device 62 to
`(the computer 18 of) the airborne measurement system shown in
`FIG. 1.
`[0022] The data measured by the airborne measurement system
`shown in FIG. 1 (and as necessary by the on-the-ground
`measurement system shown in FIG. 2) is analyzed, and
`three-dimensional data is calculated, by the on-the-ground
`analysis system shown in FIG. 3. That is to say, the high-quality
`VTR 70 plays back the videotape that was recorded by the
`airborne measurement system shown in FIG. 1, applying the
`image image [sic] signal to the frame buffer 74 and the
`reproduced time codes to an engineering workstation 76. The
`image data temporarily stored in the frame buffer 74 is applied to
`a monitor 78 and
`is
`image-displayed. Needless
`to say,
`sometimes the reproduced time codes are also displayed
`simultaneously on the monitor 78.
`[0023] A personal computer 80 reads the flight data and GPS
`correction data that is simultaneously collected by the airborne
`measurement system shown in FIG. 1 (in the event of a
`communication breakdown, the GPS correction data measured
`by the on-the-ground measurement system shown in FIG. 2),
`corrects the GPS position measurement data with the GPS
`correction data and corrects the tri-axial gyroscopic data with the
`magnetic bearing data, and transfers this, along with other
`measurement data and the time codes that are recorded together,
`to the workstation 76. The workstation 76 controls the VTR 70 in
`accordance with the time codes supplied from the computer 80
`and causes the VTR 70 to play back images with the same time
`code. In this way, the workstation 76 can correlate the conditions
`and imaging position when imaging with the images captured at
`that time, and reproduces three-dimensional data by way of
`computation that is described in detail below.
`[0024] FIG. 4 shows the flow in this working example, from
`measurement to three-dimensional data extraction. First, the
`equipment shown in FIG. 1 is put on board the aircraft, the target
`region is imaged and the flight information is recorded while flying
`over the target region at as constant an altitude and speed as
`possible (S1). At this time, the imaging target basically moves
`perpendicular to the scan lines of the camera 10. The images
`captured by the camera 10 are recorded on videotape by the
`VTR 14. At the same time, information on the exact position
`(latitude, longitude, height) and orientation of the camera 10 is
`
`
`Translation by Patent Translations Inc. 1-800-844-0494 mail@PatentTranslations.com
`
`

`
`
`
`JP-08-159762-A Page 6
`other so that the scan lines do not overlap, and are displayed in
`alternation every 1/60 second.
`[0035] That is to say, the video images captured by the camera
`10 are made up of scenes (fields) in which the imaging position is
`changed every 1/60 second; in this working example, as shown
`in FIG. 9, line data for the leading line, the middle line, and the
`final line of each field are extracted therefrom. The image formed
`from the data of the leading line of each field is called the forward
`view image, the image formed from the data of the middle line of
`each field is called the nadir view image, and the image formed
`from the data of the final line of each field is called the rearward
`view image. If the travel speed when imaging is constant, and the
`orientation of the camera is also constant, stereoscopic viewing is
`possible using this forward view image, this nadir view image,
`and this rearward view image. Note that lenses of short focal
`length may be used to emphasize height, which is to say, to
`increase the resolution.
`[0036] However, in video imaging with an aircraft, there are the
`variable factors of changes in the flying speed, flight path
`deviations, changes in the flying altitude, and changes in the
`three axes (pitch, roll, and yaw), and it is necessary to remove
`the vertical parallax that arises from these influences (S4).
`[0037] In the vertical parallax removal processing (S4), first,
`based on the exterior orientation elements of the images with a
`60% overlap percentage obtained in the orientation calculation
`(S2),
`the values of
`the exterior orientation elements
`corresponding to each of the line data in the continuous mosaic
`images (forward view image, nadir view image, and rearward
`view image) are interpolated. For example, as shown in FIG. 10,
`where the exterior orientation elements, which is to say, the
`position and orientation of the camera, at three imaging points P,
`Q, R, are, respectively, (Xp, Yp, Zp, ωp, ϕp, κp), (Xq, Yq, Zq, ωq,
`ϕq, κq), and (Xr, Yr, Zr, ωr, ϕr, κr), in the continuous mosaic
`image, the values of these exterior orientation elements are
`assigned to the lines that correspond to the nadir view image at
`the imaging points P, Q, R, and interpolated values are assigned
`to other lines. In this way, as shown in FIG. 11, exterior
`orientation

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket