throbber
United States Patent [191
`Chotiros
`
`[11] Patent Number:
`[ 45] Date of Patent:
`
`4,891,762
`Jan.2, 1990
`
`[54] METHOD AND APPARATUS FOR
`TRACKING, MAPPING AND RECOGNITION
`OF SPATIAL PATTERNS
`Inventor: Nicholas P. Chotiros, 1508 Charolais
`Dr., Austin, Tex. 78758
`
`[76]
`
`[21] Appl. No.: 154,048
`[22] Filed:
`Feb. 9, 1988
`[51]
`Int. C1,4 ....................... G06F 15/50; G06F 15/70
`[52] U.S. Cl ....................................... 364/456; 342/64;
`382/16
`[58] Field of Search ............... 364/449, 456, 423, 458,
`364/454, 443; 342/64, 90, 180; 382/16, 22, 26,
`30, 48
`
`[56]
`
`References Cited
`U.S. PATENT DOCUMENTS
`3,879,728 4/1975 Wolff .................................... 342/64
`3,992,707 11!1976 Schmidtlein eta!. ................. 342/64
`4,164,728 8/1979 Marsh ................................... 342/64
`4,179,693 12/1979 Evans et al ........................... 342/64
`4,192,004 3/1980 Buerger ............................... 364/518
`4,396,903 8/1983 Habicht et a!. ....................... 342/64
`4,490,719 12/1984 Botwin eta!. ........................ 342/64
`4,494,200 1!1985 Lam .................................... 364/449
`4,514,733 4/1985 Schrnidtlein et al .................. 342/64
`4,590,608 5/1986 Chen eta!. ............................ 382/43
`4,602,336 7/1986 Brown ................................. 364/456
`4,635,203 1/1987 Merchant ............................ 364/458
`4,700,307 10/1987 Mons eta!. ......................... 364/453
`4,715,005 12/1987 Heartz ................................. 364/521
`4,736,436 4/1988 Yasukawa et al ..................... 382/16
`4,754,493 6/1988 Coates ................................... 382/48
`
`OTHER PUBLICATIONS
`Besl, "Geometric Modeling and Computer Vision," pp.
`936-958, Proceedings of the IEEE, vol. 76, No.8, Aug.,
`1988.
`Eppig, "Autonomous Vehicles for Underwater Search
`
`and Survey," pp. 46-60, Presented at the 4th Interna(cid:173)
`tional Symposium on Unmanned Untethered Submers(cid:173)
`ible Technology, Jun. 24-27, 1985.
`
`Primary Examiner-Parshotam S. Lall
`Assistant Examiner-Thomas G. Black
`
`ABSTRACT
`[57]
`A method and apparatus for the identification of spatial
`patterns that occur in two or more scenes or maps. Each
`pattern comprises a set of points in a spatial coordinate
`system collectively represented by the geometrical fig(cid:173)
`ure formed by connecting all point pairs by straight
`lines. The pattern recognition process is one of recog(cid:173)
`nizing congruent geometrical figures. Two geometrical
`figures are congruent if all the lines in one geometrical
`figure are of the same length as the corresponding lines
`in the other. This concept is valid in a spatial coordinate
`system of any number of dimensions. In two- or three(cid:173)
`dimensional space, a geometrical figure may be consid(cid:173)
`ered as a polygon or polyhedron, respectively. Using
`the coordinates of the points in a pair of congruent
`geometrical figures, one in a scene and the other in a
`map, a least squares error transformation matrix may be
`found to map points in the scene into the map. Using the
`transformation matrix, the map may be updated and
`extended with points from the scene. If the scene is
`produced by the sensor system of a vehicle moving
`through an environment containing features at rest, the
`position and orientation of the vehicle may be charted,
`and, over a series of scenes, the course of the vehicle
`may be tracked. If the scenes are produced by a sensor
`system at rest, then moving objects and patterns in the
`field of view may be tracked.
`
`5 Claims, 7 Drawing Sheets
`
`r--1.
`
`MAP
`
`7
`I L CURRENT SCENE
`POSITION AND ORIENTATION I ,,
`
`T'1ACI(I~JG pqQCESSOR
`
`)
`(cLUSTERING PROCESSOR
`
`i
`
`I'
`
`r
`
`~I
`SPATIAL PATTERN
`RECOGNITION PROCESSOR
`
`19
`
`L COMPACTED
`SCENE r--
`
`21
`
`)
`
`(
`
`LIST OF MATCHED LINES
`COMPILATION PROCESSOR
`
`L LIST OF
`
`1.1ATCHEO LINES
`
`2:
`~~
`
`23
`
`LIST OF MATCHED LINES
`REDUCTION PROCESSOR
`
`l REDUCED LIST
`
`25
`
`"
`z
`OF MATCHED LINES
`~~
`CONGRUENT GEOMETRICAL FIGURES'l ~ fi:
`RECOGNITION PROCESSOR
`
`c
`
`'FOUND
`
`26
`
`~r
`CONGRUENT
`GEOMETRICAL FIGURES
`
`[COORDINATE TRANSFORMATION
`MATRIX COMPUTATION
`PROCESSOR
`
`27
`
`PROCESSOR
`
`(VEHICLE VELOCITY. POSITION 4NO
`ORIENTATION COMPUTATION
`
`ys
`MAP UPDATING ~
`PROCESSOR •
`
`TO STEERING ANO SPEED CORRECTION
`PROCESSOR 16
`
`PFIZER EX. 1130
`Page 1
`
`

`

`U.S. Patent
`
`Jan. 2, 1990
`
`Sheet 1 of7
`
`4,891,762
`
`3
`
`"-...
`
`' ' ...
`
`.... .... .. .... ... .. .. .. ....
`
`c::::::>
`. .......... ~ .... ··
`...................................................................................................
`
`0 . . . . . . . . .
`
`Fig. 1
`
`PFIZER EX. 1130
`Page 2
`
`

`

`U.S. Patent
`
`Jan. 2, 1990
`
`Sheet 2 of7
`
`4,891,762
`
`3
`
`Fig. 2
`
`PFIZER EX. 1130
`Page 3
`
`

`

`U.S. Patent
`
`Jan. 2, 1990
`
`Sheet 3 of7
`
`4,891,762
`
`5
`
`Fig. 3
`
`PFIZER EX. 1130
`Page 4
`
`

`

`U.S. Patent
`
`Jan. 2, 1990
`
`Sheet 4 of7
`
`4,891,762
`
`5
`
`Fig. 4
`
`PFIZER EX. 1130
`Page 5
`
`

`

`U.S. Patent
`
`Jan.2,1990
`
`Sheet 5 of7
`
`4,891,762
`
`I MISSION
`
`OBJECTIVES
`
`13
`
`)
`I
`
`l
`
`14
`
`)
`
`I
`SENSOR SYSTEM
`FOR DETECTING AND LOCATING
`ENVIRONMENTAL FEATURES
`
`I
`
`j
`
`j
`
`1r
`
`1--
`
`- 16
`
`j
`7 j
`I I CURRENTSCENE
`
`12
`
`)
`
`NAVIGATION PROCESSOR
`
`MAP
`
`I
`
`I
`I
`
`POSITION AND
`ORIENTATION
`TRACKING PROCESSOR
`
`I
`
`STEERING AND SPEED CORRECTION
`PROCESSOR
`
`I
`
`----..
`
`c
`
`17
`
`STEERING AND J
`
`SPEED CONTROLLER
`
`(
`
`Fig. 5
`
`PFIZER EX. 1130
`Page 6
`
`

`

`U.S. Patent
`
`Jan.2,1990
`
`Sheet 6 of7
`
`4,891,762
`
`7
`)
`
`I
`I
`
`MAP
`
`I I CURRENT SCENE
`
`8
`)
`
`15
`
`POSITION AND ORIENTATION
`TRACKING PROCESSOR
`
`18
`)
`(CLUSTERING PROCESSOR
`I COMPACTED
`19
`
`SCENE
`
`)0
`
`MAP UPDATING
`PROCESSOR
`
`(
`\
`(VEHICLE VELOCITY, POSITION AND
`ORIENTATION COMPUTATION
`PROCESSOR
`I
`
`t
`
`TO STEERING AND SPEED CORRECTION
`PROCESSOR 16
`
`Fig. 6
`
`(
`
`SPATIAL PATIERN
`RECOGNITION PROCESSOR
`
`LIST OF MATCHED LINES
`COMPILATION PROCESSOR
`
`I LIST OF
`
`MATCHED LINES
`
`22
`
`23
`
`LIST OF MATCHED LINES
`REDUCTION PROCESSOR
`24
`I
`
`I REDUCED LIST
`
`OF MATCHED LINES
`
`26
`
`CONGRUENT
`GEOMETRICAL FIGURES
`
`MATRIX COMPUTATION
`PROCESSOR
`
`0 z
`1-::::>
`(CONGRUENT GEOMETRICAL FIGURESJ ~ fi?
`RECOGNITION PROCESSOR
`I FOUND
`I COORDINATE TRANSFORMATION P'
`,J
`
`,....
`"<!"
`::2: w
`1-en
`>-en
`a:
`0 en z w en
`
`0
`1-
`
`j
`
`lf
`
`~
`
`21
`
`)
`
`25
`
`8
`
`29
`
`PFIZER EX. 1130
`Page 7
`
`

`

`U.S. Patent
`
`Jan.2, 1990
`
`Sheet 7 of7
`
`4,891,762
`
`22
`
`LIST OF
`MATCHED LINES
`
`23
`)
`LIST OF MATCHED LINES
`REDUCTION PROCESSOR
`
`LIST OF LIKELY MATCHED POINTS
`COMPILATION PROCESSOR
`
`LIST OF LIKELY MATCHED POINTS
`
`LIST OF LIKELY MATCHED LINES
`COMPILATION PROCESSOR
`
`35
`
`36
`
`UNLIKELY MATCHED LINES
`ELIMINATION PROCESSOR
`
`Fig. 7
`
`PFIZER EX. 1130
`Page 8
`
`

`

`1
`
`4,891,762
`
`2
`quite acceptable for scenes which contain simple pat(cid:173)
`terns or complicated patterns with much redundancy,
`but not for cluttered scenes that may contain incom(cid:173)
`plete patterns with little redundancy. In this respect, the
`5 method of the invention is superior to existing methods.
`
`METHOD AND APPARATUS FOR TRACKING,
`MAPPING AND RECOGNITION OF SPATIAL
`PATTERNS
`
`I. BACKGROUND-FIELD OF INVENTION
`The invention concerns methods and apparatus for
`the recognition, tracking and mapping of spatial pat(cid:173)
`terns for a variety of uses.
`
`II. BACKGROUND-DESCRIPTION OF PRIOR
`ART
`The background may be divided into three connected
`parts: method, apparatus and applications:
`
`B. Apparatus
`The invention is expected to be particularly useful in
`practical applications of pattern recognition, where the
`10 available space and electrical power are limited. These
`limitations impose defmite constraints on the design of
`the apparatus. For example, in an autonomous underwa(cid:173)
`ter vehicle, it is estimated that a few tens of watts of
`electrical power may be available for navigation and
`15 guidance computations. Using CMOS technology, it is
`possible to achieve processing rates of more than 10
`million integer multiply-and-accumulate operations per
`second (Mmacs) for a power consumption of only one
`watt. Therefore, an acceptable navigation processor
`should not require more than a few hundred Mmacs of
`processing power. These constraints effectively exclude
`a large proportion of the above mentioned methods
`from applications in autonomous underwater vehicles.
`
`A. Method
`The fundamental process is one of recognition of
`unique spatial patterns that recur in two or more scenes.
`A comprehensive reference of the existing methods is
`given by Paul J. Besl in his article, "Geometric Model- 20
`ing and Computer Vision," Proceedings of the IEEE,
`pages 936 to 958, Volume 76, Number 8, August 1988.
`The methods may be divided into two main categories:
`linear and nonlinear. The method of the invention falls
`in the latter category.
`A.1 Linear methods:
`The linear methods are based on the crosscorrelation
`process, which is inherently linear. It has a number of
`drawbacks, however. One drawback is that it requires a
`large amount of computation power. Attempts to im- 30
`prove the computational efficiency include hierarchical
`correlation processing and hierarchical organization of
`the scene. Another drawback is its inefficiency in deal(cid:173)
`ing with patterns of unknown rotation. To remedy this
`problem, there have been several attempts, in both 35
`space and wave number domains, to develop rotation
`invariant methods. In all cases, the computational effi(cid:173)
`ciency could only be increased at the expense of re(cid:173)
`duced scene resolution and a degradation in the recog(cid:173)
`nition performance. In applications where there is a 40
`large amount of redundancy in the pattern, such as in
`the recognition of printed text, this is not a serious
`drawback. A third drawback is the tendency of the
`crosscorrelation process to give ambiguous or false
`results when the scene is noisy, imperfect or incom- 45
`plete.
`A.2 Nonlinear methods
`The nonlinear methods are a loose collection of heu(cid:173)
`ristic methods based on feature extraction and pattern
`matching concepts, in which the position and orienta- 50
`tion of a set of features in a spatial coordinate system is
`termed a pattern. In a two-dimensional scene, features
`may include discrete points, texture, gray scale, lines,
`curves and corners, and planar surfaces. In a two-di(cid:173)
`mensional space, the pattern formed by a set of points 55
`may be represented by a polygon, and in a three-dimen(cid:173)
`sional space, a polyhedron. Feature extraction and pat(cid:173)
`tern matching have been successfully applied to certain
`types of optical and radar images, and in the recognition
`of printed and handwritten text.
`The method of the invention is one of recognizing
`patterns that may be considered as geometrical figures,
`including polygons and polyhedrons. In practice, the
`computation resources and computation time required
`by the existing methods of recognizing polygons and 65
`polyhedrons increase sharply with scene complexity,
`therefore they are most useful when applied to simple
`scenes or selected parts of complicated scenes. This is
`
`25
`
`C. Applications
`The invention is expected to be particularly useful to
`applications in autonomous navigation. There are a
`number of autonomous navigation system in existence.
`They include ballistic missile and cruise missle guidance
`systems. Equivalent systems for autonomous underwa(cid:173)
`ter or land vehicles, that can guide an unmanned craft to
`its destination over long distances, are still in their in(cid:173)
`fancy. The navigation methods and equipment of exist-
`ing unmanned autonomous underwater vehicles, de(cid:173)
`scribed by Stephen H Eppig in a paper entitled, "Au(cid:173)
`tonomous vehicles for underwater search and survey,"
`presented at the 4th International Symposium on Un(cid:173)
`manned Untethered Submersible Technology June
`24-27 1985, are based on a combination of inertial navi(cid:173)
`gation system aided by Doppler or correlation sonars
`with periodic course corrections provided by acoustic
`ranging. Acoustic ranging systems rely on a network of
`acoustic transponders that must be deployed at strategic
`positions within the operating area, therefore they can(cid:173)
`not be considered as selfcontained systems. The Dop-
`pler or correlation sonars provide a measurement of
`cruising velocity that may be integrated to give an esti(cid:173)
`mate of the distance traveled. In conjunction with an
`inertial navigation system, the velocity measurements
`may be used to estimate course and position relative to
`a set of known starting coordinates.
`Systems based on the Doppler or correlation sonars
`are the only selfcontained underwater navigation sys(cid:173)
`tems currently available, i.e. systems that do not require
`external equipment such as beacons or transponders.
`Both types of sonars are inclined to give velocity mea(cid:173)
`surement errors, particularly over sloping ocean bot(cid:173)
`toms or moving scattering layers. The resulting error in
`60 the position estimation is cumulative, therefore, correc(cid:173)
`tive position fixes by other means are necessary at peri(cid:173)
`odic intervals. Systems based on velocity measurement
`and integration are also incapable of recognizing previ-
`ously traversed areas. Apart from this invention, there
`are no selfcontained systems that can successfully navi(cid:173)
`gate by the tracking and recognition of naturally occur-
`ring features on the ocean bottom; contributing factors
`include the relative scarcity of information in sonar
`
`PFIZER EX. 1130
`Page 9
`
`

`

`3
`images and the substantial computation resources re(cid:173)
`quired by existing methods. The method of the inven(cid:173)
`tion is successful because it is particularly efficient in its
`use of information and computation resources.
`
`4,891,762
`
`4
`26 congruent geometrical figures
`27 coordinate transformation matrix computation pro-
`cessor
`28 map updating processor
`5 29 vehicle velocity, position and orientation computa(cid:173)
`tion processor
`30 common matched lines tallying processor
`31 tally matrix
`32 list of likely matched points compilation processor
`33 list of likely matched points
`34 list of likely matched lines compilation processor
`35 list of likely matched lines
`36 matched lines elimination processor
`
`III. OBJECTS AND ADVANTAGES
`Accordingly I claim the following as objects and
`advantages of this invention: to provide a method and
`apparatus for the recognition, tracking and mapping of
`spatial patterns, using a pattern recognition process 10
`whose distinguishing features are (a) the concept of
`congruent geometrical figures and (b) a maximum likeli(cid:173)
`hood method of efficiency enhancement.
`In addition, I claim the following objects and advan(cid:173)
`tages: to provide a method and apparatus that facilitates 15
`the navigation of manned or unmanned vehicles
`through the recognition, tracking and mapping of spa(cid:173)
`tial patterns formed by environmental features, to pro(cid:173)
`vide a method and apparatus to produce and store fea(cid:173)
`ture maps, to provide a method and apparatus to recog- 20
`nize previously encountered areas and to track vehicle
`position and orientation with the aid of feature maps.
`Further objects and advantages of the invention may
`be found from the ensuing description and accompany(cid:173)
`ing drawings.
`
`25
`
`VI. DESCRIPTION
`In the following, the invention in and its application
`in the navigation of a vehicle is described. The naviga(cid:173)
`tion application is described because it well illustrates
`the operation of the invention. The description is given
`in two levels: concept and process. At the concept
`level, a qualitative description is given of the invention
`and its uses as a navigation tool. At the process leveL
`the operation of the invention within a navigation sys(cid:173)
`tem is described in detail.
`
`IV. DRAWINGS AND FIGURES
`FIG. 1 illustrates the application of pattern recogni(cid:173)
`tion to navigation.
`FIG. 2 illustrates the pattern recognition concept.
`FIG. 3 illustrates the process of navigating a course
`using an existing feature map.
`FIG. 4 illustrates the process of navigating a course
`and the accumulation of a feature map.
`FIG. 5 shows the flowchart of a navigation system.
`FIG. 6 shows the flowchart of the position and orien(cid:173)
`tation tracking processor.
`FIG. 7 shows the flowchart of the list of matched
`lines reduction processor.
`
`V. DRAWING REFERENCE NUMERALS
`1 discrete features
`2 previous position
`3 current position
`4 field of view at the previous position
`5 field of view at the current position
`6 discrete features that are common to both fields of
`view
`7 feature map
`8 current scene
`9 vehicle position on feature map
`10 starting point
`11 destination
`12 navigation processor
`13 mission objectives
`14 sensor system for detecting and locating environ-
`mental features
`15 position and orientation tracking processor
`16 steering and speed correction processor
`17 steering and speed controller
`18 clustering processor
`19 compacted scene
`20 spatial pattern recognition processor
`21 list of matched lines compilation processor
`22 list of matched lines
`23 list of matched lines reduction processor
`24 reduced list of matched lines
`25 congruent geometrical figures recognition processor
`
`A. Concept
`Consider a vehicle, traveling through an environment
`in which there are a number of features, and equipped
`with a sensor system that is able to detect· the features
`30 and to estimate the position of each feature relative to
`the vehicle. Practical examples include: a space craft
`equipped with an optical imaging and ranging system
`traveling through a planetary system, an aircraft
`equipped with radar traveling over a terrestrial area,
`35 and an underwater vehicle equipped with sonar travel(cid:173)
`ing over the ocean bottom. In the first example, the
`relevant features may be celestial objects, in the second
`example, telephone poles, trees and other landmarks
`that are detectable by a radar system, and in the third
`40 example, rocks, clumps of coral and other features of
`the ocean bottom that are detectable by a sonar system.
`In FIG. 1, a vehicle is shown traveling over an area
`containing discrete features 1 that are detectable to its
`sensor system. The illustration shows the vehicle at a
`45 previous position 2 and its current position 3. At the
`previous position 2, the vehicle has a certain field of
`view 4, in which it detects a number of the features. Let
`the field of view 5 at the current position overlap the
`field of view 4 at the previous position. For each field of
`50 view, the sensor system provides a set of information,
`called a scene, comprising the signal intensities pro(cid:173)
`duced by the detected features and their estimated posi(cid:173)
`tions relative to the vehicle. The position of each fea(cid:173)
`ture is represented by a single point in a spatial coordi-
`55 nate system. A number of features 6lie within the inter(cid:173)
`section of the two fields of view, consequently they
`must be represented in the two corresponding scenes.
`An apparatus, that can recognize and match the points
`representing the common features 6 in the two scenes.
`60 will enable the vehicle to track its movement from the
`previous position 2 to the current position 3.
`Using the positional information provided by the
`sensor system, straight lines may be used to connect any
`set of points within a scene to form a geometrical figure.
`65 The geometrical figure is uniquely defined by the lights
`of the lines joining all point pairs within the set. This
`concept is valid in a spatial coordinate system of any
`number of dimensions. In a two- or three-dimensional
`
`PFIZER EX. 1130
`Page 10
`
`

`

`4,891,762
`
`5
`6
`In this application, the invention is the position and
`space, the geometrical figure may be considered as a
`orientation tracking processor 15. The components of
`polygon or polyhedron, respectively. By this concept,
`the navigation system are described in the following
`the common features 6 may be described as a geometri(cid:173)
`sections. In particular, the operation of the invention,
`cal figure. If all the lines in one geometrical figure are of
`5 that is the position and orientation tracking processor, is
`the same length as the corresponding lines in another
`described in section B.2 and its subsections B.2 a
`geometrical figure, then the two are said to be congru(cid:173)
`through B.2.d.
`ent. It follows from the above definition of the geomet(cid:173)
`B.l The sensor system
`rical figure that identical geometrical figures must be
`A suitable sensor system is used to produce a scene,
`congruent. Therefore, the process of recognizing com(cid:173)
`10 by detecting the presence of discrete features within the
`mon features in two fields of view may be formulated as
`field of view and to estimate their positions relative to
`one of recognizing congruent geometrical figures be(cid:173)
`the vehicle. Many types of sensor systems are capable of
`tween the two corresponding scenes.
`producing scenes of this type, such as radar, lidar, and
`The geometrical figure formed by the common fea(cid:173)
`stereoscopic passive sensor systems. For an underwater
`tures 6, and the positions of the vehicle relative to it, are
`15 vehicle, the sensor system is usually a sonar system. A
`illustrated in FIG. 2. The vehicle positions 2 and 3,
`brief description of the operation of a suitable sonar
`relative to the geometrical figure, are constructed from
`system will be given for completeness.
`the positional information contained in the two scenes.
`The sonar system detects features on the ocean bot(cid:173)
`The change in position of the vehicle from 2 to 3 is
`tom through the ability of the features to scatter sound.
`equal to the difference between the two position vec(cid:173)
`20 Features are detected by collecting the backscattered
`tors.
`sound signals produced by sound waves impinging on
`In general, a map may be defined as a collection of
`them. The sonar system includes beamformers that are
`points in space whose positions relative to accepted
`used to separate the backscattered signals into beams
`geographical references are known, or considered to be
`according to their directions of arrival. A peak in the
`known. If the previous position and orientation of the
`25 intensity of the signals in any beam is indicative of a
`vehicle 2 is known or considered to be known, then the
`feature in the corresponding direction; the travel time
`corresponding scene may be converted into a map.
`of the signal peak is measured and used to estimate the
`Through the recognition congruent geometrical fig(cid:173)
`range of the indicated feature. Suitably prominent signal
`ures, the current vehicle position may be charted on the
`peaks are collected. For each peak, the position of the
`map.
`30 corresponding feature is calculated from the estimated
`range and direction of arrival. The result is a set of data
`With reference to FIG. 3, if a map 7 of an operating
`points that constitute the current scene 8, each data
`area were available, then congruent geometrical figures
`point containing a signal intensity and an estimate of
`between a current scene 8 and the map may be used to
`position relative to the sensor position.
`chart the position of the vehicle 9 in the map. In this
`way, the course and position of the vehicle may be 35
`By implication, the sensor position must be at the
`continuously tracked. This concept may be used to
`origin of the coordinate system of the point positions.
`For simplicity, a principal axis of the coordinate system
`guide a vehicle from a starting position 10 to its destina-
`is aligned with the sensor orientation. Since the sensor
`tion 11, as illustrated in FIG. 3.
`system is an integral part of the vehicle, the sensor
`In the absence of a map, a vehicle may guide itself
`towards a designated destination 11, defined in terms of 40 position and orientation may be considered identical to
`a distance and bearing from a known starting orienta-
`the position· and orientation of the vehicle.
`tion and position 10, through the following steps: Using
`B.2 Position and orientation tracking processor
`the known starting position and orientation, the con-
`The position and orientation tracking processor is
`illustrated in FIG. 6. It is subdivided into a number of
`tents of a scene acquired at the starting position and
`orientation may be converted into a map. Then, 45 component processors described as follows.
`B.2.a Clustering processor: In practice, the position
`through a series of overlapping scenes linked by con-
`gruent geometrical figures, the map may be extended in
`of every point in a scene is subject to a degree of uncer-
`the direction of the destination by the accumulation of
`tainty. At any given level of confidence, the uncertainty
`interlocking geometrical figures, as illustrated in FIG.
`may be expressed in terms of a confidence interval.
`4. The process may be continued until the requisite 50 Using simple decision theory methods, the confidence
`'interval of the position of a point is calculated from the
`distance is covered.
`feature location accuracy of the sensor system and the
`selected confidence level. The feature location accu(cid:173)
`racy of the sensor system is determined by physical
`55 factors and the characteristics of the sensor system. The
`selected confidence level is a parameter with possible
`values between 0 and 100%; while there are no set rules
`regarding its proper value. intermediate values have
`been found to give satisfactory results.
`In practice, more than one data point may be found
`within the span of a confidence interval. The presence
`of more than one data point within a confidence interval
`represents an unnecessary redundancy. Therefore, a
`clustering processor 18 is used to identify groups of two
`65 or more points that occupy a space too small to be
`reliably resolved by the sensor system at the selected
`confidence level, and replace each group by a single
`representative data point at the centroid of the group;
`
`B. Process
`A description of the invention and its application to
`vehicle navigation will be given, with particular empha(cid:173)
`sis on underwater vehicles. A simplified flowchart of a
`navigation system is shown in FIG. 5. The navigation
`processor 12 is driven by the mission objectives 13. A
`sensor system 14 is used to provide the navigation pro(cid:173)
`cessor with scenes that represent features in the envi- 60
`ronment. The navigation processor may be subdivided
`into two component processors, a position and orienta(cid:173)
`tion tracking processor 15, and a steering an speed cor(cid:173)
`rection processor 16; both may request the sensor sys(cid:173)
`tem to provide a new current scene 8 as necessary. The
`steering an speed correction processor drives a steering
`and speed controller 17 which operates the steering and
`propulsion units, thus closing the control loop.
`
`PFIZER EX. 1130
`Page 11
`
`

`

`7
`·the centroid is defined as the average position weighted
`by signal intensity. Then, a unique identifier is assigned
`to each data point. An identifier may be a character
`string, bit pattern or any other suitable form of symbolic
`information. The result is a compacted scene 19.
`B.2.b The spatial pattern recognition processor: The
`operation of the spatial pattern recognition processor 20
`is based on the criterion:
`Two geometrical figures are congruent if the straight
`lines connecting all corresponding point pairs are of 10
`equal length;
`A straight line is defmed as the shortest path between
`two points in a space of one or more dimensions, not
`necessarily limited to three dimensions.
`Before going into the description of the processor 15
`itself, there are two important practical aspects that
`need to be considered, line length accuracy and recog(cid:173)
`nition performance.
`Just as there are uncertainties associated with the
`point positions, there must be uncertainties associated 20
`with the length of lines joining pairs of points. This
`uncertainty may also be allowed for in the form of a
`confidence interval. Thus, two line lengths are consid(cid:173)
`ered equal if the difference is within their combined
`confidence interval. The combined confidence interval 25
`may be approximated by the incoherent sum of the
`resolved confidence intervals of the positions of the
`four end points.
`As a consequence of line length uncertainties and
`other imperfections, it must be concluded that, in prac- 30
`tice, there is a finite probability that the performance of
`the recognition processor may be less than perfect.
`Following standard decision theory methodology, the
`performance may be expressed in terms of the probabil-
`ity of detection and the probability of false alarm; in this 35
`context, "detection" refers to the proper recognition of
`congruent geometrical figures, and "false alarm" refers
`to a false recognition. In order to achieve or exceed a
`prescribed level of performance, it can be shown that
`the number of points in the congruent geometrical fig- 40
`ures must be equal to or exceed a minimum threshold
`. number. The said threshold number may be calculated
`from the required probabilities of detection and false
`alarm, the confidence intervals of the point positions,
`the dimensions of the compacted scene and the relevant 45
`region of the map, and the average densities of the
`points in the compacted scene and in the map.
`Using information available to the navigation system,
`such as estimated vehicle velocity and elapsed time, it is
`often possible to limit the search to a relevant region in 50
`the map containing all the points that may be expected
`to form a geometrical figure congruent with another in
`the compacted scene. Similar search limits may also be
`applicable within the compacted scene. These limits can
`help improve performance and reduce costs. By calcu- 55
`lating all the combinations and permutations that have
`to be tested, and given the above practical consider(cid:173)
`ations, it can be shown that, to achieve a useful level of
`performance, a direct search, of scenes and maps pro(cid:173)
`duced by sonars, would be prohibitively costly in terms 60
`of search time and computation resources. A signifi(cid:173)
`cantly more efficient method, embodied in the spatial
`pattern recognition processor 20, is hereby disclosed.
`The spatial pattern recognition processor 20 may be
`divided into three parts:
`(1) A processor 21 is used for comparing the lengths
`of straight lines between point pairs in the compacted
`scene to those of a relevant set of point pairs in the map
`
`8
`and compiling a list of all point pairs of equal line
`lengths at the required confidence level, known as the
`list of matched lines 22. The list is a symbolic list com(cid:173)
`prising a series of entries, each entry containing the
`5 identifiers of two points in the compacted scene and the
`identifiers of two points in the map that are joined by
`lines of equal length. The list is expected to be quite
`lengthy, therefore it should be well organized for effi-
`cient searching: The contents of each entry should be
`arranged in a definite order, with the identifiers from
`the compacted scene and those from the map paired and
`arranged in a defmite order; and the identifiers within
`each pair ordered in a definite order, such as by alpha(cid:173)
`betical or numerical order. The entries should also be
`ordered in a definite order according to their contents,
`such as by alphabetical or numerical order.
`(2) A list reduction processor 23 is used to reduce the
`list 22 by a maximum likelihood method. Its flowchart is
`shown separately in FIG. 7. The process involves the
`generation of a series of lists. For efficient searching,
`each list should be organized in a similar way to the list
`of matched lines. The processor 23 may be divided into
`four parts:
`(a) A processor 30 for producing a tally matrix 31 of
`the number of matched lines that are common be(cid:173)
`tween each point in the compacted scene and an(cid:173)
`other point in the map, by tallying the entries in the
`list 22 that contain each point pair. The resulting
`tally matrix 31 comprises a two-dimensional array,.
`with the columns assigned to the points in the com(cid:173)
`pacted scene, one point per column, and the rows
`assigned to the relevant set of points in the map,
`one point per row, and containing the number of
`matched lines common to all point pairs corre(cid:173)
`sponding to the intersecting rows and columns.
`(b) A proce

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket