`Chotiros
`
`[11] Patent Number:
`[ 45] Date of Patent:
`
`4,891,762
`Jan.2, 1990
`
`[54] METHOD AND APPARATUS FOR
`TRACKING, MAPPING AND RECOGNITION
`OF SPATIAL PAITERNS
`Inventor: Nicholas P. Chotiros, 1508 Charolais
`Dr., Austin, Tex. 78758
`
`(76]
`
`[21] Appl. No.: 154,048
`(22) Filed:
`Feb. 9, 1988
`(51]
`Int. Cl.4 ....................... G06F 15/50; G06F 15/70
`[52] U.S. Cl ....................................... 364/ 456; 342/64;
`382/16
`(58] Field of Search ............... 364/449, 456, 423, 458,
`364/454, 443; 342/64, 90, 180; 382/16, 22, 26,
`30, 48
`
`[56]
`
`References Cited
`U.S. PATENT DOCUMENTS
`3,879,728 4/1975 Wolff .................................... 342/64
`3,992,707 11/1976 Schmidtlein et al . ................. 342/64
`4,164,728 8/1979 Marsh ................................... 342/64
`4,179,693 12/1979 Evans et al ........................... 342/64
`4,192,004 3/1980 Buerger ............................... 364/518
`4,396,903 8/1983 Habicht et al. ....................... 342/64
`4,490,719 12/1984 Botwin et al. ........................ 342/64
`4,494,200 1/1985 Lam .................................... 364/449
`4,514,733 4/1985 Schmiddein et al .................. 342/64
`4,590,608 5/1986 Chen et al. ............................ 382/43
`4,602,336 7/1986 Brown ................................. 364/456
`4,635,203 1/1987 Merchant ............................ 364/458
`4,700,307 10/1987 Mons et al .......................... 364/453
`4,715,005 12/1987 Heartz ................................. 364/521
`4,736,436 4/1988 Yasukawa et al ..................... 382/16
`4, 754,493 6/1988 Coates ................................... 382/48
`
`OTHER PUBLICATIONS
`Besl, "Geometric Modeling and Computer Vision," pp.
`936-958, Proceedings of the IEEE, vol. 76, No. 8, Aug.,
`1988.
`Eppig, "Autonomous Vehicles for Underwater Search
`
`and Survey," pp. 46-60, Presented at the 4th Interna(cid:173)
`tional Symposium on Unmanned Untethered Submers(cid:173)
`ible Technology, Jun. 24-27, 1985.
`
`Primary Examiner-Parshotam S. Lall
`Assistant Examiner-Thomas G. Black
`
`ABSTRACT
`[57]
`A method and apparatus for the identification of spatial
`patterns that occur in two or more scenes or maps. Each
`pattern comprises a set of points in a spatial coordinate
`system collectively represented by the geometrical fig(cid:173)
`ure formed by connecting all point pairs by straight
`lines. The pattern recognition process is one of recog(cid:173)
`nizing congruent geometrical figures. Two geometrical
`figures are congruent if all the lines in one geometrical
`figure are of the same length as the corresponding lines
`in the other. This concept is valid in a spatial coordinate
`system of any number of dimensions. In two- or three(cid:173)
`dimensional space, a geometrical figure may be consid(cid:173)
`ered as a polygon or polyhedron, respectively. Using
`the coordinates of the points in a pair of congruent
`geometrical figures, one in a scene and the other in a
`map, a least squares error transformation matrix may be
`found to map points in the scene into the map. Using the
`transformation matrix, the map may be updated and
`extended with points from the scene. If the scene is
`produced by the sensor system of a vehicle moving
`through an environment containing features at rest, the
`position and orientation of the vehicle may be charted,
`and, over a series of scenes, the course of the vehicle
`may be tracked. If the scenes are produced by a sensor
`system at rest, then moving objects and patterns in the
`field of view may be tracked.
`
`5 Claims, 7 Drawing Sheets
`
`REDUCEDt.1$T
`o;; MATCHED l.M.S
`
`~
`e
`
`CONOSIUENt GECfJEJAICAl AGc.IAES ~ ~
`J"'-=:.+--1-1--'
`
`A£COGNIHON PROCESSOR
`
`FQJND
`
`26
`
`CONOAUENT
`GEOUETAK:N. 'IGl.IACS
`
`COOAOINAlE T'RM~ClfUMTICN
`MAJAIX C~PtJTAffON
`PROCESS~
`
`IJAPUPO-A.TW~G
`PROCESSOR
`
`VfHIClE VElOCITV. POS.ll'l()N -WO
`on1e:N1"n°"' couPUfAUON
`PffOCESSOi:I
`
`fO STEERING ANO SPUOCORAECflON
`PROCESSOA ti
`
`1 of 48
`
`BI Exhibit 1130
`
`
`
`U.S. Patent
`
`Jan.2,1990
`
`Sheet 1 of7
`
`4,891,762
`
`3
`
`"' ''...
`
`....... ........
`
`..... ··
`c=:>
`. ............ ··
`............................................. " .................................................. ..
`
`Fig. 1
`
`2 of 48
`
`BI Exhibit 1130
`
`
`
`U.S. Patent
`
`Jan. 2, 1990
`
`Sheet 2 of7
`
`4,891,762
`
`3
`
`/
`
`'.:#<>•/
`
`I
`
`/.J ..
`
`Fig. 2
`
`3 of 48
`
`BI Exhibit 1130
`
`
`
`U.S. Patent
`
`Jan. 2, 1990
`
`Sheet 3of7
`
`4,891,762
`
`5
`
`Fig. 3
`
`4 of 48
`
`BI Exhibit 1130
`
`
`
`U.S. Patent
`
`Jan. 2, 1990
`
`Sheet 4of7
`
`4,891,762
`
`5
`
`Fig. 4
`
`5 of 48
`
`BI Exhibit 1130
`
`
`
`U.S. Patent
`
`Jan.2,1990
`
`Sheet 5of7
`
`4,891,762
`
`13
`
`14
`
`MISSION
`OBJECTIVES
`
`SENSOR SYSTEM
`FOR DETECTING AND LOCATING
`ENVIRONMENTAL FEATURES
`
`12
`
`NAVIGATION PROCESSOR
`
`7
`
`8
`
`MAP
`
`CURRENT SCENE
`
`POSITION AND
`ORIENTATION
`TRACKING PROCESSOR
`
`. 16
`
`STEERING AND SPEED CORRECTION
`PROCESSOR
`
`17
`
`STEERING AND
`SPEED CONTROLLER
`
`Fig. 5
`
`6 of 48
`
`BI Exhibit 1130
`
`
`
`U.S. Patent
`
`Jan. 2, 1990
`
`Sheet 6of 7
`
`4,891,762
`
`7
`
`MAP
`
`CURRENT SCENE
`
`POSITION AND ORIENTATION
`TRACKING PROCESSOR
`
`18
`
`)
`CLUSTERING PROCESSOR
`
`20
`
`COMPACTED
`SCENE
`
`SPATIAL PATTERN
`RECOGNITION PROCESSOR
`
`21
`
`LIST OF MATCHED LINES
`COMPILATION PROCESSOR
`
`LIST OF
`MATCHED LINES
`
`23
`
`LIST OF MATCHED LINES
`REDUCTION PROCESSOR
`
`:2
`w
`~
`fu
`a:
`0
`(/) z w
`0 ....
`
`(/)
`
`REDUCED LIST
`OF MATCHED LINES
`0 z
`.... :::>
`CONGRUENT GEOMETRICAL FIGURES ~ ft
`..__......_-. _ __.
`RECOGNITION PROCESSOR
`
`25
`
`FOUND
`
`26
`
`CONGRUENT
`GEOMETRICAL FIGURES
`
`COORDINATE TRANSFORMATION
`MATRIX COMPUTATION
`PROCESSOR
`
`MAP UPDATING
`PROCESSOR
`
`VEHICLE VELOCITY, POSITION AND
`ORIENTATION COMPUTATION
`PROCESSOR
`
`TO STEERING AND SPEED CORRECTION
`PROCESSOR 16
`
`Fig. 6
`
`7 of 48
`
`BI Exhibit 1130
`
`
`
`U.S. Patent
`
`Jan.2, 1990
`
`Sheet 7 of7
`
`4,891,762
`
`22
`
`LISTOF
`MATCHED LINES
`
`23
`
`)
`LIST OF MATCHED LINES
`REDUCTION PROCESSOR
`
`COMMON MATCHED LINES
`TALLYING PROCESSOR
`
`TALLY MATRIX
`
`LIST OF LIKELY MATCHED POINTS
`COMPILATION PROCESSOR
`
`LIST OF LIKELY MATCHED POINTS
`
`LIST OF LIKELY MATCHED LINES
`COMPILATION PROCESSOR
`
`UST OF LIKELY MATCHED LINES
`
`36
`
`UNLIKELY MATCHED LINES
`ELIMINATION PROCESSOR
`
`REDUCED LIST
`OF MATCHED LINE
`
`Fig. 7
`
`8 of 48
`
`BI Exhibit 1130
`
`
`
`1
`
`4,891,762
`
`2
`quite acceptable for scenes which contain simple pat(cid:173)
`terns or complicated patterns with much redundancy,
`but not for cluttered scenes that may contain incom(cid:173)
`plete patterns with little redundancy. In this respect, the
`5 method of the invention is superior to existing methods.
`
`METHOD AND APPARATUS FOR TRACKING,
`MAPPING Ai'ID RECOGNITION OF SPATIAL
`PATTERNS
`
`I. BACKGROUND-FIELD OF INVENTION
`The invention concerns methods and apparatus for
`the recognition, tracking and mapping of spatial pat(cid:173)
`terns for a variety of uses.
`
`II. BACKGROUND- DESCRIPTION OF PRIOR
`ART
`The background may be divided into three connected
`parts: method, apparatus and applications:
`
`B. Apparatus
`The invention is expected to be particularly useful in
`practical applications of pattern recognition, where the
`10 available space and electrical power are limited. These
`limitations impose definite constraints on the design of
`the apparatus. For example, in an autonomous underwa(cid:173)
`ter vehicle, it is estimated that a few tens of watts of
`electrical power may be available for navigation and
`15 guidance computations. Using CMOS technology, it is
`possible to achieve processing rates of more than IO
`million integer multiply-and-accumula.te operations per
`second (Mmacs) for a power consumption of only one
`watt. Therefore, an acceptable navigation processor
`should not require more than a few hundred Mmacs of
`processing power. These constraints effectively exclude
`a large proportion of the above mentioned methods
`from applications in autonomous underwater vehicles.
`
`25
`
`A. Method
`The fundamental process is one of recognition of
`unique spatial patterns that recur in two or more scenes.
`A comprehensive reference of the existing methods is
`given by Paul J. Bes! in his article, "Geometric Model- 20
`ing and Computer Vision," Proceedings of the IEEE,
`pages 936 tO 958, Volume 76, Number 8, August 1988.
`The methods may be divided into two main categories:
`linear and nonlinear. The method of the invention falls
`C. Applications
`in the latter category.
`The invention is expectP.d to be particularly useful to
`A. I Linear methods:
`applications in autonomous navigation. There are a
`The linear methods are based on the crosscorrelation
`number of autonomous navigation system in existence.
`process, which is inherently linear. It has a number of
`drawbacks, however. One drawback is that it requires a
`They include ballistic missile and cruise missle guidance
`large amount of computation power. AttemptS to im- 30 systems. Equivalent systems for autonomous underwa(cid:173)
`prove the computational efficiency include hierarchical
`ter or land vehicles, that can guide an unmanned craft to
`correlation processing and hierarchical organization of
`its destination over long distances, are still in their in(cid:173)
`the scene. Another drawback is its inefficiency in deal-
`ing with patterns of unknown rotation. To remedy this
`fancy. The navigation methods and equipment of exist-
`problem, there have been several attempts, in both 35 ing unmanned autonomous underwater vehicles, de-
`scribed by Stephen H Eppig in a paper entitled, "Au-
`space and wave number domains, to develop rotation
`invariant methods. In all cases, the computational effi-
`tonomous vehicles for underwater search and survey,"
`ciency could only be increased at the expense of re-
`presented at the 4th International Symposium on Un-
`duced scene resolution and a degradation in the recog-
`manned Untethered Submersible Technology June
`nition performance. In applications where there is a 40 24-27 1985, are based on a combination of inertial navi-
`large amount of redundancy in the pattern, such as in
`gation system aided by Doppler or correlation sonars
`the recognition of printed text, this is not a serious
`with periodic course corrections provided by acoustic
`drawback. A third drawback is the tendency of the
`ranging. Acoustic ranging systems rely on a network of
`crosscorrelation process to give ambiguous or false
`acoustic transponders that must be deployed at strategic
`results when the scene is noisy, imperfect or incom- 45 positions within the operating area, therefore they can-
`plete.
`not be considered as selfcontained systems. The Dop-
`A.2 Nonlinear methods
`pier or correlation sonars provide a measurement of
`The nonlinear methods are a loose collection of heu-
`cruising velocity that may be integrated to give an esti-
`ristic methods based on feature extraction and pattern
`mate of the distance traveled. In conjunction with an
`matching concepts, in which the position and orienta- 50 inertial navigation system, the velocity measurements
`tion of a set of features in a spatial coordinate system is
`may be used to estimate course and position relative to
`termed a pattern. In a two-dimensional scene, features
`a set of known starting coordinates.
`may include discrete points, texture, gray scale, lines,
`Systems based on the Doppler or correlation sonars
`curves and comers, and planar surfaces. In a two-di-
`are the only selfcontained underwater navigation sys-
`mensional space, the pattern formed by a set of points 55 terns currently available, i.e. systems that do not require
`may be represented by a polygon, and in a three-dimen-
`external equipment such as beacons or transponders.
`sional space, a polyhedron. Feature extraction and pat-
`Both types of sonars are inclined to give velocity mea-
`tern matching have been successfull y applied to certain
`surement errors, partic ularly over slo ping ocean bot-
`types of optical and radar images, and in the recognition
`toms or moving scattering layers. T he resulting error in
`of printed and handwritten text.
`60 the position estimation is cumulative, therefore, correc-
`The method of the invention is one of recognizing
`tive position fixes by other means are necessary at peri-
`patterns that may be considered as geometrical figures,
`odic intervals. Systems based on velocity measurement
`including polygons and polyhedrons. In practice, the
`and integration are also incapable of recognizing previ-
`computation resources and computation time required
`ously traversed areas. Apart from this invention, there
`by the existing methods of recognizing polygons and 65 are no selfcontained systems that can successfully navi-
`polyhedrons increase sharply with scene complexity,
`gate by the tracking and recognition of naturally occur-
`therefore they are most useful when applied to simple
`ring features on the ocean bottom; contributing factors
`scenes or selected parts of complicated scenes. This is
`include the relative scarcity of information in sonar
`
`9 of 48
`
`BI Exhibit 1130
`
`
`
`3
`images and the substantial computation resources re(cid:173)
`quired by existing methods. The method of the inven(cid:173)
`tion is successful because it is particularly efficient in its
`use of information and computation resources.
`
`4,891,762
`
`4
`26 congruent geometrical figures
`27 coordinate transformation matrix computation pro-
`cessor
`28 map updating processor
`5 29 vehicle velocity, position and orientation computa(cid:173)
`tion processor
`30 common matched lines tallying processor
`31 tally matrix
`32 list of likely matched points compilation processor
`33 list of likely matched points
`34 list of likely matched lines compilation processor
`35 list of likely matched lines
`36 matched lines elimination processor
`
`VI. DESCRIPTION
`In the following, the invention in and its application
`in the navigation of a vehicle is described.. The naviga(cid:173)
`tion application is described because it well illustrates
`the operation of the invention. The description is given
`in two levels: concept and process. At the concept
`level, a qualitative description is given of the invention
`and its uses as a navigation tool. At the process level.
`the operation of the invention within a navigation sys(cid:173)
`tem is described in detail.
`
`III. OBJECTS AND ADV ANT AGES
`Accordingly I claim the following as objects and
`advantages of this invention: to provide a method and
`apparatus for the recognition, tracking and mapping of
`spatial patterns, using a pattern recognition process 10
`whose distinguishing features are (a) the concept of
`congruent geometrical figures and (b) a maximum likeli(cid:173)
`hood method of efficiency enhancement.
`In addition, I claim the following objects and advan(cid:173)
`tages: to provide a method and apparatus that facilitates 15
`the navigation of manned or unmanned vehicles
`through the recognition, tracking and mapping of spa(cid:173)
`tial patterns formed by environmental features, to pro(cid:173)
`vide a method and apparatus to produce and store fea(cid:173)
`ture maps, to provide a method and apparatus to recog- 20
`nize previously encountered areas and to track vehicle
`position and orientation with the aid of feature maps.
`Further objects and advantages of the invention may
`be found from the ensuing description and accompany(cid:173)
`ing drawings.
`
`25
`
`IV. DRAWINGS AND FIGURES
`FIG. 1 illustrates the application of pattern recogni(cid:173)
`tion to navigation.
`FIG. 2 illustrates the pattern recognition concept.
`FIG. 3 illustrates the process of navigating a course
`using an existing feature map.
`FIG. 4 illustrates the process of navigating a course
`and the accumulation of a feature map.
`FIG. 5 shows the flowchart of a navigation system. 35
`FIG. 6 shows the flowchart of the position and orien(cid:173)
`tation tracking processor.
`FIG. 7 shows the flowchart of the list of matched
`lines reduction processor.
`
`V. DRAWING REFERENCE NUMERALS
`1 discrete features
`2 previous position
`3 current position
`4 field of view at the previous position
`5 field of view at the current position
`6 discrete features that are common to both fields of
`view
`7 feature map
`8 current scene
`9 vehicle position on feature map
`10 starting point
`11 destination
`12 navigation processor
`13 mission objectives
`14 sensor system for detecting and locating environ-
`mental features
`IS position and orientation tracking processo r
`16 steering and speed correction processor
`17 steering and speed controller
`18 clustering processor
`19 compacted scene
`20 spatial pattern recognition processor
`21 list of matched lines compilation processor
`22 list of matched lines
`23 list of matched lines reduction processor
`24 reduced list of matched lines
`25 congruent geometrical figures recognition processor
`
`A. Concept
`Consider a vehicle, traveling through an environment
`in which there are a number of features, a)ld equipped
`with a sensor system that is able to detect the features
`30 and to estimate the position of each feature relative to
`the vehicle. Practical examples include: :a space craft
`equipped with an optical imaging and ranging system
`traveling through a planetary system, an aircraft
`equipped with radar traveling over a terrestrial area,
`and an underwater vehicle equipped with sonar travel(cid:173)
`ing over the ocean bottom. In the first example, the
`relevant features may be celestial objects, in the second
`example, telephone poles, trees and other landmarks
`that are detectable by a radar system, and in the third
`40 example, rocks, clumps of coral and other features of
`the ocean bottom that are detectable by a sonar system.
`In FIG. 1, a vehicle is shown traveling over an area
`containing discrete features 1 that are detectable to its
`sensor system. The illustration shows the vehicle at a
`45 previous position 2 and its current position 3. At the
`previous position 2, the vehicle has a certain field of
`view 4, in which it detects a number of the features. Let
`the field of view 5 at the current position overlap the
`field of view 4 at the previous position. For each field of
`50 view, the sensor system provides a set of information,
`called a scene, comprising the signal intensities pro(cid:173)
`duced by the detected features and their estimated posi(cid:173)
`tions relative to the vehicle. The position of each fea(cid:173)
`ture is represented by a single point in a spatial coordi-
`55 nate system. A number of features 6 lie within the inter(cid:173)
`section of the two fields of view, consequently they
`must be represented in the two corresponding scenes.
`An apparatus, that can recognize and match the points
`representing the common fea tures 6 in the 1wo scenes.
`60 will enable the vehicle to track its movement from the
`previous position 2 to the current position 3.
`Using the positional information provided by the
`sensor system, straight lines may be used to connect any
`set of points within a scene to form a geometrical figure.
`65 The geometrical figure is uniquely defined by the lights
`of the lines joining all point pairs within the set. This
`concept is valid in a spatial coordinate system of any
`number of dimensions. In a two- or three-dimensional
`
`10 of 48
`
`BI Exhibit 1130
`
`
`
`4,891,762
`
`25
`
`30
`
`5
`space, the geometrical figure may be considered as a
`polygon or polyhedron, respectively. By this concept,
`the common features 6 may be described as a geometri(cid:173)
`cal figure. If all the lines in one geometrical figure are of
`the same length as the corresponding lines in another
`geometrical figure, then the two are said to be congru(cid:173)
`ent. It follows from the above definition of the geomet(cid:173)
`rical figure that identical geometrical figures must be
`congruent. Therefore, the process of recognizing com(cid:173)
`mon features in two fields of view may be formulated as
`one of recognizing congruent geometrical figures be(cid:173)
`tween the two corresponding scenes.
`The geometrical figure formed by the common fea(cid:173)
`tures 6, and the positions of the vehicle relative to it, are
`illustrated in FIG. 2. The vehicle positions 2 and 3,
`relative to the geometrical figure, are constructed from
`the positional information contained in the two scenes.
`The change in position of the vehicle from 2 to 3 is
`equal to the difference between the two position vec-
`tors.
`In general, a map may be defined as a collection of
`points in space whose positions relative to accepted
`geographical references are known, or considered to be
`known. If the previous position and orientation of the
`vehicle 2 is known or considered to be known, then the
`corresponding scene may be converted into a map.
`Through the recognition congruent geometrical fig(cid:173)
`ures, the current vehicle position may be charted on the
`map.
`With reference to FIG. 3, if a map 7 of an operating
`area were available, then congruent geometrical figures
`between a current scene 8 and the map may be used to
`chart the position of the vehicle 9 in the map. In this
`way, the course and position of the vehicle may be 35
`continuously tracked. This concept may be used to
`guide a vehicle from a starting position 10 to its destina(cid:173)
`tion 11, as illustrated in FIG. 3.
`In the absence of a map, a vehicle may guide itself
`towards a designated destination 11, defined in terms of 40
`a distance and bearing from a known starting orienta(cid:173)
`tion and position 10, through the following steps: Using
`the known starting position and orientation, the con(cid:173)
`tents of a scene acquired at the starting position and
`orientation may be converted into a map. Then, 45
`through a series of overlapping scenes linked by con(cid:173)
`gruent geometrical figures, the map may be extended in
`the direction of the destination by the accumulation of
`interlocking geometrical figures, as illustrated in FIG.
`4. The process may be continued until the requisite
`distance is covered.
`
`B. Process
`A description of the invention and its application to
`vehicle navigation will be given, with particular empha- 55
`sis on underwater vehicles. A simplified flowchart of a
`navigation system is shown in FIG. 5. The navigation
`processor 12 is driven by the mission objectives 13. A
`sensor system 14 is used to provide the navigation pro(cid:173)
`cessor with scenes that represent features in the envi(cid:173)
`ronment. The navigation processor may be subdivided
`into two component processors, a position and orienta(cid:173)
`tion tracking processor 15, and a steering an speed cor(cid:173)
`rection processor 16; both may request the sensor sys(cid:173)
`tem to provide a new current scene 8 as necessary. The 65
`steering an speed correction processor drives a steering
`and speed controller 17 which operates the steering and
`propulsion units, thus closing the control loop.
`
`60
`
`20
`
`6
`In this application, the invention is the position and
`orientation tracking processor 15. The components of
`the navigation system are described in the following
`sections. In particular, the operation of the invention,
`5 that is the position and orientation tracking processor, is
`described in section B.2 and its subsections B.2 a
`through B.2.d.
`B.1 The sensor system
`A suitable sensor system is used to produce a scene,
`10 by detecting the presence of discrete features within the
`field of view and to estimate their positions relative to
`the vehicle. Many types of sensor systems are capable of
`producing scenes of this type, such as radar, lidar, and
`stereoscopic passive sensor systems. For an underwater
`15 vehicle, the sensor system is usually a sonar system. A
`brief description of the operation of a suitable sonar
`system will be given for completeness.
`The sonar system detects features on the ocean bot(cid:173)
`tom through the ability of the features to scatter sound.
`Features are detected by collecting the backscattered
`sound signals produced by sound waves impinging on
`them. The sonar system includes beamforrners that are
`used to separate the backscattered signals into beams
`according to their directions of arrival. A peak in the
`intensity of the signals in any beam is indicative of a
`feature in the corresponding direction; the travel time
`of the signal peak is measured and used to estimate the
`range of the indicated feature. Suitably prominent signal
`peaks are collected. For each peak, the position of the
`corresponding feature is calculated from the estimated
`range and direction of arrival. The result is a set of data
`points that constitute the current scene 8, each data
`point containing a signal intensity and an estimate of
`position relative to the sensor position.
`By implication, the sensor position must be at the
`origin of the coordinate system of the point positions.
`For simplicity, a principal axis of the coordinate system
`is aligned with the sensor orientation. Since the sensor
`system is an integral part of the vehicle, the sensor
`position and orientation may be considered identical to
`the position· and orientation of the vehicle.
`B.2 Position and orientation tracking processor
`The position and orientation tracking processor is
`illustrated in FIG. 6. It is subdivided into a number of
`component processors described as follows.
`B.2.a Clustering processor: In practice, the position
`of every point in a scene is subject to a degree of uncer(cid:173)
`tainty. At any given level of confidence, the uncertainty
`may be expressed in terms of a confidence interval.
`50 Using simple decision theory methods, the confidence
`"interval of the position of a point is calculated from the
`feature location accuracy of the sensor system and the
`selected confidence level. The feature location accu-
`racy of the sensor system is determined by physical
`factors and the characteristics of the sensor system. The
`selected confidence level is a parameter with possible
`values between 0 and 100%; while there are no set rules
`regarding its proper value. intermediate values have
`been found to give satisfactory resuhs.
`In practice, more than one data point may be found
`within the span of a confidence interval. The presence
`of more than one data point within a confidence interval
`represents an unnecessary redundancy. Therefore, a
`clustering processor 18 is used to identify groups of two
`or more points that occupy a space too small to be
`reliably resolved by the sensor system at the selected
`confidence level, and replace each group by a single
`representative data point at the centroid of the group;
`
`11 of 48
`
`BI Exhibit 1130
`
`
`
`7
`·the centroid is defined as the average position weighted
`by signal intensity. Then, a unique identifier is assigned
`to each data point. An identifier may be a character
`string, bit pattern or any other suitable form of symbolic
`information. The result is a compacted scene 19.
`B.2.b The spatial pattern recognition processor: The
`operation of the spatial pattern recognition processor 20
`is based on the criterion:
`Two geometrical figures are congruent if the straight
`lines connecting all corresponding point pairs are of 10
`equal length:
`A straight line is defined as the shortest path between
`two points in a space of one or more dimensions, not
`necessarily limited to three dimensions.
`Before going into the description of the processor 15
`itself, there are two important practical aspects that
`need to be considered, line length accuracy and recog(cid:173)
`nition performance.
`Just as there are uncertainties associated with the
`point positions, there must be uncertainties associated 20
`with the length of lines joining pairs of points. This
`uncertainty may also be allowed for in the form of a
`confidence interval. Thus, two line lengths are consid(cid:173)
`ered equal if the difference is within their combined
`confidence interval. The combined confidence interval 25
`may be approximated by the incoherent sum of the
`resolved confidence intervals of the positions of the
`four end points.
`As a consequence of line length uncertainties and
`other imperfections, it must be concluded that, in prac- 30
`tice, there is a finite probapility that the performance of
`the recognition processor may be less than perfect.
`Following standard decision theory methodology, the
`performance may be expressed in terms of the probabil-
`ity of detection and the probability of false alarm; in this 35
`context, "detection" refers to the proper recognition of
`congruent geometrical figures, and "false alarm" refers
`to a false recognition. In order to achieve or exceed a
`prescribed level of performance, it can be shown that
`the number of points in the congruent geometrical fig- 40
`ures must be equal to or exceed a minimum threshold
`. number. The said threshold number may be calculated
`from the required probabilities of detection and false
`alarm, the confidence intervals of the point positions,
`the dimensions of the compacted scene and the relevant 45
`region of the map, and the average densities of the
`points in the compacted scene and in the map.
`Using information available to the navigation system,
`such as estimated vehicle velocity and elapsed time, it is
`often possible to limit the search to a relevant region in 50
`the map containing all the points that may be expected
`to form a geometrical figure congruent with another in
`the compacted scene. Similar search limits may also be
`applicable within the compacted scene. These limits can
`help improve performance and reduce costs. By calcu- 55
`lating all the combinations and permutations that have
`to be tested, and given the above practical consider(cid:173)
`ations, it can be shown that, to achieve a useful level of
`performance, a direct search, of scenes and maps pro(cid:173)
`duced by sonars, would be prohibitively costly in terms 60
`of search time and computation resources. A signifi(cid:173)
`cantly more efficient method, embodied in the spatial
`pattern recognition processor 20, is hereby disclosed.
`The spatial pattern recognition processor 20 may be
`divided into three parts:
`(1) A processor 21 is used for comparing the lengths
`of straight lines between point pairs in the compacted
`scene to those of a relevant set of point pairs in the map
`
`8
`and compiling a list of all point pairs of equal line
`lengths at the required confidence level, known as the
`list of matched lines 22. The list is a symbolic list com(cid:173)
`prising a series of entries, each entry containing the
`5 identifiers of two points in the compacted scene and the
`identifiers of two points in the map that are joined by
`lines of equal length. The list is expected to be quite
`lengthy, therefore it should be well organized for effi-
`cient searching: The contents of each entry should be
`arranged in a definite order, with the identifiers from
`the compacted scene and those from the map paired and
`arranged in a definite order; and the identifiers within
`each pair ordered in a definite order, such as by alpha(cid:173)
`betical or numerical order. The entries should also be
`ordered in a definite order according to their contents,
`such as by alphabetical or numerical order.
`(2) A list reduction processor 23 is used to reduce the
`list 22 by a maximum likelihood method. Its flowchart is
`shown separately in FIG. 7. The process involves the
`generation · of a series of lists. For efficient searching,
`each list should be organized in a similar way to the list
`of matched lines. The processor 23 may be divided into
`four parts:
`(a) A processor 30 for producing a tally matrix 31 of
`the number of matched lines that are common be(cid:173)
`tween each point in the compacted scene and an(cid:173)
`other point in the map, by tallying the entries in the
`list 22 that contain each point pair. The resulting
`tally matrix 31 comprises a two-dimensional array,.
`with the columns assigned to the points in the com(cid:173)
`pacted scene, one point per column, and the rows
`assigned to the relevant set of points in the map,
`one point per row, and containing the number of
`matched lines common to all point pairs corre(cid:173)
`sponding to the intersecting rows and columns.
`(b) A processor 32 for compiling a list of pairs of
`likely matched points 33, by searching each row
`and each column of the tally matrix for the maxi(cid:173)
`mum tally, and taking the point pairs correspond(cid:173)
`ing to the columns and rows that intersect at each
`maximum .
`(c) A processor 34 for searching the list 33 to find the
`likely corresponding point pairs in the map for
`every pair of points in the compacted scene that is
`contained in the list of matched lines 22, and col-
`lecting the results into a list of likely matched lines
`35.
`(d) An