throbber
United States Patent (19)
`Harris
`
`|||||||||||||||
`USOO5307289A
`5,307,289
`11
`Patent Number:
`Apr. 26, 1994
`45) Date of Patent:
`
`(56)
`
`54 METHOD AND SYSTEM FOR RELATIVE
`GEOMETRY TRACKING UTILIZNG
`MULTIPLE DISTRIBUTED
`EMITTER/DETECTOR LOCAL NODES AND
`MUTUAL LOCAL NODE TRACKING
`75 Inventor:
`James C. Harris, Vienna, Va.
`73 Assignee: Sesco Corporation, Vienna, Va.
`(21) Appl. No.: 758,782
`22 Filed:
`Sep. 12, 1991
`51
`Int. C. .............................................. G01S 13/06
`52 U.S. Cl. ...............
`... 364/516; 364/460
`58 Field of Search ....................... 364/460, 559, 516;
`342/352,457, 191, 356
`References Cited
`U.S. PATENT DOCUMENTS
`3,630,079 12/1971 Hughes et al. .
`3,742,498 6/1973 Dunn .
`3,836,970 9/1974 Reitzig ................................ 342/352
`3,866,229 2/1975 Hammack .
`3,953,856 4/1976 Hammack .
`3,996,590 12/1976 Hammack.
`4,347,996 9/1982 Grosso.
`4,560,120 12/1985 Crawford et al. .
`4,596,988 6/1986 Wanka ................................. 342/457
`4,651,156 3/1987 Martinez ............................. 342/457
`4,713,768 12/1987 Kosaka et al. .
`4,853,863 8/1989 Cohen et al. .
`4,884,208 1/1989 Mariuelli et al. .................., 364/460
`4,916,455 4/1990 Bent et al. .
`4,976,619 12/1990 Carlson .
`5,02,424 4/1991 Dodson .
`5,014,006 5/1991 Counselman, III ................. 342/352
`5,017,925 5/1991 Bertiger et al. ..................... 342/352
`5,019,827 5/1991 Wilson ................................ 364/460
`5,148,179 9/1992 Allison .
`5,150,310 9/1992 Greenspun et al. .
`OTHER PUBLICATIONS
`"Multiple Site Radar Tracking System", B. H. Cantrell
`and A. Grindlay, IEEE International Radar Confer
`ence, pp. 348-354 (1980) Apr.
`"Decentralized Processing in Sensor Arrays', Mati
`Wax and Thomas Kailath, IEEE Transactions on
`
`
`
`Acoustics, Speech, and Signal Processing, vol. AS
`SP-33, No. 4, Oct. 1985, pp. 1123-1128.
`Primary Examiner-Jack B. Harvey
`Assistant Examiner-Thomas Pesso
`Attorney, Agent, or Firm-Hoffman, Wasson & Gitler
`
`ABSTRACT
`57
`A method and system for tracking various objects utiliz
`ing a plurality of sensors. Separate locations or plat
`forms are provided with a number of sensors collocated
`with an energy generation/reflection device, and also a
`communication device. Each of the platforms is termed
`local nodes of a multi-sensor fusion system, and possibly
`can experience relative translational and/or rotational
`motion in as many as three dimensions with respect to
`itself and with respect to similar local nodes. Each local
`node is capable of measuring some combination of bear
`ing angles and/or range and/or respective derivatives
`from the local node to cooperative local nodes by gen
`erating or reflecting energy such that cooperative local
`nodes may obtain mutual sensor measurements. Infor
`mation obtained or processed by each local node, in
`cluding track data or track estimates, are possibly trans
`mitted to one or more central nodes denoted as fusion
`centers provided with processing capabilities. In addi
`tion, when an object or multiple objects which are not
`local nodes are being tracked, at least one cooperative
`local node can measure bearing angles and/or range
`and/or respective derivatives from the local node to the
`other object. After undergoing a series of processes,
`sensor data from multiple local nodes are combined at
`the fusion centers to provide estimates of both the rela
`tive geometry and relative orientation of each coopera
`tive local node with respect to other cooperative local
`nodes and the relative geometry of other sensed objects
`with respect to each cooperative local node. Estimated
`relative geometries are either range normalized or
`scaled with actual ranges depending upon sensor capa
`bilities.
`
`27 Claims, 11 Drawing Sheets
`
`Communication Fath
`
`Sensor Capability
`
`Communication Capability
`
`Data Fusion Capability
`Energy Emission Capability
`
`META 1011
`META V. THALES
`
`

`

`U.S. Patent
`
`Apr. 26, 1994
`
`Sheet 1 of 11
`
`5,307,289
`
`FIG. 1 (Prior Art)
`
`X
`
`K-ry
`
`i.
`I. I.
`X I.
`
`META 1011
`META V. THALES
`
`

`

`U.S. Patent
`
`Apr. 26, 1994
`
`Sheet 2 of 11
`
`5,307,289
`
`FIG. 2 (Prior Art)
`
`
`
`Sensor Alignment and
`Geometry Calibration
`
`
`
`Measurement
`
`
`
`Communication
`
`V
`Object
`Association/Tracking
`
`V
`Earth Coordinate
`Mapping and Data Fusion
`
`VI
`Application Interface
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`META 1011
`META V. THALES
`
`

`

`U.S. Patent
`
`Apr. 26, 1994
`
`Sheet 3 of 11
`
`5,307,289
`
`FG. 3
`
`META 1011
`META V. THALES
`
`

`

`U.S. Patent
`
`Apr. 26, 1994
`
`Sheet 4 of 11
`
`5,307,289
`
`FIG. 4
`
`
`
`LEGEND
`-VA
`
`s
`A
`
`EEE
`
`Communication Path
`
`Sensor Capability
`
`Communication Capability
`
`Data Fusion Capability
`
`Energy Emission Capability
`
`META 1011
`META V. THALES
`
`

`

`U.S. Patent
`
`Apr. 26, 1994
`
`Sheet 5 of 11
`
`5,307,289
`
`
`
`LEGEND
`-W-
`
`s
`A
`
`Communication Path
`
`Sensor Capability
`
`Communication Capability
`
`Data Fusion Capability
`
`Energy Emission Capability
`
`META 1011
`META V. THALES
`
`

`

`U.S. Patent
`
`Apr. 26, 1994
`
`Sheet 6 of 11
`
`5,307,289
`
`FIG. 6
`
`
`
`-VA-
`
`Communication Path
`
`s
`Sensor Capability
`A Communication Capability
`
`Data Fusion Capability
`Energy Emission Capability
`
`&
`
`META 1011
`META V. THALES
`
`

`

`U.S. Patent
`
`Apr. 26, 1994
`
`Sheet 7 of 11
`
`5,307,289
`
`Measurement
`
`
`
`12:
`
`16:
`
`FIG. 7
`
`8
`Elimination
`
`III
`Object
`Association/Tracking
`
`13
`
`VI
`Mutual Orientation
`
`14
`
`VII
`Perspective Mapping
`
`VIII
`Application Interface
`
`1.
`
`
`
`
`
`
`
`
`
`
`
`
`
`META 1011
`META V. THALES
`
`

`

`U.S. Patent
`
`Apr. 26, 1994
`
`Sheet 8 of 11
`
`5,307,289
`
`FIG. 8
`
`to
`
`u up to o m e o so o so
`
`up to us up up up to
`
`a
`
`1.
`
`Own Body Motion
`Elimination
`
`Measurement
`a
`a
`
`:2
`
`
`
`... 3
`
`Local Node
`Processes
`-1
`
`Local Object
`Association/Tracking
`4.
`11,
`
`7
`A
`
`6
`
`to
`
`ps
`
`a
`
`in on
`
`IV
`Communication
`
`5
`
`1 - -
`
`- - - - - -
`
`V
`Communication
`
`8
`
`9
`
`
`
`
`
`
`
`V
`Object
`Association/Tracking
`12
`
`VII
`Relative Geometry
`13
`
`V
`Mutual Orientation
`14
`
`u-
`Fusion Center
`Processes
`
`X
`Application Interface
`
`META 1011
`META V. THALES
`
`

`

`U.S. Patent
`
`Apr. 26, 1994
`
`Sheet 9 of 11
`
`5,307,289
`
`FIG. 9
`
`
`
`Measurement
`a
`
`Communication
`- - - to as a o no d
`
`Processes
`1.
`
`I
`
`
`
`5.
`V
`
`8
`
`
`
`D 3
`4
`
`C
`
`6
`
`- - - - - - -
`15
`
`Object
`Association/Tracking
`
`- 7. Own Body Motion
`Elimination
`
`is
`
`O
`
`C - - - - - - - -
`14
`
`13
`
`- - - - - - - - -
`
`
`
`N
`
`Fusion Center
`Processes
`
`12
`
`X
`Application Interface
`
`META 1011
`META V. THALES
`
`

`

`U.S. Patent
`
`Apr. 26, 1994
`
`Sheet 10 of 11
`
`5,307,289
`
`FIG 10
`
`
`
`META 1011
`META V. THALES
`
`

`

`U.S. Patent
`
`Apr. 26, 1994
`
`Sheet 11 of 11
`
`5,307,289
`
`FIG 11
`
`CA
`
`CB
`
`META 1011
`META V. THALES
`
`

`

`1
`
`METHOD AND SYSTEM FOR RELATIVE
`GEOMETRY TRACKING UTILIZING MULTIPLE
`DISTRIBUTED EMITTER/DETECTOR LOCAL
`NODES AND MUTUAL LOCAL NODE TRACKING
`CROSS-REFERENCED DISCLOSURE
`DOCUMENT
`This invention relates in part to subject matter de
`10
`scribed in Disclosure Document No. 235417.
`FIELD OF THE INVENTION
`The present invention pertains to the general field of
`object state tracking, and more specifically to the field
`15
`of spatially distributed multi-sensor object state track
`ing. Related fields are those which require some subset
`of estimates of the relative geometries and relative ori
`entations between multiple sensor platforms and the
`relative geometries between multiple sensor platforms
`and other objects. Related fields include navigation,
`guidance, surveillance, landing aids, fire control, and
`robotic motion control.
`BACKGROUND OF THE INVENTION
`The process of object state tracking has been accom
`25
`plished for many years in a myriad of different ways.
`State tracking implies that some qualities of an object's
`geometry relative to a sensing device are being fol
`lowed and estimated. These qualities are estimated by
`sensing the generated/reflected energy emissions of the
`30
`object. Qualities of relative geometry which are tracked
`include range and/or bearing and/or the respective
`derivatives from a sensor to the energy source. The
`expressions state tracking and tracking are often used
`synonymously in the literature and will be used as such
`35
`within this document. Early tracking methods often
`utilized single sensors. To achieve improved tracking
`accuracy these single sensors were upgraded or re
`placed with sensors having improved accuracy. Cur
`rent methods continue to primarily utilize single sen
`40
`sors, although a trend is developing toward mixed mode
`and multi-sensor systems to overcome the limitations of
`single sensor systems.
`Mixed mode systems utilize different types of sensors
`such as combined Radio Frequency (RF) and optic
`45
`sensors collocated upon the same platform. Mixed mode
`systems are generally utilized when one sensor type
`complements the capabilities of another sensor type.
`One sensor type, for example, might have long range
`detection capability for initial tracking, and another
`50
`collocated sensor type which has better but range lim
`ited accuracy is utilized to provide improved short
`range tracking. An example of a mixed mode multiple
`sensor system is U.S. Pat. No. 3,630,079, issued to
`Hughes.
`Multi-sensor systems are utilized to overcome several
`limitations of single sensor systems. Multiple sensors
`provide an increasing quantity of available measure
`ments as additional sensors are utilized. A greater num
`ber of measurements from multiple collocated sensors,
`60
`for example, is combined to improve the statistics of
`tracking system estimates. Additionally, single sensor
`systerns encounter significantly decreased accuracy
`when tracked objects are in poor relative geometry
`with the sensor. Multiple geometrically distributed sen
`65
`sors can significantly relieve this problem by viewing
`the object from different geometric perspectives. An
`other limitation of single sensor systems is that they are
`
`5,307,289
`2
`unable to provide information about the relative orien
`tation of multiple bodies, whereas multiple sensor sys
`tems have this capability.
`The field of spatially distributed multi-sensor tracking
`is an emerging one, having its major roots beginning
`around 1980 with developments sponsored by MIT
`Lincoln Labs. The problems addressed in this field are
`typically so highly constrained that results are usually
`not reusable in different multi-sensor tracking situations.
`Prior art systems, for example, are typically constrained
`with sensor array formations whereby sensors are per
`manently fixed at well known (a priori) relative loca
`tions and/or orientations. FIG. 1 depicts a typical prior
`art sensor platform arrangement. Sensors are arranged
`in an array grid having well known and often equal
`spacings between sensor elements, i.e. randry known.
`Position vectors between sensor platforms are typically
`either directly measured with distance measuring equip
`ment, or inferred through the use of an external absolute
`coordinate determination system such as any navigation
`system or Global Positioning System (GPS). The rela
`tive orientation between the coordinate frames in which
`pairs of sensor elements function is also typically well
`known, often identical, and not allowed to change dy
`namically. Sensors utilized in prior art multi-sensor
`systems, for example, are very often located upon the
`same rigid body. Additionally, these sensor arrays are
`not allowed to experience Own-Body motion or rela
`tive motion, and three dimensional problems are often
`approximated with substantially inaccurate two dimen
`sional models. Prior art multi-sensor tracking methods
`also typically do not have the flexibility to utilize any
`combination of range, bearing, and respective deriva
`tive information as such information is available. Most
`major prior art developments are related to either dis
`tributed acoustic sensors or distributed ground based
`radars. Examples include Mati Wax and Thomas
`Kailath, "Decentralized Processing in Sensor Arrays'
`published in IEEE Trans. Acous. Speech Sig. Proc.,
`ASSP-33, October 1985 pp. 1123-1128 and Cantrell,
`B.H., and A. Grindley, "Multiple Site Radar Tracking
`System” published in Proc. IEEE Int. Conf., April 1980
`pp. 348-354.
`A typical prior art distributed multi-sensor data fu
`sion information flow diagram is shown in FIG. 2. The
`first process represented by Block I is to estimate rela
`tive sensor positions and alignments. A common prior
`art example is to align cooperative ground based radars
`with magnetic or true north. Alignment information is
`passed to Block V where it is stored for future use. The
`Measurement process, Block II, provides sensor data
`measurements of various sensed objects from the radar
`sites (local nodes) to a central processing agent via
`Block III, the Communication process. At the central
`node, the Object Association and Tracking process,
`Block IV, associates sensor data with common targets
`and updates object track filters as required. Results are
`passed to Block V, the Earth Coordinate Mapping and
`Fusing process, whereby fusion estimates are generated
`in a common coordinate frame, such a coordinate frame
`typically being earth coordinates. The fusion estimates
`are then passed to the Application Interface process,
`Block VI, which makes the estimates available to the
`application.
`There are many different fusion system architectures
`which can be implemented to optimize performance
`under the given multi-sensor tracking system con
`
`5
`
`55
`
`META 1011
`META V. THALES
`
`

`

`5,307,289
`3
`4.
`having no relative motion at precalibrated relative posi
`straints. Examples of fusion system architectures in
`tions. The patents to Bent etal and Carlson accomplish
`clude hierarchical, centralized tracking, and sensor
`position tracking utilizing range-only triangulation
`level tracking. Sensor level tracking systems form ob
`whereby the sensor platform orientation is not required
`ject tracks at the sensor level. Centralized tracking
`or estimated. The patent to Cohen et al uses arrays
`systems gather sensor data at a single node and all track
`composed of three non-colinear sensors having known
`ing and fusion processing takes place at the central
`relative positions and orientations which are located
`level, or central node. Hierarchical architectures com
`upon a 6 Degree of Freedom (6DOF) platform. A dif
`bine the sensor data from groups of local nodes at an
`ferent 6DOF platform has three non-colinear emitters at
`intermediate level. Intermediate level nodes feed higher
`known positions. Geometric relationships are utilized to
`level nodes until possibly reaching a central level. Any
`10
`node where data processing takes place is generally
`determine the relative orientations of the two platforms.
`referred to in the literature as an agent node. Any node
`OBJECTS OF THE INVENTION
`where data from multiple sensors is combined (fused) is
`The primary object of this invention is to provide a
`termed a fusion agent node or fusion node. The combi
`novel method for relative geometry and relative orien
`nation of a sensing device and a capability for communi
`15
`tation state tracking which can obtain much greater
`cations with agent nodes is generally referred to as a
`accuracies than the prior art. An additional object is to
`local node. A local node which is also an agent node is
`provide a modular method such that any fusion system
`sometimes additionally referred to as a local agent node.
`architecture can be implemented by simply rearranging
`A closely related area is that of multiple object track
`the basic processing blocks. Additional objects of the
`association. Developments in this area are concerned
`20
`with associating a set of multiple objects tracked by a
`invention include elimination of the following prior art
`restrictions: that tracking sensor platforms are in well
`sensor with the set of objects tracked by another sensor.
`Objects appearing to have identical trajectories and
`known (a priori) permanently fixed locations; that
`tracking sensor platforms require utilization of an exter
`falling within a confidence contour (gate) are deter
`mined to be common to each set of tracked objects.
`nal absolute coordinate determination system such as
`25
`any navigation system or GPS to estimate relative plat
`Early work in this area was concerned the problem of
`handing off a tracked object from one ground based
`form geometries; that the relative orientation between
`radar to another. Multiple object track association has
`the coordinate frames in which pairs of sensor elements
`more recently received amplified attention due to pro
`function is well known and not allowed to change dy
`grams sponsored by the U.S. Army and the Strategic
`namically; that tracking sensor platforms do not experi
`30
`Defense Initiative Organization (SDIO) for analysis of
`ence their own dynamic motion; that tracking sensor
`platforms do not experience relative dynamic motion;
`extended threat clouds. Much of the scholastic research
`in this area is occurring at the University of Connecticut
`that there is no flexibility for utilization of any combina
`Department of Electrical and Systems Engineering.
`tion of range, bearing, and respective derivative infor
`Examples of work in this area include Blackman, S.S.,
`mation as available.
`"Multiple Target Tracking with Radar Applications'
`published by Artech House, Dedham, Ma 1986 and
`Bar-Shalom, Y., and T.E. Fortmann, "Tracking and
`Data Association' published by Academic Press, New
`York, 1988.
`A research area just now receiving attention is con
`cerned with a process termed registration. Registration
`is the process of determining the relative orientation of
`one sensor to that of cooperating sensors. The prior art
`typically does not consider the case of dynamic relative
`sensor orientations. Cooperative sensors in prior art
`multi-sensor systems, for example, are typically not
`located upon different platforms having relative De
`grees of Freedom. A representative example of the
`prior art is one that determines the relative orientation
`50
`of earth fixed cooperative sensors, a specific example
`being multiple cooperative ground based radar sites.
`Prior art techniques for accomplishing the registration
`process are typically restricted to determining bias off
`sets about only a single coordinate axis, such as deter
`mining the azimuth offset of cooperative ground based
`radar sites. This is accomplished through various forms
`of stochastic filtering, including a model of the geome
`try of multiple radar sites and the tracks of mutually
`tracked aircraft. An example of efforts in the area of
`60
`sensor registration is Fischer, W.L., C.E. Muehe, and
`A.G. Cameron, "Registration Errors in a Netted Air
`Surveillance System', MIT Lincoln Laboratory Tech
`nical Note 1980-40, Sep. 2, 1980 AD-AO93691.
`Examples of other patented multi-sensor tracking
`systems are U.S. Pat. Nos. 4,916,455 issued to Bent et al,
`4,976,619 issued to Carlson, and 4,853,863 issued to
`Cohen et al. These systems utilize cooperative sensors
`
`SUMMARY OF THE INVENTION
`The method of the present invention and the system
`utilizing this method will be referred to as The Smart
`Weapon Adjustable Aspect and Ranging Munition
`(SW&RM) Tracking Method which has numerous ad
`vantages over the prior art, whereby a collection of
`several distinct processes are utilized to accomplish
`mutual local node relative geometry and relative orien
`tation state tracking, and relative geometry tracking of
`other energy emission sources. The invention addresses
`significant deficiencies in several areas of the prior art of
`multi-sensor tracking, including most notably, the areas
`of sensor platform motion, relative geometry determi
`nation, and multiple platform sensor orientation regis
`tration.
`The acronym SW&RM may be misleading since the
`SW&RM Tracking Method is capable of accomplishing
`angle-only multi-sensor tracking using range normal
`ized coordinates when only bearing information is avail
`able. There are applications where inferring or measur
`ing range is not necessary or required, an example being
`Line of Sight (LOS) guidance. Additionally, there are
`numerous applications where a local node is not a muni
`tion, examples being command guidance, fire control
`systems, landing aids, and pure tracking applications
`such as surveillance and airport air traffic control.
`The SW&RM Tracking Method requires platforms
`containing one or more sensing devices collocated with
`an energy generation/ reflection device, and also a
`communication system. These platforms are termed
`local nodes of a multi-sensor fusion system, and possibly
`experience relative translational and/or rotational mo
`
`55
`
`65
`
`35
`
`45
`
`META 1011
`META V. THALES
`
`

`

`O
`
`5,307,289
`5
`6
`tion in as many as three dimensions with respect to itself
`the fusion agents is what allows the SW&RM Tracking
`Method to accomplish the required processes at a high
`and with respect to similar local nodes. Each local node
`is capable of measuring some combination of bearing
`level of resolution even while local nodes experience
`angles and/or range and/or respective derivatives from
`Own-Body motion and relative motion. There are situa
`the local node to cooperative local nodes and can gener
`tions where some local node to local node tracks are not
`ate or reflect energy by which cooperative local nodes
`available, and in which the SW&RM Tracking Method
`is still applicable and makes sense. An example is the
`may obtain mutual sensor measurements. Information
`obtained or processed by each local node, including
`virtual object tracking problem where a local node to
`local node track is broken, or where a particular local
`track data or track estimates are transmitted to one or
`node is not capable of tracking another local node, but
`more central nodes denoted as fusion centers provided
`with processing capabilities. In addition, when an ob
`where the track can be inferred based on the observa
`ject or multiple objects which are not local nodes are
`tions of other local nodes. The SW&RM Tracking
`being tracked, at least one cooperative local node has a
`Method begins to break down and lose its utility when
`means for measuring bearing angles and/or range and
`fewer local node to local node tracks are available. The
`method approaches equivalency with the prior art in
`/or respective derivatives from the local node to the
`15
`other object.
`those situations where relative local node to local node
`The SW&RM Tracking Method executes a combina
`track information is not available, and relative local
`tion of various processes prior to obtaining a multiple
`node positions and orientations are well known.
`The Own-Body Motion Elimination process identi
`local node fusion track estimate, an example of which is
`fies the pertinent parameters of the Own-Body dynamic
`shown in FIG. 4. These processes include: a tracking
`20
`motion of a local node. A motion filter is employed
`sensor measurement process termed Measurement; an
`Own-Body motion identification and sensor data map
`which maps data obtained from the Measurement pro
`ping process termed Own-Body Motion Elimination; an
`cess within the local node body coordinates onto a
`object association and tracking process, termed Object
`synthetic coordinate frame whereby the effects of Own
`Association and Tracking, by which sensor data is re
`Body motion is removed from the data. Such a syn
`25
`ceived as input by a tracking filter and by which a track
`thetic coordinate frame of a local node is termed its
`Egocentric Coordinate Frame. This process allows a
`data estimate is formed; a communication process,
`multi-sensor tracking system local node to experience
`termed Communication, by which sensor data or track
`Own-Body motion with little statistical impact upon
`data estimates are communicated from local nodes to
`fusion agents; a process termed Relative Geometry, by
`fusion estimates. If no Own-Body motion occurs, then
`30
`which the relative geometry and dynamics thereof be
`the Egocentric Coordinate Frame is simply oriented at
`tween cooperating local nodes and other tracked ob
`some arbitrary constant offset from the sensor orienta
`jects is estimated; a process termed Mutual Orientation,
`tion.
`The Object Association and Tracking process per
`by which the relative orientation and dynamics thereof
`forms the tasks of data association and object tracking.
`between cooperating local nodes is estimated; and a
`35
`process termed Perspective Mapping, whereby the
`If only bearing information is available during the entire
`tracking timeline, then range information is not and
`tracks of cooperative local nodes to each other and to
`other objects is cast into a common coordinate frame by
`cannot be inferred and is not utilized. If, however, range
`fusion agents and whereby probabilistic weightings are
`rate between a tracked object and a local node is known
`together with bearing information, then "own ship ma
`applied to track data estimates and the estimates are
`40
`combined (fused). The probabilistic weightings are
`neuvers' allow the agent which processes the local
`based upon actual track data statistics or upon estimates
`node measurement data to estimate range from the local
`node to the tracked object. Data processing is accom
`of track data statistics such as estimates which are de
`plished at the local node level if the local node is also an
`rived from Geometric Dilution of Precision (GDOP)
`agent, or at some other fusion agent level depending
`factors.
`45
`Contrary to the prior art, the SW&RM Tracking
`upon design considerations such as communication load
`Method requires that sensor platforms (local nodes)
`and system architecture and complexity. Statistical the
`ory indicates that an optimal fusion solution can only be
`which provide measurement data or track data esti
`mates of any mutually tracked objects to a common
`accomplished if all local node sensor measurements are
`data fusion node are also required to provide some
`communicated to at least one common fusion agent
`50
`individually. Suboptimal solutions may be obtained if
`subset of measurement data or track data estimates of
`the Object Association and Tracking process occurs at
`tracks formed from each cooperative local node to the
`other. Clearly, an additional requirement of the
`some level other than a central level, and if each local
`SW&RM Tracking Method is that cooperative nodes
`node communicates sensor data or track data estimates
`to fusion agents at intervals greater than the individual
`emit or reflect energy that can be mutually detected.
`The Measurement process, in addition to providing
`measurement interval. The optimal fusion solution re
`quires a greater communication capability, results in
`sensor measurement data associated with target objects,
`greater fusion system complexity, and at a minimum
`is therefore also required to provide some combination
`only requires a single computational unit. The subopti
`of bearing angles and/or range and/or respective deriv
`ative measurements between some application depen
`mal fusion solution, however, places fewer demands
`dent subset of cooperative local nodes. Additionally,
`upon communication capability, and results in a less
`complex fusion system, but provides degraded perfor
`the Object Association and Tracking process is required
`mance and requires computational units for multiple
`to estimate the relative tracks between these coopera
`fusion agent nodes.
`tive local nodes.
`Local node to local node tracking is a unique key
`The Communication process is an information trans
`65
`mission pipeline between local nodes and fusion agents.
`element of the SW&RM Tracking Method which sepa
`rates the instant method and system from the prior art.
`The actual type of data transmitted and received de
`pends upon system design considerations such as fusion
`The availability of local node-to-local node track data at
`
`55
`
`META 1011
`META V. THALES
`
`

`

`5,307,289
`8
`7
`FIG. 7 is an information flow diagram of a possible
`system structure, available communication bandwidth,
`sensor level fusion architecture implementation of the
`and desired accuracy.
`present invention;
`The Relative Geometry process estimates the relative
`FIG. 8 is an information flow diagram of a possible
`geometry and dynamics thereof between the various
`hierarchical fusion architecture implementation of the
`triplets of local node pairs and mutually tracked objects
`present invention;
`including other local nodes. By relative geometry is
`FIG. 9 is an information flow diagram of a possible
`meant the shape of the triangle connecting the various
`hierarchical fusion architecture implementation of the
`combinations of local node pairs and mutually tracked
`present invention;
`third objects. This process is equivalent to determina
`FIG. 10 is a diagram depicting the Euler angles uti
`O
`tion of the ratio of the distance from each local node to
`lized in the formulation of Euler's Equations of Motion;
`the third object to the distance between the local node
`and
`pair. If range information is available, relative geometry
`FIG. 11 depicts a triangle having vertices defined by
`also means determination of the triangle leg sizes. Com
`the position of two SW&RM local nodes and a third
`binations of range and bearing are utilized as the infor
`object according to the present invention.
`15
`mation is available to estimate each triangle shape. The
`DETAILED DESCRIPTION OF THE PRESENT
`Relative Geometry process may make use of, but does
`not require position information from any external navi
`INVENTION
`gation system or GPS. Although structures having
`The SW&RM Tracking Method is a multi-sensor
`more legs than three may be estimated through a com
`tracking method which requires local nodes and fusion
`20
`plex stochastic filter, these structures decompose into
`centers having special capabilities. Each local node at a
`statistically equivalent combinations of local node and
`minimum includes: a device for measuring some combi
`third object triplets.
`nation of bearing angles and/or range and/or respective
`The Mutual Orientation process solves simultaneous
`derivatives from the local node to cooperative local
`equations to estimate the relative orientations and dy
`nodes; a device for generating energy or a capability to
`25
`namics thereof between pairs of local node Egocentric
`reflect energy by which cooperative local nodes may
`Coordinate Frames. A stochastic filter tracks the bias
`obtain mutual sensor measurements; and a device for
`angles such that pairs of Egocentric Coordinate Frames
`communicating sensor data or track data estimates from
`may undergo relative rotational motion. The relative
`the local node to fusion agents. In addition, when an
`object or multiple objects which are not local nodes are
`orientation between the coordinate frames in which
`30
`being tracked, at least one cooperative local node has a
`pairs of sensor elements function, therefore, is not nec
`means for measuring bearing angles and/or range and
`essarily known a priori, and may be dynamic.
`/or respective derivatives from the local node to the
`The Perspective Mapping process utilizes the results
`other object. Each local node which forms object mo
`of previous processes to map track data provided by
`tion tracks locally additionally has a processor. Each
`each local node onto a common coordinate frame.
`35
`fusion center is located within the system according to
`Choice of the common coordinate frame depends upon
`a selected fusion system architecture, and minimally
`the required use of the fusion estimate. The data is fused
`includes a processor and a communication system.
`utilizing weightings based on actual or estimated data
`In contrast to the typical prior art distributed multi
`statistics and forwarded to the Application Interface
`sensor tracking system, SW&RM local nodes may be at
`process which makes fusion estimates available to the
`any arbitrary position and orientation as depicted by
`application.
`FIG. 3. Additionally, SW&RM local nodes may experi
`The sequence in which the SW&RM Tracking
`ence their own dynamic motion and relative transla
`Method processes are executed depends upon the fusion
`tional motion with respect to each other and with re
`system architecture. Examples of fusion system archi
`spect to any other sensed objects.
`45
`tectures include hierarchical, centralized tracking, and
`C

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket