KEYWORDS: Stochastic processes, Electronic filtering, Motion models, Sensors, Nickel, Monte Carlo methods, Signal processing, Data processing, Current controlled current source
Multi-target filtering for closely-spaced targets leads to degraded performance with respect to single-target filtering
solutions, due to measurement provenance uncertainty. Soft data association approaches like the probabilistic data
association filter (PDAF) suffer track coalescence. Conversely, hard data association approaches like multiplehypothesis
tracking (MHT) suffer track repulsion. We introduce the stochastic data association filter (SDAF) that
utilizes the PDAF weights in a stochastic, hard data association update step. We find that the SDAF outperforms the
PDAF, though it does not match the performance of the MHT solution. We compare as well to the recentlyintroduced
equivalence-class MHT (ECMHT) that successfully counters the track repulsion effect. Simulation
results are based on the steady-state form of the Ornstein-Uhlenbeck process, allowing for lengthy stochastic
realizations with closely-spaced targets.
KEYWORDS: Electronic filtering, Filtering (signal processing), Electronic support measures, Optimal filtering, Digital filtering, Surveillance, Data fusion, Radar, Time metrology, Monte Carlo methods
Fusion of passive electronic support measures (ESM) with active radar data enables tracking and identification of
platforms in air, ground, and maritime domains. An effective multi-sensor fusion architecture adopts hierarchical
real-time multi-stage processing. This paper focuses on the recursive filtering challenges. The first challenge is to
achieve effective platform identification based on noisy emitter type measurements; we show that while optimal
processing is computationally infeasible, a good suboptimal solution is available via a sequential measurement
processing approach. The second challenge is to process waveform feature measurements that enable
disambiguation in multi-target scenarios where targets may be using the same emitters. We show that an approach
that explicitly considers the Markov jump process outperforms the traditional Kalman filtering solution.
This paper addresses multi-sensor surveillance where some sensors provide intermittent, feature-rich information.
Effective exploitation of this information in a multi-hypothesis tracking context requires computationally-intractable
processing with deep hypothesis trees. This report introduces two approaches to address this problem, and compares
these to single-stage, track-while-fuse processing. The first is a track-before-fuse approach that provides computational
efficiency at the cost of reduced track continuity; the second is a track-break-fuse approach that is computationally
efficient without sacrificing track continuity. Simulation and sea trial results are provided.
The track repulsion effect induces track swapping in difficult target-crossing scenarios. This paper provides a simple
analytical model for the probability of successful tracking in this setting. The model provides a means to quantify the
degree-of-difficulty in target-crossing scenarios. We analyze model-based performance predictions for a range of
scenario parameters. Additionally, we provide simulation results with a multi-hypothesis tracker that confirm the
increased performance challenge in crossing target settings as the ambiguity persists longer, i.e. as the targets cross
more slowly.
KEYWORDS: Sensors, Target detection, Detection and tracking algorithms, Surveillance, Sensor networks, System on a chip, Monte Carlo methods, Sensor fusion, Statistical analysis, Data fusion
Multi-sensor tracking holds the potential for improving the surveillance performance achieved through single-sensor
tracking. This potential has been demonstrated in many domains: at NURC, in the context of multi-static
undersea surveillance. Nonetheless, the issue remains of how best to process data in large sensor networks. This
issue is taken up in this paper. We are interested to compare multi-sensor scan-based tracking with a two-stage
approach: static fusion followed by scan-based tracking. This paper focuses on some candidate methodologies
for static fusion. The methods developed in this paper fall into two categories. The scan-based approach
leverages the Gaussian mixture probabilistic hypothesis density (GM-PHD) filter; the batch approaches are
based on scan statistics, and on the multi-hypothesis PDA (MHPDA). Preliminary simulation-based
performance analysis suggests that the MHPDA approach to static fusion is the most robust in dealing with
closely spaced targets and small sensor networks. Leveraging the results presented here, follow-on work will
address the determination of an optimal fusion and tracking architecture. In particular, we will test scan-based
tracking based on the NURC distributed multi-hypothesis tracker (DMHT), with MHPDA processing followed
by scan-based tracking (with the DMHT). We anticipate that, for large sensor networks, the latter approach will
outperform the former.
Multi-sensor fusion of data from maritime surveillance assets provides a consolidated surveillance picture that provides
a basis for downstream semi-automated anomaly-detection algorithms. The fusion approach that we pursue in this
paper leverages technology previously developed at NURC for undersea surveillance. We provide illustrations of the
potential of these techniques with data from recent at-sea experimentation.
In many active sonar tracking applications, targets frequently undergo fading detection performance in which the target's detection probability can shift suddenly from a high to a low value. This characteristic is a function of the undersea environment. Using a multistatic active sonar problem, we examine the performance of track management (initiation and termination) routines where the target detection probability is based on an underlying Hidden Markov Model (HMM) with high and low detection states. Using a likelihood ratio test, we develop the optimum track initiation performance as measured by a System Operating Characteristic, similar to a Receiver Operating Characteristic, which plots the probability of initiating a true track versus the probability of initiating a false track. We show that near-optimal performance can be attained using track initiation logic that differentiates the measurements as to receiver source in an "M detections of N scans from C sensors" type of rule. Performance can further be improved by using a composite track initiation test that combines two or three such rules in a logical OR operation. We next show that the use of a Shiryaev-Roberts test for track termination yields the quickest detection of false tracks for a given duration of true target tracks when compared to a Page test and rule-based tests of the form "M or fewer detections from K scans".
KEYWORDS: Target detection, Signal to noise ratio, Performance modeling, Sensors, Modeling, Surveillance, Data modeling, Statistical modeling, Sensor performance, Systems modeling
This paper develops models for centralized and distributed trackers that account for target fading effects. The models build on earlier work by the author, and introduce: (1) sensor performance statistics based on a standard Rayleigh amplitude assumption; (2) the impact of false contacts on true track formation and maintenance. We exercise the models in a number of ways, to illustrate the strengths and weaknesses of centralized and distributed fusion architectures. Our findings are qualitatively consistent with sea trial based performance analysis.
Sonar tracking using measurements from multistatic sensors has shown promise: there are benefits in terms of robustness, complementarity (covariance-ellipse intersection) and of course simply due to the increased probability of detection that naturally accrues from a well-designed data fusion system. It is not always clear what the placement of the sources and receivers that gives the best fused measurement covariance for any target--or at least for any target that is of interest--might be. In this paper, we investigate the problem as one of global optimization, in which the objective is to maximize the information provided to the tracker. We assume that the number of sensors is known, so that the optimization is done in a continuous space. We consider di.erent scenarios and numbers of sensors. The strong variability of target strength as a function of aspect is integral to the cost function we optimize. Numerical results are given, these suggesting that certain sensor geometries should be used. We have a number of intuitive suggestions that do not involve optimization for sensor layout.
Multistatic active sonar has significant potential for littoral surveillance. This paper describes a multistatic tracking algorithm, and provides performance analysis with both simulated and real multistatic sonar data. We find that sensor fusion and target tracking provided significant value added in the sonar processing chain. In particular, we are able to drastically reduce the number of objects that an operator must contend with, both by removing large numbers of false contacts as well as by associating true contacts and establishing tracks on moving targets and fixed clutter points. The association of contacts allows for recursive filtering algorithms to process kinematic measurements and provides localization and velocity information with much smaller errors than are present in multistatic contact data.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.