Airborne surveillance and targeting sensors are capable of generating large quantities of imagery, making it difficult for the user to find the targets of interest. Automatic target identification (ATI) can assist this process by searching for target-like objects and classifying them, thus reducing workload. ATI algorithms, developed in the laboratory by QinetiQ, have been implemented in real-time on ruggedised processing capable of flight. A series of airborne tests has been carried out to assess the performance of the ATI under real world conditions, using a Wescam EO/IR turret as the source of imagery. The tests included examples of military vehicles in urban and rural scenarios, with varying degrees of hide and concealment. Tests were conducted in different weather conditions to assess the robustness of the sensor and ATI combination. This paper discusses the tests carried out and the performance of the ATI achieved as a function of the test parameters. Conclusions are drawn as to the current state of ATI and its applicability to military requirements.
Future targeting systems aim to extend the range of air-ground target search, acquisition, temporal tracking and identification exceeding those currently afforded by forward looking infrared sensors. One technology option that has the potential to fulfil this requirement is hyperspectral imaging. Therefore a solution to detection and identification at longer ranges is the fusion of data from broadband and hyperspectral sensors. QinetiQ, under the Data & Information Fusion Defense Technology Centre, aims to develop a fully integrated spatial/spectral and temporal target detection/ identification air- ground tracking environment. This will build upon current capabilities in target tracking, synthetic scene generation, sensor modelling, hyperspectral and broadband target detection and identification algorithms into a tool that can be used to evaluate data fusion architectures.
Future targeting systems, for manned or unmanned combat aircraft, aim to provide increased mission success and platform survivability by successfully detecting and identifying even difficult targets at very long ranges. One of the key enabling technologies for such systems is robust automatic target identification (ATI), operating on high resolution electro-optic sensor imagery. QinetiQ have developed a real time ATI processor which will be demonstrated with infrared imagery from the Wescam MX15 in airborne trials in summer 2005. This paper describes some of the novel ATI algorithms, the challenges overcome to port the ATI from the laboratory onto a real time system and offers an assessment of likely airborne performance based on analysis of synthetic image sequences.
Several of the sensor technologies employed for producing hyperspectral images make use of dispersion across an array to generate the spectral content along a 'line' in the scene and then use scanning to build up the other spatial dimension of the image. Infrared staring arrays rarely achieve 100% fully functioning pixels. In single-band imaging applications 'dead' elements do not cause a problem because simple spatial averaging of neighboring pixels is possible (assuming that a pixel is similar in intensity to its neighbors is a reasonably good approximation). However, when the array is used as described above to produce a spectral image, dead elements result in missing spatial and spectral information. This paper investigates the use of several novel techniques to replace this missing information and assesses them against image data of different spatial and spectral resolutions with the aim of recommending the best technique to use based on the sensor specification. These techniques are also benchmarked against naive spatial averaging.
The problem of the automatic detection and identification of military vehicles in hyperspectral imagery has many possible solutions. The availability and utility of library spectra and the ability to atmospherically correct image data has great influence on the choice of approach. This paper concentrates on providing a robust solution in the event that library spectra are unavailable or unreliable due to differing atmospheric conditions between the data and reference. The development of a number of techniques for the detection and identification of unknown objects in a scene has continued apace over the past few years. A number of these techniques have been integrated into a "Full System Model" (FSM) to provide an automatic and robust system drawing upon the advantages of each. The FSM makes use of novel anomaly detectors and spatial processing to extract objects of interest in the scene which are then identified by a pre-trained classifier, typically a multi-class support vector machine. From this point onwards adaptive feedback is used to control the processing of the system. Stages of the processing chain may be augmented by spectral matching and linear unmixing algorithms in an effort to achieve optimum results depending upon the type of data. The Full System Model is described and the boost in performance over each individual stage is demonstrated and discussed.
Emerging Hyper-Spectral imaging technology allows the acquisition of data 'cubes' which simultaneously have high- resolution spatial and spectral components. There is a wealth of information in this data and effective techniques for extracting and processing this information are vital. Previous work by ERIM on man-made object detection has demonstrated that there is a huge amount of discriminatory information in hyperspectral images. This work used the hypothesis that the spectral characteristics of natural backgrounds can be described by a multivariate Gaussian model. The Mahalanobis distance (derived from the covariance matrix) between the background and other objects in the spectral data is the key discriminant. Other work (by DERA and Pilkington Optronics Ltd) has confirmed these findings, but indicates that in order to obtain the lowest possible false alarm probability, a way of including higher order statistics is necessary. There are many ways in which this could be done ranging from neural networks to classical density estimation approaches. In this paper we report on a new method for extending the Gaussian approach to more complex spectral signatures. By using ideas from the theory of Support Vector Machines we are able to map the spectral data into a higher dimensional space. The co- ordinates of this space are derived from all possible multiplicative combinations of the original spectral line intensities, up to a given order d -- which is the main parameter of the method. The data in this higher dimensional space are then analyzed using a multivariate Gaussian approach. Thus when d equals 1 we recover the ERIM model -- in this case the mapping is the identity. In order for such an approach to be at all tractable we must solve the 'combinatorial explosion' problem implicit in this mapping for large numbers of spectral lines in the signature data. In order to do this we note that in the final analysis of this approach it is only the inner (dot) products between vectors in the higher dimensional space that need to be computed. This can be done by efficient computations in the original data space. Thus the computational complexity of the problem is determined by the amount of data -- rather than the dimensionality of the mapping. The novel combination of non- linear mapping and high dimensional multivariate Gaussian analysis, only possible by using techniques from SVM theory, allows the practical application to hyperspectral imagery. We note that this approach also generates the non-linear Principal Components of the data, which have applications in their own right. In this paper we give a mathematical derivation of the method from first principles. The method is illustrated on a synthetic data set where complete control over the true statistics is possible. Results on this data show that the method is very powerful. It naturally extends the Gaussian approach to a variety of more complex probability distributions, including multi-modal and other manifestly non- Gaussian examples. Having shown the potential of this approach it is then applied to real hyperspectral trials data. The relative improvement in performance over the Gaussian approach is demonstrated for the real data.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.