Evaluating the signature of operational platforms has long been a focus of military research. Human observations of targets in the field are perceived to be the most accurate way to assess a target's visible signature, although the results are limited to observers present in the field. Field observations do not introduce image capture or display artefacts, nor are they completely static, like the photographs used in screen based human observation experiments. A number of papers provide advances in the use of photographs and imagery to estimate the detectability of military platforms; however few describe advances in conducting human observer field trials.
This paper describes the conduct of a set of human field observation trials for detecting small maritime crafts in a littoral setting. This trial was conducted from the East Arm Port in Darwin in February 2018 with up to 6 observers at a time and was used to investigate incremental improvements to the observation process compared to small craft trials conducted in 2013. This location features a high number of potential distractors, which make it more difficult to find the small target crafts. The experimental changes aimed to test ways to measure time to detect, a result not measured at the previous small craft detection experiment, through the use of video monitoring of the observation line to compare with the use of observer-operated stop watches. This experiment also included the occasional addition of multiple targets of interest in the field of regard. Initial analysis of time-to-detect data indicates the video process may accurately assess the time to detect targets by the observers, but only if observers are effectively trained. Ideas on how to further automate the process for the human observer task are also described; however this system has yet to be implemented. This improved human observer trial process will assist the development of signature assessment models by obtaining more accurate data from field trials, including targets moving through a dynamic scene.
KEYWORDS: Target detection, 3D acquisition, 3D image processing, 3D modeling, Visualization, Digital photography, Vegetation, Visual analytics, Photography, Airborne remote sensing
Synthetic imagery could potentially enhance visible signature analysis by providing a wider range of target images in differing environmental conditions than would be feasible to collect in field trials. Achieving this requires a method for generating synthetic imagery that is both verified to be realistic and produces the same visible signature analysis results as real images. Is target detectability as measured by image metrics the same for real images and synthetic images of the same scene? Is target detectability as measured by human observer trials the same for real images and synthetic images of the same scene, and how realistic do the synthetic images need to be?
In this paper we present the results of a small scale exploratory study on the second question: a photosimulation experiment conducted using digital photographs and synthetic images generated of the same scene. Two sets of synthetic images were created: a high fidelity set created using an image generation tool, E-on Vue, and a low fidelity set created using a gaming engine, Unity 3D. The target detection results obtained using digital photographs were compared with those obtained using the two sets of synthetic images. There was a moderate correlation between the high fidelity synthetic image set and the real images in both the probability of correct detection (Pd: PCC = 0.58, SCC = 0.57) and mean search time (MST: PCC = 0.63, SCC = 0.61). There was no correlation between the low fidelity synthetic image set and the real images for the Pd, but a moderate correlation for MST (PCC = 0.67, SCC = 0.55).
KEYWORDS: Target acquisition, Target detection, 3D image processing, Received signal strength, 3D acquisition, Video, Photography, Light sources and illumination, 3D modeling, Clouds
This paper investigates the ability to develop synthetic scenes in an image generation tool, E-on Vue, and a gaming engine, Unity 3D, which can be used to generate synthetic imagery of target objects across a variety of conditions in land environments. Developments within these tools and gaming engines have allowed the computer gaming industry to dramatically enhance the realism of the games they develop; however they utilise short cuts to ensure that the games run smoothly in real-time to create an immersive effect. Whilst these short cuts may have an impact upon the realism of the synthetic imagery, they do promise a much more time efficient method of developing imagery of different environmental conditions and to investigate the dynamic aspect of military operations that is currently not evaluated in signature analysis. The results presented investigate how some of the common image metrics used in target acquisition modelling, namely the Δμ1, Δμ2, Δμ3, RSS, and Doyle metrics, perform on the synthetic scenes generated by E-on Vue and Unity 3D compared to real imagery of similar scenes. An exploration of the time required to develop the various aspects of the scene to enhance its realism are included, along with an overview of the difficulties associated with trying to recreate specific locations as a virtual scene. This work is an important start towards utilising virtual worlds for visible signature evaluation, and evaluating how equivalent synthetic imagery is to real photographs.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.