State-of-the-art hardware-in-the-loop (HWIL) test facilities have been established and in operation at the
U.S. Army's Aviation and Missile Research, Development, and Engineering Center (AMRDEC) in
McMorrow Laboratories, on Redstone Arsenal Alabama for over 37 years. These facilities have been
successfully developed and employed supporting numerous tactical and interceptor missile systems. The
AMRDEC HWIL facilities are constantly in a state state of modification and revision supporting
evolving test requirements related to increasingly complex sensor suites, guidance implementations, and
employment strategies prevalent within both existing and emerging aviation and missile programs. . This
paper surveys the role of the U.S. Army Aviation and Missile Research, Development, and Engineering
Center (AMRDEC) in the development and operation of HWIL test facilities and the implementation of
new, innovative technologies that have been integrated within facility test assets. This technology spans
both the Near IR (NIR- 1.064um) and IR (3 - 12um) and RF (2 - 95 GHz) operating ranges. The
AMRDEC HWIL facilities represent the highest degree of simulation fidelity, integrating all the major
parts of a HWIL simulation including tactical missile and seeker hardware, executive control software,
scene generation, and NIR, IR or RF scene projection systems. Successful incorporation of scene
generation and projection technologies have become a key thrust of the AMRDEC HWIL development
focus, with the intention to adapt and anticipate emerging test element requirements necessitated by future
system sensing technologies.
AMRDEC has developed the Multi-spectral Advanced Volumetric Real-time Imaging Compositor (MAVRIC) prototype
for distributed real-time hardware-in-the-loop (HWIL) scene generation. MAVRIC is a dynamic object-based energy
conserved scene compositor that can seamlessly convolve distributed scene elements into temporally aligned physicsbased
scenes for enhancing existing AMRDEC scene generation codes. The volumetric compositing process accepts
input independent of depth order. This real-time compositor framework is built around AMRDEC's ContinuumCore
API which provides the common messaging interface leveraging the Neutral Messaging Language (NML) for local,
shared memory, reflective memory, network, and remote direct memory access (RDMA) communications and the Joint
Signature Image Generator (JSIG) that provides energy conserved scene component interface at each render node. This
structure allows for a highly scalable real-time environment capable of rendering individual objects at high fidelity while
being considerate of real-time hardware-in-the-loop concerns, such as latency. As such, this system can be scaled to
handle highly complex detailed scenes such as urban environments. This architecture provides the basis for common
scene generation as it provides disparate scene elements to be calculated by various phenomenology codes and
integrated seamlessly into a unified composited environment. This advanced capability is the gateway to higher fidelity
scene generation such as ray-tracing. The high speed interconnects using PCI Express and InfiniBand were examined to
support distributed scene generation whereby the scene graph, associated phenomenology, and the scene elements can be
dynamically distributed across multiple high performance computing assets to maximize system performance.
The US Army Aviation and Missile Research, Development and Engineering Center (AMRDEC) and the Redstone Test
Center (RTC) has formed the Scene Generation Development Center (SGDC) to support the Department of Defense
(DoD) open source EO/IR Scene Generation initiative for real-time hardware-in-the-loop and all-digital simulation.
Various branches of the DoD have invested significant resources in the development of advanced scene and target
signature generation codes. The SGDC goal is to maintain unlimited government rights and controlled access to
government open source scene generation and signature codes. In addition, the SGDC provides development support to a
multi-service community of test and evaluation (T&E) users, developers, and integrators in a collaborative environment.
The SGDC has leveraged the DoD Defense Information Systems Agency (DISA) ProjectForge
(https://Project.Forge.mil) which provides a collaborative development and distribution environment for the DoD
community. The SGDC will develop and maintain several codes for tactical and strategic simulation, such as the Joint
Signature Image Generator (JSIG), the Multi-spectral Advanced Volumetric Real-time Imaging Compositor (MAVRIC),
and Office of the Secretary of Defense (OSD) Test and Evaluation Science and Technology (T&E/S&T) thermal
modeling and atmospherics packages, such as EOView, CHARM, and STAR. Other utility packages included are the
ContinuumCore for real-time messaging and data management and IGStudio for run-time visualization and scenario
generation.
AMRDEC sought out an improved framework for real-time hardware-in-the-loop (HWIL) scene generation to provide
the flexibility needed to adapt to rapidly changing hardware advancements and provide the ability to more seamlessly
integrate external third party codes for Best-of-Breed real-time scene generation. As such, AMRDEC has developed
Continuum, a new software architecture foundation to allow for the integration of these codes into a HWIL lab facility
while enhancing existing AMRDEC HWIL scene generation codes such as the Joint Signature Image Generator (JSIG).
This new real-time framework is a minimalistic modular approach based on the National Institute of Standards (NIST)
Neutral Messaging Language (NML) that provides the basis for common HWIL scene generation. High speed
interconnects and protocols were examined to support distributed scene generation whereby the scene graph, associated
phenomenology, and resulting scene can be designed around the data rather than a framework, and the scene elements
can be dynamically distributed across multiple high performance computing assets. Because of this open architecture
approach, the framework facilitates scaling from a single GPU "traditional" PC scene generation system to a multi-node
distributed system requiring load distribution and scene compositing across multiple high performance computing
platforms. This takes advantage of the latest advancements in GPU hardware, such as NVIDIA's Tesla and Fermi
architectures, providing an increased benefit in both fidelity and performance of the associated scene's phenomenology.
Other features of the Continuum easily extend the use of this framework to include visualization, diagnostic, analysis,
configuration, and other HWIL and all digital simulation tools.
The Aviation and Missile Research, Engineering and Development Center (AMRDEC), System Simulation and
Development Directorate (SS&DD) and Redstone Technical Test Center (RTTC) have teamed together to
develop a Hardware-in-the-Loop (HWIL) simulation known as the Advanced Multi-spectral Simulation Test
Acceptance Resource (AMSTAR) Production Bay Test Facility. The simulation facility has the capability to
simultaneously produce scenes in two spectral bands. This paper describes the Near Infrared (NIR) and Imaging
Infrared capabilities of the AMSTAR Production Bay Test Facility simulation.
KEYWORDS: Zoom lenses, Calibration, High dynamic range imaging, Projection systems, Convolution, Error analysis, Commercial off the shelf technology, Image processing, OpenGL, Linear filtering
AMRDEC has developed and implemented new techniques for rendering real-time 32-bit floating point energy-conserved
dynamic scenes using commercial-off-the-shelf (COTS) Personal Computer (PC) based hardware and high
performance nVidia Graphics Processing Units (GPU). The AMRDEC IGStudio rendering framework with the real-time
Joint Scientific Image Generator (JSIG) core has been integrated into numerous AMRDEC Hardware-in-the-loop
(HWIL) facilities, successfully replacing the lower fidelity legacy SGI hardware and software. JSIG uses high dynamic
range unnormalized radiometric 32-bit floating point rendering through the use of GPU frame buffer objects (FBOs). A
high performance nested zoom anti-aliasing (NZAA) technique was developed to address performance and geometric
errors of past zoom anti-aliasing (ZAA) implementations. The NZAA capability for multi-object and occluded object
representations includes: cluster ZAA, object ZAA, sub-object ZAA, and point source generation for unresolved objects.
This technique has an optimal 128x128 pixel asymmetrical field-of-view zoom. The current NZAA capability supports
up to 8 objects in real-time with a near future capability of increasing to a theoretical 128 objects in real-time. JSIG
performs other dynamic entity effects which are applied in vertex and fragment shaders. These effects include floating
point dynamic signature application, dynamic model ablation heating models, and per-material thermal emissivity rolloff
interpolated on a per-pixel zoomed window basis. JSIG additionally performs full scene per-pixel effects in a post
render process. These effects include real-time convolutions, optical scene corrections, per-frame calibrations, and
energy distribution blur used to compensate for projector element energy limitations.
The Aviation and Missile Research, Engineering and Development Center (AMRDEC), System Simulation and
Development Directorate (SS&DD) is developing a Hardware-in-the-Loop (HWIL) facility known as the Multi-spectral
System Simulation (MSS). The simulation facility has the capability to simultaneously produce scenes in three spectral
bands. This paper describes the Near Infrared (NIR) and Imaging Infrared capabilities of the MSS simulation.
KEYWORDS: Extremely high frequency, Projection systems, Missiles, Computer simulations, Fermium, Frequency modulation, Iterated function systems, Data modeling, Infrared imaging, Sensors
The Advanced Multispectral Simulation Test Acceptance Resource (AMSTAR) is a suite of state-of-the-art hardware-in-the-loop (HWIL) simulation / test capabilities designed to meet the life-cycle testing needs of multi-spectral systems. This paper presents the major AMSTAR facility design concepts and each of the Millimeter Wave (MMW), Infrared (IR), and Semi-Active Laser (SAL) in-band scene generation and projection system designs. The emergence of Multispectral sensors in missile systems necessitates capabilities such as AMSTAR to simultaneous project MMW, IR, and SAL wave bands into a common sensor aperture.
The Aviation and Missile Research, Engineering and Development Center (AMRDEC), System Simulation and Development Directorate (SS&DD) and Redstone Technical Test Center (RTTC) have teamed together to develop a Hardware-in-the-Loop (HWIL) simulation known as the Advanced Multi-spectral Simulation Test Acceptance Resource (AMSTAR). The simulation facility has the capability to simultaneously produce scenes in three spectral bands. This paper describes Near Infrared (NIR) and Imaging Infrared capabilities of the AMSTAR simulation. Additionally, this paper will described briefly the ability to conduct tests in an environmentally conditioned chamber while the unit-under-test is mounted on the Flight Motion Simulator (FMS).
KEYWORDS: Reflectivity, Computer simulations, Missiles, Human-machine interfaces, Extremely high frequency, Data archive systems, Device simulation, Fermium, Frequency modulation, Control systems
Using Object Oriented Design (OOD) concepts in AMRDEC's Hardware-in-the Loop (HWIL) real-time simulations allows the user to interchange parts of the simulation to meet test requirements. A large-scale three-spectral band simulator connected via a high speed reflective memory ring for time-critical data transfers to PC controllers connected by non real-time Ethernet protocols is used to separate software objects from logical entities close to their respective controlled hardware. Each standalone object does its own dynamic initialization, real-time processing, and end of run processing; therefore it can be easily maintained and updated. A Resource Allocation Program (RAP) is also utilized along with a device table to allocate, organize, and document the communication protocol between the software and hardware components. A GUI display program lists all allocations and deallocations of HWIL memory and hardware resources. This interactive program is also used to clean up defunct allocations of dead processes. Three examples are presented using the OOD and RAP concepts. The first is the control of an ACUTRONICS built three-axis flight table using the same control for calibration and real-time functions. The second is the transportability of a six-degree-of-freedom (6-DOF) simulation from an Onyx residence to a Linux-PC. The third is the replacement of the 6-DOF simulation with a replay program to drive the facility with archived run data for demonstration or analysis purposes.
High resolution millimeter wave RADAR has become a reality in today's sensor world. System development and simulation-based acquisition are increasing the demands for high fidelity environmental models. Therefore, high resolution clutter models are imperative. The RADAR ground clutter signal is most often treated as some random variable with a given probability distribution function with some mean value depending upon the particular RADAR, desired clutter properties, and relative geometry. This paper will show the results of implementing a technique to correlate adjacent clutter scatterers in the clutter model while maintaining the overall clutter statistics. This correlated clutter will be matched to a clutter class map which was derived from visual data of a specific terrain location. Processed images from a high resolution RADAR simulation will be shown and compared to the visual images and clutter maps.
Hardware in the Loop (HWIL) simulations are often spread over a large area with many hardware and software components. Open and closed loop tests necessitate a flexible test environment of allocating and ordering these components. These components may be connected in a star, a ring configuration, and/or any combinations of these two configurations, with shared memory wrappers connecting some of the software components. A Resource Allocation Program (RAP) is proposed along with a device table to allocate, organize, and document the communication protocol between the software and hardware components. Communication between software components is done through a single set of input output routines using information so that software objects may be changed with hardware objects or placed on different computers with minimal code changes.
KEYWORDS: Extremely high frequency, Projection systems, Missiles, Fermium, Frequency modulation, Infrared imaging, Sensors, Computer simulations, Data modeling, Signal attenuation
The Advanced Multispectral Simulation Test Acceptance Resource (AMSTAR) is a suite of state-of-the-art Hardware-In-the-Loop (HWIL) simulation / test capabilities designed to meet the life-cycle testing needs of multi-spectral systems. This paper presents the major AMSTAR facility design concepts and each of the Millimeter Wave (MMW), Infrared (IR), and Semi-Active Laser (SAL) in-band scene generation and projection system designs. The emergence of Multispectral sensors in missile systems necessitates capabilities such as AMSTAR to simultaneous project MMW, IR, and SAL wave bands into a common sensor aperture.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.