Advancement in the field of computer graphics has been occurring at an unprecedented pace. It is now possible to generate near-cinematic-quality effects in real time, largely thanks to the programmability of modern graphical processing units (GPUs). However, the spatial fidelity of synthetic imagery relies on anti-aliasing techniques that have not been advanced to the same extent. Custom anti-aliasing techniques are thus required for the generation of synthetic imagery containing high spatial frequencies. In this paper modification of an existing custom anti-aliasing technique to employ the power of modern GPUs is described and applied to infrared scene generation.
The standard antialiasing techniques within commercial graphics hardware are unsatisfactory for simulations involving targets at long ranges, e.g., that for imaging IR weapons. In this case, due to the presence of high-spatial-frequency components beyond the Nyquist frequency, the resulting scenes will contain aliasing and scintillation artifacts. Custom antialiasing techniques (that operate by supersampling) have been devised to deal with this; for example, zoom antialiasing and the corrected supersampling and scaling derivative. An alternative technique in which the target is prefiltered, shown to be equivalent to 3-D blurring of target objects at the vertex level, is described. An analysis of antialiasing performance is provided together with example imagery.
The standard anti-aliasing techniques within commercial graphics hardware are unsatisfactory for simulations involving targets at long ranges, e.g. that for imaging infrared weapons. In this case, due to the presence of high spatial frequency components beyond the Nyquist frequency, the resulting scenes will contain aliasing and scintillation artifacts. Custom anti-aliasing techniques (that operate by supersampling) have been devised to deal with this; for example, Zoom Anti-Aliasing and the Corrected Super Sampling and Scaling derivative. An alternative technique in which the target is pre-filtered, shown to be equivalent to three-dimensional blurring of target objects at the vertex level, is described in this paper. An analysis of anti-aliasing performance is provided together with example imagery.
Aliasing is unavoidable in real-time computer image generation due to the sampling processes occurring within the graphics hardware. In particular, aliasing produces scintillation effects and significant radiometric inaccuracy when targets are rendered at long range. This problem was alleviated some years ago by the development of a zoom anti-aliasing (ZAA) technique within infrared missile seeker hardware-in-the-loop simulations. An alternative ZAA technique based on extensive use of available graphics hardware functions is described here and compared to the original technique.
A challenging aspect of real-time infrared scene generation for the hardware-in-the-loop testing of infrared-guided weapon systems is the rendering of particle systems to represent gaseous and particulate volumes. In this work, a simulation tool is described for enabling the generation of real-time particle effects with high spatial and temporal fidelity by using many less primitives than traditional means. The tool can be applied for representing plumes and countermeasures, and enables the simulation of several key capabilities, including internal flow, turbulence, persistence, and structure, all capable of being varied dynamically as a function of power setting at the source. The principles of operation, software implementation, and general performance are discussed.
Aliasing is unavoidable in real-time computer image generation due to the sampling processes occurring within the graphics hardware. In particular, aliasing produces scintillation effects and significant radiometric inaccuracy when targets are rendered at long range. The zoom anti-aliasing techniques designed to alleviate the inherent aliasing problems are reviewed here. It is shown that since these rely on computational power rather than on optimal use of the extensive set of functions available within the graphics hardware they tend to be slower and more complex than necessary. A new technique based on use of the graphics hardware functions is described and compared to the earlier techniques. It is shown that the technique is faster and less complex while being similarly capable in reducing the level of aliasing.
A challenging aspect of real-time infrared scene generation for the hardware-in-the-loop (HWIL) testing of infrared-guided weapon systems is the rendering of particle systems to present gaseous and particulate volumes. In this paper a simplified technique is described for generating real-time particle effects with high spatial and temporal fidelity by using many less primitives than traditional means. The technique is suitable for representing plumes and countermeasures and enables the simulation of several key capabilities including internal flow, turbulence, persistence and structure, all capable of being varied dynamically as a function of power setting at source. The principles of operation, software implementation and general performance are discussed.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.