Paper
3 October 2022 Wavefront sensor fusion via shallow decoder neural networks for aero-optical predictive control
Author Affiliations +
Abstract
Sensor limitations often result in devices with particularly high spatial-imaging resolution or high sampling rates but not both concurrently. Adaptive optics control mechanisms, for example, rely on high-fidelity sensing technology to predictively correct wavefront phase aberrations. We propose fusing these two categories of sensors: those with high spatial resolution and those with high temporal resolution. As a prototype, we first sub-sample simulations of the Kuramoto-Sivashinsky equation, known for its chaotic flow from diffusive instability, and build a map between such simulated sensors using a Shallow Decoder Neural Network. We then examine how to fuse the merits of a common sensor in aero-optical sensing, the Shack-Hartmann wavefront sensor, with the increased spatial information of a Digital Holography wavefront sensor, training on supersonic wind-tunnel wavefront data provided by the Aero-Effects Laboratory at the Air Force Research Laboratory Directed Energy Directorate. These maps merge the high-temporal and high-spatial resolutions from each respective sensor, demonstrating a proof-of-concept for wavefront sensor fusion for adaptive optical applications.
© (2022) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Shervin Sahba, Christopher C. Wilcox, Austin McDaniel, Benjamin D. Shaffer, Steven L. Brunton, and J. Nathan Kutz "Wavefront sensor fusion via shallow decoder neural networks for aero-optical predictive control", Proc. SPIE 12223, Interferometry XXI, 1222303 (3 October 2022); https://doi.org/10.1117/12.2631951
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Sensors

Wavefront sensors

Sensor fusion

Adaptive optics

Neural networks

Wavefronts

Control systems

Back to Top