We present subwavelength imaging of amplitude- and phase-encoded objects based on a solid-immersion diffractive processor designed through deep learning. Subwavelength features from the objects are resolved by the collaboration between a jointly-optimized diffractive encoder and decoder pair. We experimentally demonstrated the subwavelength-imaging performance of solid immersion diffractive processors using terahertz radiation and achieved all-optical reconstruction of subwavelength phase features of objects (with linewidths of ~λ/3.4, where λ is the wavelength) by transforming them into magnified intensity images at the output field-of-view. Solid-immersion diffractive processors would provide cost-effective and compact solutions for applications in bioimaging, sensing, and material inspection, among others.
As a successful case of combining deep learning with photonics, the research on optical machine learning has recently undergone rapid development. Among various optical classification frameworks, diffractive networks have been shown to have unique advantages in all-optical reasoning. As an important property of light, the orbital angular momentum (OAM) of light shows orthogonality and mode-infinity, which can enhance the ability of parallel classification in information processing. However, there have been few all-optical diffractive networks under the OAM mode encoding. Here, we report a strategy of OAM-encoded diffractive deep neural network (OAM-encoded D2NN) that encodes the spatial information of objects into the OAM spectrum of the diffracted light to perform all-optical object classification. We demonstrated three different OAM-encoded D2NNs to realize (1) single detector OAM-encoded D2NN for single task classification, (2) single detector OAM-encoded D2NN for multitask classification, and (3) multidetector OAM-encoded D2NN for repeatable multitask classification. We provide a feasible way to improve the performance of all-optical object classification and open up promising research directions for D2NN by proposing OAM-encoded D2NN.
Optical neural networks (ONNs), enabling low latency and high parallel data processing without electromagnetic interference, have become a viable player for fast and energy-efficient processing and calculation to meet the increasing demand for hash rate. Photonic memories employing nonvolatile phase-change materials could achieve zero static power consumption, low thermal cross talk, large-scale, and high-energy-efficient photonic neural networks. Nevertheless, the switching speed and dynamic energy consumption of phase-change material-based photonic memories make them inapplicable for in situ training. Here, by integrating a patch of phase change thin film with a PIN-diode-embedded microring resonator, a bifunctional photonic memory enabling both 5-bit storage and nanoseconds volatile modulation was demonstrated. For the first time, a concept is presented for electrically programmable phase-change material-driven photonic memory integrated with nanosecond modulation to allow fast in situ training and zero static power consumption data processing in ONNs. ONNs with an optical convolution kernel constructed by our photonic memory theoretically achieved an accuracy of predictions higher than 95% when tested by the MNIST handwritten digit database. This provides a feasible solution to constructing large-scale nonvolatile ONNs with high-speed in situ training capability.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.