Near-eye displays have become a technology of high interest for Augmented, Virtual and Mixed reality due to the unique immersive experience they provide to the user. The majority of these devices use macroscopic optical elements that make them bulky and heavy. Our team has proposed a disruptive near-eye display concept that uses the self-focusing effect to project an image to the user’s retina. To form an image, emissive points are generated from a dense photonic integrated circuit embedded within the lens of a pair of smart glasses. In this work, we present the design of a dense routing architecture that addresses thousands of randomly distributed emissive points from a few hundred inputs. The circuit combines unbalanced waveguide splitter trees with a non-periodical addressing onto a dense waveguide network. We present the design optimization through numerical simulations and estimate the overall device performance based on simulation results. A waveguide interlayer crossing simulation indicates losses better than 0.003 dB/crossing, which guarantees low optical losses over thousands of crossings. By unbalancing correctly the splitter trees, we can obtain homogeneous power profiles over an emissive point distribution. The experimental validation of our design will be a major step towards the elaboration of a first prototype.
The development of an ideal optical system to support Mixed Reality and Augmented Reality (AR) applications has raised a lot of interest in the scientific community in the last decades. The perfect device remains an inaccessible target and researchers have to focus on the optimization of some specific behaviors. Several years ago, we introduced a disruptive display concept to push the device integration to the limit, with the suppression of the optical system. This allows the imaging process to be considered in a different way with a specific monitoring of the field of view. With this ‘smart glass’ concept, the glass is the display, and the image is formed directly onto the retina with a combination of refractive and diffractive effects. This conceptual target allowed us to define a technological roadmap to support our development. Technologies involved in this concept concern principally the field of Photonic Integrated Circuits in the visible range, digital/analogic holography and Liquid Crystal devices. We will present the current state of our research with a particular focus on the holographic display element. Recent results related to analogic pixelated hologram recording validate and question both our technological and conceptual approach. We will show images formed by sparse holographic pixel distributions with controlled angular characteristics that demonstrate the mix of refractive and diffractive effects. The transmission behavior of this holographic device will also be analyzed.
KEYWORDS: Retroreflectors, Retroreflector prisms, Augmented reality, Automotive front vision, Heads up displays, Diffusers, Deep reactive ion etching, Projection systems
The superposition of digital information in the Field of View (FOV) of a user is the basis of the current developments in mixed and augmented reality. Before being studied for near eye device and head mounted display, this application was implemented in Head Up Display (HUD) to help pilots and drivers to manage both the driving stress and the information flow related to the vehicle. Classical optical design of HUD based on the use of a combiner are strongly limited in FOV due to the issues related to pupil management. To overcome this issue head up projection displays have been developed based on the projection of digital image directly on the windshield. To support this approach an efficient projection surface that meets bright reflection and clear transparency has to be developed. We have introduced few years ago an optical approach based on retro-reflective transparent projection surface and a manufacturing process to provide microscopic corner cubes that incorporate an optical diffuser function. We present in this contribution an optimized design that increases the efficiency of the retroreflective structure towards 100%. We also discuss a possible technological process that allows the manufacturing of the master used to replicate the microstructure. This process based on grayscale lithography and on Deep Reactive Ion Etching (DRIE) may guaranty a high retro-reflection efficiency, a high transparency and a realistic draft to allow a molding manufacturing process for the microstructure fabrication.
In our Augmented Reality (AR) project, we are investigating the use of a retinal projection display based on the association of pixelated holograms and a dense distribution of waveguides. We study the use of gratings impregnated with liquid crystal to actively extract light from waveguides. We explore two extraction strategies: tuning the refractive index contrast between the grating teeth and grooves to erase the grating diffraction effect and changing the index of the waveguide cladding to tune the evanescence of the guided mode. Firstly, we present and discuss the measurements of the diffraction efficiency of nano-imprint gratings impregnated with liquid crystal and refractive liquid index. Secondly, we discuss the results of integrated switchable extraction grating of the second strategy.
Our team works on a disruptive concept of Near Eye Display for Augmented Reality (AR) applications. This device requires distributions of holographic elements described as Emissive Points Distributions (EPDs) to create a composite planar wavefront emitted towards the eye. The crystalline lens focuses this signal onto the retina in a mix of diffraction and refraction processes, to form the pixels of an image. We experimentally recorded an image of the letter “R” with pixelated holograms. At the reading of this image, we observe speckle that partially alters the image. Using image processing on the experimental results, we can suppress this speckle and recover the initial “R”, which validates our concept. We develop a simulation tool based on Fourier optics to better understand the emergence of this speckle noise. With the knowledge of the recording process and the form of the hologram given by microscopy, we simulate the electric field 𝐸𝑛 reflected by the different holographic elements from a unique collimated laser. Each field 𝐸𝑛 encodes an angular pixel of the recorded image. The sum of these optical beams in field and/or in intensity allows us to analyze the role of the different optical elements in the generation of a speckle. In particular, the role of the cross interferences between different EPDs is questioned. The experimental analysis is brought for periodic EPDs but can be extended to the case of random EPDs. It gives some insights into some possible evolutions of our concept in terms of optical implementation.
Holographic based optical elements are key components for many product in Augmented Reality and Virtual Reality. We describe in this work the use of pixelated micrometric holograms to fulfill the role of directive in phase reflector for self-focusing purpose. We present the optical set-up used to record these pixelated holograms as well as a set-up to realize a dynamic addressing on these holograms. First results of dynamic holograms addressing are shown and discussed.
One of the big challenges of Augmented Reality (AR) is to create ergonomic smart glasses. Our laboratory proposes an unconventional concept of AR glasses based on self-focusing of multiple beams into the eye. The device comprises a dense electrode network that allows extracting light from a dense waveguide network at the intersections points. Each beam emitted at these points is reflected by a holographic element that directs light towards the eye with a proper angular direction. The image pixels are created on the retina by the interferences of various beam distributions. This paper presents a method to optimize the design of waveguides and electrodes to increase the number of pixels on the final image. Our method considers a first waveguide (resp. electrode) as a curve described by a succession of segments with a unique absolute angle crossing a horizontal (resp. vertical) axe. The other waveguides (resp. electrodes) are created by the translation of the curves so that the minimal distance between two curves is equal to a fixed value. We use the B-Splines mathematical model to approximate the succession of segments. An iterative method adapted to B-Splines allows calculating the intersections between waveguides and electrodes. An Emissive Point Distribution (EPD) is obtained by selecting random groups of waveguides and electrodes. Each EPD forms one pixel onto the retina. The Point Spread Function (PSF) of this EPD characterizes the self-focusing efficiency. We calculate the Signal-to-Noise-Ratio of each EPD to evaluate the quality of the whole self-focusing process and we compare it to the Airy function. Our new mathematical model improves by a factor 3.5 the number of pixels for an equivalent SNR in comparison to the model we previously used.
We are developing a non-conventional retinal projector for augmented reality (AR) applications. In our concept, light at λ = 532 nm is guided in silicon-nitride (SiN) photonic integrated circuits (PICs) embedded in the lens of a pair of glasses. We use holographic elements to transmit the emissive points towards the user’s retina without using lenses. Pixels are formed in the eye using the self-focusing effect and the eye lens. The transparency of the device is an absolute requirement for our application. In this work, we present the fabrication and the characterization of our latest SiN PICs on transparent substrate. The device was fabricated by transferring the SiN PICs from a silicon to a glass substrate. We characterized the PICs and the free-space optical transmission properties of our device using in-house goniometers and a Modulation Transfer Function (MTF) setup. We found a 76% transparency at our wavelength and no image alteration. However, we measured significant waveguide propagation losses; solutions are discussed to tackle this problem. Our glass-substrate device is a major step towards a future prototype for our AR retinal projector.
We found Holographic materials in a widespread field of applications and particularly in Augmented Reality (AR) area, which has been attracting attention for several years. Scientists have developed various complex holographic optical design for displaying clear and bright images in transparent devices. Holographic materials developed for this technology necessitate stringent optical properties such as photosensitivity, transparency, low cost and robustness. Photopolymer materials offer a reliable solution for these requirements. Our research team has recently presented a unique concept for AR applications that requires evaluating different photopolymers system in order to support our development. Among the photomaterials under test, we have studied in particular a photopolymer formulated in our own laboratory based on du Pont patents that contains N-Phenylmethacrylamide as monomer. This solution is not commercially available and has the advantage of a good transparency and wet chemistry. Another photopolymer under test, based on a different photochemistry mechanism, is the commercial product Bayfol® HX from Covestro available as laminated layers on triacetate cellulose film substrates.
Our AR optical concept requires the use of pixelated holograms with a complex recording process that strongly depends on the inhomogeneous properties response of the photomaterial. In this paper, we describe the both photopolymer materials behavior during this holographic recording step. Then, we discuss about writing strategies implemented to improve the hologram homogeneity. In a second part, we evaluate the robustness of the holograms written in our photomaterial and we mesure their spectral stability under thermal stresses in order to extrapolate their natural aging. A comparison is made with the commercial product, it underlines that the robustness strongly depends on the nature of the polymer chemical formulation.
Liquid Crystals are birefringent materials, which address many applications such as visualization with Liquid Crystal Display (LCD) or beam shaping with Liquid Crystal on Silicon devices (LCoS). Recently, several research teams proposed using liquid crystals in photonics devices applied to new kinds of projection displays. Augmented Reality (AR) is one of the domains, which could benefit from these developments, thanks to the necessity to create active and transparent optical function. In this contribution, we present recent works at CEA Leti to develop a switchable photonic extraction grating adapted to a specific near-eye device. Two different technics are detailed and studied with FDTD simulations. We also show first experimental characterization of an impregnated diffraction grating used in a free space optical set-up
We present our first results on the recording of pixelated holograms. This specific recording process is dedicated to an unconventional approach of smart glass design. Due to the use on integrated photonics, this concept requires to adjust locally the properties of out-coupling holographic elements with specific angular distribution. We analyze here a simple Lippmann recording configuration that focus on the material behavior regarding the pixelated process. We demonstrate our ability to record distribution of holographic elements, few micrometers in size, and compare our experimental results to first elements of simulation.
Our team is currently developing an innovative and compact retinal projection display concept for augmented reality applications. This concept intends to break with conventional optics using a device based on an integrated photonics architecture. Designs and simulations of test structures have been presented in previous works. Here, we present experimental results obtained on stoichiometric Si3N4 integrated photonic circuits designed for λ = 532nm. The samples combine several building blocks such as grating couplers, Multi-Mode Interferometers (MMI) and dense waveguide arrays with several emissive areas. We used a goniometric characterization bench to perform far field measurements. For near field characterization, an imaging setup using a 4.5mm-focal lens system and a high-resolution CMOS camera was mounted on the goniometer. The power profiles of emissive points were measured in the near and far field. Regarding the dense waveguide arrays, their angular characterization confirms our simulations and allows us to emphasize the impact of trapezoidal-shaped waveguides on the emission angle. We measured a homogeneously distributed power profile across the waveguide distribution on the emissive areas. The power ratio of each emissive area remains relatively uniform. This is a promising result for our future work where light must be equally and uniformly distributed over the surface of the display.
Our team works on a new concept of smart glasses for augmented reality applications. Our retinal projector consists of a dense waveguide network to transport light on the surface of a glass and an electrode network to extract light. The laser beams are extracted through holographic elements that direct light into the eye. The images are directly created on the retina without optics. The emissive points (EPs) are defined as the intersections between dense waveguide and electrode networks. Design of the Emissive Point Distribution (EPD) is of primary importance in the imaging process. Previous results showed that EPs must be randomly distributed inside the pupil in order to obtain a good self-focusing effect. To evaluate self-focusing, we calculate the Signal-to-Noise-Ratio (SNR) of the point spread function and compare it to the Airy function. Our first EPD model considers waveguides and electrodes designed as a sum of cosine functions. At the inter- section point between a waveguide’s function and an electrode’s function, the composition of these two functions is canceled. We use Newton’s method to find this point. With this model, we can have a good SNR but the number of available pixels is low. Our second model is based on B-Spline curves to represent the waveguide and electrode networks. With this model we can easily locally modify the curves. We develop tools specially adapted for B-Spline curves to find intersections. We also establish properties on the minimum distance between two curves.
We introduce an original approach for an extended Head Up Display solution. This configuration is based on the projection of images directly on the windshield of a vehicle, allowing the display of various information around the user viewing axis, as a peripheral dashboard. We highlight that this solution is only effective if we can manage both directivity and diffusivity of the reflected light. We introduce for that purpose two technological options. A pragmatic one allows us to evaluate at short term the HUD behavior. Another proposes an original approach with a manufacturing process based on the etching of a deep cube corner cavity in a composite silicon wafer that incorporates a diffuser surface. We demonstrate both technological options and give some perspectives for future works.
Augmented Reality (AR) is now subject to a great technological acceleration. Different actors of the field develop many different projection technics. However, with each projection technic comes display strategies to improve the comfort and immersion of the user and the quality of the image displayed. Some recent works propose to use new principle of projection based on unconventional optics to create new products aiming to reduce many of the current problems related with the conventional projection principle of AR devices. CEA Leti recently proposed a disruptive optical concept for AR display[1], which is based on the interaction of a grating of monomode linear waveguides with pixelated holograms. This work uses simulations to analyze optical issues associated with this new optical concept. The main topic covered is the study of the SNR depending on the resolution criteria chosen.
The rising production and consumption of data worldwide is posing challenges on how it can be efficiently analyzed and absorbed on a human scale. Emerging technologies in data visualization are rapidly expanding to try to address this problem. One technology in particular, Augmented Reality (AR) glasses, has the capacity to allow individuals to process live virtual information while keeping an eye on their surrounding environment. Our team has recently presented a unique retinal projection concept for augmented reality applications1. The concept combines a photonic integrated circuit (PIC) and holography. The photonic integrated circuit is made of silicon nitride (Si3N4) for its ability to guide visible wavelengths2 (λ = 532 nm in our case) and its compatibility with the CMOS fabrication process technology. In our concept, the role of this circuit is to distribute and extract light at specific locations on the surface of a glass. The light emissions pass through a holographic layer deposited on the surface of the photonic circuit. The holographic layer is made of a 2D array of small individual holograms. The role of the hologram is to modify the light properties (emission angle, phase…) of the light emissions from the PIC in order to generate a composite plane wavefront. The eye can focus this wavefront on the retina even at a small eye relief distance (few centimeters). The focused wavefront represents one pixel of the projected retinal image. Several emissive point distributions on the surface of the PIC will create a full 2D retinal image. The retinal projector concept is schematically described in fig.1. Until now, both parts of the AR glass – the PIC and the holographic layer – have been developed independently3,4,5. In previous work6, the basic building blocks of the circuit were designed both analytically and with numerical simulations: single-mode waveguide, MMI (MultiMode Interference) coupler, diffraction gratings and more. This paper presents in section 2 the design and numerical simulations of a double-tip edge coupler at λ = 532 nm. This additional component will be useful to couple light in our future prototypes with a compact design. In section 3, it is combined with previously designed PIC building blocks to design a unique device dedicated to evaluate the interaction between a silicon nitride PIC and a hologram.
We recently presented a novel retinal projection concept based on the combination of integrated optics and holography. Our lens-free optical system uses disruptive technologies to overcome the limitations of current devices such as a limited field-of-view and bulky optical assemblies. An integrated optical network of Si3N4 waveguides has been designed in the visible range in order to control the intensity of an optical field originating from an emissive point distribution (EPD) at a glass surface. In addition, the phase and orientation of the optical field are controlled by incorporating a pixelated holographic layer. The Si3N4 waveguides are transparent, allowing ambient light to pass through the device for augmented reality applications. This study focuses on the design of the components used for the optical circuit at λ = 532 nm (hologram laser recording wavelength): single-mode waveguides, bent waveguides, cross-talk, diffraction grating couplers, MMI splitters (MultiMode Interference) and directional couplers. The parameters of the components are optimized with various numerical methods. Furthermore, an optical circuit used as the first test structure is presented. An optical set-up based on a goniometric configuration has been built to characterize the efficiency of our components with a particular focus on the angular properties. Future work will focus on the hologram recording process that will involve interferences between the EPD output beams and free-space planar light waves.
Ultra-realistic virtual object representation is an old dream of humanity. From the 3D Paleolithic rock painting to the late Michael Jackson holographic shows, humans have investigated display solutions to give life to the abstraction. The incredible opportunities given by the digital revolution have paved the way to the recent development of innovative volumetric displays. These complex solutions are however still limited in the visual experience they can offer to the viewer. In a more concerning manner such devices often appear as empty shells as their effective usefulness is not yet clearly defined. Recently, we have proposed an original volumetric display concept based on a 360° projection configuration. Inspired from the pepper ghost concept and from the praxinoscope design of the end of the XIX century, our display mixes real projection on transparent retroreflective surface and virtual images superimposition. This development has been made in collaboration with a group of live performing artists in France. The 360° display has been used to present an original creation of the artists and the confrontation with the public has highlighted some unexpected properties of this family of displays. We describe here the technological concept of our display and the evolutions we target to improve the visual rendering. The collaboration project with the artists is also presented and we give our analysis on the feedback of the public.
From the first ultra-realistic 3D images in the sixties to the most recent Augmented Reality devices, the field of holography has been involved in display technology for a long time. The spectral selectivity of the hologram reflection together with the very good transparency of the holographic material make it a suitable option for some of the key optical components in smart glasses. However, these devices are still very limited by the overall optical system based on the conventional scheme Display - Optical System - Combiner. Recently, we have proposed an unconventional scheme that puts the hologram at the core of the display device. Due to its 3D nanoscale complexity, the dynamic updatable hologram display is still an unreachable goal. As an alternative, our configuration is based on a concept of switchable static holographic elements. These elements are interleaved on the surface of the display and form various groups of emissive point distributions that are phase-adjusted for given angular directions. The activation of these holographic elements produces angular planar wavefronts in the far field and the display is expected to achieve retinal projection without the help of an optical system. We present our concept and describe the development of the optical set-up used to investigate our holographic configuration. We record phase-adjusted distributions of holographic elements that are multiplexed on the surface of our sample, each distribution targeting a specific angular direction. First recording results on a holographic photopolymer are given.
We developed a novel concept of retinal projection for augmented reality (AR) glasses combining optical integrated optics and holography. Our thin and lens-free concept overcomes limitations of current AR devices such as bulky optics and limited field-of-view. The integrated circuit is transparent and guide visible wavelengths by using Si3N4 as the core material of the waveguides. This work presents a detailed description of the optical principles behind the concept, including the self-focusing effect. Furthermore, we present the design of the first building blocks used for the optical integrated circuit at a visible wavelength (λ = 532 nm): single-mode waveguides, bent waveguides, cross-talk, grating couplers and MMI splitters (MultiMode Interference). Numerical simulation results of each component are presented. A prototype combining these optical building blocks in a 1024 waveguide array is designed to provide future experimental proof of concept of our retinal projection concept. In addition to this prototype, test structures are inserted on a photolithography mask to experimentally validate the simulations of each optical building block in future work. Next steps of development will include densifying the integrated optical architecture using serial coupling effects and multiple waveguide layers.
We demonstrate experimental evaluation of self-focusing effect used for image formation in an unconventional near-eye display. The impact of the spectral bandwidth of the light source used to project the image is investigated in an experimental set-up and through multiple interference simulations. It shows that the self-focusing effect is robust and does not require a highly coherent laser source. Simulations are conducted with first experimental holographic recording data and show that our concept can be implemented with a LED array as primary source. Considerations for intraocular image formation quality will be further discussed during the conference.
A new manufacturing process for advanced Fiber Bragg Gratings that uses phase plates is presented. We underline its versatility through the realisation of several filters (phaseshifted, apodised, Fabry Perot Interferometer).
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.