KEYWORDS: Printing, Holography, RGB color model, Digital holography, 3D printing, 3D modeling, Holographic materials, 3D image processing, Computer generated holography, Holograms
In this paper, simplified digital content generation using single-shot depth estimation for full-color holographic printing system is proposed. Firstly, digital content generation is analyzed completely before the hardware system of holographic printing is run to provide a high-quality three-dimensional (3D) scene without degrading information of the original 3D object. Here, the single-shot depth estimation method is applied, and 3D information is acquired from the estimated highquality depth data and a given single 2D image. Then the array of sub-holograms (hogels) is generated directly by implementing fully analyzed computation considering chromatic aberration for full-color printing. Finally, the generated hogels are recorded into holographic material sequentially via effectual time-controlled exposure under synchronized control with three electrical shutters for RGB laser beam illuminations to obtain full-color 3D reconstruction. Numerical simulation and optical reconstructions are implemented successfully.
The improvement of fill factor of holographic micromirror array (HMA) with holographic waveguide-type for threedimensional (3D) augmented-reality (AR) display system. Our proposed 3D AR system was created and briefly explain it; there have two the HOE optical film at in-and out coupler of the waveguide. In-coupler HOE is our fabricated HMA, it has a same role with optical microlens-array. HMA is integrate the displaying elemental image set (EIS) from micro display which EIS was generated by the integral imaging technology. The micro display has a 6 mm by 8mm size, 48single elemental images and micro display was located g distance from holographic waveguide which waveguide thickness was 5mm. EIS was displayed by micro display to holographic waveguide. HMA was stick with holographic waveguide and located in opposite side of waveguide and micro display. Micro display was display forward to holographic waveguide and fabricated HMA, then displayed EIS is reflected and integrated at the in-coupler HMA and integrated 3D image was through the holographic waveguide by HMA recorded angle. 3D images of internal reflect in the holographic waveguide was 1 time. 3D image was also reflected at the out-coupler HOE which role was same as optical mirror and reflect to observer’s eye. At least observer as the reconstructed images and real object out and reflects by out-coupler HOE.
KEYWORDS: Cameras, Image acquisition, 3D modeling, 3D acquisition, 3D displays, Integral imaging, Image processing, Deep learning, 3D image processing, Digital cameras
In this report, we proposed an advanced integral imaging 3D display system using a simplified high-resolution light field image acquisition method. A simplified light field image acquisition method consists of a minimized number of cameras (three cameras placed along the vertical axis) to acquire the high-resolution perspectives of a full-parallax light field image. Since the number of cameras is minimized, the number of perspectives (3×N) and the specifications of the 3D integral imaging display unit (N×N elemental lenses) cannot be matched. It is possible to utilize the additional intermediate-view elemental image generation method in the vertical axis; however, the generation of the vertical viewpoints as many as the number of elemental lenses is a quite complex process and requires huge computation/long processing time. Therefore, in this case, we use a pre-trained deep learning model, in order to generate the intermediate information between the vertical viewpoints. Here, the corrected perspectives are inputted into a custom-trained deep learning model, and a deep learning model analyzes and renders the remaining intermediate viewpoints along the vertical axis, 3×N → N×N. The elemental image array is generated from the newly generated N×N perspectives via the pixel rearrangement method; finally, the full-parallax and natural-view 3D visualization of the real-world object is displayed on the integral imaging 3D display unit.
The waveguide-type full-color 3D-AR display system based on the integral imaging technique using the holographic mirror array is proposed. In the experiment, the AR feature has been successfully verified that the real-world scene and reconstructed virtual full-color 3D image were observed simultaneously.
The improvement of holographic waveguide-type two-dimensional/three-dimensional (2D/3D) convertible augmentedreality (AR) display system using the liquid-crystalline polymer microlens array (LCP-MA) with electro-switching polarizer is proposed. The LCP-MA has the properties such as a small focal ratio, high fill factor, low driving voltage, and fast switching speed, which utilizes a well-aligned reactive mesogen on the imprinted reverse shape of the lens and a polarization switching layer. In the case of the holographic waveguide, two holographic optical elements (HOE) films are located at the input and output parts of the waveguide. These two HOEs have functions like mirror and magnifiers. Therefore, it reflects the transmitted light beams through the waveguide to the observer's eye as the reconstructed images. The proposed system has some common features like holographic AR display’s lightweight, thin size, and the observer can see the 2D/3D convertible images according to the direction of the electro-switching polarizer, with the real-world scenes at the same time. In the experiment, the AR system has been successfully verified that the real-world scene and reconstructed 2D/3D images were observed simultaneously.
We proposed a full-color three-dimensional holographic waveguide-type augmented-reality display system based on integral imaging using the holographic optical element-mirror array. As same as the conventional holographic waveguide, two holographic optical elements are utilized as in- and out-couplers that are located at the input and output parts of the waveguide. The main roles of these films are that reflecting the light beams come from the microdisplay into the waveguide, transmitting the reconstructed by the HOE-MA, three-dimensional image while a reflecting to the observer’s eye. In the experiment, the augmented-reality feature has been successfully verified that the real-world scene and reconstructed virtual three-dimensional image were observed simultaneously.
We proposed a three-dimensional (3D) holographic waveguide-type augmented reality (AR) system based on integral imaging using the mirror array. As same with the conventional holographic waveguide, two holographic optical element (HOE) films are utilized as in- and out-couplers, that are located at the input and output parts of the waveguide. The main role of the in-coupler HOE is that reflecting the light beams come from the micro display into the waveguide, and out-coupler reflects the transmitted light beams through the waveguide to the observer eye. On the basic of the main advantages of conventional holographic waveguide structure such as the light-weight and thin-size, the proposed system has an additional critical advantage that the observer can see the realistic 3D visualizations reconstructed by the outcoupler HOE-mirror array (HOE-MA), instead of simple two-dimensional images, with the real-world scenes at same time. In the experiment, the AR feature has been successfully verified that the real-world scene and reconstructed virtual 3D image were observed simultaneously.
KEYWORDS: 3D image reconstruction, 3D image processing, Integral imaging, 3D displays, 3D acquisition, 3D modeling, 3D scanning, Cameras, Image quality, Mobile devices
In this paper, we focused on the improvement of reconstructed image quality of the mobile three-dimensional display using the computer-generated integral imaging. The three-dimensional scanning method is applied instead of capturing the depth image in the acquisition step, and much more accurate three-dimensional view information (parallax and depth) can be acquired compared with the previous mobile three-dimensional integral imaging display, and the proposed system can reconstruct clearer three-dimensional visualizations of real-world objects. Here, the three-dimensional scanner acquires the three-dimensional parallax and depth information of the real-world object by the user. Then, the entire acquired data is organized and the three-dimensional the virtual model is generated based on the acquired data, and the EIA is generated from the virtual three-dimensional model. Additionally, in order to enhance the resolution of the elemental image array, an intermediate-view elemental image generation method is applied. Here, five intermediateview elemental images are generated between each four-original neighboring elemental image according to the pixel information, at least, the resolution of the generated elemental image array is enhanced almost four times than original. When the three-dimensional visualizations of real objects are reconstructed from the elemental image array with enhanced resolution, the quality can be improved quite comparing with the previous mobile three-dimensional imaging system. The proposed method is verified by the real experiment.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.