

This method was first proposed by Breen in 1996, but it was difficult to realize due to technical limitations at that time. One is the model-based method, which reconstructs the 3D model of a real scene through a computer, exports the model data to the virtual scene rendering software, and renders the virtual scene at the same time to achieve the fusion effect. There are three methods in AR to prove the realizability of scene fusion. The key to realizing ARHS is the effective fusion of real and virtual scenes. The organic combination of both can further improve the comprehensibility of the scene and achieve the “augmentation” of the scene.
#5d stereogram software#
The real-scene data are sampled by the camera, and the virtual scene is rendered by computer software or a program. The diversified scene selection of HS not only enriches its expression ability, but also makes the realization of augmented reality-holographic stereogram (ARHS) possible.ĪRHS reconstructs the light field information of real and virtual scenes at the same time. In addition, the scene is not limited to real-world objects, but can also be a 3D model rendered by computer. HS discretizes and approximates the continuous 3D light field, which greatly reduces the amount of data. Moreover, HS does not have the depth information in the scene space, but people can still perceive 3D clues, which depends on the binocular parallax effect. An HS cannot show all the information of the scene but is limited to a certain angle (less than 180°). Using discrete 2D images with parallax information as the input, the 3D reconstruction of a scene can be obtained after image processing, stereoscopic exposure, and development and fixing. HS is widely used in the military, publicity, commerce, and other fields. Holographic stereogram (HS) comprises a research hotspot in the field of three-dimensional (3D) display, providing a flexible and efficient means of 3D display. Analysis of experimental results shows that the proposed method can effectively realize augmented reality-holographic stereogram.

The obtained scene model and virtual scene are rendered simultaneously to obtain the real and virtual fusion scene. First, the point cloud data is generated by VisualSFM software, and then the 3D mesh model is reconstructed by MeshLab software. In this paper, an augmented reality-holographic stereogram based on 3D reconstruction is proposed. It can reconstruct the light field information of real and virtual scenes at the same time, further improving the comprehensibility of the scene and achieving the “augmentation” of the scene. Holographic stereogram comprises a hotspot in the field of three-dimensional (3D) display.
