Conference Paper

Sttreaming Video Textures for Mixed Reality Applications in Interactive Ray Tracing Environments.

Conference: Proceedings of the Vision, Modeling, and Visualization Conference 2003 (VMV 2003), München, Germany, November 19-21, 2003
Source: DBLP

ABSTRACT The realm of mixed reality applications lies in blending rendered images with images of the real world. This requires highly realistic rendered im- ages in order to seamlessly blend between those two worlds. However, current rasterization technology severely limits the achievable realism and imposes strict limits on the scene complexity and the optical effects that can be simulated efficiently. Real-time ray tracing can overcome many of these constraints and enables completely new approaches for mixed reality applications. This paper explores this design space based on a framework for live streaming of video textures in a real-time ray tracing engine. We also sug- gest a novel approach to video-based AR by inte- grating image compositing with shading computa- tions. We demonstrate the approach with a num- ber of VR/AR applications including video inserts, video billboards, and dynamic lighting from video and HDR video streams. Being seamlessly inte- grated into the ray tracing framework, all our ap- plications feature ray traced effects, like shadows, reflections and refraction.

0 Bookmarks
 · 
75 Views
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: This paper describes an implementation of a high dynamic range camera that is both capable to cap- ture still images as well as high dynamic range video using conventional equipment. The dynamic range of real-world scenes often exceeds the dynamic range covered by cameras, re- sulting in loss of detail and scene information; ex- tended dynamic range can be reached by combining multiple images of the same scene taken with dif- ferent, known exposure times. The result is a float- ing point radiance map with radiance values being proportional to those observed in the real scene. To combine the input images correctly, the camera re- sponse function needs to be recovered first. After a brief overview on different algorithms in the field of high dynamic range imaging, the one implemented is discussed in further detail, as well as the actual implementation in hard- and software. Finally, some results of the project are presented, showing how it can be used with augmented reality applications.
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Virtual objects in augmented reality applications often appear at and thus violate the immersion. To further extend the range of illumination eects and to improve the visual quality of the renderings we combine augmented reality, ray tracing and image based lighting techniques. We explore the resulting possibilities and problems which occur in such a scenario. Beyond that we present a method to augment video images by shadows casted from virtual onto real objects.
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: High-quality mixed reality rendering is the art of combining synthetic and photographic images in a photorealistically consistent way. Most rendering methods used in this area are based on ray tracing algorithms and take hours of computing time. With interactive ray tracing having become available in recent years it is obvious to explore its potential for mixed reality applications, like virtual TV studios featuring photorealistic live insertion of human actors in rendered background scenes. In this paper we present a method to seamlessly integrate live actor performance into the distributed interactive ray tracing framework OpenRT. An image based visual hull shader is used to create a 3D representation of the actor. This allows for the full spectrum of ray tracing based effects like reflection, refraction, and shadows in the composite image.

Full-text (2 Sources)

Download
26 Downloads
Available from
May 20, 2014