Conference Paper

A Dual Light Stage.

DOI: 10.2312/EGWR/EGSR05/091-098 Conference: Proceedings of the Eurographics Symposium on Rendering Techniques, Konstanz, Germany, June 29 - July 1, 2005
Source: DBLP

ABSTRACT We present a technique for capturing high-resolution 4D reflectance fields using the reciprocity property of light transport. In our technique we place the object inside a diffuse spherical shell and scan a laser across its surface. For each incident ray, the object scatters a pattern of light onto the inner surface of the sphere, and we photograph the resulting radiance from the sphere's interior using a camera with a fisheye lens. Because of reciprocity, the image of the inside of the sphere corresponds to the reflectance function of the surface point illuminated by the laser, that is, the color that point would appear to a camera along the laser ray when the object is lit from each direction on the surface of the sphere. The measured reflectance functions allow the object to be photorealistically rendered from the laser's viewpoint under arbitrary directional illumination conditions. Since each captured re- flectance function is a high-resolution image, our data reproduces sharp specular reflections and self-shadowing more accurately than previous approaches. We demonstrate our technique by scanning objects with a wide range of reflectance properties and show accurate renderings of the objects under novel illumination conditions.

  • [Show abstract] [Hide abstract]
    ABSTRACT: Virtual natural phenomena obtained through mathematical-physical modeling and simulation as well as graphics emulation can meet the user’s requirements for sensory experiences to a certain extent but they can hardly have the same accurate physical consistency as real natural phenomena. The technology for data acquisition and natural phenomena simulation can enable us to obtain multi-dimensional and multi-modal data directly from real natural phenomena and, based on these real data, to establish digital models highly consistent with real natural phenomena in appearance, physics, behavior or many other aspects, thus making a virtual natural phenomenon a direct mapping of real natural phenomenon. This approach is conducive to resolving problems concerning the reliability and availability of virtual reality. At present the technology for acquiring and simulating dada of natural phenomena is still in its initial stage. This paper gives a review of the related investigations. Firstly, we briefly introduce the basic methods and techniques concerned, then, based on the difference between the basic elements of various natural phenomena, we discuss the current studies on such natural phenomena as light, water, fire, smoke, dynamic terrain, etc., and finally, in connection with issues in the present research and possible future direction of development, we put forth a number of theoretical and technical problems, hoping they can be resolved in the near future.
    Sciece China. Information Sciences 01/2011; 54:683-716. · 0.71 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: This paper introduces the concept of time-of-flight reflectance estimation, and demonstrates a new technique that allows a camera to rapidly acquire reflectance properties of objects from a single view-point, over relatively long distances and without encircling equipment. We measure material properties by indirectly illuminating an object by a laser source, and observing its reflected light indirectly using a time-of-flight camera. The configuration collectively acquires dense angular, but low spatial sampling, within a limited solid angle range - all from a single viewpoint. Our ultra-fast imaging approach captures space-time "streak images" that can separate out different bounces of light based on path length. Entanglements arise in the streak images mixing signals from multiple paths if they have the same total path length. We show how reflectances can be recovered by solving for a linear system of equations and assuming parametric material models; fitting to lower dimensional reflectance models enables us to disentangle measurements. We demonstrate proof-of-concept results of parametric reflectance models for homogeneous and discretized heterogeneous patches, both using simulation and experimental hardware. As compared to lengthy or highly calibrated BRDF acquisition techniques, we demonstrate a device that can rapidly, on the order of seconds, capture meaningful reflectance information. We expect hardware advances to improve the portability and speed of this device.
    ACM Trans. Graph. 01/2011; 30:171.
  • [Show abstract] [Hide abstract]
    ABSTRACT: We present a novel technique for acquiring the geometry and spatially-varying reflectance properties of 3D objects by observing them under continuous spherical harmonic illumination conditions. The technique is general enough to characterize either entirely specular or entirely diffuse materials, or any varying combination across the surface of the object. We employ a novel computational illumination setup consisting of a rotating arc of controllable LEDs which sweep out programmable spheres of incident illumination during 1-second exposures. We illuminate the object with a succession of spherical harmonic illumination conditions, as well as photographed environmental lighting for validation. From the response of the object to the harmonics, we can separate diffuse and specular reflections, estimate world-space diffuse and specular normals, and compute anisotropic roughness parameters for each view of the object. We then use the maps of both diffuse and specular reflectance to form correspondences in a multiview stereo algorithm, which allows even highly specular surfaces to be corresponded across views. The algorithm yields a complete 3D model and a set of merged reflectance maps. We use this technique to digitize the shape and reflectance of a variety of objects difficult to acquire with other techniques and present validation renderings which match well to photographs in similar lighting.
    ACM Transactions on Graphics (TOG). 07/2013; 32(4).


Available from