PresentationPDF Available

Interactive Rendering To View-Dependent Texture-Atlases

Authors:

Abstract

Presentation of Research Paper "Interactive Rendering To View-Dependent Texture-Atlases"
Preprocessing Parameterization &
Packaging
Texture Atlas
Processing
Global Record
Compositing
Render
Texture-Atlas
Texture-Atlas
Per-Frame
Final Image
Polygonal Scene Preprocessing
screen space texture spacesorted bounding boxes
Texture Atlas SpaceScreen Space
Transformation
uniform samplerBuffer recordBuffer; // global data
in int ID[3]; // per-vertex attribute: object ID
// fetch transformation and layer for object ID
void fetchRecord(inout mat4 T, inout int layer);
...
void main(void) {
mat4 T; int layer; fetchRecord(T, layer);
gl_Layer = layer; // set texture-target layer
for(int i = 0; i < 3; i++) // set every vertex
{
vec4 v = gl_ProjectionMatrix * gl_PositionIn[i];
// screen-space vertex displacement
gl_Position = (T *(v/ v.w))*v.w;
gl_ClipVertex = gl_PositionIn[i];
// set additional attributes...
EmitVertex();
}//endfor
EndPrimitive();
return;
}
#Vertex
#
Faces
#
Obj
FPS
RTT
FPS
RTTA
2,191
4,236
5
1015
940
32,081
21,246
21
328
239
56,654
34,596
580
14
19
1,040,503
346,835
269
21
35
... Fortunately, this can be solved by performing a screen-space transformation of the geometry, together with user-defined clipping planes. The screen-space transformation procedure used in this paper is similar to that in [Trapp and Döllner 2010], but it is performed in the vertex shader instead of in the geometry shader. As illustrated in Figure 3, the extent of a single viewport in OpenGL will range from [−1, 1] in normalized device coordinates (NDC) for both the xand y-coordinates. ...
Thesis
Full-text available
The architecture, engineering and construction (AEC) industries are currently undergoing a change from a drawing-based form of information exchange to a model-based. Using the concept of Building Information Models (BIM), the content produced by architects and designers has evolved from traditional 2D-drawings to object-oriented 3D-models embedded with information to describe any building in detail. This, in turn, has opened up new possibilities of using real-time visualization and Virtual Reality (VR) as a tool for communication and understanding during the design process. However, as primarily created to describe a complete building in detail, many 3D dataset extracted from BIMs are too large and complex in order to be directly used as real-time visualization models. Because of this, it is still difficult to integrate VR and real-time visualizations as a commonly used tool during the design process. The recent introduction of a new generation of Head-Mounted Displays (HMD) has also made the situation even more challenging. Although these new types of VR devices offer huge potential in terms of realism, sense of scale and overall suitability for design and decision-making tasks, they are also far more demanding when it comes to real-time rendering performance. In order to address the current situation this thesis contributes with the design and evaluation of a new software application that provides a simple interface from BIM to VR. Following a design science research approach this application has been developed in order to fulfil a set of requirements that has been identified as important in order for VR and real-time visualization to become an everyday used tool for design and communication during the building design process. Along that path, three new technical solutions have been developed: -An efficient cells- and portals culling system automatically realized from BIM-data. -An efficient approach for integrating occlusion culling and hardware-accelerated geometry instancing. -An efficient single-pass stereo rendering technique. The final system – BIMXplorer – has been evaluated using several BIMs received from real-world projects. Regarding rendering performance, navigation interface and the ability to support fast design iterations, it has been shown to have all the needed properties in order to function well in practice. To some extent this can also be considered formally validated, as the system is already in active use within both industry and education.
... Fortunately, this can be solved by performing a screen-space transformation of the geometry, together with user-defined clipping planes. The screen-space transformation procedure used in this paper is similar to that in [Trapp and Döllner 2010], but it is performed in the vertex shader instead of in the geometry shader. As illustrated in Figure 3, the extent of a single viewport in OpenGL will range from [−1, 1] in normalized device coordinates (NDC) for both the xand y-coordinates. ...
Article
Full-text available
This paper describes and investigates stereo instancing — a single-pass stereo rendering technique based on hardware-accelerated geometry instancing — for the purpose of rendering building information models (BIM) on modern head-mounted displays (HMD), such as the Oculus Rift. It is shown that the stereo instancing technique is very well suited for integration with query-based occlusion culling as well as conventional geometry instancing, and it outperforms the traditional two-pass stereo rendering approach, geometry shader-based stereo duplication, as well as brute-force stereo rendering of typical BIMs on recent graphics hardware.
ResearchGate has not been able to resolve any references for this publication.