ArticlePDF Available

Abstract and Figures

Realistic rendering techniques of outdoor Augmented Reality (AR) has been an attractive topic since the last two decades considering the sizeable amount of publications in computer graphics. Realistic virtual objects in outdoor rendering AR systems require sophisticated effects such as: shadows, daylight and interactions between sky colours and virtual as well as real objects. A few realistic rendering techniques have been designed to overcome this obstacle, most of which are related to non real-time rendering. However, the problem still remains, especially in outdoor rendering. This paper proposed a much newer, unique technique to achieve realistic real-time outdoor rendering, while taking into account the interaction between sky colours and objects in AR systems with respect to shadows in any specific location, date and time. This approach involves three main phases, which cover different outdoor AR rendering requirements. Firstly, sky colour was generated with respect to the position of the sun. Second step involves the shadow generation algorithm, Z-Partitioning: Gaussian and Fog Shadow Maps (Z-GaF Shadow Maps). Lastly, a technique to integrate sky colours and shadows through its effects on virtual objects in the AR system, is introduced. The experimental results reveal that the proposed technique has significantly improved the realism of real-time outdoor AR rendering, thus solving the problem of realistic AR systems.
Content may be subject to copyright.
Realistic Real-Time Outdoor Rendering in Augmented
Reality
Hoshang Kolivand, Mohd Shahrizal Sunar*
MaGIC-X (Media and Games Innovation Centre of Excellence), UTM-IRDA Digital Media Centre, Universiti Teknologi Malaysia, Skudai, Johor, Malaysia
Abstract
Realistic rendering techniques of outdoor Augmented Reality (AR) has been an attractive topic since the last two decades
considering the sizeable amount of publications in computer graphics. Realistic virtual objects in outdoor rendering AR
systems require sophisticated effects such as: shadows, daylight and interactions between sky colours and virtual as well as
real objects. A few realistic rendering techniques have been designed to overcome this obstacle, most of which are related
to non real-time rendering. However, the problem still remains, especially in outdoor rendering. This paper proposed a
much newer, unique technique to achieve realistic real-time outdoor rendering, while taking into account the interaction
between sky colours and objects in AR systems with respect to shadows in any specific location, date and time. This
approach involves three main phases, which cover different outdoor AR rendering requirements. Firstly, sky colour was
generated with respect to the position of the sun. Second step involves the shadow generation algorithm, Z-Partitioning:
Gaussian and Fog Shadow Maps (Z-GaF Shadow Maps). Lastly, a technique to integrate sky colours and shadows through its
effects on virtual objects in the AR system, is introduced. The experimental results reveal that the proposed technique has
significantly improved the realism of real-time outdoor AR rendering, thus solving the problem of realistic AR systems.
Citation: Kolivand H, Sunar MS (2014) Realistic Real-Time Outdoor Rendering in Augmented Reality. PLoS ONE 9(9): e108334. doi:10.1371/journal.pone.0108334
Editor: Rongrong Ji, Xiamen University, China
Received February 25, 2014; Accepted August 26, 2014; Published September 30, 2014
Copyright: ß2014 Kolivand, Sunar. This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits
unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
Funding: The research paper was supported by Universiti Teknologi Malaysia (UTM) using Exploratory Research Grant Scheme (ERGS) vot number
R.J130000.7828.4L092. Special thanks to Ministry of Higher Education (MOHE) and Research Management Centre (RMC) for providing financial support of this
research. The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.
Competing Interests: The authors have declared that no competing interests exist.
* Email: shahrizal@utm.com
Introduction
In contrast to indoor rendering, outdoor rendering consists of
more components such, for example: position of the sun, sky
colours, shadows, rainbows, haze, trees, grass and etc. This paper
begins, attempting a working definition for some of the more
important parameters for outdoor rendering. Position of the sun,
sky colours, shadows and interaction between the sky colours and
other objects are the more significant components when it comes
to outdoor environments. These factors are taken into account
because they are the prominent components of outdoor environ-
ments [1] [2].
Over the past two decades, Augmented Reality (AR) has
become one of the most enthralling topics, not only in computer
graphics but also in other fields [3] [4] [5], beckoning researchers
on obtaining greater results. In AR, realism can be achieved
through entering shadows as well as inducing interaction between
objects [6] [7] [8] [9].
In general, realistic augmented reality has been a critical point
in computer graphics before the turn of 21st century [10]. Here, to
produce a realistic virtual object in real outdoor environments,
position of the sun, sky colours, shadows and interaction between
sky colours and objects are taken into account. Figure 1 represents
the research area. The final focus area is shown as well as all open
issues in AR.
Studies concering sky colours and shadows are the main
resources for outdoor components using grammars with sets of
rules. Rendering outdoor components is studied for visualization
of natural scenes in different contexts: animators, ecosystem
simulations, video games, design architectures and flight simula-
tors [11] [12].
Sky illumination on virtual objects is the most significant factor
in outdoor rendering not only in virtual environments but also in
augmented reality systems [8] [13] [14] [15] [16] [17] [18][19]
[20]. Generating sky colours as a background for each outdoor
scene is an essential aspect to make it more realistic. Illustrations of
the sky has become very crucial, as many buildings are designed,
so that the sky or other surrounding scenes are emblazoned
through the building windows [21].
Shadows are one of the prominent factors taken into
consideration when it comes to enhancement of realistic outdoor
environments; by realising the depth of the scene, using the
distance between the objects present. Without shadows and
shadow casters, it is strenuous to assimilate, as well as appreciate
the real size of objects when compared to others, which are placed
further away.
Semi-soft shadows are meant to be used in outdoor environ-
ments, considering their distance from the light source-sun is
cosmic. Wider areas in outdoor environments require an altered
and somewhat particular shadow generating technique to reveal:
the difference between shadows of the objects that are located
closer to the camera’s point of view and of those, that are located
further ahead.
Casting virtual shadows on other virtual objects and real
environments should be supported in realistic outdoor environ-
ments, hence an advanced technique is introduced to achieve this.
PLOS ONE | www.plosone.org 1 September 2014 | Volume 9 | Issue 9 | e108334
The presented shadow generating technique is easily implemented
not only in any virtual environments but also in all AR systems.
In outdoor AR games, the designer must choose the colour of
virtual objects, to create quality photo that reflects sky colour
variations. The choice of the colour suitable for outdoor AR
games, requires extensive investigation, even though accurate
results have not been attained yet [22] [23] especially in the case of
real-time [24] [7] [17] [8] [9]. Revealing the effects of sky colour
on the virtual objects is the final objective taken into account to
enhance the realism of the outdoor AR system.
An appropriate technique is in order to integrate all mentioned
factors in augmented reality. The technique removes the problems
associated with colour selection. Furthermore, it has the additional
advantage of observing the interaction between sky colours and
virtual objects like what can be seen on real objects during a day
[25] [26] [27] [28].
This article includes two new ideas to generate a realistic real-
time outdoor environment. A semi-soft shadow generating
technique with high quality and lower cost of rendering is
presented; as it is required for wide scale outdoor environments.
Implementing the proposed shadows technique in AR systems is
further contribution of this study to have virtual shadows on other
virtual and real objects. The integration technique in an AR
system can be expressed as additional achievements towards the
main goal of this piece.
Previous Works
Blinn [29] were the first researcher who used the indirect
illumination to demonstrate the actual distance between objects
which is known as: reflection mapping. The method is improved
by [30] then [31]. They used diffuse and specular reflection to
corresponding components of reflection. Nishita [32] and Ward
[33] illuminated real-time environments in computer graphics. A
model specifically designed for realistic rendering of large-scale
scenes is proposed by [34]. Stumpfel [35] is another researcher
who worked on illumination of sun and sky to produce realistic
environments.
Daylight is a combination of all direct and indirect lights
originated from the sun and the diffuse of other objects. In other
words, daylight includes direct sunlight, diffuse sky radiation and
both of them reflected from the earth and terrestrial objects.
Intensity of skylight or sky luminance is not uniform, awry and
depends on the clarity of the sky [32].
Figure 1. Research focus area.
doi:10.1371/journal.pone.0108334.g001
Realistic Real-Time Outdoor Rendering in Augmented Reality
PLOS ONE | www.plosone.org 2 September 2014 | Volume 9 | Issue 9 | e108334
The sun and sky are the main sources of natural illumination.
The sun is a luminary that simulates the effect of sunlight and can
be used to show how the shadows cast by a structure affect the
surrounding area. The angle of the light from the sun is controlled
by ones location, date and time. Sky light is most important
outdoor illumination to make the scene realistic [2].
Hosek et al. [36] did a critical job on sky colour generation,
based on Perez model which suffers from turbidity. Realistic sky
colour is still based on [37] and [34] technique that we use as well.
To achieve a realistic mixed reality, shadows play an important
role and are indispensable factors for 3D impressions of the scene
[38] [39] [40]. AR simulation of shadows for a virtual object in
real environments is difficult because of deeds reconstruction of
the real-world scene, especially when details of approximation of
the real scene geometry and the light source are known [41].
Jacobs et al. [42] prepared a classification of the illumination
methods into two different groups, common illumination [41] [43]
[44] [45] [15] [46] and relighting [47] [48] in mixed reality. The
credibility of shadow construction with the correct estimation of
light source position can be found in [48] [6] [49] [50].
Casting virtual shadows on other virtual and real objects is one
of the existing issues in augmented reality. Haller et al. [45]
modified shadow volumes to generate shadows in AR. In this
algorithm a virtual object such as the real one but not more
accurate is simulated which is called phantoms. The silhouette of
both the virtual and the phantom objects are detected. Phantom
shadows could be cast on virtual objects and virtual shadows could
be cast on phantom objects. This method requires many phantoms
to cover the real scene. Silhouette detection, the expensive part of
shadow volumes is the main disadvantage of this technique
especially when it comes complicated scenes. To recognize a real
object as well as generation of the phantoms, is another problem
with this algorithm.
Jacobs et al. [41] introduced a technique to create the virtual
shadow of real objects with respect to a virtual light source where
the real objects and the virtual light source are equipped with 3D
sensors. Projection shadows are used for simpler objects while
Shadow Maps (SMs) [51] are applied for more complicated ones.
They proposed a real-time rendering method to simulate colour-
consistent shadows of virtual objects in mixed reality.
Yeoh et al [18] proposed a technique for realistic shadows in
mixed reality using a shadow segmentation approach which
recovers geometrical information on multiple faded shadows. The
paper focused on dynamic shadow detection in a dynamic scene
for future requirements in mixed reality environments. The
technique is similar to Shadow Catcher in [52] but in dynamic
scenes.
Aittala [19] applied Convolution Shadow Maps [53] to produce
soft shadows in AR which employed both mip-map filtering and
fast summed area tables [54] to enhance blurring with variable
radius. The method is applicable to both scenes external and the
internal scenes.
Castro et al. [23] advised a method to produce soft shadows
with less aliasing which uses a fixed distance relative to the marker,
but with only one camera. The method also performs one sphere
mapping such as [49], but selects a source or sources light most
representative of the scene. This is important because of hardware
limitations of mobile devices. The method supports self-shadowing
as well as soft shadowing. They used filtering method such as
Percentage Closer Filtering (PCF) [55] and Variance Shadow
Maps (VSMs) [56] to generate soft shadows.
For the consideration of sunlight and skylight, [24] proposed an
outdoor image by taking into account the sun and sky light with a
linear combination as a basis image. The intensity of both sunlight
and skylight are achieved by solving the system equation. This
research could obtain the effect of environments on virtual objects
in a fixed viewpoint. The main issue dealing with existing intrinsic
image decomposition approaches is unreliability of natural
captured image with a little control. Manually picking up some
regions of the image to find a desirable sun and sky light, to make
the algorithm reliable is another problem.
Knecht et al. [57] applied a technique in radiosity for blending
the virtual objects into the real environments. Some shortcomings
such as bleeding the light and double shadowing resulted in
combining the instant radiosity and differential rendering [57].
The final work avoids inconsistent colour bleeding artifacts.
Ka´n et al. [58] used ray tracing method and applied photon
mapping to enhance the realism of virtual objects as well as visual
coherence between real and virtual objects in a composited image.
Madsen [9] estimated the outdoor illumination conditions in
AR systems based on detecting the dynamic shadows. They used
shadow volumes for generating virtual shadows. The direct sun
and sky radiances from pixel values of dynamic shadows in live
video are taken into account.
None real-time rendering is caused due to gathering many
samples of the background image at different times which is the
main difference with our approach in this study [59,60]. Liu et al.
[7] and Xing et al. [8] presented a static approach which could
consider the outdoor illumination by taking advantage of essential
association of the illumination factors and statistic attributes of
each video frame. Such as previous work of this author, this
research is viewpoint dependent. A desired future work of these
researches was to obtain this results, but for real-time rendering
which is our approach.
The biggest issue with augmented reality is the exact
illumination with respect to the environments to make the system
maximally realistic [16] [18] [19] [61] [62] [20] [63] [8] [9]. In the
case of indoor rendering light colour and effect of other objects on
virtual objects and vice versa is important which can be taken into
Figure 2. The zenithal and azimuthal angles on the hemi-
sphere.
doi:10.1371/journal.pone.0108334.g002
Realistic Real-Time Outdoor Rendering in Augmented Reality
PLOS ONE | www.plosone.org 3 September 2014 | Volume 9 | Issue 9 | e108334
account to make objects more realistic. In the case of outdoor
rendering and involving sun and sky light the effect of skylight or
sky colour plays a more significant role [64].
Kolivand et al. [65] proposed a technique to apply the effect of
the sky colour on virtual objects in augmented reality in any
specific location date and time. The main issue with the method is
casting shadows on flat surfaces only due to the use of projection
shadows for shadow generation. In this study we have tried to
overcome the previous issue regarding to casting virtual shadows
on other virtual and real objects with respect to the interaction
between sky colour and augmented objects like what can be seen
on real objects during daytime.
Methods and Materials
Sky Modelling
Before determining position of the sun, the sky must be
modelled [66]. For creating the sky, virtual dome is a convenient
tool. There are two ways to model the dome; using 3D modelling
software such as 3D Max Studio, Rhino or Maya and using a
mathematical function. Mathematical modelling is adopted for this
real-time environment since it is easy to handle in the case of real-
time. The dome is like a hemisphere in which the view point is
located. Suppose that earth is a sphere. Julian date is a precise
technique to calculate suns position [67]. The position could be
calculated for a specific longitude, latitude, date and time using
Julian date. The time of day is calculated using the Equation 1.
t~tsz0:17 sin ( 4p(J{180)
373 ){0:129 sin ( 2p(J{8)
355 )z
12 SM{L
p
ð1Þ
where,
t: Solar time
ts: Standard time
J: Julian date
SM: Standard meridian
L: Longitude
The solar declination is calculated as Equation 2. The time is
calculated in decimal hours and degrees in radians. Finally, zenith
and azimuth can be calculated as follows: ( Equation 3 and 4):
d~0:4093sin(2p(J{8)
368 )ð2Þ
hs~p
2{sin{1(sinlsind{coslcosdcos pt
12 )ð3Þ
Qs~tan{1(
{cosdsin pt
12
coslsind{sinlcosdcos pt
12
)ð4Þ
where hsis solar zenith, Qsis solar azimuth and lis latitude. With
calculation of zenith and azimuth (Figure 2)suns position will
become obvious.
Perez Sky Model
The model is convenient to illuminate arbitrary point (hp,cp)of
the sky dome with respect to the position of the sun. It uses CIE
[68] standard and could be used for a wide range of atmospheric
conditions. Luminance of point (hp,cp)is calculated using the
Equations 5 and 6:
L(hp,cp)~(1zAe
B
cos hp)(1zCeDcpzEcos2cp)ð5Þ
cp~cos{1(sinhssinhpcos(Qp{Qs)zcoshscoshp)ð6Þ
Where:
A: Darkening or brightening of the horizon
Figure 3. Left: theory of Z-GaF Shadow Maps when light and view direction are perpendicular, Right: Z-partitioning with 3 partitions in 1024*1024
resolution.
doi:10.1371/journal.pone.0108334.g003
Realistic Real-Time Outdoor Rendering in Augmented Reality
PLOS ONE | www.plosone.org 4 September 2014 | Volume 9 | Issue 9 | e108334
B: Luminance gradient near the horizon
C: Relative intensity of circumsolar region
D: Width of the circumsolar region
E: Relative backscattered light received at the earth surface
Essentially, to use Yxy space, the following three components
are needed. In each point of view, the Y luminance is calculated
by:
Y~Yz
L(hp,cp)
L(0,hs)ð7Þ
The chromaticity of x and y is calculated by:
x~xz
L(hp,cp)
L(0,hs)ð8Þ
y~yz
L(hp,cp)
L(0,hs)ð9Þ
To colour each sky pixel, all of the pixels in the introduced
formulae must be calculated iteratively. Involving date and time in
specific locations enables the exact colour reproduction of each
pixel.
Z-Partitioning, Gaussian and Fogs Shadow Maps (Z-GaF
Shadow Maps)
Shadow maps are convenient for casting shadows on other
objects but suffer from aliasing. Applying Z-partitioning on
conventional shadow maps and setting the resolution of the
partitions could solve the aliasing out as many other works
mentioned in the literature have. Semi-soft shadows are the most
suited types of shadows which could be considered for outdoor
rendering. To generate semi-soft shadows Gaussian approach is
employed on the improved shadow maps using Z-partitioning.
Although shadows demonstrate the actual distance between
objects in virtual reality, AR systems still seem to lack the distance
between real and virtual objects. Virtual objects usually appear
nearer to the camera resulting augmented objects. In outdoor AR
systems, this issue is met more than indoor rendering due to long
distances and wide areas in outdoor environments. Applying a
specific parameter of Fog [69] in the spacial partition of the view
frustum which is split in advance, makes the virtual objects appear
far from camera and consequently suitable for far distances in
outdoor environments. The algorithm is summarised as shown in
Algorithm S1.
Applying Z-partitioning and Gaussian approximation on
shadow maps reduces aliasing through increasing high resolution
for areas in the scene that are closer to the point of view and
decreasing the resolution for areas of the scene that are far away
(Figure 3(Left)). Z-partitioning was done by splitting the camera
view frustum into segments and filling the z-buffer for each
segment separately(Figure 3(Right)). Assigning convenient resolu-
tion to each fragment depends on the fragment’s z-value. This idea
is used for wide scenes such as large terrain.
View frustum splitting is based on the earliest technique [70]
and starts from the first object in the scene. This allows the GPU to
be independent of the parts of the scene which are out of any
rendering contribution. This, in addition to making the algorithm
much faster, reduces the number of layers considerably.
View frustum splitting allows a shadow map to be generated
and to change the resolution of each split part. The different types
of splitting have an effect on the final quality and rendering speed.
Uniform splitting, logarithmic and practical splitting schemes are
the common types of splitting as can be seen in Figure 4.
Although parallel split schemes are proposed for reducing the
aliasing, a uniform split scheme does not rectify the aliasing
problem. The uniform distribution of perspective aliasing behaves
no differently from standard shadow mapping. In this case, the
perspective aliasing increases hyperbolically when the objects
moves towards the camera. The logarithmic scheme is convenient
for near objects but as objects are not located in front of the
camera, it is not suitable in general cases.
As logarithmic and uniform schemes could not cover the anti-
aliasing for both near and far objects, taking their average could be
Figure 4. Split schemes, Left: Uniform splitting, Center: Logarithmic splitting, Right: Practical splitting.
doi:10.1371/journal.pone.0108334.g004
Figure 5. View frustum and light frustum mix to create Parallel
Split Shadow Maps.
doi:10.1371/journal.pone.0108334.g005
Realistic Real-Time Outdoor Rendering in Augmented Reality
PLOS ONE | www.plosone.org 5 September 2014 | Volume 9 | Issue 9 | e108334
beneficial (Figure 5). Simply put, if Ciis a ith split of practical
splitting, then
Cpractical
i~(Clogarothmic
izCuniform
i)=2ð10Þ
The presented technique for splitting is applicable for near and
far objects. This technique requires non-negative bias to adjust the
clip situation. There are some simple ways to reduce the bias.
Increasing the precise depth is a method better suited to the near
and far plane of the camera frustum.
Splitting whole scenes into multiple partitions helps control the
resolution in different parts of a scene. A major difference between
cascade shadow maps and the new approach is the non-uniform
partitions.
In the proposed technique, there is no extra bias and it can be
applied to bias concerns in most cases. A drawback of the
proposed technique was evident when the light frustum was
parallel to the view frustum.
Approximating the depth distribution using Gaussian approach,
not only generates smoother shadow boundaries but also reduces
the computational and storage cost.
Figure 6. (A): The first two steps of Z-GaF which is conventional Shadow Maps, (B): Applying the presented Z-partitioning, (C):
Applying Gaussian approximation on shadow maps, (D): Soft Shadows.
doi:10.1371/journal.pone.0108334.g006
Figure 7. (A): Logarithm splitting, (B): Practical splitting.
doi:10.1371/journal.pone.0108334.g007
Realistic Real-Time Outdoor Rendering in Augmented Reality
PLOS ONE | www.plosone.org 6 September 2014 | Volume 9 | Issue 9 | e108334
The best way to create the illusion of depth is to take the colour
value into account with respect to the distance from the viewpoint,
which is fog employment. Fog is one of the widely used effects in
most outdoor games whereby the size and the reality of the
environments are realised. By enabling the depth testing and the
fog, choosing the fog mode, fog colour, and fog density for the
closest partition which is set by high resolution, realistic fog effect is
constructed. The fog reduces the sharpness of the virtual objects.
Therefore, far away virtual objects appear to fade into background
similar to what can be seen in real environments. By setting the
starting and ending distances for the fog not only in the first
partition but also for any other partitions, fog can be applied on
any specific virtual object.
In situations where the light direction is not perpendicular to the
view direction (Figure 4(Left)), splitting the depth map into non-
intersection layers and creating one shadow map for each layer in
the light space could cover the redundancy. Each shadow map is
generated through irregular frustum clipping and scene organisa-
tion. This makes it possible to have different shadow maps without
any intersection sample points.
Ci~lClog
iz(1{l)Cuni
ið11Þ
C{ilog~n(f
n)i
m,0ƒlƒ1,0ƒmƒ1ð12Þ
Cuni
i~nz(f{n)i
mð13Þ
Where
mis the number of splits, nand fare near and far plane
clippings, respectively. Clog
iand Cuni
iare two classic splitting
schemes that increase details by referring to [71]. ais the split
Table 1. Speed of rendering measured by FPS in different resolutions.
Method 1024*1024 2048*2048
SMs 122 116
PCF 75 63
CSMs 84 79
Z-GaF 96 90
doi:10.1371/journal.pone.0108334.t001
Figure 8. (A): Shadow Maps, (B): PCF, (C): CSMs with Gaussian blurring, (D): Z-GaF Shadow Maps.
doi:10.1371/journal.pone.0108334.g008
Realistic Real-Time Outdoor Rendering in Augmented Reality
PLOS ONE | www.plosone.org 7 September 2014 | Volume 9 | Issue 9 | e108334
position weight which depends on the practical requirements of
the application.
Li~Wi{(Wi\Wi{1), ð14Þ
Where Wiis light frustum splitting with respect to view frustum
splitting (Vi) and W0~.
Implementation and Results
Z-GaF Shadow Maps
Implementing the two first steps of Z-GaF algorithm are the
conventional shadow maps whose results are illustrated in Figure 6
(A). All pictures are captured in 1024*1024 resolution. The
shadows of the tree are cast on the elephant. Self-shadowing can
be observed on some parts of the elephant’s body especially
shadows of the ears and ivories. In Figure 6 A aliasing is the main
issue which made the shadows less realistic.
Splitting the Z-depth to 2 to 4 partitions depends on the
distance of the objects from the cameras viewpoint allows to
change the resolution of each partition (Figure 6 B). High
resolution generates high quality while producing low FPS. The
close partitions are set with high enough resolution to enhance the
realism of objects. Low resolution reduces the time of rendering,
consequently increasing the speed of rendering (Figure 6C and
(D)). Obviously, when a wide scene like an outdoor environment is
rendered with the same resolution, some parts of it which are
located far away from the camera, may not be seen very well,
wasting the GPU’s and CPU’s time. Therefore, they are
performed in low resolution. Practical splitting is tested to generate
appropriate distribution of the partitions which can be seen in
Figure 7.
Figure 9. Integration of Z-GaF Shadow Maps and sky colour, January 1st at Universiti Teknologi Malaysia at different times of day.
doi:10.1371/journal.pone.0108334.g009
Figure 10. Left: Eiffel Tower, captured on 6 October at 16:03 (Source: http://www.earthcam.com/, email: ), Right: The Software generated result for
Eiffel Tower position on 6 October at 16:03 (http://www.flickr.com/photos/118766222@N04/12784543685/).
doi:10.1371/journal.pone.0108334.g010
Realistic Real-Time Outdoor Rendering in Augmented Reality
PLOS ONE | www.plosone.org 8 September 2014 | Volume 9 | Issue 9 | e108334
Higher resolution results in a higher shadow quality, but suffers
from an increase in rendering time. To overcome this problem
and keep the trade-off balance between quality and rendering
speed, the view frustum is split into different partitions. The
number of partitions can be set manually. Figure 7 shows the
difference between our proposed technique and previous ones for
determining the best suited splitting. To enhance the quality of
shadows the close partitions are set with a higher resolution, while
in case of reducing the time of rendering, far partitions are set with
a lower resolution. Results of assigning low resolution on some of
the partitions can be observed in Table 1 in case of high enough
FPS.
In Figure 7 (A) the partition distribution is based on logarithm
function. The partition’s location is not appropriately selected. In
Figure 7 (B) partitioning is constructed based on Practical splitting.
The beginning of each partition is marked by a red arrow.
Integration of the presented approach for Z-partitioning and
Gaussian approximation not only generates a convenient semi-soft
shadow compared to PCF and Cascade Shadow Maps (CSMs)
[72] but also, there is no light leaking as compared with VSMs
[56] and Layer Variance Shadow Maps (LVSMs) [73]. The main
concern of VSMs and Convolution Shadow Maps (CoSMs) [71] is
light bleeding due to Chebyshev Inequality for the upper bound of
light visibility test and exponential approximation, respectively.
Our upper bound approximation, which is based on Gaussian
distribution for all layers generates a semi-soft shadow or somehow
soft shadows as can be seen in Figure 8 (D).
Figure 11. Left: Eiffel Tower, captured on 24 October at 16:23 (Source: http://www.earthcam.com/, email: ), Right: The Software generated result for
Eiffel Tower position on 24 October at 16:23 (http://www.flickr.com/photos/118766222@N04/12784631475/).
doi:10.1371/journal.pone.0108334.g011
Figure 12. Left: Eiffel Tower, captured on 5 September at 17:19 (Source: http://www.earthcam.com/, email: ), Right: The Software generated result for
Eiffel Tower position on 6 September at 17:19 (http://www.flickr.com/photos/118766222@N04/12785060384/).
doi:10.1371/journal.pone.0108334.g012
Realistic Real-Time Outdoor Rendering in Augmented Reality
PLOS ONE | www.plosone.org 9 September 2014 | Volume 9 | Issue 9 | e108334
Figure 8 draws a comparison between previous algorithms and
Z-GaF Shadow Maps in 1024*1024 resolutions. Figure 8 (A) is the
result of conventional Shadow Maps while (B) is the result of PCF.
Figure 8 (C) illustrates CSMs using Gaussian blurring. Figure 8
(D) is the result of Z-GaF Shadow Maps which is an accurate
shadow with semi-soft shadows for outdoor environments.
Integration of Sky colour and Z-GaF Shadow Maps
Integration of sky colours and Z-GaF shadow Maps in real-time
environments is performed successfully. Z-GaF Shadow Maps
could produce high quality semi-soft shadows compared to
previous algorithms. By combination of Z-GaF Shadow Maps
and sky colour using a friendly GUI to set-up the specific location,
date and time, an outdoor rendering application is provided. In
the next section an evaluation on proposed integration is presented
in details.
Figure 9 illustrates the implementation of Z-GaF Shadow Maps
and sky colour. The effect of sky colour can be observed from the
pictures.
There is a majority of web camera service providers on the
World Wide Web, however, most of the web cameras do not grant
a view of the sky. Many of them show traffic or crowds. Only a few
web cameras capture the sky panorama. Figures 10, 11 and 12
shows some images of the Eiffel Tower in Paris, captured from
France-Telecom (2012) at different points in time, in different
days.
Discussion
Extra stages are not needed to generate virtual shadows on
virtual objects through implementing Z-GaF Shadow Maps. Since
they are based on shadow maps, casting the virtual shadows on
other objects is the main ability of this category of shadow
generating techniques.
Figure 13(A) illustrates a scene including two virtual objects, a
tree and an elephant. The virtual shadows of the tree are cast on
the virtual elephant and the real wall simultaneously. The shadow
technique used in the left picture is that of simple shadow maps
with 512*512 resolution which does not produce any adequate
results. Applying PCF with 1024*1024 resolution in the right side
picture yields better results ( Figure 13(B)).
Figure 14 is the exact scene which was presented in Figure 13.
In these pictures Z-GaF Shadow Maps are applied instead. In
picture (A), Z-GaF Shadow Maps without blurring cast virtual
shadows on real and virtual objects, while in the picture (B), Z-GaF
Shadow Maps and Gaussian approximation is employed to
generate soft shadows.
Figure 13. (A) Conventional Shadow Maps on virtual and real objects, (B) PCF shadows on virtual and real objects.
doi:10.1371/journal.pone.0108334.g013
Figure 14. Casting Z-GaF Shadow Maps on virtual and real environments simultaneously, (A) Z-GaF Shadow Maps, (B) Soft shadows
using Z-GaF Shadow Maps.
doi:10.1371/journal.pone.0108334.g014
Realistic Real-Time Outdoor Rendering in Augmented Reality
PLOS ONE | www.plosone.org 10 September 2014 | Volume 9 | Issue 9 | e108334
Castro et al. [23] proposed a method to produce semi-soft
shadows with less aliasing using a fixed distance relative to the
marker, but with only one camera (Figure 15 (A)). The method
also performs one sphere mapping such as [49], but selects a
source or sources of light, most protruding at the scene. This is
important because of hardware limitations of mobile devices. The
method does not support self-shadowing and soft shadowing. They
used filtering methods such as PCF ([55] and VSMs ([56] to
generate soft shadows. To compare our research with [23], Z-GaF
Shadow Maps is implemented to cast soft shadows on other virtual
and real objects includes 10 million triangles which can be seen in
Figure 15 (B).
Collectively, Z-GaF Shadow Maps could be performed in
augmented reality environments to generate shadows on other
objects without any aliasing and light bleeding. They are also
suitable to be applied for soft shadow generation not only in virtual
environments but also in augmented reality systems.
Interaction Between Sky colour and Object in Outdoor
AR Environments
The technique integrates position of the sun, sky colours,
shadows and interaction between sky colours and augmented
objects. Position of the sun is managed, using Julian date and a
GUI for setting the location, date and time. The sky colours are
generated based on Perez model [37] but are not visible for the
camera as the dome is beyond the view of frustum. Augmented
objects are uploaded as OBJ models. Both marker and markerless
techniques are applied to control the location, direction and
orientation of the augmented objects. Shadows are appeared using
Z-GaF Shadow Maps. The generated sky colour exerts the energy
of each patch to the all visible patches of augmented objects using
RCC [65]. Convergence rates could be set through the GUI to
find out the best interaction compared to the real objects. The
desired interaction is achieved by comparing to the real objects
which are the best benchmark for the current work as the manner
advocated by most researchers [74] [75] [24] [7] [17] [8].
The implementation during its first stage starts from ARToolkit
with multiple markers loop function as the starting point and then
calls a function to render an OpenGL GLUT scene, passing the
geometry of the scene as function parameters. The GLUT scene
function calls another GLUT display method in the OpenGL
GLUT. The method calls the initializations of the scene, calls the
display loop, and determines the geometry of the virtual scene.
Knowing that the shadows, depending on Z-GaF Shadow Maps,
of each object are rendered within the scene itself, it would be
much easier for a programmer to render the shadows in AR
environments. Moreover, to show more realistic interaction
between augmented and real objects a similar looking-like
primitive alpha objects for the background of the virtual
Figure 15. (A) Castro results [23], (B) Our results.
doi:10.1371/journal.pone.0108334.g015
Figure 16. A scene with and without augmented objects, at 9:55 on January 11th 2013 at Universiti Teknologi Malaysia.
doi:10.1371/journal.pone.0108334.g016
Realistic Real-Time Outdoor Rendering in Augmented Reality
PLOS ONE | www.plosone.org 11 September 2014 | Volume 9 | Issue 9 | e108334
the real objects are lighter due to the real sky colour, virtual objects
are also lighter.
Figure 18 (left) is an augmented scene with the three virtual
objects which was captured at 15:28 in January 11th while (right) is
the same scene in different orientation. When the location or
orientation of augmented objects changes, the shadows remain in
the same direction as the real ones.
The results posted in this section show the processes by which
the objectives and consequently the aim of the research are
achieved. The sky colours, shadows and the effects of the sky on
virtual objects in the AR system are applied progressively.
Conclusion and Future Works
This study provides a technique to demonstrate the interaction
between sky colours and virtual objects in an augmented reality
taking shadows into account. The main research contribution, in
addition to shadow improvement, is the appearance of realistic
virtual objects in outdoor rendering augmented reality environ-
ments. It involves 3D objects, sky colour effects and shadows
which enhance the realism of the AR systems.
In the first part, the sky colours with respect to position of the
sun in any specific location, date and time is successfully
constructed. Specific longitude, latitude, date and time are the
required parameters to calculate the exact position of the sun. The
position is calculated based on Julian date and the sky colour is
created based on Perez model. The sky colour is implemented
based on Preetham’s method [34] that is analytic model like actual
atmosphere used in outdoor rendering.
Another contribution of this research is a new algorithm to
create shadows with higher quality and higher frames per second,
when compared to other algorithms such as Layer Variance
Shadow Maps and Cascade Shadow Maps. Z-GaF Shadow Maps
have been tested to vindicate an increase in the quality in a typical
application.
The integrated prototype has been tested for performance. It
carries out what the users expect. The strategy of testing the results
of the technique have carried out. These include precise choice of
the test data. The software has been produced for testing purposes
during the research. It has helped to show that the calculations and
software results are free from error. The results have been
compared with the real world environment as well.
Interaction between virtual and real objects, beyond the
interaction between sky colours and objects can largely enhance
the realism. Much work needs to be done to induce the influences
of real objects on virtual ones and vice versa. Radiosity and Ray-
tracing are the suggested techniques when tasks such as this are
performed. The radiosity technique is a more complicated process,
requiring improvements to become fast enough to be applied in
augmented reality environments as well as virtual environments.
This software, in addition to helping game makers generate
outdoor games without worrying about shadows position and sky
colours at different times of day and different day of year, also
makes it possible for teachers of physics to teach about Earth orbits
and the effect sun has on shadows.
Supporting Information
Algorithm S1 Z-GaF Shadow Maps.
(PDF)
Algorithm S2 Radiosity Caster Culling (RCC).
(PDF)
Acknowledgments
The research paper supported by Universiti Teknologi Malaysia (UTM).
Special thanks to Ministry of Higher Education (MOHE) and Research
Management Centre (RMC) providing financial support of this research.
Author Contributions
Conceived and designed the experiments: HK. Performed the experiments:
HK MSS. Analyzed the data: HK. Contributed reagents/materials/
analysis tools: HK MSS. Wrote the paper: HK. Revised the manuscript:
MSS.
References
1. Nishita T, Nakamae E, Dobashi Y (1996) Display of clouds and snow taking into
account multiple anisotropic scattering and sky light. In Rushmeier, H, ed,
SIGGRAPH 96 Conference Proceedings, Annual Conference Series : 379–386.
2. Dobashi Y, Kaneda K, Yamashita H, Nishita T (1996) Method for calculation of
sky light luminance aiming at an interactive architectural design. Computer
Graphics Forum (Proc EUROGRAPHICS’96) 15: 112–118.
3. Ji R, Duan LY, Chen J, Yao H, Yuan J, et al. (2012) Location discriminative
vocabulary coding for mobile landmark search. International Journal of
Computer Vision 96: 290–314.
4. Wilson KL, Doswell JT, Fashola OS, Debeatham W, Darko N, et al. (2013)
Using augmented reality as a clinical support tool to assist combat medics in the
treatment of tension pneumothoraces. Military medicine 178: 981–985.
5. Zhu J, Ong S, Nee A (2014) A context-aware augmented reality assisted
maintenance system. International Journal of Computer Integrated Manufac-
turing : 1–13.
6. Jensen B, Laursen J, Madsen J, Pedersen T (2009) Simplifying real time light
source tracking and credible shadow generation for augmented reality. Institute
for Media Technology, Aalborg University.
7. Liu Y, Qin X, Xing G, Peng Q (2010) A new approach to outdoor illumination
estimation based on statistical analysis for augmented reality. Computer
Animation and Virtual Worlds 21: 321–330.
8. Xing G, Liu Y, Qin X, Peng Q (2012) A practical approach for real-time
illumination estimation of outdoor videos. Computers and Graphics 36: 857–
865.
9. Madsen CB, Lal BB (2013) Estimating outdoor illumination conditions based on
detection of dynamic shadows. Computer Vision, Imaging and Computer
Graphics Theory and Applications : 33–52.
10. Azuma R (1997) A survey of augmented reality. Presence: Teleoperators and
Virtual Environments 6: 355–385.
11. Klassen R (1987) Modeling the effect of the atmosphere on light. ACM
Transactions on Graphics 6: 215–237.
12. Sunar MS, Kari S, Bade A (2003) Real-time of daylight sky colour rendering and
simulation for virtual environment. IASTED International Conference on
Applied Simulation and Modeling (ASM 2003) : 3–5.
13. Kaneda K, Okamoto T, Nakamae E, Nishita T (1991) Photorealistic image
synthesis for outdoor scenery under various atmospheric conditions. The Visual
Computer 7: 247–258.
14. Tadamura K, Nakamae E, Kaneda K, Baba M, Yamashita H, et al. (1993)
Modeling of skylight and rendering of outdoor scenes. Computer Graphics
Forum 12: 189–200.
15. Gibson S, Cook J, Howard T, Hubbold R (2003) Rapid shadow generation in
real-world lighting environments. In Proceedings of Eurographics Symposium
on Rendering 2003 : 219–229.
16. Feng Y (2008) Est imation of light source en vironment for illumination
consistency of augmented reality. Image and Signal Processing, 2008 CISP’08
Congress on 3: 771–775.
17. Xing G, Liu Y, Qin X, Peng Q (2011) On-line illumination estimation of
outdoor scenes based on area selection for augmented reality. Computer-Aided
Design and Computer Graphics (CAD/Graphics), 2011 12th International
Conference : 43–442.
18. Yeoh RC, Zhou SZ (2009) Consistent real-time lighting for virtual objects in
augmented reality. in 8th IEEE International Symposium on Mixed and
Augmented Reality (ISMAR 2009) : 223–224.
19. Aittala M (2010) Inverse lighting and photorealistic rendering for augmented
reality. The Visual Computer 26: 669–678.
20. Kim Y (2010) Augmented reality of flexible surface with realistic lighting.
Ubiquitous Information Technologies and Applications (CUTE), 2010 Proceed-
ings of the 5th International Conference : 1–5.
21. Dobashi Y, Nishita T, Kaneda K, Yamashita H (1997) A fast display method of
sky colour using basis functions. The Journal of Visualization and Computer
Animation : 115–127.
Realistic Real-Time Outdoor Rendering in Augmented Reality
PLOS ONE | www.plosone.org 13 September 2014 | Volume 9 | Issue 9 | e108334
22. Noh Z, Sunar M (2010) Soft shadow rendering based on real light source
estimation in augmented reality. Advances in Multimedia - An International
Journal (AMIJ) 1: 26–36.
23. Figueiredo LHd, Velho L, et al. (2012) Realistic shadows for mobile augmented
reality. Virtual and Augmented Reality (SVR), 2012 14th Symposium : 36–45.
24. Liu Y, Qin X, Xu S, Nakamae E, Peng Q (2009) Light source estimation of
outdoor scenes for mixed reality. The Visual Computer 25: 637–646.
25. Kittler R, Perez R, Darula S (1999) Universal models of reference daylight
conditions based on new sky standards. PUBLICATIONS-COMMISSION
INTERNATIONALE DE L ECLAIRAGE CIE 133: 243–247.
26. Kittler R, Darula S, Perez R (1998) A set of standard skies characterising
daylight conditions for computer and energy conscious design. Res Report of the
American-Slovak grant project US-SK 92: 240.
27. Kittler R, Darula S (1998) Parametrization problems of the very bright cloudy
sky conditions. Solar energy 62: 93–100.
28. Sik-La´ nyi C (2014) Styles or cultural background does influence the colors of
virtual reality games? Acta Polytechnica Hungarica 11.
29. Blinn J, Newel M (1976) Texture and refection in computer generated images.
Communications of the ACM 19: 542–546.
30. Miller G, Hofman C (1984) Illumination and refection maps: Simulated objects
in simulated and real environments. SIGGRAPH 84 Advanced Computer
Graphics Animation seminar notes : 1–12.
31. Greene N (1986) Environment mapping and other applications of world
projections. Computer Graphics and Applications, IEEE 6: 21–29.
32. Nishita T, Nakamaei E (1986) Continuous tone representation of three-
dimensional objects illuminated by sky light. Computer Graphics 20: 112–118.
33. Ward GJ (1994) The radiance lighting simulation and rendering system. In
SIGGRAPH 94 : 459–472.
34. Preetham A, Shirley P, Smith B (1999) A practical analytic model for daylight.
Computer Graphics,(SIGGRAPH ’99 Proceedings) : 91–10.
35. Stumpfel J (2004) Hdr lighting capture of the sky and sun.
36. Hosek L, Wilkie A (2012) An analytic model for full spectral sky-dome radiance.
ACM Transactions on Graphics (TOG) 31: 95.
37. Perez R, Seals R, Michalsky J (1993) All-weather model for sky luminance
distribution - preliminary configuration and validation. Solar Energy 50: 235–
245.
38. Naemura T, Nitta T, Mimura A, Harashima H (2002) Virtual shadows in mixed
reality environment using flashlight-like devices. Trans Virtual Reality Society of
Japan 7: 227–237.
39. Debevec P (2004) Image-ba sed lighting. IEEE Computer Graphics and
Applications 22: 26–34.
40. Slater M, Spanlang B, Sanchez-Vives MV, Blanke O (2010) First person
experience of body transfer in virtual reality. PloS one 5: e10564.
41. Jacobs K, Nahmias JD, Angus C, Reche A, Loscos C, et al. (2005) Automatic
generation of consistent shadows for augmented reality. Proceedings of Graphics
Interface 2005 : 113–120.
42. Jacobs K, Loscos C (2004) Classification of illumination methods for mixed
reality. In Eurographics.
43. Madsen C, Nielsen M (2008) Towards probe-less augmented reality. A Position
Paper, Computer Vision and Media Technology Lab.
44. Madsen C, Sorensen M, Vittrup M (2003) The important of shadows in
augmented reality. In Proceedings 6th Annual International Workshop on
Presence.
45. Haller M, Drab S, Hartmann W (2003) A real-time shadow approach for an
augmented reality application using shadow volumes. In Proceedings of VRST
03 : 56–65.
46. Agusanto K, Li L, Chuangui Z, Sing NW (2003) Photorealistic rendering for
augmented reality using environment illumination. Mixed and Augmented
Reality, 2003 Proceedings The Second IEEE and ACM International
Symposium : 208–216.
47. Loscos C, Drettakis G, Robert L (2000) Interactive virtual relighting of real
scenes. IEEE Transactions on Visualization and Computer Graphics 6: 289–
305.
48. Yan F (2008) Estimation of light source environment for illumination consistency
of augmented reality. In First International Congress on Image and Signal
Processing 3: 771–775.
49. Kanbara M, Yokoya N (2004) Real-time estimation of light source environment
for photorealistic augmented reality. In Proceedings of the 17th International
Conference on Pattern Recognition : 911–914.
50. Ji R, Duan LY, Chen J, Xie L, Yao H, et al. (2013) Learning to distribute
vocabulary indexing for scalable visual search. Multimedia, IEEE Transactions
on 15: 153–166.
51. Williams L (1978) Casting curved shadows on curved surfaces. SIGGRAPH ’78
12.
52. Hartmann W, Zauner J, Haller M, Luckeneder T, Woess W (2003) Shadow
catcher: a vision based illumination condition sensor using artoolkit. In 2003
IEEE Internation Augmented Reality Toolkit Workshop (IEEE Cat
No03EX780) : 44–55.
53. Annen T, Dong Z, Mertens T, Bekaert HP, Kautz J (2008) Real-time, all-
frequency shadows in dynamic scenes. ACM Transactions on Graphics
(Proceedings of ACM SIGGRAPH 2008) 27: 1–34.
54. Hensley J, Scheuermann T, Coombe G, Singh M, Lastra A (2005) Fast summed-
area table generation and its applications. Comput Graph Forum 24: 547–555.
55. Reeves W, Salesin D, Cook PL (1987) Rendering antialiased shadows with depth
maps. Computer Graphics (Proceedings of SIGGRAPH 87 21: 557–562.
56. Donnelly W, Lauritzen A (2006) Variance shadow maps. In Proceedings of the
2006 ACM SIGGRAPH Symposium on Interactive 3D Graphics and Games :
161–165.
57. Knecht M, Traxler C, Mattausch O, Wimmer M (2012) Reciprocal shading for
mixed reality. Computers & Graphics.
58. Ka´n P, Kaufmann H (2012) High-quality reflections, refractions, and caustics in
augmented reality and their contribution to visual coherence. Mixed and
Augmented Reality (ISMAR), 2012 IEEE International Symposium : 99–108.
59. Arief I, McCallum S, Hardeberg J (2012) Realtime estimation of illumination
direction for a ugmented rea lity on mobile devices. Final Program a nd
Proceedings - IS and T/SID Color Imaging Conference : 111–116.
60. Lee S, Jung S (2011) Estimation of illuminants for plausible lighting in
augmented reality. Proceedings - 2011 International Symposium on Ubiquitous
Virtual Reality, ISUVR 2011 : 17–20.
61. Guan T, Duan L, Yu J, Chen Y, Zhang X (2011) Real-time camera pose
estimation for wide-area augmented reality applications. IEEE Computer
Graphics and Applications 31: 56–68.
62. Kilteni K, Normand JM, Sanchez-Vives MV, Slater M (2012) Extending body
space in immersive virtual reality: a very long arm illusion. PloS one 7: e40867.
63. Lu BV, Kakuta T, Kawakami R, Oishi T, Ikeuchi K (2010) Foreground and
shadow occlusion handling for outdoor augmented reality. Mixed and
Augmented Reality (ISMAR), 2010 9th IEEE International Symposium :
109–118.
64. Herdtweck C, Wallraven C (2013) Estimation of the horizon in photographed
outdoor scenes by human and machine. PloS one 8: e81462.
65. Kolivand H, Sunar M (2013) Covering photometric properties of outdoor
components with the effects of sky color in mixed reality. Multimedia Tools and
Applications : 1–20.
66. Kolivand, Sunar M (2011) Real-time sky color with effect of sun’s position.
International Journal of Scientific and Engineering Research 2: 2229–5518.
67. Kolivand H, Amirshakarami A, Sunar M (2011) Real-time projection shadow
with respect to sun’s position in virtual environments. International Journal of
Computer Science Issues 8: 80–84.
68. AEA (2003) Cie standard. URL http://cie.mogi.bme.hu.
69. Zhao F, Zeng M, Jiang B, Liu X (2013) Render synthetic fog into interior and
exterior photographs. In: Proceedings of the 12th ACM SIGGRAPH
International Conference on Virtual-Reality Continuum and Its Applications
in Industry. ACM, pp. 157–166.
70. Tadamura K, Qin X, Jiao G, Nakamae E (1999) Rendering optimal solar
shadows using plural sunlight depth buffers. In CGI 99: Proceedings of the
International Conference on Computer Graphics (Washington, DC, USA,
1999), IEEE Computer Society : 66.
71. Annen T, Mertens T, Bekaert P, Seidel HP, Kautz J (2007) Convolution shadow
maps. Proceedings of the 18th Eurographics conference on Rendering
Techniques : 51–60.
72. Dimitrov R (2007) Cascaded shadow maps. NVIDIA, Technical Report.
73. Lauritzen A, McCool M (2008) Layered variance shadow maps. In GI ’08:
Proceedings of Graphics Interface 2008 Toronto, Ontario, Canada), Canadian
Information Processing Society : 139–146.
74. Sunar M (2001) Sky colour modelling. Master Thesis.
75. Dobashi Y, Yamamoto T, Nishita T (2002) Interactive rendering of atmospheric
scattering effects using graphics hardwa re. In Proceedings of the AC M
SIGGRAPH/EUROGRAPHICS Conference on Graphics Hardware : 99–107.
76. Shao MZ, Badler NI (1993) A gathering and shooting progressive refinement
radiosity method.
Realistic Real-Time Outdoor Rendering in Augmented Reality
PLOS ONE | www.plosone.org 14 September 2014 | Volume 9 | Issue 9 | e108334
... This research attempted to overcome the prior problem related to casting unreal and actual items augmented items, such as what is visible on actual items throughout the daytime. In addition, Kolivand et al. [35] utilized hybrid shadow maps (HSMs) to cast delicate shadows on other virtual and actual items. ...
Article
Full-text available
Realism rendering methods of outdoor augmented reality (AR) is an interesting topic. Realism items in outdoor AR need advanced impacts like shadows, sunshine, and relations between unreal items. A few realistic rendering approaches were built to overcome this issue. Several of these approaches are not dealt with real-time rendering. However, the issue remains an active research topic, especially in outdoor rendering. This paper introduces a new approach to accomplish reality real-time outdoor rendering by considering the relation between items in AR regarding shadows in any place during daylight. The proposed method includes three principal stages that cover various outdoor AR rendering challenges. First, real shadow recognition was generated considering the sun's location and the intensity of the shadow. The second step involves real shadow protection. Finally, we introduced a shadow production algorithm technique and shades through its impacts on unreal items in the AR. The selected approach's target is providing a fast shadow recognition technique without affecting the system's accuracy. It achieved an average accuracy of 95.1% and an area under the curve (AUC) of 92.5%. The outputs demonstrated that the proposed approach had enhanced the reality of outside AR rendering. The results of the proposed method outperformed other state-of-the-art rendering shadow techniques' outcomes.
... The properties of the 3D object such as dimensions, rotation, and position are set over the marker. With respect to these dimensions the marker will be transformed into 3D object to be projected [9][10] [11]. In the present work a game object of a unity represents like scenery, characters, etc. ...
Article
Full-text available
In the recent emerging technologies a specific medium is required to interpret the things in a different way. When any marker or surface is detected then the respective 3D game object will be displayed with augmented reality based applications. The Unity supports to create, store, and configure in development of such applications. The asset of an object is stored with the help of an object prefab in Unity. A user defined prefab is required to integrate an image for a user-defined marker. But, the marker cannot be used in representation of user-defined markers. In the present paper script language c# is used to detect user defined marker image. The script language also helps in projecting a 3D game object of the corresponding marker image with user defined specifications.
... Kolivand et al. [56,58] proposed a new and unique technique for realistically rendering the outdoor scenes in real time by taking into account the sky color with respect to the sun position that involves a shadow generating algorithm, Z-Partitioning: Gaussian and Fog Shadow Maps. ...
Article
Full-text available
A realistically inserted virtual object in the real-time physical environment is a desirable feature in augmented reality (AR) applications and mixed reality (MR) in general. This problem is considered a vital research area in computer graphics, a field that is experiencing ongoing discovery. The algorithms and methods used to obtain dynamic and real-time illumination measurement, estimating, and rendering of augmented reality scenes are utilized in many applications to achieve a realistic perception by humans. We cannot deny the powerful impact of the continuous development of computer vision and machine learning techniques accompanied by the original computer graphics and image processing methods to provide a significant range of novel AR/MR techniques. These techniques include methods for light source acquisition through image-based lighting or sampling, registering and estimating the lighting conditions, and composition of global illumination. In this review, we discussed the pipeline stages with the details elaborated about the methods and techniques that contributed to the development of providing a photo-realistic rendering, visual coherence, and interactive real-time illumination results in AR/MR.
... However, rendering real-time and realistic environments can be challenging, as several parameters must be taken into account: light, shadow, differences of colours between the physical world and its representationcf. Kolivand and Sunar (2014) for an extended review of the computer graphics literature on rendering for outdoor AR. One approach is to voluntarily downscale both the representation of the physical world and the augmentations, e.g. by applying a set of stylisation techniques. ...
Chapter
Dental students must achieve an acceptable level of competence, since most procedures on teeth are irreversible and therefore learning these skills solely on patients is not acceptable. Simulation allows students to repeat procedures till they demonstrate required levels of skills, without putting actual patients at risk and yet acquiring procedural competence. In line with advances in technology, dental simulations are being developed to support the acquisition of necessary psychomotor skills before actual clinical applications. This chapter considers the use of Augmented Reality as one of the most sophisticated methods of simulation.
... shadows that are cast by virtual objects [5], [6] and overlooked the shadow interactions of real scenes that are cast onto the virtual objects, especially in handling live videos that were captured with freely moving cameras. Shadows are caused by object occlusions. ...
Article
In augmented reality, interactive shadow casting between virtual and real objects is crucial for the consistency of the appearance of integrated scenes. Since real shadow-casting objects may not be fully visible in live videos, simulating the shadows that they cast onto virtual objects based on 3D reconstruction will fail. To address this issue, we propose a framework for simulating shadow interactions in live outdoor videos based on shadow edges. The simulation framework consists of two components: real-time casting shadow detection from live videos with moving viewpoints and generation of shadow volumes. First, we propose a novel split-based shadow detection framework that accounts for moving viewpoints. Then, a visual-perception-guided adaptive sampling strategy is developed to generate high-quality shadow volumes. The experimental results demonstrate that our algorithm achieves visually consistent shadow interactions for live outdoor videos, which significantly enhances the photorealism of the generated scenes for AR applications.
... There has also been a significant amount of other research on reducing the mismatch between real and virtual objects [5,8,13], but none of the previously-proposed methods are appropriate for mobile outdoor AR environments requiring real-time interaction with virtual objects, due either to the need for additional devices or low performance. ...
Article
Full-text available
Rapid developments in augmented reality (AR) and related technologies have led to increasing interest in immersive content. AR environments are created by combining virtual 3D models with a real-world video background. It is important to merge these two worlds seamlessly if users are to enjoy AR applications, but, all too often, the illumination and shading of virtual objects is not consider the real world lighting condition or does not match that of nearby real objects. In addition, visual artifacts produced when blending real and virtual objects further limit realism. In this paper, we propose a harmonic rendering technique that minimizes the visual discrepancy between the real and virtual environments to maintain visual coherence in outdoor AR. To do this, we introduce a method of estimating and approximating the Sun’s position and the sunlight direction to estimate the real sunlight intensity, as this is the most significant illumination source in outdoor AR and it provides a more realistic lighting environment for such content, reducing the mismatch between real and virtual objects.
... Tracking from natural features is a complex problem and usually demands high computation power [10]. It is therefore difficult to use AR natural feature tracking in mobile device compared to personal computer (PC) platform because mobile devices have limited processing power, hardware and memory [11,12]. Hence, the selection of tracking algorithms needs to be given high attention in order to achieve optimum performance of AR in mobile platform. ...
Article
Full-text available
Mobile Augmented Reality (MAR) requires a descriptor that is robust to changes in viewing conditions in real time application. Many different descriptors had been proposed in the literature for example floating-point descriptors (SIFT and SURF) and binary descriptors (BRIEF, ORB, BRISK and FREAK). According to literature, floating-point descriptors are not suitable for real-time application because its operating speed does not satisfy real-time constraints. Binary descriptors have been developed with compact sizes and lower computation requirements. However, it is unclear which binary descriptors are more appropriate for MAR. Hence, a distinctive and efficient accuracy measurement of four state-of-the-art binary descriptors, namely, BRIEF, ORB, BRISK and FREAK were performed using the Mikolajczyk dataset and ALOI dataset to identify the most appropriate descriptor for MAR in terms of computation time and robustness to brightness, scale and rotation changes. The obtained results showed that FREAK is the most appropriate descriptor for MAR application as it able to produce an application that are efficient (shortest computation time) and robust towards scale, rotation and brightness changes.
Chapter
Over the last decade, mobile learning has become one of the most important trends in education. Leveraging on the technological capabilities of mobile devices, researchers and teachers have investigated the use of mobile devices in a variety of disciplines, including environmental education, in order to design innovative educational material such as augmented reality (AR) apps. Designing technologically enhanced learning activities for environmental education is particularly challenging because they often take place in informal settings and outdoors, for example through field trips or visits to parks. In this chapter, we discuss the potential of augmented reality for outdoor environmental education. More specially, we: (i) give a brief overview of learning theories that promote learning in context; (ii) describe a number of illustrative examples of augmented reality mobile apps for environmental education; (iii) and discuss the purpose and forms of digital augmentations in the context of environmental education.
Article
3D object manipulation is one of the most critical tasks for handheld mobile Augmented Reality (AR), which can contribute towards its practical potential, especially for real-world assembly support. In this context, the study of techniques which are used to manipulate 3D objects is an important research area. This study has developed an improved device-based technique within handheld mobile AR interfaces, to solve the large-range 3D object rotation problem, as well as issues related to position and orientation deviations in manipulating 3D objects. We firstly enhanced the existing device-based 3D object rotation technique (named as HoldAR) with an innovative control structure (named as TiltAR) that utilizes the handheld mobile device tilting and skewing amplitudes to determine the rotation axes and directions of the 3D object. Whenever the device is tilted or skewed in a way that exceeds the threshold values of the amplitudes, the 3D object rotation will start continuously with a pre-defined angular speed per second, to prevent over-rotation of the handheld mobile device. This over-rotation is a common occurrence when using the existing technique to perform large-range 3D object rotations. The problem of over-rotation of the handheld mobile devices needs to be solved since it causes a 3D object registration error and a 3D object display issue, where the 3D object does not appear consistent within the user's range of view. Secondly, restructuring the existing device-based 3D object manipulation technique (named as DI) was done by separating the degrees of freedom (DOF) of the 3D object translation and rotation, to prevent deviations of the 3D object position and orientation, caused by the DOF integration that utilizes the same control structure, which is HoldAR, for both tasks. Next, an improved device-based manipulation technique (named as DS), with better performance on task completion time for 3D object manipulation within handheld mobile AR interfaces, was developed. A pilot test was carried out before other main tests to determine several pre-defined values designed in the control structure of TiltAR. A series of 3D manipulation tasks were designed and developed to benchmark DS (the proposed manipulation technique) with DI (the existing technique) on task completion time (s). Sixteen participants aged 19-24 years old were selected. Each participant had to complete twelve trials, which came to a total 192 trials per experiment for all the participants. Repeated measure analysis was used to analyze the data. The results obtained have statistically proven that DS markedly outpaced DI with significant shorter task completion times in all tasks consisting of different difficulty levels and rotation amounts. Based on the findings, an improved device-based 3D object manipulation technique has been successfully developed to address the insufficient functionalities of the current technique.
Article
Full-text available
This paper presents a gathering and shooting progressive refinement radiosity method. Our method integrates the iterative process of light energy gathering used in the standard full matrix method and the iterative process of light energy shooting used in the conventional progressive refinement method. As usual, in each iteration, the algorithm first selects the patch which holds the maximum unprocessed light energy in the environment as the shooting patch. But before the shooting process is activated, a light energy gathering process takes place. In this gathering process, the amount of the unprocessed light energy which is supposed to be shot to the current shooting patch from the rest of the environment in later iterations is pre-accumulated. In general, this extra amount of gathered light energy is far from trivial since it comes from every patch in the environment from which the current shooting patch can be seen. However, with the reciprocity relationship for form-factors, still only one hemi-cube of the form-factors is needed in each iteration step. Based on a concise record of the history of the unprocessed light energy distribution in the environment, a new progressive refinement algorithm with revised gathering and shooting procedures is then proposed. With little additional computation and memory usage compared to the conventional progressive refinement radiosity method, a solid convergence speedup is achieved. This gathering and shooting approach extends the capability of the radiosity method in accurate and efficient simulation of the global illuminations of complex environments.
Conference Paper
Full-text available
The paper proposes a technique for estimating outdoor illumination conditions in terms of sun and sky radiances directly from pixel values of dynamic shadows detected in video sequences produced by a commercial stereo camera. The technique is applied to the rendering of virtual objects into the image stream to achieve realistic Augmented Reality where the shading and shadowing of virtual objects is consistent with the real scene. Other techniques require the presence of a known object, a light probe, in the scene for estimating illumination. The technique proposed here works in general scenes and does not require High Dynamic Range imagery. Experiments demonstrate that sun and sky radiances are estimated to within 7% of ground truth values.
Article
Augmented reality simulations aims to provide realistic blending between real world and virtual objects. One of the important factors for realistic augmented reality is correct illumination simulation. Mobile augmented reality systems is one of the best options for introducing augmented reality to the mass market due to its low production cost and ubiquitousness. In mobile augmented reality systems, the ability to correctly simulate in realtime the illumination direction that matches the illumination direction of the real world is limited. Developing a mobile augmented reality systems with the ability to estimate illumination direction presents a challenge due to low computation power and dynamically changing environment. In this paper, we described a new method that we have developed for realtime illumination direction estimation for mobile augmented reality systems, using analysis of shadow produced by a reference object that doubles as a 3D augmented reality marker. The implementation of the method could estimate the direction of a single strong light source in a controlled environment with a very good degree of accuracy, with angular error averaging lower than 0.038 radians. The current implementation achieved 2.1 FPS performance in a low-end Android mobile device, produced proper estimation within 15 seconds using a uniform surface, and demonstrated scalability potential.
Article
Illustrated color of known objects in Virtual Reality (VR) games for various games exhibit characteristic differences. In the first part of our analysis we studied the cartoons' colors, with a great emphasis on whether there are any cultural differences. Coloration of well-known objects depicted in cartoons originating from different parts of the world show characteristic differences. We analyzed several soft-copy and hard-copy cartoons form all over the world and determined what colors the designer use for complexion, sky, water, soil, etc. We continued with the study of VR games' colors. We analyzed eight game categories' colors and determined which colors the designer used for skin, sky, water, grass, etc. These colors were compared with the prototypical memory colors and cartoon colors of these objects. The research quantifies these differences and provides advice to the VR games to be tinted, if they are intended for a specific region of the world and specific VR games. In the second phase of the research such images of films, which were the equivalent of VR games are analyzed. The staining of these films were compared to the corresponding color of VR games and memory color display object.
Article
In 1974 Catmull developed a new algorithm for rendering images of bivariate surface patches. This paper describes extensions of this algorithm in the areas of texture simulation and lighting models. The parametrization of a patch defines a coordinate system which is used as a key for mapping patterns onto the surface. The intensity of the pattern at each picture element is computed as a weighted average of regions of the pattern definition function. The shape and size of this weighting function are chosen using digital signal processing theory. The patch rendering algorithm allows accurate computation of the surface normal to the patch at each picture element, permitting the simulation of mirror reflections. The amount of light coming from a given direction is modeled in a similar manner to the texture mapping and then added to the intensity obtained from the texture mapping. Several examples of images synthesized using these new techniques are included.
Article
This paper presents an interactive vision based real-time system for estimating light source positions and generating credible shadows for augmented reality. The implementation uses ARToolkit as a basis for geometric tracking, and a reflective sphere for tracking light sources in the environment. The paper seeks to generate perceptually credible shadows. User testing was conducted in order to determine the minimum criteria for the credibility of the shadows. User testing showed that 64 shadows are sufficient and indistinguishable from more complex compositions and a real image. It was clear that users need a reference object to distinguish between real and virtual shadows. The implementation and performance are tested using a consumer-grade web-camera and a regular laptop computer. The implementation can perform real-time with 256 generated shadows.
Article
Augmented reality (AR) provides a seamless interface between the virtual and real worlds, and it has been applied to various domains, e.g., product design, manufacturing, maintenance and repair, etc. In these AR systems, 3D graphics or other contents can be registered in the real environment to provide information to the users. Context-awareness can improve the usability of such AR systems through adapting the information rendered to the contexts so that the provided information can be more useful to the users. However, many AR product maintenance systems lack an authoring system that requires little programming skills to create context-aware AR contents. In this paper, an authoring system, authoring for context-aware AR (ACAAR), which provides concepts and techniques to author AR contents for context-aware AR applications, is proposed. Using ACAAR, the users can add and arrange various contents spatially, e.g., texts, images and computer-aided design (CAD) models, and specify the logical relationships between the AR contents and the maintenance contexts. In addition, a user study has been conducted to demonstrate the usability of the proposed system.