Conference PaperPDF Available

Single-pass Rendering of Day and Night Sky Phenomena


Abstract and Figures

This paper presents astronomical based rendering of skies as seen from low altitudes on earth, in respect to location, date, and time. The technique allows to compose an atmosphere with sun, multiple cloud layers, moon, bright stars, and Milky Way, into a holistic sky with unprecedented high level of detail and diversity. GPU generated, viewpoint-aligned billboards are used to render stars with approximated color, brightness, and scintillations. A similar approach is used to synthesize the moon considering lunar phase, earthshine, shading, and lunar eclipses. Atmosphere and clouds are rendered using existing methods adapted to our needs. Rendering is done in a single pass supporting interactive day-night cycles with low performance impact, and allows for easy integration in existing rendering systems.
Content may be subject to copyright.
Vision, Modeling, and Visualization (2012)
M. Goesele, T. Grosch, B. Preim, H. Theisel, and K. Toennies (Eds.)
Single-Pass Rendering of Day and Night Sky Phenomena
Daniel Müller Juri Engel Jürgen Döllner
Hasso-Plattner-Institut, University of Potsdam, Germany
This paper presents astronomical based rendering of skies as seen from low altitudes on earth, in respect to loca-
tion, date, and time. The technique allows to compose an atmosphere with sun, multiple cloud layers, moon, bright
stars, and Milky Way, into a holistic sky with unprecedented high level of detail and diversity. GPU generated,
viewpoint-aligned billboards are used to render stars with approximated color, brightness, and scintillations. A
similar approach is used to synthesize the moon considering lunar phase, earthshine, shading, and lunar eclipses.
Atmosphere and clouds are rendered using existing methods adapted to our needs. Rendering is done in a sin-
gle pass supporting interactive day-night cycles with low performance impact, and allows for easy integration in
existing rendering systems.
Categories and Subject Descriptors (according to ACM CCS): I.3.7 [Computer Graphics]: Three-Dimensional
Graphics and Realism—
1. Introduction
Real-time image synthesis targeting photo-realistic repre-
sentations of arbitrary virtual 3D environments is steadily
becoming more important. Improving hardware capabilities
allow rendering systems to simulate a growing number of vi-
sual subtleties of reality. Applications such as video games
and simulators (e.g., flight and vehicle simulators), as well
as architectural and historical visualizations demanding spa-
tiotemporal correctness, often benefit from appropriate envi-
ronment rendering and seamless transitions of day and night
skies. Appropriateness begins with an adequate impression
of a static blue sky; It ends with immersive, dynamic simula-
tion of environments, aware of date, time, location, lighting,
weather, ambient noise, and mood, that are coherent with our
daily, visual experience.
There are various approaches on real-time atmosphere and
cloud rendering. However, there is a gap concerning seam-
less day-night transitions and astronomical accurate night
sky approximations. In this work, we focus on night sky
features and their composition of existing techniques into
dynamic, holistic day-night cycles. We propose two tech-
niques, for efficient synthesis of the following phenomena:
Dynamic moon rendering featuring an astrophysical 3D
moon simulated on a viewpoint oriented quad, lit and
shaded based on celestial positions. For lunar eclipses,
precomputed brightness and color are used.
Rendering of thousands of individually configurable (e.g.,
brightness and color), twinkling (scintillations) stars us-
ing point-light rendering. For correct background illumi-
nation, faint stars that indicate the Milky Way, are added
by a textured cube.
The proposed single-pass techniques can be easily in-
tegrated into multi-pass rendering techniques and are well
suited for arbitrary post-processing. This provides an accu-
The Eurographics Association 2012.
Daniel Müller, Juri Engel & Jürgen Döllner / Single-Pass Rendering of Day and Night Sky Phenomena
rate, dependence-free basis with minimal performance im-
pact. Astronomical accuracy in position is verified numer-
ically, correctness in color, however, depends on a multi-
tude of influences like varying resolution, color calibration
of physical output, and subjective agreement that strongly
relies on individual expectations and experience. Accuracy
tradeoffs in favor of performance are justified for use-cases
in, e.g., education, training environments, and entertainment.
1.1. Related Work
A framework for full day-night cycles is described in Jensen
et al. [JDD01], though, no detailed information on their
tone-mapping implementation is available. Most aspects of
day and night sky rendering have been examined in isola-
tion. A collection of astronomical algorithms is given by
Meeus [Mee94].
Rendering of stars has gained little attention so far, and is
mostly done with static noise textures [RP05] or high reso-
lution star texture with real positions and colors [JDD01].
Nadeau et al. [NGN00] suggested Gaussian spots attenu-
ated with distance to overcome drawbacks of texture based
methods, such as rendering artifacts on camera movement
due to sampling. Magnor et al. [MSK10] notes that no so-
lutions covering outscattering and scintillations of stars are
The moon is commonly modeled as a separate or aggre-
gated static texture with predefined phase, position, color,
and intensity. Much more detail was spent in the research of
Jensen et al. [JDD01], who feature photo-realistic render-
ings with correct size, positioning, orientation, and shading
based on lunar surface scattering. Oberschelp et al. [OHS01]
presented a technique for rendering of solar eclipses in com-
mercial 3D animation software. Yapo and Cutler [YC09] re-
cently suggested photon tracing for physically approximated
coloring during lunar eclipses. Both approaches however,
were not feasible for real-time systems.
Leaving aerial perspective to post-processing, most ap-
proaches for atmospheric scattering satisfy our single-pass
constraint. Bruneton and Neyret [BN08] presented a sophis-
ticated algorithm that accounts for various phenomena in-
cluding the earth shadow. For cloud rendering, 2D noise ap-
proaches [RP05,HKA05] pretending semi 3D cloud cover-
age through scattering approximations [Dub05] seem ade-
quate enough, are utilized.
1.2. Overview
The remainder of this paper is structured as follows: The
next two sections introduce a new model for moon and lu-
nar eclipse rendering. Section 4explains rendering of stars
efficiently utilizing the GPU. In Section 5, sequential blend-
ing of discussed phenomena is briefly shown. Modifications
Figure 1: Various photos of the moon in the top and our
renderings in the bottom row, at corresponding day, time,
and location. From left to right: nearly full moon, waxing
gibbous at night, and nearly new moon with earthshine.
made on atmosphere and cloud rendering are indicated. Sec-
tion 6presents and discusses our results including a perfor-
mance evaluation. Finally, Section 7summarizes the pre-
sented techniques and proposes future work.
2. Rendering the Moon
The moon is modeled entirely on the GPU as a viewpoint
oriented billboard: a quad, projected onto a unit-sphere’s
tangent plane, oriented with respect to the world’s up direc-
tion. On that we simulate the virtual moon-sphere. Figure 1
lists some close-ups of the moon, as seen under clear condi-
tions. The unit-sphere is centered on the topocentric observer
and projection is done towards the moon’s position. The
moon-sphere is extrapolated from intersections between the
view ray and the quad. Texture coordinates uand vassigned
to the quad, yield this quad intersection in fragment stage.
With that, the sphere’s selenocentric z-coordinate is given by
z2=1u2v2. Each virtual moon-sphere fragment is si-
multaneously defined in relative position and orientation by
the normal nm= (u,v,z). Fragments outside the sphere are
discarded. Based on the actual moon distance, the billboard
is appropriately scaled and oriented for the topocentric ob-
server. The basic moon model is annotated in Figure 2.
2.1. Apparent Position, Size, and Orientation
For the apparent position and projection direction mof the
moon, its position in ecliptic coordinates is retrieved first,
then converted to refraction corrected, apparent horizontal
coordinates [Mee94]. The refraction of air affects the true
altitude and accounts for a displacement towards the local
zenith. At sea level, refraction accounts for a displacement
of about 36 arc minutes near the horizon, which corresponds
to the apparent moon size itself. Finally, horizontal azimuth
The Eurographics Association 2012.
Daniel Müller, Juri Engel & Jürgen Döllner / Single-Pass Rendering of Day and Night Sky Phenomena
Figure 2: The virtual moon-sphere. Note that the same s is
used for observer and moon. u and v span the tangent plane
used for retrieval of the moon’s surface normal nm.
and altitude are converted to normalized Euclidean coordi-
nates and passed to the GPU.
Refraction actually needs to be applied per fragment, to
achieve deviation from circular of the moon near or below
the horizon. In favor of a simpler model, we exclusively ap-
ply refraction to the moon position. However, shading and
lunar eclipses, are correct only without refraction applied.
2.1.1. Distance and Apparent Size
The earth-moon distance is obtained from center to cen-
ter and varies within lunar perigee and apogee of roughly
363300 km and 405500 km. The topocentric observer po-
sition and true earth radius at that point are not considered,
leading to a maximum error in angular diameter of about
1%. Knowing the distance, the apparent angular moon di-
ameter δ, describing its visible size on earth, can be approx-
imated with basic trigonometry and varies between 0.49 and
0.56 degrees. The virtual moon-sphere diameter σm(bill-
board side length) is expressed as:
σm(δ) = 2tan(cdδ/2),(1)
with cdas artificial scaling factor. Using the correct apparent
size in standard field of views (FoV) yields a subjectively
small moon, making artificial resizing necessary. Values of
cdbetween 2 and 3 lead to less irritating moon sizes.
2.1.2. Orientation
The moon is in synchronous rotation with the earth, always
revealing the same hemisphere to earthly observers. How-
ever, for correct orientation we still have to account for os-
cillating motions – known as librations – and the observer
correlated, parallactic angle.
Our model neglects diurnal and physical librations as
unobservable small. The remaining two optical librations
amount to an additional, visual surface disclosure of 9% over
time however. They are approximated in selenographic coor-
dinates, referring to the mean center of the moon’s apparent
disk [Mee94]. Libration in latitude bis the angle between
the prime meridian of the apparent lunar disk and its rotation
axis, alternately revealing north and south pole. Libration in
longitude lis the lateral rotation around the perpendicular
axis of the lunar ecliptic. The selenocentric orientation ma-
trix for the moon Rmis defined as:
Rx,Ry, and Rzdenote counter-clockwise rotation matrices
around the principal axes, ais the position angle of axis and
pthe parallactic angle. Together they describe the orienta-
tion around the observer-moon axis, required to account for
the observer’s topocentric position on a rotating earth.
2.2. Coloring and Shading of the Moon
The illumination of the moon-sphere mainly depends on per-
spective changes in position of the moon terminator (day-
night boundary) and is given by the illuminated fraction of
its perceivable disk. Usually to obtain this fraction, one re-
quires to calculate the lunar phase angle ϕwhich is the se-
lenocentric elongation of the earth from the sun. Given that,
the position angle of the bright limb can be calculated.
Instead, the lunar phase angle ϕis simply derived from
horizontal moon and sun position. Using ϕas angle between
incident and reflected light on the moon surface, yields cor-
rect illumination and with that the correct illuminated frac-
tion [JDD01]. Simplifying the sun position sas directional
light source, seen from earth instead of the moon, intro-
duces an indiscernible maximum error in lunar phase angle
of about 9 arc minutes. Shading, albedo, and earthshine are
approximated as functions of ϕ. The final surface color Im,
is defined by reflected sun and earth light:
Im(ϕ) = kλaF(θi,θr,ϕ) + βeEem(ϕ),(3)
with aas surface albedo, βeEem introducing earthshine, F
the Hapke-Lommel-Seeliger BRDF that approximates real
moon reflectance [Hap66]. θris the angle between nmand
reflected light, θibetween nmand incident light. By multi-
plying moon-sphere normals with the billboard’s orthonor-
mal matrix, they are transformed into required horizontal
space. kλis used to adjust final color and intensity.
2.2.1. Earthshine
The bluish, faint light of the dark moon fraction known as
earthshine, is caused by reflections of earth emitted light. Its
intensity Eem depends on the phase angle and is most intense
during new moon. It is approximated by:
Eem(ϕ) = 0.0061 ϕ3+0.0289ϕ20.0105 sin(ϕ),(4)
introducing a maximum difference of about 3% to the for-
mula suggested by van de Hulst [vdH80]. For the earthshine
color coefficient, βe= (0.88,0.96,1.00)is used.
The Eurographics Association 2012.
Daniel Müller, Juri Engel & Jürgen Döllner / Single-Pass Rendering of Day and Night Sky Phenomena
Figure 3: Renderings of the June 15th 2011 total eclipse, as seen from Mangalore, India, using our technique. The images
correlate to universal time in 15 minute intervals, beginning at 18:00 UTC on the left.
2.2.2. Albedo and Surface Normals
Albedo values and surface normals are encoded in a four
channel cubic environment map. The albedo is based on
Clementine data, that represents only partially true albedo.
The surface was photographed with the sun being always
near the cameras longitude, causing static shadows towards
the poles. Aggregated imagery taken from earth at full moon
would be better suited, but we currently are unaware of any
such resource. If librations were not taken into account, a
photograph of the full moon would be satisfactory.
The surface normals are based on stereo images taken by
the LRO-WAC camera. Slight surface irregularities are im-
portant inside the day-night boundary, but even there, they
are hard to perceive using correct apparent size. Neverthe-
less, they are linearly blended with the moon-sphere nor-
mals, depending on the desired intensity. We use a texture
resolution of 2562pixel per cube face to provide generous
field of views and closeups.
The dusty moon surface has a slight reddish hue, that is
simulated by fitting the average lunar albedo to a measured
reflectance spectrum as shown by Yapo and Cutler [YC09].
Using their second-degree polynomial, color channel coeffi-
cients kλ= (0.92,0.79,0.64)based on representative wave-
lengths (680,550,440)nm are obtained [REK04]. As for
stars, atmospheric outscattering is applied too (Sec. 4.3.1).
3. Rendering Lunar Eclipses
We propose a new algorithm for realistic rendering of lunar
eclipses in real-time, aiming for photometric resemblance.
The magnitude, describing the moon’s penetration depth into
the earth umbra or penumbra during an eclipse, is easily re-
trieved within our model. Color and brightness are expressed
as two separate multiplier, each correlated to the magnitude,
offering control similar to exposure time in photography.
Figure 3shows a half cycle using our technique.
3.1. Modeling Lunar Eclipses
The earth illuminated by the sun, casts a penumbra and
within that an umbra as shown in Figure 4. Their diameter
εuand εpare measured in moon radii and can be either sim-
plified to be constant – by assuming average moon and sun
distances, εuand εpcould be approximated with 2.65 and
4.65, respectively – or calculated based on actual moon and
sun distances dmand ds, as well as constant radii:
εu=3.6676 397.0001dmd1
εp=3.6676 +404.3354dmd1
The horizontal earth-moon system is assumed to be normal-
ized, yielding a moon distance of 1 (Fig. 4). For each visible
moon fragment we retrieve the normalized distance between
the related moon-sun line gand the earth center, simplifying
the sun as directional light again. The position of a moon-
sphere fragment afis given by af=mε0nm, with ε0as
actual moon radius to moon distance ratio. Using the moon-
sun line in the parametric form g:xaf+x s, the shortest
distance Dfin moon radii between afand g, henceforth re-
ferred to as eclipse phase, is given by ε0
3.2. Coloring of Lunar Eclipses
Color is modeled as a function hof transformed eclipse
phase tf. Since εuand εpvary, we assign them fixed ranges
for easier handling: 0 tf1/2 for umbral distances and
1/2tf1 for penumbral distances. The fragment eclipse
phase Dfis linearly transformed into these two ranges. h
yields a multiplier per color channel used to modify the re-
Moon Earth
Figure 4: Lunar eclipse model showing the moon at m,
the eclipse phases Dmand D fbetween the moon-sun line
(dashed in s direction), the earth in the center, and the ac-
tual umbra and penumbra radii εuand εp.
The Eurographics Association 2012.
Daniel Müller, Juri Engel & Jürgen Döllner / Single-Pass Rendering of Day and Night Sky Phenomena
t[f,m] ½1
colors: hR, hG, hB
brightness: i
Figure 5: Left: Multiplier h for lunar eclipses per color
channel, shown on a logarithmic scale. Right: Brightness i
on a linear scale between 1 and ci(we used ci=30). Corre-
lated to the t-axis, lunar eclipse phases are indicated.
Figure 6: Fragment colors over tfapplied at various eclipse
phases tm. rmroughly indicates the moon radius for better
sensation of scale. Bottom tm=0: colors increased tenfold.
sulting moon fragment color as shown in Figure 5. It ac-
counts for the following four distinctive phenomena:
1. A barely noticeable penumbra, indicated by a soft dark-
ening of the moon towards the umbra.
2. A strongly noticeable, dark umbra, with a soft edge.
3. Strong reddish orange hue in the umbra, becoming darker
towards its center. This is due to scattering in the earth’s
atmosphere, where shorter wavelength are more likely to
be outscattered than longer ones. Thus, red light is bent
towards the umbra center much stronger than blue light.
4. A soft bluish rim at the umbra edge, which reflects the
remaining, less scattered light.
We specify hby a sum of various functions [Mül12], each
fitting a phenomena to various photographs. Since hcan be-
come very complex, it is precomputed into a one dimen-
sional map for look-ups per fragment. One could also spec-
ify this texture by other means, like physical simulations, to
meet individual appearance notions and requirements.
3.2.1. Brightness during a Lunar Eclipse
Equivalent to h,iis a function of tmto describe the bright-
ness during the eclipse cycle. Instead of a specific fragment’s
eclipse phase Df, this correlates to the eclipse phase Dmof
the moon itself. A reasonable iis shown in Figure 5. Finally,
we obtain each fragment color Cfby Cf(tf) = i(tm)h(tf)cf,
with cfas initial fragment color (Figure 6).
4. Rendering Stars
Rendering stars with random distribution in position and in-
tensity is not plausible, because humans are strongly accus-
tomed to the earthly night sky with its typical constellations.
Using photographs or precomputed textures with correct star
placement is also insufficient, since stars might appear ei-
ther bulky and blurred, or small and wobbling during cam-
era movement. Utilizing point-sprites is also not applicable.
They are unaffected by camera distortion, which leads to star
clustering in the center when using larger FoVs.
Similar to rendering the moon, point light sources are ren-
dered with viewpoint aligned billboards that adapt to the ac-
tual output resolution and are prone to distortion. The tech-
nique also scales well with increasing number of pixel per
inch of modern displays. We obtain actual positions of stars
and planets and apply an individual adjusted point spread
function for intensity (intensity PSF) [Mai09]. A simple
glare (glare PSF) is added to overcome physical intensity
constraints and provide a larger range of apparent bright-
ness as common in HDRR. Star color and intensity is ap-
proximated based on their temperature and distance. This
approach is also applicable for solar planets and satellites
and can be adapted for observers within our solar system.
4.1. Modeling Stars
At most, 9500 bright stars and star clusters are perceivable
by the naked eye under optimal conditions. These are mod-
eled as points with precomputed position, color, and inten-
sity based on data provided in the Yale Bright Star Cata-
logue [HW95]. They are passed to the GPU and rotated ap-
propriately to date, time, and location in vertex stage. Here,
color and intensity affecting outscattering (extinction caused
outside the atmosphere is ignored) and scintillation are ap-
plied. Finally, one billboard per star is created and scaled in
geometry stage, and finally rendered overlaying the PSFs.
4.1.1. Positioning Stars
For each star or star cluster one vertex is passed to the GPU.
Their equatorial right ascension αand declination δfor the
J2000.0 equinox, are transformed to Euclidean coordinates
p= (x,y,z). To avoid updating all vertex positions on every
change in date, time, or location, all vertices are passed only
once to the GPU. A single rotation matrix Rs[JDD01] is
required per change, to apply precession P, and convert their
equatorial coordinates to horizontal ones. Rsis given by:
Rs=Ry(δπ/2)Rz(LMST )P,(7)
P=Rz(0.01118T)Ry(0.00972T)Rz(0.01118 T),(8)
with Tas time in Julian centuries, and LMST the approx-
imated local mean sidereal time. The final vertex position
adjusted in vertex stage is p Rs. Unfortunately, this approach
does not allow for individual annual proper motions.
4.2. Apparent Magnitude
The brightness of stars is measured on a logarithmic scale
in apparent magnitude m: A star with 1 mag is about 100
The Eurographics Association 2012.
Daniel Müller, Juri Engel & Jürgen Döllner / Single-Pass Rendering of Day and Night Sky Phenomena
times brighter than one with 6 mag. This leads to a rela-
tive brightness 2.512m1m2between two stars with appar-
ent magnitudes m1and m2.m(m) = 2.512mamis used
for individual star brightness, with maas a control mag-
nitude, e.g., representing the observers’s current brightness
sensitivity. Usually brighter stars also appear larger. How-
ever, even the faintest and smallest stars become to bulky
on today’s screens. We chose not to enlarge the intensity
PSF, but use an additional glare PSF on an enlarged bill-
board, to provide a visual cue of higher intensity. This might
be incorrect in photometric terms, but requires neither ex-
tra tone mapping nor extra blur passes for glare. Textured
glare [SSZ95] per star or dynamic temporal glare in a post-
processing step [RIF09] could be used to account for star
streaks. With a minimum billboard radius of about 2 pixel,
flickering due to aliasing is eliminated. In screen space, this
radius qis obtained by the vertical field of view γvand the
vertical output resolution resvin pixel:
q=22 tan(γv
As intensity PSF, T(l) = 2l33l2+1 is used, with l[0;1]
as normalized distance to the billboard center. The bright-
ness mneeds to directly correlate to the PSF intensity,
which is achieved by scaling with IT:
with cq4×107, calibrated by comparisons with photos,
for ma=4. Through q,ITinversely correlates to γv, so that
on decreasing FoV, fainter stars become more and more vis-
ible. Thinking of a PSF as a solid of revolution, its volume
Vcan be interpreted as intensity. Disc-integration yields a
volume of VT=1.167 for T. The calibrated intensity ITTis
1 for m=ma.
For m<ma, a star is lightened with glare of intensity IG=
m(m+(VT1))1. The glare PSF is arbitrarily defined as
G(l) = 64
l, and the billboard radius by k=max(q,ckIG),
with a resolution adjusting coefficient ck. Note when using
glares, Tneeds to be scaled by k q1.
4.3. Star Color
To procure star colors, measured B-V values based on the
Johnson-Morgan-System [JM53] need to be converted into
RGB space. Given a B-V value in mag, a stars temperature
can be approximated [JDD01,Ols98]. Chromaticity coor-
dinates can be obtained by mapping the temperature to the
Planckian locus by means of a polynomial fit [KCKH03].
These values are mapped to the CIE tristimulus, and finally
converted into sRGB space [Ols98,Kry85].
4.3.1. Scattering
Light reaching an earthly observer at night, is affected
by atmospheric scattering (with unnoticeable interspersion
though). Because of that, we approximate the optical path
Figure 7: Night sky with Orion’s Belt in the right, and the
moon in the left. Note the faint Milky Way and the moon (en-
larged) with earthshine, masking the background.
length Φof perceived light rays, traveled through a simpli-
fied atmosphere [Buc95], relative to the length in zenith di-
rection te. Given the mean earth radius re=6371 km, the
observer altitude hin km, and the light ray’s angle to zenith
θ,Φcan be expressed by the law of sines:
Φ(θ) = sinarcsinre+h
O(θ) = cr(1+βr)(teh)1Φ(θ),(12)
Color and intensity are attenuated by O, with cr6 and βr
as scattering coefficient for air molecules given by Rayleigh.
In our model, it describes the wavelength dependency and
we use βr= (0.16,0.37,0.91)for h=0 [Buc95]. For te
values of about 8 km for Rayleigh scattering are common.
[NSTN93,PSS99,BN08]. If observer altitude is ignored,
the true horizon fixed at θ=π/2, simpler approximations
[PSS99,YC09] would be sufficient.
4.3.2. Scintillations
Scintillations are fluctuations on a time-scale of millisec-
onds, causing variations in color and brightness of stars
and other perceivable, outer-atmosphere objects. They are
caused by atmospheric turbulence and therewith density and
refraction differences. Although the appearance of planets,
and the moon and sun surface are affected as well, we con-
sider point lights solely.
Similar to scattering, scintillations strongly increase to-
wards the horizon, thus we rely on the optical path length ra-
tio Φagain. For each star we generate a random scintillation
basis n[0;1]per frame, to simulate fluctuations over time
on the smallest available time scale. To distribute the num-
ber of simultaneous twinkles, ngets non linearly remapped
to [0;1]using an arbitrarily chosen N=0.02/n. Lastly, each
star’s brightness is attenuated by the scintillation intensity
S=csβrΦNwhere csis used for adjustment (e.g., cs20).
The Eurographics Association 2012.
Daniel Müller, Juri Engel & Jürgen Döllner / Single-Pass Rendering of Day and Night Sky Phenomena
Figure 8: Exemplary illustration of the composition se-
quence. Starting at the left: star map, bright stars, the moon,
atmosphere with sun, and various cloud layers.
4.4. Star Map
At clear nights with few artificial light, the faint background
of our galaxy can be sensed (Fig. 7). A screen-aligned quad
is used to render a textured cube, showing a generated Ty-
cho Catalog Skymap (star map) in equatorial coordinates
[BW09] . The star map resolution should be rather low, just
indicating the Milky Way instead of further individual stars.
Especially for high resolutions, bright stars should be re-
moved though, to avoid start doubling caused by positional
differences to individual rendered stars. Correct orientation
by means of Equation 7is applied to the texture lookup. Star
map brightness is scaled by csm(m)q1, with csdepend-
ing on output resolution and initial texture intensity. It should
be adjusted, so that the Milky Way is hinted with default γvat
ma=4. As for bright stars, scattering is applied, and bright-
ness is FoV adaptive again. Scintillations however are inapt.
5. Composition
In order to render within a single-pass, all components have
to be rendered in correct sequence, based on their correlated
phenomena distance. Rendering starts with the opaque star
map or, alternatively, a black background. Bright stars are
blended into this, utilizing per pixel intensities in the alpha
channel. The moon is opaque again, overlapping stars even
when unlit by the sun as seen in Figure 7. Finally, the at-
mosphere is blended on top of these layers, so all stars and
the moon are influenced by its color. Cloud layers can be
overlaid afterwards, using appropriate blend modes. The in-
dividual layers are illustrated in Figure 8.
Intensity differences over several magnitudes within the
day-night cycle, lead to zero visibility of stars and a strongly
attenuated moon at day time. Scaling star intensities by
(1+ (sz+1.14)32)1
2, where szis the sun’s normalized, eu-
clidean altitude, leads to a smooth transition between full
visibility just before, and zero visibility just after sunrise
(and vice versa for sunset). Likewise, for moon intensity, a
factor of 0.5+ (2+2(sz+1.05)32)1
2is used.
The atmosphere model suggested by Bruneton and Neyret
[BN08] features a good match to CIE Standard General Sky,
accounts for earth shadowing, and supports single as well as
multiple scattering. Note, that the precomputed textures for
rendering the atmosphere, can be used to replace the scatter-
ing approximations introduced for moon and star rendering.
An array of precomputed 2D Perlin noise maps, projected
onto spherical caps, form the basis for multiple, dynamic
cloud layers. Naive scattering [Mül12], inspired by Dubé
[Dub05], for lower cloud layers causes a 3D-ish appearance
which increases the sky’s overall credibility.
6. Results
A composition with clouds and moon at day is shown in Fig-
ure 9. A typical night shot in Figure 7. Even though, our re-
sults are very compelling, considering the low performance
impact, there are few remarks:
At night, a slight bluish tonemapping due to scotopic
viewing should be applied.
The Moon often feels lost at night. An additional glare,
with its intensity linked to the moon’s intensity (including
variations due to phase and lunar eclipses), fixes this.
Finally, a rendering of a specific lunar eclipse is compared
to a result synthesized by Yapo and Cutler [YC09], and two
reference photos, in Figure 10.
6.1. Performance
Component # Vertices Time in µs
Star Map, ma=6.0 4 228
Bright Stars, ma=6.0 4 ×9129 56
Moon, cd=3.0 4 12
Moon, cd=100.0 4 131
Atmosphere 4 581
Atmosphere with dithering 4 728
Table 1: Enlisted are the average time differences per frame
to an empty scene, measured over a minute. Each component
is drawn to a screen-aligned or viewpoint oriented quad.
The difference between star map and bright stars is due to
discarded stars in geometry stage. Furthermore, the applied
glare function strongly influences the performance, since it
specifies the amount of fragments to be processed per star.
System: Intel Core2 Duo E8400 at 3.0GHz, 8.0GB Memory,
NVidia GeForce GTX 460 with 1.0GB memory.
We use uncached astronomy calculations and not opti-
mized code. Targeting 60 fps, the average performance im-
pact of a cloudless day-night sky is less than 4% on our
test system. Precomputation of all atmosphere textures took
about 2.0s. Table 1lists rendering times per phenomena.
The Eurographics Association 2012.
Daniel Müller, Juri Engel & Jürgen Döllner / Single-Pass Rendering of Day and Night Sky Phenomena
Figure 9: Photographed landscape combined with a sky at
day-time, rendered using our method. It contains two cloud
layers at 1km (with scattering) and 7km, the moon with ten
times its apparent diameter, and the sun in right direction.
Figure 10: A comparison of a rendering of the December
21, 2010 lunar eclipse viewed from New York about 7:40
UTC, between a) Yapo and Cutler [YC09], two photographs
b) and c), and using the method presented in this paper d).
7. Conclusion and Future Work
We have shown techniques for efficient, astronomically ac-
curate real-time rendering of the moon and stars featur-
ing a yet unprecedented degree of detail. A third technique
was suggested, composing these night phenomena with ex-
isting day time techniques, attaining holistic day-night cy-
cles within a single-pass. The methods provide astrophysical
pleasing skies and are well suited for on the fly computations
of backgrounds and cubemaps, often required for real-time
reflections, global illumination, or various post-processing.
Among other issues, blending based on an apparent con-
trol magnitude or radiance, modeled for all individual phe-
nomena would be most valuable. This however, asks for a
uniform integration of all phenomena within a single model.
Finally, we would like to address proper, astrophysical pleas-
ing synthesis of solar eclipses.
[BN08] BRU NETO N E., NEYRE T F.: Precomputed atmospheric
scattering. Comput. Graph. Forum 27, 4 (2008). 2,6,7
[Buc95] BUCHOLTZ A.: Rayleigh-scattering calculations for the
terrestrial atmosphere. Applied Optics 34 (1995). 6
[BW09] BRIDGM AN T., WRIGHT E.: The tycho catalog skymap
- version 2.0., 2009. 7
[Dub05] DUBÉ J.-F.: Realistic cloud rendering on modern gpus.
In Game Programming Gems 5. 2005. 2,7
[Hap66] HAPKE B.: An improved theoretical lunar photometric
function. Astronomical Journal 71 (1966). 3
[HKA05] HASAN M. M., KARIM M. S., AH MED E.: Generating
and rendering procedural clouds in real time on programmable 3d
graphics hardware. In INMIC 2005 (2005), IEEE. 2
[HW95] HOFFLEIT D., WARREN JR. W. H.: Bright star cata-
logue, 5th revised. VizieR Online Data Catalog 5050 (1995). 5
M. M., SHI RLEY P., PR EMO ŽE S.: A physically-based night
sky model. In SIGGRAPH 2001 (2001), ACM. 2,3,5,6
[JM53] JOHNSO N H. L., MORGAN W. W.: Fundamental stellar
photometry for standards of spectral type on the revised system
of the yerkes spectral atlas. Astrophysical Journal 117 (1953). 6
[KCKH03] KIM Y.-S., CHO B.-H., KA NG B.-S., HONG D.-I.:
Color temperature conversion system and method using the same,
2003. 6
[Kry85] KRYSTEK M. P.: An algorithm to calculate correlated
color temperature. Color Research and Application 10 (1985). 6
[Mai09] MAIWALD C.: Hochwertiges rendern von sternen 2.0., 2009. 5
[Mee94] MEEUS J.: Astronomische Algorithmen. Barth, 1994. 2,
WENGER S.: Progress in rendering and modeling for digital
planetariums. In Proc. of Eurographics 2010 (2010). 2
[Mül12] MÜLLER D.: osghimmel - osg lib featuring dynamic,
immersive, textured or date-time and location based, procedural
skies., 2012. 5,7
Visualizing stars and emission nebulae, 2000. 2
E.: Display of the earth taking into account atmospheric scatter-
ing. In SIGGRAPH 93 (1993), ACM. 6
Visualization of eclipses and planetary conjunction events. the
interplay between model coherence, scaling and animation. The
Visual Computer 17, 5 (2001). 2
[Ols98] OLSON T.: The colors of the stars. In In IST/SID 6th
Color Imaging Conf (1998). 6
[PSS99] PREETHAM A. J., SH IRLEY P., SMITS B.: A practical
analytic model for daylight. In SIGGRAPH 99 (1999). 6
J., HANSEN C.: Efficient rendering of atmospheric phenomena.
In EGSR 04 (2004). 4
J., MYSZ KOWSKI K., SEIDEL H.-P.: Temporal glare: Real-time
dynamic simulation of the scattering in the human eye. In Pro-
ceedings Eurographics 2009 (2009). 6
[RP05] ROD EN T., PARB ERRY I.: Clouds and stars: Efficient
real-time procedural sky rendering using 3D hardware. In Proc.
of the 2005 ACM SIGCHI (2005). 2
BERG D. P., IN C T.: Physically-based glare effects for digital
images. In SIGGRAPH 95 (1995). 6
[vdH80] VAN DE HUL ST H. C.: Multiple Light Scattering: Ta-
bles, Formulas, and Applications. Academic Press, 1980. 3
[YC09] YAPO T. C., CUTLER B.: Rendering lunar eclipses. In
Proc. Graphics Interface (2009). 2,4,6,7,8
The Eurographics Association 2012.
... The sky model used in this paper is called 'Time of Day' and is available for Unity® as a plugin. This physically-based sky model was developed by taking several studies into consideration (Bruneton & Neyret, 2008;Hoffman & Preetham, 2002;Müller, Engel, & Döllner, 2012;Nishita, Sirai, Tadamura, & Nakamae, 1993;Preetham, Shirley, & Smits, 1999 , and 4 choices for the artificial lighting (i.e. no artificial light, one light bulb on, two light bulbs on and three light bulbs on). ...
In addition to its influence on lighting energy consumption, lighting systems also impact the cooling and heating energy consumption, which originates from the use of artificial lighting and/or blinds. In this paper, we focus on understanding participants’ lighting choices and the influence of user’s lighting choices on energy consumption. We conducted an experiment using an immersive virtual environment in a single occupancy office and collected 120 participants’ choices, of which 60 participants were placed in a north facing office and the other 60 participants were placed in a south facing office. The participants were asked to select their desired room lighting setting at 9am, 1pm and 5pm. The results indicate that the choices were dependent on the room conditions. The time-of-day influenced blind use choices in both orientations, however, it was a significant factor influencing lighting choices in the south facing office. In addition, participants’ choices resulted in less energy consumption than the baseline settings in EnergyPlus, since the maximum lighting option was rarely used. The influence of office orientation, and time-of-day also influenced the energy savings, with the north facing office saving more energy than the south facing office based on the participants’ choices.
Conference Paper
With the introduction of new input devices, a series of questions have been raised in regard to making user interaction more intuitive - in particular, preferred gestures for different tasks. Our study looks into how to find a gesture set for 3D travel using a multi-touch display and a mid-air device to improve user interaction. We conducted a user study with 30 subjects, concluding that users preferred simple gestures for multi-touch. In addition, we found that multi-touch user legacy carried over mid-Air interaction. Finally, we propose a gesture set for both type of interactions.
Conference Paper
Finding the best suitable environment for 3D navigation that utilizes at least six degrees-of-freedom is still difficult. Furthermore, creating a system to procedurally generate a large virtual environment provides an opportunity for researchers to understand this problem further. Therefore, we present a novel technique to render a parametric celestial skybox with the ability to light environments similar to natural color corrected images from telescopes. We first pre-compute a spherical ray map that corresponds to the cubemap coordinates, then generate stars and dust through a combination of different noise generation shaders.
Conference Paper
Full-text available
The brightness and colors of the stars have fascinated observers of the night sky for millennia. It is difficult however, to portray the depth and amount of visual information of the heavens on actual displays. This paper applies principles of reproduction, color science, and perception, to generate attractive and information-rich starfield images. Although the casual appearance of stars is bluish-white, starlight actually ranges along the entire black-body locus. Scotopic (night) vision cannot see the full color that would result, but we can use color as a natural way to depict the various types of stars and their surface temperature. This is done in three steps. First, a relation between the astronomical color index of a star (the difference in power between two historically established spectral bands) and the star's effective surface temperature is obtained. From this black body temperature, colorimetric coordinates are calculated. Finally, the colorimetric values are converted to device coordinates for the display.
Full-text available
Contemporary challenges in the production of digital planetarium shows include real-time rendering realism as well as the creation of authentic content. While interactive, live performance is a standard feature of professional digital-dome planetarium software today, support for physically correct rendering of astrophysical phenomena is still often limited. Similarly, the tools currently available for planetarium show production do not offer much assistance towards creating scientifically accurate models of astronomical objects. Our paper presents recent results from computer graphics research, offering solutions to contemporary challenges in digital planetarium rendering and modeling. Incorporating these algorithms into the next generation of dome display software and production tools will help advance digital planetariums toward make full use of their potential.
Conference Paper
Full-text available
This paper discusses a process of generating and rendering procedural clouds for 3D environments using programmable 3D graphics hardware. Cloud texture generation is performed using Perlin noise and turbulence functions. Our implementation is done in OpenGL supported GPUs with programmable vertex & fragment processing pipeline that supports OpenGL shading language (GLSL). We have performed a performance benchmark against other existing implementations and found very convincing results, as our approach yields greater FPS than those reported earlier in the literature, as well as our solution is platform independent and portable. The technique can be used in real-time graphics applications, games, film special effects and visual simulations etc
Conference Paper
Full-text available
ABSTRACT Real-time virtual reality applications, including games, increasingly use outdoor environments. A common ,task in an earth-type environment,is to render a sky ,that is realistic both in terms of imagery,and physics. Programmable,graphics hardware offers the opportunity ,to procedurally ,generate ,and ,render a highly realistic sky at a minimal,cost. We,propose an integrated set of efficient ,algorithms ,that run in graphics ,hardware ,for interactive sky rendering that is fully parameterized,for real-time control. Features of our ,method ,include multi-layered dynamic clouds and stars that individually flicker at varying,intensity and rate. Keywords Real-Time, Procedural, Clouds, Sky, Stars, Game, Virtual, Environment
Printing Options Send high resolution image to Level 2 Postscript Printer Send low resolution image to Level 2 Postscript Printer Send low resolution image to Level 1 Postscript Printer Get high resolution PDF image Get low resolution PDF Send 300 dpi image to PCL Printer Send 150 dpi image to PCL Printer More Article Retrieval Options HELP for Article Retrieval Bibtex entry for this abstract Preferred format for this abstract (see Preferences) Find Similar Abstracts: Use: Authors Title Abstract Text Return: Query Results Return items starting with number Query Form Database: Astronomy Physics arXiv e-prints
Since the publication of the fourth edition of the catalog (1982) and the release of its machine version, errors reported by colleagues and discovered by us have been collected along with new data from a number of papers published in the astronomical literature. These changes have now been incorporated into the machine version to produce a fifth edition of the computerized catalog.
A new algorithm to calculate correlated colour temperature is given. This algorithm is based on a rational Chebyshev approximation of the Planckian locus in the CIE 1960 UCS diagram and a bisection procedure. Thus time-consuming search procedures in tables or charts are no longer necessary.
Conference Paper
Johannes Kepler first attributed the visibility of lunar eclipses to refraction in the Earth's atmosphere in his Astronomiae Pars Optica in 1604. We describe a method for rendering images of lunar eclipses including color contributions due to refraction, dispersion, and scattering in the Earth's atmosphere. We present an efficient model of refraction and scattering in the atmosphere, including contributions of suspended volcanic dusts which contribute to the observed variation in eclipse brightness and color. We propose a method for simulating camera exposure to allow direct comparison between rendered images and digital photographs. Images rendered with our technique are compared to photographs of the total lunar eclipse of February 21, 2008.