ArticlePDF Available

All-sky imaging: A simple, versatile system for atmospheric research


Abstract and Figures

A simple and inexpensive fully automated all-sky imaging system based on a commercial digital camera with a fish-eye lens and a rotating polarizer is presented. The system is characterized and two examples of applications in atmospheric physics are given: polarization maps and cloud detection. All-sky polarization maps are obtained by acquiring images at different polarizer angles and computing Stokes vectors. The polarization in the principal plane, a vertical cut through the sky containing the Sun, is compared to measurements of a well-characterized spectroradiometer with polarized radiance optics to validate the method. The images are further used for automated cloud detection using a simple color-ratio algorithm. The resulting cloud cover is validated against synoptic cloud observations. A Sun coverage parameter is introduced that shows, in combination with the total cloud cover, useful correlation with UV irradiance.
Content may be subject to copyright.
All-sky imaging: a simple, versatile system
for atmospheric research
Axel Kreuter,* Matthias Zangerl, Michael Schwarzmann, and Mario Blumthaler
Division for Biomedical Physics, Department of Physiology and Medical Physics,
Innsbruck Medical University, Müllerstrasse 44, 6020 Innsbruck, Austria
*Corresponding author:
Received 17 November 2008; revised 16 January 2009; accepted 17 January 2009;
posted 21 January 2009 (Doc. ID 104048); published 13 February 2009
A simple and inexpensive fully automated all-sky imaging system based on a commercial digital camera
with a fish-eye lens and a rotating polarizer is presented. The system is characterized and two examples
of applications in atmospheric physics are given: polarization maps and cloud detection. All-sky polar-
ization maps are obtained by acquiring images at different polarizer angles and computing Stokes
vectors. The polarization in the principal plane, a vertical cut through the sky containing the Sun, is
compared to measurements of a well-characterized spectroradiometer with polarized radiance optics
to validate the method. The images are further used for automated cloud detection using a simple color-
ratio algorithm. The resulting cloud cover is validated against synoptic cloud observations. A Sun cover-
age parameter is introduced that shows, in combination with the total cloud cover, useful correlation with
UV irradiance. © 2009 Optical Society of America
OCIS codes: 010.1615, 100.2960, 110.5405, 120.5410.
1. Introduction
Observations of the sky are one of the oldest methods
in planetary sciences such as meteorology and as-
tronomy. In many cases the human observer is still
indispensable, however autonomous digital technol-
ogy has grown in importance, e.g., in large observa-
tion networks. Digital all-sky imaging has been
utilized in the automated investigation of diverse
phenomena such as Auroras [1], urban light pollu-
tion [2], and photosynthetically active radiation un-
der forest canopies [3]. In astronomy, meteorology or
atmospheric physics, all-sky imaging in its most ele-
mentary form is a convenient way to record the gen-
eral atmospheric situation during observations [4].
More specifically, all-sky cameras with polarization
filters have been used for polarization mapping of
the sky hemisphere [510]. When calibrated against
a radiometric standard, such a system is a multiwa-
velength, multiangle radiometer measuring the radi-
ance of the whole sky in one exposure [5]. Compared
to sky-scanning grating spectroradiometers, acquisi-
tion speed is a trade-off against well-defined wave-
length bandwidth, dynamic range, and precision of
the detector. The skys polarization is sensitive to
the aerosol properties in the sky and is thus an ideal
complementary measurement device in combination
with aerosol optical depth measurements.
Systems for cloud observation and detection pur-
poses have been presented in [1014]. Cloud type
identification has been attempted but remains a
challenging issue [15]. It seems that a single camera
picture on its own is not quite sufficient for a detailed
automated cloud type analysis since, e.g., cloud
height information is difficult to recover. However,
in combination with satellite images in different
wavelength regions and a ceilometer, an all-sky im-
age could add valuable information.
The combination of all-sky imaging with UV radia-
tion measurements has been shown in [1618],
where cloud cover analysis was correlated with
enhanced UV irradiation. Under certain cloud
configurations the global UV radiation field can be
enhanced compared to the clear-sky value. For
radiation measurements in general, all-sky imaging
© 2009 Optical Society of America
20 February 2009 / Vol. 48, No. 6 / APPLIED OPTICS 1091
adds an extra dimension to systematic data analysis.
Cloud cover and Sun coverage are key parameters for
estimating actual irradiances from clear-sky model
In this study we present a particularly simple and
inexpensive all-sky imaging system for atmospheric
research and demonstrate two applications: polari-
zation maps and cloud detection. We show that rudi-
mentary characterization and relative calibration is
sufficient for these purposes.
2. General System Description
Images of the full-sky hemisphere are recorded with a
commercial compact digital camera (Canon A75) with
a fish-eye objective [field-of-view (FOV) 180°] and a
stepper motor controlled linear polarizer, situated be-
tween the objective and the camera. The system is
mounted in a weatherproof housing with a glass dome
and connected via ethernet cable to a personal compu-
ter for external automated control. A horizontal setup
is assured by a bubble level on the housing. Four
images at polarizer angles differing by 45° are
acquired at a fixed exposure time of 1=125 s and an
aperture of f=6. At one polarizer angle, a second under-
exposed image is acquired (a rudimentary high dy-
namic range (HDR) method, described, e.g., in [19])
to gain extra information close to the Sun where pixels
are often saturated. The images are transmitted in
JPEG format and have an intensity resolution of
8bit for three color channels (RGB) and a spatial re-
solution of 1536 ×2048 pixels. The whole process of ac-
quiring the set of five images, rotating the polarizer,
and transmitting the data takes less than 1min.
The system was installed on a building rooftop in
the city of Innsbruck, Austria, 47:26°N, 11:39°E,
620 m amsl. In routine operation, acquisitions are ta-
ken hourly at solar elevations >5°. The system was
also installed at two additional sites, featuring differ-
ent environments and horizons: A flat urban site in
Vienna (48:24°N, 16:33°E, 180 m amsl) and a mid-
altitude alpine site with an obscured horizon in
KolmSaigurn (47:07°N, 12:98°E, 1600 m amsl).
A shadow mechanism for obscuring the direct Sun
has been dismounted again since, for most of our in-
tentions here, the advantages (simplified system,
less moving parts, less obscured sky) outweigh the
disadvantages (area around the direct Sun difficult
for image processing, reflections in the lens system).
The hemisphere is projected onto the flat CCD
chip by an equiangular projection. Each point in
the sky, characterized by two angles (azimuth angle
ϕand zenith angle Θ) is mapped onto a circular
area in the xyplane. For increased computation
speed, the original JPEG images are downscaled
by a factor of 4 by nearest neighbor interpolation
and cut so that the hemisphere of 168° FOV is a circle
in a 325 ×325 pixel square plane. The xand ycoor-
dinates are centered on the zenith pixel ðx0;y0Þand
converted to polar coordinates ϕand r. The radius ris
proportional to the zenith angle in the sky Θ¼
rðΘmax=rmax Þ, with Θmax ¼FOV=2and rmax ¼
325=2, while the azimuth angle is invariant in the
The zenith pixel is the geometric center of the
square plane, when the camera is perfectly level.
This is checked by marked points (mountain peaks)
near the horizon. The resulting angle that each pixel
subtends is Θmax=rmax 0:5°. Considering the scale
of the atmospheric structures in the sky, an adequate
spatial resolution is easily met by a 1Mpixel
CCD chip.
Since each pixel is illuminated by a radiant power
from a solid angle and integrated over the spectral
responsivity, the measured radiometric quantity is
radiance. In fact, an idealistic camera attempts to
imitate the human eyes spectral responsivity, so the
quantity would then be luminosity.
Each pixel can be considered a set of three inde-
pendent broadband detectors with 8 bit resolution.
The radiance Rat each pixel is a nonlinear function
of the stored pixel counts C:
where kis a calibration constant for radiance in ab-
solute units (Wm2sr1), so fðCÞis a linearized pixel
intensity or relative radiance. This nonlinear conver-
sion is generally implemented in imaging systems to
increase contrast. Normally, this function is called
gamma correction and is of the form fðCÞ¼Cγwith
γ2:1[20]. For our camera this relationship has
been found to be too inaccurate over the full dynamic
range, and the function fhas been established ex-
perimentally by analyzing a series of images of an
illuminated reflection plate with increasing exposure
time τand fitting a 3rd order polynomial of the form
fðCÞ¼aC bC2þcC3as an empirical function (see
Fig. 1). The result of the least squares fit for the
blue channel is ½abc¼½0:2410:0010:0000154.To
determine the constant k, a radiance standard like
an integrating sphere must be used. It is possible
Fig. 1. Linearization function fðCÞto convert pixel counts into
relative radiance for the blue channel. Data points are fitted with
a 3rd order polynomial.
1092 APPLIED OPTICS / Vol. 48, No. 6 / 20 February 2009
that kis a function of zenith angle Θ, an effect known
as vignetting in photography, in particular with fish-
eye objectives. It is a huge advantage of polarization
measurements and cloud detection presented in this
study that the issue of proper absolute radiance is
not relevant, which is a great experimental simplifi-
cation. Note that the dark noise is automatically sub-
tracted by the camera, so that our function fhas no
offset. As a test, a series of images was acquired at
exposure times of 0:000515 s with the camera being
completely in the dark. The mean counts of the JPEG
images ranged from 0.003 at 0:0005 s exposure time
to 0.25 at 15 s. It is concluded that for the exposure
times used here (around 0:01 s) the dark noise is suf-
ficiently subtracted.
Note also that, for the following image processing,
each image is rotated so that the line through the ze-
nith and the Sun is always vertical, the Sun being on
the upper half. This cut through the hemisphere is
called the principal plane (PP).
3. Polarization Maps
Polarization of the sky was first reported by Arago in
the early 19th century [21]. A first qualitative expla-
nation could be given by Lord Rayleigh using his
molecular scattering theory [22]. Indeed, the very
simplified consideration of single molecular scatter-
ing and geometric scattering angles reproduces the
polarization pattern for longer wavelengths in the
visible spectral range quite well. In the real atmo-
sphere, light is scattered multiple times and might
be backreflected by the ground, resulting in lower
than unity maximum polarization and points of zero
polarization, so-called neutral points. These neutral
points were observed quite early by Arago, Brewster
and Babinet and have more recently been imaged
in [23].
The generalized polarization state of light is con-
veniently described by the Stokes vector formalism
[21]. Decomposing the electric field vector into its
two orthogonal complex field amplitudes Erand El,
the 4-component Stokes vector is defined as
where Edenotes the complex conjugate amplitudes.
All the components are real, physical quantities,
namely, irradiances EE¼jEj2¼Iαmeasured at a
different polarizer angle α:
I45 I135
where Iþand Idenote circular polarized irra-
diances. Equation (4) can be simplified for the spe-
cific case here by noting that the irradiance is propor-
tional to the radiance for each pixel and neglecting
the circular polarization of the sky:
R45 R135 #:ð5Þ
Rαdenotes the measured radiances at relative polar-
izer angles α. The resulting degree of (linear) polar-
ization Πand its angle χis given by [21]
Geometrically, we can visualize Q=Iand U=Ias the
two orthogonal components in a unit circle (horizon-
tal cut through the Poincaré sphere), whose vector
sum is the degree of polarization. It is clear that Π
is invariant under rotation of the coordinate system,
so the zero offset of the analyzer is irrelevant. Note
also that, in the expressions for Πand χ, any calibra-
tion constant cancels so that it suffices to compute
sums of relative radiances fðCÞ. The polarization of
the skys hemisphere is computed by combining four
images at relative polarizer angles of 0°, 45°, 90°, and
135° [Fig. 2(a)]. The images are linearized by the con-
version function fto obtain four relative radiance
maps. Applying Eqs. (5) and (6) at each pixel yields
the polarization map [Fig. 2(b)]. The maps are
smoothed by a 20 ×20 pixel 2D median filter for
spatial noise reduction without obscuring real atmo-
spheric structures. In this study we restrict ourselves
to the blue channel of the camera, since it has the
strongest signal and can be compared best to the
UV spectroradiometer. The center wavelength is
about 450 mm with a full width at half-maximum
(FWHM) of 50 nm.
Finally, the polarizing property of the composite
optical system (dome/objective/polarizer/CCD-chip)
is tested. Errors introduced by optical components
can be described by the Müller matrix, operating
(a) (b)
Fig. 2. (a) Blue pixel counts Cfrom JPEG images at four polarizer
angles. Counts are converted to relative radiance and inserted into
Eqs. (5) and (6) to yield the polarization map. (b) Corresponding
polarization map of the cloud-free sky for a wavelength of 450
50 nm (blue channel). The degree of polarization is coded in gray
shades, undetermined areas are rendered white on 26 February
2008, 10:30 UTC.
20 February 2009 / Vol. 48, No. 6 / APPLIED OPTICS 1093
on the input Stokes vector [21]. The Müller matrix
describes a nonunitary polarization transformation,
i.e., it can rotate and change the length of the Stokes
vector. Here we confirm that the length of the Stokes
vector is conserved. A large, completely polarized
area (polarizer in front of a white reflectance plate)
is used as the test field and analyzed from different
angles of incidence (0°, 45°, and 80°) and polarizer
angle offsets (0°, 15°, and 30°) to confirm the invar-
iance of Πunder these angles. The measured polar-
ization is averaged over the test area and shown
along with the standard deviation as an error bar
in Fig. 3. No dependence is found for these angles of
incidence and polarizer offset angles. So we assume
the Müller matrix to be the identity for all the angles
of incidence.
However, small bright sources like the Sun, may
cause internal reflections at certain angles in the ob-
jective, which are superimposed in the skys image.
These reflections may locally introduce large errors
in the polarization but are clearly visible in the raw
images and can be identified as artifacts.
Since the Stokes vector is independent of a polar-
izer offset angle, it is clear that first Stokes para-
meter Iis overdetermined: I¼R0þR90 ¼R45þ
R135, i.e., three polarizer angles would suffice for a
full Stokes-vector recovery. This is expected because
the left-hand side of Eq. (5) contains only three un-
knowns. Now we use the additional information for
better statistics, averaging the first Stokes para-
meter Ito
αRα. Furthermore, the difference,
d¼ðR0þR90ÞðR45 þR135 Þshould be zero and is
used as a quality check of the images.
To estimate the statistical error in our polarization
analysis, we measured consecutive maps of the sky
every 2min under stable atmospheric conditions.
From the variation of these series, we estimate a 1σ
standard deviation of 3% for Π.
For validation of this method with respect to sys-
tematic errors, we compare the polarization in the
principal plane to the values measured with a well-
characterized spectroradiometer with polarized
radiance optics with a small FOV of 1:5°[24]. The
wavelength of the spectroradiometer has been set
to 495 nm. As displayed in Fig. 4, the general shape
and value of polarization agree well within 3%.
Around the Sun at a polar angle of 60°, the cameras
pixels are close to overexposure resulting in a large
deviation of the polarization from the spectroradi-
ometer measurement. Also, around the maximum po-
larization, at 30° zenith angle (90° behind the Sun), a
polarized reflection within the fish-eye objective
causes a distortion of the polarization curve. To con-
firm the effect of the reflections from the unobscured
Sun, a direct comparison between the polarization in
the principal plane with and without a mounted sha-
dow band is shown in Fig. 5. So omitting the shadow
band, the measured polarization is perturbed only at
the locations of the reflections (around 45°) and
around the Sun (62°). Taking care of these limitations,
all-sky maps of the polarization can be investigated.
Two interesting examples of polarization maps are
given. Figure 6(a) shows clear-sky polarization after
sunset, when two neutral points are distinctly visi-
ble. The minima around þand 70° zenith angle in
the principal plane are called the Babinet and Arago
neutral points, respectively. Their positions are de-
pendent on aerosol parameters in the atmosphere
and ground reflectivity [21] and will be investigated
more closely in the future.
In contrast, Fig. 6(b) shows the polarization map of
a partly covered sky around noon (solar elevation is
41°). High, thin cirrus clouds reduce the maximum
polarization below 50%, while lower, optically thick
cumulus clouds are barely polarized. It has also been
noted that scattered cumulus clouds reduce the
polarization also in the clear sky in between, due
to reflected light, corresponding to a higher ground
albedo. So although clouds have a dramatic effect
on the skys polarization, cloud detection is based
on a different method, described in Section 4.
Fig. 3. Analysis of a fully polarized test field (white reflectance
place) at different angles of incidence and polarizer angle offsets
0°, 15°, and 30°. Within the experimental uncertainty, no influence
of these parameters on the measured polarization can be found.
Fig. 4. Comparison of the all-sky camera blue polarization and
spectroradiometer radiance measurements in the principal plane
on a clear-sky day, 26 February 2008, 10:30 UTC. Solar zenith
angle is 58°.
1094 APPLIED OPTICS / Vol. 48, No. 6 / 20 February 2009
4. Cloud Detection
Color is the primary property that allows visual dis-
tinction of clouds in the sky. Because of different
wavelength dependence of scattering, the color of
the clear-sky region is blue rather than the whitish
or gray color of clouds. So in a digital color image,
clear-sky pixels have a higher ratio of blue/red radi-
ance than cloud pixels. The cloud detection method
is based on setting a threshold on this ratio [1114].
Considering a cloudless, aerosol-free sky, the blue/
red ratio is a function of both zenith and azimuth an-
gles in the sky and the solar elevation. Toward the hor-
izon, the sky appears more palish than at the zenith.
In addition, scattering by aerosols, which in general
is less wavelength dependent and much stronger in
the forward direction than molecular scattering
[25], also diminishes the blueness of the sky, most pro-
minently in the region around the Sun. However, aero-
sol content and scattering properties may vary so
much in time that this temporal variation masks
the spatial variation of the pristine cloudless sky color.
Hence, without aerosol information, a constant blue/
red ratio threshold is taken for the entire hemisphere
to account for universal atmospheric conditions.
Using a diverse set of images with typical cloud
situations (low and high clouds, illuminated and
dark clouds, different solar zenith angles), a suitable
threshold of 1.3 on this ratio for marking cloud areas
was found, that best discriminated cloud and clear
sky. The threshold was confirmed by investigating
the number of cloud-marked pixels as a function of
threshold. The number first increases before reach-
ing a plateau for the optimal value of the threshold,
after which it increases again. It should be noted that
the threshold is unique to each camera system since
it depends on the color response of the CCD chip as
well as any gamma correction. Also, the location has
an influence on the threshold as altitude and typical
aerosol background will result in a different clear-
sky color.
For cloud-marked pixels, the underexposed image
is used as a second criterion, applying another
threshold for the blue/red ratio. This step is neces-
sary for the region near the unobscured Sun, where
pixels close to saturation would always be cloud-
marked. The total cloud cover (TCC) is then com-
puted as the ratio of cloud-marked pixels to total
pixel number in the hemisphere, in which the hori-
zon (up to 20° for the Innsbruck site) is masked
out. It is noted that the fish-eye projection is not
area-conserving and the projected solid angle per
pixel strictly is a function of zenith angle (see, e.g.,
[11] for a detailed mathematical formulation). So
the simple ratio is an approximation with the rela-
tive error growing with increasing zenith angle.
However, the absolute error in the resulting TCC
is below 0.01 for most situations and can safely be
ignored, considering the accuracy of the TCC value,
which is normally rounded to one decimal place or
given in octas.
Furthermore, the area around the Sun is investi-
gated in more detail. When the Sun is unobscured,
diffraction around the blades of the cameras aper-
ture produces a star flare pattern around the Sun
with a sixfold symmetry. The number of flares allows
a quantitative definition of a Sun coverage para-
meter (SCP) as a measure of how much the Sun is
obscured by clouds. Three discrete cases are being
distinguished: when no pixels in the underexposed
image are saturated, the Sun is completely obscured,
and the SCP is assigned unity. In all cases, when pix-
els are saturated but the number of detected flares is
less than five, the Sun is assumed to be partially cov-
ered with an associated SCP of 0.5. When five or
more Sun flares are detected, the Sun is considered
unobscured and has an SCP of 0.
Here we use the JPEG image with the polarizer set
parallel to the principal plane, which approximates
an unpolarized relative radiance image. In principle,
the degree of polarization could also be applied for
cloud discrimination, but it was found to be less
selective than the color-ratio threshold and requires
more image processing.
Two representative examples of the performance
of the cloud detection method are shown in Fig. 7.
Cumulus-type clouds [Figs. 7(a) and 7(b)] have a par-
ticularly sharp boundary and good contrast in the
Fig. 6. (a) Polarization map at dawn, solar elevation is 6° and the
Sun has set behind the mountains. Note the neutral points at
about 70° zenith angle (9 September 2008, 17:00 UTC). (b) Polar-
ization map around noon with high and low clouds. (24 September
2008, 12:00 UTC).
Fig. 5. Comparison of the polarization in the principal plane,
with and without obscuring the Sun using a shadow arm. The mea-
sured polarization is only perturbed at the locations of the reflec-
tions around 45° and around the Sun (Solar zenith angle ¼62°).
24 October 2008, 09:46 and 09:48 UTC.
20 February 2009 / Vol. 48, No. 6 / APPLIED OPTICS 1095
blue/red ratio against the clear sky. In this case, the
detection works very well, even close to the unobs-
cured Sun. More problematic are thin clouds in front
of the Sun [Figs. 7(c) and 7(d)], where blue/red ratios
are very similar. Nevertheless, total cloud cover re-
sults are not severely affected. Note that the contrail
is nicely detected, as well as cumulus-type clouds
near the horizon.
For a quantitative statistical validation, more than
1 yr of hourly data in the period from 2 August 2007
to 27 October 2008 has been compared to the synop-
tic (SYNOP) observation at Innsbruck airport, lo-
cated 3km to the west of the camera site; see
Fig. 8. 73% of a total of 3903 analyzed camera images
agree to within 1octa with the SYNOP observations.
The distribution of the differences shows a slight
asymmetry, i.e., our cloud cover results tend to un-
derestimate those of the SYNOP observers. However,
cloud observations always bear certain interpreta-
tional variances, some differences will even exist be-
tween human observers. Specifically, the transition
from haze to cloud is continuous. For example, on
some hazy days (aerosol optical depth at 500 nm of
>0:4) our analysis results in zero cloud cover
whereas observers often interpret such a situation
as totally overcast. Furthermore, clouds in front of
mountains are considered only by the SYNOP obser-
vers, and add to the bias.
Finally, the significance of the SPC is validated by
correlating it with the erythemally weighted global
UV irradiance (UV index or UVI) and the TCC
[Figs. 9(a)9(c)]. The method for measuring the UVI
and determining the clear-sky model value is de-
scribed in detail in [26]. For the cases when the Sun
was labeled obscured (SCP of 1), the sky is mostly
found totally covered and the measured UVI is much
lower than the predicted clear-sky value, with the ra-
tio peaking around 0.3. For cases of a partially cov-
ered Sun, the whole range of TCC covers are found
and, as expected, the ratio of measured-to-predicted
clear-sky UVI decreases with increasing TCC. For a
SCP of 0, predominantly cloud-free sky occurs with
small TCC and the measured UVI is close to the pre-
dicted clear-sky value. These observations compare
well with those in [16]. The classification of three
Sun coverage scenarios poses a refinement to
Fig. 7. (a) Image of a sky with convective cloud type (cumulus) of
low and medium height. This cloud type has a sharp contrast and
is relatively easy to discriminate from the clear sky. (19 September
2008,12:00 UTC). (b) Processed image after cloud detection, clouds
are rendered white, while clear sky is gray and mask is black. The
underexposed image (not shown here) allows good discrimination
close to the Sun. Total cloud cover here is 0.63. (c) Image of a sky
with both low and high clouds, including altostratus and a contrail.
(24 September 2008, 12:00 UTC). (d) Total cloud cover is 0.29. Note
the problematic area near the Sun when thin clouds are present.
Fig. 8. Comparison of total cloud cover (TCC) obtained from the
camera images and SYNOP cloud observations (TCCcamera-
TCCSYNOP). 73% of a total of 3903 analyzed camera images agree
within 1octa with the SYNOP-observations.
Fig. 9. (a) For SCP of 1, the Sun is totally occluded, which coincides with a TCC near 1. The ratio of measured UVI and clear-sky pre-
diction is peaked around 0.3. In the range 0:98 <TCC <1, more than 620 data points are accumulated. (b) SCP of 0 implies a partially
covered Sun. In this case increasing TCC is correlated with a decreasing ratio of measured-to-clear-sky prediction. (c) For SCP of 0, the Sun
is assumed totally unobscured, which correlates with small TCC and the measured UVI converging toward the clear-sky prediction.
1096 APPLIED OPTICS / Vol. 48, No. 6 / 20 February 2009
previous work where only two cases were distin-
guished (Sun obscured or not obscured). It is further
noted that the extreme cases SCP of 0 and 1 repre-
sent a well-defined class of scenarios, while SCP of
0.5 spans a larger scope, which may suggest a further
improvement in the future by refined partitioning of
this case.
5. Conclusion and Outlook
A simple all-sky imaging system with a polarizing
filter has been described and characterized with
respect to measuring the sky´s degree of polarization
and detecting total cloud cover. Both types of analysis
have been validated against independent measuring
methods and found to agree well within their respec-
tive uncertainties. A Sun coverage parameter has
been obtained from image processing around the
Sun. It has been shown that the combination of total
cloud cover and Sun coverage parameter form a solid
basis for UV radiation estimation under cloudy
conditions. A cloud type analysis could be attempted
using pattern-recognition techniques in combination
with satellite images. Polarization maps of the sky
contain valuable information of aerosol content in
the atmosphere. The correlation of aerosol optical
depth and degree of polarization is one of the para-
mount topics in our ongoing work. Furthermore,
in contemporary climate research, the interaction
of aerosol and clouds are a key uncertainty in
the Earths energy budget. As a complementary de-
vice together with an aerosol optical depth measure-
ment like a sunphotometer all-sky imaging could be
a powerful tool for investigation of aerosolcloud
This work was supported by the Austrian Science
Fund (FWF) under Project P18780. We gratefully
acknowledge Lanzinger at Austrocontrol, Innsbruck
Airport, for supplying the SYNOP cloud observa-
tions. The camera system was developed in coopera-
tion with Schreder (CMS-Ing.Dr.Schreder GmbH).
1. S. B. Mende, S. E. Harris, H. U. Frey, V. Angelopoulos,
C. T. Russell, E. Donovan, B. Jackel, M. Greffen, and
L. M. Peticolas, The THEMIS array of ground-based observa-
tories for the study of auroral substorms,Space Sci. Rev., doi:
10.1007/s11214-008-9380-x (2007).
2. G. Zotti, Measuring light pollution with a calibrated high dy-
namic range all-sky image acquisition system,presented at
the DARKSKY20077th European Symposium for the Pro-
tection of the Night Sky, Bled, Slovenia (2007).
3. R. L. Chazdon and C. B. Field, Photographic estimation of
photosynthetically active radiation: evaluation of a computer-
ized technique,Oecologia 73, 525532 (1987).
4. T. E. Pickering, The MMT all-sky camera,Proc. SPIE 6267,
62671A (2006).
5. Y. Liu and K. Voss, Polarized radiance distribution measure-
ments of skylight. I. System description and characterization,
Appl. Opt. 36, 60836094 (1997).
6. N. J. Pust and J. A. Shaw, Dual-field imaging polarimeter
using liquid crystal variable retarders,Appl. Opt. 45,
54705478 (2006).
7. J. Gál, G. Horváth, V. B. Meyer-Rochow, and R. Wehner,
Polarization patterns of the summer sky and its neutral
points measured by full-sky imaging polarimetry in Finnish
Lapland north of the Arctic Circle,Proc. R. Soc. A 457,
13851399 (2001).
8. J. A. North and M. J. Duggin, Stokes vector imaging of the
polarized sky-dome,Appl. Opt. 36, 723730 (1997).
9. M. V. Berry, M. R. Dennis, and R. L. Lee, Jr., Polarization
singularities in the clear sky,New J. Phys. 6, 162 (2004).
10. G. Horváth, A. Barta, J. Gál, B. Suhai, and O. Haiman,
Ground-based full-sky imaging polarimetry of rapidly
changing skies and its use for polarimetric cloud detection,
Appl. Opt. 41, 543559 (2002).
11. C. N. Long, J. M. Sabburg, J. Calbé, and D. Pagès, Retrieving
cloud characteristics from ground-based daytime color
all-sky images,J. Atmos. Ocean. Technol. 23, 633652
12. N. H. Schade, A. Macke, H. Sandmann, and C. Stick, Total
and partial cloud detection during summermonths 2005 at
Westerland (Sylt, Germany),Atmos. Chem. Phys. Discuss.
8, 1347913505 (2008).
13. U. Feister, J. Shields, M. Karr, R. Johnson, K. Dehne, and M.
Woldt, Ground-based cloud images and sky radiances in the
visible and near infrared region from whole sky imager mea-
surements,in Proceedings of Climate MonitoringSatellite
Application Facility Training Workshop (Dresden, 2000).
14. U. Feister and J. Shields, Cloud and radiance measurements
with the VIS/NIR Daylight Whole Sky Imager at Lindenberg
(Germany),Meteor. Zeitschr. 14, 627639 (2005).
15. K. A. Buch and C. H. Sun, Cloud classification using
whole-sky imager data,presented at Ninth Symposium on
Meteoriological Observations and Instrumentation, paper
7.5, Charlotte, North Carolina, 1995.
16. G. Pfister, R. L. McKenzie,J. B. Liley, A. Thomas, B. W. Forgan,
and C. N. Long, Cloud Coverage Based on All-Sky Imaging
and Its Impact on Surface Solar Irradiance,J. Appl. Meteorol.
42, 14211434 (2003).
17. N. H. Schade, A. Macke, H. Sandmann, and C. Stick,
Enhanced solar global irradiance during cloudy sky
conditions,Meteor. Zeitschr. 16, 295303 (2007).
18. J. M. Sabburg and C. N. Long, Improved sky imaging for
studies of enhanced UV irradiance,Atmos. Chem. Phys. 4,
25432552 (2004).
19. J. Stumpfel, C. Thou, A. Jones, T. Hawkins, A. Wenger, and
P. Debevec, Proceedings of the Third International Conference
on Computer Graphics, Virtual Reality, Visualisation and
Interaction in Africa (Association for Computing Machinery,
2004), pp. 145149.
20. C. A. Poynton, Digital Video and HDTV: Algorithms and
Interfaces (Morgan Kaufmann, 2003).
21. K. L. Coulson, Polarization and Intensity of Light in the Atmo-
sphere (Deepak, 1988).
22. J. W. Strutt, On the light from the sky, its polarisation and
color,Philos. Mag. 41, 107120 274279 (1871).
23. G. Horváth, J. Gál, I. Pomozi, and R. Wehner, Polarization
portrait of the Arago Point: video-polarimetric imaging of
the neutral points of skylight polarization,Naturwis-
senschaften 85, 333339 (1998).
24. M. Blumthaler, B. Schallhart, M. Schwarzmann, R. McKenzie,
P. Johnston, M. Kotkamp, and H. Shiona, Spectral UV mea-
surements of global irradiance, solar radiance, and actinic flux
in New Zealand: intercomparison between instruments and
model calculations,J. Atmos. Ocean. Technol. 25, 945958
25. S. Twomey, Atmospheric Aerosols (Elsevier, 1977).
26. B. Schallhart, M. Blumthaler, J. Schreder, and J. Verdebout, A
method to generate near real time UV-index maps of Austria,
Atmos. Chem. Phys. 8, 74837491 (2008).
20 February 2009 / Vol. 48, No. 6 / APPLIED OPTICS 1097
... For example, the dominant smartphone operating systems, Android and iOS, only introduced support for unprocessed (RAW) imagery as recently as 2014 (Android 5.0 'Lollipop') and 2016 (iOS 10). Previously, third-party developers could only use JPEG data, which introduce a number of systematic errors due to their lossy compression and bit-rate reduction [94,121,142,317,319,334,342,349,357]. Other common problems in consumer camera data include non-linearity and the gamma correction [94,121,162,285,317,322,329,332,334,338,342,[357][358][359][360][361][362], electronic and thermal noise [118,319,324,344,354,363,364], and highly variable (between camera models) spectral response functions which are not provided by manufacturers [121,317,323,331,333,334,349,351,354,358,365]. ...
... Previously, third-party developers could only use JPEG data, which introduce a number of systematic errors due to their lossy compression and bit-rate reduction [94,121,142,317,319,334,342,349,357]. Other common problems in consumer camera data include non-linearity and the gamma correction [94,121,162,285,317,322,329,332,334,338,342,[357][358][359][360][361][362], electronic and thermal noise [118,319,324,344,354,363,364], and highly variable (between camera models) spectral response functions which are not provided by manufacturers [121,317,323,331,333,334,349,351,354,358,365]. These factors limit the accuracy of radiometric measurements done with consumer cameras by introducing systematic errors. ...
... JPEG (ISO 10918) is based on lossy spatial compression and downsampling to 8-bit values, optimal for small file sizes while maintaining aesthetic qualities. Due to camera-specific processing and compression artefacts, JPEG images lose information and are not recommended for quantitative analysis [94,121,142,317,319,329,334,342,349,357]. While standardisations exist, such as the standard Red Green Blue (sRGB) colour space and gamma curve [369], these are not strictly adhered to and cannot be assumed in data processing [372]. ...
Full-text available
Water is all around us and is vital for all aspects of life. Studying the various compounds and life forms that inhabit natural waters lets us better understand the world around us. Remote sensing enables global measurements with rapid response and high consistency. Citizen science provides new knowledge and greatly increases the scientific and social impact of research. In this thesis, we investigate several aspects of citizen science and remote sensing of water, with a focus on uncertainty and accessibility. We improve existing techniques and develop new methods to use smartphone cameras for accessible remote sensing of water.
... Their high price make sky scanners a rare instrument in weather stations where only global measurements, like diffuse irradiance or I. García et al. 2016;Caldas and Alonso-Suárez, 2019;Schwartz et al., 2017) but there are also applications based in High Dynamic Range (HDR) imaging (Nou et al., 2018) or polarization sensitive imaging systems (North and Duggin, 1997;Horváth et al., 2002;Kreuter et al., 2009). Besides cloud analysis, full sky LDR and HDR images have been used for sky classification into standard CIE skies (ISO 15469:2004(E) /CIE S 011/E:2003/CIE S 011/E: , 2004 with good performance (García et al., 2020b;Shahriar, 2013). ...
... This linear relationship is spoiled by the use of other image formats, like JPEG, involving nonlinear transformations of the original pixel values done by the camera's inaccessible built-in software. Inversion algorithms can be applied to recover sensor's irradiance from RGB pixel values (Kreuter et al., 2009) but the access to RAW data makes them unnecessary. In EOS 6D, RAW CR2 files contain the digital negative (5472 × 3648 pixels, approximately 19 MP) as well as Exif (Exchangeable Image file Format) information like the time when the image was taken or the exposure time , that are needed during the analysis. ...
Full-text available
A full sky High Dynamic Range imaging system, based on a Single-Lens Reflex camera with a fisheye lens, has been constructed and calibrated with a sky scanner luminance meter. The method considers the geometrical, spectral, timing and orientation issues between instruments. The calibration data sets, having nearly simultaneous measurements under stable sky conditions, were obtained from approximately one month of data using selection variables based in the experimental design. For luminance estimation we use the standard CIEY RGB combination and a Spectrally Matched Luminance (SML) predictor, matching the spectral response of the instruments. With 738 calibration points having luminances up to 23.6kcd/m2, covering 98.5% of the sky luminance range, CIEY is linearly correlated with sky scanner measurements with a coefficient of determination R2=0.9927 and a Root Mean Squared Error (RMSE) of 7.7%. SML gives better results, with R2=0.9973 and RMSE=5.3%. With 253 calibration points with luminances up to 12.9kcd/m2, comprising 94.1% of the sky luminance range, both predictors clearly improve, with R2=0.9964 and RMSE=4.1% in case of CIEY and R2=0.9982 and RMSE=2.9% in case of SML.
... For instance, many traditional imageprocessing methods utilize color to pinpoint clouds from the clear sky. Some of the methods establish fixed or variable thresholds in the ratio between blue and red channels (Kreuter et al. 2009;Long et al. 2006b;Heinle et al. 2010;Souza-Echer et al. 2006). Combinations of statistical and machine learning (ML) models are also used for cloud cover and cloud type estimation (Tian et al. 1999;Calbó et al. 2001;Zhuo et al. 2014). ...
Full-text available
Accurate cloud type identification and coverage analysis are crucial in understanding the Earth’s radiative budget. Traditional computer vision methods rely on low-level visual features of clouds for estimating cloud coverage or sky conditions. Several handcrafted approaches have been proposed; however, scope for improvement still exists. Newer deep neural networks (DNNs) have demonstrated superior performance for cloud segmentation and categorization. These methods, however, need expert engineering intervention in the preprocessing steps—in the traditional methods—or human assistance in assigning cloud or clear sky labels to a pixel for training DNNs. Such human mediation imposes considerable time and labor costs. We present the application of a new self-supervised learning approach to autonomously extract relevant features from sky images captured by ground-based cameras, for the classification and segmentation of clouds. We evaluate a joint embedding architecture that uses self-knowledge distillation plus regularization. We use two datasets to demonstrate the network’s ability to classify and segment sky images—one with ∼ 85,000 images collected from our ground-based camera and another with 400 labeled images from the WSISEG database. We find that this approach can discriminate full-sky images based on cloud coverage, diurnal variation, and cloud base height. Furthermore, it semantically segments the cloud areas without labels. The approach shows competitive performance in all tested tasks, suggesting a new alternative for cloud characterization.
... In fact, the cloud cover assessment does not make the best of sky polarization, mainly because of the difficulty in cloud layer extraction with a single piece of polarization information. Although Horvath et al., made some efforts in sky polarization information at an earlier stage [18] and Kreuter et al., presented a commercial digital camera with a fisheye lens and a rotating polarizer [19], their experimental results were not satisfactory due to the limitations of immature polarization imaging equipment. Eshelman et al., verified that the ground-based all-sky polarimeter system reliably determines the cloud thermodynamic phase [20] but the system is unavailable for cloud detection. ...
Full-text available
Sky cloud detection has a significant application value in the meteorological field. The existing cloud detection methods mainly rely on the color difference between the sky background and the cloud layer in the sky image and are not reliable due to the variable and irregular characteristics of the cloud layer and different weather conditions. This paper proposes a cloud detection method based on all-sky polarization imaging. The core of the algorithm is the “normalized polarization degree difference index” (NPDDI). Instead of relying on the color difference information, this index identifies the difference between degree of polarization (DoPs) of the cloud sky and the clear sky radiation to achieve cloud recognition. The method is not only fast and straightforward in the algorithm, but also can detect the optical thickness of the cloud layer in a qualitative sense. The experimental results show a good cloud detection performance.
... Traditional ones consist of thresholdbased (fixed or adaptive), time differentiation, and statistical methods. The threshold-based approaches rely on a visible reflection and infrared temperature of the clouds, therefore its performance weakens on low-contrasted (cloud vs. surface) images [13]- [15]. Time differentiation methods effectively identify the changing pixel values as clouds in multi-temporal images, however, they do not consider changes in the top of atmosphere reflectance affected by floods [12], [16]. ...
Full-text available
div>CubeSats, the nanosatellites with a wet mass up to 10 kg, accompanied by the cost decrease of accessing the space, amplified the rapid development of the Earth Observation industry. Acquired image data serve as an essential source of information in various disciplines like environmental protection, geosciences, or the military. As the quantity of remote sensing data grows, the bandwidth resources for the data transmission (downlink) are exhausted. Therefore, new techniques that reduce the downlink utilization of the satellites must be investigated and developed. For that reason, we are presenting CloudSatNet-1: an FPGA based hardware-accelerated quantized convolutional neural network (CNN) for satellite on-board cloud coverage classification. We aim to explore the effects of the quantization process on the proposed CNN architecture. Additionally, the performance of cloud coverage classification by biomes diversity is investigated, and the hardware architecture design space is explored to identify the optimal FPGA resource utilization. Results of this study showed that the weights and activations quantization adds a minor effect on the model performance. Nevertheless, the memory footprint reduction allows the model deployment on low-cost FPGA Xilinx Zynq-7020. Using the RGB bands only, up to 90% of accuracy was achieved, and when omitting the tiles with snow and ice, the performance increased up to 94.4% of accuracy with a low false-positive rate of 2.23% for the 4-bit width model. With the maximum parallelization settings, the hardware accelerator achieved 15 FPS with 2.5W of average power consumption (0.2W increase over the idle state).</div
... The applications of ground-based cloud monitoring are focused on precipitation and fog detection (Kim et al., 2020), cloud forecast skill verification (Hogan et al., 2009), cloud seeding evaluation (Schaefer et al., 1957;Geresdi et al., 2020), satellite cloud-mask validation (Skakun et al., 2021), and solar irradiance nowcasting (Schmidt et al., 2016). In addition, cloud observations have proven to be useful in investigating their effect on photochemistry (Hall et al., 2018), CO2 fluxes (Still et al., 2009), ecological biomes (Wilson & Jetz, 2016), erythemal dose rate (Silva & de Souza Echer, 2013), light pollution amplification (Jechow et al., 2019), air pollution modelling (Arciszewska & McClatchey, 2001), sky polarization (Kreuter et al., 2009), and space communications (Nugent et al., 2009). ...
Full-text available
One of the largest challenges in Numerical Weather Prediction (NWP) is cloud forecasting. The reason is twofold: first, clouds are constantly changing size and shape over short periods of time, and second, cloud formation processes occur at wide scales, from sub-micrometer to few kilometers. Typical NWP models operate at scales between 1 to 20 km resolution and run between 2 to 8 times per day, thus spatial and temporal limited for accurate cloud forecasting. Data extrapolation techniques have emerged to forecast cloud cover using deep learning and satellite imagery. However, satellite resolution remains impractical for hyperlocal cloudiness forecasting, which is relevant for many industries such as solar power, astronomy, and aviation. Over the last years, higher resolution models have been developed (<1km). One example is ClimaCell Bespoke Atmospheric Model (CBAM), which claims to be the world’s highest resolution weather forecasting model. CBAM initial conditions are fed with data from power grids, cellular networks, road cameras, connected vehicles, and smartphones. It is well established that the accuracy of NWP models improves significantly with the assimilation of high-resolution meteorological observations. However, most data-assimilation techniques omit cloud observations, thus the benefits of highly localized cloud-assimilation are still largely unknown. Since clouds directly influence solar radiation, surface temperature, precipitation and are useful signs to predict upcoming weather, I have hypothesized that the next generation of NWP models will require assimilation of high-resolution cloud data. The overall goal of this thesis is to retrieve cloud cover, cloud type, cloud motion vectors, cloud base height, and cloud transmittance from the combined use of a skycam, a pyranometer, and a ceilometer, aiming to ultimately improve hyperlocal short-term weather forecasting. This study comprised summer daytime data from July, August, and September 2020. The datasets generated by this dissertation are publicly available on
Cloud detection plays a significant role in remote sensing image applications. Existing deep learning-based cloud detection methods rely on massive precise pixel-wise annotations, which are time-consuming and expensive. To alleviate this problem, we propose a weakly supervised cloud detection framework that leverages physical rules to generate weak supervision for cloud detection in remote sensing images. Specifically, a rule-based adaptive pseudo labeling (RAPL) algorithm is devised to adaptively annotate potential cloud pixels based on cloud spectral properties without manual intervention. Unlike existing physical annotations using fixed thresholds, RAPL employs the bidirectional threshold segmentation and adaptive gating mechanism to annotate cloud and boundary masks with more explicit semantic categories and spatial structures separately. Subsequently, these pseudo masks are treated as weak supervision to optimize the heuristic cloud detection network for pixel-wise segmentation. Considering that clouds appear as complex geometric structures and nonuniform spectral reflectance, a deformable boundary refining module is designed to enhance the modeling ability of spatial transformation and activate sharp boundaries from translucent cloud regions. Moreover, a harmonic loss is employed to recognize clouds with nonuniform spectral reflectance and suppress the interference of bright backgrounds. Extensive experiments on the GF-1, L8 Biome, and WDCD datasets demonstrate that the proposed method achieves state-of-the-art results. A public reference implementation of this work in PyTorch is available at
Cloud detection is indispensable in ground-based cloud observation, and it can implement automatic cloud cover estimation. Cloud detection is quite challenging because of blurred cloud boundaries and variable shapes. In this paper, we propose a new network Channel Attention Cloud Detection Network (CACDN) for ground-based cloud detection. The proposed CACDN is an encoder-decoder architecture, and we design the cloud channel attention (CCA) module to filter information for accurate cloud detection. We conduct the experiments on TLCDD, and the experimental results show that our method achieves better results than other methods, thus proving the effectiveness of the proposed CACDN. Keywordsground-based cloud detectionCNNencoderdecoder
Full-text available
CubeSats, the nanosatellites and microsatellites with a wet mass up to 60 kg, accompanied by the cost decrease of accessing the space, amplified the rapid development of the Earth Observation industry. Acquired image data serve as an essential source of information in various disciplines like environmental protection, geosciences, or the military. As the quantity of remote sensing data grows, the bandwidth resources for the data transmission (downlink) are exhausted. Therefore, new techniques that reduce the downlink utilization of the satellites must be investigated and developed. For that reason, we are presenting CloudSatNet-1: an FPGA-based hardware-accelerated quantized convolutional neural network (CNN) for satellite on-board cloud coverage classification. We aim to explore the effects of the quantization process on the proposed CNN architecture. Additionally, the performance of cloud coverage classification by biomes diversity is investigated, and the hardware architecture design space is explored to identify the optimal FPGA resource utilization. Results of this study showed that the weights and activations quantization adds a minor effect on the model performance. Nevertheless, the memory footprint reduction allows the model deployment on low-cost FPGA Xilinx Zynq-7020. Using the RGB bands only, up to 90% of accuracy was achieved, and when omitting the tiles with snow and ice, the performance increased up to 94.4% of accuracy with a low false-positive rate of 2.23% for the 4-bit width model. With the maximum parallelization settings, the hardware accelerator achieved 15 FPS with 2.5 W of average power consumption (0.2 W increase over the idle state).
Many methods for ground-based remote sensing cloud detection learn representation features using the encoder-decoder structure. However, they only consider the information from single scale, which leads to incomplete feature extraction. In this article, we propose a novel deep network named dual pyramid network (DPNet) for ground-based remote sensing cloud detection, which possesses an encoder-decoder structure with dual pyramid pooling module (DPPM). Specifically, we process the feature maps of different scales in the encoder through dual pyramid pooling. Then, we fuse the outputs of the dual pyramid pooling in the same pyramid level using the attention fusion. Furthermore, we propose the encoder-decoder constraint (EDC) to relieve information loss in the process of encoding and decoding. It constrains the values and the gradients of probability maps from the encoder and the decoder to be consistent. Since the number of cloud images in the publicly available databases for ground-based remote sensing cloud detection is limited, we release the TJNU Large-scale Cloud Detection Database (TLCDD) that is the largest database in this field. We conduct a series of experiments on TLCDD, and the experimental results verify the effectiveness of the proposed method.
Full-text available
In Lauder, Central Otago, New Zealand, two all-sky imaging systems have been in operation for more than 1 yr, measuring the total, opaque, and thin cloud fraction, as well as indicating whether the sun is obscured by clouds. The data provide a basis for investigating the impact of clouds on the surface radiation field. The all-sky cloud parameters were combined with measurements of global, direct, and diffuse surface solar irradiance over the spectral interval from 0.3 to 3 μm. Here, the results of ongoing analysis of this dataset are described. As a reference for the magnitude of the cloud influence, clear-sky irradiance values are estimated as a simple function of solar zenith angle and the earth-sun distance. The function is derived from a least squares fit to measurements taken when available cloud images show clear-sky situations. Averaged over a longer time period, such as 1 month, cloud fraction and surface irradiance are clearly negatively correlated. Monthly means in the ratio of the measured surface irradiance to the clear-sky value had a correlation coefficient of about -0.9 with means of cloud fraction for the months from July 2000 to June 2001. In the present work reductions in the surface irradiance and situations in which clouds cause radiation values to exceed the expected clear-sky amount are analyzed. Over 1 yr of observations, 1-min-averaged radiation measurements exceeding the expected clear-sky value by more than 10% were observed with a frequency of 5%. In contrast, a reduction of more than 10% below estimated clear-sky values occurred in 66% of the cases, while clear-sky irradiances (measured irradiance within ±10% of estimated clear-sky value) were observed 29% of the time. Low cloud fractions frequently lead to moderate enhancement, because the sun is often unobscured and the clouds are brighter than the sky that they hide. As cloud fraction increases the sun is likely to be obscured, causing irradiance values to fall well below clear-sky values. However, in the case of unobscured sun, there is a tendency for strongest enhancements when cloud fractions are highest. Enhancements, especially at high solar zenith angle, are also often observed in association with thin clouds.
Full-text available
Presented here are the results of a short but intense measurement campaign at Lauder, New Zealand, in which spectral irradiance from instruments operated by the National Institute of Water and Atmospheric Research (NIWA) and Austria/Innsbruck (ATI) were traced to different irradiance standards and compared. The observed spectral differences for global irradiance were relatively small (<5%) and were consistent with those expected from observed differences in the radiation standards used by each group. Actinic fluxes measured by both groups were also intercompared and found to agree at the 10% level. The ATI instrument had the additional capability of measuring solar direct beam irradiance and sky radiances. These provided the first series of sky radiance measurements at this pristine Network for the Detection of Atmospheric Composition Change (NDACC) site. The polarization of sky radiance results were compared with estimates from a radiative transfer model without any aerosols and was found to be up to 25% smaller. Total ozone values derived from Total Ozone Mapping Spectrometer (TOMS), Dobson measurements by NIWA, spectral direct sun measurements by ATI, and spectral global irradiance measurements by NIWA agreed generally within 2%-3%.
Full-text available
We present here an overview of sky imaging, and techniques that may be applied to the analysis of full color sky images to infer cloud macrophysical properties. Details of two different types of sky imaging systems developed by the authors are presented, one of which has been developed into a commercially available instrument. Retrievals of fractional sky cover from automated processing methods are compared to human retrievals, both from direct observations and visual analyses of sky images. Although there exists some uncertainty in fractional sky cover retrievals from sky images, this uncertainty is no greater than that attached to human observations for the commercially available sky imager retrievals, with still acceptable uncertainty with the research system used in Girona, Spain. Thus, the application of automatic digital image processing techniques on sky images is a useful method to complement, or even replace traditional human observations of sky conditions for cloud cover and potentially cloud type. Additionally, the possibilities for inferring other cloud parameters such as cloud brokenness and solar obstruction of benefit for scientific research further enhance the usefulness of sky imagers.
Full-text available
A recent World Meteorological Organization report discussed the importance of continued study of the effect of clouds on the solar UV radiation reaching the earths surface. The report mentions that the use of all-sky imagery offers the potential to understand and quantify cloud effects more accurately. There are an increasing number of studies investigating the enhancement of surface solar, UV irradiance, and UV actinic flux, using automated CCD and sky imagers. This paper describes new algorithms applicable to a commercially available all-sky imager (TSI-440), for research investigating cloud enhanced spectral UV irradiance. Specifically, these include three new algorithms relating to cloud amount at different spatial positions from (1) zenith and (2) from the solar position and (3) the visible brightness of clouds surrounding the sun. A possible relationship between UV enhancement and the occurrence of near-sun cloud brightness is reported based on this preliminary data. It is found that a range of wavelength dependent intensities, from 306 to 400 nm, can occur in one day for UV enhancements. Evidence for a possible decreasing variation of intensity with longer wavelengths is also presented.
Full-text available
Using 180° field–of–view (full–sky) imaging polarimetry, the patterns of the degree and angle of polarization of the entir summer sky were measured on 25 June 1999 at a location north of the Arctic Circle in Finnish Lapland as a function of th angular solar zenith distance. A detailed description of the used full–sky imaging polarimeter and its calibration is given. A series of the degree and angle of polarization pattern of the full sky is presented in the form of high–resolution circula maps measured in the blue (450 nm) spectral range as a function of the solar zenith distance. Graphs of the spectral dependenc of the degree and angle of polarization of skylight at 90° from the Sun along the antisolar meridian are shown. The celestia regions of negative polarization and the consequence of the existence of this anomalous polarization, the neutral points are visualized. The measured values of the angular zenith distance of the Arago and Babinet neutral points are presented a a function of the zenith distance of the Sun for the red (650 nm), green (550 nm) and blue (450 nm) ranges of the spectrum. The major aim of this work is to give a clear and comprehensive picture, with the help of full–sky imaging polarimetry, o what is going on in the entire polarized skydome. We demonstrate how variable the degree of polarization of skylight and th position of the neutral points can be within 24 h on a sunny, almost cloudless, visually clear day.
Clouds are one of the most important moderators of the earth radiation budget and one of the least understood. The effect that clouds have on the reflection and absorption of solar and terrestrial radiation is strongly influenced by their shape, size, and composition. Physically accurate parameterization of clouds is necessary for any general circulation model (GCM) to yield meaningful results. The work presented here is part of a larger project that is aimed at producing realistic three-dimensional (3D) volume renderings of cloud scenes, thereby providing the important shape information for parameterizing GCMs. The specific goal of the current study is to develop an algorithm that automatically classifies (by cloud type) the clouds observed in the scene. This information will assist the volume rendering program in determining the shape of the cloud. Much work has been done on cloud classification using multispectral satellite images. Most of these references use some kind of texture measure to distinguish the different cloud types and some also use topological features (such as cloud/sky connectivity or total number of clouds). A wide variety of classification methods has been used, including neural networks, various types of clustering, and thresholding. The work presented here utilizes binary decision trees to distinguish the different cloud types based on cloud feature vectors.
All aspects of aerosol physics important in the oformation, evolution and removal of particulate material in the atmosphere are presented, and the influence of such particles on the climate and weather are outlined. The book opens with a discussion of the physics of aerosols and derives some f the more important relationships in the physics of single aerosol particles. These are then used as a basis for subsequent examination of interactions between particles and the dynamics of populations of particles relative to the evolution amd maintenance of particle size distributions in the atmosphere and for the production - modification and coagulation-removal cycle. The balance between production and removal is then reviewed and the regions of the size spectrum where the various formative and removal processes are most effective are identified. The last five chapters are devoted to the influence of atmospheric particles on weather, atmospheric optics and radiative transfer, atmospheric electricity and atmospheric energetics and climate.