ArticlePDF Available

All-sky imaging: a simple, versatile system for atmospheric research

Optica Publishing Group
Applied Optics
Authors:

Abstract and Figures

A simple and inexpensive fully automated all-sky imaging system based on a commercial digital camera with a fish-eye lens and a rotating polarizer is presented. The system is characterized and two examples of applications in atmospheric physics are given: polarization maps and cloud detection. All-sky polarization maps are obtained by acquiring images at different polarizer angles and computing Stokes vectors. The polarization in the principal plane, a vertical cut through the sky containing the Sun, is compared to measurements of a well-characterized spectroradiometer with polarized radiance optics to validate the method. The images are further used for automated cloud detection using a simple color- ratio algorithm. The resulting cloud cover is validated against synoptic cloud observations. A Sun coverage parameter is introduced that shows, in combination with the total cloud cover, useful correlation with UV irradiance.
This content is subject to copyright. Terms and conditions apply.
All-sky imaging: a simple, versatile system
for atmospheric research
Axel Kreuter,* Matthias Zangerl, Michael Schwarzmann, and Mario Blumthaler
Division for Biomedical Physics, Department of Physiology and Medical Physics,
Innsbruck Medical University, Müllerstrasse 44, 6020 Innsbruck, Austria
*Corresponding author: axel.kreuter@imed.ac.at
Received 17 November 2008; revised 16 January 2009; accepted 17 January 2009;
posted 21 January 2009 (Doc. ID 104048); published 13 February 2009
A simple and inexpensive fully automated all-sky imaging system based on a commercial digital camera
with a fish-eye lens and a rotating polarizer is presented. The system is characterized and two examples
of applications in atmospheric physics are given: polarization maps and cloud detection. All-sky polar-
ization maps are obtained by acquiring images at different polarizer angles and computing Stokes
vectors. The polarization in the principal plane, a vertical cut through the sky containing the Sun, is
compared to measurements of a well-characterized spectroradiometer with polarized radiance optics
to validate the method. The images are further used for automated cloud detection using a simple color-
ratio algorithm. The resulting cloud cover is validated against synoptic cloud observations. A Sun cover-
age parameter is introduced that shows, in combination with the total cloud cover, useful correlation with
UV irradiance. © 2009 Optical Society of America
OCIS codes: 010.1615, 100.2960, 110.5405, 120.5410.
1. Introduction
Observations of the sky are one of the oldest methods
in planetary sciences such as meteorology and as-
tronomy. In many cases the human observer is still
indispensable, however autonomous digital technol-
ogy has grown in importance, e.g., in large observa-
tion networks. Digital all-sky imaging has been
utilized in the automated investigation of diverse
phenomena such as Auroras [1], urban light pollu-
tion [2], and photosynthetically active radiation un-
der forest canopies [3]. In astronomy, meteorology or
atmospheric physics, all-sky imaging in its most ele-
mentary form is a convenient way to record the gen-
eral atmospheric situation during observations [4].
More specifically, all-sky cameras with polarization
filters have been used for polarization mapping of
the sky hemisphere [510]. When calibrated against
a radiometric standard, such a system is a multiwa-
velength, multiangle radiometer measuring the radi-
ance of the whole sky in one exposure [5]. Compared
to sky-scanning grating spectroradiometers, acquisi-
tion speed is a trade-off against well-defined wave-
length bandwidth, dynamic range, and precision of
the detector. The skys polarization is sensitive to
the aerosol properties in the sky and is thus an ideal
complementary measurement device in combination
with aerosol optical depth measurements.
Systems for cloud observation and detection pur-
poses have been presented in [1014]. Cloud type
identification has been attempted but remains a
challenging issue [15]. It seems that a single camera
picture on its own is not quite sufficient for a detailed
automated cloud type analysis since, e.g., cloud
height information is difficult to recover. However,
in combination with satellite images in different
wavelength regions and a ceilometer, an all-sky im-
age could add valuable information.
The combination of all-sky imaging with UV radia-
tion measurements has been shown in [1618],
where cloud cover analysis was correlated with
enhanced UV irradiation. Under certain cloud
configurations the global UV radiation field can be
enhanced compared to the clear-sky value. For
radiation measurements in general, all-sky imaging
0003-6935/09/061091-07$15.00/0
© 2009 Optical Society of America
20 February 2009 / Vol. 48, No. 6 / APPLIED OPTICS 1091
adds an extra dimension to systematic data analysis.
Cloud cover and Sun coverage are key parameters for
estimating actual irradiances from clear-sky model
predictions.
In this study we present a particularly simple and
inexpensive all-sky imaging system for atmospheric
research and demonstrate two applications: polari-
zation maps and cloud detection. We show that rudi-
mentary characterization and relative calibration is
sufficient for these purposes.
2. General System Description
Images of the full-sky hemisphere are recorded with a
commercial compact digital camera (Canon A75) with
a fish-eye objective [field-of-view (FOV) 180°] and a
stepper motor controlled linear polarizer, situated be-
tween the objective and the camera. The system is
mounted in a weatherproof housing with a glass dome
and connected via ethernet cable to a personal compu-
ter for external automated control. A horizontal setup
is assured by a bubble level on the housing. Four
images at polarizer angles differing by 45° are
acquired at a fixed exposure time of 1=125 s and an
aperture of f=6. At one polarizer angle, a second under-
exposed image is acquired (a rudimentary high dy-
namic range (HDR) method, described, e.g., in [19])
to gain extra information close to the Sun where pixels
are often saturated. The images are transmitted in
JPEG format and have an intensity resolution of
8bit for three color channels (RGB) and a spatial re-
solution of 1536 ×2048 pixels. The whole process of ac-
quiring the set of five images, rotating the polarizer,
and transmitting the data takes less than 1min.
The system was installed on a building rooftop in
the city of Innsbruck, Austria, 47:26°N, 11:39°E,
620 m amsl. In routine operation, acquisitions are ta-
ken hourly at solar elevations >5°. The system was
also installed at two additional sites, featuring differ-
ent environments and horizons: A flat urban site in
Vienna (48:24°N, 16:33°E, 180 m amsl) and a mid-
altitude alpine site with an obscured horizon in
KolmSaigurn (47:07°N, 12:98°E, 1600 m amsl).
A shadow mechanism for obscuring the direct Sun
has been dismounted again since, for most of our in-
tentions here, the advantages (simplified system,
less moving parts, less obscured sky) outweigh the
disadvantages (area around the direct Sun difficult
for image processing, reflections in the lens system).
The hemisphere is projected onto the flat CCD
chip by an equiangular projection. Each point in
the sky, characterized by two angles (azimuth angle
ϕand zenith angle Θ) is mapped onto a circular
area in the xyplane. For increased computation
speed, the original JPEG images are downscaled
by a factor of 4 by nearest neighbor interpolation
and cut so that the hemisphere of 168° FOV is a circle
in a 325 ×325 pixel square plane. The xand ycoor-
dinates are centered on the zenith pixel ðx0;y0Þand
converted to polar coordinates ϕand r. The radius ris
proportional to the zenith angle in the sky Θ¼
rðΘmax=rmax Þ, with Θmax ¼FOV=2and rmax ¼
325=2, while the azimuth angle is invariant in the
transformation:
r
ϕpolar
r·Θmax
rmax
ϕΘ
ϕsky
:ð1Þ
The zenith pixel is the geometric center of the
square plane, when the camera is perfectly level.
This is checked by marked points (mountain peaks)
near the horizon. The resulting angle that each pixel
subtends is Θmax=rmax 0:5°. Considering the scale
of the atmospheric structures in the sky, an adequate
spatial resolution is easily met by a 1Mpixel
CCD chip.
Since each pixel is illuminated by a radiant power
from a solid angle and integrated over the spectral
responsivity, the measured radiometric quantity is
radiance. In fact, an idealistic camera attempts to
imitate the human eyes spectral responsivity, so the
quantity would then be luminosity.
Each pixel can be considered a set of three inde-
pendent broadband detectors with 8 bit resolution.
The radiance Rat each pixel is a nonlinear function
of the stored pixel counts C:
R¼k·fðCÞ;ð2Þ
where kis a calibration constant for radiance in ab-
solute units (Wm2sr1), so fðCÞis a linearized pixel
intensity or relative radiance. This nonlinear conver-
sion is generally implemented in imaging systems to
increase contrast. Normally, this function is called
gamma correction and is of the form fðCÞ¼Cγwith
γ2:1[20]. For our camera this relationship has
been found to be too inaccurate over the full dynamic
range, and the function fhas been established ex-
perimentally by analyzing a series of images of an
illuminated reflection plate with increasing exposure
time τand fitting a 3rd order polynomial of the form
fðCÞ¼aC bC2þcC3as an empirical function (see
Fig. 1). The result of the least squares fit for the
blue channel is ½abc¼½0:2410:0010:0000154.To
determine the constant k, a radiance standard like
an integrating sphere must be used. It is possible
Fig. 1. Linearization function fðCÞto convert pixel counts into
relative radiance for the blue channel. Data points are fitted with
a 3rd order polynomial.
1092 APPLIED OPTICS / Vol. 48, No. 6 / 20 February 2009
that kis a function of zenith angle Θ, an effect known
as vignetting in photography, in particular with fish-
eye objectives. It is a huge advantage of polarization
measurements and cloud detection presented in this
study that the issue of proper absolute radiance is
not relevant, which is a great experimental simplifi-
cation. Note that the dark noise is automatically sub-
tracted by the camera, so that our function fhas no
offset. As a test, a series of images was acquired at
exposure times of 0:000515 s with the camera being
completely in the dark. The mean counts of the JPEG
images ranged from 0.003 at 0:0005 s exposure time
to 0.25 at 15 s. It is concluded that for the exposure
times used here (around 0:01 s) the dark noise is suf-
ficiently subtracted.
Note also that, for the following image processing,
each image is rotated so that the line through the ze-
nith and the Sun is always vertical, the Sun being on
the upper half. This cut through the hemisphere is
called the principal plane (PP).
3. Polarization Maps
Polarization of the sky was first reported by Arago in
the early 19th century [21]. A first qualitative expla-
nation could be given by Lord Rayleigh using his
molecular scattering theory [22]. Indeed, the very
simplified consideration of single molecular scatter-
ing and geometric scattering angles reproduces the
polarization pattern for longer wavelengths in the
visible spectral range quite well. In the real atmo-
sphere, light is scattered multiple times and might
be backreflected by the ground, resulting in lower
than unity maximum polarization and points of zero
polarization, so-called neutral points. These neutral
points were observed quite early by Arago, Brewster
and Babinet and have more recently been imaged
in [23].
The generalized polarization state of light is con-
veniently described by the Stokes vector formalism
[21]. Decomposing the electric field vector into its
two orthogonal complex field amplitudes Erand El,
the 4-component Stokes vector is defined as
2
6
6
4
I
Q
U
V
3
7
7
5
¼2
6
6
4
ElE
lþErE
r
ElE
lErE
r
ElE
rþErE
l
iðElE
rþErE
lÞ
3
7
7
5
;ð3Þ
where Edenotes the complex conjugate amplitudes.
All the components are real, physical quantities,
namely, irradiances EE¼jEj2¼Iαmeasured at a
different polarizer angle α:
2
6
6
4
I
Q
U
V
3
7
7
5
¼2
6
6
4
I0þI90
I0I90
I45 I135
IþI
3
7
7
5
;ð4Þ
where Iþand Idenote circular polarized irra-
diances. Equation (4) can be simplified for the spe-
cific case here by noting that the irradiance is propor-
tional to the radiance for each pixel and neglecting
the circular polarization of the sky:
2
4
I
Q
U3
5¼2
4
R0þR90
R0R90
R45 R135 #:ð5Þ
Rαdenotes the measured radiances at relative polar-
izer angles α. The resulting degree of (linear) polar-
ization Πand its angle χis given by [21]
Π¼ffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffi
Q2þU2
pI;χ¼0:5arctanU
Q:ð6Þ
Geometrically, we can visualize Q=Iand U=Ias the
two orthogonal components in a unit circle (horizon-
tal cut through the Poincaré sphere), whose vector
sum is the degree of polarization. It is clear that Π
is invariant under rotation of the coordinate system,
so the zero offset of the analyzer is irrelevant. Note
also that, in the expressions for Πand χ, any calibra-
tion constant cancels so that it suffices to compute
sums of relative radiances fðCÞ. The polarization of
the skys hemisphere is computed by combining four
images at relative polarizer angles of 0°, 45°, 90°, and
135° [Fig. 2(a)]. The images are linearized by the con-
version function fto obtain four relative radiance
maps. Applying Eqs. (5) and (6) at each pixel yields
the polarization map [Fig. 2(b)]. The maps are
smoothed by a 20 ×20 pixel 2D median filter for
spatial noise reduction without obscuring real atmo-
spheric structures. In this study we restrict ourselves
to the blue channel of the camera, since it has the
strongest signal and can be compared best to the
UV spectroradiometer. The center wavelength is
about 450 mm with a full width at half-maximum
(FWHM) of 50 nm.
Finally, the polarizing property of the composite
optical system (dome/objective/polarizer/CCD-chip)
is tested. Errors introduced by optical components
can be described by the Müller matrix, operating
(a) (b)
Fig. 2. (a) Blue pixel counts Cfrom JPEG images at four polarizer
angles. Counts are converted to relative radiance and inserted into
Eqs. (5) and (6) to yield the polarization map. (b) Corresponding
polarization map of the cloud-free sky for a wavelength of 450
50 nm (blue channel). The degree of polarization is coded in gray
shades, undetermined areas are rendered white on 26 February
2008, 10:30 UTC.
20 February 2009 / Vol. 48, No. 6 / APPLIED OPTICS 1093
on the input Stokes vector [21]. The Müller matrix
describes a nonunitary polarization transformation,
i.e., it can rotate and change the length of the Stokes
vector. Here we confirm that the length of the Stokes
vector is conserved. A large, completely polarized
area (polarizer in front of a white reflectance plate)
is used as the test field and analyzed from different
angles of incidence (0°, 45°, and 80°) and polarizer
angle offsets (0°, 15°, and 30°) to confirm the invar-
iance of Πunder these angles. The measured polar-
ization is averaged over the test area and shown
along with the standard deviation as an error bar
in Fig. 3. No dependence is found for these angles of
incidence and polarizer offset angles. So we assume
the Müller matrix to be the identity for all the angles
of incidence.
However, small bright sources like the Sun, may
cause internal reflections at certain angles in the ob-
jective, which are superimposed in the skys image.
These reflections may locally introduce large errors
in the polarization but are clearly visible in the raw
images and can be identified as artifacts.
Since the Stokes vector is independent of a polar-
izer offset angle, it is clear that first Stokes para-
meter Iis overdetermined: I¼R0þR90 ¼R45þ
R135, i.e., three polarizer angles would suffice for a
full Stokes-vector recovery. This is expected because
the left-hand side of Eq. (5) contains only three un-
knowns. Now we use the additional information for
better statistics, averaging the first Stokes para-
meter Ito
I¼1
2P4
αRα. Furthermore, the difference,
d¼ðR0þR90ÞðR45 þR135 Þshould be zero and is
used as a quality check of the images.
To estimate the statistical error in our polarization
analysis, we measured consecutive maps of the sky
every 2min under stable atmospheric conditions.
From the variation of these series, we estimate a 1σ
standard deviation of 3% for Π.
For validation of this method with respect to sys-
tematic errors, we compare the polarization in the
principal plane to the values measured with a well-
characterized spectroradiometer with polarized
radiance optics with a small FOV of 1:5°[24]. The
wavelength of the spectroradiometer has been set
to 495 nm. As displayed in Fig. 4, the general shape
and value of polarization agree well within 3%.
Around the Sun at a polar angle of 60°, the cameras
pixels are close to overexposure resulting in a large
deviation of the polarization from the spectroradi-
ometer measurement. Also, around the maximum po-
larization, at 30° zenith angle (90° behind the Sun), a
polarized reflection within the fish-eye objective
causes a distortion of the polarization curve. To con-
firm the effect of the reflections from the unobscured
Sun, a direct comparison between the polarization in
the principal plane with and without a mounted sha-
dow band is shown in Fig. 5. So omitting the shadow
band, the measured polarization is perturbed only at
the locations of the reflections (around 45°) and
around the Sun (62°). Taking care of these limitations,
all-sky maps of the polarization can be investigated.
Two interesting examples of polarization maps are
given. Figure 6(a) shows clear-sky polarization after
sunset, when two neutral points are distinctly visi-
ble. The minima around þand 70° zenith angle in
the principal plane are called the Babinet and Arago
neutral points, respectively. Their positions are de-
pendent on aerosol parameters in the atmosphere
and ground reflectivity [21] and will be investigated
more closely in the future.
In contrast, Fig. 6(b) shows the polarization map of
a partly covered sky around noon (solar elevation is
41°). High, thin cirrus clouds reduce the maximum
polarization below 50%, while lower, optically thick
cumulus clouds are barely polarized. It has also been
noted that scattered cumulus clouds reduce the
polarization also in the clear sky in between, due
to reflected light, corresponding to a higher ground
albedo. So although clouds have a dramatic effect
on the skys polarization, cloud detection is based
on a different method, described in Section 4.
Fig. 3. Analysis of a fully polarized test field (white reflectance
place) at different angles of incidence and polarizer angle offsets
0°, 15°, and 30°. Within the experimental uncertainty, no influence
of these parameters on the measured polarization can be found.
Fig. 4. Comparison of the all-sky camera blue polarization and
spectroradiometer radiance measurements in the principal plane
on a clear-sky day, 26 February 2008, 10:30 UTC. Solar zenith
angle is 58°.
1094 APPLIED OPTICS / Vol. 48, No. 6 / 20 February 2009
4. Cloud Detection
Color is the primary property that allows visual dis-
tinction of clouds in the sky. Because of different
wavelength dependence of scattering, the color of
the clear-sky region is blue rather than the whitish
or gray color of clouds. So in a digital color image,
clear-sky pixels have a higher ratio of blue/red radi-
ance than cloud pixels. The cloud detection method
is based on setting a threshold on this ratio [1114].
Considering a cloudless, aerosol-free sky, the blue/
red ratio is a function of both zenith and azimuth an-
gles in the sky and the solar elevation. Toward the hor-
izon, the sky appears more palish than at the zenith.
In addition, scattering by aerosols, which in general
is less wavelength dependent and much stronger in
the forward direction than molecular scattering
[25], also diminishes the blueness of the sky, most pro-
minently in the region around the Sun. However, aero-
sol content and scattering properties may vary so
much in time that this temporal variation masks
the spatial variation of the pristine cloudless sky color.
Hence, without aerosol information, a constant blue/
red ratio threshold is taken for the entire hemisphere
to account for universal atmospheric conditions.
Using a diverse set of images with typical cloud
situations (low and high clouds, illuminated and
dark clouds, different solar zenith angles), a suitable
threshold of 1.3 on this ratio for marking cloud areas
was found, that best discriminated cloud and clear
sky. The threshold was confirmed by investigating
the number of cloud-marked pixels as a function of
threshold. The number first increases before reach-
ing a plateau for the optimal value of the threshold,
after which it increases again. It should be noted that
the threshold is unique to each camera system since
it depends on the color response of the CCD chip as
well as any gamma correction. Also, the location has
an influence on the threshold as altitude and typical
aerosol background will result in a different clear-
sky color.
For cloud-marked pixels, the underexposed image
is used as a second criterion, applying another
threshold for the blue/red ratio. This step is neces-
sary for the region near the unobscured Sun, where
pixels close to saturation would always be cloud-
marked. The total cloud cover (TCC) is then com-
puted as the ratio of cloud-marked pixels to total
pixel number in the hemisphere, in which the hori-
zon (up to 20° for the Innsbruck site) is masked
out. It is noted that the fish-eye projection is not
area-conserving and the projected solid angle per
pixel strictly is a function of zenith angle (see, e.g.,
[11] for a detailed mathematical formulation). So
the simple ratio is an approximation with the rela-
tive error growing with increasing zenith angle.
However, the absolute error in the resulting TCC
is below 0.01 for most situations and can safely be
ignored, considering the accuracy of the TCC value,
which is normally rounded to one decimal place or
given in octas.
Furthermore, the area around the Sun is investi-
gated in more detail. When the Sun is unobscured,
diffraction around the blades of the cameras aper-
ture produces a star flare pattern around the Sun
with a sixfold symmetry. The number of flares allows
a quantitative definition of a Sun coverage para-
meter (SCP) as a measure of how much the Sun is
obscured by clouds. Three discrete cases are being
distinguished: when no pixels in the underexposed
image are saturated, the Sun is completely obscured,
and the SCP is assigned unity. In all cases, when pix-
els are saturated but the number of detected flares is
less than five, the Sun is assumed to be partially cov-
ered with an associated SCP of 0.5. When five or
more Sun flares are detected, the Sun is considered
unobscured and has an SCP of 0.
Here we use the JPEG image with the polarizer set
parallel to the principal plane, which approximates
an unpolarized relative radiance image. In principle,
the degree of polarization could also be applied for
cloud discrimination, but it was found to be less
selective than the color-ratio threshold and requires
more image processing.
Two representative examples of the performance
of the cloud detection method are shown in Fig. 7.
Cumulus-type clouds [Figs. 7(a) and 7(b)] have a par-
ticularly sharp boundary and good contrast in the
Fig. 6. (a) Polarization map at dawn, solar elevation is 6° and the
Sun has set behind the mountains. Note the neutral points at
about 70° zenith angle (9 September 2008, 17:00 UTC). (b) Polar-
ization map around noon with high and low clouds. (24 September
2008, 12:00 UTC).
Fig. 5. Comparison of the polarization in the principal plane,
with and without obscuring the Sun using a shadow arm. The mea-
sured polarization is only perturbed at the locations of the reflec-
tions around 45° and around the Sun (Solar zenith angle ¼62°).
24 October 2008, 09:46 and 09:48 UTC.
20 February 2009 / Vol. 48, No. 6 / APPLIED OPTICS 1095
blue/red ratio against the clear sky. In this case, the
detection works very well, even close to the unobs-
cured Sun. More problematic are thin clouds in front
of the Sun [Figs. 7(c) and 7(d)], where blue/red ratios
are very similar. Nevertheless, total cloud cover re-
sults are not severely affected. Note that the contrail
is nicely detected, as well as cumulus-type clouds
near the horizon.
For a quantitative statistical validation, more than
1 yr of hourly data in the period from 2 August 2007
to 27 October 2008 has been compared to the synop-
tic (SYNOP) observation at Innsbruck airport, lo-
cated 3km to the west of the camera site; see
Fig. 8. 73% of a total of 3903 analyzed camera images
agree to within 1octa with the SYNOP observations.
The distribution of the differences shows a slight
asymmetry, i.e., our cloud cover results tend to un-
derestimate those of the SYNOP observers. However,
cloud observations always bear certain interpreta-
tional variances, some differences will even exist be-
tween human observers. Specifically, the transition
from haze to cloud is continuous. For example, on
some hazy days (aerosol optical depth at 500 nm of
>0:4) our analysis results in zero cloud cover
whereas observers often interpret such a situation
as totally overcast. Furthermore, clouds in front of
mountains are considered only by the SYNOP obser-
vers, and add to the bias.
Finally, the significance of the SPC is validated by
correlating it with the erythemally weighted global
UV irradiance (UV index or UVI) and the TCC
[Figs. 9(a)9(c)]. The method for measuring the UVI
and determining the clear-sky model value is de-
scribed in detail in [26]. For the cases when the Sun
was labeled obscured (SCP of 1), the sky is mostly
found totally covered and the measured UVI is much
lower than the predicted clear-sky value, with the ra-
tio peaking around 0.3. For cases of a partially cov-
ered Sun, the whole range of TCC covers are found
and, as expected, the ratio of measured-to-predicted
clear-sky UVI decreases with increasing TCC. For a
SCP of 0, predominantly cloud-free sky occurs with
small TCC and the measured UVI is close to the pre-
dicted clear-sky value. These observations compare
well with those in [16]. The classification of three
Sun coverage scenarios poses a refinement to
Fig. 7. (a) Image of a sky with convective cloud type (cumulus) of
low and medium height. This cloud type has a sharp contrast and
is relatively easy to discriminate from the clear sky. (19 September
2008,12:00 UTC). (b) Processed image after cloud detection, clouds
are rendered white, while clear sky is gray and mask is black. The
underexposed image (not shown here) allows good discrimination
close to the Sun. Total cloud cover here is 0.63. (c) Image of a sky
with both low and high clouds, including altostratus and a contrail.
(24 September 2008, 12:00 UTC). (d) Total cloud cover is 0.29. Note
the problematic area near the Sun when thin clouds are present.
Fig. 8. Comparison of total cloud cover (TCC) obtained from the
camera images and SYNOP cloud observations (TCCcamera-
TCCSYNOP). 73% of a total of 3903 analyzed camera images agree
within 1octa with the SYNOP-observations.
Fig. 9. (a) For SCP of 1, the Sun is totally occluded, which coincides with a TCC near 1. The ratio of measured UVI and clear-sky pre-
diction is peaked around 0.3. In the range 0:98 <TCC <1, more than 620 data points are accumulated. (b) SCP of 0 implies a partially
covered Sun. In this case increasing TCC is correlated with a decreasing ratio of measured-to-clear-sky prediction. (c) For SCP of 0, the Sun
is assumed totally unobscured, which correlates with small TCC and the measured UVI converging toward the clear-sky prediction.
1096 APPLIED OPTICS / Vol. 48, No. 6 / 20 February 2009
previous work where only two cases were distin-
guished (Sun obscured or not obscured). It is further
noted that the extreme cases SCP of 0 and 1 repre-
sent a well-defined class of scenarios, while SCP of
0.5 spans a larger scope, which may suggest a further
improvement in the future by refined partitioning of
this case.
5. Conclusion and Outlook
A simple all-sky imaging system with a polarizing
filter has been described and characterized with
respect to measuring the sky´s degree of polarization
and detecting total cloud cover. Both types of analysis
have been validated against independent measuring
methods and found to agree well within their respec-
tive uncertainties. A Sun coverage parameter has
been obtained from image processing around the
Sun. It has been shown that the combination of total
cloud cover and Sun coverage parameter form a solid
basis for UV radiation estimation under cloudy
conditions. A cloud type analysis could be attempted
using pattern-recognition techniques in combination
with satellite images. Polarization maps of the sky
contain valuable information of aerosol content in
the atmosphere. The correlation of aerosol optical
depth and degree of polarization is one of the para-
mount topics in our ongoing work. Furthermore,
in contemporary climate research, the interaction
of aerosol and clouds are a key uncertainty in
the Earths energy budget. As a complementary de-
vice together with an aerosol optical depth measure-
ment like a sunphotometer all-sky imaging could be
a powerful tool for investigation of aerosolcloud
interaction.
This work was supported by the Austrian Science
Fund (FWF) under Project P18780. We gratefully
acknowledge Lanzinger at Austrocontrol, Innsbruck
Airport, for supplying the SYNOP cloud observa-
tions. The camera system was developed in coopera-
tion with Schreder (CMS-Ing.Dr.Schreder GmbH).
References
1. S. B. Mende, S. E. Harris, H. U. Frey, V. Angelopoulos,
C. T. Russell, E. Donovan, B. Jackel, M. Greffen, and
L. M. Peticolas, The THEMIS array of ground-based observa-
tories for the study of auroral substorms,Space Sci. Rev., doi:
10.1007/s11214-008-9380-x (2007).
2. G. Zotti, Measuring light pollution with a calibrated high dy-
namic range all-sky image acquisition system,presented at
the DARKSKY20077th European Symposium for the Pro-
tection of the Night Sky, Bled, Slovenia (2007).
3. R. L. Chazdon and C. B. Field, Photographic estimation of
photosynthetically active radiation: evaluation of a computer-
ized technique,Oecologia 73, 525532 (1987).
4. T. E. Pickering, The MMT all-sky camera,Proc. SPIE 6267,
62671A (2006).
5. Y. Liu and K. Voss, Polarized radiance distribution measure-
ments of skylight. I. System description and characterization,
Appl. Opt. 36, 60836094 (1997).
6. N. J. Pust and J. A. Shaw, Dual-field imaging polarimeter
using liquid crystal variable retarders,Appl. Opt. 45,
54705478 (2006).
7. J. Gál, G. Horváth, V. B. Meyer-Rochow, and R. Wehner,
Polarization patterns of the summer sky and its neutral
points measured by full-sky imaging polarimetry in Finnish
Lapland north of the Arctic Circle,Proc. R. Soc. A 457,
13851399 (2001).
8. J. A. North and M. J. Duggin, Stokes vector imaging of the
polarized sky-dome,Appl. Opt. 36, 723730 (1997).
9. M. V. Berry, M. R. Dennis, and R. L. Lee, Jr., Polarization
singularities in the clear sky,New J. Phys. 6, 162 (2004).
10. G. Horváth, A. Barta, J. Gál, B. Suhai, and O. Haiman,
Ground-based full-sky imaging polarimetry of rapidly
changing skies and its use for polarimetric cloud detection,
Appl. Opt. 41, 543559 (2002).
11. C. N. Long, J. M. Sabburg, J. Calbé, and D. Pagès, Retrieving
cloud characteristics from ground-based daytime color
all-sky images,J. Atmos. Ocean. Technol. 23, 633652
(2006).
12. N. H. Schade, A. Macke, H. Sandmann, and C. Stick, Total
and partial cloud detection during summermonths 2005 at
Westerland (Sylt, Germany),Atmos. Chem. Phys. Discuss.
8, 1347913505 (2008).
13. U. Feister, J. Shields, M. Karr, R. Johnson, K. Dehne, and M.
Woldt, Ground-based cloud images and sky radiances in the
visible and near infrared region from whole sky imager mea-
surements,in Proceedings of Climate MonitoringSatellite
Application Facility Training Workshop (Dresden, 2000).
14. U. Feister and J. Shields, Cloud and radiance measurements
with the VIS/NIR Daylight Whole Sky Imager at Lindenberg
(Germany),Meteor. Zeitschr. 14, 627639 (2005).
15. K. A. Buch and C. H. Sun, Cloud classification using
whole-sky imager data,presented at Ninth Symposium on
Meteoriological Observations and Instrumentation, paper
7.5, Charlotte, North Carolina, 1995.
16. G. Pfister, R. L. McKenzie,J. B. Liley, A. Thomas, B. W. Forgan,
and C. N. Long, Cloud Coverage Based on All-Sky Imaging
and Its Impact on Surface Solar Irradiance,J. Appl. Meteorol.
42, 14211434 (2003).
17. N. H. Schade, A. Macke, H. Sandmann, and C. Stick,
Enhanced solar global irradiance during cloudy sky
conditions,Meteor. Zeitschr. 16, 295303 (2007).
18. J. M. Sabburg and C. N. Long, Improved sky imaging for
studies of enhanced UV irradiance,Atmos. Chem. Phys. 4,
25432552 (2004).
19. J. Stumpfel, C. Thou, A. Jones, T. Hawkins, A. Wenger, and
P. Debevec, Proceedings of the Third International Conference
on Computer Graphics, Virtual Reality, Visualisation and
Interaction in Africa (Association for Computing Machinery,
2004), pp. 145149.
20. C. A. Poynton, Digital Video and HDTV: Algorithms and
Interfaces (Morgan Kaufmann, 2003).
21. K. L. Coulson, Polarization and Intensity of Light in the Atmo-
sphere (Deepak, 1988).
22. J. W. Strutt, On the light from the sky, its polarisation and
color,Philos. Mag. 41, 107120 274279 (1871).
23. G. Horváth, J. Gál, I. Pomozi, and R. Wehner, Polarization
portrait of the Arago Point: video-polarimetric imaging of
the neutral points of skylight polarization,Naturwis-
senschaften 85, 333339 (1998).
24. M. Blumthaler, B. Schallhart, M. Schwarzmann, R. McKenzie,
P. Johnston, M. Kotkamp, and H. Shiona, Spectral UV mea-
surements of global irradiance, solar radiance, and actinic flux
in New Zealand: intercomparison between instruments and
model calculations,J. Atmos. Ocean. Technol. 25, 945958
(2008).
25. S. Twomey, Atmospheric Aerosols (Elsevier, 1977).
26. B. Schallhart, M. Blumthaler, J. Schreder, and J. Verdebout, A
method to generate near real time UV-index maps of Austria,
Atmos. Chem. Phys. 8, 74837491 (2008).
20 February 2009 / Vol. 48, No. 6 / APPLIED OPTICS 1097
... There are different methods to determine the CC from the images captured by all-sky cameras. Several algorithms are based on the comparison between the registered signal at two different channels, with the most typical being the red/blue ratio (RBR; e.g., Calbó & Sabburg, 2008;Johnson et al., 1989;Kreuter et al., 2009;Shields et al., 1998;Silva & Souza-Echer, 2016); in this case, a threshold is established for this ratio to distinguish between cloud-free (ratio below the threshold) and cloudy sky pixels (ratio above the threshold). Other algorithms are based on the difference between red and blue channels instead of the ratio (Kazantzidis et al., 2012). ...
... A total of 68.4% of the predicted data exactly match the reference values. The same percentages but using human field observations as reference were as follows: 41% and 49% at Payerne and Jungfraujoch (Switzerland) respectively with the algorithm of Wacker et al. (2015); about 60% in Thessaloniki (Greece) by Kazantzidis et al. (2012); about 33% in Innsbruck (Austria) as calculated by Kreuter et al. (2009); about 34% with the method of Cazorla et al. (2008) in Granada (Spain); and 45.5% in Gangneung (Korea) with the algorithm of Kim et al. (2016), who compared CC values in tenths instead of oktas. ...
... The partially cloudy cases present worse behavior, with the success rate being 48%, 49%, 52%, and 55% for 6 oktas, 5 oktas, 4 oktas, and 3 oktas respectively. The percentage of predicted data that fits the reference values within ±1 oktas is 97.0%, whereas other researchers, when comparing human observations with their algorithms, reported values at other locations of 60% (Granada; Cazorla et al., 2008), 75.3% (Anhui, China; Xie et al., 2020), 70% and 78% (Payerne and Jungfraujoch; Wacker et al., 2015), 83% (Thessaloniki; Kazantzidis et al., 2012), and 73% (Innsbruck; Kreuter et al., 2009), and Huo and Lu (2012) reported values for within ±1 tenths of 79.9%, 66.7%, and 44.8% for Shouxian, Beijing, and Yangjiang (China) respectively. In the case of the percentage of predicted data fitting the reference values within ±2 oktas, our model presents a value of 99.4%, whereas other researchers obtained lower values when they used human field observations as reference: about 77% (Granada; Cazorla et al., 2008), 90.9% (Anhui; Xie et al., 2020), 84% and 89% (Payerne and Jungfraujoch; Wacker et al., 2015), 94% (Thessaloniki; Kazantzidis et al., 2012), and 85.3%, 76.1%, and 65.2% (Shouxian, Beijing, and Yangjiang; Huo & Lu, 2012), with this last case being for within ±2 tenths instead of oktas. ...
Article
Full-text available
We present a new model based on a convolutional neural network (CNN) to predict daytime cloud cover (CC) from sky images captured by all‐sky cameras, which is called CNN‐CC. A total of 49,016 daytime sky images, recorded at different Spanish locations (Valladolid, La Palma, and Izaña) from two different all‐sky camera types, are manually classified into different CC (oktas) values by trained researchers. Subsequently, the images are randomly split into a training set and a test set to validate the model. The CC values predicted by the CNN‐CC model are compared with the observations made by trained people on the test set, which serve as reference. The predicted CC values closely match the reference values within ±± \pm 1 oktas in 99% of the cloud‐free and overcast cases. Moreover, this percentage is above 93% for the rest of partially cloudy cases. The mean bias error (MBE) and standard deviation (SD) of the differences between the predicted and reference CC values are calculated, resulting in MBE=0.007MBE=0.007 \mathrm{MBE}=0.007 oktas and SD=0.674SD=0.674 \mathrm{SD}=0.674 oktas. The MBE and SD are also represented for different intervals of measured aerosol optical depth and Ångström exponent values, revealing that the performance of the CNN‐CC model does not depend on aerosol load or size. Once the model is validated, the CC obtained from a set of images captured every 5 min, from January 2018 to March 2022, at the Antarctic station of Marambio (Argentina) is compared against direct field observations of CC (not from images) taken at this location, which is not used in the training process. As a result, the model slightly underestimates the observations with an MBE of − - 0.3 oktas. The retrieved data are analyzed in detail. The monthly and annual CC values are calculated. Overcast conditions are the most frequent, accounting for 46.5% of all observations throughout the year, rising to 64.5% in January. The annual mean CC value at this location is 5.5 oktas, with a standard deviation of approximately 3.1 oktas. A similar analysis is conducted, separating data by hours, but no significant diurnal cycles are observed except for some isolated months.
... The camera, equipped with the fish-eye lens was mounted on a tripod with the optical axis of the lens directed vertically to the zenith. This method refers to the system operation and image calibration described by Kreuter et al. 58 . For a given sky, full-sky images were captured at four different polariser angles (0°, 45°, 90°and 135°), based on which three Stokes parameters, I, Q and U, were calculated at each pixel using Eq. ...
Article
Full-text available
Many insects utilise optical information in linearly polarised light for navigation, with the degree of linear polarisation (DoLP) determining whether the ‘visibility’ of such optical information is available to them. However, changes in degree of linear polarisation in response to increased atmospheric levels of fine particulate matter (PM2.5) are poorly understood. We present analyses based on both ground-based monitoring and particulate matter modelling, establishing a quantitative relationship between PM2.5 mass concentration and the DoLP. We apply this relationship to a global PM2.5 projection for 2050 and estimate the increase in number and spatial extent of low visibility days for honeybees. We find an increase by up to 20% in the geographical extent of low visibility days in 2050, with an augmented frequency of low visibility days across an area exceeding 0.75 million km² in India and 2 million km² in China. More frequent and widespread low visibility conditions can reduce the ability of insects to navigate, especially in hotspot regions.
... However, it may be necessary to register the different images before calculating the Stokes parameters (see Section 2.3.2). Such techniques have been successfully used for skylight polarization estimation [48][49][50]. ...
Article
Full-text available
This review article aims to address common research questions in passive polarized vision for robotics. What kind of polarization sensing can we embed into robots? Can we find our geolocation and true north heading by detecting light scattering from the sky as animals do? How should polarization images be related to the physical properties of reflecting surfaces in the context of scene understanding? This review article is divided into three main sections to address these questions, as well as to assist roboticists in identifying future directions in passive polarized vision for robotics. After an introduction, three key interconnected areas will be covered in the following sections: embedded polarization imaging; polarized vision for robotics navigation; and polarized vision for scene understanding. We will then discuss how polarized vision, a type of vision commonly used in the animal kingdom, should be implemented in robotics; this type of vision has not yet been exploited in robotics service. Passive polarized vision could be a supplemental perceptive modality of localization techniques to complement and reinforce more conventional ones.
... Therefore, auroral morphology observed from the ground is essential for understanding magnetospheric dynamics, which has led to research on ground-based auroral image classification [3], [4]. All-sky CCD imagers (ASIs) are the most common ground-based auroral imaging devices [5], [6], [7]. For example, the optical system of the Yellow River Station (YRS) in Ny-Ålesund, Svalbard, consists of three ASIs equipped with narrow band interferential filters centered at 427.8, 557.7, and 630.0 nm [5]. ...
Article
Full-text available
Auroral classification plays a crucial role in polar research. However, current auroral classification studies are predominantly based on images taken at a single wavelength, typically 557.7 nm. As a result, the integration of information from multiple wavelengths has received comparatively less attention, resulting in low classification rates for complex auroral patterns. Furthermore, existing studies employing traditional machine learning or deep learning approaches have not achieved an optimal balance between accuracy and speed. To overcome these challenges, this paper proposes a lightweight auroral multi-wavelength fusion classification network, MLCNet, based on a multi-view approach. Firstly, we develop a lightweight feature extraction backbone to improve the classification rate and effectively cope with the increasing amount of auroral observation data. Secondly, considering the existence of multi-scale spatial structures in auroras, we design a novel multi-scale reconstructed feature module. Finally, to highlight the discriminative information between auroral classes, we propose a lightweight attention feature enhancement module. The proposed method is validated using auroral observations from the Arctic Yellow River Station during 2003-2004. Experimental results demonstrate that the fusion of multi-wavelength information significantly improves the auroral classification performance. In particular, our approach achieves state-of-the-art classification accuracy compared to previous auroral classification studies and outperforms existing multi-view methods in terms of both accuracy and computational efficiency.
... For instance, many traditional imageprocessing methods utilize color to pinpoint clouds from the clear sky. Some of the methods establish fixed or variable thresholds in the ratio between blue and red channels (Kreuter et al. 2009;Long et al. 2006b;Heinle et al. 2010;Souza-Echer et al. 2006). Combinations of statistical and machine learning (ML) models are also used for cloud cover and cloud type estimation (Tian et al. 1999;Calbó et al. 2001;Zhuo et al. 2014). ...
Article
Full-text available
Accurate cloud type identification and coverage analysis are crucial in understanding the Earth’s radiative budget. Traditional computer vision methods rely on low-level visual features of clouds for estimating cloud coverage or sky conditions. Several handcrafted approaches have been proposed; however, scope for improvement still exists. Newer deep neural networks (DNNs) have demonstrated superior performance for cloud segmentation and categorization. These methods, however, need expert engineering intervention in the preprocessing steps—in the traditional methods—or human assistance in assigning cloud or clear sky labels to a pixel for training DNNs. Such human mediation imposes considerable time and labor costs. We present the application of a new self-supervised learning approach to autonomously extract relevant features from sky images captured by ground-based cameras, for the classification and segmentation of clouds. We evaluate a joint embedding architecture that uses self-knowledge distillation plus regularization. We use two datasets to demonstrate the network’s ability to classify and segment sky images—one with ∼ 85,000 images collected from our ground-based camera and another with 400 labeled images from the WSISEG database. We find that this approach can discriminate full-sky images based on cloud coverage, diurnal variation, and cloud base height. Furthermore, it semantically segments the cloud areas without labels. The approach shows competitive performance in all tested tasks, suggesting a new alternative for cloud characterization.
Article
Full-text available
Auroral classification plays a crucial role in polar research. However, current auroral classification studies are predominantly based on images taken at a single wavelength, typically 557.7 nm. As a result, the integration of information from multiple wavelengths has received comparatively less attention, resulting in low classification rates for complex auroral patterns. Furthermore, existing studies employing traditional machine learning or deep learning approaches have not achieved an optimal balance between accuracy and speed. To overcome these challenges, this paper proposes a lightweight auroral multi-wavelength fusion classification network, MLCNet, based on a multi-view approach. Firstly, we develop a lightweight feature extraction backbone to improve the classification rate and effectively cope with the increasing amount of auroral observation data. Secondly, considering the existence of multi-scale spatial structures in auroras, we design a novel multi-scale reconstructed feature module. Finally, to highlight the discriminative information between auroral classes, we propose a lightweight attention feature enhancement module. The proposed method is validated using auroral observations from the Arctic Yellow River Station during 2003-2004. Experimental results demonstrate that the fusion of multi-wavelength information significantly improves the auroral classification performance. In particular, our approach achieves state-of-the-art classification accuracy compared to previous auroral classification studies and outperforms existing multi-view methods in terms of both accuracy and computational efficiency. Index Terms-Multi-view learning, auroral image classification, lightweight model, multi-wavelength fusion, multi-scale reconstructed feature module, attention mechanism.
Article
Cloud detection plays a significant role in remote sensing (RS) image applications. Existing deep learning-based cloud detection methods rely on massive precise pixelwise annotations, which are time-consuming and expensive. To alleviate this problem, we propose a weakly supervised cloud detection framework that leverages physical rules to generate weak supervision for cloud detection in RS images. Specifically, a rule-based adaptive pseudo labeling (RAPL) algorithm is devised to adaptively annotate potential cloud pixels based on cloud spectral properties without manual intervention. Unlike existing physical annotations using fixed thresholds, RAPL employs the bidirectional threshold segmentation and adaptive gating mechanism to annotate cloud and boundary masks with more explicit semantic categories and spatial structures separately. Subsequently, these pseudo masks are treated as weak supervision to optimize the heuristic cloud detection network for pixelwise segmentation. Considering that clouds appear as complex geometric structures and nonuniform spectral reflectance, a deformable boundary refining module is designed to enhance the modeling ability of spatial transformation and activate sharp boundaries from translucent cloud regions. Moreover, a harmonic loss is employed to recognize clouds with nonuniform spectral reflectance and suppress the interference of bright backgrounds. Extensive experiments on the GF-1, L8 Biome, and weakly supervised cloud detection (WDCD) datasets demonstrate that the proposed method achieves state-of-the-art results. A public reference implementation of this work in PyTorch is available at https://github.com/NiAn-creator/HeuristicCloudDetection .
Chapter
Cloud detection is indispensable in ground-based cloud observation, and it can implement automatic cloud cover estimation. Cloud detection is quite challenging because of blurred cloud boundaries and variable shapes. In this paper, we propose a new network Channel Attention Cloud Detection Network (CACDN) for ground-based cloud detection. The proposed CACDN is an encoder-decoder architecture, and we design the cloud channel attention (CCA) module to filter information for accurate cloud detection. We conduct the experiments on TLCDD, and the experimental results show that our method achieves better results than other methods, thus proving the effectiveness of the proposed CACDN. Keywordsground-based cloud detectionCNNencoderdecoder
Article
Full-text available
In Lauder, Central Otago, New Zealand, two all-sky imaging systems have been in operation for more than 1 yr, measuring the total, opaque, and thin cloud fraction, as well as indicating whether the sun is obscured by clouds. The data provide a basis for investigating the impact of clouds on the surface radiation field. The all-sky cloud parameters were combined with measurements of global, direct, and diffuse surface solar irradiance over the spectral interval from 0.3 to 3 μm. Here, the results of ongoing analysis of this dataset are described. As a reference for the magnitude of the cloud influence, clear-sky irradiance values are estimated as a simple function of solar zenith angle and the earth-sun distance. The function is derived from a least squares fit to measurements taken when available cloud images show clear-sky situations. Averaged over a longer time period, such as 1 month, cloud fraction and surface irradiance are clearly negatively correlated. Monthly means in the ratio of the measured surface irradiance to the clear-sky value had a correlation coefficient of about -0.9 with means of cloud fraction for the months from July 2000 to June 2001. In the present work reductions in the surface irradiance and situations in which clouds cause radiation values to exceed the expected clear-sky amount are analyzed. Over 1 yr of observations, 1-min-averaged radiation measurements exceeding the expected clear-sky value by more than 10% were observed with a frequency of 5%. In contrast, a reduction of more than 10% below estimated clear-sky values occurred in 66% of the cases, while clear-sky irradiances (measured irradiance within ±10% of estimated clear-sky value) were observed 29% of the time. Low cloud fractions frequently lead to moderate enhancement, because the sun is often unobscured and the clouds are brighter than the sky that they hide. As cloud fraction increases the sun is likely to be obscured, causing irradiance values to fall well below clear-sky values. However, in the case of unobscured sun, there is a tendency for strongest enhancements when cloud fractions are highest. Enhancements, especially at high solar zenith angle, are also often observed in association with thin clouds.
Article
Full-text available
Presented here are the results of a short but intense measurement campaign at Lauder, New Zealand, in which spectral irradiance from instruments operated by the National Institute of Water and Atmospheric Research (NIWA) and Austria/Innsbruck (ATI) were traced to different irradiance standards and compared. The observed spectral differences for global irradiance were relatively small (<5%) and were consistent with those expected from observed differences in the radiation standards used by each group. Actinic fluxes measured by both groups were also intercompared and found to agree at the 10% level. The ATI instrument had the additional capability of measuring solar direct beam irradiance and sky radiances. These provided the first series of sky radiance measurements at this pristine Network for the Detection of Atmospheric Composition Change (NDACC) site. The polarization of sky radiance results were compared with estimates from a radiative transfer model without any aerosols and was found to be up to 25% smaller. Total ozone values derived from Total Ozone Mapping Spectrometer (TOMS), Dobson measurements by NIWA, spectral direct sun measurements by ATI, and spectral global irradiance measurements by NIWA agreed generally within 2%-3%.
Article
Full-text available
We present here an overview of sky imaging, and techniques that may be applied to the analysis of full color sky images to infer cloud macrophysical properties. Details of two different types of sky imaging systems developed by the authors are presented, one of which has been developed into a commercially available instrument. Retrievals of fractional sky cover from automated processing methods are compared to human retrievals, both from direct observations and visual analyses of sky images. Although there exists some uncertainty in fractional sky cover retrievals from sky images, this uncertainty is no greater than that attached to human observations for the commercially available sky imager retrievals, with still acceptable uncertainty with the research system used in Girona, Spain. Thus, the application of automatic digital image processing techniques on sky images is a useful method to complement, or even replace traditional human observations of sky conditions for cloud cover and potentially cloud type. Additionally, the possibilities for inferring other cloud parameters such as cloud brokenness and solar obstruction of benefit for scientific research further enhance the usefulness of sky imagers.
Article
Full-text available
A recent World Meteorological Organization report discussed the importance of continued study of the effect of clouds on the solar UV radiation reaching the earths surface. The report mentions that the use of all-sky imagery offers the potential to understand and quantify cloud effects more accurately. There are an increasing number of studies investigating the enhancement of surface solar, UV irradiance, and UV actinic flux, using automated CCD and sky imagers. This paper describes new algorithms applicable to a commercially available all-sky imager (TSI-440), for research investigating cloud enhanced spectral UV irradiance. Specifically, these include three new algorithms relating to cloud amount at different spatial positions from (1) zenith and (2) from the solar position and (3) the visible brightness of clouds surrounding the sun. A possible relationship between UV enhancement and the occurrence of near-sun cloud brightness is reported based on this preliminary data. It is found that a range of wavelength dependent intensities, from 306 to 400 nm, can occur in one day for UV enhancements. Evidence for a possible decreasing variation of intensity with longer wavelengths is also presented.
Article
Full-text available
Using 180° field–of–view (full–sky) imaging polarimetry, the patterns of the degree and angle of polarization of the entir summer sky were measured on 25 June 1999 at a location north of the Arctic Circle in Finnish Lapland as a function of th angular solar zenith distance. A detailed description of the used full–sky imaging polarimeter and its calibration is given. A series of the degree and angle of polarization pattern of the full sky is presented in the form of high–resolution circula maps measured in the blue (450 nm) spectral range as a function of the solar zenith distance. Graphs of the spectral dependenc of the degree and angle of polarization of skylight at 90° from the Sun along the antisolar meridian are shown. The celestia regions of negative polarization and the consequence of the existence of this anomalous polarization, the neutral points are visualized. The measured values of the angular zenith distance of the Arago and Babinet neutral points are presented a a function of the zenith distance of the Sun for the red (650 nm), green (550 nm) and blue (450 nm) ranges of the spectrum. The major aim of this work is to give a clear and comprehensive picture, with the help of full–sky imaging polarimetry, o what is going on in the entire polarized skydome. We demonstrate how variable the degree of polarization of skylight and th position of the neutral points can be within 24 h on a sunny, almost cloudless, visually clear day.
Article
Clouds are one of the most important moderators of the earth radiation budget and one of the least understood. The effect that clouds have on the reflection and absorption of solar and terrestrial radiation is strongly influenced by their shape, size, and composition. Physically accurate parameterization of clouds is necessary for any general circulation model (GCM) to yield meaningful results. The work presented here is part of a larger project that is aimed at producing realistic three-dimensional (3D) volume renderings of cloud scenes, thereby providing the important shape information for parameterizing GCMs. The specific goal of the current study is to develop an algorithm that automatically classifies (by cloud type) the clouds observed in the scene. This information will assist the volume rendering program in determining the shape of the cloud. Much work has been done on cloud classification using multispectral satellite images. Most of these references use some kind of texture measure to distinguish the different cloud types and some also use topological features (such as cloud/sky connectivity or total number of clouds). A wide variety of classification methods has been used, including neural networks, various types of clustering, and thresholding. The work presented here utilizes binary decision trees to distinguish the different cloud types based on cloud feature vectors.
Book
All aspects of aerosol physics important in the oformation, evolution and removal of particulate material in the atmosphere are presented, and the influence of such particles on the climate and weather are outlined. The book opens with a discussion of the physics of aerosols and derives some f the more important relationships in the physics of single aerosol particles. These are then used as a basis for subsequent examination of interactions between particles and the dynamics of populations of particles relative to the evolution amd maintenance of particle size distributions in the atmosphere and for the production - modification and coagulation-removal cycle. The balance between production and removal is then reviewed and the regions of the size spectrum where the various formative and removal processes are most effective are identified. The last five chapters are devoted to the influence of atmospheric particles on weather, atmospheric optics and radiative transfer, atmospheric electricity and atmospheric energetics and climate.