ArticlePDF Available

Hyperspectral Imaging in Environmental Monitoring: A Review of Recent Developments and Technological Advances in Compact Field Deployable Systems

MDPI
Sensors
Authors:

Abstract and Figures

The development and uptake of field deployable hyperspectral imaging systems within environmental monitoring represents an exciting and innovative development that could revolutionize a number of sensing applications in the coming decades. In this article we focus on the successful miniaturization and improved portability of hyperspectral sensors, covering their application both from aerial and ground-based platforms in a number of environmental application areas, highlighting in particular the recent implementation of low-cost consumer technology in this context. At present, these devices largely complement existing monitoring approaches, however, as technology continues to improve, these units are moving towards reaching a standard suitable for stand-alone monitoring in the not too distant future. As these low-cost and light-weight devices are already producing scientific grade results, they now have the potential to significantly improve accessibility to hyperspectral monitoring technology, as well as vastly proliferating acquisition of such datasets.
This content is subject to copyright.
sensors
Review
Hyperspectral Imaging in Environmental Monitoring:
A Review of Recent Developments and Technological
Advances in Compact Field Deployable Systems
Mary B. Stuart 1, Andrew J. S. McGonigle 2,3,4 and Jon R. Willmott 1,*
1Department of Electronic and Electrical Engineering, University of Sheeld, Sheeld S1 4DE, UK
2Department of Geography, University of Sheeld, Sheeld S10 2TN, UK
3School of Geosciences, The University of Sydney, Sydney, NSW 2006, Australia
4Faculty of Health, Engineering and Sciences, University of Southern Queensland,
Toowoomba, QLD 4350, Australia
*Correspondence: j.r.willmott@sheeld.ac.uk
Received: 17 May 2019; Accepted: 9 July 2019; Published: 11 July 2019


Abstract:
The development and uptake of field deployable hyperspectral imaging systems within
environmental monitoring represents an exciting and innovative development that could revolutionize
a number of sensing applications in the coming decades. In this article we focus on the successful
miniaturization and improved portability of hyperspectral sensors, covering their application both
from aerial and ground-based platforms in a number of environmental application areas, highlighting
in particular the recent implementation of low-cost consumer technology in this context. At present,
these devices largely complement existing monitoring approaches, however, as technology continues
to improve, these units are moving towards reaching a standard suitable for stand-alone monitoring in
the not too distant future. As these low-cost and light-weight devices are already producing scientific
grade results, they now have the potential to significantly improve accessibility to hyperspectral
monitoring technology, as well as vastly proliferating acquisition of such datasets.
Keywords: hyperspectral; environmental monitoring; miniaturization; low-cost; field deployable
1. Introduction
Over the past three decades, hyperspectral imaging has emerged as an eective tool for a variety
of applications ranging from remote sensing of the Earth’s surface [
1
3
], to art conservation and
archaeology [
4
6
]. Whilst spectral imaging with multispectral sensors has been achieved since the late
1960s [
7
], recent advances in the spectral and spatial resolution of sensors has opened-the-door to more
detailed scene analysis with hyperspectral imaging [2,8].
Hyperspectral images are characterized by both their spatial and spectral resolution [
9
,
10
],
e.g., with two spatial dimensions (S
x
and S
y
) and one spectral dimension (S
λ
). The spatial resolution
measures the geometric relationship between the image pixels, while the spectral resolution determines
the variations in illumination within the image pixels as a function of wavelength [
3
]. These data are
represented in the form of a 3-Dimensional hyperspectral data cube [
2
,
3
], where each “slice” of this
data cube along Sλ, represents a specific band from the electromagnetic spectrum [3].
Initially developed for remote sensing applications [
4
,
11
], hyperspectral imaging sensors typically
acquire images across hundreds of narrow spectral bands within the visible, Near Infrared (NIR),
and Mid Infrared (MIR) segments of the electromagnetic spectrum [
3
,
11
]. This enables the construction
of an almost continuous reflectance spectrum for each pixel in a scene which, in turn, allows for the
in-depth spectral examination of scene features that would be rather less perceptible with the coarser
Sensors 2019,19, 3071; doi:10.3390/s19143071 www.mdpi.com/journal/sensors
Sensors 2019,19, 3071 2 of 17
bandwidths of multispectral scanners [
1
,
7
,
8
,
12
]. This recent development in sensor technologies has
led to the uptake of hyperspectral imaging methods across a wide variety of disciplines, opening new
possibilities for measuring and monitoring multiple aspects of our environment [8].
In recent years, there has been a considerable uptake of field deployable hyperspectral imaging
within the discipline of environmental monitoring [
13
,
14
]. This is an exciting, and potentially
revolutionary, development that could result in substantial future alterations to existing monitoring
methods and sensing modalities, involving capture of higher quality data [
15
]. It is, therefore,
highly timely and important to capture the current state-of-play in this field at this juncture, which is the
motivation behind the development of this article. Here, we provide a review of current hyperspectral
technologies and their integration into the environmental monitoring field, with a particular focus on the
successful miniaturization and improved portability of these sensors, as well as highlighting the recent
move towards the implementation of low-cost consumer market technology. Recent developments are
discussed, focusing on key examples across a variety of environmental disciplines, emphasizing the
significant enhancements these developments have made to data acquisition for both ground-based
and aerial deployments.
The focus of this article is to review the current progress in low-cost, field deployable hyperspectral
devices for use within environmental monitoring applications. Our research methodology, therefore,
focused on the following search terms; “low-cost”, “miniaturization”/“miniaturized”, “hyperspectral”,
and “environmental monitoring”. These terms were used to search three online scientific citation
indexing services (Web of Science, Scopus, and Google Scholar) in order to obtain the articles that make
up this review. These databases were used to elucidate the key researchers that have published in the
aforementioned categories of hyperspectral imaging. This allowed us to see the current leading edge
in the field, by seeing where the research leaders were most active. The second strand to our method
was to build an understanding of the hyperspectral modalities that we believe is comprehensive.
These models are described below and we have conjoined our understanding of these modalities
with our review of the state-of-the-art. As our focus is field deployable devices, articles pertaining to
satellite-based applications, such as CubeSat, have largely been excluded from this work. Whilst these
satellite-based applications often represent low-cost, miniaturized devices, they are not inherently field
deployable and, therefore, do not fit the narrative of this article. In this review, we have provided a
comprehensive repository for information on the dierent design approaches to hyperspectral imaging
for field-deployable systems; by whom the leading research is being conducted; the nature of the
research; and our interpretation of how the research fits within the overall research field.
2. Sensor Types
There are a number of dierent approaches to hyperspectral imaging and, as such, a variety of
sensor types are available (Figure 1) [
10
]. Typically, sensors are characterized by the arrangement
and/or the number of spectral bands involved in the instrumental architecture [
10
,
16
], as well as
the applied image capture method. Push broom sensors have been traditionally used for large
airborne imaging applications and have recently been successfully miniaturized for use within UAV
(unmanned aerial vehicle) systems [
10
,
17
,
18
]. This push broom measurement approach is favored due
to its high spatial and spectral resolution [
19
], however, this image acquisition method, whereby a line
of spectral information per exposure is recorded [10,20], can cause diculties in post-processing [10].
Similarly, whiskbroom sensors, which image a single pixel or spatial location at a time [
21
,
22
],
using a rotating mirror to sweep a scan line perpendicular to the direction of the sensor platform’s
movement [
21
23
], are aected by the same issues [
21
]. Furthermore, whiskbroom sensors provide
inherently slower frame rates than Push Broom units, resulting in lengthier data acquisition periods
where all other things are equal [
21
,
24
]. Another disadvantage is that the rotation of the optics
can result in spatial distortions in the image outputs [
25
]. However, recent work reported by Uto
et al. [
22
], has demonstrated the pioneering of low-cost whiskbroom image collection suitable for
UAV deployment.
Sensors 2019,19, 3071 3 of 17
Figure 1.
Image capturing techniques for each sensor type. Note the dierent methods of image
formation; from the pixel-based capture of Push Broom and Whiskbroom scanners, to the 2-Dimensional
comprehensive image capture of Framing and Windowing instruments. This highlights the potential
issues related to image distortion resulting from the rotation of the optics in the pixel-based instruments,
as mentioned above.
Alternatively, framing instruments (Figure 1) can capture scenes through 2-Dimensional images
with additional optics that focus on either an individual wavelength or wavelength bands using
tunable filters, such as framing band pass filters translated across the spectrum [
25
]. The design of such
sensors is significantly simpler than those of push broom and whiskbroom sensors [
21
,
26
], however,
the use of spectral filtering substantially reduces the intensity of light captured at the sensor, limiting
signal to noise [
25
]. Windowing instruments also employ a 2-Dimensional Field of View (FOV) that
moves across a scene in a continuous fashion [
27
]. However, instruments that utilize this image capture
approach acquire a distinct exposure each time the FOV moves forward, with no integration between
exposures [27].
The literature highlights that although there can be significant variation caused by slit width,
lens focal length, and integration time [
10
], Push Broom sensors, at present, oer a better combination of
spatial and spectral resolution. Push Broom sensors are typically more stable than Whiskbroom sensors
due to the line-by-line image acquisition process, therefore, confining potential data misalignments to
between lines rather than between individual pixels [
19
]. Furthermore, they often have a significantly
greater spectral resolution, for example Jaud et al. [
26
], reports a spectral resolution of 1.85 nm for
their Push Broom device. Framing and Windowing devices are often limited due to the filtering of
spectral bands, resulting in spectral resolutions of >5 nm being more common for these devices [
10
,
27
].
High spatial resolution is also easier to achieve with current Push Broom devices as miniaturization
allows for them to be deployed on more maneuverable, light-weight devices, for example, a number
of studies highlight successful image acquisitions with spatial resolutions of less than 10 cm [
26
],
with Lucieer et al. [
17
], and Malenovsk
ý
et al. [
18
], achieving a spatial resolution of 4 cm with UAV
based deployments. Framing and Windowing devices are currently limited due to their typically larger
size, making Push Broom sensors significantly more compatible to light-weight, miniaturized sensing
applications at present.
Although several of these sensor designs (Figure 2) have been successfully miniaturized,
making them suitable for light-weight aerial remote sensing, they do not currently contain any
internal georeferencing data and, therefore, require the addition of external (e.g., GPS receiver) devices
to record this information if it is required [
3
,
10
]. Whilst this does not particularly eect traditional
remote sensing and ground-based imaging methods, it can become problematic when designing
eective UAV integrated payloads [
10
,
21
]. Each of these sensor designs has its advantages, depending
on the parameters of the proposed application, however, the push broom design has been the most
popular, particularly within the field of light-weight UAV image acquisition [
19
]. Whilst these sensor
implementations can involve distortions within the acquired data, they currently outperform full-frame
image capturing approaches as the latter systems currently require a compromise between spatial
coverage, spatial resolution and spectral resolution [
10
,
26
]. However, as interest and demand within
Sensors 2019,19, 3071 4 of 17
this area continues to grow [
8
,
19
], significant advances in compact sensor designs, including the
incorporation of linear variable filters, can be anticipated in the future.
1
Figure 2.
Typical schematic designs for each sensor type. (
a
) Push Broom sensor; (
b
) Whiskbroom
sensor; (
c
) Framing sensor; (
d
) Windowing sensor. Note the lack of integration between image tiles for
Windowing sensor designs. Image not to scale.
3. Technological Developments and Associated Complexities
Currently, hyperspectral imaging is generally performed by satellite or aircraft platforms [
20
,
26
,
28
],
with recent advances in airborne and spaceborne technologies providing end users with rich spectral,
spatial, and temporal information [
1
,
2
]. As such, hyperspectral imaging has been well established in the
remote sensing community, with large-scale uptake across many dierent domains [
4
,
10
]. Furthermore,
the recent development of CubeSat miniature satellites, such as HyperCube [
29
], shows significant
potential for future development of light-weight, low-cost spaceborne image acquisition [
30
32
].
However, whilst these sensors enable the analysis of extensive areas of the Earth’s surface, providing
large-scale datasets with long time series [
26
], they are often constrained by factors outside the
users’ control, such as cloud coverage and spatial resolution [
1
,
19
,
26
]. Furthermore, manned aerial
surveys operated on an on-demand basis can be rather expensive and somewhat reliant on favorable
meteorological conditions [
11
]. As a result, these drawbacks significantly limit the suitability of these
measurement types for many smaller scale, local applications.
Jaud et al. [
26
], highlights this sizable gap between the small-scale, fine resolution outputs
of field surveys and the comparatively coarse resolution provided by satellite and aerial sensors.
However, the development of UAV platforms over the last decade has enabled the development of an
intermediary protocol, in the form of UAV integrated hyperspectral sensing [
8
,
11
,
20
,
33
]. These UAV
based platforms provide greater flexibility than traditional sensing methods, permitting the user to
vary parameters such as survey size and flight altitude [
12
,
26
], in a manner tailored to the proposed
application. Additionally, due to their typically small size and low weight they can be easily,
and readily, stored and deployed [
33
,
34
]. A number of UAV integrated hyperspectral sensors have
been tested in recent years within a variety of dierent fields; Habib et al. [
20
], present a low-cost
UAV integrated hyperspectral scanner applied to the field of precision agriculture. Their multi-rotor
system proved successful, providing detailed imagery of the survey area, however, diculties arose
during the georectification process, with the accurate generation of georeferenced products proving
Sensors 2019,19, 3071 5 of 17
dicult to establish [
20
]. Similarly, Jaud et al. [
26
], experienced complications during the line-by-line
georectification and referencing required of their push broom, multi-rotor UAV sensor acquisitions,
with the push broom image formation process leading to a major source of complexity during the
geometrical correction step [26].
Georectification Diculties
Due to the light-weight nature of multi-rotor UAV systems they generate substantial high frequency
vibrations and can perform faster trajectory changes than larger platforms, therefore, these systems
require fast, accurate proprioceptive sensors to enable accurate logging of altitude and position [
26
,
35
].
Mozgeris et al. [
36
], directly compared the results obtained from a UAV based hyperspectral imaging
camera and a similar sensor based within a manned, fixed wing, ultra-light aircraft in the context of
precision agriculture monitoring at a site in Lithuania. They determined that the manned aircraft
sensor outperformed the UAV based device in terms of the quality of output data as a function of
cost. A key factor in this was the higher relative accuracy of georeferencing in the case of the manned
deployment, which the higher spatial resolution coverage of the UAV sensor was not sucient to
counteract [
36
]. Conversely, Freitas et al. [
19
], present a direct georectification method applied on their
fixed wing UAV based sensor, which substantially improved the accuracy of target georeferencing.
Whilst they still experienced diculties due to the nature of push broom image acquisition, the results
obtained suggest that reliable acquisition of accurately georeferenced data using a UAV based sensor is
now possible.
A number of studies have circumvented these georectification issues simply by implementing
ground-based data acquisition protocols [
37
], however, the obtained images can still be aected by other
factors, such as, variable meteorological conditions [
8
]. Indeed, this issue can aect both ground-based
and aerial hyperspectral imaging [
24
,
37
]. Variations in illumination, in particular, during the study
period can have a significant eect on the captured data, introducing apparent changes in captured
spectra unrelated to changes in the scene surface covering [
8
,
20
,
24
,
37
]. However, the eect of these
variations can be minimized by recording trends in illumination in parallel with the image capture [
8
],
which can be used to calibrate the hyperspectral image data acquired during these periods [3,19,24].
The demand for smaller and lighter hyperspectral imaging sensors continues to grow, with the
application of UAV integrated sensors being one of the most rapidly developing areas of remote sensing
technology [
8
,
12
]. The desire to reduce the physical size of these sensor systems whilst maintaining
the data quality available from larger units is an aspiration in both aerial and ground-based sensing
configurations [
12
,
37
]. With the advent of widely available 3D printing services [
38
,
39
], and the
continued development of sensors for both scientific and commercial purposes [
11
], the opportunities
to pioneer units specifically tailored to desired application areas have never been greater. Whilst at
present, push broom and whiskbroom sensors are subject to limitations in temporal resolution,
associated with the georectification process, there are considerable ongoing improvements in accurate
direct and indirect georectification methods [
19
,
26
]. In general, the continued development of more
compact, light-weight devices creates the opportunity for imaging surveys with high spatial and
spectral resolutions, delivering added flexibility in the acquisition parameters [11,26,37].
4. Applications within Environmental Monitoring
As highlighted in the sections above there is considerable potential for, and progress towards,
compact, field portable hyperspectral imaging sensors for a variety of environmental monitoring
applications. With the additional benefits of integrating low-cost, high quality consumer market
components, there is now a significant opportunity to make hyperspectral imaging more common
within environmental monitoring. There has, therefore, been a wide variety of devices developed for
sensing applications across these conditions. Due to the significant variations between these settings
the devices required can dier substantially in terms to size, weight, and robustness, to name a few
factors. This section will discuss developments across these contrasting environments, concentrating
Sensors 2019,19, 3071 6 of 17
on some key examples, to illustrate the current state-of-the art in the field. Within this section the term
“low-cost” is used to refer to hyperspectral devices assembled, often “in house”, from mass produced
components allowing for the overall build costs to be significantly lower than that of commercial,
scientific grade instruments.
4.1. UAV Based Applications
4.1.1. Agricultural and Natural Vegetation Monitoring
As discussed above, the development of light-weight, and low-cost, UAV compatible sensors
is a rapidly expanding area of research resulting in significant developments across a wide range of
environmental monitoring applications. Whilst there are potential issues relating to the georectification
process [
20
,
26
,
40
], the benefits related to improvements in spatial resolution and reduced fieldwork
costs are substantial. The monitoring of vegetation across both natural and agricultural environments is
a particular area of environmental monitoring that has benefitted from the advances in miniaturization
and cost reduction of hyperspectral technologies [
14
,
41
], allowing for precise, in-depth monitoring
and data collection to be accomplished even in the most inaccessible locations. The light-weight
sensors that have been developed to date show significant potential in their application for close-range
environmental monitoring [
14
], with the introduction of devices for monitoring vegetation health
receiving particular attention [
41
44
]. The continued monitoring of these environments with
hyperspectral technologies is of considerable importance. Due to the spectral resolution of these devices
it is possible to observe areas of vegetation stress, such as water stress or potential pest outbreaks,
before they become visible to the naked eye. This is done through the examination of pigments,
such as Chlorophyll, that will vary in quantity depending on the health of the vegetation, subsequently
eecting its spectral response. In the initial stages of vegetation stress these changes can be subtle and,
therefore, best recognized with hyperspectral imaging. This, in turn, allows for any potential issues to
be resolved or minimized before significant damage can be done.
Traditional monitoring methods for both agricultural and natural vegetation typically require
time consuming direct measurements or the use of spaceborne sensors [
45
,
46
], with limitations in
spatial resolution in respect of the latter [
47
,
48
]. The introduction of UAV based hyperspectral
sensors creates the opportunity to acquire accurate, close-range data that do not require the complex
processing typical of satellite and high altitude airborne systems. Indeed, these UAV deployments
aim to deliver data in an intermediary format, which provides both the satellite-based benefits of
spatial coverage as well as the spatial resolution aorded from ground-based deployments [
15
,
49
,
50
].
In particular, Garzonio et al. [
15
] present a multi-rotor UAV equipped with a cost-eective hyperspectral
sensor capable of detecting wavelengths within the visible and NIR (350–1000 nm) for a variety of
vegetation monitoring applications. Due to the multi-rotor design, the device presented was capable
of both transect and fixed target measurements, allowing it to be utilised for a variety of scenarios.
Furthermore, it provided a systematic and rapid method of high quality data collection, suitable for
relatively inaccessible locations, such as dense vegetation forests and forest canopies, allowing large,
high resolution datasets to be collected with relative ease [
15
]. However, despite overcoming issues
related to in-flight mechanical vibration of the sensor, the spectral resolution and signal to noise ratio
of the device were not optimal to capture all of the desired measurements, with particular problems
related to the capture of sun-induced fluorescence data [15].
Similarly, Näsi et al. [
14
] deployed such technology for monitoring insect damage across urban
forests. Their low-cost sensor enabled analysis at an individual tree level, providing a new level of
specificity in forest health management practices [
14
,
43
]. Whilst such detailed spatial resolution has
been achieved by a few studies in the past, such as Minaˇr
í
k and Langhammer [
51
], and Dash et al. [
52
],
they pertain, solely, to multispectral approaches. This hyperspectral unit [
14
] performed well, however,
diculties were encountered related to temporal illumination changes during the data acquisition [
14
].
As highlighted above, this is a potential issue that is generic to hyperspectral imaging from most
Sensors 2019,19, 3071 7 of 17
airborne, and ground-based, devices [
8
,
19
], and is, therefore, not a result of the low-cost of this device,
but simply a factor that requires attention during extended data acquisitions. A method that provides
the simultaneous monitoring of illumination change and data acquisition, and/or reference panel
measurements would help to minimize these issues in future work [
8
,
53
]. Despite these minor setbacks,
the development of these new, easy to use technologies could have significant benefits for monitoring
of both urban and rural forest health, with these low-cost units enabling far wider sensor proliferation
than possible hitherto, with the more expensive previously applied instrumentation. This in turn could
lead to significant benefits in terms of avoidance of future pest outbreaks and the potential resulting
forest losses [14,54].
A number of other studies have utilised similar UAV based techniques for the monitoring of
agricultural vegetation [
55
,
56
], and soil quality [
13
,
57
], producing accurate, high spatial resolution
results, further emphasizing the wide-ranging usability of these designs. However, there remain
limitations related to the weight and power supply of these devices, with heavier payloads having
a negative eect on the potential duration of aerial surveys [
15
]. Whilst this is limiting the practical
utilization of these devices at present, as technologies continue to be miniaturized and UAVs themselves
advance, survey flight times will become proportionately longer in the future [12].
4.1.2. Extreme Environment Monitoring
A particular benefit of the continued development of these devices is that they allow
non-destructive data acquisition, which is of considerable importance for highly sensitive and/or
protected environments, which are often a key focus of environmental monitoring research and
operations. Moreover, they also enable the acquisition of high spatial resolution data from locations
where ground-based field surveys would prove impractical or hazardous. Key examples here include
glacial and ice sheet regions, which have been host to considerable UAV based monitoring, for example
Crocker et al. [
58
], Hugenholtz et al. [
59
], Rippin et al. [
60
], and Ryan et al. [
61
]. However, work in
this domain to date has been largely restricted to multispectral and/or photogrammetry-based data
acquisitions, with hyperspectral monitoring being mostly confined to spaceborne observations [
62
].
The addition of field portable hyperspectral sensing to glacial settings will provide a significant
improvement to current datasets, such as the identification of supraglacial debris composition in
otherwise dicult to access locations [
63
]. Application of UAV based hyperspectral image capture in
the cryosphere is likely to be a highly promising future area of research.
4.1.3. Pollution and Particulate Monitoring
Inland water quality and pollution monitoring with hyperspectral sensors, has only recently
involved a move away from purely spaceborne imaging methods [
64
,
65
]. This change has been largely
driven by the limitations of satellite-based remote sensing as the spatial resolution provided by most
such sensors is somewhat limited, without substantial pixel mixing [
64
66
]. Hyperspectral sensors used
to monitor these environments provide high resolution optical data that allows for the simultaneous
detection and monitoring of air and water quality. This provides an extensive and accurate means of
pin-pointing potential pollution outbreaks and/or monitoring the quality of freshwater sources across
relatively large areas. Although the majority of recently developed sensors within these fields remain
aircraft based [
64
,
67
71
], with the advantage of coverage of larger survey areas than typically possible
with UAVs, a number of pioneering optical sensors for pollution and particulate monitoring are
beginning to emerge. These new devices are providing significant improvements to current monitoring
techniques with the introduction of UAV based [
72
,
73
], and lower cost portable [
74
], approaches.
The promising success rates of these new devices are providing significant improvements to our
understanding of particulate pollutants [
75
], whilst also highlighting the substantial scope for further
development and integration of UAV based hyperspectral sensor systems to this field.
Sensors 2019,19, 3071 8 of 17
4.2. Hand-Held and Ground-Based Device Applications
Whilst the majority of hyperspectral sensing measurements have been achieved from airborne
platforms, there have also been significant developments in hand-held and ground-based hyperspectral
sensing in recent years [
47
,
48
]. These devices are typically relatively light-weight and field portable,
(Figure 3) making them of significant benefit to a variety of small-scale fieldwork-based studies.
However, as this hardware is not subjected to the stringent payload requirements of UAV compatible
devices, there are relaxed tolerances with regards to weight, bulk, and power supply. A variety of
miniaturized hand-held sensors have been developed for several applications, with a degree of device
commercialization implicit in this activity [
76
,
77
]. In particular, Shan et al. [
78
] have developed a field
portable hyperspectral imager capable of detecting micro-plastic contamination in soils for particle
sizes between 0.5–5 mm. Whilst previous research has already successfully detected micro-plastic
contamination using hyperspectral imaging [
79
], that study focused on micro-plastic detection within
sea water filtrates, which required the manual separation of micro-plastics from the substrate prior to
image acquisition due to diculties related to plastic identification through water [
78
]. In contrast,
the device developed by Shan et al. [
78
] enables in-situ measurements with minimal disruption to
the study area. Given the increasing importance of this area, this technology is likely to be of ever
increasing utility here in the future.
Figure 3.
Compact UV hyperspectral imager measuring Sulphur Dioxide release from Cotopaxi
volcano, Ecuador.
Furthermore, Chennu et al. [
80
] discuss the development of a diver-operated underwater device
for the monitoring of shallow marine ecosystems, such as coral reefs. This device is the first of its
kind and represents a significant, cost-eective improvement in hyperspectral data acquisition for
these environments, avoiding the eects of complex optical paths through the atmosphere and the
water column [
80
], associated with observations taken above the water surface. Whilst the spatial
resolution of this sensor was lower than that of comparable digital camera imagers, it could suciently
identify the spectral reflectance features of corals at the organism level. The user-friendly nature of this
device allowed it to be operated with no prerequisite skills, however, its present design is too large for
integration with unmanned platforms, highlighting a significant avenue for future research.
Sensors 2019,19, 3071 9 of 17
The examples above highlight the versatility of these devices, with miniaturized hyperspectral
sensors replacing conventional non-imaging spectroscopy in a number of application areas [
14
,
47
].
Furthermore, this proliferation appears set to continue as the speed of image capture, and the processing
power of these units, increase year on year, just as the unit costs are reduced on an annual basis [
47
].
However, the development of more robust low-cost, field portable sensors for deployment in more
extreme settings, e.g., glacial and volcanic regions, remains somewhat limited. The development of
future low-cost hyperspectral sensors for these environments would build on the implementation
of low-cost spectral technologies in hostile environments [
39
,
81
83
], which have been based in
configurations suitable for short-term deployments. Indeed, Wilkes et al. [
81
] intimate that more
sustained deployments would require significant improvements to the outer casings of the device for
ruggedization and weatherproofing and robust product testing. This is a dicult hurdle to overcome
due to the highly dynamic and often volatile nature of these environments, making year-round
field-based monitoring challenging, even with state-of-the-art designs [
62
,
84
]. Future work could,
therefore, involve improvement of robust low-cost hyperspectral imagers to allow them to successfully
compete with their scientific grade equivalents for prolonged data collection in these more extreme
environments. In this respect, UAV based units have the advantage that deployments are by nature
discrete and time limited, rather than continuous, as discussed above.
5. Discussion
The development of these devices, and their application to a panoply of environmental monitoring
areas, represent a series of significant technical and scientific advances. These units provide accurate,
high resolution datasets, which help to bridge the gap between sparse and discontinuous field
observations and continuous but coarse resolution spaceborne technologies [
14
,
15
,
62
], as well
as enabling real time analysis and decision making in environmental monitoring contexts [
48
],
making them a beneficial addition to existing field monitoring techniques. Furthermore, miniaturized,
low-cost systems can be operated on a local scale by small organizations and/or companies, considerably
reducing the time required to organize specific remote sensing campaigns [
14
], relative to manned
airborne surveys, reducing the need for expensive and time-consuming direct measurement methods
and enabling aordable and rapid environmental monitoring [
14
]. This is particularly advantageous
in less well-resourced countries, where there are acute needs in terms of crop monitoring, for example.
However, there remain a number of limitations on these devices at present [
40
]. For UAV based
applications, these limitations are largely related to the currently rather large weight, bulk, and power
supply requirements of the deployed sensors, highlighting the need for future miniaturization in such
devices. [
15
]. Although this hurdle is beginning to be overcome [
42
], often with the application of
o-the-shelf consumer electronics components [
47
,
50
], there still typically remains a trade-obetween
sensor size and data quality in these next generation units [
40
,
77
,
78
]. Similar limitations also aect
ground-based and hand-held devices, although in these contexts the restrictions are not as profound.
The foremost challenge faced by the majority of these devices is their successful deployment for
long-term data collection. However, with potential future developments in ruggedization of the
hardware, which will allow such units to become competitive with commercial scientific grade devices
for long-term field deployments, the application of ground-based hyperspectral imaging appears set to
proliferate rapidly in the coming years (Figure 4).
With the technological move towards more compact, miniaturized devices for optical
sensing
[85,86]
, the implementation of low-cost consumer electronics in environmental monitoring is
on the rise [
85
,
86
]. The application of smartphone-based spectroscopy has been of particular interest
for a variety of disciplines [
12
,
39
,
87
], and is a technological step towards the realization of smartphone
based hyperspectral imaging. Compared to basic mobile devices, smartphones are equipped with
a number of features that expedite sensing applications [
85
], enabling performance of advanced
scientific measurements [
88
,
89
]. This is particularly driven by the low-cost of these units, relative to
commercial scientific grade cameras [
86
,
90
92
], resulting in these units being developed into a variety
Sensors 2019,19, 3071 10 of 17
of lab-in-a-phone technologies [
39
,
81
,
92
,
93
]. Initial developments in this field have seen the creation of
devices that work within the set-up of an existing smartphone, with considerable potential for future
device development. However, current work has faced issues in connection with the unit operating
systems, wherein raw data files (required for quantitative sensing applications) can be dicult to access
and/or are eected by auto-scaling, e.g., Smith et al. [
94
], and the presence of Bayer filters within the
majority of smartphone camera sensor designs, limiting most smartphone sensing to the visible portion
of the electromagnetic spectrum within the three defined spectral bands corresponding to the cameras
RGB pixels [
82
]. However, as smartphone-based spectrometers improve in performance, producing
results similar to those of commercial scientific devices [
39
], the “compromise” in using these cheaper
units, is becoming less of a relative downside. An in-depth review of these initial developments in
smartphone spectroscopy can be found in McGonigle et al. [82].
Figure 4.
Example dataset captured using a low-cost hyperspectral device; 128
×
128 hyperspectral
image displaying spectral reflectance from 340–850 nm of a green apple and tungsten filament lamp.
Image tiles display reflectance peaks across the Red (
a
), Green (
b
), and Blue (
c
) portions of the
electromagnetic spectrum. Note the corresponding peaks in reflectance captured in the spectral
response graph.
As smartphone spectroscopy continues to develop, there is now the beginning of applying these
units for hyperspectral imaging. In particular, Wilcox et al. [
12
], present an ultra-compact hyperspectral
imaging system, for use within a UAV set-up, that has been developed to incorporate smartphone
technologies. Similarly, Rissaren et al. [
95
], and Näsilä et al. [
96
], report initial developments
in smartphone compatible hyperspectral imaging. Critically, this demonstrates that the ever
increasing processor performance from state-of-the-art smartphone handsets is sucient to manage the
significantly larger data volumes associated with hyperspectral imaging in contrast to mere spectral
data capture [
10
,
12
]. Just as smartphone spectroscopy has now been proven in a number of application
areas [
87
,
92
,
94
], allowing for increased data collection at costs up to an order of magnitude lower
than from conventional devices [
81
], it is likely that hyperspectral imaging with smartphones will be
increasingly applied in the coming years.
In considering field portable hyperspectral imaging instrumentation for the majority of
environmental monitoring settings, three design considerations are particularly pertinent:
Compact light-weight design—Allowing for easy portability to a variety of field sites of varying
accessibility. This criterion has particular benefits in relation to set-up times, enabling for rapid
deployment of technical devices as well as significantly reducing the personnel requirements of
field visits. As discussed above, this design feature is also of significant importance for sensors
designed for UAV integration.
Sensors 2019,19, 3071 11 of 17
Low-cost—Whilst this is not an essential requirement for successful environmental monitoring
using field-based hyperspectral imaging, the production of low-cost sensors will increase the
accessibility of this measurement modality, beyond the relatively limited field deployments
achieved hitherto with the rather expensive previously available instrumentation. This is
particularly the case for smartphone based platforms, given the ubiquity of these units, and their
suitability for implementation as nodes within internet of things type architectures.
Flexibility—In order to achieve the best results, deployed devices need to be easily configurable
by researchers, allowing for adaptations to be made relating to the proposed device application.
This criterion is most easily met by devices assembled “in house” as it allows researchers to
develop and assemble components in the best arrangement for the proposed application, resulting
in a device specifically designed for its task. This is typically more favourable than generic,
commercial devices, which can be rather dicult to align to specific applications. Furthermore,
a device developed “in house” can also provide significant reductions in set-up times as the
researchers will generally be familiar with the device design.
Indeed, given the above it is evident that more and more research groups are opting to
develop their own devices instead of relying on commercially available more expensive equipment,
pointing to the proliferation and democratisation of hyperspectral imaging across the environmental
sciences. Although, at present, many of these technologies are restricted by the current limitations
of miniaturisation, and the associated tradeos that miniaturisation can bring in terms of the sensor
performance, initial results from smartphone based hyperspectral imaging suggest that significant
improvements in cost-eective, high spatial resolution data acquisition can be expected in the near
future. This increase in performance, coupled to further reductions in instrumental cost, are likely to
lead to increased utility and proliferation of these units in the coming decades, therefore.
However, an important additional consideration are the potential costs of required components
external to the sensor design. This is of particular importance for sensors designed with the low-cost
criterion in mind as the savings made during sensor assembly can quickly be lost through other device
requirements. For example, when considering UAV integrated hyperspectral sensors, it is imperative
that low-cost designs also adhere to the compact light-weight criterion in order to prevent the incursion
of extensive costs related to the acquisition of UAVs with higher payload weight limits. As failure
to consider this factor can lead to significant additional build costs, it is, therefore, of considerable
importance to understand the payload specifications and limitations of the proposed UAV system in
tandem with implementing the sensor design and development process. A number of articles discuss the
variations, and subsequent categorisation, of dierent UAV systems, highlighting the, often substantial,
dierences in terms of payload weight, fuel requirements, and survey length [
49
,
97
99
]. In general,
multi-rotor UAVs are more suited to operation within more confined/inaccessible field sites due to their
ability to take off/land vertically, whereas fixed wing UAVs are typically suited to longer endurance
applications and provide more stable data collection [
96
], however, the final decision as to which
design of UAV is selected is established by the specific parameters of the proposed application and,
therefore, varies substantially between projects. Nevertheless, these characteristics are of considerable
importance to the successful deployment of a UAV integrated sensor and can significantly impact the
overall cost to deliver the measurement. Furthermore, costs and payload weights can be minimized
further with the thorough selection of required ancillary sensors, such as RGB cameras and GPS units,
e.g., both Näsi et al. [
43
], and Honkavaara et al. [
55
], reduced the overall costs of their set-ups with the
inclusion of additional small consumer cameras instead of more expensive top-of-the-range models.
It is clear that in order to design a successful low-cost compact hyperspectral imaging instrument
a complex list of design variables must be considered and potentially juggled to enable best delivery
against the monitoring objectives. Within this there are two key exciting new frontiers, which these
low-cost units now expedite: firstly, their potential for deployment and monitoring in less well-resourced
countries, allowing for valuable research data to be acquired without the associated costs. Secondly there
is the potential for future, long-term deployments in more extreme environments, for example with
Sensors 2019,19, 3071 12 of 17
applicability in pioneering cost-eective early warning/monitoring systems for more volatile settings.
Although the eectiveness of these units is limited by currently available technologies, the increasing
interest and development in this sector looks set to produce vast improvements to low-cost and
miniaturised hyperspectral data collection, and thus provides the opportunity to improve data sets
across a wealth of environmental monitoring domains.
6. Conclusions
This article has provided an in-depth review of current miniaturized and low-cost field deployable
hyperspectral technologies and their integration into the environmental monitoring field. Whilst the
miniaturization and deployment of these devices is ongoing, it is evident that this is a burgeoning area
of research with the potential to revolutionise environmental monitoring in a wide variety of fields,
hence the timeliness of capturing the state-of-the-art, in this article, at this point in time. At present,
these devices largely complement existing monitoring techniques, however, as technologies continue
to improve, it is likely that they will be increasingly applied in stand-alone monitoring capacities.
Future work should look to expanding the applications for these devices, in particular allowing them
to be successfully utilized even in more extreme environments, as well as further capitalizing on the
reduced cost of consumer available technology in this domain. With the latest low-cost devices now
producing scientific grade results, it appears as though hyperspectral imaging with smartphones
in particular is now set to become a promising new frontier in empirical environmental science,
significantly broadening the reach of hyperspectral image capture. This article captures the beginning
of what we anticipate will be a steep rising curve of community uptake, broadening applicability far
beyond those application domains covered to date.
Author Contributions:
Writing—original draft preparation, M.B.S.; Writing—review and editing, M.B.S., J.R.W.,
A.J.S.M.
Funding:
This work has been supported by Engineering and Physical Sciences Research Council (EPSRC)
fellowship EP/M009106/1.
Conflicts of Interest: The authors declare no conflict of interest.
References
1.
Govender, M.; Chetty, K.; Bulcock, H. A review of hyperspectral remote sensing and its application in
vegetation and water resource studies. Water SA 2017,33, 145–152. [CrossRef]
2.
Ghamisi, P.; Yokoya, N.; Li, J.; Liao, W.; Liu, S.; Plaza, J.; Rasti, B.; Plaza, A. Advances in hyperspectral image
and signal processing: A comprehensive overview of the state of the art. IEEE Geosci. Remote Sens. Mag.
2017,5, 37–78. [CrossRef]
3.
Khan, M.J.; Khan, H.S.; Yousaf, A.; Khurshid, K.; Abbas, A. Modern trends in hyperspectral image analysis:
A review. IEEE Access 2018,6, 14118–14129. [CrossRef]
4.
Fischer, C.; Kakoulli, I. Multispectral and hyperspectral imaging technologies in conservation: Current
research and potential applications. Stud. Conserv. 2006,51, 3–16. [CrossRef]
5.
Padoan, R.; Steemers, T.A.; Klein, M.; Aalderink, B.; De Bruin, G. Quantitative hyperspectral imaging of
historical documents: Technique and applications. Art Proc. 2008, 25–30.
6.
Liang, H. Advances in multispectral and hyperspectral imaging for archaeology and art conservation.
Appl. Phys. A 2012,106, 309–323. [CrossRef]
7.
Landgrebe, D. Information extraction principles and methods for multispectral and hyperspectral image
data. In Information Processing for Remote Sensing; Chen, C., Ed.; World Scientific Publishing Co., Inc.: River
Edge, NJ, USA, 1999; Volume 1, pp. 3–37.
8.
Honkavaara, E.; Heikki, S.; Kaivosoja, J.; Pölönen, I.; Hakala, T.; Litkey, P.; Mäkynen, J.; Pesonen, L. Processing
and assessment of spectrometric stereoscopic imagery collected using a lightweight UAV spectral camera for
precision agriculture. Remote Sens. 2013,5, 5006–5039. [CrossRef]
9.
Liu, L.; Xu, L.; Peng, J. 3D Reconstruction from UAV-based hyperspectral images. Int. Arch. Photogramm.
Remote sens. 2018,42, 1073–1077. [CrossRef]
Sensors 2019,19, 3071 13 of 17
10.
Aasen, H.; Honkavaara, E.; Lucieer, A.; Zarco-Tejada, P.J. Quantitative remote sensing at ultra-high resolution
with UAV spectroscopy: A review of sensor technology, measurement procedures, and data correction
workflows. Remote Sens. 2018,10, 1091. [CrossRef]
11.
Ad
ã
o, T.; Hruška, J.; P
á
dua, L.; Bessa, J.; Peres, E.; Morais, R.; Sousa, J.J. Hyperspectral imaging: A review on
UAV-based sensors, data processing and applications for agriculture and forestry. Remote Sens.
2017
,9, 1110.
[CrossRef]
12.
Wilcox, C.C.; Montes, M.; Yetzbacher, M.; Edelberg, J.; Schlupf, J. An ultra-compact hyperspectral imaging
system for use with an unmanned aerial vehicle with smartphone-sensor technology. In Micro- and
Nanotechnology Sensors, Systems, and Applications X, Proceedings of SPIE Defence and Security, Orlando Florida,
United States, May 2018; International Society for Optics and Photonics: Washington, DC, USA, 2018.
[CrossRef]
13.
Baccani, C.; Rossi, G.; Landini, F.; Salvatici, T.; Romoli, M.; Pancrazzi, M.; Facardi, M.; Noce, V.; Moretti, S.;
Casagli, N. Optical design of a hyperspectral drone advanced camera for soil monitoring using an
electro-optical liquid crystal technology. In Optical Design and Engineering VII, Proceedings of SPIE Optical
Systems Design, Frankfurt, Germany, 5 June 2018; International Society for Optics and Photonics: Washington,
DC, USA, 2018. [CrossRef]
14.
Näsi, R.; Honkavaara, E.; Blomquist, M.; Lyytikäinen-Saarenmaa, P.; Hakala, T.; Viljanen, N.; Kantola, T.;
Holopainen, M. Remote sensing of bark beetle damage in urban forests at individual tree level using a novel
hyperspectral camera from UAV and aircraft. Urban For. Urban Green. 2018,30, 72–83. [CrossRef]
15.
Garzonio, R.; Di Mauro, B.; Colombo, R.; Cogliati, S. Surface reflectance and sun-induced fluorescence
spectroscopy measurements using a small hyperspectral UAS. Remote Sens. 2017,9, 472. [CrossRef]
16.
Goetz, A.F. Three decades of hyperspectral remote sensing of the Earth: A personal view. Remote Sens.
Environ. 2009,113, S5–S16. [CrossRef]
17.
Lucieer, A.; Malenovsk
ý
, Z.; Veness, T.; Wallace, L. HyperUAS—Imaging spectroscopy from a multirotor
unmanned aircraft system. J. Field Robot. 2014,31, 571–590. [CrossRef]
18.
Malenovsk
ý
, Z.; Lucieer, A.; King, D.H.; Turnbull, J.D.; Robinson, S.A. Unmanned aircraft system advances
health mapping of fragile polar vegetation. Methods Ecol. Evol. 2017,8, 1842–1857. [CrossRef]
19.
Freitas, S.; Silva, H.; Almeida, J.; Silva, E. Hyperspectral imaging for real-time unmanned aerial vehicle
maritime target detection. J. Intell. Robot. Syst. 2018,90, 551–570. [CrossRef]
20.
Habib, A.; Zhou, T.; Masjedi, A.; Zhang, Z.; Flatt, J.E.; Crawford, M. Boresight calibration of GNSS/INS-assisted
push-broom hyperspectral scanners on UAV platforms. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens.
2018,11, 1734–1749. [CrossRef]
21.
Fowler, J.E. Compressive pushbroom and whiskbroom sensing for hyperspectral remote-sensing imaging. In
Proceedings of the IEEE International Conference on Image Processing (ICIP), Paris, France, 27–30 October
2014.
22.
Uto, K.; Seki, H.; Saito, G.; Kosugi, Y.; Komatsu, T. Development of a Low-Cost Hyperspectral Whiskbroom
Imager Using an Optical Fiber Bundle, a Swing Mirror, and Compact Spectrometers. IEEE J. Sel. Top. Appl.
Earth Obs. Remote Sens. 2016,9, 3909–3925. [CrossRef]
23.
Kerekes, J.P.; Schott, J.R. Hyperspectral imaging systems. In Hyperspectral Data Exploitation: Theory and
Applications, 1st ed.; Chang, C.I., Ed.; John Wiley & Sons: Hoboken, NJ, USA, 2007; pp. 19–46.
24.
Uto, K.; Seki, H.; Saito, G.; Kosugi, Y. Development of lightweight hyperspectral imaging system for UAV
observation. In Proceedings of the 2014 6th Workshop on Hyperspectral Image and Signals Processing:
Evolution in Remote Sensing (WHISPERS), Lausanne, Switzerland, 24–27 June 2014. [CrossRef]
25.
Willett, R.M.; Duarte, M.F.; Davenport, M.A.; Baraniuk, R.G. Sparsity and structure in hyperspectral imaging:
Sensing, reconstruction and target detection. IEEE Signal Proc. Mag. 2014,31, 116–126. [CrossRef]
26.
Jaud, M.; Dantec, N.L.; Ammann, J.; Grandjean, P.; Constantin, D.; Akhtman, Y.; Barbieux, K.; Allemand, P.;
Delacourt, C.; Merminod, B. Direct georeferencing of a pushbroom, lightweight hyperspectral system for
mini-UAV applications. Remote Sens. 2018,10, 204. [CrossRef]
27.
Sellar, R.G.; Boreman, G.D. Classification of imaging spectrometers for remote sensing applications. Opt.
Eng. 2005,44, 013602. [CrossRef]
28.
Clark, M.L. Mapping land cover with hyperspectral and multispectral satellites using machine learning and
Spectral Mixture Analysis. In Proceedings of the 2016 IEEE International Geoscience and Remote Sensing
Symposium (IGARSS), Beijing, China, 10–15 July 2016.
Sensors 2019,19, 3071 14 of 17
29.
Glumb, R.; Lapsley, M.; Lee, D.; Mantica, P.; Dery, J.P. TRL6 testing of a hyperspectral infrared CubeSat
instrument. In Proceedings of the AIAA Space and Astronautics Forum and Exposition, Orlando, FL, USA,
12–14 September 2017. [CrossRef]
30.
Selva, D.; Krejci, D. A survey and assessment of the capabilities of Cubesats for Earth observation.
Acta Astronaut. 2012,74, 50–68. [CrossRef]
31.
Wright, R.; Lucey, P.; Crites, S.; Horton, K.; Wood, M.; Garbeil, H. BBM/EM design of the thermal hyperspectral
imager: An instrument for remote sensing of the Earth’s surface, atmosphere and ocean from a microsatellite
platform. Acta Astronaut. 2013,87, 182–192. [CrossRef]
32.
Poghosyan, A.; Golkar, A. CubeSat evolution: Analyzing CubeSat capabilities for conducting science
missions. Prog. Aerosp. Sci. 2017,88, 59–83. [CrossRef]
33.
Habib, A.; Xiong, W.; He, F.; Yang, H.L.; Crawford, M. Improving orthorectification of UAV-based push-broom
scanner imagery using derived orthophotos from frame cameras. IEEE J. Sel. Top. Appl. Earth Obs. Remote
Sens. 2017,10, 262–276. [CrossRef]
34.
Herwitz, S.R.; Johnson, L.F.; Dunagan, S.E.; Higgins, R.G.; Sullivan, D.V.; Zheng, J.; Lobitz, B.M.; Leung, J.G.;
Gallmeyer, B.A.; Aoyagi, M.; et al. Imaging from an unmanned aerial vehicle: Agricultural surveillance and
decision support. Comput. Electron. Agric. 2004,44, 49–61. [CrossRef]
35.
Hruska, R.; Mitchell, J.; Anderson, M.; Glenn, N.F. Radiometric and geometric analysis of hyperspectral
imagery acquired from an unmanned aerial vehicle. Remote Sens. 2012,4, 2736–2752. [CrossRef]
36.
Mozgeris, G.; Jonikaviˇcius, D.; Jovarauskas, D.; Zinkeviˇcius, R.; Petkeviˇcius, S.; Steponaviˇcius, D. Imaging
from manned ultra-light and unmanned aerial vehicles for estimating properties of spring wheat. Precis. Agric.
2018,19, 1–19. [CrossRef]
37.
Kirsch, M.; Lorenz, S.; Zimmermann, R.; Tusa, L.; Möckel, R.; Hödl, P.; Booysen, R.; Khodadadzadeh, M.;
Gloaguen, R. Integration of terrestrial and drone-borne hyperspectral and photogrammetric sensing methods
for exploration mapping and mining monitoring. Remote Sens. 2018,10, 1366. [CrossRef]
38.
Zhang, C.; Anzalone, N.C.; Faria, R.P.; Pearce, J.M. Open-source 3D-printable optics equipment. PLoS ONE
2013,8, e59840. [CrossRef]
39.
Wilkes, T.C.; McGonigle, A.J.S.; Willmott, J.R.; Pering, T.D.; Cook, J.M. Low-cost 3D printed 1nm resolution
smartphone sensor-based spectrometer: Instrument design and application in ultraviolet spectroscopy.
Opt. Lett. 2017,42, 4323–4326. [CrossRef] [PubMed]
40.
Eckardt, A.; Reulke, R. Low cost hyperspectral systems for atmospheric and surface studies. In Imaging
Spectroscopy XXII: Applications, Scenarios, and Processing, Proceedings of SPIE Optical Engineering and Applications,
San Diego, USA, 2018; International Society for Optics and Photonics: Washington, DC, USA, 2018. [CrossRef]
41.
Senf, C.; Seidl, R.; Hostert, P. Remote sensing of forest insect disturbances: Current state and future directions.
Int. J. Appl. Earth Obs. Geoinform. 2017,60, 49–60. [CrossRef] [PubMed]
42.
Aasen, H.; Burkart, A.; Bolten, A.; Bareth, G. Generating 3D hyperspectral information with lightweight
UAV snapshot cameras for vegetation monitoring: From camera calibration to quality assurance. ISPRS J.
Photogramm. Remote Sens. 2015,108, 245–259. [CrossRef]
43.
Näsi, R.; Honkavaara, E.; Lyytikäinen-Saarenmaa, P.; Blomqvist, M.; Litkey, P.; Hakala, T.; Tanhuanpää, T.;
Holopainen, M. Using UAV-based photogrammetry and hyperspectral imaging for mapping bark beetle
damage at tree-level. Remote Sens. 2015,7, 15467–15493. [CrossRef]
44.
Moriya, E.A.S.; Imai, N.N.; Tommaselli, A.M.G.; Miyoshi, G.T. Mapping mosaic virus in sugarcane based on
hyperspectral images. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2017,10, 740–748. [CrossRef]
45.
Havašov
á
, M.; Bucha, T.; Ferenˇc
í
k, J.; Jakuš, R. Applicability of a vegetation indices-based method to map
bark beetle outbreaks in the High Tatra Mountains. Ann. For. Res. 2015,58, 295–310. [CrossRef]
46.
Long, J.A.; Lawrence, R.L. Mapping percent tree mortality due to mountain pine beetle damage. For. Sci.
2016,62, 392–402. [CrossRef]
47.
Gooding, E.A.; Deutsch, E.R.; Huehnerho, J.; Hajian, A.R. Fast, cheap and in control: Spectral imaging with
handheld devices. In Next-Generation Spectroscopic Technologies X, Proceedings of SPIE Commercial and Scientific
Sensing and Imaging, Anaheim, USA, 2017; International Society for Optics and Photonics: Washington, DC,
USA, 2017. [CrossRef]
Sensors 2019,19, 3071 15 of 17
48.
Huehnerho, J.; Lozo, J.A.; Deutsch, E.R.; Hajian, A.R. High resolution handheld Raman and reflectance
hyperspectral imaging for remote sensing and threat detection. In Next-Generation Spectroscopic Technologies
XI, Proceedings of SPIE Commercial and Scientific Sensing and Imaging, Orlando, USA, 14 May 2018; International
Society for Optics and Photonics: Washington, DC, USA, 2018. [CrossRef]
49.
Colomina, I.; Molina, P. Unmanned aerial systems for photogrammetry and remote sensing: A review.
ISPRS J. Photogramm. Remote Sens. 2014,92, 79–97. [CrossRef]
50.
Uto, K.; Seki, H.; Saito, G.; Kosugi, Y.; Komatsu, T. Development of hyperspectral imaging system using
optical fiber bundle and swing mirror. In Proceedings of the 2015 7th Workshop on Hyperspectral Image and
Signal Processing: Evolution in Remote Sensing (WHISPERS), Tokyo, Japan, 2–5 June 2015; IEEE: Piscataway,
NJ, USA, 2015. [CrossRef]
51.
Minaˇr
í
k, R.; Langhammer, J. Use of a Multispectral UAV Photogrammetry for Detection and Tracking of
Forest Disturbance Dynamics. Int. Arch. Photogramm. 2016,41, 711–718. [CrossRef]
52.
Dash, J.P.; Watt, M.S.; Pearse, G.D.; Heaphy, M.; Dungey, H.S. Assessing very high resolution UAV imagery
for monitoring forest health during a simulated disease outbreak. ISPRS J. Photogramm. Remote Sens.
2017,131, 1–14. [CrossRef]
53.
Wendel, A.; Underwood, J. Illumination compensation in ground based hyperspectral imaging. ISPRS J.
Photogramm. Remote Sens. 2017,129, 162–178. [CrossRef]
54.
Lehmann, J.R.K.; Nieberding, F.; Prinz, T.; Knoth, C. Analysis of unmanned aerial system-based CIR images
in forestry—A new perspective to monitor pest infestation levels. Forests 2015,6, 594–612. [CrossRef]
55.
Honkavaara, E.; Kaivosoja, J.; Mäkynen, J.; Pellikka, I.; Pesonen, L.; Saari, H.; Salo, H.; Hakala, T.; Marklelin, L.;
Rosnell, T. Hyperspectral reflectance signatures and point clouds for precision agriculture by light weight
UAV imaging system. ISPRS Ann. Photogramm. 2012,7, 353–358. [CrossRef]
56.
Bareth, G.; Aasen, H.; Bendig, J.; Gnyp, M.L.; Bolten, A.; Jung, A.; Michels, R.; Soukkamäki, J. Low-weight and
UAV-based Hyperspectral Full-frame Cameras for Monitoring Crops: Spectral Comparison with Portable
Spectroradiometer Measurements. Photogrammetrie-Fernerkundung-Geoinformation
2015
,1, 69–79. [CrossRef]
57.
Jakob, S.; Zimmermann, R.; Gloaguen, R. The need for accurate geometric and radiometric corrections
of drone-borne hyperspectral data for mineral exploration: Mephysto—A toolbox for pre-processing
drone-borne hyperspectral data. Remote Sens. 2017,9, 88. [CrossRef]
58.
Crocker, R.I.; Maslanik, J.A.; Adler, J.J.; Palo, S.E.; Herzfeld, U.C.; Emery, W.J. A sensor package for ice surface
observations using small unmanned aircraft systems. IEEE Trans. Geosci. Remote Sens.
2012
,50, 1033–1047.
[CrossRef]
59.
Hugenholtz, C.H.; Whitehead, K.; Brown, O.W.; Barchyn, T.E.; Moorman, B.J.; LeClair, A.; Riddell, K.;
Hamilton, T. Geomorphological mapping with a small unmanned aircraft system (sUAS): Feature detection
and accuracy of a photogrammetrically-derived digital terrain model. Geomorphology
2013
,194, 16–24.
[CrossRef]
60.
Rippin, D.M.; Pomfret, A.; King, N. High resolution mapping of supra-glacial drainage pathways reveals
link between micro-channel drainage density, surface roughness and surface reflectance. Earth Surf. Process.
Landf. 2015,40, 1279–1290. [CrossRef]
61.
Ryan, J.C.; Hubbard, A.L.; Box, J.E.; Todd, J.; Christoersen, P.; Carr, J.R.; Holt, T.O.; Snooke, N.A. UAV
photogrammetry and structure from motion to assess calving dynamics at Store Glacier, a large outlet
draining the Greenland ice sheet. Cryosphere 2015,9, 1–11. [CrossRef]
62.
Bhardwaj, A.; Sam, L.; Akanksha; Martin-Torres, F.J.; Kumar, R. UAVs as remote sensing platform in
glaciology: Present applications and future prospects. Remote Sens. Environ.
2016
,175, 196–204. [CrossRef]
63.
Di Mauro, B.; Baccolo, G.; Garzonio, R.; Giardino, C.; Massab
ó
, D.; Piazzalunga, A.; Rossini, M.; Colombo, R.
Impact of impurities and cryoconite on the optical properties of the Morteratsch Glacier (Swiss Alps).
Cryosphere 2017,11, 2393–2409. [CrossRef]
64.
Pu, H.; Liu, J.-H.; Qu, D.; Sun, D.-W. Applications of imaging spectrometry in inland water quality
monitoring—a review of recent development. Water Air Soil Pollut. 2017,228, 131. [CrossRef]
65.
Pab
ó
n, R.E.C.; de Souza Filho, C.R.; de Oliveira, W.J. Reflectance and imaging spectroscopy applied to
detection of petroleum hydrocarbons pollution in bare soils. Sci. Total Environ.
2019
,649, 1224–1236.
[CrossRef] [PubMed]
Sensors 2019,19, 3071 16 of 17
66.
Gholizadeh, A.; Saberioon, M.; Ben-Dor, E.; Bor ˚uvka, L. Monitoring of selected soil contaminants using
proximal and remote sensing techniques: Background, state-of-the-art and future perspectives. Crit. Rev.
Environ. Sci. Technol. 2018,48, 243–278. [CrossRef]
67.
Olmanson, L.G.; Brezonik, P.L.; Bauer, M.E. Airborne hyperspectral remote sensing to assess spatial
distribution of water quality characteristics in large rivers: The Mississippi River and its tributaries in
Minnesota. Remote Sens. Environ. 2013,130, 254–265. [CrossRef]
68.
Dierssen, H.M.; Chlus, A.; Russell, B. Hyperspectral discrimination of floating mats of seagrass wrack and the
macroalgae Sargassum in coastal waters of Greater Florida Bay using airborne remote sensing. Remote Sens.
Environ. 2015,167, 247–258. [CrossRef]
69.
Pab
ó
n, R.E.C.; de Souza Filho, C.R. Spectroscopic characterization of red latosols contaminated by
petroleum-hydrocarbon and empirical model to estimate pollutant content and type. Remote Sens. Environ.
2016,175, 323–336. [CrossRef]
70.
Scafutto, R.D.P.M.; de Souza Filho, C.R.; Rivard, B. Characterization of mineral substrates impregnated
with crude oils using proximal infrared hyperspectral imaging. Remote Sens. Environ.
2016
,179, 116–130.
[CrossRef]
71.
Scafutto, R.D.P.M.; de Souza Filho, C.R.; de Oliveira, W.J. Hyperspectral remote sensing detection of
petroleum hydrocarbons in mixtures with mineral substrates: Implications for onshore exploration and
monitoring. ISPRS J. Photogramm. Remote Sens. 2017,128, 146–157. [CrossRef]
72.
Memisoglu, G.; Gulbahar, B.; Zubia, J.; Villatoro, J. Theoretical modelling of viscosity monitoring with vibrating
resonance energy transfer for point-of-care and environmental monitoring. Micromachines
2018
,10, 3. [CrossRef]
73.
Alvarado, M.; Gonzalez, F.; Fletcher, A.; Doshi, A. Towards the development of a low cost airborne sensing
system to monitor dust particles after blasting at open-pit mine sites. Sensors
2015
,15, 19667–19687. [CrossRef]
[PubMed]
74.
Ng, C.L.; Kai, F.M.; Tee, M.H.; Tan, N.; Hemond, H.F. A prototype sensor for in situ sensing of fine particulate
matter and volatile organic compounds. Sensors 2018,18, 265. [CrossRef] [PubMed]
75.
Reid, J.P.; Bertram, A.K.; Topping, D.O.; Laskin, A.; Martin, S.T.; Petters, M.D.; Pope, F.D.; Rovelli, G. The
viscosity of atmospherically relevant organic particles. Nat. Commun. 2018,9, 956. [CrossRef] [PubMed]
76.
Ziph-Schatzberg, L.; Woodman, P.; Nakanishi, K.; Cornell, J.; Wiggins, R.; Swartz, B.; Holasek, R. Compact,
high performance hyperspectral systems design and applications. In Next-Generation Spectroscopic Technologies
VIII, Proceedings of SPIE Sensing Technology and Application, Baltimore, USA, 2015; International Society for
Optics and Photonics: Washington, DC, USA, 2015. [CrossRef]
77.
Holasek, R.; Nakanishi, K.; Ziph-Schatzberg, L.; Santman, J.; Woodman, P.; Zacaroli, R.; Wiggins, R. The
selectable hyperspectral airborne remote sensing kit (SHARK) as an enabler for precision agriculture. In
Hyperspectral Imaging Sensors: Innovative Applications and Sensor Standards 2017, Proceedings of SPIE Commercial
and Scientific Sensing and Imaging, Anaheim, USA, 2017; International Society for Optics and Photonics:
Washington, DC, USA, 2017. [CrossRef]
78.
Shan, J.; Zhao, J.; Liu, L.; Zhang, Y.; Wang, X.; Wu, F. A novel way to rapidly monitor micro plastics in soil
by hyperspectral imaging technology and chemometrics. Environ. Pollut.
2018
,238, 121–129. [CrossRef]
[PubMed]
79.
Karlsson, T.M.; Grahn, H.; van Bavel, B.; Geladi, P. Hyperspectral imaging and data analysis for detecting
and determining plastic contamination in seawater filtrates. J. Near Infrared Spectrosc.
2016
,24, 141–149.
[CrossRef]
80.
Chennu, A.; Färber, P.; De’ath, G.; de Beer, D.; Fabricius, K.E. A diver-operated hyperspectral imaging and
topographic surveying system for automated mapping of benthic habitats. Sci. Rep.
2017
,7, 7122. [CrossRef]
[PubMed]
81.
Wilkes, T.C.; Pering, T.D.; McGonigle, A.J.S.; Tamburello, G.; Willmott, J.R. A low-cost smartphone
sensor-based UV camera for volcanic SO2emission measurements. Remote Sens. 2017,9, 27. [CrossRef]
82.
McGonigle, A.J.S.; Wilkes, T.C.; Pering, T.D.; Willmott, J.R.; Cook, J.M.; Mims, F.M.; Parisi, A.V. Smartphone
spectrometers. Sensors 2018,18, 223. [CrossRef]
83.
McGonigle, A.J.S.; Pering, T.D.; Wilkes, T.C.; Tamburello, G.; D’aleo, R.; Bitetto, M.; Aiuppa, A.; Willmott, J.R.
Ultraviolet imaging of volcanic plumes: A new paradigm in volcanology. Geosciences
2017
,7, 68. [CrossRef]
84.
Bhardwaj, A.; Joshi, P.K.; Sam, L.; Snehmani. Remote sensing of alpine glaciers in visible and infrared
wavelengths: A survey of advances and prospects. Geocarto Int. 2015,31, 557–574. [CrossRef]
Sensors 2019,19, 3071 17 of 17
85.
Zhang, C.; Cheng, G.; Edwards, P.; Zhou, M.-D.; Zheng, S.; Liu, Z. G-Fresnel smartphone spectrometer.
Lab Chip 2016,16, 246–250. [CrossRef] [PubMed]
86.
Sigernes, F.; Syrjäsuo, M.; Strovold, R.; Fortuna, J.; Grøtte, M.E.; Johansen, T.A. Do it yourself hyperspectral
imager for handheld to airborne operations. Opt. Express 2018,26, 6021–6035. [CrossRef] [PubMed]
87.
Hossain, A.; Canning, J.; Cook, K.; Jamalipour, A. Optical fiber smartphone spectrometer.Opt. Lett.
2016
,41,2237–2240.
[CrossRef] [PubMed]
88.
Lane, N.D.; Miluzzo, E.; Lu, H.; Peebles, D.; Choudhury, T.; Campbell, A.T. A survey of mobile phone sensing.
IEEE Commun. Mag. 2010,48, 140–150. [CrossRef]
89.
Contreras-Naranjo, J.C.; Wei, Q.; Ozcan, A. Mobile phone-based microscopy, sensing, and diagnostics. IEEE J.
Sel. Top. Quantum Electron. 2016,22, 1–14. [CrossRef]
90.
Zhu, H.; Sikora, U.; Ozcan, A. Quantum dot enabled detection of Escherichia coli using a cell-phone. Analyst
2012,137, 2541–2544. [CrossRef] [PubMed]
91.
Dutta, S.; Choudhury, A.; Nath, P. Evanescent wave coupled spectroscopic sensing using smartphone.
IEEE Photonics Technol. Lett. 2014,26, 568–570. [CrossRef]
92.
Hossain, A.; Canning, J.; Ast, S.; Cook, K.; Rutledge, P.J.; Jamalipour, A. Combined “dual” absorption and
fluorescence smartphone spectrometers. Opt. Lett. 2015,40, 1737–1740. [CrossRef]
93.
Ozcan, A. Mobile phones democratize and cultivate next-generation imaging, diagnostics, and measurement
tools. Lab Chip 2014,14, 3187–3194. [CrossRef]
94.
Smith, Z.J.; Chu, K.; Espenson, A.R.; Rahimzadeh, M.; Gryshuk, A.; Molinaro, M.; Pwyre, D.E.; Lane, S.;
Matthews, D.; Wachsmann-Hogiu, S. Cell-phone-based platform for biomedical device development and
educational applications. PLoS ONE 2011,6, e17150. [CrossRef]
95.
Rissanen, A.; Saari, H.; Rainio, K.; Stuns, I.; Viherkanto, K.; Holmlund, C.; Näkki, I.; Ojanen, H. MEMS
FPI-based smartphone hyperspectral imager. In Next-Generation Spectroscopic Technologies IX, Proceedings of
SPIE Commercial and Scientific Sensing and Imaging, Baltimore, USA, 2016; International Society for Optics and
Photonics: Washington, DC, USA, 2016. [CrossRef]
96.
Näsilä, A.; Trops, R.; Stuns, I.; Havia, T.; Saari, H.; Guo, B.; Ojanen, H.J.; Akujärvi, A.; Rissanen, A. Hand-held
MEMS hyperspectral imager for VNIR mobile applications. In MOEMS and Miniaturized Systems XVII,
Proceedings of SPIE, San Francisco, USA, 2018; International Society for Optics and Photonics: Washington,
DC, USA, 2018. [CrossRef]
97.
Vergouw, B.; Nagel, H.; Bondt, G.; Custers, B. Drone technology: Types, payloads, applications, frequency
spectrum issues and future developments. In The Future of Drone Use, 1st ed.; Custers, B., Ed.; TMC Asser
Press: The Hague, The Netherlands, 2016; Volume 27, pp. 21–45. [CrossRef]
98.
Cunlie, A.M.; Anderson, K.; DeBell, L.; Duy, J.P. A UK Civil Aviation Authority (CAA)-approved operations
manual for safe deployment of light-weight drones in research. Int. J. Remote Sens.
2017
,38, 2737–2744.
[CrossRef]
99. Clark, R. Understanding the drone epidemic. Comput. Law Secur. Rev. 2014,30, 230–246. [CrossRef]
©
2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access
article distributed under the terms and conditions of the Creative Commons Attribution
(CC BY) license (http://creativecommons.org/licenses/by/4.0/).
... The nanometerscale spectral resolution enables the differentiation of ground objects based on variations in their spectral curves. Consequently, HSI classification has found widespread applications in various fields including environmental monitoring [1], mineral exploration [2], precision agriculture, and urban planning [3], [4], [5], [6]. ...
... Appropriately increasing the number of heads in attention can facilitate better information fusion across different feature spaces. The number of heads is set to [2,4,8,16,32], and the classification performance is evaluated for varying numbers of heads. As shown in Fig. 7(c)and Fig. 8(c), when the number of heads is set to 8, the model achieves optimal classification performance. ...
... 4)The depth of SPM: The SPM module integrates the enhanced features, and the depth of SPM significantly influences the model's feature expression capability. To determine the optimal depth of SPM, this paper evaluates the depths as [2,4,6,8,10]. As illustrated in Fig. 7(d) and Fig. 8(d), it is evident that the classification performances for both Urban dataset and Suburb dataset improve as the depths from 2 to 6 layers. ...
Article
Full-text available
Visible hyperspectral image (V-HSI) and thermal infrared hyperspectral image (TI-HSI) have been crucial data sources for land cover classification. V-HSI can directly provide the information of land surface, such as the shape, color, texture, and other features. TI-HSI contains rich long-wave spectral information, which can reflect the unique emission characteristics of ground objects in the thermal infrared spectral range. To fully leverage the advantages of V-HSI and TI-HSI while enhancing the classification accuracy, this paper proposes a self- and cross-attention enhanced transformer network (SCAET), integrated with convolutional neural network (CNN) for HSI classification. Initially, the proposed method employs a dual-branch spatial-spectral CNN (SS CNN) to extract spectral convolution features from V-HSI and TI-HSI respectively. Subsequently, a spectral feature mapping (SFM) module is proposed to perform feature transformation, extracting independent and interactive features of V-HSI and TI-HSI. Then, a self- and cross-attention interactive enhancement (SCIE) module is designed to extract deeper features and enhance the independent features by using the interactive features. Additionally, a self-projection mixing (SPM) module is formulated to promote feature interaction and improve the generalization capability of the model. To validate the effectiveness of the proposed network, extensive experiments are conducted on real-world datasets, and the results indicate that SCAET significantly outperforms current multi-source fusion networks.
... The information contained in each pixel can represent a spectrum of direct emission, absorption, transmission, reflection, fluorescence, and Raman scattering [4,[7][8][9]. HSI techniques have been applied in diverse fields, including remote sensing [10,11], agriculture [12,13], environmental monitoring [14,15], food quality and safety control [16,17], archaeology and art conservation [18,19], forensic medicine [20,21], biomedicine and healthcare [22][23][24][25][26], as well as military and homeland security [27,28]. ...
... To prevent interference induced by the overlapping of diffraction orders, a longpass filter (9) was incorporated in front of the PGP element, which comprises double wedge prisms (10) and a rounded volume phase holographic (VPH) transmission grating (11), housed within an SM1 lens tube (8). A C-mount imaging lens (12) is mounted via a C-mount to T-2 thread adapter (13) to a CMOS camera (14). The component numbers correspond to the detailed specifications provided in Table 2. ...
Preprint
Full-text available
This work presents the development of a cost-effective dual-mode hyperspectral imaging (HSI) system integrated with machine-learning models to detect and classify bacteria with enhanced accuracy. The HSI system was constructed using commercial off-the-shelf components and 3D-printed parts, with detailed optical simulations performed to aid in the design and validate the system’s performance. A compound prism-grating-prism was implemented in an on-axis spectrograph configuration to simplify the optical assembly and minimize field-dependent aberrations. The system supports wide-field HSI in both reflectance and fluorescence modes, illuminated by chip-on-board LED sources with a visible-to-near-infrared spectrum, and a narrow-band UV. The determined spectral resolution of the custom HSI system was 1.55 nm, while the spatial resolutions were approximately 0.81 mm and 0.49 mm for in-track and cross-track directions, sufficient for spatio-spectral imaging of bacterial colonies. Furthermore, a model-training framework leveraging spectral feature fusion from both modes was developed to classify bacterial species, S. aureus and P. aeruginosa. The classification accuracies achieved using reflectance, fluorescence, and dual modes were 92.55%, 93.48%, and 97.11%, respectively. This dual-mode optical-computational platform not only demonstrates enhanced classification accuracy but also represents a scalable and economical solution for high-throughput bacterial identification.
... The information contained in each pixel can represent a spectrum of direct emission, absorption, transmission, reflection, fluorescence, and Raman scattering [4,[7][8][9]. HSI techniques have been applied in diverse fields, including remote sensing [10,11], agriculture [12,13], environmental monitoring [14,15], food quality and safety control [16,17], archaeology and art conservation [18,19], forensic medicine [20,21], biomedicine and healthcare [22][23][24][25][26], as well as military and homeland security [27,28]. ...
... To prevent interference induced by the overlapping of diffraction orders, a longpass filter (9) was incorporated in front of the PGP element, which comprises double wedge prisms (10) and a rounded volume phase holographic (VPH) transmission grating (11), housed within an SM1 lens tube (8). A C-mount imaging lens (12) is mounted via a C-mount to T-2 thread adapter (13) to a CMOS camera (14). The component numbers correspond to the detailed specifications provided in Table 2. ...
Preprint
Full-text available
This work presents the development of a cost-effective dual-mode hyperspectral imaging (HSI) system integrated with machine-learning models to detect and classify bacteria with enhanced accuracy. The HSI system was constructed using commercial off-the-shelf components and 3D-printed parts, with detailed optical simulations performed to aid in the design and validate the system’s performance. A compound prism-grating-prism was implemented in an on-axis spectrograph configuration to simplify the optical assembly and minimize field-dependent aberrations. The system supports wide-field HSI in both reflectance and fluorescence modes, illuminated by chip-on-board LED sources with a visible-to-near-infrared spectrum, and a narrow-band UV. The determined spectral resolution of the custom HSI system was 1.55 nm, while the spatial resolutions were approximately 0.81 mm and 0.49 mm for in-track and cross-track directions, sufficient for spatio-spectral imaging of bacterial colonies. Furthermore, a model-training framework leveraging spectral feature fusion from both modes was developed to classify bacterial species, S. aureus and P. aeruginosa. The classification accuracies achieved using reflectance, fluorescence, and dual modes were 92.55%, 93.48%, and 97.11%, respectively. This dual-mode optical-computational platform not only demonstrates enhanced classification accuracy but also represents a scalable and economical solution for high-throughput bacterial identification.
... Hyperspectral images (HSIs) record the precise electromagnetic spectrum of the objects in a scene in hundreds of spectral bands, enabling this way discrimination between objects that are indistinguishable in conventional RGB images. As a result, HSIs have been widely applied in fields such as agriculture [1], environmental monitoring [2], and defense and security [3]. Clustering, which aims to categorize image pixels into different classes without any labeled data, plays a crucial role in interpreting HSI data. ...
Preprint
Full-text available
Subspace clustering has become widely adopted for the unsupervised analysis of hyperspectral images (HSIs). Recent model-aware deep subspace clustering methods often use a two-stage framework, involving the calculation of a self-representation matrix with complexity of O(n^2), followed by spectral clustering. However, these methods are computationally intensive, generally incorporating solely either local or non-local spatial structure constraints, and their structural constraints fall short of effectively supervising the entire clustering process. We propose a scalable, context-preserving deep clustering method based on basis representation, which jointly captures local and non-local structures for efficient HSI clustering. To preserve local structure (i.e., spatial continuity within subspaces), we introduce a spatial smoothness constraint that aligns clustering predictions with their spatially filtered versions. For non-local structure (i.e., spectral continuity), we employ a mini-cluster-based scheme that refines predictions at the group level, encouraging spectrally similar pixels to belong to the same subspace. Notably, these two constraints are jointly optimized to reinforce each other. Specifically, our model is designed as an one-stage approach in which the structural constraints are applied to the entire clustering process. The time and space complexity of our method is O(n), making it applicable to large-scale HSI data. Experiments on real-world datasets show that our method outperforms state-of-the-art techniques. Our code is available at: https://github.com/lxlscut/SCDSC
... H YPERSPECTRAL images (HSIs) capture continuous spectral bands from visible to near-infrared wavelengths, providing rich spectral information that confers unique advantages in detection and classification tasks. Consequently, HSIs play a critical role in diverse fields such as environmental monitoring [1], [2], medical diagnosis [3], [4], and mineral exploration [5], [6]. ...
Article
Full-text available
Fusion-based hyperspectral image super-resolution has recently attracted increasing interest due to its superior reconstruction quality. This approach enhances the spatial resolution of low-resolution hyperspectral images (LR-HSIs) by fusing high-resolution multispectral images (HR-MSIs) of the same scene. However, most existing deep learning-based methods have not sufficiently considered the huge modality differences between these two types of images before feature fusion, potentially resulting in the loss of valuable information. To tackle this challenge, we introduce a novel dual-branch network with mutual guidance (DBMGNet). Specifically, we employ a dual-branch architecture to process the input LR-HSI and HR-MSI in parallel, with the aim of reconciling their modality differences. For this purpose, we designed a mutually guided dual-stream Transformer block that performs bidirectional calibration between the branches, enhancing their spatial and spectral consistency. To prevent excessive coupling of information, we proposed a multiscale feature enhancement block, which independently refines fine-grained details within each branch. Finally, a weighted feature fusion block is developed to effectively integrate the features from both branches. Experiments on three widely used datasets indicate that the proposed DBMGNet achieves stable and superior performance with lower computational cost in comparison with several state-of-the-art approaches.
... MSI extends this capability by recording data in specific spectral bands, including visible and near-infrared (NIR) light, enabling the detection of vegetation health, hydric stress, and other features not visible to the naked eye [34,35]. HSI imaging goes even further, capturing hundreds of narrow spectral bands across a broader range of the electromagnetic spectrum, providing highly detailed spectral information for each pixel [36]. ...
Article
Full-text available
The classification and identification of individual tree species in forest environments are critical for biodiversity conservation, sustainable forestry management, and ecological monitoring. Recent advances in drone technology and artificial intelligence have enabled new methodologies for detecting and classifying trees at an individual level. However, significant challenges persist, particularly in heterogeneous forest environments with high species diversity and complex canopy structures. This systematic review explores the latest research on drone-based data collection and AI-driven classification techniques, focusing on studies that classify specific tree species rather than generic tree detection. Following the Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines, peer review studies from the last decade were analyzed to identify trends in data acquisition instruments (e.g., RGB, multispectral, hyperspectral, LiDAR), preprocessing techniques, segmentation approaches, and machine learning (ML) algorithms used for classification. Findings of this study reveal that deep learning (DL) models, particularly convolutional neural networks (CNN), are increasingly replacing traditional ML methods such as random forest (RF) or support vector machines (SVMs) because there is no need for a feature extraction phase, as this is implicit in the DL models. The integration of LiDAR with hyperspectral imaging further enhances classification accuracy but remains limited due to cost constraints. Additionally, we discuss the challenges of model generalization across different forest ecosystems and propose future research directions, including the development of standardized datasets and improved model architectures for robust tree species classification. This review provides a comprehensive synthesis of existing methodologies, highlighting both advancements and persistent gaps in AI-driven forest monitoring.
Article
Full-text available
Hyperspectral imaging (HSI) is an effective technique for material identification and classification, utilizing spectral signatures with applications in remote sensing, environmental monitoring, and allied disciplines. Despite its potential, the broader adoption of HSI technology is hindered by challenges related to compactness, affordability, and durability, exacerbated by the absence of standardized protocols for developing practical hyperspectral cameras. This study introduces a comprehensive framework for developing a compact, cost-effective, and robust hyperspectral camera, employing commercial off-the-shelf (COTS) components and a volume phase holographic (VPH) grism. The use of COTS components reduces development time and manufacturing costs while maintaining adequate performance, thereby improving accessibility for researchers and engineers. The incorporation of a VPH grism enables an on-axis optical design, enhancing compactness, reducing alignment sensitivity, and improving system robustness. The proposed framework encompasses spectrograph design, including optical simulations and tolerance analysis conducted in ZEMAX OpticStudio, alongside assembly procedures, performance assessment, and hyperspectral image acquisition via a pushbroom scanning approach, all integrated into a structured, step-by-step workflow. The resulting prototype, housed in an aluminum enclosure, operates within the 420–830 nm wavelength range, achieving a spectral resolution of 2 nm across 205 spectral bands. It effectively differentiates vegetation, water, and built structures, resolves atmospheric absorption features, and demonstrates the ability to distinguish materials in low-light conditions, providing a scalable and practical advancement in HSI technology.
Article
Hyperspectral unmixing, an essential and fundamental task in remote sensing, focuses on estimating endmembers (spectrally pure components) and their fractional abundances within each mixed pixel of a hyperspectral image. With the advent of deep learning (DL), the field of hyperspectral unmixing has made significant progress. Among DL approaches, autoencoder-based models have shown promising results. However, most unmixing methods estimate the endmembers by the weights of the linear layers in the decoder of their networks, making their performance highly dependent on weight initialization. Moreover, noise is not explicitly accounted for in most recent methods that use spectral angle distance (SAD) loss. To avoid the initialization problems, we developed an innovative inversion strategy to directly estimate the endmembers. Moreover, to optimally account for noise, an end-to-end network is proposed that integrates both denoising and unmixing. Finally, for an improved feature extraction, a novel spectral-spatial attention module is integrated in the network. Extensive experiments on a synthetic and three real datasets show that the proposed method significantly and consistently outperforms the compared state-of-the-art methods. The full code is available at https://github.com/xuanwentao for public evaluation.
Article
Full-text available
There is growing interest in hyperspectral imaging to complement observation needs and techniques required to capture the critical zone dynamics. It is already widely used in remote sensing satellite imagery, for regional-scale monitoring of canopies (Asner et al. 2004), or suspended sediment transport (Yepez et al. 2017). Spectral imaging offers dense, remote and non-intrusive measurement coverage. Its implementation at fixed-station for fine temporal monitoring would ensure maximum temporal coverage to study the phenology and functioning of ecosystems (vegetation-water-soil interactions) and watersheds (sediment dynamics), at integrative scales (e.g. watershed outlets), or over experimental plots. On-site hyperspectral data also enable links to the regional scale through cross-comparison with data from space (de Moura et al. 2017). It would then enables to better control measurement biases, offering in that way better opportunity for standardizing observables, as required by international research infrastructures. In recent years, both technological progresses and applications for commercial uses made these kind of cameras more reliable, compact, and affordable, making feasible on-site hand-held or UAV-based experiments (Stuart et al. 2019). Nevertheless, deployment for continuous monitoring remains uncommon, and limited to specific applications (de Moura et al. 2017, Woodgate et al. 2020), due to a still high instrumental complexity and costs. Furthermore, correct data exploitation requires a complete mastery of the calibration, acquisition, normalization and processing chain, that can be complex with “black-box” commercial systems. The development of a dedicated spectral camera is thus preferred. Such a camera is developed within the program TERRA FORMA from the French Agency for Research (Longuevergne et al. 2022). This program aims to implement integrated socio-ecosystem observatories, in support of the French RZA and OZCAR infrastructures, by developing and deploying a dozen types of state-of-the-art sensors dedicated to environmental monitoring at national-scale until 2029. A part of this project is dedicated to the deployment up to 20 spectral cameras, within two scientific topics: monitoring of plant canopies, monitoring of suspended sediment dynamics in rivers. The instrumental solution we are implementing is based on developments carried out at IPAG since 2016 in compact spectral imaging for spaceborne Earth Observation (Gousset et al. 2019, Le Coarer et al. 2021). In addition to its compactness and optical simplicity, the main advantage of this kind of camera lies in its ability to acquire all spectral and spatial information in a single acquisition (“snapshot”) of a fraction of a second. By opposite to pushbroom or linescanner concepts, which require tens of seconds of exposure under stable illumination conditions. The TERRA FORMA camera complements these instruments with a frugal, less expensive solution, suitable for deployment as a stand-alone fixed station or for handle-held/UAV acquisitions on the field. Since May 2024, we integrated and tested in laboratory an operational camera (Fig. 1), with the following specifications: Field of view 22 by 12°, for 365 by 200 pixels 1 cm / pixel at 9 m distance 42 spectral channels between 400 and 780 nm (up to 850) Spectral resolution 10 nm (up to 6 nm) 10 x 10 x 6 cm, 0.6 kg, powered by LiPo battery We carried out a first field test in August 2024 at the eLTER site Lautaret / Roche Noire (French Alps). During this single day of acquisition, we acquired data over the landscape jointly to a reference commercial non-imaging spectrometer. This last is shown on Fig. 2, demonstrating a good adequacy between hyperspectral data from the camera and reference spectra. The next steps for 2025 are on site campaigns, lasting 3 to 6 months at fixed stations on pilot sites. On the biodiversity topic: acquisition during a full growing season in a snow-covered mountain grassland equipped with a flux tower should enable: To compare the series of data from hyperspectral imagery with the installed multi-spectral NDVI sensor (only two channels in red and near infrared). To compare spectral measurements with balances of radiative fluxes, and with CO2 and H2O exchanges in the soil-plant-atmosphere continuum. To identify the best optical proxies for inferring vegetation water status and CO2 fixation capacity during a season. Mid-term objective is to be able to increase the effective footprint of the tower, then to be able to infer canopy function and structure using imagery, through integrated and continuous measurement of several biodiversity parameters at the same time, complementary to data collected as part of the eLTER and ICOS infrastructures. On the hydrology topic: another camera will be deployed on hydrological stations (campus of Grenoble, then Galabre river (Legout et al. 2021)). The aggregation of data should enable: To identify optical proxies for quantifying suspended solids concentrations. To evaluate the robustness of this approach in a concentration range from 0 to a few tens of g/l, currently well measured by the combined turbidimetry and sampling approach (Navratil et al. 2011). To identify optical proxies capable of discriminating between the different types of suspended solids transported in rivers during floods. To apply an approach based on these optical proxies to trace the sources of suspended solids using mixture models, and compare these results with those obtained using the spectro-colorimetric manual suspended solids tracing method implemented on the Galabre site since 2013 (Legout et al. 2013). The final objective is to be able to complement in situ techniques (turbidimetry) and river sampling with a remote, robotized measurement method, providing better temporal coverage of flood episodes, more reliable than submerged sensors.
Preprint
Full-text available
Ultraviolet imaging has been applied in volcanology over the last ten years or so. This provides considerably higher temporal and spatial resolution volcanic gas emission rate data than available previously, enabling the volcanology community to investigate a range of far faster plume degassing processes, than achievable hitherto. To date this has covered rapid oscillations in passive degassing through conduits and lava lakes, as well as puffing and explosions, facilitating exciting connections to be made for the first time between previously rather separate sub disciplines of volcanology. Firstly, there has been corroboration between geophysical and degassing datasets at ≈ 1 Hz expediting more holistic investigations of volcanic source-process behaviour. Secondly, there has been the combination of surface observations of gas release, with fluid dynamic models (numerical, mathematical and laboratory) for gas flow in conduits, in attempts to link subterranean driving flow processes to surface activity types. There has also been considerable research and development concerning the technique itself, covering error analysis and most recently adaptation of smartphone sensors for this application, to deliver gas fluxes at a significantly lower instrumental price point than possible previously. At this decadal juncture in the application of UV imaging in volcanology, this article provides an overview of what has been achieved to date as well as a forward look to potential future research directions, in particular covering the first use of UV cameras to generate volcanic gas composition ratio imagery.
Article
Full-text available
Förster resonance energy transfer (FRET) between two molecules in nanoscale distances is utilized in significant number of applications including biological and chemical applications, monitoring cellular activities, sensors, wireless communications and recently in nanoscale microfluidic radar design denoted by the vibrating FRET (VFRET) exploiting hybrid resonating graphene membrane and FRET design. In this article, a low hardware complexity and novel microfluidic viscosity monitoring system architecture is presented by exploiting VFRET in a novel microfluidic system design. The donor molecules in a microfluidic channel are acoustically vibrated resulting in VFRET in the case of nearby acceptor molecules detected with their periodic optical emission signals. VFRET does not require complicated hardware by directly utilizing molecular interactions detected with the conventional photodetectors. The proposed viscosity measurement system design is theoretically modeled and numerically simulated while the experimental challenges are discussed. It promises point-of-care and environmental monitoring applications including viscosity characterization of blood or polluted water.
Article
Full-text available
Mapping lithology and geological structures accurately remains a challenge in difficult terrain or in active mining areas. We demonstrate that the integration of terrestrial and drone-borne multi-sensor remote sensing techniques significantly improves the reliability, safety, and efficiency of geological activities during exploration and mining monitoring. We describe an integrated workflow to produce a geometrically and spectrally accurate combination of a Structure-from-Motion Multi-View Stereo point cloud and hyperspectral data cubes in the visible to near-infrared (VNIR) and short-wave infrared (SWIR), as well as long-wave infrared (LWIR) ranges acquired by terrestrial and drone-borne imaging sensors. Vertical outcrops in a quarry in the Freiberg mining district, Saxony (Germany), featuring sulfide-rich hydrothermal zones in a granitoid host, are used to showcase the versatility of our approach. The image data are processed using spectroscopic and machine learning algorithms to generate meaningful 2.5D (i.e., surface) maps that are available to geologists on the ground just shortly after data acquisition. We validate the remote sensing data with thin section analysis and laboratory X-ray diffraction, as well as point spectroscopic data. The combination of ground- and drone-based photogrammetric and hyperspectral VNIR, SWIR, and LWIR imaging allows for safer and more efficient ground surveys, as well as a better, statistically sound sampling strategy for further structural, geochemical, and petrological investigations.
Article
Full-text available
In the last 10 years, development in robotics, computer vision, and sensor technology has provided new spectral remote sensing tools to capture unprecedented ultra-high spatial and high spectral resolution with unmanned aerial vehicles (UAVs). This development has led to a revolution in geospatial data collection in which not only few specialist data providers collect and deliver remotely sensed data, but a whole diverse community is potentially able to gather geospatial data that fit their needs. However, the diversification of sensing systems and user applications challenges the common application of good practice procedures that ensure the quality of the data. This challenge can only be met by establishing and communicating common procedures that have had demonstrated success in scientific experiments and operational demonstrations. In this review, we evaluate the state-of-the-art methods in UAV spectral remote sensing and discuss sensor technology, measurement procedures, geometric processing, and radiometric calibration based on the literature and more than a decade of experimentation. We follow the ‘journey’ of the reflected energy from the particle in the environment to its representation as a pixel in a 2D or 2.5D map, or 3D spectral point cloud. Additionally, we reflect on the current revolution in remote sensing, and identify trends, potential opportunities, and limitations.
Article
Full-text available
Reconstructing the 3D profile from a set of UAV-based images can obtain hyperspectral information, as well as the 3D coordinate of any point on the profile. Our images are captured from the Cubert UHD185 (UHD) hyperspectral camera, which is a new type of high-speed onboard imaging spectrometer. And it can get both hyperspectral image and panchromatic image simultaneously. The panchromatic image have a higher spatial resolution than hyperspectral image, but each hyperspectral image provides considerable information on the spatial spectral distribution of the object. Thus there is an opportunity to derive a high quality 3D point cloud from panchromatic image and considerable spectral information from hyperspectral image. The purpose of this paper is to introduce our processing chain that derives a database which can provide hyperspectral information and 3D position of each point. First, We adopt a free and open-source software, Visual SFM which is based on structure from motion (SFM) algorithm, to recover 3D point cloud from panchromatic image. And then get spectral information of each point from hyperspectral image by a self-developed program written in MATLAB. The production can be used to support further research and applications.
Article
Accidental releases of hazardous waste related to the extraction, refining, and transport of oil and gas are inevitable. Petroleum facilities and intrinsic pipelines present environmental pollution risks, threatening both human health and ecosystems. Research has been undertaken to enhance the conventional methods for monitoring hazardous waste problems and to improve time-consuming and cost-effective ways for leak detection and remediation process. In this study, both diffuse and imaging (hyperspectral) reflectance spectroscopy are used for detection and characterization of petroleum hydrocarbon (PHC) contamination in latosols. Laboratory and field measurements of PHC-contaminated and PHC-free soils were collected from an oil facility using an ASD FieldSpec-3 high-resolution portable spectrometer (2150 channels) covering visible, near infrared and shortwave infrared wavelengths (VNIR-SWIR: 350–2500 nm). The hyperspectral image dataset was acquired with the ProSpecTIR-VS airborne sensor using 357 channels in the VNIR-SWIR range at 1 m of spatial resolution. Narrow intervals of reflectance spectra were analyzed to identify the primary mineral and PHC absorption bands in soil samples and to investigate the spectral match with airborne hyperspectral data. The Multiple Endmember Spectral Mixture Analysis (MESMA) method was employed in three hierarchical levels to classify the hyperspectral imagery. The classification product yielded from MESMA model at the fourth level was 98% accurate in discriminating contaminated soils. The results demonstrated the applicability of both diffuse reflectance and imaging (hyperspectral) spectroscopy to identify bare soils contaminated by PHC leaks and spills. These technologies can also provide useful information for remediation initiatives, thereby avoiding further problems with hazardous waste.
Conference Paper
Hyperspectral instruments are fundamental tools in remote sensing for environmental control and precision farming. For hyperspectral sensors often conventional optical designs based on grating or prism spectrometers are preferred. These instruments meet the system and mission requirements. From the provided data high quality information can be derived. However, more and more data is being offered by low-cost missions. This will establish new business models and data providers. This article is intended to provide an overview of current low cost sensors for hyperspectral applications.