Content uploaded by Rene van der Wal
Author content
All content in this area was uploaded by Rene van der Wal on Nov 02, 2015
Content may be subject to copyright.
Content uploaded by Rene van der Wal
Author content
All content in this area was uploaded by Rene van der Wal on Nov 02, 2015
Content may be subject to copyright.
PAPER: Resource Monitoring with Wireless Sensor
Networks and Satellite
Sajid Nazir
dot.rural
Digital Economy Hub
University of Aberdeen
Aberdeen, UK
+44(0)1224272530
sajid.nazir@abdn.ac.uk
Gorry Fairhurst
dot.rural
Digital Economy Hub
University of Aberdeen
Aberdeen, UK
+44(0)1224272201
g.fairhurst@abdn.ac.uk
Fabio Verdicchio
dot.rural
Digital Economy Hub
University of Aberdeen
Aberdeen, UK
+44(0)1224273665
fverdicc@abdn.ac.uk
René van der Wal
dot.rural
Digital Economy Hub
University of Aberdeen
Aberdeen, UK
+44(0)1224272256
r.vanderwal@abdn.ac.uk
Scott Newey
The James Hutton
Institute
Craigiebuckler
Aberdeen, UK
+44(0)1224395215
scott.newey@hutton.ac.uk
ABSTRACT
Effective management of natural environment relies on acquiring
accurate and detailed information in a timely manner. The
technology advancements in the sensing, communication and
processing fields have made it possible to capture the events of
interest in the physical world from remote locations.
The WiSE (Wireless Internet Sensing Environment) Project is a
multi-disciplinary activity funded by the RCUK dot.rural Digital
Economy Hub. It has been designed to facilitate the deployment of
cameras and sensors to remotely monitor an area of interest in any
weather condition. The platform is solar powered and fully
autonomous. A satellite unit enables deployment to the remotest
part of the world and yet provides access to data via the Internet.
Categories and Subject Descriptors
C.2.4 [Computer-Communication Networks]: Distributed
Systems – Distributed applications.
General Terms
Algorithms, Design, Management.
Keywords
Video processing, Network protocols, Database, Satellite
communication.
1. INTRODUCTION
The WiSE Project has been developed to support remote resource
management deploying the first application to perform remote
environmental monitoring. It combines many innovative features
to enable access to remote imagery and sensor data with
optimized utilization of deployed resources for extreme weather
conditions. It can easily be used for video surveillance, asset
management and infrastructure protection [1].
The platform uses a local area network to connect the processing
unit, sensors, cameras and satellite router as shown in figure 1.
Sensors are connected via low-power wireless to monitor the
environment, perform motion detection, etc. All sensor data is
stored in a database together with information about the context.
This allows data to be retrieved via the Internet, for instance to
Digital Futures 2013, November 2013, Manchester, UK. The research
described here is supported by the award made by the RCUK Digital
Economy programme to the dot.rural Digital Economy Hub, reference:
EP/G066051/1. URL: http://www.dotrural.ac.uk/wise/
plot temperature variation; explore trends in activity at the site, or
for other purposes not currently envisaged. Section 2 describes the
platform architecture and technology components.
Section 3 of the paper explores use of the platform to provide
non-invasive monitoring the animals in their natural habitat. This
application is similar to the existing conventional digital camera
traps, which use a sensor to activate a camera when a moving
target (e.g. an animal) is detected [1]. One drawback of existing
equipment is that often the majority of captured images consist of
“non-images” with no subject due to false triggering of sensors.
This is seen as the single biggest disadvantage and expense in
using photography and video as an environmental monitoring
tool. There is thus a clear need to avoid excessive data
transmission, storage and operator analysis [3]. Collection of
“non-images” is particularly unhelpful when data has to be
retrieved retrospectively by visits to remote sites. The WiSE
presents a significant advancement over existing methods and is
expected to have wide applicability to enable a range of video-
based sensing applications beyond the environmental domain.
2. THE WiSE PLATFORM
The WiSE platform is designed to be flexible, self-powered and to
support operation throughout the year-long in a harsh outdoor
environment. A network connects processors, sensors and cameras
to a Gateway that coordinates monitoring and provides backhaul
connectivity to the Internet. The platform generates its own power
using battery-backed solar panels. This provides a multi-tiered
platform (Figure 2) with easy access to the information of interest.
The WiSE platform is an enabler that may be used for numerous
applications such as remote sensing, emergency operations in
disaster areas, and monitoring construction site safety. Video-
based monitoring applications can also offer the availability of
real-time access to imagery [2].
The Gateway is housed in a weather-proof enclosure. It comprises
a processor (Raspberry Pi B) [4] with the solar panel, power
management and communications equipment.
IP-based cameras are used to record the imagery, powered over
wired Ethernet cables connected to the Gateway. The images and
videos may be captured with a highest resolution of 5 mega pixel
and can provide two simultaneous video streams with MJPEG and
H.264 coding. These cameras allow the researchers to evaluate
new protocols to realise both a continuous presence at the remote
site and to explore retrieval of imagery. Advanced video methods
are also expected to reduce the need for subsequent human
interpretation, with significant reductions in the cost of processing
the captured imagery.
A network of sensor nodes monitors the area around the Gateway.
Each sensor node uses an AVR RFA1 micro-controller [6]
equipped with a temperature sensor, Passive Infrared (PIR) and
X-band radar motion detectors, and low-power wireless
communications using IPv6 Low Power Wireless Personal Area
Networks (6LoWPAN) [7][8]. The current consumption during
transmission for the AVR microcontroller is only 18.6 mA,
allowing operation of the unit for extended periods using external
batteries.
When a node detects movement within its local vicinity or
changes in other parameters (e.g. battery health or temperature), it
sends a wireless trigger message. Some sensor nodes may also
have on-board cameras (e.g. for capturing imagery or providing
video scene analysis).
Sensor trigger messages and imagery are stored in a data
repository at the Gateway. This is implemented as a Triplestore
database [5] which captures the context of each sensor reading as
well as its numerical value/image. The repository can be queried
in real-time to define rules that automate decisions on when to
capture data. In a typical application, another processor monitors
the various triggers and uses rules to decide when to start
recording of video or still images by a camera.
Imagery may be remotely accessed at reduced quality via the
satellite Internet connection. This could later be followed up by
download of high quality imagery and video. The repository can
also be retrospectively queried to support new analysis.
Power management is key to sustain a system powered using solar
panels, especially during winter months when solar panels will
receive less daylight and may be prone to snow coverage. The
solar panels charge two banks each of two 12 volt Gel batteries
with 70 Ah capacities. Either bank can be selected to power the
main camera, which is seen as most important resource.
The satellite unit and the main camera consume the most battery
power, while controllers consume relatively little power. The
Gateway monitors power consumption to ensure judicious use of
the available power. This allows the Gateway to dynamically
activate components as required. This also enables experiments to
dynamically trade quality of capture and used communications
resource against the power budget, e.g. adaptive video algorithms
that react to scenarios of interest and select the imagery collected
based on power consumption, stored battery energy, and
communication needs (real-time streaming, remote control, access
to data, or background down-load of high quality imagery).
Since the WiSE platform is accessible across the Internet, the
platform software can be reconfigured after deployment. This
provides a truly flexible platform allowing experimentation with
both new technologies and to customise the algorithms used for
data collection. The use of a Triplestore database extends this
flexibility by offering the potential to build large on-line datasets
that can be later utilised to explore new application based on the
captured sensor data.
3. A WiSE CAMERA TRAP
This section describes use of the WiSE platform as an advanced
camera trap to support the Natural Resource and Conservation
(NRC) theme of the dot.rural DE hub.
Monitoring of natural resources, biodiversity, and level of
exploitation are critical for economic, ecological and social
sustainability yet represent one of the most challenging areas of
natural resource management and embraces issues of governance,
ecology, resource management and technology. Recent advances
have allowed researchers, and managers to observe and monitor
wild animals remotely (Figure 3).
Such non-invasive techniques can reduce time and effort for data
collection if conducted remotely and avoid collecting large
volumes of “non-images”. Remote access also enables new uses,
and avoids animal welfare issues associated with capture and
handling of wild, and sometimes endangered animals [9].
An initial design of camera trap was prototyped and tested at the
University of Aberdeen, in May 2013 by installing in an upland
area in the Cairngorms region. This remote location is away from
existing communications infrastructure, providing an ideal test
application for the technology. In June 2013 the system is
expected to start to collect the first video-based imagery.
In this application, the WiSE sensor information is consumed by
the Gateway. Rules define how to combine the PIR and radar
sensors to reliably detect movement, allowing intelligent image
capture by the network camera. Video content analysis is used
with pre- and post-capture techniques so that the irrelevant images
and videos are marked as less relevant and/or discarded.
We expect continued operation of the platform to show that it can
effectively combine many of the advantages of “wired” Internet
cameras and remote digital camera traps, while also providing a
platform for exploring new digital techniques. For example, we
plan to use the flexibility of the platform to also explore whether
video motion detection on the current frame can be used to
confirm the presence of an object of interest.
The WiSE platform design and the developed techniques are
expected to have wider applicability to a range of video-based
sensing applications beyond the environmental domain. The
project is therefore now seeking partners for a complementary
second scenario to focus on another application for the platform.
4. CONCLUSIONS
This paper summarises the current architecture and system
components of the WiSE platform. Use of the platform is
generating outcomes from two complementary, but contrasting,
perspectives:
From a natural resource management perspective, the evaluation
is exploring how digital technology can change the way
environmental monitoring in remote environments is conducted.
The results will be evaluated by the stakeholders for each scenario
to assess the impact on future practice and policy. The lessons
learned and opportunities provided will be disseminated to other
prospective users of non-invasive monitoring techniques.
From a digital technology perspective, the methods combine state
of the art equipment with new algorithms for video capture,
compression and sensor data transmission. Use of the methods at
a remote location is expected to provide practical data to
understand the design space and evaluate the potential benefits of
using smart sensors to generate a digital environment. Interactions
with stakeholders will help to understand the implications of the
work on future systems.
5. REFERENCES
[1] O’Connell, A.F., Nichols, J. D. & Karanth, K. U. (2011)
Camera Traps in Animal Ecology: Methods and Analyses,
Springer.
[2] Kays, R., Tilak, S., Kranstauber, B., Jansen, P. A., Carbone,
C., Rowcliffe, M., Fountain, T., Eggert, J. & He, Z. (2011)
Camera Traps as Sensor Networks for Monitoring Animal
Communities. Int. Journal of Research and Reviews in
Wireless Sensor Networks 1, 19-29.
[3] Fairhurst, G., Van der Wal, R., Verdicchio, F., Nazir, S., and
Newey, S. 2012. WiSE:Wireless Internet Sensing
EnvironmentRaspberry Pi. In Digital economy All Hands
Conference DE-2012, Aberdeen, UK.
[4] http://www.raspberrypi.org/
[5] http://en.wikipedia.org/wiki/Triplestore
[6] http://www.atmel.com/devices/atmega128rfa1.aspx
[7] Zach Shelby and Carsten Bormann, The wireless embedded
Internet - Part 1: Why 6LoWPAN?, EE Times Design, 2011.
[8] Transmission of IPv6 Packets over IEEE 802.15.4 Networks,
RFC 4944, September 2007.
[9] Long, R.A., MacKay, P., Ray, J. C. & Zielinski, W. J. (2008)
Non-invasive Survey Methods for Carnivores, Island Press,
Washington, USA.
Figure 1: System architecture.
Figure 2: WiSE multi-tiered structure.
Figure 3: Captured image of an eagle.
User Application
Web Browser
Web Server
Triplestore Database
Sensor Nodes and
Cameras