ArticlePDF Available

SwarmSight: Measuring the temporal progression of animal group activity levels from natural-scene and laboratory videos

Authors:

Abstract and Figures

We describe SwarmSight (available at https://github.com/justasb/SwarmSight ), a novel, open-source, Microsoft Windows software tool for quantitative assessment of the temporal progression of animal group activity levels from recorded videos. The tool utilizes a background subtraction machine vision algorithm and provides an activity metric that can be used to quantitatively assess and compare animal group behavior. Here we demonstrate the tool's utility by analyzing defensive bee behavior as modulated by alarm pheromones, wild-bird feeding onset and interruption, and cockroach nest-finding activity. Although more sophisticated, commercial software packages are available, SwarmSight provides a low-cost, open-source, and easy-to-use alternative that is suitable for a wide range of users, including minimally trained research technicians and behavioral science undergraduate students in classroom laboratory settings.
Content may be subject to copyright.
Running head: SWARMSIGHT: ANIMAL GROUP ACTIVITY FROM VIDEOS
1
SwarmSight: Measuring the Temporal Progression of Animal Group Activity Levels from
Natural Scene and Laboratory Videos
Justas Birgiolas1, Christopher M. Jernigan1, Brian H. Smith1, Sharon M. Crook1,2
1 Arizona State University, School of Life Sciences
2 Arizona State University, School of Mathematical and Statistical Sciences
Correspondence:
Justas Birgiolas
justas@asu.edu
(217) 369-6210
School of Life Sciences
P.O. Box 874601
Arizona State University
Tempe, AZ 85287-4601
SWARMSIGHT: ANIMAL GROUP ACTIVITY FROM VIDEOS
2
Abstract
We describe SwarmSight (available at: https://github.com/justasb/SwarmSight), a novel, open-
source, Microsoft Windows software tool for quantitative assessment of the temporal progression
of animal group activity levels from recorded videos. The tool utilizes a background subtraction
machine vision algorithm and provides an activity metric that can be used to quantitatively assess
and compare animal group behavior. Here we demonstrate the tool utility by analyzing defensive
bee behavior as modulated by alarm pheromones, wild bird feeding onset and interruption, and
cockroach nest finding activity. While more sophisticated, commercial software packages are
available, SwarmSight provides a low-cost, open-source, and easy-to-use alternative that is
suitable for a wide range of users, including minimally trained research technicians and
behavioral science undergraduate students in classroom laboratory settings.
Keywords: insects, birds, bees, cockroaches, group activity, motion detection, activity
detection, activity levels, video motion analysis, natural scenes, activity change, automated
behavior assessment, software, SwarmSight
SWARMSIGHT: ANIMAL GROUP ACTIVITY FROM VIDEOS
3
SwarmSight: Measuring the Temporal Progression of Animal Group Activity Levels from
Natural Scene and Laboratory Videos
Measuring the complex behavior of animals that often act in groups has been a challenge.
In classic work, Altmann (1974) discusses the strategies of managing the complexity of assessing
social animal behavior by, among others, using randomly timed sampling sessions and limiting
the number of continuously tracked (focal) animals. Since then, tracking individual animals and
humans using video from digital cameras has become relatively straightforward. However, the
automated assessment of activity of large groups of animals in their natural environments has yet
to see the same advances as individual tracking. While manually assessing group movement is
usually possible, automating this process with computer vision software can greatly increase the
speed of data collection and expose the details of group movement structure to an extent that was
not available before. Here, we briefly cover the progress of group activity detection software and
how a new tool - SwarmSight - advances the state of the art.
Many animals are social and exhibit complex group behavior. For example, some fish
gather together into shoals while migrating, searching for food, and avoiding predators (Cushing
& Jones, 1968). Similarly, many birds accomplish similar goals and optimize group aerodynamic
efficiency through flocking (Higdon & Corrsin, 1978). Insects like locusts will form swarms as a
response to overcrowding (Collett, Despland, Simpson, & Krakauer, 1998), and bees, wasps, and
termites will swarm when searching for a new colony (McGlynn, 2012). Some insect colonies,
like those of stingless bees, will release many individuals to form a swarm to defend against
raiding species (David, 2006). These examples demonstrate that group dynamics play an
important role when studying animal feeding, migratory, and defensive behaviors.
SWARMSIGHT: ANIMAL GROUP ACTIVITY FROM VIDEOS
4
Group behavior is also present in humans. Humans interact with each other in a group
setting as pedestrians, as drivers in car traffic, and as shoppers in retail establishments. The latter
two categories play important economic roles in modern human society. For example, in 2010
traffic congestion in U.S. resulted in economic losses of approximately $101 billion USD
(Lindsey, 2012), while in 2015, non-online sales in United States totaled approximately $4.4
trillion USD (U.S. Department of Commerce, 2015). Humans also form complex social
relationships, and behave differently when they are members or leaders of larger groups (Dyer,
Johansson, Helbing, Couzin, & Krause, 2009). Insights into some aspects of human leadership
can be gained from studying animal and agent-based computer models (Quera, Beltran, &
Dolado, 2010) that are easier to manipulate experimentally. Thus, tools that are used to assess
animal group behavior can be applied to study specific human group behaviors.
With minimal use of technology, entomologists, ornithologists, and psychologists can
visually observe movement or flight behaviors, use hand-held counters, and record their scores
and observations in computer worksheets (Altmann, 1974; Boch, Shearer, & Petrasovits, 1970;
Wyatt, 1997). However, when the number of individuals in the group is too large to be tracked
reliably by a human observer, only more extreme manifestations of the behavior of interest may
be feasible to track. For example, when assessing defensive bee behavior, sting-attacks may be
recorded, while changes in fairly stereotypical, erratic flight behavior might be scored only
qualitatively (Jones et al., 2012; Pickett, Williams, & Martin, 1982). However, this method does
not provide for a detailed analysis of the temporal progression of erratic flight behavior. To
obtain more precision, individual animal behavior can be tracked and assessed by watching
videos of the recorded behavior and manually scoring behavior on a frame-by-frame basis (Dyer,
SWARMSIGHT: ANIMAL GROUP ACTIVITY FROM VIDEOS
5
Johansson, Helbing, Couzin, & Krause, 2009; Grüter, Kärcher, & Ratnieks, 2011). However, that
type of scoring is tedious, time consuming, and error prone.
Computer systems have been used to automate some of these tasks and do not suffer from
cognitive load, attention, subjectivity, and fatigue limitations (B. R. Martin, Prescott, & Zhu,
1992; P. Martin & Bateson, 1993; Noldus, Spink, & Tegelenbosch, 2001; Olivo & Thompson,
1988; Spruijt & Gispen, 1983). Despite the advantages, existing motion detection software
packages continue to have significant drawbacks that limit their usefulness in studying animal
group behavior.
Some software is not designed for scientific research applications or requires
programming knowledge. One way to assess group activity automatically is to extract the motion
component from a pre-recorded or live video. Software like iVMD (IntelliVision, 2015) can be
used in such way, but it is designed for efficiently reviewing surveillance footage. Video players
like VLC media player (VideoLAN, 2015) also can show regions that change from frame to
frame. However, without programming against the tool’s application programming interface
(API), the frame-by-frame motion data is difficult to access. Similarly, general-purpose scientific
computing packages like MATLAB (MathWorks, 2015) or Python (Python Software
Foundation, 2015) have been used to extract motion data (Hashimoto, Izawa, Yokoyama, Kato,
& Moriizumi, 1999; Ramazani, Krishnan, Bergeson, & Atkinson, 2007; Togasaki et al., 2005),
but also require programming knowledge.
Other software would be impractical to use with natural scene videos. OpenControl
(Aguiar, Mendonça, & Galhardo, 2007) is open-source software that uses a background-
subtraction algorithm to detect movement and can be used to track the location of single animals
and control maze actuators. However, the software requires setting a motionless reference frame,
SWARMSIGHT: ANIMAL GROUP ACTIVITY FROM VIDEOS
6
against which all other frames are compared. This makes it difficult for use in natural
environments like forests or open fields where ambient lighting conditions can change due to
wind or clouds, and thus shift the reference frame.
Individual tracking software can become computationally expensive when extended to
large groups of individuals. MCMC (Khan, Balch, & Dellaert, 2006), another software package
implementing accurate algorithms for tracking multiple targets, has been tested with groups of up
to 20 individuals, but it may not be computationally practical for more numerous groups. If many
individuals enter and leave the video scene, tracking them with such software may not be more
useful than simple motion data extraction.
Finally, available commercial software can be expensive and is generally not open-
source. EthoVision (Noldus et al., 2001) is a sophisticated software package with an activity
detection feature (Noldus Information Technology, 2015a), which has been used to assess the
locomotor effects of insecticide on carabid beetles (Tooming, Merivee, Must, Sibul, & Williams,
2014), and the effect of anti-viral drugs on the locomotor activity of ferrets (Oh, Barr, & Hurt,
2015). Though successful, the EthoVision software is expensive, and its source code is not
available for inspection and modification.
To address the problems of cost, platform availability, and customization, and to create a
uniquely tailored user-friendly interface for assessing the temporal progression of animal group
activity in natural environments, we created open-source SwarmSight which runs on Microsoft
Windows and works with a wide range of video formats.
In the following sections, we describe the algorithm SwarmSight implements, and
demonstrate its validity in detecting motion in a battery of synthetic motion videos. We then
SWARMSIGHT: ANIMAL GROUP ACTIVITY FROM VIDEOS
7
demonstrate a wide range of possible scientific applications by applying SwarmSight to the
assessment of group activity of stingless bees, wild birds, and hissing cockroaches.
Assessing Movement Activity with SwarmSight
The SwarmSight algorithm detects changes in pixel color between video frames. In
computer vision literature, this technique is referred to as background subtraction via thresholded
frame differencing (Courtney, 1997; Hashimoto et al., 1999; Jain, Martin, & Aggarwal, 1979;
Yalamanchili, Martin, & Aggarwal, 1982). The result of the algorithm corresponds well to
motion, because a moving object appears in new pixels and disappears from the pixels it
occupied previously. The changed pixels can be counted, and the number of changed pixels
correlates with the speed and the number of moving objects in view.
Figure 1: Motion vs. changing pixels. An object moving from left to right will result in changed
pixels when the object appears at the new location (column a). If the color difference between
overlapping pixels is below a threshold, those pixels will not change (column b). Changed pixels
when the object disappears from the screen (column c).
SWARMSIGHT: ANIMAL GROUP ACTIVITY FROM VIDEOS
8
SwarmSight Algorithm and Implementation
To compute the activity metric of a video, the software uses the ffmpeg (Bellard, 2015)
library to extract frames from video files. One major advantage of SwarmSight is that this
underlying, open-source library supports over 50 video codecs (Bellard, 2015) and enables our
software to read a wide range of video file formats including the common .mov, .avi, and .mp4.
Once the video frames are extracted, SwarmSight computes the activity metric of each
video frame. The activity metric of a frame is the sum of pixel-by-pixel average color changes
from the previous frame. Specifically, each frame, except for the first, is assigned the activity
metric, which is computed as follows. Consider a video with 𝑛!>!0 frames. Let 𝑐!,!,! represent
the change in color of a pixel located at (𝑥,𝑦) coordinates of a frame 𝑓!{1,𝑛}, and let
𝑅!,!,!,𝐺
!,!,!,𝐵
!,!,! represent the 8-bit integer RGB components of the pixel. Then compute the
inter-frame color distance
𝑐!,!,!=
𝑅!!!,!,!𝑅!,!,!+𝐺
!!!,!,!𝐺
!,!,!+𝐵
!!!,!,!𝐵
!,!,!
3!!!,
which reflects the pixel’s color change from the previous frame. Now, let 𝑎! represent the
activity metric for any frame 𝑓!>!1, and let 𝑡!{0!255} represent a user-defined threshold.
Then compute 𝑎!:
𝑎!=!!
!!!𝑐!,!,!!>!𝑡,!!!!1!
!!!𝑐!,!,!!!𝑡,!!!!0.!!
!,!
In other words, for each frame, count the pixels where the color change from the previous frame
exceeds the user threshold. This results in an activity metric that is highly correlated with scene
motion. The threshold value 𝑡 is user-selected and its optimal value highly depends on the
environment depicted in the video. A low threshold will amplify any background movements or
SWARMSIGHT: ANIMAL GROUP ACTIVITY FROM VIDEOS
9
even video compression artifacts, while a high threshold may diminish the desired signal. We
demonstrate how to find an optimal threshold in Experiment 2 below.
Software Limitations
The limitations of the software stem from the fact that it measures the aggregate
movement of all the objects in the video scene. This may be undesirable if the movement of
some objects cannot be excluded with a different recording angle or the use of the “Region of
Interest” feature (see below). It should be noted that, unless the video scene contains only one
individual, the software is not designed to measure individual animal activity levels but
aggregate, group activity levels instead.
When analyzing the activity metric produced for each video, researchers of the software
should treat the activity metric as a relative measure. The absolute value of the metric provides a
useful metric only when compared to the value obtained from other parts of the video or videos
captured under the same conditions. For example, a close up video of a scene will result in more
pixels changed per frame, while a zoomed out video of the same scene will register fewer pixels
changed per frame. However, if over the course of the video, the camera perspective and the
motion threshold parameter are not changed, the progression of pixels changed per frame will be
isolated to the progression of movement in the scene.
Low Wind Conditions. If the video depicts flying insects, the background wind should
be minimal. Background wind can cause the flying insects to move involuntarily, which may
affect the activity metric. Additionally, if the scene contains undesirable objects that are moving
due to low wind and are confined to a part of the scene (e.g. moving leaves in a corner), the
“Region of Interest” tool could be used to exclude undesirable motion. In experiments 2 and 3
below, we are able to use the tool to exclude peripheral moving leaves, branches, and sponges.
SWARMSIGHT: ANIMAL GROUP ACTIVITY FROM VIDEOS
10
We note that if the target and undesirable objects overlap, the tool is not able to exclude the
undesirable movement. Depending on the demand from the scientific community, we could add
other types of region inclusion/exclusion features in future versions of the software.
Stationary, Stable Video. The metric will register any movement, including camera
movements, changes in perspective, or zoom. To ensure the metric reflects only the desired
movement, the videos should be shot on a tripod or other stable surface, where only the objects
whose motion is being assessed are moving.
Furthermore, if activity metrics of multiple videos will be compared, then each video
should be shot under the same lighting conditions, perspective, and zoom level. If this is not the
case, differences between the video conditions could confound the changes in the activity metric.
If new videos cannot be taken, it may be possible to adjust the sensitivity threshold and set
equivalent regions of interest for each video to mitigate this problem. The software includes a
region of interest feature, which allows the user to select a rectangular region of the video to
restrict the pixels that will be used for computing the activity metric (See Figure 2). When
comparing a video that was recorded from a closer distance to one that was recorded further
away, the region of interest feature can be used on the distant video to exclude movement
activity from pixels that are not visible in the closer video. Furthermore, the motion threshold
parameter for the distant video would need to be decreased to adjust for the reduction in motion
that is being captured by each camera pixel. This could be achieved by manually calibrating the
threshold on the distant video until baseline movement activity levels recorded in both videos are
the same. Thus, the movement activity can be appropriately compared from videos recorded
under different zoom conditions when only the pixels present in both videos are included and the
zoom-adjusted threshold is used. However, the above calibration procedure will not be effective
SWARMSIGHT: ANIMAL GROUP ACTIVITY FROM VIDEOS
11
when the zoom differences between the compared videos are relatively large (e.g. more than
several degrees). We recommend using it only if the videos cannot be recorded from the same
perspective.
Minimally Compressed Preference. Ideally, the videos should be minimally
compressed. Higher compression tends to introduce compression artifacts, which under low
threshold conditions appear as changed pixels, confounding the activity metric. Native
compression by most digital cameras is acceptable.
Setting Up and Using the Software
Installation and System Requirements. The software can be freely downloaded and
installed from https://github.com/justasb/SwarmSight. It is a Microsoft .NET application written
in C#. The code is open-source and available at (https://github.com/justasb/SwarmSight
/tree/master/Source). The software runs on Microsoft Windows with .NET 4.5 framework
installed.
Video Requirements. The software uses the ffmpeg library (Bellard, 2015), which
supports a very wide variety of video file formats. Video formats produced by most digital
cameras are supported.
Activity Assessment. Once the software starts, the user can select the video file to
analyze, set the sensitivity threshold, and draw a rectangular region of interest to exclude
unnecessary or spurious movements.
SWARMSIGHT: ANIMAL GROUP ACTIVITY FROM VIDEOS
12
Figure 2: Screenshot of SwarmSight user interface showing an example video and a drawn
region of interest. The sponge moves due to wind, but its motion is excluded because the
“Region of Interest” feature is used to limit the area where motion (yellow squares) is measured.
After the video starts playing, the user can change the Activity Settings (Figure 2), and also
select the “Highlight Motion” checkbox to see which pixels are changing. This option can be
used to easily spot fast moving objects like flying insects and can be used as an aid to counting
them manually.
As the video plays, a chart below the video screen displays the frame-by-frame logarithm
of the activity metric. Using the logarithm reduces the effect of extreme movement events (the
linear metric is preserved in statistics calculations). To the right of the chart, the raw frame
activity data can be exported into a CSV file for further processing. The CSV file is saved in the
same directory as the video file, with the same filename as the video file, but ending with the
SWARMSIGHT: ANIMAL GROUP ACTIVITY FROM VIDEOS
13
“.csv” file extension. CSV files can be opened with Excel, R, MATLAB, and most other
statistical software packages.
Validation Results
To verify that the tool’s activity metric is correlated with object motion, we assessed
motion using both synthetic and live field videos. In addition, we demonstrate that the tool can
be used to detect and quantitatively compare insect swarm activity differences, track the
progression of swarm activity over time, and reveal interesting temporal dynamics in feeding
behavior of birds, and in nest finding behavior of hissing cockroaches.
Experiment 1: Synthetic Motion Videos
Method. To assess whether the tool correlates with motion depicted in a video file, we
created two synthetic videos and used the tool to assess the video motion. The first video (Figure
3) consists of six events, occurring sequentially in one-second intervals. In the first event, a
5x5px box appears on a white background. Then, one second later, the box moves 1px to the
right. After another one second, the box moves 2px, then 3px, and repeats the pattern until it
moves 5px. The movement pattern is implemented using a box colored with graded shades of
gray. The 5x5px box is divided into three bands of 1x5px of 100% black (Figure 3, Right),
2x5px band of 50% black, and 2x5px band of 25% black. Thresholds of 1, 64, 128, and 255 were
used to demonstrate the threshold effect. A threshold of 1 requires a pixel to change only 1 shade
(out of 254) to register as changed. A threshold of 255 is above the theoretical maximum change
of 254 (white to black), and no pixels should be registered as changed.
To show that the software activity metric is proportional to the number of moving
objects, we created a second synthetic video where three different boxes progressively move one
by one, until all three are moving at the same time (Figure 4, Left). In frames 1-2, the three
SWARMSIGHT: ANIMAL GROUP ACTIVITY FROM VIDEOS
14
objects do not move. In frames 3-6, one box starts to move at 1px/frame. In frames 7-11, a
second box starts moving at the same speed and direction. Finally, in frames 12-15, all three
objects are moving. It is expected that with each additional moving object, the number of
changed pixels detected per frame will increase by 10px.
Results. The first test performs as expected. At a threshold of 1 (Figure 3, Center), when
a 5x5 box appears, a change of 25 pixels is registered. When the box moves 1px, a 10px change
is detected, which consists of a 5px change of the leading edge of the box, and a 5px change of
the trailing edge that the box previously occupied. As the threshold is increased to 64, the
software registers 15px when the box appears, reflecting the top two rows of the box: 100%
black and 50% black. At a threshold of 128, only the 5px top black row is registered. At
threshold 255, no pixels are registered. The changes produced by the moving box are registered
correctly.
Figure 3: Effects of threshold on appearance and movement of 5x5px multi-shade box. Left: The
pattern of appearance and movement of the box. Center: Resulting changed pixels as registered
by the SwarmSight algorithm. Right: Illustration of the multi-shade regions of the box and
predicted threshold detection regions.
SWARMSIGHT: ANIMAL GROUP ACTIVITY FROM VIDEOS
15
The second test with multiple moving objects also demonstrates expected behavior. As
the first 5x5 box starts to move, the software registers a 10px/frame change in pixels (Figure 4).
As each additional object starts to move, the activity metric increases by 10px/frame, showing a
change of 30px/frame when all three objects are moving.
Figure 4: Activity metric vs. number of moving objects. Left: Three objects are moved to the
right at 1px/frame in succession. Right: The number of changed pixels detected by SwarmSight
is proportional to the number of moving boxes.
Experiment 2: Threshold Selection in Stingless Bee Video
Method. One useful property of the software is that it enhances the visibility of moving
objects in videos. To demonstrate this with a flying insect swarm, we tested how different
threshold levels affect the visibility of flying stingless bees Tetragonisca angustula when
compared to unprocessed video frames. We tested two videos of a hive of T. angustula, one
showing a zoomed in version of the hive (Figure 5, Top row) and one showing the zoomed out
version (Figure 5, Bottom row). The threshold was set using the Motion Threshold slider control
available in SwarmSight (Figure 2, Top Right).
SWARMSIGHT: ANIMAL GROUP ACTIVITY FROM VIDEOS
16
Result. The results demonstrate that the software can be used to distinguish flying insects
from the background. In Figure 5, the control, unprocessed video frames can be seen on the right,
with the motion-detected frames on the left at various thresholds showing detected motion in
yellow.
Figure 5: Examples of detected bee motion in yellow. A threshold between 20 and 50 provides
the best signal to noise ratio in emphasizing flying stingless bees in these videos.
Specifically, note that when using low thresholds (Figure 5, Left, 5), the software will
register noise that is not indicative of insect motion, and when using high thresholds (Figure 5,
Right, 100), it will not emphasize motion enough. At medium range thresholds, the bees are
easily visible. For example in Figure 5, at threshold 20, in the top row, showing the zoomed in
view, the bee wings and their bodies are outlined in yellow. In contrast, at threshold 50 in the
bottom row, the small bees are visible as yellow dots. As mentioned in the previous section,
because the threshold can be chosen by the user, a researcher play a video in question and adjust
the threshold via the slider to discover the optimal threshold for each video. Once discovered, the
same threshold can be used to compare activity present in several videos shot under the same
conditions.
SWARMSIGHT: ANIMAL GROUP ACTIVITY FROM VIDEOS
17
Experiment 3: Detection of Change in Activity and Its Progression
Method. SwarmSight software was used to assess how flying insect activity changes
over time in response to treatment. Here we examine a video of an entrance to a T. angustula
hive before and after a treatment with a chemical mixture of the stingless bee’s alarm
pheromone: Citral (Sigma Aldrich, B1334), 6-methyl-5-hepten-2-one (Sigma Aldrich, M48805),
and benzaldehyde (Sigma Aldrich, B1334) (Wittmann, Radtke, Zeil, Lübke, & Francke, 1990).
We use the SwarmSight software to assess bee swarm activity during the 30-second video and
observe its progression over time. To confirm that the activity metric is proportional to actual
flying bees, we compare SwarmSight results to the number of visible flying bees in the video
frames sampled at 1-second intervals.
To test whether the software can detect significant changes in behavior from nest
entrance videos, we use the software to analyze 107 T. angustula nest entrance videos consisting
of control and treatment groups. The control groups were administered behaviorally inert mineral
oil (Breed, Guzmán-Novoa, & Hunt, 2004), while the treatment groups were administered the
alarm pheromone mixture. The average activity metrics of the two groups are compared.
To demonstrate that the software can be used for measuring the progression of group
activity of birds and other insects, we used videos of wild birds and hissing cockroaches. The
wild bird video shows rock doves (Columbia livia), mourning doves (Zenaida macroura), and
Gambel's quail (Callipepla gambelii) discovering and consuming newly placed wild bird food
(Global Harvest Foods, Ltd., 2015), which was spread evenly across a 6’ x 6’ area on concrete
pavement. This video was chosen because it contains a period without any birds as well as three
phases showing waves of arriving birds. To demonstrate that SwarmSight can detect such arrival
SWARMSIGHT: ANIMAL GROUP ACTIVITY FROM VIDEOS
18
waves, the activity metric is expected to contain distinct, corresponding increases in activity
during each arrival wave.
Finally, to demonstrate that the software can be used with non-flying animals, we
selected a video of a group of Madagascar hissing cockroaches (Gromphadorhina portentosa)
locating an artificial nest. The video scene displays a round dish containing 4 insects with a
covered center area: the artificial nest. The insects were placed outside of the nest, and allowed
to crawl freely. The video contained two phases of insect activity separated by a long phase of
relative dormancy. We expect SwarmSight results to indicate the three phases.
Results. The results of the activity progression experiment show a clear increase in
detected movement activity of the bees after the treatment with alarm pheromone (Figure 6, Left,
small dots). Furthermore, the progression of the response can be observed shortly after the
treatment. The increase was confirmed visually from the motion-annotated frames captured
before (Figure 6, Right Top) and after (Figure 6, Right Bottom) the treatment as well as
manually counted bee numbers (Figure 6, Left, large dots).
SWARMSIGHT: ANIMAL GROUP ACTIVITY FROM VIDEOS
19
Figure 6: Progression of the activity metric before and after treatment of a stingless bee colony
with alarm pheromone. Small dots: changed pixels/frame, Large dots: manually counted number
of flying bees. Right: motion-annotated frames showing flying bees.
The results of the experiment comparing the average activity of behaviorally inert
mineral oil (MO) condition videos to the average activity of alarm pheromone (AP) condition
videos show that significant differences in behavior can be detected using the SwarmSight
software. The AP treatment and control group (MO) activity averages of 107 videos showed
significant differences (n = 107, 2-tailed t-test p < 0.003). Figure 7 shows a screenshot of
SwarmSight computations for the average activity levels of the two videos (showing one MO
video in the top graph, and an AP video in the middle graph) as well as the details of the activity
comparison results (bottom panel).
Figure 7: SwarmSight results comparing responses to mineral oil vs. alarm pheromone. Top
right graph shows the activity progression for mineral oil (control) treatment, middle right graph
shows activity progression for alarm pheromone treatment. The regions highlighted in blue show
the ranges of frames selected for comparison. The middle graph excludes the large spike in
activity due to the movement of a hand that administers the treatment. Bottom panel shows
graphical comparison of the two videos and the details of the comparison results.
SWARMSIGHT: ANIMAL GROUP ACTIVITY FROM VIDEOS
20
The activity metric produced from the wild bird video (Figure 8), makes it easy to
establish the onset of feeding, its duration including any interruptions, and progression of the
feeding behavior.
Figure 8: Wild bird feeding activity over time. A) Signal showing no activity. B) The onset of
feeding as discovered by the birds shows up as a gradual increase of the activity metric. C) An
interruption of feeding and group take off shows up as a large spike in activity metric. D) The
main feeding phase where waves of birds land and take off (spikes) at the food source. Note the
gradual increase and decrease in overall activity as the food source is exhausted. E) A false
landing by the birds even though the food source is exhausted. F) Activity of lingering birds at
the end of feeding.
SWARMSIGHT: ANIMAL GROUP ACTIVITY FROM VIDEOS
21
SwarmSight is also able to measure the progression of hissing cockroach activity while
finding a nest. As expected, Figure 9 shows a series of activity bursts, a period of dormancy, and
a final activity burst.
Figure 9: Hissing cockroach activity over time. A & C) Phases of hissing cockroach video
showing bursts of activity. B) Phase showing a period of relative dormancy.
Experiment 4: Performance Assessment
Method. To assess the performance of SwarmSight, we measured the frames per second
processed by the program. The threshold was set to 17, quality to 100%, and the testing system
was an Apple Macbook Air laptop with a 2GHz Intel Core i7 processor, 8GB RAM, running
Windows 7 Professional on Parallels Desktop virtual machine.
Results. We find that for high definition videos filmed using the 1080p standard (1920px
by 1080px), the frame rate is in the 9-11 fps range (Table 1). While for medium sized videos
(640px by 480px), the frame rate increases to 61 fps. Overall, the program processes about 20.8
million pixels per second.
SWARMSIGHT: ANIMAL GROUP ACTIVITY FROM VIDEOS
22
Table 1: The number of frames per second processed by SwarmSight in response to videos with
different resolutions.
Discussion
SwarmSight measures movement in a video scene and was demonstrated to automate the
quantitative assessment of temporal progression of activity levels of flying insect swarms, bird
flocks, and animal groups. It also has been shown to be useful in detecting hard-to-see flying
insects and can assist traditional methods of insect tracking and counting. Generally, the software
is most useful in situations where changes of aggregate movement over time in a region of a
video provide information valuable to the investigators.
In addition to assessing insect and bird group behavior, we expect the software to be
useful in assessing human group behavior as well. For example, a video filmed from an upper
floor of a building of pedestrians walking on a street or a sidewalk could be analyzed using the
software. If the pedestrian traffic is low, individual pedestrians could be distinguished by sudden
increases and subsequent decreases of pixel change activity. During high traffic flow, the change
in relative pedestrian activity levels over time could be assessed and compared. The same
method for measuring street pedestrian traffic could be used to measure walking shopper
behavior in retail stores or malls. For example a section of a hallway or an isle could be video-
recorded and baseline shopper movement activity assessed. Various interventions such as
placement or alteration of signage could then be assessed by comparing their effect on the video
SWARMSIGHT: ANIMAL GROUP ACTIVITY FROM VIDEOS
23
activity levels. A similar technique could be used to assess and track the changes in activity of
automobile traffic.
Furthermore, SwarmSight could be used to test theoretical models of swarm or large
group behavior. For example, a group-behavior model described by Couzin, Krause, Franks, and
Levin (2005) predicts that only a small fraction of individuals in a group need to possess
information to influence the behavior of the larger groups. For example, in Dyer et al. (2009),
just 5% of individuals in a large, 200 person group needed to be informed for the larger group to
converge onto the correct target. The videos recording the movement of the individuals could be
analyzed using SwarmSight. Analysis of a video of the whole group would be expected to show
the group movement activity over time. This would reflect the aggregate speed of the group and
would show when the group starts and stops moving. Additionally, any transient increases or
decreases of the group speed would be reflected in the pixel change signal. Such analysis could
complement manually performed measures of the group movement and its change over time.
The leadership behavior in simulated agent-based flocks such as seen in Quera et al.
(2010) may also be assessed with SwarmVision. Though computer simulations benefit from the
ability to observe any of the simulated state variables (Kelton & Law, 2000), some very large or
complex simulations may not be feasible or cost effective to re-simulate. In such cases, if a video
of the completed simulation is available, and the progression of movement in a region of the
video contains useful information to the investigators, then our software could be used to analyze
it. This way, aggregate movement metric could be extracted from a video of a complex
simulation, without re-running it.
One advantage of SwarmSight is the simple, easy-to-use user interface that makes it
possible to learn to use the tool very quickly and enables its use as a teaching tool. For example,
SWARMSIGHT: ANIMAL GROUP ACTIVITY FROM VIDEOS
24
graduate and undergraduate students in an Animal Behavioral Methods class at Arizona State
University received a 10 minute presentation about bees and the software. Following the
presentation, students were able to use the software effectively to analyze a stingless bee nest
raid video. The software presentation and the example video can be downloaded from
https://github.com/justasb/SwarmSight.
Overall, SwarmSight provides a powerful, yet easy-to-use tool for assessing motion from
natural and laboratory video scenes that has applications in behavioral experiments, as well as
serving as a teaching tool for classroom-based behavioral experiments.
In the future, the authors plan to add additional filters and incorporate machine learning
classifiers for more advanced insect tracking and behavior classification, and to utilize on-board
graphics processing units (GPUs) to increase performance. For example, supplying a support
vector machine (Cortes & Vapnik, 1995) trained on previously labeled examples of target
insects, birds, humans, or other animals with pixel color and motion data could enable the
software to count the number of target individuals per frame. In some cases, this metric would be
even more useful than the aggregate movement metric of the current version. Furthermore, a
GPU implementation of the above algorithm, may execute one or two orders of magnitude faster
than the CPU implementation (Asano, Maruyama, & Yamaguchi, 2009).
Acknowledgments
Justas Birgiolas was supported by Arizona State University Interdisciplinary Graduate
Program in Neuroscience Fellowship. Chris Jernigan was supported by Arizona State University
(ASU) and a Smithsonian Tropical Research Institute (STRI) Short Term Fellowship Award.
Brian Smith was supported by NIH grant NIH-NIDCD (DC007997). We thank David W.
Roubik for providing the Tetragonisca angustula bee hives which Christopher Jernigan filmed
SWARMSIGHT: ANIMAL GROUP ACTIVITY FROM VIDEOS
25
for part of this experiment. We also thank Steven Pratt for Madagascar hissing cockroach videos,
and Karen Hastings for wild bird videos. We also thank the anonymous reviewers who helped to
improve the earlier version of this article. We declare no known conflicts of interest.
SWARMSIGHT: ANIMAL GROUP ACTIVITY FROM VIDEOS
26
References
Aguiar, P., Mendonça, L., & Galhardo, V. (2007). Opencontrol: a free opensource software for
video tracking and automated control of behavioral mazes. Journal of neuroscience
methods, 166(1), 66–72.
Altmann, J. (1974). Observational study of behavior: sampling methods. Behaviour, 49(3), 227–
266.
Asano, S., Maruyama, T., & Yamaguchi, Y. (2009). Performance comparison of fpga, gpu and
cpu in image processing. In Field programmable logic and applications, 2009. fpl 2009.
international conference on (pp. 126–131).
Bellard, F. (2015). Ffmpeg documentation. Retrieved from
https://www.ffmpeg.org/documentation.html ([Online; accessed 28-April-2015])
Boch, R., Shearer, D., & Petrasovits, A. (1970). Efficacies of two alarm substances of the honey
bee. Journal of Insect Physiology, 16(1), 17 - 24. Retrieved from
http://www.sciencedirect.com/ science/article/pii/0022191070901083 doi:
http://dx.doi.org/10.1016/0022-1910(70)90108-3
Breed, M. D., Guzmán-Novoa, E., & Hunt, G. J. . (2004). Defensive behavior of honey bees:
organization, genetics, and comparisons with other bees. Annual Reviews in Entomology,
49(1), 271–298.
Collett, M., Despland, E., Simpson, S. J., & Krakauer, D. C. (1998). Spatial scales of desert
locust gregarization. Proceedings of the National Academy of Sciences, 95(22), 13052–
13055.
Cortes, C., & Vapnik, V. (1995). Support-vector networks. Machine learning, 20(3), 273–297.
SWARMSIGHT: ANIMAL GROUP ACTIVITY FROM VIDEOS
27
Courtney, J. D. (1997). Automatic video indexing via object motion analysis. Pattern
Recognition, 30(4), 607–625.
Couzin, I. D., Krause, J., Franks, N. R., & Levin, S. A. (2005). Effective leadership and decision-
making in animal groups on the move. Nature, 433(7025), 513–516.
Cushing, D., & Jones, F. (1968). Why do fish school? Nature, 218(5145), 918–920.
David, W. (2006). Stingless bee nesting biology. Apidologie, 37, 124–143.
Dyer, J. R., Johansson, A., Helbing, D., Couzin, I. D., & Krause, J. (2009). Leadership,
consensus decision making and collective behaviour in humans. Philosophical
Transactions of the Royal Society B: Biological Sciences, 364(1518), 781–789.
Global Harvest Foods, Ltd. (2015). Audubon park wild bird food. Retrieved from
http://www.ghfoods.com/product.asp?pc=2120 ([Online; accessed 28-April-2015])
Grüter, C., Kärcher, M., & Ratnieks, F. (2011). The natural history of nest defence in a stingless
bee, tetragonisca angustula (latreille)(hymenoptera: Apidae), with two distinct types of
entrance guards. Neotropical entomology, 40 (1), 55–61.
Hashimoto, T., Izawa, Y., Yokoyama, H., Kato, T., & Moriizumi, T. (1999). A new
video/computer method to measure the amount of overall movement in experimental
animals (two-dimensional object-difference method). Journal of neuroscience methods,
91(1), 115–122.
Higdon, J., & Corrsin, S. (1978). Induced drag of a bird flock. American Naturalist, 727–744.
IntelliVision. (2015). Intelligent video motion detector. Retrieved from http://www.intelli-
vision.com/products/intelligent -video-analytics/intelligent-video-motion-detector
([Online; accessed 28-April-2015])
SWARMSIGHT: ANIMAL GROUP ACTIVITY FROM VIDEOS
28
Jain, R., Martin, W., & Aggarwal, J. (1979). Segmentation through the detection of changes due
to motion. Computer Graphics and Image Processing, 11(1), 13–34.
Jones, S. M., van Zweden, J. S., Grüter, C., Menezes, C., Alves, D. A., Nunes-Silva, P., . . .
Ratnieks, F. L. (2012). The role of wax and resin in the nestmate recognition system of a
stingless bee, tetragonisca angustula. Behavioral Ecology and Sociobiology, 66(1), 1–12.
Kelton, W. D., & Law, A. M. (2000). Simulation modeling and analysis. McGraw Hill Boston.
Khan, Z., Balch, T., & Dellaert, F. (2006). Mcmc data association and sparse factorization
updating for real time multitarget tracking with merged and multiple measurements.
Pattern Analysis and Machine Intelligence, IEEE Transactions on, 28(12), 1960–1972.
Lindsey, R. (2012). Road pricing and investment. Economics of transportation, 1(1), 49–63.
Martin, B. R., Prescott, W., & Zhu, M. (1992). Quantitation of rodent catalepsy by a computer-
imaging technique. Pharmacology Biochemistry and Behavior, 43(2), 381–386.
Martin, P., & Bateson, P. P. G. (1993). Measuring behaviour: an introductory guide. Cambridge
University Press.
MathWorks. (2015). Matlab - the language of technical computing. Retrieved from
http://www.mathworks.com/products/matlab/ ([Online; accessed 28-April-2015])
McGlynn, T. P. (2012). The ecology of nest movement in social insects. Annual review of
entomology, 57, 291–308.
Noldus, L. P., Spink, A. J., & Tegelenbosch, R. A. (2001). Ethovision: a versatile video tracking
system for automation of behavioral experiments. Behavior Research Methods,
Instruments, & Computers, 33(3), 398–414.
SWARMSIGHT: ANIMAL GROUP ACTIVITY FROM VIDEOS
29
Noldus Information Technology. (2015). Gathering data. Retrieved from
http://www.noldus.com/EthoVision-XT/Gathering-data ([Online; accessed 28-April-
2015])
Oh, D. Y., Barr, I. G., & Hurt, A. C. (2015). A novel video tracking method to evaluate the effect
of influenza infection and antiviral treatment on ferret activity. PloS one, 10(3),
e0118780.
Olivo, R. F., & Thompson, M. C. (1988). Monitoring animals’ movements using digitized video
images. Behavior Research Methods, Instruments, & Computers, 20(5), 485–490.
Pickett, J., Williams, I., & Martin, A. (1982). (z)-11-eicosen-1-ol, an important new pheromonal
component from the sting of the honey bee,apis mellifera l. (hymenoptera, apidae.).
Journal of Chemical Ecology, 8 (1), 163-175. Retrieved from
http://dx.doi.org/10.1007/BF00984013 doi: 10.1007/BF00984013
Python Software Foundation. (2015). Welcome to python.org. Retrieved from
https://www.python.org/ ([Online; accessed 28-April-2015])
Quera, V., Beltran, F. S., & Dolado, R. (2010). Flocking behaviour: agent-based simulation and
hierarchical leadership. Journal of Artificial Societies and Social Simulation, 13(2), 8.
Ramazani, R. B., Krishnan, H. R., Bergeson, S. E., & Atkinson, N. S. (2007). Computer
automated movement detection for the analysis of behavior. Journal of neuroscience
methods, 162(1), 171–179.
Spruijt, B. M., & Gispen, W. H. (1983). Prolonged animal observation by use of digitized
videodisplays. Pharmacology Biochemistry and Behavior, 19(5), 765–769.
Togasaki, D. M., Hsu, A., Samant, M., Farzan, B., DeLanney, L. E., Langston, J. W., . . . Quik,
M. (2005). The webcam system: a simple, automated, computer-based video system for
SWARMSIGHT: ANIMAL GROUP ACTIVITY FROM VIDEOS
30
quantitative measurement of movement in nonhuman primates. Journal of neuroscience
methods, 145(1), 159–166.
Tooming, E., Merivee, E., Must, A., Sibul, I., & Williams, I. (2014). Sub-lethal effects of the
neurotoxic pyrethroid insecticide fastac® 50ec on the general motor and locomotor
activities of the non-targeted beneficial carabid beetle platynus assimilis (coleoptera:
Carabidae). Pest management science, 70(6), 959–966.
U.S. Department of Commerce. (2015). Quarterly retail e-commerce sales 3rd quarter 2015.
Retrieved from https:// www.census.gov/retail/mrts/www/data/pdf/ec_current.pdf
([Online; accessed 28-November-2015])
VideoLAN. (2015). Videolan - official page for vlc media player, the open source video
framework! Retrieved from http://www.videolan.org/vlc/ ([Online; accessed 28-April-
2015])
Wittmann, D., Radtke, R., Zeil, J., Lübke, G., & Francke, W. (1990). Robber bees (lestrimelitta
limao) and their host chemical and visual cues in nest defense bytrigona (tetragonisca)
angustula (apidae: Meliponinae). Journal of chemical ecology, 16(2), 631–641.
Wyatt, T. (1997). Methods in studying insect behaviour. Methods in Ecological and Agricultural
Entomology. CAB International, Wallingford, UK, 27–56.
Yalamanchili, S., Martin, W., & Aggarwal, J. (1982). Extraction of moving object descriptions
via differencing. Computer graphics and image processing, 18(2), 188–201.
... In the last decade, methods for observing animal behavior have been greatly accelerated by advances in high-resolution video cameras, computer processing speeds, and machine vision algorithms. Tasks like animal detection, counting, tracking, and place preference analyses have been aided with sophisticated software that can process videos of animal behavior and extract relevant measures [39][40][41][42][43][44][45][46][47] . ...
... With the method described here, we extended our previous work on motion analysis software 41 to track the locations of insect antennae and proboscis with the following goals: (1) no requirement for special hardware or complex animal preparation, (2) frame processing at real-time (30 frames per second or faster) on a conventional computer, (3) ease of use, and (4) open-source, easily extendable code. ...
... Briefly, the software works by using a set of motion filters 53 and a relaxed flood fill algorithm 54 . To find likely antenna points, two filters are used: a 3-consecutive-frame difference filter 41,55 and a median-background subtraction 56 filter. A color distance threshold filter is used for proboscis point detection. ...
Preprint
Full-text available
Many scientifically and agriculturally important insects use antennae to detect the presence of volatile chemical compounds and extend their proboscis during feeding. The ability to rapidly obtain high-resolution measurements of natural antenna and proboscis movements and assess how they change in response to chemical, developmental, and genetic manipulations can aid the understanding of insect behavior. By extending our previous work on assessing aggregate insect swarm or animal group movements from natural and laboratory videos using video analysis software SwarmSight, we developed a novel, free, and open-source software module, SwarmSight Appendage Tracking ( SwarmSight.org ) for frame-by-frame tracking of insect antenna and proboscis positions from conventional web camera videos using conventional computers. The software processes frames about 120 times faster than humans, performs at better than human accuracy, and, using 30 frames-per-second videos, can capture antennal dynamics up to 15 Hz. We used the software to track the antennal response of honey bees to two odors and found significant mean antennal retractions away from the odor source about 1 s after odor presentation. We observed antenna position density heat map cluster formation and cluster and mean angle dependence on odor concentration.
... Nowadays, high-speed videography is a pivotal tool in many ecological (e.g. Klaassen et al., 2014;Birgiolas et al., 2017;Simão et al., 2019), kinematic (e.g. Bostwick and Prum, 2003;Weihmann, 2013) or biomechanical (e.g. ...
... Van Wassenbergh et al., 2008;Larabee et al., 2017;Reichel et al., 2019;Feller et al., 2020) studies. A widespread method to extract information from this kind of video footage is video tracking, and a plethora of tools is available (Birgiolas et al., 2017;Harmer and Thomas, 2019). Once calibrated, this kind of non-contact methodology calculates the position of the animal during its movement via a tracking algorithm (e.g. ...
... For different research questions, different software might be advantageous. In many ecological or behavioral studies many individuals need to be tracked simultaneously, or general activity has to be measured (Birgiolas et al., 2017;Harmer and Thomas, 2019). In these cases, other workflows are more applicable, for example the MCMC package by Khan et al. (Khan et al., 2006) which is designed to track higher amounts of individuals or software based on analysing general activity, such as surveillance footage analysis software (for example iVMD (In-telliVision, 2015), see (Birgiolas et al., 2017) or pathtrackR (Harmer and Thomas, 2019)). ...
Article
Analysing the motion of animals, especially at high speeds, is often challenging. Motion tracking software needs to deal with a variety of visual contexts, variable lighting conditions, heterogeneous backgrounds and even background movements. Here we present motion tracking via the easy to use and constantly updated Adobe After Effects software - which is often included in software packages most researchers are already using. The provided custom-made Javascript allows for easy exporting of tracking coordinates. Furthermore, some examples for analysing the obtained data in the open source statistical software 'R' will provide reference points, even for an unexperienced user. We present a step-by-step guide of the methodology using high-speed video recordings of locust jumps and additionally validate this method by successful tracking of simulated data under defined subpar filming conditions. This simulated data allows experienced users to compare the tracking software in use with the here presented workflow to weigh the advantages and disadvantages of any motion tracking software on the market.
... We assessed overall colony activity using SwarmSight software (Birgiolas et al. 2016; Table 1). SwarmSight analyzes pixel changes between frames over a set time. ...
... We used SwarmSight to simultaneously quantify pixel changes during the mineral oil presentation and the alarm stimulus presentation. These bouts were filmed within 10 min of each other from the same angle and distance, and we only collected these data when wind and other factors did not affect observed movement (see Birgiolas et al. 2016 for further information). Thus, we could use the mineral oil as a baseline control to quantify potential activity changes in the responses to alarm pheromone treatments. ...
... Thus, a positive number indicates more bees are entering than exiting, and a negative number indicates more bees are exiting than entering. Activity A measure calculated using the SwarmVision software (Birgiolas et al. 2016) to quantify pixel changes over a 1-min window within a video. In all cases, we used a relative change between paired observations by subtracting the pixel changes during mineral oil stimulation from the pixel changes during synthetic alarm pheromone stimulation. ...
Article
Full-text available
In ants, bees, and other social Hymenoptera, alarm pheromones are widely employed to coordinate colony nest defense. In that context, alarm pheromones elicit innate species-specific defensive behaviors. Therefore, in terms of classical conditioning, an alarm pheromone could act as an unconditioned stimulus (US). Here, we test this hypothesis by establishing whether repeated exposure to alarm pheromone in different testing contexts modifies the alarm response. We evaluate colony-level alarm responses in the stingless bee, Tetragonisca angustula, which has a morphologically distinct guard caste. First, we describe the overall topology of defense behaviors in the presence of an alarm pheromone. Second, we show that repeated, regular exposure to synthetic alarm pheromone reduces different components of the alarm response, and memory of that exposure decays over time. This observed decrease followed by recovery occurs over different time frames and is consistent with behavioral habituation. We further tested whether the alarm pheromone can act as a US to classically condition guards to modify their defense behaviors in the presence of a novel (conditioned) stimulus (CS). We found no consistent changes in the response to the CS. Our study demonstrates the possibility that colony-level alarm responses can be adaptively modified by experience in response to changing environmental threats. Further studies are now needed to reveal the extent of these habituation-like responses in regard to other pheromones, the potential mechanisms that underlie this phenomenon, and the range of adaptive contexts in which they function at the colony level. Significance statement Pheromones are classically thought to elicit stereotyped action patterns. Here, we test the idea that responses to pheromones are plastic and show characteristics of an unconditioned stimulus. This study demonstrates clear non-associative plasticity in the colony-level response to alarm pheromone, in the stingless honey bee, Tetragonisca angustula. Colonies of T. angustula show habituation-like responses across multiple measures to repeated stimulation of their alarm pheromone. We therefore demonstrate that colony-level responses to pheromones are adaptively plastic. Finally, we failed to demonstrate colony-level conditioning using alarm pheromone as the unconditioned stimulus; however, these findings and others warrant further investigation.
... To broaden the use of video monitoring to assess ferret lethargy, a more affordable video analysis programme is needed. Swarm Sight is a free open-access software programme which was previously validated for assessing behavioural changes in bees, wild birds and insects [6]. In this study, we sought to investigate the use of Swarm Sight, compared to EthoVision ® XT, to assess changes in activity level in ferrets infected with influenza A or B viruses. ...
... Activity was measured using the free open-access motion analysis programme, Swarm Sight [6]. Video files (AVI. ...
Article
Full-text available
As an animal model of influenza, ferrets are uniquely capable of displaying clinical signs of illness similar to those of human influenza virus infection. To quantify lethargy, we previously established video monitoring as a more sensitive method than the commonly used manual scoring methodology for ferrets infected with influenza virus. Video monitoring is simple to set-up, but its adoption by other laboratories is restricted by the need to purchase costly commercial software, EthoVision® XT, to analyse activity. To broaden the use of video monitoring method in ferrets, we assessed Swarm Sight, a free open-access programme, for analysing activity changes in ferrets infected with seasonal influenza A(H1N1), A(H1N1) pdm09, A(H3N2) and B virus. Swarm Sight could differentiate between the various levels of lethargy associated with the infection of different influenza virus subtypes to a similar degree to EthoVision® XT. However, one major limitation of Swarm Sight is that it does not permit high throughput analysis, which considerably increases the time required to process video clips from experiments involving large numbers of ferrets. Despite this limitation, the open-accessibility and comparable results to EthoVision® XT make Swarm Sight a good alternative for researchers interested in using video monitoring to measure lethargy in ferrets.
... In the last decade, methods for observing animal behavior have been greatly accelerated by advances in high-resolution video cameras, computer processing speeds, and machine vision algorithms. Tasks like animal detection, counting, tracking, and place preference analyses have been aided with sophisticated software that can process videos of animal behavior and extract relevant measures 39,40,41,42,43,44,45,46,47 . These technologies have also aided tracking of insect antenna and proboscis movements. ...
... Briefly, the software works by using a set of motion filters 53 and a relaxed flood fill algorithm 54 . To find likely antenna points, two filters are used: a 3-consecutive-frame difference filter 41,55 and a median-background subtraction 56 filter. A color distance threshold filter is used for proboscis point detection. ...
Article
Full-text available
Many scientifically and agriculturally important insects use antennae to detect the presence of volatile chemical compounds and extend their proboscis during feeding. The ability to rapidly obtain high-resolution measurements of natural antenna and proboscis movements and assess how they change in response to chemical, developmental, and genetic manipulations can aid the understanding of insect behavior. By extending our previous work on assessing aggregate insect swarm or animal group movements from natural and laboratory videos using the video analysis software SwarmSight, we developed a novel, free, and open-source software module, SwarmSight Appendage Tracking (SwarmSight.org) for frame-by-frame tracking of insect antenna and proboscis positions from conventional web camera videos using conventional computers. The software processes frames about 120 times faster than humans, performs at better than human accuracy, and, using 30 frames per second (fps) videos, can capture antennal dynamics up to 15 Hz. The software was used to track the antennal response of honey bees to two odors and found significant mean antennal retractions away from the odor source about 1 s after odor presentation. We observed antenna position density heat map cluster formation and cluster and mean angle dependence on odor concentration. © 2017, Journal of Visualized Experiments. All rights reserved.
... Boris freeware (Friard and Gamba 2016) was used to indicate and label the laying hens' natural and aggressive behaviour, in addition to calculate the frequency and duration of the behaviours. The Swarmsight freeware (Birgiolas et al. 2017) was used to calculate the laying hens' activity index . ...
Article
Full-text available
Information on the subtle changes in behaviour of poultry birds to varying lighting conditions and other environmental factors offers ways to enhance the welfare and egg production. This study examined the effect of light colours on the cumulative activity index and behaviours of 52 Super Nick chickens with the age of days one to 9 weeks old, divided into three pens. Twelve chickens were randomly assigned to the control pen (2 x 2 meters width) with exposure to white light (WL). Twenty hens were randomly distributed to each of the two cages (4 x 4 meters width), where treatment was either exposed to blue (BL) or red light (RL) depending on the lighting schedule. The behaviours were recorded in the morning, midday, and afternoon. The perching sticks were installed in week 5. BORIS software was employed for analysing the videos to label the duration and frequency of the behaviours while Swarmsight program was used to calculate the cumulative activity index. The results showed that cumulative activity index, active behaviours (walking, running, eating, drinking, standing and food pecking), comfort behaviours (perching, wing flapping, body shaking, and head-scratching) were higher (P < 0.05) under WL than under BL and RL. The preening, sitting, and sleeping of the BL group were higher (P < 0.05) than WL and RL. Aggressive and gentle feather pecking frequencies were higher (P < 0.05) in the RL group than in the WL and BL group. The chickens preferred to dustbathe in the afternoon. After the perches were installed, foraging, leg stretching, and standing frequency and the sitting and sleeping duration declined; however, the perching, eating duration and running, food pecking frequency increased. With each passing week, aggressive pecking frequency, walking, perching, and drinking duration increased, but leg stretching and sitting duration declined. The results provide a comprehensive understanding of the influence of light colours on the behaviours of laying hens and thereby offers solutions for designing light-based platforms to enhance welfare.
... Bees do not use their antennae to trace scent trails in the way ants do. Previous studies on the characteristics of bee's antennae motion are scarce and have tracked antennal movements in only two dimensions, with a low temporal sampling rate and with relatively few odorants ( [39][40][41][42][43][44] ). Some of their results were contradictory, with Erber et al ( 40 ) and Suzuki ( 39 ) reporting that antennae moved toward the source of odor whereas Chole et al ( 43 ) and Birgiolas et al ( 44 ) reporting the antenna as moving away from the odor before conditioning. ...
Preprint
Full-text available
When sampling odors, many insects are moving their antennae in a complex but repeatable fashion. Previous works with bees have tracked antennal movements in only two dimensions, with a low sampling rate and with relatively few odorants. A detailed characterization of the multimodal antennal movement patterns as function of olfactory stimuli is thus wanting. The aim of this study is to test for a relationship between the scanning movements and the properties of the odor molecule. We tracked several key locations on the antennae of 21 bumblebees at high frequency (up to 1200 fps) and in three dimensions while submitting them to puffs of 11 common odorants released in a low-speed continuous flow. To cover the range of diffusivity and molecule size of most odors sampled by bees, compounds as different as butanol and farnesene were chosen, with variations of 200% in molar masses. Water and paraffin were used as negative controls. Movement analysis was done on the tip, the scape and the base of the antennae tracked with the neural network Deeplabcut. Bees use a stereotypical motion of their antennae when smelling odors, similar across all bees, independently of the identity of the odors and hence their diffusivity. The variability in the movement amplitude among odors is as large as between individuals. The first oscillation mode at low frequencies and large amplitude (ca. 1-3 Hz, ca. 100°) is triggered by the presence of an odor and is in line with previous work, as is the speed of movement. The second oscillation mode at higher frequencies and smaller amplitude (40 Hz, ca. 0.1°) is constantly present. Antennae are quickly deployed when a stimulus is perceived, decorrelate their movement trajectories rapidly and oscillate vertically with a large amplitude and laterally with a smaller one. The cone of air space thus sampled was identified through the 3D understanding of the motion patterns. The amplitude and speed of antennal scanning movements seem to be function of the internal state of the animal, rather than determined by the odorant. Still, bees display an active olfaction strategy. First, they deploy their antennae when perceiving an odor rather than let them passively encounter it. Second, fast vertical scanning movements further increase the flow speed experienced by an antenna and hence the odorant capture rate. Finally, lateral movements might enhance the likelihood to locate the source of odor, similarly to the lateral scanning movement of insects at odor plume boundaries. Definitive proofs of this function will require the simultaneous 3D recordings of antennal movements with both the air flow and odor fields.
... The response of the F1 larvae to light on d7 and d28 were analyzed using the program Swarm Sight Motion Analysis software (Birgiolas et al., 2017). This software does not track individual fish, but provides a measure of overall change in pixels from one frame to the next. ...
Article
Despite publication of numerous of papers, the effects of fluoxetine on fish behaviour remains mired in controversy and contradiction. One reason for this controversy is that fluoxetine displays distinct and opposing acute and chronic effects. A second reason is that most studies have been limited to two or at the most three concentrations. To address these deficiencies we exposed adult zebrafish, both single females and shoals consisting of one male and two females, to seven fluoxetine concentrations, ranging from 5 ng/L to 5 μg/L and measured their swimming behaviour, and response to a conspecific alarm substance (CAS) at seven, 14 and 28 days. We also measured the light startle response of unexposed F1 larvae at days seven and 28 post-hatch and the response to CAS at day 28. On day 7 fluoxetine decreased swimming speed at concentrations ≥500 ng/L. After addition of CAS fish exposed to 5 and 500 ng/L decreased swimming, while fish exposed to 10, 500 and 1000 ng/L significantly increased time motionless. On day 14 no treatment was significantly different from controls before addition of CAS, but afterwards fish exposed to 5 ng/L and 1000 ng/L showed significant differences from controls. On day 28 fish exposed to 50 and 5000 ng/L had slower average swimming speeds than controls before addition of CAS. After addition all fish except controls and those exposed to 500 ng/L showed decreased average speed. At seven days post-hatch, F1 larvae whose parents were exposed to 100 ng/L showed significantly higher activity than controls and those exposed to 500 ng/L fluoxetine showed lower activity in the light startle response. This study shows that the effects of fluoxetine vary with time and also in a non-monotonic manner. We suggest that the complex nature of the serotonergic system with multilateral effects at the genomic, biochemical and physiological levels interacting with environmental stimuli result in non-linear dose-response behavioural patterns.
Article
Techniques for visualising and analysing animal movement patterns are widely used in behavioural studies. While commercial options exist for analysing animal movement via video, the cost of these is often prohibitive. To meet the need for an efficient and cost‐effective video tracking and analysis tool, we have developed the pathtrackr package for the open‐source programming environment R. The pathtrackr package allows for an automated and consolidated workflow, from video input to statistical output (with addition of the open‐source FFmpeg application), of an animal's movement. The software tracks a single animal at a time across a variety of visual contexts, notably heterogenous backgrounds and variable lighting, and can deal with localised background movement that does not overlap with the animal. The software does not need training like many other solutions. We also include diagnostic tools in the package for troubleshooting. Here, we provide an overview of the package and example workflow for using its various functions. Validation with simulated data shows that pathtrackr provides very accurate results, while removing observer subjectivity and greatly reducing data collection times. This article is protected by copyright. All rights reserved.
Article
Full-text available
Ferrets are the preferred animal model to assess influenza virus infection, virulence and transmission as they display similar clinical symptoms and pathogenesis to those of humans. Measures of disease severity in the ferret include weight loss, temperature rise, sneezing, viral shedding and reduced activity. To date, the only available method for activity measurement has been the assignment of an arbitrary score by a 'blind' observer based on pre-defined responsiveness scale. This manual scoring method is subjective and can be prone to bias. In this study, we described a novel video-tracking methodology for determining activity changes in a ferret model of influenza infection. This method eliminates the various limitations of manual scoring, which include the need for a sole 'blind' observer and the requirement to recognise the 'normal' activity of ferrets in order to assign relative activity scores. In ferrets infected with an A(H1N1)pdm09 virus, video-tracking was more sensitive than manual scoring in detecting ferret activity changes. Using this video-tracking method, oseltamivir treatment was found to ameliorate the effect of influenza infection on activity in ferret. Oseltamivir treatment of animals was associated with an improvement in clinical symptoms, including reduced inflammatory responses in the upper respiratory tract, lower body weight loss and a smaller rise in body temperature, despite there being no significant reduction in viral shedding. In summary, this novel video-tracking is an easy-to-use, objective and sensitive methodology for measuring ferret activity.
Article
Full-text available
The nest of the stingless bee,Trigona (Tetragonisca) angustula, is guarded by bees positioned in the nest entrance and others hovering in front of it. Hovering guard bees track returning foragers sideways along the last 10 cm in front of the nest, but intercept and incapacitate nest intruders by clinging with mandibles to wings and legs. When attacked by the cleptobiotic stingless beeLestrimelitta limao, the colony strengthens its aerial defense with hundreds of additional hoverers. To test our hypothesis that this reaction is due to interspecific chemical communication based on kairomone effects, we presented synthetic cephalic volatiles of both species at the nest entrance and counted the number of bees leaving the nest and taking up hovering positions. We conclude that guard bees recognizeL. limao by the major terpenoids of their volatile cephalic secretions, geranial, neral (=citral) and 6-methyl-5-hepten-2-one; other components may fine-tune this recognition. The effect of chemical stimuli is not significantly enhanced by combination with a dummy ofL. limao. Guard bees, we hypothesize, respond to this kairomone by secreting a species specific alarm pheromone; a major component of this pheromone, benzaldehyde, recruits additional bees to defend the nest.
Article
Thesupport-vector network is a new learning machine for two-group classification problems. The machine conceptually implements the following idea: input vectors are non-linearly mapped to a very high-dimension feature space. In this feature space a linear decision surface is constructed. Special properties of the decision surface ensures high generalization ability of the learning machine. The idea behind the support-vector network was previously implemented for the restricted case where the training data can be separated without errors. We here extend this result to non-separable training data.High generalization ability of support-vector networks utilizing polynomial input transformations is demonstrated. We also compare the performance of the support-vector network to various classical learning algorithms that all took part in a benchmark study of Optical Character Recognition.
Article
The support-vector network is a new learning machine for two-group classification problems. The machine conceptually implements the following idea: input vectors are non-linearly mapped to a very high-dimension feature space. In this feature space a linear decision surface is constructed. Special properties of the decision surface ensures high generalization ability of the learning machine. The idea behind the support-vector network was previously implemented for the restricted case where the training data can be separated without errors. We here extend this result to non-separable training data. High generalization ability of support-vector networks utilizing polynomial input transformations is demonstrated. We also compare the performance of the support-vector network to various classical learning algorithms that all took part in a benchmark study of Optical Character Recognition.
Book
Mainly deal with queueing models, but give the properties of many useful statistical distributions and algorithms for generating them.
Article
Sub-lethal effects of pesticides on behavioural endpoints are poorly studied in carabids (Coleoptera: Carabidae) though changes in behaviour caused by chemical stress may affect populations of these nontargeted beneficial insects. General motor activity and locomotion are inherent in many behavioural patterns, and changes in these activities that result from xenobiotic influence mirror an integrated response of the insect to pesticides. Influence of pyrethroid insecticides over a wide range of sub-lethal doses on the motor activities of carabids still remains unclear. Video tracking of Platynus assimilis showed that brief exposure to alpha-cypermethrin at sub-lethal concentrations ranged from 0.01 to 100 mg L(-1) caused initial short-term (< 2 h) locomotor hyper-activity followed by a long-term (>24 h) locomotor hypo-activity. In addition, significant short- and long-term concentration and time-dependent changes occurred in general motor activity patterns and rates. Conspicuous changes in motor activity of P. assimilis beetles treated at alpha-cypermethrin concentrations up to 75 000-fold lower than MFRC suggest that many, basic fitness-related behaviours might be severely injured as well. These changes may negatively affect carabid populations in agro-ecosystems. Long-term hypo-activity could directly contribute to decreased trap captures of carabids frequently observed after insecticide application in the field.