ArticlePDF AvailableLiterature Review

Keeping track of worm trackers

Authors:

Abstract and Figures

C. elegans is used extensively as a model system in the neurosciences due to its well defined nervous system. However, the seeming simplicity of this nervous system in anatomical structure and neuronal connectivity, at least compared to higher animals, underlies a rich diversity of behaviors. The usefulness of the worm in genome-wide mutagenesis or RNAi screens, where thousands of strains are assessed for phenotype, emphasizes the need for computational methods for automated parameterization of generated behaviors. In addition, behaviors can be modulated upon external cues like temperature, O(subscript)2(/subscript) and CO(subscript)2(/subscript) concentrations, mechanosensory and chemosensory inputs. Different machine vision tools have been developed to aid researchers in their efforts to inventory and characterize defined behavioral "outputs". Here we aim at providing an overview of different worm-tracking packages or video analysis tools designed to quantify different aspects of locomotion such as the occurrence of directional changes (turns, omega bends), curvature of the sinusoidal shape (amplitude, body bend angles) and velocity (speed, backward or forward movement).
Content may be subject to copyright.
*Edited by Oliver Hobert. Last revised March 2, 2012. Published September 10, 2012. This chapter should be cited as: Husson, S. J. et al.
Keeping track of worm trackers (September 10, 2012), WormBook, ed. The C. elegans Research Community, WormBook,
doi/10.1895/wormbook.1.150.1, http://www.wormbook.org.
Copyright: © 2012 Steven J. Husson, Wagner Steuer Costa, Cornelia Schmitt and Alexander Gottschalk. This is an open-access article
distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any
medium, provided the original author and source are credited.
§To whom correspondence should be addressed. E-mail: a.gottschalk@em.uni-frankfurt.de; phone: +496979842518; fax: +496979876342518
*both authors contributed equally
Keeping track of worm trackers*
Steven J. Husson2*, Wagner Steuer Costa1*, Cornelia Schmitt1,
Alexander Gottschalk1§
1Buchman Institute for Molecular Life Sciences (BMLS), and Institute of Biochemistry,
Goethe-University, Max von Laue Strasse 15, D-60438 Frankfurt, Germany
2Katholieke Universiteit Leuven, Research group of Functional Genomics and Proteomics,
Naamsestraat 59, B-3000 Leuven, Belgium
Table of Contents
1. Introduction ............................................................................................................................2
1.1. Quantitative description of behavioral phenotypes using machine vision .................................. 4
1.2. History of C. elegans tracking systems .............................................................................. 5
2. Worm trackers ........................................................................................................................6
2.1. Nemo (Nematode movement) .......................................................................................... 6
2.2. Worm Tracker 2.0 .........................................................................................................7
2.3. The Parallel worm tracker and OptoTracker .......................................................................8
2.4. The Multi Worm Tracker ................................................................................................8
2.5. Multimodal illumination and tracking system for optogenetic analyses of circuit function ...........9
2.6. CoLBeRT: control locomotion and behavior in real time ..................................................... 10
2.7. The opto-mechanical system for imaging or manipulation of neuronal activity in freely moving
animals ........................................................................................................................... 10
2.8. Further systems allowing tracking and Ca2+ imaging in semi-restrained or freely behaving animals
....................................................................................................................................... 10
2.9. Behavioral arenas ........................................................................................................ 11
2.10. The WormLab, a commercially available worm tracker ..................................................... 12
3. Worm trackers optimized for liquid environments ........................................................................ 12
4. Additional analysis tools for quantifying C. elegans behavior ......................................................... 12
4.1. Eigenworms: Low-dimensional superposition of principal components .................................. 12
4.2. An analysis tool for the description of bending angles during swimming or crawling ................ 13
4.3. The Worm Analysis System .......................................................................................... 13
4.4. The Multi-Environment Model Estimation for Motility Analysis .......................................... 13
5. Possible future developments ................................................................................................... 13
6. Conclusion ........................................................................................................................... 14
7. References ............................................................................................................................ 14
1
Abstract
C. elegans is used extensively as a model system in the neurosciences due to its well defined nervous
system. However, the seeming simplicity of this nervous system in anatomical structure and neuronal
connectivity, at least compared to higher animals, underlies a rich diversity of behaviors. The usefulness of
the worm in genome-wide mutagenesis or RNAi screens, where thousands of strains are assessed for
phenotype, emphasizes the need for computational methods for automated parameterization of generated
behaviors. In addition, behaviors can be modulated upon external cues like temperature, O2and CO2
concentrations, mechanosensory and chemosensory inputs. Different machine vision tools have been
developed to aid researchers in their efforts to inventory and characterize defined behavioral “outputs”. Here
we aim at providing an overview of different worm-tracking packages or video analysis tools designed to
quantify different aspects of locomotion such as the occurrence of directional changes (turns, omega bends),
curvature of the sinusoidal shape (amplitude, body bend angles) and velocity (speed, backward or forward
movement).
1. Introduction
C. elegans is an outstanding model organism for the study of neuronal circuits at the systems level. Exactly
302 neurons coordinate different behaviors such as feeding, mating, egg-laying, defecation, swimming and many
subtle forms of locomotion on a solid surface. Due to its experimental amenability, the nematode has been an ideal
animal for examining the genetic basis of behavior. Numerous phenotype-driven (forward and reverse) genetic
screens have been performed, in search of defined behavioral abnormalities that can be assigned to specific genes.
However, the effects of specific mutations on behavioral changes under study are often poorly described using
imprecise terminology. In addition, as the phenotypes are difficult to quantify, lack of numerical data hinders robust
statistical analysis. These screens mostly provide an informative description of the phenotype like “Unc”
(uncoordinated) or similar descriptions (Brenner, 1974). However, an uncoordinated worm can be “coiling”,
“kinky”, “sluggish”, “loopy”, “slow” or might not move at all (Hodgkin, 1983). These observations and
phenotypical assignments are generally made by the experimenter and therefore involve the risk of subjectivity and
non-uniformity, and also fail to address issues of phenotypic penetrance and degree of severity. Moreover, precise
specification of the different aspects of locomotion that are affected, such as velocity, amplitude of the sinusoidal
movement, angles of body bends and turning frequency cannot be easily provided through visual inspection by an
individual researcher. The emergence of possibilities for tracking cells (particularly neurons; Faumont et al., 2011),
as well as optogenetic technologies that use light to gain exogenous control of defined cells (e.g., activation by the
depolarizing Channelrhodopsin-2 [ChR2] and inhibition by the hyperpolarizing Halorhodopsin [NpHR] (Boyden et
al., 2005;Deisseroth, 2011;Liewald et al., 2008;Nagel et al., 2003;Nagel et al., 2005;Stirman et al., 2011;Zhang
et al., 2007;Leifer et al., 2011), has generated an even more pressing demand for neurobiologists to have robust
computational methods for the quantification of behavior.
To address this problem, different machine vision approaches for automated behavioral analysis have been
developed recently. Here we focus on software (and, to some extent, hardware) tools that quantitatively analyze
locomotion behavior. We aim to provide a descriptive and currently comprehensive overview of different tracking
systems and software developed by the worm community. We will discuss obvious advantages and disadvantages of
the respective systems, including some “how-to's” to the extent that we can judge this either from our own
experience or from the published work describing the systems. This review focuses mainly on the “input” and the
“output” of behavior tracking systems: how many worms can be analyzed with the respective tool, and which
behavioral parameters can be analyzed (Table 1). An in-depth description of the various programs/codes of the
diversity of video analysis tools is beyond the focus of this review; these will rather be treated as “black boxes” and
the reader is referred to the original publications. We will first provide a short history of worm tracking and mention
how different video analysis tools have been used to quantitatively analyze C. elegans behavior in the past, to
illustrate how the field has evolved. Next, we will give an overview of the major approaches available to-date and
how, or if, these systems can be combined with optogenetic strategies that require precisely timed and synchronized
illumination of the animal(s) with various colors of light.
Keeping track of worm trackers
2
Table 1. Comparison of tracking systems
Name
Worm
Tracker
2.0
(Schafer
lab)
Nemo
(Taver-
narakis
lab)
The
Parallel
Worm
Tracker
(Goodman
lab)
OptoTracker
(Gottschalk
lab)
Multimodal
illumination
and
tracking
system (Lu
lab)
CoLBeRT
(Samuel
lab)
The
Multi
Worm
Tracker
(Kerr
lab)
Opto-
mechanical
system for
virtual
environments
(Lockery lab)
Single/Multi
Worm Single Single <50 <50 Single Single <120 Single
Adaptable Yes,
supports
x-y stages
by three
different
vendors,
as well as
other
camera
systems
(i.e. USB
cameras)
Yes, code
open for
changes,
supports
other
camera
systems
(i.e. USB
cameras)
Yes, code
open for
changes,
supports
other
camera
systems
(i.e. USB
cameras)
Yes, code
open for
changes,
supports
other camera
systems (i.e.
USB
cameras)
Yes, code
open for
changes,
supports any
projector and
LabVIEW
Vision
compatible
camera
systems (i.e.
USB
cameras)
Yes, code
open for
changes
Yes, code
open for
changes,
supports
LabVIEW
Vision
compatible
camera
systems
NA
Optogenetic
aplication No No No Yes Yes–3
wavelengths Yes Yes Yes
Illumination
type NA NA NA Whole field patterned;
intensity
adjustable –
each
wavelength
independently
patterned Whole
field patterned,
intensity
adjustable
X-Y Stage
control Yes No No No Yes Yes No Yes
Measured
parameters Skeleton
and
outline
Skeleton
and
outline
Centroid Centroid Skeleton and
outline Skeleton
and
outline
Skeleton
and
outline
Bright spot
Camera
resolution/
support for
other
resolution
(pixel)
1280 ×
1024/Yes 800 ×
600/Yes 640 ×
480/No,
downsized
if greater
640 ×
480/No,
downsized if
greater
320 ×
240/Yes, but
reduced fps
at higher
resolutions
1280 ×
1024 /NA 2352 ×
1728/No 4 quandrants
photomultiplier-
tube
Camera
frequency/
other
supported
(frames per
second)
30/Yes 40/Yes 15/Yes 15/Yes 25/Yes 50/Yes 31/No NA-PMT
Video
stored Yes Yes Yes Yes Yes Yes No Yes
GUI Yes Yes Yes Yes Yes Yes No Yes
Microscope
required No No No No Yes Yes No Yes
Keeping track of worm trackers
3
Name
Worm
Tracker
2.0
(Schafer
lab)
Nemo
(Taver-
narakis
lab)
The
Parallel
Worm
Tracker
(Goodman
lab)
OptoTracker
(Gottschalk
lab)
Multimodal
illumination
and
tracking
system (Lu
lab)
CoLBeRT
(Samuel
lab)
The
Multi
Worm
Tracker
(Kerr
lab)
Opto-
mechanical
system for
virtual
environments
(Lockery lab)
Required
Hardware*X-Y
Stage,
camera
Camera Camera Camera,
light source
with shutter,
filters
X-Y Stage,
camera,
projector,
filters
X-Y
Stage,
Laser,
DMD
Array,
frame
grabber,
camera
Camera,
frame
grabber,
background
light
PMTand
centering
device
Required
software Java,
ffdshow,
MATLAB
or MCR
MATLAB
(R13) +
Image
Processing
Toolbox
MATLAB
(R13) +
Image
Acquisition
and
Image
Processing
Toolbox
MATLAB
(R13) +
Image
Acquisition
and Image
Processing
Toolbox
LabVIEW (+
Vision) Mind-
Control
(custom,
C),
MATLAB
R2010a
LabVIEW
(+
Vision),
C++
(custom),
Java
NA
Cost
estimation
excluding
software,
computer
and
microscope
(US$)
3,500 350 350 1600 10,000 16,000 7,000 Commercial
version
available
(PhotoTrack,
ASI)
* Some cameras require a frame grabber and PCI card to communicate with LabVIEW or MATLAB; USB-
or fire-wire cameras should work w/o these
The authors thank Jeffrey N. Stirman for advice on assembling this table
1.1. Quantitative description of behavioral phenotypes using machine vision
Several machine vision programs follow a similar data processing strategy that first involves extraction of
individual pictures from each frame of the movie file (Figure 1). The shape of the worm is then extracted from the
background by a thresholding procedure. This operation allocates pixels to worm or background according to
whether the intensity exceeds a defined threshold value thereby generating a two-color binary image (black and
white). The next step is to depict the “skeleton” or “spine” of the animals from tail to head, often referred to as
skeletonization of the worm shape (however, some systems do not use skeletonization, but segmentation of the
worm shape). The one pixel thick line-image of the skeleton is further subdivided into different segments to allow
computation of various parameters such as the center of mass (of the entire worm or for each segment) often referred
to as the “centroid”, angles between two adjacent segments as a measure for body curvature, etc. In general, the
velocities of individual worms are calculated as the rate of change in the location of their centroid or points along
their skeleton over time, measured across the sequence of individual frames of the movie. Irrespective of the
tracking program used, the key to success is to optimize the video quality such that the worms can be easily
recognized as high contrast objects (dark) on a pale background (or vice versa). One should also take into account
that the camera resolution, magnification used, and the quality of the imaging conditions jointly determine the
accuracy of the measurements. When programmed for tracking several worms simultaneously, most trackers have
the option for particle size exclusion. Through this option, dust particles are excluded, colliding worms are ignored
and new tracks are automatically assigned once they separate again. This procedure is easier to implement and
requires a much smaller amount of computation than keeping track of both animals.
Keeping track of worm trackers
4
Figure 1: General overview of the worm tracking procedure. A movie of the behaving animal is taken either using a camera attached to a microscope
(A) or with a camera and its macro function (B). Depending on the tracker software, a motorized stage (X, Y) can be used to keep the worm in the field of
view. For simplicity, only one worm is depicted in the movie (C); multi worm trackers may track over 100 worms at the same time. Individual pictures or
frames (D) are extracted from the video file, which are subsequently converted to binary (black and white) images (E). This operation is handled differently
in each worm tracker implementation and mainly consists of thresholding and gap filling. When a motorized X-Y stage is used, the software calculates the
worm's position in the binary image and moves its center of mass to the middle of the frame. In the next step, the worm's skeleton (F) is calculated from the
binary image, which is further divided into individual segments (G). Different parameters (H) can be calculated based on the segmented skeletonized
pictures, which are stored for further processing.
Although this method can also be applied for tracking C. elegans movement in a liquid environment, it is not
optimal for quantification of swimming behavior. This fact led to the development of a covariance-based method,
where the animals’ morphology is not measured but rather similarities between frames are searched and used for
motion frequency calculation (see section 3).
1.2. History of C. elegans tracking systems
To our knowledge, the first video system, capable of tracking the movement of about 25 animals in real time
at 1 Hz, was developed by Dusenbery and used to study chemotaxis (Dusenbery, 1985). This software was
programmed in BASIC09. Later, Dusenbery and colleagues devised a system that could track even 100s of animals,
based on NIH Image software (Dhawan et al., 1999). About the same time, another system, capable of tracking 50
animals, was used to characterize the neuropeptide Y receptor NPR-1 and its role in aggregation behavior in the Cori
Bargmann lab (de Bono and Bargmann, 1998). Videos were analyzed using the “DIAS” software program that was
initially developed to study basic crawling behaviors of amoeboid cells (Soll, 1995). The speed of the objects under
study was calculated between successive frames or as average speed over a longer period of time.
To study the role of pirouettes in chemotaxis behavior, another tracking system was developed and used to
record the position, speed and turning rate of individual worms in the Shawn Lockery lab (Pierce-Shimomura et al.,
1999). The tracking system consisted of a computer-controlled motorized stage and a video camera mounted on a
compound microscope. The system located the centroid of a worm under study and recorded x and y coordinates at a
sampling rate of about 1 Hz. The worm was re-centered when it reached the edge of the field of view and the
distance that the stage moved was recorded.
A similar tracking system was developed and used in the William Schafer lab to analyze egg-laying behavior
(Hardaker et al., 2001;Waggoner et al., 1998). This prototype worm tracking system was further refined, in a joint
venture between the Schafer and Paul Sternberg labs, for automated collection and analysis of C. elegans
locomotion data. These systems were able to parameterize and classify different behavioral phenotypes of unc
mutants by classification and regression tree (CART) algorithms. The tracker hardware and programming
Keeping track of worm trackers
5
environment was estimated to cost about 10,000 US$ (excluding the requisite microscope, lighting and optics),
software was coded in “C” programming language and it could operate at 2 Hz. In order to make worm-tracking
accessible for general use in the C. elegans community, the system, with improved software, was described as a
ready-to-use imaging system for standardized quantitative analysis of C. elegans behavior, complete with a
parts-list, software packages and code to download and install (Feng et al., 2004;Cronin et al., 2005). This
“Wormtracker 1.0” used a Cohu monochrome CCD camera (460 × 380 pixels) and a Daedal motorize stage
controlled by a National Instruments controller and could operate at 30 Hz. Alternatively, video acquisition was
done through a video cassette recorder and the movie then digitized afterwards. Software consists of four basic
modules: (1) the Tracker, (2) a Converter to process raw images into a morphological skeleton, (3) a Lineup module
to order backbone points from head to tail and (4) a “Miner” module for parameter extraction. The latter module
analyzes specific features that define important parameters related to locomotion and morphology of the worm such
as body posture, bending angles, movement and locomotion waveform. A total of 59 distinct features are measured,
and the software is written with C/C++, LabVIEW 7.0 and MATLAB 13. Cronin and co-workers further described
the metrics and application of their joint venture system with a toxicological assay as an example (Cronin et al.,
2005). The software was further refined in the Sternberg lab and can be downloaded as the Caltech Nematode
Movement Analysis System (http://wormlab.caltech.edu/publications/download.html). As the system originally
described by Feng et al. (2004) was difficult to transfer to other labs, it has unfortunately not been widely used. An
updated “Wormtracker 2.0” has been made available by the Schafer lab on the MRC-LMB website, including
instructions on how to build and use the hardware, as well as software packages both for operation of the hardware,
and for analysis of the obtained videos (http://www.mrc-lmb.cam.ac.uk/wormtracker/). The system makes use of a
digital microscope-type USB-camera (“Dino-lite”) that is able to acquire macro movies without the need for a
compound microscope (see next paragraph).
Furthermore, there are various computational approaches for tracking and feature extraction of C. elegans in
liquid environments. A system for quantifying the position, trajectory and body shape of worm populations in fluid
environments has been developed by the Monica Driscoll lab (Tsechpenakis et al., 2008), while the David Sattelle
lab presented a rapid method for automated counting of thrashing frequencies (Buckingham and Sattelle, 2009).
Another approach to quantify worm activity monitors the scattering of an infrared beam through a liquid culture of
worms, and was used in the Diego Golombek lab to measuring circadian rhythms (Simonetta and Golombek, 2007).
Moreover, the Randy Blakely lab created a MATLAB script for automatic analysis of worm inactivity in liquid
environment using a fast Fourier transform (FFT) to measure movement frequency (Matthies et al., 2006).
If studying the involvement of a neuron or class of neurons in a particular behavior is of interest, optogenetic
tools like ChR2 or NpHR for activating and silencing the cells acutely is a promising approach, particularly if
combined with behavioral tracking. However, achieving single-cell expression of optogenetic tools is challenging,
even when using recombinase-based approaches (Davis et al., 2008;Macosko et al., 2009). If single neuron
expression cannot be obtained, restricting light to the region of the body where the cell of interest is localized may
overcome this problem. High spatial and temporal precision has to be achieved in order to selectively address the
cell of interest. This problem was tackled and solved by both the Hang Lu and Aravi Samuel labs (Leifer et al.,
2011;Stirman et al., 2011). Both systems were developed to illuminate distinct body regions, harboring the neurons
of interest, in freely behaving animals. The respective neurons are, at least currently, defined and targeted by their
anatomical position. Usually an area significantly larger than the size of the neuron's cell body is illuminated. This
ensures that the neuron of interest is always illuminated for the defined period, even if the animal is moving quickly.
In the future, it is conceivable that fluorescent markers expressed in a defined pattern (or within the cell of interest)
may be used to address a specific cell. To some extent, the opto-mechanical tracker by the Lockery lab provides an
approach to such devices (Faumont et al., 2011).
2. Worm trackers
2.1. Nemo (Nematode movement)
The Nektarios Tavernarakis group developed a simple yet powerful tool for analyzing nematode movement
(Nemo) without the need of a tracking device (Tsibidis and Tavernarakis, 2007). Nemo is a modular software
(written in MATLAB, release 13 or higher), that allows the user to specify which operation should occur on their
data (software can be downloaded as supplementary material to Tsibidis and Tavernarakis, 2007;
http://www.biomedcentral.com/content/supplementary/1471-2202-8-86-s2.zip). A GUI was designed to facilitate the
processing and interpretation of the data and can be used to generate graphs and histograms of the computed
parameters. We tested videos taken with different cameras at various settings and we could analyze the data without
having to change the software. The software works with indexed images as input. These have to be obtained by the
Keeping track of worm trackers
6
user through a 3rd party program like VirtualDub (http://virtualdub.sourceforge.net/). Although Nemo might be used
with all image resolutions and magnifications, it is advisable to enhance these in order to reduce the error rate later
in the quantification processes. First, all images are converted to gray scale and then a low-pass filter is applied in
order to reduce noise. This image processing sequence allows Nemo to quantify color (RGB) images. However, care
must be taken, since the resulting gray values must have a good worm to background contrast after processing.
Nemo then searches in the first video frame for a single distinct object (i.e. the imaged worm) and computes its
perimeter and skeleton using standard MATLAB Image Analysis Toolbox functions. In the following frames, only a
region adjacent to the last position of the worm is computed, avoiding time consuming operations. Nemo also
provides an algorithm to clear the skeleton from small branches. The skeleton is subdivided into a user-specified
number of lines, representing segments of the worm. The coordinates of the center of mass of all lines as well as of
the whole worm are recorded. In addition, the system is laid out such that reference points on the plate are taken into
account to determine when the plate had to be moved to keep the worm within the field of view. With this
information, the dataset is used to characterize the worm's speed, waveform (of the whole animal, or of only parts of
the body), angles between two segments, thickness, distance between head and tail and trajectory.
Installing Nemo is straightforward and well-documented in the associated Readme.pdf file and the
Algorithms.pdf document. Every function is clearly described, allowing non-MATLAB proficient users to
understand how the program works.
2.2. Worm Tracker 2.0
The Worm Tracker 2.0 was released unofficially to the worm community in the beginning of 2007 by the
Schafer lab and is frequently updated. The current release can be downloaded from the MRC-LMB website
(http://www.mrc-lmb.cam.ac.uk/wormtracker/). This single worm tracker operates with a Dino-lite digital
microscope and camera (http://www.dino-lite.eu/), practically rendering a compound microscope unnecessary. The
tracker supports motorized stages of a variety of vendors as well as other cameras mounted on conventional
microscopes, allowing one to adapt an existing microscope setup to the Worm Tracker 2.0 with ease. The software is
fully based on graphical user interfaces (GUI) and is self-explanatory. The dedicated web page presents information
ranging from the hardware needed to software installation all the way to protocols on how to optimize the NGM
plates for recording videos with Worm Tracker 2.0. Additionally, the group also released a free Worm Analysis
Toolbox for MATLAB, specifically developed to analyze videos taken with the Worm Tracker 2.0.
The tracker acquires the image stream from the camera and recognizes the worm and its centroid. It controls
the motorized stage to position the worm's centroid at the center of the image as soon as the worm reaches
previously selected boundaries in the field of view. The position of the stage is recorded in a separate file as well as
the timing of the stage movement. This information is used in the Worm Analysis Toolbox to identify the frames
where the stage moved. The blurred images caused by stage movement are dropped from the analysis, which is a
drawback compared to systems that continuously re-center the animal with small motion increments, and may lead
to loss of (some) information. However, the system also allows moving the microscope instead of the stage, thus
leaving the worm completely non-agitated, in case vibrations due to stage movement, which may be sensed by the
animal, are a concern. The stage's immobility also permits tracking of single worms swimming. The small form
factor, due to the lack of a microscope, and low acquisition cost makes this system a good choice for locomotion
studies without embedded optogenetic stimulus application.
The Worm Analysis Toolbox can be used (off-line) for automated segmentation of the worm's image and
subsequent feature extraction. The current release supports analysis of the worm's area, length, width, thickness,
transparency and the brightness of head and tail. In addition, the Toolbox allows visual confirmation of the extracted
data as well as debugging in case of errors with three helper tools. All functions are widely commented in a user
manual.
Installation of the Worm Tracker 2.0 requires a Java environment, while the Toolbox requires either a
MATLAB installation or a MATLAB Compiler Runtime Environment. The latter can be downloaded for free with
the Worm Analysis Toolbox. The group released an example folder for the Analysis Toolbox containing a video and
all tracked data, allowing one to test the program prior to purchasing the required hardware.
Although the Worm Tracker 2.0 is, at the time of writing, a work in progress, it can already be reliably used
by the worm community. The software is free to use (provided acceptance of a software release agreement issued by
the MRC) and the cost of the hardware needed is the smallest among the single worm trackers with automated stage
control. The ease of installation and operation are further reasons to consider this system in laboratories that wish to
Keeping track of worm trackers
7
start automatic behavioral assays. Unfortunately, however, at least the current version of the tracker does not support
synchronized control of other devices (for example by using TTL pulses). Therefore, in the current form, this system
is not recommended when combining worm tracking with optogenetic tools that require accurate programming of
illumination protocols to be synchronized with the videos taken.
2.3. The Parallel worm tracker and OptoTracker
The Parallel worm tracker (PWT) was designed by the Miriam Goodman lab as a high-throughput platform to
analyze the locomotion (mainly centroid speed) of up to 50 worms in parallel, for example, enabling quantification
of drug-induced paralysis (Ramot et al., 2008). The overall setup is similar to Nemo and all software packages are
implemented in MATLAB (http://wormsense.stanford.edu/tracker/). Video capture is performed using the
VideoCapture module, which is compatible with any camera capable of communicating with the MATLAB Image
Acquisition Toolbox. Thereafter tracking is performed off-line by the WormTracker module. The tracker records the
centroid position of tens of worms in sequential movie frames extracted from uncompressed grayscale (8 bit) .avi
format video files with a resolution of 640 × 480 pixels. If two animals collide, the tracking of each animal is
terminated. New tracks are assigned to the animals once they separate. WormTracker only stores those tracks for
analysis that persist for more than a certain amount of frames. Next, the WormAnalyzer package provides tools for
analysis and display of generated tracks. It is capable of automatic detection of turning events or pirouettes, as
described earlier by Chalasani et al. (2007), measures the speed of individual worms and can use these data to
measure the fraction of worms that are paralyzed by drug application. The preferences for each module are stored in
an Excel file and analyzed data can be exported as figures or tables.
The Alexander Gottschalk lab, i.e. the authors of this review, is interested in combining worm tracking with
optogenetics-assisted modulation of neuronal activity. Thus, the parallel worm tracker software was implemented
with a program that allows controlling an electronic shutter, blocking out light from, e.g., an HBO (Hg = mercury B
= luminance O = unforced cooling) light source, through the LPT (parallel) port of the computer by sending a TTL
(Transistor-Transistor Logic) pulse. In this way, a series of predefined light pulses can be applied to the worms for
optogenetics-based behavioral studies. A user-friendly interface for this “OptoTracker” has also been generated, in
which individual users can load the different modules (VideoCapture, WormTracker, WormAnalyzer and the
additional Shutter module) with their saved preferences, to export acquired raw data into an Excel file for further
characterization.
The source code for the parallel worm tracker and a user manual can be downloaded from
http://wormsense.stanford.edu/tracker/, while the OptoTracker variant can be found on the Gottschalk lab website
(http://www.biochem.uni-frankfurt.de/index.php?id=236), together with an installation and user manual. This
system provides a simple, though efficient, solution for multi worm tracking and only requires MATLAB software,
including the Image Acquisition and the Image Processing Toolboxes, a digital video camera and a microscope (if at
all).
2.4. The Multi Worm Tracker
The Rex Kerr and Catharine Rankin labs recently described a multi-worm tracking system that was used to
analyze spontaneous movement on food, chemotaxis and habituation of response to tap stimulation. The software
package consists of real-time image-analysis software, called “Multi-Worm Tracker (MWT)”, and an additional
program, “Choreography”, for off-line analysis of different behavioral parameters (Swierczek et al., 2011). The
MWT is used to provide basic features of the worm including its position and outline, whereas Choreography has to
be employed to extract additional features after selecting the appropriate objects. The system can conveniently track
up to 120 animals per plate. More animals can be reliably tracked when dropping frames during the real time
processing of the MWT; conversely, one might expect to have more worms tracked by increasing the processing
power of the computer used.
The core of the hardware system is a high-end digital camera (Falcon 4M30, 4 Megapixel (2352 × 1728), 31 fps,
10 bit digital camera equipped with a 25 mm modular focus block) that renders the use of a microscope system
unnecessary. A special frame grabber is used to acquire the uncompressed video stream during experiments. As the
camera streams the video at full rate of 7.2 gigabytes per minute, only the tracking files are stored during an
experiment. It is recommended to use a stand for the camera and to build a stage to hold the Petri dish in front of the
camera lens. The camera's high resolution can visualize the animals with 24 µm/pixel resolution, which reduces the
measurement errors due to pixel flickering. The tracking procedure searches for animals in the first frame of a movie
Keeping track of worm trackers
8
and draws a box around these. In the following frames, only the area of the boxes will be tracked; all other pixels are
skipped. The position of the worm is stored and the box around the animal is refreshed for the next frame. After a
defined amount of frames, a subpart of the whole image is searched for new worms entering the field of view,
creating new boxes where needed. These subparts are cycled through without decreasing the tracking capability. It is
important to achieve homogenous background lighting as parts of the field of view will not be addressed due to
over- or underexposure when unevenly illuminated. It is also crucial to use synchronized worms: since animal
recognition is performed through particle size analysis, all pictured animals should have the same size. When two
animals collide, their size is added and counted as one particle. These animals are ignored by the MWT until they
separate and are again identified as single worms by the animal search algorithm. This might be an issue when
tracking higher number or animals simultaneously for longer periods of time.
All required software and documentation has been published in Sourceforge (http://sourceforge.net/
projects/mwt/). The package is coded in C++ and based on LabVIEW (MWT) and JAVA (Choreography).
Unfortunately, Choreography has no GUI and works with JAVA commands. In addition to multi-worm tracking and
analysis of the data, the software also allows the control of up to three stimulus-presenting systems. This permits
multi-worm tracking while giving a computer-controlled mechanical tap to the plate or presenting a “puff” of air
over the NGM Petri dish. The system can also be used to challenge animals with a light pulse (e.g. delivered by a
ring of LEDs), for whole field optogenetics experiments (Adriel and Rankin, unpublished). Furthermore, this system
may be used to quantify high-throughput swimming assays.
2.5. Multimodal illumination and tracking system for optogenetic analyses of circuit function
The Tracker developed recently in the Lu lab resolves an hitherto existing problem in optical stimulus delivery
for optogenetic manipulation of animal behavior: patterned illumination with various wavelengths at the same time
to target several distinct optogenetic tools in the same animal, addressing different nodes of a neuronal network
(Stirman et al., 2011;Stirman et al., 2012). The system combines an inverted microscope and a commercially
available video projector for multi-color illumination of physically separated cells. At the same time, the position of
the worm is tracked by a movable x-y stage and various behavioral parameters, such as velocity and body curvature,
can be analyzed. The spatial resolution of the presented system has been calculated to 14 µm/pixel at 25 Hz,
depending on the objective used.
The software saves two video streams: the originally acquired video of the behaving animal, and a parallel
video stream with information regarding the pattern of light used to stimulate optogenetic tools expressed in
particular neurons. There is the option to merge both videos, where the area being optically activated is marked in
the color of the channel used for stimulation (red, green, blue, or combinations of these 3 colors). The calibration,
steering and analysis programs are coded in LabVIEW and all required software can be downloaded as supplements
accompanying the paper (Stirman et al., 2011;http://www.nature.com/nmeth/journal/v8/n2/extref/nmeth.1555-
S10.zip). A step-by-step protocol to make the essential optical changes to the off-the-shelf LCD projector is also
available, as well as instructions on how to use the different software packages required (Stirman et al., 2012).
Briefly, the multimodal illumination and tracking system consists of three main LabVIEW programs. The first one is
used for calibration prior to measurements. The second program performs the real-time tracking and patterned
multimodal illumination while recording the movies. The third software package (consisting of different
sub-programs) is used for post processing, i.e., head encoding, complete video analysis, and analysis of multiple
data files in batch mode and presentation.
The major advantage of the Lu system is that up to three different colors of light, each with 256 independent
levels of intensity, can be used simultaneously which is essential to combine different optical tools with shifted
action spectra; for example, ChR2 (major activation peak at 460 nm), Mac (peak at 535 nm) and NpHR (peak at
590 nm) (Chow et al., 2010;Nagel et al., 2003;Zhang et al., 2007). The applied wavelength depends on the installed
band pass filter; therefore, changing the stimulation color is as convenient and cost-effective as possible. Similarly
important is the fact that the Lu system can be assembled from a relatively cheap, off-the-shelf commercial video
projector (less than 3000 US$ for a projector, band pass filters and the LabVIEW license). The software uses a
common USB camera capable of communication with the LabVIEW Vision add-on and 25 Hz acquisition.
Keeping track of worm trackers
9
2.6. CoLBeRT: control locomotion and behavior in real time
The Samuel lab published a system, “CoLBeRT”, for controlling locomotion and behavior in real time.
CoLBeRT uses a digital micromirror device (DMD, from Texas Instruments) to reflect a diode-pumped solid-state
laser in order to achieve selective illumination (Leifer et al., 2011). High light intensities come with the drawback of
only one color of light being used at the same time, hampering the simultaneous use of different optogenetic tools, at
least in the published version of the system. The tracking and illumination setup operates at 50 Hz and is excellent
for analyzing body curvature. The DMD has a spatial limit of 5 µm for the minimal area that may be accessed
through CoLBeRT. This minimal area is larger for fast moving animals, i.e. about 30 µm for swimming worms. The
Samuel lab used viscous solutions to slow down the locomotion of the animals under study, allowing higher spatial
accuracy in directing light at the respective cells.
The “MindControl” software used to track a worm and create illumination patterns in real time is written in
the “C” programming language and is also available together with documentation through “github”
(http://github.com/samuellab/mindcontrol and https://github.com/samuellab/mindcontrol-analysis). MindControl
stores two video sequences, an original stream and one with annotations regarding the optogenetic stimulation.
During an experiment, a GUI allows one to change the optogenetic stimulation in real time as well as delivering
manual stimulations. The raw data is stored in YAML format (a human-readable format to serialize/store data). This
dataset is then processed in MATLAB to retrieve a quantitative analysis of the experiment, for instance, with
kymographs (a graph of spatial position vs. time) of the worm's locomotion.
In conclusion, this system is the fastest real-time single worm tracker to date, capable of spatially restricted
optogenetic manipulations. The disadvantage of CoLBeRT, however, is that only one color of light can be used at
the same time. Also, the acquisition cost is considerably higher compared to the Lu system (see Table 1).
2.7. The opto-mechanical system for imaging or manipulation of neuronal activity in freely moving
animals
Non-invasive neuronal manipulation via optogenetics in freely moving animals, while simultaneously tracking
evoked behaviors (as discussed above), has been a significant step forward in the analysis of neural network
function. In addition to optogenetics-assisted manipulation, imaging of neuronal activity in untethered, freely
moving animals is a major technical challenge when combined with tracking and quantification of locomotory
behavior. The image-free opto-mechanical system developed in the Lockery lab promises to address both
approaches (Faumont et al., 2011). This system can be used to create virtual environments by optogenetic activation
of sensory neurons, or to image activity in identified neurons at high magnification. The system uses two light paths
with different magnifications. The first path, with lower magnification, is used for behavioral analysis and records
the image of the animal in a standard gray-scale movie. The second path has a higher magnification (typically
63x-100x) and is used for Ca2+-imaging and the actual tracking procedure. For this purpose, a beam splitter redirects
a small amount (20%) of the light at the Ca2+-imaging camera to a four-quadrant photomultiplier tube (PMT). The
four analog signal intensities are directed to a motorized stage controller, which regulates the speed of the servo
motors in order to center the brightest spot to the center of the PMT. This approach thus requires a trackable bright
spot, for instance, a cell expressing a fluorescent protein marker. As no software processing is required for stage
control, i.e., this part is an all-analog system, this is the fastest tracking system available to date, allowing one to
track neurons in animals thrashing in liquid. The combination of two recordings with different magnifications allows
worm tracking in parallel with Ca2+-imaging in single neurons, which is a feature not commonly seen in tracking
systems. The system can also create a so-called “virtual environment” by projecting light into a user-defined pattern.
This projection can be used to control activity of (sensory) neurons expressing optogenetic tools, e.g., mimicking an
aversive stimulus by specifically expressing channelrhodopsin in, and photoactivating the, polymodal aversive
neuron ASH. The instructions needed to build the centering device have been published (Faumont et al., 2011) and a
commercial version is available (PhotoTrack, Applied Scientific Instrumentation).
2.8. Further systems allowing tracking and Ca2+ imaging in semi-restrained or freely behaving
animals
As briefly described above, when one is interested in the neuronal basis of behavior, one ideally wants to track
and quantify behavior, and at the same time record the activity of neurons involved in generating the behavior. Thus,
several systems have been described that allow tracking of semi-restrained or freely behaving animals, and
recording Ca2+ signals in transgenic neurons. These systems have been successfully used, but, to our knowledge, not
Keeping track of worm trackers
10
described in sufficient technical detail to be adopted by others. Thus, we can just mention them here and refer the
reader to the respective researchers if they are interested in setting up these systems themselves.
A first approach somewhat achieving this goal was described by the Samuel lab (Clark et al, 2007). They
imaged fluorescent signals from the AFD thermosensory neuron expressing the ratiometric Ca2+ sensor cameleon.
This was done at intermediate magnification in animals whose tails were glued, but whose heads were free to move
within a thermal gradient. They also tracked animals freely moving in such a gradient, at low magnification, by
tracking the fluorescent neuron, in this case using a joystick-controlled x-y translational stage. Later the track of the
animal was extracted from the recorded stage (and relative neuron) positions.
A different system, tracking an animal moving on an open NGM plate automatically, and acquiring Ca2+
signals from a neuron of interest, was developed by the Didier Chatenay and Schafer labs, who imaged the AVA
backward command motor neuron in freely moving animals (Ben Arous et al., 2010). This system uses two cameras,
one for acquiring an image of the animal, which is used to track locomotion behavior and to re-center the stage
(using a low magnification objective), and another to record fluorescent signals of the AVA neuron expressing the
cameleon sensor (using higher magnification). The software operates at roughly 7 Hz, and is based on an ImageJ
script, thus no costly commercial software package is required.
The Mei Zhen lab developed a similar system that tracks fluorescent cells and animal behavior, based on
freely available software (MicroManager and ImageJ), and operating at up to 20 Hz (Kawano et al., 2011). This
system allows one to image multiple command (or “premotor”) interneurons as well as ventral cord motor neurons
expressing cameleon sensors in movement-restricted animals (under a cover slip, effectively slowing down but not
preventing locomotion).
The Shawn Xu and Zhaoyang Feng labs devised a tracker to study whether motor activity decline might be
used as a lifespan predictor (Hsu et al., 2009). Their system is based on a stereomicroscope with a digital camera and
a motorized stage. Custom software tracks the animal at 2 Hz for five minutes. The software was briefly described in
the original publication, but was not published for further use. This system was further developed to allow Ca2+
imaging, termed CARIBN (Ca2+ ratiometric imaging of behaving nematodes), and allows tracking as well as Ca2+
imaging using GCaMP3 (Piggott et al., 2011). They use DsRed (non-responsive to Ca2+) as a control for motion or
focusing artifacts. In addition, the system may be suited for optogenetics experiments, while imaging Ca2+ at the
same time. However, as the Ca2+ imaging light is also used for ChR2 activation, measurement of baseline
fluorescence for Ca2+ imaging is not possible and must be controlled in a separate experiment by imaging additional
animals not expressing ChR2. The CARIBN II system adds the option of controlling the z-axis, allowing automatic
focusing of the pictured neurons, as well as z-sectioning (Zheng et al., 2012). The latter function allows CARIBN II
to image multiple neurons concomitantly. Both versions of CARIBN are available upon request.
2.9. Behavioral arenas
Conventional trackers for freely moving animals on solid surfaces do not allow one to present odors in a
spatially and temporally controlled manner. To quantitatively understand chemosensory behaviors, the Bargmann
group recently described a microfluidics device allowing creation of precise spatiotemporal chemical environments
while monitoring the resulting behavioral output (Albrecht and Bargmann, 2011). This “behavioral arena” consists
of a 4 cm2polydimethylsiloxane (PDMS) surface containing a structured micropost array (hexagonally arranged,
200 µm diameter pillars, separated by 100 µm) through which the nematodes can crawl. The arena height is set to
70 µm, which is roughly the diameter of a young adult animal. These parameters match the wavelength of normal
crawling behavior on an agar substrate. The microfluidic chip has different inlets for stimulus inflow, a worm
loading port with variable entry points and an outflow channel. Furthermore, the device boundaries are smooth in
order to minimize the animal's tendency to explore sharp corners. The different stimulus inlets are controlled by
valves, allowing different configurations of odor stimulation by generating gradients that mix two odor
concentrations or through temporal control by timed opening of a valve. The worm entry point to the arena is
variable depending on the device used. The system is equipped with a camera for recording the animal's behavior
during stimulus presentation. The image analysis is performed offline, using MATLAB code that is partially based
on the parallel worm tracker (Ramot et al., 2008). The system performs automated behavioral classification based on
the identification of five primary locomotory states: forward locomotion (straight or curved), pause, reversal,
pirouette reverse (the reversal before an omega turn) and pirouette forward (the subsequent resolution of the omega
turn). Data can be presented in stimulus-aligned ethograms in which the five states are color-coded and plotted over
time.
Keeping track of worm trackers
11
The single-layer PDMS chips can be easily re-designed and the microfluidics system works with standard
Luer valves. Second-generation devices promise high-throughput behavioral analysis. Concomitant separate
population measurement is achieved by dividing the arena with worm barriers. A 2- and a 4-arena multiplexed
device have been designed with multiple fluid inlets. These allow up to four different populations to be challenged
with four unique stimuli simultaneously during one experiment.
2.10. The WormLab, a commercially available worm tracker
MicroBrightField Inc. developed a commercially available worm tracker called WormLab
(http://www.mbfbioscience.com/wormlab). At the time of writing, the software is available for pre-purchase, with
the option for a complete system including microscope, video camera and motorized stage. The software features
include tracking of selected worms through their centroid, head or tail markers. The analysis comprises the worm's
velocity, position, area, direction and wavelength, as well as the track's length.
3. Worm trackers optimized for liquid environments
C. elegans thrashing and swimming behavior have been effectively tracked by morphological analysis, but this
approach requires high computing power. The Sattelle lab published a new approach to quantify thrashing assays
without morphometry (Buckingham and Sattelle, 2009). Their software is based on covariance analysis. First, the
background is extracted from the images through a technique employing Principal Component Analysis (the
background is represented by the maximum covariance and is therefore coded by the first principal component).
Then a covariance matrix is computed for all frames. This matrix shows frames that are statistically significantly
similar to each other. Counting the amount of frames between two similar ones allows one to identify the time
needed to complete a full swing during thrashing, which ultimately leads to the thrashing frequency. This system
was conceived for high-throughput analysis of worm swimming behavior, but requires one worm per video to be
analyzed. The Feng lab further improved the system with a program capable of controlling a motorized stage (Zheng
et al., 2011). This software automatically records a movie of each well in a multi-well plate with parameters set by
the user. The improved system combines efficient thrashing assay analysis with high-throughput screening. The
source code is written in C and compiled in LabWindows (NI, Austin, TX, USA). The thrashing analysis core
program does not require any specific hardware. The only requirement for the video is that only one animal is
depicted. The system is easily deployed and the instructions for the hardware needed for the high-throughput
measurements are available upon request by the authors.
Furthermore, the Blakely lab created a system for automatic analysis of worm (in)activity in fluids. A
MATLAB script automatically analyzes the thrashing frequency of a single worm through a Fast-Fourier transform
of the movement frequency (Matthies et al., 2006). Although the software is not published online, the authors do
share it upon request.
4. Additional analysis tools for quantifying C. elegans behavior
The following tools do not contain instructions for controlling x-y stages, thus they should be considered as
stand-alone video analysis tools that require videos or images as input.
4.1. Eigenworms: Low-dimensional superposition of principal components
Greg J. Stephens from the William Ryu lab showed that the space of shapes adopted by the worm can be
described with just four “elementary shapes”, or “eigenworms” that provide a quantitative description of worm
behavior, accounting for 95% of the variance in N2 shapes (Stephens et al., 2008). As the worm's shape determines
its motion, characterization of the shape dynamics provides insights into locomotion behavior. Variations along the
eigenworms thus offer a quantitative characterization of classical locomotion parameters such as forward crawling,
reversals, omega bends etc. For this work, they built a homemade tracking system and used MATLAB to capture
and process the images in order to calculate the eigenworms. Images of worms were first skeletonized to a
single-pixel thick backbone that was segmented into 101 parts such that 100 angles between these segments could be
calculated in order to deduce the four eigenworms or “modes”. The first two modes are sinusoidal-like oscillations
that describe the orthogonal phases of a wave along the body. The third mode is related to the curvature and is thus
used to identify or describe turns or omega bends. The fourth mode contributes to the shape of the head and tail
region of the worm. One should interpret this approach as a projection or reduction of motor behaviors onto four
templates or parameters with variable strengths. Mapping the dynamics of the shape space to the trajectory of the
moving worm can reveal subtle differences in locomotion (Stephens et al., 2010;Stephens et al., 2011).
Keeping track of worm trackers
12
The approach to calculate the eigenworms can easily be executed as a stand-alone MATLAB-based program
and virtually any movie file can be analyzed (after thresholding and transformation into individual frames by other
video processing programs like VirtualDub). The use of a tracking system is not required as one is only interested in
the “space shape” of the worms. When using this software tool, one can just move the plate by hand to keep the
worm in the center of the field of view. The backbone length also represents an accurate calculation of the length of
the worm. This approach was used when measuring body contractions or elongations evoked by depolarization of
muscle cells or cholinergic neurons via optogenetic tools (Liewald et al., 2008). Thereafter, a microfluidics device
was developed by the Lu lab for high-throughput automation of body length measurements to investigate synaptic
transmission (Stirman et al., 2010), utilizing the algorithm devised by Stephens to analyze worm length (Stephens et
al., 2008).
4.2. An analysis tool for the description of bending angles during swimming or crawling
Body bends during crawling and swimming behaviors are best displayed through kymographs of the worm's
body angles (or curvature) with respect to time. Although many trackers have an option to analyze these features, it
is unnecessary to install an expensive system if one wishes only to address these aspects of locomotion. The Steven
McIntire lab created video analysis software capable of displaying worm bending as kymographs/curvature matrices
(Pierce-Shimomura et al., 2008). The software is programmed as a custom image analysis algorithm in ImagePro
(Media Cybernetics). Videos are recorded with a resolution of 2.9 µm/pixel at a frequency of 30 Hz. The software
recognizes the animal and describes it through its midline. This skeleton is subdivided into 13 segments and the
angles between them are color-coded to form an image of the angles over time. The columns created for each frame
of the video are connected to form the curvature matrix. This method is advantageous when displaying C. elegans
body curvature changes during locomotion, since apprehension and comparison of curvature matrices is intuitive.
4.3. The Worm Analysis System
The open source Worm Analysis System implements the FARSIGHT Toolkit with a fully integrated GUI
(http://farsight-toolkit.org/wiki/Worm_Analysis_System and http://farsight-toolkit.org/wiki/Worm_Features%26
Events). The software analyzes movie files of multiple worms from which different parameters can be calculated,
like the worm's length, width, curvature, area and speed (Roussel et al., 2007). It also describes the worm's state, as
in forward motion, omega bend or pause. At the time of writing, the developers are working on a solution for
collision detection. The software is capable of tracking two or more contacting nematodes, even if they partially
overlap. The software has been optimized for usage of the graphical processing unit (GPU) during computation. One
must note that implementation of this tracker requires more programming skills in comparison to the other systems
available.
4.4. The Multi-Environment Model Estimation for Motility Analysis
The Josue Sznitman Lab recently described a new strategy for image recognition: the Multi-Environment
Model Estimation (MEME) for C. elegans motility analysis (Sznitman et al., 2010). The software is coded in
MATLAB, all functions are accessed through a GUI and it is available upon request from the authors. MEME is an
off-line image analysis software capable of recognizing worm body boundaries in image conditions that would not
be tolerated by threshold-based worm trackers. As output, MEME “skeletonizes” the worm and saves images of the
skeleton as well as a MATLAB file containing the x-y coordinates of the nematode over time. The software relies on
the idea of Mixture of Gaussians (MOG). Briefly, MOG methods describe each pixel's intensity in an image as a
variable with a Gaussian distribution. The background of an image can be recognized by analyzing the Gaussian
distributions of all pixels, which requires a “background only” image (readers are referred to the paper by Sznitman
et al., (2010), for details of the method). The MEME strategy is more reliable when recognizing worms in
microfluidic chips than the common thresholding methods. The MEME software requires a sequence of images as
input, which has to be manually extracted by third-party software such as VirtualDub. The software does not control
the image acquisition. MEME runs under the MATLAB R2009b release with the Image Processing Toolbox.
5. Possible future developments
Regarding future development of Worm Trackers, one might expect advances in three fields. The first field is
data acquisition. Currently, some tracking systems can depict freely behaving animals with a resolution reaching
10 µm per pixel for whole animals, or they can track single (fluorescent) cells in a region of interest. Multi-channel
acquisition allows trackers to not only depict the animals’ behavior, but also make use of fluorescent reporters to
Keeping track of worm trackers
13
correlate behavior and, for instance, second messenger signals (e.g. Ca2+). Parallel acquisition with different
magnifications (e.g., using low and high magnification objectives above and below the specimen plane) allows
focusing on distinct behavioral aspects. The second field is dedicated to stimulus application. Although many
systems allow some sort of stimulus application during imaging, the nature of the stimulus is limited. Combination
of optical, mechanical and thermal stimulation in real time is likely to boost C. elegans research. These aspects lead
to the third field of innovation—modularity. For the time being, researchers working with worm trackers are
probably confronted with more than one system in order to address all research questions. As can be grasped from
this review, several labs have developed solutions to the same question, which is good on one hand, as different
ideas are being developed, and different aspects can be tackled. On the other hand, these systems generally are not
compatible and thus it would be desirable to have a common basic system that can be expanded individually, where
new modules are being shared on an open access basis (this is relatively straightforward for software, but less easy
for hardware development). Modularity of worm trackers would allow one to build upon a framework and enhance
already existing systems. Most trackers are not bound to a specific hardware configuration, although the first
configuration of the new system might still be painstakingly difficult. In the future, such hardware ties will play a
lesser role. One might expect to have qualities of many described worm trackers combined into such a framework,
allowing a much broader approach for C. elegans tracking and enriching research in the worm community.
6. Conclusion
Due especially to the well-defined nervous system of C. elegans, neurobiologists in the worm community aim
at a comprehensive functional description of neuronal networks, assessing information flow from sensory neurons
through different circuit layers and motor circuits that define a prevalent behavioral response. Due to the availability
of various assays and optogenetic tools, precise behavioral parameterization is required to allow straightforward
(statistical) analysis and comparisons of the data. To this end, many tracking systems have been developed, each
with their individual strengths and applicability. The current overview aims at providing a guideline to keep track of
all the tracking systems and we hope that this WormBook chapter facilitates the search for a specific setup to fulfill
individual needs.
7. References
Albrecht, D.R., and Bargmann, C.I. (2011). High-content behavioral analysis of Caenorhabditis elegans in precise
spatiotemporal chemical environments. Nat. Methods 8, 599-605. Abstract Article
Ben Arous, J., Tanizawa, Y., Rabinowitch, I., Chatenay, D., and Schafer, W.R. (2010). Automated imaging of
neuronal activity in freely behaving Caenorhabditis elegans. J. Neurosci. Methods 187, 229-34. Abstract Article
Boyden, E.S., Zhang, F., Bamberg, E., Nagel, G., and Deisseroth, K. (2005). Millisecond-timescale, genetically
targeted optical control of neural activity. Nat. Neurosci. 8, 1263-1268. Abstract Article
Brenner, S. (1974). The genetics of Caenorhabditis elegans. Genetics 77, 71-94. Abstract
Buckingham, S.D., and Sattelle, D.B. (2009). Fast, automated measurement of nematode swimming (thrashing)
without morphometry. BMC Neurosci. 10, 84. Abstract Article
Chalasani, S.H., Chronis,N., Tsunozaki, M., Gray, J.M., Ramot, D., Goodman, M.B., and Bargmann, C.I. (2007).
Dissecting a circuit for olfactory behaviour in Caenorhabditis elegans. Nature 450, 63-70. Abstract Article
Chow, B.Y., Han, X., Dobry, A.S., Qian, X., Chuong, A.S., Li, M., Henninger, M.A., Belfort, G.M., Lin, Y.,
Monahan, P.E., and Boyden, E.S. (2010). High-performance genetically targetable optical neural silencing by
light-driven proton pumps. Nature 463, 98-102. Abstract Article
Clark, D.A., Gabel, C.V., Gabel, H., and Samuel, A.D. (2007). Temporal activity patterns in thermosensory neurons
of freely moving Caenorhabditis elegans encode spatial thermal gradients. J. Neurosci. 27, 6083-90. Abstract
Article
Cronin, C.J., Mendel, J.E., Mukhtar, S., Kim, Y.M., Stirbl, R.C., Bruck, J., and Sternberg, P.W. (2005). An
automated system for measuring parameters of nematode sinusoidal movement. BMC Genet. 6,5.Abstract Article
Keeping track of worm trackers
14
Davis, M.W., Morton, J.J., Carroll, D., and Jorgensen, E.M. (2008). Gene activation using FLP recombinase in C.
elegans. PLoS Genet 4, e1000028. Abstract Article
de Bono, M., and Bargmann, C.I. (1998). Natural variation in a neuropeptide Y receptor homolog modifies social
behavior and food response in C. elegans. Cell 94, 679-689. Abstract Article
Deisseroth, K. (2011). Optogenetics. Nat. Methods 8, 26-29. Abstract Article
Dhawan, R., Duesenbery, D.B., and Williams, P.L. (1999), Comparison of lethality, reproduction and behavior as
toxicological endpoints in the nematode Caenorhabditis elegans. J. Toxicol. Environ. Health A 58, 451-462.
Abstract Article
Dusenbery, D.B. (1985). Using a microcomputer and video camera to simultaneously track 25 animals. Comput.
Biol. Med. 15, 169-175. Abstract Article
Faumont, S., Rondeau, G., Thiele, T.R., Lawton, K.J., McCormick, K.E., Sottile, M., Griesbeck, O., Heckscher,
E.S., Roberts, W.M., Doe, C.Q., and Lockery, S.R. (2011). An image-free opto-mechanical system for creating
virtual environments and imaging neuronal activity in freely moving Caenorhabditis elegans. PLoS ONE 6, e24666.
Abstract Article
Feng, Z., Cronin, C.J., Wittig, J.H., Jr., Sternberg, P.W., and Schafer, W.R. (2004). An imaging system for
standardized quantitative analysis of C. elegans behavior. BMC Bioinformatics 5, 115. Abstract Article
Hardaker, L.A., Singer, E., Kerr, R., Zhou, G., and Schafer, W.R. (2001). Serotonin modulates locomotory behavior
and coordinates egg-laying and movement in Caenorhabditis elegans. J. Neurobiol. 49, 303-313. Abstract Article
Hodgkin, J. (1983). Male Phenotypes and Mating Efficiency in Caenorhabditis elegans. Genetics 103, 43-64.
Abstract
Hsu, A.L., Feng, Z., Hsieh, M.Y., and Xu, X.Z. (2009). Identification by machine vision of the rate of motor activity
decline as a lifespan predictor in C. elegans. Neurobiol. Aging 30, 1498-1503. Abstract Article
Kawano, T., Po, M.D., Gao, S., Leung, G., Ryu, W.S, and Zhen, M. (2011). An imbalancing act: gap junctions
reduce the backward motor circuit activity to bias C. elegans for forward locomotion. Neuron 72, 572-86. Abstract
Article
Leifer, A.M., Fang-Yen, C., Gershow, M., Alkema, M.J., and Samuel, A.D.T. (2011). Optogenetic manipulation of
neuroal activitgy in freely moving Caenorhabditis elegans. Nat. Methods 8, 147-152. Abstract Article
Liewald, J.F., Brauner, M., Stephens, G.J., Bouhours, M., Schultheis, C., Zhen, M., and Gottschalk, A. (2008).
Optogenetic analysis of synaptic function. Nat. Methods 5, 895-902. Abstract Article
Macosko, E.Z., Pokala, N., Feinberg, E.H., Chalasani, S.H., Butcher, R.A., Clardy, J., and Bargmann, C.I. (2009). A
hub-and-spoke circuit drives pheromone attraction and social behaviour in C. elegans. Nature 458, 1171-1175.
Abstract Article
Matthies, D.S., Fleming, P.A., Wilkes, D.M., and Blakely, R.D. (2006). The Caenorhabditis elegans choline
transporter CHO-1 sustains acetylcholine synthesis and motor function in an activity-dependent manner. J, Neurosci.
26, 6200-6212. Abstract Article
Nagel, G., Brauner, M., Liewald, J.F., Adeishvili, N., Bamberg, E., and Gottschalk, A. (2005). Light activation of
channelrhodopsin-2 in excitable cells of Caenorhabditis elegans triggers rapid behavioral responses. Curr. Biol. 15,
2279-2284. Abstract Article
Nagel, G., Szellas, T., Huhn, W., Kateriya, S., Adeishvili, N., Berthold, P., Ollig, D., Hegemann, P., and Bamberg,
E. (2003). Channelrhodopsin-2, a directly light-gated cation-selective membrane channel. Proc. Natl. Acad. Sci.
U.S.A. 100, 13940-13945. Abstract Article
Keeping track of worm trackers
15
Pierce-Shimomura, J.T., Chen, B.L., Mun, J.J., Ho, R., Sarkis, R., and McIntire, S.L. (2008). Genetic analysis of
crawling and swimming locomotory patterns in C. elegans. Proc. Natl. Acad. Sci, U.S.A. 105, 20982-20987.
Abstract Article
Pierce-Shimomura, J.T., Morse, T.M., and Lockery, S.R. (1999). The fundamental role of pirouettes in
Caenorhabditis elegans chemotaxis. J. Neurosci. 19, 9557-9569. Abstract
Piggott, B.J., Liu, J., Feng, Z., Wescott, S.A., and Xu, X.Z. (2011). The neural circuits and synaptic mechanisms
underlying motor initiation in C. elegans. Cell 147, 922-933. Abstract Article
Ramot, D., Johnson, B.E., Berry, T.L., Jr., Carnell, L., and Goodman, M.B. (2008). The Parallel Worm Tracker: a
platform for measuring average speed and drug-induced paralysis in nematodes. PLoS ONE 3, e2208. Abstract
Article
Roussel, N., Morton, C.A., Finger, F.P., and Roysam, B. (2007). A computational model for C. elegans locomotory
behavior: application to multiworm tracking. IEEE Trans. Biomed. Eng. 54, 1786-97. Abstract Article
Simonetta, S.H., and Golombek, D.A. (2007). An automated tracking system for Caenorhabditis elegans locomotor
behavior and circadian studies application. J. Neurosci. Methods 161, 273-280. Abstract Article
Soll, D.R. (1995). The use of computers in understanding how animal cells crawl. Int. Rev. Cyto. 163, 43-104.
Abstract Article
Stephens, G.J., Bueno, de M.M., Ryu, W.S., and Bialek, W. (2011). Emergence of long timescales and stereotyped
behaviors in Caenorhabditis elegans. Proc. Natl. Acad. Sci. U.S.A. 108, 7286-7289. Abstract Article
Stephens, G.J., Johnson-Kerner, B., Bialek, W., and Ryu, W.S. (2008). Dimensionality and dynamics in the
behavior of C. elegans. PLoS Comput. Biol. 4, e1000028. Abstract Article
Stephens, G.J., Johnson-Kerner, B., Bialek, W., and Ryu, W.S. (2010). From modes to movement in the behavior of
Caenorhabditis elegans. PLoS ONE 5, e139 Abstract Article
Stirman, J.N., Brauner, M., Gottschalk, A., and Lu, H. (2010). High-throughput study of synaptic transmission at the
neuromuscular junction enabled by optogenetics and microfluidics. J. Neurosci Methods 191, 90-93. Abstract
Article
Stirman, J.N., Crane, M.M., Husson, S.J., Gottschalk, A., and Lu, H. (2012). Assembly of a multispectral optical
illumination system with precise spatiotemporal control for the manipulation of optogenetic reagents. Nat. Protocols
7, 207-220. Abstract
Stirman, J.N., Crane, M.M., Husson, S.J., Wabnig, S., Schultheis, C., Gottschalk, A., and Lu, H. (2011). Real-time
multimodal optical control of individual neurons and muscles in freely behaving Caenorhabditis elegans. Nat.
Methods 8, 153-158. Abstract Article
Swierczek, N.A., Giles, A.C., Rankin, C.H., and Kerr, R.A. (2011). High-throughput behavioral analysis in C.
elegans. Nat. Methods 8, 592-598. Abstract Article
Sznitman, R., Gupta, M., Hager, G.D., Arratia, P.E., and Sznitman, J. (2010), Multi-environment model estimation
for motility analysis of Caenorhabditis elegans. PLoS ONE 5, e11631. Abstract Article
Tsechpenakis, G., Bianchi, L., Metaxas, D., and Driscoll, M. (2008). A novel computational approach for
simultaneous tracking and feature extraction of C. elegans populations in fluid environments. IEEE Trans. Biomed.
Eng. 55, 1539-1549. Abstract Article
Tsibidis, G.D., and Tavernarakis, N. (2007). Nemo: a computational tool for analyzing nematode locomotion. BMC
Neurosci. 8, 86. Abstract Article
Waggoner, L.E., Zhou, G.T., Schafer, R.W., and Schafer, W.R. (1998). Control of alternative behavioral states by
serotonin in Caenorhabditis elegans. Neuron 21, 203-214. Abstract Article
Keeping track of worm trackers
16
Zhang, F., Wang, L.P., Brauner, M., Liewald, J.F., Kay, K., Watzke, N., Wood, P.G., Bamberg, E., Nagel, G.,
Gottschalk, A., and Deisseroth, K. (2007). Multimodal fast optical interrogation of neural circuitry. Nature 446,
633-639. Abstract Article
Zheng, M., Gorelenkova, O., Yang, J., and Feng, Z. (2011). A liquid phase based C. elegans behavioral analysis
system identifies motor activity loss in a nematode Parkinson's disease model. J. Neurosci. Methods
doi:10.1016/j.jneumeth.2011.11.015. Abstract
Keeping track of worm trackers
17
All WormBook content, except where otherwise noted, is licensed under a Creative
Commons Attribution License.
... Then, we detected the basic phenotype of tars-1(ora1) II/wt, and the results showed that the proportion of adult nematodes and the average number of offspring per worm decreased markedly in tars-1(ora1) II/wt compared with N2 ( Fig. 3D and E). The number of body bends per minute (BPMs) during crawling behavior is a good index to evaluate motility [45,46], and BPMs decreased in tars-1(ora1) II/wt compared with N2 (Fig. 3G), although there was no obvious change in the number of bends in forward, backward and reversal locomotion (Fig. 3F). There was no obvious difference in pharyngeal pumping between tars-1(ora1) II/wt and N2 (Fig. 3H), proving that food intake did not contribute to the above abnormal phenotypes. ...
Article
Full-text available
Aminoacyl-tRNA synthetases (aaRSs) are indispensable players in translation. Usually, two or three genes encode cytoplasmic and mitochondrial threonyl-tRNA synthetases (ThrRSs) in eukaryotes. Here, we reported that Caenorhabditis elegans harbors only one tars-1, generating cytoplasmic and mitochondrial ThrRSs via translational reinitiation. Mitochondrial tars-1 knockdown decreased mitochondrial tRNAThr charging and translation and caused pleotropic phenotypes of delayed development, decreased motor ability and prolonged lifespan, which could be rescued by replenishing mitochondrial tars-1. Mitochondrial tars-1 deficiency leads to compromised mitochondrial functions including the decrease in oxygen consumption rate, complex Ⅰ activity and the activation of the mitochondrial unfolded protein response (UPRmt), which contributes to longevity. Furthermore, deficiency of other eight mitochondrial aaRSs in C. elegans and five in mammal also caused activation of the UPRmt. In summary, we deciphered the mechanism of one tars-1, generating two aaRSs, and elucidated the biochemical features and physiological function of C. elegans tars-1. We further uncovered a conserved connection between mitochondrial translation deficiency and UPRmt.
... C. elegans Numerous freely available software packages are capable of simultaneous tracking and skeletonising single or multiple worms in 2D using inexpensive microscopic imaging [5,25,44,52,53,62] (see [24] for a review). Most of these skeletonisers combine image segmentation to separate the animal from the background with thinning of the mask to some midline pixels and fitting a spline. ...
Preprint
Full-text available
3D shape reconstruction typically requires identifying object features or textures in multiple images of a subject. This approach is not viable when the subject is semi-transparent and moving in and out of focus. Here we overcome these challenges by rendering a candidate shape with adaptive blurring and transparency for comparison with the images. We use the microscopic nematode Caenorhabditis elegans as a case study as it freely explores a 3D complex fluid with constantly changing optical properties. We model the slender worm as a 3D curve using an intrinsic parametrisation that naturally admits biologically-informed constraints and regularisation. To account for the changing optics we develop a novel differentiable renderer to construct images from 2D projections and compare against raw images to generate a pixel-wise error to jointly update the curve, camera and renderer parameters using gradient descent. The method is robust to interference such as bubbles and dirt trapped in the fluid, stays consistent through complex sequences of postures, recovers reliable estimates from blurry images and provides a significant improvement on previous attempts to track C. elegans in 3D. Our results demonstrate the potential of direct approaches to shape estimation in complex physical environments in the absence of ground-truth data.
Article
Brain defects often lead to motor dysfunctions in humans. Drosophila melanogaster has been one of the most useful organisms in the study of neuronal biology due to its similarities with humans and has contributed to a more detailed understanding of the effects of genetic dysfunctions in the brain on behavior. We herein present modified protocols for the crawling assay with larvae and the climbing assay with adult flies that are simple to perform as well as a series of commands for ImageJ to automatically analyze data for the crawling assay.
Preprint
Injured nervous systems are often incapable of self-repairing, resulting in permanent loss of function and disability. To restore function, a severed axon must not only regenerate, but must also reform synapses with target cells. Together, these processes beget functional axon regeneration. Progress has been made towards a mechanistic understanding of axon regeneration. However, the molecular mechanisms that determine whether and how synapses are formed by a regenerated motor axon are not well understood. Using a combination of in vivo laser axotomy, genetics, and high-resolution imaging, we find that poly (ADP-ribose) polymerases (PARPs) inhibit synapse reformation in regenerating axons. As a result, regenerated parp(-) axons regain more function than regenerated wild-type axons, even though both have reached their target cells. We find that PARPs regulate both axon regeneration and synapse reformation in coordination with proteolytic calpain CLP-4. These results indicate approaches to functionally repair the injured nervous system must specifically target synapse reformation, in addition to other components of the injury response.
Article
We have adapted a semiautomated method for tracking Caenorhabditis elegans spontaneous locomotor activity into a quantifiable assay by developing a sophisticated method for analyzing the time course of measured activity. The 16-h worm Adult Activity Test (wAAT) can be used to measure C. elegans activity levels for efficient screening for pharmacological and toxicity-induced effects. As with any apical endpoint assay, the wAAT is mode of action agnostic, allowing for detection of effects from a broad spectrum of response pathways. With caffeine as a model mild stimulant, the wAAT showed transient hyperactivity followed by reversion to baseline. Mercury chloride (HgCl2 ) produced an early dose-response hyperactivity phase followed by pronounced hypoactivity, a behavior pattern we have termed a toxicant "escape response." Methylmercury chloride (meHgCl) produced a similar pattern to HgCl2 , but at much lower concentrations, a weaker hyperactivity response, and more pronounced hypoactivity. Sodium arsenite (NaAsO2 ) and dimethylarsinic acid (DMA) induced hypoactivity at high concentrations. Acute toxicity, as measured by hypoactivity in C. elegans adults, was ranked: meHgCl > HgCl2 > NaAsO2 = DMA. Caffeine was not toxic with the wAAT at tested concentrations. Methods for conducting the wAAT are described, along with instructions for preparing C. elegans Habitation Medium, a liquid nutrient medium that allows for developmental timing equivalent to that found with C. elegans grown on agar with OP50 Escherichia coli feeder cultures. A de novo mathematical parametric model for adult C. elegans activity and the application of this model in ranking exposure toxicity are presented.
Article
Full-text available
Entomopathogenic nematodes, including Steinernema spp., play an increasingly important role as biological alternatives to chemical pesticides. The infective juveniles of these worms use nictation-a behavior in which animals stand on their tails-as a host-seeking strategy. The developmentally-equivalent dauer larvae of the free-living nematode Caenorhabditis elegans also nictate, but as a means of phoresy or "hitching a ride" to a new food source. Advanced genetic and experimental tools have been developed for C. elegans, but time-consuming manual scoring of nictation slows efforts to understand this behavior, and the textured substrates required for nictation can frustrate traditional machine vision segmentation algorithms. Here we present a Mask R-CNN-based tracker capable of segmenting C. elegans dauers and S. carpocapsae infective juveniles on a textured background suitable for nictation, and a machine learning pipeline that scores nictation behavior. We use our system to show that the nictation propensity of C. elegans from high-density liquid cultures largely mirrors their development into dauers, and to quantify nictation in S. carpocapsae infective juveniles in the presence of a potential host. This system is an improvement upon existing intensity-based tracking algorithms and human scoring which can facilitate large-scale studies of nictation and potentially other nematode behaviors.
Article
Full-text available
Optogenetics is an excellent tool for noninvasive activation and silencing of neurons and muscles. Although they have been widely adopted, illumination techniques for optogenetic tools remain limited and relatively nonstandardized. We present a protocol for constructing an illumination system capable of dynamic multispectral optical targeting of micrometer-sized structures in both stationary and moving objects. The initial steps of the protocol describe how to modify an off-the-shelf video projector by insertion of optical filters and modification of projector optics. Subsequent steps involve altering the microscope's epifluorescence optical train as well as alignment and characterization of the system. When fully assembled, the illumination system is capable of dynamically projecting multispectral patterns with a resolution better than 10 μm at medium magnifications. Compared with other custom-assembled systems and commercially available products, this protocol allows a researcher to assemble the illumination system for a fraction of the cost and can be completed within a few days.
Article
Full-text available
Non-invasive recording in untethered animals is arguably the ultimate step in the analysis of neuronal function, but such recordings remain elusive. To address this problem, we devised a system that tracks neuron-sized fluorescent targets in real time. The system can be used to create virtual environments by optogenetic activation of sensory neurons, or to image activity in identified neurons at high magnification. By recording activity in neurons of freely moving C. elegans, we tested the long-standing hypothesis that forward and reverse locomotion are generated by distinct neuronal circuits. Surprisingly, we found motor neurons that are active during both types of locomotion, suggesting a new model of locomotion control in C. elegans. These results emphasize the importance of recording neuronal activity in freely moving animals and significantly expand the potential of imaging techniques by providing a mean to stabilize fluorescent targets.
Article
Full-text available
We designed a real-time computer vision system, the Multi-Worm Tracker (MWT), which can simultaneously quantify the behavior of dozens of Caenorhabditis elegans on a Petri plate at video rates. We examined three traditional behavioral paradigms using this system: spontaneous movement on food, where the behavior changes over tens of minutes; chemotaxis, where turning events must be detected accurately to determine strategy; and habituation of response to tap, where the response is stochastic and changes over time. In each case, manual analysis or automated single-worm tracking would be tedious and time-consuming, but the MWT system allowed rapid quantification of behavior with minimal human effort. Thus, this system will enable large-scale forward and reverse genetic screens for complex behaviors.
Article
Full-text available
The ability to optically excite or silence specific cells using optogenetics has become a powerful tool to interrogate the nervous system. Optogenetic experiments in small organisms have mostly been performed using whole-field illumination and genetic targeting, but these strategies do not always provide adequate cellular specificity. Targeted illumination can be a valuable alternative but it has only been shown in motionless animals without the ability to observe behavior output. We present a real-time, multimodal illumination technology that allows both tracking and recording the behavior of freely moving C. elegans while stimulating specific cells that express channelrhodopsin-2 or MAC. We used this system to optically manipulate nodes in the C. elegans touch circuit and study the roles of sensory and command neurons and the ultimate behavioral output. This technology enhances our ability to control, alter, observe and investigate how neurons, muscles and circuits ultimately produce behavior in animals using optogenetics.
Article
Full-text available
Organisms move through the world by changing their shape, and here we explore the mapping from shape space to movements in the nematode Caenorhabditis elegans as it crawls on an agar plate. We characterize the statistics of the trajectories through the correlation functions of the orientation angular velocity, orientation angle and the mean-squared displacement, and we find that the loss of orientational memory has significant contributions from both abrupt, large amplitude turning events and the continuous dynamics between these events. Further, we discover long-time persistence of orientational memory in the intervals between abrupt turns. Building on recent work demonstrating that C. elegans movements are restricted to a low-dimensional shape space, we construct a map from the dynamics in this shape space to the trajectory of the worm along the agar. We use this connection to illustrate that changes in the continuous dynamics reveal subtle differences in movement strategy that occur among mutants defective in two classes of dopamine receptors.
Article
Motor activity of Caenorhabditis elegans is widely used to study the mechanisms ranging from basic neuronal functions to human neurodegenerative diseases. It may also serve as a paradigm to screen for potential therapeutic reagents treating these diseases. Here, we developed an automated, 96-well plate and liquid phase based system that quantifies nematode motor activity in real time. Using this system, we identified an adult-onset, ageing-associated motor activity loss in a transgenic nematode line expressing human pathogenic G2019S mutant LRRK2 (leucine-rich repeat kinase 2), the leading genetic cause of Parkinson's disease characterized by dopaminergic neurodegeneration associated motor deficient mainly in elder citizens. Thus, our system may be used as a platform to screen for potential therapeutic drugs treating Parkinson's disease. It can also be used to monitor motor activity of nematodes in liquid phase at similar scenario.
Article
C. elegans is widely used to dissect how neural circuits and genes generate behavior. During locomotion, worms initiate backward movement to change locomotion direction spontaneously or in response to sensory cues; however, the underlying neural circuits are not well defined. We applied a multidisciplinary approach to map neural circuits in freely behaving worms by integrating functional imaging, optogenetic interrogation, genetic manipulation, laser ablation, and electrophysiology. We found that a disinhibitory circuit and a stimulatory circuit together promote initiation of backward movement and that circuitry dynamics is differentially regulated by sensory cues. Both circuits require glutamatergic transmission but depend on distinct glutamate receptors. This dual mode of motor initiation control is found in mammals, suggesting that distantly related organisms with anatomically distinct nervous systems may adopt similar strategies for motor control. Additionally, our studies illustrate how a multidisciplinary approach facilitates dissection of circuit and synaptic mechanisms underlying behavior in a genetic model organism.
Article
To quantitatively understand chemosensory behaviors, it is desirable to present many animals with repeatable, well-defined chemical stimuli. To that end, we describe a microfluidic system to analyze Caenorhabditis elegans behavior in defined temporal and spatial stimulus patterns. A 2 cm × 2 cm structured arena allowed C. elegans to perform crawling locomotion in a controlled liquid environment. We characterized behavioral responses to attractive odors with three stimulus patterns: temporal pulses, spatial stripes and a linear concentration gradient, all delivered in the fluid phase to eliminate variability associated with air-fluid transitions. Different stimulus configurations preferentially revealed turning dynamics in a biased random walk, directed orientation into an odor stripe and speed regulation by odor. We identified both expected and unexpected responses in wild-type worms and sensory mutants by quantifying dozens of behavioral parameters. The devices are inexpensive, easy to fabricate, reusable and suitable for delivering any liquid-borne stimulus.
Article
Over the past several years, optogenetic techniques have become widely used to help elucidate a variety of neuroscience problems. The unique optical control of neurons within a variety of organisms provided by optogenetics allows researchers to probe neural circuits and investigate neuronal function in a highly specific and controllable fashion. Recently, optogenetic techniques have been introduced to investigate synaptic transmission in the nematode Caenorhabditis elegans. For synaptic transmission studies, although quantitative, this technique is manual and very low-throughput. As it is, it is difficult to apply this technique to genetic studies. In this paper, we enhance this new tool by combining it with microfluidics technology and computer automation. This allows us to increase the assay throughput by several orders of magnitude as compared to the current standard approach, as well as improving standardization and consistency in data gathering. We also demonstrate the ability to infuse drugs to worms during optogenetic experiments using microfluidics. Together, these technologies will enable high-throughput genetic studies such as those of synaptic function.
Article
The ability to silence the activity of genetically specified neurons in a temporally precise fashion would provide the opportunity to investigate the causal role of specific cell classes in neural computations, behaviours and pathologies. Here we show that members of the class of light-driven outward proton pumps can mediate powerful, safe, multiple-colour silencing of neural activity. The gene archaerhodopsin-3 (Arch) from Halorubrum sodomense enables near-100% silencing of neurons in the awake brain when virally expressed in the mouse cortex and illuminated with yellow light. Arch mediates currents of several hundred picoamps at low light powers, and supports neural silencing currents approaching 900 pA at light powers easily achievable in vivo. Furthermore, Arch spontaneously recovers from light-dependent inactivation, unlike light-driven chloride pumps that enter long-lasting inactive states in response to light. These properties of Arch are appropriate to mediate the optical silencing of significant brain volumes over behaviourally relevant timescales. Arch function in neurons is well tolerated because pH excursions created by Arch illumination are minimized by self-limiting mechanisms to levels comparable to those mediated by channelrhodopsins or natural spike firing. To highlight how proton pump ecological and genomic diversity may support new innovation, we show that the blue-green light-drivable proton pump from the fungus Leptosphaeria maculans (Mac) can, when expressed in neurons, enable neural silencing by blue light, thus enabling alongside other developed reagents the potential for independent silencing of two neural populations by blue versus red light. Light-driven proton pumps thus represent a high-performance and extremely versatile class of 'optogenetic' voltage and ion modulator, which will broadly enable new neuroscientific, biological, neurological and psychiatric investigations.