ArticlePDF AvailableLiterature Review

Image analysis is driving a renaissance in growth measurement

Authors:

Abstract and Figures

The domain of machine vision, in which digital images are acquired automatically in a highly structured environment for the purpose of computationally measuring features in the scene, is applicable to the measurement of plant growth. This article reviews the quickly growing collection of reports in which digital image-processing has been used to measure plant growth, with emphasis on the methodology and adaptations required for high-throughput studies of populations.
Content may be subject to copyright.
Image
analysis
is
driving
a
renaissance
in
growth
measurement
Edgar
P
Spalding
and
Nathan
D
Miller
The
domain
of
machine
vision,
in
which
digital
images
are
acquired
automatically
in
a
highly
structured
environment
for
the
purpose
of
computationally
measuring
features
in
the
scene,
is
applicable
to
the
measurement
of
plant
growth.
This
article
reviews
the
quickly
growing
collection
of
reports
in
which
digital
image-processing
has
been
used
to
measure
plant
growth,
with
emphasis
on
the
methodology
and
adaptations
required
for
high-throughput
studies
of
populations.
Address
University
of
Wisconsin-Madison,
Department
of
Botany,
430
Lincoln
Drive,
Madison,
WI
53706,
United
States
Corresponding
author:
Spalding,
Edgar
P
(spalding@wisc.edu)
Current
Opinion
in
Plant
Biology
2013,
16:100104
This
review
comes
from
a
themed
issue
on
Growth
and
development
Edited
by
Michael
Scanlon
and
Marja
Timmermans
For
a
complete
overview
see
the
Issue
and
the
Editorial
Available
online
23rd
January
2012
1369-5266/$
see
front
matter,
#
2012
Elsevier
Ltd.
All
rights
reserved.
http://dx.doi.org/10.1016/j.pbi.2013.01.001
Introduction
Automated
methods
for
measuring
plant
growth
were
in
use
by
the
end
of
the
19th
century
(Figure
1)
but
even
then
pioneers
like
Wilhelm
Pfeffer
(18451920)
recog-
nized
the
potential
of
early
imaging
techniques,
‘Photo-
graphic
registration
will
probably
be
largely
employed
in
the
future,
for
series
of
pictures
may
be
obtained
which
when
placed
in
a
kinematograph
show
the
phases
of
several
days’
or
weeks’
growth
in
a
minute
or
so’
[1].
In
subsequent
decades,
researchers
devised
various
photographic
methods
for
studying
growth.
Computers
were
eventually
brought
to
the
task
by
digitizing
video
footage
[2]
or
projecting
photographic
transparencies
onto
digitizing
tablets
[3,4].
As
Arabidopsis
with
its
great
genetic
advantages
replaced
traditional
(and
much
larger)
subjects
such
as
oat
coleoptiles,
cucumber
hypocotyls,
and
pea
epicotyls,
a
millimeter
ruler
frequently
could
provide
the
resolution
needed
to
answer
the
important
questions
at
hand,
such
as
whether
the
hypocotyl
or
root
was
longer
or
shorter
than
the
wild
type.
Lack
of
need
for
high
resolution
coupled
with
the
difficulty
of
achieving
it
with
tiny
Arabidopsis
seedlings
pushed
the
topic
of
growth
measurement
into
something
equivalent
to
the
Dark
Ages.
Fortunately,
the
renaissance
is
well
underway
due
to
the
advent
of
digital
image
acquisition
and
computational
processing.
The
combination
of
high
resolution,
accuracy,
and
throughput
achievable
with
today’s
sensors
and
computational
technologies
is
allow-
ing
growth
measurements
to
be
compatible
with
large-
scale,
systems-style
biology
research.
Basic
image
analysis
Nearly
all
machine
vision
solutions
applicable
to
measur-
ing
plant
growth
from
images
depend
fundamentally
on
segmentation
and
analysis
of
structure,
two
procedural
stages
that
share
a
blurry
border.
Segmentation
deter-
mines
the
boundaries
of
human
recognizable
components
of
the
image
that
include
the
objects
of
interest.
Structure
analysis
is
concerned
with
characterizing
curves,
bound-
aries,
pixel
intensities
and
their
differentials.
Early
com-
puter
vision
practitioners
recognized
and
addressed
these
general
issues
by
devising
algorithmic
solutions
to
the
challenges
of
finding
lines,
corners,
and
boundaries
in
digital
images
[511].
Such
works
continue
to
serve
as
the
foundation
for
the
image-analysis
approaches
to
plant
growth
reviewed
here.
Figure
2a
illustrates
how
segmen-
tation
and
structure
analysis
can
be
combined
to
measure
growth
of
an
arbitrary
structure
shown
at
two
time
points
and
deliberately
blended
into
the
background.
Segment-
ing
the
object
of
interest
from
the
background
can
be
achieved
with
algorithms
ranging
from
those
that
detect
the
optimally
discriminating
threshold
of
pixel
intensity
based
on
the
structure
of
frequency
histograms
[11]
to
those
which
assign
each
pixel
a
probability
of
belonging
to
an
object
based
on
Bayesian
statistics
[12],
to
those
that
utilize
machine
learning
techniques
such
as
support
vec-
tor
machines
or
neural
networks
[1315].
Whatever
method
is
used,
the
result
is
a
set
of
object
pixels
from
which
the
defining
contour
or
boundary
(black
line
in
Figure
2b)
can
be
determined.
The
boundary
is
used
explicitly
or
implicitly
to
determine
the
midline
of
elongated
objects
(red
lines
in
Figure
2b)
such
as
seedling
stems
and
roots.
Each
of
the
various
midline-finding
techniques
which
one
can
use
depends
on
some
deter-
mination
of
the
point
that
lies
equidistant
between
two
opposite
boundary
positions.
Morphometrics
Midline
length
and
the
distribution
of
local
curvature
along
it
can
give
a
very
useful
description
of
a
biological
structure
such
as
a
plant
root
or
stem
[16].
From
a
time
series
of
images,
the
rate
of
change
of
these
morphometric
parameters
can
quantify
growth
and
shape
changes
with
resolution
on
the
order
of
minutes
and
microns
[17
,18].
An
important
step
in
a
midline-based
growth
measure-
ment
is
detection
of
the
correct
termination
point.
One
published
solution
for
tracking
growth
of
etiolated
seed-
lings
responding
to
light
used
a
gap
that
is
usually
present
Available
online
at
www.sciencedirect.com
Current
Opinion
in
Plant
Biology
2013,
16:100104
www.sciencedirect.com
at
the
base
of
the
closed
cotyledons
as
an
identifiable
point
where
the
hypocotyl
midline
is
terminated
[19].
A
technique
that
worked
well
for
de-etiolated
seedlings
with
opened
cotyledons
took
advantage
of
a
thickening
of
the
hypocotyl
at
the
cotyledonary
node
[20
].
A
third
technique
that
successfully
quantified
hypocotyl
growth
responses
to
ethylene
used
a
local
pattern-
matching
method
to
terminate
the
midline
at
a
repro-
ducible
cotyledon
location
[21
].
These
methods
were
either
automatic
or
semiautomatic,
which
is
necessary
if
the
method
is
to
replace
standard
manual
methods
and
enable
population
genetic
and
systems-style
studies.
In
the
case
of
roots,
which
the
object
in
Figure
2
reason-
ably
well
exemplifies,
linear
extrapolation
of
an
apical
subset
of
midline
points
intercepts
the
boundary
at
a
point
that
has
proven
useful
for
termination
[18].
The
RootTrace
tool
terminates
the
midline
at
the
tip
by
finding
the
last
pixel
in
a
progression
having
a
suffi-
ciently
high
posterior
probability
of
belonging
to
the
root
object
[22

].
Kinematics
Whereas
morphometrics
is
the
study
of
geometric
features,
kinematics
is
the
study
of
the
internal
material
processes
that
create
the
geometry,
namely
cell
pro-
duction
and
expansion
[23,24].
Kinematic
analyses
have
shown
plant
growth
to
be
a
form
of
material
flow,
which
has
been
tracked
from
sites
of
cell
production
by
photographing
growing
organs
marked
with
exogenous
[3,4,25,26],
or
endogenous
surface
marks
[27].
Figure
2b
supposes
a
grid
of
features
to
be
tracked
within
the
object
boundary
to
illustrate
a
kinematic
analysis
of
growth.
To
an
observer
at
the
tip
of
the
structure,
point
‘a’
appears
stationary
over
time
because
cells
in
that
region
are
not
expanding
much.
A
point
at
location
‘b’
would
move
away
Image
analysis
of
growth
Spalding
and
Miller
101
Figure
1
Current Opinion in Plant Biology
An
auxanometer
is
a
device
for
making
automated
measurements
of
growth.
A
figure
of
a
late
19th
century
auxanometer
taken
from
Wilhelm
Pfeffer’s
classic
textbook
is
shown
[1].
Figure
2
Schematic
illustration
of
how
morphometric
or
kinematic
descriptions
of
growth
are
obtained
from
images.
(a)
An
arbitrary
shape
having
grown
in
length
during
a
time
step
is
deliberately
made
similar
to
the
background
to
emphasize
the
fact
that
its
separation
from
the
background
may
not
be
trivial.
(b)
Successful
segmentation
defines
the
object’s
boundary
(black
outline)
which
aids
in
the
determination
of
the
midline
(red
line).
The
gray
grid
represents
fiduciary
marks,
applied
or
endogenous,
that
if
matched
between
images
can
allow
a
kinematic
analysis
of
the
behavior
of
the
material
comprising
the
object.
(c)
Velocity
profile
is
obtained
by
determining
how
fast
marks
at
each
of
the
indicated
positions
moved
away
from
the
tip.
(d)
Elemental
growth
rate
as
a
function
of
position
is
obtained
by
differentiating
the
curve
in
c.
www.sciencedirect.com
Current
Opinion
in
Plant
Biology
2013,
16:100104
from
the
observer
at
a
slow
rate,
whereas
a
point
at
‘c’
would
appear
to
move
away
considerably
faster.
Point
‘f’
moves
away
from
the
observer
at
the
maximum
rate
not
because
‘f’
marks
a
region
of
fast
material
expansion
but
because
the
interval
includes
all
of
the
expanding
material.
Figure
2c
plots
the
velocity
profile
just
described.
The
maximum
velocity
is
equal
to
the
growth
rate
a
midline-based
morphometric
method
would
measure.
Velocity
profiles
can
be
obtained
by
applying
optical
flow
analysis
methods
to
time
series
of
high-
resolution
digital
images.
Instead
of
ink
dots,
many
small
patches
of
endogenous
texture
in
an
image
caused
by
refraction
of
light
from
cell
walls
or
other
optical
effects
of
the
tissue
can
be
matched
from
one
frame
to
the
next
in
a
time
series
[2831].
Differentiating
the
velocity
profile
with
respect
to
position,
the
x-axis,
produces
the
elemen-
tal
growth
rate
profile
shown
in
Figure
2d.
It
provides
a
kinematics-based
definition
of
the
elongation
zone
and
some
fundamental
information
about
growth
of
the
primary
plant
body.
For
example,
Arabidopsis
is
not
a
small
plant
because
it
has
a
low
capacity
for
growth.
The
peak
elemental
growth
rate
of
its
root,
when
measured
as
just
described
from
images,
is
4050%
hour
1
[2831],
perfectly
matches
values
obtained
for
the
much
larger
maize
[25,32]
and
bean
[33]
roots.
Kinematics
shows
that
maize
and
bean
roots
are
bigger
than
Arabidopsis
roots
because
they
have
more
and
bigger
cells,
and
not
because
each
element
of
material
has
a
higher
intrinsic
capacity
for
expansion.
2D
versus
3D
The
above
treatment
covered
only
the
analysis
of
simple
structures
in
2D
images.
More
complicated
images
may
require
more
complicated
algorithms
but
not
new
prin-
ciples.
For
example,
a
branching
root
system
can
be
approached
by
segmentation,
contour,
and
midline
analysis
to
produce
a
skeleton
[34,35].
Likewise,
adding
the
third
spatial
dimension
complicates
the
task
but
the
image
analysis
steps
are
some
form
of
segmentation
and
structure
analysis.
Perhaps
the
larger
differences
between
2D
and
3D
studies
lie
in
the
image
acquisition
technol-
ogies.
Root
system
architecture
in
3D
has
been
studied
with
diverse
imaging
modalities.
A
successful
method
using
visible
light
depends
on
acquiring
digital
images
of
a
root
system
grown
in
a
transparent
medium
as
the
subject
is
rotated.
From
the
resulting
angle
series,
a
back-projection
method
enables
faithful
reconstruction
of
the
3D
archi-
tecture
[36
,37

].
Repeating
the
acquisition
at
different
time
points
enables
growth
studies,
one
sample
per
apparatus.
X-rays
[38],
and
magnetic
resonance
methods
[39,40]
have
also
been
used
to
obtain
3D
reconstructions
of
root
systems
in
soil,
but
not
of
their
growth.
At
the
cellular
scale,
3D
reconstructions
of
optical
slices
obtained
by
laser
scanning
confocal
imaging
are
common-
place,
though
obtaining
time
series
from
which
growth
can
be
measured
is
far
from
simple
[41

].
Methodologies
that
measure
the
path
length
of
reflected
laser
light
may
prove
to
be
an
effective
way
to
measure
3D
growth
of
plant
structures
[42,43].
Throughput
Automation
of
image
analysis
can
allow
experiments
to
expand
beyond
what
would
be
feasible
in
a
manual-
analysis
scenario,
shifting
the
rate-limiting
step
to
image
acquisition.
Throughput
of
image
acquisition
can
be
increased
by
employing
multiple
image-acquisition
devices,
each
focused
on
a
separate
sample
[44].
Another
approach
is
to
control
the
movement
of
a
single
acqui-
sition
device
to
parallelize
the
measurement
of
multiple
samples
[21
].
Each
approach
has
limitations
or
technical
challenges
to
overcome.
Setting
up
parallel
experiments
in
front
of
multiple
devices
can
be
time
consuming
and
difficult
to
synchronously
initiate.
Moving
a
camera
to
inspect
multiple
samples
may
require
technically
demanding
servoing
with
precision
motion-control
hard-
ware
and
software
[29,45
,46
].
A
third
approach
is
to
increase
the
size
of
the
scene
so
that
multiple
samples
can
be
included
in
a
single
capture
event.
Standard
digital
cameras
can
capture
overhead
images
containing
several
Arabidopsis
plants,
for
example.
Because
of
the
relatively
flat
profile
of
the
green
rosette
against
a
dark
soil
back-
ground,
the
segmentation
step
is
fairly
straightforward.
The
resolution
achieved
with
such
cameras
is
sufficient
to
resolve
small
increments
of
growth.
This
scenario
has
been
successful
[47,48]
and
commercial
platforms
for
systematizing
the
measurements
are
available
(www.lemnatec.com).
A
more
complicated
wide-scene
image
is
also
a
popular
data
type
in
Arabidopsis
research.
A
standard
flatbed
document
scanner
can
capture
images
of
multiple
Petri
plates
in
one
scan,
with
each
plate
contain-
ing
multiple
seedlings
growing
along
its
vertically
oriented
agar
surface
so
that
potentially
large
numbers
of
roots,
hypocotyls,
cotyledons,
and
possibly
leaves
and
lateral
roots
are
represented
in
profile.
Typically,
the
researcher
measures
the
structures
of
interest
using
a
manual
point
selection
device.
Needed
to
make
the
inexpensive
and
easily
automated
flatbed
scanner
into
a
high
resolution,
high
throughput,
growth-measuring
device
are
algorithms
capable
of
matching
the
human’s
ability
to
discern
and
measure
the
specific
structures
of
interest.
Incorporating
supervised
machine-learning
algorithms
into
the
image
analysis
tool
holds
much
promise
in
this
regard.
One
hundred
years
ago,
Pfeffer
saw
image
analysis
as
a
way
to
study
plant
growth
in
the
future.
From
here,
the
perspective
seems
to
be
different
only
in
the
degree
to
which
throughput,
resolution,
and
precision
will
increase.
Acknowledgement
This
work
was
supported
by
National
Science
Foundation
grant
IOS-
1031416
to
E.P.S.
102
Growth
and
development
Current
Opinion
in
Plant
Biology
2013,
16:100104
www.sciencedirect.com
References
and
recommended
reading
Papers
of
particular
interest,
published
within
the
period
of
review,
have
been
highlighted
as:
of
special
interest

of
outstanding
interest
1.
Pfeffer
W:
edn
2.
The
Physiology
of
Plants.
London:
Oxford;
1903.
2.
Jaffe
MJ,
Wakefield
AH,
Telewski
F,
Gulley
E,
Biro
R:
Computer-
assisted
image
analysis
of
plant
growth,
thigmomorphogenesis,
and
gravitropism.
Plant
Physiol
1985,
77:722-730.
3.
Spalding
EP,
Cosgrove
DJ:
Influence
of
electrolytes
on
growth,
phototropism,
nutation
and
surface
potential
in
etiolated
cucumber
seedlings.
Plant
Cell
Environ
1993,
16:445-451.
4.
Cosgrove
DJ:
Kinetic
separation
of
phototropism
from
blue-
light
inhibition
of
stem
elongation.
Photochem
Photobiol
1985,
42:745-751.
5.
Horn
BKP:
The
BinfordHorn
Line-Finder.
Cambridge,
MA:
M.I.T.
Artificial
Intelligence
Lab;
1971:.
(AI
Memo).
6.
Rosenfeld
A,
Thurston
M:
Edge
and
curve
detection
for
visual
scene
analysis.
IEEE
Trans
Comput
1971,
20:562-569.
7.
Canny
J:
A
computational
approah
to
edge
detection.
IEEE
Trans
Pattern
Anal
Mach
Intell
1986,
8:679-689.
8.
Harris
C,
Stephens
M:
A
combined
corner
and
edge
detector.
Proc.
Fourth
Alvey
Vision
Conference.
1988:147-151.
9.
Weszka
JS,
Nagel
RN,
Rosenfeld
A:
A
threshold
selection
technique.
IEEE
Trans
Comput
1974,
C-23:1322-1326.
10.
Ridler
TW,
Calvard
S:
Picture
thresholding
using
an
iterative
selection
method.
IEEE
Trans
Syst
Man
Cybern
1978,
8:630-632.
11.
Otsu
N:
A
threshold
selection
method
from
gray-level
histograms.
IEEE
Trans
Syst
Man
Cybern
1979,
9:62-66.
12.
Bouman
CA,
Shapiro
M:
A
multiscale
random
field
model
for
Bayesian
image
segmentation.
IEEE
Trans
Image
Process
1994,
3:162-177.
13.
Bhandarkar
SM,
Koh
J,
Suk
M:
A
hierarchical
neural
network
and
its
application
to
image
segmentation.
Math
Comput
Simul
1996,
41:337-355.
14.
Othman
AA,
Tizhoosh
HR:
Image
thresholding
using
neural
network.
2010
10th
International
Conference
on
Intelligent
Systems
Design
and
Applications
(ISDA);
November
29December
1:
2010:1159-1164.
15.
Yu
Z,
Wong
H-S,
Wen
G:
A
modified
support
vector
machine
and
its
application
to
image
segmentation.
Image
Vis
Comput
2011,
29:29-40.
16.
Silk
WK:
Quantitative
descriptions
of
development.
Annu
Rev
Plant
Physiol
1984,
35:479-518.
17.
Miller
ND,
Durham
Brooks
TL,
Assadi
AH,
Spalding
EP:
Detection
of
a
gravitropism
phenotype
in
glutamate
receptor-like
3.3
mutants
of
Arabidopsis
thaliana
using
machine
vision
and
computation.
Genetics
2010,
186:585-593.
Automated,
parallel
image
acquisition
followed
by
automated
image
analysis
created
a
large
dataset
of
highly
resolved
root
gravitropism
responses.
A
subtle
mutant
phenotype
was
discovered
by
computational
analysis
of
the
large
image-derived
datasets.
18.
Miller
ND,
Parks
BM,
Spalding
EP:
Computer-vision
analysis
of
seedling
responses
to
light
and
gravity.
Plant
J
2007,
52:374-381.
19.
Wang
L,
Uilecan
IV,
Assadi
AH,
Kozmik
CA,
Spalding
EP:
HYPOTrace:
image
analysis
software
for
measuring
hypocotyl
growth
and
shape
demonstrated
on
Arabidopsis
seedlings
undergoing
photomorphogenesis.
Plant
Physiol
2009,
149:1632-1637.
20.
Cole
B,
Kay
SA,
Chory
J:
Automated
analysis
of
hypocotyl
growth
dynamics
during
shade
avoidance
in
Arabidopsis.
Plant
J
2011,
65:991-1000.
The
authors
created
an
image-analysis
tool
that
automates
the
very
time-
consuming
manual
task
of
measuring
the
hypocotyl
(stem)
length
of
light-grown
Arabidopsis
seedlings.
The
HyDE
program,
demonstrated
to
be
capable
of
detecting
growth
acceleration
due
to
changes
in
far-red
irradiation,
is
available
to
the
community.
21.
Men
Y,
Yu
Q,
Chen
Z,
Wang
J,
Huang
Y,
Guo
H:
A
high-throughput
imaging
system
to
quantitatively
analyze
the
growth
dynamics
of
plant
seedlings.
Integr
Biol
2012,
4:945-952.
In
addition
to
presenting
a
novel
method
for
terminating
the
hypocotyl
midline
and
demonstrating
its
utility
in
measuring
rapid
growth
responses
to
the
hormone
ethylene,
the
authors
also
describe
a
relatively
simple
method
of
doubling
acquisition
throughput
by
moving
the
camera
back
and
forth
between
two
Petri
plates.
22.

French
A,
Ubeda-Tomas
S,
Holman
TJ,
Bennett
MJ,
Pridmore
T:
High-throughput
quantification
of
root
growth
using
a
novel
image-analysis
tool.
Plant
Physiol
2009,
150:1784-1795.
This
paper,
in
a
very
accessible
manner,
explains
the
interesting
approach
to
segmentation,
midline
finding,
and
termination
coded
in
RootTrace,
an
open-source
and
extensible
software
tool
for
measuring
root
growth
from
time
series
of
images.
23.
Silk
WK,
Erickson
RO:
Kinematics
of
plant
growth.
J
Theor
Biol
1979,
76:481-501.
24.
Richards
OW,
Kavanagh
AJ:
The
analysis
of
the
relative
growth
gradients
and
changing
form
of
growing
organisms:
illustrated
by
the
tobacco
leaf.
Am
Nat
1943,
77:385-399.
25.
Nelson
AJ,
Evans
ML:
Analysis
of
growth
patterns
during
gravitropic
curvature
in
roots
of
Zea
mays
by
use
of
a
computer-based
video
digitizer.
J
Plant
Growth
Regul
1986,
5:73-83.
26.
Beemster
GT,
Baskin
TI:
Analysis
of
cell
division
and
elongation
underlying
the
developmental
acceleration
of
root
growth
in
Arabidopsis
thaliana.
Plant
Physiol
1998,
116:1515-1526.
27.
Silk
WK,
Erickson
RO:
Kinematics
of
hypocotyl
curvature.
Am
J
Bot
1978,
65:310-319.
28.
van
der
Weele
CM,
Jiang
HS,
Palaniappan
KK,
Ivanov
VB,
Palaniappan
K,
Baskin
TI:
A
new
algorithm
for
computational
image
analysis
of
deformable
motion
at
high
spatial
and
temporal
resolution
applied
to
root
growth.
Roughly
uniform
elongation
in
the
meristem
and
also,
after
an
abrupt
acceleration,
in
the
elongation
zone.
Plant
Physiol
2003,
132:1138-1148.
29.
Walter
A,
Spies
H,
Terjung
S,
Ku
¨sters
R,
Kirchgeßner
N,
Schurr
U:
Spatio-temporal
dynamics
of
expansion
growth
in
roots:
automatic
quantification
of
diurnal
course
and
temperature
response
by
digital
image
sequence
processing.
J
Exp
Bot
2002,
53:689-698.
30.
Chavarria-Krauser
A,
Nagel
KA,
Palme
K,
Schurr
U,
Walter
A,
Scharr
H:
Spatio-temporal
quantification
of
differential
growth
processes
in
root
growth
zones
based
on
a
novel
combination
of
image
sequence
processing
and
refined
concepts
describing
curvature
production.
New
Phytol
2008,
177:811-821.
31.
Zheng
X,
Miller
ND,
Lewis
DR,
Christians
MJ,
Lee
K-H,
Muday
GK,
Spalding
EP,
Vierstra
RD:
AUXIN
UP-REGULATED
F-BOX
PROTEIN1
regulates
the
cross
talk
between
auxin
transport
and
cytokinin
signaling
during
plant
root
growth.
Plant
Physiol
2011,
156:1878-1893.
32.
Erickson
RO,
Sax
KB:
Elemental
growth
rate
of
the
primary
root
of
Zea
mays.
Proc
Am
Philos
Soc
1956,
100:487-498.
33.
Basu
P,
Pal
A,
Lynch
JP,
Brown
KM:
A
novel
image-analysis
technique
for
kinematic
study
of
growth
and
curvature.
Plant
Physiol
2007,
145:305-316.
34.
Galkovskyi
T,
Mileyko
Y,
Bucksch
A,
Moore
B,
Symonova
O,
Price
C,
Topp
C,
Iyer-Pascuzzi
A,
Zurek
P,
Fang
S
et
al.:
GiA
Roots:
software
for
the
high
throughput
analysis
of
plant
root
system
architecture.
BMC
Plant
Biol
2012,
12:116.
35.
Basu
P,
Pal
A:
A
new
tool
for
analysis
of
root
growth
in
the
spatio-temporal
continuum.
New
Phytol
2012,
195:264-274.
36.
Iyer-Pascuzzi
A,
Symonova
O,
Mileyko
Y,
Hao
Y,
Belcher
H,
Harer
J,
Weitz
J,
Benfey
P:
Imaging
and
analysis
platform
for
automatic
phenotyping
and
trait
ranking
of
plant
root
systems.
Plant
Physiol
2010,
152:1148-1157.
Image
analysis
of
growth
Spalding
and
Miller
103
www.sciencedirect.com
Current
Opinion
in
Plant
Biology
2013,
16:100104
In
addition
to
developing
an
image-based
method
of
quantifying
root
system
architecture
traits
of
rice
plants
grown
in
a
transparent
gel,
the
authors
demonstrate
how
to
use
support
vector
machines
to
determine
which
traits
are
effective
in
discriminating
genotypes.
37.

Clark
R,
MacCurdy
R,
Jung
J,
Shaff
J,
McCouch
S,
Aneshansley
D,
Kochian
L:
Three-dimensional
root
phenotyping
with
a
novel
imaging
and
software
platform.
Plant
Physiol
2011,
156:455-465.
The
technique
of
rotating
a
root
system
made
visible
by
growth
in
a
transparent
medium
while
acquiring
images,
then
reconstructing
the
3D
structure
by
the
back-projection
method
was
developed
to
the
state-of-
the-art
in
this
work,
which
also
included
the
time
domain,
so
that
not
only
growth
could
be
measured
but
also
the
root
system
architecture.
38.
Mairhofer
S,
Zappala
S,
Tracy
SR,
Sturrock
C,
Bennett
M,
Mooney
SJ,
Pridmore
T:
RooTrak:
automated
recovery
of
three-
dimensional
plant
root
architecture
in
soil
from
x-ray
microcomputed
tomography
images
using
visual
tracking.
Plant
Physiol
2012,
158:561-569.
39.
Borisjuk
L,
Rolletschek
H,
Neuberger
T:
Surveying
the
plant’s
world
by
magnetic
resonance
imaging.
Plant
J
2012,
70:129-146.
40.
Jahnke
S,
Menzel
MI,
Van
Dusschoten
D,
Roeb
GW,
Bu
¨hler
J,
Minwuyelet
S,
Blu
¨mler
P,
Temperton
VM,
Hombach
T,
Streun
M
et
al.:
Combined
MRIPET
dissects
dynamic
changes
in
plant
structures
and
functions.
Plant
J
2009,
59:634-644.
41.

Fernandez
R,
Das
P,
Mirabet
V,
Moscardi
E,
Traas
J,
Verdeil
J-L,
Malandain
G,
Godin
C:
Imaging
plant
growth
in
4D:
robust
tissue
reconstruction
and
lineaging
at
cell
resolution.
Nat
Methods
2010,
7:547-553.
The
very
difficult
task
of
tracking
cells
produced
by
a
meristem
through
space
and
time
after
3D
reconstruction
of
optical
slices
obtained
by
confocal
microscopy
is
accomplished
here.
When
this
type
of
computer
vision
methodology
can
be
further
automated,
plant
development
will
become
a
much
more
quantitative
field.
42.
Fang
S,
Yan
X,
Liao
H:
3D
reconstruction
and
dynamic
modeling
of
root
architecture
in
situ
and
its
application
to
crop
phosphorus
research.
Plant
J
2009,
60:1096-1108.
43.
Dornbusch
T,
Lorrain
S,
Kuznetsov
D,
Fortier
A,
Liechti
R,
Xenarios
I,
Fankhauser
C:
Measuring
the
diurnal
pattern
of
leaf
hyponasty
and
growth
in
Arabidopsis
a
novel
phenotyping
approach
using
laser
scanning.
Funct
Plant
Biol
2012,
39:860.
44.
Durham
Brooks
TL,
Miller
ND,
Spalding
EP:
Plasticity
of
Arabidopsis
root
gravitropism
throughout
a
multidimensional
condition
space
quantified
by
automated
image
analysis.
Plant
Physiol
2010,
152:206-216.
45.
Busch
W,
Moore
BT,
Martsberger
B,
Mace
DL,
Twigg
RW,
Jung
J,
Pruteanu-Malinici
I,
Kennedy
SJ,
Fricke
GK,
Clark
RL
et
al.:
A
microfluidic
device
and
computational
platform
for
high-
throughput
live
imaging
of
gene
expression.
Nat
Methods
2012,
9:1101-1106.
This
paper
marks
a
significant
advance
in
automation
and
therefore
throughput
in
confocal
microscope-based
studies
of
root
growth
and
reporter
gene
analysis.
It
combines
elements
of
servoing,
objecting
finding,
and
feature
extraction
to
enable
high-resolution
studies
of
growth
and
fluorescence
patterns.
46.
Subramanian
R,
Spalding
E,
Ferrier
N:
A
high
throughput
robot
system
for
machine
vision
based
plant
phenotype
studies.
Mach
Vis
Appl
2012:1-18
http://dx.doi.org/10.1007/s00138-012-0434-4.
This
paper
describes
the
construction
and
operation,
particularly
the
machine-vision
components,
of
an
automated
apparatus
for
tracking
the
growth
of
many
roots
in
parallel
with
one
imaging
device.
Particular
emphasis
is
made
on
the
object
finding
and
servoing
needed
to
achieve
proper
focus.
47.
Leister
D,
Varotto
C,
Pesaresi
P,
Niwergall
A,
Salamini
F:
Large-
scale
evaluation
of
plant
growth
in
Arabidopsis
thaliana
by
non-invasive
image
analysis.
Plant
Physiol
Biochem
1999,
37:671-678.
48.
De
Vylder
J,
Vandenbussche
FJ,
Hu
Y,
Philips
W,
Van
Der
Straeten
D:
Rosette
Tracker:
an
open
source
image
analysis
tool
for
automatic
quantification
of
genotype
effects.
Plant
Physiol
2012.
104
Growth
and
development
Current
Opinion
in
Plant
Biology
2013,
16:100104
www.sciencedirect.com
... The results showed that the local rate of material expansion along the root axis accelerates from almost nothing in the meristem to 40% or more per hour in the elongation zone before rapidly declining to zero in the maturation zone. This is true for large (maize and bean) or small (Arabidopsis) species [7]. Kinematic analyses could be combined with molecular and genetic studies of transcription factors and hormones to create a more detailed description of the mechanism that produces the abrupt acceleration as cells exit the meristem and the similarly abrupt deceleration at the edge of the maturation zone. ...
... Differentiating the velocity profile curve gives the relative elemental growth rate (REGR) profile, shown in Figure 2G. In this example, the peak REGR is approximately 45% h −1 , very similar to other measurements of Arabidopsis roots at 40-50% h −1 [7,36] and, notably, the much larger maize root [5,37] and bean root [11]. Relative elemental growth rate, a measurement of local material expansion rate, is not a function of organ size. ...
Article
Full-text available
Plant roots elongate when cells produced in the apical meristem enter a transient period of rapid expansion. To measure the dynamic process of root cell expansion in the elongation zone, we captured digital images of growing Arabidopsis roots with horizontal microscopes and analyzed them with a custom image analysis program (PatchTrack) designed to track the growth-driven displacement of many closely spaced image patches. Fitting a flexible logistics equation to patch velocities plotted versus position along the root axis produced the length of the elongation zone (mm), peak relative elemental growth rate (% h−1), the axial position of the peak (mm from the tip), and average root elongation rate (mm h−1). For a wild-type root, the average values of these kinematic traits were 0.52 mm, 23.7% h−1, 0.35 mm, and 0.1 mm h−1, respectively. We used the platform to determine the kinematic phenotypes of auxin transport mutants. The results support a model in which the PIN2 auxin transporter creates an area of expansion-suppressing, supraoptimal auxin concentration that ends 0.1 mm from the quiescent center (QC), and that ABCB4 and ABCB19 auxin transporters maintain expansion-limiting suboptimal auxin levels beginning approximately 0.5 mm from the QC. This study shows that PatchTrack can quantify dynamic root phenotypes in kinematic terms.
... explants displaying shoot response weekly, allowing for the capture and analysis of plant growth; this analytical method is being used with increasing frequency to quantify growth and other parameters (Spalding and Miller 2013;Clark et al. 2020;Li et al. 2020). Two photos per plantlet were taken each week from two different aspects (Aspect 1, and Aspect 2 rotated 90° from Aspect 1). ...
Article
Full-text available
Lepidium ostleri S.L. Welsh & Goodrich (Ostler’s peppergrass) is an endemic plant species restricted to Ordovician limestone outcrops associated with the San Francisco Mountain Range in western Utah. Due to restricted population distribution and proximity to modern mining operations, L. ostleri is a species of conservation interest. This study focused on the development of a micropropagation protocol for propagating mature plants using plant tissue culture methods. Indirect shoot organogenesis was obtained from L. ostleri explants on Murashige and Skoog (MS) medium augmented with various concentrations of BAP (6-Benzylaminopurine), kinetin (N6-furfuryladenine), and IAA (indole-3-acetic acid). Plantlets supporting shoots grown in vitro were pulse treated with differing strengths of indole-3-butyric acid (IBA) and transferred to sterile soil. Following root induction, plantlets were acclimated to ambient conditions. The successful development of a micropropagation protocol supports management activities for L. ostleri and also contributes to in vitro propagation knowledge at the species, genus, and family levels.
... The segmented leaf 72 regions are then quantified to calculate the leaf area, which serves as an indicator of plant 73 3 of 19 growth and photosynthetic activity. Similarly, stem length is determined by measuring the 74 distance between key points on the stem, which can be identified through feature extraction 75 algorithms like Harris corner detection or SIFT (Scale-Invariant Feature Transform) [2,3]. 76 To effectively analyse the large datasets of visual signals by extracting valuable insights 77 and information, deep learning models, such as convolutional neural networks (CNNs) 78 or region-based convolutional neural networks (R-CNNs) [4], have been applied to learn 79 and extract meaningful features automatically with better outcomes [5,6]. ...
Preprint
Full-text available
The use of visual signals in horticulture has attracted significant attention and encompassed a wide range of data types such as 2D images, videos, hyperspectral images, and 3D point clouds. These visual signals have proven to be valuable in developing cutting-edge computer vision systems for various applications in horticulture, enabling plant growth monitoring, pest and disease detection, quality and yield estimation, and automated harvesting. However, unlike other sectors, developing deep learning computer vision systems for horticulture encounters unique challenges due to the limited availability of high-quality training and evaluation datasets necessary for deep learning models. This paper investigates the current status of vision systems and available data in order to identify the high-quality data requirements specific to horticultural applications. We analyse the impact of the quality of visual signals on the information content and features that can be extracted from these signals. To address the identified data quality requirements, we explore the usage of a deep learning-based super-resolution model for generative quality enhancement of visual signals. Furthermore, we discuss how these can be applied to meet the growing requirements around data quality for learning-based vision systems. We also present a detailed analysis of the competitive quality generated by the proposed solution compared to cost-intensive hardware-based alternatives. This work aims to guide the development of efficient computer vision models in horticulture by overcoming existing data challenges and paving a pathway forward for contemporary data acquisition.
... Earlier crop phenotyping was manual. Currently, invasive image-based methods are increasingly being utilized [10]. It is also known as Vigna Unguiculata, which belongs to an annual herbaceous legume from the family genus Vigna. ...
Article
Full-text available
This paper uses a binary classification to discriminate cowpea leaves from other/weed leaves. Cowpea is one of the nutritional crops, but research on cowpea is negligible worldwide. Hence our experiments are purely based on cowpea leaves to increase the productivity of the high fiber consumed. An automatic model implementation has yet to be done on the cowpea leaves dataset. Therefore, our work is essential for researchers in this field. In such cases, various state-of-the-art algorithms are available, but our approach appears competitive against existing ones. The overall analysis of the complete results we have cited in the paper in table format. The images have been collected from the cowpea field of The Indian Council of Agricultural Research (ICAR), New Delhi and dataset constructed in the lab. The plants are grown under suitable environmental conditions, and images have been collected with a standard DSLR camera. In the proposed CNNs model, data (images) of cowpea leaves have been collected from a Research farm. For all collected data, we have made labeled data (Databank) to utilize cowpea leaves for further research, and then we applied three convolutional neural networks to classify cowpea leaves for smart agriculture. The experimental results show that the DenseNet121 realizes a detection performance with the highest accuracy on the CLDC. We have also used two more CNN architectures to identify cowpea leaves from the weed, and a comparative study has been made and explored in the paper. DenseNet121 methods give an accuracy of 86.12% (training dataset) and 88.89% (testing dataset), respectively.
... Digital tree models have many applications, such as biomass estimation [14,22,13], growth modelling [33,31,7], forestry management [35,25,3], urban microclimate simulation [36], and agri-tech applications, such as robotic pruning [38,2], and fruit picking [1]. ...
Preprint
Full-text available
In this paper, we present Smart-Tree, a supervised method for approximating the medial axes of branch skeletons from a tree's point cloud. A sparse voxel convolutional neural network extracts each input point's radius and direction towards the medial axis. A greedy algorithm performs robust skeletonization using the estimated medial axis. The proposed method provides robustness to complex tree structures and improves fidelity when dealing with self-occlusions, complex geometry, touching branches, and varying point densities. We train and test the method using a multi-species synthetic tree data set and perform qualitative analysis on a real-life tree point cloud. Experimentation with synthetic and real-world datasets demonstrates the robustness of our approach over the current state-of-the-art method. Further research will focus on training the method on a broader range of tree species and improving robustness to point cloud gaps. The details to obtain the dataset are at https://github.com/uc-vision/synthetic-trees.
... While in different countries and regions the specificities of the challenge vary, the universal truism is that it is becoming increasing difficult for the vast majority of urban residents to obtain and retain adequate and affordable land and housing (Mcbride et al., 20011). The first four volumes in the Adequate Housing Series canvas the state of affordable land and housing in four regions facing major affordability difficulties: Latin America and the Caribbean, Asia, Africa, and Europe and North America (Spalding, 2013). Each volume firstly explores the major trends in housing conditions, availability, and quality and tenure modalities. ...
Article
The paper highlighted housing as an essential need of man, which is why it is described as a sine qua non of human living, the priority accorded the issue of housing is immense; to most governments, the availability of sufficient but basic housing for all is often stated as a priority for enhancing the social needs of the society. Habitable housing contributes to the health, efficiency, social behaviour and general welfare of the populace. Apart from providing man with shelter and security, housing plays a major role in serving as an asset. Commercial banks through their intermediation role engender economic development in every economy. Most studies divide the determination of commercial banks behaviors into two (2) categories that is: internal and external factors. The research approach adopted for this Study is the survey approach which is more of quantitative research. The sample frame for this study is 800 low income earners from the Kaduna State Ministry of Education. A sample size of 800 is 270.The sample size of this study is 270 Staff. Simple random sampling technique was adopted by the researcher in selecting study sample, because it allows for equal chances of the models or sample to be selected. The findings revealed that the interest rate is adequate is the item that rank first, while to obtained loan from commercial banks rank second on the adequacy of cost of housing finance by commercial bank in the study area. It was recommended that allocation for housing sector should be increased to enhance housing purchase and renovation of existing structures. The federal government should subsidize the prices of building materials which may make the objective, ''housing for all'' Hummingbird Publications
... As early as 1903, Wilhelm Pfeffer has recognized the potential of image analysis in monitoring plant growth [202,255]. ...
Preprint
Full-text available
In the evolution of agriculture to its next stage, Agriculture 5.0, artificial intelligence will play a central role. Controlled-environment agriculture, or CEA, is a special form of urban and suburban agricultural practice that offers numerous economic, environmental, and social benefits, including shorter transportation routes to population centers, reduced environmental impact, and increased productivity. Due to its ability to control environmental factors, CEA couples well with computer vision (CV) in the adoption of real-time monitoring of the plant conditions and autonomous cultivation and harvesting. The objective of this paper is to familiarize CV researchers with agricultural applications and agricultural practitioners with the solutions offered by CV. We identify five major CV applications in CEA, analyze their requirements and motivation, and survey the state of the art as reflected in 68 technical papers using deep learning methods. In addition, we discuss five key subareas of computer vision and how they related to these CEA problems, as well as nine vision-based CEA datasets. We hope the survey will help researchers quickly gain a bird-eye view of the striving research area and will spark inspiration for new research and development.
Chapter
This paper introduces Smart-Tree, a supervised method for approximating the medial axes of branch skeletons from a tree point cloud. Smart-Tree uses a sparse voxel convolutional neural network to extract the radius and direction towards the medial axis of each input point. A greedy algorithm performs robust skeletonization using the estimated medial axis. Our proposed method provides robustness to complex tree structures and improves fidelity when dealing with self-occlusions, complex geometry, touching branches, and varying point densities. We evaluate Smart-Tree using a multi-species synthetic tree dataset and perform qualitative analysis on a real-world tree point cloud. Our experimentation with synthetic and real-world datasets demonstrates the robustness of our approach over the current state-of-the-art method. The dataset (https://github.com/uc-vision/synthetic-trees) and source code (https://github.com/uc-vision/smart-tree) are publicly available.KeywordsTree SkeletonizationPoint CloudMetric ExtractionNeural Network
Article
Plant health monitoring has gotten increasing concerns in recent years. Flexible devices offer a more effective mean of monitoring plants. In order to obtain plant information, the relevant physiological and ecological parameters that need to be concerned are summarized in this review. The aim is to provide a more comprehensive and explorable perspective. The research results and progress of flexible devices for plant monitoring in recent years are presented. This paper focuses on conductive materials for flexible device fabrication, including carbon-based materials (e.g. carbon nanotubes (CNTs), graphene), metallic materials (e.g. gold, silver), conductive polymer materials (e.g. polypyrrole, polythiophene), composite materials, biomass materials and hydrogel materials. In the end, we discuss the challenges and future potential of flexible devices for plant sensing at this stage. It is hoped that this will provide further inspiration for research and development of flexible sensing devices in the plant sector.
Article
Full-text available
The problem of image segmentation can be formulated as one of vector quantization. Although self-organizing networks with competitive learning are useful for vector quantization, they, in their original single-layer structure, are inadequate for image segmentation. This paper proposes and describes a hierarchical self-organizing neural network for image segmentation. The hierarchical self-organizing feature map (HSOFM) which is an extension of the traditional (single-layer) self-organizing feature map (SOFM) is seen to alleviate the shortcomings of the latter in the context of image segmentation. The problem of image segmentation is formulated as one of vector quantization and mapped onto the HSOFM. The HSOFM combines the ideas of self-organization and topographic mapping with those of multi-scale image segmentation. Experimental results using intensity and range images bring out the advantages of the HSOFM over the conventional SOFM.
Article
Full-text available
Plants forming a rosette during their juvenile growth phase, such as Arabidopsis thaliana (L.) Heynh., are able to adjust the size, position and orientation of their leaves. These growth responses are under the control of the plants circadian clock and follow a characteristic diurnal rhythm. For instance, increased leaf elongation and hyponasty – defined here as the increase in leaf elevation angle – can be observed when plants are shaded. Shading can either be caused by a decrease in the fluence rate of photosynthetically active radiation (direct shade) or a decrease in the fluence rate of red compared with far-red radiation (neighbour detection). In this paper we report on a phenotyping approach based on laser scanning to measure the diurnal pattern of leaf hyponasty and increase in rosette size. In short days, leaves showed constitutively increased leaf elevation angles compared with long days, but the overall diurnal pattern and the magnitude of up and downward leaf movement was independent of daylength. Shade treatment led to elevated leaf angles during the first day of application, but did not affect the magnitude of up and downward leaf movement in the following day. Using our phenotyping device, individual plants can be non-invasively monitored during several days under different light conditions. Hence, it represents a proper tool to phenotype light- and circadian clock-mediated growth responses in order to better understand the underlying regulatory genetic network.
Article
Full-text available
This work demonstrates how a high throughput robotic machine vision systems can quantify seedling development with high spatial and temporal resolution.The throughput that the system provides is high enough to match the needs of functional genomics research. Analyzing images of plant seedlings growing and responding to stimuli is a proven approach to finding the effects of an affected gene. However, with 104 genes in a typical plant genome, comprehensive studies will require high throughput methodologies. To increase throughput without sacrificing spatial or temporal resolution, a 3 axis robotic gantry system utilizing visual servoing was developed. The gantry consists of direct drive linear servo motors that can move the cameras at a speed of 1 m/s with an accuracy of 1 μm, and a repeatability of 0.1 μm. Perpendicular to the optical axis of the cameras was a 1 m2 sample fixture holds 36 Petri plates in which 144 Arabidopsis thaliana seedlings (4 per Petri plate) grew vertically along the surface of an agar gel. A probabilistic image analysis algorithm was used to locate the root of seedlings and a normalized gray scale variance measure was used to achieve focus by servoing along the optical axis. Rotation of the sample holder induced a gravitropic bending response in the roots, which are approximately 45 μm wide and several millimeter in length. The custom hardware and software described here accurately quantified the gravitropic responses of the seedlings in parallel at approximately 3 min intervals over an 8-h period. Here we present an overview of our system and describe some of the necessary capabilities and challenges to automating plant phenotype studies.
Article
Time lapse photographs were analyzed for curvature, κ, of the plumular hook as a function of distance from the apex, s , of seedlings of lettuce. Lactuca sativa cv ‘Grand Rapids,‘ during hook maintenance (red light) and hook opening (white light). Curvature of the inner edge of the photographic projection of the hook increases from 0.14 mm –1 near the apex to 12.7 mm –1 at the hook bisector (about 1 mm from the apex) and decreases to approximately zero below the hook (2 mm from the apex). Using concepts from fluid dynamics we relate growth rates to curvature changes. For a material element of stem cross section located at s we predict that urn:x-wiley:00029122:ajb206072:equation:ajb206072-math-0001 where M(s,o) and M(s,i) are the relative elemental growth rates at the outer and inner edges of the hypocotyl cross section, w is element width, t is time, and u(s) is velocity of departure of s from the apex. During hook maintenance ∂[ln(1 + κ w )]/∂ t is approximately zero if the apex is taken as origin. As each hypocotyl element is displaced from the apex it becomes increasingly curved; then, after displacement past the hook bisector, the element straightens. Growth rates, determined by measurements of the displacement of epidermal hairs, show the pattern expected from the equation: On the apical side of the hook, relative growth rates of the outer edge exceed growth of the inner edge, M(s,o) > M(s,i) , while on the basal side of the hook. M(s,o) < M(s,i) . The constraint for hook maintenance is that [ L(s,o) – L(s,i) ] / u(s) be constant in time, where L(s,o) and L(s,i) are the relative elemental growth rates at s . During hook opening ∂[ln(1 + κ w )]/∂ t becomes important.
Article
Article
A newly developed technique based on image sequence analysis allows automatic and precise quantification of the dynamics of the growth velocity of the root tip, the distribution of expansion growth rates along the entire growth zone and the oscillation frequencies of the root tip during growth without the need of artificial landmarks. These three major parameters characterizing expansion growth of primary roots can be analysed over several days with high spatial (20 μm) and temporal resolution (several minutes) as the camera follows the growing root by an image‐controlled root tracking device. In combination with a rhizotron set up for hydroponic plant cultivation the impact of rapid changes of environmental factors can be assessed. First applications of this new system proved the absence of diurnal variation of root growth in Zea mays under constant temperature conditions. The distribution profile of relative elemental growth rate (REGR) showed two maxima under constant and varying growth conditions. Lateral oscillatory movements of growing root tips were present even under constant environmental conditions. Dynamic changes in velocity‐ and REGR‐distribution within 1 h could be quantified after a step change in temperature from 21 °C to 26 °C. Most prominent growth responses were found in the zone of maximal root elongation.
Article
A requirement for understanding morphogenesis is being able to quantify expansion at the cellular scale. Here, we present new software (RootflowRT) for measuring the expansion profile of a growing root at high spatial and temporal resolution. The software implements an image processing algorithm using a novel combination of optical flow methods for deformable motion. The algorithm operates on a stack of nine images with a given time interval between each (usually 10 s) and quantifies velocity confidently at most pixels of the image. The root does not need to be marked. The software calculates components of motion parallel and perpendicular to the local tangent of the root's midline. A variation of the software has been developed that reports the overall root growth rate versus time. Using this software, we find that the growth zone of the root can be divided into two distinct regions, an apical region where the rate of motion, i.e. velocity, rises gradually with position and a subapical region where velocity rises steeply with position. In both zones, velocity increases almost linearly with position, and the transition between zones is abrupt. We observed this pattern for roots of Arabidopsis, tomato (Lycopersicon lycopersicum), lettuce (Lactuca sativa), alyssum (Aurinia saxatilis), and timothy (Phleum pratense). These velocity profiles imply that relative elongation rate is regulated in a step-wise fashion, being low but roughly uniform within the meristem and then becoming high, but again roughly uniform, within the zone of elongation. The executable code for RootflowRT is available from the corresponding author on request.