A Modular Capture System for Automatic Weed
Imaging and Analysis
I. J. Hales∗, M. F. Hansen∗, L. Broadbent∗, M. L. Smith∗, S. Wane†and S. Blackmore†
∗University of the West of England, Bristol. {ian.hales, mark.hansen, laurence.broadbent, melvyn.smith}@uwe.ac.uk
†Harper Adams University College, Newport. {swane, simon.blackmore}@harper-adams.ac.uk
I. INTRODUCTION
It is estimated that today’s farming methods, provide sufficient global crop output to feed 5.5bn people [6]. Given
the current world population is over 7 billion and is expected to grow to 9bn by 2050, improving the efficiency of
arable land use is a pressing concern. Clearly, crop yield is affected by competition for resources from weeds growing
amongst crops. Current techniques for weed management often involve the wide-scale spraying of herbicides, which
is both economically and environmentally expensive. By making use of precision farming techniques, such as targeted
weed killing, the cost of crop management can be lowered and the crop yield increased.
To this end, we are developing a toolkit for the real-time identification and analysis of early growth-cycle weed
clusters in crop fields, particularly with respect to localisation of the meristems, thus giving guidance to a targeted weed
control device such as a laser applicator [1]. The proposed system uses a combination of tried-and-tested 2D image
processing methods, as well as 3D surface and depth information to locate and analyse weeds in real-time from a high
frame-rate video stream.The work presented here represents the current state of the system, in which the 2D feature
analysis element is complete and development has begun on the 3D shape and structure components.
(a) (b) (c)
Fig. 1: To find the location, size and orientation of the crop-rows, a Gaussian blur is applied to the vegetation segmentation
image before thresholding again. Remaining blobs of sufficient intensity are chosen as crop-rows and ellipses fitted to
them, giving the parameters of each row.
II. 2D IMAG E ANALYSI S
Before they can be analysed, the weeds within an image must first be segmented from the background. Several metrics
for vegetation segmentation have been examined in the literature with [2] and [4] offering good reviews. We empirically
determined that the Excess Green Index (ExG) [5], obtained by the following weighting of the R,Gand Bchannels,
provided the best segmentation: ExG = 2G−R−B. This can then be thresholded using an automated method to
produce a binary segmentation.
In many of our images, such as in figure 1a, we observe not only weeds, but also the crops themselves. To ensure these
are not included in the weed-control, we apply a strong Gaussian blur to the segmented binary image then threshold
again. Based on the assumption that crops will appear significantly larger than weed clusters, any blobs which remain
are considered crop rows as illustrated in figure 1b. If the automatic threshold value is low, we assume no crop-row
was present. For each remaining connected-component, we fit an ellipse, using its major-axis-orientation and minor-axis
width to define a crop-row region as per figure 1c. As we currently focus on out-of-row weeds, any vegetation within
these regions is labelled as “crop”.
Previous work has shown that under controlled conditions it is possible to determine the meristem of weed clusters
of known species using variational models [3]. Our system is intended to be used in a realistic, outdoor environment,
in which shape-models are likely to struggle due to the very high levels of variation between the weeds. Therefore,
rather than relying on a prior model of weed appearance, we examine the imaged structure of the segmented connected-
components in isolation.
As the system is intended run in real-time at high frame-rates, simple morphological operations with minimal
computational load are used to determine meristem location. For each vegetation component labelled as “crop”, we
extract the ExG patch enclosing it and resize it to 64px wide (the height varies proportionally). The appearance of
grassy weeds is very different to that of rounded-leaf weeds, requiring a differing method to calculate the growth-centre.
(a) (b) (c) (d) (e) (f) (g) (h)
Fig. 2: Two fast methods have been developed for meristem location. For grassy patches (a-d), the hough-lines in the
ExG image are detected and their intersection point estimated. For patches with rounded leaves (e-h), the thresholded
ExG patch is thinned, then branching-pixels are detemined. Those closest to the centroids of each component are labelled
as the meristem.
Therefore, the system must first determine which type of weed is visible within the patch. By performing Canny edge
detection followed by the Hough line transform on the patch, then examining the length and relative orientation of
extracted edges, we can reliably determine if a patch contains grass or rounded weeds – grassy patches exhibit longer
edges with little directional variation, whilst rounded-leaf patches offer shorter edges with higher orientation variation.
To detect the growth centre of grassy patches, we simply calculate the least-squares point of intersection of all detected
Hough lines. The calculation for the rounded-leaf patches involves first thresholding the ExG patch using the Triangle
method as above, this time calculating the threshold on based on only the patch’s intensity histogram. This allows us
to compensate for any soil-colour or small lighting variations that may occur across the image and offers improved
separation of closely growing weeds. The resulting components then undergo morphological thinning giving a set of
pixel-width lines representing the component shape. From these lines we find the branching-pixels; that is, pixels at
which 3 or more lines of different direction meet. It is not unreasonable to assume that the meristem of the rounded
leaf plants appears near the centre of the patch and as such we take the branching-pixel closest to the central moment
of each component as our estimate for the meristem or meristems. In our test sequences, we have produced estimates
within 2mm of the hand-labelled location for 78% of grassy patches and 86% of rounded-leaf patches.
III. 3D SHAP E AN D STRU CT UR E
The 3D element of the system is currently under development. We hope to improve our estimates of meristem
location and crop-row identification by incorporating additional 3D shape and structure information into the estimation.
Two methods are being used to obtain this information: (1) Photometric Stereo (PS) for surface shape and (2) RGBD
data from a depth camera. The former provides an extremely detailed model of the surface shape of an object by
capturing images of it illuminated from different directions from the same viewpoint. Each additional direction provides
an additional constraint on the parameters of the surface normal for each pixel and its albedo. As few as two lights can
be used in simple cases where the texture and reflectivity are uniform. Such simplistic assumptions clearly do not apply
here and as such a four-light system is used, allowing not only variation in the surface albedo (which requires at least
three light sources), but also improved robustness to shadow artefacts. By examining the 3D shape of the vegetation, it
is hoped that issues such as the overlapping of leaves and identification of stems in cluttered environments will become
much easier to overcome. As monocular PS offers only surface information, the system also utilises a depth-camera to
allow the creation of depth-maps. As the crops are expected to grow significantly taller than the weed patches, this should
allow for a reliable method of crop-row detection as well as providing useful information about the weeds themselves.
By offering a fast, flexible and automated system for the recognition and localisation of weed clusters, our system will
offer a potential guidance solution for targeted weed-control systems as well as allowing visualisation of weed growth
at a field-wide level.
REFERENCES
[1] S. Christensen, H. T. Søgaard, P. Kudsk, M. Nørremark, I. Lund, E. S. Nadimi, and R. Jørgensen. Site-specific weed control technologies. Weed
Research, 49(3):233–241, June 2009.
[2] George E. Meyer and Joo Camargo Neto. Verification of color vegetation indices for automated crop imaging applications. Computers and
Electronics in Agriculture, 63(2):282–293, October 2008.
[3] Julio C. Pastrana and Thomas Rath. Novel image processing approach for solving the overlapping problem in agriculture. Biosystems Engineering,
115(1):106–115, May 2013.
[4] J. Romeo, G. Pajares, M. Montalvo, J. M. Guerrero, M. Guijarro, and J. M. de la Cruz. A new expert system for greenness identification in
agricultural images. Expert Systems with Applications, 40(6):2275–2286, May 2013.
[5] D. M. Woebbecke, G. E. Meyer, K. Von Bargen, and D. A. Mortensen. Color Indices for Weed Identification Under Various Soil, Residue, and
Lighting Conditions. Transactions of the ASAE, 38(1):259–269, 1995.
[6] StephenL Young, GeorgeE Meyer, and WayneE Woldt. Future Directions for Automated Weed Management in Precision Agriculture. In Stephen L.
Young and Francis J. Pierce, editors, Automation: The Future of Weed Control in Cropping Systems, pages 249–259. Springer Netherlands, 2014.