ArticlePDF Available

Abstract and Figures

Total above-ground biomass at harvest and ear density are two important traits that characterize wheat genotypes. Two experiments were carried out in two different sites where several genotypes were grown under contrasted irrigation and nitrogen treatments. A high spatial resolution RGB camera was used to capture the residual stems standing straight after the cutting by the combine machine during harvest. It provided a ground spatial resolution better than 0.2 mm. A Faster Regional Convolutional Neural Network (Faster-RCNN) deep-learning model was first trained to identify the stems cross section. Results showed that the identification provided precision and recall close to 95%. Further, the balance between precision and recall allowed getting accurate estimates of the stem density with a relative RMSE close to 7% and robustness across the two experimental sites. The estimated stem density was also compared with the ear density measured in the field with traditional methods. A very high correlation was found with almost no bias, indicating that the stem density could be a good proxy of the ear density. The heritability/repeatability evaluated over 16 genotypes in one of the two experiments was slightly higher (80%) than that of the ear density (78%). The diameter of each stem was computed from the profile of gray values in the extracts of the stem cross section. Results show that the stem diameters follow a gamma distribution over each microplot with an average diameter close to 2.0 mm. Finally, the biovolume computed as the product of the average stem diameter, the stem density, and plant height is closely related to the above-ground biomass at harvest with a relative RMSE of 6%. Possible limitations of the findings and future applications are finally discussed.
Content may be subject to copyright.
Plant Phenomics
Article ID 
Research Article
High-Throughput Measurements of Stem Characteristics to
Estimate Ear Density and Above-Ground Biomass
Xiuliang Jin1,2,, Simon Madec1, Dan Dutartre3,BenoitdeSolan
4, Alexis Comar3,and
Frédéric Baret1
1INRA EMMAH, UMR 1114 228 route de lA´
erodrome, 84914 Avignon, France
2Institute of Crop Sciences, Chinese Academy of Agricultural Sciences/Key Laboratory of Crop Physiology and Ecology, Ministry of
Agriculture, Beijing 100081, China
3HIPHEN, Rue Charrue, 84000 Avignon, France
4ARVALIS-Institut du V´
etal, Station Exp´
erimentale, 91720 Boigneville, France
Correspondence should be addressed to Xiuliang Jin; jinxiuxiuliang@.com
Received  March ; Accepted  May 
Copyright ©  Xiuliang Jin et al. Exclusive Licensee Nanjing Agricultural University. Distributed under a Creative Commons
Attribution License (CC BY .).
Total above-ground biomass at harvest and ear density are two important traits that characterize wheat genotypes. Two experiments
were carried out in two dierent sites where several genotypes were grown under contrasted irrigation and nitrogen treatments.
A high spatial resolution RGB camera was used to capture the residual stems standing straight aer the cutting by the combine
machine during harvest. It provided a ground spatial resolution better than . mm. A Faster Regional Convolutional Neural
Network (Faster-RCNN) deep-learning model was rst trained to identify the stems cross section. Results showed that the
identication provided precision and recall close to %. Further, the balance between precision and recall allowed getting accurate
estimates of the stem density with a relative RMSEclose to % and robustness across the two experimental sites. e estimated stem
density was also compared with the ear density measured in the eld with traditional methods. A very high correlation was found
with almost no bias, indicating that the stem density could be a good proxy of the ear density. e heritability/repeatability evaluated
over  genotypes in one of the two experiments was slightly higher (%) than that of the ear density (%). e diameter of each
stem was computed from the prole of gray values in the extracts of the stem cross section. Results show that the stem diameters
follow a gamma distribution over each microplot with an average diameter close to . mm. Finally, the biovolume computed as the
product of the average stem diameter, the stem density, and plant height is closely related to the above-ground biomass at harvest
with a relative RMSE of %. Possible limitations of the ndings and future applications are nally discussed.
1. Introduction
Ear density (the numbers of ears per m2) is generally
well correlated with above-ground biomass and grain yield
at maturity of wheat [,]. However, the correlation may
depend on environmental conditions as well as genotypes.
Most stems observed at harvest bear an ear: stem density
(the number of stems per m2) appears thus as a good proxy
of the ear density []. Stem density depends thus both on
plant density and on the number of stems per plant which
is quantied by the tillering coecient. e environmental
conditions experienced by the crop and the genotype control
the tillering coecient []. erefore, several studies report
the interest of the ear and stem density as traits to be used
intheselectionprocessofwheatgenotypes[,]. Further,
plant height and stem diameter are highly correlated with
the above-ground biomass in wheat []. erefore, stem
density, ear density, plant height, and stem diameter are thus
highly desired to score the performances of a genotype in
wheat crop breeding programs.
e number of stems per plant is dicult to evaluate
when plants start to produce tillers since plants are oen
intricated and hardly identiable. Further, the number of
stems per plant may change with time due to possible tiller
regression during tillering and stem elongation stages. Aer
the owering stage, most stems bear an ear and the stem
density therefore provides a good proxy of the ear density.
Ear and stem densities are therefore usually measured at
Plant Phenomics
maturity by manual counting over a given sample area. e
stem diameter is rarely measured since it is very tedious
and time consuming. Similarly, above-ground biomass is
rarely measured extensively for the same reasons. Crop height
at harvest is most frequently measured in the eld using
a ruler. In addition to the limits of these low-throughput
invasive measurements that require large human resources
to be completed, the small sampling area used and errors
associated with the manual measurements may result in
signicant uncertainties on these variables that would limit
experimental observations. It appears therefore necessary
to develop new methods for accurate measurements of the
stem density, crop height, and stem diameter for wheat crops
within large eld phenotyping experiments.
e recent advances in high-resolution imaging systems
and computing capacity as well as image processing algo-
rithms oer great opportunities to develop nondestructive
high-throughput methods. Jin et al. []andLiuetal.
[] have demonstrated that the plant density could be
estimated at early stages in wheat crops from high-resolution
imagery. Direct estimates of the tillering coecient at the
end of the tillering stage were investigated by several authors
with application to the management of nitrogen fertilization
for stable crops. Vegetation indices computed from the
reectance measurements have been empirically related to
the tiller density []. However, reectance measurements
are mainly sensitive to the amount of green foliage, which
is loosely related to the stem density. Alternatively, several
authors have developed algorithms for estimating wheat stem
density at early stages from high-resolution imagery [].
Unfortunately, this method, applied to plants in pots grown
under greenhouse conditions, is dicult to transfer to eld
conditions. Further, the number of stems at relatively early
stages may overestimate the actual stem density at harvest
Previous scientists have used algorithms for estimating wheat
ear density in-eld conditions using RGB or thermal imagery
[]. However, these techniques, operated from the top of
number of ears are laying in the lower layers of the canopy.
Previous studies have also demonstrated that above-ground
biomass (AGB) can be estimated using dierent remote
sensing platforms []. However, the correlative nature of
these relationships questioned their robustness when applied
outside the domain where they have been calibrated.
e aim of this study is to develop and evaluate a method
to estimate stem density aer the harvest. Images of the
remaining stems cut by the combine machine during harvest
show a clear circular cross section at their tip that could
be identied by machine vision techniques. Further, the
diameter of the stem could be also measured to tentatively
estimate the AGB by combining the average stem diameter
with the stem density and plant height. High-throughput
estimates of plant height have become now a standard trait
easy to compute from D point clouds derived from LiDAR or
standard cameras aboard drone []. e main objectives of
this study are therefore () to develop a method for identifying
stems from postharvest submillimetric RGB images and
F : Visual stem identication. Each stem identied corre-
sponds to a green bounding box. Note: the image is actually cropped
from original image by x pixel.
F : Application of the stem detection using Faster-RCNN
algorithm to an image extract in Gr´
eoux. Each yellow bounding box
corresponds to the identied stem and is associated with its score
corresponding to the probability of containing a stem.
compute the stem density; () to compare the estimated stem
density with the ear density measured with traditional inva-
sive methods; () to estimate the stem diameter and describe
their distribution; and () to investigate the capacity of stem
density, stem diameter, and plant height to provide a proxy
of AGB. e eld experiments and data acquisition are rst
described. e developed methods are then presented, and
their performances to estimate stem density, stem diameter,
and AGB are nally evaluated and discussed.
2. Materials and Methods
2.1. Experimental Sites and Ground Measurements. e
eoux and Clermont sites located in France (Table )were
hosting wheat phenotyping experiments with about one
thousand microplots of  rows by  m length (Gr´
or  rows by . m length (Clermont). For both sites,
Plant Phenomics
05 1015202530
Pixel position on the diameter
Gray level
Vertical direction
Horizon direction
45 degree direction
135 degree direction
F : e extraction of stem diameter of each sub-window
image using diameter gray level prole.
0100 200 300 400 500 600 700 800 900 1000
Estimated stem density (stems/2)
Measured stem density (stems/2)
F : Comparison between the stem density estimated using
Faster-RCNN method calibrated over the pooled Cgc dataset and
the stem density evaluated visually over the images. e black line
corresponds to the : line; the red and blue circles correspond,
respectively, to the Gr´
eoux (Vg) and Clermont (Vc) validation
rows were spaced by . cm. A subsample of microplots
(Table ) was selected in both sites for the development
and validation of the method. ey included genotypes with
contrasted tillering capacity and plant architecture as well
as variation in irrigation (Gr´
eoux) and nitrogen (Clermont)
crop management.
300 400 500 600 700 800 900
Measured ear density (Ears/2)
Estimated stem density (stems/2)
F : Relationship between the estimated stem density and the
measured ear density at the Gr´
eoux (blue dots) and Clermont (red
dots) datasets. e black line corresponds to the : line.
e ear density (ears/m2) was measured in the eld at
maturity for the  (Gr´
eoux) or  (Clermont) microplots
considered, by counting the ears over three samples of two
rows by . m length corresponding to a . m2sampled area.
e AGB (g/m2) was measured in Gr´
eoux over  microplots
by collecting all the plants within three samples of two rows
by . m length. e samples were then oven-dried at Cfor
three days and nally weighed. e height (cm) of the plants
was measured using two LMS LiDARs (SICK, Germany)
xed on a phenomobile, i.e., a robot rover that automatically
details on height measurements are given by Madec et al. in
2.2. Image Acquisition and Visual Labeling of Stems. ACanon
els equipped with a  mm focal lens was xed on a pole and
maintained at a . m distance from the ground at the Gr´
experimental site. e camera was set to speed priority. e
same operating mode was used in Clermont, except that
the camera was Sony ILCE- with  by  pixels
equipped with a  mm focal length lens and maintained
at . m from the ground. e images were recorded in
JPG format on the SD memory card. Measurements were
completed under cloudy illumination conditions with light
wind. ree (Gr´
eoux) or four (Clermont) images were taken
over each microplot. A subsample corresponding to four rows
by . m length for Gr´
eoux and four rows by . m length for
Clermont (Table ) was extracted in the center of each image.
is oered the advantage of minimizing image deformation
observed mostly on the borders of the whole image. e
Plant Phenomics
0.2 0.4 0.6
Stem diameter
200 400 600
Stem density
Stem diameter
Stem density
r = 0.12 r = 0.01
r = -0.01 r = 0.09
r = -0.27
r = -0.24
0.2 0.4 0.6
Stem diameter
0 500 1000
Stem density
Stem diameter
Stem density
r = 0.24r = -0.10r = -0.05
r = 0.15r = -0.13
r = -0.49∗∗
F : Correlation and distribution between stem density, average stem diameter, shape, and scale parameters over Gr´
eoux (a) and
Clermont (b) experimental sites. e correlation coecient, r, is given in the upper triangular matrix with ∗∗ and corresponding,
respectively, to signicant values at . and . probability levels.
500 1000 1500500 1000 1500
MLR with all
0 1000 2000
10 15 20
Basal area
0.6 0.8 1
Plant height
200 400 600
Stem density
200 400 600
Ear density
MLR with all
Basal area
Plant height
Stem density
Ear density
r = 0.767∗∗ r = 0.806∗∗ r = 0.638∗∗r = 0.889∗∗ r = 0.476∗∗
r = 0.407∗∗ r = 0.603∗∗
r = 0.724∗∗
r = 0.541∗∗r = 0.642∗∗
r = 0.928∗∗ r = 0.781∗∗
r = 0.798∗∗
r = 0.585∗∗
r = 0.719∗∗ r = 0.598∗∗
r = 0.386r = 0.560∗∗
r = 0.719∗∗
r = 0.763∗∗
r = 0.9215∗∗
(Ears/2)(Stems/2)(c2/2)(c3/3)variables (g/2)(g/2)
variables (g/2)
F : Correlation matrix between the AGB and the six variables investigated. Note: ∗∗ and mean correlation signicant at the . and
. level of probability, respectively.
Plant Phenomics
T : Characteristics of the Gr´
eoux and Clermont experimental sites.
Sites Latitude Longitude Number of plots Sowing date Sowing density (seeds/m2)
eoux 󸀠N
󸀠E// 
Clermont 󸀠N
󸀠E  // 
T : Characteristics of the images taken over the two experimental sites.
Sites Date of images Distance to ground (m) Ground resolution (mm) Sampled area (m2)
eoux // . . .
Clermont // . . .
eoux images were rst resampled using a bicubic inter-
polation algorithm to provide the same resolution as that of
e good quality of images provided strong condence in
the visual identication of the stems (Figure ). A bounding
box was interactively drawn around each stem identied in
the images. e bounding box used to identify each stem was
designed to include enough elements surrounding the stem
(Figure ). A total of  images were visually annotated to be
used for the calibration and validation of the method.
2.3. Object Detection Using Faster-RCNN. Convolutional
Neural Networks (CNNs) are powerful machine learning
methods []. ey are widely used to extract imagery infor-
mation features and then classify objects. CNNs were trained
using large collections of diverse images to extract more eec-
tively rich feature representations. ese CNNs features oen
outperform handcraed ones such as histogram of oriented
gradients (HOG), local binary patterns, or speeded up robust
features []. e TensorFlow (https://www.tensor
implementation of Faster Regional Convolutional Neural
Network (Faster-RCNN) by the object detection application
programming interface (API) [] was achieved here. Faster-
RCNN has been widely used to detect objects []. e region
proposal network (RPN) branch was inserted between the
conv and conv blocks. e Inception-Resnet-V model was
used as it obtained the best accuracy among several modern
object detectors []. An anchor was set at each location
considered by the convolution maps of the RPN layer. Each
anchor was associated with a size and aspect ratio. A set of
 anchors with dierent size and aspect ratio were assigned
at each location, following the default setting. e number
of proposed regions per patches was set to , which was
consistent with the expected number of stems per patch.
since the memory requirements were too demanding for
larger images. e original images were thus split into 
x  patches, keeping % overlap between neighboring
patches to minimize possible problems associated with the
borders. e batch size was xed to  and the threshold value
for the non-maxima suppression with an IOU (Intersection
Over Union) was set to .. e model was trained with a
learning rate of . and a momentum of .. e model
(COCO) dataset to provide the starting point. e COCO
dataset [] contains . million images with . million
of object instances belonging to  object categories. More
details on the Faster-RCNN used could be found in Madec et
al. []. e pretrained model was then ne-tuned over the
calibration image extracts. It identied and localized stems
using a bounding box associated with a condence score
varying between . and ..
e trained model was nally applied to all the image
extracts available. When identied stem bounding boxes were
overlapping, a minimum of . overlap fraction was used
to eliminate one of the overlapping bounding boxes. Finally,
bounding boxes with a condence score value smaller than
. were not considered as stems. is score threshold value
was optimized to get the best stem density estimation per-
formances. An example of the Faster-RCNN stem detection
result is presented in Figure . e estimated stem density
(stem/m2) was eventually computed by dividing the number
of stems identied over the image extracts of a microplot by
the size of the extracts (Table ).
2.4. Estimating the Stem Diameter and Biovolume. e
bounding box of the identied stems was rst transformed
into gray images using the value (V) component of the HSV
transform []: V=.R+.G+.B, where R, G,
of the RGB images coded in  bits. e gray value proles
were then extracted along four compass directions: ,
(Figure ). e gray level proles show typical
patterns with high values corresponding to the border of the
stem and lower values outside and inside the stem (Figure ).
e two borders of the stem were thus identied using
the two maximum gray values. e distance between the
maximums values was computed and then averaged over
the four compass directions to provide an estimate of the
e stem diameter was used to compute the area of the
section of the stem. e basal area of each microplot was then
computed as the average area of the stem section multiplied
by the stem density. Finally, the biovolume was computed as
from the LiDAR measurements.
2.5. Statistical Analysis. Both Gr´
eoux and Clermont datasets
were randomly split into / for model calibration and /
for validation. A rst global training (called here Cgc) was
investigated by pooling the calibration datasets of Gr´
Plant Phenomics
T : Characteristics of the data sets used for the calibration and validation of the algorithm. Statistics of the stem density are indicated
for each data set, including minimum (Min), mean (Mean), maximum (Max), range (Range), standard deviation (SD), and coecient of
variation (CV) of the stem density.
Number of image extracts Stem density (stem/m2)
Dataset Name Gr´
eoux Clermont Min Mean Max Range SD CV (%)
Calibration Cgc        .
Validation Vg c   .
Calibration Cg       .
Validation Vg      .
Calibration Cc       .
Validat io n Vc      .
T : Accuracy of stem identication using the Faster-RCNN method. Results are presented for three calibration datasets (Cg, Cc, and
Cgc). e evaluation is achieved on the validation datasets (Vg, Vc, Vgc).
Calibration dataset Validation dataset Precision Recall Bias
Vg . . -.
Vc . . -.
Vgc . . -.
Vg . . -.
Vc . . .
Vgc . . .
Vg . . -.
Vc . . .
Vgc . . .
datasets Vgc. e performances of this global calibration
(Cgc) were also evaluated on both the Gr´
eoux (Vg) and
Clermont (Vc) validation datasets. en, a cross-validation
was also investigated to better evaluate the robustness of the
classication: the calibration was completed on the Gr´
(Cg) or Clermont (Cc) calibration datasets and validated
on the Gr´
eoux (Vg) and Clermont (Vc) validation datasets.
Table presents the several cases considered.
A detected stem bounding box (i.e., with a score >.) was
considered correct (true positive, TP) if its IOU with a labeled
stem bounding box was larger than the IOU threshold value.
Otherwise, the detected stem bounding box was considered
as false positive (FP). e proposed bounding boxes with a
score<. (i.e., not considered as stems) with IOU larger than
e IOU threshold value was set to the usual value of ..
e precision (TP/(FP+TP)), recall (TP/(FN+TP)), and bias
(-(precision/recall)) were also calculated.
2.6. Heritability Computation. e broad sense heritability
(H2) evaluates the repeatability of the stem or ear density
estimates. It was computed as the percentage of the genotypic
variance, Vg, to the total variance, Vg+Ve, where Ve is the
variance due to the environment []. e heritability of
the stem density and ear density was computed over sixteen
wheat genotypes ( plots) selected from the Clermont
experimental site where each genotype was replicated six to
een times.
3. Results and Discussion
3.1. Stems Are Accurately Identied Using the Faster-RCNN
Model. To evaluate the robustness of the RCNN model, it
eoux (Cg), Clermont (Cc), or both
datasets (Cgc). Performances computed over the validation
datasets were very good with . <precision<. and . <
recall <. (Table ). Precision and recall were well balanced
with a small bias: -. <bias<..
e results showed that the classication accuracy of
stem identication was very high based on the precision
and recall values over the same experiments (Table ). e
robustness of the classication was further investigated by
comparing the precision and recall values computed over
the validation datasets coming from the other experiments.
Results show that the classication evaluated over the same
experiment used to calibrate the model was always per-
forming the best (Table ). e classication performances
single experiment was validated on the other experiment.
is may be explained both by the limited sample size of the
calibration dataset and also by the specic features associated
with each experiment, including the spatial resolution (Tables
and ). However, when the calibration was completed over
the pooled experiments (Cgc), the precision and recall values
decreased only slightly when evaluated over each individual
experiment (Vg or Vc) (Table ). e model captured the key
information common to the two experiments to provide a
consistent stem identication. It conrmed the eciency and
robustness of the Faster-RCNN method.
Plant Phenomics
T : Performances of the stem density estimation when using Faster-RCNN method for the postclassication step. e evaluation is
achieved on the three validation data sets (Vg, Vc, and Vgc).
Calibration dataset Validation dataset Sample size slope intercept R2RMSE (stems/m2) RRMSE (%)
Vg  . . . . .
Vc  . . . . .
Vgc  . . . . .
Vg  . . . . .
Vc  . . . . .
Vgc  . . . . .
Vg  . . . . .
Vc  . . . . .
Vgc  . . . . .
T : Statistics of the relationships between the estimated stem density and the measured ear density. e Faster-RCNN was trained over
the Cg+Cc dataset.
Datasets Slope Intercept R2RMSE (stems/m2) RRMSE (%)
eoux . . . . .
Clermont . . . . .
eoux & Clermont . . . . .
T : Biomass regression models derived from stem density, ear
density, stem area, height, and biovolume at the Gr´
eoux experi-
mental site. Note: ∗∗ means model signicant at the . level of
probability. e R2, RMSE, and RRMSE are averaged R2, RMSE, and
RRMSE values of leave-one-out cross-validation methods.
Variabl e s R 2RMSE (g/m2) RRMSE (%)
Stem density .∗∗  .
Ear density .∗∗  .
Plant height .∗∗  .
Basal area .∗∗  .
Biovolume .∗∗  .
All .∗∗  .
3.2. Stem Density Is Accurately Estimated. e consequences
of the identication performances of the Faster-RCNN model
discussed previously were evaluated in terms of plant density
at the image extract level. For the sake of consistency, several
calibration and validation datasets were considered to further
evaluate the robustness of the model. Results showed RRMSE
values ranging from .% to .%. Calibrating over the
pooled datasets (Cgc, Table )providedthebestperfor-
mances with RRMSE lower than %. A slight degradation
of the performances was observed when calibrating over a
single dataset. Calibrating over the Gr´
eoux dataset provided
the worst performances when validated over the Clermont
and variation in the cutting height and inclination of the
stems during harvest between Gr´
eoux and Clermont sites. In
the pooled Gr´
eoux and Clermont calibration datasets (Cgc)
that provided more robust performances.
When considering the calibration over the pooled dataset
(Cgc) that provided the overall best performances, very small
biases were observed with points closely distributed around
the : line (Figure ).escatteraroundthe:lineappeared
to be relatively independent of the stem density (Figure ).
3.3. e Stem Density Is a Good Proxy of the Ear Density. e
stem density estimated with the Faster-RCNN model cali-
brated over the Cgc dataset was compared to the ear counted
visually at the ground level. Both quantities were evaluated on
dierent samples, expected however to represent the average
microplot value. Results showed that the estimated stem
density based on the Faster-RCNN model was very consistent
(Figure ) with the measured ear density at the Gr´
(Table ) and Clermont (Table ) experimental sites. e
scatter between ear and stem densities appeared to increase
with the density: this was obvious between the Gr´
(<density<) and Clermont (<density<) sites.
Part of the larger scatter observed over the Clermont site
visual counting (. m2). e scatter between ear and stem
densities seems to increase with the density within the
Clermont site between the low and high densities (Figure ).
Nevertheless, the good agreement found between ear and
stem densities was thus conrming the results of Siddique et
al. [].
Previous studies demonstrated that the RGB imagery
can be used to estimate ear density using image processing
algorithms []. However, ear density estimation perfor-
mances were generally limited to a comparison between the
ears detected by the machine learning algorithm and those
that can be visually identied by and operated on the image.
Some discrepancies could appear compared with the actual
ear density, particularly when some ears are lying in the
lower canopy layers and could not be easily seen from the
top of the canopy. Counting ears from the stem sections
appears therefore preferable under such conditions. Further,
Plant Phenomics
stem sections are relatively simpler objects to identify as
compared to ears that may show a large aspect variability.
Additionally, ears can frequently overlap in the eld, making
their identication more complex as compared to stem
sections that never overlap.
3.4. Stem and Ear Densities Are Highly Heritable. e heri-
tability (H2) values of the stem density and ear density were
compared at the Clermont experimental site where several
replicates of  genotypes were available. Results show that
the H2values of stem density (.%) and ear density (.%)
were high and close together. is is consistent with the
strong relationship found between both quantities (Figure ).
ese heritability values agreed well with the values provided
by Madec et al. () []. e H2value of the stem density
was slightly higher than that of the ear density, probably
because of the larger sample size used for estimating the stem
density from the RGB images, which makes the values more
repeatable. e high values of heritability found suggested
that the proposed method will be well suited to serve the
breeders needs.
3.5. Stem Diameter Follows a Gamma Distribution. e
distribution of stem diameter was investigated at Gr´
and Clermont experimental sites, respectively, on  and
 microplots. e distribution of the stem diameter may
be a pertinent trait describing the structure of the tiller
population that may be impacted by the growth conditions.
e distribution of the stem diameter of each microplot was
adjusted either to a normal or to a gamma distribution.
e corresponding p values associated with the t of each
distribution was computed. Results show that the p value of
the gamma distribution was larger than that of the normal
distribution for % of the microplots for Gr´
eoux and %
of the microplots for Clermont. e gamma distribution
characterized by a scale and a shape parameter was therefore
selected to describe the stem diameter distribution over each
e average stem diameter of each microplot ranged from
. to . mm, with a median value close to . mm for both
sites (Figure ). e average stem diameter was loosely but
positively correlated to the stem density (Figure ): the stress
experienced by the plants was aecting both the density and
the diameter of the stems, with no apparent compensations
between these two traits. e stem diameter distribution
for each microplot as described by a gamma function was
further investigated: shape parameters were slightly smaller
for the Gr´
eoux site (<shape<) as compared to those of
theClermontsite(<shape<). Conversely, scale parameters
were slightly larger for the Gr´
eoux site (.<scale<.) as
compared to the Clermont site (.<scale<.). e distri-
bution of the diameters was more concentrated around the
average for the Clermont site as compared to the Gr´
site where a larger range of diameters was observed. is
may be related to the stress conditions that were stronger in
eoux, particularly during the stem elongation phase. is
was also reected by the stem density that was more impacted
in Gr´
eoux. e scale and shape parameters were negatively
correlated for both sites, with a stronger correlation for Cler-
mont (Figure ). Since the average of a gamma distribution
is dened by the product of the shape and scale parameters,
the negative correlation between the two parameters was
explained by the constraint to keep the average close to .
mm. erefore, both parameters could be equally used to
describe the “atness” of the stem diameter distribution.
Biomass. A total of  microplots from the Gr´
eoux dataset
were used to relate the measured AGB with the ear density
and the four structural traits derived from high-throughput
measurements: stem density, stem basal area computed as
the product of the average stem diameter and the stem
of the basal area and plant height. Results show that all
these traits are strongly correlated with AGB (Figure and
Table ). e best relationship is however obtained using
the biovolume that combines the three main original traits:
stem density and average stem diameter that are combined
into the basal area; and plant height. Note that these traits
are relatively independent: stem density and plant height are
loosely correlated (Figure ,r
2=.); plant height and basal
area are also loosely correlated (r2=.).
Because the dataset used was limited, the predictive
performances of the relationships observed between the AGB
validation method (Jin et al., ). e best determination
coecients were observed consistently for the biovolume
(Table ) with a relative error of .%, i.e., within the order
of magnitude of the accuracy with which AGB was measured.
Our results are very consistent with those presented by Aziz
et al. () and Pittman et al. ().
To further improve the predictive model, we used all
the ve traits together within a multiple linear regression
model. Marginal improvement of the model performances
was observed (Figure and Table ). is may be explained
by the strong relationships between the ve traits used, as
well as the decrease in the degree of freedom induced by
the increase of the number of coecients to be adjusted
(six coecients instead of two needed when using only the
biovolume). e biovolume appeared therefore as a very
sound proxy of the AGB. Previous results suggested that
AGB could be estimated using dierent optical techniques
and technologies []. Our study further conrmed these
results. e results demonstrated that the estimation accurac y
of AGB could be improved by combining LiDAR data and
RGB imagery. However, the stability of the relationship found
over the limited sample used in this study should be further
evaluated with emphasis on the possible dependency on the
environmental and management conditions, as well as on
dierences between genotypes.
4. Conclusion
is study demonstrated that the identication of the stems
aer the harvest was possible using deep-learning approaches
applied to RGB images. is requires the spatial resolution
Plant Phenomics
to be sucient, i.e., around . mm since the stem diameters
are around . mm. It ensures that the objects to be identied
within the image are represented with an optimal number of
pixels ranging betweenandpixelsasadvisedbyMadec
et al. []. Such a high resolution could be achieved using a
high-resolution RGB camera xed either on a pole, on a cart,
on a phenomobile, or even on a UAV ying at low altitude
as already demonstrated by Jin et al. []. Alternatively, a set
of RGB cameras could be mounted on the combine machine
and provide, in near real time, an estimate of the stem density.
e method requires the stems not to be covered by the
straw rejected by the combine machine. Further, too inclined
stems due to the harvest process or some postharvest practice
may result in degraded performances since the sections of
the tip of the stems will not be viewed by the camera or
will be strongly deformed. Further, the proposed method
may be not suitable under stem lodging situations where the
stem sections will show unexpected patterns. Nevertheless,
would indicate that the Faster-RCNN model trained over
the illumination conditions may have little impact of the
stem identication since the objects are mostly identied by
the relative brightness of the pixels, with the color itself
bringing very little information. We demonstrated therefore
that the stem density is accessible with high-throughput,
relatively low cost and with a very good accuracy. Further,
the capacity to sample large area to estimate the stem density
will minimize the impact of the spatial variability within a
Although Madec et al. [] among others demonstrated
that similar deep-learning techniques could be applied e-
ciently to estimate the ear density, ear identication is more
complex because of strong dierences of the ear aspect
demonstrated in this study that the stem density was a very
close proxy of the ear density although some discrepancy is
expected under specic environmental conditions. In such
circumstances, the distribution of the diameter of the stems
could potentially provide the necessary information to get a
better estimate of the ear density from the stem density and
diameter distribution.
Once the stem is identied, we demonstrated that the
diameter could be easily measured. e distribution of the
stem diameters followed a gamma function with an average
diameter close to . mm. e distribution of the stem diame-
ters may be indicative of the structure of the tiller population
that may be governed by the genetics in interaction with
conditions experienced by the plants. Finally, the biovolume
computed as the product of the average stem diameter, the
stem density, and plant height was demonstrated to be a
close proxy of the above-ground biomass. is opens very
attractive potential for the breeders to get high-throughput
estimates of the total plant biomass at harvest and possibly
quantify the radiation use eciency and the harvest index
assuming that the yield will be measured anyway. Neverthe-
less, these promising results should be veried under a much
larger number of situations to verify that the correlations are
not too dependent on the environmental conditions as well
as on the genotype.
Conflicts of Interest
e authors declare no conicts of interest.
is study was supported by “Programme d’Investissement
d’Avenir” PHENOME (ANR--INBS-) with participation
of FranceAgriMer and “Fonds de Soutien `
eoux and
Clermont who participated in the experiments. e work was
completed within the UMT-CAPTE funded by the French
Ministry of Agriculture.
[] K.D.Joseph,M.M.Alley,D.E.Brann,andW.D.Gravelle,“Row
spacing and seeding rate eects on yield and yield components
of so red winter wheat,Agronomy Journal,vol.,no.,pp.
–, .
[] J.M.Whaley,D.L.Sparkes,M.J.Foulkes,J.H.Spink,T.Semere,
and R. K. Scott, “e physiological response of winter wheat to
reductions in plant density,Annals of Applied Biology,vol.,
no. , pp. –, .
[] K. H. M. Siddique, E. J. M. Kirby, and M. W. Perry, “Ear:
Stem ratio in old and modern wheat varieties; relationship with
improvement in number of grains per ear and yield,Field Crops
[] J. Hiltbrunner, B. Streit, and M. Liedgens, “Are seeding densities
an opportunity to increase grain yield of winter wheat in a living
mulch of white clover?” Field Crops Research,vol.,no.,pp.
–, .
[] C. M. Donald, “e breeding of crop ideotypes,Euphytica,vol.
relationship between height and yield in wheat,Heredity,vol.
[] C. Law, J. Snape, and A. Worland, “Aneuploidy in wheat and
itsusesingeneticanalysis,”inWheat Breeding,pp.,
Springer, .
[] A. J. King, L. R. Montes, J. G. Clarke et al., “Identication
of QTL markers contributing to plant growth, oil yield and
fatty acid composition in the oilseed crop Jatropha curcas L,
Biotechnology for Biofuels,vol.,no.,p.,.
[] S. Arnoult, M.-C. Mansard, and M. Brancourt-Hulmel, “Early
prediction of miscanthus biomass production and composition
based on the rst six years of cultivation,Crop Science,vol.,
no. , pp. –, .
[]J.Subira,K.Ammar,F. ´
Alvaro, L. F. Garc´
ıa del Moral, S.
Dreisigacker, and C. Royo, “Changes in durum wheat root
and aerial biomass caused by the introduction of the Rht-Bb
dwarng allele and their eects on yield formation,Plant and
[] X. Jin, S. Liu, F. Baret, M. Hemerl´
e, and A. Comar, “Estimates of
plant density of wheatcrops at emergence from very low altitude
UAV im age r y,Remote Sensing of Environment,vol.,pp.
, .
 Plant Phenomics
[] S. Liu, F. Baret, D. Allard et al., “A method to estimate plant
density and plant spacing heterogeneity: Application to wheat
crops,Plant Methods,vol.,no.,p.,.
[] M. Flowers, R. Weisz, and R. Heiniger, “Remote sensing of win-
ter wheat tiller density for early nitrogen application decisions,”
Agronomy Journal,vol.,no.,pp.,.
[] S. B. Phillips, D. A. Keahey, J. G. Warren, and G. L.
Mullins, “Estimating winter wheat tiller density using spec-
tral reectance sensors for early-spring, variable-rate nitrogen
applications,” Agronomy Journal,vol.,no.,pp.,
[] I. M. Scotford and P. C. H. Miller, “Estimating tiller density and
leaf area index of winter wheat using spectral reectance and
ultrasonic sensing techniques,Biosystems Engineering,vol.,
no. , pp. –, .
[] R. D. Boyle, F. M. K. Corke, and J. H. Doonan, “Automated
estimation of tiller number in wheat by ribbon detection,
Machine Vision and Applications,vol.,no.,pp.,
[] J. A. Fernandez-Gallego, S. C. Kefauver, N. A. Guti´
errez, M. T.
Nieto-Taladriz, and J. L. Araus, “Wheat ear counting in-eld
conditions: High throughput and low-cost approach using RGB
images,Plant Methods,vol.,no.,p.,.
[] J. Fernandez-Gallego, M. Buchaillot, N. Aparicio Guti´
errez, M.
Nieto-Taladriz, J. Araus, and S. Kefauver, “Automatic wheat ear
counting using thermal imagery,Remote Sensing,vol.,no.,
p. , .
[] T. Liu, C. Sun, L. Wang, X. Zhong, X. Zhu, and W. Guo, “In-
eld wheatear counting based on image processing technology,
Nongye Jixie Xuebao/Transactions of the Chinese Society for
Agricultural Machiner y, vol. , no. , pp. –, .
[] S. Madec, X. Jin, H. Lu et al., “Ear density estimation from
high resolution RGB imagery using deep learning technique,
Agricultural and Forest Meteorology,vol.,pp.,.
[] R. Ballesteros, J. F. Ortega, D. Hern´
andez, and M. A. Moreno,
Applications of georeferenced high-resolution images obtained
with unmanned aerial vehicles. Part I: Description of image
acquisition and processing,Precision Agriculture,vol.,no.,
pp. –, .
[] S. Gao, Z. Niu, N. Huang, and X. Hou, “Estimating the leaf
area index, height and biomass of maize using HJ- and
RADARSAT-,International Journal of Applied Earth Observa-
tion and Geoinformation,vol.,no.,pp.,.
[] X. Jin, G. Yang, X. Xu et al., “Combined multi-temporal optical
and radar parameters for estimating LAI and biomass in winter
wheat using HJ and RADARSAR- data,Remote Sensing,vol.
, no. , pp. –, .
[] X. Jin, Z. Li, G. Yang et al., “Winter wheat yield estimation based
on multi-source high-resolution optical and radar imagery
and AquaCrop model using particle swarm optimization algo-
rithm,ISPRS Journal of Photogrammetry and Remote Sensing,
vol. , pp. –, .
Hu, “Estimating crop stresses, aboveground dry biomass and
yield of corn using multi-temporal optical data combined with a
radiation use eciency model,Remote Sensing of Environment,
vol. , no. , pp. –, .
[] N. Tilly, D. Homeister, Q. Cao et al., “Multitemporal crop sur-
face models: Accurate plant height measurement and biomass
estimation with terrestrial laser scanning in paddy rice,Journal
of Applied Remote Sensing,vol.,no.,p.,.
[] J. Yue, G. Yang, C. Li et al., “Estimation of winter wheat above-
ground biomass using unmanned aerial vehicle-based snap-
shot hyperspectral sensor and crop height improved models,
Remote Sensing,vol.,no.,p.,.
[] S. Madec, F. Baret, B. De Solan et al., “High-throughput phe-
notyping of plant height: Comparing unmanned aerial vehicles
and ground lidar estimates,Frontiers in Plant Science,vol.,p.
, .
[] A. Krizhevsky, I. Sutskever, and G. E. Hinton, “Imagenet classi-
cation with deep convolutional neural networks,” Advances in
neural information processing systems, pp. –, .
[] J. Deng, W. Dong, R. Socher et al., “ImageNet: a large-
scale hierarchical image database,” in Proceedings of the 2009
IEEE Conference on Computer Vision and Pattern Recognition
(CVPR), pp. –, Miami, Fla, USA, June .
[] J. Huang, V. Rathod, C. Sun et al., “Speed/accuracy trade-
os for modern convolutional object detectors,” in Proceedings
of the 2017 IEEE Conference on Computer Vision and Pattern
Recognition (CVPR ’17), pp. -, Honolulu, HI, USA, July
[] S. Ren, K. He, R. Girshick, and J. Sun, “Faster R-CNN: towards
real-time object detection with region proposal networks,” in
Advances in Neural Information Processing Systems,pp.,
[] G. W. Meyer and D. P. Greenberg, “Perceptual color spaces for
computer graphics,ACM SIGGRAPH Computer Graphic,vol.
[] M. R. Dohm, “Repeatability estimates do not always set an
upper limit to heritability,Functional Ecology,vol.,no.,pp.
–, .
Plant Phenomics 
Composition Comments
. Please check and conrm the author(s) rst and last names
and their order which exist in the last page.
. We made the highlighted change(s) for the sake of clar-
ity/correctness. Please verify and check similar highlighted
cases throughout.
. Please check the format of Table 3.
. We changed Table 8to Table 7and changed its citation in
the text. Please check.
. We removed the bold font from Figures 4,5,6,and7.
Please check.
Author(s) Name(s)
It is very important to conrm the author(s) last and rst names in order to be displayed correctly
on our website as well as in the indexing databases:
Author 1
Given Names: Xiuliang
Last Name: Jin
Author 2
Given Names: Simon
Last Name: Madec
Author 3
Given Names: Dan
Last Name: Dutartre
Author 4
Given Names: Benoit
Last Name: de Solan
Author 5
Given Names: Alexis
Last Name: Comar
Author 6
Given Names: Frédéric
Last Name: Baret
... The tiller number was estimated by capturing images of the plant base [17]. For the image analysis in the field, some studies estimated the tiller number from rice-stubble images after harvesting [18,19]. No study has provided an effective tiller-counting method under flooded conditions, particularly around the panicle-formation stage. ...
... The tiller number was estimated by capturing images of the plant base [17]. For the image analysis in the field, some studies estimated the tiller number from rice-stubble images after harvesting [18,19]. No study has provided an effective tiller-counting method under flooded conditions, particularly around the panicleformation stage. ...
Full-text available
The increase in the number of tillers of rice significantly affects grain yield. However, this is measured only by the manual counting of emerging tillers, where the most common method is to count by hand touching. This study develops an efficient, non-destructive method for estimating the number of tillers during the vegetative and reproductive stages under flooded conditions. Unlike popular deep-learning-based approaches requiring training data and computational resources, we propose a simple image-processing pipeline following the empirical principles of synchronously emerging leaves and tillers in rice morphogenesis. Field images were taken by an unmanned aerial vehicle at a very low flying height for UAV imaging—1.5 to 3 m above the rice canopy. Subsequently, the proposed image-processing pipeline was used, which includes binarization, skeletonization, and leaf-tip detection, to count the number of long-growing leaves. The tiller number was estimated from the number of long-growing leaves. The estimated tiller number in a 1.1 m × 1.1 m area is significantly correlated with the actual number of tillers, with 60% of hills having an error of less than ±3 tillers. This study demonstrates the potential of the proposed image-sensing-based tiller-counting method to help agronomists with efficient, non-destructive field phenotyping.
... Once the variety type, fertilization situation, or growth period changes, the continued use of the model will cause considerable errors. Jin et al. (2019) used Faster-RCNN to achieve a high-throughput measurement of the tiller number after harvest. This method can be used to analyze the biomass and the tiller characteristics of different varieties but cannot be used for real-time monitoring in the wheat fertility process. ...
... The first is to use GCFs to estimate and continuously improve the estimation accuracy by improving vegetation index and optimizing algorithms, such as biomass and other parameters (Dong et al., 2020). The second uses image analysis and deep learning to monitor targets, such as wheat ears or tillers after harvest (Jin et al., 2019;Sadeghi-Te Hran et al., 2019). Further improvements the estimation of wheat tiller number could be made through these two approaches mentioned above or their organic integration. ...
Full-text available
Wheat (Triticum aestivum L.) is an essential crop that is widely consumed globally. The tiller density is an important factor affecting wheat yield. Therefore, it is necessary to measure the number of tillers during wheat cultivation and breeding, which requires considerable labor and material resources. At present, there is no effective high-throughput measurement method for tiller number estimation, and the conventional tiller survey method cannot accurately reflect the spatial variation of wheat tiller density within the whole field. Therefore, in order to meet the demand for the thematic map of wheat tiller density at the field scale for the variable operation of nitrogen fertilizer, the multispectral images of wheat in Feekes growth stages 2–3 were obtained by unmanned aerial vehicle (UAV), and the characteristic parameters of the number of tillers were used to construct a model that could accurately estimate the number of tillers. Based on the vegetation index (VIs), this work proposed a gradual change features (GCFs) approach, which can greatly improve the disadvantages of using VIs to estimate tiller number, better reflect the tiller status of the wheat population, and have good results on the estimation of tiller in common models. A Lasso + VIs + GCFs method was constructed for accurate estimation of tiller number in multiple growth periods and fertilizer-treated wheat, with an average RMSE of fewer than 9 tillers per square meter, average MAE less than 8 tillers per square meter, and R² above 0.7. The results of the study not only proposed a high-throughput measurement method for the number of tillers but also provided a reference for the estimation of tiller number and other agronomic parameters.
... For example, accuracy of 93.4 % has been achieved in cucumber disease detection using deep learning (Ma et al., 2018). Regarding plant and organ counting, deep learning methods have been applied to different crops, such as maize seedlings (Quan et al., 2019), rice seedlings (Wu et al., 2019), wheat stems (Jin et al., 2019), and wheat ear (Pawara et al., 2017). In recent years, the segmentation of crop phenotypic information based on deep learning methods has gradually aroused the interest of researchers (Yang et al., 2021). ...
Full-text available
The dynamics of maize tassel area reflect the growth and development of maize plants, monitoring which facilitates crop breeding and management. At present, the monitoring of maize tassels mainly depends on manual work, which is very labor intensive and may be biased by human errors. The U-Net model has proved effective for crop segmentation using RGB imagery. However, there has not been a systematic study to test how the accuracy of U-Net model vary when applied to different maize varieties, at different tasseling stages, and on images of different spatial resolutions. Moreover, the capability of U-Net model for monitoring the dynamics of tassel area has not been explored. In this study, the potential of the U-Net model to provide an accurate segmentation of the tassels in complex situations from near-ground RGB images and UAV images were comprehensively studied. The results showed that the segmentation accuracy of U-Net model with Vgg16 as feature extraction network (IoU = 0.71) for tassels at the whole tasseling stages was better than that of U-Net model with MobileNet (IoU = 0.63). The U-Net model with Vgg16 as the feature extraction network maintained a good segmentation accuracy for maize tassels at different tasseling stages (IoU = 0.63-0.76), for different varieties (IoU = 0.65-0.79), and at different resolutions (IoU = 0.57-0.71), which proved the robustness of the model. Changes in the segmented area of tassels from images were basically consistent with the trends observed in the actual area of tassel measured manually. UAV RGB images with resolution of 3.06 mm showed a good segmentation accuracy (IoU = 0.54). In summary, the results showed that the U-Net model has a good segmentation accuracy of maize tassels under various complex situations. This study provides an effective method to monitor the maize tassel status in crop phenotyping experiments in the future.
... Leaf area index (LAI) is one of key traits of characterizing crop growth, which is highly relevant to crop photosynthesis and transpiration [1][2][3]. Aboveground biomass (AGB) is an important basis for crop yield formation [4,5]. Therefore, accurate and rapid estimation of maize LAI and AGB is helpful for high-throughput screening of breeding maize. ...
Full-text available
High-throughput estimation of phenotypic traits from UAV (unmanned aerial vehicle) images is helpful to improve the screening efficiency of breeding maize. Accurately estimating phenotyping traits of breeding maize at plot scale helps to promote gene mining for specific traits and provides a guarantee for accelerating the breeding of superior varieties. Constructing an efficient and accurate estimation model is the key to the application of UAV-based multiple sensors data. This study aims to apply the ensemble learning model to improve the feasibility and accuracy of estimating maize phenotypic traits using UAV-based red-green-blue (RGB) and multispectral sensors. The UAV images of four growth stages were obtained, respectively. The reflectance of visible light bands, canopy coverage, plant height (PH), and texture information were extracted from RGB images, and the vegetation indices were calculated from multispectral images. We compared and analyzed the estimation accuracy of single-type feature and multiple features for LAI (leaf area index), fresh weight (FW), and dry weight (DW) of maize. The basic models included ridge regression (RR), support vector machine (SVM), random forest (RF), Gaussian process (GP), and K-neighbor network (K-NN). The ensemble learning models included stacking and Bayesian model averaging (BMA). The results showed that the ensemble learning model improved the accuracy and stability of maize phenotypic traits estimation. Among the features extracted from UAV RGB images, the highest accuracy was obtained by the combination of spectrum, structure, and texture features. The model had the best accuracy constructed using all features of two sensors. The estimation accuracies of ensemble learning models, including stacking and BMA, were higher than those of the basic models. The coefficient of determination ( R 2 ) of the optimal validation results were 0.852, 0.888, and 0.929 for LAI, FW, and DW, respectively. Therefore, the combination of UAV-based multisource data and ensemble learning model could accurately estimate phenotyping traits of breeding maize at plot scale.
... In general, leaves were more likely to be used for plant phenotyping. Fewer studies have studied the high-throughput phenotyping of stems [66][67][68][69][70][71][72][73]. The overall results showed that stems can be an efficient alternative for plant phenotyping in addition to leaves. ...
Full-text available
Herbicides and heavy metals are hazardous substances of environmental pollution, resulting in plant stress and harming humans and animals. Identification of stress types can help trace stress sources, manage plant growth, and improve stress-resistant breeding. In this research, hyperspectral imaging (HSI) and chlorophyll fluorescence imaging (Chl-FI) were adopted to identify the rice plants under two types of herbicide stresses (butachlor (DCA) and quinclorac (ELK)) and two types of heavy metal stresses (cadmium (Cd) and copper (Cu)). Visible/near-infrared spectra of leaves (L-VIS/NIR) and stems (S-VIS/NIR) extracted from HSI and chlorophyll fluorescence kinetic curves of leaves (L-Chl-FKC) and stems (S-Chl-FKC) extracted from Chl-FI were fused to establish the models to detect the stress of the hazardous substances. Novel end-to-end deep fusion models were proposed for low-level, middle-level, and high-level information fusion to improve identification accuracy. Results showed that the high-level fusion-based convolutional neural network (CNN) models reached the highest detection accuracy (97.7%), outperforming the models using a single data source (<94.7%). Furthermore, the proposed end-to-end deep fusion models required a much simpler training procedure than the conventional two-stage deep learning fusion. This research provided an efficient alternative for plant stress phenotyping, including identifying plant stresses caused by hazardous substances of environmental pollution.
... Maimaitijiang and Sidike (2019) developed a canopy volume index based on the parameters extracted from the crop height model from the orthomosaic image obtained by UAV and combine them with different SRIs to improve the biomass prediction with success. Another approach was taken by Jin et al. (2019) to predict biomass by estimating the volume of the plot. The authors combined height at anthesis time when it is considered to be at its maximum, the post-harvest stem diameter, and the number extracted from RGB images capturing the residual stems standing straight after harvest. ...
Full-text available
The revolution in digital phenotyping combined with the new layers of omics and envirotyping tools offers great promise to improve selection and accelerate genetic gains for crop improvement. This chapter examines the latest methods involving digital phenotyping tools to predict complex traits in cereals crops. The chapter has two parts. In the first part, entitled “Digital phenotyping as a tool to support breeding programs”, the secondary phenotypes measured by high-throughput plant phenotyping that are potentially useful for breeding are reviewed. In the second part, “Implementing complex G2P models in breeding programs”, the integration of data from digital phenotyping into genotype to phenotype (G2P) models to improve the prediction of complex traits using genomic information is discussed. The current status of statistical models to incorporate secondary traits in univariate and multivariate models, as well as how to better handle longitudinal (for example light interception, biomass accumulation, canopy height) traits, is reviewed.
... Therefore, it is important to rapidly and objectively measure and analyse the flowering date of rapeseed, especially for germplasm resources and breeding. The unmanned aerial vehicle (UAV) platform can carry a variety of sensors and can be widely applied in the calculation of plant height (Hu et al., 2018;Che et al., 2020), leaf area index (Yao et al., 2017;Yue et al., 2018), canopy coverage (Ahmed et al., 2017;Duan et al., 2017), aboveground biomass and yield (Jin et al., 2019;Li et al., 2020;Wan et al., 2020) due to its low cost and reliable performance. In addition, the UAV platform has flexible temporal and spatial resolutions and can be utilized to obtain plant growth dynamics at different spatial scales in a large-scale experiment. ...
Full-text available
Rapeseed (Brassica napus L.) is an important oil-bearing cash crop. Effective identification of the rapeseed flowering date is important for yield estimation and disease control. Traditional field measurements of rapeseed flowering are time-consuming, labour-intensive and strongly subjective. In this study, red, green and blue (RGB) images of rapeseed flowering derived from unmanned aerial vehicles (UAVs) were acquired with a total of seventeen available orthomosaic images, covering the whole flowering period for 299 rapeseed varieties. Five different machine learning methods were employed to identify and to extract the flowering areas in each plot. The results suggested that the accuracy of flowering area extraction by the decision tree-based segmentation model (DTSM) was higher than that of naive Bayes, K-nearest neighbours (KNN), random forest (RF) and support vector machine (SVM) in all varieties and flowering dates, with R² = 0.97 and root mean square error (RMSE) = 0.051 pixels/pixels. Data on the proportion of flowering area and its dynamics showed differences in the time and duration of each flowering date among varieties. All varieties were classified into four clusters based on k-means clustering analysis. There were significant differences in eight phenotypic parameters among the four clusters, especially in the time of maximum flowering ratio and the time entering the early and medium flowering dates. The results from this study could provide a basis for rapeseed breeding based on flowering dynamics.
Full-text available
Maize population density is one of the most essential factors in agricultural production systems and has a significant impact on maize yield and quality. Therefore, it is essential to estimate maize population density timely and accurately. In order to address the problems of the low efficiency of the manual counting method and the stability problem of traditional image processing methods in the field complex background environment, a deep-learning-based method for counting maize plants was proposed. Image datasets of the maize field were collected by a low-altitude UAV with a camera onboard firstly. Then a real-time detection model of maize plants was trained based on the object detection model YOLOV5. Finally, the tracking and counting method of maize plants was realized through Hungarian matching and Kalman filtering algorithms. The detection model developed in this study had an average precision mAP@0.5 of 90.66% on the test dataset, demonstrating the effectiveness of the SE-YOLOV5m model for maize plant detection. Application of the model to maize plant count trials showed that maize plant count results from test videos collected at multiple locations were highly correlated with manual count results (R2 = 0.92), illustrating the accuracy and validity of the counting method. Therefore, the maize plant identification and counting method proposed in this study can better achieve the detection and counting of maize plants in complex backgrounds and provides a research basis and theoretical basis for the rapid acquisition of maize plant population density.
The book was inadvertently published with an incorrect name information for one of the Chapter author as Body Mori, instead it should be “Boyd A. Mori” in the front matter and Chapter 20.
The plant height (PH), stem thickness long axis (STLA), and stem thickness short axis (STSA) in maize morphological parameters can effectively reflect the growth, lodging resistance and yield information of maize plants. Terrestrial laser scanning (TLS) can achieve rapid measurement of crop phenotypic parameters. To address the problems of low automation and leaf interference in the existing measurement methods, TLS was used as the measurement sensor, and a morphological measurement method for PH, STLA and STSA of field maize based on point cloud image conversion was proposed. First, in the V3, V6, V9, and V12 stages, three-dimensional (3D) point cloud data of two maize varieties (Jingnongke 728 and Nongda 84) were obtained by TLS. Second, the point cloud processing software was used to match the collected maize point cloud data, obtain the registered multi-site cloud data, remove the background point cloud data, extract the maize row data, and carry out down-sampling. Programming was used to realize data format conversion and individual plant segmentation. Third, several methods, including plane segmentation, statistical filtering, pass filtering, maximum and minimum traversal, and Euclidean clustering were used to remove ground point clouds, judge whether there was maize plant, and extract area maize point clouds. A method of point cloud image conversion was proposed to realize the segmentation of maize stem and leaf. Finally, the height of the plant was measured by calculating the vertical distance from the highest point of the plant to its base. The point cloud of the stem at a specific location was identified, and the ellipse fitting method was used to measure the thickness of the long axis and the short axis of the stem. Compared with the measured value of artificial point cloud, the PH, STLA and STSA of maize in 4 growth stages were measured using automatic program. The root mean square error (RMSE) of the PH, STLA and STSA of Jingnongke 728 were 0.61 cm, 3.16 mm and 2.53 mm respectively, and the mean absolute percentage error (MAPE) were 0.52%, 7.90% and 9.70% respectively. The RMSE of Nongda 84 were 0.66 cm, 2.63 mm and 2.42 mm respectively, and the MAPE were 0.75%, 7.07% and 9.76% respectively. The results show that the point cloud image conversion method for measuring PH, STLA and STSA proposed in this article is suitable for maize of different growth periods and different maize varieties. It is highly consistent with the artificial point cloud measurement values and can replace manual measurement. It can provide breeders with a fast, automatic and accurate measurement program for the PH, STLA and STSA of maize.
Full-text available Ear density is one of the most important agronomical yield components in wheat. Ear counting is time-consuming and tedious as it is most often conducted manually in field conditions. Moreover, different sampling techniques are often used resulting in a lack of standard protocol, which may eventually affect inter-comparability of results. Thermal sensors capture crop canopy features with more contrast than RGB sensors for image segmentation and classification tasks. An automatic thermal ear counting system is proposed to count the number of ears using zenithal/nadir thermal images acquired from a moderately high resolution handheld thermal camera. Three experimental sites under different growing conditions in Spain were used on a set of 24 varieties of durum wheat for this study. The automatic pipeline system developed uses contrast enhancement and filter techniques to segment image regions detected as ears. The approach is based on the temperature differential between the ears and the rest of the canopy, given that ears usually have higher temperatures due to their lower transpiration rates. Thermal images were acquired, together with RGB images and in situ (i.e., directly in the plot) visual ear counting from the same plot segment for validation purposes. The relationship between the thermal counting values and the in situ visual counting was fairly weak (R2 = 0.40), which highlights the difficulties in estimating ear density from one single image-perspective. However, the results show that the automatic thermal ear counting system performed quite well in counting the ears that do appear in the thermal images, exhibiting high correlations with the manual image-based counts from both thermal and RGB images in the sub-plot validation ring (R2 = 0.75-0.84). Automatic ear counting also exhibited high correlation with the manual counting from thermal images when considering the complete image (R2 = 0.80). The results also show a high correlation between the thermal and the RGB manual counting using the validation ring (R2 = 0.83). Methodological requirements and potential limitations of the technique are discussed.
Full-text available
Wheat ear density estimation is an appealing trait for plant breeders. Current manual counting is tedious and inefficient. In this study we investigated the potential of convolutional neural networks (CNNs) to provide accurate ear density using nadir high spatial resolution RGB images. Two different approaches were investigated, either using the Faster-RCNN state-of-the-art object detector or with the TasselNet local count regression network. Both approaches performed very well (rRMSE6%) when applied over the same conditions as those prevailing for the calibration of the models. However, Faster-RCNN was more robust when applied to a dataset acquired at a later stage with ears and background showing a different aspect because of the higher maturity of the plants. Optimal spatial resolution for Faster-RCNN was around 0.3 mm allowing to acquire RGB images from a UAV platform for high-throughput phenotyping of large experiments. Comparison of the estimated ear density with in-situ manual counting shows reasonable agreement considering the relatively small sampling area used for both methods. Faster-RCNN and in-situ counting had high and similar heritability (H²85%), demonstrating that ear density derived from high resolution RGB imagery could replace the traditional counting method.
Full-text available
Background The number of ears per unit ground area (ear density) is one of the main agronomic yield components in determining grain yield in wheat. A fast evaluation of this attribute may contribute to monitoring the efficiency of crop management practices, to an early prediction of grain yield or as a phenotyping trait in breeding programs. Currently the number of ears is counted manually, which is time consuming. Moreover, there is no single standardized protocol for counting the ears. An automatic ear-counting algorithm is proposed to estimate ear density under field conditions based on zenithal color digital images taken from above the crop in natural light conditions. Field trials were carried out at two sites in Spain during the 2014/2015 crop season on a set of 24 varieties of durum wheat with two growing conditions per site. The algorithm for counting uses three steps: (1) a Laplacian frequency filter chosen to remove low and high frequency elements appearing in an image, (2) a Median filter to reduce high noise still present around the ears and (3) segmentation using Find Maxima to segment local peaks and determine the ear count within the image. Results The results demonstrate high success rate (higher than 90%) between the algorithm counts and the manual (image-based) ear counts, and precision, with a low standard deviation (around 5%). The relationships between algorithm ear counts and grain yield was also significant and greater than the correlation with manual (field-based) ear counts. In this approach, results demonstrate that automatic ear counting performed on data captured around anthesis correlated better with grain yield than with images captured at later stages when the low performance of ear counting at late grain filling stages was associated with the loss of contrast between canopy and ears. Conclusions Developing robust, low-cost and efficient field methods to assess wheat ear density, as a major agronomic component of yield, is highly relevant for phenotyping efforts towards increases in grain yield. Although the phenological stage of measurements is important, the robust image analysis algorithm presented here appears to be amenable from aerial or other automated platforms. Electronic supplementary material The online version of this article (10.1186/s13007-018-0289-4) contains supplementary material, which is available to authorized users.
Full-text available
Chitosan plays an important role in regulating growth and eliciting defense in many plant species. However, the exact metabolic response of plants to chitosan is still not clear. The present study performed an integrative analysis of metabolite profiles in chitosan-treated wheat seedlings and further investigated the response of enzyme activities and transcript expression related to the primary carbon (C) and nitrogen (N) metabolism. Metabolite profiling revealed that chitosan could induce significant difference of organic acids, sugars and amino acids in leaves of wheat seedlings. A higher accumulation of sucrose content was observed after chitosan treatment, accompanied by an increase in sucrose phosphate synthase (SPS) and fructose 1, 6-2 phosphatase (FBPase) activities as well as an up-regulation of relative expression level. Several metabolites associated with tricarboxylic acid (TCA) cycle, including oxaloacetate and malate, were also improved along with an elevation of phosphoenolpyruvate carboxylase (PEPC) and pyruvate dehydrogenase (PDH) activities. On the other hand, chitosan could also enhance the N reduction and N assimilation. Glutamate, aspartate and some other amino acids were higher in chitosan-treated plants, accompanied by the activation of key enzymes of N reduction and glutamine synthetase/glutamate synthase (GS/GOGAT) cycle. Together, these results suggested a pleiotropic modulation of carbon and nitrogen metabolism in wheat seedlings induced by chitosan and provided a significant insight into the metabolic mechanism of plants in response to chitosan for the first time, and it would give a basic guidance for the future application of chitosan in agriculture.
Full-text available
The capacity of LiDAR and Unmanned Aerial Vehicles (UAVs) to provide plant height estimates as a high-throughput plant phenotyping trait was explored. An experiment over wheat genotypes conducted under well watered and water stress modalities was conducted. Frequent LiDAR measurements were performed along the growth cycle using a phénomobile unmanned ground vehicle. UAV equipped with a high resolution RGB camera was flying the experiment several times to retrieve the digital surface model from structure from motion techniques. Both techniques provide a 3D dense point cloud from which the plant height can be estimated. Plant height first defined as the z-value for which 99.5% of the points of the dense cloud are below. This provides good consistency with manual measurements of plant height (RMSE = 3.5 cm) while minimizing the variability along each microplot. Results show that LiDAR and structure from motion plant height values are always consistent. However, a slight underestimation is observed for structure from motion techniques, in relation with the coarser spatial resolution of UAV imagery and the limited penetration capacity of structure from motion as compared to LiDAR. Very high heritability values (H 2 > 0.90) were found for both techniques when lodging was not present. The dynamics of plant height shows that it carries pertinent information regarding the period and magnitude of the plant stress. Further, the date when the maximum plant height is reached was found to be very heritable (H 2 > 0.88) and a good proxy of the flowering stage. Finally, the capacity of plant height as a proxy for total above ground biomass and yield is discussed.
Full-text available
Correct estimation of above-ground biomass (AGB) is necessary for accurate crop growth monitoring and yield prediction. We estimated AGB based on images obtained with a snapshot hyperspectral sensor (UHD 185 firefly, Cubert GmbH, Ulm, Baden-Württemberg, Germany) mounted on an unmanned aerial vehicle (UAV). The UHD 185 images were used to calculate the crop height and hyperspectral reflectance of winter wheat canopies from hyperspectral and panchromatic images. We constructed several single-parameter models for AGB estimation based on spectral parameters, such as specific bands, spectral indices (e.g., Ratio Vegetation Index (RVI), NDVI, Greenness Index (GI) and Wide Dynamic Range VI (WDRVI)) and crop height and several models combined with spectral parameters and crop height. Comparison with experimental results indicated that incorporating crop height into the models improved the accuracy of AGB estimations (the average AGB is 6.45 t/ha). The estimation accuracy of single-parameter models was low (crop height only: R2 = 0.50, RMSE = 1.62 t/ha, MAE = 1.24 t/ha; R670 only: R2 = 0.54, RMSE = 1.55 t/ha, MAE = 1.23 t/ha; NDVI only: R2 = 0.37, RMSE = 1.81 t/ha, MAE = 1.47 t/ha; partial least squares regression R2 = 0.53, RMSE = 1.69, MAE = 1.20), but accuracy increased when crop height and spectral parameters were combined (partial least squares regression modeling: R2 = 0.78, RMSE = 1.08 t/ha, MAE = 0.83 t/ha; verification: R2 = 0.74, RMSE = 1.20 t/ha, MAE = 0.96 t/ha). Our results suggest that crop height determined from the new UAV-based snapshot hyperspectral sensor can improve AGB estimation and is advantageous for mapping applications. This new method can be used to guide agricultural management.
Full-text available
Background Plant density and its non-uniformity drive the competition among plants as well as with weeds. They need thus to be estimated with small uncertainties accuracy. An optimal sampling method is proposed to estimate the plant density in wheat crops from plant counting and reach a given precision. ResultsThree experiments were conducted in 2014 resulting in 14 plots across varied sowing density, cultivars and environmental conditions. The coordinates of the plants along the row were measured over RGB high resolution images taken from the ground level. Results show that the spacing between consecutive plants along the row direction are independent and follow a gamma distribution under the varied conditions experienced. A gamma count model was then derived to define the optimal sample size required to estimate plant density for a given precision. Results suggest that measuring the length of segments containing 90 plants will achieve a precision better than 10%, independently from the plant density. This approach appears more efficient than the usual method based on fixed length segments where the number of plants are counted: the optimal length for a given precision on the density estimation will depend on the actual plant density. The gamma count model parameters may also be used to quantify the heterogeneity of plant spacing along the row by exploiting the variability between replicated samples. Results show that to achieve a 10% precision on the estimates of the 2 parameters of the gamma model, 200 elementary samples corresponding to the spacing between 2 consecutive plants should be measured. Conclusions This method provides an optimal sampling strategy to estimate the plant density and quantify the plant spacing heterogeneity along the row.
Conference Paper
We trained a large, deep convolutional neural network to classify the 1.2 million high-resolution images in the ImageNet LSVRC-2010 contest into the 1000 dif- ferent classes. On the test data, we achieved top-1 and top-5 error rates of 37.5% and 17.0% which is considerably better than the previous state-of-the-art. The neural network, which has 60 million parameters and 650,000 neurons, consists of five convolutional layers, some of which are followed by max-pooling layers, and three fully-connected layers with a final 1000-way softmax. To make training faster, we used non-saturating neurons and a very efficient GPU implemen- tation of the convolution operation. To reduce overfitting in the fully-connected layers we employed a recently-developed regularization method called dropout that proved to be very effective. We also entered a variant of this model in the ILSVRC-2012 competition and achieved a winning top-5 test error rate of 15.3%, compared to 26.2% achieved by the second-best entry
Plant density is useful variable that determines the fate of the wheat crop. The most commonly used method for plant density quantification is based on visual counting from ground level. The objective of this study is to develop and evaluate a method for estimating wheat plant density at the emergence stage based on high resolution imagery taken from UAV at very low altitude with application to high throughput phenotyping in field conditions. A Sony ILCE α5100L RGB camera with 24 Mpixels and equipped with a 60 mm focal length lens was flying aboard an hexacopter at 3 to 7 m altitude at about 1 m/s speed. This allows getting ground resolution between 0.20 mm to 0.45 mm, while providing 59–77% overlap between images. The camera was looking with 45° zenith angle in a compass direction perpendicular to the row direction to maximize the cross section viewed of the plants and minimize the effect of the wind created by the rotors. Agisoft photoscan software was then used to derive the position of the cameras for each image. Images were then projected on the ground surface to finally extract subsamples used to estimate the plant density. The extracted images were first classified to separate the green pixels from the background and the rows were then identified and extracted. Finally, image object (group of connected green pixels) was identified on each row and the number of plants they contain was estimated using a Support Vector Machine whose training was optimized using a Particle Swarm Optimization.