Automatic Wheat Ear Counting Using Thermal Imagery

Article (PDF Available)inRemote Sensing 11(7):751 · March 2019with 180 Reads
DOI: 10.3390/rs11070751
Cite this publication
Abstract
https://www.mdpi.com/2072-4292/11/7/751 Ear density is one of the most important agronomical yield components in wheat. Ear counting is time-consuming and tedious as it is most often conducted manually in field conditions. Moreover, different sampling techniques are often used resulting in a lack of standard protocol, which may eventually affect inter-comparability of results. Thermal sensors capture crop canopy features with more contrast than RGB sensors for image segmentation and classification tasks. An automatic thermal ear counting system is proposed to count the number of ears using zenithal/nadir thermal images acquired from a moderately high resolution handheld thermal camera. Three experimental sites under different growing conditions in Spain were used on a set of 24 varieties of durum wheat for this study. The automatic pipeline system developed uses contrast enhancement and filter techniques to segment image regions detected as ears. The approach is based on the temperature differential between the ears and the rest of the canopy, given that ears usually have higher temperatures due to their lower transpiration rates. Thermal images were acquired, together with RGB images and in situ (i.e., directly in the plot) visual ear counting from the same plot segment for validation purposes. The relationship between the thermal counting values and the in situ visual counting was fairly weak (R2 = 0.40), which highlights the difficulties in estimating ear density from one single image-perspective. However, the results show that the automatic thermal ear counting system performed quite well in counting the ears that do appear in the thermal images, exhibiting high correlations with the manual image-based counts from both thermal and RGB images in the sub-plot validation ring (R2 = 0.75-0.84). Automatic ear counting also exhibited high correlation with the manual counting from thermal images when considering the complete image (R2 = 0.80). The results also show a high correlation between the thermal and the RGB manual counting using the validation ring (R2 = 0.83). Methodological requirements and potential limitations of the technique are discussed. https://www.mdpi.com/2072-4292/11/7/751
remote sensing
Article
Automatic Wheat Ear Counting Using Thermal Imagery
Jose A. Fernandez-Gallego 1,2,3 , Ma. Luisa Buchaillot 1,2, Nieves Aparicio Gutiérrez 4,
María Teresa Nieto-Taladriz 5, JoséLuis Araus 1, 2, * and Shawn C. Kefauver 1, 2, *
1Plant Physiology Section, Department of Evolutionary Biology, Ecology and Environmental Sciences,
Faculty of Biology, University of Barcelona, Diagonal 643, 08028 Barcelona, Spain;
jfernaga46@alumnes.ub.edu (J.A.F.-G.); luisa.buchaillot@gmail.com (M.L.B.)
2AGROTECNIO (Center for Research in Agrotechnology), Av. Rovira Roure 191, 25198 Lleida, Spain
3Programa de Ingeniería Electrónica, Facultad de Ingeniería, Universidad de Ibagué, Carrera 22 Calle 67,
Ibagué730001, Colombia
4Instituto Tecnológico Agrario de Castilla y León (ITACyL), Ctra. Burgos Km. 119, 47071 Valladolid, Spain;
apagutni@itacyl.es
5
Instituto Nacional de Investigación y Tecnología Agraria y Alimentaria (INIA), Ctra. de la Coruña Km. 7.5,
28040 Madrid, Spain; mtnieto@inia.es
*Correspondence: jaraus@ub.edu (J.L.A.); sckefauver@ub.edu (S.C.K.);
Tel.: +34-934021469 (J.L.A.); +34-934021465 (S.C.K.)
Received: 18 January 2019; Accepted: 26 March 2019; Published: 28 March 2019


Abstract:
Ear density is one of the most important agronomical yield components in wheat. Ear
counting is time-consuming and tedious as it is most often conducted manually in field conditions.
Moreover, different sampling techniques are often used resulting in a lack of standard protocol, which
may eventually affect inter-comparability of results. Thermal sensors capture crop canopy features
with more contrast than RGB sensors for image segmentation and classification tasks. An automatic
thermal ear counting system is proposed to count the number of ears using zenithal/nadir thermal
images acquired from a moderately high resolution handheld thermal camera. Three experimental
sites under different growing conditions in Spain were used on a set of 24 varieties of durum wheat for
this study. The automatic pipeline system developed uses contrast enhancement and filter techniques
to segment image regions detected as ears. The approach is based on the temperature differential
between the ears and the rest of the canopy, given that ears usually have higher temperatures due
to their lower transpiration rates. Thermal images were acquired, together with RGB images and
in situ (i.e., directly in the plot) visual ear counting from the same plot segment for validation
purposes. The relationship between the thermal counting values and the in situ visual counting
was fairly weak (R
2
= 0.40), which highlights the difficulties in estimating ear density from one
single image-perspective. However, the results show that the automatic thermal ear counting system
performed quite well in counting the ears that do appear in the thermal images, exhibiting high
correlations with the manual image-based counts from both thermal and RGB images in the sub-plot
validation ring (R
2
= 0.75–0.84). Automatic ear counting also exhibited high correlation with the
manual counting from thermal images when considering the complete image (R
2
= 0.80). The results
also show a high correlation between the thermal and the RGB manual counting using the validation
ring (R
2
= 0.83). Methodological requirements and potential limitations of the technique are discussed.
Keywords: thermal images; ear counting; digital image processing; wheat
1. Introduction
High throughput plant phenotyping (HTPP) is a quantitative description of the functional and
structural properties of the plant [
1
] for the purpose of crop breeding [
2
,
3
]. In the case of cereals,
Remote Sens. 2019,11, 751; doi:10.3390/rs11070751 www.mdpi.com/journal/remotesensing
Remote Sens. 2019,11, 751 2 of 13
e.g., wheat, besides grain yield, agronomical yield components are also assessed as part of plant
phenotyping pipeline [
4
]. The accurate quantification of the number of ears per square meter, number
of grains per ear and the thousand kernel weight, as the main yield components in wheat, are therefore
essential in breeding programs [
5
]. In the case of ear counting, it is time-consuming and tedious as it
is most often conducted manually in field conditions. Moreover, different subsampling techniques
and derived protocols for calculation are often used resulting in a lack of standard protocol. As
an alternative, several automatic ear counting techniques have been developed in the last years,
mainly using as input high resolution RGB (Red/Green/Blue) images. Different image processing
techniques have been used such as texture and hybrid color space [
6
,
7
], multi-features from color,
grayscale and texture data [
8
]. Decorrelation stretch for color contrast enhancement and Support
Vector Machine (SVM) as classification techniques [
9
,
10
] and convolutional neural network recognition
have been also used [
11
]. Other approaches use frequency and spatial filter techniques as well as
local peaks segmentation [
12
,
13
]. Even though the visual spectrum has been widely used for ear
counting, there are general limitations to have into account in field conditions, such as solar light
conditions (unwanted shadows and bright surfaces), wind conditions (blur ears), ears overlapping
and size/shape variation (mostly depending of their more/less horizontal position) and spatial image
resolution (camera/canopy distance and sensor size).
Recently, a fusion of multispectral and RGB images have been developed for ear counting
estimation [
14
]. Though not yet applied to wheat ear counting, fruit defect detection using
hyperspectral images, through image processing systems, has also been recently developed [
15
].
Therefore, although visual and multispectral information has been used for ear counting and
hyperspectral for fruit defect detection, there is no information in the literature regarding thermal
images for ear counting applications or segmentation on such a fine spatial scale, perhaps due to the
comparatively low resolution and high cost of thermal cameras [2].
Thermal imagery is related with the transpirative status of the plant [
16
] that is separate from the
visual characteristics that could result in the RGB imagery limitations. Thermal information has been
used mainly for monitoring crop water status [
17
22
] and irrigation management [
23
,
24
]. However,
previous studies have shown that, regardless of the water conditions during the growing season, there
are often significant constitutive differences between leaf and ear temperature on sunny days [
25
], with
ear temperature being higher than leaf temperature. This suggests that thermal imagery may provide
a useful approach for ear counting [
26
]. Temperature distribution across a particular surface have
been studied using image processing techniques; for instance, in segmentation applications using the
thermal color map, thresholding and morphology operators in research related with orange, apple and
almond tree orchards [
27
29
]. Other similar segmentation approaches have focused on the assessment
of plant and leaf temperature separately in vegetables and soybean crops [30,31].
In this study, we propose an automatic wheat ear counting system using thermal images acquired
holding the camera by hand above the canopy. We include data captured at three different experimental
stations located in northern, central and southern Spain with different environmental conditions. An
image processing system was developed to segment the wheat ears taking advantage of the thermal
color map. A pipeline structure was designed to filter background and unwanted regions into the
image using an adaptive contrast technique and morphological operators. Visual counting directly in
the field (i.e., in situ) as well as ear counts derived from RGB images of the same plot segments were
also added for the purposes of validating the thermal image and algorithm counting measurements.
2. Materials and Methods
2.1. Plant Material and Experimental Setup
Two sets of twenty-four post Green Revolution (i.e., semi-dwarf) durum wheat (Triticum turgidum
L. subsp. durum (Desf) Husn.) cultivars (cvs Amilcar, Arcobaleno, Athoris, Avispa, Burgos, Claudio,
Core, Don Norman, Don Ricardo, Dorondon, Euroduro, Gallareta, Iberus, Kiko Nick, Mexa, Olivadur,
Remote Sens. 2019,11, 751 3 of 13
Paramo, Pedroso, Regallo, Saragolla, Sculptur, Simeto, Solea and Vitron) were grown during two
consecutive seasons. In the second year, the Haristide variety was planted instead of Paramo. Field
trials were carried at the experimental stations of Colmenar de la Oreja (40
04
0
N, 3
31
0
W), near
Aranjuez (Madrid province) and Coria del Rio (37
14
0
N, 6
03
0
W), near Sevilla during the 2016/2017
crop seasons and at Zamadueñas (41
42
0
N, 4
42
0
W), near Valladolid during the 2017/2018 crop
season. The first two stations belong to the Instituto Nacional de Investigación y Tecnología Agraria y
Alimentaria (INIA), while the third one belongs to the Instituto de Tecnología Agraria de Castilla y
León (ITACyL) of Spain. The average annual precipitation and annual temperature is about 425 mm
and 13.7
C, 502 mm and 18.0
C and 269 mm and 13.2
C for Aranjuez, Sevilla and Valladolid,
respectively. The meteorological data were obtained from the meteorological stations nearest to
each experimental station using the SIAR (Sistema de Información Agroclimática para el Regadio)
information system [32].
Aranjuez trials were fertilized before sowing with 450 kg ha
1
of 8:15:15 (8% N, 15% P
2
O
5
, 15%
K
2
O) fertilizer and in addition 185 kg ha
1
of 46% urea was applied before stem elongation. Sevilla
trials were fertilized before sowing with 500 kg ha
1
of 15:15:15 (15% N, 15% P
2
O
5
, and 15% K
2
O)
fertilizer and 100 kg ha
1
of 46% urea was applied before stem elongation. Finally, Valladolid trials
were fertilized before sowing with 300 kg ha
1
of 8:15:15 (8% N, 15% P
2
O
5
, and 15% K
2
O) fertilizer;
150 kg ha
1
of calcium ammonium nitrate (27% richness in nitrogen) was applied before tillering; and
150 kg ha1of ammonium sulfate nitrate (26% richness in nitrogen) was applied before heading.
Two experimental conditions (rainfed and supplemental irrigation) were assayed at Aranjuez
and Valladolid, while, in the case of Sevilla, only rainfed conditions were assayed. The genotypes
were evaluated in 9 m
2
in size plots, 6 rows, 0.25 m apart and a planting density of 250 seeds per
m
2
. Randomized blocks were used with three replicates and a total of 72 plots per trial (3 replicates
×
24 genotypes). Supplemental irrigation and rainfed trials were planted on 22 December 2016
for Aranjuez and, in the case of Valladolid, supplemental irrigation and rainfed were planted on
13 November 2017 and 23 November 2017, respectively. The rainfed trail at Sevilla was planted on
15 December 2016. Accumulated rainfall and the average temperatures during the crop season for
each experimental station were 134 mm and 14.4
C; 261 mm and 15.7
C; and 169 mm and 10.2
C for
Aranjuez (2016/2017), Sevilla (2016/2017) and Valladolid (2017/2018), respectively. For field trials
under supplemental irrigation, eight irrigations were provided at Aranjuez, with a total of 420 mm of
water, and eight irrigations were provided at Valladolid, with a total of 110 mm of water.
2.2. Thermal Images
Thermal images were acquired at Aranjuez, Sevilla and Valladolid using the MIDAS 320L infrared
camera (DIAS infrared GmbH, Germany) with a
20
C to 120
C temperature range, 8–14
µ
m spectral
range in one channel, 320
×
240 radiometric detector and 16-bit format using focal length in manual
mode. All files were exported using the default settings for PYROSOFT Professional software (DIAS
infrared GmbH, Germany) in BMP (bitmap file) format using 8 bits, and then images were converted
to JPG format using 8 bits.
Thermal images from the complete trials (72 plots each) of Aranjuez (supplemental irrigation
condition only) and Sevilla (rainfed), together with the first block (24 plots) from Valladolid (rainfed
only), were captured for this study. For each plot, one thermal image was taken holding the camera
by hand above the canopy and near the center of the plot. Images were acquired after midday in a
zenithal/nadir plane at between 0.8 and 1 m distance at each particular growth stage (GS) using Zadoks
growth stage [
33
] (Table 1). Spatial resolution was approximately 0.14 cm/pixel. Images were acquired
on 5 May 2017 (10:00–11:00 UTC, GS = 61–65, anthesis), 25 April 2017 (10:00–11:00 UTC, GS = 69,
grain filling) and 14 June 2018 (14:30–15:00 UTC, GS = 77, late grain filling) for Aranjuez, Sevilla and
Valladolid, respectively. Figure 1shows an example of thermal images acquired for each experimental
station. The actual time of the data acquisition at each location was slightly different to allow for the
Remote Sens. 2019,11, 751 4 of 13
adequate contrast between the leaves and ears in the thermal images. Preliminary selection discarded
images with acquisition or temperature problems such as blurred images or overcast conditions.
Remote Sens. 2019, 11, x FOR PEER REVIEW 4 of 13
allow for the adequate contrast between the leaves and ears in the thermal images. Preliminary
selection discarded images with acquisition or temperature problems such as blurred images or
overcast conditions.
Figure 1. Images of plots acquired using the MIDAS 320L thermal camera: (a) Aranjuez (anthesis); (b)
Sevilla (grain filling); and (c) Valladolid (late grain filling). The last image includes the ring used for
validation purposes.
2.3. Automatic Thermal Ear Counting System
This work proposes an automatic image processing system based on the thermal color map
using four steps: (1) low temperature detection; (2) contrast limited adaptive histogram equalization
(CLAHE); (3) color threshold; and (4) analyze particles command (Figure 2). The automatic system was
developed in ImageJ open source software [34]. As a first step, low temperature detection uses the CIE
L*a*b* color space [35] to avoid the blue color values; the negative b* values were filtered using the
color threshold macro [34]. CLAHE method [36] was used to enhance the local contrast in small regions
in the image. As a next step, color threshold macro was used to select the high temperature via the
Hue/Saturation/Value (HSV) color space [37], represented in colors between red and green, which
correspond to hue values from 2 to 120, and therefore closely related with the presence of ears. Finally,
analyze particles function [34] was used to count and filter the regions detected as ears.
Figure 2. Automatic thermal ear counting system: (1) low temperature detection; (2) contrast limited
adaptive histogram equalization (CLAHE); (3) color threshold; and (4) analyze particles command,
boundaries regions detected as an ear were underlined in white color.
Color thermal maps were used for the ear detection system, and the CIE L*a*b* color space was
selected with the aim of detecting the lower temperatures in the image. This color space uses a
Cartesian system of coordinates, where the positive b* axis represents the amount of yellow and the
negative b* axis represents the amount of blue [35]; in that way, we filtered the negative b* axis to
avoid leaves, which are related with lower temperature. The a* axis was not filtered for this step. The
CLAHE algorithm was used to enhance the local contrast in edges and regions into the image and
contribute to isolate overlapping or neighboring ears. The HSV color space uses the hue values from
to 360° to represent colors from red to magenta, while saturation and value (or brightness) have
numbers from 0 to 100 [37]. This color space was used to segment high temperature represented in
colors between green and red. Finally, analyze particles command was used to count and filter the
regions detected as ears by the automatic algorithm.
2.4. Algorithm validation
Image Iin Low temp. detection CLAHE Color threshold Analyze particles Iout
Figure 1.
Images of plots acquired using the MIDAS 320L thermal camera: (
a
) Aranjuez (anthesis); (
b
)
Sevilla (grain filling); and (
c
) Valladolid (late grain filling). The last image includes the ring used for
validation purposes.
2.3. Automatic Thermal Ear Counting System
This work proposes an automatic image processing system based on the thermal color map using
four steps: (1) low temperature detection; (2) contrast limited adaptive histogram equalization (CLAHE); (3)
color threshold; and (4) analyze particles command (Figure 2). The automatic system was developed
in ImageJ open source software [
34
]. As a first step, low temperature detection uses the CIE L*a*b*
color space [
35
] to avoid the blue color values; the negative b* values were filtered using the color
threshold macro [
34
]. CLAHE method [
36
] was used to enhance the local contrast in small regions
in the image. As a next step, color threshold macro was used to select the high temperature via the
Hue/Saturation/Value (HSV) color space [
37
], represented in colors between red and green, which
correspond to hue values from 2 to 120, and therefore closely related with the presence of ears. Finally,
analyze particles function [34] was used to count and filter the regions detected as ears.
Remote Sens. 2019, 11, x FOR PEER REVIEW 4 of 13
allow for the adequate contrast between the leaves and ears in the thermal images. Preliminary
selection discarded images with acquisition or temperature problems such as blurred images or
overcast conditions.
Figure 1. Images of plots acquired using the MIDAS 320L thermal camera: (a) Aranjuez (anthesis); (b)
Sevilla (grain filling); and (c) Valladolid (late grain filling). The last image includes the ring used for
validation purposes.
2.3. Automatic Thermal Ear Counting System
This work proposes an automatic image processing system based on the thermal color map
using four steps: (1) low temperature detection; (2) contrast limited adaptive histogram equalization
(CLAHE); (3) color threshold; and (4) analyze particles command (Figure 2). The automatic system was
developed in ImageJ open source software [34]. As a first step, low temperature detection uses the CIE
L*a*b* color space [35] to avoid the blue color values; the negative b* values were filtered using the
color threshold macro [34]. CLAHE method [36] was used to enhance the local contrast in small regions
in the image. As a next step, color threshold macro was used to select the high temperature via the
Hue/Saturation/Value (HSV) color space [37], represented in colors between red and green, which
correspond to hue values from 2 to 120, and therefore closely related with the presence of ears. Finally,
analyze particles function [34] was used to count and filter the regions detected as ears.
Figure 2. Automatic thermal ear counting system: (1) low temperature detection; (2) contrast limited
adaptive histogram equalization (CLAHE); (3) color threshold; and (4) analyze particles command,
boundaries regions detected as an ear were underlined in white color.
Color thermal maps were used for the ear detection system, and the CIE L*a*b* color space was
selected with the aim of detecting the lower temperatures in the image. This color space uses a
Cartesian system of coordinates, where the positive b* axis represents the amount of yellow and the
negative b* axis represents the amount of blue [35]; in that way, we filtered the negative b* axis to
avoid leaves, which are related with lower temperature. The a* axis was not filtered for this step. The
CLAHE algorithm was used to enhance the local contrast in edges and regions into the image and
contribute to isolate overlapping or neighboring ears. The HSV color space uses the hue values from
to 360° to represent colors from red to magenta, while saturation and value (or brightness) have
numbers from 0 to 100 [37]. This color space was used to segment high temperature represented in
colors between green and red. Finally, analyze particles command was used to count and filter the
regions detected as ears by the automatic algorithm.
2.4. Algorithm validation
Image Iin Low temp. detection CLAHE Color threshold Analyze particles Iout
Figure 2.
Automatic thermal ear counting system: (1) low temperature detection; (2) contrast limited
adaptive histogram equalization (CLAHE); (3) color threshold; and (4) analyze particles command, boundaries
regions detected as an ear were underlined in white color.
Color thermal maps were used for the ear detection system, and the CIE L*a*b* color space
was selected with the aim of detecting the lower temperatures in the image. This color space uses
a Cartesian system of coordinates, where the positive b* axis represents the amount of yellow and
the negative b* axis represents the amount of blue [
35
]; in that way, we filtered the negative b* axis
to avoid leaves, which are related with lower temperature. The a* axis was not filtered for this step.
The CLAHE algorithm was used to enhance the local contrast in edges and regions into the image
and contribute to isolate overlapping or neighboring ears. The HSV color space uses the hue values
from 0
to 360
to represent colors from red to magenta, while saturation and value (or brightness)
have numbers from 0 to 100 [
37
]. This color space was used to segment high temperature represented
in colors between green and red. Finally, analyze particles command was used to count and filter the
regions detected as ears by the automatic algorithm.
Remote Sens. 2019,11, 751 5 of 13
2.4. Algorithm Validation
Manual In Situ Counting and RGB Images
For validation purposes, a physical ring was placed on the top of the canopy for counting the
number of ears in the exact ring area by visual inspection in the field. The ring has a radius of 0.1225 m.
The ring was attached by an extension arm to the monopod used to acquire the RGB images. Thermal
and RGB images were acquired at the same time as the visual (in situ) ear counting (inside the ring) was
assessed in the first block (i.e., 24 plots) of the rainfed trial at Valladolid. Visual counting was always
performed by the same person at the same position where the images were acquired. Approximately
15 s were spent for each counting using a clicker to keep track of the exact number and making sure to
inspect the area inside the ring to accurately include all ears present by moving plants and changing
perspective angles at each location. Additionally, RGB images from the same plot segments were
acquired (at the same time than the thermal images) in a zenithal/nadir plane with a Sony QX1-ILCE
camera (Sony Corporation, Japan), 20.1-megapixel resolution, with 23.2
×
15.4 mm sensor size, using
16 mm focal lens and resolution of 5456
×
3632 pixels. The images were taken using a monopod at 1 m
above the canopy. The resulting RGB image spatial resolution was approximately 0.03 cm/pixel.
The presence of an ear inside the ring area assigned through the thermal images was checked by
the RGB image (Figure 3) together with the in situ visual counting. In that way, it was assured that the
temperature changes were due to the presence of an ear instead of soil, leaves or unwanted objects.
Remote Sens. 2019, 11, x FOR PEER REVIEW 5 of 13
Manual in situ counting and RGB images
For validation purposes, a physical ring was placed on the top of the canopy for counting the
number of ears in the exact ring area by visual inspection in the field. The ring has a radius of 0.1225
m. The ring was attached by an extension arm to the monopod used to acquire the RGB images.
Thermal and RGB images were acquired at the same time as the visual (in situ) ear counting (inside
the ring) was assessed in the first block (i.e., 24 plots) of the rainfed trial at Valladolid. Visual counting
was always performed by the same person at the same position where the images were acquired.
Approximately 15 s were spent for each counting using a clicker to keep track of the exact number
and making sure to inspect the area inside the ring to accurately include all ears present by moving
plants and changing perspective angles at each location. Additionally, RGB images from the same
plot segments were acquired (at the same time than the thermal images) in a zenithal/nadir plane
with a Sony QX1-ILCE camera (Sony Corporation, Japan), 20.1-megapixel resolution, with 23.2 15.4
mm sensor size, using 16 mm focal lens and resolution of 5456 3632 pixels. The images were taken
using a monopod at 1 m above the canopy. The resulting RGB image spatial resolution was
approximately 0.03 cm/pixel.
The presence of an ear inside the ring area assigned through the thermal images was checked by
the RGB image (Figure 3) together with the in situ visual counting. In that way, it was assured that
the temperature changes were due to the presence of an ear instead of soil, leaves or unwanted objects.
Figure 3. Thermal and RGB images were acquired for Valladolid at late grain filling. A ring was used
as a reference area for validation purposes. The number of ears inside the ring area were counted
using the thermal and the RGB images and, additionally, the number of ears was counted by visual
inspection in the field. The black extension-arm that supported the ring showed higher temperature
than ears and canopy (in red color), enabling it to be automatically extracted by morphology operators
in the image processing system.
Two validation steps were developed using manual image-based counting. On the one hand,
the ears inside the ring area (including the ring edge) in the thermal and RGB images were manually
marked, and the visual ear counting data from the field and the algorithm results were also included.
The results are referred to as Ring-Manual-In-situ-Counting (Ring-MIC), Ring-Manual-Thermal-
Counting (Ring-MTC), Ring-Manual-RGB-Counting (Ring-MRC) and Ring-Algorithm-Thermal-Counting
(Ring-ATC). A set of 24 images and full counting datasets were used for each variable of the ring
related measurements, thermal and RGB images. On the other hand, selected additional complete
(full-sized) thermal images (without cropping to the size of the reference ring) were also manually
marked (Figure 1). The result is referred to as the Complete-Manual-Thermal-Counting (Complete-MTC).
Finally, the number of ears automatically detected by the algorithm is referred to as the Complete-
Algorithm-Thermal-Counting (Complete-ATC). The ears manually marked in the images were counted
using a simple algorithm developed for counting the number of colored marks present in the image.
The markers were placed using the Pencil tool [34] with the same color value and circular shape and
size. In this way, the simple algorithm for the marker counting could be limited to search for precisely
the same color and shape marks to then segment and count them.
Figure 3.
Thermal and RGB images were acquired for Valladolid at late grain filling. A ring was used
as a reference area for validation purposes. The number of ears inside the ring area were counted
using the thermal and the RGB images and, additionally, the number of ears was counted by visual
inspection in the field. The black extension-arm that supported the ring showed higher temperature
than ears and canopy (in red color), enabling it to be automatically extracted by morphology operators
in the image processing system.
Two validation steps were developed using manual image-based counting. On the one hand,
the ears inside the ring area (including the ring edge) in the thermal and RGB images were manually
marked, and the visual ear counting data from the field and the algorithm results were also included.
The results are referred to as Ring-Manual-In-situ-Counting (Ring-MIC), Ring-Manual-Thermal-Counting
(Ring-MTC), Ring-Manual-RGB-Counting (Ring-MRC) and Ring-Algorithm-Thermal-Counting (Ring-ATC).
A set of 24 images and full counting datasets were used for each variable of the ring related
measurements, thermal and RGB images. On the other hand, selected additional complete (full-sized)
thermal images (without cropping to the size of the reference ring) were also manually marked
(Figure 1). The result is referred to as the Complete-Manual-Thermal-Counting (Complete-MTC).
Finally, the number of ears automatically detected by the algorithm is referred to as the
Complete-Algorithm-Thermal-Counting (Complete-ATC). The ears manually marked in the images were
counted using a simple algorithm developed for counting the number of colored marks present in
the image. The markers were placed using the Pencil tool [
34
] with the same color value and circular
Remote Sens. 2019,11, 751 6 of 13
shape and size. In this way, the simple algorithm for the marker counting could be limited to search
for precisely the same color and shape marks to then segment and count them.
2.5. Statistical Analysis
Data analysis was performed using the open source software, RStudio 1.1.423 (R Foundation
for Statistical Computing, Vienna, Austria). Lineal regressions were used to analyze the relationship
between manual image-base counting and automatic thermal ear counting. The data were plotted
using SigmaPlot version 12 (Systat Software, Inc., San Jose, CA, USA).
3. Results
3.1. Linear Regression between Thermal, RGB, In Situ and Algorithm Counting
Linear regression of Ring-MIC,Ring-MRC and Ring-MTC against Ring-ATC was calculated for the
24 rainfed plots from Valladolid at late grain filling growth stage (Figure 4). The relationships between
Ring-ATC against Ring-MIC (R
2
= 0.40), Ring-MRC (R
2
= 0.84) and Ring-MTC (R
2
= 0.75) were positive
and statistically significant (p-value < 0.001). Therefore, the weakest correlation was recorded against
the visual counting in the field, which a priori represents the actual number of ears present. We also
calculated the relationship between the thermal and the RGB manual counting using the ring, where a
positive correlation with statistical significance was obtained (R2= 0.83, p-value < 0.001). In addition,
the relationship of Ring-MRC against Ring-MIC was positive and statistically significant (R
2
= 0.37,
p-value < 0.001), and similar in strength to the correlation between Ring-ATC and Ring-MIC.
Figure 4.
Linear regression of the relationships using the ring area for: (
a
)Ring-MIC (R
2
= 0.40); (
b
)
Ring-MRC (R
2
= 0.84); and (
c
)Ring-MTC (R
2
= 0.75) vs. Ring-ATC using images from Valladolid rainfed
trial at late grain filling. The dotted lines indicate the 1:1 slope.
Remote Sens. 2019,11, 751 7 of 13
On the other hand, the relationship using the preselection of complete (full-sized) thermal
images from Aranjuez, Sevilla and Valladolid against the algorithms counting (Complete-ATC vs.
Complete-MTC) were also positive, statistically significant (R
2
= 0.80, p-value < 0.001) and close to 1:1
slope relationship (Figure 5).
Remote Sens. 2019, 11, x FOR PEER REVIEW 7 of 13
Complete-MTC) were also positive, statistically significant (R2 = 0.80, p-value < 0.001) and close to 1:1
slope relationship (Figure 5).
Figure 5. Linear regression for Complete-MTC vs. Complete-ATC (R2 = 0.80) using the full-sized thermal
images from Aranjuez, Sevilla and Valladolid at anthesis, grain filling and late grain filling,
respectively. The dotted line indicates the 1:1 slope.
3.2. Understanding Acquisition and Algorithm Errors
Figure 6 shows three temperature image scenarios related with the acquisition protocol, wheat
crop temperature and optimal algorithm considerations. The image in Figure 6a was acquired at
around 10:30 UTC in Aranjuez (supplementary irrigation) when, at this time, the ears exhibited
higher temperature than the canopy leaves due to direct sunlight conditions for several hours. In the
case of low thermal image contrast (Figure 6b), there were no temperature differences observed
between the ears and the rest of the canopy due image acquisition at 10:30 UTC in Sevilla, when
overcast conditions inhibited any direct sunlight to increase ear temperatures. Thus, the ears could
not be detected separately by the temperature sensor, resulting in some leaves being detected as an
ear by the algorithm. On the other hand, Figure 6c shows an image acquired in Aranjuez at the same
optimal daytime as Figure 6a, although the acquisitions distance used was less than 0.8 m by mistake,
so that the ears visually appear blurred in the image; the algorithm could not isolate property the ear
regions. The images in Figure 6b and 6c may be considered as acquisition errors due to improper sky
conditions and camera user error, respectively. Providing adequate sky conditions and correct
camera settings, the algorithm errors related with ear identification are relatively minor (Figure 6a)
and basically due to the inability of the algorithm to detect or separate very closely or overlapping
ears in these circumstances (see red color semi-circle in Figure 6a; similarly, two ears were not
identified, as shown by yellow circles in dots in Figure 6a) by the algorithm due to the lack of contrast.
Figure 5.
Linear regression for Complete-MTC vs. Complete-ATC (R
2
= 0.80) using the full-sized thermal
images from Aranjuez, Sevilla and Valladolid at anthesis, grain filling and late grain filling, respectively.
The dotted line indicates the 1:1 slope.
3.2. Understanding Acquisition and Algorithm Errors
Figure 6shows three temperature image scenarios related with the acquisition protocol, wheat
crop temperature and optimal algorithm considerations. The image in Figure 6a was acquired at
around 10:30 UTC in Aranjuez (supplementary irrigation) when, at this time, the ears exhibited higher
temperature than the canopy leaves due to direct sunlight conditions for several hours. In the case
of low thermal image contrast (Figure 6b), there were no temperature differences observed between
the ears and the rest of the canopy due image acquisition at 10:30 UTC in Sevilla, when overcast
conditions inhibited any direct sunlight to increase ear temperatures. Thus, the ears could not be
detected separately by the temperature sensor, resulting in some leaves being detected as an ear by the
algorithm. On the other hand, Figure 6c shows an image acquired in Aranjuez at the same optimal
daytime as Figure 6a, although the acquisitions distance used was less than 0.8 m by mistake, so that
the ears visually appear blurred in the image; the algorithm could not isolate property the ear regions.
The images in Figure 6b,c may be considered as acquisition errors due to improper sky conditions and
camera user error, respectively. Providing adequate sky conditions and correct camera settings, the
algorithm errors related with ear identification are relatively minor (Figure 6a) and basically due to the
inability of the algorithm to detect or separate very closely or overlapping ears in these circumstances
(see red color semi-circle in Figure 6a; similarly, two ears were not identified, as shown by yellow
circles in dots in Figure 6a) by the algorithm due to the lack of contrast.
Remote Sens. 2019,11, 751 8 of 13
Remote Sens. 2019, 11, x FOR PEER REVIEW 7 of 13
Complete-MTC) were also positive, statistically significant (R2 = 0.80, p-value < 0.001) and close to 1:1
slope relationship (Figure 5).
Figure 5. Linear regression for Complete-MTC vs. Complete-ATC (R2 = 0.80) using the full-sized thermal
images from Aranjuez, Sevilla and Valladolid at anthesis, grain filling and late grain filling,
respectively. The dotted line indicates the 1:1 slope.
3.2. Understanding Acquisition and Algorithm Errors
Figure 6 shows three temperature image scenarios related with the acquisition protocol, wheat
crop temperature and optimal algorithm considerations. The image in Figure 6a was acquired at
around 10:30 UTC in Aranjuez (supplementary irrigation) when, at this time, the ears exhibited
higher temperature than the canopy leaves due to direct sunlight conditions for several hours. In the
case of low thermal image contrast (Figure 6b), there were no temperature differences observed
between the ears and the rest of the canopy due image acquisition at 10:30 UTC in Sevilla, when
overcast conditions inhibited any direct sunlight to increase ear temperatures. Thus, the ears could
not be detected separately by the temperature sensor, resulting in some leaves being detected as an
ear by the algorithm. On the other hand, Figure 6c shows an image acquired in Aranjuez at the same
optimal daytime as Figure 6a, although the acquisitions distance used was less than 0.8 m by mistake,
so that the ears visually appear blurred in the image; the algorithm could not isolate property the ear
regions. The images in Figure 6b and 6c may be considered as acquisition errors due to improper sky
conditions and camera user error, respectively. Providing adequate sky conditions and correct
camera settings, the algorithm errors related with ear identification are relatively minor (Figure 6a)
and basically due to the inability of the algorithm to detect or separate very closely or overlapping
ears in these circumstances (see red color semi-circle in Figure 6a; similarly, two ears were not
identified, as shown by yellow circles in dots in Figure 6a) by the algorithm due to the lack of contrast.
Figure 6.
Thermal images: (
a
) optimal temperature: higher ear temperature than canopy temperature;
(
b
) low thermal image contrast: no temperature differences between canopy and ears; and (
c
)
out-of-focus image: ears and canopy at high temperature, and the image was acquired at less than
0.8 m distance between the camera and the canopy. The boundary regions underlined in white color
represent the ears automatically detected by the algorithm.
4. Discussion
Ear density can be used as a target breeding trait in cereal phenotyping. To date, the few
studies dealing with automatic ear counting in the field have mostly been performed using RGB
images [
6
,
8
13
]. Besides the intrinsic low cost of this approach due to the easy operation and
affordability of digital cameras, the high resolution of the natural color digital images is a major
factor to consider as both a cost and a benefit. The use of RGB images may have limitations under
certain field conditions, including the quality of the sky and light conditions, which can be overcome
with sufficiently high spatial resolution, but which requires powerful computing capacities and makes
its implementation more complex or less high throughput than expected. Other remote sensing
approaches include the use of multispectral images [
14
], but the segmentation accuracy decreases
as the canopy area observed within a single image increases, potentially due to the lower spatial
resolution of these images and the reflectance angle dependence of multispectral data. Even LIDAR
may be used [
38
] but its price and processing requirements may still be considered prohibitive and
its size and weight makes it too cumbersome to be handheld or pole mounted for quick ground
evaluation in field conditions. As an alternative, thermal images may be used. While thermal imagery
may provide slightly lower spatial resolution compared to multispectral images, the possibilities for
obtaining images with a much greater contrast between ears and leaves is much higher with thermal
imaging. The increase in contrast provided by thermal imaging stems from large differences in ear and
leaf transpiration rates, which directly affect cooling capacity and temperature. To ensure differences
in temperature between the ears and the rest of the canopy, it is still recommended the images be
acquired within a few hours of solar noon to reduce shadowing and sun angle effects. In fact, this
recommendation may be extended for any passive remote sensing imaging technique.
Moreover, thermal cameras use radiation far from the visible and near infrared spectral regions
and thus the factors that contribute to some of the limitations of RGB digital images such as brightness
of other factors affecting light conditions [
2
,
16
,
26
] are removed. For that reason, thermal images have
proven to be easier to process than RGB images, in part due to their lower resolution, without the
lack of contrast and technical limitations of multisensory array multispectral imagers. In fact, the
ears, and regardless of the water conditions, are usually several degrees warmer than the leaves [
25
],
due to their constitutive lower stomatal conductance and thus transpiration rates compared with
the leaves [
39
]. For this study, we measured the leaf and the ear temperature, with differences
Remote Sens. 2019,11, 751 9 of 13
within images ranging 1.9–5.0
C, 2.1–3.4
C and 2.0–5.0
C for Aranjuez, Sevilla and Valladolid,
respectively. The mean temperate differential between ears and leaves across all treatments needed for
the algorithm to segment property the ears was around 2.0
C. Although, for this application, we used
the thermal color map to focus on the contrast present due to relative temperature differences, it is
also possible to work on the full radiometric kelvin information to get, for instance, the mean, range or
specific ear(s) temperature from the thermal images using the same segmentation algorithm. It could
represent additional useful information for phenotyping tasks as ear temperatures have been reported
in some cases to be better correlated with grain yield than spectral vegetation indices and also provide
comparable correlations as gene expression performance in predicting grain yield [25].
For additional thermal image algorithm validation purposes, visual in situ counting was
developed using a ring to delimit a specific area over the crop while in the field and thus facilitate
the manual counting. Although the ring has a small area, compared with the complete plot size, we
obtained a relatively low R
2
relationship against thermal image-based counting (R
2
= 0.40, Figure 4).
This is most likely associated with the limited single image-perspective of the one zenithal/nadir
thermal image or RGB image captured in the field. Some portion of the error could additionally be
associated with human visual inspection errors in the field and potentially the subjectivity of the
observer, as are often assumed to be major sources of error in manual ear counting in actual breeding
programs; however, for this study, the researchers attempted to minimize the human error associated
with the Ring-MIC ear counts to provide quality validation data. In the manual in situ counting in
the field, it was necessary to both view the canopy from different angles as well as physically move
plants to acquire accurate field validation data, representing a major difference between the in situ
counting and the single image-perspective remote sensing approach of the automatic thermal image ear
counting technique presented here. In previous studies on ear recognition, no information regarding
the correlation between in situ visual ear counting and automatic ear counting was provided [
6
14
],
but it is nonetheless an important point to consider as the entire image acquisition and processing
pipeline represents a sum of errors. Of course, the approach for visual counting assayed was in fact
much faster than the traditional ear counting procedures, which implies for example counting the total
number of ears in one-meter linear row length. However, this approach is quite tedious (and of course
takes much longer than the 15 s per plot as in our study). Nonetheless, we obtained good results
using thermal imagery for ear counting with positive and strong relationships between the automatic
thermal ear counting system and the manual image-based ear counting (R
2
= 0.84 for Ring-MRC,
Figure 4; R
2
= 0.75 for Ring-MTC, Figure 4and R
2
= 0.80 for Complete-MTC, Figure 5). Furthermore, in
all comparisons, the slope of the correlation was quite close to a 1:1 ratio, indicating very little bias
toward over- or under-counting within the range of ear density in this study. Thus, the additional
validation results provide support for the capacity of the automatic thermal image counting algorithm
to count the ears that are present in the image with high precision and low bias. However, other
potential sources of error in the thermal image counting pipeline should be considered in more detail.
Although we also detected limitations specific to the use of thermal imagery for ear counting, such
as the observed crop temperature issues (Figure 6), there are also errors related to the general use of
remote sensing imaging for ear counting, potentially applicable to any other single image or “snapshot”
approaches regardless of the range of non-penetrating electromagnetic radiation employed. This is the
case for instance of overlapping and hidden ears and might explain the rather low correlation between
the Ring-MRC (single human eye perspective) and the Ring-MIC (R
2
= 0.37). The use of additional
oblique/off-nadir thermal imaging may provide improved canopy penetration, as for instance 3D surface
models that suggest performance improvements when off-nadir images are incorporated [
40
,
41
], but
may also come with other complications (consistency in oblique off-nadir angle, determination of
optimal angle, and more complex 3D processing algorithms) or yet unknown errors.
Remote Sens. 2019,11, 751 10 of 13
Table 1.
Comparative thermal and RGB data information in field conditions for ear
counting applications.
Thermal RGB
Temperature of the Ears Several degrees warmer than leaves [25]. Irrelevant.
Stage growth From heading to near crop maturity [42]. From heading to near crop maturity [12].
Accuracy hour of the day
and sky conditions
Clear sky conditions. After midday until
18:00, depending on plant water stress
conditions.
Depends of the hour of the day, 8:00 to
18:30 [10], 8:00 to 17:00 [14], 9:00 to 16:00
[9], 12:00 to 16:00 and sky (preferably
diffuse light) conditions [12].
Position of the camera Zenithal/nadir. Zenithal/nadir [6,1013]; 45above the
horizontal [14].
Distance of the camera
from crop 0.8–1 m. 0.85 m [6], 2.5 m [10], 2.9 m [11], 3.5 m
[14], 0.8–1 m [12].
Spatial resolution from
ground acquired images
Approximately 0.14 cm/pixel, depending
on camera and distance from crop.
Ranging 0.01–0.25 cm/pixel [6,1013];
depending on camera and distance from
crop.
Possible algorithm errors
-The algorithm presents errors when the air
temperature is too low or high or the sky is
too cloudy, or the conditions very windy,
which may prevent differences between the
canopy and the ear temperature appears.
-The camera could be out-of-focus,
potentially due to a very short image
acquisition distance between the camera
and the canopy.
-In sparse canopies, soil temperature may
affect the background.
-Dry or senescent leaf canopy may affect the
background.
-False positives where pixels are labeled
as ears correspond to leaves and result in
irregularities in the ear counting.
-False negatives result in ears that are not
detected by the algorithm because the
contrast between the ear and soil is not
great enough and the segmentation
algorithm discarded that region.
-The algorithm labeled the area as an ear,
where the pixels are soil and noise being a
result of background brightness caused by
a foreign object [12].
Thermal and RGB image data in field conditions are discussed throughout this work and each of
the technology, acquisition and image processing steps show some limitations (summarized in Table 1).
RGB sensors provide high frequency information (very high spatial resolution) that contributes to
improvements in perceiving the existence of an ear separate from leaves, soil and other unwanted
objects; even though similar texture characteristics can be found in the awns or in parts of the leaves
for instance due to the high RGB resolution [
12
], these similarities between awns and ears actually
increases the challenges for automatic RGB ear counting systems. On the other hand, thermal images
filter high frequency details intrinsically due to the different technology that it uses to detect much
longer wavelength radiation emissions and its low-resolution characteristics [
16
]; this helps in the
automatic ear counting system implementation using thermal images. However, we could detect some
similar RGB and thermal image errors, such as overlapping and non-identified ears, yet they may still
provide some complimentary benefits together, such as more flexible image acquisition conditions
or improved image feature extraction and opportunities for validation. Therefore, thermal and RGB
fusion may in combination provide the best features of each technology in a way that could be acquired
by new mobile phones that incorporate thermal sensors [
2
]. Even more advanced systems that include
hyperspectral cameras [15] may also be considered in the future for ear counting purposes.
On the other hand, researcher visual interpretation of the RGB images was crucial in correctly
locating the presence of ears in thermal images for the development of the thermal image ear counting
algorithm. Thereby, the acquisition of thermal and RGB images at the same time may contribute
further to the understanding and interpretation of the information in the thermal images, contributing
to the development of a more robust algorithm for ear counting with thermal image color maps. In
fact, the ears are usually several degrees warmer than leaves for only some parts of the day under the
right conditions; thus, for ear counting purposes, it is necessary to select the optimal time of day for
acquiring thermal images. The thermal images, when taken at the right moment, can provide, from an
image processing system perspective, clearer information of the different components of the canopy;
Remote Sens. 2019,11, 751 11 of 13
however, in some cases, high temperature information could be associated with soil or unwanted
objects that RGB images can help to avoid. Therefore, for future work, it may be the case that perhaps
thermal and RGB fusion could be the next step for ear counting applications.
5. Conclusions
In our study, a thermal camera was used to develop an image processing system for automatic ear
counting in field conditions. In favor of the thermal counting approach, ear density values estimated
through thermal imaging can be processed much more rapidly as the size of the images is much
smaller compared to high resolution RGB images used in other previous studies, while the increase
in contrast allows for equally accurate assessments when the thermal images are captured under
specific conditions. There should be a difference of at least 2
C between the ear and leaf temperatures
for this thermal ear counting algorithm to work. Although the correlation with manual in situ ear
counts (Ring-MIC) was not very high, the algorithm did demonstrate high correlations with various
manual image-based ear counts (Ring-MRC,Ring-MTC,Complete-MTC). In future applications, thermal
imagery may be acquired from multiple perspectives (including off-nadir and oblique), or even thermal
video data, for improved ear detection in comparison with in situ counts. However, further studies
could use the same thermal image segmentation algorithm developed here for ear detection (Figure 2)
to extract the temperature of ears and leaves separately for other phenotyping applications related to
plant water stress effects or grain yield prediction. Thermal and RGB fusion, along with 3D imaging,
could be the next steps for cereal ear counting in field conditions to take maximal advantage of the
strengths of each imaging technology.
Author Contributions:
Conceptualization, J.A.F.-G., M.L.B., J.L.A. and S.C.K.; methodology, J.A.F.-G., M.L.B.,
N.A.G., M.T.N.-T., J.L.A. and S.C.K.; software, J.A.F.-G.; validation, J.A.F.-G. and M.L.B.; formal analysis, J.A.F.-G.,
M.L.B., J.L.A. and S.C.K.; investigation, J.A.F.-G., M.L.B., N.A.G., M.T.N.-T., J.L.A. and S.C.K.; resources, N.A.G.,
M.T.N.-T., J.L.A. and S.C.K.; data curation, N.A.G. and M.T.N.-T.; writing—original draft preparation, J.A.F.-G.,
M.L.B., J.L.A. and S.C.K.; writing—review and editing, J.A.F.-G., J.L.A. and S.C.K.; visualization, J.A.F.-G. and
M.L.B; supervision, J.L.A. and S.C.K.; project administration, J.L.A.; funding acquisition, J.L.A.
Funding:
This work was supported by MINECO, Spain (project number AGL2016-76527-R) as the primary
funding support for the research project; the project “Formación de Talento Humano de Alto Nivel” (project
number BPIN 2013000100103) approved by the “Fondo de Ciencia, Tecnología e Innovación”, from the “Sistema
General de Regalías”; and “Gobernación del Tolima—Universidad del Tolima, Colombia” as the sole funding
source of the first author J.A.F.-G. Contribution of J.L.A. was supported in part by ICREA Academia, Generalitat
de Catalunya, Spain.
Acknowledgments:
The authors of this research would like to thank the field management staff at the
experimental stations of Colmenar de Oreja (Aranjuez) and Coria del Rio (Sevilla) of the Instituto Nacional
de Investigación y Tecnología Agraria y Alimentaria (INIA) and Zamadueñas (Valladolid) of the Instituto de
Tecnología Agraria de Castilla y León (ITACyL) for their continued support of our research.
Conflicts of Interest: The authors declare that they have no competing interests.
References
1.
Walter, A.; Liebisch, F.; Hund, A. Plant phenotyping: From bean weighing to image analysis. Plant Methods
2015,11, 14. [CrossRef] [PubMed]
2.
Araus, J.L.; Kefauver, S.C. Breeding to adapt agriculture to climate change: Affordable phenotyping solutions.
Curr. Opin. Plant Biol. 2018,45 Pt B, 237–247. [CrossRef]
3.
Araus, J.L.; Cairns, J.E. Field high-throughput phenotyping: The new crop breeding frontier. Trends Plant Sci.
2014,19, 52–61. [CrossRef] [PubMed]
4.
Pask, A.; Pietragalla, J.; Mullan, D.; Reynolds, M. (Eds.) Physiological Breeding II: A Field Guide to Wheat
Phenotyping; CIMMYT: Mexico D.F., Mexico, 2012; ISBN 9789706481825.
5.
Slafer, G.A.; Calderini, D.F.; Miralles, D.J. Yield Components and Compensation in Wheat: Opportunities
for Further Increasing Yield Potencial. In Increasing Yield Potential in Wheat: Breaking the Barriers; CIMMYT
International Symposium: Mexico D.F., Mexico, 1996; pp. 101–133.
Remote Sens. 2019,11, 751 12 of 13
6.
Cointault, F.; Guerin, D.; Guillemin, J.; Chopinet, B. In-field Triticum aestivum ear counting using
colour-texture image analysis. N. Z. J. Crop Hortic. Sci. 2008,36, 117–130. [CrossRef]
7.
Liu, T.; Sun, C.; Wang, L.; Zhong, X.; Zhu, X.; Guo, W. In-field wheatear counting based on image processing
technology. Nongye Jixie Xuebao/Trans. Chin. Soc. Agric. Mach. 2014,45, 282–290. [CrossRef]
8.
Zhou, C.; Liang, D.; Yang, X.; Yang, H.; Yue, J.; Yang, G. Wheat Ears Counting in Field Conditions Based on
Multi-Feature Optimization and TWSVM. Front. Plant Sci. 2018,9, 1024. [CrossRef]
9.
Zhu, Y.; Cao, Z.; Lu, H.; Li, Y.; Xiao, Y. In-field automatic observation of wheat heading stage using computer
vision. Biosyst. Eng. 2016,143, 28–41. [CrossRef]
10.
Sadeghi-Tehran, P.; Sabermanesh, K.; Virlet, N.; Hawkesford, M.J. Automated Method to Determine Two
Critical Growth Stages of Wheat: Heading and Flowering. Front. Plant Sci. 2017,8, 252. [CrossRef]
11.
Madec, S.; Jin, X.; Lu, H.; De Solan, B.; Liu, S.; Duyme, F.; Heritier, E.; Baret, F. Ear density estimation
from high resolution RGB imagery using deep learning technique. Agric. For. Meteorol.
2019
,264, 225–234.
[CrossRef]
12.
Fernandez-Gallego, J.A.; Kefauver, S.C.; Aparicio Gutiérrez, N.; Nieto-Taladriz, M.T.; Araus, J.L. Wheat ear
counting in-field conditions: High throughput and low-cost approach using RGB images. Plant Methods
2018,14, 22. [CrossRef] [PubMed]
13.
Fernandez-Gallego, J.A.; Kefauver, S.C.; Gutiérrez, N.A.; Nieto-Taladriz, M.T.; Araus, J.L. Automatic wheat
ear counting in-field conditions: Simulation and implication of lower resolution images. In Proceedings of
the Remote Sensing for Agriculture, Ecosystems, and Hydrology XX, Berlin, Germany, 10–13 September
2018; p. 23. [CrossRef]
14.
Zhou, C.; Liang, D.; Yang, X.; Xu, B.; Yang, G. Recognition of wheat spike from field based phenotype
platform using multi-sensor fusion and improved maximum entropy segmentation algorithms. Remote Sens.
2018,10, 246. [CrossRef]
15.
Jafri, M.Z.M.; Tan, S.C. Feature selection from hyperspectral imaging for guava fruit defects detection.
In Proceedings of the SPIE Digital Optical Technologies, Munich, Germany, 25–29 June 2017.
16.
Bhakta, I.; Phadikar, S.; Majumder, K. Importance of Thermal Features in the Evaluation of Bacterial Blight
in Rice Plant. In Annual Convention of the Computer Society of India; Springer: Singapore, 2018; pp. 300–313.
ISBN 9781489911827.
17.
Bellvert, J.; Zarco-Tejada, P.J.; Girona, J.; Fereres, E. Mapping crop water stress index in a “Pinot-noir”
vineyard: Comparing ground measurements with thermal remote sensing imagery from an unmanned
aerial vehicle. Precis. Agric. 2014,15, 361–376. [CrossRef]
18.
Mangus, D.L.; Sharda, A.; Zhang, N. Development and evaluation of thermal infrared imaging system for
high spatial and temporal resolution crop water stress monitoring of corn within a greenhouse. Comput.
Electron. Agric. 2016,121, 149–159. [CrossRef]
19.
Buitrago, M.F.; Groen, T.A.; Hecker, C.A.; Skidmore, A.K. Changes in thermal infrared spectra of plants
caused by temperature and water stress. ISPRS J. Photogramm. Remote Sens. 2016,111, 22–31. [CrossRef]
20.
Cohen, Y.; Alchanatis, V.; Sela, E.; Saranga, Y.; Cohen, S.; Meron, M.; Bosak, A.; Tsipris, J.; Ostrovsky, V.;
Orolov, V.; et al. Crop water status estimation using thermography: Multi-year model development using
ground-based thermal images. Precis. Agric. 2015,16, 311–329. [CrossRef]
21.
Grant, O.M.; Ochagavía, H.; Baluja, J.; Diago, M.P.; Tardáguila, J. Thermal imaging to detect spatial and
temporal variation in the water status of grapevine (Vitis vinifera L.). J. Hortic. Sci. Biotechnol.
2016
,91, 44–55.
[CrossRef]
22.
Moran, M.S.; Clarke, T.R.; Inoue, Y.; Vidal, A. Estimating crop water deficit using the relation between
surface-air temperature and spectral vegetation index. Remote Sens. Environ. 1994,49, 246–263. [CrossRef]
23.
Gonzalez-Dugo, V.; Zarco-Tejada, P.; Nicolás, E.; Nortes, P.A.; Alarcón, J.J.; Intrigliolo, D.S.; Fereres, E. Using
high resolution UAV thermal imagery to assess the variability in the water status of five fruit tree species
within a commercial orchard. Precis. Agric. 2013,14, 660–678. [CrossRef]
24.
Wang, Y.; Zhang, Y.; Zhang, R.; Li, J.; Zhang, M.; Zhou, S.; Wang, Z. Reduced irrigation increases the water
use efficiency and productivity of winter wheat-summer maize rotation on the North China Plain. Sci. Total
Environ. 2018,618, 112–120. [CrossRef]
25.
Vicente, R.; Vergara-Díaz, O.; Medina, S.; Chairi, F.; Kefauver, S.C.; Bort, J.; Serret, M.D.; Aparicio, N.;
Araus, J.L. Durum wheat ears perform better than the flag leaves under water stress: Gene expression and
physiological evidence. Environ. Exp. Bot. 2018,153, 271–285. [CrossRef]
Remote Sens. 2019,11, 751 13 of 13
26.
Araus, J.L.; Kefauver, S.C.; Zaman-Allah, M.; Olsen, M.S.; Cairns, J.E. Translating High-Throughput
Phenotyping into Genetic Gain. Trends Plant Sci. 2018,23, 451–566. [CrossRef] [PubMed]
27.
Chandel, A.K.; Khot, L.R.; Osroosh, Y.; Peters, T.R. Thermal-RGB imager derived in-field apple surface
temperature estimates for sunburn management. Agric. For. Meteorol. 2018,253–254, 132–140. [CrossRef]
28.
Gan, H.; Lee, W.S.; Alchanatis, V.; Ehsani, R.; Schueller, J.K. Immature green citrus fruit detection using color
and thermal images. Comput. Electron. Agric. 2018,152, 117–125. [CrossRef]
29.
Camino, C.; Zarco-Tejada, P.J.; Gonzalez-Dugo, V. Effects of heterogeneity within tree crowns on
airborne-quantified SIF and the CWSI as indicators of water stress in the context of precision agriculture.
Remote Sens. 2018,10, 4. [CrossRef]
30.
Jiang, Y.; Shuang, L.; Li, C.; Paterson, A.H.; Robertson, J. Deep learning for thermal image segmentation to
measure canopy temperature of Brassica oleracea in the field. In Proceedings of the 2018 ASABE Annual
International Meeting, Detroit, MI, USA, 29 July–1 August 2018; American Society of Agricultural and
Biological Engineers: St. Joseph, MI, USA, 2018; Volume 39, pp. 300–313.
31.
Page, G.F.M.; Liénard, J.F.; Pruett, M.J.; Moffett, K.B. Spatiotemporal dynamics of leaf transpiration quantified
with time-series thermal imaging. Agric. For. Meteorol. 2018,256–257, 304–314. [CrossRef]
32.
Agro-Climatic Information System for Irrigation (Sistema de Información Agroclimática para el Regadío,
SIAR). Available online: http://eportal.mapama.gob.es/websiar/Inicio.aspx (accessed on 20 February 2019).
33.
Zadoks, J.; Chang, T.; Konzak, C. A decimal growth code for the growth stages of cereals. Weed Res.
1974
,14,
415–421. [CrossRef]
34.
Schneider, C.A.; Rasband, W.S.; Eliceiri, K.W. NIH Image to ImageJ: 25 years of image analysis. Nat. Methods
2012,9, 671–675. [CrossRef] [PubMed]
35.
Malacara, D. Uniform Color Systems. In Color Vision and Colorimetry: Theory and Applications, 2nd ed.; SPIE:
Bellingham, WA, USA, 2011; pp. 103–129.
36.
Zuiderveld, K. Contrast Limited Adaptive Histogram Equalization. In Graphics Gems; Elsevier: Amsterdam,
The Netherlands, 1994; pp. 474–485. ISBN 0-12-336155-9.
37.
Su, C.H.; Chiu, H.S.; Hsieh, T.M. An efficient image retrieval based on HSV color space. In Proceedings of
the 2011 International Conference on Electrical and Control Engineering, Yichang, China, 16–18 September
2011; pp. 5746–5749. [CrossRef]
38.
Jimenez-Berni, J.A.; Deery, D.M.; Rozas-Larraondo, P.; Condon, A.; Tony, G.; Rebetzke, G.J.; James, R.A.;
Bovill, W.D.; Furbank, R.T.; Sirault, X.R.R. High Throughput Determination of Plant Height, Ground Cover,
and Above-Ground Biomass in Wheat with LiDAR. Front. Plant Sci. 2018,9, 237. [CrossRef]
39.
Tambussi, E.A.; Bort, J.; Guiamet, J.J.; Nogués, S.; Araus, J.L. The Photosynthetic Role of Ears in C3 Cereals:
Metabolism, Water Use Efficiency and Contribution to Grain Yield. CRC Crit. Rev. Plant Sci.
2007
,26, 1–16.
[CrossRef]
40.
Singh, K.K.; Frazier, A.E. A meta-analysis and review of unmanned aircraft system (UAS) imagery for
terrestrial applications. Int. J. Remote Sens. 2018,39, 5078–5098. [CrossRef]
41.
Smith, M.W.; Vericat, D. From experimental plots to experimental landscapes: Topography, erosion and
deposition in sub-humid badlands from Structure-from-Motion photogrammetry. Earth Surf. Process. Landf.
2015,40, 1656–1671. [CrossRef]
42.
Berdugo, C.A.; Steiner, U.; Dehne, H.W.; Oerke, E.C. Effect of bixafen on senescence and yield formation of
wheat. Pestic. Biochem. Physiol. 2012,104, 171–177. [CrossRef]
©
2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access
article distributed under the terms and conditions of the Creative Commons Attribution
(CC BY) license (http://creativecommons.org/licenses/by/4.0/).
This research hasn't been cited in any other publications.
  • Article
    Full-text available
    Wheat ear density estimation is an appealing trait for plant breeders. Current manual counting is tedious and inefficient. In this study we investigated the potential of convolutional neural networks (CNNs) to provide accurate ear density using nadir high spatial resolution RGB images. Two different approaches were investigated, either using the Faster-RCNN state-of-the-art object detector or with the TasselNet local count regression network. Both approaches performed very well (rRMSE6%) when applied over the same conditions as those prevailing for the calibration of the models. However, Faster-RCNN was more robust when applied to a dataset acquired at a later stage with ears and background showing a different aspect because of the higher maturity of the plants. Optimal spatial resolution for Faster-RCNN was around 0.3 mm allowing to acquire RGB images from a UAV platform for high-throughput phenotyping of large experiments. Comparison of the estimated ear density with in-situ manual counting shows reasonable agreement considering the relatively small sampling area used for both methods. Faster-RCNN and in-situ counting had high and similar heritability (H²85%), demonstrating that ear density derived from high resolution RGB imagery could replace the traditional counting method.
  • Chapter
    The aim of the study is to investigate the potential of texture and thermal features extracted from the thermograph of the rice leaves in bacterial leaf blight forecasting. Thermal images have some advantages over visual images. Visual images are capable of capturing only symptoms visible in bare eyes whereas thermal images can capture invasive temperature changes of an object when any chemical changes occur within it, which may not create any visual changes. In this paper, thermal images are used to identify the internal changes of the rice leaves before any visual changes occur due to the bacterial leaf blight disease. Thermal images of the leaves at normal, primary stage of infection and highly infected stage are collected from field. For this experiment, 158 samples of each stage (normal leaves, leaves at primary stage of infection and highly infected leaves) are considered. Images are preprocessed to standardize the environment of image acquisition. Then images are segmented to extract the region of interest using Otsu’s algorithm. Temperature variation and texture features are extracted from the segmented images using the Flir Tools and Gray-level co-occurrence matrix method respectively. The temperature differences of normal leaves, leaves at primary stage of infection and highly infected leaves are evaluated using summary statistics. Paired t-test values are computed to find the significance of the result. The result shows that there is significant difference among these three stages of leaves with respect to thermal feature. But, with respect to the texture features there is no significant difference. Hence, the result verifies the importance of thermal features in bacterial leaf blight forecasting.
  • Article
    Full-text available
    The number of wheat ears in the field is very important data for predicting crop growth and estimating crop yield and as such is receiving ever-increasing research attention. To obtain such data, we propose a novel algorithm that uses computer vision to accurately recognize wheat ears in a digital image. First, red-green-blue images acquired by a manned ground vehicle are selected based on light intensity to ensure that this method is robust with respect to light intensity. Next, the selected images are cut to ensure that the target can be identified in the remaining parts. The simple linear iterative clustering method, which is based on superpixel theory, is then used to generate a patch from the selected images. After manually labeling each patch, they are divided into two categories: wheat ears and background. The color feature “Color Coherence Vectors,” the texture feature “Gray Level Co-Occurrence Matrix,” and a special image feature “Edge Histogram Descriptor” are then exacted from these patches to generate a high-dimensional matrix called the “feature matrix.” Because each feature plays a different role in the classification process, a feature-weighting fusion based on kernel principal component analysis is used to redistribute the feature weights. Finally, a twin-support-vector-machine segmentation (TWSVM-Seg) model is trained to understand the differences between the two types of patches through the features, and the TWSVM-Seg model finally achieves the correct classification of each pixel from the testing sample and outputs the results in the form of binary image. This process thus segments the image. Next, we use a statistical function in Matlab to get the exact a precise number of ears. To verify these statistical numerical results, we compare them with field measurements of the wheat plots. The result of applying the proposed algorithm to ground-shooting image data sets correlates strongly (with a precision of 0.79–0.82) with the data obtained by manual counting. An average running time of 0.1 s is required to successfully extract the correct number of ears from the background, which shows that the proposed algorithm is computationally efficient. These results indicate that the proposed method provides accurate phenotypic data on wheat seedlings.
  • Article
    Drought is the main abiotic stress threatening wheat production in the Mediterranean region. While the negative effect of drought on the photosynthetic carbon and nitrogen metabolism of the flag leaf has been widely studied, little is known about its effect on other photosynthetic organs such as the ear. Our study compared the responses to water stress of organ temperature, spectral vegetation indices, nitrogen content, carbon isotope composition (δ13C) and expression of key genes for primary metabolism and drought-stress response in the flag leaf and the ear. Measurements were performed at heading and early grain filling in field-grown durum wheat under irrigated and rainfed conditions. Multivariate analysis of physiological traits and gene expression indicated that ears had a similar behaviour regardless of the water regime, while water stress led to significant negative effects on flag leaves. This better performance of ears under water stress compared to leaves was due to good nitrogen and water status and higher expression of key genes for primary metabolism and drought-stress responses, which also indicate a pattern of delayed senescence in ears. Upregulation of genes involved in respiration, CO2 refixation and nitrogen assimilation in ears may also suggest the relevance of these processes in ear metabolism under water stress. This study highlights the importance of including ear traits when unravelling the mechanisms that facilitate adaption of wheat to future environmental scenarios.
  • Article
    Breeding is one of the central pillars of adaptation of crops to climate change. However, phenotyping is a key bottleneck that is limiting breeding efficiency. The awareness of phenotyping as a breeding limitation is not only sustained by the lack of adequate approaches, but also by the perception that phenotyping is an expensive activity. Phenotyping is not just dependent on the choice of appropriate traits and tools (e.g. sensors) but relies on how these tools are deployed on their carrying platforms, the speed and volume of data extraction and analysis (throughput), the handling of spatial variability and characterization of environmental conditions, and finally how all the information is integrated and processed. Affordable high throughput phenotyping aims to achieve reasonably priced solutions for all the components comprising the phenotyping pipeline. This mini-review will cover current and imminent solutions for all these components, from the increasing use of conventional digital RGB cameras, within the category of sensors, to open-access cloud-structured data processing and the use of smartphones. Emphasis will be placed on field phenotyping, which is really the main application for day-to-day phenotyping.
  • Article
    Full-text available
    This research focused on understanding the effects of structural heterogeneity within tree crowns on the airborne retrieval of solar-induced chlorophyll fluorescence (SIF) and the Crop Water Stress Index (CWSI). We explored the SIF and CWSI variability observed within crowns of trees subjected to different water stress regimes and its effect on the relationships with leaf physiological measurements. High-resolution (20 cm) hyperspectral imagery was acquired to assess fluorescence retrieval from sunlit portions of the tree crowns using the Fraunhofer line depth method, and from entire crowns using automatic object-based tree crown detection methods. We also measured the canopy temperature distribution within tree crowns using segmentation algorithms based on temperature percentiles applied to high-resolution (25 cm) thermal imagery. The study was conducted in an almond orchard cultivated under three watering regimes in Cordoba, in southern Spain. Three airborne campaigns took place during the summer of 2015 using high-resolution hyperspectral and thermal cameras on board a manned aircraft. Relationships between SIF and the assimilation rate improved significantly when the sunlit tree crown pixels extracted through segmentation were used for all flight dates. By contrast, the SIF signal extracted from the entire tree crowns was highly degraded due to the canopy heterogeneity observed within tree crowns. The quartile crown segmentations applied to the thermal images showed that the CWSI values obtained were within the theoretically expected CWSI range only when the pixels were extracted from the 50th percentile class. However, the CWSI values were biased in the upper quartile (Q75) for all watering regimes due to the soil background effects on the calculated mean crown temperature. The relationship between the CWSI and Gs was heavily affected by the crown segmentation levels applied and improved remarkably when the CWSI values were calculated from the middle quartile crown segmentation (Q50), corresponding to the coldest and purest vegetation pixels (r2 = 0.78 in pure vegetation pixels vs. r2 = 0.52 with the warmer pixels included in the upper quartile). This study highlights the importance of using high-resolution hyperspectral and thermal imagery for pure-object segmentation extractions from tree crowns in the context of precision agriculture and water stress detection.
  • Article
    Accurately capturing the spatiotemporal dynamics of transpiration from sub-leaf to ecosystem scales remains a key challenge in eco-physiology and hydrology as typical methods face a trade-off between spatial coverage and temporal resolution. Here, we developed a new scalable, semi-automated method to produce highly precise estimates of water and energy fluxes and applied it to single leaves. High-resolution thermal infrared (TIR) images and paired colour photographs of excised soybean leaves were captured at 15 s intervals until wilting, automatically registered and segmented, and used as input for transient energy balance models to estimate latent heat flux (transpiration) at a temporal resolution of one second. Three approaches to estimating leaf boundary layer conductance to heat (gHa) and sensible heat flux were compared, two of which did not require the use of any dry or wet reference surface. The accuracy of water loss modeled using average leaf temperature was also compared to models retaining pixel-scale temperature heterogeneity at a spatial resolution of 0.326 mm2. Cumulative leaf water-losses modeled using average leaf temperature closely matched gravimetric measurements (r2 = 0.95) and pixel-scale models identified striking spatiotemporal patterns of water loss at the sub-leaf scale. Different methods of estimating gHa did not significantly alter model results. Use of leaf energy balance models with time series thermal images to quantify transient transpiration fluxes was able to accurately resolve 1-s time-varying leaf water loss in outdoor conditions, did not require any reference surfaces, and also produced data on the characteristic length scales of heterogeneous sub-leaf response. Given the ability to omit reference surfaces and retain accuracy, this approach also has the potential to be scaled-up to quantify energy fluxes in more complex plant canopies.