Article

Color Indices for Weed Identification Under Various Soil, Residue, and Lighting Conditions

Authors:
To read the full-text of this research, you can request a copy directly from the authors.

Abstract

Color slide images of weeds among various soils and residues were digitized and analyzed for red, green, and blue (RGB) color content. Red, green, and blue chromatic coordinates (rgb) of plants were very different from those of background soils and residue. To distinguish living plant material from a nonplant background, several indices of chromatic coordinates were studied, tested, and were successful in identifying weeds. The indices included r-g, g-b, (g-b)||r-g|, and 2g-r-b. A modified hue was also used to distinguish weeds from non-plant surfaces. The modified hue, 2g-r-b index, and the green chromatic coordinate distinguished weeds from a nonplant background (0.05 level of significance) better than other indices. However, the modified hue was the most computationally intense. These indices worked well for both nonshaded and shaded sunlit conditions. These indices could be used for sensor design for detecting weeds for spot spraying control.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... Grayscale Index GRAY (R + G + B)/3 [24] 10 Chromatic coordinates for red/ Normalized red of RGB NRRGB R/(R + G + B) [39] 11 ...
... Chromatic coordinates for green/ Normalized green of RGB NGRGB G/(R + G + B) [39] 12 ...
... Chromatic coordinates for blue/ Normalized blue of RGB NBRGB B/(R + G + B) [39] 13 Normalized Green Red Difference Index NGRDI ...
Article
Full-text available
Understanding forest tree phenology is essential for assessing forest ecosystem responses to environmental changes. Observations of phenology using remote sensing devices, such as satellite imagery and Unmanned Aerial Vehicles (UAVs), along with machine learning, are promising techniques. They offer fast, accurate, and unbiased results linked to ground data to enable us to understand ecosystem processes. Here, we focused on European beech, one of Europe’s most common forest tree species, along an altitudinal transect in the Carpathian Mountains. We performed ground observations of leaf phenology and collected aerial images using UAVs and satellite-based biophysical vegetation parameters. We studied the time series correlations between ground data and remote sensing observations (GLI r = 0.86 and FCover r = 0.91) and identified the most suitable vegetation indices (VIs). We trained linear and non-linear (random forest) models to predict the leaf phenology as a percentage of leaf cover on test datasets; the models had reasonable accuracy, RMSE percentages of 8% for individual trees, using UAV, and 12% as an average site value, using the Copernicus biophysical parameters. Our results suggest that the UAVs and satellite images can provide reliable data regarding leaf phenology in the European beech.
... In addition, three bands were generated for each capture of thermal images (band 1band 2band 3) (Figure3). Everitt et al., 1987 Woebbecke Index (WI), Woebbecke et al., 1995 ( − ) ( − ) Excess Blue Index (EXB), Mao et al., 2003 1.4bg Excess Green Index (EXG), Woebbecke et al., 1995 2g -rb Excess Red Index (EXR), Meyer et al., 2008 1.4rg Excess Green-Red Index (EXGR), Neto et al., 2004 3g-2.4r-b Color Index of Vegetation (CIVE), Kataoka et al., 2003 0.441R-0.811G+ ...
... In addition, three bands were generated for each capture of thermal images (band 1band 2band 3) (Figure3). Everitt et al., 1987 Woebbecke Index (WI), Woebbecke et al., 1995 ( − ) ( − ) Excess Blue Index (EXB), Mao et al., 2003 1.4bg Excess Green Index (EXG), Woebbecke et al., 1995 2g -rb Excess Red Index (EXR), Meyer et al., 2008 1.4rg Excess Green-Red Index (EXGR), Neto et al., 2004 3g-2.4r-b Color Index of Vegetation (CIVE), Kataoka et al., 2003 0.441R-0.811G+ ...
... In addition, three bands were generated for each capture of thermal images (band 1band 2band 3) (Figure3). Everitt et al., 1987 Woebbecke Index (WI), Woebbecke et al., 1995 ( − ) ( − ) Excess Blue Index (EXB), Mao et al., 2003 1.4bg Excess Green Index (EXG), Woebbecke et al., 1995 2g -rb Excess Red Index (EXR), Meyer et al., 2008 1.4rg Excess Green-Red Index (EXGR), Neto et al., 2004 3g-2.4r-b Color Index of Vegetation (CIVE), Kataoka et al., 2003 0.441R-0.811G+ ...
... In addition, three bands were generated for each capture of thermal images (band 1band 2band 3) (Figure3). Everitt et al., 1987 Woebbecke Index (WI), Woebbecke et al., 1995 ( − ) ( − ) Excess Blue Index (EXB), Mao et al., 2003 1.4bg Excess Green Index (EXG), Woebbecke et al., 1995 2g -rb Excess Red Index (EXR), Meyer et al., 2008 1.4rg Excess Green-Red Index (EXGR), Neto et al., 2004 3g-2.4r-b Color Index of Vegetation (CIVE), Kataoka et al., 2003 0.441R-0.811G+ ...
Article
Full-text available
Nutrient management requires traditional soil and plant analysis, which is time-consuming, costly, and requires effort. Therefore, a lot of efforts have been directed toward developing novel approaches for estimating plants' status. Our objective was to evaluate the potential of thermal and RGB imaging to estimate Chlorophyll levels and some essential nutrients in the pepper plant (capsicum annum). Forty plants were randomly selected and marked for subsequent imaging. Thermal and RGB sensor cameras have been used for remotely sensed image acquisition of the studied plants. Chlorophyll was measured by SPAD then Plants were analyzed for determination chlorophyll, Nitrogen, Phosphorus, Potassium, and sulfur contents. The correlation between the measured/observed values and data of thermal and RGB images was calculated, as well as seven vegetation indices. The results showed that Chlorophyll b, and Chlorophyll measured by SPAD significantly correlated with the blue band of RGB images (r=0.625 and 0.709), respectively. To assess the Chlorophyll performance, excess blue, excess green, excess green-red, and excess red indices, and the Color Index of Vegetation (CIVE) were the best. In thermal imaging: band3 in capture 1 was the best to assess Chlorophyll. The more fitting band to assess Nitrogen, potassium and sulfur was band2 in capture2. The more detective indices to assess pepper nutrient content were excess blue, excess green, excess green red, and CIVE indices. The regression analysis revealed that a simple linear equation can be used to estimate the percentage of Nitrogen, J. Agric. & Env. Sci. (Damanhour University) 2022, 21 (3): 80-100 Print: ISSN 1687-1464 Online: 2735-5098 81 sulfur, and potassium in plants using the excess green index (r = 0.9159, 0.8942, 0.8715, and 0,9117, respectively) and the red-blue ratio index (RBRI) could be used to estimate the percentage of P (r = 0.9098). Thermal and RGB imaging can be used as a tool for assessing some nutrients and Chlorophyll content in pepper plants.
... An evaluation was done to select the most appropriate index (between 12 common RGB indices) or spectral band to separate colors. The highest f-values were obtained for the Rband and the Woebbecke index (WBI = G − B/R − G [33]). A clear separation between green (leaves) and the other colors (flowers) was observed ( Figure S1). ...
... These dates are used for evaluation of black spot occurrence as the disease typically develops most symptoms at the end of the season resulting in an early leaf drop. Based on the RGB orthophoto, the Excess Green vegetation index was calculated (UAV_ExG = 2 * G − R− B/R + G + B [33]). The median value of ExG was determined for every genotype (only considering pixels corresponding to vegetation based on the threshold applied to the canopy height model). ...
Article
Full-text available
Breeding and selection of nursery plants require evaluation of a wide variety of traits. Characteristics that are visually scored in the field include aesthetic beauty as well as tolerance to (a)biotic stresses, among others. This work proposes methodologies based on vegetation indices and canopy height data derived from visual imagery captured using an RGB (red, green, blue) camera embedded in a drone to rank and select genotypes. This approach relies on quantitative evaluation standards that exclude breeder bias and speed up data collection. A proof of concept for nursery plants was developed in two woody ornamentals: sweet box (Sarcococca Lindl.) and garden rose (Rosa L.). This work aimed to compare methodologies and to propose how drones (unmanned aerial vehicles, UAV) and high throughput field phenotyping (HTFP) can be used in a commercial plant selection program. Data was gathered in 2019 in three well-established breeding trials, two for sweet box and one for rose. Characteristics discussed include plant architecture traits such as plant height and shape in the sweet box and rose as well as floribundity, continuous flowering and disease resistance in roses. Correlations were calculated between on-ground measurements and UAV-derived methods, obtaining significant results. The advantages and disadvantages of the methodology and the approach for each of the traits are discussed.
... where u, v are pixel coordinates, µ I ∈ R 3 and σ I ∈ R 3 are mean and standard deviation of I, and = 10 −8 . (ii) Compute the vegetation mask M following Woebbecke et al. [42]. Specifically, M is given by where I R ,I G and I B are the color channels of I norm . ...
Preprint
Agricultural robots have the prospect to enable more efficient and sustainable agricultural production of food, feed, and fiber. Perception of crops and weeds is a central component of agricultural robots that aim to monitor fields and assess the plants as well as their growth stage in an automatic manner. Semantic perception mostly relies on deep learning using supervised approaches, which require time and qualified workers to label fairly large amounts of data. In this paper, we look into the problem of reducing the amount of labels without compromising the final segmentation performance. For robots operating in the field, pre-training networks in a supervised way is already a popular method to reduce the number of required labeled images. We investigate the possibility of pre-training in a self-supervised fashion using data from the target domain. To better exploit this data, we propose a set of domain-specific augmentation strategies. We evaluate our pre-training on semantic segmentation and leaf instance segmentation, two important tasks in our domain. The experimental results suggest that pre-training with domain-specific data paired with our data augmentation strategy leads to superior performance compared to commonly used pre-trainings. Furthermore, the pre-trained networks obtain similar performance to the fully supervised with less labeled data.
... where R, G, and B represent reflectance values of R, G, and B bands in the original image acquired by the UAV-RGB sensor [35]. Given that the methodology is based on the comparison of vegetation index histograms and the determination of their percentage of overlap, in the first step it is necessary to calculate the histogram of a healthy crop based on the segments labelled as a healthy crop in the image acquisition phase by the agriculture expert. ...
Article
Full-text available
Since corn is the second most widespread crop globally and its production has an impact on all industries, from animal husbandry to sweeteners, modern agriculture meets the task of preserving yield quality and detecting corn stress. Application of remote sensing techniques enabled more efficient crop monitoring due to the ability to cover large areas and perform non-destructive and non-invasive measurements. By using vegetation indices, it is possible to effectively measure the status of surface vegetation and detect stress on the field. This study describes the methodology for corn stress detection using red-green-blue (RGB) imagery and vegetation indices. Using the Excess Green vegetation index and calculated vegetation index histogram for healthy crop, corn stress has been effectively detected. The obtained results showed higher than 89% accuracy on both experimental plots, confirming that the proposed methodology can be used for corn stress detection using images acquired only with the RGB sensor. The proposed method does not depend on the sensor used for image acquisition and vegetation index used for stress detection, so it can be used in various different setups.
... Anthocyanin (Gitelson) (1/ρ550 − 1/ρ700) * ρ780 15 [203] Vegetation indices [1,2] based on the visible spectra are usually calculated to enhance vegetation characteristics; indeed, they can effectively assess the variation in biomass cover of green crops based on RGB orthoimages [191]. Different combinations of the R, G and B bands can be developed; from these combinations, environmental and lighting effects are reduced by segmenting the vineyard rows from the main image [46]. ...
Article
Full-text available
The potential of precision viticulture has been highlighted since the first studies performed in the context of viticulture, but especially in the last decade there have been excellent results have been achieved in terms of innovation and simple application. The deployment of new sensors for vineyard monitoring is set to increase in the coming years, enabling large amounts of information to be obtained. However, the large number of sensors developed and the great amount of data that can be collected are not always easy to manage, as it requires cross-sectoral expertise. The preliminary section of the review presents the scenario of precision viticulture, highlighting its potential and possible applications. This review illustrates the types of sensors and their operating principles. Remote platforms such as satellites, unmanned aerial vehicles (UAV) and proximal platforms are also presented. Some supervised and unsupervised algorithms used for object-based image segmentation and classification (OBIA) are then discussed, as well as a description of some vegetation indices (VI) used in viticulture. Photogrammetric algorithms for 3D canopy modelling using dense point clouds are illustrated. Finally, some machine learning and deep learning algorithms are illustrated for processing and interpreting big data to understand the vineyard agronomic and physiological status. This review shows that to perform accurate vineyard surveys and evaluations, it is important to select the appropriate sensor or platform, so the algorithms used in post-processing depend on the type of data collected. Several aspects discussed are fundamental to the understanding and implementation of vineyard variability monitoring techniques. However, it is evident that in the future, artificial intelligence and new equipment will become increasingly relevant for the detection and management of spatial variability through an autonomous approach.
... The vegetation index reflects spectral differences [27,28]. The vegetation indices based on the red, green and blue bands of visible light include excess green index (EXG) [29], excess green-red index (EXGR) [30], visible-band difference vegetation index (VDVI) [31], normalized green-red difference index (NGRDI) [32], and red-green ratio index (RGRI), etc. Among them, EXG and NGRDI are expressed as: ...
Article
Full-text available
Understory vegetation plays an important ecological role in maintaining the diversity of the ecosystem, the stability of ecosystem services, and the accumulation of nutrient elements, as an important part of a forest ecosystem. In this study, a new method of recognizing areas without understory vegetation is proposed. The method makes full use of the advantages of spectral characteristics, spatial structure information and temporal resolution of UAV images, and can quickly and simply distinguish understory, without vegetation cover. Combined with fractional vegetation coverage (FVC) and vegetation dispersion, understory, with no vegetation area, can be successfully recognized, and the Pr, Re and F1 are all above 85%. The proportion of bare soil under forest in our study area is 20.40%, 19.98% and even 41.69%. The study area is located in Changting County, Fujian Province, which is a typical red soil area in China where serious soil erosion is taking place in the forest. The method provides a promising, quick and economic way of estimating understory vegetation coverage with high spatial accuracy.
... were all greater than 97%. The total accuracy of excess red index combined with digital surface model was the highest (98.77%) with Kappa coefficient of 0. 956 [9] 、超 绿指数(Excess green index) [10] 、超蓝指数(Excess blue index) [11] 、可见光波段差异植被指数(Visible band differential vegetation index) [5] 、红绿比指数 (Red-green ratio index) [12] 、蓝绿比指数(Blue-green ratio index) [13] 。这6种植被指数的计算公式和理论 区间如表1所示。 可见光波段差异植被指数 Visible band differential vegetation index ...
Article
Full-text available
Objective】To obtain remote sensing image of sand sugar tangerine orchard by UAV, rapidly extract the distribution position of fruit trees, and provide references for growth monitoring and yield prediction of fruit trees.【Method】The visible light remote sensing images taken by drones were used as the research object. Six visible light vegetation indexes of excess red index, excess green index, excess blue index, visible band differential vegetation index, red-green ratio index and blue-green ratio index were calculated. We used the double peak threshold method to select the threshold for fruit tree extraction. Based on the spectral index identification, digital surface model was added as input variable of the identification model, and the comparative test was conducted.【Result】Compared with using a single spectral index, the addition of digital surface model improved the extraction accuracies of fruit tree and non fruit tree pixels. The total accuracies of six band fusions
... To compare the VGGUnet model with traditional threshold-based segmentation methods, the NDVI was used to quantitatively evaluate the green tide extraction results on the Super Dove testing datasets; due to the lack of a near-infrared band, the excess green index (EXG) [55] was used to quantitatively evaluate the green tide extraction results on the UAV testing datasets. The calculation formulas of NDVI and EXG are as follows: ...
Article
Full-text available
Green tide beaching events have occurred frequently in the Yellow Sea since 2007, causing a series of ecological and economic problems. Satellite imagery has been widely applied to monitor green tide outbreaks in open water. Traditional satellite sensors, however, are limited by coarse resolution or a low revisit rate, making it difficult to provide timely distribution of information about green tides in the nearshore. In this study, both PlanetScope Super Dove images and unmanned aerial vehicle (UAV) images are used to monitor green tide beaching events on the southern side of Shandong Peninsula, China. A deep learning model (VGGUnet) is used to extract the green tide features and quantify the green tide coverage area or biomass density. Compared with the U-net model, the VGGUnet model has a higher accuracy on the Super Dove and UAV images, with F1-scores of 0.93 and 0.92, respectively. The VGGUnet model is then applied to monitor the distribution of green tide on the beach and in the nearshore water; the results suggest that the VGGUnet model can accurately extract green tide features while discarding other confusing features. By using the Super Dove and UAV images, green tide beaching events can be accurately monitored and are consistent with field investigations. From the perspective of near real-time green tide monitoring, high-resolution imagery combined with deep learning is an effective approach. The findings pave the way for monitoring and tracking green tides in coastal zones, as well as assisting in the prevention and control of green tide disasters.
... The former improves the segmentation effect based on the color index. For example, Woebbecke et al. [7] proposed the Excess Green Index (ExG), which emphasizes plant greenness and can effectively separate the plant from the soil. In order to simulate human color perception, Meyer et al. [8] proposed the Excess Red Index (ExR), considering the relative ratios of the three types of cones in the human eye. ...
Article
Full-text available
To solve the problems of the low target-positioning accuracy and weak algorithm robustness of target-dosing robots in greenhouse environments, an image segmentation method for cucumber seedlings based on a genetic algorithm was proposed. Firstly, images of cucumber seedlings in the greenhouse were collected under different light conditions, and grayscale histograms were used to evaluate the quality of target and background sample images. Secondly, the genetic algorithm was used to determine the optimal coefficient of the graying operator to further expand the difference between the grayscale of the target and background in the grayscale images. Then, the Otsu algorithm was used to perform the fast threshold segmentation of grayscale images to obtain a binary image after coarse segmentation. Finally, morphological processing and noise reduction methods based on area threshold were used to remove the holes and noise from the image, and a binary image with good segmentation was obtained. The proposed method was used to segment 60 sample images, and the experimental results show that under different lighting conditions, the average F1 score of the obtained binary images was over 94.4%, while the average false positive rate remained at about 1.1%, and the image segmentation showed strong robustness. This method can provide new approaches for the accurate identification and positioning of targets as performed by target-dosing robots in a greenhouse environment.
... As the visible light true color sensors carried by UAVs have only three bands of RGB [16], the strong reflection characteristics of vegetation in the near-infrared band cannot be expressed. Among more than 150 vegetation index models [17] published in the literature, this paper selects the EXG (excess green) index [18], the MGRVI (modified green-red vegetation index) [19], the NGRDI (normalized green-red difference index) [20], the RGBVI (red-green-blue vegetation index) [21], and the VDVI (visible-band difference vegetation index) [22], all of which are constructed based on the RGB color space. Moreover, the contrast between vegetation and non-vegetation areas of these visible light vegetation indexes is obvious, and the vegetation information recognition effect is good, in which the vegetation area is bright white and the non-vegetation area is dark gray. ...
Article
Full-text available
Owing to factors such as climate change and human activities, ecological and environmental problems of land desertification have emerged in many regions around the world, among which the problem of land desertification in northwestern China is particularly serious. To grasp the trend of land desertification and the degree of natural vegetation degradation in northwest China is a basic prerequisite for managing the fragile ecological environment there. Visible light remote sensing images taken by a UAV can monitor the vegetation cover in desert areas on a large scale and with high time efficiency. However, as there are many low shrubs in desert areas, the shadows cast by them are darker, and the traditional RGB color-space-based vegetation index is affected by the shadow texture when extracting vegetation, so it is difficult to achieve high accuracy. For this reason, this paper proposes the Lab color-space-based vegetation index L2AVI (L-a-a vegetation index) to solve this problem. The EXG (excess green index), NGRDI (normalized green-red difference index), VDVI (visible band difference vegetation index), MGRVI (modified green-red vegetation index), and RGBVI (red-green-blue vegetation index) constructed based on RGB color space were used as control experiments in the three selected study areas. The results show that, although the extraction accuracies of the vegetation indices constructed based on RGB color space all reach more than 70%, these vegetation indices are all affected by the shadow texture to different degrees, and there are many problems of misdetection and omission. However, the accuracy of the L2AVI index can reach 99.20%, 99.73%, and 99.69%, respectively, avoiding the problem of omission due to vegetation shading and having a high extraction accuracy. Therefore, the L2AVI index can provide technical support and a decision basis for the protection and control of land desertification in northwest China.
... In this study, due to the lack of NIR band in the images, Color VI will be used to replace NDVI to detect vegetation. Color VI is usually used for plant detection, and it was originally proposed to use RGB bands to identify living plants in bare lands, wheat straw, and other residues [23]. However, owing to its ability to highlight a specific color, such as the green of plants, we use Color VI for vegetation detection in the remote sensing images. ...
Article
Full-text available
Remote sensing is the primary way to extract the impervious surface areas (ISAs). However, the obstruction of vegetation is a long-standing challenge that prevents the accurate extraction of urban ISAs. Currently, there are no general and systematic methods to solve the problem. In this paper, we present a morphological feature-oriented algorithm, which can make use of the OSM road network information to remove the obscuring effects when the ISAs are extracted. Very high resolution (VHR) images of Wuhan, China, were used in experiments to verify the effectiveness of the proposed algorithm. Experimental results show that the proposed algorithm can improve the accuracy and completeness of ISA extraction by our previous deep learning-based algorithm. In the proposed algorithm, the overall accuracy (OA) is 86.64%. The results show that the proposed algorithm is feasible and can extract the vegetation-obscured ISAs effectively and precisely.
... Many spectral indices based on RGB bands only have been proposed to allow observation of vegetation, including its health status and volume, e.g. ExG -Excess Green, NExG -Normalized Excess Green (both Woebbecke et al., 1995), ExR -Excess Red (Meyer, Hindman and Laksmi, 1999), GLI -Green Leaf Index (Louhaichi, Borman and Johnson, 2001), VARI -Visible Atmospherically Resistant Index (Gitelson et al., 2002), CIVE -Colour Index of Vegetation Extraction (Kataoka et al., 2003), ExG -ExR difference (Meyer et al., 2004), RGBVI -Red-Green-Blue Vegetation Index (Bendig et al., 2015), and many others. ...
Article
Full-text available
Monitoring changes of land cover near water bodies and water bodies themselves represents a part of environment protection and management. The management can be done at the global or local level. The local level requires more detailed data, which can be collected i.e. by means of aircraft or UAV. The paper describes a case study focused on the utilization of UAV-based RGB data to monitor land cover near the pond Baroch, which is located in the Czech Republic, near the city of Pardubice. The area is specific – it is a small pond accompanied by several smaller pools and connecting canals and surrounded by meadows (often watered), reeds, bushes and some trees Used data were collected by authors by in advance planned flights in August, September, October, November, and December 2021. Support Vector Machine, Maximum Likelihood, Random Trees, and Deep Learning are used as methods to process data and detect land cover changes. Manually interpreted data are used as reference data. Because of the nature of the data (only R, G, and B bands), classification into bare land, the water, vegetation, dry vegetation, and wet vegetation classes only was used. Very high heterogeneity of the observed area, availability of RGB bands only, and very high spatial resolution (1,9 cm per pixel) led to isolated cells.
... where R, G, and B are the normalised red, green, and blue bands of the image, respectively; EXG (excess green index) [37], obtained from the following formula: ...
Article
Full-text available
Cultivation of cover crops is a valuable practice in sustainable agriculture. In cover crop management, the method of desiccation is an important consideration, and one widely used method for this is the application of glyphosate. With use of glyphosate likely to be banned soon in Europe, the purpose of this study was to evaluate the herbicidal effect of pelargonic acid (PA) as a bio-based substitute for glyphosate. This study presents the results of a two-year field experiment (2019 and 2021) conducted in northeast Germany. The experimental setup included an untreated control, three different dosages (16, 8, and 5 L/ha) of PA, and the active ingredients glyphosate and pyraflufen. A completely randomised block design was established. The effect of the herbicide treatments was assessed by a visual estimate of the percentage of crop vitality and a comparison assessment provided by an Ebee+ drone. Four vegetation indices (VIs) calculated from the drone images were used to verify the credibility of colour (RGB)-based and near-infrared (NIR)-based vegetation indices. The results of both types of assessment indicated that pelargonic acid was reasonably effective in controlling cover crops within a week of application. In both experimental years, the PA (16 L/ha) and PA_2T (double application of 8 L/ha) treatments demonstrated their highest herbicidal effect for up to seven days after application. PA (16 L/ha) vitality loss decreased over time, while PA_2T (double application of 8 L/ha) continued to exhibit an almost constant effect for longer due to the second application one week later. The PA dosage of 5 L/ha, pyraflufen, and a mixture of the two exhibited a smaller vitality loss than the other treatments. However, except for glyphosate, the herbicidal effect of all the other treatments decreased over time. At the end of the experiment, the glyphosate treatment (3 L/ha) demonstrated the lowest estimated vitality. The results of the drone assessments indicated that vegetation indices (VIs) can provide detailed information regarding crop vitality following herbicide application and that RGB-based indices, such as EXG, have the potential to be applied efficiently and cost-effectively utilising drone imagery. The results of this study demonstrate that pelargonic acid has considerable potential for use as an additional tool in integrated crop management.
... The algorithm for image processing was built in Python language and used the following libraries: Open-CV for commands related to image manipulations; Numpy for the realization of the mathematical operations involved in the process; and Pandas for grouping and organizing the output data. The ExG (RGB) model was used for image segmentation [29], previously evaluated as the model that presented the best performance indices under the conditions of the experiment. ...
Article
Full-text available
Precision Irrigation (PI) is a promising technique for monitoring and controlling water use that allows for meeting crop water requirements based on site-specific data. However, implementing the PI needs precise data on water evapotranspiration. The detection and monitoring of crop water stress can be achieved by several methods, one of the most interesting being the use of infra-red (IR) thermometry combined with the estimate of the Crop Water Stress Index (CWSI). However, conventional IR equipment is expensive, so the objective of this paper is to present the development of a new low-cost water stress detection system using TL indices obtained by crossing the responses of infrared sensors with image processing. The results demonstrated that it is possible to use low-cost IR sensors with a directional Field of Vision (FoV) to measure plant temperature, generate thermal maps, and identify water stress conditions. The Leaf Temperature Maps, generated by the IR sensor readings of the plant segmentation in the RGB image, were validated by thermal images. Furthermore, the estimated CWSI is consistent with the literature results.
... Multispectral and RGB-based vegetation indices are usually obtained to assess canopy efficiency and canopy health. Some of the commonly used vegetation indices are GRVI (Tucker, 1979), MGRVI (Bendig et al., 2015), RGBVI , ExG (Woebbecke et al., 1995), ExG F I G U R E 6 Data extraction using plot (A) and grid (B) boundaries minus excess red (Neto, 2010), NDVI (Rouse et al., 1973), normalized difference red edge index (Barnes et al., 2000), green normalized difference vegetation index (Gitelson et al., 2003), and modified soil adjusted vegetation index (Qi et al., 1994). Many indices (www.indexdatabase.de) ...
Article
Full-text available
Unmanned aerial systems (UASs) have increased our capacity for collecting finer spatiotemporal resolution data that were previously unobtainable through conventional methods. The use of UAS for obtaining high‐throughput phenotyping (HTP) data in plant breeding programs has gained popularity in recent years. The integrity and quality of the raw data are essential for ensuring the accuracy of predictive tools and proper interpretation of the data. This paper summarizes the standard operation procedures for high‐quality UAS data collection, processing, and analysis for UAS‐based HTP (UAS‐HTP). Plant breeders can follow these procedures to implement a UAS‐HTP system in their germplasm enhancement and cultivar development programs. An essential protocol and procedure for unmanned aerial system (UAS) data collection for small plot research is developed. The standardized semiautomated UAS image processing procedure is explained. Potential applications of UAS data for high‐throughput phenotyping of few traits is briefly described.
... Mean values for each plot of the digital numbers (0-255) for R, G, and B were used in the calculations. Only results from the VI showing the strongest correlation between the index and the yield are presented (ExG [22] and NGRDI [23]). ...
Article
Full-text available
Yield maps give farmers information about growth conditions and can be a tool for site-specific crop management. Combine harvesters may provide farmers with detailed yield maps if there is a constant flow of a certain amount of biomass through the yield sensor. This is unachievable for grass seeds because the weight of the intake is generally too small to record the variation. Therefore, there is a need to find another way to make grass seed yield maps. We studied seed yield variation in two red fescue (Festuca rubra) fields with variation in management and soil fertility, respectively. We estimated five vegetation indices (VI) based on RGB images taken from a drone to describe yield variation, and trained prediction models based on relatively few harvested plots. Only results from the VI showing the strongest correlation between the index and the yield are presented (Normalized Excess Green Index (ExG) and Normalized Green/Red Difference Index (NGRDI)). The study indicates that it is possible to predict the yield variation in a grass field based on relatively few harvested plots, provided the plots represent contrasting yield levels. The prediction errors in yield (RMSE) ranged from 171 kg ha−1 to 231 kg ha−1, with no clear influence of the size of the training data set. Using random selection of plots instead of selecting plots representing contrasting yield levels resulted in slightly better predictions when evaluated on an average of ten random selections. However, using random selection of plots came with a risk of poor predictions due to the occasional lack of correlation between yield and VI. The exact timing of unmanned aerial vehicles (UAVs) image capture showed to be unimportant in the weeks before harvest.
... EVI is an adjusted version of NDVI and is considered to be less sensitive to soil background and atmospheric aerosol scattering compared to NDVI (Huete et al. 2002). ExG uses bands in the visible portion of the spectrum and has been used as an indicator of plant structural traits (Woebbecke et al. 1995;Li et al. 2018). ...
Article
Full-text available
Phenotyping approaches have been considered as a vital component in crop breeding programs to improve crops and develop new high-yielding cultivars. However, traditional field-based monitoring methods are expensive, invasive, and time-intensive. Moreover, data collected using satellite and airborne platforms are either costly or limited by their spatial and temporal resolution. Here, we investigated whether low-cost unmanned/unoccupied aerial systems (UASs) data can be used to estimate winter wheat ( Triticum aestivum L.) nitrogen (N) content, structural traits including plant height, fresh and dry biomass, and leaf area index (LAI) as well as yield during different winter wheat growing stages. To achieve this objective, UAS-based red–green–blue (RGB) and multispectral data were collected from winter wheat experimental plots during the winter wheat growing season. In addition, for each UAS flight mission, winter wheat traits and total yield (only at harvest) were measured through field sampling for model development and validation. We then used a set of vegetation indices (VIs), machine learning algorithms (MLAs), and structure-from-motion (SfM) to estimate winter wheat traits and yield. We found that using linear regression and MLAs, instead of using VIs, improved the capability of UAS-derived data in estimating winter wheat traits and yield. Further, considering the costly and time-intensive process of collecting in-situ data for developing MLAs, using SfM-derived elevation models and red-edge-based VIs, such as CIre and NDRE, are reliable alternatives for estimating key winter wheat traits. Our findings can potentially aid breeders through providing rapid and non-destructive proxies of winter wheat phenotypic traits.
... Table 3. Vegetation indices used in this study and their accompanying equations. [34] † VARI = Visible Atmospherically Resistant Index, GLI = Green Leaf Index, MGRVI = Modified Green Red Vegetation Index, RGBVI = Red Green Blue Vegetation Index, and ExG = Excess of Green. ‡ ρG is the green reflectance band, ρR is the red reflectance band, and ρB is the blue reflectance band. ...
Article
Full-text available
In general, remote sensing studies assessing cover crop growth are species nonspecific, use imagery from satellites or modified unmanned aerial vehicles (UAVs), and rely on multispectral vegetation indexes (VIs). However, using RGB imagery and visible-spectrum VIs from commercial off-the-shelf (COTS) UAVs to assess species specific cover crop growth is limited in the current scientific literature. Thus, this study evaluated RGB imagery and visible-spectrum VIs from COTS UAVs for suitability to estimate concentration (%) and content (kg ha−1) based cereal rye (CR) biomass, carbon (C), nitrogen (N), phosphorus (P), potassium (K), and sulfur (S). UAV surveys were conducted at two fields in Indiana and evaluated five visible-spectrum VIs—Visible Atmospherically Resistant Index (VARI), Green Leaf Index (GLI), Modified Green Red Vegetation Index (MGRVI), Red Green Blue Vegetation Index (RGBVI), and Excess of Greenness (ExG). This study utilized simple linear regression (VI only) and stepwise multiple regression (VI with weather and geographic data) to produce individual models for estimating CR biomass, C, N, P, K, and S concentration and content. The goodness-of-fit statistics were generated using repeated K-fold cross-validation to compare individual model performance. In general, the models developed using simple linear regression were inferior to those developed using the multiple stepwise regression method. Furthermore, for models developed using the multiple stepwise regression method all five VIs performed similarly when estimating concentration-based CR variables; however, when estimating content-based CR variables the models developed with GLI, MGRVI, and RGBVI performed similarly explaining 74–81% of the variation in CR data, and outperformed VARI and ExG. However, on an individual field basis, MGRVI consistently outperformed GLI and RGBVI for all CR characteristics. This study demonstrates the potential to utilize COTS UAVs for estimating in-field CR characteristics; however, the models generated in this study need further development to expand geographic scope and incorporate additional abiotic factors.
... Meanwhile, considerable research has been done on image processing/analysis techniques for weed detection (Wang et al., 2019;Meyer and Neto, 2008;Wu et al., 2021). A variety of color indices that accentuate plant greenness have been proposed for enhanced discrimination and segmentation of weeds from soil backgrounds (Meyer and Neto, 2008;Woebbecke et al., 1995). These color indices, despite the ease of computation, are not robust to various imaging conditions, especially in dealing with images acquired under natural field light conditions (Hamuda et al., 2016;Bawden et al., 2017;. ...
Article
Full-text available
Weeds are among the major threats to cotton production. Overreliance on herbicides for weed control has accelerated the evolution of herbicide-resistance in weeds and caused increasing concerns about environments, food safety and human health. Machine vision systems for automated/robotic weeding have received growing interest towards the realization of integrated, sustainable weed management. However, in the presence of unstructured field environments and significant biological variability of weeds, it remains a serious challenge to develop robust in-crop weed identification and detection systems. To address this challenge requires the development of annotated, large-scale image datasets of weeds specific to cotton production and date-driven machine learning models for weed detection. Among various deep learning architectures, a diversity of YOLO (You Only Look Once) detectors is well-suited for real-time application and has enjoyed great popularity for generic object detection. This study presents a new dataset (CottoWeedDet12) of weeds that are important to cotton production in the southern United States; it consists of 5648 images of 12 weed classes with a total of 9370 bounding box annotations, collected under natural light conditions and at varied weed growth stages in cotton fields. A novel, comprehensive benchmark of 25 state-of-the-art YOLO object detectors of seven versions including YOLOv3, YOLOv4, Scaled-YOLOv4, YOLOR and YOLOv5, YOLOv6 and YOLOv7, has been established for weed detection on the dataset. Based on the Monte-Caro cross validation with 5 replications, the detection accuracy in terms of mAP@50 ranged from 88.14% by YOLOv3-tiny to 95.22% by YOLOv4, and the accuracy in terms of mAP@50[0.5:0.95] ranged from 68.18% by YOLOv3-tiny to 89.72% by Scaled-YOLOv4. All the YOLO models especially YOLOv5n and YOLOv5s have shown great potential for real-time weed detection, and data augmentation could increase weed detection accuracy. Both the weed detection dataset and software program codes for model benchmarking in this study are publicly available, which are expected to be valuable resources for promoting future research on AI-empowered weed detection and control for cotton and potentially other crops.
... A specially programmed image processing macro in DespiteScan (National Institutes of Health, Bethesda, MD, USA) was used to determine the spray coverage parameters. In order to provide an accurate reference value for LWA calculation based on monocular vision, the study was conducted based on the Excess Green Index (ExG) proposed by Woebbecke [26] for LWA calculation. ...
... This study empirically demonstrated that supervised classification methods were more effective in grassland desertification assessment of the non-desertification grade ( Figure 6). In the future, new methods, such as near-infrared and visible band combinations, color mixture analysis, image texturing, and neural network machine learning, will be helpful to improve the accuracy of grassland desertification assessment [55,78]. ...
Article
Full-text available
Grassland desertification has become one of the most serious environmental problems in the world. Grasslands are the focus of desertification research because of their ecological vulnerability. Their application on different grassland desertification grades remains limited. Therefore, in this study, 19 vegetation indices were calculated for 30 unmanned aerial vehicle (UAV) visible light images at five grades of grassland desertification in the Mu Us Sandy. Fractional Vegetation Coverage (FVC) with high accuracy was obtained through Support Vector Machine (SVM) classification, and the results were used as the reference values. Based on the FVC, the grassland desertification grades were divided into five grades: severe (FVC < 5%), high (FVC: 5–20%), moderate (FVC: 21–50%), slight (FVC: 51–70%), and non-desertification (FVC: 71–100%). The accuracy of the vegetation indices was assessed by the overall accuracy (OA), the kappa coefficient (k), and the relative error (RE). Our result showed that the accuracy of SVM-supervised classification was high in assessing each grassland desertification grade. Excess Green Red Blue Difference Index (EGRBDI), Visible Band Modified Soil Adjusted Vegetation Index (V-MSAVI), Green Leaf Index (GLI), Color Index of Vegetation Vegetative (CIVE), Red Green Blue Vegetation Index (RGBVI), and Excess Green (EXG) accurately assessed grassland desertification at severe, high, moderate, and slight grades. In addition, the Red Green Ratio Index (RGRI) and Combined 2 (COM2) were accurate in assessing severe desertification. The assessment of the 19 indices of the non-desertification grade had low accuracy. Moreover, our result showed that the accuracy of SVM-supervised classification was high in assessing each grassland desertification grade. This study emphasizes that the applicability of the vegetation indices varies with the degree of grassland desertification and hopes to provide scientific guidance for a more accurate grassland desertification assessment.
... =g/r a b (1-a) ∴a=0.667 Hague, Tillett, & Wheeler (2006) (Hague et al., 2006) Woebbecke Index WI =(g-b)/|r-g|Woebbecke et al. (1995) R, G, and B represent the DN variables from digital image red, green, and blue bands, respectively. ...
Article
Full-text available
The estimation of yield parameters based on early data is helpful for agricultural policymakers and food security. Developments in unmanned aerial vehicle (UAV) platforms and sensor technology help to estimate yields efficiency. Previous studies have been based on less cultivars (<10) and ideal experimental environments, it is not available in practical production. Therefore, the objective of this study was to estimate the yield parameters of soybean (Glycine max (L.) Merr.) under lodging conditions using RGB information. In this study, 17 time point data throughout the soybean growing season in Nanchang, Jiangxi Province, China, were collected, and the vegetation index, texture information, canopy cover, and crop height were obtained by UAV-image processing. After that, partial least squares regression (PLSR), logistic regression (Logistic), random forest regression (RFR), support vector machine regression (SVM), and deep learning neural network (DNN) were used to estimate the yield parameters. The results can be summarized as follows: (1) The most suitable time point to estimate the yield was flowering stage (48 days), which was when most of the soybean cultivars flowered. (2) The multiple data fusion improved the accuracy of estimating the yield parameters, and the texture information has a high potential to contribute to the estimation of yields, and (3) The DNN model showed the best accuracy of training (R ² =0.66 rRMSE=32.62%) and validation (R ² =0.50, rRMSE=43.71%) datasets. In conclusion, these results provide insights into both best estimate period selection and early yield estimation under lodging condition when using remote sensing.
... This behavior was also observed at Petite Sirah and Cabernet Sauvignon vineyards in California, USA [28]. ExG is a VI highlighting a green band to separate vegetation from the background, and thus is sensitive to canopy greenness [36]. A good correlation between ExG and GWS was also found to be evident for grapevines faced with different irrigation treatments at a Vermentino vineyard in Italy [76]. ...
Article
Full-text available
Monitoring and management of grapevine water status (GWS) over the critical period between flowering and veraison plays a significant role in producing grapes of premium quality. Although unmanned aerial vehicles (UAVs) can provide efficient mapping across the entire vineyard, most commercial UAV-based multispectral sensors do not contain a shortwave infrared band, which makes the monitoring of GWS problematic. The goal of this study is to explore whether and which of the ancillary variables (vegetation characteristics, temporal trends, weather conditions, and soil/terrain data) may improve the accuracy of GWS estimation using multispectral UAV and provide insights into the contribution, in terms of direction and intensity, for each variable contributing to GWS variation. UAV-derived vegetation indices, slope, elevation, apparent electrical conductivity (ECa), weekly or daily weather parameters, and day of the year (DOY) were tested and regressed against stem water potential (Ψstem), measured by a pressure bomb, and used as a proxy for GWS using three machine learning algorithms (elastic net, random forest regression, and support vector regression). Shapley Additive exPlanations (SHAP) analysis was used to assess the relationship between selected variables and Ψstem. The results indicate that the root mean square error (RMSE) of the transformed chlorophyll absorption reflectance index-based model improved from 213 to 146 kPa when DOY and elevation were included as ancillary inputs. RMSE of the excess green index-based model improved from 221 to 138 kPa when DOY, elevation, slope, ECa, and daily average windspeed were included as ancillary inputs. The support vector regression best described the relationship between Ψstem and selected predictors. This study has provided proof of the concept for developing GWS estimation models that potentially enhance the monitoring capacities of UAVs for GWS, as well as providing individual GWS mapping at the vineyard scale. This may enable growers to improve irrigation management, leading to controlled vegetative growth and optimized berry quality.
... Visible atmospherically resistant index [87] NDTI (R − G)/(R + G) Normalized difference turbidity index water [88] RGBVI (GG) − (RB)/(GG) + (RB) Red-green-blue vegetation index [89] EXG 2G − R − B Excess of green [90] In addition to the RGB indices, a series of texture indices (Table 3) including mean, variance, contrast, dissimilarity, entropy, homogeneity, and second-order moment were calculated using the glcm library in R (Table 3). In addition, two Fourier-transform-based indicators were calculated (see Section 2.2.8). ...
Article
Full-text available
Monitoring tree decline in arid and semi-arid zones requires methods that can provide up-to-date and accurate information on the health status of the trees at single-tree and sample plot levels. Unmanned Aerial Vehicles (UAVs) are considered as cost-effective and efficient tools to study tree structure and health at small scale, on which detecting and delineating tree crowns is the first step to extracting varied subsequent information. However, one of the major challenges in broadleaved tree cover is still detecting and delineating tree crowns in images. The frequent dominance of coppice structure in degraded semi-arid vegetation exacerbates this problem. Here, we present a new method based on edge detection for delineating tree crowns based on the features of oak trees in semi-arid coppice structures. The decline severity in individual stands can be analyzed by extracting relevant information such as texture from the crown area. Although the method presented in this study is not fully automated, it returned high performances including an F-score = 0.91. Associating the texture indices calculated in the canopy area with the phenotypic decline index suggested higher correlations of the GLCM texture indices with tree decline at the tree level and hence a high potential to be used for subsequent remote-sensing-assisted tree decline studies.
Article
Full-text available
Vegetation indices provide information for various precision-agriculture practices, by providing quantitative data about crop growth and health. To provide a concise and up-to-date review of vegetation indices in precision agriculture, this study focused on the major vegetation indices with the criterion of their frequency in scientific papers indexed in the Web of Science Core Collection (WoSCC) since 2000. Based on the scientific papers with the topic of "precision agricul-ture" combined with "vegetation index", this study found that the United States and China are global leaders in total precision-agriculture research and the application of vegetation indices, while the analysis adjusted for the country area showed much more homogenous global development of vegetation indices in precision agriculture. Among these studies, vegetation indices based on the multispectral sensor are much more frequently adopted in scientific studies than their low-cost alternatives based on the RGB sensor. The normalized difference vegetation index (NDVI) was determined as the dominant vegetation index, with a total of 2200 studies since the year 2000. With the existence of vegetation indices that improved the shortcomings of NDVI, such as enhanced vegetation index (EVI) and soil-adjusted vegetation index (SAVI), this study recognized their potential for enabling superior results to those of NDVI in future studies.
Thesis
Full-text available
Dans le contexte agricole actuel, il est nécessaire de réduire l’utilisation des produits phytosanitaires contre les mauvaises herbes. Le désherbage localisé présente une alternative prometteuse pour limiter les coûts et l’impact environnemental. Cependant, la localisation automatique des adventices n’est pas une tâche facile car elle présente plusieurs défis scientifiques et technologiques. L’objectif de cette thèse est de proposer des méthodes de traitement d’images et d’intelligence artificielle pour la localisation des adventices en grandes cultures. Dans ce cadre, nous avons abordé deux problématiques, la détection des rangées de culture et la détection des adventices. Deux méthodes ont été proposées pour la détection des rangées de culture. La première méthode combine la transformée de Hough et l’algorithme de regroupement linéaire itératif SLIC. La deuxième, quant à elle, utilise une approche totalement nouvelle basée sur l’apprentissage profond. Ces deux méthodes ont été utilisées pour détecter les adventices inter-rang et celles qui sont en contact avec les rangées de culture. Pour tendre vers une meilleur efficacité, deux nouvelles méthodes de détection d’adventices par apprentissage machine, entièrement automatiques ont été développées. L’originalité de ces méthodes est que l’apprentissage est effectué sur des données annotées automatiquement. La première méthode est basée sur l’apprentissage profond tandis que la seconde génère des modèles à partir de descripteurs profonds et un classifieur à classe unique. Les résultats obtenus sur des données réelles montrent l’intérêt des approches proposées.
Article
Full-text available
Red palm weevil (RPW) is widely considered a key pest of palms, creating extensive damages to the date palm trunk that inevitably leads to palm death if no pest eradication is done. This study evaluates the potential of a remote sensing approach for the timely and reliable detection of RPW infestation on the palm canopy. For two consecutive years, an experimental field with infested and control palms was regularly monitored by an Unmanned Aerial Vehicle (UAV) carrying RGB, multispectral, and thermal sensors. Simultaneously, detailed visual observations of the RPW effects on the palms were made to assess the evolution of infestation from the initial stage until palm death. A UAV-based image processing chain for nondestructive RPW detection was built based on segmentation and vegetation index analysis techniques. These algorithms reveal the potential of thermal data to detect RPW infestation. Maximum temperature values and standard deviations within the palm crown revealed a significant (α = 0.05) difference between infested and non-infested palms at a severe infestation stage but before any visual canopy symptoms were noticed. Furthermore, this proof-of-concept study showed that the temporal monitoring of spectral vegetation index values could contribute to the detection of infested palms before canopy symptoms are visible. The seasonal significant (α = 0.05) increase of greenness index values, as observed in non-infested trees, could not be observed in infested palms. These findings are of added value for steering management practices and future related studies, but further validation of the results is needed. The workflow and resulting maps are accessible through the Mapeo® visualization platform.
Article
The Percentage of Vegetational Cover (PVC), an indicator of the evaluation of slope revegetation with herbaceous plants by seeding works used in the industry, is usually measured manually through visual inspection. Hence, the measurement results of the same subject might not be constant. To overcome this situation, RGB image analysis has been introduced. In this study, the camera settings to minimize variation of the PVC measurement results and the impacts of natural sunlight were investigated. In conclusion, the cameras set as minimum ISO with fixed white balance were preferable to measure PVC when the equipment have program automatic exposure. The variation of the PVC measurement results under the conditions of this study was approximately ±2.5~7.3% (95% prediction intervals) under the natural sunlight condition with different solar altitude and/or weather. In addition, the standard deviation of the PVC measurement results calculated with photographs taken on a day in the condition of constant weather was approximately ±1.8%.
Article
Full-text available
Accurately and economically estimated crop above-ground biomass (AGB) and bean yield (BY) are critical for cultivation management in precision agriculture. Unmanned aerial vehicle (UAV) platforms have shown great potential in crop AGB and BY estimation due to their ability to rapidly acquire remote sensing data with high temporal–spatial resolution. In this study, a low-cost and consumer-grade camera mounted on a UAV was adopted to acquire red–green–blue (RGB) images, which were then combined with ensemble learning to estimate faba bean AGB and BY. The following results were obtained: (1) The faba bean plant height derived from UAV RGB images presented a strong correlation with the ground measurement (R² = 0.84, RMSE = 63.6 mm). (2) The accuracy of BY estimation (R² = 0.784, RMSE = 0.460 t ha⁻¹, NRMSE = 14.973%) based on RGB images was higher than the accuracy of AGB estimation (R² = 0.618, RMSE = 0.606 t ha⁻¹, NRMSE = 16.746%). (3) The combination of three variables (vegetation index, structural information, textural information) improved the AGB and BY estimation accuracy. (4) The AGB and BY estimation performance were best for the mid bean-filling stage. (5) The ensemble learning model provided higher AGB and BY estimation accuracy than the five base learners (k-nearest neighbor, support vector machine, ridge regression, random forest and elastic net models). These results indicate that UAV RGB images combined with machine learning algorithms, particularly ensemble learning models, can provide relatively accurate faba bean AGB (R² = 0.683, RMSE = 0.568 t ha⁻¹, NRMSE = 15.684%) and BY (R² = 0.854, RMSE = 0.390 t ha⁻¹, NRMSE = 12.693%) estimation and considerably contribute to the high-throughput phenotyping study of food legumes.
Article
Remote sensing technology uses various vehicles including satellites, helicopters, aircraft, and Unmanned Aerial Vehicles (UAV) and Drones. Remote sensing technology is often used in agriculture, especially for monitoring rice fields, helping the age of rice and so on. In the current technological era, drone devices are vehicles that are often used to monitor rice fields which are considered effective, considering that the data obtained is the latest data during flights, this is also balanced with current developments in various fields, especially for capturing air, drones be an alternative choice than other alternatives that are considered conventional. Rice is an important cultivated crop because it is a staple food for 90% of Indonesia's population, and also for the people of Papua in Merauke, which is a national food storage area. However, the obstacle that is often experienced is interference from rice diseases. Therefore a fast and accurate analysis of the health of rice plants is needed using the NDVI or Normalized Difference Vegetation Index which is a method for comparing the level of greenness of vegetation originating from drone imagery, with the value of the NDVI we can know the classification of the health of rice plants. In this study the classification of rice plant health was divided into 4 classes. Very good health is in the NDVI value range 0.721-0.92, for good health the NDVI value range is between 0.421-0.72, and normal health NDVI values are in the range 0.221-0.42, while in poor health the NDVI value is 0.11-0.22. With the utilization of drone device technology, it is possible to analyze rice plants per hectare with a normal health classification with an area of 14,877,315 Ha. Whereas in the good health classification the area is 9,846,833 Ha and in the very good health classification the area is 8,922,892
Article
Full-text available
Farmers and ranchers depend on annual forage production for grassland livestock enterprises. Many regression and machine learning (ML) prediction models have been developed to understand the seasonal variability in grass and forage production, improve management practices, and adjust stocking rates. Moreover, decision support tools help farmers compare management practices and develop forecast scenarios. Although numerous individual studies on forage growth, modeling, prediction, economics, and related tools are available, these technologies have not been comprehensively reviewed. Therefore, a systematic literature review was performed to synthesize current knowledge, identify research gaps, and inform stakeholders. Input features (vegetation index [VI], climate, and soil parameters), models (regression and ML), relevant tools, and economic factors related to grass and forage production were analyzed. Among 85 peer-reviewed manuscripts selected, Moderating Resolution Imaging Spectrometer for remote sensing satellite platforms and normalized difference vegetation index (NDVI), precipitation, and soil moisture for input features were most frequently used. Among ML models, the random forest model was the most widely used for estimating grass and forage yield. Four existing tools used inputs of precipitation, evapotranspiration, and NDVI for large spatial-scale prediction and monitoring of grass and forage dynamics. Most tools available for forage economic analysis were spreadsheet-based and focused on alfalfa. Available studies mostly used coarse spatial resolution satellites and VI or climate features for larger-scale yield prediction. Therefore, further studies should evaluate the use of high-resolution satellites; VI and climate features; advanced ML models; field-specific prediction tools; and interactive, user-friendly, web-based tools and smartphone applications in this field.
Article
Full-text available
The alpine grassland ecosystem accounts for 53 % of the Qinghai–Tibet Plateau (QTP) area and is an important ecological protection barrier, but it is fragile and vulnerable to climate change. Therefore, continuous monitoring of grassland aboveground biomass (AGB) is necessary. Although many studies have mapped the spatial distribution of AGB for the QTP, the results vary widely due to the limited ground samples and mismatches with satellite pixel scales. This paper proposed a new algorithm using unmanned aerial vehicles (UAVs) as a bridge to estimate the grassland AGB on the QTP from 2000 to 2019. The innovations were as follows: (1) in terms of ground data acquisition, spatial-scale matching among the traditional ground samples, UAV photos, and MODIS pixels was considered. A total of 906 pairs between field-harvested AGB and UAV sub-photos and 2602 sets of MODIS pixel-scale UAV data (over 37 000 UAV photos) were collected during 2015–2019. Therefore, the ground validation samples were sufficient and scale-matched. (2) In terms of model construction, the traditional quadrat scale (0.25 m2) was successfully upscaled to the MODIS pixel scale (62 500 m2) based on the random forest and stepwise upscaling methods. Compared with previous studies, the scale matching of independent and dependent variables was achieved, effectively reducing the impact of spatial-scale mismatch. The results showed that the correlation between the AGB values estimated by UAV and MODIS vegetation indices was higher than that between field-measured AGB and MODIS vegetation indices at the MODIS pixel scale. The multi-year validation results showed that the constructed MODIS pixel-scale AGB estimation model had good robustness, with an average R2 of 0.83 and RMSE of 34.13 g m−2. Our dataset provides an important input parameter for a comprehensive understanding of the role of the QTP under global climate change. The dataset is available from the National Tibetan Plateau/Third Pole Environment Data Center (https://doi.org/10.11888/Terre.tpdc.272587; H. Zhang et al., 2022).
Article
Full-text available
Estimating consistent large-scale tropical forest height using remote sensing is essential for understanding forest-related carbon cycles. The Global Ecosystem Dynamics Investigation (GEDI) light detection and ranging (LiDAR) instrument employed on the International Space Station has collected unique vegetation structure data since April 2019. Our study shows the potential value of using remote-sensing (RS) data (i.e., optical Sentinel-2, radar Sentinel-1, and radar PALSAR-2) to extrapolate GEDI footprint-level forest canopy height model (CHM) measurements. We show that selected RS features can estimate vegetation heights with high precision by analyzing RS data, spaceborne GEDI LiDAR, and airborne LiDAR at four tropical forest sites in South America and Africa. We found that the GEDI relative height (RH) metric is the best at 98% (RH98), filtered by full-power shots with a sensitivity greater than 98%. We found that the optical Sentinel-2 indices are dominant with respect to radar from 77 possible features. We proposed the nine essential optical Sentinel-2 and the radar cross-polarization HV PALSAR-2 features in CHM estimation. Using only ten optimal indices for the regression problems can avoid unimportant features and reduce the computational effort. The predicted CHM was compared to the available airborne LiDAR data, resulting in an error of around 5 m. Finally, we tested cross-validation error values between South America and Africa, including around 40% from validation data in training to obtain a similar performance. We recommend that GEDI data be extracted from all continents to maintain consistent performance on a global scale. Combining GEDI and RS data is a promising method to advance our capability in mapping CHM values.
Article
Full-text available
Corn (Zea mays L.) nitrogen (N) management requires monitoring plant N concentration (Nc) with remote sensing tools to improve N use, increasing both profitability and sustainability. This work aims to predict the corn Nc during the growing cycle from Sentinel-2 and Sentinel-1 (C-SAR) sensor data fusion. Eleven experiments using five fertilizer N rates (0, 60, 120, 180, and 240 kg N ha−1) were conducted in the Pampas region of Argentina. Plant samples were collected at four stages of vegetative and reproductive periods. Vegetation indices were calculated with new combinations of spectral bands, C-SAR backscatters, and sensor data fusion derived from Sentinel-1 and Sentinel-2. Predictive models of Nc with the best fit (R2 = 0.91) were calibrated with spectral band combinations and sensor data fusion in six experiments. During validation of the models in five experiments, sensor data fusion predicted corn Nc with lower error (MAPE: 14%, RMSE: 0.31 %Nc) than spectral band combination (MAPE: 20%, RMSE: 0.44 %Nc). The red-edge (704, 740, 740 nm), short-wave infrared (1375 nm) bands, and VV backscatter were all necessary to monitor corn Nc. Thus, satellite remote sensing via sensor data fusion is a critical data source for predicting changes in plant N status.
Article
Full-text available
Unoccupied aerial systems (UAS) based high throughput phenotyping studies require further investigation to combine different environments and planting times into one model. Here 100 elite breeding hybrids of maize ( Zea mays L.) were evaluated in two environment trials–one with optimal planting and irrigation (IHOT), and one dryland with delayed planting (DHOT). RGB (Red-Green-Blue) based canopy height measurement (CHM) and vegetation indices (VIs) were estimated from a UAS platform. Time series and cumulative VIs, by both summation (ΣVI-SUMs) and area under the curve (ΣVI-AUCs), were fit via machine learning regression modeling (random forest, linear, ridge, lasso, elastic net regressions) to estimate grain yield. VIs were more valuable predictors of yield to combine different environments than CHM. Time series VIs and CHM produced high accuracies (~68–72%), but inconsistent models. A little sacrifice in accuracy (~60–65%) produced consistent models using ΣVI-SUMs and CHM during pre-reproductive vegetative growth. Absence of VIs produced poorer accuracies (by about ~5–10%). Normalized difference type VIs produced maximum accuracies, and flowering times were the best times for UAS data acquisition. This study suggests that the best yielding varieties can be accurately predicted in new environments at or before flowering when combining multiple temporal flights and predictors.
Article
Full-text available
Grasslands are one of the world’s largest ecosystems, accounting for 30% of total terrestrial biomass. Considering that aboveground biomass (AGB) is one of the most essential ecosystem services in grasslands, an accurate and faster method for estimating AGB is critical for managing, protecting, and promoting ecosystem sustainability. Unmanned aerial vehicles (UAVs) have emerged as a useful and practical tool for achieving this goal. Here, we review recent research studies that employ UAVs to estimate AGB in grassland ecosystems. We summarize different methods to establish a comprehensive workflow, from data collection in the field to data processing. For this purpose, 64 research articles were reviewed, focusing on several features including study site, grassland species composition, UAV platforms, flight parameters, sensors, field measurement, biomass indices, data processing, and analysis methods. The results demonstrate that there has been an increase in scientific research evaluating the use of UAVs in AGB estimation in grasslands during the period 2018–2022. Most of the studies were carried out in three countries (Germany, China, and USA), which indicates an urgent need for research in other locations where grassland ecosystems are abundant. We found RGB imaging was the most commonly used and is the most suitable for estimating AGB in grasslands at the moment, in terms of cost–benefit and data processing simplicity. In 50% of the studies, at least one vegetation index was used to estimate AGB; the Normalized Difference Vegetation Index (NDVI) was the most common. The most popular methods for data analysis were linear regression, partial least squares regression (PLSR), and random forest. Studies that used spectral and structural data showed that models incorporating both data types outperformed models utilizing only one. We also observed that research in this field has been limited both spatially and temporally. For example, only a small number of papers conducted studies over a number of years and in multiple places, suggesting that the protocols are not transferable to other locations and time points. Despite these limitations, and in the light of the rapid advances, we anticipate that UAV methods for AGB estimation in grasslands will continue improving and may become commercialized for farming applications in the near future.
Article
Full-text available
Reliable and quick estimation of wheat yellow rust (WYR) severity in field is essential to manage the disease and minimize the losses. Field experiments were conducted during 2017–18 and 2018–19 to obtain visible and thermal images of 24 wheat cultivars having different levels of WYR resistance at critical growth stages. Machine learning (ML) models were constructed using the combinations of image indices (IN) and partial least square regression (PLS) scores of image indices with disease severity (DS) and Yeo-Johnson (YJ) transformed values of disease severity. The results revealed that 26 visible and 2 thermal indices considered in this study have significant correlations with WYR. The models performances were evaluated using four possible dataset combinations of (1) disease severity + indices, (2) disease severity + PLS scores of indices, (3) YJ transformed disease severity + indices, and (4) YJ transformed disease severity + PLS scores. Disease severity with image derived indices was found to be the best dataset for the prediction of WYR severity using machine learning models with an R2 and d-index above 0.95 during calibration, while up to 0.67 and 0.87, respectively during validation. Cubist model with disease severity + indices dataset was the best to predict WYR severity, while the Gaussian process regression with YJ transformed disease severity + PLS scores dataset was the poorest predictor. The results obtained in the present study showed the potential of ML models for non-destructive prediction of WYR in field using visible and thermal imaging.
Article
Full-text available
Aboveground biomass (AGB) is an important basis for wheat yield formation. It is useful to timely collect the AGB data to monitor wheat growth and to build high-yielding wheat groups. However, as traditional AGB data acquisition relies on destructive sampling, it is difficult to adapt to the modernization of agriculture, and the estimation accuracy of spectral data alone is low and cannot solve the problem of index saturation at later stages. In this study, an unmanned aerial vehicle (UAV) with an RGB camera and the real-time kinematic (RTK) was used to obtain imagery data and elevation data at the same time during the critical fertility period of wheat. The cumulative percentile and the mean value methods were then used to extract the wheat plant height (PH), and the color indices (CIS) and PH were combined to invert the AGB of wheat using parametric and non-parametric models. The results showed that the accuracy of the model improved with the addition of elevation data, and the model with the highest accuracy of multi-fertility period estimation was PLSR (PH + CIS), with R2, RMSE and NRMSE of 0.81, 1248.48 kg/ha and 21.77%, respectively. Compared to the parametric models, the non-parametric models incorporating PH and CIS greatly improved the prediction of AGB during critical fertility periods in wheat. The inclusion of elevation data therefore greatly improves the accuracy of AGB prediction in wheat compared to traditional spectral prediction models. The fusion of UAV-based elevation data and image information provides a new technical tool for multi-season wheat AGB monitoring.
Article
Full-text available
Timely and efficient monitoring of crop aboveground biomass (AGB) and grain yield (GY) forecasting before harvesting are critical for improving crop yields and ensuring food security in precision agriculture. The purpose of this study is to explore the potential of fusing source–sink-level color, texture, and temperature values extracted from RGB images and thermal images based on proximal sensing technology to improve grain yield prediction. High-quality images of wheat from flowering to maturity under different treatments of nitrogen application were collected using proximal sensing technology over a 2-year trial. Numerous variables based on source and sink organs were extracted from the acquired subsample images, including 30 color features, 10 texture features, and two temperature values. The principal component analysis (PCA), least absolute shrinkage and selection operator (LASSO), and recursive feature elimination (RFE) were used to screen variables. Support vector regression (SVR) and random forest (RF) were applied to establish AGB estimation models, and the GY prediction models were built by RF. The source dataset and sink dataset performed differently on AGB and GY estimation, but the combined source–sink dataset performed best for estimating both AGB and GY. Based on the source–sink dataset, the LASSO-RF model was the best combination for predicting AGB and GY, with the coefficient of determination (R²) of 0.85 and 0.86, root mean square error (RMSE) of 1179.09 and 609.61 kg ha⁻¹, and the ratio of performance to deviation (RPD) of 2.10 and 2.45, respectively. This study demonstrates that the multivariate eigenvalues of both source and sink organs have the potential to predict wheat yield and that the combination of machine learning models and variable selection methods can significantly affect the accuracy of yield prediction models and achieve effective monitoring of crop growth at late reproductive stages.
Article
Accurate estimation of disease severity in the field is a key to minimize the field losses in agriculture. Existing disease severity assessment methods have poor accuracy under field conditions. To overcome this limitation, this study used thermal and visible imaging with machine learning (ML) and model combination (MC) techniques to estimate plant disease severity under field conditions. Field experiments were conducted during 2017–18, 2018–19 and 2021–22 to obtain RGB and thermal images of chickpea cultivars with different levels of wilt resistance grown in wilt sick plots. ML models were constructed using four different datasets created using the wilt severity and image derived indices. ML models were also combined using MC techniques to assess the best predictor of the disease severity. Results indicated that the Cubist was the best ML algorithm, while the KNN model was the poorest predictor of chickpea wilt severity under field conditions. MC techniques improved the prediction accuracy of wilt severity over individual ML models. Combining ML models using the least absolute deviation technique gave the best predictions of wilt severity. The results obtained in the present study showed the MC techniques coupled with ML models improved the prediction accuracies of plant disease severity under field conditions.
ResearchGate has not been able to resolve any references for this publication.