Since July 2017, I am at Penn State University in the Department of Plant Pathology and Environmental Microbiology. Our program is focused on plant disease epidemiology and field crops extension plant pathology.
Skills and Expertise
- Associate Professor
- Position: Epidemiology and Statistics Director, CIGRAS (2015 to 2017)
Research Items (124)
- Feb 2019
The tobacco whitefly Bemisia tabaci (Gennadius) cryptic species complex and of the greenhouse whitefly Trialeurodes vaporariorum (Westwood) are extensively reported as destructive pests in vegetable crops worldwide. A survey was conducted in 2011 and 2012 to determine the occurrence and genetic diversity present in the populations of these whiteflies in the major vegetable production areas of Costa Rica. Insect samples were collected from sweet pepper (Capsicum annuum L.), tomato (Solanum lycopersicum L.), common bean (Phaseolus vulgaris L.) and weeds present in commercial crops either in open field or greenhouse conditions. PCR‐RFLP analysis of mitochondrial cytochrome c oxidase subunit 1 gene (mtCOI) sequences of 621 whitefly individuals confirmed the presence of the Mediterranean (MED) type of the B. tabaci and of T. vaporariorum in most sampled regions. Also, individuals of the Middle East‐Asia Minor 1 (MEAM1) type of the B. tabaci were observed in low numbers. Contingency analyses based on type of crop, geographical region, whitefly species, year of collection and production system confirmed that T. vaporariorum was the most frequent species in vegetable production areas in Costa Rica, both in greenhouses and in open fields. B. tabaci MED is likely spreading to new areas of the country, whereas B. tabaci MEAM1 was mostly absent or rarely found. Comparisons of mtCOI sequences from B. tabaci individuals revealed the presence of four B. tabaci sequence haplotypes (named MED‐i, MED‐ii, MEAM1‐i, MEAM1‐xviii) in Costa Rica, three of them identical to B. tabaci haplotypes previously reported in the Western Hemisphere and other parts of the world. Analysis of sequences of T. vaporariorum individuals revealed a more complex population with the presence of 11 haplotypes, two of which were identical to T. vaporariorum sequences reported from other countries.
Crop pathogens and pests reduce the yield and quality of agricultural production. They cause substantial economic losses and reduce food security at household, national and global levels. Quantitative, standardized information on crop losses is difficult to compile and compare across crops, agroecosystems and regions. Here, we report on an expert-based assessment of crop health, and provide numerical estimates of yield losses on an individual pathogen and pest basis for five major crops globally and in food security hotspots. Our results document losses associated with 137 pathogens and pests associated with wheat, rice, maize, potato and soybean worldwide. Our yield loss (range) estimates at a global level and per hotspot for wheat (21.5% (10.1–28.1%)), rice (30.0% (24.6–40.9%)), maize (22.5% (19.5–41.1%)), potato (17.2% (8.1–21.0%)) and soybean (21.4% (11.0–32.4%)) suggest that the highest losses are associated with food-deficit regions with fast-growing populations, and frequently with emerging or re-emerging pests and diseases. Our assessment highlights differences in impacts among crop pathogens and pests and among food security hotspots. This analysis contributes critical information to prioritize crop health management to improve the sustainability of agroecosystems in delivering services to societies. © 2019, The Author(s), under exclusive licence to Springer Nature Limited.
Anthesis is generally recommended as the optimum growth stage for applying a foliar fungicide to manage Fusarium head blight (FHB) and the Fusarium-associated toxin deoxynivalenol (DON) in wheat. However, because it is not always possible to treat fields at anthesis, studies were conducted to evaluate pre- and postanthesis treatment options for managing FHB and DON in spring and winter wheat. Network meta-analytical models were fitted to data from 19 years of fungicide trials, and log response ratio ([Formula: see text]) and approximate percent control ([Formula: see text]) relative to a nontreated check were estimated as measures of the effects of six treatments on FHB index (IND: mean percentage of diseased spikelets per spike) and DON. The evaluated treatments consisted of either Caramba (metconazole) applied early (at heading [CE]), at anthesis (CA), or late (5 to 7 days after anthesis; CL), or Prosaro (prothioconazole + tebuconazole) applied at the same three times and referred to as PE, PA, and PL, respectively. All treatments reduced mean IND and DON relative to the nontreated check, but the magnitude of the effect varied with timing and wheat type. CA and PA resulted in the highest [Formula: see text] values for IND, 52.2 and 51.5%, respectively, compared with 45.9% for CL, 41.3% for PL, and less than 33% for CE and PE. Anthesis and postanthesis treatments reduced mean IND by 14.9 to 29.7% relative to preanthesis treatments. The estimated effect size was also statistically significant for comparisons between CA and CL and PA and PL; CA reduced IND by 11.7% relative to CL, whereas PA reduced the disease by 17.4% relative to PL. Differences in efficacy against IND between pairs of prothioconazole + tebuconazole and metconazole treatments applied at the same timing (CE versus PE, CA versus PA, and CL versus PL) were not statistically significant. However, CA and CL outperformed PA and PL by 7 and 12.8%, respectively, in terms of efficacy against DON. All application programs had comparable efficacy against IND between spring and winter wheat types, but efficacy against DON was 10 to 16% greater for spring than winter wheat for applications made at or after anthesis. All programs led to an increase in mean grain yield and test weight relative to the nontreated check.
Field trials were conducted in 17 U.S. states to evaluate the effects of quinone outside inhibitor (QoI) and demethylation inhibitor (DMI) fungicide programs on Fusarium head blight index (IND) and deoxynivalenol (DON) toxin in wheat. Four DMI-only treatments applied at Feekes 10.5.1, five QoI-only treatments applied between Feekes 9 or Feekes 10.5, three QoI+DMI mixtures applied at Feekes 10.5, and three treatments consisting of a QoI at Feekes 9 followed by a DMI at Feekes 10.5.1 were evaluated. Network meta-analytical models were fitted to log-transformed mean IND and DON data and estimated contrasts of log means were used to obtain estimates of mean percent controls relative to the nontreated check as measures of efficacy. Results from the meta-analyses were also used to assess the risk of DON increase in future trials. DMI at Feekes 10.5.1 were the most effective programs against IND and DON and the least likely to increase DON in future trials. QoI-only programs increased mean DON over the nontreated checks and were the most likely to do so in future trials, particularly when applied at Feekes 10.5. The effects of QoI+DMI combinations depended on the active ingredients and whether the two were applied as a mixture at heading or sequentially. Following a Feekes 9 QoI application with a Feekes 10.5.1 application of a DMI reduced the negative effect of the QoI on DON but was not sufficient to achieve the efficacy of the Feekes 10.5.1 DMI-only treatments. Our results suggest that one must be prudent when using QoI treatments under moderate to high risk of FHB, particularly where the QoI is used without an effective DMI applied in combination or in sequence.
Foliar fungicide use in hybrid maize in the United States was rare before 2000. The decade from 2000 to 2010 saw foliar fungicides increasingly applied to maize in the absence of appreciable disease pressure, a practice seemingly at odds with integrated pest management philosophy. Yet, it is commonly believed that growers do not employ management strategies unless there are perceived benefits. Maize (corn) growers (CGs) and certified crop advisors (CCAs) across four Midwestern states (Iowa, Illinois, Ohio and Wisconsin) were surveyed to better understand their practices, values and perceptions concerning the use of foliar fungicides during 2005 to 2009. The survey results demonstrated the rapid rise in maize foliar fungicide applications from 2000 through 2008, with 84% of CGs who sprayed having used a foliar fungicide in maize production for the very first time during 2005 to 2009. During 2005 to 2009, 73% of CCAs had recommended using a foliar fungicide, but only 35% of CGs sprayed. Perceived yield gains, conditional on having sprayed, were above the break-even point on average. However, negative yield responses were also observed by almost half of CCAs and a quarter of CGs. Hybrid disease resistance was a more important factor to economically successful maize production than foliar fungicides. Diseases as a yield-limiting factor were more important to CGs than CCAs. As a group, CGs were not as embracing of foliar fungicide as were CCAs, and remained more conservative about the perceived benefits to yield.
This article addresses the modelling of crop health and its impact on crop losses, with a special emphasis on plant diseases. Plant disease epidemiological models have many different shapes. We propose a summary of modelling structures for plant disease epidemics, which stem from the concepts of infection rate, of site, of basic infection rate corrected for removals (Rc), and of basic reproductive number (R0). Crop losses, the quantitative and qualitative impacts of diseases and pests on crop performances, can be expressed along many different dimensions. We focus on yield loss, defined as the difference between the attainable yield and the actual yield, in a production situation. The modelling of yield loss stems from the concept of damage mechanism, which can be applied to the wide range of organisms (including pathogens, weeds, arthropods, or nematodes) that may negatively affect crop growth and performances. Damage mechanisms are incorporated in crop growth models to simulate yield losses. In both fields, epidemiology and crop loss, we discuss the process of model development, including model simplification. We emphasize model simplification as a main avenue towards model genericity. This is especially relevant to enable addressing the diversity of crop pathogens and pests. We also discuss the usefulness of considering differing evaluation criteria depending on the stage of model development, and thus, depending on modelling objectives. We illustrate progress made on two global key crops where model simplification has been critical; rice and wheat. Modelling pests and diseases, and of the yield losses they cause on these two crops, lead us to propose the concept of crop health syndrome as a set of injury functions, each representing the dynamics of an injury (such as, for example, the time-course of an epidemic). Crop health in a given context can be represented by the set of such injury functions, which in turn can be used as drivers for crop loss models.
Project - Epidemiology of Black Sigatoka in Banana
The current focus on this project is on the scalability of using biological control as part of the overall management of Black sigatoka. Recently, the PhD proposal of M.Sc. Patrick Becker was approved by the University of Costa Rica's Agricultural Sciences degree program. Our goal is to determine the biological and chemical factors that drive the potential for biological control and how this then needs to be considered as we scale up from greenhouse to small-scale field, and finally to large-scale application.
- Jan 2018
Genome Wide Association Studies (GWAS) allow the use of natural variation to understand the genetics controlling specific traits. Efficient methods to conduct GWAS in plants have been reported. This chapter provides the main steps to conduct and analyse GWAS in Arabidopsis thaliana using polyamine levels as trait. This approach is suitable for the discovery of genes that modulate the levels of polyamines, and can be used in combination with different types of stress.
- Oct 2017
Begomoviruses (genus Begomovirus, family Geminiviridae) have emerged as important plant pathogens in tropical and subtropical regions worldwide. Although these viruses were reported during the 1970s in Costa Rica, they are still poorly known. Therefore, the objective of this study was to analyse the diversity and distribution of begomoviruses in commercial tomato and sweet pepper fields from different agricultural production systems of the major growing regions of Costa Rica. A total of 651 plants were randomly sampled from greenhouses and open field crops during 2011 and 2012 in three different geographical locations. The bipartite begomoviruses Tomato yellow mottle virus, Tomato leaf curl Sinaloa virus and Pepper golden mosaic virus, and the monopartite begomovirus Tomato yellow leaf curl virus were detected in the collected samples. The complete genome of isolates from each species was cloned and sequenced. The frequency of detection of these four begomoviruses in the analysed samples ranged from 0 to 9%, the presence, and the prevalent virus varied largely according to the geographical location, the host (tomato and pepper), and the production system (greenhouses or open fields). An association between geographical region and begomovirus species was observed suggesting that in Costa Rica the heterogeneity on climate, topography and agricultural system might influence the distribution of begomovirus species in the country. A broader survey needs to be conducted to confirm it, although these preliminary results may contribute to the management of begomoviruses in Costa Rica.
- Sep 2017
In null hypothesis testing, failure to reject a null hypothesis may have two potential interpretations. One interpretation is that the treatments being evaluated do not have a significant effect, and a correct conclusion was reached in the analysis. Alternatively, a treatment effect may have existed but the conclusion of the study was that there was none. This is termed a Type II error, which is most likely to occur when studies lack sufficient statistical power to detect a treatment effect. In basic terms, the power of a study is the ability to identify a true effect through a statistical test . The power of a statistical test is 1 - (the probability of Type II errors), and depends on the size of treatment effect (term the effect size), variance, sample size, and significance criterion (the probability of a Type I error, α). Low statistical power is prevalent in scientific literature in general, including plant pathology. However, power is rarely reported, creating uncertainty in the interpretation of nonsignificant results and potentially underestimating small, yet biologically significant relationships. The appropriate level of power for a study depends on the impact of Type I versus Type II errors and no single level of power is acceptable for all purposes. Nonetheless, by convention 0.8 is often considered an acceptable threshold and studies with power less than 0.5 generally should not be conducted if the results are to be conclusive. The emphasis on power analysis should be in the planning stages of an experiment. Commonly employed strategies to increase power include increasing sample sizes, selecting a less stringent threshold probability for Type I errors, increasing the hypothesized or detectable effect size, including as few treatment groups as possible, reducing measurement variability, and including relevant covariates in analyses. Power analysis will lead to more efficient use of resources and more precisely structured hypotheses, and may even indicate some studies should not be undertaken. However, the conclusions of adequately powered studies are less prone to erroneous conclusions and inflated estimates of treatment effectiveness, especially when effect sizes are small.
- Jun 2017
Scenario analysis constitutes a useful approach to synthesize knowledge and derive hypotheses in the case of complex systems which are documented with mainly qualitative or very diverse information. In this article, a framework for scenario analysis is designed and then, applied to global wheat health within a timeframe from today to 2050. Scenario analysis entails the choice of settings, the definition of scenarios of change, and the analysis of outcomes of these scenarios in the chosen settings. Three idealized agrosystems, representing a large fraction of the global diversity of wheat-based agrosystems, are considered, which represent the settings of the analysis. Several components of global changes are considered in their consequences on global wheat health: climate change and climate variability, nitrogen fertilizer use, tillage, crop rotation, pesticide use, and the deployment of host plant resistances. Each idealized agrosystem is associated with a scenario of change that considers first, a production situation and its dynamics, and second, the impacts of the evolving production situation on the evolution of crop health. Crop health is represented by six functional groups of wheat pathogens: the pathogens associated with fusarium head blight; biotrophic fungi; septoria-like fungi; necrotrophic fungi; soil borne pathogens; and insect-transmitted viruses. The analysis of scenario outcomes is conducted along a risk-analytical pattern, which involves risk probabilities represented by categorized probability levels of disease epidemics, and risk magnitudes represented by categorized levels of crop losses resulting from these levels of epidemics within each production situation. The results from this scenario analysis suggest an overall increase of risk probabilities and magnitudes in the three idealized agrosystems. Changes in risk probability or magnitude however vary with the agrosystem and the functional groups of pathogens. We discuss the effects of global changes on the six functional groups, in terms of their epidemiology and of the crop losses they cause. Scenario analysis enables qualitative analysis of complex systems, such as plant pathosystems that are evolving in response to global changes, including climate change and technology shifts. It also provides a useful framework for quantitative simulation modeling analysis for plant disease epidemiology.
- Mar 2017
The literature on the importance of plant pathogens sometimes emphasizes their possible role in historical food shortages and even in famines. Aside from such major crises, plant pathogens should also be seen as important reducers of crop performances, with impacts on system sustainability, from the ecological, agronomical, social, and economic standpoints – all contributing ultimately to affecting food security. These views need reconciliation in order to produce a clearer picture of the multidimensional effects of plant disease epidemics. Such a picture is needed for disease management today, but would also be useful for future policies. This article attempts to develop a framework that would enable assessment of the impacts of plant diseases, referred collectively to as crop health, on food security via its components. We have combined three different existing definitions of food security in order to develop a framework consisting of the following six components: (1) Availability. Primary production; (2) Availability. Import - Stockpiles; (3) Access. Physical and supply chain; (4) Access. Economic; (5) Stability of food availability; (6) Utility-Safety-Quality-Nutritive value. In this framework, components of food security are combined with three attributes of production situations: the nature of the considered crop (i.e. food- or non-food), the structure of farms (i.e. subsistence or commercial), and the structure of markets (i.e. weakly organized and local, to strongly organized and globalized). The resulting matrix: [Food security components] × [Attributes of production situations] provides a framework where the impacts of chronic, acute, and emerging plant disease epidemics on food security can be examined. We propose that, given the number of components and interactions at play, a systems modelling approach is required to address the functioning of food systems exposed to plant disease risks. This approach would have application in both the management of the current attrition of crop performances by plant diseases, and also of possible disease-induced shocks. Such an approach would also enable quantifying shifts in disease vulnerability of production situations, and therefore, of food systems, as a result of climate change, globalization, and evolving crop health.
Annual decreases in soybean (Glycine max L. Merrill) yield caused by diseases were estimated by surveying university-affiliated plant pathologists in 28 soybean-producing states in theUnitedStates and in Ontario, Canada, from 2010 through 2014. Estimated yield losses from each disease varied greatly by state or province and year. Over the duration of this survey, soybean cyst nematode (SCN) (Heterodera glycines Ichinohe) was estimated to have caused more than twice as much yield loss than any other disease. Seedling diseases (caused by various pathogens), charcoal rot (caused by Macrophomina phaseolina (Tassi) Goid), and sudden death syndrome (SDS) (caused by Fusarium virguliforme O'Donnell & T. Aoki) caused the next greatest estimated yield losses, in descending order. The estimated mean economic loss due to all soybean diseases, averaged across U.S. states and Ontario from 2010 to 2014, was $60.66 USD per acre. Results from this survey will provide scientists, breeders, governments, and educators with soybean yield-loss estimates to help inform and prioritize research, policy, and educational efforts in soybean pathology and disease management.
- Dec 2016
We witness unprecedented changes in Earth’s natural and man-made ecosystems. These changes are generated by a number of drivers – human population growth, global trade, climate change, and technology shifts – which are often linked in their evolutions, and compounded in their impacts (Vitousek et al., 1986; Rosen, 2000). Importantly, these drivers of change are not linear in their dynamics, evolve at different speeds, and act upon the diversity of the biosphere through different mechanisms, resulting in heterogeneous impacts (Cassman et al., 2005; Garrett et al., 2011). Agriculture may be seen both as a recipient of these effects as well as a cause of changes in the biosphere. As with any of the interactions at play in an ecosystem, plant disease displays responses to these changes.
Soybean (Glycine max (L.) Merr.) is produced across a vast swath of North America, with the greatest concentration in the Midwest. Root rot diseases and damping-off are a major concern for production, and the primary causal agents include oomycetes and fungi. In this study, we focused on examination of oomycete species distribution in this soybean production system and how environmental and soil (edaphic) factors correlate to oomycete community composition at early plant growth stages. Using a culture-based approach, a total of 3,418 oomycete isolates were collected from 11 major soybean producing states and most were identified to genus and species using the ITS region of the rDNA. Pythium was the predominant genus isolated and investigated in this study. An ecology approach was taken to understand the diversity and distribution of oomycete species across geographical locations of soybean production. Metadata associated with field sample locations were collected using geographical information systems (GIS). Operational taxonomic units (OTUs) were used in this study to investigate diversity by location, with OTUs being defined as isolate sequences with 97% identity to one another. The mean number of OTUs ranged from 2.5 to 14 per field at the state level. Most OTUs in this study, classified as Pythium clades, were present in each field in every state, but major differences were observed in the relative abundance of each clade, which resulted in clustering of states in close proximity. Since there was similar community composition (presence/absence) but differences in OTU abundance by state the ordination analysis did not show strong patterns of aggregation. Incorporation of 37 environmental and edaphic factors using vector fitting and Mantel test, identified 15 factors that correlate with the community composition in this survey. Further investigation using redundancy analysis (RDA) identified latitude, longitude, precipitation and temperature as factors that contribute to the variability observed in community composition. Soil parameters such as, clay content and electrical conductivity also affected distribution of oomycete species. The present study suggests that oomycete species composition across geographical locations of soybean production is affected by a combination of environmental and edaphic conditions. This knowledge provides the basis to understand the ecology and distribution of oomycete species, especially those able to cause diseases in soybean, providing cues to develop management strategies.
Oomycete pathogens are commonly associated with soybean root rot, and have been estimated to reduce soybean yields in the United States by 1.5 million tons on an annual basis. Limited information exists regarding the frequency and diversity of oomycete species across the major soybean producing regions in North America. A survey was conducted across 11 major soybean producing states in the U.S. and the province of Ontario, Canada. In 2011, 2,378 oomycete cultures were isolated from soybean seedling roots on a semi-selective medium (CMA-PARPB) and identified by sequencing of the ITS region of rDNA. Sequence results distinguished a total of 51 Pythium, 3 Phytophthora, 3 Phytopythium and 1 Aphanomyces spp. in 2011, with Py. sylvaticum (16%) and Py. oopapillum (13%) being the most prevalent. In 2012, the survey was repeated, but due to drought conditions across the sampling area, fewer total isolates (n= 1,038) were collected. Additionally, in 2012, a second semi-selective medium (V8-RPBH) was included which increased the Phytophthora spp. isolated from 0.7% to 7% of the total isolates. In 2012, 54 Pythium, 7 Phytophthora, 6 Phytopythium and 1 Pythiogeton sp. were recovered, with Py. sylvaticum (14%) and Py. heterothallicum (12%) being recovered most frequently. Pathogenicity and virulence were evaluated with representative isolates of each of the 84 species on soybean cv. 'Sloan'. A seed rot assay identified 13 and 11 pathogenic species at 13ºC and 20ºC, respectively. A seedling root assay conducted at 20ºC identified 43 species as pathogenic, having a significantly detrimental effect on the seedling roots as compared to the non-inoculated control. Fifteen species were pathogenic in both the seed and seedling assays. This study provides a comprehensive characterization of oomycete species present in soybean seedling roots in the major production areas in the U.S. and Ontario, Canada, and provides a basis for disease management and breeding programs.
Question - Can anyone recommend the best free software alternative to sigma scan Pro for image analysis of disease symptoms on leaves?
I would echo Peter's thoughts on ImageJ. My colleagues here at the University have had good success using this for different needs related to processing images with different severities, and they linked it with R to be able to process the results in real-time. We are currently working to build directly into this system other calculations and analyses. Complex, yes, but very useful.
- Sep 2015
Foliar fungicide use in the U.S. Corn Belt increased in the last decade; however, questions persist pertaining to its value and sustainability. Multi-state field trials were established from 2010 to 2012 in Illinois, Iowa, Ohio and Wisconsin to examine how hybrid and foliar fungicide influenced disease intensity and yield. The experimental design was in a split-split plot with main plots consisting of hybrids varying in resistance to gray leaf spot (caused by Cercospora zeae-maydis) and northern corn leaf blight (caused by Setosphaera turcica), sub-plots corresponding to four application timings of the fungicide pyraclostrobin, and sub-sub plots represented by inoculations with either C. zeae-maydis, S. turcica or both at two vegetative growth stages. Fungicide application (VT/ R1) significantly reduced TDS relative to the control in 5 of 8 site years (P <0.05. Disease was reduced by approximately 30% at Wisconsin in 2011, 20% at Illinois in 2010, 29% at Iowa in 2010, and 32% and 30% at Ohio in 2010 and 2012, respectively. These disease severities ranged from 0.2-0.3% in Wisconsin in 2011 to 16.7 to 22.1% in Illinois in 2010. The untreated control had significantly lower yield (P <0.05) than the fungicide-treated in three site-years. Fungicide application increased the yield by approximately 6% at Ohio in 2010, 5% at Wisconsin in 2010 and 6% in 2011. Yield differences ranged from 8403-8890 kg /ha in Wisconsin 2011 to 11362-11919 kg /ha in Wisconsin 2010. Results suggest susceptibility to disease and prevailing environment are important drivers of observed differences. Yield increases as a result of the physiological benefits of plant health benefits under low disease were not consistent.
The P value (significance level) is possibly the mostly widely used, and also misused, quantity in data analysis. P has been heavily criticized on philosophical and theoretical grounds, especially from a Bayesian perspective. In contrast, a properly-interpreted P has been strongly defended as a measure of evidence against the null hypothesis, H0. We discuss the meaning of P and null-hypothesis statistical testing, and present some key arguments concerning their use. P is the probability of observing data as extreme as, or more extreme than, the data actually observed, conditional on H0 being true. However, P is often mistakenly equated with the posterior probability that H0 is true conditional on the data, which can lead to exaggerated claims about the effect of a treatment, experimental factor or interaction. Fortunately, a lower bound for the posterior probability of H0 can be approximated using P and the prior probability that H0 is true. When one is completely uncertain about the truth of H0 before an experiment (i.e., when the prior probability of H0 is 0.5), the posterior probability of H0 is much higher than P, which means that one needs P values lower than typically accepted for statistical significance (e.g., P = 0.05) for strong evidence against H0. When properly interpreted, we support the continued use of P as one component of a data analysis that emphasizes data visualization and estimation of effect sizes (treatment effects).
Standard foliar fungicide applications in wheat are usually made between flag leaf emergence (Feekes [FK] 8) and heading (FK10.5) to minimize damage to the flag leaf. However, over the last few years, new fungicide programs such as applications prior to FK8 and split half-rate applications have been implemented, although there are few data pertaining to the efficacy of these programs. Eight experiments were conducted in Illinois, Indiana, Ohio, and Wisconsin from 2010 to 2012 to compare new programs to standard FK8 and FK10 programs in terms of disease control and yield response. The programs evaluated consisted of single full-rate pplications of 19% tebuconazole + 19% prothioconazole (Prosaro) or 23.6% pyraclostrobin (Headline) at FK5 (pseudostem strongly erected), FK8, or FK10, or split half rates at FK5 and 8 (FK5+8), plus an untreated check (CK). Leaf blotch (LB) severity and yield data were collected and random effects meta-analytical models fitted to estimate the overall log odds ratio of disease reaching the flag leaf ((formulapresented)) and mean yield increase ((formulapresented)) for each fungicide program relative to CK. For all programs, (formulapresented) was significantly different from zero (P < 0.05). Based on estimated odds ratios (OR = exp[(formulapresented)]), the two FK8 programs reduced the risk of LB reaching the flag leaf by 55 and 75%, compared with 62 and 69% and 67 and 70% for the two FK10 and FK5+8 programs, respectively, and only 32 and 37% for the two FK5 programs. (formulapresented) was significantly different from zero (P≤0.003) for all FK8, FK10, and FK5+8 programs, with values of 233 and 245, 175 and 220, and 175 and 187 kg ha−1 for the FK10, FK5+8, and FK8 programs, respectively. Differences in mean yield response between Headline and Prosaro were not statistically significant (P > 0.05). The probability of profitability was estimated for each program for a range of grain prices and fungicide application costs. All FK8, FK10, and FK5+8 programs had more than an 80% chance of resulting in a positive yield response, compared with 63 and 67% for the two FK5 programs. The chance of obtaining a yield increase of 200 kg ha−1, required to offset an application cost of $36 ha−1 at a grain price of $0.18 kg−1, ranged from 44 to 60% for FK8, FK10 and FK5+8 programs compared with 22 and 25% for the two FK5 programs. These findings could be used to help inform fungicide application decisions for LB diseases in soft red winter wheat.
Sudden death syndrome (SDS), caused by Fusarium virguliforme, is an important yield limiting disease of soybean. Glyphosate is used to control weeds in soybean; however, its effect on SDS is not clearly understood. The objective of this study was to examine the impact of glyphosate on SDS, yield, and plant nutrition under field conditions. Fourteen field experiments were conducted in Iowa, Illinois, Indiana, Michigan, Wisconsin, and Ontario, Canada during 2011 to 2013. The experiment consisted of six treatment combinations of glyphosate and herbicides not containing glyphosate. Disease index was significantly different across the location–years, ranging from 0 to 65. The highest disease was noted in locations with irrigation, indicating that high soil moisture favors development of SDS. There were no effects of herbicide treatments or interactions on disease. The foliar disease index among the treatments over all years ranged from 9 to 13. Glyphosate-treatments also tended to yield more than treatments of herbicides not containing glyphosate. There were no interactions between glyphosate-treatments and total manganese in plant tissue. The interaction of glyphosate with other nutrients in plant tissue was inconclusive. This 14 location–year study demonstrated that glyphosate application did not increase SDS severity or adversely affect soybean yield under field conditions.
Phakopsora meibomiae (Arthur) Arthur has been reported to occur in several legume species in the tropical regions of Central and South America. In Costa Rica, this pathogen was initially reported as P. pachyrhizi Sydow (1); however, to our knowledge, P. pachyrhizi has not been detected in Costa Rica. In routine evaluations of a 0.2-ha field planted with soybean (Glycine max (L.) Merr var. CIGRAS 06) in La Garita, Alajuela, Costa Rica, symptoms similar to Asian soybean rust were observed in December 2012 and January 2013. Soybean plants were at growth stages R4 to R5 when these symptoms were observed, which included yellow spots on leaves with brown spots on the abaxial surface. Further evaluations at growth stage R5 to R6 indicated that the spots had coalesced, turned grayish-brown, and caused substantial defoliation. Microscopic examination of symptomatic leaves showed the presence of uredinia and urediniospores on the lower surface of the leaf. While initial symptoms were on the southern side of the field, a substantial area of the field was infected at the second evaluation. Infected leaves were submitted to the USDA-ARS Foreign Disease-Weed Science Research Unit under the appropriate USDA-Animal Plant Health Inspection Service permit for molecular characterization and identification. Urediniospores were collected by washing infected leaves with sterile water and then pelleted by centrifugation. DNA was extracted from urediniospore pellets and excised leaf pieces using a DNeasy Plant Mini Kit (Qiagen, Germantown, MD), and eight samples were amplified in real-time polymerase chain reaction (PCR) with P. pachyrhizi-specific primers Ppm1 and Ppa2 but not with the P. meibomiae-specific primers Ppm1 and Pme2 (2). Nucleotide sequence alignment of the internal transcribed spacer (ITS) regions 1 and 2 that were amplified by PCR using the primers Ppa1 and Ppa2 further confirmed the identification as P. pachyrhizi. To the best of our knowledge, this is the first known confirmation of soybean rust, caused by P. pachyrhizi in Costa Rica. CIGRAS-06 is the only soybean variety bred in the country as well as one of the very few varieties available for growers. Given that breeding for disease resistance is not a short-term option for P. pachyrhizi, alternative disease management strategies will have to be developed.
Corn (Zea mays L.)–soybean [Glycine max (L.) Merr.] cropping systems of the Midwest have led to increased selection pressure on diseases caused by Fusarium pathogens. A field experiment was conducted from 2010 to 2012 near Arlington, WI, to identify interactions among disease management practices (crop rotation, host resistance, and fungicide use) that increase corn, soybean, and wheat (Triticum aestivum L.) yields. For corn grain, significant interactions were primarily driven by crop rotation. Highest corn yields across all 3 yr were observed in the corn–soybean–wheat (CSW) rotation (13.55 Mg ha-1). Corn silage yield was influenced by cultivar rotation, with highest yields displayed by the Fusarium-susceptible rotations (susceptible followed by susceptible followed by susceptible [SSS] and susceptible followed by susceptible followed by resistant [SSR]). Soybean yields were influenced by interactions involving crop rotation and cultivar rotation. Highest soybean yields were found for crop rotations containing wheat and ranged from 5.1 to 8.4% higher than the corn alternated annually with soybean (CS) rotation. The Fusarium-resistant (resistant followed by resistant followed by resistant [RRR]) cultivar rotation (4.14 Mg ha-1) yielded 3.0% better than the next highest rotation (SSR). Crop rotation, cultivar selection, and fungicide use were all key drivers for wheat yield. Highest yields on average were observed in the CSW rotation (5.62 Mg ha-1). The Fusarium head blight (FHB)–susceptible cultivar (5.50 Mg ha-1) yielded significantly higher compared to the resistant cultivar (4.89 Mg ha-1), and fungicide use increased yield in the susceptible cultivar 7.2% (5.31 to 5.69 Mg ha-1) but not for the resistant cultivar. Although interactions were not consistent for all three crops, our results suggest growers should begin with combining a highyield- potential cultivar, regardless of its susceptibility or resistance to Fusarium pathogens, in a CSW crop rotation to maximize yield potential when managing Fusarium-related diseases.
- Jan 2015
Historically, hybrid maize in the United States was rarely sprayed with foliar fungicides, but that began to change in 2004. Fungicides are now being used in the absence of significant disease pressure. This is at odds with standard integrated pest management practices. However, growers do not typically use a given management strategy unless there is a perceived benefit. A survey was done in 2009 of maize growers and certified crop advisors across four Midwestern states (Iowa, Illinois, Ohio and Wisconsin) to better understand the values, beliefs and perceptions of those involved in making decisions on maize disease management. This article documents the survey administration, data collection and preparation for statistical analysis. A part of the survey was analyzed to build grower and crop advisor profiles. The grower population tended to be older with less formal schooling than the crop advisor population. Growers were very involved in decision-making, and often used the services of a crop advisor. Crop advisors had 91 grower clients on average, and 90% of advisors had 150 or fewer clients. Advisors with 150 or less clients were consulted on 239 maize hectares per grower on average. Growers and crop advisors interacted primarily via one-on-one meetings. Advice was provided mainly in the areas of crop production, pest management and crop nutrients. Growers owned about 50% of the land they farmed. Crop advisors were more likely to suggest different management techniques for rented land as opposed to land their clients owned, whereas growers tended to manage owned and rented land the same. On average, 89% of crop advisors took part in integrated pest management training every year.
- Jan 2015
The response of plant disease to weather variables such as temperature and precipitation is well known, and has been the basis for disease forecasting models used in decision-making by farmers and policy makers. Thus, plant disease risk can readily shift under climate change. Predicting changes in risk under climate change is complicated by the many biological interactions that result in disease. For example, some plant diseases occur when the phenology of plant and pathogen are aligned, in the case of Fusarium head blight so that spores are ready to infect during wheat flowering. There are numerous examples of experiments involving simulated climate change that have shown changes in disease risk. Finding indicators of climate change effects on disease in real agricultural or natural systems is more challenging because of the correlative nature of the observations and the potential for many other factors to influence disease, such as changes in host abundance. Assessments of global disease databases suggest some pathogen range shifts consistent with predicted outcomes for climate change. There are also several cases of emerging tree diseases in which climate change has likely played a role.
Question - Why we take thermal image for plant through period 12:00pm to 3:00?
The first two answers provided address the primary reason. In general, this is when there is peak sunlight that correlates or relates strongly to photosynthesis, as well as other factors such as plant stress.
Fusarium spp. are common fungal pathogens that infect a number of field and vegetable crops. Crop rotation, genetic resistance, and fungicides are the primary methods used for managing these pathogens; however, there is a lack of information regarding the interactions between these management strategies and how they impact Fusarium spp. population dynamics. Therefore, the objective of this research was to quantify the effect of crop rotation and management (i.e., variety selection and fungicide use) on F. graminearum, F. oxysporum, and F. virguliforme populations in the soil using real-time quantitative polymerase chain reaction (qPCR). Soil samples were collected in 2011 and 2012 from a long-term corn (Zea mays L.)-soybean [Glycine max (L.) Merr.]-wheat (Triticum aestivum L.) rotation study near Arlington, WI, and populations for each species (spores g(-1) of soil) were quantified from extracted soil DNA. Fusarium oxysporum was the most prevalent Fusarium sp. found. Crop rotation and management did not impact F. oxysporum populations nor F. virguliforme presence. A crop rotation by fungicide interaction was found for F. graminearum (P < 0.001), but this interaction was primarily affected by crop rotation. As expected, F. graminearum was found more often in plots with wheat as part of the rotation. This study found few interactions among crop rotation, variety selection, and fungicide use for controlling populations of three Fusarium spp. in the soil, and significant interactions or individual control methods were dependent on the species being examined.
El objetivo del presente trabajo fue caracterizar desde el punto de vista agronómico, 13 genotipos de camote para su cultivo en Costa Rica. Diez fueron introducidos de la Unidad de Micropropagación de la Universidad Estatal de Carolina del Norte, y cultivados por primera vez en Costa Rica, 2 conocidos como “Exportación y Zanahoriaˮ son de introducción reciente, pero se han cultivado en el país por al menos 5 años y el restante Criollo es el que ocupa actualmente la mayor área de siembra. El trabajo se llevó a cabo en la Estación Experimental Fabio Baudrit de la Universidad de Costa Rica, ubicada en la provincia de Alajuela. Los tratamientos se dispusieron en el campo en un diseño de bloques completos al azar, con 13 genotipos y 4 repeticiones. El análisis de varianza mostró que los genotipos difirieron significativamente (p=0,0001) en todas las variables evaluadas: peso fresco y seco foliar; peso fresco y seco de raíz reservante; número de raíces reservantes; peso de raíz reservante/ planta; contenido de materia seca y rendimiento (t.ha -1 ). Todos los cultivares evaluados superaron el rendimiento de la variedad Criollo (6 t.ha -1 ) y el promedio nacional (5 a 7,8 t.ha -1 ), con rendimientos entre 12 y 48 t.ha -1 . Algunos de los cultivares son promisorios para su cultivo en Costa Rica, no solo con base en su rendimiento, sino debido a su color de pulpa, amarilla o anaranjada, asociado a un mayor contenido de carotenos.
- Dec 2014
The goal of this study was to determine the agronomic performance of 13 sweet potato genotypes for their cultivation in Costa Rica. Ten were introduced from North Carolina State Universitys Micropropagation and Repository Unit, and were grown for the first time in Costa Rica. Two (Exportación and Zanahoria) were more recent introductions into the country, having been cultivated here for at least 5 years, and the one remaining (Criollo) is the most widely used locally. The field experiment was conducted at the Fabio Baudrit Experiment Station, of the University of Costa Rica, located in Alajuela province. The treatments were laid out in a randomized complete block design, with 13 genotypes and 4 repetitions. The analysis of variance showed significant differences (p=0.0001) among genotypes for all evaluated variables: fresh and dry foliar weight; storage root fresh and dry weight; number of storage roots, storage root weight/plant, dry matter content and yield (t.ha-1). The root yield of all genotypes evaluated was higher than that of the local variety Criollo (6 t.ha-1) as well as the national average (5 to 7.8 t.ha-1), ranging from 12 to 48 t.ha-1. These results indicate that some varieties are promising for release in Costa Rica, not only because of their good performance in terms of yield, but also due to quality traits such as orange or yellow flesh, associated with high carotene content.
Question - Does anyone know how to estimate the Disease Incidence in Plant disease research?
All of the suggestions so far are good, but following from Susann's reply if you are going to identify the incidence by type of pathogen (or disease), that you need to make sure you consider the most correct form to analyze such data for your objective. Are you working from just one area? You could consider spatial methods to identify if the pathogens are aggregated or random. These need to be examined using more complex statistical methods, but can prove valuable if you want to identify a management program. How large of an area will you be sampling? How many samples can you process? Are you able to georeference your sampling sites?
On-farm U.S. soybean [Glycine max (L.) Merr.] yields have increased at an annual rate of 23.3 kg ha–1 yr–1 since the 1920s. These gains have come from a variety of sources including genetic, agronomic, and environmental changes. Genetic gains arising from breeding efforts have likely contributed the most to the U.S. soybean yield increase; however, the relative contribution of each source of gain is difficult to estimate. The objectives of this study were to compare yield of soybean varieties with different year of release, understand the effects of fungicide applications on soybean seed yield, and evaluate the composition of soybean cultivars chosen to represent historically significant releases in maturity groups (MGs) II and III released during the last 85 yr. A set of 116 cultivars in these two MGs, released from 1923 to 2008, received a fungicide seed treatment followed by foliar applications at R1, R3, and R5 and were compared to non-treated controls. Seed composition changed over time with protein concentration decreasing 2.1 g kg–1 for every g kg–1increase in oil concentration. The significant interaction between fungicide treatment and MG III cultivar release year for plant stand revealed that such treatments were more beneficial with respect to obsolete cultivars of MG III, though this plant stand interaction did not translate into a significant yield interaction. The rate of genetic yield improvement made by breeders was not influenced by fungicide management and matched the observed rate of on-farm yield improvement that occurred during the same period.
Soybean I Glycine max (L.) Merr.] yield has increased during the past century; however, little is understood about the morphological parameters that have contributed most to yield gain. We conducted field studies to determine relationships between genetic gain of soybean yield and seeding rate. The hypothesis was newer cultivars would express higher yield than older cultivars when grown in higher plant populations. A total 4116 soybean cultivars equally representing Maturity Croups (MGs) II and III released over the last 80 yr were evaluated at high and low seeding rates in Wisconsin, Minnesota, Illinois, and Indiana. Seeding rates were 445,000 and 148,000 seeds ha(-1) resulting in 311,000 and 94,000 plants ha-1 (high and low, respectively). Seed yield was greater for the high seeding rate vs. low seeding rate throughout all cultivars and years of release, but the difference was larger in newer cultivars. The differences observed primarily came from an increased number of pods and seeds plant(-1). However, newer cultivars grown in low seeding rates increased per plant yield linearly by 0.118 (+/- 0.02)x- 208.0 g plant(-1), where x = year-of-release, which was three times greater than at the high seeding rate. The greater yield trend came from seeds produced on plant branches. Therefore, newer cultivars produce more compensatory yield on plant branches under lower plant populations than older cultivars, so over the last 80 yr there has been a diminishing response to the expected yield penalty from reduced plant density.
Existing crop monitoring programs determine the incidence and distribution of plant diseases and pathogens and assess the damage caused within a crop production region. These programs have traditionally used observed or predicted disease and pathogen data and environmental information to prescribe management practices that minimize crop loss (3,69). Monitoring programs are especially important for crops with broad geographic distribution or for diseases that can cause rapid and great economic losses. Successful monitoring programs have been developed for several plant diseases, including downy mildew of cucurbits, Fusarium head blight of wheat, potato late blight, and rusts of cereal crops (13,36,51,80).
- Jul 2014
Estimation of soybean [Glycine max (L.) Merr.] yield early in the growing season is an appealing idea for both, farmers and soybean-related industries. Prior attempts to predict soybean yield have had limited success, especially when using information early in the growing season. The objective of this study was to evaluate the release date and maturity group (MG) of the cultivar, digital imaging, reflectance, and weather data during successive stages of crop development as explanatory variables in a soybean yield prediction model. The data were collected in the North Central (NC) United States at Arlington, WI (2010-2011), and Lafayette, IN (2011), using 59 MG II cultivars (released 1928-2008) at Wisconsin, and 57 MG III cultivars (released 1923-2007) at Indiana that were planted in performance trials on two planting dates (May and June). A second order polynomial regression analysis followed by ridge regression was used to develop the soybean yield prediction equation. The model accounted for 80% of the yield variability in the NC U. S. data set. An additional dataset not used in the calibration was used to conduct a validation test of the predictive performance of the model. The average difference between the fitted and actual yields in the validation test was 67 kg ha(-1). Results from this study suggest that the use of cultivar release year, planting date, MG, near-infrared (NIR), visible red (RED), and Red-edge wavelength bands recorded at 77 d after planting, and weather data 30 d before and after the planting date can closely estimate soybean yields in the Midwest.
Fusarium virguliforme, the causal agent of sudden death syndrome, and Heterodera glycines, soybean cyst nematode (SCN), are economically important pathogens of soybean. In 2011 and 2012, samples submitted to a SCN detection program were assayed for SCN using a sieving/centrifugation method and for F. virguliforme using a real-time quantitative polymerase chain reaction (RT-qPCR) protocol. The objectives of this study were to: (i) determine the incidence of H. glycines and F. virguliforme in commercial soybean fields in Wisconsin; and to (ii) compare the distribution and population densities of H. glycines and F. virguliforme to determine if establishment of these pathogens is interrelated.
The trend toward earlier soybean [Glycine max (L.) Merr.] planting in the midwestern United States has interacted synergistically with genetic yield gain to provide improvement in on-farm yields. However, the impacts of earlier planting dates and their interaction with genetic gain in physiological and phenological traits remain unclear. The objectives of this study were to determine if a 30-d difference in planting date affected measured rates of genetic improvement in (i) total dry matter (TDM) production, (ii) harvest index (HI), and (iii) growth-stage duration in the north-central United States. Research was conducted at Arlington, WI, Urbana, IL, and Lafayette, IN during 2010 and 2011 using 59 Maturity Group (MG) II cultivars (released 1928–2008) at Wisconsin, and 57 MG III cultivars (released 1923–2007) at Illinois and Indiana, with targeted planting dates of 1 May and 1 June. A mixed-effect regression analysis was used to model genetic change in TDM, HI, and growth stage duration as impacted by planting date. Breeding efforts have increased TDM(R7), HI, seed-fill duration (SFD), and reproductive growth duration over time, as vegetative growth duration has been reduced. Early planting provided increased TDM(R7) and longer reproductive growth duration, but had no effect on HI or SFD. A synergistic planting date × year of release interaction existed for TDM(R7) in both maturity groups, but not for HI or SFD, suggesting that the higher yields in newer, early-planted cultivars resulted from greater TDM production, not improved HI or SFD.
Estimation of soybean [Glycine max (L.) Merr.] yield early in the growing season is an appealing idea for both, farmers and soybean-related industries. Prior attempts to predict soybean yield have had limited success, especially when using information early in the growing season. The objective of this study was to evaluate the release date and maturity group (MG) of the cultivar, digital imaging, reflectance, and weather data during successive stages of crop development as explanatory variables in a soybean yield prediction model. The data were collected in the North Central (NC) United States at Arlington, WI (2010–2011), and Lafayette, IN (2011), using 59 MG II cultivars (released 1928–2008) at Wisconsin, and 57 MG III cultivars (released 1923–2007) at Indiana that were planted in performance trials on two planting dates (May and June). A second order polynomial regression analysis followed by ridge regression was used to develop the soybean yield prediction equation. The model accounted for 80% of the yield variability in the NC U.S. data set. An additional dataset not used in the calibration was used to conduct a validation test of the predictive performance of the model. The average difference between the fitted and actual yields in the validation test was 67 kg ha–1. Results from this study suggest that the use of cultivar release year, planting date, MG, near-infrared (NIR), visible red (RED), and Red-edge wavelength bands recorded at 77 d after planting, and weather data 30 d before and after the planting date can closely estimate soybean yields in the Midwest.
Soybean [Glycine max (L.) Merr.] grain yield has annually increased nearly 23 kg ha(-1), but the interaction of genetic advancement and improved agronomic practices has not been well quantified, including N utilization and fertilization. A field study with soybean cultivars released from 1923 to 2008 in maturity group (MG) II and MG III was conducted in multiple environments with a nonlimiting supply of fertilizer N to examine the main effects and interactions of N supply and release year on grain yield and seed quality. We hypothesized that grain yield and seed quality would be improved with the nonlimiting supply of N, especially for the modern cultivars. Supplemental N totaled 560 kg N ha(-1) with 40% applied at planting and 60% applied at V5. Grain yield increased with release year in MG II (17.2 kg ha(-1) yr(-1)). Application of N to MG II cultivars increased seed protein by 10 to 19.5 g kg(-1) across all release years, but grain yield and seed oil was not affected. Grain yield gains of MG III cultivars fertilized with N was 27.4 kg ha(-1) yr(-1), which was 20% better than unfertilized (22.8 kg ha(-1) yr(-1)). Application of N to MG III cultivars increased seed mass (11%) across release years with no changes in seed protein and oil. The nonlimiting supply of N increased seed protein across all release years in MG II cultivars, and the N supply from the soil and biological N fixation was insufficient to maximize grain yield in modern, MG III cultivars in the tested environments.
- Nov 2013
- International Annual Meeting American Society of Agronomy/ Crop Science Society of America/ Soil Science Society of America 2013
Fusarium virguliforme, the causal agent of sudden death syndrome, and Heterodera glycines, soybean cyst nematode (SCN), are economically important pathogens of soybean in Wisconsin. In 2011 and 2012, soil samples submitted from growers throughout the state to a SCN detection program were screened for number of SCN eggs using a sieving/centrifugation method and for number of spores g soil-1 of F. virguliforme using a real-time quantitative polymerase chain reaction (RT-qPCR) protocol. In 2011, 135 soil samples were submitted. H. glycines was detected in 56 samples, while only 10 samples were positive for F. virguliforme. In 2012, there were 64 of 318 samples testing positive for H. glycines and 13 testing positive for F. virguliforme. Kendall’s tau rank correlation coefficient was used to describe the relationship between samples that were positive for H. glycines and/or F. virguliforme. Results indicated a negative association (τ = -0.59, P < 0.01). Additionally, a best-fitting logistic regression model that described the probability of detecting H. glycines in a soil sample based on detecting F. virguliforme confirmed the negative correlation. This result suggests that SCN and F. virguliforme do not rely on each other to colonize fields indicating that fields heavily infested with SCN are not necessarily at greater risk of F. virguliforme colonization.
- Nov 2013
- International Annual Meeting American Society of Agronomy/ Crop Science Society of America/ Soil Science Society of America 2013
Currently, many commercial products advertised to improve yield are available to soybean growers. Many of these products have been tested individually; however, their interaction with various management practices, including soybean plant population, has not been validated. A cooperative multi-state field study was initiated during 2012 at two locations in Wisconsin, Michigan, Illinois, Indiana, Kentucky, Iowa, and Arkansas; three locations in Kansas; and four locations in Minnesota. The objective of this study was to determine how grain yield and seed quality respond to a more intensive management regime at varying plant populations. Two management regimes were evaluated (untreated and a high input system termed, SOYA complete) at six targeted seeding rates ranging from 123,500 to 494,000 plants ha-1. The SOYA complete management regime consisted of multiple agricultural products currently being marketed to soybean growers (seed-applied fungicide, insecticide, biologicals, LCO promoters, and foliar-applied fertilizer, insecticide, fungicide) and were applied at labeled rates and timings. Stand counts were taken at V2 and R8 (full maturity) growth stages to confirm planting and harvest plant populations. Digital pictures were taken weekly to assess canopy closure and cumulative light interception. Overall, growing conditions were highly variable across locations and drought conditions during the 2012 growing season affected grain yield. Grain yield increased with increasing plant populations at all locations; however, maximum grain yield was dependent on growing region. The use of the SOYA complete management regime increased grain yield, but we did not find any interaction between SOYA complete and plant density. Field research will continue through the 2013 and 2014 growing seasons. The large number of site-years will allow for the development of comprehensive regional and national soybean production recommendations related to soybean inputs and seeding rates.
In Wisconsin, vegetable crops are threatened annually by infection of the aster yellows phytoplasma (AYp), the causal agent of aster yellows (AY) disease, vectored by the aster leafhopper, Macrosteles quadrilineatus Forbes. Aster leafhopper abundance and infectivity are influenced by processes operating across different temporal and spatial scales. We applied a multilevel modeling approach to partition variance in multifield, multiyear, pest scouting data sets containing temporal and spatial covariates associated with aster leafhopper abundance and infectivity. Our intent was to evaluate the relative importance of temporal and spatial covariates to infer the relevant scale at which ecological processes are driving AY epidemics and identify periods of elevated risk for AYp spread. The relative amount of aster leafhopper variability among and within years (39%) exceeded estimates of variation among farm locations and fields (7%). Similarly, time covariates explained the largest amount of variation of aster leafhopper infectivity (50%). Leafhopper abundance has been decreasing since 2001 and reached its minimum in 2010. The average seasonal pattern indicated that periods of above average abundance occurred between 11 June and 1 August. Annual infectivity appears to oscillate around an average value of 2% and seasonal periods of above average infectivity occur between 19 May and 15 July. The coincidence of the expected periods of high leafhopper abundance and infectivity increases our knowledge of when the insect moves into susceptible crop fields and when it spreads the pathogen to susceptible crops, representing a seasonal interval during which management of the insect can be focused.
In Wisconsin, vegetable crops are threatened annually by the aster yellows phytoplasma (AYp), which is obligately transmitted by the aster leafhopper. Using a multiyear, multilocation data set, seasonal patterns of leafhopper abundance and infectivity were modeled. A seasonal aster yellows index (AYI) was deduced from the model abundance and infectivity predictions to represent the expected seasonal risk of pathogen transmission by infectious aster leafhoppers. The primary goal of this study was to identify periods of time during the growing season when crop protection practices could be targeted to reduce the risk of AYp spread. Based on abundance and infectivity, the annual exposure of the carrot crop to infectious leafhoppers varied by 16- and 70-fold, respectively. Together, this corresponded to an estimated 1,000-fold difference in exposure to infectious leafhoppers. Within a season, exposure of the crop to infectious aster leafhoppers (Macrosteles quadrilineatus Forbes), varied threefold because of abundance and ninefold because of infectivity. Periods of above average aster leafhopper abundance occurred between 11 June and 2 August and above average infectivity occurred between 27 May and 13 July. A more comprehensive description of the temporal trends of aster leafhopper abundance and infectivity provides new information defining when the aster leafhopper moves into susceptible crop fields and when they transmit the pathogen to susceptible crops.
- May 2013
Planting date is a commonly manipulated management practice in soybean [Glycine max (L.) Merr.] production; however, the impacts of past and ongoing agronomic improvements, such as earlier planting, on genetic yield improvement and associated changes in seed protein and oil have not been evaluated. The objectives of this study were to determine if a 30-d difference in planting date affected measured rates of genetic improvement in (i) yield, (ii) seed mass, and (iii) seed protein and oil in the midwestern United States. Research was conducted at Arlington, WI, Urbana, IL, and Lafayette, IN, during 2010 and 2011, using 59 Maturity Group (MG) II cultivars (released 1928-2008) at Wisconsin, and 57 MG III cultivars (released 1923-2007) at Illinois and Indiana, with targeted planting dates of 1 May and 1 June. Earlier planting provided higher yields (+/- 3.1 kg ha(-1) yr(-1)) than late planting in MG III soybean. Seed protein concentration decreased linearly over cultivar year of release at a rate of 0.191 (+/- 0.069) g kg(-1) yr(-1) for MG II, and 0.242 (+/- 0.063) g kg(-1) yr(-1) for MG III. Seed oil concentration increased over year of release at a rate of 0.142 (+/- 0.037) g kg(-1) yr(-1) for MG II, and 0.127 (+/- 0.039) g kg(-1) yr(-1) for MG III. The interaction between planting date and cultivar year of release for MG III yield suggested that the trend toward earlier planting is one agronomic improvement that, when coupled with genetic improvement, has provided a synergistic increase in on-farm soybean yields in the midwestern United States.
Historically, the path of crop loss assessment research has known three phases: exploratory, development and implementation. These phases took place at different periods of agricultural research, with a common thread to improve our knowledge of the impact of diseases on crop yield quantity and quality. In this review, we provide a discussion on these phases. In particular, we emphasize the seminal research that has laid a foundation for a new phase to develop. We do this through an examination of the measurement of injury and crop losses, the current statistical models used to define thresholds and damage functions, and what is currently known regarding qualitative losses. Crop loss research enters a fourth phase of crop loss assessment, the multicriteria assessment phase. In the latter portion of the review, we provide a brief discussion on the efforts, the concepts, and the necessary multidisciplinary dialogue, that the multi-criteria assessment phase requires in order for crop loss assessment to truly change the ways diseases are managed and how management itself is truly seen among the disciplinary fields that contribute to sustainable agricultural development.
Integration of host resistance and prothioconazole + tebuconazole fungicide application at anthesis to manage Fusarium head blight (FHB) and deoxynivalenol (DON) in wheat was evaluated using data from over 40 trials in 12 U.S. states. Means of FHB index (index) and DON from up to six resistance class fungicide management combinations per trial (susceptible treated [S_TR] and untreated [S_UT]; moderately susceptible treated [MS_TR] and untreated [MS_UT]; moderately resistant treated [MR_TR] and untreated [MR_UT]) were used in multivariate meta-analyses, and mean log response ratios across trials were estimated and transformed to estimate mean percent control ((C) over bar) due to the management combinations relative to S_UT. All combinations led to a significant reduction in index and DON (P < 0.001). MR_TR was the most effective combination, with a <(C)over bar> of 76% for index and 71% for DON, followed by MS_TR (71 and 58%, respectively), MR_UT (54 and 51%, respectively), S_TR (53 and 39%, respectively), and MS_UT (43 and 30%, respectively). Calculations based on the principle of treatment independence showed that the combination of fungicide application and resistance was additive in terms of percent control for index and DON. Management combinations were ranked based on percent control relative to S_UT within each trial, and nonparametric analyses were performed to determine management combination stability across environments (trials) using the Kendall coefficient of concordance (W). There was a significant concordance of management combinations for both index and DON (P < 0.001), indicating a nonrandom ranking across environments and relatively low variability in the within-environment ranking of management combinations. MR_TR had the highest mean rank (best control relative to S_UT) and was one of the most stable management combinations across environments, with low rank stability variance (0.99 for index and 0.67 for DON). MS_UT had the lowest mean rank (poorest control) but was also one of the most stable management combinations. Based on Piepho's nonparametric rank-based variance homogeneity U test, there was an interaction of management combination and environment for index (P = 0.011) but not for DON (P = 0.147), indicating that the rank ordering for index depended somewhat on environment. In conclusion, although the magnitude of percent control will likely vary among environments, integrating a single tebuconazole + prothioconazole application at anthesis with cultivar resistance will be a more effective and stable management practice for both index and DON than either approach used alone.
Sclerotinia stem rot (also known as white mold) of soybean is a significant yield-limiting problem in the North Central production region. This disease, caused by the fungus Sclerotinia sclerotiorum (Lib.) de Bary, varies in incidence and severity from year to year because of its sensitivity to weather conditions. Losses because of Sclerotinia stem rot can be substantial when environmental conditions and management practices favor high yield potential. Employing a disease management plan based on knowledge of field history and best disease management practices can help reduce losses from Sclerotinia stem rot. An effective disease management plan integrates several management tactics that include cultural practices, varietal resistance, as well as chemical and biological control. Understanding how different environmental variables and management practices influence infection by S. sclerotiorum and disease development are important to optimize disease management and reduce losses. This profile summarizes research-based knowledge of Sclerotinia stem rot, including the disease cycle, the scope of the losses that can occur because of this disease, how to identify both the pathogen S. sclerotiorum and the disease, and current management recommendations.
Improving understanding and prediction of the potato (Solanum tuberosum) tuber size over the growing season is important due to its effects on crop price and marketing. Several models have been proposed to describe potato growth and development, but are based on short-term data and have little use for predicting yields or in-season management decisions. This analysis uses long-term data collected from 1979 to 1993 in central Wisconsin to describe growth and development of the Russet Burbank potato variety. This paper describes average number of potato tubers per plant and tuber length as influenced by thermal time and stem number per plant over 14 years. For each plant variable, data analysis uses multivariate techniques to fit a hierarchical logistic model with parameters potentially depending on stem number per plant. Analysis finds that the average number of potato tubers and average tuber length were affected by thermal time and stem number per plant. Estimated models are biologically relevant, provide an understanding of seasonal thermal variability and stem number per plant effects on average tuber set and growth, and can be used to describe yearly variation in average potato growth and development. Increased understanding of potato growth in response to thermal time and stem number per plant can improve management recommendations and predictions of crop economic value.
- Jan 2012
Earlier soybean [Glycine max (L.) Merr.] planting coupled with increasing seed cost and commodity prices has led to an increase in the number of hectares treated with seed treatments. Ultimately, growers would like to know if applying such treatment is cost effective. Therefore, the objectives of this experiment were to quantify the effects of seed treatment on early season plant population and seed yield and investigate the probability that yield response covered the cost of the seed treatment. Trials were conducted in Wisconsin from 2008 to 2010 at nine locations each year to compare no seed treatment, mefenoxam + fludioxonil (ApronMaxx), and mefenoxam + fludioxonil + thiamethoxam (CruiserMaxx). Results indicated differences in early-season plant population due to cultivar and seed treatment and that seed yield was affected by a cultivar x seed treatment interaction. At a low seed treatment price, the percentage of environments where the probability of breaking even was >50% and ranged from 56 to 67%, while it ranged from 22 to 56% at a higher price. Both ApronMaxx and CruiserMaxx had positive response ratios of 1.5 (p = 0.030) and 2.9% (p < 0.0001), respectively, but responses were cultivar dependent. Given annual environmental variability, a general lack of field information regarding field history of pathogens or insects, and the high turnover rate of soybean cultivars, soybean seed treatments can be a cost-effective component to integrate into soybean production systems.
Sudden death syndrome (SDS), caused by the fungal pathogen Fusarium virguliforme, causes significant yield reductions in soybean [Glycine max (L.) Merr.] in the United States. Appropriate recommendations to manage SDS for growers in Iowa and the Upper Midwest are limited. The research objective was to determine the response of SDS foliar disease incidence, severity, and yield to row spacing and seeding rate. In 2008 and 2009, at two Iowa locations, in fields with histories of SDS, SDS-susceptible and SDS-resistant cultivars were planted at 38- and 76-cm row spacing at seeding rates of 185,000, 309,000, and 432,000 seeds ha(-1) in plots infested with and without the pathogen. Sudden death syndrome incidence and severity were very low; however, infested plots had greater SDS disease incidence and severity than uninfested plots. A row spacing x infestation interaction indicated 7% greater yield in narrow rows (38 cm) than wide rows (76 cm) in uninfested plots, with no yield advantage to narrow rows in infested plots. Soil infestation reduced soybean seed mass (7%) in narrow rows, explaining the yield reduction for narrow rows with greater SDS. The two highest seeding rates had increased SDS incidence but yielded 9% greater than the lowest seeding rate. The susceptible cultivar had greater SDS incidence and severity and yielded 7% less than the resistant cultivar. This study indicates that in infested plots with greater SDS symptom expression, the yield advantage of narrow rows may be negated; therefore, cultivar selection is crucial when planting in narrow rows to maximize yield.
Heterodera glycines continues to be the number one yield limiting factor in soybean [Glycine max (L.) Merr.] across the Midwest. Several genetic and agronomic practices are available to assist growers in maximizing yield in a H. glycines environment. The objectives of this research were to (i) measure yield response to rotation and tillage systems and evaluate whether presence of H. glycines and reaction of cultivars to H. glycines modified this response, and (ii) determine if H. glycines population dynamics were related to source of resistance to H. glycines, rotation, or tillage systems. Field research trials were conducted during 3 yr (2006-2008) near Arlington, WI and Ames, IA. Main plots were no-tillage and conventional tillage systems. Subplots consisted of 10 rotation sequences involving corn (Zea mays L.) and soybean. Sub-subplots were three sources of H. glycines resistance. Crop rotation and source of genetic resistance were the most important factors to consider in maximizing seed yield and managing H. glycines across locations, whereas tillage was the least valuable tool in H. glycines management. Extended rotations decreased H. glycines populations, however this benefit was overcome by first or second year soybean. Results also show that continued reliance on one source of genetic resistance can lead to reproduction of H. glycines, regardless of source. Our results suggest that an integrated approach to H. glycines management that considers rotation, tillage, knowledge of H. glycines (HG)-type, and source of genetic resistance is needed to maximize seed yield and decrease H. glycines populations.
Lackermann, K. V., and Esker, P. D. 2011. Application of a rank-based method for improved cultivar selection in soft red winter wheat. Plant Dis. 95:1407-1413. Both grain yield and disease performance are important factors to consider for winter wheat (Triticum aestivum) cultivar selection. However, disease severity and yield data are often presented separately, making it difficult to compare values across multiple environments. The objective of this study was to use a rank-based method to compare cultivars based on yield and disease performance combined across multiple environments. Thirty-six wheat cultivars were planted at each of four Wisconsin locations (Arlington, Chilton, Janesville, and Lancaster) in 2009 and 2010. Plots were assessed four times during the growing season for powdery mildew, Septoria/Stagonospora leaf blotch, and leaf rust. Incidence and severity of Fusarium head blight were assessed at Zadoks growth stage 85 (soft dough). Within each location year, cultivars were ranked for severity of each disease and for grain yield. One-way analysis of variance was performed to calculate an overall rank value that incorporated data for all four diseases and yield across the eight location years. There was an effect of cultivar on overall rank (P<0.0001). Powdery mildew rank was strongly correlated with overall rank (Spearman's rho = 0.485, P = 0.005), as was yield rank (Spearman's rho = 0.674, P < 0.0001). Cultivars described as "best" or "worst" cultivars were generally more consistent in their rankings across different measures. The use of a rank-based method provides an approach that will allow growers to base cultivar selection on multiple performance measures across multiple environments.
Lackermann, K. V., Conley, S. P., Gaska, J. M., Martinka, M. J., and Esker, P. D. 2011. Effect of location, cultivar, and diseases on grain yield of soft red winter wheat in Wisconsin. Plant Dis. 95:1401-1406. Knowledge is limited about the impact of foliar diseases on wheat yield in Wisconsin. The objective of this study was to compare yield and diseases of wheat cultivars in several locations in Wisconsin in 2009 and 2010. Thirty-six wheat cultivars were planted in a randomized complete block design at field sites near Arlington, Chilton, and Lancaster, WI. At a fourth location, Janesville, WI, the design was a split plot with foliar fungicide application at Zadoks growth stage (GS) 45 at the whole-plot level and cultivar at the subplot level. Disease assessments were made four times during the growing season for powdery mildew (PM), Septoria/Stagonospora leaf blotch (SLB), and leaf rust. Incidence and severity of Fusarium head blight were assessed on 100 heads per plot at GS 85. Linear mixed-model analyses were used to study the effects of location, cultivar, and disease on grain yield (alpha = 0.05). Overall, SLB and PM were the most prevalent diseases. SLB severity was uniform among locations and PM was most prevalent at Arlington and Chilton. In both years, yield was affected by location, cultivar, location cultivar interaction, and location SLB and location-PM interactions. Yield was also negatively affected by PM in 2010. No effect of fungicide on disease severity or yield was observed at Janesville in either year. These results suggest that cultivar selection and location strongly influence grain yield in Wisconsin and that powdery mildew is capable of reducing grain yield.
Plant disease epidemiology requires expansion of its current methodological and theoretical underpinnings in order to produce full contributions to global food security and global changes. Here, we outline a framework which we applied to farmers' field survey data set on rice diseases in the tropical and subtropical lowlands of Asia. Crop health risks arise from individual diseases, as well as their combinations in syndromes. Four key drivers of agricultural change were examined: labor, water, fertilizer, and land availability that translate into crop establishment method, water shortage, fertilizer input, and fallow period duration, respectively, as well as their combinations in production situations. Various statistical approaches, within a hierarchical structure, proceeding from higher levels of hierarchy (production situations and disease syndromes) to lower ones (individual components of production situations and individual diseases) were used. These analyses showed that (i) production situations, as wholes, represent very large risk factors (positive or negative) for occurrence of disease syndromes; (ii) production situations are strong risk factors for individual diseases; (iii) drivers of agricultural change represent strong risk factors of disease syndromes; and (iv) drivers of change, taken individually, represent small but significant risk factors for individual diseases. The latter analysis indicates that different diseases are positively or negatively associated with shifts in these drivers. We also report scenario analyses, in which drivers of agricultural change are varied in response to possible climate and global changes, generating predictions of shifts in rice health risks. The overall set of analyses emphasizes the need for large-scale ground data to define research priorities for plant protection in rapidly evolving contexts. They illustrate how a structured theoretical framework can be used to analyze emergent features of agronomic and socioecological systems. We suggest that the concept of "disease syndrome" can be borrowed in botanical epidemiology from public health to emphasize a holistic view of disease in shifting production situations in combination with the conventional, individual disease-centered perspective.
If you produce corn, soybeans, or other crops in Wisconsin or elsewhere in the Midwest, dust exposure while working is inevitable. Breathing in grain dust can affect the health and overall comfort for grain producers and others who work in the grain industry. Exposures can occur: • In the combine • In bins • While unloading • In any area near any of the above situations • During drying or processing • While grinding/mixing grain and other feed products Grain dust is a complex soup that is made up of both organic and inorganic particles. Some of these can be inhaled easily, and depending on their size, can find their way deep into various parts of the respiratory system causing a range of adverse health effects. Grain dust is biologically active and is made up of a combination of: • Plant material • Bacteria • Mold and mold spores • Endotoxins (toxins contained in the cell walls of some bacteria) • Insect parts and excreta • Soil Exposure to Small Concentrations During Normal Work
The use of foliar fungicides on field corn has increased greatly over the past 5 years in the United States in an attempt to increase yields, despite limited evidence that use of the fungicides is consistently profitable. To assess the value of using fungicides in grain corn production, random-effects meta-analyses were performed on results from foliar fungicide experiments conducted during 2002 to 2009 in 14 states across the United States to determine the mean yield response to the fungicides azoxystrobin, pyraclostrobin, propiconazole + trifloxystrobin, and propiconazole + azoxystrobin. For all fungicides, the yield difference between treated and nontreated plots was highly variable among studies. All four fungicides resulted in a significant mean yield increase relative to the nontreated plots (P < 0.05). Mean yield difference was highest for propiconazole + trifloxystrobin (390 kg/ha), followed by propiconazole + azoxystrobin (331 kg/ha) and pyraclostrobin (256 kg/ha), and lowest for azoxystrobin (230 kg/ha). Baseline yield (mean yield in the nontreated plots) had a significant effect on yield for propiconazole + azoxystrobin (P < 0.05), whereas baseline foliar disease severity (mean severity in the nontreated plots) significantly affected the yield response to pyraclostrobin, propiconazole + trifloxystrobin, and propiconazole + azoxystrobin but not to azoxystrobin. Mean yield difference was generally higher in the lowest yield and higher disease severity categories than in the highest yield and lower disease categories. The probability of failing to recover the fungicide application cost (p(loss)) also was estimated for a range of grain corn prices and application costs. At the 10-year average corn grain price of $0.12/kg ($2.97/bushel) and application costs of $40 to 95/ha, p(loss) for disease severity <5% was 0.55 to 0.98 for pyraclostrobin, 0.62 to 0.93 for propiconazole + trifloxystrobin, 0.58 to 0.89 for propiconazole + azoxystrobin, and 0.91 to 0.99 for azoxystrobin. When disease severity was >5%, the corresponding probabilities were 0.36 to 95, 0.25 to 0.69, 0.25 to 0.64, and 0.37 to 0.98 for the four fungicides. In conclusion, the high p(loss) values found in most scenarios suggest that the use of these foliar fungicides is unlikely to be profitable when foliar disease severity is low and yield expectation is high.
Jirak-Peterson, J. C., and Esker, P. D. 2011. Tillage, crop rotation, and hybrid effects on residue and corn anthracnose occurrence in Wisconsin. Plant Dis. 95:601-610. Corn anthracnose (Colletotrichum graminicola) is an important disease of field corn (Zea mays). Two phases, leaf blight and stalk rot, can reduce yield through either premature leaf senescence or reduced grain harvest due to stalk lodging. Corn residue is an important source of primary inoculum and is increased through cultural practices such as no-tillage and continuous corn cropping, which are common practices in Wisconsin. Field studies conducted at the Arlington Agricultural Research Station (ARS) and the West Madison ARS showed that the incidence and severity of anthracnose leaf blight were higher in con-tinuous-corn crop rotations than in soybean–corn rotations (91% higher incidence, 24 to 78% higher severity). Anthracnose stalk rot was marginally affected by tillage in 2008 (P = 0.09), with higher incidence in chisel-plowed treatments. There was a positive association between spring residue cover and anthracnose leaf blight but no association was found between residue and stalk rot. No association was found be-tween anthracnose leaf blight and stalk rot. There was a negative asso-ciation between anthracnose leaf blight and yield but not between an-thracnose stalk rot and yield. Managing residue levels through crop rotation would help to reduce anthracnose leaf blight but further work is needed to elucidate factors that lead to stalk lodging prior to harvest.
The continuing exponential increase in scientific knowledge, the growing availability of large databases containing raw or partially annotated information, and the increased need to document impacts of large-scale research and funding programs provide a great incentive for integrating and adding value to previously published (or unpublished) research through quantitative synthesis. Meta-analysis has become the standard for quantitative evidence synthesis in many disciplines, offering a broadly accepted and statistically powerful framework for estimating the magnitude, consistency, and homogeneity of the effect of interest across studies. Here, we review previous and current uses of meta-analysis in plant pathology with a focus on applications in epidemiology and disease management. About a dozen formal meta-analyses have been published in the plant pathological literature in the past decade, and several more are currently in progress. Three broad research questions have been addressed, the most common being the comparative efficacy of chemical treatments for managing disease and reducing yield loss across environments. The second most common application has been the quantification of relationships between disease intensity and yield, or between different measures of disease, across studies. Lastly, meta-analysis has been applied to assess factors affecting pathogen-biocontrol agent interactions or the effectiveness of biological control of plant disease or weeds. In recent years, fixed-effects meta-analysis has been largely replaced by random- (or mixed-) effects analysis owing to the statistical benefits associated with the latter and the wider availability of computer software to conduct these analyses. Another recent trend has been the more common use of multivariate meta-analysis or meta-regression to analyze the impacts of study-level independent variables (moderator variables) on the response of interest. The application of meta-analysis to practical problems in epidemiology and disease management is illustrated with case studies from our work on Phakopsora pachyrhizi on soybean and Erwinia amylovora on apple. We show that although meta-analyses are often used to corroborate and validate general conclusions drawn from more traditional, qualitative reviews, they can also reveal new patterns and interpretations not obvious from individual studies.
The occurrence of aphid-transmitted viruses in agricultural crops of the Midwest and northeastern United States has become more frequent since the arrival and establishment of the soybean aphid, Aphis glycines Matsumura (Hemiptera: Aphididae). A. glycines is a competent vector of plant viruses and may be responsible for recent virus epidemics in Wisconsin snap bean, Phaseolus vulgaris L., fields. To determine whether vegetation surrounding crop fields could serve as sources of virus inocula, we examined the settling activity ofA. glycines and other aphid species in agricultural crops and noncrop field margins adjacent to snap bean fields. Noncrop field margins were made up of numerous virus-susceptible plant species within 10 m from snap bean field edges. During summers 2006 and 2007, horizontal pan traps were placed in commercial soybean [Glycine max (L.) Merr.], snap bean, and surrounding field margins to characterize aphid flight activity patterns in the different habitat types. Alate abundance and peak occurrence across years varied between crop and noncrop field margins and differed among patches of plants in field margins. Overall aphid activity peaked late in the season (21 August in 2006 and 28 July in 2007); with the majority (52%) of total aphids trapped in all habitats being A. glycines. Susceptibility to viral infection and confirmed visitation of A. glycines to these forage plants suggests the importance ofnoncrop habitats as potential sources of primary virus inoculum. Viral disease onset followed peak aphid flights and further implicates A. glycines as a likely vector of viruses in commercial bean and other crops in Wisconsin.
The ecosystem services concept provides a means to define successful disease management more broadly, beyond short-term crop yield evaluations. Plant disease can affect ecosystem services directly, such as through removal of plants providing services, or indirectly through the effects of disease management activities, including pesticide applications, tillage, and other methods of plant removal. Increased plant biodiversity may reduce disease risk if susceptible host tissue becomes less common, or may increase risk if additional plant species are important in completing pathogen life cycles. Arthropod and microbial biodiversity may play similar roles. Distant ecosystems may provide a disservice as the setting for the evolution of pathogens that later invade a focal ecosystem, where plants have not evolved defenses. Conversely, distant ecosystems may provide a service as sources of genetic resources of great value to agriculture, including disease resistance genes. Good policies are needed to support conservation and optimal use of genetic resources, protect ecosystems from exotic pathogens, and limit the homogeneity of agricultural systems. Research is needed to provide policy makers, farmers, and consumers with the information required for evaluating trade-offs in the pursuit of the full range of ecosystem services desired from managed and native ecosystems.
Brown stem rot (BSR)-resistant and -susceptible soybean accessions were continuously cropped in an area never previously seeded to soybean to study the influence of monocultures on soil and stem populations of Phialophora gregata f. sp. sojae. P. gregata f. sp. sojae population size and genotype composition were determined by dilution plating, isolation of P gregata f. sp. sojae and standard polymerase chain reaction (PCR), and by quantitative real-time PCR (q-PCR. In general, the sizes of P. gregata I. sp. sojae populations in soil were similar regardless of mono-culture. The percentage of P. gregata f. sp. sojae genotype B was greater than A in soil following the monoculture of both BSR-susceptible and -resistant soybean accessions. Following the monoculture of a BSR-resistant accession, the percentage of R gregata f. sp. sojae genotype B was greater than A. Overall, P. gregata I. sp. sojae populations in stems of a BSR-susceptible accession were greater than those in stems of a BSR-resistant accession. P. gregata f. sp. sojae genotype B was detected more often than A in stems of both resistant and susceptible accessions planted following a BSR-resistant monoculture. P. gregata f. sp. sojae genotype B was also detected more often than A in stems of a BSR-resistant accession planted following a BSR-susceptible monoculture. P. gregata f. sp. sojae genotypes A and B were isolated at similar frequencies from stems of a BSR-susceptible accession planted following a BSR-susceptible monoculture. However, q-PCR results indicate that the percentage of P. gregata I. sp. sojae genotype A was greater than B in stems of a BSR-susceptible accession planted following a BSR-susceptible monoculture. Among BSR-susceptible accessions, those with the soybean cyst nematode (SCN)-resistant cv. Peking in their parentage had the largest populations of P. gregata I. sp. sojae and a greater percentage of P gregata f. sp. sojae genotype B. Similar results were observed for BSR-resistant accessions derived from SCN-resistant PI 88788.
With the arrival of Asian soybean rust (caused by Phakopsora pachyrhizi) in the Western Hemisphere in 2001, field research to optimize chemical control of this important yield-limiting disease has proliferated. We present a meta-analytical synthesis of the results of 71 uniform fungicide trials containing 930 entries (specific fungicidal treatments) conducted in Brazil from 2003/2004 to 2006/2007. Our objectives were to: (1) quantify the overall efficacy of fungicidal treatments in reducing disease and yield loss; (2) determine to what extent fungicide efficacy depends on overall disease pressure, the number of spray applications, and the amount of disease present at the time of the first application; and (3) test for differences in efficacy among fungicide classes and specific active ingredients. Weighted median response ratios for disease severity (RS) and yield (RY) were 0.413 and 1.439, respectively, indicating that, on average, fungicide treatments reduced disease by 58.7% (range: −38.9–100%) and increased yield by 43.9% (range: −21.8–458%). Response ratios were dependent on disease pressure (expressed as disease severity of the untreated check), with the greatest reduction in rust severity (i.e., lowest RS values) observed for low disease pressure and the best yield response (i.e., highest RY values) observed for high disease pressure. In trials where both one and two application schedules were included, RS and RY were better for entries receiving two applications than for one application. In ∼65% of entries across all trials, disease was present at the time of the first application, albeit at low levels (median=0.20% severity). Only disease severities of up to 0.05% at the time of the first application could be tolerated without affecting RS negatively, whereas presence of any disease at the first application had a negative effect on RY, even when disease pressure was low. In general, triazole fungicides applied alone performed better than strobilurins alone, but there was a wide range in efficacy among individual triazoles, with prothioconazole and tebuconazole performing best and fluquinconazole and difenoconazole being least effective. Combinations of strobilurins with triazoles (especially those containing cyproconazole) improved disease and yield loss control compared with either class alone. In contrast, combinations of triazoles with a benzimidazole fungicide did not improve RS or RY compared with triazoles alone. Across fungicides, RY and RS were correlated negatively (r=−0.6296, P=0.0017), indicating that treatments with better disease control also had higher yields.
Colletotrichum gossypii var. cephalosporioides, the fungus that causes ramulosis disease of cotton, is widespread in Brazil and can cause severe yield loss. Because weather conditions greatly affect disease development, the objective of this work was to develop weather-based models to assess disease favorability. Latent period, incidence, and severity of ramulosis symptoms were evaluated in controlled environment experiments using factorial combinations of temperature (15, 20, 25, 30, and 35 degrees C) and leaf wetness duration (0, 4, 8, 16, 32, and 64 h after inoculation). Severity was modeled as an exponential function of leaf wetness duration and temperature. At the optimum temperature of disease development, 27 degrees C, average latent period was 10 days. Maximum ramulosis severity occurred from 20 to 30 degrees C, with sharp decreases at lower and higher temperatures. Ramulosis severity increased as wetness periods were increased from 4 to 32 h. In field experiments at Piracicaba, São Paulo State, Brazil, cotton plots were inoculated (10(5) conidia ml(-1)) and ramulosis severity was evaluated weekly. The model obtained from the controlled environment study was used to generate a disease favorability index for comparison with disease progress rate in the field. Hourly measurements of solar radiation, temperature, relative humidity, leaf wetness duration, rainfall, and wind speed were also evaluated as possible explanatory variables. Both the disease favorability model and a model based on rainfall explained ramulosis growth rate well, with R(2) of 0.89 and 0.91, respectively. They are proposed as models of ramulosis development rate on cotton in Brazil, and weather-disease relationships revealed by this work can form the basis of a warning system for ramulosis development.
Ray blight of pyrethrum (Tanacetum cinerariifolium), caused by Phoma ligulicola var. inoxydablis, can cause defoliation and reductions of crop growth and pyrethrin yield. Logistic regression was used to model relationships among edaphic factors and interpolated weather variables associated with severe disease outbreaks (i.e., defoliation severity >=40%). A model for September defoliation severity included a variable for the product of number of days with rain of at least 0.1 mm and a moving average of maximum temperatures in the last 14 days, which correctly classified (accuracy) the disease severity class for 64.8% of data sets. The percentage of data sets where disease severity was correctly classified as at least 40% defoliation severity (sensitivity) or below 40% defoliation severity (specificity) were 55.8 and 71%, respectively. A model for October defoliation severity included the number of days with at least 1 mm of rain in the past 14 days, stem height in September, and the product of the number of days with at least 10 mm of rain in the last 30 days and September defoliation severity. Accuracy, sensitivity, and specificity were 72.6, 73.6, and 71.4%, respectively. Youden's index identified predictive thresholds of 0.25 and 0.57 for the September and October models, respectively. When economic considerations of the costs of false positive and false negative decisions and disease prevalence were integrated into receiver operating characteristic (ROC) curves for the October model, the optimal predictive threshold to minimize average management costs was 0 for values of disease prevalence greater than 0.2 due to the high cost of false negative predictions. ROC curve analysis indicated that management of the disease should be routine when disease prevalence is greater than 0.2. The models developed in this research are the first steps toward identifying and weighting site and weather disease risk variables to develop a decision-support aid for the management of ray blight of pyrethrum.
Favorable meteorological and environmental conditions are critical components that affect Asian soybean rust (ASR), caused by Phakopsora pachyrhizi, the most damaging fungal disease of soybean. In this review, we used available knowledge on the effect of meteorological factors affecting the disease to construct a systems-based approach to understand the risk of ASR epidemics. The systems approach is based on a hierarchical framework where relevant environmental factors that affect the key stages of the ASR disease cycle are identified and this included both aerobiological and epidemiological components. The formal framework we used examined the following epidemic characteristics: spore release, spore dispersal, spore deposition, infection efficiency, latent period and spore production. It provided the ability to identify the most important meteorological-related factors along with relevant knowledge gaps from which the implications for disease forecasting were described. This is new information that can be used as a guide for further epidemiological research and especially to develop and improve upon both local and regional risk models.
Previously known only from the southern United States, hosta petiole rot recently appeared in the northern United States. Sclerotium rolfsii var. delphinii is believed to be the predominant petiole rot pathogen in the northern United States, whereas S. rolfsii is most prevalent in the southern United States. In order to test the hypothesis that different tolerance to climate extremes affects the geographic distribution of these fungi, the survival of S. rolfsii and S. rolfsii var. delphinii in the northern and southeastern United States was investigated. At each of four locations, nylon screen bags containing sclerotia were placed on the surface of bare soil and at 20-cm depth. Sclerotia were recovered six times from November 2005 to July 2006 in North Dakota and Iowa, and from December 2005 to August 2006 in North Carolina and Georgia. Survival was estimated by quantifying percentage of sclerotium survival on carrot agar. Sclerotia of S. rolfsii var. delphinii survived until at least late July in all four states. In contrast, no S. rolfsii sclerotia survived until June in North Dakota or Iowa, whereas 18.5% survived until August in North Carolina and 10.3% survived in Georgia. The results suggest that inability to tolerate low temperature extremes limits the northern range of S. rolfsii.
The potential of remote sensing to nondestruc- tively measure relationships between ray blight disease (caused by Phoma ligulicola), plant measurements and components of pyrethrum (Tanacetum cinerariifolium (Trevir.) Sch. Bip.) bio- mass and yield using a hand-held multispectral radiometer was examined. A range of disease intensities were generated using fungicides in three fi elds over two years. Nondestructive assessments were obtained by measuring the percentage of sunlight refl ected from canopies with a radiometer equipped with fi ve wavelength bands. Combinations of wavelength ratios and four vegetation indices were calculated. Rela- tionships between refl ectance and biomass were investigated by removing foliage from the canopy and periodically measuring refl ectance. Mea- surements such as stem height and the number of fl owers in October consistently had signifi cant linear relationships with relative pyrethrin and fl ower yield. The best predictors of relative fl ower and pyrethrin yield were found using either per- centage refl ectance in the near infrared (830 nm) and the difference vegetative index (DVI). Sev eral measures had signifi cant linear relationships with fresh weight of foliage, including the near infrared bandwidth and the DVI, which explained 95 to 97% of the biomass variation. This study demonstrated that plant measurements and dis- ease intensity are strongly related to pyrethrin yield, and that remote sensing has great potential to nondestructively obtain preharvest yield and biomass estimates.
New concepts in phytopathometry continue to emerge, such as the evolution of the concept of pathogen intensity versus the well-established concept of disease intensity. The concept of pathogen severity, defined as the quantitative measurement of the amount of pathogen per sampling unit has also emerged in response to the now commonplace development of quantitative molecular detection tools. Although the concept of disease severity, i.e., the amount of disease per sampling unit, is a well-established concept, the accuracy and precision of visual estimates of disease severity is often questioned. This article will review disease assessment concepts, as well as the methods and assessment aides currently available to improve the accuracy and precision of visually-based disease severity data. The accuracy and precision of visual disease severity assessments can be improved by quantitatively measuring and comparing the accuracy and precision of rates and/or assessment methods using linear regression, by using computer-based disease assessment training programmes, and by developing and using diagrammatic keys (standard area diagrams).
Foliar disease due to ray blight (Phoma ligulicola) in pyrethrum was quantified at three locations over 2 years in Tasmania, Australia. To obtain a range of ray blight disease intensities, replicated plots were treated with fungicides that varied in efficacy to control ray blight. Visual disease assessments and measurement of canopy reflectance were made at least once during spring (September through December). Visual assessments involved removal of flowering stems at ground level from which measurements of defoliation severity and the incidence of stems with ray blight were obtained. Reflectance of sunlight from pyrethrum canopies was measured at 485, 560, 660, 830, and 1,650 nm using a handheld multispectral radiometer. Measurements from these wavelengths also were used to calculate all possible reflectance ratios, as well as four vegetative indices. Relationships between wavelength bands, reflectance ratios, vegetative indices, and disease intensity measures were described by linear regression analyses. Several wavelength bands, ratios, and vegetative indices were significantly related in a linear fashion to visual measures of disease intensity. The most consistent relationships, with high R(2) and low coefficients of variation values, varied with crop growth stage over time. The ratio 830/560 was identified as the best predictor of stem height, defoliation severity, and number of flowers produced on each stem in October. However, reflectance within the near-infrared range (830 nm) and the difference vegetative index was superior in November. The use of radiometric assessment of disease was noninvasive and provided savings in disease assessment time, which is critical where visual assessment is difficult and requires destructive sampling, as with pyrethrum.
The efficacy of newly implemented fungicide recommendations on reducing the intensity of ray blight disease caused by Phoma ligulicola to achieve site-specific attainable yield potentials in Tasmanian pyrethrum fields was quantified over two seasons in 46 and 51 fields during the 2003 and 2004 growing seasons, respectively. Disease intensity and yield in two plots (10 x 4 m), one following the commercial fungicide protocol recommendations and the second receiving no fungicide, were assessed in each pyrethrum field. The commercial fungicide protocol consisted of one application of azoxystrobin at 150 g a.i./ha, followed by two applications of a tank mixture of difenoconazole at 125 g a.i./ha and chlorothalonil at 1,008 liters a.i./ha at 14- to 21-day intervals. This program resulted in significant decreases in defoliation severity and the incidence of stems and flowers with ray blight, and increases in the height of stems and number of flowers produced per stem in October and November. In plots receiving the commercial fungicide protocol, the dry weight of flowers was increased by 76 and 68% in 2003 and 2004, respectively. Moreover, pyrethrin yield increased by 81 and 78% when the commercial fungicide protocol was used compared with the nontreated plots. Tobit regression was used to examine the relationships and thresholds among disease intensity measures (defoliation severity, stem severity, and incidence of flowers with ray blight) assessed just prior to harvest. This regression utilized a left-censored regression model to define subminimal thresholds, as none of the disease intensity measures could be less than 0. Defoliation severity had a threshold of 35.3% before stem severity linearly increased and a threshold of 38.2% before the incidence of flowers with ray blight linearly increased. Finally, the threshold for stem severity was 13.7% before the incidence of flowers with ray blight linearly increased. These thresholds can be used to assist growers in making disease management decisions with the objective of minimizing loss of flowers by maintaining defoliation severity below the critical point at which the incidence of flowers with ray blight begins to linearly increase.
Yellow crinkle disease of papaya is a serious threat to papaya production in Australia. Space-time point pattern analysis was used to study the spatial and temporal dependence of two phytoplasma strains that cause yellow crinkle: tomato big bud (TBB) and sweet potato little leaf V4 (SPLL-V4). Incidence data for both phytoplasma strains were obtained from a field study conducted in Katherine, NT, Australia, between January 1996 and May 1999. The primary ecological and epidemiological question of interest was to elucidate the scale of spatial or spatio-temporal aggregation of phytoplasma-infected papaya plants. The hypothesis was that there would be a contagion process, where TBB-and SPLL-V4-infected papaya would be aggregated and not random. To test this hypothesis, a point pattern spatial analysis using Monte Carlo simulation was initially applied to the incidence data. Results of this analysis suggested that SPLL-V4 infected papaya plants displayed aggregation with spatial dependence up to 30 m (10 to 15 plants along or across rows), whereas there was not strong evidence to suggest that TBB-infected papaya plants were aggregated. However, when a space-time point pattern analysis was subsequently used to simultaneously test for the interaction between space and time, there was strong evidence (P < 0.01 for SPLL-V4 and P < 0.10 for TBB) to suggest a space-time interaction for both SPLL-V4 and TBB. For SPLL-V4, a space-time risk window of approximately 10 months and 20 m was detected, whereas for TBB, this risk window was 5 months and 10 m. The results of these studies support the hypothesis that papaya infection by both phytoplasma strains appears to be the result of a contagion process, providing support for the contention that insect vectors are the most likely mechanism for acquisition, dispersal, and transmission.
Three forecasting models for Stewart's disease (Pantoea stewartii subsp. stewartii) of corn (Zea mays) were examined for their ability to accurately predict the prevalence of Stewart's disease in Iowa at the county level. The Stevens Model, which is used as a predictor of the early wilt phase of Stewart's disease, the Stevens-Boewe Model, which predicts the late leaf blight phase of Stewart's disease, and the Iowa State Model that is used to predict the prevalence of Stewart's disease, all use mean air temperatures for December, January, and February for a preplant prediction of Stewart's disease risk in a subsequent season. Models were fitted using weighted binary logistic regression with Stewart's disease prevalence data and air temperature data for 1972 to 2003. For each model, the years 1972 to 1999 (n = 786 county-years) were used for model development to obtain parameter coefficients. All three models indicated an increased likelihood for Stewart's disease occurring in growing seasons preceded by warmer winters. Using internal bootstrap validation, the Stevens Model had a maximum error between predicted and calibrated probabilities of 10%, whereas the Stevens-Boewe and Iowa State models had maximum errors of 1% or less. External validation for each model, using air temperature and seed corn inspection data between 2000 and 2003 (n = 154 county-years), indicated that overall accuracy to predict Stewart's disease at the county level was between 62 and 66%. However, both the Stevens and Stevens-Boewe models were overly optimistic in predicting that Stewart's disease would not occur within specific counties, as the sensitivity for these two models was quite low (18 and 43%, respectively). The Iowa State Model was substantially more sensitive (67%). The results of this study suggest that the Iowa State Model has increased predictive ability beyond statewide predictions for estimating the risk of Stewart's disease at the county level in Iowa.
All co-authors (50)