To monitor and meet water quality objectives, it is necessary to understand and quantify the contribution of nonpoint sources to total phosphorus (P) loading to surface waters. However, the contribution of streambank erosion to surface water P loads remains unclear and is typically unaccounted for in many nutrient loading assessments and policies. As a result, agricultural contributions of P are overestimated, and a potentially manageable nonpoint source of P is missed in strategies to reduce loads. In this perspective, we review and synthesize the results of a special symposium at the 2022 ASA‐CSSA‐SSSA annual meeting in Baltimore, MD, that focused on streambank erosion and its contributions to P loading of surface waters. Based on discussions among researchers and policy experts, we overview the knowns and unknowns, propose next steps to understand streambank erosion contribution to P export budgets, and discuss implications of the science of streambank erosion for policy and nutrient loss reduction strategies.
The “Global Aridity Index and Potential Evapotranspiration Database - Version 3” (Global-AI_PET_v3) provides high-resolution (30 arc-seconds) global hydro-climatic data averaged (1970–2000) monthly and yearly, based upon the FAO Penman-Monteith Reference Evapotranspiration (ET0) equation. An overview of the methods used to implement the Penman-Monteith equation geospatially and a technical evaluation of the results is provided. Results were compared for technical validation with weather station data from the FAO “CLIMWAT 2.0 for CROPWAT” (ET0: r² = 0.85; AI: r² = 0.90) and the U.K. “Climate Research Unit: Time Series v 4.04” (ET0: r² = 0.89; AI: r² = 0.83), while showing significant differences to an earlier version of the database. The current version of the Global-AI_PET_v3 supersedes previous versions, showing a higher correlation to real world weather station data. Developed using the generally agreed upon standard methodology for estimation of reference ET0, this database and notably, the accompanying source code, provide a robust tool for a variety of scientific applications in an era of rapidly changing climatic conditions.
Tile drains are important water flow paths in agricultural catchments and must be included in hydrological models. However, their locations are rarely known and the explicit incorporation of tile drains in hydrological models requires refined meshes around the drains, which can significantly increase computational times. Although seepage nodes have been used to represent tile drains with satisfactory performance, they have never been applied to represent all tile drainage systems in a catchment. The goal of this study is to compare different conceptual models for tile drains and soil heterogeneity for the numerical simulation of tile drainage in an agricultural catchment in Denmark. The first conceptual model for tile drains uses seepage nodes to represent only the main collector drains in the catchment and the second model uses seepage nodes distributed over all the agricultural areas, without considering the specific locations of tile drains. A third conceptual model, labelled the Benchmark Model, represent all tile drains at their known locations with seepage nodes and a fourth conceptual model implicitly represents tile drains as a high-permeability layer. The four models performed satisfactorily to simulate the observed outlet stream discharge and could be recommended almost interchangeably. The simulation of the water table depth was very satisfactory compared to modeling studies with similar mesh resolution (∼50 m). Results indicated that the three models using seepage nodes i) simulated similar monthly discharges and cumulative discharge volumes for most of the studied tile-drained areas, and ii) simulated surface water flow in tile-drained fields without runoff or ponding water. The shorter simulation times (around 35% faster) of the model representing the main drains and the distributed seepage node model suggest that they are suitable for model calibration, compared to the Benchmark Model. Whenever the location of tile drains is unavailable, using seepage node to represent drains in agricultural areas may satisfactorily simulate catchment-scale stream and drainage discharges. Four alternative soil models were developed to evaluate the effect of soil heterogeneity on the simulations. Our results suggest that at smaller scales (drainage area) soil heterogeneity is more relevant than the drainage conceptualization to improve model results. However, at the subcatchment scale, the opposite was observed and, at the catchment scale, both criteria had a comparable effect.
Artificial subsurface (tile) drainage is used to increase trafficability and crop yield in much of the Midwest due to soils with naturally poor drainage. Tile drainage has been researched extensively at the field scale, but knowledge gaps remain on how tile drainage influences the streamflow response at the watershed scale. The purpose of this study is to analyze the effect of tile drainage on the streamflow response for 59 Ohio watersheds with varying percentages of tile drainage and explore patterns between the Western Lake Erie Bloom Severity Index to streamflow response in heavily tile‐drained watersheds. Daily streamflow was downloaded from 2010‐2019 and used to calculated mean annual peak daily runoff, mean annual runoff ratio, the percent of observations in which daily runoff exceeded mean annual runoff (TQmean), baseflow versus stormflow percentages, and the streamflow recession constant. Heavily‐drained watersheds (> 40% of watershed area) consistently reported flashier streamflow behavior compared to watersheds with low percentages of tile drainage (< 15% of watershed area) as indicated by significantly lower baseflow percentages, TQmean, and streamflow recession constants. The mean baseflow percent for watersheds with high percentages of tile drainage was 20.9% compared to 40.3% for watersheds with low percentages of tile drainage. These results are in contrast to similar research regionally indicating greater baseflow proportions and less flashy hydrographs (higher TQmean) for heavily‐drained watersheds. Stormflow runoff metrics in heavily‐drained watersheds were significantly positively correlated to western Lake Erie algal bloom severity. Given the recent trend in more frequent large rain events and warmer temperatures in the Midwest, increased harmful algal bloom severity will continue to be an ecological and economic problem for the region if management efforts are not addressed at the source. Management practices that reduce the streamflow response time to storm events, such as buffer strips, wetland restoration, or drainage water management, are likely to improve the aquatic health conditions of downstream communities by limiting the transport of nutrients following storm events. This article is protected by copyright. All rights reserved.
Agricultural subsurface drainage systems are commonly installed on farmland to remove the excess water from poorly drained soils. Conventional methods for drainage mapping such as tile probes and trenching equipment are laborious, cause pipe damage, and are often inefficient to apply at large spatial scales. Knowledge of locations of an existing drainage network is crucial to understand the increased leaching and offsite release of drainage discharge and to retrofit the new drain lines within the existing drainage system. Recent technological developments in non-destructive techniques might provide a potential alternative solution. The objective of this study was to determine the suitability of unmanned aerial vehicle (UAV) imagery collected using three different cameras (visible-color, multispectral, and thermal infrared) and ground penetrating radar (GPR) for subsurface drainage mapping. Both the techniques are complementary in terms of their usage, applicability, and the properties they measure and were applied at four different sites in the Midwest USA. At Site-1, both the UAV imagery and GPR were equally successful across the entire field, while at Site-2, the UAV imagery was successful in one section of the field, and GPR proved to be useful in the other section where the UAV imagery failed to capture the drainage pipes’ location. At Site-3, less to no success was observed in finding the drain lines using UAV imagery captured on bare ground conditions, whereas good success was achieved using GPR. Conversely, at Site-4, the UAV imagery was successful and GPR failed to capture the drainage pipes’ location. Although UAV imagery seems to be an attractive solution for mapping agricultural subsurface drainage systems as it is cost-effective and can cover large field areas, the results suggest the usefulness of GPR to complement the former as both a mapping and validation technique. Hence, this case study compares and contrasts the suitability of both the methods, provides guidance on the optimal survey timing, and recommends their combined usage given both the technologies are available to deploy for drainage mapping purposes.
Tile drainage is one of the dominant agricultural management practices in the United States and has greatly expanded since the late 1990s. It has proven effects on land surface water balance and quantity and quality of streamflow at the local scale. The effect of tile drainage on crop production, hydrology, and the environment on a regional scale is elusive due to lack of high-resolution, spatially-explicit tile drainage area information for the Contiguous United States (CONUS). We developed a 30-m resolution tile drainage map of the most-likely tile-drained area of the CONUS (AgTile-US) from county-level tile drainage census using a geospatial model that uses soil drainage information and topographic slope as inputs. Validation of AgTile-US with 16000 ground truth points indicated 86.03% accuracy at the CONUS-scale. Over the heavily tile-drained midwestern regions of the U.S., the accuracy ranges from 82.7% to 93.6%. These data can be used to study and model the hydrologic and water quality responses of tile drainage and to enhance streamflow forecasting in tile drainage dominant regions.
Agricultural developments require changes in land surface and subsurface hydraulic functions as protection from floods, reclamation of flooded land, irrigation, and drainage. Drainage of agricultural land has a long history and apparently traces back to the earliest civilizations of Mesopotamia and Iran before 4000 BC. In the Eastern Mediterranean, the Minoan and Mycenaean civilizations developed techniques and strategies of drainage of agricultural lands from the middle of the 2nd millennium BC. After the collapse of the Aegean Bronze-age civilizations, society building and agricultural innovation in the archaic and Classical periods (ca. 800-300 BC) included successful attempts at controlling drainage and irrigation techniques. In addition, China, India, and Mesoamerica have extensive histories of drainage. The aim of this review paper is to trace the evolution of the main foundings on agricultural drainage technologies through the centuries until the present. This historical review reveals valuable insights into ancient hydraulic technologies as well as irrigation and drainage management that will help to find bright horizons for sustainable agriculture in future.
In headwater catchments, streamflow recedes between periods of rainfall at a predictable rate generally defined by a power‐law relationship relating streamflow decay to streamflow. Research over the last four decades has applied this relationship to predictions of water resource availability as well as estimations of basin‐wide physiographic characteristics and ecohydrologic conditions. However, the interaction of biophysical processes giving rise to the form of these power‐law relationships remain poorly understood, and recent investigations into the variability of streamflow recession characteristics between discrete events have alternatively suggested evapotranspiration, water table elevation, and stream network contraction as dominant factors, without consensus. To assess potential temporal variability and interactions in the mechanism(s) driving streamflow recession, we combine long‐term observational data from a headwater stream in the southern Appalachian Mountains with state and flux conditions from a process‐based ecohydrologic model. Streamflow recession characteristics are non‐unique, and vary systematically with seasonal fluctuations in both rates of transpiration and watershed wetness conditions, such that transpiration dominates recession signals in the early growing season and diminishes in effect as the water table elevation progressively drops below and decouples with the root zone with topographic position. As a result of this decoupling, there exists a seasonal hysteretic relationship between streamflow decay and both evapotranspiration and watershed wetness conditions. Results indicate that for portions of the year, forest transpiration may actively compete with subsurface drainage for the same water resource that supplies streamflow, though for extended time periods these processes exploit distinct water stores. Our analysis raises concerns about the efficacy of assessing humid headwater systems using traditional recession analysis, with recession curve parameters treated as static features of the watershed, and we provide novel alternatives for evaluating interacting biological and geophysical drivers of streamflow recession.
Conventional free subsurface drainage practices in the Moghan Plan, in northwest Iran, result in low irrigation efficiency and excessive volumes of drainage water causing extensive environmental problems. Controlled drainage (CD) is promoted to boost crop yields and reduce subsurface drainage flows and leaching of nutrients. This study was conducted to test management options for CD in irrigated farmers' fields in the Moghan plain. Three options were tested: subsurface drains at 2 m with free outflow (FD), controlled drainage at 70 cm below soil surface (CD70) and controlled drainage with a varying depth depending on the crop stage (CDch). Irrigation gifts were based on the daily measured soil water content and thus varied per drainage treatment. In winter, wheat and barley were grown followed by maize in summer. For each crop and treatment, three replicates were made. The highest crop yields (for all crops) were found with CDch, followed by CD70. For wheat, the yields were respectively 27% and 41% higher in the CD70 and CDch compared to FD. For barley these increase was respectively 23% (CD70) and 34% (CDch) and for maize (forage yields) 19% (CD70) and 25% (CDch). The same trends were observed in water use efficiencies (WUE): compared to FD, the WUE was 26% in CD70 and 40% higher in CDch; for barley these increases were respectively 19% (CD70) and 32% (CDch), and for maize (forage yields) 30% (CD70) and 44% (CDch). Controlled drainage not only reduced subsurface drainage rates, but also nitrate and phosphorous losses. The average drain discharges with CDch were respectively 33%, 45% and 44% lower than FD for wheat, barley and maize. Flow-weighted NO 3 concentration in drainage discharge of CD70 and CDch were, respectively, 15% and 9% for wheat, 9% and 13% for barley, and 8% and 7% for maize lower than in FD. Soil salinity decreased in FD, but slightly increased in the CD treatments. Thus, although controlled drainage clearly has advantages above free drainage practices, to optimize CD management options, more research is needed on the long-term effects of controlled drainage on soil salinity.
The surface runoff and soil infiltration exert significant influence on soil erosion. The effects of slope gradient/length (SG/SL), individual rainfall amount/intensity (IRA/IRI), vegetation cover (VC) and antecedent soil moisture (ASM) on the runoff depth (RD) and soil infiltration (INF) were evaluated in a series of natural rainfall experiments in the South of China. RD is found to correlate positively with IRA, IRI, and ASM factors and negatively with SG and VC. RD decreased followed by its increase with SG and ASM, it increased with a further decrease with SL, exhibited a linear growth with IRA and IRI, and exponential drop with VC. Meanwhile, INF exhibits a positive correlation with SL, IRA and IRI and VC, and a negative one with SG and ASM. INF was going up and then down with SG, linearly rising with SL, IRA and IRI, increasing by a logit function with VC, and linearly falling with ASM. The VC level above 60% can effectively lower the surface runoff and significantly enhance soil infiltration. Two RD and INF prediction models, accounting for the above six factors, were constructed using the multiple nonlinear regression method. The verification of those models disclosed a high Nash-Sutcliffe coefficient and low root-mean-square error, demonstrating good predictability of both models.
We created a new dataset of spatially interpolated monthly climate data for global land areas at a very high spatial resolution (approximately 1 km 2). We included monthly temperature (minimum, maximum and average), precipitation, solar radiation, vapour pressure and wind speed, aggregated across a target temporal range of 1970–2000, using data from between 9000 and 60 000 weather stations. Weather station data were interpolated using thin-plate splines with covariates including elevation, distance to the coast and three satellite-derived covariates: maximum and minimum land surface temperature as well as cloud cover, obtained with the MODIS satellite platform. Interpolation was done for 23 regions of varying size depending on station density. Satellite data improved prediction accuracy for temperature variables 5–15% (0.07–0.17 ∘ C), particularly for areas with a low station density, although prediction error remained high in such regions for all climate variables. Contributions of satellite covariates were mostly negligible for the other variables, although their importance varied by region. In contrast to the common approach to use a single model formulation for the entire world, we constructed the final product by selecting the best performing model for each region and variable. Global cross-validation correlations were ≥ 0.99 for temperature and humidity, 0.86 for precipitation and 0.76 for wind speed. The fact that most of our climate surface estimates were only marginally improved by use of satellite covariates highlights the importance having a dense, high-quality network of climate station data.
Characterizing the impacts of climatic change on hydrologic processes is critical for managing freshwater systems. Specifically, there is a need to evaluate how the two major components of streamflow, baseflow and stormflow, have responded to recent trends in climate. We derive baseflow and stormflow for 674 sites throughout the United States from 1980 to 2010 to examine their associations with precipitation, potential evapotranspiration, and maximum/minimum temperature. The northeastern (NE) and southwestern (SW) United States display consistent trends in baseflow and stormflow: increasing during fall and winter in the NE and decreasing during all seasons in the SW. Trends elsewhere and at other times of the year are more variable but still associated with changes in climate. Counter to expectations, baseflow and stormflow trends throughout the United States tend to change concurrently. These trends are primarily associated with precipitation trends, but increases in PET are influential and likely to become important in the future.
Hydrology in many agricultural landscapes around the world is changing in unprecedented ways due to the development of extensive surface and subsurface drainage systems that optimize productivity. This plumbing of the landscape alters water pathways, timings, and storage, creating new regimes of hydrologic response and driving a chain of environmental changes in sediment dynamics, nutrient cycling, and river ecology. In this work we non-parametrically quantify the nature of hydrologic change in the Minnesota River Basin, an intensively managed agricultural landscape, and study how this change might modulate ecological transitions. During the growing season when climate effects are shown to be minimal, daily streamflow hydrographs exhibit sharper rising limbs and stronger dependence on the previous-day precipitation. We also find a changed storage-discharge relationship and show that the artificial landscape connectivity has most drastically affected the rainfall-runoff relationship at intermediate quantiles. Considering the whole year, we show that the combined climate and land-use change effects reduce the inherent nonlinearity in the dynamics of daily streamflow, perhaps reflecting a more linearized engineered hydrologic system. Using a simplified dynamic interaction model that couples hydrology to river ecology, we demonstrate how the observed hydrologic change and/or the discharge-driven sediment generation dynamics may have modulated a regime shift in river ecology, namely extirpation of native mussel populations. We posit that such non-parametric analyses and reduced complexity modeling can provide more insight than highly parameterized models and can guide development of vulnerability assessments and integrated watershed management frameworks. This article is protected by copyright. All rights reserved.
Wheat, triticale, and rapeseed growth and yield were studied under various tillage (conventional, deep ripping, direct drilling) and stubble-handling (burnt, retained) regimes with and without drainage at Hamilton in south-western Victoria from 1985 to 1987. Grain yield was increased from about 2 to >4 t/ha by drainage in both years; however, effects of other treatments, although significant, were much less. Soil structure (as measured by fractional air-filled porosity at -5 J/kg) deteriorated during winter and recovered during spring and summer. A laboratory experiment showed that this variation in soil structure resulted from saturation per se and redrying. In the field, the decline in porosity was most pronounced with cultivation and the absence of drainage, but overall, the effects of stubble retention and tillage treatments were small. There was a significant positive relationship between yield and porosity on undrained areas, but not where drains were present. Drainage reduced soil structural decline during winter, while stubble retention reduced the decline in porosity in the cultivated-undrained treatment in 1987.
The extensive development of surface and subsurface drainage systems to facilitate agricultural production throughout North America has significantly altered the hydrology of landscapes compared to historical conditions. Drainage has transformed nutrient and hydrologic dynamics, structure, function, quantity, and configuration of stream and wetland ecosystems. In many agricultural regions, more than 80% of some catchment basins may be drained by surface ditches and subsurface drain pipes (tiles). Natural channels have been straightened and deepened for surface drainage ditches with significant effects on channel morphology, instream habitats for aquatic organisms, floodplain and riparian connectivity, sediment dynamics, and nutrient cycling. The connection of formerly isolated wetland basins to extensive networks of surface drainage and the construction of main channel ditches through millions of acres of formerly low-lying marsh or wet prairie, where no defined channel may have previously existed, have resulted in large-scale conversion of aquatic habitat types, from wetland mosaics to linear systems. Reduced surface storage, increased conveyance, and increased effective drainage area have altered the dynamics of and generally increased flows in larger streams and rivers. Cumulatively, these changes in hydrology, geomorphology, nutrient cycling, and sediment dynamics have had profound implications for aquatic ecosystems and biodiversity.
This paper explores the role of drainage as aninstrument for agricultural and rural development andthe related drainage development forces and processes.Five specific roles of drainage are distinguished:foodproduction, agricultural intensification anddiversification, sustainable irrigated land use, ruraldevelopment and environmental protection. Specialattention is given to the drainage development needsof the developing countries. It is argued that whileat early stages of agricultural development, drainagedevelopment is generally driven by ongoingagricultural development, at later stages ofdevelopment these roles often reverse withagricultural development risking to become stagnatedwhen drainage is not improved. These relationshipsshould be taken into account in the design of drainagedevelopment programs and projects. It is emphasizedthat improved drainage can contribute to establishinga more diversified, competitive and sustainable typeof agriculture, enhance sanitary and public healthconditions and generally contribute to ruraldevelopment, rural well being and poverty alleviation.Drainage development in the developing countries ishowever often severely constrained by the lack of supporting public policies, institutional frameworksand professional cadres. Overcoming these constraintsshould be given high priority on the national andinternational drainage development agendas.
Identifying the existence and orientation of buried drainage systems is necessary to incorporate the impact of these features in solute transport and hydrologic models. This study was conducted to determine if a cesium magnetometer survey could identify clay tile locations. A cesium magnetometer survey with a sampling interval of 10 cm along the survey transect and 50-cm spaced transects was used at the Oregon State University Research Dairy in an attempt to map clay tile orientation and location. A shaded-relief plot of the magnetic data from a 100- 100-m portion of the Dairy successfully identified clay tile in the western part of the study area, but was unable to identify clay tile in the eastern part of the study area. Probing and trenching confirmed the existence of clay tile in both portions of the site. This case study has shown that cesium magnetometer surveys can locate clay tile with a spatial accuracy (horizontally) of 25 cm, and offers a new technique for non-invasive subsurface drainage pipe location. This study has also elucidated potential limitations in this method for identifying subsurface drainage pipe locations that may depend on soil type, management strategies, and soil magnetic properties.
The term flashiness reflects the frequency and rapidity of short term changes in streamflow, especially during runoff events. Flashiness is an important component of a stream's hydrologic regime. A variety of land use and land management changes may lead to increased or decreased flashiness, often to the detriment of aquatic life. This paper presents a newly developed flashiness index, which is based on mean daily flows. The index is calculated by dividing the pathlength of flow oscillations for a time interval (i.e., the sum of the absolute values of day-to-day changes in mean daily flow) by total discharge during that time interval. This index has low interannual variability, relative to most flow regime indicators, and thus greater power to detect trends. Index values were calculated for 515 Midwestern streams for the 27-year period from 1975 through 2001. Statistically significant increases were present in 22 percent of the streams, primarily in the eastern portion of the study area, while decreases were present in 9 percent, primarily in the western portion. Index values tend to decrease with increasing watershed area and with increasing unit area ground water inputs. Area compensated index values often shift at ecoregion boundaries. Potential index applications include evaluation of programs to restore more natural flow regimes.
Hydrology plays a critical role in wetland development and ecosystem structure and functions. Hydrologic responses to forest management and climate change are diverse in the Southern United States due to topographic and climatic differences. This paper presents a comparison study on long-term hydrologic characteristics (long-term seasonal runoff patterns, water balances, storm flow patterns) of three watersheds in the southern US. These three watersheds represent three types of forest ecosystems commonly found in the lower Atlantic coastal plain and the Appalachian upland mountains. Compared to the warm, flat, and shallow groundwater dominated pine flatwoods on the coast, the inland upland watershed was found to have significantly higher water yield, Precipitation/Hamon's potential evapotranspiration ratio (1.9 for upland vs 1.4 and 0.9 for wetlands), and runoff/precipitation ratio (0.53±0.092 for upland vs 0.30±0.079 and 0.13±0.094 for wetlands). Streamflow from flatwoods watersheds generally are discontinuous most of the years while the upland watershed showed continuous flows in most years. Stormflow peaks in a cypress–pine flatwoods system were smaller than that in the upland watershed for most cases, but exceptions occurred under extreme wet conditions. Our study concludes that climate is the most important factor in determining the watershed water balances in the southern US. Topography effects streamflow patterns and stormflow peaks and volume, and is the key to wetland development in the southern US.
The Midwest of the USA is a highly productive agricultural region, in part due to the installation of perforated subsurface pipes, known as tile drains that remove excess water from wet soils. Tile drains rapidly move water to nearby streams, influencing how quickly streamflow rises and falls (i.e., streamflow “flashiness”). Currently, there are no comprehensive studies that compare the extent to which tile drainage influences flashiness across large and diverse agricultural regions. We address this knowledge gap by examining growing‐season (April–October) flashiness using the Richards‐Baker Index (RBI) in 139 watersheds located throughout the Midwest. Using a spatial tile‐drainage dataset, watersheds were split into low, medium, and high tile‐drainage classes. We found no significant differences between the flashiness of these three classes using a one‐way Kruskal–Wallis test. When watersheds were separated into infiltration groups to help control for different soil types, the high tile‐drainage class RBI was significantly higher than the low tile‐drainage class RBI in the high infiltration group. To further understand the causes of flashiness, additional environmental variables and their relationship to flashiness were examined using multivariate regression. In the low infiltration group, tile drainage significantly reduced flashiness, with watershed area and average depth to water table being the largest influences on flashiness. Tile drainage produced a larger reduction in flashiness in the high infiltration watersheds, with the largest influences being percent clay in the watershed and watershed area. These results indicate that the influence of tile drainage on flashiness emerges only after other watershed variables are accounted for. Given that tile drainage may increase in the future as precipitation patterns and extremes change, flashiness will likely continue to be modified. These results lead to an improved understanding of flood‐generating and nutrient transport mechanisms that are relevant to stakeholders across a wide range of sectors.
Hydrograph flashiness is a commonly used and valuable measure of hydrologic behavior, calculated directly from streamflow measurements. There is limited understanding of how and why flashiness varies at continental scales and within different hydroclimatic regions, which can aid in predicting regional responses to an accelerating water cycle. We addressed this knowledge gap by calculating a widely used metric of flashiness, the Richards-Baker Flashiness index (RBI), for 1144 watersheds across the continental US for a ten year period, using daily streamflow from USGS gages. While regional studies have suggested that flashiness is organized by watershed size, we show that flashiness is poorly organized by drainage area at continental scales. Moreover, relationship strengths and directions with drainage area varied by hydroclimatic regions. Using a regionally based random forest analysis, we show that empirical relationships between watershed characteristics and flashiness are diverse, emphasizing that drivers of flashiness are likely related to regional or smaller scale features that encapsulate relationships between climate, human impacts, and the subsurface.
The use of drainage pipe is documented as far back as 200 B. C. and continues to be used in poorly drained agricultural regions throughout the world. While good for crop production, the eco-hydrologic impacts of this modification have been shown to adversely affect natural drainage networks. Identifying the exact location of drainage pipe networks is essential to developing groundwater and surface water models. The geometry of drainage pipe networks installed decades ago has often been lost with time or was never well documented in the first place. Previous work has recognized that drainage pipes can be observed for certain soil types in visible spectrum (RGB) remote sensing data due to changes in soil albedo. In this work, small Unmanned Aerial Systems (sUAS) were used to collect high resolution RGB and thermal data to map subsurface drainage pipe. Within less than 96 h of a small (< 1.3 cm) rain event, a total of approximately 60 ha of sUAS thermal and RGB data were acquired at two different locations in the IML-CZO in Illinois. The thermal imagery showed limited evidence of thermal contrast related to the drainage pipe. If the data were acquired immediately after a rain event it is more likely a temperature contrast would have been detected due to lower soil moisture proximal to the drainage pipe network. The RGB data, however, elucidated the drainage pipe entirely at one site and elucidated traces of the drainage pipe at the other site. These results illustrate the importance of the timing of sUAS data collection with respect to the precipitation event. Ongoing related work focusing on laboratory and numerical experiments to better quantify feedbacks between albedo, soil moisture, and heat transfer will help predict the optimal timing of data collection for applications such as drainage pipe mapping.
Artificial drainage is among the most widespread land improvements for agriculture. Drainage benefits crop production, but also promotes nutrient losses to water resources. Here, we outline how a systems perspective for sustainable intensification of drainage can mitigate nutrient losses, increase fertilizer nitrogen-use efficiency and reduce greenhouse-gas emissions. There is an immediate opportunity to realize these benefits because agricultural intensification and climate change are increasing the extent and intensity of drainage systems. If a systems-based approach to drainage can consistently increase nitrogen-use efficiency, while maintaining or increasing crop production, farmers and the environment will benefit.
In Midwestern agricultural fields, subsurface tile drainage has been widely used to remove excess water from the soil through perforated tubes installed beneath the ground surface. While it plays an important role in enabling agricultural activities in wet but productive areas, this system is a major driving factor affecting water and nutrient dynamics, and water quality in this region. However, despite its critical role, the specific locations of subsurface tile drainage structures are not generally available nor well captured by conventional optical image processing due to soil surface features, such as topographic depressions and tillage. To overcome these challenges, in this study, we have explored the potential of using thermal images to identify the location of a subsurface drainage pipe. The hypothesis is that the unique spatial distribution of soil moisture set up by tile drains can result in the difference in surface soil temperature between areas near and away from drainage pipes. Toward this objective, we designed and developed an experimental device based on a dimensionless analysis at a scale of 1:20, which was deployed in the open air for 4.5 months. The experimental results demonstrate that (1) there is an ideal time for thermal image acquisition that maximizes the contrast between the regions close to and distant from subsurface drainage systems, and (2) the thermal image processing approach proposed in this study is a promising tool that has advantages of higher accuracy and stability in localizing subsurface drainage pipes over optical image-based approaches
Surface runoff and tile drainage are the main pathways for water movement and entry of agricultural nitrate into water resources. The objective of this 5‐yr study was to characterize the partitioning of water flow and nitrate loss between these pathways for a humid‐temperate Brookston clay loam soil under 54 to 59 yr of consistent cropping and fertilization. Cropping treatments included monoculture corn ( Zea mays L., MC), continuous bluegrass ( Poa pratensis L.) sod (CS), and a corn–oat–alfalfa ( Medicago sativa L.)–alfalfa rotation (RC–RO–RA1–RA2). Fertilization treatments included annual fertilizer addition (F) and no fertilizer addition (NF). Tile drainage and surface runoff occurred primarily during the nongrowing season (November–April), and they were highly correlated with the mean saturated hydraulic conductivity of the near‐surface soil profile. Tile drainage accounted for 69 to 90% of cumulative water flow and 79 to 96% of cumulative nitrate loss from fertilized rotation and CS, whereas surface runoff accounted for the majority of the nitrate losses in MC (i.e., 75–93% of water flow and 65–96% of nitrate loss). Cumulative nitrate losses were highest in the RC‐F (152 kg N ha ⁻¹ ), RC‐NF (101 kg N ha ⁻¹ ), RA2‐F (121 kg N ha ⁻¹ ), and RA2‐NF (75 kg N ha ⁻¹ ) plots, and these high losses are attributed to N mineralization from the plowed alfalfa and fertilization (if applicable). Fertilization increased cumulative nitrate loss in tile drainage from all treatments, whereas no fertilization increased cumulative nitrate loss in surface runoff from the rotation. Cropping system and fertilization on clay loam soil changed how water flow and nitrate loss were partitioned between tile drainage and surface runoff.
Core Ideas
Water partitioning is highly correlated with mean saturated hydraulic conductivity.
Tile drainage was the main pathway of water flow and nitrate loss in rotation plots.
Surface runoff was the main pathway of water flow and nitrate loss in monoculture corn.
Fertilization increased nitrate loss and water flow through tile drainage.
Average nitrate loss was 36% greater in the rotation plots versus monoculture corn.
Quantitative synthesis of data from single-case designs (SCDs) is becoming increasingly common in psychology and education journals. Because researchers do not ordinarily report numerical data in addition to graphical displays, reliance on plot digitizing tools is often a necessary component of this research. Intercoder reliability of data extraction is a commonly overlooked, but potentially important, step of this process. The purpose of this study was to examine the intercoder reliability and validity of WebPlotDigitizer (Rohatgi, 2015), a web-based plot digitizing tool for extracting data from a variety of plots, including XY coordinates of interrupted time-series data. Two coders extracted 3,596 data points from 168 data series in 36 graphs across 18 studies. Results indicated high levels of intercoder reliability and validity. Implications of and recommendations based on these results are discussed in relation to researchers involved in quantitative synthesis of data from SCDs.
So far, every example in this book has started with a nice dataset that’s easy to plot. That’s great for learning (because you don’t want to struggle with data handling while you’re learning visualisation), but in real life, datasets hardly ever come in exactly the right structure. To use ggplot2 in practice, you’ll need to learn some data wrangling skills. Indeed, in my experience, visualisation is often the easiest part of the data analysis process: once you have the right data, in the right format, aggregated in the right way, the right visualisation is often obvious.
Strategies to reduce nitrate-nitrogen (nitrate) pollution delivered to streams often seek to increase groundwater residence time to achieve measureable results, yet the effects of tile drainage on residence time have not been well documented. In this study, we used a geographic information system groundwater travel time model to quantify the effects of artificial subsurface drainage on groundwater travel times in the 7443-ha Bear Creek watershed in north-central Iowa. Our objectives were to evaluate how mean groundwater travel times changed with increasing drainage intensity and to assess how tile drainage density reduces groundwater contributions to riparian buffers. Results indicate that mean groundwater travel times are reduced with increasing degrees of tile drainage. Mean groundwater travel times decreased from 5.6 to 1.1 yr, with drainage densities ranging from 0.005 m (7.6 mi) to 0.04 m (62 mi), respectively. Model simulations indicate that mean travel times with tile drainage are more than 150 times faster than those that existed before settlement. With intensive drainage, less than 2% of the groundwater in the basin appears to flow through a perennial stream buffer, thereby reducing the effectiveness of this practice to reduce stream nitrate loads. Hence, strategies, such as reconnecting tile drainage to buffers, are promising because they increase groundwater residence times in tile-drained watersheds.
For selected Lake Erie Basin tributaries, detailed studies of nutrient and sediment runoff have been underway since 1974 and of pesticide runoff since 1983. The monitoring stations subtend watersheds ranging in size from 11 km2 to 16,400 km2 and having similar land use and soils. Examination of the agrochemical concentration and loading patterns at these stations reveals systematic changes related to watershed size (scale). As watershed size increases, peak storm event concentrations decrease while the durations of mid-range concentrations lengthen. The extent of these scale effects is parameter specific, being most evident for suspended solids. We hypothesize that these scale effects are attributable to the pathways and timing of pollutant movement from fields into streams, coupled with dilution associated with routing of runoff water into and through the stream system from differing positions in the watershed. These scale effects need to be considered when comparing concentration and loading data from different watersheds and when designing sampling programs.
Benefits of properly designed and managed agricultural water management systems on crop yields have not been fully realized. Two drainage management systems, conventional free drainage (FD), and controlled drainage (CD) were designed and managed on two research sites in North Carolina. Site 1 (TRS), consists of four subsurface drainage plots at the Tidewater Research Station in Plymouth. Two plots have been in FD and two plots have been in CD for 14 crops between 1990 and 2010. Site 2 (BATH), which is near Bath, NC was installed in 2008 and operated through June 2011. It consists of 6 open ditch drainage plots. Three plots have been in FD, and three in CD, from 2008 through June 2011. Both sites are in a corn-soybean-wheat rotation. There were 6 corn crops, 6 wheat crops and 7 soybean crops during the study. Controlled drainage plots on both sites experienced significant (10.4%) corn yield increases compared to the FD plots. No significant change in wheat yields was observed under CD. Soybean yield increased in all years except 2010 which was complicated by replanting, a change in variety at replanting due to lack of seed availability and a power outage during a large storm event just prior to maturity. A significant soybean yield increase of 10.2% was observed without the 2010 crop. When 2010 was included the average yield increase was 6.5%, but the yield increase was not significant. Corn had increased yields in all crop years at all locations under controlled drainage. Wheat yields remained relatively the same between treatments in all years. Water table depths, drainage rates, precipitation and water quality data were collected and analyzed at both sites.
While some of the world's most productive agriculture is on artificially drained soils, drainage is increasingly perceived as a major contributor to detrimental off‐site environmental impacts. However, the environmental impacts of artificial or improved agricultural drainage cannot be simply and clearly stated. The mechanisms governing the hydrology and loss of pollutants from artificially drained soils are complex and vary with conditions prior to drainage improvements and other factors: land use, management practices, soils, site conditions, and climate. The purpose of this paper is to present a review of research on the hydrologic and water quality effects of agricultural drainage and to discuss design and management strategies that reduce negative environmental impacts.Although research results are not totally consistent, a great majority of studies indicate that, compared to natural conditions, drainage improvements in combination with a change in land use to agriculture increase peak runoff rates, sediment losses, and nutrient losses. Nevertheless, sediment and nutrient losses from artificially drained croplands are usually small compared to cropland on naturally well‐drained uplands.Increasing drainage intensity on lands already in agricultural production may have positive, as well as negative, impacts on hydrology and water quality. For example, increasing the intensity of subsurface drainage generally reduces loss of phosphorus and organic nitrogen, whereas it increases loss of nitrate‐nitrogen and soluble salts. Conversely, increasing surface drainage intensity tends to increase phosphorus loss and reduce nitrate‐nitrogen outflows.Improved drainage is required on many irrigated, arid lands to prevent the rise of the water table, water logging, and salinity buildup in the soil. Although salt accumulation in receiving waters is the most prevalent problem affecting downstream users, the effect of irrigation and improved drainage on loss of trace elements to the environment has had the greatest impact in the U.S. These detrimental effects often can be avoided by identifying a reliable drainage outlet prior to construction of irrigation projects.Research has shown that management strategies can be used to minimize pollutant loads from drained lands. These strategies range from the water table management practices of controlled drainage and subirrigation, to cultural and structural measures. For example, controlled drainage has been found to reduce nitrate‐nitrogen and phosphorus losses by 45 and 35%, respectively, in North Carolina.It is becoming increasingly clear that drainage and related water management systems must be designed and managed to consider both agricultural and environmental objectives. While significant advances in our knowledge of environmental impacts and methods for managing these systems have improved in the last 20 years, there is much yet to be learned about the complex mechanisms governing losses of pollutants from drained soils.
The long‐term effects of drainage on physical properties of a lakebed silty clay soil were evaluated 16 years after initiation of a field experiment. The treatments were undrained, surface drainage, subsurface drainage, and a combination of surface and subsurface drainage. Soil conditions were characterized by surface penetration resistance and by unconfined compressive strength, hydraulic conductivity, and pore size distribution in the 0–30 cm depth. Subsurface drainage resulted in greater soil hydraulic conductivity, less unconfined compressive strength, and less surface crust resistance than treatments without subsurface drainage. Subsurface drainage also decreased bulk density and increased the volume of air‐filled pores at 0.02 to 1.0‐bar suctions, but these effects were of smaller magnitude.
An alfalfa ( Medicago sativa L.)‐timothy (Phleum pratense) mixture was grown during the period of these measurements. The survival of alfalfa and the total hay yield decreased in the order: combined surface and subsurface drained, subsurface drained, surface drained, and undrained treatments.
Utility of subsurface drainageas an intervention to reclaim waterloggedsaline lands and to ensure sustainabilityof irrigated agriculture in India has beenestablished through experiments and pilotresearch conducted for over a century.While tracing the history of subsurfacedrainage in the 20th century, thispaper lists several pilots conducted in preand post independence periods. An attempthas been made to critically review thefindings emerging out of such a largenumber of pilots. Salient findings thatcould serve as design guidelines or tooperationalize the systems in an effectiveand eco-friendly manner have been puttogether for their application in future.It is believed that large scale drainageprojects would increasingly be implementedin India and many other developingcountries. In the opinion of the author,the knowledge generated in this centurywould help to design and plan subsurfacedrainage activities on which rests the foodand nutritional security of India and manyother developing nations.
Up to now, most watershed models have been focused on the representation of ‘natural’ flow and transport processes. In this paper, we discuss the role of man-made networks, such as ditches, roads, hedge rows and hedges, underground drainage by buried pipes, etc. The influence of such features on the hydrology of a watershed may be of particular importance if the aim of the modelling is to predict the effect of landscape management or the fate of contaminants, e.g. pesticides, when a rain event occurs very soon after their spreading on the soil surface. It is likely that such artificial networks may act as conduits or short-circuits for the transport of contaminants, either dissolved or sorbed on soil particles, by-passing some of the retardation mechanisms such as sorption in the soil, retention of surface runoff by grass verges, biodegradation in the unsaturated zone, etc.We first present a small watershed on which the study was conducted, the Kervidy, which is a 5 km2 ‘bocage2’ catchment in Brittany, France. The man-made networks were observed and their extent and functioning described. We then included the potential hydraulic role of these networks in a distributed watershed model (TOPOG, [J. Hydrol. 150 (1993) 665]). This modified model, ANTHROPOG, was run, for comparison, with and without the man-made network; sensitivity tests were also made to assess the hydrologic importance of these networks. It was shown that they can have a very significant effect on the functioning of a watershed.We conclude on the relevance of the improved distributed model for the management of rural landscapes, and on the type of additional data needed to calibrate the model with parameters representative of the true processes.
This study examines spatiotemporal variability (event-based, seasonal) in the contribution of drainage tiles within a basin to basin hydrologic discharge and soluble reactive phosphorus (SRP) and total phosphorus (TP) export over a period of 1 year. Tile discharge was highly variable at both moderate (wet versus dry periods) and smaller (within-event) temporal scales, accounting for 0–90% of basin discharge at any given time. An estimated 42% of basin annual discharge originated from drainage tiles, the majority of which occurred during the winter and spring months. Concentrations of SRP and TP in drainage tile effluent were also highly variable in space and time (1–2850 μg SRP L−1, 5–8275 μg TP L−1). Higher concentrations of SRP and TP were linked to fields receiving manure compared to fields receiving inorganic fertilizers. SRP export from tiles accounted for 118% of basin SRP export on average, although their contribution to basin SRP export ranged from 4 to 344% on 32 discrete dates during which all tiles in the basin were sampled for hydrochemistry. On the same 32 dates, tiles accounted for an average of 43% of basin TP export, although this ranged from 0 to 200%. Management options such as tile plugs and optimizing the timing and application rates of fertilizer should be explored to minimize nutrient export from tiles.
Detailed location maps of tile drains in the Midwestern United States are generally not available, as the tile lines in these areas were laid more than 75 years ago. The objective of this study is to map individual tile drains and estimate drain spacing using a combination of GIS-based analysis of land cover, soil and topography data, and analysis of high resolution aerial photographs to within the Hoagland watershed in west-central Indiana. A decision tree classifier model was used to classify the watershed into potentially drained and undrained areas using land cover, soil drainage class, and surface slope data sets. After masking out the potential undrained areas from the aerial image, image processing techniques such as the first-difference horizontal and vertical edge enhance filters, and density slice classification were used to create a detailed tile location map of the watershed. Drain spacings in different parts of the watershed were estimated from the watershed tile line map. The decision tree identified 79% of the watershed as potential tile drained area while the image processing techniques predicted artificial subsurface drainage in approximately 50% of the Hoagland watershed. Drain spacing inferred from classified aerial image vary between 17 and 80 m. Comparison of estimated tile drained areas from aerial image analysis shows a close agreement with estimated tile drained areas from previous studies (50% versus 46% drained area) which were based on GIS analysis and National Resource Inventory survey. Due to lack of sufficient field data, the results from this analysis could not be validated with observed tile line locations. In general, the techniques used for mapping tile lines gave reasonable results and are useful to detect drainage extent from aerial image in large areas. These techniques, however, do not yield precise maps of the systems for individual fields and may not accurately estimate the extent of tile drainage in the presence of crop residue in agricultural fields and/or existence of other spatial features with similar spectral response as tile drains.
In many parts of the world the frequency of river floods (flash floods) seems to have increased during the past half century. Intensified agriculture is considered as a possible cause for the changed peak flow behavior. It is believed that a large-scale, narrowly designed subsurface drainage reduces the soil water retention in periods with excessive precipitation or snow melt. To increase the soil water retention, it may be necessary to reconsider conventional drain spacing design. The present study deals with the calculation of drain spacings for optimal rainstorm runoff control. A semi-analytical procedure is developed with which for a given extreme rainfall event the drain spacing is calculated that provides the highest possible soil water retention, but no surface runoff. The model considers two-dimensional unsteady water flow between parallel tile drains, with a rising water table. It combines an analytical rising water table model with an empirical spreading water table model. A comparison of the new and a conventional drain design system (Hooghoudt–Ernst) shows that with the newly designed system a considerable temporary soil water retention during heavy rainfall can be achieved. For example, for a soil with a hydraulic conductivity of 0.5 m d−1 that is underlain by an impervious barrier at the 2.0 m depth, and that is drained by tiles with a radius of 0.1 m at the 1.0 m soil depth, an additional soil water retention of 38 mm is obtained when the drain spacing is 46.0 m instead of 13.5 m for a rainfall event of 80 mm in a 4-day period. The newly proposed design system may help to reduce the flood threat in areas with large-scale agricultural drainage in periods with excessive rainfall or snow melt.
Long-term hydrologic simulations are presented predicting the effects of drainage water management on subsurface drainage, surface runoff and crop production in Iowa's subsurface drained landscapes. The deterministic hydrologic model, DRAINMOD was used to simulate Webster (fine-loamy, mixed, superactive, mesic) soil in a Continuous Corn rotation (WEBS_CC) with different drain depths from 0.75 to 1.20m and drain spacing from 10 to 50m in a combination of free and controlled drainage over a weather record of 60 (1945–2004) years. Shallow drainage is defined as drains installed at a drain depth of 0.75m, and controlled drainage with a drain depth of 1.20m restricts flow at the drain outlet to maintain a water table at 0.60m below surface level during the winter (November–March) and summer (June–August) months. These drainage design and management modifications were evaluated against conventional drainage system installed at a drain depth of 1.20m with free drainage at the drain outlet. The simulation results indicate the potential of a tradeoff between subsurface drainage and surface runoff as a pathway to remove excess water from the system. While a reduction of subsurface drainage may occur through the use of shallow and controlled drainage, these practices may increase surface runoff in Iowa's subsurface drained landscapes. The simulations also indicate that shallow and controlled drainage might increase the excess water stress on crop production, and thereby result in slightly lower relative yields. Field experiments are needed to examine the pathways of water movement, total water balance, and crop production under shallow and controlled drainage in Iowa's subsurface drained landscapes.