Soil Use and Management

Published by Wiley
Online ISSN: 1475-2743
Print ISSN: 0266-0032
Despite a growing awareness that erosion on arable land in Britain is a potential hazard to long-term productivity, there is still only limited information on the rates involved, particularly long-term values. Use of the caesium-137 (137Cs) technique to study soil erosion within arable fields on various soil types at 13 locations in southern Britain has yielded retrospective measurements of the long-term (c. 30 years) rates of soil loss and the patterns of soil redistribution within the study fields. The range of long-term rates of net soil loss extends from 0.61 per hectare per year on clay soils in Bedfordshire to 10.5 t per hectare per year on brown sands in Nottinghamshire. The measured rates are compared with other published data for similar soil types and land use, and the implications for long-term productivity and potential environmental impacts are considered.
Field peas (Pisum sativum L.) were grown in sequence with winter wheat (Triticum aestivum L.) or spring barley (Hordeum vulgare L.) in large outdoor lysimeters. The pea crop was harvested either in a green immature state or at physiological maturity and residues returned to the lysimeters after pea harvest. After harvest of the pea crop in 1993, pea crop residues (pods and straw) were replaced with corresponding amounts of 15N-labelled pea residues grown in an adjacent field plot. Reference lysimeters grew sequences of cereals (spring barley/spring barley and spring barley/winter wheat) with the straw removed. Leaching and crop offtake of 15N and total N were measured for the following two years. These treatments were tested on two soils: a coarse sand and a sandy loam. Nitrate concentrations were greatest in percolate from lysimeters with immature peas. Peas harvested at maturity also raised the nitrate concentrations above those recorded for continuous cereal growing. The cumulative nitrate loss was 9–12 g NO3-N m–2 after immature peas and 5–7 g NO3-N m–2 after mature peas. Autumn sown winter wheat did not significantly reduce leaching losses after field peas compared with spring sown barley. 15N derived from above-ground pea residues accounted for 18–25% of the total nitrate leaching losses after immature peas and 12–17% after mature peas. When compared with leaching losses from the cereals, the extra leaching loss of N from roots and rhizodeposits of mature peas were estimated to be similar to losses of 15N from the above-ground pea residues. Only winter wheat yield on the coarse sand was increased by a previous crop of peas compared to wheat following barley. Differences between barley grown after peas and after barley were not statistically significant. 15N lost by leaching in the first winter after incorporation accounted for 11–19% of 15N applied in immature pea residues and 10–15% of 15N in mature residues. Another 2–5% were lost in the second winter. The 15N recovery in the two crops succeeding the peas was 3–6% in the first crop and 1–3% in the second crop. The winter wheat did not significantly improve the utilization of 15N from the pea residues compared with spring barley.
In 1983, an annual Survey of Fertiliser Practice in England and Wales was extended to Scotland, to provide comprehensive information on inorganic fertilizer, lime and also organic manure use in mainland Britain. It was based on an annual sample of about 1500 farms, selected from the Agricultural Census and stratified by farm type and size. Results from the first fifteen years (1983–97) show that fertilizer nitrogen (N) rates on both tillage crops and grassland peaked at 157 and 132 kg ha–1, respectively, in the mid 1980s and subsequently decreased by c.10%. The majority of N was applied in straight form (without P or K) to tillage crops and in compound form (containing two or more nutrients e.g. NPK; NK) to grassland. Total N use on cereals showed little change but autumn-applied N decreased on both winter cereals and winter oilseed rape. Total N rates decreased on oilseed rape and, to a smaller extent, on maincrop potatoes and sugarbeet. Between 1983–87 and 1993–97, mean phosphate (P2O5) rates declined by almost 10% on both tillage crops (from 58 to 53 kg ha–1) and on grassland (from 25 to 23kg ha–1). The corresponding mean potash (K2O) rates decreased slightly on both tillage crops (from 64 to 62 kg ha–1), and on grassland (from 32 to 31 kg ha–1), although annual usage was more variable on grassland. Sulphur use increased appreciably on cereal and oilseed rape crops between 1993, when S data were first recorded in the survey, and 1997 when 13% and 30%, respectively, of these crop areas received S-fertilizer. However, on grassland, S use remained very low. Average lime use increased on both tillage crops and grassland between the mid 1980s and mid 1990s, from 10 to 12% and 4 to 7% of the total area, respectively. The proportion of land receiving organic manures remained at c. 16% for tillage cropping but increased slightly for grassland, from a mean of 40% in 1983–87 to 44% in 1993–97. Manures were applied throughout the year but about half the applications to tillage land, and a quarter of those to grassland, were made in autumn when the risk of subsequent nitrate leaching loss is greatest.
During the four consecutive winters between 1984 and 1989 a computer simulation model was used to estimate the amounts of nitrogen in a cereal crop and available from soil to the crop after winter. The model does this by taking account of daily weather and by making simple assumptions about the starting conditions each autumn after the harvest of the previous crop. Some of the information which was given to farmers on viewdata systems is displayed, together with maps showing the average amounts of nitrogen in soil and crop in spring over 10 years in eastern England. This 10-year average is used as a baseline against which to judge the simulations in each of the four winters of our viewdata service.
The behaviour of potassium (K) in a range of arable soils was examined by plotting the change in exchangeable K of the topsoil (Δ Kex) at the end of a 3–5 year period against the K balance over the same period (fertilizer K applied minus offtake in crops, estimated from farmers' records of yield and straw removal). Based on the assumption that values for offtake per tonne of crop yield used for UK arable crops MAFF 2000) are valid averages, 10–50% of Δ Kex was explained by the balance, relationships being stronger on shallow/stony soils. Excess fertilizer tended to increase Kex and reduced fertilization decreased it, requiring between 1.2 and 5.4 kg K ha−1 for each mg L−1Δ Kex. However, merely to prevent Kex falling required an extra 20 kg K ha−1 yr−1 fertilizer on Chalk soils and soils formed in the overlying Tertiary and Quaternary deposits, despite clay contents >18%. Whereas, on older geological materials, medium soils needed no extra K and clays gained 17 kg K ha−1 yr−1. It is unlikely that the apparent losses on some soil types are anomalies due to greater crop K contents. Theory and the literature suggest leaching from the topsoil as a major factor; accumulation in the subsoil was not measured. Recommendations for K fertilization of UK soils might be improved by including loss or gain corrections for certain soil types.
Each year since 1986 information has been collected about the farming systems at intersections of a nationwide 7 km square grid in Denmark. These management data and corresponding soil analyses were used in the model DAISY to simulate water and nitrogen dynamics. The model was validated with respect to harvested dry matter yield and nitrogen content in the soil. Simulated nitrate leaching from farmland areas from 1 April 1989 to 31 March 1993 was related to precipitation zones, soil type, fertilizer strategies and cropping systems. The mean simulated nitrate leaching for the whole of Denmark was 74 kg N/ha/yr, with a large yearly variation in the period considered. The simulated nitrate leached from soils with a sandy subsoil corresponded to 51% of the applied fertilizer, twice that leached from soils with a loamy subsoil. The application of pig manure resulted in average leaching losses of 105 kg N/ha/yr. The simulated nitrate leaching losses at sites where only artificial fertilizer was applied were in the following order: cereal with undersown grass < crop followed by winter cereal or winter rape < cereal or rape without a catch crop < root crops without a catch crop. Where only artificial fertilizers were applied, the simulated mean annual leaching was 59 kg N/ha from spring barley and 40 kg N/ha from winter wheat. A map of simulated nitrate leaching in Denmark was produced using a Geographical Information System.
The incidence of soil water erosion was monitored in 12 erosion-susceptible arable catchments (c. 80 fields) in England and Wales between 1990 and 1994. Factors associated with the initiation of erosion were recorded, and the extent of rills and gullies measured. Approximately 80% of the erosion events were on land cropped to winter cereals. In 30% of cases, the initiation of erosion was linked to valley floor features, which concentrated runoff. Poor crop cover, wheelings and tramlines were also assessed as contributory factors in 22%, 19% and 14% of cases, respectively. In c. 95% of cases rainfall events causing erosion were ≥10 mm day−1 and c. 80% were >15 mm day−1. Erosion was also associated with maximum rainfall intensities of >4 mm h−1 for c. 90% of cases and >10 mm h−1 for c. 20%. Mean net soil erosion rates were approximately 4 t ha−1 per annum (median value 0.41 t ha−1 per annum) and associated mean P losses 3.4 kg ha−1.
This paper reports spatial and temporal changes at the regional level in soil organic carbon (SOC) using a soil-test database. A total of 23 329 SOC test values recorded between 1990 and 2004 by certified commercial laboratories and collected in a mountainous French region (Franche-Comté) were integrated in a database. Results show a strong trend in organic carbon content, mainly related to elevation. A large loss in SOC was observed over the survey period. This loss correlated with baseline SOC content with greater loss from soils with higher carbon content. This loss is likely to be due to both changes in land use from permanent grassland to cultivation and to an increase in temperature during the survey period. Our study demonstrates that past soil-test results which were not originally intended for monitoring can provide an alternative method for detecting changes in SOC.
The effect of drought between summer 1995 and 1997 on stream and river nitrate concentrations was investigated using sites close to the long-running meteorological station in Oxford, UK. Nitrate concentrations in the River Windrush were relatively low during the drought, but after it had ended reached the highest level since records began in 1973. The low concentrations during the drought probably reflect a reduced contribution from agricultural runoff. High nitrate concentrations were found in a field drain at Wytham Environmental Change Network site during and after the drought, but discharge was greatly reduced. A woodland stream at Wytham had much lower nitrate concentrations than the field drain but these similarly increased during and after the drought. There was evidence that both a concentrating effect of low water volumes and enhanced soil nitrogen mineralization and nitrification rates were causing concentrations to rise. The effects of mineralization and nitrification were more important in woodland than agricultural land. Nitrate load over the course of a year was determined largely by discharge, but steeper gradients for the relationship between cumulative load and cumulative discharge were seen during and after the drought than before, reflecting the higher concentrations.
Field calibrations for a neutron probe and a capacitance sensor (Diviner 2000) for measuring the soil water content of a shrinking–swelling clay soil were substantially different from commonly used default values. Using our field calibrations, the two instruments estimated similar changes in the cumulative water content of a soil profile (0–1 m depth) over one growing season.
Calibration coefficients for a Diviner 2000 capacitance sensor were developed under laboratory conditions for soils of six textures. The calibration equations, derived by regression analysis, significantly (P <0.001) related Diviner 2000 measurements of scaled frequency (SF) with volumetric soil water content of the soil. In all cases the calibration accounted for >93% of the variation (R2 adjusted) with the volumetric water content of the soil.
The Netherlands has a high cumulative mean phosphorus (P) balance. In the 20th century, cumulative mean P surpluses were ca. 4500 kg P2O5/ha. The annual surpluses have levelled off because of manure application limits from 1984 onwards. We report the effect of soil type, land use, and manure policy on changes in soil P of fields in the Netherlands during the 20th century. We used data (>5 million soil P tests) from the soil analysis laboratory BLGG AgroXpertus. Our results show that soil P has increased on average to fairly high and high ratings. Differences between regions and between land use have remained high from the first records in the 1930s; on arable land the increase continued until the end of our study period while on grassland no changes are evident in the last decades. In general regions with high livestock density have high soil P status. Soil P increased in the order bulbfields < grassland < arable land < maize land < horticulture, and in the order loess < clay < peat < sand soils. Spatial variations in P values reflect more the market value of the crops and regional availability of animal manure than (fertilizer) recommendations. Manure policy since 1984 has resulted in increasingly tight restrictions on P application from manure and fertilizers, but the effects are not yet clearly reflected in changed trends in soil P.
The Marrakech Accords allow biospheric carbon sinks and sources to be included in attempts to meet emission reduction targets for the first commitment period of the Kyoto Protocol. Forest management, cropland management, grazing land management and re-vegetation are allowable activities under Article 3.4 of the Kyoto Protocol. Soil carbon sinks and sources can therefore be included under these activities. The Kyoto Protocol states that sinks and sources of carbon should be accounted for ‘taking into account uncertainties, transparency in reporting, verifiability’. At its most stringent, verifiability would entail the sampling of each geo-referenced piece of land subject to an Article 3.4 activity at the beginning and end of a commitment period, using a sampling regime that gives adequate statistical power. Soil and vegetation samples and records would be archived and the data from each piece of land aggregated to produce a national figure. Separate methods would be required to deliver a second set of independent verification data. Such an undertaking at the national level would be prohibitively expensive. At its least stringent, verifiability would entail the reporting of areas under a given practice (without geo-referencing) and the use of default values for a carbon stock change for each practice, to infer a change for all areas under that practice. A definition of verifiability between these extremes would allow simple methods, such as those derived from IPCC default values for CO2 fluxes from soil, to be used for estimating changes in soil carbon. These may enable low-level verifiability to be achieved by most parties by the beginning of the first commitment period (2008–2012).
Grazing animals are known to change the characteristics of agricultural grasslands as a source of and pathway for phosphorus (P) loss to water. Previous work, using physico-chemical analysis of the overland flow revealed that the presence of grazing animals increased the overall quantity of P being lost, in particular the unreactive and particulate P fractions. The aim of this study was to characterise the organic P (Po) fraction in overland flow from grazed and non-grazed grassland small plots using phosphorus-31 nuclear magnetic resonance (31P NMR) spectroscopy to give greater insight into P loss to water under simulated rainfall. The effect of the grazing animal was most pronounced in the dissolved unreactive P (DUP) and particulate unreactive P (PUP) fractions measured in overland flow from the grazed plots, over four times higher than from the non-grazed plots. Five distinct classes of P compounds were detected in the 31P NMR spectra, inorganic orthophosphate (δ = 6.83 ppm), orthophosphate monoesters (δ = 4.95–5.69 ppm), orthophosphate diesters (δ = 1.89 ppm), phosphonates (δ = 19.38 ppm), and pyrophosphates (δ = −3.26 ppm). Distinct signals at 5.69, 5.37, 5.10, and 4.95 ppm in the overland flow extracts from the plots indicated significant concentrations of myo-inositol hexakisphosphate in the orthophosphate monoester region. Orthophosphate diesters (assigned to phospholipids) and phosphonates were also only detected in overland flow collected from the grazed plot. These results indicate that normal grazing management practices may not only affect the concentrations of Po but also the forms of Po being transferred from grassland systems to water.
Yield responses of irrigated, field-grown cotton to phosphorus fertilizer application in Australia have been variable. In an attempt to understand better this variability, the distribution of fertilizer P within soil P fractions was identified using 32P and 33P radioisotopes. The soil chosen, an alkaline, grey, cracking clay (Vertosol), was representative of those used for growing cotton in Australia. Chang and Jackson fractionation of soil P from samples collected within 1 h of application indicated that 49, 7 and 13% of the P fertilizer was present as 0.5 m NH4F, 0.1 m NaOH and 1 m H2SO4 extractable P, respectively. Over 89% of the P fertilizer was recovered as Colwell extractable P in these samples, suggesting that the majority of these reaction products was in a highly plant-available form. Fertilizer-P remained in an available form within the band 51 days after application, and 68% of the applied fertilizer-P was recovered as Colwell-P (1071 mg kg−1). The Colwell-P concentration in the band was 35 times that in the unfertilized soil. Thus, the variability in crop response to P fertilizer application in these soils is not a consequence of fertilizer-P becoming unavailable to plants. These results confirm the suitability of the Colwell (1963) sodium bicarbonate extraction method for measuring available P in these soils.
Solid waste poses a serious health risk when it is disposed of inadequately because water-based solutions derived from the decomposition of solid waste products (leachate) can enter groundwater systems via plumes. To assess the public health risk and potential ecological impacts, we require knowledge on the pedological and hydrogeological settings in which waste is disposed. This is particularly the case in coarse textured highly permeable soil. To rapidly collect data, geophysical methods such as direct current (dc) resistivity techniques have been used. Moreover, non-contact electromagnetic (EM) induction instruments have also been employed. The aim of this research was to demonstrate how the inversion using a 1-dimensional inversion algorithm with lateral constraints of the apparent electrical conductivity (σa) measured in the horizontal coplanar (HCP) and perpendicular co-planar arrays (PRP) of a DUALEM-421 EM induction probe can be used to develop a two-dimensional model of the true electrical conductivity (σ) within a Quaternary aeolian sand in the Tuggerah Soil Landscape southeast of Sydney in Australia. Our results from 2D models of σ accord with estimates of bulk electrical conductivity (σb) of a leachate plume and uncontaminated groundwater, the stratigraphy of the Tuggerah soil landscape unit and the depth of sand used to landscape the decommissioned landfill. Further research is needed to determine the origin of the plume and a quasi-3D modelling approach is applicable.
We have examined the contributions sucrose and sawdust make to the net immobilization of inorganic soil N and assimilation of both C and N into microbial biomass when they are used as part of a restoration plan to promote the establishment of indigenous vegetation on abandoned agricultural fields on the Central Hungarian Plain. Both amendments led to net N immobilization. Sucrose addition also led to mobilization of N from the soil organic N pool and its immobilization into microbial biomass, whereas sawdust addition apparently immobilized soil N into a non-biomass compartment or a biomass component that was not detected by the conventional biomass N assay (CHCl3 fumigation and extraction). This suggests that the N was either cycled through the biomass, but not immobilized within it, or that it was immobilized in a protected biomass fraction different to the fraction into which N was immobilized in response to sucrose addition.
A careful study of the etiology and symptoms of the decline phenomena in stands of silver fir (Abies alba Mill.), Scots pine (Pinus sylv L.), European beech (Fagus silv. L.) and Norway spruce (Picea abies Karst.) in southern Germany leads to the conclusion that all these diseases, although exhibiting some common features (e.g. premature senescence and shedding of leaves, formation of transparent crowns), vary considerably between species and, within one particular species, between forest regions. It therefore seems plausible to assume, as a first approach, that we have to deal with different types of disease or decline, and consequently also with varying sets of causes or stress factors. This approach can be demonstrated best by reviewing the present knowledge of diseases in Norway spruce.
A 17-year chronosequence of Acacia auriculiformis fallows on Arenosols of the Batéké Plateau (D.R. Congo) was surveyed and compared with virgin savannah soils to assess chemical soil fertility changes induced by these N-fixing trees. Significant increases in organic carbon content, total nitrogen content, cation exchange capacity and sum of base cations were found after relatively short fallow periods of only 4 years and did not only affect the forest floor, but extended to at least 50 cm depth. The Acacia act as a major source of organic matter (OM), hence increasing organic carbon and nitrogen content and decreasing the C/N ratio. The increased OM content suggests that humification processes are the main cause of the significant decrease in pH. Total exchangeable cations initially increased slowly but doubled (topsoil 0–25 cm) and tripled (subsoil 25–50 cm) after 10 years. The point of zero net proton charge was systematically lower than soil pH and decreased with increasing OM content, thereby increasing the cation exchange capacity, although concurrent acidification retarded a significant beneficial impact at field pH on Acacia fallows of 10 years and older. Although the chemical soil fertility improves steadily with time, after 8 years of Acacia fallow the absolute amounts of available nutrients are still small and slash and burn practices are required to liberate the nutrients stored in the remaining biomass and litter before each new cropping period.
The 296 soil associations of the National Soil Map of England and Wales are placed into five categories of erosion risk. These are based on land use, landform and soil properties and take into account the extent of erosion in the uplands, and its frequency, extent and rates in the lowlands. Erosion of arable land is by water or wind, but in the uplands frost action and disturbance by sheep are also important. A large proportion of arable England (36%) is at moderate to very high risk of erosion, including much of the better drained and more easily worked land, especially sandy soils. In the uplands thin soils or deep peats are most at risk. If land use changes, because of increasing intensification of agriculture or in response to climatic changes, many soil associations will become more at risk of erosion.
Flooding of abandoned coal mines often causes discharges of iron-rich drainage water into the environment. Treatment of these discharges results in the formation of ochre (hydrous iron oxides) for which no end-use has been identified. Ochre effectively adsorbs phosphate from solution and thus could be used for remediation of waste waters. The resulting P-enriched ochre could then potentially be recycled as a P fertilizer. Pot and field experiments were set up to assess performance and environmental acceptability of ochre in this role, using grass and barley as test crops, as well as birch and spruce tree seedlings. Soils and plant materials were analysed for total and available P, total metals and pH. Results showed that P-saturated ochre functioned as a slow-release P fertilizer, and in the short term was as effective as conventional P fertilizer in maintaining crop yields. It also raised soil pH, and did not pose any significant problem through introduction of potentially toxic trace metals into the soil.
Using geographic information system techniques, hydrology of soil types (HOST) classes were combined with slope, rockiness, flood hazard and soil moisture deficit classes within a risk matrix to produce a slurry acceptance map for Northern Ireland (NI) on a 50 m grid. Moreover, due to the whole territory of NI being designated as a nitrate vulnerable zone, a nitrates action programme is to be implemented across the region in the near future and this is likely to restrict slurry applications to the growing season. To assess the risk classes associated with slurry applications during the growing season, an additional slurry acceptance map for NI was created in which the HOST factor was excluded from the analysis. The maps created showed that, for the period January–December, the majority (80%) of agricultural soils in NI were in the severe risk category following application of 50 m3 ha−1 of slurry. However, this proportion was reduced to only 29% when the same volume of slurry was applied during the growing season, when the soils were not saturated and significant rainfall was not received in the period immediately after slurry application.
The in-field calibration of a dielectric probe to measure soil water content is described. The probe uses an access tube analogous to that of the neutron probe. The dielectric constant was measured at soil depths of 10, 20, 30, 40, 60 and 100 cm. Cores of soil were then taken from the face of pits dug 30 cm from the access tube and their soil water contents determined by oven drying. The dielectric constant values measured by the probe were calibrated against water contents from these cores. We found that sensor depth needed to be included to achieve a good calibration model that explained 72% of the variance. It is argued that depth needs to be included because of artefacts introduced during the installation of the access tube.
Soils can be used as a biospheric sink for carbon under Article 3.4 of the Kyoto Protocol and parties are able to use agricultural soil carbon sinks to contribute towards carbon emission reduction targets. This should be done ‘taking into account uncertainties, transparency in reporting, and verifiability’. Models are often tested against data sets of long-term changes in soil organic carbon (SOC), but most data sets have only mean SOC values available at each sample date, with no estimates of error about the mean. We show that when using data sets that do not include estimates of error about the mean, it is not possible to reduce the error (root mean squared error) between modelled and measured values below 6.8–8.5%, even with site-specific model calibration. Equivalent errors for model runs using regional default input values are 12–34%. Using error as an indicator of the certainty that can be attached to model projections, we show that a significant reduction in uncertainty is needed for Kyoto accounting. Uncertainties for modelling during the first Kyoto Commitment Period could be reduced by better replication of soil measurements at benchmark sites. This would allow model error to be separated from measurement error, which would allow more comprehensive model testing and, ultimately, more certainty to be attached to model predictions.
The efficient use of biologically fixed N in agriculture is important in organic farming and when N fertilizers are either expensive or unavailable. The aim of the study was to determine the effects of cultivation and sowing dates on the efficiency of use of biologically fixed N built up during a period of grass/clover ley by subsequently sown ryegrass. Dates of cultivation in two field experiments conducted in consecutive years (1994/95 and 1995/96) ranged from August to October and sowing was carried out either immediately after cultivation or after a delay of one month. Nitrate-N losses through leaching, herbage yields and N offtake by ryegrass were measured from 1994 to 1996. A laboratory experiment was carried out to assess net N mineralization and nitrification in the soil of the field experiment under different conditions.
We studied the long-term accumulation processes and material balances of phosphorus (P) in the soil/sediment profiles of large-scale effluent recharge basins used for wastewater reclamation by the soil aquifer treatment (SAT) system. The objective was to quantify and clarify the long-term performance of soil/sediment in the SAT system as a sorbent to filter out P from the recharged effluent. Total P concentration in the soil/sediment profiles of the Shafdan wastewater treatment plant (WWTP) increased over 25 years of operation (1977–2001) by 20–220 mg kg−1, as a result of adding loads of 0.17–6.2 kg m−2 of P. Retained P in the 0–2.0 m soil layer increased from 0.06 to 0.31 kg m−2 with increasing cumulative load of P while the retained percentage gradually decreased from 19 to 5% of the cumulative P load. Accumulation rate of P in the 0–0.15 m horizon in the basins was inversely proportional to recharge time, decreasing from ∼28 mg P kg−1 year−1 during the first 3 years of operation, to <2.3 mg P kg−1 year−1 between the 20th and 25th years of operation. Thus, P content in this horizon approached a steady state after about 10–15 years of effluent recharge under the operational conditions of the Shafdan WWTP. Phosphorus concentration in deeper horizons increased at constant rates of approximately 7.8, 5.9 and 2.9 mg P kg −1 year−1 in the 0.15–0.30, 0.30–0.60 and 1.80 to 2.10-m horizons, respectively, over the 25 years of effluent recharge. However, the accumulation front of P appears gradually to have moved deeper in the soil profile. In general, this phenomenon may be explained by kinetic limitations to the achievement of full adsorption equilibrium for P between the flowing solution and the solid phase components of the soil. In addition, both the increase of EPC0(the equilibrium P concentration in solution at which there is no sorption or desorption to or from the soil under the given conditions), caused by long-term effluent recharge, and gradual decrease of the annual average concentration of P in the effluent input after 1995, may result in the steady-state level of P in the topsoil of the basin.
Because of the observed variability in soil available P (Olsen) contents, phosphorus budgets were used to predict changes in the soil P status of an intensively managed 6 ha grassland catchment in Northern Ireland. The P accumulation rate of approximately 24 kg/ha/y suggested an increase of soil available P (Olsen) of 1.0 mg P/kg/y. Soluble reactive phosphorus concentrations in drainflow measured on a daily basis for a two year period (January 1981 — December 1982) were compared with the two year period January 1990 — December 1991. The median concentration had increased by 10.0 μg P/1 in 1990/91 compared with 1981/82. This difference was only apparent in mean concentrations for the two time periods, after data associated with high flow events, which were more frequent in 1981/82, were excluded from the comparison. This rate of increase of 1.1 μg P/1/y, which was interpreted as reflecting an increase in soluble reactive phosphorus concentration in soil solution, is comparable to the increase in background soluble reactive phosphorus of 1.5 ± 0.54 μg P/1/y which was reported recently over a 17 year period from diffuse sources in the much larger (4400 km2) Northern Ireland catchment of Lough Neagh.
Nitrogen balances and total N and C accumulation in soil were studied in reseeded grazed grassland swards receiving different fertilizer N inputs (100–500 kg N ha−1 year−1) from March 1989 to February 1999, at an experimental site in Northern Ireland. Soil N and C accumulated linearly at rates of 102–152 kg N ha−1 year−1 and 1125–1454 kg C ha−1 year−1, respectively, in the top 15 cm soil during the 10 year period. Fertilizer N had a highly significant effect on the rate of N and C accumulation. In the sward receiving 500 kg fertilizer N ha−1 year−1 the input (wet deposition + fertilizer N applied) minus output (drainflow + animal product) averaged 417 kg N ha−1 year−1. Total N accumulation in the top 15 cm of soil was 152 kg N ha−1 year−1. The predicted range in NH3 emission from this sward was 36–95 kg N ha−1 year−1. Evidence suggested that the remaining large imbalance was either caused by denitrification and/or other unknown loss processes. In the sward receiving 100 kg fertilizer N ha−1 year−1, it was apparent that N accumulation in the top 15 cm soil was greater than the input minus output balance, even before allowing for gaseous emissions. This suggested that there was an additional input source, possibly resulting from a redistribution of N from lower down the soil profile. This is an important factor to take into account in constructing N balances, as not all the N accumulating in the top 15 cm soil may be directly caused by N input. N redistribution within the soil profile would exacerbate the N deficit in budget studies.
Organic manures are an important source of P which can make a significant economic contribution to farm fertilizer policies. In the region of 119000 tonnes of P are returned annually to UK agricultural land in the form of manures collected and handled on farms, with an estimated 66000 tonnes of P applied to tillage land and 53000 tonnes to grassland.Previous research on the utilization of manure P has tended to indicate a lower efficiency compared to inorganic fertilizer P in the season following application, but in the longer term manure and fertilizer P can be regarded as equivalent. Failure to adequately account for manure P additions to the land may result in soil enrichment which could increase the agricultural contribution to eutrophication, as a result of surface runoff or leaching.Recent research has indicated that the current guidelines for minimizing runoff losses following the land spreading of manures are generally soundly based. However, there is a need for further research where manures are applied to cracking clay soils with underdrainage, and where rainfall soon after slurry application can increase surface runoff.The careful cycling of manures within a properly devised fertilizer plan should minimize the risk of unnecessary soil P enrichment and subsequent leaching losses by restricting topsoil extractable P levels to less than 70 mg I-1.
Production of vegetables in greenhouses, having three solid walls and heated mainly or entirely by sunlight (‘sunlight greenhouses’), has expanded greatly in the northern areas of China. Excessive applications of manure and fertilizers are common, leading to nitrate accumulations in soil. We surveyed nitrogen application rates in more than 130 commercial greenhouses in Shaanxi Province, northwest China. Average application of fertilizer N was 753 and 600 kg/ha in the Yangling and Xian areas, respectively. In addition, N added in organic form in 31 greenhouses surveyed in Yangling averaged 699 kg/ha. We also surveyed nitrate in the soil profile in 70 commercial greenhouses. In 33 greenhouses in Yangling after harvest, the average NO3−-N accumulation to 200 cm depth was 737 kg/ha; in 43 greenhouses in Xian it was 506 kg/ha to a depth of 100 cm. Vegetables are mainly grown during the winter, and in summer the plastic is removed and the soil left fallow, in part to allow accumulated salts in the soil to be leached out during a rainy period. But this procedure also leads to nitrate leaching. Nitrate loss depends on rainfall during the fallow period. In a wet year (2007), average N loss below 100 cm was estimated to be 158 kg/ha; but in a dry year (2006) nitrate accumulated in the profile, with little loss. In a wet year, summer fallow and removal of plastic is beneficial because it decreases salt accumulation in the upper soil layers. How to balance this with the loss of nitrate is a challenge for greenhouse management in the study region.
Selected soil properties of different soil layers measured after 29 years a
Phosphorus uptake by groundnut and rapeseed (kg P ha )1 ) as influenced by direct-applied P and different rates and fre- quencies of residual P applied in the prece- ding 25 years
yields of groundnut obtained for 5 years with residual P accumulated in soil due to different rates and frequencies of P applied in the preceding 25 years
The effects of 25 years of annual applications of P fertilizer on the accumulation and migration of soil Olsen-P, and the effects of soil residual P on crop yields by withholding P application for the following 5 years, were evaluated in a subtropical region. Annual application of P fertilizer for 25 years to crops in summer (groundnut), winter (wheat, mustard or rapeseed) or in both seasons raised the Olsen-P status of the plough layer (0–15 cm) from initially very low (12 kg P ha−1) to medium (18 kg P ha−1) and very high levels (40–59 kg P ha−1), depending on the amount of P surplus (amount of fertilizer applied in excess of removal by crops) (r = 0.86, P ≥ 0.01). However, only 4–9% of the applied P fertilizer accumulated as Olsen-P to a depth of 15 cm (an increase of 2 mg kg−1per 100 kg ha−1 surplus P) in the sandy loam soil. In the following 5 years, the raising of 10 crops without P fertilizer applications decreased the accumulated Olsen-P by only 20–30% depending upon the amount of accumulated P and crop requirements. After 29 years, 45–256 kg of residual P fertilizer had accumulated as Olsen-P ha−1 in the uppermost 150 cm with 43–58% below 60 cm depth; this indicates enormous movement of applied P to deeper layers in this coarse textured soil with low P retention capacity for nutrients. Groundnut was more efficient in utilizing residual P than rapeseed; however, for both crops the yield advantage of residual P could be compensated for by fresh P applications. These results demonstrated little agronomic advantage above approximately 20 mg kg−1 Olsen-P build-up and suggested that further elevation of soil P status would only increase the risk of environmental problems associated with the loss of P from agricultural soils in this region.
Nitrogen deficiency is the major problem in the creation of new ecosystems on most derelict land. Initially there is insufficient nitrogen in the wastes to drive the new systems, and nitrogen accumulation is, therefore, required. The most cost-effective way of providing this nitrogen is to use leguminous species which fix nitrogen from the atmosphere. Once nitrogen starts to accumulate in the soil management should aim to promote efficient cycling. Maintaining a near-neutral soil pH and a sward with a small C:N ratio helps to increase the mineralization of nitrogen in dead plant residues, and grazing animals also reduce nitrogen accumulation in dead vegetation.
Estimating N mineralization is important both environmentally and economically. Chemical and biological tests have been used for many years in an attempt to predict the N-supplying capacity of soil. Simple methods are needed to predict N mineralization as a means to guide N management in rangeland ecosystems dependent on the natural N-supplying capacity of soils. A short-term C mineralization assay may provide a routine and rapid procedure to attain this goal. The objective of this study was to investigate the association of C and N mineralization in a calcareous soil. A calcareous soil was amended with eight range plant materials and incubated under aerobic condition (50% water holding capacity) at 25 °C. The potentially mineralizable C (C0) values ranged from 2232 mg C/kg soil with a decomposition rate constant (k) of 0.058/day for Hordeum bulbosum L. to 2834.2 mg C/kg soil with a k value of 0.115/day for Medicago sativa L. treatment. The k value for Medicago sativa L. treatment (0.115/day) was about two times greater than that of Hordeum bulbosum L. (0.058/day). The product of k and C0 (kC0) was highly correlated (r = 0.96, P < 0.001) with N mineralization/immobilisation (Nm). Moreover, Nm was more strongly correlated (r = 0.95, P < 0.001) with cumulative CO2-C evolved during the first 6 days of incubation than other incubation periods. Overall, the quantities of Nm were closely associated with kC0 and short-term CO2-C evolution. The degree of association between C and N mineralization is time-dependent.
The accuracy of assays based on galactosidase and the enzyme-linked immunosorbent assay specific to Thanatephorus cucumeris were compared with techniques based on soil dilution plating and baiting in sterilized field soil. Although soil dilution plating is reasonably quantitative, it requires substantial time, material and labour. Plant baits gave inconsistent results in the estimation of T. cucumeris populations in the soil. Enzyme-linked immunosorbent assay (ELISA) using monoclonal antibodies is suitable for detecting the presence of a range of anastomosis groups (AGs) of 71 cucumeris in soil samples, but more quantitative applications seem to be limited to a very narrow range of concentrations of the fungus (0–10 μg/g). Monoclonal antibody ELISA could be used if the soil samples are routinely further diluted, provided the range of concentrations is uniformly low. An assay of β-galactosidase permits estimation of a more adequate range of concentrations (0–500 μg/g) and may be used in defined experiments using uninoculated soil samples.
Uniform application rates of fertilizers and herbicides may result in over-treating some soils and under-treating others; costs may be unnecessarily large and soil, ground water and surface waters may be contaminated. An alternative is site specific treatment, tailored to individual soil types present in agricultural fields of any size. To study the pollution hazards of the herbicide alachlor, leaching and adsorption experiments used disturbed samples and undisturbed soil columns. Adjoining Ves, Normania and Webster soil series (Udic Haplustoll; Aquic Haplustoll; Typic Haplaquoll) were sampled and analysed for various properties. Ring uniformly 14C-labelled alachlor was used to study adsorption and leaching characteristics in these soils. Results show different alachlor behaviour in topsoil and subsoil layers.
As a result of the important role played by phosphorus (P) in surface water eutrophication, the susceptibility of soils to release P requires evaluation. The degree of phosphorus saturation, assessed by oxalate extraction (DPSox), has been used as an indicator. However, most laboratories do not include DPSox in routine soil tests because of cost and time. This study evaluates the suitability of the ammonium acetate extraction in the presence of EDTA (AAEDTA), the standard soil test P (STP) in Wallonia (Southern Belgium), to predict DPSox; we also compared it with the Mehlich 3 extraction. Ninety-three topsoil samples were collected in agricultural soils throughout Wallonia. Good correlations were found between the AAEDTA and the Mehlich 3 methods for P, Fe and Al (r = 0.85, 0.77 and 0.86, respectively). An exponential relationship was found between PAAEDTA and DPSox. Results of principal component analysis and regression demonstrated that STP can be used to predict DPSox (r = 0.93) after logarithmic transformation. Soil test Al was also a good indicator of the P sorption capacity (PSCox) of soils (r = 0.86). Including the clay fraction in regression equations only slightly improved the prediction of PSCox (r = 0.90), while other readily available data (such as pH or organic carbon) did not significantly improve either DPSox or PSCox predictions.
Changes in soil pH when the Ultisol was incubated with biochar from nine crop residues: (a) biochar samples from non-legume residues; (b) biochar samples from legume straw.
pH, alkalinity and base cation content of biochar
Correlation of soil pH with biochar pH (a) and biochar alkalinity (b).
Zeta potential of biochar at different pHs: (a) biochar from non-leguminous residues; (b) biochar from leguminous straw.
exchange properties after incubation of the soil with different types of biochar after 60 days
Biochar was prepared using a low temperature pyrolysis method from nine plant materials including non-leguminous straw from canola, wheat, corn, rice and rice hull and leguminous straw from soybean, peanut, faba bean and mung bean. Soil pH increased during incubation of the soil with all nine biochar samples added at 10 g/kg. The biochar from legume materials resulted in greater increases in soil pH than from non-legume materials. The addition of biochar also increased exchangeable base cations, effective cation exchange capacity, and base saturation, whereas soil exchangeable Al and exchangeable acidity decreased as expected. The liming effects of the biochar samples on soil acidity correlated with alkalinity with a close linear correlation between soil pH and biochar alkalinity (R2 = 0.95). Therefore, biochar alkalinity is a key factor in controlling the liming effect on acid soils. The incorporation of biochar from crop residues, especially from leguminous plants, can both correct soil acidity and improve soil fertility.
The approximate time-scales for serious lowering of the base status of acidic upland soils in northeast Scotland have been based on assessments of geochemical weathering rates in two upland catchments. Periods of 1100 and 12000 years are obtained for soils evolved primarily from granite and quartz-biotite-norite respectively. Factors regulating the rate of removal of base cations in drainage water are discussed, to elucidate those which significantly influence long-term rates of soil acidification. The relationship between base cation leaching and river water acidity is briefly considered.
A review of recent data shows that (i) dissolved CO2 has its greatest acidifying effect in soils with pH values above about 6.5, (ii) fertilizers containing NH−1+ ions or urea will acidify soil whether the ions are taken up directly by plants or are first nitrified, (iii) oxidation of nitrogen and sulphur in soil organic matter causes acidification especially after deforestation, and (iv) the acidifying effect of rainfall and dry deposition is due to sulphuric and nitric acids, SO2 and NH−1+ ions. A table is given showing the order of magnitude of each source of acidification.
The impact of isolated trees and natural forest vegetation on soil acidity is discussed. There is a considerable variation in impact between species on similar soils and between sites for any given species. The effect of coniferous plantations on soil acidity is reviewed and the causes of any increased acidity discussed. Crop species, initial soil conditions, silvicultural practices and the proportion of the tree removed at felling are all important factors influencing the long-term impact of plantations on soil acidity.
Natural acidification processes result in increasing solubility of aluminium as soils become more acid. Exchangeable aluminium provides a large reserve that can be mobilized by percolating acids or salts, with solution pH determining the upper limit of its solubility. Aluminium can also be mobilized within soils and into drainage waters in soluble complexes with silica or fluoride, and in organically complexed forms.
A uniquely British feature of the acid rain debate is the association of afforestation with enhanced stream-water acidification. Harriman & Morrison (1982) working in the Trossachs region of central Scotland found that streams flowing through forests were consistently more acid, and had higher concentrations of Cl- and SO42-, than analogous streams draining adjacent open moorland. In the following sections these will be reviewed and then suggestions put forward as to how the forest manager could minimize the risk of streamwater acidification. - from Author
Abstract Soil samples have been taken periodically from unlimed plots of the 130-year-old Park Grass Experiment and from the 100-year-old Geescroft Wilderness at Rothamsted. Changes in the pH of the samples show how acidification has progressed. The soils are now at, or are approaching, equilibrium pH values which depend on the acidifying inputs and on the buffering capacities of the soils. We have calculated the contributions to soil acidification of natural sources of acidity in the soil, atmospheric deposition, crop growth and nutrient removal, and, where applicable, additions of fertilizers. The relative importance of each source of acidification has changed as the soils have become more acid. Acid rain (wet deposited acidity) is a negligible source, but total atmospheric deposition may comprise up to 30% of acidifying inputs at near neutral soil pH values and more as soil pH decreases. Excepting fertilizers, the greatest causes of soil acidification at or near neutral pH values are the natural inputs of H+ from the dissolution of CO2 and subsequent dissociation of carbonic acid, and the mineralization of organic matter. Under grassland, single superphosphate and small amounts of sodium and magnesium sulphates have had no effect on soil pH, whilst potassium sulphate increased soil acidity slightly. All of these effects are greatly outweighed under grassland, however, by those of nitrogen fertilizers. Against a background of acidification from atmospheric, crop and natural inputs, nitrogen applied as ammonium sulphate decreased soil pH up to a maximum of 1.2 units at a rate in direct proportion to the amount added, and nitrogen applied as sodium nitrate increased soil pH by between 0.5 and 1 unit.
Correlation analysis was used to determine the main factors related to soil pH and to yield of white clover in a range of hill soils. Results for 109 Northern Ireland pasture soils showed that pH (H2O) was significantly correlated with exchangeable Ca, total exchangeable bases, base saturation, P, exchangeable Al and Al saturation, but not with exchangeable Mn. Clover yield (dry weight of shoots) in 12 acid soils from Northern Ireland, Scotland and the Falkland Islands was significantly correlated with exchangeable Ca, total exchangeable bases and Al saturation. The results support the use of Al saturation rather than exchangeable Al, soil solution Al or pH when calculating lime requirements to overcome these limiting factors in hill soils.
A 3-year field trial examined in a long-term no-till system the effects of surface-applied lime and cover black oat (Avena strigosa Schreb) residues on soil chemical attributes, root growth and grain yield of corn (Zea mays L.) and soybean (Glycine max L. Merrill) on a loamy, kaolinitic, thermic Typic Hapludox in Paraná State, Brazil. The treatments consisted of dolomitic lime broadcast on the soil surface at 0 or 12 t/ha, with and without cover of black oat residues. Corn and soybeans were grown without rainfall limitation. Applying lime on the surface improved soil acidity and decreased aluminium (Al) toxicity to a 10-cm depth 1 year after application. Surface liming increased pH and the content of exchangeable Ca2+ to a 20-cm depth, and decreased Al toxicity to a 40- to 60-cm depth, 3 years after application, indicating that the surface-applied lime moved deeper. Cover black oat residues did not favour the mobility of surface-applied lime to alleviate subsoil acidity and an increase in the Al3+ saturation level at the soil surface was found in unlimed plots with black oat residues. Root growth and grain yields of corn and soybean were not influenced by surface liming with or without cover black oat residue. Despite the soil acidity level, root length of corn and soybean ranged from 55 to 60% at 0- to 10-cm depth. The results suggest that Al toxicity is low in no-till systems during cropping seasons with adequate and well-distributed rainfall, but this effect is not related to the presence of cover oat residues.
Soil profiles, first sampled between 1963 and 1973, were resampled in 1991 in an upland area with modertely high deposition of pollutants. One hundred horizons from 32 profiles, representing 10 different soil subgroups were analysed for pH and seven variables related to pH, using the same laboratory methods on both sampling occasions. To allow comparisons to be made with results obtained with these old methods, analysis of the 1991 samples was repeated for some determinands using the methods currently used in the analytical laboratory. Organic and A horizons show a consistent increase in acidity between samplings. Although brown soils and lithomorphic soils have increased in acidity throughout their depth, gleys and podzols have decreased in acidity at depth, probably because of poor water transmission downwards into these horizons. Correlations with other determinands suggest that the dominant process in the soils is leaching of basic cations and their replacement on exchange sites by protons and probably aluminium ions. A cause of the increase in soil acidity is likely to be the deposition of atmospherically transported pollutants.
Crop growth on strongly weathered soils is often limited by soil compaction in addition to aluminium toxicity and/or calcium deficiency. This study examines the effects of subsoiling, lime and gypsum on penetrometer resistance, acidity, aluminium and calcium levels and cotton (Gossypium hirsutum L.) root growth on soils transitional between Cecil and Appling series (clayey, kaolinitic, thermic Typic Hapludults) in the Piedmont region of Georgia, USA. The main plots were subsoiled to depths of 0.35 or 0.80 m or untreated. Dolomitic limestone (0 or 4.03 t per hectare on subplots) and phosphogypsum (0 or 10 t per hectare on sub-subplots) were incorporated into the surface soil (0.15 m). Deep subsoiling (0.80 m depth) decreased penetrometer resistance at 0.3–0.5 m depth and increased yield in two of three years, but there was no response to shallow subsoiling (0.35 m depth). Lime increased yield when surface soil water pH prior to amendment was less than a Cate-Nelson critical value of 4.6. Gypsum moved downward much more rapidly than lime, increasing soil solution calcium ion activity to a depth of 0.8 m within 5 months of application. There were differences in clay content between replicate plots and calcium movement was faster where the clay content was less. Yield responses to gypsum in 1986 were attributed to increased root growth below 0.2 m resulting from the increased calcium ion activity. Yield response to gypsum in limed sub-subplots was significant only in 1986.
Fifteen soil profiles in the Alltcailleach Forest in NE Scotland have been resampled after almost 40 years. The pH, in 0.01 M CaCl2, of the soil has decreased by 0.07 to 1.28 units in 80% of the surface organic horizons and by 0.16 to 0.54 units in 73% of the mineral horizons below 40 cm. The key factors governing increases and decreases in soil pH are changes in ground vegetation and tree canopy, although some effects of acid deposition cannot be ruled out.
Top-cited authors
Dominique Arrouays
  • French National Institute for Agriculture, Food, and Environment (INRAE)
Bruce C Ball
Christine Watson
  • Scotland's Rural College
Keith C. Cameron
  • Lincoln University New Zealand
J. Poesen
  • KU Leuven