Long-term groundwater trends and their impact on the future extent of dryland salinity in Western Australia in a variable climate

Article (PDF Available) · January 2008with 200 Reads
Cite this publication
Long-term groundwater trends and their impact on the future extent of dryland salinity in
Western Australia in a variable climate.
RJ George
1,6
, RJ Speed
2
, JA Simons
3
, RH Smith
4
, R Ferdowsian
5
, GP Raper
1,6
& DL Bennett
1,6
,
Catchment Hydrology Group
1-5
Department of Agriculture and Food, Western Australia,
1,6
PO Box
1231 Bunbury, 6321,
2
PO Box 110 Geraldton 6531,
3
PMB 50 Esperance 6450,
4
PO Box 423
Merredin 6415,
5
444 Albany Hwy Albany 6330 & University of WA Centre for Ecohydrology
6.
Introduction
Consecutive West Australian Governments have fostered agricultural development in the Wheatbelt.
Excluding the forested Darling Ranges, this part of south-western Australia covers about 24 M ha.
Native, perennial, deep-rooted forests and woodlands were cleared in phases. Land releases increased
after each World War. For example, in the 15 years after World War 2, the cultivated area increased
from 5.6 M ha to 10 M ha. Similarly land releases in the 1960s alienated an additional 3.5 M ha. By
2001, when clearing had effectively ceased, over 19 M ha of land had been converted to annual crops
and pastures. Residual forests (>4 M ha) only remain in the Darling Ranges, parts of the Perth Basin
and as isolated remnants in the wheatbelt.
Changes to the environment followed the plough. While W.E. Wood (1924) first published the details
of a causal link between clearing and salinity, it was not until the 1950s that dryland salinity emerged
as a major farmland and water supply issue. By this time, a survey of farmers revealed that 40,000 ha
of previously arable land had become salt affected, and over 400,000 ha was at risk (George 1990).
The extent of salinity has been tracked using a combination of methods, at a range of scales.
Extensive surveys of salinity were undertaken by the Australian Bureau of Statistics and Department
of Agriculture, every 5 years over a 47 year period. While the questions asked have varied slightly,
farmers surveyed report the area saline has increased from 73,476 ha (1955) to 932,695 ha (2003).
Between 1996 and 2000, the Land Monitor project used satellites and a high resolution Digital
Elevation Model (+/- 1m) to estimate salinity at paddock-scales. Interpretation noted that 992,000 ha
of the wheatbelt, including 821,000 ha of agricultural land, were severely salt-affected. An additional
area of land was also classified saline (85,700 ha) within palaeodrainages (336,580 ha). The Project
also estimated the equilibrium valley hazard (not risk) using a Digital Elevation Model and rule based
approach. This area was forecast to be between 2.8 and 4.4 M ha.
Prior to the publication of the Land Monitor, the NLWRA (2000) reported a salinity assessment
conducted across all Australian States. In Western Australia, Short and McConnell (2001) reported
that 3.552 M ha were classed as prone to salinity at 2000 (16%; defined as the watertable between 2-
5m below ground surface with rising trends), 4.181 M ha (21%) would be at risk by 2020 and 6.490 M
ha (33%) at equilibrium (>2100).
Watertable Analysis
Rotary-Air-Blast drilling rigs, operated by regional hydrologists of the Department of Agriculture and
Food were used to establish a network of 1318 long-term monitoring bores [termed SALTWATCH
bores]. These bores exist in clusters at about 100 catchments/sites across the agricultural regions,
representing most of the 19 M ha cleared area. Bores were typically drilled to basement, on transects
from upper to lower slopes, or in areas that were saline or were suspected of having a significant risk.
Manual time series analyses of trends in all bores were undertaken (<1990, 1990-2000, 2000-, and all
periods) and presented for two time periods (Table 1). Analysis of each period was conducted by
calculating the dominant trend. Linear trends were simple to assess, however if there was significant
seasonal variability, trends were derived from a line of best fit connecting summer minima. These
results are contrasted to rainfall data for five stations, one from each region, where Accumulated
Monthly Residual Rainfall (AMRR 1975-1999) is compared pre and post 2000.
Region Bores Pre 2000 Post 2000
# Rising Falling Stable Rising Falling Stable
Northern 109-170 66% 6% 27% 18% 69% 13%
Central 299-479 47% 5% 47% 23% 37% 40%
South-West 331-370 53% 3% 44% 37% 12% 52%
South Coast (West) 76-80 74% 17% 9% 50% 31% 17%
South Coast (East) 175-219 72% 5% 23% 71% 7% 22%
Table 1. Bores analysed for groundwater trends 1990-2000 (n=990) & post 2000 (n= 1318 bores).
Bores qualified for inclusion in the analyses if they were located in cleared agricultural land, remote
from the effects of any salinity management treatment (drains, trees, perennial pastures) and met
minimum standards (e.g. 5 years duration and/or 20 monitoring observations). The average catchment
had 14 bores and 50 groundwater observations. Trend analyses were conducted on between 266 and
1318 bores [<1990 (n=266), 1990-2000 (n=990), 2000-07 (n=1198), and ALL (n=1318)].
Bores were not sorted by soil-landscape zone. For example, in some catchments the proportion of
bores in valleys with shallow watertables dominated (eg Central Region where Wallatin and Beacon
catchments had 72-83 bores in areas where average watertables were < 3m), while other catchments
were dominated by bores with deep watertables (e.g. NE Yilgarn = 9 m and East Belka = 12m).
Results
The relative proportions of groundwater bores with rising trends changed after 2000, both in terms of
the amount and degree of rise/fall, and also spatially across the five regions. Prior to 2000, in four
regions, between 53-74% of all bores had rising trends and <6% had a falling trend. About 9-47% had
no trend (stable). Pre 2000, the western South Coast had the greatest number of falling trends (17%)
and also had fewest stable trends (9%). After 2000, the number of bores with rising trends decreased
in 4 out of the 5 regions. This was most pronounced in the Northern region (18%), and progressively
reduced towards the eastern South Coast where the number pre & post 2000 was unchanged (71-72%).
Groundwater trends differ depending on the depth to watertable. Plots of trend by depth for each time
period (<1990, 1990-2000, All time) for the five regions (Figure 1) show that prior to 2000, nearly all
bores displayed a rising trend whether the watertable was shallow or deep. However, after 2000 (b
plots) trend appeared dependant on depth to watertable. In the Northern region, downward trends, to -
0.5 m/yr, were most common in bores with shallow watertables (<10m), less significant falls (-0.2
m/yr) were apparent for deeper watertables, to >30m. In the Central region, falls were less (-0.2 m/yr)
and all but three were observed where watertables were <10m. In the South West and Western South
Coast falls were less (<-0.1 m/yr), and were observed at <5m watertable depths. By contrast, in the
Eastern South Coast, the trends pre and post 2000 remained the same; upwards (>0.2 m/yr) or stable.
In addition to a reduction in the number of post 2000 bores with rising trends, the magnitude of rise
diminished from north to south (Figure 2c). In some Northern region bores, rises in the 1990s were
offset by falls post 2000. However, this wasn’t the case in the entire Northern region (eg not Perth
Basin), nor in most other regions, where watertables rose (<1990-2007). In the Central, South West
and Western South Coast regions, it was usually only ‘discharge’ bores that demonstrated falling
trends (<-0.2m/yr >2000). Bores in areas of valley hazard and uplands, or those remote from
discharge zones, continued to rise.
After 1975, South West annual rainfalls reduced. However, since 2000 in the North and West
(Morowa, Narrogin), rainfall has been further reduced (Figure 1). By contrast, it has increased in the
south-east (Esperance) and slightly in the Central (Merredin) region, the later due to three large events.
Discussion
We attribute the observed groundwater responses to the interaction between three factors; clearing,
reduced rainfall, and the onset of hydrologic equilibrium. Experimental data implicates clearing as the
dominant causal factor in groundwater rise and the expansion of land salinity (Peck and Williamson,
1989). The analyses presented here allow some insight into the impact of the other two factors.
-600
-400
-200
0
200
400
600
800
Jan-75
Jan-77
Jan-79
Jan-81
Jan-83
Jan-85
Jan-87
Jan-89
Jan-91
Jan-93
Jan-95
Jan-97
Jan-99
Jan-01
Jan-03
Jan-05
Jan-07
Date
Accumulated Monthly Residual Rainfall
(AMRR)
Northern
(Morowa)
Central (Merredin)
Easern South
Coast
(Esperance)
Western South
Coast (MtBarker)
Mt Barker
South West
(Narrogin)
Figure 1. AMRR plots for five stations within the agricultural areas investigated.
Rates of groundwater rise observed from 1975 to 2000 were significant and occurred over a period
where the area of saline land grew from 167,000 to 1 M ha. We attribute this rise primarily to land-
use change brought about by the scale of clearing which preceded it. Notably, this 25-year period was
one of reduced rainfall relative to the previous 50 years in all areas except the eastern South Coast.
Since 2000, the numbers of bores with rising trends and their rates of rise have decreased. However,
this response varies spatially, with most reductions in the Northern region, and none in the eastern
South Coast. Persistent drought and high evaporative demand in the Northern region (>20% rainfall
reduction) appears to have negated previous watertable rise. By contrast, in much of the Central, SW
and western South Coast, changed rainfall has not caused the same degree of reduction. Notably, in
the South West, the post 2000 reduction (Narrogin, Figure 1) has not caused obvious change, while on
eastern South Coast (Esperance), where rainfall has increased, trends remain upward.
Rates of groundwater rise are affected by the degree to which the catchment has responded to clearing.
In catchments still actively filling [not near equilibrium], reduced rainfall-recharge appears to have
had no discernible impact on rising trends. Later, as these catchments approach equilibrium, and
discharge areas grow, climate impacts will become the dominant controller of trends. Finally, as
noted, the dataset has not been analysed by landscape position, thus we have some over-represented
some landscapes and some with little or no data. A drilling program underway will in-fill the gaps.
Despite lower than average rainfall over much of the wheatbelt since 2000, we continue to see
salinisation expand in all regions, especially following episodic floods, such as occurred in 1999-2000,
2001 and 2006. Hence our measured reductions of watertables in some wheatbelt valleys may be as
much attributed to recessions between these floods, as to a shift in mean annual rainfall.
The recent change in groundwater trends may have a significant implication for assessing the likely
future extent of salinity and the effect of management, especially those established as a result of the
National Action Plan for Salinity and Water Quality. Observed reductions in watertable must be
corrected for climate. Failure to do this may exaggerate the expected benefits of management.
References
George, R.J. (1990). The 1989 saltland survey. J. Agric. W. Aust. (4th Series) 31: 159-166.
Peck, A.J. and Williamson, D.R. (Eds) (1987). Hydrology and Salinity in the Collie River Basin,
Western Australia. J. Hydrol., Special Edition, 94:1-198.
Short, R.J. and McConnell, C, (2001). Extent and Impacts of Dryland Salinity. Resource Management
Technical Report No 202. Department of Agriculture and Food, Baron-Hay Court, Perth.
Wood W.E. (1924). Increase of salt in soil and streams following the destruction of native vegetation.
Journal of the Royal Society of Western Australia 10(7), 35-47.
Figure 2. Hydrograph derived trend analysis (+ Rise / - Fall) by Depth to Groundwater (2005-2007), for
all bores in Agricultural Regions (1=Northern, 2=Central, 3=South West, 4=South Coast-West and
5=South Coast-East) for periods (a) <2000, (b) 2000-2007 and (c) All record (>1975-2007).
Rate of change (m/yr)
Depth to groundwater (m)
Rate of change (m/yr) Rate of change (m/yr)
Depth to groundwater (m)Depth to groundwater (m)Depth to groundwater (m)Depth to groundwater (m)
-50
-40
-30
-20
-10
0
-2.0 -1.0 0.0 1.0 2.0
1(a)
-50
-40
-30
-20
-10
0
-2.0 -1.0 0.0 1.0 2.0
1(b)
-50
-40
-30
-20
-10
0
-2.0 -1.0 0.0 1.0 2.0
1(c)
-50
-40
-30
-20
-10
0
-2.0 -1.0 0.0 1.0 2.0
2(a)
-50
-40
-30
-20
-10
0
-2.0 -1.0 0.0 1.0 2.0
2(b)
-50
-40
-30
-20
-10
0
-2.0 -1.0 0.0 1.0 2.0
2(c)
-50
-40
-30
-20
-10
0
-2.0 -1.0 0.0 1.0 2.0
3(a)
-50
-40
-30
-20
-10
0
-2.0 -1.0 0.0 1.0 2.0
3(b)
-50
-40
-30
-20
-10
0
-2.0 -1.0 0.0 1.0 2.0
3(c)
-50
-40
-30
-20
-10
0
-2.0 -1.0 0.0 1.0 2.0
4(a)
-50
-40
-30
-20
-10
0
-2.0 -1.0 0.0 1.0 2.0
4(b)
-50
-40
-30
-20
-10
0
-2.0 -1.0 0.0 1.0 2.0
4(c)
-50
-40
-30
-20
-10
0
-2.0 -1.0 0.0 1.0 2.0
5(a)
-50
-40
-30
-20
-10
0
-2.0 -1.0 0.0 1.0 2.0
5(b)
-50
-40
-30
-20
-10
0
-2.0 -1.0 0.0 1.0 2.0
5(c)
  • Article
    Germination and emergence are often the most precarious stage in a plant's lifecycle: plants are particularly vulnerable to environmental stress at this time. Despite these constraints, plants colonize much of the planet including extreme environments. We argue that for many species, establishment in extreme situations is not because seed is adapted to germinate in extreme environments but because it falls into, and is spatially and temporally nurtured within, more benign “recruitment niches.” This principle has importance for revegetation in extreme environments such as the world's drylands. Using examples from ground-breaking experiments conducted by CV Malcolm and colleagues between 1976 and 1982 on saltland revegetation with halophytes, we show that recruitment niches can be constructed based on an understanding of the key requirements that seeds need in their immediate environment to establish. As part of their studies, Malcolm's team developed a “niche seeder” capable of distributing and precisely placing fruits of Atriplex species in an elevated “V”-shaped mound (to decrease waterlogging), covering the fruits with vermiculite (to decrease capillarity and therefore salinity at the soil surface), and spraying the placements with black paint (to increase soil temperatures). Subsequent studies in arid environments showed that the establishment of woody plants was also improved using stones on the soil surface to develop appropriate recruitment niches. Malcolm's identification of the “recruitment niche” is an important principle of broader relevance to the revegetation of degraded landscapes in extreme environments. In addition, the development of the niche seeder is an important case study in ecological restoration.
  • Technical Report
    Full-text available
    In 2003 a site near Yealering in Western Australia was selected for a paired catchment study to monitor and compare surface water run-off and salt export from unimproved saltland and saltbush systems. This study reports the results of the investigations, undertaken between 2003 and 2010, of changes in hydrology under an improved saltbush system that may lead to both on-site and off-site environmental benefits. Over the seven years of observation, this study has shown that managing saltland by establishing saltbush, together with an improved annual pasture component, has dramatically reduced the amount of salt, nutrient and sediment discharged in surface run-off. Both the concentration and mass of these elements in run-off were consistently much lower than from an adjacent unimproved area of saltland. Only one-tenth of the salt was discharged from the improved catchment once the saltbush system was established. The groundwater head relationships observed at the site indicate that head-driven discharge of salts to the ground surface is likely to be small and that evaporation is probably the major mechanism responsible for salt concentration near the soil surface. The study provided direct evidence that watertables were reduced seasonally under the saltbush system. Although the reduction is not of a large magnitude, it occurs predominantly in the mid-summer to midwinter period, creating greater potential for early winter leaching of salts that accumulate at the soil surface over the summer in salt-affected environments. Reducing the depth to watertable and surface soil salinity in the early winter period creates a better environment for the growth of annual pastures during their critical germination and establishment phases. Overall, these observations suggest that the saltbush-based system could be used to stabilise a saline area (by providing ground cover, reducing the severity of salinity and minimising soil erosion), and reduce salt, nutrient and sediment run-off into streams over the long term. Further, the system allows the salinity in the topsoil to be reduced through the slowing of capillary rise (due to a lower watertable and dryer soil) coupled with an improved leaching regime. Because the saltland valley systems in the Western Australian wheatbelt are essentially hydrologically one-dimensional, in situ treatments such as establishing saltbush-based systems provide perhaps the best option for saltland management in the short and medium term. Such in situ treatments are also likely to be more attractive to farmers to implement, as they can see the benefit of intervention on their own land. They also see an improved aesthetic value and possible biodiversity benefits, although these latter are not quantified. The mechanisms responsible for the observed changes in salt run-off are likely to be due to the saltbush inducing changes within the near-surface zone at a much smaller scale than was the scope of observations made during this study. It is recommended that further appropriately scaled studies be undertaken to detect and characterise any changes within the near-surface zone, in order to better determine the mechanism responsible for changes observed. It is also recommended that a saltbush system be established on the untreated catchment and that changes continue to be observed in relevant groundwater and surface water characteristics over time, together with the aforementioned, more detailed observations.
  • Chapter
    Full-text available
    Dryland salinity is caused by a build-up of salts in the root zone of plants in non-irrigated areas to the extent that it affects plant growth. Salinity can have direct adverse effects on agricultural systems, but the mobilisation of salt from affected land often causes downstream impacts on water resources as well as the loss of associated infrastructure, environmental and social values. Dryland salinity is a problem in areas where internal (leaching) and external (runoff) drainage is unable to remove salts, which may come from several sources, commonly from rainfall or dryfall (wind-borne). Primary salinity occurs as a result of pedogenesis and within the context of geologic processes, while secondary salinity results from human-induced land uses change, such as clearing for agriculture. Dryland salinity is often associated with sodic soils and waterlogging. The combined effect of these problems on plants is often much greater than the sum of their individual impacts so addressing the problem that is most tractable can provide a partial solution. For example, it may be better to reduce surface waterlogging rather than to drain subsurface saline groundwater, especially if disposing the drainage water has additional impacts on rivers and downstream water resources. Dryland salinity results from increased recharge leading to the mobilisation of salts by groundwater at multiple-scales, something that occurs mainly at the local scale in irrigation salinity. A drying climate in parts of southern Australia in recent decades has reduced the perception of risk of dryland salinity, so that it has become a much lower priority for Government and some land managers. With further reductions in rainfall-recharge due to continued poleward shift of weather systems, this trend is expected to continue. This chapter outlines types and causes of dryland salinity as it affects agricultural land, its interactions with waterlogging, sodicity and other factors, and methods of management. The chapter has a focus on Australia which has over 2 M ha of affected land, more than half of which is in Western Australia, an area that has been experiencing a drying climate since about 1975. This experience is likely to be informative of areas not yet affected by climate change. The impacts of dryland salinity on stream salinity is an important topic but is not within our scope.
  • Chapter
    Microvirga lotononidis is a recently described rhizobial species within the Methylobacteriaceae that is an effective nitrogen-[N2] fixing microsymbiont of the symbiotically specific African crotalarioid legume Listia angolensis. M. lotononidis possesses several properties that are unusual in root nodule bacteria, including pigmentation and the ability to grow at temperatures of up to 45 °C. The high-quality-draft genome sequence of M. lotononidis strain WSM3557T was recently determined. Here we describe the features of WSM3557T, together with genome sequence information and annotation. The genome is 7,082,538 nucleotides with 63.08% G + C content. It is comprised of 18 scaffolds of 104 contigs with a total of 7036 genes, 6952 of which are protein encoding and 84 RNA-only encoding genes. WSM3557T possesses a multipartite genome, consisting of a chromosome and four plasmids. Genes encoding lipo-chitooligosaccharide Nod factors are present. The symbiotic regions of the genome have a mosaic structure, with large numbers of transposases, integrases, recombinases and inactivated derivatives present. Genes for nodulation and nitrogen fixation [nod, nif and fix genes] are clustered together on several scaffolds. Elements of several types of secretion systems have been identified, including a general (type I) system with a remarkably large number of genes [36] predicted to encode exported RTX toxins and related Ca2+-binding proteins, a putative type III secretion system and a putative type IV conjugal transfer system. These and other general features of the WSM3557 genome will be presented.
  • Conference Paper
    Full-text available
    Dryland salinity is a hydrologically driven land degradation hazard brought about by replacing deep-rooted, perennial vegetation with shallow rooted, annual crops and pastures. In the late 1990s over a million hectares of farmland in Western Australia was mapped as severely salt affected. It was also estimated that between 2.8 and 4.5 million hectares of agricultural land had a high hazard and could be at future risk from shallow watertables and salinity.
  • Article
    Full-text available
    Regional acidification of surface waters is occurring in south-western Australia that is coupled with dryland salinity and acidity carried by rising groundwater. We report on basin-scale influences of acidic saline groundwater discharge on surface water chemistry in the Swan-Avon Basin investigated through 7 years of annual base-flow and lake water quality snapshots. This is linked with analysis of temporal hydrochemistry and discharge at two sites in the basin to assess historic patterns. At least 350 km of major waterways and tributaries in the Swan-Avon Basin were found to exhibit base-flow acidity (pH < 4.5) along with low-level waters in a number of lakes. Acidity appears linked with saline groundwater discharge, contains high concentrations of aluminium, iron and manganese and a range of trace elements (e.g. Pb, Cu, Ni, Zn). There is evidence that acidification has been occurring for at least 30 years and is linked with increased diffuse discharge of saline groundwater in the salinising landscape. However, base-flow acidity in some sub-catchments can be attributed to point discharge from agricultural groundwater drains used to mitigate salinsation of land. Managing acidification of base-flows requires containing diffuse acidification in salinising landscapes and practical options so that landholders can treat point discharge of acidic groundwater from drains. This investigation highlights that the influence of groundwater discharge on stream water quality in salinising landscapes is not limited to salinity and can also include acidification, particularly in deeply weathered regolith. Crown Copyright
  • Article
    Full-text available
    Seven gram-negative, rod-shaped bacteria were isolated from Lebeckia ambigua root nodules and authenticated on this host. Based on the 16S rRNA gene phylogeny, they were shown to belong to the genus Burkholderia, with the representative strain WSM5005T being most closely related to Burkholderia tuberum (98.08% sequence similarity). Additionally, these strains formed a distinct group in phylogenetic trees based on the housekeeping genes gyrB and recA. Chemotaxonomic data including fatty acid profiles and analysis of respiratory quinones supported the assignment of the strains to the genus Burkholderia. Results of DNA-DNA hybridisations, physiological and biochemical tests allowed genotypic and phenotypic differentiation of our strains from the closest validly named Burkholderia neighbour species. Therefore, these strains represent a novel species for which the name Burkholderia sprentiae sp. nov. (WSM5005T = LMG 27175T = HAMBI 3357 T), is proposed.
  • Extent and Impacts of Dryland Salinity
    • R J Short
    • C Mcconnell
    Short, R.J. and McConnell, C, (2001). Extent and Impacts of Dryland Salinity. Resource Management Technical Report No 202. Department of Agriculture and Food, Baron-Hay Court, Perth.
  • Hydrology and Salinity in the Collie River Basin, Western Australia
    Peck, A.J. and Williamson, D.R. (Eds) (1987). Hydrology and Salinity in the Collie River Basin, Western Australia. J. Hydrol., Special Edition, 94:1-198.
  • The 1989 saltland survey
    • R J George
    George, R.J. (1990). The 1989 saltland survey. J. Agric. W. Aust. (4th Series) 31: 159-166.