ArticlePDF Available

Management reduces E. coli in irrigated pasture runoff

Authors:

Abstract

Microbial pollutants, some of which can cause illnesses in humans, chronically contaminate many California water bodies. Among numerous sources, runoff from irrigated pastures has been identified as an important regulatory target for improving water quality. This study examined the potential to reduce E. coli contamination from cattle in irrigated pastures. During the 14 irrigation events examined, we found that E. coli concentrations were lowest with a combination of three treatments: filtering runoff through a natural wetland, reducing runoff rates, and letting the pasture rest from grazing at least a week prior to irrigation. Integrated pasture and tailwater management are required to significantly reduce E. coli concentrations in runoff.
http://CaliforniaAgriculture.ucop.edu • OCTOBER–DECEMBER 2007 159
RESEARCH ARTICLE
pal wastewater treatment plants, as well
as nonpoint sources such as wildlife
(Atwill et al. 2001) and intensive and
extensive livestock production systems
(Atwill et al. 2003; Lewis et al. 2005).
Although pathogens are the underly
-
ing concern, most state and federal am
-
bient fresh-water quality standards are
based on indicator coliform bacteria. The
standards use total or fecal coliforms
and/or a subset of this group called
Escherichia coli. For fresh waters such
as streams and lakes across California,
fecal coliform standards range from 20
to 2,000 colony-forming units (cfu) per
100-milliliter (ml) sample, depending
on the designated beneficial use of the
water body. For full-body-contact ben
-
eficial uses such as swimming and bath
-
ing, the U.S. Environmental Protection
Agency (U.S. EPA) currently recom
-
mends an E. coli standard of 126 cfu/100
ml for an average of five samples col
-
lected over 30 days, or 235 cfu/100 ml
for a single grab sample.
Management reduces E. coli in irrigated pasture runoff
by A. Kate Knox, Kenneth W. Tate,
Randy A. Dahlgren and Edward R. Atwill
Microbial pollutants, some of which
can cause illnesses in humans, chroni
-
cally contaminate many California
water bodies. Among numerous
sources, runoff from irrigated pastures
has been identified as an important
regulatory target for improving water
quality. This study examined the po
-
tential to reduce E. coli contamination
from cattle in irrigated pastures. Dur
-
ing the 14 irrigation events examined,
we found that E. coli concentrations
were lowest with a combination of
three treatments: filtering runoff
through a natural wetland, reducing
runoff rates, and letting the pasture
rest from grazing at least a week prior
to irrigation. Integrated pasture and
tailwater management are required to
significantly reduce E. coli concentra
-
tions in runoff.
C
ontamination of surface waters by
pathogens — and the associated
human health risks — is a leading
water-quality issue for California and
the nation. Pathogens are the most com
-
mon impairment to surface waters in
California, according to the statewide
list of polluted water bodies (Cal EPA
2004). Listed pathogen-impaired wa
-
ter bodies include 103 miles of coastal
shorelines, 4,713 acres of estuaries, 688
acres of lakes and reservoirs, and 1,788
miles of rivers and streams.
Pathogens that can cause illness
in humans include protozoa such as
Cryptosporidium parvum and Giardia
duodenalis, as well as bacteria such as
Salmonella and Escherichia coli O157:H7, a
virulent strain of the commonly found
coliform. The sources of these pathogens
are diverse; they are shed in the feces
of wildlife, humans, livestock and pets
found across most watersheds. Pathogen
contamination can come from point
sources such as discharge from munici
-
While these standards are based on
the assumption that there is a correla
-
tion between these indicator bacteria
and microbial pathogens of concern, the
validity of this assumption likely varies
from watershed to watershed as well as
seasonally within a given watershed. In
addition, we have a generally poor un
-
derstanding of how indicator bacteria
and pathogen concentrations correlate
in rural or agriculturally dominated
watersheds. Regardless, indicator bac
-
teria are used as regulatory surrogates
for pathogens due to their relatively low
analysis costs and analytical simplicity
compared to most pathogens, which
can be expensive and technically dif
-
ficult to test for on a large scale.
Pathogens from irrigated pastures
Recent regulatory developments
in California have focused significant
attention on the quality of waters dis
-
charged from agricultural production
systems, including extensive livestock
A study at the UC Sierra Foothill Research and Extension Center examined
the ability of small wetlands to filter E. coli in runoff from irrigated, grazed
pastures. Such disease-causing pathogens pollute waterways across California.
160 CALIFORNIA AGRICULTURE • VOLUME 61, NUMBER 4
production on irrigated pastures and
nonirrigated rangelands. Irrigated
pastures in California maintain an es
-
timated 800,000 acres of green forage
throughout the dry summer months,
a critical food source for the state’s
livestock. While flood irrigation is a
common and inexpensive way of de
-
livering water to these pastures, this
method can generate significant runoff
(tailwater) (Bedard-Haughn et al. 2004;
Tate et al. 2001).
Information is needed on the ef
-
fectiveness of integrating three ap
-
proaches to reduce microbial pollutant
concentrations in tailwater discharged
from pastures: (1) vegetative filters
such as wetlands and buffer strips,
(2) pasture grazing management and
(3) irrigation management. We con
-
ducted a management-scale case study
on a flood-irrigated pasture and wetland
system in the northern Sierra Nevada
foothills. Our objective was to examine
the reduction in tailwater E. coli concen
-
trations due to: (1) wetlandltration of
tailwater, (2) offsetting the timing of
Fig. 1. (A) Wetland sampling scheme, and (B) range of E. coli concentrations observed at each site (cfu = colony-forming units).
livestock grazing and irrigation and (3)
the management of irrigation-water ap
-
plication rates.
Study pasture and wetland
The rangeland landscape in the west-
ern Sierra Nevada foothills of Northern
California is a patchwork of irrigated pe
-
rennial grass and clover pastures inter
-
spersed with annual grasslands and oak
woodlands. For this study, E. coli concen
-
trations and instantaneous runoff rates
were measured immediately above and
below aow-through wetland receiv
-
ing irrigated-pasture tailwater at the UC
Sierra Foothill Research and Extension
Center (SFREC) in Yuba County. Cattle
were excluded from the wetland for the
duration of the study period, April 2004
through September 2005.
The wetland was on an ephemeral
stream channel at the base of a small
basin that collects runoff from a
12-acre, flood-irrigated, foothill pasture
(fig. 1A). Tailwater runs directly into the
ephemeral channel along the base of
the irrigated pasture. Tailwater is then
transported approximately 150 yards
down the channel to the top of the
wetland. The only source of flow in the
channel during the summer irrigation
season is tailwater from this irrigated
pasture. Flow from this upstream chan
-
nel is naturally dispersed throughout
the wetland and eventually leaves via
another channel at the bottom.
The wetland has a surface area of
about 0.5 acre, with a flow path (the
length of wetland between inflow and
outflow points) of about 135 yards and
an average width of 200 yards. The
wetland is densely vegetated with dot
-
ted smartweed (Polygonum punctatum),
water speedwell (Veronica catenata) and
rice cutgrass (Leersia oryzoides). Due to
regular irrigation events on the pasture,
the wetland remains saturated through
-
out the summer months with standing
surface water between irrigation events.
Grazing and irrigation protocols
Pasture irrigation was managed
during the 2004 and 2005 summer ir
-
rigation seasons to create a range of
The 12-acre pasture was irrigated at different
rates, above and center, in order to measure the
amount of fecal bacteria flushed from the field.
Grazing was also limited prior to irrigation for
varying numbers of days. Right, a ditch delivers
irrigation water to the pasture.
http://CaliforniaAgriculture.ucop.edu • OCTOBER–DECEMBER 2007 161
water-application and tailwater-runoff
rates (table 1). This allowed us to in
-
vestigate the potential to reduce E. coli
concentrations in tailwater by reducing
the runoff rate, which in turn reduces
the erosion of bacteria from cattle fecal
pats (hydrologic mobilization) and the
flushing of bacteria from the pasture in
surface runoff (transport capacity). The
timing of pasture grazing by beef cattle
was managed to create a range of total
days rest between grazing and irriga
-
tion of the pasture. This allowed us to
characterize the potential reduction of
E. coli in tailwater attributable to such
processes as the background mortality
rate of E. coli, and the drying and heat
-
ing of fecal pats during the summer
season (Li et al. 2005).
Grazing. The 12-acre irrigated pas-
ture was fenced as one complete unit
and was grazed by beef cattle during
the 2004 and 2005 summer irriga
-
tion seasons (May through October).
Grazing duration ranged from 8 to
16 days per month. The number of
cattle ranged from 56 to 102, resulting
in mean stocking rates per grazing
event of 1.3 to 1.8 animal unit months
per acre.
Irrigation. The pasture was irrigated
in five discrete sections called sets, and
set size ranged from 1 to 4 acres. These
sets were irrigated sequentially so that
the entire pasture was irrigated over
the course of 2 to 3 consecutive days.
Irrigation scheduling was such that the
entire pasture was irrigated every 9 to
14 days throughout the summer, with
the shortest intervals in July when hot
temperatures and long days created
the greatest plant-soil water demand.
Days of rest. The timing of graz-
ing (once per month for 8 to 16 days)
combined with the timing of irrigation
(every 9 to 14 days) created a range of
days of rest between grazing and irriga
-
tion. This resting period ranged from 0
to 35 days (table 1), with 0 days of rest
meaning that cattle were present during
irrigation. Finally, irrigation applica
-
runoff per irrigation event ranged from
3.5 to 11.5 hours, and during this time
samples were collected at 30- to 60-
minute intervals. This allowed for the
characterization of E. coli concentrations
throughout the entire runoff period for
each irrigation event.
Measuring E. coli. Samples were
collected within the stream channel
immediately above and below the wet
-
land using ISCO 6700 autosamplers
(ISCO, Lincoln, Neb.). This allowed the
quantification of input-output E. coli
concentrations and loads to evaluate
Passing tailwater through relatively small wetlands
can significantly reduce E. coli from irrigated pastures.
TABLE 1. Tailwater runoff rates and E. coli concentrations in irrigated pasture–wetland system for 14 irrigation trials at SFREC, 2004 and 2005
Irrigation event
Irrigation
application rate
Duration
of tailwater
runoff
Duration of pasture
rest from grazing
prior to irrigation
Max. instantaneous
tailwater runoff rate
above wetland
Total tailwater runoff
Into wetland*
Out of wetland*
Reduction in E. coli
tailwater load due
to wetland*
cfs/acre hr day cfs/acre cu ft/acre %
7/1/04 2.5
5.50 9 1.38 21,800 19,250 33
7/13/04
0.7 9.00 21 0.52 14,200 11,900 91
7/27/04
1.7 6.50 0 1.02 16,300 14,700 79
9/2/04
1.7 7.50 0 1.22 29,900 26,200 64
9/19/04
0.7 11.50 16 0.60 17,200 16,350 74
10/2/04
2.5 7.75 29 1.67 37,800 34,800 63
10/17/04
0.7 8.00 0 0.47 18,450 17,200 81
6/16/05
2.5 3.50 0 1.53
6/29/05
1.7 4.25 8 1.19 19,200 14,700 65
7/11/05
0.7 9.75 20 0.36
7/26/05
0.7 4.25 35 0.68 10,450 8,700 91
8/8/05
2.5 6.00 0 1.47 20,700 16,800 69
8/19/05
1.7 6.50 9 1.00
8/31/05
0.7 6.75 0 0.47 8,900 7,400 90
* Because of equipment failure, a complete record of water inflow was not available for 6/16/05, 7/11/05 and 8/19/05; water outflow is not shown,
and percent E. coli load reduction for these dates could not be accurately calculated.
tion rates of 0.7 (n = 6), 1.7 (n = 4) and
2.5 (n = 4) cubic feet per second (cfs) per
acre were applied over the 14 irrigation
events to create a range of tailwater
runoff rates from 0.36 to 1.67 cfs/acre
(table 1).
Tailwater collection. Half of the
14 irrigation events were in summer
2004 and the other half in summer 2005
(table 1). For this study, we focused
tailwater monitoring on specific irriga
-
tion events in a single irrigation set
that was slightly over 1 acre. Thus, we
could control for variation in the area of
pasture generating runoff between ir
-
rigation events, and achieve a relatively
broad range of tailwater runoff rates
across events. The duration of tailwater
the effectiveness of wetland filtration.
Flow rates were continuously recorded
every 15 minutes using a 1-foot, 9
V-notch weir with an automatic
depth recorder (Metritape Type
AGS, Metritape, Littleton, Mass.).
This allowed us to examine the ef
-
fect of tailwater runoff rate on E. coli
concentration. E. coli concentration
(cfu/100 ml) was determined within
24 hours of sample collection by direct
membrane filtration and then culture
of the membrane onto CHROMagar
EC (Chromagar Microbiology, Paris,
France) at 112.1 °F (44.5 °C) for 24 hours.
Hydraulic residence times. Hydraulic
residence time, which is generally an
estimate of how long water takes to
162 CALIFORNIA AGRICULTURE • VOLUME 61, NUMBER 4
pass through a wetland, can be a major
factor influencing the efficiency of the
wetland to retain pollutants (Blahnik
and Day 2000). Longer residence times,
often associated with lower runoff
rates, generally result in greater reten
-
tion of pollutants (Knight et al. 2000).
Determining the hydraulic residence
time for a study wetland allows extrap
-
olation of the results to other wetland
systems.
To quantify hydraulic residence
times, continuous bromide injections
were conducted at irrigation applica
-
tion rates of 0.7, 1.7 and 2.5 cfs/acre.
Bromide is considered a conservative
tracer of water movement through
space and time because it is not uti
-
lized by plants or microorganisms,
and is not readily bound to soil par
-
ticles. A solution of known bromide
concentration was injected at a known
rate (20 to 25 milliliters
per minute)
into the center of the stream above
the
wetland, using a fluid-metering pump.
During injections, water samples were
collected at short intervals (3 to 20
minutes) to capture the entire runoff
period both above and below the wet
-
land. Bromide concentrations were
quantified using ion chromatography.
The hydraulic residence time was cal
-
culated using the time it took for half
of the bromide to pass from above
to below the wetland (Webster and
Ehrman 1996). More than 95% of the bro
-
mide injected was recovered below the
wetland for all three irrigation events.
Data analysis. We used linear mixed
effects regression to simultaneously
examine the reduction in E. coli con
-
centration by the wetland, as well as
the relationships between E. coli
con-
centration and instantaneous tailwater
runoff rate (cfs/acre) above and below
the wetland, days of rest from grazing
prior to irrigation, and time of sample
collection relative to the arrival of tail
-
water at a sample location (for more de
-
tailed methodology, see Tate, Lancaster
et al. 2005, and Tate, Lyle et al. 2005).
The dependent variable was E. col
i
concentration (cfu/100 ml) in water
samples (n = 364) collected throughout
14 irrigation events from sample loca
-
tions immediately above and below the
wetland. E. coli concentration was log
10
transformed. Independent or fixed ef
-
fect variables in the model were sample
location (above or below the wetland),
tailwater runoff rate (cfs/acre), duration
of rest from grazing prior to irrigation
(days), and time since the arrival of tail
-
water runoff at each sample location for
each sample collected (hours) for each
irrigation event.
To assess whether wetland efficiency
was dependent upon instantaneous
tailwater flow rate, we included an
interaction between sampling location
(above versus below) and instantaneous
tailwater runoff rate at the sample loca
-
tion. The quadratic term for days of rest
from grazing was included to account
for the possibility that the relationship
between rest period and E. coli concen
-
tration was not linear. A backward-
stepwise approach was followed to
identify significant (
P < 0.05) factors
associated with E. coli concentrations.
Year (2004 or 2005) was treated as a ran
-
dom effect variable to adjust the results
for possible differences between years.
Effects on E. coli concentrations
Wetland filtration. E. coli concentra-
tions were reduced below the wetland
compared to above the wetland (table 2;
fig. 2). For example, at an instantaneous
tailwater flow-rate of 1.0 cfs and follow
-
ing 7 days of pasture rest from grazing,
the final analysis found that the wet
-
land decreased E. coli concentrations in
tailwater by about 40% (fig. 3A).
E. coli concentrations in pasture run
-
off above the wetland were never below
the 235 cfu/100 ml standard recom
-
mended by the U.S. EPA for any of the
samples (n = 182) collected during the
14 irrigation events, ranging from 420
cfu/100 ml to 157,800 cfu/100 ml, with a
median of 5,400 cfu/100 ml (see fig. 1B,
page 160). In contrast, overall E. coli
con-
centrations below the wetland (filtered
pasture runoff and wetland runoff)
TABLE 2. Linear mixed effects analysis characterizing the relationship between log
10
-transformed
E. coli concentration (cfu/100 ml) in irrigated-pasture tailwater above and below a wetland
receiving tailwater at SFREC, 2004 and 2005 irrigation seasons
Fixed variable Coefficient* Standard error P value†
Intercept 3.74 0.094 < 0.001
Sample location
Above wetland‡ 0.00
Below wetland –0.91 0.076 < 0.001
Tailwater runoff rate (cfs/ac)
0.18 0.071 0.014
Time since first tailwater runoff (hr)§
–0.05 0.008 < 0.001
Days rested from grazing –0.02 0.006 0.003
Days rested from grazing
2
0.0004 0.0001 0.050
Sample location × tailwater runoff rate¶
Above wetland‡ 0.00
Below wetland 0.66 0.103 < 0.001
* Coefficient for each significant independent variable in regression model. Coefficient value indicates the effect (+ or –)
and magnitude of relationship between each variable and log
10
E. coli concentration. For continuous variables (tailwater
runoff rate, time since first tailwater runoff per irrigation event, and days rested from grazing prior to irrigation event),
the coefficient indicates the change in
E. coli concentration associated with each additional increment in the variable
(e.g., cfs/acre, hour, day).
P value for each independent variable.
Referent condition for categorical variable sample location. The coefficient for the referent condition (above wetland) is set
to 0.0 and the coefficient for below the wetland represents the estimated reduction in log
10
E. coli concentration between
the sample locations above and below the wetland.
§ Time since the first tailwater runoff arrived at each sample location for the irrigation event.
¶ Interaction term for sample location by tailwater runoff rate.
Runoff below the wetland had significantly
lower E. coli concentrations; nonetheless, 95%
of the samples collected still did not meet
federal standards for E. coli.
http://CaliforniaAgriculture.ucop.edu • OCTOBER–DECEMBER 2007 163
were significantly lower than those above
the wetland (pasture runoff). Specifically,
E. coli concentrations below the wetland
ranged from 10 to 74,600 cfu/100 ml, with
a median of 1,283 cfu/100 ml. However, in
spite of the more than four-fold decrease
in median E. coli concentrations by the
wetland, only 6% of the 182 samples col
-
lected below the wetland met the U.S.
EPA standard.
Although the primary regulatory
concern with E. coli centers on concen
-
tration, it is also important to consider
the reduction in E. coli load (the total
number of E. coli entering and exit
-
ing the wetland) per irrigation event.
We calculated the percentage of total
number (cfu) of E. coli retained within
the wetland during each event from
the difference between inflow and
outow load, and found that percent
reduction ranged from 33% to 91%,
with an average of 73% (table 1). These
results are comparable to previous
ndings that relatively narrow (1 to
2 yards wide) vegetative buffer strips
can reduce E. coli and C. parvum in
runoff by as much as 90% to 99% on
Californias annual grasslands under
rainfall-runoff conditions (Atwill et
al. 2002, 2006; Tate et al. 2004,
2006).
Reductions of 80% to 99% have been
seen for E. coli and fecal coliforms
with the use of constructed surface-
flow wetlands to treat municipal and
Top left, channelized runoff from the pasture
was collected in a small basin. Above, V-weirs
were fitted with, left, autosamplers to monitor
E. coli concentrations.
Fig. 3. Predicted E. coli concentrations
in pasture tailwater above and below the
wetland as (A) tailwater runoff rate increases
(time since first runoff 3 hours, days since
grazing 7 days); (B) days rested from cattle
grazing prior to irrigation increases (time
since first runoff 3 hours, tailwater runoff
rate 1 cfs); (C) time since tailwater runoff
begins during an irrigation event (days since
grazing 7 days, tailwater runoff rate 1 cfs).
Fig. 2. E. coli concentration and tailwater
profiles above and below the study
wetland for a typical (A) high- and (B) low-
flow irrigation event.
164 CALIFORNIA AGRICULTURE • VOLUME 61, NUMBER 4
livestock wastewater (Gerba et al.
1999; Hill 2003; Quinonez-Diaz et al.
2001).
Tailwater runoff rate. As irrigation
tailwater runoff rates increased, E. coli
concentrations increased both above and
below the wetland (figs. 2 and 3A); for
example, figure 2 shows the increased
E. coli concentrations profile for an irri
-
gation event with a peak instantaneous
pasture runoff rate of 1.53 cfs/acre
compared to an event with a rate of
0.47 cfs/acre.
This relationship can be
attributed to the fact that higher runoff
rates increase the tailwater’s capacity for
pollutant mobilization and transport. In
other studies, we have found that runoff
rate is positively correlated with the load
of E. coli and C. parvum discharged from
cattle fecal deposits on annual grass
-
lands under rainfall-runoff conditions
(Atwill et al. 2002; Tate et al. 2004, 2006).
As the tailwater runoff rate in
-
creased, the wetland was less effec
-
tive at filtering E. coli and reducing
concentrations in tailwater (fig. 3A).
Essentially, at high runoff rates, the fil
-
tration capacity of the wetland becomes
overwhelmed by the mobilization and
transport capacity of the tailwater. The
increase in instantaneous tailwater run
-
off rate corresponded with a decrease
in hydraulic residence time, which also
likely reduced the amount of time for
wetland processes that reduce E. coli
concentrations, such as exposure to so
-
lar ultraviolet radiation and predation
by other microbes.
In this wetland, the hydraulic resi
-
dence time varied from 38 minutes at
an irrigation-water application rate of
2.5 cfs/acre to over 120 minutes at
0.7 cfs/acre; these application rates
resulted in maximum instantaneous
pasture runoff rates of 1.53 and
0.47 cfs/acre, respectively. These rela
-
tively short hydraulic residence times,
in conjunction with the relatively low
retention of total runoff volume (table
1), indicate that the majority of tailwater
runoff contributed to the wetland during
an irrigation event passed through that
wetland during the same event. From
total water inflow and outflow volume
data (table 1), we can calculate that water
retention in the wetland over these ir
-
rigation events ranged from 5% to 23%,
with the wetland retaining an average of
13% of the water contributed per event.
Soils at the study site were formed
over greenstone with a rocky clay
B-horizon at a depth of about 1 foot,
and an impervious, dense clay
C-horizon at a depth of about 3 feet.
There is not much storage volume in the
soil profile below this wetland, so that
any significant water loss to vertical
seepage would have to come from losses
through fractured bed material. Instead,
we suspect that most water retained in
the wetland was lost to subsurface flows
through channel substrates and lateral
subsurface flow from the wetland to the
surrounding soil profile. In general, we
have observed that the major hydrologic
transport pathways in the study site
soils are significant lateral flow on top
of the B-horizon and through macro
-
pores such as rodent tunnels, root tun
-
nels and soil cracks.
Grazing. E. coli concentrations in tail-
water directly from the pasture (above
the wetland) were highest when cattle
were actively grazing during an irriga
-
tion event with high tailwater runoff
rates. E. coli concentrations in tailwater
were significantly reduced with increas
-
ing rest time between grazing and ir
-
rigation (table 2, fig. 3B). However, the
relationship was not linear, and E. coli
reductions became smaller with each ad
-
ditional day of rest. For example, the E.
coli concentration was 23% lower after 9
days of rest than after 1 day of rest, but
only 2% lower after each additional day
of rest. This reduction was likely due
to two primary processes: (1) as cattle
fecal pats age, the microbial pollutants
in them naturally die off (Li et al. 2005;
Meays et al. 2005), and (2) as the pats dry,
they develop shells that trap the bacteria
inside.
Irrigation events. Over the course of an
irrigation event, E. coli concentrations ini
-
tially spiked but then declined (gs. 2 and
3C). This pattern is likely due to two pri
-
mary processes: (1) as the irrigation event
progresses, the tailwater volume increases
and dilutes the E. coli, and (2) as the first
irrigation water flows, it flushes the readily
mobilized and transportable bacteria from
the pasture.
This result shows the importance of
collecting multiple samples during an ir
-
rigation event to accurately characterize
E. coli
concentrations. In addition, a sin-
gle sample near the end of the event will
be much more likely to achieve water-
quality standards than a single sample
collected early in the event.
Wetlands can reduce E. coli runoff from irrigated pastures, but their use should be
integrated with management strategies such as timing grazing prior to irrigation
and minimizing the volume of irrigation tailwater.
http://CaliforniaAgriculture.ucop.edu • OCTOBER–DECEMBER 2007 165
Attaining water-quality standards
Results from this study indicate that
passing tailwater through relatively
small wetlands can significantly reduce
E. coli from irrigated pastures. As with
any management measure, the feasibility
and costs of creating a wetland will be
site-specic. However, wetlands reduce
E. coli concentrations less efficiently as the
tailwater runoff rate increases. In addi
-
tion, the concentration of E. coli in pasture
runoff increases with the tailwater runoff
rate. Collectively, these results indicate
that the implementation of a wetland
filter to reduce pathogens should be
integrated with irrigation management
designed to minimize tailwater runoff
rates and volume. Simply implementing
a wetland filter under conditions of high
tailwater runoff rates may not lead to
significant reductions in E. coli concentra
-
tions discharged from irrigated pastures
(g. 3A). This study also indicates that al
-
lowing several days of rest from grazing
prior to irrigation can significantly reduce
E. coli in pasture runoff.
We found that the combination of
a wetland filter, low tailwater runoff
rates, and at least 1 week of rest from
grazing prior to irrigation gener
-
ated the lowest E. coli concentrations.
Nonetheless, 94% of the 182 samples
collected below the wetland during 14
irrigation events were above the U.S.
EPA recommended level of 235 cfu/100
ml. (California water quality is now
regulated by nine regional boards with
differing standards; these standards
also differ from the federal recommen
-
dations [see p. 156].) Under the grazing
and irrigation conditions of this study,
we also found that up to 91% of the total
E. coli load discharged from the pasture
was filtered by the wetland, with 73%
filtered on average per irrigation event.
It is critical to fully explore opportuni
-
ties to further reduce tailwater runoff
rates and subsequent E. coli generation
from irrigated pastures, allowing wet
-
lands to serve as efficiently as possible.
Finally, it is important to note that
the standard E. coli test is used to iden
-
tify indicator bacteria rather than a
specific pathogen of concern. We have
found E. coli concentrations in beef
cattle feces on irrigated pastures to be
as high as 500,000 to 1,000,000 cfu per
gram of wet feces. It is therefore not
uncommon to find relatively high E. coli
concentrations in pasture tailwater, par
-
ticularly when the feces are fresh and
tailwater runoff rates are high.
The critical questions that must be
addressed focus on the load and con
-
centrations of actual pathogens in tail
-
water, and the efficiency of integrated
wetland, irrigation and grazing man
-
agement to reduce the pathogens that
may be discharged from pastures dur
-
ing irrigation events. For instance, in
California’s beef cattle herds, C. parvum
oocysts (eggs) are primarily shed in
high concentrations in the feces of beef
calves 1 to 4 months old, with very low
shedding rates for adult cattle (Atwill et
al. 1999, 2003). In contrast, E. coli indica
-
tor bacteria are consistently shed in all
ages of cattle feces at high rates year-
round. A grazed pasture might dis
-
charge high concentrations of indicator
bacteria, but low or zero concentrations
of the pathogen C. parvum
.
References
Atwill ER, Hoar B, Pereira MGC, et al. 2003. Im-
proved quantitative estimates of low environmental
loading and sporadic periparturient shedding of
Cryptosporidium parvum
in adult beef cattle. Appl Env
Microbiol 68:4604–10.
Atwill ER, Hou L, Karle BM, et al. 2002. Transport
of Cryptosporidium parvum oocysts through vegetated
buffer strips and estimated filtration efficiency. Appl
Env Microbiol 68:5517–27.
Atwill ER, Johnson EM, Klingborg DJ, et al. 1999.
Age, geographic and temporal distribution of fecal
shedding of Cryptosporidium parvum oocysts in cow-
calf herds. Amer J Vet Res 60:420–5.
Atwill ER, Maldonado Camargo S, Phillips R, et
al. 2001. Quantitative shedding of two genotypes of
Cryptosporidium parvum in California ground squirrels.
Appl Env Microbiol 67:2840–3.
Atwill ER, Tate KW, Pereira MGC, et al. 2006. Efficacy
of natural grass buffers for removal of Cryptosporidium
parvum in rangeland runoff. J Food Protect 69:177–84.
Bedard-Haughn A, Tate KW, van Kessel C. 2004.
Using
15
N to quantify vegetative buffer effectiveness
for sequestering N in runoff. J Env Qual 33:2252–62.
Blahnik T, Day J. 2000. The effects of varied hy
-
draulic and nutrient loading rates on water quality and
hydrologic distributions in a natural forested treatment
wetland. Wetlands 20:48–61.
[Cal EPA] California Environmental Protection
Agency. 2004. 2002 Clean Water Act Section 303(d)
List Summary Tables. State Water Resources Control
Board; Water Quality. www.swrcb.ca.gov/tmdl/303d_
sumtables.html.
Gerba CP, Thurston JA, Falabi JA, et al. 1999.
Optimization of artificial wetland design for removal
of indicator microorganisms and pathogenic protozoa.
Water Sci Technol 40:363–8.
Hill VR. 2003. Prospects for pathogen reductions in
livestock wastewaters: A review. Crit Rev Env Sci Tech
-
nol 33:187–235.
Knight RL, Payne WVE, Borer RE, et al. 2000.
Constructed wetlands for livestock wastewater man-
agement. Ecolog Eng 15:41–55.
Lewis DJ, Atwill ER, Lennox MS, et al. 2005. Link-
ing on-farm dairy management practices to storm-
flow fecal coliform loading for California coastal
watersheds. Env Monitor Assess 107:407–25.
Li X, Atwill ER, Dunbar LA, et al. 2005. Seasonal
temperature fluctuation induces rapid inactivation of
Cryptosporidium parvum. Env Sci Technol 39:4484–9.
Meays CL, Broersma K, Nordin R, Mazumder A.
2005. Survival of Escherichia coli in beef cattle fecal
pats under different levels of solar exposure. Range
Ecol Manage 58:279–83.
Quinonez-Diaz MD, Karpiscak MM, Ellman
ED, Gerba CP. 2001. Removal of pathogenic and
indicator microorganisms by a constructed wetland
receiving untreated domestic wastewater. J Env Sci
Health Part A–Toxic/Hazardous Substances & Env Eng
36:1311–20.
Tate KW, Atwill ER, Bartolome JW, Nader GA.
2006. Significant
E. coli attenuation by vegetative
buffers on annual grasslands. J Env Qual 35:795–805.
Tate KW, Lancaster DL, Morrison J, Lyle DF. 2005.
Monitoring helps reduce water quality impacts in
flood-irrigated pasture. Cal Ag 59:168–75.
Tate KW, Lyle DF, Lancaster DL, et al. 2005. Statis
-
tical analysis of monitoring data aids in prediction of
stream temperature. Cal Ag 59:161–7.
Tate KW, Nader GA, Lewis DJ, et al. 2001. Evalu
-
ation of buffers to improve the quality of runoff from
irrigated pastures. J Soil Water Cons 55:473–8.
Tate KW, Pereira MGC, Atwill ER. 2004. Efficacy
of vegetated buffer strips for retaining Cryptospo
-
ridium parvum
. J Env Qual 33:2243–51.
Webster JR, Ehrman TP. 1996. Solute dynamics.
In: Hauer FR, Lamberti GA (eds.). Methods in Stream
Ecology. San Diego, CA: Academic Pr. p 145–60.
Without such information on all
pathogens of concern, it is possible that
regulation based upon indicator bacte
-
ria alone will lead to unnecessary man
-
agement restrictions. Alternatively, if
indicator bacteria are poorly correlated
with certain pathogens, it is also pos
-
sible that regulation based solely upon
indicator bacteria will lead to a false
sense of human health protection. This
suggests that water-quality monitoring
and standards should target specific mi
-
crobial pathogens of concern.
A.K. Knox was Graduate Student in Ecology,
UC Davis, and now is Ecologist, WSP Environ-
mental Strategies, Seattle, Wash.; K.W. Tate is
Rangeland Watershed Specialist, Department
of Plant Sciences; R.A. Dahlgren is Professor of
Soil Science, Department of Land, Air and Water
Resources; and E.R. Atwill is Cooperative Exten-
sion Specialist and Director, Western Institute of
Food Safety and Security, School of Veterinary
Medicine; all at UC Davis.
... Sie erfordern meist auch einen hohen Einsatz von Energie, teilweise auch an Inputstoffen (Säuren oder Laugen, etc.), sodass dies nur in spezialisierten, zentral arbeitenden Einheiten implementierbar erscheint. (Aarons & Gourley, 2012, Centner, 2003, Chu et al., 2013, Knox et al., 2007, Lin et al., 2010, Lin et al., 2011, Miller et al., 2008, Netthisinghe et al., 2013, Popova et al., 2014, Tate et al., 2006, Unger et al., 2012, Unger et al., 2013. Dabei war der Fokus i.d.R. beschränkt auf die Verschmutzung in tierproduzierenden Betrieben (d.ö. in Weidetierhaltung); es gibt somit kaum Untersuchungen zu den Auswirkungen der Anwendung organischer Dünger auf Feldern. ...
... Es gibt auch erste Studien zu ihrer Minderungswirkung für Oberflächenabfluss von tierproduzierenden Betrieben, sowohl in Hinsicht auf Pathogenen (E. coli; Knox et al. 2007) wie in Hinsicht auf Tierarzneimittel (Carvalho et al. 2013). Besonders in Hinblick auf die Parallelen zwischen Tierarzneimitteln und Pflanzenschutzmitteln können Pflanzenkläranlagen eine Maßnahme für die Minderung der Emissionen von punktuellen Tierarzneimittel-"Hot-Spots" (z.B. größere tierproduzierende Betriebe) in Betracht kommen, falls Untersuchungen problematische Tierarzneimittelgehalte in Oberflächengewässer feststellen würden. ...
... Somit leisten diese Maßnahmen für schwer abbaubare Stoffe eher eine Umverteilung des Stoffes in den verschiedenen Umweltmedien als eine Minderung der Umweltexposition. (Knox et al., 2007). ...
Technical Report
Full-text available
Veterinary pharmaceuticals and their transformation products have gained increasing attention as environmental contaminants in both the scientific and public spheres. The administration of pharmaceuticals to either animals or humans results in only a partial absorption of the substance by the body – the rest is excreted. High levels of antibiotics have been found in farm manure, especially in poultry and pig farming. Veterinary pharmaceuticals find their way into agricultural fields by way of fertilisation with contaminated farm manure or directly through animal excretions in free-range fields. Veterinary pharmaceuticals that have been released into the environment can have negative ecotoxicological effects on dung organisms as well as soil and water organisms. Recent studies document veterinary pharmaceutical residues in organic fertilisers (manure, slurry and fermentation residues) and their transfer into soils and groundwater. The effects of these residues on environmental compartments have not yet been thoroughly researched. These problems are complex and driven by various factors, such as the strong industrialisation of agriculture and high consumer demand for low-cost meat and animal products. At the federal level (inter alia in the 16th Amendment to the German Medicinal Products Act) and the state level as well as in research, various approaches have been developed for a more careful use of veterinary pharmaceuticals and for limiting the amounts applied in livestock farming. However, the reduction of veterinary pharmaceuticals’ entry into the environment and reducing the pollution of environmental compartments are rarely in the foreground of these initiatives. Against this background, the project aimed to compile existing approaches and measures for reducing the entry of veterinary pharmaceuticals into the environment, as well as to derive additional environmental measures from scientific literature. The report presents more detailed information, including some background information on the action areas and information on modes of action of the measures.
... Third, E. coli may survive longer in agricultural soils than in soils from riparian areas, and thus replacing noncrop vegetation with crops could increase disease incidence (24). Fourth, removing vegetation could increase pathogen prevalence in runoff from adjacent hill slopes, given that noncrop vegetation is known to sequester many pathogens, including E. coli (25,26). ...
... Similarly, ranchers could mitigate crosscontamination risk by fencing waterways that eventually pass through produce fields to prevent entry of livestock and wildlife or by attracting livestock away from field crops or streams with water, food supplements, and food troughs (33). Rather than removing vegetation, ranchers and growers could sequester pathogens by maintaining and/or installing vegetated buffers between crop fields and grazeable lands (25,26). Other alternatives include planting produce that is not eaten raw in areas adjacent to grazeable lands and reducing application of agrichemicals (i.e., herbicides and fungicides), which can increase EHEC through decreasing predatory and competitor bacterial abundance (34). ...
... (2) buffering farm fields with noncrop vegetation to filter pathogens from runoff (25,26); (3) fencing upstream waterways from cattle and wildlife; (4) attracting livestock away from upstream waterways with water troughs, food supplements, and feed (33); (5) vaccinating cattle against foodborne pathogens (31); (6) creating secondary treatment wetlands near feedlots and high-intensity grazing operations (32); (7) reducing agri-chemical applications to bolster bacteria that depredate and compete with E. coli (34); (8) exposing compost heaps to high temperatures through regular turning to enhance soil fertility without compromising food safety (4); and (9) maintaining diverse wildlife communities with fewer competent disease hosts (21). traps, and California ground squirrels (Spermophilus beecheyi) were trapped using Tomahawk live traps. ...
Article
Full-text available
Significance Fresh produce has become the primary cause of foodborne illness in the United States. A widespread concern that wildlife vector foodborne pathogens onto fresh produce fields has led to strong pressure on farmers to clear noncrop vegetation surrounding their farm fields. We combined three large datasets to demonstrate that pathogen prevalence in fresh produce is rapidly increasing, that pathogens are more common on farms closer to land suitable for livestock grazing, and that vegetation clearing is associated with increased pathogen prevalence over time. These findings contradict widespread food safety reforms that champion vegetation clearing as a pathogen mitigation strategy. More generally, our work indicates that achieving food safety and nature conservation goals in produce-growing landscapes is possible.
... Microbial pathogens are considered one of the globally leading causes of water quality impairment in agricultural watersheds . Within a watershed, pathogenic bacteria and protozoa can be delivered to groundwater bodies from a variety of vectors including humans, many species of livestock, wildlife, and pets (Knox et al., 2007). Although discharge from point sources such as wastewater treatment facilities has traditionally represented a potential source of pathogens, regulation and enhanced treatment technologies have decreased its importance as a pollution source. ...
Article
The use of freshwater in agricultural systems represents a high percentage of total water consumption worldwide. Therefore, alternative sources of water for irrigation will need to be developed, particularly in arid and semi-arid areas, in order to meet the growing demand for food in the future. The use of recycled wastewater (RWW), brackish water (BW) or desalinated brackish water (DBW) are among the different non-conventional water resources proposed. However, it is necessary to evaluate the health risks for humans and animals associated with the microbiological load of these waters. Protozoa such as free-living amoebae (FLA) are considered an emerging group of opportunistic pathogens capable to cause several diseases in humans (e.g. cutaneous and ocular infections, lung, bone or adrenal gland conditions or fatal encephalitis). In the present study we evaluate FLA presence in three different irrigation water qualities (RWW, BW and DBW) and its survival in irrigated agricultural soils of an extremely arid insular ecosystem (Fuerteventura, Canary Islands, Spain). Samples were cultured on 2% Non-Nutrient Agar (NNA) plates covered with a thin layer of heat killed E. coli and checked daily for the presence of FLA. According to the prevalence of FLA, Vermamoeba vermiformis (53,8%), Acanthamoeba spp. (30,8%), Vahlkampfia avara (7,7%) and Naegleria australiensis (7,7%) were detected in the analysed water samples, while Acanthamoeba (83,3%), Cercozoa spp. (8,3%) and Vahlkampfia orchilla (8,3%) were isolated in irrigated soils. Only Acanthamoeba strains were isolated in no irrigated soils used as control, evidencing the capability of these protozoa to resist environmental harsh conditions. Additionally, all analysed water sources and the irrigated soils presented growth of several pathogenic bacteria. Therefore, the coexistence in water and soils of pathogenic bacteria and FLA, can mean an increased risk of infection in agroecosystems.
... But emerging evidence suggests that non-crop vegetation (i.e., hedgerows) may not promote wildlife intrusion (Sellers et al., 2018) and is not associated with elevated pathogen prevalence (Karp et al., 2015b). On the contrary, it is possible that non-crop vegetation may actually filter and sequester pathogens from runoff (Tate et al., 2006;Knox et al., 2007), mitigating on-farm risk. ...
Article
Full-text available
California's Central Coast rose to national food safety prominence following a deadly 2006 outbreak of Escherichia coli O157:H7 that was traced to spinach grown in this intensive agricultural region. Since then, private food safety protocols and subsequent public regulations targeting farm-level practices have developed extensively, aiming to avert future foodborne illness crises. However, amidst sweeping reforms in prescribed best practices for food safety, growers were pressured to take precautionary approaches to control pathogenic contamination—suppressing wildlife near fields, removing habitat, restricting biological soil amendments (e.g., compost, manure), and most recently, chemically treating irrigation water—that may generate negative unintended consequences for environmental and social sustainability. We synthesize socio-ecological data from three qualitative, interview-based studies to examine grower perceptions and experiences of food safety reforms in California's Central Coast region and explore the effects of food safety regulations on environmental and socio-economic sustainability. We identify three disjunctures between food safety requirements and farming realities in practice: (1) Growers perceive that some food safety practices legitimately mitigate risk, while others fail to reduce or even accentuate risk; (2) Food safety requirements can create contradictions in the co-management of food safety and environmental sustainability; and (3) Food safety requirements may foster impediments to regional food systems socioeconomic sustainability. We argue that these disjunctures warrant changes in food safety policy, implementation, and/or food safety education. We provide concrete suggestions for shifting the focus of food safety reform away from the narrow surveillance of individual grower compliance and toward an integrated perspective on regional risk, vulnerability, and resilience.
... Other reported rates of EC or FC attenuation in CFWs range from 1.61-3.56 log 10 (excluding the site in SE Scotland where reports an increase in FC due to faecal inputs from wild birds), and Knox, Tate et al. (2007) report lower rates of EC attenuation at higher flow rates in irrigated beef cattle pasture in California. For municipal/domestic wastewater treatment the reported range is from 1.29-3.00 ...
... From a management standpoint the issues of sediment, nutrient and pathogen contaminants have collectively been addressed using stream-side vegetation buffer strips in an attempt to attenuate pollutant entry into stream systems (Castelle et al., 1994;Schmitt et al., 1999;Dosskey 2002;Dorioz et al., 2006;Mayer et al., 2007). Vegetation buffers have been effective at reducing nutrient (Yates and Sheridan, 1983;Lowrance et al., 1985), pathogen (Tate et al., 2004;Knox et al., 2007) and sediment loading (Lyons et al., 2000;Lee et al., 2003) in streams. However, development of effective policy concerning the use of buffer strips has been complicated by the fact that the efficacy of buffers in reducing pollutant loading varies strongly in accordance with a number of design and environmental factors including buffer width, cover and height of plant material, slope, and soil attributes (Pearce et al., 1997, Atwill et al., 2005, George et al., 2011. ...
Article
The majority of streams and rivers in the United States (U.S.) are ecologically impaired, or threatened by anthropogenic stressors. Recent reports have found atrazine in drinking water to be associated with increased birth defects and incidences of Non-Hodgkin’s Lymphoma, with higher levels of significance from exposure to both atrazine and nitrate-N. In contrast, recent illnesses from E. coli contaminating vegetables that originated from irrigation water has increased awareness of identifying sources of E. coli entering irrigation reservoirs. Methods to accurately predict atrazine and E. coli occurrence and potential sources in waterways continue to limit the identifying appropriate and effective prevention and treatment practices. Therefore, the primary objectives of this study were to: 1) Identify watersheds across Nebraska that were at risk for exceeding nitrate-N and atrazine maximum contaminant limits (MCLs) in surface water, 2) Determine the specific times of greatest risk for exposure to atrazine throughout the year, 3) Determine the load of E. coli during storm events in a hydrologic controlled stream situated adjacent to a livestock grazing operations and centered in the fly zone for avian migration in the Midwest, and 4) Identify trends between E. coli concentrations, grazing rotations, and avian migrations patterns. Findings from objectives 1 and 2 of this project identified impairments for both nitrate-N and atrazine in the surface water during the early growing season in the southeastern region of Nebraska. Objectives 3 and 4 required a complex combination of bovine density and waterfowl migration patterns to evaluate the impact of E. coli concentrations in stream water, with the downstream reservoir had exceedance probabilities above the EPA freshwater criteria >85% of the growing season following rainfall events. Further, methodology developed in this project has the potential for application in regions with higher dependency on surface water to determine agrochemical and E. coli load influxes from upstream regions, evaluate other surface water contaminants in surface and/or groundwater, and implement best management practices. Advisor: Tiffany L. Messer
Article
Recent pathogenic Escherichia coli contamination of fresh vegetables that originated from irrigation water has increased awareness and importance of identifying sources of E. coli entering agroecosystems. However, inadequate methods for accurately predicting E. coli occurrence and sources in waterways continue to limit the identification of appropriate and effective prevention and treatment practices. Therefore, the primary objectives of this study were to: (1) Determine the concentration of E. coli during storm events in a hydrologic controlled stream situated in a livestock research operation that is located in the Central Flyway for avian migration in the United States. Great Plains; and (2) Identify trends between E. coli concentrations, grazing rotations, and avian migration patterns. The study sampled five rainfall events (three summer and two fall) to measure E. coli concentrations throughout storm events. A combination of cattle density and waterfowl migration patterns were found to significantly impact E. coli concentrations in the stream. Cattle density had a significant impact during the summer season (p < .0001), while waterfowl density had a significant impact on E. coli concentrations during the fall (p = .0422). The downstream reservoir had exceedance probabilities above the Environmental Protection Agency freshwater criteria > 85% of the growing season following rainfall events. Based on these findings, implementation of best management practices for reducing E. coli concentrations during the growing season and testing of irrigation water prior to application are recommended.
Article
Full-text available
هدف از این مطالعه، بررسی برخی شاخص‌های میکروبی و فیزیکوشیمیایی آب رودخانه‌های مرتبط به تالاب انزلی و مقایسه آن با استانداردهای بین‌المللی است نمونه‌برداری‌ها در فصول تابستان و پاییز سال 1390 و به‌طور ماهیانه انجام پذیرفت. برای جداسازی میکروارگانیسم‌های هتروتروف از محیط‌های پلبت کانت آگار، انتروکوکها از محیط KF و برای کلی فرم‌ها و اشرشیاکلی از ECC کروم آگار ومک کانگی آگار استفاده گردید و پس از 48 تا 72 ساعت در انکوباتور با دمای 30 درجه سانتی‌گراد شناسایی و شمارش گردیدند؛ و فاکتورهای فیزیکوشیمیایی آب نیز بررسی گردید. بیشترین میزان میانگین تغییرات لگاریتمی در کل باکتری‌ها (cfu/ml 07/7)، توتال کلی فرمی (cfu/ml 537/6)، کلی فرم مدفوعی (cfu/ml 96/4) و استرپتوکوک مدفوعی (cfu/ml 649/3) در ایستگاه شماره 6 (رودخانه پیربازار) در فصل تابستان بوده است (P<0.05). در اﻳﻦ ﺗﺤﻘﻴﻖ، میزان NO2 میلی گرم بر لیتر در آب ایستگاه‌ها بیشتر از حد مجاز استاندارد EPA بوده است اما NO3 و NH4 و PO4 میلی گرم برلیتر در محدوده مجاز بود. میزان باکتری‌های کلی فرمی و کلی فرم‌های مدفوعی و انتروکوک در بسیاری از ایستگاه‌ها بسیار بالا و بیشتر از حد مجاز استاندارد بود. بالا رفتن دمای محیط، رشد جمعیت شهری در نزدیک تالاب‌ها و ورود فاضلاب‌ها از دلایل اصلی افزایش بار آلودگی در تالاب انزلی است.
Article
Full-text available
Non grazed, vegetated buffer strips are often recommended as best management practices to protect waterbodies from sediments and nutrients in runoff from grazed pastures. The objectives of this study were to characterize levels of nitrate/nitrogen (NO3-N), total phosphorus (Total P), and total suspended solids (TSS) in runoff and evaluate the potential water quality improvements from 10 m buffer strips on irrigated Sierra Nevada foothill pastures. We found that 15% and 69% of irrigation water applied to sprinkler and flood irrigated pastures became runoff, respectively. There were distinct temporal patterns of constituent concentration in runoff during irrigation events having ramifications for effective water quality monitoring and study design. The 10 m buffer did not significantly reduce concentrations and loads of NO3-N in runoff from sprinkler and flood irrigated pastures. The buffer also failed to reduce Total P concentration under either irrigation schemes, or Total P and TSS load under sprinkler irrigation. The buffer did reduce TSS concentration under both irrigation schemes, TSS load under flood irrigation, and Total P load under flood irrigation. These results reflect the effectiveness of buffers during the first year following buffer establishment. Improved irrigation efficiency to reduce runoff generation h perhaps the most readily acceptable and practical first step for reducing the potential for negative water quality impacts from these systems.
Article
Full-text available
Declines in cold-water habitat and fisheries have generated stream- temperature monitoring efforts across Northern California and the western United States. We demon- strate a statistical analysis approach to facilitate the interpretation and application of these data sets to achieve monitoring objectives. Spe- cifically, we used data collected from the Willow and Lassen creek water- sheds in Modoc County to demon- strate a method for identifying and quantifying potential relationships between stream temperature and factors such as stream flow, canopy cover and air temperature. Our moni- toring data clearly indicated that a combination of management prac- tices to increase both in-stream flow and canopy cover can be expected to reduce stream temperature on the watersheds studied.
Article
Full-text available
Overland and shallow subsurface hydrologic transport of pathogenic Cryptosporidium parvum oocysts from cattle feces into surface drinking water supplies is a major concern on annual grasslands in California's central and southern Sierra Nevada foothills. Soil boxes (0.5 m wide × 1.1 m long × 0.3 m deep) were used to evaluate the ability of grass vegetated buffer strips to retain 2 × 10⁸ spiked C. parvum oocysts in 200-g fecal deposits during simulated rainfall intensities of 30 to 47.5 mm/h over 2 h. Buffers were comprised of Ahwahnee sandy loam (coarse-loamy, mixed, active, thermic Mollic Haploxeralfs; 78:18:4 sand to silt to clay ratio; dry bulk density = 1.4 g/cm³) set at 5 to 20% land slope, and ≥95% grass cover (grass stubble height = 10 cm; biomass = 900 kg/ha dry weight). Total number of oocysts discharged from each soil box (combined overland and subsurface flow) during the 120-min simulation ranged from 1.5 × 10⁶ to 23.9 × 10⁶ oocysts. Observed overall mean log10 reduction of total C. parvum flux per meter of vegetated buffer was 1.44, 1.19, and 1.18 for buffers at 5, 12, and 20% land slope, respectively. Rainfall application rate (mm/h) was strongly associated with oocyst flux from these vegetated buffers, resulting in a decrease of 2 to 4% in the log10 reduction per meter buffer for every additional mm/h applied to the soil box. These results support the use of strategically placed vegetated buffers as one of several management strategies that can reduce the risk of waterborne C. parvum attributable to extensive cattle grazing on annual grassland watersheds. Please view the pdf by using the Full Text (PDF) link under 'View' to the left. Copyright © 2004. American Society of Agronomy, Crop Science Society of America, Soil Science Society . ASA, CSSA, SSSA
Article
Full-text available
Northern California has extensive areas of irrigated pasture, which provide critical summer forage for livestock. In many of these systems, water is diverted directly from a stream into ditches or pipes and transported to individual pastures, where it is applied as flood surface irrigation. Our case study of discharges from irrigated pastures on Willow and Lassen creeks in Modoc County illustrates an assessment and monitoring approach for land managers and natural-resources professionals working to resolve water-quality impairments related to agricultural discharges from similar systems. We report correlations between four indicator variables measured in the field and the variables determined in the laboratory, to evaluate the potential for employing a strategic combination of the two.
Article
Full-text available
Previous studies have observed higher levels of soluble nutrients leaving vegetative buffers than entering them, suggesting that the buffers themselves are acting as a source rather than a sink by releasing previously stored nutrients. This study used 98 atom % (15)N-labeled KNO(3) at a rate of 5 kg ha(-1) to quantify buffer efficiency for sequestering new inputs of NO(-)(3)-N in an extensively grazed irrigated pasture system. Buffer treatments consisted of an 8-m buffer, a 16-m buffer, and a nonbuffered control. Regardless of the form of runoff N (NO(-)(3), NH(+)(4), or dissolved organic nitrogen [DON]), more (15)N was lost from the nonbuffered treatments than from the buffered treatments. The majority of the N attenuation was by vegetative uptake. Over the course of the study, the 8-m buffer decreased NO(-)(3)-(15)N load by 28% and the 16-m buffer decreased load by 42%. For NH(+)(4)-(15)N, the decrease was 34 and 48%, and for DON-(15)N, the decrease was 21 and 9%. Although the buffers were effective overall, the majority of the buffer impact occurred in the first four weeks after (15)N application, with the buffered plots attenuating nearly twice as much (15)N as the nonbuffered plots. For the remainder of the study, buffer effect was not as marked; there was a steady release of (15)N, particularly NO(-)(3)- and DON-(15)N, from the buffers into the runoff. This suggests that for buffers to be sustainable for N sequestration there is a need to manage buffer vegetation to maximize N demand and retention.
Article
The enhancement of water quality by artificial wetland systems is increasingly being employed throughout the world. Three wetlands were studied in Tucson, AZ to evaluate their individual performance in the removal of indicator bacteria (coliforms), coliphage, and enteric pathogens (Giardia and Cryptosporidium). A duckweed-covered pond, a multi-species subsurface flow (SSF) and a multi-species surface flow (SF) wetland were studied. Removal of the larger microorganisms, Giardia and Cryptosporidium, was the greatest in the duckweed pond at 98 and 89 percent, respectively. The lowest removal occurred in the SF wetland, 73 percent for Giardia and 58 percent removal for Cryptosporidium. In contrast, the greatest removal of coliphage, total and fecal coliforms occurred in the SSF wetland, 95, 99, and 98 percent respectively, whereas the pond had the lowest removals (40, 62, and 61 percent, respectively). Sedimentation may be the primary removal mechanism within the duckweed pond since the removal was related to size, removal of the largest organisms being the greatest. However, the smaller microorganisms were removed more efficiently in the SSF wetland, which may be related to the large surface area available for adsorption and filtration. This study suggests that in order to achieve the highest treatment level of secondary unchlorinated wastewater, a combination of aquatic ponds and subsurface flow wetlands may be necessary.
Article
Understanding the survival and transport of Escherichia coli in feces on land and in water is important when trying to assess contamination of water by grazing animals. A fecal-pat experiment was conducted in July and August of 2003 to investigate the survival ofE. coliunder 4 levels ofsolar exposure controlledby usingshade cloth. Fresh beefcattle manure wasuniformly blended to produce 2.5- and 1.6-kg fecal pats, which were placed in plastic trays or in contact with the soil and covered with 0%, 40%, 80%, or 100% shade cloth treatments and replicated 5 times. Samples from each fecal pat were collected at Time 0 to establish E. coli levels; sampling was repeated at Day 1, Day 3, and approximately weekly thereafter for 45 days to determine die-off. E. coli concentration and percent moisture were measured for each fecal sample. At the end of the experiment, fecal pats under the 0% shade cloth had the lowestE. coliconcentrations, followed by the 40%, 80%, and 100% treatments, with 0.018, 0.040, 0.11, and 0.44 3 106 colony-forming units (CFU) � g� 1, respectively. Fecal-pat size was significant only on Day 17, when large fecal pats had higher concentrations of E. coli (P , .0001). There was no significant difference (P ¼ 0.43) in E. coli concentration between the fecal pats in contact with the soil vs. those in plastic trays. Percent moisture of fecal pats was not a good covariate. Age of fecal pats, as well as exposure to solar radiation negatively influences the survival of E. coli. From a management perspective, E. coli in fecal pats under forested situations would survive longer than in open grasslands due to shading, and any possible contamination by E. coli would be greatest within 7 days of removing cattle from a riparian area or pasture.
Article
Swine, cattle, poultry, and other commercial livestock can be infected by numerous pathogenic enteric microbes that are also infectious for humans. Livestock waste generated by concentrated animal feeding operations (CAFOs) therefore may contain human pathogens that can infect humans exposed to them during direct contact with livestock waste or media contaminated with the waste (e.g., surface water, ground water, food crops). This article reviews published literature related to the detection and inactivation of human pathogens in flushed livestock waste (i.e., “wastewater”). Physical, biological, chemical, and energetic treatment technologies are discussed, including research results demonstrating or indicating the potential efficacy of the treatment approaches for pathogen reductions in livestock wastewaters.
Article
The enhancement of water quality by artificial wetland systems is increasingly being employed throughout the world. Three wetlands were studied in Tucson, AZ to evaluate their individual performance in the removal of indicator bacteria (coliforms), coliphage, and enteric pathogens (Giardia and Cryptosporidium). A duckweed-covered pond, a multi-species subsurface flow (SSF) and a multi-species surface flow (SF) wetland were studied. Removal of the larger microorganisms, Giardia and Cryptosporidium, was the greatest in the duckweed pond at 98 and 89 percent, respectively. The lowest removal occurred in the SF wetland, 73 percent for Giardia and 58 percent removal for Cryptosporidium. In contrast, the greatest removal of coliphage, total and fecal coliforms occurred in the SSF wetland, 95, 99, and 98 percent respectively, whereas the pond had the lowest removals (40, 62, and 61 percent, respectively). Sedimentation may be the primary removal mechanism within the duckweed pond since the removal was related to size, removal of the largest organisms being the greatest. However, the smaller microorganisms were removed more efficiently in the SSF wetland, which may be related to the large surface area available for adsorption and filtration. This study suggests that in order to achieve the highest treatment level of secondary unchlorinated wastewater, a combination of aquatic ponds and subsurface flow wetlands may be necessary.
Article
In 1995, the Gulf of Mexico Program (GMP) sponsored efforts by the Alabama Soil and Water Conservation Committee and the National Council of the Pulp and Paper Industry for Air and Stream Improvement (NCASI) to conduct a review of the literature concerning the use of constructed wetlands for treating concentrated livestock wastewaters. The scope of the literature review and summary of design/operation data included all of North America. Both published and unpublished data have been provided by researchers to be included in the database. The database format used for the GMP project is only slightly modified from the format developed for the US Environmental Protection Agency (EPA) North America Treatment Wetland Database, which includes information from municipal, industrial and stormwater treatment wetlands. The GMP Livestock Wastewater Treatment Wetland Database includes information from 68 sites with a total of 135 pilot and full-scale wetland systems (systems include parallel units at individual research facilities). Types of livestock wastewater being treated by constructed wetlands include dairy manure and milkhouse wash water, runoff from concentrated cattle-feeding operations, poultry manure, swine manure and catfish pond water. Over 1300 operational data records are summarized in the database. These data indicate that removal rates for 5-day biochemical oxygen demand (BOD5), total suspended solids (TSS), ammonium nitrogen (NH4-N), total nitrogen (TN), total phosphorus (TP), chemical oxygen demand (COD) and fecal coliforms are potentially very high in constructed wetlands receiving animal wastewaters. Average concentration reduction efficiencies were: BOD5 65%, TSS 53%, NH4-N 48%, TN 42%, and TP 42%. Removals are a function of inlet concentrations and hydraulic loading rates. Successful wetland design must include adequate pretreatment to protect the health of the wetland biota and must include adequate wetland area to meet the quality goals.