The enumeration of phages infecting host-specific strains of Bacteroides has been widely recognised as an effective and low-cost method of microbial source tracking (MST). A recently described human-specific Bacteroides host strain (GB-124) has been shown to detect bacteriophages exclusively in human-impacted waters and is emerging as a useful MST tool. However, a better understanding of the morphology and ecological behaviour of the phages, especially in wastewater disinfection processes, is now required in order to validate their role as MST markers. Bacteriophages infecting Bacteroides fragilis GB-124 (n = 21) were isolated from wastewater effluent and irradiated using laboratory-based UV-C (254 nm) collimated beam experiments. Bacteriophages were found to be both a morphologically and ecologically homogeneous group, with all specimens showing highly similar first order log-linear inactivation profiles (mean fluence required to inactivate phages by 4-log(10) was 36 mJ/cm(2)). These findings present the first evidence that phages infecting GB-124 are inactivated by the levels of UV-C radiation routinely delivered during tertiary wastewater treatment processes. More importantly, comparison with previously published inactivation data suggests that their response to UV-C radiation makes GB-124 phages more suitable surrogates for selected enteric viruses in UV disinfection processes than traditional faecal indicator bacteria or human-specific molecular markers.
Nontuberculous mycobacteria (NTM) are opportunistic pathogens found in natural and human-engineered waters. In 2009, a relative increase in the isolation of Mycobacterium gordonae from pulmonary samples originating from General Hospital Zabok was noted by the National Mycobacteria Reference Laboratory. An epidemiological survey revealed a contamination of the cold tap water with M. gordonae and guidelines regarding sputum sample taking were issued. In addition, all incident cases of respiratory infection due to NTM reported from 2007 to 2012 at General Hospital Zabok were included in a retrospective review. Out of 150 individual NTM isolates, M. gordonae was the most frequently isolated species (n = 135; 90%) and none of the cases met the criteria of the American Thoracic Society for pulmonary NTM disease. While concomitant Mycobacterium tuberculosis infection was confirmed in only 6 (4%) patients, anti-tuberculosis treatment was initiated for a significant portion of patients (n = 64; 42.6%) and unnecessary contact tracing was performed. This study points out the need to enhance the knowledge about NTM in our country and indicates the importance of faster NTM identification, as well as the importance of good communication between laboratory personnel and physicians when evaluating the significance of the isolated NTM.
Bacteria present in water samples taken on a weekly basis, from June 2004 through June 2005, from three streams, were cultured on Coliscan Easygel agar plates. Colonies representative of a variety of colors and morphologies were subjected to amplification and sequencing of a 1000-1100 nt portion of the 16S rRNA gene. A total of 528 colonies were sequenced; these categorized into 26 genera and 78 species. Of 175 dark blue/purple colonies presumed to be E. coli, sequence analysis indicated that 45 (25%) were actually other genera. For the urban stream Gwynns Falls Gwynns Run, E. coli was the most common genus/species encountered, followed by Klebsiella and Aeromonas. For Pond Branch, a stream located in a forested watershed, it was Serratia, followed by Yersinia and Aeromonas. For McDonogh (MCDN), a stream associated with Zea mays (corn) row crop agriculture, E. coli was the most frequently isolated genus/species, followed by Aeromonas and Enterobacter. ERIC-PCR genotyping of isolates from the most prevalent genera/species, indicated a high degree of diversity within-stream for E. coli and K. pneumoniae. Conversely, genotyping of Y. enterocolitica isolates indicated that some were shared between different streams.
A gas chromatography with mass spectrometric detection (GC-MS) method was developed and optimized for the determination of 17 endocrine disrupting compounds (EDCs) in coastal water samples. The evaluated EDCs were from different origins and included estrogens, bisphenol A, alkylphenolethoxylates, alkylphenols, phytoestrogens and sitosterol (SITO). The EDCs were extracted from samples using Oasis HLB (Hydrophilic-Lipophilic Balance) cartridges and derivatized with N,O-bis(trimethylsilyl)trifluoroacetamide (BSTFA) added with 1% trimethylchlorosilane (TMCS). The validation parameters revealed that this method was highly specific for all assayed compounds (>99%) and the linearity of the calibration curves showed a correlation higher than 0.99. The detection limits were at low ng/L level and the recovery rates were higher than 70%. The performance of the method was checked using coastal water samples taken every 2 months during 2009-2010 from the Douro River estuary and the Porto coastline (Portugal). These data revealed that approximately 98.0% of the analyzed compounds showed levels above their limits of detection (LODs). The measured estrogens (2-20 ng/L) and industrial pollutants (up to 1.1 μg/L) were in biologic hazardous concentrations. Besides, a clear seasonal pattern of fluctuation was established for phytoestrogens and SITO. The physicochemical data, namely the amounts of nitrates, nitrites and phosphorous, confirmed the low water quality of this area.
We investigated whether risk of sporadic enteric disease differs by drinking water source and type using surveillance data and a geographic information system. We performed a cross-sectional analysis, at the individual level, that compared reported cases of enteric disease with drinking water source (surface or ground water) and type (municipal or private). We mapped 814 cases of campylobacteriosis, cryptosporidiosis, giardiasis, salmonellosis and verotoxigenic Escherichia coli infection, in a region of British Columbia, Canada, from 1996 to 2005, and determined the water source and type for each case's residence. Over the 10-year period, the risk of disease was 5.2 times higher for individuals living on land parcels serviced by private wells and 2.3 times higher for individuals living on land parcels serviced by the municipal surface/ground water mixed system, than the municipal ground water system. Rates of sporadic enteric disease potentially differ by drinking water source and type. Geographic information system technology and surveillance data are accessible to local public health authorities and used together are an efficient and affordable way to assess the role of drinking water in sporadic enteric disease.
This study deals with the examination of quality of seawater bathing areas in Greece over a 10-year period and identifies risk factors for high bacteria indicator organism concentrations. Qualitative descriptive analysis was applied and the microbiological test results of 231,205 water samples were associated with pollution markers and other parameters. Measurements of Escherichia coli (99.6%) and enterococci (100%) were found to be in accordance with the mandatory value guidelines set by the new European Directive. An increasing trend for the yearly mean value of faecal streptococci was noted. Using logistic regression analysis, phenolic smell (OR = 2.10, CI = 2.04-2.16), rainfall the day before sampling (OR = 1.67, CI = 1.64-1.74), high seas (OR = 1.42, CI = 1.39-1.46) and rainfall on the day of sampling (OR = 1.27, CI = 1.20-1.33) were positively independently associated with high levels of bacterial indicators (total coliforms, faecal coliforms, faecal streptococci and E. coli). The highest risk, absolute risk value 42.8% (RR = 3.17, CI = 2.97-3.38), was measured when previous day rainfall, phenolic smell and high seas were simultaneously recorded. Such parameters should be further investigated as predetermining factors for the assessment of beach bathing water quality, providing a timely indication of water risk assessment.
Ten outbreaks of waterborne acute gastroenteritis (AGE) have been investigated in France since 1998. These outbreaks have affected populations of over 1,000 people, with generally high attack rates. The causal agents have been identified in six of these events. Aetiologies involved mainly noroviruses and Cryptosporidium sp. The point of entry of the contamination was located in the distribution network in five outbreaks (waste water backflows in four cases and one case of contamination induced by maintenance work) and at the water collection facilities in five other cases. Once the outbreak was detected, epidemiological and environmental investigations and crisis management followed well-established procedures. Further progress in public health surveillance will depend on more complete and rapid detection and reporting. Automated analysis of health insurance data on the reimbursement of drugs for AGE should help make detection more complete. Improved reactivity depends primarily on the operator immediately reporting incidents that indicate a possible massive contamination of the water network to health authorities - in particular complaints from the population, which are the only early-warning alerts in the case of waste water backflows.
This paper reports a spatial-temporal examination of waterborne disease data from the State of Mexico, 2000 to 2005, by county as the spatial unit. It was found that the incidence of waterborne disease did not decrease during the period under study. Inequality between metropolitan areas and rural zones was observed. People living in population centres had lower incidence of water-related diseases, possibly due to better access to services. In all cases, children under five years old suffered a much higher relative morbidity than the population in general. Improvement of the water distribution network between 2000 and 2005 could explain the decrease in morbidity from 30% to 15%, for the total population, and from 34% to 18.5%, for children under five years old. Coverage of sewer services over the period was not substantially improved; as a result the coefficient of determination remained nearly constant: 16.5% for the total population and 25% for children under five. Maintenance and operation deficiencies in the water distribution and wastewater sanitation systems play an important role in the incidence of this type of disease. It was found that the institutional division of the territory does not correspond to the actual distribution of the risk areas.
For some time now, antibiotic-resistant bacterial strains have been found in the human population, in foods, in livestock and wild animals, as well as in surface waters. The entry of antibiotics and resistant bacterial strains into the environment plays an important role in the spread of antibiotic resistance. The goal of the present study was to monitor the entry of antibiotic resistances into the environment through the contamination of wastewater. To assess the extent of transmission of antibiotic resistances from human sources into the environment, the resistance patterns of Escherichia coli strains isolated from human patients have been compared to those found in strains isolated from sewage sludge. Our results may indicate if resistances to particular antibiotics are more prone than others to spread into the environment. To monitor the increase of specific resistances over time, samples taken in the years 2000 and 2009 were analysed. Our study shows that for some antibiotics a parallel development of resistance patterns has taken place in both patient and environmental samples over time. For other sets of antibiotics, independent developments have occurred in the samples. A clear increase of multi-resistant E. coli strains over time was observed in samples from both sources.
A pooled analysis of seven cross-sectional studies from Newfoundland and Labrador, Waterloo and Hamilton Regions, Ontario and Vancouver, East Kootenay and Northern Interior Regions, British Columbia (2001 to 2007) was performed to investigate the drinking water consumption patterns of Canadians and to identify factors associated with the volume of tap water consumed. The mean volume of tap water consumed was 1.2 L/day, with a large range (0.03 to 9.0 L/day). In-home water treatment and interactions between age and gender and age and bottled water use were significantly associated with the volume of tap water consumed in multivariable analyses. Approximately 25% (2,221/8,916) of participants were classified as bottled water users, meaning that 75% or more of their total daily drinking water intake was bottled. Approximately 48.6% (4,307/8,799) of participants used an in-home treatment method to treat their tap water for drinking purposes. This study provides a broader geographic perspective and more current estimates of Canadian water consumption patterns than previous studies. The identified factors associated with daily water consumption could be beneficial for risk assessors to identify individuals who may be at greater risk of waterborne illness.
To determine factors associated with microbiological safety of public drinking water systems in regional New South Wales (NSW), Australia.
We analysed 107,000 end-user drinking water samples for an association between detection of Escherichia coli and drinking water system features, sample year and season using NSW Health Drinking Water Monitoring Program data, 2001-2007. We used negative binomial generalized estimating equations with adjustment for autocorrelation and clustering.
We detected E. coli in over 2% of samples from 40% (129/323) of systems. E. coli detection was significantly more common in earlier years and during summer (p<0.001). On multivariate analysis E. coli detection was significantly associated with smaller systems; watercourse sources; no disinfection or disinfection with ultraviolet only; and higher post-treatment mean turbidity (all p< or =0.01). Detection was most strongly associated with lack of disinfection (incidence rate ratio 12.6, p<0.001) and smaller supply systems (1% reduction in E. coli detection for each 1,000 person increase in supply population, p=0.004). Ultraviolet disinfection alone was the least effective disinfection method (p<0.001).
Even in developed countries, drinking water systems without disinfection or serving small populations appear vulnerable to the effects of faecal contamination, which presents a risk of waterborne disease outbreaks.
Increased domestic, laboratory confirmed, Campylobacter notifications were reported in Siderhamn municipality, December 2002 and January 2003. Concurrently, during preliminary investigations a large outbreak of acute gastroenteritis was detected. Simultaneously, two studies were completed to identify risk factors for infection with Campylobacter and acute gastrointestinal infection (AGI): (1) a case-cohort study using Campylobacter cases (N = 101) with a large random sample from the municipal population as referents (N = 1000) and (2) a retrospective cohort study for the outcome AGI using the same sample. A postal questionnaire was used to collect demographic, clinical, water and food consumption data. Measures of association (risk ratio (RR), odds ratio (OR)) and 95% confidence intervals (CI) were calculated. Stool, environmental and water samples were tested by standard methods at Gävle Hospital and SMI laboratories respectively. In the case-cohort study, Camplylobacter cases were more likely than referents to consume communal water (OR = 12.6 (95% CI 1.7-92.3)). In the cohort study, risk of gastroenteritis was 2.3 times higher in those who consumed water (AR = 27.3%) than others (AR = 12%). Risk of illness was associated with the amount of water consumed in both studies. Campylobacter was detected in stools and Escherichia coli (E. coli) from routine communal water (CW) samples. Results suggest both Söderhamn outbreaks of Campylobacter and AGI were associated with consumption of CW. The method used strengthened epidemiological evidence and was efficient in the use of time and resources.
This geographical study aimed to show natural or water-processing-related factors of faecal contamination incidents (FCIs) of drinking water in continental France. We defined a FCI as the occurrence of at least 20 colony-forming Escherichia coli or enterococci among all the 100 mL samples collected for regulatory purpose within one day from a given drinking water supply zone (SZ). We explored correlations between the standardized number of FCIs per département (N_Pols) and various indicators related to weather, land cover, topography, geology and water management for three SZ size sub-classes. In 2003-2004, 2,739 FCIs occurred in SZs supplying fewer than 2,000 people, mainly with simply disinfected groundwater. N_Pols correlates with four covariates: (1) precipitation; (2) the extension of the karst outcrops; (3) the extent of disinfection; and (4) catchment protection. One hundred millimetres of yearly excess in precipitation increases the pollution risk by 28-37%, depending on the sub-class. A 10% extension of the karst areas, a 10% increase of unprotected resources, or of SZs with no disinfection, could entail a higher risk of FCI by about 10%. The correlations are reproducible over the three sub-classes and corroborate expert appraisals. These results encourage the ongoing effort to generalize disinfection and catchment protection.
In Greece, standard tests are performed in watering and cooling systems of hotels. A total of 1,494 water samples were collected during 2004-2011 from 124 hotels from four regions in Crete (Greece). Samples were tested for the presence of Legionella spp.; 103 isolates were identified and typed by polymerase chain reaction (PCR)-sequencing and sequence-based typing (SBT) (in case of L. pneumophila sg 1). Of those, 48 belonged to various serogroups of L. pneumophila (sg 1, 2, 3, 5, 6, 8, 12, 13, and 15), 32 were characterized as L. anisa, 17 as L. taurinensis and there was a single occurrence of L. quinlivanii, L. maceachernii, and L. oakridgensis. In the case of L. pneumophila SG1, one prevalent sequence type was revealed (ST37). The variability of Legionella spp. observed questions the existence of a single ST of the L. pneumophila sg1 species and leads towards the need for a genetic level investigation of all Legionnaires' disease cases.
Tropical Storm Jeanne struck Haiti in September 2004, causing widespread flooding which contaminated water sources, displaced thousands of families and killed approximately 2,800 people. Local leaders distributed PūR®, a flocculent-disinfectant product for household water treatment, to affected populations. We evaluated knowledge, attitudes, practices, and drinking water quality among a sample of PūR® recipients.
We interviewed representatives of 100 households in three rural communities who received PūR® and PūR®-related education. Water sources were tested for fecal contamination and turbidity; stored household water was tested for residual chlorine.
All households relied on untreated water sources (springs [66%], wells [15%], community taps [13%], and rivers [6%]). After distribution, PūR® was the most common in-home treatment method (58%) followed by chlorination (30%), plant-based flocculation (6%), boiling (5%), and filtration (1%). Seventy-eight percent of respondents correctly answered five questions about how to use PūR®; 81% reported PūR® easy to use; and 97% reported that PūR®-treated water appears, tastes, and smells better than untreated water. Although water sources tested appeared clear, fecal coliform bacteria were detected in all sources (range 1 – >200 cfu/100 ml). Chlorine was present in 10 (45%) of 22 stored drinking water samples in households using PūR®.
PūR® was well-accepted and properly used in remote communities where local leaders helped with distribution and education. This highly effective water purification method can help protect disaster-affected communities from waterborne disease.
Contamination of drinking water by microbiological and chemical agents can lead to adverse health effects. In England and Wales, the Chemicals Hazards and Poisons Division (CHaPD) of the Health Protection Agency provides expert advice on the consequences to public health of chemical contamination incidents affecting drinking water. In this study, we extracted data from the National Database on the type and nature of drinking water contamination events reported to the CHaPD between 2006 and 2008. Eighty-two incidents with confirmed chemical contamination were identified. Among the 70 incidents where data was available, 40% (28/70) of incidents related to contamination of drinking water provided by private suppliers, 31% (22/70) were due to contamination occurring close to the point of consumption (i.e. near consumer) and 29% (20/70) related to incidents where public water supplies were identified as the contaminated source. For the majority of incidents, little or no information was available on the critical exposure variables such as duration of contamination and actual or estimates of the population affected. Reassuringly, the levels of exposure in most incidents were considered unlikely to cause serious immediate or long term ill health effects. Recording of exposure data for reported contamination incidents needs to be improved.
Over a 5 day period in October 2007 a boil-water notice was served on the majority of Oslo, capital city of Norway, as a result of a combination of bacteriological findings (coliforms, intestinal enterococci, and E. coli), and very low numbers of Cryptosporidium oocysts and Giardia cysts in 10 L water samples taken from the water distribution network. The water source had been regularly monitored for these parasites and generally found to be negative. Over 460,000 residents were affected by the boil-water notice, as were many thousands of businesses.
Despite an extensive outbreak of waterborne giardiasis in Bergen, Norway during 2004/2005, occurrence of parasites in Norwegian drinking water supplies has apparently continued to be considered to be of minimal relevance by Norwegian health authorities. Here we describe the background and occurrence of the episode in Oslo, including the species of Cryptosporidium detected, and use this event, in conjunction with incidents from other countries, as a basis to discuss the following issues: 1) under which circumstances should the occurrence of Cryptosporidium oocysts and Giardia cysts in water supplies trigger the issue of a boil-water notice, and 2) the possibilities and probabilities of post-treatment contamination events in the water distribution network.
We evaluated the ability of UNICEF-designed pot-chlorinators to achieve recommended free residual chlorine (FRC) levels in well water in Bissau, Guinea-Bissau, during a cholera outbreak. Thirty wells were randomly selected from six neighbourhoods. Pot-chlorinators--perforated plastic bottles filled with gravel, sand and calcium hypochlorite granules--were placed in each well. FRC was measured before and 24, 48 and 72 h after placement and compared with World Health Organization (WHO)-recommended levels of 21 mg L(-1) for well water during cholera outbreaks and 0.2-5 mg L 1 in non-outbreak settings. Presence of well covers, distance from wells to latrines, and rainfall were noted. Complete post-chlorination data were collected from 26 wells. At baseline, no wells had FRC>0.09 mg L(-1). At 24, 48 and 72 h post-chlorination, 4 (15%), 1 (4%) and 0 wells had FRC>or=1 mg L(-1) and 16 (62%), 4 (15%) and 1 (4%) wells had FRC between 0.2 and 5 mg L(-1), respectively. Several families reported discontinuing household water chlorination after wells were treated with pot-chlorinators. Pot-chlorinators failed to achieve WHO-recommended FRC levels in well water during a cholera outbreak, and conveyed a false sense of security to local residents. Pot-chlorination should be discouraged and alternative approaches to well-water disinfection promoted.
An operating error in a sewage treatment plant led to severe drinking water contamination in a well-defined district of a suburban municipality of Zurich, Switzerland. Despite the alert issued to the local population on the same day advising people not to consume the contaminated water, cases of acute gastroenteric diseases were subsequently observed. Considerable faecal contamination was detected the day after the incident in water samples taken up to 500 m from the sewage plant. In a retrospective epidemiological study involving 240 persons living in the affected area, 126 cases of acute gastrointestinal illness were documented. The epidemic curve revealed a peak incidence two days after the event. Stool samples from 11 of 20 patients were positive for noroviruses or Campylobacter jejuni. Although these microorganisms were not detected in the contaminated water, the subsequently conducted case-control study among the surveyed population showed that consumption of contaminated drinking water was associated with gastrointestinal illness (odds ratio 29.1; 95% confidence interval: 9.8-86.4; p = 0.001). The study also revealed the very probable time period of infection. We present the dimension and chronology of this outbreak and discuss the reasons for its localised and temporary spread.
The role of the water cycle in spreading human pathogenic influenza viruses is poorly studied and is not considered to be significant. However, gastrointestinal symptoms developed in a large proportion of influenza A (H1N1) 2009 virus infected people during the pandemic in 2009 and fecal shedding was reported. This fecal route could potentially play a role in the entry of human pathogenic influenza viruses in to the water cycle. Monitoring of influenza viruses in sewage and surface water during the pandemic in 2009 showed that influenza A viruses were detected in sewage and surface water. However, the pandemic influenza A (H1N1) 2009 virus was not detected. These findings imply that the water cycle did not play a relevant role in spreading the pandemic influenza virus during the epidemic in the Netherlands in 2009. Analyses of deliberately contaminated water samples confirmed the ability of quantitative RT-PCR to detect influenza viruses in sewage samples whereas the analysis of large volumes of surface water was strongly hampered by the presence of PCR-inhibiting substances.
The Gulf of Mexico Alliance (GOMA) was tasked by the five Gulf State Governors to identify major issues affecting the Gulf of Mexico (GoM) and to set priorities for ameliorating these problems. One priority identified by GOMA is the need to improve detection methods for water quality indicators, pathogens and microbial source tracking. The United States Environmental Protection Agency (USEPA) is tasked with revising water quality criteria by 2012; however, the locations traditionally studied by the USEPA are not representative of the GoM and this has raised concern about whether or not the new criteria will be appropriate. This paper outlines a number of concerns, including deadlines associated with the USEPA Consent Decree, which may prevent inclusion of research needed to produce a well-developed set of methods and criteria appropriate for all regulated waters. GOMA makes several recommendations including ensuring that criteria formulation use data that include GoM-specific conditions (e.g. lower bather density, nonpoint sources), that rapid-testing methods be feasible and adequately controlled, and that USEPA maintains investments in water quality research once the new criteria are promulgated in order to assure that outstanding scientific questions are addressed and that scientifically defensible criteria are achieved for the GoM and other regulated waterbodies.
We studied the shoreward and seasonal distribution of E. coil and enterococci in sand (at the water table) at two southern Lake Michigan beaches-Dunbar and West Beach (in Indiana). Deep, backshore sand (approximately 20 m inland) was regularly sampled for 15 months during 2002-2003. E. coli counts were not significantly different in samples taken at 5-m intervals from 0-40 m inland (P = 0.25). Neither E. coli nor enterococci mean counts showed any correlation or differences between the two beaches studied. In laboratory experiments, E. coli readily grew in sand supplemented with lake plankton, suggesting that in situ E. coil growth may occur when temperature and natural organic sources are adequate. Of the 114 sand enterococci isolates tested, positive species identification was obtained for only 52 (46%), with E. faecium representing the most dominant species (92%). Genetic characterization by ribotyping revealed no distinct genotypic pattern (s) for E. coli, suggesting that the sand population was rather a mixture of numerous strains (genotypes). These findings indicate that E. coli and enterococci can occur and persist for extended periods in backshore sand at the groundwater table. Although this study was limited to two beaches of southern Lake Michigan, similar findings can be expected at other temperate freshwater beaches. The long-term persistence of these bacteria, perhaps independent of pollution events, complicates their use as indicator organisms. Further, backshore sand at the water table may act as a reservoir for these bacteria and potentially for human pathogens.
A method for the determination of 42 hazard residues required by 'Japan Positive List System' in bottled water was described. Hazard compounds in bottled water were extracted with a solid phase extraction step using C18 disks. Determination was carried out by gas chromatography/mass spectrometry (GC/MS) and liquid chromatography-tandem mass spectrometry (LC/MS/MS). The disk extraction has high throughput which is well adapted to isolate and enrich these compounds from large volumes of water. For the water sample spiked at three concentration levels (LOQ, 4 times LOQ and 8 times LOQ), the recoveries of all analytes ranged between 65% and 120% with a relative standard deviation<24% (n=8).
Corrosion and scaling is a major problem in water distribution systems, thus evaluation of water corrosivity properties is a routine test in water networks. To evaluate water stability in the Bandar Abbas water distribution system, the network was divided into 15 clusters and 45 samples were taken. Langelier, Ryznar, Puckorius, Larson-Skold (LS) and Aggressive indices were determined and compared to the marble test. The mean parameters included were pH (7.8 ± 0.1), electrical conductivity (1,083.9 ± 108.7 μS/cm), total dissolved solids (595.7 ± 54.7 mg/L), Cl (203.5 ± 18.7 mg/L), SO4 (174.7 ± 16.0 mg/L), alkalinity (134.5 ± 9.7 mg/L), total hardness (156.5 ± 9.3 mg/L), HCO3 (137.4 ± 13.0 mg/L) and calcium hardness (71.8 ± 4.3 mg/L). According to the Ryznar, Puckorius and Aggressive Indices, all samples were stable; based on the Langelier Index, 73% of samples were slightly corrosive and the rest were scale forming; according to the LS index, all samples were corrosive. Marble test results showed tested water of all 15 clusters tended to scale formation. Water in Bandar Abbas is slightly scale forming. The most appropriate indices for the network conditions are the Aggressive, Puckorius and Ryznar indices that were consistent with the marble test.
Membrane filtration, multiple tube fermentation (the standard methods) and Colilert are techniques available for assessing drinking water quality, but there are no published comparisons of Colilert to standard methods in a developing country laboratory. We reviewed the published literature on Colilert and standard methods and conducted a study to compare Colilert with membrane filtration for the detection and enumeration of total coliforms and fecal coliforms (Escherichia coli bacteria) using 35 stored drinking water samples from households in Abidjan, Côte d'lvoire. Our study results are consistent with previous published studies conducted in developed countries. Results from Colilert and membrane filtration correlated for both total coliforms (r2 = 0.81) and E. coli (r2 = 0.93). Colilert is an acceptable method to measure the presence and quantity of coliforms in water samples in a developing country setting.
Members of the genus Vibrio are common in aquatic environments. Among them are V. cholerae, V. vulnificus, V. parahaemolyticus and V. mimicus. Several studies have shown that environmental factors, such as temperature, salinity, and dissolved oxygen, are involved in their epidemiology. Therefore, the main objective of this study is to determine if there is a correlation between the presence/amount of V. cholerae, V, vulnificus, V. parahaemolyticus and V. mimicus and the environmental conditions of the seawater off the coast of Guaymas, México. Quantification of all four pathogenic bacteria was performed using the most probable number method, and suspected colonies were identified by polymerase chain reaction (PCR). Correlations were found using principal component analysis. V. parahaemolyticus was the most abundant and widely distributed bacteria, followed by V. vulnificus, V. mimicus and V. cholerae. Positive correlations between V. parahaemolyticus, V. vulnificus and V. mimicus with temperature, salinity, electric conductivity, and total dissolved solids were found. The abundance of V. cholerae was mainly affected by the sampling site and not by physicochemical parameters.
Community diversity and abundance of biofilms from a full-scale drinking water distribution system in Shanghai were characterized by denaturing gradient gel electrophoresis (DGGE) analysis of 16S rRNA sequences and heterotrophic plate count (HPC), respectively. Bacteria affiliated to the Beta- and Gamma-Proteobacteria were dominating in both in-situ and HPC-culturable bacterial communities. Other bacteria present included members of Alphaproteobacteria, Bacteroides, Actinobacteria, Nitrospirae and Firmicutes. Acidovorax, Ralstonia and Acinetobacter were common species in biofilms. Klebsiella pneumoniae and Enterobacter sp. were detected in the local distribution system. Dissolved organic carbon (DOC), residual disinfectant and temperature were the most important factors influencing both bacterial abundance and composition. HPC for biofilm sample was not correlated with its community diversity.
In order to study the prevalence of enteric pathogens capable of causing infection and disease in the rural communities of Nkonkobe, bacterial isolates were collected from several surface water and groundwater sources used by the community for their daily water needs. By making use of selective culture media and the 20E API kit, presumptive Escherichia coli, Salmonella spp. and Vibrio cholerae isolates were obtained and then analysed by polymerase chain reaction assays (PCR). The PCR successfully amplified from water samples a fragment of E. coli uidA gene that codes for beta-D-glucuronidase which is a highly specific characteristic of enteropathogenic E. coli, enterotoxigenic E. coli and entero-invasive E. coli. The PCR also amplified the epsM gene from water samples containing toxigenic V. cholerae. Although E. coli was mostly detected in groundwater sources, toxigenic V. cholerae was detected in both surface and groundwater sources. There was a possibility of Salmonella typhimurium in Ngqele and Dyamala borehole water samples. The presence of these pathogenic bacteria in the above drinking water sources may pose a serious health risk to consumers.
This study was conducted to address the distribution of Acanthamoeba genotypes in therapeutic hot springs in Iran. Sixty water and sediment samples were collected from bicarbonate, sulphur, and sodium chloride thermal springs in the northwest. All hot springs examined are used mainly for health purposes in Iran. Acanthamoeba were identified by both morphology and PCR (polymerase chain reaction). Genotype identification was based on the sequencing of a highly variable and informative region of Diagnostic Fragment 3 (stem 29-1 of 18S rRNA gene) within Acanthamoeba-specific amplimer (ASA.S1). Twenty percent of hot springs were contaminated with thermotolerant Acanthamoeba belonging to the potentially pathogenic T4 and T3 genotypes. A high number (91.7%) of strains showed growth at 37 °C, and eight isolates showed growth at 42 °C. A single isolate (HSNW2) was detected in waters at 70 °C. The presence of thermotolerant Acanthamoeba highlights a risk factor for susceptible individuals, as Acanthamoeba-related keratitis continues to rise in Iran. Periodic surveillance of thermal waters as well as improved filtration and disinfection is recommended to prevent disease related to pathogenic Acanthamoeba. This is the first comprehensive molecular study of Acanthamoeba genotypes in hot springs in Iran and the first to report the occurrence of the T3 genotype (corresponding to Acanthamoeba griffini) in thermal water sources in this country.
The ability of pathogenic free-living amoebae to produce infections is a growing concern. In this study, we investigated the presence of free-living amoebae (Acanthamoeba spp., Naegleria fowleri, Balamuthia mandrillaris) in drinking water supplies in Karachi, Pakistan. Fifty-two domestic tap water samples were examined. Amoebae were identified by morphological characteristics and polymerase chain reaction. Thirty percent of the examined samples were positive for Acanthamoeba spp., 8% for N. fowleri while B. mandrillaris were not recovered. Additionally we examined secretory IgA antibody to Acanthamoeba and B. mandrillaris. Acanthamoeba antibody prevalence rate was 100% in both males and females, while B. mandrillaris antibody prevalence rate was 5.5% in males only (females were negative). Our findings suggest that free-living amoebae are a potential health hazard in domestic water supplies in Karachi, Pakistan.
A comprehensive survey assessing the presence of Acanthamoeba was conducted on 50 samples from water sources in parks and public squares from 22 municipal districts of Tehran, Iran. The prevalence and genotypes of Acanthamoeba were determined by PCR and the PCR fragments of ribosomal RNA genes sequenced. Sixteen (32%) samples were positive for Acanthamoeba spp. Sequence analysis revealed that the positive isolates belonged to the T4 and T5 genotypes. Fourteen isolates (87.5%) were T4, and two (12.5%) were T5. Acanthamoeba may be a problematic organism for contact lens wearers and for immunocompromised individuals. In Iran, Acanthamoeba keratitis has increased in recent years, mainly due to poor hygiene in contact lens wearers. A thorough survey for the prevalence of this amoeba could have a significant role in prevention of disease. This is the first report of the T5 genotype from water in recreational areas of Tehran.
The purpose was to identify the prevalence of naked amoebae in tap water in south Florida to ascertain the risk of amoebal infections of the cornea in contact lens wearers.
Over the course of a 2-year period, water samples were collected from sites throughout Broward, Palm Beach, and Dade counties, Florida. The presence of amoebae in samples was based on an enrichment cultivation method appropriate for Acanthamoeba. Amoebae were identified using diagnostic features discernable by light microscopy.
A total of 283 water samples were processed and amoebae were noted in 80 of these. Acanthamoeba were found on 8 occasions (2.8%). The genera Hartmannella and Vahlkampfia, rarely involved in keratitis cases, were found in 3.5% and 2.8% of samples, respectively. A total of 19 different naked amoebae were recorded and amoebae (regardless of genus) were present in 19.4% of all samples.
Previous surveys in England and Korea have shown that acanthamoebae are found in 15 to 30% of tap water samples in the home and have been associated with corneal infection in contact lens wearers. The incidence of acanthamoebae infection in the USA (2.8%) has been found to be lower than that in the UK and it has been postulated that this is related to the lack of a storage water tank in the roof loft space. However, the level of treatment of municipal water is clearly not effective at killing amoebal cysts (or trophozoites) as evidenced by the high occurrence of amoebae (19.4%) in this study.
Monitoring progress towards the targets for access to safe drinking-water and sanitation under the Millennium Development Goals (MDG) requires reliable estimates and indicators. We analyzed trends and reviewed current indicators used for those targets. We developed continuous time series for 1990 to 2015 for access to improved drinking-water sources and improved sanitation facilities by country using multilevel modeling (MLM). We show that MLM is a reliable and transparent tool with many advantages over alternative approaches to estimate access to facilities. Using current indicators, the MDG target for water would be met, but the target for sanitation missed considerably. The number of people without access to such services is still increasing in certain regions. Striking differences persist between urban and rural areas. Consideration of water quality and different classification of shared sanitation facilities would, however, alter estimates considerably. To achieve improved monitoring we propose: (1) considering the use of MLM as an alternative for estimating access to safe drinking-water and sanitation; (2) completing regular assessments of water quality and supporting the development of national regulatory frameworks as part of capacity development; (3) evaluating health impacts of shared sanitation; (4) using a more equitable presentation of countries' performances in providing improved services.
Support is growing for the incorporation of fetching time and/or distance considerations in the definition of access to improved water supply used for global monitoring. Current efforts typically rely on self-reported distance and/or travel time data that have been shown to be unreliable. To date, however, there has been no head-to-head comparison of such indicators with other possible distance/time metrics. This study provides such a comparison. We examine the association between both straight-line distance and self-reported one-way travel time with measured route distances to water sources for 1,103 households in Nampula province, Mozambique. We find straight-line, or Euclidean, distance to be a good proxy for route distance (R(2) = 0.98), while self-reported travel time is a poor proxy (R(2) = 0.12). We also apply a variety of time- and distance-based indicators proposed in the literature to our sample data, finding that the share of households classified as having versus lacking access would differ by more than 70 percentage points depending on the particular indicator employed. This work highlights the importance of the ongoing debate regarding valid, reliable, and feasible strategies for monitoring progress in the provision of improved water supply services.
Point-of-use water chlorination reduces diarrhoea risk by 25-85%. Social marketing has expanded access to inexpensive sodium hypochlorite for water treatment, at a cost of less than US$0.01 per day, in Kenya. To increase product access, women's groups in western Kenya were trained to educate neighbours and sell health products to generate income. We evaluated this programme's impact on equity of access to water treatment products in a cross-sectional survey. We surveyed 487 randomly selected households in eight communities served by the women's groups. Overall, 20% (range 5-39%) of households in eight communities purchased and used chlorine, as confirmed by residual chlorine observed in stored water. Multivariate models using illiteracy and the poorest socioeconomic status as a referent showed that persons with at least some primary education (OR 2.5, 95% CI 1.8, 3.5) or secondary education (OR 5.4, 95% CI 1.6, 17.5) and persons in the four wealthiest quintiles (OR 2.5, 95% CI 1.0, 6.0) were more likely to chlorinate stored water. While this implementation model was associated with good product penetration and use, barriers to access to inexpensive water treatment remained among the very poor and less educated.
Quantitative Microbial Risk Assessment (QMRA) models with 10,000 Monte Carlo simulations were applied to ascertain the risks of rotavirus and Ascaris infections for farmers using different irrigation water qualities and consumers of lettuce irrigated with the different water qualities after allowing post-harvest handling. A tolerable risk (TR) of infection of 7.7 x 10(-4) and 1 x 10(-2) per person per year were used for rotavirus and Ascaris respectively. The risk of Ascaris infection was within a magnitude of 10(-2) for farmers accidentally ingesting drain or stream irrigation water; approximately 10(0) for farmers accidentally ingesting farm soil and 10(0) for farmers ingesting any of the irrigation waters and contaminated soil. There was a very low risk (10(-5)) of Ascaris infection for farmers using pipe-water. For consumers, the annual risks of Ascaris and rotavirus infections were 10(0) and 10(-3) for drain and stream irrigated lettuce respectively with slight increases for rotavirus infections along the post-harvest handling chain. Pipe irrigated lettuce recorded a rotavirus infection of 10(-4) with no changes due to post harvest handling. The assessment identified on-farm soil contamination as the most significant health hazard.
Quantitative microbial risk assessment (QMRA) is frequently used to estimate health risks associated with wastewater irrigation and requires pathogen concentration estimates as inputs. However, human pathogens, such as viruses, are rarely quantified in water samples, and simple relationships between fecal indicator bacteria and pathogen concentrations are used instead. To provide data that can be used to refine QMRA models of wastewater-fed agriculture in Accra, stream, drain, and waste stabilization pond waters used for irrigation were sampled and analyzed for concentrations of fecal indicator microorganisms (human-specific Bacteroidales, Escherichia coli, enterococci, thermotolerant coliform, and somatic and F+ coliphages) and two human viruses (adenovirus and norovirus genogroup II). E. coli concentrations in all samples exceeded limits suggested by the World Health Organization, and human-specific Bacteroidales was found in all but one sample, suggesting human fecal contamination. Human viruses were detected in 16 out of 20 samples, were quantified in 12, and contained 2-3 orders of magnitude more norovirus than predicted by norovirus to E. coli concentration ratios assumed in recent publications employing indicator-based QMRA. As wastewater irrigation can be beneficial for farmers and municipalities, these results should not discourage water reuse in agriculture, but provide motivation and targets for wastewater treatment before use on farms.
A quantitative microbial risk assessment was applied to evaluate the microbial risks of the Accra Urban Water System (AUWS). The exposure assessment was based on the count of indicator organisms in waste water from open roadside drains and in water and sand samples from the beach. The predicted total disease burden generated in a representative catchment of the AUWS (Odaw Catchment) was 36,329 Disability Adjusted Life Years (DALYs) per year, of which 12 and 88% are caused by, respectively, shortcomings in the water supply system and inappropriate sanitation. The DALYs per person per year were above the WHO reference value. The open roadside drain had the highest contribution to the disease burden. Of four possible interventions evaluated for health risk reduction, the highest efficiency in terms of DALYs averted per euro invested would be achieved by providing covers for the open roadside drains.
Our study aimed to assess the accumulation of bacteriophages in sandy and clayey fresh water sediments. All of the 24 natural fresh water sediments were positive for somatic and F-specific phages, though their concentrations in the overlying water were undetectable in 1 and 11 samples, respectively, out of 24, corresponding to 4 and 46% for somatic and F-specific phages, respectively. Based on the sediment-to-water ratios, F-specific phages accumulate over 100 times more than the somatic coliphages in clayey sediments. Inactivation of bacteriophages in clayey and sandy sediments over a 1-month period at 15 degrees C was negligible. Our data suggest that persistence of deposited viruses in fresh water sediments leads to accumulation and the findings call for additional investigations on the fate of entrapped pathogenic viruses.
Rainfall and river flows are environmental variables influencing the microbial status of bivalve mollusc harvesting areas. This study investigated spatial and temporal relationships between rainfall, river flows and concentrations of Escherichia coli in mussels (Mytilus spp.) and Pacific oysters (C. gigas) from three harvesting areas in the Dart Estuary over the period 1996-2009. Mussels growing on the riverbed were found to be more contaminated than oysters growing in the water column. A step change in the levels of the microbial indicator was identified in both species from all harvesting areas. The highest levels of E. coli were detected when total rainfall exceeded 2 mm and water levels in the main tributaries exceeded the mean flow. The magnitude of response in levels of E. coli to these hydrological events varied between species and monitoring points, but was consistently higher between the 3rd and 4th days after the rainfall event. This lag time is assumed to result from catchment topography and geology determining peak levels of runoff at the headwaters 12-24 h after rainfall events. It is considered that future risk management measures may include sampling targeting hydrograph events.
The presence/absence hydrogen sulphide test (P/A H2S) is widely used as a low-cost alternative faecal indicator test in remote and resource-poor settings. The aim of the paper is to assess how bacterial density and sample volume affect its accuracy. Based on a systematic search, we identified studies that tested water samples (n = 2,034) using both the P/A H2S test and recognised tests for thermotolerant coliforms (TTC) or Escherichia coli. We calculated P/A H2S test specificity and sensitivity against a range of TTC and E. coli densities. For two studies, we compared this with sensitivity and specificity estimates for simulated 100 and 20 ml presence/absence tests. For most of the 19 included studies, as the threshold used to define contamination increased from 1 to 100 cfu/100 ml, P/A H2S test sensitivity increased but specificity decreased. Similarly, the simulation indicated that increasing test volumes from 20 to 100 ml increased sensitivity but reduced specificity. There was potential for bias, for example from lack of blinding during test interpretation, in most of the studies reviewed. In assessing the P/A H2S test as an alternative to standard methods, careful consideration of likely indicator bacteria levels and sample volume is required.
Chlorine is the most widely used disinfectant worldwide, partially because residual protection is maintained after treatment. This residual is measured using colorimetric test kits varying in accuracy, precision, training required, and cost. Seven commercially available colorimeters, color wheel and test tube comparator kits, pool test kits, and test strips were evaluated for use in low-resource settings by: (1) measuring in quintuplicate 11 samples from 0.0-4.0 mg/L free chlorine residual in laboratory and natural light settings to determine accuracy and precision; (2) conducting volunteer testing where participants used and evaluated each test kit; and (3) comparing costs. Laboratory accuracy ranged from 5.1-40.5% measurement error, with colorimeters the most accurate and test strip methods the least. Variation between laboratory and natural light readings occurred with one test strip method. Volunteer participants found test strip methods easiest and color wheel methods most difficult, and were most confident in the colorimeter and least confident in test strip methods. Costs range from 3.50-444 USD for 100 tests. Application of a decision matrix found colorimeters and test tube comparator kits were most appropriate for use in low-resource settings; it is recommended users apply the decision matrix themselves, as the appropriate kit might vary by context.
Haloacetic acids (HAAs) are produced by the reaction of chlorine with natural organic matter and are regulated disinfection by-products of health concern. Biofilms in drinking water distribution systems and in filter beds have been associated with the removal of some HAAs, however the removal of all six routinely monitored species (HAA(6)) has not been previously reported. In this study, bench-scale glass bead columns were used to investigate the ability of a drinking water biofilm to degrade HAA(6). Monochloroacetic acid (MCAA) and monobromoacetic acid (MBAA) were the most readily degraded of the halogenated acetic acids. Trichloroacetic acid (TCAA) was not removed biologically when examined at a 90% confidence level. In general, di-halogenated species were removed to a lesser extent than the mono-halogenated compounds. The order of biodegradability by the biofilm was found to be monobromo > monochloro > bromochloro > dichloro > dibromo > trichloroacetic acid.
Most methods for the analysis of haloacetic acids published in recent years are based on ion chromatography with direct injection, employing a gradient elution with potassium hydroxide (KOH). This work reports the exploration of an alternative eluent, a buffer of sodium carbonate/sodium hydrogen carbonate, aimed at the simultaneous analysis of nine haloacetic acids along with bromate, chlorite and chlorate. The alternative of both a less alkaline eluent and a lower temperature of operation may prevent the partial decomposition of some of the haloacetic acids during the analytical process, especially the more vulnerable brominated ones. Gradient elution at temperature of 7 °C yielded the best results, with an acceptable separation of 17 analytes (which includes the major natural inorganic anions) and a good linearity. Precision ranges from 0.3 to 23.4 (% V.C.), and detection limits are within units of μg L(-1), except for tribromoacetic acid - somewhat high in comparison with those of the official methods. Nonetheless, with the basic instrumentation setup herein described, this method may be suitable for monitoring when the drinking water treatments are to be optimized. This is especially interesting for small communities or for developing/developed countries in which regulations on disinfection by-products others than trihalomethanes are being addressed.
Acinetobacter in surface waters are a major concern because of their rapid development of resistance to a wide range of antimicrobials and their ability to persist in these waters for a very long time. Four surface water isolates of Acinetobacter having both multidrug- and multimetal-resistant ability were isolated and identified through biochemical tests and 16S rDNA sequencing. Based on these analyses, two hemolytic isolates were affiliated with Acinetobacter haemolyticus with an accession number of X81662. The other two non-hemolytic isolates were identified as Acinetobacter johnsonii and Acinetobacter calcoaceticus and affiliated with accession numbers of Z93440 and AJ888983, respectively. The antibiotic and heavy metal resistance profiles of the isolates were determined by using 26 antibiotics and 17 heavy metals. Acinetobacter isolates displayed resistance to β-lactams, cephalosporins, aminoglycosides, and sulfonamides. The hemolytic isolates were found to show resistance to higher numbers of heavy metals than the non-hemolytic ones. Due to a possible health risk of these pathogenic bacteria, a need exists for an accurate assessment of their acquired resistance to multiple drugs and metals.
The virulence factor concept has been a powerful engine in driving research and the intellectual flow in the fields of microbial pathogenesis and infectious diseases. This review analyzes virulence factors from the viewpoint of the damage-response framework of microbial pathogenesis, which defines virulence factor as microbial components that can damage a susceptible host. At a practical level, the finding that effective immune responses often target virulence factors provides a roadmap for future vaccine design. However, there are significant limitations to this concept, which are rooted in the inability to define virulence and virulence factors in the absence of host factors and the host response. In fact, this concept appears to work best for certain types of bacterial pathogens, being less well suited for viruses and commensal organisms with pathogenic potential.
In 1997, a compulsory notification system for waterborne outbreaks was introduced in Finland. The main aim of this notification is to obtain immediate information on suspected waterborne outbreaks in order to restrict and manage the outbreak promptly. During the past ten years, there have been 67 waterborne outbreaks in Finland, mainly associated with small groundwater supplies or private wells. The number of reported waterborne outbreaks has increased since the launch of the notification system indicating that the threshold limit of outbreak detection has most probably decreased. The number of cases of illness has fulfilled the national health target, which is below 0.01% of the population, but more action is still needed to ensure the production of safe drinking water under all circumstances. Ten years accumulation of knowledge on outbreaks has revealed that a compulsory notification system is an effective tool to gather information on waterborne outbreaks. The system has also increased awareness of possible problems related to the quality of drinking water. This article summarises management and legislative actions and policy measures taken so far in Finland to reduce the number of outbreaks and cases of illness related to them.
Giardia spp. and Cryptosporidium spp. are recognized worldwide as highly infectious protozoan parasites that can cause severe gastrointestinal disease in humans and animals. The detection of these pathogens in activated sludge samples becomes interesting since there is an increasing trend for the use of sewage sludge (biosolids) in agriculture. A total of 22 samples were collected and evaluated by means of Centrifugal - Concentration, followed or not followed by a purification process (ether clarification and sucrose flotation). Student t tests for comparison of the two procedures indicated a higher recovery rate of Giardia cysts with Centrifugal - Concentration; with regard to Cryptosporidium oocysts, no significant differences were found between the two methods, as only two samples were positive. The Centrifugal - Concentration procedure was shown to be the simplest and cheapest to perform, as emphasized by the efficiency recovery results.
Granular activated carbon (GAC) was used to remove bromide (Br(-)) and bromate (BrO3(-)) from drinking water in both bench- and pilot-scale experiments. The present study aims to minimize BrO3(-) formation and eliminate BrO3(-) generated during the ozonation of drinking water, particularly in packaged drinking water. Results show that the Br(-) and BrO3(-) levels in GAC-treated water decreased in both bench- and pilot-scale experiments. In the bench-scale experiments, when the empty bed contact time (EBCT) was 5 min, the highest reduction rates of Br(-) in the mineral and ultrapure water were found to be 74.9% and 91.2%, respectively, and those of BrO3(-) were 94.4% and 98.8%, respectively. The GAC capacity for Br(-) and BrO3(-) removal increased with the increase in EBCT. Reduction efficiency was better in ultrapure water than in mineral water. In the pilot-scale experiments, the minimum reduction rates of Br(-) and BrO3(-) were 38.5% and 73.2%, respectively.
We have compared in extracts of activated sludge the number of enteroviruses detectable with buffalo green monkey (BGM) cell-cultures versus the number of enteroviral genomes determined by reverse-transcription quantitative real-time PCR (RT-qPCR). In order to find conditions adequate for quantifying enteroviral RNA isolated from (waste)water we have investigated affinity capture of RNA with polystyrene beads (Dynabeads). The capture efficiency strongly depended on the genomic region chosen for the affinity binding. Capture of the RNA by its 3'-tail was most efficient (almost 100%); other regions within the genome yielded variable but lower results. Indirect capture (first hybridization of the RNA to the oligonucleotides, then attachment of the duplex molecules to the beads) was much more efficient than direct capture (attachment of the oligonucleotides to the beads first, then binding of the RNA), and resulted in RNA capture of maximally 60-80%. At least partly, this was due to incomplete hybridization of the RNA to the complementary oligonucleotides. No correlation was found between the number of cytopathic effects (CPE) determined by cell culture and the number of genomes quantified by RT-qPCR; RT-qPCR values were consistently much higher than the number of CPE. This points to overestimation of infectious enteroviruses by RT-qPCR and/or underestimation by the cell culture approach.