ArticlePDF Available

Abstract

Between July 1, 2007 and December 31, 2010, nearly 4,000 rural drinking water supplies were analyzed for coliform bacteria, nitrate, fluoride and 15 metals as part of a state-funded program that provides assistance to low-income families. Nearly half of these wells had an exceedance of at least one health-based water quality standard. Test results for iron and coliform bacteria exceeded safe limits in 21% and 18% of these wells, respectively. In addition, 10% of the water samples from these wells were high in nitrate, and approximately 15% had an elevated result for aluminum, arsenic, lead or manganese. These findings emphasize the importance of water quality monitoring to the health of nearly a million Wisconsin families that obtain their water from a privately-owned well.
A preview of the PDF is not available
... Those responsible for health surveillance actions were professionals linked to public health departments (40; 81.63%) from municipal, state, or federal spheres [30][31][32][33][34][35][36][37][38][39]41,42,[45][46][47][48][49][50][52][53][54][55][56][57][58][59][60][61][62]64,[66][67][68][69][70][71][72][73][74][75]; agencies responsible for water protection [44,76]; departments of environment and conservation [40]; departments of public health engineering [77]; national water laboratories [29]; or in-field sanitary inspectors of a water test laboratory [51]. Sometimes, the surveillance involved other sectors besides health (e.g., environmental, civil protection, and agencies responsible for the supply) [44][45][46]48,54,64] or was conducted by a national network for monitoring environmental public health [63,65]. ...
... In addition, turbidity was analyzed as a relevant indicator [34,35,37,[39][40][41]49,51,53] and frequently associated with acute diarrhea when the maximum limit was exceeded [37]. Bacterial contamination by coliform [32][33][34]37,39,41,45,51,68,72] and Escherichia coli [33,35,39,45,51,64,75] were also correlated with the emergence of acute diarrhea [75]. The assessment of fluoride levels was indicated as a potential parameter for preventing cavities in the population [31,35,36,41,47]. ...
... The surveillance actions of water quality highlighted in this class addressed the identification of chemical contaminants in households (i.e., lead [70,72], arsenic [66,71,72], iron [70,72,77], nitrate [68,72,77], fluoride [46,77], aluminum, manganese, strontium, and nitrogen [72]), revealed locations that did not fully comply with regulations of chemical contamination [48], and identified volatile organic compounds within the allowed level [68]. Identifying these contaminants has been part of testing programs at residences of people from at-risk groups (e.g., women with children [70], low-income families with pregnant women, and young children [72]) and assessments of supply sources and sources needing treatment after a disaster [50]. ...
Article
Full-text available
This study identified and mapped worldwide surveillance actions and initiatives of drinking water quality implemented by government agencies and public health services. The scoping review was conducted between July 2021 and August 2022 based on the Joanna Briggs Institute method. The search was performed in relevant databases and gray literature; 49 studies were retrieved. Quantitative variables were presented as absolute and relative frequencies, while qualitative variables were analyzed using the IRaMuTeQ software. The actions developed worldwide and their impacts and results generated four thematic classes: (1) assessment of coverage, accessibility, quantity, and drinking water quality in routine and emergency situations; (2) analysis of physical–chemical and microbiological parameters in public supply networks or alternative water supply solutions; (3) identification of household water contamination, communication, and education with the community; (4) and investigation of water-borne disease outbreaks. Preliminary results were shared with stakeholders to favor knowledge dissemination.
... The responsible for health surveillance actions were professionals linked to public health departments (40; 81.63%) from municipal, state, or federal spheres [31][32][33][34][35][36][37][38][39][40]42,43,[46][47][48][49][50][51][53][54][55][56][57][58][59][60][61][62][63]65,[67][68][69][70][71][72][73][74][75][76], agencies responsible for water protection [45,77], department of environment and conservation [41], department of public health engineering [78], national water laboratory [30], or in-field sanitary inspectors of the water test laboratory [52]. Sometimes, the surveillance involved other sectors besides health (e.g., environmental, civil protection, and responsible for the supply) [45][46][47]49,55,65] or was conducted by a national network for monitoring environmental public health [64,66]. ...
... In addition, turbidity was analyzed as a relevant indicator [35,36,38,[40][41][42]50,52,54] and frequently associated with acute diarrhea if the maximum limit is exceeded [38]. Bacterial contamination by coliform [33][34][35]38,40,42,46,52,69,73] and Escherichia Coli [34,36,40,46,52,65,76] were also correlated with the emergence of acute diarrhea [76]. The assessment of fluoride levels was indicated as a potential parameter for preventing cavities in the population [32,36,37,42,48]. ...
... The surveillance actions of water quality highlighted in this class addressed the identification of chemical contaminants in households (i.e., lead [71,73], arsenic [67,72,73], iron [71,73,78], nitrate [69,73,78], fluoride [47,78], aluminum, manganese, strontium, and nitrogen [73]), revealed locations that did not fully comply with regulations of chemical contamination [49], and identified volatile organic compounds within the allowed level [69]. Identifying these contaminants has been part of testing programs at residences of people from risk groups (e.g., women with children [71], low-income families with pregnant women, and young children [73]) and assessments of supply sources and sources needing treatment after a disaster [51]. ...
Preprint
This study identified and mapped worldwide surveillance actions and initiatives of drinking water quality implemented by government agencies or public health services. The scoping review was conducted between July 2021 and August 2022 based on the Joanna Briggs Institute. The search was performed in relevant databases and grey literature; 49 studies were obtained. Quantitative variables were presented as absolute and relative frequencies, while qualitative variables were analyzed using the IRaMuTeQ software. The actions developed worldwide and their impacts and results provided four thematic classes: (1) assessment of coverage, accessibility, quantity, and drinking water quality in routine and emergency situations; (2) analysis of physical-chemical and microbiological parameters in public supply networks or alternative water supply solutions; (3) identification of household water contamination, communication, and education with the community; (4) and investigation of water-borne disease outbreaks. Preliminary results were shared with stakeholders to favor early knowledge dissemination.
... A glance to the results obtained in this dimension shows that participants reveal a low literacy level regarding water quality issues, whatever the competence under review (i.e., obtain, understand, assess, or apply). The outcomes obtained in Water Quality dimension agree with those achieved by Gibson and Pieper [30], Maleckia et al. [48] and Knobeloch et al. [41]. According to these authors, most owners of private abstractions do not conduct water analysis even though their concerns with the environmental pollution problems do exist. ...
Chapter
Nowadays, an increasing amount of water is used without the awareness that this resource is not inexhaustible. In fact, pollution, environmental degradation and/or climate change caused by human activities lead to the degradation of the quality of available water. In 2015, the United Nations warned about the risk of reaching a water deficit of 40%, in 2030, if consumption patterns are not changed. Indeed, population growth is one of the main causes for this deficit. The protection and sustainable consumption of water is one of the United Nations’ Sustainable Development Goals, to ensure that the world’s population has access to clean water, free from pollution and managed responsibly. Thus, governments and non-governmental organizations must promote the conscious and informed use of water by the population. It is mandatory that the population becomes aware of the need for efficient management of water resources, ensuring their quality and preventing their degradation, in order to not compromise/jeopardize their future availability. The knowledge of the population’s literacy on water issues and on water quality—health interconnections is essential to design plans leading to the implementation of eco-sustainable practices. The goal of this research was to evaluate the literacy of water consumers and to establish a forecast model for water literacy managing. The collection of information was conducted through the inquiry by questionnaire technique and applied on a cohort encompassing 453 participants. The questionnaire includes three main dimensions (Water Quality, Disease Prevention and Sustainability/Public Health Promotion) and in each dimension, four competencies were evaluated (obtain, understand, assess and apply information regarding water consumption). The results obtained allow to assert that in the two first dimensions, the competence in which participants show more difficulty is assess. Regarding the Sustainability/Public Health Promotion, the participants show more difficulty in the competence apply. The model presented in this research, grounded on the connectionist paradigm, has shown great efficiency in the forecast of the target variable. The key contribution of the present research is to present an integrated and systematic approach that can give a contribution to the increase of water literacy, which allows the implementation of eco-sustainable practices.KeywordsArtificial intelligenceArtificial neural networksSustainable use of waterWater literacy assessmentWater managementWater quality
... Multiple TW-exposure hypotheses, relevant to BW specifically and to POU drinking water in general, were assessed. In line with an increasingly anthropized water cycle and with previous TW results by this research group (Bradley et al. 2018;Bradley et al. 2020;Bradley et al. 2021a;Bradley et al. 2021b;Bradley et al. 2022) andothers (e.g., de Jesus Gaffney et al. 2015;Focazio et al. 2006;Knobeloch et al. 2013;Postma et al. 2011;Rogan and Brady 2009), simultaneous exposures to multiple inorganic and organic constituents of potential human-health interest were expected to occur in BW samples (Hypothesis I). Exceedances of FDA-enforceable BW standard of quality (SOQ, "shall not contain in excess of") levels (U.S. Food & Drug Administration 2021); adopted from and, with few exceptions (e.g., lead [Pb]), equivalent to EPA public-supply enforceable National Primary Drinking Water Regulation maximum contaminant level(s) (MCL) (U.S. Environmental Protection Agency 2021a, e); were not expected (Hypothesis II). ...
Article
Background Bottled water (BW) consumption in the United States and globally has increased amidst heightened concern about environmental contaminant exposures and health risks in drinking water supplies, despite a paucity of directly comparable, environmentally-relevant contaminant exposure data for BW. This study provides insight into exposures and cumulative risks to human health from inorganic/organic/microbial contaminants in BW. Methods BW from 30 total domestic US (23) and imported (7) sources, including purified tapwater (7) and spring water (23), were analyzed for 3 field parameters, 53 inorganics, 465 organics, 14 microbial metrics, and in vitro estrogen receptor (ER) bioactivity. Health–benchmark–weighted cumulative hazard indices and ratios of organic–contaminant in vitro exposure-activity cutoffs were assessed for detected regulated and unregulated inorganic and organic contaminants. Results 48 inorganics and 45 organics were detected in sampled BW. No enforceable chemical quality standards were exceeded, but several inorganic and organic contaminants with maximum contaminant level goal(s) (MCLG) of zero (no known safe level of exposure to vulnerable sub-populations) were detected. Among these, arsenic, lead, and uranium were detected in 67%, 17%, and 57% of BW, respectively, almost exclusively in spring-sourced samples not treated by advanced filtration. Organic MCLG exceedances included frequent detections of disinfection byproducts (DBP) in tapwater-sourced BW and sporadic detections of DBP and volatile organic chemicals in BW sourced from tapwater and springs. Precautionary health–based screening levels were exceeded frequently and attributed primarily to DBP in tapwater-sourced BW and co-occurring inorganic and organic contaminants in spring-sourced BW. Conclusion The results indicate that simultaneous exposures to multiple drinking-water contaminants of potential human-health concern are common in BW. Improved understandings of human exposures based on more environmentally realistic and directly comparable point-of-use exposure characterizations, like this BW study, are essential to public health because drinking water is a biological necessity and, consequently, a high-vulnerability vector for human contaminant exposures.
... The results obtained in this dimension are in concordance with various authors who point out that the majority of well owners do not analyze their water despite A. Fernandes et al. being concerned with the environmental pollution issues (Knobeloch et al. 2013;Gibson and Pieper 2017;Maleckia et al. 2017). Furthermore, they also agree with studies that highlight that the population has difficulties in interpreting the water quality reports Fox et al. 2016;Maleckia et al. 2017). ...
Article
A conceptual model to assess the literacy level of water consumers is presented. On the one hand, a literature search was performed using the ScienceDirect and B-On platforms, conjoining the terms literacy, awareness, water, water for human consumption, drinking water, environmental, disease prevention and public health, resulting in seven papers with the mingle of literacy and water and five on literacy and the environment being uncovered. On the other hand, the lack of papers and information on the subject caused us to consider developing a conceptual model to transform the processes of planning and operationalization of the studies of literacy of water consumers. The model can support the development and validation of measurement tools capable of apprehending different dimensions in the context of water literacy. A questionnaire was conceived and applied to a cohort of 147 respondents in order to assess water literacy. In addition, the articulation of the proposed model and Deming’s PDCA model was demonstrated in order to achieve excellence through the evaluation of the current reality to promote improvement solutions.
... However, aggregating data in this manner can overlook factors underlying contamination "hotspots," in this case, bedrock depth. For example, the statewide averages for coliform and nitrate MCLG exceedances in Wisconsin, irrespective of bedrock depth, are 18% and 10%, respectively (Knobeloch et al. 2013). Using the multivariable models for coliforms and nitrate for recharge and no-recharge periods, respectively (see below and Figures 4B and 4C), the statewide percentages are equivalent The n's do not equal the number of samples analyzed (see Figure S1) because some wells had missing depth-to-bedrock values (six wells for the groundwater recharge period and one well for the no recharge period) for which analytic weights could not be generated. ...
Article
Full-text available
Background: Groundwater quality in the Silurian dolomite aquifer in northeastern Wisconsin, USA, has become contentious as dairy farms and exurban development expand. Objectives: We investigated private household wells in the region, determining the extent, sources, and risk factors of nitrate and microbial contamination. Methods: Total coliforms, Escherichia coli, and nitrate were evaluated by synoptic sampling during groundwater recharge and no-recharge periods. Additional seasonal sampling measured genetic markers of human and bovine fecal-associated microbes and enteric zoonotic pathogens. We constructed multivariable regression models of detection probability (log-binomial) and concentration (gamma) for each contaminant to identify risk factors related to land use, precipitation, hydrogeology, and well construction. Results: Total coliforms and nitrate were strongly associated with depth-to-bedrock at well sites and nearby agricultural land use, but not septic systems. Both human wastewater and cattle manure contributed to well contamination. Rotavirus group A, Cryptosporidium, and Salmonella were the most frequently detected pathogens. Wells positive for human fecal markers were associated with depth-to-groundwater and number of septic system drainfield within 229m. Manure-contaminated wells were associated with groundwater recharge and the area size of nearby agricultural land. Wells positive for any fecal-associated microbe, regardless of source, were associated with septic system density and manure storage proximity modified by bedrock depth. Well construction was generally not related to contamination, indicating land use, groundwater recharge, and bedrock depth were the most important risk factors. Discussion: These findings may inform policies to minimize contamination of the Silurian dolomite aquifer, a major water supply for the U.S. and Canadian Great Lakes region. https://doi.org/10.1289/EHP7813.
Article
Full-text available
Background Lead can be present in drinking water in soluble and particulate forms. The intermittent release of lead particulates in drinking water can produce highly variable water lead levels (WLLs) in individual homes, a health concern because both particulate and soluble lead are bioavailable. More frequent water sampling would increase the likelihood of identifying sporadic lead “spikes,” though little information is available to aid in estimating how many samples are needed to achieve a given degree of sensitivity to spike detection. Objective To estimate the number of rounds of tap water sampling needed to determine with a given level of confidence that an individual household is at low risk for the intermittent release of lead particulates. Methods We simulated WLLs for 100,000 homes on 15 rounds of sampling under a variety of assumptions about lead spike release. A Markovian structure was used to describe WLLs for individual homes on subsequent rounds of sampling given a set of transitional probabilities, in which homes with higher WLLs at baseline were more likely to exhibit a spike on repeated sampling. Results Assuming 2% of homes had a spike on the first round of sampling and a mid-range estimate of transitional probabilities, the initial round of sampling had a 6.4% sensitivity to detect a spike. Seven rounds of sampling would be needed to increase the sensitivity to 50%, which would leave unrecognized the more than 15,000 homes that intermittently exhibit spikes. Significance For assessing household risk for lead exposure through drinking water, multiple rounds of water sampling are needed to detect the infrequent but high spikes in WLLs due to particulate release. Water sampling procedures for assessment of lead exposure in individual homes should be modified to account for the infrequent but high spikes in WLL. Impact It has been known for decades that intermittent “spikes” in water lead occur due to the sporadic release of lead particulates. However, conventional water sampling strategies do not account for these infrequent but hazardous events. This research suggests that current approaches to sampling tap water for lead testing identify only a small fraction of homes in which particulate spikes occur, and that sampling procedures should be changed substantially to increase the probability of identifying the hazard of particulate lead release into drinking water.
Article
Full-text available
Dominant forms of agricultural production in the U.S. Upper Midwest are undermining human health and well being. Restoring critical ecosystem functions to agriculture is key to stabilizing climate, reducing flooding, cleaning water, and enhancing biodiversity. We used simulation models to compare ecosystem functions (food-energy production, nutrient retention, and water infiltration) provided by vegetation associated with continuous corn, corn-soybean rotation, and perennial grassland producing feed for dairy livestock. Compared to continuous corn, most ecosystem functions dramatically improved in the perennial grassland system (nitrate leaching reduced ~90%, phosphorus loss reduced ~88%, drainage increased ~25%, evapotranspiration reduced ~29%), which will translate to improved ecosystem services. Our results emphasize the need to incentivize multiple ecosystem services when managing agricultural landscapes.
Article
Provision of safe drinking water is critical for human health and survival. Indicator bacteria (i.e., total coliforms and fecal coliforms) are typical indicators of microbial water quality. In North Carolina (NC), new private drinking water wells are required for total coliform and E. coli testing. Such wells are typically only tested once before use in NC, so consistent and accurate testing is critical. Herein, we analyzed a large data set (n = 32,839) of water samples collected from private drinking water wells, representing 80 of North Carolina's 100 counties, and analyzed for significant differences in E. coli and total coliform positive rates depending on county and region of collection, transport method, sampling collection point, day of week, month, year, season, treatment method, time elapsed between collection and incubation, method used for detection, and the personnel reporting sample results. Our results showed that 34.11% of samples tested positive for total coliform, while 1.38% tested positive for E. coli. The total coliform positive rate ranged from 16.4% to 60% among the 80 counties. While nine counties reported no positive E. coli samples, positive E. coli rates for the remaining counties ranged from 0.17% to 12.7%. Sample from the Inner Coastal Plain were more likely to be positive for total coliforms, while samples from the Outer Coastal Plain were more likely to be positive for both the E. coli and total coliforms. Delivery method, sampling site, month, year and season of collection, treatment type, elapsed time between collection and analysis, and person reporting sample results were correlated with positive total coliform results. Similarly, sampling point, month, treatment type, test type, and the person reporting the results were correlated with positive E. coli samples. These results can be used to prioritize limited public health resources for maximum impact on new private well drinking water safety. Further, providing clear and complete information by the sample collector could improve future research to better inform well water sampling practices.
Article
Full-text available
Groundwater from boreholes is the major source of bottled water in Algeria. The aim of this study is to determine the bacteriological quality of groundwater that serves bottled water production. A total of 73 groundwater boreholes were sampled and analyzed for the required bacteriological parameters. The analysis was performed in accordance to ISO standards methods. There should be no bacteria growth for each bacteriological parameter to qualify the groundwater of good bacteriological quality. The bacteriological analysis highlighted that 37 of the 73 groundwater samples (51%) were of poor bacteriological quality while 36 of them (49%) were of good bacteriological quality. Total coliforms and E. coli were the major sources of contamination with respectively 35 and 24 contaminated samples, followed in order by Pseudomonas aeruginosa, enterococci and sulfite reducing anaerobic bacteria spores with respectively 8, 7 and 2 contaminated samples. Bacteriological quality was strongly and negatively correlated with urbanization and/or agricultural activity parameter (r = −0.454). The performed logistic regression model showed that the presence of urbanization or agricultural activity multiplies significantly (P < 0.001) the risk by 7 of being a poor bacteriological quality groundwater. These findings are useful to avoid drill costs and to take the best strategy to protect groundwaters. HIGHLIGHTS Half of groundwaters do not satisfy the bacteriological quality criteria.; Total coliforms and E. coli are the major sources of contamination in the studied groundwaters.; The presence of urbanization or agricultural activity has a negative impact on groundwater bacteriological quality.; Evidence was shown that drilling costs could be avoided by adopting the groundwater quality prediction tools.;
Article
Full-text available
Inorganic arsenic is naturally occurring in groundwaters throughout the United States. This study investigated arsenic exposure and self-report of 9 chronic diseases. We received private well-water samples and questionnaires from 1185 people who reported drinking their water for 20 or more years. Respondents with arsenic levels of 2 microg/L or greater were statistically more likely to report a history of depression, high blood pressure, circulatory problems, and bypass surgery than were respondents with arsenic concentrations less than 2 microg/L.
Article
Full-text available
The authors examined associations between exposure to aluminum or silica from drinking water and risk of cognitive decline, dementia, and Alzheimer's disease among elderly subjects followed for 15 years (1988-2003). They actively searched for incident cases of dementia among persons aged 65 years or over living in 91 civil drinking-water areas in southern France. Two measures of exposure to aluminum were assessed: geographic exposure and individual exposure, taking into account daily consumption of tap water and bottled water. A total of 1,925 subjects who were free of dementia at baseline and had reliable water assessment data were analyzed. Using random-effects models, the authors found that cognitive decline with time was greater in subjects with a higher daily intake of aluminum from drinking water (>or=0.1 mg/day, P=0.005) or higher geographic exposure to aluminum. Using a Cox model, a high daily intake of aluminum was significantly associated with increased risk of dementia. Conversely, an increase of 10 mg/day in silica intake was associated with a reduced risk of dementia (adjusted relative risk =0.89, P=0.036). However, geographic exposure to aluminum or silica from tap water was not associated with dementia. High consumption of aluminum from drinking water may be a risk factor for Alzheimer's disease.
Article
Full-text available
The subjects of this study were children aged 6-60 months living in villages in the Ulas Health Region, Sivas. The villages were divided into two groups according to the amount of strontium in the soil: region 1, > 350 ppm, 650 children; region 2, < 350 ppm, 1596 children. Overall, the prevalence of one or more clinical signs of rickets was 22.9%. The prevalence in region 1 was 31.5% and that in region 2, 19.5%. These values were significantly different (p < 0.001). When other variables which may be relevant to the occurrence of rickets were taken into account, the difference in prevalence persisted. The results suggest that in villages where nutrition is mainly based on grain cereals the presence of strontium in the soil will increase the prevalence of rickets significantly. As a preventive measure, a greater proportion of the foods given to children in these villages should be derived from animal origin, and cereals and drinking water supplies should be obtained from villages with a low soil strontium content, or calcium supplements should be given. Abstract IN: http://adc.bmj.com/content/75/6/524.abstract Fulltext IN: http://adc.bmj.com/content/75/6/524.full.pdf+html
Article
Full-text available
During 1992 and 1993 the Wisconsin Division of Health investigated five cases in which copper-contaminated drinking water was suspected of causing gastrointestinal upsets. Each of these case studies was conducted after our office was notified of high copper levels in drinking water or notified of unexplained illnesses. Our findings suggest that drinking water that contains copper at levels above the federal action limit of 1.3 mg/l may be a relatively common cause of diarrhea, abdominal cramps, and nausea. These symptoms occurred most frequently in infants and young children and among resident of newly constructed or renovated homes. Images p958-a
Article
The growing availability of genetic tests for most inherited iron-overload conditions and our current ability to assess hepatic iron stores, and at a lesser extent, liver fibrosis by noninvasive methods have reduced the need for liver biopsy in patients with hepatic iron excess. Histologic evaluation of the liver remains useful (1) in well-defined genetic iron overload disorders to evaluate associated hepatic damage, (2) in unclassified genetic or acquired iron excess to guide etiologic diagnosis and to establish prognosis, and (3) in research studies for a whole and reliable assessment of the liver. The identification of iron overload, the description of its cellular and lobular distribution, semiquantitative assessment of its amount, and inventory of associated lesions, especially fibrosis, are the pathologist's main objectives.
Article
Hemochromatosis is an inherited disease with iron overload and joint involvement resembling osteoarthritis. To determine the rate of joint replacement surgery in patients with hemochromatosis, we performed a cross-sectional cohort study. A total of 199 individuals with hereditary hemochromatosis were included. The prevalence of joint replacement surgery in hip, knee, and ankle joints because of secondary osteoarthritis was assessed. Data were compared with 917 healthy subjects from the population-based Bruneck study. A total of 32 of 199 individuals with hemochromatosis received joint replacement surgery with a total number of 52 joints replaced. Compared with expected rates in healthy individuals, patients with hemochromatosis had a significantly higher risk for joint replacement surgery (odds ratio 9.0; confidence interval, 4.6-17.4). Joint replacement occurred significantly earlier in life in patients with hemochromatosis; 21.9% of the patients with hemochromatosis and 1.7% of healthy individuals required joint replacement before the age of 50 years (P=.0027). Moreover, patients with hemochromatosis were more likely to require multiple joint replacements (8.5%) than the control group (expected rate 0.3%; P=.0001). Hemochromatosis is a risk factor for joint replacement surgery because of severe secondary osteoarthritis.
Article
Nitrate is a contaminant of drinking water in agricultural areas and is found at high levels in some vegetables. Nitrate competes with uptake of iodide by the thyroid, thus potentially affecting thyroid function. We investigated the association of nitrate intake from public water supplies and diet with the risk of thyroid cancer and self-reported hypothyroidism and hyperthyroidism in a cohort of 21,977 older women in Iowa who were enrolled in 1986 and who had used the same water supply for >10 years. We estimated nitrate ingestion from drinking water using a public database of nitrate measurements (1955-1988). Dietary nitrate intake was estimated using a food frequency questionnaire and levels from the published literature. Cancer incidence was determined through 2004. We found an increased risk of thyroid cancer with higher average nitrate levels in public water supplies and with longer consumption of water exceeding 5 mg/L nitrate-N (for >or=5 years at >5 mg/L, relative risk [RR] = 2.6 [95% confidence interval (CI) = 1.1-6.2]). We observed no association with prevalence of hypothyroidism or hyperthyroidism. Increasing intake of dietary nitrate was associated with an increased risk of thyroid cancer (highest vs. lowest quartile, RR = 2.9 [1.0-8.1]; P for trend = 0.046) and with the prevalence of hypothyroidism (odds ratio = 1.2 [95% CI = 1.1-1.4]), but not hyperthyroidism. Nitrate may play a role in the etiology of thyroid cancer and warrants further study.
Article
Iron is necessary for life, but excess iron can be toxic to tissues. Iron is thought to damage tissues primarily by generating oxygen free radicals through the Fenton reaction. We present an overview of the evidence supporting iron's potential contribution to a broad range of eye disease using an anatomical approach. Iron can be visualized in the cornea as iron lines in the normal aging cornea as well as in diseases like keratoconus and pterygium. In the lens, we present the evidence for the role of oxidative damage in cataractogenesis. Also, we review the evidence that iron may play a role in the pathogenesis of the retinal disease age-related macular degeneration. Although currently there is no direct link between excess iron and development of optic neuropathies, ferrous iron's ability to form highly reactive oxygen species may play a role in optic nerve pathology. Lastly, we discuss recent advances in prevention and therapeutics for eye disease with antioxidants and iron chelators. Iron homeostasis is important for ocular health.
A prospective, controlled, double-blind, double-dummy, multicenter clinical trial was made to assess the efficacy and tolerability of iron-protein-succinylate (ITF 282) in comparison with a well known iron preparation in the treatment of iron deficiency or iron deficient anemia. One thousand and ninety-five patients affected with iron deficiency or overt iron deficient anemia were randomized to receive either two ITF 282 tablets/day (60 mg iron each) or a commercially available ferrous sulphate controlled release tablet (one tablet containing 105 mg iron/day). Five hundred and forty-nine patients received ITF 282; 546 patients were treated with ferrous sulphate. Both treatments lasted 60 days. The treatment outcome was checked by evaluating special hematology, symptomatology, safety hematology and hematochemistry. After two months of treatment, the normalization of the main hematologic parameters in both groups was detected. Although in the first month the reference treatment appears to provide somewhat faster results, at the end of the observation, the values of hematocrit, hemoglobin and ferritin were greater in the ITF 282 group, indicating a more progressive and steady therapeutic effect. The overall clinical rating was significantly in favor of ITF 282, with 78.9% of favorable results vs 67.6%. By dividing the patient population according to pathological conditions (iron deficiency or overt anemia), or according to the etiopathogenesis of the iron deficiency (increased requirement, or increased loss in adults and in the elderly), separate analyses on the treatment outcome were made (and have been included). The general tolerability, although favorable with both treatments, was significantly more favorable with ITF 282. With this medication, 63 patients (11.5%) complained of 69 adverse reactions (25 heartburn, 19 constipation, 25 abdominal pain) vs 141 events reported by 127 patients (26.3%) with the reference medication (33 heartburn, 31 epigastric pain, 23 constipation, 32 abdominal pain, 8 skin rash, 14 nausea). These observations confirm that, although the most modern preparations of ferrous sulphate exhibit a relatively low frequency of adverse events of limited clinical concern, it is nevertheless possible to decrease both the prevalence and the duration of such events without prejudice for the clinical efficacy, with the use of more "physiological" preparations in which the iron is reversibly bound to a protein carrier, thus effectively removing one of the main obstacles to the correct compliance with treatments that must be administered for prolonged periods of time.
Article
The objective of this study was to assess the relation between long-term exposure to different aluminum (Al) forms in drinking water and Alzheimer's disease (AD). The study participants were selected from a random sample of the elderly population (> or = 70 years of age) of the Saguenay-Lac-Saint-Jean region (Quebec). Sixty-eight cases of Alzheimer's disease diagnosed according to recognized criteria were paired for age (+/-2 years) and sex with nondemented controls. Aluminum speciation was assessed using established standard analytical protocols along with quality control procedures. Exposure to Al forms (total Al, total dissolved Al, monomeric organic Al, monomeric inorganic Al, polymeric Al, Al(3+), AlOH, AlF, AlH(3)SiO(2+)(4), AlSO(4)) in drinking water was estimated by juxtaposing the subject's residential history with the physicochemical data of the municipalities. The markers of long-term exposures (1945 to onset) to Al forms in drinking water were not significantly associated with AD. On the other hand, after adjustment for education level, presence of family cases, and ApoE varepsilon4 allele, exposure to organic monomeric aluminum estimated at the onset of the disease was associated with AD (odds ratio 2.67; 95% CI 1.04-6.90). On average, the exposure estimated at the onset had been stable for 44 years. Our results confirm prime the importance of estimation of Al speciation and consideration of genetic characteristics in the assessment of the association between aluminum exposure and Alzheimer's disease.