The Chernobyl Disaster: Cancer following the Accident at the Chernobyl Nuclear Power Plant

Department of Epidemiology, Columbia University, New York, New York, United States
Epidemiologic Reviews (Impact Factor: 6.67). 02/2005; 27(1):56-66. DOI: 10.1093/epirev/mxi012
Source: PubMed
Download full-text


Available from: Lydia B Zablotska
  • Source
    • "Because of this scarcity of studies conducted in the immediate aftermath of past disasters, a method for identifying individuals at high risk of internal contamination after a nuclear accident has not yet been defined (Hatch et al. 2005; Saenko et al. 2011; Upton 1981). Since October 2011, the Voluntary Internal Radiation Exposure Screening (VIRES) program has been conducted in Minamisoma, Fukushima, with the ultimate goal of monitoring the long-term health risks of the residents (Minamisoma Municipal General Hospital 2014). "
    [Show abstract] [Hide abstract]
    ABSTRACT: The Fukushima Dai-ichi nuclear disaster, the first level-7 major nuclear disaster since Chernobyl, raised concerns about the future health consequences by exposure to and an intake of radionuclides. Factors determining the risk and level of internal radiation contamination after a nuclear accident, which are a key to understanding and improving current nuclear disaster management, are not well studied. This study aims to investigate both the prevalence and level of internal contamination in residents of Minamisoma, and to identify factors determining the risk and levels of contamination. A program assessing internal radiation contamination using a whole body counter (WBC) measurement and a questionnaire survey was implemented in Minamisoma, between October 2011 and March 2012. Approximately 20% of the city population (8,829 individuals) participated in the WBC measurement for internal contamination, of which 94% responded to the questionnaire. The proportion of participants with detectable internal contamination was 40% in adults and 9% in children. The level of internal contamination ranged from 2.3 to 196.5 Bq/kg (median, 11.3 Bq/kg). Tobit regression analysis identified two main risk factors: more time spent outdoors, and intake of potentially contaminated foods and water. This study suggests that with sensible and reasonable precautions, people may be able to live continuously in radiation-affected areas with limited contamination risk. To enable this, nuclear disaster response should strictly reinforce food and water controls with dissemination of evidence-based and up-to-date information about avoidable contamination risks.
    Full-text · Article · Mar 2014 · Environmental Health Perspectives
  • Source
    • "The link between radiation and thyroid cancer was enforced when in 1986 the release of radioactive iodine in the environment after a meltdown in reactor 4 of the Chernobyl nuclear power plant led to an increase in cases of papillary thyroid carcinoma (PTC) in the areas surrounding the site in Ukraine, Belarus, and Russia. This effect was most pronounced in children [2] [3]. A large percentage of these cases were found to carry a specific translocation, called rearranged in transformation/papillary thyroid carcinoma (RET/PTC) translocation, which was found to be enough to initiate transformation to PTC. "
    [Show abstract] [Hide abstract]
    ABSTRACT: The high doses of radiation received in the wake of the Chernobyl incident and the atomic bombing of Hiroshima and Nagasaki have been linked to the increased appearance of thyroid cancer in the children living in the vicinity of the site. However, the data gathered on the effect of low doses of radiation on the thyroid remain limited. We have examined the genome wide transcriptional response of a culture of TPC-1 human cell line of papillary thyroid carcinoma origin with a RET/PTC1 translocation to various doses (0.0625, 0.5, and 4Gy) of X-rays and compared it to response of thyroids with a RET/PTC3 translocation and against wild-type mouse thyroids irradiated with the same doses using Affymetrix microarrays. We have found considerable overlap at a high dose of 4Gy in both RET/PTC-positive systems but no common genes at 62.5mGy. In addition, the response of RET/PTC-positive system at all doses was distinct from the response of wild-type thyroids with both systems signaling down different pathways. Analysis of the response of microRNAs in TPC-1 cells revealed a radiation-responsive signature of microRNAs in addition to dose-responsive microRNAs. Our results point to the fact that a low dose of X-rays seems to have a significant proliferative effect on normal thyroids. This observation should be studied further as opposed to its effect on RET/PTC-positive thyroids which was subtle, anti-proliferative and system-dependent.
    Full-text · Article · Mar 2012 · Mutation Research/Fundamental and Molecular Mechanisms of Mutagenesis
  • Source
    • "-In the case of environmental exposures (e.g., from exposure to the atomic bomb in Japan, radionuclide releases from the Chornobyl accident, radionuclides that contaminated the Techa River), the proportion of study subjects with radiation measurements varied widely. For example, in the case of a-bomb survivors, no individual radiation measurements were available (Kodama et al., 2001), whereas, for studies of thyroid cancer in Belarus and in the Ukraine following the Chornobyl accident, all study subjects had thyroid gamma activity measurements (Bouville et al. 2007; Hatch et al. 2005; Likhtarev et al. 2006). In the latter case, thyroid activity measurements provided information on the dose rate at the time of measurement but had to be supplemented with environmental radiation measurements and with models to account for environmental and metabolic transfer, location, and individual dietary and lifestyle habits. "
    [Show abstract] [Hide abstract]
    ABSTRACT: Biodosimetry measurements can potentially be an important and integral part of the dosimetric methods used in long-term studies of health risk following radiation exposure. Such studies rely on accurate estimation of doses to the whole body or to specific organs of individuals in order to derive reliable estimates of cancer risk. However, dose estimates based on analytical dose reconstruction (i.e., models) or personnel monitoring measurements (i.e., film badges) can have substantial uncertainty. Biodosimetry can potentially reduce uncertainty in health risk studies by corroboration of model-based dose estimates or by using them to assess bias in dose models. While biodosimetry has begun to play a more significant role in long-term health risk studies, its use is still generally limited in that context due to one or more factors including inadequate limits of detection, large inter-individual variability of the signal measured, high per-sample cost, and invasiveness. Presently, the most suitable biodosimetry methods for epidemiologic studies are chromosome aberration frequencies from fluorescence in situ hybridization (FISH) of peripheral blood lymphocytes and electron paramagnetic resonance (EPR) measurements made on tooth enamel. Both types of measurements, however, are usually invasive and require biological samples that can be difficult to obtain. Moreover, doses derived from these methods are not always directly relevant to the tissues of interest. To increase the value of biodosimetry to epidemiologic studies, a number of issues need to be considered, including limits of detection, effects of inhomogenous exposure of the body, how to extrapolate from the tissue sampled to the tissues of interest, and how to adjust dosimetry models applied to large populations based on sparse biodosimetry measurements. The requirements of health risk studies suggest a set of characteristics that, if satisfied by new biodosimetry methods, would increase the overall usefulness of biodosimetry in determining radiation health risks.
    Full-text · Article · Feb 2010 · Health physics
Show more