Article
To read the full-text of this research, you can request a copy directly from the authors.

Abstract

Safecast is a volunteered geographic information (VGI) project where the lay public uses hand-held sensors to collect radiation measurements that are then made freely available under the Creative Commons CC0 license. However, Safecast data fidelity is uncertain given the sensor kits are hand assembled with various levels of technical proficiency, and the sensors may not be properly deployed. Our objective was to validate Safecast data by comparing Safecast data with authoritative data collected by the U. S. Department of Energy (DOE) and the U. S. National Nuclear Security Administration (NNSA) gathered in the Fukushima Prefecture shortly after the Daiichi nuclear power plant catastrophe. We found that the two data sets were highly correlated, though the DOE/NNSA observations were generally higher than the Safecast measurements. We concluded that this high correlation alone makes Safecast a viable data source for detecting and monitoring radiation.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... The DOE dataset is used in this paper for comparison to Safecast because it has a broad spatial coverage of the Fukushima prefecture. The DOE data were previously aggregated in comparison to the Safecast data at the same location to show a high correlation of measurements over three months in 2011 by Coletti et al. [49] and over years with the development of a decay correction method by Hultquist and Cervone [50]. Finally, Cervone and Hultquist [45] demonstrated how Safecast data could be used to reconstruct the high concentrations of radiation levels of the plume from DOE and predict future radiation levels. ...
... Refs. [49,50] and rasterized to the same extent. A levelplot using the R lattice package [53] is used for visualization with a formula of ...
Article
Full-text available
Citizen-led movements producing spatio-temporal big data are potential sources of useful information during hazards. Yet, the sampling of crowdsourced data is often opportunistic and the statistical variations in the datasets are not typically assessed. There is a scientific need to understand the characteristics and geostatistical variability of big spatial data from these diverse sources if they are to be used for decision making. Crowdsourced radiation measurements can be visualized as raw, often overlapping, points or processed for an aggregated comparison with traditional sources to confirm patterns of elevated radiation levels. However, crowdsourced data from citizen-led projects do not typically use a spatial sampling method so classical geostatistical techniques may not seamlessly be applied. Standard aggregation and interpolation methods were adapted to represent variance, sampling patterns, and the reliability of modeled trends. Finally, a Bayesian approach was used to model the spatial distribution of crowdsourced radiation measurements around Fukushima and quantify uncertainty introduced by the spatial data characteristics. Bayesian kriging of the crowdsourced data captures hotspots and the probabilistic approach could provide timely contextualized information that can improve situational awareness during hazards. This paper calls for the development of methods and metrics to clearly communicate spatial uncertainty by evaluating data characteristics, representing observational gaps and model error, and providing probabilistic outputs for decision making.
... Albeit subjected to the same standards of general scientific enquiry (Morris- Suzuki, 2014, Coletti et al. 2017, Brown et al. 2016, Kuchinskaya, 2019, the scientific facts and evidence produced by these citizen groups serve the needs of the community, allowing them to gain control over their lives: ...
... 6 Nuclear experts in Japan and elsewhere have responded to the emergence of citizen labs and the rise of citizen-led measuring practices by drawing attention to questions of data accuracy, validity, reliability, etc. Various scientists in Tokyo and Fukushima are performing crosschecks of citizen-generated data. (Morris- Suzuki, 2014;Coletti et al. 2017;Brown 2018;Kuchinskaya, 2019). ...
Article
Full-text available
This study illustrates how citizen-driven radiation monitoring has emerged in post-Fukushima Japan, where citizens generate their own radiation data and measurement devices to provide public with actionable data about their environments. Drawing on eth-nographic fieldwork in and around Fukushima Prefecture, it highlights the multifaceted character of these bottom-up, citizen-led efforts, contrasting these initiatives with the emergence of "citizen participatory" science policy discourses in Japan. Recognizing the contested nature of citizenship in Japan and in the nuclear arena, the article considers how terms and definitions shape the participation of citizens and other stakeholders (local communities, public authorities, regulators, and professional scientists) in science and technology in culturally and historically specific ways. It builds on these observations to open up new spaces of expertise, which engage all stakeholders through social-scientific intervention.
... The scientific and engineering concepts and principles underlying radiation detection and the subsequent use of such technologies for mapping variations in both gamma and neutron field(s) are long-established [5], with series of technical papers having been published detailing the processes and best practices [6][7][8]. Although technologies have seen continued advancement over recent decades, following the March 2011 accident at Japan's Fukushima Daiichi Nuclear Power Plant (FDNPP)notable advancements in the unmanned aerial vehicle (UAV) [9][10][11], unmanned ground vehicle (UGV) [12,13], and static/mobile distributed detection systems [14] were realized. Even years after this driver, progress continues across radiation detection, localization, and mapping, whether in underpinning detector materials research (e.g., novel plastics, high-dose semiconductors, dual gammaneutron scintillators) [15][16][17][18]; innovative, autonomous, and miniaturized deployment mechanisms [19][20][21]; or sensor-fusion/data visualization methodologies, progressing from 2D to 3D scenarios [22][23][24][25][26][27]. ...
Article
Full-text available
This work presents the application of a novel evolutional algorithmic approach to determine and reconstruct the specific 3-dimensional source location of gamma-ray emissions within the shelter object, the sarcophagus of reactor Unit 4 of the Chornobyl Nuclear Power Plant. Despite over 30 years having passed since the catastrophic accident, the high radiation levels combined with strict safety and operational restrictions continue to preclude many modern radiation detection and mapping systems from being extensively or successfully deployed within the shelter object. Hence, methods for reconstructing the intense and evolving gamma fields based on the limited inventory of available data are crucially needed. Such data is particularly important in planning the demolition of the unstable structures that comprise the facility, as well as during the prior operations to remove fuel containing materials from inside the sarcophagus and reactor Unit 4. For this approach, a simplified model of gamma emissions within the shelter object is represented by a series of point sources, each regularly spaced on the shelter object’s exterior surface, whereby the calculated activity values of these discrete sources are considered as a population in terms of evolutionary algorithms. To assess the numerical reconstruction, a fitness function is defined, comprising the variation between the known activity values (obtained during the commissioning of the New Safe Confinement at the end of 2019 on the level of the main crane system, located just below the arch above the shelter object) and the calculated values at these known locations for each new population. The final algorithm’s performance was subsequently verified using newly obtained information on the gamma dose-rate on the roof of the shelter object during radiation survey works at the end of 2021. With only 7000 iterations, the algorithm attained an MAPE percentage error of less than 23%, which the authors consider as satisfactory, considering that the relative error of the measurements is ±17%. While a simple initial application is presented in this work, it is demonstrated that evolutional algorithms could be used for radiation mapping with an existing network of radiation sensors, or, as in this instance, based on historic gamma-field data.
... The results in this experiment are supported by 12) who found that the Safecast monitoring data in Japan taken by the bGeigie Nano Monitors that they had used were in agreement with the measurements that were gathered by the U.S. Department of Energy (DOE) and the U.S. National Nuclear Security Administration (NNSA). Therefore, it is believed that the bGeigie Nano Monitor has proven to perform well and produce reliable results in normal background dose rate areas. ...
Article
The bGeigie Nano Monitor is a radiation monitor based on a Geiger Muller tube (GM) detector developed by the team at Safecast as an affordable and easy to use mobile radiation monitoring device for public use as part of its citizen science project. The bGeigie Nano Monitor is said to detect alpha, beta and measure gamma radiation accurately to within a 15% uncertainty, as well as the ability for this measured data to be uploaded to a Safecast API website. The objective of this study was to evaluate the bGeigie Nano Monitorʼs accuracy and reliability in both measuring and recording radiation from alpha, beta and gamma sources. It was found that the bGeigie Nano Monitor is very accurate in the dose rate range of 5-900 µSv/h. Above this dose rate the accuracy of the measurements were not as reliable as the monitor was brought closer to the 1000 µSv/h limit of detection. The monitor was capable of detecting beta and gamma radiation from the tested sources of ²⁴¹Am, ⁹⁰Sr/⁹⁰Y and ¹³⁷Cs. During the assessment of the monitor it was found that it could take up to a minute for the measured dose rate exposed to a source to stabilise, it was also found that after being exposed to a high dose rate it took up to a minute to return to background dose levels after the removal of the radiation source. In conclusion, the bGeigie Nano Monitor is capable of being an easily assembled radiation monitor for the public to accurately measure the dose rates of radioactivity in their area and to share this monitoring data through the Safecast API website.
... disaster-related information sharing site was launched using Ushahidi, which was used to map the damage caused by the 2010 earthquake and subsequent tsunami off the coast of Haiti [37]. Likewise, after the meltdown at the Fukushima Daiichi Nuclear Power Plant, Safecast was established to collect and share accurate environmental data in an open and participatory way, which soon began monitoring, collecting and publishing information on the environmental radiation, rapidly expanding in scale, scope, and geographic coverage [38]. In addition, citizens, mainly engineers, launched Hack for Japan to support the development of applications and services to help and rebuild the disaster victims. ...
Article
Full-text available
This paper addresses open data, open governance, and disruptive/emerging technologies from the perspectives of disaster risk reduction (DRR). With an in-depth literature review of open governance, the paper identifies five principles for open data adopted in the disaster risk reduction field: (1) open by default, (2) accessible, licensed and documented, (3) co-created, (4) locally owned, and (5) communicated in ways that meet the needs of diverse users. The paper also analyzes the evolution of emerging technologies and their application in Japan. The four-phased evolution in the disaster risk reduction is mentioned as DRR 1.0 (Isewan typhoon, 1959), DRR 2.0 (the Great Hanshin Awaji Earthquake, 1995), DRR 3.0 (the Great East Japan Earthquake and Tsunami: GEJE, 2011) and DRR 4.0 (post GEJE). After the GEJE of 2011, different initiatives have emerged in open data, as well as collaboration/partnership with tech firms for emerging technologies in DRR. This paper analyzes the lessons from the July 2021 landslide in Atami, and draws some lessons based on the above-mentioned five principles. Some of the key lessons for open data movement include characterizing open and usable data, local governance systems, co-creating to co-delivering solutions, data democratization, and interpreting de-segregated data with community engagement. These lessons are useful for outside Japan in terms of data licensing, adaptive governance, stakeholder usage, and community engagement. However, as governance systems are rooted in local decision-making and cultural contexts, some of these lessons need to be customized based on the local conditions. Open governance is still an evolving culture in many countries, and open data is considered as an important tool for that. While there is a trend to develop open data for geo-spatial information, it emerged from the discussion in the paper that it is important to have customized open data for people, wellbeing, health care, and for keeping the balance of data privacy. The evolution of emerging technologies and their usage is proceeding at a higher speed than ever, while the governance system employed to support and use emerging technologies needs time to change and adapt. Therefore, it is very important to properly synchronize and customize open data, open governance and emerging/disruptive technologies for their effective use in disaster risk reduction.
... The network operators are best positioned to verify uploaded data, but even in this case it is the individual who decides what data will be made public on the map. Some efforts have been made to calibrate and validate indications of the instruments used in the SAFECAST network (Coletti et al., 2017), (Hultquist and Cervone, 2018), but most of the currently existing NRMNs have no descriptions or recommendations of the exact measurement procedure. Unfortunately, the operators have no possibility of specifying the procedure they use to carry out the measurements or to describe peculiarities (buildings, forest, water etc.) of the location. ...
Article
Full-text available
In the aftermath of a nuclear or radiological accident, an extended mapping of reliable dose rate values is of key importance for any governmental decision and countermeasures. Presently, numerous dosimetry network stations, operated by the national governments of the member states in Europe, provide such dose rate data on an hourly basis. Nevertheless, there are large areas in Europe that are not covered at all by these early warning networks and other areas that show only a low density of governmental network stations. Hence, there may be a significant lack of information in case of a nuclear or radiological emergency. As a consequence of the Fukushima Daiichi nuclear power plant accidents in 2011, a number of non-governmental radiation monitoring networks (NRMN) appeared on the internet, providing dose rate data based on stationary as well as on mobile measurements of ionizing radiation by laypersons. Especially the mobile detectors are able to cover large areas in short time. Therefore, it is of considerable importance to investigate the feasibility of using dose rate data from non-governmental networks as a complementary input to the European Radiological Data Exchange Platform (EURDEP). Within the European Metrology Program for Innovation and Research (EMPIR), the project 16ENV04 “Preparedness” has studied the metrological relevance of such non-governmental dose rate data (also called crowd-sourced radiological monitoring) in the most comprehensive way so far. Sixteen different dose rate detector systems (in general 4 of each type, plus 2 types with 2 detectors, i.e. 68 detectors in total) used in NRMN have been investigated for the reliability of their data and the corresponding networks, and their data provision to the public were analyzed. The most relevant performance parameters of dosimetry systems (detector's inherent background, energy dependence and linearity of the response as well as the response to secondary cosmic radiation, the sensitivity to small increases of the dose rate and finally the stability of the detector's indication at various climatic conditions - temperature and humidity) have been investigated for fourteen representative types of non-governmental dose rate measuring instruments. Results of this comprehensive performance study of the simple, light-weighted and cheap dose rate meters used in NRMN, and conclusions on the feasibility of using their data for governmental monitoring in case of a nuclear or radiological emergency are presented.
... This paper describes the intercomparison exercise that focused on thyroid monitoring by non-spectrometric instruments, including gamma cameras and other instruments that were considered available for measurements made by members of the public, based on a market survey (the details are given in SÚRO Report No (2017)). The use of data by members of the public has been used to a rather large extent in Japan following the Fukushima accident (Coletti et al., 2017;Brown et al., 2016). ...
Article
One of the issues of the Open Project for the European Radiation Research Area (OPERRA) was human thyroid monitoring in case of a large scale nuclear accident. This issue was covered in task 5.4 as project “CaThyMARA” (Child and Adult Thyroid Monitoring After Reactor Accident), which included several aspects of thyroid monitoring, e.g. screening of facilities able to perform thyroid monitoring in the European countries, dose estimation, modelling of detector response, and two intercomparison exercises. The intercomparison described in this paper focused on thyroid monitoring by non-spectrometric instruments, including gamma cameras and other instruments that were considered available for measurements made by members of the public. A total of 12 facilities from 7 European countries have participated and 43 various measuring devices have been evaluated. The main conclusion of this intercomparison is that the ability to make assessments of ¹³¹ I activity in the thyroid to the exposed population after an accidental release must, on the average, be considered as good among the European laboratories taking part in this study. This intercomparison also gave the participants the possibility to calibrate the measuring devices for thyroid measurements of children where this procedure was not available before. A comprehensive report of the intercomparison is given.
... For example, following the Fukushima nuclear plant disaster, the internationally crowdfunded and crowdsourced project Safecast distributed handheld sensors to volunteers, resulting in over 27 million observations on radiation measurements in Japan and worldwide. These data are publicly available and have been shown to be reliable and useful for public safety (Coletti et al., 2017). Some citizen science programs equip volunteers with the tools or expertise with which to lobby for local or national policy change, or simply with the goal of engaging the public in the environment (Shirk et al., 2012). ...
Technical Report
Full-text available
Citizen science for environmental policy Development of an EU-wide inventory and analysis of selected practices - Study Publication metadata Citizen science is the non-professional involvement of volunteers in the scientific process, whether in the data collection phase or in other phases of the research. Citizen science is a powerful tool for environmental management that has the potential to inform an increasingly complex environmental policy landscape and to meet the growing demands from society for more participatory decision-making. While there is growing interest from international bodies and national governments in citizen science, however the evidence that it can successfully contribute to environmental policy development, implementation, evaluation or compliance remains scant. Central to elucidating this question is a better understanding of the benefits delivered by citizen science, that is to determine to what extent these benefits can contribute to environmental policy, and to establish whether projects that provide policy support also co-benefit science and encourage meaningful citizen engagement. The aim of this study was to provide the European Commission with an evidence base of citizen science activities that can support environmental policies in the European Union (EU). The first objective was to develop an inventory of citizen science projects relevant for environmental policy and assess how these projects contribute to the Sustainable Development Goals (SDGs) set by the United Nations (UN) General Assembly. To this end, a desk-research and an EU-wide survey were used to identify 503 citizen science projects of relevance to environmental policy. The second objective was to assess the conditions under which citizen science can best support environmental policy, through the selection and analysis of a sample of citizen science projects. This was followed by an in-depth analysis of 45 projects along 94 project attributes. Subsequently, this analysis provided the foundation for making a series of recommendations to leverage the contribution of citizen science to environmental policy.
... The radiation measurements are uploaded to a collective map as individual point measurements with GPS coordinates and a time record. Safecast has proven to be a reliable source of radiation data when compared to DOE measurements ( Coletti et al., 2017;Hultquist and Cervone, 2017). It is speculated that Safecast could be used to fill temporal and spatial gaps in government produced radiation data. ...
... This 'distributed' radiological survey utilised a network of hundreds of mobile-phone tethered Geiger-Muller (GM) tubebased systems-each carried by members of the public. These units simultaneously logged their position, activity level and resulting calibrated dose-rate across Fukushima Prefecture [17]. ...
Article
Full-text available
With extensive remediation currently ongoing as a consequence of the Fukushima Daiichi Nuclear Power Plant accident, there exists the even greater need to provide a system with which the distribution of radiation (specifically radiocesium) can be rapidly determined across extensive areas, yet at high (meter or sub-meter) spatial resolutions. Although a range of potential survey methods have been utilised (e.g. fixed-wing aircraft, helicopter, vehicular and more-recently unmanned aerial vehicle) to characterise such radiation distributions, ground-based (on-foot) methods that employ human operatives to traverse sites of interest, remain one of the primary methods with which to perform routine radiological surveys. Through the application of a newly developed platform carried as a backpack-contained unit, it was possible to survey sites at twice the rate previously possible - reducing the overall exposure time of the operator to ionising radiation, as well as dramatically reducing the level of radiation attenuation (introduced by the operator) onto the detector. Alike to magnetometry survey platforms employed for ore prospecting applications, this boom-based system removed the requirement to perform lengthy and intricate corrections to the data to account for such gamma-ray interception, intensity reduction and spectral distortion. &#13.
... Elementary School D in Fukushima is close to the prefectural border with Tochigi prefecture, located in a mountainous region, and did not suffer much damage during the Great Tohoku Earthquake. The region this school is located in the 80 km out of range of the first nuclear power plant in Fukushima [43]. However, there are many valleys around this school and it is inferred that there are many dangerous areas of sediment-related disasters. ...
Article
Full-text available
In this research, a visiting class on disaster preparedness education for higher-grade elementary school students (10–11 years old) was conducted in Wakayama prefecture, which is exposed to Nankai Trough earthquakes, and in different parts of the three prefectures whose coasts were most affected by the 2011 Great East Japan Earthquake: Fukushima (Western inland), Miyagi (North side out of the tsunami inundation area and Northern inland), and Iwate (Medium inland). Group activities with game-like elements were conducted. To examine whether this initiative improves schoolchildren’s awareness of disaster-prevention, surveys were conducted before, immediately after, and one month after the classes. Results indicate differences in awareness depending on regional characteristics of the schoolchildren’s residential area. The data obtained at each school varied according to whether the school was in a region that had experienced disaster in the recent past, or if the school was in a region where there is a recognized risk of disaster in the future. Classes in regions with recent disaster experience showed increased awareness of threats and prevention after the disaster-prevention class; however, this effect was short-lived. Increased awareness lasted longer in those schools located in regions that had not suffered from disasters in the recent past, but that are predicted to experience a major disaster in the future. We therefore infer that the “previous history of disasters” defines the key difference between regions, even when the particular school concerned was located outside the afflicted area (the coastal zone in the 2011 Great East Japan earthquake and tsunami) and so not directly affected. The afflicted area was limited to the 2011 Great East Japan Earthquake; regions experiencing no direct damage, even if they were near damaged regions, saw an increased awareness of the threat of disasters as a result of disaster-prevention classes. Students also saw a decrease in their own confidence regarding evacuation behavior, while their expressed dependence on their families for help in evacuation situations strengthened. However, such effects were temporary. In the future, it would be desirable to develop disaster-prevention programs that consider such regional characteristics.
... > 90% of the samples in this study come from the areas Safecast-mapped as 0.16 μSv/h and higher (Fig. 1). The Safecast observation has been cross-checked against authoritative data collected by the U. S. Department of Energy (DOE) and the U. S. National Nuclear Security Administration (NNSA) (Coletti et al., 2017). The contaminated areas delineated by Safecast and DOE/NNSA data are highly similar. ...
Article
After the March 11, 2011, nuclear reactor meltdowns at Fukushima Dai-ichi, 180 samples of Japanese particulate matter (dusts and surface soils) and 235 similar U.S. and Canadian samples were collected and analyzed sequentially by gamma spectrometry, autoradiography, and scanning electron microscopy with energy dispersive X-ray analysis. Samples were collected and analyzed over a five-year period, from 2011 to 2016. Detectable levels of (134)Cs and (137)Cs were found in 142 of 180 (80%) Japanese particulate matter samples. The median radio-cesium specific activity of Japanese particulate samples was 3.2kBqkg(-1)±1.8kBqkg(-1), and the mean was 25.7kBqkg(-1) (σ=72kBqkg(-1)). The U.S. and Canadian mean and median radio‑cesium activity levels were <0.03kBqkg(-1). U.S. and Canadian samples had detectable (134)Cs and (137)Cs in one dust sample out of 32 collected, and four soils out of 74. The maximum US/Canada radio-cesium particulate matter activity was 0.30±0.10kBqkg(-1). The mean in Japan was skewed upward due to nine of the 180 (5%) samples with activities >250kBqkg(-1). This skewness was present in both the 2011 and 2016 sample sets. >300 individual radioactively-hot particles were identified in samples from Japan; composed of 1% or more of the elements cesium, americium, radium, polonium, thorium, tellurium, or strontium. Some particles reached specific activities in the MBqμg(-1) level and higher. No cesium-containing hot particles were found in the U.S. sample set. Only naturally-occurring radionuclides were found in particles from the U.S. background samples. Some of the hot particles detected in this study could cause significant radiation exposures to individuals if inhaled. Exposure models ignoring these isolated hot particles would potentially understate human radiation dose.
Article
Safecast, a Citizen Science project devoted to monitoring ambient dose rate, initiated in Japan 2011 after the Fukushima NPP accident and soon spread worldwide. Its standard instrument is bGeigie Nano, featuring a pancake-type thin-window GM-sensor coupled to a GPS receiver and data-storage, in a sturdy plastic case allowing field use. Recorded ADR tracks are shown on a publicly accessible map on Safecast.org, containing almost 200 M records by 2022. Interpretability of results depends on quality assurance of the measurement process, covering both metrological characterization of the instrument and its practical use—relevant because users are citizens in general not familiar with metrological procedures, measurement statistics, the concept of representativeness, etc. Here we focus on the former aspect. In field use, the source of GM response is internal background (BG), ambient gamma-rays and secondary cosmic radiation (SCR). Through dedicated experiments, mainly performed on lakes (where the terrestrial gamma component is largely absent), we quantified the BG and SCR response and investigated the variance of response between instruments. We investigated conformity to count-rates Poisson statistics and the occurrence of spurious extreme signals, which can lead to artefacts in ADR maps. Impact of experimental results on practice and uncertainties are discussed.
Article
Harmful energy monitoring system in a specific area for presence are nearby industrial environment. In industrial location are using heavy instruments, it revealing tiniest harmful rays will cause harmful genetic mutations, that can be passed on the living things changes are up to DNA level. Curious continuous monitoring and control the destructive energies with wireless control system. A design of wireless Sensor Networks with the Geiger-Muller Counter is proposed to calculate the least range of destructive energy. For this Wireless Sensor Networks (WSN) connected to high performance, low power 16 MHz In-System Self-Programmable processor are continuously monitoring the energy level.
Article
Full-text available
This paper investigates the issue of quality in the context of DIY science, a consolidating ensemble of social practices progressively gaining attention also in academia and policy. Although rarely from official institutions and with formal affiliations, DIY science practitioners often perform scientific activity aimed at the solution of incumbent problems. What becomes of scientific quality in the DIY milieu? Departing from a Post Normal Science framework, we make the case of the DIY science community as an “extended peer community” performing practices of "extended peer review". Three real cases of DIY science suggest a preliminary set of quality dimensions intrinsic to the DIY practice. Subsequently, through a purposefully designed interview study with members of different DIY communities, we look further into these quality dimensions (or “qualities”) and propose a possible “DIY ethos” embedding elements of agency, care, ‘tempo’, integrity and openness. DIY science practitioners are not necessarily concerned with quality assessments, but are rather engaged in the production of ‘extended facts’ that can be regarded in themselves as quality commitments, i.e. by acting, tinkering and hacking in the matters that are of their care and concern. We conclude by discussing that DIY science can also constitute a real case example to contrast currently existing tensions about the issue of quality in mainstream science.
Article
Safecast is a citizen science project, aimed to environmental monitoring. Its main activity is measuring ambient dose rate all over the world. The freely accessible data, currently (January 2020) more than 120 million observations, were used to calculate mean values of dose equivalent rate in various cities where sufficient data is available. The results mainly reflect dose rate from terrestrial radiation, whose variability is controlled by the one of geochemistry, namely the concentrations of uranium, thorium and potassium. Further influence comes from cosmic radiation and in a few cases, from anthropogenic radiation caused by nuclear fallout. Mean dose rate has been calculated for 330 cities and towns worldwide. Results are shown in tables, graphs and as maps.
Preprint
Full-text available
In the first part of this chapter, I show the effort spent by Japanese NGOs and labor activists to prevent occupational hazards and to make them more socially visible when they occur. Although their struggle started long before, it has gained momentum since the Fukushima nuclear disaster. The second part, I highlight the conflicting interpretations of low- dose radiation and the importance of nuclear labor in that dispute. This chapter follows a study that I began in 2002 on Japanese nuclear contract workers. Further observations and interviews have been conducted since 2011 among cleanup workers, government experts, activists, and epidemiologists in Japan and Europe.
Article
Growing concern for radiological and nuclear safety and growing use of radiation protection instruments by both professionals and lay persons increase the need for low cost and reliable instrumentation. The aim of this work was to develop a radiation protection instrument that will be both affordable to the widest radiation protection community, including the citizen networks and provide metrologically sound data. The instrument was based on a Geiger-Muller tube, which was tested before and after the energy compensation by lead foils. Instrument energy, angular and dose rate dependence was determined for different tube compensations and the optimum compensation allowing full compliance with relevant standards was identified.
Article
A methodology is presented to calibrate contributed Safecast dose rate measurements acquired between 2011 and 2016 in the Fukushima prefecture of Japan. The Safecast data are calibrated using observations acquired by the U.S. Department of Energy at the time of the 2011 Fukushima Daiichi power plant nuclear accident. The methodology performs a series of interpolations between the U.S. government and contributed datasets at specific temporal windows and at corresponding spatial locations. The coefficients found for all the different temporal windows are aggregated and interpolated using quadratic regressions to generate a time dependent calibration function. Normal background radiation, decay rates, and missing values are taken into account during the analysis. Results show that the standard Safecast static transformation function overestimates the official measurements because it fails to capture the presence of two different Cesium isotopes and their changing magnitudes with time. A model is created to predict the ratio of the isotopes from the time of the accident through 2020. The proposed time dependent calibration takes into account this Cesium isotopes ratio, and it is shown to reduce the error between U.S. government and contributed data. The proposed calibration is needed through 2020, after which date the errors introduced by ignoring the presence of different isotopes will become negligible.
Article
Full-text available
The Fukushima Daichi Nuclear Power Plant disaster, which began on 11 March 2011, provided a crucial opportunity to evaluate the state of preparation on the part the powerplant operator (TEPCO), relevant Japanese government agencies, and international oversight bodies, to gather necessary information on radiation risks quickly and to share it with those tasked with emergency response as well as with the general public. The inadequacy of this preparation and the chaotic nature of inter-agency and inter-governmental communication has been well noted in several official reports on the disaster. In response, Safecast, an international, volunteer-based organization devoted to monitoring and openly sharing information on environmental radiation and other pollutants, was initiated on 12 March 2011, one day following the start of the accident. Since then the group has implemented participatory, open-source, citizen-science-centered radiation mapping solutions developed through a process of collaborative open innovation. The information Safecast provided has proven useful to experts, to policy makers, and to the public. This paper briefly describes the methodology and toolsets Safecast has developed and deployed, as well as organizational and social aspects, and summarizes key results obtained to date. In addition, it discusses appropriate criteria for evaluating the success of citizen-science efforts like Safecast, and places it in context with other non-governmental radiation monitoring efforts.
Article
Full-text available
This article briefly reviews the causes and impacts of the massive eastern Japan earthquake and tsunami of 11 March 2011, and comments on the response measures taken by Japan to cope with this devastating disaster. Mass losses occurred mostly because the intensity of the quake and the induced tsunami exceeded local coping capacity. Particularly, the nuclear power plant crisis triggered by the tsunami significantly increased the short- and long-term impacts of the disaster. While the coping capacity Japanese society built after the 1995 Hanshin-Awaji great earthquake tremendously mitigated the damages, there is room for improvement despite Japan’s great efforts in this disaster. Investigating the tsunami preparedness of the coastal nuclear power plants is an issue of paramount importance. In response to future large-scale disasters, there is an urgent need for a highly collaborative framework based on which all available resources could be mobilized; a mutual assistance and rescue system against catastrophes among regions and countries on the basis of international humanitarian aid; and further in-depth research on the multi-hazard and disaster-chain phenomenon in large-scale disasters and corresponding governance approaches.
Article
Full-text available
The initial estimation of release amounts of 131I and 137Cs accidentally discharged from the Fukushima Daiichi Nuclear power plant into the atmosphere is presented. For the source term estimation, environmental monitoring data on air concentrations of iodine and cesium were mainly used. The System for Prediction of Environmental Emergency Dose Information (SPEEDI) network system) operated by MEXT was used for calculating air concentrations and dose rates. The simulation results were furnished from NSC for the purpose of the source term estimation. To estimate the total amounts of 131I and 137Cs discharged into the atmosphere, the 'Duration' is roughly estimated by assuming that the release with a certain release rate continued from/to the middle times between released times of sampled air.
Article
Full-text available
The proliferation of information sources as a result of networked computers and other interconnected devices has prompted significant changes in the amount, availability, and nature of geographic information. Among the more significant changes is the increasing amount of readily available volunteered geographic information. Although volunteered information has fundamentally enhanced geographic data, it has also prompted concerns with regard to its quality, reliability, and overall value. This essay situates these concerns as issues of information and source credibility by (a) examining the information environment fostering collective information contribution, (b) exploring the environment of information abundance, examining credibility and related notions within this environment, and leveraging extant research findings to understand user-generated geographic information, (c) articulating strategies to discern the credibility of volunteered geographic information (VGI), including relevant tools useful in this endeavor, and (d) outlining specific research questions germane to VGI and credibility.
Article
Japan is the only country to suffer twice from the terrible consequences of atomic bombs. Hiroshima and Nagasaki are renowned internationally for experiencing the first twin devastating nuclear attacks in history. Unfortunately, Japan has witnessed several other serious nuclear-related disasters in recent years. The much publicized Fukushima disaster in 2011 is one of them. How could such a serious accident occur in a modern, highly sensitive, nuclear-conscious country? The answer to that central question is complex, involving not only political and administrative issues but also technical and human dimensions. In retrospect, both government officials and private industry were far too lax with the operation and development of nuclear policies and facilities. The Fukushima debacle was the result of a lack of rigorous management and control of nuclear issues by both public authorities and private industry.
Article
On March 11, 2011, an earthquake and tsunami crippled the Fukushima Daiichi Nuclear Power Station. The emerging crisis at the plant was complex, and, to make matters worse, it was exacerbated by communication gaps between the government and the nuclear industry. An independent investigation panel, established by the Rebuild Japan Initiative Foundation, reviewed how the government, the Tokyo Electric Power Company (Tepco), and other relevant actors responded. In this article, the panel’s program director writes about their findings and how these players were thoroughly unprepared on almost every level for the cascading nuclear disaster. This lack of preparation was caused, in part, by a public myth of “absolute safety” that nuclear power proponents had nurtured over decades and was aggravated by dysfunction within and between government agencies and Tepco, particularly in regard to political leadership and crisis management. The investigation also found that the tsunami that began the nuclear disaster could and should have been anticipated and that ambiguity about the roles of public and private institutions in such a crisis was a factor in the poor response at Fukushima.
Article
In recent months there has been an explosion of interest in using the Web to create, assemble, and disseminate geographic information provided voluntarily by individuals. Sites such as Wikimapia and OpenStreetMap are empowering citizens to create a global patchwork of geographic information, while Google Earth and other virtual globes are encouraging volunteers to develop interesting applications using their own data. I review this phenomenon, and examine associated issues: what drives people to do this, how accurate are the results, will they threaten individual privacy, and how can they augment more conventional sources? I compare this new phenomenon to more traditional citizen science and the role of the amateur in geographic observation.
Article
The U.S. Department of Energy/National Nuclear Security Administration's (DOE/NNSA) Aerial Measuring System (AMS) deployed personnel and equipment to partner with the U.S. Forces in Japan (USFJ) to conduct multiple aerial radiological surveys. These were the first and most comprehensive sources of actionable information for U.S. interests in Japan and provided early confirmation to the Government of Japan as to the extent of the release from the Fukushima Daiichi Nuclear Power Plant. Many challenges were overcome quickly during the first 48 h, including installation and operation of Aerial Measuring System equipment on multiple USFJ aircraft, flying over difficult terrain, and flying with USFJ pilots who were unfamiliar with the Aerial Measuring System flight patterns. These factors combined to make for a programmatically unanticipated situation. In addition to the challenges of multiple and ongoing releases, integration with the Japanese government to provide valid aerial radiological survey products that both military and civilian customers could use to make informed decisions was extremely complicated. The Aerial Measuring System Fukushima response provided insight into addressing these challenges and gave way to an opportunity for the expansion of the Aerial Measuring System's mission beyond the borders of the U.S.
Article
Within the framework of Web 2.0 mapping applications, the most striking example of a geographical application is the OpenStreetMap (OSM) project. OSM aims to create a free digital map of the world and is implemented through the engagement of participants in a mode similar to software development in Open Source projects. The information is collected by many participants, collated on a central database, and distributed in multiple digital formats through the World Wide Web. This type of information was termed ‘Volunteered Geographical Information’ (VGI) by Goodchild, 2007. However, to date there has been no systematic analysis of the quality of VGI. This study aims to fill this gap by analysing OSM information. The examination focuses on analysis of its quality through a comparison with Ordnance Survey (OS) datasets. The analysis focuses on London and England, since OSM started in London in August 2004 and therefore the study of these geographies provides the best understanding of the achievements and difficulties of VGI. The analysis shows that OSM information can be fairly accurate: on average within about 6 m of the position recorded by the OS, and with approximately 80% overlap of motorway objects between the two datasets. In the space of four years, OSM has captured about 29% of the area of England, of which approximately 24% are digitised lines without a complete set of attributes. The paper concludes with a discussion of the implications of the findings to the study of VGI as well as suggesting future research directions.
Article
This article reviews routine quality-control (QC) procedures for current nuclear medicine instrumentation, including the survey meter, dose calibrator, well counter, intraoperative probe, organ ("thyroid") uptake probe, gamma-camera, SPECT and SPECT/CT scanner, and PET and PET/CT scanner. It should be particularly useful for residents, fellows, and other trainees in nuclear medicine, nuclear cardiology, and radiology. The procedures described and their respective frequencies are presented only as general guidelines.
John Iovine: Geiger counter sanity check
  • Spinrad
Spinrad, P., 2011. John Iovine: Geiger counter sanity check. Makezine. http:// makezine.com/2011/04/25/john-iovine-geiger-counter-sanity-check/.
Safecast maps. http://blog.safecast.org/maps
  • Safecast
Safecast, August 2015d. Safecast maps. http://blog.safecast.org/maps/.
Converting Safecast Measurements to mR/h. Personal Communication
  • N Dolezal
Dolezal, N., August 2014. Converting Safecast Measurements to mR/h. Personal Communication.
Japan's Quake Shifts Earth's axis by 25 Centimetres
  • C Chai
Chai, C., March 2011. Japan's Quake Shifts Earth's axis by 25 Centimetres. http:// www.webcitation.org/5x95t0CLU.
About Calibration and the Bgeigie Nano
  • Safecast
Safecast, December 2015a. About Calibration and the Bgeigie Nano. http://blog. safecast.org/faq/about-calibration-and-the-bgeigie-nano/.
Magnitude 9.0-Near the East Coast of Honshu, Japan
  • U S Survey
U. S. Geological Survey, March 2011. Magnitude 9.0-Near the East Coast of Honshu, Japan. http://earthquake.usgs.gov/earthquakes/eqinthenews/2011/usc0001xgp/ .
Converting Safecast Measurements to mR/h. Personal Communication
  • K Kozhuharov
Kozhuharov, K., August 2014. Converting Safecast Measurements to mR/h. Personal Communication.
LND 7317 Technical Specification
  • Inc Lnd
LND, Inc, 2011. LND 7317 Technical Specification. http://www.lndinc.com/products/ 17/.
Converting Safecast Measurements to mR/h. Personal Communication
  • A Mallins
Mallins, A., August 2014. Converting Safecast Measurements to mR/h. Personal Communication.
  • Safecast
Safecast, December 2015b. Bgeigie Nano Kit. http://blog.safecast.org/bgeigie-nano/.
Nano Operation Manual
  • Safecast
Safecast, December 2015c. Nano Operation Manual. https://github.com/Safecast/ bGeigieNanoKit/wiki/Nano-Operation-Manual.