Chapter

Chapitre 2 : Nucléaire et radioactivité

Authors:
To read the full-text of this research, you can request a copy directly from the authors.

Abstract

La science est au cœur de multiples enjeux de société, au rang desquels les nombreux défis engendrés par les crises écologiques. Ces enjeux impliquent des mécanismes que la physique peut permettre d’expliquer. Une formation universitaire de physique fournit un bagage standard pour critiquer des résultats liés à la discipline : ordres de grandeur, adéquation entre les modèles et les expériences, ou encore incertitudes expérimentales. Cet outillage est nécessaire à tout scientifique. Mais un·e citoyen·ne doit aussi être capable d’interagir avec ses semblables de manière rationnelle, c’est-à-dire comprendre les débats de société et y participer en connaissance de cause.C’est dans cette optique que s’inscrit cet ouvrage, issu d’un cours à destination des étudiant·e·s en troisième année de Licence de physique à Université Paris Cité. Son objectif est double : transmettre un ensemble de connaissances minimales dans quelques domaines physiques sur des thématiques que l’on rencontre fréquemment dans les médias, afin d’en saisir les enjeux, et d'autre part donner des clés permettant de se construire une opinion éclairée.Après avoir expliqué comment le savoir scientifique s’élabore, le manuel apporte des éléments de compréhension sur la radioactivité, le réchauffement climatique, l’énergie dans la société et les émissions de gaz à effet de serre, sans oublier les ondes électromagnétiques et la santé.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

ResearchGate has not been able to resolve any citations for this publication.
Preprint
Full-text available
Towards the end of June 2021, temperature records were broken by several degrees Celsius in several cities in the Pacific northwest areas of the U.S. and Canada, leading to spikes in sudden deaths, and sharp increases in hospital visits for heat-related illnesses and emergency calls. Here we present a multi-model, multi-method attribution analysis to investigate to what extent human-induced climate change has influenced the probability and intensity of extreme heatwaves in this region. Based on observations and modeling, the occurrence of a heatwave with maximum daily temperatures (TXx) as observed in the area 45° N–52° N, 119° W–123° W, was found to be virtually impossible without human-caused climate change. The observed temperatures were so extreme that they lie far outside the range of historically observed temperatures. This makes it hard to quantify with confidence how rare the event was. In the most realistic statistical analysis, which uses the assumption that the heatwave was a very low probability event that was not caused by new nonlinearities, the event is estimated to be about a 1 in 1000 year event in today’s climate. With this assumption and combining the results from the analysis of climate models and weather observations, an event, defined as daily maximum temperatures (TXx) in the heatwave region, as rare as 1 in a 1000 years would have been at least 150 times rarer without human-induced climate change. Also, this heatwave was about 2 °C hotter than a 1 in 1000-year heatwave that at the beginning of the industrial revolution would have been (when global mean temperatures were 1.2 °C cooler than today). Looking into the future, in a world with 2 °C of global warming (0.8 °C warmer than today), a 1000-year event would be another degree hotter. It would occur roughly every 5 to 10 years in such global warming conditions. Our results provide a strong warning: our rapidly warming climate is bringing us into uncharted territory with significant consequences for health, well-being, and livelihoods. Adaptation and mitigation are urgently needed to prepare societies for a very different future.
Article
Full-text available
Building upon recent work on other major fossil fuel companies, we report new archival research and primary source interviews describing how Total responded to evolving climate science and policy in the last 50 years. We show that Total personnel received warnings of the potential for catastrophic global warming from its products by 1971, became more fully informed of the issue in the 1980s, began promoting doubt regarding the scientific basis for global warming by the late 1980s, and ultimately settled on a position in the late 1990s of publicly accepting climate science while promoting policy delay or policies peripheral to fossil fuel control. Additionally, we find that Exxon, through the International Petroleum Industry Environmental Conservation Association (IPIECA), coordinated an international campaign to dispute climate science and weaken international climate policy, beginning in the 1980s. This represents one of the first longitudinal studies of a major fossil fuel company’s responses to global warming to the present, describing historical stages of awareness, preparation, denial, and delay.
Article
Full-text available
Plain Language Summary The net sunlight reaching the Earth's climate system depends on the solar irradiance and the Earth's reflectance (albedo). We have observed earthshine from Big Bear Solar Observatory to measure the terrestrial albedo. For earthshine we measure the sunlight reflected from Earth to the dark part of the lunar face and back to the nighttime observer, yielding an instantaneous large‐scale reflectance of the Earth. In these relative measurements, we also observe the sunlit, bright part of the lunar face. We report here reflectance data (monthly, seasonal and annual) covering two decades, 1998–2017. The albedo shows a decline corresponding to a net climate forcing of about 0.5 W/m2. We find no correlation between measures of solar cycle variations and the albedo variations. The first precise satellite measures of terrestrial albedo came with CERES. CERES global albedo data (2001‐) show a decrease in forcing that is about twice that of earthshine measurements. The evolutionary changes in albedo motivate continuing earthshine observations as a complement to absolute satellite measurements, especially since earthshine and CERES measurements are sensitive to distinctly different parts of the angular reflectivity. The recent drop in albedo is attributed to a warming of the eastern pacific, which is measured to reduce low‐lying cloud cover and, thereby, the albedo.
Article
Full-text available
Les ressources minières constituent un stock fini que l’humanité épuise de plus en plus vite. Les modèles montrent que pour un certain nombre d’entre elles, les pics de production pourraient être franchis au cours de ce siècle. La sobriété pourrait donc être l’enjeu de société de notre époque...
Article
Full-text available
The current linear no-threshold paradigm assumes that any exposure to ionizing radiation carries some risk, thus every effort should be made to maintain the exposures as low as possible. We examined whether background radiation impacts human longevity and cancer mortality. Our data covered the entire US population of the 3139 US counties, encompassing over 320 million people. This is the first large-scale study which takes into account the two major sources of background radiation (terrestrial radiation and cosmic radiation), covering the entire US population. Here, we show that life expectancy, the most integrative index of population health, was approximately 2.5 years longer in people living in areas with a relatively high vs. low background radiation. (≥ 180 mrem/year and ≤ 100 mrem/year, respectively; p < 0.005; 95% confidence interval [CI]). This radiation-induced lifespan extension could to a great extent be associated with the decrease in cancer mortality rate observed for several common cancers (lung, pancreas and colon cancers for both genders, and brain and bladder cancers for males only; p < 0.05; 95% CI). Exposure to a high background radiation displays clear beneficial health effects in humans. These hormetic effects provide clear indications for re-considering the linear no-threshold paradigm, at least within the natural range of low-dose radiation.
Article
Full-text available
Accurate assessment of anthropogenic carbon dioxide (CO2) emissions and their redistribution among the atmosphere, ocean, and terrestrial biosphere – the “global carbon budget” – is important to better understand the global carbon cycle, support the development of climate policies, and project future climate change. Here we describe data sets and methodology to quantify the five major components of the global carbon budget and their uncertainties. Fossil CO2 emissions (EFF) are based on energy statistics and cement production data, while emissions from land use change (ELUC), mainly deforestation, are based on land use and land use change data and bookkeeping models. Atmospheric CO2 concentration is measured directly and its growth rate (GATM) is computed from the annual changes in concentration. The ocean CO2 sink (SOCEAN) and terrestrial CO2 sink (SLAND) are estimated with global process models constrained by observations. The resulting carbon budget imbalance (BIM), the difference between the estimated total emissions and the estimated changes in the atmosphere, ocean, and terrestrial biosphere, is a measure of imperfect data and understanding of the contemporary carbon cycle. All uncertainties are reported as ±1σ. For the last decade available (2009–2018), EFF was 9.5±0.5 GtC yr−1, ELUC 1.5±0.7 GtC yr−1, GATM 4.9±0.02 GtC yr−1 (2.3±0.01 ppm yr−1), SOCEAN 2.5±0.6 GtC yr−1, and SLAND 3.2±0.6 GtC yr−1, with a budget imbalance BIM of 0.4 GtC yr−1 indicating overestimated emissions and/or underestimated sinks. For the year 2018 alone, the growth in EFF was about 2.1 % and fossil emissions increased to 10.0±0.5 GtC yr−1, reaching 10 GtC yr−1 for the first time in history, ELUC was 1.5±0.7 GtC yr−1, for total anthropogenic CO2 emissions of 11.5±0.9 GtC yr−1 (42.5±3.3 GtCO2). Also for 2018, GATM was 5.1±0.2 GtC yr−1 (2.4±0.1 ppm yr−1), SOCEAN was 2.6±0.6 GtC yr−1, and SLAND was 3.5±0.7 GtC yr−1, with a BIM of 0.3 GtC. The global atmospheric CO2 concentration reached 407.38±0.1 ppm averaged over 2018. For 2019, preliminary data for the first 6–10 months indicate a reduced growth in EFF of +0.6 % (range of −0.2 % to 1.5 %) based on national emissions projections for China, the USA, the EU, and India and projections of gross domestic product corrected for recent changes in the carbon intensity of the economy for the rest of the world. Overall, the mean and trend in the five components of the global carbon budget are consistently estimated over the period 1959–2018, but discrepancies of up to 1 GtC yr−1 persist for the representation of semi-decadal variability in CO2 fluxes. A detailed comparison among individual estimates and the introduction of a broad range of observations shows (1) no consensus in the mean and trend in land use change emissions over the last decade, (2) a persistent low agreement between the different methods on the magnitude of the land CO2 flux in the northern extra-tropics, and (3) an apparent underestimation of the CO2 variability by ocean models outside the tropics. This living data update documents changes in the methods and data sets used in this new global carbon budget and the progress in understanding of the global carbon cycle compared with previous publications of this data set (Le Quéré et al., 2018a, b, 2016, 2015a, b, 2014, 2013). The data generated by this work are available at https://doi.org/10.18160/gcp-2019 (Friedlingstein et al., 2019).
Article
Full-text available
Significance Public opinion toward some science and technology issues is polarized along religious and political lines. We investigate whether people with more education and greater science knowledge tend to express beliefs that are more (or less) polarized. Using data from the nationally representative General Social Survey, we find that more knowledgeable individuals are more likely to express beliefs consistent with their religious or political identities for issues that have become polarized along those lines (e.g., stem cell research, human evolution), but not for issues that are controversial on other grounds (e.g., genetically modified foods). These patterns suggest that scientific knowledge may facilitate defending positions motivated by nonscientific concerns.
Thesis
Full-text available
A partir de la fin de l’Antiquité, les grands bronzessont peu à peu délaissés puis abandonnés. Il faudra attendre laseconde moitié du XVIème siècle pour voir réapparaître dans lepaysage artistique français, sous François Ier, de grandsensembles statuaires en bronze. Au coeur de ce phénomène, latechnique tient un rôle majeur et suscite plusieursinterrogations. D’abord, se pose pour cette période la questionde l’existence ou non d’une identité technique des bronzesfrançais. L’exécution d’une statue en bronze implique denombreuses étapes qui conduisent du modèle à saretranscription dans le métal. Peut-on reconnaître dans cesétapes et dans les manières de les aborder une certaine unitétechnique qui marquerait la seconde moitié du XVIème siècle,voire le début du XVIIème en France ? Inversement, remarqueton dès cette période de réintroduction différentes écolesregroupant certains sculpteurs, fondeurs, ateliers, voire liées àcertains chantiers particuliers ? Par ailleurs, la réapparition dessavoir-faire associés à la statuaire en bronze pose la questiondes origines. D’où viennent ces techniques soi-disant oubliées :d’autres centres européens, de pratiques de fonderie concernantd’autres types de production ?Dans ce travail de thèse, nous nous sommes attachés à apporterdes éléments de réponse à ces différentes interrogations. Pource faire, des études technologiques ont été menées sur troisgrands ensembles marquant cette période de renouveau dansl’art du bronze : les copies en bronze de marbres antiques parPrimatice, les Vertus du monument funéraire d’Henri II et deCatherine de Médicis, les Allégories du monument de coeurd’Anne de Montmorency.Pour compléter ce corpus, des éléments isolés ont été étudiés :la Diane chasseresse de Barthélémy Prieur, l’Apollon duBelvédère, le Gladiateur Borghèse et la Vénus Médicisattribués à Hubert le Sueur. L’objectif a été de tenter de révélerprocédés, matériaux et savoir-faire engagés, complétant ainsiles données fournies par les documents d’archives quiaccompagnent ces commandes prestigieuses. La stratégied’étude employée a bénéficié de l’expérience des travauxentrepris ces trente dernières années. Des développementsméthodologiques ont néanmoins été nécessaires pour compléterles possibilités offertes par l’étude technologique de la statuaireen bronze. Ces développements ont en particulier concerné lesnoyaux de fonderie, ces matériaux employés pour réaliser desstatues creuses. Les résultats obtenus montrent que lespremières décennies de réappropriation de la grande statuaireen bronze sont marquées par l’emploi d’un même procédé àl’épargné qui trouve racine dans les procédés employés auMoyen-âge pour la fonte de cloches ou de canons par exemple.Mais dés le XVIIème siècle, le monopole de ce procédé sembleêtre mis à mal, preuve sans doute d’une émancipation desfondeurs et d’une innovation constante. Parallèlement à cesphénomènes dont les conséquences marquent la fonte statuaireen général, les sculpteurs, les fondeurs, développent dans leursateliers des savoir-faire personnels et innovent au cas par cas,selon la nature des commandes qu’ils reçoivent.
Article
Full-text available
We consider the problem of grid-forming control of power converters in low-inertia power systems. Starting from an average-switch three-phase inverter model, we draw parallels to a synchronous machine (SM) model and propose a novel grid-forming converter control strategy which dwells upon the main characteristic of a SM: the presence of an internal rotating magnetic field. In particular, we augment the converter system with a virtual oscillator whose frequency is driven by the DC-side voltage measurement and which sets the converter pulse-width-modulation signal, thereby achieving exact matching between the converter in closed-loop and the SM dynamics. We then provide a sufficient condition assuring existence, uniqueness, and global asymptotic stability of equilibria in a coordinate frame attached to the virtual oscillator angle. By actuating the DC-side input of the converter we are able to enforce this sufficient condition. In the same setting, we highlight strict incremental passivity, droop, and power-sharing properties of the proposed framework, which are compatible with conventional requirements of power system operation. We subsequently adopt disturbance decoupling techniques to design additional control loops that regulate the DC-side voltage, as well as AC-side frequency and amplitude, while in the end validating them with numerical experiments.
Article
Full-text available
In this study, we construct a new monthly zonal mean carbon dioxide (CO2) distribution from the upper troposphere to the stratosphere over the 2000–2010 time period. This reconstructed CO2 product is based on a Lagrangian backward trajectory model driven by ERA-Interim reanalysis meteorology and tropospheric CO2 measurements. Comparisons of our CO2 product to extratropical in situ measurements from aircraft transects and balloon profiles show remarkably good agreement. The main features of the CO2 distribution include (1) relatively large mixing ratios in the tropical stratosphere; (2) seasonal variability in the extratropics, with relatively high mixing ratios in the summer and autumn hemisphere in the 15–20 km altitude layer; and (3) decreasing mixing ratios with increasing altitude from the upper troposphere to the middle stratosphere ( ∼ 35 km). These features are consistent with expected variability due to the transport of long-lived trace gases by the stratospheric Brewer–Dobson circulation. The method used here to construct this CO2 product is unique from other modelling efforts and should be useful for model and satellite validation in the upper troposphere and stratosphere as a prior for inversion modelling and to analyse features of stratosphere–troposphere exchange as well as the stratospheric circulation and its variability.
Article
Full-text available
The availability of wind power for renewable energy extraction is ultimately limited by how much kinetic energy is generated by natural processes within the Earth system and by fundamental limits of how much of the wind power can be extracted. Here we use these considerations to provide a maximum estimate of wind power availability over land. We use several different methods. First, we outline the processes associated with wind power generation and extraction with a simple power transfer hierarchy based on the assumption that available wind power will not geographically vary with increased extraction for an estimate of 68 TW. Second, we set up a simple momentum balance model to estimate maximum extractability which we then apply to reanalysis climate data, yielding an estimate of 21 TW. Third, we perform general circulation model simulations in which we extract different amounts of momentum from the atmospheric boundary layer to obtain a maximum estimate of how much power can be extracted, yielding 18–34 TW. These three methods consistently yield maximum estimates in the range of 18–68 TW and are notably less than recent estimates that claim abundant wind power availability. Furthermore, we show with the general circulation model simulations that some climatic effects at maximum wind power extraction are similar in magnitude to those associated with a doubling of atmospheric CO<sub>2</sub>. We conclude that in order to understand fundamental limits to renewable energy resources, as well as the impacts of their utilization, it is imperative to use a "top-down" thermodynamic Earth system perspective, rather than the more common "bottom-up" engineering approach.
Article
Full-text available
The consensus that humans are causing recent global warming is shared by 90%–100% of publishing climate scientists according to six independent studies by co-authors of this paper. Those results are consistent with the 97% consensus reported by Cook et al (Environ. Res. Lett. 8 024024) based on 11 944 abstracts of research papers, of which 4014 took a position on the cause of recent global warming. A survey of authors of those papers (N = 2412 papers) also supported a 97% consensus. Tol (2016 Environ. Res. Lett. 11 048001) comes to a different conclusion using results from surveys of non-experts such as economic geologists and a self-selected group of those who reject the consensus. We demonstrate that this outcome is not unexpected because the level of consensus correlates with expertise in climate science. At one point, Tol also reduces the apparent consensus by assuming that abstracts that do not explicitly state the cause of global warming ('no position') represent non-endorsement, an approach that if applied elsewhere would reject consensus on well-established theories such as plate tectonics. We examine the available studies and conclude that the finding of 97% consensus in published climate research is robust and consistent with other surveys of climate scientists and peer-reviewed studies.
Book
Full-text available
Technical Report
Full-text available
This document examines the impacts of the integration of a large share of variable renewable generation into the generation mix of the European interconnected electricity system. The analysis which is based on the results of long term studies performed by EDF R&D, aims at improving the current understanding of the technical and economical feasibility of a massive deployment of wind and PV across the European system. The document addresses particularly several aspects of the system integration of variable generation including the characterization of variable RES generation, the need for generation and interconnection infrastructure, the impacts on short-term system operation and the market profitability.
Article
Full-text available
State-of-the-art radiative models can be used to calculate in a rigorous and accurate manner the atmospheric greenhouse effect, as well as its variation with concentration in water vapour or carbon dioxide. A simple explanation of this effect uses an analogy with the greenhouse effect produced by a glass window. While this analogy has pedagogical virtues and provides a first order explanation of the mean temperature of the Earth, it has an important drawback; it is not able to explain why the greenhouse effect increases with increasing carbon dioxide concentration. Indeed, absorption of infrared radiation by carbon dioxide is, under this scheme, almost at its maximum and depends very weakly on CO2 concentration. It is said to be saturated. In this paper, we explore this question and propose an alternative model which, while remaining simple, correctly takes into account the various mechanisms and provides an understanding of the increasing greenhouse effect with CO2 concentration, together with the corresponding climate warming. The role of the atmospheric temperature gradient is particularly stressed.
Article
Full-text available
This paper contrasts the standard economic approach to energy modelling with energy models using a biophysical approach. Neither of these approaches includes changing energy-returns-on-investment (EROI) due to declining resource quality or the capital intensive nature of renewable energy sources. Both of these factors will become increasingly important in the future. An extension to the biophysical approach is outlined which encompasses a dynamic EROI function that explicitly incorporates technological learning. The model is used to explore several scenarios of long-term future energy supply especially concerning the global transition to renewable energy sources in the quest for a sustainable energy system.
Article
Full-text available
The near- and long-term societal effects of declining EROI are uncertain, but probably adverse. A major obstacle to examining social implications of declining EROI is that we do not have adequate empirical understanding of how EROI is linked, directly or indirectly, to an average citizen′s ability to achieve well-being. To evaluate the possible linkages between societal well-being and net energy availability, we compare these preliminary estimates of energy availability: (1) EROI at a societal level, (2) energy use per capita, (3) multiple regression analyses and (4) a new composite energy index (Lambert Energy Index), to select indicators of quality of life (HDI, percent children under weight, health expenditures, Gender Inequality Index, literacy rate and access to improved water). Our results suggest that energy indices are highly correlated with a higher standard of living. We also find a saturation point at which increases in per capita energy availability (greater than 150 GJ) or EROI (above 20:1) are not associated with further improvement to society.
Article
Full-text available
Background: We extend our previous study of childhood leukaemia and proximity to high-voltage powerlines by including more recent data and cases and controls from Scotland, by considering 132-kV powerlines as well as 275 and 400 kV and by looking at greater distances from the powerlines. Methods: Case–control study using 53 515 children from the National Registry of Childhood Tumours 1962–2008, matched controls, and calculated distances of mother's address at child's birth to powerlines at 132, 275, and 400 kV in England, Wales and Scotland. Results: Our previous finding of an excess risk for leukaemia at distances out to 600 m declines over time. Relative risk and 95% confidence interval for leukaemia, 0–199 m compared with>1000 m, all voltages: 1960s 4.50 (0.97–20.83), 2000s 0.71 (0.49–1.03), aggregate over whole period 1.12 (0.90–1.38). Increased risk, albeit less strong, may also be present for 132-kV lines. Increased risk does not extend beyond 600 m for lines of any voltage. Conclusions: A risk declining over time is unlikely to arise from any physical effect of the powerlines and is more likely to be the result of changing population characteristics among those living near powerlines.
Article
Full-text available
All forms of economic production and exchange involve the use of energy directly and in the transformation of materials. Until recently, cheap and seemingly limitless fossil energy has allowed most of society to ignore the importance of contributions to the economic process from the biophysical world as well as the potential limits to growth. This paper centers on assessing the energy costs of modern day society and its relation to GDP. Our most important focus is the characteristics of our major energy sources including each fuel's energy return on investment (EROI). The EROI of our most important fuels is declining and most renewable and non-conventional energy alternatives have substantially lower EROI values than traditional conventional fossil fuels. At the societal level, declining EROI means that an increasing proportion of energy output and economic activity must be diverted to attaining the energy needed to run an economy, leaving less discretionary funds available for “non-essential” purchases which often drive growth. The declining EROI of traditional fossil fuel energy sources and the effect of that on the world economy are likely to result in a myriad of consequences, most of which will not be perceived as good.
Article
The rise of “fake news” is a major concern in contemporary Western democracies. Yet, research on the psychological motivations behind the spread of political fake news on social media is surprisingly limited. Are citizens who share fake news ignorant and lazy? Are they fueled by sinister motives, seeking to disrupt the social status quo? Or do they seek to attack partisan opponents in an increasingly polarized political environment? This article is the first to test these competing hypotheses based on a careful mapping of psychological profiles of over 2,300 American Twitter users linked to behavioral sharing data and sentiment analyses of more than 500,000 news story headlines. The findings contradict the ignorance perspective but provide some support for the disruption perspective and strong support for the partisan polarization perspective. Thus, individuals who report hating their political opponents are the most likely to share political fake news and selectively share content that is useful for derogating these opponents. Overall, our findings show that fake news sharing is fueled by the same psychological motivations that drive other forms of partisan behavior, including sharing partisan news from traditional and credible news sources.
Book
Cela est contre-intuitif, mais souvent nous ne pensons et n'agissons pas de façon rationnelle. Par exemple, après les attaques du World Trade Center, beaucoup d'entre nous ont eu peur de prendre l'avion et ont privilégié les déplacements en voiture lorsqu'ils étaient possibles. Pourtant la probabilité de mourir en avion est très inférieure à celle de mourir en voiture. Cela est contre-intuitif, mais souvent nous ne pensons et n'agissons pas de façon rationnelle. Par exemple, après les attaques du World Trade Center, beaucoup d'entre nous ont eu peur de prendre l'avion et ont privilégié les déplacements en voiture lorsqu'ils étaient possibles. Pourtant la probabilité de mourir en avion est très inférieure à celle de mourir en voiture. Pourquoi avons-nous tendance à accorder plus de poids aux informations qui confirment nos croyances qu'à celles qui les infirment ? Pourquoi les narrations construites par notre cerveau peuvent être parfaitement cohérentes et néanmoins totalement erronées ? Bref, pourquoi sommes-nous biaisés ? Comprendre et savoir remédier aux biais cognitifs est fondamental car leurs conséquences tant au niveau individuel qu'au niveau collectif sont loin d'être anodines. Maniement des probabilités, compréhension du hasard, prise de décision : dans chacun de ces domaines, l'influence des biais cognitifs est majeure. En s'appuyant sur de nombreux exemples de notre quotidien et dans un style très vivant, Vincent Berthet met en lumière notre rationalité limitée. Et montre comment certains acteurs en tirent parfois profit. Une plongée au cœur de notre irrationalité.
Book
Remonter aux sources les plus pernicieuses des superstitions et des supercheries par l'explication de ce qu'est la méthode scientifique, la Zététique.
Article
Two systems of knowledge production are identified and addressed by this study: peer review and public review. The peer-review system is defined, its goals are stated, its participants and their roles are identified, and its affordances are summarized. The double-blind journal submission system is examined as an example of peer review. Three enduring challenges for peer review are identified: lack of reproducibility, costs of publication, and undue influence of sponsorship. This study also identifies and defines a new concept: public review. Similarly, the goals of public review are stated, its participants and their roles are identified, and its affordances are summarized. Wikipedia is examined as a primary example of public review. The challenges of public review reaching its goals are enumerated, with uneven development, participation, and representation identified as enduring problems. Lastly, examples of the differing features and affordances of each system recombining to achieve new results are envisioned by identifying examples projects. In conclusion, the study argues that peer review and public review should be understood and contextually applied for their relative strengths rather than criticized for failing to deliver the goals of the other.
Article
L’angle choisi par cet éditorial invité est celui du travail réel, de l’activité des chercheuses et chercheurs en psychologie. Nous proposons une analyse des évolutions de la discipline, en matière de diffusion des « produits » de la recherche et de publication scientifique. Les logiques capitalistes (marchandes) à l’œuvre, qui émanent de grands groupes privés de l’édition (Elsevier, Frontiers…) mettent à mal le travail des chercheurs et tendent à les déposséder de la capacité à penser leurs pratiques et règles de métier. Les effets de l’extension du capitalisme aux productions et au travail scientifiques décrits sont de plusieurs ordres. La manifestation la plus tangible reste les revues payantes (avec frais de publication). Elles sont révélatrices de changements bien plus profonds propres à un milieu dérégulé par des logiques marchandes. Cette dérégulation débouche sur une remise en cause de l’universalisme, au sens de Robert K. Merton. Dans ce contexte, les chercheurs ainsi que leurs institutions deviennent une source de profit à double entrée pour les grands groupes de l’édition. En proie à la prédation capitaliste exercée par ces derniers, les travailleurs de la recherche perdent progressivement la main sur l’organisation de leur propre production et le choix des critères d’évaluation de leur travail. En effet, les outils bibliométriques (commerciaux) développés par les éditeurs tendent à devenir le socle de l’évaluation du travail des chercheurs et viennent ainsi se substituer au « travail de la pensée » des communautés scientifiques. Nous terminons par une critique des critères d’évaluation promus par les éditeurs (notamment l’indicateur SCImago Journal Rank, SJR), tout en soulignant que ceux-ci tendent à entrer en contradiction avec la réalisation d’un travail de qualité. Sans oublier que la volonté de se conformer aux critères imposés peut parfois inciter à développer des pratiques relevant de l’imposture, contraires à l’intégrité scientifique. Ce texte ouvre des pistes de réflexion sur les conditions d’une réappropriation, par les chercheurs, de leur métier et des modalités de diffusion de leurs travaux. Éditorial invité en accès ouvert sur : http://www.cairn.info/revue-zilsel-2019-2-page-9.htm | The angle chosen for this guest editorial is that of real work, the activity of researchers in psychology. We propose an analysis of developments in the discipline in terms of the dissemination of research 'products' and scientific publication. The capitalist (commercial) logic at work, emanating from the main private publishing groups (Elsevier, Frontiers, etc.), undermines the work of researchers and tends to deprive them of the ability to think about their practices and professional rules. The effects of the extension of capitalism to scientific production and work have been described in various ways. The most tangible manifestation is journals with APC. They are indicative of much more profound changes in an environment deregulated by market forces. This deregulation leads to a questioning of universalism in the sense of Robert K. Merton. In this context, researchers and their institutions are becoming a two-way source of profit for the major publishing groups. Prey to the capitalist predation of the latter, researchers are gradually losing control over the organisation of their own production and the choice of criteria for evaluating their work. The (commercial) bibliometric tools developed by publishers are tending to become the basis for evaluating the work of researchers, replacing the "thinking" of scientific communities. We conclude with a critique of the evaluation criteria promoted by publishers (in particular the SCImago Journal Rank (SJR) indicator), stressing that they tend to run counter to the production of quality work. Not to mention the fact that the desire to comply with imposed criteria can sometimes lead to the development of practices that run counter to scientific integrity. This text opens up avenues for reflection on the conditions under which researchers can reclaim their profession and the ways in which their work is disseminated.
Article
This study presents the data on ¹²⁹I and ²³⁶U concentrations in seawater samples and sea ice cores obtained during two expeditions to the Arctic Ocean that took place onboard R/V Polarstern (PS94) and R/V Lance (N-ICE2015) in summer 2015. Carbon-14 was also measured in the deep water samples from the Nansen, Amundsen, and Makarov Basins. The main goal was to investigate the distribution of ¹²⁹I and ²³⁶U in a transect from the Norwegian Coast to the Makarov Basin to fully exploit the potential of combining ¹²⁹I and ²³⁶U as a dual tracer to track Atlantic waters throughout the Arctic Ocean. The use of the ¹²⁹I/²³⁶U and ²³⁶U/²³⁸U atom ratios allowed identifying a third Atlantic branch that enters the Arctic Ocean (the Arctic Shelf Break Branch) following the Norwegian Coastal Current that carries a larger proportion of the European reprocessing plants signal compared to Fram Strait Branch Water and Barents Sea Branch Water. The combination of ¹²⁹I and ²³⁶U also allowed quantifying the different proportions of the La Hague stream, the Scottish stream, and Atlantic waters forming the three Atlantic branches of the Arctic Ocean Boundary Current. The results show that the ¹²⁹I/²³⁶U atom ratio can now be used to identify the different Atlantic branches entering the Arctic Ocean. New input functions for ¹²⁹I, ²³⁶U, and ¹²⁹I/²³⁶U have also been described for each branch, which can be further used for calculation of transit time distributions of Atlantic waters.
Article
Is the Sun a solar-type star? The Sun's activity, including sun-spot activity, varies on an 11-year cycle driven by changes in its magnetic field. Other nearby solar-type stars have their own cycles, but the Sun does not seem to match their behavior. Strugarek et al. used magnetohydrodynamic simulations to show that stellar activity periods should depend on the star's Rossby number, the ratio between the inertial and Coriolis forces. Turning to observations, they found that solar-type stars, including the Sun, follow this relation. The results advance our understanding of how stars generate their magnetic fields and confirm that the Sun is indeed a solar-type star. Science , this issue p. 185
Book
Because of its peculiar biology, its negative impacts on forestry, and its urticating larvae affecting human and animal health, pine processionary moth has largely been studied in many European countries during the last century. However, knowledge remained scattered and no synthesis has ever been published. Since the IPCC retained the moth as one of the two insect indicators of climate change because of its expansion with warming up, filling this gap became increasingly important. Led by INRA, this book associates 101 authors from 22 countries of Europe, Minor Asia and North Africa, combining all the concerned research fields (entomology, ecology, genetics, mathematical modelling, medical and veterinary science, pest management) in a multidisciplinary approach to understand and model the processes underlying past, present and future moth expansion and to propose adapted management methods. Besides, the major biological patterns of the related processionary species are also detailed.
Article
Idiopathic Environmental Intolerance attributed to Electromagnetic Fields (IEI-EMF) is a condition in which symptoms are attributed to electromagnetic field (EMF) exposure. As electro-hypersensitive (EHS) people have repeatedly been observed, during provocation trials, to report symptoms following perceived rather than actual exposure, the hypothesis has been put forward that IEI-EMF originates from psychological mechanisms, especially nocebo responses. This paper examines this hypothesis, using data from a qualitative study aimed at understanding how EHS people come to regard themselves as such. Forty self-diagnosed EHS people were interviewed. A typified model of their attribution process was then elaborated, inductively, from their narratives. This model is linear and composed of seven stages: (1) onset of symptoms; (2) failure to find a solution; (3) discovery of EHS; (4) gathering of information about EHS; (5) implicit appearance of conviction; (6) experimentation; (7) conscious acceptance of conviction. Overall, symptoms appear before subjects start questioning effects of EMF on their health, which is not consistent with the hypothesis that IEI-EMF originates from nocebo responses to perceived EMF exposure. However, such responses might occur at the sixth stage of the process, potentially reinforcing the attribution. It remains possible that some cases of IEI-EMF originate from other psychological mechanisms. Bioelectromagnetics. 2015;9999:XX–XX. © 2015 Wiley Periodicals, Inc.
Article
Solar energy provides by far the greatest potential for energy generation among all forms of renewable energy. Yet, just as for any form of energy conversion, it is subject to physical limits. Here we review the physical limits that determine how much energy can potentially be generated out of sunlight using a combination of thermodynamics and observed climatic variables. We first explain how the first and second law of thermodynamics constrain energy conversions and thereby the generation of renewable energy, and how this applies to the conversions of solar radiation within the Earth system. These limits are applied to the conversion of direct and diffuse solar radiation - which relates to concentrated solar power (CSP) and photovoltaic (PV) technologies as well as biomass production or any other photochemical conversion - as well as solar radiative heating, which generates atmospheric motion and thus relates to wind power technologies. When these conversion limits are applied to observed data sets of solar radiation at the land surface, it is estimated that direct concentrated solar power has a potential on land of up to 11.6 PW (1 PW = 10(15) W), whereas photovoltaic power has a potential of up to 16.3 PW. Both biomass and wind power operate at much lower efficiencies, so their potentials of about 0.3 and 0.1 PW are much lower. These estimates are considerably lower than the incoming flux of solar radiation of 175 PW. When compared to a 2012 primary energy demand of 17 TW, the most direct uses of solar radiation, e.g., by CSP or PV, have thus by far the greatest potential to yield renewable energy requiring the least space to satisfy the human energy demand. Further conversions into solar-based fuels would be reduced by further losses which would lower these potentials. The substantially greater potential of solar-based renewable energy compared to other forms of renewable energy simply reflects much fewer and lower unavoidable conversion losses when solar radiation is directly converted into renewable energy.
Article
The Internet is the source of many hopes and fears. Some claim that this tool will help lead to the emergence of knowledge societies if digital and cognitive gaps can be reduced beforehand. This article takes that idea as a basis to then evaluate what are the real power relations played out between beliefs and knowledge on the web. It shows that the amplification of the dissemination of information is conducive to the expression of “confirmation bias” which is one of the mechanisms which permit beliefs to survive. It also stresses that Internet is a cognitive market which is very sensitive to structuring of supply and therefore the motivation of suppliers, which confers a decisive advantage to the “empire of beliefs”. This last point is measured quantitatively in this paper on various topics of belief.
Article
Depuis une vingtaine d'années, on s'interroge sur l'audience des para-sciences au sein de nos sociétés. Dans quelle mesure le public déclare-t-il croire à la réalité d'un certain nombre de phénomènes tels que l'explication des caractères par les signes astrologiques, la transmission de pensée ou les envoûtements ? En France, une série d'enquêtes a été menée depuis le début des années quatre-vingt. Cet article est fondé sur l'analyse comparée des résultats de cinq enquêtes par sondage menées de 1982 à 2000 portant sur les croyances à 11 phénomènes de ce type. L'examen de ces données indique que, contrairement à une opinion commune, ces croyances ne semblent pas avoir progressé de manière sensible dans la période. L'analyse des déterminants sociodémographiques et idéologiques fait apparaître certaines régularités : les femmes, les jeunes, les personnes qui croient en un au-delà après la mort sont plus enclins à déclarer des croyances dans les para-sciences.