Tom E. Hardwicke's research while affiliated with University of Amsterdam and other places

Publications (46)

Article
Flexibility in the design, analysis and interpretation of scientific studies creates a multiplicity of possible research outcomes. Scientists are granted considerable latitude to selectively use and report the hypotheses, variables and analyses that create the most positive, coherent and attractive story while suppressing those that are negative or...
Article
Full-text available
Scientific journals may counter the misuse, misreporting, and misinterpretation of statistics by providing guidance to authors. We described the nature and prevalence of statistical guidance at 15 journals (top-ranked by Impact Factor) in each of 22 scientific disciplines across five high-level domains (N = 330 journals). The frequency of statistic...
Article
Full-text available
Journals exert considerable control over letters, commentaries and online comments that criticize prior research (post-publication critique). We assessed policies (Study One) and practice (Study Two) related to post-publication critique at 15 top-ranked journals in each of 22 scientific disciplines ( N = 330 journals). Two-hundred and seven (63%) j...
Preprint
Scientific journals may counter the misuse and misinterpretation of statistical methods by providing statistical guidance to authors. Here, we sought to assess the nature and prevalence of statistical guidance offered to authors by 15 journals (top-ranked by Impact Factor) in each of 22 scientific disciplines (N = 330 journals). For each journal, t...
Preprint
Full-text available
Background Undisclosed discrepancies often exist between study registrations and their associated publications. Discrepancies can increase risk of bias, and when undisclosed, they disguise this increased risk of bias from readers. To remedy this issue, we developed an intervention called discrepancy review . We provided journals with peer reviewers...
Preprint
Critical scrutiny of published research is an important feature of a self-correcting scientific ecosystem. Academic journals exert considerable control over criticism submitted in the form of letters, commentaries, or online comments (post-publication peer review, PPPR). Currently, there is limited empirical data documenting how journals handle PPP...
Article
Replication—an important, uncommon, and misunderstood practice—is gaining appreciation in psychology. Achieving replicability is important for making research progress. If findings are not replicable, then prediction and theory development are stifled. If findings are replicable, then interrogation of their meaning and validity can advance knowledg...
Article
Full-text available
Replication studies that contradict prior findings may facilitate scientific self-correction by triggering a reappraisal of the original studies; however, the research community’s response to replication results has not been studied systematically. One approach for gauging responses to replication results is to examine how they affect citations to...
Article
Full-text available
Psychologists are navigating an unprecedented period of introspection about the credibility and utility of their discipline. Reform initiatives emphasize the benefits of transparency and reproducibility-related research practices; however, adoption across the psychology literature is unknown. Estimating the prevalence of such practices will help to...
Preprint
Replication studies that contradict prior findings may facilitate scientific self-correction by triggering a reappraisal of the original studies; however, the research community's response to replication results has not been studied systematically. One approach for gauging responses to replication results is to examine how they impact citations to...
Preprint
Replication, an important, uncommon, and misunderstood practice, is making a comeback in psychology. Achieving replicability is a necessary but not sufficient condition for making research progress. If findings are not replicable, then prediction and theory development are stifled. If findings are replicable, then interrogation of their meaning and...
Article
Full-text available
For any scientific report, repeating the original analyses upon the original data should yield the original outcomes. We evaluated analytic reproducibility in 25 Psychological Science articles awarded open data badges between 2014 and 2015. Initially, 16 (64%, 95% confidence interval [43,81]) articles contained at least one ‘major numerical discrep...
Article
Full-text available
Scientific claims in biomedical research are typically derived from statistical analyses. However, misuse or misunderstanding of statistical procedures and results permeate the biomedical literature, affecting the validity of those claims. One approach journals have taken to address this issue is to enlist expert statistical reviewers. How many jou...
Preprint
For any scientific report, repeating the original analyses upon the original data should yield the original outcomes. We evaluated analytic reproducibility in 25 Psychological Science articles awarded open data badges between 2014-2015. Initially, 16 (64%, 95% confidence interval [43,81]) articles contained at least one “major numerical discrepancy...
Preprint
Scientific claims in biomedical research are typically derived from statistical analyses. However, misuse and misunderstanding of statistical procedures and results permeates the biomedical literature, affecting the validity of those claims. One approach journals have taken to address this issue is to enlist expert statistical reviewers. How many j...
Article
Full-text available
Serious concerns about research quality have catalysed a number of reform initiatives intended to improve transparency and reproducibility and thus facilitate self-correction, increase efficiency and enhance research credibility. Meta-research has evaluated the merits of some individual initiatives; however, this may not capture broader trends refl...
Preprint
Preregistration is the act of submitting a study plan, ideally also with analytical plan, to a registry prior to conducting the work. Preregistration increases the discoverability of research even if it does not get published further. Adding specific analysis plans can clarify the distinction between planned, confirmatory tests and unplanned, explo...
Preprint
Psychological science is navigating an unprecedented period of introspection about the credibility and utility of its research. A number of reform initiatives aimed at increasing adoption of transparency and reproducibility-related research practices appear to have been effective in specific contexts; however, their broader, collective impact amids...
Article
While some scientists study insects, molecules, brains, or clouds, other scientists study science itself. Meta-research, or research-on-research, is a burgeoning discipline that investigates efficiency, quality, and bias in the scientific ecosystem, topics that have become especially relevant amid widespread concerns about the credibility of the sc...
Article
Full-text available
Teachers around the world hold a considerable number of misconceptions about education. Consequently, schools can become epicenters for dubious practices that might jeopardize the quality of teaching and negatively influence students’ wellbeing. The main objective of this study was to assess the efficacy of refutation texts in the correction of err...
Preprint
Teachers around the world hold a considerable number of misconceptions about education. Consequently, schools can become epicenters for dubious practices that might jeopardize the quality of teaching and negatively influence students’ wellbeing. The main objective of this study was to assess the efficacy of refutation texts in the correction of err...
Preprint
Preregistration clarifies the distinction between planned and unplanned research by reducing unnoticed flexibility. This improves credibility of findings and calibration of uncertainty. However, making decisions before conducting analyses requires practice. During report writing, respecting both what was planned and what actually happened requires...
Article
Petitions have a long history of being used for political, social, ethical, and injustice issues, however, it is unclear how/whether they should be implemented in scientific argumentation. Recently, an extremely influential commentary published in Nature (Amrhein et al., 2019) calling for the abandonment of “statistical significance” was signed by...
Article
Preregistration clarifies the distinction between planned and unplanned research by reducing unnoticed flexibility. This improves credibility of findings and calibration of uncertainty. However, making decisions before conducting analyses requires practice. During report writing, respecting both what was planned and what actually happened requires...
Article
Full-text available
Readers of peer-reviewed research may assume that the reported statistical analyses supporting scientific claims have been closely scrutinized and surpass a high-quality threshold. However, widespread misunderstanding and misuse of statistical concepts and methods suggests that suboptimal or erroneous statistical practice is routinely overlooked du...
Preprint
KEY MESSAGES•Petitions have a long history of being used for political, social, ethical, and injustice issues, however, it is unclear how/whether they should be implemented in scientific argumentation. •Recently, an extremely influential commentary published in Nature (Amrhein et al., 2019) calling for the abandonment of “statistical significance”...
Preprint
Whilst some scientists study insects, molecules, brains, or clouds, other scientists study science itself. Meta-research, or “research-on-research”, is a burgeoning discipline that investigates efficiency, quality, and bias in the scientific ecosystem, topics that have become especially relevant amid widespread concerns about the credibility of the...
Preprint
Serious concerns about research quality have catalyzed a number of reform initiatives intended to improve transparency and reproducibility and thus facilitate self-correction, increase efficiency, and enhance research credibility. Meta-research has evaluated the merits of individual initiatives; however, this may not capture broader trends reflecti...
Article
Registered reports present a substantial departure from traditional publishing models with the goal of enhancing the transparency and credibility of the scientific literature. We map the evolving universe of registered reports to assess their growth, implementation and shortcomings at journals across scientific disciplines.
Article
Full-text available
Access to data is a critical feature of an efficient, progressive and ultimately self-correcting scientific ecosystem. But the extent to which in-principle benefits of data sharing are realized in practice is unclear. Crucially, it is largely unknown whether published findings can be reproduced by repeating reported analyses upon shared data (‘anal...
Article
Full-text available
The vast majority of scientific articles published to-date have not been accompanied by concomitant publication of the underlying research data upon which they are based. This state of affairs precludes the routine re-use and re-analysis of research data, undermining the efficiency of the scientific enterprise, and compromising the credibility of c...
Article
Replication is the cornerstone of science – but when and why? Not all studies need replication, especially when resources are limited. We propose that a decision-making framework based on Bayesian philosophy of science provides a basis for choosing which studies to replicate.
Article
Full-text available
The credibility of scientific claims depends upon the transparency of the research products upon which they are based (e.g., study protocols, data, materials, and analysis scripts). As psychology navigates a period of unprecedented introspection, user-friendly tools and services that support open science have flourished. However, the plethora of de...
Preprint
Selection pressures for significant results may infuse bias into the research process. We evaluated the implementation of one innovation designed to mitigate this bias, ‘Registered Reports’, where study protocols are peer-reviewed and granted in-principle acceptance (IPA) for publication before the study has been conducted. As of February 2018, 91...
Preprint
Full-text available
The credibility of scientific claims depends upon the transparency of the research products upon which they are based (e.g., study protocols, data, materials, and analysis scripts). As psychology navigates a period of unprecedented introspection, user-friendly tools and services that support open science have flourished. There has never been a bett...
Preprint
Access to research data is a critical feature of an efficient, progressive, and ultimately self-correcting scientific ecosystem. But the extent to which in-principle benefits of data sharing are realized in practice is unclear. Crucially, it is largely unknown whether published findings can be reproduced by repeating reported analyses upon shared d...
Preprint
Replication is the cornerstone of science – but when and why? Not all studies need replication, especially when resources are limited. We propose that a decision-making framework based on Bayesian philosophy of science provides a basis for choosing which studies to replicate.
Article
Iyadurai et al.¹ report a randomized controlled trial intended to mitigate post-trauma symptoms in motor vehicle collision (MVC) survivors presenting to an emergency department. The intervention, motivated by memory ‘(re)consolidation’ theory, consisted of a brief reminder of the accident followed by playing the computer game Tetris.
Preprint
Full-text available
Beginning January 2014, Psychological Science gave authors the opportunity to signal open data and materials if they qualified for badges that accompanied published articles. Before badges, less than 3% of Psychological Science articles reported open data. After badges, 23% reported open data, with an accelerating trend; 39% reported open data in t...
Article
Full-text available
Author Openness is a core value of scientific practice. The sharing of research materials and data facilitates critique, extension, and application within the scientific community, yet current norms provide few incentives for researchers to share evidence underlying scientific claims. In January 2014, the journal Psychological Science adopted such...
Article
Full-text available
When a series of studies fails to replicate a well-documented effect, researchers might be tempted to use a ‘vote counting’ approach to decide whether or not the effect is reliable – that is, simply comparing the number of successful and unsuccessful replications. Vohs’ (2015) response to the absence of money priming effects reported by Rohrer, Pas...
Article
Significance Reconsolidation-updating theory suggests that existing memory traces can be modified, or even erased, by postretrieval new learning. Compelling empirical support for this claim could have profound theoretical, clinical, and ethical implications. However, demonstrating reconsolidation-mediated memory updating in humans has proved partic...
Article
Full-text available
It is becoming increasingly clear that science has sailed into troubled waters. Recent revelations about cases of serious research fraud and widespread ‘questionable research practices’ have initiated a period of critical self-reflection in the scientific community and there is growing concern that several common research practices fall far short o...

Citations

... On a more general note, it should be borne in mind that that preregistration has a variety of benefits that help increase scientific rigor, but it is not a panacea (as argued by [87]). Therefore, preregistration should ideally be used together with complementary open science practices increasing transparency and confidence (see [87] for an overview). ...
... We see that statisticians and philosophers of science have an ongoing debate about the role of statistical significance in statistical inference [2,14,16,18,19]. Beyond the controversies, it seems clear that all statisticians encourage a better use of hypothesis tests and p-values by researchers. ...
... Finally, this study also sheds light on psychometric and reproducibility issues in this area (Nosek et al., 2022). Speci cally, a sample size derivation based on multilevel analyses and various analytical methods, including Bayesian analysis, was devised (Arend & Schäfer 2019;Etz & Vandekerckhove, 2018). ...
... as significant), as well as it could be done mildly (e.g., suggesting positive effects from numerical trends although the authors stated that change was not statistically significant). In addition, studies with positive results are more likely to be cited than null or negative findings (i.e., citation bias), which might make it more difficult to discover negative results (Hardwicke et al., 2021). The same happens with contradictory replications and critical commentaries. ...
... Yet, looking at the fraction of published studies that were actually preregistered paints a different picture. In their recent study, Hardwicke et al. [25] found that only 3% of 188 examined articles from 2014 to 2017 which were randomly sampled from the literature included a preregistration statement, which contradicts the more positive outlook by [22,24]. Stürmer et al. [26] also found that when early career researchers were asked about questionable research practices and open science, they deemed many open science practices necessary, yet toward preregistration they expressed more reluctance: Only about half of the participants found that preregistration was fairly necessary or very necessary, and even less indicated that they planned to consider preregistering their studies in the near future. ...
... Beyond the costs to productivity, the inaccessibility and instability of many neuroimaging tools poses a wider threat to reproducibility [9][10][11][12][13][14][15][16] . The transparency and openness promotion (TOP) guidelines, which to date have over 5,000 journals and 4 . ...
... Having access to the data underlying scientific studies increases the opportunity for researchers and other stakeholders to explore, validate and reproduce published findings 7,11 . In some cases, unidentified errors in the data and/ or analyses that slipped through the peer-review process can be detected 1,12 . These errors may vary in their severity, ranging from small differences in reported effect sizes to major changes in the magnitude and/or direction of effects. ...
... The knowledge of these tests will provide learners with both the ability to perform biostatistics and evaluate published research critically. As demonstrated by a survey conducted by Hardwicke and Goodman in 2020, <25% of authors used specialized statistical review in their published articles, highlighting the need for physicians and researchers to be able to critically evaluate published statistical research [10]. While there are deficits in statistical knowledge, students and residents have expressed the importance of understanding statistical concepts and incorporating research into their 1 1 1 current or future practice [5,11]. ...
... Even after assistance of the original authors, results from 13 of those articles were still not reproducible. Unpublished work by Artner and colleagues examining papers in four different journals showed similar results, as did a recent study by Hardwicke et al. [15] involving articles appearing in Psychological Science. Thus, we can safely assume that many more corrections should be published, if all errors would get detected. ...