Alexander A. Aarts’s research while affiliated with Open University of the Netherlands and other places

What is this page?


This page lists works of an author who doesn't have a ResearchGate profile or hasn't added the works to their profile yet. It is automatically generated from public (personal) data to further our legitimate goal of comprehensive and accurate scientific recordkeeping. If you are this author and want this page removed, please let us know.

Publications (8)


Making the Most of Tenure in Two Acts: An Additional Way to Help Change Incentives in Psychological Science?
  • Article

January 2022

·

2 Reads

SSRN Electronic Journal

Alexander A. Aarts






Estimating the reproducibility of psychological science
  • Article
  • Full-text available

August 2015

·

53,376 Reads

·

6,375 Citations

Science

Alexander A. Aarts

·

Joanna E. Anderson

·

·

[...]

·

Empirically analyzing empirical evidence One of the central goals in any scientific endeavor is to understand causality. Experiments that seek to demonstrate a cause/effect relation most often manipulate the postulated causal factor. Aarts et al. describe the replication of 100 experiments reported in papers published in 2008 in three high-ranking psychology journals. Assessing whether the replication and the original experiment yielded the same result according to several criteria, they find that about one-third to one-half of the original findings were also observed in the replication study. Science , this issue 10.1126/science.aac4716

Download

Estimating the reproducibility of psychological science

August 2015

·

438 Reads

·

787 Citations

Reproducibility is a defining feature of science, but the extent to which it characterizes current research is unknown. We conducted replications of 100 experimental and correlational studies published in three psychology journals using high-powered designs and original materials when available. Replication effects were half the magnitude of original effects, representing a substantial decline. Ninety-seven percent of original studies had statistically significant results. Thirty-six percent of replications had statistically significant results; 47% of original effect sizes were in the 95% confidence interval of the replication effect size; 39% of effects were subjectively rated to have replicated the original result; and if no bias in original results is assumed, combining original and replication results left 68% with statistically significant effects. Correlational tests suggest that replication success was better predicted by the strength of original evidence than by characteristics of the original and replication teams.

Citations (3)


... Par exemple, les publications en libre accès, ainsi que les données, codes, documents et prépublications ouverts (Sarabipour et al., 2019) et les rapports enregistrés sont associés à des taux de citation plus élevés (Hobson, 2019 ;Piwowar & Vision, 2013 ;Pontika, 2015 ;Sarabipour et al., 2019) et permettent un partage plus rapide des travaux universitaires. Nombre de ces outils sont considérés comme des productions scientifiques dotées de leurs propres Digital Object Identifier (DOI, en français, identifiant d'objet digital), ce qui peut aider les ECR à établir leur réputation scientifique, à améliorer leur curriculum vitae universitaire et leur employabilité (Aarts, 2017 ;Markowetz, 2015 ;O'Carroll et al., 2017). Bien que ces avantages pratiques ne soient pas uniquement pertinents pour les ECR féministes (voir Toribio-Flórez et al., 2021), ils sont particulièrement utiles pour les ECR issu·es de cette perspective, étant donné que la psychologie féministe implique une réappropriation de l'agentivité et du pouvoir dans les espaces traditionnels. ...

Reference:

S’engager dans la science ouverte en tant que chercheur·euses féministes en début de carrière
Open Practices Badges for Curricula Vitae: An Additional Way to Help Change Incentives in Psychological Science?
  • Citing Article
  • January 2022

SSRN Electronic Journal

... These scandals spotlight individual labs as exemplars of bad research practice. However, they sit atop a decade of discussions of the so-called 'replication-crisis', often centred around fields like psychology and medical research [4]. Staggering statistics in these fields continue to highlight that the vast majority of research results cannot be reproduced by other researchers. ...

Estimating the reproducibility of psychological science
  • Citing Article
  • August 2015

... Research in psychology and social science has come under fierce criticism for a number of years, and the domains have, in general, been slow to react (Aarts, 2015;Cassidy et al., 2019;Hullman et al., 2022;Meehl, 1990;Scheel, 2022;Sedlmeier & Gigerenzer, 1989;Spellman, 2015;Vowels, 2023a). These critiques concern a wide variety of aspects relating to all aspects of the research pipeline, from theory development (Borsboom et al., 2021;Oberauer & Lewandowsky, 2019;Scheel, 2022), measurement (Barry et al., 2014;Flake & Fried, 2020), data collection (Button et al., 2013;Henrich et al., 2010;Vankov et al., 2014), model specification and analysis (Hullman et al., 2022;Scheel, 2022;Vowels, 2023a), and reliability of interpretations (Grosz et al., 2020;Hernan, 2018;Rohrer, 2018;Szucs & Ioannidis, 2017;Yarkoni, 2019). ...

Estimating the reproducibility of psychological science

Science