ArticlePublisher preview available

Learning From Mistakes: Teaching Students About Errata, Corrigenda, and Nonretraction Corrections to the Research Literature

Scholarship of Teaching and Learning in Psychology
Authors:
To read the full-text of this research, you can request a copy directly from the authors.

Abstract

Publication guidelines indicate that when substantive errors in research are noted after the publication of a research report, a correction (also known as an erratum or corrigendum) should be issued. Although such corrections are published with some regularity in psychology and in other scientific fields, the issue of corrections to the scientific literature is seldom addressed in the kinds of research methods textbooks often used in undergraduate psychology courses. This article briefly reviews the existing literature on the frequency and effectiveness of scientific correction generally and extends that literature to the domain of psychology specifically, with an emphasis on what undergraduate students should know.
TEACHER-READY RESEARCH REVIEW
Learning From Mistakes: Teaching Students About Errata, Corrigenda,
and Nonretraction Corrections to the Research Literature
Heather A. Haas
The University of Montana Western
Steven V. Rouse
Pepperdine University
Publication guidelines indicate that when substantive errors in research are noted after
the publication of a research report, a correction (also known as an erratum or
corrigendum) should be issued. Although such corrections are published with some
regularity in psychology and in other scientific fields, the issue of corrections to the
scientific literature is seldom addressed in the kinds of research methods textbooks
often used in undergraduate psychology courses. This article briefly reviews the
existing literature on the frequency and effectiveness of scientific correction generally
and extends that literature to the domain of psychology specifically, with an emphasis
on what undergraduate students should know.
Keywords: errata, corrigenda, correction notices, research methods, Autism-Spectrum
Quotient (AQ)
The so-called replication crisis has encouraged
psychologists to turn a critical eye to their own
research practices and spurred the development of
relevant resources for psychology instructors. The
Noba Project, for example, includes a reading on
“The Replication Crisis in Psychology” (Diener &
Biswas-Diener, 2015) and materials for a 1-hr
lecture on the topic have also been made available
(Chopik, Bremner, Defever, & Keller, 2018). Eye
on Psi Chi published a synopsis of current reform
efforts (Chartier, Lewis, & McCarthy, 2018), and
the Psi Chi Journal of Psychological Research has
published pieces reviewing the replication crisis
and describing the way the journal has adopted
and adapted the Open Science badge system for
articles published in the journal (Rouse, 2017,
2018).
Related changes in research practices have
opened new possibilities for psychology in-
structors. Open Data articles have been repur-
posed as fodder for reanalysis in undergraduate
psychology courses (McIntyre, 2017; see also
openstatslab.com), and a number of authors
have proposed or described integrating replica-
tion attempts into undergraduate research meth-
ods classes (Asendorpf et al., 2013;Edlund,
2016;Frank & Saxe, 2012;Grahe, 2017;Grahe
et al., 2012;Grahe, Brandt, Ijzerman, & Co-
hoon, 2014;Koole & Lakens, 2012;Lenne &
Mann, 2016;Standing, 2016;Standing, As-
trologo, Benbow, Cyr-Gauthier, & Williams,
2016;Standing, Grenier, Lane, Roberts, &
Sykes, 2014).
1
Although these shifts in prac-
1
The extent to which students benefit from participation
in replication attempts appears to require further empirical
demonstration, however (Jern, 2018).
Heather A. Haas, Department of History, Philosophy, and
Social Sciences, The University of Montana Western; Steven
V. Rouse, Social Science Division, Pepperdine University.
We would like to acknowledge the assistance of Kaylynn
Baker, who helped with the review of textbook coverage.
We would also like to thank LaGrange College for provid-
ing guest researcher access to library databases, and
LaGrange College librarians Charlene Baxter and Arthur
Robinson for external confirmation of some search results.
Correspondence concerning this article should be ad-
dressed to Heather A. Haas, Department of History, Philos-
ophy, and Social Sciences, The University of Montana
Western, Dillon, MT 59725. E-mail: heather.haas@
umwestern.edu
This document is copyrighted by the American Psychological Association or one of its allied publishers.
This article is intended solely for the personal use of the individual user and is not to be disseminated broadly.
Scholarship of Teaching and Learning in Psychology
© 2020 American Psychological Association
ISSN: 2332-2101 http://dx.doi.org/10.1037/stl0000216
58
2022, Vol. 8, No. 1, 58–69
This article was published Online First September 10, 2020.
... Our review demonstrated that there is a plethora of peer-reviewed articles in this area. Haas & Rouse [91], for example, recommend that students be introduced to the concept of correction in the scientific record and taught how to identify and interpret correction notices during a literature search. One of the key requirements of a reliable scientific record is that publishers issue corrections whenever substantive errors in published research are identified. ...
Article
Full-text available
In recent years, the scientific community has called for improvements in the credibility, robustness and reproducibility of research, characterized by increased interest and promotion of open and transparent research practices. While progress has been positive, there is a lack of consideration about how this approach can be embedded into undergraduate and postgraduate research training. Specifically, a critical overview of the literature which investigates how integrating open and reproducible science may influence student outcomes is needed. In this paper, we provide the first critical review of literature surrounding the integration of open and reproducible scholarship into teaching and learning and its associated outcomes in students. Our review highlighted how embedding open and reproducible scholarship appears to be associated with (i) students' scientific literacies (i.e. students’ understanding of open research, consumption of science and the development of transferable skills); (ii) student engagement (i.e. motivation and engagement with learning, collaboration and engagement in open research) and (iii) students' attitudes towards science (i.e. trust in science and confidence in research findings). However, our review also identified a need for more robust and rigorous methods within pedagogical research, including more interventional and experimental evaluations of teaching practice. We discuss implications for teaching and learning scholarship.
... Our review demonstrated that there is a plethora of peer-reviewed articles in this area. Haas and Rouse (2022) recommend that students be introduced to the concept of correction in the scientific record, associated terminology, and taught how to identify and interpret correction notices, when encountered during a literature search. One of the key requirements of a reliable scientific record is that publishers issue corrections whenever substantive errors in published research are identified. ...
Preprint
Full-text available
In recent years, the scientific community has called for improvements in the credibility, robustness, and reproducibility of research, characterized by higher standards of scientific evidence, increased interest in open practices, and promotion of transparency. While progress has been positive, there is a lack of consideration about how this approach can be embedded into undergraduate and postgraduate research training. Currently, the impact of integrating an open and reproducible approach into the curriculum on student outcomes is not well articulated in the literature. Therefore, in this paper, we provide the first comprehensive review of how integrating open and reproducible scholarship into teaching and learning may impact students, using a large-scale, collaborative, team-science approach. Our review highlighted how embedding open and reproducible scholarship may impact: (1) students’ scientific literacies (i.e., students’ understanding of open research, consumption of science, and the development of transferable skills); (2) student engagement (i.e., motivation and engagement with learning, collaboration, and engagement in open research), and (3) students’ attitudes towards science (i.e., trust in science and confidence in research findings). Our review also identified a need for more robust and rigorous methods within evaluations of teaching practice. We discuss implications for teaching and learning scholarship in this area.
... There is also a plethora of recent evidence that supports the need for incorporating this approach into undergraduate and postgraduate training (e.g., Button, 2018;FORRT, 2019;Pownall, 2020). This has led to discussions related to teaching undergraduate students about the factors that have contributed to the "replication crisis," which is the growing concern about the lack of successful replications of published research (Chopik et al., 2018;Haas & Rouse, 2020). Similarly, there have been efforts to address QRPs in student research (Sacco & Brown, 2019;Strand & Brown, 2019;Wagge et al., 2019), and considerations of how to integrate this approach across teaching curricula (Frank & Saxe, 2012;Frankowski, 2021;Hanna et al., 2021;Sarafoglou et al., 2020). ...
Article
Full-text available
Recently, there has been a growing emphasis on embedding open and reproducible approaches into research. One essential step in accomplishing this larger goal is to embed such practices into undergraduate and postgraduate research training. However, this often requires substantial time and resources to implement. Also, while many pedagogical resources are regularly developed for this purpose, they are not often openly and actively shared with the wider community. The creation and public sharing of open educational resources is useful for educators who wish to embed open scholarship and reproducibility into their teaching and learning. In this article, we describe and openly share a bank of teaching resources and lesson plans on the broad topics of open scholarship, open science, replication, and reproducibility that can be integrated into taught courses to support educators and instructors. These resources were created as part of the Society for the Improvement of Psychological Science (SIPS) hackathon at the 2021 Annual Conference, and we detail this collaborative process in the article. By sharing these open pedagogical resources, we aim to reduce the labor required to develop and implement open scholarship content to further the open scholarship and open educational materials movement.
... There is also a plethora of recent evidence that supports the need for incorporating this approach into undergraduate and postgraduate training (e.g., Button, 2018;FORRT, 2019;Pownall, 2020). This has led to discussions related to teaching undergraduate students about the 'replication crisis' and researcher degrees of freedom (Chopik et al., 2018;Haas & Rouse, 2020), a concern for reducing questionable research practices in student research (Sacco & Brown, 2019;Strand & Brown, 2019;Wagge et al., 2019), and efforts to integrate this approach across teaching curricula (Frank & Saxe, 2012;Frankowski, 2021;Galati & Markant, 2018;Hanna et al., 2021;Sarafoglou et al., 2020). Likewise, there have been recent proposals to respond to these concerns through development of best practice guides (e.g., ...
Preprint
Full-text available
Recently, there has been a growing emphasis on embedding open and reproducible approaches into research. One essential step in accomplishing this larger goal is to embed such practices into undergraduate and postgraduate research training. However, this often requires substantial time and resources to implement. Also, while many pedagogical resources are regularly developed for this purpose, they are not often openly and actively shared with the wider community. The creation and public sharing of open educational resources is useful for educators who wish to embed open scholarship and reproducibility into their teaching and learning. In this article, we describe and openly share a bank of teaching resources and lesson plans on the broad topics of open scholarship, open science, replication, and reproducibility that can be integrated into taught courses, to support educators and instructors. These resources were created as part of the Society for the Improvement of Psychological Science (SIPS) hackathon at the 2021 Annual Conference, and we detail this collaborative process in the article. By sharing these open pedagogical resources, we aim to reduce the labour required to develop and implement open scholarship content to further the open scholarship and open educational materials movement.
Article
Full-text available
Purpose This paper aims to investigate the characterization of corrections to the papers published in Library and Information Science (LIS) journals during 2006-2015. It studies the frequency and location of the published errors, time interval between the publication of the original papers and their corrections, as well as associations between journals’ impact factors (IF) and their correction rates. Design/methodology/approach The population of the study comprised of 369 errata published in 50 LIS journals. The data were obtained from Clarivate Analytics’ Web of Science (WoS) and Journal Citation Reports. Findings The results of the study revealed a correction rate of 0.37 per cent for LIS journals, which is substantially lower than that of 124 subject categories with at-least one erratum in the WoS. Among the countries with the highest number of errata in LIS journals, the USA ranked first, followed by China and England. However, the greatest share of errata to overall LIS publications of the country was seen in Kazakhstan, Russia and Botswana. Results showed that no statistically significant relationships existed between the journals’ IF and their correction rates. The highest proportion of errors published in LIS literature was occurred in authors’ information, references, tables and figures. Moreover, the average time from publication of the original articles to their corresponding errata was found to be 8.7 months. Social implications Correcting the unintentional mistakes in scholarly articles is an ethical responsibility of researchers and journal editors. Originality/value The current research tries to investigate the characteristics of errata in the LIS field.
Article
Full-text available
The objective of this study is to compare the quantity of citations that retracted and nonretracted articles received in engineering based on articles indexed in the Web of Science database and published between 1945 and 2015. For data analysis, the Statistical Package for the Social Sciences was used along with the Kolmogorov–Smirnov, Mann–Whitney, Tukey–Kramer tests and descriptive statistics. The data set included 238 retracted and 236 nonretracted articles, with the retracted articles cited 2,348 times and nonretracted articles cited 2,957 times. The results highlight that retraction does not end citation, thus threatening scientific credibility.
Article
Full-text available
Some have argued that having students conduct rigorous replications of published studies would provide benefits to both psychological science and the students themselves. However, while it seems clear that replications are beneficial to psychological science, there is little empirical evidence that having students conduct replications provides benefits to students. In this study, I conducted a preliminary test (N = 37) of one purported benefit to students of conducting a classroom replication: the development of scientific critical thinking skills. Students completed a 1-term research methods course centered on a class replication project. I assessed students’ critical thinking development after completing the course. The results were largely inconclusive, showing no significant change in performance between a pretest (M = 11.00, SD = 3.44) and a posttest (M = 10.30, SD = 2.62), t(36) = −1.21, p = .23. This study highlights the need for additional research on the question of whether having students conduct replications provides educational benefits beyond those offered by other pedagogical methods.
Article
This paper describes the novel use of parallel student teams from a research m thods course to perform a replication study, and suggests that this approach offers pedagogical benefits for both students and teachers, as well as potentially contributing to a resolution of the replication crisis in sychology today. Four teams, of five undergraduates each, independently attempted exact replications of Study 8 by Gailliot et al. (2007), which reported that participants’ self–control is enhanced by consuming a glucose drink. In a 2 × 2 independent groups design, participants (N=306) first consumed a glucose drink or a placebo, and then wrote about death, intended to deplete their self–control, or dental pain as a control condition. Absolute levels of self-control were lower here than in the target article (shown by more items left unsolved in a word puzzle), but its main result was replicated, since self–control overall was raised by the lucose drink. Also, the teams reliably reported similar effects for the experimental treatments (ICC=.928). Two differences from the target study results were noted: the glucose effect occurred only with female participants, and no effect was found from the writing scenario used.
Article
It is suggested that replication projects may be valuable in teaching research methods, and also address the current need in psychology for more independent verification of published studies. Their use in an undergraduate methods course is described, involving student teams who performed direct replications of four well–known experiments, yielding results which were subsequently published online. Illustrative data are given for the one successful replication and three failures obtained, and practical suggestions are given for incorporating replication projects intoa methods course as an alternative to the usual term project. It is also noted that the published success rates of replication attempts appear to be higher for those studies that were performed as class projects.
Article
Retraction of scholarly publications ensures that unqualified knowledge is purged from the scientific community. However, there appears to be little understanding about how this is practiced among library and information science (LIS) journals. Hence, this study investigated the correction and retraction practices of LIS journals. Journals included in the Web of Science’s information science and library science subject category were selected for the study and the characteristics of the articles corrected or retracted in those journals between 1996 and 2016 were examined. Findings show that there were 517 corrections and five retractions in LIS journals during the period. Most of the corrections made to articles in LIS journals were minor while the reasons for article retraction included plagiarism, duplication, irreproducible results and methodological errors. Our findings also reveal that on average it took about 587 days for an article to be retracted while some of the retracted articles continued to be cited after retraction. The study concluded that the average number of errors per correction was lower than what had been observed in medical journals while some of the retracted articles continued to receive positive post-retraction citations. It also recommended the inclusion of a check on the validity of literature cited by authors at the review stage as part of the quality control mechanism by publishers of LIS journals.
Article
https://www.psichi.org/page/224EyeSum18aChartier#.Xp0xcchKjIV
Article
Over the past 10 years, crises surrounding replication, fraud, and best practices in research methods have dominated discussions in the field of psychology. However, no research exists examining how to communicate these issues to undergraduates and what effect this has on their attitudes toward the field. We developed and validated a 1-hr lecture communicating issues surrounding the replication crisis and current recommendations to increase reproducibility. Pre- and post-lecture surveys suggest that the lecture serves as an excellent pedagogical tool. Following the lecture, students trusted psychological studies slightly less but saw greater similarities between psychology and natural science fields. We discuss challenges for instructors taking the initiative to communicate these issues to undergraduates in an evenhanded way.
Article
One of the key characteristics of a rigorous and robust science is that it has the ability to self-correct when mistakes are made. The focal article by Grand et al. (2018) has plenty to say about what faithful actors can do to ensure a more robust industrial and organizational (I-O) psychology literature, but we were surprised at how little attention the issue of correcting the research record was given. To be clear, the target article mentions the issue of research misconduct and even cites an article from Leadership Quarterly (Atwater, Mumford, Schriesheim, & Yammarino, 2014) that deals specifically with the issue, but it does not include guidance on this issue for editors or publishers. This is not surprising. The issue of misconduct is one that produces a great deal of discomfort in academic circles (Wager, 2015), and when it is discussed, it is usually only in the vaguest terms (e.g., Banks et al., 2016). Moreover, as Grand and colleagues point out, it could be argued that the “gathering storm” of questionable research that has enveloped other fields such as social psychology seems distant. Perhaps to reassure us, the authors point to several articles showing that results of our field seem to be replicable, robust, and relevant. But are they?