Article

Publication bias in the medical literature: a review by a Canadian Research Ethics Board.

Dalhousie University and the Capital District Health Authority, Halifax, Nova Scotia, Canada.
Canadian Journal of Anaesthesia (Impact Factor: 2.5). 06/2007; 54(5):380-8. DOI: 10.1007/BF03022661
Source: PubMed

ABSTRACT We reviewed the publication record of all protocols submitted to the Capital District Health Authority Research Ethics Board (REB) in Halifax, Nova Scotia, for the period 1995-1996. Because of a heightened awareness of the issue, we hypothesized that there would be less publication bias (a failure to report negative results) and a higher publication rate from completed studies, than previously reported.
Closed studies were identified from the REB database. Publications were identified by the investigators, requests from sponsors, and a literature review. For each publication, we identified authors, title, journal, number of subjects enrolled, and whether or not the publication was a report of a randomized clinical trial. Comparisons were done using a Student's t test, the Chi-square statistic, or Fisher's exact test as appropriate.
From the database of closed studies, 106 remained unpublished, while completed investigations resulted in 84 publications (44% publication rate). The median time to publication was 32.5 months. Publication of statistically significant results occurred in 71/84 trials. Publication of protocols submitted by departments ranged from 91% (anesthesia; 10/11) to 25% [nursing; 2/8 (P<0.05)]. Trials investigating new drugs in Phase 3 or 4 studies were more likely to be published than trials investigating agents in Phase 1 or 2 (P<0.05), and were less likely to be published if sponsored by a pharmaceutical company (P<0.05).
Publication bias continues to be a problem, particularly for early phase investigative studies. Our results suggest that a different approach is required to reduce publication bias. The role that REBs and peer-reviewed journals might play requires further exploration.

Download full-text

Full-text

Available from: Richard I Hall, Apr 20, 2015
1 Follower
 · 
87 Views
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Publication bias has been around for about 50 years. It has become a concern for almost 20 years in the medical research community. This review briefly summarizes the current status of publication bias, potential sources where bias may arise from, and its common evaluation methods. In the field of translational stroke research, publication bias has long been suspected; however, it has not been addressed with sufficient efforts. Its status has remained the same during the last decade. The author emphasizes the important role that publishers might play in addressing publication bias.
    Journal of Experimental Stroke and Translational Medicine 01/2009; 2(1):16-21. DOI:10.6030/1939-067X-2.1.16
  • [Show abstract] [Hide abstract]
    ABSTRACT: Sociologists of science have argued that due to the institutional reward system negative research results, such as failed experiments, may harm scientific careers. We know little, however, of how scientists themselves make sense of negative research findings. Drawing from the sociology of work, the author discusses how researchers involved in a double-blind, placebo, controlled randomized clinical trial for methamphetamine dependency informally and formally interpret the emerging research results. Because the drug tested in the trial was not an effective treatment, the staff considered the trial a failure. In spite of the disappointing results, the staff involved in the daily work with research subjects still reframed the trial as meaningful because they were able to treat people for their drug dependency. The authors of the major publication also framed the results as worthwhile by linking their study to a previously published study in a post hoc analysis. The author concludes that negative research findings offer a collective opportunity to define what scientific work is about and that the effects of failed experiments depend on individual biography and institutional context.
    Science, Technology & Human Values 05/2010; 35(1). DOI:10.1177/0162243910366155 · 2.41 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: When the nature and direction of research results affect their chances of publication, a distortion of the evidence base - termed publication bias - results. Despite considerable recent efforts to implement measures to reduce the non-publication of trials, publication bias is still a major problem in medical research. The objective of our study was to identify barriers to and facilitators of interventions to prevent or reduce publication bias. We systematically reviewed the scholarly literature and extracted data from articles. Further, we performed semi-structured interviews with stakeholders. We performed an inductive thematic analysis to identify barriers to and facilitators of interventions to counter publication bias. The systematic review identified 39 articles. Thirty-four of 89 invited interview partners agreed to be interviewed. We clustered interventions into four categories: prospective trial registration, incentives for reporting in peer-reviewed journals or research reports, public availability of individual patient-level data, and peer-review/editorial processes. Barriers we identified included economic and personal interests, lack of financial resources for a global comprehensive trial registry, and different legal systems. Facilitators identified included: raising awareness of the effects of publication bias, providing incentives to make data publically available, and implementing laws to enforce prospective registration and reporting of clinical trial results. Publication bias is a complex problem that reflects the complex system in which it occurs. The cooperation amongst stakeholders to increase public awareness of the problem, better tailoring of incentives to publish, and ultimately legislative regulations have the greatest potential for reducing publication bias.
    BMC Health Services Research 12/2014; 14(1):551. DOI:10.1186/s12913-014-0551-z · 1.66 Impact Factor