Patient-Oriented Cancer Information on the Internet: A Comparison of Wikipedia and a Professionally Maintained Database

Bruce and Ruth Rappaport Faculty of Medicine, Technion-Israel Institute of Technology, Haifa, Israel.
Journal of Oncology Practice 09/2011; 7(5):319-23. DOI: 10.1200/JOP.2010.000209
Source: PubMed


A wiki is a collaborative Web site, such as Wikipedia, that can be freely edited. Because of a wiki's lack of formal editorial control, we hypothesized that the content would be less complete and accurate than that of a professional peer-reviewed Web site. In this study, the coverage, accuracy, and readability of cancer information on Wikipedia were compared with those of the patient-orientated National Cancer Institute's Physician Data Query (PDQ) comprehensive cancer database.
For each of 10 cancer types, medically trained personnel scored PDQ and Wikipedia articles for accuracy and presentation of controversies by using an appraisal form. Reliability was assessed by using interobserver variability and test-retest reproducibility. Readability was calculated from word and sentence length.
Evaluators were able to rapidly assess articles (18 minutes/article), with a test-retest reliability of 0.71 and interobserver variability of 0.53. For both Web sites, inaccuracies were rare, less than 2% of information examined. PDQ was significantly more readable than Wikipedia: Flesch-Kincaid grade level 9.6 versus 14.1. There was no difference in depth of coverage between PDQ and Wikipedia (29.9, 34.2, respectively; maximum possible score 72). Controversial aspects of cancer care were relatively poorly discussed in both resources (2.9 and 6.1 for PDQ and Wikipedia, respectively, NS; maximum possible score 18). A planned subanalysis comparing common and uncommon cancers demonstrated no difference.
Although the wiki resource had similar accuracy and depth as the professionally edited database, it was significantly less readable. Further research is required to assess how this influences patients' understanding and retention.

Download full-text


Available from: Yaacov Richard Lawrence, Oct 01, 2015
24 Reads
  • Source
    • "Overall, the studies that have examined the writing style and readability of Wikipedia articles have generally found that they are at least as easy to read as their online and offline counterparts (Elia, 2009); however, results were varied when specific knowledge domains were investigated (Korosec et al., 2010; Rajagopalan et al., 2010, 2011). Wikipedia's writing style was found to be inconsistent (West & Williamson, 2009), especially concerning international topics (Dalby, 2007). "
    [Show abstract] [Hide abstract]
    ABSTRACT: Wikipedia may be the best-developed attempt thus far to gather all human knowledge in one place. Its accomplishments in this regard have made it a point of inquiry for researchers from different fields of knowledge. A decade of research has thrown light on many aspects of the Wikipedia community, its processes, and its content. However, due to the variety of fields inquiring about Wikipedia and the limited synthesis of the extensive research, there is little consensus on many aspects of Wikipedia's content as an encyclopedic collection of human knowledge. This study addresses the issue by systematically reviewing 110 peer-reviewed publications on Wikipedia content, summarizing the current findings, and highlighting the major research trends. Two major streams of research are identified: the quality of Wikipedia content (including comprehensiveness, currency, readability, and reliability) and the size of Wikipedia. Moreover, we present the key research trends in terms of the domains of inquiry, research design, data source, and data gathering methods. This review synthesizes scholarly understanding of Wikipedia content and paves the way for future studies. Open access version:
    Journal of the Association for Information Science and Technology 02/2015; 66(2). DOI:10.1002/asi.23172 · 2.23 Impact Factor
  • Source
    • "One the one hand, the crowd allows for a wider range of expertise and preferences to be leveraged in group decision‐making, suggesting that 'collective wisdom' might make evaluation more accurate, reducing information frictions and providing greater efficiency in financing decisions. For example, studies on forecasting have supported the idea that experts are no more accurate than informed amateurs (Tetlock, 2005), while examination of Wikipedia has shown that it compares favorably on many dimensions to expert‐created content (Clauson, Polen, Boulos, & Dzenowagis, 2008; Giles, 2005; Rajagopalan et al., 2011). Further, recent research on Wikipedia has shown that while there may be political biases in certain articles (Greenstein & Zhu, 2012), as these articles are more heavily edited by the crowd, they ultimately become less biased than similar work produced by the experts at Encyclopedia Britannica (Greenstein & Zhu, 2014). "
    [Show abstract] [Hide abstract]
    ABSTRACT: In fields as diverse as technology entrepreneurship and the arts, crowds of interested stakeholders are increasingly responsible for deciding which innovations to fund, a privilege that was previously reserved for a few experts, such as venture capitalists and grant‐making bodies. Little is known about the degree to which the crowd differs from experts in judging which ideas to fund, and, indeed, whether the crowd is even rational in making funding decisions. Drawing on a panel of national experts and comprehensive data from the largest crowdfunding site, we examine funding decisions for proposed theater projects, a category where expert and crowd preferences might be expected to differ greatly. We instead find substantial agreement between the funding decisions of crowds and experts. Where crowds and experts disagree, it is far more likely to be a case where the crowd is willing to fund projects that experts may not. Examining the outcomes of these projects, we find no quantitative or qualitative differences between projects funded by the crowd alone, and those that were selected by both the crowd and experts. Our findings suggest that crowdfunding can play an important role in complementing expert decisions, particularly in sectors where the crowds are end users, by allowing projects the option to receive multiple evaluations and thereby lowering the incidence of "false negatives."
    Management Science 01/2015; DOI:10.2139/ssrn.2443114 · 2.48 Impact Factor
  • Source
    • "Wikipedia provides various medical information. Rajagopalan et al. [23] compared the quality of cancer information found on Wikipedia with a professionally maintained database. Articles of Wikipedia and the professional database have similar depth and accuracy, but the professionally edited articles appeared to be more readable. "
    [Show abstract] [Hide abstract]
    ABSTRACT: Many patients have difficulties with comprehending written and spoken health information presented to them. Additionally, detailed explanation of medication use and further implications are too complex and expensive to be carried out by medical professionals whenever they are prescribing medication. An alternative approach to transform medical information in an easier understandable form could save cost for medical service providers and help to increase patients’ adherence. We present a requirements framework for medical information translation systems. Furthermore, concept, architecture, and actual implementation of a web application leveraging crowdsourcing are illustrated. To demonstrate that the crowdsourcing approach is suitable to improve comprehensibility of medical information, a proof-of-concept experiment is conducted.
    Multikonferenz Wirtschaftsinformatik 2014, Paderborn, Germany; 02/2014
Show more