Reviewing Hypothetical Migraine Studies Using Funding Criteria From The Patient-Centered Outcomes Research Institute

Health Affairs (Impact Factor: 4.97). 10/2012; 31(10):2193-9. DOI: 10.1377/hlthaff.2012.0774
Source: PubMed


What role can rigorous observational comparative effectiveness studies play in guiding clinical decision making? What criteria should be used in determining whether the results of such studies should be communicated to clinicians and to patients? We address these questions by considering two hypothetical observational studies in patients with migraine against the backdrop of the review criteria drawn up by the Patient-Centered Outcomes Research Institute (PCORI). These criteria emphasize that patient-centered comparative effectiveness research should exhibit relevance to patients, have great potential to affect practice and improve outcomes, and be conducted using rigorous analytic methods. We conclude that these hypothetical studies would be unlikely to have been funded or communicated by PCORI, and we offer suggestions for improving their relevance and analytic approaches. We also conclude that high-quality observational studies can effectively complement findings from randomized trials, and that communicating their results to patients and clinicians is warranted.

4 Reads
  • [Show abstract] [Hide abstract]
    ABSTRACT: There may once have been a time when doctors unquestioningly accepted the government's declaration of a drug's effectiveness and when patients unquestioningly accepted the prescriptions of their doctors. That time has passed. Now, information-good and bad-showers from all directions on patients and physicians alike. A filter is needed, and peer review provides the best one. But who or what is this validated information for? Ethically, its primary purpose is to enable patients to make decisions consistent with their values. Providing vetted information in a form that is useful to patients requires an emphasis on comprehensible, comprehensive, trustworthy, verifiable, and transparent communication. The hypothetical comparative effectiveness case study in this month's Health Affairs does not appear to rise to the level that would be helpful to providers or patients.
    Health Affairs 10/2012; 31(10):2236-40. DOI:10.1377/hlthaff.2012.0793 · 4.97 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: This article explores issues of concern to payers evaluating the hypothetical comparative effectiveness case study of two fictitious migraine treatments in this month's Health Affairs. The case study presents the seemingly paradoxical situation in which randomized controlled trials produce one result, and real-world observational comparative effectiveness research produces another. For the payer making coverage decisions, this scenario raises three major themes related to interpretation and communication. First, there is a need for a well-considered set of criteria that weigh evidence across comparative effectiveness studies to determine whether enough evidence exists to communicate or enact new health care policies. Second, emphasis should be placed on studies that are published or presented in peer-reviewed settings. Third, access to raw comparative effectiveness research data would enable payers to more deeply explore research interests relevant to their particular constituencies. Payers' involvement in comparative effectiveness research should be encouraged, not discouraged, to advance our understanding of what works best and for whom.
    Health Affairs 10/2012; 31(10):2225-30. DOI:10.1377/hlthaff.2012.0730 · 4.97 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: Comparative effectiveness research evaluates the relative effectiveness, safety, and value of competing treatment options in clinically realistic settings. Such evaluations can be methodologically complex and difficult to interpret. There will be a growing need for critical evaluation of comparative effectiveness studies to assess the adequacy of their design and to put new information into a broader context. Equally important, this knowledge will have to be communicated to clinicians in a way that will actually change practice. We identify three challenges to effective dissemination of comparative effectiveness research findings: the difficulty of interpreting comparative effectiveness research data, the need for trusted sources of information, and the challenge of turning research results into clinical action. We suggest that academic detailing-direct outreach education that gives clinicians an accurate and unbiased synthesis of the best evidence for practice in a given clinical area-can translate comparative effectiveness research findings into actions that improve health care decision making and patient outcomes.
    Health Affairs 10/2012; 31(10):2206-12. DOI:10.1377/hlthaff.2012.0817 · 4.97 Impact Factor
Show more