The Hypothetical Migraine Drug Comparative Effectiveness Study: A Payer's Recommendations For Obtaining More Useful Results
ABSTRACT This article explores issues of concern to payers evaluating the hypothetical comparative effectiveness case study of two fictitious migraine treatments in this month's Health Affairs. The case study presents the seemingly paradoxical situation in which randomized controlled trials produce one result, and real-world observational comparative effectiveness research produces another. For the payer making coverage decisions, this scenario raises three major themes related to interpretation and communication. First, there is a need for a well-considered set of criteria that weigh evidence across comparative effectiveness studies to determine whether enough evidence exists to communicate or enact new health care policies. Second, emphasis should be placed on studies that are published or presented in peer-reviewed settings. Third, access to raw comparative effectiveness research data would enable payers to more deeply explore research interests relevant to their particular constituencies. Payers' involvement in comparative effectiveness research should be encouraged, not discouraged, to advance our understanding of what works best and for whom.
- [Show abstract] [Hide abstract]
ABSTRACT: There may once have been a time when doctors unquestioningly accepted the government's declaration of a drug's effectiveness and when patients unquestioningly accepted the prescriptions of their doctors. That time has passed. Now, information-good and bad-showers from all directions on patients and physicians alike. A filter is needed, and peer review provides the best one. But who or what is this validated information for? Ethically, its primary purpose is to enable patients to make decisions consistent with their values. Providing vetted information in a form that is useful to patients requires an emphasis on comprehensible, comprehensive, trustworthy, verifiable, and transparent communication. The hypothetical comparative effectiveness case study in this month's Health Affairs does not appear to rise to the level that would be helpful to providers or patients.Health Affairs 10/2012; 31(10):2236-40. DOI:10.1377/hlthaff.2012.0793 · 4.97 Impact Factor
- [Show abstract] [Hide abstract]
ABSTRACT: Comparative effectiveness research evaluates the relative effectiveness, safety, and value of competing treatment options in clinically realistic settings. Such evaluations can be methodologically complex and difficult to interpret. There will be a growing need for critical evaluation of comparative effectiveness studies to assess the adequacy of their design and to put new information into a broader context. Equally important, this knowledge will have to be communicated to clinicians in a way that will actually change practice. We identify three challenges to effective dissemination of comparative effectiveness research findings: the difficulty of interpreting comparative effectiveness research data, the need for trusted sources of information, and the challenge of turning research results into clinical action. We suggest that academic detailing-direct outreach education that gives clinicians an accurate and unbiased synthesis of the best evidence for practice in a given clinical area-can translate comparative effectiveness research findings into actions that improve health care decision making and patient outcomes.Health Affairs 10/2012; 31(10):2206-12. DOI:10.1377/hlthaff.2012.0817 · 4.97 Impact Factor
- Journal of Comparative Effectiveness Research 09/2013; 2(5):413-8. DOI:10.2217/cer.13.59 · 0.72 Impact Factor