Amanda R Patrick

Harvard Medical School, Boston, Massachusetts, United States

Are you Amanda R Patrick?

Claim your profile

Publications (61)400.84 Total impact

  • [Show abstract] [Hide abstract]
    ABSTRACT: PurposeEstimating drug effectiveness and safety among older adults in population-based studies using administrative health care claims can be hampered by unmeasured confounding as a result of frailty. A claims-based algorithm that identifies patients likely to be dependent, a proxy for frailty, may improve confounding control. Our objective was to develop an algorithm to predict dependency in activities of daily living (ADL) in a sample of Medicare beneficiaries.Methods Community-dwelling respondents to the 2006 Medicare Current Beneficiary Survey, >65 years old, with Medicare Part A, B, home health, and hospice claims were included. ADL dependency was defined as needing help with bathing, eating, walking, dressing, toileting, or transferring. Potential predictors were demographics, International Classification of Diseases, Ninth Revision Clinical Modification diagnosis/procedure and durable medical equipment codes for frailty-associated conditions. Multivariable logistic regression was used to predict ADL dependency. Cox models estimated hazard ratios for death as a function of observed and predicted ADL dependency.ResultsOf 6391 respondents, 57% were female, 88% white, and 38% were ≥80. The prevalence of ADL dependency was 9.5%. Strong predictors of ADL dependency were charges for a home hospital bed (OR = 5.44, 95%CI = 3.28–9.03) and wheelchair (OR = 3.91, 95%CI = 2.78–5.51). The c-statistic of the final model was 0.845. Model-predicted ADL dependency of 20% or greater was associated with a hazard ratio for death of 3.19 (95%CI: 2.78, 3.68).Conclusions An algorithm for predicting ADL dependency using health care claims was developed to measure some aspects of frailty. Accounting for variation in frailty among older adults could lead to more valid conclusions about treatment use, safety, and effectiveness. Copyright © 2014 John Wiley & Sons, Ltd.
    Pharmacoepidemiology and Drug Safety 10/2014; · 2.90 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: Recent years have witnessed a growing body of observational literature on the association between glucose-lowering treatments and cardiovascular disease. However, many of the studies are based on designs or analyses that inadequately address the methodological challenges involved.
    Diabetologia 09/2014; · 6.88 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: Methods for near-real-time monitoring of new drugs in electronic healthcare data are needed. In a novel application, we prospectively monitored ischemic, bleeding, and mortality outcomes among patients initiating prasugrel versus clopidogrel in routine care during the first 2 years following the approval of prasugrel. Using the HealthCore Integrated Research Database, we conducted a prospective cohort study comparing prasugrel and clopidogrel initiators in the 6 months following the introduction of prasugrel and every 2 months thereafter. We identified patients who initiated antiplatelets within 14 days following discharge from hospitalizations for myocardial infarction (MI) or acute coronary syndrome. We matched patients using high-dimensional propensity scores (hd-PSs) and followed them for ischemic (i.e., MI and ischemic stroke) events, bleed (i.e., hemorrhagic stroke and gastrointestinal bleed) events, and all-cause mortality. For each outcome, we applied sequential alerting algorithms. We identified 1,282 eligible new users of prasugrel and 8,263 eligible new users of clopidogrel between September 2009 and August 2011. In hd-PS matched cohorts, the overall MI rate difference (RD) comparing prasugrel with clopidogrel was -23.1 (95 % confidence interval [CI] -62.8-16.7) events per 1,000 person-years and RDs were -0.5 (-12.9-11.9) and -2.8 (-13.2-7.6) for a composite bleed event outcome and death from any cause, respectively. No algorithms generated alerts for any outcomes. Near-real-time monitoring was feasible and, in contrast to the key pre-marketing trial that demonstrated the efficacy of prasugrel, did not suggest that prasugrel compared with clopidogrel was associated with an increased risk of gastrointestinal and intracranial bleeding.
    Drug Safety 02/2014; · 2.62 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: Fractures related to osteoporosis are associated with $20B in cost in the US, with the majority of cost born by federal health care programs, such as Medicare and Medicaid. Despite the proven fracture reduction benefits of several osteoporosis treatments, less than one-quarter of patients older than 65 years of age who fracture receive such care. A post-fracture liaison service (FLS) has been developed in many health systems but has not been widely implemented in the US. We developed a Markov state-transition computer simulation model to assess the cost-effectiveness of an FLS using a health care system perspective. Using the model, we projected the lifetime costs and benefits of FLS, with or without a bone mineral density test, in men and women who had experienced a hip fracture. We estimated the costs and benefits of an FLS, the probabilities of re-fracture while on osteoporosis treatment, as well as the utilities associated with various health states from published literature. We used multi-way sensitivity analyses to examine impact of uncertainty in input parameters on cost-effectiveness of FLS. The model estimates that an FLS would result in 153 fewer fractures (109 hip, 5 wrist, 21 spine, 17 other), 37.43 more quality-adjusted life years (QALYs), and save $66,879 compared with typical post-fracture care per every 10,000 post-fracture patients. Doubling the cost of the FLS resulted in an incremental cost-effectiveness ratio (ICER) of $22,993 per QALY. The sensitivity analyses showed that results were robust to plausible ranges of input parameters; assuming the least favorable values of each of the major input parameters results in an ICER of $112,877 per QALY. An FLS targeting patients post-hip fracture should result in cost-savings and reduced fractures under most scenarios. © 2014 American Society for Bone and Mineral Research.
    Journal of bone and mineral research: the official journal of the American Society for Bone and Mineral Research 01/2014; · 6.04 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: Background-New anticoagulants may improve health outcomes in patients with atrial fibrillation, but it is unclear whether their use is cost-effective.Methods and Results-A Markov state transition was created to compare 4 therapies: dabigatran 150 mg BID, apixaban 5 mg BID, rivaroxaban 20 mg QD, and warfarin therapy. The population included those with newly diagnosed atrial fibrillation who were eligible for treatment with warfarin. Compared with warfarin, apixaban, rivaroxaban, and dabigatran, costs were $93 063, $111 465, and $140 557 per additional quality-adjusted life year gained, respectively. At a threshold of $100 000 per quality-adjusted life year, apixaban provided the greatest absolute benefit while still being cost-effective, although warfarin would be superior if apixaban was 2% less effective than expected. Although apixaban was the optimal strategy in our base case, in probabilistic sensitivity analysis, warfarin was optimal in an equal number of iterations at a cost-effectiveness threshold of $100 000 per quality-adjusted life year.Conclusions-While at a standard cost-effectiveness threshold of $100 000 per quality-adjusted life year, apixaban seems to be the optimal anticoagulation strategy; this finding is sensitive to assumptions about its efficacy and cost. In sensitivity analysis, warfarin seems to be the optimal choice in an equal number of simulations. As a result, although all the novel oral anticoagulants produce greater quality-adjusted life expectancy than warfarin, they may not represent good value for money.
    Circulation Cardiovascular Quality and Outcomes 11/2013; · 5.04 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: Many patients with type 2 diabetes eventually require insulin, yet little is known about the patterns and quality of pharmacologic care received following insulin initiation. Guidelines from the American Diabetes Association and the European Association for the Study of Diabetes recommend that insulin secretagogues such as sulfonylureas be discontinued at the time of insulin initiation to reduce the risk of hypoglycemia, and that treatment be intensified if HbA1c levels remain above-target 3 months after insulin initiation. To describe pharmacologic treatment patterns over time among adults initiating insulin and/or intensifying insulin treatment. Observational study. A large commercially insured population of adult patients without recorded type 1 diabetes who initiated insulin. We evaluated changes in non-insulin antidiabetic medication use during the 120 days immediately following insulin initiation, rates of increase in insulin dose and/or dosing frequency during the 270 days following an insulin initiation treatment period of 90 days, and rates of insulin discontinuation. Seven thousand, nine hundred and thirty-two patients initiated insulin during 2003-2008, with the majority (61 %) initiating basal insulin only. Metformin (55 %), sulfonylureas (39 %), and thiazolidinediones (30 %) were commonly used prior to insulin initiation. Metformin was continued by 64 % of patients following mixed or mealtime insulin initiation; the continuation rate was nearly as high for sulfonylureas (58 %). Insulin dose and/or dosing frequency increased among 22.9 % of patients. Insulin was discontinued by 27 % of patients. We found evidence of substantial departures from guideline-recommended pharmacotherapy. Insulin secretagogues were frequently co-prescribed with insulin. The majority of patients had no evidence of treatment intensification following insulin initiation, although this finding is difficult to interpret without HbA1c levels. While each patient's care should be individualized, our data suggest that the quality of care following insulin initiation can be improved.
    Journal of General Internal Medicine 10/2013; · 3.42 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: Key to conducting active drug safety surveillance using longitudinal health care data is determining whether and when there is sufficient evidence to raise a safety alert. We propose to quantify the expected value of the information (VOI) to be gained through continued monitoring in terms of its potential to reduce health losses among future patients and weigh this against the health cost of exposing current patients during continued monitoring. . To apply this sequential VOI approach to monitoring the comparative safety of prasugrel v. clopidogrel on gastrointestinal (GI) bleeding. . We calculated expected health losses assuming expected mortality, nonfatal myocardial infarction (MI), and nonfatal stroke on clopidogrel were 1.27, 5.93, and 1.14 per 100 person-years, using historical data; relative rates on prasugrel were 0.95, 0.76, and 1.02 based on trial data; and MI, stroke, and GI bleed were 9%, 25%, and 0.1% as bad as death, respectively. We assigned gamma prior distributions to the rates of bleeding on clopidogrel and prasugrel to capture baseline uncertainty; in Monte Carlo simulations, prasugrel's efficacy parameters were sampled from distributions. . Treating all patients with prasugrel minimized expected health losses, resulting in 475.3 death-equivalents over 25,000 person-years of treatment. Monitoring increased expected losses by 5, and treating all patients with clopidogrel increased losses by 46.4. In Monte Carlo simulation, monitoring on average increased expected losses by 4.6, but a reduction in losses from monitoring was supported within the bounds of uncertainty (95% confidence interval, -0.6 to 11.1). Limitations. Patient heterogeneity and the possibility of updating efficacy parameters during monitoring were not incorporated. . The proposed approach integrates expected health harms and benefits of continued monitoring in the decision to raise a safety alert.
    Medical Decision Making 08/2013; · 2.27 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: PURPOSE: When using claims data, dichotomous covariates (C) are often assumed to be absent unless a claim for the condition is observed. When available historical data differs among subjects, investigators must choose between using all available historical data versus data from a fixed window to assess C. Our purpose was to compare estimation under these two approaches. METHODS: We simulated cohorts of 20 000 subjects with dichotomous variables representing exposure (E), outcome (D), and a single time-invariant C, as well as varying availability of historical data. C was operationally defined under each paradigm and used to estimate the adjusted risk ratio of E on D via Mantel-Haenszel methods. RESULTS: In the base case scenario, less bias and lower mean square error were observed using all available information compared with a fixed window; differences were magnified at higher modeled confounder strength. Upon introduction of an unmeasured covariate (F), the all-available approach remained less biased in most circumstances and rendered estimates that better approximated those that were adjusted for the true (modeled) value of C in all instances. CONCLUSIONS: In most instances considered, operationally defining time-invariant dichotomous C based on all available historical data, rather than on data observed over a commonly shared fixed historical window, results in less biased estimates. Copyright © 2013 John Wiley & Sons, Ltd.
    Pharmacoepidemiology and Drug Safety 03/2013; · 2.90 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: Objective To evaluate the comparative cost-effectiveness of interventions to improve adherence to evidence-based medications among postmyocardial infarction (MI) patients. Data Sources/Study SettingCost-effectiveness analysis. Study DesignWe developed a Markov model simulating a hypothetical cohort of 65-year-old post-MI patients who were prescribed secondary prevention medications. We evaluated mailed education, disease management, polypill use, and combinations of these interventions. The analysis was performed from a societal perspective over a lifetime horizon. The main outcome was an incremental cost-effectiveness ratio (ICER) as measured by cost per quality-adjusted life year (QALY) gained. Data Collection/Extraction Methods Model inputs were extracted from published literature. Principal FindingsCompared with usual care, only mailed education had both improved health outcomes and reduced spending. Mailed education plus disease management, disease management, polypill use, polypill use plus mailed education, and polypill use plus disease management cost were $74,600, $69,200, $133,000, $113,000, and $142,900 per QALY gained, respectively. In an incremental analysis, only mailed education had an ICER of less than $100,000 per QALY and was therefore the optimal strategy. Polypill use, particularly when combined with mailed education, could be cost effective, and potentially cost saving if its price decreased to less than $100 per month. Conclusions Mailed education and a polypill, once available, may be the cost-saving strategies for improving post-MI medication adherence.
    Health Services Research 12/2012; 47(6). · 2.49 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: Dabigatran, an oral thrombin inhibitor, and rivaroxaban and apixaban, oral factor Xa inhibitors, have been found to be safe and effective in reducing stroke risk in patients with atrial fibrillation. We sought to compare the efficacy and safety of the 3 new agents based on data from their published warfarin-controlled randomized trials, using the method of adjusted indirect comparisons. We included findings from 44 535 patients enrolled in 3 trials of the efficacy of dabigatran (Randomized Evaluation of Long-Term Anticoagulation Therapy [RELY]), apixaban (Apixaban for Reduction in Stroke and Other Thromboembolic Events in Atrial Fibrillation [ARISTOTLE]), and rivaroxaban (Rivaroxaban Once Daily Oral Direct Factor Xa Inhibition Compared With Vitamin K Antagonism for Prevention of Stroke and Embolism Trial in Atrial Fibrillation [ROCKET-AF]), each compared with warfarin. The primary efficacy end point was stroke or systemic embolism; the safety end point we studied was major hemorrhage. To address a lack of comparability between trial populations caused by the restriction of ROCKET-AF to high-risk patients, we conducted a subgroup analysis in patients with a CHADS(2) score ≥3. We found no statistically significant efficacy differences among the 3 drugs, although apixaban and dabigatran were numerically superior to rivaroxaban. Apixaban produced significantly fewer major hemorrhages than dabigatran and rivaroxaban. An indirect comparison of new anticoagulants based on existing trial data indicates that in patients with a CHADS(2) score ≥3 dabigatran 150 mg, apixaban 5 mg, and rivaroxaban 20 mg resulted in statistically similar rates of stroke and systemic embolism, but apixaban had a lower risk of major hemorrhage compared with dabigatran and rivaroxaban. Until head-to-head trials or large-scale observational studies that reflect routine use of these agents are available, such adjusted indirect comparisons based on trial data are one tool to guide initial therapeutic choices.
    Circulation Cardiovascular Quality and Outcomes 07/2012; 5(4):480-6. · 5.04 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: Several efforts are under way to develop and test methods for prospective drug safety monitoring using large, electronic claims databases. Prospective monitoring systems must incorporate signalling algorithms and techniques to mitigate confounding in order to minimize false positive and false negative signals due to chance and bias. The aim of the study was to describe a prototypical targeted active safety monitoring system and apply the framework to three empirical examples. We performed sequential, targeted safety monitoring in three known drug/adverse event (AE) pairs: (i) paroxetine/upper gastrointestinal (UGI) bleed; (ii) lisinopril/angioedema; (iii) ciprofloxacin/Achilles tendon rupture (ATR). Data on new users of the drugs of interest were extracted from the HealthCore Integrated Research Database. New users were matched by propensity score to new users of comparator drugs in each example. Analyses were conducted sequentially to emulate prospective monitoring. Two signalling rules--a maximum sequential probability ratio test and an effect estimate-based approach--were applied to sequential, matched cohorts to identify signals within the system. Signals were identified for all three examples: paroxetine/UGI bleed in the seventh monitoring cycle, within 2 calendar years of sequential data; lisinopril/angioedema in the second cycle, within the first monitoring year; ciprofloxacin/ATR in the tenth cycle, within the fifth year. In this proof of concept, our targeted, active monitoring system provides an alternative to systems currently in the literature. Our system employs a sequential, propensity score-matched framework and signalling rules for prospective drug safety monitoring and identified signals for all three adverse drug reactions evaluated.
    Drug Safety 04/2012; 35(5):407-16. · 2.62 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: A previous study suggested an increased risk of preeclampsia among women treated with selective serotonin reuptake inhibitors (SSRIs). Using population-based health-care utilization databases from British Columbia (1997-2006), the authors conducted a study of 69,448 pregnancies in women with depression. They compared risk of preeclampsia in women using SSRIs, serotonin-norepinephrine reuptake inhibitors (SNRIs), or tricyclic antidepressants (TCAs) between gestational weeks 10 and 20 with risk in depressed women not using antidepressants. Among prepregnancy antidepressant users, the authors compared the risk in women who continued antidepressants between gestational weeks 10 and 24 with the risk in those who discontinued. Relative risks and 95% confidence intervals were estimated. The risk of preeclampsia in depressed women not treated with antidepressants (2.4%) was similar to that in women without depression (2.3%). Compared with women with untreated depression, women treated with SSRI, SNRI, and TCA monotherapy had adjusted relative risks of 1.22 (95% confidence interval (CI): 0.97, 1.54), 1.95 (95% CI: 1.25, 3.03), and 3.23 (95% CI: 1.87, 5.59), respectively. Within prepregnancy antidepressant users, the relative risk for preeclampsia among continuers compared with discontinuers was 1.32 (95% CI: 0.95, 1.84) for SSRI, 3.43 (95% CI: 1.77, 6.65) for SNRI, and 3.26 (95% CI: 1.04, 10.24) for TCA monotherapy. Study results suggest that women who use antidepressants during pregnancy, especially SNRIs and TCAs, have an elevated risk of preeclampsia. These associations may reflect drug effects or more severe depression.
    American journal of epidemiology 03/2012; 175(10):988-97. · 4.98 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Multiple studies demonstrate poor adherence to medication regimens prescribed for chronic illnesses, including osteoporosis, but few interventions have been proven to enhance adherence. We examined the effectiveness of a telephone-based counseling program rooted in motivational interviewing to improve adherence to a medication regimen for osteoporosis. We conducted a 1-year randomized controlled clinical trial. Participants were recruited from a large pharmacy benefits program for Medicare beneficiaries. All potentially eligible individuals had been newly prescribed a medication for osteoporosis. Consenting participants were randomized to a program of telephone-based counseling (n = 1046) using a motivational interviewing framework or a control group (n = 1041) that received mailed educational materials. Medication regimen adherence was the primary outcome compared across treatment arms and was measured as the median (interquartile range) medication possession ratio, calculated as the ratio of days with filled prescriptions to total days of follow-up. The groups were balanced at baseline, with a mean age of 78 years; 93.8% were female. In an intention-to-treat analysis, median adherence was 49% (interquartile range, 7%-88%) in the intervention arm and 41% (2%-86%) in the control arm (P = .07, Kruskal-Wallis test). There were no differences in self-reported fractures. In this randomized controlled trial, we did not find a statistically significant improvement in adherence to an osteoporosis medication regimen using a telephonic motivational interviewing intervention.
    Archives of internal medicine 02/2012; 172(6):477-83. · 11.46 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Little is known about the use of warfarin in hemodialysis (HD) patients with atrial fibrillation (AF). We studied temporal trends of AF among older HD patients, and of warfarin use among those with AF. We linked US Medicare and prescription claims from older patients undergoing HD in 2 Eastern US states. We established annual cohorts of prevalent HD patients; AF was ascertained from >2 claims (>7 days apart) in the same year, with a diagnosis code indicating AF. Among those with AF, we defined current and past warfarin use. Demographic and clinical characteristics were also ascertained for each cohort. We used repeated-measures logistic regression to define the odds of AF and of current or past versus absence of warfarin use. Of 6,563 unique patients, 2,185 were determined to have AF. The prevalence of AF increased from 26% in 1998 to 32% in 2005. In 2005, current warfarin use was present in 24% of AF patients and past use in 25%; 51% had no evidence of any warfarin use. No significant trends in utilization were observed from 1998 through 2005. Patients aged =85 years and nonwhites were less likely to have received warfarin; most comorbidities were not associated with warfarin use except for patients with past pulmonary embolism or deep venous thrombosis who were more likely than those without such history. While the prevalence of AF has been increasing among older HD patients, warfarin use was low and unchanged over time, perhaps reflecting the lack of evidence supporting its use.
    Journal of nephrology 12/2011; 25(3):341-53. · 2.00 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: Adherence to osteoporosis treatment is low. Although new therapies and behavioral interventions may improve medication adherence, questions are likely to arise regarding their cost-effectiveness. Our objectives were to develop and validate a model to simulate the clinical outcomes and costs arising from various osteoporosis medication adherence patterns among women initiating bisphosphonate treatment and to estimate the cost-effectiveness of a hypothetical intervention to improve medication adherence. We constructed a computer simulation using estimates of fracture rates, bisphosphonate treatment effects, costs, and utilities for health states drawn from the published literature. Probabilities of transitioning on and off treatment were estimated from administrative claims data. Patients were women initiating bisphosphonate therapy from the general community. We evaluated a hypothetical behavioral intervention to improve medication adherence. Changes in 10-yr fracture rates and incremental cost-effectiveness ratios were evaluated. A hypothetical intervention with a one-time cost of $250 and reducing bisphosphonate discontinuation by 30% had an incremental cost-effectiveness ratio (ICER) of $29,571 per quality-adjusted life year in 65-yr-old women initiating bisphosphonates. Although the ICER depended on patient age, intervention effectiveness, and intervention cost, the ICERs were less than $50,000 per quality-adjusted life year for the majority of intervention cost and effectiveness scenarios evaluated. Results were sensitive to bisphosphonate cost and effectiveness and assumptions about the rate at which intervention and treatment effects decline over time. Our results suggests that behavioral interventions to improve osteoporosis medication adherence will likely have favorable ICERs if their efficacy can be sustained.
    The Journal of Clinical Endocrinology and Metabolism 07/2011; 96(9):2762-70. · 6.31 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: Sudden cardiac death constitutes the leading cause of death in patients receiving dialysis. Little is known about the trends in implantable cardioverter-defibrillator (ICD) use and the outcomes of such device placement. Retrospective cohort study. US long-term dialysis patients who received an ICD in 1994-2006. PREDICTORS, OUTCOMES, & MEASUREMENTS: ICD utilization rates and incident rates of all-cause mortality, device infections, and other device-related procedures were measured. We compared mortality between recipients and otherwise similar patients who did not receive such a device using high-dimensional propensity score matching. We also examined the associations of demographics, dialysis type, baseline comorbid conditions, cardiovascular events at the time of admission, and recent infection with the study outcomes. 9,528 patients received an ICD in 1994-2006, with >88% placed after 2000. Almost all ICD use in the 1990s was for secondary prevention, however, half the patients received ICDs for apparent primary prevention in 2006. Mortality rates after implantation were high (448 deaths/1,000 patient-years) and most deaths were cardiovascular. Postimplantation infection rates were high, especially in the first year after implantation (988 events/1,000 patient-years) and were predicted by diabetes and recent infection. Patients receiving ICDs for secondary prevention had an overall 14% (95% CI, 9%-19%) lower mortality risk compared with propensity-matched controls, but these benefits seemed to be restricted to the early postimplantation time. Lack of clinical data, especially for laboratory and heart function studies. Residual confounding by indication. ICD use in dialysis patients is increasing, but rates of all-cause and cardiovascular mortality remain high in dialysis patients receiving these devices. Device infections are common, particularly in patients with recent infections. Randomized trials of ICDs are needed to determine the efficacy, safety, and risk-benefit ratio of these devices in dialysis patients.
    American Journal of Kidney Diseases 06/2011; 58(3):409-17. · 5.76 Impact Factor
  • Source
    J J Gagne, A R Patrick, H Mogun, D H Solomon
    [Show abstract] [Hide abstract]
    ABSTRACT: We examined variations in fracture rates among patients initiated on antidepressant drug treatment as identified from Medicare data in two US states and assessed whether the observed variation could be explained by affinity for serotonin transport receptors. We used Cox proportional hazards models to compare fracture rates of the hip, humerus, pelvis, wrist, and a composite of these, among propensity score-matched cohorts of users of secondary amine tricyclics, tertiary amine tricyclics, selective serotonin reuptake inhibitors (SSRIs), and atypical antidepressants. As compared with secondary amine tricyclics, SSRIs showed the highest association with composite fracture rate (hazard ratio 1.30; 95% confidence interval (CI) 1.12-1.52), followed by atypical antidepressants (hazard ratio 1.12; 95% CI 0.96-1.31) and tertiary amine tricyclics (hazard ratio 1.01; 95% CI 0.87-1.18). The results were robust to sensitivity analyses. Although SSRI use was associated with the highest rate of fractures, variation in fracture risk across specific antidepressant medications did not depend on affinity for serotonin transport receptors.
    Clinical Pharmacology &#38 Therapeutics 06/2011; 89(6):880-7. · 6.85 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: To explore the "healthy user" and "healthy adherer" effects-hypothetical sources of bias thought to arise when patients who initiate and adhere to preventive therapies are more likely to engage in healthy behaviors than are other subjects. The authors examined the association between statin initiation and adherence, and the subsequent use of preventive health services and incidence of clinical outcomes unlikely to be associated with the need for, or use of, a statin among older enrollees in two state-sponsored drug benefit programs. After adjustment for demographic and clinical covariates, patients who initiated statin use were more likely to receive recommended preventive services than noninitiators matched on age, sex, and state (hazard ratio [HR]: 1.10, 1.06-1.14 for males, HR: 1.09, 1.07-1.11 for females) and appeared to have a lower risk of a range of adverse outcomes (HR: 0.87, 0.85-0.89) thought to be unrelated to statin use. Adherence to a statin regimen was also associated with increased rates of preventive service use and a decreased rate of adverse clinical outcomes (HR: 0.93, 0.88-0.99). These results suggest that patients initiating and adhering to chronic preventive drug therapies are more likely to engage in other health-promoting behaviors. Failure to account for this relationship may introduce bias in any epidemiologic study evaluating the effect of a preventive therapy on clinical outcomes.
    Value in Health 06/2011; 14(4):513-20. · 2.89 Impact Factor
  • Source
    William H Shrank, Amanda R Patrick, M Alan Brookhart
    [Show abstract] [Hide abstract]
    ABSTRACT: The current emphasis on comparative effectiveness research will provide practicing physicians with increasing volumes of observational evidence about preventive care. However, numerous highly publicized observational studies of the effect of prevention on health outcomes have reported exaggerated relationships that were later contradicted by randomized controlled trials. A growing body of research has identified sources of bias in observational studies that are related to patient behaviors or underlying patient characteristics, known as the healthy user effect, the healthy adherer effect, confounding by functional status or cognitive impairment, and confounding by selective prescribing. In this manuscript we briefly review observational studies of prevention that have appeared to reach incorrect conclusions. We then describe potential sources of bias in these studies and discuss study designs, analytical methods, and sensitivity analyses that may mitigate bias or increase confidence in the results reported. More careful consideration of these sources of bias and study designs by providers can enhance evidence-based decision-making.
    Journal of General Internal Medicine 05/2011; 26(5):546-50. · 3.42 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: We examined new users of osteoporosis drugs among seniors in Pennsylvania and found no evidence of healthy adherer bias on observed associations between adherence to treatment and non-vertebral fracture risk; we document fracture reduction with better adherence to bisphosphonates, yet no fracture reduction with better adherence to calcitonin or raloxifene. We examined the potential for "healthy adherer bias" when studying the effects of adherence to osteoporosis pharmacotherapy on fracture risk. Based on clinical trial evidence, bisphosphonates, calcitonin, and raloxifene reduce vertebral fracture risk; yet only bisphosphonates are documented to reduce non-vertebral fracture risk. This is a cohort study of older women in Pennsylvania who initiated osteoporosis drugs between 1995 and 2005. We included new users of bisphosphonates, calcitonin, and raloxifene. Adherence was categorized based on a measure of compliance as high [proportion of days covered (PDC) ≥ 80%], intermediate (50% < PDC < 80%), or low (PDC ≤ 50%) according to a 180-day ascertainment period. Non-vertebral fracture rates within 365 days after the ascertainment period were compared between adherence categories (reference = low) using Cox proportional hazard models and adjusting for fracture risk factors. Primary and secondary prevention cohorts were examined separately. Adherence to calcitonin and raloxifene were control analyses. We found little difference in fracture rates between levels of adherence to calcitonin, bisphosphonates for primary prevention, or raloxifene for secondary prevention. We document lower fracture rates among high versus low adherent bisphosphonate users for secondary prevention (HR = 0.53, 95%CI = 0.38-0.74) and higher fracture rates among high versus low adherent raloxifene users for primary prevention (HR = 2.01, 95%CI = 1.04-3.87). We document little evidence of healthy adherer bias when studying the association between better adherence to osteoporosis drugs and fracture risk reduction, with only better adherence to bisphosphonates reducing fracture risk. The higher fracture risk among highly adherent raloxifene users for primary prevention is likely due to residual confounding.
    Osteoporosis International 03/2011; 22(3):943-54. · 4.04 Impact Factor

Publication Stats

1k Citations
400.84 Total Impact Points

Institutions

  • 2006–2014
    • Harvard Medical School
      • Department of Medicine
      Boston, Massachusetts, United States
  • 2005–2014
    • Brigham and Women's Hospital
      • Department of Medicine
      Boston, Massachusetts, United States
  • 2006–2013
    • Partners HealthCare
      Boston, Massachusetts, United States
  • 2008–2010
    • National Institute of Mental Health (NIMH)
      Maryland, United States
  • 2009
    • University of British Columbia - Vancouver
      • Department of Anesthesiology, Pharmacology and Therapeutics
      Vancouver, British Columbia, Canada
    • Massachusetts General Hospital
      • Department of Psychiatry
      Boston, MA, United States