Patrick Tchou

Cleveland Clinic, Cleveland, Ohio, United States

Are you Patrick Tchou?

Claim your profile

Publications (244)1335.24 Total impact

  • [Show abstract] [Hide abstract]
    ABSTRACT: Background: Direct Electrical Cardioversion (DCC) has been known to increase the risk of thromboembolism (TE) in patients with Atrial Fibrillation (AF). However, guidelines and current practices are unclear if this risk exists in patients undergoing DCC < 48 hours after AF onset. We aim to assess TE risk in these patients with and without the use of therapeutic anti-coagulation. Methods: All DCC done at Cleveland Clinic between 1996 to 2012, <48 hours after AF onset were selected from Electrophysiology database. They were divided into two groups on the basis of their anti-coagulation status. We did extensive chart review in these patients to look for outcomes of major TE complications within a month of DCC. Results: Among 567 DCC in 510 patients without therapeutic anti-coagulation, the mean (SD) CHA2DS2-VASc score was 2.34(1.66). At the time of DCC, 54% of these patients were on aspirin, while 12% were on warfarin with INR ≤1.5. There were 7 reported cerebrovascular accidents (CVAs) (1.22%) within a month after DCC. In these 7 patients, the mean (SD) CHA2DS2-VASc score was 3.57(1.27) ranging between 2 to 5. In the second group, 901 DCC were done in 733 patients who were therapeutic on warfarin (63%) or heparin (37%) at the time of DCC. Mean (SD) CHA2DS2-VASc score was 2.61(1.73). There were 2 reported (0.22%, p= 0.016) CVAs within a month after DCC. These two had CHA2D2-VASc scores of 4 and 6. Of these two patients, one was sub- therapeutic (INR=1.14) when he presented with neurological deficits 18 days after the DCC. The other patient held anticoagulation 10 days after DCC for a surgical procedure which was followed by the TE stroke a week later. The non-treated group therefore had a 5.6 times greater odds of having a thromboembolic stroke within a month of DCC (95% CI: 1.16 to 27.14) . Conclusions: In patients with acute onset atrial fibrillation, odds of thromboembolic complications are over 5 times higher in patients who did not receive therapeutic anti-coagulation at the time of DCC, despite having a lower baseline stroke risk as defined by their CHA2DS2-VASc scores. In addition, the two patients in our study who did have a stroke in the therapeutically anti-coagulated group had ceased their anticoagulant prior to the stroke.
    Circulation Cardiovascular Quality and Outcomes; 07/2014
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Despite sparse clinical data, current atrial fibrillation (AF) guidelines favor amiodarone as a drug of choice for patients with left ventricular hypertrophy (LVH). This study tested the hypothesis that patients with persistent AF and LVH on nonamiodarone antiarrhythmics have higher mortality compared to patients on amiodarone. In an observational cohort analysis of patients who underwent cardioversion for AF, patients with LVH, defined as left ventricular wall thickness ≥1.4 cm, by echocardiogram prior to their first cardioversion, were included; clinical data, including antiarrhythmic drugs and ejection fraction (LVEF), were collected. Mortality, determined via the Social Security Death Index, was analyzed using Kaplan-Meier and Cox proportional hazards models to determine whether antiarrhythmic drugs were associated with higher mortality. In 3,926 patients, echocardiographic wall thickness was available in 1,399 (age 66.8 ± 11.8 years, 67% male, LVEF 46 ± 15%, septum 1.3 ± 0.4, posterior wall 1.2 ± 0.2 cm), and 537 (38%) had LVH ≥1.4 cm. Among 537 patients with LVH, mean age was 67.5 ± 11.7 years, 76.4% were males, and mean LVEF was 48.3 ± 13.3%. Amiodarone was associated with lower survival (log rank P = 0.001), including after adjusting for age, LVEF, and coronary artery disease (P = 0.023). In propensity-score matched cohorts with LVH treated with no drugs, nonamiodarone antiarrhythmic drugs (non-AADs), or amiodarone (N = 65 each group), there was early lower survival in patients on amiodarone (P = 0.05). Patients with persistent AF and LVH on non-AADs do not have higher mortality compared to patients on amiodarone. Importantly, these findings do not support amiodarone as a superior choice in patients with LVH.
    Pacing and Clinical Electrophysiology 05/2014; · 1.75 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: Many patients eligible for cardiac resynchronization therapy (CRT) are over 80 years of age. Survival in this population and how it compares to the general octogenarian population has not been established. We extracted clinical data on a cohort of 800 consecutive patients undergoing the new implantation of a CRT device between April 15, 2004 and August 6, 2007. Patients over age 80, with class III-IV New York Heart Association heart failure symptoms on optimal medical therapy undergoing initial CRT implantation, were included in the final cohort. Using the United States Social Security Period Life Table for 2006, fractional survival for octogenarians in the general population was calculated and matched to our cohort based on age and gender. A comparison was then made between octogenarians undergoing CRT compared to the general population. A total of 95 octogenarians who met inclusion criteria were identified, of whom 86.3% received a biventricular defibrillator and the remainder a biventricular pacemaker. Over a mean follow-up of 3.6 ± 1.5 years, there were 47 deaths (47.4%). The mean survival time was 4.1 years (95% CI 3.7-4.5), and survival at 2 years was 78.9%. Compared to the general octogenarian population, octogenarians receiving CRT had only modestly worse survival over the duration of follow-up with the survival curves diverging at 2 years of follow-up (P = 0.03). Octogenarians with advanced heart failure have a reasonable mean survival time following CRT. All-cause mortality in this patient population is only modestly worse compared to the general octogenarian population. Therefore, in octogenarians deemed to be reasonable candidates, CRT should not be withheld based on age alone.
    Pacing and Clinical Electrophysiology 01/2014; · 1.75 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: Background QRS morphology and duration (QRSd) determine CRT candidate selection but criteria require refinement. Objective Assess CRT effect according to QRSd, treated by dichotomization vs a continuous function, and modulation by gender. Methods Patients selected were NYHA Class III/IV with LBBB and non-ischemic cardiomyopathy (to test “pure” CRT effect) with pre- and post-implant echocardiographic evaluations. Positive response was defined as improved LVEF post-CRT. Results In 212 patients (LVEF 19±7.1%; QRSd 160±23 ms; 105 females), CRT improved LVEF to 30±15% (p<0.001) during median 2 years follow-up. Positive response occurred in 150/212 (71%). Genders did not differ for QRSd, pharmacotherapy and comorbidities, but female CRT response was greater: incidence 84% (88/105) vs 58% (62/107) in males (p<0.001); LVEF improvement 15±14% vs 7.2±13%, respectively (p<0.001). Overall, response rate was 58% when QRSd<150ms vs 76% when QRSd≥150 ms (p=0.009). This probability differed between genders: 86% in women vs 36% in men (p<0.001) for QRSd <150 ms, and 83% vs 69% respectively when QRSd≥150ms (p=0.05). Thus, female response rates remained high whether QRSd was < or ≥150 ms (86 vs 83%, p=0.77) but differed in males (36 vs 69%, p<0.001). With QRSd as a continuum, the CRT-response relationship was nonlinear, and significantly different between genders. Female superiority at shorter QRSd inverted with prolongation >180 ms. Conclusions The QRSd-CRT response relationship in heart failure patients with LBBB and non-ischemic cardiomyopathy is better described by a sex-specific continuous function and not by dichotomization by 150 ms which excludes a large proportion of women with potentially favorable outcome.
    Heart rhythm: the official journal of the Heart Rhythm Society 01/2014; · 4.56 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: Background Baseline QRS duration (QRSd) ≥150 ms is a recognized predictor of clinical improvement by cardiac resynchronization therapy (CRT), particularly for those with left bundle branch (LBBB). Patients with QRSd <150ms, are considered less likely to respond. Objective We theorized that left ventricular dyssynchrony, while usually associated with wider QRSd, also exhibits lower QRS frequency characteristics and that low frequency content should predict CRT response in LBBB patients. Methods We retrospectively examined the QRS frequency content of 170 heart failure patients with LBBB and QRSd ≥120 ms using the Fourier transformation. Ninety-four responders to CRT (definition; reduction in left ventricular end-systolic volume by ≥15% from baseline) were compared to 76 nonresponders (<15% reduction). The analysis of three standard ECG leads (I, AVF and V3) representing the three dimensions of depolarization, was performed and V3 provided the best predictive value. Results The QRSd of responders (160.3±17.8ms) and nonresponders (161.8±21.1ms; p=0.604) were similar. We found that the percentage of total QRS frequency power <10Hz that exceeded 52% was most predictive of CRT response compared to other cutoff values. However, the percentage of patients with total QRS power >52% below 10 Hz was especially predictive of response in those with QRSd <150ms. In these patients, this power threshold was highly predictive of CRT response (PPV=85.7% and NPV=71.4%). Conclusions In this group of CRT recipients with LBBB, retrospective analysis of QRS frequency content below 10 Hz had greater predictive value for CRT response than baseline QRSd, particularly in those with QRSd<150ms.
    Heart Rhythm. 01/2014;
  • [Show abstract] [Hide abstract]
    ABSTRACT: To determine the long term outcomes of patients undergoing cardiac resynchronization therapy based on degree of echocardiographic response. While improvement in left ventricular (LV) function has been shown to portend superior short-term outcomes in patients with heart failure undergoing CRT, the durability of this effect at five years has not been established. We extracted clinical data on a cohort of 880 consecutive patients undergoing the new implantation of a CRT device between 9/30/2003 and 8/6/2007. Patients with an EF ≤35% undergoing initial CRT implantation, with an available pre-CRT and follow up echocardiogram, were included in the final cohort. Based on changes in LVEF, patients were categorized into "non-responders" (change in EF ≤4%), "responders" (EF change 5%-20%), and "super responders" (change in EF >20%). A Cox multivariate model was performed to determine the effect of response on long term survival free of LVAD or heart transplant. 526 patients met inclusion criteria, of whom 196 (37.3%) were classified as "non-responders", 236 (44.9%) as "responders", and 94 (17.9%) as "super-responders". In multivariate analysis, "super-responders" had the best survival and "non-responders" the worst over a mean of follow up of 5.3 ±2.4 years. At five years, survival free of LVAD or heart transplant amongst super-responders was 82%, responders 70%, and non-responders 48%. In patients with heart failure undergoing CRT, survival benefit is durable at five years of follow up and its degree intimately tied to the level of improvement in ventricular function. The prognosis of non-responders is exceptionally poor.
    Heart rhythm: the official journal of the Heart Rhythm Society 11/2013; · 4.56 Impact Factor
  • Circulation Arrhythmia and Electrophysiology 08/2013; 6(4):e66. · 5.95 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: BACKGROUND: Earlier studies in patients with reduced left ventricular ejection fraction (LVEF) ≤35% and prolonged QRS showed better survival outcomes with cardiac resynchronization therapy (CRT). Some patients respond dramatically to CRT by improving their LVEF to the normal range and are considered "super-responders." Our aim was to determine whether super-responders survival increases to levels comparable to the general population. We compared the survival of super-responders to the general population matched for age and sex. METHODS: Of 909 patients with CRT device implantation between September 1998 and July 2008, 814 patients had pre- and post-CRT echocardiogram. A total of 95 patients with LVEF ≥ 50% following CRT were classified as super-responders. For 92 super-responders, who had U.S. Social Security numbers, an age- and sex-matched example was selected from the Social Security Life Tables. An expected survival plot of the matched population was then compared to the actual survival of super-responders. RESULTS: Super-responders had comparable survival to the age-sex matched general population (P = 0.53), and Kaplan-Meier survival analysis in 92 patients showed that super-responders with CRT pacemakers had similar survival to those with CRT implantable cardioverter defibrillators (P = 0.77). Super-responders were more likely to be females (54% vs 25%, P < 0.001) and less likely to have significant coronary artery disease (62% vs 42%, P < 0.001). CONCLUSIONS: Normalization of LVEF by CRT improves survival to levels comparable to the general population. This observation favors the concept that some CRT candidates have a cardiomyopathy likely generated by the conduction abnormality that is reversible through biventricular pacing.
    Pacing and Clinical Electrophysiology 05/2013; · 1.75 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: BACKGROUND: Following pulmonary vein isolation (PVI) for the treatment of paroxysmal atrial fibrillation (AF), spontaneous dissociated firing (DiFi) from the isolated veins may be observed. Little is known about the significance and prognostic implications of this phenomenon. We sought to determine the relationship between DiFi and ablation outcomes. METHODS: The study population consisted of 156 paroxysmal AF patients who underwent first time PVI and were found to have spontaneous DiFi from the pulmonary veins (PVs). Their outcomes were compared to a population of 156 propensity-matched controls from our prospectively maintained AF ablation data registry. RESULTS: DiFi was most frequently observed from the right superior PV and occurred in 89 patients (57.1%). After 24 months of follow-up, patients with DiFi had better success rates compared to those with silent veins after isolation (88.5% vs 75%, P = 0.002). The overall distribution of types of recurrent arrhythmia was similar between DiFi patients and their matched controls (P = NS). During repeat ablations, DiFi patients were less likely to have PV conduction recovery (60% vs 93.3%, P = 0.02). Importantly, none of the veins with DiFi during index procedures was found to have conduction recovery. CONCLUSION: In patients with paroxysmal AF undergoing ablation, DiFi from the PVs after their isolation was associated with improved ablation outcomes. It is possible that DiFi is an indicator of successful durable isolation of the PVs. The findings suggest that confirmation of exit block may be warranted to improve AF ablation outcomes.
    Pacing and Clinical Electrophysiology 04/2013; · 1.75 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: BACKGROUND: Vascular complications are a known risk of catheter-based pulmonary vein antral isolation (PVAI). Procedure-related thromboembolic events necessitate full-dose anticoagulation, which worsens outcomes in the event of vascular access injury. OBJECTIVE: Real-time ultrasound allows direct visualization of vascular structures. We hypothesized that ultrasound use with venipuncture reduces vascular complications associated with PVAI. METHODS: Retrospective analysis of all adverse events occurring with PVAI was performed during two periods: 2005-2006 when ultrasound was not used and 2008-2010 when ultrasound was routinely employed. All patients received full-dose IV heparin during PVAI. In the no ultrasound cohort, only 14 % underwent PVAI without stopping warfarin, while 91 % of patients in the ultrasound cohort were on continued warfarin. Only patients deemed at high risk for thromboembolism with a periprocedural international normalized ratio (INR) less than 2 were bridged with subcutaneous low-molecular-weight heparin. RESULTS: Ultrasound reduced total vascular complications (1.7 vs. 0.5 %, p < 0.01) and decreased the incidence of major vascular complications by sevenfold. Warfarin with INR ≥ 1.2 on the day of PVAI was associated with more vascular complications (4.3 vs. 1.2 %, p < 0.01). Ultrasound guidance overcame the risk associated with warfarin therapy. Vascular complications in anticoagulated patients with INR ≥ 1.2 using ultrasound guidance were two- and ninefold lower than those in patients not using ultrasound with an INR < 1.2 (0.5 vs. 1.2 %, p < 0.05) and INR ≥ 1.2 (0.5 vs. 4.3 %, p < 0.01), respectively. CONCLUSION: Ultrasound-guided venipuncture improves the safety profile of PVAI, reducing vascular complications in patients on warfarin to levels below those with no ultrasound and off warfarin.
    Journal of Interventional Cardiac Electrophysiology 04/2013; · 1.39 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: BACKGROUND: -Pulmonary vein isolation (PVI) for atrial fibrillation (AF) is associated with a transient increased risk of thromboembolic and hemorrhagic events. We hypothesized that dabigatran can be safely used as an alternative to continuous warfarin for the peri-procedural anticoagulation in PVI. METHODS AND RESULTS: -999 consecutive patients undergoing PVI were included; 376 patients were on dabigatran (150 mg) and 623 were on warfarin with therapeutic INR. Dabigatran was held 1 to 2 doses prior to PVI and restarted at the conclusion of the procedure or as soon as patients were transferred to the nursing floor. Propensity score matching was applied to generate a cohort of 344 patients in each group with balanced baseline data. Total hemorrhagic and thromboembolic complications were similar in both groups, before (3.2% vs 3.9%; p = 0.59), and after (3.2% vs 4.1%; p = 0.53) matching. Major hemorrhage occurred in 1.1% vs 1.6% (p = 0.48) before, and 1.2% vs 1.5% (p = 0.74) after matching in the dabigatran vs warfarin group respectively. A single thromboembolic event occurred in each of the dabigatran and warfarin groups. Despite higher doses of intra-procedural heparin, the mean ACT was significantly lower in patients who held dabigatran for 1 or 2 doses than those on warfarin. CONCLUSIONS: -Our study found no evidence to suggest a higher risk of thromboembolic or hemorrhagic complications with use of dabigatran for peri-procedural anticoagulation in patients undergoing PVI compared to uninterrupted warfarin therapy.
    Circulation Arrhythmia and Electrophysiology 04/2013; · 5.95 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: BACKGROUND: -Implantable cardioverter defibrillator (ICD) implantation for prevention of sudden cardiac death is typically deferred for 90 days after coronary revascularization, but mortality may be highest early after cardiac procedures in patients with ventricular dysfunction. We determined mortality risk in post-revascularization patients with left ventricular ejection fraction (LVEF) ≤35% and compared survival to those discharged with a wearable cardioverter defibrillator (WCD). METHODS AND RESULTS: -Hospital survivors after surgical (CABG) or percutaneous (PCI) revascularization with LVEF≤35% were included from Cleveland Clinic and national WCD registries. Kaplan-Meier, Cox proportional hazards, propensity score-matched survival and hazard function analyses were performed. Early mortality hazard was higher among 4149 patients discharged without a defibrillator compared to 809 with WCDs (90-day mortality post-CABG 7% vs. 3%, p=0.03; post-PCI 10% vs. 2%, p<0.0001). WCD use was associated with adjusted lower risks of long-term mortality in the total cohort (39%, p<0.0001) and both post-CABG (38%, p=0.048) and post-PCI (57%, p<0.0001) cohorts (mean follow-up 3.2 years). In propensity-matched analyses, WCD use remained associated with lower mortality (58% post-CABG, p=0.002; 67% post-PCI, p<0.0001). Mortality differences were not attributable solely to therapies for ventricular arrhythmia. Only 1.3% of the WCD group had a documented appropriate therapy. CONCLUSIONS: -Patients with LVEF≤35% have higher early compared to late mortality after coronary revascularization, particularly after PCI. As early hazard appeared less marked in WCD users, prospective studies in this high risk population are indicated to confirm whether WCD use as a bridge to LVEF improvement or ICD implantation can improve outcomes after coronary revascularization.
    Circulation Arrhythmia and Electrophysiology 12/2012; · 5.95 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: OBJECTIVES: We sought to identify the characteristics, treatment, and outcomes of periprocedural cerebrovascular accident (PCVA) during electrophysiologic (EP) procedures. BACKGROUND: Periprocedural cerebrovascular accident is one of the most feared complications during EP procedures with very few data regarding its characteristics, management, and outcomes. METHODS: Between January 1998 and December 2008, we reviewed 30,032 invasive EP procedures for PCVA occurrence and characteristics. Management and outcomes were also determined. RESULTS: Thirty-eight CVAs were identified. Twenty (53 %) were intraprocedural and 18 (47 %) postprocedural. Thirty-two (84 %) were classified as strokes and six (16 %) as transient ischemic attacks. All CVAs except one (37, 97 %) were ischemic and the vast majority occurred during ablation procedures (36, 95 %). Among the 31 patients with ischemic stroke, 11 (35 %) were treated with reperfusion (eight catheter-based therapy and three intravenous t-PA) of whom five (46 %) had complete recovery, three (27 %) had partial recovery, and three (27 %) had no recovery. No hemorrhagic transformations occurred. CONCLUSION: Periprocedural cerebrovascular accident during EP procedures is rare and is almost always ischemic. It occurs more frequently during ablation procedures. Reperfusion therapy is feasible and safe.
    Journal of Interventional Cardiac Electrophysiology 12/2012; · 1.39 Impact Factor
  • Circulation Arrhythmia and Electrophysiology 12/2012; 5(6):e104-8. · 5.95 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: A significant proportion of implantable cardioverter-defibrillators (ICDs) have been subject to Food and Drug Administration (FDA) advisories. The impact of device advisories on mortality or patient care is poorly understood. Although estimated risks of ICD generators under advisory are low, dependency on ICD therapies to prevent sudden death justifies the assessment of long-term mortality. To test the association of FDA advisory status with long-term mortality. The study was a retrospective, single-center review of clinical outcomes, including device malfunctions, in patients from implantation to either explant or death. Patients with ICDs first implanted at Cleveland Clinic between August 1996 and May 2004 who became subject to FDA advisories on ICD generators were identified. Mortality was determined by using the Social Security Death Index. In 1644 consecutive patients receiving first ICD implants, 704 (43%) became subject to an FDA advisory, of which 172 (10.5%) were class I and 532 (32.3%) were class II. ICDs were explanted before advisory notifications in 14.0% of class I and 10.1% of class II advisories. Among ICDs under advisory, 28 (4.0%) advisory-related and 15 non-advisory- related malfunctions were documented. Over a median follow-up of 70 months, 814 patients died. Kaplan-Meier 5-year survival rate was 65.6% overall, and 64.2, 61.1, and 69.3% in patients with no, class I, and class II advisories, respectively (P = .17). ICD advisories impacted 43% of the patients. Advisory-related malfunctions affected 4% within the combined advisory group. Based on a conservative management strategy, ICDs under advisory were not associated with increased mortality over a background of significant disease-related mortality.
    Heart rhythm: the official journal of the Heart Rhythm Society 07/2012; 9(10):1619-26. · 4.56 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Elevated red cell distribution width (RDW) has been associated with poor long-term outcomes in patients with systolic dysfunction. The relationship between baseline RDW and reverse ventricular remodeling in advanced heart failure patients undergoing cardiac resynchronization therapy (CRT) has not been established. The authors reviewed the pre-implant and follow-up echocardiograms of 233 patients undergoing the new implantation of a CRT device at the Cleveland Clinic between December 2001 and November 2006. Patients were included in the final cohort if they had an RDW level within 7 days of CRT implantation, a left ventricular ejection fraction (LVEF) ≤40%, and New York Heart Association class II to IV symptoms. Patients with a reduction in left ventricular end-systolic volume ≥15% following CRT were considered "responders." Multivariate models were created to assess the relationship between baseline RDW elevation with progressive remodeling and all-cause mortality. Of 233 patients, 217 patients met inclusion criteria. Patients in the highest RDW quartile (>16.1) derived significantly less improvement in LVEF (3.5%±9.3% vs 10.1%±10.9%, P=.001) than patients in the lowest quartile (<13.6). In multivariate analysis, elevated RDW was inversely associated with response (odds ratio, 0.83; 95% confidence interval, 0.69-0.99; P=.047). The presence of elevated RDW is associated with less reverse left ventricular remodeling in patients with advanced heart failure undergoing CRT.
    Congestive Heart Failure 03/2012; 18(2):79-84.
  • [Show abstract] [Hide abstract]
    ABSTRACT: Patients with non-left bundle branch block (LBBB) morphologies are thought to derive less benefit from cardiac resynchronization therapy (CRT) than those with LBBB. However, some patients do exhibit improvement. The characteristics associated with a response to CRT in patients with non-LBBB morphologies are unknown. Clinical, electrocardiographic, and echocardiographic data were collected from 850 consecutive patients presenting for a new CRT device. For inclusion, all patients had a left ventricular ejection fraction of ≤35%, a QRS duration of ≥120 ms, and baseline and follow-up echocardiograms available. Patients with a paced rhythm or LBBB were excluded. The response was defined as an absolute decrease in left ventricular end-systolic volume of ≥10% from baseline. Multivariate models were constructed to identify variables significantly associated with the response and long-term outcomes. A total of 99 patients met the inclusion criteria. Of these 99 patients, 22 had right bundle branch block and 77 had nonspecific intraventricular conduction delay; 52.5% met the criteria for response. On multivariate analysis, the QRS duration was the only variable significantly associated with the response (odds ratio per 10-ms increase 1.23, 95% confidence interval 1.01 to 1.52, p = 0.048). During a mean follow-up of 5.4 ± 0.9 years, 65 patients died or underwent heart transplant or left ventricular assist device placement. On multivariate analysis, the QRS duration was inversely associated with poor long-term outcomes (hazard ratio per 10-ms increase 0.79, 95% confidence interval 0.66 to 0.94, p = 0.005). In patients with advanced heart failure and non-LBBB morphologies, a wider baseline QRS duration is an important determinant of enhanced reverse ventricular remodeling and improved long-term outcomes after CRT.
    The American journal of cardiology 09/2011; 108(11):1576-80. · 3.58 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: The aim of the present study was to evaluate the relationship between left ventricular (LV) electrical delay, as measured by the QLV interval, and outcomes in a prospectively designed substudy of the SMART-AV Trial. This was a multicentre study of patients with advanced heart failure undergoing cardiac resynchronization therapy (CRT) defibrillator implantation. In 426 subjects, QLV was measured as the interval from the onset of the QRS from the surface ECG to the first large peak of the LV electrogram. Left ventricular volumes were measured by echocardiography at baseline and after 6 months of CRT by a blinded core laboratory. Quality of life (QOL) was assessed by a standardized questionnaire. When separated by quartiles based on QLV duration, reverse remodelling response rates (>15% reduction in LV end systolic volume) increased progressively from 38.7 to 68.4% and QOL response rate (>10 points reduction) increased from 50 to 72%. Patients in the highest quartile of QLV had a 3.21-fold increase (1.58-6.50, P = 0.001) in their odds of a reverse remodelling response after correcting for QRS duration, bundle branch block type, and clinical characteristics by multivariate logistic regression analysis. Electrical dyssynchrony, as measured by QLV, was strongly and independently associated with reverse remodelling and QOL with CRT. Acute measurements of QLV may be useful to guide LV lead placement.
    European Heart Journal 08/2011; 32(20):2516-24. · 14.72 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: Atrial fibrillation (AF) ablation is increasingly used in clinical practice. We aimed to study the natural history and long-term outcomes of ablated AF. We followed 831 patients after pulmonary vein isolation (PVI) performed in 2005. We documented clinical outcomes using our prospective AF registry with most recent update on this group of patients in October 2009. In the first year after ablation, 23.8% had early recurrence. Over long-term follow-up (55 months), only 8.9% had late arrhythmia recurrence defined as occurring beyond the first year after ablation. Repeat ablations in patients with late recurrence revealed conduction recovery in at least 1 of the previously isolated PVs in all of them and right-sided triggers with isoproterenol testing in 55.6%. At last follow-up, clinical improvement was 89.9% (79.4% arrhythmia-free off antiarrhythmic drugs and 10.5% with AF controlled with antiarrhythmic drugs). Only 4.6% continued to have drug-resistant AF. It was possible to safely discontinue anticoagulation in a substantial proportion of patients with no recurrence in the year after ablation (CHADS score ≤2, stroke incidence of 0.06% per year). The procedure-related complication rate was very low. Pulmonary vein isolation is safe and efficacious for long-term maintenance of sinus rhythm and control of symptoms in patients with drug-resistant AF. It obviates the need for antiarrhythmic drugs, negative dromotropic agents, and anticoagulants in a substantial proportion of patients.
    Circulation Arrhythmia and Electrophysiology 06/2011; 4(3):271-8. · 5.95 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: Plasma B-type natriuretic peptide (BNP) is abnormally elevated in patients with lone atrial fibrillation (AF). The exact significance and prognostic implications of this elevation have yet to be determined. Little is known about BNP in lone AF patients undergoing arrhythmia ablation. We sought to determine the relationship between BNP levels and the risk of recurrent arrhythmia after ablation of lone AF. We followed up 726 patients with lone AF undergoing first-time arrhythmia ablation. All had BNP levels measured on the day of ablation with of the point-of-care Triage Meter assay (Biosite Diagnostics, San Diego, CA). At baseline, factors associated with elevated BNP levels in multivariable linear regression analysis (with log BNP being the dependent variable) were older age (β regression coefficient for +1-year change, 0.025; P<0.0001), longer duration of AF (β for +1-year change, 0.031; P=0.01), nonparoxysmal AF (versus paroxysmal; β, 0.52; P<0.0001), and larger left atrial size (β for +1-cm(2) change, 0.040; P<0.0001). The BNP levels were strongly associated with arrhythmia recurrence in univariate- (hazard ratio for +1-log-BNP change, 2.32; 95% confidence interval, 2.11 to 2.74; P<0.001) and covariate- (hazard ratio for +1-log-BNP change, 2.13; 95% confidence interval, 2.06 to 2.38; P<0.001) adjusted Cox proportional hazards analysis. The covariate-adjusted hazard ratios for recurrent arrhythmia were 1.6, 2.7, 4.3, and 5.7 for the second, third, fourth, and fifth quintiles, respectively, compared with patients in the lowest quintile (P for trend across quintiles <0.001). B-type natriuretic peptide levels correlate with AF burden (chronicity, altered hemodynamics, and anatomic remodeling) in patients with lone AF and are strong predictors of recurrent arrhythmia after ablation. Elevated BNP levels may reflect increased cardiac chamber wall stress and/or intrinsic atrial disease, thus increasing the risk of arrhythmia recurrence.
    Circulation 05/2011; 123(19):2077-82. · 15.20 Impact Factor

Publication Stats

6k Citations
1,335.24 Total Impact Points

Institutions

  • 1995–2014
    • Cleveland Clinic
      • • Department of Internal Medicine
      • • Center for Atrial Fibrillation
      • • Department of Cardiology
      Cleveland, Ohio, United States
  • 2009–2012
    • Metropolitan Heart and Vascular Institute
      Minneapolis, Minnesota, United States
    • Cedars-Sinai Medical Center
      • Division of Cardiology
      Los Angeles, CA, United States
  • 2011
    • Medical University of South Carolina
      Charleston, South Carolina, United States
  • 1986–2008
    • University of Wisconsin - Milwaukee
      Milwaukee, Wisconsin, United States
    • Mount Sinai Medical Center
      New York City, New York, United States
  • 2007
    • Kansas City University of Medicine and Biosciences
      Kansas City, Missouri, United States
    • University of Chicago
      Chicago, Illinois, United States
  • 2006
    • Philadelphia ZOO
      Philadelphia, Pennsylvania, United States
    • University of Missouri - St. Louis
      Saint Louis, Michigan, United States
  • 2005
    • University of Nebraska at Omaha
      • Division of Cardiology
      Omaha, NE, United States
  • 2004
    • University of Nebraska Medical Center
      Omaha, Nebraska, United States
    • Fukuoka University
      • Department of Cardiology
      Fukuoka-shi, Fukuoka-ken, Japan
    • Loyola University Medical Center
      Maywood, Illinois, United States
  • 2000
    • Institute of Electrical and Electronics Engineers
      Washington, Washington, D.C., United States
  • 1998
    • The Ohio State University
      • Division of Hospital Medicine
      Columbus, OH, United States
  • 1991–1997
    • University of Pittsburgh
      • • Department of Medicine
      • • Division of Cardiology
      Pittsburgh, Pennsylvania, United States
    • Mount Sinai Hospital
      New York City, New York, United States
  • 1989
    • Wayne State University
      • Department of Psychiatry and Behavioral Neurosciences
      Detroit, MI, United States
  • 1988–1989
    • Samaritan Medical Center
      Watertown, New York, United States