[Show abstract][Hide abstract] ABSTRACT: Left atrial flutter following atrial fibrillation (AF) ablation is increasingly common and difficult to treat. We evaluated the safety and efficacy of ablation of the anteroseptal line connecting the right superior pulmonary vein (RSPV) to the anteroseptal mitral annulus (MA) for the treatment of perimitral flutter (PMF).
Journal of Arrhythmia 07/2015; 155. DOI:10.1016/j.joa.2015.06.001
[Show abstract][Hide abstract] ABSTRACT: Unnecessary ventricular pacing from Cardiac Implantable Electronic Devices has been associated with long-term risks (heart failure, atrial fibrillation, and possibly stroke). Several device programming strategies are available to minimize ventricular pacing, each potentially associated with unintended consequences. Occasionally, the only effective means is to program to the AAI(R) pacing mode. However, in one manufacturer's implantable cardioverter defibrillators (ICD), the AAI(R) mode has the potential risk of prolonged pacing cessation following a nonsustained ventricular tachycardia (NSVT).
Patients with ICDs, managed through the Cleveland Clinic device clinic, follow the Heart Rhythm Society consensus document recommendations for device monitoring with remote interrogations (every three months) and yearly in-person evaluations. Clinically significant findings also trigger additional evaluations by the nurse and physician.
Two patients having Boston Scientific ICDs (E110 Teligen 100), had asystole and marked bradycardia following untreated NSVT. These pauses in pacing were due to use of the AAI(R) pacing mode. In order to enhance ventricular tachycardia detection, by design atrial pacing is disabled during, and for a time after, an episode of ventricular tachycardia when the device operates in the "ventricular tachycardia response" (VTR) phase. Thus, following spontaneous termination of the NSVT, no pacing occurred in these patients until the VTR period ended. Non-conventional programming was utilized to permit AAI(R) pacing while avoiding these asystolic and bradycardic events during VTR.
Unintended consequences can occur when complex VT detection parameters interact with specific pacing modes. At times, non-conventional programming can avoid these interactions while still achieving effective AAI(R) pacing. This article is protected by copyright. All rights reserved.
This article is protected by copyright. All rights reserved.
[Show abstract][Hide abstract] ABSTRACT: Background
Baseline QRS duration (QRSd) ≥150 ms is a recognized predictor of clinical improvement by cardiac resynchronization therapy (CRT), particularly for those with left bundle branch (LBBB). Patients with QRSd <150ms, are considered less likely to respond.
We theorized that left ventricular dyssynchrony, while usually associated with wider QRSd, also exhibits lower QRS frequency characteristics and that low frequency content should predict CRT response in LBBB patients.
We retrospectively examined the QRS frequency content of 170 heart failure patients with LBBB and QRSd ≥120 ms using the Fourier transformation. Ninety-four responders to CRT (definition; reduction in left ventricular end-systolic volume by ≥15% from baseline) were compared to 76 nonresponders (<15% reduction). The analysis of three standard ECG leads (I, AVF and V3) representing the three dimensions of depolarization, was performed and V3 provided the best predictive value.
The QRSd of responders (160.3±17.8ms) and nonresponders (161.8±21.1ms; p=0.604) were similar. We found that the percentage of total QRS frequency power <10Hz that exceeded 52% was most predictive of CRT response compared to other cutoff values. However, the percentage of patients with total QRS power >52% below 10 Hz was especially predictive of response in those with QRSd <150ms. In these patients, this power threshold was highly predictive of CRT response (PPV=85.7% and NPV=71.4%).
In this group of CRT recipients with LBBB, retrospective analysis of QRS frequency content below 10 Hz had greater predictive value for CRT response than baseline QRSd, particularly in those with QRSd<150ms.
[Show abstract][Hide abstract] ABSTRACT: Background: Direct Electrical Cardioversion (DCC) has been known to increase the risk of thromboembolism (TE) in patients with Atrial Fibrillation (AF). However, guidelines and current practices are unclear if this risk exists in patients undergoing DCC < 48 hours after AF onset. We aim to assess TE risk in these patients with and without the use of therapeutic anti-coagulation.
Methods: All DCC done at Cleveland Clinic between 1996 to 2012, <48 hours after AF onset were selected from Electrophysiology database. They were divided into two groups on the basis of their anti-coagulation status. We did extensive chart review in these patients to look for outcomes of major TE complications within a month of DCC.
Results: Among 567 DCC in 510 patients without therapeutic anti-coagulation, the mean (SD) CHA2DS2-VASc score was 2.34(1.66). At the time of DCC, 54% of these patients were on aspirin, while 12% were on warfarin with INR ≤1.5. There were 7 reported cerebrovascular accidents (CVAs) (1.22%) within a month after DCC. In these 7 patients, the mean (SD) CHA2DS2-VASc score was 3.57(1.27) ranging between 2 to 5. In the second group, 901 DCC were done in 733 patients who were therapeutic on warfarin (63%) or heparin (37%) at the time of DCC. Mean (SD) CHA2DS2-VASc score was 2.61(1.73). There were 2 reported (0.22%, p= 0.016) CVAs within a month after DCC. These two had CHA2D2-VASc scores of 4 and 6. Of these two patients, one was sub- therapeutic (INR=1.14) when he presented with neurological deficits 18 days after the DCC. The other patient held anticoagulation 10 days after DCC for a surgical procedure which was followed by the TE stroke a week later. The non-treated group therefore had a 5.6 times greater odds of having a thromboembolic stroke within a month of DCC (95% CI: 1.16 to 27.14) .
Conclusions: In patients with acute onset atrial fibrillation, odds of thromboembolic complications are over 5 times higher in patients who did not receive therapeutic anti-coagulation at the time of DCC, despite having a lower baseline stroke risk as defined by their CHA2DS2-VASc scores. In addition, the two patients in our study who did have a stroke in the therapeutically anti-coagulated group had ceased their anticoagulant prior to the stroke.
Circulation Cardiovascular Quality and Outcomes; 07/2014
[Show abstract][Hide abstract] ABSTRACT: Background
QRS morphology and duration (QRSd) determine CRT candidate selection but criteria require refinement.
Assess CRT effect according to QRSd, treated by dichotomization vs a continuous function, and modulation by gender.
Patients selected were NYHA Class III/IV with LBBB and non-ischemic cardiomyopathy (to test “pure” CRT effect) with pre- and post-implant echocardiographic evaluations. Positive response was defined as improved LVEF post-CRT.
In 212 patients (LVEF 19±7.1%; QRSd 160±23 ms; 105 females), CRT improved LVEF to 30±15% (p<0.001) during median 2 years follow-up. Positive response occurred in 150/212 (71%). Genders did not differ for QRSd, pharmacotherapy and comorbidities, but female CRT response was greater: incidence 84% (88/105) vs 58% (62/107) in males (p<0.001); LVEF improvement 15±14% vs 7.2±13%, respectively (p<0.001). Overall, response rate was 58% when QRSd<150ms vs 76% when QRSd≥150 ms (p=0.009). This probability differed between genders: 86% in women vs 36% in men (p<0.001) for QRSd <150 ms, and 83% vs 69% respectively when QRSd≥150ms (p=0.05). Thus, female response rates remained high whether QRSd was < or ≥150 ms (86 vs 83%, p=0.77) but differed in males (36 vs 69%, p<0.001). With QRSd as a continuum, the CRT-response relationship was nonlinear, and significantly different between genders. Female superiority at shorter QRSd inverted with prolongation >180 ms.
The QRSd-CRT response relationship in heart failure patients with LBBB and non-ischemic cardiomyopathy is better described by a sex-specific continuous function and not by dichotomization by 150 ms which excludes a large proportion of women with potentially favorable outcome.
Heart rhythm: the official journal of the Heart Rhythm Society 07/2014; 11(7). DOI:10.1016/j.hrthm.2014.04.001 · 5.08 Impact Factor
[Show abstract][Hide abstract] ABSTRACT: Despite sparse clinical data, current atrial fibrillation (AF) guidelines favor amiodarone as a drug of choice for patients with left ventricular hypertrophy (LVH).
This study tested the hypothesis that patients with persistent AF and LVH on nonamiodarone antiarrhythmics have higher mortality compared to patients on amiodarone.
In an observational cohort analysis of patients who underwent cardioversion for AF, patients with LVH, defined as left ventricular wall thickness ≥1.4 cm, by echocardiogram prior to their first cardioversion, were included; clinical data, including antiarrhythmic drugs and ejection fraction (LVEF), were collected. Mortality, determined via the Social Security Death Index, was analyzed using Kaplan-Meier and Cox proportional hazards models to determine whether antiarrhythmic drugs were associated with higher mortality.
In 3,926 patients, echocardiographic wall thickness was available in 1,399 (age 66.8 ± 11.8 years, 67% male, LVEF 46 ± 15%, septum 1.3 ± 0.4, posterior wall 1.2 ± 0.2 cm), and 537 (38%) had LVH ≥1.4 cm. Among 537 patients with LVH, mean age was 67.5 ± 11.7 years, 76.4% were males, and mean LVEF was 48.3 ± 13.3%. Amiodarone was associated with lower survival (log rank P = 0.001), including after adjusting for age, LVEF, and coronary artery disease (P = 0.023). In propensity-score matched cohorts with LVH treated with no drugs, nonamiodarone antiarrhythmic drugs (non-AADs), or amiodarone (N = 65 each group), there was early lower survival in patients on amiodarone (P = 0.05).
Patients with persistent AF and LVH on non-AADs do not have higher mortality compared to patients on amiodarone. Importantly, these findings do not support amiodarone as a superior choice in patients with LVH.
[Show abstract][Hide abstract] ABSTRACT: Many patients eligible for cardiac resynchronization therapy (CRT) are over 80 years of age. Survival in this population and how it compares to the general octogenarian population has not been established.
We extracted clinical data on a cohort of 800 consecutive patients undergoing the new implantation of a CRT device between April 15, 2004 and August 6, 2007. Patients over age 80, with class III-IV New York Heart Association heart failure symptoms on optimal medical therapy undergoing initial CRT implantation, were included in the final cohort. Using the United States Social Security Period Life Table for 2006, fractional survival for octogenarians in the general population was calculated and matched to our cohort based on age and gender. A comparison was then made between octogenarians undergoing CRT compared to the general population.
A total of 95 octogenarians who met inclusion criteria were identified, of whom 86.3% received a biventricular defibrillator and the remainder a biventricular pacemaker. Over a mean follow-up of 3.6 ± 1.5 years, there were 47 deaths (47.4%). The mean survival time was 4.1 years (95% CI 3.7-4.5), and survival at 2 years was 78.9%. Compared to the general octogenarian population, octogenarians receiving CRT had only modestly worse survival over the duration of follow-up with the survival curves diverging at 2 years of follow-up (P = 0.03).
Octogenarians with advanced heart failure have a reasonable mean survival time following CRT. All-cause mortality in this patient population is only modestly worse compared to the general octogenarian population. Therefore, in octogenarians deemed to be reasonable candidates, CRT should not be withheld based on age alone.
[Show abstract][Hide abstract] ABSTRACT: To determine the long term outcomes of patients undergoing cardiac resynchronization therapy based on degree of echocardiographic response.
While improvement in left ventricular (LV) function has been shown to portend superior short-term outcomes in patients with heart failure undergoing CRT, the durability of this effect at five years has not been established.
We extracted clinical data on a cohort of 880 consecutive patients undergoing the new implantation of a CRT device between 9/30/2003 and 8/6/2007. Patients with an EF ≤35% undergoing initial CRT implantation, with an available pre-CRT and follow up echocardiogram, were included in the final cohort. Based on changes in LVEF, patients were categorized into "non-responders" (change in EF ≤4%), "responders" (EF change 5%-20%), and "super responders" (change in EF >20%). A Cox multivariate model was performed to determine the effect of response on long term survival free of LVAD or heart transplant.
526 patients met inclusion criteria, of whom 196 (37.3%) were classified as "non-responders", 236 (44.9%) as "responders", and 94 (17.9%) as "super-responders". In multivariate analysis, "super-responders" had the best survival and "non-responders" the worst over a mean of follow up of 5.3 ±2.4 years. At five years, survival free of LVAD or heart transplant amongst super-responders was 82%, responders 70%, and non-responders 48%.
In patients with heart failure undergoing CRT, survival benefit is durable at five years of follow up and its degree intimately tied to the level of improvement in ventricular function. The prognosis of non-responders is exceptionally poor.
Heart rhythm: the official journal of the Heart Rhythm Society 11/2013; 11(3). DOI:10.1016/j.hrthm.2013.11.025 · 5.08 Impact Factor
[Show abstract][Hide abstract] ABSTRACT: Background:
Earlier studies in patients with reduced left ventricular ejection fraction (LVEF) ≤35% and prolonged QRS showed better survival outcomes with cardiac resynchronization therapy (CRT). Some patients respond dramatically to CRT by improving their LVEF to the normal range and are considered "super-responders." Our aim was to determine whether super-responders survival increases to levels comparable to the general population. We compared the survival of super-responders to the general population matched for age and sex.
Of 909 patients with CRT device implantation between September 1998 and July 2008, 814 patients had pre- and post-CRT echocardiogram. A total of 95 patients with LVEF ≥ 50% following CRT were classified as super-responders. For 92 super-responders, who had U.S. Social Security numbers, an age- and sex-matched example was selected from the Social Security Life Tables. An expected survival plot of the matched population was then compared to the actual survival of super-responders.
Super-responders had comparable survival to the age-sex matched general population (P = 0.53), and Kaplan-Meier survival analysis in 92 patients showed that super-responders with CRT pacemakers had similar survival to those with CRT implantable cardioverter defibrillators (P = 0.77). Super-responders were more likely to be females (54% vs 25%, P < 0.001) and less likely to have significant coronary artery disease (62% vs 42%, P < 0.001).
Normalization of LVEF by CRT improves survival to levels comparable to the general population. This observation favors the concept that some CRT candidates have a cardiomyopathy likely generated by the conduction abnormality that is reversible through biventricular pacing.
[Show abstract][Hide abstract] ABSTRACT: Background:
Following pulmonary vein isolation (PVI) for the treatment of paroxysmal atrial fibrillation (AF), spontaneous dissociated firing (DiFi) from the isolated veins may be observed. Little is known about the significance and prognostic implications of this phenomenon. We sought to determine the relationship between DiFi and ablation outcomes.
The study population consisted of 156 paroxysmal AF patients who underwent first time PVI and were found to have spontaneous DiFi from the pulmonary veins (PVs). Their outcomes were compared to a population of 156 propensity-matched controls from our prospectively maintained AF ablation data registry.
DiFi was most frequently observed from the right superior PV and occurred in 89 patients (57.1%). After 24 months of follow-up, patients with DiFi had better success rates compared to those with silent veins after isolation (88.5% vs 75%, P = 0.002). The overall distribution of types of recurrent arrhythmia was similar between DiFi patients and their matched controls (P = NS). During repeat ablations, DiFi patients were less likely to have PV conduction recovery (60% vs 93.3%, P = 0.02). Importantly, none of the veins with DiFi during index procedures was found to have conduction recovery.
In patients with paroxysmal AF undergoing ablation, DiFi from the PVs after their isolation was associated with improved ablation outcomes. It is possible that DiFi is an indicator of successful durable isolation of the PVs. The findings suggest that confirmation of exit block may be warranted to improve AF ablation outcomes.
[Show abstract][Hide abstract] ABSTRACT: BACKGROUND: Vascular complications are a known risk of catheter-based pulmonary vein antral isolation (PVAI). Procedure-related thromboembolic events necessitate full-dose anticoagulation, which worsens outcomes in the event of vascular access injury. OBJECTIVE: Real-time ultrasound allows direct visualization of vascular structures. We hypothesized that ultrasound use with venipuncture reduces vascular complications associated with PVAI. METHODS: Retrospective analysis of all adverse events occurring with PVAI was performed during two periods: 2005-2006 when ultrasound was not used and 2008-2010 when ultrasound was routinely employed. All patients received full-dose IV heparin during PVAI. In the no ultrasound cohort, only 14 % underwent PVAI without stopping warfarin, while 91 % of patients in the ultrasound cohort were on continued warfarin. Only patients deemed at high risk for thromboembolism with a periprocedural international normalized ratio (INR) less than 2 were bridged with subcutaneous low-molecular-weight heparin. RESULTS: Ultrasound reduced total vascular complications (1.7 vs. 0.5 %, p < 0.01) and decreased the incidence of major vascular complications by sevenfold. Warfarin with INR ≥ 1.2 on the day of PVAI was associated with more vascular complications (4.3 vs. 1.2 %, p < 0.01). Ultrasound guidance overcame the risk associated with warfarin therapy. Vascular complications in anticoagulated patients with INR ≥ 1.2 using ultrasound guidance were two- and ninefold lower than those in patients not using ultrasound with an INR < 1.2 (0.5 vs. 1.2 %, p < 0.05) and INR ≥ 1.2 (0.5 vs. 4.3 %, p < 0.01), respectively. CONCLUSION: Ultrasound-guided venipuncture improves the safety profile of PVAI, reducing vascular complications in patients on warfarin to levels below those with no ultrasound and off warfarin.
[Show abstract][Hide abstract] ABSTRACT: Background:
Pulmonary vein isolation (PVI) for atrial fibrillation is associated with a transient increased risk of thromboembolic and hemorrhagic events. We hypothesized that dabigatran can be safely used as an alternative to continuous warfarin for the periprocedural anticoagulation in PVI.
Methods and results:
A total of 999 consecutive patients undergoing PVI were included; 376 patients were on dabigatran (150 mg), and 623 patients were on warfarin with therapeutic international normalized ratio. [corrected] Dabigatran was held 1 to 2 doses before PVI and restarted at the conclusion of the procedure or as soon as patients were transferred to the nursing floor. Propensity score matching was applied to generate a cohort of 344 patients in each group with balanced baseline data. Total hemorrhagic and thromboembolic complications were similar in both groups, before (3.2% versus 3.9%; P=0.59) and after (3.2% versus 4.1%; P=0.53) matching. Major hemorrhage occurred in 1.1% versus 1.6% (P=0.48) before and 1.2% versus 1.5% (P=0.74) after matching in the dabigatran versus warfarin group, respectively. A single thromboembolic event occurred in each of the dabigatran and warfarin groups. Despite higher doses of intraprocedural heparin, the mean activated clotting time was significantly lower in patients who held dabigatran for 1 or 2 doses than those on warfarin.
Our study found no evidence to suggest a higher risk of thromboembolic or hemorrhagic complications with use of dabigatran for periprocedural anticoagulation in patients undergoing PVI compared with uninterrupted warfarin therapy.