[Show abstract][Hide abstract] ABSTRACT: Myocardial infarction leads to changes in the geometry (remodeling) of the left ventricle (LV) of the heart. The degree and type of remodeling provides important diagnostic information for the therapeutic management of ischemic heart disease. In this paper, we present a novel analysis framework for characterizing remodeling after myocardial infarction, using LV shape descriptors derived from atlas-based shape models. Cardiac magnetic resonance images from 300 patients with myocardial infarction and 1991 asymptomatic volunteers were obtained from the Cardiac Atlas Project. Finite element models were customized to the spatio-temporal shape and function of each case using guide-point modeling. Principal component analysis was applied to the shape models to derive modes of shape variation across all cases. A logistic regression analysis was performed to determine the modes of shape variation most associated with myocardial infarction. Goodness of fit results obtained from end-diastolic and end-systolic shapes were compared against the traditional clinical indices of remodeling: end-diastolic volume, end-systolic volume and LV mass. The combination of end-diastolic and end-systolic shape parameter analysis achieved the lowest deviance, Akaike information criterion and Bayesian information criterion, and the highest area under the receiver operating characteristic curve. Therefore, our framework quantitatively characterized remodeling features associated with myocardial infarction, better than current measures. These features enable quantification of the amount of remodeling, the progression of disease over time, and the effect of treatments designed to reverse remodeling effects.
PLoS ONE 10/2014; 9(10):e110243. · 3.53 Impact Factor
[Show abstract][Hide abstract] ABSTRACT: Background
The benefit of a primary prevention implantable cardioverter-defibrillator (ICD) among patients with chronic kidney disease is uncertain.
Meta-analysis of patient-level data from randomized controlled trials.
Setting & Population
Patients with symptomatic heart failure and left ventricular ejection fraction < 35%.
Selection Criteria for Studies
From 7 available randomized controlled studies with patient-level data, we selected studies with available data for important covariates. Studies without patient-level data for baseline estimated glomerular filtration rate (eGFR) were excluded.
Primary prevention ICD versus usual care effect modification by eGFR.
Mortality, rehospitalizations, and effect modification by eGFR.
We included data from the Multicenter Automatic Defibrillator Implantation Trial I (MADIT-I), MADIT-II, and the Sudden Cardiac Death in Heart Failure Trial (SCD-HeFT). 2,867 patients were included; 36.3% had eGFR < 60 mL/min/1.73 m2. Kaplan-Meier estimate of the probability of death during follow-up was 43.3% for 1,334 patients receiving usual care and 35.8% for 1,533 ICD recipients. After adjustment for baseline differences, there was evidence that the survival benefit of ICDs in comparison to usual care depends on eGFR (posterior probability for null interaction P < 0.001). The ICD was associated with survival benefit for patients with eGFR ≥ 60 mL/min/1.73 m2 (adjusted HR, 0.49; 95% posterior credible interval, 0.24-0.95), but not for patients with eGFR < 60 mL/min/1.73 m2 (adjusted HR, 0.80; 95% posterior credible interval, 0.40-1.53). eGFR did not modify the association between the ICD and rehospitalizations.
Few patients with eGFR < 30 mL/min/1.73 m2 were available. Differences in trial-to-trial measurement techniques may lead to residual confounding.
Reductions in baseline eGFR decrease the survival benefit associated with the ICD. These findings should be confirmed by additional studies specifically targeting patients with varying eGFRs.
American Journal of Kidney Diseases 07/2014; · 5.76 Impact Factor
[Show abstract][Hide abstract] ABSTRACT: To provide a meta-analysis to estimate the performance of 12 commonly reported risk stratification tests as predictors of arrhythmic events in patients with NIDCM.
Multiple techniques have been assessed as predictors of death due to ventricular tachyarrhythmias/sudden death in patients with non-ischemic dilated cardiomyopathy (NIDCM).
Forty-five studies enrolling 6088 patients evaluating the association between arrhythmic events and predictive tests (baroreflex sensitivity, heart rate turbulence, heart rate variability, left ventricular end diastolic dimension, left ventricular ejection fraction, electrophysiology study, non-sustained ventricular tachycardia, left bundle branch block, signal-averaged electrocardiogram, fragmented QRS, QRS-T angle, and T-wave alternans) were included. Raw event rates were extracted and meta-analysis was performed using mixed effects methodology. We also used trim-and-fill method to estimate the influence of missing studies on the results.
Patients were 52.8±14.5 years old and 77% were male. LVEF was 30.6±11.4%. Test sensitivities ranged from 28.8% to 91.0%; specificities from 36.2% to 87.1%; odds ratios from 1.5 to 6.7. OR was highest for fragmented QRS and TWA (OR=6.73 and 4.66, 95% confidence interval 3.85-11.76 and 2.55-8.53, respectively) and lowest for QRS duration (OR=1.51, 1.13-2.01). None of the autonomic tests (HRV, HRT, BRS) were significant predictors of arrhythmic outcomes. Accounting for publication bias reduced the odds ratios for the various predictors but did not eliminate the predictive association.
Techniques incorporating functional parameters, depolarization abnormalities, repolarization abnormalities, and arrhythmic markers provide only modest risk stratification for SCD in patients with NIDCM. It is likely that combinations of tests will be required to optimize risk stratification in this population.
Journal of the American College of Cardiology 01/2014; · 15.34 Impact Factor
[Show abstract][Hide abstract] ABSTRACT: Purpose: Although the ICD has been shown to be effective in preventing mortality in several large randomized controlled trials (RCTs), its effectiveness in specific populations is uncertain. We combined data from 7 RCTs to develop, calibrate, and validate a prediction model that estimated survival over time in subgroups of interest.
Methods: Using patient-level data from 7 RCTs representing 4455 patients, we developed a Bayesian hierarchical Weibull regression model to combine data while allowing for trial-specific baseline hazard functions and treatment effects. The final model, derived from backwards elimination, included the main effects of treatment, covariates and the interaction between age and treatment. We performed frequentist evaluation of our prediction model using calibration and discrimination statistics. We performed internal validation using bootstrap samples of the combined data set and external validation using registry data. The model explored patients in 192 subgroups stratified by treatment, age, ejection fraction (EF), New York Heart Association (NYHA) class, QRS, and presence of ischemic disease.
Results: With the borrowing of strength between covariate categories and across trials, our Bayesian hierarchical model allows predictions even for subgroups with small sizes (subgroup sample size ranged from 0 to 200) though with increased uncertainty in such cases. The prediction model had a C-statistic of 0.72 (se=0.01) at year 1 indicating good discrimination and was well calibrated (p=0.99). The C-statistic was slightly smaller at years 2-5 (range: 0.67,0.70), but the model predictions were also calibrated. The same general conclusions were obtained using either internal or external validation data sets. At 5 years, the model predicts the ICD to be more effective in all subgroups. Predicted 5-yr survival with an ICD varied from 29.6% (75+y, NYHAIII, EF<30, QRS>=120, ischemia) to 90% (<65y, NYHAI, EF>=30, QRS>=120, no ischemia), while survival in the control group varied from 20% (75+y, NYHAIII, EF<30, QRS>=120, ischemia) to 84.2% (<65y, NYHAII, EF>=30, QRS<120, no ischemia). The absolute survival benefit ranged from 0.6% to 21.6% across subgroups.
Conclusion: Our findings suggest that over time ICD treatment is more effective in most subgroups relative to non-ICD. Incorporation of this prediction model into a decision-analytic framework will allow exploration of harms/benefits of ICD use in specific subgroups of interest, while also exploring the uncertainty of these findings and the value of additional data acquisition.
The 35th Annual Meeting of the Society for Medical Decision Making; 10/2013
[Show abstract][Hide abstract] ABSTRACT: A collaborative framework was initiated to establish a community resource of ground truth segmentations from cardiac MRI. Multi-site, multi-vendor cardiac MRI datasets comprising 95 patients (73 men, 22 women; mean age 62.73±11.24years) with coronary artery disease and prior myocardial infarction, were randomly selected from data made available by the Cardiac Atlas Project (Fonseca et al., 2011). Three semi- and two fully-automated raters segmented the left ventricular myocardium from short-axis cardiac MR images as part of a challenge introduced at the STACOM 2011 MICCAI workshop (Suinesiaputra et al., 2012). Consensus myocardium images were generated based on the Expectation-Maximization principle implemented by the STAPLE algorithm (Warfield et al., 2004). The mean sensitivity, specificity, positive predictive and negative predictive values ranged between 0.63 and 0.85, 0.60 and 0.98, 0.56 and 0.94, and 0.83 and 0.92, respectively, against the STAPLE consensus. Spatial and temporal agreement varied in different amounts for each rater. STAPLE produced high quality consensus images if the region of interest was limited to the area of discrepancy between raters. To maintain the quality of the consensus, an objective measure based on the candidate automated rater performance distribution is proposed. The consensus segmentation based on a combination of manual and automated raters were more consistent than any particular rater, even those with manual input. The consensus is expected to improve with the addition of new automated contributions. This resource is open for future contributions, and is available as a test bed for the evaluation of new segmentation algorithms, through the Cardiac Atlas Project (www.cardiacatlas.org).
Medical image analysis 09/2013; 18(1):50-62. · 3.09 Impact Factor
[Show abstract][Hide abstract] ABSTRACT: Cardiovascular imaging studies generate a wealth of data which is typically used only for individual study endpoints. By pooling data from multiple sources, quantitative comparisons can be made of regional wall motion abnormalities between different cohorts, enabling reuse of valuable data. Atlas-based analysis provides precise quantification of shape and motion differences between disease groups and normal subjects. However, subtle shape differences may arise due to differences in imaging protocol between studies.
A mathematical model describing regional wall motion and shape was used to establish a coordinate system registered to the cardiac anatomy. The atlas was applied to data contributed to the Cardiac Atlas Project from two independent studies which used different imaging protocols: steady state free precession (SSFP) and gradient recalled echo (GRE) cardiovascular magnetic resonance (CMR). Shape bias due to imaging protocol was corrected using an atlas-based transformation which was generated from a set of 46 volunteers who were imaged with both protocols.
Shape bias between GRE and SSFP was regionally variable, and was effectively removed using the atlas-based transformation. Global mass and volume bias was also corrected by this method. Regional shape differences between cohorts were more statistically significant after removing regional artifacts due to imaging protocol bias.
Bias arising from imaging protocol can be both global and regional in nature, and is effectively corrected using an atlas-based transformation, enabling direct comparison of regional wall motion abnormalities between cohorts acquired in separate studies.
Journal of Cardiovascular Magnetic Resonance 09/2013; 15(1):80. · 4.44 Impact Factor
[Show abstract][Hide abstract] ABSTRACT: BACKGROUND: As LVEF may improve, worsen, or stay the same over time, patients' prognosis may also be expected to change related to change in LVEF, among other factors. OBJECTIVE: To evaluate the effect of LVEF change on outcome in DEFINITE. METHODS: DEFINITE enrolled patients with nonischemic cardiomyopathy with LVEF<36%, history of symptomatic heart failure, and the presence of significant ventricular ectopic activity. Follow-up LVEF measurements were obtained annually in only a minority (17%) of trial participants. This study therefore evaluated survival and arrhythmic endpoints in patients whose LVEF was re-assessed between 90-730 days after enrollment. RESULTS: During the 90-730 days post-randomization period, 187/449 enrolled patients (42%) who survived past 90 days had at least one follow-up LVEF measurement; these patients tended to be younger, white, diabetic, had better 6-minute walk tests, higher BMI, were more likely to have appropriate shocks, and had fewer deaths compared to those without follow-up LVEF measurements. Patients whose LVEF improved had reduced mortality compared to patients who had a decrease in LVEF (HR=0.09, 95% CI 0.02-0.39; p=0.001). Survival free of appropriate shocks was not significantly related to LVEF improvement during follow-up. CONCLUSIONS: LVEF improvement was associated with improved survival, but not with a significant decrease in appropriate shocks. These data highlight that appropriate caution should be taken not to extrapolate the positive effect of improved LVEF to elimination of arrhythmic events.
Heart rhythm: the official journal of the Heart Rhythm Society 02/2013; · 4.56 Impact Factor
[Show abstract][Hide abstract] ABSTRACT: BACKGROUND: Whether there is an optimal time to place an implantable cardioverter-defibrillator (ICD) more than 40 days after myocardial infarction (MI) in guideline-eligible patients is unknown. OBJECTIVE: To evaluate the impact of time from MI to randomization on mortality, re-hospitalizations, and complications. METHODS: Individual data on patients enrolled in 9 primary prevention ICD trials were provided. Clinical trials were eligible for the current analysis if they enrolled patients with an MI more than 40 days prior to randomization to primary prevention ICD therapy versus usual care: MADIT-I, MUSTT, MADIT-II, and SCD-HeFT. RESULTS: ICD recipients died less frequently than non-recipients at 5 years across all subgroups of time from MI to randomization. In unadjusted Cox proportional hazards regression, a survival benefit was evident in most subgroups. Adjusted Bayesian Weibull survival modeling yielded hazard ratio (HR) 0.50, 95% posterior credible interval [PCI] 0.20-1.25 41-180 days after MI; HR 0.98, 95% PCI 0.37-2.37 181-365 days after MI; HR 0.22, 95% PCI 0.07-0.59 >1-2 years after MI; HR 0.42, 95% PCI 0.17-0.90 >2-5 years after MI; HR 0.55, 95% PCI 0.25-1.15 >5-10 years after MI; and HR 0.48, 95% PCI 0.20-1.02 > 10 years after MI. There was no evidence of an interaction between time from MI and all-cause mortality, re-hospitalizations, or complications. CONCLUSIONS: In this meta-analysis, there was scant evidence that the efficacy of primary prevention ICD therapy and no evidence that the risks of re-hospitalizations or complications are dependent on time to implantation more than 40 days after MI.
Heart rhythm: the official journal of the Heart Rhythm Society 02/2013; · 4.56 Impact Factor
[Show abstract][Hide abstract] ABSTRACT: Peri-infarct border zone (BZ) as quantified by late gadolinium enhancement (LGE) on cardiac magnetic resonance imaging (MRI) has been proposed as a risk stratification tool, and is associated with increased mortality. BZ has been measured by various methods in the literature. We assessed which BZ analysis best predicts inducible arrhythmia during electrophysiological study (EPS).
LGE was performed in 47 patients with coronary artery disease referred for EPS to assess for ventricular tachycardia (VT). LGE data was analyzed for BZ quantification by 3 previously published methods. Method I (BZ-I) used pixels 2-3 standard deviations over the mean of normal tissue, expressed as % of left ventricular mass, Method II (BZ-II, as described by Yan) and Method III (BZ-III, as described by Schmidt). EPS results were classified as negative (non-inducible) or positive (monomorphic VT - MVT).
There were 47 subjects-age 61.7 years, 72% male. During EPS, 20 patients were non-inducible and 18 had induced MVT. Ejection fraction was not significantly different between non-inducible patients and those with MVT (34.1% vs. 28.5%, p = 0.13). BZ-I was significantly different (1.4% vs. 2.6%, p = 0.001), but not BZ-II (7.9% vs. 6.9%, p = 0.68) or BZ-III (2.7 g vs. 2.1 g, p = 0.88). Multivariate analysis demonstrated that only BZ-I was an independent predictor of EPS outcome after controling for infarct size (OR 1.97 per % change, 95% CI 1.04-3.73, p = 0.04).
This study demonstrates significant variability between the published methods for measuring BZ. Also, BZ-I is a stronger predictor of inducible MVT during EPS than ejection fraction and infarct size. BZ may be another LGE marker of elevated risk of arrhythmia.
[Show abstract][Hide abstract] ABSTRACT: Background- Implantable cardioverter-defibrillators (ICDs) are increasingly used for primary prevention after randomized, controlled trials demonstrating that they reduce the risk of death in patients with left ventricular systolic dysfunction. The extent to which the clinical characteristics and long-term outcomes of unselected, community-based patients with left ventricular systolic dysfunction undergoing primary prevention ICD implantation in a real-world setting compare with those enrolled in the randomized, controlled trials is not well characterized. This study is being conducted to address these questions. Methods and Results- The study cohort includes consecutive patients undergoing primary prevention ICD placement between January 1, 2006 and December 31, 2009 in 7 health plans. Baseline clinical characteristics were acquired from the National Cardiovascular Data Registry ICD Registry. Longitudinal data collection is underway, and will include hospitalization, mortality, and resource use from standardized health plan data archives. Data regarding ICD therapies will be obtained through chart abstraction and adjudicated by a panel of experts in device therapy. Compared with the populations of primary prevention ICD therapy randomized, controlled trials, the cohort (n=2621) is on average significantly older (by 2.5-6.5 years), more often female, more often from racial and ethnic minority groups, and has a higher burden of coexisting conditions. The cohort is similar, however, to a national population undergoing primary prevention ICD placement. Conclusions- Patients undergoing primary prevention ICD implantation in this study differ from those enrolled in the randomized, controlled trials that established the efficacy of ICDs. Understanding a broad range of health outcomes, including ICD therapies, will provide patients, clinicians, and policy makers with contemporary data to inform decision-making.
[Show abstract][Hide abstract] ABSTRACT: BACKGROUND: Implantable cardioverter-defibrillators (ICD) are recommended for the primary prevention of sudden cardiac death in patients with left ventricular dysfunction but it is unclear whether treatment benefits are diminished in patients with very low baseline left ventricular ejection fraction (LVEF) (<25%) or increased in those with prolonged QRS duration (>120 msec). OBJECTIVE: To study the effects of very low LVEF and prolonged QRS duration on the mortality benefits of ICD therapy. METHODS: We performed a meta-analysis of primary prevention randomized controlled trials comparing ICD and standard medical therapy. All-cause mortality hazard ratios in subgroups according to thresholds of 25% for LVEF and 120 msec for QRS duration were extracted from published reports or contributed by trial investigators and synthesized. RESULTS: There was no significant difference of ICD effectiveness in LVEF subgroups of 25-35% (random effects HR 0.81, 95% CI 0.70-0.94) versus <25% (HR 0.71, 95% CI 0.55-0.93). Results were similar also in the narrow and wide QRS subgroups (HR 0.78, 95% CI 0.68-0.90 and 0.70, 0.51-0.95, respectively). Within the LVEF<25% and wide QRS subgroups, there was large heterogeneity driven by the DINAMIT trial that included early post-myocardial infarction patients and its results (HR 1.49, 95% CI 0.84-2.68, and 1.51, 0.83-2.83, respectively) differed significantly from other trials (p=0.008 and p=0.01, respectively). CONCLUSION: LVEF values and QRS duration do not appear to directly modify the survival benefit of ICD in patients with baseline LVEF<35%. However, patients with a recent myocardial infarction do not benefit from ICD, especially when they have LVEF <25% and/or wide QRS.
Heart rhythm: the official journal of the Heart Rhythm Society 10/2012; · 4.56 Impact Factor
[Show abstract][Hide abstract] ABSTRACT: AIMS: This study investigated autonomic nervous system function in subjects with diabetes during exercise and recovery. METHODS: Eighteen type 2 diabetics (age 55±2years) and twenty healthy controls (age 51±1years) underwent two 16-min bicycle submaximal ECG stress tests followed by 45min of recovery. During session #2, atropine (0.04mg/kg) was administered at peak exercise, and the final two minutes of exercise and entire recovery occurred under parasympathetic blockade. Plasma catecholamines were measured throughout. Parasympathetic effect was defined as the difference between a measured parameter at baseline and after parasympathetic blockade. RESULTS: The parasympathetic effect on the RR interval was blunted (P=.004) in diabetic subjects during recovery. Parasympathetic effect on QT-RR slope during early recovery was diminished in the diabetes group (diabetes 0.13±0.02, control 0.21±0.02, P=.03). Subjects with diabetes had a lower heart rate recovery at 1min (diabetes 18.5±1.9bpm, control 27.6±1.5bpm, P<.001). CONCLUSIONS: In subjects with well-controlled type 2 diabetes, even with minimal evidence of CAN using current methodology, altered cardiac autonomic balance is present and can be detected through an exercise-based assessment for CAN. The early post-exercise recovery period in diabetes was characterized by enhanced sympathoexcitation, diminished parasympathetic reactivation and delay in heart rate recovery.
Journal of diabetes and its complications 10/2012; · 2.11 Impact Factor
[Show abstract][Hide abstract] ABSTRACT: Background: There is a heightened risk of sudden cardiac death related to exercise and the postexercise recovery period, but the precise mechanism is unknown. We have demonstrated that sympathoexcitation persists for ≥45 minutes after exercise in normals and subjects with coronary artery disease (CAD). The purpose of this study is to determine whether this persistent sympathoexcitation is associated with persistent heart rate variability (HRV) and ventricular repolarization changes in the postexercise recovery period. Methods and Results: Twenty control subjects (age 50.7 ± 1.4 years), 68 subjects (age 58.2 ± 1.5 years) with CAD and preserved left ventricular ejection fraction (LVEF), and 18 subjects (age 57.6 ± 2.4 years) with CAD and depressed LVEF underwent a 16-minute submaximal bicycle exercise protocol with continuous ECG monitoring. QT and RR intervals were measured in recovery to calculate the time dependent corrected QT intervals (QTc), the QT-RR relationship, and HRV. QTc was dependent on the choice of rate correction formula. There were no differences in QT-RR slopes among the three groups in early recovery. HRV recovered quickly in controls, more slowly in those with CAD-preserved LVEF, and to a lesser extent in those with CAD-depressed LVEF. Conclusion: Despite persistent sympathoexcitation for the 45-minute recovery period, ventricular repolarization changes do not persist for that long and HRV changes differ by group. Additional understanding of the dynamic changes in cardiac parameters after exercise is needed to explore the mechanism of increased sudden cardiac death risk at this time.
Annals of Noninvasive Electrocardiology 10/2012; 17(4):349-60. · 1.08 Impact Factor
[Show abstract][Hide abstract] ABSTRACT: Fibrotic and autonomic remodeling in heart failure (HF) increase vulnerability to atrial fibrillation (AF). Because AF electrograms (EGMs) are thought to reflect the underlying structural substrate, we sought to (1) determine the differences in AF EGMs in normal versus HF atria and (2) assess how fibrosis and nerve-rich fat contribute to AF EGM characteristics in HF.
AF was induced in 20 normal dogs by vagal stimulation and in 21 HF dogs (subjected to 3 weeks of rapid ventricular pacing at 240 beats per minute). AF EGMs were analyzed for dominant frequency (DF), organization index, fractionation intervals (FIs), and Shannon entropy. In 8 HF dogs, AF EGM correlation with underlying fibrosis/fat/nerves was assessed. In HF compared with normal dogs, DF was lower and organization index/FI/Shannon entropy were greater. DF/FI were more heterogeneous in HF. Percentage fat was greater, and fibrosis and fat were more heterogeneously distributed in the posterior left atrium than in the left atrial appendage. DF/organization index correlated closely with %fibrosis. Heterogeneity of DF/FI correlated with the heterogeneity of fibrosis. Autonomic blockade caused a greater change in DF/FI/Shannon entropy in the posterior left atrium than left atrial appendage, with the decrease in Shannon entropy correlating with %fat.
The amount and distribution of fibrosis in the HF atrium seems to contribute to slowing and increased organization of AF EGMs, whereas the nerve-rich fat in the HF posterior left atrium is positively correlated with AF EGM entropy. By allowing for improved detection of regions of dense fibrosis and high autonomic nerve density in the HF atrium, these findings may help enhance the precision and success of substrate-guided ablation for AF.
Circulation Arrhythmia and Electrophysiology 06/2012; 5(4):640-9. · 5.95 Impact Factor
[Show abstract][Hide abstract] ABSTRACT: Early repolarization (ER) on a 12-lead electrocardiogram has recently been associated with ventricular tachyarrhythmias (VTAs) in patients without structural heart disease and in patients with healed myocardial infarction (MI). An association between ER and VTAs in the setting of acute ST-segment elevation MI (STEMI) has not been explored. In a single-center retrospective case-control design, 50 patients with STEMI complicated by VTAs (cases), defined as ventricular fibrillation, sustained ventricular tachycardia, or nonsustained ventricular tachycardia within 72 hours of the index hospitalization, were matched for age and gender with 50 subjects with STEMI without VTAs (controls). Electrocardiograms obtained an average of 1 year before STEMI were analyzed for ER pattern, defined as notching or slurring of the terminal QRS complex or J-point elevation >0.1 mV above baseline in ≥ 2 contiguous leads. A higher prevalence of ER was associated with VTAs overall in cases compared to controls (26% vs 4%, p = 0.01) and localized to anterior (16% vs 0%) and inferior (14% vs 2%, p = 0.07) leads but not lateral limb leads. Notching (10% vs 2%, p = 0.1) and J-point elevation (16% vs 0%) were more common in cases. Slurring was uncommon. ER was associated with VTAs (odds ratio [OR] 6.5, 95% confidence interval [CI] 1.5 to 28.8, p = 0.01), even after adjustment for creatine kinase-MB (OR 9.2, 95% CI 1.6 to 53.4, p = 0.01) and ejection fraction (OR 5.7, 95% CI 1.2 to 27.1, p = 0.03). In conclusion, ER is associated with VTAs in patients with STEMI even after adjustment for left ventricular ejection fraction or creatine kinas-MB levels. Larger prospective studies exploring potential associations and mechanisms of ventricular arrhythmogenesis with ER pattern are needed.
The American journal of cardiology 05/2012; 110(5):615-20. · 3.58 Impact Factor