Kay Ryschon

Kansas City VA Medical Center, Kansas City, Missouri, United States

Are you Kay Ryschon?

Claim your profile

Publications (43)185.52 Total impact

  • [Show abstract] [Hide abstract]
    ABSTRACT: Pre- and postablation atrial fibrillation (AF) brain natriuretic peptide (BNP) levels were shown to predict increased recurrence of AF following ablation. Our objective was to assess whether elevated BNP levels merely represent the presence of AF at the time of measurement or indeed the true recurrence of AF. In a prospective study of 88 patients undergoing AF ablation, BNP levels were measured immediately before, after, 24 h, and 4-6 months postablation. BNP levels were stratified by presenting rhythm and ventricular rate at the time of measurement. Median BNP level preablation was higher in patients presenting in AF compared to sinus rhythm (SR) (54(44-79) pg/ml vs. 30(18-47) pg/ml, p < 0.001). Postablation restoration of SR in patients presenting in AF reduced median BNP levels from 54(44-79) pg/ml to 40(37-51) pg/ml, (p < 0.001). However, no change was noted in patients who presented in and maintained SR throughout the procedure (30(18-47) pg/ml to 27(16-40) pg/ml, p = 0.270). At 4-6 months, BNP measured in patients in SR was not significantly different from postablation BNP (35(22-53) pg/ml vs. 38(20-52) pg/ml, p = 0.656), although 35 % of them had AF recurrence in 1-year follow-up. Median BNP level measured in five patients while in atrial arrhythmia was elevated compared to postablation BNP (464(421-464) pg/ml to 37(36-37) pg/ml, p = 0.043). BNP levels and ventricular rates are positively correlated at all times pre- and postablation. BNP level rises acutely during AF and with rapid ventricular rates. BNP level seems to be a function of atrial rhythm and ventricular rate rather than short- or long-term predictor of AF ablation success.
    Journal of Interventional Cardiac Electrophysiology 05/2014; 40(2). · 1.39 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Background The clinical significance of mildly elevated troponins in patients presenting to the emergency room (ER) with atrial fibrillation (AF) is not well understood.HypothesisWe hypothesized that mildly elevated troponin in these patients is associated with adverse cardiovascular outcomes.Methods In a multi-center, retrospective study, 662 patients with AF were divided into 3 groups based on troponin levels: group 1, mildly elevated; group 2, normal; and group 3, troponin not measured. Primary outcome was the combined endpoint of all-cause mortality and myocardial infarction (MI) at one year.ResultsLevels of TnI were measured in 503 (76%) patients. They were elevated in 220 patients (33%, group 1; mean, 0.56 ng/mL), normal in 283 patients (43%, group 2), and not measured in 159 patients (24%, group 3). Significantly more cardiac testing was done at index hospitalization in group 1 (50%) compared with groups 2 and 3 (28% and 29%, P ≤ 0.001) and in the following year (29%, vs 20% and 17%, P = 0.02). Group 1 had more positive tests (62%) compared with groups 2 and 3 (25% and 43%, P ≤ 0.001). Group 1 had a significantly higher occurrence of the primary endpoint (22%, vs 10% and 15%, P = 0.002), driven primarily by a higher incidence of MI in group 1 (7%, vs 1% and 2%, P = 0.001).Conclusions Troponin levels are routinely checked in a majority of patients presenting to the emergency department with AF. Even mildly elevated TnI is associated with a greater incidence of coronary artery disease on diagnostic testing and a higher 1-year incidence of MI.
    Clinical Cardiology 04/2014; · 2.23 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: PURPOSE: An association between atrial fibrillation (AF) and gastroesophageal reflux disease (GERD) and/or irritable bowel syndrome (IBS) is increasingly being identified; yet the role of radiofrequency catheter ablation (RFA) of AF has not been systematically evaluated in these patient populations. METHODS: We performed a prospective matched case-control study of AF patients with GERD and/or IBS who underwent RFA for AF in two centers in North America. AF patients with GERD and/or IBS (gastrointestinal [GI] group) were matched by age, gender, and type of AF at each of the centers with an equal number of AF patients without GERD or IBS (non-GI group). RESULTS: Sixty patients were included in the study with 30 in each group. Mean age of the population was 45 years with 14 (47 %) males and 21 (87 %) patients with paroxysmal AF in each group. More patients in the GI group had identifiable GI triggers for AF episodes. During RFA, more patients in the GI group had a "vagal response" compared to non-GI group (60 vs 13 %; p < 0.001). Left atrial scar as identified by electroanatomical mapping was more common in patients in the non-GI group compared to the GI group (57 vs 27 %; p = 0.018). At 1-year follow-up, 56 (93 %) of the patients were free from AF with no difference between both groups. CONCLUSIONS: Majority of AF patients with GERD and/or IBS have triggered AF and a positive vagal response during RFA. RFA is equally effective in this patient population when compared to those without GERD or IBS.
    Journal of Interventional Cardiac Electrophysiology 06/2013; · 1.39 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: OBJECTIVES: The purpose of this study was to examine the impact of yoga on atrial fibrillation (AF) burden, quality of life (QoL), depression, and anxiety scores. BACKGROUND: Yoga is known to have significant benefit on cardiovascular health. The effect of yoga in reducing AF burden is unknown. METHODS: This single-center, pre-post study enrolled patients with symptomatic paroxysmal AF with an initial 3-month noninterventional observation period followed by twice-weekly 60-min yoga training for next 3 months. AF episodes during the control and study periods as well as SF-36, Zung self-rated anxiety, and Zung self-rated depression scores at baseline, before, and after the study phase were assessed. RESULTS: Yoga training reduced symptomatic AF episodes (3.8 ± 3 vs. 2.1 ± 2.6, p < 0.001), symptomatic non-AF episodes (2.9 ± 3.4 vs. 1.4 ± 2.0; p < 0.001), asymptomatic AF episodes (0.12 ± 0.44 vs. 0.04 ± 0.20; p < 0.001), and depression and anxiety (p < 0.001), and improved the QoL parameters of physical functioning, general health, vitality, social functioning, and mental health domains on SF-36 (p = 0.017, p < 0.001, p < 0.001, p = 0.019, and p < 0.001, respectively). There was significant decrease in heart rate, and systolic and diastolic blood pressure before and after yoga (p < 0.001). CONCLUSIONS: In patients with paroxysmal AF, yoga improves symptoms, arrhythmia burden, heart rate, blood pressure, anxiety and depression scores, and several domains of QoL. (Yoga on Arrythmia Burden and Quality of Life in Paroxysmal Atrial Fibrillation; NCT00798356).
    Journal of the American College of Cardiology 01/2013; 61(11). · 15.34 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: Objectives This study sought to examine whether suppressing premature ventricular contractions (PVC) using radiofrequency ablation improves effectiveness of the cardiac resynchronization therapy (CRT) in nonresponders. Background CRT is an effective strategy for drug refractory congestive heart failure. However, one-third of patients with CRT do not respond clinically, and the causes for nonresponse are poorly understood. Whether frequent PVC contribute to CRT nonresponse remains unknown. Methods In this multicenter study, CRT nonresponders with >10,000 PVC in 24 h who underwent PVC ablation were enrolled from a prospective database. Results Sixty-five subjects (age 66.6 ± 12.4 years, 78% men, QRS duration of 155 ± 18 ms) had radiofrequency ablation of PVC from 76 foci. Acute and long-term success rates of ablation were 91% and 88% in 12 ± 4 months of follow-up. There was significant improvement in left ventricular (LV) ejection fraction (26.2 ± 5.5% to 32.7 ± 6.7 %, p < 0.001), LV end-systolic diameter (5.93 ± 0.55 cm to 5.62 ± 0.32 cm, p < 0.001), LV end-diastolic diameter (6.83 ± 0.83 cm to 6.51 ± 0.91 cm, p < 0.001), LV end-systolic volume (178 ± 72 to 145 ± 23 ml, p < 0.001), LV end-diastolic volume (242 ± 85 ml to 212 ± 63 ml, p < 0.001), and median New York Heart Association functional class (3.0 to 2.0, p < 0.001). Modeling of pre-ablation PVC burden revealed an improvement in ejection fraction when the pre-ablation PVC burden was >22% in 24 h. Conclusions Frequent PVC is an uncommon yet significant cause of CRT nonresponse. Radiofrequency ablation of PVC foci improves LV function and New York Heart Association class and promotes reverse remodeling in CRT nonresponders. PVC ablation may be used to enhance CRT efficacy in nonresponders with significant PVC burden.
    Journal of the American College of Cardiology 10/2012; 60(16):1531–1539. · 15.34 Impact Factor
  • Journal of The American College of Cardiology - J AMER COLL CARDIOL. 01/2011; 57(14).
  • [Show abstract] [Hide abstract]
    ABSTRACT: Pulmonary vein antral isolation (PVAI) is an effective treatment for atrial fibrillation and involves prolonged procedure and fluoroscopy times. This study assesses the impact of a comprehensive radiation safety program on patient and operator radiation exposure during PVAI. We evaluated a comprehensive radiation safety program including: (1) verbal reinforcement of previous fluoroscopy times (2) effective collimation (3) minimizing source-intensifier distance and (4) effective lead shield use. Exposure doses in 41 consecutive patients without (group-I, n = 21) and with (group-II, n = 20) the use of radiation safety program were assessed. PVAI was done using intracardiac echo (ICE) guided roving circular mapping catheter. A 3-dimensional mapping system was used in 27% cases for additional guidance. Operator and patient exposure was measured during the PVAI. The age, gender, body mass index and AF duration were similar in both of the groups. The total procedure (166 +/- 56 vs 178 +/- 38 min, p = 0.54) and fluoroscopy times (74 +/- 24 vs 70 +/- 20 min, p = 0.72) were comparable. Group-II had significantly lower dose area product (234 +/- 120 vs 548 +/- 363 Gy cm(2), p = 0.03) compared to group-I. The mean operator exposure was reduced by half and mean patient peak skin dose by three to ten times with comprehensive radiation safety program. None of the patients were noted to have radiation related skin injuries. Additional lifetime cancer risk was significantly lower in group-II patients (0.08 vs 0.2%, p < 0.001) than group-I. Implementation of a comprehensive radiation safety program described above significantly decreases the radiation exposure to the patient as well as the operator.
    Journal of Interventional Cardiac Electrophysiology 12/2008; 24(2):105-12. · 1.39 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: The use of electrical stun guns has been rising among law enforcement authorities for subduing violent subjects. Multiple reports have raised concerns over their safety. The cardiovascular safety profile of these devices in relationship to the position of delivery on the torso has not been well studied. We tested 13 adult pigs using a custom device built to deliver neuromuscular incapacitating (NMI) discharge of increasing intensity that matched the waveform of a commercially available stun gun (TASER(R) X-26, TASER International, Scottsdale, AZ, USA). Discharges with increasing multiples of output capacitances were applied in a step-up and step-down fashion, using two-tethered barbs at five locations: (1) Sternal notch to cardiac apex (position-1), (2) sternal notch to supraumbilical area (position-2), (3) sternal notch to infraumbilical area (position-3), (4) side to side on the chest (position-4), and (5) upper to lower mid-posterior torso (position-5). Endpoints included determination of maximum safe multiple (MaxSM), ventricular fibrillation threshold (VFT), and minimum ventricular fibrillation induction multiple (MinVFIM). Standard TASER discharges repeated three times did not cause ventricular fibrillation (VF) at any of the five locations. When the barbs were applied in the axis of the heart (position-1), MaxSM and MinVFIM were significantly lower than when applied away from the heart, on the dorsum (position-5) (4.31 +/- 1.11 vs 40.77 +/- 9.54, P< 0.001 and 8.31 +/- 2.69 vs 50.77 +/- 9.54, P< 0.001, respectively). The values of these endpoints at position-2, position-3, and position-4 were progressively higher and ranged in between those of position-1 and position-5. Presence of ventricular capture at a 2:1 ratio to the delivered TASER impulses correlated with induction of VF. No significant metabolic changes were seen after standard NMI TASER discharge. There was no evidence of myocardial damage based on serum cardiac markers, electrocardiography, echocardiography, and histopathologic findings confirming the absence of significant cardiac effects. Standard TASER discharges did not cause VF at any of the positions. Induction of VF at higher output multiples appear to be sensitive to electrode distance from the heart, giving highest ventricular fibrillation safety margin when the electrodes are placed on the dorsum. Rapid ventricular capture appears to be a likely mechanism of VF induction by higher output TASER discharges.
    Pacing and Clinical Electrophysiology 05/2008; 31(4):398-408. · 1.75 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: This study sought to assess cocaine's effects on Taser-induced ventricular fibrillation (VF) threshold in a pig model. Stun guns are increasingly used by law enforcement officials to restrain violent subjects, who are frequently intoxicated with cocaine and other drugs of abuse. The interaction of cocaine and the stun gun on VF induction is unknown. We tested five adult pigs using a custom device built to deliver multiples of standard neuromuscular incapacitating (NMI) discharge that matched the waveform of a commercially available electrical stun gun (Taser X-26, Taser International, Scottsdale, Arizona). The NMI discharges were applied in a step-up and step-down fashion at 5 body locations. End points included determination of maximum safe multiple, minimum VF-inducing multiple, and ventricular fibrillation threshold (VFT) before and after cocaine infusion. Standard NMI discharges (x1) did not cause VF at any of the 5 locations before or after cocaine infusion. The maximum safe multiple, minimum VF-inducing multiple, and VFT of NMI application increased with increasing electrode distance from the heart. There was a 1.5- to 2-fold increase in these values at each position after cocaine infusion, suggesting decreased cardiac vulnerability for VF. Cocaine increased the required strength of NMI discharge that caused 2:1 or 3:1 ventricular capture ratios at all of the positions. No significant changes in creatine kinase-MB and troponin-I were seen. Cocaine increased the VFT of NMI discharges at all dart locations tested and reduced cardiac vulnerability to VF. The application of cocaine increased the safety margin by 50% to 100% above the baseline safety margin.
    Journal of the American College of Cardiology 09/2006; 48(4):805-11. · 15.34 Impact Factor
  • Article: P4-93
    Heart Rhythm. 01/2006; 3(5).
  • [Show abstract] [Hide abstract]
    ABSTRACT: Catheter ablation has significantly transformed the clinical management of atrial fibrillation (AF). The safety and efficacy of this procedure are not well understood in patients with pacemakers and defibrillators. The purpose of this study was to study the impact of radiofrequency catheter ablation of AF in patients with pacemakers and implantable cardiac defibrillators. We studied 86 patients with pacemakers and defibrillators (group I) and a similar number of age- and gender-matched controls (group II) who underwent AF ablation between 1999 and 2004. Clinical and procedural variables were compared between the two groups. In group I, various generator and lead parameters were compared before and after the procedure. Resurgence of clinical AF after 2 months was considered recurrence. Both groups were similar with regard to age, gender, body mass index, and type of AF. Group I had a higher incidence of diabetes (17% vs 6%, P = .03), coronary artery disease (25% vs 13%, P = .05), less prolonged AF (31 +/- 21 vs 45 +/- 30 months, P <.001), lower left ventricular ejection fraction (49 +/- 13% vs 52 +/- 9%, P = .03), and left ventricular end-diastolic dimensions (4.97 +/- 0.81 vs 4.72 +/- 0.67, P = .03). No changes in the sensing and pacing thresholds, impedance of atrial and ventricular leads, or defibrillator coil impedance after AF ablation were observed in group I. Atrial lead dislodgment was seen in two patients. Transient abnormal but "expected" pulse generator behavior was seen in 25% of patients without permanent malfunction. Stroke (1% vs 1%, P = 1.000), pulmonary vein stenosis (2% vs 1%, P = .77), and AF recurrence rates at 12 months were similar between groups I and II, respectively (19% vs 21%, P = .73). AF ablation is safe and efficacious in patients with pacemakers and defibrillators.
    Heart Rhythm 01/2006; 2(12):1309-16. · 4.92 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: Infection is a devastating complication of permanent pacemakers (PMs) implantable cardioverter defibrillators (ICDs). Many implanting physicians commonly use povidone-iodine solution to irrigate the device pocket before implanting the device. We sought to assess if such a measure would alter the rate of infection. A total of 2,564 consecutive patients who received implantable PM or ICD devices between 1994 and 2002 were studied. Povidone-iodine was used for pocket irrigation in 53% and saline in 47%. A total of 18 (0.7%) patients developed pocket infections with 0.7% (10/1,359) in povidone-iodine (group I) and 0.6% (8/1,205) in saline (group II) pocket irrigation (p = ns). Groups I and II were studied for various clinical and demographic variables described in the results section. There was no statistical difference in the baseline demographic and clinical characteristics between groups I and II, respectively. ICDs were most frequently infected than PMs (56% vs 44%). Most (83%) of the devices were dual chamber. Reopening of the pocket for either lead or generator replacement had a higher incidence of infection than new implants (61% vs 39%). There was no difference in the use of preimplantation antibiotic prophylaxis. Late (61%) and deep pocket infections (78%) were more common than early (39%) and superficial infections (22%). Blood cultures were positive in 67% and Staphylococcus aureus was the common most pathogen (50%). The mean duration of antibiotics use after the diagnosis of device infection was 35 +/- 23 days with 72% requiring device explantation. The device was reimplanted on the contralateral side in 56% cases. One patient in each group died due to device infection and related complications. No significant allergy to iodine was seen in either group. Povidone-iodine irrigation of the subcutaneous pocket did not alter the rates of pocket infection after pacemaker/defibrillator implantation.
    Pacing and Clinical Electrophysiology 09/2005; 28(8):789-94. · 1.25 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: The Duke Treadmill Score (DTS) is an established clinical tool for risk stratification of coronary artery disease. We sought to assess the prognostic value of the DTS in diabetics compared with nondiabetics in this study. We studied 100 diabetics and 202 age- and sex-matched nondiabetic controls without known coronary artery disease risk stratified by DTS and followed for a median duration of 6.6 years. The association between DTS and primary, secondary outcomes, composite events, and rate of coronary angiography was tested. Survival free from cardiac death, nonfatal myocardial infarction, congestive heart failure, or early and late revascularization was 89%, 54%, and 13%, respectively, in the low-, intermediate-, and high-risk categories of diabetic group (P < .0001), and 91%, 57%, and 17%, respectively, in the low- to high-risk groups of nondiabetics (P < .0001). During follow-up, diabetics had more secondary events (P = .011) and coronary angiography (P < .001) compared with nondiabetics. The DTS was a strong independent predictor of composite events in both diabetics (P < .001) and nondiabetics (P < .001). A significant number of diabetics were classified as intermediate risk and had a significantly higher incidence of coronary angiography (87.5% vs 70.8%, P = .032) and late revascularizations (35.4% vs 15.3%, P = .011) within this risk group compared with nondiabetics. Survival free from major adverse cardiac events differed significantly across the 3 Duke risk groups for diabetics (P = .002) but not for controls (P = .07). Survival free from composite events differed significantly across the 3 Duke risk groups for both diabetics and nondiabetics (P < .0001). Overall, diabetics had higher rates of major adverse cardiac events, composite events (P = .011), and coronary angiography (P < .001) than nondiabetics. The DTS is a strong predictor of survival free of composite events in both groups by multivariate analysis. The DTS predicted survival free from MACE and composite events equally well in patients with and without diabetes.
    American heart journal 09/2005; 150(3):516-21. · 4.56 Impact Factor
  • Heart Rhythm 05/2005; 2(5):S3–S4. · 4.92 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: The association between the restriction length polymorphisms of the Vitamin D receptor (VDR) gene and the bone mineral density (BMD) or the rate of bone loss is still under debate. In a longitudinal study of untreated postmenopausal elderly women, we evaluated the relationship between the VDR gene polymorphisms (BsmI, TaqI, ApaI, and FokI) and the rate of bone loss over a 3-year period. We also examined the effect of adjustments for dietary and lifestyle factors on these associations. Before adjustments, the rate of femoral neck bone loss was - 3.76 +/- 1.58% in women with BB genotype and 0.45 +/- 0.65% in women with bb genotype, which was not significantly different. Upon adjustment for dietary and lifestyle factors, statistically significant (P = 0.03) bone loss was observed at femoral neck in women with BB genotype (- 3.66 +/- 2.44%) compared to that of bb genotype (2.39 +/- 1.32%). Similar results were observed with TaqI genotypes. The rates of bone loss at other skeletal sites were not different between VDR genotypes defined by BsmI and TaqI. VDR gene polymorphisms defined by ApaI and FokI were not related to the rate of bone loss.
    The Journal of Steroid Biochemistry and Molecular Biology 06/2004; 89-90(1-5):503-6. · 4.05 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: The role of caffeine as a risk factor for bone loss is controversial. Our goals were 1) to compare in both a cross-sectional study and a 3-y longitudinal study the bone mineral density (BMD) of postmenopausal women consuming high or low amounts of caffeine and 2) to study the interaction between caffeine intake, vitamin D receptor (VDR) polymorphism, and BMD in the longitudinal study. The results are derived from cross-sectional measurements of BMD in 489 elderly women (aged 65-77 y) and from longitudinal measurements made in 96 of these women who were treated with a placebo for 3 y. Changes in BMD were adjusted for confounding factors and were compared between groups with either low (< or =300 mg/d) or high (>300 mg/d) caffeine intakes and between the VDR genotype subgroups of the low- and high-caffeine groups. Women with high caffeine intakes had significantly higher rates of bone loss at the spine than did those with low intakes (-1.90 +/- 0.97% compared with 1.19 +/- 1.08%; P = 0.038). When the data were analyzed according to VDR genotype and caffeine intake, women with the tt genotype had significantly (P = 0.054) higher rates of bone loss at the spine (-8.14 +/- 2.62%) than did women with the TT genotype (-0.34 +/- 1.42%) when their caffeine intake was >300 mg/d. Intakes of caffeine in amounts >300 mg/d ( approximately 514 g, or 18 oz, brewed coffee) accelerate bone loss at the spine in elderly postmenopausal women. Furthermore, women with the tt genetic variant of VDR appear to be at a greater risk for this deleterious effect of caffeine on bone.
    American Journal of Clinical Nutrition 11/2001; 74(5):694-700. · 6.92 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: To evaluate the diagnostic and prognostic significance of ST-segment deviation detected by ambulatory Holter monitoring in unselected chest pain patients referred for coronary angiography. Two hundred seventy-seven patients (71% were men) who underwent coronary angiography for evaluation of chest pain were studied with 24-h ambulatory Holter monitoring within 72 h of angiography. A lumen diameter reduction of > or = 50% was considered coronary artery disease. The ST-segment deviation was defined as > or = 1-mm deviation from the baseline lasting > or = 1 min separated by a minimum of 1 min. The patients were followed up for 65 +/- 21 months (mean +/- SD) for occurrences of death, myocardial infarction, hospitalization for unstable angina, and need for revascularization. Of the 277 patients, 223 (80%) had coronary artery disease. The prevalence of coronary artery disease was not significantly different in patients with (43 of 48 patients; 90%) and without (180 of 229 patients; 79%) Holter-detected ST-segment deviation. The diagnostic accuracy of Holter-detected ST-segment deviation in predicting the presence of coronary artery disease was poor (33%), with a sensitivity of 19% and a specificity of 91%. The presence of Holter-detected ST-segment deviation was not predictive of future cardiac events or death. The ST-segment changes detected on ambulatory Holter monitoring are of limited value in identifying coronary artery disease and predicting the future adverse cardiac events or death in unselected patients with chest pain.
    Chest 09/2001; 120(3):834-9. · 7.13 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: Published reports on the effect of alcohol consumption on bone mineral density (BMD) are inconsistent. The objective of this study was to examine the relation between alcohol intake and BMD, calcitropic hormones, calcium absorption, and other biochemical indexes of bone and mineral metabolism in elderly women. The results presented are derived from baseline observations of 489 elderly women (aged 65-77 y) recruited for an osteoporosis study. The nondrinking group comprised 297 women and the drinking group comprised 148 women. Furthermore, the effect of different alcohol intakes (</=28.6, >28.6 to </=57.2, >57.2 to </=142.9, and >142.9 g/wk) was studied. Women who consumed alcohol had significantly higher spine (10%), total body (4.5%), and midradius (6%) BMD than did nondrinkers. An alcohol intake >28.6 g/wk was associated with higher BMD; maximum effect was seen with an intake of >28.6 to </=57.2 g/wk (16%, 12%, and 14% increase in spine, total body, and midradius BMD, respectively). There was a marked reduction in bone remodeling markers, serum osteocalcin, and the ratio of urinary cross-linked N:-telopeptides of type 1 collagen to creatinine with alcohol consumption, suggesting that increased BMD with alcohol consumption could be due to reduced bone remodeling. Further, serum parathyroid hormone concentrations were significantly lower in alcohol drinkers than in nondrinkers and could be one of the causes of decreased bone resorption. Moderate alcohol intake was associated with higher BMD in postmenopausal elderly women. The protective effect of alcohol may have been a result of lower bone remodeling due to reduced parathyroid hormone concentrations or factors such as increased estrogen concentrations.
    American Journal of Clinical Nutrition 11/2000; 72(5):1206-13. · 6.92 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: Cigarette smoking has been implicated as a risk factor for osteoporosis. In the present study, the relationship between smoking and bone mineral density, calcitropic hormones, calcium absorption, and biochemical indices related to bone and mineral metabolism was examined at baseline, in subjects recruited for an osteoporotic study. The subjects included 489 elderly women, aged 65-77 years. After exclusions (thiazide users), 54 women constituted the smoking group and 390 women were classified as nonsmokers. The effect of frequency of smoking was also examined in this population (33 light smokers [<1 pack/day] and 21 heavy smokers [>1 pack/day]). Adjusted mean total body bone mineral density was 4% lower (0.968 +/- 0.019 vs. 1.009 +/- 0.004) and the total hip density was 6% lower (0.778 +/- 0.024 vs. 0.826 +/- 0.006) in heavy smokers compared with nonsmokers. At the other sites measured (spine, midradius, femoral neck, trochanter, and Ward's triangle), a similar nonsignificant trend was observed. The adjusted mean calcium absorption corrected for weight was lower (13%) both in light and heavy smokers compared with nonsmokers, and serum 25-hydroxyvitamin D was significantly lower (16%) in heavy smokers than nonsmokers. Serum parathryroid hormone (PTH) was higher in heavy smokers, but was not significantly different from that of nonsmokers. A significant increase in bone remodeling markers, serum osteocalcin (4.35 +/- 0.271 vs. 3.79 +/- 0. 066) and urine N-telopeptide/creatinine (NTx/Cr) ratio (74.5 +/- 5. 75 vs. 49.8 +/- 1.4) was seen in heavy smokers compared with nonsmokers. These results suggest that smoking lowers bone mineral density, and is a result of decreased calcium absorption associated with secondary hyperparathyroidism and increased bone resorption.
    Bone 09/2000; 27(3):429-36. · 4.46 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: To determine the effect of age, severity of lung disease, severity and frequency of exacerbation, steroid use, choice of an antibiotic, and the presence of comorbidity on the outcome of treatment for an acute exacerbation of COPD. A retrospective chart analysis over 24 months. A university Veterans Affairs medical center. Outpatients with COPD who were treated with an antibiotic over a period of 24 months for an acute exacerbation of COPD. Severity of an acute exacerbation of COPD was defined using the criteria of Anthonisen et al: increased dyspnea, increased sputum volume, and increased sputum purulence. Severity of lung disease was stratified based on FEV(1) percent predicted using American Thoracic Society guidelines (stage I, FEV(1) > or = 50%; stage II, FEV(1) 35 to 49%; stage III, FEV(1) < 35%). Treatment outcome was judged successful when the patient had no return visit in 4 weeks for a respiratory problem. Failure was defined as a return visit for persistent respiratory symptoms that required a change of an antibiotic in < 4 weeks. One-hundred seven patients with COPD (mean age +/- SD, 66.9 +/- 9.5 years) experienced 232 exacerbations over 24 months. First-line antibiotics (trimethoprim-sulfamethoxazole, ampicillin/amoxicillin, and erythromycin) were used to treat 78% of all exacerbations. Treatment failure was noted in 12.1% of first exacerbations and 14. 7% of all exacerbations, with more than half the failures requiring hospitalization. Host factors that were independently associated with treatment failure included the following: FEV(1) < 35% (46.4% vs 22.4%; p = 0.047), use of home oxygen (60.7% vs 15.6%; p < 0. 0001), frequency of exacerbation (3.8 +/- 2.0 vs 1.6 +/- 0.91; p < 0. 001), history of previous pneumonia (64.3% vs 35.1 p < 0.007), history of sinusitis (28.6% vs 8.8%; p < 0.009) and use of maintenance steroids (32.1% vs 15.2% p = 0.052). Using stepwise logistic regression analysis to identify the top independent variables, the use of home oxygen (p = 0.0002) and frequency of exacerbation (p < 0.0001) correctly classified failures in 83.3% of the patients. Surprisingly, age, the choice of an antibiotic, and the presence of any one or more comorbidity did not affect the treatment outcome. The results of our study suggest that patient host factors and not antibiotic choice may determine treatment outcome. Prospective studies in appropriately stratified patients are needed to validate these findings.
    Chest 03/2000; 117(3):662-71. · 7.13 Impact Factor

Publication Stats

727 Citations
185.52 Total Impact Points


  • 2014
    • Kansas City VA Medical Center
      Kansas City, Missouri, United States
  • 2013
    • University of Nebraska at Omaha
      Omaha, Nebraska, United States
  • 2008–2011
    • Kansas City University of Medicine and Biosciences
      • Department of Internal Medicine
      Kansas City, Missouri, United States
  • 2006
    • Cleveland Clinic
      Cleveland, Ohio, United States
  • 1983–2001
    • Creighton University
      • Department of Medicine
      Omaha, NE, United States