[show abstract][hide abstract] ABSTRACT: The development of acute kidney injury (AKI) is associated with poor outcome. The modified RIFLE (risk, injury, failure, loss of kidney function, and end-stage renal failure) classification for AKI, which classifies patients with renal replacement therapy needs according to RIFLE failure class, improves the predictive value of AKI in patients undergoing cardiac surgery. Our aim was to assess risk factors for post-operative AKI and the impact of renal function on short- and long-term survival among all AKI subgroups using the modified RIFLE classification.
We prospectively studied 2,940 consecutive cardiosurgical patients between January 2004 and July 2009. AKI was defined according to the modified RIFLE system. Pre-operative, operative and post-operative variables usually measured on and during admission, which included main outcomes, were recorded together with cardiac surgery scores and ICU scores. These data were evaluated for association with AKI and staging in the different RIFLE groups by means of multivariable analyses. Survival was analyzed via Kaplan-Meier and a risk-adjusted Cox proportional hazards regression model. A complete follow-up (mean 6.9 +/- 4.3 years) was performed in 2,840 patients up to April 2013.
Of those patients studies, 14% (n = 409) were diagnosed with AKI. We identified one intra-operative (higher cardiopulmonary bypass time) and two post-operative (a longer need for vasoactive drugs and higher arterial lactate 24 hours after admission) predictors of AKI. The worst outcomes, including in-hospital mortality, were associated with the worst RIFLE class. Kaplan-Meier analysis showed survival of 74.9% in the RIFLE risk group, 42.9% in the RIFLE injury group and 22.3% in the RIFLE failure group (P <0.001). Classification at RIFLE injury (Hazard ratio (HR) = 2.347, 95% confidence interval (CI) 1.122 to 4.907, P = 0.023) and RIFLE failure (HR = 3.093, 95% CI 1.460 to 6.550, P = 0.003) were independent predictors for long-term patient mortality.
AKI development after cardiac surgery is associated mainly with post-operative variables, which ultimately could lead to a worst RIFLE class. Staging at the RIFLE injury and RIFLE failure class is associated with higher short- and long-term mortality in our population.
Critical care (London, England) 12/2013; 17(6):R293. · 4.72 Impact Factor
[show abstract][hide abstract] ABSTRACT: Tissue-based xenografts such as cartilage are rejected within weeks by humoral and cellular mechanisms that preclude its clinical application in regenerative medicine. The problem could be overcome by identifying key molecules triggering rejection and the development of genetic-engineering strategies to counteract them. Accordingly, high expression of α1,2-fucosyltransferase (HT) in xenogeneic cartilage reduces the galactose α1,3-galactose (Gal) antigen and delays rejection. Yet, the role of complement activation in this setting is unknown.
To determine its contribution, we assessed the effect of inhibiting C5 complement component in α1,3-galactosyltransferase-knockout (Gal KO) mice transplanted with porcine cartilage and studied the effect of human complement on porcine articular chondrocytes (PAC).
Treatment with an anti-mouse C5 blocking antibody for 5 weeks enhanced graft survival by reducing cellular rejection. Moreover, PAC were highly resistant to complement-mediated lysis and primarily responded to human complement by releasing IL-6 and IL-8. This occurred even in the absence of anti-Gal antibody and was mediated by both C5a and C5b-9. Indeed, C5a directly triggered IL-6 and IL-8 secretion and up-regulated expression of swine leukocyte antigen I (SLA-I) and adhesion molecules on chondrocytes, all processes that enhance cellular rejection. Finally, the use of anti-human C5/C5a antibodies and/or recombinant expression of human complement regulatory molecule CD59 (hCD59) conferred protection in correspondence with their specific functions.
Our study demonstrates that complement activation contributes to rejection of xenogeneic cartilage and provides valuable information for selecting approaches for complement inhibition.
Osteoarthritis and Cartilage 09/2013; · 4.26 Impact Factor
[show abstract][hide abstract] ABSTRACT: Severe lithium poisoning is a frequent condition in the intoxicated intensive care unit population. Dialysis is the treatment of choice, but no clinical markers predicting higher requirement for dialysis have been identified to date. We analyze the characteristics of lithium overdose patients needing dialysis to improve lithium clearance, and identify the ones associated with higher dialysis requirement. This is an observational, retrospective study of 14 patients with lithium poisoning admitted from 2004 to 2009. Median age was 41.8 ± 16.1 years. Poisonings were acute in 7.1%, acute-on-chronic in 64.28%, and chronic in 28.5% of cases. Comparing clinical and biochemical data in patients requiring more than one dialysis session with those requiring only one session, the univariate analysis showed differences at admission in creatinine clearance (40.5 ± 23 vs. 73.3 ± 24.9 mL/min, P = 0.025), white blood cells (17,528 ± 3,530 vs. 11,580 ± 3360 cells/L, P = 0.007), and blood sodium concentration (134.8 ± 5.9 vs. 141.8 ± 8.4 mmol/L, P=0.035). We measured the degree of association between the number of sessions and the variables with partial correlations. High lithium levels (P = 0.006, r = 0.69), low creatinine clearance (P = 0.04, r = -0.55), and low blood sodium concentration (P = 0.024, r = -0.59) were associated with a greater number of dialysis sessions. The correlation remained significant for blood sodium concentration (P = 0.016, r = -0.67) after adjustment for creatinine clearance and initial lithium levels. Presence on admission of low creatinine clearance, low blood sodium concentration, and/or high lithium levels correlated with a higher number of dialysis sessions in severe lithium poisoning. These factors, especially low blood sodium concentration, are associated with higher dialysis requirements in severe lithium intoxication.
Hemodialysis International 07/2012; 16(3):407-13. · 1.44 Impact Factor
[show abstract][hide abstract] ABSTRACT: We investigate age and sex differences in acute myocardial infarction (AMI) after cardiac surgery in a prospective study of 2038 consecutive patients undergoing cardiac surgery with cardiopulmonary bypass. An age of ≥ 70 years implied changes in the type of AMI from the ST-segment elevation myocardial infarction (STEMI) to non-ST-segment elevation myocardial infarction (non-STEMI). Men were more likely than women to suffer from AMI after cardiac surgery (11.8% vs. 5.6%), as a result of the higher frequency of STEMI (6% of men vs. 1.8% of women; P < 0.001) in both age groups. A troponin-I (Tn-I) peak was significantly higher in patients ≥ 70 years old. In-hospital mortality was higher in patients ≥ 70 (7.3%) than in those < 70 years old (3.3%), because of the increased mortality observed in men with non-AMI (2.1% vs. 6.3%) and women with STEMI (0% vs. 28.6%) and non-STEMI (0% vs. 36.8%, P < 0.05). Old age was associated with a higher frequency of non-STEMI, Tn-I peak, mortality and length of stay in the intensive care unit (ICU). Regardless of age, men more often suffer from AMI (particularly STEMI). AMI in women had a notable impact on excess mortality and ICU stay observed in patients ≥ 70 years of age. Clinical and Tn-I peak differences are expected in relation to age and gender after AMI post-cardiac surgery.
Interactive cardiovascular and thoracic surgery 04/2012; 15(1):28-32.
[show abstract][hide abstract] ABSTRACT: ABSTRACT: INTRODUCTION: Non-neurological complications in patients with severe traumatic brain injury (TBI) are frequent, worsening the prognosis, but the pathophysiology of systemic complications after TBI is unclear. The purpose of this study was to analyze non-neurological complications in patients with severe TBI admitted to the ICU, the impact of these complications on mortality, and their possible correlation with TBI severity. METHODS: An observational retrospective cohort study was conducted in one multidisciplinary ICU of a university hospital (35 beds); 224 consecutive adult patients with severe TBI (initial Glasgow Coma Scale (GCS) < 9) admitted to the ICU were included. Neurological and non-neurological variables were recorded. RESULTS: Sepsis occurred in 75% of patients, respiratory infections in 68%, hypotension in 44%, severe respiratory failure (arterial oxygen pressure/oxygen inspired fraction ratio (PaO2/FiO2) < 200) in 41% and acute kidney injury (AKI) in 8%. The multivariate analysis showed that Glasgow Outcome Score (GOS) at one year was independently associated with age, initial GCS 3 to 5, worst Traumatic Coma Data Bank (TCDB) first computed tomography (CT) scan and the presence of intracranial hypertension but not AKI. Hospital mortality was independently associated with initial GSC 3 to 5, worst TCDB first CT scan, the presence of intracranial hypertension and AKI. The presence of AKI regardless of GCS multiplied risk of death 6.17 times (95% confidence interval (CI): 1.37 to 27.78) (P < 0.02), while ICU hypotension increased the risk of death in patients with initial scores of 3 to5 on the GCS 4.28 times (95% CI: 1.22 to15.07) (P < 0.05). CONCLUSIONS: Low initial GCS, worst first CT scan, intracranial hypertension and AKI determined hospital mortality in severe TBI patients. Besides the direct effect of low GCS on mortality, this neurological condition also is associated with ICU hypotension which increases hospital mortality among patients with severe TBI. These findings add to previous studies that showed that non-neurological complications increase the length of stay and morbidity in the ICU but do not increase mortality, with the exception of AKI and hypotension in low GCS (3 to 5).
Critical care (London, England) 03/2012; 16(2):R44. · 4.72 Impact Factor
[show abstract][hide abstract] ABSTRACT: Catheter-related bloodstream infections (CR-BSI) are an increasing problem in the management of critically ill patients. Our objective was to analyze the incidence and epidemiology of CR-BSI in arterial catheters (AC) in a population of critically ill patients.
We conducted a two-year, prospective, non-randomized study of patients admitted for > 24 h in a 24-bed medical-surgical major teaching ICU. We analyzed the arterial catheters and differentiated between femoral and radial locations. Difference testing between groups was performed using the two-tailed t-test and chi-square test as appropriate. Multivariate logistic regression analyses were conducted to identify independent predictors of CR-BSI occurrence and type of micro-organism responsible.
The study included 1456 patients requiring AC placement for ≥ 24 h. A total of 1543 AC were inserted for 14,437 catheter days. The incidence of AC-related bloodstream infections (ACR-BSI) was 3.53 episodes per 1000 catheter days. In the same period the incidence of central venous catheter (CVC)-related bloodstream infections was 4.98 episodes per 1000 catheter days. Logistic regression analysis showed that days of insertion (OR: 1.118 95% confidence interval (CI) 1.026-1.219) and length of ICU stay (OR: 1.052 95% CI: 1.025-1.079) were associated with a higher risk of ACR-BSI. Comparing 705 arterial catheters in femoral location with 838 in radial location, no significant differences in infection rates were found, although there was a trend toward a higher rate among femoral catheters (4.13 vs. 3.36 episodes per 1000 catheter days) (p = 0.72). Among patients with ACR-BSI, Gram-negative bacteria were isolated in 16 episodes (61.5%) in the femoral location and seven (28%) in radial location (OR: 2.586; 95% CI: 1.051-6.363).
We concluded that as has been reported for venous catheters ACR-BSI plays an important role in critically ill patients. Days of insertion and length of ICU stay increase the risk of ACR-BSI. The femoral site increases the risk for Gram-negative infection.
The Journal of infection 06/2011; 63(2):139-43. · 4.13 Impact Factor
[show abstract][hide abstract] ABSTRACT: Introduction: Catheter-related bloodstream infection (CR-BSI) is a cause of morbidity and mortality in intensive care units, and the optimal approach for preventing these infections is not well defined. Comparison of CR-BSI rates with those provided by programs such as the National Nosocomial Infection Surveillance System (NNISS) from the USA and the Spanish National Nosocomial Infection Surveillance Study (ENVIN), enable determination of the need to implement control measures. In 2000, we found that the CR-BSI rates in UCIs of our hospital were much higher than the data reported by ENVIN. Objective: To assess the impact of implementing a protocol for proper use of intravascular catheters on CR-BSI rates in the intensive care unit (ICU) of a tertiary hospital. Methods: Prospective study of patients admitted to the ICUs of a tertiary hospital in the months of May and June, from 2000 to 2004. In 2001, a CR-BSI prevention program including aspects related to catheter insertion and maintenance in ICU patients was implemented. We calculated infection rates per 1000 days of catheter use in all the 2-month periods studied, and compared the 2000 and 2004 results by analysis of the odds ratios and confidence intervals.
[show abstract][hide abstract] ABSTRACT: Xenotransplantation of genetically engineered porcine chondrocytes may benefit many patients who suffer cartilage defects. In this work, we sought to elucidate the molecular bases of the cellular response to xenogeneic cartilage. To this end, we isolated pig costal chondrocytes (PCC) and conducted a series of functional studies. First, we determined by flow cytometry the cell surface expression of multiple immunoregulatory proteins in resting conditions or after treatment with human TNF-alpha, IL-1alpha, or IL-1beta, which did not induce apoptosis. TNF-alpha and to a lesser extent IL-1alpha led to a marked upregulation of SLA I, VCAM-1, and ICAM-1 on PCC. SLA II and E-selectin remained undetectable in all the conditions assayed. Notably, CD86 was constitutively expressed at moderate levels, whereas CD80 and CD40 were barely detected. To assess their function, we next studied the interaction of PCC with human monoblastic U937 and Jurkat T cells. U937 cells adhered to resting and in a greater proportion to cytokine-stimulated PCC. Consistent with its expression pattern, pig VCAM-1 was key, mediating the increased adhesion after cytokine stimulation. We also conducted coculture experiments with U937 and PCC and measured the release of pig and human cytokines. Stimulated PCC secreted IL-6 and IL-8, whereas U937 secreted IL-8 in response to PCC. Finally, coculture of PCC with Jurkat in the presence of PHA led to a marked Jurkat activation as determined by the increase in IL-2 secretion. This process was dramatically reduced by blocking pig CD86. In summary, CD86 and VCAM-1 on pig chondrocytes may be important triggers of the xenogeneic cellular immune response. These molecules together with TNF could be considered potential targets for intervention in order to develop xenogeneic therapies for cartilage repair.
[show abstract][hide abstract] ABSTRACT: Catheter-related bloodstream infection (CR-BSI) is a cause of morbidity and mortality in intensive care units, and the optimal approach for preventing these infections is not well defined. Comparison of CR-BSI rates with those provided by programs such as the National Nosocomial Infection Surveillance System (NNISS) from the USA and the Spanish National Nosocomial Infection Surveillance Study (ENVIN), enable determination of the need to implement control measures. In 2000, we found that the CR-BSI rates in UCIs of our hospital were much higher than the data reported by ENVIN.
To assess the impact of implementing a protocol for proper use of intravascular catheters on CR-BSI rates in the intensive care unit (ICU) of a tertiary hospital.
Prospective study of patients admitted to the ICUs of a tertiary hospital in the months of May and June, from 2000 to 2004. In 2001, a CR-BSI prevention program including aspects related to catheter insertion and maintenance in ICU patients was implemented. We calculated infection rates per 1000 days of catheter use in all the 2-month periods studied, and compared the 2000 and 2004 results by analysis of the odds ratios and confidence intervals.
A total of 923 patients were included. Mean age was 58.7 years (SD: 15.4), mean ICU stay was 11.6 days (SD: 11.4), mean SAPSII was 28.2 (SD: 15.9), and mortality was 20.5%. There was a significant reduction in CR-BSI rates from 13.3 episodes per 1000 days of catheter use in the first period to 3.21 in the last period (OR=3.53, 95% CI: 2.36-5.31).
Application of a prevention program for CR-BSI and a system for monitoring BSI rates led to a significant, sustained reduction in these infections.
[show abstract][hide abstract] ABSTRACT: Temporary tracheostomy (TT) is a common critical care procedure in patients with acute respiratory failure who require prolonged mechanical ventilatory support. Usually TT is considered if weaning from mechanical ventilation has been unsuccessful, but both the timing and the method to perform are still controversial. A clinical trial by Rumback suggested that early TT may benefit patients who are not improving and who are expected to require prolonged respiratory support. In that study, early TT improved survival and shortened duration of mechanical ventilation, although other studies found no benefit. Minimally invasive bedside percutaneous TT (PTT) was introduced as an alternative to the traditional surgical technique. In expert hands, both techniques are equivalent in complications and safety; however, the PTT approach may be more cost-effective. Early PTT should be considered in patients with a high likelihood of prolonged mechanical ventilation. The authors introduce a modification of a device used for PTT. Surgical technique should be used when PTT is contraindicated
Clinical Pulmonary Medicine 08/2008; 15(5):267-273.
[show abstract][hide abstract] ABSTRACT: Bloodstream infections (BSIs) related to central venous catheters (CVCs) and arterial catheters (ACs) are an increasing problem in the management of critically ill patients. Our objective was to assess the efficacy of a needle-free valve connection system (SmartSite), Alaris Medical Systems, San Diego, CA, USA) in the prevention of catheter-related bloodstream infection (CR-BSI). Patients admitted to an intensive care unit were prospectively assigned to have a CVC and AC connected with either a needle-free valve connection system (NFVCS) or a three-way stopcock connection (3WSC). The characteristics of the patients were similar in the two groups. Before manipulation, the NFVCS was disinfected with chlorhexidine digluconate 0.5% alcoholic solution. The 3WSC was not disinfected between use but it was covered with a protection cap. A total of 799 patients requiring the insertion of a multilumen CVC or AC for >48h from 1 April 2002 to 31 December 2003 were included. CR-BSI rates were 4.61 per 1000 days of catheter use in the disinfected NFVCS group and 4.11 per 1000 days of catheter use in the 3WSC group (P=0.59). When CVC-BSIs and AC-BSIs were analysed separately, the rate of CVC-BSI was 4.26 per 1000 days of catheter use in the NFVCS group, compared with 5.27 in the 3WSC group (P=0.4). The incidence rate of AC-BSI was 5.00 per 1000 days of catheter use in the NFVCS group, compared with 2.83 in the 3WSC group (P=0.08). The use of NFVCS does not reduce the incidence of catheter-related bacteraemia. The arterial catheter (AC) is a significant source of infection in critically ill patients.
Journal of Hospital Infection 09/2007; 67(1):30-4. · 2.86 Impact Factor