[Show abstract][Hide abstract] ABSTRACT: Chronic kidney disease (CKD) is a major health issue for HIV-positive individuals, associated with increased morbidity and mortality. Development and implementation of a risk score model for CKD would allow comparison of the risks and benefits of adding potentially nephrotoxic antiretrovirals to a treatment regimen and would identify those at greatest risk of CKD. The aims of this study were to develop a simple, externally validated, and widely applicable long-term risk score model for CKD in HIV-positive individuals that can guide decision making in clinical practice.
A total of 17,954 HIV-positive individuals from the Data Collection on Adverse Events of Anti-HIV Drugs (D:A:D) study with ≥3 estimated glomerular filtration rate (eGFR) values after 1 January 2004 were included. Baseline was defined as the first eGFR > 60 ml/min/1.73 m2 after 1 January 2004; individuals with exposure to tenofovir, atazanavir, atazanavir/ritonavir, lopinavir/ritonavir, other boosted protease inhibitors before baseline were excluded. CKD was defined as confirmed (>3 mo apart) eGFR ≤ 60 ml/min/1.73 m2. Poisson regression was used to develop a risk score, externally validated on two independent cohorts. In the D:A:D study, 641 individuals developed CKD during 103,185 person-years of follow-up (PYFU; incidence 6.2/1,000 PYFU, 95% CI 5.7-6.7; median follow-up 6.1 y, range 0.3-9.1 y). Older age, intravenous drug use, hepatitis C coinfection, lower baseline eGFR, female gender, lower CD4 count nadir, hypertension, diabetes, and cardiovascular disease (CVD) predicted CKD. The adjusted incidence rate ratios of these nine categorical variables were scaled and summed to create the risk score. The median risk score at baseline was -2 (interquartile range -4 to 2). There was a 1:393 chance of developing CKD in the next 5 y in the low risk group (risk score < 0, 33 events), rising to 1:47 and 1:6 in the medium (risk score 0-4, 103 events) and high risk groups (risk score ≥ 5, 505 events), respectively. Number needed to harm (NNTH) at 5 y when starting unboosted atazanavir or lopinavir/ritonavir among those with a low risk score was 1,702 (95% CI 1,166-3,367); NNTH was 202 (95% CI 159-278) and 21 (95% CI 19-23), respectively, for those with a medium and high risk score. NNTH was 739 (95% CI 506-1462), 88 (95% CI 69-121), and 9 (95% CI 8-10) for those with a low, medium, and high risk score, respectively, starting tenofovir, atazanavir/ritonavir, or another boosted protease inhibitor. The Royal Free Hospital Clinic Cohort included 2,548 individuals, of whom 94 individuals developed CKD (3.7%) during 18,376 PYFU (median follow-up 7.4 y, range 0.3-12.7 y). Of 2,013 individuals included from the SMART/ESPRIT control arms, 32 individuals developed CKD (1.6%) during 8,452 PYFU (median follow-up 4.1 y, range 0.6-8.1 y). External validation showed that the risk score predicted well in these cohorts. Limitations of this study included limited data on race and no information on proteinuria.
Both traditional and HIV-related risk factors were predictive of CKD. These factors were used to develop a risk score for CKD in HIV infection, externally validated, that has direct clinical relevance for patients and clinicians to weigh the benefits of certain antiretrovirals against the risk of CKD and to identify those at greatest risk of CKD.
PLoS Medicine 03/2015; 12(3):e1001809. DOI:10.1371/journal.pmed.1001809 · 14.00 Impact Factor
[Show abstract][Hide abstract] ABSTRACT: Background: Abdominal infections are frequent causes of sepsis and septic shock in the intensive care unit (ICU) and are associated with adverse outcomes. We analyzed the characteristics, treatments and outcome of ICU patients with abdominal infections using data extracted from a one-day point prevalence study, the Extended Prevalence of Infection in the ICU (EPIC) II. Methods: EPIC II included 13,796 adult patients from 1,265 ICUs in 75 countries. Infection was defined using the International Sepsis Forum criteria. Microbiological analyses were performed locally. Participating ICUs provided patient follow-up until hospital discharge or for 60 days. Results: Of the 7,087 infected patients, 1,392 (19.6%) had an abdominal infection on the study day (60% male, mean age 62 +/- 16 years, SAPS II score 39 +/- 16, SOFA score 7.6 +/- 4.6). Microbiological cultures were positive in 931 (67%) patients, most commonly Gram-negative bacteria (48.0%). Antibiotics were administered to 1366 (98.1%) patients. Patients who had been in the ICU for <= 2 days prior to the study day had more Escherichia coli, methicillin-sensitive Staphylococcus aureus and anaerobic isolates, and fewer enterococci than patients who had been in the ICU longer. ICU and hospital mortality rates were 29.4% and 36.3%, respectively. ICU mortality was higher in patients with abdominal infections than in those with other infections (29.4% vs. 24.4%, p < 0.001). In multivariable analysis, hematological malignancy, mechanical ventilation, cirrhosis, need for renal replacement therapy and SAPS II score were independently associated with increased mortality. Conclusions: The characteristics, microbiology and antibiotic treatment of abdominal infections in critically ill patients are diverse. Mortality in patients with isolated abdominal infections was higher than in those who had other infections.
[Show abstract][Hide abstract] ABSTRACT: The identification of risk factors associated with perioperative seizures would be of great benefit to the anesthesiologist in managing brain tumor patients undergoing craniotomy with intraoperative brain mapping.
[Show abstract][Hide abstract] ABSTRACT: Background
Infections are a leading cause of death in patients with advanced cirrhosis, but there are relatively few data on the epidemiology of infection in intensive care unit (ICU) patients with cirrhosis. AimsWe used data from the Extended Prevalence of Infection in Intensive Care (EPIC) II one-day point-prevalence study to better define the characteristics of infection in these patients. Methods
We compared characteristics, including occurrence and types of infections in non-cirrhotic and cirrhotic patients who had not undergone liver transplantation. ResultsThe EPIC II database includes 13,796 adult patients from 1,265 ICUs: 410 of the patients had cirrhosis. The prevalence of infection was higher in cirrhotic than in non-cirrhotic patients (59 vs. 51%, p<0.01). The lungs were the most common site of infection in all patients, but abdominal infections were more common in cirrhotic than in non-cirrhotic patients (30 vs. 19%, p<0.01). Infected cirrhotic patients more often had Gram-positive (56 vs. 47%, p<0.05) isolates than did infected non-cirrhotic patients. Methicillin-resistant Staphylococcus aureus (MRSA) was more frequent in cirrhotic patients. The hospital mortality rate of cirrhotic patients was 42%, compared to 24% in the non-cirrhotic population (p<0.001). Severe sepsis and septic shock were associated with higher in-hospital mortality rates in cirrhotic than in non-cirrhotic patients (41% and 71% vs. 30% and 49%, respectively, p<0.05). Conclusions
Infection is more common in cirrhotic than in non-cirrhotic ICU patients and more commonly due to Gram-positive organisms, including MRSA. Infection in patients with cirrhosis was associated with higher mortality rates than in non-cirrhotic patients.This article is protected by copyright. All rights reserved.
Liver international: official journal of the International Association for the Study of the Liver 03/2014; DOI:10.1111/liv.12520 · 4.41 Impact Factor
[Show abstract][Hide abstract] ABSTRACT: Background / Purpose:
Early and delayed cognitive dysfunctions are an understudied issue after aneurismal subarachnoid hemorrage (aSAH).The aim of this study is to describe early and late changes after aSAH in terms of cognitive functions, activities of everyday life and quality of life.
Cognitive dysfunction has different time courses after aSAH depending on SAH severity and treatment complications. SAH influences cognitive and social factors.
Critical Care 03/2013; 17(2). DOI:10.1186/cc12282 · 5.04 Impact Factor
[Show abstract][Hide abstract] ABSTRACT: To provide a global, up-to-date picture of the prevalence, treatment, and outcomes of Candida bloodstream infections in intensive care unit patients and compare Candida with bacterial bloodstream infection.
A retrospective analysis of the Extended Prevalence of Infection in the ICU Study (EPIC II). Demographic, physiological, infection-related and therapeutic data were collected. Patients were grouped as having Candida, Gram-positive, Gram-negative, and combined Candida/bacterial bloodstream infection. Outcome data were assessed at intensive care unit and hospital discharge.
EPIC II included 1265 intensive care units in 76 countries.
Patients in participating intensive care units on study day.
Of the 14,414 patients in EPIC II, 99 patients had Candida bloodstream infections for a prevalence of 6.9 per 1000 patients. Sixty-one patients had candidemia alone and 38 patients had combined bloodstream infections. Candida albicans (n = 70) was the predominant species. Primary therapy included monotherapy with fluconazole (n = 39), caspofungin (n = 16), and a polyene-based product (n = 12). Combination therapy was infrequently used (n = 10). Compared with patients with Gram-positive (n = 420) and Gram-negative (n = 264) bloodstream infections, patients with candidemia were more likely to have solid tumors (p < .05) and appeared to have been in an intensive care unit longer (14 days [range, 5-25 days], 8 days [range, 3-20 days], and 10 days [range, 2-23 days], respectively), but this difference was not statistically significant. Severity of illness and organ dysfunction scores were similar between groups. Patients with Candida bloodstream infections, compared with patients with Gram-positive and Gram-negative bloodstream infections, had the greatest crude intensive care unit mortality rates (42.6%, 25.3%, and 29.1%, respectively) and longer intensive care unit lengths of stay (median [interquartile range]) (33 days [18-44], 20 days [9-43], and 21 days [8-46], respectively); however, these differences were not statistically significant.
Candidemia remains a significant problem in intensive care units patients. In the EPIC II population, Candida albicans was the most common organism and fluconazole remained the predominant antifungal agent used. Candida bloodstream infections are associated with high intensive care unit and hospital mortality rates and resource use.
Critical care medicine 12/2010; 39(4):665-70. DOI:10.1097/CCM.0b013e318206c1ca · 6.15 Impact Factor
[Show abstract][Hide abstract] ABSTRACT: Traumatic brain injury (TBI) in children is frequent, sometimes lethal, and may have life-long consequences in survivors. Prevention at school and in sports, including both kids and families, is of paramount importance. Scarce data are available in terms of epidemiology, physiopathology, management and prognosis. This non-systematic review suggests that rational organization of rescue and transport to designated hospitals, linked with early diagnosis/removal of surgical masses and comprehensive monitoring and intensive care, offer the best chances for reducing mortality and morbidity in severe cases. After the acute phase rehabilitation and families play a fundamental role.
[Show abstract][Hide abstract] ABSTRACT: Intraoperative brain mapping has the goal of aiding with maximal surgical resection of brain tumors while minimizing functional sequelae. Retrospective randomized studies on large populations have shown that this technique can optimize the surgical approach while reducing postoperative morbidity. During direct electrical stimulation of the language areas adjacent to the tumor, the patient should be collaborative and be able to speak to participate in language testing. Different anesthesiological protocols have been proposed to allow intraoperative brain mapping, which range from local anesthesia to conscious sedation or general anesthesia, with or without airway instrumentation. The most common intraoperative complications are seizure, respiratory depression, and patients' stress and discomfort. Since awake craniotomy carries both benefits and potential risks, the following factors are crucial in the management of patients: 1) careful selection of the patients and 2) communication between the anesthesiological and surgical teams. To date, there remains no consensus about the optimal anesthesiological regimen to use. Only prospective, multicentre randomized studies focused on evaluating the role of different anesthesiological techniques on intraoperative monitoring, postoperative deficits, and intraoperative complications can answer the question of which anesthesiological approach should be chosen when intraoperative brain mapping is requested.
[Show abstract][Hide abstract] ABSTRACT: The aim of this study was to evaluate the arterio-venous difference in carbon dioxide tension (DPCO2) and the ratio between DPCO2 and arterio-jugular oxygen difference (AJDO2) as indicators of compensated or uncompensated cerebral hypoperfusion.
Cerebral blood flow (CBF) was reduced stepwise in 6 pigs by inducing intracranial hypertension with consequently cerebral perfusion pressure (CPP) reduction: CBF 100%, 50-60 % of baseline, 20-30% of baseline. Intracranial pressure (ICP), mean arterial pressure (MAP), CPP and CBF (laser-Doppler method) were continuously recorded. Superior sagittal sinus was punctured for the determination of AJDO2 and DPCO2.
CBF impairment was accompanied by changes in AJDO2 from 6.03 +/- 1.21 vol% to 7.32 +/- 1.30 vol%, up to 8.07 +/- 1.32 vol% (P < 0.01), in DPCO2 from 12.17 +/- 3.25 mmHg to 16 +/- 4.12 mmHg, up to 26.5 +/- 6.41 mmHg (P < 0.01), and DPCO2/AJDO2 ratio from 2.05 +/- 0.39 to 2.06 +/- 0.72 up to 3.41 +/- 1.09 in the 3 phases (P < 0.05).
When CBF declines AJDO2 increases, indicating greater extraction of O2 to satisfy aerobic metabolism. However, this mechanism can no longer compensate once a critical CBF threshold is reached. DPCO2 rises slowly during moderate CBF reduction because of defective washout; the rise is steeper during marked CBF impairment when anaerobic metabolism takes place. During cerebral hypoperfusion the venous blood gases and acid base variables mirror the degree of cerebral perfusion. In particular the DPCO2, and the DPCO2/ AJDO2 ratio may be useful markers of critical brain hypoperfusion.
[Show abstract][Hide abstract] ABSTRACT: The aim of the present study was to assess the veno-arterial difference in pCO2 (delta pCO2) as an indicator of ischemia compared to the arteriovenous O2 difference (AVDO2). Staircase cerebral blood flow (CBF) reductions were obtained in seven domestic pigs by inducing intracranial hypertension: CBF 100%, 50-60% of baseline, 20-30% of baseline. ICP, MAP, CPP and CBF (Laser-Doppler method) were continuously recorded. The superior sagittal sinus was punctured to determine AVDO2 and delta pCO2. AVDO2 was 5.9 (+/- 1.78, range 3.3-7.4), 7.01 (+/- 1.31, range 5-8.9) and 8.17 (+/- 1.51, range 6.0-11.3) ml/100 ml in the three CBF steps (p = 0.001). CBF impairment was accompanied by the following increases in delta pCO2: from 10 (+/- 4, range 4-15) mmHg to 14.5 (+/- 4.11, range 10-27) mmHg, and to 31.2 (+/- 9.0, range 17-39) mmHg (p < 0.001). When CBF declines AVDO2 increases, indicating greater extraction of O2 to satisfy the aerobic metabolism. However, this mechanism can no longer compensate once a critical CBF threshold is reached. delta pCO2 rises slowly during moderate CBF reduction because of defective washout; the rise is impressive during marked CBF impairment when anaerobic metabolism takes place with proton buffering in CO2 and H2O. Therefore, when the brain's ability to compensate for low blood flow is exceeded, CO2 production outweighs O2 extraction.