Article

Chlorhexidine-impregnated sponges and less frequent dressing changes for prevention of catheter-related infections in critically ill adults: a randomized controlled trial.

INSERM U823, University Joseph Fourier, Albert Boniot Institute, 38076, Grenoble Cedex, France.
JAMA The Journal of the American Medical Association (Impact Factor: 30.39). 04/2009; 301(12):1231-41. DOI: 10.1001/jama.2009.376
Source: PubMed

ABSTRACT Use of a chlorhexidine gluconate-impregnated sponge (CHGIS) in intravascular catheter dressings may reduce catheter-related infections (CRIs). Changing catheter dressings every 3 days may be more frequent than necessary.
To assess superiority of CHGIS dressings regarding the rate of major CRIs (clinical sepsis with or without bloodstream infection) and noninferiority (less than 3% colonization-rate increase) of 7-day vs 3-day dressing changes.
Assessor-blind, 2 x 2 factorial, randomized controlled trial conducted from December 2006 through June 2008 and recruiting patients from 7 intensive care units in 3 university and 2 general hospitals in France. Patients were adults (>18 years) expected to require an arterial catheter, central-vein catheter, or both inserted for 48 hours or longer.
Use of CHGIS vs standard dressings (controls). Scheduled change of unsoiled adherent dressings every 3 vs every 7 days, with immediate change of any soiled or leaking dressings.
Major CRIs for comparison of CHGIS vs control dressings; colonization rate for comparison of 3- vs 7-day dressing changes.
Of 2095 eligible patients, 1636 (3778 catheters, 28,931 catheter-days) could be evaluated. The median duration of catheter insertion was 6 (interquartile range [IQR], 4-10) days. There was no interaction between the interventions. Use of CHGIS dressings decreased the rates of major CRIs (10/1953 [0.5%], 0.6 per 1000 catheter-days vs 19/1825 [1.1%], 1.4 per 1000 catheter-days; hazard ratio [HR], 0.39 [95% confidence interval {CI}, 0.17-0.93]; P = .03) and catheter-related bloodstream infections (6/1953 catheters, 0.40 per 1000 catheter-days vs 17/1825 catheters, 1.3 per 1000 catheter-days; HR, 0.24 [95% CI, 0.09-0.65]). Use of CHGIS dressings was not associated with greater resistance of bacteria in skin samples at catheter removal. Severe CHGIS-associated contact dermatitis occurred in 8 patients (5.3 per 1000 catheters). Use of CHGIS dressings prevented 1 major CRI per 117 catheters. Catheter colonization rates were 142 of 1657 catheters (7.8%) in the 3-day group (10.4 per 1000 catheter-days) and 168 of 1828 catheters (8.6%) in the 7-day group (11.0 per 1000 catheter-days), a mean absolute difference of 0.8% (95% CI, -1.78% to 2.15%) (HR, 0.99; 95% CI, 0.77-1.28), indicating noninferiority of 7-day changes. The median number of dressing changes per catheter was 4 (IQR, 3-6) in the 3-day group and 3 (IQR, 2-5) in the 7-day group (P < .001).
Use of CHGIS dressings with intravascular catheters in the intensive care unit reduced risk of infection even when background infection rates were low. Reducing the frequency of changing unsoiled adherent dressings from every 3 days to every 7 days modestly reduces the total number of dressing changes and appears safe.
clinicaltrials.gov Identifier: NCT00417235.

Full-text

Available from: Jean-Ralph Zahar, May 08, 2015
1 Follower
 · 
390 Views
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: to assess the effectiveness of the chlorhexidine antimicrobial dressing in comparison to the gauze and tape dressing in the use of central venous catheters. a randomized clinical trial was conducted in the intensive care and adult semi intensive care units of a university hospital in the south of Brazil. The subjects were patients using short-term central venous catheters, randomly assigned to the intervention (chlorhexidine antimicrobial dressing) or control (gauze and micro porous tape) groups. a total of 85 patients were included: 43 in the intervention group and 42 in the control group. No statistically significant differences were found between dressings in regard to the occurrence of: primary bloodstream infections (p-value = 0.5170); local reactions to the dressing (p-value = 0.3774); and dressing fixation (p-value = 0.2739). both technologies are effective in covering central venous catheters in regard to the investigated variables and can be used for this purpose. Registry ECR: RBR-7b5ycz.
    Revista Latino-Americana de Enfermagem 10/2014; 22(5):764-771. DOI:10.1590/0104-1169.3443.2478 · 0.54 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Venous access devices are of pivotal importance for an increasing number of critically ill patients in a variety of disease states and in a variety of clinical settings (emergency, intensive care, surgery) and for different purposes (fluids or drugs infusions, parenteral nutrition, antibiotic therapy, hemodynamic monitoring, procedures of dialysis/apheresis). However, healthcare professionals are commonly worried about the possible consequences that may result using a central venous access device (CVAD) (mainly, bloodstream infections and thrombosis), both peripherally inserted central catheters (PICCs) and centrally inserted central catheters (CICCs). This review aims to discuss indications, insertion techniques, and care of PICCs in critically ill patients. PICCs have many advantages over standard CICCs. First of all, their insertion is easy and safe -due to their placement into peripheral veins of the arm- and the advantage of a central location of catheter tip suitable for all osmolarity and pH solutions. Using the ultrasound-guidance for the PICC insertion, the risk of hemothorax and pneumothorax can be avoided, as well as the possibility of primary malposition is very low. PICC placement is also appropriate to avoid post-procedural hemorrhage in patients with an abnormal coagulative state who need a CVAD. Some limits previously ascribed to PICCs (i.e., low flow rates, difficult central venous pressure monitoring, lack of safety for radio-diagnostic procedures, single-lumen) have delayed their start up in the intensive care units as common practice. Though, the recent development of power-injectable PICCs overcomes these technical limitations and PICCs have started to spread in critical care settings. Two important take-home messages may be drawn from this review. First, the incidence of complications varies depending on venous accesses and healthcare professionals should be aware of the different clinical performance as well as of the different risks associated with each type of CVAD (CICCs or PICCs). Second, an inappropriate CVAD choice and, particularly, an inadequate insertion technique are relevant-and often not recognized-potential risk factors for complications in critically ill patients. We strongly believe that all healthcare professionals involved in the choice, insertion or management of CVADs in critically ill patients should know all potential risk factors of complications. This knowledge may minimize complications and guarantee longevity to the CVAD optimizing the risk/benefit ratio of CVAD insertion and use. Proper management of CVADs in critical care saves lines and lives. Much evidence from the medical literature and from the clinical practice supports our belief that, compared to CICCs, the so-called power-injectable peripherally inserted central catheters are a good alternative choice in critical care.
    11/2014; 3(4):80-94. DOI:10.5492/wjccm.v3.i4.80
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Best clinical practice for patients with suspected catheter-related infection (CRI) remains unclear according to the latest Infectious Diseases Society of America (IDSA) guidelines. Thus, the objective of this study was to analyze clinical practice concerning the central venous catheter (CVC) and its impact on prognosis in patients with suspected CRI. We performed a prospective, multicenter, observational study in 18 Spanish Intensive Care Units (ICUs). Inclusion criteria were patients with CVC and suspected CRI. The following exclusion criteria were used: age less than 18 years; pregnancy; lactation; human immunodeficiency virus; neutropenia; solid or haematological tumor; immunosuppressive or radiation therapy; transplanted organ; intravascular foreign body; haemodynamic instability; suppuration or frank erythema/induration at the insertion site of the CVC, and patients with bacteremia or fungemia. The end-point of the study was mortality at 30 days of CRI suspicion. The study included 384 patients. In 214 (55.8%) patients, CVC was removed at the moment of CRI suspicion, in 114 (29.7%) CVC was removed later and in 56 (14.6%) CVC was not removed. We did not find significant differences between survivors (n =311) and non-survivors (n =73) at 30 days according to CVC decision (P =0.26). The rate of confirmed catheter-related bloodstream infection (CRBSI) was higher in survivors than in non-survivors (14.5% versus 4.1%; P =0.02). Mortality rate was lower in patients with CRBSI than in the group of patients whose clinical symptoms were due to other causes (3/48 (6.25%) versus 70/336 (20.8%); P =0.02). We did not find significant differences in mortality in patients with confirmed CRBSI according to CVC removal at the moment of CRI suspicion (n =38) or later (n =10) (7.9% versus 0; P =0.99). In patients with suspected CRI, immediate CVC removal may be not necessary in all patients. Other aspects should be taken into account in the decision-making, such as vascular accessibility, the risk of mechanical complications during new cannulation that may be life-threatening, and the possibility that the CVC may not be the origin of the suspected CRI.
    Critical care (London, England) 10/2014; 18(5):564. DOI:10.1186/s13054-014-0564-3