[Show abstract][Hide abstract] ABSTRACT: To test a multiplex real-time polymerase chain reaction (PCR) method for simultaneous detection of multiple organisms in bloodstream infections.
Prospective observational study at the University of California Davis Medical Center (Sacramento, CA). Two hundred adult (>18 yrs) patients from the emergency room, intensive care units, and general medicine wards at risk of a bloodstream infection and who manifested signs of systemic inflammatory response syndrome (SIRS). Whole blood samples for PCR testing were collected at the same time as blood culture (BC). PCR results were compared to blood and other culture results.
PCR detected potentially significant bacteria and fungi in 45 cases compared to 37 by BC. PCR detected the methicillin resistance (mecA) gene in all three culture-confirmed methicillin-resistant Staphylococcus aureus cases. More than 68% of PCR results were confirmed by blood, urine, and catheter culture. Independent clinical arbitrators could not rule out the potential clinical significance of organism(s) detected by PCR, but not by BC. PCR did not detect Enterococcus faecalis in five BC-confirmed cases. On average, seven patient samples could be tested simultaneously with the PCR method in 6.54 +/- .27 hrs.
Multiplex PCR detected potentially significant bacteria and fungi that were not found by BC. BC found organisms that were not detected by PCR. Despite limitations of both BC and PCR methods, PCR could serve as an adjunct to current culture methods to facilitate early detection of bloodstream infections. Early detection of microorganisms has the potential to facilitate evidence-based treatment decisions, antimicrobial selection, and adequacy of antimicrobial therapy.
Critical care medicine 05/2008; 36(5):1487-92. · 6.37 Impact Factor
[Show abstract][Hide abstract] ABSTRACT: This study is the first conducted to characterize the prevalence of blood contamination on hospital glucose meters, which potentially may spread infection. Twelve academic and nonacademic affiliated hospitals participated in hospitalwide surveys for blood contamination on glucose meters used in patient service areas. Each glucose meter was inspected visually and sampled for testing with a reduced phenolphthalein test for hemoglobin to detect the presence of blood. Glucose meters from 9 urban, 2 suburban, and 1 rural hospital were surveyed. The overall mean frequency of glucose meters with blood contamination was 30.2 ± 17.5% (range, 0.0-60.5%; median, 34.6%). The mean frequency of ICU meters with blood contamination was 48.2 ± 30.2%. ICU meters were 2.20 times more likely to be contaminated than meters on the general medicine floors. The number of operators and location of the hospital were significant predictors for blood contamination. An increase of 100 operators increases the odds of contamination by 6% (P = 0.0002). In this study group, the odds of contamination were higher in the urban hospitals compared with suburban or rural hospitals. Blood contamination is present on hospital glucose meters. The number of instrument operators represent a risk factor associated with instrument contamination. ICU meters were more likely to be contaminated. The authors recommend monitoring and surveillance, and a defined infection control protocol for point-of-care testing instruments to reduce the prevalence of contamination. These actions should help decrease the risk of spread of infectious agents and nosocomial infections.
Point of Care The Journal of Near-Patient Testing & Technology 11/2005; 4(4):158-163.
[Show abstract][Hide abstract] ABSTRACT: The objective was to determine frequencies of inadequate vs. adequate antimicrobial treatment and antibiotic alterations in patients with blood culture proven septicemia and outcome effects (LOS, 28-day mortality). Our long-term goal is to use multiplexed PCR NAT to rapidly detect infectious agents and focus treatment reducing unnecessary alterations. Methods. Alteration was defined as antibiotic addition, discontinuation, clinically significant dose change, or temporal gap >1 hr. Alterations were analyzed in 3 time intervals: TI1, collection time of blood for culture to qualitative positive blood culture notification; TI2, notification to final MIC result; and TI3, MIC result to 72 hours. We unpeeled alterations to identify survivor (S) vs. nonsurvivor (NS) patterns. We assessed adequacy of antimicrobial treatment using Kollef and colleagues' criteria (Chest 2000;118:146). Results. Mean (SD), median, and range of age (yrs) for 40 S & 26 NS were, respectively: 56.3 (17.5) & 62.0 (19.0); 59 & 64; and 18-82 & 29-98. Gender (M & F): S, 28 (70%) & 12 (30%); and NS, 13 & 13. Mean (SD), median, and range of hospital LOS (days) for S & NS were: 10.2 (8.4) & 4.9 (4.0) [P<0.05]; 8 & 4; and 2-41 & 1-15. Mean (SD) time (hrs) following admission of blood collection, qualitative culture notification, and MIC result were: 4.28 (5.18), 26.9 (17.6), and 89.3 (34.7). Patterns are displayed in several temporal histograms. Mean number of alterations for S & NS were: TI1, 1.60 & 1.85; TI2, 1.35 & 1.16; and TI3, 0.59 & 0.44. Higher mortality was associated with antibiotic additions in TI1. Adequate antimicrobial treatment was given in 46/66 (70%) patients; inadequate, 12/66 (18%); and probable inadequate, 8/66 (12%). Conclusions. Inadequate therapy resulted from resistant organisms, uninformed antibiotic selection, and lack of early definitive identification. Alteration debt depletes valuable resources. A frame shift of earlier evidence may reduce LOS and improve outcomes. Mortality here appeared associated with excessive antibiotic alterations in TI1 because physicians could not treat efficaciously based on presumptive etiology and qualitative culture results.
Infectious Diseases Society of America 2003 Annual Meeting; 10/2003
[Show abstract][Hide abstract] ABSTRACT: Unacceptable mortality rates and excessively high costs of sepsis can be attributed in part to delayed pathogen identification, poorly focused treatment, and unjustified antimicrobial use. Rapid diagnosis of pathogens, immediately focused antibiotics, and quick interruption of dysfunctional sepsis cascades could optimize clinical value. Value analysis constructs evidence-based hypotheses, called value propositions, that form a value model for the use of rapid diagnosis and point-of-care testing (POCT) in sepsis. Rapid diagnostic methods, such as nucleic acid testing (NAT), can assure fast therapeutic turnaround time. The authors applied NAT hypothetically to different scenarios for which they performed value analysis of patient and hospital costs and of medical outcomes. Value propositions were constructed for both adult and pediatric clinical situations. The authors conclude that rapid diagnosis by NAT can reduce the probability of complications, allow physicians to immediately focus treatment, and decrease both costs and mortality. Expected benefits for both pediatric and adult patients validate the need for continued NAT research and development of POCT for critically ill patients with sepsis.
Each year 750,000 people develop severe sepsis. The mortality rate in severe sepsis ranges from 28% to 50%. 1 In the United States, more than 500 patients die every day of severe sepsis. Severe sepsis is the leading cause of death in noncoronary intensive care unit patients 1 and has an annual incidence exceeding three quarters of a million cases with costs averaging $22,100 per case. 2 The enormous economic impact of sepsis is estimated at $16.7 billion per year. Because of age shifts in the population, the incidence of severe sepsis will increase in the United States by approximately 1.5% per annum. 1 Point-of-care testing (POCT) and nucleic acid testing (NAT) in sepsis may be capable of reducing costs and mortality. After defining sepsis and briefly describing the normal response to infection, we develop a value model with the following objectives: (a) to identify important diagnostic pivots in the pathophysiology of sepsis, (b) to use value analysis to develop value propositions (hypotheses) that demonstrate how POCT and NAT can facilitate decision making, and (c) to quantitate the medical and economic marginal value of benefits to critically ill patients with severe sepsis.
Point of Care The Journal of Near-Patient Testing & Technology 08/2003; 2(3):163-171.
[Show abstract][Hide abstract] ABSTRACT: We measured lead concentrations in three hemoglobin-based oxygen carriers (HBOCs; Oxyglobin, Hemopure, and Hemolink) and compared them with lead concentrations from blood-bank blood. Oxyhemoglobin dissociation was measured with large concentrations of lead in bovine HBOC, with or without bovine blood, and in bovine blood. Samples of each were prepared by combining one with normal saline (control), the second with small lead concentrations (22 micro g/dL), and the third with toxic lead concentrations (70 micro g/dL). They were blended in 2 tonometers at oxygen concentrations (2.5%, 5%, 8%, 10%, 21%, and 95%) with 5% CO(2) and the remainder nitrogen for 5 min per sample after a 15-min wash-in with each level of oxygen and were measured with co-oximetry. Oxygen saturation was plotted against PO(2), fitting fourth-order polynomial nonlinear regression to the data. The lead concentrations of the three HBOCs were 0.51, 0.22, 0.40 micro g/dL. There were no clinically important differences of the oxyhemoglobin dissociation curves as a function of lead concentration. The lead concentrations of the three tested HBOCs were small and no larger than the average for blood-bank blood. The presence of increasing concentrations of lead in either concentrated solution of bovine HBOC or a 1:1 mixture of bovine HBOC and native bovine blood does not appear to affect hemoglobin oxygenation in an acute in vitro model of increased lead concentrations. IMPLICATIONS: Gunshot wounds rapidly increase circulating lead concentrations. Lead concentrations are small in three hemoglobin-based oxygen carriers (HBOCs), and HBOCs and/or bovine blood do not appear to be affected by lead concentrations in terms of immediate oxygen on-loading and off-loading. HBOCs may be useful in patients with gunshot wounds.
[Show abstract][Hide abstract] ABSTRACT: The objective of this study was to conduct a national survey to understand the current state of biohazard controls for point-of-care testing (POCT). The survey was distributed to 50 professionals at U.S. hospitals, consulting firms, and accreditation and regulatory agencies by electronic mail with telephone follow-up. The authors surveyed physicians, POCT coordinators, infection control officers, safety officers, laboratory technicians, and consultants. They asked about disinfection protocols for POCT instruments, disposal of POCT wastes (e.g., test strips), concerns about biohazard controls for POCT, and types of pathogens potentially spread by POCT. The response rate was 80% (40 of 50). Responses were categorized, grouped, and analyzed. Thirty-four percent of the responding institutions did not have a defined disinfection protocol for POCT instruments other than universal precautions, whereas 66% had either a defined disinfection protocol in their nursing procedures or in an overlapping infection control policy. Ninety-three percent of the institutions stated that they did not have difficulty with proper disposal of POCT wastes, such as test strips, test cartridges, and lancets. The pathogens of most concern for spread by POCT were hepatitis B and C viruses (63%). The main concerns about biohazard control for POCT were instrument contamination, cross-infection, and nosocomial infection (27%), and improper transporting, handling, or disposal of waste (27%). Disinfection procedures for POCT instruments were not defined at many hospitals. In view of the wide range of concern and the potential risks for infection, guidelines for biohazard controls for POCT are warranted to protect the safety of patients and hospital staff.
Point of Care The Journal of Near-Patient Testing & Technology 05/2003; 2(2):101-105.
[Show abstract][Hide abstract] ABSTRACT: Our goals: (a) to design value strategies for optimal diagnosis and treatment of sepsis; (b) to assess theoretical marginal penalties of inadequate antimicrobial treatment of intensive care patients with infections (case study); and (c) to analyse physician antibiotic alterations in patients with blood cultureproven bloodstream infections. Marginal penalties, which reflect extra procedures and excess costs arising from uninformed treatment decisions, were transformed to subject group‐distributed opportunity costs. Value analysis revealed substantial marginal penalties associated with adverse factors, such as increased ICU length of stay, procedures (catheterisation, mechanical ventilation, tracheostomy) and ultimately, higher mortality, in critically ill patients. Investigation of 66 septicaemica patients (International Classification of Disease [ICD‐9] code 038 and related derivatives) hospitalised over 17 months revealed that physicians altered antibiotics extensively during three time intervals: TI1—collection of blood for culture to qualitative positive blood culture notification, TI2—notification to final MIC result, and TI3—final MIC result to 72 hours afterward. Empirical antibiotic alterations during TI1 may have adversely affected survival. Alterations peaked after blood collection and after notification of qualitative positive blood culture results. Based on patterns of alterations and 28‐day mortality, we hypothesize that nucleic acid testing, if used to identify organisms and rule in bloodstream infections early (4–6 h) following admission, will help facilitate diagnosis, focus antibiotic therapy, and avert dysfunctional sepsis cascades. Reduced “alteration debt” and marginal penalties should offset the costs of nucleic acid testing. The potential for enhanced survival and improved outcomes warrants clinical trials of rapid nucleic acid testing to decrease indiscriminate antibiotic alterations, evaluate proposed value strategies, and test outcome hypotheses.
Scandinavian Journal of Clinical and Laboratory Investigation 01/2003; 63:16-26. · 1.29 Impact Factor
[Show abstract][Hide abstract] ABSTRACT: The objective of this study was to determine if heparin or ethylenediaminetetraacetic acid (EDTA) affected whole‐blood glucose measurements obtained with test strips and common handheld meter systems. Lithium heparin (15 USP units/mL), potassium (K2) EDTA (1.8 mg/dL), and no‐additive aliquots of venous blood from fifty‐five diabetic patients were tested using four handheld glucose meter systems: the SureStep Pro and the One Touch II (Lifescan, Milpitas, CA), the Precision PCx (Abbott Laboratories, MediSense Products, Bedford, MA), and the Accu‐Chek Comfort Curve (Roche Diagnostics, Indianapolis, IN), and a the YSI 2700 reference analyzer (Yellow Springs Instruments, Yellow Springs, OH). Coefficients of variation for within‐day and between‐day precision ranged from 1.4% to 6.8% for the glucose meter systems. Heparin and EDTA produce system‐specific effects. Anticoagulant versus no‐additive test strip comparisons showed that the SureStep Pro, with heparin and EDTA, and the Precision PCx, with EDTA, produced statistically significant negative mean paired differences. Heparin minimized the frequency of errors in glucose meter results. Heparinized samples produced the lowest frequency of discrepant meter results outside clinical error tolerances (± 15 mg/dL, glucose ≤ 100 mg/dL; ± 15%, glucose > 100 mg/dL) when compared with YSI 2700 reference results for paired samples. Specimen restrictions arise from anticoagulant selection, hematocrit range, physiological factors, and processing time. To reduce the potential for medical errors, the relative effects of different specimen anticoagulants should be considered carefully when conducting comparison studies, planning clinical applications, and interpreting glucose results.
Point of Care The Journal of Near-Patient Testing & Technology 02/2002; 1(1):2–8.
[Show abstract][Hide abstract] ABSTRACT: This study was designed to validate in vitro oxygen saturation (SO2) measurements with the NOVA CO-Oximeter (Nova Biomedical Corp, Waltham, Mass, USA) in canine blood containing hemoglobin (Hb) glutamer-200 bovine (Hb-200; Oxyglobin, Biopure, Cambridge, Mass, USA) as a Hb-based oxygen carrier recently introduced into clinical practice. In the first set of experiments, stored blood from 6 mixed-breed canine blood donors was used. Target PO2 levels were reached in aliquots of blood samples by tonometry. Oxygen saturation was then measured with the test device and calculated based on known PO2 values. In the second set of experiments, total oxygen content was directly measured by means of an oxygen-specific electrode in aliquots of fresh whole arterial, venous, and mixed (arterial-venous) blood withdrawn from the same canine blood donors. Hb-200 was added to those blood samples to yield plasma Hb concentrations of 1.62, 3.25, 6.50, and 9.75 g/dL. Based on Hb content and SO2 measured by the NOVA CO-Oximeter in these samples, total oxygen content was also calculated for each sample and compared with measured values. A strong correlation was found between SO2 values measured with the co-oximeter in samples after tonometry, and calculated SO2 based on known PO2. Directly measured total blood O2 content varied by </=5% from values computed based on co-oximeter measurements of Hb content and SO2. These results did not change with different levels of oxygenation of the samples or different plasma Hb-200 concentrations. In conclusion, the NOVA CO-Oximeter is an accurate analyzer for measurement of SO2 after Hb-200 administration to canine blood.
[Show abstract][Hide abstract] ABSTRACT: This study was designed to validate oxygen saturation measurements from the NOVA CO-Oximeter (NOVA Biomedical Corporation, Waltham, MA), the i-STAT System (Sensor Devices, Waukesha, WI), and the Corning 170 blood gas analyzer (Bayer Corporation, East Walpole, MA) under conditions similar to the clinical application of a hemoglobin-based oxygen carrier (HBOC, hemoglobin glutamer-200 [bovine]; Oxyglobin, Biopure Corporation, Cambridge, MA). A canine model was used for both in vitro and in vivo experiments. In vivo experiments were conducted in a canine laboratory, and in vitro experiments were conducted in a tonometry laboratory. Study subjects were six mixed-breed dogs, each weighing approximately 30 kg. In the first set of experiments, the target blood po(2) levels were reached by tonometry. In the second set of experiments, quantitative measurements of total oxygen content with the LEXO2CON-K (HOSPEX Fiberoptics, Chestnut Hill, MA) were performed, immediately followed by measurements with the NOVA CO-Oximeter and the i-STAT system. HBOC was added in concentrations of 16.2, 32.5, 65, and 97.5 g/L. To analyze the clinical significance of the differences in the results obtained with the each investigated instrument, blood samples from dogs treated with HBOC after acute hemorrhagic shock were used. Oxygen saturation, oxygen content, and po(2) were measured. There was a strong correlation between the oxygen saturation values measured with the investigated instruments in samples after tonometry and known po(2). The total calculated oxygen content varied by 5% based on results generated by calculations using the investigated instruments. The results did not change with different oxygenation of the sample. The differences among methods were not significant when the HBOC concentration was 16.2 g/L. Higher concentrations of HBOC increased the difference between calculated and measured oxygen content; the i-STAT system demonstrated a greater deviation compared with the results of the other two instruments. Systemic oxygen uptake using the investigated instruments showed a high correlation with values based on LEXO2CON-K measurements (R = 0.97 for CO-Oximeter, R = 0.96 for Corning 170 blood gas analyzer, and R = 0.79 for i-STAT system). Systemic oxygen uptake values based on CO-Oximeter and Corning 170 blood gas analyzer data showed 75% accuracy; i-STAT system accuracy was 63% for control samples and 50% for samples after HBOC infusion.
American Journal of Therapeutics 10(1):21-8. · 1.29 Impact Factor