Article

Clinical bacteriology in low-resource settings: today's solutions

If you want to read the PDF, try requesting it from the authors.

Abstract

Low-resource settings are disproportionately burdened by infectious diseases and antimicrobial resistance. Good quality clinical bacteriology through a well functioning reference laboratory network is necessary for effective resistance control, but low-resource settings face infrastructural, technical, and behavioural challenges in the implementation of clinical bacteriology. In this Personal View, we explore what constitutes successful implementation of clinical bacteriology in low-resource settings and describe a framework for implementation that is suitable for general referral hospitals in low-income and middle-income countries with a moderate infrastructure. Most microbiological techniques and equipment are not developed for the specific needs of such settings. Pending the arrival of a new generation diagnostics for these settings, we suggest focus on improving, adapting, and implementing conventional, culture-based techniques. Priorities in low-resource settings include harmonised, quality assured, and tropicalised equipment, consumables, and techniques, and rationalised bacterial identification and testing for antimicrobial resistance. Diagnostics should be integrated into clinical care and patient management; clinically relevant specimens must be appropriately selected and prioritised. Open-access training materials and information management tools should be developed. Also important is the need for onsite validation and field adoption of diagnostics in low-resource settings, with considerable shortening of the time between development and implementation of diagnostics. We argue that the implementation of clinical bacteriology in low-resource settings improves patient management, provides valuable surveillance for local antibiotic treatment guidelines and national policies, and supports containment of antimicrobial resistance and the prevention and control of hospital-acquired infections.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... Costs were converted to US Dollars ($, 2020) with the exchange rate $1 = 31. 16 laboratories were used to estimate the characteristics of specimens processed in a laboratory receiving 10,000 and 100,000 specimens in a year and the number of each item required to test for those specimens. Construction costs were estimated from a Kenyan laboratory from a previous report [5] and utility costs were estimated from Laboratory B and C. ...
... There has been a lot of progress in developing new diagnostic technologies in recent years, such as automated identification and susceptibility testing, and selective media to facilitate identification of drug resistant organisms. These can improve the quality and efficiency of microbiology diagnosis, however they are usually out of reach of government laboratories in LMICs either because of cost or because they are not adapted to use in tropical conditions [16]. ...
... Currency conversion-$1 = 31. 16 Baht, $1 = 0.76 GBD (www.xe.com and https://www.bankofengland.co.uk/ as of 11 th August 2020). (DOCX) S3 Table. ...
Article
Full-text available
Antimicrobial resistance (AMR) is a major threat to global health. Improving laboratory capacity for AMR detection is critically important for patient health outcomes and population level surveillance. We aimed to estimate the financial cost of setting up and running a microbiology laboratory for organism identification and antimicrobial susceptibility testing as part of an AMR surveillance programme. Financial costs for setting up and running a microbiology laboratory were estimated using a top-down approach based on resource and cost data obtained from three clinical laboratories in the Mahidol Oxford Tropical Medicine Research Unit network. Costs were calculated for twelve scenarios, considering three levels of automation, with equipment sourced from either of the two leading manufacturers, and at low and high specimen throughput. To inform the costs of detection of AMR in existing labs, the unit cost per specimen and per isolate were also calculated using a micro-costing approach. Establishing a laboratory with the capacity to process 10,000 specimens per year ranged from $254,000 to $660,000 while the cost for a laboratory processing 100,000 specimens ranged from $394,000 to $887,000. Excluding capital costs to set up the laboratory, the cost per specimen ranged from $22–31 (10,000 specimens) and $11–12 (100,000 specimens). The cost per isolate ranged from $215–304 (10,000 specimens) and $105–122 (100,000 specimens). This study provides a conservative estimate of the costs for setting up and running a microbiology laboratory for AMR surveillance from a healthcare provider perspective. In the absence of donor support, these costs may be prohibitive in many low- and middle- income country (LMIC) settings. With the increased focus on AMR detection and surveillance, the high laboratory costs highlight the need for more focus on developing cheaper and cost-effective equipment and reagents so that laboratories in LMICs have the potential to improve laboratory capacity and participate in AMR surveillance.
... Second, surveillance by clinical samples may be influenced by the type of clinical specimen (e.g. blood vs. respiratory tract secretions), previous antibiotic use as well as indications for sampling, factors that are often not standardized in LMIC [5]. Finally, clinical bacteriology in LMIC is typically implemented at the second level of care, i.e. the district referral hospital [6]. ...
... For smaller sample sizes (value in one of the cells ≤ 5), the Fischer exact test was used. All differences between isolates obtained from healthy pregnant women and febrile patients were statistically significant (p < 0.001) There was no statistical difference in resistance patterns between isolates growing in counts of 10 4 CFU/ml and 10 5 AMR rates were significantly lower compared to E. coli isolates obtained from clinical samples (mostly blood cultures) in the same district. ...
Article
Full-text available
Background In low- and middle-income countries, surveillance of antimicrobial resistance (AMR) is mostly hospital-based and, in view of poor access to clinical microbiology, biased to more resistant pathogens. We aimed to assess AMR among Escherichia coli isolates obtained from urine cultures of pregnant women as an indicator for community AMR and compared the AMR results with those from E. coli isolates obtained from febrile patients in previously published clinical surveillance studies conducted within the same population in Nanoro, rural Burkina Faso. We furthermore explored feasibility of adding urine culture to standard antenatal care in a rural sub-Saharan African setting. Methods Between October 2016–September 2018, midstream urine samples collected as part of routine antenatal care in Nanoro district were cultured by a dipslide method and screened for antibiotic residues. Significant growth was defined as a pure culture of Enterobacterales at counts of ≥ 10 ⁴ colony forming units/ml. Results Significant growth was observed in 202/5934 (3.4%) cultures; E. coli represented 155 (76.7%) of isolates. Among E. coli isolates, resistance rates to ampicillin, cotrimoxazole and ciprofloxacin were respectively 65.8%, 64.4% 16.2%, compared to 89.5%, 89.5% and 62.5% among E. coli from clinical isolates (n = 48 of which 45 from blood cultures). Proportions of extended spectrum beta-lactamase producers and multidrug resistance were 3.2% and 5.2% among E. coli isolates from urine in pregnant women versus 35.4%, and 60.4% respectively among clinical isolates. Conclusions The E. coli isolates obtained from healthy pregnant women had significantly lower AMR rates compared to clinical E. coli isolates, probably reflecting the lower antibiotic pressure in the pregnant women population. Adding urine culture to the routine urine analysis (dipstick) of antenatal care was feasible. The dipslide culture method was affordable and user-friendly and allowed on-site inoculation and easy transport; challenges were contamination (midstream urine sampling) and the semi-quantitative reading. Provided confirmation of the present findings in other settings, E. coli from urine samples in pregnant women may be a potential indicator for benchmarking, comparing, and monitoring community AMR rates across populations over different countries and regions.
... 7 9 The rapid diagnostics can be especially useful in secondary level hospitals in India, and below, as most of these hospitals do not have necessary infrastructure and human resources to support pathogen identification and antimicrobial susceptibility testing (AMST). [10][11][12] Over the last decade, several tests using latest molecular techniques have been developed Summary box ► Rapid diagnostics for antimicrobial resistance (AMR) have enormous potential to support adoption of diagnostic stewardship in settings with restrained healthcare resources. ► Huge efforts and investments have been made to develop rapid point-of-care diagnostics that can be effective in containment of AMR in India. ...
... In low-income and middle-income countries (LMICs) like India, these imported tests even when available in private market are not used widely owing to their steep prices and stringent infrastructure and human resource requirements. 12 Typically, a diagnostic once developed undergoes systematic validation in laboratory for accuracy of analytical parameters (figure 1). If found satisfactory, it is approved by Indian regulator, that is, Drugs Controller General of India (DCGI), for market introduction. ...
Article
Full-text available
A good point-of-care diagnostic test holds a promise to reduce inappropriate use of antibiotics by enabling early detection of the pathogen and facilitating rapid testing of antimicrobial susceptibility. India has taken many initiatives in the recent past to augment the development and deployment of diagnostics in Indian health care system. Funding opportunities to promote innovation in diagnostics development were started in early 2000s through various ministries and departments. India released National Essential Diagnostics List which enlists essential tests and there is now Free Diagnostics Service Initiative of Government of India under National Health Mission that mandates to provide all essential tests free of cost. We wanted to understand how these initiatives have impacted the diagnostics that could be of use in containment of antimicrobial resistance (AMR) and whether there is a smooth process for bringing indigenously developed products relevant to AMR into the healthcare system. We conducted a longitudinal survey (January 2019 and January 2021) to understand the availability of market ready indigenous rapid diagnostics for AMR in the country and their progress towards introduction in the private market or uptake in healthcare system. We found that many innovators and developers are working towards development of rapid tests that can be useful in the containment of AMR in India. While there are many promising diagnostics on the horizon, the pathway for uptake of indigenously developed diagnostics in healthcare system remains disjointed and needs to be harmonised for the investments made towards development to translate as tangible gains. Since most of these efforts are government funded, it is incumbent upon the government to also provide a seamless pathway to make these diagnostics available in health care system. In absence of this guidance, most of these diagnostics will sit with the innovators/developers and will never be used for the purpose they were intended to serve.
... Current estimates of 48 h are based on automated blood culture systems, which show better performance in yield and speed, but pose financial and logistic challenges compared to manual blood culture systems, which are largely used in LMICs (31). The time-to-detection of growth is not clearly defined for manual blood cultures, and further research and innovation are needed in manual blood culture methods (32). For culture-positive cases, antibiotic treatment should be tailored to the pathogen's susceptibility using the narrowest spectrum antibiotic available. ...
... Although prospective HAI surveillance is the gold standard, it is neither feasible nor sustainable in many LMICs. Also, many low-resource settings lack diagnostic microbiology services, which are key to implementing prospective surveillance (32). An alternative method of surveillance, PPS using clinical definitions of HAI, although less robust, is less expensive and easier to conduct in resource-limited settings (117,118). ...
Article
Full-text available
Healthcare-associated infections (HAIs) and antimicrobial-resistant (AMR) infections are leading causes of neonatal morbidity and mortality, contributing to an extended hospital stay and increased healthcare costs. Although the burden and impact of HAI/AMR in resource-limited neonatal units are substantial, there are few HAI/AMR prevention studies in these settings. We reviewed the mechanism of action and evidence supporting HAI/AMR prevention interventions, including care bundles, for hospitalized neonates in low- and middle-income countries (LMIC).
... One of the five objectives of the WHO Global Action Plan on AMR is to strengthen the knowledge and evidence base through surveillance and research (3). WHO defines blood cultures (BC) as a priority specimen for AMR surveillance and it is recommended to prioritize key clinical specimens in resource-limited settings (4,5). WHO also highlights diagnostic stewardship as an integral part to build up AMR surveillance systems, defining this as the "coordinated guidance and interventions to improve appropriate use of microbiological diagnostics to guide therapeutic decisions" (6). ...
... Difficulties implementing EUCAST in resource-limited settings is especially hampered because defibrinated horse blood is not available for AST of fastidious bacteria. This has been published also by others and we recommend a low-resource adapted EUCAST version to overcome these obstacles (4,28). ...
Article
Full-text available
Background: Blood cultures (BC) have a high clinical relevance and are a priority specimen for surveillance of antimicrobial resistance. Manual BC are still most frequently used in resource-limited settings. Data on automated BC performance in Africa are scarce. We implemented automated BC at a surveillance site of the African Network for improved Diagnostics, Epidemiology and Management of Common Infectious Agents (ANDEMIA). Methods: Between June 2017 and January 2018, pairs of automated BC (BacT/ALERT ® FA Plus) and manual BC (brain-heart infusion broth) were compared at a University hospital in Bouaké, Côte d'Ivoire. BC were inoculated each with a target blood volume of 10 ml from the same venipuncture. Automated BC were incubated for up to 5 days, manual BC for up to 10 days. Terminal subcultures were performed for manual BC only. The two systems were compared regarding yield, contamination, and turnaround time. For quality assurance, isolates were retested in a German routine microbiological laboratory. Results: BC sampling was increased from on average 24 BC to 63 BC per month. A total of 337 matched pairs of BC were included. Automated BC was positive in 36.5%, manual BC in 24.0% ( p -value < 0.01), proportion of contamination was 47.9 and 43.8%, respectively ( p -value = 1.0). Turnaround time of positive BC was shortened by 2.5 days with automated compared to manual BC ( p < 0.01). Most common detected pathogens in both systems were Klebsiella spp. (26.0%) and Staphylococcus aureus (18.2%). Most contaminants were members of the skin flora. Retesting of 162 isolates was concordant in 79.6% on family level. Conclusions: Implementing automated BC in a resource-limited setting is possible and improves microbiological diagnostic performance. Automated BC increased yield and shortened turnaround times. Regular training and mentorship of clinicians has to be intensified to increase number and quality of BC. Pre-analytical training to improve diagnostic stewardship is essential when implementing a new microbiological method. Retesting highlighted that manual identification and antimicrobial susceptibility testing can be of good quality and sustainable. The implementation of automated tools should be decided individually according to economic considerations, number of samples, stable supply chain of consumables, and technical sustainability.
... Shortages in the staff and facilities to perform routine microbiological testing may result in ineffectual prophylactic antibiotic usage. 17 We found use of laparoscopy was associated with lower SSI rates, an effect which persisted when HDI was accounted for. Barriers to uptake of surgical technologies (including training, treatment costs and lack of supportive infrastructure for technologies including servicing, equipment support staff, distribution and repair capability), such as laparoscopy, are likely to greatly affect LMICs and increase the observed rate of SSI in children. ...
Article
Introduction Surgical site infection (SSI) is one of the most common healthcare-associated infections (HAIs). However, there is a lack of data available about SSI in children worldwide, especially from low-income and middle-income countries. This study aimed to estimate the incidence of SSI in children and associations between SSI and morbidity across human development settings. Methods A multicentre, international, prospective, validated cohort study of children aged under 16 years undergoing clean-contaminated, contaminated or dirty gastrointestinal surgery. Any hospital in the world providing paediatric surgery was eligible to contribute data between January and July 2016. The primary outcome was the incidence of SSI by 30 days. Relationships between explanatory variables and SSI were examined using multilevel logistic regression. Countries were stratified into high development, middle development and low development groups using the United Nations Human Development Index (HDI). Results Of 1159 children across 181 hospitals in 51 countries, 523 (45·1%) children were from high HDI, 397 (34·2%) from middle HDI and 239 (20·6%) from low HDI countries. The 30-day SSI rate was 6.3% (33/523) in high HDI, 12·8% (51/397) in middle HDI and 24·7% (59/239) in low HDI countries. SSI was associated with higher incidence of 30-day mortality, intervention, organ-space infection and other HAIs, with the highest rates seen in low HDI countries. Median length of stay in patients who had an SSI was longer (7.0 days), compared with 3.0 days in patients who did not have an SSI. Use of laparoscopy was associated with significantly lower SSI rates, even after accounting for HDI. Conclusion The odds of SSI in children is nearly four times greater in low HDI compared with high HDI countries. Policies to reduce SSI should be prioritised as part of the wider global agenda.
... During the visual inspection of bacterial growth, different parameters are assessed, such as the appearance of turbidity ("cloudiness") caused by undissolved particles in the broth, the deposition of bacterial colonies as "puff balls" at the bottom of the BCB, or as pellicle formation at the liquid-air interface, and gas production [7]. Automated systems outperform manual systems in terms of time-to-detection (TTD) and growth detection [7,[11][12][13][14][15][16], but they are expensive, require regular maintenance, and are not adapted to the environmental conditions commonly seen in low-resource settings (LRS) [17]. Therefore, many LRS laboratories still resort to manual blood culture systems. ...
Article
Full-text available
Bloodstream infections and antimicrobial resistance are an increasing problem in low-income countries. There is a clear need for adapted diagnostic tools. To address this need, we developed a simple, universal reader prototype that detects bacterial growth in blood culture bottles. Our “turbidimeter” evaluates bacterial growth, based on the turbidity of the broth and the color change of the colorimetric CO2 indicator in commercially available blood culture bottles. A total of 60 measurements were performed using 10 relevant microbial species, spiked in horse blood, to compare the turbidimeter’s performance with that of an automatic reference system. The turbidimeter was able to detect growth in all but one of the spiked blood culture bottles. In the majority (7/10) of the species tested, time-to-detection of the turbidimeter was shown to be non-inferior to the reference automated time-to-detection. This was, however, only the case when both the turbidity and color change in the colorimetric CO2-indicator were used to evaluate growth. We could not demonstrate the non-inferiority of the turbidity measurement alone. Overall, the turbidimeter performed well, but we also identified some improvements that will be implemented in the next version of the prototype.
... However, the use of microbiological data in this study was quite low. Studies show that clinical microbiological services in low-resource settings are traditionally limited due to a multitude of reasons including lack of infrastructure, equipment, quality assurance, and personnel and training (30). However, all hospitals included in this study were tertiary care hospitals and had the capacity to perform antimicrobial susceptibility testing. ...
Article
Full-text available
Background To develop effective antimicrobial stewardship programs (ASPs) for low- and middle-income countries (LMICs), it is important to identify key targets for improving antimicrobial use. We sought to systematically describe the prevalence and patterns of antimicrobial use in three LMIC hospitals. Methods Consecutive patients admitted to the adult medical wards in three tertiary care hospitals in Tanzania, Kenya, and Sri Lanka were enrolled in 2018–2019. The medical record was reviewed for clinical information including type and duration of antimicrobials prescribed, indications for antimicrobial use, and microbiologic testing ordered. Results A total of 3,149 patients were enrolled during the study period: 1,103 from Tanzania, 750 from Kenya, and 1,296 from Sri Lanka. The majority of patients were male (1,783, 56.6% overall) with a median age of 55 years (IQR 38–68). Of enrolled patients, 1,573 (50.0%) received antimicrobials during their hospital stay: 35.4% in Tanzania, 56.5% in Kenya, and 58.6% in Sri Lanka. At each site, the most common indication for antimicrobial use was lower respiratory tract infection (LRTI; 40.2%). However, 61.0% received antimicrobials for LRTI in the absence of LRTI signs on chest radiography. Among patients receiving antimicrobials, tools to guide antimicrobial use were under-utilized: microbiologic cultures in 12.0% and microbiology consultation in 6.5%. Conclusion Antimicrobials were used in a substantial proportion of patients at tertiary care hospitals across three LMIC sites. Future ASP efforts should include improving LRTI diagnosis and treatment, developing antibiograms to direct empiric antimicrobial use, and increasing use of microbiologic tests.
... Shortages in the staff and facilities to perform routine microbiological testing may result in ineffectual prophylactic antibiotic usage. 17 We found use of laparoscopy was associated with lower SSI rates, an effect which persisted when HDI was accounted for. Barriers to uptake of surgical technologies (including training, treatment costs and lack of supportive infrastructure for technologies including servicing, equipment support staff, distribution and repair capability), such as laparoscopy, are likely to greatly affect LMICs and increase the observed rate of SSI in children. ...
... CBC and culture and sensitivity tests are important in assessing treatment outcomes; for instance, a CBC value in normal limits (3.7-9.4 × 10 9 mm 3 ) [35] while culture and sensitivity testing facilitates the selection of the most appropriate antibiotics [27,36,37]. In LMICs, the capacity of clinical microbiology laboratories is very low, and even where capacity is not limited, such laboratories are underutilized [22,38,39]. The inadequate capacity is majorly due to inadequate laboratory infrastructure, lack of adequately trained/qualified staff, and limited resources to procure laboratory consumables [40][41][42]. ...
Article
Full-text available
Ceftriaxone has a high propensity for misuse because of its high rate of utilization. In this study, we aimed at assessing the appropriateness of the clinical utilization of ceftriaxone in nine health facilities in Uganda. Using the World Health Organization (WHO) Drug Use Evaluation indicators, we reviewed a systematic sample of 885 patients’ treatment records selected over a three (3)-month period. Our results showed that prescriptions were written mostly by medical officers at 53.3% (470/882). Ceftriaxone was prescribed mainly for surgical prophylaxis at 25.3% (154/609), respiratory tract infections at 17% (104/609), and sepsis at 11% (67/609), as well as for non-recommended indications such as malaria at 7% (43/609) and anemia at 8% (49/609). Ceftriaxone was mostly prescribed once daily (92.3%; 817/885), as a 2 g dose (50.1%; 443/885), and for 5 days (41%; 363/885). The average score of inappropriate use of ceftriaxone in the eight indicators was 32.1%. Only 58.3% (516/885) of the ceftriaxone doses prescribed were administered to completion. Complete blood count and culture and sensitivity testing rates were 38.8% (343/885) and 1.13% (10/885), respectively. Over 85.4% (756/885) of the patients improved and were discharged. Factors associated with appropriate ceftriaxone use were gender, pregnancy status, days of hospitalization, health facility level of care, health facility type, and type of prescriber.
... Shortages in the staff and facilities to perform routine microbiological testing may result in ineffectual prophylactic antibiotic usage. 17 We found use of laparoscopy was associated with lower SSI rates, an effect which persisted when HDI was accounted for. Barriers to uptake of surgical technologies (including training, treatment costs and lack of supportive infrastructure for technologies including servicing, equipment support staff, distribution and repair capability), such as laparoscopy, are likely to greatly affect LMICs and increase the observed rate of SSI in children. ...
... As a consequence, meta-analyses describing antimicrobial resistance rates in Africa describe high rates of resistance but emphasize the scarcity of data, the bias towards tertiary centres in urban areas and the lack of microbiological quality control in most studies [7,8]. The financial, logistic and infrastructural challenges associated with clinical bacteriology in LMIC have been described elsewhere [2,[9][10][11][12]. However, it has also been demonstrated that acceptable quality bacteriology can be achieved in these settings when provided with adequate financial, logistic and supervisory support [3]. ...
Preprint
Full-text available
Use of equipment-free, “manual” blood cultures is still widespread in low-resource settings, as requirements for implementation of automated systems are often not met. Quality of manual blood culture bottles currently on the market, however, is usually unknown. An acceptable quality in terms of yield and speed of growth can be ensured by evaluating the bottles using simulated blood cultures. In these experiments, bottles from different systems are inoculated in parallel with blood and a known quantity of bacteria. Based on literature review and personal experiences, we propose a short and practical protocol for an efficient evaluation of manual blood culture bottles, aimed at research or reference laboratories in low-resource settings. This laboratory protocol was used in a study for Médecins Sans Frontières' Mini-Lab project, which aims to bring clinical bacteriology to low-resource settings. Three bottle types were evaluated in this study; two "manual" blood culture bottles and one automated system.
Article
Full-text available
Background Antimicrobial resistance (AMR) is a public health crisis of global proportions. Data is required to understand the local drivers of antimicrobial resistance and support decision-making processes including implementation of appropriate antimicrobial stewardship strategies. Objectives To measure antimicrobial usage in hospitals in Ghana. Methods Using the Global Point Prevalence instruments and processes, we conducted point prevalence surveys across AMR surveillance sentinel hospitals in Ghana, between September and December 2019. Hospital records of all inpatients on admission at 0800 hours on a specific day were reviewed for antimicrobial use at the time of the survey. Data on antibiotic use, including indication for use and quality of prescribing were recorded. Results Overall prevalence of antibiotic use across the sentinel sites was 54.9% (n = 1591/2897), ranging between 48.4% (n = 266/550) and 67.2% (n = 82/122). The highest prevalence of antibiotic use 89.3% (n = 25/28) was observed in adult ICUs. The average number of antibiotics prescribed per patient was 1.7 (n = 1562/2620), with the majority (66%, n = 728/2620) administered via the parenteral route. The five most-commonly used antibiotics were metronidazole (20.6%, n = 541/2620), cefuroxime (12.9%, n = 338/2620), ceftriaxone (11.8%, n = 310/2620), amoxicillin/clavulanic acid (8.8%, n = 231/2620) and ciprofloxacin (7.8%, n = 204/2620). The majority (52.2%; n = 1367/2620) of antibiotics were prescribed to treat an infection, whilst surgical prophylaxis accounted for 26.1% (n = 684/2620). Conclusions We observed a high use of antibiotics including metronidazole and cephalosporins at the participating hospitals. Most antibiotics were empirically prescribed, with low use of microbiological cultures. High usage of third-generation cephalosporins especially for community-acquired infections offers an opportunity for antibiotic stewardship interventions.
Article
Background Many studies have reported the interactive effects between relative humidity and temperature on infectious diseases. However, evidence regarding the combined effects of relative humidity and temperature on bacillary dysentery (BD) is limited, especially for large-scale studies. To address this research need, humidex was utilized as a comprehensive index of relative humidity and temperature. We aimed to estimate the effect of humidex on BD across mainland China, evaluate its heterogeneity, and identify potential effect modifiers. Methods Daily meteorological and BD surveillance data from 2014 to 2016 were obtained for 316 prefecture-level cities in mainland China. Humidex was calculated on the basis of relative humidity and temperature. A multicity, two-stage time series analysis was then performed. In the first stage, a common distributed lag non-linear model (DLNM) was established to obtain city-specific estimates. In the second stage, a multivariate meta-analysis was conducted to pool these estimates, assess the significance of heterogeneity, and explore potential effect modifiers. Results The pooled cumulative estimates showed that humidex could promote the transmission of BD. The exposure-response relationship was nearly linear, with a maximum cumulative relative risk (RR) of 1.45 [95% confidence interval (CI): 1.29–1.63] at a humidex value of 40.94. High humidex had an acute adverse effect on BD. The humidex-BD relationship could be modified by latitude, urbanization rate, the natural growth rate of population, and the number of primary school students per thousand persons. Conclusions High humidex could increase the risk of BD incidence. Thus, it is suitable to incorporate humidex as a predictor into the early warning system of BD and to inform the general public in advance to be cautious when humidex is high. This is especially true for regions with higher latitude, higher urbanization rates, lower natural growth rates of population, and lower numbers of primary school students per thousand persons.
Article
Full-text available
Containing antimicrobial resistance and reducing high levels of antibiotic consumption in low- and lower middle-income countries are a major challenge. Clinical guidelines targeting antibiotic prescribing can reduce consumption, however, the degrees to which clinical guidelines are adopted and adhered to are challenging for developers, policy makers and users. The aim of this study was to review the strategies used for implementing and promoting antibiotic guideline adherence in low- and lower middle-income countries. A review of published literature was conducted using PubMed, Cochrane Library, SCOPUS and the information systems of the World Health Organization and the Australian National University according to PRISMA guidelines and our PROSPERO protocol. The strategies were grouped into five broad categories based on the Cochrane Effective Practice and Organization of Care taxonomy. The 33 selected studies, representing 16 countries varied widely in design, setting, disease focus, methods, intervention components, outcomes and effects. The majority of interventions were multifaceted and resulted in a positive direction of effect. The nature of the interventions and study variability made it impossible to tease out which strategies had the greatest impact on improving CG compliance. Audit and feedback coupled with either workshops and/or focus group discussions were the most frequently used intervention components. All the reported strategies are established practices used in antimicrobial stewardship programs in high-income countries. We recommend interrupted time series studies be used as an alternative design to pre- and post-intervention studies, information about the clinical guidelines be made more transparent, and prescriber confidence be investigated.
Article
Background: Bloodstream infections (BSIs) are a major cause of morbidity and mortality in hospitalized neonates. Data on antibiotic resistance in neonatal BSIs and their impact on clinical outcomes in Africa are limited. Methods: We conducted a prospective cohort study at 2 tertiary level neonatal intensive care units (NICUs) in Ghana. All neonates admitted to the NICUs were included from October 2017 to September 2019. We monitored BSI rates and analyzed the effect of BSI and antibiotic resistance on mortality and duration of hospitalization. Results: Out of 5433 neonates included, 3514 had at least one blood culture performed and 355 had growth of a total of 368 pathogenic microorganisms. Overall incidence of BSI was 1.0 (0.9-1.1) per 100 person days. The predominant organisms were Klebsiella pneumoniae 49.7% (183/368) and Streptococcus spp. 10.6% (39/368). In addition, 512 coagulase negative Staphylococci were isolated but considered probable contaminants. Among K. pneumoniae, resistance to gentamicin and amikacin was 91.8% and 16.4%, respectively, while carbapenem resistance was 4.4%. All-cause mortality among enrolled neonates was 19.7% (1066/5416). The mortality rate was significantly higher in neonates with BSI compared with culture-negative neonates in univariate analysis (27.9%, n = 99/355 vs. 16.5%, n = 520/3148; hazard ratio 1.4, 95% confidence interval 1.07-1.70) but not in multivariate analysis. Conclusion: The diversity of etiological agents and the high risk of antibiotic resistance suggest that standard empirical treatment is unlikely to improve the outcome of BSIs in low and middle income. Such improvements will depend on access to reliable clinical microbiologic services.
Article
In settings with limited resources and a wide range of possible etiologies, molecular technologies offer an effective solution for infectious disease diagnostics, because they are agile, fast and flexible. Health systems that routinely use molecular diagnostics will achieve economies of scale, maximize limited expertize and rapidly respond to new threats.
Article
Full-text available
Background Routine microbiology results are a valuable source of antimicrobial resistance (AMR) surveillance data in low- and middle-income countries (LMICs) as well as in high-income countries. Different approaches and strategies are used to generate AMR surveillance data. Objectives We aimed to review strategies for AMR surveillance using routine microbiology results in LMICs and to highlight areas that need support to generate high quality AMR data. Sources We searched papers that used routine microbiology to describe the epidemiology of AMR and drug resistant infections in LMICs in PubMed. We also included papers that, from our perspective, were critical in highlighting the biases and challenges or employed specific strategies to overcome these in reporting AMR surveillance in LMICs. Content Topics covered included strategies of identifying AMR cases (including case-finding based on isolates from routine diagnostic specimens and case-based surveillance of clinical syndromes), of collecting data (including cohort, point-prevalence survey, and case-control), of sampling AMR cases (including lot quality assurance surveys), and of processing and analysing data for AMR surveillance in LMICs. Implications The various AMR surveillance strategies warrant a thorough understanding of their limitations and potential biases to ensure maximum utilization and interpretation of local routine microbiology data across time and space. For instance, surveillance using case-finding based on results from clinical diagnostic specimens is relatively easy to implement and sustain in LMIC settings but the estimates of incidence and proportion of AMR is at risk of biases due to underuse of microbiology. Case-based surveillance of clinical syndrome generates informative statistics that can be translated to clinical practices but needs financial and technical support, and locally-tailored trainings to sustain. Innovative AMR surveillance strategies that can be easily implemented and sustained with minimal costs will be useful for improving AMR data availability and quality in LMICs.
Article
Full-text available
Background Culture media are fundamental in clinical microbiology. In laboratories in low- and middle-income countries (LMIC), they are mostly prepared in-house, which is challenging. Objectives This narrative review describes challenges related to culture media in LMIC, compiles best practices for in-house media preparation, gives recommendations to improve access to quality-assured culture media products in LMIC and formulates outstanding questions for further research. Sources Scientific literature was searched using PubMed and predefined MeSH terms. In addition, grey literature was screened, including manufacturer’s websites and manuals as well as microbiology textbooks. Content Bacteriology laboratories in LMIC often face challenges at multiple levels: lack of clean water and uninterrupted power supply, high environmental temperatures and humidity, dust, inexperienced and poorly trained staff, and a variable supply of consumables (often of poor quality). To deal with this at a base level, one should be very careful in selecting culture media. It is recommended to look for products supported by the national reference laboratory, that are being distributed by an in-country supplier. Correct storage is key, as is appropriate preparation and waste management. Centralized media acquisition has been advocated for LMICs, a role that can be taken up by the national reference laboratories, next to guidance and support of the local laboratories. In addition, there is an important role in tropicalization and customization of culture media formulations for private in vitro diagnostic manufacturers, who are often still unfamiliar with the LMIC market and the plethora of bacteriology products. Implication The present narrative review will assist clinical microbiology laboratories in LMICs to establish best practices for handling culture media by defining quality, regulatory and research paths.
Article
Full-text available
Background In low-and–middle-income countries (LMIC), data related to antimicrobial resistance (AMR) are often inconsistently collected. Humanitarian, private, and non-governmental medical organizations (NGO), working with or in parallel to public medical systems, are sometimes present in these contexts. Yet, what is the role of NGOs in the fight against AMR, and how can they contribute to AMR data collection in contexts where reporting is scarce? How can context-adapted, high-quality clinical bacteriology be implemented in remote, challenging, and underserved areas of the world? Aim To provide an overview of AMR data collection challenges in LMIC and describe one initiative, the Mini-Lab project developed by Médecins Sans Frontières (MSF), that attempts to partially address them. Sources We conducted a literature review using PubMed and Google scholar databases to identify peer-reviewed research and grey literature from publicly available reports and websites. Content We address the necessity of and difficulties related to obtaining AMR data in LMIC, as well as the role that actors outside of public medical systems can play in the collection of this information. We then describe how the Mini-Lab can provide simplified bacteriological diagnosis and AMR surveillance in challenging settings. Implication NGOs are responsible for a large amount of healthcare provision in some very low-resourced contexts. As a result, they also have a role in AMR control, including bacteriological diagnosis and the collection of AMR-related data. Actors outside of the public medical system can actively contribute to implementing and adapting clinical bacteriology in LMIC and can help improve AMR surveillance and data collection.
Article
Full-text available
Data on comprehensive population-based surveillance of antimicrobial resistance is lacking. In low- and middle-income countries, the challenges are high due to weak laboratory capacity, poor health systems governance, lack of health information systems, and limited resources. Developing countries struggle with political and social dilemma, and bear a high health and economic burden of communicable diseases. Available data are fragmented and lack representativeness which limits their use to advice health policy makers and orientate the efficient allocation of funding and financial resources on programs to mitigate resistance. Low-quality data means soaring rates of antimicrobial resistance and the inability to track and map the spread of resistance, detect early outbreaks, and set national health policy to tackle resistance. Here, we review the barriers and limitations of conducting effective antimicrobial resistance surveillance, and we highlight multiple incremental approaches that may offer opportunities to strengthen population-based surveillance if tailored to the context of each country.
Article
Full-text available
Objective: The aim of this study was to determine the prevalence and antibiotic resistance patterns of bacterial isolates from inpatients and outpatients in Mbale and Soroti regional referral hospitals in Eastern Uganda. Methods: A retrospective analysis of culture and antibiotic sensitivity test results from the microbiology laboratories of the two tertiary hospitals was conducted for a 3-year period (January 2016-December 2018). Results: Microbiology records of 3092 patients were reviewed and analyzed. 1305 (42.1.%) samples yielded clinical isolates. The most prevalent isolates were Escherichia coli (n = 442; 33.9%), Staphylococcus aureus (n = 376; 28.8%), Klebsiella pneumoniae (n = 237; 18.2%), and Streptococcus pneumoniae (n = 76; 5.8%). High rates of antimicrobial resistance were detected across both Gram-negative and Gram-positive bacteria. E.coli and K. pneumoniae were resistant to several agents such as amoxicillin/clavulanate (83.5%; 64.6%), cefotaxime (74.2%; 52.7%), ciprofloxacin (92.1%; 27.8%), gentamicin (51.8%; 76%), imipenem (3.2%; 10.5%), tetracycline (98%; 74.5%), and trimethoprim-sulfamethoxazole (74.1%; 74.3%), respectively. S. aureus and S. pneumoniae exhibited the following resistance profile: cefoxitin (44.4%; 40.9%), chloramphenicol (69.1%; 27.6%) clindamycin (21.5%; 24.4%), gentamicin (83.2%; 66.9%), penicillin (46.5%; -) tetracycline (85.6%; 97.6%), trimethoprim-sulfamethoxazole (88%; 91.3%), and vancomycin (41.2 %; -). Conclusion: We observed high resistance rates to antibiotics among the majority of microorganisms that were isolated from the samples collected from patients in Eastern Uganda. Furthermore, measures should be undertaken locally to improve microbiology diagnostics and to prevent the spread of antibiotic resistant strains as this impedes the optimal treatment of bacterial infections and narrows the choice of effective therapeutic options.
Article
Full-text available
Bacterial identification is challenging in low-resource settings (LRS). We evaluated the MicroScan identification panels (Beckman Coulter, Brea, CA, USA) as part of Médecins Sans Frontières’ Mini-lab Project. The MicroScan Dried Overnight Positive ID Type 3 (PID3) panels for Gram-positive organisms and Dried Overnight Negative ID Type 2 (NID2) panels for Gram-negative organisms were assessed with 367 clinical isolates from LRS. Robustness was studied by inoculating Gram-negative species on the Gram-positive panel and vice versa. The ease of use of the panels and readability of the instructions for use (IFU) were evaluated. Of species represented in the MicroScan database, 94.6% (185/195) of Gram-negative and 85.9% (110/128) of Gram-positive isolates were correctly identified up to species level. Of species not represented in the database (e.g., Streptococcus suis and Bacillus spp.)), 53.1% out of 49 isolates were incorrectly identified as non-related bacterial species. Testing of Gram-positive isolates on Gram-negative panels and vice versa (n = 144) resulted in incorrect identifications for 38.2% of tested isolates. The readability level of the IFU was considered too high for LRS. Inoculation of the panels was favorably evaluated, whereas the visual reading of the panels was considered error-prone. In conclusion, the accuracy of the MicroScan identification panels was excellent for Gram-negative species and good for Gram-positive species. Improvements in stability, robustness, and ease of use have been identified to assure adaptation to LRS constraints.
Article
Full-text available
Significance While antimicrobial resistance is an urgent global problem, substantial clinical surveillance gaps exist in low- and middle-income countries (LMICs). We fill the gaps in the global prevalence map of nine pathogens, resistant to 19 (classes of) antibiotics (representing 75 unique combinations), based on the robust correlation between countries’ socioeconomic profiles and extensive surveillance data. Our estimates for carbapenem-resistant Acinetobacter baumannii and third-generation cephalosporin-resistant Escherichia coli benefit over 2.2 billion people in countries with currently insufficient diagnostic capacity. We show how structural surveillance investments can be prioritized based on the magnitude of prevalence estimated (Middle Eastern countries), the relative prevalence increase over 1998 to 2017 (sub-Saharan African countries), and the improvement of model performance achievable with new surveillance data (Pacific Islands).
Article
Full-text available
Introduction and Objectives Multidrug resistant (MDR) Klebsiella pneumoniae is increasing worldwide with poorly characterized epidemiology in many parts of the world, particularly in Africa. This study aimed to investigate the molecular epidemiology of K. pneumoniae, to identify the diversity of Sequence Types (ST) and to detect carbapenem resistance genes in major regional hospitals in Khartoum, Sudan, Methods K. pneumoniea isolates (n = 117) were cultured from four hospitals in Khartoum, from April 2015 to October 2016. The isolates were characterised by sequencing of 16S-23S rDNA internal transcribed spacer (ITS) region. Molecular epidemiology was determined by Multilocus sequence typing (MLST), and analysed by maximum likelihood phylogeny (PhyML). Antimicrobial Susceptibility was determined by disk diffusion. Isolates phenotypically resistant to carbapenem were screened for carbapenemase genes: blaNDM, blaOXA48, blaIMP, blaVIM and blaGES by PCR. Results ITS sequencing confirmed the 117 isolates as K. pneumoniae. MLST revealed 52 different STs grouped in 4 distinct clusters by PhyML. All isolates were MDR and carbapenemase-producing K. pneumoniae (CP-KP) isolates accounted for 44/117 (37.6%) mostly harbouring blaNDM (28/44) and blaOXA-48 (7/44) with several isolates harbouring multiple genes. Conclusion MDR and CP-KP K. pneumoniea is widespread in Khartoum hospitals, with a diverse population of 52 ST clustering in 4 major lineages. There is an urgent need for systematic epidemiological studies of drug-resistant infections across all healthcare institutions in Sudan to inform local infection prevention and control strategies.
Article
Background Antimicrobial resistance (AMR) is a growing problem worldwide, with an estimated high burden in low- and middle-income countries (LMICs). In these settings, tackling the problem of AMR is often constrained by a lack of reliable surveillance data, due to limited use of microbiological diagnostics in clinical practice. Objectives The aim of this article is to present an overview of essential elements for setting up an AMR surveillance system in LMICs, to summarize the steps taken to develop such a system in the country of Georgia and to describe its impact on microbiology laboratories. Sources Literature review of published papers using PubMed and experiences of experts involved in setting up AMR surveillance in Georgia. Content Basic requirements for implementing a laboratory-based surveillance system in LMICs can be captured under four pillars: 1) governmental support, 2) laboratory capacity and quality management, 3) materials and supplies, and 4) sample collection, data management, analysis and reporting. In Georgia, the World Health Organisation Proof-of-Principle project helped to start the collection of AMR surveillance data on a small scale by promoting the use of microbiological diagnostics in clinics, and providing training and materials for laboratories. Thanks to governmental support and strong lead by the national reference laboratory, the AMR surveillance network was sustained and expanded after the project ended. Implications This review describes the Georgian approach in building and expanding a functional AMR surveillance system, considering the elements identified from literature.. The introduction of quality management systems, standardization of guidelines and training paired with targeted capacity-building led to improved laboratory standards and management of patients with bloodstream infections. Reliable AMR surveillance data may inform and facilitate policy making on AMR control. The Georgian experience can guide other countries in the process of building up their national AMR surveillance system.
Article
PRIDA is an Australian based network of medical and scientific specialists, combining expertise in microbiology laboratory development, infection control, management of infectious diseases and antimicrobial stewardship. PRIDA focuses on grassroots support for Pacific and Southeast Asian sites through the establishment of long-term mentoring relationships with front line health care workers. With an emphasis on bench level training for scientists and bedside development for clinicians, PRIDA has advanced testing capacity, infection control and antimicrobial stewardship in the Solomon Islands, Timor-Leste, and PNG. Understanding the need to upskill HCWs in the Pacific, PRIDA has expanded into areas of formal education opportunities with development of online microbiology diplomas, for pathologists, physicians, and scientists. Concurrent design of multidisciplinary virtual video conferenced microbiology rounds provides teaching opportunities in real time and improvement in daily patient care. From its origin of volunteerism, PRIDA has attracted funding through partnership with larger organisations and are currently involved in sponsored AMR projects in the Pacific.
Article
Full-text available
Background The Global Point Prevalence Survey of Antimicrobial Consumption and Resistance (Global-PPS) provides a methodology to support hospitals worldwide in collecting antimicrobial use data. We aim to evaluate the impact of the Global-PPS on local antimicrobial stewardship (AMS) programmes and assess health care professionals’ educational needs and barriers for implementing AMS. Methods A cross-sectional survey was disseminated within the Global-PPS network. The target audience consisted of hospital healthcare workers, involved in local surveillance of antimicrobial consumption and resistance. This included contacts from hospitals that already participated in the Global-PPS or were planning to do so. The survey contained 24 questions that addressed the hospital’s AMS activities, experiences conducting the PPS, as well as the learning needs and barriers for implementing AMS. Results A total of 248 hospitals from 74 countries participated in the survey, of which 192 had already conducted the PPS at least once. The survey response rate was estimated at 25%. In 96.9% of these 192 hospitals, Global-PPS participation had led to the identification of problems related to antimicrobial prescribing. In 69.3% at least one of the hospital’s AMS components was initiated as a result of Global-PPS findings. The level of AMS implementation varied across regions. Up to 43.1% of all hospitals had a formal antimicrobial stewardship strategy, ranging from 10.8% in Africa to 60.9% in Northern America. Learning needs of hospitals in high-income countries and in low-and middle-income countries were largely similar and included general topics (e.g. ‘optimising antibiotic treatment’), but also PPS-related topics (e.g. ‘translating PPS results into meaningful interventions’). The main barriers to implementing AMS programmes were a lack of time (52.7%), knowledge on good prescribing practices (42.0%), and dedicated funding (39.9%). Hospitals in LMIC more often reported unavailability of prescribing guidelines, insufficient laboratory capacity and suboptimal use of the available laboratory services. Conclusions Although we observed substantial variation in the level of AMS implementation across regions, the Global-PPS has been very useful in informing stewardship activities in many participating hospitals. More is still to be gained in guiding hospitals to integrate the PPS throughout AMS activities, building on existing structures and processes.
Article
Full-text available
Background Persistent fever, defined as fever lasting for 7 days or more at first medical evaluation, has been hardly investigated as a separate clinical entity in the tropics. This study aimed at exploring the frequencies and diagnostic predictors of the ubiquitous priority (i.e., severe and treatable) infections causing persistent fever in the tropics. Methods In six different health settings across four countries in Africa and Asia (Sudan, Democratic Republic of Congo [DRC], Nepal, and Cambodia), consecutive patients aged 5 years or older with persistent fever were prospectively recruited from January 2013 to October 2014. Participants underwent a reference diagnostic workup targeting a pre-established list of 12 epidemiologically relevant priority infections (i.e., malaria, tuberculosis, HIV, enteric fever, leptospirosis, rickettsiosis, brucellosis, melioidosis, relapsing fever, visceral leishmaniasis, human African trypanosomiasis, amebic liver abscess). The likelihood ratios (LRs) of clinical and basic laboratory features were determined by pooling all cases of each identified ubiquitous infection (i.e., found in all countries). In addition, we assessed the diagnostic accuracy of five antibody-based rapid diagnostic tests (RDTs): Typhidot Rapid IgM, Test-itTM Typhoid IgM Lateral Flow Assay, and SD Bioline Salmonella typhi IgG/IgM for Salmonella Typhi infection, and Test-itTM Leptospira IgM Lateral Flow Assay and SD Bioline Leptospira IgG/IgM for leptospirosis. Results A total of 1922 patients (median age: 35 years; female: 51%) were enrolled (Sudan, n = 667; DRC, n = 300; Nepal, n = 577; Cambodia, n = 378). Ubiquitous priority infections were diagnosed in 452 (23.5%) participants and included malaria 8.0% (n = 154), tuberculosis 6.7% (n = 129), leptospirosis 4.0% (n = 77), rickettsiosis 2.3% (n = 44), enteric fever 1.8% (n = 34), and new HIV diagnosis 0.7% (n = 14). The other priority infections were limited to one or two countries. The only features with a positive LR ≥ 3 were diarrhea for enteric fever and elevated alanine aminotransferase level for enteric fever and rickettsiosis. Sensitivities ranged from 29 to 67% for the three RDTs targeting S. Typhi and were 9% and 16% for the two RDTs targeting leptospirosis. Specificities ranged from 86 to 99% for S. Typhi detecting RDTs and were 96% and 97% for leptospirosis RDTs. Conclusions Leptospirosis, rickettsiosis, and enteric fever accounted each for a substantial proportion of the persistent fever caseload across all tropical areas, in addition to malaria, tuberculosis, and HIV. Very few discriminative features were however identified, and RDTs for leptospirosis and Salmonella Typhi infection performed poorly. Improved field diagnostics are urgently needed for these challenging infections. Trial registration NCT01766830 at ClinicalTrials.gov.
Article
Full-text available
Easy and robust antimicrobial susceptibility testing (AST) methods are essential in clinical bacteriology laboratories (CBL) in low-resource settings (LRS). We evaluated the Beckman Coulter MicroScan lyophilized broth microdilution panel designed to support Médecins Sans Frontières (MSF) CBL activity in difficult settings, in particular with the Mini-Lab. We evaluated the custom-designed MSF MicroScan Gram-pos microplate (MICPOS1) for Staphylococcus and Enterococcus species, MSF MicroScan Gram-neg microplate (MICNEG1) for Gram-negative bacilli, and MSF MicroScan Fastidious microplate (MICFAST1) for Streptococci and Haemophilus species using 387 isolates from routine CBLs from LRS against the reference methods. Results showed that, for all selected antibiotics on the three panels, the proportion of the category agreement was above 90% and the proportion of major and very major errors was below 3%, as per ISO standards. The use of the Prompt inoculation system was found to increase the MIC and the major error rate for some antibiotics when testing Staphylococci. The readability of the manufacturer’s user manual was considered challenging for low-skilled staff. The inoculations and readings of the panels were estimated as easy to use. In conclusion, the three MSF MicroScan MIC panels performed well against clinical isolates from LRS and provided a convenient, robust, and standardized AST method for use in CBL in LRS.
Article
Full-text available
Whole-genome sequencing (WGS) is finding important applications in the surveillance of antimicrobial resistance (AMR), providing the most granular data and broadening the scope of niches and locations that can be surveilled. A common but often overlooked application of WGS is to replace or augment reference laboratory services for AMR surveillance. WGS has supplanted traditional strain subtyping in many comprehensive reference laboratories and is now the gold standard for rapidly ruling isolates into or out of suspected outbreak clusters. These and other properties give WGS the potential to serve in AMR reference functioning where a reference laboratory did not hitherto exist. In this perspective, we describe how we have employed a WGS approach, and an academic-public health system collaboration, to provide AMR reference laboratory services in Nigeria, as a model for leapfrogging to national AMR surveillance.
Article
In 1999, at the conference of the European Working Group on Nosocomial Infections, the term bloodstream infection was proposed in the presence of clinical symptoms and microorganisms in the bloodstream. The first classification of IC consisted of 3 categories: hospital, iatrogenic and out-of-hospital. Then THEY were classified into 5 categories. At the same time, ICS that occurred during the first 48 hours after the patient's admission to the medical organization were divided into 4 groups (A-D). Group C included bacteremia associated with invasive procedures and was classified into 5 subgroups. The number of episodes of IR. The number of episodes of IC in the world is growing depending on the geographical location of the country (from 1995 to 2002 increased by 40%, by 2007 - by 14.3%). Among the sources of infection, the role of the respiratory, hepatobiliary, gastrointestinal, urogenital and urinary tracts, the presence of intravascular devices and pneumonia. BI is characterized by frequent infestation of men, staphylococcal etiology, catheter-association, and the presence of comorbid diseases. Re-episodes of Gram-negative BI are more likely to occur within 3 months. Until 2004, S. aureus was the leading pathogen of BI; after 2005, E. coli dominated. These two pathogens succeeded each other in different years. Currently, pathogens of BI in patients with therapeutic profile are gram-positive cocci, including CNS, S. aureus, enterococci, fungi and anaerobes. BI is characterized by polymicrobiality (35.7%), including bacterial-fungal (22%).
Article
Bloodstream infections (BSIs) and subsequent organ dysfunction (sepsis and septic shock) are conditions that rank among the top reasons for human mortality and have a great impact on healthcare systems. Their treatment mainly relies on the administration of broad-spectrum antimicrobials since the standard blood culture-based diagnostic methods remain time-consuming for the pathogen's identification. Consequently, the routine use of these antibiotics may lead to downstream antimicrobial resistance and failure in treatment outcomes. Recently, significant advances have been made in improving several methodologies for the identification of pathogens directly in whole blood especially regarding specificity and time to detection. Nevertheless, for the widespread implementation of these novel methods in healthcare facilities, further improvements are still needed concerning the sensitivity and cost-effectiveness to allow a faster and more appropriate antimicrobial therapy. This review is focused on the problem of BSIs and sepsis addressing several aspects like their origin, challenges, and causative agents. Also, it highlights current and emerging diagnostics technologies, discussing their strengths and weaknesses.
Article
Pathogenic bacteria are one of the leading causes of foodborne outbreaks that need to be kept under stringent and constant surveillance. The current surveillance programs for microbial food safety are facing new challenges, such as need for rapid and high-throughput subtyping, need for antimicrobial profiling, need for accurate source tracing, and need for delineation of transmission routes. Whole-genome sequencing (WGS), as a cutting-edge analytical tool, can reveal the comprehensive genomic information of a microorganism, that enables the precise identification and characterization of foodborne pathogens. The applications of WGS in food safety are renewing the current surveillance programs. This review summarizes the latest attempts of WGS in microbial food safety and discusses the potential of this technology to respond to future food safety challenges.
Article
Full-text available
In low-resource settings, detection of healthcare-acquired outbreaks in neonatal units relies on astute clinical staff to observe unusual morbidity or mortality from sepsis as microbiological diagnostics are often absent. We aimed to generate reliable (and automated) early warnings for potential clusters of neonatal late onset sepsis using retrospective data that could signal the start of an outbreak in an NCU in Port au Prince, Haiti, using routinely collected data on neonatal admissions. We constructed smoothed time series for late onset sepsis cases, late onset sepsis rates, neonatal care unit (NCU) mortality, maternal admissions, neonatal admissions and neonatal antibiotic consumption. An outbreak was defined as a statistical increase in any of these time series indicators. We created three outbreak alarm classes: 1) thresholds: weeks in which the late onset sepsis cases exceeded four, the late onset sepsis rates exceeded 10% of total NCU admissions and the NCU mortality exceeded 15%; 2) differential: late onset sepsis rates and NCU mortality were double the previous week; and 3) aberration: using the improved Farrington model for late onset sepsis rates and NCU mortality. We validated pairs of alarms by calculating the sensitivity and specificity of the weeks in which each alarm was launched and comparing each alarm to the weeks in which a single GNB positive blood culture was reported from a neonate. The threshold and aberration alarms were the strongest predictors for current and future NCU mortality and current LOS rates (p
Article
Objectives Azithromycin is an alternative to treat invasive non-typhoidal Salmonella (iNTS) infections. We determined its epidemiological cut-off (ECOFF) and compared azithromycin susceptibility testing methods for iNTS. Methods We used EUCAST ECOFFinder to determine the minimum inhibitory concentrations (MIC; obtained by broth microdilution) ECOFF and corresponding disk zone diameters of 515 iNTS from blood cultures in DR Congo, Burkina Faso, Rwanda, and Cambodia. Transferable resistance mechanisms were determined by polymerase chain reaction. We compared azithromycin susceptibility testing by semi-automated broth microdilution (customized Sensititre panel; reference), agar dilution, gradient tests (bioMérieux, Liofilchem, HiMedia; read at 80% (MIC80%) and 100% inhibition (MIC100%)) and disk diffusion (Rosco, Oxoid, BD, Liofilchem) for 161 wild and 198 non-wild type iNTS. Results Azithromycin MIC ECOFF was 16 mg/l corresponding to a 12mm zone diameter; mphA was detected in 192/197 non-wild and 0/47 wild type iNTS. Categorical agreement was excellent (≥98%) for all methods. Essential agreement was very good for agar dilution (>90%), but moderate for gradient tests (MIC80%: 52 – 71% and MIC100%: 72 – 91%). Repeatability was good for all methods/brands. Interreader agreement was high for broth microdilution and agar dilution (all ≤1 twofold dilution difference) and disk diffusion (>96% ≤3mm difference), but lower for gradient tests (MIC80% & MIC100%: 83 − 94% ≤1 twofold dilution difference). Conclusions Azithromycin ECOFF of iNTS was 16 mg/l, i.e. equal to Salmonella Typhi. Disk diffusion is an accurate, precise, and user-friendly alternatives for agar dilution and broth microdilution. Reading gradient tests at 100% instead of 80% inhibition improved accuracy and precision.
Article
Background Current antimicrobial resistance surveillance (AMR) is mainly laboratory-based. This approach can have inherent biases given the potential for selective specimen submission for microbiological analysis, and for its inability to map antibiotic susceptibility test results to a clinical syndrome. Objectives To discuss the need for population-based surveillance of AMR, and highlight the pros and cons of threshold surveys. Sources Studies on methodology for AMR surveillance published in the last 10 years, obtained through a PubMed search on antimicrobial resistance (all fields) and surveillance/method (MeSH term). Content We discuss the use of threshold surveys to overcome the challenge of sample size in population-bases AMR surveys, which are a suitable approach in both low- and high-resource settings. Implication Scale up in the use of population-based threshold survey on the prevalence of AMR will provide necessary information to triangulate the data from routinely-reported laboratory-based AMR surveillance at the local, national and global level.
Article
Full-text available
“Right to health” is a universal right inclusive of a culture of safety. This review aims to highlight how clinical microbiology laboratories can contribute to patient safety. They can bring down medical errors through clinical collaboration and quality control. Timely and accurate inputs from microbiology laboratory help in clinical correlation and aid in safe patient care. Through internet search, using keywords such as “medical errors” and “quality assurance,” global burden of medical errors has been compiled. References have been taken from guidelines and documents of standard national and international agencies, systematic reviews, observational studies, retrospective analyses, meta-analyses, health bulletins and reports, and personal views. Safety in healthcare should lay emphasis on prevention, reporting, analysis, and correction of medical errors. If not recorded, medical errors are regarded as occasional or chance events. Global data show adverse events are as high as 10% among hospitalized patients, and approximately two-thirds of these are reported from low- to middle-income countries (LMICs). This includes errors in laboratories as well. Clinical microbiology can impact patient safety when practiced properly with an aim to detect, control, and prevent infections at the earliest. It is a science that integrates a tripartite relationship between the patient, clinician, and a microbiology specialist. Through collaborative healthcare, all stakeholders benefit by understanding common errors and mitigate them through quality management. However, errors tend to happen despite standardization and streamlining all processes. The aim should be to minimize them, have fair documentation, and learn from mistakes to avoid repetition. Local targets should be set and then extended to meet national and global benchmarks.
Article
Full-text available
Pediatric community-acquired bloodstream infections (CA-BSIs) in sub Saharan African humanitarian contexts are rarely documented. Effective treatment of these infections is additionally complicated by increasing rates of antimicrobial resistance. We describe the findings from epidemiological and microbiological surveillance implemented in pediatric patients with suspected CA-BSIs presenting for care at a secondary hospital in the conflict affected area of Zamfara state, Nigeria. Any child (> 2 months of age) presenting to Anka General Hospital from November 2018 to August 2020 with clinical severe sepsis at admission had clinical and epidemiological information and a blood culture collected at admission. Bacterial isolates were tested for antibiotic susceptibility. We calculated frequencies of epidemiological, microbiological and clinical parameters. We explored risk factors for death amongst severe sepsis cases using univariable and multivariable Poisson regression, adjusting for time between admission and hospital exit. We included 234 severe sepsis patients with 195 blood culture results. There were 39 positive blood cultures. Of the bacterial isolates, 14 were Gram positive and 18 were Gram negative; 5 were resistant to empiric antibiotics: methicillin-resistant Staphylococcus aureus (MRSA; n = 2) and Extended Spectrum Beta-Lactamase positive enterobacterales (n = 3). We identified no significant association between sex, age-group, ward, CA-BSI, appropriate intravenous antibiotic, malaria positivity at admission, suspected focus of sepsis, clinical severity and death in the multivariable regression. There is an urgent need for access to good clinical microbiological services, including point of care methods, and awareness and practice around rational antibiotic in healthcare staff in humanitarian settings to reduce morbidity and mortality from sepsis in children.
Article
Full-text available
Objectives Implementation of standard laboratory practices towards accurate antimicrobial susceptibility testing (AST) is challenging in resource-constrained settings. Efforts to improve AST are required to address knowledge and practice gaps in such settings. In this study, we aimed to address these gaps through external quality assurance surveys and mentoring of laboratories in Pakistan. Methods This prospective study (May 2017–September 2019) included 10 consenting laboratories. External quality assessment (EQA) was conducted quarterly and performance scored. Each EQA cycle was followed by an on-site technical visit during which AST methodology, quality procedures and laboratory safety were evaluated using a questionnaire developed for this study. Cumulative scores of performance in the EQA and in the technical evaluation were designated “Composite Laboratory Performance Score; CLPS”. During on-site visits, feedback provided was to each participating laboratory towards addressing gaps identified. Results Over the course of the study, our data show significant improvement in CLPS amongst the laboratories included. While improvement in the CLPS scores varied between laboratories, a linear regression model showed improvement within the cohort from 21.37 (May 2017) to 91.5 (September 2019); a significant overall increase of 70.13 points (p = 0.001). Conclusion Interventions to improve AMR surveillance include quality assured reporting of antimicrobial resistance. Our data show that in resource-limited settings EQA surveys and on-site evaluations followed by guidance contribute towards such improvement. We propose that this model would be a useful tool for laboratory strengthening in such settings.
Article
Full-text available
Background There is a need for simple microbiology diagnostics to enable antimicrobial resistance surveillance in low- and middle-income countries. Objectives To investigate the field utility of InTray COLOREX plates for urine culture and ESBL detection. Methods Clinical urine samples from Mahosot Hospital, Vientiane, Lao PDR were inoculated onto chromogenic media and InTray COLOREX Screen plates between June and August 2020. Urine and isolates from other clinical specimens were inoculated onto COLOREX ESBL plates. A simulated field study investigating the field utility of the InTray COLOREX plates was also completed. Results In total, 355 urine samples were inoculated onto standard chromogenic agar and InTray COLOREX Screen plates, and 154 urine samples and 54 isolates from other clinical specimens on the COLOREX ESBL plates. Growth was similar for the two methods (COLOREX Screen 41%, standard method 38%) with 20% discordant results, mainly due to differences in colony counts or colonial appearance. Contamination occurred in 13% of samples, with the COLOREX Screen plates showing increased contamination rates, potentially due to condensation. ESBL producers were confirmed from 80% of isolates from the COLOREX ESBL plates, and direct plating provided rapid detection of presumptive ESBL producers. Burkholderia pseudomallei also grew well on the ESBL plates, a relevant finding in this melioidosis-endemic area. Conclusions The InTray COLOREX Screen and ESBL plates were simple to use and interpret, permitting rapid detection of uropathogens and ESBLs, and have the potential for easy transport and storage from field sites and use in laboratories with low capacity.
Article
The transmission dynamics of Streptococcus pneumoniae in sub-Saharan Africa are poorly understood due to a lack of adequate epidemiological and genomic data. Here we leverage a longitudinal cohort from 21 neighbouring villages in rural Africa to study how closely related strains of S. pneumoniae are shared among infants. We analysed 1074 pneumococcal genomes isolated from 102 infants from 21 villages. Strains were designated for unique serotype and sequence-type combinations, and we arbitrarily defined strain sharing where the pairwise genetic distance between strains could be accounted for by the mean within host intra-strain diversity. We used non-parametric statistical tests to assess the role of spatial distance and prolonged carriage on strain sharing using a logistic regression model. We recorded 458 carriage episodes including 318 (69.4 %) where the carried strain was shared with at least one other infant. The odds of strain sharing varied significantly across villages (χ ² =47.5, df=21, P -value <0.001). Infants in close proximity to each other were more likely to be involved in strain sharing, but we also show a considerable amount of strain sharing across longer distances. Close geographic proximity (<5 km) between shared strains was associated with a significantly lower pairwise SNP distance compared to strains shared over longer distances ( P -value <0.005). Sustained carriage of a shared strain among the infants was significantly more likely to occur if they resided in villages within a 5 km radius of each other ( P -value <0.005, OR 3.7). Conversely, where both infants were transiently colonized by the shared strain, they were more likely to reside in villages separated by over 15 km ( P -value <0.05, OR 1.5). PCV7 serotypes were rare (13.5 %) and were significantly less likely to be shared ( P -value <0.001, OR −1.07). Strain sharing was more likely to occur over short geographical distances, especially where accompanied by sustained colonization. Our results show that strain sharing is a useful proxy for studying transmission dynamics in an under-sampled population with limited genomic data. This article contains data hosted by Microreact.
Article
Full-text available
Background: Although global surveillance of antimicrobial resistance (AMR) is considered key in the containment of AMR, data from low- and middle-income countries, especially from sub-Saharan Africa, are scarce. This study describes epidemiology of bloodstream infections and antimicrobial resistance rates in a secondary care hospital in Benin. Methods: Blood cultures were sampled, according to predefined indications, in BacT/ALERT FA Plus and PF Plus (bioMérieux, Marcy-l'Etoile, France) blood culture bottles (BCB) in a district hospital (Boko hospital) and to a lesser extent in the University hospital of Parakou. These BCB were incubated for 7 days in a standard incubator and twice daily inspected for visual signs of growth. Isolates retrieved from the BCB were processed locally and later shipped to Belgium for reference identification [matrix-assisted laser desorption/ionization time-of-flight spectrometry (MALDI-TOF)] and antibiotic susceptibility testing (disk diffusion and E-tests). Results: From October 2017 to February 2020, 3353 BCB were sampled, corresponding to 3140 blood cultures (212 cultures consisting of > 1 BCB) and 3082 suspected bloodstream infection (BSI) episodes. Most of these cultures (n = 2471; 78.7%) were sampled in children < 15 years of age. Pathogens were recovered from 383 (12.4%) cultures, corresponding to 381 confirmed BSI. 340 of these pathogens were available and confirmed by reference identification. The most common pathogens were Klebsiella pneumoniae (n = 53; 15.6%), Salmonella Typhi (n = 52; 15.3%) and Staphylococcus aureus (n = 46; 13.5%). AMR rates were high among Enterobacterales, with resistance to third-generation cephalosporins in 77.6% of K. pneumoniae isolates (n = 58), 12.8% of Escherichia coli isolates (n = 49) and 70.5% of Enterobacter cloacae isolates (n = 44). Carbapenemase production was detected in 2 Escherichia coli and 2 Enterobacter cloacae isolates, all of which were of the New Delhi metallo-beta lactamase type. Methicillin resistance was present in 22.4% of S. aureus isolates (n = 49). Conclusion: Blood cultures were successfully implemented in a district hospital in Benin, especially among the pediatric patient population. Unexpectedly high rates of AMR among Gram-negative bacteria against commonly used antibiotics were found, demonstrating the clinical and scientific importance of clinical bacteriology laboratories at this level of care.
Article
Full-text available
Background Antimicrobial resistance (AMR) poses a major threat to human health around the world. Previous publications have estimated the effect of AMR on incidence, deaths, hospital length of stay, and health-care costs for specific pathogen–drug combinations in select locations. To our knowledge, this study presents the most comprehensive estimates of AMR burden to date. Methods We estimated deaths and disability-adjusted life-years (DALYs) attributable to and associated with bacterial AMR for 23 pathogens and 88 pathogen–drug combinations in 204 countries and territories in 2019. We obtained data from systematic literature reviews, hospital systems, surveillance systems, and other sources, covering 471 million individual records or isolates and 7585 study-location-years. We used predictive statistical modelling to produce estimates of AMR burden for all locations, including for locations with no data. Our approach can be divided into five broad components: number of deaths where infection played a role, proportion of infectious deaths attributable to a given infectious syndrome, proportion of infectious syndrome deaths attributable to a given pathogen, the percentage of a given pathogen resistant to an antibiotic of interest, and the excess risk of death or duration of an infection associated with this resistance. Using these components, we estimated disease burden based on two counterfactuals: deaths attributable to AMR (based on an alternative scenario in which all drug-resistant infections were replaced by drug-susceptible infections), and deaths associated with AMR (based on an alternative scenario in which all drug-resistant infections were replaced by no infection). We generated 95% uncertainty intervals (UIs) for final estimates as the 25th and 975th ordered values across 1000 posterior draws, and models were cross-validated for out-of-sample predictive validity. We present final estimates aggregated to the global and regional level. Findings On the basis of our predictive statistical models, there were an estimated 4·95 million (3·62–6·57) deaths associated with bacterial AMR in 2019, including 1·27 million (95% UI 0·911–1·71) deaths attributable to bacterial AMR. At the regional level, we estimated the all-age death rate attributable to resistance to be highest in western sub-Saharan Africa, at 27·3 deaths per 100 000 (20·9–35·3), and lowest in Australasia, at 6·5 deaths (4·3–9·4) per 100 000. Lower respiratory infections accounted for more than 1·5 million deaths associated with resistance in 2019, making it the most burdensome infectious syndrome. The six leading pathogens for deaths associated with resistance (Escherichia coli, followed by Staphylococcus aureus, Klebsiella pneumoniae, Streptococcus pneumoniae, Acinetobacter baumannii, and Pseudomonas aeruginosa) were responsible for 929 000 (660 000–1 270 000) deaths attributable to AMR and 3·57 million (2·62–4·78) deaths associated with AMR in 2019. One pathogen–drug combination, meticillin-resistant S aureus, caused more than 100 000 deaths attributable to AMR in 2019, while six more each caused 50 000–100 000 deaths: multidrug-resistant excluding extensively drug-resistant tuberculosis, third-generation cephalosporin-resistant E coli, carbapenem-resistant A baumannii, fluoroquinolone-resistant E coli, carbapenem-resistant K pneumoniae, and third-generation cephalosporin-resistant K pneumoniae. Interpretation To our knowledge, this study provides the first comprehensive assessment of the global burden of AMR, as well as an evaluation of the availability of data. AMR is a leading cause of death around the world, with the highest burdens in low-resource settings. Understanding the burden of AMR and the leading pathogen–drug combinations contributing to it is crucial to making informed and location-specific policy decisions, particularly about infection prevention and control programmes, access to essential antibiotics, and research and development of new vaccines and antibiotics. There are serious data gaps in many low-income settings, emphasising the need to expand microbiology laboratory capacity and data collection systems to improve our understanding of this important human health threat. Funding Bill & Melinda Gates Foundation, Wellcome Trust, and Department of Health and Social Care using UK aid funding managed by the Fleming Fund.
Article
Full-text available
Background Manual blood culture bottles (BCBs) are frequently used in low-resource settings. There are few BCB performance evaluations, especially evaluations comparing them with automated systems. We evaluated two manual BCBs (Bi-State BCB and BacT/ALERT BCB) and compared their yield and time to growth detection with those of automated BacT/ALERT system. Methods BCBs were spiked in triplicate with 177 clinical isolates representing pathogens common in low-resource settings (19 bacterial and one yeast species) in adult and paediatric volumes, resulting in 1056 spiked BCBs per BCB system. Growth in manual BCBs was evaluated daily by visually inspecting the broth, agar slant, and, for BacT/ALERT BCB, colour change of the growth indicator. The primary outcomes were BCB yield (proportion of spiked BCB showing growth) and time to detection (proportion of positive BCB with growth detected on day 1 of incubation). 95% CI for yield and growth on day 1 were calculated using bootstrap method for clustered data using. Secondary outcomes were time to colony for all BCBs (defined as number of days between incubation and colony growth sufficient to use for further testing) and difference between time to detection in broth and on agar slant for the Bi-State BCBs. Findings Overall yield was 95·9% (95% CI 93·9–98·0) for Bi-State BCB and 95·5% (93·3–97·8) for manual BacT/ALERT, versus 96·1% (94·0–98·1) for the automated BacT/ALERT system (p=0·61). Day 1 growth was present in 920 (90·8%) of 1013 positive Bi-State BCB and 757 (75·0%) of 1009 positive manual BacT/ALERT BCB, versus 1008 (99·3%) of 1015 automated bottles. On day 2, detection rates were 100% for BI-State BCB, 97·7% for manual BacT/ALERT BCB, and 100% for automated bottles. For Bi-State BCB, growth mostly occurred simultaneously in broth and slant (81·7%). Sufficient colony growth on the slant to perform further tests was present in only 44·1% of biphasic bottles on day 2 and 59·0% on day 3. Interpretation The yield of manual BCB was comparable with the automated system, suggesting that manual blood culture systems are an acceptable alternative to automated systems in low-resource settings. Bi-State BCB outperformed manual BacT/ALERT bottles, but the agar slant did not allow earlier detection nor earlier colony growth. Time to detection for manual blood culture systems still lags that of automated systems, and research into innovative and affordable methods of growth detection in manual BCBs is encouraged. Funding Médecins Sans Frontières and Department of Economy, Science and Innovation of the Flemish Government.
Article
Correct processing of blood cultures may impact individual patient management, antibiotic stewardship, and scaling up of antimicrobial resistance surveillance. To assess the quality of blood culture processing, we conducted four assessments at 16 public hospitals across different regions of Peru. We assessed the following standardized quality indicators: 1) positivity and contamination rates, 2) compliance with recommended number of bottles/sets and volume of blood sampled, 3) blood culture utilization, and 4) possible barriers for compliance with recommendations. Suboptimal performance was found, with a median contamination rate of 4.2% (range 0–15.1%), with only one third of the participating hospitals meeting the target value of < 3%; and a median positivity rate of 4.9% (range 1–8.1%), with only 6 out of the 15 surveilled hospitals meeting the target of 6–12%. None of the assessed hospitals met both targets. The median frequency of solitary blood cultures was 71.9% and only 8.9% ( N = 59) of the surveyed adult bottles met the target blood volume of 8 – 12 mL, whereas 90.5% ( N = 602) were underfilled. A high frequency of missed opportunities for ordering blood cultures was found (30.1%, 95/316) among patients with clinical indications for blood culture sampling. This multicenter study demonstrates important shortcomings in the quality of blood culture processing in public hospitals of Peru. It provides a national benchmark of blood culture utilization and quality indicators that can be used to monitor future quality improvement studies and diagnostic stewardship policies.
Article
Antimicrobials are essential in reducing morbidity and mortality from infectious diseases globally. However, due to the lack of effective surveillance measures and widespread overuse, there is an increasing threat to the effectiveness of antimicrobials. Although there is a global increase in antimicrobial resistance, low- and middle-income countries share a much higher burden. Antimicrobial stewardship efforts such as effective surveillance and reduction in overuse can help combat the increase in antimicrobial resistance.
Article
As some patients infected with the novel coronavirus progress to critical illness, a subset will eventually develop shock. High-quality data on management of these patients are scarce, and further investigation will provide valuable information in the context of the pandemic. A group of experts identify a set of pragmatic recommendations for the care of patients with SARS-CoV-2 and shock in resource-limited environments. We define shock as life-threatening circulatory failure that results in inadequate tissue perfusion and cellular dysoxia/hypoxia, and suggest that it can be operationalized via clinical observations. We suggest a thorough evaluation for other potential causes of shock and suggest against indiscriminate testing for coinfections. We suggest the use of the quick Sequential Organ Failure Assessment (qSOFA) as a simple bedside prognostic score for COVID-19 patients and point-of-care ultrasound (POCUS) to evaluate the etiology of shock. Regarding fluid therapy for the treatment of COVID-19 patients with shock in low-middle-income countries, we favor balanced crystalloids and recommend using a conservative fluid strategy for resuscitation. Where available and not prohibited by cost, we recommend using norepinephrine, given its safety profile. We favor avoiding the routine use of central venous or arterial catheters, where availability and costs are strong considerations. We also recommend using low-dose corticosteroids in patients with refractory shock. In addressing targets of resuscitation, we recommend the use of simple bedside parameters such as capillary refill time and suggest that POCUS be used to assess the need for further fluid resuscitation, if available.
Article
Full-text available
Liver abscesses containing hypervirulent Klebsiella pneumoniae have emerged during the past 2 decades, originally in Southeast Asia and then worldwide. We hypothesized that hypervirulent K. pneumoniae might also be emerging in France. In a retrospective, monocentric, cohort study, we analyzed characteristics and outcomes for 199 consecutive patients in Paris, France, with liver abscesses during 2010-2015. We focused on 31 patients with abscesses containing K. pneumoniae. This bacterium was present in most (14/27, 52%) cryptogenic liver abscesses. Cryptogenic K. pneumoniae abscesses were more frequently community-acquired (p<0.00001) and monomicrobial (p = 0.008), less likely to involve cancer patients (p<0.01), and relapsed less often (p<0.01) than did noncryptogenic K. pneumoniae liver abscesses. K. pneumoniae isolates from cryptogenic abscesses belonged to either the K1 or K2 serotypes and had more virulence factors than noncryptogenic K. pneumoniae isolates. Hypervirulent K. pneumoniae are emerging as the main pathogen isolated from cryptogenic liver abscesses in the study area.
Article
Full-text available
Background: The spread of antibiotic-resistant bacteria poses a substantial threat to morbidity and mortality worldwide. Due to its large public health and societal implications, multidrug-resistant tuberculosis has been long regarded by WHO as a global priority for investment in new drugs. In 2016, WHO was requested by member states to create a priority list of other antibiotic-resistant bacteria to support research and development of effective drugs. Methods: We used a multicriteria decision analysis method to prioritise antibiotic-resistant bacteria; this method involved the identification of relevant criteria to assess priority against which each antibiotic-resistant bacterium was rated. The final priority ranking of the antibiotic-resistant bacteria was established after a preference-based survey was used to obtain expert weighting of criteria. Findings: We selected 20 bacterial species with 25 patterns of acquired resistance and ten criteria to assess priority: mortality, health-care burden, community burden, prevalence of resistance, 10-year trend of resistance, transmissibility, preventability in the community setting, preventability in the health-care setting, treatability, and pipeline. We stratified the priority list into three tiers (critical, high, and medium priority), using the 33rd percentile of the bacterium's total scores as the cutoff. Critical-priority bacteria included carbapenem-resistant Acinetobacter baumannii and Pseudomonas aeruginosa, and carbapenem-resistant and third-generation cephalosporin-resistant Enterobacteriaceae. The highest ranked Gram-positive bacteria (high priority) were vancomycin-resistant Enterococcus faecium and meticillin-resistant Staphylococcus aureus. Of the bacteria typically responsible for community-acquired infections, clarithromycin-resistant Helicobacter pylori, and fluoroquinolone-resistant Campylobacter spp, Neisseria gonorrhoeae, and Salmonella typhi were included in the high-priority tier. Interpretation: Future development strategies should focus on antibiotics that are active against multidrug-resistant tuberculosis and Gram-negative bacteria. The global strategy should include antibiotic-resistant bacteria responsible for community-acquired infections such as Salmonella spp, Campylobacter spp, N gonorrhoeae, and H pylori. Funding: World Health Organization.
Article
Full-text available
OBJECTIVE: To describe findings from an external quality assessment programme involving laboratories in Africa that routinely investigate epidemic-prone diseases. METHODS: Beginning in 2002, the Regional Office for Africa of the World Health Organization (WHO) invited national public health laboratories and related facilities in Africa to participate in the programme. Three surveys comprising specimens and questionnaires associated with bacterial enteric diseases, bacterial meningitis, plague, tuberculosis and malaria were sent annually to test participants' diagnostic proficiency. Identical surveys were sent to referee laboratories for quality control. Materials were prepared, packaged and shipped in accordance with standard protocols. Findings and reports were due within 30 days. Key methodological decisions and test results were categorized as acceptable or unacceptable on the basis of consensus feedback from referees, using established grading schemes. FINDINGS: Between 2002 and 2009, participation increased from 30 to 48 Member States of the WHO and from 39 to 78 laboratories. Each survey was returned by 64-93% of participants. Mean turnaround time was 25.9 days. For bacterial enteric diseases and meningitis components, bacterial identification was acceptable in 65% and 69% of challenges, respectively, but serotyping and antibiotic susceptibility testing and reporting were frequently unacceptable. Microscopy was acceptable for 73% of plague challenges. Tuberculosis microscopy was satisfactorily performed, with 87% of responses receiving acceptable scores. In the malaria component, 82% of responses received acceptable scores for species identification but only 51% of parasite quantitation scores were acceptable. CONCLUSION: The external quality assessment programme consistently identified certain functional deficiencies requiring strengthening that were present in African public health microbiology laboratories.
Article
Full-text available
Background: The declining trend of malaria and the recent prioritization of containment of antimicrobial resistance have created a momentum to implement clinical bacteriology in low-resource settings (LRS). Successful implementation relies on guidance by a quality management system (QMS). Over the past decade, international initiatives were launched towards implementation of QMS in HIV/AIDS, tuberculosis and malaria. Aims: To describe the progress towards accreditation of medical laboratories and to identify the challenges and "Best practices" for implementation of QMS in clinical bacteriology in LRS. Sources: Published literature, online reports and websites related to the implementation of laboratory QMS, accreditation of medical laboratories and initiatives for containment of antimicrobial resistance. Content: Apart from the limitations of infrastructure, equipment, consumables and staff, QMS are challenged with the complexity of clinical bacteriology and the healthcare context in LRS (small-scale laboratories, attitudes and perception of staff, absence of laboratory information systems). Likewise, most international initiatives addressing laboratory health strengthening have focused on public health and outbreak management rather than on hospital based patient care. "Best Practices" to implement quality-assured clinical bacteriology in LRS include alignment with national regulations and public health reference laboratories, participating in external quality assurance programmes, support from the hospital's management, starting with attainable projects, conducting error review and daily bench-side supervision, looking for locally adapted solutions, stimulating ownership and extending existing training programmes to clinical bacteriology. Implications: The implementation of QMS in clinical bacteriology in hospital settings will ultimately boost a culture of quality to all sectors of healthcare in LRS.
Article
Full-text available
For a clinical study in the European research network on better diagnosis for neglected infectious diseases (NIDIAG) project (Better Diagnosis of Neglected Infectious Diseases: www.nidiag.org), we developed Standard Operating Procedures (SOPs), which we implemented in a basically equipped laboratory in a 380-bed rural hospital (“Hopital General de Reference Mosango”) in the Kwilu province in the Democratic Republic of the Congo (DRC). The study aimed to improve the early diagnosis of severe and treatable infections among patients with neurological disorders and took place over a 20-month period (14/09/2012–24/05/2014) (ClinicalTrials.gov Identifier: {"type":"clinical-trial","attrs":{"text":"NCT01589289","term_id":"NCT01589289"}}NCT01589289). The set of 50 SOPs (S1 Appendix), all in French, include procedures related to the inclusion and clinical management of patients with neurological disorders (n = 4), diagnostic testing (n = 33), data collection and management (n = 5), and quality assurance (n = 8).
Article
Full-text available
Accurate diagnosis of infectious diseases is essential for appropriate targeting of treatment and disease control. Rapid diagnostic tests (RDTs) are quick and easy to perform, they give results during one clinic visit, and they can be used in settings with little infrastructure or trained personnel. RDTs are promising tools to improve diagnosis in remote or low-resource settings. In the field of neglected infectious diseases, new manufacturers, RDTs, and users are coming onto the scene [1, 2]. As access to RDTs improves, the need for quality assurance and postmarket surveillance increases. The International Medical Device Regulators Forum has formulated guidelines about quality assurance of medical devices, including RDTs, which have been adopted as regulatory standards in Australia, Canada, the European Union, Japan, and the United States [3]. Specific quality standards for in vitro diagnostic tests (IVDs) (ISO 13485:2003) and medical laboratories (ISO 15189:2012) have been published by the International Organisation for Standardisation [4, 5]. In less-regulated settings, the World Health Organisation (WHO) has stepped in to promote IVD quality [3, 6, 7]. Participation from various stakeholders is required to assure RDT quality. Manufacturers must ensure that their products are ready for the market—i.e., that product design, development, testing, manufacturing, packaging, and labelling meet the required standards of safety and performance. The role of RDT users is, among other responsibilities, to know indications, contraindications, and operating procedures of the devices. Most regulatory authorities recognise that efficient communication between manufacturers and users is key to postmarket surveillance [3–7]. In low-resource settings and in the field of neglected infectious diseases, this communication between manufacturers and users may be suboptimal, as well as the pre- and postmarketing oversight of national regulatory authorities. The Neglected Infectious Diseases dIAGnosis (NIDIAG) consortium aims to improve diagnostic approaches for different clinical syndromes in low-resource settings where neglected infectious diseases are prevalent. In this case study, we assessed several quality aspects of RDTs used in the NIDIAG study about persistent fever: we focused on RDT labelling and instructions for use (IFU) and on product-related incidents, including communication with manufacturers about these incidents.
Article
Full-text available
The role of national health laboratories in support of public health response has expanded beyond laboratory testing to include a number of other core functions such as emergency response, training and outreach, communications, laboratory-based surveillance and data management. These functions can only be accomplished by an efficient and resilient national laboratory network that includes public health, reference, clinical and other laboratories. It is a primary responsibility of the national health laboratory in the Ministry of Health to develop and maintain the national laboratory network in the country. In this article, we present practical recommendations based on 17 years of network development experience for the development of effective national laboratory networks. These recommendations and examples of current laboratory networks, are provided to facilitate laboratory network development in other states. The development of resilient, integrated laboratory networks will enhance each state’s public health system and is critical to the development of a robust national laboratory response network to meet global health security threats.
Article
Full-text available
External Quality Assessment (EQA) surveys performed by the World Health Organization Regional Office for Africa (WHO AFRO) revealed the need for the strengthening of publichealth microbiology laboratories, particularly for testing of epidemic-prone diseases in theAfrican Region. These surveys revealed common issues such as supply chain managementskilled personnel, logistical support and overall lack of quality standards. For sustainableimprovements to health systems as well as global health security, deficiencies identified needto be actively corrected through robust quality assurance programmes and implementation oflaboratory quality management systems.Given all the pathogens of public health importance, an external quality assessment programmewith a focus on vaccine-preventable diseases and emerging and re-emerging dangerouspathogens is important, and should not be stand-alone, but integrated within laboratorynetworks as seen in polio, measles, yellow fever and rubella.In 2015, WHO AFRO collaborated with the US Centers for Disease Control and Preventionthe London School of Hygiene & Tropical Medicine and partners in a series of consultationswith countries and national and regional EQA providers for the development of qualityassurance models to support HIV point-of-care testing and monitoring. These consultationsrevealed similar challenges as seen in the WHO AFRO surveys. WHO AFRO brought forthits experience in implementing quality standards for health programmes, and also openeddiscussions on how lessons learned through such established programmes can be utilised tosupporting and strengthening the introduction of early infant diagnosis of HIV and viralload point-of-care testing.An optimised external quality assessment programme will impact the ability of countries tomeet core capacities, providing improved quality management systems, improving theconfidence of diagnostic network services in Africa, and including capacities to detect eventsof international public health importance.
Article
Full-text available
In 2015, UNAIDS launched the 90-90-90 targets aimed at increasing the number of people infected with HIV to become aware of their status, access antiretroviral therapies and ultimately be virally suppressed. To achieve these goals, countries may need to scale up point-of-care (POC) testing in addition to strengthening central laboratory services. While decentralising testing increases patient access to diagnostics, it presents many challenges with regard to training and assuring the quality of tests and testing. To ensure synergies, the London School of Hygiene & Tropical Medicine held a series of consultations with countries with an interest in quality assurance and their implementing partners, and agreed on an external quality assessment (EQA) programme to ensure reliable results so that the results lead to the best possible care for HIV patients. As a result of the consultations, EQA International was established, bringing together EQA providers and implementers to develop a strategic plan for countries to establish national POC EQA programmes and to estimate the cost of setting up and maintaining the programme. With the dramatic increase in the number of proficiency testing panels required for thousands of POC testing sites across Africa, it is important to facilitate technology transfer from global EQA providers to a network of regional EQA centres in Africa for regional proficiency testing panel production. EQA International will continue to identify robust and cost-effective EQA technologies for quality POC testing, integrating novel technologies to support sustainable country-owned EQA programmes in Africa.
Article
Full-text available
Health system and HIV epidemiology in Mozambique> Medical care in Mozambique is mostly provided through the national health service of the Ministry of Health. All primary healthcare and HIV-related services are provided free of charge. There are over 1500 public sector health facilities in Mozambique and most of these are primary healthcare centres. Although all hospitals have a laboratory, only a quarter of the health centres have a formal laboratory. In this context, point-of-care (POC) testing and syndromic management of diseases play an important role in the health system. Both communicable and non-communicable diseases are prevalent in the Mozambican population. Mozambique has a population of 28 million and is among the nine countries with the highest HIV prevalence in the world. HIV prevalence in the country among people aged 15-49 years is 11.5%, ranging from 3.7% in the Niassa province in the north to 25.1% in the Gaza province in the south. HIV prevalence is higher among women (13.1%) than among men (9.2%), and higher in urban areas (15.9%) compared with rural areas (9.2%). Among children aged between 0 and 11 years,HIV prevalence is 1.4%, and 2.3% in those younger than one year. It is estimated that 102 new infections in children occur daily in Mozambique (Ministry of Health, unpublished data). Demographic impact studies show that an estimated 1.6 million Mozambicans were living with HIV in 2009.
Article
Full-text available
Objectives Using a clinical research laboratory as a case study, we sought to characterize barriers to maintaining Good Clinical Laboratory Practice (GCLP) services in a developing world setting. Methods Using a US Centers for Disease Control and Prevention framework for program evaluation in public health, we performed an evaluation of the Kilimanjaro Christian Medical Centre–Duke University Health Collaboration clinical research laboratory sections of the Kilimanjaro Clinical Research Institute in Moshi, Tanzania. Laboratory records from November 2012 through October 2014 were reviewed for this analysis. Results During the 2-year period of study, seven instrument malfunctions suspended testing required for open clinical trials. A median (range) of 9 (1-55) days elapsed between instrument malfunction and biomedical engineer service. Sixteen (76.1%) of 21 suppliers of reagents, controls, and consumables were based outside Tanzania. Test throughput among laboratory sections used a median (range) of 0.6% (0.2%-2.7%) of instrument capacity. Five (55.6%) of nine laboratory technologists left their posts over 2 years. Conclusions These findings demonstrate that GCLP laboratory service provision in this setting is hampered by delays in biomedical engineer support, delays and extra costs in commodity procurement, low testing throughput, and high personnel turnover.
Article
Full-text available
Background: Streptococcus pneumoniae is the leading cause of meningitis and sepsis. The aim of the study was to analyze the distribution, vaccine serotype coverage, and antibiotic resistance of S. pneumoniae serotypes isolated from patients with invasive diseases, after the introduction of pneumococcal 7-valent conjugated vaccine (PCV-7). Methods: A total of 134 isolates were collected from blood and cerebrospinal fluid specimens at Hamad Hospital during the period from 2005 to 2009. Isolate serotyping was done using the Quellung reaction. The prevaccination period was considered before 2005. Results: The most common serotypes for all age groups were 3 (12.70%), 14 (11.90%), 1 (11.90%), 19A (9.00%), 9V (5.20%), 23F (5.20%), and 19F (4.50%). Coverage rates for infant <2 years for PCV-7, the 10-valent conjugated vaccine (PCV-10), and the 13-valent conjugated vaccine (PCV-13) were 34.78%, 52.17%, and 78.26%, respectively. Coverage rates of these vaccines were 50%, 67.86%, and 75% for the 2-5 years age group; 27.12%, 40.68%, and 64.41% for the age group 6-64 years; and 25%, 33.33%, and 66.67% for the ≥65 years age group, respectively. The percentage of nonsusceptible isolates to penicillin, cefotaxime, and erythromycin were 43.86%, 16.66%, and 22.81%, respectively. Thirty-seven isolates (32.46%) were multidrug resistant (MDR) and belonged to serotypes 14, 19A, 19F, 23F, 1, 9V, 12F, 4, 6B, 3, and 15A. Compared to previous results before the introduction of PCV-7, there was a significant reduction in penicillin-nonsusceptable S. pneumoniae from 66.67% to 43.86%, and a slight insignificant reduction in erythromycin nonsusceptible strains from 27.60% to 22.8%, while there was a significant increase in cefotaxime nonsusceptible strains from 3.55% to 16.66%. Conclusion: Invasive pneumococcal strains and the emergence of MDR serotypes is a global burden that must be addressed through multiple strategies, including vaccination, antibiotic stewardship, and continuous surveillance.
Article
Full-text available
Adequate laboratory infrastructure in sub-Saharan Africa is vital for tackling the burden of infectious diseases such as human immunodeficiency virus and acquired immune deficiency syndrome, malaria, and tuberculosis. Despite the need for laboratory testing in addressing the infectious disease burden, laboratories are ill-integrated into the diagnostic and care delivery process in low-resource settings. There is a diagnostic culture of circumventing laboratory testing and using other, less reliable and less valid signals to diagnose diseases such as malaria. Although much of the literature focuses on disease-specific challenges around laboratory testing, we sought to identify horizontal challenges to the laboratory testing process through interviews with clinicians involved in the diagnostic process. Based on 22 interviews with physicians, nurses, clinical officers, medical students, and laboratory technicians, technologists and supervisors, we identified 12 distinct challenges in the areas of staff, materials, workflow, and the blood bank. These challenges underscore the informational challenges that compound more visible resource shortages in the laboratory testing process, which lend themselves to horizontal strengthening efforts around the diagnostic process.
Article
Full-text available
Ebola emerged in West Africa around December 2013 and swept through Guinea, Sierra Leone and Liberia, giving rise to 27,748 confirmed, probable and suspected cases reported by 29 July 2015. Case diagnoses during the epidemic have relied on polymerase chain reaction-based tests. Owing to limited laboratory capacity and local transport infrastructure, the delays from sample collection to test results being available have often been 2 days or more. Point-of-care rapid diagnostic tests offer the potential to substantially reduce these delays. We review Ebola rapid diagnostic tests approved by the World Health Organization and those currently in development. Such rapid diagnostic tests could allow early triaging of patients, thereby reducing the potential for nosocomial transmission. In addition, despite the lower test accuracy, rapid diagnostic test-based diagnosis may be beneficial in some contexts because of the reduced time spent by uninfected individuals in health-care settings where they may be at increased risk of infection; this also frees up hospital beds. We use mathematical modelling to explore the potential benefits of diagnostic testing strategies involving rapid diagnostic tests alone and in combination with polymerase chain reaction testing. Our analysis indicates that the use of rapid diagnostic tests with sensitivity and specificity comparable with those currently under development always enhances control, whether evaluated at a health-care-unit or population level. If such tests had been available throughout the recent epidemic, we estimate, for Sierra Leone, that their use in combination with confirmatory polymerase chain-reaction testing might have reduced the scale of the epidemic by over a third.This article has not been written or reviewed by Nature editors. Nature accepts no responsibility for the accuracy of the information provided.
Article
Full-text available
The lack of new antibiotic classes calls for a cautious use of existing agents. Yet, every 10 min, almost two tons of antibiotics are used around the world, all too often without any prescription or control. The use, overuse and misuse of antibiotics select for resistance in numerous species of bacteria which then renders antimicrobial treatment ineffective. Almost all countries face increased antimicrobial resistance (AMR), not only in humans but also in livestock and along the food chain. The spread of AMR is fueled by growing human and animal populations, uncontrolled contamination of fresh water supplies, and increases in international travel, migration and trade. In this context of global concern, 68 international experts attending the fifth edition of the World HAI Resistance Forum in June 2015 shared their successes and failures in the global fight against AMR. They underlined the need for a “One Health” approach requiring research, surveillance, and interventions across human, veterinary, agricultural and environmental sectors. This strategy involves concerted actions on several fronts. Improved education and increased public awareness are a well-understood priority. Surveillance systems monitoring infections need to be expanded to include antimicrobial use, as well as the emergence and spread of AMR within clinical and environmental samples. Adherence to practices to prevent and control the spread of infections is mandatory to reduce the requirement of antimicrobials in general care and agriculture. Antibiotics need to be banned as growth promoters for farm animals in countries where it has not yet been done. Antimicrobial stewardship programmes in animal husbandry have proved to be efficient for minimising AMR, without compromising productivity. Regarding the use of antibiotics in humans, new tools to provide highly specific diagnoses of pathogens can decrease diagnostic uncertainty and improve clinical management. Finally, infection prevention and control measures – some of them as simple as hand hygiene – are essential and should be extended beyond healthcare settings. Aside from regulatory actions, all people can assist in AMR reduction by limiting antibiotic use for minor illnesses. Together, we can all work to reduce the burden of AMR.
Article
Full-text available
Background. In Kenya, invasive nontyphoidal Salmonella (iNTS) disease causes severe bacteremic illness among adults with human immunodeficiency virus (HIV) and especially among children <5 years of age coinfected with HIV or malaria, or who are compromised by sickle cell disease or severe malnutrition. The incidence of iNTS disease in children ranges from 166 to 568 cases per 100 000 persons per year. Methods. We review the epidemiology of iNTS disease and genomics of strains causing invasive illness in Kenya. We analyzed a total of 192 NTS isolates (114 Typhimurium, 78 Enteritidis) from blood and stools from pediatric admissions in 2005–2013. Testing for antimicrobial susceptibility to commonly used drugs and whole-genome sequencing were performed to assess prevalence and genetic relatedness of multidrug-resistant iNTS strains, respectively. Results. A majority (88/114 [77%]) of Salmonella Typhimurium and 30% (24/79) of Salmonella Enteritidis isolates tested were found to be multidrug resistant, whereas a dominant Salmonella Typhimurium pathotype, ST313, was primarily associated with invasive disease and febrile illness. Analysis of the ST313 isolates has identified genome degradation, compared with the ST19 genotype that typically causes diarrhea in humans, especially in industrialized countries, adapting a more host-restricted lifestyle typical of Salmonella Typhi infections. Conclusions. From 2012, we have observed an emergence of ceftriaxone-resistant strains also showing reduced susceptibility to fluoroquinolones. As most cases present with nonspecific febrile illness with no laboratory-confirmed etiology, empiric treatment of iNTS disease is a major challenge in Kenya. Multidrug resistance, including to ceftriaxone, will pose further difficulty in management of iNTS disease in endemic areas.
Article
Full-text available
Background: The Integrated Infectious Disease Capacity-Building Evaluation (IDCAP) was designed to test the effects of two interventions, Integrated Management of Infectious Disease (IMID) training and on-site support (OSS), on clinical practice of mid-level practitioners. This article reports the effects of these interventions on clinical practice in management of common childhood illnesses. Methods: Two trainees from each of 36 health facilities participated in the IMID training. IMID was a three-week core course, two one-week boost courses, and distance learning over nine months. Eighteen of the 36 health facilities were then randomly assigned to arm A, and participated in OSS, while the other 18 health facilities assigned to arm B did not. Clinical faculty assessed trainee practice on clinical practice of six sets of tasks: patient history, physical examination, laboratory tests, diagnosis, treatment, and patient/caregiver education. The effects of IMID were measured by the post/pre adjusted relative risk (aRR) of appropriate practice in arm B. The incremental effects of OSS were measured by the adjusted ratio of relative risks (aRRR) in arm A compared to arm B. All hypotheses were tested at a 5 % level of significance. Results: Patient samples were comparable across arms at baseline and endline. The majority of children were aged under five years; 84 % at baseline and 97 % at endline. The effects of IMID on patient history (aRR = 1.12; 95 % CI = 1.04-1.21) and physical examination (aRR = 1.40; 95 % CI = 1.16-1.68) tasks were statistically significant. OSS was associated with incremental improvement in patient history (aRRR = 1.18; 95 % CI = 1.06-1.31), and physical examination (aRRR = 1.27; 95 % CI = 1.02-1.59) tasks. Improvements in laboratory testing, diagnosis, treatment, and patient/caregiver education were not statistically significant. Conclusion: IMID training was associated with improved patient history taking and physical examination, and OSS further improved these clinical practices. On-site training and continuous quality improvement activities support transfer of learning to practice among mid-level practitioners.
Article
Full-text available
Bacterial sepsis is a leading cause of mortality among febrile patients in low- and middle-income countries, but blood culture services are not widely available. Consequently, empiric antimicrobial management of suspected bloodstream infection is based on generic guidelines that are rarely informed by local data on etiology and patterns of antimicrobial resistance. To evaluate the cost-effectiveness of surveillance for bloodstream infections to inform empiric management of suspected sepsis in low-resource areas, we compared costs and outcomes of generic antimicrobial management with management informed by local data on etiology and patterns of antimicrobial resistance. We applied a decision tree model to a hypothetical population of febrile patients presenting at the district hospital level in Africa. We found that the evidence-based regimen saved 534 more lives per 100,000 patients at an additional cost of $25.35 per patient, resulting in an incremental cost-effectiveness ratio of $4,739. Although this number compares favorably to standard cost-effectiveness thresholds, it should ultimately be compared with other relevant policy-relevant alternatives to determine whether routine surveillance for bloodstream infections is a cost-effective strategy in the African context. © The American Society of Tropical Medicine and Hygiene.
Article
Full-text available
Salmonella enterica infections are common causes of bloodstream infection in low-resource areas, where they may be difficult to distinguish from other febrile illnesses and may be associated with a high case fatality ratio. Microbiologic culture of blood or bone marrow remains the mainstay of laboratory diagnosis. Antimicrobial resistance has emerged in Salmonella enterica, initially to the traditional first-line drugs chloramphenicol, ampicillin, and trimethoprim-sulfamethoxazole. Decreased fluoroquinolone susceptibility and then fluoroquinolone resistance have developed in association with chromosomal mutations in the quinolone resistance-determining region of genes encoding DNA gyrase and topoisomerase IV and also by plasmid-mediated resistance mechanisms. Resistance to extended-spectrum cephalosporins has occurred more often in nontyphoidal than in typhoidal Salmonella strains. Azithromycin is effective for the management of uncomplicated typhoid fever and may serve as an alternative oral drug in areas where fluoroquinolone resistance is common. In 2013, CLSI lowered the ciprofloxacin susceptibility breakpoints to account for accumulating clinical, microbiologic, and pharmacokinetic-pharmacodynamic data suggesting that revision was needed for contemporary invasive Salmonella infections. Newly established CLSI guidelines for azithromycin and Salmonella enterica serovar Typhi were published in CLSI document M100 in 2015. Copyright © 2015, American Society for Microbiology. All Rights Reserved.
Article
Full-text available
Currently microorganisms are best identified using 16S rRNA and 18S rRNA gene sequencing. However, in recent years matrix assisted laser desorption ionization-time of flight mass spectrometry (MALDI-TOF MS) has emerged as a potential tool for microbial identification and diagnosis. During the MALDI-TOF MS process, microbes are identified using either intact cells or cell extracts. The process is rapid, sensitive, and economical in terms of both labor and costs involved. The technology has been readily imbibed by microbiologists who have reported usage of MALDI-TOF MS for a number of purposes like, microbial identification and strain typing, epidemiological studies, detection of biological warfare agents, detection of water- and food-borne pathogens, detection of antibiotic resistance and detection of blood and urinary tract pathogens etc. The limitation of the technology is that identification of new isolates is possible only if the spectral database contains peptide mass fingerprints of the type strains of specific genera/species/subspecies/strains. This review provides an overview of the status and recent applications of mass spectrometry for microbial identification. It also explores the usefulness of this exciting new technology for diagnosis of diseases caused by bacteria, viruses, and fungi.
Article
Full-text available
Background: Proficiency testing (PT) is a means of verifying the reliability of laboratory results, but such programmes are not readily available to laboratories in developing countries. This project provided PT to laboratories in Nigeria. Objectives: To assess the proficiency of laboratories in the diagnosis of HIV, tuberculosis and malaria. Methods: This was a prospective study carried out between 2009 and 2011. A structured questionnaire was administered to 106 randomly-selected laboratories. Forty-four indicated their interest in participation and were enrolled. Four rounds of pre-characterised plasma panels for HIV, sputum films for tuberculosis and blood films for malaria were distributed quarterly by courier over the course of one year. The results were returned within two weeks and scores of ≥ 80% were reported as satisfactory. Mentoring was offered after the first and second PT rounds. Results: Average HIV PT scores increased from 74% to 95% from the first round to the third round, but decreased in the fourth round. For diagnosis of tuberculosis, average scores increased from 42% in the first round to 78% in the second round; but a decrease to 34% was observed in the fourth round. Malaria PT performance was 2% at first, but average scores increased between the second and fourth rounds, culminating in a fourth-round score of 39%. Many participants requested training and mentoring. Conclusions: There were gross deficiencies in the quality of laboratory services rendered across Nigeria. In-country PT programmes, implemented in conjunction with mentoring, will improve coverage and diagnosis of HIV, tuberculosis and malaria.
Article
Full-text available
A number of exciting new technologies have emerged to detect infectious diseases with greater accuracy and provide faster times to result in hopes of improving the provision of care and patient outcomes. However, the challenge in evaluating new methods lies not in the technical performance of tests but in (1) defining the specific advantages of new methods over the present gold standards in a practicable way and (2) understanding how advanced technologies will prompt changes in medical and public health decisions. With rising costs to deliver care, enthusiasm for innovative technologies should be balanced with a comprehensive understanding of clinical and laboratory ecosystems and how such factors influence the success or failure of test implementation. Selecting bloodstream infections as an exemplar, we provide a 6-step model for test adoption that will help clinicians and laboratorians better define the value of a new technology specific to their clinical practices.
Article
Full-text available
In 1891, Winter(1) described the first 4 cases of tuberculous meningitis (TBM), in which paracenteses of the theca vertebralis was performed to relieve cerebrospinal fluid (CSF) fluid pressure. Since this original description of the lumbar puncture (LP) procedure, neurologists worldwide have relied on LPs for both diagnostic and therapeutic purposes. In resource-limited settings, LPs are often the only neurologic test available to aid the clinician in neurologic diagnosis. In sub-Saharan Africa, a large number of patients present to hospitals with acute neurologic symptoms, including those who are HIV-infected and have opportunistic infections such as cryptococcal meningitis and TBM. In these clinical scenarios, LPs are an essential point-of-care diagnostic and therapeutic procedure.(2) The benefits of the LP as a diagnostic tool are well-known, but it is important to emphasize that therapeutic LPs are a low-cost measure to monitor and treat intracranial pressure (ICP) due to nonobstructive hydrocephalus in regions of the world where more sophisticated testing and treatment are unavailable due to limitations of medical equipment, medication supplies, and clinical personnel, including specialized neurologists and neurosurgeons.
Article
Full-text available
The emergence of multidrug-resistant (MDR) typhoid is a major global health threat affecting many countries where the disease is endemic. Here whole-genome sequence analysis of 1,832 Salmonella enterica serovar Typhi (S. Typhi) identifies a single dominant MDR lineage, H58, that has emerged and spread throughout Asia and Africa over the last 30 years. Our analysis identifies numerous transmissions of H58, including multiple transfers from Asia to Africa and an ongoing, unrecognized MDR epidemic within Africa itself. Notably, our analysis indicates that H58 lineages are displacing antibiotic-sensitive isolates, transforming the global population structure of this pathogen. H58 isolates can harbor a complex MDR element residing either on transmissible IncHI1 plasmids or within multiple chromosomal integration sites. We also identify new mutations that define the H58 lineage. This phylogeographical analysis provides a framework to facilitate global management of MDR typhoid and is applicable to similar MDR lineages emerging in other bacterial species.
Article
Full-text available
Escherichia coli sequence type 131 (ST131) and Klebsiella pneumoniae ST258 emerged in the 2000s as important human pathogens, have spread extensively throughout the world, and are responsible for the rapid increase in antimicrobial resistance among E. coli and K. pneumoniae strains, respectively. E. coli ST131 causes extraintestinal infections and is often fluoroquinolone resistant and associated with extended-spectrum β-lactamase production, especially CTX-M-15. K. pneumoniae ST258 causes urinary and respiratory tract infections and is associated with carbapenemases, most often KPC-2 and KPC-3. The most prevalent lineage within ST131 is named fimH30 because it contains the H30 variant of the type 1 fimbrial adhesin gene, and recent molecular studies have demonstrated that this lineage emerged in the early 2000s and was then followed by the rapid expansion of its sublineages H30-R and H30-Rx. K. pneumoniae ST258 comprises 2 distinct lineages, namely clade I and clade II. Moreover, it seems that ST258 is a hybrid clone that was created by a large recombination event between ST11 and ST442. Epidemic plasmids with blaCTX-M and blaKPC belonging to incompatibility group F have contributed significantly to the success of these clones. E. coli ST131 and K. pneumoniae ST258 are the quintessential examples of international multidrug-resistant high-risk clones. Copyright © 2015, American Society for Microbiology. All Rights Reserved.
Article
Full-text available
Nontyphoidal Salmonella is a major cause of bloodstream infections worldwide, and HIV-infected persons and malaria-infected children are at increased risk for the disease. We conducted a systematic literature review to obtain age group-specific, population-based invasive nontyphoidal Salmonella (iNTS) incidence data. Data were categorized by HIV and malaria prevalence and then extrapolated by using 2010 population data. The case-fatality ratio (CFR) was determined by expert opinion consensus. We estimated that 3.4 (range 2.1-6.5) million cases of iNTS disease occur annually (overall incidence 49 cases [range 30-94] per 100,000 population). Africa, where infants, young children, and young adults are most affected, has the highest incidence (227 cases [range 152-341] per 100,000 population) and number of cases (1.9 [range 1.3-2.9] million cases). An iNTS CFR of 20% yielded 681,316 (range 415,164-1,301,520) deaths annually. iNTS disease is a major cause of illness and death globally, particularly in Africa. Improved understanding of the epidemiology of iNTS is needed.
Article
Full-text available
Background The present External Quality Assessment (EQA) assessed reading and interpretation of malaria rapid diagnostic tests (RDTs) in the Democratic Republic of the Congo (DRC).Methods The EQA consisted of (i) 10 high-resolution printed photographs displaying cassettes with real-life results and multiple choice questions (MCQ) addressing individual health workers (HW), and (ii) a questionnaire on RDT use addressing the laboratory of health facilities (HF). Answers were transmitted through short message services (SMS).ResultsThe EQA comprised 2344 HW and 1028 HF covering 10/11 provinces in DRC. Overall, median HW score (sum of correct answers on 10 MCQ photographs for each HW) was 9.0 (interquartile range 7.5 ¿ 10); MCQ scores (the % of correct answers for a particular photograph) ranged from 54.8% to 91.6%. Most common errors were (i) reading or interpreting faint or weak line intensities as negative (3.3%, 7.2%, 24.3% and 29.1% for 4 MCQ photographs), (ii) failure to distinguish the correct Plasmodium species (3.4% to 7.0%), (iii) missing invalid test results (8.4% and 23.6%) and (iv) missing negative test results (10.0% and 12.4%). HW who were trained less than 12 months ago had best MCQ scores for 7/10 photographs as well as a significantly higher proportion of 10/10 scores, but absolute differences in MCQ scores were small. HW who had participated in a previous EQA performed significantly better for 4/10 photographs compared to those who had not. Except for two photographs, MCQ scores were comparable for all levels of the HF hierarchy and non-laboratory staff (HW from health posts) had similar performance as to laboratory staff. Main findings of the questionnaire were (i) use of other RDT products than recommended by the national malaria control programme (nearly 20% of participating HF), (ii) lack of training for a third (33.6%) of HF, (iii) high proportions (two-thirds, 66.5%) of HF reporting stock-outs.Conclusions The present EQA revealed common errors in RDT reading and interpretation by HW in DRC. Performances of non-laboratory and laboratory staff were similar and dedicated training was shown to improve HW competence although to a moderate extent. Problems in supply, distribution and training of RDTs were detected.
<