[Show abstract][Hide abstract] ABSTRACT: As key stakeholders in immunization policy decisions, the Pediatricians of Ontario held an accredited conference on January 18, 2014, to discuss prevention of invasive meningococcal disease. Five key recommendations were put forth regarding immunization strategies to protect children from meningococcal serogroup B disease. The recently approved four-component meningococcal B (4CMenB) vaccine should be recommended and funded as part of Ontario's routine immunization schedule and should also be mandated for school attendance. Public funding for 4CMenB immunization is justified based on current MenB epidemiology, vaccine coverage, cost effectiveness and acceptability, as well as legal, political and ethical considerations related to 4CMenB immunization, particularly because routine recommendations and funding are currently in place for vaccination against meningococcal serogroups that cause significantly less disease in Canada than MenB. Broadly, the goals are to assist individual practitioners in advocating the benefits of 4CMenB vaccination to parents, and to counterbalance recommendations from the National Advisory Committee on Immunization and the Canadian Paediatric Society.
Preview · Article · Sep 2015 · The Canadian journal of infectious diseases & medical microbiology = Journal canadien des maladies infectieuses et de la microbiologie medicale / AMMI Canada
[Show abstract][Hide abstract] ABSTRACT: Syphilis outbreaks in urban men who have sex with men (MSM) are an ongoing public health challenge in many high-income countries, despite intensification of efforts to screen and treat at-risk individuals. We sought to understand how population-level coverage of asymptomatic screening impacts the ability to control syphilis transmission.
We developed a risk-structured deterministic compartmental mathematical model of syphilis transmission in a population of sexually active MSM. We assumed a baseline level of treatment of syphilis cases due to seeking medical care in all scenarios. We evaluated the impact of sustained annual population-wide screening coverage ranging from 0% to 90% on syphilis incidence over the short term (20 years) and at endemic equilibrium.
The relationship between screening coverage and equilibrium syphilis incidence displayed an inverted U-shape relationship, with peak equilibrium incidence occurring with 20-30% annual screening coverage. Annual screening of 62% of the population was required for local elimination (incidence <1 case per 100 000 population). Results were qualitatively similar in the face of differing programmatic, behavioural and natural history assumptions, although the screening thresholds for local elimination differed. With 6-monthly or 3-monthly screening, the population coverage required to achieve local elimination was reduced to 39% or 23%, respectively.
Although screening has the potential to control syphilis outbreaks, suboptimal coverage may paradoxically lead to a higher equilibrium infection incidence than that observed in the absence of intervention. Suboptimal screening programme design should be considered as a possible contributor to unsuccessful syphilis control programmes in the context of the current epidemic.
Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Full-text · Article · May 2015 · Sexually transmitted infections
[Show abstract][Hide abstract] ABSTRACT: Only a portion of hospital-acquired Clostridium difficile infections can be traced back to source patients identified as having symptomatic disease. Antibiotic exposure is the main risk factor for C difficile infection for individual patients and is also associated with increased asymptomatic shedding. Contact with patients taking antibiotics within the same hospital ward may be a transmission risk factor for C difficile infection, but this hypothesis has never been tested.
To obtain a complete portrait of inpatient risk that incorporates innate patient risk factors and transmission risk factors measured at the hospital ward level and to investigate ward-level rates of antibiotic use and C difficile infection risk.
A 46-month (June 1, 2010, through March 31, 2014) retrospective cohort study of inpatients 18 years or older in a large, acute care teaching hospital composed of 16 wards, including 5 intensive care units and 11 non-intensive care unit wards.
Patient-level risk factors (eg, age, comorbidities, hospitalization history, antibiotic exposure) and ward-level risk factors (eg, antibiotic therapy per 100 patient-days, hand hygiene adherence, mean patient age) were identified from hospital databases.
Incidence of hospital-acquired C difficile infection as identified prospectively by hospital infection prevention and control staff.
A total of 255 of 34 298 patients developed C difficile (incidence rate, 5.95 per 10 000 patient-days; 95% CI, 5.26-6.73). Ward-level antibiotic exposure varied from 21.7 to 56.4 days of therapy per 100 patient-days. Each 10% increase in ward-level antibiotic exposure was associated with a 2.1 per 10 000 (P < .001) increase in C difficile incidence. The association between C difficile incidence and ward antibiotic exposure was the same among patients with and without recent antibiotic exposure, and C difficile risk persisted after multilevel, multivariate adjustment for differences in patient-risk factors among wards (relative risk, 1.34 per 10% increase in days of therapy; 95% CI, 1.16-1.57).
Among hospital inpatients, ward-level antibiotic prescribing is associated with a statistically significant and clinically relevant increase in C difficile risk that persists after adjustment for differences in patient-level antibiotic use and other patient- and ward-level risk factors. These data strongly support the use of antibiotic stewardship as a means of preventing C difficile infection.
Full-text · Article · Feb 2015 · JAMA Internal Medicine
[Show abstract][Hide abstract] ABSTRACT: Evaluate safety and efficacy of Incobotulinumtoxin A in elderly patients with dementia and paratonia.
University-affiliated hospital, spasticity management Clinic.
Ten subjects were enrolled.
1) severe cognitive impairment 2) diagnosis of Alzheimer's disease, vascular dementia, or frontotemporal dementia, and 3) score >3 on the paratonic assessment instrument, with posture in an arm(s) interfering with provision of care.
1) alternate etiologies for increased tone and 2) injection with botulinum toxin within the 6 months preceding the study.
Single center, randomized, double blind, placebo-controlled, crossover trial with two treatment cycles of 16 weeks. Assessments occurred at 2, 6, 12 and16 weeks following injections. Subjects received up to 300 U of Incobotulinumtoxin A in arm(s).
Primary outcome measure was the modified caregiver burden scale (mCBS); exploratory secondary outcome measures were also performed. Analysis of variance and mixed modeling techniques were used to evaluate treatment effects.
Incobotulinumtoxin A treatment produced significant improvement in mCBS total score -1.11 (-2.04 to -0.18) (Treatment effect and 95% CI), dressing sub-score -0.36 (-0.59 to 0.12), and cleaning under the left and right armpits sub-score -0.5 (-0.96 to -0.04), -0.41 (-0.79 to -0.04) respectively. PROM in the left and right elbow increased by 27.67 degrees (13.32-42.02) and 22.07 degrees (9.76-34.39) respectively. PROM in the left and right shoulder increased by 11.92 degrees (5.46-18.38) and 8.58 degrees (3.73-13.43) respectively. No significant treatment effect was found for GAS, VAS and PAINAD scales or change in time to perform care. No adverse drug reactions occurred.
Administration of Incobotulinumtoxin A in elderly people with advanced dementia and paratonia may be an efficacious and safe treatment to increase range of motion and reduce functional burden. Further studies are needed to confirm results.
[Show abstract][Hide abstract] ABSTRACT: Infections due to Gram-negative bacteria exhibit seasonal trends, with peak infection rates during warmer months. We hypothesized that the likelihood of a bloodstream infection due to Gram-negative bacteria increases with proximity to the equator. We tested this hypothesis and identified geographical, climatic and social factors associated with this variability.
We established a network of 23 international centers in 22 cities. Setting: De-identified results of positive blood cultures from 2007-2011 and data sources for geographic, climatic and socioeconomic factors were assembled for each center.
Patients at the 23 centers with positive blood cultures.
Due to variability in the availability of total culture volumes across sites, our primary outcome measure was the fraction of positive blood cultures that yielded Gram-negative bacteria; sources of variability in this outcome measure were explored using meta-regression techniques.
The mean fraction of bacteremia associated with Gram-negative bacteria was 48.4% (range 26.4% to 61.8%). Although not all sites displayed significant seasonality, the overall P-value for seasonal oscillation was significant (P<0.001). In univariate meta-regression models, temperature, latitude, latitude squared, longitude, per capita gross domestic product and percent of gross domestic product spent on healthcare were all associated with the fraction of bacteremia due to Gram-negative bacteria. In multivariable models, only percent of gross domestic product spent on healthcare and distance from the equator (ie. latitude squared) were significantly associated with the fraction of bacteremia due to Gram-negative bacteria.
The likelihood of bacteremia due to Gram-negative bacteria varies markedly between cities, in a manner that appears to have both geographic (latitude) and socioeconomic (proportion gross domestic product devoted to health spending) determinants. Thus, the optimal approach to initial management of suspected bacteremia may be geographically specific. The rapid emergence of highly antibiotic-resistant Gram-negative pathogens may have geographically specific impacts.
[Show abstract][Hide abstract] ABSTRACT: The Middle East Respiratory Syndrome Coronavirus (MERS-CoV) was initially recognized as a source of severe respiratory illness and renal failure in 2012. Prior to 2014, MERS-CoV was mostly associated with sporadic cases of human illness, of presumed zoonotic origin, though chains of person-to-person transmission in the healthcare setting were reported. In spring 2014, large healthcare-associated outbreaks of MERS-CoV infection occurred in Jeddah and Riyadh, Kingdom of Saudi Arabia. To date the epidemiological information published by public health investigators in affected jurisdictions has been relatively limited. However, it is important that the global public health community have access to information on the basic epidemiological features of the outbreak to date, including the basic reproduction number (R0) and best estimates of case-fatality rates (CFR). We sought to address these gaps using a publicly available line listing of MERS-CoV cases.
R0 was estimated using the incidence decay with exponential adjustment ("IDEA") method, while period-specific case fatality rates that incorporated non-attributed death data were estimated using Monte Carlo simulation.
707 cases were available for evaluation. 52% of cases were identified as primary, with the rest being secondary. IDEA model fits suggested a higher R0 in Jeddah (3.5-6.7) than in Riyadh (2.0-2.8); control parameters suggested more rapid reduction in transmission in the former city than the latter. The model accurately projected final size and end date of the Riyadh outbreak based on information available prior to the outbreak peak; for Jeddah, these projections were possible once the outbreak peaked. Overall case-fatality was 40%; depending on the timing of 171 deaths unlinked to case data, outbreak CFR could be higher, lower, or equivalent to pre-outbreak CFR.
Notwithstanding imperfect data, inferences about MERS-CoV epidemiology important for public health preparedness are possible using publicly available data sources. The R0 estimated in Riyadh appears similar to that seen for SARS-CoV, but CFR appears higher, and indirect evidence suggests control activities ended these outbreaks. These data suggest this disease should be regarded with equal or greater concern than the related SARS-CoV.
[Show abstract][Hide abstract] ABSTRACT: The 2014 West African Ebola outbreak has evolved into an epidemic of historical proportions and catastrophic scope. Prior outbreaks have been contained through the use of personal protective equipment, but such an approach has not been rapidly effective in the current epidemic. Several candidate vaccines have been developed against the Ebola virus, and are undergoing initial clinical trials.
As removal of population-level susceptibility through vaccination could be a highly impactful control measure for this epidemic, we sought to estimate the number of vaccine doses and timing of vaccine administration required to reduce the epidemic size. Our base model was fit using the IDEA approach, a single equation model that has been successful to date in describing Ebola growth. We projected the future course of the Ebola epidemic using this model. Vaccination was assumed to reduce the effective reproductive number. We evaluated the potential impact of vaccination on epidemic trajectory under different assumptions around timing of vaccine availability.
Using effective reproductive (Re) number estimates derived from this model, we estimate that 3-4 million doses of vaccine, if available and administered, could reduce Re to 0.9 in the interval from January-March 2015. Later vaccination would be associated with a progressively diminishing impact on final epidemic size; in particular, vaccination to the same Re at or after the epidemic is projected to peak (April-May 2015) would have little impact on final epidemic size, though more intensive campaigns (e.g., Re reduced to 0.5) could still be effective if initiated by summer 2015. In summary, there is a closing window of opportunity for the use of vaccine as a tool for Ebola epidemic control.
Effective vaccination, used before the epidemic peaks, would be projected to prevent tens of thousands of deaths; this does not minimize the ethical challenges that would be associated with wide-scale application of vaccines that have undergone only limited evaluation for safety and efficacy.
[Show abstract][Hide abstract] ABSTRACT: The 2014 West African Ebola virus outbreak, now more correctly referred to as an epidemic, is the largest ever to occur. As of August 28, 2014, concerns have been raised that control efforts, particularly in Liberia, have been ineffective, as reported case counts continue to increase. Limited data are available on the epidemiology of the outbreak. However, reported cumulative incidence data as well as death counts are available for Guinea, Sierra Leone, Liberia and Nigeria. We utilized a simple, two parameter mathematical model of epidemic growth and control, to characterize epidemic growth patterns in West Africa, to evaluate the degree to which the epidemic is being controlled, and to assess the potential implications of growth patterns for epidemic size. Models demonstrated good fits to data. Overall basic reproductive number (R0) for the epidemic was estimated to be between 1.6 and 2.0, consistent with prior outbreaks. However, we identified only weak evidence for the occurrence of epidemic control in West Africa as a whole, and essentially no evidence for control in Liberia (though slowing of growth was seen in Guinea and Sierra Leone). It is projected that small reductions in transmission would prevent tens of thousands of future infections. These findings suggest that there is an extraordinary need for improved control measures for the 2014 Ebola epidemic, especially in Liberia, if catastrophe is to be averted.
[Show abstract][Hide abstract] ABSTRACT: Antibiotic therapy is the principal risk factor for Clostridium difficile infection (CDI), but little is known about how risks cumulate over the course of therapy and abate after cessation. We prospectively identified CDI cases among adults hospitalized at a tertiary hospital between June 2010 and May 2012. Poisson regression models included covariates for time since admission, age, hospitalization history, disease pressure, and intensive care unit stay. Impacts of antibiotic use through time were modeled using 4 measures: current antibiotic receipt, time since most recent receipt, time since first receipt during a hospitalization, and duration of receipt. Over the 24-month study period, we identified 127 patients with new onset nosocomial CDI (incidence rate per 10,000 patient days [IR] = 5.86). Of the 4 measures, time since most recent receipt was the strongest independent predictor of CDI incidence. Relative to patients with no prior receipt of antibiotics in the last 30 days (IR = 2.95), the incidence rate of CDI was 2.41 times higher (95% confidence interval [CI] 1.41, 4.13) during antibiotic receipt and 2.16 times higher when patients had receipt in the prior 1-5 days (CI 1.17, 4.00). The incidence rates of CDI following 1-3, 4-6 and 7-11 days of antibiotic exposure were 1.60 (CI 0.85, 3.03), 2.27 (CI 1.24, 4.16) and 2.10 (CI 1.12, 3.94) times higher compared to no prior receipt. These findings are consistent with studies showing higher risk associated with longer antibiotic use in hospitalized patients, but suggest that the duration of increased risk is shorter than previously thought.