Recent publications
Background
Kenya launched a Rabies Elimination Strategy in 2014, aiming to end human rabies deaths by 2030. In March 2022, Lamu County reported increased cases of human dog bites and suspected rabies deaths to the Ministry of Health (MoH). We aimed to establish the extent of the rabies outbreak in humans and animals and determine the challenges to achieving rabies elimination by 2030.
Methods
We extracted dog bite reports from the Kenya Health Information System (KHIS), national surveillance database system, and reviewed medical records at health facilities in Lamu County for suspected human rabies deaths from 2020 to 2022. We obtained information about animal bites and illnesses in deceased persons, checked the availability of anti-rabies vaccines in health facilities, and administered rabies knowledge and practice questionnaires to health workers. For categorical data, frequencies and proportions were determined.
Results
There were 787 dog bite cases and six human rabies cases. Only a third (2/6) of the rabies cases were uploaded to the KHIS. The county used targeted dog vaccination, and samples were not collected from the biting dogs. Regarding the availability of human rabies vaccines, half (8/16) of the facilities had the human rabies vaccine, and 19% (3/16) had both the human rabies vaccine and rabies immunoglobulin (RIG). Rabies vaccine stock-outs were common at 73% (11/16). Only 25% (18/73) of the health workers reported their first action would be to clean the bite wound with running water and soap for 15 min. Additionally, 86% (54/63) did not know the recommended human rabies vaccine and RIG dosage and schedule, while 25% (18/73) of healthcare workers were satisfied with the existing information-sharing mechanisms between veterinary and human health departments for rabies prevention and control.
Conclusions
There was underreporting of rabies cases, a lack of awareness of bite wound management at health facilities, and persistent stockouts of human rabies vaccines. We suggest training healthcare workers on animal bite case management and improving One Health information exchange.
Background
Wilms tumor (WT) is the most common pediatric malignancy of the kidney. Past studies describing WT incidence and survival used surveillance data with < 30% of the US population. We evaluated differences in WT incidence and survival comparing demographic groups and tumor characteristics.
Methods
We analyzed new cases of WT among patients aged < 20 years at diagnosis by using incidence data from US Cancer Statistics (USCS) for 2003–2020 and 5‐year relative survival (RS) data from the National Program of Cancer Registries (NPCR) for 2001–2019. To assess incidence trends, average annual percent change (AAPC) was calculated by using joinpoint regression. Relative survival (RS) and all‐cause survival were calculated overall and by demographic and clinical variables.
Results
During 2003–2020, 8218 cases of WT were reported in USCS, which represented an age‐adjusted incidence rate of 5.7 cases per million. Rates were the highest among females (6.3), children aged 0–4 years (17.2), and non‐Hispanic Black patients (7.1). Overall, trends remained stable (AAPC = −0.4, 95% CI: −1.4 to 0.4). Among 7567 cases of WT in NPCR, 5‐year RS was 92.6%. Patients with the lowest survival include the following: those aged 10–19 years (hazard ratio [HR] = 1.65, 95% CI: 1.02–2.65); non‐Hispanic Black patients (HR = 1.39, 95% CI: 1.11–1.76); those with regional stage (HR = 1.93, 95% CI: 1.47–2.54) or distant stage (HR = 5.12, 95% CI: 3.99–6.57); and patients from nonmetropolitan counties (HR = 1.46, 95% CI: 1.09–1.96). Individuals diagnosed during 2011–2019 (HR = 0.64, 95% CI: 0.53–0.77) had higher survival than those diagnosed during 2001–2010.
Conclusions
The highest WT incidence rates were patients who were female, 0–4 years, and non‐Hispanic Black. Survival improved during the study period; survival differed by race, ethnicity, metropolitan status, and age. Further studies to delineate the causes of these disparities may improve outcomes.
Larval source management (LSM) has a long history of advocacy and successes but is rarely adopted where funds are limited. The World Health Organization (WHO) guidelines on malaria prevention recommend the use of LSM as a supplementary intervention to the core vector control methods (insecticide-treated nets and indoor residual spraying), arguing that its feasibility in many settings can be limited by larval habitats being numerous, transient, and difficult to find or treat. Another key argument is that there is insufficient high-quality evidence for its effectiveness to support wide-scale implementation. However, the stagnation of progress towards malaria elimination demands that we consider additional options to the current emphasis on insecticidal commodities targeting adult mosquitoes inside homes. This letter is the result of a global, crossdisciplinary collaboration comprising: (a) detailed online expert discussions, (b) a narrative review of countries that have eliminated local malaria transmission, and (c) a mathematical modeling exercise using two different approaches. Together, these efforts culminated in seven key recommendations for elevating larval source management as a strategy for controlling malaria and other mosquito-borne diseases in Africa (Box 1). LSM encompasses the use of larvicide (a commodity) as well as various environmental sanitation measures. Together, these efforts lead to the long-term reduction of mosquito populations, which benefits the entire community by controlling both disease vector and nuisance mosquitoes. In this paper, we argue that the heavy reliance on large-scale cluster-randomized controlled trials (CRTs) to generate evidence on epidemiological endpoints restricts the recommendation of approaches to only those interventions that can be measured by functional units and deliver relatively uniform impact and, therefore, are more likely to receive financial support for conducting these trials. The explicit impacts of LSM may be better captured by using alternative evaluation approaches, especially high-quality operational data and a recognition of locally distinct outcomes and tailored strategies. LSM contributions are also evidenced by the widespread use of LSM strategies in nearly all countries that have successfully achieved malaria elimination. Two modelling approaches demonstrate that a multifaceted strategy, which incorporates LSM as a central intervention alongside other vector control methods, can effectively mitigate key biological threats such as insecticide resistance and outdoor biting, leading to substantial reductions in malaria cases in representative African settings. This argument is extended to show that the available evidence is sufficient to establish the link between LSM approaches and reduced disease transmission of mosquito-borne illnesses. What is needed now is a significant boost in the financial resources and public health administration structures necessary to train, employ and deploy local-level workforces tasked with suppressing mosquito populations in scientifically driven and ecologically sensitive ways. In conclusion, having WHO guidelines that recognize LSM as a key intervention to be delivered in multiple contextualized forms would open the door to increased flexibility for funding and aid countries in implementing the strategies that they deem appropriate. Financially supporting the scale-up of LSM with high-quality operations monitoring for vector control in combination with other core tools can facilitate better health. The global health community should reconsider how evidence and funding are used to support LSM initiatives.
Graphical Abstract
Background
β-synuclein (β-syn), mainly expressed in central nerve system, is one of the biomarkers in cerebrospinal fluid (CSF) and blood for synaptic damage, which has been reported to be elevated in CSF and blood of the patients of prion diseases (PrDs).
Methods
We analyzed 314 CSF samples from patients in China National Surveillance for CJD. The diagnostic groups of the 223 patients with PrDs included sporadic Creutzfeldt-Jacob disease (sCJD), genetic CJD (gCJD), fatal familial insomnia (FFI) and Gerstmann-Straussler-Scheinker (GSS). 91 patients with non-PrDs comprised Alzheimer’s disease (AD), Parkinson's disease (PD), viral encephalitis (VE) or autoimmune encephalitis (AE) were enrolled in the control groups. The CSF β-syn levels were measured by a commercial microfluidic ELISA. The Mann–Whitney U test and Kruskal–Wallis H test were employed to analyze two or more sets of continuous variables. Multiple linear regression was also performed to evaluate the factors for CSF β-syn levels. Receiver operating characteristics (ROC) curves and area under the curve (AUC) values were used to assess the diagnostic performance of β-syn.
Results
The median of β-syn levels (2074 pg/ml; IQR: 691 to 4332) of all PrDs was significantly higher than that of non-PrDs group (504 pg/ml; IQR: 126 to 3374). The CSF β-syn values in the cohorts of sCJD, T188K-gCJD, E200K-gCJD and P102L-GSS were remarkably higher than that of the group of AD + PD, but similar as that of the group of VE + AE. The elevated CSF β-syn in sCJD and gCJD cases was statistically associated with CSF 14-3-3 positive and appearance of mutism. ROC curve analysis identified satisfied performance for distinguishing from AD + PD, with high AUC values in sCJD (0.7640), T188K-gCJD (0.8489), E200K-gCJD (0.8548), P102L-GSS (0.7689) and D178N-FFI (0.7210), respectively.
Conclusion
Our data here indicate that CSF β-syn is a potential biomarker for distinguishing PrDs (gCJD, sCJD and GSS) from AD and PD, but is much less efficient from VE and AE. These findings have critical implications for early diagnosis and monitoring of synaptic integrity in prion diseases.
Clostridioides difficile is the predominant pathogen in hospital-acquired infections and antibiotic-associated diarrhea. Dedicated networks and annual reports for C. difficile surveillance have been established in Europe and North America, however the extensive investigation on the prevalence of C. difficile infection (CDI) in China is limited. In this study, 1528 patients with diarrhea were recruited from seven geographically representative regions of China between July 2021 and July 2022. The positivity rate of toxigenic C. difficile using real-time fluorescence quantitative PCR test of feces was 10.2% (156/1528), and 125 (8.2%, 125/1528) strains were successfully isolated. The isolates from different geographical areas had divergent characteristics after multilocus sequence typing, toxin gene profiling, and antimicrobial susceptibility testing. No isolate from clade 2 were found, and clade 1 was still the main clade for these clinical isolates. Interestingly, clade 4, especially ST37, previously known as the characteristic type of China, showed a strong geographical divergence. Clade 3, although rare in China, has been detected in Hainan and Sichuan provinces. Most C. difficile isolates (76.8%, 96/125) were toxigenic. Clindamycin, erythromycin, and moxifloxacin were the top three antibiotics to which resistance was observed, with resistance rates of 81.3%, 63.6%, and 24.0%, respectively. Furthermore, 34 (27.2%, 34/125) multidrug-resistant (MDR) strains were identified. All the strains were sensitive to metronidazole, vancomycin, and meropenem. The genotype of C. difficile varies greatly among the different geographical regions in China, and new types are constantly emerging. Therefore, comprehensive, longitudinal, and standardized surveillance of C. difficile infections is needed in China, covering typical geographical areas.
The introduction of the Sustainable Development Goals by the United Nations has set a global target for achieving Universal Health Coverage, requiring resilient health systems capable of addressing public health emergencies and ensuring health security. Public health surveillance, crucial for detecting and responding to infectious disease outbreaks, is key to building health system resilience. Due to the high levels of mobility and political instability in the Middle East and North Africa (MENA) region, unique challenges arise in cross-border health surveillance. This review aims to highlight the importance of cross-border public health surveillance in strengthening health systems across MENA to achieve equitable health outcomes.
A mixed-methods approach was utilized, combining a systematic literature review with semi-structured in-depth interviews (IDIs) involving 28 stakeholders from seven MENA countries. The literature review adhered to PRISMA guidelines, while the IDIs provided qualitative insights into current surveillance practices and challenges. Findings from the literature review and IDIs were triangulated and analyzed using the WHO Health Systems Strengthening (HSS) Building Blocks Framework to identify key challenges and recommendations for improving cross-border surveillance.
Results indicate that existing cross-border surveillance systems in MENA face challenges in data collection, analysis, and sharing, with disparities across countries based on income levels and political contexts. Key challenges include delayed and incomplete data sharing, insufficient funding across sectors, inadequate training, inconsistent data definitions, and limited integration of health data for mobile populations. Recommendations emphasize strengthened governance and leadership to facilitate regional cooperation and information sharing, sustainable financing for implementing a One Health approach, utilizing innovative information systems, workforce development to enhance data collection and analysis, and secure supply chains for medicines and vaccines and equitable service delivery for all mobile populations.
In conclusion, the WHO HSS Building Block Framework provides a comprehensive approach to assessing and improving cross-border public health surveillance and enhancing health security and equity in MENA. Strengthening cross-border surveillance systems may help MENA countries meet IHR requirements, achieve greater health security, and advance health equity among all types of mobile populations. Despite limitations, the study offers critical insights for improving cross-border surveillance strategies in the region.
White-tailed deer (Odocoileus virginianus) are a ubiquitous species in North America. Their high reproductive potential leads to rapid population growth, and they exhibit a wide range of biological adaptations that influence their interactions with vectors and pathogens. This review aims to characterize the intricate interplay between white-tailed deer and the transmission cycles of various tick- and mosquito-borne pathogens across their range in the eastern United States and southeastern Canada. The first part offers insights into the biological characteristics of white-tailed deer, their population dynamics, and the consequential impacts on both the environment and public health. This contextual backdrop sets the stage for the two subsequent sections, which delve into specific examples of pathogen transmission involving white-tailed deer categorized by tick and mosquito vectors into tick-borne and mosquito-borne diseases. This classification is essential, as ticks and mosquitoes serve as pivotal elements in the eco-epidemiology of vector-borne diseases, intricately linking hosts, the environment, and pathogens. Through elucidating these associations, this paper highlights the crucial role of white-tailed deer in the transmission dynamics of tick- and mosquito-borne diseases. Understanding the interactions between white-tailed deer, vectors, and pathogens is essential for effective disease management and public health interventions.
Graphical Abstract
Antimalarial therapeutic efficacy studies are vital for monitoring drug efficacy in malaria-endemic regions. The WHO recommends genotyping polymorphic markers including msp-1, msp-2, and glurp for distinguishing recrudescences from reinfections. Recently, WHO proposed replacing glurp with microsatellites (Poly-α, PfPK2, TA1). However, suitable combinations with msp-1 and msp-2, as well as the performance of different algorithms for classifying recrudescence, have not been systematically assessed. This study investigated various microsatellites alongside msp-1 and msp-2 for molecular correction and compared different genotyping algorithms across three sites in Uganda. Microsatellites 313, Poly-α, and 383 exhibited the highest diversity, while PfPK2 and Poly-α revealed elevated multiplicity of infection (MOI) across all sites. The 3/3 match-counting algorithm classified significantly fewer recrudescences than both the ≥ 2/3 and Bayesian algorithms at probability cutoffs of ≥ 0.7 and ≥ 0.8 (P < 0.05). The msp-1/msp-2/2490 combination identified more recrudescences using the ≥ 2/3 and 3/3 algorithms in the artemether-lumefantrine (AL) treatment arm, while msp-1/msp-2/glurp combination classified more cases of recrudescence using the ≥ 2/3 in the dihydroartemisinin-piperaquine (DP) arm. Microsatellites PfPK2 and Poly-α, potentially sensitive to detecting minority clones, are promising replacements for glurp. Discrepancies in recrudescence classification between match-counting and Bayesian algorithms highlight the need for standardized PCR correction practices.
Supplementary Information
The online version contains supplementary material available at 10.1038/s41598-025-88892-7.
Objectives
To determine the impact of increased long-acting injectable antiretroviral therapy (Cabotegravir-rilpivirine [CAB/RPV]) use among persons with diagnosed HIV (PWDH) with viral suppression (VLS), per 2021 US Food and Drug Administration (FDA) guidelines, on HIV incidence and levels of VLS in the US.
Methods
We used the HOPE compartmental model to simulate CAB/RPV use during 2023-2035. We first simulated a baseline scenario (no CAB/RPV), in which 69% of PWDH had VLS. We then introduced CAB/RPV in 2023 under two scenarios: (1) where CAB/RPV improved the duration of VLS post-cessation of ART use compared to oral ART; (2) where CAB/RPV additionally improved adherence. We compared cumulative 2023-35 incidence and percentage of PWDH with VLS at year-end 2035 to baseline.
Results
When CAB/RPV increased the duration of VLS only, cumulative incidence was reduced up to 9%, and VLS increased up to 4%. When CAB/RPV also improved ART adherence, incidence was reduced up to 19.5%, and VLS increased up to 9%.
Conclusions
CAB/RPV, even if only used among PWDH with VLS, may reduce HIV incidence and increase VLS, due to longer-lasting VLS post-cessation of usage. If CAB/RPV also improves ART adherence, incidence is further reduced. Improved clinical efficacy of CAB/RPV may translate to improved population-level outcomes, even in limited use cases.
Although autism is a childhood‐onset neurodevelopmental disorder, its features change across the life course due to a combination of individual and contextual influences. However, the influence of contextual factors on development during childhood and beyond is less frequently studied than individual factors such as genetic variants that increase autism risk, IQ, language, and autistic features. Potentially important contexts include the family environment and socioeconomic status, social networks, school, work, services, neighborhood characteristics, environmental events, and sociocultural factors. Here, we articulate the benefit of studying contextual factors, and we offer selected examples of published longitudinal autism studies that have focused on how individuals develop within context. Expanding the autism research agenda to include the broader context in which autism emerges and changes across the life course can enhance understanding of how contexts influence the heterogeneity of autism, support strengths and resilience, or amplify disabilities. We describe challenges and opportunities for future research on contextual influences and provide a list of digital resources that can be integrated into autism data sets. It is important to conceptualize contextual influences on autism development as main exposures, not only as descriptive variables or factors needing statistical control.
In recent years, multiple reports have emerged describing real-time quantitative polymerase chain reaction (qPCR) detection of DNA derived from human parasite species in environmental soil samples. In one such report, sampling was focused in impoverished areas of the southeastern United States, and a link between poverty and the presence of parasite DNA in soil was proposed. Whether transmission of certain parasitic diseases persists in the United States in association with poverty remains an important question. However, we emphasize caution when reviewing interpretations drawn solely from qPCR detection of parasite-derived environmental DNA without further verification. We discuss here the limitations of using qPCR to test environmental DNA samples, the need for sampling strategies that are unbiased and repeatable, and the importance of selecting appropriate control areas and statistical tests to draw meaningful conclusions.
RMSF, a tickborne infection caused by Rickettsia rickettsii , produces severe and fatal disease in humans and dogs. Since the beginning of the 21st century, cases have risen dramatically, most notably in Mexico and Brazil, where outbreaks occur in urban centers including cities with populations of > 1,000,000 persons. Reported case fatality rates can exceed 50%. Factors consistent with high case fatality include lack of awareness of disease ecology, limited capacity for diagnosis, and delay in appropriate antimicrobial treatment. The emergence of urban hyperendemic foci has been leveraged by 2 distinct but similar anthropogenic events that create disproportionately high numbers of vertebrate amplifiers of R rickettsii , as well as the tick species that transmit this pathogen in proximity with dense human populations. This often occurs in neighborhoods with a highly marginalized at-risk population that includes persons in poverty and particularly children, and health management systems that are under-resourced. We discuss strategies to reduce host dog populations, particularly in Mexico, and capybaras in Brazil. We review challenges to the control of tick populations in these settings. Robust systems are required to enhance awareness of RMSF among medical practitioners and people at risk of RMSF. Public health campaigns should incorporate innovative behavioral science (eg, diverse learning models, motivational interviews, and gamification) to increase prevention and understanding within communities. While anti- Rickettsia or anti-tick vaccines will be necessary to resolve this One Health crisis, impactful implementation will require data-driven and multiple-target innovations to address challenges with hosts, ticks, medical systems, and public welfare. The companion Currents in One Health by Foley, Backus, and López-Pérez, JAVMA, March 2025, addresses helpful information for the practicing veterinarian.
Introduction
Millions of Euvichol-Plus doses have been deployed from the global oral cholera vaccine stockpile in over 20 cholera-affected countries. However, information on Euvichol-Plus’s effectiveness is limited. Using this vaccine in a cholera epidemic in Dhaka, Bangladesh, provided the opportunity to evaluate the vaccine effectiveness (VE) using a test-negative design.
Methods
A two-dose regimen of Euvichol-Plus was administered to individuals aged >1 year in a population of ca. 900 000 in two campaign rounds between June and August 2022, with prospective registration of all persons who received at least one dose. We conducted systematic surveillance in two key facilities, enrolling patients with acute watery diarrhoea who were eligible for vaccination from the campaign’s start and who presented for care between 21 August 2022 and 20 August 2023. Faecal culture-positive cholera cases were matched to up to four faecal culture-negative controls by age, presentation date and facility. Vaccination status was documented without knowledge of culture results. Conditional logistic regression models estimated the OR for the vaccination-cholera association, and the VE of the two-dose regimen was calculated as [(1−OR) × 100].
Results
The analysis included 226 cases and 552 matched controls. The adjusted VE of two doses of the Euvichol-Plus vaccine against medically attended cholera was 66% (99.5% CI: 30 to 83) for all recipients. Limited protection (12%; 95% CI: −95 to 60) was observed for children aged 1–4 years; whereas, protection was 79% (95% CI: 60 to 89) for those aged ≥5 years. VE against cholera with moderate to severe dehydration was 69% (95% CI: 44 to 83) overall but 6% (95% CI: −206 to 71) for children aged 1–4 years.
Conclusion
Euvichol-Plus provided significant protection against medically attended cholera of any severity as well as cholera with moderate to severe dehydration. However, significant levels of protection were only observed for those aged ≥5 years.
This study aims to establish an animal model of monkeypox virus (MPXV) infection in dormice through intranasal inoculation. Male dormice aged 4–5 months were selected as experimental subjects and administered different titers of MPXV (103.5 PFU, 104.5 PFU, and 105.5 PFU, respectively) via nasal instillation. Within 14 days post-infection, clinical indicators such as survival rate, body weight changes, respiratory status, and mental state were continuously monitored. Additionally, tissue samples from the lungs, liver, spleen, and trachea of dormice from each group were collected on the 5th and 10th days for virus titer detection, and histopathological analysis was performed on lung samples collected on the 5th and 10th days. Dormice infected with MPXV exhibited typical symptoms such as appetite loss, continuous body weight reduction, aggravated respiratory difficulties, accompanied by lethargy, chills, and other clinical manifestations similar to human monkeypox infection. Virological tests further confirmed the distribution of MPXV in multiple vital organs of dormice, including the lungs, liver, spleen, and trachea, with particularly significant pathological damage observed in lung tissue. An MPXV infection model in dormice was successfully established through intranasal inoculation with a titer of 105.5 PFU MPXV, which can be used for studying the infection mechanism and pharmacology of MPXV.
Institution pages aggregate content on ResearchGate related to an institution. The members listed on this page have self-identified as being affiliated with this institution. Publications listed on this page were identified by our algorithms as relating to this institution. This page was not created or approved by the institution. If you represent an institution and have questions about these pages or wish to report inaccurate content, you can contact us here.
Information
Address
Atlanta, United States
Website