BMC Infectious Diseases

Published by Springer Nature
Online ISSN: 1471-2334
Learn more about this page
Recent publications
Workflow overview. The basic layout of the infrastructure: Both system, the Swiss HIV Cohort Study (SHCS) and Swiss Transplant Cohort Study (STCS) interact with the newly set-up trial platform on Research Electronic Data Capture (REDCap). Various application programming interfaces (API) and triggers enable communication between these systems
Study overview on the REDCap trial platform: A A dashboard for each patient visualizes the progress of the trial. B A dashboard specific to the Swiss HIV Cohort Study (SHCS) gives an overview of the study progress, in red highlighting missing items. C A search tool implemented in REDCap can be used to navigate to specific data entry forms. D A dashboard specific to the Swiss Transplant Cohort Study (STCS) gives overview of the study progress
Background The rapid course of the severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) pandemic calls for fast implementation of clinical trials to assess the effects of new treatment and prophylactic interventions. Building trial platforms embedded in existing data infrastructures is an ideal way to address such questions within well-defined subpopulations. Methods We developed a trial platform building on the infrastructure of two established national cohort studies: the Swiss human immunodeficiency virus (HIV) Cohort Study (SHCS) and Swiss Transplant Cohort Study (STCS). In a pilot trial, termed Corona VaccinE tRiAL pLatform (COVERALL), we assessed the vaccine efficacy of the first two licensed SARS-CoV-2 vaccines in Switzerland and the functionality of the trial platform. Results Using Research Electronic Data Capture (REDCap), we developed a trial platform integrating the infrastructure of the SHCS and STCS. An algorithm identifying eligible patients, as well as baseline data transfer ensured a fast inclusion procedure for eligible patients. We implemented convenient re-directions between the different data entry systems to ensure intuitive data entry for the participating study personnel. The trial platform, including a randomization algorithm ensuring balance among different subgroups, was continuously adapted to changing guidelines concerning vaccination policies. We were able to randomize and vaccinate the first trial participant the same day we received ethics approval. Time to enroll and randomize our target sample size of 380 patients was 22 days. Conclusion Taking the best of each system, we were able to flag eligible patients, transfer patient information automatically, randomize and enroll the patients in an easy workflow, decreasing the administrative burden usually associated with a trial of this size.
Summary figure of estimated values for patients with hospital-acquired symptomatic infections and onward community transmission with a 7 day cut-off for symptom onset after admission and prior to discharge for defining a patient with hospital-acquired infection. Note here that the "misclassified" (C) includes those "missed" unidentified infections that return to hospital later as a hospitalised COVID-19 case (1500 "community-onset, hospital-acquired" cases)
Background SARS-CoV-2 is known to transmit in hospital settings, but the contribution of infections acquired in hospitals to the epidemic at a national scale is unknown. Methods We used comprehensive national English datasets to determine the number of COVID-19 patients with identified hospital-acquired infections (with symptom onset > 7 days after admission and before discharge) in acute English hospitals up to August 2020. As patients may leave the hospital prior to detection of infection or have rapid symptom onset, we combined measures of the length of stay and the incubation period distribution to estimate how many hospital-acquired infections may have been missed. We used simulations to estimate the total number (identified and unidentified) of symptomatic hospital-acquired infections, as well as infections due to onward community transmission from missed hospital-acquired infections, to 31st July 2020. Results In our dataset of hospitalised COVID-19 patients in acute English hospitals with a recorded symptom onset date (n = 65,028), 7% were classified as hospital-acquired. We estimated that only 30% (range across weeks and 200 simulations: 20–41%) of symptomatic hospital-acquired infections would be identified, with up to 15% (mean, 95% range over 200 simulations: 14.1–15.8%) of cases currently classified as community-acquired COVID-19 potentially linked to hospital transmission. We estimated that 26,600 (25,900 to 27,700) individuals acquired a symptomatic SARS-CoV-2 infection in an acute Trust in England before 31st July 2020, resulting in 15,900 (15,200–16,400) or 20.1% (19.2–20.7%) of all identified hospitalised COVID-19 cases. Conclusions Transmission of SARS-CoV-2 to hospitalised patients likely caused approximately a fifth of identified cases of hospitalised COVID-19 in the “first wave” in England, but less than 1% of all infections in England. Using time to symptom onset from admission for inpatients as a detection method likely misses a substantial proportion (> 60%) of hospital-acquired infections.
Background Some tuberculosis (TB) treatment guidelines recommend daily TB treatment in both the intensive and continuation phases of treatment in HIV-positive persons to decrease the risk of relapse and acquired drug resistance. However, guidelines vary across countries, and treatment is given 7, 5, 3, or 2 days/week. The effect of TB treatment intermittency in the continuation phase on mortality in HIV-positive persons on antiretroviral therapy (ART), is not well-described. Methods We conducted an observational cohort study among HIV-positive adults treated for TB between 2000 and 2018 and after enrollment into the Caribbean, Central, and South America network for HIV epidemiology (CCASAnet; Brazil, Chile, Haiti, Honduras, Mexico and Peru). All received standard TB therapy (2-month initiation phase of daily isoniazid, rifampin or rifabutin, pyrazinamide ± ethambutol) and continuation phase of isoniazid and rifampin or rifabutin, administered concomitantly with ART. Known timing of ART and TB treatment were also inclusion criteria. Kaplan–Meier and Cox proportional hazards methods compared time to death between groups. Missing model covariates were imputed via multiple imputation. Results 2303 patients met inclusion criteria: 2003(87%) received TB treatment 5–7 days/week and 300(13%) 2–3 days/week in the continuation phase. Intermittency varied by site: 100% of patients from Brazil and Haiti received continuation phase treatment 5–7 days/week, followed by Honduras (91%), Peru (42%), Mexico (7%), and Chile (0%). The crude risk of death was lower among those receiving treatment 5–7 vs. 2–3 days/week (HR = 0.68; 95% CI = 0.51—0.91; P = 0.008). After adjusting for age, sex, CD4, ART use at TB diagnosis, site of TB disease (pulmonary vs. extrapulmonary), and year of TB diagnosis, mortality risk was lower, but not significantly, among those treated 5–7 days/week vs. 2–3 days/week (HR 0.75, 95%CI 0.55–1.01; P = 0.06). After also stratifying by study site, there was no longer a protective effect (HR 1.42, 95%CI 0.83–2.45; P = 0.20). Conclusions TB treatment 5–7 days/week was associated with a marginally decreased risk of death compared to TB treatment 2–3 days/week in the continuation phase in multivariable, unstratified analyses. However, little variation in TB treatment intermittency within country meant the results could have been driven by other differences between study sites. Therefore, randomized trials are needed, especially in heterogenous regions such as Latin America.
A map of Ethiopia showing the location of the study hospitals with corresponding SARS-CoV-2 seroprevalence. a Shows the location of five hospitals from which a total of 1997 healthcare workers enrolled between December 2020 and February 2021. b Shows the corresponding seroprevalence of severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2). The y-axis of Fig. 2b represents the study hospitals. The x-axis of Fig. 2b shows crude seroprevalence rates (%) with 95% confidence intervals estimated by dividing the number of participants tested seropositive for immunoglobin G (IgG) antibodies elicited against the receptor binding domain (RBD) of the spike protein of SARS-CoV-2 to the total number of participants who provided sera and were tested
Abbreviations COVID-19: Coronavirus disease 2019; SARS-CoV-2: Severe acute respiratory syndrome-coronavirus-2; RBD: Receptor binding domain; RT-PCR: Reverse transcription polymerase chain reaction; BCG: Bacille Calmette Guerin; IgG: Immunoglobulin G; ELISA: Enzyme-Linked Immunosorbent Assay; WHO: World health organization; OD: Optical density; LFA: Lateral follow assay; HWs: Healthcare workers.
Background COVID-19 pandemic has a devastating impact on the economies and health care system of sub-Saharan Africa. Healthcare workers (HWs), the main actors of the health system, are at higher risk because of their occupation. Serology-based estimates of SARS-CoV-2 infection among HWs represent a measure of HWs’ exposure to the virus and could be used as a guide to the prevalence of SARS-CoV-2 in the community and valuable in combating COVID-19. This information is currently lacking in Ethiopia and other African countries. This study aimed to develop an in-house antibody testing assay, assess the prevalence of SARS-CoV-2 antibodies among Ethiopian high-risk frontline HWs. Methods We developed and validated an in-house Enzyme-Linked Immunosorbent Assay (ELISA) for specific detection of anti-SARS-CoV-2 receptor binding domain immunoglobin G (IgG) antibodies. We then used this assay to assess the seroprevalence among HWs in five public hospitals located in different geographic regions of Ethiopia. From consenting HWs, blood samples were collected between December 2020 and February 2021, the period between the two peaks of COVID-19 in Ethiopia. Socio-demographic and clinical data were collected using questionnaire-based interviews. Descriptive statistics and bivariate and multivariate logistic regression were used to determine the overall and post-stratified seroprevalence and the association between seropositivity and potential risk factors. Results Our successfully developed in-house assay sensitivity was 100% in serum samples collected 2- weeks after the first onset of symptoms whereas its specificity in pre-COVID-19 pandemic sera was 97.7%. Using this assay, we analyzed a total of 1997 sera collected from HWs. Of 1997 HWs who provided a blood sample, and demographic and clinical data, 51.7% were females, 74.0% had no symptoms compatible with COVID-19, and 29.0% had a history of contact with suspected or confirmed patients with SARS-CoV-2 infection. The overall seroprevalence was 39.6%. The lowest (24.5%) and the highest (48.0%) seroprevalence rates were found in Hiwot Fana Specialized Hospital in Harar and ALERT Hospital in Addis Ababa, respectively. Of the 821 seropositive HWs, 224(27.3%) of them had a history of symptoms consistent with COVID-19 while 436 (> 53%) of them had no contact with COVID-19 cases as well as no history of COVID-19 like symptoms. A history of close contact with suspected/confirmed COVID-19 cases is associated with seropositivity (Adjusted Odds Ratio (AOR) = 1.4, 95% CI 1.1–1.8; p = 0.015). Conclusion High SARS-CoV-2 seroprevalence levels were observed in the five Ethiopian hospitals. These findings highlight the significant burden of asymptomatic infection in Ethiopia and may reflect the scale of transmission in the general population.
Background Tuberculosis (TB) control is threatened by an increasing prevalence of diabetes mellitus (DM), particularly in endemic countries. Screening for DM is not routinely implemented in Tanzania; therefore, we aimed to screen for DM at TB diagnosis using clinical-demographic markers. Methods Our cross-sectional study recruited TB patients who received anti-TB treatment between October 2019 and September 2020 at health care facilities in three regions from Tanzania. Patients were screened for DM using DM symptoms (polydipsia, polyphagia and polyuria) and random blood glucose (RBG) testing. Patients with a history of DM and those with no history of DM but an RBG ≥ 7.8 mmol/L had point-of-care glycated haemoglobin (HbA1c) testing, and were considered to have DM if HbA1c was ≥ 48 mmol/mol. Results Of 1344 TB patients, the mean age was 41.0 (± 17.0) years, and 64.7% were male. A total of 1011 (75.2%) had pulmonary TB, and 133 (10.4%) had at least one DM symptom. Overall, the prevalence of DM was 7.8%, of which 36 (2.8%) TB patients with no history of DM were newly diagnosed with DM by RBG testing. TB/DM patients were older than those with only TB (50.0 ± 14.0 years vs 40.0 ± 17.0 years, p < 0.001). Patients with RBG ≥ 7.8 mmol/L were more likely to have pulmonary TB (p = 0.003), age ≥ 35 years (p = 0.018), and have at least one DM symptom (p < 0.001). There was a substantial agreement (Kappa = 0.74) between the on-site glucometer and point-of-care HbA1c tests in detecting DM range of hyperglycemia. Conclusion The implementation of clinical-demographic markers and blood glucose screening identified the overall prevalence of DM and those at risk of DM in TB patients. Clinical-demographic markers are independent predictors for DM range hyperglycemia and highlight the importance of further diagnostic testing and early co-management of TB and DM.
Proportion of main MDR strains by study year. Figure 1 is a line graph of the proportion of the main MDR strains by study year. Linear trends, determined by using the coefficient of determination (R 2 ), demonstrate that there is no significant linear increase/decrease in the prevalence of MDR strains. The overall prevalence of other MDR strains is demonstrated in Table 2
Overview of bacterial strains and MDR strains
Background Patients, especially inpatients, with spinal cord lesions and disorders (SCI/D) have an elevated risk of recurrent urinary tract infections with multidrug resistant (MDR) bacteria. This study evaluated antimicrobial resistance and the prevalence of multidrug resistance and determined the risk factors for multidrug resistance. Methods In this retrospective cohort study, urine culture results were used to calculate the antimicrobial resistance rate and the incidence of infection with MDR bacteria in the SCI/D population. MDR was defined as acquired nonsusceptibility to at least one agent from three or more antimicrobial categories. The cohort included 402 inpatients from 2013 to 2020, with 1385 urine isolates. We included only the first isolate; duplicate isolates, defined as positive cultures of the same strain within 14 days, were excluded from the evaluation. Results The most common MDR strains were Klebsiella spp . (29%) and Escherichia coli (24%). MDR isolates were detected in 50% of the samples and extended spectrum beta-lactamase (ESBL)-producing isolates were detected in 26%, while carbapenem resistance was found in 0.1%. Significantly higher rates of infection with MDR bacteria were identified in groups of patients with indwelling urethral/suprapubic catheters (p = 0.003) and severity scores of C1–C4/AIS A–C (p = 0.01). We identified age (OR: 0.99, 95% CI; 0.98–0.99, p = 0.000), sex (OR: 1.55, 95% CI; 1.16–2.06, p = 0.003), management with urethral/suprapubic catheters (OR: 2.76, 95% CI; 2.04–3.74, p = 0.000), and spontaneous voiding (OR: 1.84, 95% CI; 1.03–3.29, p = 0.038) as independent predictors of multidrug resistance in our study population. Conclusions We identified a high antibiotic resistance rate and an increasing prevalence of infection with MDR bacteria in the SCI/D inpatient population. Particular attention should be given to bladder management, with an emphasis on minimizing the use of indwelling catheters.
Background There is a paucity of knowledge on the long-term outcome in patients diagnosed with COVID-19. We describe a cohort of patients with a constellation of symptoms occurring four weeks after diagnosis causing different degrees of reduced functional capacity. Although different hypothesis have been proposed to explain this condition like persistent immune activation or immunological dysfunction, to date, no physiopathological mechanism has been identified. Consequently, there are no therapeutic options besides symptomatic treatment and rehabilitation. Methods We evaluated patients with symptoms that persisted for at least 4 weeks after COVID-19. Epidemiological and clinical data were collected. Blood tests, including inflammatory markers, were conducted, and imaging studies made if deemed necessary. Severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) reverse transcription polymerase chain reaction (RT-PCR) in plasma, stool, and urine were performed. Patients were offered antiviral treatment (compassionate use). Results We evaluated 29 patients who reported fatigue, muscle pain, dyspnea, inappropriate tachycardia, and low-grade fever. Median number of days from COVID-19 to positive RT-PCR in extra-respiratory samples was 55 (39–67). Previous COVID-19 was mild in 55% of the cases. Thirteen patients (45%) had positive plasma RT-PCR results and 51% were positive in at least one RT-PCR sample (plasma, urine, or stool). Functional status was severely reduced in 48% of the subjects. Eighteen patients (62%) received antiviral treatment. Improvement was seen in most patients (p = 0.000) and patients in the treatment group achieved better outcomes with significant differences (p = 0.01). Conclusions In a cohort of COVID-19 patients with persistent symptoms, 45% of them have detectable plasma SARS-CoV-2 RNA. Our results indicate possible systemic viral persistence in these patients, who may benefit of antiviral treatment strategies.
Characteristics of the included children (N = 225)
Percentages of combined antiretroviral drugs prescribed off-label by children's age and weight
Background Early start of highly active antiretroviral therapy (HAART) in perinatally HIV-1 infected children is the optimal strategy to prevent immunological and clinical deterioration. To date, according to EMA, only 35% of antiretroviral drugs are licenced in children < 2 years of age and 60% in those aged 2–12 years, due to the lack of adequate paediatric clinical studies on pharmacokinetics, pharmacodynamics and drug safety in children. Methods An observational retrospective study investigating the rate and the outcomes of off-label prescription of HAART was conducted on 225 perinatally HIV-1 infected children enrolled in the Italian Register for HIV Infection in Children and followed-up from 2001 to 2018. Results 22.2% (50/225) of included children were receiving an off-label HAART regimen at last check. Only 26% (13/50) of off-label children had an undetectable viral load (VL) before the commencing of the regimen and the 52.0% (26/50) had a CD4 + T lymphocyte percentage > 25%. At last check, during the off label regimen, the 80% (40/50) of patients had an undetectable VL, and 90% (45/50) of them displayed CD4 + T lymphocyte percentage > 25%. The most widely used off-label drugs were: dolutegravir/abacavir/lamivudine (16%; 8/50), emtricitbine/tenofovir disoproxil (22%; 11/50), lopinavir/ritonavir (20%; 10/50) and elvitegravir/cobicistat/emtricitabine/ tenofovir alafenamide (10%; 10/50). At logistic regression analysis, detectable VL before starting the current HAART regimen was a risk factor for receiving an off-label therapy (OR: 2.41; 95% CI 1.13–5.19; p = 0.024). Moreover, children < 2 years of age were at increased risk for receiving off-label HAART with respect to older children (OR: 3.24; 95% CI 1063–7.3; p = 0.001). Even if our safety data regarding off-label regimens where poor, no adverse event was reported. Conclusion The prescription of an off-label HAART regimen in perinatally HIV-1 infected children was common, in particular in children with detectable VL despite previous HAART and in younger children, especially those receiving their first regimen. Our data suggest similar proportions of virological and immunological successes at last check among children receiving off-label or on-label HAART. Larger studies are needed to better clarify efficacy and safety of off-label HAART regimens in children, in order to allow the enlargement of on-label prescription in children.
  • Cindy SimoensCindy Simoens
  • Tarik GheitTarik Gheit
  • Ruediger RidderRuediger Ridder
  • [...]
  • Thara SomanathanThara Somanathan
Background The incidence of high-risk human papillomavirus (hrHPV)-driven head and neck squamous cell carcinoma, in particular oropharyngeal cancers (OPC), is increasing in high-resource countries. Patients with HPV-induced cancer respond better to treatment and consequently have lower case-fatality rates than patients with HPV-unrelated OPC. These considerations highlight the importance of reliable and accurate markers to diagnose truly HPV-induced OPC. Methods The accuracy of three possible test strategies, i.e. (a) hrHPV DNA PCR (DNA), (b) p16 (INK4a) immunohistochemistry (IHC) (p16), and (c) the combination of both tests (considering joint DNA and p16 positivity as positivity criterion), was analysed in tissue samples from 99 Belgian OPC patients enrolled in the HPV-AHEAD study. Presence of HPV E6*I mRNA (mRNA) was considered as the reference, indicating HPV etiology. Results Ninety-nine OPC patients were included, for which the positivity rates were 36.4%, 34.0% and 28.9% for DNA, p16 and mRNA, respectively. Ninety-five OPC patients had valid test results for all three tests (DNA, p16 and mRNA). Using mRNA status as the reference, DNA testing showed 100% (28/28) sensitivity, and 92.5% (62/67) specificity for the detection of HPV-driven cancer. p16 was 96.4% (27/28) sensitive and equally specific (92.5%; 62/67). The sensitivity and specificity of combined p16 + DNA testing was 96.4% (27/28) and 97.0% (65/67), respectively. In this series, p16 alone and combined p16 + DNA missed 1 in 28 HPV driven cancers, but p16 alone misclassified 5 in 67 non-HPV driven as positive, whereas combined testing would misclassify only 2 in 67. Conclusions Single hrHPV DNA PCR and p16 (INK4a) IHC are highly sensitive but less specific than using combined testing to diagnose HPV-driven OPC patients. Disease prognostication can be encouraged based on this combined test result.
The temporal pattern of cases number, severe rate, case fatality rate, days in ODI for all COVID-19 patients in the mainland of China from January to March 2020. The left vertical axis corresponded to the daily severe rate and case fatality rate; the right vertical axis corresponded to the ODI. Cases after March, 2020 are not shown due to the small proportion
The patterns of ODI-related COVID-19 disease severe rate and case fatality rate examined by Join-Point regression models. A‒D indicate the overall severe rate and that stratified by sex, age and regions, respectively. E–H indicate the overall case fatality rate and that stratified by sex, age and regions, respectively. For each panel, red and blue points indicate severe rates and case fatality rates at each day of ODI, which were fitted by the red or blue curve. The arrows indicate the turning points of fitted curves. The Annual Percent Change (APC) value of each fitted curve was provided for each panel. *APC is significantly from zero at alpha = 0.05 level
The odds ratio and attributable fraction of COVID-19 disease severe rate and case fatality rate (CFR) from ODIs in the mainland of China. A Severe rate for all cases. B Severe rate stratified by sex. C Severe rate stratified by age. D Severe rate stratified by region. E Case fatality rate for all cases. F Case fatality rate stratified by sex. G Case fatality rate stratified by age. H Case fatality rate stratified by region. The points and lines represent odds ratios and their 95% CIs. The bars represent the attributable fractions and their significance of differences by asterisk (*P < 0.05; **P < 0.01)
The predictive and real numbers of severe and death cases according to the different cutoff value of ODI
  • Qing-Bin LuQing-Bin Lu
  • Tian-Le CheTian-Le Che
  • Li-Ping WangLi-Ping Wang
  • [...]
  • Zhong-Jie LiZhong-Jie Li
Background To quantitatively assess the impact of the onset-to-diagnosis interval (ODI) on severity and death for coronavirus disease 2019 (COVID-19) patients. Methods This retrospective study was conducted based on the data on COVID-19 cases of China over the age of 40 years reported through China’s National Notifiable Infectious Disease Surveillance System from February 5, 2020 to October 8, 2020. The impacts of ODI on severe rate (SR) and case fatality rate (CFR) were evaluated at individual and population levels, which was further disaggregated by sex, age and geographic origin. Results As the rapid decline of ODI from around 40 days in early January to < 3 days in early March, both CFR and SR of COVID-19 largely dropped below 5% in China. After adjusting for age, sex, and region, an effect of ODI on SR was observed with the highest OR of 2.95 (95% CI 2.37‒3.66) at Day 10–11 and attributable fraction (AF) of 29.1% (95% CI 22.2‒36.1%) at Day 8–9. However, little effect of ODI on CFR was observed. Moreover, discrepancy of effect magnitude was found, showing a greater effect from ODI on SR among patients of male sex, younger age, and those cases in Wuhan. Conclusion The ODI was significantly associated with the severity of COVID-19, highlighting the importance of timely diagnosis, especially for patients who were confirmed to gain increased benefit from early diagnosis to some extent.
Flow chart of participants included in the trial. IPN infected pancreatic necrosis, mNGS metagenomic next-generation sequencing
Comparison of plasma mNGS and blood culture for detection of pathogens. A Pathogens detected by mNGS and BC; B Comparison of positive rates of mNGS and BC. C Comparison of the number of pathogens detected of mNGS and BC; mNGS metagenomic next-generation sequencing, BC blood culture
Tests performance of plasma mNGS in detecting IPN related pathogens. IPN infected pancreatic necrosis, SPN sterile pancreatic necrosis, PPA positive percent agreement, NPA negative percent agreement
  • Donghuang HongDonghuang Hong
  • Peng WangPeng Wang
  • Jingzhu ZhangJingzhu Zhang
  • [...]
  • Weiqin LiWeiqin Li
Background Infected pancreatic necrosis (IPN) is a life-threatening complication of acute pancreatitis (AP). Timely diagnosis of IPN could facilitate appropriate treatment, but there is a lack of reliable non-invasive screening tests. In this study, we aimed to evaluate the diagnostic value of plasma metagenomic next-generation sequencing (mNGS) based on circulating microbial cell-free DNA in patients with suspected IPN. Methods From October 2020 to October 2021, 44 suspected IPN patients who underwent plasma mNGS were reviewed. Confirmatory diagnosis of IPN within two weeks after the index blood sampling was considered the reference standard. The confirmation of IPN relied on the microbiological results of drains obtained from the necrotic collections. The distribution of the pathogens identified by plasma mNGS was analyzed. Positive percent agreement (PPA) and negative percent agreement (NPA) were evaluated based on the conformity between the overall mNGS results and culture results of IPN drains. In addition, the clinical outcomes were compared between mNGS positive and negative patients. Results Across all the study samples, thirteen species of bacteria and five species of fungi were detected by mNGS. The positivity rate of plasma mNGS was 54.55% (24/44). Of the 24 mNGS positive cases, twenty (83.33%, 95% CI, 68.42–98.24%) were consistent with the culture results of IPN drains. The PPA and NPA of plasma mNGS for IPN were 80.0% (20/25; 95% CI, 64.32–95.68%) and 89.47% (17/19; 95% CI, 75.67–100%), respectively. Compared with the mNGS negative group, patients in the positive group had more new-onset septic shock [12 (50.0%) vs. 4 (20.0%), p = 0.039]. Conclusion IPN relevant pathogens can be identified by plasma mNGS, potentially facilitating appropriate treatment. The clinical application of mNGS in this cohort appears feasible.
Correlation between N-gene cycle threshold (Ct) values and ORF1ab gene Ct values among index cases (top). Correlation between S-gene cycle threshold (Ct) values and ORF1ab gene Ct values among index cases (middle). Percent of index cases by SARS-CoV-2 viral load in saliva, estimated using ORF1ab Ct values (bottom)
The number of index cases detected between October 1, 2020 and April 15, 2021 at UT Austin by the number of close contacts that tested (top) and number of test-positive contacts (bottom)
Proportion of test positive contacts and 95% confidence intervals by presentation of cough among index cases on the day of exposure (top) and viral load at the time of test (bottom)
Background Factors that lead to successful SARS-CoV-2 transmission are still not well described. We investigated the association between a case’s viral load and the risk of transmission to contacts in the context of other exposure-related factors. Methods Data were generated through routine testing and contact tracing at a large university. Case viral loads were obtained from cycle threshold values associated with a positive polymerase chain reaction test result from October 1, 2020 to April 15, 2021. Cases were included if they had at least one contact who tested 3–14 days after the exposure. Case-contact pairs were formed by linking index cases with contacts. Chi-square tests were used to evaluate differences in proportions of contacts testing positive. Generalized estimating equation models with a log link were used to evaluate whether viral load and other exposure-related factors were associated with a contact testing positive. Results Median viral load among the 212 cases included in the study was 5.6 (1.8–10.4) log 10 RNA copies per mL of saliva. Among 365 contacts, 70 (19%) tested positive following their exposure; 36 (51%) were exposed to a case that was asymptomatic or pre-symptomatic on the day of exposure. The proportion of contacts that tested positive increased monotonically with index case viral load (12%, 23% and 25% corresponding to < 5, 5–8 and > 8 log 10 copies per mL, respectively; X 2 = 7.18, df = 2, p = 0.03). Adjusting for cough, time between test and exposure, and physical contact, the risk of transmission to a close contact was significantly associated with viral load (RR = 1.27, 95% CI 1.22–1.32). Conclusions Further research is needed to understand whether these relationships persist for newer variants. For those variants whose transmission advantage is mediated through a high viral load, public health measures could be scaled accordingly. Index cases with higher viral loads could be prioritized for contact tracing and recommendations to quarantine contacts could be made according to the likelihood of transmission based on risk factors such as viral load.
Background Pre-exposure prophylaxis (PrEP) can significantly reduce HIV acquisition especially among communities with high HIV prevalence, including men who have sex with men (MSM). Much research has been finding suboptimal PrEP persistence; however, few studies examine factors that enhance PrEP persistence in real-world settings. Methods We interviewed 33 patients who identified as MSM at three different PrEP clinics in three regions of the U.S. (Northeast, South, Midwest). Participants were eligible if they took PrEP and had been retained in care for a minimum of 6 months. Interviews explored social, structural, clinic-level and behavioral factors that influencing PrEP persistence. Results Through thematic analysis we identified the following factors as promoting PrEP persistence: (1) navigation to reduce out-of-pocket costs of PrEP (structural), (2) social norms that support PrEP use (social), (3) access to LGBTQ + affirming medical providers (clinical), (4) medication as part of a daily routine (behavioral), and (5) facilitation of sexual health agency (belief). Discussion In this sample, persistence in PrEP care was associated with structural and social supports as well as a high level of perceived internal control over protecting their health by taking PrEP. Patients might benefit from increased access, LGBTQ + affirming medical providers, and communications that emphasize PrEP can promote sexual health.
Phylogenetic trees of coxsackievirus B1, coxsackievirus A6, and coxsackievirus A4 based on the partial VP1 sequences by the neighbor-joining algorithm implemented in MEGA (version 7.0) using the Kimura two-parameter substitution model and 1000 bootstrap pseudo-replicates, respectively. Only strong bootstrap values (> 75%) are shown. ● Indicates strains isolated in this investigation; ▲ indicates the prototype strain
Background Hand, foot, and mouth disease (HFMD) is a common child infectious disease caused by more than 20 enterovirus (EV) serotypes. In recent years, enterovirus A71 (EV-A71) has been replaced by Coxsackievirus A6 (CV-A6) to become the predominant serotype. Multiple EV serotypes co-circulate in HFMD epidemics, and this study aimed to investigate the etiological epidemic characteristics of an HFMD outbreak in Kunming, China in 2019. Methods The clinical samples of 459 EV-associated HFMD patients in 2019 were used to amplify the VP1 gene region by the three sets of primers and identify serotypes using the molecular biology method. Phylogenetic analyses were performed based on the VP1 gene. Results Three hundred and forty-eight cases out of 459 HFMD patients were confirmed as EV infection. Of these 191 (41.61%) were single EV infections and 34.20% had co-infections. The EVs were assigned to 18 EV serotypes, of which CV-A6 was predominant (11.33%), followed by CV-B1 (8.93%), CV-A4 (5.23%), CV-A9 (4.58%), CV-A 16 (3.49%) and CV-A10 and CVA5 both 1.96%. Co-infection of CV-A6 with other EVs was present in 15.25% of these cases, followed by co-infection with CV-A16 and other EVs. The VP1 sequences used in the phylogenetic analyses showed that the CV-A6, CV-B1 and CV-A4 sequences belonged to the sub-genogroup D3 and genogroups F and E, respectively. Conclusion Co-circulation and co-infection of multiple serotypes were the etiological characteristic of the HFMD epidemic in Kunming China in 2019 with CV-A-6, CV-B1 and CV-A4 as the predominant serotypes. This is the first report of CV-B1 as a predominant serotype in China and may provide valuable information for the diagnosis, prevention and control of HFMD.
MRI showing extensive swelling in the soft tissue of the lower legs. MRI magnetic resonance imaging
Twelve hours after admission: Bullae and erythematous plaque in both lower legs
Fifteen hours after admission: Progression of the skin lesions in both legs
Histopathological findings: Swollen muscle fibers, dissolved necrotic sarcoplasm, high bacterial concentration in the muscle and surrounding adipose tissue, extensive infiltration of acute and chronic inflammatory cells, and dilated blood vessels with congested vasculitis. A Hematoxylin and eosin (H&E) stain, ×100. B H&E stain, ×400
Day 4 after admission: progression of tissue necrosis to the thigh
Background Vibrio vulnificus infections develop rapidly and are associated with a high mortality rate. The rates of diagnosis and treatment are directly associated with mortality. Case presentation We describe an unusual case of a 61-year-old male patient with chronic liver disease and diabetes who presented with a chief complaint of pain in both lower legs due to V. vulnificus infection in winter. Within 12 h of arrival, typical skin lesions appeared, and the patient rapidly developed primary sepsis. Despite prompt appropriate antibiotic and surgical treatment, the patient died 16 days after admission. Conclusion Our case findings suggest that V. vulnificus infection should be suspected in patients with an unclear infection status experiencing pain of unknown origin in the lower legs, particularly in patients with liver disease or diabetes, immunocompromised status, and alcoholism.
Conceptual framework showing the possible impact of COVID-19 and antimalarial resistance on malaria care among pregnant women. Dashed line (green): antimalarial resistance; Dotted line (blue): COVID-19; Continuous line (yellow): both COVID-19 and antimalarial resistance
Background Uganda accounts for 5% of all malaria cases and deaths reported globally and, in endemic countries, pregnancy is a risk factor for both acquisition of P. falciparum infection and development of severe malaria. In recent years, malaria control has been threatened by COVID-19 pandemic and by the emergence, in Northern Uganda, of both resistance to artemisinin derivatives and to sulfadoxine-pyrimethamine. Methods In this facility-based, prospective, observational study, pregnant women will be recruited at antenatal-care visits and followed-up until delivery. Collected data will explore the incidence of asymptomatic parasitemia and malaria-related outcomes, as well as the attitudes towards malaria prevention, administration of intermittent preventive treatment, healthcare seeking behavior and use of insecticide-treated nets. A subpopulation of women diagnosed with malaria will be recruited and their blood samples will be analyzed for detection of genetic markers of resistance to artemisinin derivatives and sulfadoxine-pyrimethamine. Also, to investigate the impact of COVID-19 on malaria care among pregnant women, a retrospective, interrupted-time series will be conducted on at the study sites for the period January 2018 to December 2021. Discussion The present study will explore the impact of COVID-19 pandemic on incidence of malaria and malaria-related adverse outcomes, along with the prevalence of resistance to artemisinin derivatives and to sulfadoxine-pyrimethamine. To our knowledge, this is the first study aiming to explore the combined effect of these factors on a cohort of pregnant women. Trial registration : This study has been registered on the public website on 26th April, 2022. Identifier: NCT05348746.
Study flow chart displaying patient counts in each treatment group. 2-DG 2-deoxy-d-glucose, SOC standard of care, SOC1 SOC in Part A of the study, SOC2 SOC in Part B of the study
Median time (days) to clinical endpoints compared between patients receiving 2-DG 90 mg/kg/day plus SOC and patients receiving standard of care only. 2-DG 2-deoxy-d-glucose, SOC standard of care, WHO World Health Organization
Background At present, no single efficacious therapeutic exists for acute COVID-19 management and a multimodal approach may be necessary. 2-deoxy- d -glucose (2-DG) is a metabolic inhibitor that has been shown to limit multiplication of SARS-CoV-2 in-vitro. We evaluated the efficacy and safety of 2-DG as adjunct to standard care in the treatment of moderate to severe COVID-19 patients. Methods We conducted a randomized, open-label, phase II, clinical study to evaluate the efficacy, safety, and tolerability of 2-DG administered as adjunct to standard of care (SOC). A total of 110 patients between the ages of 18 and 65 years with moderate to severe COVID-19 were included. Patients were randomized to receive 63, 90, or 126 mg/kg/day 2-DG in addition to SOC or SOC only. Times to maintaining SpO 2 ≥ 94% on room air, discharge, clinical recovery, vital signs normalisation, improvement by 1 and 2 points on WHO clinical progression scale, negative conversion on RT-PCR, requirement for intensive care, and mortality were analyzed to assess the efficacy. Results Patients treated with 90 mg/kg/day 2-DG plus SOC showed better outcomes. Time to maintaining SpO 2 ≥ 94% was significantly shorter in the 2-DG 90 mg compared to SOC (median 2.5 days vs. 5 days, Hazard ratio [95% confidence interval] = 2.3 [1.14, 4.64], p = 0.0201). Times to discharge from isolation ward, to clinical recovery, and to vital signs normalization were significantly shorter for the 2-DG 90 mg group. All three doses of 2-DG were well tolerated. Thirty-three (30.3%) patients reported 65 adverse events and were mostly (86%) mild. Conclusions 2-DG 90 mg/kg/day as adjunct to SOC showed clinical benefit over SOC alone in the treatment of moderate to severe COVID-19. The promising trends observed in current phase II study is encouraging for confirmatory evaluation of the efficacy and safety of 2-DG in a larger phase III trial. Trial registration : CTRI, CTRI/2020/06/025664. Registered 5th June 2020,,%2744369det%27 .
Flow chart of participant selection. Study population selection and criteria for exclusion, a total of 51 patients were included in the analysis. CKRT, continuous kidney replacement therapy; CVVH, continuous venovenous hemofiltration; ICUs, intensive care units; ECMO, extracorporeal membrane oxygenation; IKRT, intermittent kidney replacement treatment
Scatter plot of observed vancomycin concentration in the II group (A) and CI group (B). Scatter plot of AUC24/MIC in the II group (C) and CI group (D). Solid line represents the mean ± SD. The gray area represents the target range
Target attainment of initial observed concentration (A) and overall observed concentration (B) in the II group and CI group. Target attainment of initial AUC24/MIC (C) and overall AUC24/MIC (D) in the II group and CI group during CVVHa. aFor target concentration, therapeutic exposure is defined as trough concentration of 15–25 mg/L for continuous infusion (CI group) and steady-state concentration of 10-20 mg/L for intermittent infusion group (II group), respectively. For AUC24/MIC target, therapeutic exposure is defined as 400–650 for both groups. Suptherapeutic exposure is defined as the target PK/PD indices above the desired range, whereas subtherapeutic exposure is defined as PK/PD indices below the desired range. *Bonferroni-adjusted P < 0.05
Correlation analysis of target PK/PD indices with EFR. Correlation of observed concentration with EFR in the II group (A) and CI group (B). Correlation of AUC24/MIC with EFR in the II group (C) and CI group (D). The Spearman correlation coefficient r is shown. Statistical significance was assessed by Spearman correlation. EFR, effluent flow rate
Abstract Background A prospective interventional study comparing outcomes in critically ill patients receiving intermittent infusion (II) or continuous infusion (CI) of vancomycin during continuous venovenous hemofiltration (CVVH) is lacking. The objective of this study was to compare the pharmacokinetic/pharmacodynamics (PK/PD) target attainment, therapeutic efficacy and safety among critically ill patients who received CI or II of vancomycin in a prospective interventional trial and to explore the correlations of effluent flow rate (EFR) with PK/PD indices. Methods This prospective interventional study was conducted in two independent intensive care units (ICUs) from February 2021 to January 2022. Patients in one ICU were assigned to receive CI (intervention group) of vancomycin, whereas patients in the other ICU were assigned to receive II regimen (control group). The primary outcome was to compare the PK/PD target attainment, including target concentration and target area under the curve over 24 h to minimum inhibitory concentration (AUC24/MIC). Results Overall target attainment of PK/PD indices was higher with CI compared with II, irrespective of target concentration (78.7% vs. 40.5%; P
Enrollment of inpatient with CRS-operation. CRS chronic rhinosinusitis, NPX nasopharynx, SS Sjogren’s syndrome
The compositions of isolated bacterial species in SS-CRS and non-SS-CRS. a Total species. b Facultative anaerobic or aerobic species. c Anaerobic species
Top three bacterial genera of facultative anaerobes or aerobes in SS-CRS and non-SS-CRS. Staphy., Staphylococcus; Pseudo., Pseudomonas; Strep., Streptococcus; Kleb., Klebsiella; SS, Sjogren’s Syndrome; CRS, chronicrhinosinusitis. B Top three bacterial genera of anaerobes in SS-CRS and non-SS-CRS. Cuti., Cutibacterium; Pepto., Peptostreptococcus; Fuso., Fusobacter; Prevo., Prevotella
Top three bacterial species of facultative anaerobes or aerobes in SS-CRS and non-SS-CRS. CoNS, Coagulase-negative Staphylococcus; PA, Pseudomonas aeruginosa; MSSA, Methicillin-sensitive Staphylococcus aureus; SE, Staphylococcus epidermidis.B Top three bacterial species of anaerobes in SS-CRS and non-SS-CRS. Cuti., Cutibacterium; Pepto., Peptostreptococcus
The major antibiotic-resistant bacterial species in SS-CRS and non-SS-CRS. MRSA, Methicillin-resistant Staphlococcus aureus; extended-spectrum β-lactamases producing Klebsiella pneumoniae (ESBL-KP); carbapenem-resistant Pseudomonas aeruginosa (CRPA); multidrug-resistant Acinetobacter baumannii (MDRAB)
Background Chronic rhinosinusitis (CRS) affects the quality of life of many people worldwide and can cause comorbidities. Our previous research proved that Sjogren’s syndrome (SS) is a predisposing factor for CRS, with a 2.5-fold associated risk. Antibiotics are important in CRS treatment; however, there is a paucity of research on the pathogenic bacteria of SS-CRS in the past. We conducted this study to investigate the pathogenic difference of SS-CRS and non-SS-CRS and aimed to give clinicians references when selecting antibiotics to treat SS-CRS. Materials and methods A total of 14,678 patients hospitalized for CRS operation from 2004 to 2018 were identified from the Chang Gung Research Database. These CRS cases were classified as either SS-CRS or non-SS-CRS. We analyzed their bacterial distribution by studying the results of the pus cultures performed alongside surgery. Results The top three facultative anaerobic or aerobic isolated bacteria in the SS-CRS group were coagulase-negative Staphylococcus (CoNS: 34.3%), Pseudomonas aeruginosa (28.6%), methicillin-sensitive Staphylococcus aureus (MSSA: 20%), and Staphylococcus epidermidis (20%). In the non-SS-CRS group, S. epidermidis (29.3%), CoNS (25.7%), and MSSA (14.2%) were identified. The top three anaerobic bacterial genera were Cutibacterium (54.3%), Peptostreptococcus (11.4%), and Fusobacterium (11.4%) in the SS-CRS group and Cutibacterium (53.8%), Peptostreptococcus (25%), and Prevotella (12.9%) in the non-SS-CRS group. Conclusions P.aeruginosa is a major pathogen in SS-CRS patients. In addition, physicians should be aware of potential Fusobacterium and antibiotic-resistant bacterial infection in patients with SS-CRS.
Imaging and pathology findings. A Chest radiograph showed irregular opacities at the left lower lung and blunted left costophrenic angle. B, C Hematoxylin and eosin and acid-fast stained lung sections revealed granulomatous inflammation with the presence of acid-fast bacilli. D FDG-PET/CT showed intense FDG hypermetabolism at bilateral lungs, especially LUL (SUVmax, 12.3) and multiple hot areas at bilateral mediastinal nodes (SUVmax, 8.6), bilateral supraclavicular regions (SUVmax, 4.9), right mesenteric region (SUVmax, 8.9) and left para-sternal node (SUVmax, 3.1). FDG PET-CT, fluorodeoxyglucose positron emission tomography-computed tomography; SUVmax, maximum standardized uptake value
Background Patients with adult-onset immunodeficiency syndrome due to anti-interferon-γ autoantibodies (AIGAs) are susceptible to disseminated Mycobacterium avium complex (MAC) infections. M. chimaera, a newly identified MAC species, is distinguished from the others due to the reduced virulence. Previous cases of disseminated M. chimaera infection have been linked to cardiothoracic surgery. Reports of disseminated M. chimaera in patients without a history of cardiothoracic surgery are rare. Case presentation A 57-year-old Asian man, previously healthy, presented with fever, dry cough, exertional dyspnea, and decreased appetite. The delayed resolution of pneumonia despite antibiotic treatment prompted further imaging studies and biopsies from the lung and lymph node. The fluorodeoxyglucose positron emission tomography/computed tomography (FDG-PET/CT) demonstrated intense uptake in lung consolidations and diffuse lymphadenopathy. Cultures of the specimens obtained from sputum, blood, stool, lung tissue, and lymph node grew M. chimaera. Further immunological evaluation disclosed the presence of neutralizing AIGAs, which possibly led to acquired immunodeficiency and disseminated M. chimaera infection. Conclusions We herein present the first case of adult-onset immunodeficiency due to AIGAs complicated with disseminated M. chimaera infection. Further immunological evaluation, including AIGAs, may be warranted in otherwise healthy patients who present with disseminated mycobacterial infection.
7-day incidences per 100,000 inhabitants at the travel destination of tested returnees at the time of testing. Red lines: thresholds for public health measures in Germany
Odds ratios (dark blue) plus 95% confidence interval (light blue) in univariable logistic regression of selected variables and COVID-19 test result with weekly growing dataset. For better readability, upper confidence interval values above 10 have been truncated (orange dots)
Frequency of patients with positive and negative SARS-COV2 test result by triage score (red/blue bars) and sensitivity and specificity at score cut-offs (green/orange bars). N.B.: For better readability, the frequency of patients is shown in log scale. The red lines indicate the chosen cut-off values for risk group-classification (see below)
Background Numerous scoring tools have been developed for assessing the probability of SARS-COV-2 test positivity, though few being suitable or adapted for outpatient triage of health care workers. Methods We retrospectively analysed 3069 patient records of health care workers admitted to the COVID-19 Testing Unit of the Ludwig-Maximilians-Universität of Munich between January 27 and September 30, 2020, for real-time polymerase chain reaction analysis of naso- or oropharyngeal swabs. Variables for a multivariable logistic regression model were collected from self-completed case report forms and selected through stepwise backward selection. Internal validation was conducted by bootstrapping. We then created a weighted point-scoring system from logistic regression coefficients. Results 4076 (97.12%) negative and 121 (2.88%) positive test results were analysed. The majority were young (mean age: 38.0), female (69.8%) and asymptomatic (67.8%). Characteristics that correlated with PCR-positivity included close-contact professions (physicians, nurses, physiotherapists), flu-like symptoms (e.g., fever, rhinorrhoea, headache), abdominal symptoms (nausea/emesis, abdominal pain, diarrhoea), less days since symptom onset, and contact to a SARS-COV-2 positive index-case. Variables selected for the final model included symptoms (fever, cough, abdominal pain, anosmia/ageusia) and exposures (to SARS-COV-positive individuals and, specifically, to positive patients). Internal validation by bootstrapping yielded a corrected Area Under the Receiver Operating Characteristics Curve of 76.43%. We present sensitivity and specificity at different prediction cut-off points. In a subgroup with further workup, asthma seems to have a protective effect with regard to testing result positivity and measured temperature was found to be less predictive than anamnestic fever. Conclusions We consider low threshold testing for health care workers a valuable strategy for infection control and are able to provide an easily applicable triage score for the assessment of the probability of infection in health care workers in case of resource scarcity.
Background China has experienced a continuous decreasing trend in the incidence of hepatitis A in recent years. Temporal trend analyses are helpful in exploring the reasons for the changing trend. Thus, this study aims to analyse the incidence trend of viral hepatitis A by region and age group in mainland China from 2004 to 2017 to evaluate the effectiveness of prevention and control measures. Methods Data on hepatitis A and population information were collected and analysed with a joinpoint regression model. Annual percentage changes (APCs) and average annual percentage changes (AAPCs) were estimated for the whole country and for each region and age group. Results From 2004 to 2017, the seasonality and periodicity of hepatitis A case numbers were obvious before 2008 but gradually diminished from 2008 to 2011 and disappeared from 2012–2017. The national incidence of hepatitis A (AAPC = − 12.1%) and the incidence rates for regions and age groups showed decreasing trends, with differences in the joinpoints and segments. Regarding regions, the hepatitis A incidence in the western region was always the highest among all regions, while a nonsignificant rebound was observed in the northeastern region from 2011 to 2017 (APC = 14.2%). Regarding age groups, the hepatitis A incidence showed the fastest decrease among children (AAPC = − 15.3%) and the slowest decrease among elderly individuals (AAPC = − 6.6%). Among all segments, the hepatitis A incidence among children had the largest APC value in 2007–2017, at − 20.4%. Conclusion The national annual incidence of hepatitis A continually declined from 2004 to 2017 and the gaps in hepatitis A incidence rates across different regions and age groups were greatly narrowed. Comprehensive hepatitis A prevention and control strategies, including the use of routine vaccination during childhood in mainland China, especially the implementation of the national Expanded Program on Immunization (EPI) in 2008, resulted in substantial progress from 2004 to 2017. However, gaps remain. Regular monitoring and analysis of hepatitis A epidemic data and prompt adjustment of hepatitis A prevention and control strategies focusing on children, elderly individuals and those living in certain regions are recommended.
Flowchart of the enrolled patients
ROC analysis of the studied biomarkers for predicting 28-day mortality in sepsis
Multi-marker approach predicting 28-day mortality for sepsis
Background We aimed to explore the prognostic utilities of C-reactive protein (CRP), procalcitonin (PCT), neutrophil CD64 (nCD64) index, in combination or alone, in septic patients. Methods We retrospectively included 349 septic patients (based on Sepsis 3.0 definition). The primary outcome was 28-day all-cause mortality. Cox regression model, receiver-operating characteristic (ROC) curve, reclassification analysis, Kaplan–Meier survival curves were performed to evaluate the predictive efficacy of the above parameters. Results CRP, nCD64 index were independent predictors of 28-day mortality for sepsis in the Cox regression model [CRP, HR 1.004 (95% CI 1.002–1.006), P < 0.001; nCD64 index, HR 1.263 (95% CI 1.187–1.345, P < 0.001]. Area under the ROC curve (AUC) of CRP, PCT, nCD64 index, nCD64 index plus PCT, nCD64 index plus CRP, were 0.798 (95% CI 0.752–0.839), 0.833 (95% CI 0.790–0.871), 0.906 (95% CI 0.870–0.935), 0.910 (95% CI 0.875–0.938), 0.916 (95% CI 0.881–0.943), respectively. nCD64 plus CRP performed best in prediction, discrimination, and reclassification of the 28-day mortality risk in sepsis. The risk of 28-day mortality increased stepwise as the number of data exceeding optimal cut-off values increased. Conclusions nCD64 index combined with CRP was superior to CRP, PCT, nCD64 index and nCD64 index plus PCT in predicting 28-day mortality in sepsis. Multi-marker approach could improve the predictive accuracy and be beneficial for septic patients.
Regional distributions of weekly age standardised SARS-CoV-2 incidence rates by wave in Germany (for the calendar week with highest rate in Germany) and results of Moran’s test of residual correlation (spatial autocorrelation). Note. Shaded areas cover the four pandemic waves
Trajectories of weekly age-standardised SARS-CoV-2 incidence rates (ASIRs) for working-age population (aged 20–64 years) by levels of regional labour market indicators (based on tertiles) in Germany
Predicted age-standardised SARS-CoV-2 incidence rates (ASIRs) for working age population (aged 20–64 years) at given levels of labour market indicators (mean + − 1 SD) for different pandemic waves based on spatial error model for panel data (same adjustments as in Table 3)
Background Regional labour markets and their properties are named as potential reasons for regional variations in levels of SARS-CoV-2 infections rates, but empirical evidence is missing. Methods Using nationwide data on notified laboratory-confirmed SARS-CoV-2 infections, we calculated weekly age-standardised incidence rates (ASIRs) for working-age populations at the regional level of Germany’s 400 districts. Data covered nearly 2 years (March 2020 till December 2021), including four main waves of the pandemic. For each of the pandemic waves, we investigated regional differences in weekly ASIRs according to three regional labour market indicators: (1) employment rate, (2) employment by sector, and (3) capacity to work from home. We use spatial panel regression analysis, which incorporates geospatial information and accounts for regional clustering of infections. Results For all four pandemic waves under study, we found that regions with higher proportions of people in employment had higher ASIRs and a steeper increase of infections during the waves. Further, the composition of the workforce mattered: rates were higher in regions with larger secondary sectors or if opportunities of working from home were comparatively low. Associations remained consistent after adjusting for potential confounders, including a proxy measure of regional vaccination progress. Conclusions If further validated by studies using individual-level data, our study calls for increased intervention efforts to improve protective measures at the workplace, particularly among workers of the secondary sector with no opportunities to work from home. It also points to the necessity of strengthening work and employment as essential components of pandemic preparedness plans.
Background: Increased intensity of pyrethroid resistance is threatening the effectiveness of insecticide-based interventions to control malaria in Africa. Assessing the extent of this aggravation and its impact on the efficacy of these tools is vital to ensure the continued control of major vectors. Here we took advantage of 2009 and 2014 data from Malawi to establish the extent of the resistance escalation in 2021 and assessed its impact on various bed nets performance. Methods: Indoor blood-fed and wild female Anopheles (An) mosquitoes were collected with an electric aspirator in Chikwawa. Cocktail and SINE PCR were used to identify sibling species belonging to An. funestus group and An. gambiae complex. The susceptibility profile to the four classes of insecticides was assessed using the WHO tubes bioassays. Data were saved in an Excel file. Analysis was done using Vassarstats and figures by Graph Pad. Results: In this study, a high level of resistance was observed with pyrethroids (permethrin, deltamethrin and alpha-cypermethrin with mortality rate at 5x discriminating concentration (DC) < 50% and Mortality rate at 10x DC < 70%). A high level of resistance was also observed to carbamate (bendiocarb) with mortality rate at 5x DC < 25%). Aggravation of resistance was also noticed between 2009 and 2021. For pyrethroids, the mortality rate for permethrin reduced from 47.2% in 2009 to 13% in 2014 and 6.7% in 2021. For deltamethrin, the mortality rate reduced from 42.3% in 2009 to 1.75% in 2014 and 5.2% in 2021. For Bendiocarb, the mortality rate reduced from 60% in 2009 to 30.1% in 2014 and 12.2% in 2021. The high resistance observed is consistent with a drastic loss of pyrethroid-only bed nets efficacy although Piperonyl butoxide (PBO)-based nets remain effective. The resistance pattern observed was linked with high up-regulation of the P450 genes CYP6P9a, CYP6P9b and CYP6M7 in An. funestus s.s. mosquitoes surviving exposure to deltamethrin at 1x, 5x and 10x DC. A significant association was observed between the 6.5 kb structural variant and resistance escalation with homozygote resistant (SV+/SV+) more likely to survive exposure to 5x and 10x (OR = 4.1; P < 0.001) deltamethrin than heterozygotes. However, a significant proportion of mosquitoes survived the synergist assays with PBO suggesting that other mechanisms than P450s are present. Conclusions: This resistance aggravation in An. funestus s.s. Malawian population highlights an urgent need to deploy novel control tools not relying on pyrethroids to improve the effectiveness of vector control.
Background The COVID-19 pandemic has affected all people across the globe. Regional and community differences in timing and severity of surges throughout the pandemic can provide insight into risk factors for worse outcomes in those hospitalized with COVID-19. Methods The study cohort was derived from the Cerner Real World Data (CRWD) COVID-19 Database made up of hospitalized patients with proven infection from December 1, 2019 through November 30, 2020. Baseline demographic information, comorbidities, and hospital characteristics were obtained. We performed multivariate analysis to determine if age, race, comorbidity and regionality were predictors for mortality, ARDS, mechanical ventilation or sepsis hospitalized patients with COVID-19. Results Of 100,902 hospitalized COVID-19 patients included in the analysis (median age 52 years, IQR 36–67; 50.7% female), COVID-19 case fatality rate was 8.5% with majority of deaths in those ≥ 65 years (70.8%). In multivariate analysis, age ≥ 65 years, male gender and higher Charlson Comorbidity Index (CCI) were independent risk factors for mortality and ARDS. Those identifying as non-Black or non-White race have a marginally higher risk for mortality (OR 1.101, CI 1.032–1.174) and greater risk of ARDS (OR 1.44, CI 1.334–1.554) when compared to those who identify as White. The risk of mortality or ARDS was similar for Blacks as Whites. Multivariate analysis found higher mortality risk in the Northeast (OR 1.299, CI 1.22–1.29) and West (OR 1.26, CI 1.18–1.34). Larger hospitals also had an increased risk of mortality, greatest in hospitals with 500–999 beds (OR 1.67, CI 1.43–1.95). Conclusion Advanced age, male sex and a higher CCI predicted worse outcomes in hospitalized COVID-19 patients. In multivariate analysis, worse outcomes were identified in small minority populations, however there was no difference in study outcomes between those who identify as Black or White.
Diagrams of the immunopathological cascades arising from Sarcoptes scabiei infection depending on host hypersensitivity response (Type I or IV). Diagram A represents the immunopathological processes as currently proposed in narrative literature reviews of S. scabiei, and diagram B represents the Immunopathological relationships supported by the meta-analysis undertaken in this study. Solid arrow ( →) indicates a stimulation or influence from one parameter to the other, whereas a dashed arrow (– →) indicates a hypothesised link; small up or down triangle next to parameter indicates an increase or decrease; red text indicates missing immunological links considered likely to connect parameters; Parameters in non-bold indicates secreted cytokines or immunoglobulins; *in panel B IV indicates no direct measure of macrophages instead measured by MCP-1; **in panel A IV indicates epidermal cells to include keratinocytes, Langerhans cells and fibroblasts; *** in panel B indicates no direct measure of T cells or B cells however could be included in the measurement of lymphocytes. IL interleukin; IFN-γ Interferon gamma; TNF-α tumour necrosis factor alpha; TGF-ß transforming growth factor beta; CD4+= T helper cells; CD8+ cytotoxic T cells; Ig immunoglobulin; C3 complement 3; MCV mean corpuscular volume; TEC total erythrocyte concentration; PCV packed cell volume; AGP Acid(1)-alpha glycoprotein; SAA serum amyloid A; A:G ratio albumin:globulin ratio; ALT alanine aminotransferase; BUN blood urea nitrogen; MCHC mean corpuscular haemoglobin concentration; MCH mean corpuscular haemoglobin; LPO lipid peroxidation; CAT catalase; GSH:GSSH free glutathione:oxidized glutathione ratio; GGT Gamma-glutamyl transferase. Created in Inkscape
Heatmap illustrating the four host species for which effect sizes were most commonly calculated (dog = 138, human = 76, bare-nosed wombat = 67 and Iberian ibex = 63). The heat reflects the percentage of studies for each category (immunological process) with each category amounting to a total of 100%. Parameters not falling directly into a definitive category, such as ‘Erythrocytic changes or ‘Acute phase proteins’, were included in the category ‘Other’
Background Sarcoptes scabiei is one of the most impactful mammalian parasites. There has been much research on immunological and clinical pathological changes associated with S. scabiei parasitism across a range of host species. This rich body of literature is complex, and we seek to bring that complexity together in this study. We first (1) synthesise narrative reviews of immunopathological relationships to S. scabiei infection to construct overarching hypotheses; then (2) undertake a systematic meta-analysis of primary literature on immunological and clinical pathological changes; and lastly (3) contrast our findings from the meta-analysis to our synthesis from narrative reviews. Methods We synthesised 55 narrative reviews into two overarching hypotheses representing type I and type IV immune responses to S. scabiei infection. We then systematically extracted all literature reporting immunological variables, acute phase proteins, oxidant/antioxidant status, and erythrocytic, hepatological and nephrological changes, calculating 565 effect sizes between controls and sarcoptic mange affected groupings, refining (simplifying) hypotheses from narrative reviews. Results Immunological and clinical pathological parameters were most often studied in dogs (n = 12) and humans (n = 14). Combining immunological and clinical pathological information across mammalian species (n = 19) helped yield general insights into observed disease responses. This is evidenced by interspecific consensus in 27 immunological and clinical pathology variables (6/26 type I hypersensitivity, 3/20 type IV hypersensitivity, 6/10 oxidant/antioxidant status, 3/6 acute phase protein, 4/7 erythrocytic, and 5/10 hepatological/nephrological). Conclusions Elevated IgE, eosinophils and mast cells in type I hypersensitivity response corresponded to what was described in narrative reviews. Results from type IV hypersensitivity response suggested typical antibody response, however cell-mediated response was less evident. Some consensus of acute phase protein response and shifted oxidant/antioxidant balance and slight evidence of anemia. We highlight the need for mange/scabies studies to more routinely compare immunological and clinical pathological changes against controls, and include collection of a more standardised suite of variables among studies.
Epidemic curve of COVID-19 and clinical course of patients in Singapore. A Epidemic curves of COVID-19 as of March 10, 2020 in Singapore are shown. The green and red solid bars correspond to the newly reported cases by date of symptom onset and by date of laboratory confirmation, respectively. B Each panel presents timeline of infection for each case. Expected SARS-CoV-2 viral dynamics and observed viral load for the first 13 cases are depicted by grey (or black) solid lines and grey open circles, respectively. The timing of arrival to Singapore (red dashed lines), the timing of symptom onset (black dashed lines), the estimated timing of infection establishment (blue shaded areas), and the detection limit of viral load (grey dashed lines) are also shown
Viral load dynamics of the three patients in China. The three panel presents timeline of infection for the three cases in Zhuhai, China used to compute the viral load boundary for infection establishment. Expected SARS-CoV-2 viral dynamics and observed viral load are depicted by grey (or black) solid lines and grey open circles, respectively. The timing of symptom onset (black dashed lines), the timing of infection establishment (known; blue shaded areas), and the estimated viral load boundaries for infection establishment (red dashed lines) are also shown
Background Multiple waves of the COVID-19 epidemic have hit most countries by the end of 2021. Most of those waves are caused by emergence and importation of new variants. To prevent importation of new variants, combination of border control and contact tracing is essential. However, the timing of infection inferred by interview is influenced by recall bias and hinders the contact tracing process. Methods We propose a novel approach to infer the timing of infection, by employing a within-host model to capture viral load dynamics after the onset of symptoms. We applied this approach to ascertain secondary transmission which can trigger outbreaks. As a demonstration, the 12 initial reported cases in Singapore, which were considered as imported because of their recent travel history to Wuhan, were analyzed to assess whether they are truly imported. Results Our approach suggested that 6 cases were infected prior to the arrival in Singapore, whereas other 6 cases might have been secondary local infection. Three among the 6 potential secondary transmission cases revealed that they had contact history to previously confirmed cases. Conclusions Contact trace combined with our approach using viral load data could be the key to mitigate the risk of importation of new variants by identifying cases as early as possible and inferring the timing of infection with high accuracy.
Background: While national malaria incidence has been declining in Myanmar, some subregions within the nation continue to have high burdens of malaria morbidity and mortality. This study assessed the malaria situation in one of these regions, Banmauk Township, located near the Myanmar-India border. Our goal was to provide a detailed description of the malaria epidemiology in this township and to provide some evidence-based recommendations to formulate a strategy for reaching the national malaria elimination plan. Banmauk consistently has one of the highest malaria burdens in Myanmar. Methods: With the implementation of strengthened malaria control and surveillance activities after the endorsement of a national malaria elimination plan in 2015, detailed incidence data were obtained for 2016-2018 for Banmauk Township. The data include patient demographics, parasite species, disease severity, and disease outcome. Data were analyzed to identify characteristics, trends, distribution, and risk factors. Results: During 2016-2018, 2,402 malaria cases were reported, with Plasmodium falciparum accounting for 83.4% of infections. Both P. falciparum and P. vivax were transmitted more frequently during the rainy season (May-October). Despite intensified control, the annual parasite incidence rate (API) in 2017 (11.0) almost doubled that in 2016 (6.5). In total, 2.5% (59/2042) of the cases, of which 54 P. falciparum and 5 P. vivax, were complicated cases, resulting in 5 deaths. Malaria morbidity was high in children < 15 years and accounted for 33.4% of all cases and about 47% of the complicated cases. Older age groups and males living with poor transportation conditions were more likely to test positive especially in rainy and cold seasons. Despite the clear seasonality of malaria, severe cases were found among young children even more common in the dry season, when malaria incidence was low. Conclusions: Despite the declining trend, the malaria burden remained high in Banmauk Township. Our study also documented severe cases and deaths from both falciparum and vivax malaria. P. falciparum remained the predominant parasite species, demanding increased efforts to achieve the goal of elimination of P. falciparum by 2025. As P. falciparum cases decreased, the proportion of cases attributable to P. vivax increased. In order to eliminate malaria, it will likely be important to increasingly target this species as well.
Map of Sululta and Ziway (Batu) can be located to the north and south of Addis Ababa, respectively, Ethiopia
Average H. Pylori prevalence prediction accuracy and F1- scores of machine learning classifiers using various feature selection methods. Maroon and blue colors represent high and low accuracy (A), and F1 score (B), respectively. The numbers within each cell indicate the accuracy/F1-score of each classifier-feature selection method pair. KNN indicates K-Nearest Neighbors: SVM, Support Vector Machines; XGB, XGBoost; LR, Logistic Regression; NB, Naive Bayes; and RF, Random Forests. FULL indicates all risk factors are used. IG indicates Information Gain: ReF, ReliefF; MRMR, Minimum Redundancy Maximum Relevance; CFS, Correlation-based Feature Selection; FCBF, Fast Correlation Based Filter; and SFFS, Sequential Floating Forward Selection. The numbers -10 and -20 indicate the number of risk factors selected for the ranking-based feature selection methods. C The Receiver Operating Characteristic (ROC) curves of six classifiers (using their best hyperparameter combination) were obtained when they were used to predict H. pylori infection using a subset of risk factors selected through IG-20 feature selection method. The area under the ROC curve (AUROC) for KNN was 0.76, 0.79 for NB, and 0.78 for the other classifiers. The X-axis represents the False Positive Rate (1-Specificity) whereas the Y-axis represents the True Positive Rate (Sensitivity)
The relative importance of H.pylori risk factors based on all feature selection methods. X-axis indicates the H. Pylori risk factors, summarized in Table 1. Y-axis indicates the average probability of being selected across all feature selection methods. The error bars indicate one standard errors across all cross-validation folds
Two-dimensional hierarchical clustering heatmap of H. pylori risk factors and feature selection methods. Maroon and blue colors indicate more and less frequently selected features in five tenfold cross-validation runs, respectively. X-axis shows the H. pylori risk factors, summarized in Table 1. Y-axis indicates all feature selection methods. The risk factors found more frequently by feature selection methods appear on the heatmap's left columns. The feature selection methods that select the greatest number of risk factors appear on the heatmap's bottom rows. The risk factors grouped together suggest that they have been chosen similarly under varying feature selection methods. The feature selection methods grouped together indicate that these methods choose a similar set of risk factors
Background Although previous epidemiological studies have examined the potential risk factors that increase the likelihood of acquiring Helicobacter pylori infections, most of these analyses have utilized conventional statistical models, including logistic regression, and have not benefited from advanced machine learning techniques. Objective We examined H. pylori infection risk factors among school children using machine learning algorithms to identify important risk factors as well as to determine whether machine learning can be used to predict H. pylori infection status. Methods We applied feature selection and classification algorithms to data from a school-based cross-sectional survey in Ethiopia. The data set included 954 school children with 27 sociodemographic and lifestyle variables. We conducted five runs of tenfold cross-validation on the data. We combined the results of these runs for each combination of feature selection (e.g., Information Gain) and classification (e.g., Support Vector Machines) algorithms. Results The XGBoost classifier had the highest accuracy in predicting H. pylori infection status with an accuracy of 77%—a 13% improvement from the baseline accuracy of guessing the most frequent class (64% of the samples were H. Pylori negative.) K-Nearest Neighbors showed the worst performance across all classifiers. A similar performance was observed using the F1-score and area under the receiver operating curve (AUROC) classifier evaluation metrics. Among all features, place of residence (with urban residence increasing risk) was the most common risk factor for H. pylori infection, regardless of the feature selection method choice. Additionally, our machine learning algorithms identified other important risk factors for H. pylori infection, such as; electricity usage in the home, toilet type, and waste disposal location. Using a 75% cutoff for robustness, machine learning identified five of the eight significant features found by traditional multivariate logistic regression. However, when a lower robustness threshold is used, machine learning approaches identified more H. pylori risk factors than multivariate logistic regression and suggested risk factors not detected by logistic regression. Conclusion This study provides evidence that machine learning approaches are positioned to uncover H. pylori infection risk factors and predict H. pylori infection status. These approaches identify similar risk factors and predict infection with comparable accuracy to logistic regression, thus they could be used as an alternative method.
Introduction: Immunosuppressive chemotherapy increase the risk of vaccine-preventable infectious diseases in children; nevertheless, chemotherapy may result in delay or miss updated immunization schedules. The predictable antibody waning after incomplete primary immunization series may be intensified at the end of chemotherapy. This study aimed to investigate post-chemotherapy vaccine immunity waning at the end of immunosuppressive therapy in children with malignancy and hematologic disorders. Materials and methods: Children with malignancies and hematologic disorders including chronic immune thrombocytopenic purpura (ITP) younger than 18 years old were enrolled from September 2015 to August 2019. Eligible patients who completed their treatment protocol for at least 6 months were recruited. The patient information, including sex, age at the date of diagnosis, number of chemotherapy sessions, underlying disease, and vaccination history, was taken by chart review using predefined questionnaires. The patient's blood samples were obtained, and serum IgG antibody titer checked against diphtheria, tetanus, hepatitis B virus (HBV), mumps, measles, and rubella (MMR) were measured by enzyme-linked immunosorbent assay (ELISA). Results: 110 children receiving immunosuppressive chemotherapy were recruited. Forty-four (40%) of the children tested were girls and 66 (60%) were boys. The mean age of patients was 5.5 years with a range of 2 to 13 years. Of 110 studied children, 27.3% were seronegative for all antibodies. On average, patients undergo 19 episodes of chemotherapy. The mean chemotherapy sessions were significantly greater in children who were seronegative for all tested antibodies (mean: 36.2, 95% CI 33.16 to 39.24, p-value < 0.001). No statistically significant differences were observed regarding the patient's sex and age between the seropositive and seronegative groups (p-value 0.513 and 0.060, respectively). Based on Poisson regression model analysis, the female gender was associated with 37% lower odds of seronegativity (incidence rate ratio (IIR): 0.63; [95% conf. interval: 0.39 to 1.01, p-value: 0.55]), while chemotherapy sessions 30 or more was associated with significant odds of seronegativity for all tested vaccines (IIR: 25.41; [95% conf. interval: 6.42 to 100.57, p-value < 0.001]). Conclusion: Our results reemphasized planned catchup immunization in children undergoing immunosuppressive chemotherapy for malignancy, especially against tetanus, diphtheria, and hepatitis B at least 6 months after the end of chemotherapy sessions.
ROC curve analysis showed the performance of qPCR vs. AFS in diagnosing skeletal TB
Background At present, skeletal tuberculosis (TB) diagnosis is mostly by histopathology, but the positivity rate is low. There is a need to develop new methods for the molecular identification of this disorder. Therefore, we aimed to investigate the clinical utility of quantitative PCR (qPCR)-based diagnosis of skeletal TB from formalin-fixed paraffin-embedded (FFPE) tissues and its comparative evaluation with acid-fast bacillus staining (AFS). Methods We detected Mycobacterium tuberculosis ( M. tuberculosis/ MTB) DNA using qPCR and AFS in FFPE tissue samples from 129 patients suspected of having skeletal TB. The sensitivity, specificity as well as area under the curve (AUC) of qPCR and AFS were calculated. Meanwhile, some factors potentially affecting qPCR and AFS results were investigated. Results Overall, qPCR outperformed AFS in detecting M. tuberculosis . The AUC of qPCR was higher than that of AFS (0.744 vs.0.561, p < 0.001). Furthermore, decalcification of bone tissues did not affect the sensitivity and specificity of qPCR tests. Whereas it impacted the performance of AFS, decalcification increased AFS's specificity and decreased its sensitivity (p < 0.05). Moreover, qPCR had a significantly larger AUC than AFS in decalcified and non-decalcified groups (0.735/0.756 vs. 0.582/0.534, p < 0.001) respectively. Similarly, the AUC of PCR was more extensive than that of AFS regardless of skeletal TB patients with concomitant pulmonary TB or not (0.929 vs. 0.762; 0.688 vs. 0.524, p < 0.01). Conclusions Our data demonstrate that qPCR offers superior accuracy for the detection of mycobacteria in FFPE tissues compared to traditional AFS, indicating its clinical value in osteoarticular TB diagnosis.
Background Staphylococcus aureus causes many human infections, including wound infections, and its pathogenicity is mainly influenced by several virulence factors. Aim This study aimed to detect virulence genes ( hla , sea , icaA , and fnbA ) in S. aureus isolated from different wound infections among Egyptian patients admitted to Minia University Hospital. This study also aimed to investigate the prevalence of these genes in methicillin-resistant S. aureus (MRSA), methicillin-susceptible S. aureus (MSSA), and vancomycin-resistant S. aureus isolates and the resistance and sensitivity to different antibiotic classes. Methods A cross-sectional study was carried out from November 2019 to September 2021. Standard biochemical and microbiological tests revealed 59 S. aureus isolates. The Kirby-Bauer disc diffusion method was used to determine antibiotic susceptibility. DNA was extracted using a DNA extraction kit, and polymerase chain reaction was used to amplify all genes. Results A total of 59 S. aureus isolates were detected from 51 wound samples. MRSA isolates accounted for 91.5%, whereas MSSA isolates accounted for 8.5%. The multidrug resistance (MDR) percentage in S. aureus isolates was 54.2%. S. aureus showed high sensitivity pattern against vancomycin, linezolid, and chloramphenicol. However, a high resistance pattern was observed against oxacillin and piperacillin. sea was the most predominant gene (72.9%), followed by icaA (49.2%), hla (37.3%), and fnbA (13.6%). sea was the commonest virulence gene among MRSA isolates (72.2%), and a significant difference in the distribution of icaA was found. However, sea and icaA were the commonest genes among MSSA isolates (79.9%). The highest distribution of sea was found among ciprofloxacin-resistant isolates (95.2%). Conclusion The incidence of infections caused by MDR S. aureus significantly increased with MRSA prevalence. sea is the most predominant virulence factor among antibiotic-resistant strains with a significant correlation to piperacillin, gentamicin, and levofloxacin.
Background During the early stage of the COVID-19 pandemic, many countries implemented non-pharmaceutical interventions (NPIs) to control the transmission of SARS-CoV-2, the causative pathogen of COVID-19. Among those NPIs, stay-at-home and quarantine measures were widely adopted and enforced. Understanding the effectiveness of stay-at-home and quarantine measures can inform decision-making and control planning during the ongoing COVID-19 pandemic and for future disease outbreaks. Methods In this study, we use mathematical models to evaluate the impact of stay-at-home and quarantine measures on COVID-19 spread in four cities that experienced large-scale outbreaks in the spring of 2020: Wuhan, New York, Milan, and London. We develop a susceptible-exposed-infected-removed (SEIR)-type model with components of self-isolation and quarantine and couple this disease transmission model with a data assimilation method. By calibrating the model to case data, we estimate key epidemiological parameters before lockdown in each city. We further examine the impact of stay-at-home and quarantine rates on COVID-19 spread after lockdown using counterfactual model simulations. Results Results indicate that self-isolation of susceptible population is necessary to contain the outbreak. At a given rate, self-isolation of susceptible population induced by stay-at-home orders is more effective than quarantine of SARS-CoV-2 contacts in reducing effective reproductive numbers Re\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$R_e$$\end{document}. Variation in self-isolation and quarantine rates can also considerably affect the duration of outbreaks, attack rates and peak timing. We generate counterfactual simulations to estimate effectiveness of stay-at-home and quarantine measures. Without these two measures, the cumulative confirmed cases could be much higher than reported numbers within 40 days after lockdown in Wuhan, New York, Milan, and London. Conclusions Our findings underscore the essential role of stay-at-home orders and quarantine of SARS-CoV-2 contacts during the early phase of the pandemic.
FDA Emergency Use Authorization (EUA) criteria of casirivimab/imdevimab
Background Monoclonal antibodies (mAb) prevent COVID-19 progression when administered early. We compared mAb treatment outcomes among vaccinated and unvaccinated patients during Delta wave and assessed the feasibility of implementing stricter eligibility criteria in the event of mAb scarcity. Methods We conducted a retrospective observational study of casirivimab/imdevimab recipients with mild-to-moderate COVID-19 infection in an emergency department or outpatient infusion center (July 1–August 20, 2021). Primary outcome was all-cause hospital admission within 30 days post-treatment between vaccinated vs. unvaccinated patients during Delta surge in the Bronx, NY. Results A total of 250 patients received casirivimab/imdevimab (162 unvaccinated vs. 88 vaccinated). The median age was 39 years for unvaccinated patients, and 52 years for vaccinated patients (p < 0.0001). The median number of EUA criteria met was 1 for unvaccinated and 2 for vaccinated patients (p < 0.0001). Overall, 6% (15/250) of patients were admitted within 30 days post-treatment. Eleven unvaccinated patients (7%) were admitted within 30-days compared to 4 (5%) vaccinated patients (p = 0.48). Conclusions All-cause 30-day admission was not statistically different between vaccinated and unvaccinated patients. When federal allocation of therapies is limited, programs must prioritize patients at highest risk of hospitalization and death regardless of vaccination status.
A 27-year-old female patient with a two-year history of HARRT presented with upper abdominal pain, vomiting and diarrhea. The initial ¹⁸F-FDG PET/CT maximum intensity projection (A PET) and axial slices (B, D PET; C, E PET/CT) showed hyper-metabolic lymph nodes in the neck, mediastinum and retro-peritoneum (blue arrows) in addition to the spleen (red arrows). The SUVmax of lymph nodes and spleen was 18.6 and 8.8, respectively. The neck lymph node biopsy confirmed diffuse large B-cell lymphoma
A 47-year-old male patient with no previous relevant history, presenting with cervical painful swollen lymph nodes with intermittent fever for 3 months. Finally, HIV infection was confirmed at the local Centers for Disease Control (CDC). Blood test showed Epstein-Barr virus infection by polymerase chain reaction, and the T-SPOT were positive. An ¹⁸F-FDG PET/CT was performed, with maximum intensity projection (A) and axial slices (B, D and F PET; C, E and G PET/CT), showing cervical, mediastinal and retroperitoneal lymph nodes high uptake (blue arrows) and splenomegaly moderate uptake (red arrows). The axial slices (B and C, blue arrows) show cervical node involvement, with SUVmax value of 9.2. Multiple necroses were found in the mediastinal lymph nodes (D and E; blue arrows). Right neck lymph node biopsy confirmed granulomatous lymphadenitis with positive bacteria by Ziehl–Neelsen staining. And the specimen culture proved to be Mycobacterium tuberculosis infection finally
Association between lymphoma status and PET parameters established by using a gradient-based segmentation method. The correlations are shown for SURmax (A), SUVLN (B), SUVMarrow (C) and SUVLiver (D)
ROC curve for PET parameters as a screening test for malignant lymphoma and inflammatory lymphadenopathy. The discriminatory ability of SURmax and SUVLN was better than that of the maximum diameter and involved areas of lymph nodes, the SUVMarrow and SUVLiver in malignant lymphoma and inflammatory lymphadenopathy
Background It is vital to distinguish between inflammatory and malignant lymphadenopathy in human immunodeficiency virus (HIV) infected individuals. The purpose of our study was to differentiate the variations in the clinical characteristics of HIV patients, and apply ¹⁸F-FDG PET/CT parameters for distinguishing of malignant lymphoma and inflammatory lymphadenopathy in such patients. Methods This retrospective cross-sectional study included 59 consecutive HIV-infected patients who underwent whole-body ¹⁸F-FDG PET/CT. Of these patients, 37 had biopsy-proven HIV-associated lymphoma, and 22 with HIV-associated inflammatory lymphadenopathy were used as controls. The determined parameters were the maximum of standard uptake value (SUVmax), SUVmax of only lymph nodes (SUVLN), the most FDG-avid lesion-to-liver SUVmax ratio (SURmax), laboratory examinations and demographics. The optimal cut-off of ¹⁸F-FDG PET/CT value was analyzed by receiver operating characteristic curve (ROC). Results Considering the clinical records, the Karnofsky Performance Status (KPS) scores in patients with inflammatory lymphadenopathy were obviously higher than those in patients with malignant lymphoma (P = 0.015), whereas lymphocyte counts and lactate dehydrogenase (LDH) were obviously lower (P = 0.014 and 0.010, respectively). For the ¹⁸F-FDG PET/CT imaging, extra-lymphatic lesions, especially digestive tract and Waldeyer’s ring, occurred more frequently in malignant lymphoma than inflammatory lymphadenopathy. Furthermore, the SURmax and SUVLN in malignant lymphoma were markedly higher than those in inflammatory lymphadenopathy (P = 0.000 and 0.000, respectively). The cut-off point of 3.1 for SURmax had higher specificity (91.9%) and relatively reasonable sensitivity (68.2%) and the cut-off point of 8.0 for the SUVLN had high specificity (89.2%) and relatively reasonable sensitivity (63.6%). Conclusion Our study identified the distinctive characteristics of the clinical manifestations, the SURmax, SUVLN and detectability of extra-lymphatic lesions on ¹⁸F-FDG PET, and thus provides a new basis for distinguishing of malignant lymphoma from inflammatory lymphadenopathy in HIV-infected patients.
Self-reported health at study entry (baseline) and 24 weeks
Functional impairment at study entry (baseline) and 24 weeks
Background Direct-acting antivirals (DAAs) are highly effective in achieving sustained virologic response among those with chronic hepatitis C virus (HCV) infection. Quality of life (QOL) benefits for an HCV-infected population with high numbers of people who inject drugs and people living with HIV (PLHIV) in Eastern Europe have not been explored. We estimated such benefits for Ukraine. Methods Using data from a demonstration study of 12-week DAA conducted in Kyiv, we compared self-reported QOL as captured with the MOS-SF20 at study entry and 12 weeks after treatment completion (week 24). We calculated domain scores for health perception, physical, role and social functioning, mental health and pain to at entry and week 24, stratified by HIV status. Results Among the 857 patients included in the final analysis, health perception was the domain that showed the largest change, with an improvement of 85.7% between entry and week 24. The improvement was larger among those who were HIV negative (104.4%) than among those living with HIV (69.9%). Other domains that showed significant and meaningful improvements were physical functioning, which improved from 80.5 (95% CI 78.9–82.1) at study entry to 89.4 (88.1–90.7) at 24 weeks, role functioning (64.5 [62.3–66.8] to 86.5 [84.9–88.2]), social functioning (74.2 [72.1–76.2] to 84.8 [83.2–86.5]) and bodily pain (70.1 [68.2–72.0] to 89.8 [88.5–91.1]). Across all domains, QOL improvements among PLHIV were more modest than among HIV-negative participants. Conclusion QOL improved substantially across all domains between study entry and week 24. Changes over the study period were smaller among PLHIV.
Image of the patient. A and B Pre-and post-operative CT scan images of the middle ear, showing slightly blurred bone in the apical region. C and D Pre-and post-operative middle ear transverse position T1WI MRI, showing low signal in the apex of petrous part. E and F Pre-and post-operative middle ear transverse position T1WI + FS enhancement MRI, showing moderate enhancement in the apex of petrous part, with no significant preoperative or postoperative changes. G and H Pre-and post-operative middle ear transverse position T1WI + FS MRI, showing slightly higher signal in the apex of petrous part with blurred boundaries. I and J Cranial MRV, showing irregular wall of posterior superior sagittal sinus, narrow lumen, low blood flow signal, considerd thrombosis; Left sigmoid sinus normal, Decreased blood flow signal in the right sigmoid sinus
The patient's pathological section, bacterial culture, postoperative endoscopy and eye motility Fig. A HE staining of the mastoid granulation of the left middle ear suggested chronic inflammation of local mucosa with neutrophil infiltration. Scale bar = 20 μm. B PAS staining of the pathological section of the mastoid granulation of the left middle ear indicated a small amount of yeast-like fungi in the exudate of the middle ear (blue arrow), accompanied by budding phenomenon, and the morphology suggested yeast infection. Scale bar = 20 μm. Image A and B were obtained using the EasyScanner (Motic, Xiamen, China) and its associated software (Motic DSAssistant Plus, Motic). C Bacterial culture of secretion from left ear suggested near smooth yeast, CHROMagar candida chromogenic medium,72 h, creamy colorless smooth colony. D Bacterial culture of the secretion from the left ear was seen under the microscope (× 1000 times). Gram staining was positive and swarms of ovoid spores were seen. E Overall view of mastoid cavity and external auditory meatus under endoscopy after operation. F White punctate secretions of external auditory canal were observed under endoscopy. G mNGS showed candida tropicalis. H The patient had limited abduction and adduction of the left eye
Trend chart of inflammatory indicators in patients from September to November in 2021. A The trend chart of leukocyte was higher than normal at the beginning of admission, decreased after vancomycin and fluconazole treatment, and increased in the later period, which might be related to hormone use. B C-reactive protein trends. At the beginning of admission, it was slightly higher than normal, then increased, and decreased to normal after vancomycin and fluconazole treatment. C Amyloid A trend map. Since admission slightly higher than normal, after the rise is obvious. D Procalcitonin trend chart. It's been within the normal range
The flow chart of patient's diagnosis and treatment
Background Petrositis is a rare and fatal complication associated with otitis media. It is most likely caused by bacterial infections, but in some cases it is caused by fungal infections. Case study The case in this report is associated with fungal petrositis. The clinical symptoms are: ear pain from chronic otitis media, severe headache, peripheral facial palsy and diplopia. The case was finally confirmed through imaging of middle ear, bacterial culture, pathology, and blood Metagenomic next-generation sequencing (mNGS) test. The patient was treated with sensitive antifungal drugs. Conclusion Drug treatment is conservative but efficient method in this case. mNGS can provide pathogenic reference, when antibiotic is not efficient enough for fungal infections or drug-resistant fungal infections cases. This allows we to adjust drug use for the treatment.
Box plots of normalized antibody concentration level for five P. falciparum antigens (2018 vs. 2017, by exposure group). Antibody concentration level is expressed by the median fluorescence intensity (MFI) after log-transformation and standardization between years for titre concentration. Mean concentration per year per group is displayed by red circles
Changes in seroprevalence for five P. falciparum antigens (2018 vs 2017, by exposure group). Seroprevalence was estimated by fitting multilevel logistic regression models, with adjustment for age, use of a bednet the night before, size of the household, occupation of the head of the household and possession of livestock. Robust variance estimators were used, and models used random intercepts at the individual, household and commune levels. The intervention group refer to the individuals who self-reported having received MDA in 2018, while the control group refer to individuals who self-report not being exposed to MDA in 2018. No participant was exposed to MDA in 2017, whatever the group
Treatment effects of MDA campaign on IgG seropositivity to five P. falciparum antigens. Treatment effects estimates are derived from multilevel logistic regression models, with adjustment for age, use of a bednet the night before, size of the household, occupation of the head of the household and possession of livestock. Robust variance estimators were used, and models used random intercepts at the individual, household and commune levels. Marginal probabilities were used for computing risk differences and relative risks. Treatment effect estimates are displayed with their 95% confidence intervals
Introduction Serological methods provide useful metrics to estimate age-specific period prevalence in settings of low malaria transmission; however, evidence on the use of seropositivity as an endpoint remains scarce in studies to evaluate combinations of malaria control measures, especially in children. This study aims to evaluate the immediate effects of a targeted mass drug administration campaign (tMDA) in Haiti by using serological markers. Methods The tMDA was implemented in September–October 2018 using sulfadoxine-pyrimethamine and single low-dose primaquine. A natural quasi-experimental study was designed, using a pretest and posttest in a cohort of 754 randomly selected school children, among which 23% reported having received tMDA. Five antigens were selected as outcomes (MSP1-19, AMA-1, Etramp5 antigen 1, HSP40, and GLURP-R0). Posttest was conducted 2–6 weeks after the intervention. Results At baseline, there was no statistical difference in seroprevalence between the groups of children that were or were not exposed during the posttest. A lower seroprevalence was observed for markers informative of recent exposure (Etramp5 antigen 1, HSP40, and GLURP-R0). Exposure to tMDA was significantly associated with a 50% reduction in the odds of seropositivity for Etramp5 antigen 1 and a 21% reduction in the odds of seropositivity for MSP119. Conclusion Serological markers can be used to evaluate the effects of interventions against malaria on the risk of infection in settings of low transmission. Antibody responses against Etramp5 antigen 1 in Haitian children were reduced in the 2–6 weeks following a tMDA campaign, confirming its usefulness as a short-term marker in child populations.
Principal component analysis of age categories and COVID-19 symptoms. The principal component, CP1, explains 79.7% of the variability of the data. The principal component, CP2, explains 20.3% of the variability of the data. CP1 and CP2 relate the main clinical manifestations for young people, adults and elderly patients
Distribution of symptoms according to the sex
Background The ability of SARS-CoV-2 to remain in asymptomatic individuals facilitates its dissemination and makes its control difficult. Objective. To establish a cohort of asymptomatic individuals, change to the symptomatic status, and determine the most frequent clinical manifestations. Methods Between April 9 and August 9, 2020, molecular diagnosis of SARS-CoV-2 infection was confirmed in 154 asymptomatic people in contact with subjects diagnosed with COVID-19. Nasopharyngeal swabs were performed on these people in different hospitals in Córdoba, the Caribbean area of Colombia. The genes E, RdRp, and N were amplified with RT-qPCR. Based on the molecular results and the Cq values, the patients were subsequently followed up through telephone calls to verify their health conditions. Results Overall, of 154 asymptomatic individuals, 103 (66.9%) remained asymptomatic, and 51 (33.1%) changed to symptomatic. The most frequent clinical manifestations in young people were anosmia and arthralgia. Adults showed cough, ageusia, and odynophagia; in the elderly were epigastralgia, dyspnea, and headache. Mortality was 8%. Conclusions A proportion of 33% of presymptomatic individuals was found, of which four of them died. This high rate could indicate a silent transmission, contributing significantly to the epidemic associated with SARS-CoV-2.
Lost to follow up vs retention (Months) at given time points among FSWs initiated on ART between Jan 2018–Dec 2020
Background Patient retention in care and sustained viral load suppression are a cornerstone to improved health and quality of life, among people living with HIV. However, challenges of retention on ART remain among female sex workers (FSWs). We report lost to follow up (LTFU), viral load suppression, and the associated factors among FSWs that access HIV treatment at primary health care facilities in Kampala. Methods We retrospectively abstracted and analysed patient management data of HIV positive FSWs who enrolled in care between January 2018 to December 2020. LTFU was defined as failure of a FSW to return for treatment at least 90 days from the date of their last clinic appointment. We defined viral suppression as having a last viral load of ≤ 1000 copies/ml preceding data abstraction. Data were analysed using Stata 15.1 software. Results A total of 275 FSWs were included in our study sample. We found low retention of 85.1% (n = 234) at six months, corresponding to LTFU of 14.9 (n = 41) within the same period. Retention decreased with duration of being in care up to 73.5% (n = 202) at 24 months, and this translates to LTFU of 26.5% (n = 73). Viral load testing coverage was 62% (n = 132) and of these, 90.9% (n = 120) were virally suppressed. Factors associated with LTFU in univariable logistic regression; and viral load suppression in multivariable logistic regression models were; having a telephone contact (OR: 0.3, 95% CI: 0.1–0.9 p = 0.031), having enrolled in HIV care aged ≥ 35 years (OR: 0.5, 95% CI: 0.2–1.0 p = 0.048), (OR:0.03, 95%CI: 0.00–0.5, p = 0.016); and having good ART adherence (OR: 0.2, 95% CI: 0.1–0.5 p = 0.001), (OR:24.0, 95% CI: 3.7–153.4 p = 0.001) respectively. Having good ART adherence remained statistically significant (OR: 0.2, 95% CI: 0.08–0.53 p = 0.001) in multivariable logistic regression for LTFU. Conclusion This study found low retention among HIV diagnosed FSWs in care. Viral load suppression was acceptable and comparable to that of the general population, however viral load coverage was low. Strategies that increase retention in care and access to viral load testing such as individual client centred care models are vital to improve retention and viral load coverage among FSWs.
Overview of the study design. The monthly PTB incidence from January 2005 to December 2020 and the total population data reported at the end of each year from 2004 to 2019 were collected and used to calculate the monthly incidence between January 2005 to December 2020. The incidence data before the COVID-19 outbreak between January 2005 and December 2019 were then used to construct a prediction model without intervention, and the data both before and during the COVID-19 outbreak (between January 2020 and December 2020) were utilized to construct a prediction model under intervention
Incidence data estimated from the non-intervention model. A Time series of monthly PTB incidence from January 2005 to December 2020. The red line indicates the observed incidence before the COVID-19 outbreak. The sky blue line represents the observed incidence during the COVID-19 outbreak. The orange line denotes the incidence data between January 2020 and December 2020 as predicted from the non-intervention model. B The observed TB incidence (sky blue line) and the incidence predicted with the data before the COVID-19 outbreak (orange line) between January 2020 and December 2020
Evaluation of the intervention effect on the tendency of PTB incidence with both the strict and regular interventional models. Time series of PTB monthly incidence from January 2020 to December 2021. The brown line indicates the observed incidence under the strict state of COVID-19 epidemic prevention and control. The blue line represents the observed incidence under the regular state of COVID-19 epidemic prevention and control. The black line denotes the incidence data from May 2020 to December 2021 as predicted from the intervention model under strict intervention. The brown line denotes the incidence data from January 2021 to December 2021 as predicted from the intervention model under regular intervention
Predicted and actual monthly PTB incidence in China from January 2005 to December 2021. Time series of monthly PTB incidence from January 2005 to December 2021. The blue line indicates the observed incidence. The red line denotes the incidence data between January 2021 and December 2021 as predicted from the intervention model. The green interval represents the a time series forecast prediction interval (95%) for the predictions
Background The COVID-19 pandemic has driven public health intervention strategies, including keeping social distance, wearing masks in crowded places, and having good health habits, to prevent the transmission of the novel coronavirus (SARS-CoV-2). However, it is unknown whether the use of these intervention strategies influences morbidity in other human infectious diseases, such as tuberculosis. Methods In this study, three prediction models were constructed to compare variations in PTB incidences after January 2020 without or with intervention includes strict and regular interventions, when the COVID-19 outbreak began in China. The non-interventional model was developed with an autoregressive integrated moving average (ARIMA) model that was trained with the monthly incidence of PTB in China from January 2005 to December 2019. The interventional model was established using an ARIMA model with a continuing intervention function that was trained with the monthly PTB incidence in China from January 2020 to December 2020. Results Starting with the assumption that no COVID-19 outbreak had occurred in China, PTB incidence was predicted, and then the actual incidence was compared with the predicted incidence. A remarkable overall decline in PTB incidence from January 2020 to December 2020 was observed, which was likely due to the potential influence of intervention policies for COVID-19. If the same intervention strategy is applied for the next 2 years, the monthly PTB incidence would reduce on average by about 1.03 per 100,000 people each month compared with the incidence predicted by the non-interventional model. The annual incidence estimated 59.15 under regular intervention per 100,000 in 2021, and the value would decline to 50.65 with strict interventions. Conclusions Our models quantified the potential knock-on effect on PTB incidence of the intervention strategy used to control the transmission of COVID-19 in China. Combined with the feasibility of the strategies, these results suggested that continuous regular interventions would play important roles in the future prevention and control of PTB.
Flow of information through the different phases of the systematic review. The flowchart was adapted from the Preferred Reporting Items for Systematic Review and Meta-Analyses flow chart model
Forest plots showing the risk of mortality in patients who took ivermectin compared to controls, stratified by placebo or other drugs. RR relative risk. Asterisk indicates that this study had two control groups, one with placebo and the other with another drug. We included in the pooled analysis only the comparator arm which used placebo
Forest plots showing the risk of mechanical ventilation requirement in patients who took ivermectin compared to controls, stratified by placebo or other drugs. RR relative risk
Forest plots showing sensitivity analysis of mortality and mechanical ventilation according to the percentage of confirmed COVID-19 patients and risk of bias. RR relative risk
Background The role of ivermectin in the treatment of COVID-19 is still under debate, yet the drug has been widely used in some parts of the world, as shown by impressive market data. The available body of evidence may have changed over the last months, as studies have been retracted and “standards of care” (SOC) used in control groups have changed with rapidly evolving knowledge on COVID-19. This review aims to summarize and critically appraise the evidence of randomized controlled trials (RCTs) of ivermectin, assessing clinical outcomes in COVID-19 patients. Methods RCTs evaluating the effects of ivermectin in adult patients with COVID-19 were searched through June 22, 2022, in four databases, L.OVE platform, clinical trial registries and pre-prints platforms. Primary endpoints included all-cause mortality and invasive ventilation requirement. Secondary endpoint was the occurrence of adverse events. Risk of bias was evaluated using the Cochrane Risk of Bias 2.0 tool. Meta-analysis included only studies which compared ivermectin to placebo or SOC. Random-effects were used to pool the risk ratios (RRs) of individual trials. The quality of evidence was evaluated using GRADE. The protocol was register in PROSPERO (CRD42021257471). Results Twenty-five RCTs fulfilled inclusion criteria (n = 6310). Of those, 14 compared ivermectin with placebo, in night ivermectin associated with SOC was compared to SOC and two studies compared ivermectin to an active comparator. Most RCTs had some concerns or high risk of bias, mostly due to lack of concealment of the randomization sequence and allocation, lack of blinding and high number of missing cases. Ivermectin did not show an effect in reducing mortality (RR = 0.76; 95%CI: 0.52–1.11) or mechanical ventilation (RR = 0.74; 95%CI: 0.48–1.16). This effect was consistent when comparing ivermectin vs. placebo, and ivermectin associated with SOC vs. SOC, as well as in sensitivity analysis. Additionally, there was very low quality of evidence regarding adverse effects (RR = 1.07; 95%CI: 0.84–1.35). Conclusions The evidence suggests that ivermectin does not reduce mortality risk and the risk of mechanical ventilation requirement. Although we did not observe an increase in the risk of adverse effects, the evidence is very uncertain regarding this endpoint.
Computerized brain tomography showed bilateral symmetrical hypodense lesions involving thalami (arrows) (C) and cerebellar hemispheres (arrowhead), causing effacement of cerebellar folia (A, B); isolated hypodense lesion at left high frontal area (asterixis) (D). Chest X-ray after intubation showed endotracheal tube tip 0.7 cm above carina, parenchyma was normal with no obvious infiltration (E)
Background We report the first case of COVID-19 associated acute necrotizing encephalopathy (ANE) without pulmonary disease in a patient with an extremely high interleukin-6 (IL-6) level and Ran Binding Protein 2 ( RANBP2 ) mutation. Case presentation A 29-year-old woman recently immunized with inactivated viral vaccine—BBIBP32-CorV (Sinopharm) presented with alteration of consciousness. Her body temperature was 37° Celsius, blood pressure 42/31 mmHg, heart rate 130 bpm, respiratory rate 20 per minute, and oxygen saturation 98%. Respiratory examination was unremarkable. Neurological examination revealed stupor but preserved brainstem reflexes. Non-contrast computerized tomography of the brain showed symmetrical hypodense lesions involving bilateral thalami and cerebellar hemispheres characteristic of ANE. No pulmonary infiltration was found on chest radiograph. SARS-CoV-2 was detected by PCR; whole genome sequencing later confirmed the Delta variant. RANBP2 gene analysis revealed heterozygous Thr585Met mutation. Serum IL-6 was 7390 pg/mL. Urine examination showed pyelonephritis. Her clinical course was complicated by seizure, septic shock, acute kidney injury, and acute hepatic failure. She later developed coma and passed away in 6 days. Conclusions ANE is caused by cytokine storm leading to necrosis and hemorrhage of the brain. IL-6 was deemed as a prognostic factor and a potential treatment target of ANE in previous studies. RANBP2 missense mutation strongly predisposes this condition by affecting mitochondrial function, viral entry, cytokine signaling, immune response, and blood–brain barrier maintenance. Also, inactivated vaccine has been reported to precipitate massive production of cytokines by antibody dependent enhancement (ADE). The true incidence of COVID-19 associated ANE is not known as were the predictors of its development. We proposed these potential two factors ( RANBP2 mutation and ADE) that could participate in the pathogenesis of ANE in COVID-19 apart from SARS-CoV2 infection by itself. Further study is needed to confirm this hypothesis, specifically in the post-vaccination period. Role of RANBP2 mutation and its application in COVID-19 and ANE should be further elaborated.
Flowchart showing screening and analysis of patients with recurrent PTB from the national TB surveillance database in Henan province, China from 2005 to 2018
The probability of PTB recurrence after primary PTB diagnosis in Henan province, China from 2005 to 2018. A Probability of recurrence in primary PTB patients. B Probability of recurrence in primary PTB patients with bacteriological positive and negative cases. C Probability of recurrence after first recurrence. D Probability of recurrence after first recurrence with bacteriological positive and negative cases
Background Recurrence continues to place significant burden on patients and tuberculosis programmes worldwide, and previous studies have rarely provided analysis in negative recurrence cases. We characterized the epidemiological features of recurrent pulmonary tuberculosis (PTB) patients, estimated its probability associated with different bacteriology results and risk factors. Methods Using 2005–2018 provincial surveillance data from Henan, China, where the permanent population approximately were 100 million, we described the epidemiological and bacteriological features of recurrent PTB. The Kaplan–Meier method and Cox proportional hazard models, respectively, were used to estimate probability of recurrent PTB and risk factors. Results A total of 7143 (1.5%) PTB patients had recurrence, and of 21.1% were bacteriological positive on both laboratory tests (positive–positive), and of 34.9% were negative–negative. Compared with bacteriological negative recurrent PTB at first episodes, the bacteriological positive cases were more male (81.70% vs 72.79%; P < 0.001), higher mortality risk (1.78% vs 0.92%; P = 0.003), lower proportion of cured or completed treatment (82.81% vs 84.97%; P = 0.022), and longer time from onset to end-of-treatment. The probability of recurrence was higher in bacteriological positive cases than those in bacteriological negative cases (0.5% vs 0.4% at 20 months; P < 0.05). Conclusions Based on patient’s epidemiological characteristics and bacteriological type, it was necessary to actively enact measures to control their recurrent.
ROC curve of CPR, NLPR and FPR in the differential diagnosis of pyogenic liver abscess patients with sepsis
ROC curve of CPR, NLPR and FPR in the differential diagnosis of pyogenic liver abscess patients with prolonged hospital stays
Objective The purpose of the current study was to evaluate the association between C-reactive protein-to-platelet ratio (CPR), neutrophil-to-lymphocyte*platelet ratio (NLPR) and fibrinogen-to-platelet ratio (FPR) and the prognoses of pyogenic liver abscess (PLA) patients. Methods A cohort of 372 patients with confirmed PLA were enrolled in this retrospective study between 2015 and 2021. Laboratory data were collected on admission within 24 h. The demographic characteristics and clinical features were recorded. Risk factors for outcomes of PLA patients were determined via multivariate logistic regression analyses, and optimal cut-off values were estimated by using the receiver operating characteristic (ROC) curve analysis. Results Out of 372 patients, 57.8% were men, 80 (21.5%) developed sepsis, and 33 (8.9%) developed septic shock. The levels of CPR, NLPR and FPR were significantly increased in the development of sepsis, and prolonged hospital stays in PLA patients. The multivariate logistic regression analysis indicated that the CPR (OR: 2.262, 95% CI: 1.586–3.226, p < 0.001), NLPR (OR: 1.118, 95% CI: 1.070–1.167, p < 0.001) and FPR (OR: 1.197, 95% CI: 1.079–1.329, p = 0.001) were independent risks of PLA patients with sepsis, and NLPR (OR: 1.019, 95% CI: 1.004–1.046, p = 0.019) was shown to be an independent predictor of prolonged hospital stays. The ROC curve results showed that the three biomarkers had different predictive values, and CPR proved to work best, with a ROC value of 0.851 (95% CI: 0.807–0.896, p < 0.001) for sepsis. Conclusion Higher levels of CPR, NLPR and FPR were associated with a higher risk of poor outcomes. Moreover, a high CPR level performed best when predicting the clinical outcome in PLA patients.
Background Airspace disease as seen on chest X-rays is an important point in triage for patients initially presenting to the emergency department with suspected COVID-19 infection. The purpose of this study is to evaluate a previously trained interpretable deep learning algorithm for the diagnosis and prognosis of COVID-19 pneumonia from chest X-rays obtained in the ED. Methods This retrospective study included 2456 (50% RT-PCR positive for COVID-19) adult patients who received both a chest X-ray and SARS-CoV-2 RT-PCR test from January 2020 to March of 2021 in the emergency department at a single U.S. institution. A total of 2000 patients were included as an additional training cohort and 456 patients in the randomized internal holdout testing cohort for a previously trained Siemens AI-Radiology Companion deep learning convolutional neural network algorithm. Three cardiothoracic fellowship-trained radiologists systematically evaluated each chest X-ray and generated an airspace disease area-based severity score which was compared against the same score produced by artificial intelligence. The interobserver agreement, diagnostic accuracy, and predictive capability for inpatient outcomes were assessed. Principal statistical tests used in this study include both univariate and multivariate logistic regression. Results Overall ICC was 0.820 (95% CI 0.790–0.840). The diagnostic AUC for SARS-CoV-2 RT-PCR positivity was 0.890 (95% CI 0.861–0.920) for the neural network and 0.936 (95% CI 0.918–0.960) for radiologists. Airspace opacities score by AI alone predicted ICU admission (AUC = 0.870) and mortality (0.829) in all patients. Addition of age and BMI into a multivariate log model improved mortality prediction (AUC = 0.906). Conclusion The deep learning algorithm provides an accurate and interpretable assessment of the disease burden in COVID-19 pneumonia on chest radiographs. The reported severity scores correlate with expert assessment and accurately predicts important clinical outcomes. The algorithm contributes additional prognostic information not currently incorporated into patient management.
Non-contrast computed tomography (CT) scan performed in the emergency department showing a right parietal hypodense area
A Non-contrast CT scan showing the progress of the hypodense area in Fig. 1. B Magnetic resonance image (MRI) showing a right parietal space-occupying lesion (SOL), enhanced, surrounded by edema, with restricted diffusion in the periphery and no suspicion of bleeding on susceptibility weighted imaging. C A CT scan showing new hemorrhage in the SOL area. D MRI axial view: Upper – post-contrast gadolinium T1WI (left) and T2WI (right) showing postoperative changes and significantly decreased edema and mass effect around the abscess. Lower – diffusion-weighted imaging/apparent diffusion coefficient showing thinner restricted diffusion on the periphery
Multiple skin lacerations and cutaneous hematomas at different stages of evolution on both upper limbs which the patient and his spouse attributed to his gardening hobby
Background Nocardia cyriacigeorgica was first described in 2001. It is an emerging pathogen that mainly affects immunocompromised patients. A brain abscess caused by N. cyriacigeorgica has been reported only in immunocompromised hosts. We present a rare case of brain abscess caused by N. cyriacigeorgica in an adult male receiving low dose steroids. Case presentation A 75-year-old male weekend gardener without an immunocompromising condition presented with neurological complaints that were initially attributed to an ischemic stroke. Due to the unusual presentation and rapid progression, his condition was thought to be caused by a cerebral space-occupying lesion. He underwent an emergent right-sided parietal craniotomy and the histopathological report of the specimen was an abscess caused by N. cyriacigeorgica. The patient received appropriate antibiotic treatment and completely recovered without sequelae. Conclusions Nocardia species are a rare cause of brain abscess in immunocompetent patients. Their clinical presentation can mimic other more common cerebral diseases, such as brain tumors (primary and secondary) and stroke. The possibility of an abscess caused by N. cyriacigeorgica should also be considered in the differential diagnosis in an immunocompetent patient.
Enrollment and Analysis Inclusion by Antepartum Period, Hepatitis B Surface Antigen Status, and Randomization Arm. HBsAg Hepatitis B Surface Antigen, ZDV zidovudine, TDF tenofovir disoproxil fumarate, ART antiretroviral therapy
Baseline Characteristics for Women Eligible for TDF Randomization
Pairwise differences in calculated creatinine clearance, calcium, and phosphate at birth for infants in P1084s
Background: Tenofovir disoproxil fumarate (TDF) in combination with other antiretroviral (ARV) drugs has been in clinical use for HIV treatment since its approval in 2001. Although the effectiveness of TDF in preventing perinatal HIV infection is well established, information about renal safety during pregnancy is still limited. Trial design: The IMPAACT PROMISE study was an open-label, strategy trial that randomized pregnant women to one of three arms: TDF based antiretroviral therapy (ART), zidovudine (ZDV) based ART, and ZDV alone (standard of care at start of enrollment). The P1084s substudy was a nested, comparative study of renal outcomes in women and their infants. Methods: PROMISE participants (n = 3543) were assessed for renal dysfunction using calculated creatinine clearance (CrCl) at study entry (> 14 weeks gestation), delivery, and postpartum weeks 6, 26, and 74. Of these women, 479 were enrolled in the P1084s substudy that also assessed maternal calcium and phosphate as well as infant calculated CrCl, calcium, and phosphate at birth. Results: Among the 1338 women who could be randomized to TDF, less than 1% had a baseline calculated CrCl below 80 mL/min. The mean (standard deviation) maternal calculated CrCl at delivery in the TDF-ART arm [147.0 mL/min (51.4)] was lower than the ZDV-ART [155.0 mL/min (43.3); primary comparison] and the ZDV Alone [158.5 mL/min (45.0)] arms; the mean differences (95% confidence interval) were - 8.0 mL/min (- 14.5, - 1.5) and - 11.5 mL/min (- 18.0, - 4.9), respectively. The TDF-ART arm had lower mean maternal phosphate at delivery compared with the ZDV-ART [- 0.14 mg/dL (- 0.28, - 0.01)] and the ZDV Alone [- 0.17 mg/dL (- 0.31, - 0.02)] arms, and a greater percentage of maternal hypophosphatemia at delivery (4.23%) compared with the ZDV-ART (1.38%) and the ZDV Alone (1.46%) arms. Maternal calcium was similar between arms. In infants, mean calculated CrCl, calcium, and phosphate at birth were similar between arms (all CIs included 0). Conclusions: Although mean maternal calculated CrCl at Delivery was lower in the TDF-ART arm, the difference between arms is unlikely to be clinically significant. During pregnancy, the TDF-ART regimen had no observed safety concerns for maternal or infant renal function. Trial registration: NCT01061151 on 10/02/2010 for PROMISE (1077BF). NCT01066858 on 10/02/2010 for P1084s.
Top-cited authors
Günter Kampf
  • University of Greifswald
Axel Kramer
  • University of Greifswald
Benjamin J Cowling
  • The University of Hong Kong
Moses Joloba
  • Makerere University
Nina Langeland
  • University of Bergen