Academisch Medisch Centrum Universiteit van Amsterdam
Recent publications
Background The aim of this study was to explore the relationship between follow-up imaging characteristics and overall survival (OS) in advanced hepatocellular carcinoma (HCC) patients under sorafenib treatment. Methods Associations between OS and objective response (OR) by mRECIST or early tumor shrinkage (ETS; ≥20% reduction in enhancing tumor diameter at the first follow-up imaging) were analyzed in HCC patients treated with sorafenib within a multicenter phase II trial (SORAMIC). 115 patients were included in this substudy. The relationship between survival and OR or ETS were explored. Landmark analyses were performed according to OR at fixed time points. Cox proportional hazards models with OR and ETS as a time-dependent covariate were used to compare survival with factors known to influence OS. Results The OR rate was 29.5%. Responders had significantly better OS than non-responders (median 30.3 vs. 11.4 months; HR, 0.38 [95% CI, 0.22–0.63], p < 0.001), and longer progression-free survival (PFS; median 10.1 vs. 4.3 months, p = 0.015). Patients with ETS ≥ 20% had longer OS (median 22.1 vs. 11.4 months, p = 0.002) and PFS (median 8.0 vs. 4.3 months, p = 0.034) than patients with ETS < 20%. Besides OR and ETS, male gender, lower bilirubin and ALBI grade were associated with improved OS in univariate analysis. Separate models of multivariable analysis confirmed OR and ETS as independent predictors of OS. Conclusion OR according to mRECIST and ETS in patients receiving sorafenib treatment are independent prognostic factors for OS. These parameters can be used for assessment of treatment benefit and optimal treatment sequencing in patients with advanced HCC.
Background The General Data Protection Regulation is a regulation in EU law on data protection and privacy in the European Union. We aimed to provide an overview of the General Data Protection Regulation (GDPR) enablers and barriers to the secondary use of health data in Europe from the research we conducted in the Joint Action InfAct (Information for Action!) WP10 Assessing and piloting interoperability for public health policy, as well as to provide an example of a national-level case study on experiences with secondary use of health data and GDPR on an example of the Austrian COVID-19 data platform. Methods We have identified a number of European initiatives, projects and organizations that have dealt with cross-border health data sharing, linkage and management by desk research and we conducted 17 semi-structured in-depth interviews and analyzed the interview transcripts by framework analysis. Results GDPR was seen as an enabler to the secondary use of health data in Europe when it comes to user rights over their data, pre-existing laws regarding data privacy and data sharing, sharing anonymized statistics, developing new data analysis approaches, patients` trust towards dealing with their health data and transparency. GDPR was seen as a barrier to the secondary use of health data in Europe when it comes to identifiable and individual-level data, data sharing, time needed to complete the process, workload increase, differences with local legal legislations, different (and stricter) interpretations and access to data. Conclusion The results of our analysis show that GDPR acts as both an enabler and a barrier for the secondary use of health data in Europe. More research is needed to better understand the effects of GDPR on the secondary use of health data which can serve as a basis for future changes in the regulation.
Background Immunomodulatory therapies that improve the outcome of sepsis are not available. We sought to determine whether treatment of critically ill patients with sepsis with low-dose erythromycin—a macrolide antibiotic with broad immunomodulatory effects—decreased mortality and ameliorated underlying disease pathophysiology. Methods We conducted a target trial emulation, comparing patients with sepsis admitted to two intensive care units (ICU) in the Netherlands for at least 72 h, who were either exposed or not exposed during this period to treatment with low-dose erythromycin (up to 600 mg per day, administered as a prokinetic agent) but no other macrolides. We used two common propensity score methods (matching and inverse probability of treatment weighting) to deal with confounding by indication and subsequently used Cox regression models to estimate the treatment effect on the primary outcome of mortality rate up to day 90. Secondary clinical outcomes included change in SOFA, duration of mechanical ventilation and the incidence of ICU-acquired infections. We used linear mixed models to assess differences in 15 host response biomarkers reflective of key pathophysiological processes from admission to day 4. Results In total, 235 patients started low-dose erythromycin treatment, 470 patients served as controls. Treatment started at a median of 38 [IQR 25–52] hours after ICU admission for a median of 5 [IQR 3–8] total doses in the first course. Matching and weighting resulted in populations well balanced for proposed confounders. We found no differences between patients treated with low-dose erythromycin and control subjects in mortality rate up to day 90: matching HR 0.89 (95% CI 0.64–1.24), weighting HR 0.95 (95% CI 0.66–1.36). There were no differences in secondary clinical outcomes. The change in host response biomarker levels from admission to day 4 was similar between erythromycin-treated and control subjects. Conclusion In this target trial emulation in critically ill patients with sepsis, we could not demonstrate an effect of treatment with low-dose erythromycin on mortality, secondary clinical outcomes or host response biomarkers.
Purpose Gadoxetic acid uptake on hepatobiliary phase MRI has been shown to correlate with ß-catenin mutation in patients with HCC, which is associated with resistance to certain therapies. This study aimed to evaluate the prognostic value of gadoxetic acid uptake on hepatobiliary phase MRI in patients with advanced HCC receiving sorafenib. Methods 312 patients with available baseline hepatobiliary phase MRI images received sorafenib alone or following selective internal radiation therapy (SIRT) within SORAMIC trial. The signal intensity of index tumor and normal liver parenchyma were measured on the native and hepatobiliary phase MRI images, and relative tumor enhancement higher than relative liver enhancement were accepted as high gadoxetic acid uptake, and its prognostic value was assessed using univariate and multivariate Cox proportional hazard models. Results The median OS of the study population was 13.4 (11.8–14.5) months. High gadoxetic acid uptake was seen in 51 (16.3%) patients, and none of the baseline characteristics was associated with high uptake. In univariate analysis, high gadoxetic acid uptake was significantly associated with shorter overall survival (10.7 vs. 14.0 months, p = 0.005). Multivariate analysis confirmed independent prognostic value of high gadoxetic acid uptake (HR, 1.7 [1.21–2.3], p = 0.002), as well as Child–Pugh class ( p = 0.033), tumor diameter ( p = 0.002), and ALBI grade ( p = 0.015). Conclusion In advanced HCC patients receiving sorafenib (alone or combined with SIRT), high gadoxetic acid uptake of the tumor on pretreatment MRI, a surrogate of ß-catenin mutation, correlates with shorter survival. Gadoxetic acid uptake status might serve in treatment decision-making process.
Background: More than 300 cities including the city of Amsterdam in the Netherlands have joined the UNAIDS Fast-Track Cities initiative, committing to accelerate their HIV response and end the AIDS epidemic in cities by 2030. To support this commitment, we aimed to estimate the number and proportion of Amsterdam HIV infections that originated within the city, from Amsterdam residents. We also aimed to estimate the proportion of recent HIV infections during the 5-year period 2014-2018 in Amsterdam that remained undiagnosed. Methods: We located diagnosed HIV infections in Amsterdam using postcode data (PC4) at time of registration in the ATHENA observational HIV cohort, and used HIV sequence data to reconstruct phylogeographically distinct, partially observed Amsterdam transmission chains. Individual-level infection times were estimated from biomarker data, and used to date the phylogenetically observed transmission chains as well as to estimate undiagnosed proportions among recent infections. A Bayesian Negative Binomial branching process model was used to estimate the number, size and growth of the unobserved Amsterdam transmission chains from the partially observed phylogenetic data. Results: Between January 1 2014 and May 1 2019, there were 846 HIV diagnoses in Amsterdam residents, of whom 516 (61%) were estimated to have been infected in 2014-2018. The rate of new Amsterdam diagnoses since 2014 (104 per 100,000) remained higher than the national rates excluding Amsterdam (24 per 100,000), and in this sense Amsterdam remained a HIV hotspot in the Netherlands. An estimated 14% [12-16%] of infections in Amsterdan MSM in 2014-2018 remained undiagnosed by May 1 2019, and 41% [35-48%] in Amsterdam heterosexuals, with variation by region of birth. An estimated 68% [61-74%] of Amsterdam MSM infections in 2014-2018 had an Amsterdam resident as source, and 57% [41-71%] in Amsterdam heterosexuals, with heterogeneity by region of birth. Of the locally acquired infections, an estimated 43% [37-49%] were in foreign-born MSM, 41% [35-47%] in Dutch-born MSM, 10% [6-18%] in foreign-born heterosexuals, and 5% [2-9%] in Dutch-born heterosexuals. We estimate the majority of Amsterdam MSM infections in 2014-2018 originated in transmission chains that pre-existed by 2014. Conclusions: This combined phylogenetic, epidemiologic, and modelling analysis in the UNAIDS Fast-Track City Amsterdam indicates that there remains considerable potential to prevent HIV infections among Amsterdam residents through city-level interventions. The burden of locally acquired infection remains concentrated in MSM, and both Dutch-born and foreign-born MSM would likely benefit most from intensified city-level interventions. Funding: This study received funding as part of the H-TEAM initiative from Aidsfonds (project number P29701). The H-TEAM initiative is being supported by Aidsfonds (grant number: 2013169, P29701, P60803), Stichting Amsterdam Dinner Foundation, Bristol-Myers Squibb International Corp. (study number: AI424-541), Gilead Sciences Europe Ltd (grant number: PA-HIV-PREP-16-0024), Gilead Sciences (protocol numbers: CO-NL-276-4222, CO-US-276-1712, CO-NL-985-6195), and M.A.C AIDS Fund.
Purpose To compare the treatment response and progression-free survival (PFS) in advanced hepatocellular carcinoma (HCC) patients who received sorafenib treatment either alone or combined with radioembolization (RE). Methods Follow-up images of the patients treated within a multicenter phase II trial (SORAMIC) were assessed by mRECIST. A total of 177 patients (73 combination arm [RE + sorafenib] and 104 sorafenib arm) were included in this post-hoc analysis. Response and progression characteristics were compared between treatment arms. Survival analyses were done to compare PFS and post-progression survival between treatment arms. Multivariate Cox regression analysis was used to compare survival with factors known to influence PFS in patients with HCC. Results The combination arm had significantly higher objective response rate (61.6% vs. 29.8%, p < 0.001), complete response rate (13.7% vs. 3.8%, p = 0.022), and a trend for higher disease control rate (79.2% vs. 72.1%, p = 0.075). Progression was encountered in 116 (65.5%) patients and was more common in the sorafenib arm (75% vs. 52.0%, p = 0.001). PFS (median 8.9 vs. 5.4 months, p = 0.022) and hepatic PFS were significantly better in the combination arm (9.0 vs. 5.7 months, p = 0.014). Multivariate analysis confirmed the treatment arm as an independent predictor of PFS. Conclusion In advanced HCC patients receiving sorafenib, combination with RE has an additive anticancer effect on sorafenib treatment resulting in a higher and longer tumor response. However, the enhanced response did not translate into prolonged survival. Better patient selection and superselective treatment could improve outcomes after combination therapy.
Objectives Root reimplantation has been the favored approach for patients with heritable aortic disorder (HAD) requiring valve-sparring root replacement (VSRR). In the past few years, root remodelling with annuloplasty has emerged as an alternative to root reimplantation in the general population. The aim of this study was to examine the late outcomes of patients with HAD undergoing VSRR and compare different techniques. Methods Using the AVIATOR registry, data were collected from 5 North American and European centers. Patients were divided in 4 groups according to the technique of valve-sparing used (root reimplantation, root remodelling with ring annuloplasty, root remodelling with suture annuloplasty and root remodelling alone). The primary end-points were freedom from aortic regurgitation (AR) ≥2 and freedom from reintervention on the aortic valve. Secondary end-points were survival and changes in annular dimensions over time. Results A total of 237 patients were included in the study (reimplantation= 100, remodelling + ring annuloplasty= 76, remodelling + suture annuloplasty= 34, remodelling alone= 27). The majority of patients had Marfan syndrome (82%). Preoperative AR ≥ 2 was present in 41% of the patients. Operative mortality was 0.4% (n = 1). No differences were found between techniques in terms of postoperative AR ≥ 2 (p = 0.58), reintervention (p = 0.52) and survival (p = 0.59). Changes in aortic annulus dimension were significantly different at 10 years (p < 0.05), a difference that started to emerge 4 after years surgery. Conclusions Overall, VSRR are safe and durable procedures in patients with HAD. Nevertheless, root remodelling alone is associated with late annular dilatation. Addition of an annuloplasty, however, results in similar freedom from AR, reintervention, survival, and changes in annulus size compared to reimplantation.
Background Irreversible electroporation (IRE) ablation is generally performed with multielectrode catheters. Electrode-tissue contact is an important predictor for the success of pulmonary vein (PV) isolation; however, contact force is difficult to measure with multielectrode ablation catheters. In a preclinical study, we assessed the feasibility of a multielectrode impedance system (MEIS) as a predictor of long-term success of PV isolation. In addition, we present the first-in-human clinical experience with MEIS. Methods In 10 pigs, one PV was ablated based on impedance (MEIS group), and the other PV was solely based on local electrogram information (electrophysiological group). IRE ablations were performed at 200 J. After 3 months, recurrence of conduction was assessed. Subsequently, in 30 patients undergoing PV isolation with IRE, MEIS was evaluated and MEIS contact values were compared to local electrograms. Results In the porcine study, 43 IRE applications were delivered in 19 PVs. Acutely, no reconnections were observed in either group. After 3 months, 0 versus 3 ( P =0.21) PVs showed conduction recurrence in the MEIS and electrophysiological groups, respectively. Results from the clinical study showed a significant linear relation was found between mean MEIS value and bipolar dV/dt (r ² =0.49, P <0.001), with a slope of 20.6 mV/s per Ohm. Conclusions Data from the animal study suggest that MEIS values predict effective IRE applications. For the long-term success of electrical PV isolation with circular IRE applications, no significant difference in efficacy was found between ablation based on the measurement of electrode interface impedance and ablation using the classical electrophysiological approach for determining electrode-tissue contact. Experiences of the first clinical use of MEIS were promising and serve as an important basis for future research.
The recombination-activating genes (RAG) 1 and 2 are indispensable for diversifying the primary B cell receptor repertoire and pruning self-reactive clones via receptor editing in the bone marrow; however, the impact of RAG1/RAG2 on peripheral tolerance is unknown. Partial RAG deficiency (pRD) manifesting with late-onset immune dysregulation represents an ‘experiment of nature’ to explore this conundrum. By studying B cell development and subset-specific repertoires in pRD, we demonstrate that reduced RAG activity impinges on peripheral tolerance through the generation of a restricted primary B cell repertoire, persistent antigenic stimulation and an inflammatory milieu with elevated B cell-activating factor. This unique environment gradually provokes profound B cell dysregulation with widespread activation, remarkable extrafollicular maturation and persistence, expansion and somatic diversification of self-reactive clones. Through the model of pRD, we reveal a RAG-dependent ‘domino effect’ that impacts stringency of tolerance and B cell fate in the periphery.
Pre-eclampsia (PE) affects 2–8% of pregnancies and is responsible for significant morbidity and mortality. The maternal clinical syndrome (defined by hypertension, proteinuria, and organ dysfunction) is the result of endothelial dysfunction. The endothelial response to increased levels of soluble FMS-like Tyrosine Kinase 1 (sFLT1) is thought to play a central role. sFLT1 is released from multiple tissues and binds VEGF with high affinity and antagonizes VEGF. Expression of soluble variants of sFLT1 is a result of alternative splicing; however, the mechanism is incompletely understood. We hypothesize that neuro-oncological ventral antigen 2 (NOVA2) contributes to this. NOVA2 was inhibited in human umbilical vein endothelial cells (HUVECs) and multiple cellular functions were assessed. NOVA2 and FLT1 expression in the placenta of PE, pregnancy-induced hypertension, and normotensive controls was measured by RT-qPCR. Loss of NOVA2 in HUVECs resulted in significantly increased levels of sFLT1, but did not affect expression of membrane-bound FLT1. NOVA2 protein was shown to directly interact with FLT1 mRNA. Loss of NOVA2 was also accompanied by impaired endothelial functions such as sprouting. We were able to restore sprouting capacity by exogenous VEGF. We did not observe statistically significant regulation of NOVA2 or sFLT1 in the placenta. However, we observed a negative correlation between sFLT1 and NOVA2 expression levels. In conclusion, NOVA2 was found to regulate FLT1 splicing in the endothelium. Loss of NOVA2 resulted in impaired endothelial function, at least partially dependent on VEGF. In PE patients, we observed a negative correlation between NOVA2 and sFLT1.
Rationale: There is a major unmet need for improving the care of children and adolescents with severe asthma and wheeze. Identification of factors contributing to disease severity may lead to improved diagnostics, biomarkers, or therapies. The airway microbiota may be such a key factor. Objective: To compare the oropharyngeal airway microbiota of children and adolescents with severe and mild/moderate asthma/wheeze. Methods: Oropharyngeal swab samples from school-age and pre-school children in the European U-BIOPRED multicenter study of severe asthma, all receiving severity-appropriate treatment, were examined using 16S rRNA gene sequencing. Bacterial taxa were defined as Amplicon Sequence Variants (ASVs). Results: We analysed 241 samples from four cohorts; A) 86 school-age children with severe asthma, B) 39 school-age children with mild/moderate asthma, C) 65 pre-school children with severe wheeze and D) 51 pre-school children with mild/moderate wheeze. The most common bacteria were Streptococcus (mean relative abundance 33.5%), Veillonella (10.3%), Haemophilus (7.0%), Prevotella (5.9%) and Rothia (5.5%). Age group (school-age versus pre-school) was associated with the microbiota in beta-diversity analysis (F=3.32, p=0.011) and in a differential abundance analysis (28 significant ASVs). Among all children, we found no significant difference in the microbiota between children with severe and mild/moderate asthma/wheeze in a univariable beta-diversity analysis (F=1.99, p=0.08, n=241), but a significant difference in a multivariable model (F=2.66, p=0.035), including number of exacerbations in the previous year. Age was also significant when expressed as a Microbial Maturity Score (Spearman Rho 0.39, p=4.6e-10), however this score was not associated with asthma/wheeze severity. Conclusion: There was a modest difference in the oropharyngeal airway microbiota between children with severe and mild/moderate asthma/wheeze across all children but not in individual age groups, and a strong association between the microbiota and age. This suggests the oropharyngeal airway microbiota as an interesting entity in studying asthma severity, but probably without the strength to serve as a biomarker for targeted intervention.
Introduction: Non-obstructive coronary arteries (NOCA) are present in 39.7% to 62.4% of patients who undergo elective angiography. Coronary microcirculation (<400 µm) is not visible on angiography therefore functional assessment, invasive or non-invasive plays a prior role to help provide a more personalized diagnosis of angina. Area covered: In this review, we revise the pathophysiology, clinical importance and invasive assessment of the coronary microcirculation, and discuss angiography-derived indices of microvascular resistance. A comprehensive literature review over four decades is also undertaken. Expert opinion: The coronary microvasculature plays an important role in flow autoregulation and metabolic regulation. Invasive assessment of microvascular resistance is a validated modality with independent prognostic value, nevertheless, its routine application is hampered by the requirement of intravascular instrumentation and hyperaemic agents. The angiography-derived index of microvascular resistance has emerged as a promising surrogate in pilot studies, however, more data are needed to validate and compare the diagnostic and prognostic accuracy of different equations as well as to illustrate the relationship between angiography-derived parameters for epicardial coronary arteries and those for the microvasculature.
Background Valosin-containing protein (VCP) disease, caused by mutations in the VCP gene, results in myopathy, Paget’s disease of bone (PBD) and frontotemporal dementia (FTD). Natural history and genotype–phenotype correlation data are limited. This study characterises patients with mutations in VCP gene and investigates genotype–phenotype correlations. Methods Descriptive retrospective international study collecting clinical and genetic data of patients with mutations in the VCP gene. Results Two hundred and fifty-five patients (70.0% males) were included in the study. Mean age was 56.8±9.6 years and mean age of onset 45.6±9.3 years. Mean diagnostic delay was 7.7±6 years. Symmetric lower limb weakness was reported in 50% at onset progressing to generalised muscle weakness. Other common symptoms were ventilatory insufficiency 40.3%, PDB 28.2%, dysautonomia 21.4% and FTD 14.3%. Fifty-seven genetic variants were identified, 18 of these no previously reported. c.464G>A (p.Arg155His) was the most frequent variant, identified in the 28%. Full time wheelchair users accounted for 19.1% with a median time from disease onset to been wheelchair user of 8.5 years. Variant c.463C>T (p.Arg155Cys) showed an earlier onset (37.8±7.6 year) and a higher frequency of axial and upper limb weakness, scapular winging and cognitive impairment. Forced vital capacity (FVC) below 50% was as risk factor for being full-time wheelchair user, while FVC <70% and being a full-time wheelchair user were associated with death. Conclusion This study expands the knowledge on the phenotypic presentation, natural history, genotype–phenotype correlations and risk factors for disease progression of VCP disease and is useful to improve the care provided to patient with this complex disease.
Background Sixty million golfers around the world play golf. Golf injuries are most frequently located in the spine, elbow, wrist, hand and shoulder. Those injuries are often seen in golfers with more playing hours and suboptimal swing biomechanics, resulting in overuse injuries. Golfers who do not perform a warm-up or do not warm-up appropriately are more likely to report an injury than those who do. There are several ways to warm-up. It is unclear, which warm-up is most useful for a golfer to perform. Moreover, there is currently no evidence for the effectiveness of a warm-up program for golf injury prevention. We previously have developed the Golf Related Injury Prevention Program (GRIPP) intervention using the Knowledge Transfer Scheme (KTS). We aim to evaluate the effect of the GRIPP intervention on golf-related injuries. The hypothesis is that the GRIPP intervention program will reduce the number of golf-related injuries. Methods and design The GRIPP study is a two-armed randomized controlled trial. Twenty-eight golf clubs with 11 golfers per club will be randomly allocated to the intervention or control group. The intervention group will perform the GRIPP intervention program, and the control group will perform their warm-up as usual. The GRIPP intervention is conducted with the Knowledge Transfer Scheme framework, which is a systematic process to develop an intervention. The intervention consists of 6 exercises with a maximum total of 10 min. The primary outcome is the overall prevalence (%) of golf injuries measured with the Oslo Sports Trauma Research Center (OSTRC-H) questions on health problems every fortnight. The secondary outcome measures will be exposure to golf and compliance to the intervention program. Discussion In other sports warm-up prevention programs are effective in reducing the risk of injuries. There are no randomized trials on golf injury prevention. Therefore, an individual unsupervised golf athlete intervention program is conducted which reflects the daily practice of predominantly unsupervised exposure of amateur golfers. Trial registration The trial is retrospectively (28 October 2021) registered at the Dutch Trial Register: NL9847 (
Background: Interleukin (IL)-36 signaling has been shown to be increased in ulcerative colitis (UC). Spesolimab, a novel humanized monoclonal antibody, targets the IL-36 pathway. Research design and methods: We report safety, immunogenicity, and efficacy data of intravenous [IV] spesolimab in UC. Study 1: phase II randomized, placebo-controlled trial (300 mg single dose; 450 mg every 4 weeks [q4w]; or 1,200 mg q4w, three doses). Study 2: phase IIa, randomized, placebo-controlled trial (1,200 mg q4w). Study 3: phase IIa, open-label, single-arm trial (1,200 mg q4w). Studies lasted 12 weeks, with a 12-, 24-, and 16-week safety follow-up, respectively. Results: Adverse event (AE) rates were similar for spesolimab and placebo in Studies 1 (N = 98; 64.9%; 65.2%) and 2 (N = 22; 86.7%; 71.4%); all patients in Study 3 (N = 8) experienced AEs. The most frequent investigator-assessed drug-related (spesolimab; placebo) AEs were skin rash (5.4%; 0%) and nasopharyngitis (4.1%; 0%) in Study 1; acne (13.3%; 0%) in Study 2; one patient reported skin rash, nasopharyngitis, headache, and acne in Study 3. Efficacy endpoints were not met. Conclusions: Spesolimab was generally well tolerated, with no unexpected safety concerns. The safety data are consistent with studies in other inflammatory diseases.
Objective: To evaluate the relationship between reported coronavirus disease 2019 (COVID-19)-like symptoms and the presence of severe acute respiratory syndrome coronavirus-2 (SARS-CoV-2) antibodies in patients with an immune-mediated inflammatory disorder or post-solid organ transplantation (IMIDT) with and without immunosuppressive medication (imed) and controls. Method: The IENIMINI cohort was a prospective cohort study set up in the Netherlands in March 2020, with 2 monthly (paper) or weekly (online) questionnaires about COVID-19-like symptoms. Participants from this cohort who reported these symptoms between March 2020 and November 2020 were approached for this substudy. SARS-CoV-2 antibodies were tested using a total antibody assay. Results: Of the 1203 participants approached, 629 agreed to participate and were sent a fingerprick test; 565 participants collected a capillary blood sample, of which 562 were usable. Analysis showed that 57/202 (28.2%) of the tested IMIDT group with imed, 48/16 3(29.4%) of the IMIDT group without imed, and 69/197 (35.0%) of the control group tested positive for SARS-CoV-2 antibodies. Seroprevalences of SARS-CoV-2 antibodies between males and females, biological disease-modifying anti-rheumatic drug users and non-users, and those who had had a serious disease period (defined as an episode with dyspnoea and fever) and those who had not, were not statistically different between the three groups. Conclusions: Approximately 30% of patients who had reported COVID-19-like symptoms had SARS-CoV-2 antibodies. The seroprevalence of SARS-CoV-2 antibodies after reported COVID-19-like symptoms was similar in IMIDT patients with and without imed compared to controls.
Resistance plasmids are crucial for the transfer of antimicrobial resistance and thus form a matter of concern for veterinary and human healthcare. To study plasmid transfer, foodborne Escherichia coli isolates harboring one to five known plasmids were co-incubated with a general recipient strain. Plasmid transfer rates under standardized conditions varied by a factor of almost 10 ⁶ , depending on the recipient/donor strain combination. After 1 hour transconjugants never accounted for more than 3% of the total number of cells. Transconjugants were formed from 14 donors within 1 hour of co-incubation, but in the case of 3 donors 24 hours were needed. Transfer rates were also measured during longer co-incubation, between different species and during repeated back and forth transfer. Longer co-incubation resulted in the transfer of more types of resistance. Maximum growth rates of donor strains varied by a factor of 3. Donor strains often had higher growth rates than the corresponding transconjugants, which grew at the same rate as or slightly faster than the recipient. Hence, possessing one or more plasmids does not seem to burden the harboring strain metabolically. Transfer was species specific and repeated transfer of one plasmid did not result in different transfer rates over time. Transmission Electron microcopy was used to analyze the morphology of the connection between co-incubated strains. Connection by more pili between the cells resulted in better aggregate formation and corresponded with higher transfer rates.
Institution pages aggregate content on ResearchGate related to an institution. The members listed on this page have self-identified as being affiliated with this institution. Publications listed on this page were identified by our algorithms as relating to this institution. This page was not created or approved by the institution. If you represent an institution and have questions about these pages or wish to report inaccurate content, you can contact us here.
3,377 members
Judith de Vos
  • Department of Biomedical Engineering and Physics
Shazia Micheal
  • Department of Clinical Genetics
Ingeborg Klaassen
  • Department of Ophthalmology
Judy Luigjes
  • Department of Psychiatry
Meibergdreef 9, 1105 AZ, Amsterdam, Netherlands