Background The development of accurate urinary biomarkers for non-invasive and cost-effective detection of primary and recurrent bladder tumours is recognized as one of the major clinical needs in bladder cancer diagnostics. The purposes of this study were (1) to validate the results of a previous technical comparison by determining the diagnostic performance of nine methylation markers in urine pellet compared to full void urine, and (2) to validate the diagnostic performance of the optimal marker panel GHSR/MAL from a previous exploratory study in a preclinical setting. Methods Urine samples of 108 patients with bladder cancer and 100 age- and gender-matched controls were prospectively collected for methylation analysis. Urinary methylation levels of the markers FAM19A4, GHSR, MAL, miR-129, miR-935, PHACTR3, PRDM14, SST and ZIC1 were determined with quantitative methylation-specific PCR in urine pellet. Area under the curves (AUCs) were determined for individual markers and the marker panel GHSR/MAL. The diagnostic performance of the marker panel GHSR/MAL was evaluated in the total study population and in different subgroups of patients with bladder cancer using the Chi-square test. The diagnostic accuracy was assessed by leave-one-out cross-validation. Results All nine urinary methylation markers (FAM19A4, GHSR, MAL, miR-129, miR-935, PHACTR3, PRDM14, SST and ZIC1) showed significantly higher methylation levels in bladder cancer patients than in controls (p < 0.001). Area under the curves (AUCs) of the nine methylation markers tested in urine pellet were similar to AUCs in full void urine of an independent previous cohort. GHSR/MAL reached an AUC of 0.89 (95% confidence interval [CI] 0.84–0.94), at 80% sensitivity and 93% specificity. Sensitivity of GHSR/MAL increased with higher tumour grades, higher tumour stages, in primary vs. recurrent tumours, and in males vs. females. Conclusions This technical validation supports the robustness of DNA methylation analysis in urine pellet and full void urine for the non-invasive detection of bladder cancer. Subsequent preclinical validation confirmed the diagnostic potential of GHSR/MAL. These findings underline the diagnostic potential of the marker panel GHSR/MAL for future bladder cancer diagnostics.
Purpose International guidelines vary in their recommendations whether or not to reduce the therapeutic dose of low molecular weight heparins (LMWHs) in renal impairment. The use of anti-Xa monitoring as a basis of dose adjustments is also a matter of debate. As this may lead to variations in treatment policies, we aimed to study the treatment policies of therapeutically dosed LMWHs in renal impairment in Dutch hospitals. Methods An 11-item survey was distributed between June 2020 and March 2021 to hospital pharmacists, representing Dutch hospital organisations. Primary outcomes were the dosing regimens of therapeutically dosed LMWHs in renally impaired patients. Secondary outcomes were the proportion of hospitals that used anti-Xa monitoring and the anti-Xa target range used. Results There was a response from 56 of 69 (81%) Dutch hospital organisations where in each case a hospital pharmacist completed the survey. In these hospitals, 77 LMWH regimens were in use. In 76 of 77 (99%) regimens, a regular dose reduction was used at the start of treatment. Fifty-five of these hospitals used a dose reduction if estimated glomerular filtration rate (eGFR) < 50 ml/min and 17 used a dose reduction if eGFR < 30 ml/min. Anti-Xa levels were not routinely monitored in 40% of regimens, while 22% monitored anti-Xa if eGFR < 50 ml/min, 27% if eGFR < 30 ml/min and 10% in other eGFR cutoff values. Target ranges of 1.0–2.0 IU/ml (once daily) and 0.5/0.6–1.0 IU/ml (twice daily) were used in 69% of regimens that included monitoring of anti-Xa. Conclusion Treatment policies show substantial diversity in therapeutically dosed LMWHs in renally impaired patients. The most commonly used treatment regimen was a regular dose reduction if eGFR is < 50 ml/min, without anti-Xa monitoring.
Parent‐infant closeness and active parent participation in neonatal care are important for parent and infant health. To give an overview of current neonatal settings and gain an in‐depth understanding of facilitators and barriers to parent‐infant closeness, zero‐separation, in 19 countries. Neonatal intensive care unit (NICU) professionals, representing 45 NICUs from a range of geographic regions in Europe and Canada, were purposefully selected and interviewed June–December 2018. Thematic analysis was conducted to identify, analyze and report patterns (themes) for parent‐infant closeness across the entire series of interviews. Parent‐infant separation during infant and/or maternity care is very common (42/45 units, 93%), despite the implementation of family integrated care (FICare) practices, including parent participation in medical rounds (17/45, 38%), structured education sessions for parents (16/45, 36%) and structured training for healthcare professionals (22/45, 49%). NICU professionals encountered four main themes with facilitators and barriers for parent‐infant closeness on and between the hospital, unit, staff, and family level: Culture (jointly held characteristics, values, thinking and behaviors about parental presence and participation in the unit), Collaboration (the act of working together between and within different levels), Capacities (resources and policies), and Coaching (education to acquire and transfer knowledge and skills). Implementing parent‐infant closeness in the NICU is still challenging for healthcare professionals. Further optimization in neonatal care towards zero‐separation and parent‐infant closeness can be achieved by enforcing the ‘four Cs for Closeness’: Culture, Collaboration, Capacities, and Coaching. Implementing parent‐infant closeness in the neonatal intensive care unit is still challenging for healthcare professionals. Further optimization in neonatal care towards zero‐separation and parent‐infant closeness can be achieved by enforcing the ‘four Cs for Closeness’: Culture, Collaboration, Capacities, and Coaching.
Background The Dutch Working Party on Antibiotic Policy (SWAB) in collaboration with relevant professional societies, has updated their evidence-based guidelines on empiric antibacterial therapy of sepsis in adults. Methods Our multidisciplinary guideline committee generated ten population, intervention, comparison, and outcome (PICO) questions relevant for adult patients with sepsis. For each question, a literature search was performed to obtain the best available evidence and assessed using the Grading of Recommendations Assessment, Development, and Evaluation (GRADE) system. The quality of evidence for clinically relevant outcomes was graded from high to very low. In structured consensus meetings, the committee formulated recommendations as strong or weak. When evidence could not be obtained, recommendations were provided based on expert opinion and experience (good practice statements). Results Fifty-five recommendations on the antibacterial therapy of sepsis were generated. Recommendations on empiric antibacterial therapy choices were differentiated for sepsis according to the source of infection, the potential causative pathogen and its resistance pattern. One important revision was the distinction between low, increased and high risk of infection with Enterobacterales resistant to third generation cephalosporins (3GRC-E) to guide the choice of empirical therapy. Other new topics included empirical antibacterial therapy in patients with a reported penicillin allergy and the role of pharmacokinetics and pharmacodynamics to guide dosing in sepsis. We also established recommendations on timing and duration of antibacterial treatment. Conclusions Our multidisciplinary committee formulated evidence-based recommendations for the empiric antibacterial therapy of adults with sepsis in The Netherlands.
Purpose Previous studies have shown a wide range of efficacy (29 to 71%) of a mandibular advancement device (MAD) in the treatment of obstructive sleep apnea (OSA). Currently, the ability to preselect suitable patients for MAD therapy based on individual characteristics related to upper airway collapsibility is limited. We investigated if the use of non-custom interim MAD during drug-induced sleep endoscopy (DISE) could be a valuable screening tool to predict MAD treatment outcome. Methods In a single-center prospective study including a consecutive series of patients with OSA, we compared DISE outcomes with a MAD in situ with polysomnography results after 3 months of using the same MAD that was used during DISE. Results Of 41 patients who completed the study, the median apnea–hypopnea index (AHI) was 16.0 events/h [IQR 7.4–23.4]. Respiratory outcomes on polysomnography, including apnea index (AI), total AHI, AHI in supine position, and oxygen desaturation index, all significantly improved after 3 months of MAD treatment. With complete improvement of the upper airway obstruction with the MAD in situ during DISE in supine position, patients were 6.3 times more likely to be a responder to MAD treatment compared to patients with a persisting complete obstruction, although not statistically significant (OR 6.3; 95%CI 0.9–42.7; p = 0.060). Conclusion The potential predictive value with regard to MAD therapy outcomes of the use of an interim MAD during DISE would be an important finding, since the prediction of MAD therapy outcome is of great clinical and scientific interest. A study with a larger cohort should be performed to further investigate our findings.
Background Patients with a transient ischemic attack (TIA) or ischemic stroke are at increased risk of developing cognitive impairment in the subacute phase. At present, the effects of exercise on cognitive functioning following a TIA or stroke are not fully known. The purpose of this trial was to investigate the effect of exercise on global cognition. Methods The MoveIT trial is a single-centre, observer-blinded, randomized controlled trial involving a 1-year exercise intervention consisting of a 12-week group exercise program, combined with three counselling visits to the physiotherapists over a 9-month period. The control group received standard care. The primary outcome was global cognitive functioning, assessed at one year, using the Montreal Cognitive Assessment (MoCA). Secondary outcomes included cardiorespiratory fitness, the cardiovascular profile, and attainment of secondary prevention targets, anxiety, depression and fatigue at one and two years. Results The experimental group consisted of 60 patients, while the control group consisted of 59 patients. The mean age was 64.3 years and 41% were female. No between-group differences were found on global cognitive functioning (MD, 0.7 out of 30, 95% CI, − 0.2 to 1.6) or on secondary outcome measures at 12 months. The only significant between-group difference was found for fatigue, in favour of the experimental group at 12 months (MD, 0.6 out of 63, 95% CI, 0.1 to 1.1). Conclusions No benefit of this exercise intervention was found regarding global cognition. Future studies need to focus on optimizing rehabilitation strategies for this vulnerable group of patients. Trial registration http://www.trialregister.nl . Unique identifier: NL3721 . Date first registration: 06-03-2013.
Purpose There is growing evidence that patients with certain simple stable musculoskeletal injuries can be discharged directly from the Emergency Department (ED), without compromising patient outcome and experience. This study aims to review the literature on the effects of direct discharge (DD) of simple stable musculoskeletal injuries, regarding healthcare utilization, costs, patient outcome and experience. Methods A systematic review was performed in Medline, Embase, CINAHL, Cochrane Library and Web of Science using PRISMA guidelines. Comparative and non-comparative studies on DD of simple stable musculoskeletal injuries from the ED in an adult/paediatric/mixed population were included if reporting ≥ 1 of: (1) logistic outcomes: DD rate (proportion of patients discharged directly); number of follow-up appointments; DD return rate; (2) costs; (3) patient outcomes/experiences: functional outcome; treatment satisfaction; adverse outcomes; other. Results Twenty-six studies were included (92% conducted in the UK). Seven studies (27%) assessed functional outcome, nine (35%) treatment satisfaction, and ten (38%) adverse outcomes. A large proportion of studies defined DD eligibility criteria as injuries being minor/simple/stable, without further detail. ED DD rate was 26.7–59.5%. Mean number of follow-up appointments was 1.00–2.08 pre-DD, vs. 0.00–0.33 post-DD. Return rate was 0.0–19.4%. Costs per patient were reduced by €69–€210 (ranging from − 38.0 to − 96.6%) post-DD. Functional outcome and treatment satisfaction levels were ‘equal’ or ‘better’ (comparative studies), and ‘high’ (non-comparative studies), post-DD. Adverse outcomes were low and comparable. Conclusions This systematic review supports the idea that DD of simple stable musculoskeletal injuries from the ED provides an opportunity to reduce healthcare utilization and costs without compromising patient outcomes/experiences. To improve comparability and facilitate implementation/external validation of DD, future studies should provide detailed DD eligibility criteria, and use a standard set of outcomes. Systematic review registration number: 120779, date of first registration: 12/02/2019.
Objective Previous studies reported less prenatal healthcare consumption and more perinatal complications in women with a migrant background. Hence, we investigated in a country with free healthcare access whether women with a migrant background differed with respect to pregnancy complications, healthcare consumption and in terms of associations with psychological distress in comparison to native Dutch. Methods We included 324 native Dutch and 303 women with a migrant background, who visited two hospitals in Amsterdam for antenatal care between 2014 and 2015. Participants completed the Edinburgh Postnatal Depression Scale, the Hospital Depression and Anxiety Scale, and sociodemographic questions. Complications and healthcare consumption during pregnancy were extracted from medical records. Regression analyses were used with adjustment for covariates. Results Except for gestational diabetes [adjusted OR = 3.09; 95% CI = (1.51, 6.32)], no differences were found between groups in perinatal complications [OR = 1.15; 95% CI = (0.80, 1.64)], nor in healthcare consumption [OR = 0.87; 95% CI = (0.63, 1.19)]. Women with a migrant background reported more depressive symptoms [Cohen’s d = 0.25; 95% CI = (0.10, 0.41)], even after adjustment for socio-economic factors. Psychological distress was associated with more hospital admissions during pregnancy. When experiencing depressive symptoms, women with a migrant background had an increased risk to be admitted [OR = 1.11; 95% CI = (1.01, 1.21)]. Conclusions for Practice This cohort study found no differences in pregnancy-related complications, except for diabetes, nor different healthcare consumption, in women with a migrant background versus native Dutch, in a country with free health care access. However, women with a migrant background experienced more depressive symptoms, and when depressed their risk for hospital admission increased. Additional research is warranted to improve healthcare for this population.
Aims The aim of this study was to explore the functional results in a fitter subgroup of participants in the Hip Fracture Evaluation with Alternatives of Total Hip Arthroplasty versus Hemiarthroplasty (HEALTH) trial to determine whether there was an advantage of total hip arthroplasty (THA) versus hemiarthroplasty (HA) in this population. Methods We performed a post hoc exploratory analysis of a fitter cohort of patients from the HEALTH trial. Participants were aged over 50 years and had sustained a low-energy displaced femoral neck fracture (FNF). The fittest participant cohort was defined as participants aged 70 years or younger, classified as American Society of Anesthesiologists grade I or II, independent walkers prior to fracture, and living at home prior to fracture. Multilevel models were used to estimate the effect of THA versus HA on functional outcomes. In addition, a sensitivity analysis of the definition of the fittest participant cohort was performed. Results There were 143 patients included in the fittest cohort. Mean age was 66 years (SD 4.5) and 103 were female (72%). No clinically relevant differences were found between the treatment groups in the primary and sensitivity analyses. Conclusion This analysis found no differences in functional outcomes between HA and THA within two years of displaced low-energy FNF in a subgroup analysis of the fittest HEALTH patients. These findings suggest that very few patients above 50 years of age benefit in a clinically meaningful way from a THA versus a HA early after injury. Cite this article: Bone Jt Open 2022;3(8):611–617.
Introduction Within the value-based healthcare framework, outcome data can be used to inform patients about (treatment) options, and empower them to make shared decisions with their health care professional. To facilitate shared decision-making (SDM) supported by outcome data, a multicomponent intervention has been designed, including patient decision aids on the organisation of post-treatment surveillance (breast cancer); discharge location (stroke) and treatment modality (advanced kidney disease), and training on SDM for health care professionals. The SHared decision-making supported by OUTcome information (SHOUT) study will examine the effectiveness of the intervention and its implementation in clinical practice. Methods and analysis Multiple interrupted time series will be used to stepwise implement the intervention. Patients diagnosed with either breast cancer (N=630), stroke (N=630) or advanced kidney disease (N=473) will be included. Measurements will be performed at baseline, three (stroke), six and twelve (breast cancer and advanced kidney disease) months. Trends on outcomes will be measured over a period of 20 months. The primary outcome will be patients’ perceived level of involvement in decision-making. Secondary outcomes regarding effectiveness will include patient-reported SDM, decisional conflict, role in decision-making, knowledge, quality of life, preferred and chosen care, satisfaction with the intervention, healthcare utilisation and health outcomes. Outcomes regarding implementation will include the implementation rate and a questionnaire on the health care professionals’ perspective on the implementation process. Ethics and dissemination The Medical research Ethics Committees United in Nieuwegein, the Netherlands, has confirmed that the Medical Research Involving Human Subjects Act does not apply to this study. Bureau Onderzoek & Innovatie of Santeon, the Netherlands, approved this study. The results will contribute to insight in and knowledge on the use of outcome data for SDM, and can stimulate sustainable implementation of SDM. Trial registration number NL8374, NL8375 and NL8376.
What is known and objective: Many severe intoxications occur with substances with no specific antidote, which is why methods of extracorporeal elimination represent a particularly useful and even critical component in their management. The purpose of this review is to summarize the accumulating evidence and clinical results from the application of CytoSorb hemoadsorption therapy in patients with severe intoxications. Comment: The technology represents a promising technique with an increasing number of publications in a variety of severe intoxication scenarios suggesting that early intervention might provide rapid substance removal with subsequent overall clinical improvement. What is new and conclusion: Given the tremendous challenges in performing prospective, randomized trials in this field, the strong safety profile of the device and the high acuity of these life-threatening situations, CytoSorb should be considered as a therapeutic option in severe intoxications, particularly when direct antidotes are not available. However, further clinical data are desirable to provide precise recommendations.
Pre-eclampsia (PE) affects 2–8% of pregnancies and is responsible for significant morbidity and mortality. The maternal clinical syndrome (defined by hypertension, proteinuria, and organ dysfunction) is the result of endothelial dysfunction. The endothelial response to increased levels of soluble FMS-like Tyrosine Kinase 1 (sFLT1) is thought to play a central role. sFLT1 is released from multiple tissues and binds VEGF with high affinity and antagonizes VEGF. Expression of soluble variants of sFLT1 is a result of alternative splicing; however, the mechanism is incompletely understood. We hypothesize that neuro-oncological ventral antigen 2 (NOVA2) contributes to this. NOVA2 was inhibited in human umbilical vein endothelial cells (HUVECs) and multiple cellular functions were assessed. NOVA2 and FLT1 expression in the placenta of PE, pregnancy-induced hypertension, and normotensive controls was measured by RT-qPCR. Loss of NOVA2 in HUVECs resulted in significantly increased levels of sFLT1, but did not affect expression of membrane-bound FLT1. NOVA2 protein was shown to directly interact with FLT1 mRNA. Loss of NOVA2 was also accompanied by impaired endothelial functions such as sprouting. We were able to restore sprouting capacity by exogenous VEGF. We did not observe statistically significant regulation of NOVA2 or sFLT1 in the placenta. However, we observed a negative correlation between sFLT1 and NOVA2 expression levels. In conclusion, NOVA2 was found to regulate FLT1 splicing in the endothelium. Loss of NOVA2 resulted in impaired endothelial function, at least partially dependent on VEGF. In PE patients, we observed a negative correlation between NOVA2 and sFLT1.
Background The COVID-19 pandemic continues to overwhelm intensive care units (ICUs) worldwide, and improved prediction of mortality among COVID-19 patients could assist decision making in the ICU setting. In this work, we report on the development and validation of a dynamic mortality model specifically for critically ill COVID-19 patients and discuss its potential utility in the ICU. Methods We collected electronic medical record (EMR) data from 3222 ICU admissions with a COVID-19 infection from 25 different ICUs in the Netherlands. We extracted daily observations of each patient and fitted both a linear (logistic regression) and non-linear (random forest) model to predict mortality within 24 h from the moment of prediction. Isotonic regression was used to re-calibrate the predictions of the fitted models. We evaluated the models in a leave-one-ICU-out (LOIO) cross-validation procedure. Results The logistic regression and random forest model yielded an area under the receiver operating characteristic curve of 0·87 [0·85; 0·88] and 0·86 [0·84; 0·88], respectively. The recalibrated model predictions showed a calibration intercept of −0·04 [−0·12; 0·04] and slope of 0·90 [0·85; 0·95] for logistic regression model and a calibration intercept of −0·19 [−0·27; −0·10] and slope of 0·89 [0·84; 0·94] for the random forest model. Discussion We presented a model for dynamic mortality prediction, specifically for critically ill COVID-19 patients, which predicts near-term mortality rather than in-ICU mortality. The potential clinical utility of dynamic mortality models such as benchmarking, improving resource allocation and informing family members, as well as the development of models with more causal structure, should be topics for future research.
Background . Several batteries have been developed for the cognitive assessment of people with multiple sclerosis (MS). However, all these tests have some limitations in general clinical practice and from a cross-cultural perspective. In this study, we aimed to validate a novel cognitive screening test, the Cross-Cultural Dementia screening test (CCD), in pwMS. Methods . Seventy-five participants with relapsing-remitting MS and 75 healthy controls were enrolled and completed a comprehensive neuropsychological battery and the CCD. Intergroup comparisons, effect sizes, and correlations with previously validated tests were calculated for a majority and a pilot study of a minority sample. ROC curves were estimated, and random forest classification models were developed. Results . There were statistically significant differences between cognitively impaired MS (MS-CI) group and healthy controls, and between MS-CI and non-cognitively impaired MS group in all subtests of CCD with medium to large effect sizes. Correlations with standardized neuropsychological tests were moderate to high, supporting concurrent validity. These results were replicated in the minority sample. The random forest models showed a very accurate classification using the CCD. This test showed good psychometric properties compared with SDMT. Conclusions . Our study validates the CCD for cognitive impairment screening in MS, showing advantages over other routinely used cognitive tests.
Background After latissimus dorsi transfer (LDT), an increase in scapulothoracic (ST) contribution in thoracohumeral (TH) elevation is observed when compared to the asymptomatic shoulder. It is not known which shoulder muscles contribute to this change in shoulder kinematics, and whether the timing of muscle recruitment has altered after LDT. The aim of the study was to identify which shoulder muscles and what timing of muscle recruitment are responsible for the increased ST contribution and shoulder elevation after LDT for a massive irreparable posterosuperior rotator cuff tear (MIRT). Methods Thirteen patients with a preoperative pseudoparalysis and MIRT were recruited after LDT with a minimum follow-up of 1 year. 3D electromagnetic tracking was used to assess maximum active elevation of the shoulder (MAES) in both the LDT and the asymptomatic contralateral shoulder (ACS). Surface electromyography (EMG) tracked activation (% EMG max) and activation timing of the latissimus dorsi (LD), deltoid, teres major, trapezius (upper, middle and lower) and serratus anterior muscles. MAES was studied in forward flexion, scapular abduction and abduction in the coronal plane. Results In MAES, no difference in TH motion was observed between the LDT and ACS, P=0.300. However, the glenohumeral motion for MAES was significantly lower in the LDT shoulder F(1,12) =11.230, p=0.006. The LD % EMG max did not differ between the LDT and ACS in MAES. A higher % EMG max was found for the deltoid F(1,12)=17.241, P=0.001, and upper trapezius F(1,10)=13.612, P=0.004 in the LDT shoulder during MAES. The middle trapezius only showed a higher significant difference in % EMG max for scapular abduction, P =0.020 (LDT 52.3 ± 19.4, ACS 38.1 ± 19.7). The % EMG max of the lower trapezius, serratus anterior and teres major did not show any difference in all movement types between the LDT and ACS and no difference in timing of recruitment of all the shoulder muscles was observed. Conclusion After LDT in patients with a MIRT and preoperative pseudoparalysis, the LD muscle did not alter its % EMG max during MAES when compared to the ACS. The cranial transfer of the LD tendon with its native %EMG max, together with the increased %EMG max of the deltoid, middle and upper trapezius muscles could be responsible for the increased ST contribution. The increased GH joint reaction force could in turn increase active elevation after LDT in a previous pseudoparalytic shoulder.
Background Methotrexate is an immunomodulatory drug for patients with Crohn’s disease. Erythrocyte MTX-polyglutamates (MTX-PG 1-5 ) may be used for therapeutic drug monitoring (TDM) as MTX-PG is thought to mediate MTX’s efficacy. Information on determinants of the concentration of MTX-PG in patients with Crohn’s disease is lacking. We aim to identify clinical and biochemical determinants of the erythrocyte MTX-PG 1-5 and MTX-PG total concentration in patients with Crohn’s disease. Methods Adults with Crohn’s disease on methotrexate treatment who visited the outpatient clinic of Amsterdam UMC were included. Erythrocyte MTX-PGs were measured by tandem mass spectrometry. Results Nineteen patients were included, with a median duration of MTX use of 77 months (range 7–202). Twelve patients received MTX monotherapy, whereas 7 patients were on concomitant TNF-α inhibitors. The mean dose of MTX was 15.5 mg (SD ± 2.8) and 12 (63%) patients used subcutaneous MTX. MTX-PG 1-5 were successfully measured in 18 patients, showing substantial variability in concentrations of MTX-PG total and individual species. The median MTX-PG total was 117.1 nmol/L (range 46.4–258.7) with preferential accumulation of MTX-PG 3 (43.1 nmol/L, range 15.3–96.1). Patients on subcutaneous compared to oral MTX had higher median MTX-PG (4,5) levels (55 versus 9 nmol/L, p = 0.01). Higher age (β = 0.71) and lower estimated glomerular filtration rate (β = − 0.52) were associated with a significantly higher MTX-PG total concentration (R ² = 0.60, p = 0.001). Conclusion MTX-PG concentrations display a considerable inter-individual variability. Higher MTX-PG accumulation is associated with subcutaneous administration, higher age, and lower renal function in Crohn’s disease patients.
Background Subcutaneous (SC) vedolizumab is effective in inflammatory bowel diseases (IBD) when administered after induction with two infusions. Aim To assess the effectiveness, safety and pharmacokinetics of a switch from intravenous (IV) to SC maintenance vedolizumab in patients with IBD. Methods In this prospective cohort study, patients with IBD who had ≥4 months IV vedolizumab were switched to SC vedolizumab. We studied the time to discontinuation of SC vedolizumab, adverse events (AEs), changes in clinical and biochemical outcomes and vedolizumab concentrations at baseline, and weeks 12 and 24. Results We included 135 patients, 82 with Crohn's disease (CD) and 53 with ulcerative colitis (UC). Eleven (13.4%) CD and five (9.4%) UC patients discontinued SC vedolizumab after a median of 18 (IQR 8–22) and 6 weeks (IQR 5–10), respectively. Four patients (all CD) switched to a different drug due to loss of response, nine switched back to IV vedolizumab due to adverse events, and three due to needle fear. Common AEs were injection site reactions (n = 15) and headache (n = 6). Median clinical and biochemical disease activity remained stable after the switch. Median vedolizumab serum concentrations increased from 19 μg/ml at the time of the switch to 31 μg/ml 12 weeks after the switch (p < 0.005). Conclusions Switching from IV to SC vedolizumab maintenance treatment is effective in patients with CD or UC. However, 9% of patients were switched back to IV vedolizumab due to adverse events or fear of needles.
Objective: To describe outcome after pancreatic surgery in the first six years of a mandatory nationwide audit. Background: Within the Dutch Pancreatic Cancer Group, efforts have been made to improve outcome after pancreatic surgery. These include collaborative projects, clinical auditing, and implementation of an algorithm for early recognition and management of postoperative complications. However, nationwide changes in outcome over time have not yet been described. Methods: This nationwide cohort study included consecutive patients after pancreatoduodenectomy and distal pancreatectomy from the mandatory Dutch Pancreatic Cancer Audit (January 2014-December 2019). Patient, tumor, and treatment characteristics were compared between three time periods (2014-2015, 2016-2017, and 2018-2019). Short-term surgical outcome was investigated using multilevel multivariable logistic regression analyses. Primary endpoints were failure to rescue and in-hospital mortality. Results: Overall, 5345 patients were included, of whom 4227 after pancreatoduodenectomy and 1118 after distal pancreatectomy. After pancreatoduodenectomy, failure to rescue improved from 13% to 7.4% (OR 0.64, 95%CI 0.50-0.80, P<0.001) and in-hospital mortality decreased from 4.1% to 2.4% (OR 0.68, 95%CI 0.54-0.86, P=0.001), despite operating on more patients with age >75 years (18% to 22%, P=0.006), ASA score ≥3 (19% to 31%, P<0.001) and Charlson comorbidity score ≥2 (24% to 34%, P<0.001). The rates of textbook outcome (57% to 55%, P=0.283) and major complications remained stable (31% to 33%, P=0.207), whereas complication-related intensive care admission decreased (13% to 9%, P=0.002). After distal pancreatectomy, improvements in failure to rescue from 8.8% to 5.9% (OR 0.65, 95%CI 0.30-1.37, P=0.253) and in-hospital mortality from 1.6% to 1.3% (OR 0.88, 95%CI 0.45-1.72, P=0.711) were not statistically significant. Conclusions: During the first six years of a nationwide audit, in-hospital mortality and failure to rescue after pancreatoduodenectomy improved despite operating on more high-risk patients. Several collaborative efforts may have contributed to these improvements.
Institution pages aggregate content on ResearchGate related to an institution. The members listed on this page have self-identified as being affiliated with this institution. Publications listed on this page were identified by our algorithms as relating to this institution. This page was not created or approved by the institution. If you represent an institution and have questions about these pages or wish to report inaccurate content, you can contact us here.