Diagnosis and treatment of atopic dermatitis (AD) in chimpanzees are challenging. Validated allergy tests specific for chimpanzees are not available. A multifactorial management of atopic dermatitis is important. Successful management of AD has, to the best knowledge of the authors, not been described in chimpanzees.
Background Optimizing return to work (RTW) after knee arthroplasty (KA) is becoming increasingly important due to a growing incidence of KA and poor RTW outcomes after KA. We developed the Back At work After Surgery (BAAS) clinical pathway for optimized RTW after KA. Since the effectiveness and cost analysis of the BAAS clinical pathway are still unknown, analysis on effectiveness and costs of BAAS is imperative. Method This protocol paper has been written in line with the standards of Standard Protocol Items: Recommendations for Interventional Trails. To assess the effectiveness and cost-effectiveness for RTW, we will perform a multicenter prospective cohort study with patients who decided to receive a total KA (TKA) or an unicompartmental KA (UKA). To evaluate the effectiveness of BAAS regarding RTW, a comparison to usual care will be made using individual patient data on RTW from prospectively performed cohort studies in the Netherlands. Discussion One of the strengths of this study is that the feasibility for the BAAS clinical pathway was tested at first hand. Also, we will use validated questionnaires and functional tests to assess the patient’s recovery using robust outcomes. Moreover, the intervention was performed in two hospitals serving the targeted patient group and to reduce selection bias and improve generalizability. The limitations of this study protocol are that the lead author has an active role as a medical case manager (MCM) in one of the hospitals. Additionally, we will use the data from other prospective Dutch cohort studies to compare our findings regarding RTW to usual care. Since we will not perform an RCT, we will use propensity analysis to reduce the bias due to possible differences between these cohorts. Trail Registration This study was retrospectively registered at clinicaltrails.gov (https://clinicaltrials.gov/ct2/show/NCT05690347, date of first registration: 19–01-2023).
Purpose: To estimate the diagnostic accuracy of circumpapillary retinal nerve fibre layer (RNFL) thickness and macular ganglion cell layer-inner plexiform layer (GCL-IPL) thickness measurements to discriminate an abnormal visual function (i.e. abnormal age-based visual acuity and/or visual field defect) in children with a newly diagnosed brain tumour. Methods: This cross-sectional analysis of a prospective longitudinal nationwide cohort study was conducted at four hospitals in the Netherlands, including the national referral centre for paediatric oncology. Patients aged 0-18 years with a newly diagnosed brain tumour and reliable visual acuity and/or visual field examination and optical coherence tomography were included. Diagnostic accuracy was evaluated with sensitivity, specificity, positive predictive value (PPV) and negative predictive value (NPV). Results: Of 115 patients included in the study (67 [58.3%] male; median age 10.6 years [range, 0.2-17.8 years]), reliable RNFL thickness and GCL-IPL thickness measurements were available in 92 patients (80.0%) and 84 patients (73.0%), respectively. The sensitivity for detecting an abnormal visual function was 74.5% for average RNFL thickness and 41.7% for average GCL-IPL thickness at a specificity of 44.5% and 82.9%, respectively. The PPV and NPV were 33.0% and 82.6% for the average RNFL thickness and 57.1% and 82.2% for the average GCL-IPL thickness. Conclusion: An abnormal visual function was discriminated correctly by using the average RNFL thickness in seven out of ten patients and by using the average GCL-IPL thickness in four out of ten patients. The relatively high NPVs signified that patients with normal average RNFL thickness and average GCL-IPL thickness measurements had a relative high certainty of a normal visual function.
Introduction: Most studies about rib fractures focus on mortality and morbidity. Literature is scarce on long term and quality of life (QoL) outcomes. Therefore, we report QoL and long-term outcomes after rib fixation in flail chest patients. Methods: A prospective cohort study of clinical flail chest patients admitted to six level 1 trauma centres in the Netherlands and Switzerland between January 2018 and March 2021. Outcomes included in-hospital outcomes and long-term outcomes, such as QoL measurements 12 months after hospitalization using the EuroQoL five dimensions (EQ-5D) questionnaire. Results: Sixty-one operatively treated flail chest patients were included. Median hospital length of stay was 15 days and intensive care length of stay was 8 days. Sixteen (26%) patients developed pneumonia and two (3%) died. One year after hospitalization the mean EQ5D score was 0.78. Complication rates were low and included hemothorax (6%) pleural effusion (5%) and two revisions of the implant (3%). Implant related irritation was commonly reported by patients (n = 15, 25%). Conclusions: Rib fixation for flail chest injuries can be considered a safe procedure and with low mortality rates. Future studies should focus on quality of life rather than solely short-term outcomes.
A 48-year-old HIV-positive patient presented at the otorhinolaryngology department with a growing mass on the left side of his neck, fever and night sweats. Biopsy demonstrated a granulomatous, necrotizing inflammation. After extensive additional testing, PCR on lesion punctate material was positive for Chlamydia trachomatis, yielding a diagnosis of cervical lymphogranuloma venereum.
Background Dose reduction (DR) of adalimumab, etanercept and ustekinumab has proven to be (cost-)effective in psoriasis patients with low disease activity. Further implementation is needed to establish application of DR for eligible patients. Objectives To evaluate implementation of protocolized biologic DR in daily practice. Methods A pilot implementation study was performed in 3 hospitals during 6 months. By combining education and protocol development, involved healthcare providers (HCPs) were directed towards adoption of protocolized DR. DR of adalimumab, etanercept, and ustekinumab was achieved by stepwise injection interval prolongation. Implementation outcomes (fidelity, feasibility) were assessed. Factors for optimizing implementation were explored in interviews with HCPs. Uptake was measured in patients by chart review. Results The implementation strategy was executed as planned. Implementation fidelity was less than 100% as not all provided tools were used across study sites. HCPs indicated feasibility of implementing protocolized DR, although time-investment was needed. Identified additional factors for successful implementation included support for patients, uptake of DR into guidelines, and supportive electronic health record systems. During the 6 months intervention period, 52 patients were eligible for DR of whom 26 (50%) started DR. The proposed DR protocol was followed in 22/26 patients (85%) on DR. Conclusion Additional staff for support, extra time during consultations, education on DR for HCPs and patients, and effective tools such as a feasible protocol can lead to more patients on biologic DR.
Background: Advanced low-grade ovarian carcinoma (LGOC) is difficult to treat. In several studies, high estrogen receptor (ER) protein expression was observed in patients with LGOC, which suggests that antihormonal therapy (AHT) is a treatment option. However, only a subgroup of patients respond to AHT, and this response cannot be adequately predicted by currently used immunohistochemistry (IHC). A possible explanation is that IHC only takes the ligand, but not the activity, of the whole signal transduction pathway (STP) into account. Therefore, in this study, the authors assessed whether functional STP activity can be an alternative tool to predict response to AHT in LGOC. Methods: Tumor tissue samples were obtained from patients with primary or recurrent LGOC who subsequently received AHT. Histoscores of ER and progesterone receptor (PR) were determined. In addition, STP activity of the ER STP and of six other STPs known to play a role in ovarian cancer was assessed and compared with the STP activity of healthy postmenopausal fallopian tube epithelium. Results: Patients who had normal ER STP activity had a progression-free survival (PFS) of 16.1 months. This was significantly shorter in patients who had low and very high ER STP activity, with a median PFS of 6.0 and 2.1 months, respectively (p < .001). Unlike ER histoscores, PR histoscores were strongly correlated to the ER STP activity and thus to PFS. Conclusions: Aberrant low and very high functional ER STP activity and low PR histoscores in patients with LGOC indicate decreased response to AHT. ER IHC is not representative of functional ER STP activity and is not related to PFS.
A better understanding of transcriptional evolution of IDH-wild-type glioblastoma may be crucial for treatment optimization. Here, we perform RNA sequencing (RNA-seq) (n = 322 test, n = 245 validation) on paired primary-recurrent glioblastoma resections of patients treated with the current standard of care. Transcriptional subtypes form an interconnected continuum in a two-dimensional space. Recurrent tumors show preferential mesenchymal progression. Over time, hallmark glioblastoma genes are not significantly altered. Instead, tumor purity decreases over time and is accompanied by co-increases in neuron and oligodendrocyte marker genes and, independently, tumor-associated macrophages. A decrease is observed in endothelial marker genes. These composition changes are confirmed by single-cell RNA-seq and immunohistochemistry. An extracellular matrix-associated gene set increases at recurrence and bulk, single-cell RNA, and immunohistochemistry indicate it is expressed mainly by pericytes. This signature is associated with significantly worse survival at recurrence. Our data demonstrate that glioblastomas evolve mainly by microenvironment (re-)organization rather than molecular evolution of tumor cells.
Objective: Routine urgent endoscopic retrograde cholangiopancreatography (ERCP) with endoscopic biliary sphincterotomy (ES) does not improve outcome in patients with predicted severe acute biliary pancreatitis. Improved patient selection for ERCP by means of endoscopic ultrasonography (EUS) for stone/sludge detection may challenge these findings. Design: A multicentre, prospective cohort study included patients with predicted severe acute biliary pancreatitis without cholangitis. Patients underwent urgent EUS, followed by ERCP with ES in case of common bile duct stones/sludge, within 24 hours after hospital presentation and within 72 hours after symptom onset. The primary endpoint was a composite of major complications or mortality within 6 months after inclusion. The historical control group was the conservative treatment arm (n=113) of the randomised APEC trial (Acute biliary Pancreatitis: urgent ERCP with sphincterotomy versus conservative treatment, patient inclusion 2013-2017) applying the same study design. Results: Overall, 83 patients underwent urgent EUS at a median of 21 hours (IQR 17-23) after hospital presentation and at a median of 29 hours (IQR 23-41) after start of symptoms. Gallstones/sludge in the bile ducts were detected by EUS in 48/83 patients (58%), all of whom underwent immediate ERCP with ES. The primary endpoint occurred in 34/83 patients (41%) in the urgent EUS-guided ERCP group. This was not different from the 44% rate (50/113 patients) in the historical conservative treatment group (risk ratio (RR) 0.93, 95% CI 0.67 to 1.29; p=0.65). Sensitivity analysis to correct for baseline differences using a logistic regression model also showed no significant beneficial effect of the intervention on the primary outcome (adjusted OR 1.03, 95% CI 0.56 to 1.90, p=0.92). Conclusion: In patients with predicted severe acute biliary pancreatitis without cholangitis, urgent EUS-guided ERCP with ES did not reduce the composite endpoint of major complications or mortality, as compared with conservative treatment in a historical control group. Trial registration number: ISRCTN15545919.
Purpose Cognitive functioning is increasingly assessed as a secondary outcome in neuro-oncological trials. However, which cognitive domains or tests to assess, remains debatable. In this meta-analysis, we aimed to elucidate the longer-term test-specific cognitive outcomes in adult glioma patients. Methods A systematic search yielded 7098 articles for screening. To investigate cognitive changes in glioma patients and differences between patients and controls ≥one-year follow-up, random-effects meta-analyses were conducted per cognitive test, separately for studies with a longitudinal and cross-sectional design. A meta-regression analysis with a moderator for interval testing (additional cognitive testing between baseline and one-year post-treatment) was performed to investigate the impact of practice in longitudinal designs. Results Eighty-three studies were reviewed, of which 37 were analyzed in the meta-analysis, involving 4078 patients. In longitudinal designs, semantic fluency was the most sensitive test to detect cognitive decline over time. Cognitive performance on MMSE, digit span forward, phonemic and semantic fluency declined over time in patients who had no interval testing. In cross-sectional studies, patients performed worse than controls on the MMSE, digit span backward, semantic fluency, Stroop speed interference task, trail making test B and finger tapping. Conclusion Cognitive performance of glioma patients one year after treatment is significantly lower compared to the norm, with specific tests potentially being more sensitive. Cognitive decline over time occurs as well, but can easily be overlooked in longitudinal designs due to practice effects (as a result of interval testing). It is warranted to sufficiently correct for practice effects in future longitudinal trials.
Background: The main objective of this study is to provide the first characterization of the current research field of the clinical microbiome in LUTSs. Methods: First-of-its-kind scientometric insight into the historical development and structural state of the discipline is provided by a field analysis, mapping, and sub-analysis of articles for future research. On 22 December 2022, the entire Scopus database was searched without language or date restrictions. Search terms included "Chronic prostatitis", OR "Interstitial cystitis", OR "Lower urinary tract symptoms", OR "Lower urinary tract dysfunction", OR "Overactive bladder", OR "Incontinence", OR "Urolithiasis", OR "Urothelium", OR "Urine", OR "Urology", OR "urinary disorder", OR "Pathophysiology", OR "Benign prostatic hyperplasia", OR "Benign prostatic enlargement", AND "Microbiota", OR "Microbiome", OR "Urobio-ma", OR "Urobiota; microflora". The author and institutional data were transformed using the analytical tool Biblioshiny (a Shiny app for Bibliometrix), which took into account variations in author spelling as well as institutional naming and subgroups. Results: The specified search strategy was able to locate 529 documents from 267 sources published from 1981 to 2022. The average number of years from publication was 4.59 years. The authors with the most publications were Wolfe AJ and Brubaker I. The top three most collaborative networks were Loyola University Chicago, Loyola University Medical Center, and the University of California San Diego. The most frequently occurring words among the 50 nodes were: human, humans, nonhuman, female, adult, article, microbiology, microflora, microbiota, and controlled study. Frontiers in Cellular and Infection Microbiology and the International Urogynecology Journal, followed by Nature Reviews Urology, were the top three most relevant sources in microbiome research in urology. Conclusions: One of the most crucial requirements for developing research policies and anticipating the scientific requirements of researchers is paying attention to the evolution of various scientific fields. Understanding research gaps and future needs in microbiome research in urology can be effectively understood by paying attention to the models, maps, and visualizations used in this research, which are the results of systematic analysis of scientific products in the most esteemed scientific journals in the world.
Purpose of Review Bladder pain syndrome (BPS)/interstitial cystitis (IC) can also be classified as either non-ulcerative or ulcerative, corresponding to the characteristic cystoscopic findings under hydrodistention. Promising therapeutic effects, including decreased bladder pain, have been reported from recent clinical trials using botulinum toxin A (BoNTA) for the treatment of BPS/IC. This review summarizes the current state of the literature on the underlying mechanisms of BoNTA therapy in BPS/IC as well as new forms of its application. Recent Findings BoNTA has its effect in the central nervous system in the afferent nerves as well as in the bladder wall. Besides the well-known effects of BoNTA in the nervous system, pain control as well as reduction of urinary urgency in BPS patients could be achieved by mast cell stabilization effecting histamine release as well as modulation of TRPV and PGE2 pathways, among other systems. In addition, new forms of BoNTA administration have focused on intravesical instillation of the drug in order to circumvent bladder wall injections. Hyperthermia, intravesical hydrogel, and lysosomes have been studied as new ways of BoNTA application in BPS/IC patients. From the available studies, bladder instillation of BoNTA in combination with EMDA is the most promising and effective novel approach. Summary The most promising novel application methods for BoNTA in patient with BPS/IC are bladder instillations. Future research needs to point out if bladder instillations with BoNTA with some form of bladder absorption enhancement such as hyperthermia or EMDA would be able to replace BoNTA injections in patients with BPS/IC
Background: In current practice, rates of locally recurrent rectal cancer (LRRC) are low due to the use of the total mesorectal excision (TME) in combination with various neoadjuvant treatment strategies. However, the literature on LRRC mainly consists of single- and multicenter retrospective cohort studies, which are prone to selection bias. The aim of this study is to provide a nationwide, population-based overview of LRRC after TME in the Netherlands. Patients and methods: In total, 1431 patients with nonmetastasized primary rectal cancer diagnosed in the first six months of 2015 and treated with TME were included from the nationwide, population-based Netherlands Cancer Registry. Data on disease recurrence were collected for patients diagnosed in these 6 months only. Competing risk cumulative incidence, competing risk regression, and Kaplan-Meier analyses were performed to assess incidence, risk factors, treatment, and overall survival (OS) of LRRC. Results: Three-year cumulative incidence of LRRC was 6.4%; synchronous distant metastases (LRRC-M1) were present in 44.9% of patients with LRRC. Distal localization, R1-2 margin, (y)pT3-4, and (y)pN1-2 were associated with an increased LRRC rate. No differences in LRRC treatment and OS were found between patients who had been treated with or without prior n(C)RT. Curative-intent treatment was given to 42.9% of patients with LRRC, and 3-year OS thereafter was 70%. Conclusions: Nationwide LRRC incidence was low. A high proportion of patients with LRRC underwent curative-intent treatment, and OS of this group was high in comparison with previous studies. Additionally, n(C)RT for primary rectal cancer was not associated with differences in treatment and OS of LRRC.
Background: Differentiation between uncomplicated and complicated postoperative wound drainage after arthroplasty is crucial to prevent unnecessary reoperation. Prospective data about the duration and amount of postoperative wound drainage in patients with and without prosthetic joint infection (PJI) are currently absent. Methods: A multicentre cohort study was conducted to assess the duration and amount of wound drainage in patients after arthroplasty. During 30 postoperative days after arthroplasty, patients recorded their wound status in a previously developed wound care app and graded the amount of wound drainage on a 5-point scale. Data about PJI in the follow-up period were extracted from the patient files. Results: Of the 1019 included patients, 16 patients (1.6 %) developed a PJI. Minor wound drainage decreased from the first to the fourth postoperative week from 50 % to 3 %. Both moderate to severe wound drainage in the third week and newly developed wound drainage in the second week after a week without drainage were strongly associated with PJI (odds ratio (OR) 103.23, 95 % confidence interval (CI) 26.08 to 408.57, OR 80.71, 95 % CI 9.12 to 714.52, respectively). The positive predictive value (PPV) for PJI was 83 % for moderate to heavy wound drainage in the third week. Conclusion: Moderate to heavy wound drainage and persistent wound drainage were strongly associated with PJI. The PPV of wound drainage for PJI was high for moderate to heavy drainage in the third week but was low for drainage in the first week. Therefore, additional parameters are needed to guide the decision to reoperate on patients for suspected acute PJI.
Purpose: Older patients with COVID-19 can present with atypical complaints, such as falls or delirium. In other diseases, such an atypical presentation is associated with worse clinical outcomes. However, it is not known whether this extends to COVID-19. We aimed to study the association between atypical presentation of COVID-19, frailty and adverse outcomes, as well as the incidence of atypical presentation. Methods: We conducted a retrospective observational multi-center cohort study in eight hospitals in the Netherlands. We included patients aged ≥ 70 years hospitalized with COVID-19 between February 2020 until May 2020. Atypical presentation of COVID-19 was defined as presentation without fever, cough and/or dyspnea. We collected data concerning symptoms on admission, demographics and frailty parameters [e.g., Charlson Comorbidity Index (CCI) and Clinical Frailty Scale (CFS)]. Outcome data included Intensive Care Unit (ICU) admission, discharge destination and 30-day mortality. Results: We included 780 patients, 9.5% (n = 74) of those patients had an atypical presentation. Patients with an atypical presentation were older (80 years, IQR 76-86 years; versus 79 years, IQR 74-84, p = 0.044) and were more often classified as severely frail (CFS 6-9) compared to patients with a typical presentation (47.6% vs 28.7%, p = 0.004). Overall, there was no significant difference in 30-day mortality between the two groups in univariate analysis (32.4% vs 41.5%; p = 0.173) or in multivariate analysis [OR 0.59 (95% CI 0.34-1.0); p = 0.058]. Conclusions: In this study, patients with an atypical presentation of COVID-19 were more frail compared to patients with a typical presentation. Contrary to our expectations, an atypical presentation was not associated with worse outcomes.
Introduction Delirium is common among patients admitted to the Intensive Care Unit (ICU) and its impact on the neurocognitive and psychiatric state of survivors is of great interest. These new-onset or worsening conditions, together with physical alterations, are called Post Intensive Care Syndrome (PICS). Our aim is to update on the latest screening and follow-up options for psychological and cognitive sequelae of PICS. Method This narrative review discusses the occurrence of delirium in ICU settings and the relatively new concept of PICS. Psychiatric and neurocognitive morbidities that may occur in survivors of critical illness following delirium are addressed. Future perspectives for practice and research are discussed. Results There is no ‘gold standard’ for diagnosing delirium in the ICU, but two extensively validated tools, the Confusion Assessment Method for the ICU and the Intensive Care Delirium Screening Checklist, are often used. PICS complaints are frequent in ICU survivors who have suffered delirium and have been recognized as an important public health and socio-economic problem worldwide. Depression, anxiety, post-traumatic stress disorder and long-term cognitive impairment are recurrently exhibited. Screening tools for these deficits are discussed, as well as the suggestion of early assessment after discharge and at 3 and 12 months. Conclusions Delirium is a complex but common phenomenon in the ICU and a risk factor for PICS. Its diagnosis is challenging with potential long-term adverse outcomes, including psychiatric and cognitive difficulties. The implementation of screening and follow-up protocols for PICS sequelae is warranted to ensure early detection and appropriate management.
The disclosure of online test results (i.e., laboratory, radiology and pathology results) on patient portals can vary from immediate disclosure (in real-time) via a delay of up to 28 days to non-disclosure. Although a few studies explored patient opinions regarding test results release, we have no insight into actual patients' preferences. To address this, we allowed patients to register their choices on a hospital patient portal. Our research question was: When do patients want their test results to be disclosed on the patient portal and what are the reasons for these choices? We used a mixed methods sequential explanatory design that included 1) patient choices on preferred time delay to test result disclosure on the patient portal for different medical specialties (N = 4592) and 2) semi-structured interviews with patients who changed their mind on their initial choice (N = 7). For laboratory (blood and urine) results, 3530 (76.9%) patients chose a delay of 1 day and 912 (19.9%) patients chose a delay of 7 days. For radiology and pathology results 4352 (94.8%) patients chose a delay of 7 days. 43 patients changed their mind about when they wanted to receive their results. By interviewing seven patients (16%) from this group we learned that some participants did not remember why they made changes. Four participants wanted a shorter delay to achieve transparency in health-related information and communication; to have time to process bad results; for reassurance; to prepare for a medical consultation; monitoring and acting on deviating results to prevent worsening of their disease; and to share results with their general practitioner. Three participants extended their chosen delay to avoid the disappointment about the content and anxiety of receiving incomprehensible information. Our study indicates that most patients prefer transparency in health-related information and want their test results to be disclosed as soon as possible.
Background In the previously reported SAPS trial ( https://clinicaltrials.gov/ct2/show/NCT01139489 ), procalcitonin-guidance safely reduced the duration of antibiotic treatment in critically ill patients. We assessed the impact of shorter antibiotic treatment on antimicrobial resistance development in SAPS patients. Materials and methods Cultures were assessed for the presence of multi-drug resistant (MDR) or highly resistant organisms (HRMO) and compared between PCT-guided and control patients. Baseline isolates from 30 days before to 5 days after randomization were compared with those from 5 to 30 days post-randomization. The primary endpoint was the incidence of new MDR/HRMO positive patients. Results In total, 8,113 cultures with 96,515 antibiotic test results were evaluated for 439 and 482 patients randomized to the PCT and control groups, respectively. Disease severity at admission was similar for both groups. Median (IQR) durations of the first course of antibiotics were 6 days (4–10) and 7 days (5–11), respectively ( p = 0.0001). Antibiotic-free days were 7 days (IQR 0–14) and 6 days (0–13; p = 0.05). Of all isolates assessed, 13% were MDR/HRMO positive and at baseline 186 (20%) patients were MDR/HMRO-positive. The incidence of new MDR/HRMO was 39 (8.9%) and 45 (9.3%) in PCT and control patients, respectively ( p = 0.82). The time courses for MDR/HRMO development were also similar for both groups ( p = 0.33). Conclusions In the 921 randomized patients studied, the small but statistically significant reduction in antibiotic treatment in the PCT-group did not translate into a detectable change in antimicrobial resistance. Studies with larger differences in antibiotic treatment duration, larger study populations or populations with higher MDR/HRMO incidences might detect such differences.
309 Background: Treatment of locally advanced and metastatic esophagogastric cancer (EGC) largely depends on systemic treatment including taxanes or platinum compounds – agents which are known to cause chemotherapy induced peripheral neuropathy (CIPN). To date, the extend of CIPN in EGC has not been investigated in real-world data. Methods: We identified patients with EGC from the Netherlands Cancer Registry (NCR) and Prospective Observational Cohort study of Oesophageal-gastric cancer Patients (POCOP) who underwent systemic treatment between 2016 and 2018 and completed at least one EORTC QLQ-CIPN20 questionnaire. Total CIPN scores and scores for the motor, sensory and autonomic scales (ranged 0-100) were calculated, and analyzed over time for patients receiving chemoradiation (CRT), peri-operative chemotherapy (POC) and patients primarily treated with palliative chemotherapy (PC). To gain insight in the differences between CIPN score at baseline and after 3, 6, 9, 12, 18 and 24 months we constructed linear mixed effect models. Results: We included 1052 EGC patients. 769 patients received CRT (response: 76% at baseline, 30% after 24 months), 143 received POC (response: 79% at baseline, 33% after 24 months) and 140 received PC (response: 74% at baseline, 28% after 12 months). Over time, neuropathy scores increased in all scales for patients receiving POC and PC, with a maximum increase of 16.5 points in total CIPN score after 6 months in the PC group compared to baseline. For patients receiving CRT, increase in neuropathy scores was more subtle, with a maximum increase in total CIPN score of 4.3 points after 24 months compared to baseline. Results of mixed effect models showed a significant increase of CIPN scores over time in all three groups compared to baseline, persisting for as long as two years after start of initial treatment. Conclusions: CIPN frequently occurs in EGC patients treated with chemotherapy, especially with PC. Due to the possible irreversibility of CIPN, it is important that clinicians inquire patients about neurotoxicity often, adjust the dose of neurotoxic agents as needed and that more research is done into strategies possibly preventing the occurrence of CIPN. Results of mixed model analysis to assess changes in total CIPN score over time. Each model was adjusted for age, sex, performance status and number of comorbidities. The model for POC and PC was also adjusted for use of oxaliplatin.[Table: see text] *p-value <0.001 a Chemoradiotherapy b Peri-operative chemotherapy c Palliative chemotherapy, no estimates were calculated for 18 and 24 months due to the low number of responses.
Institution pages aggregate content on ResearchGate related to an institution. The members listed on this page have self-identified as being affiliated with this institution. Publications listed on this page were identified by our algorithms as relating to this institution. This page was not created or approved by the institution. If you represent an institution and have questions about these pages or wish to report inaccurate content, you can contact us here.
Hilvarenbeekseweg 60, 5022 GC, Tilburg, Brabant, Netherlands