Background and aims International endoscopy societies vary in their approach for credentialing individuals in endoscopic ultrasound (EUS) to enable independent practice; however, there is no consensus in this or its implementation. In 2019, the Joint Advisory Group on GI Endoscopy (JAG) commissioned a working group to examine the evidence relating to this process for EUS. The aim of this was to develop evidence-based recommendations for EUS training and certification in the UK. Methods Under the oversight of the JAG quality assurance team, a modified Delphi process was conducted which included major stakeholders from the UK and Ireland. A formal literature review was made, initial questions for study were proposed and recommendations for training and certification in EUS were formulated after a rigorous assessment using the Grading of Recommendation Assessment, Development and Evaluation tool and subjected to electronic voting to identify accepted statements. These were peer reviewed by JAG and relevant stakeholder societies before consensus on the final EUS certification pathway was achieved. Results 39 initial questions were proposed of which 33 were deemed worthy of assessment and finally formed the key recommendations. The statements covered four key domains, such as: definition of competence (13 statements), acquisition of competence (10), assessment of competence (5) and postcertification mentorship (5). Key recommendations include: (1) minimum of 250 hands-on cases before an assessment for competency can be made, (2) attendance at the JAG basic EUS course, (3) completing a minimum of one formative direct observation of procedural skills (DOPS) every 10 cases to allow the learning curve in EUS training to be adequately studied, (4) competent performance in summative DOPS assessments and (5) a period of mentorship over a 12-month period is recommended as minimum to support and mentor new service providers. Conclusions An evidence-based certification pathway has been commissioned by JAG to support and quality assure EUS training. This will form the basis to improve quality of training and safety standards in EUS in the UK and Ireland.
Background In sub-Saharan Africa, the origins of asthma and high prevalence of abnormal lung function remain unclear. In high-income countries (HICs), associations between birth measurements and childhood asthma and lung function highlight the importance of antenatal and early life factors in the aetiology of asthma and abnormal lung function in children. We present here the first study in sub-Saharan Africa to relate birth characteristics to both childhood respiratory symptoms and lung function. Methods Children attending schools in two socioeconomically contrasting but geographically close areas of Nairobi, Kenya, were recruited to a cross-sectional study of childhood asthma and lung function. Questionnaires quantified respiratory symptoms and preterm birth; lung function was measured by spirometry; and parents were invited to bring the child’s immunisation booklet containing records of birth weight and serial weights in the first year. Results 2373 children participated, 52% girls, median age (IQR), 10 years (8–13). Spirometry data were available for 1622. Child immunisation booklets were available for 500 and birth weight and infant weight gain data were available for 323 and 494 children, respectively. In multivariable analyses, preterm birth was associated with the childhood symptoms ‘wheeze in the last 12 months’; OR 1.64, (95% CI 1.03 to 2.62), p=0.038; and ‘trouble breathing’ 3.18 (95% CI 2.27 to 4.45), p<0.001. Birth weight (kg) was associated with forced expiratory volume in 1 s z-score, regression coefficient (β) 0.30 (0.08, 0.52), p=0.008, FVC z-score 0.29 (95% CI 0.08 to 0.51); p=0.008 and restricted spirometry, OR 0.11 (95% CI 0.02 to 0.78), p=0.027. Conclusion These associations are in keeping with those in HICs and highlight antenatal factors in the aetiology of asthma and lung function abnormalities in sub-Saharan Africa.
Introduction To assess pericoronary adipose tissue (PCAT) density on Computed Tomography Coronary Angiography (CTCA) as a marker of inflammatory disease activity in coronary allograft vasculopathy (CAV). Methods PCAT density, lesion volumes, and total vessel volume-to-myocardial mass ratio (V/M) were retrospectively measured in heart transplant patients, age and sex-matched controls, and patients with atherosclerosis. Results A total of 126 CTCAs were analysed from 94 heart transplant patients (mean age 49 [SD 14.5] years, 40% female). PCAT density was higher in transplant patients with CAV (n=40; -73.0 HU [SD 9.3]) than without CAV (n=86; -77.9 HU [SD 8.2]), and controls (n=12; -86.2 HU [SD 5.4]), p<0.01 for both (Fig-A). Unlike patients with atherosclerotic coronary artery disease (n=32), CAV lesions were non-calcified, comprising of mostly fibrous or fibrofatty tissue. V/M was lower in patients with CAV than without (32.4 mm³/g [SD 9.7] vs. 41.4 mm³/g [SD 12.3], p<0.0001). PCAT density and V/M improved the ability to predict CAV from AUC 0.75 to 0.85 when added to donor age and donor hypertension status (p<0.0001) (Fig-B). PCAT density above -66 HU was associated with a greater incidence of all-cause mortality (OR 18.0 [95%CI 3.25-99.6], p<0.01) and the composite endpoint of death, CAV progression, acute rejection (Fig-C), and coronary revascularization (OR 7.47 [95%CI 1.8-31.6], p=0.01) over 5.3 (SD 2.1) years. Conclusion Heart transplant patients with CAV have higher PCAT density and lower V/M than those without. Increased PCAT density is associated with adverse clinical outcomes. These CCTA metrics could be useful for CAV diagnosis and monitoring.
Introduction Inflammation and its resolution modulate ischaemic injury after myocardial infarction (MI). We evaluated somatostatin receptor 2 (SST2) PET/MRI using ⁶⁸Ga-DOTATATE for assessing post-infarct inflammation. Methods SST2 PET/MRI was performed at 2 weeks and 3 months after MI in a prospective longitudinal cohort study. Blood samples were taken for cardiac biomarkers, cell immunophenotyping and proteomic inflammatory markers. Cardiac function and viability were re-assessed at 1 year by MRI. Results In 38 participants (mean age 60 [SD 9] years; 84% male), ⁶⁸Ga-DOTATATE maximum standardised uptake values (SUVmax) clearly differentiated infarcted myocardium defined by late gadolinium enhancement MRI from remote regions (infarct 2.4 vs. remote 1.4, p<0.0001; Fig.1a-c). Infarct SUVmax was increased by 0.3 in in patients with C-reactive protein ≥2.0 (p=0.001). Each unit increase in SUVmax was associated with a 6ms increase in MRI T2-oedema signal (p=0.03). When transformed to unit variance, each unit increase in Troponin was associated with 0.3 SUVmax increase (p<0.0001) and for NTpro-BNP, this was 0.2 (p=0.0002). Infarct SUVmax was 22% lower (p<0.0001) after a mean 104 (SD 22) days. However, baseline CCL25 was associated with persistence of ⁶⁸Ga-DOTATATE signal (p=0.006). SST2 was co-expressed with CD68⁺ macrophages in myocardial biopsies of MI patients. In a preliminary analysis of the first 26 participants, there was a trend toward significance between 3-month global SUVmax and 1-year left ventricular ejection fraction (r=-0.36, p=0.09). Conclusion SST2 PET/MRI tracks resolving inflammation after MI. Ongoing work will confirm links between residual infarct-related ⁶⁸Ga-DOTATATE signal, systemic immune activation, and longer-term myocardial injury.
Pilomatrixoma is a benign hair follicle tumour. Anetodermic changes overlying pilomatrixoma are rare. The aim of this study is to evaluate a case series of patients with a clinical diagnosis of anetodermic pilomatrixoma presenting to our Dermatology Department over a 5‐year period. Eight cases were identified. The median age of onset was 21 years. All cases presented on the upper limbs and trunk with a solitary rapidly evolving tumour, tender on palpation. They had an erythematous protuberant appearance with a wrinkled and atrophic surface. Underlying pilomatrixomas were firm measuring 1–5 cm. Simple excision was carried out in seven cases without postoperative complications. In conclusion, anetodermic pilomatrixoma is a rare variant of this tumour, occurring more frequently on the upper body. It presents with identifiable features and should be differentiated from other skin tumours. Surgical removal is usually the gold standard treatment.
Spondylodiscitis is the commonest spine infection, and pyogenic spondylodiscitis is the most common subtype. Whilst antibiotic therapy is the mainstay of treatment, some advocate that early surgery can improve mortality, relapse rates, and length of stay. Given that the condition carries a high mortality rate of up to 20%, the most effective treatment must be identified. We aimed to compare the mortality, relapse rate, and length of hospital stay of conservative versus early surgical treatment of pyogenic spondylodiscitis. All major databases were searched for original studies, which were evaluated using a qualitative synthesis, meta-analyses, influence, and regression analyses. The meta-analysis, with an overall pooled sample size of 10,954 patients from 21 studies, found that the pooled mortality among the early surgery patient subgroup was 8% versus 13% for patients treated conservatively. The mean proportion of relapse/failure among the early surgery subgroup was 15% versus 21% for the conservative treatment subgroup. Further, it concluded that early surgical treatment, when compared to conservative management, is associated with a 40% and 39% risk reduction in relapse/failure rate and mortality rate, respectively, and a 7.75 days per patient reduction in length of hospital stay (p < 0.01). The meta-analysis demonstrated that early surgical intervention consistently significantly outperforms conservative management in relapse/failure and mortality rates, and length of stay, in patients with pyogenic spondylodiscitis.
Primary biliary cholangitis (PBC) is an immune-mediated inflammatory disorder of the interlobular bile ducts which leads in many cases to cirrhosis. In patients with PBC, inadequate biochemical response to first-line treatment with ursodeoxycholic acid (UDCA) identifies those at high risk of progressive liver disease. In this study, we used transcriptional profiling of peripheral immune cells to gain insight into the immunobiology of high- vs. low-risk disease.We performed bulk RNA-sequencing of monocytes, NK cells, CD4+ T cells, CD8+ T cells, and B cells isolated from the peripheral blood of 40 treatment-naïve PBC patients; 36 high-risk patients (ALP ≥1.67 times the upper limit of normal [ULN] despite treatment with UDCA); 32 low-risk patients (ALP <1×ULN on UDCA); and 32 matched controls. We used Weighted Gene Co-expression Network Analysis (WGCNA) to identify networks of co-expressed genes (“modules”) associated with high-risk, low-risk or any PBC, and the most highly connected genes (“hub genes”) within them. Finally, we performed Multi-Omics Factor Analysis (MOFA) of WGCNA modules to identify the principal axes of biological variation (“latent factors”) across all immune cell subsets. We identified modules associated with high-risk, low-risk, or any PBC patient (q < 0.05) in each PBMC subset. Hub genes and functional annotations suggested that: (1) CD4+ T cells, CD8+ T cells and monocytes are active in PBC patients irrespective of disease activity, with enrichment of genes involved in TNF, IL-2, IL-6, and INFγ signalling, amongst other pathways; and (2) TNF signalling, implicated in all five cell types studied, is associated with high risk compared to low risk disease. Using MOFA, we identified one latent factor which bridged all cell types; was heavily weighted for modules enriched for TNF signalling; and showed significant difference in high compared to low risk disease (q < 0.05). Using this approach we found evidence of pro-inflammatory signalling in all stages of PBC disease irrespective of high-risk or low-risk disease and we identified TNF signalling as key in high risk compared to low risk disease. • Download figure • Open in new tab • Download powerpoint Abstract O10 Figure 1 Multi-omics latent factor analysis (MOFA) of modules identified by weighted gene co-expression network analysis (WGCNA) for all cell types. Heatmap to show the fitted MOFA model, displaying the percentage of variance explained for each factor (rows) in each cell type. LF1 was associated with all cell types and contained modules enriched for TNF signalling in high risk vs. low risk disease. It also identified latent factors unique to each cell type; for example LF10 was unique to CD4+ T cells
Non-alcoholic fatty liver disease (NAFLD) is now the most common cause of liver disease worldwide, and the second most common cause of liver transplantation in the UK. Transplantation is physiologically stressful, and is associated with deterioration in cardiometabolic risk factors, as well as disease recurrence after 5 years. Infection is the leading cause of death in the first 5 years post-transplant; subsequently cardiovascular disease and malignancy are the predominant causes of death.Here we retrospectively evaluate a single centre’s experience of assessment and management of cardiometabolic risk factors post transplantation. This was a single-centre study of patients who underwent liver transplantation for NASH cirrhosis between January and December 2021. Data was collected retrospectively for patient’s height, weight, blood pressure, glycated haemoglobin (HbA1c) and estimated glomerular filtration rate (eGFR) on initial assessment, admission for transplant, and subsequent visits at 3, 6 and 12 months, post-transplant. QRISK3, the prediction algorithm for cardiovascular risk, was calculated on assessment and 12 months post-operatively (https://qrisk.org). Results shown as mean ± standard deviation, statistical analysis was carried out using GraphPad Prism v9. We identified 27 patients who had a liver transplant for NASH during this time period. We noted a 6.8 kg/m2 increase in mean BMI from assessment to 1 year follow up, 2/26 patients had a new diagnosis of type 2 diabetes. There was a significant increase in blood pressure over the study period: Mean systolic blood pressure 122 ± 15.9 v 150 ± 27.5 mm Hg, p<0.005; Mean diastolic pressure 64 ± 10.3 v 77 ± 8.3 mmHg, p<0.005. Fifty percent of patients had new onset chronic kidney disease stage 3 (CKD3) post-transplant and one died after 7 months due to a myocardial infarction. We noted a significant rise in QRISK3 score 1 year post transplant (Mean = 14% v 27%, p=0.0024); 84.6% of patients at 1 year post transplant, had a QRISK3 score greater than 10, meeting criteria for statin therapy. We would recommend that all patients undergoing liver transplantation for NASH are assessed for risk factors of cardiovascular disease and NASH recurrence at routine follow up appointments, including a QRISK3 assessment. Statins are recommended for primary prophylaxis of major cardiovascular events in those with a QRISK3 greater than 10%. With approximately 85% of patients meeting this criteria, we would advocate for proactive assessment and initiation of therapy in all NASH patients post transplant.
Introduction Chronic hepatitis B (CHB) can cause permanent liver damage and Hepatocellular carcinoma (HCC), leading to liver transplantation. Both hepatitis B immunoglobulin (HBIG) and Nuclos(t)ide analogues (NUCs) have been used to prevent CHB reactivation. However, the duration and end point of HBIG therapy remains controversial, which causes a vast discrepancy in their use in different transplant centres. We aimed to explore the current use of HBIG post-liver transplantation in the UK, the rate of CHB reactivation, its association with duration and type of HBIG/NUCs used, and overall survival. Methods We conducted a retrospective, multicentre study through the British Association for the Study of Liver (BASL); all 7 major UK transplant centres were invited. 6 centres provided data and were included in this study. All patients undergoing liver transplantation for CHB from 2012 to mid-2023 were included in this study. Demographics, indications for transplants, type of NUCs, HBIG duration and viral serology pre-transplant and post-liver transplant and overall survival were collected. Results 251 patients were included (78.8% male; median age 51). 68.5% had HBV mono-infection, and 24.7% had HBV/HDV Coinfection (n=62). Median UKELD was 52. Most common indication was decompensated cirrhosis (46.2%), followed by HCC (41.4%). 72.9% had undetectable HBV DNA, and 61.5% with Delta coinfection had detectable HDV RNA at the time of transplant. Most patients had NUCs post-liver transplantation regardless of HBV DNA detectability (table 1). However, there was a vast discrepancy in HBIG doses in the first-week post-transplant and maintenance dosing following that. While 81% received HBIG post-liver transplantation (67.7% received HBIG at day 0 post-transplant); only 46.2% had maintenance HBIG doses (range 1 week to 10 years). Only 14 patients (5.6%) had reactivation post-liver transplantation; all had either coinfection with HDV, HCV or HIV or HCC before the transplant. Median survival was 54 Months (range 0–155) with 25 deaths. Multivariate analysis showed no significant association between number of HBIG doses in the first-week post-transplant, maintenance HBIG, type of NUCs used and reactivation or mortality.View this table: • View inline • View popup • Download powerpoint Abstract P135 Table 1 Patient characteristics of all subjects included in analysis Conclusion Excellent long-term outcome for liver transplantation in CHB patients was seen with the help of HBIG and NUCS, preventing reactivation. The prolonged use of HBIG post-liver transplantation did not impact reactivation or survival. However, nearly half of CHB patients continue to have prolonged maintenance HBIG doses. A national consensus for HBIG use post-liver transplantation needs to be implemented.
Liver disease is on the rise, which has created the urgent need for new treatments. An attractive therapeutic approach is the understanding and manipulation of the regenerative pathways of the liver. The regenerative capability of the liver is dependent on the nature of the injury. It is known that mild injury and hepatectomy induce hepatocyte proliferation driven regeneration. However, hepatocyte proliferation is impaired during chronic injury and secondary mechanisms of regeneration might exist. Indeed, studies in animal models have revealed several regenerative processes, which might take place in chronic disease: 1) liver stem cells activation, 2) dedifferentiation/redifferentiation of cholangiocytes/hepatocytes, 3) transdifferentiation between cholangiocytes and hepatocytes. There is little knowledge of the above processes in human and the current knowledge is mainly derived from histopathology analyses. Here we aim to define the mechanisms behind epithelial plasticity in the diseased liver.To understand the liver response to chronic disease, we have collected liver biopsies from just under 50 patients across the spectrum of non-alcoholic fatty liver disease and we performed state of the art 3D imaging and single nuclei RNA sequencing (snRNA). In depth, computational analysis has allowed us to dissect the cellular composition of the liver during the course of chronic liver disease. We further used liver organoids as an in vitro model to validate our snRNAseq findings. The analysis has revealed that disease progression is accompanied by tissue remodelling, loss of zonation and extensive ductular reaction. Further analyses, captured both at the transcriptomic and protein level the presence of cells sharing characteristics from cholangiocytes and hepatocytes, termed biphenotypic cells. Gene enrichment analysis revealed signalling pathways likely involved in their generation. In vitro modelling using NAFLD human derived liver organoids validated these molecular pathways and their importance in the generation of bi-phenotypic cells. In conclusion, we use snRNAseq analysis, to demonstrate the presence of a molecular pathway involved in liver epithelial cell plasticity during chronic liver disease. This study paves the way for the development of new therapies promoting tissue repair to improve organ function.
Background and Aims Hepatitis B (HBV) is the leading cause of hepatocellular carcinoma (HCC) worldwide. Immunotherapies to date have not been successful in achieving long-term immune control required for functional cure. The antigenic peptide display of HBV derived peptides on HLA class I molecules is crucial for mounting antiviral immune responses. TAPBPR functions intracellularly as an MHC class I peptide exchange catalyst promoting the loading of immunogenic peptides onto MHC-I as well as the release of suboptimal peptides.Following on from the use of recombinant soluble TAPBPR (sTAPBPR) to mediate exogenous peptide loading onto MHC class I at the cell surface, the Boyle lab demonstrated that by using recombinant TAPBPR-antibody fusion proteins (TAPBPR-Ab) we can tether TAPBPR to the plasma membrane expressed target protein resulting in superior peptide exchange when compared with sTAPBPR alone. This was observed with low peptide concentrations due to its ability to remain bound to the cell surface, thus enabling sequential binding to multiple MHC molecules. TAPBPR-Ab promoted highly efficient loading of exogenous immunogenic peptides onto tumour cells, resulting in antigen-specific CD8+ T cell responses and killing of antibody-target positive tumour cells in vivo. Here we investigate the ability of TAPBPR-Ab fusion proteins to be utilised to increase the immunogenicity of HBV infected cells. Methods Recombinant sTAPBPR fused to αGFP nanobody (TAPBPR-GFPnb) and tethered to the plasma membrane was used to perform exogenous peptide loading of fluorescently labelled HBV peptides on a panel of cells expressing the antibody target GFP and single HLA class I of choice (HLA-A*02:01, -B*08:01, -B*35:01 and Cw*06:02). Results Superior peptide binding levels were observed on target cells treated with TAPBPR-GFPnb compared to cells treated with sTAPBPR or peptide alone. This was observed on HLA class I allotypes that TAPBPR is known to exhibit a preference for such as HLA-A*02:01 and HLA B*08:01, as well as -B*35:01 and Cw*06:02 allotypes. Peptide binding was not observed in cells lacking classical HLA class I expression demonstrating that peptide loading occurs directly onto to HLA class I. Peptide binding was negligible on TAPBPR-GFPnb treated cells lacking the nanobody target, similarly to cells treated with peptide alone. Conclusion The use of TAPBPR-Ab fusion proteins is a novel approach to decorate target positive HBV infected cells with immunogenic HBV peptides on a diverse range of HLA class I. They have demonstrated an ability to trigger antigen specific immune responses, even at low peptide concentrations and thus have the exciting potential to be an important therapeutic tool for HBV functional cure, with limited off target effects.
Background Response inhibition − or the ability to withhold a suboptimal response − relies on the efficacy of fronto-striatal networks, and is impaired in neuropsychiatric disorders including addiction. Cortical paired associative stimulation (cPAS) is a form of transcranial magnetic stimulation (TMS) which can strengthen neuronal connections via spike-timing-dependent plasticity mechanisms. Here, we used cPAS targeting the fronto-striatal inhibitory network to modulate performance on a response inhibition measure in chronic alcohol use. Methods Fifty-five participants (20 patients with a formal alcohol use disorder (AUD) diagnosis (26–74 years, 6[30%] females) and 20 matched healthy controls (HCs) (27–73 years, 6[30%] females) within a larger sample of 35 HCs (23–84 years, 11[31.4%] females) underwent two randomized sessions of cPAS 1-week apart: right inferior frontal cortex stimulation preceding right presupplementary motor area stimulation by either 4 ms (excitation condition) or 100 ms (control condition), and were subsequently administered the Stop Signal Task (SST) in both sessions. Results HCs showed decreased stop signal reaction time in the excitation condition (t(19) = −3.01, p = 0.007, [CIs]:−35.6 to −6.42); this facilitatory effect was not observed for AUD (F(1,31) = 9.57, p = 0.004, CIs: −68.64 to −14.11). Individually, rates of SST improvement were substantially higher for healthy (72%) relative to AUD (13.6%) groups (OR: 2.33, p = 0.006, CIs:−3.34 to −0.55). Conclusion In line with previous findings, cPAS improved response inhibition in healthy adults by strengthening the fronto-striatal network through putative long-term potentiation-like plasticity mechanisms. Furthermore, we identified a possible marker of impaired cortical excitability, and, thus, diminished capacity for cPAS-induced neuroplasticity in AUD with direct implications to a disorder-relevant cognitive process.
Objectives Virtual reality (VR) might improve symptom management, but there is limited evidence regarding VR in palliative care. We evaluated the feasibility of VR and impact on anxiety and pain for patients in a hospital palliative care consultation service. Methods Patients referred to a hospital specialist palliative care team, with anxiety or pain, were offered a VR intervention (a short audiovisual experience). Participants rated anxiety and pain on a 0–10 Likert severity scale pre intervention/post intervention and completed an evaluation form. Change in symptom scores was analysed by parametric statistics. Results 28 participants used VR a total of 42 times with no adverse events. Mean pain score reduced by 29% from 4.10 (SD=2.71) pre intervention to 2.93 (SD=2.45) post intervention (t(27)=5.150, p<0.001). Mean anxiety scores reduced by 40% from 4.43 (SD=2.56) to 2.65 (SD=2.24) (t(27)=5.058, p<0.001). Patients rated the experience on average 4.75/5 and all would recommend use to a friend. VR was described as absorbing and relaxing. Conclusion VR may improve anxiety and pain and was acceptable in this setting. Large-scale evaluation will generate important data on feasibility and implementation.
Aims Enteral feeding is commonly used to manage a variety of medical conditions in hospital. For people with diabetes this can present a specific challenge for glucose management. To address gaps in our understanding of modern enteral feeding outcomes and to help with the development of more specific guidance on maintaining glycaemic control, we conducted a national survey on the management of enteral feeding against the standards in the nationally adopted Joint British Diabetes Societies for Inpatient Care (JBDS) guidelines. Methods A questionnaire was developed using the 2018 JBDS guideline as a template This questionnaire was sent out by email to all 220 UK specialist diabetes teams. Databases of Diabetes UK, the Association of British Diabetologists (ABCD), and the Diabetes Inpatient Specialist Nurse (DISN) UK Group were used. Results Twenty‐six hospitals responded, 11 had guidelines for the management of insulin with enteral feeding. There were 3 main feed regimens used: continuous 24‐hour feeding, a single feed with one break in 24 hours, or multiple feeds in 24 hours. There were 5 regimens in common use: premixed insulin, isophane insulin, analogue basal insulin, variable rate intravenous insulin, or basal bolus insulin. Overall glucose control was poor for all regimens and combinations. Continuous feed showed better glucose control than a single feed with a break, mean ( + SD) glucose 12.4 mmol/L (5.6) vs 15.1 mmol/L (6.9) p <0.005, but no group showed optimal control. Conclusions Managing diabetes control during enteral feeding remains a challenge. Our survey showed that glucose control during this treatment is suboptimal.
Objectives: Stenting of malignant colon obstruction is used as a bridge to surgery or as an alternative to surgical colostomy in a palliative setting. Current guidelines recommend stent placement as the first line of treatment in colonic obstruction in both curative and palliative settings. However, it is unclear whether the location of the malignant obstruction influences the outcome of the stenting procedure. The goal of this study was to compare the outcomes of colonic stents between proximal and distal colonic strictures with regard to technical and clinical success and the risk of adverse events. Methods: A multi-center retrospective cohort was composed of patients who underwent a colonic stent placement at two tertiary hospitals between 2013 and 2021. The technical and clinical outcome, stent type used, duration of post-procedural hospital stay and complications were noted. Results: A total of 148 patients who underwent colonic stenting were identified. 41 patients underwent stent placement in the proximal colon and 107 patients underwent a distal stent placement. There was no difference in technical success (100% vs 96.3%, p = 0.209), clinical success (97.0% vs 89.6%, p = 0.199) or complications (24.4% vs 37.4%, p = 0,135). Conclusion: Technical success and clinical success rates are high and do not differ between stent locations. There is no significant difference in complication rates between proximal and distal colonic stents.
Institution pages aggregate content on ResearchGate related to an institution. The members listed on this page have self-identified as being affiliated with this institution. Publications listed on this page were identified by our algorithms as relating to this institution. This page was not created or approved by the institution. If you represent an institution and have questions about these pages or wish to report inaccurate content, you can contact us here.