Recent publications
Adolescents and young adults (AYA) with chronic rheumatic musculoskeletal diseases (RMDs) have multiple potential risk factors for compromised bone health. The effect of the active inflammatory disease state on bone resorption, malnutrition, reduced physical activity, delayed puberty, vitamin D deficiency and use of glucocorticoid therapy can all result in impaired bone accrual. The impact of chronic RMDs on bone density can extend into adulthood, and coupled with the inevitable bone loss during ageing, may lead to increased fragility fractures throughout life. Although many tools can be used to determine bone health, the most useful and widely used technique by far is dual-energy X-ray absorptiometry (DXA).
Low bone mineral density (BMD) can be asymptomatic, hence AYAs with chronic RMDs may benefit from routine bone health screening. Specific measures include the use of vitamin D and calcium supplements, steroid-sparing medications and the promotion of weight-bearing exercise. This chapter addresses common risk factors for adverse bone health in AYAs with RMDs and explores the broad approach to prevention and management of bone fragility.
This review provides descriptive evidence for the potential sociodemographic risk factors of race/ethnicity, younger age, and socioeconomic status, as well as evidence for the immigrant effect in women with breast cancer (BC) across world regions and countries. Using MEDLINE and the Web of Science on epidemiology, incidence/mortality rates, and social determinants, we searched a registry-based dataset and the reference lists of prior reviews of BC women (code C50) from the GLOBOCAN 2022 database and the National Cancer Institute’s Surveillance, Epidemiology, and End Results (SEER) program 2024. Globally, 1,959,256 new cases (26.7%) and 495,572 deaths (17.7%) were recorded in women aged <75 years in 2022. The age-standardized incidence rate (ASIR) of BC was the highest among countries with a very high Human Development Index (ASIR, 70.9), a high HDI (ASIR, 44.4), middle-low levels (ASIR, 33.6–32.5), driven by affluent lifestyles and a higher incidence of infectious diseases and infection-associated cancers. Besides, younger BC women are statistically more likely than older ones to have pathogenic germline variants in BC susceptibility genes (BRCA1/2, TP53, PALB2). The descriptive epidemiology presented in this review should be of global value to clinicians, researchers, and policymakers, considering the implementation and implications of
population-based BC screening programs.
Introduction
Arboviruses are a diverse group of arthropod-borne pathogens and are emerging global public health threats with no approved therapeutics. Arboviruses are spreading rapidly, posing a health threat to UK Armed Forces (UKAF) service personnel (SP) through deployment to endemic regions. There are limited data on the burden of arboviral infections in UKAF SP.
Methods
A retrospective service evaluation of UKAF electronic healthcare records (eHRs) and statutory notifications to the Defence Public Health Unit was conducted. Cases with possible/confirmed dengue, chikungunya or Zika virus infections between 2005 and 2023 were included. eHRs were interrogated and trends analysed.
Results
Of 107 suspected infections between 2005 and 2023, 49 (45.8%) were laboratory-confirmed. Dengue fever was the most common (45/49) followed by chikungunya (3/49) and Zika (1/49) virus infections. The average yearly incidence of reported dengue infection increased from 0.51 cases per 100 000 UKAF SP per year in 2009–2011 to 3.85 cases per 100 000 SP per year in 2021–2023. 19/45 (42.2%) cases occurred during operational deployments and 24/45 (53.3%) during non-military activity. Dengue infection was most frequently acquired in Southeast Asia.
Using WHO clinical severity criteria, 33/45 (73.3%) had dengue with warning signs and 5 (11.1%) had severe dengue. 23/45 (51.1%) dengue cases were hospitalised (median length of stay 5 days, IQR 3, range 1–9). No dengue fatalities or medical discharges occurred. Occupational impact was significant, with a median of 11 days stood down (IQR 10, range 0–45); 3/19 (15.8%) cases on operations required aeromedical evacuation (AEROMED). One deployed case of chikungunya required AEROMED and a 35-day downgrade.
Conclusions
Reports of arboviral infections, particularly dengue, are increasing in UKAF personnel, presenting an emerging health threat. This has implications for UKAF provision of deployed diagnostics and dengue vaccination policy. The rapid spread of arboviruses outside their traditional geographical areas, including into Europe, necessitates further surveillance and requires diagnostic and therapeutic research.
Acute decompensated cirrhosis (DC) and acute-on-chronic liver failure are common reasons for hospital admission that have a high in-hospital mortality rate (10%–20%). Patients require a detailed assessment for precipitating factors and management of complications such as infections, ascites, acute kidney injury and hepatic encephalopathy. Multiple reports have demonstrated unwarranted variability in the care of patients with DC. In 2014, the British Society of Gastroenterology (BSG)/British Association for the Study of the Liver (BASL) DC care bundle (DCCB) was introduced to provide a structured approach for the management of patients with DC in the first 24 hours. Usage of the DCCB has been shown to improve care of patients with DC. However, despite evidence indicating the beneficial impact of the DCCB, overall usage across the UK was only 11.4% in a national audit. Our aim was to update the DCCB to incorporate recent advances in care and improve its usability and develop a strategy to improve its usage nationally. The updated bundle was developed by a multidisciplinary group of specialists from BSG, BASL and the Society for Acute Medicine with the quality of evidence supporting the bundle recommendations assessed using the Grading of Recommendation Assessment Development and Evaluation tool. Proposed minimum standards for audit were also developed. Finally, a strategy to promote usage of the bundle including education/training at a national and local level, improving accessibility for the bundle, and promotion of frameworks for use at an institutional level to improve and monitor utilisation of DCCB.
Peripheral blood stem cell (PBSC) donation is the primary procedure used to collect haemopoietic stem cells (HSCs) for transplantation in individuals with haematological malignancies. More than 90,000 HSC transplants take place globally each year, and there is an increasing need to guarantee HSC mobilisation, improve tolerability to apheresis, and optimise immune reconstitution. Currently, mobilisation of HSCs depends upon pharmacological agents, with donors inactive during their subsequent apheresis. A successful yield of HSCs is not always achieved, and greater efficiency of collection procedures would improve the donors’ safety and experience, along with the overall functioning of apheresis departments. The mobilisation of immune cells during bouts of exercise has been increasingly studied over the past 40 years. Exercise enriches peripheral blood with HSCs and immune cells such as cytolytic natural killer cells, and these may impact upon collection efficiency and patient outcomes following transplantation. Using exercise in conjunction with routine pharmaceutical agents may meet these needs. This article describes the impact of exercise on the quantity and engraftment potential of HSCs. Given that PBSC collections take on average 3–4 h per day per donor, and often consecutive days to complete, particular attention is paid to adopting interval exercise in this setting. Moreover, practical and safety considerations for allogeneic and autologous donors are discussed. ‘Intra-apheresis cycling’ is proposed as a feasible adjunctive strategy to evoke clinically significant improvements in the quality of the immune graft. Further research is needed to validate this concept in conjunction with routine mobilisation agents.
Objective
To explore the evidence for interventions that integrate child health and social care and support programmes and the impact they have on child health and wellbeing.
Data sources
The Cochrane Library, Ovid Medline, Ovid Embase, Ovid Emcare, Ovid Health Management Information Consortium (HMIC) database, and Ovid Social Policy and Practice, Proquest Psychinfo and Ebscohost Cinahl.
Eligibility
Peer-reviewed original research that described an intervention integrating health care and social support or care interventions for children and young people (CYP) up to the age of 18 years in high-income countries. All databases were searched from inception to August 2023.
Data extraction and synthesis
16 studies were identified: 9 quantitative studies including 4 RCTs, 5 qualitative studies and 2 mixed methods studies. Studies were assessed for quality and a narrative review performed. Study heterogeneity meant a meta-analysis could not be completed.
Results
For the purposes of clarity and understanding we collated the identified studies bv mode of delivery. In doing so we determined three main models of delivering integrated health and social care services: Targeted support for vulnerable groups, where the provision of packages of interventions focussed on target populations, this showed potential for decreasing the need for social support in the long-term but with limited evidence for reducing referrals into other services. These types of service were more successful in meeting specific objectives such as lower rates of smoking, and reducing repeat pregnancies; Collaborative health and social support, which typically collocated health and social care practitioners, demonstrated improved collaborative working but with little impact on workload, job satisfaction, or service delivery; and School centred health and social care, which were based in educational facilities and improved some aspects of CYP wellbeing and physical health but with concerns they added to teacher workload.
Conclusions
Integrated health and social support programmes offer promising solutions to addressing health inequity in children and young people in underserved populations. However, more robust and consistent study designs are needed to guide researchers and policy makers in their implementation and evaluation.
PROSPERO registration
CRD42023399907
Transcatheter mitral leaflet repair is a non-surgical technique used to treat severe mitral regurgitation. The technique has matured significantly since its commercial introduction, and with device iteration and increasing operator experience, it is now an important treatment option for patients at higher risk for conventional mitral valve surgery. Randomised clinical trials have established the safety and efficacy of the technique in the treatment of primary and secondary mitral regurgitation, and its use was approved by the National Institute for Health and Care Excellence in 2019. This position statement summarises the clinical evidence and indications for the procedure and provides expert consensus on best practice in terms of patient selection, the procedure and post-procedure care. Standards are also described with respect to team composition, minimum case volume and collection of procedural and outcome data.
Objective
Unintentional parathyroid gland resection during total thyroidectomy can result in permanent hypoparathyroidism and lifelong replacement therapy. Near infrared autofluorescence (NIRAF) imaging may aid intraoperative identification and preservation of the parathyroid glands. This article aims to review NIRAF's effectiveness in the prevention of post‐operative hypoparathyroidism.
Design
Systematic review and meta‐analysis reported according to PRISMA guidelines.
Methods
The electronic databases of MEDLINE, Embase and Cochrane were searched in September 2024. Included articles were randomised controlled trials (RCTs) that studied the use of NIRAF vs. dissection with no intraoperative aids in thyroidectomy. Meta‐analysis was performed using a random‐effects model. Primary outcomes were postoperative hypocalcaemia and permanent hypoparathyroidism.
Results
Eight RCTs were included in the final analysis, comprising 1620 patients. Meta‐analysis revealed patients undergoing thyroidectomy using NIRAF had a reduced risk of both post‐operative hypocalcaemia (OR 0.56, 95% CI: 0.36–0.89, p = 0.01) and persistent hypoparathyroidism (OR 0.44, 95% CI: 0.22–0.89, p = 0.02).
Conclusions
NIRAF use in thyroidectomy reduces the risk of post‐operative hypocalcaemia and post‐operative hypoparathyroidism.
Ketamine‐induced cystitis is an increasingly recognized complication associated with the addictive use of ketamine, a dissociative anesthetic. This article provides a comprehensive overview, focusing on its pathophysiology, clinical presentation, diagnosis, management strategies, and implications for addiction treatment. The British Association of Urological Surgeons consensus serves as a foundational reference for management, while additional literature is integrated to highlight the multifaceted nature of Ketamine Bladder and its impact on individuals with substance use disorders.
This report provides guidance for users of linear accelerator (linac) Manufacturer Integrated Quality Control (MIQC) tools. MIQC tools have been developed and introduced by radiotherapy linac vendors, and have the potential to improve both the quality and efficiency of linac Quality Control (QC). They usually utilise the Electronic Portal Imaging Device (EPID), but may acquire data from other sources, and automatically perform and analyse tests of various treatment machine QC parameters. The currently available systems meeting this definition are Varian Machine Performance Check (MPC), CyberKnife Automated Quality Assurance (AQA)/End-to-End (E2E), TomoTherapy Quality Assurance (TQA), and Elekta Machine QA (EMQA) (also known as AQUA). This guidance report covers the commissioning and implementation of MIQC. The guidance has been developed by a radiotherapy special interest group (RTSIG) working party on behalf of the Institute of Physics and Engineering in Medicine (IPEM). Recommendations within the report are derived from the experience of the working party members, existing guidance, literature, and a United Kingdom survey conducted in 2022 (Pearson et al 2023). Topics covered include developing an understanding of the QC system, independence review of MIQC, commissioning, implementation, ongoing QC and calibration, software upgrades and periodic review. The commissioning section covers detector commissioning, repeatability and reproducibility, baseline and tolerance setting, concordance with existing QC, sensitivity testing, cost-benefits analysis, and risk assessment methods. In order to offer practical guidance, case studies covering each aspect of commissioning are included. They are real-world examples or experiences from early adopters, each applied to a different example MIQC system. The examples will be directly applicable to users of that specific MIQC system, but also provide practical guidance on clinical implementation to users of the other systems.
Objectives
To report the British Association of Urological Surgeons (BAUS) consensus document on the assessment and management of post‐prostatectomy incontinence‐stress urinary incontinence (PPI‐SUI).
Methods
We conducted a contemporary literature search to identify the current evidence base. A guideline development group was formed by the Female, Neurological and Urodynamic Urology (FNUU) Section of BAUS to formulate and review the recommendations. Where a lack of evidence was identified, expert opinion of the FNUU Executive Committee and a modified Delphi approach was utilised.
Results
This consensus addresses several knowledge gaps in the current literature on PPI‐SUI, in addition to tackling areas not addressed by the current international guidelines, e.g., prostate cancer survivorship. Of the initial draft, the modified Delphi consensus methodology was applied to 65 statements split into seven broad categories: terminology, assessment, conservative management, surgical treatment, perioperative care, complication management, and follow‐up after PPI‐SUI surgery. This is applicable to general and specialist Urologists worldwide. After three rounds, consensus was achieved with 63/65 statements.
Conclusions
We provide a modified Delphi consensus on the assessment and management of PPI‐SUI to help guide and standardise the assessment and management pathway of these patients.
Objective: Obstructive sleep apnoea (OSA) can have a significant health burden in terms of sleep outcomes, systemic morbidity including cardiovascular and urological complications, and quality of life (QoL). Transoral robotic surgery (TORS) constitutes a novel option for patients where continuous positive airway pressure (CPAP) is not tolerated. This study aimed to assess OSA patients managed with TORS in terms of QoL and sleep outcomes.
Data sources: Systematic review using EMBASE, CINAHL, MEDLINE, and Cochrane electronic databases.
Review methods: Studies were identified assessing QoL (with validated QoL tools) and sleep outcomes in OSA patients managed with TORS. For the meta-analysis, mean difference (MD) was calculated using an inverse variance random-effects model.
Results: Four studies (252 patients) were included. Meta-analysis showed improvements in apnoea-hypopnea index (AHI) (MD = 20.01, p < 0.00001), Epworth Sleepiness Scale (ESS) (MD = 5.16, p < 0.00001) and lowest oxygen saturations (LSaO2) (MD = -7.05, p < 0.00001) following TORS. For QoL, there were improvements in voice, prostate and overactive bladder symptoms, erectile function, and overall QoL following TORS. Swallowing returned to baseline at 3 months. No major complications were reported, with all adverse events managed conservatively.
Conclusion: This systematic review and meta-analysis is the first in the existing literature to evaluate TORS as a treatment option for OSA across both sleep and QoL domains. Significant improvements were observed in both parameters following TORS. Whilst further research is needed, the current findings can assist clinicians and patients when it comes to clinical decision-making regarding personalised treatment options for a condition that carries a significant morbidity and QoL burden.
Objective
Keloid scars are extremely difficult to treat with current therapy options, such as surgical excision and steroids, and have high recurrence rates. Intralesional cryotherapy is a relatively new treatment modality that uses a double-lumen needle to freeze the scar from the core, outwards. Evidence from the literature supports its use, with recurrence rates reported between 0–23%. The aim of this study was to assess patient-reported outcomes of keloid scar treatment with intralesional cryotherapy.
Method
All patients who had undergone intralesional cryotherapy, with a minimum follow-up period of six weeks were asked to complete a questionnaire. Patients were asked to rate the appearance of their scar and severity of their symptoms on a Visual Analogue Scale. They also reported any side-effects, complications and whether they would recommend the treatment.
Results
A total of 52 patients were included between 2017 and 2019. All patients reported an improvement in scar appearance and 91% of patients reported an improvement in pain, with an average 3.75-point reduction in pain scores. All patients would recommend the treatment. Hypopigmentation was the most frequently reported side-effect and was most common in patients with Fitzpatrick skin types V and VI. A second treatment was required by three patients; all three had keloid scarring on the anterior chest and had previously had multiple courses of steroids.
Conclusion
The findings of our study appears to support the use of intralesional cryotherapy for the treatment of keloid scarring, with high patient satisfaction rates.
Background
Long-term surviving liver transplant recipients can spontaneously develop operational tolerance, which allows them to completely discontinue their immunosuppression, but we lack validated tools to predict the likelihood of rejection following immunosuppression withdrawal. A previous clinical trial showed that a logistic regression algorithm including the transcript levels of a set of five genes in a liver biopsy could predict the success of immunosuppression withdrawal with high sensitivity and specificity.
Objective
To determine if the use of a liver tissue transcriptional test of tolerance to stratify liver recipients prior to immunosuppression withdrawal accurately identifies operationally tolerant recipients and reduces the incidence of rejection, as compared with a control group in whom immunosuppression withdrawal is performed without stratification.
Design and methods
Prospective, multicentric, phase IV, biomarker-strategy design trial with a randomised control group in which adult liver transplant recipients were randomised 1 : 1 to either: (1) non-biomarker-based immunosuppression weaning (Arm A); or (2) biomarker-based immunosuppression weaning (Arm B).
Setting and participants
Adult liver transplant recipients ≥ 3 years post transplant (≥ 6 years if age ≤ 50 years old) with no history of autoimmunity or recent episodes of rejection, normal allograft function, and no significant histological abnormalities in a baseline screening liver biopsy, recruited from 12 transplant units in United Kingdom, Germany, Belgium and Spain.
Intervention
Enrolled patients underwent a screening liver biopsy to exclude the presence of subclinical allograft damage. Eligible participants randomised to Arm A underwent gradual discontinuation of immunosuppression. Among participants allocated to Arm B, only those found to be biomarker-positive were offered immunosuppression withdrawal, while biomarker-negative participants remained on their baseline immunosuppression. Patients who completely discontinued immunosuppression and maintained stable allograft function underwent protocol liver biopsies at 12 and 24 months after immunosuppression withdrawal.
Main outcome measure
Development of operational tolerance, defined as the successful discontinuation of immunosuppression with maintenance of normal allograft status 12 and 24 months after immunosuppression withdrawal.
Results
One hundred and twenty-two patients were eligible to participate in the trial, 116 were randomised (58 to Arm A and 58 to Arm B), 80 initiated immunosuppression withdrawal and 34 were maintaining on their baseline immunosuppression. Among the 80 patients who initiated withdrawal, 54 (67.5%) developed clinically apparent rejection, 22 (27.5%) successfully discontinued immunosuppression, 21 underwent a liver biopsy and 13 (16.3%) met the histological criteria of operational tolerance at 12 months after immunosuppression discontinuation. The transcriptional tolerance biomarker was not accurate at identifying patients meeting the operational tolerance criteria [odds ratio 1.466, 95% confidence interval (CI) 0.326 to 9.215; p = 0.744; Sensitivity (Sn) 54%, Specificity (Sp) 42%, positive predictive value 16%, and negative predictive value 81%, with an accuracy of 44%]. Due to the poor diagnostic performance of the test, the trial was terminated prematurely following an interim analysis of the results. No patients lost their grafts as a result of rejection during the study duration.
Conclusions
In selected liver transplant recipients, immunosuppression withdrawal proved to be feasible, but was successful in a much lower proportion of patients than originally estimated. A previously validated liver tissue transcriptional biomarker test was not considered accurate in predicting the success of immunosuppression withdrawal.
Study registration
Current Controlled Trials ISRCTN47808000 and EudraCT 2014-004557-14.
Funding
This award was funded by the National Institute for Health and Care Research (NIHR) Efficacy and Mechanism Evaluation (EME) programme (NIHR award ref: 13/94/55) and is published in full in Efficacy and Mechanism Evaluation ; Vol. 12, No. 3. See the NIHR Funding and Awards website for further award information.
Objectives
Uncertainty remains about many aspects of first-line treatment of autoimmune hepatitis (AIH).
Design
Systemic review with meta-analysis (MA).
Data sources
Bespoke AIH Endnote Library, updated to 30 June 2024.
Eligibility criteria
Randomised controlled trials (RCTs) and comparative cohort studies including adult patients with AIH, reporting death/transplantation, biochemical response (BR) and/or adverse effects (AEs).
Data extraction and synthesis
Data pooled in MA as relative risk (RR) under random effects. Risk of bias (ROB) assessed using Cochrane ROB-2 and ROBINS-1 tools.
Results
From seven RCTs (five with low and two with some ROB) and 18 cohort studies (12 moderate ROB, six high for death/transplant), we found lower death/transplantation rates in (a) patients receiving pred+/−aza (vs no pred): overall (RR 0.38 (95% CI 0.20 to 0.74)), in patients without symptoms (0.38 (0.19–0.75)), without cirrhosis (0.30 (0.14–0.65)), and with decompensated cirrhosis (RR 0.38 (0.23–0.61)), and (b) patients receiving pred+aza (vs pred alone) (0.38 (0.22–0.65)). Patients receiving higher (vs lower) initial pred doses had similar BR rates (RR 1.07 (0.92–1.24)) and mortality (0.71 (0.25–2.05)) but more AEs (1.73 (1.17–2.55)). Patients receiving bud (vs pred) had similar BR rates (RR 0.99 (0.71–1.39)), with fewer cosmetic AEs (0.46 (0.34–0.62)). Patients receiving mycophenolate mofetil (MMF) (vs aza) had similar BR rates (RR 1.32 (0.73–2.38)) and fewer AEs requiring drug cessation (0.20 (0.09–0.43)).
Conclusions
Mortality is lower in pred-treated (vs untreated) patients, overall and in several subgroups, and in those receiving pred+aza (vs pred). Higher initial pred doses confer no clear benefit and cause more AEs. Bud (vs pred) achieves similar BR rates, with fewer cosmetic AEs. MMF (vs aza) achieves similar BR rates, with fewer serious AEs.
Fasting during the month of Ramadan is an obligatory religious practice for healthy adult Muslims. To complete a fast, individuals must abstain from eating, drinking, and taking medications from dawn to sunset. Individuals may be exempt from fasting during Ramadan on health grounds. However, some patients may still fast to fulfill their religious obligation, even if this means going against medical advice. Solid organ transplant recipients may have to follow strict fluid and electrolyte requirements, which could be challenging during Ramadan, leading to the concern that abstaining from fluid intake can lead to prerenal acute kidney injury. Furthermore, transplant recipients must take their immunosuppression at prescribed intervals to preserve graft function, drug level variability runs the risk of graft rejection. Following a review of the current literature, a shared decision-making tool has been developed to assist clinicians in supporting patients who are motivated to fast during Ramadan. All recipients wishing to fast should undergo a risk assessment. Those in the low–moderate risk category may be able to fast and safely follow medication reviews and optimize their immunosuppression regimens. In addition, they would benefit from monitoring graft function, therapeutic drug levels, electrolytes, and additional parameters such as fluid status, weight, blood pressure, and concurrent management of comorbidities. Those stratified in the higher-risk categories should be encouraged to explore alternatives, such as Fidyah or winter fasting.
More liver transplants (LT) are performed worldwide thanks to extended criteria donors (ECD). This is paralleled by a supposed increased risk of allograft failure (AF) at 90 and 365 days. This study has been designed to portray the LT practice worldwide and investigate models of AF prediction and the impact of risk mitigation strategies for further improving graft and patient outcomes. This is a multicenter, international, non-competitive, observational two segment study on consecutive LTs over two periods (2017–2019 and 2022–2024). A steering committee of LT experts defined the study protocol. The prospective segment will enroll 750 patients from 15 high-volume LT centers (50 per center), and the retrospective segment will enrol 4200 patients from 56 LT centers (75 per center). To provide a snapshot of the LT activity globally and to develop new algorithms for the timely prediction of AF at 90 and 365 days post-LT. The study also aims (1) to validate the existing predictive models and (2) to investigate the best time for re-transplantation, paying attention to the differences in AF and Ischemic cholangiopathy according to the donor types and mitigation strategies implemented in the various settings. Since the adoption of machine perfusion has increased in different proportions worldwide, models will be adjusted according to this parameter. Finally, retrospective and prospective data will be available for further stratifications and modelling according to the degree of decompensation at transplant, gender match, postoperative complications and their management. This protocol was approved by Fondazione Policlinico Universitario Agostino Gemelli IRCCS Ethics Committee (study ID: 4571) and the Institutional Review Board of the University of California, Los Angeles. The provisional study protocol was submitted to the main scientific international societies in the transplant field. Results will be published in international peer-reviewed journals and presented at congresses.
Graphical Abstract
Institution pages aggregate content on ResearchGate related to an institution. The members listed on this page have self-identified as being affiliated with this institution. Publications listed on this page were identified by our algorithms as relating to this institution. This page was not created or approved by the institution. If you represent an institution and have questions about these pages or wish to report inaccurate content, you can contact us here.
Information
Address
Birmingham, United Kingdom