University Hospitals Birmingham NHS Foundation Trust
Recent publications
Falls among older adults pose a significant public health challenge, as they lead to severe outcomes such as fractures and loss of independence. Research has shown that training cognitive function and balance simultaneously, termed Dual-Task (DT) training, improves mobility and reduces fall risks in older adults. This study aims to evaluate the feasibility and acceptability of a blended supervised and self-directed technology-based DT training programme for older adults who have high risk of falling. This is a single-arm, non-randomised feasibility study employing quantitative and qualitative methods. Fifty healthy adults aged 65 years or above will be recruited from the NHS primary and secondary care pathways and from the community. Participants will undergo supervised cognitive and balance DT training for 12 weeks, followed by self-directed DT training for an additional 12 weeks. The cognitive training will be delivered using a commercial mobile application (app) available from the AppStore or Google Play. The balance training will involve static (Marching on the spot, Tandem Stand, Hip Abduction & Extension, Squats, Tiptoe Stand, and Pendulum/Sideways Sway) and dynamic (Figure of Eight Walk, Walking Forwards and Backwards, Lunges, Functional Reach, Toe Tapping, Upper Limb Strength Exercises, and Side-Steps/Simple Grapevine) exercises focused on improving balance, postural stability and strength. Feasibility outcomes will be recruitment, adherence, usage of the app, and attrition. Outcomes measure data, that will be collected at baseline and at 24 weeks, includes the Timed- Up and Go (TUG) test (likely primary outcome in any future trial), along with self-reported questionnaires assessing cognition, fear of falling, quality of life, healthcare service usage, and the self-reported number of falls. Focus group interviews will be conducted with thirty participants and thirty healthcare professionals for in-depth exploration of the feasibility and acceptability of the DT training programme.
Purpose To evaluate whether the position of the transtibial centralisation tunnel, on the background of an anatomical transtibial pull‐through root repair (ATPR), affects the tibiofemoral contact mechanics and meniscal extrusion for medial meniscus posterior root tears (MMPRT). Methods Meniscal extrusion and contact mechanics were measured using two‐dimensional imaging and pressure films in 10 porcine knee joints. The posterior root was tested under six states: (1) intact; (2) MMPRT; (3) ATPR; (4) ATPR with TTC at the posterior horn (TTC‐PH); (5) ATPR with TTC midway between the PH and posterior border of medial collateral ligament (MCL) (TTC‐MID) and (6) ATPR with TTC behind the MCL (TTC‐MCL). The testing protocol loaded knees with 200‐N axial compression at four flexion angles (30°, 45°, 60° and 90°). At each angle and state, meniscal extrusion was measured as the difference in its position under load to that of the unloaded condition in the intact state. Contact area and pressure were recorded for all states at all angles and were analysed using a MATLAB programme. Results ATPR + TTC‐PH led to greater reduction in extrusion compared to both ATPR and ATPR + TTC‐MCL at 60° and 90° (p < 0.02 and p < 0.05, respectively). ATPR + TTC‐PH improved contact area compared to ATPR at 60° (p = 0.037) and 90° (p = 0.014), and to ATPR + TTC‐MCL at 90° (p = 0.042). ATPR + TTC‐MID improved contact area compared to ATPR at 90° (p = 0.035). ATPR + TTC‐PH reduced peak contact pressure compared to ATPR at 45° (p = 0.046) and 60° (p = 0.019), and to ATPR + TTC‐MCL at 60° (p = 0.040). The intact meniscus, TTC‐PH and TTC‐MID repair states performed similarly across all angles with regards to contact mechanics. Conclusion Combining ATPR with TTC‐PH provides the most appropriate biomechanical properties in reducing extrusion and improving contact mechanics following a MMPRT in porcine knees. Level of Evidence Not applicable (laboratory study).
Preventing pressure injuries (PIs) remains the most effective way to reduce their burden. A key element of prevention is the assessment of PI risk. The study aimed to investigate whether guidance documents relevant to the United States (US) advocated for specific risk assessment recommendations. We conducted a systematic review of guidance documents published between 2010 and 2024. Embase, Medline, Cinahl, and four key organisational websites were systematically searched to retrieve relevant articles. Two independent reviewers screened the articles for inclusion. One reviewer extracted the data, and a second reviewer checked all extracted data. Three reviewers assessed the guidance documents quality using the Appraisal of Guidelines for Research & Evaluation (AGREE II) tool. A narrative synthesis was used to describe and summarise findings. Six clinical practice guidelines (CPGs) and eight other best practice recommendations were included. The median scores of most AGREE II domains were higher for CPGs compared to other best practice recommendations. Risk assessment was consistently positioned as a critical first step in the prevention of PIs, emphasising its role in identifying at‐risk individuals and informing targeted interventions. Although risk assessment was presented as a crucial step in PI risk prevention, there was no clear and unanimous recommendation for a specific risk assessment strategy across all guidance documents, either for the general population or for specific subgroups of patients in US healthcare settings. These findings suggest a need for national consensus on concepts, implementation, and language addressing PI risk assessment.
Introduction Recruitment and training is vital to maintaining the size, deployability and effectiveness of armed forces, but was threatened early in the COVID-19 pandemic. Reports suggested asymptomatic seroconversion driving SARS-CoV-2 transmission in young adults. Potential association between lower vitamin D status and increased infection risk was also highlighted. We aimed to prospectively determine seroconversion and test the hypothesis that this would vary with vitamin D supplementation in representative populations. Methods Two cohorts were recruited from Yorkshire, Northern England. Infantry recruits received daily oral vitamin D (1000 IU for 4 weeks, followed by 400 IU for the remaining 22 weeks of training) in institutional countermeasures to facilitate ongoing training/co-habitation. Controls were recruited from an un-supplemented University population, subject to social distancing and household restrictions. Venous blood samples (baseline and week 16) were assayed for vitamin D and anti-SARS-CoV-2 spike glycoprotein antibodies, with additional serology (weeks 4, 9, 12) by dried blood spot. The impact of supplementation was analysed on an intention-to-treat basis in volunteers completing testing at all time points and remaining unvaccinated against SARS-CoV-2. Variation in seroconversion with vitamin D change was explored across, and modelled within, each population. Results In the military (n=333) and University (n=222) cohorts, seroconversion rates were 44.4% vs 25.7% (p=0.003). At week 16, military recruits showed higher vitamin D (60.5±19.5 mmol/L vs 53.5±22.4 mmol/L, p<0.001), despite <50% supplementation adherence. A statistically significant (p=0.005) effect of negative change in vitamin D (%) on seroconversion in recruits (OR of 0.991 and 95% CI of 0.984 to 0.997) was not evidenced in the University cohort. Conclusion Among unvaccinated populations, SARS-CoV-2 infection of infantry recruits was not reduced by institutional countermeasures, versus civilians subject to national restrictions. Vitamin D supplementation improved serum levels, but the implementation did not have a clinically meaningful impact on seroconversion during military training.
Background Asthma is a heterogeneous disease characterized by overlapping clinical and inflammatory features. Objective This study aimed to provide insight into the systemic inflammatory profile in asthma, greater understanding of asthma endotypes and the contribution of genetic risk factors to both. Methods 4205 patients with asthma aged 16–60 were recruited from UK centers; serum cytokines were quantified from 708, including cytokines associated with Type 1, 2 and 17 inflammation. 3037 patients were genotyped for 25 single nucleotide polymorphisms associated with moderate‐severe asthma. Results Serum cytokines associated with Th2 inflammation showed high coordinated expression for example, IL‐4/IL‐5 (R² = 0.513). The upper quartile of the serum cytokine data identified 43.7% of patients had high levels for multiple Th2 cytokines. However, the groups defined by serum cytokine profile were not clinically different. Childhood‐onset asthma was characterized by elevated total IgE, allergic rhinitis and dermatitis. Exacerbation prone patients had a higher BMI, smoking pack‐years, asthma control questionnaire score and reduced lung function. Patients with blood eosinophils of > 300 cells/µL had elevated total IgE and lower smoking pack‐years. None of these groups had a differential serum cytokine profile. Asthma risk alleles for; rs61816764 (FLG) and rs9303277 (IKFZ3) were associated with childhood onset disease (p = 2.67 × 10⁻⁴ and 2.20 × 10⁻⁷; retrospectively). No genetic variant was associated with cytokine levels. Conclusion Systemic inflammation in asthma is complex. Patients had multiple overlapping inflammatory profiles suggesting several disease mechanisms. Genetic risk factors for moderate‐severe asthma confirmed previous associations with childhood onset of asthma.
Skin cancer is a global health crisis and a leading cause of morbidity and mortality worldwide. A leading factor of malignancy remains the UV radiation, driving various biomolecular changes. With shifting population behaviors, deficiency in screening programs and reliance on self‐presentation, climate change and the ageing world populace, global incidents have been surging alarmingly. There is an urgent need for new technologies to achieve timely intervention through rapid and accurate diagnostics of skin cancer. Raman spectroscopy has been emerging as a highly promising analytical technology for diagnostic applications, poised to outpace the current costly, invasive and slow procedures, frequently hindered by varying sensitivity, specificity and lack of portability. Herein, complex and intricate progress are overviewed and consolidated across medical and engineering disciplines with a focus on the latest advances in the traditional and emerging skin cancer diagnostics. Methods detecting structural and chemical responses are categorized along with emerging chemo‐biophysical sensing techniques. Particular attention is drawn to Raman spectroscopy, as a non‐invasive, rapid and accurate sensing of molecular fingerprints in dermatological matrix with an additional focus on artificial intelligence, as a decision support tool collectively, laying the platform toward development and rapid translation of point‐of‐care diagnostic technologies for skin cancer to real‐world applications.
Background Medical education employs diverse teaching strategies, including blending lecture‐based learning, small‐group teaching (SGT) and, increasingly, simulation‐based learning. Nonetheless, limitations in clinical application and participation persist. Simulation via Instant Messaging for Bedside Application (SIMBA) complements these methods by simulating real‐world clinical scenarios. This pilot study compares SIMBA's effectiveness with SGT in endocrine topics for medical and pharmacy students. Methods The SIMBA for students model was developed using Kern's six‐step framework. SIMBA sessions, facilitated by trained moderators and senior experts, simulated outpatient consultations via WhatsApp. The study included SIMBA and SGT sessions from October 2020 to March 2022. Teaching effectiveness was assessed through postsession surveys and multiple‐choice questions (MCQs). The study compared the MCQ scores and student satisfaction of SIMBA, SGT and combined SIMBA + SGT cohorts. Results One hundred thirty (103 medical and 27 pharmacy) students participated in 14 SIMBA sessions, and 150 students responded to the post‐SGT survey, with 38 attending both. Median MCQ scores were higher post‐SIMBA (75.0%) compared with post‐SGT (60.0%) ( p < 0.0001). No significant difference was observed between SIMBA and SIMBA + SGT scores or SGT and SIMBA + SGT scores. SIMBA sessions were perceived as enjoyable (89.2%), intelligible (90.8%), engaging (81.5%), promoted new knowledge (90.0%) and enhanced comprehension (93.9%). 83.1% of students desired SIMBA to complement SGT. Conclusions SIMBA demonstrated superior knowledge gain and student satisfaction compared to SGT. Its familiar technology and interactive format suit modern learning, offering a standardised and equitable experience. Integrating SIMBA into the curriculum could help overcome teaching limitations and better prepare students for clinical practice.
Introduction: Abnormal breathing patterns unexplained by pathophysiology are typically referred to using terms including chronic breathlessness syndrome or complex breathlessness. Often patients with these conditions are referred to physiotherapy for an assessment of this breathlessness, where some are diagnosed with breathing pattern disorder (BrPD) or dysfunctional breathing (DB). The condition seen in physiotherapy occurs in at least 10% of the general population, increasing to 29−40% with coexisting conditions. Inconsistency in the nomenclature and physiotherapy assessment reduces recognition of the condition and hinders development in this area. Aims of the study: To establish expert physiotherapists' consensus on terminology to describe this condition and provide guidance for its physiotherapy assessment. Participants and methods: The opinions and experiences of ten respiratory physiotherapists, nine other clinicians (doctors, nurses, and speech and language therapists), and five patients diagnosed with BrPD were explored in focus groups or interviews regarding the terminology used and assessment experience. A second separate purposive sample of clinical expert physiotherapists ( n = 11) took part in a nominal group technique (NGT) process to build consensus on the following questions: Question 1: What is your preferred term for this condition? Question 2: What are the most important assessment components to be included in all assessments? Results: One focus group ( n = 10) and 14 interviews were completed. Framework analysis of the data from focus groups and interviews was undertaken and these results were shared with the participants in the nominal group. Consensus (71%) for the term breathing pattern disorder (BrPD) was achieved and an assessment guide was created. Conclusion: With improved consistency in its description and assessment, the adoption of breathing pattern disorder may help to further develop clinical and research priorities in this area within physiotherapy services.
Background The transition from pre-clinical to clinical teaching is often a time of heightened anxiety for students. With the shift to bi-modal teaching during the pandemic there was an opportunity to explore the use of 360-degree videos and virtual reality (VR) simulation teaching to enhance educational experience and smooth the transition from pre-clinical to clinical teaching. The aims of this study were to understand students’ perceptions of face-to-face and virtual simulation teaching during the recovery phase of the COVID-19 pandemic. Methods Two groups of students were recruited all of whom were about to have their clinical introduction to the periodontology department. All 20 students received current standard induction programme. One group (n = 7 students) received standard teaching only. One group (n = 13 students) in addition to standard teaching methods also received access to 360-degree video and VR headset prior to standard teaching. Focus groups were then conducted with the students. A topic guide was developed and piloted. Focus groups were conducted online, audio was recorded and transcribed verbatim. Transcripts were analysed codes and themes were developed using thematic analysis as a framework for analysing the focus groups. Results The 3 key themes identified were: the importance of familiarity with the clinical environment, preparation prior to attending clinical sessions and the benefit of practical experience. Conclusion This study demonstrates how 360-videos and VR technology may enhance dental education, provided it is implemented appropriately and at the correct time in training. Overall, students had a positive attitude towards using 360-videos and acknowledged its value in meeting a range of learning objectives, including infection control, IT training, and clinic orientation.
Background Care following transient ischaemic attack (TIA) and minor stroke is variable and often leaves patients feeling abandoned and uncertain. We developed a theoretically-informed, multifaceted intervention which comprised nurse-led, structured follow-up at 4 weeks after TIA/minor stroke to identify and address patient needs. This study evaluated the feasibility and acceptability of both the intervention and procedures to inform a future randomised controlled trial. Method We conducted a multicentre, randomised feasibility study with mixed-methods process evaluation (ISRCTN registry reference: ISRCTN39864003). We collected patient reported outcome measures (PROMs) at 1, 12 and 24 weeks and clinical data at baseline and 24 weeks. The process evaluation comprised qualitative interviews with a sub-sample, feedback questionnaires, and observations of intervention delivery. Results We recruited 54 patients over 12 months, achieving 90% of the target sample size (n = 60). PROMs return rates were 94.4% (51/54), 85.2% (46/54) and 71.1% (27/38) at 1, 12, and 24-weeks, respectively. Intervention fidelity was high and the intervention largely aligned with the theoretical underpinnings. The process evaluation illustrated how patients benefitted from the intervention through support they would not have received through usual care. This included direct referral or signposting to support services, information and education, actionable advice, and reassurance about and normalisation of recovery. The trial design was feasible and acceptable for both patients and clinicians. Conclusion Nurse-led, structured follow-up after TIA and minor stroke is feasible, acceptable and valued by patients and clinicians. Our intervention can identify and help address unmet needs. A definitive randomised trial to evaluate intervention effectiveness and cost-effectiveness is feasible and acceptable.
Introduction Few UK studies have explored the epidemiology of postoperative acute kidney injury after diverse types of elective major non‐cardiac surgery. Fewer still have compared postoperative acute kidney injury risk factors with conditions such as peri‐operative myocardial injury that might have similar pathophysiology. This study aimed to characterise postoperative acute kidney injury and its clinical consequences in elective major non‐cardiac surgery, and to assess risk factors for postoperative acute kidney injury including those related to peri‐operative myocardial injury. Methods All elective major non‐cardiac surgical episodes, occurring between 2015 and 2020, were identified retrospectively. Patients without measured peri‐operative renal parameters were not studied. Our primary outcome was 7‐day postoperative acute kidney injury rate, defined using Kidney Disease Improving Global Outcomes criteria. Multivariable logistic regression modelling was used to assess risk factors for postoperative acute kidney injury. Results Postoperative acute kidney injury occurred in 1334/13,790 (9.7%) episodes, with 663 (49.7%) occurring on day 1. Postoperative acute kidney injury was associated with increased peri‐operative complications (OR 1.8, 95%CI 1.6–2.1, p < 0.001), unanticipated critical care admissions (OR 2.4, 95%CI 1.6–3.5, p < 0.001) and in‐hospital mortality (OR 8.0, 95%CI 5.1–12.5, p < 0.001). Independent risk factors for postoperative acute kidney injury include: raised creatinine; hypertension; anaemia; platelet: lymphocyte ratio; heart rate; male sex: renin‐angiotensin‐aldosterone system blockade; and intra‐abdominal surgery. Discussion Postoperative acute kidney injury is common and is associated with adverse outcomes. Prevalence peaks initially within the first 48 h, with a secondary rise seen from day 5 onwards, suggesting a different aetiology. It is determined by a combination of patient and surgical risk factors, with the former relating to physiological, rather than chronological, renal age. In common with peri‐operative myocardial injury, postoperative acute kidney injury is independently associated with factors affecting autonomic tone and myeloid skewing.
Internationally, reference dosimetry for clinical proton beams largely follows the guidelines published by the International Atomic Energy Agency (IAEA TRS-398 Rev. 1 (2024). This approach yields a relative standard uncertainty of 1.7% (k = 1) on the absorbed dose to water determined under reference conditions. The new IPEM code of practice presented here, enables the relative standard uncertainty on the absorbed dose to water measured under reference conditions to be reduced to 1.0% (k = 1). This improvement is based on the absorbed dose to water calibration service for proton beams provided by the National Physical Laboratory (NPL), the UK’s primary standards laboratory. This significantly reduced uncertainty is achieved through the use of a primary standard level graphite calorimeter to derive absorbed dose to water directly in the clinical department’s beam. This eliminates the need for beam quality correction factors ( kQ,Q0) as required by the IAEA TRS-398 approach. The portable primary standard level graphite calorimeter, developed over a number of years at the NPL, is sufficiently robust to be useable in the proton beams of clinical facilities both in the UK and overseas. The new code of practice involves performing reference dosimetry measurements directly traceable to the primary standard level graphite calorimeter in a clinical proton beam. Calibration of an ionisation chamber is performed in the centre of a standard test volume (STV) of dose, defined here to be a 10 × 10 × 10 cm volume in water, centred at a depth of 15 cm. Further STVs at reduced and increased depths are also utilised. The designated ionisation chambers are Roos-type plane-parallel chambers. This article provides all the necessary background material, formalism, and specifications of reference conditions required to implement reference dosimetry according to this new code of practice. The Annexes provide a detailed review of ion recombination and how this should be assessed (Annex A1) and detailed work instructions for creating and delivering the STVs (Annex A2).
Objective To compare the effect of acepromazine on surgical duration and complication rate when compared to medetomidine. Study Design A randomised, prospective clinical study. Animal or Sample Population Thirty‐two female entire dog. Methods Thirty‐two female entire dogs undergoing elective laparoscopic ovariectomy (lap‐ove). Dogs were randomly assigned to pre‐medication Group M (medetomidine, n = 20), or Group A (acepromazine, n = 12). Anaesthesia protocol was standardised and monitored by one anaesthetist. The surgeons were blinded for the group allocation. Time was recorded at predetermined intra‐operative points. Surgical difficulty was subjectively assessed intra‐operatively, as well as objectively scored following review of the surgical recordings. Procedural complications were recorded. Results The mean ± SD surgical time from skin incision to removal of both ovaries was 11.8 ± 1.8 minutes and did not differ significantly between the two groups (M: 11.7 ± 1.6, A: 11.9 ± 2.3). Across the entire study population surgical difficulty was assessed as ‘easy’ in 14 dogs (44%), ‘medium’ difficulty in 14 dogs (44%) and ‘marked’ difficulty in four dogs (12%), while complication rate was 12.5% in both groups. There were no significant differences in difficulty or complication rate between groups. Conclusion Choice of pre‐medication did not significantly affect any of the outcomes measured. Clinical Significance Acepromazine may be used for sedation as part of a balanced premedication protocol as an alternative to medetomidine in dogs undergoing laparoscopic ovariectomy without significantly increasing the procedure time, surgical difficulty or complication rate.
There are no current stratified medicine options for STK11-deficient NSCLC. STK11 loss mediates mTORC activation, GLUT1 up-regulation and increased glycolysis. This metabolic reprogramming might represent a therapeutic vulnerability targetable with mTORC1/2 inhibition. In arm B2 of the National Lung Matrix Trial 54 patients with NSCLC received vistusertib, of which 49 were STK11-deficient (30 with KRAS mutation (B2D), 19 without (B2S)). Objective response (OR) and durable clinical benefit (DCB) rates with 95% credible intervals (CrI) were estimated from posterior probability distributions generated using Bayesian beta-binomial conjugate analysis. In B2D, 2 per-protocol patients obtained OR (estimated true OR rate (95%CrI) 9.8% (2.4–24.3). Estimates of true DCB rate (95%CrI): B2D 24.4% (11.1–42.3), B2S 14.6% (3.6–34.7). Overall, vistusertib cannot be recommended in this context. Longitudinal ctDNA analysis demonstrates enrichment of SMARCA4 mutations post-treatment. In vitro studies show adaptive resistance to mTORC1/2 inhibition via AKT reactivation. (NCT02664935, ISRCTN38344105, EudraCT 2014-000814-73, 10 June 2015)
Patients (pts) with myelodysplasia‐related AML (MR‐AML) are now genetically recategorized, with three different groups in the International Consensus Classification: AML with mutated TP53 (TP53‐AML), with myelodysplasia‐related gene mutations (MR‐GM AML), and with myelodysplasia‐related cytogenetic abnormalities (MR‐CG AML). Moreover, TP53‐AML is determined by the presence of an additional complex karyotype (TP53‐mut CK and non‐CK AML, respectively). Nonetheless, the relevance of this classification to transplantation outcomes is largely unknown. We analyzed the outcomes of pts. with MR‐AML undergoing allogeneic hematopoietic cell transplantation in first complete remission between 2010 and 2022 according to these genetic categories. Overall, 1152 patients were identified: 379 (33%), 328 (28%), 246 (21%), and 199 (17%) with MR‐GM, TP53‐mut CK, MR‐CG, and TP53‐mut non‐CK AML, respectively. Median age was 60 years; median year of transplant was 2020. Unrelated donors and reduced‐intensity conditioning were used in 65% and 61% of cases, respectively. Outcomes differed markedly among genetic categories, with an increasing relapse incidence (20.2%, 29.2%, 44.6%, and 57.6% at 2 years), and decreasing LFS (60%, 55.3%, 40.6%, and 20.2% at 2 years), overall survival (65.7%, 60.1%, 47.1%, and 24.5% at 2 years), and graft‐versus‐host disease‐free, relapse‐free survival (46.9%, 39.5%, 31.9%, and 13.2% at 2 years) in MR‐GM, MR‐CG, TP53‐mut non‐CK, and TP53‐mut CK AML, respectively. These differences were confirmed in the multivariate analysis (hazard ratio for LFS: 0.21, 0.33 and 0.61 in MR‐GM, MR‐CG, and TP53‐mut non‐CK, with respect to reference TP53‐mut CK AML group). This study confirms the strong impact of genetic grouping of MR‐AML on transplant outcomes.
Background Survival rates after a diagnosis of cancer are improving. Poorly managed gastrointestinal (GI) side effects can interfere with delivery of curative cancer treatment. Long-term physical side effects of cancer therapy impinge on quality of life in up to 25% of those treated for cancer, and GI side effects are the most common and troublesome. Aim To provide comprehensive, practical guidance on the management of acute and chronic luminal gastrointestinal symptoms arising during and after treatment for cancer Methods A multidisciplinary expert group including patients treated for cancer, divided into working parties to identify, and synthesise recommendations for the optimal assessment, diagnosis and appropriate interventions for luminal GI side effects of systemic and local cancer therapies. Recommendations were developed using the principles of the BMJ AGREE II reporting. Results 103 recommendations were agreed. The importance of the patient perspective and what can be done to support patients are emphasised. Key physiological principles underlying the development of GI toxicity arising from cancer therapy are outlined. Individual symptoms or symptom clusters are poor at distinguishing the underlying cause(s), and investigations are required if empirical therapy does not lead rapidly to significant benefits. Patients frequently have multiple GI causes for symptoms; all need to be diagnosed and optimally treated to achieve resolution. Investigations and management approaches now known to be ineffective or of questionable benefit are highlighted. Conclusions The physical, emotional and financial costs to individuals, their families and society from cancer therapy can be considerable. Identifying and signposting affected patients who require specialist services is the role of all clinicians. Progress in the treatment of cancer increasingly means that patients require expert, multidisciplinary supportive care providing effective and safe treatment at every stage of the cancer journey. Development of such expertise should be prioritised as should the education of health professionals and the public in what, when and how acute and chronic gastrointestinal symptoms and complications should be managed.
Traditionally, postnatal depression (PND) has been considered as depression in the first year after giving birth, although it has been argued that the 12‐month cut‐off may be somewhat arbitrary. Specialist perinatal mental health services in England have recently been extended to include women in their second year postpartum; however, there is no good estimate for the prevalence of PND beyond the first year. This review aimed to obtain the best estimate of the prevalence of PND in the second postpartum year. Eligible studies were those that assessed PND and provided a point prevalence using a validated screening tool or clinical diagnosis at least once beyond the first 12 months in women over the age of 18 years in any country. Studies were excluded if they only included women who were already depressed or had elevated depression scores at baseline. PubMed, Embase, Web of Science, CINAHL and PsychINFO were searched in January 2021 (and updated in February 2024) for studies that included the prevalence of PND beyond the first 12 postnatal months. Study quality was assessed using Cochrane's ROBINS‐I and Risk of Bias 2 tools. Prevalence data were combined in meta‐analysis using prediction intervals (PIs). A total of 6340 papers were found, and of these, 32 studies including 57210 participants across 18 countries met the inclusion criteria and were meta‐analysed. The prevalence of PND in the second year (13–24 months) was 15% (95% confidence interval [CI] 12%, 17%; 95% PI 4%, 30%) and similar to that in the first year, 16% (95% CI 14%, 19%; 95% PI 6%, 31%). Despite considerable heterogeneity, common in meta‐analysis of prevalence studies, findings show that a similar proportion of women experience PND in the second year after birth.
Institution pages aggregate content on ResearchGate related to an institution. The members listed on this page have self-identified as being affiliated with this institution. Publications listed on this page were identified by our algorithms as relating to this institution. This page was not created or approved by the institution. If you represent an institution and have questions about these pages or wish to report inaccurate content, you can contact us here.
1,370 members
Dougall Mccorry
  • neurosciences
Melanie Field
  • Renal Transplant Surgery and Vascular Access
Ewen A Griffiths
  • Upper Gastrointestinal Surgery
Adel H Mansur
  • Department of Respiratory Medicine
Susan Mollan
  • Department of Ophthalmology
Information
Address
Birmingham, United Kingdom