Recent publications
Background
Interactive electronic devices (IEDs) are ubiquitous in young children's lives. However, research on their impact on learning and development is still limited. The aim of this study was to understand the perspectives of early years practitioners (EYPs) and public health consultants (PHCs) on the use of IEDs in children aged 3–5.
Methods
Using purposive sampling techniques, we recruited four EYPs and two PHCs from children's nurseries and a government organisation in the northwest of England. Semi‐structured interviews were used to collect data, which were audio‐recorded, transcribed verbatim and anonymised. Data were analysed using reflective thematic analysis.
Results
EYPs and PHCs noted that although IEDs could negatively impact child development and behaviour, they could also aid in learning. EYPs expressed concerns about the impact of parents' own IED habits on children's communication and social skills. On the other hand, PHCs stressed that substituting outdoor play with the use of these devices could affect children's social and physical skills and reduce physical activity levels, which are crucial for development. Finally, both EYPs and PHCs agreed that there was a need to improve parents' and EYP's knowledge and to develop interactive interventions to promote an understanding of how IEDs should be used with young children.
Conclusion
EYPs and PHCs acknowledge the potential advantages of using IEDs as a teaching tool for children. However, they have concerns about the long‐term effects on communication, social and physical skills and how children are impacted by their parents' use of these devices. To support policy statements, future research should offer further evidence of the benefits and harms of IED use.
The Cape Vulture ( Gyps coprotheres ) has the smallest range of any vulture species in Africa, Europe or Asia and is substantially impacted by anthropogenic factors because of their low productivity and long maturation times. Almost year-round presence at breeding colonies makes understanding Cape Vulture breeding behaviour essential for their conservation. Camera traps, a first for this species, were used to investigate the effect of time of day and temperature on the presence and behaviours performed at nest sites. The number of vultures at a nest site was likely to be higher during the early morning and late afternoon when temperatures were lower, with significantly fewer individuals present at higher temperatures in the days before and after laying. Attendance of at least one adult at the nest was recorded for 86.9% and 99.8% of time points in the days before and after laying, respectively. Almost-constant attendance during incubation may also be necessitated by predation pressure, with this study providing observation of possible predation pressure by White-necked Ravens ( Corvus albicollis ) at the colony. Here, we demonstrate that camera trapping is an effective method of studying Cape Vulture breeding behaviour, which improves understanding and allows more informed conservation measures to be implemented. Cape Vulture nest attendance is affected by temperature, so this species may be vulnerable to climatic changes and subsequent changes in predation pressure.
A variety of medical specialities undertake percutaneous drainage but understanding of device performance outside radiology is often limited. Furthermore, the current catheter sizing using the “French” measurement of outer diameter is unhelpful; it does not reflect the internal diameter and gives no information on flow rate. To illustrate this and to improve catheter selection, notably for chest drainage, we assessed the variation of drain performance under standardised conditions. Internal diameter and flow rates of 6Fr.-12Fr. drainage catheters from 8 manufacturers were tested to ISO 10555-1 standard: Internal diameters were measured with Meyer calibrated pin-gauges. Flow rates were calculated over a period of 30s after achieving steady state. Evaluation demonstrated a wide range of internal diameters for the 6Fr., 8Fr., 10Fr. and 12Fr. catheters. Mean measurements were 1.49 mm (SD:0.07), 1.90 mm (SD:0.10), 2.43 mm (SD:0.11) and 2.64 mm (SD:0.03) respectively. Mean flow rates were 128 mL/min (SD:37.6), 207 mL/min (SD: 55.1), 291 mL/min (SD:36.7) and 303 mL/min (SD:20.2) respectively. There was such variance that there was overlap between catheters of different size: thin-walled 10Fr. drains performed better than 12Fr. “Seldinger” chest drains. Better understanding of drain characteristics and better declaration of performance data by manufacturers are required to allow optimum drain choice for individual patients and optimum outcomes.
Hardware system customized toward the demands of graph neural network learning would promote efficiency and strong temporal processing for graph‐structured data. However, most amorphous/polycrystalline oxides‐based memristors commonly have unstable conductance regulation due to random growth of conductive filaments. And graph neural networks based on robust and epitaxial film memristors can especially improve energy efficiency due to their high endurance and ultra‐low power consumption. Here, robust and epitaxial Gd: HfO2‐based film memristors are reported and construct a weighted echo state graph neural network (WESGNN). Benefiting from the optimized epitaxial films, the high switching speed (20 ns), low energy consumption (2.07 fJ), multi‐value storage (4 bits), and high endurance (10⁹) outperform most memristors. Notably, thanks to the appropriately dispersed conductance distribution (standard deviation = 7.68 nS), the WESGNN finely regulates the relative weights of input nodes and recursive matrix to realize state‐of‐the‐art performance using the MUTAG and COLLAB datasets for graph classification tasks. Overall, robust and epitaxial film memristors offer nanoscale scalability, high reliability, and low energy consumption, making them energy‐efficient hardware solutions for graph learning applications.
The effect of a first-order exothermic chemical reaction on the mixed flow in a vertical channel filled with a porous medium has been studied using the local thermal non-equilibrium model. A mathematical model including the exponen-tial term for heat generation in the fluid phase and interphase heat transfer terms in both the fluid and solid phases was considered. The above dimensionless model has been solved numerically using the Matlab routine bvp4c for the energy equations together with an in-house developed code for the momentum equation. The existence of dual solutions is reported for certain values of the Frank-Kamenetskii number. The obtained profiles of temperature and velocity have been plotted. In addition, the ranges of existence of the dual solutions are reported.
Purpose : The present study investigated the effect of unpleasant salty or bitter tastes on cycling sprint performance and knee-extensor force characteristics in different fatigue states. Methods : Following a familiarization session, 11 trained male cyclists completed 3 experimental trials (salty, bitter, and water) in a randomized crossover order. In each trial, participants cycled at 85% of the respiratory compensation point for 45 minutes and then, after a 5-minute rest, completed a 1-minute sprint. Muscle-force characteristics were assessed using 2 knee-extensor maximal voluntary contractions immediately before, between, and after the cycling efforts. Participants mouth-rinsed and ingested 25 mL of test solution (salty, bitter, and water) immediately before each maximal voluntary contractions and the 1-minute sprint. Results : There were no significant differences in mean and peak power output during the 1-minute sprint between conditions (mean power: 528 [71] W, 524 [70] W, and 521 [80] W in the water, salt, and bitter conditions, respectively). Muscle-force production was impaired in all conditions after the heavy-intensity cycling, evidenced by a decline in maximum force production ( P = .01, effect size = 0.32) and 100- to 200-millisecond impulse ( P = .04, effect size = 0.27). However, there were no significant differences between conditions in maximal force or impulse measures at rest or after exercise. Conclusion : These data question whether unpleasant tastes can influence muscle-force production and do not support that they may be used as an ergogenic aid for a cycling sprint performed under fatigued conditions.
Due to global blood shortages and restricted donor blood storage, the focus has switched to the in vitro synthesis of red blood cells (RBCs) from induced pluripotent stem cells (iPSCs) as a potential solution. Many processes are required to synthesize RBCs from iPSCs, including the production of iPSCs from human or animal cells, differentiation of iPSCs into hematopoietic stem cells, culturing, and maturation of the hematopoietic stem cells (HSC) to make functional erythrocytes. Previous investigations on the in vitro production of erythrocytes have shown conflicting results. Some studies have demonstrated substantial yields of functional erythrocytes, whereas others have observed low yields of enucleated cells. Before large-scale in vitro RBC production can be achieved, several challenges which have limited its application in the clinic must be overcome. These issues include optimizing differentiation techniques to manufacture vast amounts of functional RBCs, upscaling the manufacturing process, cost-effectiveness, and assuring the production of RBCs with good manufacturing practices (GMP) before they can be used for therapeutic purposes.
Background: Handgrip strength (HGS) is an indicator of overall muscle health and is affected by impaired blood glucose levels. This review discusses the relationship between HGS and blood glucose levels and provides solutions to the known problems of HGS and blood glucose regulation. Methods: This review followed the Preferred Reporting Items for Systematic Reviews and Meta-Analyses 2020 guidelines. The articles were sourced from Google Scholar and PubMed. A total of 418 studies were screened, of which 19 articles were included in this study. The Newcastle–Ottawa Scale was used to assess the risk of bias. Results: A relationship was observed between low HGS and high blood glucose levels. The suggested mechanisms involve insulin resistance, Caspase-3 activation, and the mitochondrial impact. Sarcopenia emerged as an independent risk factor for impaired glucose control. Interventions including insulin administration and exercise have been proposed to preserve muscle mass. Conclusion: Resistance training and HGS exercises can be added to the rehabilitation practices for managing diabetes mellitus. HGS measurements are vital for predicting muscle mass loss in clinical practice.
This study examines the experiences and unmet needs of caregivers of children with autism spectrum disorder in Nigeria. With a high prevalence in Nigeria, autism spectrum disorder poses a heavy economic burden on society and the patients’ families, with limited social interactions and stigma. Despite this, the unmet needs and psychosocial burdens of autism spectrum disorder on caregivers have been understudied in Nigeria. The study contributes evidence and raises interest in this area of research.
This qualitative study was conducted among twenty-three purposively selected caregivers. Questions from the PREPARE and Zarit Burden Interview tools were adapted for the interview and discussion guides. Data were collected among caregivers of pupils in selected special needs schools in Cross River State, Nigeria. Inductive and deductive approaches were used for the analysis using NVivo 20 pro. The socio-ecological model was used to generate the themes and quotes.
The study generated four themes and eleven sub-themes across four levels of the socio-ecological model. Findings from our study showed that caregivers of children undergo significant emotional distress, disbelief, and fear at the early stage of diagnosis. Furthermore, families and friends had difficulty comprehending or accepting their children’s diagnosis, which further created tension and misunderstanding. Socio-cultural contexts such as stigma and isolation were not uncommon in the society.
Given the burden of the psychological demand and stigma attached to caregivers and children with autism, there is an urgent need for a tailor-made intervention with the key interplay of individual, interpersonal, societal/institutional, and policy in Nigeria. Advocacy efforts and awareness chaired by caregivers should be strengthened across all levels of the society in Nigeria.
The growth in use of Artificial Intelligence is having a major impact on society, with further impacts anticipated in the coming years and decades. There are individual differences in attitudes towards Artificial Intelligence and it is important for scientists and others to be able to measure these. Individual differences in attitudes towards Artificial Intelligence may be associated with other major psychological or circumstantial factors, and understanding these associations is beneficial. In addition, it is important to be able to track attitudes towards Artificial Intelligence over time. For this purpose, scientists have developed psychometric measurement tools to measure attitudes towards Artificial Intelligence. This Chapter provides an overview and evaluation of these tools, with a focus on tools that measure general attitudes towards Artificial Intelligence, and that are quantitative measurements, which can be analysed statistically. Semantic, methodological, and psychometric factors that the user should consider when choosing a suitable tool are discussed. The choice of measurement tool may depend on many researcher-driven considerations, including time, cost, and practical factors, but the quality and validity of the measurement tool should be a major factor in this choice. A scale’s ability to capture important dimensions in the data should also be a key consideration. We recommend that observed ambivalence about AI is best captured with a bi-dimensional AI attitudes scale.
Cardiology, as a medical specialty, addresses cardiovascular diseases (CVDs), a leading cause of global mortality. Nanomaterials offer transformative potential across key areas such as drug delivery, stem cell therapy, imaging, and gene delivery. Nanomaterials improve solubility, bioavailability, and targeted delivery in drug delivery, reducing systemic side effects. Examples include gas microbubbles, liposomal preparations, and paramagnetic nanoparticles, which show promise in treating atherosclerosis. Stem cell therapy benefits from nanotechnology through enhanced cell culture conditions and three-dimensional scaffolds that support cardiomyocyte growth and survival. Gold nanoparticles and PLGA-derived microparticles further improve stem cell viability. In imaging, nanomaterials enable advanced visualization techniques such as magnetic resonance imaging (MRI) with direct labeling and optical tracking via dye-conjugated nanoparticles. In gene delivery, polymeric nanocarriers like polyethyleneimine, dendrimers, and graphene-based materials offer efficient, non-viral alternatives, with magnetic nanoparticles showing promise in targeted applications. Ongoing research highlights the potential of nanomaterials to revolutionize CVD management by improving therapeutic outcomes and enabling precision medicine. These advancements position nanotechnology as a cornerstone of modern cardiology.
Estimating animal abundance has a key role to play in ecology and conservation, but survey methods are always challenged by imperfect detection. Among the techniques applied to deal with this issue, Double Observer (DO) is increasing in popularity due to its cost‐effectiveness. However, the effort of using DO for surveying large territories can be significant. A DO‐based survey method that allows accurate abundance estimations with reduced effort would increase the applicability of the method. This would have positive effects on the conservation of species which are challenging to survey such as mountain ungulates.
We used computer simulations based on real data and a field test to assess the reliability of the DO and of a new proposed survey procedure, the Double Observer Adjusted Survey (DOAS). DOAS is based on total block counts adjusted with some DO surveys conducted in a proportion of the total area only. Such DO surveys are then used to estimate detection probability with a mark‐recapture‐derived approach.
We found that full DO is much more accurate than simple block counts for abundance estimations. DOAS is a less demanding alternative to full DO and can produce comparable abundance estimates, at the cost of a slightly lower precision. However, in the DOAS overall detectability has to be estimated within a sufficient number of sites (around a quarter of the total) to obtain a higher precision and avoid large overestimations.
Practical implications. DO methods could increase the reliability of abundance estimations in mountain ungulates and other gregarious species. Full DO in particular could allow researchers to obtain unbiased estimations with high precision and its usage is therefore suggested instead of block counts in wildlife monitoring. Given the high costs of full DO, the DOAS procedure could be a viable and cost‐effective survey strategy to improve abundance estimates when resources are scarce.
Morbidity and mortality from preventable diseases among ethnic minority groups are higher when compared with white population group in the UK ⁽¹⁾ . Diet is a major modifiable disease risk factor and depends on several factors, including the affordability and availability of foods commonly consumed by these communities. With families now bearing the burden of inflation with an increase in the price of food, and with people adapting to this by adjusting their spending, the widening of health inequality is reported ⁽²⁾ .
Understanding of the impact of the cost-of-living crisis (COLC) on the dietary practice and health of ethnic minority populations is limited ⁽²⁾ . This study explored the impact of dietary choices in the immigrant Nigerian community living in the UK, within the context of a COLC.
Using hermeneutic phenomenological design and with the aid of purposive sampling, community networks, and snowballing were leveraged to assist in recruiting participants from the Nigerian population in Manchester and London based on age, gender, and socio-economic factors. Utilising an informed consent process, fifteen one-to-one telephone semi-structured interviews were undertaken, and detailed information about culture, dietary choices, and the cost-of-living crisis was collected.
The six-step guide Braun & Clark’s Reflexive thematic analysis of verbatim transcribed data from the audio-recorded interviews revealed adaptive and coping strategies have been adopted ⁽³⁾ . Participants report what they now eat, and activities adopted to support eating. This includes purchasing cheap, low-quality, and reduced quantities of food, reducing wastage of food through preservation of any excess, bulk cooking, or preparation to minimise energy usage. Other cost savings are being made through, cutting down on social engagements (e.g. eating out and nightlife), A focus on food ‘’needs’’ and not ‘’wants’’, a reliance on social networks for support and assistance, the use of food banks, and skipping meals is also reported. One participant: ‘’ This has affected our diet. So, it’s been very challenging. I mean generally, everywhere the cost of things is high’’. (GregM61). Another said, “ I just cook enough for two days so that I can save some energy and reduce my energy bills ”. (KhadijatF45). Illustrating the impact on health one participant suggested: ‘’ When you have a cheap cost of meat and a cheap cost of carbohydrates, then you are eating badly’’. (EkongM51).
These findings cause concern since cheap and low-quality diets eaten by ethnic minority communities are known to be unhealthy and are high-risk factors for chronic diseases ⁽⁴⁾ . The need for long-term support for individuals or groups most at risk of the COLC, and an ongoing need to develop culturally appropriate strategies to support healthy eating on a budget for this ethnic group are paramount to prevent poor health outcomes in this group.
The UK population is ageing and becoming more ethnically diverse ⁽¹⁾ . Nutrition is a key modifiable determinant of healthy ageing but there is little published data of dietary patterns in ethnic minority groups. The reasons for poor dietary habits of older adults from ethnic minority groups could be attributed to cost of living, language barriers, age, availability of traditional foods ⁽²⁾ . As part of a larger research study to improve nutritional health in older adults (TANGERINE: nuTritional heAlth aNd aGeing in oldER ethnIc miNoritiEs), the aim of this study was to investigate vegetable intake in different older ethnic groups compared with a white (British) reference population.
We used food frequency questionnaire (FFQ) data drawn from Wave 2 (2010-2012) and Wave 13 (2021-2022) of Understanding Society, a UK household panel survey ⁽³⁾ . We calculated the proportions of vegetable intake by ethnic group for each wave, weighted for population representativeness, and used (weighted) logistic regression for intake (everyday vs less than every day) to adjust for potential confounders. Data from the WHO food insecurity questionnaire in wave 13 was used to evaluate the ethnic group differences in food insecurity.
The percentage vegetable intake at least every day was reduced between Waves 2 and 13 in all ethnic groups. At both Waves all ethnic groups, except Indian ethnicity have lower vegetable intakes than white (British) reference group. The age and sex adjusted odds ratios (OR) (95% confidence intervals) at Wave 2 were 0.60 (0.51, 0.71) for Caribbean, 0.67 (0.56, 0.79) for African, 0.36 (0.28, 0.44) for Pakistani, 0.78 (0.62, 0.98) for Bangladeshi and 1.10 (0.94, 1.28) for Indian. The differences could be largely explained by lower income and greater area deprivation for Bangladeshi, less so for Caribbean, African and Pakistani groups. Results were similar for Wave 13. All ethnic groups, except Indian had higher odds of greater food insecurity than the white (British) reference group, largely attributed to income and area deprivation, for example, the OR for Pakistani group compared with white (British) reference group decreased from 1.74 (1.18, 2.56) to 1.05 (0.70, 1.58). However, for the African group, the OR remained greater than white reference population at 2.55 (1.73, 3.76) even after accounting for socioeconomic position.
The findings suggest differences in vegetable intake between different ethnic groups which have been maintained between 2010-2012 and 2021-2022 and may be explained to some extent by socioeconomic disadvantage. Whilst we used cross-sectional analyses of self-reported data, there remains a need for further large-scale studies using longitudinal and experimental designs in older ethnic groups considering socioeconomic position, recognising the importance of heterogeneity and the need to analyse ethnic groups individually, rather than as a group for measurements of dietary intake.
Extended reality (XR), which includes Virtual Reality (VR) as well as Augmented Reality (AR) technologies, has emerged as an innovative tools with potential applications in both the health and education sectors. For nutrition studies, XR technology offers alternative approaches to addressing challenges related to the prevention and management of chronic conditions such as overweight, obesity, diabetes, and cardiovascular diseases (CVD) ⁽¹⁾ . By simulating immersive environments and interactive experiences, XR technology presents the opportunity to influence dietary behaviours, modify eating habits, to improve health by reducing weight and body mass index (BMI). This review aims to explore the application of XR technology in nutrition studies, particularly focusing on dietrelated non-communicable diseases (obesity, diabetes, and cardiovascular diseases), with an emphasis on managing relevant health outcomes.
A comprehensive search utilising multiple databases including PubMed, Cochrane, Web of Science, Scopus, CINAHL, Medline, and ProQuest was conducted (December to March 2024). Key search terms encompassed XR technology (e.g., virtual reality, augmented reality, Oculus, mobile app, immersive) in conjunction with nutrition-preventable terms (obesity, overweight, diabetes, CVD and health outcomes BMI). Following the search, duplicates were removed, and articles were screened against predefined inclusion criteria. Data extraction focused on study details, participant characteristics, interventions, outcomes, and results.
845 articles were identified, and five met the inclusion criteria (3,4,5,6,7) . These included one randomised cross-over study ⁽⁴⁾ and four randomised controlled trials (1–3,5) . Of these articles, four studies explored VR-based approaches (2–5) , while one study used AR technology ⁽¹⁾ . The primary outcomes assessed across these studies focussed on the efficacy of VR and AR interventions in various domains, including portion size reduction, hunger, eating behaviour, food preferences, and weight management. Two studies showed improved portion size self-efficacy with VR interventions (2,5) . One study reported that even though eating a virtual meal does not appear to significantly reduce hunger in healthy individuals, meal duration was significantly shorter in the virtual meal, than in the actual or real meal, which led to a higher eating rate ⁽⁴⁾ . The use of XR interventions also showed the potential to support optimal portion size selection and reduction in implicit food preferences (1,3) .
XR interventions may be effective in addressing various aspects of eating behaviour and portion control. These findings suggest potential applications in nutrition education and obesity management, especially for those for whom technology usage is the norm, by offering innovative approaches for intervention and behaviour change. Further research in this domain is needed to elucidate the efficacy, feasibility, and long-term impact of XR/VR-based interventions in the prevention and management of chronic conditions.
Constructions of children’s agency have been an influential and dominant arena for discussion since the emergence of the ‘new’ paradigm of childhood in the 1990s. Cross-disciplinary studies recognise the different social, cultural and temporal influences upon perceptions of childhood and acknowledge the impact of such constructions on how children’s agency is understood and realised. Many of the definitions of agency reflect Article 12 of the United Nations Convention on the Rights of the Child, which states the child’s right to be involved in decisions affecting them. However, as with other articles of the convention, Article 12 is prone to subjective adult interpretation predicated on assumptions of competence and capability, and subject to the same uneasy tension between participation, protection and provision which characterises the convention more broadly. Furthermore, the presumed relationship between children’s involvement in decision making as an indicator of agency is misleading. This paper argues that children’s agency is a poorly defined concept, whose lack of clarity contributes to children being constrained as active change agents within and beyond contexts which directly affect them. Using the context of child language brokers, the paper argues that despite offering children the ‘socio-culturally mediated capacity to act’ brokering practices frequently take place in response to adult-determined objectives, rather than in contexts freely chosen by the child, potentially compromising their agentic potential. This paper draws upon the findings from Crutchley’s doctoral thesis which used Biographical Narrative Interpretive Method to explore the retrospective narratives of adults who assumed the role of cultural and linguistic brokers during their childhoods.
Healthcare workers in clinical training are at a high risk of stress and burnout due to their dual clinical practice and academic components. The psychosocial clinical environment and personality traits contribute to healthcare workers’ increased risk of psychological distress. The two-dimensional circumplex proposed by the complete state model of mental health suggests that people with mental illness symptoms can have mental well-being and are seen as traversing the circumplex from floundering to flourishing. Poverty is a significant predictor of mental illness, and COVID-19 has been linked to poorer mental health. The WHO’s Global Mental Health Report identifies three paradigm-shifting strategies for enhancing mental health, including elevating those who are impacted and reshaping social and living conditions. The United Nation’s action plan is to achieve sustainable health goals for 2030. The aim is to improve leadership and governance, responsive and integrated care, strategies for promotion and prevention, and the use of evidence and research to improve mental health.
(Free Word Cloud Generator, n.d.)
Note Mitchell (Perspect Psychiatr Care 57(3):1474–1481, 2021) [1] surveyed eighty-seven student nurses using the Perceived Stress Scale (PSS; Sheu et al. in Nurs Res 5(4):341–351, 1997 [2]). The questionnaire includes 29 Likert items ranging from 0 to 4 with six categories of perceived stress: nursing care of patients, academics and clinical staff, workload and assignments, peer group/daily life, low level of knowledge and skills, and the clinical environment. The ranked scores were converted into a word cloud for graphical representation in the figure presented above.
Non-healing wounds cost the National Health Service over £5.6 billion annually in wound management. Skin allografts are used to treat non-healing wounds, ulcers and burns, offering the best protection against infection. In order to allow host cells to repopulate and to avoid immunogenicity, cell components are removed through decellularisation. Decellularisation of human dermis has so far been performed in NHS Blood and Transplant using a combination of two enzymes (RNase T1 and the recombinant human DNase Pulmozyme)®. This study aims at validating a new method to remove DNA from donated dermis via the use of a single enzyme, Benzonase, known for its effectiveness of DNA digestion. Skin samples were decellularised by removing the epidermis, lysing of dermal cells, removal of cellular fragments by a detergent wash and removal of nucleic acids by a nuclease incubation with either Benzonase or Pulmozyme + RNase T1. DNA quantification with PicoGreen, as well as histology on wax-embedded biopsies, stained with DAPI and haemotoxylin and eosin, were performed. In vitro toxicity test on human osteosarcoma immortalised cells and skin fibroblasts, and biomechanical (tensile) testing, were also performed. The effectiveness of DNA digestion with the new methodology was comparable to previous procedure. Mean DNA removal percentage following decellularisation with Pulmozyme + RNase was 99.9% (3.83 ng/mg). Mean DNA removal percentage with Benzonase was 99.8% (9.97 ng/mg). Histology staining showed complete decellularisation following either method. Benzonase was proven to be non-toxic to both cell lines used, and a one-way Anova test showed no significant difference in neither stress nor strain between acellular dermal matrix decellularised with either Benzonase or Pulmozyme + RNase T1. Benzonase was able to effectively decellularise dermis after prior removal of epidermis. It performed just as well as the combination of Pulmozyme + RNase T1, but represents significant advantages in terms of cost effectiveness, procurement and storage; Benzonase has been successfully used in the decellularisation of other tissues, thus would be better for Tissue Banking use. Switching to this combined DNase/RNase can have far-reaching consequences in the production of acellular human dermal matrix by NHSBT and in the treatment of patients requiring it.
Introduction
Primary healthcare (PHC) patient medical records contain Systematised Nomenclature of Medicine-Clinical Terms (SNOMED-CT) that include information regarding diagnosis, demographics and veterans’ status. This study intended to identify, analyse and compare the prevalence of type 2 diabetes, hypertension, dementia and smoking tobacco in veterans and non-veterans, including stratification by age and gender.
Methods
The authors partnered with 13 PHC practices with a population of 137 410 patients. Staff extracted matched veteran and non-veteran SNOMED-CT data from patient medical records; then sent the authors anonymised data in an amalgamated format between October 2023 and January 2024. Patients were from a local community and therefore social and environmental factors would be similar. Submitted information was inputted into an SPSS database 28 for analysis which included descriptive and inferential statistics to indicate statistical significance.
Results
In total, 5458 PHC electronic records were examined comprising 2729 veterans and 2729 demographically matched for age and gender non-veterans. Each group contained 86.4% (N=2359) men and 13.6% (N=370) women. The mean age was 63.8 years (SD 17.7). Rates of hypertension were 20.9% in veterans compared with 17.6% in non-veterans (p=0.002). Type 2 diabetes mellitus was 8.3% in veterans compared with 6.4% in non-veterans (p=0.007). Dementia was 2.1% of veterans compared with 2.5% of non-veterans (p=0.32). Smoking was 11.8% of veterans compared with 10.6% of non-veterans (p=0.16).
Conclusion
These results reveal that veterans were statistically more likely to be diagnosed with hypertension and diabetes. This study should assist in a better understanding of the healthcare needs of the veteran population to potentially inform better patient-centred care. However, the effectiveness of using PHC patient medical records requires increased efforts to improve data quality which needs improved PHC staff knowledge, consistency in SNOMED-CT coding, better veteran medical e-record registration and coding and better data transmission between the Defence Medical Services and PHC.
The disruptive potential of generative AI (GenAI) tools to academic labour is potentially vast. Yet as we argue herein, such tools also represent a continuation of the inequities inherent to academia’s prestige economy and the intensified hierarchy and labour precarisation endemic to universities as prestige institutions. In a recent survey of n = 284 UK-based academics, reasons were put forward for avoiding GenAI tools. These responses surface concerns about automative technologies corrupting academic identity and inauthenticating scholarly practice; concerns that are salient to all who participate within and benefit from the work of scholarly communities. In discussion of these survey results, we explore ambivalence about whether GenAI tools expedite the acquisition or depletion of prestige demanded of academics, especially where GenAI tools are adopted to increase scholarly productivity. We also appraise whether, far from helping academics cope with a work climate of hyper-intensifcation, GenAI tools ultimately exacerbate their vulnerability, status-based peripheralisation, and self-estrangement.
Institution pages aggregate content on ResearchGate related to an institution. The members listed on this page have self-identified as being affiliated with this institution. Publications listed on this page were identified by our algorithms as relating to this institution. This page was not created or approved by the institution. If you represent an institution and have questions about these pages or wish to report inaccurate content, you can contact us here.
Information