Recent publications
The incidence of bleeding in patients with acute leukemia varies widely based on several factors, including the specific type of leukemia, the stage of the disease, the patient’s comorbidity, and the presence and severity of thrombocytopenia. Thrombocytopenia due to both bone marrow infiltration and chemotherapy; disseminated intravascular coagulation (DIC) (more common in patients with acute promyelocytic leukemia, APL); impaired platelet function; hypofibrinogenemia due to the consumption of fibrinogen during systemic activation of coagulation or impaired synthesis by asparaginase may alone or in combination variably influence the risk of bleeding. About 30–50% of patients will experience bleeding, which in about half of cases can be severe, with intracranial hemorrhage occurring in 3–6% of cases. Prevention and management of bleeding is mainly based on the use of platelet transfusions with a minimal target of 10 × 109/L platelets or higher in patients with active bleeding, DIC, or APL, and the management of coagulopathy with replacement of fibrinogen. The prophylactic use of antifibrinolytic agents is not recommended due to the lack of efficacy, while it can be of benefit in selected situations (e.g., oral mucosal bleeding).
Background
Diabetes of the exocrine pancreas (DEP) is an underdiagnosed form of diabetes, prevalently caused by acute and chronic pancreatitis (CP). The contribution of incretin system dysfunction and the role of glucagon levels in the pathogenesis of DEP remain unclear. The aim of our study is to assess the secretion of glucagon like peptide‐1 (GLP‐1), glucose‐dependent insulinotropic peptide (GIP) and glucagon, along with the incretin effect, in individuals with and without CP. By comparing these parameters within the same glucose tolerance class, we seek to elucidate specific hormonal alterations that characterize DEP.
Methods
To pursue this aim, we conducted a cross‐sectional study on 32 patients with chronic pancreatitis (wCP) and 60 patients without chronic pancreatitis (w/oCP), who were administered an oral glucose tolerance test, a hyperglycemic clamp and a mixed meal test with measurement of glucose, insulin, C‐peptide, GLP‐1, GIP and glucagon.
Results
The comparison between individuals wCP and w/oCP showed worse beta‐cell function and lower incretin effect for the former, but incretin and glucagon levels were similar. Diabetes prevalence was higher in the group wCP than in the group w/oCP (56% vs. 33%). Thus, to evaluate the differences determined by CP, we found it necessary to stratify individuals according to glucose tolerance class. After stratification, we found that both groups had similar beta‐cell function, incretin effect and incretin and glucagon secretion.
Conclusions
Therefore, incretin and glucagon levels and the incretin effect varied according to glucose tolerance, not the presence or absence of CP. Similar defects in incretin secretion and effects are responsible for diabetes development in individuals wCP and w/oCP.
Mitophagy is a well-characterized and redundant recycling system for damaged mitochondria and a marker of organelle quality (Picca et al., 2023). Yet, the assessment of mitophagy in vivo remains a challenge. The characterization of the endosomal-lysosomal pathways supporting the endocytic trafficking has provided invaluable information also into mitophagy signaling. The endocytic pathway has been implicated in preserving mitochondrial quality via generation of mitochondria-derived vesicles (MDVs) and, as such, has been related to mitophagy tasks (Ferrucci et al., 2024). Altered mitophagy and MDV signaling accompany brain aging and neurodegenerative conditions (Ferrucci et al., 2024). However, how MDVs can be best characterized to be exploited as hallmarks of health and disease is debated. MDVs may be a trait d’union between dysfunctional mitophagy and decline of cell homeostasis through shuttling and/or being themselves mitochondria-derived damage-associated molecular patterns. These latter by instigating chronic low-grade inflammation may support neuroinflammation and neurodegeneration (Ferrucci et al., 2024). Alternatively, MDVs may rescue mitochondrial bioenergetics of neighbouring cells and favour neuronal health by transferring functional organelles. However, what defines one or the other role of MDVs and whether the outcome is mediated by vesicle subpopulations released under different metabolic triggers remain to be defined. Herein, we discuss MDVs as surrogate and more accessible measures of mitophagy. We also highlight the importance of addressing challenges in MDVs isolation and characterization to appreciate their signaling roles in neurodegeneration.
Background
This study aimed to compare the differences in clinical characteristics and response to monoclonal antibodies against CGRP (anti-CGRP mAbs) between patients who habitually used triptans (TRIPTANS group) and patients who were non-current users (NO-TRIPTANS group).
Methods
In this prospective cohort study, all consecutive outpatients treated with anti-CGRP mAbs for 12 months were included. Clinical data were collected at baseline and monthly: number of headache days (MHDs), the absolute number of analgesics (AMNs), and the number of days with at least one analgesic (AMDs), Headache Impact Test-6 (HIT-6), and Migraine Disability Assessment (MIDAS) questionnaires. The outcomes were to evaluate the differences between TRIPTANS and NO-TRIPTANS groups (users or non-users of triptans in the 6 months before and during anti-CGRP mAb treatment) in MHDs and the other clinical variables during treatment. Response rates were assessed based on reductions in MHDs (≤ 25%, ≥ 50%, ≥ 75%).
Results
A total of 336 patients treated with mAbs were included. At baseline, NO-TRIPTANS group had higher MHDs (24.7 ± 6.7) compared to the TRIPTANS group (21.8 ± 6.9), p < 0.001. Comparative and normalized analyses showed significant and sustained lower MHDs in the TRIPTANS group during treatment. The MIDAS score was also significantly lower in the TRIPTANS group at month-3, 6, 9, 12, and lower AMDs and AMNs compared to NO-TRIPTANS group were seen in most of the time-points. The number of patients with ≥ 50% reduction of MHDs was significantly higher in the TRIPTANS group at months 1 and 12.
Conclusions
This study showed greater effectiveness of anti-CGRP mAb in habitual triptans users, possibly due to a common and/or synergistic action.
Introduction: Low-grade appendiceal mucinous neoplasms (LAMNs) are rare tumors with nonspecific clinical features. Abdominal computed tomography (CT) is the primary diagnostic tool. Surgical resection is the standard treatment, requiring “en bloc” tumor removal to minimize the risk of rupture. Laparoscopic surgery is a safe and effective approach. Case Report: A 74-year-old female patient was admitted for evaluation of a pelvic mass. Physical examination revealed a non-tender abdomen with a palpable pelvic mass. Imaging showed significant thickening and dilation of the appendix. Laparoscopic surgery confirmed LAMN. The patient recovered well with no complications at follow-up. Conclusion: Low-grade appendiceal mucinous neoplasms are rare tumors with variable presentations. Abdominal CT and enteroscopy are crucial for diagnosis. Laparoscopic surgery offers a safe and effective treatment, ensuring favourable outcomes.
Abstract
Objectives: Lacerations are one of the most widespread and common emergencies among children, addressed by doctorsin emergency room admissions. The aim of this study was to optimize the formulation of a pre-existing anaesthetic gel knownas LAT gel (Lidocaine 4%, Adrenaline 0.05%, Tetracaine 0.5% gel), in order to allow the paediatric emergency department ofthe Policlinico A. Gemelli (Rome) to improve the management of lacerations, while avoiding the mandatory use of infiltrativeanaesthesia. The aim of the study was also to assess the stability over time of the active ingredients (lidocaine, adrenaline,tetracaina) in the galenic preparation.
Methods: LAT gel is a formulation prepared using: a poloxamer, Lutrol F127 (orKolliphor P407), two anaesthetics (lidocaine and tetracaine), together with adrenaline. The formulation was prepared in aGrade A laminar flow hood and in a Grade B environment for microbial load. Batches were subjected to microbiologicalanalysis by hospital hygiene as indicated in FU XII (“Official Italian pharmacopoeia XII edition”). The chemical stability ofthe gel was assessed by high-performance liquid chromatography.
Results: Analysis showed that the lidocaine, tetracaineand adrenaline content in the LAT gel remained constant when stored in a refrigerator at 2 to 8°C for up to 120 days. Thenon-need for additional anaesthetic methods was considered as a parameter of effectiveness. 83 children were evaluated.In the cases recorded, 83.1% (69/83) of patients did not require an additional anaesthesia. Patients who required additionalanaesthesia 16.9% (14/83) were those with deep and extensive lacerations. The results show that an effective, sterile, stablepreparation with excellent applicability was prepared. Conclusion: This product lends itself to application to the woundsurface with a better anaesthetic and haemostatic effect and, unlike injectable lidocaine, is an optimal treatment for increasingcompliance in paediatric emergency rooms.
BACKGROUND
The use of biomarkers, such as the neutrophil-to-lymphocyte ratio (NLR) and the neutrophil-to-platelet ratio (NPR), has shown promise in evaluating early outcomes after medical, interventional, and surgical treatments. NLR has emerged as an indicator of systemic inflammation and physiological stress. NPR has emerged as a potential indicator of inflammation and thrombotic risk in the context of surgical and radiological procedures.
AIM
To analyze the correlation of NLR and NPR with the development of post-liver transplantation (LT) early complications after stratification for hepatocellular carcinoma diagnosis.
METHODS
Consecutive patients undergone LT between January 2019 and December 2023 were enrolled. Data regarding the concentration of hemoglobin and the differential leukocyte count on postoperative days (POD) 0, 1, 3, and 5 were collected.
RESULTS
The dataset included 161 consecutive patients undergone LT. Clavien-Dindo IV-V complications had a good correlation with NLR POD 1 (P = 0.05), NLR POD 3 (P < 0.001), NLR POD 7 (P < 0.001), NPR POD 3 (P < 0.001). In addition, the NPR ratio on POD 3 correlated with the onset of 30-day hemorrhage (P = 0.009). Finally, 30-day mortality had a significant association with the NLR POD 1 (P = 0.03) and with NLR POD 7 (P = 0.004), while NPR had a significant correlation with 30-day mortality in NPR POD 7 (P = 0.004).
CONCLUSION
The analysis of NLR and NPR are strictly correlated with Clavien-Dindo IV-V complications and 30-day post-LT death.
BACKGROUND
The CAR-OLT score predicts major adverse cardiovascular events 1 year after liver transplant (LT).
AIM
To test the hypothesis that the CAR-OLT score may help avoid cardiac stress tests in LT candidates.
METHODS
This retrospective single-center cohort study included all adult patients undergoing elective evaluation for first cadaveric donor orthotopic LT for liver cirrhosis with or without hepatocellular carcinoma at Fondazione Policlinico Universitario Agostino Gemelli Istituto di Ricerca e Cura a Carattere Scientifico in Rome, Italy. Cardiac contraindications for LT listing were defined after a center-specific cardiac workup, which included cardiac stress tests for most patients. The diagnostic accuracy of the CAR-OLT score was evaluated using the area under the receiver operating characteristic (AUROC) method.
RESULTS
A total of 342 LT candidates were evaluated between 2015 and 2019, with a moderate cardiovascular risk profile (37% diabetes, 34% hypertension, 22% obesity). Of these, 80 (23%) candidates underwent coronary angiography. Twenty-one (6%) candidates were given cardiac contraindications to LT listing, 48% of which were due to coronary artery disease. The CAR-OLT score predicted cardiac contraindications to LT listing with an AUROC of 0.81. The optimal cut-off for sensitivity was a CAR-OLT score ≤ 23, which showed a 99% negative predictive value for cardiac contraindications to LT listing. A total of 84 (25%) LT candidates with a CAR-OLT score ≤ 23 underwent 87 non-invasive cardiac tests and 13 coronary angiographies pre-listing, with estimated costs of approximately 48000€. The estimated savings per patient was €574.70 for the Italian National Health System.
CONCLUSION
A CAR-OLT score ≤ 23 can identify LT candidates who can be safely listed without the need for cardiac stress tests, providing time and cost savings. These findings require external validation.
Background
Preservation of mobility independence is a primary goal in older adults with physical frailty and sarcopenia (PF&S). Interventions based on the combination of physical activity (PA) and nutritional counselling have been indicated as strategies for the management of this condition, although their effectiveness is not confirmed in all investigations. A possible explanation for this uncertain scenario relies in the impact of the adherence to PA interventions. Hence, the present study investigated the impact of the adherence to PA sessions on the incidence of mobility disability in older adults with PF&S.
Methods
This is a secondary analysis of an evaluator blinded, randomised controlled trial, developed in 16 clinical sites across 11 European countries, from January 2016 to 31 October 2019. Participants were community‐dwelling older adults (70+ years) with PF&S enrolled in the SPRINTT trial (NCT02582138). PF&S was operationalised as having a total score from 3 to 9 on the short physical performance battery (SPPB), low appendicular lean mass and ability to complete the 400‐m walk test in < 15 min. Data from participants allocated to a multicomponent intervention (PA with technological support plus nutritional counselling) and a healthy ageing lifestyle education programme (control group) were analysed. Adherence to PA was assessed based on the number of weekly sessions attended. According to recommendations of the American College of Sports Medicine, adherence was categorised as below recommendations (< 2 sessions/week, BR), meeting recommendations (2–3 sessions/week, MR), and above recommendations (> 3 sessions/week, AR). The primary outcome was incident mobility disability, operationalised as incident inability to complete the 400‐m walk test in < 15 min during up to 36 months of follow‐up.
Results
Data of 1444 participants (mean age 79.3 years, 72.6% women) were analysed. In those with SPPB scores of 3–7, MR and AR groups had lower risk of mobility disability compared with controls [MR HR (95% CI): 0.57 (0.41–0.78), p = 0.001; AR HR (95% CI): 0.33 (0.23–0.46), p < 0.001] and BR groups [MR: HR (95% CI): 0.48 (0.34–0.69), p < 0.001; AR: HR (95% CI): 0.27 (0.18–0.38), p < 0.001] in a dose‐dependent manner. In those with SPPB scores of 8 or 9, the BR group had a higher risk of mobility disability than controls. MR and AR groups had a lower risk of mobility disability than the BR group.
Conclusions
In older adults with PF&S, adherence to PA recommendations is associated with lower incidence of mobility disability. This benefit depends on the degree of adherence as well as baseline physical performance.
Trial Registration: ClinicalTrials.gov NCT02582138
Introduction
Periprosthetic femur fractures around the hip represent a significant clinical challenge, particularly in patients aged 65 and older, as delayed treatment can lead to increased morbidity and mortality. Early surgical intervention is generally recommended to minimize complications and optimize functional recovery. This study aims to assess the impact of surgical timing on operative duration, hospitalization length, complication rates, and long-term survival in elderly patients with periprosthetic femur fractures.
Methods
A retrospective observational study followed STROBE guidelines and the Declaration of Helsinki. Patients aged ≥ 65 years with periprosthetic femur fractures treated surgically between 2014 and 2022 were included. Patients were stratified into two groups based on surgical timing: Early Surgery (< 24 h) and Delayed Surgery (≥ 24 h). Frailty was assessed using the Clinical Frailty Scale, while medical condition severity was evaluated with the National Early Warning Score. Primary outcomes included operative time, hospitalization length, complications, and survival. Statistical analyses were performed using t-tests, chi-squared tests, and Kaplan–Meier survival analysis, with significance at p < 0.05.
Results
Seventy-two patients were analyzed. The mean operative time was significantly shorter in the Early Surgery group (145 ± 49.6 min) compared to the Delayed Surgery group (176 ± 71.9 min; p < 0.001). Hospitalization duration was also reduced in the Early Surgery group (12.1 ± 5.98 days vs. 19.7 ± 13.2 days; p = 0.002). Survival analysis demonstrated significantly better long-term outcomes in the Early Surgery group (p = 0.036), with 10-year survival rates of 92.8% versus 68.2%.
Conclusion
Early surgical intervention (< 24 h) in elderly patients with periprosthetic femur fractures is associated with shorter operative time, reduced hospitalization, and improved long-term survival compared to delayed surgery. Prompt surgical management should be prioritized to enhance patient outcomes.
Purpose
This study investigated whether radiomic features extracted from [¹⁸F]FDG-PET scans acquired before and two weeks after neoadjuvant treatment, and their variation, provided prognostic parameters in locally advanced cervical cancer (LACC) patients treated with neoadjuvant chemo-radiotherapy (CRT) followed by radical surgery.
Methods
We retrospectively included LACC patients referred to our Institution from 2010 to 2016. [¹⁸F]FDG-PET/CT was performed before neoadjuvant CRT (baseline) and two weeks after the start of treatment (early). Radiomic features were extracted after semi-automatic delineation of the primary tumour, on baseline and early PET images. Delta radiomics were calculated as the relative differences between baseline and early features. We performed 5-fold cross-validation stratified for recurrence and cancer-specific death, integrating dimensionality reduction of the radiomic features and variable hunting with importance within the folds. After supervised feature selection, radiomic models with the best-performing features for each timepoint, as well as clinical models and combined clinico-radiomic models, were built. Model performances are presented as C-indices, for prediction of recurrence/progression (disease-free survival, DFS) and cancer-specific death (overall survival, OS).
Results
95 patients were included. With a median follow-up of 76.0 months (95% CI: 59.5–82.1), 31.6% of patients had recurrence/progression and 20.0% died of disease. None of the models could predict DFS (C-indices ≤ 0.72). Model performances for OS yielded slightly better results, with mean C-indices of 0.75 for both the radiomic and combined model based on early features, 0.79 and 0.78 for the radiomic and combined model derived from delta features, and 0.76 for the clinical models.
Conclusion
[¹⁸F]FDG-PET early and delta radiomic features could not predict DFS in patients with LACC treated with neoadjuvant CRT followed by radical surgery. Although slightly improved performances for the radiomic and combined models were observed in the prediction of OS compared to the clinical model, the added value of these parameters and their inclusion in the clinical practice seems to be limited.
Chronic lymphocytic leukemia (CLL) cells may bear mutations in IGHV genes, the 2%-cutoff allowing to discriminate two subsets, unmutated (U)- or mutated (M)-CLL, with different clinical course. IGHV genes may also incorporate additional ongoing mutations, a phenomenon known as intraclonal diversification (ID). Here, through an original bioinformatic workflow for NGS data, we used the inverse Simpson Index (iSI) as diversity measure among IGHV sequences to dichotomize cases with different ID levels into IDhigh (iSI ≥ 1.2) vs. IDlow (iSI < 1.2) both in CLL (n = 983) and in other lymphoproliferative disorders (LPD; n = 127). In CLL, IDhigh cases accounted for 14.6%, overrepresented in M-CLL (P = 0.0028), while higher percentages were documented in GC-derived LPD. In M-CLL (n = 396), IDhigh patients (n = 69) experienced longer time-to-first treatment than IDlow patients (P = 0.015), and multivariate analyses (n = 299) confirmed ID as independent variable. IGHV gene mutations of IDhigh cases had molecular signatures indicating ongoing activity of the AID)/Polη-dependent machinery; consistently, IDhigh M-CLL expressed higher levels of AID transcripts than IDlow M-CLL (P = 0.012). In conclusion, we propose a robust NGS protocol to quantitatively evaluate ID in CLL, demonstrating that: i) all CLL patients presented ID although at various degree; ii) high degree of ID has clinical relevance identifying a M-CLL subset with better outcome.
Background
Liver diseases are common in patients with inflammatory bowel disease (IBD). Little is known about how specialists perceive and manage liver enzyme abnormalities. This study investigates the current practice and educational needs of IBD specialists in the management of liver enzyme abnormalities.
Methods
A 22-question web-based survey was distributed to members of the Italian Group for the study of IBD, covering their demographics, workplace features, and approaches to managing liver enzyme abnormalities in IBD patients.
Results
The survey was completed by 205/439 (46.7%) respondents. The majority of respondents were over 45 years old (38.5%) and worked in Northern Italy (61%). Most were gastroenterologists (86%) practicing in public hospitals (45%), with 21.5% having a defined referral pathway to a dedicated liver unit for IBD patients. Ninety-seven percent of physicians reported regular monitoring of transaminases, while 88% also monitored gamma-glutamyl transpeptidase and 76% alkaline phosphatase (ALP). In cases of abnormal enzyme levels, over 70% reported ordering additional diagnostic tests independently, with notable heterogeneity in the thresholds used to trigger further investigation. The conditions most frequently suspected in cases of mild transaminase elevations were metabolic dysfunction-associated steatotic liver disease (71%) and drug-induced liver injury (17%). A significant proportion of physicians (57%) considered their training in managing liver enzyme abnormalities adequate but acknowledged the need for further educational opportunities. The main barrier identified was the lack of specific guidelines and actionable flowcharts (62%).
Conclusion
This survey reveals heterogeneity in monitoring and management of liver enzyme abnormalities among IBD specialists. Most physicians recognize the need for improved training and specific guidelines.
Background
Muscle strength is one of the key components in the diagnosis of sarcopenia. The aim of this study was to train a machine learning model to predict reference values and percentiles for handgrip strength and chair‐stand test (CST), in a large cohort of community dwellers recruited in the Longevity check‐up (Lookup) 8+ project.
Methods
The longevity checkup project is an ongoing initiative conducted in unconventional settings in Italy from 1 June 2015. Eligible participants were 18+ years and provided written informed consent. After a 70/20/10 split in training, validation and test set, a quantile regression forest (QRF) was trained. Performance metrics were R‐squared (R²), mean squared error (MSE), root mean squared error (RMSE) and mean Winkler interval score (MWIS) with 90% prediction coverage (PC). Metrics 95% confidence intervals (CI) were calculated using a bootstrap approach. Variable contribution was analysed using SHapley Additive exPlanations (SHAP) values. Probable sarcopenia (PS) was defined according to the European Working Group on Sarcopenia in Older People 2 (EWGSOP2) criteria.
Results
Between 1 June 2015 and 23 November 2024, a total of 21 171 individuals were enrolled, of which 19 995 were included in our analyses. In the overall population, 11 019 (55.1%) were females. Median age was 56 years (IQR 47.0–67.0). Five variables were included: age, sex, height, weight and BMI. After the train/validation/test split, 13 996 subjects were included in the train set, 4199 in validation set and 1800 in the test set. For handgrip strength, the R² was 0.65 (95% CI 0.63–0.67) in the validation set and 0.64 (95% CI 0.62–0.67) in the test set. PCs were 91.5% and 91.2%, respectively. For CST test, the R² was 0.23 (95% CI 0.20–0.25) in the validation set and 0.24 (95% CI 0.20–0.28) in the test set. The PCs were 89.5% and 89.3%. Gender was the most influential variable for handgrip and age for CST. In the validation set, 23% of subjects in the first quartile for handgrip and 13% of subjects in the fourth quartile for CST test met criteria of PS.
Conclusions
We developed and validated a QRF model to predict subject‐specific quantiles for handgrip and CST. These models hold promise for integration into clinical practice, facilitating cost‐effective and time‐efficient early identification of individuals at elevated risk of sarcopenia. The predictive outputs of these models may serve as surrogate biomarkers of the aging process, capturing functional decline.
Background
Over the last few years, there has been increasing attention to the involvement of the central nervous system in Duchenne muscular dystrophy (DMD). The aim of this study was to assess the spectrum of neurodevelopmental and mental disorders and possible required intervention in our cohort of 264 boys and adults with DMD.
Methods
We retrospectively analysed clinical notes and psychological assessments, including routinely performed cognitive tests and clinical observations. Intelligence quotients and site of mutations were also noted.
Results
103/264 individuals (39%) had symptoms compatible with one of the following diagnoses: attention deficit hyperactivity disorder (ADHD) (n=26), autism spectrum disorder (ASD) (n=11), depressive mood/disruptive mood dysregulation disorder (n=27), anxiety disorder (n=17), obsessive-compulsive disorder (n=2), psychosis risk syndrome (n=7), and 13 had a more complex phenotype. ADHD and ASD were more frequent in infancy, emotional dysregulation during early adolescence, and psychosis and more severe phobias in older boys and adults. The risk of developing these disorders did not increase with the concomitant involvement of the dystrophin isoforms Dp140 and Dp71. Pharmacological treatment was suggested for 48 individuals but was started only in 24, as it was refused by the remaining 24 families.
Conclusions
Our findings confirm that neurodevelopmental and mental disorders are common in DMD and are likely to have a multifactorial nature. These findings support the need for disease-specific assessments and the need to increase awareness of the possible behavioural and social difficulties among families and healthcare professionals.
Skin and Soft Tissue Infections (SSTIs) are common in pediatric patients, accounting for nearly 25% of clinical visits. These infections can range from mild to life-threatening and include a severe subset known as Acute Bacterial Skin and Skin Structure Infections (ABSSSI). Prompt diagnosis and appropriate antibiotic use are crucial for optimizing patient outcomes while minimizing adverse effects and antimicrobial resistance. However, empirical treatment often becomes necessary due to the lack of culture specimens, making local epidemiology and clinical presentation key factors in treatment decisions. This expert opinion paper aims to outline the “ golden rules ” for the management of SSTIs in children, focusing on achieving microbiological clearance, clinical improvement, and effective control of symptoms, such as fever and pain, which significantly impact the child's well-being. These emphasize the principles of antimicrobial stewardship, recommending early diagnosis with appropriate laboratory tests, rational empiric therapy, and prompt switch to targeted therapy based on microbiological findings, as well as proper fever and pain management. The paper also highlights the importance of a multidisciplinary approach for complex cases, optimal dosing, and effective communication with patients' families to improve treatment compliance. Furthermore, antibiotic therapy should be selected to reduce hospital stay and facilitate home-based continuity of care, while follow-up and strengthening of the hospital-territory network are critical for continuity of care after discharge. These recommendations aim to optimize the management of pediatric SSTIs by ensuring comprehensive care from initial diagnosis to post-discharge follow-up, promoting the rational use of antibiotics, and ultimately improving clinical outcomes and quality of life for children and their families.
Institution pages aggregate content on ResearchGate related to an institution. The members listed on this page have self-identified as being affiliated with this institution. Publications listed on this page were identified by our algorithms as relating to this institution. This page was not created or approved by the institution. If you represent an institution and have questions about these pages or wish to report inaccurate content, you can contact us here.
Information
Address
Rome, Italy