Carla Carnovale’s research while affiliated with Università degli Studi di Milano-Bicocca and other places

What is this page?


This page lists works of an author who doesn't have a ResearchGate profile or hasn't added the works to their profile yet. It is automatically generated from public (personal) data to further our legitimate goal of comprehensive and accurate scientific recordkeeping. If you are this author and want this page removed, please let us know.

Publications (139)


Distribution of acetylation status in relation to treatment-related events. The asterisk (*) indicates a statistically significant difference between the two groups, as determined by a chi-square test (χ² = 8.101, p = 0.044).
(A) Cumulative incidence curves for NAT2 acetylation status related to the development of ATDH, other adverse events, and treatment modifications. (B) Cumulative incidence of ATDH based on patient’s acetylation status over time. P: the p-value indicates the difference between the slow group and the rapid/intermediate group for each specific variable (Gray’s test). Abbreviations: ADRs = adverse drug reactions; ATDH = antituberculosis drug-induced hepatotoxicity; TMs = treatment modifications.
NAT2 Acetylation Status Predicts Hepatotoxicity During Antituberculosis Therapy: Cumulative Risk Analysis of a Multiethnic Cohort
  • Article
  • Full-text available

April 2025

·

7 Reads

Marco Schiuma

·

Sofia Dinegro

·

Vera Battini

·

[...]

·

Stefania Cheli

Antituberculosis drug-induced hepatotoxicity (ATDH) is a common adverse drug reaction often requiring treatment interruption, complicating tuberculosis management. The slow acetylator phenotype, characterized by reduced N-acetyltransferase 2 (NAT2) enzyme activity, is associated with increased hepatotoxicity risk, while rapid acetylators are associated with a higher risk of therapeutic failure. This study investigates the association between the NAT2 acetylation phenotype and ATDH occurrence, with an emphasis on its predictive value in regard to a multiethnic population and its impact on the timing of ATDH onset. A retrospective observational study was conducted on tuberculosis patients treated at Luigi Sacco Hospital, Milan, Italy (July 2020–September 2023). The NAT2 genotyping identified slow and rapid/intermediate acetylators. Cumulative incidence analysis and Fine–Gray competing risks regression models were used to assess ATDH risk and onset timing. Among 102 patients, 21.6% developed ATDH, including 16.7% with slow and 4.9% with rapid/intermediate acetylators. ATDH onset was significantly earlier in regard to slow acetylators (median 0.5 vs. 2 months, interquartile range-IQR: 0.5–3 vs. 1.7–5.5). Slow acetylators were associated with a higher risk of developing ATDH (Sub-distribution hazard ratio, SHR = 3.05; 95% confidence interval-CI: 1.17–7.95; p = 0.02), even after adjusting for confounders. The NAT2 acetylation phenotype strongly influences ATDH risk and timing. Early acetylator status identification may enable dose adjustments, enhancing treatment safety. These findings highlight the role of pharmacogenetics in optimizing antituberculosis therapy by improving efficacy and minimizing toxicity.

Download



Fig. 1 Case-crossover design: Each patient's 1-year timeline (blue line) depicts the period leading up to the diagnosis of acute myocardial infarction (AMI) (red triangle)
Fig. 2 Example of a network of diagnoses and drugs from 1 year leading up to the first episode of acute myocardial infarction for an individual patient. Each node represents either a drug or a diagnosis. The size of the node indicates its centrality degree, which measures the importance of the node within the network. The edges represent the type of pairs: drug-drug pair (blue lines), drug-diagnosis pair (green lines), and diagnosis-diagnosis pair (red lines). The thickness
Fig. 3 Signal identification from outliers
Network Analysis and Machine Learning for Signal Detection and Prioritization Using Electronic Healthcare Records and Administrative Databases: A Proof of Concept in Drug-Induced Acute Myocardial Infarction

January 2025

·

55 Reads

·

2 Citations

Drug Safety

Background and Objective: Safety signals for potential drug-induced adverse events (AEs) typically emerge from multiple data sources, primarily spontaneous reporting systems (SRS), despite known limitations. Increasingly, real-world data (RWD) from sources like electronic health records (EHRs) and administrative databases are leveraged for signal detection. Although network analysis has shown promise in mapping relationships between clinical attributes for signal detection in SRS databases, its application in RWD from EHRs and administrative databases remains limited. This study aimed to evaluate the performance of network analysis in detecting safety signals within Italian administrative databases, using drug-induced acute myocardial infarction (AMI) as a proof of concept. Methods: A case-crossover design was employed to explore the association between drug exposure and AMI using the Healthcare Administrative Database of Mantova, Italy, from 2014 to 2018. Patients with their first AMI hospitalization were identified after a 365-day washout period to exclude prior hospitalizations. A network was constructed to analyse the relationships between prescribed drugs and diagnoses, represented as nodes, with undirected edges illustrating their interactions. For each AMI patient, all diagnoses and drugs recorded or redeemed within 365 days of the first AMI episode were identified, and various drug-diagnosis, drug-drug, and diagnosis-diagnosis pairs were generated. The frequency of these pairs was calculated, and three types of edge weights quantified the strength of connections. Outlier drug-AMI pairs were identified using a predictive score (F) based on frequency (C) and full edge weights (WF), with validation for known AMI associations. Signals were prioritized using the F score, C of AMI, and WF, analysed through k-means clustering to identify patterns in the data. Results: From 2014 to 2018, a total of 3,918 patients had AMI, with 4,686 AMI diagnoses. Of those, 2,866 had prescriptions in the year prior, totalling 498,591 prescriptions. A network analysis identified 2,968 unique nodes, revealing 529,935 diagnosis-diagnosis connections, 235,380 drug-diagnosis connections, and 102,831 drug-drug connections. The median number of connections (C) for drug nodes was 404 (Q1-Q3: 194-671), while for diagnosis nodes, it was 380 (Q1-Q3: 216-664). The median (Q1-Q3) WF was 11.8 (9-14), while the median F score across pairs was 0.1 (Q1-Q3: 0.1-0.3). A total of 249 potential safety signals were detected, with 63.4% aligning with known AE. Among the remaining signals, 80 were prioritized, and five—terazosin, tamsulosin, allopurinol, esomeprazole, and omeprazole—emerged as the highest priority. Conclusions: Overall, our novel method demonstrates that network analysis is a valuable tool for signal detection and prioritization in drug-induced AEs based on EHRs and administrative databases.


NAT2 Slow Acetylator Phenotype as a Significant Risk Factor for Hepatotoxicity Caused by Antituberculosis Drugs: Results From a Multiethnic Nested Case-Control Study

December 2024

·

19 Reads

·

2 Citations

Clinical Infectious Diseases

Background Under standard therapies, the incidence of drug-induced liver injury (DILI) in patients with tuberculosis ranges from 2% to 28%. Numerous studies have identified the risk factors for antituberculosis DILI; however, none have been conducted in a multiethnic real-world setting. The primary outcome of the current study was to identify the risk factors that could be used as the best predictors of DILI in a multiethnic cohort. Methods A nested case-control study was conducted in patients at the tuberculosis clinic of Luigi Sacco Hospital in Milan. Results The study included 102 patients (mean age [SD], 45.6 [15.6] years). For each patient with hepatotoxicity, 2 controls were matched for sex, age, body mass index, tuberculosis/tuberculosis infection diagnosis, and index date. We found that N-acetyltransferase 2 gene (NAT2) slow acetylator status was the best independent predictor of DILI (odds ratio, 5.97 [95% confidence interval, 1.38–25.76]; P = .02]. Conclusions NAT2 genotype–guided dosing may help optimize antituberculosis drug treatment and prevent treatment failure. Clinical Trials Registration ClinicalTrials.gov NCT06539455


Low-Dose Aspirin and Risk of Anaemia in Older Adults: Insights from a Danish Register-based Cohort Study

October 2024

·

39 Reads

·

1 Citation

European Heart Journal - Quality of Care and Clinical Outcomes

Aims To assess the risk of anaemia among low-dose aspirin (LDA) exposure in Danish older individuals in a real-world setting. Methods Population based-cohort study conducted using Danish registers. The study population included older individuals (≥65 years) exposed to LDA between 2008 and 2013 for primary or secondary prevention of cardiovascular events. Over a five-year follow-up, outcomes included anaemia incidence based on haemoglobin values and hematinic deficiency incidence based on antianemic prescriptions. Results Among the 313 508 individuals included in the study population, those exposed to LDA (n = 59 869, 19.1%) had an incidence of hematinic deficiency determined by the use of antianemic treatment of 9.6%, with an incidence rate ratio of 9.11 (95% Confidence Interval, CI: 8.81-9.41) when compared to non-users of LDA (n = 253 639, 80.9%), who had an incidence of 3.7%. Anaemia determined by haemoglobin value measurements was observed in 5.9% of those exposed to LDA, with an incidence rate ratio of 7.89 (95% CI: 7.58-8.21) when compared to non-users of LDA. Approximately one in five individuals (n = 2 422, 21.5%) who experienced anaemia also experienced bleeding. Severe anaemia was observed in 1.3% of those exposed to LDA compared to 0.6% of those not exposed. Among the exposed, the reduction in haemoglobin and ferritin levels was associated with the severity of anaemia. Conclusion These findings indicate that in a real-world setting, anaemia with LDA can occur in 6 to 10 older individuals out of every 100 LDA users during the first 5 years of treatment.


Figure 2 Flowchart of the developed algorithm applied using the incident new-user study design to predict the therapeutic indication of redeemed antiseizure medications by older individuals with epilepsy. AEDs, antiepileptics drugs; TDM, therapeutic drug monitoring. Index date, first redeemed antiseizure medication following epilepsy hospitalization.
Figure 3 Performance of the developed algorithm for predicting the use of antiseizure medications for epilepsy. Step 3=treatment patterns of antiseizure medications compatible for nonepileptic disorders; Step 5= therapeutic substitution and add-on, Step 6= epileptic surgical procedures and examinations; Step 6b= AEDs therapeutic drug monitoring; Total= overall algorithm/corresponding to step 7 (recorded therapeutic indication for redeemed antiseizure medications in the Danish National Prescription Registry). AEDs, Antiepileptic drugs; N, number.
A new data-driven method to predict the therapeutic indication of redeemed prescriptions in secondary data sources: A case study on antiseizure medications users aged ≥ 65 identified in Danish registries.

May 2024

·

17 Reads

Objectives We aimed to develop a new data-driven method to predict the therapeutic indication of redeemed prescriptions in secondary data sources using antiepileptic drugs among individuals aged ≥65 identified in Danish registries. Design This was an incident new-user register-based cohort study using Danish registers. Setting The study setting was Denmark and the study period was 2005–2017. Participants Participants included antiepileptic drug users in Denmark aged ≥65 with a confirmed diagnosis of epilepsy. Primary and secondary outcome measures Sensitivity served as the performance measure of the algorithm. Results The study population comprised 8609 incident new users of antiepileptic drugs. The sensitivity of the algorithm in correctly predicting the therapeutic indication of antiepileptic drugs in the study population was 65.3% (95% CI 64.4 to 66.2). Conclusions The algorithm demonstrated promising properties in terms of overall sensitivity for predicting the therapeutic indication of redeemed antiepileptic drugs by older individuals with epilepsy, correctly identifying the therapeutic indication for 6 out of 10 individuals using antiepileptic drugs for epilepsy.


The REporting of A Disproportionality Analysis for DrUg Safety Signal Detection Using Individual Case Safety Reports in PharmacoVigilance (READUS-PV): Explanation and Elaboration

May 2024

·

186 Reads

·

87 Citations

Drug Safety

In pharmacovigilance, disproportionality analyses based on individual case safety reports are widely used to detect safety signals. Unfortunately, publishing disproportionality analyses lacks specific guidelines, often leading to incomplete and ambiguous reporting, and carries the risk of incorrect conclusions when data are not placed in the correct context. The REporting of A Disproportionality analysis for drUg Safety signal detection using individual case safety reports in PharmacoVigilance (READUS-PV) statement was developed to address this issue by promoting transparent and comprehensive reporting of disproportionality studies. While the statement paper explains in greater detail the procedure followed to develop these guidelines, with this explanation paper we present the 14 items retained for READUS-PV guidelines, together with an in-depth explanation of their rationale and bullet points to illustrate their practical implementation. Our primary objective is to foster the adoption of the READUS-PV guidelines among authors, editors, peer reviewers, and readers of disproportionality analyses. Enhancing transparency, completeness, and accuracy of reporting, as well as proper interpretation of their results, READUS-PV guidelines will ultimately facilitate evidence-based decision making in pharmacovigilance.


Table 1 (continued)
The Reporting of a Disproportionality Analysis for Drug Safety Signal Detection Using Individual Case Safety Reports in PharmacoVigilance (READUS-PV): Development and Statement

May 2024

·

149 Reads

·

52 Citations

Drug Safety

Disproportionality analyses using reports of suspected adverse drug reactions are the most commonly used quantitative methods for detecting safety signals in pharmacovigilance. However, their methods and results are generally poorly reported in published articles and existing guidelines do not capture the specific features of disproportionality analyses. We here describe the development of a guideline (REporting of A Disproportionality analysis for drUg Safety signal detection using individual case safety reports in PharmacoVigilance [READUS-PV]) for reporting the results of disproportionality analyses in articles and abstracts. We established a group of 34 international experts from universities, the pharmaceutical industry, and regulatory agencies, with expertise in pharmacovigilance, disproportionality analyses, and assessment of safety signals. We followed a three-step process to develop the checklist: (1) an open-text survey to generate a first list of items; (2) an online Delphi method to select and rephrase the most important items; (3) a final online consensus meeting. Among the panel members, 33 experts responded to round 1 and 30 to round 2 of the Delphi and 25 participated to the consensus meeting. Overall, 60 recommendations for the main body of the manuscript and 13 recommendations for the abstracts were retained by participants after the Delphi method. After merging of some items together and the online consensus meeting, the READUS-PV guidelines comprise a checklist of 32 recommendations, in 14 items, for the reporting of disproportionality analyses in the main body text and four items, comprising 12 recommendations, for abstracts. The READUS-PV guidelines will support authors, editors, peer-reviewers, and users of disproportionality analyses using individual case safety report databases. Adopting these guidelines will lead to more transparent, comprehensive, and accurate reporting and interpretation of disproportionality analyses, facilitating the integration with other sources of evidence.


Exploring the impact of co-exposure timing on drug-drug interactions in signal detection through spontaneous reporting system databases: a scoping review

April 2024

·

46 Reads

·

1 Citation

Introduction: Drug-drug interactions (DDIs) are defined as the pharmacological effects produced by the concomitant administration of two or more drugs. To minimize false positive signals and ensure their validity when analyzing Spontaneous Reporting System (SRS) databases, it has been suggested to incorporate key pharmacological principles, such as temporal plausibility. Areas covered: The scoping review of the literature was completed using MEDLINE from inception to March 2023. Included studies had to provide detailed methods for identifying DDIs in SRS databases. Any methodological approach and adverse event were accepted. Descriptive analyzes were excluded as we focused on automatic signal detection methods. The result is an overview of all the available methods for DDI signal detection in SRS databases, with a specific focus on the evaluation of the co-exposure time of the interacting drugs. It is worth noting that only a limited number of studies (n = 3) have attempted to address the issue of overlapping drug administration times. Expert opinion: Current guidelines for signal validation focus on factors like the number of reports and temporal association, but they lack guidance on addressing overlapping drug administration times, highlighting a need for further research and method development.


Citations (80)


... As an emerging field, there is also an increasing number of publications, some of which were written and became available even during the peer-review process for this article. Six such publications were identified that pertained to signal detection via advanced ML algorithms such as deep learning, reinforcement learning, and others (although these studies do not alter the main conclusions of this review) [54][55][56][57][58][59]. Next, we considered device surveillance out of scope given that there is little standardization across companies' safety organizations as to how device surveillance is performed (i.e., under or not under the umbrella of the drug safety function). ...

Reference:

Artificial Intelligence: Applications in Pharmacovigilance Signal Management
Network Analysis and Machine Learning for Signal Detection and Prioritization Using Electronic Healthcare Records and Administrative Databases: A Proof of Concept in Drug-Induced Acute Myocardial Infarction

Drug Safety

... With respect to the timing of adverse outcomes, the relevant guidelines and studies suggest the need for regular liver function monitoring to prevent or mitigate ATDH, but there is no consensus on the optimal timing for such monitoring [22][23][24], possibly because of a lack of evidence on the acetylation status of TB patients and the timing of ATDH onset. In our previous study [25], we provided evidence that the NAT2 acetylation status is significantly associated with the development of ATDH in a multiethnic cohort receiving standard anti-TB therapy. In particular, we demonstrated that slow acetylators are associated with a significantly higher risk of developing ATDH compared to rapid/intermediate acetylators. ...

NAT2 Slow Acetylator Phenotype as a Significant Risk Factor for Hepatotoxicity Caused by Antituberculosis Drugs: Results From a Multiethnic Nested Case-Control Study
  • Citing Article
  • December 2024

Clinical Infectious Diseases

... Thus, the findings are exploratory in nature. Second, reporting bias and underreporting are inherent limitations of spontaneous reporting databases like FAERS, which may affect the accuracy of data analysis due to subjective judgment differences, reporting tendencies, or incomplete data, potentially delaying the identification of drug risks [51][52][53] . Additionally, due to variations in reporting practices and healthcare systems, and considering that the U.S. is the primary contributor to FAERS reports, the generalizability of the findings to populations outside the U.S. requires further validation. ...

Conducting and interpreting disproportionality analyses derived from spontaneous reporting systems

Frontiers in Drug Safety and Regulation

... Other "classic" targets for deprescribing were apparently already well managed and did not require many changes in our population. As such, we rarely encountered inappropriate statin and aspirin use for the primary prevention of ischemic vascular disease in older people [32,33]; anticoagulant dose could be reduced in some residents but not stopped [34,35], and with the exception of metformin and metoclopramide, dose adjustments in response to renal impairment were also rarely necessary in spite of the 37.5% prevalence of renal impairment of stage 3 or more in our population. ...

Low-Dose Aspirin and Risk of Anaemia in Older Adults: Insights from a Danish Register-based Cohort Study
  • Citing Article
  • October 2024

European Heart Journal - Quality of Care and Clinical Outcomes

... This study employs a retrospective observational pharmacovigilance analysis utilizing the publicly accessible FAERS database. All methods and protocols of this study were conducted in compliance with the relevant guidelines and regulations outlined in "The Reporting of A Disproportionality Analysis for Drug Safety Signal Detection Using Individual Case Safety Reports in PharmacoVigilance (READUS-PV) 11 . Since the FAERS database is publicly accessible and patient records are anonymized, this study does not require informed consent or ethical approval. ...

The REporting of A Disproportionality Analysis for DrUg Safety Signal Detection Using Individual Case Safety Reports in PharmacoVigilance (READUS-PV): Explanation and Elaboration

Drug Safety

... Disproportionality analysis results were reported according to the READUS-PV statement (15), whereas logistic regression results adhered to the STROBE statement checklist (16) (Table SI and Table SII). All data analyses and processing were conducted using purposewritten R language (version 4.3.1) ...

The Reporting of a Disproportionality Analysis for Drug Safety Signal Detection Using Individual Case Safety Reports in PharmacoVigilance (READUS-PV): Development and Statement

Drug Safety

... This methodology was adjusted to incorporate the 75th percentile of patients with a specific diagnosis and their corresponding co-medications. Notably, EHRs and administrative databases include all prescribed medications, whereas individual case safety reports are often incomplete, presenting a significant challenge for SRS database analysis [2,53]. Indeed, our results illustrate the potential of network analysis to identify complex interactions between medications, clinical events, and patient characteristics, offering an innovative alternative to traditional SRS databases. ...

Timing Matters: A Machine Learning Method for the Prioritization of Drug-Drug Interactions Through Signal Detection in the FDA Adverse Event Reporting System and Their Relationship with Time of Co-exposure

Drug Safety

... This issue has been associated with the risk of relapse and disease progression, and it has been partially addressed with the introduction of typical long-acting injectable (LAI) antipsychotics [8]. The subsequent development of atypical LAI antipsychotics further improved the safety profile of these treatments, which are administered at extended intervals ranging from two weeks to six months [9,10]. Despite the advantages of LAIs, several barriers limit their widespread use, including challenges in defining the target patient population, uncertainty about the optimal timing for initiating LAI treatment, and the need for education and training for healthcare providers [11]. ...

Evaluating the 6-month formulation of paliperidone palmitate: a twice-yearly injectable treatment for schizophrenia in adults
  • Citing Article
  • March 2024

... 51 Recently, a robust investigation of Danish registers has assessed the risk of hospitalisation due to either minor or major cognitive impairment at 3 years associated with DPP-4i against GLP-1RAs in older adults with T2DM: only major, but not minor, cognitive impairment-related hospitalisation was significantly higher in DPP-4i users (HR 1.58; 95% CI 1.22, 2.06). 87 Similarly, our pharmacoepidemiological study of TriNetX US Collaborative Network EHRs 57 showed lower hazards of cognitive deficits (defined as a composite outcome of various International Classification of Diseases-10th Revision codes reflecting cognitive impairment) within the first year since starting the GLP-1RA semaglutide compared with the DPP-4i sitagliptin (n=46 772; HR 0.72; 95% CI 0.64, 0.80) and the sulfonylurea glipizide (n=38 412; HR 0.72; 95% CI 0.63, 0.81), but not the SGLT-2i empagliflozin (which is known to also have some neuroprotective effects). 88 These results are reassuring for patients with T2DM taking GLP-1RAs. ...

Comparing major and mild cognitive impairment risks in older type-2 diabetic patients: a Danish register-based study on dipeptidyl peptidase-4 inhibitors vs. glucagon-like peptide-1 analogues

Journal of Neurology

... The FAERS database can be accessed via the quarterly raw data files, which require extensive cleaning to ensure usability. Tools like the FDA Adverse Events Dashboard and the OpenVigil Project independently clean, validate, and process the data, leading to slight variations in data interpretation [8]. For this study, the OpenVigil 2.1 software package was used to query and analyse adverse event reports. ...

Enhancing Transparency in Defining Studied Drugs: The Open-Source Living DiAna Dictionary for Standardizing Drug Names in the FAERS

Drug Safety