Article

Artificial intelligence methods to predict chemotherapy-induced neutropenia in breast cancer patients.

Authors:
To read the full-text of this research, you can request a copy directly from the authors.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

Article
: There has been growing investment in artificial intelligence (AI) interventions to combat the opioid-driven overdose epidemic plaguing North America. Although the evidence for the use of technology and AI in medicine is mounting, there are a number of ethical, social, and political implications that need to be considered when designing AI interventions. In this commentary, we describe 2 key areas that will require ethical deliberation in order to ensure that AI is being applied ethically with socially vulnerable populations such as people who use drugs: (1) perpetuation of biases in data and (2) consent. We offer ways forward to guide and provide opportunities for interventionists to develop substance use-related AI technologies that account for the inherent biases embedded within conventional data systems. This includes a discussion of how other data generation techniques (eg, qualitative and community-based approaches) can be integrated within AI intervention development efforts to mitigate the limitations of relying on electronic health record data. Finally, we emphasize the need to involve people who use drugs as stakeholders in all phases of AI intervention development.
ResearchGate has not been able to resolve any references for this publication.