ArticlePDF AvailableLiterature Review

Digital Health Equity: Addressing Power, Usability, and Trust to Strengthen Health Systems

Authors:

Abstract

Background: Without specific attention to health equity considerations in design, implementation, and evaluation, the rapid expansion of digital health approaches threatens to exacerbate rather than ameliorate existing health disparities. Methods: We explored known factors that increase digital health inequity to contextualize the need for equity-centered informatics. This work used a narrative review method to summarize issues about inequities in digital health and to discuss future directions for researchers and clinicians. We searched literature using a combination of relevant keywords (e.g., “digital health”, “health equity”, etc.) using PubMed and Google Scholar. Results: We have highlighted strategies for addressing medical marginalization in informatics according to vectors of power such as race and ethnicity, gender identity and modality, sexuality, disability, housing status, citizenship status, and criminalization status. Conclusions: We have emphasized collaboration with user and patient groups to define priorities, ensure accessibility and localization, and consider risks in development and utilization of digital health tools. Additionally, we encourage consideration of potential pitfalls in adopting these diversity, equity, and inclusion (DEI)-related strategies.
20
IMIA Yearbook of Medical Informatics 2022
© 2022 IMIA and Georg Thieme Verlag KG
Digital Health Equity: Addressing Power,
Usability, and Trust to Strengthen Health Systems
Han Koehle1*, Clair Kronk2,*, Young Ji Lee3*
1 Student Affairs Health Equity Initiative, University of California Santa Barbara, Santa Barbara,
California, USA
2 Center for Medical Informatics, Yale University School of Medicine, Connecticut, USA
3 School of Nursing, University of Pittsburgh, Pittsburgh, Pennsylvania, USA
* All authors contributed equally to this work. Authors are listed in alphabetical order by surname.
Summary
Background: Without specific attention to health equity consid-
erations in design, implementation, and evaluation, the rapid
expansion of digital health approaches threatens to exacerbate
rather than ameliorate existing health disparities.
Methods: We explored known factors that increase digital health
inequity to contextualize the need for equity-centered informatics.
This work used a narrative review method to summarize issues
about inequities in digital health and to discuss future directions
for researchers and clinicians. We searched literature using a
combination of relevant keywords (e.g., “digital health”, “health
equity”, etc.) using PubMed and Google Scholar.
Results: We have highlighted strategies for addressing medical
marginalization in informatics according to vectors of power such
as race and ethnicity, gender identity and modality, sexuality,
disability, housing status, citizenship status, and criminalization
status.
Conclusion: We have emphasized collaboration with user and
patient groups to define priorities, ensure accessibility and
localization, and consider risks in development and utilization of
digital health tools. Additionally, we encourage consideration of
potential pitfalls in adopting these diversity, equity, and inclusion
(DEI)-related strategies.
Keywords
Health equity; gender identity; informatics; health inequities;
healthcare disparities
Yearb Med Inform 2022:20-32
http://dx.doi.org/10.1055/s-0042-1742512
1 Introduction
Pervasive disparities in healthcare access
and health outcomes between populations
reflect the ways in which socioeconomic
power distribution shapes individual risks
and opportunities, including exposure to
violence, discrimination, and environmental
burdens; access to stable and safe housing,
food, and water; and access to appropriate
health services [1]. Health system biases
further compound broader, societal inequi-
ties; biases in health research and practice
contribute to reduced trust and alienation,
further reducing access even when services
are theoretically available [2–4]. Address-
ing health disparities requires major shifts
across all elements of health and healthcare.
Medical informaticians have a crucial role
to play in this larger effort, as digital health
represents a critical point of intervention. If
we do not effectively address bias and access
disparities in digital health, health gaps will
widen and become more difficult to amelio-
rate, but in facing these challenges we can
close these gaps as only we can.
The World Health Organization (WHO)
in Global Strategy on Digital Health 2020-
2025 defines digital health broadly, including
virtual care, remote monitoring devices,
smart wearables, tools for data exchange and
sharing, artificial intelligence, and more. We
will address examples of how informatics
can improve or exacerbate health dispari-
ties through digital health tools like patient
portals, telehealth, and machine learning
algorithms, but it is crucial for all elements
of digital health to proactively address bias
and inequity in design and utilization [5].
The COVID-19 pandemic played a major
role in increasing usage and acceptance of
digital health in healthcare [6–8]. Invest-
ments in digital health companies were about
$5.4 billion in the first half of 2020 during
the pandemic [9]. The WHO perceives that
digital health has the potential to reach more
people and provide them with access to
available health services [10, 11]. The U.S.
National Science and Technology Council
reported that digital health can save both
costs and time for patients, and increase their
access to health services [12]. The pandemic
has also spotlighted health inequity and on-
going failures to address equity in healthcare
and public health [13].
2 Methods
This work uses a narrative review method to
summarize issues about inequities in digital
health and to discuss future directions for
researchers and clinicians. We searched the
extant literature using a combination of rele-
vant keywords (e.g., “digital health”, “health
equity”, “bias”, etc. derived from author con-
sensus outline) using PubMed and Google
Scholar. This outline involved two rounds
of author consensus regarding the scope of
digital health equity topics. This consensus
shifted as part of the review process, and a
second round of consensus was sought. In
the first round of searches, we focused on
(1) access and barriers, (2) algorithmic bias,
(3) digital and nondigital health literacy, and
(4) surveillance and safety. In the second
round of searches, we focused on (1) equity
Article published online: 2022-12-04
IMIA Yearbook of Medical Informatics 2022
21
Digital Health Equity: Addressing Power, Usability, and Trust to Strengthen Health Systems
in digital health, (2) digital determinants of
health and digital health equity, (3) epistemic
justice in digital health, (4) aggregate data,
(5) individual data, and (6) user (or non-us-
er) experience. Searches were thus carried
out in the following form for PubMed and
for Google Scholar: “‘digital health equity’
OR ‘health surveillance’ OR ‘algorithmic
bias’…”, etc. based on the consensus topics
to the point of narrative, thematic saturation.
For PubMed, the following approximate
number of results were found in the first
round of searches, from 2019 to 2021:
“access and barriers AND digital health”
(n=224), “algorithmic bias” (n=40), “digital
health literacy OR nondigital health literacy”
(n=93), “surveillance AND safety AND
digital health” (n=24).
For the second round, the following
approximate number of results were found,
from 2019 to 2022: “digital health equity”
(n=14), “digital determinants of health”
(n=2), “epistemic justice” (n=30), “(aggregate
data OR individual data) AND digital health”
(n=13), and “user experience OR non-user
experience” (n=1,411). Responses including
additional sources from reviewers and editors
were also included. Unfortunately, usage of
Google Scholar is biased, meaning results
were likely different for searches run by all
authors, making it difficult to report exact or
even approximate numbers of results.
Articles were initially screened if they
were published in English and discussed the
topic of interest. Our first round of searches
took place in October and November of
2021, with the secondary round of searches
taking place in February 2022. We prior-
itized our selection toward peer-reviewed
manuscripts; however, we also included
select gray literature such as white papers
for the following reasons: (1) to include a
larger and more comprehensive diversity of
perspectives in the piece; (2) to recognize the
power dynamics which allow for publication
in peer-reviewed journals, and how that priv-
ilege may miss crucial perspectives; and (3)
to consider lived experience in relationship
to informatics. We also prioritized sources
based on recency, preferring sources pub-
lished from 2020 onward; however, select
sources published before that time were
considered if they included information not
covered by more recent literature.
3 Equity in Digital Health
As evidence mounts regarding the role of
structural power and oppression in shaping
individual and population health, the impli-
cations are clear for the ethical and practical
duty of all involved in health promotion
and health care to centrally address equity
and community solidarity in the design and
implementation of health policies and tools.
The Lancet and Financial Times Commission
on governing health futures 2030 urged
the adoption of a values-based framework
for governing health to ensure that digital
technologies support universal health ben-
efits and positive transformations. Their
framework focuses on addressing power
asymmetries, public trust, and universal pub-
lic health through practices grounded in the
foundational values of democracy, inclusion,
equity, human rights, and solidarity [14].
Centering equity in digital health means
balancing improved reach with increased
risk in digital health; for example, people
with highly stigmatized diagnoses may
be more able to access care in specialty
clinics if they can do so through telehealth
services, but data breaches also pose greater
risks for these patients [15]. It means en-
suring that tools meant to expand access
to care, like telehealth for hard-to-reach
populations, do not create a permanent
barrier to screenings and services that re-
quire in-person encounters and hands-on
examination [16]. It means ensuring that
digital tools collect and reference equitable
data sources. A recent Google app designed
to assist dermatologists in diagnosing skin
conditions quickly came under fire when us-
ers reported that it does not work on Black
or Brown skin, maintaining or reinforcing
existing disparities in representation of
skin conditions in dermatological teaching
[17, 18]. This demonstrates the need for
designing electronic health record (EHR)
systems that adequately reflect relevant
categorizations of experience in a given
community. When EHRs fail to adequately
reflect identities and experiences in a given
socioeconomic context, EHRs can present
a technological barrier to safety, confiden-
tiality, and appropriateness in healthcare
[19, 20]. Equity in digital health means
addressing structural power at-large as it
shapes individual health as well as data
collection, use, management, storage, and
sharing in the health system.
3.1 Structured Power Determines
Health Outcomes
The latter part of the 20
th
century and early 21
st
century observed a paradigm shift in individ-
ual and public health from a primary focus on
individual behavior and genetic predetermi-
nation to an increasing focus on macro-level
power dynamics [21–23]. This includes ineq-
uities in exposure to environmental risks such
as pollutants; access to material resources such
as stable and appropriate housing, nutritious
food, and safe drinking water; exposure to war
and other violence; and access to epistemic
resources such as formal education and the
Internet. Health outcomes are mediated both
directly and indirectly by social standing; stig-
matized groups carry both the stress burden
of stigma and structural harms associated with
enacted stigma, violence, and discrimination
across all other areas of society. For this
reason, global health equity must consider
the role of systems that structure stigma on a
global level, such as colonization and white
supremacy [24–26]. These ideological and
political systems have shaped the distribution
of environmental pollutants, of housing, of
food, of violence, and of epistemic norms,
including concepts of demography, health and
illness, and who has authority to participate in
public health strategy.
3.2 Digital Determinants of
Health and Digital Health Equity
Digital health transforms the already-unequal
landscape of health determinants by empha-
sizing access to technology and digital litera-
cy. Access to Internet-capable mobile devices
varies widely between and within countries.
For example, the Pew Research Center re-
ported 100% mobile phone ownership among
adults in South Korea in 2018, versus 64%
in India in the same year [27]. Across the
global South, mobile users are more likely
to use multiple SIM cards, to pay for mobile
usage via prepaid rather than monthly plans;
to primarily use a mobile device that is owned
22
IMIA Yearbook of Medical Informatics 2022
Koehle et al.
by the head of their household rather than
a personal mobile device; to use primarily
browser-based rather than app-based mobile
Internet; and to employ browsers that reduce
data consumption by hiding data-heavy ele-
ments from websites [28].
Patterns of smartphone and mobile In-
ternet adoption and use reflect individual
and regional socioeconomic power; digital
health design should consider global and
regional patterns of access and use. Ensuring
that health apps have a well-functioning and
data-frugal browser equivalent will expand
usability for users in the global South, users
in rural areas, and lower-income users. Mo-
bile health tools should account for multiple
users sharing a single device.
Within relatively wealthy countries,
access and use patterns also reflect socio-
economic disparities. For example, one study
showed that Pokémon Go, an augmented
reality-based mobile game, provides an
unequal number of spots in which users
can engage with the game based on their
and others’ neighborhoods. Predominantly
Black and Hispanic neighborhoods in major
cities in the United States, such as Chicago
and New York, had fewer spots to play the
game than predominantly white and Asian
neighborhoods [29].
Digital health developers should also
consider family dynamics in their designs.
Such dynamics can be affected by various
factors including race or ethnicity, culture, and
social identity. For example, working women
with children may not be able to adopt advice
from stress-relief applications suggesting
spending time with family. These unrealistic
recommendations can cause guilt or increase
stress levels due to a perceived failure to fol-
low recommendations. Social identities and
roles should therefore be reflected in digital
health apparatuses [7, 30]. To reduce the gap,
diverse stakeholders should be involved and
compensated for their contributions from the
beginning of the development phase to reflect
their values and perspectives. For instance,
researchers could recruit ethnic minority or
stigmatized populations in order to reflect
their cultural values and perspectives with the
aim of target intervention [20, 31, 32].
A 2021 special issue of Global Policy
addressing digital technology and health
equity spotlights the ways that financial and
political power shape priorities in health
technology, including which technologies are
viewed as important and how functionality
may influence power structures [33]. Authors
of the issue highlighted the “gold rush”
ethos of digital technology, emphasizing
the inherent tension in goals between profit
motives and public health priorities [33].
Other authors specifically named problems
with “philanthrocapitalism” in digital health,
criticizing the reductionist and ineffective
approach of education-based interventions
like the Motech Global Mobile Health
Program [34]. They highlighted risks going
far beyond individual and collective health
including erosion of basic liberties, increase
of social conflict, wasted public funding,
and long-term harm to economic systems
[33]. Developers and institutional adopters
of digital health tools should proactively
and transparently assess for these risks in
collaboration with prospective user and/or
patient groups.
4 Epistemic Justice in
Digital Health
Global health care and policy is organized
around epistemic practices and norms that
are fundamentally entwined with the history
of European global colonization and white
supremacy. The result is a system of knowl-
edge production and sharing that habitually
enacts knowledge-based injustice, including
unjustly discounting the credibility and in-
terpretive frameworks of some knowledge
and knowledge-producers and according to
structural prejudices in health knowledge
production and use [35, 36]. The effect of
this form of injustice includes persistent
assumptions, particularly by those situated
within the academic research organizations
in the Global North, that marginalized com-
munities lack the capacity to meaningfully
participate in research or policy develop-
ment; this renews the exclusion of their
perspectives [37]. Instead of centering par-
ticipation on those thought to be capable of
participating, participation should be viewed
as a basic human right, and where capacity to
participate is compromised, capacity should
be actively facilitated [35].
A key element of the epistemology of
health and health care is conceptualizing
human groups, including determining which
groups are medically relevant and in naming
and defining those groups. Demography is
one epistemic framework for understanding
health significance in human groups. De-
mographic information is typically defined
as the statistical characteristics of human
populations including, but not limited to,
age, gender identity, ethnicity, education,
and employment status, among many others.
Demographics often form the basis of social
determinants of health (SDoH) frameworks,
which are in turn intimately connected to
health disparities research. However, it has
been noted that SDoH has “lost meaning
within systems of care because of misuse and
lack of context, and large social gradients in
health and clinical outcomes persist” [38].
For instance, race is oftentimes classified as
an SDoH when the actual SDoH is structural
racism. As Crear-Perry et al. note “[by] de-
fining the root causes of health inequities,
we can move the focus of intervention away
from individual blame and misguided theo-
ries of the biological basis of race and eth-
nicity… It is an economic, social, and moral
imperative that we center the experience of
the communities that are the most impacted
when we look for solutions” [38].
4.1 Reporting of Demographic
Information
Demography has historical and ongoing
entanglements with the eugenics movements
and eugenic ideologies, which requires belief
in inherent differences between groups in
order to justify disproportionate benefit and
harm to different groups [39]. One risk of de-
mographic frameworks is that they can tend
to encourage naturalizing health differences
rather than conceptualizing health differenc-
es between groups as reflective of structural
power and oppression. Yearby reimagines a
SDoH framework which is multi-layered
in approach, considering factors such as
discrimination, civic participation, incar-
ceration, and law [40]. Informaticians must
begin to grapple with these intertwined and
complex systems which are not fully rep-
resented in the health record, by becoming
IMIA Yearbook of Medical Informatics 2022
23
Digital Health Equity: Addressing Power, Usability, and Trust to Strengthen Health Systems
fluent in social policy and public health,
and examining structural discrimination and
biases in all involved systems [40].
Collecting demographic information
means translating and flattening complex
individual identities and experiences into
universalized categories, often for the ease of
understanding of the academic Global North.
This privileges normative group categories
and models that are localized to racial, eth-
nic, gender, class, and religious perspectives.
Demographic tools in digital health must be
designed to accommodate epistemic local-
ization and feedback responsiveness. Digital
health developers should adopt a starting
assumption that demographic categories
may need to be localized and re-localized
to adapt to dynamic developments of both
social categories themselves and global
understandings of the health significance of
different types of social categories.
Research connecting demographics to
root causes of biases requires appropriate
recording and description. It requires trust-
worthiness in not only the patient-clinician
relationship, but also in the patient-informa-
tician, clinician-informatician relationships,
and community-clinician relationships--all
connections which lead to better health
outcomes [41–43]. Designing the best
questions and answers does not always mean
collecting the best data when training and
education are not present and persistent in
these relationships [20, 44]. It is important
to note significant differences in those same
relationships to mistrust of medical systems.
Many communities have specific histories of
abuse, neglect, and violence originating in
medical systems [45–47]. Others now claim
such histories with no such basis [48–50].
Treating both situations as anti-science and
anti-medicine on equal footing is a denial
of systematic and structural abuse and a
likening of that abuse to conspiracy theo-
ries. Modern medicine was built on white
supremacist frameworks and practices such
as involuntary experimentation on enslaved
Black people, forced sterilizations of Black
people and Indigenous peoples, and rel-
egation of structural racism to supposed
“genetic” differences based on scientific
racism [46, 51–60]. LGBTQIA+ patients are
regularly turned away at the door and refused
care, often legally, based on their sexual
orientation or gender identity [20, 61]. With
this reality in mind, why should marginalized
patients report demographic characteristics?
Of course, as with all health surveillance,
demographics can help elucidate larger pub-
lic crises: issues of racism, sexism, ableism,
homophobia, transphobia, and other forms
of discrimination. But that is just one step.
Determining how providers need to act to
counteract these large oppressive systems is
crucial to the future of healthcare.
However, even from the provider side,
there are significant issues of marginaliza-
tion based on demographic characteristics.
Marginalized providers face everything from
microaggressions to direct violence. Some
examples include an elderly white woman
telling a Black doctor not to “waste [their]
affirmative action” [62] or a patient’s parent
reporting that she’s glad to have a “usual
straight” doctor instead of someone who is
gay [63], or patients threatening to shoot and
kill Asian nurses [64]. Racial discrimination
against hospital employees in the wake of
the COVID-19 pandemic led to several
high-profile, multi-million dollar lawsuits
[65–67]. On the other hand, research has
shown that when marginalized patients are
treated by people who look like them or have
their same experiences, patient outcomes
are better [68–70]. But if providers cannot
answer to the structural racism in their own
ranks, how can one even begin to tackle
persistent patient-side structural racism?
NMA President Leon McDougle noted
in an interview just how racism continues
to prevail in medical communities: “The
root cause is systemic racism dating back
to chattel slavery… This is a societal issue
that will require cross-sector investment and
collaboration to remedy” [71].
In recognition of structural failures in
trust and the continued vulnerability of
disclosures within health contexts, the
collection of individual patient or user data
should be structurally collaborative. Disclo-
sure should be prompted in a way that makes
clear why particular information is being
collected and how it will be used; disclosure
should be optional as much as possible; when
disclosure is a precondition of service, this
should be explained; and when data is used
on the user or patient’s behalf, this should
be made transparent. A core demonstration
of epistemic humility in global health is
trusting a patient or user to reasonably judge
whether a particular disclosure is safe and
facilitating their decision-making process
by indicating why a particular element is
relevant to their care.
4.2 Replicating (In)Equity in Design
One key area in which digital health influ-
ences health outcomes via epistemic justice
or injustice is gender and sexuality. Digital
health applications can allow LGBTQIA+
to access care more privately and can act to
reduce stigma [72]. However, digital health
applications rarely consider gender equity in
their design [30]. Studies reported that most
existing applications do not adopt standards
including gender identity, assigned gender at
birth, or gender markers on health insurance
documents, and that they do not consider di-
versity in gender, sex, and sexual orientation
(GSSO) data [73, 74]. Work by Kronk et al.
[20], McClure et al. [75], and Davison et
al. [76] has surveyed the current landscape
of GSSO data in EHRs and provided newer
frameworks to reassess data collection
standards. Recommendations in this work
included an overhaul to the existing Health
Level 7 (HL7) sex and gender model, as well
as implementation of a two-step process (of
gender identity and assigned gender at birth
[AGAB]) in clinical contexts.
Furthermore, GSSO data fields have
been built using Eurocentric ideas of gender
identity and sexual orientation, which may
be different from that of non-Eurocentric
countries outside of the United States, Can-
ada, Australia, Germany, and France, for in-
stance [61]. As Kronk and Dexheimer point
out: “[a] small segment of non-Eurocentric
identities were described [using Eurocentric
terminology like] ‘transgender, ‘transsex-
ual,’ or ‘transvestite’... such as hijra being
described as ‘transsexuals’” [61]. In order
to disambiguate GSSO terminology, Kronk
created the GSSO ontology, containing over
14,000 terms on those topics [77]. However,
the terminology is currently only available
in English, and it also possesses a relatively
Eurocentric lens by virtue of its authorship.
Constructing more collaborative datasets
which consider multiple cultural as well as
24
IMIA Yearbook of Medical Informatics 2022
Koehle et al.
linguistic perspectives and translating those
affirming terminologies into clinical care
through vocabulary standards are essential
for better care outcomes in trans and gen-
der-marginalized populations.
Inequity of design has also led to current
digital health systems which do not reflect
the circumstances of women, particularly
women from racial or ethnic minority back-
grounds and low-income women. National
data shows that Black adults have similar
rates of internet access as white adults, and
show the highest percentage of smartphone
ownership among race or ethnic groups
[78]. However, usage of digital health is
significantly low among women of color.
Black women showed a low enrollment
rate in digital pregnancy services, physical
activity applications, and digital health for
sexually transmitted diseases compared to
women from other race or ethnic groups.
Being excluded or not participating in digital
health can harm not only women’s health, but
also that of their families because women are
often responsible for their care [30]. These
gaps perpetuate existing sexual and gender
inequities [73]. Digital health should include
diverse groups from the beginning of the
development phase to reflect their values
and perspectives.
4.3 Health Information Standards
It is imperative that we collect data, design
algorithms, and evaluate applications equal-
ly and fairly by considering all possible
factors caused by biases. However, there is
no clear definition or standard of “fairness”
in machine learning algorithms [79]; thus,
it is difficult to measure the concept [80].
Furthermore, the disconnect between the
public and private sectors in digital health can
also lead to racial bias in algorithms used in
patient care. The U.S. Food and Drug Admin-
istration has highlighted that privately funded
machine learning algorithms used in health
care should have the same ethical standards as
those developed by publicly funded research
(i.e., National Institute of Health, USA).
Publicly funded research is usually peer
reviewed and evaluated by domain experts
who can determine whether the proposed
algorithms contain biases. Also, studies are
approved by their institutional review boards
(IRBs) which improves oversight of methods.
However, the private sector can face conflicts
between protecting intellectual property and
being transparent to algorithmic design and
inputs. Currently, there is no broadly agreed
upon standard for evaluating algorithm-based
systems, and there are no federal, state, or
local regulations governing the use of these
algorithms [80]. Regulators must understand
structural racism to evaluate commercialized
algorithms perpetuating racial bias and to
oversee data flows in the algorithm loop [80,
81]. Concepts of fairness in health informa-
tion must be developed through participatory
and equitable processes and not centered on
the epistemic perspective of researchers in
the Global North.
5 Aggregate Data
5.1 Algorithmic Bias
Currently, many health systems are adopting
machine learning algorithms and software
to manage health using patient data such
as clinical information, socio-demographic
information, laboratory values or diagnostic
images [81, 82]. Although machine learning
algorithms hold great potential for reducing
health care cost and increasing the efficiency
of workflow, these algorithms can exacerbate
existing disparities and introduce unexpected
ones [83]. Biases can be reflected in various
stages of algorithm development, from col-
lecting data to designing and implementing
algorithms in clinical practice.
Vulnerable populations in health care
such as individuals marginalized due to
sexual orientation or gender identity, Black
and Latine
1
populations, and those with
1 We use the term Latine here instead of Lat-
inx, as it has been called “a more organic
alternative” to Latinx, being designed to
work with the Spanish language, as it can
be more easily pronounced and conjugated
in Spanish than “Latinx” [84]. Additional-
ly, the usage of “e” in Latine is native to
gender-neutral words in Spanish, such as
in the term “estudiante” [85]. However, we
would like to emphasize that it is important
to consider the terminology individuals
utilize themselves in individual contexts.
low socioeconomic status, experience sig-
nificant baseline health disparities. Those
pre-existing biases have the potential to be
perpetuated by machine learning algorithms,
reinforcing deeply rooted stigma and dis-
crimination [86].
Recently, Obermeyer et al. examined an
algorithm used in the U.S. health system that
identified patients needing high-risk care
management [87]. This study reported that
the algorithm contained racial bias in cases in
which race was self-reported. Furthermore,
those models can still have low performance
even when algorithms take racial and cultural
factors into account. Coley et al. (2021)
reviewed two algorithms that predict suicide
risk across racial and ethnic groups [88].
These algorithms showed different results
across racial and ethnic groups: they accu-
rately predicted the risk for white, Hispanic,
and Asian patients while they less accurately
predicted the risk for Black and American
Indian/Alaskan Native patients. The latter
groups did not report their race or ethnicity
in the records [88]. Furthermore, evidence
around the genome showed that the collected
dataset did not represent diverse racial and
ethnic groups [89, 90]; most of the genomic
databases were collected from people with
European ancestry. Once researchers devel-
op treatment strategies based on the biased
data, excluded populations such as Black
and Indigenous people may not experience
the same treatment efficacy, which could
lead to harmful outcomes. Thus, it is crit-
ical to improve accuracy and performance
of predictive models for disadvantaged
populations by ensuring their inclusion in
such models. To bridge the gap, there is a
need for collaboration via multidisciplinary
system development teams from diverse
backgrounds [81]. Otherwise, health dis-
parities will be perpetuated and further
embedded within society, leading to greater
health inequities [91].
As stated previously, there are no broad-
ly agreed upon standards for evaluating
algorithm-based systems [80]. Recently, re-
searchers proposed MINIMAR (MINimum
Information for Medical AI Reporting), a
new framework “describing the minimum
information necessary to understand in-
tended predictions, target populations, and
hidden biases, and the ability to generalize
IMIA Yearbook of Medical Informatics 2022
25
Digital Health Equity: Addressing Power, Usability, and Trust to Strengthen Health Systems
these emerging technologies” [92]. This
framework can identify how data and infor-
mation are collected to train a model with
reduced biases and equity issues. Ideally this
new framework can be leveraged to improve
equity in AI models.
5.2 Surveillance and Safety
Mass health surveillance during the
COVID-19 pandemic has proven indis-
pensable, assisting public health institutions
and governments immensely with nearly
real-time decision-making capabilities.
However, these systems led to nearly uncon-
trollable surveillance creep, and have been
used by various countries to invade privacy
to extreme capacities, such as using facial
recognition to track infected persons [93],
all for the “greater good” [94].
Meanwhile, security surrounding health
data appears to have been increasingly
compromised. Over the last two years,
millions of health-related documents, in-
cluding such sensitive information as Social
Security numbers, health conditions, and
medication lists have been compromised.
HIPAA Journal reports 642 data breaches
in the United States involving at least 500
records in 2020 alone, theoretically leaking
information equating to nearly 82% of the
U.S. population [95]. The sale of records on
the dark web can net up to $1,000 USD per
record, which can then be used for purposes
of extortion, coercion, and identity theft
[96]. Choi, Johnson, and Lehmann show-
cased in 2019 that these data breaches are
associated with deterioration in timeliness
of care and patient outcomes [97]. But these
breaches have gone even further in directly
impacting outcomes: in 2021, an infant
allegedly died due to care issues related to
a hospital ransomware attack [98].
Social media platforms and mobile de-
vices have only increased vulnerabilities
and highlighted myriad issues with digital
systems. In 2021, a former Meta (previously
Facebook) employee leaked thousands of
documents, showcasing how Meta ampli-
fied the voices of the anti-vaccination move-
ments and other medical misinformation.
Imran Ahmed, of the Center for Countering
Digital Hate, noted that nothing was done
because “engagement is the only thing that
matters… [it] drives attention and attention
equals eyeballs and eyeballs equal ad rev-
enue” [99]. Additional documents clearly
showed that Meta knew that Instagram use
was strongly associated with depression,
anxiety, and eating disorders [100]. Infor-
maticians in academia and industry need to
be aware of these vulnerabilities, advocate
for more individual-level and system-wide
protections, and work to educate patients
and providers on how their information will
be used and to whom it is available[101],
especially when considering vulnerable
populations, such as adolescents [102].
For providers, it can seem a difficult
gap to leap. Over 4,000 anti-vaccination
protesters clashed with police in Athens
in July 2021 [103]. Fake vaccinations and
vaccination documentation run rampant
[104]. “We must insist that trust hospitals…
be held accountable for their actions”,
one waste pickers’ advocate noted [105].
The place of the medical provider amidst
such chaos is right in the center of it all.
Providers cannot be apolitical actors, and
paths need to be opened for more equita-
ble patient, and community, advocacy by
providers [106].
6 Individual Data:
Confidentiality, Stigma, and
Criminalization
Balancing the importance of health surveil-
lance with security is critical to maintaining
public health. Such surveillance is neces-
sary to eliminate sources of health prob-
lems larger than just one person, including
pathogenic spread and behavior, workplace
hazards, housing components, and water
and air quality, among others. For example,
the water crisis in Flint, Michigan was ig-
nored and largely dismissed by authorities
until engineer Marc Edwards and pediatri-
cian Mona Hanna-Attisha showcased the
presence of water lead levels and its effects
on blood lead levels [107]. However, even
though extensive work showcased that lead
levels in Flint had been lowered to levels
safe for human consumption, public trust
had been broken: “The anger, the lack of
trust, it’s all justified,” Senator Jim Ananich
reported [107]. The very next year would
see one of the most infamous medical mis-
information movements in world history
that focused on vaccine resistance- one that
would generate upwards of $1.1 billion in
annual revenue for social media sites [108].
6.1 Provider-Side Reporting on
Health-Related Statuses or Conditions
On 1 September 2021 in Texas, Senate Bill
(SB8) went into effect, banning abortion
at around six weeks, part of a continued
assault on reproductive health rights. Six
months previously in Arkansas, House Bill
(HB1570) effectively banned gender-affirm-
ing care for transgender youth, signaling a
mass introduction of anti-transgender bills
after the December 2020 Bell v Tavistock
case, which was only overturned in Septem-
ber 2021. Both acts effectively criminalized
every aspect of their respective areas of care:
making it illegal to provide the care itself,
resources concerning the care, and any as-
sistance related to administering that care.
With that in mind, transgender patients
may feel uncomfortable providing informa-
tion about gender-affirming medications,
preferring to engage stealthily in medical
encounters and to seek grey or black-market
alternatives. Individuals seeking abortion
services may have to cross state or national
borders for care. In an environment where a
person can be prosecuted for manslaughter
as the result of a miscarriage, handcuffed
and restrained while in labor, forced to
undergo Caesarian section or blood trans-
fusion, or charged under drug trafficking
statutes for “delivering drugs to an infant
through the umbilical cord,” discussing
medically salient information, or even
seeking out prenatal care, becomes a severe
safety issue [109–111]. How informaticians
present this information in systems can
exacerbate these problems.
Additionally, providers have been known
to attempt to cover their mistakes and
discriminatory actions, and to help other
providers do so as well. In 2020, a trans man
in the United Kingdom undergoing metoid-
ioplasty had vaginectomy performed without
26
IMIA Yearbook of Medical Informatics 2022
Koehle et al.
consent, and another provider modified the
consent form afterward in an attempt to avoid
detection. The damage done, a fundamental
breach of provider-patient trust, resulted in
mild penalties, with one provider suspended
for five months and the other for one year
[112]. Cases of intersex genital mutilation
(IGM) are not much better: although 2018
led to the depathologization of transness
by the World Health Organization (WHO),
intersex conditions had “no end in sight for
pathologisation” [113]. “The tendency of
the medical profession to ‘cover its tracks’
through providing false information…
The mingling of damage both to intersex
people’s bodies, and to their core relation-
ships through… professional betrayal”
[114]. Informaticians become involved in
these processes by codifying these issues,
oftentimes in clinical code sets such as
SNOMED-CT, and then those sets are used
by researchers who assume pathology. For
instance, until early 2022, SNOMED-CT
codified “sodomy” as a disorder. Today,
SNOMED CT codes still pathologize trans-
gender people under the label of ‘gender
identity disorder’ despite calls to remove
such information, and the term transgender
still appears in problem lists [20, 115]. In
general, while informaticians can create
and enforce systems which are more ac-
countable, careful consideration should be
made in deciding what should and should
not be recorded, and who that recording
truly benefits. When it comes to patients
with disabilities, Dr. Lisa Iezzoni, a profes-
sor of medicine at Harvard Medical School,
reported in 2021 that 80% of physicians she
surveyed “viewed quality of life of people
with disabilities [as] worse than that of other
[nondisabled] people” and that only around
41% of physicians felt confident in their
ability to provide the same quality of care
to patients with disabilities as those without
[116]. Integrating disability considerations
into health care systems could potentially
help close this chasm. Mudrick et al. found
that embedding disability accommodation
needs within the EHR was useful in visit
planning, but that the structure needed to
be more flexible and more integrated with
existing EHR infrastructure, such as with
scheduling [117]. However, there has been
little, if any, research regarding how people
with disabilities feel about the current EHR
landscape, what they would want or not want
represented, or the relationship between that
representation and quality of care. As Turk
and McDermott noted in 2018, “[in] general,
there are few articles that focus on” disabled
populations [118]. More research is needed
in this domain, but it can certainly pull from
the extensive work of scholars in the fields of
disability studies and crip theory [119–121].
6.2 Effects of Data Breaches on
Patients
In areas where mental health-related stigma
is high, leaks and breaches of sensitive infor-
mation can be extremely lucrative for those
obtaining such information. Following a data
breach of Vastaamo in Finland, nearly 30,000
people were extorted, resulting in 25,000 po-
lice reports [15]. Familial abuse, histories of
rape, terminal conditions, suicidal thoughts
and more were released online for all to see
[15]. Retraumatization due to data breach-
es has been linked to anxiety, depression,
suicidal thoughts, and even post-traumatic
stress disorder (PTSD).
Release of information related to physical
illness has led inadvertently to similarly
bleak outcomes. In 2020, Peruvian trans
woman Alejandra Monocuco was left to die
by paramedics after they learned she was
HIV-positive [122]. In 2019, a Honduran
trans woman seeking asylum, Roxana Her-
nandez, was left to die in ICE (U.S. Immi-
gration and Customs Enforcement) custody
after suffering from AIDS-related illness
and being refused treatment [123]. Suicidal
ideation and depression have been tied to
diagnosis of sexually-transmitted infections
(STIs) and stigma following infection [124].
Stigma and misinformation related to STIs
runs rampant, and disclosure of health
information without consent could lead to
criminal prosecution in some cases.
Health information has also been used
illegally in intelligence efforts, seeding
public mistrust of public health programs.
For instance, a Pakistani physician allegedly
helped the CIA run a fraudulent hepatitis
vaccine program in order to obtain DNA
samples of Osama bin Laden, leading to
bin Laden’s execution by U.S. operatives.
This event, as described, violates medical
neutrality as outlined in the Geneva Con-
ventions, and led to exacerbation of mistrust
of medical systems in Pakistan [125]. The
U.S. arm of Save the Children, which legiti-
mately organized hepatitis B vaccinations in
Pakistan, was forced to evacuate the country.
Refusals of the polio vaccination spiked
and medical personnel became victims of
violent attacks [125]. Fake videos spread
like wildfire in 2019, claiming that polio
vaccines cause severe illness, leading to a
mob of 500 setting fire to a health clinic in
Peshawar [126].
6.3 Reporting of Omics-Related Data
With the advent of newer technologies, like
CRISPR-Cas9, concerns continue to mount.
In 2020, a Chinese court sentenced He
Jiankui, a man who claimed to have created
the world’s first gene-edited babies using
CRISPR, to several years in prison for “ille-
gal medical practice” and fined him 3 million
yuan (USD$430,000) [127]. Even James
Wilson, the primary investigator involved
with the tragic death of Jesse Gelsinger,
has come to warn about not reenacting the
“hyperaccelerated transition to the clinic”
of the 1990s [128]. From an informatics
standpoint, EHR infrastructure, interopera-
bility, standardization, quality assurance, and
privacy and data-security considerations are
necessary for bridging the gap toward more
ethical and equitable clinical trials research
in the wake of Gelsinger’s death [129, 130].
The increased practice of consumer
DNA-analysis-related services over the past
decade, such as 23andMe and Ancestry.com,
has led to numerous ethical and moral de-
bacles. In April 2018, law enforcement used
“genetic genealogy” approaches to identify
the so-called ‘Golden State Killer’ who
was last active in 1986 [131]. However, the
legal process was shaky, as police avoided a
requirement for a court order by uploading
sequence data cobbled together from old
crime scene samples. For instance, the 2014
wrongful arrest of Michael Usry based on a
partial match in a DNA database showcased
significant privacy concerns [131], and it is
possible, under the U.S. Genetic Information
Nondiscrimination Act (GINA) of 2008,
IMIA Yearbook of Medical Informatics 2022
27
Digital Health Equity: Addressing Power, Usability, and Trust to Strengthen Health Systems
that released genetic information could be
used to deny long-term care insurance, life
insurance, or disability insurance [132–134].
This means that individuals need to carefully
weigh risks and benefits of genetic testing,
which includes direct-to-consumer sites
like 23andMe and Ancestry.com, as a test
result may be required to be disclosed to
insurers. For instance, in September 2015, a
36-year-old woman with no current medical
issues, was denied life insurance because of
a positive BRCA1 gene result [133]. This
hurts patients twice over: denying necessary
financial protections and making prospective
care impossible. An individual whose sister
learned she had a BRCA gene put it best:
“This is not the calculation I want to be
doing when it comes to my health” [133].
Recording genetic-related data in the EHR
or in other medical systems may, in these
select circumstances, lead to worse health
outcomes for patients.
The benefits of measuring genetic in-
formation are undeniable, yet without firm
patient protections it stands to be exploited
by governments and corporations at the
expense of the health and well-being of the
individual. Additionally, from an informatics
perspective, the availability of genomic data
has far outpaced its ability to be analyzed
effectively, there is often a reluctance to
share data because of its sensitive nature, and
EHRs have not implemented mechanisms
to assist in data collection [135, 136]. Some
groups have attempted to integrate genetic
data into the EHR, while others have charac-
terized further issues with implementation,
naming the current barriers to implemen-
tation as lack of standards-compliant data
structures, lack of means for storage of such
data, and representation of such data on a
patient level [137, 138].
Bombard and Hayeems advanced the idea
that digital decision support tools broaden
“the reach and efficiency of genome medi-
cine by enabling easier access to testing and
counselling resources” while also noting
the importance of “a human touch” [139].
This led to their suggestion in producing a
hybrid digital model of human and computer
interaction. Importantly, the pair note that
“[t]he quality of care afforded by digital
solutions is only as good as the data input
into these systems… Existing biases may
therefore be reinforced by digital solutions,
disproportionately disadvantaging those
already marginalized by genomic medicine”
[139]. Landry et al. echo this statement stat-
ing that “[the] lack of diversity in genomic
research can affect the understanding of the
relationships between genes and disease in
unstudied populations including, erroneous
rare variant-disease associations in poorly
studied populations, and insufficient evi-
dence regarding the effect of variants on
disease in diverse populations” [140].
Often, informaticians, as end-users of
data collected elsewhere, are stuck in a
difficult situation. We need to look farther
and further for equitable information, such
as including data from the Human Heredity
and Health in Africa (H3Africa) consortium,
or from the gnomAD population database
[141]. We need to be clear about data biases
in all of our work if no other data exists so
that tools do not overstep their limitations,
and to make clear calls for continued eq-
uitable data collection. Finally, we need to
consider the context of contemporary and
historical mistreatment in data collection,
and to not discount the present reality of
people represented by data points.
7 User (or Non-User)
Experience
7.1 The Digital Divide
Due to the pandemic, many health services
and resources, such as telehealth, have
moved to the internet. Early studies in digital
health equity have focused on the “digital
divide”, the inequities between those who
have access and those who do not have ac-
cess to technologies [142]. Studies show that
people facing disadvantaged circumstances,
such as limited income to afford high-speed
internet and advanced mobile devices, are
unlikely to have equal access to digital health
[143–145]. This unintentional exclusion can
lead to further disadvantage, thus worsening
health inequity [145]. Developing countries
may have additional issues related to digital
divide when health systems are under-re-
sourced and beholden to unsustainable
financing mechanisms. Equity of access to
digital health must be considered as part of
a complex system [146–148]. Even if people
have access to technology, digital health eq-
uity cannot be reached without the ability to
use the technology and make sense of digital
health applications [149].
7.2 Usability & Accessibility
Digital health resources can help facilitate
data-based decision making for patients and
providers. However, this requires patient and
provider–along with key others such as fam-
ily members and interpreters–to be fully able
to access and use these resources. A patient
portal, care platform, or other digital tool
must be accessible to users with intellectual
and communication-related disabilities as
well as their family members, interpreters,
or other key users who may have different
access needs than the patient [150]. Cur-
rently, patient portals are often inaccessible
to users who rely on assistive technology,
users with communication-related and in-
tellectual disabilities, and trans people [20,
150]. Additionally, patient portals can cause
access barriers for trans users. Many such
patients may have legal gender markers that
are not represented in patient interfaces,
which can encourage stigmatizing treatment
by providers, billing errors, inappropriate
forms of address in procedurally gener-
ated communications, and worse health
outcomes associated with loss of trust and
avoidance of care.
7.3 Telemedicine and Remote-
Presence Health Care
The use of telehealth has dramatically
expanded during the COVID-19 pandemic
to reduce virus transmission and provide
low-cost services. Telehealth could mean
increased accessibility to healthcare by
reducing the time it takes to access care,
the cost of providing care, and the need for
patients and providers to share a physical lo-
cation. However, there is also potential to re-
inforce health inequities by reducing access
for people with disabilities and those with
less access to high-bandwidth technology
or digital literacy. A further risk is creating
28
IMIA Yearbook of Medical Informatics 2022
Koehle et al.
stable disparities in access to assessments
that are generally only available in-person;
for example, it is generally not possible to
assess for pneumonia by listening to lungs,
to measure blood pressure, or to assess fetal
heart rate in telehealth contexts [16]. If
telehealth is a central strategy for reducing
access barriers, this could mean that already
medically marginalized communities receive
care that routinely misses key assessments.
7.4 Digital Literacy
While telehealth can help reach patient
populations who are currently underserved,
including incarcerated populations and
rural populations, these groups often
lack access to high-speed internet, secure
devices, and digital literacy [16, 77–79].
Other groups that currently face structural
barriers to accessing high-quality care, like
older adults, marginalized ethnic and racial
groups, patients with low socioeconomic
status relative to their home countries, and
patients located in countries that are low-
and middle-income on a global scale, also
face digital literacy and access barriers [16].
Telehealth-based strategies must consider
these co-existing barriers.
In the same way that a provider in an
in-person appointment helps orient the
patient to the clinical environment by in-
dicating where to sit, what to expect, etc.,
the provider in a digital encounter must be
prepared to assist the patient in adopting
the new format or system and address any
apprehensiveness about the efficacy of tele-
health interventions [150, 151]. This could
mean providing patients with the opportunity
to make a test call in advance of their first
telehealth appointment to facilitate comfort
with the platform and process [152]
8 Potential Futures
It may be easy to look at the current health
equity landscape as irreparable, having been
built on hundreds of years of oppression,
marginalization, and discrimination. In this
work, we have emphasized collaboration
with user and patient groups to define pri-
orities, ensure accessibility and localization,
and consider risks in development and uti-
lization of digital health tools. Additionally,
we encourage consideration of potential
pitfalls in adopting these diversity, equity,
and inclusion (DEI)-related strategies.
When we think about creating a di-
verse, equitable, and inclusive informatics
landscape, it is not simply the creation of
a committee of marginalized persons who
make recommendations to another mostly in-
different entity. Several independent groups
have already put together such recommenda-
tions, which have been available for years. It
is not about only updating one’s language.
It is about making a material difference. As
Tatiana McInnis phrased it: “These words
[diversity, equity, and inclusion], and the
intentions they seek to express, are well
and good, yet they fall flat as [DEI] offices
fail and refuse to address systemic white
domination, anti-Blackness, misogyny or
any group-specific violence in their mission
statements” [153].
One significant problem with DEI of-
fices and organizations is that they expect
this work, which effectively retraumatizes
marginalized persons every day, to be free.
DEI is often built on a voluntary model, as a
second career that marginalized people have
to do, with the unspoken threat that things
will continue to be the way they are without
this uncompensated labor. In this sense, the
lives and labor of marginalized people are
treated as commodities to add to the product
environment of larger entities [154].
In one of the author’s experiences, she
was told up-front that the DEI office was not
about creating long-lasting solutions. It was
about “quick wins” that make administrators
look good against the political background.
This conceptualization feels endemic to
DEI, especially at large organizations like
Google, where attempts to hold individu-
als and systems accountable led directly
to severe retaliation, as was the case with
Meredith Whittaker and Timnit Gebru. But
these individuals, as well as many others
like them, have not given up the fight for
equity. In 2017, Whittaker founded the AI
Now Institute with Kate Crawford at NYU,
and, in December 2021, Gebru launched the
Distributed Artificial Intelligence Research
Institute (DAIR).
In these cases, and numerous others, it is
made clear that these DEI entities, as McIn-
nis put it, “are spaces of impossibility; they
cannot do the things they are tasked with as
they are not empowered to hold community
members accountable when they fail to
uphold stated investments in equity… They
exist not to create systematic change but
as evidence that the work has already been
done” [153]. In fact, it has been found that
this dishonesty behind many organizations
claiming to promote DEI heightens concerns
for marginalized peoples, rather than miti-
gating them [155].
Further, in implementing DEI strategies
within medical informatics, it is crucial to
be aware of these pitfalls to ensure that ap-
proaches are effective and change culture.
Interventions must not be centered around
these “quick wins” that make for PR-friendly
headlines, but instead must confront power
structures both within organizations and in
society at-large.
Transformative justice requires account-
ability on all levels. In the academic sphere,
it is fundamentally apparent that there is
a lack of understanding, compassion, or
forethought from administrators. It is not
uncommon to see a list of simple demands
for racial equity be pushed aside for a com-
mittee that can make recommendations but
has no real power. Oftentimes the only real
change occurs after a breaking point has been
reached: graduate student unionization and
striking in the United States has proved as
much. And if that’s the way it has to be, then
it will continue to be so.
However, it should be made clear that
equity in research is not the whole picture of
health equity. To quote one respondent cited
in Everhart et al. 2021: “I’m not interested in
research; I’m interested in services” [156].
Researching inequity and showcasing its
existence is only one piece of that puzzle.
For the most part, it is usually obvious that
such inequities exist. It is the rare minority
of research which actually attempts to reduce
or eliminate them.
Open-source research is a single step, to
make our knowledge, which is in the general
interest, freely available. We, as scientists
and researchers, need to be accountable
for how that research is used. Too often re-
searchers will scoff at this idea. A few years
IMIA Yearbook of Medical Informatics 2022
29
Digital Health Equity: Addressing Power, Usability, and Trust to Strengthen Health Systems
ago, a question to this end popped up on
a well-known research website: “Does the
responsibility of researchers end with the
scientific publication of their findings?” The
very idea that this question has to be asked
is an abject failure of researcher education.
The responsibility of researchers only
begins with publication. The ethical duties
of research involve actively bettering the
world around us, and so researchers should
keep in mind societal and policy implications
of their work, both within the work itself
and with how that work is used afterward.
Researchers need to be active collaborators
with implementers and policymakers. The
success of research should not be judged by
its lead researcher’s h-index, but rather by its
impact in society.
References
1. Solar O, Irwin A. A conceptual framework for
action on the social determinants of health. WHO
Document Production Service 2010. Available
from: https://www.who.int/sdhconference/
resources/ConceptualframeworkforactiononSDH_
eng.pdf [accessed March 9, 2022].
2. Dovidio JF, Penner LA, Albrecht TL, Norton
WE, Gaertner SL, Shelton JN. Disparities and
distrust: the implications of psychological
processes for understanding racial disparities
in health and health care. Soc Sci Med 2008
Aug;67(3):478-86.
3. van Ryn M, Saha S. Exploring unconscious bias
in disparities research and medical education.
JAMA 2011 Sep 7;306(9):995-6.
4. Sieck CJ, Sheon A, Ancker JS, Castek J, Cal-
lahan B, Siefer A. Digital inclusion as a social
determinant of health. NPJ Digit Med 2021 Mar
17;4(1):52.
5. Global strategy on digital health 2020-2025;
2021. Available from: https://www.who.int/docs/
default-source/documents/gs4dhdaa2a9f352b-
0445bafbc79ca799dce4d.pdf [accessed March
9, 2022].
6. Cheng C, Elsworth GR, Osborne RH. Co-design-
ing eHealth and Equity Solutions: Application
of the Ophelia (Optimizing Health Literacy and
Access) Process. Front Public Health 2020 Nov
20;8:604401.
7. Kozelka EE, Jenkins JH, Carpenter-Song E.
Advancing Health Equity in Digital Mental
Health: Lessons From Medical Anthropology
for Global Mental Health. JMIR Ment Health
2021 Aug 16;8(8):e28555.
8. Dankwa-Mullan I, Scheufele EL, Matheny ME,
Quintana Y, Chapman WW, Jackson G, et al. A
proposed framework on integrating health equity
and racial justice into the artificial intelligence
development lifecycle. J Health Care Poor Un-
derserved 2021;32(2):300-17.
9. Lyles CR, Adler-Milstein J, Thao C, Lisker S,
Nouri S, Sarkar U. Alignment of Key Stakehold-
ers’ Priorities for Patient-Facing Tools in Digital
Health: Mixed Methods Study. J Med Internet
Res 2021 Aug 26;23(8):e24890.
10. Peterson CB, Hamilton C, Hasvold P. From
innovation to implementation: eHealth in the
WHO European region, WHO Regional Office
for Europe, Copenhagen, Denmark; 2016.
11. Jones M, DeRuyter F, Morris J. The Digital
Health Revolution and People with Disabilities:
Perspective from the United States. Int J Environ
Res Public Health 2020 Jan 7;17(2):381.
12. Jones B, King PT, Baker G, Ingham T. COVID-19,
intersectionality, and health equity for indige-
nous peoples with lived experience of disability.
Am Indian Cult Res J 2020;44(2):71-88.
12. Shadmi E, Chen Y, Dourado I, Faran-Perach I,
Furler J, Hangoma P, et al. Health equity and
COVID-19: global perspectives. Int J Equity
Health 2020 Jun 26;19(1):104.
14. Kickbusch I, Piselli D, Agrawal A, Balicer R,
Banner O, Adelhardt M, et al; Secretariat of
the Lancet and Financial Times Commission.
The Lancet and Financial Times Commission
on governing health futures 2030: growing
up in a digital world. Lancet 2021 Nov
6;398(10312):1727-76.
15. Ralston W. They Told Their Therapists
Everything. Hackers Leaked It All, WIRED;
2021. Available from: https://www.wired.com/
story/vastaamo-psychotherapy-patients-hack-
data-breach/ [accessed January 7, 2022].
16. Weigel G, Ramaswamy A, Sobel L, Salganicoff
A, Cubanski J, Freed M. Opportunities and
Barriers for Telemedicine in the U.S. During
the COVID-19 Emergency and Beyond; 2020.
Available from: https://www.kff.org/womens-
health-policy/issue-brief/opportunities-and-
barriers-for-telemedicine-in-the-u-s-during-
the-covid-19-emergency-and-beyond/ [accessed
January 9, 2022].
17. Glatter R. Google Announces New AI
App To Diagnose Skin Conditions; 2021.
Available from: https://www.forbes.com/
sites/robertglatter/2021/05/21/google-
announces-new-ai-app-to-diagnose-skin-
conditions/?sh=3e575686e4a0 [accessed March
9, 2022].
18. Perlman KL, Klein EJ, Park JH. Racial disparities
in dermatology training: the impact on black
patients. Cutis 2020 Dec;106(6):300-1.
19. Ti L, Ho A, Knight R. Towards Equitable AI
Interventions for People Who Use Drugs: Key
Areas That Require Ethical Investment. J Addict
Med 2021 Apr 1;15(2):96-8.
20. Kronk CA, Everhart AR, Ashley F, Thompson
HM, Schall TE, Goetz TG, et al. Transgender
data collection in the electronic health record:
Current concepts and issues. J Am Med Inform
Assoc 2022 Jan 12;29(2):271-84.
21. Herriot M, Valentine N. Health in All Policies
as part of the primary health care agenda on
multisectoral action, 2018. https://www.who.
int/publications/i/item/WHO-HIS-SDS-2018.59
[accessed March 9, 2022].
22. Borde E, Hernández M. Revisiting the so-
cial determinants of health agenda from the
global South. Glob Public Health 2019 Jun-
Jul;14(6-7):847-62.
23. Dawes DE, Williams DR. The political determi-
nants of health, Johns Hopkins University Press,
Baltimore; 2020.
24. Daffé ZN, Guillaume Y, Ivers LC. Anti-Rac-
ism and Anti-Colonialism Praxis in Global
Health-Reflection and Action for Practitioners
in US Academic Medical Centers. Am J Trop
Med Hyg 2021 Jul 19;105(3):557-60.
25. Czyzewski K. Colonialism as a broader social
determinant of health. Int Indig Policy J 2011;
2(1). https://doi.org/10.18584/iipj.2011.2.1.5
26. Büyüm AM, Kenney C, Koris A, Mkumba L,
Raveendran Y. Decolonising global health:
if not now, when? BMJ Glob Health 2020
Aug;5(8):e003394.
27. Silver L. Smar tphone Ownership Is Growing Rap-
idly Around the World, but Not Always Equally;
2019. Available from: https://www.pewresearch.
org/global/2019/02/05/smartphone-ownership-
is-growing-rapidly-around-the-world-but-not-
always-equally/ [accessed March 9, 2022].
28. Avle S, Quartey E, Hutchful D. Research on
Mobile Phone Data in the Global South: Op-
portunities and Challenges, In: Foucault Welles
B, González-Bailón S, editors. Oxf. Handb.
Networked Commun. Oxford University Press;
2020. p. 487–509. Available from: https://doi.
org/10.1093/oxfordhb/9780190460518.013.33
29. Brewer LC, Fortuna KL, Jones C, Walker R,
Hayes SN, Patten CA, et al. Back to the Fu-
ture: Achieving Health Equity Through Health
Informatics and Digital Health. JMIR Mhealth
Uhealth 2020 Jan 14;8(1):e14512.
30. Figueroa CA, Luo T, Aguilera A, Lyles CR.
The need for feminist intersectionality in
digital health. Lancet Digit Health 2021
Aug;3(8):e526-e533.
31. Silva AB, Assumpção AMBD, Andrade Filha
IGD, Regadas CT, Castro MCD, Silva CRA,
et al. Cross-cultural adaptation of the Zero
Mothers Die (ZMD App) in Brazil: contribut-
ing to digital health with the approach on care
centred for e-pregnant woman. Revista Brasile-
ira de Saúde Materno Infantil 2020;19:751-62.
Available from: https://doi.org/10.1590/1806-
93042019000400002
32. Anderson-Lewis C, Darville G, Mercado RE,
Howell S, Di Maggio S. mHealth Technology
Use and Implications in Historically Under-
served and Minority Populations in the United
States: Systematic Literature Review. JMIR
Mhealth Uhealth 2018 Jun 18;6(6):e128.
33. Storeng KT, FukudaParr S, Mahajan M, Venkat-
apuram S. Digital Technology and the Political
Determinants of Health Inequities: Special
Issue Introduction. Global Policy 2021;12:5-11.
Available from: https://doi.org/10.1111/1758-
5899.13001.
34. Al Dahdah M. From Ghana to India, Saving the
Global South’s Mothers with a Digital Solu-
tion. Global Policy 2021;12:45–54. https://doi.
org/10.1111/1758-5899.12939
35. Bhakuni H, Abimbola S. Epistemic injustice in
academic global health. Lancet Glob Health 2021
Oct;9(10):e1465-e1470.
36. Koskinen I, Rolin K. Structural epistemic (in)
justice in global contexts. In: Global Epistemol-
30
IMIA Yearbook of Medical Informatics 2022
Koehle et al.
ogies and Philosophies of Science. Routledge;
2021. p.115-25. Available from: https://doi.
org/10.4324/9781003027140-12
37. Walker M, Boni A. Epistemic justice, par-
ticipatory research and valuable capabilities.
In: Participatory research, capabilities and
epistemic justice. Cham: Palgrave Macmillan;
2020. p. 1-25. Available from: https://doi.
org/10.1007/978-3-030-56197-0_1.
38. Crear-Perry J, Correa-de-Araujo R, Lewis John-
son T, McLemore MR, Neilson E, Wallace M.
Social and Structural Determinants of Health
Inequities in Maternal Health. J Womens Health
(Larchmt) 2021 Feb;30(2):230-5.
39. Sear R. Demography and the rise, apparent fall,
and resurgence of eugenics. Popul Stud (Camb)
2021 Dec;75(sup1):201-20.
40. Yearby R. Structural Racism and Health Dispar-
ities: Reconfiguring the Social Determinants of
Health Framework to Include the Root Cause. J
Law Med Ethics 2020 Sep;48(3):518-26.
41. Nundy S, Oswald J. Relationship-centered care:
A new paradigm for population health manage-
ment. Healthc (Amst) 2014 Dec;2(4):216-9.
42. Petersen C. Patient informaticians: Turning
patient voice into patient action. JAMIA Open
2018 May 23;1(2):130-5.
43. Birkhäuer J, Gaab J, Kossowsky J, Hasler S,
Krummenacher P, Werner C, et al. Trust in
the health care professional and health out-
come: A meta-analysis. PLoS One 2017 Feb
7;12(2):e0170988.
44. Skorton DJ. How diversity training for health
care workers can save patients’ lives, USA Today.
(n.d.). Available from: https://www.usatoday.
com/story/opinion/2020/10/07/why-diversity-
training-medical-schools-can-save-patients-lives-
column/3635406001/ [accessed January 7, 2022].
45. Hamed S, Thapar-Björkert S, Bradby H, Ahlberg
BM. Racism in European Health Care: Structural
Violence and Beyond. Qual Health Res 2020
Sep;30(11):1662-73.
46. Nuriddin A, Mooney G, White AIR. Reckoning
with histories of medical racism and violence in
the USA. Lancet 2020 Oct 3;396(10256):949-51.
47. Scharff DP, Mathews KJ, Jackson P, Hoffsuem-
mer J, Martin E, Edwards D. More than Tuskegee:
understanding mistrust about research partici-
pation. J Health Care Poor Underserved 2010
Aug;21(3):879-97.
48. Hussain A, Ali S, Ahmed M, Hussain S. The An-
ti-vaccination Movement: A Regression in Mod-
ern Medicine. Cureus 2018 Jul 3;10(7):e2919.
49. Mylan S, Hardman C. COVID-19, cults, and
the anti-vax movement. Lancet 2021 Mar
27;397(10280):1181.
50. Jaiswal J, LoSchiavo C, Perlman DC. Disinfor-
mation, Misinformation and Inequality-Driven
Mistrust in the Time of COVID-19: Lessons
Unlearned from AIDS Denialism. AIDS Behav
2020 Oct;24(10):2776-80.
51. Richmond Ii SP, Grubbs V. How Abolition of
Race-Based Medicine Is Necessary to Amer-
ican Health Justice. AMA J Ethics 2022 Mar
1;24(3):E226-232.
52. Downs J. Maladies of empire: how colonialism,
slavery, and war transformed medicine. Cam-
bridge, Massachusetts: The Belknap Press of
Harvard University Press; 2021.
53. Washington HA. Medical apartheid: the dark
history of medical experimentation on Black
Americans from colonial times to the present.
1st pbk. Ed. New York: Harlem Moon; 2006.
54. Yearby R, Clark B, Figueroa JF. Structural
Racism In Historical And Modern US Health
Care Policy: Study examines structural racism
in historical and modern US health care policy.
Health Aff (Milwood) 2022; 41(2):187-94.
55. Romano MJ. White Privilege in a White Coat:
How Racism Shaped my Medical Education. Ann
Fam Med 2018 May;16(3):261-3.
56. Wallace AA. Race and Medicine: How Modern
Medicine Has Been Fueled By Racism; 2020.
Available from: https://www.healthline.com/
health/modern-medicine-fueled-by-racism [ac-
cessed May 5, 2022].
57. Alang S, Hardeman R, Karbeah J, Akosionu O,
McGuire C, Abdi H, et al. White Supremacy and
the Core Functions of Public Health. Am J Public
Health 2021 May;111(5):815-819.
58. Geneviève LD, Martani A, Shaw D, Elger BS,
Wangmo T. Structural racism in precision med-
icine: leaving no one behind. BMC Med Ethics
2020 Feb 19;21(1):17.
59. Jones R, Crowshoe L, Reid P, Calam B, Curtis E,
Green M, et al. Educating for Indigenous Health
Equity: An International Consensus Statement.
Acad Med 2019 Apr;94(4):512-9.
60. Mosby I, Swidrovich J. Medical experimentation
and the roots of COVID-19 vaccine hesitancy
among Indigenous Peoples in Canada. CMAJ
2021 Mar 15;193(11):E381-E383.
61. Kronk CA, Dexheimer JW. An ontology-based
review of transgender literature: Revealing a
history of medicalization and pathologization.
Int J Med Inform 2021 Dec;156:104601.
62. Weeks LD. When the patient is racist, how should
the doctor respond? 2017. Available from: https://
www.statnews.com/2017/06/12/racism-bias-
patients-doctors/ [accessed January 7, 2022].
63. Gabrani A, Pal S. Physician and Gay: Am I Safe
at Work? Acad Med 2019 Jun;94(6):753-4.
64. Hohman M. “You’re in utter disbelief ”: 3
Asian American health workers detail racial
harassment at work. 2021. Available from:
https://www.today.com/health/3-asian-american-
nurses-discuss-racial-harassment-work-t217549
[accessed January 7, 2022].
65. Melanson A. Bedford VA nurse from Dracut
alleges racial discrimination in lawsuit, The
Sun; 2021. Available from: https://www.
lowellsun.com/2021/04/25/bedford-va-nurse-
lawsuit-alleges-racial-discrimination/ [accessed
January 7, 2022].
66. Gooch K. Catholic Health workers sue New York
hospital for $2M, allege racial discrimination.
Beckers Hosp Rev 2021. Available from:
https://www.beckershospitalreview.com/legal-
regulatory-issues/catholic-health-workers-
sue-new-york-hospital-for-2m-allege-racial-
discrimination.html#:~:text=Six%20current%20
and%20former%20workers,colleagues%2C%20
according%20to%20court%20
documents.&text=The%20lawsuit%20was%20
filed%20Aug. [accessed January 7, 2022].
67. Goodman E. Suit claims racism at Stanford
Hospital – Employee says co-worker dressed as
a KKK member. Dly Post 2021. Available from:
https://padailypost.com/2021/09/22/suit-claims-
racism-at-stanford-hospital-employee-claims-
co-worker-dressed-as-a-kkk-member/ [accessed
January 7, 2022].
68. Alsan M, Garrick O, Graziani G. Does diver-
sity matter for health? Experimental evidence
from Oakland. American Economic Review
2019;109(12):4071-111.
69. Leggott K. Here’s Why LGBTQ Physicians Should
Self-Identify; 2020. Available from: https://
www.aafp.org/news/blogs/freshperspectives/
entry/20200303fp-lgbtqphysicians.html
[accessed October 9, 2021].
70. Huerto R. Minority Patients Benefit From Having
Minority Doctors, But That’s a Hard Match to Make;
2020. Available from: https://labblog.uofmhealth.
org/rounds/minority-patients-benefit-from-having-
minority-doctors-but-thats-a-hard-match-to-
make-0#:~:text=1%3A46%20PM-,Minority%20
Patients%20Benefit%20From%20Having%20
Minority%20Doctors%2C%20But%20That’s%20
a,medicine%20physician%20at%20Michigan%20
Medicine.&text=Editor’s%20note%3A%20
Information%20on%20the,being%20done%20
by%20investigators%20everywhere. [accessed
January 7, 2022].
71. Brownlee D. Why Are Black Male Doctors
Still So Scarce In America? Forbes 2020.
Available from: https://www.forbes.com/
sites/danabrownlee/2020/08/11/why-
are-black-male-doctors-still-so-scarce-in-
america/?sh=4bf038d827c2 [accessed January
7, 2022].
72. Gilbey D, Morgan H, Lin A, Perry Y. Effective-
ness, Acceptability, and Feasibility of Digital
Health Interventions for LGBTIQ+ Young Peo-
ple: Systematic Review. J Med Internet Res 2020
Dec 3;22(12):e20158.
73. Antonio M, Lau F, Davison K, Devor A, Queen
R, Courtney K. Toward an inclusive digital
health system for sexual and gender minorities
in Canada. J Am Med Inform Assoc 2022 Jan
12;29(2):379-84.
74. Chittalia AZ, Marney HL, Tavares S, Warsame L,
Breese AW, Fisher DL, et al. Bringing Cultural
Competency to the EHR: Lessons Learned Pro-
viding Respectful, Quality Care to the LGBTQ
Community. AMIA Annu Symp Proc 2021 Jan
25;2020:303-10.
75. McClure RC, Macumber CL, Kronk C, Grasso
C, Horn RJ, Queen R, et al. Gender harmony:
improved standards to support affirmative care
of gender-marginalized people through inclusive
gender and sex representation. J Am Med Inform
Assoc 2022 Jan 12;29(2):354-63. Erratum in: J
Am Med Inform Assoc 2021 Nov 25.
76. Davison K, Queen R, Lau F, Antonio M. Cultur-
ally Competent Gender, Sex, and Sexual Orienta-
tion Information Practices and Electronic Health
Records: Rapid Review. JMIR Med Inform 2021
Feb 11;9(2):e25467.
77. Kronk CA, Dexheimer JW. Development of the
Gender, Sex, and Sexual Orientation ontology:
Evaluation and workflow. J Am Med Inform
Assoc 2020 Jul 1;27(7):1110-5.
78. Joseph RP, Keller C, Adams MA, Ainsworth BE.
IMIA Yearbook of Medical Informatics 2022
31
Digital Health Equity: Addressing Power, Usability, and Trust to Strengthen Health Systems
Print versus a culturally-relevant Facebook and
text message delivered intervention to promote
physical activity in African American women:
a randomized pilot trial. BMC Womens Health
2015 Mar 27;15:30.
79. Panch T, Mattie H, Atun R. Artificial intelligence
and algorithmic bias: implications for health
systems. J Glob Health 2019 Dec;9(2):010318.
80. Allen A, Mataraso S, Siefkas A, Burdick H, Bra-
den G, Dellinger RP, et al. A Racially Unbiased,
Machine Learning Approach to Prediction of
Mortality: Algorithm Development Study. JMIR
Public Health Surveill 2020 Oct 22;6(4):e22400.
81. Brogan, J. The Next Era of Biomedical Research:
Prioritizing Health Equity in The Age of Digital
Medicine. Voices in Bioethics 2021;7. Available
from: https://doi.org/10.52214/vib.v7i.8854
82. Char DS, Abràmoff MD, Feudtner C. Identifying
Ethical Considerations for Machine Learning
Healthcare Applications. Am J Bioeth 2020
Nov;20(11):7-17.
83. Ibrahim SA, Charlson ME, Neill DB. Big Data
Analytics and the Struggle for Equity in Health
Care: The Promise and Perils. Health Equity
2020 Apr 1;4(1):99-101.
84. Kamara D. Opinion | Latinx vs Latine; 2021.
Available from: https://tulanehullabaloo.
com/57213/intersections/opinion-latinx-vs-
latine/ [accessed March 9, 2022].
85. Why Latinx/e?, (n.d.). https://elcentro.colostate.
edu/about/why-latinx/ [accessed March 9, 2022].
86. Bompelli A, Wang Y, Wan R, Singh E, Zhou Y,
Xu L, et al. Social and behavioral determinants
of health in the era of artificial intelligence with
electronic health records: A scoping review.
Health Data Science 2021:1-19. Available from:
https://doi.org/10.34133/2021/9759016.
87. Obermeyer Z, Powers B, Vogeli C, Mullainathan
S. Dissecting racial bias in an algorithm used to
manage the health of populations. Science 2019
Oct 25;366(6464):447-53.
88. Coley RY, Johnson E, Simon GE, Cruz M,
Shortreed SM. Racial/Ethnic Disparities in the
Performance of Prediction Models for Death
by Suicide After Mental Health Visits. JAMA
Psychiatry 2021 Jul 1;78(7):726-34.
89. Haga SB. Impact of limited population diversity
of genome-wide association studies. Genet Med
2010 Feb;12(2):81-4.
90. Nwanaji-Enwerem JC, Jackson CL, Ottinger
MA, Cardenas A, James KA, Malecki KMC,
et al. Adopting a “Compound” Exposome
Approach in Environmental Aging Biomarker
Research: A Call to Action for Advancing Racial
Health Equity. Environ Health Perspect 2021
Apr;129(4):45001.
91. Fink-Samnick E. The Social Determinants of
Mental Health: Assessment, Intervention, and
Wholistic Health Equity: Part 2. Prof Case
Manag 2021 Sep-Oct 01;26(5):224-41.
92. Hernandez-Boussard T, Bozkurt S, Ioannidis
JPA, Shah NH. MINIMAR (MINimum Infor-
mation for Medical AI Reporting): Developing
reporting standards for artificial intelligence in
health care. J Am Med Inform Assoc 2020 Dec
9;27(12):2011-5.
93. Calvo RA, Deterding S, Ryan RM. Health
surveillance during covid-19 pandemic. BMJ
2020;369. https://doi.org/10.1136/bmj.m1373.
94. Williams SN, Armitage CJ, Tampe T, Dienes
K. Public attitudes towards COVID-19 contact
tracing apps: A UK-based focus group study.
Health Expect 2021 Apr;24(2):377-85.
95. Healthcare Data Breach Statistics, HIPAA J.
(n.d.). Available from: https://www.hipaajournal.
com/healthcare-data-breach-statistics/ [accessed
December 13, 2021].
96. Steger A. What Happens to Stolen Healthcare Data?
2019. Available from: https://healthtechmagazine.
net/article/2019/10/what-happens-stolen-
healthcare-data-perfcon [accessed January 7, 2022].
97. Choi SJ, Johnson ME, Lehmann CU. Data
breach remediation efforts and their implica-
tions for hospital quality. Health Serv Res 2019
Oct;54(5):971-80.
98. Collier K. Baby died because of ransomware
attack on hospital, suit says; 2021. Available
from: https://www.nbcnews.com/news/baby-
died-due-ransomware-attack-hospital-suit-
claims-rcna2465 [accessed January 7, 2022].
99. Klepper D, Seitz A. Facebook froze as an-
ti-vaccine comments swarmed users, 2021.
In: Lipschultz JH. Social Media and Political
Communication. Taylor & Francis; 2022.
100. Koenig D. Leaked Documents Show Facebook
Put Profit Before Public Good; 2021. Available
from: https://www.webmd.com/a-to-z-
guides/news/20211108/facebook-put-profit-
before-public-health#:~:text=Leaked%20
Documents%20Show%20Facebook%20Put%20
Profit%20Before%20Public%20Good,-By%20
Debbie%20Koenig&text=Nov.&text=The%20
files%20were%20leaked%20by,a%20
consortium%20of%20news%20organizations.
[accessed January 7, 2022].
101. Latulipe C, Mazumder SF, Wilson RKW, Talton
JW, Bertoni AG, Quandt SA, et al. Security and
Privacy Risks Associated With Adult Patient
Portal Accounts in US Hospitals. JAMA Intern
Med 2020 Jun 1;180(6):845-9.
102. Ip W, Yang S, Parker J, Powell A, Xie J, Morse
K, et al. Assessment of Prevalence of Adolescent
Patient Portal Account Access by Guardians.
JAMA Netw Open 2021 Sep 1;4(9):e2124733.
103. Fitz-Gibbon J. Greek protesters clash with
cops over COVID-19 vaccines. N.Y. Post 2021.
Available from: https://nypost.com/2021/07/25/
greek-protestors-clash-with-police-over-covid-
19-vaccines/ [accessed January 7, 2022].
104. Anti-Vaxxers Bribe Doctors for “Vaccination”
With Water, End Up With the Real Vaccine; 2021.
Available from: https://www.keeptalkinggreece.
com/2021/10/10/greece-fake-vaccinations-water-
real-vaccine/ [accessed January 7, 2022].
105. A Trust Betrayed: Barriers to Health Access in
India; 2020. Available from: https://unfoundation.
org/blog/post/trust-betrayed-barriers-health-
access-india/ [accessed March 9, 2022].
106. Jennings B, Duncan LL. Water Safety and Lead
Regulation: Physicians’ Community Health
Responsibilities. AMA J Ethics 2017 Oct
1;19(10):1027-35.
107. Robertson D. Flint Has Clean Water Now.
Why Won’t People Drink It? Politico 2020.
Available from: https://www.politico.com/news/
magazine/2020/12/23/flint-water-crisis-2020-
post-coronavirus-america-445459 [accessed
January 7, 2022].
108. Cockerell I. Anti-vaxxers make up to $1.1 billion
for social media companies; 2021. Available
from: https://www.codastory.com/waronscience/
social-media-profit-pandemic-antivax/ [accessed
January 7, 2022].
109. Coercive and Punitive Governmental Responses
to Women’s Conduct During Pregnancy;
(n.d.). Available from: https://www.aclu.org/
other/coercive-and-punitive-governmental-
responses-womens-conduct-during-
pregnancy#:~:text=Some%20women%20
were%20forced%20to,will%20suffer%20as%20
a%20result. [accessed January 7, 2022].
110. Thompson P, Cruz AT. How an Oklahoma
women’s miscarriage put a spotlight on racial
disparities in prosecutions; 2021. Available from:
https://www.nbcchicago.com/news/national-
international/how-an-oklahoma-womens-
miscarriage-put-a-spotlight-on-racial-disparities-
in-prosecutions/2674171/ [accessed January 7,
2022].
111. Moghe S. Woman shackled by police while in
labor settles with New York City; 2021. Avail-
able from: https://www.cnn.com/2021/04/21/us/
new-york-pregnant-woman-shackled-by-police-
settles/index.html [accessed January 7, 2022].
112. Dyer C. Doctors suspended after trans patient has
vagina removed without consent. BMJ 2020 Mar
3;368:m852.
113. WHO publishes ICD-11 – and no end in sight
for pathologisation of intersex people; 2018.
https://oiieurope.org/who-publishes-icd-11-and-
no-end-in-sight-for-pathologisation-of-intersex-
people/ [accessed January 7, 2022].
114. Gleeson J. Depathologising, Repathologising: the
WHO’s New Guidelines for Trans and Intersex
Healthcare; 2018. Available from: https://www.
versobooks.com/blogs/4136-depathologising-
repathologising-the-who-s-new-guidelines-for-
trans-and-intersex-healthcare [accessed January
7, 2022].
115. Ram A, Kronk CA, Eleazer JR, Goulet JL,
Brandt CA, Wang KH. Transphobia, encoded:
an examination of trans-specific terminology
in SNOMED CT and ICD-10-CM. J Am Med
Inform Assoc 2022 Jan 12;29(2):404-10.
116. Murez C. Too Many U.S. Doctors Biased
Against Patients With Disabilities: Study;
2021. Available from: https://www.webmd.
com/multiple-sclerosis/news/20210202/many-
doctors-biased-against-patients-with-disabilities
[accessed January 7, 2022].
117. Mudrick NR, Breslin ML, Nielsen KA, Swager
LC. Can disability accommodation needs stored
in electronic health records help providers pre-
pare for patient visits? A qualitative study. BMC
Health Serv Res 2020 Oct 16;20(1):958.
118. Turk MA, McDermott S. Do Electronic
Health Records support the complex needs of
people with disability? Disabil Health J 2018
Oct;11(4):491-2.
119. McRuer R. Crip theory: cultural signs of queer-
ness and disability, New York University Press,
New York, 2006. Available from: http://www.
MQU.eblib.com.AU/EBLWeb/patron/?target=-
patron&extendedid=P_865717_0 [accessed
32
IMIA Yearbook of Medical Informatics 2022
Koehle et al.
March 11, 2022].
120. Mullaney C. Disability Studies: Foundations &
Key Concepts; 2019. Available from: https://
daily.jstor.org/reading-list-disability-studies/
[accessed March 11, 2022].
121. Davis LJ, editor. The disability studies reader.
5th ed. New York: Routledge, Taylor & Francis
Group; 2017.
122. Parsons V. Trans woman left to die with coronavirus
symptoms by paramedics who “refused to
treat her” because she had HIV. PinkNews
2020. Available from: https://www.pinknews.
co.uk/2020/06/01/alejandra-monocuco-trans-
woman-dead-coronavirus-hiv-positive-bogota/.
123. Garcia SE. Independent Autopsy of Transgender
Asylum Seeker Who Died in ICE Custody Shows
Signs of Abuse. NY Times 2018. Available
from: https://www.nytimes.com/2018/11/27/us/
trans-woman-roxsana-hernandez-ice-autopsy.
html [accessed January 7, 2022].
124. Wang S, Ni Y, Gong R, Shi Y, Cai Y, Ma J.
Psychosocial Syndemic of suicidal ideation: a
cross-sectional study among sexually transmit-
ted infection patients in Shanghai, China. BMC
Public Health 2020 Aug 31;20(1):1314.
125. Northam J. How The CIA’s Hunt For Bin
Laden Impacted Public Health Campaigns In
Pakistan; 2021. Available from: https://www.
npr.org/2021/09/06/1034631928/the-cias-hunt-
for-bin-laden-has-had-lasting-repercussions-for-
ngos-in-pakistan#:~:text=But%20the%20bin%20
Laden%20raid,and%20health%20workers%20we-
re%20targeted. [accessed January 7, 2022].
126. Morrish L. How fake videos unravelled Paki-
stan’s war on polio, 2020. Available from: https://
firstdraftnews.org/articles/how-fake-videos-
unravelled-pakistans-war-on-polio/ [accessed
January 7, 2022].
127. Cyranoski D. What CRISPR-baby prison
sentences mean for research. Nature. 2020
Jan;577(7789):154-5.
128. Rinde M. The Death of Jesse Gelsinger, 20
Years Later; 2019. Available from: https://www.
sciencehistory.org/distillations/the-death-of-jesse-
gelsinger-20-years-later [accessed March 9, 2022].
129. Mc Cord KA, Hemkens LG. Using electronic
health records for clinical trials: Where do we
stand and where can we go? CMAJ 2019 Feb
4;191(5):E128-E133.
130. Kahn JM, Gray DM 2nd, Oliveri JM, Wash-
ington CM, DeGraffinreid CR, Paskett ED.
Strategies to improve diversity, equity, and
inclusion in clinical trials. Cancer 2022 Jan
15;128(2):216-21.
131. Molteni M. The Creepy Genetics Behind the
Golden State Killer Case, WIRED; 2018.
Available from: https://www.wired.com/story/
detectives-cracked-the-golden-state-killer-case-
using-genetics/ [accessed January 7, 2022].
132. Can the results of direct-to-consumer genetic
testing affect my ability to get insurance?
(n.d.). Available from: https://medlineplus.
gov/genetics/understanding/dtcgenetictesting/
dtcinsurancerisk/ [accessed January 7, 2022].
133. Farr C. If You Want Life Insurance, Think Twice
Before Getting A Genetic Test; 2016. Available
from: https://www.fastcompany.com/3055710/
if-you-want-life-insurance-think-twice-before-
getting-genetic-testing [accessed January 7, 2022].
134. Goldstein L. Can an Insurance Company Use
My Medical/Genetic Information to Deny Me
Insurance Coverage? 2020. Available from:
https://www.linkedin.com/pulse/can-insurance-
company-use-my-medicalgenetic-deny-me-
leanne-goldstein/ [accessed January 7, 2022].
135. Horton RH, Lucassen AM. Recent developments
in genetic/genomic medicine. Clin Sci (Lond)
2019 Mar 5;133(5):697-708.
136. Powell K. The broken promise that under-
mines human genome research. Nature 2021
Feb;590(7845):198-201.
137. Williams MS, Taylor CO, Walton NA, Goehring-
er SR, Aronson S, Freimuth RR, et al. Genomic
Information for Clinicians in the Electronic
Health Record: Lessons Learned From the Clini-
cal Genome Resource Project and the Electronic
Medical Records and Genomics Network. Front
Genet 2019 Oct 29;10:1059.
138. Ayatollahi H, Hosseini SF, Hemmat M. Integrat-
ing Genetic Data into Electronic Health Records:
Medical Geneticists’ Perspectives. Healthc
Inform Res 2019 Oct;25(4):289-96.
139. Bombard Y, Hayeems RZ. How digital tools can
advance quality and equity in genomic medicine.
Nat Rev Genet 2020 Sep;21(9):505-6.
140. Landry LG, Ali N, Williams DR, Rehm HL,
Bonham VL. Lack Of Diversity In Genomic
Databases Is A Barrier To Translating Precision
Medicine Research Into Practice. Health Aff
(Millwood) 2018 May;37(5):780-5.
141. Mensah NE, Towards genomic equity; 2021.
Available from: https://www.genomicseducation.
hee.nhs.uk/blog/towards-genomic-equity/
[accessed March 9, 2022].
142. Cullen R. Addressing the digital divide. Online in-
formation review 2001;25:311-20. Available from:
https://doi.org/10.1108/14684520110410517.
143. Newman LA, Biedrzycki K, Baum F. Digital
technology access and use among socially and
economically disadvantaged groups in South
Australia. The Journal of Community Informat-
ics 2010:6(2).
144. Ewing S, Rennie E, Thomas J. Broadband policy
and rural and cultural divides in Australia. In:
Andreasson K, editor. Digital Divides: New
Challenges and Opportunities of e-Inclusion.
CRC Press; 2015. p. 107–24.
145. Freeman T, Fisher M, Foley K, Boyd MA, Ward
PR, McMichael G, et al. Barriers to digital health
services among people living in areas of socio-
economic disadvantage: Research from hospital
diabetes and antenatal clinics. Health Promot J
Austr 2021 Sep 12.
146. O’Neil S, Taylor S, Sivasankaran A. Data
Equity to Advance Health and Health Equi-
ty in Low- and Middle-Income Countries:
A Scoping Review. Digit Health 2021 Dec
22;7:20552076211061922.
147. McCool J, Dobson R, Whittaker R, Paton C.
Mobile Health (mHealth) in Low- and Mid-
dle-Income Countries. Annu Rev Public Health
2022 Apr 5;43:525-39.
148. Ahmed T, Rizvi SJR, Rasheed S, Iqbal M, Bhuiya
A, Standing H, et al. Digital Health and Inequal-
ities in Access to Health Services in Bangladesh:
Mixed Methods Study. JMIR Mhealth Uhealth
2020 Jul 21;8(7):e16473.
149. Rich E, Miah A, Lewis S. Is digital health care
more equitable? The framing of health inequal-
ities within England’s digital health policy
2010-2017. Sociol Health Illn 2019 Oct;41 Suppl
1(Suppl 1):31-49.
150. Valdez RS, Rogers CC, Claypool H, Trieshmann
L, Frye O, Wellbeloved-Stone C, et al. Ensuring
full participation of people with disabilities in an
era of telehealth. J Am Med Inform Assoc 2021
Feb 15;28(2):389-92.
151. Tadros E, Ribera E, Campbell O, Kish H, Ogden
T. A call for mental health treatment in incarcer-
ated settings with transgender individuals. Am
J Fam The 2020;48(5):495-508. Available from:
https://doi.org/10.1080/01926187.2020.1761273
152. A Message from the President, The Lamp 2020;2.
153. McInnis T, Farewell A. Letter to DEI Work; 2020.
Available from: https://www.insidehighered.
com/views/2020/08/20/diversity-equity-and-
inclusion-offices-cant-be-effective-if-they-arent-
empowered [accessed January 7, 2022].
154. Scott KA, Elliott S. STEM diversity and inclu-
sion efforts for women of color: A critique of
the new labor system. Int J Gend Sci Technol
2019;11(3): 374-82.
155. Wilton LS, Bell AN, Vahradyan M, Kaiser CR.
Show Don’t Tell: Diversity Dishonesty Harms
Racial/Ethnic Minorities at Work. Pers Soc
Psychol Bull 2020 Aug;46(8):1171-85.
156. Everhart AR, Boska H, Sinai-Glazer H, Wil-
son-Yang JQ, Burke NB, LeBlanc G, et al. ‘I’m
not interested in research; i’m interested in ser-
vices’: How to better health and social services
for transgender women living with and affected
by HIV. Soc Sci Med 2022 Jan;292:114610.
Correspondence to:
Clair Kronk
Center for Medical Informatics
Yale School of Medicine
300 George Street, PO Box 208009
New Haven, CT 06520
USA
E-mail: clair.kronk@yale.edu
ResearchGate has not been able to resolve any citations for this publication.
Article
Full-text available
There are sustained international efforts to increase the number and percentage of people of color who pursue STEM education and careers. These initiatives are most widely justified as means to provide human capital for technology companies. Particularly for women of color (African American, Native American, Latinx) in the US, far too many digital inclusion endeavors entrench women of color, sometimes unwittingly, in a labor system that treats them merely as commodities. As a result, women of color either avoid lives in computing or leave them. To display and critique some of the aspects of this phenomenon, we discuss it in comparison to the labor system of sharecropping in the southern US after the Civil War. We challenge those who fund, design, implement, and evaluate efforts at diversity and inclusion to see women of color not as commodities, but as agents with interests in social and economic emancipation and autonomy. INTRODUCTION For quite some time, there has been much attention about why there continues to be so few women of color (e.g. African American, Latinx, Native American) entering and remaining in science, technology, engineering and math (STEM). The issue has assumed attention in the US particularly for technology and computer science related fields. In this article, we respond to 2018-year's conference question with a challenge. 1 Rather than focus on who does STEM and why, we invert the question and consider who does not do STEM, computer science in particular, and why not? Based on Scott's twenty-years of experience in leading and researching girl-centered technology programs (Scott, in press), we conjecture that disparity persists partly because many digital inclusion efforts reinforce, sometimes unwittingly, a labor system that has aspects similar to sharecropping. Before offering this perspective, however, we review the current state of affairs that prompts our critical analysis.
Article
Full-text available
Objective To assess a common hypothesis that data serve as a mechanism to improve health and health equity in low-and middle-income countries (LMICs), we conducted a synthesis of the evidence about the linkage between data capabilities in LMICs and health outcomes. Methods We searched and reviewed peer-reviewed and grey literature published in the past decade that focused on at least one aspect of health data or health equity or provided insights on the relationship between data use and improved health outcomes, decision-making, or both. We supplemented this with expert interviews and convenience-sampled literature. Results Of the 50 included articles, 33 discussed data collection, with 23 stating that poor accuracy, reliability, and completeness hindered data-informed decision-making. Of 27 articles discussing data access, 18 described how lack of interoperability between data systems hampered governments’ and other organizations’ ability to leverage the full value of data available. Of 19 articles discussing data use, 13 discussed how data were not getting to those doing work on the ground. Although key informants postulated a virtuous cycle between data and improved health outcomes, evidence did not support this connection. Conclusions Findings indicate better data might improve health service delivery. However, more work is needed to examine whether improvements in data yield improvements in health outcomes in LMICs. Our conceptual framework of data equity for health and health equity developed through this scoping review helps identify the key components along which to assess improvements in LMICs’ data capabilities.
Article
Full-text available
Demography was heavily involved in the eugenics movement of the early twentieth century but, along with most other social science disciplines, largely rejected eugenic thinking in the decades after the Second World War. Eugenic ideology never entirely deserted academia, however, and in the twenty-first century, it is re-emerging into mainstream academic discussion. This paper aims, first, to provide a reminder of demography’s early links with eugenics and, second, to raise awareness of this academic resurgence of eugenic ideology. The final aim of the paper is to recommend ways to counter this resurgence: these include more active discussion of demography’s eugenic past, especially when training students; greater emphasis on critical approaches in demography; and greater engagement of demographers (and other social scientists) with biologists and geneticists, in order to ensure that research which combines the biological and social sciences is rigorous.
Article
Full-text available
This paper presents results of a research priority setting process focused on trans women living with and affected by HIV across Canada. It features data from semi-structured interviews and focus groups conducted with a diverse group of 76 trans women in five urban centers across the country on how they have navigated health and social service programming within their geographic context. The results focus on the structure and types of services. Respondents offered simple, yet creative ways to address barriers to vital services based on their individual and collective experiences. Notably, participants stressed the need for 1) trans-friendly and trans-specific services, 2) integrated health services, and aid in navigating complex, overlapping systems, and 3) comprehensive community-based services. They also suggest employing trans women as care coordinators or case managers in order to foster more trans-friendly environments and empower community members. We identify concrete ways to improve health and social services at the level of service delivery and program design, as well as recommendations for future participatory research. We close with an interrogation of trans people, and trans women living with and affected by HIV in particular, as ‘hard to reach’ populations.
Article
Full-text available
Photo by Clément Hélardot on Unsplash INTRODUCTION The history of biomedical research in the United States is both inspiring and haunting. From the first public demonstration of anesthesia in surgery at Massachusetts General Hospital to the infamous Tuskegee Experiment, we see the significant advances made for the medical field and the now exposed power dynamics that contribute to injustices when they are left unmonitored.[1] Over the past century, biomedical research led to positive change but also reinforced structural racism. Henrietta Lacks, whose tissue was used without her consent to generate HeLa cells, and the Tuskegee study research subjects, who were denied an existing treatment for syphilis, exemplify how biomedical research in the US has been a vector for exploiting minority groups in exchange for knowledge creation. As we usher in the age of computational medicine, leaders of the field must listen to calls from communities around the country and world to decrease the prevalence of structural racism in the next wave of medical advances.[2] We are vulnerable to perpetuating structural racism through algorithms and databases that will drive biomedical research and aid healthcare systems in developing new methods for diagnosing and treating illness. With guidelines from governmental funding agencies and inclusivity of racial and ethnic minorities in research and development communities, we can inch closer to a more just future for our nation's health. BACKGROUND Many health systems rely on commercial software to store and process their patient’s data. This software commonly comes with patented predictive algorithms that help providers assign a risk score to patients based on health needs. However, biases held by algorithm developers can reflect racial disparities and incorporate them in the algorithms if proper counterbalances are not in place to audit the work of algorithm designers.[3] Despite the recent digitization of healthcare data across the United States, racial bias has already found its way into healthcare algorithms that manage populations. ANALYSIS l. Use of Algorithms One landmark study that interrogated a widely used algorithm demonstrated that Black patients were considerably sicker than white patients at a given risk score, evidenced by signs of uncontrolled disease.[4] The algorithm predicted the need for additional help based on past expenditures, and we historically spend less on Black patients than white patients. Rectifying this bias would lead to three times as many Black patients receiving additional resources. The algorithm produces a treatment gap due to a history of unequal access to care and lower spending on people of color compared to white people in the healthcare system. The disconnect between the clinical situation and historical resource allocation exemplifies how certain predictors may produce an outcome that harms patients. The study shows that using a proxy measure for healthcare spending to quantify how sick a patient is instead of using physiologic data can amplify racial disparities. This example highlights the need for collaboration between clinicians, data scientists, ethicists, and epidemiologists of diverse backgrounds to ensure model parameters do not perpetuate racial biases. ll. Use of Big Data in Algorithm Creation In addition to eradicating algorithms that make decisions based on proxy measures encoding racial inequities, we must also be diligent about the content of databases employed in algorithm development. Racial disparities in a database may result from the intentional selection of a homogenous population or unintentional exclusion due to systemic issues such as unequal distribution of resources. For example, a genetic study conducted in a Scandinavian country is more likely to be racially homogenous and not generalizable to a broader population. Applying algorithms derived from homogenous populations to either diverse populations or to different homogenous populations would fail to account for biological differences and could result in a recess of care when used beyond the appropriate population. Additionally, companies like Apple or FitBit could de-identify consumer data collected using their wearable sensors and make it available in research. This is problematic because the demographic distribution of people who have access to their technology may not reflect the general population. To combat these potential disparities, we must construct freely accessible research databases containing patients with diverse demographic characteristics that better model the actual populations that a given model will serve. lll. Government-Based Safeguards Armed with an understanding of how systemic bias is integrated into algorithms and databases, we must strive to construct safeguards that minimize systemic racism in computational biomedical research. A potential way to step forward as a society would be aligning incentives to produce the desired results. Governmental agencies wield enormous power over the trajectory of publicly funded research. Therefore, it is crucial that computational biomedical research funding is regulated by procedures that encourage diverse researchers to investigate and develop healthcare algorithms and databases that promote our nation's health. For example, the National Institutes of Health (NIH) has set forth two large initiatives to catalyze equitable growth of knowledge and research in healthcare artificial intelligence (AI). The first initiative is the Artificial Intelligence/Machine Learning Consortium to Advance Health Equity and Researcher Diversity (AIM-AHEAD), which focuses on increasing diversity in researchers and data within AI/machine learning (ML). The program states that “these gaps pose a risk of creating and continuing harmful biases in how AI/ML is used, how algorithms are developed and trained, and how findings are interpreted.” Increased participation of researchers and communities currently underrepresented in AI/ML modeling can prevent continued health disparities and inequities. Programs like AIM-AHEAD are crucial to reducing the risk of creating and continuing harmful biases in biomedical research. With the four key focus areas of partnerships, research, infrastructure, and data science training, AIM-AHEAD and its future incarnations can promote health equity for the next era of medicine and biomedical research. The second initiative announced by the NIH is known as the Bridge to Artificial Intelligence (Bridge2AI) program.[5] This program has a different approach to tackling systemic racism and bias by focusing on the content and process of AI/ML research. Two key components of AI/ML research are rich databases and algorithm development protocols. To develop reproducible and actionable algorithms, researchers must have access to large, well-labeled databases and follow best practices in their development of algorithms. However, large databases are not readily available across the healthcare research ecosystem. As a result, many investigators struggle to gain access to databases that would enable them to carry out AI/ML research at their home institution. A movement toward more freely available databases like the Medical Information Mart for Intensive Care (MIMIC) and electronic Intensive Care Unit (eICU) through the PhysioNet platform created at the Massachusetts Institute of Technology Laboratory for Computational Physiology can improve access to data for research.[6] By adopting the practice of freely available databases commonly used in the AI/ML research communities outside of medicine, MIMIC and eICU lowered the barrier to entry for data scientists interested in health care. The improved access from MIMIC and eICU has led to over 2,000 publications to date. While this is a solid foundational step for the healthcare AI/ML research community, it is essential to reflect on progress and ensure that freely accessible databases are racially and geographically diverse. In this manner, Bridge2AI will facilitate the expansion of healthcare databases that are ethically sourced, trustworthy, and accessible. Without government programs such as AIM-AHEAD and Bridge2AI, the US biomedical research community is at higher risk of perpetuating systemic racism and biases in how AI/ML is used, how algorithms are developed, and how clinical decision support results are interpreted when delivering patient care. lV. Private Sector Standards Even with the proper incentives delivered from governmental agencies, there can be a disconnect between the public and private sectors, leading to racial bias in algorithms used in patient care. Privately funded AI/ML algorithms used in care decision-making should be held to the same ethical standards as those developed by publicly funded research at academic institutions. Publicly funded research is usually peer-reviewed before publication, giving reviewers a chance to evaluate algorithmic bias or deficiencies. Algorithms used in care may have avoided similar scrutiny. Corporations have an inherent conflict between protecting intellectual property and providing transparency of algorithmic design and inputs. The Food and Drug Administration (FDA) is responsible for regulating AI/ML algorithms. It has classified them as Software as a Medical Device (SaMD), focusing on the development process and benchmarking.[7] The importance of holding privately funded algorithm development to the same standards as publicly funded research is highlighted in a September 2020 review of FDA-approved AI/ML algorithms. All SaMD approved by the FDA are registered by private companies.[8] Regulators must be well-versed in structural racism and equipped to evaluate proprietary algorithms for racial bias and maintain oversight as population data drifts occur and the algorithms continue to optimize themselves. The FDA role is crucial to clinical use of SaMD. CONCLUSION Computational decision support using algorithms and large databases has the potential to transform the way we deliver care. However, governments should plan how they will prevent structural racism and associated inequities from running rampant in the new systems. Racial bias and mistreatment are engrained in the history of medical research. In the newly formulated digital world, these biases are potentially even more dangerous. They now can propagate quietly in the background, masked under layers of computer code that very few people understand how to write and interpret. It will be possible for a doctor to unconsciously propagate bias because an algorithm is nudging the doctor’s behavior to make the best treatment decision for their patient. Healthcare leaders must be vigilant in guiding the development of algorithms and databases to minimize systemic racism. Guidance from funding agencies to uphold minimum quality standards, a transparent vetting process for algorithms by governing bodies like the FDA, and a diverse community of researchers and developers will allow us to curb the spread of structural racism in research and build equitable tools for diagnosing and treating illness. The real question is: will the age of digital medicine also lead to a more equitable healthcare system? - [1] About the USPHS Syphilis Study. Accessed August 18 2021. https://www.tuskegee.edu/about-us/centers-of-excellence/bioethics-center/about-the-usphs-syphilis-study; MGH. MGH Firsts. Accessed August 18 2021. https://libguides.massgeneral.org/mghhistory/firsts. [2] Geneviève LD, Martani A, Shaw D, Elger BS, Wangmo T. Structural racism in precision medicine: leaving no one behind. BMC medical ethics. 2020;21(1):1-13. [3] Barocas S, Selbst AD. Big data's disparate impact. Calif L Rev. 2016;104:671. [4] Obermeyer Z, Powers B, Vogeli C, Mullainathan S. Dissecting racial bias in an algorithm used to manage the health of populations. Science. 2019;366(6464):447-453. [5] NIH. Bridge to Artificial Intelligence. Accessed August 17 2021. https://commonfund.nih.gov/bridge2ai. [6] PhysioNet. PhysioNet: The Research Resource for Complex Physiologic Signals. Accessed August 17 2021. https://physionet.org. [7] FDA. Software as a Medical Device (SaMD). Accessed August 16 2021. https://www.fda.gov/medical-devices/digital-health-center-excellence/software-medical-device-samd. [8] Benjamens S, Dhunnoo P, Meskó B. The state of artificial intelligence-based FDA-approved medical devices and algorithms: an online database. npj Digital Medicine. 2020;3(1):1-8.
Article
Modern medicine has always endorsed White supremacy by maintaining social, political, and economic structures that have exacerbated Black and Brown persons' lived embodiment of racism. Racial essentialism persists in health professions education and practice, especially in kidney disease etiology and intervention. This article considers how glomerular filtration rate estimates are one example of historically, politically, and scientifically situated racialized practice in health care today that illuminates a glaring need to abolish race-based clinical care of any kind.
Article
The COVID-19 pandemic has illuminated and amplified the harsh reality of health inequities experienced by racial and ethnic minority groups in the United States. Members of these groups have disproportionately been infected and died from COVID-19, yet they still lack equitable access to treatment and vaccines. Lack of equitable access to high-quality health care is in large part a result of structural racism in US health care policy, which structures the health care system to advantage the White population and disadvantage racial and ethnic minority populations. This article provides historical context and a detailed account of modern structural racism in health care policy, highlighting its role in health care coverage, financing, and quality.
Article
Digital transformations are well underway in all areas of life. These have brought about substantial and wide-reaching changes, in many areas, including health. But large gaps remain in our understanding of the interface between digital technologies and health, particularly for young people. The Lancet and Financial Times Commission on governing health futures 2030: growing up in a digital world argues digital transformations should be considered as a key determinant of health. But the Commission also presses for a radical rethink on digital technologies, highlighting that without a precautionary, mission-oriented, and value-based approach to its governance, digital transformations will fail to bring about improvements in health for all.
Article
This article reflects on current trends and proposes new considerations for the future of mobile technologies for health (mHealth). Our focus is predominantly on the value of and concerns with regard to the application of digital health within low- and middle-income countries (LMICs). It is in LMICs and marginalized communities that mHealth (within the wider scope of digital health) could be most useful and valuable. Peer-reviewed literature on mHealth in LMICs provides reassurance of this potential, often reflecting on the ubiquity of mobile phones and ever-increasing connectivity globally, reaching remote or otherwise disengaged populations. Efforts to adapt successful programs for LMIC contexts and populations are only just starting to reap rewards. Private-sector investment in mHealth offers value through enhanced capacity and advances in technology as well as the ability to meet increasing consumer demand for real-time, accessible, convenient, and choice-driven health care options. We examine some of the potential considerations associated with a private-sector investment, questioning whether a core of transparency, local ownership, equity, and safety are likely to be upheld in the current environment of health entrepreneurship. Expected final online publication date for the Annual Review of Public Health, Volume 43 is April 2022. Please see http://www.annualreviews.org/page/journal/pubdates for revised estimates.