ArticlePDF Available

Responsible innovation in neurotechnology enterprises (OECD Science, Technology and Industry Working Paper)

Authors:

Abstract and Figures

Novel neurotechnology offers significant potential for the promotion of health1 and economic growth. Spearheaded by large national and international flagship initiatives in brain science and fuelled by a clear medical need, research both in the public and private sector has made considerable strides towards novel neurotechnology, services and markets. At the same time, neurotechnology raises a range of unique ethical, legal, and policy questions that potential business models will have to address. These questions include issues of (brain) data privacy, the prospects of human enhancement, the regulation and marketing of direct-to-consumer devices, the vulnerability of cognitive patterns for commercial or political manipulation, and new inequalities in use and access. While some of these issues are shared by other technology domains (e.g. gene editing or artificial intelligence (AI)), neurotechnology is exceptional because of the close connection between brain and cognition to human identity, agency, and accountability. Yet, it is also an extremely diverse field of research and commercial activity, which requires a customtailored approach to regulation based on the particular applications under consideration – e.g. their scope (e.g., invasive, non-invasive), types of data produced, and target audiences envisioned. While approaches for fostering “responsible innovation” have become more common in the public sector, private sector frameworks are only beginning to emerge. The 2018 OECD Shanghai Workshop “Minding neurotechnology: delivering responsible innovation for health and well-being” brought together more than 120 leaders from 12 countries from government, companies, academia, venture capital, and insurance companies to shed light on the benefits, challenges, and options of strengthening responsible innovation in the private sector. The workshop yielded a number of important lessons about the interactions between emerging neurotechnology innovators, policy makers, and civil society, both on what is happening already and what is needed. It also revealed a number of important insights into the potential role of the private sector for responsible innovation more generally beyond neurotechnology.
Content may be subject to copyright.
OECD Science, Technology and Industry Working Papers
2019/05
Responsible innovation
in neurotechnology
enterprises
Hermann Garden,
David E. Winickoff,
Nina Maria Frahm,
Sebastian Pfotenhauer
https://dx.doi.org/10.1787/9685e4fd-en
2 | RESPONSIBLE INNOVATION IN NEUROTECHNOLOGY ENTERPRISES
WORKING PAPER “RESPONSIBLE INNOVATION IN NEUROTECHNOLOGY ENTERPRISES” © OECD 2019
OECD Working Papers should not be reported as representing the official views of the
OECD or of its member countries. The opinions expressed and arguments employed are
those of the authors. Working Papers describe preliminary results or research in progress
by the author(s) and are published to stimulate discussion on a broad range of issues on
which the OECD works. Comments on Working Papers are welcomed, and may be sent to
Directorate for Science, Technology and Innovation, OECD, 2 rue André-Pascal, 75775
Paris Cedex 16, France.
Note to Delegations:
This document is also available on O.N.E under the reference code:
DSTI/STP/BNCT(2018)5/FINAL
This document, as well as any data and any map included herein, are without prejudice to
the status of or sovereignty over any territory, to the delimitation of international frontiers
and boundaries and to the name of any territory, city or area.
© OECD (2019)
You can copy, download or print OECD content for your own use, and you can include
excerpts from OECD publications, databases and multimedia products in your own
documents, presentations, blogs, websites and teaching materials, provided that suitable
acknowledgment of OECD as source and copyright owner is given. All requests for
commercial use and translation rights should be submitted to rights@oecd.org.
RESPONSIBLE INNOVATION IN NEUROTECHNOLOGY ENTERPRISES | 3
WORKING PAPER “RESPONSIBLE INNOVATION IN NEUROTECHNOLOGY ENTERPRISES” © OECD 2019
Foreword
This document is the result of analytical work on the opportunities and challenges of
implementing responsibility frameworks into neurotechnology translation at major brain
research initiatives and in the private sector. The report draws on: (1) the discussion at the
BNCT workshop “Minding Neurotechnology: delivering responsible innovation for health
and well-being”, 6-7 September 2018, Shanghai, People’s Republic of China (referred to
the “Shanghai Workshop” hereafter); and (2) commentaries by workshop participants.
The Shanghai Workshop was focused on exploring some of the unique ethical, legal, and
policy challenges raised by health-related applications of brain science and its integration
into cutting edge neurotechnologies. One key aim of this workshop was to provide a forum
for innovators to discuss strategies for delivering responsible innovation in
neurotechnology for health applications.
The BNCT Project “Neurotechnology and Society” (Programme of Work and Budget
2017-2018) and the Shanghai Workshop were supported by the Korea Legislation Research
Institute (KLRI), Korea, and by The Kavli Foundation, USA.
The workshop was supported and hosted by the China National Center for Biotechnology
Development, Beijing, People’s Republic of China, and by the Tongji University School
of Medicine, Shanghai, People’s Republic of China.
4 | RESPONSIBLE INNOVATION IN NEUROTECHNOLOGY ENTERPRISES
WORKING PAPER “RESPONSIBLE INNOVATION IN NEUROTECHNOLOGY ENTERPRISES” © OECD 2019
Table of contents
Foreword ................................................................................................................................................ 3
Key messages .......................................................................................................................................... 5
1. Introduction ....................................................................................................................................... 9
2. A public health priority and a market opportunity ..................................................................... 11
2.1. Enabling translational brain research .......................................................................................... 15
3. Ethical, legal and social challenges ................................................................................................ 18
4. Role of the private sector in neurotechnology governance .......................................................... 21
4.1. Key opportunities, risks, and barriers ......................................................................................... 21
4.2. The unique position of start-ups ................................................................................................. 23
5. Design standards and regulation .................................................................................................... 26
6. Opportunities in soft law................................................................................................................. 29
6.1. Development of principles .......................................................................................................... 29
6.2. Ethics and governance frameworks for neurotechnology ........................................................... 30
6.3. Emerging practices for responsible innovation in business settings ........................................... 33
Annex A. Bibliography ..................................................................................................................... 38
Annex B. ............................................................................................................................................... 45
RESPONSIBLE INNOVATION IN NEUROTECHNOLOGY ENTERPRISES | 5
WORKING PAPER “RESPONSIBLE INNOVATION IN NEUROTECHNOLOGY ENTERPRISES” © OECD 2019
Key messages
Novel neurotechnology offers significant potential for the promotion of health1 and
economic growth. Spearheaded by large national and international flagship initiatives in
brain science and fuelled by a clear medical need, research both in the public and private
sector has made considerable strides towards novel neurotechnology, services and markets.
At the same time, neurotechnology raises a range of unique ethical, legal, and policy
questions that potential business models will have to address. These questions include
issues of (brain) data privacy, the prospects of human enhancement, the regulation and
marketing of direct-to-consumer devices, the vulnerability of cognitive patterns for
commercial or political manipulation, and new inequalities in use and access. While some
of these issues are shared by other technology domains (e.g. gene editing or artificial
intelligence (AI)), neurotechnology is exceptional because of the close connection between
brain and cognition to human identity, agency, and accountability. Yet, it is also an
extremely diverse field of research and commercial activity, which requires a custom-
tailored approach to regulation based on the particular applications under consideration
e.g. their scope (e.g., invasive, non-invasive), types of data produced, and target audiences
envisioned.
While approaches for fostering “responsible innovation” have become more common in
the public sector, private sector frameworks are only beginning to emerge. The 2018 OECD
Shanghai Workshop “Minding neurotechnology: delivering responsible innovation for
health and well-being” brought together more than 120 leaders from 12 countries from
government, companies, academia, venture capital, and insurance companies to shed light
on the benefits, challenges, and options of strengthening responsible innovation in the
private sector. The workshop yielded a number of important lessons about the interactions
between emerging neurotechnology innovators, policy makers, and civil society, both on
what is happening already and what is needed. It also revealed a number of important
insights into the potential role of the private sector for responsible innovation more
generally beyond neurotechnology. Among the key messages are:
It is time to re-think governance of neurotechnology. Brain research in the
public and private sector has made considerable progress towards novel
neurotechnology applications, both for clinical and non-clinical use. Innovators are
receiving significant public and media attention, occasionally mixing issues around
neurotechnology innovation with controversies in adjacent domains (such as gene
editing and AI). A highly heterogeneous international landscape of innovation
practices, regulation of nascent markets, and de-facto standards (e.g. through
industry self-regulation) is rapidly emerging, which creates uncertainty among
public and private sector actors.
Stakeholders in the public and private sector are looking for guidance. There
is an urgent need to develop shared frameworks for how novel neurotechnology
and associated data are used. New governance mechanisms will likely be required
to address how these technologies challenge our understanding of human agency,
identity, and the boundaries of normal human capacity; how to identify and
1 Health is a state of complete physical, mental and social well-being and not merely the absence of
disease or infirmity.
6 | RESPONSIBLE INNOVATION IN NEUROTECHNOLOGY ENTERPRISES
WORKING PAPER “RESPONSIBLE INNOVATION IN NEUROTECHNOLOGY ENTERPRISES” © OECD 2019
anticipate the broader impact of neurotechnology on society; and how the potential
of novel neurotechnology is communicated to the public to both inform and to avoid
hype. Moreover, guidance will be needed on how to conduct small-scale clinical
trials in situations where novel neurotechnological interventions might be invasive
and involve some (possibly unquantifiable) risk.
The private sector has an important role in the development of responsible
innovation practices in global markets. Companies – and especially start-ups –
are at the forefront of neurotechnology innovation. Responsible technology
development and effective governance must involve the private sector as a central
actor early on, especially in global contexts. At the same time, the private sector
has a key interest in demonstrating responsibility and integrity.
An explicit commitment to principles of responsible development upstream can
promote the trust and trustworthiness that are crucial for success. Responsible
design considerations early in the pipeline as part of the innovation process itself
can support the social robustness and acceptability of new products and services,
increase end-user trust, and ensure that innovation delivers for and with society.
Transparency is critical to build trust in the ways data will be collected, managed
and used. Experience with innovation trajectories in other emerging technologies
(e.g. nanotechnology) reveal that upstream engagement can be crucial for
identifying and mitigating public concerns early in the development process.
Companies are keenly aware that the entire neurotechnology business sector can be
harmed and public trust can be undermined by single bad corporate actors in the
field.
Tools and approaches for responsible governance of neurotechnology are
emerging. There has been considerable experimentation among companies about
how to address the unique social, ethical, and legal aspects raised by novel
neurotechnology, especially those related to the collection and use of ‘personal
brain data’.2 Emergent “good practices” in the private sector include for example
the appointment of advisory boards on ethical, legal and social questions; the
development of guidelines and principles; greater emphasis on responsible
technology transfer; and interest in socially responsible investment. Importantly,
many approaches known from the public sector do not easily translate to
companies. Especially start-up companies lack the organizational and financial
resources, and face considerable pressures of speed and scale that tend to
discourage costly and slow deliberative exercises. Moreover, approaches from
other sectors do not easily translate to neurotechnology. A mix of soft and hard
governance tools (e.g. industry standards, regulatory processes) is needed for
different sectors and different applications. These should provide clear pathways
for developers that give certainty in routes to market as well as gaining societal
approval. Experience with other emerging technologies suggests opportunities in
including roles for researchers, clinicians, industry, governments, and civil society
in governance models. Frameworks such as Corporate Social Responsibility could
be enriched with approaches of Responsible Research and Innovation, and vice
versa.
2Personal brain datais information relating to the functioning or structure of the human brain of
an identified or identifiable individual that includes unique information about their physiology,
health, or mental states.
RESPONSIBLE INNOVATION IN NEUROTECHNOLOGY ENTERPRISES | 7
WORKING PAPER “RESPONSIBLE INNOVATION IN NEUROTECHNOLOGY ENTERPRISES” © OECD 2019
Sound regulation is key to enable robust innovation trajectories. Soft-law
measures and self-regulation are important building blocks of responsible
innovation. However, clear and better aligned regulatory frameworks are equally
needed to create certainty and ensure a high-level of user protection. Overall a
functional, bottom-up approach, starting with the assessment of the technical
peculiarities of different classes of applications, is to be preferred to the adoption
of broad and all-encompassing principles. Simplification of extant solutions in
particular in fields such as civil liability ought to be pursued, also by replacing
existing strategies with a risk-management approach. Standardisation and product
safety regulation is also essential to grant users’ protection and clear compliance
criteria developers need to abide by. Ethical guidelines, even when reflecting
differences in culture, traditions and sensitivities, not to be intended as a
replacement for regulation, contribute to the development of a responsible research
and innovation approach.
Standards are critical. Standards for neurotechnology innovation can help ensure
a positive impact on health and society. Harmonized terminology, processes and
standards not only enable investment in brain science and neurotechnology
development, they also form the basis for impartiality, equal treatment,
confidentiality, ethics, scientific integrity and transparency. International efforts on
the standardisation of neurotechnology system specification and interoperability
would help communication and collaboration across major brain research
initiatives and the private sector.
There are large potential gains to be derived from data sharing. International
collaboration in neurotechnology innovation should include a focus on sharing of
personal brain data. Significant cultural differences exist, and a diversity of
governance systems can complicate data sharing. The standardisation of personal
brain data collection, curation, and sharing will not only drive new discovery, but
will also be essential to obtain broader value from the data. Intellectual property
consists not in the data itself, but in what discoveries can emerge from its analysis.
Privacy concerns will always have to be taken into account.
Public deliberation can contribute directly to value creation. Public
engagement is critical in the development of robust neurotechnology futures and
for a comprehensive governance approach. Innovation in neurotechnology must be
a collaboration between science and society: currently, the public is frequently
viewed through the lenses of knowledge deficits and trust deficits. There is a need
for a broader discussion to help define goals and elaborate scientific questions. Such
a discussion is critical for developing trust and trustworthiness with end users, and
can help tailor emerging technologies better to the needs of those they are designed
to help.
Investors play a key role in enabling responsible innovation. Investment is the
lifeblood of the start-up driven neurotechnology industry, without which
innovations cannot reach the marketplace. Questions of funding, public-private
partnerships, grants, and public markets play a key role for addressing challenges
of responsible innovation effectively. Guidance on “responsible investment” could
help support such efforts.
There would be utility in developing a set of international Principles.
Principles for Responsible Innovation in Neurotechnology’, such as those being
8 | RESPONSIBLE INNOVATION IN NEUROTECHNOLOGY ENTERPRISES
WORKING PAPER “RESPONSIBLE INNOVATION IN NEUROTECHNOLOGY ENTERPRISES” © OECD 2019
developed by the OECD Biotechnology, Nanotechnology, and Converging
Technologies Working Party (BNCT), could complement, inform, and harmonize
international guidance and norms. These Principles could support responsible
innovation in neurotechnology, help governments better assess the ethical, legal
and social issues (ELSI) of these technologies, and elicit policy responses that
maximize benefits while minimizing risks. They should not generalize across the
entire spectrum of neurotechnology and should be aimed at all actors in the
innovation process. Any movement towards Principles should recognize the
diversity of ethical values across countries and make acceptable accommodations,
yet identify common ground on which norms, standards and regulatory provision
can stand.
RESPONSIBLE INNOVATION IN NEUROTECHNOLOGY ENTERPRISES | 9
WORKING PAPER “RESPONSIBLE INNOVATION IN NEUROTECHNOLOGY ENTERPRISES” © OECD 2019
1. Introduction
Emerging neurotechnologies, defined as “devices and procedures that are used to access,
monitor, investigate, assess, manipulate, and emulate the structure and function of neural
systems” (Giordano, 2012[1]; OECD, 2017[2]), have the potential to radically change how
to understand human cognition and behaviour. They also offer tremendous potential for the
promotion of health, well-being, and innovation-driven economic growth. Non-invasive
wearable devices using EEG-monitoring of cortical zones can help map and train brain
activity and steer machines through brain-computer interfaces controlled by users and
could be especially important for use by individuals with a motor-disability. They can also
provide real-time feedback on current cognitive patterns and can be used to induce
transcranial stimulations to manipulate brain activity.
Neurotechnology is also increasingly becoming a data science, redefining what is possible
in terms of monitoring and intervention in clinical and non-clinical settings, with great
promise for improving mental health, well-being and productivity. Here, the convergence
between neuroscience, engineering, digitalisation, and AI is a key driver of innovation and
will disrupt existing practices as well as traditional boundaries between medical therapies
and consumer markets. For example, digital phenotyping technology as developed by the
company Mindstrong Health and others can help anticipate emerging mental health
problems through pattern recognition in cell phone usage, and launch targeted
interventions. AI-driven clinical software support tools, such as Predictix (an AI-driven
approach to personalize medicine, Taliaz, Israel) or Aifred Health (machine learning
techniques predict treatment efficacy, Aifred Health, Canada), can be used to personalize
antidepressant medication and improve mental health treatments.
These developments are not neutral, but foreseeably have an impact on societies, for
example on how to judge and manage human health and behaviour, and which forms of
medical interventions to consider legitimate. Neurotechnology therefore holds tremendous
opportunities to improve health and well-being through innovation, but also raises
questions about its responsible governance and use. These questions concern for example
the possibility of human enhancement, changing personality, and intervening in self-
perception. Also, issues around unauthorized use and misuse of personal brain data have
become more tangible in the wake of recent privacy breaches in the social networking
community. Other governance issues are raised when products intended for clinical use are
used in non-therapeutic settings. In many of these questions, neurotechnology is unique in
part because of the close connection of the brain and cognition to human identity and
agency (Nuffield Council on Bioethics, 2013[3]).
Ethical, legal and social issues (ELSI) surrounding neurotechnology affect the entire
innovation pipeline, from fundamental brain research, cognitive neuroscience, and other
brain-inspired sciences (Jeong et al., 2019[4]; Greely, Ramos and Grady, 2016[5]; Salles
et al., 2019[6]) to questions of commercialization and marketing (e.g. direct-to-consumer
marketing of wearable, non-invasive applications based on claims about improvement of
cognitive performance and well-being) (Eaton and Illes, 2007[7]; Martinez-Martin and
Kreitmair, 2018[8]; Wexler, 2016[9]). The translation of neurotechnology into medical
settings raises yet another set of issues, e.g. around the protection of health data acquired
through neurodiagnostic devices, or the trust in medical assessment tools based on machine
learning pose (Finlayson, Bowers and Ito, 2019[10]).
10 | RESPONSIBLE INNOVATION IN NEUROTECHNOLOGY ENTERPRISES
WORKING PAPER “RESPONSIBLE INNOVATION IN NEUROTECHNOLOGY ENTERPRISES” © OECD 2019
The private sector is a major driver of neurotechnology innovation, benefitting from large-
scale national or international brain research and technology initiatives. Companies and
investors hence play a key role for ensuring the responsible development and governance
of emerging technologies, alongside public sector research actors such universities and
governments. Yet, tackling questions of responsible innovation at the interface between
public and private sector interests raises a number of challenges, as revealed by the OECD
Shanghai Workshop. Companies face very different constraints and environments for
research and development than public institutions, including an imperative of speed, scale,
and profitability. Data collection and sharing raise additional issues for many products and
services. Companies are facing heterogeneous and potentially rapidly changing regulatory
landscapes across countries and regions. Yet another challenge is how to mobilize
investment such as to enable the responsible translation of cutting edge neuroscience into
markets with a view beyond purely financial returns toward the public good.
Novel anticipatory frameworks and good practices for the responsible governance of novel
neurotechnology are beginning to emerge, both from the private and the public sector. In
the private sector in particular, this includes for example the appointment of advisory
boards on ethical, legal and social questions; the development of internal guidelines and
principles; greater emphasis on responsible technology transfer; and interest in socially
responsible investment. Principles for Responsible Innovation in Neurotechnology, as
currently under development by the OECD, could provide a reference for governments and
innovators for the responsible translation of brain research into products and markets.
RESPONSIBLE INNOVATION IN NEUROTECHNOLOGY ENTERPRISES | 11
WORKING PAPER “RESPONSIBLE INNOVATION IN NEUROTECHNOLOGY ENTERPRISES” © OECD 2019
2. A public health priority and a market opportunity
Mental health is an increasingly important public health concern in OECD countries and
beyond. Mental and neurological disorders cause great human suffering and increasingly
recognized as major causes of death and disability worldwide (Feigin et al., 2019[11]; James
et al., 2018[12]; Vos et al., 2016[13]) (see Figure 1). They often remain untreated and impose
significant economic and social welfare costs, elevating their importance to the highest
national and international policy levels. In Europe, mental illnesses (e.g. depression,
anxiety disorders and alcohol and other drug use disorders) alone affect more than one in
six people with an estimated total cost of over EUR 600 billion in 2015 (OECD/EU,
2018[14]). The direct and indirect costs of mental health problems are significant, and can
amount to over 4% of GDP (Hewlett and Moran, 2014[15]). A report by the World Economic
Forum and the Harvard School of Public Health (2011[16]) estimated the global economic
costs of mental health conditions in 2030 at USD 6 trillion.
Figure 1. Years Lived with Disability (YLD, %) for some non-communicable diseases
Note: To estimate Years Lived with Disability (years of life lived with any short-term or long-term health loss,
YLD) for a particular cause in a particular time period, the number of incident cases in that period is multiplied
by the average duration of the disease and a weight factor that reflects the severity of the disease on a scale
from 0 (perfect health) to 1 (dead). Incidence: the number of new cases of a given disease during a given period
in a specified population. Neurological disorders are diseases of the central and peripheral nervous system (e.g.
epilepsy, Alzheimer’s disease and other dementias, cerebrovascular diseases including stroke, migraine and
other headache disorders, multiple sclerosis, Parkinson's disease, neuroinfections)
(http://www.who.int/features/qa/55/en/). Mental disorders comprise a broad range of problems generally
characterized by some combination of abnormal thoughts, emotions, behaviour and relationships with others
(e.g. schizophrenia, depression, intellectual disabilities and disorders due to drug abuse)
(http://www.who.int/mental_health/management/en/).
15.91
14.41
8.59
5.34 5.25
4.19
0.91
0
2
4
6
8
10
12
14
16
18
Musculos keletal
disorders
Mental disorders Neurological
disorders
Dia betes and
kidney diseases
Chronic
respiratory
dis eas es
Cardi ova scul ar
dis eas es
Neoplasms
YLD (%)
12 | RESPONSIBLE INNOVATION IN NEUROTECHNOLOGY ENTERPRISES
WORKING PAPER “RESPONSIBLE INNOVATION IN NEUROTECHNOLOGY ENTERPRISES” © OECD 2019
Source: Institute for Health Metrics and Evaluation (IHME), USA (http://ghdx.healthdata.org/;
http://ghdx.healthdata.org/gbd-results-tool), accessed 25 April 2019. Year: 2017. List of countries:
http://ghdx.healthdata.org/countries. The Institute for Health Metrics and Evaluation (IHME) is an independent
global health research center at the University of Washington (USA). The Global Health Data Exchange
(GHDx) is a data catalog created and supported by IHME.
As a public policy topic, mental health is strongly related to global demographic trends,
especially ageing populations in developed countries (United Nations, 2017[17]; World
Health Organization, 2013[18]). By 2050, the world population will likely grow to 9.8 billion
people, with one in five aged 60 years or older (United Nations, 2017[17]). In Japan, the
proportion of people older than 60 years already exceeds 30%. Many other countries in
North America, the EU, but also Chile, and South Korea, and Australia face similar issues
of demographic ageing (World Health Organization, 2015[19]).
Dementia is one of the main targets of mental health initiatives and research world-wide.
Dementia is a general term for progressive (usually age-related) decline in brain
functionality affecting memory, thinking, behaviour and emotion. Dementia affects 50
million people worldwide with an estimated worldwide cost in 2018 of USD 1 trillion
(including costs for informal care) (Alzheimer’s Disease International (ADI), 2018[20]). By
2050, it is estimated that 152 million people will be living with dementia (Alzheimer's
Disease International (Alzheimer’s Disease International (ADI), 2018[20]). Currently there
is no cure for dementia and no effective treatment that can stop disease progression. Despite
remarkable discoveries in dementia research, drug development in Alzheimer’s disease and
other dementias has been marked by disappointments (Hodges, 2015[21]; Larson, 2018[22]).
Systems theory, precision pharmacology und medicine (Hampel et al., 2017[23]; Hampel
et al., 2019[24]), the convergence of engineering science and artificial intelligence (Ding
et al., 2019[25]), and the development of novel health technologies (OECD, 2017[26]) offer
powerful tools to better understand human brains and to close the treatment gap in
Alzheimer’s disease and other dementias. As part of the EU Human Brain Project (HBP)
scientists are developing real-time simulation of large biological neural networks to mimic
the brain's neural networks with the aim to better understand neural processing in the brain
and shed light on the pathological processes leading to disorders such as epilepsy and
Alzheimer's disease (van Albada et al., 2018[27]).
Dementia and other mental health diseases are one major driver of current neuroscientific
research and technology development, both in the public and the private sector. Studies
suggest that a cognitive reserve (cognitive resilience) can help tolerate more
neurodegeneration with less functional decline and psychiatric symptoms (Arenaza-
Urquijo, Wirth and Chételat, 2015[28]; Livingston et al., 2017[29]). It could be argued that
factors that are potentially influencing cognitive reserve, such as genetics and epigenetics,
education, social inclusion, and mental and physical stimulation, open up new avenues for
diagnosis, prevention, and therapy in Alzheimer’s disease and other dementias (Russ,
2018[30]; Weiler et al., 2018[31]). For example, AI-supported analysis of digital data from
smart phones could offer surrogates for laboratory-based neuropsychological assessment
(Dagum, 2018[32]), and initial studies indicate potential efficacy of deep brain stimulation
(DBS) in Parkinson disease (Hickey and Stacy, 2016[33]; Limousin and Foltynie, 2019[34]).
However, the use of computerized cognitive training as an option for maintaining cognitive
function in normal aging has shown inconclusive results (Gates et al., 2019[35]).
Yet, neurotechnology comprizes a much more expansive set of research and economic
activities. The growing interest in neurotechnology is linked to key industries such as
healthcare, education, information and communication technology, and law enforcement.
Beyond clinical applications, neurotechnology also has significant potential for the
RESPONSIBLE INNOVATION IN NEUROTECHNOLOGY ENTERPRISES | 13
WORKING PAPER “RESPONSIBLE INNOVATION IN NEUROTECHNOLOGY ENTERPRISES” © OECD 2019
development of direct-to-consumer (DTC) products and services, for example around the
self-monitoring of cognitive health and well-being, optimizing cognitive performance,
education, and communication technology (Ienca, Haselager and Emanuel, 2018[36]).
Table 1. Key patent filing locations.
Numbers of new patents filed 2008-2016 for health-related neurotechnology.
Priority country
2008
2009
2010
2011
2012
2013
2014
2015
2016
Total
United States
1067
1092
994
1134
1113
1354
970
943
851
9518
China
101
82
166
211
310
363
481
779
943
3436
Korea
57
56
65
72
89
93
141
119
131
823
Japan
57
75
67
68
49
76
78
84
49
603
Patent Co-operation Treaty
24
20
47
64
46
68
61
64
53
447
Russia
18
27
26
26
30
27
26
49
76
305
Germany
45
35
53
49
44
33
58
43
38
398
European Patent Office
33
24
26
20
49
42
33
54
43
324
United Kingdom
25
11
13
15
20
29
28
35
45
221
Australia
48
34
6
31
12
20
12
19
16
198
Note: This Table shows the numbers of patents filed 2008-2016 within the area of health-related
neurotechnology for each of the top 10 priority filing locations (United States, People’s Republic of China,
Korea, Japan, Patent Co-operation Treaty3, Russia, Germany, European Patent Office (EPO), United Kingdom,
Australia). Priority filing location: the patent authority in which the first registration took place. Key search
terms used for health-related neurotechnology: neuromodulation, neuroprosthetic, neurorehabilitation,
neurosensing, brain-computer interface, neuroimaging, mental health, mental disorders, neurological disorders,
diagnostics, therapeutics, health monitoring, prevention.
Source: The primary data source for this analysis was the Derwent World Patents Index™, as accessed via the
Derwent Innovation™ platform - both produced by Clarivate Analytics (June 2019).
Table 2. Key source of innovation countries.
Numbers of patents filed for key source of innovation countries based on inventor activity filed 2008-2016 for
health-related neurotechnology.
Source of innovation country
2008
2009
2011
2012
2013
2014
2015
2016
Total
United States
921
917
896
872
1102
795
753
675
7775
China
109
72
207
325
377
479
778
760
3224
Korea
15
13
18
60
95
142
117
129
612
Germany
53
55
82
61
57
72
69
54
555
Australia
63
41
76
53
53
50
51
54
463
Israel
27
47
58
55
47
34
51
30
375
Canada
25
19
16
34
57
34
31
29
279
Switzerland
29
25
25
24
39
40
31
28
286
Japan
11
32
33
22
40
34
38
28
262
France
11
17
21
16
30
53
43
31
239
Note: This Table shows the number of patents filed 2008-2016 within the area of health-related
neurotechnology for each of the key source of innovation countries (United States, People’s Republic of China,
Korea, Germany, Australia, Israel, Canada, Switzerland, Japan, France). Source of innovation countries:
address of inventor filing a patent. Key search terms used for health-related neurotechnology: neuromodulation,
neuroprosthetic, neurorehabilitation, neurosensing, brain-computer interface, neuroimaging, mental health,
mental disorders, neurological disorders, diagnostics, therapeutics, health monitoring, prevention.
3 https://www.wipo.int/pct/en/texts/articles/atoc.html
14 | RESPONSIBLE INNOVATION IN NEUROTECHNOLOGY ENTERPRISES
WORKING PAPER “RESPONSIBLE INNOVATION IN NEUROTECHNOLOGY ENTERPRISES” © OECD 2019
Source: The primary data source for this analysis was the Derwent World Patents Index™, as accessed via the
Derwent Innovation™ platform - both produced by Clarivate Analytics (June 2019).
Worldwide a total of 16 273 patents in health-related neurotechnology have been field at
key filing locations from 2008-2016 (see Table 1). In most countries the filings show an
increased patent activity over the years, with the USA (9 518), People’s Republic of China
(3 436), Korea (823), and Japan (603) as leading markets. Complementary to the key filing
locations, data shown in Table 2 provides information about the countries the a high
innovation (research) activity in health-related neurotechnology: United States, People’s
Republic of China, Korea, Germany, Australia, Israel, Canada, Switzerland, Japan, France.
Medical device companies such as Boston Scientific (USA) followed by Medtronic (USA/
Ireland), Cochlear Limited (Australia), and Advanced Bionics (USA) are some of the top
patents applicants by invention volume indicating dynamic activity of these assignees in
the field of health-related neurotechnology. Academic institution, such as the University of
California (USA) and the Tsinghua University (People’s Republic of China) are noted
among the top entities in this area (see Figure 2).
Among health-related neurotechnology, the following technological categories show high
patent activity (total patent filings, 2008-2016): neuromodulation (10 375), neuroprosthetic
(7 432), neuroimaging (1 854), neurosensing (1 768), neurorehabilitation (1 094), brain-
computer interface (574), see Figure 3. The relatively high patent activity for
neuromodulation technologies confirms this category as an important area of innovation.
Figure 2. Numbers of patents field by key applicants.
Total numbers of patents filed 2008-2016 for health-related neurotechnology.
Note: This Figure provides an analysis of the total number of patents filed 2008-2016 within the area of health-
related neurotechnology for each of the top 10 priority filing locations (United States, People’s Republic of
China, Korea, Japan, Patent Co-operation Treaty, Russia, Germany, European Patent Office (EPO), United
Kingdom, Australia). Priority filing location: the patent authority in which the first registration took place. Key
search terms used for health-related neurotechnology: neuromodulation, neuroprosthetic, neurorehabilitation,
neurosensing, brain-computer interface, neuroimaging, mental health, mental disorders, neurological disorders,
diagnostics, therapeutics, health monitoring, prevention.
Source: The primary data source for this analysis was the Derwent World Patents Index™, as accessed via the
Derwent Innovation™ platform - both produced by Clarivate Analytics (June 2019).
0200 400 600 800 1000
Tsinghua University (PR China)
University of California (USA)
Glo bus Me dical Inc ( USA)
MED-EL (Austria)
Advan ce d Bi on ic s ( US A)
Cochlear Limited (Australia)
Med tronic (USA/ Ireland)
Bos ton Sc ien tific (U SA )
RESPONSIBLE INNOVATION IN NEUROTECHNOLOGY ENTERPRISES | 15
WORKING PAPER “RESPONSIBLE INNOVATION IN NEUROTECHNOLOGY ENTERPRISES” © OECD 2019
Figure 3. Patent activity for selected types of health-related neurotechnology
Total numbers of new patents filed 2008-2016 for selected types of health-related neurotechnology.
Note: This Figure shows the total numbers of new patents filed of selected types of health-related
neurotechnology filed 2008-2016 for each of the top 10 priority filing locations (United States, People’s
Republic of China, Korea, Japan, Patent Co-operation Treaty, Russia, Germany, European Patent Office (EPO),
United Kingdom, Australia). Priority filing location: the patent authority in which the first registration took
place. One invention can fall into more than one category. Each category’s invention count is independent of
other categories.
Source: The primary data source for this analysis was the Derwent World Patents Index™, as accessed via the
Derwent Innovation™ platform - both produced by Clarivate Analytics (June 2019).
2.1. Enabling translational brain research
Over the past years, neuroscience has experienced a massive increase in research activity
and funding through large-scale, national and trans-national brain research initiatives, such
as the EU Human Brain Project (HBP), the Japanese Brain/MINDS project, the Korea
Brain Initiative, the U.S. Brain Research through Advancing Innovative Neurotechnologies
(BRAIN) Initiative®, and the emerging Australian Brain Initiative, and the China Brain
Project. These initiatives aim to shed light on the biological basis of mental and
neurological processes and disorders, and on how to define cognition, emotion and
consciousness. They are also a major driver of technology development, both through new
tools to understand the brain and through commercial applications arising from this
understanding (OECD, 2017[2]).
16 | RESPONSIBLE INNOVATION IN NEUROTECHNOLOGY ENTERPRISES
WORKING PAPER “RESPONSIBLE INNOVATION IN NEUROTECHNOLOGY ENTERPRISES” © OECD 2019
Table 3. Issues and opportunities in neurotechnology translation
Issues in Neuroscience Translation
Possible Solutions
Inadequate access to cutting edge technology for
clinical research in academic institutions.
Development of technology platforms which include public-
private partnerships that both invest in early stage research
and have the capacity to take discoveries to market.
Large neuroscience data sets require significant
resources for the validation, management, storage,
and analysis.
Standardisation of data should start already when the data is
generated in order to optimize sharing and downstream use.
Governments must support the infrastructure to manage and
store valuable data in the longer term.
Complex and lengthy contract negotiations between
public institutions and private entities in partnerships.
Guidance on how to simplify processes for research
translation and collaboration with companies.
Stringent ownership of data and IP. Sharing publically-funded neuroscience data in an open
science environment and allowing IP on discoveries that
develop novel ways to use the data.
Ethical, legal, and social issues for translation and
technology use.
Development of guidelines and principles analysing how
novel technologies impact individuals and society.
Implementation of frameworks of responsible innovation
upstream.
Unrealistic expectations by users of neurotechnology
and the broader public diminishes trust.
Avoiding hype and provide evidence-based information for
experts and the publics. Stakeholder engagement and
communication between research participants, patients,
members of the public.
Source: OECD Shanghai Workshop, September 2018.
Translational brain research the application of novel neuroscientific or biological
knowledge and clinical trials of novel techniques and therapies that address critical medical
and health needs is one goal of all of these flagship initiatives. Translational principles
are reflected in project design in various ways. For example, governments use these
initiatives to actively foster connectivity across diverse stakeholders on a national level
and globally and disciplines in order to help the transfer of knowledge into novel
neurotechnology. These flagship brain research and technology initiatives are large-scale,
complex, and heterogeneous endeavours, reflecting an understanding that brain science and
neurotechnology are platform technologies enabling broad applications to multiple
products, processes, and markets. They emphasize collaboration, openness, and
information sharing as important factors in realising opportunities and managing risks,
especially in novel, often disruptive technologies.
Brain research initiatives are also important vehicles for governments to shape the
neuroscience research agenda towards concrete policy goals in public health and social
well-being. Some initiatives include explicit considerations of how to integrate elements of
social responsibility and ethics into their technology transfer, business practices, research
and development (R&D), and corporate governance. In order for those technologies to be
integrated into society, they need to be developed together with society for markets and
broadly disseminated beyond the laboratory or company where they originated. Closer
collaboration between brain initiatives around the world will accelerate discovery and
innovation. These international, public-private collaborations could offer a ‘test bed’ for
new approaches to information sharing, intellectual property (IP) management, public
engagement, and incentivising open science and responsible innovation. Possible
roadblocks and solutions for the translation of research into products within brain initiatives
are summarized in Table 3.
Translational ambitions go beyond the scope of individual initiatives. The International
Brain Initiative (IBI) is a new global body that has formed to coordinate the activities of
RESPONSIBLE INNOVATION IN NEUROTECHNOLOGY ENTERPRISES | 17
WORKING PAPER “RESPONSIBLE INNOVATION IN NEUROTECHNOLOGY ENTERPRISES” © OECD 2019
the major brain initiatives around the globe.4 The vision of the IBI is catalysing and
advancing ethical neuroscience research through international collaboration and
knowledge sharing, by uniting diverse ambitions to expand scientific possibility, and
disseminating discoveries for the benefit of humanity. Working groups have been formed
to coordinate global neuroethics, an inventory of projects across the initiatives, data
sharing, tool and technology dissemination, education and training and communication and
public outreach. The IBI seeks to engage with governments and policy makers and global
organisations.
4 http://www.internationalbraininitiative.org/
18 | RESPONSIBLE INNOVATION IN NEUROTECHNOLOGY ENTERPRISES
WORKING PAPER “RESPONSIBLE INNOVATION IN NEUROTECHNOLOGY ENTERPRISES” © OECD 2019
3. Ethical, legal and social challenges
The 2018 OECD Shanghai Workshop “Minding neurotechnology: delivering responsible
innovation for health and well-being” showed that neurotechnology governance requires
serious engagement with the private sector to ensure responsible development in this
domain. Emerging neurotechnology products and services in neuroimaging, brain-
computer-interfaces, and neurostimulation are raising questions by companies and
consumers alike, for example on the privacy of personal brain data, the reliability and
validity of automated cognitive assessment, and potential off-label and misuses of
neurotechnologies (Bowman et al., 2018[37]; Garden and Winickoff, 2018[38]; Kaebnick and
Gusmano, 2018[39]; Nuffield Council on Bioethics, 2013[3]; Müller and Rotter, 2017[40]).
The meeting also demonstrated that private sector actors are keenly aware of the need to
bring innovation processes into alignment with societal needs, values and expectations in
order to reap the full potential of their innovations.
The ethical, legal and social challenges surrounding these emerging technologies affect the
entire innovation pipeline, from fundamental brain science (e.g. acquiring informed
consent) to questions of commercialization and marketing (e.g. direct-to-consumer
marketing of wearable, non-invasive applications based on claims about improvement of
cognitive performance and well-being) (Eaton and Illes, 2007[7]; Martinez-Martin and
Kreitmair, 2018[8]; Wexler, 2016[9]). The translation of neurotechnology into medical
settings raises yet another set of issues, e.g. around the protection of health data acquired
through neurodiagnostic devices, or the trust in medical assessment tools based on machine
learning pose (Finlayson, Bowers and Ito, 2019[10]).
The potential effects might be both more subtle and more transformative than anticipated
in crude visions of ‘mind control.’ For example, if benign forms of cognitive training or
neuro-stimulation enhance educational or other performance outcomes, they might create
implicit expectations by employers and society at large, and putting at a disadvantage those
who cannot afford them. Neurological information could also be reflected in insurance
rates, creating new strata of vulnerable populations.
Additional consequential effects might unfold at the interface between neuro and data
science. With respect to democracy and political participation, recent scandals such as those
involving political analytics company Cambridge Analytica and Facebook have revealed
the vulnerability of our political systems to concerted efforts of behavioural data gathering
and targeted manipulation of social media. Likewise, targeted advertising based on digital
phenotyping fuelled by big data and machine learning could, for example, enable retailers
to increase sales, including of unhealthy products such as cigarettes, alcohol, and high-
calorie foods, to those most susceptible to them (e.g. with a propensity towards alcoholism
or addictive behaviour). This kind of manipulation of purchasing and consumption patterns
can have direct and significant impacts on public health and on the costs of maintaining
public welfare systems.
Other issues in the responsible development and use of neurotechnology include:
Digital footprint: the adoption of neurotechnology and other personal health
technologies in academia, clinical applications, and consumer markets, and recent
privacy and security breaches in the social networking community have raised
ethical, legal, and social questions about peoples’ digital footprint, data ownership,
storage, sharing, and validation (Greenberg, 2018[41]; Hernandez, 2018[42]). Ienca et
RESPONSIBLE INNOVATION IN NEUROTECHNOLOGY ENTERPRISES | 19
WORKING PAPER “RESPONSIBLE INNOVATION IN NEUROTECHNOLOGY ENTERPRISES” © OECD 2019
al. (2018[36]) argued that “creating an ecosystem that enables technological
innovation while making sure that citizens have control over their data is critical
for neurotechnology”.
Manipulation: neuroimaging and brain stimulation technologies are being used
with growing potential in research and increasingly in the clinics. Also, there is
significant potential in courts for testing the veracity of testimony and for marketing
purposes (Smith, 2013[43]). Decoding brain activity, thoughts, and mental states
bears the risk of unauthorized monitoring, judgement, manipulation, and
discrimination (Poldrack, 2017[44]; Racine and Affleck, 2016[45]; Robillard and Illes,
2016[46]). It should be noted, however, that validating systems and providing the
evidence about the ‘truth of people’s thoughts’ in real world settings remains a
major challenge. Given that personal brain data are privacy-sensitive data types that
can potentially reveal predictive information about health status, mental states and
behaviour, and that the manipulation of brain activity via brain stimulation can
influence personal identity, neurotechnology raises important implications also
from the perspective of human rights (Cabrera, Evans and Hamilton, 2014[47]; Ienca
and Andorno, 2017[48]).
Transparency: emerging technologies are not neutral and can impact and
fundamentally alter society. Drawing on work on AI, a “Meticulous Transparency”
assessment has been developed by Benrimoh et al. (2018[49]). This framework also
requires developers to provide details on intentionality, scope of use, data sources
and bias control, human interpretability, the projected risks and benefits of the
product, monitoring and contingency plans for adverse events.
Technology misuse: the complexity and disruptive potential of recent advances in
neurotechnology have raised public concerns about their potential misuse
(Bowman and Husbands, 2011[50]). Examples of neurotechnology misuse include
the unsafe use of do-it-yourself (DIY) technology for cognitive enhancement,
malicious ‘neuro-hacking’, and neuro-doping are some examples of potential
technology misuse that require discussion by all stakeholders (Aicardi et al.,
2018[51]; Park, 2017[52]; Wexler, 2017[53]).
Will cognitive enhancement technologies be designed to maximize certain behaviours that
favour the interests of the most power players? Will commercial EEG or other self-
monitoring technologies be used to fuel a new “bio-advertising” market, where personal
biological measures are used to target products to people in more and more irresistible
ways? Will personal brain data exacerbate current tendencies of irresponsible and non-
transparent data collection and monopolization? These are the questions that drive current
discussions around regulatory scrutiny and responsible business development in
neurotechnology.
The potential of neurotechnologies to influence human behaviour in ways that society may
not be aware of should not be taken lightly, especially given that many of these technologies
are being developed with behaviour alteration as an explicit goal (i.e. those technologies
that are therapeutics for mental illness). This does not detract from their enormous potential
to benefit human health and well-being, but highlights the need for caution in the use of
technologies that exploit biases and motivations that can influence human thinking and
decision making. It will also be important to ensure that these technologies are
democratized, so that neurotechnologies and AI are not solely tools to be used by and to
benefit those who can afford their development.
20 | RESPONSIBLE INNOVATION IN NEUROTECHNOLOGY ENTERPRISES
WORKING PAPER “RESPONSIBLE INNOVATION IN NEUROTECHNOLOGY ENTERPRISES” © OECD 2019
Box 1. Opportunities and risks in human-computer interface technology
The development of human–computer interfaces and other cognitive technologies affects
innovation and productivity through many routes, for example, increasingly intelligent and
autonomous machines and systems, simulation-driven approaches to pre-clinical testing of
potential therapies, and predictive analytics of health data in personalized medicine
(OECD, 2017[26]). Human-computer interfaces draw on, for example, neuroscience,
software engineering, sensing technologies, neuromorphic engineering (Bainbridge and
Roco, 2016[54]).
The digital transformation of industries and the health sector will further strengthen the ties
between human, machines, and algorithms. Enterprises increasingly rely on a mix of digital
technologies and automated systems for their productivity. In the clinical sphere brain–
computer interfaces (BCIs) offer important solutions to public health needs and for patients
in neuronal rehabilitation (Abdulkader, Atia and Mostafa, 2015[55]; Wolpaw and Winter
Wolpaw, 2012[56]). In addition, BCIs can be used to implicitly communicate information
to a machine, allowing for neuroadaptive technology (Zander et al., 2016[57]). In that way,
the interaction between human operators and machines becomes more natural and intuitive
and the work gets more productive (Zander and Kothe, 2011[58]).
Although techniques for human-computer interactions have become increasingly user-
friendly they still depend on a computer as an operator to translate their original thought or
intention into a sequence of small, explicit commands, which presents both a
communication bottleneck and a source of potential error. New approaches to human-
computer interfaces, that preserves the resources of the human operator while enabling
them to use the full potential of the machine hold a potential to widen this bottleneck and
minimize the risk of failure.
Even though significant advances have been made in this area by utilizing machine learning
for smart automation, risks and consequences of system fragility may increase and the
ability to anticipate system failures could diminish (Leveson, 2011[59]). Here, the merger
between human oversight and artificial intelligence (AI) in human-computer interfaces
could further promote a human-centric approach and increase the robustness of systems
through simultaneous and continuous learning (OECD/EU, 2018[14]; OECD, 2019[60]). In
this vein, Specker Sullivan and Illes (2018[61]) note that ethics capacities and reporting in
BCI research can be improved through (1) the explicit reflection on the value, goals, and
methods in human subjects study design, and (2) openness and transparency about ethics
practices in reporting.
RESPONSIBLE INNOVATION IN NEUROTECHNOLOGY ENTERPRISES | 21
WORKING PAPER “RESPONSIBLE INNOVATION IN NEUROTECHNOLOGY ENTERPRISES” © OECD 2019
4. Role of the private sector in neurotechnology governance
Both internal and external drivers have brought the issue of responsibility to the forefront
in neurotechnology businesses. Internally, some companies are already leading by example
by including social responsibility into their core vision of technology development and
establishing their own mechanisms. Interactions between researchers, companies,
regulators, and user and patient communities have been quite strong, including through the
activities of government agencies such as the U.S. National Institutes of Health (NIH). A
very active academic community around neuroethics and science & technology studies
(STS) community has been part of many developments. Externally, neuroscience and
neurotechnology have received growing international attention internationally, most
notably through large-scale flagship research initiatives such as the EU Human Brain
Project (HBP) and the U.S. BRAIN initiative. Conversely, recent public controversies
around human enhancement and a broader wave of technology 'backlash' raised the stakes
for the prospects of this nascent sector. This has sparked an increase in further
collaborations among companies and social scientists in the domain of AI ethics and the
new field of ‘Public Interest Technology’.
There is currently a window of opportunity to address ethical, legal, and social issues.
Neurotechnology is a relatively young field where many promising applications are still in
research and trial phases. “Upstream” engagement can help avoid costly design lock-ins
and reduce the need for costly adjustments at a later stage to ensure market compatibility
of emerging products and services (see Figure 6). What is more, there is a growing
awareness and sensitivity of these issues in the neuroscience-community. International
flagship projects such as the US BRAIN Initiative and the HBP are embracing the growing
interactions with neuroethicists and policy makers, which is representative of a general
desire among stakeholders to address potential issues of future applications early on.
4.1. Key opportunities, risks, and barriers
The OECD Shanghai Workshop underscored the potential opportunities arising for
companies from engaging with questions of responsibility. Neurotechnology companies
recognize that they can develop a competitive advantage by building a reputation as
responsible technology leader and demonstrating integrity. An explicit commitment to
principles of responsible development of neurotechnologies "upstream” i.e. responsible
design considerations early in the pipeline as part of the innovation process itself can
boost the social robustness and acceptability of new products and services, increase
consumer trust, and ensure that innovation ‘really matters to society’ (Wilsdon and Willis,
2004[62]). Experience with innovation trajectories in other sectors (e.g. biotechnology or
digital platforms) reveal that upstream engagement can be crucial for identifying and
mitigating public concerns early in the development process (Nuffield Council on
Bioethics, 2012[63]). Moreover, there is growing evidence that integrating a plurality of
perspectives upstream in the design of innovations will improve technology design, enable
new creative solutions, and facilitate trustworthy governance (Sutcliffe, 2011[64]).
Nevertheless, there are perceived risks associated with introducing responsibility
mechanisms into the innovation processes indiscriminately. Processes of public
deliberation and anticipatory governance, widely used in the public sector, tend to be time
and resource intensive, and hence can slow down R&D processes or stifle innovation.
22 | RESPONSIBLE INNOVATION IN NEUROTECHNOLOGY ENTERPRISES
WORKING PAPER “RESPONSIBLE INNOVATION IN NEUROTECHNOLOGY ENTERPRISES” © OECD 2019
Moreover, some companies consider attention to responsibility outside their core mission
of revolutionizing healthcare, education, or consumer entertainment, and second to the
imperative of delivering shareholder value. Workshop participants thus emphasized that
responsibility tools must be carefully tailored to the needs and constraints of the private
sector. At the same time, participants recognized the risks in not addressing questions of
responsibility head-on. A lack of public debate and international standards might lead to a
race to the bottom in terms of regulatory control, or may encourage rogue behaviour that
can evaporate trust in an entire field through a single “kill event.” A central challenge is
how best to mobilize societal and regulatory engagement without stifling innovation is a
central challenge.
A number of barriers currently prevent stakeholders to effectively address questions of
responsibility in and with the private sector. First, established pathways for responsible
innovation in public sector research such as deliberative exercises or ethics boards – do
not easily translate into the private sector. Second, the unique questions and societal
implications of emerging neurotechnologies (such as concerns with human agency, brain
privacy, or behavioural control) make it unlikely that tools and approaches mobile
zed in other technology domains will be directly applicable or effective. Third, many
young, innovative companies and especially start-ups – tend to lack time and resources
to commit the necessary organizational capital. Instead, they are primarily bound by
demands for scale and returns by investors, which skews incentive structures. Finally, there
is a lack of awareness of some of the issues in the public so that less public debate is
happening than would be helpful.
A number of leading neurotechnology companies share a commitment to certain core
values that should guide research and development. These values include maximizing
social impact and health benefits; prioritizing safety and efficacy; committing to integrity,
honesty, and trustworthiness; emphasizing transparency and privacy protection; enabling
responsiveness to social concerns; and being consistent with stated goal and action (see
Figure 6). Companies recognize that competitive pressures and vested interests may limit
the extent of self-governance that can be expected from the private sector, which provides
a rationale for public-private engagement to develop adequate policies and oversight.
One-size-fits-all solutions for responsibility challenges are not possible: regulatory
approaches will have to be both context and application specific. The particular approach
taken will depend on the area of application, e.g. whether a technology is intended for
scientific research, medical use (prevention, diagnosis, therapy), or non-medical use (well-
being) as well as the envisioned user (e.g. a medical practitioner, a commercial end user, a
company). It also depends on the technology readiness level and the perceived level or risk.
Societal response and corresponding approach depend on the specific social, ethical,
demographic, cultural, and legal environments. Finally, the unique questions and societal
implications raised by novel neurotechnology (such as concerns with human agency, brain
privacy, or behavioural control) make it unlikely that tools and approaches mobilized in
other technology domains will be directly applicable or effective.
RESPONSIBLE INNOVATION IN NEUROTECHNOLOGY ENTERPRISES | 23
WORKING PAPER “RESPONSIBLE INNOVATION IN NEUROTECHNOLOGY ENTERPRISES” © OECD 2019
Figure 4. Implementing social responsibility into neurotechnology companies
Source: OECD Shanghai Workshop, September 2018.
4.2. The unique position of start-ups
Companies, especially innovative start-ups, face a very specific set of challenges to engage
with questions of social responsibility in their daily routines, as the workshop underscored.
These challenges include an imperative of speed and scale, dependence on investors for
company strategy, a lack of dedicated organizational capabilities to deal with responsibility
in R&D, and unclear regulatory contexts across countries. Business models that explicitly
take into account questions of social responsibility all the way from research to marketing
are still evolving.
Buil d competitive a dvantage
through reputation as
res po ns i bl e lea der s
Boost s ocia l robustness,
acc eptabil ity, consumer trust
Identify and mitigate public
co nc erns ea rl y
Ensur e (s oci al) s ustainability of
the products a nd services
Enhance "product-market fi t" by
making i nnovations matter.
Time and resource intensi ty
Slow down or stifle innovation
pr oc es s es
Outside company core miss ion
"Ra ce to the bottom" i n
regulation.
No es ta bl is hed f ra mewor ks for
pri vate sectors; publi c research
frameworks not appli cable
Pr ess u res of s peed a nd s ca le,
esp. f or s ta rt-ups
Unique cha ll enges of bra in
sc ience and neurotechnology
Limited organi zational a nd
financial res ources
Complexity of ethical, legal, and
soci al questions
Complexity of global policy a nd
regulatory la ndscape.
Solutions should be specific to:
Area of appli cation, e.g. for researc h, prevention, dia gnosis, therapy, well-bei ng (non-medical)
Technol ogy user, e.g. di r ect-to cons umer, medical practi tioner, company
Technol ogy r eadi ness levels : technol ogies a nd applic ations with immedia te i mpact vs. l ong-term future options
Risk level: bas e actions on ris k assessment and develop risk mana gement options in ac cordance with intended or
poss ible unintended us es
Soci o-cul tural co ntexts : dev el op con text-speci fic and i ncl usive solutions based on s ocial, ethical , demographic ,
cul tural , and legal contex ts .
Opportunities Risks Ch allenges
Stated company goals and values:
Promote heal th through beneficia l applications
Prioritize safety and effi cac y
Integrity a nd honesty
Trans parency and pri vacy protection
Awaren es s of downs trea m/ future i mpa c ts
Responsiveness to social concerns
Consi stency with s tated company goals and acti ons .
Drivers:
Emerging "good practices" from some companies
"Responsibility" as core vision
Cl os e i nter ac tio ns between comp an ies , regulators,
academics, society
Li vel y neuroethics a nd STS (Sci ence, Technology and
Society) communities
Large-sc ale fla gship initiatives in bra in science and
technology with s ocietal c omponent
Controversy i n other sectors (e.g. data pri vacy)
24 | RESPONSIBLE INNOVATION IN NEUROTECHNOLOGY ENTERPRISES
WORKING PAPER “RESPONSIBLE INNOVATION IN NEUROTECHNOLOGY ENTERPRISES” © OECD 2019
Established pathways for responsible innovation in public sector research, such as
deliberative exercises or ethics boards, operate on very different timescales or are not easily
brought into corporate R&D processes. Moreover, accountability structures differ.
However, many of the main goals for Responsible Innovation are identical for the public
and private sector: anticipating potential regulatory issues ahead of time, ensuring that
research is conducted inclusively such as to benefit from diverse inputs and potential uses,
and demonstrate the legitimacy and social licence of ongoing research and development
activities.
While practices and tools for responsible innovation for established companies are still
emerging, even less is known about how start-ups and or teams at pre-commercial
incubation stages can adopt and implement socially responsible innovation methods and
business conduct. Yet, this start-up phase might be even more important for current
development in neurotechnology than a focus on larger firms. Start-ups are becoming some
of the most exciting venues for breakthrough tools in basic and clinical neuroscience, from
visualizing neuronal activity in the mouse brain to digital phenotyping in the clinic. With
this success, a new set of questions is emerging around ethical, legal, and social
implications of neurotechnology in the start-up world. How will data be shared in
companies that are protecting their intellectual property? This might include, for example,
a question about whether experience of early clinical experimentation can be shared (e.g.
through registries) in order to maximize research opportunities and to avoid repetition of
trials with negative results. How is privacy protected for clinical technologies sweeping up
vast amounts of individual neurological or behavioural data? How should governments
regulate software-based tools that are adapting continually? Who is responsible for
maintaining and updating technologies once they have been deployed, particularly in health
settings? What are the social implications of technologies that can monitor cognition and
behaviour? When does monitoring become surveillance?
The situation of start-ups poses a range of critical challenges for responsible innovation
routines. Here, the imperative of speed and scale is ever more pronounced than for
established firms, and the focus is primarily on creating a commercially viable product in
the first place. Start-ups usually do not have the size, organizational resources, or financial
means to tackle responsibility as a key issue. Moreover, investors play a key role in making
strategic decisions for start-ups and will have to be active players in the development of
responsible innovation mechanisms as well. Hype, unsubstantiated medical claims, and
potential misinterpretation, and off-purpose use pose significant risks to innovators and
investors alike. Neurotechnology start-ups at the Shanghai Workshop emphasized that the
investor chosen by the company should be in line with the values and business strategy,
especially from a perspective or responsibility.
Another challenge is the difference in environment between start-ups and university-based
research. Start-up companies are often created by academic investigators who want to
commercialize discoveries made in a university laboratory and who are used to the
institutionalized ethics procedures in university settings. Yet, while the origin might be in
a university, the start-up culture is fundamentally different from the academic culture. In a
start-up the focus is on more rapid product development than on papers or robust
procedures; the development teams include engineers, designers, and data scientists;
timelines are often much shorter; and the culture encourages risk and failure.
RESPONSIBLE INNOVATION IN NEUROTECHNOLOGY ENTERPRISES | 25
WORKING PAPER “RESPONSIBLE INNOVATION IN NEUROTECHNOLOGY ENTERPRISES” © OECD 2019
There are at least three major challenges that start-ups face compared to academic labs
when developing neurotechnologies:
First, the start-up needs to be able to build a product that innovates, in the sense
that it offers a user something better than existing technology. Creating value and
addressing health needs should be at the heart of product development. This
requires not only great engineering and design but an eye to “product-market fit”,
which is an industry term for understanding the problem that needs to be solved.
Second, as with an academic lab, the start-up needs to raise funding to support
research and development. This usually depends on venture funds which come with
an expectation of a financial return on investment. This means that in addition to
creating an innovative product, the start-up needs to have a business model for
commercialization. Questions that are rarely asked in an academic lab, such as
“Who will pay for this?” and “How big is the market for this?”, are fundamental to
raising funds in a start-up.
Third, for clinical products, start-ups need to test their technologies in patients. In
contrast to academic labs, few start-ups have access to clinics. For the development
of clinical tools, start-ups need clinical partners who are willing to work with a
commercial entity while not necessarily sharing in the equity of the company.
Managing these public-private partnerships for research can become complicated
in an environment where universities want intellectual property. Moreover,
guidance will be needed on how to conduct small-scale clinical trials in situations
where novel neurotechnological interventions might be invasive and involve some
(possibly unquantifiable) risk.
From the standpoint of start-ups, where development can be rapid and iterative, a process
that includes users, developers, and investors in establishing guidelines will be critical. It
is not possible to foresee all the unintended consequences of novel technologies but
stakeholders can establish some fundamental principles that will guide their development.
Transparency, agency, and privacy protection are all essential elements for the ethical
development of neurotechnologies. User-centred design can help to translate these
elements into specific features of software and hardware, often referred to as “ethically
aligned design” (see next section). And a focus on empowering patients and families can
also guide how these features are deployed.
Engagement with consumers (e.g. patients, clinicians) and other stakeholders will also
ensure that innovative technologies meet their needs, and are more likely to be used and be
effective. A major failure to translate innovative technologies is that they do not provide
end-users with the benefits they want, or are used in ways that were not anticipated by
developers, potentially causing unanticipated harm. Most of all, for neurotechnologies to
be successful they need to gain and retain public trust in order to obtain a ‘social licence’
to operate. Some big technology companies and social media platforms are currently
experiencing a “techlash” as the public questions the motives and values of large tech
companies. Start-ups avoid some of this scrutiny but they still have the challenge of
ensuring public trust through ethical behaviour that is focused on empowering users.
26 | RESPONSIBLE INNOVATION IN NEUROTECHNOLOGY ENTERPRISES
WORKING PAPER “RESPONSIBLE INNOVATION IN NEUROTECHNOLOGY ENTERPRISES” © OECD 2019
5. Design standards and regulation
The development and appropriate use of emerging technologies is frequently supported by
standards that ensure technological robustness, interoperability, and general compliance
with well-defined criteria and standards for safety and effectiveness. Standards therefore
represent an important instrument to tackle questions of ethics and social responsibility,
and can make a positive impact on science, technology development, and
commercialisation. Governance frameworks for neurotechnology innovation should take
into consideration standards for, e.g. safety, efficacy, manufacturing, and the compliance
with existing data protection, intellectual property and medical device regulations, as well
as fundamental human rights. However, given the low level of maturity of some
neurotechnologies, there can also be some reluctance to establish strict standards given the
multiple unknowns.
The question thus arises on how standards can be responsibly developed without
unnecessarily slowing down the deployment of technology-based solutions. The IEEE5
sponsored working group of Neurotechnologies for Brain-Machine Interfacing, chaired by
Ricardo Chavarriaga (Defitech Chair in Brain-Machine Interface, Center for
Neuroprosthetics & Institute of Bioengineering, School of Engineering, Polytechnique
Fédérale de Lausanne, Switzerland), has been working on identifying the current state and
priorities for standardisation in this field. It has first highlighted the need to recognize that
these technologies are based on the integration of multiple subsystems, often based on other
emerging technologies including AI, the Internet of Things (IoT), intelligent robotics and
augmented/ virtual reality.
Consequently there is a great heterogeneity in the level of standardisation on the elements
that compose neurotechnologies. For instance, there is a rather high level of standardisation
on the safety and biocompatibility of traditional sensing technologies and prosthetic
devices. In contrast there are practically no standards related to the system specification,
interoperability or benchmarking of the functional capabilities of these systems.
One of the clear priorities for standardisation concerns data management. The possibility
of widespread data sharing is important to promote new discoveries that reflect global
diversity and differences across populations. However, cultural differences exist, and a
diversity of governance systems can complicate data sharing (OECD, 2013[65]; OECD,
2014[66]). Nonetheless, the multiple ongoing projects on platforms to manage large
quantities of data are being developed independently by separate entities (i.e. national brain
agencies), without clear efforts to ensure compatibility across them.
In addition, data collection brings another priority area which is the protection of the data
and the privacy of individuals. Noteworthy, this concern goes beyond neural technologies
and should be addressed consistently for all data-intensive (AI-powered) activities. In
particular, as stated in the Article 25 of the EU General Data Protection Regulation
(GDPR), manufacturers should ensure data protection by design and by default, meaning
that both the hardware and the software have been designed from the foundations as
5 https://www.ieee.org/
RESPONSIBLE INNOVATION IN NEUROTECHNOLOGY ENTERPRISES | 27
WORKING PAPER “RESPONSIBLE INNOVATION IN NEUROTECHNOLOGY ENTERPRISES” © OECD 2019
secure.6 It is thus important that current initiatives on governance and standards for
neurotechnologies are coherent with efforts on data-related emerging technologies.
Given the fact that neurotechnologies are constantly evolving, standards and regulation
should be able to accommodate developments at different levels of technological maturity.
Hence, instead of a monolithic set of rules, standardisation should be approached as a
coherent set of widely agreed of rules ranging from community guidelines and field-
specific good practices for technologies at early stage of development to industry
established standards for more mature developments and products. Proper integration
across different levels of standardisation can facilitate faster and safer development and
technology transfer from research to industry. Consistently, different types of governance
may apply to each stage of the development and deployment of neurotechnologies.
Importantly, a coherent approach should be taken to ensure prioritisation of the ethical
aspects, safety, subject protection and respect of cultural differences. Principles of ethics-,
privacy- and security-by design should be thoroughly applied from early stages of research
and development. In the same way, it is also important to have coherent regulation between
clinical and consumer-oriented neurotechnologies. The latter are expected to play an
important role on reducing access costs and will increasingly be used in healthcare and
wellness applications. It is thus important for consumer-oriented devices to comply with
relevant standards in terms of safety, efficacy and interoperability with clinically-graded
equipment.
Given the fast development of these emerging technologies and the unavoidable
uncertainty of their deployment in society, it is important to allow that standardisation and
governance mechanisms rapidly evolve alongside new development. One example of
flexible mechanisms is the draft guidance document “Implanted Brain-Computer Interface
(BCI) Devices for Patients with Paralysis or Amputation - Non-clinical Testing and Clinical
Considerations” released by the US Food & Drug Administration (FDA) (2019[67]). It is a
leapfrog guidance mechanism by which agencies and regulatory bodies can “share initial
thoughts regarding emerging technologies that are likely to be of public health importance
early in product development”. They are thus non-binding but represent the current stance,
allows for community feedback and recognizes that recommendations are likely to change
as technology evolves and more information becomes available. International efforts on
this type of flexible, evolving recommendations would certainly help aligning current
efforts in the development of neural technologies and strongly promote standardisation on
system specification and interoperability.
In considering standardisation and soft law approaches, regulation should not be feared, for
it does not necessarily impair or delay innovation. Quite the contrary, when well designed
through an empirical and functional approach it creates a clear framework that allows
technological development and its economies to prosper (OECD, 2019[60]). This is the
perspective the European Commission adopted in its Communication of April 25th 2018 on
“Artificial Intelligence for Europe”7.
6 https://gdpr-info.eu/art-25-gdpr/
7 (2018). Communication from the Commission to the European Parliament, the European Council,
the Council, the European Economic and Social Committee and the Committee of the Regions.
Artificial Intelligence for Europe. Brussels, European Commission.
28 | RESPONSIBLE INNOVATION IN NEUROTECHNOLOGY ENTERPRISES
WORKING PAPER “RESPONSIBLE INNOVATION IN NEUROTECHNOLOGY ENTERPRISES” © OECD 2019
Indeed, absent specific intervention, uncertainty might reside with respect to a number of
issues, such as who bears liability in case of a malfunctioning or accident involving the use
of some application, in particular if intended to function in close cooperation with human
beings. Different sets of rules could overlap, thence simplification would certainly benefit
the system. To this end, a risk-management approach (Bertolini, 2016[68]) could be
conceived – also contemplated by the European Parliament in its recommendations from
16 February 20178whereby the party is held liable that is best positioned to identify the
risk and manage it also through insurance without requiring the demonstration of an
exact causal nexus.
Standardization and ex ante product safety regulation then play a central role in ensuring
both a high quality product design thence users’ protection –, and a clear legal framework
for businesses, allowing them to identify the requirements they need to abide by. Efforts in
perfecting such body of norms, as well as the development of internationally recognized
technical standards should be welcomed.
Regulation necessarily occurs at national and regional level. Aiming at the adoption of a
global legal framework is largely unrealistic, and not necessarily beneficial in such
technical matters. Indeed, if technology regulation occurred at regional not merely
national – level, the development of competing systems could be beneficial, allowing for
alternative approaches to be tested, without causing excessive fragmentation.
To summarize, standardisation and regulation should be consistent with the fact that it is
impossible to solve all the uncertainties before these technologies are deployed to society.
Therefore, there should be mechanisms to properly inform society about the risks and
benefits they entail, as well as the possibility of unforeseen outcomes. Complementary,
developers are responsible for monitoring the impact of these technologies and should be
ready to anticipate and react accordingly in case of negative outcomes. Proactive and
flexible mechanisms for standardisation and governance will play an important role in the
safe, responsible deployment of solutions based on brain science and neurotechnology.
8 (2017). European Parliament resolution of 16 February 2017 with recommendations to the
Commission on Civil Law Rules on Robotics (2015/2103(INL)), European Parliament.
RESPONSIBLE INNOVATION IN NEUROTECHNOLOGY ENTERPRISES | 29
WORKING PAPER “RESPONSIBLE INNOVATION IN NEUROTECHNOLOGY ENTERPRISES” © OECD 2019
6. Opportunities in soft law
At the Shanghai Workshop, the topic of soft law was explicitly discussed with respect to
neurotechnology. While regulation refers to a system of rules that identify permissible and
impermissible activities with sanction or incentives to ensure compliance, soft law refers
to policy instruments with moral or political force but without legal enforceability.
Examples of soft law include private standards, general policies, guidelines, principles,
codes of conduct, and forums for transnational dialogue. The various instruments of soft
law might be well suited to the governance of emerging technologies where there is often
a need to operate at the global scale and where a flexible approach might be appropriate
given the uncertain trajectories.
6.1. Development of principles
OECD Principles for addressing pressing ethical, legal, societal, economic and cultural
challenges would be beneficial to support responsible advancement of novel
neurotechnology. These OECD Principles could help governments better assess the
impacts of neurotechnology and develop policy responses for reaping and sharing their
benefits.
One major challenge for developing international principles or guidelines for responsible
innovation in neurotechnology is the diversity of the field, both in terms of technologies
and scientific knowledge involved as well as different regulatory contexts. The
development of neurotechnology can involve contributions from different sectors, such as
neuroscience and data science, in which there are different practices, standards and
governance and regulatory requirements. Moreover, scientific advances and the emergence
of new applications can transfer rapidly into different jurisdictions and areas of application.
Common standards that nevertheless recognize different regulatory and cultural practices
can help secure more rapid and effective collaborations and transfer of technologies so that
clearer pathways can be found towards global dissemination and diffusion. This will
necessarily involve elements of responsible research (soft governance) at upstream stages
of the innovation process, and an understanding of regulatory conditions in areas of
application where there are more specific conditions attached to market entry (such as
medical devices). Far from being a barrier to innovation and development, engagement
with such governance processes, alongside public involvement at all stages, can help secure
public acceptability and clearer and more predictable routes through the innovation
pathway.
Soft law measures, such as the Ethics Guidelines for a trustworthy AI9, or the OECD
Recommendation of the Council on Artificial Intelligence (OECD, 2019[60]), may not, and
are not intended to replace the need for a sound regulatory framework. Such instruments,
broader and more general in their assumptions, scope and conclusions, may instead be
useful to shape a culture of responsible research and innovation. Even in such a perspective,
however, the development of alternative, and competing models, reflecting different
9 Available at https://ec.europa.eu/digital-single-market/en/news/ethics-guidelines-trustworthy-ai,
adopted by the High-Level Expert Group on AI, appointed by the European Commission.
30 | RESPONSIBLE INNOVATION IN NEUROTECHNOLOGY ENTERPRISES
WORKING PAPER “RESPONSIBLE INNOVATION IN NEUROTECHNOLOGY ENTERPRISES” © OECD 2019
cultures, traditions and sensitivities, might be welcomed. Such diversity could support the
testing of different solutions and of identifying those to be preferred.
Principles and guidelines, even when reflecting differences in culture, traditions and
sensitivities, should not to be intended as a replacement for regulation, but contribute to the
development of a responsible research and innovation approach. The regulation of
emerging technologies requires novel approaches, that need to be bottom-up and
functional. Indeed, it is necessary to gain an exact understanding of the functioning of
technologies, dividing them into classes according to their peculiarities, observe how they
interact with already existing norms, identify the issues they give rise to, and the incentives
extant rules provide to all players involved (Leenes et al., 2017[69]). The tendency towards
indistinct exceptionalism should be contrasted with attentive empiricism, distinguishing
reality from science-fiction and inflated expectations.
It is the responsibility of governance bodies to align neurotechnology with democratic
principles such as individual freedom, equality of opportunity and citizen involvement in
public deliberation. A roadmap for democratizing neurotechnology should align innovation
in this domain with the principles of openness, transparency, avoidance of centralized
control, inclusiveness, and user-centeredness. The complexity of neurotechnology requires
adaptive and multi-level governance frameworks that promote and take into consideration
(Ienca and Andorno, 2017[48]):
Quality standards and guidelines for neurotechnology producers.
The calibrated balancing between the freedom to innovate and the promotion of
privacy and security.
The inclusion of diverse actors and perspectives in public deliberation.
The protection of fundamental human rights.
6.2. Ethics and governance frameworks for neurotechnology
Governance frameworks are necessary as a way to ensure standard reporting of the
capabilities and potential consequences of neurotechnologies, allowing societies to more
easily grapple with their implications and the appropriate responses. The right frameworks
should enable innovation, though help steer it to desired goals. Key requirements for
frameworks for the governance of neurotechnology are developing standards and acquiring
sufficient knowledge about the efficacy of neurotechnology. Successful governance
frameworks should promote further research on standards for safety and effectiveness of
neurotechnology, and anticipate potential public health or individual risk. For instance,
currently, the efficacy of several consumer neurotechnologies is not conclusive and their
grounding on solid scientific research is often loose (Ienca, Haselager and Emanuel,
2018[36]; Wexler and Reiner, 2019[70]).
The development of relevant frameworks is underway. In fact there are some using
frameworks already both at the research phase and design phase, and there are many models
from other areas of emerging technology that carry excellent ideas for the private sector
(Coalition for Responsible Use of Gene Editing in Agriculture, 2017[71]; Knoppers et al.,
2014[72]; Marchant and Allenby, 2017[73]; OECD, 2019[60]).
Neurotechnology innovation represents a special case for ethics-based approaches, which
have been widely used other life science and technology domains. Neuroethics has evolved
over the past 15 years. The early emphasis of the field on the ethics of neuroscience and an
RESPONSIBLE INNOVATION IN NEUROTECHNOLOGY ENTERPRISES | 31
WORKING PAPER “RESPONSIBLE INNOVATION IN NEUROTECHNOLOGY ENTERPRISES” © OECD 2019
embedded neuroscience of the mind, is shifting today to impact, and the decisions and
decision-making tools that end-users must embody in absorbing neurotechnology into their
lives and into their societies.
To this end, six critical concepts underlie responsible innovation on the part of the
neurotechnology sector:
1. Neurotechnological exceptionalism, and the key role of neuroethics in guiding
AIMs ([A]anticipate and articulate, [I] implement and integrate, and [M]
monitor and measure) – recognising the need for caution, anticipating ethical
targets, integrating ethical benchmarks, and monitoring and measuring
outcomes.
2. Scientific, procedural, and ethical reproducibility, that ensures all aspects of
reproducible study designs from the first elements of conceptualization to the
furthest reaches of knowledge dissemination and exchange.
3. Given that there are varying cultural ecosystems, it is important to recognize
that balanced (rather than universal) principles will help support the definition
and integration of different ethical values from within different cultural
ecosystems.
4. A recognition of the role of internal self-regulation alongside and external
regulatory action, to promote a reflective, continuously adaptive, internally
self-regulated moral code. This has advantages over direct regulatory oversight
which can be difficult to maintain in a way that remains to date and relevant to
the fast-paced developments in the neurotechnology sector.
5. The compatibility of standards and regulatory requirements across different
sectors and jurisdictions can encourage responsible self-regulation whilst help
secure predictability in routes to market.
6. Continuous engagement with the public, policy makers and regulators is
necessary to secure acceptability of novel applications that are potentially
disruptive.
Under the umbrella of neuroethics, a number of frameworks and approaches are currently
being developed in academic, policy, and company settings. One emerging framework is
AIM ([A] anticipate and articulate, [I] implement and integrate, and [M] monitor and
measure), presented at the Shanghai Workshop by Judy Illes (Professor of Neurology and
Canada Research Chair in Neuroethics, University of British Columbia) as one possible
way forward (Illes, 2018[74]):
Anticipate and articulate: for neurotechnology to advance ethically, ethically
tenable values and goals must be predefined and articulated. These include the goals
of inclusivity discussed by Tan Lee (CEO and Founder, EMOTIV) and post-
mortem outreach discussed by Tom Insel (Co-founder and President, Mindstrong
Health) that bridge the leap from the laboratory to life. Inclusivity, much like Value
Sensitive Design (Friedman et al., 2013[75]) embraces the voices of all stakeholders
inventors to end-users of all backgrounds and ages early on in the design process
and mitigates biases that can be introduced when developers work in professional,
gender, and cultural isolation. Post-mortem outreach can be understood to span the
full range of planning for the dissemination and sales of successful products, as
well as the sharing of knowledge about unsuccessful neurotechnological attempts,
32 | RESPONSIBLE INNOVATION IN NEUROTECHNOLOGY ENTERPRISES
WORKING PAPER “RESPONSIBLE INNOVATION IN NEUROTECHNOLOGY ENTERPRISES” © OECD 2019
whether those involve failed technical design, poor uptake, harm, and even lack of
financial interest from potential investors.
Implement and integrate: the focus of this aspect of the AIM framework is on
trust that is achieved when ethical targets are anticipated and values are well-
articulated. Ethical benchmarks are set, and strategies to maintain them are
implemented and integrated. This aim ensures a focus on proper planning and
execution of trials and testing of devices, with realistic recruitment and business
plans that ensure funding-to-completion and follow-up of participants as needed
(Eaton, Kwon and Scott, 2015[76]). For clinical trials, it ensures that they are stopped
only when results are clearly insignificant, there is harm to subjects from adverse
events, a deficient protocol design renders trial continuation futile, or there is
authentic inability to recruit human subjects. Adherence to this aim limits unethical
abandonment of trials midstream due, for example, to change in investor interests,
funding, shrinking research budgets, mergers and acquisitions, emergence of
competitive products, pressures to end unproductive programs, defensive
manoeuvres by competition, supply failures, or catastrophic events. It also
embraces efficacy and trial change when, for example, a device is modified to
improve its performance or to suit a new target population, or moves into the real-
world clinic or home setting.
Monitor and measure: the concept of ethical reproducibility and informed risk
over informed consent are key variables to measure and monitor (Anderson,
Eijkholt and Illes, 2013[77]). In animal testing, for example, ethical reproducibility
pertains to well-established requirements for reporting, and strategies used to select
models, procedures to mitigate pain, and approaches to minimize the numbers
required for robust experimental results. As proposed more explicitly for human
experimentation, this concept pertains to reporting of strategies to assure the
capacity of a prospective participant to consent to a study, especially in cases
involving the greater acquisition of cognitive autonomy (youth) or diminishing
cognitive capacity (older adults), steps to mitigate risks to individuals and third
parties, steps to maximize benefit in the short and long term, and steps to assure
justice and access. Communication is the key to reproducibility in this context. All
stakeholder must be vested in measuring and monitoring success, failure, and
benefit and harm and appreciate them in all the local or global environments in
which they may occur. The internal desire for professional self-regulation and
outward communication of them are expressions of integrity. However, it should
be noted that self-regulation maybe insufficient whenever fundamental rights are
concerned. Moreover, self-regulation would not shield those that abide by it from
possible liabilities since legal principles do still apply.
Various other approaches are currently being developed by established companies and
start-ups. For example, the Canadian start-up Aifred developed the “meticulous
transparency” framework, which aims to support AI developers, civil society and
regulatory bodies to evaluate AI technologies for their capabilities and the intentionality.
Meticulous transparency provides stakeholders with a complete description of the purpose,
scope of use, projected benefits and risks, and data sources of the AI application upstream
its development. “Meticulous transparency shifts the focus of ethical evaluation from the
technology itself to instead why it is being built, and potential consequences.” (Benrimoh
et al., 2018[49]).
RESPONSIBLE INNOVATION IN NEUROTECHNOLOGY ENTERPRISES | 33
WORKING PAPER “RESPONSIBLE INNOVATION IN NEUROTECHNOLOGY ENTERPRISES” © OECD 2019
Similar to an ethics board, the “meticulous transparency” framework demands AI
developers to publicly document and explain: (1) establish a rationale for a project
(intentions); (2) defend the methods (data sources and interpretability); (3) discuss positive
and negative impacts (consequences).
This framework has six evaluation steps, aimed at ensuring that developers consider the
full range of ethical concerns prior to commencing development:
1. A complete description of the purpose of the product. This refers to declaring
the intentionality behind the project- focusing on consequences and purpose
rather than simply technical specifications and capability.
2. Scope of use. This refers to defining where and with which populations the tool
should be operating.
3. Data sources and bias control. This refers to ensuring that the application is
being trained using appropriate data given the intended purpose and scope of
use, which is critical when attempting to reduce bias.
4. Human interpretability. This refers to having decisions explainable enough that
a human operator would be able to understand them. This does not mean perfect
explainability – just to a degree that is appropriate to the field. For example,
within medicine risk factors are often used from the literature when making
certain decisions, even when these risk factors are not completely understood.
This is similar to understanding the key input features of an AI model, even if
their high-level interactions with other features are difficult to explain.
5. Projected risks and benefits. This refers to a process of assessing risks and
benefits of the application, considered from many different perspectives (social,
economic, social justice etc.).
6. Monitoring and contingency plans for Adverse Events. Finally, makers of AI
and neurotechnology products could be considered as to have the same
responsibility as drug developers, who must continue to monitor their products
for adverse events and unintended consequences.
6.3. Emerging practices for responsible innovation in business settings
Leading companies in, e.g. neurotechnology, machine-learning, robotics, and the various
digital technologies are well-positioned to identify and tackle critical issues by interacting
with researchers, users and investors alike. In fact, workshop participants emphasized that
“the science needs to be done right” to ensure ethical viability, governability, and social
desirability of emerging technologies and to manage expectations and hypes. However,
while many companies are ready to engage questions of responsibility head-on, they
frequently lack the tools and framework to do so within their business settings. Recognising
that the social and ethical issues raised by the diversity of novel technologies fall squarely
in-between public and private sector responsibilities as part of the innovation process can
help ensure socially desirable outcomes and contribute to the robustness and sustainability
of products and services in this promising field.
Various good practices have already begun to emerge from within neurotechnology
companies as well as related technology domains such as AI, gene editing, nanotechnology,
or synthetic biology. An overview of the different ex-ante (pre-emptive) and ex-post
approaches by larger technology industries was provided at the Shanghai Workshop by
34 | RESPONSIBLE INNOVATION IN NEUROTECHNOLOGY ENTERPRISES
WORKING PAPER “RESPONSIBLE INNOVATION IN NEUROTECHNOLOGY ENTERPRISES” © OECD 2019
Gary Marchant (Regents Professor, Center for Law Science & Innovation, Arizona State
University) and is summarized in Table 4 (Marchant, 2016[78]).
Table 4. Examples of governance frameworks for emerging technologies at companies
Mechanism
Example
Company-NGO Partnership
DuPont-EDF Nanotechnology Risk Framework
http://www.nanoriskframework.org/
Responsible Use Guidelines
Coalition for Responsible Gene Editing
http://geneediting.foodintegrity.org/responsible-use-guidelines/
Risk Mitigation Checklist
Ethical OS
https://ethicalos.org/
Downstream Product Stewardship
Ginkgo Bioworks, Bayer Company
https://media.bayer.com/baynews/baynews.nsf/id/Bayer-Ginkgo-Bioworks-unveil-joint-venture-Joyn-Bio-establish-operations-
Boston-West-Sacramento; https://spectrum.ieee.org/the-human-os/biomedical/ethics/synthetic-biology-behemoth-aims-to-
police-its-own-industry
Industry Best Practices
Future of Privacy Forum
https://fpf.org/2018/07/31/future-of-privacy-forum-and-leading-genetic-testing-companies-announce-best-practices-to-protect-
privacy-of-consumer-genetic-data
Public Engagement
Gene Drives
https://doi.org/10.1002/hast.808
Request Government Regulation
Microsoft Facial Recognition Technology
https://blogs.microsoft.com/on-the-issues/2018/07/13/facial-recognition-technology-the-need-for-public-regulation-and-
corporate-responsibility/
Data Responsibility
IBM’s Principles for Trust and Transparency
https://www.ibm.com/blogs/policy/trust-principles/
Patent Licensing
Broad Institute Principles for Disseminating Scientific Innovation
https://www.broadinstitute.org/principles-disseminating-scientific-innovations; https://ssrn.com/abstract=2897574
External Monitor
Volkswagen Compliance Monitor
https://www.reuters.com/article/us-volkswagen-emissions-monitor/u-s-monitor-seeks-more-transparency-from-vw-over-
emissions-idUSKCN1LC0RW
Certification Programmes
Responsible Care System, American Chemistry Council
https://responsiblecare.americanchemistry.com/Management-System-and-Certification/
Source: Prof. Dr. Gary E. Marchant, Faculty Director and Regents Professor, Center for Law Science &
Innovation, Arizona State University, Tempe, USA; presentation at Shanghai Workshop, adapted.
A range of examples for such instruments and good practices that can help support
responsible innovation within the private sector were further elaborated by Sebastian
Pfotenhauer (Linde Professor of Innovation Research, Munich Center for Technology in
Society and TUM School of Management, Technical University of Munich) and Nina
Frahm (Munich Center for Technology in Society and Harvard Kennedy School):
Appoint responsible innovation officers and boards. Consistent attention to
questions of responsibility requires human resources and organizational capacity.
Several neurotechnology businesses, including the digital phenotyping start-up
RESPONSIBLE INNOVATION IN NEUROTECHNOLOGY ENTERPRISES | 35
WORKING PAPER “RESPONSIBLE INNOVATION IN NEUROTECHNOLOGY ENTERPRISES” © OECD 2019
Mindstrong Health, have recently appointed advisory boards consisting of domain
experts, regulatory experts, and social scientists. Such dedicated Responsible
Research and Innovation (RRI) or ethical, legal and social issues (ELSI) boards can
help the company monitor and support technology development in socially
desirable directions, and anticipate or co-shape emerging regulations. Moreover,
companies can attune their CSR and R&D departments to questions of responsible
innovation. Dedicated personnel can help bridge traditional organizational
boundaries. However, particularly for young (or small) innovation-oriented
business, an external advisory board might be even more important than traditional
CSR units.
Develop principles and guidelines. A number of organizations have recently
released principles or guidelines that focus on questions of responsible innovation,
but also professional associations like Institute of Electrical and Electronics
Engineers (IEEE) or the Coalition for Responsible Gene Editing. Principles may
entail high-level management commitments for entire companies or projects; they
may also be part of the identity of a start-up (e.g. the commitment to collect minimal
data). They can combine elements of both of “responsible development” and
“responsible use” of novel technology. One challenge is that at present, principles
tend to be rather abstract and hence hard to implement in concrete and measurable
ways. However, this challenge is not unique, but is shared by e.g. initiatives to
enhance sustainability. Difficult operationalization notwithstanding, principles can
serve as a useful moral compass.
Responsible technology transfer. The transition from the lab to the commercial
stage is a critical juncture for questions of responsibility (Eppinger and Tinnemann,
2014[79]; Gwizdała and Śledzik, 2017[80]) Many researchers and innovators take
great interest in the future use of their invention, even when realized without their
direct participation, e.g. through licensing agreements. University technology
transfer offices typically operate under incentive structures that emphasize numbers
of patents and licences, or the amount of licensing returns for the host institution.
Responsible transfer metrics could include considerations of social benefits and
impact (e.g. free licensing to developing countries), equity (e.g. patent pools), and
anticipatory governance (e.g. as part of business plans), and adjust incentives and
transfer contracts accordingly (e.g. required RRI boards for start-ups).
Strategic partnerships. Public-private partnerships have proven effective
instruments for providing public services or tackling societal challenges through
combined investments (OECD, 2015[81]; Roehrich, Lewis and George, 2014[82]).
Moreover, recent initiatives around public procurement of innovation have been
used create nascent markets and steer innovation activity in directions of public
value as defined by governments through “challenges” and specific conditions. For
example, public procurement calls in robotics are currently aiming to address
infrastructural maintenance tasks such as sewer and bridge inspections.
Neurotechnology companies can seek to develop strategic partnerships and
alliances with research institutions, governmental and non-governmental
organizations to anticipate and tackle responsibility issues.
Socially responsible investment & certification. Especially for start-ups, key
decisions about company and marketing strategy are typically greatly shaped by
their investors. Conference participants suggested that careful selection of one’s
investors is instrumental for addressing questions of social responsibility. Recent
36 | RESPONSIBLE INNOVATION IN NEUROTECHNOLOGY ENTERPRISES
WORKING PAPER “RESPONSIBLE INNOVATION IN NEUROTECHNOLOGY ENTERPRISES” © OECD 2019
trends towards sustainable investments and “green bonds” might offer a model for
novel forms of “responsible investment” (Kurtz, 2009[83]). Such a development
could be supported by new standards or certifications. While the European Union
has developed various RRI checklists and standards for responsible research and
innovation in the public sector, they do presently not exist for private sector
settings. Iatris and Schroeder (2016[84]) recently investigated how existing CSR
standards and certification (such as ISO9001 or IS45001) could be mobilized to
include aspects of RRI in a more comprehensive manner.
Diversify hiring practices. Many tech companies are increasingly hiring social
scientists and humanities experts to address a broader and perhaps more socially
conscious set of perspectives on innovation. This can help to provide more socially
inclusive perspectives on the benefits and risks of neurotechnology, and anticipate
potential controversies.
Use test beds and regulatory “sandboxes” to co-create technology and
regulation. Companies and innovation scholars are increasingly emphasizing the
need to develop innovations in real-world settings that can anticipate and respond
to use patterns, social uptake, concerns, and potential regulatory issues. Novel
instruments such as test-beds, living laboratories and regulatory sandboxes enable
testing in spatially confined, experimental settings prior to broader rollout,
frequently with some form of “co-creation.” These instruments can also be
employed to co-develop appropriate rules and regulations in tandem with the
technology, as currently seen in cases of autonomous driving and robotics. For
neurotechnology, there are opportunities to investigate applications with selected
populations (e.g. local mental health patients) together with the participation of
public bodies to gauge regulatory needs.
6.3.1. Corporate Social Responsibility (CSR)
In organizational terms, questions of responsibility are traditionally the domain of
Corporate Social Responsibility (CSR), an important source of soft law, which has
developed into a lively area of academic scholarship and diverse practice (Crane et al.,
2009[85]; Idowu and Louche, 2011[86]). Many medium and large enterprises have adopted
CSR practices and organizational units tasked with CSR. A wide range of international
standards, best practice, and instruments are available, including OECD Guidelines for
Multinational Enterprises (2011[87]) and the UN Guiding Principles on Business and Human
Rights (UN Human Rights, 2011[88]).
Yet, questions of responsible innovation, i.e. the social, ethical, and legal challenges arising
from the development of high-tech products, have largely remained outside the CSR
purview, which has been concerned more with matters of worker and human rights, local
communities, or environmental externalities, e.g. in the mining sector or globalized
manufacturing. Workshop participants remarked that the organisational barriers between
CSR and Research and Development units tend to be high. Moreover, many companies at
the forefront of disruptive innovations are start-ups that lack the organisational capacity,
resources, experience, or time to make CSR a priority. Yet, many CSR dimensions apply
equally to responsible innovation questions:
Externalities: like other forms of economic activity, innovation can create negative
externalities and spill overs, such as in the democratic implications of digital social
media or autonomous driving. Anticipating, managing and potentially internalizing
these externalities will be a key issue of responsible innovation.
RESPONSIBLE INNOVATION IN NEUROTECHNOLOGY ENTERPRISES | 37
WORKING PAPER “RESPONSIBLE INNOVATION IN NEUROTECHNOLOGY ENTERPRISES” © OECD 2019
Public trust and social licence to operate: trust in the integrity and intentions of
innovative companies, and in the technologies they develop, are key for business
success and sustainability. As recent controversies around social media data leaks,
gene patents, or automotive emissions testing underscored, the social licence to
operate for a company is a central asset of business model in any innovative firm.
Socio-cultural embedding and regulation: like CSR practices, responsible
innovation practices will differ across countries and communities based on social
values, norms, economic conditions, and the political and institutional landscape.
International RI practices will have to balance the desire for uniform global
standards with socio-cultural idiosyncrasies.
Corporate scientific citizenship: good corporate citizenship entails using rights
and responsibilities for innovation with a view towards other citizens and the public
good.
Shareholder and stakeholder accountability: a key CSR debate has been
between advocates of a narrow definition of value creation as shareholder value vs.
a broader sense of accountability to all societal stakeholders. Similar arguments
apply to responsibility in innovation, where an investor’s interest in financial
returns (e.g. Venture Capital funding a start-up) has to be balanced against broader
definitions of public value.
38 | RESPONSIBLE INNOVATION IN NEUROTECHNOLOGY ENTERPRISES
WORKING PAPER “RESPONSIBLE INNOVATION IN NEUROTECHNOLOGY ENTERPRISES” © OECD 2019
Annex A. Bibliography
Abdulkader, S., A. Atia and M. Mostafa (2015), “Brain computer interfacing: Applications and
challenges”, Egyptian Informatics Journal, Vol. 16/2, pp. 213-230,
http://dx.doi.org/10.1016/J.EIJ.2015.06.002.
[55]
Aicardi, C. et al. (2018), Opinion on ’Responsible Dual Use’ Human Brain Project Human Brain
Project Authors, https://sos-ch-dk-2.exo.io/public-website-
production/filer_public/f8/f0/f8f09276-d370-4758-ad03-679fa1c57e95/hbp-ethics-society-
2018-opinion-on-dual-use.pdf (accessed on 15 April 2019).
[51]
Alzheimer’s Disease International (ADI) (2018), World Alzheimer Report 2018 - The state of the
art of dementia research: New frontiers; World Alzheimer Report 2018 - The state of the art
of dementia research: New frontiers,
https://www.alz.co.uk/research/WorldAlzheimerReport2018.pdf (accessed on
24 September 2018).
[20]
Anderson, J., M. Eijkholt and J. Illes (2013), “Ethical reproducibility: towards transparent
reporting in biomedical research”, Nature Methods, Vol. 10/9, pp. 843-845,
http://dx.doi.org/10.1038/nmeth.2564.
[77]
Arenaza-Urquijo, E., M. Wirth and G. Chételat (2015), “Cognitive reserve and lifestyle: moving
towards preclinical Alzheimer’s disease”, Frontiers in Aging Neuroscience, Vol. 7, p. 134,
http://dx.doi.org/10.3389/fnagi.2015.00134.
[28]
Bainbridge, W. and M. Roco (2016), Handbook of Science and Technology Convergence,
Springer, http://dx.doi.org/10.1007/978-3-319-07052-0.
[54]
Benrimoh, D. et al. (2018), “Meticulous Transparency—An Evaluation Process for an Agile AI
Regulatory Scheme”, in Mouhoub, M. et al. (eds.), Recent Trends and Future Technology in
Applied Intelligence, Springer, Cham, http://dx.doi.org/10.1007/978-3-319-92058-0_83.
[49]
Bertolini, A. (2016), “Insurance and Risk Management for Robotic Devices: Identifying the
Problems”, Global Jurist, Vol. 16/3, http://dx.doi.org/10.1515/gj-2015-0021.
[68]
Bowman, D. et al. (2018), “The Neurotechnology and Society Interface: Responsible Innovation
in an International Context”, Journal of Responsible Innovation, Vol. 5/1,
http://dx.doi.org/10.1080/23299460.2018.1433928.
[37]
Bowman, K. and J. Husbands (2011), “Dual use issues in the life sciences: challenges and
opportunities for education in an emerging area of scientific responsibility.”, CBE life
sciences education, Vol. 10/1, pp. 3-7, http://dx.doi.org/10.1187/cbe.10-12-0150.
[50]
Cabrera, L., E. Evans and R. Hamilton (2014), “Ethics of the Electrified Mind: Defining Issues
and Perspectives on the Principled Use of Brain Stimulation in Medical Research and Clinical
Care”, Brain Topography, Vol. 27/1, pp. 33-45, http://dx.doi.org/10.1007/s10548-013-0296-
8.
[47]
RESPONSIBLE INNOVATION IN NEUROTECHNOLOGY ENTERPRISES | 39
WORKING PAPER “RESPONSIBLE INNOVATION IN NEUROTECHNOLOGY ENTERPRISES” © OECD 2019
Coalition for Responsible Use of Gene Editing in Agriculture (2017), Principles and Guidelines
for Responsible Use of Gene Editing in Agriculture Coalition for Responsible Use of Gene
Editing in Agriculture, http://geneediting.foodintegrity.org/wp-
content/uploads/sites/2/2017/05/Gene-Editing_Final-Principles-Guidelines_Oct-2017.pdf
(accessed on 20 June 2019).
[71]
Crane, A. et al. (eds.) (2009), Socially Responsible Investment and Shareholder Activism, Oxford
University Press, http://dx.doi.org/10.1093/oxfordhb/9780199211593.003.0011.
[83]
Crane, A. et al. (eds.) (2009), The Corporate Social Responsibility Agenda, Oxford University
Press, http://dx.doi.org/10.1093/oxfordhb/9780199211593.003.0001.
[85]
Dagum, P. (2018), “Digital biomarkers of cognitive function”, npj Digital Medicine, Vol. 1/1,
p. 10, http://dx.doi.org/10.1038/s41746-018-0018-4.
[32]
Ding, Y. et al. (2019), “A Deep Learning Model to Predict a Diagnosis of Alzheimer Disease by
Using <sup>18</sup> F-FDG PET of the Brain”, Radiology, Vol. 290/2, pp. 456-464,
http://dx.doi.org/10.1148/radiol.2018180958.
[25]
Eaton, M. and J. Illes (2007), “Commercializing cognitive neurotechnology—the ethical terrain”,
Nature Biotechnology, Vol. 25/4, pp. 393-397, http://dx.doi.org/10.1038/nbt0407-393.
[7]
Eaton, M., B. Kwon and C. Scott (2015), “Money and Morals Ending Clinical Trials for
Financial Reasons”, Curr Topics Behav Neurosci, Vol. 19,
http://dx.doi.org/10.1007/7854_2014_337.
[76]
Eppinger, E. and P. Tinnemann (2014), “Technology Transfer of Publicly Funded Research
Results from Academia to Industry: Societal Responsibilities?”, in Responsible Innovation 1,
Springer Netherlands, Dordrecht, http://dx.doi.org/10.1007/978-94-017-8956-1_5.
[79]
Feigin, V. et al. (2019), “Global, regional, and national burden of neurological disorders, 1990–
2016: a systematic analysis for the Global Burden of Disease Study 2016”, The Lancet
Neurology, Vol. 18/5, pp. 459-480, http://dx.doi.org/10.1016/S1474-4422(18)30499-X.
[11]
Finlayson, S., J. Bowers and J. Ito (2019), “Adversarial attacks on medical machine learning”,
Science, Vol. 363/6433, http://dx.doi.org/10.1126/science.aaw4399.
[10]
Friedman, B. et al. (2013), “Value Sensitive Design and Information Systems”, Springer,
Dordrecht, http://dx.doi.org/10.1007/978-94-007-7844-3_4.
[75]
Garden, H. and D. Winickoff (2018), Issues in neurotechnology governance, OECD,
https://doi.org/10.1787/c3256cc6-en. (accessed on 27 September 2018).
[38]
Gates, N. et al. (2019), “Computerised cognitive training for maintaining cognitive function in
cognitively healthy people in midlife”, Cochrane Database of Systematic Reviews, Vol. 3,
p. CD012278, http://dx.doi.org/10.1002/14651858.CD012278.pub2.
[35]
Giordano, J. (2012), Neurotechnology : premises, potential, and problems, CRC Press.
[1]
40 | RESPONSIBLE INNOVATION IN NEUROTECHNOLOGY ENTERPRISES
WORKING PAPER “RESPONSIBLE INNOVATION IN NEUROTECHNOLOGY ENTERPRISES” © OECD 2019
Greely, H., K. Ramos and C. Grady (2016), “Neuroethics in the Age of Brain Projects”, Neuron,
Vol. 92/3, pp. 637-641, http://dx.doi.org/10.1016/j.neuron.2016.10.048.
[5]
Greenberg, A. (2018), “Inside the Mind’s Eye: An International Perspective on Data Privacy
Law in the Age of Brain-Machine Interfaces”, SSRN Electronic Journal,
http://dx.doi.org/10.2139/ssrn.3180941.
[41]
Gwizdała, J. and K. Śledzik (2017), Risk Asymmetries in ’Open Science’ Concept: University
Technology Transfer Perspective,
https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3035886 (accessed on 16 April 2019).
[80]
Hampel, H. et al. (2017), “A Precision Medicine Initiative for Alzheimer’s disease: the road
ahead to biomarker-guided integrative disease modeling”, Climacteric, Vol. 20/2, pp. 107-
118, http://dx.doi.org/10.1080/13697137.2017.1287866.
[23]
Hampel, H. et al. (2019), “The Alzheimer Precision Medicine Initiative”, Journal of Alzheimer’s
Disease, Vol. 68/1–24.
[24]
Hernandez, D. (2018), “Brain-Computer Interfaces: ‘The Last Frontier of Human Privacy’”,
Wall Street Journal April, https://www.wsj.com/articles/brain-computer-interfaces-the-last-
frontier-of-human-privacy-1524580522.
[42]
Hewlett, E. and V. Moran (2014), Making Mental Health Count: The Social and Economic Costs
of Neglecting Mental Health Care, OECD Health Policy Studies, OECD Publishing, Paris,,
http://dx.doi.org/10.1787/9789264208445-en.
[15]
Hickey, P. and M. Stacy (2016), “Deep Brain Stimulation: A Paradigm Shifting Approach to
Treat Parkinson’s Disease.”, Frontiers in neuroscience, Vol. 10, p. 173,
http://dx.doi.org/10.3389/fnins.2016.00173.
[33]
Hodges, J. (2015), “A decade of discovery and disappointment in dementia research”, Nature
Reviews Neurology, Vol. 11/11, pp. 613-614, http://dx.doi.org/10.1038/nrneurol.2015.191.
[21]
Iatridis, K. and D. Schroeder (2016), “Applying Corporate Responsibility Tools to Responsible
Research and Innovation”, http://dx.doi.org/10.1007/978-3-319-21693-5_5.
[84]
Idowu, S. and C. Louche (2011), Theory and practice of corporate social responsibility,
Springer.
[86]
Ienca, M. and R. Andorno (2017), “Towards new human rights in the age of neuroscience and
neurotechnology”, Life Sciences, Society and Policy, Vol. 13/5,
http://dx.doi.org/10.1186/s40504-017-0050-1.
[48]
Ienca, M., P. Haselager and E. Emanuel (2018), “Brain leaks and consumer neurotechnology”,
Nature Biotechnology, Vol. 36/9, pp. 805-810, http://dx.doi.org/10.1038/nbt.4240.
[36]
Illes, J. (2018), Ethical Issues in BCI Research and Development: Aim to AIM,.
[74]
RESPONSIBLE INNOVATION IN NEUROTECHNOLOGY ENTERPRISES | 41
WORKING PAPER “RESPONSIBLE INNOVATION IN NEUROTECHNOLOGY ENTERPRISES” © OECD 2019
James, S. et al. (2018), “Global, regional, and national incidence, prevalence, and years lived
with disability for 354 diseases and injuries for 195 countries and territories, 1990–2017: a
systematic analysis for the Global Burden of Disease Study 2017”, The Lancet,
Vol. 392/10159, pp. 1789-1858, http://dx.doi.org/10.1016/S0140-6736(18)32279-7.
[12]
Jeong, S. et al. (2019), “Korea Brain Initiative: Emerging Issues and Institutionalization of
Neuroethics”, Neuron, Vol. 101/3, pp. 390-393,
http://dx.doi.org/10.1016/j.neuron.2019.01.042.
[4]
Kaebnick, G. and M. Gusmano (2018), “Making Policies about Emerging Technologies”,
Hastings Center Report, Vol. 48/1, http://dx.doi.org/10.1002/hast.816.
[39]
Knoppers, B. et al. (2014), “A human rights approach to an international code of conduct for
genomic and clinical data sharing”, Hum Genet, Vol. 133, pp. 895-903,
http://dx.doi.org/10.1007/s00439-014-1432-6.
[72]
Larson, E. (2018), “Prevention of Late-Life Dementia: No Magic Bullet”, Annals of Internal
Medicine, Vol. 168/1, p. 77, http://dx.doi.org/10.7326/M17-3026.
[22]
Leenes, R. et al. (2017), “Regulatory challenges of robotics: some guidelines for addressing legal
and ethical issues”, Law, Innovation and Technology, Vol. 9/1, pp. 1-44,
http://dx.doi.org/10.1080/17579961.2017.1304921.
[69]
Leveson, N. (2011), Engineering a Safer World - Systems Thinking Applied to Safety, The MIT
Press, https://mitpress.mit.edu/books/engineering-safer-world.
[59]
Limousin, P. and T. Foltynie (2019), “Long-term outcomes of deep brain stimulation in
Parkinson disease”, Nature Reviews Neurology, Vol. 15/4, pp. 234-242,
http://dx.doi.org/10.1038/s41582-019-0145-9.
[34]
Livingston, G. et al. (2017), “Dementia prevention, intervention, and care”, The Lancet,
Vol. 390/10113, pp. 2673-2734, http://dx.doi.org/10.1016/S0140-6736(17)31363-6.
[29]
Marchant, G. (2016), “Advancing Resilience through Law”, in Igor, (. and M. Florin (eds.),
Resource Guide on Resilience, IRGC, Lausanne: EPFL International Risk Governance
Center.
[78]
Marchant, G. and B. Allenby (2017), “Soft Law: New tools for governing emerging
technologies”, Bulletin of the Atomic Scientists, Vol. 73/2,
https://doi.org/10.1080/00963402.2017.1288447 (accessed on 27 September 2018).
[73]
Martinez-Martin, N. and K. Kreitmair (2018), “Ethical Issues for Direct-to-Consumer Digital
Psychotherapy Apps: Addressing Accountability, Data Protection, and Consent.”, JMIR
mental health, Vol. 5/2, p. e32, http://dx.doi.org/10.2196/mental.9423.
[8]
Müller, O. and S. Rotter (2017), “Neurotechnology: Current Developments and Ethical Issues.”,
Frontiers in systems neuroscience, Vol. 11, p. 93,
http://dx.doi.org/10.3389/fnsys.2017.00093.
[40]
42 | RESPONSIBLE INNOVATION IN NEUROTECHNOLOGY ENTERPRISES
WORKING PAPER “RESPONSIBLE INNOVATION IN NEUROTECHNOLOGY ENTERPRISES” © OECD 2019
Nuffield Council on Bioethics (2013), Novel neurotechnologies: intervening in the brain,
Nuffield Council on Bioethics, http://nuffieldbioethics.org/wp-
content/uploads/2013/06/Novel_neurotechnologies_report_PDF_web_0.pdf.
[3]
Nuffield Council on Bioethics (2012), Emerging biotechnologies: technology, choice and the
public good, http://nuffieldbioethics.org/ (accessed on 5 June 2019).
[63]
OECD (2019), Recommendation of the Council on Artificial Intelligence, OECD/LEGAL/0449,
OECD.
[60]
OECD (2017), “Neurotechnology and society: Strengthening responsible innovation in brain
science”, OECD Science, Technology and Industry Policy Papers, No. 46, OECD Publishing,
Paris, http://dx.doi.org/10.1787/f31e10ab-en.
[2]
OECD (2017), New Health Technologies: Managing Access, Value and Sustainability, OECD
Publishing, Paris, https://dx.doi.org/10.1787/9789264266438-en.
[26]
OECD (2015), “Public-private Partnerships in Biomedical Research and Health Innovation for
Alzheimer’s Disease and other Dementias”, OECD Science, Technology and Industry Policy
Papers, No. 20, OECD Publishing, Paris, https://dx.doi.org/10.1787/5js36rc8wwbt-en.
[81]
OECD (2014), “Unleashing the Power of Big Data for Alzheimer’s Disease and Dementia
Research: Main Points of the OECD Expert Consultation on Unlocking Global Collaboration
to Accelerate Innovation for Alzheimer’s Disease and Dementia”, OECD Digital Economy
Papers, No. 233, OECD Publishing, Paris, https://dx.doi.org/10.1787/5jz73kvmvbwb-en.
[66]
OECD (2013), The OECD Privacy Framework 2013,
http://www.oecd.org/sti/ieconomy/oecd_privacy_framework.pdf (accessed on 2 May 2019).
[65]
OECD (2011), OECD Guidelines for Multinational Enterprises, 2011 Edition, OECD
Publishing, Paris, https://dx.doi.org/10.1787/9789264115415-en.
[87]
OECD/EU (2018), Health at a Glance: Europe 2018: State of Health in the EU Cycle, OECD
Publishing, Paris/EU, Brussels, https://doi.org/10.1787/health_glance_eur-2018-en.
[14]
Park, K. (2017), “Neuro-doping: The rise of another loophole to get around anti-doping
policies”, Cogent Social Sciences, Vol. 115,
http://dx.doi.org/10.1080/23311886.2017.1360462.
[52]
Poldrack, R. (2017), “The risks of reading the brain”, Nature, Vol. 541,
https://www.nature.com/articles/541156a.pdf (accessed on 25 September 2018).
[44]
Racine, E. and W. Affleck (2016), “Changing Memories: Between Ethics and Speculation”, The
AMA Journal of Ethics, Vol. 18/12, pp. 1241-1248,
http://dx.doi.org/10.1001/journalofethics.2016.18.12.sect1-1612.
[45]
Robillard, J. and J. Illes (2016), “Manipulating Memories: The Ethics of Yesterday’s Science
Fiction and Today’s Reality”, The AMA Journal of Ethics, Vol. 18/12, pp. 1225-1231,
http://dx.doi.org/10.1001/journalofethics.2016.18.12.msoc1-1612.
[46]
RESPONSIBLE INNOVATION IN NEUROTECHNOLOGY ENTERPRISES | 43
WORKING PAPER “RESPONSIBLE INNOVATION IN NEUROTECHNOLOGY ENTERPRISES” © OECD 2019
Roehrich, J., M. Lewis and G. George (2014), “Are public–private partnerships a healthy option?
A systematic literature review”, Social Science & Medicine, Vol. 113, pp. 110-119,
http://dx.doi.org/10.1016/J.SOCSCIMED.2014.03.037.
[82]
Russ, T. (2018), “Intelligence, Cognitive Reserve, and Dementia”, JAMA Network Open,
Vol. 1/5, p. e181724, http://dx.doi.org/10.1001/jamanetworkopen.2018.1724.
[30]
Salles, A. et al. (2019), “The Human Brain Project: Responsible Brain Research for the Benefit
of Society.”, Neuron, Vol. 101/3, pp. 380-384,
http://dx.doi.org/10.1016/j.neuron.2019.01.005.
[6]
Smith, K. (2013), “Reading minds”, Nature, Vol. 502,
https://www.nature.com/polopoly_fs/1.13989!/menu/main/topColumns/topLeftColumn/pdf/5
02428a.pdf (accessed on 22 August 2017).
[43]
Specker Sullivan, L. and J. Illes (2018), “Ethics in published brain–computer interface research”,
Journal of Neural Engineering, http://dx.doi.org/10.1088/1741-2552/aa8e05.
[61]
Sutcliffe, H. (2011), A report on Responsible Research &amp; Innovation Prepared for DG
Research and Innovation, European Commission), https://ec.europa.eu/research/science-
society/document_library/pdf_06/rri-report-hilary-sutcliffe_en.pdf (accessed on
25 April 2019).
[64]
U.S. Food and Drug Administration (FDA) (2019), Implanted Brain-Computer Interface (BCI)
Devices for Patients with Paralysis or Amputation-Non-clinical Testing and Clinical
Considerations Draft Guidance for Industry and Food and Drug Administration Staff,
https://www.regulations.gov. (accessed on 15 April 2019).
[67]
UN Human Rights (2011), Guiding Principles on Business and Human Rights,
https://www.ohchr.org/documents/publications/GuidingprinciplesBusinesshr_eN.pdf
(accessed on 25 April 2019).
[88]
United Nations (2017), World Population Prospects The 2017 Revision,
https://population.un.org/wpp/Publications/Files/WPP2017_KeyFindings.pdf (accessed on
23 September 2018).
[17]
van Albada, S. et al. (2018), “Performance Comparison of the Digital Neuromorphic Hardware
SpiNNaker and the Neural Network Simulation Software NEST for a Full-Scale Cortical
Microcircuit Model”, Frontiers in Neuroscience, Vol. 12, p. 291,
http://dx.doi.org/10.3389/fnins.2018.00291.
[27]
Vos, T. et al. (2016), “Global, regional, and national incidence, prevalence, and years lived with
disability for 310 diseases and injuries, 1990–2015: a systematic analysis for the Global
Burden of Disease Study 2015”, The Lancet, Vol. 388/10053, pp. 1545-1602,
http://dx.doi.org/10.1016/S0140-6736(16)31678-6.
[13]
Weiler, M. et al. (2018), “Cognitive Reserve Relates to Functional Network Efficiency in
Alzheimer’s Disease”, Frontiers in Aging Neuroscience, Vol. 10,
http://dx.doi.org/10.3389/fnagi.2018.00255.
[31]
44 | RESPONSIBLE INNOVATION IN NEUROTECHNOLOGY ENTERPRISES
WORKING PAPER “RESPONSIBLE INNOVATION IN NEUROTECHNOLOGY ENTERPRISES” © OECD 2019
Wexler, A. (2017), “The Social Context of &quot;Do-It-Yourself&quot; Brain Stimulation:
Neurohackers, Biohackers, and Lifehackers.”, Frontiers in human neuroscience, Vol. 11,
p. 224, http://dx.doi.org/10.3389/fnhum.2017.00224.
[53]
Wexler, A. (2016), “The practices of do-it-yourself brain stimulation: implications for ethical
considerations and regulatory proposals”, Journal of Medical Ethics, Vol. 42,
http://dx.doi.org/10.1136/medethics-2015-102704.
[9]
Wexler, A. and P. Reiner (2019), “Oversight of direct-to-consumer neurotechnologies”, Science,
Vol. 363/6424, pp. 234-235.
[70]
Wilsdon, J. and R. Willis (2004), See-through Science Why public engagement needs to move
upstream, http://sro.sussex.ac.uk/47855/1/See_through_science.pdf (accessed on
12 October 2017).
[62]
Wolpaw, J. and E. Winter Wolpaw (2012), “Brain–Computer Interfaces: Something New under
the Sun”, in Wolpaw, J. and E. Winter Wolpaw (eds.), BrainComputer Interfaces:
Principles and Practice, Oxford Scholarship Online,
http://dx.doi.org/10.1093/acprof:oso/9780195388855.003.0001.
[56]
World Economic Forum and Harvard School of Public Health (2011), The Global Economic
Burden of Non-communicable Diseases,
http://www3.weforum.org/docs/WEF_Harvard_HE_GlobalEconomicBurdenNonCommunica
bleDiseases_2011.pdf.
[16]
World Health Organization (2015), World report on ageing and health.
[19]
World Health Organization (2013), 2013-2020 GLOBAL ACTION PLAN FOR THE
PREVENTION AND CONTROL OF NONCOMMUNICABLE DISEASES, http://www.who.int
(accessed on 23 September 2018).
[18]
Zander, T. and C. Kothe (2011), “Towards passive brain–computer interfaces: applying brain
computer interface technology to human–machine systems in general”, Journal of Neural
Engineering, Vol. 8/2, p. 025005, http://dx.doi.org/10.1088/1741-2560/8/2/025005.
[58]
Zander, T. et al. (2016), “Neuroadaptive technology enables implicit cursor control based on
medial prefrontal cortex activity.”, Proceedings of the National Academy of Sciences of the
United States of America, Vol. 113/52, pp. 14898-14903,
http://dx.doi.org/10.1073/pnas.1605155114.
[57]
RESPONSIBLE INNOVATION IN NEUROTECHNOLOGY ENTERPRISES | 45
WORKING PAPER “RESPONSIBLE INNOVATION IN NEUROTECHNOLOGY ENTERPRISES” © OECD 2019
Annex B.
Agenda OECD Workshop
“Minding Neurotechnology: delivering responsible innovation for health and well
-being”
6
-7 September 2018, Shanghai, People’s Republic of China
The objectives of the Shanghai Workshop are:
1. Promote a deeper dialogue among business leaders, investors, policy makers, social scientists,
and practitioner communities to enable desirable social outcomes and benefits of
neurotechnology.
2. Enrich current discussions of the social implications of neurotechnology on both short and
long-term time horizons by hearing from those engaged in bringing products to market.
3. Better understand how considerations of responsible innovation can improve the sustainability
of business models in novel neurotechnology.
Day One (Thursday, 6 September 2018)
08:30-09:00 Registration
Venue: Renaissance Shanghai Putuo Hotel
09:00-09:30 Welcome messages & introduction to workshop
Workshop Moderator: Prof. Dr. Jialin Charles Zheng, Professor of Regenerative Medicine and
Neuroscience, Dean, Tongji University School of Medicine, Shanghai, People's Republic of
China
Mr. Dominique Guellec, Head, Science and Technology Policy Division, Directorate for
Science, Technology and Innovation, OECD, Paris, France
Ministry of Science and Technology of the People’s Republic of China (MOST)
Dr. Xinmin Zhang, Director General, China National Center for Biotechnology Development
(CNCBD), People’s Republic of China
Prof. Dr. Jie Chen, President of Tongji University, Shanghai, People’s Republic of China
Mr. Ik-hyeon Rhee, President Korea Legislation Research Institute (KLRI), Republic of
Korea
09:30-09:55 Keynote
Ms. Tan Le, CEO, EMOTIV, San Francisco, USA
46 | RESPONSIBLE INNOVATION IN NEUROTECHNOLOGY ENTERPRISES
WORKING PAPER “RESPONSIBLE INNOVATION IN NEUROTECHNOLOGY ENTERPRISES” © OECD 2019
09:55-10:20
Keynote
Prof. Dr. Mu-ming Poo, Member of the Chinese Academy of Sciences, Director, Institute of
Neuroscience, Chinese Academy of Sciences, Director, CAS Center for Excellence in Brain
Science and Intelligence Technology, People’s Republic of China
10:20-10:50 Coffee break
10:50-12:40 Session 1
Neurotechnology innovation from the bottom up: strategies for product development at
major brain research initiatives
Chair: Prof. Dr. Linda Richards, Deputy Director (Research), Queensland Brain Institute, Australia
Panellists:
Dr. A. Lyric Jorgenson, National Institutes of Health (NIH), Lyric A. Jorgenson, Deputy
Director, Office of Science Policy, Office of the Director, USA
Dr. Sung-Jin Jeong, Principal Researcher/Director, Neuronal Development and Disease
Department, Brain Research Policy Center Korea Brain Research Institute, Republic of
Korea
Dr. Dekel Taliaz, CEO & Co-Founder, Taliaz Ltd, Cofounder, Vice President, Tech
division of Israel Brain Technologies, Israel
Prof. Dr. Shigeo Okabe, Brain/MINDS Program Supervisor, Graduate School of Medicine,
The University of Tokyo, Japan
Prof. Dr. Qingming Luo, Vice President, Huazhong University of Science and Technology,
People’s Republic of China
This first session focuses on the translation of knowledge emerging from major brain research
initiatives into novel neurotechnologies for health and well-being. In order for those technologies
to be integrated into society, they need to be developed for markets and broadly disseminated
beyond the laboratory or company where they originated. Health innovation and technological
development are expressed goals of some major public funding efforts and national brain
initiatives, with company formation being imagined as one key to achieving those goals.
Discussion questions:
1. What are the current trends for neurotechnology innovation across the major ‘brain
initiatives’? What are the funding opportunities for the dissemination and translation of
research?
2. For the ‘brain initiatives’ seeking to spur innovation: what are best practices for attracting
investment, encouraging public-private sector collaboration, and translating research into
marketable products?
3. What mechanisms are in place to ensure spin-outs and future products meet ethical, social
standards?
12:40-13:40
Lunch
13:40-14:05 Keynote
Dr. Tom Insel, Co-Founder and President, Mindstrong Health, Palo Alto, CA, USA
RESPONSIBLE INNOVATION IN NEUROTECHNOLOGY ENTERPRISES | 47
WORKING PAPER “RESPONSIBLE INNOVATION IN NEUROTECHNOLOGY ENTERPRISES” © OECD 2019
14:05-14:20
Session lead-in: “Neurotechnology ventures”
Mr. Jordan P. Amadio, M.D., M.B.A., Neurosurgeon, Technology Innovator, Start-up Investor/
Strategist, Austin, Texas, USA
14:20-16:05 Session 2
Making innovation work: addressing the challenges of commercialisation in disruptive
technology
Chair: Mr. Jordan P. Amadio, M.D., M.B.A., Neurosurgeon, Technology Innovator, Start-up
Investor/ Strategist, Austin, Texas, USA
Panellists:
Dr. Graeme Moffat, VP of Scientific & Regulatory Affairs, MUSE, Toronto, Canada
Dr. David Benrimoh, CEO, Aifred Health, Montreal, Canada
Dr. Moonkyo Chung, Korea Technology Finance Corporation (KOTEC), Deputy Director,
Seocho Technology Appraisal Center, Republic of Korea
Dr. Oh-hyoung Kwon, Partner, FuturePlay, Republic of Korea
Ms. Yifei Fan, Business Development Manager, AXA Lab Asia, Shanghai, People's
Republic of China
Prof. Dr. Luming Li, Professor of Biomedical Engineering and Neuromodulation Technology,
Tsinghua University, Beijing, People’s Republic of China
Dr. Yunting Liu, Commercial & Strategy Director, Tencent Medical, People’s Republic of
China
Dr. Chris Thatcher, President and CEO, NeuroStar, USA
This session will focus on the formation and development of small and medium sized enterprises
and their engagement with key partners: public research institutions and the private investment
sector. Panellists will discuss the current state of play in their technologies, business models and
challenges.
Discussion questions:
1. What are the unique challenges and opportunities for start-up companies and SMEs in
neurotechnology innovation in terms of, e.g., market size, investment, ethics, and
regulation?
2. What is the landscape of private investment in the arena of neurotechnology?
3. What is the role of academic entrepreneurs in the commercialisation of techno-creative
innovations?
16:05-16:25
Coffee break
48 | RESPONSIBLE INNOVATION IN NEUROTECHNOLOGY ENTERPRISES
WORKING PAPER “RESPONSIBLE INNOVATION IN NEUROTECHNOLOGY ENTERPRISES” © OECD 2019
16:25-16:40
Session lead-in
Prof. Dr. Guoyu Wang, Professor of Philosophy, Fudan University, People’s Republic of China
16:40-18:30 Session 3
Identifying gaps in neurotechnology governance: potential roles of the market and the
public sector to ensure ‘technology robustness’
Co-Chairs: Prof. Dr. Guoyu Wang, Professor of Philosophy, Fudan University, P.R. China; Mr.
John Clarkson, Senior Vice President and Chief Operating Officer, Ontario Brain Institute,
Toronto, Canada
Panellists:
Dr. Mariarosaria Taddeo, Research Fellow, Deputy Director, Digital Ethics Lab, Oxford
Internet Institute, University of Oxford, Turing Fellow, Alan Turing Institute, London, Oxford,
UK
Mr. Junkil Been, Co-founder, Chief Executive Officer, Neurophet, Republic of Korea
Dr. Marcello Ienca, Research Fellow, Health Ethics & Policy Lab, Department of Health
Sciences and Technology, ETH Zürich, Switzerland
Mr. Alex Ni, MBA, CPA, CMA, CTO, Avertus, Toronto, Canada
Dr. Laura Y. Cabrera, Assistant Professor, Neuroethics, Michigan State University, Center
for Ethics & Humanities in the Life Sciences, USA
Dr. Andrea Bertolini, Assistant professor Private Law, Dirpolis Institute, Adjunct Professor,
Private Law, University of Pisa, Italy
The third session will raise potential govern
ance issues associated with emerging
neurotechnologies that deserve shared consideration given their public attention as well as
potential economic and social implications. Concerns about privacy and misuse of brain data
have become more tangible in the wake of recent privacy breaches in the social networking
community. Other governance issues are raised when products intended for clinical use are used
in non-therapeutic settings. Given the limited experiences with some novel neurotechnologies:
how can companies, investors, and insurers anticipate the potential unintended use, broader
societal effects, misperception and backlash? How do they engage the goal of “appropriate use”,
data privacy, and integrity in neurotechnologies?
Discussion questions:
1. Understanding the grey areas in neurotechnology: what are the key gaps, risks and
uncertainties within businesses, and at the intersection of the public and private sector?
2. Are governance tools such as consumer protection laws, liability rules, post-marketing
surveillance, and current ethical frameworks sufficient to promote public trust and
technology robustness?
3. What are the best practices to learn from “early adopters” that support technology
validation?
19:00
Dinner
RESPONSIBLE INNOVATION IN NEUROTECHNOLOGY ENTERPRISES | 49
WORKING PAPER “RESPONSIBLE INNOVATION IN NEUROTECHNOLOGY ENTERPRISES” © OECD 2019
Day Two (Friday, 7 September 2018)
08:45-09:00 Opening Day Two
Workshop Moderator: Dr. Pingping Li, Associate Professor, Deputy Director, Division of Public
Health, China National Center for Biotechnology Development (CNCBD), People’s Republic of
China
Comment
Prof. Dr. Gang Pei, Member of the Chinese Academy of Sciences, Former President of Tongji
University, Shanghai Institute of Biochemistry and Cell Biology, Chinese Academy of Sciences,
People’s Republic of China
09:00-09:15 Session lead-in: “Challenges in the governance of emerging technology”
Prof. Dr. Gary E. Marchant, Faculty Director and Regents Professor, Center for Law Science &
Innovation, Arizona State University, Tempe, USA
09:15-11:00 Session 4
Building responsible innovation: frameworks and best practices in the private sector
Chair: Prof. Dr. Judy Illes, Canada Research Chair in Neuroethics, Professor of Neurology,
Department of Medicine, Director, Neuroethics Canada, The University of British Columbia,
Vancouver, Canada
Panellists:
Prof. Dr. Karen Rommelfanger, Assistant Professor, Department of Neurology, Assistant
Professor, Department of Psychiatry and Behavioral Sciences, Emory University, Atlanta,
USA
Prof. Dr. Sebastian Pfotenhauer, Professor of Innovation Research - Innovation, Society &
Public Policy Group, Munich Center for Technology in Society, Technical University of
Munich, Germany
Dr. Xiaodong Tao, Vice Precedent, IFLYTEK CO., LTD., President of iFLY Health, People's
Republic of China
Prof. Dr. Yizheng Wang, Researcher, Huashan Hospital, Fudan University, People’s
Republic of China
Dr. Tom Insel, Co-Founder and President, Mindstrong Health, Palo Alto, CA, USA
Ms. Tan Le, Founder, Chief Executive Officer, Emotiv, San Francisco, USA
Prof. Dr. Adrian Carter, Associate Professor, Head, Neuroscience and Society Group,
Monash Institute of Cognitive and Clinical Neurosciences, Monash University, Australia
Prof. Dr. Ricardo Andrés Chavarriaga Lozano, Ecole Polytechnique Fédérale de Lausanne,
CNBI - Chair in Brain-Machine Interface, Geneva, Switzerland
In this session, panellists will focus on the modes through which ethics and social responsibility
can make a positive impact on brain research and neurotechnology development. A mixed group
of innovators, representatives from major ‘brain initiatives’, and other experts discuss how forms
of upstream responsibility can contribute to downstream profitability and health impact. Some
brain research initiatives and businesses within neurotechnology and related fields like AI have
sought to integrate elements of social responsibility and ethics into their technology transfer,
50 | RESPONSIBLE INNOVATION IN NEUROTECHNOLOGY ENTERPRISES
WORKING PAPER “RESPONSIBLE INNOVATION IN NEUROTECHNOLOGY ENTERPRISES” © OECD 2019
business practices, R&D, and corporate governance.
Discussion questions:
1. What are the strategic approaches and best practices to align disruptive neurotechnology
with societal needs? How can responsibility frameworks complement regulation and
support the robustness of products in markets?
2. What strategies are used by major brain initiatives and companies to help promote
transparency, trust, and positive societal outcomes?
3. How can ethical, legal, and social considerations of neurotechnology innovation strengthen
the ties between public research, investors, companies, and insurers?
11:00-11:20
Coffee break
11:20-12:30 Session 5
Exploring the potential role of policy makers in delivering responsible innovation for
health and well-being
Chair: Dr. David Winickoff, Senior Policy Analyst, Secretary, Working Party on Bio-, Nano- and
Converging Technologies (BNCT), Science and Technology Policy Division, OECD, Paris,
France
Panellists:
Dr. Françoise D. Roure, Chairperson of the Committee “Safety, Security and Risk”, French
Ministry of Economy and Finance High Council of Economy, Paris, France
Dr. Seunghye Wang, Research Fellow, Office of Global Legal Research, Korea Legislation
Research Institute, Republic of Korea
Prof. Dr. Xian-En Zhang, Principal Investigator, Institute of Biophysics, Chinese Academy of
Sciences, Former Director of the Basic Research Department, Ministry of Science &
Technology (MOST), People’s Republic of China
Dr.
Isabella Beretta, Scientific Advisor International Research Organisations, Federal
Department of Economic Affairs, Education and Research EAER, State Secretariat for
Education, Research and Innovation SERI, Berne, Switzerland
Dr. A. Lyric Jorgenson, National Institutes of Health (NIH), Deputy Director, Office of Science
Policy, Office of the Director, USA
Mr. Hugh Whittall, Director at Nuffield Council on Bioethics, UK
Participants reflect on the potential role of policy makers and innovators in advancing responsible
innovation in neurotechnology. The OECD is developing Principles for responsible development
and use of novel neurotechnologies for health-related applications.
12:30-13:00 Summary, conclusions, and outlook
Dr. David Winickoff, Senior Policy Analyst, Secretary, Working Party on Bio-, Nano- and
Converging Technologies (BNCT), Science and Technology Policy Division, OECD, Paris
13:00
End of workshop
... Although many public sector research initiatives have implemented measures to address these issues, similar systematic measures in the private sector have yet to emerge. This gap is critical, as neurotech innovation today is largely driven by a set of companies that are subject to growing public scrutiny [1][2][3][4][5] . Here we detail lessons, emerging practices and open questions for responsible innovation in the private sector that are the result of three years of policy deliberations that began with a 2018 conference in Shanghai convened by the Organization for Economic Co-operation and Development (OECD) and led to the release of the "OECD Recommendation on Responsible Innovation in Neurotechnology" last year 6 . ...
... These questions include concerns about brain data privacy, runaway human enhancement, individual autonomy, vulnerability to political or economic manipulation, direct-to-consumer (DTC) marketing of devices that have variable, if any, effectiveness, dual use, do-it-yourself (DIY) neurotech and neurohacking, and new forms of inequality [7][8][9][10][11] . Although public-sector research has been quick to implement targeted programs to tackle these concerns-for example, the 'Ethics and Society' strand of the Human Brain Project 12,13 -the private sector has thus far paid relatively scarce systematic attention to them 3 . ...
... Overall, this workshop highlighted the need to continuously evaluate the state-of-the-art and the implications of neurotechnologies. This requires multistakeholder, anticipatory processes for developing appropriate tools including ethical and technical guidelines, standards, and regulatory instruments that allow translation of neurotechnologies for both consumer and medical applications [316][317][318]. Over the last 16 years, various clinical trials of implantable neurotechnology in humans have demonstrated successful applications. ...
Article
Full-text available
The Eighth International Brain-Computer Interface (BCI) Meeting was held June 7-9, 2021 in a virtual format. The conference continued the BCI Meeting series' interactive nature with 21 workshops covering the breadth of topics in BCI (also called brain-machine interface) research. Some workshops provided detailed examinations of methods, hardware, or processes. Others focused on BCI applications or user groups. Several workshops continued consensus building efforts designed to create BCI standards and improve comparisons between studies and the potential for meta-analysis and large multi-site clinical trials. Ethical and translational considerations were the primary topic for some workshops or an important secondary consideration for others. The range of BCI applications continues to expand, with more workshops focusing on approaches that can extend beyond the needs of those with physical impairments. This paper summarizes each workshop, provides background information and references for further study, summarizes discussions, and describes the resulting conclusion, challenges, or initiatives.
... Liability law is used to deal with risks and damage sustained from using (consumer) products, whether derived from new technologies or not. Liability law is likely to play a key role in distributing risks and benefits of AI (Garden et al. 2019). This explains the various EU-level initiatives (Expert Group on Liability and New Technologies 2019; European Commission 2020b; European Parliament 2020b) that try to establish who is liable for which aspects of AI. ...
Chapter
Full-text available
This chapter reviews the proposals that have been put forward to address ethical issues of AI. It divides them into policy-level proposals, organisational responses and guidance for individuals. It discusses how these mitigation options are reflected in the case studies exemplifying the social reality of AI ethics. The chapter concludes with an overview of the stakeholder groups affected by AI, many of whom play a role in implementing the mitigation strategies and addressing ethical issues in AI.
... Another author served as an expert and external consultant for BNCT projects at several occasions. As part of this engagement, the authors were occasionally tasked with contributing to reports, workshops, and background materials of the BNCT Working Party (see Winickoff and Pfotenhauer 2018;Garden et al. 2019). Another author was part of a US-based research team collaborating on the EC-funded research project, in which she conducted interviews and focus groups within project-affiliated institutions and co-organized a national workshop on RRI. ...
Article
Full-text available
Long presented as a universal policy-recipe for social prosperity and economic growth, the promise of innovation seems to be increasingly in question, giving way to a new vision of progress in which society is advanced as a central enabler of techno-economic development. Frameworks such as "Responsible" or "Mission-oriented" Innovation, for example, have become commonplace parlance and practice in the governance of the innovation-society nexus. In this paper, we study the dynamics by which this “social fix” to technoscience has gained legitimacy in institutions of global governance by investigating recent projects at two international organizations, the OECD and the European Commission, to mainstream "Responsible Innovation" frameworks and instruments across countries. Our analysis shows how the turn to societal participation in both organizations relies on a new deficit logic––a democratic deficit of innovation––that frames a lack of societal engagement in innovation governance as a major barrier to the uptake and dissemination of new technologies. These deficit politics enable global governance institutions to present "Responsible Innovation" frameworks as the solution, and to claim authority over the co-production of particular forms of democracy and innovation as intertwined pillars of a market-liberal international order.