ChapterPDF Available

Neural Implants as Gateways to Digital-Physical Ecosystems and Posthuman Socioeconomic Interaction

Authors:

Abstract

For many employees, ‘work’ is no longer something performed while sitting at a computer in an office. Employees in a growing number of industries are expected to carry mobile devices and be available for work-related interactions even when beyond the workplace and outside of normal business hours. In this article it is argued that a future step will increasingly be to move work-related information and communication technology (ICT) inside the human body through the use of neuroprosthetics, to create employees who are always ‘online’ and connected to their workplace’s digital ecosystems. At present, neural implants are used primarily to restore abilities lost through injury or illness, however their use for augmentative purposes is expected to grow, resulting in populations of human beings who possess technologically altered capacities for perception, memory, imagination, and the manipulation of physical environments and virtual cyberspace. Such workers may exchange thoughts and share knowledge within posthuman cybernetic networks that are inaccessible to unaugmented human beings. Scholars note that despite their potential benefits, such neuroprosthetic devices may create numerous problems for their users, including a sense of alienation, the threat of computer viruses and hacking, financial burdens, and legal questions surrounding ownership of intellectual property produced while using such implants. Moreover, different populations of human beings may eventually come to occupy irreconcilable digital ecosystems as some persons embrace neuroprosthetic technology, others feel coerced into augmenting their brains to compete within the economy, others might reject such technology, and still others will simply be unable to afford it. In this text we propose a model for analyzing how particular neuroprosthetic devices will either facilitate human beings’ participation in new forms of socioeconomic interaction and digital workplace ecosystems – or undermine their mental and physical health, privacy, autonomy, and authenticity. We then show how such a model can be used to create device ontologies and typologies that help us classify and understand different kinds of advanced neuroprosthetic devices according to the impact that they will have on individual human beings.
85
DELab UW
Matthew E. Gladden
Institute of Computer Science, Polish Academy of Sciences
Neural implants as gateways
to digital-physical ecosystems
andposthumansocioeconomic
interaction
Introduction
For many employees, "work" is no longer something performed while sitting at
acomputer inan oce. Employees inagrowing number of industries are expected to
carry mobile devices and be available for work-related interactions even when beyond
the workplace and outside of normal business hours. In this article it is argued that
afuture step will increasingly be to move work-related information and communica-
tion technology (ICT) inside the human body through the use of neuroprosthetics, to
create employees who are always "online" and connected to their workplaces digital
ecosystems. At present, neural implants are used primarily to restore abilities lost
through injury or illness, however their use for augmentative purposes is expected to
grow, resulting inpopulations of human beings who possess technologically altered
capacities for perception, memory, imagination, and the manipulation of physical
environments and virtual cyberspace. Such workers may exchange thoughts and
share knowledge within posthuman cybernetic networks that are inaccessible to
unaugmented human beings.
86
INNOVATIVE AND DIGITAL ECONOMY SOCIETY INTHE DIGITAL AGE
Scholars note that despite their potential benets, such neuroprosthetic devices may
create numerous problems for their users, including asense of alienation, the threat
of computer viruses and hacking, nancial burdens, and legal questions surrounding
ownership of intellectual property produced while using such implants. Moreover,
dierent populations of human beings may eventually come to occupy irreconcilable
digital ecosystems as some persons embrace neuroprosthetic technology, others feel
coerced into augmenting their brains to compete within the economy, others might
reject such technology, and still others will simply be unable to aord it.
In this text we propose amodel for analysing how particular neuroprosthetic devices
will either facilitate human beings’ participation innew forms of socioeconomic
interaction and digital workplace ecosystems–or undermine their mental and physi-
cal health, privacy, autonomy and authenticity. We then show how such amodel can
be used to create device ontologies and typologies that help us classify and under-
stand dierent kinds of advanced neuroprosthetic devices according to the impact
that they will have onindividual human beings.
From Neuroprosthetic Devices to Posthuman
Digital-Physical Ecosystems
Existing Integration of the Human Brain with Work-Related
Digital-Physical Ecosystems
In recent decades the integration of the human brain with work-related digital eco-
systems has grown stronger and increasingly complex. Whereas once employees were
expected to use desktop computers during "working hours," for agrowing number of
employees it is now expected that they be available for work-related interactions at all
times through their possession and mastery of mobile (and now, wearable) devices
(Shih 2004; Gripsrud 2012). Along this path of ever closer human-technological
integration, an emerging frontier is that of moving computing inside the human
body through the use of implantable computers (Koops & Leenes 2012; Gasson 2012;
McGee 2008).
87
DELab UW
The Potential of Neuroprosthetic Implants for Human
Enhancement
One particular type of implantable computer is aneuroprosthetic device (or neural
implant) designed to provide ahuman being with some sensory, cognitive, or motor
capacity (Lebedev 2014). Such neuroprostheses are currently used primarily for
therapeutic purposes, to restore abilities that have been lost due to injury or illness.
However, researchers have already developed experimental devices designed for pur-
poses of human enhancement that allow an individual to exceed his or her natural
biological capacities by, for example, obtaining the ability to perceive ultrasonic
waves or store digitized computer les within one’s body (Warwick 2014; Gasson
2012; McGee 2008).
Toward Posthuman Digital-Physical Ecosystems
e use of neuroprosthetics for purposes of human enhancement is expected to grow
over the coming decades, resulting inasegment of the population whose minds
possess unique kinds of sensory perception, memory, imagination, and emotional
intelligence and who participate insocial relations that are mediated not through the
exchange of traditional oral, written, or nonverbal communication but byneurotech-
nologies that allow the sharing of thoughts and volitions directly with other human
minds and with computers (McGee 2008; Warwick 2014; Rao, Stocco, Bryan, Sarma,
Youngquist, Wu, & Prat 2014).
Until now, communicating athought to another mind has required the thought to
be expressed physically as asocial action that is audible, visible, or tangible innature,
however future neuroprosthetics may facilitate the exchange of ideas directly at the
level of thought (Warwick 2014; Rao et al. 2014; Gladden 2015d), thereby allow-
ing the creation of human networks that can be understood as either "supersocial"
or "postsocial" innature. Not only might such posthuman (Ferrando 2013) digital
ecosystems be inaccessible to those who lack the appropriate form of neural augmen-
tation, but even their very existence may be invisible to unmodied human beings.
In this text, we will oen refer to such ecosystems as "digital" to emphasise the fact
that they may utilize an immersive cyberspace or other articial environment as
avirtualized locus for socioeconomic interaction. However, it should be kept inmind
88
INNOVATIVE AND DIGITAL ECONOMY SOCIETY INTHE DIGITAL AGE
that any such virtual reality is always grounded inand maintained bythe computa-
tional activity of electronic or biological physical substrates; thus technically, digital
ecosystems should always be understood as "digital-physical" ecosystems.
The Need to Analyse Neuroprosthetics from Cybernetic,
Phenomenological, and Existentialist Perspectives
As abidirectional gateway, a neural implant not only aids one’s mind to reach out to
explore the world and interact with other entities; it may also allow external agents or
systems to reach into one’s mind to access–and potentially manipulate or disrupt–one’s
most intimate mental processes (Gasson 2012: 15–16). is makes it essential that
manufacturers who produce such devices, policymakers who can encourage or ban their
adoption, and users inwhom they will be implanted be able to understand the positive
and negative impacts of particular neuroprosthetic devices onindividual users. is calls
for the development of device ontologies and typologies for classifying and understand-
ing neuroprostheses that do not simply focus onthe devices’ technical characteristics
but which also consider auser’s lived experience of aneuroprosthetic device and which
integrate acybernetic analysis of "control and communication" (Wiener 1961) with phe-
nomenological and even existentialist perspectives (Gladden 2015d).
Existing Ontologies and Typologies of Neuroprosthetic Devices
Existing typologies for neuroprosthetics are primarily functional. For example,
aneuroprosthetic device can be classied based onthe nature of its interface with
the brain’s neural circuitry [sensory, motor, bidirectional sensorimotor, or cognitive
(Lebedev 2014)], its purpose [for restoration, diagnosis, identication, or enhance-
ment (Gasson 2012: 25)], or its location [non-invasive, partially invasive, or invasive
(Gasson 2012: 14)]. Typologies have also been developed that classify aneuropro-
sthesis according to whether it aids its human user to interact with areal physical
environment using his or her natural physical body, augments or replaces the user’s
natural physical body (e.g., with robotic prosthetic limbs), or allows the user to sense
and manipulate some virtual environment (Gladden 2015b).
89
DELab UW
Formulating Our Model for an Ontology of
Neuroprosthetics
Here we propose a model for classifying and understanding neuroprosthetic
devices especially intheir role of integrating human beings into digital ecosystems,
economies and information systems. e model comprises two main dimensions,
of which one (impact) is further subdivided into two sub-dimensions (new capaci-
ties and detriments).
Roles of the Human User
A neuroprosthetic device aects its human user as viewed onthree levels: 1) the
human being as asapient metavolitional agent, aunitary mind that possesses its
own conscious awareness, memory, volition, and conscience–or "metavolitionality"
(Gladden 2015d; Calverley 2008)–2) the human being as an embodied organism
that inhabits and can sense and manipulate a particular environment through
the use of its body; and 3) the human being as asocial and economic actor who
interacts with others to form social relationships and to produce, exchange, and
consume goods and services.
Impact: Potential New Capacities and Detriments
At each of these three levels, aneuroprosthetic device can create for its user either
new opportunities and advantages, new threats and disadvantages, or both. Typi-
cally aneuroprosthetic device creates new opportunities for its user to participate
insocioeconomic interaction and informational ecosystems byproviding some new
cognitive, sensory, or motor capacity. Disadvantages may take the form of a new
dependency onsome external resource, the loss of apreviously existing capability,
asecurity vulnerability, or some other detriment. Because aneuroprosthetic device’s
creation of new capacities can be independent of its creation of detriments, these ele-
ments comprise two dierent dimensions; however, it is simpler to treat them as two
sub-dimensions of asingle larger dimension, the devices "impact".
90
INNOVATIVE AND DIGITAL ECONOMY SOCIETY INTHE DIGITAL AGE
Figure 1. Amultidimensional model of the impacts of neuroprosthetic devices onindividual users
oLoss of ownership of body and IP
oFinancial, technological, and social
dependencies
oSubjugation to external agency
oSocial exclusion and employment
discrimination
oVulnerability to hacking, data theft,
blackmail, or other crime
oNo control over sensory organs
oNo control over motor organs
oNo control over other bodily systems
oOther biological side-eects
oLoss of agency
oLoss of conscious awareness
oLoss of cognitional info security
oConating real and virtual experience
oConating true and false memories
oOther psychological side-eects
oEnhanced memory (engrams)
oEnhanced creativity
oEnhanced emotion
oEnhanced conscious awareness
oEnhanced conscience
oSensory enhancement
oMotor enhancement
oEnhanced memory (exograms)
oNew kinds of social relations
oCollective knowledge
oJob exibility and instant retraining
oEnhanced management of
technological systems
oEnhanced business decision-making
and monetary value
oQualications for specic roles
...sapient
metavolitional
agent
Impacts on the
human being
as...
...embodied
embedded
organism
...social and
economic actor
Potential Detriments Potential New Capacities
Impacts Captured byOur Model
Below we present specic capacities and detriments that neuroprosthetics are
expected to create for their users at the three levels of the human being as 1) sapient
metavolitional agent, 2) embodied embedded organism, and 3) social and economic
actor. ese items constitute abroad universe of expected possible impacts identi-
ed byscholars; any one neuroprosthesis may generate only asmall number of these
eects, ifany.
Impacts onthe User as Sapient Metavolitional Agent
Neuroprosthetic devices may aect their users’ cognitive processes inways that posi-
tively or negatively impact the ability of such persons to participate insocioeconomic
interaction and informational ecosystems.
91
DELab UW
New capacities provided byneuroprosthetics may include:
oEnhanced memory, skills, and knowledge stored within the mind (engrams). Build-
ing oncurrent technologies tested inmice, future neuroprosthetics may oer hu-
man users the ability to create, alter, or weaken memories stored intheir brains
natural memory systems inthe form of engrams (Han, Kushner, Yiu, Hsiang,
Buch, Waisman, Bontempi, Neve, Frankland, & Jossely 2009; Ramirez, Liu, Lin,
Suh, Pignatelli, Redondo, Ryan, & Tonegawa 2013; McGee 2008; Warwick 2014:
267). is could potentially be used not only to aect auser’s declarative knowl-
edge but also to enhance motor skills or reduce learned fears.
oEnhanced creativity. Aneuroprosthetic device may be able to enhance amind’s
powers of imagination and creativity (Gasson 2012: 23–24) byfacilitating pro-
cesses that contribute to creativity, such as stimulating mental associations be-
tween unrelated items. Anecdotal increases increativity have been reported to
result aer the use of neuroprosthetics for deep brain stimulation (Cosgrove 2004;
Gasson 2012).
oEnhanced emotion. Aneuroprosthetic device might provide its user with more de-
sirable emotional dynamics (McGee 2008: 217). Eects onemotion have already
been seen indevices used, e.g., for deep brain stimulation (Kraemer 2011).
oEnhanced conscious awareness. Research is being undertaken to develop neuro-
prosthetics that would allow the human mind to, for example, extend its peri-
ods of attentiveness and limit the need for periodic reductions inconsciousness
(i.e.,sleep) (Kourany 2013: 992–93).
oEnhanced conscience. One’s conscience can be understood as ones set of metavoli-
tions, or desires about the kinds of volitions that one wishes to possess (Calverley
2008; Gladden 2015d); insofar as aneural implant enhances processes of memory
and emotion (Calverley 2008: 528–34) that allow for the development of the con-
science, it may enhance ones ability to develop, discern, and follow one’s conscience.
New impairments generated byneuroprosthetics at the level of their user’s internal
mental processes may include:
oLoss of agency. Aneuroprosthetic device may damage the brain or disrupt its ac-
tivity inaway that reduces or eliminates the ability of its human user to possess
and exercise agency (McGee 2008: 217). Moreover, the knowledge that this can
occur may lead users to doubt whether their volitions are really "their own"–an
eect that has been seen with neuroprosthetics used for deep brain stimulation
(Kraemer: 2011).
oLoss of conscious awareness. Aneuroprosthetic device may diminish the quality
or extent of its users conscious awareness, e.g., byinducing daydreaming or in-
creasing the required amount of sleep. Aneuroprosthesis could potentially even
destroy its user’s capacity for conscious awareness (e.g., byinducing acoma) but
92
INNOVATIVE AND DIGITAL ECONOMY SOCIETY INTHE DIGITAL AGE
without causing the death of his or her biological organism (Gladden 2015d).
oLoss of information security for internal cognitive processes. Aneuroprosthetic de-
vice may compromise the condentiality, integrity, or availability of information
contained within its user’s mental activities (such as perception, memory, volition
or imagination), either byaltering or destroying information, making it inaccessi-
ble to the user, or making it accessible to unauthorized parties (McGee 2008: 217;
Gladden 2015d; Gladden 2015c).
oInability to distinguish areal from avirtual ongoing experience. If aneuroprosthe-
sis alters or replaces its user’s sensory perceptions, it may make it impossible for
the user to know which (if any) of the sense data that he or she is experiencing cor-
respond to some actual element of an external physical environment and which
are "virtual" or simply "false" (McGee 2008: 221; Gladden 2015d).
oInability to distinguish true from false memories. If aneuroprosthetic device is able
to create, alter, or destroy engrams within its user’s brain, it may be impossible for
auser to know which of his or her apparent memories are "true" and which are
"false" (i.e., distorted or purposefully fabricated) (Ramirez etal. 2013).
oOther psychological side eects. e brain may undergo potentially harmful and
unpredictable structural and behavioral changes as it adapts to the presence, ca-
pacities, and activities of aneuroprosthesis (McGee 2008: 215–16; Koops & Leenes
2012: 125, 130). ese eects may include new kinds of neuroses, psychoses, and
other disorders unique to users of neuroprosthetics.
Impacts onthe User as Embodied Embedded Organism
Interacting with an Environment
Neuroprosthetic devices may aect the ways inwhich their users sense, manipulate,
and occupy their environment through the interface of aphysical or virtual body. New
capacities provided might include:
oSensory enhancement. Aneuroprosthetic device may allow its user to sense his or her
physical or virtual environment innew ways, either byacquiring new kinds of raw
sense data or new modes or abilities for processing, manipulating, and interpreting
sense data (Warwick 2014: 267; McGee 2008: 214; Koops & Leenes 2012: 120, 126).
oMotor enhancement. Aneuroprosthetic device may give users new ways of manip-
ulating physical or virtual environments through their bodies (McGee 2008: 213;
Warwick 2014: 266). It may grant enhanced control over ones existing biological
body, expand one’s body to incorporate new devices (such as an exoskeleton or vehi-
cle) through body schema engineering (Gladden 2015b), or allow the user to control
external networked physical systems such as drones or 3D printers or virtual systems
or phenomena within an immersive cyberworld.
93
DELab UW
oEnhanced memory, skills, and knowledge accessible through sensory organs (exog-
rams). Aneuroprosthetic device may give its user access to external data-storage
sites whose contents can be "played back" to the user’s conscious awareness through
his or her sensory organs or to real-time streams of sense data that augment or
replace one’s natural sense data (Koops & Leenes 2012: 115, 120, 126). e ability
to record and play back ones own sense data could provide perfect audiovisual
memory of ones experiences (McGee 2008: 217).
New impairments generated byneuroprosthetics at the level of their users’ physical
or virtual bodily interfaces with their environments might include:
oLoss of control over sensory organs. Aneuroprosthetic device may deny auser di-
rect control over his or her sensory organs (Koops & Leenes 2012: 130). Techno-
logically mediated sensory systems may be subject to noise, malfunctions, and
manipulation or forced sensory deprivation or overload occurring at the hands of
"sense hackers" (Gladden 2015c: 201–02).
oLoss of control over motor organs. Aneuroprosthetic device may impede auser’s
control over his or her motor organs (Gasson 2012: 14–16). e user’s body may
no longer be capable, e.g., of speech or movement, or the control over ones speech
or movements may be assumed bysome external agency.
oLoss of control over other bodily systems. Aneuroprosthetic device may impact
the functioning of internal bodily processes such as respiration, cardiac activity,
digestion, hormonal activity, and other processes that are already aected byex-
isting implantable medical devices (McGee 2008: 209; Gasson 2012: 12–16).
oOther biological side eects. Aneuroprosthetic device may be constructed from
components that are toxic or deteriorate inthe body (McGee 2008: 213–16), may
be rejected byits host, or may be subject to mechanical, electronic, or soware
failures that harm their host’s organism.
Impacts onthe User as Social and Economic Actor
Neuroprosthetic devices may aect the ways inwhich their users connect to, partici-
pate in, contribute to, and are inuenced bysocial relationships and structures and
economic networks and exchange. New capacities provided might include:
oAbility to participate innew kinds of social relations. Aneuroprosthetic device may
grant the ability to participate innew kinds of technologically mediated social rela-
tions and structures that were previously impossible, perhaps including new forms
of merged agency (McGee 2008: 216; Koops & Leenes 2012: 125, 132) or cybernetic
networks with utopian (or dystopian) characteristics (Gladden 2015d).
94
INNOVATIVE AND DIGITAL ECONOMY SOCIETY INTHE DIGITAL AGE
oAbility to share collective knowledge, skills, and wisdom. Neuroprosthetics may link
users inaway that forms communication and information systems (McGee 2008:
214; Koops & Leenes 2012: 128–29; Gasson 2012: 24) that can generate greater
collective knowledge, skills, and wisdom than are possessed byany individual
member of the system (Wiener 1961: loc. 3070., 3149.; Gladden 2015d).
oEnhanced job exibility and instant retraining. By facilitating the creation, altera-
tion, and deletion of information stored inengrams or exograms, aneuropros-
thetic device may allow auser to download new knowledge or skills or instantly
establish relationships for use inanew job (Koops & Leenes, 2012: 126).
oEnhanced ability to manage complex technological systems. By providing adirect
interface to external computers and mediating its user’s interaction with them
(McGee 2008: 210), aneuroprosthesis may grant an enhanced ability to manage
complex technological systems, e.g., for the production or provisioning of goods
or services (McGee 2008: 214–15; Gladden 2015b).
oEnhanced business decision-making and monetary value. By performing data min-
ing to uncover novel knowledge, executing other forms of data analysis, oering
recommendations, and alerting the user to potential cognitive biases, aneuropros-
thesis may enhance its user’s ability to execute rapid and eective business-related
decisions and transactions (Koops & Leenes 2012: 119). Moreover, by storing
cryptocurrency keys, aneuroprosthesis may allow its user to store money directly
within his or her brain for use ondemand (Gladden 2015a).
New impairments generated byneuroprosthetic devices at the level of their users
socioeconomic relationships and activity might include:
oLoss of ownership of one’s body and intellectual property. Aneuroprosthetic device
that is leased would not belong to its human user, and even aneuroprosthesis that
has been purchased could potentially be subject to seizure insome circumstances
(e.g., bankruptcy). Depending onthe leasing or licensing terms, intellectual prop-
erty produced byaneuroprosthetic device’s user (including thoughts, memories,
or speech) may be partly or wholly owned bythe device’s manufacturer or pro-
vider (Gladden 2015d; Gladden 2015c: 164).
oQualifications for specific professions and roles. Neuroprosthetic devices may in-
itially provide persons with abilities that enhance job performance in partic-
ular fields (Koops & Leenes 2012: 131–32) such as computer pro-
gramming, art, architecture, music, economics, medicine, information
science, e-sports, information security, law enforcement, and the military;
as expectations for employees’ neural integration into workplace systems
grow, possession of neuroprosthetic devices may become a requirement for
employment in some professions (McGee 2008: 211, 214–15; Warwick 2014:
269).
95
DELab UW
oCreation of nancial, technological or social dependencies.e user of aneuropro-
sthetic device may no longer be able to function eectively without the device
(Koops & Leenes 2012: 125) and may become dependent on its manufacturer
for hardware maintenance, soware updates, and data security and onspecial-
ised medical care providers for diagnostics and treatment relating to the device
(McGee 2008: 213). Auser may require regular device upgrades inorder to re-
main competitive insome jobs. High switching costs may make it impractical to
shi to acompetitor’s device aer auser has installed an implant and committed
to its manufacturers digital ecosystem.
oSubjugation of the user to external agency. Instead of merely impeding its user’s
ability to possess and exercise agency, aneuroprosthesis may subject its user to
control bysome external agency. is could occur, e.g., if the user’s memories,
emotions, or volitions were manipulated bymeans of the device (Gasson 2012:
15–16) or ifthe user joined with other minds to create anew form of social entity
that possesses some shared agency (McGee, 2008: 216).
oSocial exclusion and employment discrimination. e use of detectable neuropro-
sthetics may result inshunning or mistreatment of users (Koops & Leenes 2012:
124–25). Users of advanced neuroprosthetics may lose the ability or desire to
communicate with human beings who lack such devices, thereby fragmenting hu-
man societies (McGee 2008: 214–16; Warwick 2014: 271) and possibly weakening
users’ solidarity with other human beings (Koops & Leenes 2012: 127). Possession
of some kinds of neuroprosthetic devices may exclude their users from employ-
ment inroles where "natural," unmodied workers are considered desirable or
even required (e.g., for liability or security reasons).
oVulnerability to data the, blackmail, and extortion. Ahacker, computer virus, or
other agent may be able to steal data contained inaneuroprosthesis or use it to
gather personal data (potentially including the contents of thoughts, memories,
or sensory experiences) (McGee 2008: 217; Koops & Leenes 2012: 117, 130; Gas-
son 2012: 21; Gladden 2015: 167–68) that could be used for blackmail, extortion,
corporate espionage, or terrorism.
Applying the Model: Toward aNew Typology of
Neuroprosthetics
As atest case, we can use this model to analyse one kind of neuroprosthetic device that
is expected to become available inthe future: acochlear implant with audio recording,
playback, upload, download, and live streaming capabilities (Koops &Leenes 2012;
McGee 2008; Gladden 2015d). Everything that its user hears would be recorded for
later playback ondemand. Instead of simply conveying the "real" sounds produced
96
INNOVATIVE AND DIGITAL ECONOMY SOCIETY INTHE DIGITAL AGE
bythe physical environment, those sounds can be augmented or replaced byother
audio that is stored inor transmitted to the device. Potential capacities and impair-
ments created for the user of such adevice are identied inFigure 2 below.
As can be seen from this example, the model does not yield asingle quantitative
"impact score" for each of the three levels but rather uses qualitative descriptions
to capture acomplex set of impacts. is model delineates adevice ontology that
can form the basis of further reection onand analysis of aneuroprosthetic device’s
impact from both cybernetic, phenomenological, and existentialist perspectives. By
allowing neuroprosthetic devices with similar characteristics to be identied and
grouped, it can also serve as the basis of new typologies for neurotechnologies.
Figure 2. The model applied to analyse impacts of aparticular auditory neuroprosthesis
oHackers can eavesdrop on live audio
from the user's implant or access
recorded auditory experiences
oUser could be forced to hear sounds
(e.g., voices) designed to produce
specic reactions or behaviors
oSome may refuse to speak with user
since all conversations are recorded
oUser will be suspected of receiving
secret aid or advice trough implant
oLoss of control over auditory sense
data to those directing the device
oDisruption of sensorimotor feedback
loops due to lack of real sense data
oConation of "real" sounds from the
environment, the playback of recorded
audio, and live streaming of audio
from a remote source
oPsychological eects of sensory
overload, deprivation, or manipulation
oA continuous internal "soundtrack"
of music or sounds can be created to
stimulate desirable cognitive activity
and supress undesirable activity
oPlayback ability grants perfect
auditory memory
oExtension of body by tapping into
audio from remote microphones
oAbility to receive live audio prompts
may aid politicians, actors, news
broadcasters, lecturers, etc.
oHands-free ability to play back audio
notes or download reference material
may aid surgeons, artists, drivers,
soldiers, police, athletes, etc.
oTwo or more persons can share
their inner speech for forging joint
experiences and communal decisions
...sapient
metavolitional
agent
Impacts on the
human being
as...
...embodied
embedded
organism
...social and
economic actor
Potential Detriments Potential New Capacities
97
DELab UW
Conclusion
Ongoing advances inneuroprosthetics are expected to yield adiverse range of new
technologies with the potential to dramatically reshape a human being’s internal
mental life, his or her bodily existence and interaction with the environment, and his
or her participation insocial and economic networks and activity. e new capaci-
ties and impairments that such technologies provide may allow human beings to
physically and virtually inhabit digital ecosystems and interact socially inways so
revolutionary that they can best be understood as "posthuman."
e model developed inthis text for understanding these impacts of neuroprosthetic
devices is already being elaborated inthe specic context of information security to
provide aframework for future research and practice inthat eld (Gladden 2015c). By
further rening and applying the model inother contexts, we hope that it will be pos-
sible for engineers, ethicists, policymakers, and consumers to better understand how
particular kinds of neuroprosthetic devices may contribute to the development of new
digital ecosystems that can be apowerful venue for the growth, liberation, and empower-
ment–or oppression and dehumanization–of the human beings of the future.
References
Calverley D.J., (2008), "Imagining anon-biological machine as alegal person", AI & SOCIETY
22(4), 523–37.
Cosgrove G.R., (2004), Session 6: Neuroscience, brain, and behavior V: Deep brain stimulation.
Meeting of the President’s Council on Bioethics, Washington, DC, June 24–25, 2004.
https://bioethicsarchive.georgetown.edu/pcbe/transcripts/june04/session6.html, accessed
June 12, 2015.
Ferrando F., (2013), "Posthumanism, transhumanism, antihumanism, metahumanism, and new
materialisms: Dierences and relations", Existenz: An International Journal inPhilosophy,
Religion, Politics, and the Arts 8(2), 26–32.
Gasson M.N., (2012), Human ICT implants: From restorative application to human enhancement
In M.N. Gasson, E. Kosta, & D.M. Bowman (eds.), Human ICT Implants: Technical, Legal
and Ethical Considerations, T. M. C. Asser Press, 11–28.
Gladden M.E., (2015), "Cryptocurrency with aconscience: Using articial intelligence to develop
money that advances human ethical values", Ethics inEconomic Life, Uniwersytet Łódzki,
May 8, 2015.
Gladden M.E., (2015), Cybershells, shapeshiing, and neuroprosthetics: Video games as tools for
posthuman ‘body schema (re)engineering.’ Ogólnopolska Konferencja Naukowa Dyskursy
Gier Wideo, AGH, Kraków, June 6, 2015.
Gladden M.E., (2015), e Handbook of Information Security for Advanced Neuroprosthetics. Indi-
anapolis: Synthypnion Academic.
Gladden M.E., (2015), Tachikomatic domains: Utopian cyberspace as a‘contingent heaven’ for
humans, robots, and hybrid intelligences. His Master’s Voice: Utopias and Dystopias
inAudiovisual Culture, Uniwersytet Jagielloński, March 24, 2015.
Gripsrud M. & Hjorthol R., (2012), "Working onthe train: From ‘dead time’ to productive and
vital time", Transportation 39(5), 941–56.
Han J.-H., Kushner S.A., Yiu A.P., Hsiang H.-W. Buch, T. Waisman, A. Bontempi, B. Neve, R.L.
Frankland, P.W. & Josselyn S.A., (2009), "Selective erasure of a fear memory", Science
323(5920), 1492–96.
Koops B.-J. & Leenes R., (2012), Cheating with implants: Implications of the hidden information
advantage of bionic ears and eyes In M.N. Gasson, E. Kosta, & D.M. Bowman (eds.), Human
ICT Implants: Technical, Legal and Ethical Considerations, T. M. C. Asser Press, 113–134.
Kourany J.A., (2013), "Human enhancement: Making the debate more Productive", Erkenntnis
79(5), 981–98.
Kraemer F., (2011), "Me, myself and my brain implant: Deep brain stimulation raises questions of
personal authenticity and alienation", Neuroethics 6(3), 483–97.
Lebedev M., (2014), "Brain-machine interfaces: An overview", Translational Neuroscience 5(1),
99–110.
McGee E.M., (2008), Bioelectronics and implanted devices In B. Gordijn & R. Chadwick (eds.),
Medical Enhancement and Posthumanity, Springer Netherlands, 207–224.
Ramirez S., Liu X., Lin P.-A., Suh J., Pignatelli M., Redondo R.L., Ryan T.J., & Tonegawa S., (2013),
"Creating afalse memory inthe hippocampus", Science 341(6144), 387–91.
Rao R.P.N., Stocco A., Bryan M., Sarma D., Youngquist T.M., Wu J., & Prat C.S., (2014), "A direct
brain-to-brain interface inhumans", PLoS ONE 9(11).
Shih J., (2004), "Project time inSilicon Valley", Qualitative Sociology 27(2), 223–45.
Warwick K., (2014), "e cyborg revolution", Nanoethics 8, 263–73.
Wiener N. ,(2015), Cybernetics: Or Control and Communication inthe Animal and the Machine,
second edition. Cambridge, Massachusetts: e MIT Press, 1961 (Quid Pro ebook edition
for Kindle, 2015).
... Transhumanism postulates that human is a unique species that may be technologically augmented to provide its survival, new competencies, and development (Ranish & Sorgner, 2008). In line with these assumptions, implanting chips and prostheses can increase human performance or help people gain new skills (Gladden, 2016); in other words it can help humans become cyborgs. The term 'cyborg' was originally coined by Clynes and Kline (1960) when researching the prospectus of human space exploration. ...
Article
Not only does the art market include human and nonhuman creators, it also incorporates technologically augmented artists, called cyborgs. They use wearables, sensors, chips, and even new organs to process various stimuli, such as electromagnetic radiation, atmospheric pressure, or ultraviolet rays, to produce artworks. Little is known however, how the objects produced by them are perceived by the art recipients. This paper applies an experimental study with 373 non-experts in the field of art. The results show that the perceived value of the painting depends on the type of agent and on the context of the evaluation. The price of objects created by a human artist is significantly higher if the context cue is the price of the canvas made by humanoid robot than another human. People value cyborg's artwork similarly to human-generated artwork when contextual cue is human, and similarly to robot-generated artwork when contextual cue is humanoid robot.
... Insofar as future neuroprostheses incorporate such technologies, they may be less likely to simply support their hosts' biological agency; they might instead conceivably impair, override, transform, or replace it. This might be encountered, for example, with neuroprostheses that are controlled by computers possessing human-like cognitive abilities or are composed of biological components possessing their own biological agency distinct from that of their users (Rutten et al., 2007;Stieglitz, 2007;Gladden, 2016b). ...
Article
Full-text available
Previous works exploring the challenges of ensuring information security for neuroprosthetic devices and their users have typically built on the traditional InfoSec concept of the “CIA Triad” of confidentiality, integrity, and availability. However, we argue that the CIA Triad provides an increasingly inadequate foundation for envisioning information security for neuroprostheses, insofar as it presumes that (1) any computational systems to be secured are merely instruments for expressing their human users' agency, and (2) computing devices are conceptually and practically separable from their users. Drawing on contemporary philosophy of technology and philosophical and critical posthumanist analysis, we contend that futuristic neuroprostheses could conceivably violate these basic InfoSec presumptions, insofar as (1) they may alter or supplant their users' biological agency rather than simply supporting it, and (2) they may structurally and functionally fuse with their users to create qualitatively novel “posthumanized” human-machine systems that cannot be secured as though they were conventional computing devices. Simultaneously, it is noted that many of the goals that have been proposed for future neuroprostheses by InfoSec researchers (e.g., relating to aesthetics, human dignity, authenticity, free will, and cultural sensitivity) fall outside the scope of InfoSec as it has historically been understood and touch on a wide range of ethical, aesthetic, physical, metaphysical, psychological, economic, and social values. We suggest that the field of axiology can provide useful frameworks for more effectively identifying, analyzing, and prioritizing such diverse types of values and goods that can (and should) be pursued through InfoSec practices for futuristic neuroprostheses.
... 66 Moreover, the use of mnemocybernetic technologies to affect a particular person's memories (or even a human being's awareness of the possibility that such technologies may have been used) may create uncertainty for that individual about whether the information stored in his or her memories is true or false and an inability to trust any of his or her apparent memories. 67 The use of such technologies may also provide a mnemonic designer with the ability to access, interpret, and create an external record of (and perhaps even potentially 'export' in an automated fashion) memories stored within the human host's mind that he or she wishes to keep confidential. ...
Chapter
Full-text available
This chapter describes how responsibilities for planning and implementing information security practices and mechanisms are typically allocated among individuals filling particular roles within an organization. It then investigates the unique forms that these InfoSec roles and responsibilities can take when the focus of their activities is ensuring information security for advanced neuroprostheses and their human hosts.
Chapter
Full-text available
This chapter explores the way in which standard corrective and compensating security controls (such as those described in NIST Special Publication 800-53) become more important, less relevant, or significantly altered in nature when applied to ensuring the information security of advanced neuroprosthetic devices and host-device systems. Controls are addressed using an SDLC framework whose stages are (1) supersystem planning; (2) device design and manufacture; (3) device deployment; (4) device operation; and (5) device disconnection, removal, and disposal. Corrective and compensating controls considered include those relating to incident response procedures, mechanisms, and training; error handling capacities; failure mode capacities and procedures; and flaw remediation.
Chapter
Full-text available
This chapter explores the way in which standard detective security controls (such as those described in NIST Special Publication 800-53) become more important, less relevant, or significantly altered in nature when applied to ensuring the information security of advanced neuroprosthetic devices and host-device systems. Controls are addressed using an SDLC framework whose stages are (1) supersystem planning; (2) device design and manufacture; (3) device deployment; (4) device operation; and (5) device disconnection, removal, and disposal. Detective controls considered include those relating to the establishment of an integrated InfoSec security analysis team; use of all-source intelligence regarding component suppliers; integrity indicators; designing the capacity to detect medical emergencies; integrated situational awareness; establishment of account usage baselines; general monitoring and scanning; auditing of events; threat and incident detection; and proactive detection and analysis methods.
Chapter
Full-text available
This chapter explores the way in which standard preventive security controls (such as those described in NIST Special Publication 800-53) become more important, less relevant, or significantly altered in nature when applied to ensuring the information security of advanced neuroprosthetic devices and host-device systems. Controls are addressed using an SDLC framework whose stages are (1) supersystem planning; (2) device design and manufacture; (3) device deployment; (4) device operation; and (5) device disconnection, removal, and disposal. Preventive controls considered include those relating to security planning; risk assessment and formulation of security requirements; personnel controls; information system architecture; device design principles; memory-related controls; cryptographic protections; device power and shutoff mechanisms; program execution protections; input controls; logical access control architecture; authentication mechanisms; session controls; wireless and remote-access protections; backup capabilities; component protections; controls on external developers and suppliers; environmental protections; contingency planning; system component inventory; selection of device recipients and authorization of access; physical and logical hardening of the host-device system and supersystem; device initialization and configuration controls; account management; security awareness training; vulnerability analysis; operations security (OPSEC); control of device connections; media protections; exfiltration protections; maintenance; security alerts; information retention; and media sanitization.
Chapter
Full-text available
This text presents an introduction to neuroprosthetic devices and systems that explores both the state of the art of sensory, motor, and cognitive neuroprostheses that are currently in use as well as more sophisticated kinds of neuroprosthetic technologies that are being actively pursued or that are expected to be developed in the future. This overview takes us from the contemporary world of neuroprostheses that have been designed primarily for purposes of therapeutic treatment of medical disorders and the restoration of natural human abilities lost due to illness or injury to an emerging future world in which neuroprosthetic devices offer the possibility of augmenting and transforming the capacities of their users in such a way that they can perhaps best be described as ‘posthumanizing’ technologies.
Chapter
Full-text available
This text develops a model based on network topology that can be used to analyze or engineer the structures and dynamics of an organization in which neuroprosthetic technologies are employed to enhance the abilities of human personnel. We begin by defining neuroprosthetic supersystems as organizations whose members include multiple neuroprosthetically augmented human beings. It is argued that the expanded sensory, cognitive, and motor capacities provided by ‘posthumanizing’ neuroprostheses may enable human beings possessing such technologies to collaborate using novel types of organizational structures that differ from the traditional structures that are possible for unaugmented human beings. The concept of network topology is then presented as a concrete approach to analyzing or engineering such neuroprosthetic supersystems. A number of common network topologies such as chain, linear bus, tree, ring, hub-and-spoke, partial mesh, and fully connected mesh topologies are discussed and their relative advantages and disadvantages noted. Drawing on the notion of different architectural ‘views’ employed in enterprise architecture, we formulate a topological model that incorporates five views that are relevant for neuroprosthetic supersystems: the (1) physical and (2) logical topologies of the neuroprosthetic devices themselves; (3) the natural topology of social relations of the devices’ human hosts; (4) the topology of the virtual environments, if any, created and accessed by means of the neuroprostheses; and (5) the topology of the brain-to-brain communication, if any, facilitated by the devices. Potential uses of the model are illustrated by applying it to four hypothetical types of neuroprosthetic supersystems: (1) an emergency medical alert system incorporating body sensor networks (BSNs); (2) an array of centrally hosted virtual worlds; (3) a ‘hive mind’ administered by a central hub; and (4) a distributed hive mind lacking a central hub. It is our hope that models such as the one formulated here will prove useful not only for engineering neuroprosthetic supersystems to meet functional requirements but also for analyzing the legal, ethical, and social aspects of potential or existing supersystems, to ensure that the organizational deployment of neuroprosthetic technologies does not undermine the wellbeing of such devices’ human users or of societies as a whole.
Chapter
Full-text available
When designing target architectures for organizations, the discipline of enterprise architecture has historically relied a set of assumptions regarding the physical, cognitive, and social capacities of the human beings serving as organizational members. In this text we explore the fact that for those organizations that intentionally deploy posthumanizing neuroprosthetic technologies among their personnel, such traditional assumptions no longer hold true: the use of advanced neuroprostheses intensifies the ongoing structural, systemic, and procedural fusion of human personnel and electronic information systems in a way that provides workers with new capacities and limitations and transforms the roles available to them. Such use of neuroprostheses has the potential to affect an organization’s workers in three main areas. First, the use of neuroprostheses may affect workers’ physical form, as reflected in the physical components of their bodies, the role of design in their physical form, their length of tenure as workers, the developmental cycles that they experience, their spatial extension and locality, the permanence of their physical substrates, and the nature of their personal identity. Second, neuroprostheses may affect the information processing and cognition of neurocybernetically augmented workers, as manifested in their degree of sapience, autonomy, and volitionality; their forms of knowledge acquisition; their locus of information processing and data storage; their emotionality and cognitive biases; and their fidelity of data storage, predictability of behavior, and information security vulnerabilities. Third, the deployment of neuroprostheses can affect workers’ social engagement, as reflected in their degree of sociality; relationship to organizational culture; economic relationship with their employers; and rights, responsibilities, and legal status. While ethical, legal, economic, and functional factors will prevent most organizations from deploying advanced neuroprostheses among their personnel for the foreseeable future, a select number of specialized organizations (such as military departments) are already working to develop such technologies and implement them among their personnel. The enterprise architectures of such organizations will be forced to evolve to accommodate the new realities of human-computer integration brought about by the posthumanizing neuroprosthetic technologies described in this text.
Chapter
Full-text available
The discipline of enterprise architecture (EA) seeks to generate alignment between an organization’s electronic information systems, human resources, business processes, workplace culture, mission and strategy, and external ecosystem in order to increase the organization’s ability to manage complexity, resolve internal conflicts, and adapt proactively to environmental change. In this text, an introduction to the definition, history, organizational role, objectives, benefits, mechanics, and popular implementations of enterprise architecture is presented. The historical shift from IT-centric to business-centric definitions of EA is reviewed, along with the difference between ‘hard’ and ‘soft’ approaches to EA. The unique organizational role of EA is highlighted by comparing it with other management disciplines and practices. The creation of alignment is explored as the core mechanism by which EA achieves advantageous effects. Different kinds of alignment are defined, the history of EA as a generator of alignment is investigated, and EA’s relative effectiveness at creating different types of alignment is candidly assessed. Attention is given to the key dynamic by which alignment yields deeper integration of an organization’s structures, processes, and systems, which in turn grants the organization greater agility – which itself enhances the organization’s ability to implement rapid and strategically directed change. The types of tasks undertaken by enterprise architects are discussed, and a number of popular enterprise architecture frameworks are highlighted. A generic EA framework is then presented as a means of discussing elements such as architecture domains, building blocks, views, and landscapes that form the core of many EA frameworks. The role of modelling languages in documenting EA plans is also addressed. In light of enterprise architecture’s strengths as a tool for managing the deployment of innovative forms of IT, it is suggested that by adopting EA initiatives of the sort described here, organizations may better position themselves to address the new social, economic, and operational realities presented by emerging ‘posthumanizing’ technologies such as those relating to social robotics, nanorobotics, artificial life, genetic engineering, neuroprosthetic augmentation, and virtual reality.
Article
Full-text available
"Posthuman" has become an umbrella term to refer to a variety of different movements and schools of thought, including philosophical, cultural, and critical posthumanism; transhumanism (in its variations of extropianism, liberal and democratic transhumanism, among others); the feminist approach of new materialisms; the heterogeneous landscape of antihumanism, metahumanism, metahumanities, and posthumanities. Such a generic and all-inclusive use of the term has created methodological and theoretical confusion between experts and non-experts alike. This essay will explore the differences between these movements, focusing in particular on the areas of signification shared by posthumanism and transhumanism. In presenting these two independent, yet related philosophies, posthumanism may prove a more comprehensive standpoint to reflect upon possible futures.
Article
Full-text available
Cryptocurrencies like Bitcoin are offering new avenues for economic empowerment to individuals around the world. However, they also provide a powerful tool that facilitates criminal activities such as human trafficking and illegal weapons sales that cause great harm to individuals and communities. Cryptocurrency advocates have argued that the ethical dimensions of cryptocurrency are not qualitatively new, insofar as money has always been understood as a passive instrument that lacks ethical values and can be used for good or ill purposes. In this paper, we challenge such a presumption that money must be "value-neutral." Building on advances in artificial intelligence, cryptography, and machine ethics, we argue that it is possible to design artificially intelligent cryptocurrencies that are not ethically neutral but which autonomously regulate their own use in a way that reflects the ethical values of particular human beings – or even entire human societies. We propose a technological framework for such cryptocurrencies and then analyze the legal, ethical, and economic implications of their use. Finally, we suggest that the development of cryptocurrencies possessing ethical as well as monetary value can provide human beings with a new economic means of positively influencing the ethos and values of their societies.
Book
Full-text available
How does one ensure information security for a computer that is entangled with the structures and processes of a human brain – and for the human mind that is interconnected with such a device? The need to provide information security for neuroprosthetic devices grows more pressing as increasing numbers of people utilize therapeutic technologies such as cochlear implants, retinal prostheses, robotic prosthetic limbs, and deep brain stimulation devices. Moreover, emerging neuroprosthetic technologies for human enhancement are expected to increasingly transform their human users’ sensory, motor, and cognitive capacities in ways that generate new ‘posthumanized’ sociotechnological realities. In this context, it is essential not only to ensure the information security of such neuroprostheses themselves but – more importantly – to ensure the psychological and physical health, autonomy, and personal identity of the human beings whose cognitive processes are inextricably linked with such devices. InfoSec practitioners must not only guard against threats to the confidentiality and integrity of data stored within a neuroprosthetic device’s internal memory; they must also guard against threats to the confidentiality and integrity of thoughts, memories, and desires existing within the mind the of the device’s human host. This second edition of The Handbook of Information Security for Advanced Neuroprosthetics updates the previous edition’s comprehensive investigation of these issues from both theoretical and practical perspectives. It provides an introduction to the current state of neuroprosthetics and expected future trends in the field, along with an introduction to fundamental principles of information security and an analysis of how they must be re-envisioned to address the unique challenges posed by advanced neuroprosthetics. A two-dimensional cognitional security framework is presented whose security goals are designed to protect a device’s human host in his or her roles as a sapient metavolitional agent, embodied embedded organism, and social and economic actor. Practical consideration is given to information security responsibilities and roles within an organizational context and to the application of preventive, detective, and corrective or compensating security controls to neuroprosthetic devices, their host-device systems, and the larger supersystems in which they operate. Finally, it is shown that while implantable neuroprostheses create new kinds of security vulnerabilities and risks, they may also serve to enhance the information security of some types of human hosts (such as those experiencing certain neurological conditions).
Article
Full-text available
Brain-machine interfaces (BMIs) hold promise to treat neurological disabilities by linking intact brain circuitry to assistive devices, such as limb prostheses, wheelchairs, artificial sensors, and computers. BMIs have experienced very rapid development in recent years, facilitated by advances in neural recordings, computer technologies and robots. BMIs are commonly classified into three types: sensory, motor and bidirectional, which subserve motor, sensory and sensorimotor functions, respectively. Additionally, cognitive BMIs have emerged in the domain of higher brain functions. BMIs are also classified as noninvasive or invasive according to the degree of their interference with the biological tissue. Although noninvasive BMIs are safe and easy to implement, their information bandwidth is limited. Invasive BMIs hold promise to improve the bandwidth by utilizing multichannel recordings from ensembles of brain neurons. BMIs have a broad range of clinical goals, as well as the goal to enhance normal brain functions.
Article
Full-text available
We describe the first direct brain-to-brain interface in humans and present results from experiments involving six different subjects. Our non-invasive interface, demonstrated originally in August 2013, combines electroencephalography (EEG) for recording brain signals with transcranial magnetic stimulation (TMS) for delivering information to the brain. We illustrate our method using a visuomotor task in which two humans must cooperate through direct brain-to-brain communication to achieve a desired goal in a computer game. The brain-to-brain interface detects motor imagery in EEG signals recorded from one subject (the "sender") and transmits this information over the internet to the motor cortex region of a second subject (the "receiver"). This allows the sender to cause a desired motor response in the receiver (a press on a touchpad) via TMS. We quantify the performance of the brain-to-brain interface in terms of the amount of information transmitted as well as the accuracies attained in (1) decoding the sender's signals, (2) generating a motor response from the receiver upon stimulation, and (3) achieving the overall goal in the cooperative visuomotor task. Our results provide evidence for a rudimentary form of direct information transmission from one human brain to another using non-invasive means.
Article
Full-text available
In this article, I explore select case studies of Parkinson patients treated with deep brain stimulation (DBS) in light of the notions of alienation and authenticity. While the literature on DBS has so far neglected the issues of authenticity and alienation, I argue that interpreting these cases in terms of these concepts raises new issues for not only the philosophical discussion of neuro-ethics of DBS, but also for the psychological and medical approach to patients under DBS. In particular, I suggest that the experience of alienation and authenticity varies from patient to patient with DBS. For some, alienation can be brought about by neurointerventions because patients no longer feel like themselves. But, on the other hand, it seems alienation can also be cured by DBS as other patients experience their state of mind as authentic under treatment and retrospectively regard their former lives without stimulation as alienated. I argue that we must do further research on the relevance of authenticity and alienation to patients treated with DBS in order to gain a deeper philosophical understanding, and to develop the best evaluative criterion for the behavior of DBS patients.
Chapter
Human ICT implants such as cochlear implants and cardiac pacemakers have been in common clinical use for many years, forming intimate links between technology and the body. Such medical devices have become increasingly advanced in their functionality, with some able to modify behaviour by directly interacting with the human brain and others coming closer to restoring functionality which outperforms its natural counterpart. More recently, and somewhat more controversially, low-tech human ICT implants have been increasingly employed in healthy people, in non-therapeutic contexts. Applications typically focus on identification such as VIP entry into nightclubs, automated payments and controlling access to secure facilities. While reviewing the state of the art, this chapter makes the case that with the desire of technology enthusiasts and self-experimenters to push boundaries, increasing familiarity driving cultural and societal changes, advances in medical technology and the inevitable drift of medical technology to non-medical application, this is clearly just the beginning for human enhancement using ICT implants.
Conference Paper
In a number of popular video games, the player character’s (originally human) body undergoes a temporary or permanent transformation to take on a radically different physical form, such as that of an animal, mythical creature, machine, or cloud of energy. In fantasy games, such a transformation might be caused by a magical spell, ability, or item; in science fiction games, the character’s body might be transformed through cybernetic augmentation, mind uploading, or ‘jacking in’ to experience cyberspace through a virtual avatar. In the real world, researchers have found that the human brain utilizes a ‘body schema’ to control the body and interpret sense data received through it, and that the brain displays a significant ability to update its body schema to reflect bodily changes resulting from growth, illness, injury, or the addition of prosthetic devices. However, it is unknown how dramatically a human body can be transformed before the brain loses its ability to communicate with and control it. This question of whether the human mind can interact with the world without the use of a human body has occupied philosophers from the times of Aquinas and Descartes through the present day. Here we argue that video games can play a crucial role in aiding us to solve this mystery – and thus in ascertaining the extent to which the reengineering of the form and function of the human body envisioned by many transhumanist and posthumanist thinkers may or may not be possible. We begin by suggesting that differences in how body transformation is depicted in fantasy versus science fiction games reveal game designers’ implicit insights into the limits of our brain’s ability to adapt to a changed body. We then argue that the sensorimotor feedback loop experienced while playing video games – which is not present in other media such as books or films – creates a unique opportunity to explore how greatly the human brain’s body schema can be extended or transformed to accommodate the possession of a radically non-human body. In this fashion, the designers and players of computer gamers are working at the frontiers of an emerging field of ‘body schema engineering.’ Their experiences will aid humanity to understand the extent to which it may or may not be possible to develop posthuman technologies such as xenosomatic prosthetics (which provide a human mind with the experience of possessing a body radically different from its natural human body), neosomatic prosthetics (which physically replace all of a person’s body apart from the brain with a synthetic housing that may or may not resemble a human body), and moioprosthetics (specialized neosomatic prosthetics that encase the human brain within a standardized ‘cyberbrain’ that can be easily swapped among different robotic ‘cybershells’ in the form of humanoid or animal bodies, vehicles, or buildings). Finally, we suggest that reflecting on computer gamers’ in-game experiences of possessing and utilizing non-human bodies can help us to anticipate and understand the novel psychological conditions – whether disorders or enhancements – that may result from the long-term use of body-altering neuroprosthetics. Through their exploitation of video games’ body-transforming capabilities, gamers can become pioneers and heralds of new posthuman ways of existing and interacting with reality.
Article
This paper looks at some of the different practical cyborgs that are realistically possible now. It firstly describes the technical basis for such cyborgs then discusses the results from experiments in terms of their meaning, possible applications and ethical implications. An attempt has been made to cover a wide variety of possibilities. Human implantation and the merger of biology and technology are important factors here. The article is not intended to be seen as the final word on these issues, but rather to give an initial overview. Most of the experiments described are drawn from the author’s personal experience over the last 15 years.
Article
Human enhancement—the attempt to overcome all human cognitive, emotional, and physical limitations using current technological developments—has been said to pose the most fundamental social and political question facing the world in the twenty-first century. Yet, the public remains ill prepared to deal with it. Indeed, controversy continues to swirl around human enhancement even among the very best-informed experts in the most relevant fields, with no end in sight. Why the ongoing stalemate in the discussion? I attempt to explain the central features of the human enhancement debate and the empirical and normative shortcomings that help to keep it going. I argue that philosophers of science are especially well equipped to rectify these shortcomings, and I suggest that we may be deeply remiss if we don’t do so.