PreprintPDF Available

Digital Threats to Humans and Society

Preprints and early-stage research may not have been peer reviewed yet.


This preprint summarises some of the threats of dual and primary uses of digital technology and calls for a new paradigm for the digital society of the future.
Digital Threats to Humans and Society, summarized by Dirk Helbing
We herewith acknowledge that digital technologies have great potential to significantly
contribute to solutions of major global problems (including health- and sustainability related
ones). They are also fueling innovation and new economic sectors. Many documents have
covered this, and have described the business value and societal benefits.
This document, in contrast, will try to give an overview of threats and undesired side effects
of digital technologies, highlight potential misuses, dual uses, systemic risks, and potential
accidents. As it turns out, the damage may be large-scale and perhaps even bigger than the
damage that classical weapons may cause. It is not clear whether our societies are well
prepared to avoid or handle these risks. Many of the side effects (such as impacts on
democracy and human rights) may be hard to quantify and become visible only with
considerable delays.
Furthermore note that, to non-insiders, some of the following may read a bit like science
fiction. However, considering that the digital revolution accelerates exponentially and that
military technology in some countries is decades ahead of publicly known business
applications, it must be assumed that most of the technological developments discussed
below are not just future possibilities, but do already exist, which calls for proper
It is well-known that the digital revolution is highly disruptive. It concerns not only business
models, but transforms entire areas of society, which were formerly separated and managed
according to different values, goals, and principles. Now, digital technologies are being used
to re-invent more or less everything, and all sectors of society. They are pervasive and fuel
disruptive innovations everywhere.
It must also be considered that data volumes and processing power have increased
dramatically. Systems have also become much more connected, and are organized as
networks of networks. While this implies many benefits, it also introduces new vulnerabilities,
such as systemic failures due to cascading effects (e.g. blackouts).
Let us now shortly address various challenges that are emerging or already around. Given
the page limits of this case study, we will keep well-known issues short, while reflecting less
well-known problems in greater detail.
Big Data and Data-Driven Society
By now it has become common to manage many systems in a “data-driven” way, using Big
Data and real-time analytics. The quality of the resulting solutions, however, depends on the
quality of the data and the ability to turn data into information, knowledge, and wisdom. This
is often complicated by issues such as the following:
Exact measurements “are obstructed by “biases, randomness, turbulence, … chaos
theory, … quantum mechanics, … undecidability …, and overfitting, to mention just
some of the problems… Given the classification problem[s] of false positives, there
are even cases where results deteriorate [when more measurements are made]”.
Classification errors (such as “false positive” and “false negatives”) are a widespread
problem in statistical analyses, not just in case of data samples that are biased (i.e.
non-representative). In case of predictive policing algorithms, for example, the “false
positive” rate is often above 90%.
Many patterns in the data are insignificant and meaningless. They show up only by
coincidence. Therefore, the risk of “overfitting” is serious.
Correlations do not necessarily mean causation, and even if causation is given, it is
often difficult to determine whether A causes B, B causes A, or a third factor C
causes A and B.
Parameter fitting is only possible with a finite accuracy, but slightly different
parameters may lead to largely different outcomes, as phenomena such as
deterministic chaos, turbulence, and the related “butterfly effect” illustrate. In such
cases, the “sensitivity” of the model can be a problem.
A good fit to a dataset does not yet mean a validated model. This requires to test the
model with different datasets without re-adjusting parameters. Only validated models,
however, can be assumed to have meaningful implications for different settings or to
predict future developments to some extent.
Sensitive models are not very suitable to predict the future. They imply a high degree
of uncertainty. (Still, they may reveal the instability of a system, which can be also
important to know.)
Complex, networked systems often imply “wicked problems” and unexpected
behaviors. Moreover, large variations (including “black swans”) may occur more often
than statistically expected based on a normal distribution.
Digital “Crystal Ball”
Despite the issues mentioned above, large IT companies are increasingly using Big Data to
model real-world processes and systems. With massive surveillance, one even tries to
produce something like a digital “Crystal Ball”. In fact, this is what Palantir, for example, is
aiming to do.
However, there are also other companies, such as Recorded Future
(apparently a spin-off of Google and the CIA).
It seems that the World Economic Forum
(WEF) has similar infrastructures.
These may not only perform real-time analytics of what
happens in almost any place in the world (“nowcasting”), but also try to predict the future
(“forecasting”). The military has certainly built a “prediction machine” as well.
Artificial Intelligence
While Big Data is often being called the “new oil of the digital age”, the “digital motor” running
on it is Artificial Intelligence, which is based on machine learning algorithms.
Machine learning models today may aim to learn millions or even billions of parameters, or
more. However, the interactions between the system elements are often poorly represented
(even in times of the Internet of Things). Besides, (slow or lack of) convergence of the
learning algorithms may be an issue, particularly in dynamically changing environments. This
can produce a bad system representation or wrong forecasts.
Surprisingly, simpler models can often have more predictive power than complicated ones
(as the above issue of “over-fitting” illustrates). Overall, one can say by now that Big Data
has not made “the scientific method obsolete”, in contrast to what was famously assumed by
Chris Anderson.
According to today’s knowledge, there are fundamental limits to the
accuracy of modelling complex dynamical systems such as climate, life, behavior, or health.
A further frequent criticism of AI algorithms is that they are like “black boxes”.
In other
words, it is often not understandable how they come to their conclusions. In fact, as recent
studies on discrimination of minorities, women, and people of color
have shown, outcomes
of AI systems need to be questioned. To counter these issues, huge efforts are now being
made to work on trustable, explainable, and fair AI.
Surveillance Capitalism
All around the world, the collection of data has become a lucrative opportunity. Two kinds of
systems are typically distinguished: (1) societies with state-based surveillance and control
(such as China), (2) societies, where surveillance is assumed to be carried out mainly
through corporations (as in the USA). While labels such as “technological totalitarianism” are
often used for the former, the term “surveillance capitalism” represents mainly the latter.
Surveillance capitalism is perhaps best characterized by the following quote of Eric Schmidt,
the former Google CEO:
“With your permission, you give us more information about you, about your friends,
and we can improve the quality of our searches. We don't need you to type at all. We
know where you are. We know where you've been. We can more or less know what
you're thinking about.”
Both, surveillance capitalism and state-run digital societies, imply dangers for human rights
and human dignity, in particular for privacy. However, it does not stop there. Typical
additional features are profiling, scoring, and targeting, as will be discussed below. Overall,
these developments are increasingly seen as potential threats to democracies. In particular,
terms such as “data dictatorship” try to warn of the dangers of data-driven behavioral
manipulation (see below).
War Room Approach
The management or control of systems, based on huge amounts of data, is typically done
through a control room, also often called a “war room” (which may utilize the “crystal ball”
approach mentioned above). This approach is used not only by secret services and the
military, but increasingly also for the operation of modern companies, supply chains, and
smart cities.
A data-driven, “technocratic” approach, however, has some limitations. For example, while
production facilities may maximize a certain goal function, cities and societies have multiple
competitive goals, which must all be addressed simultaneously to ensure a thriving social
system. If one wanted to optimize the future of planet Earth with technical means, however,
one would have to choose a goal function, even though there is no science to determine the
right one. For example, should it be GDP per capita, sustainability, life expectancy, or quality
of life? Whatever goal function is chosen, it will (have to) map the complexity of the system
to a one-dimensional function, which will oversimplify the system and neglect secondary and
tertiary goals. Therefore, sooner or later one would end up having new kinds of trouble.
Hence, running a society based on the paradigm of a data-empowered “benevolent dictator”
is expected to perform poorly.
A “military-style”, centralized top-down control approach would also undermine democracy,
which is based on principles such as diversity, participation and collective intelligence. Due
to the plurality of goals, cities and societies should, in fact, neither be run like businesses nor
machines. A co-evolutionary approach may outperform optimization, and a coordination
approach may outcompete control.
To illustrate the dangers of optimization, let us assume an AI system that would be tasked to
make the world sustainable. What if it concluded that the easiest way to reach sustainability
would be depopulation, even though a better future might exist for the people of the world?
This could trigger a horrible scenario (see below). To avoid such problems, it has been
demanded that “war rooms” be turned into “peace rooms”.
It is also important to consider that “evidence-based” decision-making can mean two things:
“fact-based” (determined via established scientific methods of verification and falsification) or
“data-driven” (based on measurement or estimates, projections, or forecasts). The two are
often not the same, but the former approach is increasingly replaced by the latter. This
creates new risks: A data-driven approach may be vulnerable to misinterpretation and bias
(see above), but also to manipulation or deception. For example, many political decisions in
response to Covid-19 were based on data or even on predicted data (in fact, often on
forecasts, which never materialized).
This could have caused various counter-productive results. Furthermore, a data-driven
approach is prone to hacking.
Cyber Threats (Cyber Vulnerability)
This paragraph is kept short, as the subject of “cyber threats” has already received a lot of
attention in recent years. New issues, however, arise, when adversaries use powerful AI for
their attacks, for example, to discover security gaps that can be used for “zero day exploits”.
Due to the exponential (or even faster) increase in cyber risks and personalized propaganda,
some people think the way the Internet is organized is now too vulnerable and outdated.
According to them, the current Internet should be replaced by a satellite-based system,
probably combined with quantum-encryption. This might reduce possibilities to manipulate
Internet contents and compromise data communication a lot. However, it would also imply a
large degree of control over a big share of information, by very few people, who will not
necessarily act in the best interest of all people. Also note that we may also see light-based
communication (LiFi), soon, which has much higher data transmission rates.
Profiling and Digital Doubles
Both, companies and governments are collecting increasing amounts of data about
individuals. This process is called “profiling”, i.e. the creation of personal profiles. These
profiles are producing ever more detailed representations of people, objects, and planet
Earth. As the degree of detail increases, one speaks of “avatars” (animated representations)
or even “digital twins” (which assume that all relevant characteristics of the system of interest
are being digitally reproduced).
Such digital twins are being envisaged, for example, for
companies and cities, but also the entire planet, its inhabitants, their behaviors, health,
bodies, and personalities. Clearly, this implies huge privacy issues, but not only. It makes
everyone vulnerable to “hacking” (e.g. of emotions, thinking, behavior, and/or health; see
below for more details).
World Simulation
The avatars and digital doubles would not only be data collections. Their behavior would
also be simulated and animated. That is, they would have a virtual life, which would allow for
digital “what … if” experiments, before a particular implementation is chosen. One such
platform created for “world simulation” is called “Sentient World”.
It has been used to
produce a “military second Earth”
and is perhaps the main reason for the mass
surveillance revealed by Wikileaks (CIA Vault 7) and Edward Snowden (NSA).
platform seems to go back to Fortune 500 companies, who are using it for their strategic
However, “Sentient World” is apparently also being used to plan war operations and perform
psychological operations (PsyOps), potentially also in peace times. (The way politics and the
public think and talk about Covid-19 might be an example.) The tool should further be seen
in the context of the controversial Information Dominance strategy,
which seems to include
the problematic concept of “Mind War”.
Attention Economy and Nudging
In an information-rich age, it has been claimed that the shortest resource may actually not be
money, but attention. Who manages to catch our attention has the chance to influence our
emotions, thinking and behavior. As only a small fraction of information is consciously
processed, we may also be manipulated in a subliminal way, for example by “nudging”.
While people are on the Internet or social media platforms, they are constantly being
exposed to nudges and advertisements. In the system of “surveillance capitalism”, there is
an auction-based market for such ads. In other words, everyone can bid to get the attention
of Internet users. This does not only consume a lot of their time. It also absorbs a lot of
intellectual capacity that would probably better be used to solve the world’s existential
Censorship and Propaganda
Attention economics (specifically the approach to absorb a great share of the attention
capacity) can also be used for new forms of censorship and propaganda. Most search
engines, social media and Internet-based services are now personalized. This means that
algorithms increasingly decide what offer someone receives and what information he or she
sees. Algorithms also determine how many people see what kind of information, and who is
to see what. In this way, it is possible to determine where and how far certain information
spreads. This fact can be used to make some kinds of information (e.g. confidential or
sensitive information) virtually invisible (even without deleting it), or to amplify other kinds of
Conformity and Distraction
In conclusion, the methods mentioned before can be used for distraction (which is why social
media have also been called “weapons of mass distraction”). However, they can also be
used to promote (forced) consensus (“Gleichschaltung”). Both ways of “social engineering”
of communities (promoting cooperation and convergence or conflict and divergence) have
been reported (namely in Edward Snowden’s JTRIG revelations
Note that the promotion of a single perspective can undermine diversity and pluralism, which
are important preconditions for innovation, societal resilience, collective intelligence, and
Hate Speech
The promotion of hate speech has a toxic effect, as it undermines trust, solidarity, and the
coherence of a community. Such a “divide et impera” strategy can undermine the basis of
any society. One might even talk about sedition (“Zersetzung”). Hate speech tends to
spread, because it tends to get more attention. It emotionalizes exchange on the Internet
and thereby makes people spend more time on social media platforms, which is of
commercial interest. Note that a lot of hate speech comes from troll farms. Social bots and
language-generating AI systems such as GPT-3 may contribute to the problem.
Assuming always bad intent, however, gives an incomplete picture. Some digital visionaries
seem to believe that it is right to decompose society into its “atoms”, the individuals”. This
makes it easier to manipulate the behavior of people, using Artificial Intelligence. In
perspective, society would become a system, in which Artificial Intelligence superintelligent
or not would control the thinking, emotions, behaviors, and lives of individuals. For further
details see, for example, the sections on “targeting” and “transhumanism”.
Fake News and Disinformation
Overall, the attention economy makes it difficult to determine facts and to focus on them.
This undermines education, science, and dialogue, hence, the basis of modern democratic
societies, which believe in learning, insight, truth, enlightenment and a responsible, self-
determined life. It also creates new information asymmetries and, based on the principle of
“knowledge is power”, advantages for a small new digital elite. If this problem is not
addressed, soon, democracies may give way to a new kind of digitally based feudalism.
Targeting and Behavioral Manipulation
When nudging is combined with Big Data to personalize nudges, this is called “big
The method is being used not only by military propaganda, but also for
commercial “neuromarketing”.
A major share of Artificial Intelligence capacity today is being used for behavioral
manipulation. It might also be misused to manipulate democratic elections, which has been
claimed for the Brexit vote and US elections 2016. As Cambridge Analytica insiders have
revealed, this military-style propaganda method has apparently been used in about 65
Moreover, it appears these methods are being used to shape political systems
not only during elections. They are probably being applied as well to “socially engineer”
societies during peace times on an everyday basis.
Citizen Scores and Behavioral Control
Besides profiling and targeting, the instruments of digital societies include also the use of
citizen scores. Multiple scores are in use. They encompass, for example, the “customer
lifetime value’ and the “social credit score” in China,
which bears similarities with the
“Karma Police” program of the British GCHQ.
The above scores are super-scores, which inappropriately condense the “value” or status of
a person in a single number, which determines the access to resources and services, and
the rights as well. Besides considering wealth or health, such scores may also be based on
behavior, using surveillance. As a consequence, they are instruments of behavioral control.
In this connection, the term “technological totalitarianism” is often being used.
In fact,
recently, the “social credit score” has come under heavy criticism, particularly in connection
with the treatment of Uyghurs.
Digital Policing
A somewhat similar concept is digital policing. Many countries have tested or used predictive
policing programs, which try to predict future crimes based on past recorded crime patterns.
Somewhat similar to the movie “Minority Report”, the goal is to take interventions before
crime happens. This might imply restrictions (such as “geofencing”) for people who have
actually never committed a crime. Also, prison sentences may depend on algorithmic
Digital policing has been criticized for its involvement of secret service-like activities
(surveillance) and the lack of separation from police action (i.e. executive function), but also
for its lack of transparency and democratic oversight. A further concern is systematic
discrimination against people of color and minorities. Moreover, the rate of false positives is
huge, often above 90%. This implies potentially a lot of arbitrariness in the action taken.
While lists of potential terrorists often contain millions of names, only 1 in 100.000 alarms
seems to be related to a person who has actually committed an act of terror.
Code Is Law
As processes in our economy, society and environment are increasingly monitored by the
Internet of Things and algorithmically managed and controlled, we are faced with a situation
that has been described as “code is law”.
According to this, processes in our society, access to goods and services, and what is
doable or not, are increasingly determined by algorithms (“code”). These add restrictions to
the action space, almost as if they were “laws of nature”. Such restrictions interfere with
freedom rights. While before, people could decide to violate laws at the risk of punishment, if
they had good reasons for this, this possibility may not exist anymore in the future.
In the age of “industry 4.0”,
there is a danger that algorithmic approaches would be
inappropriately transferred from objects to subjects, from robots to people, and from
production to society. Such a data-driven approach misses out on many hardly measurable
qualities of life that matter for humans, such as human dignity, freedom, creativity, culture, or
Despite this, society might increasingly be automated and run like a machine, which
threatens to undermine diversity, innovation, societal resilience, and collective intelligence.
Cashless Society and Digital Command Economy
One framework in which the principle of “code is law” may play out particularly harsh is a
cashless society, which is being discussed as a possible element of the emerging “digital
command economy”.
Such developments may intentionally or not be promoted by the
Sustainability Development Goals of the Agenda 2030 and by the Planetary/Global Health
Agendas. Here, the access to goods and services might be made dependent not only on
available budgets or previous consumption patterns (“too much meat”, “too many flights”...).
It could also be coupled across sectors (e.g. car rental may not be possible until the
apartment rent is paid), or coupled to behavior, as in the social credit score discussed
before. That is, criticizing the political system might impact access to critical resources.
A cashless society may also perform resource management based on the controversial
principles of triage. Rather than offering everyone equal rights of access, there could then be
privileged cases that would be prioritized, hopeless cases that would get no access to
certain resources at all, and other cases that might get access to what is left. In such a
context, total submission to what the system expects might become a precondition for
survival. The current discussion on whether unvaccinated people shall still be treated in
hospitals or triaged away demonstrates this trend quite clearly.
Electronic ID
One key element of security and control concepts is a unique, forgery-proof identity (even
though it is known from complexity science that “control variables” typically do not relate to
individual system components). In this connection, biometry has been a significant concept
(despite previous failed attempts to characterize good and bad people by their genes and
physiognomy). Fingerprints and face recognition are probably the most well-known features
used, but have been questioned due to database leaks, forgery, and privacy concerns.
Secret services use digital forensics (based on features of smart devices, software installed,
usage patterns, and location tracking). This allows them to identify people with extremely
high accuracy.
To enforce certain behavioral and consumption patterns, it seems that further kinds of
electronic IDs have been considered and explored, such as body-based IDs, using
nanoparticles. Project Jumpstart has apparently worked on such an e-ID.
Probably, the
id2020 consortium has also been working on such solutions.
Furthermore, there are
various relevant publications and patents pointing into similar directions.
While such solutions may serve the purpose of making people manageable and controllable
like things, they would simultaneously destroy the very essence of human dignity, which is
the foundation of democracies around the world and a key value protected by the UN
Human Rights Charter.
Internet of Bodies
The development of nanoparticle-based technologies obviously goes beyond the
development of e-IDs. Nanoparticles and nanobots can be used in medicine for diagnostics
(surveillance of body functions) and treatments (interference with body functions). They may
also be used for gene editing, at least in perspective.
Some of the related developments have been summarized under the label “Internet of
The World Economic Forum, one of the main promoters of Internet-of-Things-
based Industry 4.0, has repeatedly pointed out “The Internet of Bodies is here”.
Unfortunately, there are many issues with using (or misusing) such technologies, and many
of them are unsolved. In fact, nanotechnology so far is largely unregulated.
It seems that calls for urgent political control of the Internet of Bodies have not been effective
so far.
This raises serious concerns, as nanoparticles can be absorbed by human bodies
through food, water, air, drugs, and vaccines.
Unfortunately, it appears that often informed
consent has not been given in advance. In fact, many people have been exposed to non-
natural nanoparticles, and some of them are toxic.
Neurotechnology: Reading and Controlling Minds
Recently, there has also been increasing news about the development of neurotechnologies.
So far, most of this news was focused on brain chips, as engineered by “Neuralink” and
other companies.
However, the technological development seems to be much further, particularly in relation to
military applications. In the meantime, researchers and engineers are apparently working on
human machine interfaces (HMIs), which are based on dispersing nanoparticles in the brain.
The so-called “Obama Brain Project” has supported related kinds of research with several
billion Dollars.
Similar research is also performed in Europe, apparently based on substances such as
graphene oxide.
The pharma industry is certainly involved in these developments. Further relevant literature
can be found under keywords such as “smart dust”
and “neural dust”.
“The Matrix”
The above technological developments do not only threaten the freedom of thought (that
nobody else would know of), but also the freedom of will. Even though this sounds like
science fiction, there is a danger that it might become possible to manipulate people’s
minds, emotions, and thoughts to an extent that would seriously endanger the autonomy of
humans. In perspective, people may become part of (and controlled by) a giant hybrid
computer system.
Concerns about this have, in particular, been raised by a number of tech billionaires, who
have suggested that reality might be a computer simulation.
In societies based on the
principle “code is law”, this is certainly so to an ever increasing extent. Inspired by the
related movie series, such an augmented reality has often been framed as “The Matrix”. It
should be stated, however, that even weaker forms of manipulation such as “big nudging”
may produce effects reminiscent of what is called “The Matrix”, i.e. a digitally managed world
one might hardly be able to escape from.
Neurotechnologies offer attractive perspectives for corporations. While surveillance
capitalism seems to be restricted to surveilling people and inferring their thoughts, emotions,
and behaviors, the next stage of this data-driven system seems to be what some have called
In this system, it would also be possible to steer the thoughts and emotions and decisions of
people, and to shape their ideas, memories, and values through computer-based control. In
fact, labs are already working on dream advertising, i.e. implanting dreams that would make
people buy certain products.
It is obvious that such an approach, which may be somewhat comparable to hypnosis, bears
great potential for misuse, against which people may not be able to defend themselves. This
includes deception that could be more realistic than deep fakes, and may go so far as to
involve people in crimes or accidents against their will.
Technological Convergence and Transhumanism
For those who argue for a technology-driven world and for doing everything that is doable, it
is clear that we would see a great technological convergence. This means that basically all
technologies, including electrical, computational, neuronal, cognitive, genetic, information-
and nanotechnology-based would merge. This development would also bring about an
eventual human-machine convergence. People would start upgrading themselves with
technological implants, turning them into superior cyborgs. Eventually, humans and
machines might even be hardly distinguishable. The proponents of this transhumanist idea
often believe that humans would, in fact, be replaced by non-biological forms of life.
Singularity and Superintelligence (“Digital God”)
According to Moore’s Law, processing power grows at an exponential, i.e. ever-accelerating
rate. If this trend continues, it is expected that the processing power of supercomputers
would eventually surpass the processing power of human brains. Shortly after that point,
called “singularity”,
transhumanists believe that universal “superintelligence” would take
over control of human affairs and the planet, and assume “God-like” power. In this system,
humans might become something like “cells” of a digitally connected “meta-body”, the brain
of which would be the previously mentioned superintelligent system. Humans might become
an integral part of this “superintelligent” system, possibly mainly executing its commands.
“Apocalyptic AI”
According to transhumanists, the before-mentioned technological singularity would connect
us with the entire world; it would come with a cognitive shift called “transcendence” and
make us feel like Gods.
Furthermore, some transhumanist identify this singular shift with apocalyptic elements of the
Judaic and Christian theology. The following quote from Robert M. Geraci’s non-fiction book
on “Apocalyptic AI” may give an impression:
“Apocalyptic AI authors promise that intelligent machines our “mind children,”
according to Moravec will create a paradise for humanity in the short term but, in
the long term, human beings will need to upload their minds into machine bodies in
order to remain a viable life-form. The world of the future will be a transcendent
digital world; mere human beings will not fit in. In order to join our mind children in life
everlasting, we will upload our conscious minds into robots and computers, which will
provide us with the limitless computational power and effective immortality that
Apocalyptic AI advocates believe make robot life better than human life.”
Death by Algorithm
As stated before, according to this transhumanist ideology, “mere humans will not fit in” the
world after the singularity, and would not remain a viable life-form. They would not be
supported by the digital world of the future. Artificial Intelligence would, therefore, perhaps be
given the power to decide about the life and death of people.
With the increasing use of
algorithms to make triage decisions, such a development seems to be already on the way.
Furthermore, the discussion around not providing hospital treatment to unvaccinated people
suggests that vaccination against Covid-19 might effectively turn into something like an
“entry ticket” for the transhumanist age in the making, while there is little or no evidence that
the envisaged transhumanist future would actually be a viable life form over an extended
period of time (say, a thousand years). It must be warned that humanity may be engaging in
an extremely risky, probably highly irresponsible experiment, here.
Autonomous Weapons
It cannot be excluded that the above described developments may end with the death of a
lot of people. Depopulation as a result of the early deployment of technology, before proper
testing could be done, seems to be a possibility. It is particularly concerning that effective
precautions against the use of autonomous weapons do not seem to exist. Killer drones and
killer robots are just two of many possible elements of future wars. It is conceivable that
nanotechnology and perhaps even 5G, which is based on directed energy beams rather than
diffusive radiation (in contrast to 4G), might be turned into weapons that could be used
against the people of a country, also in a personalized way (e.g. based on scores). Perhaps
more importantly, however, it seems that there are only very few studies about possible
health impacts when nanoparticles in the human body interact with electromagnetic
radiation. The possibility of adverse effects or even weaponization should certainly be
considered, and suitable precautions taken.
Hence, ABC weapons are not the only weapons one needs to be concerned about. Even
though there is little public information about this, digital and nanoparticle-based weapons
might be more devastating than drone attacks, killer robots, or even nuclear blasts, while
they could also be used by non-state actors.
Scenario Horribilis
Obviously, it could happen that some of the above developments come together and
reinforce each other in a dangerous way. For example, given that the world financial crisis of
the fiat currency system is far from being solved, the possibility of a major financial crash
and bankruptcy cascade is significant. The response to this situation and the resulting supply
shortages might be the introduction of an Internet-of-Bodies-based e-ID, even though this
would establish a totalitarian cashless society and eliminate human dignity. Due to the
supply shortages, many people might die based on algorithm-based triage-decisions using
sensitive personal data. Clearly, measures must be taken to avert such scenarios, in
particularly as alternative solutions to sustainability challenges do exist.
Summary and Conclusions
Humanity is faced with a myriad of technological innovations, which are happening at an
exponentially accelerating pace. As outlined above, the following trends are observed:
1. A New Economy: Big Data, AI, Surveillance Capitalism, Attention Economy
(Profiling, Targeting, Digital Twins, e-IDs, Cashless Society)
2. A New Politics: Digital Censorship and Propaganda, Scoring, Behavioral
Manipulation (Big Nudging)
3. A New Legal System: Code is Law, Digital Policing (PreCrime), Social Credit
Score, Karma Police, Death by Algorithm
4. A New Human: Transhumanism, Mind Reading/Mind Control, Neurocapitalism
5. A New “God”: Singularity, Superintelligence, “The Matrix”, “Apocalyptic AI”
These developments can shatter the very foundations on which our society was built.
Overall, they have not strengthened democratic institutions lately. On the contrary: there are
unprecedented threats to freedom, democracy, dignity and human rights, peace, the right to
life, and in view of the emerging transhumanist trend even the existence of humanity as
we know it. Hence, these developments may dismantle our society and might be more
dangerous to it than conventional war or terrorism. All traditional institutions of our society
are under attack, or threatened by disruptive innovations. It is questionable, however,
whether these revolutionary changes serve the interest of the great majority of people. At
least there does not seem to be sufficient democratic or political legitimacy for them at the
moment, while financial gains do not seem to be a good basis for the grave decisions
It must be stressed that the above developments are not alternative-less. Digital
technologies can also be used in different, humane, ways, without having to give up on
environment- or health-related goals. However, this requires a different perspective,
paradigm, and approach, and proper decisions need to be taken, soon. Even though better,
self-determined human futures are conceivable (based, for example, on participatory
resilience, socio-ecological finance, democratic capitalism, and digital democracy), politics
and business seem rather late in implementing frameworks that would empower citizens and
strengthen civil society, while supporting a symbiotic, sustainable relationship with nature.
Most likely, however, it is not too late to change this.
Helbing, D. (2013). Globally networked risks and how to respond. Nature, 497(7447), 5159. (accessed on Nov. 11, 2021)
Vasiliauskaite, V., Antulov-Fantulin, N. and Helbing, D. (May., 2021). Some Challenges in Monitoring Epidemics, in print. (accessed on Nov. 9, 2021)
Quote from: Arcaute, E., Barthelemy, M., Batty, M., Caldarelli, G., Gershenson, C., Helbing, D., Moreno, Y., Ramasco, J. J.,
Rozenblat, C. and Sánchez, A. (Sep., 2021). Future Cities: Why Digital Twins Need to Take Complexity Science on Board,
Preprint. (accessed on Nov. 9, 2021)
Palantir Technologies. (accessed on Nov. 9, 2021)
Ashford, W. (Aug. 3, 2010) CIA and Google invest in high-tech crystal ball technology, (accessed on
Nov. 9, 2021)
World Economic Forum, Centre for Cybersecurity, (accessed
on Nov. 9, 2021)
Tucker, P. (Apr. 8, 2015) Can the military make a prediction machine?,
military-make-prediction-machine/109561/ (accessed on Nov. 9, 2021)
Anderson, C. (Jun. 23, 2008) The End of Theory: The Data Deluge Makes the Scientific Methods Obsolete, (accessed on Nov. 9, 2021)
Pasquale, F. (2016) The Black Box Society (Harvard University, Harvard, MA).
Raji, I.D., Gebru, T., Mitchell, M., Buolamwini, J., Lee, J. and Denton, E. (2020) Saving face: Investigating the ethical
concerns of face recognition auditing. AIES’20, Feb. 7-8, 2020, New York, NY, USA.
Zuboff, S. (2020) The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power
(PublicAffairs, New York).
Saint, N. (Oct. 4, 2010) Google CEO: “We Know Where You Are. We Know Where You’ve Been. We Can More Or Less
Know What You Are Thinking About.”,
youve-been-we-can-more-or-less-know-what-youre-thinking-about-2010-10 (accessed on Nov. 9, 2021)
Helbing, D. and Pournaras, E. (2015) Society: Build digital democracy, Nature 527: 33-34, (accessed on Nov. 9, 2021)
Helbing D., Frey, B. S., Gigerenzer, G., Hafen, E., Hagner, M., Hofstetter, Y., van den Hoven, J., Zicari, R. V. and Zwitter
A. (Feb. 25, 2017) Will Democracy Survive Big Data and Artificial Intelligence?, (accessed on Nov. 9,
Helbing, D. and Seele, P. (2017) Turn war rooms into peace rooms. Nature 549: 458, (accessed on Nov. 9, 2021)
John Hopkins University and Medicine, Covid-19 Dashboard, (accessed on Nov. 9,
Dawkins, D. (Aug. 27, 2021). Jeff Bezos And Elon Musk Are Now Butting Heads Over The Small Satellite Internet Business.
small-satellite-internet-business/ (accessed on Nov. 11, 2021)
Arcaute, E., Barthelemy, M., Batty, M., Caldarelli, G., Gershenson, C., Helbing, D., Moreno, Y., Ramasco, J. J., Rozenblat, C.
and Sánchez, A. (Sep., 2021). Future Cities: Why Digital Twins Need to Take Complexity Science on Board. [Preprint.] (accessed on Nov. 9, 2021)
Baard, M. (Jun. 23, 2007) Sentient world: war games on the grandest scale, (accessed on Nov. 9, 2021)
Sterling, B. (Jun. 25, 2007) A Military Second Earth, (accessed on Nov. 9,
Faggella, D. (May 19, 2019) Sentient World Simulation and NSA Surveillance Exploiting Privacy to Predict the Future,
(accessed on Nov. 9, 2021)
Synthetic Environment for Analysis and Simulations,
Endsley, M., Jones, W. M. and LOGICON TECHNICAL SERVICES INC DAYTON OH (1997) Situation Awareness
Information Dominance & Information Warfare, (accessed on Nov. 9, 2021);
Miller, D. (2004) Information dominance: The philosophy of total propaganda control, (accessed on Nov.
9, 2021)
Thaler, R. H. (2009) Nudge: Improving Decisions About Health, Wealth and, Happiness. Revised&Expanded edition. New
York: Penguin Books.
Greenwald, G. (Feb. 25, 2014) How Covert Agents Infiltrate The Internet To Manipulate, Deceive, And Destroy Reputations, (accessed on Nov. 9, 2021);
(Feb. 25, 2014) The Art of Deception: Training for a New Generation of Online Covert Operations, (accessed on
Nov. 9, 2021);
Bell, V. (Aug. 16, 2015) Britain’s ‘Twitter Troops’ have ways of making you think…,
intelligence-group (accessed on Nov. 9, 2021)
Helbing D., Frey, B. S., Gigerenzer, G., Hafen, E., Hagner, M., Hofstetter, Y., van den Hoven, J., Zicari, R. V. and Zwitter
A. (Feb. 25, 2017) Will Democracy Survive Big Data and Artificial Intelligence?, (accessed on Nov. 9,
Dooely, R. (Nov. 22, 2011), Brainfluence: 100 Ways to Persuade and Convince Consumers Neuromarketing, (accessed on Nov. 9,
Kaiser, B. (2019). Targeted: The Cambridge Analytica Whistleblower’s Inside Story of How Big Data, Trump, and Facebook
Broke Democracy and How It Can Happen Again (Illustrated edition). Harper.
Van der Boor, R. A. E., van Daalen, A. F. and Duistermaat, M. (Oct. 25, 2017) Behavioural change as the core of warfighting:
So now what?, (accessed on Nov.
9, 2021)
The Economist (Dec. 17, 2016) China invents the digital totalitarian state, (accessed on Nov. 9, 2021)
Gallagher, R. (Sep. 25, 2015) Profiled: From Radio to Porn, British Spies Track Web Users’ Online Identities, (accessed on Nov. 9, 2021)
Schirrmacher, F. (2015) Technologischer Totalitarismus. Suhrkamp,
totalitarismus-t-9783518074343 (accessed on Nov. 9, 2021)
Lederer, E. M. (Oct. 21, 2021) 43 countries criticize China at UN for repression of Uyghurs, (accessed on
Nov. 9, 2021)
Munk, T. B. (2017) 100,000 false positives for every real terrorist: Why anti-terror algorithms don’t work. First Monday, 22(9), (accessed on Nov. 9, 2021)
Lessig, L. (2006) Code: And Other Laws of Cyberspace, Version 2.0. 2. ed.,
Cyberspace-Version/dp/0465039146/ (accessed on Nov. 9, 2021)
Schwab, K. The Fourth Industrial Revolution. World Economic Forum,
revolution-by-klaus-schwab (accessed on Nov. 9, 2021);
Schwab, K. (Jan. 3, 2017) The Fourth Industrial Revolution,
Schwab/dp/1524758868/ (accessed on Nov. 9, 2021)
Helbing, D. (Oct. 30, 2015) The Automation of Society is Next: How to Survive the Digital Revolution, (accessed on Nov. 9, 2021)
Fuster, T (Nov. 18, 2017) Digitale Planwirtschaft führt zu einem orwellschen Schreckensszenario, (accessed on Nov. 9, 2021)
Andrews, M. (May 12, 2020) DOD Awards $138 Million Contract Enabling Prefilled Syringes for Future Covid-19 Vaccine,
for-future-covid-19/ (accessed on Nov. 9, 2021)
Burt, C. (Sep. 20, 2019) ID2020 and partners launch program to provide digital ID with vaccines, (accessed on
Nov. 9, 2021)
Römer K. (2004) Tracking Real-World Phenomena with Smart Dust. Berlin, Heidelberg: Springer, In: Karl H., Wolisz
A., Willig A. (eds) Wireless Sensor Networks. EWSN 2004. Lecture Notes in Computer Science, vol 2920, (accessed on Nov. 9, 2021);
Ehrlich, G., Fenster, M. (2021) Methods and systems of prioritizing treatments, vaccination, testing and/or activities while
protecting the privacy of individuals. U.S. Patent No. 11,107,588 B2, (accessed on Nov. 9, 2021)
Bhokisham, N. and van Arsdale, E. (Jun. 23, 2020) Internet of Bodies (IoB)- Using CRISPR to electrically connect with and
control the genome,
connect-with-and-control-the-genome (accessed on Nov. 9, 2021)
The Internet of Bodies Will Change Everything, for Better or Worse (Oct. 29, 2020), (accessed on
Nov. 9, 2021);
Mary, L., Boudreaux, B., Chaturvedi, R., Romanosky, S. and Downing, B. (2020) The Internet of Bodies: Opportunities, Risks,
and Governance. Santa Monica, CA: RAND Corporation,
internet-of-bodies.html (accessed on Nov. 9, 2021);
Mary, L., Boudreaux, B., Chaturvedi, R., Romanosky, S. and Downing, B. (2020) The Internet of Bodies: Opportunities, Risks,
and Governance,
(accessed on Nov. 9, 2021);
Maxwell, A. (Oct. 10, 2020) A Living, Breathing Internet of Things All Around You,
breathing-internet-of-bodies-all-around-you/ (accessed on Nov. 9, 2021);
Celik, A., Salama, K. N. and Eltawil, A. (2020) The Internet of Bodies: A Systematic Survey on Propagation Characterization
and Channel Modeling. TechRxiv. [Preprint.],
d_Channel_Modeling/12912752 (accessed on Nov. 9, 2021);
Celik, A., Salama, K. N. and Eltawil, A. (2020) The Internet of Bodies: A Systematic Survey on Propagation Characterization
and Channel Modeling. IEEE Communications Surveys and Tutorials. [Preprint.], (accessed on Nov. 9, 2021);
Gao, F., Chen, D-L., Wenig, M-H., Yang, R-Y. (2021) Revealing Development Trends in Blockchain-Based 5G Network
Technologies through Patent Analysis. Sustainability. 2021; 13(5):2548,
(accessed on Nov. 9, 2021)
Xiao, L., (Jan. 4, 2020) Tracking How Our Bodies Work Could Change Our Lives, (accessed on Nov. 10,
(Aug. 6, 2020) The Internet of Bodies Is Here: Tackling New Challenges of Technology Governance, (accessed
on Nov. 10, 2021);
Ly, J. U. (July 2020) Shaping the Future of the Internet of Bodies: New Challenges of Technology Governance. n.d., pp. 28. (accessed on Nov. 10, 2021)
Regulation of Nanotechnology, (accessed on Nov. 10, 2021)
Chauvet, Z. (Aug. 13, 2020, updated on Aug. 17, 2020) Last Call to Control the Internet of Bodies, (accessed on Nov. 10, 2021)
崔大祥, 高昂, , 田静, 李雪玲, and 沈琦 (filed Sep. 27, 2020, and issued Jan. 15, 2021) Nano coronavirus recombinant
vaccine taking graphene oxide as carrier. China CN112220919A, (accessed on Nov. 10, 2021);
Chen, Y.-C., Cheng H-F., Yang Y.-C. and Yeh M.-K (2016) Nanotechnologies Applied in Biomedical Vaccines. Micro and
Nanotechnologies for Biotechnology. IntechOpen, (accessed on Nov. 10, 2021);
Von Andrian, U. H., Farokhzad, O. C., Langer, R. S., Junt, T., Moseman, E. A., Zhang, L., Basto, P., Iannacone, M., and Alexis,
F. (filed Mar. 15, 2013, and issued Jan. 10, 2017) Vaccine nanotechnology. United States US9539210B2, (accessed on Nov. 10, 2021)
Main, D. (Sep. 5, 2016) Potentially Toxic Magnetic Nanoparticle Pollution Has Been Found in Human Brains (accessed on Nov. 10,
Maher, B. A., Ahmed, I. A. M., Karloukovski, V., MacLaren, D. A., Foulds, P. G., Allsop, D., Mann, D. M. A., Torres-Jardón, R.
and Calderon-Garciduenas, L. (2016) Magnetite Pollution Nanoparticles in the Human Brain. Proceedings of the National
Academy of Sciences 113, no. 39: 10797801. (accessed on Nov. 10, 2021);
Flores, D. S. (2018) The Secret Program of US. Mind Control Weapons: Is It Developing in Latin America?” International
Physical Medicine & Rehabilitation Journal Volume 3, no. Issue 2.
us-mind-control-weapons-is-it-developing-in-latin-america.html (accessed on Nov. 10, 2021);
Flores, D. S. (2021) The secret program of US. mind control weapons: is it developing in Latin America?. Int Phys Med Rehab
J. 2018;3(2):145 146.
crime.html (accessed on Nov. 10, 2021)
Further links:
Abramson, D., Fu, D. and Johnson, J. E. (issued Mar. 26, 2020) Cryptocurrency System Using Body Activity Data. United
States US20200097951, (accessed on Nov. 10, 2021);
Bhokisham, N., VanArsdale, E., Stephens, K. T., Hauk, P., Payne, G. F. and Bentley, W. E. (2020) A Redox-Based
Electrogenetic CRISPR System to Connect with and Control Biological Information Networks. Nature Communications 11, no.
1: 2427. (accessed on Nov. 10, 2021);
Mahdi, M. N., Ahmad, A. R., Qassim, O. S., Natiq, H., Subhi, M. A. and Mahmoud, M. (2021) From 5G to 6G Technology:
Meets Energy, Internet-of-Things and Machine Learning: A Survey. Applied Sciences 11, no. 17: 8117. (accessed on Nov. 10, 2021);
European Technology Platform NetWorld2020 (2020) Smart Networks in the context of NGI, https://bscw.5g- (accessed on Nov. 10,
Pathak, B. A. and Keller III., W. J. (filed Mar. 13, 2014, and issued Jul. 5, 2016) DNA/nanoparticle complex enhanced radio
frequency transponder: structure of mark for detecting hybridization state and authenticating and tracking articles, method of
preparing the same, and method of the same. United States US9382579B2.
(DNA link) (accessed on Nov. 10, 2021);
Van Lune, H. and Bruggeman, J. J. (filed May 18, 2005, and issued Nov. 22, 2006) Luciferase assay system. European Union
EP1724359A1. (Luciferase) (accessed on Nov. 10, 2021)
Regalado A. (Sep. 30, 2014) Obama’s Brain Project Backs Neurotechnology. MIT Technology Review. (accessed on Nov. 10,
Alivisatos, A. P., Andrews, A. M., Boyden, E. S., Chun, M., Church, G. M., Deisseroth, K., Donoghue, J. P. et al. (March 26,
2013) Nanotools for Neuroscience and Brain Activity Mapping. ACS Nano 7, no. 3: 185066. (accessed on Nov. 10, 2021);
Makin, S. (Aug. 8, 2016) ‘Neural Dust’ Could Enable a Fitbit for the Nervous System. Scientific American. (accessed on Nov. 10,
Costandi, M. (Mar. 24, 2016) Genetically Engineered ‘Magneto’ Protein Remotely Controls Brain and Behaviour. The Guardian. (accessed
on Nov. 10, 2021);
Wheeler, M. A., Smith, C. J., Ottolini, M., Barker, B. S., Purohit, A. M., Grippo, R. M., Gaykema, R. P. et al. (2016) Genetically
Targeted Magnetic Control of the Nervous System. Nature Neuroscience 19, no. 5: 75661. (accessed on Nov. 10, 2021);
Schilling, D. R. (April 19, 2013) Knowledge Doubling Every 12 Months, Soon to Be Every 12 Hours. Industry Tap (blog). (accessed on Nov. 10,
Services from IBM, (accessed on Nov. 10, 2021);
IBM Global Technology Services (Jul., 2006) The toxic terabyte: How data-dumping threatens business efficiency,
(accessed on Nov. 10, 2021)
Smith, T. (Jul. 1, 2021) Elon Musk’s Neuralink Is Being Outperformed by This Spanish Graphene Startup, (accessed on Nov. 10, 2021);
INBRAIN Neuroelectronics Secures $17 Million in Series A Funding for First AI-Powered Graphene-Brain Interface (Mar. 30,
Series-A-Funding-for-First-AI-Powered-Graphene-Brain-Interface (accessed on Nov. 10, 2021);
Billing, M. (Oct. 26, 2020) The European Startups Hacking Your Brain Better than Elon Musk’s Neuralink, (accessed on Nov. 10, 2021);
Bramini, M., Alberini, G., Colombo, E., Chiacchiaretta, M., DiFrancesco, M. L., Maya-Vetencourt, J. F., Maragliano, L.,
Benfenati, F. and Cesca, F. (2018) Interfacing Graphene-Based Materials With Neural Cells. Frontiers in Systems
Neuroscience 12: 12. (accessed on Nov. 10, 2021);
Lee, H. -J (2021) Recent Progress in Radio-Frequency Sensing Platforms with Graphene/Graphene Oxide for Wireless Health
Care System. Applied Sciences 11, no. 5: 2291. (accessed on Nov. 10, 2021);
Marr, B. (Sep. 16, 2018) Smart Dust Is Coming. Are You Ready? Forbes. (accessed on Nov. 10, 2021);
Sutter, J. D. (May 3,2010) ‘Smart Dust’ Aims to Monitor Everything, (accessed on Nov. 10, 2021);
Hoffman, T. (Mar 24, 2003) Smart Dust, (accessed on Nov.
10, 2021)
MIT Technology Review (Jul. 16, 2013) How Smart Dust Could Spy On Your Brain, (accessed on Nov. 10, 2021);
Dongjin, S., Carmena, J. M., Rabaey, J. M., Alon, E. and Maharbiz, M. M. (2013) Neural Dust: An Ultrasonic, Low Power
Solution for Chronic Brain-Machine Interfaces,
(accessed on Nov. 10, 2021);
Sharma, P., Moudgil, B. M., Walter G. A., Grobmyer, S. R, Santra, S., Jiang, H., Brown, S. C., Scott, E. W., Zhang, Q. and
Bengtsson, N. (filed Aug. 28, 2008, and issued Jan. 29, 2013) Multimodal nanoparticles for non-invasive bio-imaging. United
States US8361437B2. (accessed on Nov. 10, 2021);
Fuerst, O. and Rosenberg, Z. (filed Apr. 11, 2003, and issued Jan. 19, 2005) System for collecting storing presenting and
analyzing immunization data having remote stations in communication with a vaccine and disease database over a network.
European Union EP1497780A2. (accessed on Nov. 10, 2021);
Hogan, T. (filed Apr. 23, 2002, and issued Dec. 12, 2002) System and method for automatically recording animal temperature
and vaccination information. United States US20020188470A1. (accessed on
Nov. 10, 2021)
Burns, J. (Oct. 13, 2016) Elon Musk And Friends Are Spending Millions To Break Out Of The Matrix. Forbes.
(accessed on Nov. 10, 2021);
Hartmans, A. (Oct. 3, 2016) Tech Billionaires Are Asking Scientists for Help Breaking Humans out of the Computer Simulation
They Think They Might Be Trapped In. Business Insider.
humans-out-of-a-computer-simulation-2016-10 (accessed on Nov. 10, 2021)
Lesaja, S. and Palmer, X.-L (2020) Brain-Computer Interfaces and the Dangers of Neurocapitalism, (accessed on Nov. 10, 2021);
Pykett, J. (2013) Neurocapitalism and the New Neuros: Using Neuroeconomics, Behavioural Economics and Picoeconomics for
Public Policy. Journal of Economic Geography 13, no. 5: 84569.
abstract/13/5/845/918547 (accessed on Nov. 10, 2021);
Griziotti, G. (2020) Neurocapitalism Technological Mediation and Vanishing Lines. Minor Compositions;
Meckel, M. (2018) Mein Kopf Gehört Mir. Eine Reise Durch Die Schöne Neue Welt Des Brainhacking. Piper Verlag GmbH.
Stickgold, R., Zadra, A. and Haar, A. J. H. (Jun. 8, 2021) Advertising in Dreams Is Coming: Now What?, (accessed on Nov. 10, 2021)
McIntosh, A. and Siepmann, K. (Apr. 12, 2015) The Age of Transhumanist Politics Has Begun, (accessed on Nov. 10,
The Singularity is Near, (accessed on Nov. 10, 2021);
Kurzweil, R. (2006) The Singularity Is Near: When Humans Transcend Biology. New York: Penguin Books;
Kurzweil, R. (2000) The Age of Spiritual Machines: When Computers Exceed Human Intelligence. First Thus edition. New York,
NY: Penguin Books, 2000
Kurzweil, R. (Oct. 1, 2015) Tiny Robots In Our Brains Will Make Us ‘Godlike’. HuffPost.
kurzweil-nanobots-brain-godlike_n_560555a0e4b0af3706dbe1e2 (accessed on Nov. 10, 2021)
Geraci, R. M. (2012) Apocalyptic AI: Visions of Heaven in Robotics, Artificial Intelligence, and Virtual Reality. Reprint edition.
Oxford: Oxford University Press.
Piper, K. (Jun. 21, 2019) Death by Algorithm: The Age of Killer Robots Is Closer than You Think. Vox. (accessed on Nov. 10, 2021);
Simanowski, R. (2018) The Death Algorithm and Other Digital Dilemmas. Translated by Jefferson Chase. Untimely Meditations.
Cambridge, MA, USA: MIT Press. (accessed on
Nov. 10, 2021);
Helbing, D. and Seele, P. (Nov. 26, 2020) Death by Algorithm? Project Syndicate. https://www.project-
(accessed on Nov. 10, 2021);
Helbing, D, Beschorner, T., Frey, B., Diekmann, A., Seele, P., Spiekermann-Hoff, S., Zwitter, A., van den Hoven, J. and
Hagendorff, T. (2021) Triage 4.0: On Death Algorithms and Technological Selection. Is Today’s Data-Driven Medical System
Still Compatible with the Constitution?, (accessed on Nov. 10, 2021)
Hao, K. (Apr. 23, 2020) Doctors Are Using AI to Triage Covid-19 Patients. The Tools May Be Here to Stay. MIT Technology
Review. (accessed on Nov. 10,
Surber, R. (2018) Artificial Intelligence: Autonomous Technology (AT), Lethal Autonomous Weapons Systems (LAWS) and
Peace Time Threats. Artificial Intelligence, n.d., 44. (accessed on Nov.
10, 2021);
Surber, R. (Feb. 2, 2018) Autonome Intelligenz ist nicht nur in Kriegsrobotern riskant. NZZ. (accessed on Nov. 10,
2021); (accessed on Nov. 10, 2021);
Nanotechnology: Dangers of Molecular Manufacturing. Center for Responsible Nanotechnology (CRN). (accessed on Nov. 10, 2021)
ResearchGate has not been able to resolve any citations for this publication.
Full-text available
Epidemic models often reflect characteristic features of infectious spreading processes by coupled nonlinear differential equations considering different states of health (such as susceptible, infectious or recovered). This compartmental modelling approach, however, delivers an incomplete picture of the dynamics of epidemics, as it neglects stochastic and network effects, and the role of the measurement process, on which the estimation of epidemiological parameters and incidence values relies. In order to study the related issues, we combine established epidemiological spreading models with a measurement model of the testing process, considering the problems of false positives and false negatives as well as biased sampling. Studying a model-generated ground truth in conjunction with simulated observation processes (virtual measurements) allows one to gain insights into the fundamental limitations of purely data-driven methods when assessing the epidemic situation. We conclude that epidemic monitoring, simulation, and forecasting are wicked problems, as applying a conventional data-driven approach to a complex system with nonlinear dynamics, network effects and uncertainty can be misleading. Nevertheless, some of the errors can be corrected for, using scientific knowledge of the spreading dynamics and the measurement process. We conclude that such corrections should generally be part of epidemic monitoring, modelling and forecasting efforts. This article is part of the theme issue ‘Data science approaches to infectious disease surveillance’.
Full-text available
Health data bear great promises for a healthier and happier life, but they also make us vulnerable. Making use of millions or billions of data points, Machine Learning (ML) and Artificial Intelligence (AI) are now creating new benefits. For sure, harvesting Big Data can have great potentials for the health system, too. It can support accurate diagnoses, better treatments and greater cost effectiveness. However, it can also have undesirable implications, often in the sense of undesired side effects, which may in fact be terrible. Examples for this, as discussed in this article, are discrimination, the mechanisation of death, and genetic, social, behavioural or technological selection, which may imply eugenic effects or social Darwinism. As many unintended effects become visible only after years, we still lack sufficient criteria, long-term experience and advanced methods to reliably exclude that things may go terribly wrong. Handing over decision-making, responsibility or control to machines, could be dangerous and irresponsible. It would also be in serious conflict with human rights and our constitution.
Full-text available
The Internet of Bodies (IoB) is an imminent extension to the vast Internet of things domain, where interconnected devices (e.g., worn, implanted, embedded, swallowed, etc.) located in-on-and-around the human body form a network. Thus, the IoB can enable a myriad of services and applications for a wide range of sectors, including medicine, safety, security, wellness, entertainment, to name but a few. Especially considering the recent health and economic crisis caused by novel coronavirus pandemic, a.k.a. COVID-19, the IoB can revolutionize today's public health and safety infrastructure. Nonetheless, reaping the full benefit of IoB is still subject to addressing related risks, concerns, and challenges. Hence, this survey first outlines the IoB requirements and related communication and networking standards. Considering the lossy and heterogeneous dielectric properties of the human body, one of the major technical challenges is characterizing the behavior of the communication links in-on-and-around the human body. Therefore, this paper presents a systematic survey of channel modeling issues for various link types of human body communication (HBC) channels below 100 MHz, the narrowband (NB) channels between 400 MHz and 2.5 GHz, and ultra-wideband (UWB) channels from 3 to 10 GHz. After explaining bio-electromagnetics attributes of the human body, physical and numerical body phantoms are presented along with electromagnetic propagation tool models. Then, the first-order and the second-order channel statistics for NB and UWB channels are covered with a special emphasis on body posture, mobility, and antenna effects. For capacitively, galvanically, and magnetically coupled HBC channels, four different channel modeling methods (i.e., analytical, numerical, circuit, and empirical) are investigated, and electrode effects are discussed. Lastly, interested readers are provided with open research challenges and potential future research directions.
Full-text available
In the past decade, graphene has been widely researched to improve or overcome the performance of conventional radio-frequency (RF) nanodevices and circuits. In recent years, novel RF bio and gas sensors based on graphene and its derivatives, graphene oxide (GO) and reduced graphene oxide (rGO), have emerged as new RF sensing platforms using a wireless remote system. Although the sensing schemes are still immature, this review focuses on the recent trends and advances of graphene and GO (rGO)-based RF bio and gas sensors for a real-time and continuous wireless health care system.
Full-text available
Neurocapitalism. Technological Mediation and Vanishing Lines Giorgio Griziotti Foreword by Tiziana Terranova Translated by Jason Francis Mc Gimsey Technological development is ridden with conflicts, bifurcations and unexpected developments. Neurocapitalism takes us on an extraordinarily original journey through the effects that cutting-edge technology has on cultural, anthropological, socio-economic and political dynamics. Today, neurocapitalism shapes the technological production of the commons, transforming them into tools for commercialization, automatic control, and crisis management. But all is not lost: in highlighting the growing role of General Intellect’s autonomous and cooperative production through the development of the commons and alternative and antagonistic uses of new technologies, Giorgio Griziotti proposes new ideas for the organization of the multitudes of the new millennium. “it is rare to find a book… which is capable of combining a competent technical viewpoint with a coherent theoretical perspective… animated by a great political passion nourished by the ‘common learning’ of collective self-training.” – Tiziana Terranova, from the foreword Bio: With his passions for technology and political philosophy, Giorgio Griziotti is one of the first software engineers to come out of Milan’s Politecnico University. His participation in the autonomous movements in Italy in the 1970s forced him to gain most of his professional experience in the field of ICT in exile. He has an experience of more than thirty years in large international IT projects. Today he is an independent researcher, member of the collective Effimera.
Full-text available
Electronic information can be transmitted to cells directly from microelectronics via electrode-activated redox mediators. These transmissions are decoded by redox-responsive promoters which enable user-specified control over biological function. Here, we build on this redox communication modality by establishing an electronic eCRISPR conduit of information exchange. This system acts as a biological signal processor, amplifying signal reception and filtering biological noise. We electronically amplify bacterial quorum sensing (QS) signaling by activating LasI, the autoinducer-1 synthase. Similarly, we filter out unintended noise by inhibiting the native SoxRS-mediated oxidative stress response regulon. We then construct an eCRISPR based redox conduit in both E. coli and Salmonella enterica. Finally, we display eCRISPR based information processing that allows transmission of spatiotemporal redox commands which are then decoded by gelatin-encapsulated E. coli. We anticipate that redox communication channels will enable biohybrid microelectronic devices that could transform our abilities to electronically interpret and control biological function. Redox-responsive transcriptional regulators can enable user-specified electronic control over biological functions. Here the authors demonstrate electronic control of CRISPRa and CRISPRi using redox signalling.