Available via license: CC BY-NC-SA 4.0
Content may be subject to copyright.
Tega Brain
THE ENVIRONMENT
IS NOT A SYSTEM
APRJA Volume 7, Issue 1, 2018
ISSN 2245-7755
CC license: ‘Attribution-NonCommercial-ShareAlike’.
153
In late 2017, Microsoft’s chief environmental
scientist, Lucas Joppa announced AI for
Earth, a new initiative to put articial intelli-
gence in the hands of those who are trying to
“monitor, model and manage the earth’s nat-
ural systems”. AI for Earth gives environmen-
tal researchers access to Microsoft’s cloud
platform and AI technologies, and similar to
recent initiatives by companies like Google
and Planet Labs, it aims to integrate AI into
environmental research and management.
It is obvious that Silicon Valley stands
to prot handsomely from the uptake of AI in
environmental research and management,
as it has from the application of these meth-
ods in a diverse range of other elds. From
urban design to the justice system, decision
making processes are being automated
by data-driven systems. And in spite of a
growing body of criticism on the limitations
of these technologies,[1] the tech industry
continues to promote them with the mix of so-
lutionism and teleology that imbues Joppa’s
words. He urges: “for every environmental
problem, governments, nonprots, academia
and the technology industry need to ask two
questions: ‘how can AI help solve this?’ and
‘how can we facilitate the application of AI?’”
(Joppa)
This paper considers some of the
limitations and possibilities of computa-
tional models in the context of environmental
inquiry, specically exploring the modes of
knowledge production that it mobilizes. As
has been argued by authors like Katherine
Hayles and Jennifer Gabrys, computation
goes beyond just reading and represent-
ing the world. As a mode of inquiry it has a
powerful world-making capacity, generat-
ing new pathways for action and therefore
new conditions. “Computing computes.”[2]
Computational metaphors are also pervasive
as framing devices for complex realities,
particularly in the context of research on the
city, the human brain or human behavior.[3]
Historic computational attempts to
model, simulate and make predictions about
environmental assemblages, both emerge
from and reinforce a systems view on the
world. The word eco-system itself stands
as a reminder that the history of ecology is
enmeshed with systems theory and presup-
poses that species entanglements are op-
erational or functional. More surreptitiously,
a systematic view of the environment con-
notes it as bounded, knowable and made up
of components operating in chains of cause
and effect. This framing strongly invokes
possibilities of manipulation and control and
implicitly asks: what should an ecosystem be
optimized for?[4]
This question is particularly relevant at
a time of rapid climate change, mass extinc-
tion and, conveniently, an unprecedented
surplus of computing. As many have pointed
out, these conditions make it tempting (and
lucrative) to claim that neat technological
xes can address thorny existential prob-
lems.[5] This modernist fantasy is well and
truly alive for proponents of the smart city,
and even more dramatically in proposals for
environmental interventions that threaten to
commodify earth’s climate conditions, such
as atmospheric engineering.[6]
What else does a systems view of the
environment amplify or edit out? This dis-
cussion revisits several historic missteps in
Tega Brain: THE ENVIRONMENT IS NOT A SYSTEM
Figure 1: Seagrass in Tasmania, Australia. Credit: Tega
Brain.
154
APRJA Volume 7, Issue 1, 2018
environmental measurement and modeling
in order to pull focus on the epistemologi-
cal assumptions embedded into a systems
perspective. It then asks, what are other
possibilities for ecological thought? Does AI
have any potential to reveal environments in
ways that escape the trapping of systems?
Critical to my inquiry is the recent work of
Anna Tsing and what she calls, “the arts of
noticing”. Tsing’s work offers a starting point
for thinking outside of both a systems frame-
work and assumptions of progress (17). Her
perspective on ecology and the lifeworlds it
describes unfolds and emerges through “en-
counters” (20) which bring together entities,
transforming them in indeterminate ways.
Might AI operate through modes of environ-
mental encounters or will it simply amplify “an
informatics of domination” (Haraway 162)?
The poverty of numbers
A systems view of the environment reinforced
by computation, has numerous precedents,
including 18th and 19th century attempts at
scientic forest management. This early at-
tempt at centralized ecosystem management
through numerical modeling foreshadows the
contemporary use of these approaches in
the context of computation. James C. Scott
traces how the introduction of centralized
forestry required forests to be made legible
in new ways.[7] Trees in forests were meas-
ured, quantied and modeled to optimize
harvest and replanting for timber yield. Thus
the fastest growing species were replanted
in felled areas, and trees became viewed as
autonomous machines for producing wood.
Those species not harvestable for timber –
low lying bushes, fungi and plants (Scott 13),
as well as traditional ‘unofcial’ use of forests
by local communities – were edited out of the
system (Hölzl 436). These scientic or scal
forests, were managed with the assumption
that complex species entanglements were
irrelevant and could be treated as external
to a system designed to efciently transform
trees into commodities. Yet after a couple of
generations of felling and replanting, yields
began to drop and the health of managed
forests deteriorated (Scott 20). Viewing the
forest as a factory oversimplied the reality
of the relations and interdependencies of its
species.
The scientic forest failed by its own
criteria: timber yield. However it is worth
acknowledging that if yield had remained
high while biodiversity declined, this his-
tory of sustainable environmental manage-
ment would be remembered as a success,
analogous to industrial agriculture. Tsing
calls environments that are simplied and
optimized to produce commodities “planta-
tions” (435). The economic drivers of capi-
talism make crop yields the ultimate goal of
agricultural landscapes, and shape how they
are measured, modeled and manipulated.
When a landscape is managed as a factory,
its species become assets alienated from
their lifeworlds[8] like workers who fulll HITs
on Mechanical Turk with no bearing on each
other or what they produce. When the asset
Figure 2: Imaginary forest patch partitioned in 84
sections. Credit: Grünberger, G. (1788) Lehrbuch für
den pfalzbaierischen Förster, Vol. 1 (München: Strobl),
Figure 163 from Historicizing Sustainability: German
Scientic Forestry in the Eighteenth and Nineteenth
Centuries (Hölzl).
155
can no longer be extracted, the landscape
becomes a ruin and disappears from view,
deemed worthless (Tsing 31). Both the plan-
tation and the scientic forest are the results
of numerical approaches to landscape man-
agement applied in the name of economics.
They highlight that data collection and mod-
eling practices are never neutral. Rather,
they are contingent on decisions of what is
deemed important or trivial in the eyes of the
manager and therefore are profoundly driven
by culture and economics, class and race.
The fantasy of stability
In the twentieth century, the science of ecol-
ogy emerged in dialogue with cybernetics
and systems theory. There is a rich body
of literature critiquing how these condi-
tions inuenced environmental research.
[9] Cybernetics, rst dened in the 19th
century by André-Marie Ampère as “the
science of governance” was catalyzed as
an interdisciplinary eld by proponents like
Norbert Wiener in the post war decades.[10]
It inspired ecologists to pursue questions of
control and self regulation in the context of
species lifeworlds. Some early ecosystem
diagrams were even realized in the style of
circuitry.
Botanist Michael Tansley was among
the rst to use the term “ecosystem” in 1935
to describe the “systematic” functioning of
forests, grasslands and wetlands environ-
ments. He saw ecosystems as “the whole
system (in the physical sense), including
not only the organism-complex, but also the
whole complex of physical factors forming
what we call the environment of the biome
[… these] are the basic units of nature”
(299). Like the scientic foresters, Tansley
proposed that ecosystems were made of
discrete stable units, interacting in ways that
tend towards a state of dynamic equilibrium.
He also assumed that natural selection
favors stability, that “systems that can attain
the most stable equilibrium, can survive the
longest” (Tansley 299). This idea of ecologi-
cal equilibrium remains stubbornly inuential,
as does the idea of the environment as a
unied “whole”. As philosophers like Bruno
Latour and Timothy Morton discuss, the idea
that the “natural world” exists in a balanced
harmonious state that is then disrupted by
humans reiterates the misconception that
humans and environment are separate.[11]
Towards the late 1960s, Tansy’s as-
sumption of ecosystem homeostasis was
proving difcult to verify, even in ambitious
large-scale ecosystem modeling projects
enabled by the availability of computa-
tion. One such project was the Grasslands
Biome, started in 1968 at Colorado State
University. It was an unprecedented attempt
to comprehensively model a grasslands
ecosystem with a computational model and
aimed to uncover new ecological principles
(Kwa 1). Employing hundreds of full time re-
searchers, the project involved extraordinary
methods of data collection as researchers
tried to account for all forms of energy enter-
ing and leaving the system, attempting to
quantify everything eaten and excreted by
Figure 3: Prominent biologist of the 1960s, Howard
Odum’s rst presentation of an ecosystem using the
symbolism and aesthetic of electric circuit diagrams.
Image by Howard Odum, 1960 cited in Madison (218).
Tega Brain: THE ENVIRONMENT IS NOT A SYSTEM
156
APRJA Volume 7, Issue 1, 2018
all organisms in the biome and then input-
ting this data into a mathematical model.
Students and researchers would follow
animals around the grasslands whispering
into tape recorders. They would ‘collect’ ani-
mals and analyze their stomach content by
inserting probes into their digestion systems
(Coupland). Soil microbiology was also stud-
ied, yet soil invertebrates and highly mobile
species such as insects and birds remained
frustratingly uncooperative in yielding infor-
mation to researchers (Coupland 35).
Despite this labor, the Grasslands
model, like similar large-scale ecological
modeling programs of the time, revealed very
few new ecological principles. Deemed “too
simplied biologically” despite implement-
ing an unprecedented number of variables
(Coupland 154), the model was built with an
assumption of default equilibrium. Coupland
argues that the Biome Model was simply “a
sophisticated version of a cybernetic system
[…] and cast […] the ecologist in the role
of systems engineer” (146). The project
disproved its foundational hypothesis – that
complex ecological realities can be reconciled
with mathematical models and be described
as abstracted structures of inputs and out-
puts. “The grandiose ideal of achieving total
control over ecosystems, which around 1966
appealed so much to systems ecologists
as well as politicians, was dismissed as a
hyperbole” (Coupland 155).
Data collection and modeling practices
remain shaped by what is considered typical
or atypical, important and peripheral – sum-
mations of the boundary conditions of real-
ity. However making these assumptions is
difcult. Even with the growing capacity of
contemporary computing, it is dangerous to
simply assume that more data equals more
reality. An example of this is the story of how
Joe Farman, a British geophysicist working
for the British Antarctic Survey, rst observed
the destruction of the ozone layer. Farman
maintained a single ground based ozone
sensor in the Antarctic throughout the 1960s
and 1970s, and continued to do so in spite of
the launch of NASA atmospheric monitoring
satellites that collected vastly larger quanti-
ties of data (Vitello). When Farman’s sensor
began to show a 40% drop in ozone levels
in the early 1980s, he assumed it was dam-
aged and replaced it as NASA’s atmospheric
models had reported no such change. After
years carefully checking, Farman published
this alarming result in Nature as the rst
observation of the destruction of the ozone
layer due to human pollutants. Until then, this
had been only a theoretical hypothesis.[12]
How had NASA’s satellites missed such a
marked change in ozone composition? One
response from NASA suggests that their data
processing software was programmed to dis-
card readings that appeared to be outliers,
thus ignoring the drastic changes that were
occurring in ozone concentration (Farman).
In this case, reality itself was an outlier and
assumed to be an error.
Figure 4: Processing of replicate biomass samples, ready
for drying and weighing, in the eld laboratory at the
CPER/Pawnee grassland site, Colorado, USA. Credit:
Larry Nell, Colorado State University, July 1971.
157
The limits of machine
learning
What if there was no cap on the amount
of data produced from an environment for
analysis? Could models be derived from
from datasets rather than built from theory
to avoid erroneous assumptions like those
made in the Grasslands model? Could
machine learning be adopted to deal with
quantities of data beyond human compre-
hension and prevent any need for discarding
outliers? Can these techniques produce a
more robust representation of reality, free of
human judgement?
These are the arguments made for
machine learning. In 1959 Arthur Samuel
dened machine learning as “the ability to
learn without being explicitly programmed”
(McCarthy). Rules are derived from patterns
in large data sets, rather than programmed
based on theoretical knowledge of underlying
structures. “Correlation is enough. We can
stop looking for models” proclaimed Wired
editor Chris Anderson in 2008, in an article
titled “End of Theory”. In other words, had
the Grasslands model been derived through
machine learning, energy ows through
the ecosystem could have been estimated
based on correlations the data, rather than
estimated from inputting data into a theo-
retical model, hardcoded from hypothesis of
ecosystem dynamics. Although this would
have prevented erroneous assumptions like
default homeostasis, it is important to ac-
knowledge that machine learning substitutes
one set of assumptions for another.
Machine learning assumes that enough
data can be collected to adequately represent
and make predictions about reality. In the
context of the environment, this is an enor-
mous challenge given the very limited size
of our existing datasets. Another signicant
assumption is that the past is indicative of
the future. Yet as the sudden unprecedented
depletion of atmospheric ozone in the 1980s
shows, this to not always be the case.
Similarly, climate change means our ability
to make accurate predictions from our exist-
ing data is diminished. Many environmental
datasets like precipitation records span 250
years at best, with the majority spanning a
much shorter period.[13] From a geological
point of view this is an absurdly small slice
of time, and one in which the earth’s climate
has been relatively stable. As the patterns,
rhythms and cycles in both climatic and
biological phenomena are drastically dis-
rupted, it becomes increasingly difcult to
make predictions based on this short, stable
interval of climate data. William B. Gail calls
this the coming of “a new dark age”, where
the accumulated observations of Earth’s
irreducibly complex conditions are increas-
ingly rendered obsolete. If machine learning
approaches are to be adopted, it is important
to recognize the limits of these methods.
Dreams of objectivity
Another prominent argument made for the
use of AI methods is that data-driven ap-
proaches neutralize human decision making
by simply representing the world as it is. The
proponents of AI for Earth also make these
claims to objectivity: “Decisions about what
actions to take will be easier to make — and
less vulnerable to politicization — if we know
what is happening on Earth, when and where.
AI can help to provide that information.”
(Joppa) However in other realms, AI systems
continue to reveal and conrm biases and
structural inequalities rather than offering an
easy pathway to their neutralization.
For example, defendant risk scoring
systems designed to help judges make
Tega Brain: THE ENVIRONMENT IS NOT A SYSTEM
158
APRJA Volume 7, Issue 1, 2018
decisions to “deliver better outcomes to all
who touch the justice system” (Equivalent)
have been shown to score black defend-
ants at signicantly higher risk for reoffense
than white defendants with similar or worse
criminal records (Angwin et al.). Systems like
these should serve as warnings to other in-
dustries implementing automating decisions
making, even in the name of environmental
management. As theorist Françoise Vergès
argues, “adaptation through technology or
the development of green capitalism […] does
not thoroughly address the long history and
memory of environmental destruction […],
nor the asymmetry of power.” Contemporary
environmental challenges directly emerge
from violent histories of colonialism, imperial-
ism and the ongoing exploitation of marginal-
ized communities or those living in the global
South (Vergès). As such, there is no reason
to suggest that AI technologies built and
implemented by a cohort of wealthy white
men in the US will in any way manage or
distribute environmental resources in ways
that are equitable for everyone.
Technologies will only ever provide
partial xes if they are not accompanied by
shifts in perception and values, along with
regulatory change that addresses histories
of injustice and “the tradition of belief in
progress” (Vergès). More efcient resource
use in a system of deregulated capitalism
is most likely to beget further resource use
rather than net reduction. Microsoft seems to
have it backwards in its mission statement
“to empower every person and organization
on the planet to achieve more”. Wasn’t the
idea behind technologies of automation to
empower us to achieve less? Or at least
prompt a radical rethinking of what ‘more’ is?
As Vergès argues, if these logics go unques-
tioned, mounting environmental challenges
will not only continue to accelerate change
in an already stressed biosphere, but also
further augment environmental injustices.
If the environment is not a
system, then what is it?
How else might we think of environments in
lieu of the systems metaphor? Tsing offers
the concept of assemblage and here I build
on her work, understanding environments as
open ended assemblages of non-humans,
living and nonliving, entangled in ways of life.
Ecologists turned to assemblages to
get around the sometimes xed and
bounded connotations of ecological
‘community.’ The question of how the
varied species in a species assem-
blage inuence each other — if at
all — is never settled: some thwart (or
eat) each other; others work together
to make life possible; still others just
happen to nd themselves in the same
place. Assemblages are open-ended
gatherings. They allow us to ask about
communal effects without assuming
them. (Tsing 54)
Like Tsing, many authors have taken
up the concept of assemblage to round out
the simplication and abstraction connotated
through use of technological metaphors.
Following Latour, to assume a system is also
to surreptitiously assume “the hidden pres-
ence of an engineer at work”, a presence
that suggests intention and that what we
can see are parts of a unied whole (Some
Advantages of the Notion of “Critical Zone”
for Geopolitics, 3). Assemblage relieves us
of this view, instead suggesting a collec-
tion of entities that may or may not exhibit
systematic characteristics. The edges of an
assemblage are fuzzy – modes of interac-
tion are always shifting and agencies within
them are heterogeneous. Katherine Hayles
also invokes the term in her inquiry on
159
cognition in complex human technological
entanglements, what she calls “cognitive
assemblages” (Unthought 3). Hayles
chooses assemblage over network arguing
that network conveys “a sense of sparse,
clean materiality”, whilst assemblage of-
fers “continuity in a eshy sense, touching,
incorporating, repelling, mutating” (118). She
continues: “I want to convey the sense of a
provisional collection of parts in constant ux
as some are added and others lost. The parts
are not so tightly bound that transformations
are inhibited and not so loosely connected
that information cannot ow between parts”
(118). Similarly, I take up assemblage as an
imperfect descriptor to avoid the hubristic as-
sumptions of a systems view. Stating “I am
studying a grasslands assemblage” instead
of “I am studying a grasslands system”
produces a remarkable shift in expectations
and assumptions. This simple substitution
dismantles subtle assumptions of xed
categories of knowledge, as well as assump-
tions that engineering and control are always
possible. Instead it foregrounds uncertainty
and acknowledges the unknowability of the
world.
Rather than describing ecology through
interactions or exchanges between entities,
Tsing proposes that it emerges through
encounters. For Tsing, encounters open new
possibilities for thinking. They produce trans-
formation and are therefore indeterminate
(63). They are also non-human centered.
There can be encounters between different
species – say a mushroom and a pine tree –
or between lifeforms and non-human materi-
als. Components of a system are implied to
be static discrete units, leaving out processes
of contamination and transformation. For
example when predator-prey relations are
described as transfers of energy between
components in a system, say a walrus eats a
mollusc, it is inferred that the walrus remains
unchanged by the encounter. Seeing the
world as made up of individuals sealed off
from one another, allows for the assumption
of stable categories, and makes the world
easier to quantify through data, interpreted
as pattern and codied as algorithm. The
yield from a data-driven mode of knowledge
production is obviously rich and wide reach-
ing, providing new insight into phenomena
like climate change. And yet, as the story of
Farman’s attention to the atmosphere shows,
scaling and automating data collection pro-
cesses can risk overpresuming the stability
of the world and blind us to transformations
outside of assumed possibility spaces.
In this way “smartness”, in its current
form, produces a kind of myopia. A smart
city, home or environment contains networks
of sensors automatically pinging data back
to servers to train machine learning models
of the world. Indeed this is also Joppa’s
pitch for AI for Earth: “AI systems can now
be trained to classify raw data from sensors
on the ground, in the sky or in space, using
categories that both humans and comput-
ers understand, and at appropriate spatial
and temporal resolution.” This statement is
worthy of carefully consideration. Firstly, how
does one decide on an appropriate tempo-
ral resolution? In the case of the German
forests, it took nearly a century to see that
management methods were unsustainable
because the life rhythms of a tree are at a
vastly slower tempo than those of human
economies. Joppa also infers that the world
can be revealed by how it appears through
“raw sensor data”. Yet this implies the sen-
sors themselves as somehow neutral and
overlooks the layers of human decision mak-
ing that has occurred in their production and
installation.[14]
It can also be surprisingly difcult to
resolve the world into clearly dened catego-
ries. And are these categories stable? Tsing’s
argument that encounters produce transfor-
mation suggests that neat taxonomies will
Tega Brain: THE ENVIRONMENT IS NOT A SYSTEM
160
APRJA Volume 7, Issue 1, 2018
never fully accommodate the uidity and
uncertainty of the world. This is particularly
apparent in plant systematics where even
the denition of species is contested and
ever changing (Ernst). In trying to categorize
plant specimens, a tension can emerge
between how the specimen appears – its
phenotype, and how it appears on a genetic
level – its genotype. As genetic sequencing
techniques have become cheaper and there-
fore more widely available, plant scientists
sometimes nd that the species indicated
by phenotype does not always match up to
the genotype – a discovery that has caused
many herbaria to be reorganized. However
even when identifying specimen on a purely
genetic level, there is still dispute over how
species are interpreted.[15]
Data-driven research methods neces-
sitate the collection of huge quantities of data
and in doing so, they dismantle opportunities
for paying close specic attention to the
world. These methods also tend to obscure
the many other ways of building understand-
ing. Also, perhaps intentionally, data collec-
tion increasingly acts to maintain the status
quo. We use data to study problems that
would be more effectively addressed through
simple political action. The impetus to “study
the problem” ad nauseam gives the appear-
ance of addressing an issue while perfectly
maintaining the present state of affairs.[16]
Amplifying encounters
How might we reciprocally illuminate the
environment and balance our well oiled ca-
pacity for imagining it from an all-conquering
systems worldview? How might we elevate
engagement through the specics of en-
counter and narrative?
Ethnography is one possibility. Tsing’s
study of the matsutake mushroom explores
what can be learnt from a Japanese mush-
room, a lifeform that cannot be cultivated
and that thrives in highly disturbed forests.
Through her ethnography she shows how
close attention inevitably facilitates transfor-
mation. Tsing calls this “the arts of noticing”,
tactics for thinking without either the abstrac-
tion produced by quantication or deeply held
assumptions of progress. If we are “agnostic
about where we are going, we might look for
what has been ignored” (51). As Farman’s
ozone research showed, paying close atten-
tion rather than outsourcing observation and
interpretive capacities can reveal the world
in different ways. In particular, attention can
emphasize the indeterminacy and messi-
ness of encounters outside of an engineer-
ing agenda. It can transform the observer,
directly involving us in the weirdness of the
world.
Could technologies like machine vi-
sion and remote sensing be used to amplify
environmental encounters and the arts of
noticing our ecological entanglements?
The rise of digital naturalism sees the de-
velopment of apps and initiatives that focus
attention on the lifeforms in our various
bioregions. Initiatives such as iNaturalist,
Merlin Bird ID and eBird invite non-scientists
to contribute environmental observations
and use either crowd-sourced or “assisted
identication” to identify species and build
biodiversity databases. Assisted identica-
tion utilizes computer vision techniques to
guide species identication from images by
identifying broad categories and making sug-
gestions. Through this process, the system
is also gradually being trained, and over
time will therefore make better suggestions.
Many scientic institutions also hope that
data-driven species identication can help to
reduce the bottlenecks in identication pro-
cesses as human taxonomists are in short
supply (Kim).
161
It is also worth emphasizing that these
apps do not purport to replace human identi-
cation but rather facilitate human computer
collaboration to reach conclusions quicker.
This is signicant, as it shows a way that AI
can produce more meaningful environmental
encounters rather than automate them away.
This use case for AI also serves as a reminder
that data can be much more than a material
for building a simulation or instrumentalizing
whatever is being measured. The act of data
collection and collaborative identication can
amplify encounters and, by extension, yield
transformation or what artist Jenny Odell
calls “a certain dismantling in the mind.” In
observing a local bird, and being assisted to
identify it as a magpie, I’m learning and tun-
ing my perception to the lifeworlds I inhabit:
I’m subject to transformation.
Accounts of the scientic forest, the
Grasslands Biome and Farman’s ozone
observations, mostly focus on the success
or failure of the science – on whether these
projects of observation or modeling suc-
ceeded or failed in revealing new patterns,
on whether the resultant environmental mod-
els proved accurate, and, by extension, on
whether they produced new possibilities for
environmental management and manipula-
tion. But telling these stories like this, is telling
them from a systems point of view. And what
tends to get overlooked is how these are
actually stories of environmental encounter
though data collection. As encounters, they
are also stories of transformation of both the
environments and the humans involved. How
did the meticulous observation of the envi-
ronmental assemblages in question shift and
transform the people studying them? In itself,
this question rejects a false binary between
human and environment. It acknowledges
the instability of the observer and the tenden-
cies of Western science to edit out intuition,
emotion and philosophical recalibrations.
The reciprocal transformation that occurs
with attention and encounter, what Nobel
prize winning geneticist Barbara McClintock
called “getting a feeling for the organism”,
is not only critical for formulating original
scientic hypothesis, but more deeply, for
questioning foundational assumptions of
what is counted as knowledge and what we
then expect knowledge to do.[17] Looking
back on the early scientic forests and even
on the more recent Grasslands Biome, it is
difcult to speculate on how these projects
changed the people involved. However, their
stories remind us of the irreducibility of an
unruly and complex environment. That as
hard as we try to contain the world in neat
technological metaphors, it will always leak
out and transform us.
Notes
[1] See recent books Weapons of Math
Destruction by Cathy O’Neil, Automating
Inequality by Virgina Eubanks, Code and
Clay, Data and Dirt: Five Thousand Years
of Urban Media by Shannon Mattern, and
the Machine Bias Series published by
Propublica and written by Julia Anguin et al.
Figure 5: Deer observations made at the CPER/Pawnee
grassland site, Colorado, USA. Credit: Animated GIF
made from Adam Curtis’ documentary All Watched over
by Machines of Loving Grace.
Tega Brain: THE ENVIRONMENT IS NOT A SYSTEM
162
APRJA Volume 7, Issue 1, 2018
[2] See Katherine Hayles (My Mother Was
a Computer, 7-31) and Jennifer Gabrys’
discussion in Program Earth (11).
[3] Sociologist Shannon Mattern warns of
the “the city as computer model” arguing
that it often hinders “the development of
healthy, just, and resilient cities” (The City
is Not a Computer). Psychologist Robert
Epstein highlights similar issues in the
context of brain research observing that
historically, metaphors for cognition have
always been drawn from the dominant
technology of the time – hydraulics, springs
and mechanics, electrical circuits and
now computation. Epstein argues that
the ubiquity of information processing
metaphors in brain research may well
be constraining the eld by conning
hypotheses and explanations to those that
align with computational processes. These
metaphors equally constrain approaches to
environment inquiry.
[4] This question is inspired by Shannon
Mattern’s discussion of the city as a
computer metaphor (The City is Not a
Computer).
[5] See Bratton et al. (9); Gabrys (230);
Stengers (1000), and Szerszynski et al
(2818).
[6] See Temple on the planned atmospheric
tests scheduled to occur in the US in 2018.
[7] See James C. Scott’s well known
account of scientic forestry in Seeing Like
a State.
[8] I use the word ‘lifeworlds’ following Anna
Tsing who describes objects in capitalist
exchange as being alienated and “torn from
their lifeworlds” (121).
[9] Many authors discuss the inuence
of systems theory on ecology, such as
Elichirigoity, Planet Management, and
Latour, Some Advantages of the Notion of
“Critical Zone” for Geopolitics. Some also
consider the inuence of cybernetics such
as Haraway, The High Cost of Information,
and Jennifer Gabrys, Program Earth.
[10] See Wiener’s landmark 1948 book,
Cybernetics.
[11] Latour’s concept of “naturecultures”
introduced in the Politics of Nature is an
attempt to collapse a false binary between
the human concerns and nature. Morton,
builds on this in The Ecological Thought that
also rejects this bifurcation.
[12] The theory of ozone destruction was
published by Molina et al.
[13] See Simpson.
[14] See Gabrys; Bratton et al.
[15] See Fazekas for discussion of differ-
ences in species denitions. Hull discusses
how these uncertainties have led to the
concept of reciprocal illumination in plant
systematics. This concept acknowledges
the multiple methods for classifying and
naming species.
[16] Now discontinued, The Human Project
was an example of data collection in lieu
of political action. The project planned to
address issues of health, urban design
and inequality by collecting huge volumes
of data from 10000 New Yorkers over 20
years.
[17] See Keller’s biography of McClintock’s
life.
163
Works cited
AI for Earth Grant, Microsoft, 2017. Web.
https://www.microsoft.com/en-us/research/
academic-program/azure-research-award-
ai-earth/. Accessed 10 Jan. 2018.
Ampère, André-Marie, Charles Augustin
Sainte-Beuve, and Émile Littré. Essai sur
la philosophie des sciences. Vol. 1. Paris:
Bachelier, 1834. Print.
Anderson, Chris. “The End of Theory: The
Data Deluge Makes the Scientic Method
Obsolete.” Wired magazine, 16 July 2008.
Web. https://www.wired.com/2008/06/pb-
theory/. Accessed 10 Feb. 2018.
Angwin, Julia, et al. “Machine Bias: There’s
Software Used Across the Country to
Predict Future Criminals. And it’s Biased
against Blacks.” ProPublica, May 23 (2016).
Print.
Bratton, Benjamin. H., and Natalie
Jeremijenko. “Latent Interfaces, Suspicious
Images.” Situated Technologies pamphlets
3 (2008). Print.
Curtis, Adam. All Watched Over by
Machines of Loving Grace. London: BBC,
2011. Film Series.
Coupland, Robert T., ed. Grassland
Ecosystems of the World: Analysis of
Grasslands and their Uses. Vol. 18.
Cambridge: Cambridge University Press,
1979. Print.
eBird, The Cornell Lab for Ornithology.
Web. https://ebird.org. Accessed 10 Feb.
2018.
Elichirigoity, Fernando. Planet Management:
Limits to Growth, Computer Simulation, and
the Emergence of Global Spaces. Evanston,
IL: Northwestern University Press, 1999.
Print.
Epstein, Robert. “The Empty Brain.” Aeon
(2016). Web. https://aeon.co/essays/your-
brain-does-not-process-information-and-it-
is-not-a-computer. Accessed 10 Feb. 2018.
Equivalent, 2018. Web. http://www.equivant.
com/about-us. Accessed 10 Feb. 2018.
Ernst, Mayr, “Species Concepts and
Denitions.” Topics in the Philosophy of
Biology. Dordrecht: Springer, 1976, pp.
353-371. Print.
Eubanks, Virginia. Automating Inequality:
How High-tech Tools Prole, Police, and
Punish the Poor. Basingstoke: Macmillan,
2018. Print.
Farman, Joseph C., Brian G. Gardiner, and
Jonathan D. Shanklin. “Large Losses of
Total Ozone in Antarctica Reveal Seasonal
ClOx/NOx Interaction.” Nature 315.6016
(1985): 207. Print.
Fazekas, Aron J., et al. “Are Plant Species
Inherently Harder to Discriminate than
Animal Species using DNA Barcoding
Markers?.” Molecular Ecology Resources
9.s1 (2009): 130-139. Print.
Gail, William B. “A New Dark Age Looms.”
New York Times, 19 April 2016. Web.
https://www.nytimes.com/2016/04/19/
opinion/a-new-dark-age-looms.html.
Accessed 10 Feb. 2018.
Gabrys, Jennifer. Program Earth:
Environmental Sensing Technology and
the Making of a Computational Planet.
Minneapolis: University of Minnesota Press,
2016. Print.
Hayles, N. Katherine. Unthought: The
Power of the Cognitive Nonconscious.
Chicago: University of Chicago Press, 2017.
Print.
Tega Brain: THE ENVIRONMENT IS NOT A SYSTEM
164
APRJA Volume 7, Issue 1, 2018
Hayles, N. Katherine. My Mother was a
Computer: Digital Subjects and Literary
Texts. Chicago: University of Chicago
Press, 2010. Print.
Haraway, Donna. “The High Cost
of Information in Post-World War II
Evolutionary Biology: Ergonomics,
Semiotics, and the Sociobiology of
Communication Systems.” Philosophical
Forum. Vol. 13. No. 2-3 (1981). Print.
Haraway, Donna. Simians, Cyborgs and
Women: The Reinvention of Nature.
London: Free Association Books, 1991.
Hölzl, Richard. “Historicizing Sustainability:
German Scientic Forestry in the Eighteenth
and Nineteenth Centuries.” Science as
Culture 19.4 (2010): 431-460. Print.
Hull, David L. Science as a Process: An
Evolutionary Account of the Social and
Conceptual Development of Science.
Chicago: University of Chicago Press, 2010.
Print.
iNaturalist, California Academy of Sciences.
Web. www.inaturalist.org. Accessed 10 Feb.
2018.
Joppa, Lucas N. “The Case for Technology
Investments in the Environment.” Nature
Comment, 19 December. 2017. Web.
https://www.nature.com/articles/d41586-
017-08675-7. Accessed 10 Feb. 2018.
Keller, Evelyn Fox. A Feeling for the
Organism, 10th Anniversary Edition: The
Life and Work of Barbara McClintock.
Basingstoke: Macmillan, 1984. Print.
Kim, Ke Chung, and Loren B. Byrne.
“Biodiversity Loss and the Taxonomic
Bottleneck: Emerging Biodiversity Science.”
Ecological Research, 21.6 (2006): 794.
Print.
Kwa, C. “Modeling the Grasslands.”
Historical Studies in the Physical and
Biological Sciences, vol. 24, no. 1 (1993):
125-155. Print.
Latour, Bruno. “Some Advantages of the
Notion of “Critical Zone” for Geopolitics.”
Procedia Earth and Planetary Science, vol.
10 (2014): 3-6. Print.
Latour, Bruno. Politics of Nature. Harvard
University Press, 2004.
Machine Bias Series, Propublica, 2018.
Web. https://www.propublica.org/series/
machine-bias. Accessed 15 Jan. 2018.
Madison, Mark Glen. “‘Potatoes Made
of Oil’: Eugene and Howard Odum
and the Origins and Limits of American
Agroecology.” Environment and History vol.
3, no. 2 (1997): 209-238. Print.
Mattern, Shannon. “A City Is Not a
Computer.” Places Journal (2017). Web.
https://placesjournal.org/article/a-city-is-not-
a-computer/. Accessed 1 Mar 2018.
Mattern, Shannon. Code and Clay, Data and
Dirt: Five Thousand Years of Urban Media.
Minneapolis: University of Minnesota Press,
2017. Print.
McCarthy, John and Feigenbaum, Edward
A., “Arthur Samuel: Pioneer in Machine
Learning,” AI Magazine, vol. 11, no. 3
(1990): 10-11. Print.
165
Merlin, The Cornell Lab. Web. http://merlin.
allaboutbirds.org/. Accessed 10 Mar. 2018.
Molina, Mario J., and F. Sherwood
Rowland. “Stratospheric Sink for
Chlorouoromethanes: Chlorine Atom-
catalysed Destruction of Ozone.” Nature
249.5460 (1974): 810. Print.
Morton, Timothy. The Ecological Thought.
Cambridge, Mass.: Harvard University
Press, 2010. Print.
Odell, Jenny. “Notes of a Bioregional
Interloper.” Open Space, San Francisco
Museum of Modern Art (2017). Web. https://
openspace.sfmoma.org/2017/10/notes-of-
a-bioregional-interloper/. Accessed 10 Feb.
2018.
O’Neil, Cathy. Weapons of Math
Destruction: How Big Data Increases
Inequality and Threatens Democracy. New
York: Broadway Books, 2017. Print.
Scott, James C. Seeing like a State: How
Certain Schemes to Improve the Human
Condition have Failed. New Haven,
Connecticut: Yale University Press, 1998.
Print.
Simpson, I. R., and P. D. Jones. “Analysis of
UK Precipitation Extremes Derived from Met
Ofce Gridded Data.” International Journal
of Climatology vol. 34 no. 7 (2014): 2438-
2449. Print.
Stengers, Isabelle. “The Cosmopolitical
Proposal.” Making Things Public:
Atmospheres of Democracy (2005): 994.
Print.
Szerszynski, Bronislaw, and Maialen
Galarraga. “Geoengineering Knowledge:
Interdisciplinarity and the Shaping
of Climate Engineering Research.”
Environment and Planning A 45.12 (2013):
2817-2824. Print.
Tansley, Arthur G. “The Use and Abuse of
Vegetational Concepts and Terms.” Ecology
vol. 16, no. 3, 1935: 284-307. Print.
Temple, James. “Harvard Scientists
Moving Ahead on Plans for Atmospheric
Geoengineering Experiments.” Technol Rev
vol. 24 (2017).
The Human Project. Web. https://www.
thehumanproject.org/. Accessed 10 Feb.
2018.
Tsing, Anna Lowenhaupt. The Mushroom
at the End of the World: On the Possibility
of Life in Capitalist Ruins. Princeton, NJ:
Princeton University Press, 2015.
Vergès, Françoise. “Racial Capitalocene:
Is the Anthropocene Racial?” Verso Blog,
30 August. 2017. Web. https://www.verso-
books.com/blogs/3376-racial-capitalocene.
Accessed 10 Feb. 2018.
Vitello, Paul. “Joseph Farman, 82, Is Dead;
Discovered Ozone Hole.” New York Times,
May 18, 2013. Web. https://www.nytimes.
com/2013/05/19/science/earth/joseph-
farman-82-is-dead-discovered-ozone-hole.
html. Accessed 10 Feb. 2018.
Wiener, Norbert. Cybernetics: Control
and Communication in the Animal and the
Machine. New York: Wiley, 1948. Print.
Tega Brain: THE ENVIRONMENT IS NOT A SYSTEM