ArticlePDF Available

Big Data, Algorithmic Regulation, and the History of the Cybersyn Project in Chile, 1971–1973

Authors:

Abstract and Figures

We are living in a data-driven society. Big Data and the Internet of Things are popular terms. Governments, universities and the private sector make great investments in collecting and storing data and also extracting new knowledge from these data banks. Technological enthusiasm runs throughout political discourses. "Algorithmic regulation" is defined as a form of data-driven governance. Big Data shall offer brand new opportunities in scientific research. At the same time, political criticism of data storage grows because of a lack of privacy protection and the centralization of data in the hands of governments and corporations. Calls for data-driven dynamic regulation have existed in the past. In Chile, cybernetic development led to the creation of Cybersyn, a computer system that was created to manage the socialist economy under the Allende government 1971-1973. My contribution will present this Cybersyn project created by Stafford Beer. Beer proposed the creation of a "liberty machine" in which expert knowledge would be grounded in data-guided policy. The paper will focus on the human-technological complex in society. The first section of the paper will discuss whether the political and social environment can completely change the attempts of algorithmic regulation. I will deal specifically with the development of technological knowledge in Chile, a postcolonial state, and the relationship between citizens and data storage in a socialist state. In a second section, I will examine the question of which measures can lessen the danger of data storage regarding privacy in a democratic society. Lastly, I will discuss how much data-driven governance is required for democracy and political participation. I will present a second case study: digital participatory budgeting (DPB) in Brazil.
Content may be subject to copyright.
$
£¥
social sciences
Article
Big Data, Algorithmic Regulation, and the History of
the Cybersyn Project in Chile, 1971–1973
Katharina Loeber
Department of History, University of Cologne, 50923 Köln, Germany; kaloeber@gmail.com
Received: 21 December 2017; Accepted: 11 April 2018; Published: 13 April 2018


Abstract:
We are living in a data-driven society. Big Data and the Internet of Things are popular
terms. Governments, universities and the private sector make great investments in collecting and
storing data and also extracting new knowledge from these data banks. Technological enthusiasm
runs throughout political discourses. “Algorithmic regulation” is defined as a form of data-driven
governance. Big Data shall offer brand new opportunities in scientific research. At the same time,
political criticism of data storage grows because of a lack of privacy protection and the centralization
of data in the hands of governments and corporations. Calls for data-driven dynamic regulation have
existed in the past. In Chile, cybernetic development led to the creation of Cybersyn, a computer
system that was created to manage the socialist economy under the Allende government 1971–1973.
My contribution will present this Cybersyn project created by Stafford Beer. Beer proposed the
creation of a “liberty machine” in which expert knowledge would be grounded in data-guided policy.
The paper will focus on the human–technological complex in society. The first section of the paper
will discuss whether the political and social environment can completely change the attempts of
algorithmic regulation. I will deal specifically with the development of technological knowledge
in Chile, a postcolonial state, and the relationship between citizens and data storage in a socialist
state. In a second section, I will examine the question of which measures can lessen the danger of
data storage regarding privacy in a democratic society. Lastly, I will discuss how much data-driven
governance is required for democracy and political participation. I will present a second case study:
digital participatory budgeting (DPB) in Brazil.
Keywords:
Cybersyn; human–technological complex; data-driven society; postcolonial
history; socialism
1. Introduction
Technological enthusiasts promote a process of building a global information economy, once
characterized by Bill Gates as “friction-free capitalism” (Mosco 2016). Big Data and cloud computing
create a global culture of knowledge. Information production accelerates in networks that link data
centers, devices, organizations and individuals. It defines the global expansion of networked data
centers controlled by a handful of companies. The cloud and Big Data power informational capitalism
and enable a dominant way of knowing (Mosco 2016). Data collection, data generation, and algorithmic
governance are important topics in research, the economy and politics. According to Tim O’Reilly,
developments in data collection and storage make governments more efficient and adaptive. He defines
“algorithmic regulation”, a form of data-driven governance (O’Reilly 2013). Big Data will offer brand
new opportunities in scientific research. At the same time, political criticism of data storage grows
because of a lack of privacy protection and the centralization of data in the hands of governments and
corporations. This discussion is now common in our society. Furthermore, I suggest that technological
development is part of a complex pattern of factors forming our society. This human–technological
complex will be the main point of discussion.
Soc. Sci. 2018,7, 65; doi:10.3390/socsci7040065 www.mdpi.com/journal/socsci
Soc. Sci. 2018,7, 65 2 of 15
It is necessary to put Big Data and cloud computing in the context of political economy, society,
and culture, i.e., the human–technological complex or social–technological complex. In our society,
most developments in information technology aim to strengthen capitalism, although the world of
information technology (IT) is a turbulent one. However, Big Data, cloud computing, and algorithmic
regulation have deep historical roots (Medina 2015). A closer look at the past might show us how
technological development could work with different social, economic, and political prerequisites.
The history of cybernetics holds lessons for these present-day problems.
This contribution will deal with the following questions:
(1)
Can a political environment completely change the attempts of algorithmic regulation?
(2)
Which measures can weaken the dangers of data storage in a democratic society?
(3)
How much data-driven governance do democracy and political participation need?
First, I will look at the term Big Data and its definition. Next, I will explore how a technological
system built in Chile during the 1970s—Project Cybersyn—addressed issues similar to those we
currently face in areas such as Big Data and algorithmic regulation. The paper will give an overview
of how algorithmic governance can be completely different in a political, geographic and historical
context different from that of northern capitalistic states. Realizing that technological investigation
depends on social conditions, we must look at the factors that concretely differ.
One has to keep in mind that the Cybersyn project has been discussed little in scientific discourses
since being destroyed in 1973. Only one great monography deals with project Cybersyn in detail:
Eden Medina published her study “Cybernetic Revolutionaries” in 2011. She also published diverse
papers on single aspects of the topic. Some papers and books of Raul Espejo, the Chilean informatic
who was part of the team that developed Cybersyn, also support further research on the project
(Espejo and Reyes 2011) Technological development is part of a complex pattern of factors forming a
society’s technological complex. Nevertheless, a democratic society or a socialist government does
not automatically guarantee responsible handling of stored data. Regarding this fact, I will discuss
measures that might secure responsible handling of data and the possibility to handle Big Data in
a responsible and democratic way. Finally, I will deal with the relationship between the political
participation of people and the technological development of a social–technological complex. As an
example of civic participation, I chose participatory budgeting in Brazil. I chose this example because I
found some political similarities to the case of Chile at a local level during my research. In the 1990s,
the city of Porto Alegre in southern Brazil won international renown with its innovative policies.
The centerpiece of the new policies involved the population in planning the city budget, participatory
budgeting. The new method of drawing up the budget radically altered the relationship between the
city administration and civil society. In the last few years, so-called digital participatory budgets were
installed in different cities. Digital participatory budgeting (DPB) is an online space for discussions
with society on local budget allocation issues and priorities. Such platforms exist both as integral parts
of face-to-face participatory budgeting and as exclusively digital experiences (Matheus et al. 2010).
In this last section, I will discuss the role of digital support in the participatory budget. I will discuss
how technological development and political activism can flow in a social–technological system.
2. Discussion
2.1. Definition of the Term Big Data
First, we must look at Big Data. The term has become ubiquitous but owing to a shared
origin between academia, industry and the media there is no single unified definition. Various
stakeholders provide diverse and often contradictory definitions. The lack of a consistent definition
introduces ambiguity and hampers discourse relating to Big Data and algorithmic regulation
(
Ward and Barker 2013
). To deal with the term in my discussion, I chose one of the most cited
definitions that is included in a meta report from 2001 (Gartner). The report has since been co-opted as
Soc. Sci. 2018,7, 65 3 of 15
providing a key definition. Gartner proposed a threefold definition encompassing the so-called three
Vs: volume, velocity and variety. This is a definition rooted in magnitude. The report remarks upon the
increasing size of data, the increasing rate at which it is produced, and the increasing range of formats
and representations employed (Laney 2001). Anecdotally, Big Data is predominantly associated with
the two ideas of data storage and data analysis. These concepts are far from new. Therefore, we have
to deal with the question of how Big Data is notably different from conventional data-processing
techniques. For rudimentary insight as to the answer to this question one need look only at the term
Big Data. Big implies significance, complexity, and challenge. Unfortunately, the word big also invites
quantification and therein lies the difficulty in furnishing a unique definition (
Ward and Barker 2013
).
Nevertheless, the most important characteristic of Big Data sets is the inability of data-processing
application software to deal with them
.
The definition of Big Data does not only vary on volume,
variety and velocity of the datasets, but also on the capacity of the organization, storing, management
and employment of Big Data (Magoulas and Lorica 2009).
Despite the range and differences in diverse definitions, there are some points of similarity
regarding the size, complexity and structure of Big Data.
The critical factors of Big Data are: (i) the volume of the datasets; (ii) the structure, behavior,
and permutations of the datasets; (iii) the tools and techniques that are used to process a sizable or
complex dataset (Ward and Barker 2013). This paper cannot provide an analysis of the available
literature on Big Data analytics. There are various Big Data tools, methods and technologies with
various applications and opportunities. There is one main problem: due to the rapid growth of Big
Data, research on solutions to handle and extract value and knowledge from these datasets is necessary
(
Elgendy and Elragal 2014
). We are far from finding these solutions. I suggest that the storage and
handling of Big Data sets is not only a technical problem but a political one.
I further suggest that Big Data is part of a “global culture of knowing” (Mosco 2016). Mentioned
above, the “global culture of knowing” defines the global expansion of networked data centers
controlled by a handful of companies in today’s capitalistic society as informational capitalism.
Obviously, data collection, data generation, and algorithmic governance are part of our society.
Companies move their data into the cloud and collect masses of data from individuals who are storing
traces of their identities in the cloud. The cloud is also used by the military in strategic planning as
well as universities in transforming education. I already mentioned the danger of a lack of privacy
protection and the centralization of data in the hands of governments and corporations. Even the
scientific use of Big Data is critical, because of the volume and complexity of the datasets. At this point,
I come to my first question:
2.2. Can a Political Environment Completely Change the Attempts of Algorithmic Regulation?
This section starts to discuss the complex relationship between technology and society.
Stated above, the history of Big Data, cloud computing and algorithmic regulation is a long
one. Technological development not only exists in northern capitalistic states, but also in countries
of the so-called “global south” under different social, economic and political prerequisites. Next, I
explore how a technological system built in Chile during the 1970s—Project Cybersyn—addressed
issues similar to those we currently face in areas such as Big Data, cloud computing, and
algorithmic regulation.
If we move beyond technology and ideology, it is clear that we have already seen calls for
data-driven dynamic regulation in the past. For example, algorithmic regulation is highly reminiscent
of cybernetics, the interdisciplinary science that emerged in the aftermath of the Second World War.
Cybernetics moved away from linear understandings of cause and effect and toward investigations
of control through circular causality, or feedback. It influenced developments in areas as diverse
as cognitive science, air defense, industrial management and urban planning. It also shaped ideas
about governance (Medina 2015). Cybernetics can be defined as a scientific study of how humans,
animals and machines control and communicate with each other. It is a transdisciplinary approach for
Soc. Sci. 2018,7, 65 4 of 15
exploring the structures constraints and possibilities of regulatory systems. Norbert Wiener, the famous
Massachusetts Institute of Technology (MIT) mathematician, credited with coining the term cybernetics,
defined cybernetics in 1948 as “the scientific study of control and communication in the animal and the
machine” (Wiener 1948). The paper will show how the cybernetic project Cybersyn became a historical
case for the early use of so-called groupware and early “internet” communication—an early form of
the cloud.
The content of cybernetics varied according to geography and historical period. In the USA, early
work on cybernetics was often associated with defense; in Britain, it was associated with understanding
the brain (Medina 2015); in the Soviet Union cybernetics became a way to make the social sciences more
scientific and also contributed to the use of computers in a highly centralized economy (Mosco 2016).
In Chile, cybernetics led to the creation of a computer system some thought would further socialist
revolution (Medina 2011).
A discussion of Project Cybersyn requires a treatment of Stafford Beer and the so-called viable
system model (VSM) that he developed in the 1960s. Beer conducted groundbreaking work on the
application of cybernetic concepts to the regulation of companies. He believed that cybernetics and
operations research should drive action, whether in the management of a firm or governance on a
national scale. Norbert Wiener, once told Stafford Beer that if he was the father of cybernetics, then
Beer was the father of management cybernetics. Correspondence, invitations to conferences, meetings
and friendships followed. Stafford was welcomed by many of the early pioneers and formed special
bonds with his mentors, Warren McCulloch, Ross Ashby and Norbert Wiener (Leonard 2002).
For Beer, computers in the 1960s and 1970s presented exciting new opportunities for regulation.
In 1967, he observed that computers could bring about structural transformations within organizations.
Organizations, which could be firms as well as governments, were linked to new communications
channels that enabled the generation and exchange of information and permitted dynamic decision
making. Beer proposed the creation of a so-called liberty machine, a system that operated in close to
real time, facilitated instant decision making, and shunned bureaucracy. The liberty machine should
also prevent top-down tyranny by creating a distributed network of shared information. Expert
knowledge would be grounded in data-guided policy instead of bureaucratic politics (Beer 1994).
Indeed, the liberty machine sounds a lot like the cloud, algorithmic regulation and data storage as
we understand it today. Examining Beer ’s attempt to construct an actual liberty machine in the context
of political revolution—Project Cybersyn—further strengthens this comparison.
The VSM Stafford Beer developed in the 1960s is a model of the organizational structure of any
autonomous system capable of reproducing itself. A viable system is defined as any system organized
in such a way as to meet the demands of surviving in a changing environment. One of the prime
features of systems that survive is that they are adaptable. The VSM expresses a model for a viable
system, which is an abstracted cybernetic description that applies to any organization that is a viable
system and capable of autonomy (Beer 1972). The VSM was a result of Beers psychological research
and his work on Operation Research. He suggested the subsequent hypothesis that there might be
invariances in the behavior of individuals and that these invariances might also inform the peer group
of individuals and even the total societary unit to which they belong. In the early 1950s, this theme
constantly emerged in Beers operational research work in the steel industry. He used to refer to the
structure of “organic systems.” The quest became to know how systems are viable (Beer 1984). One
important thing to note about the cybernetic theory of organizations encapsulated in the VSM is that
viable systems are recursive. Viable systems contain viable systems that can be modeled using an
identical cybernetic description as the higher (and lower) level systems in the containment hierarchy
(Beer 1972).
What follows is a brief introduction to the cybernetic description of the organization encapsulated
in a single level of the VSM. According to Beers cybernetic model of any viable system, there are five
necessary and sufficient sub-systems interactively involved in any organism or organization that is
capable of maintaining its identity independently of other such organisms within a shared environment.
Soc. Sci. 2018,7, 65 5 of 15
This “set of rules” will, therefore, apply to an organism such as a human being, or to an organization
consisting of human beings such as the state (Beer 1984). In broad terms Systems 1–3 are concerned with
the “Inside and Now” (Beer 1984) of the organization’s operations (
Beer 1972
). The first sub-system of
any viable system consists of those elements that produce it. These elements are themselves viable
systems (Beer 1984). Taking into account that my paper focuses on the Cybersyn project, it is useful
to look at the VSM as a state. According to Beer, at the limits, the citizens constitute System One
of the state. This hypothesis is limited by the fact that the citizens first produce communities and
companies, cities and industries, and other viable agglomerations, which are themselves all elements to
be included in the state (Beer 1984). Anytime you have two or more activities being operated together;
the possibility exists for them to get out of sync with each other or get in each other’s way. System Two
exists as a service to coordinate common services for consistency and efficiency. System 2 represents
the administrative channels and bodies that allow the primary activities in System 1 to communicate
with each other (Leonard 2009). System 2 allows System 3 to monitor and co-ordinate the activities
within System 1. System 3 represents the structures and controls that are put in place to establish the
rules, resources, rights and responsibilities of System 1 and to provide an interface with Systems 4 and
5 (Beer 1972).
System 4 is concerned with the “Outside and Future” (Beer 1984). The term describes strategical
responses to the effects of external, environmental and future demands on the organization. System
Four’s role is to observe the anticipated future environment and its own states of adaptiveness and act
to bring them into harmony (Leonard 2009).
System 5 is responsible for policy decisions within the organization as a whole to balance demands
from different parts of the organization and steer the organization as a whole. The “fiveness” of
the VSM was due to Beer’s efforts to establish the necessary and sufficient conditions of viability.
The number could have been different. What could not have been otherwise was the fact of the
logical closure of the viable system by System Five: only this determines an identity. Nominating
the components of System Five in any application is a profoundly difficult job because the closure
identifies self-awareness in the viable system. Beer often told the story of how President Salvador
Allende in 1972 told him that System Five, which Beer had been thinking of as Allende himself, was,
in fact, the people (Beer 1984).
In addition to the sub-systems that make up the first level of recursion, the environment is
represented in the model. The presence of the environment in the model is necessary as the domain
of action of the system. Algedonic alerts are alarms and rewards that escalate through the levels of
recursion when actual performance fails or exceeds capability, typically after a timeout (Beer 1972).
Throughout its development, the VSM has been in the process of continuous testing and
verification. Meanwhile, however, the whole approach had its most significant and large-scale
application during 1971–1973 in Allende’s Chile (Beer 1984).
Project Cybersyn itself was an ambitious technological project tied to an ambitious political project.
It emerged in the context of Chile’s “peaceful road to socialism” (Medina 2015). Salvador Allende
had won the Chilean presidency in 1970 with a promise to build a different society. His political
program would make Chile a democratic socialist state with respect for the country’s constitution and
individual freedoms, such as freedom of speech and freedom of the press. Giving the state control
of Chile’s most important industries constituted a central plank in Allende’s platform but created
management difficulties. The government had limited experience in this area (Medina 2015).
The problem of how to manage these newly socialized enterprises led a young Chilean engineer
named Fernando Flores to contact Beer, the British cybernetician, and ask for advice. Flores worked
for the Corporación de Fomento de la Producción(CORFO), the government agency charged with the
nationalization effort. Together, Beer and Flores formed a team of Chilean and British engineers and
developed a plan for a new technological system that would improve the government’s ability to
coordinate the state-run economy (Medina 2015). There were several features of the project. The
use of the VSM was the most important one. The team identified 12 levels of recursion from the
Soc. Sci. 2018,7, 65 6 of 15
individual worker to the country as a whole (Leonard 2009). Beer defined the Chilean state as a VSM
embedded in the “world of nations”. Project Cybersyn picked out Chilean industry as a VSM. The
Minister of Economy was equated with System 5 (Beer 1981). In practice, Cybersyn focused on the
levels of the product line, the sector, the branch and CORFO itself. For prototyping purposes, some
firms were modeled, and training was piloted for firms to provide meaningful worker information
and participation and to differentiate between their roles and knowledge bases and those of the
experts (Leonard 2009). The system would provide daily access to factory production data and a set
of computer-based tools that the government could use to anticipate future economic behavior. It
also included a futuristic operations room that would facilitate government decision-making through
conversation and better comprehension of data. Beer envisioned ways to both increase worker
participation in the economy and preserve the autonomy of factory managers, even with the expansion
of state influence. Beer gave the system the name Cybersyn in recognition of cybernetics, the scientific
principles guiding its development, and of synergy, the idea that the whole of the system was more than
the sum of its technological parts. The system worked by providing the government with up-to-date
information on production activity within the nationalized sector. Factory managers transmitted data
on the most important indices of production to the Chilean government on a daily basis. Regarding
hardware, the system relied on a national network of telex machines that that connected the factories
to the central mainframe computer (Medina 2015).
These telex machines became a historical example of the early use of collaborative software or
groupware and early “internet” communication: using the system’s telex machines, the government
was able to guarantee the transport of food into the city with only about 200 trucks, recouping the
shortages caused by 40,000 striking truck drivers who blocked access streets towards Santiago in 1972.
The computer processed the production data and alerted the government agency in charge
of the nationalization effort (CORFO) if something was wrong. Project Cybersyn also included an
economic simulator, intended to give government officials an opportunity to play with different
policy alternatives and, through play, acquire a heightened sense of the relationship among the
various economic variables. It also included a futuristic operations room which was built in
downtown Santiago.
Agreeing with Eden Medina, I suggest that Cybersyn allows us to consider that algorithmic
governance can be completely different in a political, geographic and historical context different from
that of northern capitalistic states (Medina 2015). If we look on present-day forms of computerized
governance, for example, Amazon or Google, it becomes clear that technological development can be
built on very different attempts and can have different faces.
A general problem of technological development in our society is the technology-centered idea
of social change. Throughout Project Cybersyn, Beer repeatedly expressed frustration that Cybersyn
was viewed as a suite of technological fixes—an operations room, a network of telex machines, an
economic simulator or software to track production data—rather than a way to restructure Chilean
economic management. Beer was interested in understanding the system of Chilean economic
management and how government institutions might be changed to improve coordination. He viewed
technology as a way to change the internal organization of Chile’s government. Eden Medina, who
has investigated the case of Cybersyn, suggests moving away from thinking regarding socio-technical
systems (Medina 2015).
I would go one step further in the argument. At this point, the term social-ecological system (SES)
is appropriate. SES are ecological systems intricately linked with and affected by one or more social
systems. Resilience is a crucial property of SES. Resilience is defined as the capacity of a system to
withstand or recover from shocks. These are complex adaptive systems characterized by cross-scale
interactions and feedback loops between ecological, social and economic components that often result
in the reorganization of these components and non-linear trajectories of change. Consistent with
Holling, hierarchies and adaptive cycles form the basis of ecosystems and social-ecological systems
across scales. Together they form a so-called panarchy that describes how a healthy system can invent
Soc. Sci. 2018,7, 65 7 of 15
and experiment while being kept safe from factors that destabilize the system because of their nature or
excessive exuberance. Each level is allowed to operate at its own pace. At the same time, it is protected
from above by slower, larger levels but invigorated from below by faster, smaller cycles of innovation.
The whole panarchy is, therefore, creative as well as conservational. The interactions between cycles
combine learning with continuity. In this context, sustainability is an important factor. Sustainability
is defined as the capacity to create, test and maintain adaptive capability. Development in an SES
describes the process of creating, testing and maintaining opportunity. Sustainable development
combines the two and refers to the aim of fostering adaptive capabilities and creating opportunities
(Holling 2001).
Comparing the definition of an SES with Beer’s concept of a VSM, there are diverse similarities:
both theories focus on the stability of a dynamic system. An important question is how to manage
these systems to reach viability or resilience. Regarding the historical lessons of the Cybersyn
project, I suggest that technological development is part of a complex pattern of factors forming
a society—a human–technological complex. The role of technology in an SES is less discussed and
needs more analysis. The following is the first hypothesis of my paper: algorithmic regulation
containing cloud computing and Big Data are parts of a complex pattern of factors forming a society
(SES)—a human–technological complex.
Now I directly deal with the second important question:
2.3. Which Measures Can Weaken the Dangers of Data Storage in a Democratic Society?
Realizing that technological investigation depends on social conditions, we have to take a look at
the factors that differ concretely. If we examine technological development in our Western capitalistic
states, we face two main problems: first, we have to deal with a lack of privacy protection; second,
there is a lack of algorithmic transparency and democratic control. A democratic society or a socialist
government does not automatically guarantee responsible handling of stored data.
Responsible handling of data is not a technical problem but is, nevertheless, a social problem. New
technological innovations such as smartphones, the increased use of data-driven analytics, and the
push to create smart cities and an Internet of Things all make the collection of data easier. At the same
time, these innovations permit the recording of increasing volumes of human and non-human activity.
Often, we adopt these data-gathering technologies before understanding their full ramifications.
Looking at an example from Great Britain, as prime minister David Cameron tested the app Number
10 Dashboard on his iPad, which gave up-to-the-minute data about the UK’s economic and financial
health, including GDP, bank lending, jobs and property data, as well as polling data and Twitter feeds.
Polling data and Twitter feeds were defined as “political context” (Mosco 2016). One can easily imagine
that Twitter feeds did not raise the GDP, but the stored data could without difficulty be abused through
advertising or social control.
Such developments raise important questions about privacy and the extent to which we should
expect to forfeit our privacy so that an increasingly data-driven environment can function. Privacy
protection can mean the difference between a system that is centralized and abusive and one that can
protect and promote human freedom. Project Cybersyn might serve as a historical example. In critical
reflection, Beer overstated Cybersyn’s ability to promote freedom in Chile, but he did take pains to
counteract the system’s potential for abuse by including mechanisms to protect and preserve factory
autonomy. This protection was engineered into the system’s design. The government, for example,
could intervene in shop-floor activities only after the software detected a production anomaly and the
factory failed to resolve the anomaly within a set period (Medina 2015).
Let us take a look at the kind of data storage in Project Cybersyn. Beer proposed a system
called Project Cyberstride. The system would rely on data collected from state-controlled industries.
Cyberstride would use mainframe technology to make statistical predictions about future economic
behavior. The system updated these predictions daily based on new data arriving from the enterprises
(Medina 2011). Typically, this included data on raw materials and energy as well as data on worker
Soc. Sci. 2018,7, 65 8 of 15
satisfaction. Worker satisfaction was measured by the percentage of workers present on a given
day. Operations research scientists conducted studies to determine the acceptable range of values for
each index, i.e., what would be considered normal and what would be considered cause for alarm.
Engineers from Chile and Britain developed statistical software to track the fluctuations in the index
data and signal if they were abnormal. The software also used statistical methods to predict the future
behavior of the factory and thus give government planners an early opportunity to address a potential
crisis (Medina 2015). Therefore, it is possible that privacy protection is a focus of technological design.
Another important measure to gain democratic control over technology is the development of
mechanisms for greater algorithmic transparency. Companies and government offices often couple
large datasets with forms of algorithmic decision-making whose inner workings are shielded from
public view. The use of those datasets is shielded from public view as well. We have limited knowledge
of how Facebook deals with our personal information or how Google constructs our personal filter
bubble of search results (Medina 2015). We have only a general—but not a complete—understanding
of the factors that go into our credit-rating systems. The use and coupling of datasets taken from cell
phones or computers in a criminal prosecution is the point of discussion regarding privacy protection
as well as algorithmic transparency. Medina claims the need to need to develop mechanisms for
greater algorithmic transparency and democratic control (Medina 2015). Obviously, research on this
point is necessary. Nevertheless, the problem of algorithmic transparency is more complex. To ensure
algorithmic transparency, the complexity of the algorithms should not exceed the potential user’s
capacity for understanding. We are still far away from the capacity to do that. The non-transparency
of software and algorithms is a problem that has existed for over 50 years. The discussion about
algorithmic transparency went as far as machine learning or deep learning. Software and algorithms
have been black boxes for over 50 years, even for programmers (Passig 2017). For Norbert Wiener,
the father of cybernetics in 1960, one the reason for the incomprehensibility of software was our
slowness in thinking. We can understand what a program does, but it may take so long that the
criticism comes too late (Wiener 1960). Stanislaw Lem wrote in 1964 that systems above a certain
degree of complexity are fundamentally unpredictable (Lem 1964). In 1967, Marvin Minsky called
attention to the interactions of individual processes, the proliferation of the code, and the effects of the
cooperation of several programmers (Minsky 1967). For Joseph Weizenbaum, it was the size of the
systems, the departure of the original programmers, and the passing of time, “because the systems are
a consequence of their history and history is lost” (Weizenbaum 1976).
There are causes of this incomprehensibility. The first is bugs, that is, errors in the software and
hardware. They were no less frequent 50 years ago than they are today. We may now know more
precisely that they are inevitable. The problem cannot be solved by merely putting better people in
charge. The second unnamed cause is that code is not just what the individual programmer thinks up.
Even the simplest program relies on a set of foreign code that exceeds its own by orders of magnitude.
Embedded code libraries and frameworks often solve recurring tasks; software layers translate the
written code into a language that the processor can process; software displays, stores or transmits all
of this (Passig 2017).
Another question is what is meant by the repeatedly demanded “transparency” or “intelligibility”
of algorithms? There are different possibilities for the interpretation of these words. At the lowest level,
they are about whether the code is even visible. This is the case with open-source software, but usually
not in the commercial sector, regardless of whether machine learning is involved. However, suppose
the code is accessible. Does the requirement of transparency mean that it should be transparent for
laypersons in a reasonable time? This requirement can only be fulfilled with elementary code examples
(Passig 2017). Is it about the question of whether the result can be checked? There are computations
whose results are comparatively easy. In an introduction to the scientific work of 1972, the following is
said about “programmable electronic desktop computers:” “It is important, however, that the program
is checked for accuracy. To do this, the results that the calculator prints out must be compared with
Soc. Sci. 2018,7, 65 9 of 15
the results obtained in arithmetic with pencil and paper” (Seiffert 1972). However, as soon as the
computation becomes more complex, you cannot go far with pencil and paper.
The question of algorithmic transparency is complex and has a long history. Nevertheless, even if
it is impossible to have full transparency, it makes sense to demand more transparency. The political
environment might not change the complexity of algorithms or the fact that humans’ brains work
slower than computers, but we should not forget to ask who profits from a lack of transparency.
Here, too, Project Cybersyn offers important insights. Evgeny Morozov compares Beer’s Cybersyn
project to Michael Flowers’ suggestion of a concept of real-time data analysis allowing city agencies
to operate in a cybernetic manner (Flowers 2013). The appeal of this approach to bureaucrats would
be fairly obvious: like Beer ’s central planners, they can be effective while remaining ignorant of the
causal mechanisms at play (Morozov 2014). Nevertheless, according to Medina, he was not ignorant
of the transparency problem. Beer believed Project Cybersyn would increase worker participation.
Workers were enabled to create the factory models that formed the basis of the Cybersyn software. This
integration allowed workers to connect intellectually to their work (Medina 2015). Beer argued that
technology could help integrate workers’ informal knowledge into the national planning process while
lessening information overload (Morozov 2014). Medina argues that it also did something else, which
Beer did not acknowledge. It gave them a way to understand how this form of data-driven regulation
worked. Theoretically, it allowed them to open up the black box of the computer and understand
the operation of the analytical processing taking place within it (Medina 2015). Nevertheless, Project
Cybersyn ended with the Pinochet coup in 1973.
We have to deal with the question of whether these measures discussed correspond with the
definition of Big Data we have seen before. Morozov called Cybersyn the “socialist origin of Big Data”
(Morozov 2014). I suggest it is not. Big Data as we know it signifies the storage of all data that it is
possible to store and to analyze. Morozov describes the method behind Big Data as follows: “[
. . .
]
collect as much relevant data from as many sources as possible, analyze them in real time, and make
an optimal decision based on the current circumstances rather than on some idealized projection”
(Morozov 2014). Cybersyn would have relied on the same cybernetic principles (Morozov 2014). One
small problem in this context is the term relevant. Collecting data regarding the Big Data principle
signifies obtaining all data possible, even if it appears irrelevant at first. Taking a look at the kind of
data stored and analyzed by Cybersyn described above, it becomes clear that not all kinds of data
were stored. Working with Big Data, Cybersyn would have stored how often workers use the lavatory,
made a profile with the help of their mobile phones, and so on. A comparison with Cameron’s app
Number 10 Dashboard makes the difference even clearer.
My second interim result is that Big Data is a product of our capitalistic society oriented toward
profits. For scientific research, we do not have to handle all the random data we get. The technological
development allows us to work with increasingly complex data. This advance is a great scientific
challenge, but the big Vs that I described are not necessarily useful for scientific analysis. Big Data
mainly works in favor of great companies using the information to sell their products, or it may be
used by autocratic governments. This lack can only be valuable if organizations handling data support
the individuals they are addressing. In this case, the power of the individuals and democratic control
can increase considerably.
Now I am going to deal with my last question.
3. How Much Data-Driven Governance Do Democracy and Political Participation Need?
In addition to his work on Project Cybersyn, Stafford Beer asked how cybernetics might help the
state to respond quickly to the demands of the people. Regarding the fact that television ownership
had increased in the 1970s in Chile (123,000 to 500,000), Beer proposed building a new form of real-time
communication via TV that would allow the people to communicate their feelings to the government
(Medina 2011). The increasing purchasing power in the first year of Allende’s government was a result
of his economic program. The government had developed a broad outline for structural reform. One
Soc. Sci. 2018,7, 65 10 of 15
element was a Keynesian-based economic recovery plan initiated by the Minister for the Economy
Pedro Vuskovic, a sort of Chilean New Deal. It was based on the redistribution of wealth and an
attempt to partially freeze rising commodity prices (the 1970 price increase stood at 35%). If one
takes into account the salary increases introduced on 1 January 1971 and the bonuses and increases
in welfare benefits, the salaries of the lowest-paid workers and peasants may have risen by as much
as 100%. In consequence, a spending fever hit the lowest income groups, and industrial production
suddenly took off again (production increased by 10% a year in 1971 and 1972). Commercial activity
revived and unemployment dropped off. Nevertheless, all of these reforms provoked dissent, which
slowed down the social reconstruction of the country (Compagnon 2004).
Beer called his system Project Cyberfolk. He planned to build a series of so-called algedonic meters
capable of measuring how happy Chileans were with their government. Beer proposed building a series
of algedonic meters capable of measuring the happiness of Chilean citizens about their government in
real time. These algedonic meters would not ask questions. The users simply moved a pointer on a
dial somewhere between total dissatisfaction and total happiness. Users could construct their scale of
happiness. This construction was reminiscent of Beer’s attention to autonomy and broad participation.
These meters could be installed in any location with a television set. Cyberfolk was never realized. Beer
commissioned several prototype meters and used them in small-group experiences (Medina 2011).
Beyond the basic problem that 500,000 television sets in Chile were still a low number, it is easy to
imagine how a government could abuse such a device or partisan groups might manipulate them.
Looking at political uses of today’s cloud, it seems to have little to do with practicing democracy,
civic participation or activism. As an example, I refer to Barack Obama’s campaign’s use of
cloud-computing and Big Data analyses in the presidential elections 2012 to identify potential voters
and deliver enough of them to the polls and exceed many pundits’ expectations. The campaign built
more than 200 apps that ran on Amazon Web Services (Mosco 2016).
Obviously, political participation does not need Big Data. Nevertheless, as cybernetic history
shows, in a human–technological complex, technical measures and cloud computing can be useful to
support civic participation and activism. As an example of civic participation, I chose the participatory
budget (PB) in Brazil. I chose this example because there are some political similarities to the case of
Chile on a local level. Of course, a direct comparison is not possible. The establishment of the PB in
Porto Alegre was an example of a process of re-democratization that overtook Latin America in the
1980s. Many countries shared a common agenda regarding democracy and its institutions: they were
struggling to build or rebuild their democratic institutions with an agenda that focused mainly on
fighting corruption, improving access to government and strengthening governmental accountability.
These experiences are unusually diverse in countries characterized by deep-rooted political, social,
economic and regional disparities such as Brazil (Souza 2001). Nevertheless, we also deal with the
election of a socialist party and small economical resources.
Brazil is an example of both re-democratization and decentralization. Regarding participation,
the 1988 constitution provided several mechanisms that allowed grassroots movements to take part in
some decisions and to oversee public matters, especially at the local level (Souza 2001). The financial
autonomy afforded to Brazilian municipalities under the 1988 constitution and the spoils system meant
that the mayor had discretion over a significant and guaranteed resource stream and was allowed
to make strategic senior appointments to support the development of participatory governance. PB
challenged the traditional role of the city councilors and the legislative branch (Cabannes 2004).
The PB was established in Porto Alegre by the incoming Workers’ Party mayor who had the
explicit intention of designing a participatory process that challenged the clientelism and corruption
endemic within Brazilian political culture and legitimized redistributive policies. In the 1990s, the
city of Porto Alegre in southern Brazil won international renown with its innovative policies. The
centerpiece of the new policy involved the population in planning the city budget: participatory
budgeting. One has to keep in mind that it was not only Porto Alegre, or other cities governed by
Soc. Sci. 2018,7, 65 11 of 15
leftist parties, that embarked on a policy of increasing local revenue but also several municipalities
across Brazil (Souza 2001).
The main functions of the PB, especially in Porto Alegre or Belo Horizonte, are described
and analyzed by various authors (Sánchez 2002;Fedozzi 2001). Practices differed in cities and
communities over the years, but I regard as important for my study some common elements of direct
and representative democracy, as follows:
Citizens received basic information about the city budget in meetings at the district level.
Delegates selected from the people attending these meetings drew up a list of priorities for projects
in the forthcoming budget, in consultation with the general public and the administration.
The next step was for all those taking part to vote on assigning priorities to projects and to elect
two delegates from each district to the Conselho do Orçamento Participativo (COP).
Alongside the district assemblies, issue-related assemblies were also set up to handle city-wide
topics; these nominated two delegates each to the COP, too.
By the directives from the district and issue-related assemblies, the COP drew up a draft budget
plus investment plan and submitted these to the city council for assessment and final decision. The
COP was also responsible for working out the rules for the process of planning the next budget.
These rules incorporated allocation formulae developed specially to ensure even treatment
of poor and rich districts. The structures of the participatory budget were largely developed
autonomously; the intention was to revise these structures year by year.
One might ask what this example of civic activism, which demands face-to-face meetings, has
to do with the human–technological complex. First, we have to consider that personal meetings are
absolutely necessary for political activism and democratic life. However, in the last few years, so-called
digital participatory budgets were installed in different cities. Currently, some of the municipalities in
Brazil are going beyond the traditional processes of participation such as the PB and have started using
information and communication technologies (ICTs) such as the internet in these local participatory
processes, so called digital participatory budgeting (DPB). DPB is an online space for discussions with
society on local budget-allocation issues and priorities. Such platforms exist both as an integral part of
face-to-face participatory budgeting as well as exclusively digital experiences (Matheus et al. 2010).
Digital participatory budgeting is more recent and, until 2016, was less frequently used in Brazil when
compared to face-to-face participatory budgeting. The first experiments took place in 2001 in the city
of Porto Alegre, Rio Grande do Sul, and in Ipatinga, Minas Gerais. Data from 2014 shows that 37 such
platforms were operating in Brazil at the time. Even if digital inequality varies across Latin American
countries and within different areas of those countries, innovations based on ICT tools have been
growing all over the region. Many scholars promote e-participation as very positive. The Innovations
for Democracy in Latin America (LATINNO) Project at the WZB Berlin Social Science Center, which
investigates democratic innovations that have evolved in 18 countries across Latin America since 1990,
has just completed a study of new digital institutional designs that promote e-participation aimed
at improving democracy. The LATINNO project is the first comprehensive and systematic source
of data on new forms of citizen participation evolving in Latin America. The research focused on
Brazil, Colombia, Mexico and Peru—countries with different political and social backgrounds, varied
population sizes and different levels of internet connectivity. The findings not only disclose common
patterns among these countries but also indicate trends that may reveal how digital democracy may
evolve in the region in coming years. Taken together, Brazil, Colombia, Mexico, and Peru created 206
innovations for e-participation between 2000 and 2016, 141 of which are still active in 2017. Brazil
and Mexico are the countries with the highest number of digital innovations (73 and 71, respectively),
followed by Colombia (32) and Peru (30). Given that Brazil and Mexico are both large countries with
large populations, one could expect their number of digital innovations to be higher. The number of
innovations created in 2015 is five times higher than the number implemented in 2010 (the year after
which, about 90% of all cases have been initiated). Although most new digital spaces for e-participation
Soc. Sci. 2018,7, 65 12 of 15
are recent and new technologies quickly become old, the research group argues that innovations have
overall been demonstrating a reasonable level of sustainability (Pogrebinschi 2017). At this point, I
see another commonality with Project Cybersyn. Stafford Beer and his team had to work with only
one mainframe computer and telex machine. Sustainability must be one of the main focuses in IT
development to ensure responsible handling of data. In this context, more thought has to be given
toward how we can extend the life of older technologies (Medina 2015).
The process of digital democratic innovations has connected the concepts of electronic governance
and electronic government intimately, a theme that is under construction because of the novelty of the
subject and area of study. There are differences between the limits of these concepts, and some authors
call’s for electronic governance will be for others understood as electronic government, and vice-versa.
Regardless of the theoretical discussions that are underway, electronic government and electronic
governance are widely seen as a tool with many possibilities to support changes in government
and even the transformation of society itself. There are fundamental differences between electronic
government and electronic governance. The concept of electronic government is seen largely as the
government’s relationship with the company regarding the provision of services via electronic tools.
These services might be the use of the internet, the intensive application of information technology in
the process of services and the relationship of governments to citizens for electronic brokerage. That is,
electronic government is often supported by a side whose focus is more on providing information than
on an interaction between government and society. The concept of electronic governance goes beyond
electronic services delivery, and technological improvements in public administrations such as several
authors describe for electronic government and electronic governance. According to Matheus, Ribeiro,
Vaz and de Souza, electronic governance extends to a broader issue. Electronic governance concerns
the relationship between citizens and the governmental body in the current context of democratization.
Governance has several implications, being understood as the state capacity to implement, efficiently
and effectively, the decisions that are taken. So, the concept of governance refers to decision-making
processes and government action, and its relationship with citizens. This relationship is based on
control and accountability, transparency of data and information, and participation in decision-making
both in the formulation and monitoring phases of public policies (Matheus et al. 2010).
Looking back at project Cybersyn again, I suggest that Stafford Beer explored electronic
government as well as electronic governance without being aware of these terms. Cybersyn would
have offered a service of providing information as well as the possibility of worker’s participation.
Cyberfolk, far from perfect, can be seen as an early concept of electronic governance.
Electronic governance in Brazil is somewhat troubled. We have a society that due to social and
cultural conditions exists without the benefits of technology. One has to consider a lack of infrastructure
and services for those citizens who cannot pay (another similarity to the case of Chile). Nevertheless,
in some areas, technological feasibility, resources and skilled workers provide services with deep social
inequality to the power of social movements and civil society organizations (Matheus et al. 2010).
A case study from Matheus, Ribeiro, Vaz and de Souza presented the use of ICTs in participatory
budgeting on the following occasions: to monitor the participatory budget; to collect the proposals
that would be taken for voting in participatory budgeting; and the participatory budgeting vote
(Matheus et al. 2010).
As shown in Table 1, different cities use different methods of monitoring the PB via the internet.
The earliest digital measures of monitoring PBs can be found in the city of Porto Alegre, Brazil. Using
the internet to monitor the PB supports the promotion of social control. Besides, by allowing searches
of the database of the site, monitoring the PB allowed the monitoring of citizens through information
sent by email to the government in the PB site. According to the authors, the use of ICTs would not
only expand the number of people tracking and monitoring, since local citizens, representatives of PB
and community leaders are monitoring in person and disseminating information via the internet so
that everyone has access to the progress of the PB. Only the city of Ipatinga was using the internet
to gather proposals for the participatory budgeting vote. In the other places, decisions are still taken
Soc. Sci. 2018,7, 65 13 of 15
by the popular assemblies (Porto Alegre, Recife, Bella Vista and Miraflores) or by the city hall (Belo
Horizonte) (Matheus et al. 2010).
The reactions of citizens to ICT are entirely different. In their analysis of the DPB experience in Belo
Horizonte from 2014, Júlio Cesar Andrade de Abreu and JoséAntonio Gomes de Pinho conclude that
democratic sentiments and the meaning of e-participation ranged from large impulses of democratic
hope up to disbelief in digital participation. The authors analyzed three episodes from 2006, 2006 and
2011 of DPB in Belo Horizonte. During this period, messages and manifestations of the citizens who
were participating were collected (De Abreu and de Pinho 2014).
This case study supports my last hypothesis: in an SES, groupware, cloud, or technical
development can support democratic participation if it is seen as a socio-technological complex.
Nevertheless, it also may be abused. Even more than Project Cybersyn, the PB and DPB and their
relationship in political participation show the complex pattern of political activism by people formed
by face-to-face assemblies, written papers, demonstrations, bureaucracy and technology. Taking a
look at the data collected in the different forms of digital participatory budget, the importance of
responsible handling of data has to be highlighted again. The datasets regarding PB might be “big”,
but contain less random data. Monitoring PB, even online voting, does not need data from fitness
trackers of the participants nor their Twitter feeds. Democratic participation might be strengthened by
technology, but never by Big Data in the sense of collecting everything possible.
Table 1. Cases of digital participatory budgeting (DPB).
Cases of Digital
Participatory
Budgeting
Practice Implication Cities Experience
Monitoring of
participatory budget
(PB)
Municipality use the
internet to provide
information
Social control
transparency Porto Alegre The experience
started in 1995
Collection demands on
internet
People can indicate
priorities of PB via the
Internet
The motivation of
people to
participate
Ipatinga The experience
started in 2001
Online voting I
Online voting is part of
traditional PB
Improvement of
different kinds of
participation
Recife
Since 2007 people
have been able to
vote by internet for
traditional PB
Online voting II
The online process is
separated from
traditional PB
Improvement of
different kinds of
participation
Belo Horizonte
Around 25% of
participants have
voted by internet
since 2006
Adopted from (Matheus et al. 2010).
4. Conclusions
Big Data, cloud computing and algorithmic regulation are not only forms of data collection or
technological components. Discussions about technological developments should not be reduced to
technology. It is not about single technological components; it is about the social, political, economic
and technical. It is about the role of technology in the complex pattern of society. The SES theory
might help in the interpretation of this complex pattern. Ecological systems are not only linked to
social systems but socio-technological systems. Technology might be seen as a factor that can support
or weaken the resilience of a socio-ecological system (SES) while it is influenced by other factors of
the SES. Looking at the SES theory, it is important to keep in mind that society cannot be reduced to
a management system. I suggest that fact as a critical point regarding SES theory as well as Beers
thinking of management as a system.
Cybersyn serves as an historical example how algorithmic regulation can change in societies other
than northern capitalistic states. Nevertheless, it had limitations. Planned to support companies and
Soc. Sci. 2018,7, 65 14 of 15
workers’ participation, it may have disempowered workers by coding their knowledge in software
used by the central state (Medina 2015). Looking at Cyberfolk, which was never realized, there are
even more contradictions. It is easy to imagine how Cyberfolk could collect information from people
regarding Big Data defined above. Cyberfolk also leaves out the most important components of
political life and participation: communication, face-to-face meetings, and assemblies.
According to Judith Butler, assemblies of physical bodies have an expressive dimension which is
not reducible to speech, since their physical presence affect the outcome of their gatherings. Butler links
assembly with precarity by pointing out that a body suffering under conditions of precarity persists and
resists, and that mobilization brings out this dual dimension of corporeal life, just as assemblies make
visible and audible the bodies that require basic freedoms of movement and association. By enacting a
form of radical solidarity in opposition to political and economic forces, a new sense of “the people”
emerges that is interdependent, capable of grievances, precarious and persistent (Butler 2015). No
doubt technological inventions cannot replace face-to-face assemblies. No virtual space can replace real
space. A system like Cyberfolk would have never managed the political life of Chileans. Nevertheless,
information and communication technology can support citizen’s movements. The case study of the
PBs and DPB might serve as an example for the interconnection of assemblies and information and
communication technologies. There is not a central system regulating PB. There are different technical
measures in the PBs, resulting in different forms of participation that support the discussions and voting
of the citizens. One might argue that technical equipment is not available for many people for economic
reasons. It is important to keep in mind that a predominance of digital innovation in participation
processes might force social division; this is the face of digital participation in capitalistic societies.
Nevertheless, digital and technical support of face-to-face assemblies offers access to information
and possibilities for participation for many people. Placed in a socio-technological system, technical
innovation might be an important measure to support political activism and democratic participation.
Conflicts of Interest: The authors declare no conflict of interest.
References
Beer, Stafford. 1972. Brain of the Firm. London: The Penguin Press.
Beer, Stafford. 1981. Brain of the Firm, 2nd ed. Chichester, New York, Brisbane and Toronto: John Wiley & Sons.
Beer, Stafford. 1984. The Viable System Model: Its Provenance, Development, Methodology and Pathology.
The Journal of the Operational Research Society 35: 7–25. [CrossRef]
Beer, Stafford. 1994. Designing Freedom. Chichester: Wiley.
Butler, Judith. 2015. Notes toward a Performative Theory of Assembly. Cambridge: Harvard University Press.
Cabannes, Yves. 2004. Participatory Budgeting: A Significant Contribution to Participatory Democracy.
Environment and Urbanization 16: 27–46. [CrossRef]
Compagnon, Olivier. 2004. Popular Unity: Chile, 1970–1973. In Encyclopedia of Labor History Worldwide. St. James
Press: Available online: https://halshs.archives-ouvertes.fr/halshs- 00133348 (accessed on 5 December 2017).
De Abreu, Júlio Cesar Andrade, and JoséAntonio Gomes de Pinho. 2014. Sentidos e significados da participação
democrática através da Internet: Uma análise da experiência do Orçamento Participativo Digital. Revista de
Administração Pública 48: 821–46. [CrossRef]
Elgendy, Nada, and Ahmed Elragal. 2014. Big Data Analytics: A Literature Review Paper. Paper presented at
Advances in Data Mining. Applications and Theoretical Aspects: 14th Industrial Conference (ICDM 2014),
St. Petersburg, Russia, July 16–20. Conference Paper in Lecture Notes in Computer Science August 2014.
[CrossRef]
Espejo, Raul, and Alfonso Reyes. 2011. Organizational Systems: Managing Complexity with the Viable System Model.
Berlin and Heidelberg: Springer Science & Business Media.
Fedozzi, Luciano. 2001. Orçamento participativo. reflexoes sobre a experiencia de porto alegre. Porto Alegre:
Tomo Editorial.
Soc. Sci. 2018,7, 65 15 of 15
Flowers, Michael. 2013. Beyond Open Data: The Data-Driven City. In Beyond Transparency: Open Data and the
Future of Civic Innovation. Edited by Brett Goldstein and Lauren Dyson. San Francisco: Code for American
Press, pp. 185–99.
Holling, Crawford S. 2001. Understanding the Complexity of Economic, Ecological, and Social Systems. Ecosystems
4: 390–405. [CrossRef]
Laney, Douglas. 2001. 3D Data Management: Controlling Data Volume, Velocity, and Variety. In Application
Delivery Strategies. Stamford: META Group.
Lem, Stanislaw. 1964. Summa Technologiae. First Published in 1964; Minneapolis: University of Minnesota Press.
Leonard, Allenna. 2002. Stafford Beer: The Father of Management Cybernetics. Cybernetics and Human Knowing 9:
133–36.
Leonard, Allenna. 2009. The Viable System Model and Its Application to Complex Organizations. Systemic Practice
and Action Research 22: 223–33. [CrossRef]
Magoulas, Roger, and Ben Lorica. 2009. Introduction to Big Data. Release 2.0; Sebastopol: O’Reilly Media.
Matheus, Ricardo, Manuella Maia Ribeiro, Jose Vaz, and Cesar Souza. 2010. Case Studies of Digital Participatory
Budgeting in Latin America: Models for Citizen Engagement. Paper presented at 4th International
Conference on Theory and Practice of Electronic Governance (ICEGOV 2010), Beijing, China, October
25–28.
Medina, Eden. 2011. Cybernetic Revolutionaries: Technology and Politics in Allende’s Chile. Cambridge: MIT Press.
Medina, Eden. 2015. Rethinking Algorithmic Regulation. K Kybernetes 44: 1005–19. [CrossRef]
Minsky, Marvin. 1967. Why Programming Is a Good Medium for Expressing Poorly Understood and
Sloppily Formulated Ideas. In Design and Planning II. Computers in Design and Communication. Edited
by Martin Krampen and Peter Seitz. New York: Hastings House, pp. 117–21.
Morozov, Evgeny. 2014. Project Cybersyn and the Origins of the Big Data Nation. 3QuarksDaily, October 13.
Mosco, Vincent. 2016. To the Cloud: Big Data in a Turbulent World. New York: Routledge.
O’Reilly, Tim. 2013. Open Data and Algorithmic Regulation. In Beyond Transparency: Open Data and the Future
of Civic Innovation. Edited by Brett Goldstein and Lauren Dyson. San Francisco: Code for American Press,
pp. 289–300.
Passig, Kathrin. 2017. Fünfzig Jahre Black Box. Merkur, November 23.
Pogrebinschi, Thamy. 2017. Digital Innovation in Latin America: How Brazil, Colombia, Mexico, and Peru Have
Been Experimenting with E-Participation. Democracy and Democratization (Blog). June 7. Available online:
https://democracy.blog.wzb.eu/tag/e-democracy/ (accessed on 5 December 2017).
Sánchez, Félix Ruiz. 2002. Orçamento participativo: Teoria e practica. São Paulo: Cortez.
Seiffert, Helmut. 1972. Einführung in das Wissenschaftliche Arbeiten. Braunschweig: Vieweg.
Souza, Celina. 2001. Participatory Budgeting in Brazilian Cities: Limits and Possibilities in Building Democratic
Institutions. Environment and Urbanization 13: 159–84. [CrossRef]
Ward, Jonathan Stuart, and Adam Barker. 2013. Undefined By Data: A Survey of Big Data Definitions. arXiv.
Weizenbaum, Joseph. 1976. Computer Power and Human Reason: From Judgment to Calculation. New York: W.H.
Freeman & Co. Ltd.
Wiener, Norbert. 1948. Cybernetics, or Control and Communication in the Animal and the Machine. Cambridge:
MIT Press.
Wiener, Norbert. 1960. Some Moral and Technical Consequences of Automation. Science 131: 1355–58. [CrossRef]
[PubMed]
©
2018 by the author. Licensee MDPI, Basel, Switzerland. This article is an open access
article distributed under the terms and conditions of the Creative Commons Attribution
(CC BY) license (http://creativecommons.org/licenses/by/4.0/).
... However, on the other hand, the story of the Industrial Revolution 4.0 can potentially cause job losses worldwide. Studies from McKinsey estimated that 60 percent of jobs worldwide will be replaced by automation (Loeber 2018). In Indonesia, 51.9 percent of employment is estimated to be lost. ...
Article
Full-text available
Objective: The Industrial Revolution 4.0 and the digital economy are challenges for Indonesia’s economy, particularly for 2020-2024. This Industrial Revolution has given challenges and opportunities to the onward economic development. Public Private Partnership (PPP) is a potential scheme for infrastructure development in Indonesia. The implementation of PPP in Indonesia is still a challenge, and the process has consumed time and complexity, which needs a solution to accelerate the development by using information technology as digital tools in PPP. Method: This article uses a normative juridical approach, namely research that performs data processing, which essentially means an activity to analyze written legal materials. This research method refers to legal norms contained in laws and regulations as well as applicable legal norms, and the aim is to produce a PPP regulatory model in dealing with the era of the Industrial Revolution 4.0 and Society 5.0. Data collection techniques through Focus Group Discussions are carried out by inviting stakeholders of PPP in Indonesia, primarily to obtain information on the urgency of using information technology in the PPP process in Indonesia during the COVID-19 pandemic. This research has novelties, including using information technology as a digital tool in the PPP process that can be applied to the PPP stages. Result: The contribution of this study provides for the PPP regulatory model using information technology as one of the characteristics of the Industrial Revolution 4.0 and Society 5.0, as well as the form of mapping the PPP process that can be used through information technology. Conclusion: The results of this study are expected to be an input for the Indonesian government, especially the Secretariat of the Indonesian PPP unit, to create an integrated system by using information technology to support the PPP Project in Indonesia. Establishing a PPP information technology system must also be based on regulations in Indonesia and follow the regulation model that the government should regulate the use of information technology in the PPP Regulation. Information technology use in Indonesia's PPP process starts from the planning stage to the agreement's implementation.
... Hence, there is a need to avoid determinism and question the prominence of technology over knowledge (epistème), to rebalance science-technology-ethics relations that affect sociopolitical decisions related to sustainability. The former decision-making power of politics has turned into a kind of brokering governed by economic forces mutually influenced by technologies and technical progress (Labini, 1990;Loeber, 2018). The traditional role of politics in democracy has been shifting from challenging powerful groups at the top of society and making decisions in the interest of current and future generations to representing and mediating new economic and technocratic classes (Diamond, 2015;Bertsou and Caramani, 2020). ...
Article
Full-text available
The growing attention to digital sustainability can arguably be linked to climate change and digital transformations as major megatrends rapidly altering our collective present and future. The current Russian-Ukrainian war and the recent pandemic, however, have both raised uncertainty over the 2030 Sustainable Development Goals (SDGs) achievement and the role of technology and innovation for sustainability. Without ignoring the dramatic consequences for people, the Ukrainian war can be deemed as a significant shift in geopolitics and global energy policies, with a short-term return to fossil fuel and commitments to renewable and clean energy transitions. At the same time, the COVID-19 pandemic acted as a catalyst for a more pervasive diffusion and adoption of information and communication technologies (ICTs) transforming our lives and notions of sustainability. By considering the disruptive impact triggered by the pandemic, this paper aims at advancing awareness and knowledge of digital sustainability and at drawing a coherent framework of arguments including ethical and epistemological issues, taking into account the approach of complexity science. This will be essentially carried out by considering digital sustainability as “the convergence of digital and sustainability imperatives that involves a trans-disciplinary approach of deploying digital technologies in tackling sustainability issues” (Pan and Zhang, 2020). Across different interpretations reflected within business and management debates (Sharma, et al., 2021), this definition gives meaning to the concept or construct by specifying operations that must be performed in order to measure or manipulate the concept (Berrío-Zapata, et al., 2021). This paper will focus on the profound transformations of our view of reality by ICTs acting as instrumentarian technologies, and the need to avoid determinism, rethink science-technology relations, and consider the distributed morality of multi-agent ecosystems as significant aspects to further a debate on the trans-disciplinary nature of digital sustainability, including the potential negative impacts of digital technologies on society, economy and environment.
... Big Data, Algorithmic Regulation, and the History of the Cybersyn Project in Chile, 1971-1973 [92] The Cybersyn project was a technology system that addressed issues similar to algorithmic regulation and Big Data. There are several potential benefits of dynamic data-driven regulation, but there are also concerns about data storage and centralization. ...
Article
Full-text available
The use and management of Big Data in the political sphere has posed unprecedented challenges concerning democratic governance, equity, and the rule of law. As Big Data establishes itself as a resource of growing value, it is imperative to address one of the most critical challenges: data sustainability. Data sustainability involves social and ethical considerations relating to the correct use of personal data. Lack of informed consent and transparency in collecting and using personal data raises serious concerns regarding privacy and individual rights. It is necessary to define regulations and public policies that guarantee citizens’ digital rights based on ethical and democratic standards associated with data management. This article aims to review the literature in the context of data sustainability to identify how Big Data is used, particularly emphasizing its application in the policy domain and the challenges it poses for democratic governance, equity, and law. We have used systematic mapping methodology to collect relevant papers, finding 28 papers associated with democratic governance and Big Data in the context of data sustainability. From the review of these papers, there appears to be a lack of proposals focusing on applying or implementing democratic governance and Big Data. Furthermore, there seem to be no measures to assess the application of Big Data in democratic governance. From these, the need to move towards the definition of formal models that integrate Big Data practices in democratic governance is identified.
... Almost all companies continue to make efforts to develop R&D because technology [4] and innovation are key factors in determining the company's competitive advantage, which contributes to increasing revenue in various ways, including technology transactions, technology transfer, licensing and seeking funding for commercialization [5]. To estimate the economic value of technology, especially one that is granted a patent, it is necessary to have definite arrangements, systems, institutions and estimation processes to calculate the patented technology. ...
Article
Full-text available
Purpose: Innovation as intellectual capital is the main capital to improve the economy and welfare for developing countries. In this article, the researchers focus on discussing how this Patent valuation system is built in developing countries, In this article, the word technology here is patent-protected technology especially in Indonesia. The primary purpose of the patent assessment in this article is to provide loans, or in Indonesia, it is referred to as a patent fiduciary. This research is also expected to provide a solution to the decline in Indonesia's GII rank in 2021. The researchers describe how this technology valuation relates to fostering innovation in Indonesia. Design/methodology/approach: This research applies a descriptive analytical method. The researchers analyze existing innovation conditions through an intellectual property management approach and theories in intellectual property. The researchers construct the formation of a valuation system in Indonesia and form a valuation linkage scheme and technology improvement in Indonesia. Finding: The researchers map a valuation system consisting of integrated subsystems and describe how these subsystems work and are integrated with each other. Furthermore, it is also found that valuation has a positive correlation with the increase in innovation in developing countries, especially for innovation-based startups and small and medium-sized companies in Indonesia. Research limitation/implication: The limitation of this research is the technology valuation process using three conventional methods. It still allows the emergence of new methods and this research is based on Indonesian characteristics, although this research can be applied in developing countries in ASEAN region which have the same characteristics as Indonesia. Original/value: The novelty values contained in this article, firstly on the formation of a valuation system that can be adopted by Indonesia which does not yet exist, The technology valuation system can also apply in several developing countries, especially countries in the ASEAN region which will focus on the development of innovation and technology. Secondly, the valuation system and fiduciary implementation can indirectly increase national innovation.
... Chilean President Salvador Allende attempted a large-scale autonomous economic planning project with Project Cybersyn in the 1970s, but this attempt was abortive as Allende was overthrown before it could succeed or fail (Loeber 2018;Medina 2006;Espinosa 2021). The Soviet Union made some attempts at computer-driven economic planning, though it was focused more on the mathematical than computer science side of the issue, though it proved unsuccessful. ...
Article
Full-text available
The economic calculation of a central planner has traditionally been argued to result in irrational and inefficient allocation of resources, but this can be reasonably questioned given advances in computing technology, especially artificial intelligence (AI). We conclude that central planning coupled with AI is still unable to allocate resources with the same efficiency as price signals and market forces through examinations of the technical structure of current AI approaches. AI-driven central planning is not viable in part due to incentives, computing power, knowledge/data acquisition, and speed of collection. There are deep incentive problems for planners, programmers, and ordinary participants to complicate efforts at planning and bias data. Most importantly, AI cannot easily or quickly duplicate the signals of relative scarcity that are generated in markets. Some challenges we highlight are pertinent to planning generally, but many others arise from the introduction of an AI-planner.
Chapter
This chapter introduces readers to the transformative effects fueled by converging and overlapping patterns of design, tech and entertainment. These are in part positive and in part negative. Technology, for a start, has dramatically changed the ‘infrastructure’ of democratic systems (the number and quality of connections between citizens and public administrations, the physiognomy of the public space, and access to information). A second beneficial consequence influenced by technology, design and entertainment into democratic processes consists of the prominence gained by design-thinking and creative experimentation applied to problem-solving in public policy. On the other hand, stronger ties between technology, entertainment and design have widened the gap between citizens’ expectations of everything related to digitalization, including government, and the actual rendering of digitalized public decision-making. The decreased satisfaction in digitally based forms of democratic decision-making poses a crucial challenge to digitalised policy-making, in both national and supranational venues. Public regulators are seeing the poorest results ever recorded in terms of interest, engagement and retention despiteusing the most cutting edge and advanced technologies.KeywordsToggling taxTED conferencesMass consumption electronicsCybersyn projectDesign-thinkingInnovationExperimentalismWicked-problemsDemocratic innovationsAestheticsDigital democracyCivic engagement
Article
This article aims to indicate a contemporary deployment of accelerationist theory by contrasting it with dominant neoliberal perceptions of technological development. The analysis outlines the codependency of technological and social progress as a technosocial entity in light of a retrospective view on the course of the Chilean socialist government’s project Cybersyn in the early 1970s.
Article
Full-text available
The Cybersyn project has lately received increased attention. In this article, we study the local technical antecedents of Stafford Beer's Cybersyn project in Chile, particularly regarding Cybernetics and Systems ideas and local computing and networking developments. We show that the Cybersyn project in Chile was hosted by a rich intellectual environment that understood Cybernetics and Systems ideas; that it found a mature computer community and infrastructure whose high point was the State Computing Enterprise EMCO/ECOM, and an advanced networking experience whose flagship was the automation of the State Bank that involved a pioneering network of teleprocess. In summary, this paper attempts to unveil the deep historical background over which the globally unique cybernetic experiment called Cybersyn flourished.
Article
Full-text available
This article examines the connecting lines between the Chilean Project Cybersyn’s interface design, the German Hochschule für Gestaltung Ulm and its cybernetically inspired approaches towards information design, and later developments in interaction design and the emerging field of Human–Computer Interaction in the USA. In particular, it first examines how early works of designers Tomàs Maldonado and Gui Bonsiepe on operative communication, that is, language-independent (and thus internationalizable) pictogram systems and visual grammars for computational systems, were intertwined with attempts to ground industrial design in a scientific methodology, to address an era of computing machines, and to develop the concept of the interface as a heuristic for a renovated design thinking. It thereby also reconstructs further historical vanishing lines—e.g. the pictorial grammar of Otto Neurath’s ISOTYPE—of the development of the ‘ulm model’ of design. Second, the article explores how an apprehension of first-order cybernetics in West Germany—e.g. represented by hfg ulm staff like Max Bense or Abraham Moles, merged with Cybersyn’s second-order cybernetics ideas, as represented by Stafford Beer’s Viable System Model. And third, it asks about a further conceptual turn regarding an understanding of design which resulted in a focus on communicative interaction, e.g. in the later works of Fernando Flores and Terry Winograd on HCI, or in Beer’s Team Syntegrity approach. As an effect, the text will explore a specific and international network of cybernetic thinking between Latin America, Europe, and North America which emerged around Project Cybersyn, and which was occupied with questions of HCI, a democratization of design, and intelligence amplification.
Thesis
Full-text available
Questo elaborato nasce dalla volontà di raccontare quanto avvenuto in Cile tra il 1971 e il 1973, nell’ambito di un progetto sviluppato dal neoeletto governo Allende e conosciuto ai più come Progetto Cybersyn. Questo progetto, nell’intenzione dei suoi creatori, avrebbe dovuto facilitare la comunicazione fra i vari siti produttivi del paese e con il governo, permettendo così di applicare su larga scala e in maniera altamente efficiente il programma di nazionalizzazioni e pianificazione economica fortemente voluto dalla coalizione di Allende. Il primo capitolo inizia descrivendo il contesto che ha permesso la nascita del progetto Cybersyn, ovvero la salita al potere di Allende e la sua visione per il futuro del paese. Continua poi illustrando l’incontro tra il presidente cileno e Stafford Beer, accademico britannico specializzato nello studio della cibernetica e chiamato come consulente da un membro del governo. Verranno infine descritte in maniera sintetica le diverse fasi di sviluppo del progetto, fino al suo momento di massima operatività nel 1972 e la sua prematura fine con il golpe di Pinochet nel 1973. Il secondo capitolo si occupa invece di descrivere più nel dettaglio il funzionamento del progetto, le sue diverse componenti e i differenti gradi di sviluppo raggiunti da queste ultime. Si cercherà così di comprendere quanto effettivamente le visioni teoriche di Beer e Allende siano riuscite a coniugarsi nella pratica dello sviluppo di Cybersyn. Il terzo capitolo, per concludere, cercherà di analizzare il lascito del Progetto Cybersyn, sia dal punto di vista degli spunti teorici che da quello delle applicazioni pratiche. Ai tre capitoli fa infine seguito un’intervista al dottor Raul Espejo, una delle figure chiave all’interno del team che realizzò il Progetto Cybersyn. Ciò permetterà di approfondire alcuni nodi chiave della storia del progetto e del suo lascito, dal punto di vista unico di chi a quel progetto ha potuto partecipare in prima persona.
Article
Full-text available
Este trabalho analisa três edições (2006, 2008 e 2011) do Orçamento Participativo Digital (OPD) de Belo Horizonte (MG). Objetiva deslindar os sentidos e significados da participação democrática mediada pela internet nessa experiência. Utiliza-se como método o estudo de caso único, de natureza qualitativa. Os dados foram coletados por meio de observação direta nos sítios do OPD e análise documental. Durante a fase de observação direta foram coletadas mensagens e manifestações dos cidadãos que estavam participando do processo. Tais dados foram tratados com o software Atlas TI 6.0®. Conclui-se que tanto o sentido quanto o significado democrático dessa experiência de participação mediadas pela internet são modificados ao longo das edições, indo de grandes impulsos de esperança democrática até a descrença nessa modalidade de participação digital.
Conference Paper
Full-text available
In the information era, enormous amounts of data have become available on hand to decision makers. Big data refers to datasets that are not only big, but also high in variety and velocity, which makes them difficult to handle using traditional tools and techniques. Due to the rapid growth of such data, solutions need to be studied and provided in order to handle and extract value and knowledge from these datasets. Furthermore, decision makers need to be able to gain valuable insights from such varied and rapidly changing data, ranging from daily transactions to customer interactions and social network data. Such value can be provided using big data analytics, which is the application of advanced analytics techniques on big data. This paper aims to analyze some of the different analytics methods and tools which can be applied to big data, as well as the opportunities provided by the application of big data analytics in various decision domains.
Article
Purpose – The history of cybernetics holds important lessons for how we approach present-day problems in such areas as algorithmic regulation and big data. The purpose of this paper is to position Project Cybersyn as a historical form of algorithmic regulation and use this historical case study as a thought experiment for thinking about ways to improve discussions of algorithmic regulation and big data today. Design/methodology/approach – The paper draws from the author’s extensive research on Cybersyn’s history to build an argument for how cybernetic history can enrich current discussions on algorithmic regulation and the use of big data for governance. Findings – The paper identifies five lessons from the Cybersyn history that point to current data challenges and suggests a way forward. These lessons are: first, the state matters; second, older technologies have value; third, privacy protection prevents abuse and preserves human freedom; fourth, algorithmic transparency is important; and finally, thinking in terms of socio-technical systems instead of technology fixes results in better uses of technology. Research limitations/implications – Project Cybersyn was a computer network built by the socialist government of Salvador Allende under the supervision of the British cybernetician Stafford Beer. It formed part of the government’s program for economic nationalization. Work on the project ended when a military coup brought the Allende government to an early end on September 11, 1973. Since we do not know how the system would have functioned in the long term, parts of the argument are necessarily speculative. Practical implications – The paper uses Cybersyn’s history to suggest ways that the Chilean experience with cybernetic thinking might enhance, improve, and highlight shortcomings in current discussions of algorithmic regulation. Originality/value – The paper provides an original argument that connects one of the most ambitious cybernetic projects in history to present day technological challenges in the area of algorithmic regulation.
Article
Norbert Wiener once told Stafford that if he was the father of cybernetics, then Stafford was the father of management cybernetics. Stafford had written to Wiener after reading Cybernetics to say I think I am a cybernetician. Correspondence, invitations to conferences, meetings and friendships followed. Stafford was welcomed by many of the early pioneers and formed special bonds with his mentors, Warren McCulloch, Ross Ashby and Norbert Wiener. These bonds were important to him both for the opportunities for friendship and learning, and for the validation they gave a young man working in the British steel industry which was by no means unanimous in its support of innovation and progressive practices. Stafford himself was always eager to support other innovators, sponsoring George Spencer-Brown while he wrote Laws of Form, writing a glowing preface for Humberto Maturana and Francisco Varela's Autopoeisis and Cognition and writing many reviews of new books in the field. In later years, he tried to follow in his mentors' footsteps, encouraging young people as best he could with no institutional base. Obituary written by Allenna Leonard with assistance from Roger Harnden. Allenna was Stafford's close partner who joyfully shared both his life and his work for many years. Stafford Beer was born on September 25, 1926. He died on August 23, 2002, at the age of 75.
Article
It took the author 30 years to develop the Viable System Model, which sets out to explain how systems are viable-that is, capable of independent existence. He wanted to elucidate the laws of viability in order to facilitate the management task, and did so in a stream of papers and two (of his seven) books. Much misunderstanding about the V.S.M. and its use seems to exist; especially its methodological foundations have been largely forgotten, while its major results have hardly been noted. This paper reflects on the history, nature and present status of the V.S.M., without seeking once again to expound the model in detail or to demonstrate its validity. It does, however, provide a synopsis, present the methodology and confront some highly contentious issues about both the managerial and scientific paradigms.
Article
Stafford Beer’s Viable System Model is the best known of the many cybernetic models he constructed over a career spanning more than 50years. He explored the necessary conditions for viability in any complex system whether an organism, an organization or a country. Although the model was first applied in his work in the steel industry, many further applications were made during his later work as a consultant. The best known of these was when he was invited by President Salvadore Allende of Chile in 1970 to model the social economy of that country. That experiment was brutally cut short in 1973 by the CIA assisted coup during which Allende was killed and Pinochet’s dictatorship installed. The model itself draws on mathematics, psychology, biology, neurophysiology, communication theory, anthropology and philosophy. It was first expressed in mathematical terms in ‘The Cybernetic Factory’; next it was described in neurophysiological terms in Brain of the firm; and finally according to logic and graphic presentation in Heart of Enterprise and Diagnosing the System for Organizations. This last version is the one that is most accessible. It enables people to address organizational issues in a way that skirts the usual categories and organization charts and gets down to the actual necessary functions, no matter who is performing them. With this model people can get a boost as they diagnose or design an organizations. One aspect is to discover what the organization’s critical variables are and to find or install the homeostats that will show that they are maintaining equilibrium. Within that context, the model will help you ascertain that the principle functions and communications channels are in place and can function effectively. A crucial aspect of the VSM is that it is recursive; that is that the same relationships can be traced from the shop floor to the corporation or from the village to the country. Two examples will be discussed: a small business and the Chilean work from the 1970s. It is hoped that this will encourage people to imagine a world that works much better than it does now and where management is not defeated by complexity.
Article
Hierarchies and adaptive cycles comprise the basis of ecosystems and social-ecological systems across scales. Together they form a panarchy. The panarchy describes how a healthy system can invent and experiment, benefiting from inventions that create opportunity while being kept safe from those that destabilize because of their nature or excessive exuberance. Each level is allowed to operate at its own pace, protected from above by slower, larger levels but invigorated from below by faster, smaller cycles of innovation. The whole panarchy is therefore both creative and conserving. The interactions between cycles in a panarchy combine learning with continuity. An analysis of this process helps to clarify the meaning of "sustainable development." Sustainability is the capacity to create, test, and maintain adaptive capability. Development is the process of creating, testing, and maintaining opportunity. The phrase that combines the two, "sustainable development," thus refers to the goal of fostering adap