ArticlePDF Available

Algorithmic Overdependence: Fostering Awareness through Digital Facilitation and (Re-) Construction


Abstract and Figures

This contribution intends to raise awareness on connectedness in continuous digitization of society and organizations. It suggests points of reflection when being tracked by Internet-of-Things systems which in turn encourage or discourage behavior. The question arises: How much digital facilitation is necessary and when is algorithmic overdependence dominating? Concerns related to the invasive expansion of digital technologies and their 'smartness' (through algorithms and artificial decision-making) to direct behaviors of all kinds can be represented and experienced by art installations. We suggest promoting constructive awareness by offering a scenario in such an installation. It allows subjects to experience algorithmic influence and subsequently encourages regaining control through individual capacity building for individually coherent (and transparent) design. The proposed installation enables new forms of governance based on experiential learning and digital artefacts for personal mastery of collective intelligence. In: Challenging Organisations and Society, Vol. 9., No. 1, 1541-1557, 2020.
Content may be subject to copyright.
Beware of Art: ARTificial Intelligence
Challenging Organizations and Society
Edited by Claudia Schnugg and Andrea Schueller
2020 Volume 9, Issue 1
reflective hybrids®
and Society
Claudia Schnugg, Andrea Schueller
Beware of Art: ARTificial Intelligence
Challenging Organizations and
page 143 6
Elena Raviola
Artificial Intelligence and Creative
Work: Practice and Judgement,
Organizing and Structuring
page 14 42
Elisabetta Jochim
The Opportunities of Artificial
Intelligence and Art for Creativity
and Society
page 1460
Claudia Schnugg
Collaborations of Art, Science
and Technology: Creating Future
Realities with Art and A.I.
page 1473
Sougwen Chung interviewed by
A reflection on Art, Artificial
Intelligence and Robots in Society
page 1492
Andrea Schueller
Fragments of the Future: Identity,
Art and the Artificial
page 1499
Paola Michela Mineo and Andrea Schueller in
Fragments as Media of Time
pa ge 1531
Christian Stary, Claudia Schnugg
Algorithmic Overdependence:
Fostering Awareness through Digital
Facilitation and (Re-)Construction
page 1541
Johannes Braumann interviewed by
Why didn’t you stay until Sunday’s
pa ge 1558
About the Authors
pa ge 15 62
Journal “Challenging Organisations and Society . reflective hybrids® (COS)”
COS is the first journal to be dedicated to the rapidly growing require-
ments of reflective hybrids in our complex 21st-century organisations
and society. Its international and multidisciplinary approaches balance
theory and practice and show a wide range of perspectives in and
between organisations and society. Being global and diverse in thinking
and acting outside the box are the targets for its authors and readers in
management, consulting and science.
Editor-in-Chief: Maria Spindler (AT)
Deputy Editors-in-Chief: Gary Wagenheim (CA), Tonnie van der Zouwen (NL)
Editorial Board: Ann Feyerherm (US), Ilse Schrittesser (AT), Maria Spindler (AT),
Chris Stary (AT), Gary Wagenheim (CA), Nancy Wallis (US), Tonnie van der
Zouwen (NL)
Guest Editors: Tom Brown (CA), Andrea Schueller (AT), Claudia Schnugg (AT)
Reviewers: François Breuer, Tom Brown, Silvia Ettl Huber, Jeff Haldeman, Ann
Feyerherm, Russell Kerkhoven, Larissa Krainer, Marlies Lenglachner,
Ruth Lerchster, Barbara Lesjak, Annette Ostendorf, Richard Pircher,
Ilse Schrittesser, Claudia Schuchard, Andrea Schüller, Maria Spindler,
Christian Stary, Martin Steger, Thomas Stephenson, Martina Ukowitz,
Gary Wagenheim, Nancy Wallis, Tonnie van der Zouwen
Proofreading: Deborah Starkey
Terms of Publication: Before publication authors are requested to assign copy-
right to “Challenging Organisations and Society . reflective hybrids®”.
Beginning one year after initial publication in “Challenging Organisations
and Society . reflective hybrids®” authors have the right to reuse their
papers in other publications. Authors are responsible for obtaining per-
mission from copyright holders for reproducing any illustrations, figures,
tables, etc. previously published elsewhere. Authors will receive an
e-mailed proof of their articles and a copy of the final version.
Disclaimer: The authors, editors, and the publisher take no legal responsibility
for errors or omissions that may be made in this issue. The publisher
makes no warranty, expressed or implied, regarding the material con-
tained herein.
Copyright: COS . reflective hybrids®, Vienna 2020
Christian Stary, Claudia Schnugg | Algorithmic Overdependence
Challenging Organisations and Society
Christian Stary, Claudia Schnugg
Algorithmic Overdependence: Fostering Awareness
through Digital Facilitation and (Re-)Construction
Algorithmic Overdependence
is contribution intends to raise awareness of connectedness in the continu-
ous digitalization of society and organizations. It suggests points of reection
when being tracked by Internet-of-ings systems, which in turn encourage
or discourage behavior. e question arises: How much digital facilitation is
necessary and when does algorithmic overdependence dominate? Concerns
related to the invasive expansion of digital technologies and their ‘smartness’
(through algorithms and articial decision-making) to direct behaviors of
all kinds can be represented and experienced by art installations. We suggest
promoting constructive awareness by oering a scenario in such an installa-
tion. It allows subjects to experience algorithmic inuence and subsequently
encourages regaining control through individual capacity building for indi-
vidually coherent (and transparent) design. e proposed installation enables
new forms of governance based on experiential learning and digital artefacts
for personal mastery of collective intelligence.
Keywords: Internet of Behavior, predictive analytics, articial decision-mak-
ing, behavior control, governance, opacity, digital literacy, design-integrated
engineering, citizen participation, accountability
1. Caught in the Web of Behavior Due to Digital Intelligence?
According to the renowned research and advisory company Gartner, by 2023,
individual activities will be tracked digitally by an “Internet of Behavior” to
Christian Stary, Claudia Schnugg | Algorithmic Overdependence
Challenging Organisations and Society
inuence benet and service eligibility for 40% of people worldwide. is
Internet of Behavior will link a person digitally to their actions.
1 For exam-
ple, linking an image as documented by facial recognition with an activity
such as purchasing a drink from a vending machine can be tracked digitally.
e resulting understanding of an individual’s or group’s behavior can prot
various actors. For example, not only vending machine providers and drink
producers will track individuals’ behavior for arranging their oerings mod-
el; insurance companies will also track individuals’ behavior for determining
a corresponding pricing model.
Tracking and applying knowledge about behavior will not only link individu-
als to their preferred actions and always provide their preferred drinks in the
vending machine, nor only help companies to create the best pricing models.
e Internet of Behavior can also be used to encourage or discourage behav-
ior. Algorithmic processing of data enables navigation and synthetization of
large amounts of data. Based on identifying patterns from all the data, it al-
lows conclusions to be drawn in a time-ecient way: e tailored eciency of
algorithms can shi attention to a limited choice. For instance, an observing
individual’s behavior in a certain situation might provide the best inference
results for similar situations but lack other opportunities when algorithms
do not capture alternative viewpoints on activities. Results may dier among
those individuals an algorithm deems to have certain properties, e.g., walk-
ing on the right side of a street. Contextual information, such as the street
environment or the country of origin, can shed light on the meaning of arti-
cial conclusions. ‘e danger of such reliance on algorithms is that, despite
the benets and assumption that algorithms are ecient, logical, and data-
driven and therefore unbiased, algorithms are not infallible and oentimes
carry biases of their own or of their creators’ (Wei et al., 2017, p. 3).
Christian Stary, Claudia Schnugg | Algorithmic Overdependence
Challenging Organisations and Society
Once providers or users depend too much on algorithm-generated informa-
tion they become algorithmically overdependent (Banker et al., 2019). en,
information generation and exchange are increasingly handled autonomous-
ly by digital systems, leading humans to give up control unwittingly and los-
ing track of process-steps they are held accountable for.
e technological drivers of such developments are the Internet-based com-
munication system and the increasing set of objects that can be connected,
mutually and with people, by utilizing the Internet protocol stack. Internet
technology operated as part of so-called Internet-of-ings (IoT) applications
allows the connection not only of people but also of physical objects of vari-
ous kinds. It enables the integration of various sensors, actuators, and objects
so they can communicate directly with one another without human inter-
vention (cf. Lin et al., 2017). Such digitalized objects can track the state of
humans and their level of awareness (to their environment), and guide them
to achieve their objectives, such as nding a location. Others can intervene
in certain situations, e.g., stopping a car to prevent an accident. Information
is collected by sensors and combined to trigger either actuator or human be-
havior. Originally passive or observing elements such as sensors can become
active ones (Shaev, 2014).
e enrichment of communication and interaction between humans and
artefacts of all kinds linked with IoT-based networking developments con-
verge physical, digital and the virtual elements ‘to create smart environments
that make energy, transport, cities and many other areas more intelligent’
(Vermesan et al., 2013, p. 8). is intelligence is based on algorithmic deci-
sion-making tools, which are increasingly used by government and private
bodies. Articial decision-making is applied to various forms of data, oen
relying on the algorithmic analysis of personal information. As a result, a
new wave of policy concerns has emerged (Zarksy, 2015) questioning the le-
gitimacy of algorithmic decision-making and asking for accountability (cf.
Binns, 2018, Hutchens et al., 2017, Bovens et al., 2014). So far, they have not
154 4
Christian Stary, Claudia Schnugg | Algorithmic Overdependence
Challenging Organisations and Society
been satisfactorily resolved, although individual notice and consent has be-
come commonplace – cf. the video camera sign when entering public places
like train stations. Rather, troubling implications for democracy and human
ourishing are expected, when self-interests of companies or public bodies
determine the use of collected data and oversee their future use (Yeung, 2017).
Overdependence has nurtured the discussion of a scored society (cf. Citron
et al., 2014) and algocracy (cf. Danaher, 2016). It underlines the need to un-
derstand overdependence on an analytical level while on the basic level, vast
reactions and actions with respect to the detriment of individuals occur. On
an analytical level, the nature of these concerns is linked to the way the deci-
sion-making relies on biased and inaccurate datasets, the opacity of applied
algorithms, the lack of thorough review, and/or opportunities to intervene
from a design perspective.
An awareness of this analytical understanding of overdependence of all ac-
tor groups needs to be raised. We elaborate on algorithmic data processing
and on (re)gaining control through active design intervention and introduce
the concept of an experiential learning support installation. By creating ex-
periences, especially artistic installations can help dierent audience groups
to understand complex theoretical ideas and explore technological concepts
through sensemaking construction (Schnugg, 2019). Entering the proposed
installation provides a feeling of getting ‘caught’ in the Internet of Behavior by
providing a feeling of opacity and lack of transparency. Based on data meas-
ured with IoT components interlinked with decision-making algorithms, the
scenario also physically limits the person passing through. At the end, the
results of the algorithmic interpretation of behavior and prescriptions will be
generated and handed out to each person. ese results are expected to trig-
ger demand for (re)gaining control of IoT system behavior. Hence, a second
part of the installation includes a novel governance scheme with digital de-
sign facilities that allow for learning and exploration. In this way, the feeling
of oppression is modulated towards actively engineering IoT-spaces.
Christian Stary, Claudia Schnugg | Algorithmic Overdependence
Challenging Organisations and Society
2. Algorithmic Overdependence – Opening Space for Intervention and
Algorithms form the core of machine intelligence since their processing leads
to computer-interpreted data and decisions. ose can be used to inuence
human behavior and to direct human coexistence. A recent example con-
cerns social relations that undergo signicant changes in everyday life and
sociality due to pervasive and perpetual mediated presence of friends by
social media (cf. ulin et al, 2020): Not only the emergence of novel con-
straints of coupling with other interactors (e.g., becoming ‘friends’) and the
recoupling of social interaction can be observed, but also modied rhythms
of interaction in terms of increasing frequency and insistency. Both nally
aect human foreground activities due to the continuous stream of online
contacts, including their structuring. Such ‘domestication’ processes of digi-
tal media are based on role shis. Individuals shi from being passive receiv-
ers and consumers of technology to highly active interactors. Novel forms of
(social) networking driven by interactors’ behavior shape technology’s mean-
ings, functions, and representations. e material artefact and its algorithmic
capabilities shape the individuals’ sensemaking of digital systems, as well as
how their actions aect individual sensemaking (Mesgari et al., 2019).
But do algorithms incorporate these factors and categories of information?
Wayingwe (2019, p. 6.) explains:
’Conclusively, algorithms intend to present an avid manner in which arti-
cial intelligence skills could be applied in organizational decision-making
sections. However, its actual use to guarantee improvements in considera-
tion to those who are both directly and indirectly aected by the resultant
decisions is inevitably jeopardized by the variations in considerable factors
so as to ensure a positive change (e New York Times, 2018, p. 19). It is
evident that algorithmic approaches are entirely dependent on the users’
mastery of computer skills such as coding, instructional discernment, and
the capacity to execute the encoded guidelines (Danaher 2016, p. 256). Fur-
thermore, overdependence on the algorithmic requirements deters user
154 6
Christian Stary, Claudia Schnugg | Algorithmic Overdependence
Challenging Organisations and Society
organizations and individuals to consider mental capacities, situational
changes and the relevant needs of data contributors and decision-making
beneciaries. erefore, algorithms can be improved by frequent changes
and improvements in relation to the systemic requirements to give a sensi-
ble meaning to decision-making organizations and individuals.’
e ever-increasing application of algorithms to decision-making in a range
of social contexts has prompted demands for algorithmic accountability: Ac-
countable decision-makers must provide their decision subjects with justi-
cations for their automated system’s outputs (Binns, 2019). So far, it is still
open what kinds of principles such justications can be expected to appeal
to. Moreover, accountability needs to be based on a common concept under-
standing. Bovens et al. (2014) explains accountability of a party A to another
party B in case A has an obligation to provide B with some justication re-
garding a certain conduct. If B nds A’s justication to be inadequate, A may
face some form of sanction. is has important implications for algorithmic
decision-making and the actors involved.
Imagine a community deploying an IoT surveillance system is held account-
able by a citizen who is denied access to a public service by the system. Ac-
countability in this scenario might consist of a demand by the citizen that
the community provides justication for the decision; a response from the
community council with an account of how the surveillance system works,
and why it is appropriate for the context; and a nal step in which the citi-
zen either accepts the justication or rejects it, in which case the community
council might have to revise or reprocess the decision with a human agent,
or face some form of sanction. Such a situation serves well as input for expe-
riencing algorithmic overdependence, particularly the impact of opaqueness
with respect to directly aected stakeholders.
In our example, one way for the community council is to provide evidence of
prior eective algorithmic decision-making, e.g., meeting public demands for
security. It could also provide proof of methodological and/or scientic rigor in
the development of algorithms for decision-making. Finally, (possibly) aected
Christian Stary, Claudia Schnugg | Algorithmic Overdependence
Challenging Organisations and Society
stakeholders could be invited to participate in explanatory features, or more
proactively to the redesign of the IoT application and co-create transparent
algorithms for automated decision-making with the development team. Such
a move not only avoids a posteriori resolving of misunderstandings and result-
ing conicts, but also addresses a major challenge to accountability (cf. Zarsky,
2015). Allowing aected stakeholders to scrutinize and hold to account the
exercise of algorithmic design of decision-making strengthens the commit-
ment to share responsibility for dependence on algorithmic decision-making.
Tackling transparency as a problem for socially consequential mechanisms
can concern several forms of opacity (cf. Burell, 2106): ‘(1) opacity as inten-
tional corporate or state secrecy, (2) opacity as technical illiteracy, and (3) an
opacity that arises from the characteristics of machine learning algorithms
and the scale required to apply them usefully’. We recognize that increas-
ing technological literacy could help to uncover algorithmic decision-making
and reect on its purpose. Moreover, audit trails to the algorithmic process
or interactive modeling allow individuals to gain a better understanding of
how their actions impact upon the algorithmic response (Citron et al., 2014).
Recognizing the distinct forms of opacity that may come into play in given
applications is key to determining which technical and non-technical solu-
tions can help to prevent harm.
Transparency can help restore accountability. Even when sophisticated algo-
rithms are inherently opaque, algorithmic decisions preferably become more
understandable, either to be interpreted ex post or to be interpretable ex ante
by responsible and aected stakeholders (cf. Le Laat, 2017).
3. Immerse Experience and Facilitation Design for Re-Weaving the Web of
We term the suggested installation Digitized as it starts with experiencing the
algorithmic overdependence based on IoT components and triggers the use of
154 8
Christian Stary, Claudia Schnugg | Algorithmic Overdependence
Challenging Organisations and Society
digital facilities for (re)designing IoT settings to regain control over digitized
systems. Capacity building is driven by personal experience of algorithmic
decision-making and by creating an understanding of IoT system compo-
nents and their interplay. e desired outcome is an individual’s (re)gained
condence in dealing with complex systems in an analytical and constructive
way. Such experiential design is understood as artful in the context of busi-
ness innovations (Cain, 1998) and can be connected to artistic elaboration
of the installation. rough experience it reduces the semantic gap between
non-familiar systems or objects and aected stakeholders.
Fig. 1. Momentum-based experiential design
Figure 1 gives an overview of a possible instance of the Digitized installation
to be located on a usually crowded part of a university campus or a similar
public place. e interactive experience is based on 4 momentums. Starting
with conveying the feeling of oppression (Momentum 1) and that of over-
dependence of algorithmic decision-making for behavior en- or discourage-
ment (Momentum 2), the momentums cumulate in design-centered engi-
neering of an IoT application when developing component understanding
(Momentum3) and behavior control (Momentum 4).
Christian Stary, Claudia Schnugg | Algorithmic Overdependence
Challenging Organisations and Society
Fig. 2. Options-generating Momentum 1
Figure 2 overviews the prepared topics in the Digitized tunnel approaching
some point of decision-making aer experiencing the loss of individual con-
trol of behavior. Walking through the small entry several options (right-hand
side of Figure 2) become available. When aiming for encouragement (Mo-
mentum 2), Momentum 3 introduces design-centered engineering of an IoT
application for regaining behavior control in Momentum 4. Each momentum
is described in the following.
Momentum 1: Feeling of Oppression
Making the increasing invisible restriction of behavior explicit: e interactive
experience starts by passing through a tunnel that is getting smaller so that
participants begin to feel uncomfortable, until at the end of the tunnel a small
outlet is available. is needs to be passed to continue the interactive experi-
ence. e participants walk towards the end of this narrowing tunnel, leaving
digital nger- and footprints until they leave the tunnel through a small outlet
Christian Stary, Claudia Schnugg | Algorithmic Overdependence
Challenging Organisations and Society
with an algorithmic decision displayed on their future behavior regulations,
making the invisible parts of the IoT system visible in terms of conclusive
behavior prescriptions. Navigation and deep links to content and background
information on the domestication and development of IoT systems are pro-
vided along the tunnel wall by IoT components, interactive stations, and QR
codes. e visual, acoustic and spatial experience becomes more intense the
more data is collected and the lower the range of opportunities by algorith-
mic decisions-making becomes. Hence, the feeling of oppression is triggered
through multiple channels, regardless of whether the behavior conforms to
expected patterns or leads to regulating a participant’s behavior.
Figure 3 shows the concept of the tunnel design. e tunnel is equipped
with information and interactive stations on the IoT (i.e. the system context),
showing some of the sensor systems physically. e tunnel system collects
sensor data and processes them using decision-making algorithms on the be-
havior of each participant. Movements, time, navigations paths followed on
the screens on the walls of the tunnel, etc. are recorded and reected to the
participant as part of that process.
Fig. 3. Structuring the tunnel experience (Momentum 1)
Christian Stary, Claudia Schnugg | Algorithmic Overdependence
Challenging Organisations and Society
Momentum 2: Experiencing Algorithmic Overdependence
Confrontation with the system that incrementally restricts behavior based on
articial decision-making: Passing through the small exit, each participant
receives information how their behavior in the installation shapes and con-
strains future behavior, e.g., by restricting access to information, resources,
services, social contacts and settings. It is a manifestation of algorithmic
overdependence in an IoT environment for an individual who is part of a
community. For instance, a student is denied access to certain services, while
being nudged to adapt to certain ways of behavior, such as booking courses
earlier to individualize course designs. Figure 2 captures typical behavior
patterns that can result from experiencing algorithmic decision-making. It
shows that besides informed capacity building based on the intention to (re)
gain control of IoT technologies, other strands of action can result from the
tunneled experience.
Momentum 3: Regaining Control
Zooming out & zooming in, actively exploring the system: Aer having re-
ceived the interpretation of individual behavior data, the participants are
guided to a learner-friendly location nearby to start actively (re-)designing
an IoT application. is set of activities aims to explore a variety of design op-
portunities. ey have a digital baseline, i.e. the ‘digital twin’ of themselves in
the installation. e digital twin is prepared on a tabletop system (Oppl et al.,
2014) (see Figure 4). It represents all IoT elements the participants were able to
experience through algorithmic decision-making in the Digitized tunnel in
the form of abstract block elements and their relationships. In this way, par-
ticipants can physically generate of model of IoT components, including sen-
sor systems and soware components processing collected data (see Figure 5
and Figure 6). In addition, algorithms (encoded in hard- or soware) can be
decomposed to explain step-by-step computational intelligence.
Christian Stary, Claudia Schnugg | Algorithmic Overdependence
Challenging Organisations and Society
Fig. 4. Modeling the digital twin on a table-top system
Momentum 4: Self-ecient Digital Capacity Building
In-depth understanding leading to action: e model created in Momentum 3
needs to be instantiated by IoT elements and synchronized by a specic oper-
ation logic (algorithmic decision-making procedure). For capacity building,
so-called Nerd Trees (see Figure 5) have been designed (Stary et al., 2020).
ey contain simple and combined IoT components. Participants can grab
one or more IoT components, namely IoT-(i.e. M5Stack©) elements in each
of the boards (layers) and compose applications according to their model.
Since these components have inherent behavior, their coupling makes the
IoT system operate according to participants’ individual needs as represent-
ed in their model. e implementation allows monitoring of the generated
data and the ow of information for decision-making. Figure 5 shows the
top-down and bottom-up drivers to explore IoT systems and their compo-
nents. e layered approach of the Nerd Tree supports middle-out capacity
building, in particular for visitors who are familiar with combined sensor
systems including temperature and movement measurement and who want
deeper knowledge, either in technologies or application development. e
M5Stack control element contains various ways to plug in sensors and com-
bine them to create intelligent application system behavior. It also provides
Christian Stary, Claudia Schnugg | Algorithmic Overdependence
Challenging Organisations and Society
basic functions for display, navigation, and control. For complex interaction,
M5Stack-applications can be operated from mobile phones. ey reside on
top of the Nerd Tree.
Fig. 5. Variety of IoT system components – A NerdTree
Figure 6 zooms into programming the behavior of M5Stack© applications
(cf. using Blockly. On the le, the M5Stack control component
with several sensors for securing the access to rooms (including a keyboard
to typing in keys) is illustrated. On the right, a screen shot of UIFlow (when
programming in Blockly) is shown, processing an event and (re)acting based
on recognized sensor data. e creation of IoT application behavior through
Blockly is based on the language JavaScript supporting block-based visual
programming. According to its concept, Blockly features structured (de)
composition of IoT components (represented as units) and handling of events
in an IoT environment. In this way, not only can each block of the digital twin
be mapped to one or more operational entities, but also the successive pass-
ing of information along algorithmic computations can be experienced and
operated in real time.
Christian Stary, Claudia Schnugg | Algorithmic Overdependence
Challenging Organisations and Society
Fig. 6. Design-oriented engineering utilizing M5Stack© elements,
and Blockly programming in UIFlow©
4. Conclusion
More and more data are captured through IoT sensor components, and oen
users as well as other adopters tend to depend too much on algorithmically
generated information, so much so that they may even select inferior prod-
ucts and services to their own detriment or restrict their own free moving
space. We refer to this as algorithm overdependence. Rather than ‘surrender-
ing to technology’ in modern digital environments we suggested experiential
design for stakeholders to develop an understanding of the complex systems
to create agency.
e proposed insta llation Digitized aims to trigger reective practice for (con-
cerned) stakeholders in continuously digitized environments. It promotes
awareness by oering scenarios of concern and triggers to allow transparency
Christian Stary, Claudia Schnugg | Algorithmic Overdependence
Challenging Organisations and Society
and design for mutual use for users and providers of digitized systems. For
the physical parts of the installation a digital support system is available so
participants can regain intelligent control.
Digitized is an individual, however, socially grounded and co-created artis-
tic protocol of space perception and (re)design. Based on the interactive ex-
perience of articial decision-making, constructive interventions can be set
through physical experience even for intangible elements including algorith-
mic processing.
Due to its partly interactive character and educational elements the installa-
tion enables active reection on and testing of IoT and develops methods of
behavior capturing and regulating. Artistic mediation showcases anchors of
digitization in dierent elds ranging from explicit access control to indirect
control of behavior. It uses audio, visual arts (drawing, video, visualization)
and edutainment (craing intervention, workshops). ese elements will be
explored in situ and insights might vary from individual to individual.
Banker, S., & Khetani, S. (2019). Algorithm Overdependence: How the Use of
Algorithmic Recommendation Systems Can Increase Risks to Consumer Well-Being.
Journal of Public Policy & Marketing, 38(4), 500-515.
Binns, R. (2018). Algorithmic accountability and public reason. Philosophy &
Technology, 31(4), 543-556.
Bovens, M., Goodin, R. E., & Schillemans, T. (2014). e Oxford handbook of public
accountability. Oxford: OUP Oxford.
Burrell, J. (2016). How the machine ‘thinks’: Understanding opacity in machine learning
algorithms. Big Data & Society, 3(1), 2053951715622512.
Cain, J. (1998). ExperienceBased Design: Toward a Science of Artful Business
Innovation. Design Management Journal (Former Series), 9(4), 10-16.
Christian Stary, Claudia Schnugg | Algorithmic Overdependence
Challenging Organisations and Society
Citron, D. K., & Pasquale, F. (2014). e scored society: Due process for automated
predictions. Wash. L. Rev., 89, 1, Symposium: Articial Intelligence. Available at:
Danaher, J. (2016). e threat of algocracy: Reality, resistance, and accommodation,
Philosophy & Technology, 29.3, 245-268.
Danaher, J., Hogan, M. J., Noone, C., Kennedy, R., Behan, A., De Paor, A., & Murphy, M.
H. (2017). Algorithmic governance: Developing a research agenda through the power of
collective intelligence. Big Data & Society, 4(2), 2053951717726554.
De Laat, P. B. (2018). Algorithmic decision-making based on machine learning from
Big Data: Can transparency restore accountability? Philosophy & Technology, 31(4),
Hutchens, B., Keene, G., & Stieber, D. (2017). Regulating the Internet of ings:
Protecting the” Smart” Home, University of Washington School of Law UW Law Digital
Commons Technology Law and Public Policy Clinic, Available at:
Lin, J., Yu, W., Zhang, N., Yang, X., Zhang, H., & Zhao, W. (2017). A survey on internet
of things: Architecture, enabling technologies, security and privacy, and applications.
IEEE Internet of ings Journal, 4(5), 1125-1142.
Martin, R. (2015). e Internet of ings (IoT)–Removing the Human Element. Infosec
Writers, 28, 12. East Carolina University Target Publication:
Mesgari, M., & Okoli, C. (2019). Critical review of organisation-technology
sensemaking: towards technology materiality, discovery, and action. European Journal
of Information Systems, 28(2), 205-232.
Oppl, S., & Stary, C. (2014). Facilitating shared understanding of work situations using a
tangible tabletop interface. Behaviour & Information Technology, 33(6), 619-635.
Schnugg, C. (2019). Creating ArtScience Collaboration. Palgrave Macmillan, Cham.
Shaev, Y. (2014). From the sociology of things to the “Internet of things”. Procedia –
Social and Behavioral Sciences, 149, 874–878.10.1016/j.sbspro.2014.08.266.
Stary, C., Kaar, C. (2020). Design-Integrated IoT Capacity Building using Tangible
Building Blocks. Proc. 20th International Conference on Advanced Learning
Technologies (ICALT), IEEE, New York, in press.
Christian Stary, Claudia Schnugg | Algorithmic Overdependence
Challenging Organisations and Society
e New York Times (2018). Even Imperfect Algorithms Can Improve the Criminal
Justice System. A way to combat the capricious and biased nature of human decisions,
ulin, E., Vilhelmson, B., & Sch wanen, T. (2020). Absent friends? Smartphones,
mediated presence, and the recoupling of online social contact in everyday life. Annals
of the American Association of Geographers, 110(1), 166-183.
Vermesan, O., & Friess, P. (2013). Internet of things: Converging technologies for smart
environments and integrated ecosystems. Aalborg: River Publishers.
Wayingwe, W. (2019). Are Algorithms trustworthy?, Skrillics, Inc. download 2.2.2020
Wei, V. & Stephenson, T. (2017). Algorithmic Discrimination. White Paper, University
of Washington School of Law UW Law Digital Commons Technology Law and Public
Policy Clinic, available at:
Yeung, K. (2017). ‘Hypernudge’: Big Data as a mode of regulation by design.
Information, Communication & Society, 20(1), 118-136.
Zarsky, T. (2016). e trouble with algorithmic decisions: An analytic road map to
examine eciency and fairness in automated and opaque decision making. Science,
Technology, & Human Values, 41(1), 118-132.
About the Authors
Challenging Organisations and Society
About the Authors
About the Authors – COS Journal Issue 9.1
Johannes Braumann heads Creative Robotics at the University of Art and
Design Linz. He is co-founder of the Association for Robots in Architecture
and the main developer of the intuitive robot programming environment
KUKA | prc, which is used by more than 100 universities and 50 companies
worldwide. e focus of his work is the development of methods of robotics
for new user groups. ereby, Creative Robotics cooperates closely with the
Innovation Center Grand Garage and develops innovative robot processes
for (and with) SMEs and cra businesses.
Sougwen Chung is an internationally renowned artist and a pioneer in the
eld of human-robot collaboration. In her work she artistically explores and
researches ways to work with machines and the potential of articial intelli-
gence in creative cooperative processes. Chung has been artist-in-residence at
distinguished organizations like Nokia Bell Labs, is a former research fellow
at MIT’s Media Lab and was selected as the Woman of the Year in Monaco in
2019 for achievement in the Arts & Sciences.
Elisabetta F. Jochim is creative AI lead at Libre AI and co-founder at Cueva
Gallery. She has a background in Arts and Humanities and extensive experi-
ence in project management working with heterogeneous teams in dynamic
environments. Finding her passion in the intersection of technology and art,
she explores how articial intelligence can enhance human creativity. Her in-
terests focus on digital aesthetics, human-computer interaction, human and
machine creativity, and society.
Paola Michela Mineo is an Italian visual artist: her research is rooted in re-
lational art, but she uses an interdisciplinary language that ranges from per-
formance art to photography, from the purest sculpture to installations. She
graduated in Architecture at the Polytechnic of Milan and Athens; she rein-
terprets the concept of human cast and fragment, transforming them from
About the Authors
Challenging Organisations and Society
an anatomical copy to a real pieces of personal identity portraits. She has ex-
hibited her work in various museums, and is always committed to extracting
beauty from the darkest social realities.
For further information see:
Elena Raviola is Professor in Business and Design at the university of Goth-
enburg. She is recipient of the Tortsen and Wanja Söderberg Professorship in
Design Management at the Academy of Design and Cras Gothenburg and
Director of the Business and Design Lab. Her research incorporates arti-
cial intelligence and design, and its implications of work processes, most im-
portantly on creative work. Her main research interest lies in understanding
the role of technology and other material artifacts in organizing professional
work, especially in news production. She was visiting researcher at Stanford,
Bocconi University, Harvard, and Sciences Po, and worked at Jönköping In-
ternational Business School and Copenhagen Business School.
Claudia Schnugg is independent researcher and curator in the eld of art
and science. Her work focuses on analyzing the eects of art in organiza-
tional and social settings, including change processes and new technologies.
As advocate of artscience collaboration, she has been the catalyst for numer-
ous projects. Claudia is working with leading scientic institutions, tech cor-
porations and cultural partners. She researched at JKU in Linz, Copenha-
gen Business School, UCLA Art|Sci Center+Lab, and at European Southern
Observatory, Chile. She headed the Ars Electronica Residency Network and
was rst Artistic Director of Science Gallery Venice. Her most recent book is
“Creating ArtScience Collaboration” (2019).
Andrea Schueller is an independent business consultant, executive coach
and lecturer at various universities specializing in generative change and
transformation, organizational design, systemic identity, social innovation,
creative emergence. Over the years she has qualied in various elds and ap-
plies her work shapeshiing in dierent contexts pursuing the red line of
fostering embodied consciousness development through fresh presence and
156 4
About the Authors
Challenging Organisations and Society
holistic working designs. She is teaching trainer for Group Dynamics with
the OEGGO (Austrian Association of Group Dynamics & Organization
Consulting) which she chaired and served as a Board Member (2012-2018).
She is a co-founder of COS Collective.
See more:,
Christian Stary is professor for Business Information Systems at the Uni-
versity of Linz, Austria. His research areas include Interactive Design of
Sociotechnical Systems, Business Process Management, Conceptual Model-
ling and Knowledge Management. He is responsible for several European re-
search projects, such as TwinTide, dealing with method transfer in UI design
and evaluation. He is member of the editorial board of international cross-
and interdisciplinary journals, among them UAIS published by Springer. He
is one of the founders and chair of the Competence Center on Knowledge
Management, the ICKM (Int. Council on Knowledge Management), and or-
ganizer of several academic events on interactive systems, business process
and knowledge management. He is also a co-founder of COS Collective.
Liselotte Zvacek is management consultant, leadership coach and lecturer at
dierent universities in Austria; teaching trainer (train the trainer) of OEG-
GO (Austrian Society of Group Dynamics and Organisational Development)
and member of the board of OEGGO (2000-02 and 2012-18); facilitator at the
Graduate School of Business of Stanford University (USA) 2011-15; member
of the faculty of the Hernstein Institute; member of NTL (National Train-
ing Laboratories Institute, USA), photographer. She is a co-founder of COS
Flow peer group
(3 x 1d)
Your homebase for
orientation, integration &
individual learning
Group in collective ow
Deep dive generative group
COS Conference active
Engage on stage, show
your intention and action
for organisations & society
Next New Action
Assess your creative
potential for leadership
and consulting
Whole System:
Co-Creating new structures
for collaboration
Futuring, working with large
groups and networks for
transformational change
Integrating somatic
intelligence in high
performance teams
Awaken somatic intelligence
for generative change
Creating my Master’s piece
Writers space *
Photography & Film *
Freestyle *
* choose one – or more (optional)
Craft your ideas and
develop ments and bring
them into the world. Act!
COS Curriculum
Creators for
Organisations & Society
25 days & 1d/8h coaching for
master’s piece
The COS-Certied Curriculum
“Creating Organisations & Society“
New Creations in Organisations & Society originate in the undivided source
of sensing, feeling, thinking. Acting from there we make a dierence. In this
curriculum you will touch the source, develop your inner world and come
out with new resources for action in the outer world. It’s challenging for
you and others!
We designed the curriculum for mindful people who:
Wish to live and work closer to their calling and aspiration.
Desire to go on a journey of transformation and tangible action.
Want to intentionally achieve better, more creative results in the
organisations they own or work for.
Change their surroundings collaboratively, mindfully and powerfully.
Direct intention and generative power towards shared development.
Enter uncharted territory.
Here and now modules address individual, group and organisational learn-
ing spaces and oer learning on the spot in the here and now. You practice
presencing and learn how to intervene in the moment – here and now. This
is where immediate change happens.
Flow and grow together through action learning. You come closer to your-
self, develop ways to generatively hold your many facets, connect with oth-
ers in this way and manifest your actions from a fresh, supportive social
network. A learning through experiencing and acting, experiencing and
Craft and manifest: During your learning journey you are continuously
crafting your own masters’ piece. This artistic, scientic or freestyle „piece
of work“ is your gift and your challenge to yourself and to Organisations &
Society: The one you work or live in or the one you are intending to create.
A project development, a new business idea, a book, a new way of working
and living.
Your calling triggers and shapes your learning journey throughout all
modules. We support you in making a pearl-chain, your intentional learn-
ing process is the pearl string. – Beautiful!
COS Certied Curriculum:
Creators for Organisation & Society
For more information please contact:
Dr. Andrea Schueller:
Dr. Maria Spindler:
Costs approx.: € 5.600,– + VAT
Become a Friend & Member of COS!
Join the COS movement and become a Friend & Member of COS! COS is
a home for reective hybrids and a growing platform for co-creation of
meaningful, innovative forms of working & living in and for organizations
and society, between and beyond theory and practice. We invite you to
become an active member of COS.
Being a part of COS you have access to our products and happenings. As a
Friend & Member, you carry forward the COS intention of co-creating gen-
erative systems through mindful, fresh mind-body action. Let’s connect in
and for novel ways around the globe!
Access points for your participation & future contribution are:
Mutual inspiration & support at the COS-Conference
Development & transformation at COS-Creations Seminars
Creative scientic publishing & reading between and beyond theory
and practice
COS LinkedIn Virtual Community
And more …
The Friend & Membership fee is € 200,00 + 20 % VAT for 2 years.
Your 2 years COS Friend & Membership includes:
Free access to all Volumes and all Issues of COS online-Journal:
Conference fee discount of 25 %
COS-Creations: Special discount of 20 % for one seminar of your
choice during the membership period.
Please send your application for membership to o
Join COS, a Home for Reective Hybrids
The future is an unknown garment that invites us to weave our lives into
it. How these garments will t, cover, colour, connect and suit us lies in our
(collective) hands. Many garments from the past have become too tight,
too grey, too something…and the call for new shapes and textures is ac-
knowledged by many. Yet changing clothes leaves one naked, half dressed
in between. Let’s connect in this creative, vulnerable space and cut, weave
and stitch together.
Our target group is reective hybrids – leaders, scientists, consultants, and
researchers from all over the world who dare to be and act complex. Multi-
layered topics require multidimensional approaches that are, on the one
hand, interdisciplinary and, on the other hand, linked to theory and prac-
tice, making the various truths and perspectives mutually useful.
If you feel you are a reective hybrid you are very welcome to join our COS
movement, for instance by:
Visiting our website:
Getting in touch with COS-Creations. A space for personal & collec-
tive development, transformation and learning. Visit our website:
Following our COS-Conference online:
Subscribing to our newsletter: see
Subscribing to the COS Journal: see
Ordering single articles from the COS Journal:
Becoming a member of our LinkedIn group: go to
and type in “Challenging Organisations andective hybrids”
or contact Tonnie van der Zouwen: o
The Journal with Impact
The Journal “Challenging Organisations
and Society . reflective hybrids® (COS)”
is the first journal to be dedicated to the
rapidly growing requirements of reflective
hybrids in our complex 21st-century
organisations and society. Its international
and multidisciplinary approaches balance
theory and practice and show a wide
range of perspectives in and between
organisations and society.
Being global and diverse in thinking and
acting outside the box are the targets for
its authors and readers in management,
consulting and science.
ISSN 2225-1774
ResearchGate has not been able to resolve any citations for this publication.
Full-text available
This article contributes to the geographical understanding of how mobile online presence enabled by smartphones transforms human spatial practices; that is, people’s everyday routines and experiences in time and space. Contrasting a mainstream discourse concentrating on the autonomy and flexibility of ubiquitous (anywhere, anytime) use of social media, we examine new and mounting constraints on user agency. Building on time-geographic theory, we advance novel insights into the virtualities of young people’s social lives and how they are materialized in the physical world. Critically, we rework the classical time-geographic conceptions of bundling, constraints, rhythms, and pockets of local order; draw on the emerging literature on smartphone usage; and use illustrative examples from interviews with young people. We suggest a set of general and profound changes in everyday life and sociality due to pervasive and perpetual mediated presence of friends: (1) the emergence of new coupling constraints and the recoupling of social interaction; (2) the changing rhythms of social interaction due to mediated bundles of sociality becoming more frequent and insistent; (3) the shifting nature of the streaming background of online contacts, which are becoming more active, intervening in, and intruding on ongoing foreground activities of everyday life; and (4) the reordering of foreground activity as well as colocated and mediated presences, centering on processes of interweaving, congestion and ambivalence, and colocated absence. Key Words: intervening background, local and mediated pockets of order, online copresence, rhythm, spatial practice.
Full-text available
More than two decades of sensemaking research has brought thorough knowledge of how people understand organisational phenomena and attach meaning to them. This stream of research explores varied social and cognitive aspects of the process in the context of organisations and information technology (IT). However, such a large body of literature exhibits some significant shortcomings: there is a lack of IT materiality; a neglect of the discovery aspect of perception; and a lack of action orientation. So, there is limited understanding of the role that the material artefact plays in shaping users’ sensemaking of new IT, as well as how users’ actions affect their sensemaking. Moreover, while the literature mostly focuses on sensemaking as the creation of new meanings to rationalise user experiences, it neglects the discovery aspect of sensemaking that refers to perception of the meaning already available. To address these issues, this article provides a thorough review of the literature on organisation-technology sensemaking and synthesises our current understanding of the phenomenon. It then analyses the major shortcomings in our knowledge and highlights the need to address those shortcomings. It subsequently discusses an ecological approach consistent with the tenets of critical realism that can address some of the existing shortcomings.
Full-text available
Decision-making assisted by algorithms developed by machine learning is increasingly determining our lives. Unfortunately, full opacity about the process is the norm. Would transparency contribute to restoring accountability for such systems as is often maintained? Several objections to full transparency are examined: the loss of privacy when datasets become public, the perverse effects of disclosure of the very algorithms themselves (“gaming the system” in particular), the potential loss of companies’ competitive edge, and the limited gains in answerability to be expected since sophisticated algorithms usually are inherently opaque. It is concluded that, at least presently, full transparency for oversight bodies alone is the only feasible option; extending it to the public at large is normally not advisable. Moreover, it is argued that algorithmic decisions preferably should become more understandable; to that effect, the models of machine learning to be employed should either be interpreted ex post or be interpretable by design ex ante.
Full-text available
The ever-increasing application of algorithms to decision-making in a range of social contexts has prompted demands for algorithmic accountability. Accountable decision-makers must provide their decision-subjects with justifications for their automated system’s outputs, but what kinds of broader principles should we expect such justifications to appeal to? Drawing from political philosophy, I present an account of algorithmic accountability in terms of the democratic ideal of ‘public reason’. I argue that situating demands for algorithmic accountability within this justificatory framework enables us to better articulate their purpose and assess the adequacy of efforts toward them.
Full-text available
One of the most noticeable trends in recent years has been the increasing reliance of public decision-making processes (bureaucratic, legislative and legal) on algorithms, i.e. computer-programmed step-by-step instructions for taking a given set of inputs and producing an output. The question raised by this article is whether the rise of such algorithmic governance creates problems for the moral or political legitimacy of our public decision-making processes. Ignoring common concerns with data protection and privacy, it is argued that algorithmic governance does pose a significant threat to the legitimacy of such processes. Modelling my argument on Estlund’s threat of epistocracy, I call this the ‘threat of algocracy’. The article clarifies the nature of this threat and addresses two possible solutions (named, respectively, ‘resistance’ and ‘accommodation’). It is argued that neither solution is likely to be successful, at least not without risking many other things we value about social decision-making. The result is a somewhat pessimistic conclusion in which we confront the possibility that we are creating decision-making processes that constrain and limit opportunities for human participation.
Full-text available
This article considers the issue of opacity as a problem for socially consequential mechanisms of classification and ranking, such as spam filters, credit card fraud detection, search engines, news trends, market segmentation and advertising, insurance or loan qualification, and credit scoring. These mechanisms of classification all frequently rely on computational algorithms, and in many cases on machine learning algorithms to do this work. In this article, I draw a distinction between three forms of opacity: (1) opacity as intentional corporate or state secrecy, (2) opacity as technical illiteracy, and (3) an opacity that arises from the characteristics of machine learning algorithms and the scale required to apply them usefully. The analysis in this article gets inside the algorithms themselves. I cite existing literatures in computer science, known industry practices (as they are publicly presented), and do some testing and manipulation of code as a form of lightweight code audit. I argue that recognizing the distinct forms of opacity that may be coming into play in a given application is a key to determining which of a variety of technical and non-technical solutions could help to prevent harm.
Conference Paper
The emergence of customized services and products utilizing Internet-of-Things (IoT) systems requires specific design skills. Developers need to understand IoT systems as complex hardware/software systems of highly dynamic nature, coherently connected to design representations, such as digital twins. While the importance of designing a use or business case is acknowledged in IoT engineering, there is a lack of methods integrated with the early design phases of IoT systems. We propose an agile support method for design-integrated capacity building. The approach features tangible IoT components and their (meta-)models, which are complemented by digital learning support tools. The benefits stemming from tangible design and implementation are structured and aligned hands-on experiences including system prototyping.
Consumers increasingly encounter recommender systems when making consumption decisions of all kinds. While numerous efforts have aimed to improve the quality of algorithm-generated recommendations, evidence has indicated that people often remain averse to superior algorithmic sources of information in favor of their own personal intuitions (a type II problem). The current work highlights an additional (type I) problem associated with the use of recommender systems: algorithm overdependence. Five experiments illustrate that, stemming from a belief that algorithms hold greater domain expertise, consumers surrender to algorithm-generated recommendations even when the recommendations are inferior. Counter to prior findings, this research indicates that consumers frequently depend too much on algorithm-generated recommendations, posing potential harms to their own well-being and leading them to play a role in propagating systemic biases that can influence other users. Given the rapidly expanding application of recommender systems across consumer domains, the authors believe that an appreciation and understanding of these risks is crucial to the effective guidance and development of recommendation systems that support consumer interests.
Fog/edge computing has been proposed to be integrated with Internet-of-Things (IoT) to enable computing services devices deployed at network edge, aiming to improve the user’s experience and resilience of the services in case of failures. With the advantage of distributed architecture and close to end-users, fog/edge computing can provide faster response and greater quality of service for IoT applications. Thus, fog/edge computing-based IoT becomes future infrastructure on IoT development. To develop fog/edge computing-based IoT infrastructure, the architecture, enabling techniques, and issues related to IoT should be investigated first, and then the integration of fog/edge computing and IoT should be explored. To this end, this paper conducts a comprehensive overview of IoT with respect to system architecture, enabling technologies, security and privacy issues, and present the integration of fog/edge computing and IoT, and applications. Particularly, this paper first explores the relationship between Cyber-Physical Systems (CPS) and IoT, both of which play important roles in realizing an intelligent cyber-physical world. Then, existing architectures, enabling technologies, and security and privacy issues in IoT are presented to enhance the understanding of the state of the art IoT development. To investigate the fog/edge computing-based IoT, this paper also investigate the relationship between IoT and fog/edge computing, and discuss issues in fog/edge computing-based IoT. Finally, several applications, including the smart grid, smart transportation, and smart cities, are presented to demonstrate how fog/edge computing-based IoT to be implemented in real-world applications