ArticlePDF Available

Position Paper: Escaping Academic Cloudification to Preserve Academic Freedom

Authors:

Abstract

Especially since the onset of the COVID-19 pandemic, the use of cloud-based tools and solutions - lead by the ‘Zoomification’ of education, has picked up attention in the EdTech and privacy communities. In this paper, we take a look at the progressing use of cloud-based educational tools, often controlled by only a handful of major corporations. We analyse how this ‘cloudification’ impacts academics’ and students’ privacy and how it influences the handling of privacy by universities and higher education institutions. Furthermore, we take a critical perspective on how this cloudification may not only threaten users’ privacy, but ultimately may also compromise core values like academic freedom: the dependency relationships between universities and corporations could impact curricula, while also threatening what research can be conducted. Finally, we take a perspective on universities’ cloudification in different western regions to identify policy mechanisms and recommendations that can enable universities to preserve their academic independence, without compromising on digitalization and functionality.
Privacy Studies Journal
ISSN: 2794-3941
Vol. 1, no. 1 (2022): 49-66
Position Paper: Escaping
Academic Cloudication to
Preserve Academic Freedom
Tobias Fiebig, Martina
Lindorfer, and Seda Gürses
50
Privacy Studies Journal Vol. 1, no. 1 (2022)
Fiebig, Lindorfer, and Gürses: Escaping Academic Cloudication
Introduction
The onset of the COVID-19 pandemic led to a shift in our perception of digital techno-
logies in teaching (EdTech). While, before the pandemic, digital teaching support was a
feature, a plan, or something to do in ‘the future,’ COVID-19 immediately turned it into
a necessity. Societal use of the Internet shifted in general;1 specic changes in academia
and teaching organizations were described in the coinage of ‘The Zoomication of the
Classroom.2
As with all that is necessary, needs deemed less necessary in the situation may receive
limited aention. What we, as we claim, overlooked in the Zoomication of our class-
rooms were the signicant implications for students’ and teachers’ privacy rights, and
the severe implications for academic freedom. Digitalization in its current form follows
the established pathways of surveillance capitalism3 and centralization4 amassing control
over what education means in the hands of a small set of major corporations.5 We further-
more claim that the COVID-19 pandemic was not the spark that led to the Zoomication
of education, but more of a catalyst, allowing necessity to push aside doubts, accelerating
an ongoing process of corporate-driven centralization.
To underline our points, we revisit the results of the white paper ‘Heads in the Clouds:
Measuring the Implications of Universities Migrating to Public Clouds’. 6 As their work is
of a more technical nature, we rst explore what they measured, and how they obtained
these results. Subsequently, we summarize their core ndings and explore what these
mean for the privacy, security, and digital sovereignty of students and academics around
the world. Finally, we conclude with an outlook on what digital sovereignty in education
should mean, and which policy steps should be taken to retain it for academic instituti-
ons.
Background
In this section, we discuss background and terms necessary for the rest of the paper. We
rst explore facets of privacy, most importantly, privacy as an individual right that an
individual exerts control over and provides consent for, and second, privacy compliance
as a mechanism used by organizations unable to provide reasonable privacy controls to
1 Anja Feldmann, Oliver Gasser, Franziska Lichtblau, Enric Pujol, Ingmar Poese, Christoph Dieel,
Daniel Wagner, et al., ”A year in lockdown: how the waves of COVID-19 impact internet trac,” Com-
munications of the ACM 64, no. 7 (Association for Computing Machinery, 2021): 101-108.
2 Mehdi Karamollahi, Carey Williamson, and Martin Arli, ”Zoomiversity: a case study of pandemic
eects on post-secondary teaching and learning,” in Passive and Active Measurement, 23rd Internatio-
nal Conference, PAM 2022. Virtual Event, March 28–30, 2022 Proceedings, eds. Oliver Hohlfeld, Giovane
Moura, and Cristel Pelsser (Cham; Springer, 2022), 573-599.
3 Nick Srnicek, Platform Capitalism (Hoboken: Wiley & Sons, 2017).
4 Tobias Fiebig et al., ”Heads in the Clouds: Measuring the Implications of Universities Migrating to
Public Clouds,arXiv preprint arXiv:2104.09462 (2021).
5 Ben Williamson and Anna Hogan, ”Pandemic Privatisation in Higher Education: Edtech and Univer-
sity Reform,” Education International (2021).
6 Fiebig et al. ”Heads in the Clouds.
Vol. 1, no. 1 (2022) Privacy Studies Journal
51Fiebig, Lindorfer, and Gürses: Escaping Academic Cloudication
individuals to still ‘do’ privacy. Thereafter, we discuss the history of organizational IT in
higher education, and take a look at what digital sovereignty means and should mean in
the context of universities.
Privacy Compliance & Individual Control
Privacy is an elusive term and comes with a myriad of facets and interpretations.1 In
work, we explore two facets of privacy: First, privacy in the context of an individual’s
control over their own data, i.e., their ability to make conscious decisions on who handles
their data for what purpose. This essentially boils down to an individual’s ability to pro-
vide informed consent for every processing of data related to themselves.2 This notion is
also what end-users commonly understand as privacy.3
Second, we introduce privacy-by-compliance, which stems from the governance reality in
which we nd ourselves, shapedin Europe—by the GDPR. In a privacy-by-compliance
seing an organization does not operate towards providing their users with control
over their data. Instead, the major objective is puing policies and contracts in place that
ensure compliance with applicable privacy legislation and policies in their corresponding
jurisdiction, independently of the question whether users actually do have control over
their data.
Users’ control over their data may be limited by, e.g., having a technical choice to use a
service, but facing real-world requirements that necessitates the use of the service. As
an example, imagine a user only having one supermarket in their vicinity reachable by
foot; all other supermarkets require a car. Said supermarket now introduces an external
Bluetooth surveillance service for customers to improve targeted advertising, i.e., a ser-
vice that tracks users’ phones’ Bluetooth broadcasts to identify if and how they move in
a store.4 The user is ultimately free to choose to use this supermarket and consent to the
tracking, or go to any other supermarket that does not utilize such tracking. However, if
the user does not have access to a car there may be socio-economic circumstances pre-
venting them from executing their right to opt-out of data processing by using another
service.
Similarly, the supermarket may claim that the use of the external service hosted in—for
the sake of argumentthe U.S. serves their ‘legitimate interests.Furthermore, as they
may hold a contract with the processing partyunder Safe Harbour or any of its dece-
dents, i.e., the subsequent agreements put into place when the previous one was conside-
1 Helen Nissenbaum, Privacy in Context (Stanford: Stanford University Press, 2009).
2 Anita L. Allen, ”Privacy-as-data control: Conceptual, practical, and moral limits of the paradigm,”
Conn. L. Rev. 32 (2000): 861.
3 Kelly Caine and Rima Hanania, ”Patients want granular privacy control over health information in
electronic medical records,” Journal of the American Medical Informatics Association 20, no. 1 (2013): 7-15.
4 Michael Kwet, “In Stores, Secret Surveillance Tracks Your Every Move,The New York Times, June
14, 2019, accessed May 30, 2022, hps://www.nytimes.com/interactive/2019/06/14/opinion/bluetooth-
wireless-tracking-privacy.html.
52
Privacy Studies Journal Vol. 1, no. 1 (2022)
Fiebig, Lindorfer, and Gürses: Escaping Academic Cloudication
red illegal by the European Court5they may claim that explicit consent from users is
not even necessary, as—technicallytheir processing of personal data is compliant with
the GDPR. Now, this argument certainly goes against common perception of privacy
control, and will most likely also not hold up when scrutinized in a court of law (as Safe
Harbour itself).6 Yet, in the end it rst creates an illusion of compliance, which is deemed
sucient to satisfy legal requirements, and prevents users from asking too many questi-
ons.
In this perspective, we also see how the lines between data control and data processing
vanishes if privacy by compliance is employed. In fact, by creating a framework that only
provides technical control to users, a data controller also enters the issue of not being
able to exert control themselves. The contractual framework enables compliance but not
user control, because it lacks in feasible enforcement in case of contractual violations.
Hence, this lack of feasible enforcement in case of contractual violations equally applies
to the data controller when a data processor only bound by privacy-by-compliance is
being used; the controller has no reasonable means to enforce that a data processor does
not take control of the data it is tasked to process. This may occur due to applicable laws,
e.g., the Cloud Act7 or simply due to an extensible chain of opaque sub-processors, e.g.,
an SaaS (Software-as-a-Service) provider ultimately using infrastructure supplied by
Amazon and/or Microsoft, where the ultimate processor is not obvious, or a combination
of both.
Both of these cases may seem hypothetical. Nevertheless, we revisit these points in Sec-
tion 4, and see how universities fall exactly into the issues described above.
University IT: A Brief Summary
According to Fiebig et al.,8 IT in universities clusters in three distinct pillars: teaching,
research, and administration. The most common item spanning these three pillars is
certainly email, which is used to communicate with students, fellow researchers, and the
administration alike. In addition, each pillar has dedicated resources and requirements.
For example, research infrastructure may include a graphics cards cluster for AI opera-
tions, or infrastructure for conducting online services. Teaching infrastructure usually
includes a Learning Management System (LMS), which allows teachers to conduct their
courses, track students’ course progress, and sometimes even conduct examinations.
Finally, the administration also has specic requirements, like human resource manage-
ment applications, payment processing and billing systems, as well as infrastructure for
handling student enrolment.9
5 Martin A. Weiss and Kristin Archick, US-EU data privacy: from safe harbor to privacy shield, Congressio-
nal Research Service, May 19, 2016.
6 Ibid.
7 Marcin Rojszczak, ”CLOUD act agreements from an EU perspective,” Computer Law & Security Review
38 (2020).
8 Fiebig et al. ”Heads in the Clouds.
9 For a more comprehensive description of universities’ IT infrastructure, please refer to Section II of
the paper by Fiebig et al.
Vol. 1, no. 1 (2022) Privacy Studies Journal
53Fiebig, Lindorfer, and Gürses: Escaping Academic Cloudication
Digital Sovereignty in Higher Education
Digital sovereignty is one of the most commonly used terms in digital governance over
the last couple of years.10 As with all popular terms, it is rather dicult to pinpoint exactly
what it means. A common interpretation revolves around nation state’s ability to inict
their own governance decisions, may it be in terms of permissible content or other regu-
lations, on digital systems under the reality of a global Internet.11 More critical voices,
such as Avila Pinto,12 tie the maer of digital sovereignty to classical protectionism, and
ultimately a form of ‘digital colonialism’.
Similarly, Fiebig & Aschenbrenner13 criticized the notion of digital sovereignty being cen-
tred around the creation of ‘own’ siloed systems14 and regulatory control, 15 16 instead of
taking a perspective on the independent ability to operate, repair, and rebuild digital
infrastructure.17
However, universities are not nation states—despite often being state organizations—
especially not in the world of mostly free public education in central Europe. So, what do
we mean when we talk about digital sovereignty in higher education?
Essentially, the point about digital sovereignty in higher education concerns whether
digital infrastructure used by universities can negatively impact their purpose, which
is usually the execution of independent research and independent teaching. This means,
that external parties usually should not decide which students a university admits, what
content it teaches (within certain boundaries of accreditation etc.), and what scientic
research it conducts. The conglomerate of these requirements forms what is usually
understood as ‘academic freedom.
Hence, when we talk about digital sovereignty being lost in higher education or acade-
mia, we are talking about a situation where the way the digital infrastructure an orga-
nization relies on is being operated puts it into a situation where its academic freedom,
either in terms of research or education, may be tainted by an external party. For digital
sovereignty to be lost, this external party naturally does not necessarily have to exercise
10 Julia Pohle and Thorsten Thiel, ”Digital sovereignty,” in Practicing Sovereignty: Digital Involvement
in Times of Crises, ed. Bianca Herlo, Daniel Irrgang, Gesche Joost, and Andreas Unteidig (Bielefeld:
transcript Verlag, 2021), 47-67.
11 Luciano Floridi, ”The ght for digital sovereignty: What it is, and why it maers, especially for the
EU,” Philosophy & Technology 33, no. 3 (2020): 369-378.
12 Renata Avila Pinto, ”Digital sovereignty or digital colonialism,” SUR-Int’l J. on Hum Rts. 27 (2018): 15.
13 Tobias Fiebig and Doris Aschenbrenner, ”13 propositions on an Internet for a ’burning world,’” in
Proceedings of the ACM SIGCOMM Joint Workshops on Technologies, Applications, and Uses of a Responsible
Internet and Building Greener Internet (2022).
14 Arnaud Braud et al., ”The road to European digital sovereignty with Gaia-X and IDSA,IEEE Network
35, no. 2 (The Institute of Electrical and Electronics Engineers , 2021): 4-5.
15 Huw Roberts et al., ”Safeguarding European values with digital sovereignty: An analysis of state-
ments and policies,” Internet Policy Review (2021).
16 Benjamin Farrand and Helena Carrapico, ”Digital sovereignty and taking back control: from regu-
latory capitalism to regulatory mercantilism in EU cybersecurity,” European Security 31, no. 3 (2022):
435-453.
17 Fiebig and Aschenbrenner, “13 propositions.”
54
Privacy Studies Journal Vol. 1, no. 1 (2022)
Fiebig, Lindorfer, and Gürses: Escaping Academic Cloudication
that opportunity; The mere chance of it being exercised is sucient for digital sovere-
ignty to be lost.18
The Pandemic Eect on Corporations and IT
The COVID19 pandemic has signicantly aected all aspects of society and commerce.
In terms of digital infrastructure, ranging from how we use the Internet,19 the eect on
those running and providing digital infrastructure,20 toas also found by Fiebig et al.
digital infrastructure in teaching and learning21.
In addition, the pandemic also impacted global supply chains,22 23 while home deliveries
of commodities24 and food25 increased, leading to considerable growth for related compa-
nies. Thus, we observe an overall growth of corporations across sectors that provided
services lling the gaps in terms of consumption and social interaction, while these shifts
simultaneously feed-back into human behaviour and desires.26
Measuring Cloudication
In this section, we provide background information on the work of Fiebig et al.27
Measuring Cloud Adoption
18 See also the argument by Fiebig and Aschenbrenner on digital sovereignty commonly being used
wrong.
19 Anja Feldmann, Oliver Gasser, Franziska Lichtblau, Enric Pujol, Ingmar Poese, Christoph Dieel,
Daniel Wagner, et al., ”The lockdown eect: Implications of the COVID-19 pandemic on internet traf-
c,” Proceedings of the ACM internet measurement conference (Association for Computing Machinery,
2020): 1-18.
20 Mannat Kaur et al., ”’I needed to solve their overwhelmness’: How system administration work was
aected by COVID-19,” 25th ACM Conference on Computer-Supported Cooperative Work and Social Comput-
ing (Association for Computing Machinery, 2022).
21 Karamollahi, Williamson, and Arli, ”Zoomiversity.
22 Serpil Aday and Mehmet Seckin Aday, ”Impact of COVID-19 on the food supply chain,” Food Quality
and Safety 4, no. 4 (2020): 167-180.
23 Remko van Hoek, ”Research opportunities for a more resilient post-COVID-19 supply chain–closing
the gap between research ndings and industry practice,” International Journal of Operations & Produc-
tion Management 40, no. 4 (2020): 341-355.
24 Avinash Unnikrishnan and Miguel Figliozzi, “Exploratory analysis of factors aecting levels of home
deliveries before, during, and post-COVID-19,” Transportation Research Interdisciplinary Perspectives 10
(2021).
25 Diana Gavilan et al., ”Innovation in online food delivery: Learnings from COVID-19.” International
Journal of Gastronomy and Food Science 24 (2021).
26 Toni D. Pikoos et al., ”The Zoom eect: exploring the impact of video calling on appearance dissatis-
faction and interest in aesthetic treatment during the COVID-19 pandemic,” Aesthetic Surgery Journal
41, no. 12 (2021).
27 Fiebig et al. ”Heads in the Clouds.”
Vol. 1, no. 1 (2022) Privacy Studies Journal
55Fiebig, Lindorfer, and Gürses: Escaping Academic Cloudication
To measure universities’ adoption of cloud services, Fiebig et al. utilize data from the
Domain Name System (DNS). The domain name system is, essentially, like a phone book
which allows computers to look up additional information for names. For example, when
a user wants to access hps://www.example.com, the DNS will be used to look up the
Internet Protocol (IP) address of www.example.com, so the users’ computer can establish
a network connection to the server hosting www.example.com, to retrieve content from
that site. Similarly, the DNS provides further functions, as for example, looking up which
server is responsible for receiving emails for a specic domain, or to discover specic
services related to a domain.
In their work, Fiebig et al. use a historic dataset from 2015 onwards, which essentially
contains a global record of which names and associated information have been looked up
by users. Please note, that this does not refer to individual users, but instead works on an
aggregate of data that has been carefully processed to not include personally identiable
information.
Using this dataset, Fiebig et al. are able to investigate where sites under universities
domains are hosted, whether they use a cloud-hosted learning management system, or
one of the large video chat solutions (Zoom etc.), and where they receive their emails.
Core Findings
Here, for brevity, we only summarize the core ndings presented by Fiebig et al.; for a
comprehensive view of their results, we recommend to consult their paper. In summary,
Fiebig et al.28 nd:
1. A dierence between regions: According to their measurements, there is a stark
contrast in cloud adoption between traditional Anglo-American inuenced academic
systems—the U.S., the U.K., the Netherlands, and the THE Top 100—versus continental
European systems as found in Germany, France, Austria, and Swierland. While the
former group embraced the cloudication of universities’ IT even before the pandemic,
the laer group is more cautious, and only during the pandemic a slight uptick in adop-
tion was measurable.
2. The impact of the pandemic on cloud adoption was focused on video lecturing:
While the general cloud adoption of universities shifted into the view of public percep-
tion with the beginning of the pandemic, new adoptions were mostly clustered around
video communication and collaboration tools like Zoom, WebEx, and Microsoft Teams.
3. Policy and Privacy-by-Compliance have a major impact on cloud adoption: Fiebig et
al. observe that cloud adoption for email hosting was limited in the Netherlands before
mid-2018. Since then, a steady uptake of, especially, Microsoft-based email hosting can be
observed. As Fiebig et al. note, this coincides with a leer published by the Dutch mini-
28 Ibid.
56
Privacy Studies Journal Vol. 1, no. 1 (2022)
Fiebig, Lindorfer, and Gürses: Escaping Academic Cloudication
stry of the interior, claiming that all privacy concerns regarding Microsoft’s services have
been resolved for Dutch government organizations.29
Discussion
In this section, we revisit the privacy implications of cloudication, and assess how the
current cloudication measured by Fiebig et al. impacts academic freedom as a whole.
Teachers’ and Students’ Privacy
As outlined in Section 2.1, privacy is often understood as one’s ability to freely determine
who processes one’s own data for what purpose. However, in a university context, this
point of free decision making can be severely limited by a student’s choice to pursue a
certain career or eld of study. If a university decides to, for example, outsource its LMS
to a U.S.-based company hosting it in Amazon’s EC2 cloud, it could still oer students a
choice to opt out of using the LMS. However, as experience shows,30 in these cases neces-
sity will trump personal choice. Hence, much as in our supermarket example in Section
2.1, a student is restricted in their ability to make a free and independent choice concer-
ning their privacy preferences. If they would prefer not to have their data processed by
systems controlled by either Amazon or another U.S.-based company, their only options
are to arrange themselves with this practice, or to accept that they cannot aend a course
or study at a specic university.
Privacy-by-Compliance
What Fiebig et al. observe in terms of cloud service adoption is that especially those regi-
ons ‘further along the path of cloudication’ accumulate a multitude of services from
dierent vendors (even though most of them ultimately rely on one of the big providers
of cloud infrastructure, i.e., Google, Amazon, and Microsoft). This makes it increasingly
dicult for universities to oer its users—may it be students, researchers, or teachers
ne-grained control over where their data is processed and how. At the same time, espe-
cially European institutions, nd themselves struggling with the implementation of data
protection legislation.31 This may create an environment in which universities prioritize
technical compliance with regulations over that actual control. Common methods to
create this ‘privacy-by-compliance’ include, for example, unspecic and broad privacy
policies essentially covering any conceivable cloud service, while using contractual agre-
29 Ferd Grapperhaus and Kajsa Ollongren, ”Vericatie op de uitvoering van het overeengekomen
verbeterplan met Microsoft”, accessed May 30, 2022, hps://www.tweedekamer.nl/kamerstukken/
brieven_regering/detail? id=2019Z13829&did=2019D28465.
30 Bart Custers, Simone van der Hof, and Bart Schermer, ”Privacy expectations of social media users:
The role of informed consent in privacy policies,” Policy & Internet 6, no. 3 (2014): 268-295.
31 Vincenzo Mangini, Irina Tal, and Arghir-Nicolae Moldovan, An empirical study on the impact of
GDPR and right to be forgoen-organisations and users perspective, in Proceedings of the 15th Interna-
tional Conference on Availability, Reliability and Security (2020), 1-9.
Vol. 1, no. 1 (2022) Privacy Studies Journal
57Fiebig, Lindorfer, and Gürses: Escaping Academic Cloudication
ements with suppliers to outsource responsibility for data protection aspects. To further
explore this subject, we recommend the reader to take a look at their own institution’s
privacy policyif they can nd it.
As the main tool of privacy-by-compliance, Universities’ privacy policies are an ideal
place to investigate the prevalence of privacy-by-compliance.32 Coghlan et al.33 studied
the privacy policies of 23 popular EdTech tools and found that universities often nego-
tiate their own terms and conditions, which also impacts data processing. Thus, instead
of focusing on the privacy policies of individual platforms, we also studied the publicly
available privacy policies of each countrys top-three universities (THE Top100, 21 uni-
versities, 46 documents) to identify how they communicate their cloud use. We nd two
types of documents: (1) privacy policies describing data collection/processing activities,
and (2) data protection guidance (not publicly available for 4 universities).34
The public-facing documents we surveyed are exclusively focused on data controller and
FERPA responsibilities,35 i.e., data and student records collected and processed by the uni-
versities using their own IT infrastructure. German universities stood out with policies
being detailed and emphasizing subject access rights. Still, despite the high cloud-usage
found by Fiebig et al.,36 we did not nd one university that provides a comprehensive
overview of what data is collected by and shared with these infrastructures. Instead, the
data shared is summarized in broad terms like “platform usage and interaction data”,
and is regularly hidden in auxiliary documents. While third-party cloud services used
in websites, e.g., social media buons, are mentioned regularly, references to third-party
services used in university administration and operations were scarce. Some universities
noted contractual agreements with third-party cloud providers to limit purpose of data
collection and processing, but not a single one provided further details on the implemen-
tation of these contracts. Hence, in summary, universities seem to approach the issue of a
growing set of cloud dependencies by applying privacy-by-compliance.
Another aspect in this framework is the role of the student in this setup. As Fiebig et
al. note, a progressing cloudication may intersect with a further developed self-under-
standing as an economic entity of an academic institution, or rather, the encouragement
of such positions by an academic system at large. The continuous inux of traditional
management methods into academia—progress reports, Key Performance Indicators
32 Simon Coghlan, Tim Miller, and Jeannie Paterson, “’Good proctor or ’Big Brother? AI Ethics and
Online Exam Supervision Technologies,” Philosophy & Technology (2021).
33 Ibid.
34 All documents we analysed are available online: hps://github.com/headsinthecloud/universities.
35 U.S. Department of Education “Family Educational Rights and Privacy Act (FERPA)”, accessed
November 11, 2022, hps://www2.ed.gov/policy/gen/guid/fpco/ferpa/index.html.
36 Fiebig et al., ”Heads in the Clouds.
58
Privacy Studies Journal Vol. 1, no. 1 (2022)
Fiebig, Lindorfer, and Gürses: Escaping Academic Cloudication
(KPIs), and a drive to ‘valorize’37 research, sometimes even included as a KPI—has been
ongoing for several years, and has equally been criticized38 and applauded.39
A necessary corner stone in the use of privacy-by-compliance is, however, the acceptance
of users as a form of employee, i.e., as people hired or integrated into the organization
for a purpose and use.40 This transforms their privacy concerns in the work environment
from a private maer of their own to a simple question of organizational compliance, in
which the organization can make decisions for them, as it is essentially just a decision
for itself. There are arguments to be had on whether this perspective is valid—even for
employees41yet such a stance simplies the process of creating privacy-by-compliance.
Systems are there for a purpose; if usage is restricted to business relevant activities only,
there is far fewer private data to be handled.
We, the authors, obviously disagree with this perspective, especially in the context of
universities and education. We argue that taking such a perspective of privacy-by-comp-
liance, which includes the necessary leap of interpreting students as a form of employees
of the university system, fundamentally conicts with the idea of an academic environ-
ment enabling students to execute (and aain the ability to execute) free and independent
thoughts.42 We would, in fact, go as far as claiming that education itself is one of the
most private maers in our society. The ability to develop ideas is rooted in an ability
to be wrong. Recording our learning progress—detailed and ne-grainedmight make
our learning errors a permanent record in cloud infrastructure outside of our control, or
at least carries the threat of them becoming a permanent record. In turn, this ominous
threat might inhibit the learning progress of students: Cautious to not create a permanent
record of them challenging the status quo or being out-of-their-depth when exploring
new elds and subjects, they may move towards safe and predictable options. In that
sense, the eect is similar to how a threat of privacy violations and surveillance leads to
a change in aitude, as people align their behavior with the expectation of being obser-
ved.43
Hence, in summary, we claim that if an academic organization aempts to implement
privacy-by-compliance instead of leaving its students (and to a degree teachers) with the
ability to control the spread of their data, it ultimately fails its own purpose.
37 Here, valorization, verb ‘to valorize’, refers to the process of successfully disseminating and promot-
ing research results, especially converting research results into a tangible and monetary benet for
the organization, for example, by obtaining and selling patents, or by creating a start-up company
rooted in research results.
38 Deborah Churchman, ”Voices of the academy: academics’ responses to the corporatizing of acade-
mia,” Critical Perspectives on Accounting 13, no. 5-6 (2002): 643-656.
39 Adrienne S. Chan and Donald Fisher, eds., The Exchange University: Corporatization of Academic Culture
(Vancouver: UBC Press, 2009).
40 Sara Ahmed, What’s the use?: On the uses of use (Durham, NC: Duke University Press, 2019).
41 Lothar Determann and Robert Sprague, ”Intrusive monitoring: Employee privacy expectations are
reasonable in Europe, destroyed in the United States,” Berkeley Tech. LJ 26 (2011): 979.
42 Ahmed, What’s the use? See also the humboldtian ideal of education.
43 Nina Gerber, Paul Gerber, and Melanie Volkamer, ”Explaining the privacy paradox: A systematic
review of literature investigating privacy aitude and behavior,Computers & Security 77 (2018): 226-
261.
Vol. 1, no. 1 (2022) Privacy Studies Journal
59Fiebig, Lindorfer, and Gürses: Escaping Academic Cloudication
Academic Freedom
In Section 2.3, we briey discussed the meaning of digital sovereignty in the context of
higher education. We now shift this discussion into the context of academic freedom.
Fiebig et al.44 claim, that the progressing cloudication of universities’ IT may ultimately
threaten academic freedom. However, the underlying mechanics of how this comes to be,
as well as the historic embedding, remains—to a degree—unclear in their work.
As with the issue with privacy-by-compliance, this boils down to the ultimate purpose
of academia as a cradle of independent thought. Even though we acknowledge that this
ideal is often betrayed by academics themselves, we use it as an assumption in our argu-
ment, making our claims within the framework of an ideal world.
A glowing and well-documented example of the corrective power of academia—and the
corporate need to spend excessive resources on preventing truth to be acknowledge by
society—is certainly the issue of lead pollution.45 Paerson, the rst scientist to establish
the age of the earth, also noticed that there was an apparent human-made poisoning of
the environment by the then commonly leaded gasoline.46 Facing this discovery, espe-
cially oil and gas corporations expended signicant resources to discredit Paerson and
prevent his results from appearing, allegedly going as far as promising him nearly unli-
mited third-party funding if he would only vow to not pursue this line of research.47
Now, what enabled Paerson to continue his work was (a) academic freedom, and (b) his
adversaries lacking a direct measure of exerting pressure. More boldly speaking, while
oil and gas companies could try to buy him, and could fund research ‘disproving’ his
ndings ad inmum, there was no lever to take something from him or his institution.
Cloudication and questionable funding resemble one another in that they challenge/
threaten scientic independence.48 As Fiebig et al.49 claim, there is, however, an inherent
dierence in the fact that cloudication gives corporations who operate in the heart of
academia a direct lever to inuence the academic discourse on the negative impact of said
corporations.50 They may, for example, put pressure on a university whose researchers
conduct work that is perceived by the corporation as a threat to itself.
44 Fiebig et al. ”Heads in the Clouds.
45 We note that we could also use the human-made climate crisis currently ravaging our world as an
example here. However, for that incident sadly no common consensus on how bad the situation is has
been reached yet, even though several corporations have been caught—knowing how bad the state of
climate change istrying to discredit climate researchers in order to sway public opinion their way.
Similar eects have also been observed around the tobacco industry.
46 Clair C. Paerson, ”Contaminated and natural lead environments of man,” Archives of Environmental
Health: An International Journal 11, no. 3 (1965): 344-360.
47 Neil Degrasse-Tyson, ”The Clean Room,” Cosmos: A Spacetime Odyssey. Fox Broadcasting, April 20,
2014.
48 Sylvia Rowe, Nick Alexander, Fergus Clydesdale, Rhona Applebaum, Stephanie Atkinson, Richard
Black, Johanna Dwyer et al., ”Funding food science and nutrition research: nancial conicts and
scientic integrity,” Nutrition Reviews 67, no. 5 (2009): 264-272.
49 Fiebig et al. ”Heads in the Clouds.”
50 Shoshana Zubo, ”Big other: surveillance capitalism and the prospects of an information civiliza-
tion,” Journal of Information Technology 30, no. 1 (2015): 75-89.
60
Privacy Studies Journal Vol. 1, no. 1 (2022)
Fiebig, Lindorfer, and Gürses: Escaping Academic Cloudication
Imagine, for example, a university migrating their email infrastructure to Google. At
the moment, according to Fiebig et al., this concerns at least 10% of all U.S. R1/R2 uni-
versities. Then, let’s say, that university conducts research that is not in the best interest
of Google. They may nd that the contributions of Google to the eld of machine lear-
ning are not beneting society,51 they might talk about how large language models are
severely biased and thus introducing harms to society,52 or they may simply nd Google
to execute unfair business practices.53 While, traditionally, Google would be able to exert
pressure only by, e.g., reducing third-party funding to this institution, they now have a
very direct lever. No law forces one organization to conduct business with another. In
a free market, even infrastructure-providers—and there are many—are free to decide
with whom they want (and do not want) to work. Technically, Google could decide to
discontinue the business relationship regarding a cloud-hosted email solution with the
university. While, of course, the university could always start hosting their own systems
again, this comes with signicant knowledge requirements,54 most certainly knowledge
migrated out of the institution as part of the cost-saving measures of outsourcing in the
rst place.55 Furthermore, an email migration—even to another vendor—always incurs
signicant costs and disruption of services, no maer how well it is executed. Of course,
this additional cost diers between the type of service being used, and ties closely to
the amount of data stored along with it. For example, a comparatively complex service
may be cheaper to migrate than a simple service relying on petabytes of data. At the
same time, for specic services the number of reasonable choices may be limited. When
it comes to enterprise-scale email, for example, choices are essentially limited to products
from Google and Microsoft. Similarly, the number of providers of Learning Management
Systems is limited, and—at the time of writingall of these ultimately use Amazon’s
cloud infrastructure to provide their services.
Hence, all of the sudden, Google could do something inicting direct harm to punish an
institution, without even doing something illegal.56 The notion of this being sudden might
sound surprising here. After all, contractual agreements should have terms and conditi-
ons that prevent their sudden termination. However, especially in business-to-business
interactions, these terms can turn out to be surprisingly short. Furthermore, quiet recently,
Google actually used the issue of urgency to renegotiate contractual terms with several
51 Reddit, accessed May 30, 2022, hps://www.reddit.com/r/MachineLearning/comments/uyra/d_i_
dont_really_trust_papers_out_of_top_labs/.
52 Emily M. Bender, Timnit Gebru, Angelina McMillan-Major, and Shmargaret Shmitchell, ”On the
Dangers of Stochastic Parrots: Can Language Models Be Too Big?,” in Proceedings of the 2021 ACM
Conference on Fairness, Accountability, and Transparency (Association for Computing Machinery, 2021),
610-623.
53 Brian William Jones, ”The unlimited storage that Google promised my university is being discon-
tinued,” Twier, accessed May 30, 2022, hps://web.archive.org/web/20221129194157/hps://twier.
com/bwjones/status/1490802506628145153.
54 Florian Holzbauer, et al., “Not that Simple: Email Delivery in the 21st Century,” USENIX Annual Tech-
nical Conference (2022).
55 Monica Belcourt, ”OutsourcingThe benets and the risks,” Human resource management review 16, no.
2 (2006): 269-279.
56 Please note, at this point, that Google is just a place holder for any hypergiant providing services a
university may become dependent upon. The same argument stands for Microsoft, Oracle, Amazon,
Zoom, Facebook, Apple, and many more, some of which have already been caught in actions similar
to those described here.
Vol. 1, no. 1 (2022) Privacy Studies Journal
61Fiebig, Lindorfer, and Gürses: Escaping Academic Cloudication
major U.S. universities: After the universities had used the le storage that Google had
initially oered them unlimited and free, for petabytes of data, Google abruptly provided
them with for petabytes of data, Google quickly urged them to renegotiate the terms for a
signicantly higher price.5758 Furthermore, such considerations leave the power dynamics
and especially power imbalance in terms of legal capabilities and funds out of scope.
In business-to-business activity, the least desirable result in case of a breach of contract
is a lengthy lawsuit. This then has the potential of leading to—ultimately—reasonable
restitution payment. However, in contrast to the potential gain of inuence on a research
agenda such a restitution payment is negligible for major corporations. Furthermore, in
comparison to the resources and stamina of hypergiants’59 legal departments, universi-
ties’ ability to defend themselves is, most likely, limited.
It is also important to note that these interactions, occurred before—although not on a
major scale. Zoom intervened in a seminar that was not aligned with their corporate
values,60 Facebook terminated researchers’ private Facebook accounts,61 and Google
reportedly used an organization’s dependence as a sales mechanic.62 Similarly, we have
seen how corporations with similar nancial resources tried and keep trying to increase
climate disaster denial and discredit climate science for their own benet.63
Ultimately, no maer how one stands on whether large cloud corporations would use
their market power to further their own gains—and we argue that as rational actors they
can be expected to do so—for academic sovereignty and freedom as outlined in Section
2.3, the mere chance they could is already the worst-case scenario.
Controversial Content and Centralization
The aforementioned power of hypergiants extends beyond the academic context. As
Fiebig and Aschenbrenner note in their ‘13 Propositions on an Internet for a Burning
World, the prevalence and commoditization of large-scale denial of service aacks created
a situation where independent or self-hosting of content on the Internet has become chal-
lenging. Thus, it is dicult for smaller agents to publish content on the Internet without
resorting to use the infrastructure of major cloud providers, may it be Amazon, Akamai,
or Cloudare. Hence, refusal of major cloud providers to ‘protect’ a site hosting speech
they do not agree with may eectively limits an entities’ ability to share said speech. This
means that a majority of hate and misinformation sites are hosted on major providers, as
57 Slashdot N.D.a, accessed November 11, 2022, hps://hardware.slashdot.org/story/22/02/14/1433256/.
58 Slashdot N.D.b, accessed November 11, 2022,, hps://tech.slashdot.org/story/22/10/03/2327248/univer-
sities-adapt-to-googles-new-storage-fees-or-migrate-away-entirely.
59 ‘Hypergiants’ is a term from the scientic eld of network measurement. The term encompasses large
multi-national cloud and technology corporations like, for example, Amazon, Google, or Facebook.
60 NYU-AAUP Executive Commiee, ”Statement from the NYU-AAUP on Zoom Censorship Today,
accessed May 30, 2022, hps://academeblog.org/2020/10/23/statement-from-the-nyu-aaup-on-zoom-
censorship-today/.
61 Barbara Ortutay, ”Facebook shuts out NYU academics’ research on political ads,” accessed May 30,
2022, hps://apnews.com/article/technology-business-5d3021ed9f193bf249c3af158b128d18.
62 Jones, “The unlimited storage.
63 Shannon Hall, ”Exxon knew about climate change almost 40 years ago,” Scientic American 26 (2015).
62
Privacy Studies Journal Vol. 1, no. 1 (2022)
Fiebig, Lindorfer, and Gürses: Escaping Academic Cloudication
for example, Cloudare.64 As of recently, there was a discussion on whether Cloudare
should stop providing services to Kiwi Farms, a site conducting targeted harassment that
has been linked to at least three suicides.65
Conclusion and Recommendations
In this paper, we took a perspective on the ndings of Fiebig et al. on the cloudication
of universities. We reiterated and expanded their arguments and further illuminated
the connection between privacy, the ability to control one’s own data, education, and
academic freedom. In addition, we elaborated upon the argument of corporations using
positions of power to align researchers with their own interests, sourcing from historic
examples. The major remaining question is: What can we, what can academia, what can
society, do to counteract these eects?
Fiebig et al. provided commonplace answers.66 They proclaim that universities should
organize and collaborate to build research and teaching infrastructure that is control-
led in a democratic and transparent manner by public institutions. While this argument
holds true in a tautological manner, it is also fairly naïve: The cloudication of universi-
ties is driven by socio-economic circumstances and a desire of scale and growth. Howe-
ver, as in other contexts, we might have to realize that eternal growth is not sustainable.67
Instead of following the idea that digitalization enables more; more growth, more revenue,
more prot, more students, more research, more everything. The fundamental question
we have to ask ourselves is whether privacy and academic freedom in higher education
should become a maer of sustainable infrastructures. Hence, in addition to Fiebig et al.’s
recommendations, we demand not only public infrastructures for public services, but
instead sustainable infrastructures. We claim that, when truly sustainable, the question
of privacy and academic freedom will solve themselves.
64 Catherine Han, Deepak Kumar, and Zakir Durumeric, ”On the Infrastructure Providers That Sup-
port Misinformation Websites,” Proceedings of the International AAAI Conference on Web and Social Media
16 (2022).
65 Joseph Menn and Talor Lorenz, ”Under pressure, security rm Cloudare drops Kiwi Farms web-
site”, Washington Post, September 3, 2022, accessed November 11, 2022, hps://www.washingtonpost.
com/technology/2022/09/03/cloudare-drops-kiwifarms. Please note that the authors are strongly con-
vinced that this specic example, KiwiFarms, is a harmful entity that was only allowed to remain
connected to the rest of the Internet due to carefully exploiting a claim of free speech to hide their ille-
gal activity, i.e., by reframing targeted harassment as a maer of speech. Hence, while we ultimately
agree with Cloudare’s decision to terminate services for the site, and note the harm done by Cloud-
are’s hesitation towards reaching this conclusion, we also note the challenge for society created by a
private company being in a position to make that decision.
66 Fiebig et al., ”Heads in the Clouds.
67 Donella H. Meadows, Dennis L. Meadows, Jørgen Randers, and William W. Behrens, ”The limits to
growth,” in Green Planet Blues, eds. Ken Conca and Georey Dabelko (London: Routledge, 2018), 25-29.
Vol. 1, no. 1 (2022) Privacy Studies Journal
63Fiebig, Lindorfer, and Gürses: Escaping Academic Cloudication
Disclosure Statement
None of the authors have conicts of interest regarding the subject maer of this work,
apart from being academics, working in the system we describe.
Acknowledgements
Our work was enabled by the use of a self-hosted Nextcloud instance, Signal (hosted on
Amazon EC2), Google Scholar, Microsoft Oce, and a self-hosted BigBlueBuon instance.
Any opinions, ndings, and conclusions or recommendations expressed in this material
are those of the authors and do not necessarily reect the views of their host institutions.
Bibliography
Aday, Serpil, and Mehmet Seckin Aday. “Impact of COVID-19 on the food supply
chain.” Food Quality and Safety 4.4 (2020): 167-180.
Ahmed, Sara. What’s the use?: On the uses of use. Durham, NC: Duke University Press,
2019.
Allen, Anita L. “Privacy-as-data control: Conceptual, practical, and moral limits of
the paradigm.” Conn. L. Rev. 32 (2000): 861.
Avila Pinto, Renata. “Digital sovereignty or digital colonialsim.” SUR-Int’l J. on Hum
Rts. 27 (2018): 15.
Belcourt, Monica. “Outsourcing—The benets and the risks.” Human resource manage-
ment review 16, no. 2 (2006): 269-279.
Bender, Emily M., Timnit Gebru, Angelina McMillan-Major, and Shmargaret
Shmitchell. “On the Dangers of Stochastic Parrots: Can Language Models Be
Too Big?.” In Proceedings of the 2021 ACM Conference on Fairness, Accountabil-
ity, and Transparency (Association for Computing Machinery, 2021): 610-623.
Braud, Arnaud, et al. “The road to European digital sovereignty with Gaia-X and
IDSA.” IEEE Network 35, no. 2 (The Institute of Electrical and Electronics
Engineers, 2021): 4-5.
Caine, Kelly, and Rima Hanania. “Patients want granular privacy control over health
information in electronic medical records.” Journal of the American Medical
Informatics Association 20, no. 1 (2013): 7-15.
Chan, Adrienne S., and Donald Fisher, eds. The exchange university: Corporatization of
academic culture. Vancouver: UBC Press, 2009.
Churchman, Deborah. “Voices of the academy: academics’ responses to the corpora-
tizing of academia.” Critical Perspectives on Accounting 13, no. 5-6 (2002): 643-
656.
Coghlan, S., T. Miller, and J. Paterson. “Good proctor or “Big Brother”? AI Ethics and
Online Exam Supervision Technologies.” Philosophy & Technology (2021).
64
Privacy Studies Journal Vol. 1, no. 1 (2022)
Fiebig, Lindorfer, and Gürses: Escaping Academic Cloudication
Custers, Bart, Simone van der Hof, and Bart Schermer. “Privacy expectations of social
media users: The role of informed consent in privacy policies.” Policy &
Internet 6, no. 3 (2014): 268-295.
Degrasse-Tyson, Neil. “The Clean Room.” Cosmos: A Spacetime Odyssey. National
Geographic, Fox Broadcasting, April 20, 2014.
Determann, Lothar, and Robert Sprague. “Intrusive monitoring: Employee privacy
expectations are reasonable in Europe, destroyed in the United States.”
Berkeley Tech. LJ 26 (2011): 979.
Farrand, Benjamin, and Helena Carrapico. “Digital sovereignty and taking back con-
trol: from regulatory capitalism to regulatory mercantilism in EU cybersecu-
rity.” European Security 31, no. 3 (2022): 435-453.
Feldmann, Anja, Oliver Gasser, Franziska Lichtblau, Enric Pujol, Ingmar Poese,
Christoph Dieel, Daniel Wagner, et al. “The lockdown eect: Implications
of the COVID-19 pandemic on internet trac.” Proceedings of the ACM inter-
net measurement conference (Association for Computing Machinery, 2020):
1-18.
Feldmann, Anja, Oliver Gasser, Franziska Lichtblau, Enric Pujol, Ingmar Poese,
Christoph Dieel, Daniel Wagner, et al. “A year in lockdown: how the waves
of COVID-19 impact internet trac.” Communications of the ACM 64, no. 7
(Association for Computing Machinery, 2021): 101-108.
Fiebig, Tobias, Seda Gürses, Carlos H. Gañán, Erna Kotkamp, Fernando Kuipers,
Martina Lindorfer, Menghua Prisse, and Taritha Sari. “Heads in the Clouds:
Measuring the Implications of Universities Migrating to Public Clouds.”
arXiv preprint arXiv:2104.09462 (2021).
Fiebig, Tobias, and Doris Aschenbrenner. “13 propositions on an Internet for a
‘burning world’.” Proceedings of the ACM SIGCOMM Joint Workshops on Tech-
nologies, Applications, and Uses of a Responsible Internet and Building Greener
Internet (2022).
Floridi, Luciano. “The ght for digital sovereignty: What it is, and why it maers,
especially for the EU.” Philosophy & Technology 33, no. 3 (2020): 369-378.
Gavilan, Diana, Adela Balderas-Cejudo, Susana Fernández-Lores, and Gema Marti-
nez-Navarro. “Innovation in online food delivery: Learnings from COVID-
19.” International Journal of Gastronomy and Food Science 24 (2021): 100330.
Gerber, Nina, Paul Gerber, and Melanie Volkamer. “Explaining the privacy paradox:
A systematic review of literature investigating privacy aitude and behav-
ior.” Computers & Security 77 (2018): 226-261.
Grapperhaus, Ferd, and Kajsa Ollongren. Vericatie op de uitvoering van het overeenge-
komen verbeterplan met Microsoft. 2019. hps://www.tweedekamer.nl/kamer-
stukken/brieven_regering/detail? id=2019Z13829&did=2019D28465 accessed
May 30, 2022.
Hall, Shannon. “Exxon knew about climate change almost 40 years ago.” Scientic
American 26 (2015).
Han, Catherine, Deepak Kumar, and Zakir Durumeric. “On the Infrastructure Pro-
Vol. 1, no. 1 (2022) Privacy Studies Journal
65Fiebig, Lindorfer, and Gürses: Escaping Academic Cloudication
viders That Support Misinformation Websites.” Proceedings of the Inter-
national AAAI Conference on Web and Social Media 16 (Association for the
Advancement of Articial Intelligence, 2022).
van Hoek, Remko. “Research opportunities for a more resilient post-COVID-19
supply chain–closing the gap between research ndings and industry prac-
tice.” International Journal of Operations & Production Management 40, no. 4
(2020): 341-355.
Holzbauer, Florian et al., “Not that Simple: Email Delivery in the 21st Century.”
USENIX Annual Technical Conference (2022).
Jones, Brian William . “The unlimited storage that Google promised my
university is being discontinued”, Twier, URL: hps://web.
archive.org/web/20221129194157/hps://twier.com/bwjones/
status/1490802506628145153, accessed May 30, 2022.
Karamollahi, Mehdi, Carey Williamson, and Martin Arli. “Zoomiversity: a case
study of pandemic eects on post-secondary teaching and learning.” Interna-
tional Conference on Passive and Active Network Measurement. Cham: Springer,
2022: 573-599.
Kaur, Mannat, Simon Parkin, Marijn Janssen, and Tobias Fiebig. “‘I needed to solve
their overwhelmness’: How system administration work was aected by
COVID-19.” 25th ACM Conference on Computer-Supported Cooperative Work and
Social Computing (Association for Computing Machinery, 2022).
Kwet, Michael, “In Stores, Secret Surveillance Tracks Your Every Move,” The New
York Times, June 14, 2019, hps://www.nytimes.com/interactive/2019/06/14/
opinion/bluetooth-wireless-tracking-privacy.html, accessed May 30, 2022.
Mangini, Vincenzo, Irina Tal, and Arghir-Nicolae Moldovan. “An empirical study on
the impact of GDPR and right to be forgoen-organisations and users per-
spective.” Proceedings of the 15th International Conference on Availability, Reli-
ability and Security (2020):1-9.
Meadows, Donella H., Dennis L. Meadows, Jørgen Randers, and William W. Behrens.
“The limits to growth.” In Green Planet Blues: Critical Perspectives on Global
Environmental Politics, edited by Ken Conca and Georey Dabelko, 25-29.
Abingdon: Routledge, 2018.
Menn, Joseph, and Talor Lorenz. “Under pressure, security rm Cloudare drops
Kiwi Farms website,” Washington Post, September 3, 2022, hps://www.
washingtonpost.com/technology/2022/09/03/cloudare-drops-kiwifarms/,
accessed November 11, 2022.
Nissenbaum, Helen. Privacy in Context: Technology, Policy, and the Integrity of Social
Life. Stanford: Stanford University Press, 2009.
NYU-AAUP Executive Commiee. “Statement from the NYU-AAUP on Zoom Cen-
sorship Today.” Academe Blog, October 23, 2020. hps://academeblog.
org/2020/10/23/statement-from-the-nyu-aaup-on-zoom-censorship-today/,
accessed May 30, 2022.
Ortutay, Barbara. “Facebook shuts out NYU academics’ research on political ads”,
66
Privacy Studies Journal Vol. 1, no. 1 (2022)
Fiebig, Lindorfer, and Gürses: Escaping Academic Cloudication
AP News, August 5, 2021, hps://apnews.com/article/technology-busi-
ness-5d3021ed9f193bf249c3af158b128d18, accessed May 30, 2022.
Paerson, Clair C. “Contaminated and natural lead environments of man.” Archives
of Environmental Health: An International Journal 11, no. 3 (1965): 344-360.
Pikoos, Toni D., Simone Buzwell, Gemma Sharp, and Susan L. Russell. “The Zoom
eect: exploring the impact of video calling on appearance dissatisfaction
and interest in aesthetic treatment during the COVID-19 pandemic.” Aes-
thetic Surgery Journal 41, no. 12 (2021): NP2066-NP2075.
Pohle, Julia, and Thorsten Thiel. “Digital sovereignty.” In Practicing Sovereignty: Dig-
ital Involvement in Times of Crises, eds. Bianca Herlo, Daniel Irrgang, Gesche
Joost, and Andreas Unteidig, 47-67. Bielefeld: transcript Verlag, 2021.
Reddit, hps://www.reddit.com/r/MachineLearning/comments/uyra/d_i_dont_
really_trust_papers_out_of_top_labs/, accessed May 30, 2022.
Roberts, Huw, Josh Cowls, Federico Casolari, Jessica Morley, Mariarosaria Taddeo,
and Luciano Floridi. “Safeguarding European values with digital sover-
eignty: An analysis of statements and policies.” Internet Policy Review (2021).
Rojszczak, Marcin. “CLOUD act agreements from an EU perspective.” Computer
Law & Security Review 38 (2020): 105442.
Rowe, Sylvia, Nick Alexander, Fergus Clydesdale, Rhona Applebaum, Stephanie
Atkinson, Richard Black, Johanna Dwyer, et al. “Funding food science and
nutrition research: nancial conicts and scientic integrity.” Nutrition
Reviews 67, no. 5 (2009): 264-272.
Slashdot, N.D.a hps://hardware.slashdot.org/story/22/02/14/1433256/, accessed
November 11, 2022.
Slashdot N.D.b hps://tech.slashdot.org/story/22/10/03/2327248/universities-adapt-
to-googles-new-storage-fees-or-migrate-away-entirely, accessed November
11, 2022.
Srnicek, Nick. Platform Capitalism. Hoboken: Wiley & Sons, 2017.
U.S. Department of Education. Family Educational Rights and Privacy Act (FERPA).
hps://www2.ed.gov/policy/gen/guid/fpco/ferpa/index.html, accessed
November 11, 2022.
Unnikrishnan, Avinash, and Miguel Figliozzi. “Exploratory analysis of factors aect-
ing levels of home deliveries before, during, and post-COVID-19.” Transpor-
tation Research Interdisciplinary Perspectives 10 (2021): 100402.
Weiss, Martin A., and Kristin Archick. “US-EU data privacy: from safe harbor to pri-
vacy shield.” Congressional Research Service May 19, 2016.
Williamson, Ben, and Anna Hogan. “Pandemic Privatisation in Higher Education:
Edtech and University Reform.” Education International (2021).
Zubo, Shoshana. “Big other: surveillance capitalism and the prospects of an infor-
mation civilization.” Journal of Information Technology 30, no. 1 (2015): 75-89.
Chapter
Full-text available
Os impactos do uso de plataformas digitais cibernéticas nos modelos educacionais desejados
Article
Research data sharing is embedded in policies, guidelines and requirements commonly promoted by research funding organizations that demand data to be “as open as possible, as closed as necessary” and FAIR. This paper discusses the challenges of balancing privacy protection with data sharing in a PhD project involving long‐tail, small‐sized qualitative human subjects' data. Based on experiences and feedback from project participants, we argue that privacy protection is about respecting the participants and their self‐image. This can be achieved through dialogue and involvement of the participants building on the principles of shared stewardship. Further, we suggest that de‐identification and plain language consent materials are better at protecting privacy than anonymisation, which in a digital data environment is difficult to achieve and not necessarily a sensible approach for qualitative data, where the gold is in the details. The literature indicates that it matters to participants whether data are reused for research or other purposes, and that they trust the institutions. This supports our claim that research data services must find better solutions for restricted sharing when necessary.
Article
Full-text available
With the emergence of remote education and work in universities due to COVID-19, the 'zoomification' of higher education, i.e., the migration of universities to the clouds, reached the public discourse. Ongoing discussions reason about how this shift will take control over students' data away from universities, and may ultimately harm the privacy of researchers and students alike. However, there has been no comprehensive measurement of universities' use of public clouds and reliance on Software-as-a-Service offerings to assess how far this migration has already progressed. We perform a longitudinal study of the migration to public clouds among universities in the U.S. and Europe, as well as institutions listed in the Times Higher Education (THE) Top100 between January 2015 and October 2022. We find that cloud adoption differs between countries, with one cluster (Germany, France, Austria, Switzerland) showing a limited move to clouds, while the other (U.S., U.K., the Netherlands, THE Top100) frequently outsources universities' core functions and services---starting long before the COVID-19 pandemic. We attribute this clustering to several socio-economic factors in the respective countries, including the general culture of higher education and the administrative paradigm taken towards running universities. We then analyze and interpret our results, finding that the implications reach beyond individuals' privacy towards questions of academic independence and integrity.
Article
Full-text available
In recent years, we have been able to observe the emergence and mainstreaming of an EU discourse on digital sovereignty, which highlights the importance of gaining back control of EU digital infrastructure and technological production, based on the EU's perceived loss of economic competitiveness, limited capacity to innovate, high degree of dependence on foreign digital infrastructures and service providers and, related to all these factors, difficulty in providing EU citizens with a high level of cybersecurity. Bearing in mind that a considerable percentage of these infrastructures and service providers are under private sector control, the present article asks how this sovereignty discourse conceptualises the role of the private sector in EU cybersecurity. Drawing from a Regulatory Capitalism theoretical model, this article proposes that the EU has instead entered a Regulatory Mercantilist phase where it seeks to reassert its control over cyberspace, impose digital borders, accumulate data wealth and reduce its dependence on external private sector actors whose values may not reflect those of the EU order. A new approach to cybersecurity is emerging, in which the non-EU private sector can be perceived as much of a threat as foreign powers, and from whom digital sovereignty must be secured.
Article
Full-text available
Online exam supervision technologies have recently generated significant controversy and concern. Their use is now booming due to growing demand for online courses and for off-campus assessment options amid COVID-19 lockdowns. Online proctoring technologies purport to effectively oversee students sitting online exams by using artificial intelligence (AI) systems supplemented by human invigilators. Such technologies have alarmed some students who see them as a “Big Brother-like” threat to liberty and privacy, and as potentially unfair and discriminatory. However, some universities and educators defend their judicious use. Critical ethical appraisal of online proctoring technologies is overdue. This essay provides one of the first sustained moral philosophical analyses of these technologies, focusing on ethical notions of academic integrity, fairness, non-maleficence, transparency, privacy, autonomy, liberty, and trust. Most of these concepts are prominent in the new field of AI ethics, and all are relevant to education. The essay discusses these ethical issues. It also offers suggestions for educational institutions and educators interested in the technologies about the kinds of inquiries they need to make and the governance and review processes they might need to adopt to justify and remain accountable for using online proctoring technologies. The rapid and contentious rise of proctoring software provides a fruitful ethical case study of how AI is infiltrating all areas of life. The social impacts and moral consequences of this digital technology warrant ongoing scrutiny and study.
Article
Full-text available
In March 2020, the World Health Organization declared the Corona Virus 2019 (COVID-19) outbreak a global pandemic. As a result, billions of people were either encouraged or forced by their governments to stay home to reduce the spread of the virus. This caused many to turn to the Internet for work, education, social interaction, and entertainment. With the Internet demand rising at an unprecedented rate, the question of whether the Internet could sustain this additional load emerged. To answer this question, this paper will review the impact of the first year of the COVID-19 pandemic on Internet traffic in order to analyze its performance. In order to keep our study broad, we collect and analyze Internet traffic data from multiple locations at the core and edge of the Internet. From this, we characterize how traffic and application demands change, to describe the "new normal," and explain how the Internet reacted during these unprecedented times.
Article
Full-text available
The COVID-19 pandemic has significantly affected shopping behavior and has accelerated the adoption of online shopping and home deliveries. We administered an online survey among the population in the Portland-Vancouver-Hillsboro Metropolitan area on household and demographic characteristics, e-commerce preferences and factors, number of deliveries made before and during the COVID-19 lockdown, and number of deliveries expected to make post-pandemic. In this research, we conduct an exploratory analysis of the factors that affect home delivery levels before, during, and post-COVID-19. There was a significant increase in home deliveries during the COVID-19 lockdown relative to the before COVID-19 period. A high proportion of the households that made less than three deliveries per month before the pandemic stated they would order more online post-pandemic. A majority of the households that ordered more than three deliveries per month before COVID-19 are expected to revert to their original levels post-pandemic. The two variables most positively affecting the likelihood of online shopping were access to delivery subscriptions and income. Tech-savvy individuals are expected to make more home delivery orders post-pandemic compared to before and during COVID-19. Health concerns positively increase the likelihood of ordering online during the pandemic and post-pandemic. Older and retired individuals are less likely to use online deliveries. However, the likelihood of older and retired individuals ordering more home deliveries increased during the pandemic lockdown. Households with disabled members, single workers, and respondents concerned about online experience and health are more likely to be first-time online shoppers during the pandemic.
Article
Full-text available
Digitalization is chewing the world with strong economic and social impacts. Recently, the management of the COVID 19 crisis highlighted the power of digital tools and their impact on stakes such as privacy, surveillance, transparency, and censorship. How nations deal with massive digitalization and master the technologies, and applications that are deployed and used on their soils by their companies and citizens is vividly raised by the U.S. ban on Huawei enforced by the “clean network” strategy. This ban could have set the ground for a technology war between “2 blocks: the digital democracies and the techno-authoritarian regimes” [1], [2].
Article
Background The popularity of videoconferencing platforms has skyrocketed during the COVID-19 pandemic, however, there have been concerns regarding the potential for video calls to promote appearance dissatisfaction, as individuals are exposed to their reflection on camera for extended periods. Objectives The current study characterized current video usage behaviors and their relationship with appearance dissatisfaction and interest in aesthetic procedures in the general population. Methods An online survey was completed by 335 adults currently living in Australia. Multiple aspects of video usage were assessed, including engagement in ‘video manipulation’ techniques to enhance appearance and the focus of visual attention while on video calls (ie, on self or others). The Dysmorphic Concern Questionnaire was administered to determine if video use behaviors were associated with greater body image disturbance. Results Over 1/3 of participants had identified new appearance concerns while on video. Dysmorphic concern was associated with self-focused attention, greater engagement in video manipulation behaviors, and increasing appearance concerns due to their time on video calls. Individuals who identified new video-based appearance concerns reported greater interest in obtaining future beauty treatments (eg, waxing) and aesthetic procedures (eg, non-surgical procedures such as anti-wrinkle injections). Conclusions This is one of first empirical studies to report the potential consequences of video call usage for increasing appearance dissatisfaction and dysmorphic concern, and to demonstrate a link between the use of video calls and interest in cosmetic procedures.
Article
Digital sovereignty, and the question of who ultimately controls AI seems, at first glance, to be an issue that concerns only specialists, politicians and corporate entities. And yet the fight for who will win digital sovereignty has far-reaching societal implications. Drawing on five case studies, the paper argues that digital sovereignty affects everyone, whether digital users or not, and makes the case for a hybrid system of control which has the potential to offer full democratic legitimacy as well as innovative flexibility.