ArticlePDF Available

COVID-19 Tracing App for Mobiles - The Fine Line Between Trust and Big Brother

Authors:

Abstract

In this paper we add to the current debate on teh use and design of Covid-19 Tracing Apps for mobiles. We build on our research on datafcation technologies in the workplace and outline how a fruitful transdisciplinary dialogue around technology design options might look like to tango trust and control.
1
COVID-19 TRACING APP FOR MOBILES THE FINE LINE BETWEEN TRUST AND
BIG BROTHER
Dr. Simon SCHAFHEITLE
(simondaniel.schafheitle@unisg.ch)
&
Prof. Dr. Antoinette WEIBEL,
University of St. Gallen
Since the announcements of outbreak in March 2020, the corona crisis has continued to rear its
ugly head in Europe: epidemiologists and health experts unanimously point to the same (few)
options to fight the current situation. In brief: we must flatten the curve. More precisely, the
prevailing and evidence-based policy initiative, inspired by epidemiological scholarship is that of
social distancing. However, the voices of other experts are becoming increasingly louder. For
instance, economists believe that the policy measures must come with an expiration date in order
to prevent economic damages from spiraling out of control. Psychologists ask policy makers to
consider the possible consequences of a recession, including increases in social boredom, domestic
violence, depression, and suicide rates.
The Corona Crisis: A Wicked Problem”.
It has become clear that the current situation is a wicked problem: complex, ambiguous, unique,
and for which solutions have yet to be found. More precisely, a wicked problem is a complex,
ambiguous, and unprecedented challenge for which there are no patent remedies and for which
solutions must be newly created. In contrast to a dilemma situation, where costs and benefits of
the various alternatives can be weighed sensibly and ex-ante, the effectiveness of a measure to
2
solve a wicked problem only becomes apparent through -sometimes courageously- trial and error
evaluation.
Applied to the current COVID-19 situation, the present wicked problem is wicked in a double
sense. First, decision-makers tackle multilayered risks in unprecedented mitigation strategies. For
instance, there are no absolute right or wrongs in managing public health and the prevention of
economic damages. Second, decision-makers face political risks, because it is always possible that
a promising and well-thought-out solution from the drawing board is unsuccessful, or even has
negative or “backfiring” consequences.
So, How Do We Deal With The Situation?
We propose that an effective corona mobile app requires transdisciplinary cooperation. In our view,
two approaches are central to fight the COVID-19 wicked problem with a tracing mobile app. First,
“wicked problems” can only be solved through cooperation between various technological experts,
but also between individuals (users, or guinea pigs in this case), and the politicians who have to
design, communicate and implement the solutions. Second, such transdisciplinary cooperation
must be based on mutual trust, as only trust will enable stakeholders to openly discuss and their
willingness to engage in novel, unorthodox solutions. User trust, as we argue, is the pivotal
touchstone of success.
In this light, there is debate about utilizing a mobile app to distinguish between the infected, the
immune, and, those who have not been infected yet. It is envisioned to analyze mobile phone
Bluetooth-data in order to identify and break infection chains by labelling/tracking an individual’s
3
health status. Simultaneously, restrictive policy measures will also be relaxed. The implementation
of a corona tracing mobile app is a novel and serious approach to solving the wicked problem.
Design Options Matter for Drawing the Fine Line Between Trust and Control.
Science can make a valuable contribution to solving this challenge. For more than two years, the
Institute for Work and Employment Research at the University of St. Gallen has been researching
a transdisciplinary NRP75 National Fund project (together with business ethics and tech law) to
examine how algorithms and technical solutions can be designed in the workplace so that novel
workplace technologies (so-called datafication technology controls) can promote employees’ trust
in the employer (or at least not damage such trust).
The central insight from this research work to date is that invasive technologies and a vivid
trust relationship can go hand in hand if designed correctly. However, a transdisciplinary design
approach is a necessary prerequisite.
As a reminder: the corona tracing mobile app is intended to enable a successful balancing between
economic interests on the one hand, and the saving of human lives on the other. Therefore, it
benefits politicians to examine the realization of such an app in light of these findings in a
transdisciplinary dialogue.
Numerous studies from independent disciplines show that when seeking to enable a trusting
relationship between the applying institution and the user, the design of the technology is decisive
at five main facets: (1) the users’ possibilities to participate in the process of controlling (e.g.,
feedback channels on the functionality and scope of the technological systems) as well as (2) the
sufficient and correct information of the users, meaning which data is collected and how it is used.
4
In this realm, the so-called function creep should be carefully considered. Function creep relates
to the sui generis ability of intelligent technologies to unfold functionalities by their mere
application; these are new functionalities which are not in the inventor's mind. More precisely, a
function creep manifests here in whether the mobile app only has tracing capabilities and not
tracking functionalities; concerning users' trust in technology and decision-makers, this makes a
huge difference that should not be neglected.
Furthermore, the (3) appropriateness of data collection plays an important role in a trust-promoting
app design. In this case, however, it is worth noting that technology only taps into data sources that
are required to achieve the communicated purpose. Finally, the app’s (4) programmed intelligence
(i.e., whether it serves descriptive, predictive or even prescriptive purposes) and (5) the
accessibility and comparability of data and analyses might appear less obvious, yet not less
decisive for a vivid trust relationship.
Urgent Questions Unclear Answers.
Can society take a mobile app that is capable of independently sending an intelligent warning signal
if an infected and a non-infected person come too close? Or does it create a digital prejudice which,
instead of maintaining community trust, contributes to collective insecurity? Should anonymous
data be accessible to everyone at any time? Along these lines, another pressing question emerges:
what about the voluntary nature and the anonymity of data if non-infection or immunity
become the labor market’s newest currency? In that respect, psychologists might assume a self-
reinforcing spiral of distrust within society, which would ultimately break up the social contract.
Additionally, lawyers would raise data protection issues; ethicists might ask whether and how a
5
conflicting weighing of benefits could be carried out similar to the discussion that Dr. Wolfgang
Schäuble started very recently.
Solutions to wicked problems, as already mentioned, require a trust-based and
transdisciplinary dialogue and, thus, a joint design process similar to design-thinking. A promising
solution is one able to maintain the relationship of trust between the people and the state, ensuring
that social peace is not damaged.
Chapter
No trust, no use? Oft wird Vertrauen als kritischer Erfolgsfaktor propagiert, wenn es um die Nutzung von neuen Technologien geht, vor allem wenn es sich um intelligente Systeme, sogenannte KI (Künstliche Intelligenz) handelt. Diese finden nämlich immer mehr Eingang in die heutige Gesellschaft, sowohl im privaten (z. B. Einkaufen mit Amazons KI-basiertem Smart Speaker Alexa) als auch im beruflichen oder schulischen Umfeld (z. B. intelligente Systeme, die die Personalauswahl oder Lernprozesse unterstützen sollen). Der Einsatz von smarten, ubiquitären Technologien erhöht die Unsicherheit und Skepsis, gerade bei EndanwenderInnen ohne technisches Verständnis, und verschärft das Spannungsfeld zwischen Mensch, Maschine und Gesellschaft. Die Vertrauensfrage wird ins Rampenlicht gerückt. Ist Vertrauen der KonsumentInnen die Lösung, um das hochgelobte Potenzial der künstlichen Intelligenz voll auszunutzen? Ganz so einfach ist es nicht. Unklarheiten in Definitionen, Sprachgebrauch und Messmethoden verwässern das Verständnis um die Zusammenhänge von Vertrauen und Nutzen. Es ist über die unterschiedlichen Disziplinen hinweg nicht eindeutig geklärt, ob die Nutzung von neuen Technologien, insbesondere KI, tatsächlich mit Vertrauen einhergeht, für welchen Zweck und Vertrauen in wen: Die HerstellerInnen? Die DesignerInnen? Wie kann Vertrauen und Nutzung abgegrenzt werden? Dieser Beitrag hat zum Ziel, kuriose Geschichten und Behauptungen rund um KI und Vertrauen zu entmystifizieren, um letztlich Künstliche Intelligenz besser in die Praxis zu bringen. Dazu braucht es einen transdisziplinären und vor allem adressatengerechten Diskurs darüber, was intelligente Systeme sind, und eine differenzierte Auseinandersetzung damit, welche Rolle Vertrauen dabei spielen könnte. Es gibt nämlich auch die Vertrauens-SkeptikerInnen, die vehement die Meinung vertreten, dass Vertrauen im Kontext KI überhaupt keine Rolle spielt. Wir argumentieren, dass Vertrauen – vor allem im Endanwenderkontext – eine wichtige Variable ist, welche nicht nur die Adoption, sondern auch die Art und Weise, wie KI-basierte Systeme genutzt werden, maßgeblich beeinflusst. Anhand von praktischen Beispielen wollen wir aufzeigen, dass ein angemessen kalibriertes Vertrauensniveau nicht nur zu einem effizienteren, sicheren und synergetischen Umgang mit KI-basierten oder automatisierten Systemen führt, sondern sogar Leben retten kann.
ResearchGate has not been able to resolve any references for this publication.