Conference PaperPDF Available

Towards a Framework for Information Privacy in Complex Service Ecosystems

Authors:

Abstract

Information Privacy gained visibility and rising awareness in society as well as media coverage due to the case of Cambridge Analytica and Facebook. This case demonstrates the extent of complex service ecosystems with a multitude of actors involved in actions that impact information privacy. As such ecosystems are nowadays ubiquitous the implementation of the General Data Protection Regulation (GDPR) seeks to establish responsibility regarding actions taken by data processors. With this paper, propose an analytical framework that builds on an analysis of privacy-invasive critical cases in complex service ecosystems. We applied a cross impact matrix to systematically identify critical issues. Additionally, by visualizing data flows between actors, privacy-critical issues in service ecosystems become apparent. Building on these insights privacy-related problem propositions are derived that lead to future design-oriented research directions. Thus, we propose a framework that helps scholars and practitioners to identify blind spots and privacy-critical issues in service ecosystems.
Towards a Framework for Information Privacy
Thirty Ninth International Conference on Information Systems, San Francisco 2018 1
Towards a Framework for Information
Privacy in Complex Service Ecosystems
Short Paper
Christian Kurtz
University of Hamburg
Hamburg, Germany
christian.kurtz@uni-hamburg.de
Martin Semmann
University of Hamburg
Hamburg, Germany
martin.semmann@uni-hamburg.de
Wolfgang Schulz
Hans-Bredow-Institute
Hamburg, Germany
w.schulz@hans-bredow-institut.de
Abstract
Information Privacy gained visibility and rising awareness in society as well as media
coverage due to the case of Cambridge Analytica and Facebook. This case demonstrates
the extent of complex service ecosystems with a multitude of actors involved in actions
that impact information privacy. As such ecosystems are nowadays ubiquitous the
implementation of the General Data Protection Regulation (GDPR) seeks to establish
responsibility regarding actions taken by data processors. With this paper, propose an
analytical framework that builds on an analysis of privacy-invasive critical cases in
complex service ecosystems. We applied a cross impact matrix to systematically identify
critical issues. Additionally, by visualizing data flows between actors, privacy-critical
issues in service ecosystems become apparent. Building on these insights privacy-related
problem propositions are derived that lead to future design-oriented research directions.
Thus, we propose a framework that helps scholars and practitioners to identify blind
spots and privacy-critical issues in service ecosystems.
Keywords: Service Ecosystem, General Data Protection Regulation, Information Privacy,
Framework, Design Science
Introduction
A 2015 editorial in the Information Systems Journal stated that “recent technological changes are
generating additional privacy challenges beyond the existing landscape” (Belanger and Xu 2015, p. 575).
The newly publicized case of Cambridge Analytica’s misuse of Facebook user data is just one prominent
example of such challenges. About 87 million Facebook users were affected by the privacy-invasive access
of user data by an application that was published on Facebook (Frier 2018). This application was able to
access a wide variety of data from Facebook users and users’ friends, subsequently delivering the data to a
third actor, Cambridge Analytica. As the case shows, users nowadays face the challenge of maintaining their
privacy in the context of highly interconnected information processing. Currently, users often once make a
single decision to use a service in general. However, technological advances have led to complex service
ecosystems that combine services from a multitude of service providers. Organizations implement other,
backend services in their services for various purposes. These include application performance
management, social network integration, or monetization of services through advertisements. A single user
decision to use a service can result in giving access to multiple parties who may process this data (Conger
et al. 2013; Razaghpanah et al. 2018). Information about a user is aggregated through many avenues of data
access, which is both highly complex and confusing for users. Where users previously faced one decision
Towards a Framework for Information Privacy
Thirty Ninth International Conference on Information Systems, San Francisco 2018 2
about one organization, they now face a multitude of actors that are not necessarily even visible to the user.
These changes to complex services have led to problems regarding users’ privacy over time.
In this article we propose a framework that is appropriate to analyze privacy-critical issues associated with
these changes. As the case of Facebook demonstrates, no solution exists until now that fully cover all critical
issues to protect user’s privacy. The problems that arise within digital, interconnected services have not yet
been sufficiently considered in the design of solutions. However, according to the framework of Privacy by
Design, privacy must be approached from a design-thinking perspective (Cavoukian 2009). Though,
without a precise description of a problem, the necessary requirements for a suitable solution cannot be
derived (Peffers et al. 2007). We develop a framework to analyze critical cases to identify privacy-related
problems, as we believe that privacy protection requires better solutions than those that already exist. Our
framework makes it possible to identify blind spots and enables an examination of privacy-critical issues
between multiple actors located in an interconnected service ecosystem. In the next step, organizations and
regulators may benefit from guidance on where to position privacy-based modifications in order to prevent
critical cases like that of Facebook. This knowledge is also indispensable because the General Data
Protection Regulation (GDPR) specifies that organizations bear responsibility with regard to actions taken
by “processors”, which also includes integrated backend services (Regulation 2016, Chapter 4 Article 25).
To perform data protection impact assessments knowledge is required that cannot be easily produced for
complex service ecosystems (Regulation 2016, Chapter 4 Article 35). The same is true for the information
basis on which a data subject, a user, gives consent. In summary, we research privacy-critical issues of
multi-actor information-processing that are related to service ecosystems.
This paper begins with a theoretical framework that includes the foundations of service ecosystems and
information privacy in multi-actor relationships. We then derive a multi-actor perspective on service
ecosystems. Afterwards, in the research design section, we deduce the framework to the assessment of
multilateral, independent (design) decisions and consider their implications for privacy in critical cases.
Subsequently, two critical cases are analyzed. Building on the results, we derive in the discussion the critical
issues for privacy in that cases. These issues are used to deduce problem propositions that may act as the
basis of solution designs. The paper finishes with a conclusion and outlook for future research endeavors
that build on the results of this research-in-progress article.
Theoretical Framework
In general, the ways in which service is delivered have changed essentially in many respects and have
become a key driver in the information systems discipline (Böhmann et al. 2014). Research on service has
likewise shifted from focusing on single services towards systems of services (Böhmann et al. 2014;
Chandler and Lusch 2015; Vargo and Lusch 2011). Building on this systemic perspective, service ecosystems
are defined as “a relatively self-contained, self-adjusting system of resource-integrating actors connected
by shared institutional arrangements and mutual value creation through service exchange” (Lusch and
Vargo 2014, p. 24) that includes rules and norms (Vargo and Lusch 2016, p. 11). This definition implies a
dynamic, combined configuration across multiple actors that creates value for the beneficiary. Recently
published studies show the high dissemination of third parties in (mobile) services from a privacy-critical
perspective (Backes et al. 2016; Lerner et al. 2016; Meng et al. 2016; Razaghpanah et al. 2018; Vallina-
Rodriguez et al. 2016). Third parties have numerous points of access both on one device and across multiple
devices (Brookman et al. 2017; Buss 2015), and they can thus condense the information they gather on a
single user via unique identifiers such as IP addresses or user settings (Kurtz et al. 2016).
Early privacy studies mention practices of companies that are privacy-critical (Culnan 1993; Smith et al.
1996). Also, studies have examined the dimensions reflecting individuals’ concerns about organizational
practices that impact privacy (Smith et al. 1996). In the models subsequently developed with the intention
of examining user privacy decision-making, such practices were considered in user privacy risks and privacy
concerns (Dinev and Hart 2006; Malhotra et al. 2004; Nikkhah and Sabherwal 2017). In this context, the
conflict experienced by web-based service providers with regard to how much data to share with third
parties was examined, taking user privacy concerns into account (Gopal et al. 2018). However, decision-
making on the part of an individual implies both choice and consent, and today, with increasingly complex
trade-offs, these notions are no longer sufficient (Acquisti et al. 2015; Solove 2012). The complexity result
in individuals sharing and providing their personal data without realizing it (Belanger and Xu 2015, p. 576).
For instance, smartphone operating systems do not enable users to view the way third-party applications
Towards a Framework for Information Privacy
Thirty Ninth International Conference on Information Systems, San Francisco 2018 3
collect and share their data (Crossler and Bélanger 2017; Enck et al. 2014). These modifications in
information technology lead to confusing and unclear sharing of personal data (Acquisti et al. 2015, p. 509).
Incomplete and asymmetric information are reasons for privacy uncertainty among users, which results in
an inability to act in a self-interested manner (Acquisti et al. 2015; Crossler and Bélanger 2017).
The GDPR has come into effect in May 2018 in the European Union where it remains to be seen to what
extent the GDPR can address these problems. The GDPR is not based on a specific theory of privacy but
sticks to the concept of using “personal data” as the starting point of regulation, trying to guarantee the
persons control over the processing of those data. Additionally, the GDPR includes elements of systemic
data protection (Tikkinen-Piri et al. 2018). This act aims to protect privacy in the digital world, thus relating
to service ecosystems, in the form of data protection and data regulation (Danezis et al. 2015). Our approach
can therefore also help to demonstrate to what extend the GDPR can address privacy issues in complex
ecosystems. We intend to assess the critical cases we detect based on the GDPR in further projects (Kurtz
et al. 2019). The evaluation of the cases on the basis of normative standards basically follows the same
procedure as the analysis presented here but is an independent assessment that is not scope of this paper
and thus not presented here. The results of that assessment will be shared with regulators and policy makers
to create a real-world impact of our research and to find ways, how society can deal with the issues.
Figure 1. Adapted Model of Privacy in Ecosystems (based on (Conger et al. 2013))
In this article, we use the term “ecosystem” to indicate that actors interact not only with well-designed
information systems (IS) but also with systems of systems. Parts of these systems emerge in the
spontaneous interaction among actors. At some point, actors make (design) decisions that have
consequences for other actors, for instance for the scope of subsequent (design) decisions. The objectives
of those systems can be in terms of a broad sense that includes software and hardware architecture or the
structuring of social procedures as well as the interaction of all of these. These ecosystems have
consequences for the privacy of actors that have not yet been examined in detail. A model previously
published offers a system perspective on the interactions between an individual, an organization and an
integrated third organization which can influence individual’s privacy (Conger et al. 2013). Our focus is on
the interactions among these actors. To illustrate this, we have refined and modified the existing model
(Figure 1) to depict a part of the ecosystem that we will discuss in more detail below.
We name the actors as “user”, “frontend service”, and “backend service”. Furthermore, we consider the
actor “platform” to allow us to analyze the critical cases in more detail and reflect how services are provided
in a digital world. These actors occur not just once but several times, for instance, a frontend service
integrates various backend services as mentioned above. In line with this concept, our investigation does
not factor in the characteristics influencing the disclosure of personal information by the first party (privacy
calculus). Furthermore, we do not factor in the author’s understanding of the fourth party (which is
considered an illegal entity i.e., hackers). The actors and their actions take effect in service ecosystems.
Research Design
In the following, we develop an analytical framework that helps to specify the problems and phenomena in
these complex service ecosystems that current privacy research until now cannot explain entirely. The
service ecosystem is subject to problems that occur directly where the actors are located or at the interfaces
between them. To reveal these problems, we make use of critical cases. We understand “critical” in a way
that builds on the epistemological understanding of “critique”, that is, discussing issues with the intention
of preventing (unproductive) separation of single categories, fields, and practices of thought. To do that we
Third Parties
e. g. RevealMobi
Third Parties
e. g.
RevealMobilea
Frontend
Service Backend
Service
Service Ecosystem
PlatformUser
Towards a Framework for Information Privacy
Thirty Ninth International Conference on Information Systems, San Francisco 2018 4
work with those established categories and extend them from a service ecosystem perspective. By shifting
the perspective towards multi-actor service systems, we are able to identify privacy related issues along
interfaces and distributed data processing. Thus, we contribute to the ongoing discourse about information
privacy. Our research will help to identify blind spots and specify problems that are results of paths
dependencies, sector- and layer-specificities, and disciplinary boundaries. This research is thus the first
step towards design science projects that can lead to practices and models that foster privacy aware service
ecosystem design and thus can support regulators identifying inconsistencies of recent approaches.
A critical caseis thus also a case for which no capable analytical framework existed of systematically
locating privacy-critical issues in their entirety. This need has already been raised, as “[m]uch of the
research on information privacy has focused on individuals […]” (Belanger and Xu 2015, p. 576) as a
consequence of which existing privacy models cannot be used to adequately examine the critical cases.
There has already been a call for the examination of information privacy violations and of the organizational
factors which lead them (Belanger and Xu 2015, p. 576). Building on this call for research, we propose an
analytical framework for privacy-critical issues located in complex service ecosystems.
Different critical issues for privacy can become visible in the examination of the entire complex service
ecosystem. For this, as a method of analyzing the critical cases, we use a cross impact matrix to identify
privacy-related problems. This enables us to identify actions and their corresponding impacts in the
complex service ecosystem. Additionally, we can explore which impacts the multilateral, independent
(design) decisions of each actor within the service ecosystem cause which effects. In the matrix, the actors
are arranged across the top of the matrix and down one side. This allows a simple and clear presentation of
both the actors and the causeeffect relationships among them. There can also be issues where the cause-
effect relation only affects one actor. This arrangement of actors is necessary to create transparency and to
identify critical privacy issues. This is needed to specify the problems, forming the basis of designing
suitable requirements for artifacts for protecting individual’s privacy. Design is the act of creating an
explicitly appropriate solution to a defined problem, and in the process of designing, it is necessary to
specify the problem such that the solution can take into account its complexity (Peffers et al. 2007). Without
such problem specification, no solutions can be designed that take the problems into account to their full
extent. This problem specification is the first process step of the Design Science Process Model (Peffers et
al. 2007) of which the next steps will be focused by us. In our analytical framework the critical issues for
privacy are identified by the application of the cross-impact matrix. By the analysis of critical cases in this
cross-impact matrix we identify distinguishing as well as repeating patterns. Subsequently, this enables us
to derive problem propositions. Building on this methodology, our research process, shown schematically
in Figure 2, is divided into three steps: (1) apply critical cases to the cross-impact matrix, enabling us to
identify critical issues for privacy, (2) use these issues to derive privacy-related objectives and requirements
of a solution, and (3) and finally derive suitable design solutions for users, policy makers, and industry.
Figure 2. Analytical Framework for Establishing Information Privacy in Complex Service
Ecosystems
In this article, which considers research that is still in progress, we conduct, to a limited extent, the first two
steps of the research process. We examine two different critical cases that show the range of application of
the analytical framework. The critical cases have their own peculiarities. During the whole research process,
we analyze multiple critical cases in the context of a full research paper. This allows us to sharpen our
analytical framework and to identify and specify problems and patterns that do not occur in the two critical
cases analyzed here.
Towards a Framework for Information Privacy
Thirty Ninth International Conference on Information Systems, San Francisco 2018 5
Critical Case Analysis
“Facebook and Cambridge Analytica”
The data scandal of Facebook and Cambridge Analytica with a widely coverage in the media has also led to
a hearing of Facebook’ CEO Mark Zuckerberg before the United States House of Representatives Committee
on Energy and Commerce (Rosenberg et al. 2018; Solon 2018; United States House of Representatives
Committee on Energy and Commerce 2018). In this case, users of the platform Facebook used the on the
platform provided frontend service “Your Digital Life”. By authorize the requested data access rights to this
frontend service, users disclosed their own data as well as enabled the access to data of their friends for
Your Digital Life. 270,000 users directly used Your Digital Life, by contrast, the data of in total 87 million
Facebook users was shared with the service (Frier 2018). After that, Your Digital Life sold this data to a
backend service, Cambridge Analytica, which used it to target users on Facebook (Rosenberg et al. 2018).
Impact on
User
Platform
Facebook
Frontend Service
Your Digital Life
Backend Service
Cambridge Analytica
Action of | Data from
4) Didn't carefully
check Privacy
Settings
1) Opt-out Privacy
Settings for User’s
Friends
2) Enables Service
Data Access on the
Platform
3) Offers data-
collecting
Application
5) Breaks the Policy
6) User and Friends’
Data are actively
transmitted
7) Use the received
Data for Targeted
Advertising
8) Exploits
Possibilities of
Targeting on the
Platform
Table 1. “Facebook” - Problem Description (Grey Fields Symbolize Data Flows)
In this case, several flaws led to this large data breach. In the time that the case was formed, users had to
manually opt-out of sharing their data with the services used by friends’ (1). Facebook enabled external
services on the platform to access the data of both users and users’ friends (2). The frontend service Your
Digital Life thus collected data (3) of Facebook users who did not carefully check their privacy settings (4).
Your Digital Life then did not comply with Facebook’s data policy (5) and transmitted the data to the
backend service Cambridge Analytica (6). Based on this data, Cambridge Analytica delivered individualized
advertisements (7) that exploited the potential that is intended by the platform provider (8).
The data-flows (grey fields in Table 1) of this case show that the data is passed from the platform Facebook
(A) to the frontend service Your Digital Life (B). Data are then actively transmitted to the backend service
Cambridge Analytica (C), who used this data for target advertisements on Facebook (D). Such deceitful
behavior could have been prevented if mechanisms had been implemented to limit data sharing as well as
access to and forwarding of data on the platform.
“AccuWeather and RevealMobile”
In the second critical case, the iOS application AccuWeather transmitted user device data to a backend
service called RevealMobile which approximated user’s location using this data even when the permission
that the application may access location services was revoked by the user. This latter organization focusses
on mobile marketing by using such data to segment user groups for advertising (RevealMobile 2017).
A
B
C
D
Towards a Framework for Information Privacy
Thirty Ninth International Conference on Information Systems, San Francisco 2018 6
In this case, the application AccuWeather was after a review of Apple available for download in the iOS
AppStore (1). The privacy statement declared that the application uses methods to approximate user’s
location and that this data is transmitted to backend services (AccuWeather 2018). The backend services
receiving this data were neither explicitly stated nor was a complete specification of these services included
(2). Approximation of a user’s location was possible because in iOS, access to a user's location is linked to
technical access to GPS data (3). This is not visible for users who do not read the privacy statement
intensively. Here, the technologies and the approximation actions of backend services were mentioned (4),
but the backend services were not named explicitly. When using the application, the platform iOS sent the
wi-fi name, BSSID (Basic Service Set Identification, corresponds to the MAC address of the connected
wireless access point), and Bluetooth status, which were used to approximate the location of a user's device
(5). During a testing period lasting 36 hours, while the application was not in the foreground of the screen,
the mentioned data was sent 16 times to the company RevealMobile (Strafach 2017). A study of
RevealMobile states that the "[…] technology sits inside hundreds of apps […]” (6) and “[i]t turns the
location data coming out of those apps into meaningful audience data […]" (7) (RevealMobile 2016, p. 2).
Table 2. “AccuWeather” - Problem Description (Grey Fields Symbolize Data Flows)
The data-flows (Table 2) differentiates this case from the prior case. The user’s device data (A) is passed on
from the platform iOS to the application AccuWeather (B). Then, the data are passed from the platform via
the API in the application (C) to the backend service RevealMobile (D). Here, the data are enriched with
data sources of other backend services to approximate the location of users (E) (Sapiezynski et al. 2015).
The main issue was that users’ negotiation to the app to don’t access the location is not consistent with the
technical implementation. The operating system transmitted Wi-fi BSSID data which at a first view seem
to be non-privacy related to the application with included backend services. In normal usage of iOS, users
have no opportunity to consent or neglect this information flow. In the next step, this data is enriched with
databases which include the locations of WiFi networks. Next, it is possible for actors in the service
ecosystem to approximate the device’s location. And this, despite the fact that the user denies the access to
his location. Such transmission of data could have been prevented by the platform if mechanisms had been
implemented to regulate access to data with which may become privacy-critical for users.
Discussion
The following offers an overview of the problem propositions, shown in Table 3, deduced from the privacy
critical issues based on the analyzed cases. The propositions are linked to the different aspects of the cases.
As the critical case analysis shows, service ecosystems apply several modes of action and impacts on privacy.
Nevertheless, both analyzed cases share commonalities that regarding the possibility to embed actors
Impact on
User
Platform
iOS
Frontend Service
AccuWeather
Backend Service
RevealMobile
Action of | Data from
User
4) Didn't carefully
check the Privacy
Statement
Platform
1) App published
3) Location Access
only linked to GPS
Data
Frontend
Service
2) Privacy Statement
does not include
Backend Services
5) Transmission of
Data via API to
approximate Location
Backend
Service
7) Turns Data into
Audience Data for
Advertising
6) Wide Distribution
of RevealMobile on
Platform
A
B
C
D
E
Towards a Framework for Information Privacy
Thirty Ninth International Conference on Information Systems, San Francisco 2018 7
within the service ecosystem and process data of users. Further research and an extension of the critical
case analysis is thus needed to validate the identified issues. However, the problems identified relate to core
mechanics of the service ecosystems and thus can be deemed worthwhile building on them. As several
problems are identified, we discuss only two problem propositions in detail to provide insights on the
following design science projects. In total, these are only exemplary and initial problem propositions that
require further enrichment by other cases.
The problem proposition, “Overview of Backend Service”, can be derived from both cases. It consists of the
problem for users that no overview of backend services exists. It is not always possible to identify which
backend services are available in a given frontend service (second case). It is even more difficult to maintain
an overview of which frontend services used by a user are connected to which backend services.
The second problem proposition “Format of Transparency highlights the problem of the complexity of
privacy statements (Keith et al. 2018) which due to emerging backend services in services intensifies. In the
statements the interactions with backend services are mentioned and described. However, also the backend
privacy policies are implicitly accepted. Nevertheless, consent to privacy statements must be given informed
which seems for such complex privacy statements very doubtful.
Impact on
Users
Platforms
Frontend Services
Backend Services
Action of
Users
Check of Privacy
Statements and
Settings (FB4, AW4)
Careful Use of
Platforms (FB1, FB5,
AW3, AW6)
Platforms
Default Settings for
Privacy Protection
(FB1, FB5)
Implementation of
User Decisions (AW3)
Control when Data
leave Platform (FB2,
FB3, FB6, AW1, AW5)
Control when Data
leave Platform (FB6,
FB8, AW5)
Frontend
Services
Format of
Transparency (FB4,
AW2, AW4)
Interaction with
Backend Services
(FB6, AW5, AW7)
Backend
Services
Overview of Backend
Services (FB6, FB7,
AW5, AW6, AW7)
Table 3. Derived Privacy-Related Problem Propositions
The two cases have data flows with different characteristics. The data flow models contain active data
transmitting on the part of the frontend service (“Your Digital Life”) or data transmission via an API
(implemented in the application AccuWeather). The data flows may be classified differently. Relevant for
this may be which prior action took that the privacy-critical data flow occur. Such aspects will be considered
in our subsequent research. Building on the identified problems (Figure 2), we will apply the Design Science
Research Process Model (Peffers et al. 2007) to develop socio-technical artifacts that aim for improving
privacy in service ecosystems. Next, objectives of a solution are derived and instantiated by designing
solutions accordingly. This is followed by demonstrating and evaluating the artifacts. For policy makers, we
would work out the requirements for regulations to develop solutions to deal with the identified issues.
Conclusion and Outlook
According to the GDPR, the frontend service provider may be held responsible for implemented backend
services in the future. As digital services comprise modules of different backend services, issues regarding
users privacy as well as regarding frontend service providers responsibility arise (Kurtz et al. 2018). We
have developed a framework to specify the problems of multi-actor information-processing that are related
to service ecosystems. This is also essential for evaluating the GDPR and the responsibility of actors under
Towards a Framework for Information Privacy
Thirty Ninth International Conference on Information Systems, San Francisco 2018 8
the assumption that the supervision of those actors is sufficient to control the privacy risks to users.
Moreover, within the GDPR are instruments that could be improved based on our framework, especially
instruments such as the data protection impact assessments (Regulation 2016, Chapter 4 Article 35).
In further developing this framework and the problem propositions briefly outlined herein, we expect that
the number of problem specifications will grow. We would substantiate this through analysis of additional
cases. To summarize, we would like to emphasize once again the necessity of such a framework. A complete
view of complex ecosystems is needed to protect privacy as the interview with the platform operations
manager at Facebook shows. To the question of what kind of control Facebook had over the data given to
external services, he stated “Absolutely none. Once the data left Facebook servers there was not any control,
and there was no insight into what was going on” (Lewis 2018). We offer a framework by which such
privacy-critical problems can be specified. This is a necessary step to enable design of comprehensive
solutions for privacy, so that users and organizations are not affected by repeated and unnecessary failures.
Acknowledgements
This research was sponsored by the Hamburg Ministry of Science, Research and Equality in the project
Information Governance Technologies under the reference LFF-FV 34. We thank Tilo Böhmann for his
valuable feedback throughout the development of this paper.
References
AccuWeather. 2018. "Privacy Statement" Retrieved 03.06.2018. https://www.accuweather.com
/en/privacy
Acquisti, A., Brandimarte, L., and Loewenstein, G. 2015. "Privacy and Human Behavior in the Age of
Information" Science (347:6221), pp. 509-514.
Backes, M., Bugiel, S., and Derr, E. 2016. "Reliable Third-Party Library Detection in Android and Its
Security Applications" Conference on Computer and Communications Security. pp. 356-367.
Belanger, F., and Xu, H. 2015. "The Role of Information Systems Research in Shaping the Future of
Information Privacy" Information Systems Journal (25:6), pp. 573-578.
Böhmann, T., Leimeister, J. M., and Möslein, K. 2014. "Service Systems Engineering" Business &
Information Systems Engineering (6:2), pp. 73-79.
Brookman, J., Rouge, P., Alva, A., and Yeung, C. 2017. "Cross-Device Tracking: Measurement and
Disclosures" Proceedings on Privacy Enhancing Technologies (2017:2), pp. 133-148.
Buss, J. 2015. "Cross-Device Advertising: How to Navigate Mobile Marketing’s Next Big Opportunity"
Journal of Digital & Social Media Marketing (3:1), pp. 73-79.
Cavoukian, A. 2009. "Privacy by Design" Information and privacy commissioner of Ontario, Canada).
Chandler, J. D., and Lusch, R. F. 2015. "Service Systems a Broadened Framework and Research Agenda on
Value Propositions, Engagement, and Service Experience" Journal of Service Research, pp. 6-22.
Conger, S., Pratt, J. H., and Loch, K. D. 2013. "Personal Information Privacy and Emerging Technologies"
Information Systems Journal (23:5), pp. 401-417.
Crossler, R. E., and Bélanger, F. 2017. "The Mobile Privacy-Security Knowledge Gap Model: Understanding
Behaviors" Proceedings of the 50th Hawaii International Conference on System Sciences.
Culnan, M. J. 1993. "" How Did They Get My Name?": An Exploratory Investigation of Consumer Attitudes
toward Secondary Information Use". MIS quarterly, pp. 341-363.
Danezis, G., Domingo-Ferrer, J., Hansen, M., Hoepman, J.-H., Metayer, D. L., Tirtea, R., and Schiffner, S.
2015. "Privacy and Data Protection by Design-from Policy to Engineering" arXiv preprint:1501.03726).
Dinev, T., and Hart, P. 2006. "An Extended Privacy Calculus Model for E-Commerce Transactions"
Information Systems Research (17:1), pp. 61-80.
Enck, W., Gilbert, P., Han, S., Tendulkar, V., Chun, B.-G., Cox, L. P., Jung, J., McDaniel, P., and Sheth, A.
N. 2014. "Taintdroid: An Information-Flow Tracking System for Realtime Privacy Monitoring on
Smartphones" ACM Transactions on Computer Systems (TOCS) (32:2), p. 5.
Frier, S. 2018. "Facebook Says There May Be More Cambridge Analytica-Sized Leaks". Retrieved
03.06.2018. Bloomberg. https://www.bloomberg.com/news/articles/2018-04-26/facebook-says-
there-may-be-more-cambridge-analytica-sized-leaks
Gopal, R. D., Hidaji, H., Patterson, R. A., Rolland, E., and Zhdanov, D. 2018. "How Much to Share with
Third Parties? User Privacy Concerns and Website Dilemmas" Man. Inf. Sys. Quar. (42:1), pp. 143-164.
Towards a Framework for Information Privacy
Thirty Ninth International Conference on Information Systems, San Francisco 2018 9
Keith, M. J., Frederickson, J. T., Reeves, K. S., and Babb, J. 2018. "Optimizing Privacy Policy Videos to
Mitigate the Privacy Policy Paradox" in Proceedings of the 51st Hawaii International Conference on
System Sciences. Hawaii.
Kurtz, A., Gascon, H., Becker, T., Rieck, K., and Freiling, F. 2016. "Fingerprinting Mobile Devices Using
Personalized Configurations" Proceedings on Privacy Enhancing Technologies (2016:1), pp. 4-19.
Kurtz, C., Semmann, M., and Böhmann, T. 2018. "Privacy by Design to Comply with GDPR: A Review on
Third-Party Data Processors" in: Americas Conference on Information Systems. New Orleans.
Kurtz, C., Wittner, F., Semmann, M., Schulz, W., and Böhmann, T. 2019. "The Unlikely Siblings in the GDPR
Family: A Techno-Legal Analysis of Major Platforms in the Diffusion of Personal Data in Service
Ecosystems" Proceedings of the 52nd Hawaii International Conference on System Sciences. Hawaii.
Lerner, A., Simpson, A. K., Kohno, T., and Roesner, F. 2016. "Internet Jones and the Raiders of the Lost
Trackers: An Archaeological Study of Web Tracking from 1996 to 2016" USENIX Security Symposium.
Lewis, P. 2018. "'Utterly Horrifying': Ex-Facebook Insider Says Covert Data Harvesting Was Routine" The
Guardian.
Lusch, R. F., and Vargo, S. L. 2014. Service-Dominant Logic: Premises, Perspectives, Possibilities.
Cambridge University Press.
Malhotra, N. K., Kim, S. S., and Agarwal, J. 2004. "Internet Users' Information Privacy Concerns (Iuipc):
The Construct, the Scale, and a Causal Model" Information Systems Research (15:4), pp. 336-355.
Meng, W., Ding, R., Chung, S. P., Han, S., and Lee, W. 2016. "The Price of Free: Privacy Leakage in
Personalized Mobile in-Apps Ads" NDSS.
Nikkhah, H. R., and Sabherwal, R. 2017. "A Privacy-Security Model of Mobile Cloud Computing
Applications" in: Thirty Eighth International Conference on Information Systems.
Peffers, K., Tuunanen, T., Rothenberger, M. A., and Chatterjee, S. 2007. "A Design Science Research
Methodology for Information Systems Research" Journal of Management Inf. Sys. (24:3), pp. 45-77.
Razaghpanah, A., Nithyanand, R., Vallina-Rodriguez, N., Sundaresan, S., Allman, M., Kreibich, C., and Gill,
P. 2018. "Apps, Trackers, Privacy, and Regulators: A Global Study of the Mobile Tracking Ecosystem".
Regulation, General Data Protection Regulation. 2016. "Regulation (EU) 2016/679 of the European
Parliament and of the Council of 27 April 2016 on the Protection of Natural Persons with Regard to the
Processing of Personal Data and on the Free Movement of Such Data, and Repealing Directive 95/46,"
Official Journal of the European Union (OJ) (59), pp. 1-88.
RevealMobile. 2016. "Using Mobile Location Data and Beacons to Measure Retail Shopping Behavior."
RevealMobile. 2017. "Revealmobile Website." Retrieved 15.08.2017. https://revealmobile.com
Rosenberg, M., Confessore, N., and Cadwalladr, C. 2018. "How Trump Consultants Exploited the Facebook
Data of Millions," in: The New York Times.
Sapiezynski, P., Stopczynski, A., Gatej, R., and Lehmann, S. 2015. "Tracking Human Mobility Using Wifi
Signals" Plos One (10:7).
Smith, H. J., Milburg, S. J., and Burke, S. J. 1996. "Information Privacy: Measuring Individuals' Concerns
About Organizational Practices" Mis Quarterly (20:2), pp. 167-196.
Solon, O. 2018. "Facebook Says Cambridge Analytica May Have Gained 37m More Users' Data" in: The
Guardian.
Solove, D. J. 2012. "Introduction: Privacy Self-Management and the Consent Dilemma" Harv. L. Rev. (126).
Strafach, W. 2017. "Advisory: Accuweather iOS App Sends Location Information to Data Monetization
Firm." Retrieved 21.08.2017. https://hackernoon.com/advisory-accuweather-ios-app-sends-location-
information-to-data-monetization-firm-83327c6a4870
Tikkinen-Piri, C., Rohunen, A., and Markkula, J. 2018. "EU General Data Protection Regulation: Changes
and Implications for Personal Data Collecting Companies" Computer Law & Security Review (34:1),
pp. 134-153.
United States House of Representatives Committee on Energy and Commerce. 2018. "Testimony of Mark
Zuckerberg, Chairman and Chief Executive Officer, Facebook."
Vallina-Rodriguez, N., Sundaresan, S., Razaghpanah, A., Nithyanand, R., Allman, M., Kreibich, C., and Gill,
P. 2016. "Tracking the Trackers: Towards Understanding the Mobile Advertising and Tracking
Ecosystem" arXiv:1609.07190.
Vargo, S. L., and Lusch, R. F. 2011. "It's All B2b… and Beyond: Toward a Systems Perspective of the Market"
Industrial marketing management (40:2), pp. 181-187.
Vargo, S. L., and Lusch, R. F. 2016. "Institutions and Axioms: An Extension and Update of Service-
Dominant Logic" Journal of the Academy of Marketing Science (44:1), pp. 5-23.
... However, the recent proliferation of high-dimensional data sets introduces the possibility of piecing together a person's complete profile from seemingly disparate and anonymized pieces of information [83,157]. This danger is heightened when collected information is linked to ubiquitous, location-tracking mobile devices [36,44,90,171]. ...
... A factor in this surge of interest might be attributed to the desire for commercial entities to protect themselves from the loss of their customers' data. According to the General Data Protection Regulation set in effect in the European Union, organizations are responsible for the misuse of information that is processed on their systems [90]. Thus it is not just the individual person that is interested in the security of their data [6,140,148,176], but many commercial enterprises who process these data are motivated to ensure that they are not subject to unintended disclosure through neglect or otherwise. ...
... One well-publicized case of a compromised data set with far-reaching consequences is the Facebook-to-Cambridge Analytica data leak, resulting in unauthorized actors gaining access to private information of over 83 million individuals [90]. Through inadequate access control, Facebook was also found to be inadvertently providing third parties with the ability to view user's birth dates, widely considered a private attribute [37]. ...
Article
With the dramatic improvements in both the capability to collect personal data and the capability to analyze large amounts of data, increasingly sophisticated and personal insights are being drawn. These insights are valuable for clinical applications but also open up possibilities for identification and abuse of personal information. In this article, we survey recent research on classical methods of privacy-preserving data mining. Looking at dominant techniques and recent innovations to them, we examine the applicability of these methods to the privacy-preserving analysis of clinical data. We also discuss promising directions for future research in this area.
... Moreover, the dependence on collaboration in increasingly complex data ecosystems challenges businesses in keeping track of interfaces with partners and individuals (Tene and Polonetsky 2013;Crain 2018). Similarly, legal experts such as data protection officers (DPO), legislators, or privacy lawyers (hereinafter termed regulatory stakeholders) need more detailed insights into data ecosystems to more efficiently assess privacy violations or align privacy statements and regulations if necessary (Conger et al. 2013;Kurtz et al. 2018). Therefore, privacy and IS researchers call for novel approaches that help to unravel the manifold socio-technical relations constituting data ecosystems and thereby foster the anticipation, prevention, and analysis of privacy violations (Crain 2018;Nissenbaum 2019;Oliveira et al. 2019). ...
... Recently, modeling ecosystems was highlighted as a viable lens of analysis for the privacy field (Kurtz et al. 2018;Elrick 2021). However, existing ecosystem meta-models (Oliveira et al. 2018;Burmeister et al. 2019a) do not focus on privacy concerns and the underlying socio-technical relations for personal-data exchange. ...
... Only a few privacy models address this issue by generalizing what kinds of actors may gain access to personal data (Conger et al. 2013;Benson et al. 2015). Moreover, there is a lack of research specifying the socio-technical relations between these actors that ultimately lead to privacy violations (Ananny and Crawford 2018;Kurtz et al. 2018). Privacy violations result in adverse consequences for individuals and comprise social, physical, psychological, prosecution-, career-, resource-, or freedom-related consequences (Karwatzki et al. 2017). ...
Conference Paper
Full-text available
Privacy violations increasingly result from personal-data processing by a convoluted set of actors that collaborate in complex data ecosystems. These data ecosystems comprise numerous socio-technical elements and relations, and their opacity often obscures the manifold reasons for privacy violations. Therefore, researchers and practitioners call for systematic approaches that allow for decomposing data ecosystems in order to receive transparency about the opaque data flows and processing mechanisms across actors. This paper positions architectural thinking as a reasonable means for this need. By collecting key privacy concerns of business and regulatory stakeholders and developing a corresponding data ecosystem architecture meta-model, we provide first steps for extending the scope of architectural thinking to the privacy context. Our results are based on a mixed methods approach, which triangulates data received from a multiple case study of privacy scandals and from 14 expert interviews.
... Different known or unknown producers create the solutions, the components of which may only be visible partially or not at all, hiding which solutions are involved and how they are connected [8]. Such lack of transparency of how user data is aggregated through many avenues of data access may entail threats to user privacy, whose data can move through an entire ecosystem [9]. In this paper, we initiate an analysis of how an ecosystem setting may affect the users' privacy. ...
... In an ecosystem, with many different players involved, providing such transparency in a coordinated way is particularly challenging. However, a lack of transparency about the solutions from third-parties that collect and share their data causes incomplete and asymmetric information sharing, preventing users from acting in a self-interested manner [9]. ...
Conference Paper
In the last decade, the software industry embraced a new development paradigm called software ecosystems. In this setting, multiple businesses act together as a unit and interact with a shared market for software and services. Ecosystems often result from opening a technology in the form of a platform so that it can receive external contributions from independent developers, small companies, resellers, and other actors. This open innovation phenomenon involves not just big players such as Google and Apple, but also SMEs seeking to co-create new solutions that facilitate their entrance into new markets. The challenging coordination of such distributed development must enable secure sourcing, integration, deployment, and evolution of the software solutions created by multiple complementors. An issue that has grown in importance with the introduction of the General Data Protection Regulation in 2018 is compliance with relevant data protection legislation. Our goal therefore is to analyse how privacy requirements affect software development in such ecosystems. As main contribution, we present an analysis of (i) data protection roles in a software ecosystem together with (ii) a set of challenges for data protection to platform companies, based on GDPR specification.
... In a socio-techno-legal analysis (Kurtz, Semmann and Schulz, 2018;Kurtz et al., 2020), each case is examined in turn at the intersection of the research on information systems and the law by building upon and extending prior interdisciplinary cases regarding privacy (Pavlou, 2011). This study utilizes the concept of boundary resources as an analytical framework and closely examines how the different types of boundary resources materialized in the two cases. ...
Article
Full-text available
Billions of people interact within platform-based ecosystems containing the personal data of their daily lives, which have become rigorously creatable, processable, and shareable. Platform providers facilitate interactions between users, service providers, and third parties in these socio-technical ecosystems. Platform providers influence their platform ecosystems to promote the contributions of the service providers and exercise control by utilizing boundary resources investigated in the information systems field. In a socio-techno-legal analysis of two high-profile cases and consideration of the General Data Protection Regulation (GDPR), we show that the boundary resource design, arrangement, and interplay can influence whether and to what extent platform providers are accountable for platform providers unlawful personal data processing in platform ecosystems. The findings can have a huge impact to account actors for personal data misusage in platform ecosystems and, thus, the protection of personal liberty and rights in such socio-technical systems.
... Cyberattacks may harm our dignity or privacy, accidentally or intentionally. Recall how large-scale activities of governments or companies on the Internet have become a major issue: the U.S. National Security Agency surveillance (Margulies 2013), the Great firewall of China (Lee and Liu 2012), or the scandal of Cambridge Analytica (Kurtz et al. 2018). In this context, governments and international institutions are pushing for a more secure and governable cyberspace. ...
Article
Full-text available
Cyberattacks constitute a major threat to most organizations. Beyond financial consequences, they may entail multiple impacts that need to be taken into account when making risk management decisions to allocate the required cybersecurity resources. Experts have traditionally focused on a technical perspective of the problem by considering impacts in relation with the confidentiality, integrity, and availability of information. We adopt a more comprehensive approach identifying a broader set of generic cybersecurity objectives, the corresponding set of attributes, and relevant forecasting and assessment models. These are used as basic ingredients for decision support in cybersecurity risk management.
... Cyber attacks may harm our dignity or privacy, accidentally or intentionally. Furthermore, large scale activities of governments or companies on the Internet have become a major issue on this topic, such as the US NSA surveillance [115], the Great firewall of China [107] or the scandal of Cambridge Analytica [105]. In this context, governments and international institutions are pushing for a more secure and governable cyberspace. ...
Thesis
Full-text available
Digitalisation is pervasive in our society. In some realms, this has been ubiquitous for a long time, e.g. the information systems at companies and public administration. In other domains, digitalisation is emerging. A paradigmatic case during the last decade has been the digitalisation of industry. This increased automation and connectivity has exposed industrial processes and facilities to cybersecurity risks, which can cause incidents with the equipment that could compromise operations, safety, or the environment. These systems are now the target of sophisticated cyber attacks, such as Stuxnet or Shamoon, capable of working in the background to conduct espionage or sabotage actions. An interruption of operations of a few hours could represent tens of thousands of Euros. Potentially, a manipulation of the equipment – even unintentional – might cause or facilitate an incident with safety or environmental consequences. Therefore, managing cybersecurity risks and incidents in these industrial environments exposed to physical consequences is fundamentally different from traditional cybersecurity, which focuses on information and privacy risks. Other emerging paradigms face similar cyber-physical risks, for instance, in relation with the Internet of Things, Smart Cities or autonomous cars. Additionally, our life is so dependent on digital technologies or cyberspaces that new psychological and social risks have come to the fore, such as the social impact of fake news and trolls, the psychological impact of cyber-bullying or the public exposition of intimate photos. Our motivation is to bring innovative cybersecurity risk analysis models that address aspects not well covered by popular cybersecurity risk analysismethods. Specifically: (1) Models that address risk analysis during incidents, which differ from tradi-tional risk analysis in that the analyst studies a particular incident that is happening or could happen immediately. (2) Models that address the strategic analysis of adversarial threats, since in traditional cybersecurity risk analysis these are usually studied without takinginto account their behaviour or motivations. (3) Digital risks – as described in the introductory paragraph – might leadto informational, operational, physical or psychological impacts and thus require models that facilitate decision-making with multiple objectives of different nature, value and importance for the involved stakeholders. (4) The inclusion of risk transfer, insurance in particular, in cybersecurity risk analysis as a complement to protective and reactive measures.
... For our future work, we will focus on this aspect more closely, especially given that major recent privacy breaches have occurred in complex, interdependent bundles of services (e.g., Facebook and Cambridge Analytica). Our work can be extended by further studying the complex personal data sharing ecosystems of modern third/fourth parties (Conger et al. 2013;Kurtz 2018). ...
Article
Full-text available
Fitness trackers are undoubtedly gaining in popularity. As fitness-related data are persistently captured, stored, and processed by these devices, the need to ensure users’ privacy is becoming increasingly urgent. In this paper, we apply a data-driven approach to the development of privacy-setting recommendations for fitness devices. We first present a fitness data privacy model that we defined to represent users’ privacy preferences in a way that is unambiguous, compliant with the European Union’s General Data Protection Regulation (GDPR), and able to represent both the user and the third party preferences. Our crowdsourced dataset is collected using current scenarios in the fitness domain and used to identify privacy profiles by applying machine learning techniques. We then examine different personal tracking data and user traits which can potentially drive the recommendation of privacy profiles to the users. Finally, a set of privacy-setting recommendation strategies with different guidance styles are designed based on the resulting profiles. Interestingly, our results show several semantic relationships among users’ traits, characteristics, and attitudes that are useful in providing privacy recommendations. Even though several works exist on privacy preference modeling, this paper makes a contribution in modeling privacy preferences for data sharing and processing in the IoT and fitness domain, with specific attention to GDPR compliance. Moreover, the identification of well-identified clusters of preferences and predictors of such clusters is a relevant contribution for user profiling and for the design of interactive recommendation strategies that aim to balance users’ control over their privacy permissions and the simplicity of setting these permissions.
... In total, the multiactor information-processing in service ecosystems have consequences for information privacy of users. That poses particular challenges [21], where it remains to be seen to what extent the GDPR in form of data regulation can cover these challenges. ...
Conference Paper
Full-text available
The digital age is characterized by hyper-connected services. Whenever we engage with an app we likely engage with a broader set of actors, often facilitated by a platform. Essentially, we engage with a service ecosystem posing particular challenges for privacy regulation. With GDPR taking effect we seek to understand the implications of it for privacy in such ecosystems. Interconnected services can facilitate the diffusion of personal data and thus impede with individual privacy rights. We apply a novel techno-legal analysis to the flow of personal information in service ecosystems. Based on two cases, we show that novel requirements arise for platforms as key actors in service ecosystems. Using our techno-legal analysis we conclude that two major platform providers, Apple and Facebook, have more in common from a legal perspective than the current rhetoric suggests. Based on the analysis, we discuss where privacy-preserving solutions in service ecosystems need to be positioned.
Presentation
Full-text available
When using today's internet services and platform ecosystems, data of consumers is often harvested and shared without their consent; that is, consumers seized to be sovereigns of their own data. Due to the rapid and abundant nature of interactions in today's platform ecosystems, manual consent management is impracticable. To support development of semi-automated solutions for this problem, we investigated the use of policy definition languages as machine-readable and enforceable mechanism to re-establish data sovereignty for consumers. Based on an expert literature review, we develop a framework of the chances and challenges of leveraging policy definition languages as central building blocks for data sovereignty in platform ecosystems.
Conference Paper
Full-text available
The digital age is characterized by hyper-connected services. Whenever we engage with an app we likely engage with a broader set of actors, often facilitated by a platform. Essentially, we engage with a service ecosystem posing particular challenges for privacy regulation. With GDPR taking effect we seek to understand the implications of it for privacy in such ecosystems. Interconnected services can facilitate the diffusion of personal data and thus impede with individual privacy rights. We apply a novel techno-legal analysis to the flow of personal information in service ecosystems. Based on two cases, we show that novel requirements arise for platforms as key actors in service ecosystems. Using our techno-legal analysis we conclude that two major platform providers, Apple and Facebook, have more in common from a legal perspective than the current rhetoric suggests. Based on the analysis, we discuss where privacy-preserving solutions in service ecosystems need to be positioned.
Book
Full-text available
In 2004, Robert F. Lusch and Stephen L. Vargo published their groundbreaking article on the evolution of marketing theory and practice toward "service-dominant (S-D) logic", describing the shift from a product-centred view of markets to a service-led model. Now, in this keenly anticipated book, the authors present a thorough primer on the principles and applications of S-D logic. They describe a clear alternative to the dominant worldview of the heavily planned, production-oriented, profit-maximizing firm, presenting a coherent, organizing framework based on ten foundational premises. The foundational premises of S-D logic have much wider implications beyond marketing for the future of the firm, transcending different industries and contexts, and will provide readers with a deeper sense of why the exchange of service is the fundamental basis of all social and economic exchange. This accessible book will appeal to students, as well as to researchers and practitioners.
Article
Full-text available
The paper motivates, presents, demonstrates in use, and evaluates a methodology for conducting design science (DS) research in information systems (IS). DS is of importance in a discipline oriented to the creation of successful artifacts. Several researchers have pioneered DS research in IS, yet over the past 15 years, little DS research has been done within the discipline. The lack of a methodology to serve as a commonly accepted framework for DS research and of a template for its presentation may have contributed to its slow adoption. The design science research methodology (DSRM) presented here incorporates principles, practices, and procedures required to carry out such research and meets three objectives: it is consistent with prior literature, it provides a nominal process model for doing DS research, and it provides a mental model for presenting and evaluating DS research in IS. The DS process includes six steps: problem identification and motivation, definition of the objectives for a solution, design and development, demonstration, evaluation, and communication. We demonstrate and evaluate the methodology by presenting four case studies in terms of the DSRM, including cases that present the design of a database to support health assessment methods, a software reuse measure, an Internet video telephony application, and an IS planning method. The designed methodology effectively satisfies the three objectives and has the potential to help aid the acceptance of DS research in the IS discipline.
Article
Full-text available
Service-dominant logic continues its evolution, facilitated by an active community of scholars throughout the world. Along its evolutionary path, there has been increased recognition of the need for a crisper and more precise delineation of the foundational premises and specification of the axioms of S-D logic. It also has become apparent that a limitation of the current foundational premises/axioms is the absence of a clearly articulated specification of the mechanisms of (often massive-scale) coordination and cooperation involved in the cocreation of value through markets and, more broadly, in society. This is especially important because markets are even more about cooperation than about the competition that is more frequently discussed. To alleviate this limitation and facilitate a better understanding of cooperation (and coordination), an eleventh foundational premise (fifth axiom) is introduced, focusing on the role of institutions and institutional arrangements in systems of value cocreation: service ecosystems. Literature on institutions across multiple social disciplines, including marketing, is briefly reviewed and offered as further support for this fifth axiom.
Conference Paper
As the General Data Protection Regulation (GDPR) within the European Union comes into effect, organizations need to cope with novel legal requirements regarding the processing of user data and particularly how other, in the service integrated, organizations can process these. Information systems (IS) and their design as mashing up services of various providers (ecosystems) is state of practice. The GDPR raises for companies the question of how they can ensure that operations conform with external data processors according to the regulation. The approach of Privacy by Design (PbD), which is also included in the GDPR, offers for organizations a way to operationalize these legal requirements. Therefore, we conduct the first, rigorous, and systematic literature review of PbD. Specifically, we focus on works that seek implementation of PbD in organizations, located in ecosystems. The results show a surprising dearth of research in this field, although GDPR explicitly emphasizes this critical issue.
Article
Publishers websites are increasingly presenting content and services that are not created and managed by the website administrators themselves, but are provided by other third parties. While third party content and services provide value and utility to website users, this comes at the cost of user information being shared with the third party. Privacy concerns surrounding information leakage have been growing rapidly. With increasing concerns regarding online privacy and information disclosure, it is important to understand the factors that affect the level of sharing between publisher websites and third parties. In this study, we propose a two-sided economic model that captures the interaction between the users, publisher websites, and third parties. Specifically, we focus on the effect of privacy concerns on the sharing behavior of the publisher website and the impact of users’ privacy concerns on third party market concentration. We then analyze welfare aspects to provide insights on the impacts of industry regulations and policy on users, publisher websites, and third parties. We partially validate the model using an exploratory empirical analysis of publisher website third party sharing behavior and the structure of the industry. To the best of our knowledge, this study is among the first to analyze publisher website decision making in sharing user information with third parties.
Article
The General Data Protection Regulation (GDPR) will come into force in the European Union (EU) in May 2018 to meet current challenges related to personal data protection and to harmonise data protection across the EU. Although the GDPR is anticipated to benefit companies by offering consistency in data protection activities and liabilities across the EU countries and by enabling more integrated EU-wide data protection policies, it poses new challenges to companies. They are not necessarily prepared for the changes and may lack awareness of the upcoming requirements and the GDPR's coercive measures. The implementation of the GDPR requirements demands substantial financial and human resources, as well as training of employees; hence, companies need guidance to support them in this transition. The purposes of this study were to compare the current Data Protection Directive 95/46/EC with the GDPR by systematically analysing their differences and to identify the GDPR's practical implications, specifically for companies that provide services based on personal data. This study aimed to identify and discuss the changes introduced by the GDPR that would have the most practical relevance to these companies and possibly affect their data management and usage practices. Therefore, a review and a thematic analysis and synthesis of the article-level changes were carried out. Through the analysis, the key practical implications of the changes were identified and classified. As a synthesis of the results, a framework was developed, presenting 12 aspects of these implications and the corresponding guidance on how to prepare for the new requirements. These aspects cover business strategies and practices, as well as organisational and technical measures.