ThesisPDF Available

Privacy in social networks – economic options for regulation

Authors:

Abstract and Figures

The first concept of privacy was provided by Samuel D. Warren and Louis Brandeis in 1890 as “the right to be left alone”. At that time, the world was more than a century away from people voluntary disclosing information and sharing data on a large scale via the Internet on social networks such as Facebook. Today, the business model of the major social networks contain a thirst for their users’ personal data which threatens user privacy. Information and power asymmetries hinder users from enforcing their privacy preferences. Furthermore, network effects and switching costs tie them to the market leading networks. The dissertation at hand analyses the topic of privacy in social networks from an information systems and economic research viewpoint. It illustrates privacy factors in the social network environment and examines the related dynamics of user privacy. As such, this thesis analyses whether the status quo of privacy in social networks is economically inefficient or leads to inefficiency, and whether governmental regulation is required. Moreover, existing approaches to solve the privacy challenge in social networks business are assessed and the most promising concepts are emphasized.
Content may be subject to copyright.
A preview of the PDF is not available
... Information was collected from phone log files, wifi logs, event logs, Bluetooth logs, and databases containing the browsing history. Snapchat was analyzed in [33] by Infosecurity Group and by Aji et al. [34] on two smartphones using Android and iOS. They acquired the data from the smartphone's internal memory through 3 extraction techniques: physical, logical, and file system. ...
Article
Full-text available
Smartphone users spend a substantial amount of time in browsing, emailing, and messaging through different social networking apps. The use of social networking apps on smartphones has become a dominating part of daily lives. This momentous usage has also resulted in a huge spike in cybercrimes such as social harassing, abusive messages, vicious threats, broadcasting of suicidal actions, and live coverage of violent attacks. Many of such crimes are carried out through social networking apps; therefore, the forensic analysis of allegedly involved digital devices in crime scenes and social apps installed on them can be helpful in resolving criminal investigations. This research is aimed at performing forensic investigation of five social networking apps, i.e., Instagram, LINE, Whisper, WeChat, and Wickr on Android smart phones. The essential motivation behind the examination and tests is to find whether the data resides within the internal storage of the device or not after using these social networking apps. Data extraction and analysis are carried out using three tools, i.e., Magnet AXIOM, XRY, and Autopsy. From the results of these experiments, a considerable amount of essential data was successfully extracted from the examined smartphone. This useful data can easily be recovered by forensic analysts for future examination of any crime situation. Finally, we analyzed the tools on the basis of their ability to extract digital evidences from the device and their performance are examined with respect to NIST standards.
Article
Full-text available
A large-scale experiment during the 2010 U.S. Congressional Election demonstrated a positive effect of an online get-out-the-vote message on real world voting behavior. Here, we report results from a replication of the experiment conducted during the U.S. Presidential Election in 2012. In spite of the fact that get-out-the-vote messages typically yield smaller effects during high-stakes elections due to saturation of mobilization efforts from many sources, a significant increase in voting was again observed. Voting also increased significantly among the close friends of those who received the message to go to the polls, and the total effect on the friends was likely larger than the direct effect, suggesting that understanding social influence effects is potentially even more important than understanding the direct effects of messaging. These results replicate earlier work and they add to growing evidence that online social networks can be instrumental for spreading offline behaviors.
Conference Paper
Providers of leading digital services follow a data-centric business model that enables them to provide their users with highly beneficial, personalized services but also threatens the users’ privacy. These threats need to be addressed, not only to protect the users’ privacy, but also to establish stable and sustainable markets for digital services. Hence, approaches towards privacy protection have to cater not only to users’ need for control but also to businesses’ desire for data collection and usage. This paper takes an economic perspective on the privacy threats in digital services and presents a framework for reconciling digital services and privacy. In particular, this paper conceptualizes the privacy threats in digital services as agency problems and discusses the feasibility of transferring classic approaches for addressing agency problems to the domain of privacy in digital services. The paper analyzes in detail the transferability of the concept accountability to the examined domain and presents an accountability-centric framework for reconciliation along with a requirements analysis for technological implementation of the framework.
Article
We analyze platform competition where user data is collected to improve ad-targeting. Considering that users incur privacy costs, we show that the equilibrium level of data provision is distorted and can be inefficiently high or low: if overall competition is weak or if targeting benefits are low, too much private data is collected, and vice-versa. Further, we find that softer competition on either market side leads to more data collection, which implies substitutability between competition policy measures on both market sides. Moreover, if platforms engage in two-sided pricing, data provision is efficient.
Article
Although understanding preferences for privacy is of great importance to economists, businesses, and politicians, little is known about the factors that shape the individual willingness to share personal data. This article provides four experimental studies with a total of 470 participants that help characterize individual preferences for sharing personal data varying the characteristics of potential recipients. We find that participants’ willingness to share personal data with anonymous recipients decreases with the number of recipients. However, social distance to the recipients and the amount of personal data a single recipient receives do not decrease the willingness to share personal data. Further, we provide a methodological insight by showing that verification of personal data is essential when eliciting privacy pReferences
Article
Several authors have compared the impact of Big Data to the environmental impact of industrialisation (Kuneva, 2009, Schneier, 2013, Hirsch, 2006). This analogy seems useful: like the exhaust and use of chemical compounds, the omnipresent generation and subsequent use of personal data can impact individuals as well as society as a whole. Similarly to the effects of industrialisation, adverse side-effects of datafication on individuals, societies and ecosystems could manifest themselves only years later and in seemingly unrelated contexts. Industrialised societies have enacted laws in response to the adverse environmental effects of industrialisation. Theories from modern sociology and public awareness have played a significant role in this process. Similarly, the European Union has enacted the General Data Protection Regulation to address the risks of processing of personal data. But laws tend to come only after any negative impact has become sufficiently apparent. This has spurred the introduction of precaution- and discourse-based management of unknown risks. The GDPR appears to implement these risk management mechanisms less extensively than current environmental protection laws. As with the effects of industrialisation, the side-effects of datafication cannot be entirely known in advance. Privacy is currently a prime concern, but datafication can also affect entire societies if it results in a “digital panopticon”. Considering the influence of Risk Society Theory and Normal Accident Theory, two important social theories concerning industrial hazards, this article proposes a number of areas where future iterations of data protection law can be developed.
Chapter
The European data protection reform has resulted in a new regulation that will be effective from May 2018. This so-called General Data Protection Regulation contains specific provisions on data protection by design and on data protection by default. After briefly discussing related approaches such as “privacy by design”, we will elaborate how these provisions can be interpreted and sketch the potential impact on data processing in Europe and possibly beyond.
Conference Paper
Social Network Services (SNS) business models highly depend on the gathering and analysation of user data to obtain an advantage in competition for advertising clients. Nevertheless, an extensive collection and analysis of this data poses a threat to users’ privacy. Based on an economic perspective it seems rational for Social Network Operators (SNO) to ignore the users’ desire for privacy. However, privacy-friendly services might have the potential to earn users’ trust, leading to an increased revelation of personal data. Addressing these issues, we examine the existing privacy problem in SNS in the context of competition between SNO to investigate whether competition tend to enhance user privacy or whether it is the root of its violation. Therefore, this paper investigates the interconnectedness of the market structure and privacy problems in SNS. After analysing the users’ and the advertisers’ side of SNS, their competitiveness and its influence on user privacy are examined.