ThesisPDF Available

Data Protection with Ethereum Blockchain

Authors:

Abstract and Figures

Blockchain technology has been one of the most promising technologies of the past decade, with Ethereum and Bitcoin being the two most popular Blockchains today. Both do not provide data protection and privacy by default. The former allows for Decentralized Applications (DApps) to be built, with zero chance of downtime or censorship and is the main focus of this dissertation. The European Union approved a law in 2016, the General Data Protection Regulation (GDPR), with severe penalties being enforced since May 25th, 2018. It is considered a massive step toward protecting user data. Not only does it affect companies with offices in the EU, but also organizations throughout the world that have users from EU territories. Further, it stipulates key obligations for organizations handling user data, in addition to introducing new rights to individuals, such as the right to erasure. This represents a major challenge towards achieving GDPR compliance in DApps, as Blockchains such as Ethereum, are immutable by design. This dissertation’s work attempts to comply with the GDPR and its conflicting right to erasure, by developing an Ethereum proof-of-concept DApp: DFiles, which also aims to provide some form of data privacy and protection. It also allows its users to upload encrypted files in addition to their download and decryption. It was developed using an Agile methodology in an iterative approach with mainly decentralized technologies, such as the Interplanetary File System (IPFS) and Ethereum smart contracts, with a centralized component for user authentication, while adhering to Blockchain Software Engineering. Due to the GDPR’s complexity, some parts were selected, namely the rights to erasure, data portability, access and rectification. DFiles GDPR compliance was then evaluated with a statistical analysis on user encrypted and unencrypted uploaded files in the DApp, with its elapsed upload times and Ethereum transaction costs measured for files separated into four categories: small (1KB-1MB), medium (1MB-20MB), large (20MB-200MB) and extra-large (200MB-2GB). However, due to hardware limitations, this statistical analysis could only be performed for files up to 14.2MB. It concluded that transaction costs for unencrypted files are slightly higher, although this increase is not significant. As for elapsed upload times, it found that the elapsed upload time in encrypted files was overall significantly higher. Data from files larger than 14.2MB was still recorded which determined that the last two elapsed upload times for unencrypted files up to 800MB, are less than the last two upload elapsed times for encrypted ones up to 14.2MB. In conclusion, encrypting files to comply with the General Data Protection Regulation’s right to erasure is a valuable option only for small to medium files up to 14.2MB. From there, without considering hardware encryption limitations, upload times tend to grow exponentially. Ethereum and the IPFS must advance to allow better privacy techniques. Recently, there have been major new improvements to Ethereum and its smart contracts; the world of DApp development is always changing at a fast rate. In the future, Ethereum might evolve to a newer version which may bring new and enhanced privacy controls which may allow its complete GDPR compliance.
Content may be subject to copyright.
A preview of the PDF is not available
... Ethereum is one of the blockchain platforms with a decentralized nature and has a smart contract capability with a high success rate [24]. A smart contract is like an electronic treaty, the contract is generated in a programming language to run a set of agreed rules that are stored in a decentralized manner to write or retrieve data inside the blockchain. ...
Article
Full-text available
Data is an important thing as a base of an analytic or a hypothesis for concluding participant vote data in the Indonesian General Election. The data needs to be processed and secured, so the integrity of the data is in good condition. It also needs to protect the participant voting rights, so the information is correctly displayed. The research problems include creating a system that protects the integrity of election data and creating a system that protects election rights for each voter. Based on the problems, this research discusses a blockchain-based electronic voting information system that would secure the integrity of data and also protecting the participant voting rights in a General Election. The system uses Ethereum as a blockchain with Solidity as a programming language to build a smart contract and is built in Microsoft Windows platform. In this research, consortium blockchain and biometric fingerprint authentication are used as a problem-solving method, and waterfall steps is used as a system development method. The result of this research is a proposed design of the e-voting system. The conclusion based on this research is a blockchain-based e-voting system that secures the integrity of the data in a selection process and ensures protection to each vote right. Index Terms—Authentication; Blockchain; Data; Integrity; E-Voting; System
Chapter
With the advent of technology, the world saw the rise of blockchain technology, because of its accessibility, and efficiency in managing transactions and the related records. According to IBM, because it delivers immediate, shareable and entirely transparent information kept on an immutable ledger that can only be viewed by permissioned network users, blockchain is excellent for delivering that information. The most important aspect of blockchain is its distributed ledger technology. The Byzantine Fault Tolerance which is largely associated with distributed systems is a feature that allows a decentralised, trustless network to function even when some nodes are broken or malevolent. This paper explains the use of proof-of-work algorithms and Byzantine Fault Tolerance (BFT) to handle the Byzantine Faults in blockchain technology, its effects and the solutions to this problem.
Conference Paper
Full-text available
Smart Contracts have gained tremendous popularity in the past few years, to the point that billions of US Dollars are currently exchanged every day through such technology. However, since the release of the Frontier network of Ethereum in 2015, there have been many cases in which the execution of Smart Contracts managing Ether coins has led to problems or conflicts. Compared to traditional Software Engineering, a discipline of Smart Contract and Blockchain programming, with standardized best practices that can help solve the mentioned problems and conflicts, is not yet sufficiently developed. Furthermore, Smart Contracts rely on a non-standard software life-cycle, according to which, for instance, delivered applications can hardly be updated or bugs resolved by releasing a new version of the software. In this paper we advocate the need for a discipline of Blockchain Software Engineering, addressing the issues posed by smart contract programming and other applications running on blockchains. We analyse a case of study where a bug discovered in a Smart Contract library, and perhaps "unsafe" programming, allowed an attack on Parity, a wallet application, causing the freezing of about 500K Ethers (about 150M USD, in November 2017). In this study we analyze the source code of Parity and the library, and discuss how recognised best practices could mitigate, if adopted and adapted, such detrimental software misbehavior. We also reflect on the specificity of Smart Contract software development, which makes some of the existing approaches insufficient, and call for the definition of a specific Blockchain Software Engineering.
Article
Full-text available
telephone: +44(0)115 8484313, facsimile: +44(0)115 8486420. The author would like to extend his thanks to the Editor and the anonymous reviewers for their invaluable comments and advice. EXECUTIVE SUMMARY The term 'customer value' is used within the marketing literature to portray both what is derived by the customer from the supplier, and also what is derived by the supplier from the customer. This latter property is now referred to as 'customer lifetime value' (CLV), but there is no agreement on a distinct name for the former. The author, therefore, has chosen the term 'Value for the Customer' (VC) to represent all similarly associated, demand-side notions of value. Recent investigations imply that VC is of increasing interest to marketers, both practicing and academic. However, although the notion of VC is not new, the marketing literature offers little evidence to imply that anything by way of conceptual consensus exists. It remains, therefore, an area of continuing ambiguity, sub-ject to both empirical and speculative enquiry, but with no clear theoretical anchor. This paper attempts to pro-vide such an anchor, primarily by exploring current diversity of thought and then seeking to rationalize, clarify and classify extant ideas to create a coherent VC domain. Conceptual Framework and Research Method Bagozzi (1984) suggests that conceptual meaning within theory construction can be allocated in three ways: via attributional definition, via structural definition, and/or through dispositional definition. This paper adopts this same perspective and works to build theory accordingly. The research is entirely archival and uses two points of reference. The first, consisting entirely of non-marketing sources, was used to explore the semantic and ontologi-cal variety inherent within the broader value domain. The second, consisting of ninety (90) recent texts -primar-ily from the marketing literature but also from the fields of strategy and quality management – provided the means of analysing business-oriented perspectives.
Chapter
The Analytic Hierarchy Process (AHP) is a problem solving framework. It is a systematic procedure for representing the elements of any problem. It organizes the basic rationality by breaking down a problem into its smaller constituent parts and then calls for only simple pairwise comparison judgments, to develop priorities in each hierarchy.
Article
This paper proposes a novel framework for modeling the Value for the Customer, the so-called Conceptual Model for Decomposing Value for the Customer (CMDVC). This conceptual model is first validated through an exploratory case study where the authors validate both the proposed constructs of the model and their relations. In a second step the authors propose a mathematical formulation for the CMDVC as well as a computational method. This has enabled the final quantitative discussion of how the CMDVC can be applied and used in the enterprise environment, and the final validation by the people in the enterprise. Along this research, we were able to confirm that the results of this novel quantitative approach to model the Value for the Customer is consistent with the company's empirical experience. The paper further discusses the merits and limitations of this approach, proposing that the model is likely to bring value to support not only the contract preparation at an Ex-Ante Negotiation Phase, as demonstrated, but also along the actual negotiation process, as finally confirmed by an enterprise testimonial.
Article
Decisions involve many intangibles that need to be traded off. To do that, they have to be measured along side tangibles whose measurements must also be evaluated as to, how well, they serve the objectives of the decision maker. The Analytic Hierarchy Process (AHP) is a theory of measurement through pairwise comparisons and relies on the judgements of experts to derive priority scales. It is these scales that measure intangibles in relative terms. The comparisons are made using a scale of absolute judgements that represents, how much more, one element dominates another with respect to a given attribute. The judgements may be inconsistent, and how to measure inconsistency and improve the judgements, when possible to obtain better consistency is a concern of the AHP. The derived priority scales are synthesised by multiplying them by the priority of their parent nodes and adding for all such nodes. An illustration is included.
Article
Eight companies that were Process Effectiveness Network members of the Industrial Research Institute attempted to collectively determine the best practices of the Fuzzy Front End (FFE) of innovation. Comparing one company's processes to those of another proved insurmountable because there was neither a common language nor clear and consistent definition of the key elements of the front end. As a result, the group developed a theoretical construct, defined as the New Concept Development (NCD) model, in order to provide a common language and insights on the front end activities. The model consists of three key parts: five front end elements, the engine that powers the elements, and external influencing factors. Proficiency of the FFE was evaluated at 19 companies by using the NCD model. Highly innovative companies were found to be more proficient in the FFE and in several elements of the NCD model.