Conference Paper

The Crisis of Standardizing DRM: The Case of W3C Encrypted Media Extensions

Authors:
To read the full-text of this research, you can request a copy directly from the author.

Abstract

The process of standardizing DRM via the W3C Encrypted Media Extensions (EME) Recommendation has caused a crisis for W3C and potentially other open standards organizations. While open standards bodies are considered by definition to be open to input from the wider security research community, EME led civil society and security researchers asking for greater protections to be positioned actively against the W3C. This analysis covers both the procedural issues in open standards at the W3C that both allowed EME to be standardized as well as for vigorous opposition by civil society. The claims of both sides are tested via technical analysis and quantitative analysis of participation in the Working Group. We include recommendations for future standards that touch upon some of the same issues as EME.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the author.

... Further increasing the likelihood of attacks, implementers are recommended that Linked Data Proofs "are detached from the actual payload." 16 The combination of these problems, where a variable number of signatures can be arbitrarily detached and re-attached to messages combined with the (possibly insecure) retrieving of unknown external documents leads RDF in general, and likely Verifiable Credential implementations that use RDF, to be vulnerable to the same kind of attacks that rendered XML digital signatures insecure in practice [33]. ...
... 34 The problem is not just one of broken standards and an off-the-rails standardization process at the W3C. The conflict over DRM at the W3C demonstrated already the W3C standards process was prone for corporate capture and capable of being abused [16]. The underlying problem is the cultish desire for a "self-sovereign" global identity system runs counter to privacy. ...
Preprint
Full-text available
Due to the widespread COVID-19 pandemic, there has been a push for `immunity passports' and even technical proposals. Although the debate about the medical and ethical problems of immunity passports has been widespread, there has been less inspection of the technical foundations of immunity passport schemes. These schemes are envisaged to be used for sharing COVID-19 test and vaccination results in general. The most prominent immunity passport schemes have involved a stack of little-known standards, such as Decentralized Identifiers (DIDs) and Verifiable Credentials (VCs) from the World Wide Web Consortium (W3C). Our analysis shows that this group of technical identity standards are based on under-specified and often non-standardized documents that have substantial security and privacy issues, due in part to the questionable use of blockchain technology. One concrete proposal for immunity passports is even susceptible to dictionary attacks. The use of `cryptography theater' in efforts like immunity passports, where cryptography is used to allay the privacy concerns of users, should be discouraged in standardization. Deployment of these W3C standards for `self-sovereign identity' in use-cases like immunity passports could just as well lead to a dangerous form identity totalitarianism.
... Le franchissement de chacune d'elles se fait par consensus, sachant que c'est Tim Berners-Lee, ou une personne qu'il aura expressément mandatée pour ce faire, qui décrète qu'il y a consensus sur telle ou telle question et qu'il est temps de passer à l'étape suivante, ou qui décrète au contraire que le consensus n'a pas été trouvé et qu'il convient de revenir à une étape précédente, ou tout simplement d'abandonner le projet. Le processus du W3C confère un pouvoir considérable à Tim Berners-Lee, qui peut prendre en dernière instance des décisions contraires à l'avis exprimé par la majorité des membres d'un groupe de travail, ou bien trancher alors qu'il reste des objections formelles exprimées par tel ou tel membre et n'ayant pas été résolues (Halpin, 2017 ;Sire, 2017). Pour cette raison, certains observateurs n'hésitent pas à qualifier le fondateur du Web de « dictateur à perpétuité » (Malcom, 2008). ...
... Pour cette raison, certains observateurs n'hésitent pas à qualifier le fondateur du Web de « dictateur à perpétuité » (Malcom, 2008). Il n'empêche, le W3C est une arène dont la vocation est d'être ouverte, transparente, de garantir l'efficacité technique des normes, d'assurer l'interopérabilité des infrastructures (Virili, 2003), et d'empêcher, grâce à une politique de brevets originale, que des entreprises participent à la discussion dans le seul but de se rendre indispensables par la suite en tirant parti des brevets qu'elles détiennent (Russel, 2003 ;Halpin, 2017). ...
Article
Résumé Nous expliquons comment fonctionne le Web sémantique, et en particulier comment les producteurs de contenus peuvent « discrétiser » les informations de manière à ce que des logiciels comme les moteurs de recherche puissent ensuite les corréler à l’échelle du Web en générant des ontologies. Nous comparons les trois syntaxes auxquels ils peuvent avoir recours : le Resource Description Framework , les microdonnées et les microformats, ainsi que les arènes et les processus de normalisation propres à chacune de ces syntaxes. En nous référant pour notre analyse à la théorie opérationnelle de l’écriture numérique développée à l’Université Technologique de Compiègne, nous montrons ainsi comment sont confrontées différentes « politiques du sens » sur le Web portées par des acteurs dont les objectifs et les intérêts diffèrent. Nous décryptons la teneur non pas simplement du mécanisme de description et de traitement de l’information mais aussi des allers-retours entre l’individu qui décrit et édite des informations et la machine qui traite et éditorialise des données.
... 와 DRM 기업 간의 계약을 통해 라이선스를 공 급하고 있다 [5]. 이는 신규 DRM 기업이 진입하 기 어려운 형태이며, 수익 배분 방식이 투명하게 공개되지 않는 문제가 있다 [7]. 본 ...
... Today many entities, including Amazon, Apple, MIT, and Stanford University, have joined W3C in the development of Web standards (https://www.w3.org), which are publically available free of charge. However, this kind of standardization process also casts some doubts as some of the largest Silicon Valley corporations, including Google and Microsoft, joined the consortium by recommending debatable standards, such as the EME (Halpin, 2017). ...
... Tools such as ProVerif are specialized in security and privacy properties, and so can be deployed at the very beginning of the design phase of new protocols before or at the beginning of standardization. The widespread use of formal veri cation would not only increase security and privacy, but would allow the process of standardization to be less dependent on human reputation or the in uence of powerful implementers, as has been claimed to be the case with controversial standards around DRM like the W3C Encrypted Media Extensions standard [19]. The process of the standardization should be moved from being a social process to a scienti c process, where the security and privacy of standard protocols can be proven via automated formal veri cation, with proofs subject to public review. ...
Conference Paper
The science of security can be set on firm foundations via the formal verification of protocols. New protocols can have their design validated in a mechanized manner for security flaws, allowing protocol designs to be scientifically compared in a neutral manner. Given that these techniques have discovered critical flaws in protocols such as TLS 1.2 and are now being used to re-design protocols such as TLS 1.3, we demonstrate how formal verification can be used to analyze new protocols such as the W3C Web Authentication API. We model W3C Web Authentication with the formal verification language ProVerif, showing that the protocol itself is secure. However, we also stretch the boundaries of formal verification by trying to verify the privacy properties of W3C Web Authentication given in terms of the same origin policy. We use ProVerif to show that without further mandatory requirements in the specification, the claimed privacy properties do not hold. Next steps on how formal verification can be further integrated into standards and the further development of the privacy properties of W3C Web Authentication is outlined.
Chapter
This book presents a first comprehensive effort to explore the mechanics and fundamentals of global ICT standardization. It offers a comprehensive study of legal rules governing ICT standardization; systematically analyses governance and institutional features of some most prominent Standards Development Organizations; and presents qualitative empirical evidence on implementation of these rules in practice. By evaluating legal and procedural rules in light of current practices and tendencies in the industry, the book explores various options available for disciplining ICT standardization from the viewpoint of the applicable legislation, judiciary, and internal governance rules of Standards Development Organizations and offers practical solutions on how to increase the legitimacy of ICT standards. Adding to the previous theoretical approach to the field of standardization from historical, legal and political science perspective, this book applies theoretical considerations to unexplored scenarios, offering a holistic picture of ICT standardization and providing a novel contribution to the field.
Chapter
Although blockchain is decentralized by designed, today this technology is in practice in the hands of few corporations—antitrust can be crucial in restoring blockchain’s decentralization. Having provided a toolkit to navigate into the blockchain ecosystem of Distributed Ledger Technologies (DLT), Section 3 investigates the relation between competition and blockchain. Is blockchain the real game-changer for today’s centralized digital markets? Section 3 explores the exploitation of DLT as a tool to enforce antitrust principles more efficiently in today’s tech-digital economy. In particular, it will be analyzed if blockchain and DLT, in general, can be considered as revolutionary progress also in the context of antitrust. Section 4 bears the question as to whether blockchain raises antitrust concerns and how to deal with them. Technologies are not the driver of anticompetitive conduct, but forms of government surveillance, such as antitrust, are fundamental for keeping people’s trust in new advanced technologies.
Chapter
Due to the widespread COVID-19 pandemic, there has been a push for ‘immunity passports’ and even technical proposals. Although the debate about the medical and ethical problems of immunity passports has been widespread, there has been less inspection of the technical foundations of immunity passport schemes. These schemes are envisaged to be used for sharing COVID-19 test and vaccination results in general. The most prominent immunity passport schemes have involved a stack of little-known standards, such as Decentralized Identifiers (DIDs) and Verifiable Credentials (VCs) from the World Wide Web Consortium (W3C). Our analysis shows that this group of technical identity standards are based on under-specified and often non-standardized documents that have substantial security and privacy issues, due in part to the questionable use of blockchain technology. One concrete proposal for immunity passports is even susceptible to dictionary attacks. The use of ‘cryptography theater’ in efforts like immunity passports, where cryptography is used to allay the privacy concerns of users, should be discouraged in standardization. Deployment of these W3C standards for ‘self-sovereign identity’ in use-cases like immunity passports could just as well lead to a dangerous form identity totalitarianism.
Article
Objective - The study aims to explore the dispute between accessibility and Digital Rights Management (DRM) for disabled people in accessing information on the Web. More specifically, this paper explores the challenges DRM has placed on them. Methodology/Technique –This study inspects the controversial interaction between accessibility and DRM in relation to disabled people’s access to information on the Web using document analysis from a socio-legal perspective. Further, the black-letter law of some widely known and internationally cited regulations are used in the investigation and discussion. Findings & Novelty- It is argued that the regulations which are beneficial to the accessibility of disabled people are overruled by DRM. More specifically, the challenges posed by DRM include: (1) Negligence of disabled people’s rights, (2) Conflict of accessibility, and (3) Ignorance of copyright-related exceptions. This study is a cross disciplinary study probing the issue of disabled people in both legal studies, through relevant legislation, and information studies, through the topic of information access on the Web. It examines and analyses major regulations issued by leading organizations across the two disciplines. The findings of this study may be beneficial to knowledge and practice to bridge the gap of human rights for information access, particularly for disabled people, and argues that both information and legal professionals should be responsible for this. Type of Paper: Review
Chapter
For over a decade, standards bodies like the IETF and W3C have attempted to prevent the centralization of the Web via the use of open standards for ‘permission-less innovation.’ Yet today, these standards, from OAuth to RSS, seem to have failed to prevent the massive centralization of the Web at the hands of a few major corporations like Google and Facebook. We’ll delve deep into the lessons of failed attempts to replace DNS like XRIs, identity systems like OpenID, and metadata formats like the Semantic Web, all of which were re-cuperated by centralized platforms like Facebook as Facebook Connect and the “Like” Button. Learning from the past, a new generation of blockchain standards and governance mechanisms may be our last, best chance to save the Web.
Article
Full-text available
World Wide Consortium (W3C) as an standard setting organization for the world wide web plays a very important role in shaping the web. We focus on the ongoing controversy related to Encrypted Media Extensions (EME) and found that there was a serious lack of participation from people from non western countries. We also found serious lack of gender diversity in the EME debate.
Conference Paper
Full-text available
We give cryptographic schemes that help trace the source of leaks when sensitive or proprietary data is made available to a large set of parties. This is particularly important for broadcast and database access systems, where the data should be accessible only to authorized users. Such schemes are very relevant in the context of pay television, and easily combine with and complement the Broadcast Encryption schemes of A. Fiat and M. Naor [CRYPTO ’93, Lect. Notes Comput. Sci. 773, 480-491 (1994; Zbl 0870.94026)].
Conference Paper
Trusted computing relies on formally verified trusted computing platforms to achieve high security assurance. In practice, however, new platforms are often proposed without a comprehensive formal evaluation and explicitly defined underlying assumptions. In this work, we propose TRUSTFOUND, a formal foundation and framework for model checking trusted computing platforms. TRUSTFOUND includes a logic for formally modeling platforms, a model of trusted computing techniques and a broad spectrum of threat models. It can be used to check platforms on security properties (e.g., confidentiality and attestability) and uncover the implicit assumptions that must be satisfied to guarantee the security properties. In our experiments, TRUSTFOUND is used to encode and model check two trusted platforms. It has identified a total of six implicit assumptions and two severe previously-unknown logic flaws from them.
Article
In the fall of 2005, problems discovered in two Sony-BMG compact disc copy protection systems, XCP and MediaMax, triggered a public uproar that ultimately led to class-action litigation and the recall of millions of discs. We present an in-depth analysis of these technologies, including their design, im- plementation, and deployment. The systems are surprisingly complex and suffer from a diverse array of flaws that weaken their content protection and expose users to serious security and privacy risks. Their complexity, and their failure, makes them an interesting case study of digital rights management that carries valuable lessons for content companies, DRM vendors, policymakers, end users, and the security community. This paper is a case study of the design, implementation, and deployment of anti-copying technologies. We present a detailed technical analysis of the security and privacy implications of two systems, XCP and MediaMax, which were developed by separate companies (First4Internet and SunnComm, respectively) and shipped on millions of music compact discs by Sony-BMG, the world's second largest record company. We consider the design choices the companies faced, examine the choices they made, and weigh the conse- quences of those choices. The lessons that emerge are valuable not only for compact disc copy protection, but for copy protection systems in general. Before describing the technology in detail, we will first recap the public events that brought the issue to prominence in the fall of 2005. This is necessarily a brief account that leaves out many details, some of which will appear later in the paper as we discuss each part of the technology. For a fuller account, see (10). The security and privacy implications of Sony-BMG's CD digital rights management (DRM) technolo- gies first reached the public eye on October 31, 2005, in a blog post by Mark Russinovich (29). While testing a rootkit detector he had co-written, Russinovich was surprised to find an apparent rootkit (software designed to hide an intruder's presence (16)) on one of his systems. Investigating, he found that the rootkit was part of a CD DRM system called XCP that had been installed when he inserted a Sony-BMG music CD into his computer's CD drive. News of Russinovich's discovery circulated rapidly on the Internet, and further revelations soon fol- lowed, from us,1 from Russinovich, and from others. It was discovered that the XCP rootkit makes users'
Article
In response to piracy and online file trading, the music industry has begun to adopt technological measures, often referred to as digital rights management (DRM), to control the sale and distribution of music over the Internet. Previous economic analysis on the impact of DRM implementation has been overly simplistic. A careful analysis of copyright law and the microeconomic principles governing the music industry demonstrates that commentators have failed to account for factors relevant to the measure of social welfare within the music industry. This paper develops a more refined economic model that is better suited to accurately assessing how legal or technological changes like DRM will affect the music industry. Utilizing a refined economic model, the analysis suggests that the economic effects of implementing DRM technology are generally negative, albeit uncertain. While DRM implementation may inhibit piracy, facilitate price discrimination, and lower transactional costs, it will likely decrease social welfare by raising barriers to entry and exacerbating a number of existing market failures. Specifically, DRM implementation may facilitate the extension of monopoly pricing, decrease the amount of information available to potential music consumers, diminish the number of positive externalities, and raise artistic and informational barriers to entry into certain genres of music.
Article
Timing attacks are usually used to attack weak computing devices such as smartcards. We show that timing attacks apply to general software systems. Specifically, we devise a timing attack against OpenSSL. Our experiments show that we can extract private keys from an OpenSSL-based web server running on a machine in the local network. Our results demonstrate that timing attacks against network servers are practical and therefore security systems should defend against them. (c) 2005 Elsevier B.V. All rights reserved.
Conference Paper
The desires for robust digital rights management (DRM) sys- tems are not new to the commercial world. Indeed, industrial research, development and deployment of systems with DRM aspects (most no- tably crude copy-control schemes) have a long history. Yet to date the industry has not seen much commercial success from shipping these sys- tems on top of platforms that support general-purpose computing. There are many factors contributing to this lack of acceptance of current DRM systems, but I see three specific areas of work that are key adoption blockers today and ripe for further academic and commercial research. The lack of a general-purpose rights expression/authorization language, robust trust management engines and attestable trusted computing bases (TCBs) all hamper industrial development and deployment of DRM sys- tems for digital content. In this paper I briefly describe each of these challenges, provide examples of how the industry is approaching each problem, and discuss how the solutions to each one of them are depen- dent on the others.
Article
Various scenarios which describes the limitations of security by obscurity, a belief that code secrecy can make a system more secure, are discussed. In 1994, Blaze had discovered a flaw in the Escrowed Encryption Standard that could be used to circumvent law-enforcement monitoring. The cases regarding flaws in software secrecy of NASA and Diebold are also discussed. It is suggested that risk assessment, examination and testing appropriate to deployment settings are fundamental to security assurance.
Article
Purpose The purpose of this paper is to provide a review of developments in the USA related to digital rights management (DRM) through legal, technological, and market developments in recent years. Design/methodology/approach This article summarizes recent developments in DRM in two areas. First is the legal landscape, including copyright law developments that apply to digital content and attempts to impose DRM technology through legislation and litigation. Second are recent advances in DRM‐related technology and developments in digital content markets that are based on DRM. In both cases, USA developments are compared with the situation in Europe. Findings Developments in American copyright law, DRM technology, and digital content markets exert heavy influences on the spread of DRM in Europe, but the legal and technological frameworks are not different, giving rise to incompatibilities. Practical implications DRM technologies need to evolve differently for European markets, because they need to exist in the context of fundamentally different copyright law frameworks (e.g., Private Copying) and incumbent technologies (e.g., Conditional Access television). Originality/value European readers of this paper should gain an understanding of American DRM technology and its legal context, and how they influence developments in Europe.
DRM and HTML5: it’s now or never for the Open Web
  • H Halpin
Technological protection measures in the Copyright (Amendment) Bill
  • P Prakash
W3C process document (2016). https://www.eff.org/deeplinks/2016/03/security-researchers-tell-w3c-protect-researchers-who-investigate-browsers
  • C Mccathie-Neville
Security researchers: tell the W3C to protect researchers who investigate browsers
  • C Doctorow
FA premier league: the broader implications for copyright licensing
  • B Batchelor
  • T Jenkins