Chapter

Network Neutrality: An Empirical Approach to Legal Interoperability

Authors:
To read the full-text of this research, you can request a copy directly from the authors.

Abstract

The Internet is grounded in an open and interoperable architecture, giving rise to a quintessentially transnational environment. This global network of networks is, however, in natural tension with an international legal system based on mutually excluding legal frameworks. Differently from electronic networks, which are based on shared technical standards whose main objective is to make different systems compatible, national juridical system are based on essentially domestic rules, whose application to the online environment has the potential to fragment the Internet. The implementation of divergent domestic laws and regulation has indeed the potential to balkanise the global Internet creating separated national intranets and potentially conflicting cyberspaces. It seems important, therefore, to encourage the development of harmonious rules across jurisdictions, thus fostering the compatibility of the legal systems penetrated by the Internet. Promoting a “legally interoperable” environment may be considered as an instrumental step to achieving a better-functioning Internet ecosystem, in which new technologies can spur, and the free flow of information is not hindered by diverging national laws.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... Nous utiliserons le concept d'interopérabilité pour parvenir à un modèle théorique avancé. Néanmoins, rappelons que l'interopérabilité, en termes juridiques et sociaux, diffère de l'interopérabilité appliquée aux systèmes (Belli et Foditsch 2016). L'interopérabilité juridique peut se produire à différents degrés, tandis que l'interopérabilité technique se produit avec une réponse binaire (oui ou non). ...
Chapter
La cybercriminalité n’est pas un sujet nouveau pour les autorités brésiliennes. Un article de 2015 indique que le Brésil est le deuxième pays le plus vulnérable à la cybercriminalité (Muggah et Thompson 2015). Il indique également que la société brésilienne a perdu entre 4,1 et 4,7 milliards de dollars américains en vols de données et en fraudes financières. La plupart des cibles de la cybercriminalité étaient des entreprises et des citoyens qui avaient de l’argent en banque. Cet article conclut que le gouvernement brésilien et la société devraient mettre en oeuvre davantage de réformes pour lutter contre la cybercriminalité. Mais comment le pays met-il en oeuvre ces réformes pour les réussir ? Le présent chapitre répond à cette question en procédant à deux évaluations : les réformes les plus récentes et celles à venir. En outre, il retrace la situation actuelle de la cybercriminalité au Brésil après la crise de la pandémie en utilisant des données multisources.
Article
This article claims that the Notice and Consent (N&C) approach is not efficient to protect the privacy of personal data. On the contrary, N&C could be seen as a license to freely exploit the individual’s personal data. For this reason, legislators and regulators around the world have been advocating for different and more efficient safeguards, notably through the implementation of the Privacy by Design (PbD) concept, which is predicated on the assumption that privacy cannot be assured solely by compliance with regulatory frameworks. In this sense, PbD affirms that privacy should become a key concern for developers and organisations alike, thus permeating new products and services as well as the organisational modi operandi. Through this paper, we aim at uncovering evidences of the inefficiency of the N&C approach, as well as the possibility to further enhance PbD, in order to provide the individual with increased control on her personal data. The paper aims at shifting the focus of the discussion from “take it or leave it” contracts to concrete solutions aimed at empowering individuals. As such, we are putting forth the Data Control by Design (DCD) concept, which we see as an essential complement to N&C and PbD approaches advocated by data-protection regulators. The technical mechanisms that would enable DCD are currently available (for example, User Managed Access (UMA) v1.0.1 Core Protocol). We, therefore, argue that data protection frameworks should foster the adoption of DCD mechanisms in conjunction with PbD approaches, and privacy protections should be designed in a way that allows every individual to utilise interoperable DCD tools to efficiently manage the privacy of her personal data. After having scrutinised the N&C, PbD and DCD approaches we discuss the specificities of health and genetic data, and the role of DCD in this context, stressing that the sensitivity of genetic and health data requires special scrutiny from regulators and developers alike. In conclusion, we argue that concrete solutions allowing for DCD already exist and that policy makers should join efforts together with other stakeholders to foster the concrete adoption of the DCD approach.
Article
Full-text available
The internet is a general-purpose network grounded on openness, decentralisation and interoperability. Such features have allowed innovation to flourish, lowering barriers to communication, participation and cooperation, thus empowering end-users. ‘General purpose’ means that the purpose for which the internet is used is not predefined by the operator but can be autonomously decided by the end-user. Accordingly, the network neutrality (NN) principle mandates non-discriminatory treatment of internet traffic to preserve the general-purpose nature of the internet, unleashing end-users’ creativity. This paper starts by exploring the NN debate, stressing that the NN rationale is to preserve an open and decentralised internet architecture, empowering end-users and protecting their rights. Subsequently, it argues that the combination of reduced data caps and zero-rating (ZR) schemes may create artificial scarcity, raise the price of the open internet and jeopardise the achievement of the NN rationale. It provides a taxonomy of ZR models and emphasises that several ZR practices might impose on the internet a centralised configuration that characterises less innovative networks, such as the Minitel. The phenomenon that I define as ‘Minitelisation’ of the internet consists of the shift from a user-centric, general-purpose network to one with predefined purposes, thereby creating passive consumers of predetermined services, rather than active internet users.
Article
Full-text available
This case study is part of an ongoing series developed in support of a larger text on interoperability by John Palfrey and Urs Gasser Interop: The Promise and Perils of Highly Interconnected Systems (Basic Books, June 2012). The book is an extension of their 2007 study and paper, “Breaking Down Digital Barriers: When and How ICT Interoperability Drives Innovation” (Berkman Center Research Publication, 2007). Interop: The Promise and Perils of Highly Interconnected Systems focuses on the relationship between interoperability and innovation in the Information and Communication Technology (ICT) environment and beyond. Palfrey and Gasser seek to sharpen the definition of interoperability and identify its relevance for consumers, companies, governments, and the public by examining its driving forces and inhibitors, while considering how it can best be achieved, and why.Authored by research assistants at the Berkman Center for Internet & Society in close collaboration with the authors, the series examines real-world examples where interoperability has played or continues to play a decisive role — in ICT and other sectors, including transportation, retail, and energy. Each case explores how forces such as law, policy, technology, economic incentives, and market innovations drive, or in some cases inhibit, interoperability, and what industry stakeholders seek to achieve via interoperability.Some cases explore well-known innovations, such as the bar code and the UPC system, air traffic control, and the QWERTY keyboard layout. Others touch on more recent examples, including efforts to standardize cell phone chargers in the E.U., the evolution of electronic data interchange systems, and digital rights management (DRM) systems. The nature of interoperability and its attendant challenges are also explored in less-traditional contexts, including commons-based knowledge creation models, which require large-scale collective efforts, and complex, next-generation models, like the Internet of Things, cloud computing, and the Smart Grid.
Article
Full-text available
Chris Marsden argues for a 'Middle Way' on net neutrality, a problem of consumer and media policy without easy answers that cannot be left to self-regulated market actors. His holistic solution considers ISPs' roles in the round, including their ‘Three Wise Monkeys’ legal liabilities for content filtering. Co-regulation is an awkward compromise between state and private regulation, with constitutionally uncertain protection for end-users and the appearance of a solution with only partial remedy against private censorship.
Article
Full-text available
A design principle aimed at choosing the proper placement of functions among the modules of a distributed computer system is presented. The principle, called the end-to-end argument, suggests that certain functions usually placed at low levels of the system are often redundant or of little value compared to the cost of providing them at a low level. Examples include highly reliable communications links, encryption, duplicate message suppression, and delivery acknowledgement. It is argued that low level mechanisms to support these functions are justified only as performance enhancement.
Article
Full-text available
We discuss network neutrality regulation of the Internet in the context of a two-sided market model. Platforms sell broadband Internet access services to residential consumers and may set fees to content and application providers on the Internet. When access is monopolized, cross-group externalities (network effects) can give a rationale for network neutrality regulation (requiring zero fees to content providers): there exist parameter ranges for which network neutrality regulation increases the total surplus compared to the fully private optimum at which the monopoly platform imposes positive fees on content providers. However, for other parameter values, network neutrality regulation can decrease total surplus. Extending the model to a duopoly of residential broadband ISPs, we again find parameter values such that network neutrality regulation increases total surplus suggesting that network neutrality regulation could be warranted even when some competition is present.
Chapter
The question of whether and how to protect the principle of network neutrality (“NN”) is currently one of the most hotly debated topics of Internet policy around the world. As the name may already suggest, NN is essentially a non-discrimination principle that applies to the transmission of Internet traffic. It prescribes that, in principle, all Internet traffic should be transmitted on an equal basis, or at least in a manner that does not favour or disfavour particular users, applications, content, services or devices. The need to protect NN through law and policy is widely perceived as a result of the discriminatory treatment of Internet traffic which some Internet providers have begun to engage in (BEREC 2012) while others have publicly announced their wish to do so. Such discriminatory treatment has the potential to restrict the freedom of Internet users to receive and impart information and use or run services and devices of their choice.
Article
The report explores some of the most crucial facets of Network Neutrality, underscoring its close relationship with the full enjoyment of end-users fundamental rights. The report also includes a proposal for a Model Framework on Network Neutrality that has been elaborated by the Dynamic Coalition through an open, inclusive and multi-stakeholder effort, in order to promote an efficient safeguard of the Net Neutrality principle in accordance with international human rights standards.
Article
The end-to-end argument was first put forward in the early 1980s as a core design principle of the Internet. The argument, as framed then, remains relevant and powerful: the fundamental architecture of the Internet endures, despite change in both underlying technologies and in applications. Nevertheless, the evolution of the Internet also shows the limits of foresight, and we now see that the Internet, the applications that run on top of the Internet, and the interpretation of the end-to-end argument itself have all greatly evolved.This paper concerns the evolution in thinking around the end-to-end argument, the design principles for modern Internet applications, and what the end-to-end argument has to do with application structure. We argue that while the literal reading of the early end-to-end argument does not directly offer guidance about the design of distributed applications and network services, there is a useful interpretation of the end-to-end argument that both informs the thinking about application design, and gives as well a perspective on the end-to-end argument in general that is valid in today’s world.
Article
The best proposals for network neutrality rules are simple. They ban abusive behavior like tollboothing and outright blocking and degradation. And they leave open legitimate network services that the Bells and Cable operators want to provide, such as offering cable television services and voice services along with a neutral internet offering. They are in line with a tradition of protecting consumer's rights on networks whose instinct is just this: let customers use the network as they please. No one wants to deny companies the right to charge for their services and charge consumers more if they use more. But what does need to be stopped is raw discrimination that is nothing more than a tax on innovation taken by government-supported corporations.
Article
This paper examines the the concept of network neutrality in telecommunications policy and its relationship to Darwinian theories of innovation. It also considers the record of broadband discrimination practiced by broadband operators in the early 2000s.
Article
This essay addresses the fundamental questions of Internet governance: whether and how the architecture of the Internet should affect the shape and content of legal regulation of the global network of networks. Our answer to these questions is based on the concept of layers, the fundamental architectural feature of the Internet. Our thesis is that legal regulation of the Internet should be governed by the layers principle - the law should respect the integrity of layered Internet architecture. This principle has two corollaries. The first corollary is the principle of layer separation: Internet regulation should not violate or compromise the separation between layers designed into the basic architecture of the Internet. The second corollary is the principle of minimizing layer crossing, i.e., minimize the distance between the layer at which the law aims to produce an affect and the layer directly affected by legal regulation. The essay argues that layers analysis provides a more robust conceptual framework for evaluating Internet regulations than does the end-to-end principle. The layers principle is supported by two fundamental ideas. The first idea is transparency: the fact that layer-violating regulations damage transparency combined with the fact that Internet transparency lowers the cost of innovation provides compelling support for the principle of layer separation: public Internet regulators should not violate or compromise the separation between layers designed into the basic architecture of the Internet. The second idea is fit: the fact that layer-crossing regulations result in inherent mismatch between the ends such regulations seek to promote and the means employed implies that layer-crossing regulations suffer from problems of overbreadth and underinclusion; avoidance of these problems requires Internet regulators to minimize the distance between the layer at which the law aims to produce an effect and the layer directly targeted by legal regulation. Finally, the essay provides a detailed discussion of several real or hypothetical layer-violating or layer-crossing regulations, including: (1) The Serbian internet interdiction myth, (2) Myanmar's cut-the-wire policy, (3) China's great firewall, (4) the French Yahoo case, (5) cyber-terrorism, (6) Pennsylvania's IP address-blocking child-pornography statute, (7) port blocking and peer-to-peer file sharing, and (8) the regulation of streaming video at the IP layer.
Book
A detailed examination of how the underlying technical structure of the Internet affects the economic environment for innovation and the implications for public policy. Today—following housing bubbles, bank collapses, and high unemployment—the Internet remains the most reliable mechanism for fostering innovation and creating new wealth. The Internet's remarkable growth has been fueled by innovation. In this pathbreaking book, Barbara van Schewick argues that this explosion of innovation is not an accident, but a consequence of the Internet's architecture—a consequence of technical choices regarding the Internet's inner structure that were made early in its history. The Internet's original architecture was based on four design principles: modularity, layering, and two versions of the celebrated but often misunderstood end-to-end arguments. But today, the Internet's architecture is changing in ways that deviate from the Internet's original design principles, removing the features that have fostered innovation and threatening the Internet's ability to spur economic growth, to improve democratic discourse, and to provide a decentralized environment for social and cultural interaction in which anyone can participate. If no one intervenes, network providers' interests will drive networks further away from the original design principles. If the Internet's value for society is to be preserved, van Schewick argues, policymakers will have to intervene and protect the features that were at the core of the Internet's success.
Making the EU work for people: roaming and the open internet
  • A Ansip
Protecting Human Rights through Network Neutrality: Furthering Internet Users’ Interest, Modernising Human Rights and Safeguarding the Open Internet
  • L Belli
  • M Van Bergen
Architectural Principles of the Internet, Request for Comments
  • B Carpenter
Permissionless Innovation – Openness, not Anarchy Available at: http:// www. internetsociety. org/ blog/ tech-matters
  • L Daigle
The Tao of IETF: A Novice’s Guide to the Internet Engineering Task Force
  • P Hoffman
Connecting Continents for Enhanced Multistakeholder Internet Governance. IGF 2014 Chair’s Summary
  • Igf Chair
Silenced: an international report on censorship and control of the Internet
  • D Banisar
How do transnational policy actors matter? Annual Meeting of the Research Committee 19 of the International Sociological Association
  • D Béland
  • M A Orenstein
Net neutrality: this is serious
  • T Berners Lee
De la gouvernance à la régulation de l’Internet
  • L Belli
The Catenet Model for Internetworking, DARPA/IPTO, retrieved from https
  • V Cerf
Network neutrality: Words of power and 800-pound gorillas
  • D Clark
Declaration of the Committee of Ministers on network neutrality
  • Coe
1.5 lakh mails and counting: India lodges one of its biggest online protests over net neutrality
  • P K Jayadevan
Network neutrality: Guidelines for Internet neutrality
  • Nkom
Autronic AG v. Switzerland Application no. 12726/87
  • Ecthr
Réguler le commerce électronique par la résolution des litiges en ligne: Une approche critique
  • T Schultz
Disegno di legge Costituzionale d’Iniziativa del senatore Campanella comunicatio alla Presidenza il 10 luglio 2014. Introduzione dell’articolo 34-bis della Costituzione, recante disposizioni volte al riconoscimento del diritto di accesso ad internet
  • Repubblica Senato Della
The path to a free and open internet
  • White House
Interop: The Promise and Perils of Highly Interconnected Systems
  • J Palfrey
  • U Gasser
Legal interoperability as a tool for combatting fragmentation
  • R Weber
Organic net neutrality
  • D Weinberger
Testimony before the House Committee on the Judiciary
  • T Wu
Internet Governance, Council of Europe Strategy
  • Coe
A discourse principle approach to net neutrality policymaking: a model framework and its application. Net neutrality compendium
  • L Belli
  • M Van Bergan
  • W Michael
The Internet Standards Process - Revision 3, Request for Comments
  • S Bradner
Ahmet Yıldırım v. Turkey. Application no
  • Ecthr
Network Neutrality: An Ongoing Regulatory Debate. 2nd Report of the Dynamic Coalition on Network Neutrality. Presented at the 9th United Nations Internet Governance Forum
  • L Belli
Autronic AG v. Switzerland, 22
  • Ecthr
Opinion of the European Data Protection Supervisor on net neutrality, traffic management and the protection of privacy and personal data
  • Edps
Opinion of the Europe an Data Protection Supervisor on the Proposal for a Regulation of the Europe an Parliament and of the Council laying down measures concerning the Europe an single market for electronic communications and to achieve a Connected Continent
  • Edps
Permissionless Innovation - Openness, not Anarchy
  • L Daigle
EDPS Comments on DC Connect’s Public Consultation on “Specific Aspects of Transparency, Traffic Management and Switching in an Open Internet
  • Edps