Article

Abstraction, indirection, and Sevareid's Law: Towards benign computing

Authors:
To read the full-text of this research, you can request a copy directly from the author.

Abstract

Computing is one of the primary means by which we solve problems in society today. In this short paper we examine the implications of the primary techniques used in computer systems work - abstraction and indirection - and of Sevareid's Law, an epigram that suggests that our problem-solving instinct may be leading us astray. We explore the context of this dilemma and discuss instances in which this has arisen in the recent past. We then consider a few design options and changes to the normal mode of computer science practice that might enable us to sidestep the implications of Sevareid's Law.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the author.

... Alderson & Doyle [30] have argued that complexity arises in such systems in order to provide robustness to uncertainty in their environments-however, this complexity can also be a source of fragility, leading to a "robust yet fragile" tradeoff in system design. The need to scale out at all levels of the architecture-at both the level of distributed systems and at the macro system level-is also emphasized in the paper by Bhargavan [16] for creating "benign systems", which are computer systems that are less likely to produce harmful impacts to the ecosystem and society. ...
... While ICT admittedly has many benefits, the unthoughtful use of technology can lead to unintended harmful side effects (e.g., when the society becomes overly reliant on technology, it becomes too reliant on it, and fails to function when technology is disrupted). As we chart out the approximate networking ecosystem, it will be a good time to base approximate networking on the strong architectural foundations of "benign computing" [16], which is focused on minimizing the harmful side effects of technology. In this regard, we can focus on making our approximation networking solutions scale out, fail well, have open design at every level of its structure [16]. ...
... As we chart out the approximate networking ecosystem, it will be a good time to base approximate networking on the strong architectural foundations of "benign computing" [16], which is focused on minimizing the harmful side effects of technology. In this regard, we can focus on making our approximation networking solutions scale out, fail well, have open design at every level of its structure [16]. ...
Conference Paper
Internet is the linchpin of modern society, which the various threads of modern life weave around. But being a part of the bigger energy-guzzling industrial economy, it is vulnerable to disruption. It is widely believed that our society is exhausting its vital resources to meet our energy requirements, and the cheap fossil fuel fiesta will soon abate as we cross the tipping point of global oil production. We will then enter the long arc of scarcity, constraints, and limits---a post-peak "long emergency" that may subsist for a long time. To avoid the collapse of the networking ecosystem in this long emergency, it is imperative that we start thinking about how networking should adapt to these adverse "undeveloping" societal conditions. We propose using the idea of "\textit{approximate networking}"---which will provide \textit{good-enough} networking services by employing \textit{contextually-appropriate} tradeoffs---to survive, or even thrive, in the conditions of scarcity and limits.
... 95 One approach to addressing the harms of future technologies is a "benign technology" perspective. 69 This perspective also arose out of computing; we present an adapted version here, expanded to encompass all technology: ...
Preprint
Full-text available
In the past several years, scientists have issued a series of warnings about the threats of climate change and other forms of environmental disruption. Here, we provide a scientists' warning on how technology affects these issues. Technology simultaneously provides substantial benefits for humanity, and also profound costs. Current technological systems are exacerbating climate change and the wholesale conversion of the Earth's ecosystems. Adopting new technologies, such as clean energy technologies and artificial intelligence, may be necessary for addressing these crises. Such transformation is not without risks, but it may help set human civilizations on a path to a sustainable future.
... al discusses material end to end dependencies for a self-sustainable internet in a paper on "internet quines" [10]. Another formative concept is Raghavan's benign computing, where computing devices horizontally scale, fail well (in the case of intermittent power), and aspire toward open design [11]. ...
... Raghavan and Pargman [38] ask "what is the appropriate response to excessive sociotechnical complexity?", and suggest that the process of refactoring can be useful not only in computing but also for "simplifying largescale sociotechnical systems while retaining all or most of their benefits". Raghavan [36] cites Sevareid's law, "The chief source of problems is solutions", discusses how "our problemsolving instinct may be leading us astray" and proposes "a set of principles for computing that is less likely to have unintended, harmful downsides" to ecosystems and human society. Swiss researcher Lorenz Hilty has grappled with the same issues over an extended period of time and discusses the relationship between Computing within Limits and the concepts of efficiency, sufficiency and self-sufficiency [21]. ...
... "It is possible to cling to hopes about human ingenuity and the success of large-scale engineering projects (carbon capture and storage, fusion power, massive scaling-up of renewable energy sources, geoengineering etc.) for much longer than it is possible to deny a reality of decreasing rates of return of limited resources. " [21] Similarly, Raghavan [24] responds to Kelly's attempts to find a middle ground between techno-utopia and technophobic [12] by arguing that beneficial computing is impossible as it is beset with temporary solutions that don't work, to problems that are in reality unsolvable. Raghavan suggests an alternative "benign computing" which is a rejection of the utopian notion of creating new technology that is strictly 'beneficial' or that advances 'development'. ...
Conference Paper
Full-text available
In computing there is a small but growing community who desire to make sense of the role of computing in a world with limits. This community has provided a much needed critical perspective on what has otherwise been computing's contribution to a worsening world state, or at best a weak sustainability. But, by framing the biophysical and social environment as limited, there is a danger of adopting a negative and overly pessimistic approach with the effect of marginalising our message and contribution to computing. Previous attempts to address the tension between a limited world and a positive approach have been foundered on concerns that a techno-utopia is not only unrealisable but efforts to achieve it are exacerbating the problem. In this paper we explore the potential for an explicitly positive approach to computing within limits research: regenerative computing. We describe what regenerative computing within limits might look like and suggest a way forward. We expect this new approach to transform the computing and sustainability discourse, and empower the computing within limits community to become ambassadors of hope and regenerative sustainability.
... Unintended consequences are the staple of complex social systems, which follow unexpectedly from the nonlinear interactions between subsystems [11] and our propensity to intervene in systems with our "solutions". Unfortunately, our problem-solving instinct also creates a number of followup problems and networking systems (including future self-driving networks) are not immune to this tendency [40]. Systems thinking can help us anticipate and avoid the negative consequences of well-intentioned solutions. ...
Conference Paper
Along with recent networking advances (such as software-defined networks, network functions virtualization, and programmable data planes), the networking field, in a bid to construct highly optimized self-driving and self-organizing networks, is increasingly embracing artificial intelligence and machine learning. It is worth remembering that the modern Internet that interconnects millions of networks is a 'complex adaptive social system', in which interventions not only cause effects but the effects have further knock-on consequences (not all of which are desirable or anticipated). We believe that self-driving networks will likely raise new unanticipated challenges (particularly in the human-facing domains of ethics, privacy, and security). In this paper, we propose the use of insights and tools from the field of "systems thinking"---a rich discipline developing for more than half a century, which encompasses more realistic models of complex social systems---and highlight their relevance for studying the long-term effects of network architectural interventions, particularly for self-driving networks. We show that these tools complement existing simulation and modeling tools and provide new insights and capabilities. To the best of our knowledge, this is the first study that has considered the relevance of formal systems thinking tools for the analysis of self-driving networks.
... Unintended consequences are the staple of complex social systems, which follow unexpectedly from the nonlinear interactions between subsystems [8] and our propensity to intervene in systems with our "solutions"-solutions regarding which Eric Sevareid, an economics commentator had astutely noted, "the chief source of problems is solutions" Our problem-solving instinct also creates a number of followup problems and networking systems (including future selfdriving networks) are not immune to this tendency [32]. ...
Article
Full-text available
The networking field has recently started to incorporate artificial intelligence (AI), machine learning (ML), big data analytics combined with advances in networking (such as software-defined networks, network functions virtualization, and programmable data planes) in a bid to construct highly optimized self-driving and self-organizing networks. It is worth remembering that the modern Internet that interconnects millions of networks is a `complex adaptive social system', in which interventions not only cause effects but the effects have further knock-on effects (not all of which are desirable or anticipated). We believe that self-driving networks will likely raise new unanticipated challenges (particularly in the human-facing domains of ethics, privacy, and security). In this paper, we propose the use of insights and tools from the field of "systems thinking"---a rich discipline developing for more than half a century, which encompasses qualitative and quantitative nonlinear models of complex social systems---and highlight their relevance for studying the long-term effects of network architectural interventions, particularly for self-driving networks. We show that these tools complement existing simulation and modeling tools and provide new insights and capabilities. To the best of our knowledge, this is the first study that has considered the relevance of formal systems thinking tools for the analysis of self-driving networks.
... The Internet of today is quite different from that of a few decades past along both axes: it has gone from partiallycentralized and democratic to distributed and feudal. 2 Our aim is to move towards distributed and democratic: not to undo the necessary trend towards wide distribution, but to disperse control. Put another way, the scale-out design philosophy that has served us well in the design of systems over the past two decades must now be applied to the control of systems as well [39]. ...
Conference Paper
Full-text available
Today's Internet scarcely resembles the mythological image of it as a fundamentally democratic system. Instead, users are at the whims of a small number of providers who control nearly everything about users' experiences on the Internet. In response, researchers and engineers have proposed, over the past decade, many systems to re-democratize the Internet, pushing control over data and systems back to the users. Yet nearly all such projects have failed. In this paper we explore why: what are the goals of such systems and what has caused them to run aground?
... In most modern cloud-based services the systems are indeed administratively centralized (i.e., stored on systems owned by one company) even if many different physi-cal computers are used in a data center to store the data. In systems terminology, the backend cloud systems in data centers are "scale out" but the administrative control adheres to the older, now-archaic model of "scale up", as we discuss in prior work [70]. 1 The technical structure of these systems has direct impact on the axes identified by Preist et al. and Ekbia and Nardithat of the cornucopian paradigm and of issues of labor. Labor is often provided by users for free while the company who owns the technical infrastructure harvests the monetary values that are created by this arrangement [44]. ...
Conference Paper
There has been an increased interest in broader contexts from ecology and economics within the HCI community in recent years. These developments suggest that the HCI community should engage with and respond to concerns that are external to computing yet profoundly impact human society. In this paper we observe that taking these broader contexts into account yields a fundamentally different way to think about sustainable interaction design, one in which the designer's focus must be on a) ecological limits, b) creating designs and artifacts that do not further a cornucopian paradigm, and c) fundamental human needs. It can be hard to be responsive to these contexts in practical HCI work. To address this, we propose that the design rubric of disintermediation can serve as a unifying approach for work that aims to meet the ecological and economic challenges outlined in the literature. After discussing the potential use and impact of disintermedation, we perform an analysis using this design rubric to several key application areas.
... "Bincam" in Thieme et al 2012 [27]). A more thoughtful, and constructive, treatment of this problem space is given by Raghavan [23] in his discussion of how Sevareid's Law ("the chief source of problems is solutions") can provide useful motivation to understand current technology design. However the importance of the device that is used to provide data to, and extract data from the individual, is underestimated. ...
Conference Paper
Full-text available
YAFR (Yet another futile rant) presents the smartphone: an unstoppable piece of technology generated from a perfect storm of commercial, technological, social and psychological factors. We begin by misquoting Steve Jobs and by being unfairly rude about the HCI community. We then consider the smartphone's ability to kill off competing technology and to undermine collectivism. We argue that its role as a Lacanian stain, an exploitative tool, and as a means of concentrating power into the hands of the few, make it a technology that will rival the personal automobile in its effect on modern society.
... Industrial societies are replete with complex sociotechnical systems that contribute to most of their complexity today. Many of these systems are marked by significant use of abstraction and indirection, two of the core principles that enable much of computing design [32]. In examining abstraction and indirection, and thinking about sociotechnical systems like we do computing systems, then, we might be able to uncover a coarse understanding of the complexity of sociotechnical systems. ...
Conference Paper
Research in sociology, anthropology, and organizational theory indicates that most societies readily create increasingly complex societal systems. Over long periods of time, accumulated societal complexity bears costs in excess of benefits, and leads to a societal decline. In this paper we attempt to answer a fundamental question: what is the appropriate response to excessive sociotechnical complexity? We argue that the process of refactoring, which is commonplace in computing, is ideally suited to our circumstances today in a global industrial society replete with complex sociotech-nical systems. We further consider future directions for computing research and sustainability research with the aim to understand and help decrease sociotechnical complexity.
Conference Paper
The Internet stands atop an unseen industrial system required for its continued growth, operation, and maintenance. Its scale could not have been achieved without this reliance, and its dependencies---ranging from sophisticated manufacturing facilities to limited raw materials---make it vulnerable to supply-chain disruptions, which are more likely as human society faces global ecological limits. We introduce the concept of an Internet quine, a metaphor that represents a collection of devices, protocols, manufacturing facilities, software tools, and other related components that is self-bootstrapping and capable of being used (by engineers or autonomously) to reproduce itself and all the needed components of the Internet. In this paper, we study the nature of Internet quines and discuss how they could be built. We also attempt to identify a collection of such tools and facilities, and how small and inexpensive they can be made.
Conference Paper
Research on computing within limits explores the design of computing technologies that will be appropriate for a future where availability of resources is drastically reduced. In an effort to define the scope and goals of limits-aware computing, early papers discussed how such a future may come about, what challenges this future may present, and the kinds of technologies we should design given these scenarios. In this paper, we posit that these future challenges already exist today in their incipient forms. We propose that limits-aware computing research should focus on these problems to make a difference today while preparing for further future collapse.
Conference Paper
Full-text available
Recent years have seen a flurry of work on sustainable com- puting and sustainable HCI, but it is unclear whether this body of work adheres to a meaningful definition of sustain- ability. In this paper, we review four interlocking frameworks that together provide a rigorous foundation for what consti- tutes sustainability. Each consecutive framework both builds upon and can loosely be seen as a refinement of the previous framework. More specifically, we leverage prominent eco- logical thinking from outside of computer science to inform what sustainability means in the context of computing. To this end, we re-evaluate some recent results from the field of sustainable HCI and offer thoughts on further research in the field.
Book
As computation continues to move into the cloud, the computing platform of interest no longer resembles a pizza box or a refrigerator, but a warehouse full of computers. These new large datacenters are quite different from traditional hosting facilities of earlier times and cannot be viewed simply as a collection of co-located servers. Large portions of the hardware and software resources in these facilities must work in concert to efficiently deliver good levels of Internet service performance, something that can only be achieved by a holistic approach to their design and deployment. In other words, we must treat the datacenter itself as one massive warehouse-scale computer (WSC). We describe the architecture of WSCs, the main factors influencing their design, operation, and cost structure, and the characteristics of their software base. We hope it will be useful to architects and programmers of today's WSCs, as well as those of future many-core platforms which may one day implement the equivalent of today's WSCs on a single board.