Coding Freedom: The Ethics and Aesthetics of Hacking
... To advance empirically on this question, it is key to recognize that contemporary participatory processes are more a cause of the digitization of society, rather than a consequence. Indeed, participatory processes are at the root of the 'hacker' movement, which was born in the late 1950s and has profoundly shaped the digital revolution [15,16]. Historically, the hacker movement takes its deep roots back to the emergence of a strong post-World War II counterculture born out of the disillusionment of American intellectuals following the atomic bombings of Hiroshima and Nagasaki, and more broadly by a rejection of war [17]. ...
This article explores the role of hackathons for good in building a community of software and hardware developers focused on addressing global sustainable development goal (SDG) challenges. We theorize this movement as computational diplomacy: a decentralized, participatory process for digital governance that leverages collective intelligence to tackle major global issues. Analysing Devpost and GitHub data reveals that 30% of hackathons since 2010 have addressed SDG topics, employing diverse technologies to create innovative solutions. Hackathons serve as crucial kairos moments, sparking innovation bursts that drive both immediate project outcomes and long-term production. We propose that these events harness the neurobiological basis of human cooperation and empathy, fostering a collective sense of purpose and reducing interpersonal prejudice. This bottom–up approach to digital governance integrates software development, human collective intelligence and collective action, creating a dynamic model for transformative change. By leveraging kairos moments, computational diplomacy promotes a more inclusive and effective model for digital multilateral governance of the future.
This article is part of the theme issue ‘Co-creating the future: participatory cities and digital governance’.
... camcorders, social media, smartphones…) to fuel resistance and support protesters but also the emergence of digitally-native resistant coalitions such as Anonymous -has resulted in the burgeoning literature on hacking (e.g. Maxigas, 2012;Coleman, 2013;Toupin, 2016), hacktivism (Jordan, 2002;Milan, 2015), and digital resistance and digital activism more in general (see, among others, Karatzogianni & Michaelides, 2009;Fuchs, 2014;Treré, 2019). ...
Individuals and groups increasingly seek to resist the harms and risks of a data-driven society. This essay explores the possibility of individual and collective resistance vis-à-vis datafication, drawing on examples from across the globe. It shows how infrastructure, political agency, and tactics have changed in response to datafication. It reviews six resistance tactics, distinguishing between “defensive resistance” and “productive resistance”: self-defence, subversion, avoidance, literacy, counter-imagination, and advocacy campaigning. Investigating them offers insights on the ability of social actors to contribute to innovation in mobilising practices amidst intrusive surveillance.
... To advance empirically on this question, it is key to recognise that contemporary participatory processes are more a cause of the digitisation of society, rather than a consequence. Indeed, participatory processes are at the root of the "hacker" movement, which was born in the late 1950s and has profoundly shaped the digital revolution [15,16]. Historically, the hacker movement takes its deep roots back to the emergence of a strong post World War II counterculture born out of a disillusion of American intellectuals following the atomic bombings of Hiroshima and Nagasaki, and more broadly by a rejection of war [17]. ...
This article explores the role of hackathons for good in building a community of software and hardware developers focused on addressing global SDG challenges. We theorise this movement as computational diplomacy: a decentralised, participatory process for digital governance that leverages collective intelligence to tackle major global issues. Analysing Devpost and GitHub data reveals that 30% of hackathons since 2010 have addressed SDG topics, employing diverse technologies to create innovative solutions. Hackathons serve as crucial kairos moments, sparking innovation bursts that drive both immediate project outcomes and long-term production. We propose that these events harness the neurobiological basis of human cooperation and empathy, fostering a collective sense of purpose and reducing interpersonal prejudice. This bottom-up approach to digital governance integrates software development, human collective intelligence, and collective action, creating a dynamic model for transformative change. By leveraging kairos moments, computational diplomacy promotes a more inclusive and effective model for digital multilateral governance of the future.
... We are interested in practices that are closer to the messy entanglements of learning in the making. For instance, Michelle Cordy, in a thirty second YouTube clip draws a mind map for the book, Coding Freedom (Coleman, 2013). The clip is a sped up version of her drawing the map, referring to the book, using post-it notes, multi-coloured markers and so on. ...
This paper is about a slow hunch. A hunch that a modest interference in networked learning, that we have called public click pedagogy (PCP), may, in some instances, usefully open up a side of networked learning that is often glossed. Learning new material, developing new skills, making new discoveries can be complicated, and messy. Few of us go from inexperienced to skilled or novice to master in anything like a simple, tidy or routine manner. We often learn more from our mistakes than our successes. We sometimes find ourselves in blind alleys or chasing down rabbit holes that appear to take us nowhere.What learners actually do when they try to come to terms with a new domain via formal or informal means, tends to be secret learner business. What is commonly made visible is how successful they are in coming to terms with the domain, something which is judged by people who have demonstrated knowledge and expertise in the domain. Our hunch is that a modest exploration of secret learner business by making public the fuzzy, pragmatic and messy business of learning may work as a useful complement to those elements of learning already made public. The label PCP draws attention to three characteristics of this work: that it is made public, that ‘aha’ or click moments which in a glossed account masquerade as the product of acute insight are traced carefully, and that accounts of these practices may operate pedagogically, for the learner, and perhaps, other learners.To explore the doing of networked learning we draw parallels with the doing of science as it has been studied by scholars in the field of science, technology, and society (STS). Looking at how scientists actually do science, STS scholars came to see that accounts of science as products of the scientific method glossed the messiness, noncoherence and fuzziness of what went on in the laboratory. To Bruno Latour, in the early days of STS, science was Janus-like, with two contradictory faces: science in the making and ready made science. More recently, John Law with others have extended this line of argument to examine the performativity of noncoherence.Drawing on this work, we examine three empirical cases, one of which is the preparation of this paper. We trace the negotiation of the ideas and arguments, our learning in the making, and the noncoherences which we partially domesticate through dialogue.
... Openness has long been associated with the traits of classical liberalism: individualism, liberty, equality, competition, and free-market exchange (Tkacz, 2015). These values resonate within the libertarian culture of Silicon Valley, and found fertile ground in the open source faction of the F/OSS movement (Coleman, 2013). Likewise, the rapid and widespread adoption of open data is doubtless due in part to its conformity with reigning neoliberal visions of good governance. ...
Open data is increasingly being promoted as a route to achieve food security and agricultural development. This article critically examines the promotion of open agri-food data for development through a document-based case study of the Global Open Data for Agriculture and Nutrition (GODAN) initiative as well as through interviews with open data practitioners and participant observation at open data events. While the concept of openness is striking for its ideological flexibility, we argue that GODAN propagates an anti-political, neoliberal vision for how open data can enhance agricultural development. This approach centers values such as private innovation, increased production, efficiency, and individual empowerment, in contrast to more political and collectivist approaches to openness practiced by some agri-food social movements. We further argue that open agri-food data projects, in general, have a tendency to reproduce elements of “data colonialism,” extracting data with minimal consideration for the collective harms that may result, and embedding their own values within universalizing information infrastructures.
... This was in regards to the conversion of the Netscape web browser to open source (to Mozilla), which occurred immediately after its release, and the publication of The Open Source Definition (OSD) 1 in 1988. In the early 2000s, numerous publications have focused on economics and management (for example , Capiluppi & Michlmayr, 2007;Coleman, 2012;Lerner & Tirole, 2002;Weber, 2005, etc.). In this paper, we propose that since the heyday of open source in the 2000s, the internal circumstances and external environment surrounding open source have changed. ...
Since the emergence of free software in the 1980s, the belief that software should be free has been quite widely held. During this period, the software developers took the leading role. As the 1990s began, there was a notable increase in the number of general users of open source software. Furthermore, developers and distro developers (who acted as both licensors and licensees, as well as users and developers) began to take a more active role in driving the direction of open source development. This period marked the emergence of the open source era, characterized by a greater convenience for distro developers. However, in the 2010s, the importance of platforms increased and previous control on licensing diminished. In other words, the history of free software and open source has been the history of how the parties holding the business initiative have been changing at each period.
... These cultures foster creative and alternative explorations of cooking, eating, and dining that go beyond common concerns over food quality, sustainability, safety, and justice. The niche experiments inspired by molecular gastronomy, science, technological trends in crowdsourcing, and collaboration via social networks, as well as sensors and DIY electronics, embody the maker and hacker ethos with all its ambiguities (Coleman 2013). In these experiments the makers and hackers try to find low-cost, DIY, reversedengineered, and inventive solutions to disrupt and/or customize the current systems of food production, consumption, distribution, and food politics. ...
... The movement to promote open-source software was, in the early 2000s, still the subject of vigorous debate, especially as two wings of the software community-one adopting an "open source" framing and the other embracing the language of "free software"-both sought to increase the adoption and visibility of open licenses such as GPL (Coleman, 2012). Since then, openly licensed software has become a routine element within a range of industries-in fact, it is a crucial element in massive markets ranging from internet servers to mobile phones-and this normalization has perhaps undermined the necessity of searches for terms related to it. ...
... We found that computer science students experience an ecosystem in which stereotypes manifest themselves in institutional and informal contexts, both online and offline (the songbook used during orientation weeks; the décor the student-run bar; the classroom and social media). Humor is an essential way to release and exorcise anxiety and worries in the busy life of being a computer science student, humor plays a role in celebrating and defining geekiness (Coleman 2012), maintaining joking relationships across social groups (Radcliffe-Brown 1940) and, in some cases, publicly making feminist statements. This is consistent with recent anthropological research describing humor as an important -and understudied-mode of human cognition (Hsu 2016) which allows people to perceive a situation or phenomenon from multiple diverse perspectives. ...
We propose equity-focused institutional accountability as a set of principles to organize equity, inclusion, and diversity efforts in computer science organizations. Structural inequity and lack of representation of marginalized identities in computing are increasingly in focus in CSCW research – and research institutions as well as tech organizations are struggling to find ways to advance inclusion and create more equitable environments. We study humor in a computer science organization to explore and decode how negative stereotypes create unnecessary and avoidable barriers to inclusion and counter efforts to creating a welcoming environment for all. We examine the humor embedded in sociomaterial artefacts, rituals, and traditions, and uncover the stereotyped narratives which are reproduced in formal and informal spaces. We argue that these stereotyped narratives both pose a risk of activating stereotype threat in members of marginalized groups, and of normalizing and reproducing ideas of who belongs in computer science. We situate and discuss the complexity of institutional accountability in the context of a traditionally participatory and collegial model of governance. As a way forward we propose three principles for an equity-focused approach to accountability in computer science organizations: 1) Examine organizational traditions and spaces to critically evaluate challenges for inclusion; 2) Normalize critical reflection in the core practices of the organization; 3) Diversify and improve data collection.
... In the past two decades, the OSS development paradigm has become increasingly common in software development. This trend emerged, to some extent, as a result of changes in organizational practices (e.g., adoption of agile and distributed development approaches; Abrahamsson, Salo, Ronkainen, & Warsta, 2002;Dingsøyr, Nerur, Balijepally, & Moe, 2012) but was also driven by the popularity of software freedom among developers and self-described hackers (Coleman, 2013). Furthermore, online platforms for "social coding" have gained popularity among both organizations and individuals, promoting the temporal and geographic distribution of collaborative work in open source projects (Yu, Yin, Wang, & Wang, 2014). ...
In this study, we explore the future of work by examining differences in productivity when teams are composed of only humans or both humans and machine agents. Our objective was to characterize the similarities and differences between human and human–machine teams as they work to coordinate across their specialized roles. This form of research is increasingly important given that machine agents are becoming commonplace in sociotechnical systems and playing a more active role in collaborative work. One particular class of machine agents, bots, is being introduced to these systems to facilitate both taskwork and teamwork. We investigated the association between bots and productivity outcomes in open source software development through an analysis of hundreds of project teams. The presence of bots in teams was associated with higher levels of productivity and higher work centralization in addition to greater amounts of observed communication. The adoption of bots in software teams may have tradeoffs, in that doing so may increase productivity, but could also increase workload. We discuss the theoretical and practical implications of these findings for advancing human–machine teaming research. This research characterizes the changing landscape of collaborative work brought on by the increasing presence of autonomous machine agents in teams. We report on a multidimensional analysis of work in software development projects examining differences between human‐only and human‐bot teams.
... The free software movement originally came out of Richard Stallman's acknowledgment that the practice of sharing software, with its hacker culture heritage (Coleman 2013) was being increasingly 'enclosed' by commercial ventures. In order to create a legally binding form of resistance to these new enclosures, Stallman created the GPL (General Public License 9 ), which postulated that the copyright on a given piece of software is allocated by default to the developer and that the developer can license their software to an unlimited number of people. ...
In this article, we analyze the productive role of aesthetics in organizing technoscientific work. Specifically, we investigate how aesthetic judgments form and inform code-writing practices at a large web services company in Russia. We focus on how programmers express aesthetic judgments about code and software design in everyday practice and explore how language with positive and negative valences is deployed. We find that programmers label code as “beautiful” without defining or establishing agreement about the term and are thereby able to maintain different ideals of beauty within the same organization. However, by learning how to avoid what senior developers deem to be “ugly” code, developers become socialized into producing code with a similar style and logic that we describe as “not ugly.” The fieldwork suggests that aesthetic language can function simultaneously as a mechanism that supports professional diversity within an organization and as a tool for producing consistencies in software design. Studying manifestations of both positive and negative aesthetic language in technoscientific work provides insight into professional practices and the various roles aesthetic language can play in organizational life.
O livro apresenta o conceito de colonialismo digital em uma dupla dimensão: subjetiva e objetiva. Subjetiva, pois, ao abordar o aspecto da subsunção formal dos processos de produção e consumo, destacam que a cognição, o conhecimento e o comportamento social se constituem pela mediação de algoritmos racistas, presentes na totalidade das relações sociais. Objetiva, porque mostram que esse processo depende da exploração de recursos naturais presentes em territórios do Sul Global, o que reatualiza o racismo e a racialização. Escrito com a experiência da educação popular dos autores, o livro flui com uma clareza que facilita sua compreensão. Ao combinar conceitos do materialismo histórico e dialético e da literatura anticolonial, o estudo oferece uma ampla e rica discussão a respeito do colonialismo digital e como podemos enfrentá-lo.
How should we understand alternative social media and open-source technologies that seek to challenge the dominance of Big Tech? Are these ethical substitutes for monopolistic platforms and technological infrastructures, or “alternative” in the sense we might talk of alternative forms of culture? Here we offer new perspective on these questions by conceptualizing alternative tech through Bourdieu’s theories of cultural production and distinctive consumption. Building on the work of Holm, Coleman and others, we explore the “techno-critical disposition” through a case study of A Traversal Network of Feminist Servers (ATNOFS), arguing this is manifested primarily as “critical craftiness,” or hacker aesthetics in a critical register. Finally, we consider how ATNOFS represents a “distinctive” path to the wider adoption of alternative platforms, as well as how the techno-critical disposition may be reconfiguring legitimacy in the broader field of technology production.
When it comes to climate crisis research, current debates are increasingly thematizing the needs but also the challenges of collaborative, transdisciplinary work. Geophysical characterizations of climate change are increasingly deemed insufficient to respond to the challenges that vulnerable communities face worldwide. In this paper, I describe the work of studying‐while‐caring for an environmental data infrastructure in order to address this issue. I suggest framing “data management” anthropologically as a question of collective stewardship that is better conceived as a “knowledge infrastructure” (Edwards 2010) instead of a formal approach to automated data curation. To examine the sociotechnical blindspots of data management, I elaborate on the anthropological concept of “infrastructural blues” based on the data engineering work I conducted. For the conclusion, I discuss the concept of “common” as a substitute for “open” technologies and address the broader implications of the proposed shift toward community stewardship and self‐determination as guiding practices for socio‐environmental data governance.
Revisiting the history of the development of software and communication technologies, this article demonstrates that while the early techno-utopian theories have been balanced by more sombre approaches, the emancipatory potential of productions whose outputs do not take the commodity form deserves further theoretical reflection. Social form and value-form literature provides a way to rethink publicly financed activities and activities of software communities as a variety of social forms of wealth and productions within capitalist social formations. Public wealth, it is argued, is a useful umbrella concept to approach the forms of wealth in the sphere of software, media and communication. With digitally storable matter, due to its replicability at near zero cost, it is of utmost importance that the state provides an institutional framework, primarily for capital, but also for public wealth, to be coded. In this setting, legal form, its content and function play a key role in the contested reproduction between forms of public wealth and capital.
The open data movement is often touted as a sweeping strategy to democratize science, promote diverse data reuse, facilitate reproducibility, accelerate innovation, and much more. However, the potential perils of open data are seldom examined and discussed in equal measure to these promises. As we continue to invest in open data, we need to study the full spectrum of what open data facilitates in practice, which can then inform future policy and design decisions. This paper aims to address this gap by presenting an investigative digital ethnography of one contrarian community, anthropogenic climate change (ACC) skeptics, to describe how they process, analyze, preserve, and share data. Skeptics often engage in data reuse similar to conventional data reusers, albeit for unconventional purposes and with varying degrees of trust and expertise. The data practices of ACC skeptics challenge the assumption that open data is universally beneficial. These findings carry implications for data repositories and how they might curate data and design databases with this type of reuse in mind.
In recent years, the use of video call or video conference tools has not stopped increasing, and especially due to the COVID-19 pandemic, the use of video calls increased in the educational and work spheres, but also in the family sphere, due to the risks of contagion in face-to-face meetings. Throughout the world, many older people are affected by hearing loss. Auditory functional diversity can make it difficult to enjoy video calls. Using automatic captions might help these people, but not all video calling tools offer this functionality, and some offer it in some languages. We developed an automatic conversation captioning tool using Automatic Speech Recognition and Speech to Text, using the free software tool Coqui STT. This automatic captioning tool is independent of the video call platform used and allows older adults or anyone with auditory functional diversity to enjoy video calls in a simple way. A transparent user interface was designed for our tool that overlays the video call window, and the tool allows us to easily change the text size, color, and background settings. It is also important to remember that many older people have visual functional diversity, so they could have problems reading the texts, thus it is important that each person can adapt the text to their needs. An analysis has been carried out that includes older people to analyze the benefits of the interface, as well as some configuration preferences, and a proposal to improve the way the text is displayed on the screen. Spanish and English were tested during the investigation, but the tool allows us to easily install dozens of new languages based on models trained for Coqui STT.
Alltage sind zunehmend durch programmierte Algorithmen und Computercode strukturiert, während Alltage und Menschen diese wiederum gestalten. Doch wo genau lässt sich Computercode ethnografisch in seinen Wirkungsweisen begegnen? Er ist niemals eines, sondern immer vieles. In diesem Beitrag beleuchten wir unterschiedliche methodische Herangehensweisen an Computercode in seinen verschiedenen Dimensionen und veranschaulichen diese anhand von Fallbeispielen. Dabei unterscheiden wir die folgenden fünf Dimensionen: a) Code als Text und seine Einschreibungen b) Computercode in seiner Performanz und Ausführung, c) Programmieren als Praxis: Entwickeln, Basteln, Debuggen und Hacken, d) Infrastrukturieren mit und durch Computercode, und e) Gouvernance und Gouvernementalität: Regieren mit und durch Computercode. Die Perspektivierungen sind nicht exklusiv und überschneidungsfrei, vielmehr erlaubt das Zusammendenken verschiedener Dimensionen von Code als soziotechnische Assemblage ein tiefergehendes Verständnis. Wir diskutieren, welche Dimensionen bei welchem Forschungsinteresse hilfreich sein kann und betonen gleichermaßen die Notwendigkeit einer Reflexion jener perspektivischen Entscheidungen, die wir im Forschungsprozess treffen.
Algorithms pervade our lives, affect and enable behaviors, and produce social relations in ways that are frequently unnoticed, difficult to trace, and raise interesting questions around agency, responsibility, and ethics. For anthropologists looking at the creation of algorithms, the technical and precise definitions from computer science are often no longer sufficient. One of their tasks is to ethnographically trace how people describe and/or conceptualize algorithms and how they perceive algorithmic effects on their daily lives or if they think about such things at all. From examination of “source code” to a wider analysis of computational cultures and the emergent social effects of algorithmic thinking, future anthropological work will, no doubt, seek to localize and pluralize the many cosmo‐techniques that are present in the world of computation. Through ethnographic descriptions, anthropologists make a valuable contribution to in‐depth understandings of algorithmic cultures as they arise from the field.
Direct action is strategic political activity that is designed to draw attention to or change political and cultural relationships by bypassing extant political institutions and processes. Direct action uses physical and virtual means that are outside of existing political institutions; it is often used by groups that are prohibited from or have limited access to legal or full participation in institutionalized political activities. Examples of direct action include vigils, blockades, wildcat strikes, demonstrations, the occupation of buildings and other spaces, and the destruction of property.
Over the last fifteen years, the development of blockchain technologies has attracted a large volume of professional expertise, capital investment and media attention. This burgeoning sector of technology practices has coalesced around a few major initiatives (Bitcoin, Ethereum), but it is still moving at a fast pace and its configuration is evolving. If this sector is marked by a variety of technological protocols, financial arrangements and organizational forms, it is also, we would argue, a site of social effervescence. Parties, meet-ups, and the sorts of informal socializing which gather around events and networks of all kinds function to endow the blockchain sector with the characteristics of what, in cultural analysis, are often called “scenes”. The aim of this special issue is to examine the interest of the notion of scene for the analysis of blockchain practices. We argue that the notion of scene may be mobilized as a useful analytical framework not only for the study of blockchain practices, but for that of technology practices more generally. In this introductory article, we ask the following questions: how can the notion of scene contribute to the understanding of blockchain practices? And what sort of research agenda does the notion point to? In the following sections we first identify some “scenic” components in blockchain phenomena. Then we review how media discourses and academic scholarship have framed these phenomena to show that the scene perspective is undertheorized in the context of technology-related social groupings. Finally, we propose a framework to analyse the main dimensions of blockchain scenes, before presenting the contributions to the special issue. With this special issue, we aim to establish a research agenda around technology scenes at the junction of STS and cultural analysis.
Responsible artificial intelligence guidelines ask engineers to consider how their systems might harm. However, contemporary artificial intelligence systems are built by composing many preexisting software modules that pass through many hands before becoming a finished product or service. How does this shape responsible artificial intelligence practice? In interviews with 27 artificial intelligence engineers across industry, open source, and academia, our participants often did not see the questions posed in responsible artificial intelligence guidelines to be within their agency, capability, or responsibility to address. We use Suchman's “located accountability” to show how responsible artificial intelligence labor is currently organized and to explore how it could be done differently. We identify cross-cutting social logics, like modularizability, scale, reputation, and customer orientation, that organize which responsible artificial intelligence actions do take place and which are relegated to low status staff or believed to be the work of the next or previous person in the imagined “supply chain.” We argue that current responsible artificial intelligence interventions, like ethics checklists and guidelines that assume panoptical knowledge and control over systems, could be improved by taking a located accountability approach, recognizing where relations and obligations might intertwine inside and outside of this supply chain.
This special issue shows that the intersection of software and human systems is necessarily academic and applied. The collected articles make clear that software needs anthropological analyses while demonstrating that understanding the impact of software requires technological expertise. Software and human systems meet in specific programs, institutions, and histories. For anthropologists, this intersection requires confronting the software “as data” dilemma—we gain analytical purchase by treating software as another type of data, yet this approach can also mean losing sight of the algorithmic work that software accomplishes in our everyday lives. Software is not mere data but has a variety to it akin to the diversity that anthropology has long interrogated. This introduction advocates for an approach based in “Human-Computer Assembled Networks”—recognizing how software and human systems work through specific interfaces, assemble technologies and resources and human actors, and function through networks both human and algorithmic. Overall, the collected articles demonstrate an array of ways to do applied anthropology, ranging from ethnographic work on specific projects, the development of specific software informed by anthropology, and critical engagements with histories of software and technology. Finally, this special issue advocates for a translational approach that is specifically positioned to mediate between academic and applied approaches.
This paper tells the story of Decide Madrid (Decide), a civic tech platform designed by Madrid’s municipality in 2015 in the spirit of the autonomous and hacker philosophies that spearheaded the Spanish Occupy or 15M movement. We develop a biographical account of Decide to show how the design of the platform was modified over the course of four years to accommodate shifting ideas of how digital infrastructures can channel, both online and offline, the social energies and technical rationalities of political participation. In particular, we identify three design assemblages of democratic participation: the public, the libre and the commons, whose orientations and configurations sometimes share, and sometimes diffract, different logics, logistics and locations of where and how democracy ought to be activated in the digital age. We further show how over time these modulations cultivated a view of democratic culture as an experimental process.
This paper contributes to political economic debates on the digital economy and software production by attending to the ‘hybridity’ of economic forms in digital markets. It adds to a nascent literature on hybridity by going close to the processes of code-production. Methodologically, we ambitiously combine in-situ ethnography with netnographic observation and qualitative interviews in a thickly multi-situated analysis of Danish proprietary app developers’ use of the code-repository Github commonly associated with free and open source software (F/OSS) projects. By attending to the situated practices of everyday coding within ‘app-centric media,’ we show how proprietary developers engage in hybrid practices that both align, but are also partly at odds with the overarching frame of commercial exchanges in which they operate. We argue that these practices form part of four boundary crossing, salient modes of valuation within Danish app-development, which at once destabilize and maintain traditional boundaries between proprietary and F/OSS code. While our analysis concerns a Danish app economy, it serves to demonstrate how hybridity beyond commercial exchange forms a fundamental part of both software materiality, practices and values within situated digital markets. This, thereby, is crucial to grasping the valuations and mechanisms at work furthering the digital attention economy.
Digital technologies induce organised immaturity by generating toxic sociotechnical conditions that lead us to delegate autonomous, individual, and responsible thoughts and actions to external technological systems. Aiming to move beyond a diagnostic critical reading of the toxicity of digitalisation, we bring Bernard Stiegler’s pharmacological analysis of technology into dialogue with the ethics of care to speculatively explore how the socially engaged arts—a type of artistic practice emphasising audience co-production and processual collective responses to social challenges—play a care-giving role that helps counter technology-induced organised immaturity. We outline and illustrate two modes by which the socially engaged arts play this role: 1) disorganising immaturity through artivism, most notably anti-surveillance art, that imparts savoir vivre, that is, shared knowledge and meaning to counter the toxic side of technologies while enabling the imagination of alternative worlds in which humans coexist harmoniously with digital technologies, and 2) organising maturity through arts-based hacking that imparts savoir faire, that is, hands-on knowledge for experimental creation and practical enactment of better technological worlds.
Social movement impact on democracy has primarily been treated in two ways in the literature: the role of social movements in promoting democratization in the form of regime change; and a more recent literature on the ways social movements initiate democratic innovation in governing institutions and norm diffusion in already existing democracy. In this article, we argue that to fully understand social movement impact on democracy, we need to look beyond these two main approaches, as important as they are. Using the emblematic case of Spain’s 15-M pro-democracy movement to illustrate our conceptual proposal, we draw on existing literature to argue that social movements can impact democracy in several key arenas currently not sufficiently considered in the literature. We provide examples of democratic impact emerging from the experimentation around the central problematic of ‘real democracy’ in the ‘occupied squares’ to highlight several ways social movements’ democratic impact might be explored. We develop the concepts of hybridity and democratic laboratory to analyze these impacts and discuss their relation to contemporary theorizing about democracy and movement outcomes. We argue that adopting this broader approach to the democratic impact of social movements leads to a more nuanced understanding of movement outcomes and ‘success’.
The text diagnoses two opposing tendencies in the research on algorithms: the first abstracts and unites heterogeneous developments under the term “algorithm”; the second emphasizes specifics such as data sets, material conditions, software libraries, interfaces, and so on, thus dissolving that which apparently algorithms do into more fine-grained analyses. The text proposes a research perspective that resolves this tension by conceiving of algorithms as a relation between the abstract and the concrete that allows to capture both in their interdependence. This approach is informed by two motives: first, the necessity to connect detailed analyses of specific information technologies with general political concerns; and second, the application of recent feminist critiques of epistemology to the analysis of algorithms. The ensuing relational perspective on algorithms is connected to the genealogy of algorithmic technology before being demonstrated regarding the mutually complementing relationships: algorithms-materiality, algorithms-data, algorithms-code, and algorithms-interfaces.
Hacking emerged in the 1960s at university computer science laboratories in the United States where researchers and engineers developed innovative techniques of creating and modifying computer software. The field of hacking has since expanded globally to encompass network intrusion, free and open‐source software, cybercrime and cyberterrorism, hacktivism, civic hacking, and culture hacking.
Scholars have theorized the use of chance processes in modern art in general, and in computer‐based art in particular, as the expression of an aesthetics of nonintention and authorial abnegation. Although writers of computer‐generated poetry in the United States make extensive use of computer‐based randomization, their creative adventures with computational indeterminacy do not lead them to endorse an aesthetics of nonintention or authorial abnegation. Rather, they cultivate different forms of authorial intentionality and pursue different and often‐conflicting goals. These tensions result from computational indeterminacy's affordances in relation to the different and often‐conflicting cultural currents that inform contemporary US liberal subjectivity. These currents include calls to advance social justice and inclusivity; an emphasis on self‐determination; a willingness to approach technology as an inscrutable, autonomous, and self‐determining form of creative agency; and the celebration of rarefied aesthetic abstraction and experimentation as a form of expressive freedom.
Enkel Collective was established in late 2014 in the small post-industrial port city of Fremantle, near Perth in Western Australia, during the tail-end of a decade long 'mining boom'
in a region that relies heavily on primary industries for its economic survival.2 Not long
afterward, my conversations with co-founder Adam Jorlen began, and during an interview
in late December 2018, we discussed the journey to date. Here, I reflect on our conversation, sharing portions from a unique sociotechnical experiment that responded to growing
precarity by building and sustaining an organized network.
On June 2, 2014, workers in Hyderabad celebrated the creation of the 29th State in India. Chanting “Victory for Telangana” in ‘HITEC City,’ a neighborhood constructed through ‘special economic zones’ (SEZ) and ICT parks (Information and Communication Technologies), computer engineers of the association Engineers for a New State (ENS) wore the colors of the “Party for Telangana”. Following strikes and agitated protests, members of ENS enjoyed the official recognition of the new state. Born in Telangana, and anticipating the economic benefits promised by the new political entity in formation, these engineers projected that they would also enjoy a revitalization of Telangana culture promised by local politicians. Based on 19 months of fieldwork research between 2012 and 2019, the ethnography focuses on the observation of Hindu rituals named “State Festivals” — Bonalu and Bathukamma — as they act upon other mediatized activities supporting high-tech development (conferences and hackathons) in Hyderabad. These rituals propose a cultural cohesion legitimized by the new state and supporting the emergence of a regional identity — a process which reinforced the privileged position of ICT and the types of skills that are sought after by this industry. The article suggests that the renewal of these state rituals by these engineers is integrated into a set of sociotechnical strategies for an engaged citizenship. These strategies shape but also emerge from the construction of ICT infrastructures for the new Telangana state.
It is often assumed that the interests of users and developers coincide, sharing a common goal of good design. Yet users often desire functionality that goes beyond what designers, and the organisations they work in, are willing to supply. Analysing online forums, complemented with interviews, we document how users, hackers and software developers worked together to discover and apply system exploits in hardware and software. We cover four cases: users of CPAP breathing assistance machines getting access to their own sleep data, 'hacking' the Nintendo switch game console to run non-authorised software, end-users building their own insulin supply system, and farmers repairing their own agriculture equipment against suppliers terms and conditions. We propose the concept of the 'gulf of interests' to understand how differing interests can create conflicts between end-users, designers, and the organisations they work in. This points us in the direction of researching further the political and economic situations of technology development and use.
In Anthropology in the Meantime Michael M. J. Fischer draws on his real world, multi-causal, multi-scale, and multi-locale research to rebuild theory for the twenty-first century. Providing a history and inventory of experimental methods and frameworks in anthropology from the 1920s to the present, Fischer presents anthropology in the meantime as a methodological injunction to do ethnography that examines how the pieces of the world interact, fit together or clash, generate complex unforeseen consequences, reinforce cultural references, and cause social ruptures. Anthropology in the meantime requires patience, constant experimentation, collaboration, the sounding-out of affects and nonverbal communication, and the conducting of ethnographically situated research over longitudinal time. Perhaps above all, anthropology in the meantime is no longer anthropology of and about peoples; it is written with and for the people who are its subjects. Anthropology in the Meantime presents the possibility for creating new narratives and alternative futures.
In Anthropology in the Meantime Michael M. J. Fischer draws on his real world, multi-causal, multi-scale, and multi-locale research to rebuild theory for the twenty-first century. Providing a history and inventory of experimental methods and frameworks in anthropology from the 1920s to the present, Fischer presents anthropology in the meantime as a methodological injunction to do ethnography that examines how the pieces of the world interact, fit together or clash, generate complex unforeseen consequences, reinforce cultural references, and cause social ruptures. Anthropology in the meantime requires patience, constant experimentation, collaboration, the sounding-out of affects and nonverbal communication, and the conducting of ethnographically situated research over longitudinal time. Perhaps above all, anthropology in the meantime is no longer anthropology of and about peoples; it is written with and for the people who are its subjects. Anthropology in the Meantime presents the possibility for creating new narratives and alternative futures.
In Anthropology in the Meantime Michael M. J. Fischer draws on his real world, multi-causal, multi-scale, and multi-locale research to rebuild theory for the twenty-first century. Providing a history and inventory of experimental methods and frameworks in anthropology from the 1920s to the present, Fischer presents anthropology in the meantime as a methodological injunction to do ethnography that examines how the pieces of the world interact, fit together or clash, generate complex unforeseen consequences, reinforce cultural references, and cause social ruptures. Anthropology in the meantime requires patience, constant experimentation, collaboration, the sounding-out of affects and nonverbal communication, and the conducting of ethnographically situated research over longitudinal time. Perhaps above all, anthropology in the meantime is no longer anthropology of and about peoples; it is written with and for the people who are its subjects. Anthropology in the Meantime presents the possibility for creating new narratives and alternative futures.
Opportunities are often conceptualized as ‘openings’ in a social movement’s external environment which can reduce the cost of collective action, while threat is a force which increases the cost of inaction. These concepts were originally formulated to describe the
political contexts of traditional offline movements, therefore how are they perceived and framed by activists in the digital environment of data? This study utilizes qualitative thematic coding to examine the collective action frames in four years of archival texts
from 2009 to 2012 of two highly adept data activist mobilizations: The Digital Rights movement and the Anonymous hacktivist collective. Analyses show that frames of opportunity appear with far less frequency than threat. For both cases, volatile opportunity frames regarding their online actions are very rare, especially in the Anonymous texts. It concludes by suggesting that highly leveraging the affordances of digital technology and data may lead to political opportunities losing some of their perceived salience as a mobilizing force, while cultural factors around data activism may
simultaneously result in an increased perception of threats. This study shows the importance of understanding how data mediate contemporary collective action and calls for further development and synthesis of the structural and cultural aspects of both opportunity and threat in social movement theory.
The purpose of this article is to discuss an ethnography of code, specifically code ethnography, a method for examining code as a socio-technical actor, considering its social, political, and economic dynamics in the context of digital infrastructures. While it can be applied to any code, the article presents the results of code ethnography application in the study of internet interconnection dynamics, having the Border Gateway Protocol (BGP) as code and two of the largest internet exchange points (IXPs) in the world as points of data collection, DE-CIX Frankfurt, and IX.br São Paulo. The results show inequalities in the flows of information between the global North and the global South and concentration of power at the level of interconnection infrastructure hitherto unknown in the context of the political economy of the internet. Code ethnography is explained in terms of code assemblage, code literacy, and code materiality. It demonstrates the grammar of BGP in context, making its logical and physical dimensions visible in the analysis of the formation of giant internet nodes and infrastructural interdependencies in the circulation information infrastructure of the internet.
In recent decades, many tech spaces have emerged worldwide to promote innovation. Based on ethnographic research, this article examines one of such initiatives in Brazil—a public laboratory of digital fabrication located in a low‐income neighborhood in the periphery of São Paulo. While scholars have exposed the neoliberal aspects of fablabs, this article aims to de‐center hegemonic understandings of innovation by attending to its situated practices. Analyzing the techno‐optimist aspirations and institutional legacies behind this laboratory, I explain how the US‐based fablab model was reconfigured in light of community concerns and previous Latin American experiments of digital inclusion. Against a monolithic image of tech collectives, I show how lab workers cultivated a diverse range of audiences and creative practices, specifically those of working‐class women. The article concludes with a call for more anthropological attention to overlooked tech practices as a means to imagine fairer and more solidary forms of innovation. En décadas recientes, muchos centros de tecnología han emergido globalmente con el propósito de promover innovaciones. Basado en investigación etnográfica, este artículo examina una de esas iniciativas en Brasil: un laboratorio público de fabricación digital localizado en una comunidad de la zona sur de San Pablo. Dado que los aspectos neoliberales de los fablabs ya fueron expuestos por investigadores, este artículo pretende descentrar entendimientos hegemónicos de la innovación a través del estudio de sus prácticas situadas. Analizando las aspiraciones tecno‐optimistas y los legados institucionales detrás de este laboratorio, se explica cómo un modelo estadounidense de fablab fue reconfigurado a la luz de las preocupaciones de la comunidad y experimentos latinoamericanos anteriores de inclusión digital. Complejizando imágenes monolíticas de colectivos tecnológicos, se muestra cómo los trabajadores de laboratorio cultivaron una gama diversificada de públicos y prácticas creativas, específicamente de mujeres de la clase trabajadora. El artículo concluye con un llamamiento a una mayor atención antropológica a prácticas tecnológicas desatendidas como medio de imaginar formas más justas y solidarias de innovación. [informática, desarrollo, inclusión digital, innovación, tecnología] Em décadas recentes, muitos centros de tecnologia têm emergido globalmente com o propósito de promover inovação. Baseado em pesquisa etnográfica, este artigo examina uma dessas iniciativas no Brasil: um laboratório público de fabricação digital localizado numa comunidade da zona sul de São Paulo. Dado que os aspectos neoliberais dos fablabs já foram expostos, este artigo pretende descentrar entendimentos hegemônicos de inovação através do estudo das suas práticas situadas. Analisando as aspirações tecno‐otimistas e os legados institucionais por detrás deste laboratório, explica‐se como o modelo Estadunidense de fablab foi reconfigurado à luz de preocupações da comunidade e experimentos Latino‐americanos de inclusão digital anteriores. Complexificando imagens monolíticas de coletivos tecnológicos, mostra‐se como os trabalhadores do laboratório cultivaram uma gama diversificada de públicos e práticas criativas, especificamente de mulheres de classe trabalhadora. O artigo conclui com um apelo por maior atenção antropológica a práticas tecnológicas negligenciadas como meio de imaginar formas mais justas e solidárias de inovação. [computação, desenvolvimento, inclusão digital, inovação, tecnologia]
I explain the success of open source software (OSS) from the perspectives of connectivism and the language-action framework. Reproducibility engenders trust, which we build via conversations, and OSS practices help us to learn how to be more effective learning together, contributing to the same goal.
This is the story of how Unix users hacked the European data networks competition in the 1980s, by using and appropriating common resources for their own experimental and operational needs while planting the seeds of participatory digital culture, both within and outside of the European Union's tech policies frame, both in prominent roles and behind the scenes of the “standards wars”. I analyze how they were acknowledged among international computer networks, focusing on EUNET (1982–1992), a pioneering Internet provider in Europe. I show that Unix culture, beyond technical arguments, was reframed through their institutional relationships, when EUNET were courted by the EU tech programs to help develop a digital infrastructure for the knowledge sector. Eventually, the organization contributed but also profited from this alliance, before turning towards the new business of computerized telecom networks, illustrating the transfers between the knowledge institutions and the knowledge economy.
Cet article explore les ressorts rhétoriques et pratiques de l’autonomie technologique. Il s’appuie sur une recherche conduite autour des activités d’un organisme de formation prônant l’autoconstruction de machines agricoles, l’ Atelier Paysan . Il montre tout d’abord une pluralité de modes d’existence de l’autonomie : une autonomie dotée d’une épaisseur politique, prônée par les formateurs, et une autonomie plus pragmatique pour les agriculteurs. Seront ensuite mises en évidence l’importance des outils et des fournitures mobilisés lors des formations, constituant l’infrastructure de toute activité de construction et de maintenance. L’article argumente ainsi en faveur d’une prise en compte accrue de la dimension matérielle des processus d’autonomisation.
To compensate for their lack of internet access, Cuban video game enthusiasts and programmers have built vast grassroots computer networks, the biggest of which, SNET (Street Network), at one point connected tens of thousands of households across Havana. This vernacular infrastructure generated not only new means of access but also new relations between people and fostered new political subjectivities. SNET is heavily shaped by a local cultural ideology of resolver , of collectively navigating resources and limitations in a context of scarcity. Using the metaphor of modding (modifying), a communal practice within gaming cultures that describes alterations by players or fans that change the look or functionality of a video game, we show how SNET makers were forced to constantly adapt to the shifting technical, political, and social frameworks in Cuba. Expanding anthropological theory on infrastructures that shows how breakdown in many parts of the world is a constitutive part of how people experience them, we argue that makers of human infrastructures such as SNET must not only deal with material breakdown but also navigate the breakdown of social relationships.
ResearchGate has not been able to resolve any references for this publication.