ArticlePDF Available

The escalating politics of 'Big Biology'

The escalating politics of Big Biology
Javier Lezaun
Institute for Science, Innovation and Society, School of Anthropology and Museum Ethnography, University of
Oxford, 64 Banbury Road, Oxford, OX2 6PN, UK.
BioSocieties (2013) 8, 480485. doi:10.1057/biosoc.2013.30; published online 21 October 2013
Constitutional Moments
As the history of twentieth-century science suggests, controversies about Big Scienceare
vehicles for rethinking the relationship between science and society. Behind arguments
about size, scale or enhanced coordination, with all their managerial undertones, lie deep-
seated commitments to alternative visions of the political organization of knowledge
The experiments in large-scale biology discussed in this collection are thus best seen as
constitutional moments(Jasanoff, 2011), occasions when the challenge of creating new and
extended research collectives forces an explication of the links between technical practices,
organizational architectures and ethical imaginaries.
As these articles make clear, however, constitutional moments in the contemporary life
sciences come in many shapes and forms. A healthy degree of scepticism towards claims of scale,
speed or novelty runs through this special section, and helps qualify many of the most
hyperbolic arguments about the nature and consequences of large biological research projects.
The very category of big biology, the editors point out, is in question from the start.
And yet, all the contributors describe processes of escalation (if not always the scaling up
of things). They are concerned, that is, with an amplification of the political reflexivity of
the scientific enterprise, brought about by the expansion, intensification or acceleration of
research efforts. What can we glean from these case studies, then, about the evolving constitution
of the life sciences in an era of large data infrastructures and a seemingly relentless push towards
greater and faster circulation of informational and biological materials? The following are some
open-ended reflections prompted by the many insights that crisscross this collection.
The Imperative to Share
A striking leitmotif of the large-scale biological research projects discussed in these articles is
the apparently generalized imperative to share. Whether it is data, mutant mice or DNA
sequences, these enterprises proclaim an economy of free and unfettered exchange, founded
on the willingness of researchers and institutions to put in circulation, often without direct or
explicit remuneration, the product of their work.
© 2013 Macmillan Publishers Ltd. 1745-8552 BioSocieties Vol. 8, 4, 480485
This moralization of exchange is remarkable, and it is so central to the public self-image of
the contemporary life sciences, that it is often taken for granted or accepted as an evidently
good thing. Our own inclination as social scientists to describe systems of exchange in the
sciences as moral economiessuggests a certain inability to fully unpack this category. What
do we mean by moral? What is the underside of sharing? The articles lay bare some of the
qualifications and complexities that attend to these acts of apparent self-dispossession. One of
the most remarkable insights is that for something to become the object of such an economy of
free exchange, it has first to lose value; that the generalized obligation to share is supported by
very specific acts of devaluation.
For instance, in her analysis of contemporary initiatives for the production of mutant mice,
Gail Davies notes a diversity of sociological and spatial imaginaries, but a shared desire to
reduce the cost of research animals. In some cases, this is achieved by producing new
economies of scalethrough centralization and planning, as in the KOMP project. In others, it
is a matter of implementing new forms of distributed managementthat include the
outsourcing or subcontracting of key components of the production process (mouse breeding,
embryo cryopreservation) to lower-cost countries. This is perhaps an obvious point, but it
bears emphasizing: the free dissemination of biological materials is here only possible once the
relevant object of circulation has been cheapened in significant ways. It is the capacity to add
(or restore) value to this now de-valued resource that creates the relevant asymmetries of
knowledge and power.
Open sourcemodels, like those of the BioBrick initiative in synthetic biology discussed by
Emma Frow, partake of this double act of value creation and suppression. As individual
biological parts typically have low value, Frow writes describing the logic underpinning these
arrangements, there is little point in restricting access to them promises of commodification
and value generation in synthetic biology lie further downstream, in the combination of
parts into biological devices and systems with useful properties(Frow, this issue). Yet,
as she notes, the creation of low-value, freely exchangeable universaltools requires stripping
biological parts of valuable traits, traits that mark the insertion of these objects
in local economies of research. The success of the open sourcedream in synthetic biology
hinges on subordinating these local cultures of valuation to an overarching principle of
A similar dynamic is at work in the data-intensive infrastructures described in several of
the articles. In fact, one can advance here a counter-intuitive definition of data. Rather than
being the critical commodity whose mere circulation adds value to, and constitutes the raison
dêtre of, the network, data ought to be seen as the most radically de-valued entity as the
very manifestation of worthlessness. It is only when these data are reinstated into specific
forms of labour and care when data are collated, curated, interpreted and otherwise acted
upon that such a thing acquires again the status of meaningful and valuable asset.
I raise this point partly to call attention to an asymmetry in how the social sciences confront
the problem of value: we tend to pay a great deal attention to the creation of value, much less
to its destruction. The articles in this collection invite us to embark on a study of the practices
of de-valuation in the contemporary life sciences (Leonelli, this issue; see also Muniesa, 2012).
What (and who) loses value in and through big biology? How are certain things and
capacities devaluated (or prevented from acquiring new value) in the service of an economy of
free accessand unrestricted circulation?
481© 2013 Macmillan Publishers Ltd. 1745-8552 BioSocieties Vol. 8, 4, 480485
The Diminishing Outside
The editors conclude their introduction to the special section with an intriguing warning about
the diminishing outsideof large-scale biological research. As more and more actors and
resources are linked up in collaborative infrastructures, there is less and less scope to be left
outside its logics and reach, for scientists and perhaps for social scientists too(Davies, Frow
and Leonelli, this issue).
In other words, how does large-scale research configure that which is not part of itself but is
nevertheless affected by the gravitational force exerted by these extraordinary concentrations
of resources? This question is crucial, as the editors suggest, for it is in that outside’–in the
otherof big biology, so to speak that the capacity for radical critique will be founded.
Here it is useful again to read across the articles and between the concrete examples featured
in this collection. A theme that emerges from that reading is that large-scale biology often
understands its other (and the publics it purports to serve more generally) as an aggregate of
users; big biologymust be, first and foremost, user friendly.
As Stephen Hilgartner recounts in his article, this was a critical rhetorical move in shoring
up the legitimacy of the Human Genome Project (HGP) against its early critics. The HGP
demarcated its scientific jurisdiction (and aligned it with the interests of ordinary biology)by
presenting itself as essentially a utility or service provider, an infrastructure able to produce
and disseminate raw datafor the benefit of the scientific community without imposing a
centralized hierarchy for the allocation of resources or the definition of relevant research
questions (see also Kevles, 1997).
The language of user-friendlinesshas since become a hallmark of large-scale projects in the
life sciences. We know, however, that being a user is hardly an innocent epistemic or political
position, that it comes with its own constraints and commitments. The user is always
formatted in particular ways; usabilitydepends as much on the configuration of the user as
on the design of the product (cf. Oudshoorn and Pinch, 2003; Cupers, 2013).
The question could thus be put as follows: how does the user-friendlinessof large-scale
biology format its outside? The example of the HGP offers some historical perspective.
Contemporary post-genomic research practices have been deeply influenced by the HGP. The
HGP has achieved this influence not by imposing a particular model for the organization of
research across biology (the fear of its critics), or by answering or dissolving the long-standing
questions of ordinary biology(the hope of its proponents). Rather, the HGP has configured
the post-genomic era by creating data sets and analytical instruments that have effectively
opened and foreclosed specific research horizons (Hilgartner, this issue). It has shaped our
collective imagination of what directions of enquiry are worth pursuing and which objects of
research are pragmatically tractable.
This tension between user-friendlinessand epistemic pluralism is explored in virtually all
the articles in this collection. There is clearly no a priori answer to the question of how a
particular big biologyproject might impact the existing ecology of research problems and
practices. What we can do, as Sabina Leonelli argues in her contribution, is to turn this
question into a key criterion for the assessment of large-scale infrastructures in the life
sciences. That is, we can define scale not in terms of size or the amount of resources gathered
in one particular project, but, as she puts it, on the basis of the range and scope of the
biological and biomedical questions which it helps to address. Such an analysis would open
482 © 2013 Macmillan Publishers Ltd. 1745-8552 BioSocieties Vol. 8, 4, 480485
the door to a consideration of the implications of new research infrastructures that goes
beyond deceptively uncontroversial references to the usability of their outputs.
The Stand-Insof Big Biology
There is another way of approaching the question of the outsideof large-scale research an
outsidethat is not properly external to it, but that does nevertheless constitute a very
particular kind of other.
In their book The New Spirit of Capitalism,LucBoltanskianveChiapellodescribe
contemporary capitalist production as an economy organized around projects, rather
than firms or organizations, where value is created by an expansive logic of networking and
ever-greater connectivity. Their description resonates with many of the trends identified in
this collection the projectas the relevant unit of organization and value creation, the
seeming obsolescence of traditional proprietary mechanisms, the devaluation or redefini-
tion of work in network environments, the premium on circulation and collaboration, and
so on.
In trying to describe the forms of exploitation characteristic of such an economy, Boltanski
and Chiapello identify the figure of the stand-in(doublure). In a world where profit depends
on unfettered mobility across projects, they argue, exploitation takes the form of immobility.
The stand-insare those actors who must remain in situ while others circulate, and are thus
unable to capitalize on individual projects in order to enhance their own versatility and
mobility. They care for all that is rooted and situated in the nodes that make up the network,
and in so doing create the conditions for others to move and increase their own value. [S]ome
peoples immobility, Boltanski and Chiapello (2007) argue, is necessary for other peoples
mobility(p. 362).
Who are the stand-insof large-scale biological research? Certainly not the individuals
and collectives involved in these little science, if such a thing still exists. It is rather those
who, despite being deeply involved in large-scale collaborative enterprises, fail to extract
the expected benefit from the intensification of circulations and the expansion of exchange.
They are exploited, Boltanski and Chiapello (2007) write, in the sense that the role they
play as a factor in production does not receive the acknowledgment it merits; and that their
contribution to the creation of value added is not remunerated at the requisite level for its
distribution to be deemed fair(p. 363). Far from being excluded from these expanding
networks of collaboration and sharing, these stand insare essential to their workings
their emplacement enables the user-friendliness and inter-operability of the network. Yet
they remain tied to specific locales, objects or practices, and as a result are unable to extract
the sort of mobility that constitutes the true surplus value of participation in these research
To identify the stand-insof big biology we need to appreciate the forms of immobility that
attend to the human and (extending Boltanski and Chiapellos formulation) non-human
components of these networks. In other words, we need a vocabulary to re-specify what it
means to be or remain in situ in a world where mobility and circulation seem to be the measure
of all things.
483© 2013 Macmillan Publishers Ltd. 1745-8552 BioSocieties Vol. 8, 4, 480485
For one, it is clear that immobility does not designate here a lack of spatial displacement.
Large-scale biological research draws on the work of many peripatetic actors sample
collectors, instrument calibrators, IT support technicians, auditors whose labours, however,
remain invisible in descriptions of the enterprise focused on laboratory production or data
Other figures of immobility can be glimpsed in other examples of big biologydiscussed in
these articles, whether it is data curators, subcontracted lab technicians, or the research
subjects recruited to provide biological samples or donate their bodies for clinical research
(cf. Kelly, 2011). They are immobile in the sense that they are structurally prevented from
turning their involvement in any particular project into a dynamic of accumulation.
New Sociological Imaginaries
Finally, this collection is unique, I think, in its attention to the sociological imaginaries and
critical capacities that are embedded in (or excluded from) large-scale enterprises in the
contemporary life sciences. Because of their explicit attention to the creation of new collectives
the process of political escalation to which I referred earlier these endeavours often open up
spaces for the involvement of social scientists (the main problems we have to deal with are
sociological, observes one of Gail Daviess informants). Several of the contributors note in
passing their own involvement in the governance or assessment of the projects they describe.
What to make of these opportunities to engage critically with big biology?
The background of any invitation to add critical value to large-scale biological research
is the experience gained through the HGP. To put it in simple terms: HGP gave us ELSI. This
was a form of appraisal that operated by a sort of critical outsourcing: the societal evaluation
of genomics and its implicationswas carried out by academics in their capacity as paid expert
advisers, often along strict disciplinary lines (the ethical, the legal and the social, typically in
that order).
The HGP and its ELSI model of subcontracted critique belong to an era dominated by
claims and counter-claims of genetic determinism. It has been exported to other areas of
emerging technoscience(most notably in our context, and not surprisingly, to synthetic
biology). Yet post-HGP big biologyis giving rise to new forms and parameters of critical
engagement. Jane Calvert, for instance, notes the absence of ELSI-like forums in systems
biology. Questions about the societal value of this new science are couched instead in the
language of grand challenges, a discourse that tends to follow top-down, policy-centric
paths, but often enlarges the range of issues under consideration.
This is just to suggest that, as Kaushik Sunder Rajan eloquently argues in his commentary,
social scientists will need to invent a different set of tactics of engagement in this post-genomic
world. For one, our own sociological imaginaries are bound to change as we confront
scientific enterprises driven by seemingly congenial understandings of complexity, relation-
ality or interdisciplinarity. This will hopefully be a world open to a proliferation of
constitutional experiments in the governance of scientific collaborations, in which the
boundary between academic inquiry and activist critique will be increasingly porous, and
where the definition of scientific and social matters of concern will be subject to the sort of
robust debate that should come with the creation of new research collectives.
484 © 2013 Macmillan Publishers Ltd. 1745-8552 BioSocieties Vol. 8, 4, 480485
About the Author
Javier Lezaun is James Martin Lecturer in Science and Technology Governance at the Institute
for Science, Innovation and Society, University of Oxford. He is currently directing
BioProperty, a research programme that explores the practices of exchange and appropriation
of the contemporary life sciences (
Boltanski, L. and Chiapello, E. (2007) The New Spirit of Capitalism. London: Verso.
Cupers, K. (2013) Use Matters: An Alternative History of Architecture. London: Routledge.
Jasanoff, S. (2011) Constitutional moments in governing science and technology. Science and Engineering
Ethics 17(4): 621638.
Kelly, A.H. (2011) Will he be there? Mediating malaria, immobilizing science. Journal of Cultural Economy
4(1): 6579.
Kevles, D. (1997) Big science and big politics in the United States: Reflections on the death of the SSC and the
life of the Human Genome Project. Historical Studies in the Physical and Biological Sciences 27(2): 269297.
Muniesa, F. (2012) A flank movement in the understanding of valuation. The Sociological Review 59(s2):
Oudshoorn, N. and Pinch, T. (2003) How Users Matter: The Co-Construction of Users and Technologies.
Cambridge, MA: The MIT Press.
485© 2013 Macmillan Publishers Ltd. 1745-8552 BioSocieties Vol. 8, 4, 480485
... Large-scale public-private consortia are increasingly being employed to facilitate translational research in the life sciences (Lezaun 2013). Contemporary examples include the Quebec Consortium for Drug Discovery (CQDM), several networks created under the 'Investments for the Future' programme of the French National Research Agency (ANR), and the Innovative Medicine's Initiative (IMI), a joint venture between the European Commission and the European federation of pharmaceutical industries and associations (Efpia). ...
... These flows are not unproblematic; they require work to cross the boundaries, for example between institutions, disciplines, or sectors. As Lezaun (2013) notes large collaborative ventures actually require scientific materials and data to be devalued by the scientists who produce them, in order to make them freely shareable with others who might, in other circumstances, be competitors. These factors, especially taken together, raise the question of whether the existing norms, patterns and traditions of collaboration in various life sciences fields are likely to support or hinder working in large-scale public-private consortia. ...
... This paper aims to contribute to the burgeoning social science literature on 'big biology' (Calvert 2010;Davies et al. 2013;Lezaun 2013;Vermeulen et al. 2013) by exploring the potential impact of large public-private consortia in the field of stem cell research. This will involve drawing on data from a series of qualitative interviews conducted with stem cell scientists working on the IMI StemBANCC project. ...
Full-text available
The rise of ‘big biology’ is bringing academic and industrial scientists together in large consortia to address translational challenges in the life sciences. In order to assess the impact of this change, this paper examines the existing norms and styles of collaboration in one high profile translational domain; stem cell research. Data is drawn from qualitative interviews with academic and industry scientists working in a large European stem cell research project. Respondents discussed what they perceived as the main benefits and risks of collaborative research, what styles of collaboration they were familiar with, and what collaborative work in stem cell science normally involves. A wide range of materials, data, and expertise can be exchanged during collaborative work. Informal collaborations are governed by an ethos of reciprocity and mediated by trust while formal project agreements can provide a safe space for sharing between unfamiliar partners. These characteristics make stem cell research well suited to pre-competitive public-private ventures but translation of new products to market may be more challenging.
... On the one hand, the scale of finance and resource investment in large-scale hiPSC banks positions hiPSC as a highly valued and promising technology. On the other, such open science efforts to build large-scale common scientific resources also requires devaluing of specific data and biomaterialsin this case hiPSC linesto allow them to be shared among scientists and companies who might otherwise be in competition (Lezaun, 2013;Meskus, 2018). How, then, do practitioners understand the value of recent investments in largescale biobanks for hiPSC and the worth that they produce? ...
... Producing this knowledge will take a considerable amount of time and labour. The apparent paradox of devaluing cell lines and data that have taken time and effort to produce by making them available on a not-for-profit basis (Lezaun, 2013) makes sense when it is understood that an open access resource can also act as a strategy for outsourcing this discovery work to the European academic community. ...
Full-text available
The StemBANCC consortium was a large European consortium bringing together scientists from academic institutions and the pharmaceutical industry. The aim of the consortium was to produce 1500 induced pluripotent stem cell (iPSC) lines from participants with a variety of common, complex diseases. These cell lines were intended to help develop iPSC as tools for screening small molecule drug candidates and modelling human diseases in vitro. The scale of investment in this and other biobanks presents iPSC as a valuable commodity. However, StemBANCC was also mandated to make its making cells and data available to European researchers on a not-for-profit basis. To understand how making this quantity of stem cells and data available is configured as a valuable and worthwhile investment for the consortium partners, research materials (project documents, scientific literature and interviews with scientists working on StemBANCC) were analysed using theoretical tools from Valuation Studies. Combining STS and economic sociology, valuation studies analyse how the worth of things, ideas and phenomena result from context-specific practices of assessment and evaluation; worth incorporates moral, financial, scientific, economic. social and cultural registers of value. In this sense, practitioners evaluate the worth of the StemBANCC cells and data at a variety of sites, from participant recruitment to online databases. This provides an alternative to biovalue and similar conceptual models for theorising the generation of value in the Life Sciences.
... This analytical frame was instrumental in uncovering monopolisation practices at play in research consortia, and led to the observation that in order to create value from the abundance of data within these scientific infrastructures, teams construct their assets as scarce. While contemporary big biology is marked by the generalised imperative to share data to create abundance (Lezaun, 2013, Lezaun andMontgomery, 2015), collaborative endeavours within research consortia are in fact built around forms of exclusions: exclusion of those outside the collaboration who do not own valuable properties that can be shared and exclusion of those inside the collaboration from using the 'shared' resources. This paper also contributes to discussions on valuation, providing both empirical and analytical grounding to the question of what gets valued and how in the knowledge economy. ...
Drawing upon ethnographic findings from an epigenetics research laboratory in the United Kingdom, this article explores practices of research collaborations in the field of epigenetics, and epigenomics research consortia in particular. I demonstrate that research consortia are key scientific infrastructures that enable the aggregation of masses of data deemed necessary for the production of results and the fostering of epistemic value. Building on Science and Technology Studies (STS) scholarship on value production, and the concept of asset, I show that the production of valuable research within epigenomics research consortia rests on the active organisation and management of abundance and scarcity. It involves shaping and standardising the masses of data gathered in consortia, while it also entails research teams enclosing their data within their laboratories’ walls. As they do so, research teams construct data into scarce and monopolised assets, which they can put to productive use in collaborative endeavours against a revenue. In addition to contributing empirical and critical insights into the ways epigenetics knowledge is formed and negotiated in specific research contexts, this article offers conceptual tools to examine and problematise knowledge production practices in data-intensive research more broadly. In particular, it points out that while contemporary big biology is marked by the generalised imperative to ‘share’ data and ‘open’ science, collaborative endeavours within research consortia are built around forms of exclusions.
... The underacknowledged caveat about mapping the omes is that they in theory encompass all possible biological interactions, but in practice they are mapped and related to particular bodies, places, and times. It is in this way that the 'big biology' postgenomics has come to signify does not necessarily mean that the local and the contextual does not matter Lezaun 2013;Rajan 2013). ...
Full-text available
For many geographers, postgenomics is a relatively new perspective on biological causality. It is a non-dualistic way to conceptualize DNA, genes and environment. It also presents an opportunity for a broad critical engagement with biology through geography’s insights into socionature and the fallacies of spatial inference. In postgenomics, mapping of the spatial and temporal contexts and circumstances surrounding DNA, rather than DNA sequence alone, has become prioritized. Consequently, scientific and economic value in postgenomics accrues through the enclosure and mapping of the ‘omes’. These include the more familiar epigenome and microbiome, but also the interactome, the phenome, and the exposome among many others. The omes represent the cartographic translation of biological spatialities that modify the outcomes of DNA sequence from within as well as from outside of human bodies. In this article, we show how postgenomics leverages this omic ontologicalization of space and puts it to productive use. Drawing upon recent studies of the human microbiome, we illustrate how problematic geographies of difference arise through the way this omic mapping unfolds.
... The underacknowledged caveat about mapping the omes is that they in theory encompass all possible biological interactions, but in practice they are mapped and related to particular bodies, places, and times. It is in this way that the 'big biology' postgenomics has come to signify does not necessarily mean that the local and the contextual does not matter Lezaun 2013;Rajan 2013). ...
... A number of scholars, particularly in the history and philosophy of biology, have studied the organization, movement, and reuse of data with databases (Ankeny and Leonelli, 2011;Bauer, 2008;Borgman, 2012;Hine, 2006;Leonelli, 2012Leonelli, , 2013a. In addition, scholars have examined the regulatory and institutional configurations in which the use and value of data develop (Jasanoff, 2002;Keating and Cambrosio, 2012) -increasingly in relation to the open data and open science movements (Borgman, 2012;Leonelli, 2013b;Lezaun, 2013) -as well as the standards and forms of objectivity that enable such uses and values (Beaulieu, 2001(Beaulieu, , 2004Frow, 2012;Mackenzie et al., 2013;Porter, 1995). ...
This ethnographic study, based on fieldwork at the Computational and Systems Medicine laboratory at Imperial College London, shows how researchers in the field of metabolomics--the post-genomic study of the molecules and processes that make up metabolism--enact and coproduce complex views of biology with multivariate statistics. From this data-driven science, metabolism emerges as a multiple, informational and statistical object, which is both produced by and also necessitates particular forms of data production and analysis. Multivariate statistics emerge as 'natural' and 'correct' ways of engaging with a metabolism that is made up of many variables. In this sense, multivariate statistics allow researchers to engage with and conceptualize metabolism, and also disease and processes of life, as complex entities. Consequently, this article builds on studies of scientific practice and visualization to examine data as material objects rather than black-boxed representations. Data practices are not merely the technological components of experimentation, but are simultaneously technologies and methods and are intertwined with ways of seeing and enacting the biological world. Ultimately, this article questions the increasing invocation and role of complexity within biology, suggesting that discourses of complexity are often imbued with reductionist and determinist ways of thinking about biology, as scientists engage with complexity in calculated and controlled, but also limited, ways.
Full-text available
The 21st century has witnessed the emergence of in silico reproduction alongside the familiar in vitro reproduction (e.g. IVF), as increasingly large and automatically‐generated data sets have come to play an instrumental role in assisted reproduction. The article addresses this datafication of reproduction by analysing time‐lapse embryo imaging, a key data‐driven technology for embryo selection in IVF cycles. It discusses the new forms of knowledge and value creation enabled by data‐driven embryo selection and positions this technology as a harbinger of a wider datafication of (reproductive) health. By analysing the new ways of seeing embryos with ‘in silico vision,’ the ‘data generativity’ of developing embryos and the patenting of embryo selection algorithms, I argue that this datafied method of embryo selection may not just result in more or less ‘IVF success,’ but also affects the conceptualisation and commercialisation of the assisted reproductive process. In doing so, I highlight how the datafication of reproduction both reflects and reinforces a consolidating trend in the fertility sector—characterised by mergers resulting in larger fertility chains, online platforms organising fertility care and expanded portfolios of companies aiming to cover each step of the IVF cycle.
Full-text available
What are the epistemological and political contours of evidence today? This introduction to the special issue lays out key shifts in the contemporary politics of knowledge and describes the collective contribution of the six papers as an articulation of what we describe as a ‘new empiricism’, exploring how earlier historical appeals to evidence to defend political power and decision-making both chime with and differ from those of the contemporary era. We outline some emerging empirical frontiers in the study of instruments of calculation, from the evolution of the randomized controlled trial (RCT) to the growing importance of big data, and explore how these methodological transformations intersect with the alleged crisis of expertise in the ‘post-truth’ era. In so doing, we suggest that the ambiguity of evidence can be a powerful tool in itself, and we relate this ambiguity to the ideological commitment and moral fervour that is elicited through appeals to, and the performance of, evaluation.
Full-text available
Big Data promises to heal many of the complex problems in healthcare. Departing from conventional ways of thinking about what constitutes relevant health data and how to analyze it, data-intensive resourcing entails different practices of collecting, sorting, circulating, and interpreting data, while making it available to multiple users. In the process, big data and the infrastructures instilled to support it are creating new forms of value and reordering relationships as distinctions between medical and nonmedical data and research and care are blurred. This paper situates big data in healthcare within social-political and economic conditions in the U.S., and charts emerging assemblages in order to make visible the cultural work that big data does in healthcare.
Full-text available
This paper focuses on an unsettling example of experimental labour - the Human Landing Catch (HLC). The HLC is a cheap and reliable technique to produce data on mosquito densities in a defined area. It requires only a human volunteer to sit over night with his legs exposed, a headlamp to spot mosquitoes, and a rubber tube and plastic cup to catch them as they come to feed on him. The HLC formed the central methodological and operational strategy for a malaria control that took place in Dar es Salaam, funded by the Bill and Melinda Gates Foundation. This paper analyses the epistemic and economic value of this experimental scenario by examining in detail the work it entails. In conceptualizing the different species of productivity associated with the HLC, of particular interest is the surprising fact that he is there. This paper argues that the interplay of mobility and immobility offers a way to rethink the value of research within interlocking circulations of capital, science, mosquitoes and men.
Le projet sur le genome humain et celui des physiciens sur l'accelerateur de protons et d'antiprotons furent lances par le gouverment federal des Etats-Unis et prirent des directions differentes a la fin de la guerre froide
The sociological understanding of valuation often starts with an idea of value as something that something has by virtue of how people consider it (that is, it is socially constructed, a convention, a social representation, a projection). At some point, however, analysis also often draws a contrast between this sort of appraisal and some other type of value that the thing may have as a result of its own condition (what it costs, how it is made, with what kind of labour, money and materials, what it is worth in relation to objective standards and fundamental metrics). Dissatisfaction with this binary approach has been expressed in various quarters, but the pragmatist contribution of John Dewey provides a particularly useful resource with which to engage with the subject. This article reviews some aspects of this dissatisfaction, with a focus on the pragmatist idea of valuation considered as an action. I discuss this idea in relation to financial valuation, referring in particular to early pedagogical materials on corporation finance elaborated in the context of the professionalization of business administration. Finally I elaborate on the usefulness of a pragmatist stance in the understanding of financial valuation today.
A century after the publication of Max Weber's The Protestant Ethic and the "Spirit" of Capitalism, a major new work examines network-based organization, employee autonomy and post-Fordist horizontal work structures.
Scholars in science and technology studies (STS) have recently been called upon to advise governments on the design of procedures for public engagement. Any such instrumental function should be carried out consistently with STS's interpretive and normative obligations as a social science discipline. This article illustrates how such threefold integration can be achieved by reviewing current US participatory politics against a 70-year backdrop of tacit constitutional developments in governing science and technology. Two broad cycles of constitutional adjustment are discerned: the first enlarging the scope of state action as well as public participation, with liberalized rules of access and sympathetic judicial review; the second cutting back on the role of the state, fostering the rise of an academic-industrial complex for technology transfer, and privatizing value debates through increasing delegation to professional ethicists. New rules for public engagement in the United Sates should take account of these historical developments and seek to counteract some of the anti-democratic tendencies observable in recent decades.
Use Matters: An Alternative History of Architecture
  • K Cupers
Cupers, K. (2013) Use Matters: An Alternative History of Architecture. London: Routledge.