Chalmers University
Select theme: all | valuations | algorithms | infrastructures
Against Bioethicalization: Value Disjunctures in the History of Fetal Research
May 2022 | » Get text | Keywords: social theory, disease surveillance, valuations, infrastructures
This paper has two aims: First, it proposes to sensitize our analytical minds to what we dub “value disjunctures”— clashes, in practice, between different values. What happens if we highlight the periods and situations when versions of the world are pulled apart? Second, it aims to highlight how today’s bioethics can neither be read as a tale of democratization of ethics, nor as a tale solely driven by ethical disasters. What we offer is a story of how the bioethical yardsticks of today were established as dominant in fetal research. Our sensitizing toolkit helps to shine a light on how bioethicalization is a historical process that intertwines what is good, with what objects are seen as important, as well as how these objects are understood. Bioethicalization is partly a struggle about ethics, which yardsticks for the good that become salient, but also a struggle about which objects should be ethicalized–as well as the nature of the ethicalized objects. The valuographical contribution highlights how all matters of value – the ethical, the epistemic, and the economic – are intertwined with changing fetal ontologies.
Lee, Francis, Dussauge, Isa, Jülich, Solveig. 2022. "Against Bioethicalization: Value Disjunctures in the History of Fetal Research." SocArXiv.
Ontological Overflows and the Politics of Absence
Mar 2022 | » Get text | Keywords: algorithms, social theory, disease surveillance, infrastructures, actor-network theory
This paper suggests that STS needs to start attending to what I dub ontological overflows. My argument is that the focus on construction and enactment stories in STS has led to us taking over the matters of concern of our interlocutors. Our informants’ concerns and objects, becoming our concerns and objects. I argue that we have taken for granted which objects should be attended to, cared for, and analyzed. Thus, our theories and methods have constituted a particular blindness to those objects that our informants do not care for—the objects at the edges of the network, the smooth rhizomatic spaces, the blank figures, the neglected things, the undiscovered continents, the plasma. The paper thus joins in the ongoing discussion about the ontological politics of invisibility, partial knowledge, and fractionality, and asks how STS can attend to the making of the absent, weak, and invisible. What would happen if we start paying attention to these ontological overflows in practice? By tracing how multiple absences are produced, the paper shows the usefulness of caring for the othered objects, of following the making of alterity and otherness. The argument is that the tracing of ontological overflows opens up for understanding how tangential objects are dis-assembled, and consequently for tracing how absence, alterity, and otherness are made in practice.
Lee, Francis. 2022. “Ontological Overflows and the Politics of Absence.” SocArXiv.
Detecting the unknown in a sea of knowns: health surveillance, knowledge infrastructures, and the quest for classification egress
Jan 2022 | » Get text | Keywords: social theory, disease surveillance, valuations, infrastructures
The sociological study of knowledge infrastructures and classification has traditionally focused on the politics and practices of classifying things or people. However, actors’ work to escape dominant infrastructures and pre-established classification systems has received little attention. In response to this, this article argues that it is crucial to analyze, not only the practices and politics of classification, but also actors’ work to escape dominant classification systems. The article has two aims: First, to make a theoretical contribution to the study of classification by proposing to pay analytical attention to practices of escaping classification, what the article dubs classification egress. This concept directs our attention to—not only the practices and politics of classifying things—but also how actors work to escape or resist classification systems in practice. Second, the article aims to increase our understanding of the history of quantified and statistical health surveillance. In this, the article investigates how actors in health surveillance assembled a knowledge infrastructure for surveilling, quantifying, and detecting unknown patterns of congenital malformations in the wake of the Thalidomide disaster in the early 1960s. The empirical account centers on the actors’ work to detect congenital malformations and escaping the dominant nosological classification of diseases, the International Classification of Diseases (ICD), by replacing it with a procedural standard for reporting of symptoms. Thus, the article investigates how actors deal with the tension between the-already-known-and-classified and the unknown-unclassified-phenomenon in health surveillance practice.
Lee, Francis. 2022. "Detecting the unknown in a sea of knowns: health surveillance, knowledge infrastructures, and the quest for classification egress". SocArXiv.
Sensing Salmonella: modes of sensing and the politics of sensing infrastructures
Aug 2021 | » Get text | Keywords: social theory, disease surveillance, valuations, infrastructures
The intent with this chapter is dual: First, it aims to add to the vocabulary for analyzing the politics sensing infrastructures. Drawing on post actor-network theory sensibilities, the chapter introduces the concept of style of inference in order to analyze the politics of how different sensing infrastructures apprehend the world (cf. Fujimura & Chou; Hacking). Paraphrasing Adrian Mackenzie: different sensing infrastructures have very different ways of navigating the steadily increasing tidal wave of data—and we need to understand how these differences are integrated with society at large (Mackenzie). An important argument of the chapter is thus that different styles of inference are more or less compatible with a wider political and organizational context. For example, the value of web searches on flu symptoms are not fully trusted as evidence of flu outbreaks in the healthcare system. The style of inferring flu intensity is not stabilized. The chapter therefore contends that there is a need to understand how sensing infrastructures have different styles of inference, and how these are differently compatible with governmental action and politics. Thus, the argument is that different styles of inference are deeply implicated in a politics of sensing.
Lee, Francis. “Sensing Salmonella: Modes of Sensing and the Politics of Sensing Infrastructures.” In Sensing In/Security: Sensors as Transnational Security Infrastructures, edited by Nina Witjes, Nikolaus Pöchhacker, and Geoffrey C. Bowker, 97–131. London: Mattering Press, 2021.
Enacting the Pandemic: Analyzing Agency, Opacity, and Power in Algorithmic Assemblages
Nov 2020 | » Get text | Keywords: algorithms, social theory, disease surveillance, infrastructures, bioscience, actor-network theory
This article has two objectives: First, the article seeks to make a methodological intervention in the social study of algorithms. Second, the article traces ethnographically how an algorithm was used to enact a pandemic, and how the power to construct this disease outbreak was moved around through an algorithmic assemblage. The article argues that there is a worrying trend to analytically reduce algorithms to coherent and stable objects whose computational logic can be audited for biases to create fairness, accountability, and transparency (FAccT). To counter this reductionist and determinist tendency, the article proposes three methodological rules that allows an analysis of algorithmic power in practice. Empirically, the article traces the assembling of a recent epidemic at the European Centre for Disease Control and Prevention—the Zika outbreak starting in 2015—and shows how an epidemic was put together using an array of computational resources, with very different spaces for intervening. A key argument is that we, as analysts of algorithms, need to attend to how multiple spaces for agency, opacity, and power open and close in different parts of algorithmic assemblages. The crux of the matter is that actors experience different degrees of agency and opacity in different parts of any algorithmic assemblage. Consequently, rather than auditing algorithms for biased logic, the article shows the usefulness of examining algorithmic power as enacted and situated in practice.
Lee, Francis. “Enacting the Pandemic.” Science & Technology Studies 34, no. 1 (2021): 65–90.
How should we theorize algorithms? Five ideal types in analyzing algorithmic normativities
Aug 2019 | » Get text | Keywords: algorithms, social theory, infrastructures
The power of algorithms has become a familiar topic in society, media, and the social sciences. It is increasingly common to argue that, for instance, algorithms automate inequality, that they are biased black boxes that reproduce racism, or that they control our money and information. Implicit in many of these discussions is that algorithms are permeated with normativities, and that these normativities shape society. The aim of this editorial is double: First, it contributes to a more nuanced discussion about algorithms by discussing how we, as social scientists, think about algorithms in relation to five theoretical ideal types. For instance, what does it mean to go under the hood of the algorithm and what does it mean to stay above it? Second, it introduces the contributions to this special theme by situating them in relation to these five ideal types. By doing this, the editorial aims to contribute to an increased analytical awareness of how algorithms are theorized in society and culture. The articles in the special theme deal with algorithms in different settings, ranging from farming, schools, and self-tracking to AIDS, nuclear power plants, and surveillance. The contributions thus explore, both theoretically and empirically, different settings where algorithms are intertwined with normativities.
Lee, Francis, and Lotta Björklund Larsen. “How Should We Theorize Algorithms? Five Ideal Types in Analyzing Algorithmic Normativities.” Big Data & Society 6, no. 2 (July 1, 2019).
Algorithms as folding: Reframing the analytical focus
Aug 2019 | » Get text | Keywords: algorithms, social theory, infrastructures
This article proposes an analytical approach to algorithms that stresses operations of folding. The aim of this approach is to broaden the common analytical focus on algorithms as biased and opaque black boxes, and to instead highlight the many relations that algorithms are interwoven with. Our proposed approach thus highlights how algorithms fold heterogeneous things: data, methods and objects with multiple ethical and political effects. We exemplify the utility of our approach by proposing three specific operations of folding—proximation, universalisation and normalisation. The article develops these three operations through four empirical vignettes, drawn from different settings that deal with algorithms in relation to AIDS, Zika and stock markets. In proposing this analytical approach, we wish to highlight the many different attachments and relations that algorithms enfold. The approach thus aims to produce accounts that highlight how algorithms dynamically combine and reconfigure different social and material heterogeneities as well as the ethical, normative and political consequences of these reconfigurations.
Lee, Francis, Jess Bier, Jeffrey Christensen, Lukas Engelmann, Claes-Fredrik Helgesson, and Robin Williams. 2019. “Algorithms as Folding: Reframing the Analytical Focus.” Big Data & Society 6 (2).
Styles of Valuation: Algorithms and Agency in High-throughput Bioscience
Jul 2019 | » Get text | Keywords: algorithms, valuations, infrastructures, bioscience
In science and technology studies today, there is a troubling tendency to portray actors in the biosciences as “cultural dopes” and technology as having monolithic qualities with predetermined outcomes. To remedy this analytical impasse, this article introduces the concept styles of valuation to analyze how actors struggle with valuing technology in practice. Empirically, this article examines how actors in a bioscientific laboratory struggle with valuing the properties and qualities of algorithms in a high-throughput setting and identifies the copresence of several different styles. The question that the actors struggle with is what different configurations of algorithms, devices, and humans are “good bioscience,” that is, what do the actors perform as a good distribution of agency between algorithms and humans? A key finding is that algorithms, robots, and humans are valued in multiple ways in the same setting. For the actors, it is not apparent which configuration of agency and devices is more authoritative nor is it obvious which skills and functions should be redistributed to the algorithms. Thus, rather than tying algorithms to one set of values, such as “speed,” “precision,” or “automation,” this article demonstrates the broad utility of attending to the multivalence of algorithms and technology in practice.
Lee, Francis, and Claes-Fredrik Helgesson. “Styles of Valuation: Algorithms and Agency in High-Throughput Bioscience.” Science, Technology, & Human Values, July 30, 2019.
Standardizing abnormality: congenital malformations and statistical disease surveillance
Nov 2018 | » Get text | Keywords: disease surveillance, infrastructures, actor-network theory
This chapter investigates the birth of the Swedish register for congenital malformations, and the expectations that were ascribed to statistical disease surveillance during this time. The chapter focuses on different articulations of the belief in statistical knowledge and standardization of congenital malformations as an early warning system that could prevent new medical disasters.
Lee, Francis (2018). "Standardizing abnormality: congenital malformations and statistical disease surveillance" Working Paper. November 2018. Available at
Ordering society and nature: some elements of a sociology of algorithms
Oct 2018 | » Get text | Keywords: algorithms, disease surveillance, valuations, infrastructures, actor-network theory
This article is an intervention in the sociology of classification and valuation. The article proposes four metaphors for analyzing how algorithms, modeling, or data practices shape practices of classification and valuation. Elaborating on classic work in actor-network theory, these metaphors highlight four ways in which algorithms can come to shape the practical ordering of society and nature. These metaphors urge the sociologist to pay attention to moments of algorithmic bifurcation, syncopation, absenting, and intervention. These moments highlight the intertwining of judgment and computation, the folding of time and space, the importance of absences, and the variable spaces for intervention afforded by algorithmic infrastructures. Using these metaphors, the article analyzes how the Current Zika State—the classification of the world into classes of disease intensity—was algorithmically assembled at the European Center for Disease Control and Prevention.
Lee, Francis. (2018) "Ordering society and nature: some elements of a sociology of algorithms" Working Paper. October 2018. Available at
Where is Zika? Four challenges of emerging knowledge infrastructures for pandemic surveillance
Sep 2017 | » Get text | Keywords: algorithms, disease surveillance, infrastructures
Today, pandemics are increasingly known through novel and emerging digital knowledge infrastructures. For example, algorithms for disease classification, genetic and geographical information systems, as well as models of contagion, travel, or ecology. These digital knowledge infrastructures are constantly humming in different disease control organizations across the globe. In the west the US CDC, the WHO, and the European CDC are endlessly monitoring their screens, attempting to detect the next big outbreak of disease. These knowledge infrastructures are increasingly reshaping our global knowledge about disease and pandemics. Through these tools, new disease patterns become objects of intervention, new outbreaks become visible, and new ways of classifying the world come into being. The general purpose of this paper, is to inquire into how emerging knowledge infrastructures—such as algorithms and modeling—shape knowledge production about pandemics. In doing this, the paper speaks to at least two overarching problems. First, how disease surveillance is reshaped by these emerging knowledge infrastructures. Second, how algorithms and modeling enter into processes of knowledge production more generally. In engaging with these questions through the lens of disease surveillance the paper outlines four challenges in dealing with algorithmic and modelled knowledge production.
Lee, Francis. 2017. "Where is Zika? Four challenges of emerging knowledge infrastructures for pandemic surveillance." Governing by prediction: models, data, and algorithms in and for governance, Paris, 11-13 Sep.
Analyzing algorithms: some analytical tropes
Nov 2016 | » Get text | Keywords: algorithms, social theory, infrastructures
Algorithms are everywhere. Hardly a day passes without reports on the increased digitalization and automation of society and culture. As we know these processes are fundamentally based on algorithms (Kichin 2012). Today, there is also a proliferation of research on the social aspects of algorithms: on census taking (Ruppert 2012), predicting and preventing crime (Ferguson 2017), credit assessment (DeVille & Velden 2015), pricing water (Ballestero 2015), machine-learning (Burrell 2016), email spam filters (Maurer 2013), dating services (Roscoe & Chillas 2014) to men- tion a few. The focus of these researchers have in different ways been algorithms and their pro- found impact (cf. Kockelman 2013). However, in this algorithmic world, it seems to us that we are moving in a landscape where we find familiar tropes of technological hype, determinism, and of evil technology run wild.
Lee, Francis, and Lotta Björklund Larsen. 2016. "Analyzing algorithms: some analytical tropes." Second Algorithm Studies Workshop, Stockholm, Sweden, 23-24 feb.
A Sociology of Treason: The Construction of Weakness
Jan 2014 | » Get text | Keywords: social theory, infrastructures, actor-network theory
The process of translation has both an excluding and including character. The analysis of actor networks, the process of mobilizing alliances, and constructing networks is a common and worthwhile focus. However, the simultaneous betrayals, dissidences, and controversies are often only implied in network construction stories. We aim to nuance the construction aspect of actor–network theory (ANT) by shining the analytical searchlight elsewhere, where the theoretical tools of ANT have not yet systematically ventured. We argue that we need to understand every process of translation in relation to its simultaneous process of treason, and to add antonyms for Callon’s problematization, intressement, enrollment, and mobilization. This enables us to describe powerlessness not as a state but as a process. Our case focuses on the network building around measures for disabled people in the construction of the Athens Metro, during the period 1991-1993. The discussion highlights the efforts of disability organizations to intervene in the initial construction works of the metro project and the simultaneous actions of the Greek government to exclude disability organizations from the design process and to disrupt the accessibility-metro actor network.
Galis, Vasilis, and Francis Lee. 2014. "A Sociology of Treason: The Construction of Weakness." Science, Technology, & Human Values 39 (1):154-179.
Publications updated in Jan 1970