Södertörn University
Computational universalism, or, Attending to relationalities at scale
Feb 2025 | » Get text | Keywords: algorithms, social theory, artificial intelligence
The social sciences and humanities have increasingly adopted computational terminology as the organizing categories for inquiry. By organizing research around vernacular computational objects (e.g., data, algorithms, or AI) and divided worldly domains (e.g., finance, health, and governance), scholars risk obscuring the universalizing practices and ambitions of computation. These practices seek to establish new relationalities at unprecedented scales, connecting disparate domains, circulating resources across boundaries, and positioning computational interventions as universally applicable. Drawing on intellectual traditions that inspect the fixity of universalizing claims, we problematize the easy adoption of computational categories and argue that they serve as epistemic traps that naturalize the expanding reach of computational universalism. Instead of accepting the hardened categories of our interlocutors, we propose attending to the partial, effortful, and often contested work of translation and commensuration that enables computational actors to position themselves as obligatory passage points across all domains.
Lee, Francis, and David Ribes. 2025. "Computational universalism, or, Attending to relationalities at scale." Social Studies of Science. https://doi.org/10.1177/03063127251345089
The ontological politics of synthetic data: Normalities, outliers, and intersectional hallucinations
Jan 2025 | » Get text | Keywords: algorithms, social theory, artificial intelligence
Synthetic data is increasingly used as a substitute for real data due to ethical, legal, and logistical reasons. However, the rise of synthetic data also raises critical questions about its entanglement with the politics of classification and the reproduction of social norms and categories. This paper aims to problematize the use of synthetic data by examining how its production is intertwined with the maintenance of certain worldviews and classifications. We argue that synthetic data, like real data, is embedded with societal biases and power structures, leading to the reproduction of existing social inequalities. Synthetic data can have crucial ontological consequences and contribute to the reproduction of social facts and categories, such as class, race, gender, or age. Through empirical examples, we demonstrate how synthetic data tends to highlight majority elements as the 'normal' and minimize minority elements, and that changes to data structures that create synthetic data will inevitably result in what we term 'intersectional hallucinations.'
Lee, Francis, Saghi Hajisharif, and Ericka Johnson. 2025. "The ontological politics of synthetic data: Normalities, outliers, and intersectional hallucinations." Big Data & Society 12(1). https://doi.org/10.1177/20539517251318289
Reassembling Agency: Epistemic Practices in the Age of Artificial Intelligence
Jan 2025 | » Get text | Keywords: algorithms, social theory, artificial intelligence, actor-network theory
This article reflects on how sociology can analyse the role of artificial intelligence (AI) in scientific practice without buying into the current AI hype. Drawing on sensibilities developed in actor-network theory (ANT), it introduces the concept of 'agencing' (agency as a verb) which refers to how scientists debate and configure human and machine agency. It suggests that we can come to a more nuanced understanding of the effects of AI in science by attending to actors' agencing practices. By discussing three ideal types of agencing, the article argues that AI should not be regarded as a rupture in the tooling and practices of science, but rather as a continuation of long-standing patterns of practice.
Lee, Francis. 2025. "Reassembling Agency: Epistemic Practices in the Age of Artificial Intelligence." Sociologisk Forskning 62(1-2): 43-58. https://doi.org/10.37062/sf.62.27824.
Experiences of Digitized Valuation
May 2024 | » Get text | Keywords: valuations, algorithms
This editorial introduces the second part of the theme issue on Digitizing Valuation. It explores how digitization is not simply a process of turning existing valuation instruments and practices into code, but rather can have unique implications for how social order is established, challenged, and maintained. The contributions in this issue examine the lived experiences of digitized valuation across different domains.
Lee, Francis, Andrea Mennicken, Jacob Reilley, and Malte Ziewitz. 2024. "Experiences of Digitized Valuation." Valuation Studies 11(1): 1-8. https://doi.org/10.3384/vs.2001-5992.2024.11.1.1-8
Caring for the Monstrous Algorithm: Attending to Wrinkly Worlds and Relationalities in an Algorithmic Society
Mar 2024 | » Get text | Keywords: algorithms, social theory, artificial intelligence
This text proposes that we, social analysts of algorithms, need to develop a split vision for the algorithm-as-technological-object and the algorithm-as-assemblage in order to effectively attend to, analyze, and critique algorithms in society. The point of departure is that we need to distance ourselves from a simplified and reductive understanding of algorithms-as-objects, and care for them as part of a relational algorithmic assemblage. A simplified notion of algorithms is problematic for two reasons: First, it produces a reductive notion of the world where decision-makers point to algorithms-as-objects to simplify decisions about the world. Second, by taking a simplified and delineated object called 'algorithm' as the point of departure for analysis and critique in an algorithmic society, we risk producing technologically deterministic understandings of complex phenomena.
Lee, Francis. 2024. "Caring for the Monstrous Algorithm: Attending to Wrinkly Worlds and Relationalities in an Algorithmic Society." In Care in Times of (AI) Crisis, edited by Karl Palmås. Springer. https://doi.org/10.1007/978-3-031-52049-5_5
Ontological overflows and the politics of absence: Zika, disease surveillance, and mosquitos
Jan 2024 | » Get text | Keywords: algorithms, social theory, disease surveillance, infrastructures, actor-network theory
This paper suggests that STS needs to start attending to what I dub ontological overflows. My argument is that the focus on construction and enactment stories in STS has led to us taking over the matters of concern of our interlocutors. Our informants' concerns and objects, becoming our concerns and objects. I argue that we have taken for granted which objects should be attended to, cared for, and analyzed. Thus, our theories and methods have constituted a particular blindness to those objects that our informants do not care for—the objects at the edges of the network, the smooth rhizomatic spaces, the blank figures, the neglected things, the undiscovered continents, the plasma. The paper thus joins in the ongoing discussion about the ontological politics of invisibility, partial knowledge, and fractionality, and asks how STS can attend to the making of the absent, weak, and invisible. What would happen if we start paying attention to these ontological overflows in practice? By tracing how multiple absences are produced, the paper shows the usefulness of caring for the othered objects, of following the making of alterity and otherness. The argument is that the tracing of ontological overflows opens up for understanding how tangential objects are dis-assembled, and consequently for tracing how absence, alterity, and otherness are made in practice.
Lee, Francis. 2024. "Ontological overflows and the politics of absence: Zika, disease surveillance, and mosquitos." Science as Culture 33(3): 1-26. https://doi.org/10.1080/09505431.2023.2291046
Digitizing Valuation
Dec 2022 | » Get text | Keywords: valuations, algorithms
This editorial introduces the theme issue on Digitizing Valuation. We suggest that digitization is not simply a process of turning existing valuation instruments and practices into code. Rather, digitizing valuations can have unique implications for how social order is established, challenged, and maintained. The contributions in this issue explore how digital technologies are reshaping practices of valuation across different domains, from algorithmic trading to digital health records.
Lee, Francis, Andrea Mennicken, Jacob Reilley, and Malte Ziewitz. 2022. "Digitizing Valuation." Valuation Studies 9(1): 1-10. https://doi.org/10.3384/VS.2001-5992.2022.9.1.1-10
Enacting the Pandemic: Analyzing Agency, Opacity, and Power in Algorithmic Assemblages
Nov 2020 | » Get text | Keywords: algorithms, social theory, disease surveillance, infrastructures, bioscience, actor-network theory
This article has two objectives: First, the article seeks to make a methodological intervention in the social study of algorithms. Second, the article traces ethnographically how an algorithm was used to enact a pandemic, and how the power to construct this disease outbreak was moved around through an algorithmic assemblage. The article argues that there is a worrying trend to analytically reduce algorithms to coherent and stable objects whose computational logic can be audited for biases to create fairness, accountability, and transparency (FAccT). To counter this reductionist and determinist tendency, the article proposes three methodological rules that allows an analysis of algorithmic power in practice. Empirically, the article traces the assembling of a recent epidemic at the European Centre for Disease Control and Prevention—the Zika outbreak starting in 2015—and shows how an epidemic was put together using an array of computational resources, with very different spaces for intervening. A key argument is that we, as analysts of algorithms, need to attend to how multiple spaces for agency, opacity, and power open and close in different parts of algorithmic assemblages. The crux of the matter is that actors experience different degrees of agency and opacity in different parts of any algorithmic assemblage. Consequently, rather than auditing algorithms for biased logic, the article shows the usefulness of examining algorithmic power as enacted and situated in practice.
Lee, Francis. “Enacting the Pandemic.” Science & Technology Studies 34, no. 1 (2021): 65–90. https://doi.org/10.23987/sts.75323.
How should we theorize algorithms? Five ideal types in analyzing algorithmic normativities
Aug 2019 | » Get text | Keywords: algorithms, social theory, infrastructures
The power of algorithms has become a familiar topic in society, media, and the social sciences. It is increasingly common to argue that, for instance, algorithms automate inequality, that they are biased black boxes that reproduce racism, or that they control our money and information. Implicit in many of these discussions is that algorithms are permeated with normativities, and that these normativities shape society. The aim of this editorial is double: First, it contributes to a more nuanced discussion about algorithms by discussing how we, as social scientists, think about algorithms in relation to five theoretical ideal types. For instance, what does it mean to go under the hood of the algorithm and what does it mean to stay above it? Second, it introduces the contributions to this special theme by situating them in relation to these five ideal types. By doing this, the editorial aims to contribute to an increased analytical awareness of how algorithms are theorized in society and culture. The articles in the special theme deal with algorithms in different settings, ranging from farming, schools, and self-tracking to AIDS, nuclear power plants, and surveillance. The contributions thus explore, both theoretically and empirically, different settings where algorithms are intertwined with normativities.
Lee, Francis, and Lotta Björklund Larsen. “How Should We Theorize Algorithms? Five Ideal Types in Analyzing Algorithmic Normativities.” Big Data & Society 6, no. 2 (July 1, 2019).
Algorithms as folding: Reframing the analytical focus
Aug 2019 | » Get text | Keywords: algorithms, social theory, infrastructures
This article proposes an analytical approach to algorithms that stresses operations of folding. The aim of this approach is to broaden the common analytical focus on algorithms as biased and opaque black boxes, and to instead highlight the many relations that algorithms are interwoven with. Our proposed approach thus highlights how algorithms fold heterogeneous things: data, methods and objects with multiple ethical and political effects. We exemplify the utility of our approach by proposing three specific operations of folding—proximation, universalisation and normalisation. The article develops these three operations through four empirical vignettes, drawn from different settings that deal with algorithms in relation to AIDS, Zika and stock markets. In proposing this analytical approach, we wish to highlight the many different attachments and relations that algorithms enfold. The approach thus aims to produce accounts that highlight how algorithms dynamically combine and reconfigure different social and material heterogeneities as well as the ethical, normative and political consequences of these reconfigurations.
Lee, Francis, Jess Bier, Jeffrey Christensen, Lukas Engelmann, Claes-Fredrik Helgesson, and Robin Williams. 2019. “Algorithms as Folding: Reframing the Analytical Focus.” Big Data & Society 6 (2).
Styles of Valuation: Algorithms and Agency in High-throughput Bioscience
Jul 2019 | » Get text | Keywords: algorithms, valuations, infrastructures, bioscience
In science and technology studies today, there is a troubling tendency to portray actors in the biosciences as “cultural dopes” and technology as having monolithic qualities with predetermined outcomes. To remedy this analytical impasse, this article introduces the concept styles of valuation to analyze how actors struggle with valuing technology in practice. Empirically, this article examines how actors in a bioscientific laboratory struggle with valuing the properties and qualities of algorithms in a high-throughput setting and identifies the copresence of several different styles. The question that the actors struggle with is what different configurations of algorithms, devices, and humans are “good bioscience,” that is, what do the actors perform as a good distribution of agency between algorithms and humans? A key finding is that algorithms, robots, and humans are valued in multiple ways in the same setting. For the actors, it is not apparent which configuration of agency and devices is more authoritative nor is it obvious which skills and functions should be redistributed to the algorithms. Thus, rather than tying algorithms to one set of values, such as “speed,” “precision,” or “automation,” this article demonstrates the broad utility of attending to the multivalence of algorithms and technology in practice.
Lee, Francis, and Claes-Fredrik Helgesson. “Styles of Valuation: Algorithms and Agency in High-Throughput Bioscience.” Science, Technology, & Human Values, July 30, 2019.
Analyzing algorithms: some analytical tropes
Nov 2016 | » Get text | Keywords: algorithms, social theory, infrastructures
Algorithms are everywhere. Hardly a day passes without reports on the increased digitalization and automation of society and culture. As we know these processes are fundamentally based on algorithms (Kichin 2012). Today, there is also a proliferation of research on the social aspects of algorithms: on census taking (Ruppert 2012), predicting and preventing crime (Ferguson 2017), credit assessment (DeVille & Velden 2015), pricing water (Ballestero 2015), machine-learning (Burrell 2016), email spam filters (Maurer 2013), dating services (Roscoe & Chillas 2014) to men- tion a few. The focus of these researchers have in different ways been algorithms and their pro- found impact (cf. Kockelman 2013). However, in this algorithmic world, it seems to us that we are moving in a landscape where we find familiar tropes of technological hype, determinism, and of evil technology run wild.
Lee, Francis, and Lotta Björklund Larsen. 2016. "Analyzing algorithms: some analytical tropes." Second Algorithm Studies Workshop, Stockholm, Sweden, 23-24 feb.
Publications updated in Jan 2026