Chalmers University
Select theme: all | valuations | algorithms | infrastructures
Ontological Overflows and the Politics of Absence
Mar 2022 | » Get text | Keywords: algorithms, social theory, disease surveillance, infrastructures, actor-network theory
This paper suggests that STS needs to start attending to what I dub ontological overflows. My argument is that the focus on construction and enactment stories in STS has led to us taking over the matters of concern of our interlocutors. Our informants’ concerns and objects, becoming our concerns and objects. I argue that we have taken for granted which objects should be attended to, cared for, and analyzed. Thus, our theories and methods have constituted a particular blindness to those objects that our informants do not care for—the objects at the edges of the network, the smooth rhizomatic spaces, the blank figures, the neglected things, the undiscovered continents, the plasma. The paper thus joins in the ongoing discussion about the ontological politics of invisibility, partial knowledge, and fractionality, and asks how STS can attend to the making of the absent, weak, and invisible. What would happen if we start paying attention to these ontological overflows in practice? By tracing how multiple absences are produced, the paper shows the usefulness of caring for the othered objects, of following the making of alterity and otherness. The argument is that the tracing of ontological overflows opens up for understanding how tangential objects are dis-assembled, and consequently for tracing how absence, alterity, and otherness are made in practice.
Lee, Francis. 2022. “Ontological Overflows and the Politics of Absence.” SocArXiv. https://doi.org/10.31235/osf.io/e5ju9
Enacting the Pandemic: Analyzing Agency, Opacity, and Power in Algorithmic Assemblages
Nov 2020 | » Get text | Keywords: algorithms, social theory, disease surveillance, infrastructures, bioscience, actor-network theory
This article has two objectives: First, the article seeks to make a methodological intervention in the social study of algorithms. Second, the article traces ethnographically how an algorithm was used to enact a pandemic, and how the power to construct this disease outbreak was moved around through an algorithmic assemblage. The article argues that there is a worrying trend to analytically reduce algorithms to coherent and stable objects whose computational logic can be audited for biases to create fairness, accountability, and transparency (FAccT). To counter this reductionist and determinist tendency, the article proposes three methodological rules that allows an analysis of algorithmic power in practice. Empirically, the article traces the assembling of a recent epidemic at the European Centre for Disease Control and Prevention—the Zika outbreak starting in 2015—and shows how an epidemic was put together using an array of computational resources, with very different spaces for intervening. A key argument is that we, as analysts of algorithms, need to attend to how multiple spaces for agency, opacity, and power open and close in different parts of algorithmic assemblages. The crux of the matter is that actors experience different degrees of agency and opacity in different parts of any algorithmic assemblage. Consequently, rather than auditing algorithms for biased logic, the article shows the usefulness of examining algorithmic power as enacted and situated in practice.
Lee, Francis. “Enacting the Pandemic.” Science & Technology Studies 34, no. 1 (2021): 65–90. https://doi.org/10.23987/sts.75323.
How should we theorize algorithms? Five ideal types in analyzing algorithmic normativities
Aug 2019 | » Get text | Keywords: algorithms, social theory, infrastructures
The power of algorithms has become a familiar topic in society, media, and the social sciences. It is increasingly common to argue that, for instance, algorithms automate inequality, that they are biased black boxes that reproduce racism, or that they control our money and information. Implicit in many of these discussions is that algorithms are permeated with normativities, and that these normativities shape society. The aim of this editorial is double: First, it contributes to a more nuanced discussion about algorithms by discussing how we, as social scientists, think about algorithms in relation to five theoretical ideal types. For instance, what does it mean to go under the hood of the algorithm and what does it mean to stay above it? Second, it introduces the contributions to this special theme by situating them in relation to these five ideal types. By doing this, the editorial aims to contribute to an increased analytical awareness of how algorithms are theorized in society and culture. The articles in the special theme deal with algorithms in different settings, ranging from farming, schools, and self-tracking to AIDS, nuclear power plants, and surveillance. The contributions thus explore, both theoretically and empirically, different settings where algorithms are intertwined with normativities.
Lee, Francis, and Lotta Björklund Larsen. “How Should We Theorize Algorithms? Five Ideal Types in Analyzing Algorithmic Normativities.” Big Data & Society 6, no. 2 (July 1, 2019).
Algorithms as folding: Reframing the analytical focus
Aug 2019 | » Get text | Keywords: algorithms, social theory, infrastructures
This article proposes an analytical approach to algorithms that stresses operations of folding. The aim of this approach is to broaden the common analytical focus on algorithms as biased and opaque black boxes, and to instead highlight the many relations that algorithms are interwoven with. Our proposed approach thus highlights how algorithms fold heterogeneous things: data, methods and objects with multiple ethical and political effects. We exemplify the utility of our approach by proposing three specific operations of folding—proximation, universalisation and normalisation. The article develops these three operations through four empirical vignettes, drawn from different settings that deal with algorithms in relation to AIDS, Zika and stock markets. In proposing this analytical approach, we wish to highlight the many different attachments and relations that algorithms enfold. The approach thus aims to produce accounts that highlight how algorithms dynamically combine and reconfigure different social and material heterogeneities as well as the ethical, normative and political consequences of these reconfigurations.
Lee, Francis, Jess Bier, Jeffrey Christensen, Lukas Engelmann, Claes-Fredrik Helgesson, and Robin Williams. 2019. “Algorithms as Folding: Reframing the Analytical Focus.” Big Data & Society 6 (2).
Styles of Valuation: Algorithms and Agency in High-throughput Bioscience
Jul 2019 | » Get text | Keywords: algorithms, valuations, infrastructures, bioscience
In science and technology studies today, there is a troubling tendency to portray actors in the biosciences as “cultural dopes” and technology as having monolithic qualities with predetermined outcomes. To remedy this analytical impasse, this article introduces the concept styles of valuation to analyze how actors struggle with valuing technology in practice. Empirically, this article examines how actors in a bioscientific laboratory struggle with valuing the properties and qualities of algorithms in a high-throughput setting and identifies the copresence of several different styles. The question that the actors struggle with is what different configurations of algorithms, devices, and humans are “good bioscience,” that is, what do the actors perform as a good distribution of agency between algorithms and humans? A key finding is that algorithms, robots, and humans are valued in multiple ways in the same setting. For the actors, it is not apparent which configuration of agency and devices is more authoritative nor is it obvious which skills and functions should be redistributed to the algorithms. Thus, rather than tying algorithms to one set of values, such as “speed,” “precision,” or “automation,” this article demonstrates the broad utility of attending to the multivalence of algorithms and technology in practice.
Lee, Francis, and Claes-Fredrik Helgesson. “Styles of Valuation: Algorithms and Agency in High-Throughput Bioscience.” Science, Technology, & Human Values, July 30, 2019.
Ordering society and nature: some elements of a sociology of algorithms
Oct 2018 | » Get text | Keywords: algorithms, disease surveillance, valuations, infrastructures, actor-network theory
This article is an intervention in the sociology of classification and valuation. The article proposes four metaphors for analyzing how algorithms, modeling, or data practices shape practices of classification and valuation. Elaborating on classic work in actor-network theory, these metaphors highlight four ways in which algorithms can come to shape the practical ordering of society and nature. These metaphors urge the sociologist to pay attention to moments of algorithmic bifurcation, syncopation, absenting, and intervention. These moments highlight the intertwining of judgment and computation, the folding of time and space, the importance of absences, and the variable spaces for intervention afforded by algorithmic infrastructures. Using these metaphors, the article analyzes how the Current Zika State—the classification of the world into classes of disease intensity—was algorithmically assembled at the European Center for Disease Control and Prevention.
Lee, Francis. (2018) "Ordering society and nature: some elements of a sociology of algorithms" Working Paper. October 2018. Available at https://francislee.org/texts.html
Where is Zika? Four challenges of emerging knowledge infrastructures for pandemic surveillance
Sep 2017 | » Get text | Keywords: algorithms, disease surveillance, infrastructures
Today, pandemics are increasingly known through novel and emerging digital knowledge infrastructures. For example, algorithms for disease classification, genetic and geographical information systems, as well as models of contagion, travel, or ecology. These digital knowledge infrastructures are constantly humming in different disease control organizations across the globe. In the west the US CDC, the WHO, and the European CDC are endlessly monitoring their screens, attempting to detect the next big outbreak of disease. These knowledge infrastructures are increasingly reshaping our global knowledge about disease and pandemics. Through these tools, new disease patterns become objects of intervention, new outbreaks become visible, and new ways of classifying the world come into being. The general purpose of this paper, is to inquire into how emerging knowledge infrastructures—such as algorithms and modeling—shape knowledge production about pandemics. In doing this, the paper speaks to at least two overarching problems. First, how disease surveillance is reshaped by these emerging knowledge infrastructures. Second, how algorithms and modeling enter into processes of knowledge production more generally. In engaging with these questions through the lens of disease surveillance the paper outlines four challenges in dealing with algorithmic and modelled knowledge production.
Lee, Francis. 2017. "Where is Zika? Four challenges of emerging knowledge infrastructures for pandemic surveillance." Governing by prediction: models, data, and algorithms in and for governance, Paris, 11-13 Sep.
Analyzing algorithms: some analytical tropes
Nov 2016 | » Get text | Keywords: algorithms, social theory, infrastructures
Algorithms are everywhere. Hardly a day passes without reports on the increased digitalization and automation of society and culture. As we know these processes are fundamentally based on algorithms (Kichin 2012). Today, there is also a proliferation of research on the social aspects of algorithms: on census taking (Ruppert 2012), predicting and preventing crime (Ferguson 2017), credit assessment (DeVille & Velden 2015), pricing water (Ballestero 2015), machine-learning (Burrell 2016), email spam filters (Maurer 2013), dating services (Roscoe & Chillas 2014) to men- tion a few. The focus of these researchers have in different ways been algorithms and their pro- found impact (cf. Kockelman 2013). However, in this algorithmic world, it seems to us that we are moving in a landscape where we find familiar tropes of technological hype, determinism, and of evil technology run wild.
Lee, Francis, and Lotta Björklund Larsen. 2016. "Analyzing algorithms: some analytical tropes." Second Algorithm Studies Workshop, Stockholm, Sweden, 23-24 feb.
Publications updated in Jan 1970