Uppsala University
Select theme: all | valuations | algorithms | infrastructures
Analyzing algorithmic folds: Strategies for de-essentializing algorithms
May 2018 | » Get text | Keywords: algorithms, social theory, infrastructures
In this paper, we argue that algorithms can be analyzed according to Deleuze’s notion of the baroquian fold. The concept of folding describes an operation in which new proximities and juxtapositions are made between the exteriorities and interiorities of datasets, for example, as well as between pasts, presents, and futures. We use the fold to highlight the ability of algorithms to combine disparate data and data points to create new proximities between the universal, the normal, and the individual, such that, as the relations between them are remade, they too are refashioned . In six vignettes about contemporary uses of algorithms across settings we show how conceiving of algorithms use of data as an operation of folding allows us to assess the variety of new juxtapositions and proximities that algorithms enable.
Lee, Francis, Jess Bier, Jeffrey Christensen, Lukas Engelmann, CF Helgesson, and Robin Williams. "Analyzing algorithmic folds: Strategies for de-essentializing algorithms." Working Paper. May 2018.
Styles of valuation: algorithms and agency in high throughput bioscience
Mar 2018 | » Get text | Keywords: algorithms, valuations, infrastructures, bioscience
The biosciences are often proclaimed to be going through a data revolution based on high throughput technologies, online data sharing, and algorithmic tools for analysis. The aim of this article is to examine how actors’ valuations of different human/machine configurations are tied to broader struggles about what should count as good scientific practice in high throughput bioscience. In inquiring about the valuation of these different agential configurations, we have identified four heuristic styles of valuation in the laboratory under study: a bioinformatic, a subjectivist, an experimentalist, and a trialist. Specifically, the article traces how actors value two different algorithmic practices: randomization and normalization. Actor’s valuations of these two algorithms can be seen as exemplifying some broader tensions in current high-throughput biomedical research. The point being that an analysis of styles of valuation—with an emphasis on devices and practices—can shine light on broader shifts in the biosciences. What is deemed a good way of configuring agency is a matter of negotiation of the yardsticks for value that actors’ deem salient in each situation.
Lee, Francis, and CF Helgesson. 2018. "Styles of valuation: algorithms and agency in high throughput bioscience." Working Paper.
Where is Zika? Four challenges of emerging knowledge infrastructures for pandemic surveillance
Sep 2017 | » Get text | Keywords: algorithms, disease surveillance, infrastructures
Today, pandemics are increasingly known through novel and emerging digital knowledge infrastructures. For example, algorithms for disease classification, genetic and geographical information systems, as well as models of contagion, travel, or ecology. These digital knowledge infrastructures are constantly humming in different disease control organizations across the globe. In the west the US CDC, the WHO, and the European CDC are endlessly monitoring their screens, attempting to detect the next big outbreak of disease. These knowledge infrastructures are increasingly reshaping our global knowledge about disease and pandemics. Through these tools, new disease patterns become objects of intervention, new outbreaks become visible, and new ways of classifying the world come into being. The general purpose of this paper, is to inquire into how emerging knowledge infrastructures—such as algorithms and modeling—shape knowledge production about pandemics. In doing this, the paper speaks to at least two overarching problems. First, how disease surveillance is reshaped by these emerging knowledge infrastructures. Second, how algorithms and modeling enter into processes of knowledge production more generally. In engaging with these questions through the lens of disease surveillance the paper outlines four challenges in dealing with algorithmic and modelled knowledge production.
Lee, Francis. 2017. "Where is Zika? Four challenges of emerging knowledge infrastructures for pandemic surveillance." Governing by prediction: models, data, and algorithms in and for governance, Paris, 11-13 Sep.
Analyzing algorithms: some analytical tropes
Nov 2016 | » Get text | Keywords: algorithms, social theory, infrastructures
Algorithms are everywhere. Hardly a day passes without reports on the increased digitalization and automation of society and culture. As we know these processes are fundamentally based on algorithms (Kichin 2012). Today, there is also a proliferation of research on the social aspects of algorithms: on census taking (Ruppert 2012), predicting and preventing crime (Ferguson 2017), credit assessment (DeVille & Velden 2015), pricing water (Ballestero 2015), machine-learning (Burrell 2016), email spam filters (Maurer 2013), dating services (Roscoe & Chillas 2014) to men- tion a few. The focus of these researchers have in different ways been algorithms and their pro- found impact (cf. Kockelman 2013). However, in this algorithmic world, it seems to us that we are moving in a landscape where we find familiar tropes of technological hype, determinism, and of evil technology run wild.
Lee, Francis, and Lotta Björklund Larsen. 2016. "Analyzing algorithms: some analytical tropes." Second Algorithm Studies Workshop, Stockholm, Sweden, 23-24 feb.
Publications updated in May 2018