top of page


Drawing data together: Inscriptions, asylum, and scripts of security

Science, Technology, and Human Values.

Sarah Perret and Claudia Aradau (2023), Online first -

Data have become a vital device of border governance and security. Recent scholarship on the datafication of borders and migration at the intersection of science and technology studies and critical security studies has privileged concepts attuned to messiness, contingency, and friction such as data assemblages and infrastructures. This paper proposes to revisit and expand the analytical vocabulary of script analysis to understand what comes to count as data, what forms of data come to matter and how “drawing data together” reconfigures power and agency at Europe’s borders. Empirically, we analyze controversies about the practices of asylum decision-making and age assessment in Greece. We show that agency of “users” is unequally distributed through anticipations of subscription and dis-inscription, while asylum seekers are conscripted within security scripts that restrict their agency. Moreover, as a multiplicity of inscriptions are produced, migrants’ claims can be disqualified through circumscriptions of data and ascriptions of expertise.

Digital–Nondigital Assemblages: Data, Paper Trails, and Migrants’ Scattered Subjectivities at the Border. International Political Sociology, 17(3).

International Political Sociology ,

Lucrezia Canzutti & Martina Tazzioli (2023) 

This paper argues that the border regime works through entanglements of digital and nondigital data and of “low-tech” and “high-tech” technologies. It suggests that a critical analysis of the assemblages between digital and nondigital requires exploring their effects of subjectivation on those who are labeled as “migrants.” The paper starts with a critique of the presentism and techno-hype that pervade research on borders and technology, and points to the importance of analyzing historical continuities and ruptures in the technologization of the border regime. It then explores the assemblages of high-tech and low-tech technologies used for controlling mobility and investigates the imbrication of digital and nondigital records that migrants need to deal with and show not only at the border but throughout their journeys and, eventually, to obtain refugee status. The third section discusses migrants’ tactical uses of digital and nondigital records, their attempts to erase or reconstruct traces of their passages, and states’ oscillation between politics of identification and nonidentification. Finally, the fourth section questions the image of the “data double” and contends that, rather than a discrete digital subject, migrants’ digital traces generate scattered digital subjectivities that migrants themselves cannot fully access.

Neither opaque nor transparent: A transdisciplinary methodology to investigate datafication at the EU borders

Big Data & Society, 9(2), Open Access available at 

Ana Valdivia, Claudia Aradau, Tobias Blanke, & Sarah Perret (2022)

In 2020, the European Union announced the award of the contract for the biometric part of the new database for border control, the Entry Exit System, to two companies: IDEMIA and Sopra Steria. Both companies had been previously involved in the development of databases for border and migration management. While there has been a growing amount of publicly available documents that show what kind of technologies are being implemented, for how much money, and by whom, there has been limited engagement with digital methods in this field. Moreover, critical border and security scholarship has largely focused on qualitative and ethnographic methods. Building on a data feminist approach, we propose a transdisciplinary methodology that goes beyond binaries of qualitative/quantitative and opacity/transparency, examines power asymmetries and makes the labour of coding visible. Empirically, we build and analyse a dataset of the contracts awarded by two European Union agencies key to its border management policies – the European Agency for Large-Scale Information Systems (eu-LISA) and the European Border and Coast Guard Agency (Frontex). We supplement the digital analysis and visualisation of networks of companies with close reading of tender documents. In so doing, we show how a transdisciplinary methodology can be a device for making datafication ‘intelligible’ at the European Union borders.

Algorithmic Reason: The New Government of Self and Other 

OUP, Oxford

Claudia Aradau and Tobias Blanke (2022)

Open Access -  PDF of e-book available here

Are algorithms ruling the world today? Is artificial intelligence making life-and-death decisions? Are social media companies able to manipulate elections? As we are confronted with public and academic anxieties about unprecedented changes, this book offers a different analytical prism through which these transformations can be explored. Claudia Aradau and Tobias Blanke develop conceptual and methodological tools to understand how algorithmic operations shape the government of self and other. They explore the emergence of algorithmic reason through rationalities, materializations, and interventions, and trace how algorithmic rationalities of decomposition, recomposition, and partitioning are materialized in the construction of dangerous others, the power of platforms, and the production of economic value. The book provides a global trandisciplinary perspective on algorithmic operations, drawing on qualitative and digital methods to investigate controversies ranging from mass surveillance and the Cambridge Analytica scandal in the UK to predictive policing in the US, and from the use of facial recognition in China and drone targeting in Pakistan to the regulation of hate speech in Germany.

Asylum, Borders, and the Politics of Violence: From Suspicion to Cruelty

Global Studies Quarterly 2(2), Open Access, available at 

Claudia Aradau and Lucrezia Canzutti (2022)

Critical scholarship in international relations, border, and migration studies has analyzed the cultures of suspicion that underpin border practices and have increasingly reshaped the politics of asylum globally. They have highlighted either the generalization of suspicion through the securitization of asylum or racialized and gendered continuities of colonial violence. We propose to understand the entanglements of continuity and discontinuity in the politics of asylum through “technologies of cruelty.” To conceptualize technologies of cruelty, we draw on Etienne Balibar's “topographies of cruelty” and Rita Laura Segato's “pedagogies of cruelty.” Empirically, the article argues that technologies of cruelty minimize, erase, undo, splinter, and devalue asylum seekers’ claims for protection in ways that objectify and dehumanize them. Methodologically, the argument is developed through an analysis of a corpus of asylum appeal decisions in the United Kingdom. Asylum appeals are a particularly important archive for the diagnosis of cruelty internationally, as they are both inscriptions of dominant knowledge and contestations over knowledge and claims to protection. They also allow us to trace how the politics of asylum is situated within global topographies of cruelty, shaped by technologies of cruelty and the deactivation of empathy.

You can find here some of the publications from the project and related research we have done over the past years.

The politics of (non-)knowledge at Europe's borders: Errors, fakes, and subjectivity

Review of International Studies: 1-20 

Claudia Aradau and Sarah Perret (2022)

From statistical calculations to psychological knowledge, from profiling to scenario planning, and from biometric data to predictive algorithms, International Relations scholars have shed light on the multiple forms of knowledge deployed in the governing of populations and their political effects. Recent scholarship in critical border and security studies has drawn attention to ‘the other side of knowledge’ and has developed a vibrant conversation with the emergent interdisciplinary field of ignorance studies. This article proposes to advance these conversations on governing through non-knowledge by nuancing the analysis of power/(non-)knowledge/subjectivity relations. Firstly, we expand the analysis of non-knowledge by attending to the problematisation of errors and fakes in controversies at Europe's borders. Errors have emerged in relation to border actors’ practices and technologies, while migrant practices, documentation, and narratives are deemed to be potentially ‘fake’, ‘fraudulent’, or ‘false’. Secondly, we explore how different subjectivities are produced through regimes of error/truth and fake/authenticity. We argue that there are important epistemic differences between ‘fake’ and ‘error’, that they are entangled with different techniques of power and produce highly differentiated subjectivities. Finally, we attend to how these subjectivities are enacted within racialised hierarchies and ask whether non-knowledge can be mobilised to challenge these hierarchies.

Book Reviews: Border Frictions: Gender, Generation and Technology on the Frontline (Karine Côté-Boucher (2020), Routledge, Abingdon)
Defense & Security Analysis, 37(3): 383-385

Sarah Perret (2021)


Algorithmic Surveillance and the Political Life of Error

Journal for the History of Knowledge 2(1), Open Access, available at:

Claudia Aradau and Tobias Blanke (2021)

Concerns with errors, mistakes, and inaccuracies have shaped political debates about what technologies do, where and how certain technologies can be used, and for which purposes. However, error has received scant attention in the emerging field of ignorance studies. In this article, we analyze how errors have been mobilized in scientific and public controversies over surveillance technologies. In juxtaposing nineteenth-century debates about the errors of biometric technologies for policing and surveillance to current criticisms of facial recognition systems, we trace a transformation of error and its political life. We argue that the modern preoccupation with error and the intellectual habits inculcated to eliminate or tame it have been transformed with machine learning. Machine learning algorithms do not eliminate or tame error, but they optimize it. Therefore, despite reports by digital rights activists, civil liberties organizations, and academics highlighting algorithmic bias and error, facial recognition systems have continued to be rolled out. Drawing on a landmark legal case around facial recognition in the UK, we show how optimizing error also remakes the conditions for a critique of surveillance.

Experimentality, surplus data and the politics of debilitation in borderzones 


Claudia Aradau (2020)

The use of digital devices and the collection of digital data have become pervasive in borderzones. Whether deployed by state or non-state actors, digital devices are rolled out despite intense criticism and controversy. Many scholars and public actors  have criticised these experiments with digital technologies in borderzones. Yet, what exactly does it mean to conduct experiments in borderzones?  In this article, I discuss different meanings of experiments and experimentation and propose to understand border governance  through the prism of experimentality. Experimentality was initially formulated in the anthropological literature on the globalisation of clinical trials and, more recently, revisited in feminist science and technology studies. Drawing on this work, I argue that experimentality has become a rationality of governing in borderzones, which renders social relations continuously decomposable and recomposable by inserting mundane (digital) devices into the world. The introduction of various digital devices in Greece since 2015, starting with Skype for the pre-registration of asylum seekers, helps shed light on a particular form of governing through experiments without protocol. This form of experimentality has specific political effects for migrants’ lives. Firstly, experimentality builds upon and intensifies neoliberalism by rearranging rather than redressing precarity. In so doing, experimentality through digital devices produces debilitation rather than better connectivity or access to asylum. Secondly, migrants become not only subjects of surveillance, but subjects of extraction of ‘surplus data’ which entangles their lives into the circuits of digital platforms.

There is an elephant in the room: towards a critique on the use of fairness in biometrics

AI Ethics , Open Access available at

Ana Valdivia, Júlia Corbera Serrajòrdia & Aneta Swianiewicz (2022)

The proliferation of biometric systems in our societies is shaping public debates around its political, social and ethical implications. Yet, whilst concerns towards the racialised use of this technology have been on the rise, the field of biometrics remains unperturbed by these debates. Despite the lack of critical analysis, algorithmic fairness has recently been adopted by biometrics. Different studies have been published to understand and mitigate demographic bias in biometric systems, without analysing the political consequences. In this paper, we offer a critical reading of recent debates about biometric fairness and show its detachment from political debates. Building on previous fairness demonstrations, we prove that biometrics will be always biased. Yet, we claim algorithmic fairness cannot distribute justice in scenarios which are broken or whose intended purpose is to discriminate. By focusing on demographic biases rather than examine how these systems reproduce historical and political injustices, fairness has overshadowed the elephant in the room of biometrics.

Infrastructures of (non)knowledge: Fakes and fear at Europe's borders

in Technopolitics and the Making of Europe, edited by Nina Klimburg-Witjes and Paul Trauttsmandorff

Claudia Aradau (2023),

False identities, forged documents and fake narratives underpin discourses and practices of border control in Europe. Migrants are always under suspicion that they ‘fake’ their stories, suffering, passports, trauma or data. This chapter explores how infrastructures produce ways of knowing and not-knowing by drawing distinctions between and allocating people and things to categories of genuine and fake, authentic and inauthentic. By proposing to speak of infrastructures of (non)knowledge, I draw attention not just to ‘robust networks of people, artifacts, and institutions that generate, share, and maintain specific knowledge about the human and natural worlds' (Edwards, 2010), but also to how such networks enact boundaries between knowledge and non-knowledge. Detecting fakes requires devices, methods and experts that can deploy tests of authentication to decide which narratives, which documents and which embodied suffering is real or fake. As fakes are tied into the politics of fear, refugees continue to be under suspicion for fraud, of using fake documents and fake stories to have access to asylum. Fake data, forged documents or narratives can be detected at any point and even lead to the revocation of refugee status.


The burden of data: Digital tools, evidence and credibility in asylum and immigration courts
SECURITY FLOWS Report, London: King's College London.
Thais Lobo and Claudia Aradau (2023)


This report delves into the shifting landscape of asylum proceedings and the intricate role digital technologies play in the lives of migrants. The report draws findings from 916 appeals at the Upper Tribunal of the Immigration and Asylum Chamber (UTIAC) that between 2007-2022 mentioned one or more of 16 digital platforms. As the use of digital evidence becomes increasingly pivotal in protection claims, we shed light on the challenges migrants face and propose a set of recommendations for a fairer and more equitable system.

bottom of page