AI as extraction
Materiality and power in Kate Crawford’s recent book...
What I found particularly compelling about Kate Crawford’s recent book Atlas of AI was its interest in thinking in very material and connected terms about the problems that artificial intelligence poses. Rather than allowing it to be otherworldly, Crawford effectively seeks to explore how AI is a deeply integrated part of the world itself, in terms of social structures but also in terms of environmental and ecological factors. The book starts with a story about a performing horse, and this is just one of many instances in which nature is forgrounded in surprising and often revealing ways. In Crawford’s account AI are embedded in all sorts of connections. This has the effect of blowing-away any sense of immateriality that might have persisted this far.
To bring about this radical materiality agenda, as we might think of it, the book’s chapters start with the dirt of earth and the grit of labor before moving into the heat and environmental degradation of data and the reductionism of classification, this is before moving inwards to the tracking of affect and outwards to the structures of the state and the forces of power. Having grounded AI the conclusion chapter is followed by a coda that takes us into space.
As the chapter list suggests, this may be a book about AI and materiality, but it is also a book about AI and power. The point being that this focus on materiality is needed to bring about a stronger sense of how AI are part of power relations and social forces. As Crawford tellingly concludes, the book explores ‘the planetary infrastructures of AI as an extractive industry: from its material genesis to the political economy of its operations to the discourses that support its aura of immateriality and inevitability’. This closing reflection is indicative of the type of approach the book develops, bringing infrastructures together with operations and discourses.
The notion of extraction is a powerful one that does a significant amount of work in literally grounding the discussion of AI. AI is an industry, it is argued, that ‘depends on exploiting energy and mineral resources’. There is, inevitably, a spatial element to thinking of AI as an ‘extractive industry’. The result of such a phrase is that location becomes a key part of the analysis. Data centres, we are told, like the mining sector before them, ‘are far removed from major populations’. The materiality of AI brings with it an attention to the mapping of AI. That is really an implicit point in the book, Crawford's method is informed by the notion of the atlas. This is a methodological point that is explained in some detail, especially with regard to what it means to think in terms of the atlas and how an eye for topography enables AI to be incorporated into the world in more direct and specific ways.
For Crawford, an understanding AI needs to also turn-back upon the very label within which it is contained. There is, Crawford concludes, ‘much at stake in how we define AI, what its boundaries are, and who determines them: it shapes what can be seen and contested’. There is s strong sense in the book of Crawford’s concern with avoiding the distractions of myth making and techno posturing and trying, instead, to think about the agendas behind them, especially concerning the what and who of AI. The argument made here runs across the multi-scalar chapters of the book: there is a concentration of power when it comes to the development and ownership of AI. Crawford goes as far as to refer to ‘the empires of AI’. Such a concentration of AI has consequences for the future and for the way that these technologies develops, their obejectives and the type of power associated with such developments. Crawford is pushing toward the connection of ‘issues of power and justice’.
Thinking in material terms such as these brings to the fore the substances of such connections. At the same time, trying to think in terms of the atlas is itself extractive of the relations that are often left in the substrate. The question this poses then is whether something as vast as AI can be thought of in these wide-ranging terms or if it simply becomes too hard to maintain this combination of materialities in the analysis. What this book does is to bring out the materiality of AI in ways that could well change understandings of the conditions from which AI emerges and may even, with its arguments about extraction, change what we understand AI to be in the first place.