Code Dependent: Living in the Shadow of AI by Madhumita Murgia - review by Carl Miller

Carl Miller

Guilt by Algorithm

Code Dependent: Living in the Shadow of AI

By

Picador 320pp £20
 

It was in the barrio of Norte Grande, in the foothills of the Andes in northwest Argentina, that in 2018 a project was rolled out to try to predict who would become pregnant. Powerful machine learning, supported by one of the largest tech companies in the world, was deployed to identify the women most likely to have a child while still teenagers. 

The idea was to usher in a brave new world of digital welfare through smarter, targeted interventions. Yet, as you stroll through the streets of the city with Madhumita Murgia and one of the cast of characters in Code Dependent, Norma Gutiarraz, you learn that this isn’t what happened. In some areas, nearly three quarters of families were flagged up by the system. The needs of these families were painfully obvious to anyone who cared to visit: sanitation, health care, jobs, education. The scheme wasn’t explained to the participants, which led to its being shut down before almost any women had seen practical benefits. Oh, and the whole project – about pregnancy, you understand – had neglected to collect data about men. 

‘I wanted to find’, Murgia writes, ‘real-world encounters with AI that clearly showed the consequences of our dependence on automated systems.’ Her focus is not on the technical magic of artificial intelligence itself or the wunderkinds and tycoons who build it. Instead, leaving the bubble of Silicon Valley far behind, she wants to know what happens when AI is pulled into the messy tangle of normal life. 

Amsterdam is another city turning to AI, hoping that it will do good. The goal was to identify families that needed early invention via the creation of a list, named Top400, of young people most likely to commit serious crime. And there too the story turns into a smorgasbord of unintended consequences. Joining the Top400 turned out to be more punitive than preventive. Young people suddenly found themselves with a target on their backs, the subjects of surveillance by the police and targets of recruitment by gangs. ‘The algorithm-generated lists were more than predictors. They were curses,’
Murgia writes. 

Through a series of vivid, dramatically diverse tableaus, Murgia does indeed situate AI in the rough and tumble of human society. What she often finds is the sheer messiness that ensues when you take powerful technology away from technical labs and think-tank ethical frameworks and mix it with, well, everything else: human ambition, superstition, inequalities, resilience, resistance. 

Most of Murgia’s tales, then, are cautionary. She meets a poet battling sexualised deepfakes and ‘nudes-for-hire’ services, an engineer of facial-recognition technology now doubting his creation and some of the one billion workers who survive primarily ‘at the behest of an app’. The writing is as much a travelogue and a series of pen portraits as anything else, and really comes alive when Murgia describes the people she encounters and the places she visits. Norma Gutiarraz has ‘liquid hazel eyes’. In Argentina, she passes through an ‘otherworldly terrain of volcanoes, salt flats and technicolour lagoons’.

She and her subjects are most optimistic about the potential of AI in the field of health. At Chinchpada Christian Hospital in western India, where ‘pink bougainvillaea branches hang low over a red-brick wall’, she meets Dr Ashita Singh, who is involved in the fight against tuberculosis. Singh employs the qTrack app, which is powered by machine learning, to spot possible signs of tuberculosis in X-ray scans, using it as a second opinion to augment, not supplant, her own clinical expertise. Health care, Murgia says, is the ‘one area where the technology’ seems to have ‘genuinely life-changing potential’. It is also possibly one of the few areas in which there are already regulations in place to control AI properly – to mitigate potential harms, protect patient data and prevent conflicts of interest. 

The stories in Code Dependent are wonderfully varied, but their common threads, I think, come clearly into view. Wherever you are in the world, it is very often the least privileged whose lives have been most changed by AI. From Nairobi to Sofia, you find it is poor people who have most contact with state services and so more data has been collected about them. It is the ‘global precariat’ who feed AI systems with data and who are identified, surveilled, detected and, sometimes, punished through these technologies. All of this raises a philosophical problem too. In those scenarios where AI is used to profile individuals or to target an intervention of some sort, it removes the individual autonomy of the people involved. They are turned from people capable of making choices into probabilities or clouds of risk. 

People aren’t merely passive objects of AI, however. Murgia comes across individuals finding ways of reclaiming their autonomy and independence. They include Armin Samii, an Uber driver who invented UberCheats, a programme that, in a jujitsu pivot, uses the power of data against a tech giant. After employing it to find that 17 per cent of his six thousand trips had been underpaid, he released it, allowing other Uber drivers to check their own journey histories. 

The starkest story of resistance centres on Maya Wang, who uncovered the algorithmic system that underpins policing in the Xinjiang region of China. She found first the documentation and then the actual program, the Integrated Joint Operations Platform, being used by the police in the state. As she and her colleagues pulled the code apart, they exposed a system of surveillance that is grotesquely Kafkaesque. ‘There’s never been any other empire or government in the whole of human history’, Wang tells Murgia, ‘that has been able to exert such wide-ranging and deep surveillance on people.’ The system hoovers up vehicle data, DNA and biometric information, voice and gait profiles and mobile-phone records. Thirty-six suspicious ‘person types’ were identified in the process. Among the behaviours flagged up were ‘generally acting suspiciously’, having ‘unstable thoughts’ and ‘having improper [sexual] relations’. AI triggered travel restrictions and detention, turning daily life in Xinjiang into a ‘pressure cooker’,
Murgia writes. 

To me, the most important revelation of Murgia’s book is the quiet drift of Big Tech companies into the public sector. They unexpectedly pop up throughout the book in places you do not expect: Microsoft in Argentina; Google in Mexico; AWS in India. The reason is the almost complete absence of AI capabilities in the public sector. There seems to be barely a government in the world that can train AI models itself, and this is changing the role of Big Tech in public life. To actually use AI, public-sector institutions all over the world have had to create relationships with ‘hyperscalers’ like Google, Microsoft, Amazon, Meta and Apple. Big Tech is helping to create the tech state.

As Murgia directs your gaze around the world to reveal how AI is actually changing lives, a sense is inevitably created that all of these varied influences are creeping closer to you too. As one gig worker says, ‘everybody feels safe, they have a nice job and this won’t affect them, just those poor Uber drivers. But this artificial intelligence, it will spread, and it’s coming for everyone.’

Sign Up to our newsletter

Receive free articles, highlights from the archive, news, details of prizes, and much more.

RLF - March

Follow Literary Review on Twitter