The Chaos Machine: The Inside Story of How Social Media Rewired Our Minds and Our World by Max Fisher - review by Carl Miller

Carl Miller

Are You Outraged Yet?

The Chaos Machine: The Inside Story of How Social Media Rewired Our Minds and Our World

By

Quercus 352pp £20
 

‘By the time I landed in Myanmar, the soldiers were already throwing babies in fires,’ Max Fisher writes of his visit to the country in 2017. Houses burned. Rockets slammed into the walls of longhouses. A few years before, media restrictions had been lifted and Myanmar had hurtled into the digital age. While soldiers rampaged across the countryside, Facebook swirled with stories of Rohingya committing acts of cannibalism and smuggling weapons into Myanmar, accompanied by graphic images of bestiality and calls for beheadings. Each was shared tens of thousands of times, ‘pumping the national bloodstream with conspiracies and ultranationalist rage,’ Fisher states. From digital freedom to genocide: how did we get there? Untangling those threads is what Fisher sets out to do in The Chaos Machine.

To those who read books about tech and society, Fisher’s story starts in a familiar place. About as far away from Myanmar as you can get, we have Silicon Valley, a place whose origins are part hippy transcendentalist, part rapacious capitalist, home to a get-rich counterculture of engineers and dorm-room dropouts shunning social convention and feeding on and into the mythos that Silicon Valley is like nowhere else on earth.

New companies are created – that is what Silicon Valley does – and launched into a ceaseless and desperate war for attention. Google, Facebook, Twitter and the dozens upon dozens of others that have fallen by the wayside are all, ultimately, about their user bases. The ‘organizing incentive of all social media’, Fisher explains, is ‘attention’. Each company is on a journey of constant innovation and reinvention, seeking to win and keep winning the attention that keeps them alive.

Growth, then, is the animating ideal behind the platforms these companies build, and ‘persuasive technology’ is the means of achieving this. The research this technology is built on draws on everything from dopamine studies to behavioural psychology and addiction analysis. Features galore have emerged to keep our attention, from the endless scroll to dark patterns, but one is more important than all others, according to Fisher: recommendation algorithms. From an infinite pool of information, ‘recommender algos’ select what to serve up to you. On Facebook that means the news feed, on YouTube the recommended videos, on Twitter the curated timeline. And each algorithm is directed by powerful machines learning constantly what information will keep us on the platform the longest.

So the race for attention has shaped the products and these products have shaped what we see. This has created nothing less than ‘a wholly new era in the human experience’. Fisher’s point is that wiring platforms to grab attention has had a series of vast and ultimately ruinous consequences for the world we live in. For what the machines learned was that across an array of cultures and societies, more extreme content wins more engagement. The charge is not simply that YouTube and Facebook have allowed polarising, extremist, conspiracist, hateful material to persist on their platforms; it is also that they have actively pushed people’s attention towards it. Their products haven’t just reflected reality for us all but actually created it.

This is a story not only of technology but also of what happens when it commingles with fragile psychologies and vulnerable societies. The algorithms learned how to nourish dark and dangerous human impulses. People were pulled in as much by anomie as by hate – by ‘content’, Fisher writes, ‘that spoke to feelings of alienation, of purposelessness’. But the information they were fed recontextualised whatever personal hardships they were experiencing, presenting them as part of a wider process, one in which their identities were under threat. As a result, online spaces grew that rallied people to defend their threatened identities and to deify the people who went out to attack, even to kill, an often imagined enemy.

The results will have been seen by anyone who has spent time online: an algorithmically served diet of information triggering moral outrage, a hardening of resentment between in-groups and out-groups and a sense of having your own view of the world confirmed. In the United States, Fisher plots a path from Gamergate to Pizzagate, and then the transformation of a ‘wing of the Republican Party’ into a ‘millions-strong conspiracy cult’ in ‘the vanguard of a campaign to topple American democracy’.

At first, the only people who really understood what the platforms were doing were those who ran them. For the rest of us, realisation has come through a drip-feed of studies, disclosures by whistleblowers and the occasional mea culpa of a tech giant exec. Those running the tech giants were ignorant of the societies they were pushing their products into, including their own. ‘As the usage expands, it’s in every country, it’s in places in the world and languages and cultures we don’t understand,’ Fisher quotes Chris Cox, Facebook’s chief product officer, as saying in 2013. Ignorance is only part of the story, however. The other part involves wilful myopia. Fisher is careful to note the many warnings Facebook and YouTube received about what their algorithms were doing years before Myanmar’s genocide began.

Diagnosing the influence of the tech giants on society has become practically a genre in its own right, and the broad contours of Fisher’s argument will feel familiar in various parts to readers of, say, The Attention Merchants by Tim Wu, The Net Delusion by Evgeny Morozov and The Internet is Not the Answer by Andrew Keen. None of the other books on this subject, however, traces quite as crisply or clearly as Fisher’s does a single line from attention-harvesting through recommender algorithms to radicalisation.

It is when Fisher reports first-hand on the effects of all this that the writing really crackles. He takes us, for instance, to a hospital in Maceió, Brazil, to show us doctors fighting not just the Zika virus but also anti-vax parents pulled down YouTube rabbit holes. Along a remote mountain road in Sri Lanka, Fisher brings us to a concrete house without running water but ‘bristling’ with smartphones. Inside lives a family who ‘could recite, verbatim, memes constructing an alternate reality of nefarious Muslim plots’. Sectarian extremists, famous on Facebook, send exactly the kind of content that keeps people on the platform. Posts show makeshift weapons and a list of targets. Two mosques are marked ‘tonight’, another two ‘tomorrow’. Sudarshana Gunawardana, Sri Lanka’s information chief, is reduced to begging Facebook to take action via its online submission box.

It is never Fisher’s argument, however, that the tech giants conjure hatred out of nothing, but rather that they exploit and worsen hatreds that already exist. In Myanmar, anti-Rohingya animus stretches back to the early 20th century. But it was only in 2012, the moment that media restrictions were lifted, that communal tensions flamed into genocide.

Fisher’s answer is simply to turn these chaos machines off, or at least the recommender algorithms that power them. It would create a less engaging internet, but also one with ‘fewer schoolteachers chased into hiding, fewer families burned alive in their homes … fewer lives ruined by undeserved infamy … fewer children deprived of lifesaving vaccines … Maybe even fewer democracies torn asunder by polarization, lies, and violence.’ Fisher’s book brings us face to face with chaos machines and their ruinous human consequences. I only wish the cure didn’t feel quite so unlikely.

Sign Up to our newsletter

Receive free articles, highlights from the archive, news, details of prizes, and much more.

RLF - March

A Mirror - Westend