The algorithms that enrich Facebook’s shareholders and its founder and CEO Mark Zuckerberg have hijacked evolutionary quirks of human nature. The consequences for many communities around the world have been tragic.
As creatures that live in social clusters, natural selection has sculpted our neurology over thousands of generations to reward certain types of social stimuli and status cueing over others. We can’t help it. Facebook’s engagement metrics are the digital measure of a species that has learned that an individual’s chance of survival is improved when the clan or the tribe is minded to assure that survival.
The little rush of pleasure you feel at a well-received post, is a post-modern artefact of your primordial ancestors’ gratification at scoring the prime cut of mammoth t-bone around the tribal barbecue pit as a reward for stepping up and really spearing the bejeebus out of that big boy out on the tundra.
There’s nothing arbitrary about the Facebook algorithm targeting your need for tribal reinforcement. When Facebook was first coded 17 years ago, Zuckerberg and his Harvard roomies had only the vaguest notion of what they were doing. Thousands of data mavens and behavioural scientists have introduced a lot more focus and rigour to the process.
Far from tempering the worst elements of its algorithms, Facebook has optimised them, with tragic consequences all over the world.
Atham-Lebbe Farsith, a waiter in a one-room curry house in Ampara did not understand why the customer was so angry. All of the customers were angry. The man at the counter was waving his plate at Farsith, screaming at him in Sinhalese, as the other customers added their voices to his, complaining that Farsith had added something to the meal.
The Tamil-speaking waiter thought they might be angry because of the unfortunate lump of flour he could see quite clearly in the curry sauce. The flour was meant to thicken the gravy, but it had not broken down. He and his brother were not trained chefs after all. They’d worked a couple of brutal years as labourers in Saudi Arabia, long enough to earn the money needed to start up this small restaurant in their village in the green fields of the Gal Oya Valley, in southeastern Sri Lanka.
Farsith tried to ignore the angry diner, but the man kept shouting at him that he had ‘put it in’.
Well they had, hadn’t they. Flour was in the recipe.
So in his poor, halting Sinhalese he answered, “I don’t know. Yes, we put?” The mob erupted surging over and around the counter to beat seven kinds of hell out of the brothers, destroying the restaurant and rolling on to burn down the local mosque.
It was 26 February, 2018.
The riots would rage across the country for another two and a half weeks, injuring hundreds of people, and killing two.
In July that year, a thousand miles north of Ampara, five men passing through the Indian village of Rainpada were set upon by locals and beaten to death. The travellers had walked to the hamlet, hoping to barter for simple goods at the local markets. Four of them were members of one family, from a nomadic tribe in northeastern Maharashtra. They were kicked and punched and bludgeoned to actual pieces. There was so much blood and gore, splattered so extravagantly over the walls, the floor and even the ceiling of the village council office where the men had sought refuge, that nobody from the little village wanted to clean up after themselves. Eventually, labourers from a neighbouring township agreed to mop up the human offal for 5000 rupees.
Farsith and his brother hid for months from their neighbours in Sri Lanka. The village of Rainpada died because all the men ran away, fearful of being arrested for murder.
Kill all you see
Meanwhile across the Bay of Bengal two soldiers deserted from the Army of Myanmar. They gave themselves up to an insurgent group that videoed confessions from each man.
The deserters, former Private Myo Win Tun and former Private Zaw Naing Tun, were handed off to agents of China’s Ministry of State Security. The Chinese, for their own inscrutable reasons, mostly involving a feud with a couple of Burmese generals, delivered the men to the International Criminal Court in the Hague.
There they would testify to war crimes and genocide, committed in 2017, on the direct orders of their superiors.
Go into the villages of the Rohingya and “Kill all you see,” they were told.
By the time Privates Myo Win Tun and Zaw Naing Tun, and their comrades from Battalions 353 and 565 of the Army of Myanmar were done, over twenty thousand Rohingya were dead and three-quarters of a million were fleeing on foot across muddy mountain tracks to Bangladesh.
According to a report by the Simon-Skjodt Centre for Preventing Genocide, United States Holocaust Memorial Museum, “Myanmar Army soldiers slit throats; burned victims alive, including infants and children; and beat civilians to death. State security forces opened fire on men, women, and children from land and helicopter gunships at close range and at a distance, killing untold numbers. Survivors from some villages also reported how soldiers slashed women’s breasts, hacked bodies to pieces, and beheaded victims."
The common thread
Thousands of miles separated these sudden storms of bloodletting, and each was unique in its particulars and scale, but one individual ties them all together with a thick, golden thread. Your maximum Facebook friend, Mark Zuckerberg.
Each of the atrocities was exceptional and singular in the way they arose from the cultures and historical moments in which they germinated. Each was of a kind with long antecedent legacies of fear and loathing between rival tribes and belligerent cultures. But each was also enabled, amplified and, in the case of the Rohingya genocide, micro-managed and tactically iterated using the tools of a trillion-dollar company with a stated mission “to give people the power to share and make the world more open and connected.”
Facebook says it is all about bringing people together. To achieve this mission it has now plugged its fiberoptic data tubes into the reptilian hindbrains of 2.9 billion human beings with no other planet to escape to.
But in creating a maximum addressable eyeball market of half the world’s population (rising to nearly 100 per cent penetration in markets such as Sri Lanka and Myanmar) Mark Zuckerberg has gifted us the power to share toxic disinformation and weaponised paranoia, connecting some of the worst people ever and making the world more open to their violent derangements.
In each case above, and in dozens of other instances of lethal communal violence around the world, Facebook acted as both a catalysing and organising agent.
In Sri Lanka, which banned the network after a series of bomb attacks on Easter church services killed more than two hundred people, the curry house riot in Ampara was sparked by a viral Facebook post claiming that police had seized thousands of sterilisation pills from a Muslim pharmacist in the village. The pills, according to the conspiracy, were part of a cunning Islamist plot to depopulate Sri Lanka of its Sinhalese Buddhist inhabitants by chemically castrating the males.
It was, of course, a load of old bollocks. But Facebook’s algorithm does not discriminate against bollocks, old or new. Indeed, if the content is measurably ‘sticky’ and ‘sharable’ the algorithm gets very excited and starts pimping it out with extreme prejudice.
The angry customer was possibly just an angry customer who’d seen the viral post before finding the flour blob in his dinner, dissolving into the curry sauce exactly like one of those sneaky sterilisation pills would.
But he was just as likely a Sinhalese ultranationalist, with a director of phone-cam photography standing off to the side, capturing his performative outrage and scoring the absolute money shot when poor Farsith mumbled, “I don’t know. Yes, we put?”
The eighteen-second video of that confession blasted out across Sri Lankan Facebook within hours, guaranteeing that Ampara’s riots would go national.
It wasn’t an accident.
It was, by the calculations of the Zuckerberg’s algorithm, super awesome content. The sort of thing that friends love to share and connect over, as they burn down the local mosque and beat a few waiters to death. Naturally, it also smashed all the KPIs for any social media strategist whose online campaign was designed to funnel potential followers towards that all important call-to-action… to burn down the local mosque and beat a few waiters to death.
So too, a thousand miles north in the village of Rainpada.
For weeks before the five nomadic travellers arrived for market day, the village and surrounding hamlets had seethed with fearful rumours of ‘outsiders’ kidnapping local children and harvesting their organs, which is to say that Facebook and WhatsApp, the Zuckbot’s global messaging platform, were boiling with hot content.
There were memes. It mattered not to the Facebook or its algorithm that some of the most popular memes featured images of children killed in the Syrian civil war, rather than locally farmed for their sweetmeats by some locavore Slender Man.
The images were powerful, engagement was high, and organic reach was off the scale, bro.
Facebook’s Vision Statement, which is slightly gushier than its Mission Statement, speaks of people using the platform “to stay connected with friends and family, to discover what’s going on in the world, and to share and express what matters to them.”
This possibly explains Zuckerberg’s recent churlishness re. All the Facebook hate.
The good folks of Rainpada, after all, were using their cheap Androids and discount data minutes to stay connected with friends and family, who were all about discovering what was going on in the world, especially as regards all that child abduction and organ harvesting they’d been seeing on the Book of Face and in their village and family WhatsApp group chats lately.
That stuff totally mattered to them and you better believe they were totes gonna murder the share button and express themselves about it. Same way they were totes gonna murder those five guys who wandered into town and made the mistake of offering a nine-year-old girl a boiled lolly as they asked her when the market was open.
It didn’t matter that the child abduction memes and murder videos metastasising in Rainpada’s group chats were fake and crudely so. They had social proof. The posts were sourced from friends and neighbours in the real.
The wanderers never had a chance. They thought they were walking into the village for market day, but they had instead stepped into a lethal trap, fashioned from the very real, but completely bogus terrors of one of the countless small, isolated, technologically illiterate and epistemically unsophisticated micro-demographics which when aggregated together make Mark Zuckerberg a billionaire.
But of course, Zuckerberg isn’t the only one who knows how to work those demos.
What separated the Rohingya genocide from the killings in Sri Lanka and northern India, wasn’t just scale. In Myanmar, the military had identified Facebook as a premium channel for delivering psywar effects directly into the eyeballs and the limbic systems of the country’s entire population. So deeply is Facebook dialled into Myanmar that 97 per cent of the country’s internet users have an account on the network. Many could be forgiven for mistaking the US platform for the whole of the web.
In Myanmar, Facebook is the internet.
Effectively, it is the whole world.
The military exploited the reach of the network, and the vulnerability of its users, to prepare the populace not just to accept a terror campaign that shaded the difference between ethnic cleansing and straight-up genocide, but to help actively prosecute it.
Up to 700 army operators, many trained in Russia’s information warfare facilities were tasked with filling Myanmar’s newsfeeds, pages and groups with content designed to inflame hatred of the country’s Rohingya minority.
They deliberately operationalised the power of social proof which had turned so deadly in Rainpada – taking over popular entertainment accounts and beauty blogger pages to salt them with increasingly extreme material. When the military’s commanders decided in August of 2017 to move to the kinetic phase of operations, soldiers like Myo Win Tun and Zaw Naing Tun found themselves aided in the slaughter by local villagers, whipped into a fury by two years of micro-targeted, A-B tested, high-engagement psywar posts.
The campaign was so extensive, and the evidence so undeniable that even Facebook didn’t bother trying to deny it.
It’s still booking ads there, of course.
Until recently, Zuckerberg’s go-to crisis management play was the teary-eyed apology, and a promise to do better. In Sri Lanka and India the promises involved ‘working with authorities’ and setting up quick reaction teams to address disinformation. In Myanmar, where the authorities were the problem, Alex Warofka, a product policy manager, simply said, “We agree that we can and should do more.”
Last week’s Congressional testimony by Frances Haugen, a Facebook manager gone rogue, had its own viral moments, mostly due to some scary reveals about the damage being done to young women by Instagram, and her explanation of how the platform preferences ‘engagement’ over everything – including the threat of mob violence and the subversion of democracy.
But Haugen’s testimony also called out Facebook’s mea culpas for all those burning villages and piled up corpses in places like Ampara and Rainpada and Myanmar. Three years after the lynchings, riots and massacres, Haugen revealed that Facebook still “spends almost all of its budget for keeping the platform safe only on English-language content.”
She revealed that the company continually lied about “the safety of children, the efficacy of its artificial intelligence systems, and its role in spreading divisive and extreme messages.” There was, she said, one base truth about Facebook. The only thing it cares about is growth.
“This is not simply a matter of certain social media users being angry or unstable, or about one side being radicalised against the other; it is about Facebook choosing to grow at all costs,” she said.
“The company’s leadership knows how to make Facebook and Instagram safer, but won’t make the necessary changes because they have put their astronomical profits before people,” Frances Haugen said.
Because to make Facebook and Instagram and WhatsApp safer will mean unplugging the engagement engine.
The primal energies which have driven hundreds of incidents of mass violence, not just in Sri Lanka, India and Myanmar, but all over the world, are the motive power behind all the shares Craig Kelly used to get for his anti-vaccination posts. They are the reason hundreds of thousands of Americans died needlessly from COVID-19. They may yet re-elect Donald Trump. They are the subterranean explosions collapsing our faith in any sort of shared reality.
Critics argue the company should be broken up.
Perhaps the solution is to tap into the very human fears that Facebook itself exploits - its constituent pieces could be buried at a crossroads at midnight, and the ground salted so that nothing ever grows there again.