Now we know for sure that big tech peddles despair, we must protect ourselves | Zoe Williams

After the Molly Russell case, there can be no doubt about the risk or urgency: our anger must be put to good use

Now that the inquest into the awful death of Molly Russell in 2017 has delivered its findings, we have a new reality to adjust to. The teenager died from an act of self-harm, “while suffering depression and the negative effects of online content”. Her father described how she had entered “the bleakest of worlds”: online content on self-harm and suicide was delivered in waves by Instagram and Pinterest, just leaving it to the algorithm. “Looks like you’ve previously shown an interest in despair: try this infinitely replenishing stream of fresh despair.”

Social media platforms deliberately target users with content, seeking attention and therefore advertising revenue: we knew that. This content can be extremely damaging: we knew that, too. But surely now that we’ve struggled, falteringly, towards the conclusion that it can be deadly, there can be no more complacency. These are corporations like any other, and it’s time to build on the consensus that they cause harm by regulating, as we would if they were producing toxic waste and pumping it into paddling pools.

People, parents especially, worry a lot about the digital age and its impact on teenagers, and a lot of those worries are nonsense: are they addicted to Fifa? Will Minecraft turn them into recluses or sever their connection with the natural world? Does Fortnite stop them reading books (in fact, yes, but some other time for that)? Sometimes you’ll get a useful correction from a specialist in addiction or adolescence but there isn’t a coherent pushback from tech giants, because these anxieties create exactly the debate they need, amorphous and essentially luddite in character: what if today’s kids are less resilient than yesterday’s because they were raised in a world with different stimuli? If the real threat to kids is modernity itself, it can never be addressed, it can only be discussed.

Underneath all that noise is a persistent drumbeat, an agenda now well known, pursued by methods that have been widely studied. Any platform that is free to use exists to maximise its advertising revenue, which means chasing watchers and watch-time. The algorithms suggesting content are not designed to prioritise quality or relevance, but rather to take an existing interest in any given user and direct them, in Molly Russell’s case, to more extreme versions of it. This had the tragic outcome with Molly that she was bombarded by more and more explicit explorations of misery, such that the coroner, Andrew Walker, said: “It would not be safe to leave suicide as a conclusion.” We cannot seal off a death from despair as an individual act when there are global corporations unrestrainedly marketing despair.

The problem goes far beyond young people: we can see algorithm impacts in nativist politics all over the world, and in that regard, youth is not the defining factor – indeed, the casual characterisation of youth as a state of vulnerability is its own blind alley. Nevertheless, there are two elements that make social media particularly influential on the young, and the behemoths of the field particularly culpable in their failure to address the problem. As Laura Bates notes in Men Who Hate Women, her detailed research into the “manosphere”, the social media coverage of Gen Z is astronomical: 85% of US teens use YouTube, 72% use Instagram, 51% still use Facebook. People spend significantly more time watching content that’s been recommended than stuff they’ve gone looking for: on YouTube, 70% of everything watched has been suggested by the site.

Adolescence is also, manifestly, a time of great intellectual as well as neurological plasticity, when you might easily want to know what an incel is without wanting to become one, or feel very keenly that the world is doomed one day, without being ready for your entire feed to be about variations of the apocalypse. We can, and do, debate ad nauseam how a mature society supports the outer edges of youthful turbulence, from eating disorders to toxic masculinity, yet we allow the main media consumed by that generation to operate, not just without any sense of responsibility or duty, but with a business model that foments every problem for profit.

The standards that social media companies set for themselves are curiously duplicitous, as well as being demonstrably insufficient. Last year, the staff of Connecticut senator Richard Blumenthal set up a fake Insta account posing as a 13-year-old girl interested in “extreme dieting”; it was immediately directed towards user accounts called “I have to be thin”, “Eternally starved”, “I want to be perfect”: evidence, the senator said, that the algorithm amplified harmful content by design. The platform’s response was that it was a sifting error – the site already had rules against the promotion of extreme dieting, and these accounts slipped through it. But this doesn’t answer the central charge, which was not that their rules weren’t executed well enough, but that they were actively advertising eating disorders to kids who showed an interest. CNN repeated the sting the following week, with the same results.

The online safety bill, expected to progress through parliament – although it may not be enacted until 2024 – addresses content that promotes self-harm and suicidal ideation, and would put it in Ofcom’s hands to evaluate what is appropriate for under-18s. It’s a useful waypoint, away from tech giants just regulating themselves, but insufficient both practically and in spirit. There’s no point countries regulating one by one, the response needs to be international: and we should not waste time discussing what kind of suicidal ideation is appropriate for what age group. We need to ask more fundamental questions, starting further up the pipeline, about what the moral responsibilities of mass publishing are.

All this takes time, youth is short, parents will be thinking they should control incoming influence themselves, that they don’t have time to wait for international initiatives, bills to progress. You can micromanage your kids’ consumption, be aware of the triggers everywhere – YouTube for toxic masculinity, TikTok for overwhelming climate anxiety, Instagram for eating disorders – try to control it all yourself, and this will work for some. But it also corrodes your relationship with your children to be constantly policing them, destroying their trust and openness. I don’t want to turn into the internet jailer just so that Mark Zuckerberg can enjoy unfettered profit.

An air of pre-emptive defeatism hangs over this debate: a sense that it is too late to regulate social media, that the lie has travelled all the way round the world, and there is now no point in the truth getting its pants on. But that is a counsel of despair. We cannot afford despair.

But at the same time, the solution is not individual. The answer isn’t for a billion parents to surveil their children’s Instagram and Pinterest feeds. It is to build a consensus, which is as global as the platforms themselves, that some things are more important than profit, and regulate accordingly.

  • Zoe Williams is a Guardian columnist

  • In the UK and Ireland, Samaritans can be contacted on 116 123, or email jo@samaritans.org or jo@samaritans.ie. In the US, the National Suicide Prevention Lifeline is 1-800-273-8255. In Australia, the crisis support service Lifeline is 13 11 14. Other international helplines can be found at befrienders.org.

  • Do you have an opinion on the issues raised in this article? If you would like to submit a letter of up to 300 words to be considered for publication, email it to us at guardian.letters@theguardian.com

Contributor

Zoe Williams

The GuardianTramp

Related Content

Article image
The Guardian view on mental health online: protect the vulnerable | Editorial
Editorial: The giant social media companies cannot escape responsibility when depressed teens are led to damaging material online

Editorial

28, Jan, 2019 @6:27 PM

Article image
Social media addiction is not natural or normal – but is it really a disease? | Roisin Kiberd
We should be focusing on living with Facebook et al, rather then abandoning, restricting or censoring it, says technology writer Roisin Kiberd

Roisin Kiberd

19, Mar, 2019 @3:13 PM

Article image
The desperate people of Ukraine need help, not self-satisfied social media posts | Moya Lothian-McLean
Twitter, Instagram and TikTok posts offer quick catharsis, but it’s the unshowy work of collective organising that makes a real difference, says writer Moya Lothian-McLean

Moya Lothian-McLean

08, Mar, 2022 @7:00 AM

Article image
Adult online age used by third of eight- to 17-year-old social media users
Ofcom study covers Facebook, TikTok, Instagram, Snapchat, Twitter and YouTube, all of which have age limits of 13

Dan Milmo Global technology editor

10, Oct, 2022 @11:01 PM

Article image
Social media addiction should be seen as a disease, MPs say
UK report suggests sites such as Facebook and Instagram could be harming mental health

Jim Waterson Media editor

18, Mar, 2019 @12:01 AM

Article image
Social media firms 'should hand over data amid suicide risk'
Royal College of Psychiatrists hope research will shine light on how young people use platforms

Denis Campbell Health policy editor

17, Jan, 2020 @12:01 AM

Article image
A ‘safe space for racists’: antisemitism report criticises social media giants
Facebook, Twitter, Instagram, YouTube and TikTok failing to act on most reported anti-Jewish posts, says study

Maya Wolfe-Robinson

02, Aug, 2021 @4:14 PM

Article image
Yes, children are in distress. But don't blame it all on Instagram | Gaby Hinsliff
In many cases the problem is real life, says the Guardian columnist Gaby Hinsliff

Gaby Hinsliff

08, Feb, 2019 @6:00 AM

Article image
I built a life on oversharing – until I saw its costs, and learned the quiet thrill of privacy | Moya Lothian-McLean
From social media to journalism, I shared in order to be heard. Now, I am beginning to listen to myself, says journalist Moya Lothian-McLean

Moya Lothian-McLean

02, May, 2022 @5:00 AM

Article image
Facebook and Twitter 'harm young people's mental health'
Poll of 14- to 24-year-olds shows Instagram, Facebook, Snapchat and Twitter increased feelings of inadequacy and anxiety

Denis Campbell Health policy editor

19, May, 2017 @7:08 AM