Molly Russell coroner calls for review of children’s social media access

Andrew Walker’s report says government should consider separate platforms for adults and children

The coroner at Molly Russell’s inquest has recommended that the government consider separate social media platforms for children and adults as he called for a review of child use of online content.

The senior coroner Andrew Walker, who presided over the inquest into 14-year-old Molly’s death, has issued safety recommendations that focus on child access to social media content. Molly, from Harrow, north-west London, died in November 2017 after viewing extensive amounts of material related to suicide, depression, anxiety and self-harm on platforms including Instagram and Pinterest.

Walker issued a prevention of future deaths report, recommending that the government review the provision of internet platforms to children. As part of that review, he said, it should look at: separate sites for children and adults; checking a user’s age before they sign up to a platform; providing age-appropriate content to children; the use of algorithms to provide content; advertising to children; and parental or guardian access to a child’s social media account.

In a landmark conclusion to the inquest last month, Walker said social media had contributed to Molly’s death, stating that she had “died from an act of self-harm whilst suffering from depression and the negative effects of online content”.

The prevention of future deaths notice has been sent to Instagram’s owner, Meta, and Pinterest as well as two other platforms that Molly interacted with before her death: Snapchat and Twitter. The report has also been sent to the culture secretary, Michelle Donelan, and Ofcom, the UK communications regulator charged with overseeing the online safety bill. All parties who receive the report must respond by 8 December with details of the actions they propose to take, or explain why they are taking no action.

Walker said in the report that the government should consider setting up an independent body to monitor social media content and should consider legislation to protect children from harmful online material. The online safety bill, which is due to resume its progress through parliament, imposes a duty of care on tech platforms to protect children from harmful content. Walker said platforms should also consider self-regulation.

“Although regulation would be a matter for government, I can see no reason why the platforms themselves would not wish to give consideration to self-regulation,” he wrote.

Responding to the report, Molly’s father, Ian Russell, urged social media platforms to think “long and hard” about whether their services were safe for children and to take action before the online safety bill was introduced.

“We urge social media companies to heed the coroner’s words and not drag their feet waiting for legislation and regulation, but instead to take a proactive approach to self-regulation to make their platforms safer for their young users,” he said. “They should think long and hard about whether their platforms are suitable for young people at all.”

Russell added that the online safety bill should be introduced “as soon as possible”. Referring to the systems that repeatedly pushed harmful content at his daughter before her death, Russell called for tech bosses to face stronger criminal sanctions “if they fail to take action to curb the algorithmic amplification of destructive and extremely dangerous content or fail to remove it swiftly”.

William Perrin, an internet safety expert and trustee of the UK charity Carnegie, said Walker’s report contained recommendations that were mainly covered by the bill, such as giving Ofcom a role in monitoring how platforms deal with harmful content. However, Perrin said action had still not been taken, despite many interventions like Walker’s.

“This is yet another report, and a weighty one, that recommends action on harmful content,” he said. “But the government has yet to take that action. It’s all very well to say the online safety bill will do these things but it has still yet to be implemented.”

Beeban Kidron, a crossbench peer and child internet safety campaigner, said she did not support the concept of separate platforms for children and adults but added: “The coroner is entirely right that a child going online should be offered an age-appropriate experience.”

Merry Varney, who led the Russell family’s inquest team from the law firm Leigh Day, said: “The decision of HM senior coroner Walker to issue this report both to the government and social media companies is very welcome, and action to prevent further harm to children must be taken urgently.”

Donelan said her “thoughts will be with Molly’s family” when the bill returns to parliament shortly and that the coroner’s report matched provisions in the legislation.

She added: “What happened to Molly is heartbreaking, which is why I am considering the coroner’s report into her death so carefully. His recommendations tally with what our world-leading Online Safety Bill already delivers, which is an important step forward.”

A Meta spokesperson said: “We agree regulation is needed and we’ve already been working on many of the recommendations outlined in this report, including new parental supervision tools that let parents see who their teens follow and limit the amount of time they spend on Instagram.”

Pinterest said the coroner’s report would be “considered with care” while Twitter and Snapchat’s parent company confirmed it had received the report. The Department for Digital, Culture, Media and Sport has been approached for comment.

• In the UK, the youth suicide charity Papyrus can be contacted on 0800 068 4141 or email pat@papyrus-uk.org. In the UK and Ireland, Samaritans can be contacted on 116 123 or by emailing jo@samaritans.org or jo@samaritans.ie. In the US, the National Suicide Prevention Lifeline is 1-800-273-8255. In Australia, the crisis support service Lifeline is 13 11 14. Other international helplines can be found at www.befrienders.org. You can contact the mental health charity Mind by calling 0300 123 3393 or visiting mind.org.uk

Contributor

Dan Milmo Global technology editor

The GuardianTramp

Related Content

Article image
‘The bleakest of worlds’: how Molly Russell fell into a vortex of despair on social media
London teenager died from an act of self-harm in 2017 after the darker side of online life overwhelmed her

Dan Milmo Global technology editor

30, Sep, 2022 @11:20 AM

Article image
Adult online age used by third of eight- to 17-year-old social media users
Ofcom study covers Facebook, TikTok, Instagram, Snapchat, Twitter and YouTube, all of which have age limits of 13

Dan Milmo Global technology editor

10, Oct, 2022 @11:01 PM

Article image
Violent online content ‘unavoidable’ for UK children, Ofcom finds
Every child interviewed by media watchdog had watched violent material on the internet

Alex Hern UK technology editor

15, Mar, 2024 @12:01 AM

Article image
Social media firms 'should hand over data amid suicide risk'
Royal College of Psychiatrists hope research will shine light on how young people use platforms

Denis Campbell Health policy editor

17, Jan, 2020 @12:01 AM

Article image
Pinterest executive admits platform ‘not safe’ when Molly Russell used it
Inquest hears teenager viewed multiple images of self-harm on Pinterest before she killed herself

Dan Milmo Global technology editor

22, Sep, 2022 @4:49 PM

Article image
Prosecute tech chiefs who endanger children, says Molly Russell’s father
Ian Russell says inquest into daughter’s death is ‘unique’ opportunity to make online platforms safer

Dan Milmo Global technology editor

16, Jan, 2023 @7:00 AM

Article image
Social media addiction should be seen as a disease, MPs say
UK report suggests sites such as Facebook and Instagram could be harming mental health

Jim Waterson Media editor

18, Mar, 2019 @12:01 AM

Article image
Meta and Pinterest chiefs ordered to appear in person at Molly Russell inquest
Coroner deemed physical appearance necessary for viewing evidence while giving testimony

Dan Milmo

02, Sep, 2022 @10:04 PM

Article image
The Guardian view on mental health online: protect the vulnerable | Editorial
Editorial: The giant social media companies cannot escape responsibility when depressed teens are led to damaging material online

Editorial

28, Jan, 2019 @6:27 PM

Article image
Molly Russell death: police 'likely' to access teenager's phone data
Fourteen-year-old viewed harmful content online before killing herself in 2017

Sarah Marsh

18, Mar, 2019 @11:07 AM