Instagram bans 'graphic' self-harm images after Molly Russell's death

Social media site announces action following criticism from British teenager’s father

Instagram has announced that it will ban all graphic self-harm images as part of a series of changes made in response to the death of British teenager Molly Russell.

The photo-sharing platform made the decision – which critics said was necessary but long overdue – in response to a tide of public anger over the suicide of the 14-year-old girl, whose Instagram account contained distressing material about depression and suicide.

After days of growing pressure on Instagram culminated in a meeting with health secretary Matt Hancock, the social network’s head Adam Mosseri admitted that the company had not done enough and said that explicit imagery of self-harm would no longer be allowed on the site.

“We are not where we need to be on self-harm and suicide, and we need to do more to protect the most vulnerable,” Mosseri said. “We will get better and we are committed to finding and removing this content at scale.”

The move follows significant public anger over Molly’s death. Her father Ian Russell said he believed Instagram was partly to blame. The family found material relating to depression and suicide when they looked at her account after her death.

Instagram announced a range of further measures, including the removal of non-graphic images of self-harm from the most visible parts of its app and website, which appeared designed to draw a line under what has become a reputational crisis for the brand and its parent company Facebook.

But critics said the changes should have already been made and remained sceptical they would be enough to tackle a problem that some said has grown unchecked for 10 years.

The NSPCC said Instagram had taken “an important step”, but that social networks were still falling short and that legislation would be necessary.

“It should never have taken the death of Molly Russell for Instagram to act,” said chief executive Peter Wanless. “Over the last decade, social networks have proven over and over that they won’t do enough.”

Wanless said it was not enough to wait until “the next tragedy strikes”, urging the government to act without delay and impose a duty of care on social networks, with tough punishments for those who fail to protect their young users.

Others said Facebook had consistently fallen short on self-harm and suicide across its online empire. “The company has failed to prioritise preventing self-harm,” said Jennifer Grygiel, a social media expert and assistant professor of communications at Syracuse University.

“At-risk individuals will not be safe until Facebook takes it role as a global corporation and communications platform more seriously. These changes should have been made years ago.”

Before the meeting, Mr Hancock said: “Social media companies need to do more, in particular, to remove material that encourages suicide and self-harm, so I’m going to be asking other social media companies to act.

“I don’t want people to go on to social media and search for images about suicide to get directed to yet more of that sort of imagery. They need help to not post more about suicide.”

There have been longstanding concerns over how Instagram and other social networks handle content that could be harmful to the mental health of its audience, particularly young people, but the issue became urgent after Molly’s father said in an interview Instagram “helped kill my daughter”.

Mosseri accepted that the move was overdue. Asked in an interview with the Daily Telegraph why Instagram had taken so long to tackle the issue, he said: “We have not been as focused as we should have been on the effects of graphic imagery of anyone looking at content.

“That is something that we are looking to correct and correct quickly. It’s unfortunate it took the last few weeks for us to realise that. It’s now our responsibility to address that issue as quickly as we can.”Speaking on BBC Radio 4’s PM programme, the digital minister, Margot James, said the government would “have to keep the situation very closely under review to make sure that these commitments are made real – and as swiftly as possible”.

Mosseri said some self-harm images would be allowed to remain on Instagram. “I might have an image of a scar and say, ‘I’m 30 days clean,’ and that’s an important way to tell my story,” he said.

“That kind of content can still live on the site but the next change is that it won’t show up in any recommendation services so it will be harder to find.”

Instagram’s decision comes as large social media companies such as Facebook, which owns Instagram, prepare to battle with the British government of the future of internet regulation in the UK.

The government is considering imposing a mandatory code of conduct on tech companies, which could be accompanied by fines for non-compliance, prompting in a substantial behind-the-scenes lobbying campaign by social media sites.

The culture secretary, Jeremy Wright, is due to unveil the government’s proposals at the end of this month, helping to spur Facebook into swift action.

  • In the UK, Samaritans can be contacted on 116 123 or emailjo@samaritans.org. In the US, the National Suicide Prevention Lifeline is 1-800-273-8255. In Australia, the crisis support service Lifeline is 13 11 14. Other international suicide helplines can be found at www.befrienders.org.

Contributors

Sarah Marsh and Jim Waterson

The GuardianTramp

Related Content

Article image
Instagram to extend its ban on images of self-harm to cover cartoons
Move follows appeal by father of 14-year-old Molly Russell, who killed herself in 2017

Alex Hern

28, Oct, 2019 @9:42 AM

Article image
Instagram still hosting self-harm images after Molly Russell inquest verdict
Online content was blamed for the 14-year-old’s death yet some harmful posts remain live on site, including suicide-related content

Shanti Das

08, Oct, 2022 @3:35 PM

Article image
Instagram heads off regulations with ban on self-harm images | analysis
Removing non-graphic related content, however, presents harder challenge than graphic content

Alex Hern

08, Feb, 2019 @4:10 PM

Article image
Molly Russell coroner calls for review of children’s social media access
Andrew Walker’s report says government should consider separate platforms for adults and children

Dan Milmo Global technology editor

14, Oct, 2022 @11:31 AM

Article image
Prosecute tech chiefs who endanger children, says Molly Russell’s father
Ian Russell says inquest into daughter’s death is ‘unique’ opportunity to make online platforms safer

Dan Milmo Global technology editor

16, Jan, 2023 @7:00 AM

Article image
Violent online content ‘unavoidable’ for UK children, Ofcom finds
Every child interviewed by media watchdog had watched violent material on the internet

Alex Hern UK technology editor

15, Mar, 2024 @12:01 AM

Article image
Instagram tests feature to block explicit images in direct message requests
Trial launched after research shows unsolicited DMs used to target high-profile women with abusive content

Dan Milmo

27, Jun, 2023 @3:17 PM

Article image
Molly Russell death: police 'likely' to access teenager's phone data
Fourteen-year-old viewed harmful content online before killing herself in 2017

Sarah Marsh

18, Mar, 2019 @11:07 AM

Article image
Zuckerberg’s kindness pledge for Threads is ‘absurd’, says Molly Russell charity
Foundation says words contradict reality of Instagram, which contributed to suicide of London teenager

Dan Milmo Global technology editor

07, Jul, 2023 @3:07 PM

Article image
Instagram urged to crack down on eating disorder images
Social media platform advised to follow self-harm picture ban with focus on anorexia

Sarah Marsh

08, Feb, 2019 @4:15 PM