Instagram bans 'graphic' self-harm images after Molly Russell's death

Social media site announces action following criticism from British teenager’s father

Instagram has announced that it will ban all graphic self-harm images as part of a series of changes made in response to the death of British teenager Molly Russell.

The photo-sharing platform made the decision – which critics said was necessary but long overdue – in response to a tide of public anger over the suicide of the 14-year-old girl, whose Instagram account contained distressing material about depression and suicide.

After days of growing pressure on Instagram culminated in a meeting with health secretary Matt Hancock, the social network’s head Adam Mosseri admitted that the company had not done enough and said that explicit imagery of self-harm would no longer be allowed on the site.

“We are not where we need to be on self-harm and suicide, and we need to do more to protect the most vulnerable,” Mosseri said. “We will get better and we are committed to finding and removing this content at scale.”

The move follows significant public anger over Molly’s death. Her father Ian Russell said he believed Instagram was partly to blame. The family found material relating to depression and suicide when they looked at her account after her death.

Instagram announced a range of further measures, including the removal of non-graphic images of self-harm from the most visible parts of its app and website, which appeared designed to draw a line under what has become a reputational crisis for the brand and its parent company Facebook.

But critics said the changes should have already been made and remained sceptical they would be enough to tackle a problem that some said has grown unchecked for 10 years.

The NSPCC said Instagram had taken “an important step”, but that social networks were still falling short and that legislation would be necessary.

“It should never have taken the death of Molly Russell for Instagram to act,” said chief executive Peter Wanless. “Over the last decade, social networks have proven over and over that they won’t do enough.”

Wanless said it was not enough to wait until “the next tragedy strikes”, urging the government to act without delay and impose a duty of care on social networks, with tough punishments for those who fail to protect their young users.

Others said Facebook had consistently fallen short on self-harm and suicide across its online empire. “The company has failed to prioritise preventing self-harm,” said Jennifer Grygiel, a social media expert and assistant professor of communications at Syracuse University.

“At-risk individuals will not be safe until Facebook takes it role as a global corporation and communications platform more seriously. These changes should have been made years ago.”

Before the meeting, Mr Hancock said: “Social media companies need to do more, in particular, to remove material that encourages suicide and self-harm, so I’m going to be asking other social media companies to act.

“I don’t want people to go on to social media and search for images about suicide to get directed to yet more of that sort of imagery. They need help to not post more about suicide.”

There have been longstanding concerns over how Instagram and other social networks handle content that could be harmful to the mental health of its audience, particularly young people, but the issue became urgent after Molly’s father said in an interview Instagram “helped kill my daughter”.

Mosseri accepted that the move was overdue. Asked in an interview with the Daily Telegraph why Instagram had taken so long to tackle the issue, he said: “We have not been as focused as we should have been on the effects of graphic imagery of anyone looking at content.

“That is something that we are looking to correct and correct quickly. It’s unfortunate it took the last few weeks for us to realise that. It’s now our responsibility to address that issue as quickly as we can.”Speaking on BBC Radio 4’s PM programme, the digital minister, Margot James, said the government would “have to keep the situation very closely under review to make sure that these commitments are made real – and as swiftly as possible”.

Mosseri said some self-harm images would be allowed to remain on Instagram. “I might have an image of a scar and say, ‘I’m 30 days clean,’ and that’s an important way to tell my story,” he said.

“That kind of content can still live on the site but the next change is that it won’t show up in any recommendation services so it will be harder to find.”

Instagram’s decision comes as large social media companies such as Facebook, which owns Instagram, prepare to battle with the British government of the future of internet regulation in the UK.

The government is considering imposing a mandatory code of conduct on tech companies, which could be accompanied by fines for non-compliance, prompting in a substantial behind-the-scenes lobbying campaign by social media sites.

The culture secretary, Jeremy Wright, is due to unveil the government’s proposals at the end of this month, helping to spur Facebook into swift action.

  • In the UK, Samaritans can be contacted on 116 123 or emailjo@samaritans.org. In the US, the National Suicide Prevention Lifeline is 1-800-273-8255. In Australia, the crisis support service Lifeline is 13 11 14. Other international suicide helplines can be found at www.befrienders.org.

Contributors

Sarah Marsh and Jim Waterson

The GuardianTramp

Related Content

Article image
Instagram to extend its ban on images of self-harm to cover cartoons
Move follows appeal by father of 14-year-old Molly Russell, who killed herself in 2017

Alex Hern

28, Oct, 2019 @9:42 AM

Article image
Instagram heads off regulations with ban on self-harm images | analysis
Removing non-graphic related content, however, presents harder challenge than graphic content

Alex Hern

08, Feb, 2019 @4:10 PM

Article image
Molly Russell death: police 'likely' to access teenager's phone data
Fourteen-year-old viewed harmful content online before killing herself in 2017

Sarah Marsh

18, Mar, 2019 @11:07 AM

Article image
Instagram urged to crack down on eating disorder images
Social media platform advised to follow self-harm picture ban with focus on anorexia

Sarah Marsh

08, Feb, 2019 @4:15 PM

Article image
Three in four girls have been sent sexual images via apps, report finds
Just over half of teenagers who had been sent non-consensual sexual images via social media apps reported it, the report finds

Richard Adams Education editor

06, Dec, 2021 @12:05 AM

Article image
Facebook pauses work on Instagram Kids after teen mental health concerns
Move follows WSJ revelations that Facebook-commissioned research showed Instagram could affect girls’ mental health

Dan Milmo Global technology editor

27, Sep, 2021 @2:26 PM

Article image
Instagram led users to Covid misinformation amid pandemic – report
Recommendations feature also pushed anti-vaccination and antisemitic material, watchdog says

Alex Hern

09, Mar, 2021 @12:44 PM

Article image
Fake online influencers a danger to children, say campaigners
Virtual personalities created by companies ‘have potential to manipulate young people’

Robert Booth Social affairs correspondent

04, Nov, 2019 @6:15 PM

Article image
Social media firms to be penalised for not removing child abuse
UK police and charities praise plan to hold firms liable for not taking down harmful content

Sandra Laville and Alex Hern

08, Apr, 2019 @8:58 AM

Article image
Nando's-inspired sex slang used by girls as young as 10
Data based on texts of 50,000 children suggests ‘peri peri’ and ‘coleslaw’ not as innocent as parents may think

Robert Booth Social affairs correspondent

02, Mar, 2020 @3:08 PM