Social media firms must face heavy fines over extremist content – MPs

An inquiry by the Commons home affairs committee condemns technology companies for failing to tackle hate speech

Social media companies are putting profit before safety and should face fines of tens of millions of pounds for failing to remove extremist and hate crime material promptly from their websites, MPs have said.

The largest and richest technology firms are “shamefully far” from taking action to tackle illegal and dangerous content, according to a report by the Commons home affairs committee.

The inquiry, launched last year following the murder of the Labour MP Jo Cox by a far-right gunman, concludes that social media multinationals are more concerned with commercial risks than public protection. Swift action is taken to remove content found to infringe copyright rules, the MPs note, but a “laissez-faire” approach is adopted when it involves hateful or illegal content.

Referring to Google’s failure to prevent paid advertising from reputable companies appearing next to YouTube videos posted by extremists, the committee’s report said: “One of the world’s largest companies has profited from hatred and has allowed itself to be a platform from which extremists have generated revenue.”

In Germany, the report points out, the justice ministry has proposed imposing financial penalties of up to €50m on social media companies that are slow to remove illegal content.

“Social media companies currently face almost no penalties for failing to remove illegal content,” the MPs conclude. “We recommend that the government consult on a system of escalating sanctions, to include meaningful fines for social media companies which fail to remove illegal content within a strict timeframe.”

During its investigation, the committee found instances of terror recruitment videos for banned jihadi and neo-Nazi groups remaining accessible online even after MPs had complained about them.

Some of the material included antisemitic, hate-crime attacks on MPs that had been the subject of a previous committee report. Material encouraging child abuse and sexual images of children was also not removed, despite being reported on by journalists.

Social media companies that fail to proactively search for and remove illegal content should pay towards costs of the police doing so, the report recommends, just as football clubs are obliged to pay for policing in their stadiums and surrounding areas on match days.

The government, the report says, should consider whether failure to remove illegal material is in itself a crime and, if not, how the law should be strengthened. The thrust of the committee’s arguments suggest social media companies need to be treated as though they are traditional publishers.

Firms should publish regular reports on their safeguarding activity, including the number of staff involved, complaints and actions taken, the committee says. It is “completely irresponsible” that social media companies are failing to tackle illegal and dangerous content and to implement even their own community standards, the report adds.

A thorough review is required of the legal framework controlling online hate speech, abuse and extremism to ensure that the law is up to date, the MPs conclude. “What is illegal offline should be illegal – and enforced – online.”

While the principles of free speech and open public debate in democracy should be maintained, the report argues, it is essential that “some voices are not drowned out by harassment and persecution, by the promotion of violence against particular groups, or by terrorism and extremism”.

Yvette Cooper, the Labour MP who chairs the home affairs committee, said: “Social media companies’ failure to deal with illegal and dangerous material online is a disgrace.

“They have been asked repeatedly to come up with better systems to remove illegal material such as terrorist recruitment or online child abuse. Yet repeatedly they have failed to do so. It is shameful.

Man accused of posting murder footage on Facebook kills himself

“These are among the biggest, richest and cleverest companies in the world, and their services have become a crucial part of people’s lives. This isn’t beyond them to solve, yet they are failing to do so. They continue to operate as platforms for hatred and extremism without even taking basic steps to make sure they can quickly stop illegal material, properly enforce their own community standards, or keep people safe …

“It is blindingly obvious that they have a responsibility to proactively search their platforms for illegal content, particularly when it comes to terrorist organisations.”

Google, the parent company of YouTube, told the inquiry that it has plans to extend its “trusted flagger” programme to identify terrorist propaganda and would invest in improving its alert procedures. It said that it “no interest” in making money from extremist material.

Facebook also told MPs that it is is reviewing how it handles violent videos and other objectionable material after a video of a murder in the United States remained on its service for more than two hours.

Google, Facebook and Twitter all refused to tell the committee how many staff they employ to monitor and remove inappropriate content.

Contributor

Owen Bowcott Legal affairs correspondent

The GuardianTramp

Related Content

Article image
MPs criticise social media firms for failure to report criminal posts
Facebook, Google and Twitter condemned for not telling police about crimes on their platforms

Alex Hern

24, Apr, 2019 @2:40 PM

Article image
MPs press social media firms over failure to take down hate speech
Committee says there has been a shift in attitudes at Facebook, YouTube and Twitter but ‘we need you to do more’

Alan Travis ​Home affairs editor

19, Dec, 2017 @1:31 PM

Article image
EU gives Facebook and Google three months to tackle extremist content
Commission says internet companies also including YouTube and Twitter need to show progress on issue or face legislation

Samuel Gibbs

01, Mar, 2018 @2:42 PM

Article image
Face-off between MPs and social media giants over online hate speech
Committee tells Twitter, Facebook and others they should get tough on online abuse, with one MP accusing them of ‘commercial prostitution’

Alan Travis Home affairs editor

14, Mar, 2017 @6:08 PM

Article image
Ofcom to push for regulation of social networks
Media regulator’s chief seeks action to prevent online harm

Jim Waterson Media editor

17, Sep, 2018 @10:55 PM

Article image
Facebook, Google and Twitter to testify in Congress over extremist content
Firms to give evidence in Senate on combating spread of extremist propaganda, while Twitter misses Russian election interference information deadline

Alex Hern

10, Jan, 2018 @3:32 PM

Article image
Social media firms 'should hand over data amid suicide risk'
Royal College of Psychiatrists hope research will shine light on how young people use platforms

Denis Campbell Health policy editor

17, Jan, 2020 @12:01 AM

Article image
A ‘safe space for racists’: antisemitism report criticises social media giants
Facebook, Twitter, Instagram, YouTube and TikTok failing to act on most reported anti-Jewish posts, says study

Maya Wolfe-Robinson

02, Aug, 2021 @4:14 PM

Article image
The royal twitterati: how the monarchy learned to love social media
The Queen and princes Charles, William and Harry have been quick to realise the power of social media to boost popularity

Caroline Davies

16, Feb, 2017 @6:37 PM

Article image
The Guardian view on censoring the internet: necessary, but not easy | Editorial
Editorial: Who should protect us online? And who will guard us from these guards?

Editorial

21, Aug, 2017 @5:34 PM