Social media firms face big UK fines if they fail to stop sexist and racist content

Revised online safety bill proposes fines of 10% of revenue but drops harmful communications offence

Social media platforms that breach pledges to block sexist and racist content face the threat of substantial fines under government changes to the online safety bill announced on Monday.

Under the new approach, social media sites such as Facebook and Twitter must also give users the option of avoiding content that is harmful but does not constitute a criminal offence. This could include racism, misogyny or the glorification of eating disorders.

Ofcom, the communications regulator, will have the power to fine companies up to 10% of global turnover for breaches of the act. Facebook’s parent, Meta, posted revenues of $118bn (£99bn) last year.

A harmful communications offence has, however, been dropped from the legislation after criticism from Conservative MPs that it was legislating for “hurt feelings”.

Ministers have scrappedthe provision on regulating “legal but harmful” material – such as offensive content that does not constitute a criminal offence – and are instead requiring platforms to enforce their terms and conditions for users.

If those terms explicitly prohibit content that falls below the threshold of criminality – such as some forms of abuse – Ofcom will then have the power to ensure they police them adequately.

Under another adjustment to the bill, big tech companies must offer people a way of avoiding harmful content on their platform, even if it is legal, through methods that could include content moderation or warning screens. Examples of such material include those that are abusive, or incite hatred on the basis of race, ethnicity, religion, disability, sex, gender reassignment or sexual orientation.

However, firms will not be able to take down content or ban a user unless the circumstances for doing so are clearly set out in the terms of service. Users will also have to be offered a right of appeal to protect against arbitrary content removal or account bans.

The revival of the much-delayed attempt to rein in tech firms comes as Meta was fined €265m on Monday for a breach of data protection law after the personal details of more than 500 million people were published online.

The bill, which returns to parliament on 5 December after being paused in July, also contains new provisions on protecting children. Overall, the legislation imposes a duty of care on tech firms to shield children from harmful content, but the updated bill now includes provisions such as requiring social media companies to publish assessments of the dangers their sites pose to children. Sites that carry age limits – which for most big social media sites is 13 years old – will have to set out in their terms of service how they enforce them.

The culture secretary, Michelle Donelan, said an unregulated social media industry has “damaged our children for too long”. She added: “I will bring a strengthened online afety ill back to parliament, which will allow parents to see and act on the dangers sites pose to young people. It is also freed from any threat that tech firms or future governments could use the laws as a licence to censor legitimate views.”

The shadow culture secretary, Lucy Powell, said the government had “bowed to vested interests” by scrapping the legal but harmful provision.

“Removing ‘legal but harmful’ gives a free pass to abusers and takes the public for a ride. It is a major weakening, not strengthening, of the bill,” she said.

“The government has bowed to vested interests, over keeping users and consumers safe.”

Changes to the bill have been made in the face of warnings from Conservative MPs and some campaign groups that a prior version would encourage tech firms to be overcensorious and stifle freedom of speech.

Kemi Badenoch, the trade secretary and former Tory leadership contender, criticised the bill in July, stating: “We should not be legislating for hurt feelings.”

Her comments alluded to the harmful communications proposal in the draft bill, which made it an offence to send a message on social media intended to cause “psychological harm, amounting to at least serious distress”. This has now been dropped and the government will no longer repeal parts of two acts – the Malicious Communications Act and the Communications Act – that it was intended to replace.

Other changes to the bill include criminalising encouragement of committing self-harm, a change that was introduced after the inquest into the death of 14-year-old Molly Russell, who died after viewing extensive amounts of harmful material on Instagram and Pinterest in 2017. Under the bill, which applies to all firms that produce user-generated content, as well as search engines, tech companies must tackle illegal content such as child sexual abuse images and terrorist material.


Dan Milmo Global technology editor

The GuardianTramp

Related Content

Article image
Online harms bill: firms may face multibillion-pound fines for illegal content
Government sets out strict guidelines to govern removal of material promoting child sexual abuse and terrorism

Alex Hern

15, Dec, 2020 @8:17 AM

Article image
Online safety bill: changes urged to allow access to social media data
Campaigners say bill in ‘serious peril’ of passing without powers to make platforms more transparent

Dan Milmo

19, Jun, 2023 @5:00 AM

Article image
Minister refuses to rule out changes to UK online safety bill
Social media bosses who breach child safety rules may face jail if Ofcom given powers to prosecute

Kiran Stacey and Dan Milmo

13, Jan, 2023 @9:59 AM

Article image
Adult online age used by third of eight- to 17-year-old social media users
Ofcom study covers Facebook, TikTok, Instagram, Snapchat, Twitter and YouTube, all of which have age limits of 13

Dan Milmo Global technology editor

10, Oct, 2022 @11:01 PM

Article image
Government criticised over renewed delay to online safety bill
Internet safety groups say withdrawal of proposed legislation from next week’s Commons schedule leaves children at continued risk

Alex Hern UK technology editor

27, Oct, 2022 @5:25 PM

Article image
Prosecute tech chiefs who endanger children, says Molly Russell’s father
Ian Russell says inquest into daughter’s death is ‘unique’ opportunity to make online platforms safer

Dan Milmo Global technology editor

16, Jan, 2023 @7:00 AM

Article image
Molly Russell coroner calls for review of children’s social media access
Andrew Walker’s report says government should consider separate platforms for adults and children

Dan Milmo Global technology editor

14, Oct, 2022 @11:31 AM

Article image
Online safety bill must protect adults from self-harm content, say charities
Samaritans among those calling for people of ‘all ages’ to be safeguarded from suicide and self-harm material

Dan Milmo Global technology editor

14, Oct, 2022 @5:00 AM

Article image
Facebook whistleblower Frances Haugen calls for urgent external regulation
Ex-employee tells UK MPs Mark Zuckerberg ‘has unilateral control over 3bn people’ due to his position

Jim Waterson and Dan Milmo

25, Oct, 2021 @5:44 PM

Article image
Zuckerberg’s kindness pledge for Threads is ‘absurd’, says Molly Russell charity
Foundation says words contradict reality of Instagram, which contributed to suicide of London teenager

Dan Milmo Global technology editor

07, Jul, 2023 @3:07 PM