What powers will Ofcom have to regulate the internet?

Watchdog to broadly oversee two specific areas covering illegal and harmful content

The government is to appoint Ofcom as an internet watchdog, giving it the ability to fine social media companies that do not protect users from harmful content. The culture secretary, Nicky Morgan, and the home secretary, Priti Patel, said Ofcom’s existing position as broadcasting regulator made it suitable to enforce rules to keep the internet safe.

What has happened today?

The government has responded to a consultation it launched last year about its plans to regulate the internet. The response fleshes out some of the proposals contained in the initial online harms white paper, most importantly by naming Ofcom as the regulator that would be given power over the net.

What will Ofcom have the power to do?

It will broadly oversee two specific areas covering illegal and harmful content. For the former, it will make sure companies quickly take down illegal content – with a particular focus on terrorism and child abuse imagery – and prevent much of it being posted in the first place.

For the latter, Ofcom will primarily make sure social networks enforce their own terms and conditions. That means that if a social network says, for instance, that material promoting self-harm is banned, it will be required to take action to enforce that.

Why is the government censoring the internet?

The government argues that the two areas it covers suffer from a lack of regulation. For illegal content, social networks currently face an all-or-nothing approach to liability, where they are free from all penalties provided they are not seen to be actively supporting the content. The government wants the ability to use penalties to encourage speedy enforcement, and discourage companies from deliberately turning a blind eye to their own platforms.

For “harmful but not illegal content”, the government says it needs to act to protect children online, and wants to create a legal duty of care on the part of social networks to ensure they face penalties for harms their platforms cause.

What will this mean for me?

In the short term, little. As with the implementation of GDPR, the proposals will likely lead to a flurry of small changes on the day the law comes into practice, with the more meaningful changes taking years to work through regulators and courts.

Social networks have warned that anything that imposes requirements on them to take down content quickly runs the risk of encouraging them to remove false positives – material that is not actually infringing, but looks like it might be close. That means users of sites such as Instagram or YouTube might notice the platforms start becoming more censorious.

What if I run a website?

The government is keen to emphasise that the proposals only cover sites that allow user-generated content. It estimates fewer than 5% of businesses are thus regulated. But those rules will cover sites big and small, and some of the smaller companies may struggle to keep up with the requirements, critics argue.

“The proposals ask British startups to police legal content and then find themselves punished when they make mistakes,” said Dom Hallas, the executive director of startup trade body Coadec. “It is a confusing minefield that can only benefit big companies with the resources and armies of lawyers to comply.”

But the government says any enforcement will be proportional to the size of the company, with a few other factors, such as the number of children who are users, taken into account.

How will sites know who is a child?

That is an open question. The government is leaning on an upcoming regulation from the Information Commissioner’s Office, the age-appropriate design code, to argue it is up to sites to work out whether they have younger users, and if so, to protect them from harmful content accordingly. But it also says it is examining the prospect of tools such as “age assurance” technology, initially developed for its abortive attempts to force purveyors of adult content to impose their own age gates online.

What are the penalties for failure?

We do not know, but the government says they will be “fair, proportionate and transparent”. The white paper suggested individual executives may be held to account for failures, but now, while the response says it is “essential that company executives are sufficiently incentivised to take online safety seriously”, the government has not set out any specific policies and will not do so until it finalises its response in the spring.

Contributor

Alex Hern

The GuardianTramp

Related Content

Article image
Social media platforms must police their sites better, says Ofcom
Chief executive joins those arguing that social media are under-regulated

Alex Hern

13, Jul, 2018 @1:02 PM

Article image
Tories will struggle to turn desire to regulate internet into policy | Jim Waterson
Matt Hancock wants to rein in internet excess – just don’t ask him how it will work in reality

Jim Waterson Media editor

20, May, 2018 @5:03 PM

Article image
Porn sites are not doing enough to protect children, warns Ofcom
Regulator publishes first report into video-sharing platforms and says few put child safety before profits

Alex Hern

20, Oct, 2022 @11:29 AM

Article image
UK considers internet ombudsman to deal with abuse complaints
Other measures being explored include levy on social media companies to help meet costs of online policing

Owen Bowcott and Samuel Gibbs

22, Aug, 2017 @5:00 AM

Article image
Tech bosses face jail if children not kept safe online after UK parliament deal
Rebel Tories drop amendment after ministers agree to make managers criminally liable for persistent breaches of duty of care

Dan Milmo Global technology editor

16, Jan, 2023 @11:36 PM

Article image
Minister refuses to rule out changes to UK online safety bill
Social media bosses who breach child safety rules may face jail if Ofcom given powers to prosecute

Kiran Stacey and Dan Milmo

13, Jan, 2023 @9:59 AM

Article image
Online safety bill needs tougher rules on misogyny, say peers
Nicky Morgan and other Tory peers plan amendment to boost Ofcom’s powers to penalise social media firms

Tobi Thomas

04, Feb, 2023 @1:29 PM

Article image
WhatsApp would not remove end-to-end encryption for UK law, says chief
Meta’s head of chat app says it would not comply with the requirements set out in online safety bill

Alex Hern

09, Mar, 2023 @2:07 PM

Article image
Online harms bill: firms may face multibillion-pound fines for illegal content
Government sets out strict guidelines to govern removal of material promoting child sexual abuse and terrorism

Alex Hern

15, Dec, 2020 @8:17 AM

Article image
British teenagers among world's most extreme internet users, report says
Thinktank warns that heavy internet use can have damaging consequences but says educating teenagers more effective than limiting online access

Denis Campbell Health policy editor

29, Jun, 2017 @11:01 PM