Will UK’s online safety bill protect children from adult material?

Legislation puts duty of care on tech firms to protect under-18s but does not mandate use of specific age-checking technology

The online safety bill is due to become law this year and it imposes a duty of care on tech companies to protect children from harmful content. However, there are calls from campaigners and peers to toughen the legislation’s provisions regarding pornography. Here is what the act proposes to do on adult material.

Will the online safety bill prevent children from accessing pornography?

The bill requires all pornography websites, such as Pornhub, to ensure children do not encounter their content. This will require age-checking measures. The legislation refers to stringent age verification – checking a user’s age via government ID or an authoritative data source such as a person’s bank – as a means of doing so. Breaches of the act carry the threat of a fine of up to 10% of a company’s global turnover or, in extreme cases, blocking a website altogether.

What are the rules now?

MPs have described the legal approach to pornography in the UK as a “loose patchwork” comprising more than a dozen laws. It is a criminal offence to publish work under the Obscene Publications Act that is deemed “obscene” and it is illegal under the Criminal Justice and Immigration Act to possess an “extreme” pornographic image. It is also an offence to make, possess or distribute indecent images of a child.

The primary regulator of legal pornography offline is the British Board of Film Classification, which gives pornography age ratings – R18 for the most extreme but legal content, or 18 – but it has no control over online content.

Ofcom, the communications watchdog, already has the power to regulate UK-based “video-sharing platforms” such as TikTok, Snapchat and OnlyFans. These platforms are required to protect under-18s from videos containing R18 material such as pornography.

The age appropriate design code was introduced in 2021 and is designed to prevent websites and apps from misusing children’s data. Under its terms, social media platforms would be breaching the code if their algorithms served adult material to under 18-year-olds.

How will pornography websites prevent children from accessing adult material?

Age verification has been a troublesome issue for the government. Age-checking for pornography was announced as a Conservative policy in 2015. However, plans to introduce a nationwide age verification system for online pornography were abandoned in 2019.

The bill will not mandate use of specific technologies for age checking, although Ofcom will issue codes of practice on age assurance, which is the umbrella term for assessing the age of people online. Age verification is the term for the toughest measures, such as requiring proof of official ID.

One solution is to use age verification companies that vet a user’s age – via a range of methods including checking official ID or bank statements – and then notify the porn provider that the person wishing to access their service, who is anonymised, is over 18 years old.

Ofcom has said it will launch a consultation on protecting children from pornographic content – including on user-generated platforms such as OnlyFans – in the autumn.

Will children be protected from adult material on mainstream social media platforms?

The government has indicated that there will be clear instructions to mainstream social media sites and search engines to prevent children accessing pornographic content on their services. The bill requires sites to prevent children encountering what it terms “primary priority content”. Because it qualifies as a “user-to-user” service, subscription site OnlyFans is also covered by this part of the bill.

We will not know what is primary priority content officially until it is defined in a statutory instrument that will be published after the bill becomes law. However, pornography is expected to be on that list and it was listed as primary priority content by the previous culture secretary, Nadine Dorries, in a parliamentary statement last year. According to a timeline published by Ofcom, though, it could be more than 18 months after the bill is passed before these provisions come into effect.

Social media sites and legal pornography sites will also be required to shield all users from illegal pornography such as obscene content and child sexual abuse material.

What does the bill do about non-consensual image sharing?

The bill will update the law on sharing intimate images without someone’s consent. In England and Wales there will be a new “base offence”, where it is an offence to share an intimate image of a person if they do not consent – and the perpetrator does not believe they have consented. Currently, these offences apply if the image is shared in order to cause humiliation or distress.

The base offence will now apply regardless of the motivation, including sharing it as a joke, for social status, financial gain or “where there is no motivation at all”.

Contributor

Dan Milmo Global technology editor

The GuardianTramp

Related Content

Article image
Minister refuses to rule out changes to UK online safety bill
Social media bosses who breach child safety rules may face jail if Ofcom given powers to prosecute

Kiran Stacey and Dan Milmo

13, Jan, 2023 @9:59 AM

Article image
Porn sites are not doing enough to protect children, warns Ofcom
Regulator publishes first report into video-sharing platforms and says few put child safety before profits

Alex Hern

20, Oct, 2022 @11:29 AM

Article image
Online harms bill: firms may face multibillion-pound fines for illegal content
Government sets out strict guidelines to govern removal of material promoting child sexual abuse and terrorism

Alex Hern

15, Dec, 2020 @8:17 AM

Article image
Government criticised over renewed delay to online safety bill
Internet safety groups say withdrawal of proposed legislation from next week’s Commons schedule leaves children at continued risk

Alex Hern UK technology editor

27, Oct, 2022 @5:25 PM

Article image
Millions of porn videos will not be blocked by UK online age checks
Clause means children will be able to view content on social media and image-sharing sites

Damien Gayle

18, Oct, 2018 @12:48 PM

Article image
Online safety bill must protect adults from self-harm content, say charities
Samaritans among those calling for people of ‘all ages’ to be safeguarded from suicide and self-harm material

Dan Milmo Global technology editor

14, Oct, 2022 @5:00 AM

Article image
Tumblr to ban all adult content
Microblogging site says move reflects responsibilities to different age groups

Jim Waterson Media editor

03, Dec, 2018 @7:38 PM

Article image
Fine tech companies that fail to protect children, Labour says
Party would establish new standalone internet regulator if it came to power in next election

Jim Waterson Media editor

06, Feb, 2019 @12:01 AM

Article image
Adult online age used by third of eight- to 17-year-old social media users
Ofcom study covers Facebook, TikTok, Instagram, Snapchat, Twitter and YouTube, all of which have age limits of 13

Dan Milmo Global technology editor

10, Oct, 2022 @11:01 PM

Article image
Changes to online safety bill tread line between safety and appearing ‘woke’
Ministers drop ‘harmful communications’ offence with some arguing it was ‘legislating for hurt feelings’

Alex Hern

29, Nov, 2022 @5:13 PM