Watchdog cracks down on tech firms that fail to protect children

Sites must assess content for sexual abuse and suicide risk or face fines of up to £17m

Technology companies will be required to assess their sites for sexual abuse risks, prevent self-harm and pro-suicide content, and block children from broadcasting their location, after the publication of new rules for “age-appropriate design” in the sector.

The UK Information Commissioner’s Office, which was tasked with creating regulations to protect children online, will enforce the new rules from autumn 2021, after one-year transition period. After which companies that break the law can face sanctions comparable to those under GDPR, including fines of up to £17m or 4% of global turnover.

Companies that make services likely to be accessed by a child will have to take account of 15 principles designed to ensure their services do not cause harm by default. Those include:

  • a requirement to default privacy settings to high, unless there is a compelling reason not to;

  • orders to switch off geolocation by default, and to turn off visible location tracking at the end of every session;

  • a block on using “nudge techniques to lead or encourage children to provide unnecessary personal data or weaken or turn off their privacy protections”;

  • a requirement on sites to uphold their stated terms, policies and community standards.

Elizabeth Denham, the information commissioner, said: “Personal data often drives the content that our children are exposed to – what they like, what they search for, when they log on and off and even how they are feeling.

“In an age when children learn how to use an iPad before they ride a bike, it is right that organisations designing and developing online services do so with the best interests of children in mind. Children’s privacy must not be traded in the chase for profit.”

Andy Burrows, the NSPCC’s head of child safety online policy, said: “This transformative code will force high-risk social networks to finally take online harm seriously and they will suffer tough consequences if they fail to do so.

“For the first time, tech firms will be legally required to assess their sites for sexual abuse risks, and can no longer serve up harmful self-harm and pro-suicide content. It is now key that these measures are enforced in a proportionate and targeted way.”

The code, which is legally backed by a requirement in the Data Protection Act 2018 for the ICO to prepare a code of practice containing guidance “on standards of age-appropriate design of relevant information society services which are likely to be accessed by children” had faced criticism in earlier drafts over the risk it could force the entire internet to be made child safe, owing to ambiguity over whether any given site is likely to be accessed by children.

In the final version of the code, the ICO says it will take a “commonsense” approach to the question, but notes that “if your service is the kind of service that you would not want children to use in any case, then your focus should be on how you prevent access.

“If your service is not aimed at children but is not inappropriate for them to use either, then your focus should be on assessing how appealing your service will be to them.”

The initial focus of the code is likely to be large social media companies, including YouTube, TikTok and Snapchat, all of which have significant numbers of child users and, until now, few legal restrictions on how they can be treated.

But the changes have not eased all fears. Dom Hallas, the co-founder of Coadec, the UK’s coalition of tech startups, called the rules a “textbook example of bad regulation that will entrench big companies”.

“The practical impact of the code will be that thousands of tech companies, from e-commerce to maps, have to build multiple versions of the same product with different sets of rules. Startups can’t afford to do this but big tech can.

“Many will say this code is a victory for kids but it will in fact restrict the startup services available to under-18s and create an internet for children designed by tech giants.”

In the US, under-13s must be given special treatment, which has led to trouble for YouTube after a Federal Trade Commission settlement last year.

Contributor

Alex Hern Technology editor

The GuardianTramp

Related Content

Article image
Fine tech companies that fail to protect children, Labour says
Party would establish new standalone internet regulator if it came to power in next election

Jim Waterson Media editor

06, Feb, 2019 @12:01 AM

Article image
Prosecute tech chiefs who endanger children, says Molly Russell’s father
Ian Russell says inquest into daughter’s death is ‘unique’ opportunity to make online platforms safer

Dan Milmo Global technology editor

16, Jan, 2023 @7:00 AM

Article image
NCA says end-to-end encryption poses challenge for law enforcers on child abuse
Agency responds after ICO says encryption plays an important role in children’s online safety

Dan Milmo Global technology editor

22, Jan, 2022 @7:00 AM

Article image
Porn sites are not doing enough to protect children, warns Ofcom
Regulator publishes first report into video-sharing platforms and says few put child safety before profits

Alex Hern

20, Oct, 2022 @11:29 AM

Article image
Molly Russell coroner calls for review of children’s social media access
Andrew Walker’s report says government should consider separate platforms for adults and children

Dan Milmo Global technology editor

14, Oct, 2022 @11:31 AM

Article image
Government criticised over renewed delay to online safety bill
Internet safety groups say withdrawal of proposed legislation from next week’s Commons schedule leaves children at continued risk

Alex Hern UK technology editor

27, Oct, 2022 @5:25 PM

Article image
Girls suffer under pressure of online 'perfection', poll finds
UK charity Girlguiding says biggest internet worry for 35% of girls aged 11-21 is comparing themselves with others

Sarah Marsh

22, Aug, 2017 @11:01 PM

Article image
Adult online age used by third of eight- to 17-year-old social media users
Ofcom study covers Facebook, TikTok, Instagram, Snapchat, Twitter and YouTube, all of which have age limits of 13

Dan Milmo Global technology editor

10, Oct, 2022 @11:01 PM

Article image
Tech bosses face jail if children not kept safe online after UK parliament deal
Rebel Tories drop amendment after ministers agree to make managers criminally liable for persistent breaches of duty of care

Dan Milmo Global technology editor

16, Jan, 2023 @11:36 PM

Article image
Tough code of practice for websites will aim to protect children online
Facing potential Lords defeat, government announces it will back data protection bill amendment to safeguard children’s privacy

Jessica Elgot

08, Dec, 2017 @6:38 PM