Millions of sexually explicit videos will still be available online to children in the UK after new age verification rules come into force, due to a “commercial basis” clause that exempts social media and image-sharing websites.
Age verification (AV) regulations presented to parliament by the Department for Culture, Media and Sport (DCMS) last week do not include websites on which less than a third of content is pornographic material and where it is provided free of charge.
It means blogging, social media and image-sharing services such as Imgur, Tumblr, Twitter and Reddit, which host vast quantities of pornographic content, will continue to be accessible without any age checks.
It comes after the government earlier this year delayed implementation of age verification for online porn – a key 2015 Conservative manifesto commitment, that was made law by last year’s Digital Economy Act. The act requires sites serving pornographic content to British users from anywhere in the world to implement AV for users, or potentially be blocked by UK internet service providers.
Twitter and other social media had been identified as a particular problem for the effective implementation of the law.
According to the the Online Pornography (Commercial Basis) Regulations, published by the DCMS last week, sites only have to implement AV measures if they are making pornography available on a commercial basis.
This does not apply if the material is available for free and “where it is reasonable for the age-verification regulator to assume that pornographic material makes up less than one third of the content of the material made available.”
Parliament is yet to set a date to debate the regulations, but they have already been criticised by groups campaigning both for and against AV.
The End Violence Against Women coalition said pornography encouraged young people to view women as sex objects and normalised harmful attitudes and behaviours. Its co-director, Rachel Krys, said: “We will be extremely disappointed if the government now fails to meet its commitment to restrict access to pornography.
“Wherever a child is online they should not be exposed to pornographic content and the government must ensure the porn industry and social media companies do everything they can to protect children.”
Jim Killock, the executive director of Open Rights Group, which has opposed the plans, called the definition “inconsistent and bizarre”. He said: “They haven’t explained how they define percentage. Are they going to count the number of pixels on the screen? What’s a third? It’s very wide and arbitrary and very difficult to define.
“It shows the underlying problem. They are not trying to catch everything, they are not trying to remove all porn, only some; and when that’s combined with fines and blocking of content – and therefore targeting users as well as companies – you end up with a policy that’s pretty incoherent.”
The development raises the prospect that the government would look for new ways to regulate online content. DCMS and the Home Office are already working on an internet safety white paper to be published this winter, which will set out a number of legislative and non-legislative measures.
The NSPCC, which worked closely with the government in drafting the new rules, said it was calling for “robust regulation and fines” for social media sites that fail to block children from inappropriate content.
Andy Burrows, the charity’s associate head of child safety online, said: “The UK is going further than any other country to shield children from pornography. But we can’t be complacent, because the new regulation won’t be the silver bullet in blocking all online porn, for example on social networks like Twitter and Tumblr where porn is readily available but makes up less than 30% of their content.
“There will be a government review 18 months after the laws kick in on whether regulation needs to be tougher.”
A DCMS source said it had always been clear that AV was not a panacea, that it was focused on commercial pornographic websites, and that the government expected social media platforms to enforce their own terms and conditions to protect children.
A spokesman said: “Introducing age verification for commercial pornographic sites is a major step forward to protect children from easily accessible pornography. Our proposals mean that any websites marketing themselves as pornographic or for which pornography is a significant commercial driver will require age verification.”