An alliance of the world’s most powerful law enforcement agencies including the FBI, Interpol and Britain’s National Crime Agency (NCA) have condemned Meta’s plans to encrypt direct messages on Facebook Messenger and Instagram, saying that doing so will weaken the ability to keep child users safe.
The Virtual Global Taskforce, made up of 15 agencies, is chaired by the NCA and also includes Europol and the Australian federal police among its membership. The VGT has spoken out, it says, owing to the “impending design choices” by Meta, which it says could cause serious harm.
The decision to encrypt direct messages on the platforms, which would prevent anyone other than the intended recipient being able to intercept the communications, “is an example of a purposeful design choice that degrades safety systems and weakens the ability to keep child users safe”, the alliance said.
The VGT praised Meta’s work with the American National Centre for Missing and Exploited Children (NCMEC), which acts as a clearing house for reports on online child sexual abuse.
Meta reported more such cases to the NCMEC than any other provider, the alliance said, but it feared that lead would disappear once the messages are “end-to-end encrypted” (E2EE). “The VGT has not yet seen any indication from Meta that any new safety systems implemented post-E2EE will effectively match or improve their current detection methods,” it said.
Citing the case of David Wilson, who was jailed in 2021 for abusing 52 children, the VGT predicted similar arrests could prove impossible with encryption enabled. “The successful prosecution of Wilson and the resulting safeguarding of hundreds of children was possible because law enforcement were able to access the evidence contained within over 250,000 messages through Facebook. In an E2EE environment, it is highly unlikely this case would have been detected.”
In a statement, a Meta spokesperson said: “The overwhelming majority of Brits already rely on apps that use encryption. We don’t think people want us reading their private messages, so have developed safety measures that prevent, detect and allow us to take action against this heinous abuse, while maintaining online privacy and security.
“As we continue to roll out our end-to-end encryption plans, we remain committed to working with law enforcement and child safety experts to ensure that our platforms are safe for young people.
“In the case of David Wilson, we submitted Cybertips to authorities using both public and private information,” the spokesperson added. “We have developed detection systems using behavioural signals and other account activity that are not reliant on the content of private messages to identify malicious actors like David Wilson. It’s misleading and inaccurate to say that encryption would have prevented us from identifying and reporting accounts like David Wilson’s to the authorities.”
Plans to encrypt all messages on Meta’s platforms were first revealed in 2019, when the company was still called Facebook. But they have been delayed a number of times, because of technical hurdles and regulatory pressure. In April 2021, the company announced the integration would not happen until 2022 at the earliest, before pushing the deadline back to 2023 in November 2021.
“We’re taking our time to get this right,” Meta’s head of safety, Antigone Davis, said when the latest delay was announced. “As a company that connects billions of people around the world and has built industry-leading technology, we’re determined to protect people’s private communications and keep people safe online.”
But in the years since, Meta has given few details on how the push to encrypt communications by default would avoid harsh trade-offs against its existing child safety practices. Davis said Meta would be able to detect abuse by using non-encrypted data, account information and reports from users.
Meta’s third messaging platform, WhatsApp, already uses a similar approach to tackle child sexual abuse but makes far fewer referrals to NCMEC as a result. In March 2022, the non-profit released data showing Facebook made 22m reports of online exploitation of children in 2021, while WhatsApp made just 1.3m. Other encrypted platforms with similar user bases were even worse: Apple, which runs iMessage, reported just 160 cases.
VGT’s intervention comes two days after WhatsApp joined forces with other encrypted messaging apps, including its rival Signal, to call on the UK government to protect E2EE in the online safety bill. The services hinted they may be forced to leave the UK, rather than obey the new law, if it attempted to criminalise encryption.