Meta taskforce to combat trade of child sex abuse materials after damning report

The Stanford Internet Observatory documented how a web of social media accounts advertises and distributes abuse materials

Mark Zuckerberg’s Meta has set up a taskforce to investigate claims that Instagram is hosting the distribution and sale of self-generated child sexual abuse material, with the platform’s algorithms helping advertise illicit content.

The move by the Facebook parent comes after a report from the Stanford Internet Observatory (SIO) that found a web of social media accounts, which appear to be operated by minors, advertising self-generated child sexual abuse material (SG-CSAM).

The SIO said that Instagram is “currently the most important platform for these networks” with enabling features such as recommendation algorithms and direct messaging that connects buyers and sellers of SG-CSAM.

The SIO said it acted on a tip from the Wall Street Journal, which detailed Instagram’s SG-CSAM problems, along with the SIO’s findings, in an investigation published on Wednesday.

The SIO reported that Instagram has allowed users to search for terms that its own algorithms know could be linked to SG-CSAM, with a pop-up screen for users warning that “these results may contain images of child sexual abuse”. The screen gives users the option to “see results anyway”. Instagram has removed the option for users to view the content after being contacted by the Journal.

In a statement, a Meta spokesperson said the company had set up an internal taskforce to deal with the claims in the reports.

“We’re continuously exploring ways to actively defend against this behaviour, and we set up an internal task force to investigate these claims and immediately address them,” said the spokesperson.

The SIO report follows a Guardian investigation in April that revealed how Meta is failing to report or detect the use of Facebook and Instagram for child trafficking. In response to the Guardian’s allegations at the time, a Meta spokesperson said: “The exploitation of children is a horrific crime – we don’t allow it and we work aggressively to fight it on and off our platforms.”

The SIO said its investigation found that large networks of social media accounts are openly advertising self-generated child sexual abuse material. It said Instagram’s popularity and “user-friendly interface” made it a preferred option among platforms.

“The platform’s recommendation algorithms effectively advertise SG-CSAM: these algorithms analyze user behaviours and content consumption to suggest related content and accounts to follow,” said the SIO.

The report said SG-CSAM can sometimes be distributed voluntarily but then become widely distributed publicly. It can also overlap with non-consensual intimate imagery, also referred to as “revenge porn”, while minors can also be coerced into producing sexual content. The SIO added that in recent years SG-CSAM has increasingly become a commercial venture including the posting of content “menus” online.

Researchers said they looked at one network in particular in which there were 405 accounts advertising the sale of SG-CSAM on Instagram as well as 128 seller accounts on Twitter. They said 58 accounts within the Instagram follower network appeared to be content buyers. The accounts were referred to the National Center for Missing and Exploited Children (NCMEC), which processes reports of online sexual child exploitation from US tech platforms. The SIO report said one month after they were reported to the NCMEC, 31 of the Instagram seller accounts were still active, along with 28 of the likely buyer accounts. On Twitter, 22 out of the 128 accounts identified in the report were still active. Twitter has been contacted for comment.

Meta said it had already addressed some of the investigation findings, saying in a statement it had fixed a technical issue that prevented reports of SG-CSAM from reaching content viewers and updating guidance to content reviews about identifying and removing predatory accounts. The Journal reported that an anti-paedophile activist was told by Instagram that one image of a scantily clad girl with a graphically sexual caption “does not go against our Community Guidelines” and was told to hide the account in order to avoid seeing its content.

Meta said in its statement it had also removed “thousands” of SG-CSAM-related search terms and hashtags on Instagram after researchers at the SIO found that paedophiles were searching under terms such as #pedobait and variations on #mnsfw (“minor not safe for work”).

Meta added that between 2020 and 2022 it had dismantled 27 abusive networks while in January this year it had disabled more than 490,000 accounts for violating its child safety policies.

The SIO report said industry-wide action is needed to tackle the problem.

Contributor

Dan Milmo Global technology editor

The GuardianTramp

Related Content

Article image
Child sexual abuse content growing online with AI-made images, report says
More children and families extorted with AI-made photos and videos, says National Center for Missing and Exploited Children

Katie McQue

16, Apr, 2024 @1:00 PM

Article image
Senator demands answers on reports of Meta censoring pro-Palestinian content
Elizabeth Warren sends terse letter to Mark Zuckerberg, citing human rights organizations and Wall Street Journal report

Kari Paul

15, Dec, 2023 @6:00 AM

Article image
Campaign aims to stop Facebook encryption plans over child abuse fears
No Place to Hide drive funded by Home Office to warn social media firms over dangers of end-to-end encryption

Dan Milmo Global technology editor

18, Jan, 2022 @12:01 AM

Article image
Meta earnings dropped by less than analysts expected
Parent company of Facebook and Instagram reports $32bn in revenue for fourth quarter, prompting rise in stock price

Erin McCormick

02, Feb, 2023 @1:41 AM

Article image
Twitter threatens to sue Meta over launch of rival Threads app
In a letter to CEO Mark Zuckerberg, a lawyer for the Elon Musk-owned app said Meta had unlawfully misappropriated trade secrets

Johana Bhuiyan in New York

06, Jul, 2023 @9:49 PM

Article image
Meta documents show 100,000 children sexually harassed daily on its platforms
Employees fretted over company’s ‘negligible’ response to child grooming, according to internal documents made public in lawsuit

Katie McQue

18, Jan, 2024 @10:25 PM

Article image
Facebook owner Meta to sack 11,000 workers after revenue collapse
Mark Zuckerberg says firm overinvested at start of Covid, adding ‘I got this wrong’

Alex Hern UK technology editor

09, Nov, 2022 @12:55 PM

Article image
Meta reports surprisingly strong quarter one earnings after restructuring hiccups
The company posted $28.10bn in revenue and appears to be shifting focus away from the metaverse to artificial intelligence

Kari Paul

26, Apr, 2023 @11:47 PM

Article image
Mark Zuckerberg confirms broad layoffs to begin at Meta – report
Parent company of Facebook, WhatsApp and Instagram is among others in the tech industry to suffer a severe slowdown

Guardian staff and agencies

08, Nov, 2022 @10:46 PM

Article image
Meta platforms are marketplaces for child predators claims lawsuit
Facebook and Instagram ‘enabled adults to find, message and groom minors’ for sexual exploitation, alleges state of New Mexico legal filing

Katie McQue

07, Dec, 2023 @11:55 AM