Social networks struggle to crack down on ‘incel’ movement

Analysis: despite action from large platforms and volunteer moderators, communities remain influential online

Despite years of strict moderation from the main social networks, the “incel” community remains as influential as it was in 2014, when an English 22-year-old killed seven people on the streets of Isla Vista, California, motivated by his hatred of women.

The murders were an eerie parallel of the shootings in Plymouth last week. Both killers were radicalised on social media, where they posted extensively about their hatred of women and their feelings of despair over their lack of sexual activity.

But in the years since 2014, all the main social networks have acted against the movement. Reddit, which was once home to some of the largest incel communities on the internet, has spent much of the past two years enforcing policies that had previously been only loosely applied.

Subreddits such as r/incels and r/theblackpill have been banned for violating “sitewide rules regarding violent content”. The latter was a gathering point for individuals who described themselves as having been “blackpilled”, a philosophy loosely linked to the incel community where members describe themselves as having been awakened to the true miseries of modern life.

In other communities that could easily cross the line into violent extremism, volunteer moderators work hard to keep the conversation from veering into dark places. The Forever Alone subreddit, for instance, is “a place where people who have been alone most of their lives could come and talk about their issues”. Its 10 volunteer moderators do not work for Reddit, but enforce a set of rules, which include “be polite, friendly and welcoming”, and a strict ban on “any incel references, slang or inference”.

The Reddit account of the Plymouth shooter was suspended on Wednesday, just hours before the attack, again for breaking the site’s content policy. A Reddit spokesperson said: “We take these matters very seriously. Our investigation is ongoing.”

Other platforms were slower to act. YouTube, where the shooter had an account and regularly posted vlog-style videos, also took down his account – on Friday, citing the platform’s “offline behaviour” policy. That policy is also relatively new: as recently as 2019, YouTube was criticised for not taking down content from users such as Tommy Robinson, who were careful to only post videos that were within the rules of the platform, even as they more broadly engaged in behaviour that went far beyond what the service would allow.

“Our hearts go out to those affected by this terrible incident,” a YouTube spokesperson said. “We have strict policies to ensure our platform is not used to incite violence. In addition, we also have longstanding policies that prohibit those responsible for attacks like these from having a YouTube channel and have since terminated their channel from our platform.”

On Facebook, the incel movement isn’t banned outright. Only a small handful of designated “hateful ideologies” are so limited, including white supremacy and nazism. Many more movements are banned as designated “hateful organisations”, but such a restriction does not apply to the leaderless incel movement. Instead, however, the site’s limitations on hate speech largely apply: content promoting hate on the basis of someone’s sex or gender is banned, as is any content promoting violence.

Despite action from large social networks, the incel community remains influential online. Sites with loose or nonexistent moderation policies, such as 4chan and 8kun, have sizeable cohorts, and smaller, dedicated forums are able to set their own moderation policies.

Contributor

Alex Hern Technology editor

The GuardianTramp

Related Content

Article image
Ofcom to push for regulation of social networks
Media regulator’s chief seeks action to prevent online harm

Jim Waterson Media editor

17, Sep, 2018 @10:55 PM

Article image
Cyberstalking victim urges social media firms to tackle problem
Mark Weeks’s ex-girlfriend was jailed after harassing him using various fake online accounts

Ben Quinn

25, Feb, 2019 @6:00 AM

Article image
Plymouth shootings may be a sign ‘incel’ culture is spreading
What do we know about the subculture of the involuntarily celibate and its beliefs and aims?

Alexandra Topping

13, Aug, 2021 @5:21 PM

Article image
UK gun licences: could vetting social media prevent another Plymouth shooting?
Analysis: hateful ideologies like ‘incel’ movement must be recognised as threats rather than dismissed as an online subculture

Alex Hern Technology editor

16, Aug, 2021 @3:34 PM

Article image
Plan to crack down on social media firms is 'massive step', say MPs
White paper proposes making executives liable for harmful content on their platforms

Alex Hern Technology editor

05, Apr, 2019 @1:39 PM

Article image
Social media firms must face heavy fines over extremist content – MPs
An inquiry by the Commons home affairs committee condemns technology companies for failing to tackle hate speech

Owen Bowcott Legal affairs correspondent

01, May, 2017 @5:01 AM

Article image
Alarm over steep rise in number of sextortion cases in UK
Number of people coerced into providing explicit images only to be blackmailed over their publication doubles in a year

Sarah Marsh

03, Sep, 2017 @3:41 PM

Article image
Ban social media trolls from voting, election watchdog suggests
Electoral Commission says bans could be considered in attempt to reduce amount of abuse faced by politicians

Rajeev Syal and agencies

18, Sep, 2017 @4:53 PM

Article image
Social media addiction should be seen as a disease, MPs say
UK report suggests sites such as Facebook and Instagram could be harming mental health

Jim Waterson Media editor

18, Mar, 2019 @12:01 AM

Article image
Face-off between MPs and social media giants over online hate speech
Committee tells Twitter, Facebook and others they should get tough on online abuse, with one MP accusing them of ‘commercial prostitution’

Alan Travis Home affairs editor

14, Mar, 2017 @6:08 PM