Angry about Facebook censorship? Wait until you hear about the news feed

Peeved about Facebook’s curation of trending topics? Its news feed is reinventing censorship for a technological age, and humans need not apply

Bad news: Facebook is censoring the internet every day, warping your understanding of the world around you to benefit its corporate interests, and fundamentally changing the media landscape in a potentially apocalyptic fashion.

Good news: that has little to nothing to do with the fact that the human curators of its trending topics feature are a bit sniffy about linking to Breitbart News.

The most surprising thing about Facebook’s trending stories isn’t that the human editors behind them occasionally exercise their own judgement in which stories they do or don’t link to; it’s that even with humans working directly on the feature, it’s still awful. The output of the feature is so bad that I and many others assumed it must be entirely algorithmic: how else would you end up with bizarre gnomic statements like this, taken verbatim from the “science and technology” section of my feed today:

“Quebec, Canada: Parts of Province Experience Snowfall, Report Says”

But no. Facebook has an entire team of writers working on these statements, according to a report from Gizmodo, and those writers are apparently encouraged to focus on mainstream news sites such as the BBC and CNN over fringe right-wing outlets like Breitbart or Newsmax.

Facebook, for its part, denies censoring trending topics, saying that it wouldn’t even be “technically feasible” to do what the whistleblower alleged. That hasn’t stopped outrage at the report reaching the highest levels: Republican Senator John Thune spoke out on Tuesday, asking the site to explain itself.

If Thune is outraged about stories not appearing on trending topics – a small sidebar which has an unclear influence on web traffic, fails to shape discussion, and is buried on mobile devices – wait until he finds out about the news feed.

The jewel in Facebook’s crown is a hotbed of censorship. Don’t believe me? Try to post a picture of Aboriginal women in traditional dress – that is, topless – on your Facebook feed and see how long it lasts.

The company’s moderation team is notorious for its heavy-handed approach to topics like nudity, even as it also gets slated by governments worldwide for not removing and reporting content glorifying terrorism rapidly enough.

But being a community moderator at Facebook is a thankless task. The work, often outsourced to companies like Manila-based contractor TaskUs, is performed with little remuneration or training. And even the best-paid highly skilled employees would have trouble drawing a consistent plan of action out of Facebook’s vague attempts at drawing up community standards.

Mark Zuckerberg taking in front of a chart showing the history of its applications such as News Feed and the Like Button.
Facebook’s news feed was born in 2006, when the social network was growing into a global phenomenon. Photograph: Justin Sullivan/Getty Images

Say what you like about moderation, though: at least you can see it happening. What seems so disturbing about the alteration of trending topics is that the sites which were kept off the list had no way of knowing that they had even had a chance. There’s no reports feeding back why a curator decided to place or leave a story. It’s an opaque system.

Except, of course, that we can speak to former curators of trending topics to find out what they did and didn’t post.

With the news feed, there’s no such luck. The algorithm that drives it makes just as many editorial choices as the trending topic curators, but you can’t interview it to ask why. It will never be fired and decide to speak out about its decisions under the cloak of anonymity. Instead, it just sits there, day in day out, totally dictating the content seen by more than a billion users of the biggest social network in the world.

Perhaps because of that, the majority of Facebook users don’t even realise that the news feed is edited at all. A 2015 study suggested that more than 60% of Facebook users are entirely unaware of any algorithmic curation on Facebook at all: “They believed every single story from their friends and followed pages appeared in their news feed”, the authors wrote.

The news feed algorithm takes in so many signals when deciding what should be promoted and what should be buried that it’s likely the case that there is no one person at Facebook who can list them all. But we know some choices the algorithm makes: it promotes live video as much as possible, and pre-recorded video almost as heavily – although in both cases, only if the video is delivered through Facebook’s own platform.

It pushes articles that you spend a long time reading, as well as links posted by your closest friends, over the alternative. If you run a business page on the site, it will show your posts to a tiny fraction of people who’ve subscribed, and then ask for cash to show it to anyone else.

These decisions don’t feel outrageous, because Facebook sells them under the veneer of neutrality. Articles with a longer read time aren’t shown because Facebook made an editorial decision that you shouldn’t read short pieces; instead it’s because “the time people choose to spend reading or watching content they clicked on from news feed is an important signal that the story was interesting to them”. And so Facebook promotes stories with a high read time, because it wants the news feed to be full of “interesting” stories.

You could, of course, argue that the decision to focus on interesting stories, as opposed to important, or pleasing, or humorous ones is itself an editorial decision.

But that argument probably wouldn’t be very interesting. So no one would read it, because it wouldn’t show up on Facebook. Oh well.

Contributor

Alex Hern

The GuardianTramp

Related Content

Article image
Facebook censors Le Monde's mammogram screening photo
Social networking company apologises after picture of woman having mammogram with her breast exposed falls foul of its anti-nipple policy

Alex Hern

12, Oct, 2016 @8:51 AM

Article image
Facebook's news saga reminds us humans are biased by design
The revelation that some of the social media site’s journalistic decisions are made by people, not algorithms, has shone a fascinating light on the rapidly changing news landscape

Alex Hern

13, May, 2016 @9:44 PM

Article image
Facebook: no current plans to make 'catastrophic' news feed change worldwide
Company’s head of news feed says test is to see if users prefer only friends’ posts in timeline, but journalists in affected countries warn of danger to democracy

Alex Hern

24, Oct, 2017 @1:17 PM

Article image
Facebook bans women for posting 'men are scum' after harassment scandals
Comedian Marcia Belsky’s 30 day ban for response to misogynistic abuse directed at friend prompts protest resulting in hundreds of suspensions

Samuel Gibbs

05, Dec, 2017 @2:23 PM

Article image
Norwegian editor challenges Zuckerberg to discuss censorship
Head of Aftenposten, which forced backdown over ‘napalm girl’ photo, accuses Facebook founder of hiding from debate

Matthew Weaver

20, Sep, 2016 @7:34 AM

Article image
Facebook doesn't need to ban fake news to fight it
Mark Zuckerberg’s social media site doesn’t have to become a censor to help tackle false stories. It can do a lot by helping its own users with context

Alex Hern

25, Nov, 2016 @1:56 PM

Article image
Facebook to consider public interest before removal of posts violating guidelines
Move comes after repeated criticism of Facebook from news organisations, charities and others over important posts being taken down without notice

Alex Hern

24, Oct, 2016 @9:00 AM

Article image
Why am I seeing this?: New Facebook tool to demystify news feed
Facebook attempts to assure users it’s not spying on them with their phone’s mic

Alex Hern

01, Apr, 2019 @4:44 PM

Article image
Facebook blocks Chechnya activist page in latest case of wrongful censorship
The barring of a non-terrorist group for ‘terrorist activity’ sparks debate – again – about how overloaded moderators can handle content fairly and accurately

Julia Carrie Wong in San Francisco

06, Jun, 2017 @7:18 PM

Article image
Facebook developed secret software to censor user posts in China, report says
Software developed with Mark Zuckerberg’s support will allow third parties to monitor and suppress the visibility of posts

Samuel Gibbs and agencies

23, Nov, 2016 @3:08 PM