Facebook considering ending restrictions on Covid misinformation

Social network turning to its ‘supreme court’ about decision, says head of global affairs, Nick Clegg

Facebook is turning to its “supreme court” to decide whether to end restrictions on Covid misinformation, more than two years after the company first started to take special action on posts promoting falsehoods about the disease.

The social network is considering changing the way it deals with such misinformation by, for example, labelling it as false or demoting it in algorithmic ranking, rather than simply removing it from the site. It wants to make the change now, according to head of global affairs, Nick Clegg, “as many, though not all, countries around the world seek to return to more normal life”.

But in order to avoid making the wrong choice when “resolving the inherent tensions between free expression and safety”, Facebook will turn to its oversight board, the arms-length self-regulator set up in May 2020, to decide on what the future moderation policy should be.

“We are requesting an advisory opinion from the oversight board on whether Meta’s current measures to address Covid-19 misinformation under our harmful health misinformation policy continue to be appropriate,” Clegg said, “or whether we should address this misinformation through other means, like labeling or demoting it either directly or through our third-party fact-checking program.”

By requesting an opinion, Facebook is not committing to honour the judgment issued by the board, prompting some to question whether the site was simply seeking cover for making a decision likely to be broadly unpopular with a large section of society whatever it chooses.

In a statement, the oversight board said it had accepted Facebook’s request. “Meta [Facebook’s parent company] must send the board’s recommendations through its official policy development process and give regular updates on this, including through its newsroom,” the board added.

“While the board’s policy advisory opinion is not binding, Meta must provide a public response and follow-on actions within 60 days of receiving our recommendations.”

In its detailed request for an option, Meta said that the increased availability of authoritative guidance means that misinformation is less likely to be posted into an “information vacuum”. It also argued that the development of vaccines and therapeutic treatments, as well as the evolution of variants such as the Omicron strain, mean that Covid-19 is less deadly than it used to be, and said that “public health authorities are actively evaluating whether Covid-19 has evolved to a less severe state”.

“Meta remains committed to combating Covid-19 misinformation and providing people with reliable information,” Clegg said in a statement. “As the pandemic has evolved, the time is right for us to seek input from the oversight board about our measures to address Covid-19 misinformation, including whether those introduced in the early days of an extraordinary global crisis remains the right approach for the months and years ahead.”

Sign up to First Edition, our free daily newsletter – every weekday morning at 7am BST

The oversight board is funded by Facebook, and its first four members, all of whom hold the title co-chair, were selected by the social network. The co-chairs and Facebook picked the initial board of 20, which includes former Guardian editor Alan Rusbridger, and over time intend to expand it to a board of 40, at which point Facebook says it will cease to be involved in selecting membership. After more than two years, however, the board is still only 23 people.

Despite the close links, the board has had some clashes with Facebook. In 2021, it rejected an attempt by the social network to force it to decide – and take the political fallout over – whether to permanently block Donald Trump from the site, and later that same year it also overruled Facebook’s attempt to prevent it from passing judgment on a particular case, in an assertion of its own authority likened by observers to the US supreme court’s historic 1803 ruling in Marbury v Madison.


Alex Hern Technology editor

The GuardianTramp

Related Content

Article image
Facebook funnelling readers towards Covid misinformation - study
Research findings undermine firm’s claims it is cracking down on inaccurate news

Emma Graham-Harrison and Alex Hern

19, Aug, 2020 @10:00 AM

Article image
Minister refuses to rule out changes to UK online safety bill
Social media bosses who breach child safety rules may face jail if Ofcom given powers to prosecute

Kiran Stacey and Dan Milmo

13, Jan, 2023 @9:59 AM

Article image
Medical experts v anti-vaxxers: the Covid-19 information battle
Scientists face potential struggle to convince people of vaccine safety as celebrities join misinformation chorus

Ben Quinn and Niamh McIntyre

19, Sep, 2020 @5:00 AM

Article image
Facebook admits site appears hardwired for misinformation, memo reveals
Papers reveal struggle to tackle hate speech and reluctance to censor rightwing US news organisations

Dan Milmo and David Pegg

25, Oct, 2021 @10:22 PM

Article image
Ofcom to push for regulation of social networks
Media regulator’s chief seeks action to prevent online harm

Jim Waterson Media editor

17, Sep, 2018 @10:55 PM

Article image
Tech giants join with governments to fight Covid misinformation
Facebook, Twitter and Google part of working group targeting false claims about vaccines

Alex Hern Technology editor

20, Nov, 2020 @3:16 PM

Article image
Facebook removes Save Our Rights UK Covid denial videos
Social media site pulls videos by group that Tory MP told to persist with its anti-lockdown activities

Ben Quinn and Archie Bland

28, Jan, 2021 @7:54 PM

Article image
Facebook break can boost wellbeing, study suggests
Research finds leaving social network for a week increases life satisfaction, especially among heavy users and ‘lurkers’

Haroon Siddique

22, Dec, 2016 @10:55 AM

Article image
Google and Facebook face penalties if they don't stop online hate
Theresa May and French president Emmanuel Macron agree to create laws to punish failure to remove jihadist propaganda from websites

Jessica Elgot in Paris

13, Jun, 2017 @8:22 PM

Article image
Tories test Boris Johnson campaign messages on Facebook
Flurry of ads fronted by new PM may be testing waters before potential early poll

Jim Waterson Media editor

25, Jul, 2019 @2:55 PM