Early steps to trace and block the trolls who spread fake news | Open door | Paul Chadwick

For those journalists who seek truth and take seriously their role to facilitate democracy, it is imperative to expose deception and masquerading voices

Fakery has long infected public debate in democracies, especially during election campaigns, with effects hard to gauge.

In the UK in 1924, the fake “Zinoviev letter” fuelled a Russia scare different from today’s concerns. It is thought to have greatly harmed the then Labour government’s chances of re-election when the letter was reported as authentic in the Daily Mail just before polling day.

Technology has made the threat of fakery greater today. Widespread disclosures are no longer made only by mass media, whose claims are necessarily public and able to be challenged. A sort of darkness covers deceptions that are now possible through a mix of data-driven targeting and the social media news feeds of individuals, sharing among their personal networks.

For those journalists who seek truth and take seriously their role to facilitate democracy, it is imperative to expose fake news and masquerading voices. Nothing is more potentially destructive of freedom under law than a community losing trust in the information with which public debate is nourished, public choices are made and public accountability is extracted.

Knowledge is gradually building about how Russian trolls have attempted to influence or disrupt public debate beyond Russia, using social media in particular. An important step in controlling the infection was Twitter’s recent disclosure to the US Congress of 2,752 accounts that Twitter has “tied to Russian actors”, as the House permanent select committee on intelligence put it.

Availability of the list of Twitter accounts allows other democratic institutions, especially traditional journalism organisations, to investigate the effects of those accounts, especially any impact on their own published journalism and on the debates they host on their own platforms.

What, if anything, can be learned about the objectives and tactics of those who create and deploy the fakery?

Last week, the Guardian began answering. Using the list it reported that the Russian “troll army” accounts had been cited more than 80 times across UK media. Two of the accounts have appeared in two different Guardian articles. A report last June about the LGBTI community’s fight back against online abuse necessarily drew heavily on social media. It has been footnoted to disclose that one of the tweets quoted in the piece was drawn, unknowingly, from a fake account.

The second fake account mentioned in a Guardian article, @TEN_GOP, used to be understood to represent Tennessee Republicans’ opinions. It had been promoted by members of President Trump’s inner circle, though I am not aware of any evidence that they knew it was a Russian troll account when they cited it. Last June the account was quoted in a Guardian live blog in a selection of conservatives’ responses to the US withdrawal from the Paris climate agreement.

I asked the Guardian’s technical experts to use the list to examine what had been happening in the more than 40m comments published online below articles since the beginning of 2016. The analysis is continuing so I might have more to add. But early results, which show relatively few of the accounts appeared, invite cautious analysis.

The more such work is publicised – prudently, for no one wants to assist trolls – the more readers are put on guard. I trust that other journalism organisations will conduct similar exercises and share their techniques and findings. I hope the social media giants, Facebook and Twitter, will promptly and routinely make public all fake accounts, advertisements and “news items” they find as they inquire into how their services have been manipulated for malign purposes. Excessive secrecy retards the democratic fightback.

It is important to note that those who made online comments in which they embedded one of the troll accounts did not necessarily know the account was fake. They may simply have agreed with its sentiments.

At the Guardian, six of the troll Twitter accounts have been found so far in below-the-line comments beneath eight articles published between June 2016 and 1 November 2017, the date the congressional committee released Twitter’s list.

The articles are in two broad categories: the US presidential election campaign in 2016 and Donald Trump’s first six months in office; and the conflict in Syria, where Russia is fighting on the side of the regime of Bashar al-Assad.

A comment beneath an article about the Women’s March on Washington, which coincided with Trump’s inauguration, claimed that “the organiser” of the march was campaigning for sharia law across the US, and cited as its lone source a Russian troll account. Below an article referring to Trump and Vladimir Putin meeting at the G20 summit in Hamburg last July, where violence embarrassed the German hosts, was a comment disparaging “extreme left anti-fascists” and embedding a trolling Twitter handle.

Other comments that contain fake accounts within them seem aimed at fomenting disdain, or worse, towards, for example, those who called out Trump for his behaviour towards women.

As others who have analysed Russian trolling have noted, the comments are sometimes unsteadily expressed, as if the writer was not a native English speaker. If the words and messaging associated with proven troll accounts can be aggregated in large amounts, perhaps patterns will emerge and help with prevention, not just reaction. In this fight, artificial intelligence can be harnessed by both sides.

Exposing destructive use of social media, including through its appearances in older media forums, can begin in earnest. Opportunities exist for people of goodwill to collaborate to make the work faster and more effective.

Contributor

Paul Chadwick

The GuardianTramp

Related Content

Article image
Real concerns about fake news and staged photos | Letters
Letters: Isn’t the front page prominence given to a photo of Donald Trump apparently embracing Klan members an instance of the fake news the pope is warning  about?

Letters

09, Dec, 2016 @7:10 PM

Article image
The Guardian view on Russian trolls: democracy is much too easy to hack | Editorial
Editorial: Of course the Russians tried to influence the US presidential election. The shocking thing is that they found it so simple

Editorial

18, Feb, 2018 @5:13 PM

Article image
Open door: The corrections column co-editor on… the ‘zombie’ stories that refuse to die | Barbara Harper
Barbara Harper: How a tweet from Russell Brand or a Facebook post from Star Trek’s George Takei can propel an old story back to the top of our most-read list

Barbara Harper

03, Aug, 2014 @6:00 PM

Open door: The authentication of anonymous bloggers

Chris Elliott: The readers' editor on... the facts and faces behind anonymous blogs

Chris Elliott

12, Jun, 2011 @11:02 PM

Article image
Social media are testing the legal boundaries of free speech | Open door
Open door: Law on hate speech shaped itself in response to physical confrontations. But now we also live in cyberspace

Paul Chadwick

24, Sep, 2017 @4:46 PM

Open door: Guardian.co.uk's executive editor for news on… the social media harvest

Sheila Pulham: Guardian.co.uk's executive editor for news on… the social media harvest

Sheila Pulham

22, Mar, 2010 @12:05 AM

Article image
How editors lost self-control – and all respect for the law | Simon Jenkins

Simon Jenkins: For papers, phone hacking is a moment of truth: commercial pressures have warped ethics – and the public will want action

Simon Jenkins

05, Jul, 2011 @7:30 PM

Open door: Kia Abdullah, Twitter and the Guardian
Chris Elliott: The readers' editor on… the distress caused by a freelance journalist's offensive tweets about three tragic deaths

Chris Elliott

10, Jul, 2011 @9:30 PM

Article image
The Guardian view on the freedom of the internet: it’s under attack around the world | Editorial
Editorial: The net is a powerful weapon, and governments don’t want it in the hands of their critics

Editorial

11, Dec, 2014 @8:00 PM

Article image
The Guardian view on censoring the internet: necessary, but not easy | Editorial
Editorial: Who should protect us online? And who will guard us from these guards?

Editorial

21, Aug, 2017 @5:34 PM