On the other hand... | Tim Adams

In a world of social media and polarised opinion, the stakes are now even higher. Time for a listening revolution

In the past week, the spectacle of the American gun lobby facing down the bereaved families of Parkland, Florida, has been difficult to watch. So close to the latest tragedy, the insistence on the sanctity of the right to bear arms has looked not only wrong-headed but wildly perverse. The weight of evidence, which grows shooting by shooting, goes a long way to proving that the second amendment has the diametrically opposite effect from that – keeping families safe – in which its proponents place their faith. The disconnect invites a question that seems increasingly insistent in our lives: on big issues, why is it so very hard for people to change their minds?

In the case of gun control, it is tempting to believe this is primarily a political question or even a financial one. In Wednesday’s emotive town hall debate in Miami, one young survivor of the shooting put that case directly to the Republican Florida senator Marco Rubio: the National Rifle Association had supported Rubio’s career to the tune of $3.3m; would he now refuse to take any more? For Donald Trump, intent on arming teachers, the monetary incentive looked even more telling: his campaign had benefited from a reported $21m of NRA funding.

But what of the tens of millions of Americans who didn’t have those incentives; those who contributed to the NRA pot, who still feel safer with a gun in the house - why exactly aren’t they thinking again?

The kind of polarisation that finds its lethal extreme in the gun debate is not confined to it. The famous maxim attributed to John Maynard Keynes – “When the facts change, I change my mind. What do you do, sir?” – would appear to have fewer and fewer advocates. Given the wearying fever of debate around Brexit, you might expect there to have been a significant swapping of sides among voters, but the polling suggests that the voting pattern of June 2016 is remarkably, many would say insanely, robust.

Or take the example of Trump himself. In one of his most telling insights from the original campaign trail, the 45th president suggested that he “could stand in the middle of Fifth Avenue and shoot somebody and I wouldn’t lose any voters”. The past two years might suggest a certain plausibility in that sentiment.

Anecdotal evidence suggests that the reason for the stubborn polarising of opinion is the increasingly tribal nature of our debate. We seem to be forgetting what pragmatism and compromise and listening hard to the other point of view even feel like. Intransigence is confused with leadership. We seem increasingly in thrall to leaders who offer the seduction of unwavering solutions to complex and evolving problems. Activists on both sides of the political spectrum police allies and elected representatives for purity of thought and action, issuing anonymised threats to those who seek to express nuance or deviation.

For a while, it has seemed that WB Yeats’s century-old assertion that “the best lack all conviction, while the worst are full of passionate intensity” has taken on the status of prophecy. But what is behind this apparently growing reluctance to admit doubt?

The most persuasive place to look for the answer to that question is in the field of behavioural psychology. In recent years, the Enlightenment faith in human rationality, our capacity to form our opinion in accordance with evidence, has been undermined, in particular in the pioneering work of Daniel Kahneman and Amos Tversky. Though we are inundated by data, the Israeli duo observed, we remain demonstrably over-reliant on instinct and prone to emotional bias of various kinds in all our choices. “No one ever made a decision because of a number,” Kahneman observed. “They needed a story.” The best we can hope for, the pair suggested, is to recognise those flaws and biases and try to react accordingly.

Last year, another duo of behavioural psychologists offered a telling critique of that latter position, one that sheds some light perhaps on what can seem our growing aversion to flexibility of thought. Hugo Mercier, who works at a French research institute in Paris, and Dan Sperber, based at the Central European University in Budapest, pointed out in their book The Enigma of Reason that if reason was a trait evolved over millions of years, like walking or sight, why did it seem so very unfit for purpose?

Their answer to this question was a neat inversion. It is not that we have too much faith in our capacity for reason, they argued, but that we tend to think of it in the wrong context. They agreed with Kahneman’s conclusion that human reason is both biased and lazy – “biased because it searches constantly for reasons to support the reasoner’s point of view, lazy because it makes little effort to assess the quality of justifications and arguments it produces” – but, they suggested, reason is biased and lazy for a purpose. We are not randomly credulous. Presented with someone else’s argument, we’re adept at immediately spotting its limitations. Invariably, however, the positions we are myopic about are our own.

Because man is a uniquely social animal, it follows that reason evolved as a resolutely social attribute; if you “take reason out of the interactive context in which it evolved”, Mercier and Sperber say, “nothing guarantees it will yield adaptive results…” Reason is a brilliantly inventive advocate for our instincts; contrary to the idea that we do our best thinking alone, we approach the truth when our take on the world is confronted by those of all the rival advocates in our group. The checks and balances to our biases are the demands of co-operation. “For a wide variety of tasks, argumentation allows people to reach better answers,” Mercier and Sperber say. Remove that diversity of opinion and thought and all the inbuilt limitations of reason become entrenched. Sperber and Mercier accept that there are two principal constraints to changing our minds: confirmation bias and cognitive dissonance.

There have been many studies of the power of confirmation bias. Perhaps the most famous study of how it makes it hard for us to change our minds was conducted at Stanford University in 1980. In it, subjects were given information about a pair of firefighters and asked to judge their suitability for their chosen job. They were told a few indiscriminate biographical details about the firefighters, but then informed that the true success or failure of the firefighters could be best predicted by something called the risky-conservative choice test (RCC test). They were asked to read the methodology that supported this test and to evaluate the firefighters accordingly.

After they had performed this task, those involved in the study were informed of the fact that the risky-conservative choice test itself and all of its results were entirely fictitious. It was fake news. Despite this knowledge, when asked to evaluate the firefighters again, those involved almost universally maintained their original evaluations.

The scientists behind the experiment concluded that “the results strongly support the hypothesis that even after the initial evidential basis for their beliefs had been totally refuted, people fail to make appropriate revisions in those beliefs”.

The researchers found a further surprising fact. If the people asked to evaluate the firefighters were encouraged to write down reasons for their original evaluation, they were even less likely to change their opinion than before; ie the act of writing itself seemed to hardwire their judgment and make it even more difficult to change.

Neuroscientists believe that there is a biological basis to this confirmation bias. When we find or share information that confirms our beliefs about the world, they suggest, we get a dopamine rush, similar to the one we get if we eat chocolate or fall in love. We tend to fear the prospect of losing that rush of certainty. If faced with a challenge to our understanding of the world, something that does not fit with our belief, our instinct is immediately to look for ways to assimilate or neuter that challenge.

This latter state is cognitive dissonance. Multiple experiments show that state to be uncomfortable and most of us do all we can to dispel it. Our identity is invested in the opinions and beliefs we hold; threaten those beliefs and you threaten self-understanding. The larger the discrepancy, the greater the potential embarrassment in acknowledging that we were wrong and everyone else was right about an issue, the greater lengths we go to prove our confirmation bias: polarising us still further.

In her recent book The Influential Mind, Dr Tali Sharot, reader in cognitive neuroscience at University College London, examines some of the implications of these biases in our current social context. Speaking last week, she suggested that the stakes are higher these days for cognitive dissonance because being wrong – or being right – is a more public act for far more people than ever before.

“If you think about social media as a place where we express our choices and opinions and beliefs,” she said, “that simple fact of expressing them, in itself will strengthen our belief.” Just by the public act of “clicking ‘like’, you immediately in fact make yourself more committed to that idea than you were a few seconds before”. And the smarter or more informed you consider yourself to be, the more likely you are to look for confirmation bias for that opinion. And the harder it is to admit you were wrong.

The difficulties in accepting contradictory evidence are further exacerbated, Sharot suggests, by our tendency to seek information not from the most reliable but from the most like-minded sources. In another recent study, one that she has not yet published, subjects were invited to interact with others taking in both those who had similar and dissimilar political views to them. They then took part in a test that involved some complex rules that had nothing to do with politics. Participants could see how the other people they had met performed at this test. When they had the opportunity to give assistance in the task from other participants, however, they were significantly more likely to choose help from people who were like them in political terms, ahead of those who were shown to be most adept at the test.

Sharot’s book takes account of such polarity and looks at strategies to overcome it. Using the stubborn example of parents who persist in believing in the link between the MMR vaccine and autism, despite the absence of evidence, she points to the strategy of the immunisation team at UCLA, which devised a method to get parents to vaccinate their kids. The UCLA team decided not to try to argue about the thing they disagreed with the parents on, the link to autism – however many figures they showed, it did not work. Instead, they focused on the detail of the deadly diseases – measles, mumps and rubella – for which the vaccine was designed to protect kids. That gave the parents a perception of control and vaccination rates improved threefold.

“We may have the data and we have the figures,” Sharot suggests, “but we need to convey that truth using what we know about how the mind works.”

Mercier and Sperber imply, in The Enigma of Reason, that our socially mediated habits of interaction ironically allow us to choose fewer properly social forums to employ our reason in the way it was intended. The example they cite for a properly functioning model of rationality is the great essayist Montaigne, for whom “the study of books [was] a languishing a feeble motion, whereas conversation teaches and exercises at once”.

A more contemporary example might be our most visible populariser of the scientific method, Professor Brian Cox, who argued to me that “the idea that people have a right to their opinion is obviously true in a free society. But you really do not have a right to have that opinion heard. The weighting has to be toward knowledge. But the point is for one reason or another many people don’t know how to change their mind. The whole point of science is that you have to be prepared – and delighted – to change your mind in the face of new evidence. That is the message that should be taught in schools.”

One of the most impressive social experiments in this latter “delight” was tellingly created by a sixth former, Kal Turnbull, in Inverness five years ago. Turnbull’s experiment is a forum on the Reddit platform called Change My View and it might have been tailored exactly to the requirements for effective social reasoning set out by Mercier and Sperber (or, indeed, Montaigne).

Turnbull was struck, as he prepared to go to university in Edinburgh, by the fact that he had spent his whole life “surrounded mainly by friends and family that thought pretty similarly”. He felt constricted by the narrowness of that and wondered: “If you have an opinion that you are interested in having changed, or want changed, how would you go about doing that? How do you find people who might challenge your ideas?”

Twitter, Facebook, even reading newspapers didn’t feel like a solution, he explains to me. “I guess what I was looking for was a place where you might offer an opinion up almost in a catered way” – as if to address a very precise and pressing need – “and people would come in and say, OK you think this for these reasons let’s see if we can pull that apart and change your view.”

Change My View now has 520,000 subscribers, who have signed up to the kind of open and vulnerable debate that is all too rare on the internet. Turnbull worked on promoting and developing the premise of the forum throughout university, where again, he says, he was struck by an unwillingness of his peers genuinely to confront their own beliefs.

“I had this frustration of why can’t we talk to each other a bit more interestingly? Where people are promoting ideas… [not just] to get some applause, a pat on the shoulder.”

Change My View lives and dies by the rules of engagement that Turnbull wrote to encourage the kind of “Socratic dialogue” he imagined. They have become a remarkable document.

In the terms of Change My View, if you start a subject thread, which can be anything from “Black Panther was an extremely mediocre movie, ” to “Politicies that lean toward the center of the ideological spectrum of a country are the best.”, it is a kind of full-time commitment to a respectful give and take until some resolution is achieved.

If you are persuaded to change your view, you award the decisive opponent a delta, the mathematical symbol for change. To have your own view altered or successfully to alter someone else’s view are both counted as victories. Elon Musk, no less, endorsed the idea that the site is “perhaps the most civilised place on the web”.

The tone and rigour of the conversation that Turnbull’s site generates has made it the subject of studies at Columbia University and the Georgia Institute for Technology, as academics try to understand its appeal. A team from Cornell University, meanwhile, conducted research over two years into the kinds of posts that did most to achieve delta ratings and that successfully changed people’s views. The most persuasive qualities were variously a sudden shift in vocabulary, which indicated commentators bringing surprising points of view; longer and more detailed posts, full of specifics; and, perhaps counterintuitively, arguments that were most aware of their limitations, using language of qualification and caveat.

Contrary to appearances, Turnbull suggests, the internet has not been a great forum for interaction. Change My View, he hopes, is almost going back to those Scottish Enlightenment coffee house ideas, where people would get together and share their finest thoughts and relish the opportunity to have them questioned and dismantled. “We are not a debating service, we are a conversational service,” he says. “Debate has this theatrical connotation where you are trying to win over the crowd and so on; where conversation is all about trying to understand, and to be encouraged to perhaps notice contradictions in your own positions.”

That other great prophet of our times, Alvin Toffler, author of Future Shock, suggested that “The illiterate of the 21st century will not be those who cannot read and write, but those who cannot learn, unlearn, and relearn.”

As a case in point, it is worth looking at a current exchange, with 1,300 answers at the time of writing, which examines the proposition “Every argument against changing gun laws in the USA is unrealistic and illogical” and finds some subtle middle ground even among the diehards. It might not change your view of the NRA, but it might give you a new idea of the idea of arguing on the internet.• Tim Adams is an Observer writer

Contributor

Tim Adams

The GuardianTramp

Related Content

Article image
Why the left’s hellish vision is so ruinous | Andrew Hindmoor
An unduly bleak view of recent British history, apt to see little but a legacy of neoliberalism, ignores the advances of social democracy and erodes faith in progressive politics, writes Andrew Hindmoor

Andrew Hindmoor

11, Mar, 2018 @6:02 AM

Article image
How old ghosts are haunting Ireland | Susan McKay
As Britain prepares to leave the EU, the Irish border question looms ever larger, stirring fears that the Troubles have not yet been consigned to the past

Susan McKay

25, Mar, 2018 @6:04 AM

Article image
Read on, Jordan B Peterson, if you want to survive a visit to Glasgow | Kevin McKenna
The Canadian culture warrior is coming to town. Prepare for new right platitudes

Kevin McKenna

05, Aug, 2018 @5:00 AM

Article image
How populist uprisings could bring down liberal democracy | Yascha Mounk
Authoritarians are on the rise, and electorates are seduced by extremes. To fight back, mainstream politicians need to grasp the causes of popular discontent and rebuild democracy’s moral foundations

Yascha Mounk

04, Mar, 2018 @7:05 AM

Article image
Elites must start listening to the ‘smelly citizens’ in the streets | Anne McElvoy
From the honours list to the Brexit vote, the governing class seems tone-deaf to the sentiments of the people

Anne McElvoy

06, Aug, 2016 @11:03 PM

Article image
There are reasons to be cheerful. These are the dying days of a rancid old order | Will Hutton
In the UK and the US, the political wind will soon change in favour of those demanding good government

Will Hutton

11, Aug, 2019 @4:59 AM

Article image
A revolution in our sense of self | Nick Chater
Psychologists have tried to plumb the depths of human motivation to make sense of our behaviour. But our inner mental world is a fiction, sustained by constant improvisation

Nick Chater

01, Apr, 2018 @5:00 AM

Article image
Nobody minds a gentle nudge, except in the wrong direction | Andrew Rawnsley
The past decade has demonstrated when ‘choice architecture’ in politics can succeed and when it doesn’t work

Andrew Rawnsley

21, Oct, 2017 @11:05 PM

Article image
Outrage makes you feel good, but doesn’t change minds | Sonia Sodha
The left must learn that moral outrage will never win an argument

Sonia Sodha

01, Apr, 2017 @11:05 PM

Article image
Beware a closing of the British mind if we abandon European endeavours | Nick Cohen
Post-Brexit, we should be wary of spurning joint projects in science and education

Nick Cohen

11, Jan, 2020 @6:30 PM