Countering disinformation: feeling is believing

#CriticalThinking

Peace, Security & Defence

Picture of Chris Kremidas-Courtney
Chris Kremidas-Courtney

Senior Advisor at Defend Democracy, Lecturer at the Institute for Security Governance and former Senior Fellow at Friends of Europe.

Disinformation continues in full swing while governments and international organisations try their best to counter it. Disinformation helped to bring measles back to Europe and is already playing a role in global markets in response to the coronavirus outbreak. Meanwhile disinformation campaigns by malign foreign actors are eroding people’s confidence in public institutions, elections, and each other.

Despite these concerns, so many of the current proposals to counter disinformation rely on the same kinds of responses: education, teaching critical thinking, fact-checking, and technical solutions to contain its spread.

These solutions are all incomplete, as they ignore what we have learned from neuroscience and human behaviour. They are all too often aimed at how people think rather than how people feel. As Thomas Fuller put it, “Seeing is believing, but feeling is the truth”.

Recent research tells us that we all consider ourselves rational thinkers. Yet neuroscience also tells us that we mostly practice ‘motivated reasoning’. That is to say, our reasoning is actually laced with emotion. Not only are the two inseparable, but our feelings about people and ideas arise in a matter of milliseconds, much faster than our conscious thoughts.

When disinformation amplifies fear and anxiety, it taps into an emotional desire to reject ambiguity

Reason is slower and more deliberate and it does not happen in an emotional vacuum. We are therefore susceptible to emotions that can bias our views from the beginning, especially on topics that matter the most to us.

Another dynamic associated with disinformation is the psychological concept of ‘cognitive closure’. When faced with uncertainty or anxiety, we often seek rapid clarity or closure, which makes us vulnerable to ideas which offer clear binary choices. In embracing this black-and-white worldview, people then aggregate into like-minded groups and often insulate themselves from alternative viewpoints.

When disinformation amplifies fear and anxiety, it taps into an emotional desire to reject ambiguity. Thus, cognitive closure causes us to seek safety not only in certainty, but within a tribe of people who reached the same quick conclusions. This does not occur necessarily because people are closed-minded, but rather their human survival reflexes, driven by emotions, have been hacked.

When it comes to why people choose to spread disinformation, we encounter yet another emotional dynamic: the desire for higher social status.

Once a false item gets shared often enough, more people will decide to share that same information

A 2018 MIT study found that, from 2006 to 2017, disinformation spread through Twitter farther and faster than the truth. In fact, false news was found to be 70% more likely to be retweeted than confirmed facts.

Among the reasons for this is that novelty attracts human attention. When information is novel, it is not only more interesting, but more valuable. It makes someone feel they have a raised social status since they feel “in the know.”

Once a false item gets shared often enough, more people will decide to share that same information. Due to a dynamic called the illusory truth effect, this repetition of false information makes information feel more true. And when something feels true, people are more likely to share it.

We all think we are discerning critical thinkers and most believe that it’s other people who are more vulnerable to fake news. Unfortunately, none of us are immune to this dynamic. Science tells us that any of us can possibly spread disinformation.

Encourage people to withhold judgment on new information they encounter

Too often, we think we’re reasoning when we are actually rationalising. In the words of Jonathan Haidt, “We may think we’re being scientists, but we’re actually being lawyers.” Our reasoning is too often used as a means to a predetermined end as we try to win our ‘case’ and too often it’s full of our own biases.

So, what can be done? While fact-checking, technical solutions, and a wider understanding of critical thinking are a good start, we also need to acknowledge the emotional component to stay a step ahead of disinformation operators exploiting this vulnerability.

On the education front, in addition to critical thinking, we can offer some new skills to limit the effects of disinformation by using lessons learned from memory studies. Quite simply, encourage people to withhold judgment on new information they encounter. When this happens, people are more likely to evaluate information, giving the rational thought process time to catch up with the initial emotional signals received.

When people take time to evaluate, they are less likely to accept (or share) any disinformation they’ve encountered. Essentially, we all need to slow down. When we do slow down, we do a better job of distinguishing fake and true information.

Marketing and advertising professionals design campaigns to reach the heads and hearts of consumers

So, if social media required people to evaluate the information they are about to share, they may be less likely to share false or misleading information.

As for creating a more balanced mental and emotional approach to countering disinformation, there are two groups which are well equipped for this task: marketing experts and politicians.

Marketing and advertising professionals design campaigns to reach the heads and hearts of consumers. They constantly monitor which messages resonate and which ones don’t.  And while some governments have tapped into this expertise to hone specific messaging efforts, most have yet to employ it against disinformation.

We may need to commit more resources (of the right kind) to achieve the results we are looking for

Politicians also have an innate sense of how to tap into both human emotions and thoughts to convince them to support a certain policy or to vote for them. Their entire campaign staffs are aimed at developing and deploying these kinds of messages.

In fact, the average parliamentary candidate’s campaign messages contain more potent emotional content than almost any of the counter-disinformation efforts being put to use currently in the West.

To properly leverage neuroscience to effectively counter disinformation and protect our societies from its destabilising effects, we may need to commit more resources (of the right kind) to achieve the results we are looking for. But we won’t know unless we take a hard look at our current approaches, take a brutally honest look at our results, and engage the right kind of expertise so we can better leverage the lessons of science to counter disinformation

Related activities

view all
view all
view all
Track title

Category

00:0000:00
Stop playback
Video title

Category

Close
Africa initiative logo

Dismiss