Developing effective responses to disinformation

#CriticalThinking

Digital & Data Governance

Picture of Eileen Culloty
Eileen Culloty

Coordinator of the Ireland European Digital Media Observatory (EDMO) Hub, and Assistant Professor and Deputy Director of the Institute for Media, Democracy and Society at Dublin City University (DCU)

Creating effective strategies to combat online disinformation is a pressing objective, yet it proves to be a complex endeavour fraught with conceptual, practical and regulatory challenges. The conceptual intricacies arise due to the divergent interpretations of the issue, leading to ambiguity in distinguishing disinformation from opinion and other problematic content like hate speech. Practical obstacles emerge from the overwhelming volume of content coursing through online platforms, making it arduous to establish and enforce equitable and consistent moderation standards.

Concurrently, from a regulatory standpoint, concerns loom large over the legal, ethical and democratic consequences of curbing free expression and endowing platforms with potentially unchecked authority to decide what constitutes acceptability. Therefore, while there is a widespread consensus on the necessity of addressing disinformation, there remains considerable uncertainty regarding what actions can and should be taken.

By telling people about common manipulation techniques and false claims, they are more likely to recognise disinformation when they see it

As policymakers, tech firms, news outlets, educators and various stakeholders are entrusted with the responsibility of devising strategies to combat this issue, it becomes crucial to evaluate the factual foundation supporting current approaches aimed at addressing disinformation. Proposed countermeasures can take many forms. Some researchers suggest that countermeasures can be divided into four types:

  • laws and ethics, such as regulations and ethical guidelines;
  • technology, such as automated harmful content detection;
  • education, such as media literacy; and
  • psychology or behavioural sciences, such as pre-bunking and nudging.

At the same time, countermeasures can be conceptualised at a macro or system level or a micro, individual level. System-level countermeasures aim to change the environment. For example, system-level countermeasures might address the business model of online advertising or the attention economy of social media platforms because both fund and enable disinformation. Other system-level actions might focus on embedding media literacy in the education system to ensure that future generations are knowledgeable about disinformation and good information practices or that reliable news media are available to provide accurate information.

Unsurprisingly, system-level countermeasures are complex, long-term and potentially expensive. Moreover, they can be subject to lobbying as tech companies resist measures that would reduce their revenue. System-level countermeasures may be subject to abuse in some circumstances. For example, anti-democratic governments have introduced repressive laws prohibiting the spread of disinformation.

Individual-level countermeasures, as the name suggests, focus on equipping individuals with the knowledge and tools to resist disinformation and source reliable information. This may include media literacy, fact-checking, pre-bunking or content labelling. For example, during the COVID-19 pandemic, many social media platforms carried labels reminding people to source accurate information about vaccines. The challenge, of course, is that these approaches rely on the awareness and willingness of individuals. Arguably, those most in need of support may be least able or interested in seeking it out.

The challenges associated with correcting misinformation after it has spread have prompted researchers to explore how to prevent people from falling for and sharing misinformation in the first place. For example, pre-bunking and media literacy are both pre-emptive in that they aim to help people recognise and reject disinformation.

Whereas a fact-check debunks a false claim after it has been published, pre-bunking aims to help individuals spot false claims before encountering them. The idea is simple: by telling people about common manipulation techniques and false claims, they are more likely to recognise disinformation when they see it. Researchers have explored how to use videos and games to develop pre-bunking knowledge and skills.

An effective strategy will require a commitment to tough system-level countermeasures, as well as individual-level initiatives

Media literacy has a broader goal in that it not only aims to protect people from disinformation, it also aims to empower people to create and participate fully in our media-saturated world. Since 2007, UNESCO has championed ‘media and information literacy’ as an umbrella concept that incorporates competencies relating to media literacy, information literacy, news literacy and digital literacy. Media and information literacy countermeasures are often conceived within formal education. More recently, there has been greater focus on how to provide media literacy through other settings such as libraries.

These are early days for disinformation countermeasures. Researchers are active in investigating the most effective approaches, social innovators are active in developing new initiatives, and many governments and funding bodies are supporting innovation. The EU has adopted a wide-ranging response to disinformation that puts emphasis on regulation, access to platform data, fact-checking, media literacy and boosting an independent news industry.

As reflected in the EU response, an effective strategy will require a commitment to tough system-level countermeasures, as well as individual-level initiatives. Disinformation is a complex challenge with no simple solution. It is worth remembering that disinformation is only a symptom. The ultimate goal is vibrant democratic societies where people can lead rich lives. Countering disinformation is a necessary step to securing that future.


The views expressed in this #CriticalThinking article reflect those of the author(s) and not of Friends of Europe.

Related activities

view all
view all
view all
Track title

Category

00:0000:00
Stop playback
Video title

Category

Close
Africa initiative logo

Dismiss