Author: Michael Bang Petersen, Professor of Political Science, Aarhus University, Denmark 

Executive Summary

  • Exposure to misinformation can change people’s views but often the impact is limited because specific attitudes are often rooted in larger worldviews that are difficult to change. Nonetheless, exposure to misinformation can have a number of other adverse consequences. It can distract people’s time and attention from more relevant information sources; it can deplete politician’s and mainstream media’s resources because of a constant felt need to counter it; and, even if it does not change views, it can still sow confusion and uncertainty.
  • A small minority of users are responsible for the spread of most misinformation – the habitual “super-sharers”. These users are motivated by political activism rooted in difficult-to-change psychological dispositions such as anger and frustration against specific political groups and actors.
  • Simple online interventions that target “super-sharers” are likely to fail. Dealing with these individuals will likely require genuine deradicalization programs targeting current “super-sharers” as well as actual policy reform to address the underlying frustrations motivating new “super-sharers” to enter social media.
  • Educational interventions are mainly likely to succeed if they focus on empowering the online audience who occasionally is be exposed to misinformation as consequence of the activities of the “super-sharers”. This involves a conceptual shift from focusing interventions on changing the motivations of habitual sharers to mitigating the effects of their actions on naive bystanders.
  • One well-known intervention is fact-checking. Fact-checking works at two levels: First, it incentivizes decision-makers to share more accurate information and, second, exposure to fact-checks reduces belief in and the sharing of misinformation. Yet, the effectiveness of fact-checks – sometimes referred to as debunking – is reduced by the fact that it is necessarily reactive rather than proactive (so-called prebunking); that misinformation producers works fast and in unexpected ways; and that research shows that it requires repeated exposure to a fact-check to keep false beliefs from emerging.
  • A promising avenue is therefore to invest in prebunking interventions. These interventions are designed to (1) foster the awareness of misinformation and (2) develop competences in detecting misinformation have been found to be successful and in reducing the sharing of misinformation without necessarily reducing the sharing of valid information. Such prebunking interventions are thus oriented towards empowering audiences. An orientation towards empowerment is in line with general principles in risk communication research about how to motivate the public to deal with risks across domains.
  • Prebunking interventions should focus on establishing intellectual humility. Trust and humility are positively related to the sharing of and belief in reliable information. Generalized mistrust, in contrast, is positively related to the sharing of misinformation. It is key that the interventions not only focus on facilitating a critical and suspicious mindset. Instead, interventions need to highlight the fallibility of users’ own intuitions and thereby foster humility. freedoms.

Download the report


The FFS thanks the below institutions for all their support in the creation of this output.