How to fight fake news

A new study compares different approaches, including economic incentives, to 'vaccinate' against misinformation.

Fanatic Studio/Gary Waters/SPL

As fast as a virus, and potentially just as devastating: the term infodemic was conied to describe the spread of fake, partially true or only likely news during the Covid-19 pandemic. The mere volume of often conflicting information available on the net has contributed to increasing the background noise and disorientating public opinion, according to some scholars of the phenomenon. It is actually a phenomenon that has existed for some time, but whose spread seems to have been speeded up by the advent of social media. What now seems certain is that misinformation that goes viral can help influence public debate and political decisions on relevant social issues, such as vaccines, climate change, and health and social policies.
Much less clear is how to intervene in the phenomenon to mitigate it, and how to enable people to recognise misinformation and defend themselves against it. A recently published study in the Scientific Reports journal tested different approaches to encourage social media users to pay more attention to content before sharing it. We discussed about this topic with Folco Panizza, a researcher at the IMT School for Advanced Studies Lucca, and one of the authors of the research. 

First of all, what is known about how widespread the phenomenon of scientific disinformation is?

Unfortunately, in Italy there is few available data, unlike in other countries such as the United States, Canada, Australia and the United Kingdom, on which several studies already exist.

Regarding the extent and relevance of the problem, there is currently a heated debate among experts. There are those who claim that it is a phenomenon inflated and instrumentalised by some political parties, but it has a small impact in reality. But an important part of researchers thinks the opposite.

By measuring the information flow, you can see that, indeed, most of the content disseminated online by information sources is not misleading or false. But most of the content is just personal content, in technical jargon User Generated Content. This content is often interesting, emotional, surprising, easy to understand and can conceal misinformation.

We, therefore, do not have the data to quantify the spread of disinformation, with the risk that content can go viral undisturbed. In essence, even a small part of falsehood can generate a huge impact. This is also and especially the case when it comes to scientific misinformation, both due to a lack of general competence in the population to distinguish between scientific and pseudoscientific content, and to the great demand for information on health and the environment, topics on which there is often personal involvement.

What approaches are used to study the phenomenon of disinformation?

One of the premises, which is a starting difficulty, is that with a few exceptions there is a lack of collaboration with social media platforms to gain access to data. The other difficulty is the lack of transparency of the platforms on how they operate. Therefore, we try to find alternative ways to study the phenomenon. One is to study user behaviour through interviews and bots, tools that however give us a partial view of reality. The other is to try to recreate parallel contexts to social media by conducting experiments in a controlled environment.

And what indications does research give on how to intervene to solve the problem?

One of the most common interventions is debunking, i.e. correcting false beliefs derived from misinformation. Over the years, substantial literature has developed on how this approach works on a mental level. The problem is that debunking is at a disadvantage from the start, because it comes after the exposure to the 'false' information, trying to correct it. Briefly, it chases the problem, and its application is limited to particular contexts.

This has prompted the search for different methods, trying to anticipate and prevent. Some of these interventions are based on stimulating social media users to focus, making them pay attention to content with visual or textual inputs that remind the user to be careful.
Another approach is that of digital literacy. literacyIt consists of explaining to users some very quick and easy 'tricks of the trade' for evaluating a piece of content in the media, and is considered to be highly effective.
Yet another example is that defined inoculation. This type of intervention is based on 'inoculating' users against uninformative content by repeatedly exposing them to content with these characteristics. It is not clear, however, whether this kind of inoculation, just as in the case of vaccines, needs 'booster shots'.

Folco Panizza, researcher in behaviour and pro-social decision-making at the IMT School

Can you describe your research?

In our study, we tested two types of interventions to address scientific misinformation. The main approach was literacy. By exploiting previous research that identified a series of strategies used by fact-checkers, i.e. those who professionally verify the trueness of news and information, we obtained a small list of suggestions that are easy to use. These are strategies that we use every day even if we do not realise it, related for instance to how to use search engines. The fundamental mistake when faced with new content of which we do not know the source or other information is that we allow ourselves to be very much dazzled by its appearance.

The most effective way to evaluate it is instead to completely ignore the site that proposes it to us, and go look for information elsewhere. Instead of preferring a 'vertical' reading within the content itself, one should open several pages (or several tabs in one's browser) to search for other items with a 'horizontal' reading, looking at the information from different points of view.

The second approach concerns the use of incentives, in this case monetary ones. We tried to stimulate a greater awareness of the content by rewarding correct answers. These incentives turned out to be very effective. The idea of being rewarded for being able to correctly evaluate a piece of content significantly increases the accuracy of judgement.

Providing the strategies, on the other hand, only seems to have an effect if one is unfamiliar with the source. Not because the strategies do not work, but because providing them in this condensed form may lead to ignoring them, as we often think we know better or consider them a waste of time. Both of these interventions work, although convincing users to pay more attention and seek more information is not so easy.

What implications might these results have for social media and media policies?

If we could replicate these experiments in real contexts through a dialogue with the platforms, it would be possible to provide users with this information with easily implementable tools, such as pop-ups that appear when trying to share content. Unfortunately, social media companies have no incentive to test these kinds of interventions, even though there are already some policies in the European Union and a willingness to listen to some researchers.

The proposed incentives could be used in debunking activities, by recruiting and rewarding users themselves to evaluate information, an activity that is effective in many cases. We are conducting further research on other types of non-monetary incentives. For example, using games similar to those proposed on social media by malicious actors to challenge the user to identify information seems to work effectively.

Marco Maria Grande

You might also be interested in

SocietyMind and Brain

Rationality Pills #4 - Experts are (not) always right

How do we decide whom to trust? The philosophy of science tries to answer.

SocietyTechnology and Innovation

If we go extinct, it will not be ChatGPT's fault

A reflection on the real and supposed risks of artificial intelligence.

Society

The majority opinion against fake news

Social media users 'listen' to the evaluations of others, even if they say they don't.

Cultures

One million photographs to tell the story of the city of Lucca

A project in collaboration between the IMT School and the municipality enhances the city's vast photographic heritage.