The majority opinion against fake news

Social media users 'listen' to the evaluations of others, even if they say they don't.


The countermeasures attempted to combat the circulation of disinformation on the Internet and social media are many and varied - from providing information with educational content to developing and teaching strategies to stimulate critical thinking - with varying degrees of success. Among the tools thought to help improve the quality of information circulating on social media is showing users how other people rate it. In technical jargon, this is called 'crowsourced fact-checking', i.e. a check for veracity done by a multitude of users.

A study conducted by Folco Panizza, a researcher at Scuola IMT, in collaboration with colleagues from other Italian and European universities, shows that knowing what others think about the accuracy of a certain content has a positive effect in combating misinformation - people share less false content - but at the same time it exhibits a curious psychological effect: users do not realise they have been influenced and claim to have arrived at their assessment and decision independently.

To conduct the study, published in the journal Humanities & Social Sciences CommunicationsThe researchers recruited around 1,000 participants, Facebook users, by randomly submitting to each of them one of ten potential posts concerning scientific information on health and the environment that had previously been rated on a 1-6 truthfulness scale by another group of users. Showing the participants what others thought of the accuracy of the content influenced how they behaved: in eight out of ten cases, they rated the content of the posts as more correct and accurate.

"What the study suggests is that providing information, even in a complex way - for example, how many people are not sure, how many disagree, and so on - alerts the user, who at that point, hesitates or gives up sharing the content anyway," Panizza notes. "The study confirms that the positive effect of providing users with information about what others think is also valid in the social environment, and confirms years of research in psychology. However, it is interesting to note that those who have seen these evaluations say that they have not used them, in essence that they have decided for themselves, a psychological effect that should be better investigated and understood'. 

Social platforms have made some experiments in recent years to apply some of these strategies and see how well they work. Facebook CEO Mark Zuckerberg, for example, has discussed the possibility of introducing forms of crowdsourcing on Facebook, while Twitter introduced an experimental programme in 2021 called Birdwatch, which allows users of the platform to add notes to posts that they deem misleading. The new study suggests and confirms that these techniques can be useful and work. "It would be enough, for example, that when a user tries to share a piece of content, he or she would be shown - perhaps in a simplified graphical form - how often that content has been shared or believed to be true by others," Panizza further notes. This information could help to understand how the majority thinks on a certain topic, but also how widespread dissent is, and thus help to reason and, perhaps, change behaviour.

You might also be interested in

SocietyMind and Brain

Vaccines: trusting experts is not the same as listening to them

A study confirms that the public trusts science but needs to be listened to.


How does it feel to see human remains in the museum?

A survey analyses visitors' opinions, amid ethical doubts and uncomfortable historical legacies.


How to fight fake news

A study compares different approaches, including economic incentives, to 'vaccinate' against misinformation.