Can 'intellectual humility' help medicine?

A research project studies interventions to foster communication between doctors and patients and improve health policies.

Freepik

We experienced this in a striking way throughout the pandemic, when a sort of dialogue of the deaf between experts and the rest of the world took place. The experts tended to accuse the public of not knowing, not understanding, and therefore not heeding and not adhering to recommendations - from wearing a mask to getting vaccinated - with the urgency that the moment demanded. The public, more or less consciously, blamed the experts for not realising the needs, values and expectations of ordinary people (what risk is acceptable in vaccinating a healthy child against a disease?). If there is one thing that was clear to everyone in hindsight, it is that, in Italy but not only there, communication did not work, and precisely at a time when it was crucial for the implementation of urgent health policies that the interaction should take place correctly.

Traditionally, when analysing the interaction between experts and non-experts, the focus is on the non-experts' shortcomings (e.g. their implicit, cognitive, or other 'biases'). Seldom is there a focus on the possible biases and shortcomings of experts in their role as communicators. As various studies and research are highlighting, among the factors at play in compromising communication may also be the willingness (or unwillingness) to admit that one's knowledge does not always allow one to correctly assess contexts and situations. Simply put, to admit that one can be wrong. For this kind of open-mindedness to novelty and readiness to change one's own opinion and accept those of others, there is a definition: 'intellectual humility'. Contrary to what we commonly understand by the term humility, this is not so much a moral characteristic, but precisely the cognitive capacity to recognise the limits of one's knowledge and realise if and when it is time to change one's mind.

The entire history of thought is studded with warnings against the dangers of intellectual conceit. The French philosopher Michel de Montaigne wrote that one of the plagues of man is the boasting of his knowledge. In more recent times, psychologists have been trying to understand why some people stubbornly cling to their beliefs, even when presented with seemingly irrefutable evidence that they are wrong, while others seem more willing to revise their ideas, and adopt different ones when circumstances require it. Researchers from different fields - from psychology to philosophy to social sciences - are trying to revive and deepen this concept, convinced that it can help curb some of the negative phenomena in today's society, from the excessive polarisation of opinions to the communication problems between experts and the rest of society seen for instance - but certainly not exclusively - in the pandemic period.

The group MInD (Models, Inferences and Decisions) of the research unit MoMiLab has just won a grant from the Templeton Foundation, an American foundation that supports research projects straddling philosophy, theology, and science, to study whether intellectual humility, and its positive effects, can be fostered by specific interventions, and whether it can lead to improvements in communication between doctors and patients. "What we are trying to understand is whether the concept of intellectual humility, which up to now has mainly interested philosophers, can be studied using empirical methods and have repercussions in the practice of scientific communication," explains Gustavo Cevolani, professor of logic and philosophy of science at the IMT School. The starting point is to understand whether intellectual humility is related - and in what relation - to a concept that has been much more studied from a psychological and cognitive point of view, the so-called illusion of understanding (illusion of understanding). The illusion of understanding is the gap between a person's actual understanding of a certain subject and their subjective assessment of that understanding. Put simply, it is the distance between what one actually knows and what one believes one knows. This distance can affect both the non-expert public, who, for example, believe they have sufficient knowledge to assess the mechanisms of how a vaccine works, and the experts themselves, for example doctors who believe they know and take for granted their patients' expectations of certain treatments or therapies. The project of the MoMiLab researchers is to apply these concepts to the medical and health fields, to design interventions that improve communication but also health policy outcomes.

But how do we go from recognising that each one of us - experts included - does not and cannot know everything, and can very often be wrong, to avoiding inappropriate behaviour on the part of a doctor, perhaps reducing antibiotic prescriptions in cases where they are not needed, or increasing a patient's satisfaction with the treatment he or she is offered? "We can try to do this by designing a specific study. The first step is to test through experiments whether reducing the illusion of understanding can help increase intellectual humility,' he explains Federica Ruzzante, a PhD student in cognitive, social and computational neuroscience at Scuola IMT, who is working on the research project. "We have designed a test in several stages: first we will identify the topics on which the general population thinks they understand more than they actually do, and those on which doctors think they know the opinions and knowledge of the public, without actually having a good knowledge of them. A group of participants will be asked to give an estimate of their understanding of certain health topics; then their actual knowledge will be tested with real quizzes and, finally, they will be asked again to express their understanding of the phenomenon. "The research tells us that people, when confronted with the need to retrieve information on a subject, such as answering a test, realise that they may have overestimated their knowledge". "Experts will be tested in a very similar way, asking questions such as 'how much do you think you understand people's attitudes towards antibiotics?', and then asking them to provide estimates, for example: 'out of 100 people, how many do you think have a position against the use of antibiotics?'" he adds Folco Panizza, research fellow at the school and second project coordinator. "Shortly we will start with the selection and preparation of materials, to see on which topics there is a greater gap between experts and laymen".

'We expect,' Ruzzante concludes, 'that confronting people with their illusion of understanding may stimulate not only doubts about their own understanding of the specific topic, but also the development of a general critical and self-critical attitude. A bath of humility... intellectual, in short'.

Chiara Palmerini

You might also be interested in

SocietyMind and Brain

Rationality Pills #4 - Experts are (not) always right

How do we decide whom to trust? The philosophy of science tries to answer.

SocietyTechnology and Innovation

If we go extinct, it will not be ChatGPT's fault

A reflection on the real and supposed risks of artificial intelligence.

Mind and Brain

Four easy lessons on the brain

A video popularisation project puts neuroscience within everyone's reach.

Technology and Innovation

How to heal the gut by analysing microbiota networks

Ground-breaking research analyses the complex interactions between microbes in the human gut.