Much of the time weâre bombarded with massive amounts of information and no matter how hard we try, none of us can understand everything, and must defer to others
We constantly make decisions about who to trust. Much of the time weâre bombarded with massive amounts of information on all sorts of different subjects, from science and health, to social issues, economics and politics.
But no matter how hard we try â or brilliant we are â none of us can understand everything, and correctly assess the risks associated with the issues affecting ourselves and our communities.
We have no choice but defer to others, and the decisions we make about a personâs or organisationâs trustworthiness can play a huge part in our health and mental wellbeing. In some situations, such as whether to take a vaccine, it can be a matter of life or death.
During the pandemic, researchers conducted a series of large surveys investigating which factors were linked to vaccine hesitancy. One survey questioned Americans in five different states, another in 23 countries and a final one included over 120,000 respondents in . They all found that was a key factor in determining whether people intended to be vaccinated.
But what influenced this trust in science? Researchers on âepistemic trustâ â which is our trust in someone as a knowledgeable source of information â have which we use to determine trustworthiness: how we perceive an expertâs level of expertise, integrity and benevolence (concern and care for society).
A recent measured trust in science throughout the pandemic, and the factors affecting it. By analysing data from four surveys done at different points in time, and involving over 900 respondents, the researchers found that trust in science increased substantially after the pandemic began â and it was mainly due to positive assumptions about the scientistsâ expertise in their field.
In contrast, the most pronounced reason for distrusting the scientists was a perceived lack of benevolence because scientists are often dependent on the funders of their research. So, the researchers recommended that science communication emphasised the good intentions, values and independence of the scientists.
In the UK, reported a high level of trust towards scientists during the pandemic, compared to 52 per cent towards the government. Although no studies specifically investigated perceptions of the scientistsâ expertise, integrity and benevolence, towards the vaccine were mainly caused by lack of trust in the benefits of vaccination and concerns about future unforeseen side effects.
Itâs okay to say âI donât knowâ
Many of us, whatever our field of work, fear that showing uncertainty can damage our image â and we may compensate by expressing overconfidence in an attempt to win trust. This strategy has been seen from university when writing about the findings of academic research â and also from some public health officials when communicating to the public .
But some studies show that while confident advisors are judged more favourably, people uncertain advice. In fact, when faced with an explicit choice, people were more likely to choose an advisor who provided uncertain advice (by providing a range of outcomes, probabilities or saying that one event is âmore likelyâ than another) over an advisor who provided certain advice with no doubts.
It seems that advisors benefit from expressing themselves with confidence, but not from communicating false certainty.
In many situations, people are willing to trust those who can admit they donât have a definitive answer. Good news come from recent experimental studies on , and which found that communicating uncertainty and even admitting our mistakes is and can even to trustworthiness.
So, failure in âexpertiseâ by higher integrity and benevolence. When communicating uncertainties in a transparent way, we are perceived as and willing to tell the truth.
Thereâs a neurological basis
Another characteristic of trustworthiness is that it can also be weakened by what is known as âguilt by associationâ (you can be judged by the company you keep) â or â the psychological mechanism behind that belief.
Thereâs a saying that a spoonful of tar can spoil a barrel of honey. And in fact, the food analogy makes some sense.
It is believed that throughout evolution, our disgust mechanisms, originally evolved to assess contamination and avoid disease from rotton or soiled food, also started to . Our disgust reaction â when disgusted by peopleâs untrustworthy behaviour â is the same neurologically as our disgust reaction if food is off.
In support of this hypothesis, both disgust in food and moral judgement activate the same areas and the same .
Interestingly, our disgust sensitivity (how easily we are disgusted) does indeed show a with our level of distrust in others. In other words, if we are inclined to worry about pathogens on food, weâll also be inclined have a lower level of social trust and feel that most people should be avoided.
But it is still unclear how this psychological process of âmoral contagionâ can affect our trust towards many organisations or individuals allegedly collaborating closely with each other, such as scientists, government, pharmaceutical corporations, universities and international bodies during the pandemic. In such a melting pot of organisations, it will depend on the groups we feel drawn to, and our personal sensitivities to misconducts , , or .
In the current climate, any person or institution who genuinely wants to be trusted should work on communicating their expertise, honesty and benevolence â and encourage those they work with to do the same.
Dr Erik Gustafsson is a Senior Lecturer at the Department of Psychology in the Faculty of Science and Health.
This article is republished from under a Creative Commons Licence. .
More The Conversation Articles...
The Conversation is an independent source of news analysis and informed comment written by academic experts, working with professional journalists who help share their knowledge with the world.