People who think their opinions are superior to others are most prone to overestimating their relevant knowledge and ignoring chances to learn more

GettyImages-165763476.jpgBy guest blogger Tom Stafford

We all know someone who is convinced their opinion is better than everyone else’s on a topic – perhaps, even, that it is the only correct opinion to have. Maybe, on some topics, you are that person. No psychologist would be surprised that people who are convinced their beliefs are superior think they are better informed than others, but this fact leads to a follow on question: are people actually better informed on the topics for which they are convinced their opinion is superior? This is what Michael Hall and Kaitlin Raimi set out to check in a series of experiments in the Journal of Experimental Social Psychology.

The researchers distinguish “belief superiority” from “belief confidence” (thinking your opinion is correct). Belief superiority is relative – it is when you think your opinion is more correct than other people’s; the top end of their belief superiority scale is to indicate that your belief is “Totally correct (mine is the only correct view)”.

The pair set out to find people who felt their beliefs on a variety of controversial political issues (things like terrorism and civil liberties, or wealth redistribution) were superior, and to check – using multiple choice quizzes – how well they were informed on the topics about which they held these superiority beliefs. 

Across five studies Hall and Raimi found that those people with the highest belief superiority also tended to have the largest gap between their perceived and actual knowledge – the belief superior consistently suffered from the illusion that they were better informed than they were. As you might expect, those with the lowest belief superiority tended to underestimate how much they knew.

As well as simple background knowledge, the researchers were also interested in how people with belief superiority sought out new information relevant to that belief. They gave participants a selection of news headlines and asked them to select which articles they’d like to read in full at the end of the experiment. Categorising headlines as belief-congruent or belief incongruent, the researchers observed that those participants with higher belief-superiority were more likely to select belief congruent headlines. In other words, despite being badly informed compared to their self-perception, these participants chose to neglect sources of information that would enhance their knowledge.

Finally and more promisingly, the researchers found some evidence that belief superiority can be dented by feedback. If participants were told that people with beliefs like theirs tended to score poorly on topic knowledge, or if they were directly told that their score on the topic knowledge quiz was low, this not only reduced their belief superiority, it also caused them to seek out the kind of challenging information they had previously neglected in the headlines task (though the evidence for this behavioural effect was mixed).

The studies all involved participants accessed via Amazon’s Mechanical Turk, allowing the researchers to work with large samples of Americans for each experiment. Their findings mirror the well-known Dunning-Kruger effect – Kruger and Dunning showed that for domains such as judgements of grammar, humour or logic, the most skilled tend to underestimate their ability, while the least skilled overestimate it. Hall and Raimi’s research extends this to the realm of political opinions (where objective assessment of correctness is not available), showing that the belief your opinion is better than other people’s tends to be associated with overestimation of your relevant knowledge.

Overall the research presents a mixed picture. It shows, as others have, that our opinions are often not as justified as we believe – even for the opinions that we are most confident are better than other people’s. On the other hand, it shows that people are responsive to feedback, and aren’t solely driven by confirmation bias when they seek out new information. The final picture is of human rationality that is flawed, but correctable, not doomed. 

Is belief superiority justified by superior knowledge?

tom stafford head shotPost written by Tom Stafford (@tomstafford) for the BPS Research Digest. Tom is a psychologist from the University of Sheffield who is a regular contributor to the Mind Hacks blog. His latest book is For argument’s sake: evidence that reason can change minds.

Article source: http://feedproxy.google.com/~r/BpsResearchDigest/~3/A2HjuBDr6-E/