It is well-attested that people are heavily biased when it comes to evaluating arguments and evidence. They tend to evaluate evidence and arguments that are in line with their beliefs more favorably, and tend to dismiss it when it isn't in line with their beliefs. For instance, Taber and Lodge (2006) found that people consistently rate arguments in favor of their views on gun control and affirmative action more strongly than arguments that are incongruent with their views on these matters. They also had a condition where people could freely pick and choose information to look at, and found that most participant actively sought out sympathetic, nonthreatening sources (e.g., those pro-gun control were less likely to read the anti-gun control sources that were presented to them).
Such attitudes can frequently lead to belief polarization. When we focus on just those pieces of information that confirm what we already believe, we get further and further strengthened in our earlier convictions. That's a bad state of affairs. Or isn't it? The argumentative theory of reasoning, put forward by Mercier and Sperber suggests that confirmation bias and other biases aren't bugs but design features. They are bugs if we consider reasoning to be a solitary process of a detached, Cartesian mind. Once we acknowledge that reasoning has a social function and origin, it makes sense to stick to one's guns and try to persuade the other.
Like an invisible hand, the joint effects of biases will lead to better overall beliefs in individual reasoners who engage in social reasoning: "in group settings, reasoning biases can become a positive force and contribute to a kind of division of cognitive labor" (p. 73). Several studies support this view. For instance, some studies indicate that, contrary to earlier views, people who are right are more likely to convince others in argumentative contexts than people who think they are right. In these studies, people are given a puzzle with a non-obvious solution. It turns out that those who find the right answer do a better job at convincing the others, because the arguments they can bring to the table are better. But is there any reason to assume that this finding generalizes to debates in science, politics, religion and other things we care about? It's doubtful.
Olivier Morin has a recent paper that questions this positive image of biased reasoning in social contexts. If everyone wants to persuade rather than to seek out the truth in a detached way, why would people listen to good arguments even in social contexts? Perhaps they do so for logical puzzles they just learned about, but it's doubtful they would do this for things they deeply care about like issues in politics and religion: "collective reasoning without the virtues of ingenuity is vulnerable--and no amount of civility can change this"*. Being nice to your opponent and hearing her out does not mean automatically you will make a good-faith effort to weigh the merits of her argument in a detached manner.
Earlier, I've written about the problem of debating creationists in this context. Heavily polarized debates like the vaccine debate, climate change debate, gun control debate are compelling illustrations that the invisible hand of argumentation does not work, if we do not make a good-faith effort to step outside our comfort zone and hear the other party out. Debiasing approaches, as Morin observes, have only very limited success.
Morin proposes that perhaps we should change our institutional contexts to minimize bad effects. For instance, replication studies in science should be encouraged more. We should then also take care to counter instances of epistemic injustice, where people of various minorities aren't given due credit as credible testifiers because of biases against them (e.g., in debates on police violence against African Americans). So if we want reasoning to work well in an argumentative context, we would have to make sure that everyone with a relevant voice in the debates gets heard and duly acknowledged. Morin also advocates the enlightenment values of individual ingenuity--being a responsible evaluator of arguments who exhibits epistemic virtues such as detachment and thoroughness might be preferable to being a skillful arguer who tries to persuade her audience, although these characteristics needs not be mutually exclusive.
* (Note) There has been a lot of talk in the blogosphere about the pros and cons of civility, its relationship to epistemic injustice and academic freedom. By itself, I don't think that civility is a virtue, however, it can be a consequence of other things that I do deem virtuous within a professional community, such as respect for people who are more junior and vulnerable than oneself, interpreting the speech-acts of others charitably as a default stance, etc.
Recent Comments