Arguments, Cognition, and Science. André C. R. Martins

Arguments, Cognition, and Science - André C. R. Martins


Скачать книгу
at interpreting the data. However, when the problem was a very controversial issue (e.g., gun control), people with better numeracy interpreted the data in ways that agreed with their initial points of view. That happened regardless of the real data. People with improved numeracy would become even more polarized on the subject than people less well trained in mathematics. That suggests that being smarter can mean a stronger ability to distort reality to conform to one’s own point of view. That might not be conscious, but, at least in those observations, being smarter didn’t seem to help at doing a better job. Indeed, in a more recent study, Kahan and others (2017) have observed that polarization over politically charged issues such as gun control does not decrease with better numerical skills; quite the opposite. After looking at the same data, those more capable of analyzing data correctly, at least in principle, showed a stronger tendency toward more polarized opinions.

      Indeed, Stanovich and collaborators (2013) have made similar observations about what they called myside bias, our tendency to evaluate data in a biased way that supports our opinions. The size of the bias their subjects showed was not correlated to their intelligence. It should be no surprise by now to learn that Rollwage and others (2018) observed that the more radical position people in a theme, the less capable those people seem to be to estimate the accuracy of their own judgments. Anyone who has ever observed people with strong political opinions debating should find that conclusion quite easy to accept.

      The Reasons of Our Reason

      Being too confident and ignoring the actual chances we might be wrong seem to serve no practical purpose. Neither does allowing ourselves to be convinced by opinions that are clearly wrong. Those characteristics do not seem to be good heuristics; they interfere in our ability to find good answers. Is it possible that, despite those problems, we might gain something from those cognition errors?

      

      Hugo Mercier and Dan Sperber (Mercier and Sperber 2011) have proposed an explanation to that question. We have always assumed our mental and verbal skills have evolved for the pursuit of truth. We can use them for that, after all. It is possible, however, that they have evolved for other purposes. Since we are social beings, Mercier and Sperber observed that this characteristic may have shaped how we reason. After all, we evolved in an environment where, if others believed what you said, you would have more power. That meant better chances at surviving. There must have been strong pressure to be able to argue well and convince people. Convincing could be an advantage regardless of the correctness of the reasoning or the conclusion. Having followers and believers can provide major advantages. Therefore, Mercer and Sperber proposed their Argumentative Theory of Reasoning (ATR). ATR states that our reasoning exists to make us competent at debating and convincing others, and that is often not the same as arriving at the right answer.

      The idea that we reason to make convincing arguments (that might turn out to be true or not) also seems to be applicable for children (Mercier 2011b) as well as other cultures (Mercier 2011a). We may still use our intellects to pursue correct answers to the problems we face. Finding better answers might also have contributed to shaping our reasoning to some degree. The evidence that a good part of our argumentation skill evolved to allow us to win arguments is, however, quite compelling.

      That idea provides an explanation for our overconfidence. Being confident of what we say is a better way to convince others than showing doubt. If we want to convince instead of being right, looking for ways to defend our points of view is a more effective strategy. Confirmation bias now makes sense. We do not need to find counterarguments for our opinions if we will not use them. Unless, of course, we are anticipating a debate and looking for ways to answer those counterarguments. Our so far unexplained biases start making sense if we do not reason for truth but for social reasons (Mercier and Sperber 2017).

      The strategies we use for convincing others might change from one individual to the next. Strategies that might be efficient at changing the mind of one person might fail with someone else. It might seem that different political orientations could be associated to differences in our brains. Political conservatives seem to be more structured in their decision making. They might have a greater need for order and closure. Meanwhile, liberals seem to tolerate ambiguity and new experiences better (Jost et al. 2003). Those observations seem to be associated with differences in the structure of their brains (Amodio et al. 2007). Quite interestingly, the same differences are already noticeable in the brain structure of young adults. Being a liberal seems to correspond to more gray matter volume of anterior cingulate cortex, and conservatives would have an increased right amygdala size (Kanai et al. 2011). Right now, it is not clear if those differences cause the political orientation, but the functions of those brain regions suggest that might be the case. The amygdala, for example, is responsible to fear regulation. A larger amygdala might suggest someone is more responsive to fear. On the other hand, the anterior cingulate cortex is involved in monitoring uncertainty. In this case, a larger region might, in principle, allow a larger tolerance to uncertainty. On the other hand, recent experiments suggest that both groups might have no difference in the way they respond to threats (Bakker et al. 2019).

      Those observations do not mean, however, that one group reasons better or worse than the other. They might only represent a difference in preferences—and that, obviously, exists between both groups. Still, the question of whether one group would reason in a more competent way did get raised in the literature. Dan Kahan tested that.

      While ideology influences which opinions we will trust, people with different ideologies seem to be influenced in the same way by new information (Kahan 2013). In his experiments, Kahan compared North American conservatives and liberals. Both sides showed the same tendency to fit reports about empirical evidence to their ideological positions. Both sides distorted the meaning of the evidence to support their own views.

      That distortion, we know now, is not a sign of stupidity—quite the opposite. Kahan also tested the cognitive abilities of the subjects, which showed a reversal of what would be expected if lack of intelligence was the cause of this alignment and distortion. Those who scored highest in his cognition test “were the most likely to display ideologically motivated cognition.” That led Kahan to propose his Expressive Rationality Thesis (ERT). ERT claims people process information in ways that promote their individual ends. It is a similar idea to the Argumentative Theory of Reasoning. Both ERT and ATR state that we do not naturally use our reasoning or argumentative skills to get closer to the truth. Instead, we use our brains and language to establish our identities, to convince our allies, or to agree with their positions.

      Those seem to be the main cause of our mental skills. When deciding which expert was more reliable, people agreed more with experts who, taking into account their clothes, presence, type of beard, and so on, looked like someone who would share their points of views (Kahan 2010). We unconsciously manipulate information and discourse to advance our ideological positions. Those are, indeed, positions that are defined and help to define to which group we belong. We do not treat information that agrees with our opinions the same way we treat information that disagrees from us (Taber and Lodge 2006), and that happens not only on a conscious level, as a strategy, but also subconsciously. The same pattern is also observed when we make unconscious estimates (Gilead et al. 2018).

      People might treat their beliefs as if they were valued possessions, things they want to protect (Abelson and Prentice 1989). We might want to defend our beliefs, regardless of whether they are correct or not. In some circumstances, correctness might not be an issue. We might just be talking about preferences, where there might be no right or wrong. But serious debate could benefit if each side was capable of, at least, understanding the arguments of the opposing view. That is often not the case. When we have strong views, for example, on issues like abortion (Luker 1985) or politics (Sears and Whitney 1973), many of us seem to be incapable of even considering ideas that are opposite to our beliefs.

      Consistency sounds like a good characteristic. Consistent people are considered reliable. If I start changing my expressed opinions too often, people will likely think I am either crazy or not very smart. Experimental evidence does show, as we have seen, that smart people are better at defending their points of view. That does not mean, unfortunately, that consistent people are more likely to be right. More intelligence could, in principle, help us make an impartial and solid analysis of available data. If that were


Скачать книгу