The Research Experience. Ann Sloan Devlin

The Research Experience - Ann Sloan Devlin


Скачать книгу
lake, make, rake, take, bike, hike, like, mike, … (k in the third position) generates far more possibilities; in fact, two times as many in a typical text (Tversky & Kahneman, 1973).

      The availability heuristic emerges in research in many ways. For example, if we develop a questionnaire that first asks people to rate a list of items describing their university in terms of preference (e.g., food, school spirit, academics, career counseling, cost, and residence halls), and then we ask them an open-ended question about advantages and disadvantages of attending that university, the items from that initial list will be available in memory and will likely influence what people say in the open-ended question. If we had asked the open-ended question first, we might get a different set of responses. Thus, the order in which information is presented to participants may influence their responses and is related to the availability heuristic. Chapter 10 discusses one way to address this problem of availability by doing what is known as counterbalancing the order of presentation of materials. In complete counterbalancing, all possible orders of presenting the materials are included in the research approach.

      Counterbalancing: Presenting orders of the treatment to control for the influence of confounding variables in an experiment.

      Wason Selection Task: Logic problem in which you have to determine which of four two-sided cards need to be turned over to evaluate the stated hypothesis (e.g., if there is a vowel on one side there is an even number on the other).

      Humans Want to Confirm Hypotheses

      What we have available in memory influences us in other important ways, specifically when we think about ways to confirm our hypotheses rather than when we think of ways to disconfirm them. Figure 1.3 shows a well-known example of our preference for thinking about information in terms of the way it is presented: the Wason Selection Task (Wason, 1966, 1968). This task involves making decisions about two-sided cards. This task has many variations, but in one version (Figure 1.3), people are told the following: These cards have two sides: a letter of the alphabet on one side and a number on the other. Then people are told a “rule,” and their job is to make sure the rule is being followed. Here is the rule: If there’s a vowel on one side, there’s an even number on the other.

Four cards are labeled E, K, 4, and 7.

      Figure 1.3 Example of Wason Selection Task

      Then they are asked:

      Which card or cards do you have to turn over to make sure the rule is being followed?

      Why? One reason is that people heard the statement, “If there’s a vowel …,” and so what do they see? They see a vowel (E). They have a vowel available (think availability heuristic), and it seems logical to investigate the other side of that card. And they are correct, at least to that point; they should turn over the E. But they must also turn over the 7 to make sure that there is no vowel on the other side of that card. People don’t do that; they don’t think to disconfirm the rule.

      Try This Now 1.1

      Before you read further, what card(s) did you select?

      People usually select E and frequently E in combination with K and 4; they hardly ever select 7.

      The Wason Selection Task demonstrates an important part of thinking related to research. Humans have a much easier time thinking of ways to confirm information (think hypothesis) than to disconfirm it. What comes far less easily is taking a disconfirmational strategy to the hypothesis or the theory by seeking to disconfirm it. In research, we seem far more willing to seek to confirm rather than to disconfirm. Humans tend to exhibit what is known as confirmation bias in that we look for information that confirms our hypotheses. We also need to ask ourselves the question, what situation(s) would be a good test to show that the hypothesis is incorrect?

      Confirmation bias: Tendency to look for information that confirms our hypotheses.

      In the research findings of Kahneman and Tversky, you have seen that our cognitive processes are susceptible to a wide range of influences and biases. Even such respected researchers as Kahneman and Tversky may have been susceptible to the biases they studied. In the article “Voodoo Correlations Are Everywhere—Not Only in Neuroscience,” Klaus Fiedler (2011) showed that the use of the letter K (discussed earlier in this chapter) for Tversky and Kahneman’s (1973) demonstration of the availability heuristic may have used an unrepresentative letter (K). Because this finding has not been replicated with many other letters of the alphabet (as Fiedler reported, citing the work of Sedlmeier et al. [1998]), using K may not have been a good test of Tversky and Kahneman’s hypothesis. In selecting their stimulus (K) intuitively, Fiedler explained, Tversky and Kahneman were fallible human beings: “Such an intuitive selection process will typically favor those stimuli that happen to bring about the expected phenomenon, making mental simulation an omnipresent source of bias in behavioral research” (Fiedler, 2011, p. 165).

      In other words, Fiedler (2011) argued that the authors (consciously or otherwise) selected a stimulus that was likely to prove their point. The larger message of this research example provides a cautionary tale: Researchers and cognitive animals want to validate their hypotheses; by reinforcing what the Wason Selection Task shows, they seek to prove, not to disprove, and are likely to select stimuli that support their hypotheses rather than stimuli that challenge or refute them.

      How can humans guard against this common “affirming” behavior? Being aware that such errors are likely is the first step. Asking how one might disprove or refute the hypothesis is another step. Imagining the conditions under which a prediction would not hold is as important as identifying the conditions under which the prediction is likely to hold. In other words, ask yourself what evidence would counter the hypothesis.

      Revisit and Respond 1.1

       Explain what it means to say humans are limited information processors.

       Describe the concept of a schema and its adaptive and maladaptive implications for research.

       Define heuristics and give examples of representativeness and availability.

       Explain the Wason Selection Task and what it shows about the difference between confirming and disconfirming hypotheses.

      Other Problems in Thinking

      Several problems in thinking have been covered; let’s discuss a few more and in the process reinforce some of the information already presented. In Shermer’s (1997) Why People Believe Weird Things, Chapter 3 is titled “How Thinking Goes Wrong: Twenty-Five Fallacies That Lead Us to Believe Weird Things.” In that chapter, Shermer discussed four major categories of difficulties in how we think about evidence and data (Table 1.1). To illustrate the categories and the problems they present for our research, the chapter will focus on examples (see shading) in each category.

      Problems in Scientific Thinking: Theory Influences Observations

      As part of the category “Problems in Scientific Thinking,” Shermer listed “Theory influences observations” (1997, p. 46). What this statement means is that theory in some sense directs, shapes, or may even limit the kinds of observations humans make. Again, it is clear that we might limit ourselves because we look for a particular kind of behavior rather than being open to any kind of activity in the environment. Most people have never heard a peacock’s scream and would never guess that the sound they hear when visiting a suburb outside Los Angeles comes from that bird. Why? Because most of us think peacocks are birds that reside in captivity. But peacocks have roamed wild in some places (like Rolling Hills on the Palos Verdes Peninsula in California) for more than 100 years. We limit our choices to the most


Скачать книгу