The Role of Cognitive Errors in the Drug Policy Debate

By David Hadorn

People often don't think clearly. So much is obvious to everyone, including drug-policy reform advocates who have experienced the muddy thinking too often characteristic of prohibitionists. However, few people realize that an entire body of scientific literature has accumulated showing how and why people's judgments and decisions so often go awry. Some knowledge of this literature might be useful in terms of gaining a better understanding of how otherwise rational people (let's assume) can be so thick-headed.

Among the twenty or so major categories of cognitive error identified during the course of research in judgment and decision making, two in particular are relevant to the drug policy debate: belief perseveration and illusory correlation. The joint action of these phenomena provides an alternative (or supplementary) explanation for the observed intransigence of prohibitionists (and others) in the face of evidence. Such intransigence is generally ascribed simply to ignorance or malevolence. But this is not the whole story.

In this paper we examine the nature of belief perseveration and illusory correlation, with special emphasis on the role of these factors in the drug policy debate. I will attempt to discern possible courses of action for minimizing the effect of these errors in this arena.

Belief Perseveration

The processes by which people form, test, reinforce, modify, and (sometimes) abandon their beliefs are reasonably well understood. Among the key processes is the distinct tendency to persist in believing one's existing theories about the world even when faced with overwhelming contrary evidence:

"It appears that beliefs - from relatively narrow personal impressions to broader social theories - are remarkably resilient in the face of empirical challenges that seem logically devastating. Two paradigms illustrate this resilience. The first involves the capacity of belief to survive and even be strengthened by new data, which, from a normative standpoint, should lead to the moderation of such beliefs. The second involves the survival of beliefs after their original evidential bases have been negated." [1]

Drug policy reform advocates will immediately recognize these phenomena in their opponents. Prohibition-minded individuals seem remarkably resistant to alterations in their belief when confronted with contrary evidence. (It would be well, of course, to search for the presence of this problem also in ourselves, but with almost all the evidence on our side, we can be relatively assured that our beliefs are not persisting in spite of evidence.)

A study conducted by Lord et al. [2] illustrates the kind of research used to analyze the problem of belief perseveration. These investigators identified people with clear views (one way or the other) concerning the effectiveness of capital punishment as a deterrent to crime. In a counterbalanced design, subjects were presented with purportedly authentic empirical studies which either supported or refuted their position. Subjects consistently rated the studies supporting their position as "more convincing" and "better conducted" than the studies opposing their beliefs.

"In fact, many individual subjects who had read both the results summary and the procedural details of the study that opposed their belief ultimately became more convinced of the correctness of that belief! No such effects occurred when the same results and procedures were read by subjects whose initial views were supported."

A more serious challenge to one's beliefs than the introduction of new evidence is the revelation that the original bases for the beliefs were completely spurious. Even under these circumstances, beliefs often persist. Several studies have been conducted to explore this phenomenon, among the most illuminating of which is a study by Anderson et al. [3], in which subjects were informed that a functional relationship existed between how well firefighters perform in their job and their scores on a test of risk preference (i.e., risk avoiding versus risk seeking). Subjects were provided with scenarios in which the job performance of certain firefighters was presented along with their scores on the test.

The investigators found that presenting even a single pair of cases (i.e., one successful and one unsuccessful firefighter with appropriately discrepant scores) was sufficient for subjects to develop beliefs about the functional relationship of performance and test scores. Moreover, these beliefs:

". . . survived the revelation that the cases in question had been totally fictitious and the different subjects had, in fact, received opposite pairings of riskiness scores and job outcomes. Indeed, when comparisons were made between subjects who had been debriefed and those who had not been, it appeared that over 50% of the initial effect of the 'case history' information remained after debriefing."
Ross and Anderson explore a variety of cognitive mechanisms which might underlie the unwarranted persistence of our beliefs and social theories, including biased search, recollection, and assimilation of information; erroneous formation of causal explanations; behavioral confirmation, and "self-fulfilling" hypotheses, among others [1]. All of these mechanisms have been well studied and described, and none is subject to ready correction. The collective power of these errors is formidable.

The authors conclude that "attributional biases and other inferential shortcomings are apt not to be corrected but instead to be compounded by subsequent experience and deliberations." This is a discouraging situation for those who wish to change the minds of others by reference to evidence. With respect to the drug policy debate, the phenomenon of belief perseveration would appear to support the view that the goal of that debate is not (primarily) to change the minds of prohibitionists, but rather to influence those who are truly undecided. Arguing with prohibitionists is often likely to be a waste of time.

On the other hand, beliefs sometimes do change. Ross and Anderson note that "even if logical or empirical challenges have less impact than might be warranted by normative standards, they may still get the job done." This is particularly likely if the contrary evidence is presented persistently so as to accumulate into an overwhelming body of data. So perhaps there is hope after all.

Moreover, a particularly effective method for changing people's beliefs exists, in the form of "vivid, concrete, first-hand experience" [1]. In the drug policy arena, such experiences might come, for example, through the revelation that a respected friend, family member, or colleague uses cannabis (or other drug), or when such an individual is arrested and jailed for use of drugs. In addition, revelatory experiences can sometimes be arranged in certain settings of powerful emotional appeal. In this regard, Ross and Anderson note that "the effectiveness of groups and leaders that accomplish dramatic political or religious conversions offer inviting targets of future research." Thus, proselytizing has its place, as many leaders of the drug policy reform movement are successfully demonstrating.

Illusory Correlation

The other major cognitive error of particular relevance to the drug policy debate is known as illusory correlation, which is "the tendency to see two things as occurring together more often than they actually do" [4]. Often this (mis-) perceived co-occurrence then becomes the basis for erroneous causal inferences. For example, in the context of drug policy, illusory correlation is frequently manifested by such statements as "I've seen lots of kids drop out of school after smoking pot," with the implication that smoking pot causes kids to drop out of school.

The phenomenon of illusory correlation was first studied in the early 1960's in the setting of word association tests, in which series of strategically designed pairs of words were briefly presented to test subjects [5]. Invariably subjects would report that related words (e.g., lion, tiger) appeared together much more often than they actually did. This work was later extended to many real-world applications, including several interesting studies concerning psychologists' interpretation of patients' answers to projective tests. For example, based on a few paranoid people who drew large eyes during development of the Draw a Person test, psychologists believed for many years that the tendency to draw large eyes was correlated with (and perhaps caused by) paranoia [6]. This quaint notion has now been thoroughly debunked.

The basic structure of illusory correlation is grounded in signal detection theory, where the presence of one attribute or event is considered a sign of another. At a crude level, all or most learning about the world consists of discerning which signals are associated with which (current or future) attributes or outcomes. Babies learn that a hand with a spoon coming toward their face is a sign that subsequent gustatory sensations will follow. Somewhat older children learn that (some) round green things taste sweet. And so forth.

Despite its apparent simplicity, the process of judging which things are inter-correlated is a difficult one. In general, people tend to make far more errors than "hits" in their judgments of correlation, especially when -- as is often the case -- their expectations, hopes, and prior theories interfere with objective coding and processing of data.

Even absent such biases, the process is a challenging one, in large part because it requires the construction (if only in one's mind) of a 2x2 table for each pair of signals and (putatively) corresponding attributes, with the resulting four cells corresponding to (1) signal present, attribute present, (2) signal present, attribute absent, (3) signal absent, attribute absent, and (4) signal absent, attribute present. For example, data in the first cell might correspond to people who have smoked cannabis (signal) and who later drop out of school (attribute), or to things that are round and green (signal) and that taste sweet (attribute).

In the parlance of signal detection theory, the four cells are called, respectively, true positive, false positive, true negative, and false negative attributions. (Similar language is used in epidemiology, medical testing, and many other settings.) Thus, a green marble would be a false positive case of the putative correlation between round-greenness and sweetness, and students who do not smoke cannabis and who do not drop out of school would be considered true negative instances with respect to the theory that cannabis causes kids to drop out of school. Those who smoked cannabis but did not drop out would be false-positive cases, while students who do not smoke cannabis but drop out "anyway" are false-negative cases.

The major cognitive error responsible for producing illusory correlation is the strong tendency to focus almost exclusively on true positive cases, i.e., cell #1 in the above schema. Such cases, which are usually the most visible and impressive of any of the four cells, typically lead to the development of theories about correlation and causation -- which, once formed, are remarkably resistant to empirical challenge, as described earlier.

Focusing on one cell in a 2x2 table is worthless or worse than useless in terms of judging correlation. Even considering two or three of the four cells is inadequate; all four are necessary. For example, as discussed by Jennings et al. [7], if asked to test the theory that red-haired individuals are hot-tempered, most people would attempt to recall the red-haired individuals they have known and try to remember if those individuals were hot tempered (cells #1 and #2). More sophisticated "intuitive psychologists" would attempt to recall the hot-tempered people they have known and determine how many of these had red hair. But it would occur to very few people that the proportion of even-tempered blondes and brunettes is essential to the judgment task at hand.

Similar problems with illusory correlation frequently occur with respect to cause and effect, where the signal precedes the outcome attribute or event. For example, the efficacy of prayer (signal) is often attested to by people who testify (one way or the other) to the proportion of cases in which something prayed for does or does not occur (event). A few would attend to the proportion of cases in which something hoped-for came about without recourse to prayer. But "even the most sophisticated of intuitive psychologists would probably balk at the suggestion that data from the 'absent-absent' cell (i.e., favorable outcomes that were not prayed for and that did not occur) are indispensable for assessing the impact of prayers on worldly outcomes" [7].

Illusory correlation and belief perseveration are mutually reinforcing phenomena. Once an apparent correlation is observed (i.e., in the true positive cell), the mind is remarkably adept at developing theories about why the association was observed. In the case of cannabis smoking and dropping out of school, for example, the likely hypothesis would be that cannabis produces an "amotivational syndrome" or in some way impairs one's ability to function in school. This hypothesis might seem reasonable to many people (especially given the influence of government propaganda), and it is therefore inaugurated into one's repertoire of beliefs. The belief then perseveres despite the discrediting of its original evidential bases (as discussed earlier) because, well, the hypothesis still seems reasonable.

Discussion

The problems of belief perseveration and illusory correlation, like most cognitive errors, are remarkably resistant to correction. By the time one is past childhood it is probably too late to change the often-erroneous natural patterns and processes used to form judgments about correlation. No doubt the entire subject of cognitive errors is something that should be taught in school early on, together with basic rules of logical thinking. The absence of such subjects in our curricula is largely responsible, in my opinion, for the unwarranted success of many a demagogue in promoting ridiculous ideas.

In the drug policy arena, belief perseveration and illusory correlation clearly underlies much of the thinking manifested by prohibitionists, especially in terms of their "reasons" for believing that illegal drugs are harmful -- and that prohibition is likely to be a viable policy even if such drugs are, in fact, harmful. The mutually reinforcing nature of belief perseveration and illusory correlation should not be underestimated. Drug policy reform advocates must fully understand and appreciate these problems if corrective measures are at all likely to be effective.

What form might these measures take? There are two major avenues for exploiting this knowledge. First, at global level it could be helpful simply to publicize and promulgate what is known about the existence and role of cognitive errors in general, and of belief perseveration and illusory correlation in particular. We could then invite our adversaries to scrutinize their thinking and belief systems for evidence of contamination by (at least) these two cognitive errors. We would promise to do the same. Members of each side could help the other side by summarizing the evidence they believe is not being taken into account by that side. Obviously such an invitation is not likely to be well received by die-hard PDFA types, but it would seem hard to resist (on intellectual grounds, at least) by anyone with a shred of respect for logical thinking.

On a less global level, drug-policy reform advocates should be alert for opportunities to debunk any arguments based on "cell 1 myopia." For example, in response to the prototypical drug warrior's claim, as cited in the opening paragraph of this article, one might ask "And what percentage of kids who smoke pot don't drop out of school? Are you sure it isn't even higher than the percentage you just quoted? And wouldn't that mean that smoking pot protects against dropping out of school? And what proportion of kids who drop out who smoke pot? Without knowing the answers to these questions, the relationship between smoking pot and dropping out, if any, is impossible to determine. Are you aware of that fact? If not, why not? If so, why are you putting forth manifestly invalid arguments?"

Or words to that effect. This line of attack (or defense) is likely to be particularly effective in cases, such as the one just alluded to, where empirical data does not exist to support or refute a particular position. Where such facts do exist, bringing them to light will often be more effective than pointing out the cognitive errors inherent in the prohibitionists' arguments. Ideally, one would like to do battle on both fronts, but this will not always be feasible. In such cases, judgment will be required to determine which approach is most likely to change one's opponent's mind.

References

1. Ross L, Anderson CA, Shortcoming in the attribution process: On the origins and maintenance of erroneous social assessments. In Kahnemann, Slovic, Tversky eds., Judgment Under Uncertainty: Heuristics and Biases. Cambridge U. Press, 1982.

2. Lord C, Lepper MR, Ross L. Biased assimilation and attitude polarization. The effects of prior theories on subsequently considered evidence. Journal of Personality and Social Psychology 1979; 37: 2098-2110.

3. Anderson CA, Lepper MR, Ross L. The perseverance of social theories. The role of explanation in the persistence of discredited information. Journal of Personality and Social Psychology 1980; 39: 1037-1049.

4. Chapman LJ, Chapman J. Test results are what you think they are. In Kahnemann, Slovic, Tversky eds., Judgment Under Uncertainty: Heuristics and Biases. Cambridge U. Press, 1982.

5. Chapman LJ. Illusory correlation in observational report. Journal of Verbal Learning and Verbal Behavior 1967; 6: 151-155.

6. Chapman LJ, Chapman JP. Illusory correlation as an obstacle to the use of valid psychodiagnostic signs. Journal of Abnormal Psychology 1969; 74: 271-280.

7. Jennings DL, Amabile TM, Ross L. Informal covariation assessment: Data-based versus theory-based judgements. In Kahnemann, Slovic, Tversky eds., Judgment Under Uncertainty: Heuristics and Biases. Cambridge U. Press, 1982.

[End]

Top

Back to the Articles directory

This URL: http://www.pdxnorml.org/cog_err.html