In many instances, individuals like to focus their attention on information that confirms their beliefs or attitudes. If they feel that euthanasia is inappropriate, for example, they will often seek information that highlights the potential risks of this procedure. This inclination is often called a confirmation bias (Jonas, Schulz-Hardt, Frey, & Thelen, 2001;; Wason, 1960, 1968) or a congeniality bias (Eagly, A. H., & Chaiken, 2005). Other terms have been applied to similar phenomena, such as hypothesis locking, myside bias, and the Tolstoy syndrome, because the famous author wrote about similar tendencies.
In other instances, individuals deliberately seek information that might discount their beliefs or attitudes. Specifically, when the principal motive of individuals is to form accurate beliefs, they will often seek this disconfirming information. When the principal motive of individuals is to defend their beliefs, perhaps to nullify negative emotions, they will tend to seek confirming information (e.g., Hart, Albarracin, Eagly, Brechan, Lindberg, & Merrill, 2009).
Sometimes, the term confirmation bias is restricted to this specific phenomenon--in which individuals select information that aligns with their attitudes, beliefs, or behaviors. Other authors apply this term more broadly, referring to instances in which individuals also underrate the importance of contradictory information or maintain their attitudes, beliefs, or behaviors despite conflicting information (e.g., Rassin, 2008).
Many striking instances of the confirmation bias were uncovered in the 1960s. In a typical study, participants first reinforce or demonstrate some attitude, belief, or behavior. They might express their attitudes towards some issue or object, such as abortion. Alternatively, they might reach a decision, such as whether or not someone is guilty of a crime. In addition, they might report the extent to which they engage in some behavior, such as smoking.
Second, participants receive some alternative sources of information about this attitude, belief, or behavior. Usually, only a title or abstract of each source is presented. Nevertheless, the title or abstract is typically sufficient to decide whether the information reinforces or contradicts their attitude, belief, or behavior. Generally, half of the sources are consistent, rather than inconsistent, with their attitude, belief, or behavior.
Participants then specify which sources of information they would like to read in more detail. The inclination to select only the sources that reinforce their attitude, belief, or behavior manifests a confirmation bias.
Some studies have examined the confirmation bias in the realm of beliefs. Adams (1961), for example, asked a sample of mothers whether they felt genetics or the environment primarily determines the development of children. Next, these mothers were granted the opportunity to decide which of two speeches, only one of which seemed to reinforce their belief, they would like to hear. Most of the mothers chose the speech that aligned with their belief.
Many factors affect the magnitude of this confirmation bias. Indeed, in some instances, a disconfirmation bias prevails.
To illustrate, various facets of attitude strength can affect the magnitude of confirmation biases. The confirmation bias, for example, seems to be pronounced if individuals are committed to these attitudes (e.g., Cotton, 1985;; Frey. 1986). This bias could be adaptive, because otherwise individuals might shift their behavior too frequently (see Spreading of alternatives).
When individuals experience some form of threat, the confirmation bias tends to magnify. Jonas, Greenberg, and Frey (2003), for example, showed the confirmation bias is especially pronounced after individuals contemplate their mortality. Specifically, according to Terror Management Theory, when mortality is salient, individuals experience the need to connect themselves to an enduring cause or collective. As a consequence, they become more inclined to defend their worldviews. They will, thus, disregard information that refutes their prevailing attitudes or beliefs.
Similarly, as Jonas, Graupmann, and Frey (2006) showed, mood can also magnify or inhibit the confirmation bias. In particular, when individuals experience a positive mood, the confirmation bias dissipates: Individuals are often willing to embrace information that diverges from their expectations or beliefs. In contrast, when individuals experience a negative mood, the confirmation bias is amplified: Individuals are especially inclined to dislike information that diverges from their expectations or beliefs.
Specifically, in this study, participants reached a preliminary decision about which of five holiday destinations they would prefer. Next, they watched either a comedy or a documentary on boys, detained in a youth center, who had been sexually abused by guards. Finally, they were granted opportunities to read more information about the holiday destinations. If participants had watched the comedy, and therefore experienced a positive mood, they were more willing to read information they believed may conflict with their original choose of a holiday destination.
Presumably, when individuals feel positive and strong, they feel they can withstand the complexities that such unexpected or dissonant information entails. Indeed, they might perceive such information as an opportunity to develop. In contrast, when individuals feel negative, they do not feel they can withstand this complexity and dissonance.
Sometimes, messages seem difficult to process, called disfluency (see fluency and the hedonic marker hypothesis). For example, the font may be difficult to read. Because of this disfluency, individuals feel they need to devote more effort to these messages. They are not as likely to rely on their preconceptions or other heuristics, and the confirmation bias, therefore, tends to subside.
This possibility was proposed and validated by Hernandez and Preston (2013). In one of their studies, for example, individuals indicated whether they support or oppose capital punishment. Next, they read a message that presents arguments that contradict their opinions on this issue. The message was written in either a clear font or a hazy font. Finally, they answered questions that gauge their attitudes towards this issue. If the font was hazy and difficult to read, participants were especially likely to shift their opinions to align with the message& that is, the confirmation bias seemed to have dissipated, but only when the font was hazy.
The magnitude of confirmation biases also varies across domains. To illustrate, a meta-analysis, undertaken by Hart, Albarracin, Eagly, Brechan, Lindberg, and Merrill (2009), showed the confirmation bias is especially pronounced when the issue relates to politics. Consistent with this findings, Republican voters are approximate 1.5 times more likely to report watch Fox News than are Democrat voters. In contrast, Democrat voters are almost 1.5 times more likely to watch CNN (The Pew Research Center for the People & the Press, 2006;; cited by Hart, Albarracin, Eagly, Brechan, Lindberg, and Merrill, 2009).
Various personality traits might amplify or inhibit the confirmation bias (see Rassin, 2008, for a measure that represents susceptibility to confirmation biases). That is, several traits might curb the receptivity of information to contradictory information, such as right wing authoritarianism (Altemeyer, 1981).
Albarracin and Mitchell (2004), for example, assessed the extent to which individuals feel they could refute arguments that contradict their beliefs or attitudes. Individuals who felt they might not be able to refute these arguments were more inclined to disregard information that contradicts their attitudes or beliefs. Specifically, in this study, some individuals endorsed items such as "When trying to defend my point of view, I am not at all articulate" and "I am unable to defend my own opinions successfully". These individuals were more likely than other participants to select sources of information that countered their attitudes. In addition, they were more persuaded by information that contradicted their initial attitudes.
In some instances, individuals experience a motivation to form accurate beliefs or reasonable attitudes. In these contexts, they often seek information that contradicts their beliefs or attitudes--primarily to ensure their conclusions are sound.
To illustrate, if individuals need to justify their beliefs or attitudes in a forthcoming debate, the confirmation bias tends to abate. Individuals seek information that contradicts their beliefs or attitudes, primarily to anticipate possible rebuttals (Freedman, 1965).
Several features of the context can evoke this need to form accurate beliefs or reasonable attitudes. Jonas, Schulz-Hardt, and Frey (2005), for example, showed that confirmation bias dissipates if individuals feel that somebody else will rely on their beliefs or attitudes. That is, when participants needed to advise a customer, they became more inclined to seek information that contradicts their initial opinion on some matter.
More generally, the importance of accuracy, and hence the orientation towards disconfirming information, tends to escalate when the decision of individuals is related to some important outcome, called outcome relevant involvement. To illustrate, in a study conducted by Jonas and Frey (2003), some of the participants were informed they would win a prize if their decision was correct. These participants did not demonstrate a strong confirmation bias but, instead, sought disconfirming information, presumably to ensure their opinions were accurate.
If people feel distracted while they reach a preliminary decision, the confirmation bias later dissipates. In contrast, if people deliberate carefully or trust their initial instincts as they reach a preliminary decision, the confirmation bias is reinforced. Arguably, distracted individuals are more inclined to doubt their original decision and, therefore, are especially receptive to information that may diverge from this initial choice.
This possibility was proposed and validated by Fischer, Fischer, Weisweiler, and Frey (2010). In this study, participants imagined they were a manager. They receive mixed reviews about one of their employees, Mr Miller. Their task was to decide whether to dismiss this person.
Immediately after they received this information, some participants completed a proofreading task for five minutes to distract their attention--and then needed to select their initial choice immediately. Some participants deliberated carefully for five minutes. Finally, some participants were told to trust their instincts and reach an immediate choice.
Next, all participants received more snippets of information about Mr Miller. They could then choose to read these snippets of information in more detail before updating their decision. Relative to participants who deliberated carefully or trusted their initial instincts, participants who were distracted during the five minutes were more inclined to read information that challenges their initial choice. For example, if they initially decided to dismiss Mr Miller, these individuals were willing to read favorable information about this person in more detail. Their confirmation bias diminished.
According to Fischer, Fischer, Weisweiler, and Frey (2010), this finding shows that distraction limited the extent to which people were certain of their initial decision, diminishing the confirmation bias. Yet, from the perspective of unconscious thinking theory, the distraction condition may have enabled unconscious processes to unfold. These processes could have underscored the complexities and contradictions in the information, motivating participants to seek more information.
Fischer, Jonas, Frey, and Schulz-Hardt (2005) showed that confirmation biases are less pronounced when participants must, also, undertake some other task concurrently. In other words, cognitive load attenuates the confirmation bias. This finding implies the confirmation bias might, largely, entail deliberate processes rather than automatic processes. Nevertheless, whether these biases depend on conscious or automatic processes might, potentially, be contingent upon whether a defense or accuracy motivation is salient (Hart, Albarracin, Eagly, Brechan, Lindberg, and Merrill, 2009).
After individuals engage in a task that demands appreciable self control and discipline, consuming resources from a limited supply of mental energy (see Ego depletion), they are more likely to demonstrate a confirmation bias. This finding, according to Fischer, Greitemeyer, and Frey (2008), implies that limitations in motivation, rather than cognition, elicit this confirmation bias. That is, individuals experience the inclination to read information that confirms their preferences. To override this inclination, and thus read information that counters their preferences, they need to mobilize their effort, consuming mental energy.
In this study, participants watched a series of video clips. A series of words appeared towards the bottom of this screen. To deplete limited resources, some participants were explicitly instructed to disregard these words--a task that demands careful discipline. The remaining participants received no instructions about these words.
Next, to assess confirmation bias, participants received summaries of various articles, half of which contradicted the political orientation of these individuals. If participants had attempted to disregard the words, they were especially disinclined to read articles that diverge from their political orientation (Fischer, Greitemeyer, & Frey, 2008). Subsequent studies showed this effect of ego depletion cannot be ascribed to mood, cognitive load, or ego threat.
Finally, the extent to which individuals felt committed to their political orientation mediated this relationship (Fischer, Greitemeyer, & Frey, 2008). That is, after mental energy is depleted, individuals do not feel as motivated to shift their political orientation. They feel more committed to this orientation and, thus, become unwilling to consider information that contradicts their political orientation.
Hart, Albarracin, Eagly, Brechan, Lindberg, and Merrill (2009) enumerate a series of other possible factors that could amplify or inhibit the confirmation bias. According to these authors, a motivation to defend cognitions usually underpins this bias. This motive, if fulfilled through other means like self affirmation, will tend to be inhibited. Confirmation bias should thus abate.
The confirmation bias was initially ascribed to cognitive dissonance theory, first propounded by Festinger (1957, 1958, 1964). In particular, when individuals entertain conflicting cognitions, they experience an unpleasant feeling, called dissonance. For example, this dissonance can arise if they reflect upon beliefs that contradict their behaviors. To prevent this state, individuals might prefer only to seek information, and thus entertain thoughts, that align with their existing beliefs, attitudes, or behaviors.
Westen, Blagov, Harenski, Kilts, and Hamann (2006), in their study of the neurophysiological underpinnings of these biases, provided some indirect evidence of this account. In this study, participants received two contradictory statements, articulated by George W Bush, John Kerry, or Tom Hanks. Next, they received a statement, from the same person, that seemed to reconcile these two statements. To illustrate, the first statement might demonstrate the support of George Bush towards Ken Lay, the former CEO of Enron, in 2000. The second statement might refer to the inclination of George Bush to criticize Enron. The third statements, which reconciles this conflict, emphasizes how George Bush felt betrayed by Ken Lay and was genuinely shocked by the decline of Enron.
Two important findings emerged. First, after the third statement was presented, participants still perceived the remarks from the leader of the party they do not support as contradictory. Second, evaluations of these contradictions activated brain region associated with emotional regulation, such as the amygdale, anterior cingulate cortex, the posterior cingulated, and the insula. These findings indicate the motivation to reject or avert contradictions, which might undermine the confirmation bias, is related to regions that regulate negative emotional states.
In addition, the dorsomedial frontal cortex was also activated. This region is associated with the relating information to the self (D'Argembeau, Collette, Van der Linden, Laureys, Del Fiore, Degueldre, et al., 2005;; Fossati, Hevenor, Graham, Grady, Keightley, Craik, et al., 2003) as well as the experience of sympathy (Decety & Chaminade, 2003). Presumably, this need to reconcile the conflicting information was evoked only when individuals identified with the person, activating a sense of self.
Several motives might underpin the inclination of individuals, at least in some circumstances, to seek information that refutes, rather than confirms, their attitudes, beliefs, and behavior. To illustrate, individuals often seek novelty--and disconfirming information is, sometimes, more likely to fulfill this motive (Sears, 1965).
Furthermore, individuals like to form accurate beliefs (Chaiken, Liberman, & Eagly, 1989). This need to maintain accuracy could also underpin many instances of disconfirmation bias (Jonas, E., Schulz-Hardt & Frey, 2005).
When the evidence that generated some belief is invalidated or eradicated, individuals still tend to maintain this belief. The confirmation bias might explain this tendency (Ross & Anderson, 1982).
The approach that demonstrates this tendency is called the debriefing paradigm (see Kunda, 1999;; Ross & Anderson, 1982). First, participants receive information that supports some argument or hypothesis. Second, the beliefs or attitudes of participants towards this hypothesis are assessed. Third, participants are told the original information was actually erroneous. Fourth, beliefs or attitudes are evaluated again. To some extent, the original attitudes and beliefs remain intact, even when the origins of these cognitions are invalidated.
To illustrate, in a study conducted by Ross, Lepper, and Hubbard (1975), participants conducted a task, in which they needed to distinguish between legitimate and fabricated suicide notes. Some participants were informed, at random, they were excellent at this task. These participants believed they were better at this activity, even after they were told this feedback was indeed fabricated (Anderson, Lepper, & Ross, 1980).
Cognitive egocentrism is defined as the tendency of some individuals to prefer or favor their perspective even if likely to be misguided. For example, they may assume that other individuals perceive the same visual features as do they even if seated in a different spot. Such cognitive egocentrism may be related to interpersonal coldness. That is, if individuals dismiss the perspective of someone, they may not appreciate the needs and emotions of this person.
For example, in one study, conducted by Boyd, Bresin, Ode, and Robinson (2013), participants first completed a pair of tasks that putatively assesses cognitive egocentrism. On each trial, they receive a sound, such as a gun shot, over headphones and must identify whether these stimuli were presented on the left or right side. In addition, a dot appeared below a horizontal line on a computer screen. Participants needed to specify the point on the line that represents the latitude of this dot. After completing this pair of tasks, participants indicated the degree to which they exhibit various traits, such as sympathetic or ruthless. Interestingly, some participants perceived the location of the dot as biased towards the sound. These participants were more likely to report elevated levels of coldness rather than warmth.
Presumably, the sound shifted the attention of individuals towards one side. If people exhibit cognitive egocentrism and overestimate the relevance of their perspective, they may perceive this side of space as more important. Consequently, they may feel the dot is also located towards this side. In other words, this biased location of the sound may reflect cognitive egocentrism, and this tendency was associated with coldness rather than warmth.
Individuals often overestimate the extent to which some outcome or consequence was foreseeable. That is, after some event transpires, they like to believe they did, or would have, predicted this outcome, called the hindsight bias. The problem with this bias is that individuals may become too confident with their predictions and not accommodate a range of possibilities.
Wu, Shimojo, Wang, and Camerer (2013), for example, revealed a hindsight bias in the visual domain. One set of participants, called the performers, needed to decide whether blurred photos contain humans. Another set of participants, called the evaluators, needed to decide whether people could detect humans in the various photos--although, one some trials, either a clear version of this photo first appeared rapidly or a phrase,like "there is a human" appeared, indicating whether or not the photo included a human.
In general, the evaluators significantly overestimated the likelihood the performers would accurately determine whether the photos contained humans. But, this overestimation was observed only when evaluators first saw the clear image or, to a lesser extent, a phrase that indicated whether or not the photo included a human. Therefore, when evaluators knew the answer, they assumed the task would be easy.
This finding can be ascribed to a sampling bias. That is, after seeing the clear version, the evaluators perhaps directed their attention properties of the blurred version that resembled the clear version. Because of this bias, they overestimated the similarity between these two versions. The blurred variant, therefore, did not seem too blurred. Indeed, as Wu, Shimojo, Wang, and Camerer (2013) showed in one of the studies, the distribution of gaze differed most appreciably between performance and evaluators when the hindsight bias was more pronounced. Likewise, when evaluators received information about the distribution of gaze in performers, the hindsight bias dissipated.
Adams, J. S. (1961). Reduction of cognitive dissonance by seeking consonant information. Journal of Abnormal and Social Psychology, 62, 74-78.
Albarracin, D., & Mitchell, A. L. (2004). The role of defensive confidence in preference for proattitudinal information: How believing that one is strong can sometimes be a defensive weakness. Personality and Social Psychology Bulletin, 30, 1565-1584.
Altemeyer, B. (1981). Right-wing authoritarianism. Winnipeg, Manitoba, Canada: University of Manitoba Press.
Anderson, C. A., Lepper, M. R., & Ross, L. (1980). Perseverance of Social Theories: The Role of Explanation in the Persistence of Discredited Information. Journal of Personality and Social Psychology, 39, 1037-1049.
Baron, J. (2000). Thinking and deciding (3rd ed.). New York: Cambridge University Press.
Betsch, T., Haberstroh, S., Gl?ckner, A., Haar, T., & Fiedler, K. (2001). The effects of routine strength on adaptation and information search in recurrent decision-making. Organizational Behavior and Human Decision Processes, 84, 23-53.
Boyd, R. L., Bresin, K., Ode, S., & Robinson, M. D. (2013). Cognitive egocentrism differentiates warm and cold people. Journal of Research in Personality, 47(1), 90-96. doi:10.1016/j.jrp.2012.09.005
Brannon, L. A., Tagler, M. J., & Eagly, A. H. (2007). The moderating role of attitude strength in selective exposure to information. Journal of Experimental Social Psychology, 43, 611-617.
Brock, T. C., Albert, S. M., & Becker, L. A. (1970). Familiarity, utility, and supportiveness as determinants of information receptivity. Journal of Personality and Social Psychology, 14, 292-301.
Chaiken, S., Liberman, A., & Eagly, A. H. (1989). Heuristic and systematic information processing within and beyond the persuasion context. In J. S. Uleman & J. A. Bargh (Eds.), Unintended thought (pp. 212-252). New York: Guilford Press.
Clarke, P., & James, J. (1967). The effects of situation, attitude intensity and personality on information-seeking. Sociometry, 30, 235-245.
Cotton, J. L. (1985). Cognitive dissonance in selective exposure. In D. Zillmann & J. Bryant (Eds.), Selective exposure to communication (pp. 11-33). Hillsdale, NJ: Erlbaum.
Cotton, J. L., & Hieser, R. A. (1980). Selective exposure to information and cognitive dissonance. Journal of Research in Personality, 14, 518-527.
D'Argembeau, A., Collette, F., Van der Linden, M., Laureys, S., Del Fiore, G., Degueldre, C., et al. (2005). Self-referential reflective activity and its relationship with rest: A PET study. Neuroimage, 25, 616-624.
Darley, J. M., & Gross, P. H. (1983). A hypothesis-confirming bias in labeling effects. Journal of Personality and Social Psychology, 44, 20-33.
Decety, J., & Chaminade, T. (2003). Neural correlates of feeling sympathy. Neuropsychologia, 41, 127-138.
Devine, P. G., Hirt, E. R., & Gehrke, E. M. (1990). Diagnostic and confirmation strategies in trait hypothesis testing. Journal of Personality and Social Psychology, 58, 952-963.
Eagly, A. H., Chen, S., Chaiken, S., & Shaw-Barnes, K. (1999). The impact of attitudes on memory: An affair to remember. Psychological Bulletin, 125, 64-89.
Eagly, A. H., & Chaiken, S. (2005). Attitude research in the 21st century: The current state of knowledge. In D. Albarrac?n, B. T. Johnson, & M. P. Zanna (Eds.), The handbook of attitudes (pp. 743-767). Mahwah, NJ: Erlbaum.
Ehrlich, D., Guttman, I., Schonbach, P., & Mills, J. (1957). Postdecision exposure to relevant information. Journal of Abnormal and Social Psychology, 54, 98-102.
Feather, N. T. (1963). Cognitive dissonance, sensitivity, and evaluation. Journal of Abnormal and Social Psychology, 66, 157-163.
Feather, N. T. (1969). Preference for information in relation to consistency, novelty, intolerance of ambiguity, and dogmatism. Australian Journal of Psychology, 21, 235-249.
Festinger, L. (1957). A theory of cognitive dissonance. Evanston, IL: Row, Peterson.
Festinger, L. (1958). The motivating effect of cognitive dissonance. In Allport (Eds.), et al. Assessment of human motives. New York: Holt, Rinehart and Winston.
Festinger, L. (1964). Conflict, decision and dissonance. Stanford, CA: Stanford University Press.
Fischer, P., Fischer, J., Weisweiler, S. and Frey, D. (2010). Selective exposure to information: How different modes of decision making affect subsequent confirmatory information processing. British Journal of Social Psychology, 49, 871-881. doi: 10.1348/014466610X499668
Fischer, P., Jonas, E., Frey, D., & Schulz-Hardt, S. (2005). Selective exposure to information: The impact of information limits. European Journal of Social Psychology, 35, 469-492
Fischer, P., Greitemeyer, T., & Frey, D. (2008). Self-regulation and selective exposure: The impact of depleted self-regulation resources on confirmatory information processing. Journal of Personality and Social Psychology, 94, 382-395.
Fischer, P., Schulz-Hardt, S., & Frey, D. (2008). Selective exposure and information quantity: How different information quantities moderate decision makers' preferences for consistent and inconsistent information. Journal of Personality and Social Psychology, 94, 231-244.
Fossati, P., Hevenor, S. J., Graham, S. J., Grady, C., Keightley, M. L., Craik, F., et al. (2003). In search of the emotional self: An fMRI study using positive and negative emotional words. American Journal of Psychiatry, 160, 1938-1945.
Freedman, J. L. (1965). Confidence, utility, and selective exposure: A partial replication. Journal of Personality and Social Psychology, 2, 778-780.
Frey, D. (1981). The effect of negative feedback about oneself and cost of information on preferences for information about the source of this feedback. Journal of Experimental Social Psychology, 17, 42-50.
Frey, D. (1982). Different levels of cognitive dissonance, information seeking, and information avoidance. Journal of Personality and Social Psychology, 43, 1175-1183.
Frey, D. (1986). Recent research on selective exposure to information. In L. Berkowitz (Ed.), Advances in experimental social psychology (Vol. 19, pp. 41-80). New York: Academic Press.
Frey, D., & Rosch, M. (1984). Information seeking after decisions: The roles of novelty of information and decision reversibility. Personality and Social Psychology Bulletin, 10, 91-98.
Frey, D., & Stahlberg, D. (1986). Selection of information after receiving more or less reliable self-threatening information. Personality and Social Psychology Bulletin, 12, 434-441.
Frey, D., Stahlberg, D., & Fries, A. (1986). Information seeking of high- and low-anxiety subjects after receiving positive and negative self-relevant feedback. Journal of Personality, 54, 694-703.
Frey, D., & Wicklund, R. (1978). A clarification of selective exposure: The impact of choice. Journal of Experimental Social Psychology, 14, 132-139.
Hart, W., Albarracin, D., Eagly, A. H., Brechan, I., Lindberg, M. J., & Merrill, L. (2009). Feeling validated versus being correct: A meta-analysis of selective exposure to information. Psychological Bulletin, 135, 555-588.
Hernandez, I., & Preston, J. L. (2013). Disfluency disrupts the confirmation bias. Journal of Experimental Social Psychology, 49(1), 178-182. doi:10.1016/j.jesp.2012.08.010
Hillis, J. W., & Crano, W. D. (1973). Additive effects of utility and attitudinal supportiveness in the selection of information. Journal of Social Psychology, 89, 257-269.
Holton, B., & Pyszczynski, T. (1989). Biased information search in the interpersonal domain. Personality and Social Psychology Bulletin, 15, 42-51.
Jonas, E., & Frey, D. (2003). Information search and presentation in advisor-client interactions. Organizational Behavior and Human Decision Processes, 91, 154-168.
Jonas, E., Graupmann, V., & Frey, D. (2006). The influence of mood on the search for supporting versus conflicting information: Dissonance reduction as a means of mood regulation? Personality and Social Psychology Bulletin, 32, 3-15.
Jonas, E., Greenberg, J. & Frey, D. (2003). Connecting terror management and dissonance theory: Evidence that mortality salience increases the preference for supporting information after decisions. Personality and Social Psychology Bulletin, 29, 1181-1189.
Jonas, E., Schulz-Hardt, S., & Frey, D. (2005). Giving advice or making decisions in someone else's place: The influence of impression, defense and accuracy motivation on the search for new information. Personality and Social Psychology Bulletin, 31, 977-990.
Jonas, E., Schulz-Hardt, S., Frey, D., & Thelen, N. (2001). Confirmation bias in sequential information search after preliminary decisions: An expansion of dissonance theoretical research on selective exposure to information. Journal of Personality and Social Psychology, 80, 557-571.
Klayman, J., & Ha, Y. W. (1987). Confirmation, disconfirmation, and information in hypothesis testing. Psychological Review, 94, 211-228.
Kleck, R. E., & Wheaton, J. (1967). Dogmatism and responses to opinion-consistent and opinion-inconsistent information. Journal of Personality and Social Psychology, 5, 249-252.
Kunda, Z. (1999). Social Cognition: Making Sense of People. MIT Press
Lavine, H., Lodge, M., & Freitas, K. (2005). Threat, authoritarianism and selective exposure to information. Political Psychology, 26, 219-244.
McFarland, S. G., & Warren, J. C. (1992). Religious orientations and selective exposure among fundamentalist Christians. Journal for the Scientific Study of Religion, 31, 163-174.
Miller, R. L. (1977). The effects of postdecisional regret on selective exposure. European Journal of Social Psychology, 7, 121-127.
Nemeth, C., & Rogers, J. (1996). Dissent and the search for information. British Journal of Social Psychology, 35, 67-76.
Olson, J. M., & Zanna, M. P. (1979). A new look at selective exposure. Journal of Experimental Social Psychology, 15, 1-15.
Poletiek, Fenna (2001). Hypothesis-testing behaviour. Hove, UK: Psychology Press.
Pyszczynski, T. A., Greenberg, J., & LaPrelle, J. (1985). Social comparison after success and failure: Biased search for information consistent with a self-servicing conclusion. Journal of Experimental Psychology, 21, 195-211.
Rassin, E. (2008). Individual differences in the susceptibility to confirmation bias. Netherlands Journal of Psychology, 64, 87-93.
Rhine, R. J. (1967). The 1964 presidential election and curves of information seeking and avoiding. Journal of Personality and Social Psychology, 5, 416-423.
Rosen, S. (1961). Postdecision affinity for incompatible information. Journal of Abnormal and Social Psychology, 63, 188-190.
Rosenbaum, L. L., & McGinnies, E. (1973). Selective exposure: An addendum. The Journal of Psychology: Interdisciplinary and Applied, 83, 329-331.
Ross, L., & Anderson, C. A. (1982). Shortcomings in the attribution process: On the origins and maintenance of erroneous social assessments. In D. Kahneman, P. Slovic, & A. Tversky. Judgment under uncertainty: Heuristics and biases (pp. 129-152). Cambridge University Press.
Ross, L., Lepper, M. R., & Hubbard, M. (1975). Perseverance in self-perception and social perception: Biased attributional processes in the debriefing paradigm. Journal of Personality and Social Psychology, 32, 880-889.
Schulman, G. I. (1971). Who will listen to the other side: Primary and secondary group support and selective exposure. Social Problems, 18, 404-415.
Schulz-Hardt, S., Frey, D., Luthgens, C., & Moscovici, S. (2000). Biased information search in group decision making. Journal of Personality and Social Psychology, 78, 655-669.
Schwarz, N., Frey, D., & Kumpf, M. (1980). Interactive effects of writing and reading a persuasive essay on attitude change and selective exposure. Journal of Experimental Social Psychology, 16, 1-17.
Sears, D. O. (1965). Biased indoctrination and selectivity of exposure to new information. Sociometry, 28, 363-376.
Sears, D. O. (1966). Opinion formation and information preferences in an adversary situation. Journal of Experimental Social Psychology, 2, 130-142.
Sears, D. O., & Freedman, J. L. (1965). Effects of expected familiarity with arguments upon opinion change and selective exposure. Journal of Personality and Social Psychology, 3, 420-426.
Smith, S. M., Fabrigar, L. R., Powell, D. M., & Estrada, M. (2007). The role of information processing capacity and goals in attitude-congruent selective exposure effects. Personality and Social Psychology Bulletin, 33, 948-960.
Steele, C. M. (1988). The psychology of self-affirmation: Sustaining the integrity of the self. In L. Berkowitz (Ed.), Advances in experimental social psychology (Vol. 21, pp. 261-302). New York: Academic Press.
Thayer, S. (1969). Confidence and postjudgement exposure to consonant and dissonant information in a free choice situation. The Journal of Social Psychology, 77, 113-120.
Trope, Y., & Bassok, M. (1982). Confirmatory and diagnosing strategies in social information gathering. Journal of Personality and Social Psychology, 43, 22-34.
Wason, P. C. (1968) Reasoning about a rule. Quarterly Journal of Experimental Psychology, 20, 273-281.
Wason, P. C. (1960). On the failure to eliminate hypotheses in a conceptual task. Quarterly Journal of Experimental Psychology, 12, 129-140.
Westen, D., Blagov, P. S., Harenski, K., Kilts, C., & Hamann, S. (2006). Neural bases of motivated reasoning: An fMRI study of emotional constraints on partisan political judgment in the 2004 U.S. Presidential Election. Journal of Cognitive Neuroscience, 18, 1947-1958.
Wu, D., Shimojo, S., Wang, S. W., & Camerer, C. F. (2013). Shared visual attention reduces hindsight bias. Psychological Science, 23, 1524-1533. doi: 10.1177/0956797612447817
Last Update: 6/30/2016