Bias Training
Collectively jumping to conclusions: Social information amplifies the tendency to gather insufficient data
Justin Sulik, Charles Efferson & Ryan McKay
Journal of Experimental Psychology: General, forthcoming
Abstract:
False beliefs can spread within societies even when they are costly and when individuals share access to the same objective reality. Research on the cultural evolution of misbeliefs has demonstrated that a social context can explain what people think but not whether it also explains how people think. We shift the focus from the diffusion of false beliefs to the diffusion of suboptimal belief-formation strategies and identify a novel mechanism whereby misbeliefs arise and spread. We show that, when individual decision makers have access to the data-gathering behavior of others, the tendency to make decisions on the basis of insufficient evidence is amplified, increasing the rate of incorrect, costly decisions. We argue that this mechanism fills a gap in current explanations of problematic, widespread misbeliefs such as climate change denial.
Correcting the Unknown: Negated Corrections May Increase Belief in Misinformation
Kevin Autry & Shea Duarte
Applied Cognitive Psychology, forthcoming
Abstract:
Corrections are not always effective at reducing belief in misinformation. Negated corrections, which state a piece of information is not true, may only be effective at inhibiting information an observer has already encountered. We compared the effectiveness of negated corrections and replacements while manipulating initial exposure to a target concept. Subjects read one (Experiment 1) or six (Experiment 2) passages presenting a target concept (e.g., blue car) or not, followed by a negated correction (e.g., not blue), replacement (e.g., red), or no correction, then answered open‐ended questions which were scored for mentions of the target concept. When subjects were exposed to the target concept, negated corrections reduced mentions of the misinformation relative to no correction; however, when not exposed to the concept, negated corrections increased mentions relative to no correction. These results demonstrate that negated corrections can increase belief in misinformation when observers have not been exposed to the misinformation.
The Fragility of Experts: A Moderated-Mediation Model of Expertise, Expert Identity Threat, and Overprecision
Sanghoon Hoonie Kang & Jerry Kim
Academy of Management Journal, forthcoming
Abstract:
Experts play a crucial role in modern organizations, but evidence regarding the soundness and reliability of their decision-making is mixed and often contradictory. We develop and test a moderated-mediation model of expert decision-making linking expertise, identity threat, and overprecision to understand when and why experts offer overly precise judgments, and how they can cope with disconfirming feedback. We find support for this model in a series of lab experiments which show that (a) experts are more likely than novices to double-down and produce overly precise predictions following disconfirming feedback; (b) this feedback-induced overprecision by experts is mediated by perceived level of expert identity threat; (c) the source of the feedback matter for identity threat and overprecision; and (d) self-affirmation attenuates identity threat and reduces overprecision. We supplement these experimental findings by investigating experts’ response to disconfirming feedback in two real-world settings: Major League Baseball umpiring and Chief Financial Officer predictions of stock market returns. Our model and results show that feedback can harm expert decision making by leading experts to be overly precise in their judgment, challenging existing notions on the ability of expert decision-makers, and providing insight into when and why experts should be relied upon in organizational decisions.
Humans rely more on algorithms than social influence as a task becomes more difficult
Eric Bogert, Aaron Schecter & Richard Watson
Scientific Reports, April 2021
Abstract:
Algorithms have begun to encroach on tasks traditionally reserved for human judgment and are increasingly capable of performing well in novel, difficult tasks. At the same time, social influence, through social media, online reviews, or personal networks, is one of the most potent forces affecting individual decision-making. In three preregistered online experiments, we found that people rely more on algorithmic advice relative to social influence as tasks become more difficult. All three experiments focused on an intellective task with a correct answer and found that subjects relied more on algorithmic advice as difficulty increased. This effect persisted even after controlling for the quality of the advice, the numeracy and accuracy of the subjects, and whether subjects were exposed to only one source of advice, or both sources. Subjects also tended to more strongly disregard inaccurate advice labeled as algorithmic compared to equally inaccurate advice labeled as coming from a crowd of peers.
Shifting attention to accuracy can reduce misinformation online
Gordon Pennycook et al.
Nature, forthcoming
Abstract:
In recent years, there has been a great deal of concern about the proliferation of false and misleading news on social media. Academics and practitioners alike have asked why people share such misinformation, and sought solutions to reduce the sharing of misinformation. Here, we attempt to address both of these questions. First, we find that the veracity of headlines has little effect on sharing intentions, despite having a large effect on judgments of accuracy. This dissociation suggests that sharing does not necessarily indicate belief. Nonetheless, most participants say it is important to share only accurate news. To shed light on this apparent contradiction, we carried out four survey experiments and a field experiment on Twitter; the results show that subtly shifting attention to accuracy increases the quality of news that people subsequently share. Together with additional computational analyses, these findings indicate that people often share misinformation because their attention is focused on factors other than accuracy — and therefore they fail to implement a strongly held preference for accurate sharing. Our results challenge the popular claim that people value partisanship over accuracy, and provide evidence for scalable attention-based interventions that social media platforms could easily implement to counter misinformation online.
The left digit effect in a complex judgment task: Evaluating hypothetical college applicants
Andrea Patalano et al.
Journal of Behavioral Decision Making, forthcoming
Abstract:
A left digit effect has been broadly observed across judgment and decision‐making contexts ranging from product evaluation to medical treatment decisions to number line estimation. For example, $3.00 is judged to be a much greater cost than $2.99, and “801” is estimated strikingly too far to the right of “798” on a number line. Although the consequences of the effects for judgment and decision behavior have been documented, the sources of the effects are not well established. The goal of the current work is to extend investigations of the left digit effect to a new complex judgment activity and to assess whether the magnitude of the effect at the individual level can be predicted from performance on a simpler number skills task on which the left digit effect has also recently been observed. In three experiments (N = 434), adults completed a judgment task in which they rated the strength of hypothetical applicants for college admission and a self‐paced number line estimation task. In all experiments, a small or medium left digit effect was found in the college admissions task, and a large effect was found in number line estimation. Individual‐level variation was observed, but there was no relationship between the magnitudes of the effects in the two tasks. These findings provide evidence of a left digit effect in a novel multiattribute judgment task but offer no evidence that such performance can be predicted from a simple number skills task such as number line estimation.
Individual differences in the perception of probability
Mel Khaw, Luminita Stevens & Michael Woodford
PLoS ONE, April 2021
Abstract:
In recent studies of humans estimating non-stationary probabilities, estimates appear to be unbiased on average, across the full range of probability values to be estimated. This finding is surprising given that experiments measuring probability estimation in other contexts have often identified conservatism: individuals tend to overestimate low probability events and underestimate high probability events. In other contexts, repulsive biases have also been documented, with individuals producing judgments that tend toward extreme values instead. Using extensive data from a probability estimation task that produces unbiased performance on average, we find substantial biases at the individual level; we document the coexistence of both conservative and repulsive biases in the same experimental context. Individual biases persist despite extensive experience with the task, and are also correlated with other behavioral differences, such as individual variation in response speed and adjustment rates. We conclude that the rich computational demands of our task give rise to a variety of behavioral patterns, and that the apparent unbiasedness of the pooled data is an artifact of the aggregation of heterogeneous biases.