C.V.

When does good news make you less optimistic?

Later this month, Phil Fernbach is moving to Boulder, CO to join the Leeds School of Business as a post doc in the Center for Consumer Financial Decision Making. You may remember Phil from a few recent posts. He and I have been looking at how you can win The New Yorker’s Cartoon Caption Contest (post 1, post 2).

During his house hunting trip, I asked him when does good news make people less optimistic. His answer:

Phil’s work is fascinating and highly counter-intuitive. For me it highlights the important role that attention plays in judgments and choices. Daniel Kahneman has a great quote that captures the role that attention often plays in life:

Nothing in life is as important as you think it is, while you are thinking about it.

The difference between Danny’s quote and Phil’s finding is that attributions that accompany a focus on weak evidence serves to undermine the importance of that evidence being contemplated.

Finally, and more importantly, you should smell the milk before you drink it.

———————————————————————————————-

Fernbach, P. M., Darlow, A. & Sloman, S. A. (2011). When good evidence goes bad. The weak evidence effect in judgment and decision-making. Cognition, 119, 459-467.

An indispensable principle of rational thought is that positive evidence should increase belief. In this paper, we demonstrate that people routinely violate this principle when predicting an outcome from a weak cause. In Experiment 1 participants given weak positive evidence judged outcomes of public policy initiatives to be less likely than participants given no evidence, even though the evidence was separately judged to be supportive. Experiment 2 ruled out a pragmatic explanation of the result, that the weak evidence implies the absence of stronger evidence. In Experiment 3, weak positive evidence made people less likely to gamble on the outcome of the 2010 United States mid-term Congressional election. Experiments 4 and 5 replicated these findings with everyday causal scenarios. We argue that this ‘‘weak evidence effect’’ arises because people focus disproportionately on the mentioned weak cause and fail to think about alternative causes.