It’s always been clear that we like it when people are firm, not wishy-washy. John Kerry lost the 2004 presidential election in large part because he was derided for having changed his vote. Where sense might tell us that changing your mind after serious consideration of new information is a good thing, and a sign of intelligence and reason, we have something working against that, which made the label “Flip-Flop!” stick like a “Kick me!” sign on his back.
This isn’t news, of course, and isn’t, in itself, interesting. What is interesting is that we have studies (you knew we would) that show that people consistently prefer confidence even when it’s consistently wrong:
The research, by Don Moore of Carnegie Mellon University in Pittsburgh, Pennsylvania, shows that we prefer advice from a confident source, even to the point that we are willing to forgive a poor track record. Moore argues that in competitive situations, this can drive those offering advice to increasingly exaggerate how sure they are. And it spells bad news for scientists who try to be honest about gaps in their knowledge.
Indeed, the studies explain our real-work experience with leaders in politics and business who make strong, confident statements about things they know nothing about; who make wild, confident predictions about things they can’t possibly predict with any hope of accuracy; who make bold, confident decisions, and then stand by them even when they turn into disasters.
The problem, though, is that honesty is considered weakness, and serious consideration of facts becomes a bad thing. And that’s dangerous. It’s also how the sorts of people who are firmly confident in fantasy garner support. Religious fanatics, conspiracy theorists, denialists of AIDS and climate change and the Holocaust, are all vehemently confident. Scientists, at least when we’re being honest about it, must always admit to some amount of uncertainty, however small. We don’t know everything there is to know about evolution, about viruses, about cancer, or about the global climate... but when we admit to a gap in what we know, we fall victim to this effect. We are uncertain, so we lose ground.
An example of this is something I quoted in these pages last fall, talking about the Large Hadron Collider. The fear-mongers have latched onto a crazy notion that the LHC will create a black hole that will destroy the world. Scientists are as sure as we can be that this is ridiculous, and, “No, it can’t happen,” would not be an unreasonable way to answer the question. And, yet, physicist Janna Levin answers by saying, “Well, it’s interesting, ’cause you can never say ‘never,’ actually, and the best things you can say are that it’s incredibly, ridiculously, extremely unlikely that anything like that can happen. [...] But can I say it’s a physical impossibility? I can not.”
I like the way New Scientist ends the article, with Dr Moore demonstrating the very effect we’re talking about:
So if honest advice risks being ignored, what is a responsible scientific adviser to do? “It’s an excellent question, and I’m not sure that I have a great answer,” says Moore.
 Well, that and because he ran a crappy campaign. But, hey.