Tuesday, June 30, 2009



It’s always been clear that we like it when people are firm, not wishy-washy. John Kerry lost the 2004 presidential election in large part because he was derided for having changed his vote.[1] Where sense might tell us that changing your mind after serious consideration of new information is a good thing, and a sign of intelligence and reason, we have something working against that, which made the label “Flip-Flop!” stick like a “Kick me!” sign on his back.

This isn’t news, of course, and isn’t, in itself, interesting. What is interesting is that we have studies (you knew we would) that show that people consistently prefer confidence even when it’s consistently wrong:

The research, by Don Moore of Carnegie Mellon University in Pittsburgh, Pennsylvania, shows that we prefer advice from a confident source, even to the point that we are willing to forgive a poor track record. Moore argues that in competitive situations, this can drive those offering advice to increasingly exaggerate how sure they are. And it spells bad news for scientists who try to be honest about gaps in their knowledge.

Indeed, the studies explain our real-work experience with leaders in politics and business who make strong, confident statements about things they know nothing about; who make wild, confident predictions about things they can’t possibly predict with any hope of accuracy; who make bold, confident decisions, and then stand by them even when they turn into disasters.

Mission accomplished!

The problem, though, is that honesty is considered weakness, and serious consideration of facts becomes a bad thing. And that’s dangerous. It’s also how the sorts of people who are firmly confident in fantasy garner support. Religious fanatics, conspiracy theorists, denialists of AIDS and climate change and the Holocaust, are all vehemently confident. Scientists, at least when we’re being honest about it, must always admit to some amount of uncertainty, however small. We don’t know everything there is to know about evolution, about viruses, about cancer, or about the global climate... but when we admit to a gap in what we know, we fall victim to this effect. We are uncertain, so we lose ground.

An example of this is something I quoted in these pages last fall, talking about the Large Hadron Collider. The fear-mongers have latched onto a crazy notion that the LHC will create a black hole that will destroy the world. Scientists are as sure as we can be that this is ridiculous, and, “No, it can’t happen,” would not be an unreasonable way to answer the question. And, yet, physicist Janna Levin answers by saying, “Well, it’s interesting, ’cause you can never say ‘never,’ actually, and the best things you can say are that it’s incredibly, ridiculously, extremely unlikely that anything like that can happen. [...] But can I say it’s a physical impossibility? I can not.”

I like the way New Scientist ends the article, with Dr Moore demonstrating the very effect we’re talking about:

So if honest advice risks being ignored, what is a responsible scientific adviser to do? “It’s an excellent question, and I’m not sure that I have a great answer,” says Moore.

[1] Well, that and because he ran a crappy campaign. But, hey.


Thomas J. Brown said...

At my last job, one of the owners of the company was quite Catholic and one day said, "that's why I don't trust science – it keeps changing its mind. Science will say, 'this is how it is,' and then someone finds out that's wrong and science says, 'now this is how it is.' Why should I believe you now if you were wrong before?"

I just sat there aghast. My jaw had literally dropped. For a moment, I just didn't know what to say. Finally I replied:

"That's exactly why I trust science! Science says, 'this is how it is, this is how it is,' and then some new information come to light and science says, 'oh, I was wrong before, but now I know better, and this is how it is.' Whereas religion says, 'this is how it is, this is how it is,' and then someone proves that wrong and religion just says, 'shut the fuck up.'"

He didn't really have a response for me.

Barry Leiba said...

Well, and his argument isn’t even right as far as it goes. Religion keeps "getting it wrong" and changing its mind too.

People used to believe in Astarte and Ba’al and Moloch. People used to believe in Isis and Osiris and Ra. Then we had Zeus, Poseidon, Aphrodite, and Apollo. The Romans worshipped Venus, Mars, Jupiter, and Neptune, and we’re not talking about planets. For the Norse, it was Odin, Thor, and Balder. The Jews settled on one God, but their batch of prophets was largely overtaken by a new, Christian set, headed by Jesus, variously considered God or God’s “son”. The Muslims switched up to Muhammad. We have the Buddha, of course, and there are millions who believe in Shiva or Vishnu or Brahma, along with a myriad of lesser gods, incarnations, and avatars.

We keep changing our minds, each time chuckling at the folly of the old ideas, and asserting that now we have the true answer. And at any one time — including now — there are countless different versions, some subtly different, some vastly so.

Your former colleague’s argument was ridiculous on its surface.

tzink said...

I agree with your take, I can relate. A long time ago at work, I noticed that I often had to hedge my statements with "This *probably* will help with spam filtering..." or "... it should work this way but there are a lot of variables..." Management doesn't like that.

I kept having to hear "Don't hedge," and "Stand behind your statements." Confidence was more important than accuracy.