Having written about the Northeast Conference on Science and Skepticism the other day, I’ve found myself thinking more about the “Why is it So Difficult to Be a Skeptic?” segment, and the part about explaining skepticism to others. And I thought I’d write some of those thoughts down.
At the core, it falls into the explanation that I cited in the other day’s post: a skeptic is someone who considers the evidence before making a decision or believing something. That points to no particular ideology, no specific political party, and no predetermined point of view. A skeptic can be left-leaning, right-leaning, or straight down the middle. A skeptic can be a Democrat, a Republican, or a Libertarian. A skeptic can be an atheist, or a skeptic can believe in God — but the skeptic makes that choice after considering the evidence.
It’s certainly true that skeptics strongly lean toward atheism, and tend to be more left than right politically. I would say that the tendency is because that’s where the evidence leads, rather than any other reasons. Of course, everyone starts with some set of world views, and a skeptic is no different. The skeptic adjusts his world view as he analyzes the input.
When someone comes forth and says, “I have a new cure for diseases,” the skeptic does not say, “Bullshit!”, though that may be the image many people have of us. No, the skeptic says, “Do you? Show me,” and then the skeptic looks at what’s there. “A friend of mine says it worked for him,” might get a response of, “Mm, hm. What else?” Data from a controlled trial will wield more power, and may elicit a nod, and an “Ah, good!”
We will, naturally, compare what you’re offering with things we already know, and that’s where it might look like we’ve decided in advance. We haven’t, though: we’re just noting that your idea is very much like something the evidence has already shown to be wrong, so it will be that much more difficult to convince us — you have to get past the evidence that’s already there. A new homeopathic “cure” that’s substantially the same as all the others isn’t really new.
We’ll also bring in what we know in general, and use it as part of our skeptical analysis. If we can see how your idea might work, we could start with a more positive view of it than we’d have if the idea doesn’t seem to make sense with respect to what we know about medicine or mechanics or physics, or whatever. If you approach me with a perpetual motion machine, you’ll have a steep hill to climb to convince me that it works, because I know something about, say, the combined effects of conservation of energy and friction.
When someone says that the positions of the moon, stars, and planets at the time of one’s birth determines significant things about one’s life and personality, the skeptic does not say, “Bullshit!” — not the first time. The skeptic looks at the evidence. And evidence shows that astrology does not work.
A skeptic will look for alternative explanations that fit the evidence. If we know that someone moved from one place to another without leaving tracks in the sand, one explanation may be that she flew. But that doesn’t mesh with what we know of how things work. Is there an alternative explanation? Perhaps wind took away the tracks. And if we have no explanation that’s both consistent with what we already know and explains what we’re seeing, we’re willing to accept that we don’t know the answer. If it’s important enough, we’ll keep looking until we find an answer that works.
Skepticism doesn’t only apply to things that are “fun” to deride. We’re not just skeptical of alternative medicine, paranormal activity, and pseudoscience. When someone says that human activity is causing damaging global climate change, we have three things to be skeptical of:
- The global climate is changing.
- Humans are causing it (or making it worse).
- It’s damaging.
But we know better than to reject anything new out of hand, without examining the evidence. After all, at one point Louis Pasteur said, “I have a new cure for diseases,” didn’t he? And then he showed them to work.
There are a number of studies showing that astrology has no predictive value, and that any effects appear to come from confirmation bias. For example, there’s a study published in “Nature” in 1985. Unfortunately, it’s behind a paywall, but it’s worth a read if you can find someone with a subscription, or if you’re willing to pay for the article (I have a printed copy).