Saturday, September 10, 2011

Maybe we're all conspiracy theorists

Here's an interesting post by Matt Ridley, author of The Rational Optimist (which, unfortunately, I haven't read).

He seems like a rational guy, but he's also a conservative writing for The Wall Street Journal (a newspaper which has become even loonier since being purchased by Roger Ailes' News Corp., the parent company of Fox "News"). So see if you can spot where his bias creeps in.

(And yes, I'm biased, too. We all are. But if you think I'm wrong about this, tell me why.)
Michael Shermer, the founder and editor of Skeptic magazine, has never received so many angry letters as when he wrote a column for Scientific American debunking 9/11 conspiracy theories. Mr. Shermer found himself vilified, often in CAPITAL LETTERS, as a patsy of the sinister Zionist cabal that deliberately destroyed the twin towers and blew a hole in the Pentagon while secretly killing off the passengers of the flights that disappeared, just to make the thing look more plausible.

He tells this story in his fascinating new book, "The Believing Brain." In Mr. Shermer's view, the brain is a belief engine, predisposed to see patterns where none exist and to attribute them to knowing agents rather than to chance—the better to make sense of the world. Then, having formed a belief, each of us tends to seek out evidence that confirms it, thus reinforcing the belief.

This is why, on the foundation of some tiny flaw in the evidence—the supposed lack of roof holes to admit poison-gas cans in one of the Auschwitz-Birkenau gas chambers for Holocaust deniers, the expectant faces on the grassy knoll for JFK plotters, the melting point of steel for 9/11 truthers—we go on to build a great edifice of mistaken conviction.

I say "we" because, after reading Mr. Shermer's book and others like it, my uneasy conclusion is that we all do this, even when we think we do not. It's not a peculiarity of the uneducated or the fanatical. We do it in our political allegiances, in our religious faith, even in our championing of scientific theories. And if we all do it, then how do we know that our own rational rejections of conspiracy theories are not themselves infected with beliefs so strong that they are, in effect, conspiracy theories, too?

So far, so good, right? Michael Shermer is a brilliant man. I might also recommend How We Know What Isn't So (1991) by Thomas Gilovich. It's not just superbly entertaining, it will really make you think.

Gilovich talks about "cognitive illusions," which we're all prone to. Just as optical illusions aren't evidence that our eyesight is poor, so are cognitive illusions not evidence that we're stupid. In both cases, they're just the result of our biological structure, the means by which we apprehend the world. We're easily fooled by certain things.

All of us develop a worldview by which we try to make sense of our environment. After that, we find it very easy to believe things which fit that established worldview, and we find ways to ignore or dismiss things that don't. This is a natural tendency of all of us.

Yes, me, too. I try to recognize it and to compensate for it, but it's not that easy. It's just very, very easy to believe what we want to believe.

Science, too, tries to compensate for this natural tendency - a lot more successfully than most individuals. That's a huge part of what the scientific method is all about. And although scientists themselves are only  human, the scientific method is easily the best way we've ever discovered of separating what's true from what feels good. Nothing else even comes close.

The result is that, when it comes to a scientific matter, the consensus of scientists who specialize in that particular field is far and away more likely to be correct than not. And if it is wrong in some respect, the scientific community itself will be the first to recognize that, and the consensus will change.

Note that what any individual scientist thinks isn't important. If you choose a particular scientist to believe, especially when his is a minority opinion, you're just choosing what you want to believe. Likewise, a scientist outside his field of expertise is really no better than a layman.

Anyway, Ridley eventually gets to this point:
But those are the easy cases. What about the harder ones?

Take climate change. Here is Mr. Shermer's final diagnostic of a wrong conspiracy theory: "The conspiracy theorist defends the conspiracy theory tenaciously to the point of refusing to consider alternative explanations for the events in question, rejecting all disconfirming evidence for his theory and blatantly seeking only confirmatory evidence to support what he has already determined to be the truth."

This describes many of those who strive to blame most climate change on man-made carbon dioxide emissions. Of course, they reply that it also describes those who strive to blame most climate change on the sun.

That's how belief systems work: On both sides, there is huge belief, buttressed by confirmation bias, and equally huge belief that the belief and the conspiracy are all on the other side. Rick Perry, Al Gore—each thinks the other is a mad conspiracy theorist who will not let the facts get in the way of prejudices. Maybe both are right.

See the problem? See his bias kicking in? (Or maybe it's just the bias of The Wall Street Journal, since this was written for that publication.)

Ridley presents Rick Perry and Al Gore as just two sides to the same coin. Both have their belief systems, and both are biased. Both are presented as a possible example of a "mad conspiracy theorist who will not let the facts get in the way of prejudices."

But think about that. Al Gore accepts the overwhelming scientific consensus on climate change. Pretty much 100% of climatologists agree that global warming is occurring, and around 98% think that it's mostly caused by human activity.

A very few climate scientists disagree and think that human activity isn't the main cause. I might argue that they're regularly shown to be wrong, but maybe that's just my bias. It really doesn't matter anyway, because I don't need to do that. The only rational move here is to accept the scientific consensus. That's Al Gore's position.

Rick Perry, on the other hand, rejects global warming entirely. He claims that it's just a massive, worldwide fraud perpetrated by dishonest scientists greedy for research dollars. Apparently, oil, coal, and gas companies, flush with cash, who've been paying their executives multimillion dollar bonuses, can't outbid the munificent salaries which universities lavish on their science professors. Odd, isn't it?

And, of course, none of them would refuse to be bribed. I mean, of course you'd take money to betray everyone and everything you believe in. That's just self-evident, right? (You know, I think Rick Perry has just given us a glimpse into his own ethical system, and it's not pretty at all.)

Ridley is an "economic conservative," and he presents these two sides as equivalent. Al Gore, with his acceptance of the overwhelming consensus of climate scientists, is equivalent to Rick Perry, with his rejection of the consensus and his belief in a worldwide conspiracy of evil scientists. (Keep in mind that even the few climatologists who disagree with the consensus generally don't see a conspiracy. They just interpret things differently than the majority.)

Do you think those two stances are equivalent? Really? Do you think that both sides show evidence of a "mad conspiracy theorist who will not let the facts get in the way of prejudices"? Or do you think that Ridley, as a conservative, just couldn't bring himself to criticize Rick Perry while praising Al Gore?

I know what I think, but maybe I'm just biased. :)

PS. Thanks again to Jim Harris for the link.

No comments: