There's a natural human tendency to believe what we want to believe and dismiss any evidence that contradicts it. In fact, hearing evidence against our beliefs often reinforces them. Professional political liars know this, and take advantage of it. A bald-faced lie will not only be believed by many people, but attempts to correct the lie will just increase the effect. It's win-win for the liar.
The truth is sometimes a poor competitor in the marketplace of ideas - complicated, unsatisfying, full of dilemmas, always vulnerable to misinterpretation and abuse. - George F. Kennen
This is human nature,... and it's rather scary, don't you think? Of course, we see it every day in politics. Sarah Palin should have been laughed into insignificance with that "death panels" claim, along with every other Republican who gave it lip service. They all knew that it was a lie, and so did any rational observer. But it still worked politically. It wasn't even close to being true, but it fit the narrative they were pushing. In politics, that's success.
And studies of Fox "News" viewers, many of whom still believe that Saddam Hussein had something to do with the 9/11 attacks or that we really did find weapons of mass destruction in Iraq, have shown that these false beliefs are actually strengthened - greatly strengthened - when these people are shown evidence that they're false. Apparently, just reminding them of the circumstances when they formed their original beliefs serves to cement those false beliefs even further.
What ails the truth is that it is mainly uncomfortable, and often dull. The human mind seeks something more amusing, and more caressing. - H. L. Mencken
But it's not just politics. It's everywhere in human societies. Medical researchers find that their careful, evidence-based discoveries are disregarded by people who just "know" that they're wrong. And it's not only anti-vaccine activists who refuse to accept scientific findings - basically, all of them - which go against their preconceived notions, either. This is common, very common.
Climatologists struggle with global warming deniers. Biologists struggle with creationists. Pharmacologists struggle with homeopaths. But in all of these cases, the scientific consensus is clear. It's just that ordinary people refuse to accept the science. In fact, in this study, people were more likely to believe in ESP when they were told that scientists rejected it.
Truthiness is what you want the facts to be, as opposed to what the facts are. - Stephen Colbert
But don't get too smug. This is human nature, and it affects all of us. We are all likely to accept evidence that confirms our existing beliefs and dismiss that which doesn't. We pay more attention to data that confirms our biases, and we trust it more. When we've made up our minds, we resist changing our opinion, even when the evidence indicates that we should.
We pick sides - not just in politics - and we naturally root for our side, whatever that happens to be. It's not just on the conscious level, but unconsciously, too. It's very easy to believe what we want to be true and very easy to dismiss what we don't. That's true for every single one of us. So what can we do about it? Read further if you want my suggestions.
1) To begin with, it helps to be aware of the problem. I highly recommend How We Know What Isn't So: The Fallibility of Human Reason in Everyday Life by Thomas Gilovich. It's superbly entertaining, as well as thought-provoking. Gilovich explains how this is a natural problem with human cognition, not a matter of intelligence. Just as optical illusions are not evidence that our eyesight is poor, so are cognitive illusions not evidence that we're stupid. They're simply a function of how our brains work.
Being aware of this probably won't entirely prevent it, but it can help. In particular, if we're aware of common cognitive illusions, we're more likely to recognize them when they happen to us. And we can try to be most skeptical about what we really want to be true. That does not mean disbelieving everything, but only making sure that the evidence is there, and that we haven't too quickly dismissed contrary data or arguments. Question yourself. Do you really have good reason for your beliefs?
The biggest difference between the world of science and everyday life in protecting against erroneous beliefs is that scientists utilize a set of formal procedures to guard against the sources of bias and error. - Thomas Gilovich
2) Understand the scientific method. You don't have to be a scientist yourself to understand why the scientific method is the best way we've yet discovered to distinguish truth from wishful thinking. And you should at least understand the point, that it's a method scientists have developed to overcome those natural human tendencies mentioned above. It's not perfect - nothing is - but it works.
Scientists, like everyone else, want to believe that they're right. As individuals, they'll often cling to a favored hypothesis and resist arguments against it. (They're a lot better at changing their minds than most of us, but scientists are still human.) But science is a group effort. Evidence must be duplicable by multiple independent researchers. Even if you resist evidence that your theory is wrong, other scientists will eagerly show you - and everyone else - the error of your ways.
There are many hypotheses in science which are wrong. That's perfectly all right; they're the aperture to finding out what's right. Science is a self-correcting process. To be accepted, new ideas must survive the most rigorous standards of evidence and scrutiny. - Carl Sagan
And science reserves its highest accolades for those who overthrow established thinking. Yes, like every other field, there's an incentive to "go along to get along," but in science, there's an even greater incentive to be a heretic. Of course, you still have to be right. Right or wrong, you will have other scientists checking your work. But if you're right, you will be highly respected in the scientific community, with awards, accolades, and your pick of prestigious positions. That's a real incentive for a scientist to be a maverick.
So there are always contrarians in science. That's how science advances. But usually, they're wrong. Mainstream scientific thinking does change, but that's not the way to bet. After all, scientific thinking has been growing for centuries now, always tested by new generations of scientists. For us laymen, the only rational position in a scientific issue is to accept the consensus of the experts. They might be wrong, but they're far more likely to be right. And if you don't accept the consensus of the real experts in a scientific issue, then you're just picking sides based on what you want to be true. That's not rational.
Keep in mind a couple of things. First, since this is so important, I just want to emphasize once more that what any individual scientist says isn't necessarily valid. You can almost always find a "scientist" who agrees with you, whatever you believe. But an individual scientist should have no real validity unless and until he can demonstrate to other scientists that he's right, and therefore change the scientific consensus. Picking a "tame" scientist to present contrary views is a common political tactic, but it's invalid.
Although the history of science contains numerous examples of an investigator's expectations clouding his or her vision and judgment, the most serious of these abuses are overcome by the discipline's insistence on replicability and the public presentation of results. - Thomas Gilovich
And second, this must be the consensus of scientists in their own particular field of expertise. Scientists are almost as likely as anyone else to hold loony opinions outside their own particular field. And when you get to cutting edge science, it's unlikely that non-specialists have the knowledge they need to really understand the issues. We're long past the time when a scientist could be a real Renaissance man (or woman), with an expert's knowledge in every field. Our total scientific knowledge has just grown too large for that.
So when it comes to climate change, for example, you shouldn't listen to Al Gore. He's not even a scientist, let along a climatologist, and he's only one person. But you shouldn't listen to Fox "News," the big oil companies, or a scientist with a minority position, either. Doing any of those things is just picking what you want to be true, picking what agrees with "your side" in the political and social debate. That's not how a rational person determines the truth.
Instead, you should accept - tentatively, as all science is tentative - the views of the consensus of climatologists. They're not infallible, but they're far more likely to be right than anyone else. And this can apply to all questions of science, including - but not limited to - evolution, the age of the Earth, and the efficacy of homeopathic treatments.
No matter how many times a theory meets its tests successfully, there can be no certainty that it will not be overthrown by the next observation. This, then, is a cornerstone of modern natural philosophy. It makes no claim of attaining ultimate truth. In fact, the phrase "ultimate truth" becomes meaningless, because there is no way in which enough observations can be made to make truth certain and, therefore, "ultimate". - Isaac Asimov
3) Beware of pseudoscience. Scientists have been so successful in advancing our state of knowledge and bringing us life-saving and life-improving inventions that all sorts of people want to claim the backing of "science" for their pet beliefs. Unfortunately, they're not willing to use the rigorous methods of science. And so you get such unscientific things as "Christian Science" or claims that quantum mechanics (the "abracadabra" of the modern world) backs up the latest woo.
Real scientists are usually eager to combat these fakes, but laymen can still be confused (even scientists can be confused, when it comes to deliberate fraud). Look to scientific medical researchers to combat the claims of homeopathy and acupuncture. Look to evolutionary biologists to counter the claims of creationists. And whenever anyone who's not a theoretical physicist uses "quantum mechanics" as an explanation,... well, that tends to make a very good BS detector for rational people.
Inspect every piece of pseudoscience, and you will find a security blanket, a thumb to suck, a skirt to hold. - Isaac Asimov
4) Non-scientific matters are much more difficult. It's not that economists or historians or sociologists don't know more about their own field of expertise than we do. It's just that a clear consensus is often harder to find, because these professions can't run multiple, independent experiments to demonstrate, beyond a reasonable doubt, the truth.
Economists, for example, can study the Great Depression for evidence, but that evidence is more open to interpretation than scientific evidence. And it can't be duplicated under controlled conditions. Therefore, economists - being human - are more likely to interpret everything as support for their own favored beliefs. History, sociology, political science - to a greater or lesser extent, consensus is not impossible, but is harder to come by.
So what do we laymen do? Well, first - as I mentioned above - be aware of the problem. Try to listen to multiple points of view. In these matters, too, pay attention to what the actual experts say. Economists might not agree, but you're far better off listening to the arguments of skilled economists, on both sides of an issue, than just some radio or TV personality. Economists, when it comes to economic issues, are far more likely to know what they're talking about, even when they disagree.
If possible, find a back-and-forth debate between experts. Even if it's just one economist replying to another, that will give you a better idea of where they actually disagree (and where they agree). If you understand that, you might have a better position from which to base your own opinions. If it's a policy issue, try to understand the point of the policy and the exact point of disagreement. Which side seems to have real evidence to back up their opinions?
All nonscientific systems of thought accept intuition, or personal insight, as a valid source of ultimate knowledge. ... Science, on the other hand, is the rejection of this belief, and its replacement with the idea that knowledge of the external world can only come from objective investigation - that is, by methods accessible to all. - Alan Cromer
Finally, accept that certainty is not possible. We just have to make the best decisions we can. However, we do need to learn from our mistakes (and from our successes, for that matter). If it's a personal financial question, write down your expectations and check back later to see if they've been met. Was it a good idea or a bad idea? After all, it's probably not the only financial question you'll ever need to decide.
If it's a matter of legislation, urge politicians to (1) specify exactly what the policy is supposed to accomplish in a particular time-frame (using precise, quantifiable measures of success and failure), and (2) include in the legislation a review of the policy at the end of that time, with specific attention to the original rationale and those quantifiable predictions of the proposal's effect. Far too often, policies continue to get funding simply because they've become the status quo, and the status quo always has backers.
Understand that we - all of us - can be wrong. There's no shame in being wrong. It happens. We want to avoid it, as much as possible, and we want to discover our errors as soon as possible - and to correct them. Following the above guidelines should help. Even so, you might sometimes make the wrong decisions (sometimes for the right reasons). Well, none of us is infallible.
Examine your mistakes. Did you make an error that could have been avoided? Maybe not. There are no guarantees. Sometimes we do everything right and it still ends up being the wrong decision. That's life. But in most cases, we probably just made a mistake - and likely a very human mistake.
No comments:
Post a Comment