Here's an article in New Scientist about how scientists need to do a better job of selling scientific findings to the American people. An excerpt:
[John] Holdren's prescription was a classic example of the "deficit model" of science communication, which assumes that mistrust of unwelcome scientific findings stems from a lack of knowledge. Ergo, if you provide more facts, scepticism should melt away. This approach appeals to people trained to treat evidence as the ultimate arbiter of truth. The problem is that in many cases, it just doesn't work.
Perversely, just giving people more information can sometimes polarise views and cause sceptics to harden their line. "We can preach the scientific facts as long as we want," says Dietram Scheufele, a specialist in science communication at the University of Wisconsin-Madison. "This is replicating the same failed experiment over and over again." ...
First, though, a bit of perspective. While some of the comments made recently by Rick Perry, Michele Bachmann and others may seem alarming, it's important to bear in mind the relatively narrow audience they were intended to reach.
This is presidential primary season, when candidates must appeal to the most ideologically committed voters to win their party's nomination. When Perry invoked Galileo in contending that "the science is not settled" on climate change, it was a message crafted to appeal to hard-core Republican voters and big-money donors within the oil and coal industries - not to the majority of Americans who accept that our planet is getting warmer and that human activities are largely to blame.
In fact, few objective measures support the idea that fundamentally anti-science ideology has taken hold in the US. Scientists are generally held in high public esteem, scientific knowledge shapes up fairly well compared to other nations, public interest is high and investment in research remains healthy. "You can't find a society that's more pro-science," argues Dan Kahan of Yale University.
OK, so America is very "pro-science," but this article suggests letting the military or the insurance industry "carry the climate message." Al Gore was terrible, because he's a politician. (So,... what? We're supposed to object when a politician accepts science?) And, of course, no one in this supposedly "pro-science" nation is actually going to listen to scientists:
While it wasn't always so, US scientists tend to lean heavily towards the Democrats' camp - which helps explain why the idea of climatologists forming part of a liberal conspiracy to whip up alarm and keep federal research dollars flowing has become part of the climate deniers' narrative.
Sure, scientists have abandoned the Republican Party in droves. Only 6% of American scientists now consider themselves to be Republicans (not all of the others are Democrats; many are independents). But that's because the Republican Party has become completely anti-science.
For similar reasons, African Americans abandoned the GOP when the party initiated its "Southern strategy" of deliberately appealing to white racists. And much as Republicans would like to attract the Hispanic vote, their policy of whipping up hysteria about Hispanic immigrants makes that rather difficult, too.
So how can we convince the Republican Party to become more rational and more evidence based, as well as less bigoted? Yeah, good question, huh? Is this really just a problem with framing?
For many scientists, talk of "framing" and "selling" ideas to the public sounds uncomfortably like misinformation through the dark art of spin. This misses the point, argue advocates of framing. It's possible to communicate accurately about science in the context of an engaging frame, they say.
This author, Peter Aldhous, may have a point. But I guess I'm skeptical. Scientists are never going to be as good at selling and spinning information as their opponents. For one thing, the truth matters to scientists. In the Republican Party, the end justifies the means. But that's never going to fly in the scientific community.
Even if a scientist does believe that - and sometimes they do - it's quickly discovered by other scientists. After all, scientific results must be peer-reviewed and confirmed by independent research, and the surest way to make a name for yourself as a scientist is to demonstrate that some other scientist is wrong.
And what should we do when Democrats back real science, tell them to shut up? Sure, if Democrats are for something, Republicans will automatically be against it. (We saw this when the Democrats adopted the Republicans' own health care plan, at which point it instantly became "socialism" to the GOP.) But are we really going to discourage Democrats from supporting science in the faint hope that Republicans might start doing so?
Frankly, one of my problems with the Democrats is that they don't support science strongly enough. (Admittedly, this is one area where Barack Obama has done pretty well - though not, unfortunately, when it comes to global warming.)
Aldhous explains that education actually strengthens our cultural biases:
Hammering another nail into the coffin of the deficit model, Kahan's latest survey of more than 1500 US adults indicates that far from overcoming our cultural biases, education actually strengthens them. Among those with greater numeracy and scientific literacy, opinions on climate change polarised even more strongly.
Kahan's explanation is that we have a strong interest in mirroring the views of our own cultural group. The more educated we become, he argues, the better we get at making the necessary triangulation to adopt the "correct" opinions. On issues like climate change, for most people these cultural calculations trump any attempt to make an objective assessment of the evidence.
But that doesn't make any sense, does it? Why would scientists leave the Republican Party in such huge numbers if greater "scientific literacy" actually strengthens our cultural biases. After all, who would be more scientifically literate (and numerate) than scientists?
I don't know. The fact is, I don't think I want our scientists "framing" their findings. And I guess I doubt that increased scientific literacy is any part of the problem. I suspect that we're just not teaching what's really important in science - the scientific method.
To my mind, you can't be scientifically literate without understanding the scientific method. And if you understand the scientific method, you'll accept the scientific consensus on any issue. People - laymen, at any rate - who don't accept the consensus as the best answer we have so far - since the scientific consensus is always provisional - are not scientifically literate. It's as simple as that.
This reminds me of that debate about Fox "News" viewers, and the difference between being misinformed and ill-informed. (They are not necessarily ill-informed, since they know who the president and vice-president are. They're just misinformed, since they believe lies about them.)
Similarly, what does it really mean to be scientifically literate? It's not facts, especially when they don't believe them, anyway. The basis of scientific literacy is understanding the scientific method - what it is, how it works to correct bias, and why it's the best way we've ever discovered of determining the truth.