His post Why People Deny Science 11/14/2018 addresses people's susceptibility to pseudoscience more genberally:
To those naïve to the challenges we face, at first the concept of science-based medicine seems obvious. Well of course you’re going to base medicine on science, what are the other options? At its core the idea is simple: medical practice should be informed by the best evidence we have available. In practice, this is complicated, because there are many ways to evaluate the evidence.He gives more details on the results of a recent study of 244 subjects, Examining how people reason about controversial scientific topics (behind subscription).
Further, people find many ways to deny the science, either a specific scientific conclusion, or science itself. We hear these various excuses all the time – there are “other” ways of knowing, we don’t need science to know what works, I am the evidence, etc. Sometimes people respect science, but just get it wrong. They might misunderstand the placebo effect, overestimate the significance of studies, not understand the nature of p-hacking, or fail to realize the potential for self deception in less-than-rigorous studies.
Then there are those who will dismiss entire swathes of science out of hand. This is commonly done through an appeal to conspiracy theories, such as references to “Big Pharma”, or the notion that doctors lie to make money, that the system is broken and cannot be trusted, or even that science itself is broken.
There has been quite a bit of discussion recently about how people process political conspiracy theories and false or misleading reports from ideologically or politically partisan sources. One of the puzzlers has been the idea that refuting false information can make the person holding it even more convinced of the errorneous claim or conspiracy theory.
Kind of a discouraging thought for the reality-based community.
Novella, though, suggests that such a "backfire" phenomenon may not be so clear-cut as some writers suggest:
There are many studies showing that people will engage ad hoc in motivated reasoning, meaning that the conclusion comes first, and reasoning is used to justify the conclusion rather than determine the conclusion. There is inconsistent evidence for a possible backfire effect – which means not only rejecting evidence which contradicts a held belief, but strengthening that belief in the face of contradictory evidence.He puts it even more strongly in Backfire Effect Not Significant Neurologica Blog 01/04/2018:
... more recent research suggests that the backfire effect may not exist, or at least is exceedingly rare. A recently published series of studies puts a pretty solid nail in the coffin of the backfire effect (although this probably won’t be the last word). ...He focuses instead on the problem of what he calls "metacognition":
The backfire effect, however, is very specific. This occurs when people not only reject factual correction, but create counterarguments against the correction that move them further in the direction of the incorrect belief. It’s probably time for us to drop this from our narrative, or at least deemphasize it and put a huge asterisk next to any mention of it.
What all of this suggests is that people do not usually engage in metacognition – thinking about their own thinking. They may have a cognitive style that they tend to use, but otherwise they engage in whatever type of reasoning serves their purpose on any particular topic. They might vigorously defend the consensus of scientific opinion on one topic, then reject it on another citing a vague conspiracy, and dismiss it on a third without any real justification or by appealing to fallacious logic.
To counter this we cannot just teach science or explain what the evidence says. We need to teach critical thinking skills – which is metacognition. Critical thinking includes an understanding of all the various ways our thinking can go wrong. Just as important, however, is that critical thinking involves stepping back from our own cognitive process to examine it objectively, to make a sincere effort to consistently apply valid logic and the same fair and objective criteria for evidence. [my emphasis]
No comments:
Post a Comment