Educated But Misinformed: Why We Think We're Right When We're Not
You could argue that life is all about being right. We try to choose the right career, make the right assumptions, say the right things, every moment of every day. And because it’s so important to us to be right, we sometimes do anything it takes to stay that way— including defending ourselves when we’re wrong.
At Cornell University, David Dunning and his colleagues carry out ongoing research on the pscyhology of human wrongness. In their work, the researchers ask survey respondents if they are familiar with certain technical concepts from physics, biology, politics, and geography. A fair number claim familiarity with genuine terms like “centripetal force” and “photon.” But they also claim some familiarity with concepts that are entirely bogus, such as the “plates of parallax,” “ultra-lipid,” and “cholarine.” In one study, roughly 90 percent of participants claimed some knowledge of at least one of the nine fictitious concepts they were asked about. Most interesting of all, Dunning and his colleagues found that the better versed respondents considered themselves in a general topic, the more familiarity they claimed with the meaningless terms associated with it in the survey.
“It’s odd to see people who claim political expertise assert their knowledge of both Susan Rice (the national security adviser to President Barack Obama) and Michael Merrington (a pleasant-sounding string of syllables),” says Dunning, “but it’s not that surprising. For more than 20 years, I have researched people’s understanding of their own expertise—formally known as the study of metacognition, the processes by which human beings evaluate and regulate their knowledge, reasoning, and learning—and the results have been consistently sobering, occasionally comical, and never dull.”
So why do people misevaluate their own expertise? Dunning says it’s partly unavoidable.
“Logic itself almost demands this lack of self-insight: For poor performers to recognise their ineptitude would require them to possess the very expertise they lack,” he explains. “To know how skilled or unskilled you are at using the rules of grammar, for instance, you must have a good working knowledge of those rules, an impossibility among the incompetent. Poor performers—and we are all poor performers at some things—fail to see the flaws in their thinking or the answers they lack.”
But what’s curious, Dunning says, is that incompetence does not leave people disoriented, perplexed, or cautious. Instead, the incompetent are “often blessed with an inappropriate confidence, buoyed by something that feels to them like knowledge.”
In 1999, in the Journal of Personality and Social Psychology, Dunning and his former graduate student Justin Kruger published a paper that documented how, in many areas of life, incompetent people are simply incapable of recognising just how icompetent they are. They called this the Dunning-Kruger Effect. And while it’s applicable to nearly every area of life, from political debate to firearm safety, some of its most disturbing effects occur in the field of education.
When Education Fails to Educate
Over the years, Dunning has become convinced of one key, overarching fact about the ignorant mind: “One should not think of it as uninformed. Rather, one should think of it as misinformed.”
He describes a 2014 study by Tony Yates and Edmund Marek that tracked the effect of high school biology classes on 536 Oklahoma high school students’ understanding of evolutionary theory. The students were rigorously quizzed on their knowledge of evolution before taking introductory biology, and then again just afterward. Not surprisingly, the students’ confidence in their knowledge of evolutionary theory shot up after instruction, and they endorsed a greater number of accurate statements.
The trouble is that the number of misconceptions the group endorsed also shot up. For example, instruction caused the percentage of students strongly agreeing with the true statement “Evolution cannot cause an organism’s traits to change during its lifetime” to rise from 17 to 20 percent—but it also caused those strongly disagreeing to rise from 16 to 19 percent. In response to the likewise true statement “Variation among individuals is important for evolution to occur,” exposure to instruction produced an increase in strong agreement from 11 to 22 percent, but strong disagreement also rose from nine to 12 percent. Tellingly, the only response that uniformly went down after instruction was “I don’t know.”
Dunning laments: “Education fails to correct people who believe that vision is made possible only because the eye emits some energy or substance into the environment. It fails to correct common intuitions about the trajectory of falling objects. And it fails to disabuse students of the idea that light and heat act under the same laws as material substances. What education often does appear to do, however, is imbue us with confidence in the errors we retain.”
Dunning has observed the same effect in his own research.
Studying physics students in 2013, Dunning and his colleagues presented participants with several problems about the trajectory of a ball. Some participants got perfect scores, and seemed to know it, being quite confident in their answers. Other participants did only moderately well and, again, seemed to know it, as their confidence dipped slightly. But participants who did the worst were just as confident in their answers as the top performers were. When looking only at the confidence of students getting 100 percent versus zero percent right, it was often impossible to tell who was in which group.
Why? “Because both groups ‘knew something,'” Dunning says. “They knew there was a rigorous, consistent rule that a person should follow to predict the balls’ trajectories.” One group used the right Newtonian principle, and one group used the wrong one–but stuck by it.
Driver’s education courses, particularly those aimed at handling emergency maneuvers, tend to increase, rather than decrease, accident rates. They do so, Dunning explains, because training people to handle, say, snow and ice leaves them with the lasting impression that they’re permanent experts on the subject. In fact, their skills usually erode rapidly after they leave the course. And so, months or even decades later, they have confidence but little leftover competence when their wheels begin to spin.
We tend to exhibit the Dunning-Kruger effect on a societal level as well.
In 1943, after a survey of 7,000 American college freshmen found that only six percent could identify the original 13 colonies (with some believing that Abraham Lincoln, “our first president,” “emaciated the slaves”), the New York Times lamented the nation’s “appallingly ignorant” youth. Six decades later, in 2002, a national test of fourth, eighth, and 12th graders produced similar results.
“The way we traditionally conceive of ignorance—as an absence of knowledge—leads us to think of education as its natural antidote,” Dunning says. Clearly being misinformed can be just as dangerous.
Filling the Void
So how can we learn to recognise our own ignorance and misbeliefs, and help prevent misinformation from spreading to others? Is it even possible?
Scholars at the University of Bristol and the University of Western Australia have been studying why people suffer from misbeliefs and how to prevent it in future situations.
Stephan Lewandowsky of the University of Western Australia, who leads the research, says the main reason people are more likely to believe false information is because it actually takes less brain power to believe a statement is false than to accept it as truth. Finding the truth takes time and effort that people often don’t care enough to spend on particular issues that aren’t of immediate concern.
“The main reason that misinformation is sticky, according to the researchers, is that rejecting information actually requires cognitive effort,” he says. “Weighing the plausibility and the source of a message is cognitively more difficult than simply accepting that the message is true – it requires additional motivational and cognitive resources. If the topic isn’t very important to you or you have other things on your mind, misinformation is more likely to take hold.”
When we do take the time to thoughtfully evaluate incoming information, there are only a few features that we are likely to pay attention to: Does the information fit with other things I believe in? Does it make a coherent story with what I already know? Does it come from a credible source? Do others believe it?
“Misinformation is especially sticky when it conforms to our pre-existing political, religious, or social point of view. Because of this, ideology and personal worldviews can be especially difficult obstacles to overcome.”
Even worse, Lewandowsky says, efforts to retract misinformation often backfire, paradoxically amplifying the effect of the erroneous belief.
Along with his colleagues, Lewandowsky is currently researching strategies to help people correct their misbeliefs and avoid ignorance in the future.
Here are a few things you can do to prevent misinforming yourself and others:
1. Open a lesson with the most common misbeliefs on a topic.
After that, show students the gaps created by those misbeliefs or the implausible conclusions they lead to. “Such an approach can make the correct theory more memorable when it’s unveiled, and can prompt general improvements in analytical skills,” Dunning says.
2. To correct misbeliefs, fill the void left behind.
If you’re going to tell students something is not true, quickly provide the information that is true. Otherwise, Dunning says, the correction won’t stick because students will remember everything that was said except for the crucial qualifier, “not.” To successfully eradicate a misbelief requires not only removing the misbelief, but filling the void left behind.
3. Play devil’s advocate.
“For individuals,” Dunning says, “the trick is to be your own devil’s advocate: to think through how your favoured conclusions might be misguided, to ask yourself how you might be wrong, or how things might turn out differently from what you expect.”
4. Consider the opposite.
Imagine yourself in a future in which you have turned out to be wrong in a decision, and then consider what the likeliest path was that led to your failure.
5. Seek advice from others.
“Other people may have their own misbeliefs,” Dunning says, “but a discussion can often be sufficient to rid a serious person of his or her most egregious misconceptions.”
6. Recognise the limits of your knowledge.
“The built-in features of our brains, and the life experiences we accumulate, do in fact fill our heads with immense knowledge,” Dunning says. “What they do not confer is insight into the dimensions of our ignorance. As such, wisdom may not involve facts and formulas so much as the ability to recognise when a limit has been reached.”
7. Focus on the facts you want to highlight, rather than the myths.
If you’ve begun a lesson or study session by dispelling various misbeliefs, focus on the so-called “positive” information, meaning the information that is true.
8. Make sure that the information you want people to take away is simple and brief.
The keyword here is “take away.” It aids students’ memories to summarise the most important parts of your lesson whenever possible.
9. Consider your audience and the beliefs they are likely to hold.
No complicated psychoanalysis required. Just be aware of what your students know and don’t know, and how that may bias their beliefs.
10. Strengthen your message through repetition.
The brain likes repetition. Although some of us may be better able than others to remember something after hearing it once, most people need to hear a concept multiple times before it sinks in. Repeating over time–say, every few days or once a week–works especially well.