10 Common Learning Myths That Might Be Holding You Back
Science is constantly changing, and although we’ve come a long way since the days when it was widely believed that older people couldn’t learn new things, a number of learning myths have stood the test of time despite having no grounding in scientific research.
Tom Bennett—teacher, author, and director of ResearchED—points out that there are still too many unproven theories about learning that are taken as fact. He founded ResearchED to tackle these myths and contribute to greater research literacy in the education community.
“We have had all kinds of rubbish thrown at us over the last 10 to 20 years,” he says. “We’ve been told that kids only learn properly in groups. We’ve had people claiming that children learn using brain gym, people saying that kids only learn if you appeal to their learning style. There’s not a scrap of research that substantiates this, and, unfortunately, it is indicative of the really, really dysfunctional state of social science research that exists today.”
In fact, according to research from the Organisation for Economic Co-operation and Development (OECD), out of the trillions of dollars that are spent on education policies all around the world, only one in ten are actually evaluated.
So with this in mind, we thought we’d line up some of the most common learning myths of the 21st century and take a look at why they don’t hold water.
1. Re-reading and highlighting
When students prepare for a test, some of the most common things they’ll do to commit the relevant information to memory is to re-read it or highlight whatever they consider to be important.
However, a report published in the journal Psychological Science in the Public Interest showed that both of these study strategies are relatively ineffective. Passively reading the same text over and over again won’t do anything for comprehension or recall unless it’s spaced out over time and highlighting or underlining can even be detrimental if the wrong information is selected.
So if you want to make your study time count, check out these smart study tactics that are based on the latest brain research.
2. Students have different learning styles
You’ve probably heard about “learning styles” and how everyone has their preferred or ideal learning style, whether it’s visual, auditory, or kinesthetic. The theory is that some people learn better when they take in information by listening to it, while others learn more effectively when information is presented visually and others learn best through hand-on practice.
It’s so popular that a recent poll of head teachers at independent schools showed that over 85 percent believe in learning styles, 66 percent are using them in their schools and some have even sent teachers on courses or paid for external consultants.
But academics from the worlds of neuroscience, education, and psychology have been voicing their concerns about the popularity of this approach to teaching and learning. Systematic studies of learning styles have consistently found no evidence or very weak evidence to support the idea that matching the material to a student’s learning style is more effective.
3. You are either right or left brained
The idea that some people are right brained while others are left brained has been around for a while now. According to the theory, left-brained people are more logical, analytical and methodical, whereas right-brained people are more creative and artistic.
But a 2013 study by scientists from the University of Utah analysed over 1,000 brains and found no evidence that people preferentially use the left or right hemisphere.
Of course, certain functions are processed more by one region of the brain than others, and this is known as lateralization. But we all use our entire brain equally, and the fact that our brain regions are all connected is the very thing that allows us to think both creatively and analytically.
4. The 10,000 hour rule
Journalist and author Malcolm Gladwell popularised the 10,000 hour rule, which is based on research from psychologist Anders Ericsson and says that 10,000 hours of deliberate practice is enough to become world-class in your chosen field.
But although practice is certainly essential when you’re learning a new skill or studying a new topic, there’s no magic number of hours that will turn you into an expert or bring you to the proficiency level of a professional athlete or musician.
A Princeton study found that deliberate practice can only predict success in fields with stable structures where the rules never change, such as tennis, chess, or classical music. In less stable fields, mastery requires more than just practice.
So what’s the takeaway?
“There is no doubt that deliberate practice is important, from both a statistical and a theoretical perspective,” explains study co-author Brooke Macnamara. “It is just less important than has been argued. For scientists, the important question now is, what else matters?”
5. You should always stick with your first answer
Have you ever been advised not to change an answer on a multiple choice test or exam once you’ve put it down? This advice is common in school and even college, and one study found that 75 percent of college students and 55 percent of instructors believe that changing their initial answer would lower their score overall.
Despite the popularity of this theory, research shows that reconsidering your answers isn’t such a bad idea. A review of 33 studies found that, on average, people who change their answers score higher on tests than those who don’t. So if you’ve got extra time and are having doubts about one of your answers, don’t be afraid to give it a second look.
6. Intelligence is fixed at birth
We tend to think of intelligence as something that you either have or don’t have, and this is known as a fixed mindset. However, a growing body of research shows that our IQ can increase over time, and in fact, research on growth mindset by Stanford psychologist Carol Dweck shows that our beliefs about intelligence can actually affect our effort, and in turn, our performance.
So what can you do if you don’t have a growth mindset? Not to worry, it’s an area we could all stand to improve in, so check out these tips for developing a growth mindset.
7. Praising intelligence will motivate students
When we want to motivate our kids, students or even employees, we often praise their ability and intelligence by saying things like “Wow, that’s so smart” or “You’re really good at this.” However, the same research on the growth mindset by Stanford Psychologist Carol Dweck found that this kind of praise can actually be counterproductive and discourage people from taking risks.
So what should we be praising if not ability or intelligence? Dweck’s research shows that praising effort and persistence is a much better way to motivate people to work hard and keep improving. This is because praising effort rather than ability helps promote the idea that intelligence is malleable, and that trying and failing is all part of the learning process.
So instead of being afraid to make mistakes and seem dumb, students come to see that their brain is like a muscle that needs to be strengthened, and that mistakes can actually help them reach their full potential.
8. We only use 10 percent of our brain
The popular theory that we only use 10 percent to 20 percent of our brain has been around for years now, and was even promoted by recent Hollywood movies like Lucy and Limitless, where the protagonists uncover a way to unlock the rest of their brain and end up with superhuman powers.
Unfortunately, as appealing as it is to imagine that we have untapped potential, this theory is nothing more than an urban legend. It seems to have originated from the 1930s self-help book “How to Win Friends and Influence People,” in which a Harvard University professor was misquoted.
9. The learning pyramid
Although the learning pyramid myth was debunked a long time ago, it still lingers and is taken as fact by many teachers and students. The theory says that people remember 10 percent of what they read, 20 percent of what they hear, 30 percent of what they see, 50 percent of what they see and hear, 70 percent of what they say and write, and 90 percent of what they do or teach others.
But while this pyramid would be a useful tool if it were true, the problem with it is that it’s never actually been proven and the percentages given are pure fiction.
It’s unclear where the pyramid and numbers originated, but researcher, learning expert and instructional designer Will Thalheimer points out that if someone uses scientific verbiage, we’re more likely to believe it, which is probably why the learning pyramid is still widely accepted as fact.
“People do not necessarily remember more of what they hear than what they read. They do not necessarily remember more of what they see and hear than what they see,” he says. “The numbers are nonsense and the order of potency is incorrect.”
10. There are shortcuts to better learning
This is probably the biggest learning myth of all time, because every learning myth we’ve covered so far is tied to the idea that there’s a quicker way to commit new information to memory.
It’s understandable, of course, since learning is hard work and we’d all love to take a shortcut if we could. But despite all the learning fads that have come and gone, from mindfulness to brain training games and exercises, learning is and will always be a process. It requires time and effort, and is bound to feel difficult and uncomfortable at times.
So while an understanding of how the brain works can certainly help us study and learn more effectively, the bottom line is that there are no shortcuts. The next time someone tells you about an app or learning method that sounds too good to be true, take it with a pinch of salt and remember to view claims critically and look for the evidence behind them.
Want to learn how to read critically, check facts and gain a more balanced perspective? Check out these tips for honing your fact checking skills.