12 Facets of Education That Will Be Obsolete By 2025
Educational research is especially fertile right now, and efforts to integrate it into curricula over the next decade are going to leave some of us high and dry unless we start paying attention this second. Significant findings range from brain-based study habits to insights into the nature of intelligence and motivation. They also include glimpses into which of today’s factors will no longer be, well, factors come 2025. In this list we outline 12 of the primary facets of learning, instruction, and policy that won’t be around to see the next decade.
1. Traditional subjects
We’re living in an increasingly multi-disciplinary world, so why shouldn’t our majors reflect that? In the next ten years, we’ll start to see less Biology, Math, English and more Big Data, Creative Studies, and Decision Sciences.
Finland is considering its most radical overhaul of basic education yet: abandoning teaching by subject for teaching “by phenomenon.” Traditional lessons such as English Literature and Physics are already being phased out among 16-year-olds institutions in Helsinki.
Instead, the Finns are teaching “topics” such as the European Union, which encompasses learning languages, history, politics, and geography. No more of an hour of history followed by an hour of chemistry. The idea aims to eliminate one of the biggest gripes of students everywhere: “What is the point of learning this?” Now, each subject is anchored to the reason for learning it.
Along the same lines, the University College of London, students can earn a degree in Arts and Sciences (BASc). That’s right–not a double major, an interdisciplinary major. Students engage with interdisciplinary learning throughout the degree, taking subjects from the Humanities, Social Sciences, Sciences and Engineering in relevant and connected combinations.
“We’re determined to be just as science-y as we are arts-y,” says Dr Carl Gombrich, who directs the program. Gombrich acknowledges that defenders of the traditional single-subject approach might argue the new program lacks depth, but he says this just isn’t the case.
The program is very popular so far. “We’ve exceeded our expected applications by 25 percent, and applications for next year are up by a further 10 percent.”
Many institutions offer what are called “interdisciplinary studies,” but they tend to be viewed on the same level as “create your own major” programs. And until now, they have been neither popular amongst students nor valued by employers. As companies begin demanding job candidates with expertise in combining the arts and sciences, interdisciplinary degree programs will start replacing traditional majors as the preferred course of study. By the year 2025 we’ll be kissing single-subject degrees goodbye.
2. Letter grades
Why are they still here?
“Somewhere along the way, there became an unspoken agreement that grades are effective communicators of student learning,” writes educator Chris Crouch. “And somehow we as a society have taken this bait; hook, line and sinker. The variability of student grades from teacher to teacher, course to course, school to school, and state to state are so great, I can’t believe that we realistically put any stock in what they measure and what they communicate. At best they are an accurate snapshot of where a student is but they do not provide parents or students meaningful feedback for improvement or even growth.”
Alfie Kohn agrees. Grades, he says, do three damaging things: they diminish students’ interest in whatever they’re learning, create a preference for the easiest possible task, and reduce the quality of students’ thinking.
In one experiment, students who were told they’d be graded on how well they learned a social studies lesson had more trouble understanding the main point of the text than did students who were told that no grades would be involved. Even on a measure of rote recall, the graded group remembered fewer facts a week later (Grolnick and Ryan, 1987).
In 2008, the faculty at Stanford Law School voted to eliminate letter grades and replace them with four levels of achievement. The decision came after a long period of discussion among students and faculty that weighed issues such as collegiality, anxiety, and fairness. Leaders of the reform argued that shifting from the precision of letter grades to broader categories would reduce pressure and refocus students’ and professors’ energies on learning for the sake of learning.
American universities that have adopted a no-grade policy include Antioch University, Bennington College, Harvey Mudd College, New College of Florida, Reed College, and Sarah Lawrence College. In Australia, most unis still use the traditional system. But change is coming. In a 2014 interview, Australian Council for Education Research chief executive Geoff Masters called the grading current system “poor at lifting the performance of the brightest students and those struggling to meet expected standards.” In the U.K. last month, the Department for Education announced a commission to look into an “Assessment Without Levels” approach.
Teachers and institutions are rising up against the old system, and before long the bubble is going to burst. There’s even a popular Twitter hashtag for supporters of the no-grading system: #TTOG (Teachers Throwing Out Grades). Supporters include Mark Barnes, Starr Sackstein, and Brian Aspinall.
It doesn’t make much sense to teach a concept, or even a unit, and never bring it up again. For one thing, it doesn’t help students retain new material in the long run. And secondly, it’s not a very cohesive or integrated approach to presentation. Why should we begin a year with the Revolutionary War, end it with the Obama administration, and never mention the former again? The best instruction isn’t linear; it loops back on itself and helps students understand the relationships between seemingly disparate concepts.
What’s more, cognitive psychologists are telling us that retention depends heavily on spaced recall, meaning there’s no way you can reasonably expect a student to remember details from the beginning of the semester, let alone the beginning of the year, without strengthening those memories through regular recall.
4. Highlighting and reviewing
Although highlighting and underlining are common practices, says educator and psychologist Annie Murphy Paul, studies show they offer no benefit beyond simply reading the text.
“Some research even indicates that highlighting can get in the way of learning; because it draws attention to individual facts, it may hamper the process of making connections and drawing inferences.”
Nearly as bad is the practice of rereading, a common exercise that is much less effective than some of the better techniques you can use. In fact, highlighting, underlining, rereading and summarising are all rated by researchers as being of “low utility.”
What students will start doing instead is testing themselves. Research shows that the mere act of calling information to mind strengthens that knowledge and aids in future retrieval.
“While practice testing is not a common strategy, despite the robust evidence supporting it,” says Paul, “there is one familiar approach that captures its benefits: using flash cards.”
We’ll start to see a lot more research-backed study methods being used amongst students and supported by teachers over the next half-decade.
In their book, 50 Myths and Lies That Threaten America’s Public Schools, Berliner and Glass explain that tracking, or separating students according to academic ability, provides little to no benefit for low-achieving students and, at best, modest academic benefits for high-achieving students:
“For the vast majority of those who are labeled in our schools as gifted and talented, or high-achieving, ability grouping by such attributes appears not to work as well as commonly thought.”
We will find an effective alternative to tracking. Maybe not this year, maybe not the next. But it won’t be long before it’s overturned once and for all.
6. High-stakes standardised tests
Parents, students, and educators around the country are asking serious questions about the number of tests students are taking and the reasons they’re taking them. Last year the U.S. Senate education committee held a hearing on the reauthorisation of the No Child Left Behind (NCLB) law and, specifically, on testing. The committee’s chairman, Lamar Alexander, has released a draft bill offering a lot more leeway to states in designing their own assessment systems. Although Education Secretary Arne Duncan and Democratic leaders in Congress are saying we must protect annual testing at all costs, new ideas are being put forward that would offer alternatives to the current model.
7. The Matthew effect
A study conducted in 2012 found that third-graders who lack proficiency in reading are four times more likely to become high school dropouts. A vicious cycle sets in: assignments increasingly require background knowledge and familiarity with “book words” (literary, abstract and technical terms), competencies that are themselves acquired through reading. Meanwhile, courses in science, social studies, history and even math come to rely more and more on textual analysis, so that struggling readers begin to fall behind in these subjects as well.
This phenomenon is called the Matthew Effect, and it’s going to fade away with tracking as we start understanding the learning process better with things like personalised instruction and analytics.
8. Learning styles theory
Learning styles theory, one of the most popular and widely adopted paradigms of the century, is finally meeting its end. In a controversial report from the journal of Psychological Science in the Public Interest (2009), Harold E. Pashler, a professor of psychology at the University of California at San Diego, wrote, “The contrast between the enormous popularity of the learning-styles approach within education and the lack of credible evidence for its utility is, in our opinion, striking and disturbing. If classification of students’ learning styles has practical utility, it remains to be demonstrated.” The report was based on an exhaustive study of learning styles literature, aimed at finding empirical evidence for the validity of the theory. The same goes for Multiple Intelligences theory, first proposed by Dr. Howard Gardner, professor of education at Harvard University, 30 years ago. You just can’t call one person a visual learner when we are all, in fact, visual learners. The sooner we realise this, the better.
9. The genius factor
A growth mindset is the idea that, whatever level of talent you have, you can always develop further through hard work, good strategies, and mentorship from others. In a growth mindset, the main goal is to get smarter, to challenge your brain, make those new connections, and grow your abilities over time. And when you think that way, Stanford psychologist Carol Dweck reveals, it’s not as crucial to look smart or smarter than other people at every given moment.
On the flip side of that is the fixed mindset, or what Dweck has nicknamed the “culture of genius.” It’s the belief that your basic talents, abilities, and intelligence are fixed traits. You have a certain amount, and that’s that. Having to work hard at something means you’re just not as smart as other people.
“When kids are praised for their talent, their brilliance, it fosters more of a fixed mindset, which is ironic because the self-esteem movement told us that was going to make our kids confident and successful,” says Dweck. “And when kids struggle, it creates a fear of challenge, a fear of failure. We show in our research that kids lie about their poor performance if you have praised them for their intelligence.”
10. The right brain/left brain paradigm
Kurt Fischer, founding president of the International Mind, Brain, and Education (MBE) Society and director of the MBE graduate program at Harvard University, says: “This is total nonsense, unless you’ve had half of your brain removed.” The theory may have emerged from a misunderstanding of the split-brain work of Nobel Prize winner Roger Sperry, who noticed differences in the brain when he studied people whose left and right brains had been surgically disconnected. Surely, some of us are more “artsy” or more “science-y,” but in reality this has little to do with some predetermined anatomical distribution of brain power and more to do with environment and genetics.
11. The access issue
Of the estimated 7 billion people on Earth, 6 billion now have access to a working mobile phone. Africa, which had a mobile penetration rate of just 5% in the 1990s, is now the second largest and fastest growing mobile phone market in the world, with a penetration rate of over 60% and climbing.
The United Nations estimates 58 million children from ages 6 to 11 don’t attend school, a number that has remained stubbornly stagnant since the middle of the last decade. But with the issue of technology access seemingly solved, education will follow closely on its heels.
12. ‘Admission’ to college
In his new book, The End of College: Creating the Future of Learning and the University of Everywhere, Kevin Carey envisions a future in which “the idea of admission to college will become an anachronism, because the University of Everywhere will be open to everyone and educational resources that have been scarce and expensive for centuries will be abundant and free.”
This will be driven largely by advances in information technology.
“Whereas historically you went to college in a specific place and only studied with the other people who could afford to go [to] that place,” Carey says in a recent interview on National Public Radio, “in the future we’re going to study with people all over the world, interconnected over global learning networks and in organisations that in some cases aren’t colleges as we know them today, but rather 21st-century learning organisations that take advantage of all of the educational tools that are rapidly becoming available to offer great college experiences for much less money.”