12 Facets of Education That Will Be Obsolete By 2025

By
May 19th, 2016 9 Comments Features

kevincarey-theendofcollege

Educational research is especially fertile right now, and efforts to integrate it into curricula over the next decade are going to leave some of us high and dry unless we start paying attention this second. Significant findings range from brain-based study habits to insights into the nature of intelligence and motivation. They also include glimpses into which of today’s factors will no longer be, well, factors come 2025. In this list we outline 12 of the primary facets of learning, instruction, and policy that won’t be around to see the next decade.

1. Traditional subjects

We’re living in an increasingly multi-disciplinary world, so why shouldn’t our majors reflect that? In the next ten years, we’ll start to see less Biology, Math, English and more Big Data, Creative Studies, and Decision Sciences.

Finland is considering its most radical overhaul of basic education yet: abandoning teaching by subject for teaching “by phenomenon.” Traditional lessons such as English Literature and Physics are already being phased out among 16-year-olds institutions in Helsinki.

Instead, the Finns are teaching “topics” such as the European Union, which encompasses learning languages, history, politics, and geography. No more of an hour of history followed by an hour of chemistry. The idea aims to eliminate one of the biggest gripes of students everywhere: “What is the point of learning this?” Now, each subject is anchored to the reason for learning it.

Along the same lines, the University College of London, students can earn a degree in Arts and Sciences (BASc). That’s right–not a double major, an interdisciplinary major. Students engage with interdisciplinary learning throughout the degree, taking subjects from the Humanities, Social Sciences, Sciences and Engineering in relevant and connected combinations.

“We’re determined to be just as science-y as we are arts-y,” says Dr Carl Gombrich, who directs the program. Gombrich acknowledges that defenders of the traditional single-subject approach might argue the new program lacks depth, but he says this just isn’t the case.

The program is very popular so far. “We’ve exceeded our expected applications by 25 percent, and applications for next year are up by a further 10 percent.”

Many institutions offer what are called “interdisciplinary studies,” but they tend to be viewed on the same level as “create your own major” programs. And until now, they have been neither popular amongst students nor valued by employers. As companies begin demanding job candidates with expertise in combining the arts and sciences, interdisciplinary degree programs will start replacing traditional majors as the preferred course of study. By the year 2025 we’ll be kissing single-subject degrees goodbye.

2. Letter grades

Why are they still here?

“Somewhere along the way, there became an unspoken agreement that grades are effective communicators of student learning,” writes educator Chris Crouch. “And somehow we as a society have taken this bait; hook, line and sinker. The variability of student grades from teacher to teacher, course to course, school to school, and state to state are so great, I can’t believe that we realistically put any stock in what they measure and what they communicate. At best they are an accurate snapshot of where a student is but they do not provide parents or students meaningful feedback for improvement or even growth.”

Alfie Kohn agrees. Grades, he says, do three damaging things: they diminish students’ interest in whatever they’re learning, create a preference for the easiest possible task, and reduce the quality of students’ thinking.

In one experiment, students who were told they’d be graded on how well they learned a social studies lesson had more trouble understanding the main point of the text than did students who were told that no grades would be involved. Even on a measure of rote recall, the graded group remembered fewer facts a week later (Grolnick and Ryan, 1987).

In 2008, the faculty at Stanford Law School voted to eliminate letter grades and replace them with four levels of achievement. The decision came after a long period of discussion among students and faculty that weighed issues such as collegiality, anxiety, and fairness. Leaders of the reform argued that shifting from the precision of letter grades to broader categories would reduce pressure and refocus students’ and professors’ energies on learning for the sake of learning.

American universities that have adopted a no-grade policy include Antioch University, Bennington College, Harvey Mudd College, New College of Florida, Reed College, and Sarah Lawrence College. In Australia, most unis still use the traditional system. But change is coming. In a 2014 interview, Australian Council for Education Research chief executive Geoff Masters called the grading current system “poor at lifting the performance of the brightest students and those struggling to meet expected standards.” In the U.K. last month, the Department for Education announced a commission to look into an “Assessment Without Levels” approach.

Teachers and institutions are rising up against the old system, and before long the bubble is going to burst. There’s even a popular Twitter hashtag for supporters of the no-grading system: #TTOG (Teachers Throwing Out Grades). Supporters include Mark Barnes, Starr Sackstein, and Brian Aspinall.

3. Teach-and-abandon

It doesn’t make much sense to teach a concept, or even a unit, and never bring it up again. For one thing, it doesn’t help students retain new material in the long run. And secondly, it’s not a very cohesive or integrated approach to presentation. Why should we begin a year with the Revolutionary War, end it with the Obama administration, and never mention the former again? The best instruction isn’t linear; it loops back on itself and helps students understand the relationships between seemingly disparate concepts.

What’s more, cognitive psychologists are telling us that retention depends heavily on spaced recall, meaning there’s no way you can reasonably expect a student to remember details from the beginning of the semester, let alone the beginning of the year, without strengthening those memories through regular recall.

4. Highlighting and reviewing

Although highlighting and underlining are common practices, says educator and psychologist Annie Murphy Paul, studies show they offer no benefit beyond simply reading the text.

“Some research even indicates that highlighting can get in the way of learning; because it draws attention to individual facts, it may hamper the process of making connections and drawing inferences.”

Nearly as bad is the practice of rereading, a common exercise that is much less effective than some of the better techniques you can use. In fact, highlighting, underlining, rereading and summarising are all rated by researchers as being of “low utility.”

What students will start doing instead is testing themselves. Research shows that the mere act of calling information to mind strengthens that knowledge and aids in future retrieval.

“While practice testing is not a common strategy, despite the robust evidence supporting it,” says Paul, “there is one familiar approach that captures its benefits: using flash cards.”

We’ll start to see a lot more research-backed study methods being used amongst students and supported by teachers over the next half-decade.

5. Tracking

In their book, 50 Myths and Lies That Threaten America’s Public Schools, Berliner and Glass explain that tracking, or separating students according to academic ability, provides little to no benefit for low-achieving students and, at best, modest academic benefits for high-achieving students:

“For the vast majority of those who are labeled in our schools as gifted and talented, or high-achieving, ability grouping by such attributes appears not to work as well as commonly thought.”

We will find an effective alternative to tracking. Maybe not this year, maybe not the next. But it won’t be long before it’s overturned once and for all.

6. High-stakes standardised tests

Parents, students, and educators around the country are asking serious questions about the number of tests students are taking and the reasons they’re taking them. Last year the U.S. Senate education committee held a hearing on the reauthorisation of the No Child Left Behind (NCLB) law and, specifically, on testing. The committee’s chairman, Lamar Alexander, has released a draft bill offering a lot more leeway to states in designing their own assessment systems. Although Education Secretary Arne Duncan and Democratic leaders in Congress are saying we must protect annual testing at all costs, new ideas are being put forward that would offer alternatives to the current model.

7. The Matthew effect

A study conducted in 2012 found that third-graders who lack proficiency in reading are four times more likely to become high school dropouts. A vicious cycle sets in: assignments increasingly require background knowledge and familiarity with “book words” (literary, abstract and technical terms), competencies that are themselves acquired through reading. Meanwhile, courses in science, social studies, history and even math come to rely more and more on textual analysis, so that struggling readers begin to fall behind in these subjects as well.

This phenomenon is called the Matthew Effect, and it’s going to fade away with tracking as we start understanding the learning process better with things like personalised instruction and analytics.

8. Learning styles theory

Learning styles theory, one of the most popular and widely adopted paradigms of the century, is finally meeting its end. In a controversial report from the journal of Psychological Science in the Public Interest (2009), Harold E. Pashler, a professor of psychology at the University of California at San Diego, wrote, “The contrast between the enormous popularity of the learning-styles approach within education and the lack of credible evidence for its utility is, in our opinion, striking and disturbing. If classification of students’ learning styles has practical utility, it remains to be demonstrated.” The report was based on an exhaustive study of learning styles literature, aimed at finding empirical evidence for the validity of the theory. The same goes for Multiple Intelligences theory, first proposed by Dr. Howard Gardner, professor of education at Harvard University, 30 years ago. You just can’t call one person a visual learner when we are all, in fact, visual learners. The sooner we realise this, the better.

9. The genius factor

A growth mindset is the idea that, whatever level of talent you have, you can always develop further through hard work, good strategies, and mentorship from others. In a growth mindset, the main goal is to get smarter, to challenge your brain, make those new connections, and grow your abilities over time. And when you think that way, Stanford psychologist Carol Dweck reveals, it’s not as crucial to look smart or smarter than other people at every given moment.

On the flip side of that is the fixed mindset, or what Dweck has nicknamed the “culture of genius.” It’s the belief that your basic talents, abilities, and intelligence are fixed traits. You have a certain amount, and that’s that. Having to work hard at something means you’re just not as smart as other people.

“When kids are praised for their talent, their brilliance, it fosters more of a fixed mindset, which is ironic because the self-esteem movement told us that was going to make our kids confident and successful,” says Dweck. “And when kids struggle, it creates a fear of challenge, a fear of failure. We show in our research that kids lie about their poor performance if you have praised them for their intelligence.”

10. The right brain/left brain paradigm

Kurt Fischer, founding president of the International Mind, Brain, and Education (MBE) Society and director of the MBE graduate program at Harvard University, says: “This is total nonsense, unless you’ve had half of your brain removed.” The theory may have emerged from a misunderstanding of the split-brain work of Nobel Prize winner Roger Sperry, who noticed differences in the brain when he studied people whose left and right brains had been surgically disconnected. Surely, some of us are more “artsy” or more “science-y,” but in reality this has little to do with some predetermined anatomical distribution of brain power and more to do with environment and genetics.

11. The access issue

Of the estimated 7 billion people on Earth, 6 billion now have access to a working mobile phone. Africa, which had a mobile penetration rate of just 5% in the 1990s, is now the second largest and fastest growing mobile phone market in the world, with a penetration rate of over 60% and climbing.

The United Nations estimates 58 million children from ages 6 to 11 don’t attend school, a number that has remained stubbornly stagnant since the middle of the last decade. But with the issue of technology access seemingly solved, education will follow closely on its heels.

12. ‘Admission’ to college

In his new book, The End of College: Creating the Future of Learning and the University of Everywhere, Kevin Carey envisions a future in which “the idea of admission to college will become an anachronism, because the University of Everywhere will be open to everyone and educational resources that have been scarce and expensive for centuries will be abundant and free.”

This will be driven largely by advances in information technology.

“Whereas historically you went to college in a specific place and only studied with the other people who could afford to go [to] that place,” Carey says in a recent interview on National Public Radio, “in the future we’re going to study with people all over the world, interconnected over global learning networks and in organisations that in some cases aren’t colleges as we know them today, but rather 21st-century learning organisations that take advantage of all of the educational tools that are rapidly becoming available to offer great college experiences for much less money.”

About 

Saga Briggs is Managing Editor of InformED. You can follow her on Google+ or @sagamilena

9 Responses

  1. Shelley Tapp says:

    Two years ago, at my university, we attempted to deal with some of these issues as we considered the future of colleges and universities. Some of these ideas met a great deal of resistance from faculty. One of the important factors, as I saw it, is the inter-relation of how students are taught in public schools and what methods will be effective for them when they reach college. So, it seems to me, that these changes must be made concurrently in higher education and public schools, and that even after a transformation, it would take some time to see the real consequences on student learning at both levels. I compare it to developing a vaccine for an illness: you can help the people who haven’t had the illness yet, but the vaccine will do little or nothing to help the people infected with the disease before the vaccine can be disseminated. So my question is, how much do we as educators believe that students being produced by the current system can respond to the new paradigms. I would like to hear others comment. Please forgive the vaccine metaphor, it did not imply that either level of education equates to a disease, it was just the first metaphor I could think of that encapsulated my reservations, doubts, even agreement with the principles of this article, in other words, with my ability to visualize what the new worldwide learning organization will look like and how students will respond to it.

    • Saga Briggs says:

      Great question, Shelley. I agree it’s a transition that requires time, and it may be unrealistic to expect concurrent change at both levels, at least right away. If you’d like, please add your comment to our Facebook post here: https://www.facebook.com/OC.InformED/. We’d love to hear other educators’ thoughts as well, and you make a really good point that I’m sure many of our followers would appreciate having a chance to respond to.

  2. colin lever says:

    A lot of interesting points here but not all of them come up with a solution. It is not enough just to say that ‘something’ will happen in the future (ie points 5 & 6). What we need are active solutions that can challenge the status quo now. Take a look at my book; Children in Need: Education, Wellbeing & the Pursuit of GDP or my website as an example

  3. Charles Booth says:

    This article resonates with me very strongly. I teach apprenticeships in th UK, and the model is in desperate need of updating and making relevant for newer subjects and industries such as digital marketing and creative media etc. Sometimes it is like we are travelling salesmen driving to a learner workplace for a short lesson and some form filling, spending too much time measuring and almost no time teaching.
    I’ve long suspected a few things that need changing but a lot of the things you’ve mentioned I wouldn’t have ever thought of. I’d love to hear more, if you could recommend some of your sources to me. I would love to start doing something about it… Charles

    • Daniel Wurm says:

      We have introduced a very different type of apprenticeship here in Australia, using cutting edge e-learning and smart-phone technology. Using our system, apprentices complete all their study and assessment on a real job, in real workplaces. It gives employers the tools they need to do the training themselves. That is the future of apprenticeships. Contact us for more info

  4. Everything is great except for #8, learning styles theory. While perhaps an unproven concept, I personally feel a much faster understanding through auditory channels than otherwise.

    I also have a bone to pick with #10, your denial of right brain / left brain theory. I strongly recommend Dan Pink’s A Whole New Mind if you’re curious about how the two “halves” of the brain (symbolically represented by the left and right hemispheres, though in reality located all throughout) differ in functionality. It’s not “artsy” and “sciencey,” but rather “analytical” and “synergistic”

    Other than that, great article that I may reference in my next piece.

    • Saga Briggs says:

      Thanks for your comment, Jeremy. I agree that evidence for both theories can be found in the real world—certainly, some people learn best through one channel vs another, and think more “analytically” vs “synergistically”—but I don’t think we should be suggesting to students that everyone fits neatly into one category or another. Because not everyone does. There is far too much variation between individuals to treat these theories as comprehensive representations of the way the brain operates. Sure, it’s powerful and useful if you discover that you do fit into one of these molds. But we shouldn’t be pointing to students and saying, “You’re a auditory learner, NOT a visual learner” or “You’re an analytical thinker, NOT a creative thinker.” Unfortunately this is the sort of black and white perspective these theories have led us to adopt.

    • Swara Patel says:

      Hey , I totally agree with you…while I was reading the article ….I had the exact same opinion as yours 🙂

  5. Thabo says:

    Thank you very much Saga for this thoughtful and yet inspiring article. It clearly shows that education reform is revolutional in nature and fixation to certain educational approaches stifles growth especialy in countries which are on the receiving end of educational research from developed countries. I’m awed by the fact that for many years in my interaction with teachers on Edtech I’ve been stressing planning lessons with the aim of catering for various learning styles especially during lesson activities.

Leave a Reply