The world of education seems to be discovering for the first time that humans can produce coherent discourse. This is the last edition of The Daily Devil’s Dictionary for 2018.
Traditional educational practices have been under fire for some time. But what is the miracle technique we can now imagine to replace them?
According to the British Psychological Society’s publication, Research Digest, it’s something called self-explanation, which it describes as “a powerful learning technique,” citing the “meta-analysis of 64 studies involving 6000 participants.”
Here is today’s 3D definition:
Self-explanation:
The attempt of a learner to mimic a teacher in a classroom
Contextual note
Although the shifting sands of knowledge have been blown in all directions by the winds of what we might call “technology-induced educational climate change,” the practice of teaching and learning in classrooms and the philosophy that underlies it haven’t changed a great deal. The technology that has now invaded everyone’s lives has nevertheless pushed the world of education to realize that its traditional, fundamentally inert model of teaching and learning needs some serious refreshing.
In the traditional model, learners passively consume a body of received and formally structured knowledge and wisdom conveyed to them by authorized books via teachers. The teachers are usually assumed to be authorities themselves or at least faithful transmitters of authoritative discourse. To achieve a recognized result in the form of grades and diplomas, learners compete amongst themselves to prove their capacity to duplicate the discourse they are expected to absorb. This means that teachers spend a lot of time explaining things while assuming that their explanations are sufficient to provoke understanding.
Alas, human understanding rarely conforms to that model. Worse, the model can produce serious misunderstanding when, for example, the knowledge presented becomes fragmented and the deeper links and associations within the content taught remain unperceived and undeveloped.
But is self-explanation the answer? The article establishes that allowing learners to explain things to the best of their ability represents at the very least a methodological break from tradition, which is unquestionably true. But to make this work it would presumably require radically revising the whole psychological basis of teaching.
Our own research in teaching and training, drawing on the tradition of constructivist pedagogy, shows that self-expression — rather than simply self-explanation — is an essential ingredient of deep learning. But self-expression more than explanation. It implies dialogue and research between multiple “selfs” expressing themselves. This recognizes the importance of learning as a collaborative activity, fostering the capacity to understand the link between constructed discourse and the knowledge that is the object of the discourse, as well as its underlying logic, explicit and implicit.
This highlights the essential divergence between the traditions. The current industrial model promotes competitive learning, constantly assessing the comparative accomplishments of individual learners. Collaborative learning encourages the growth of a culture of understanding and practice among groups of learners.
Do these creative educational thinkers really feel it necessary to invent a rather cumbersome and ambiguous term — self-explanation (easily confused with self-justification) — to draw our attention to what has always have been obvious in learning environments? Isn’t that how Socrates “taught” his philosophy, through mutually explanatory dialogue, which includes the permanent challenge to refine understanding and find the best ways of formulating it?
Historical note
The article reveals some of the terrifying fallacies that underlie traditional pedagogical thinking when it tells us: “The researchers checked and the benefits of self-explanation are not due to the technique simply leading to more time being spent in study.”
Let’s call this the myth of time spent. Education systems across the globe have sought to define the optimal number of hours of study per week or in total that it requires to learn anything. Language schools typically promise to make customers fluent in 60, 90 or some other imaginary number of hours of study. And Malcolm Gladwell famously stepped in to claim — impressively but somewhat mistakenly — that anything well learned requires 10,000 hours of practice. Measuring learning outcomes by time spent makes sense for a culture that quantifies everything. It makes a lot less sense where serious learning is concerned.
The recent history of educational evolution has been contaminated by another trend that the article uncritically alludes to. For many digital educators, teaching essentially means reformatting knowledge into a system of choices (like an algorithm), the simplest (and stupidest) of which is the multiple-choice question. It makes things easy for those who believe that learning is essentially about distinguishing right and wrong facts or formulation.
This encourages the further fallacy that knowledge can be reduced to a series of single right answers. But learning, whether it’s mathematics or painting, always entails phenomena such as the perception of systemic logic, constructing and deconstructing cultural patterns of association and articulating ideas and relationships.
On this subject, the article summarizes one observation in the study: “They found some limited evidence that a multiple-choice format (in which the student chooses a range of explanations from a list) to be the least effective” [sic]. Apart from the mangled syntax, the point is so obvious — that multiple-choice is in fact the opposite of self-explanation — as to be not worth mentioning. But the entire field of education seems to be struggling with the idea that digital technology has imposed multiple-choice logic as the fundamental tool of learning.
So, here’s a final binary question: Is this the best we can expect from serious studies?
*[In the age of Oscar Wilde and Mark Twain, another American wit, the journalist Ambrose Bierce, produced a series of satirical definitions of commonly used terms, throwing light on their hidden meanings in real discourse. Bierce eventually collected and published them as a book, The Devil’s Dictionary, in 1911. We have shamelessly appropriated his title in the interest of continuing his wholesome pedagogical effort to enlighten generations of readers of the news.]
The views expressed in this article are the author’s own and do not necessarily reflect Fair Observer’s editorial policy.
Support Fair Observer
We rely on your support for our independence, diversity and quality.
For more than 10 years, Fair Observer has been free, fair and independent. No billionaire owns us, no advertisers control us. We are a reader-supported nonprofit. Unlike many other publications, we keep our content free for readers regardless of where they live or whether they can afford to pay. We have no paywalls and no ads.
In the post-truth era of fake news, echo chambers and filter bubbles, we publish a plurality of perspectives from around the world. Anyone can publish with us, but everyone goes through a rigorous editorial process. So, you get fact-checked, well-reasoned content instead of noise.
We publish 2,500+ voices from 90+ countries. We also conduct education and training programs
on subjects ranging from digital media and journalism to writing and critical thinking. This
doesn’t come cheap. Servers, editors, trainers and web developers cost
money.
Please consider supporting us on a regular basis as a recurring donor or a
sustaining member.
Will you support FO’s journalism?
We rely on your support for our independence, diversity and quality.