Researchers must treat their findings with ruthless objectivity to avoid dreaded “cargo cults.” And educators should consider doing the same.
Back in 1960, the great British filmmaker David Attenborough visited Tanna Island in the South Pacific to document the lives of the islanders. What he witnessed was both fascinating and bizarre.
Prior to the 1940s, the islanders had largely lived in splendid isolation, used stone-age technology and fed themselves through subsistence farming. With the onset of WWII, that era of splendid isolation evaporated, as the Pacific become a major theater of war.
Tanna became a base of operations for the Allied army and the island was quickly overrun with soldiers. For the first time, the islanders saw tinned food, chocolate, rifles, modern textiles, Jeeps, and propeller-driven aircraft. The soldiers traded some of this cargo for the cooperation of the islanders.
When the Pacific war ended in September 1945, the soldiers left the island and returned to their previous lives. The islanders of course remained, but for them life without the cargo was wanting.
Soon they began to strategize how they could make the cargo return. They thought hard about the behaviors and rituals of the soldiers and built what we might call a theory of change. They analyzed their current situation, compared it to that of the soldiers, and built an explanatory framework to theorize why the soldiers had access to the delicious cargo, while they did not.
The islanders concluded that the soldiers must have had very powerful ancestors, and that it was these ancestors who bestowed the gifts, which were sent from the spirit world. Ergo, if the islanders replicated the rituals of the soldiers the cargo would again begin to flow. They cleared runways in the jungle, made rifles and telescopes from bamboo and crafted radio headsets from coconuts. The islanders then formed into platoons and marched up and down the runways —regularly looking up in expectation that an aircraft would land, laden with precious cargo. But nothing happened.
The logic of the islanders could not be faulted, but it was built on flawed premises. Without an understanding of capitalism, mass production, aerodynamics, and the internal combustion engine—they were not well positioned to build a strong theory of change. Instead, they focused on the features of the world that they could see and assumed that everything they could see was everything there was. When the cargo didn’t come, they assumed it was because they were marching wrong, or because the runway was the incorrect shape, or their salutes not high or long enough.
The Nobel Prize winning physicist Richard Feynman, during his 1974 commencement address at the Californian Institute of Technology, made a powerful parallel between the thought processes of the Tanna Islanders and bad science—coining the term Cargo Cult Science.
Richard Feynman cautioned that to avoid falling into the same Cargo Cult trap as the Tanna Islanders, scientific researchers must be willing to question their own theoretical assumptions and findings, and they must be able to investigate all possible flaws in an experiment or theory. He also argued that scientists should adopt an unusually high level of honesty, especially self-honesty: much higher than the standards of everyday life.
Guarding Against Our Education Cargo Cults
We can apply Feyman’s Cargo Cult concept to education. Governments around the world collectively spend in the region of USD $140 billion each year on teaching resources, education technology, curriculum materials, and teacher professional development. But we witness with despair that much of this investment is having insufficient impact. Too much is being invested in shiny things that look good but deliver little. We call these shiny things education cargo cults and our core message is that education cargo cults must die.
The Tanna Islanders believed in their Cargo Cult not because THEY were mentally deficient but because we are ALL mentally deficient. If any one of us were completely rational then the world would be a strange buzzing confusion – especially when those we meet often act irrationally. We all develop belief systems to survive in our busy, buzzing world; sometimes developing beliefs that “get us through” and some of these rules of thumb or heuristics allow us to unconsciously traverse all manner of situations. But sometimes these beliefs are contrary to good practice.
During the last 40 years, a growing database of Cognitive Biases or glitches in our human operating system have been catalogued and confirmed through laboratory experiment and psychometric testing.
The research suggests that biases afflict all of us, unless we have been trained to ward against them.
In the table below, we summarize some of the inherent biases that, if left unchecked, can result in educational Cargo Cults:
Cognitive Bias Category | Description |
Authority Bias | Tendency to attribute greater weight and accuracy to the opinions of an authority figure – irrespective of whether this is deserved – and to be influenced by it
EDUCATION: Don’t be swayed by famous titled gurus. Carefully unpick and test of all their assumptions – especially if they are making claims outside the specific area of expertise. Be particularly suspicious of anyone who writes and publishes a white paper [!!!]
|
Confirmation Bias
Post-Purchase Rationalization
Choice-Support Bias |
The tendency to collect and interpret information in a way that conforms with rather than opposes our existing beliefs.
And when information is presented which contradicts current beliefs this can transition into Belief Perseverance i.e. where individuals hold beliefs that are utterly at odds with the data.
EDUCATION: We tend to select education approaches, products, and services that accord with our world view and will often continue to do so, even when convincing evidence is presented that our world view may be distorted. Be prepared to go against the grain and to question sacred assumptions. |
Observer Expectancy Effect
Observer Effect
Hawthorne Effect
Placebo Effect
|
The tendency for any intervention, even a sugar pill, to result in improved outcomes – mainly because everyone involved thinks the intervention will work and this creates a self-fulfilling prophecy
EDUCATION: If educational ‘sugar pills’ can generate positive effect sizes, then well-crafted education ‘medicines’ should generate a double whammy of effect plus placebo turbo boost – so opt for the latter.
|
Ostrich effect |
The tendency to avoid monitoring information that might give psychological discomfort. Originally observed in contexts where financial investors refrained from monitoring their portfolios during downturns.
EDUCATION: The importance of collecting robust and regular data from a range of sources about the implementation of new interventions and analyzing this ruthlessly. Collect evidence to know thy impact.
|
Anecdotal Fallacy
|
The tendency of taking anecdotal information at face value and giving it the same status as more rigorous data in making judgments about effectiveness
EDUCATION: Do not take spurious claims about impact at face value and do not invest in training based on participant satisfaction testimonials alone.
|
Halo Effect |
Tendency to generalize from a limited number of experiences or interactions with an individual, company, or product to make a holistic judgment about every aspect of the individual or organization.
EDUCATION: Sometimes the whole is less than the sum of its parts. Just because an educational support organization has world-leading expertise in area A does not mean that they are also world leading in area B.
|
Not Invented Here | Tendency of avoiding using a tried and tested product because it was invented elsewhere – typically claiming “but we are different here.”
EDUCATION: Be open to using and adapting existing expertise. Avoid reinventing the educational wheel – unless you work in terrain where wheels are useless [you probably don’t].
|
Ikea Effect
|
Tendency to have greater buy-in to a solution where the end-user is directly involved in building or localizing the product.
EDUCATION: Make the effort to localize and adapt tested solutions. This will generate greater emotional buy-in than standardized deployment.
|
Bandwagon Effect
Illusory Truth Effect
Mere Exposure Effect |
Tendency to believe that something works because a large number of other people believe it works.
EDUCATION: It might work and it might not. Test all claims carefully and don’t blindly join the bandwagon to keep up with the Joneses.
|
Clustering Illusion
Cherry Picking |
Tendency to remember and overemphasize streaks of positive or negative data that are clustered together in large parcels of random data i.e. seeing phantom patterns.
EDUCATION: Ask yourself: Are the claims made by educational researchers or service providers based on longitudinal data with a common long-term pattern, or from a small snapshot that could have been cherrypicked?
|
Conservativism | The tendency to revise ones’ beliefs insufficiently when presented with information that contradicts our current beliefs.
EDUCATION: If the evidence is robust, it just might be true. There was a time when people who declared that the earth wasn’t flat were burned as heretics. But test all evidence as claims carefully.
|
Courtesy Bias | The tendency to give an opinion that is more socially palatable than our true beliefs.
EDUCATION: Participant satisfaction scores from training events in some cultural contexts may be a grade or higher than the scores people would give if they were less polite.
|
Law of the Instrument | If you have a hammer, everything looks like a nail.
EDUCATION: Start with the problem or ‘wicked issue’ you are trying to solve, and then work backwards to find the right instruments – rather than searching for nails to bang.
|
Bike-shedding | The tendency to avoid complex projects like world peace to focus on projects that are simple and easy to grasp by the majority of participants – like building a bike shed.
EDUCATION: Don’t be afraid to go after the real problems. Build a bike shed if the world really needs bike sheds. If it doesn’t, then fix what needs fixing most.
|
Sunk Cost Fallacy
|
Tendency to continue with a project that is not bearing fruit, simply because so much has been invested in it already and withdrawal would be an admission of failure.
EDUCATION: Review implementation of new approaches regularly and set clear kill parameters/hurdles that must be achieved for the project to stay live. Ruthlessly prune anything that does not pass the hurdle test.
|
For educators to overcome these cognitive biases and fallacies, they need to develop their logical-rational skills but without losing their passion for teaching. Educators need not act like Tanna Islanders. This requires the development of mindframes that enable educators to have an unrelenting focus on impact and to continually ask themselves whether they are having the greatest impact.
This post is an excerpt from “Educational Cargo Cults Must Die” by John Hattie and Arran Hamilton.
Click here to read it now!