How to Develop Immunity to Nonsense: Lessons From a New Science

Like the body, the mind has evolved its own immune system against problematic ideas. How can you make yours stronger? Be aware of your own fallibility. With others, be charitable, yet inquisitive. Commit to the truth, instead of giving in to lazy relativism. Let arguments change your opinions, not the other way around.
strong human brain-ai

One single line drawing of strong human brain character logo identity. Smart fresh mind for fitness center icon logotype concept. Dynamic continuous line draw design vector graphic illustration © Simple Line /

December 16, 2023 04:31 EDT

In less than a generation, we’ve managed to build an utterly bewildering information environment: social media. With nothing more than a tap on a screen, 5.3 billion of us can now plunge headlong into a swirling ocean of “viral” content.

Previous generations also contended with misinformation. False narratives, malicious gossip and political spin are as old as time. Before the advent of the scientific method, everyone marinated in a rich stew of fairy tales, myths and superstitions. Like us, our ancestors trusted quack cures and fell for conspiracy theories.

But now, quite suddenly, we find ourselves in a brave new world, one riddled with rabbit holes and confounded by clickbait. We have fake news and flame wars, cancel culture and contested speech norms, echo chambers and “alternative facts.” We’ve seen culture warriors weaponize Facebook and Twitter, science denial grow into a lucrative business, and conspiracy theories mutate into monstrous forms (QAnon). Is it any wonder so many of us are lost?

We all see others taken in by the BS. We think, “Children get faked out by fairy tales, but I’ve outgrown them. Voters are bamboozled by propaganda, but that’s them, not us. Followers of other religions are misled, but I practice the true faith.” Eventually, though, the more thoughtful among us think to ask pivotal questions. “Am I really so exceptional? Or am I, too, being played? Would I know if I was? Do I really know what I think I know? What misconceptions do I harbor?”

They say there’s a sucker born every minute, but in truth, we’re all born suckers. We’re fairly gullible by default, probably because our ancestors had to learn rapidly when young. That’s why children believe in the tooth fairy. The problem is that, even as adults, we remain strangely susceptible to evolved forms of nonsense: Without guidance, we remain lost.

An unusual few, though, exhibit what I call “deep immunity.” These folks think differently. Somehow, they ward off troublesome information with ease and exhibit uncommonly sound judgment. They cultivate mental habits that can grow into something we could all use more of these days: the precious trait called wisdom.

But how do we cultivate these habits? Half a lifetime ago, I began studying the matter in earnest. Decades of research led me to an astonishing, transformative, but almost unknown fact: The human mind has an immune system of its own. Just as the body has a system for spotting and neutralizing infectious microbes, the mind has a system for spotting and shedding infectious ideas. So I wrote a book about it. The book helped launch an upstart science — what we in the business call cognitive immunology.

The field illuminates the workings of the mind’s defenses. It explains why these defenses sometimes break down and how we can fortify them against corruption. Critical thinking (CT), it turns out, is at best a haphazard approach to achieving misinformation immunity. CT is not enough. The good news? Outbreaks of viral nonsense are not inevitable. “Infodemics” can be prevented. The trick is to apply the science and proactively cultivate mental immunity.

In what follows, I distill an ocean of research into four actionable steps. Taking them should give your mind’s defenses an immediate boost. I hope, though, that the guide will also spur more ardent, long-term striving. If we keep these guidelines in mind and work patiently toward mastery, we can all grow substantially wiser.

Step 1: Shift your reference frame

Many of us default to a certain outlook on information. I call it the info consumer frame (or ICF). On this view, the infosphere is like a marketplace. (Hence the metaphor “marketplace of ideas.”) We humans are like shoppers: We browse the aisles looking for ideas that strike our fancy. Ideas are assumed to be like products that sit obediently on shelves. Meanwhile, our minds are like shopping carts — passive containers for the mental stuff we acquire. Learning is fundamentally a matter of filling your cart with information commodities; the ideas we “buy” into become our beliefs, and everyone is entitled to believe what they like.

This frame is pernicious. It breeds a sense of cognitive entitlement, exacerbates preexisting biases and obstructs higher-order thinking. In the information age, it is proving especially dangerous. Its influence is seen in the polarization that threatens the world’s democracies and the ideological entrenchment of today’s culture warriors. Whenever propaganda goes viral and incites unruly mobs, the invisible hand of the ICF is at work. In our time, we’re called to rethink this prevailing understanding of our relationship with ideas.

Happily, an alternative is taking hold in the sciences. Here’s the gist: Ideas are more like microbes than groceries. Bundles of information obey an evolutionary logic: The “fittest” tend to find hosts, survive and reproduce. Our minds host some that are good for us and others that are bad for us. Good ideas (roughly, true and useful ones) amount to mind-symbionts and bad ideas (the false or harmful ones) amount to mind-parasites.

Sometimes the latter — “infobugs” — proliferate at our expense. For example, beliefs about witchcraft have incited moral panics (Salem), extremist ideologies have inspired terrorist attacks (9/11) and fake news has galvanized sedition (the January 6 US Capitol attack). Some infobugs even induce us to spread them. Think of the clever but misleading meme that gets you to share it, or the religious notions that inspire proselytizing. Just as a virus can hijack a cell for its purposes, an ideology can hijack a mind for its “purposes.”

Call this the microbial ecosystem frame. Minds are not passive receptacles. They’re active, infection-prone contraptions cobbled together by natural selection. In fact, our minds co-evolved in a rich stew of ideas, many of them prone to replicate in spite of our best interests. Crucially, every one of us is susceptible to mind-infections. In fact, every one of us is infected. We play host to countless infobugs. Misconceptions, false assumptions, overgeneralizations, limiting beliefs, crippling doubts — all of these are, in a very real sense, mind-parasites. Minds teem with them, and precautions must be taken to keep them from running wild.

This frame has a key implication: Each and every one of us has a lot to learn — and unlearn. Much of what we think we know doesn’t truly amount to knowledge. Admit this, embrace the consequent humility, and you take an important step toward deep immunity.

Step 2: Have standards

We need shared cognitive standards. Otherwise, our beliefs become arbitrary. Opinions diverge, ideologies harden, and worldviews become irreconcilable. Historically, it works like this: Excuses that license irresponsible talk spread, sowing the seeds of mental decadence. Then, unaccountable talk proliferates, belief systems diverge and societies succumb to mistrust, division and conflict.

The outbreaks of irresponsible thinking in our time can be traced to ideas like these: “Our beliefs are fundamentally private and no one else’s concern”; “Everyone is entitled to their opinion”; “Values are fundamentally subjective”; “Articles of faith should not be questioned”; “Criticism is tantamount to the policing of thought.” A related idea — that “no one has standing to uphold standards” — is conveyed by the sneaky rhetorical question, “Who’s to say?”

Philosophers call this nexus of ideas “relativism,” and intellectual historians know that their appearance presages periods of turmoil and civic decline. Why? Because they weaken the centripetal pull of objective evidence. Without reality-based cognitive standards, “the center cannot hold,” and “mere anarchy is loosed.”

Cognitive immunologists classify relativistic ideas as mental immune disruptors. People employ them to evade accountability norms. This subverts those norms, leading to cognitive dysfunction. Shedding the disruptors is thus one way to build mental immunity. If you haven’t already done so, I suggest renouncing the ones in quotation marks above.

Try this also: Apply the Golden Rule to the life of the mind. Ask yourself what cognitive standards you would have others observe, then hold yourself to those same standards. Want others to be honest? Be honest yourself. Want others to be fair-minded and persuadable? Make yourself fair-minded and persuadable. Are you troubled that others believe things they have no business believing? Then don’t believe things you have no business believing. Apply the “Law of the Gospels” to the world of information and — voila! — you get a rich and beneficial ethics of belief.

Norms of accountable talk are the cornerstone of human civilization. When they are generally observed, constructive means exist for resolving conflicts, and everyone benefits. When bad actors defy these norms, it chips away at the trust that makes cooperative living possible. Imagine a world where decayed norms of accountable talk afford no protection against malicious accusations. Imagine a rival employing unfounded allegations to get you locked up. You’d have no recourse. If that’s not the world you want, help strengthen the norms of accountable talk.

Also, dump the idea that it’s enough to have a good reason for whatever you want to do or believe. You can manufacture a serviceable reason for anything, so that standard is too lax. (I call this the Platonic standard, because it occupies center stage in two Platonic dialogues.) This standard encourages wishful thinking and rationalization. It also exacerbates confirmation bias

The antidote is the Socratic standard: beliefs and decisions should be able to withstand tough questioning, including the objections of those who disagree. Standards like this give us a mechanism for resolving our differences with words. They also bring the defects of troublesome ideas to light and help us shed them. The true test of responsible belief is not, “Can I find a reason for this?” but, “Can it withstand questioning?”

You know how we update our antivirus software to protect our computers from the latest digital pathogens? We need to do the same with our brains. Here’s how. Learn how bad actors “hack” minds: how they play on fears, encourage wishful thinking and float seductive conspiracy theories. How they weaponize doubt, cultivate cynicism and compromise mental immune systems. Build your mental library of mind-viruses, fallacies and mental immune disruptors, and you’ll spot manipulative information more easily.

Step 3: Practice basic cognitive hygiene

Many of us dislike uncertainty, so we “tune out” our doubts. But cognitive immunology explains why this is a grave mistake. Doubts are quite literally the antibodies of the mind. The mind generates them to fight off problematic information. Learn to listen to them. Often, they’ll draw attention to an idea’s defects, thereby reducing the risk of mind-infection. Better yet, befriend your doubts: learn to enjoy their company and enjoy the benefits of next-level BS-detection.

Your mind also generates reasons. Sometimes, it does this to rationalize what it wants, but more often, it does this to draw your attention to a consideration that really does count for or against something. A basic principle of cognitive hygiene, then, is to give good reasons their due. Whether they count for your position or against it, credit them. Let them change your mind. (In practice, this often means letting them nudge your degree of confidence in something up or down a bit.)

Willingness to yield to “better reasons” is the very heart of rational accountability, so submit to each and every relevant consideration that comes along. Often, there are important considerations on both sides of an issue; when this happens, reject the myopic foolishness of “either…or” and embrace the wisdom of “both…and.” Grown-ups can credit competing considerations.

You can strengthen the “muscle” at the core of your mind’s immune system by habitually yielding to evidence. Simply allow evidence to shape your outlook and your mind’s immune system will grow stronger. Push back against evidence (that is, defy reason on behalf of a favored position), and it will grow weaker. The research on this is, I believe, conclusive: Even small concessions to willful belief damage the mind’s immune system. By all means, be resolutely hopeful, but renounce willful believing.

When exploring contentious topics, it’s also important to sideline your identity. Here’s why: When people hitch their identity to a set of views, a phenomenon called “identity-protective cognition” kicks in. They begin experiencing legitimate challenges as threats — and overreact. More precisely, your mind’s immune system will overreact. When mere words trigger a heated response in you, you’re experiencing an unhealthy auto-immune reaction. Immunologists call some immune system overreactions “autoimmunity.” Yes, auto-immune disorders of the mind also exist.

Good cognitive hygiene also requires that you practice subtractive learning. Most of us think of learning as adding to the mind’s knowledge stockpile. But it’s equally important to subtract out the stuff that doesn’t belong. Notice inconsistencies in your beliefs and take time to address them. (Usually, this means letting go of one of the inconsistent beliefs.) Fail to do this and inconsistencies will accumulate; your belief system will grow increasingly unreliable, and your capacity for sound judgment will degrade.

Step 4: Mind your mindset

It’s easy to slip into a mindset that compromises mental immune function. If you’re too trusting, a lot of bad information will get past your filters; if you’re unduly suspicious, good information will get caught in those same filters. You can be too gullible, but you can also be too cynical. You wouldn’t know it from all the emphasis we place on critical thinking, but you really can be too critical for your own good.

Critical thinking is mostly a fine thing, but the combative attitude of a culture warrior is corrosive of mental immune health. Culture warriors fixate on points that can be wielded as weapons against “them” — and become blind to considerations that weigh against “us.” Treat the space of reasons as a battlefield and you’ll develop an acute case of what psychologists call “myside bias.” This can fatally compromise your mind’s immune system. That’s why partisan zeal unhinges minds.

To avoid this fate, be curious, not critical. Maintain a collaborative spirit. Treat conversation partners as collaborators. Never wield reasons as weapons; instead, employ them as pointers meant to guide attention to relevant considerations. Don’t reason to win; reason to find out. I call this mode the way of inquiry: Make it your default mindset and, over time, you’ll achieve something akin to wisdom.

Of course, we do need to test each other’s ideas. Our mind-infections are largely invisible to us, so we need the help of others to spot them. It doesn’t help, though, if conversational idea-testing becomes contentious. Then, pride and fear interfere with falsehood removal. Two habits of mind can help here. First, think of challenges as opportunities, not threats. They’re opportunities to unlearn and should generally be welcomed. Master this, and you won’t overreact to cognitive conflict.

Second, convert your objections into clarifying questions. Even if the view at issue seems unworthy, approach it as something worth understanding. Show genuine interest. Be curious and patient. If the claim in question is problematic, ask for help understanding it. Do this, and a lot of times the claimant will discover its problematic qualities for themself. Once you’ve won a person’s trust, you can place countervailing considerations alongside their reasons — “This is true too, right?” — but let them weigh up the pros and cons. And let them draw their own conclusions.

So there you have it: a four-step guide to developing mental immunity. To sum up: (1) shift your reference frame, (2) uphold standards of accountable talk, (3) practice sound cognitive hygiene and (4) mind your mindset. As you weed out misconceptions and replace them with understanding, your immunity will deepen. You’ll become less prone to mind-infections. As those around you do the same, they become less likely to infect you.

We can build herd immunity to cognitive contagion. Imagine a world where outbreaks of unreason are routinely nipped in the bud, where truculent ideologies are easily dissolved and pointless partisanship no longer frustrates human aspirations. Can we evolve such a world? Absolutely. With cognitive immunology to light the path, each of us just needs to do our part.

[Anton Schauble edited this piece.]

The views expressed in this article are the author’s own and do not necessarily reflect Fair Observer’s editorial policy.


Only Fair Observer members can comment. Please login to comment.
Inline Feedbacks
View all comments

Support Fair Observer

We rely on your support for our independence, diversity and quality.

For more than 10 years, Fair Observer has been free, fair and independent. No billionaire owns us, no advertisers control us. We are a reader-supported nonprofit. Unlike many other publications, we keep our content free for readers regardless of where they live or whether they can afford to pay. We have no paywalls and no ads.

In the post-truth era of fake news, echo chambers and filter bubbles, we publish a plurality of perspectives from around the world. Anyone can publish with us, but everyone goes through a rigorous editorial process. So, you get fact-checked, well-reasoned content instead of noise.

We publish 2,500+ voices from 90+ countries. We also conduct education and training programs on subjects ranging from digital media and journalism to writing and critical thinking. This doesn’t come cheap. Servers, editors, trainers and web developers cost money.
Please consider supporting us on a regular basis as a recurring donor or a sustaining member.

Will you support FO’s journalism?

We rely on your support for our independence, diversity and quality.

Donation Cycle

Donation Amount

The IRS recognizes Fair Observer as a section 501(c)(3) registered public charity (EIN: 46-4070943), enabling you to claim a tax deduction.

Make Sense of the World

Unique Insights from 2,500+ Contributors in 90+ Countries

Support Fair Observer

Support Fair Observer by becoming a sustaining member

Become a Member