Heads We Win. Oops, It’s Tails.

Tom Flynn

Andy Norman, Mental Immunity: Infectious Ideas, Mind-Parasites, and the Search for a Better Way to Think, with a foreword by Steven Pinker. (New York, NY: HarperWave, 2021. ISBN 978-0-06-300298-2.) 397 pp. Hardcover, $29.99.

Brian T. Watson, Headed into the Abyss: The Story of Our Time, and the Future We’ll Face. (Swampscott, MA: Anvilside Press, 2019. ISBN 978-0-578-59411-8.) 294 pp. Paperback, $13.00.

These two books—which, as it happens, I read consecutively—offer opposing perspectives on the defining dilemma of our age. Faced with unprecedented and clearly existential crises—pandemics, climate change, social division, an explosion in mystical thinking, and so much more—a civilization with a fair past record of rising to its challenges now seems helpless. Are there solutions, or does night loom for us all?

Start on the Sunny Side

Let’s turn first to Andy Norman’s Mental Immunity. Richard Dawkins popularized parallels between clear thinking and human biology in his 1976 best-seller The Selfish Gene. There he introduced the notion that ideas perpetuate themselves in a genetic fashion, captured in the coinage meme. Mental Immunity may be the most audacious book to tread this path, this time linking clear thinking with the functioning of the human immune system.

A philosopher and humanist activist at Carnegie Mellon University (and a prolific Free Inquiry contributor), Norman charges that a deep-seated error in Western thought muddies our views of truth and hobbles the teaching of critical thinking. Norman invites us to view his critique not as a work of philosophy but as one in cognitive immunology. “Bad ideas are parasites,” he declares. “Not ‘analogous to parasites’ or ‘metaphorical parasites’ but actual parasites.” The good news is that “the art of bad idea removal is on the verge of becoming a science.”

Norman is not shy about his work’s potential benefits:

Imagine a world where cognitive immunologists design interventions— “immunotherapies”—that restore mental immune health. Where mental immune “boosters” and “mind vaccines” prevent epidemics of partisan thinking. Where people think more clearly, reason more collaboratively, and change their minds when reasons show them to be in the wrong. Imagine reason-giving dialogue flourishing and breaking down ideological barriers. Imagine “fixed” mindsets unfurling, like flowers freed from frost.

Some readers may find this too bold, but Norman manages to deliver on some, if not all, of his immunological rhetoric. And he does it while conducting a masterful exploration of epistemic philosophy. Many secular humanists will welcome his powerful refutation of intellectual relativism—to say nothing of his surprising demonstration that, contra what most of us think we know about Hume, one most certainly can derive an “ought” from an “is.” (The doctrine that one can’t, often associated with the label naturalistic fallacy, was long criticized by Free Inquiry founder Paul Kurtz. Kurtz’s humanism was rooted in ethical objectivism; I think he would have welcomed the arguments Norman offers here.)

A Reasonable Submission

Sound reasoning constitutes cognitive health, and for Norman that begins with “an act of submission, a willingness to follow good reasons where they lead. Negate that submission via an act of will, and thinking begins to go haywire.” Specifically, when you treat reality as you wish it were, not as it is, havoc will follow.1

Norman mounts a sweeping attack on intellectual subjectivism. “The idea that values are fundamentally subjective is profoundly dysfunctional,” he accuses. In a chart, he presents “Six Immune-Disruptive Ideas [his emphasis] that together give our culture an acquired immune deficiency”:

  • Beliefs are private.
  • Everyone’s entitled to their opinion.
  • Values are subjective.
  • I have no standing. (“Who’s to say?”)
  • Basic commitments can’t be rational.
  • Questioning core values is intolerant.

To the contrary, he argues, beliefs that drive our actions are inherently public and need to be subject to debate and scrutiny. No one is “entitled” to an opinion that’s demonstrably wrong. Values are objective: some are manifestly more conducive to human flourishing than others. If beliefs are publicly debatable, then no one lacks standing to examine them. Even our most basic commitments—and our reasons for holding them—are subject to public review (this in some ways restates the first point on this list). And there’s nothing intolerant about questioning core values; since they influence action so profoundly, we daren’t leave them unchallenged. Hear, hear.

Readers who know their American philosophy will guess what’s coming next: the epistemological debate between William Kingdon Clifford and William James. Clifford, the short-lived British mathematician and philosopher, is renowned for his bombastic 1877 essay “The Ethics of Belief.” It proposed a demanding ethics of knowledge, later called “evidentialism”: “It is wrong always, everywhere, and for anyone, to believe anything upon insufficient evidence.”

Clifford’s essay struck with such force that, nineteen years later, American philosopher-psychologist William James still felt obliged to respond with a lecture titled “The Will to Believe.” James described it as “a defense of our right to adopt a believing attitude in religious matters, in spite of the fact that our merely logical intellect may not have been coerced.” Because James was renowned as an early pragmatist, his approach to epistemology came to be labeled “pragmatic” as well.

Clifford and James were published together for decades and studied by generations of students. But Clifford has fallen into obscurity, while (in Norman’s words), “James’ thesis … has become a kind of meta-ideology: an all-purpose excuse for indulging in wishful thinking.”

Norman compares Clifford’s and James’s methods, sometimes placing a heavy thumb on the scale in favor of James. But it scarcely matters, because Norman concludes that neither Clifford nor James got it right. Instead, it is the American philosopher Charles Sanders Peirce who brought the evidentialism-pragmatism debate to its healthiest resolution, asking not “What should we believe?” but rather “By what method should we fix our beliefs?”

Peirce was urging us to subject beliefs to both traditional epistemic testing and pragmatic how-well-do-these-beliefs-serve-us testing. The idea was to do both, thereby making idea appraisal more rigorous. Also, Jamesian principles imply that, if a belief works for you, then it’s true for you. Peirce, though, understood that you can’t relativize truth in this way—not without subverting collaborative inquiry. He grasped that selfish and unaccountable believing plants the seeds of ideology and breeds irreconcilable differences. James appears not to have grasped these things.

It is to society’s detriment that of Clifford, James, and Peirce, it was James and his promiscuous epistemology that proved widely popular.2 “To date,” Norman writes with regret, “we’ve not achieved an understanding of basic belief that is tenable, well functioning, and wisdom conductive. Or, for that matter, conducive to cognitive immune health.”

What stopped us? The eleventh chapter of Mental Immunity gives a compelling answer.

Damn Those Greeks!

In a tour-de-force of philosophical historiography, Norman argues that Western epistemology went astray in ways that continue to empower dysfunctional thinking and make it unnecessarily difficult to teach effective critical thinking skills. He ascribes this profound error, naturally enough, to the Greeks.

Socrates demonstrated all too well that any dogmatic pronouncement could be challenged. “[T]he Socratic model makes skeptical debunking trivially easy, and conclusive demonstration all but impossible. It rigs the reasoning game in favor of challengers, making it unduly hard to secure sensible claims.” In other words, the Socratic method gives skeptical interlocutors so great an advantage that it licenses a bottomless skepticism. “[A] mental immune system that relied on the Socratic standard would be hyperactive.” (Again, one sees parallels with Paul Kurtz, who criticized the “nihilistic skepticism” proceeding from Socrates in his 1992 book The New Skepticism.)

Recognizing this shortcoming, Plato sought to provide for reliable positive knowledge by focusing on the verities of mathematics. Mathematical truths can be proven incontrovertibly. What’s more, new truths can be erected atop the foundation of those already proven. Can we approach morality and ethics in a similarly undisputable way? Plato thought so, but it turned out that non-mathematical propositions must be evaluated according to a criterion of reasonableness that admits an infinite regress of justifications. Plato’s model

overgeneralizes, and teaches us to demand a warrant for every belief and claim. This makes it like an autoimmune trigger: conducive to hyperactive immune response. If the body’s immune system attacked microbes indiscriminately, it would quickly kill us; how then can a mind that challenges ideas indiscriminately be healthy?

Western thought never resolved this dilemma. Philosopher after philosopher tried. Aristotle sought to stem the infinite regress of objections by appealing to first principles. In the twelfth century, Thomas Aquinas sought to do it by requiring that first principles be underwritten by faith—a disastrous trigger for the Dark Ages. Next came Descartes, hoping he’d found an unchallengeable assertion in the self’s awareness of itself (“Cogito, ergo sum”). More important, he introduced a “gravitational metaphor” into epistemology that has vexed thinkers ever since: “the idea that knowledge needs a foundation only makes sense if, absent support, beliefs gravitate to their demise.” For their part, Locke and especially Hume sought to end the “endless search for stable grounding” by appealing to perception and the facts it reveals. But empiricism has its own problems, starting with the fact that our perceptions can be in error. Out of this confusion rises the unhealthful modern-day notion (hinted at by Hume and brought to full flower by G. E. Moore) that fact and value are “wholly distinct realms … separated by a yawning chasm.” Here lies the corrosive notion that an is can never justify an ought. (Here too, though Norman doesn’t mention it, lies the absurdity of Stephen Jay Gould’s notion of “non-overlapping magisteria.”)

Look, we’ve wound up back at Clifford’s evidentialism. To seek warrant for beliefs, Clifford counsels, we should look not just to perceptions but to evidence. Yet in our own time evidentialism faces a “strident backlash … capable of reversing centuries of progress,” Norman warns. Displaying a disapprobation for evidentialism at which his previous discussion of it only hinted, Norman now associates that stance with a nihilism so toxic that even many evidentialists recoil from it, resorting to a sort of moral double-entry bookkeeping under which moral judgments have no evidential basis but are nonetheless strongly held on grounds of feelings. “No wonder people are anxious about evidentialism,” he grumbles. “Apparently, it saddles us with the conclusion that nothing really matters, not even ourselves. For many (me among them), this is too much to stomach. … For all its virtues, evidentialism is a formidable mental immune disruptor.”3

Another disruptor whose inclusion Free Inquiry readers will find unsurprising is religion. “[R]eligions appear to play a key role in weakening cognitive immune systems. … Religiosity correlates with anti-intellectualism and political intolerance.” Yet this critique is circumscribed; Norman would reform religion, not expunge it. “Imagine the world’s religions re-formed as communities of inquiry. Imagine them dispensing healthy attitude adjustments, unencumbered by magical thinking and baldly improbable metaphysics.” (The obligatory nod to John Lennon and Yoko Ono appears on the following page.) Norman may imagine faith communities developing in this way, but is it likely? Historically, churches that jettison magic and metaphysics seem to cede much of their emotional salience. They tend to suffer membership declines and find their appeal limited to boutique audiences. (In the United States, this fate has claimed most mainline and liberal Protestant churches.) I can’t help thinking that when religion is free to be itself and pursue its own (often mystical) interests, it’s far more likely to be part of the problem than the solution.

Summing Up

At the end of Cognitive Immunity’s epic journey, what is the way forward? “[W]e need to help everyone build an identity,” Norman proclaims, “where the will to find out predominates over the will to believe.” He sets forth an intriguing principle: “[P]resumptive claims are effectively immune to bare challenge.4 They’re immune to bare challenge because (by definition) the burden of proof is on the challenger, and in such cases, a bare challenge shouldn’t suspend entitlement.

Norman prescribes an immunological “serum” he calls the New Socratic Model: “[Y]ou can sometimes ward off a challenge without answering it. There are moves in the game that instead allow you to show that a challenge doesn’t need answering.” Norman calls such moves “indirect defendings,” and, modest as they seem, he views them as a conclusive answer to the helplessness in the face of radical skepticism that afflicted Socrates, Plato, and all their many successors. At last, Norman can present his long-awaited mind vaccine: “A belief is reasonable if it can withstand the challenges to it that genuinely arise.”

Norman challenges us to accept that a single, foundational change in the way most of us approach epistemology can cut the Gordian knot of our era’s dilemmas. Surely Norman has arrived at a useful refinement for the teaching of critical thinking. But is that enough to accomplish what such existential crises as climate change and societal standoff demand? There’s quite a gap between “useful” and “this will save the world,” and though I’m deeply intrigued, I’m not convinced that Norman has bridged that gap.

Still, Norman delivers an appeal near the end of the book that will speak to something deep in us all:

Which tale would you rather our descendants tell? The tale of how we clung to comforting beliefs and wrung our hands as unhinged thinking spread across the Internet, tearing cultures apart? Or the tale of how our generation got serious about mental immune health, developed and administered a mind vaccine, and brought about a second Enlightenment?

Time for the Dark Side

Which brings us to architect, cultural critic, and three-time FI contributor Brian T. Watson’s Headed into the Abyss. The book was published late in 2019, shortly before COVID-19 brought society to a standstill; the pandemic arguably denied it much of the attention that it might have been due. Bar none, it’s the most harrowing and comprehensive account I’ve yet read of how (paraphrasing Norman) we will cling to comforting beliefs and wring our hands as unhinged thinking spreads across the internet, tearing cultures apart. On Watson’s telling, it’s too late for any other outcome. It has been for decades.

Watson’s message was anticipated on the final page of Stephen Emmott’s telegraphic overpopulation jeremiad, Ten Billion (2013):

We urgently need to do—and I mean actually do—something radical to avert a global catastrophe. But I don’t think we will.

I think we’re fucked.

What Emmott said in stark white type on an all-black page, Watson spells out in merciless detail:

Our prospects for the future—say, 2030 to 2080—are very grim and the book analyzes how we would respond to the biggest forces and realities facing us if we were inclined to or able to. But because we will not be able to respond either fully or optimally, or in a sufficiently timely manner, I explain why our responses will be inadequate. … [O]ur prospects are bleak, I don’t see us fixing critical things that need to be fixed.

Abyss isn’t a book about overpopulation or climate change or dwindling resources. It’s about … well, everything. Watson examines ten primary forces: “capitalism, technology, the internet, politics, media, education, human nature, the environment, [and] human population.” He looks back nearly a half-century, finding in such developments as Ronald Reagan’s assault on the air traffic controllers’ union and the savings-and-loan crisis of the eighties direct precursors of the currents tearing society apart today. Capitalism is but the first of the ten forces he examines, and he sees it as moving society the wrong way on countless fronts: deregulation, the move from a near-lifetime employment model to a gig-economy model, the replacement of community-based in-person shopping by “placeless” online consumerism, and the capture of sector after sector by a handful of mammoth corporations. All these elements have elevated the very rich while alienating the poor and near-poor.

Then factor in the other nine forces.

“Today,” he writes, “the forces are working together in ways that foreclose effective intervention … [a]nd they are working on a planet that for the first time has eight billion people on it, and in a biosphere that has never been in worse shape.” Watson singles out the internet and the subculture to which it has given rise with special vehemence:

Perhaps the easiest way to grasp the futility and delusion of hope is to think about the internet. Because of its dynamics, we will remain unable to address our problems as long as it and social media exist. But what chance is there of turning off the web, eliminating connectivity, shunning social media, and halting the advance of artificial intelligence and data collection? There is no chance.5

“The danger that we face,” Watson concludes, “is nothing less than the unraveling of civilization across the planet.” He predicts that “it will be environmental chaos that spells the end of civilization as we know it” and that catastrophe “will occur in this century.” Watson’s chilling advice to readers is to take stock of the things that make life commodious—dependable power, abundant food and water, bearable weather, at least the appearance of civil order—and simply appreciate them, recognizing that few among us (and even fewer among our descendants) will see their like much longer.

I put down Headed into the Abyss with a dark sense that Watson is right. Humanity has committed so many grievous errors over the past half-century, in so many interdependent domains, that it’s hard to imagine we’re somehow going to save ourselves. (If anything, my review fails to capture the sheer, panoramic inevitability of Watson’s bleak account.)

Meanwhile Norman’s epistemological insights are powerful, but I find it difficult to accept that simply teaching a new way to think about truth, however brilliantly, can actually save humanity’s bacon.

Both books are eminently worth reading for the clarity and power with which each author pursues his respective vision. Enjoy them while we still have light to read by.

[1] Norman calls this ideology, a word he defines in solely negative terms—“an interlocking system of bad ideas.” He discourages using the word, as some do, as a neutral synonym for concepts such as worldview.

[2] Norman notes that part of the impact of James’s views reflected the license they gave for a later reinterpretation of religious beliefs based on their social benefit rather than their truth.

[3] Wow. Allow this contented evidentialist to lodge an objection. Yes, evidentialism leads to a certain form of nihilism, but I for one actually take comfort in the deflation of the importance of oneself and one’s life that it provides. To my mind, a rigorous secular humanism is best lived on the edge of the abyss. See my essay “The Big ‘M,’” FI, June/July 2007.

[4] A bare challenge is one that advances no further proposition but simply issues a demand such as “But why?”

[5] For more on this, see Watson’s article “The Internet, the Virus, and Reason” (FI, June/July 2020).

Tom Flynn

Tom Flynn (1955-2021) was editor of Free Inquiry, executive director of the Council for Secular Humanism, director of the Robert Green Ingersoll Birthplace Museum, and editor of The New Encyclopedia of Unbelief (2007).


This article is available to subscribers only.
Subscribe now or log in to read this article.