Our brains are incredible little mushboxes; they are unfathomably complex, powerful organs that grant us motor skills, logic, and abstract thought. Brains have bequeathed unto we humans just about every cognitive advantage, it seems, except for one little omission: the ability to adequately process the concept of long-term, civilization-threatening phenomena. They’ve proven miracle workers for the short-term survival of individuals, but the human brain sort of malfunctions when it comes to navigating wide-lens, slowly-unfurling crises like climate change.
Humans have, historically, proven absolutely awful, even incapable, of comprehending the large, looming—dare I say apocalyptic?—slowburn threats facing their societies.
Videos by VICE
“Our brain is essentially a get-out-of-the-way machine,” Daniel Gilbert, a professor of psychology at Harvard says in his university’s (decidedly less flashy) version of a TED talk. “That’s why we can duck a baseball in milliseconds.” That is, our brain seems to be programmed to react best to hard, certain information—threats that unfold over generations fail to trigger our reactionary instincts. “Many environmentalists say climate change is happening too fast,” Gilbert says. “No, it’s happening too slowly. It’s not happening nearly quickly enough to get our attention.”
It’s an unfortunate quirk of human psychology; it’s allowed us to outwit and outplay most other species around the globe—we’re smarter, more resourceful, more conniving—but it might also come to mean we won’t outlast them. There are currently a host of very real, very pressing, and very long-simmering crises on our plates; climate change, sure, but also biggies like mass extinction and biodiversity loss and ocean acidification, which will take up to many decades before they become full-blown, civilization-threatening calamities.
That’s why I’ve always bristled a bit at the post-colon header of Jared Diamond’s great book, Collapse: How Societies Choose to Fail or Succeed. What society, comprised of humans capable of abstract thought, with fully developed brains, would actively choose to fail? “It’s been a good run, but seeing as how I am exhausted from all this rapaciousness and decadence, I hereby opt to Fall” -the Roman Empire.
Diamond’s work, published in 2005, before the emergence of the post-Inconvenient Truthclimate change awareness boom, details the myriad ways that societies doom themselves, primarily through environmental misdeeds that should be ominously familiar to contemporary society, like deforestation, overfishing, and the ruination of farmland, as well as unsustainable social practices like slavery, over-taxation, and loss of trade partners.
The societies he examines—like that of Easter Island, which he argues perished almost solely because it gradually degraded the environment it relied upon for food and ship-making—can maybe be said to have had a choice to have done things differently, in hindsight. But given what we know now about how the human brain is wired, it’s unclear that, even given the observable information that slowly failing societies had at the time—food stocks that seemed to be declining, that fewer and fewer trees were apparently available to construct shelter or vessels, and that farmland’s yields kept on decreasing—their members would have been cognitively equipped to tackle the existential threats that would prove bigger problems for their grandchildren then themselves.
It seems perverse, but a load of evidence shows this to be true—our grey matter is set up to instruct us to cope with the here-and-now, and flails in the face of long, uncertain future threats.
In a survey of the research on how the brain processes climate change, the Guardian reports that psychologists, neurologists, and social scientists alike have demonstrated that our brains are programmed to respond to immediate, or “reliable” inputs—the slow, gradual rise of global temperatures accompanied by a host of difficult-to-predict impacts is perhaps the antithesis of what we’re designed to react swiftly to. The oft-cited “Marshmallow study,” in which children were shown to exhibit a willingness to make sacrifices in the present for greater rewards (more mallows) in the future only if they knew for certain that reward was incipient, that it was “reliable,” is evidence of said behavior. Kids who weren’t sure they’d be rewarded mostly weren’t willing to do shit.
We’re all kind of like children, when it comes to these decadal threats. We opt for instant gratification—guzzling oil, burning coal, razing forests, manufacturing plastic—in the face of what we perceive to be unreliability. We’re greedy little fossil-fueled Fausts.
Even Al Gore points to neuroscience to illuminate this obstacle. The science journal Nature, covering one of Gore’s speeches, explains the VP’s take: “Evolution, he said, had trained us to respond quickly and viscerally to threats. But when humans are confronted with ‘a threat to the existence of civilization that can only be perceived in the abstract’, we don’t do so well. Citing functional magnetic resonance imaging, he said that the connecting line between amygdalae, which he described as the urgency centre of the brain, with the neocortex is a one way street: emotional emergencies can spark reasoning, but not the other way around.”
Listen to Motherboard’s podcast about brains:
Of course, the science itself is anything but unreliable. A staggering 97 percent of scientists agree that climate change is caused by man, and while they certainly disagree on the precise fallout if we fail to decide to change our ways, most are certain that for billions of people, especially the world’s poor, will suffer.
That’s all but guaranteed. But what is it, the big difference between the slowly unfurling climate crises that are sure to eventually wipe out whole coastal communities and cities in arid places, and, say, the slowly unfurling nuclear crises that may or may not eventually wipe out whole metropolises and military bases?
“Why,” as the science journalist George Marshall writes, does the one “quicken the pulse, and the [other] induce widespread indifference?” Marshall is the author of Don’t Even Think About It: Why Our Brains Are Wired to Ignore Climate Change, and tries to explicate why, even though we’re aware of climate change, we never seem to list it as a pressing threat (we don’t—survey after survey finds global warming ranking extremely low among Americans’ priorities). The answer, he says, of course, lies in psychology.
“The primary reason is that our innate sense of social competition has made us acutely alert to any threat posed by external enemies,” he wrote in an op-ed last year. “In experiments, children as young as three can tell the difference between an accident and a deliberate attack. Climate change confounds this core moral formula: it is a perfect and undetectable crime everyone contributes to but for which no one has a motive.”
So the lack of anyone immediate to lay blame on (deep down, Americans know it’s largely our fault, even if we like to make ham-fisted attempts at blaming China) further complicates the picture.
Even if you live on the vanishing shores of Bangladesh, in the tinder brush woods of the Australian outback, or the parched dustbin of the American Southwest, your brain is not making the case that climate change is going to kill you. Storms might, wildfires might, drought might, the symptoms of a warming globe might. But climate change itself remains impersonal, an abstraction, and registers little need for urgent action.
On top of that, we’re afflicted with confirmation bias, which helps prevent those whose political ideologies (like tax-and-regulation-averse political conservatives) are at odds with curtailing the behaviors that beget climate change, mass extinctions, or ocean acidification, from accepting that they are happening at all. Entrenched conservatives actively seek out (the very flimsy) arguments that humans are not warming the atmosphere, not acidifying the oceans, to confirm their bias that big government and taxes are anathema to a free society. (We all do this, on different issues.) So yes, there’s a cognitive reason that your uncle is still in denial about climate change, despite the vast sea of evidence washing up on his doorstep—and that so many politicians steadfastly refuse to accept the vast scientific evidence that humans activity is warming the globe.
So is there any way to bridge this gap? How do we get the human brain to concern itself with slow-moving apocalypses? Science has been tackling this issue, relentlessly, for the past few years. There have been countless studies on the best way to frame messaging, on whether or not doomsaying is an effective motivator of action, or whether optimistic, “business-friendly” language can inspire cooperation without rocking the boat. Here’s an example: a study on the neuroscience of climate engagement that argues that “Moral cognitive neuroscience, and in particular the dual-process theory, indicates that up, close and personal harm triggers deontological moral reasoning, whereas harm originating from impersonal moral violations, like those produced by climate impacts, prompts consequentialist moral reasoning.”
Basically, it seems to be suggesting we embrace an ends-justify-the-means approach, and appeal to our innate moralistic imperative to act towards a greater good—even if that greater good is so far off as to seem uncertain to our survivalist brains. You can see why few of these papers has yet offered environmentalists, politicians, or scientists a silver bullet to rally the masses to action.
It’s deflating stuff. Climate change in particular has been called a “wicked problem,“—even a super wicked problem—seemingly expertly concocted by some malevolent overlord to confound human abilities. Marshall reiterates that fighting climate change requires personal sacrifice now for fuzzy benefits much later: “The cognitive psychologist Daniel Kahneman, who won a Nobel prize for his studies of how irrationally we respond to such issues, sighed deeply when I asked him to assess our chances: ‘Sorry,’ he said, ‘I am deeply pessimistic. I see no path to success.’”
All this may be one reason we’re so enamored with apocalyptic fiction right now: maybe it relieves some of that abstract cognitive dissonance that comes with grappling with a real far off threat by resolving the crisis, immediately and violently, so we don’t have to think about it anymore. Combine our neurological deficiencies with our current political ones, and it’s easy to despair.
“If you were going to weaponize an issue to take advantage of the weak points in the American political system—to highlight all the blind spots, dysfunctions, and irrationalities—you would create climate change,” Ezra Klein wrote last year. “And then you would stand back and watch the world burn.”
At least all that burning should finally light up our brains.
Humans are capable of tackling long-term monster threats—look how Australia conquered its megadrought, for one—but perhaps only when it registers unambiguously as a threat to our brains. When that particular tipping point will hit is anyone’s guess; we’ve already stared down the barrel of climate change-fueled hurricanes, drought, and floods. It seems foolish to expect that we’re only another crisis away from cooperation. We may, as Diamond might say, already have chosen to fail. If we have, we’ll have no one to blame but our brains.
For more of Motherboard’s climate science coverage, watch this: