Let's face it, we're a mess.
Most of us know we're not doing enough to deal with our greatest threats. Every day, we witness the dumbest behavior, things that seem to fly in the face of our most basic survival instincts.
We ask, "Why???"
The human brain evolved over millions of years. Most psychologists agree, it's good at responding to immediate threats.
It's terrible at responding to slow, gradual threats, even when they're far more important. As Brian Merchant writes in Vice, "Humans have, historically, proven absolutely awful, even incapable, of comprehending the large, looming... slow burn threats facing their societies." In Collapse, Jared Diamond chronicles how leaders of past civilizations failed to address clear dangers because it was easier to shrug them off and downplay them.
As it turns out, there's a lot wrong with the human brain. It's full of contradictory impulses that fight against our better reasoning.
Here's a list:
1) People don't take invisible threats seriously.
Our biggest threats now are invisible.
You can't see them.
As Harvard psychologist Daniel Gilbert explains, humans evolved to deal with immediate, visible, audible threats. We're not good at dealing with threats that don't trigger our senses. As Gibert says, "We're very good at clear and present danger, like every mammal." Humans were well adapted for the ice age. Unfortunately, "we've got this great big brain so we could navigate our ancestral environment... and what did we do? We created an entirely new environment to which our brain is not perfectly adapted." In other words, our actions made new long-term threats that we can barely perceive.
Those threats are tiny, like viruses in the air and molecules in the atmosphere that heat the planet and kill our food sources.
According to Gilbert, humans don't respond well to gradual, abstract threats. We rely on our emotions to sense danger, and that's why so many people get worked up over terrorism or abortion. Those threats arouse feelings, and they stir up moral outrage. Threats like viruses or climate change should arouse our emotions, but they don't. You can't see or hear climate change.
You can't get morally outraged at a virus.
It doesn't have a face.
2) Everyone thinks, "It won't happen to me."
Psychologists have been studying unrealistic optimism bias for decades. Neil Weinstein was one of the first to write about this trait in a 1980 issue of the Journal of Personality and Social Psychology. As he wrote, "people tend to be unrealistically optimistic about the future." They overestimate the chance of good things happening and underestimate the chance of bad things happening. When you present someone with evidence about their odds, they almost always predict that they'll beat those odds. They think those odds apply to everyone else, but not themselves. It influences their behavior.
It can make them reckless.
Weinstein and his colleagues at several universities published an overview of unrealistic optimism in Current Directions in Psychological Science, looking at hundreds of studies. For example, 56 percent of people underestimate their risk of having a heart attack compared to the average person. They also demonstrate unrealistic optimism when it comes to their odds of experiencing or surviving natural disasters.
Here's where it gets weird:
These hundreds of studies on unrealistic optimism have found that the more control someone has over a situation, the more unrealistic their optimism becomes. They believe they can avoid something bad happening, even if they don't take those precautions. As Weinstein and his colleagues write, people become guilty of "transforming the comparative risk judgment into a personal risk judgment with no clear reference."
If you can avoid something bad by changing your behavior, it ironically makes you more inclined to believe you'll avoid it regardless of what you do. It gives you a false sense of control.
This trait doesn't stack well when it comes to threats. When everyone goes around thinking "it won't happen to me," then they increase everyone's risk by avoiding or even refusing to take precautions. They take more risks than they should, and then they get angry when things go sideways.
Why do people think like this? According to the research, it's simple. People want to feel good about themselves.
They want to feel special.
3) People get high off ignoring threats.
It feels good to blow off warnings.
For a lot of people, it delivers a dopamine boost. It gives them a sense of power and control, however fleeting it might be.
Jack W. Brehm introduced the idea of reactance in his 1966 book, A Theory of Psychological Reactance. He argued that people have a natural tendency to resist perceived threats to their freedom and autonomy. They hate being told what to do, even if it’s for their own good. They're more worried about losing their freedom than protecting their health.
We see that every day now.
Psychologists have been studying reactance for almost 60 years, and they’ve learned a lot. The research has shown that the more you try to warn most people about threats, the more they resist.
They often get violent.
Reactance becomes an especially difficult problem in individualist societies like ours. It's hard to correct. Studies have shown that an individualist will feel more threatened when someone they know starts arguing in favor of anything they consider a violation of their freedom. People also demonstrate vicarious reactance. Basically, they feel an impulse to defend someone else's freedoms at the expense of the greater good.
There's a final irony here.
The research on reactance has observed that people do eventually take threats seriously, once they become absolute and unavoidable. That's when they start to panic and act mainly in their own selfish interests, disregarding everyone else around them, even family. They do irrational things like hoarding toilet paper and fighting over bottled water.
By then, it's too late.
They're screwed.
4) People shoot the messenger.
People tend to get punished for delivering bad news. Instead of being listened to, they're gaslit and pathologized. They're demonized.
They're labeled "unlikable."
Psychologists call this tendency spontaneous trait transference.
According to John Skowronsky and Donal Carlston, spontaneous trait transference happens when "communicators are perceived as possessing the very traits they describe in others." Instead of believing warnings, people tend to transfer those negative traits to the person trying to warn them.
Dozens of subsequent experiments have confirmed this trend. If you're trying to warn someone about a threat, people will act like you're the threat instead of taking you seriously.
It's why leaders don't listen to their Cassandras.
Neither does the public.
5) People trust their gut too much.
The Nobel Prize-winning economist Daniel Kahneman has published several books and articles on decision-making, including the bestseller Thinking, Fast and Slow, where he calls out impulsive, intuitive reasoning.
Thinking with your gut gets you in trouble.
Intuitive thinking tends to work well when it guides someone toward caution and safety. It doesn't work well when it encourages them to make overconfident, overly optimistic decisions. As Kahneman has said during interviews, "People are overconfident in their judgments... they make plans and absolutely believe those plans will come through."
Western societies have glorified gut thinking and quick decisions. It has a terrible track record. It leads companies into bankruptcy and countries into war. You might wonder why we keep doing it.
There's a related mistake here, called survivorship bias. Over the last 20 years, corporate media outlets and financial magazines have pumped out endless articles examining the success of billionaires and influencers who built their empires on impulses and hunches. They've constructed an illusion that it's good to think with your gut, overlooking all the times when it fails, and also ignoring all the other intellectual work that goes into success.
And so here we are, a hunch-based society.
It's failing.
6) People want to forget collective trauma.
You would think societies tried to remember important lessons from disasters and emergencies and learn from them.
Unfortunately, that doesn't always happen.
Just as often, societies engage in a willful forgetting of the immediate past. Social psychologists refer to it as collective amnesia.
Craig Stephen describes collective amnesia as “the tendency of societies to recognize a new potentially catastrophic threat, react to it for a few years, and then move on.” They don't deal with their threats. They don't develop effective plans or strategies. They don't prepare for next time.
They just get tired of talking about it.
Even worse, the elite members of society will often cultivate this collective amnesia through propaganda. It serves them.
As Alessandra Tanesini writes, “Communities often respond to traumatic events in their histories by destroying objects that would cue memories of a past they wish to forget.” As she explains, dominant members of a group “spread memory ignorance” in order to erase mistakes they made and suppress dissent among the ranks. This memory ignorance serves as “a form of self-deception or wishful thinking in the service of self-flattery."
Sometimes, these artifacts are the very tools everyone needs to protect themselves from ongoing threats. Instead of learning to associate them with safety and assurance, they associate them with fear.
The elite make it that way.
7) People can adjust to almost anything.
In the 1990s, Daniel Pauly introduced a term called shifting baseline syndrome to describe how scientists and communities fail to perceive drastic changes over a long period. Each new generation accepts their current situation as normal, without considering what used to be normal. Basically, we accept something awful because we don't know any better.
Psychologists have another name for it:
Habituation.
Our baselines and attitudes toward what counts as "normal" can shift quickly as we get accustomed to familiar situations. The more you're exposed to something, the less you react to it. That's why exposure therapy sometimes helps patients overcome phobias.
With enough exposure, you can get used to almost anything.
It can work against us.
When someone becomes habituated to a threat, they'll stop responding the way they should. They'll stop protecting themselves. And they'll stop trying to do anything about it. They'll just accept it.
8) People defend what they consider normal.
Once someone adjusts to a horrible normal, they prefer it.
Change scares them.
Two social psychologists at Yale proposed a theory to explain why people go so far out of their way to defend horrible norms. They'll do it even when they know it doesn't work, and when it's hurting them.
It's called systems justification theory.
According to John Jost and Mahzarin Banaji, we're wired to resist change. For most people, it's easier to convince yourself and everyone else to accept a bad situation than try to change it, no matter how terrible it gets. Members of a group will resort to desperate measures to defend the status quo. They do it to preserve social harmony and boost their own self-esteem. Since most of us play varying roles in perpetuating dominant socio-political norms, we all feel somewhat motivated to justify them to each other.
It makes us feel better.
The concept has inspired dozens of studies. Jost published a book about it with Harvard University Press in 2020.
There's also Kanter's law.
Essentially, "everything looks like a failure in the middle." That's when people are most likely to abandon ambitious goals and revert to old, dysfunctional norms. They cave into the illusion that what they had before was more effective or simpler, and they're often wrong.
They just don't see it.
9) Most people just want to fit in.
Our desire for social acceptance often overrides our survival instincts, short-circuiting our normal threat response.
Social psychologist Solomon Asch conducted a number of studies that found something unsettling. People care more about fitting in than being right. Nearly 75 percent of the participants in his conformity experiments were willing to give an answer they knew was wrong during a vision test. Their only reason was that they didn't want to stand out. They conformed almost 40 percent of the time. Only 24 percent managed to resist.
Even during emergencies, people have a tendency to look for social confirmation before they do anything. Amanda Ripley describes this glitch in Unthinkable: Who Survives When Disaster Strikes and Why. As she writes in a related article: “Large groups of people facing death act in surprising ways. Most of us become incredibly docile… Usually, we form groups and move slowly, as if sleepwalking in a nightmare.” In short, we don’t panic.
We turn into cattle.
John Leach describes this behavior as the won't to live. Most people in any given emergency shut down or freeze up. They’ll spend precious time gossiping and trying to get more information before they try to take action. Psychologists even have a term for this procrastination.
They call it milling.
In the Journal of Community & Public Health, Carl Ross explains how normalcy bias has hampered our approach to the pandemic.
As he writes, “we are sensitive to the perception of others viewing us as abnormal. Within social relationships, very few want to be seen as alarmist, overreactive or a fool because if they are wrong about a threat then they will be regarded as less credible in the future.” He goes on to state that “social shaming reinforces our normalcy bias. It’s not cool to overreact.”
10) The elites always panic.
The super rich seem to grasp all of these glitches, at least on an intuitive level. They exploit them for their personal gain.
The corporate media has fallen into a predictable pattern over the last several years, downplaying and dismissing threats instead of giving the public reliable and accurate information. The super rich don't care about saving people. Instead, they obsess over protecting their property from looters and securing the most resources for themselves. They don't want ordinary people to take action, because that threatens their own interests.
A sociologist named Kathleen Tierney teamed up with Caron Chess and Lee Clarke to coin a term for all this:
They call it elite panic.
Rebecca Solnit talks about elite panic in her book, A Paradise Built in Hell. As she writes, "It's a very paternalist orientation to governance. It's how you might treat a child." Unfortunately, history shows that our institutions often take the elite panic approach. It makes everything worse.
According to Solnit:
Elites tend to believe in a venal, selfish, and essentially monstrous version of human nature... They believe that only their power keeps the rest of us in line and that when it somehow shrinks away, our seething violence will rise to the surface.
When they panic, elites convince government authorities and media outlets to hide, delay, or downplay vital information. They mistakenly assume the public will turn into mobs if they know the truth.
Experts call these fears disaster myths.
As Lee Clarke explains, "Disaster myths are not politically neutral, but rather work systematically to the advantage of elites... because to acknowledge the truth of the situation would lead to very different policy prescriptions than the ones currently in vogue." These policy prescriptions work for the greater good. They're simpler, and they're usually not very profitable.
As James B. Meigs writes, elite panic has routinely undermined our response to emergencies ranging from earthquakes to pandemics. It reinforces everyone's existing cognitive biases, and it encourages them to neglect or refuse commonsense precautions.
You see elite panic every time some columnist or contributor fixates on correcting and fact-checking warnings, criticizing caution as "fear," and insisting that everyone carry on as if everything's normal.
If you want to measure elite panic, just count the number of articles and news segments that tell you not to panic.
It's a good barometer.
Can we do better?
In theory, humans can work together and respond to threats like pandemics and global warming. We have a lot of evolutionary baggage working against us. The elite exploit these weaknesses to keep us divided.
This is what's holding us back. Regardless of where you land on the doomer spectrum, this answers your biggest question:
What's wrong with everyone?
Now you know.
OK Doomer is supported by readers. If you appreciate this work, please subscribe or buy me a coffee.