In the 1950s, a psychologist did a little experiment.
He had 71 Stanford students perform extremely boring tasks, like turning pegs and arranging things on desks. When they finished, he paid them either $1 or $20 and asked them how they felt. Something weird happened. The participants who got paid almost nothing described the tasks as much more fun and meaningful than the ones who earned $20.
Why?
As Lee McIntyre explains in Post-Truth, it's because "their ego was at stake." They were trying to preserve some sense of agency. "To reduce the dissonance, they altered their belief that the task had been boring." But if you got $20 out of it, you felt no need to lie.
This same psychologist did a number of studies where he asked people to do things they didn't want to, like protesting for causes they didn't support. Again, he observed that people who did things they saw no value in would invent one. They would just make stuff up. They did the same thing over and over. Building on these studies, McIntyre says it's a hallmark of human nature to "seek harmony between our beliefs, attitudes, and behavior." It makes us uncomfortable when they're out of sync.
The guy who conducted the experiments was Leon Festinger, and he published his findings in a book called A Theory of Cognitive Dissonance, probably one of the most important books of the century.
It's 1960. Don Draper has to figure out how to sell cigarettes after a bombshell report lands saying tobacco causes cancer.
He says, "It's toasted."
Nobody understands what he's talking about at first. They call him crazy. All cigarettes are toasted, they say.
So what?
In that moment, Don Draper shows us what an evil genius he is. You don't have to go out of your way to lie to everyone. You just have to take advantage of their cognitive dissonance. They want a reason to keep lighting up. Knowing that cigarettes cause cancer makes them uncomfortable. So just give them a reason they already know.
Act like it matters.
If there's one thing the elite has studied, it's the science and psychology of the lie. They're masters of deceit. They know how to get the public to go along with anything, and you don't need a conspiracy theory to understand why. They love money. The more money they accumulate, the more they want. They resort to evermore desperate measures.
There's an architecture to lying. It's not just about making stuff up. You have to make the lie appealing.
We live in a world of lies. The worse things get, the more lies we're going to hear, especially from our own institutions and people in positions of authority. They'll lie for many different reasons, but most importantly they'll lie because they want everyone to keep working and producing wealth for them, right up until the moment they die from a disease or in one of the many climate disasters scientists are predicting.
The least we can do is understand how lies work.
It matters.
You probably know that repetition plays a role in misinformation. There's an actual name for what happens. It's called the illusory truth effect. As an article in the Journal of Cognition explains, "Repetition generates the illusion of epistemic weight." If your brain sees something a lot, it's wired to give it more credibility. Decades of research on the illusory truth effect show that it works, time and again. People don't feel bad about spreading lies anymore, since "repeatedly reading misinformation might even reduce how unethical it feels to share that unambiguously false information on social media." Everyone from politicians to advertisers uses this tactic to convince the public to act against their own interests.
So how does the illusory truth effect work?
Well, humans respond to information based on fluency and priming. As the authors state, "Repetition leads to easier and more fluent processing." The more times you hear something, the more it makes sense and the more you remember it. After all, that's how your brain learns.
We learn through repetition.
There's another interesting component of the illusory truth effect, and it's called source dissociation. As the authors state, "people remember the semantic content but not its source." Basically, they don't remember when or where they first heard a piece of information. They'll attribute a more credible source after the fact, even if they're just guessing. Someone might've heard something from a friend or saw it on Reddit. If they can't remember, they'll tell you they read it in a scientific journal, because that sounds better.
Studies on the illusory truth effect have shown that you can make people believe some outrageous things, like elephants are faster than cheetahs, simply by repeating it enough times. This work builds off the Asch conformity experiments, where participants agreed with wrong answers a majority of the time. Repetition and peer pressure can do horrible things.
Maybe you remember a couple of years ago, when millions of people were yelling about schools offering students litter boxes if they identified as cats. Nobody really believed it, but they had no trouble acting like they did. It wasn't about litter boxes. It was about attacking public education.
That's a good example.
There's a reason why corporations invest billions in advertising and messaging. They know they have to vary their repetition across a range of contexts to get everyone to believe... anything. That's also why billionaires and rich families have gobbled up newspapers, and why they also fund "independent journalists" to go around touting their message. It's about surrounding people with improvisations on the same lie from a hundred angles.
It creates the illusion of truth.
You'd think small lies are the most effective. It's actually the opposite. You need to start with a big lie.
The big lies kill.
See, most people don't really care about the truth. That's not how our brains work. As a clinical psychologist at NYU explains, "We don't truly believe things, so much as provisionally accept information we find useful." People are happy to accept and promote lies if they benefit from them.
Lies are about what people want to be true.
If you're going to lie, it's better to focus your energy and attention on one big, central lie and craft an entire reality around it. One big lie helps organize the lesser lies into a coherent structure that people can accept. If you waste your time on little lies, they're easier to knock down.
As a bonus, big lies have a way of breaking people's minds. They can't believe you would make something like that up, so they're more willing to accept it simply because they don't want to admit that someone would invent something so untrue, outrageous, or clearly wrong. Opponents have a hard time fighting back against big lies. Once you've committed to something that's clearly false, then evidence no longer works on you. The audacity of the lie itself tends to fluster your opponents.
In their book Big Liars, Christian Hart and Drew Curtis interviewed hundreds of the biggest liars in the world.
They identified a key component:
Confidence.
Good liars go to extreme lengths to make their misinformation seem not only believable but normal. They think about how many and what kind of details to include in their stories. They rehearse everything from their tone of voice to their body language and eye contact. But there's something even more important in their tool kit.
Every good liar tells the truth.
As Hart and Curtis point out, "Successful liars tend to embed their lies in a cloud of truth... they will likely bury a single lie in a vast jumble of truths, obscuring the deceit." They sneak in clear, simple lies.
Here's the anatomy of a good lie:
1) It's clear and simple.
2) It's either unnoticeable, or too big to fail.
3) It's repeated endlessly.
4) It's buried in truths.
Maybe you can see the political implications of all this. Maybe you've wondered why agencies like the CDC bother to manipulate excess mortality data and then explain it, when they could just outright falsify or even withhold it. Maybe you've wondered why Covid minimizers insist on reporting half-truths, or why media outlets continue publishing stories on "mystery illness." Maybe you've wondered why the government bothers to send us rapid tests that don't work very well, when they could send us nothing at all.
Well, those wouldn't be very good lies.
They have to look like they're trying.
That's a great lie.
On a global scale, it's cheaper and easier to control society through lies, including lies of omission and half-truths. As psychologists have told us, most people don't even want the truth. They want information that's useful to them, at least in the short term. They want excuses.
They want something that feels like the truth.
It's the same thing with climate change. Oil companies invested billions to convince the world they were planning to phase out fossil fuels when most of them were planning to accelerate production and consumption. "Net Zero" by 2050 became the biggest lie in history.
Because it's true.
We probably will reach net zero emissions by 2050, because global industrial civilization will collapse by then.
So Exxon is telling the truth.
Sort of.
Over the last four years, we've seen a number of big lies when it comes to Covid. Here they are, in order:
The flu is deadlier.
Kids won't get it.
You'll only get it once.
Immunity lasts for decades.
The vaccinated won't get it.
Everyone's going to get it.
It's mild.
Kids are sicker because of immunity debt.
Long Covid is a mental illness.
Every single one of these lies has fed the public's cognitive dissonance. Hundreds of studies have shown the true nature of Covid and the threat it poses to all of us. Every single one of us knows at least one person, and probably several, living with post-Covid health problems. But most westerners, Americans in particular, don't like having to wear a mask or learn about air quality. They don't like having to change their habits or rethink society. They choose to believe obvious misinformation because it preserves their self-image.
They're just like those Stanford students who got paid $1 for doing a bunch of pointless work. They'll say anything to justify it, because none of them want to admit they've been played. They also don't want to admit they've likely ruined at least one person's life, probably a close friend's.
You can apply this same thought process to any moment in history when the bulk of society has chosen to accept outrageous propaganda rather than face the moral consequences of their actions.
That's the truth.
It's bad enough that people in positions of power are choosing to lie bigger and bigger to hide their catastrophic mistakes. It's scarier to see the psychological studies on how often people are willing to accept and perpetuate lies in order to alleviate cognitive dissonance while preserving their own status and sense of self. That's what we're up against.
There's only one way to fight big lies. We have to repeat the truth just as often, just as loudly, with just as much conviction. We have to make the truth appealing, even funny. We have to do it every single day, even on the days when everything feels hopeless.
In fact, those are the days that matter most.
We need as many people as possible speaking the truth in every kind of way. We need protests. We need articles. We need podcasts. We need tweets. We need art. We need music. We need memes. We need the truth presented politely over the dinner table, and we need it shouted in the streets. We need it with a smile, and we need it dripping with sarcasm. We need it on t-shirts. We need it on billboards.
We need it all.
There's one last thing we can do, and it's going to be hard. We're going to have to figure out a way to offer forgiveness and amnesty to people who finally see the truth and stop fighting it. It's hard to get people on your side if they're scared of admitting they were wrong.
Of course, maybe it's not that hard. I don't think most of us are interested in revenge. We just want people to do the right thing when it comes to public health, climate change, and human rights.
It's either that, or we join the big lies. Or I guess we go ahead and build our little cabins in the mountains and wait.
We don't have many other options.
Do we?
OK Doomer is supported by readers. If you appreciate this work, please subscribe or buy me a coffee.