There's a story about Zig Ziglar, the motivational speaker. A woman approached him at one of his seminars, begging him for help. She said she hated her job. Her boss was horrible. Her coworkers hated her.
What should she do?
Well, Zig told her to shut up and stop complaining. He said she sounded like the negative one. He told her to chant aphorisms into a mirror about how much she loved her job. She did. Lo and behold, she was cured.
I've got bad news for everyone.
Zig didn't help this woman. In fact, all he did was reinforce a cognitive bias that psychologists have been studying for decades. This bias explains a lot about the state of the world today. It explains why nobody listens to you about any of the threats we face now. It explains why they try to punish you, even when you’re right. It explains why telling the truth makes you the bad guy.
It’s getting worse.
In the 1990s, psychologists John Skowronski and Donal Carlston began noticing something strange when someone tried to warn their friends and family about something. Instead of believing them, everyone transferred those negative traits to the person trying to warn them. Skowronski and Carlston confirmed the behavior in four studies. They defined this phenomenon as spontaneous trait transference, something that happens when "communicators are perceived as possessing the very traits they describe in others." The one who smelt it dealt it.
As Skowronsky explains, "politicians who allege corruption by their opponents may themselves be perceived as dishonest" and "critics who praise artists may themselves be perceived as talented." If you describe someone as negative, unreliable, or dangerous, people tend to misremember it as a self-description.
Yes, they think you were talking about yourself.
Later, Rick Brown and John Bassili found that people transfer personality traits to inanimate objects like bananas. They do it without thinking. Yep, you can condition someone to believe that bananas are evil. And if you can convince someone bananas are evil, imagine what you can do with masks or vaccines.
It gets worse.
You know the phrase, don't shoot the messenger?
In 2019, a team of psychologists at Harvard led by Leslie John reviewed hundreds of studies and conducted eleven different experiments to explain why people punish someone for giving them bad news. They learned that the human brain often reaches for the quickest, easiest explanations for negative events in their lives, especially ones that preserve their self-image and group harmony.
As Leslie John and her colleagues write, "people are especially prone to attributing agency to others for negative outcomes." They also "attribute agency to those proximal to the event." There's nothing more proximal to an event than the first person to tell you what's going on. Once again, our shared psychology discourages us from warning each other about threats.
And so:
"Bad news messengers may be prime candidates in recipients' search for antagonists to cast in accounts of unwanted outcomes." Bad news also motivates people to come up with "fallacious" causal explanations "often generated effortlessly, seemingly automatically." They generate these fallacious explanations through poor reasoning "characterized by shallow, unconscious thought."
That's how we wind up with so many conspiracy theories. They're easier to swallow than the truth. They gratify us.
Fortunately, there’s a way to combat this trend. As Carlston and Skowronski explore in a 2005 study, “forcing participants to recall the target… just prior to trait judgments eliminated the transference effect.” You have to get them to visualize the person or thing you’re talking about and focus on that, not your experiences or perceptions. Try to sound as objective as possible. The more you talk about yourself in relation to the threat, the more they’ll link you to it.
It’s hard to pull off.
Like so many of the biases we observe, simply warning someone about spontaneous trait transference doesn’t help.
They do it anyway.
Spontaneous trait transference converges with a few other cognitive biases, like information disconfirmation—something Leon Festinger documented when he studied a doomsday cult in the 1950s. Contrary to what you’d think, more evidence and information disproving a belief can make someone want to believe even harder than they did before. That’s how denial works.
One of the earliest studies on this bias observed the reaction of high schoolers to compelling evidence debunking the divinity of Jesus. The high schoolers believed in Jesus more afterward, even when they admitted the evidence was convincing. That behavior keeps showing up.
Another major study in the Journal of Personality and Social Psychology found that when someone has a strong opinion on something, they accept all evidence in favor of their original belief at face value. They don't question it. They subject conflicting evidence to a much higher standard. Doing that allows them to reject or dismiss the counter-evidence. The more persuasive the evidence, the harder they try to reject it. They bend their minds into all kinds of shapes.
A 2020 article by researchers in Australia found something similar. After reading about Donald Trump's immoral and illegal acts, his supporters were more likely to focus on the wrongdoings of Trump's opponents. It didn't matter if they believed Trump had done anything wrong.
Education doesn’t make a difference. In fact, the more educated someone becomes, the more immune they think they are to bias. It winds up creating its own bizarre bias. I guess you could call it the college effect. Liberals do it, too. When presented with evidence they don’t like, they work extra hard to reject it, and that includes deflecting more attention toward their opponents.
Maybe you’ve noticed…
All of this research points to a larger understanding of how most people process information. It’s not based on logic. It’s based on social ties and emotion. Psychologists and political scientists call this motivated reasoning. We look for facts that suit our default emotional stance. That emotional stance derives from our social backgrounds. The ideas and opinions we hear repeated by those around us sound like the truest, most comforting ones.
Ideas from the outside?
That’s dangerous.
A 2010 study by Yale law professor Dan Kahan found that our cultural values and worldviews influence who we regard as experts and authorities in the first place. According to Kahan, you can classify most people as individualists or communitarians, and either hierarchical or egalitarian. Most people tend to reject the authority of experts who don't align with their cultural values and orientations to the world. The topic doesn't matter.
In Kahan's study, only 23 percent of hierarchical individualists would recommend a book by an expert who expressed belief in the urgency of climate change, even if that expert was a member of the National Academy of Sciences and a professor at an elite university. You can extrapolate from there. If someone's a hierarchical individualist, they're probably going to reject evidence that vaccines or masks work, along with air purifiers. They're more likely to dismiss warnings about climate change or collapse. They aren't going to listen to any experts who come off as too liberal or communitarian—or negative.
You can find hierarchical individualists among Democrats as well, on the more “moderate” side, and you know what it looks like.
Here’s the last nail:
Scientists have figured out that we apply fight-or-flight responses not just to physical threats, but to facts themselves. Arthur Lupia’s work on emotion and fear shows how politicians can manipulate the public’s fear to get them to accept policies they would otherwise reject. They just have to make the truth sound scary and complicated, and make the ideas they want look simple and reassuring.
People will hide from facts they don’t like.
You can see how this has played out with so many of our threats, from the climate crisis to airborne contagious diseases. Politicians on both sides have coordinated with the media to make the real solutions sound scary, complicated, and impractical. They’ve painted truth-tellers as fearmongers. This strategy has continued right up through today, when public health grifters like Vinay Prasad declare they would “happily” drink glass after glass of raw milk.
Well, he could.
He won’t.
People have a very bad habit of associating what’s comfortable and convenient with what’s safe and true. That fundamental flaw lies at the heart of most biases we uncover. People fear change more than anything.
So, that’s why it’s so hard to get people to listen to us. It’s easy to pretend nobody has to change their habits or routines. It’s easy to make them afraid of the truth, and it’s easy to get them to associate us with the things we’re trying to warn everyone about. Think about it for a minute, and it also explains why so many people see someone wearing an N95 mask and immediately assume that person is the sick one. In the plague days, people were more scared of the doctor than the disease. Plague masks? Creepy. Fleas and rats? Normal.
They’ve learned to associate protections with threats.
It’s ass-backward, but there it is.
To sum things up, people tend to attribute the bad news and negative events in their lives to those around them, often their friends and family. They don't do a good job of distinguishing between a threat and someone trying to warn them.
They get them mixed up.
This trait explains so much of what's going on now. It explains why the public gets angry at climate protestors instead of the oil executives who've ruined their future. It explains how university students have somehow wound up as the villains in so many people's eyes, instead of the governments sponsoring and committing genocide. It explains why you can't criticize billionaires or super rich influencers without that incredibly annoying counter claim:
"You're just jealous."
It explains why you get pathologized and called everything from a doomer to a snowflake for caring about anything but yourself. It explains why pretending to care looks better than actually caring.
Once you understand this glitch in human thinking, it has less power over you. Maybe there’s not much you can do to change someone’s mind in the short term. Maybe you just have to let them learn on their own. Sometimes, you have to leave the information with them and give them time to process it. In the meantime, you have to insulate yourself from them as much as possible. Sometimes, trying harder to convince someone of the truth doesn’t work.
We all feel the emotional turbulence of watching people we care about welcome harmful misinformation into their lives. We’ve all wondered how we wound up becoming the bad guy in all of this.
That’s how.
Yep. It's like that. Makes me wonder how human society got as far as it has. We be stupid and crazy . . .
Society is so doomed.
This was a great article!