Trust can be a good thing in business: If you as an Apple employee in the Steve Jobs era believed in his brilliance as an innovator, for example, you’d feel more committed to the decision to develop a tablet device that did not yet exist. Trust is a way to have friction-free relationships by reducing transaction cost. Of course, some leaders, like Bernie Madoff, are worthy of distrust. The question is, when connecting the dots on a suspicion about leaders or groups in your workplace, how can you be sure you’re right?
“The human brain is really hardwired to seek out and overweight certain kinds of information,” says social psychologist Roderick Kramer. As he writes in a new paper, “Misconnecting the Dots: Origins and Dynamics of Outgroup Paranoia,” there are psychological factors at work — often in concert — that lead people to inflate or misconstrue suspicions into mistrust when it’s not warranted. Here are three types of misperceptions to be aware of:
- Overly personal construal of interaction: “People begin to read their own personal story into a situation,” says Kramer. “The reason I wasn’t invited to that meeting is because they all discussed it and actively decided to exclude me.”
- Sinister attribution error: “We often make paranoid attributions for benign behaviors,” he says. “A lot of us have experienced this around email. I send an email to my superior and they don’t get back to me right away. And I begin to ruminate about why — they’re mad at me, I’ve disappointed them, they’re punishing me – when in fact they may be busy and not even reading email.”
- Exaggerated perception of conspiracy: “This tends to be social in nature,” says Kramer. “My colleague didn’t get back to me, but come to think of it my boss didn’t either — suddenly I begin to put those pieces together and think, ‘Oh, I’m not going to get that promotion.’”
So how do you keep suspicions from spinning out of control while maintaining a healthy skepticism? “Just knowing the nature of these biases and the psychological factors that feed them allows you to begin to compensate for those,” says Kramer. “In a way, we aspire to help the brain make rational choices by understanding some of the ways it goes wrong.” Kramer offers these de-biasing strategies to avoid misconnecting the dots:
Be mindful of the impact of status.
Those with fewer resources or less power have a tendency toward hypervigilance, a psychological factor that can exacerbate misperceptions, says Kramer. “Lower status groups tend to look around vigilantly for any evidence to support their theory, because they have a lot to lose if they get it wrong.” In a study Kramer conducted on the graduate student–faculty relationship, for example, he found that graduate students spent a lot more time worried about how well the relationship is going. “Not surprisingly, the faculty are busy thinking about the people they’re accountable to, not the lower status people,” he says.
Gather data like a scientist.
Once you think you’ve come to a conclusion on an issue, try to prove yourself wrong, says Kramer. There’s a whole body of research that suggests that people tend to seek confirmatory evidence to the exclusion of other information. “It’s a natural thing we do,” he says, “but a more rational approach is to work very hard to gather unbiased data, including information that might disconfirm your interpretation — scientists and doctors are trained to do this.”
Talk to the opposition.
Part of questioning your interpretation of the facts should include talking to experts who have alternative interpretations. “Conspiracy theorists tend to go to websites they agree with and share information with like-minded people,” says Kramer. But you have a better chance of getting it right if you constantly reassess your interpretation of the facts. “There is actually some wisdom in keeping track of what your enemies are doing,” he says.
Don’t let yourself be isolated.
Keeping suspicions to yourself, or confined to just a few friends who share your point of view, can fuel paranoia, says Kramer, who has studied leader paranoia and found that one of the common mistakes, especially by presidents like Richard Nixon and Lyndon B. Johnson, is to become surrounded by yes-men. “It’s important to be sure you’re getting a panoply of information. You have to think about the social network you’re in — is it really serving you well?”
Roderick Kramer is the William R. Kimball Professor of Organizational Behavior at Stanford Graduate School of Business. The paper “Misconnecting the Dots” was recently published in the book Power, Politics, and Paranoia: Why People Are Suspicious of Their Leaders.