Book Review: The Bias That Divides Us: The Science and Politics of Myside Thinking
ETHOS Digital Issue 10, Mar 2023
Why People See the Same Facts and Come to Different Conclusions
The Bias That Divides Us: The Science and Politics of Myside Thinking by Keith Stanovich By Keith Stanovich The MIT Press (2021); 241 pp |
You might think that when it comes to arguments and facts, the marketplace of ideas will separate the wheat from the chaff. You might also be persuaded that presenting people with the facts can help them reason past misinformation and forge consensus on reality.
If so, consider a 1954 experiment conducted by psychologists Albert Hastorf and Hadley Cantril, in which they showed a video of a football game between Dartmouth College and Princeton University to students from those schools.1 The game had been rough—one player left the field with a broken nose and a concussion, another with a broken leg. Rule infractions abounded.
Hastorf and Cantril asked the students who had started the rough play: Just over a third (36%) of Dartmouth students—compared with the vast majority (86%) of Princetonians—said Dartmouth had started it. In contrast, 2% of Dartmouth students, and none of the Princetonians, said Princeton had started the rough play. Furthermore, 13% of the Dartmouth students and 0% of Princeton students said the game had been “clean and fair”. In contrast, 42% of Dartmouth students and 93% of Princetonians said the game had been “rough and dirty”. The numbers show that seeing the same video did not lead students from the two schools to a consensus. Instead, they reported wildly different takes on the game they saw.
In his 2021 book The Bias that Divides Us: The Science and Politics of Myside Thinking, psychologist Keith Stanovich mentions numerous other examples of people seeing the same evidence and coming to very different conclusions. Sometimes, seeing the same evidence made people less cohesive—people on different sides of an issue became more convinced of their prior opinions! Policymakers who want to know more about how people actually think about information, and wanting to understand why we are so divided on matters that appear to be settled on the facts, need to read this book.
We tend to invest our emotions and egos in our convictions, and be especially resistant to making trade-offs against them.
Myside bias occurs not just in psychology labs, but everywhere in the real world. It occurs when people are more willing to do something to others than themselves. It occurs when we assume, for instance, that high reward activity has low risk—this may explain why people “invest” in cryptocurrencies despite regulators’ warnings. It also occurs when people are more likely to disbelieve the benefits of activities that they find immoral (e.g., capital punishment, stem cell research). It’s why people who smoke and drink are less likely to believe in the harmful effects of cigarettes and alcohol. And it explains why people can become trapped in “islands of false beliefs” , as Stanovich puts it, using their intelligence to further rationalise themselves into conspiracies.
Perhaps nowhere is myside bias more evident than in political polarisation. We find it more cognitively taxing to evaluate evidence that conflicts with our existing beliefs. We exaggerate the strengths of evidence that supports our beliefs and the weaknesses of evidence that does not. Furthermore, we employ strategies to avoid having to critically assess these protected beliefs. “I am entitled to my opinion”, or “don’t talk about religion and politics”, are both examples of strategies that shield us from having our convictions challenged.
This is compounded by the fact that we are more swayed by the usefulness of a belief than its truth. This is especially so when holding the belief is key to our membership in a group and maintaining the group’s cohesion. For example, someone deeply enmeshed in a wellness community might persist in professing the efficacy of homeopathy in the face of contrary evidence because his membership in the community is more important than being factually and logically correct.
We are more swayed by the usefulness of a belief than its truth.
A world in which we think of political parties or ideologies as tribes, and in which belonging trumps being right, is ripe for myside bias to breed polarisation. Lilliana Mason’s research suggests that labelling a position “liberal” or “conservative” did more to predict whether liberals or conservatives would support it than any principles internal to the position itself. This explains why we can hold ideologically inconsistent and even contradictory positions on different issues. Indeed, psychologists find that liberals and conservatives switch the types of statistical evidence they find persuasive depending on the issue.
A world in which belonging trumps being right is ripe for myside bias to breed polarisation.
Another problem psychologists find is that while cognitive elites tend to be less susceptible to cognitive biases, they are more susceptible to myside bias. More intelligent people are more likely to think that they have reasoned their way to their conclusions, rather than acknowledge that their conclusions derive from the social groups to which they belong . No doubt, they are also better at rationalising their biases. Studies find that the ideological consistency of cognitive elites has increased over time, and that having ideologically consistent positions on world views tends to be the case only for people who are deeply involved in politics, or those highly educated and immersed continuously in high-level media sources.
At this point, readers might be thinking about how current conditions conspire to accentuate the toxic effects of myside bias. In a time of enhanced ambiguity, complexity, and uncertainty, it is becoming increasingly difficult for people to empirically verify evidence. Populists cast aspersions on the experts who can help us do so. As more of us live in cities without access to tight kinship networks or civic associations, we cleave more to the few sources of belonging that give us our identities. And as we retreat into the rage-inducing clickbaity world of social media, we become less and less familiar with alternative ways of thinking about and experiencing the world. Hence, when we encounter a tweet from our ideological opposites, our ignorance of the contexts in which the ideas in the tweet arose lead us to become incredulous at their seemingly alien ideas.
While cognitive elites tend to be less susceptible to cognitive biases, they are more susceptible to myside bias.
Readers who are policymakers might, at this point, be despondent. But Stanovich’s final two chapters offer suggestions for the way forward. The penultimate chapter discussing the myside bias of one group of cognitive elites (university professors) could be instructive for adjacent groups such as policymakers. For instance, when tasked with designing surveys and polls, we should be careful about infusing our questions with confounds. For example, survey questions that ask people if they would defer to religious authorities but not academic experts might show conservatives but not liberals as valuing authority. We should also be less judgmental and more curious about the different worldviews of citizens who seem ‘irrational’ or who appear to ‘lack the facts’. As public officers, we should reflect on how we ourselves exhibit behaviours or attitudes that we may think are exclusive from other groups, and recognise that we, like all other groups of people, tend to over-estimate how much thinking we have done to arrive at our own beliefs.
Additionally, just as scientists, who themselves may be biased, use the scientific method to detect errors and validate their conclusions, the rest of us can similarly use decision-making processes to challenge our tendency to evaluate information in self-serving ways. Stanovich suggests several. Chiefly, we could beware the tribal qualities of the parties and ideologies to which we subscribe, and be comfortable with holding beliefs that are inconsistent with our confederates in these communities.
For instance, Stanovich warns against the “bad” kind of identity politics. Identity politics, of the sort practised by Martin Luther King and Mahatma Gandhi, can be crucial to help marginalised communities gain access to equal rights and protections. However, when used to prioritise one group over another, and when those same identities are used against efforts to build shared humanity, they fertilise myside bias and grow polarisation.
Stanovich is a psychologist rather than a sociologist. This book leaves one curious about the social practices we can adopt and nurture to reduce the toxic effects of myside bias. But his book does suggest that a rich civic life, one in we can be members of multiple communities, can help us forge the cross-cutting identities that can give us many sides to belong to. It also highlights that engagements designed to help people understand and reconcile diverse worldviews, and doing so from their common identity as citizens, are vital for societal cohesion.
NOTE
- A.H. Hastorf, A. H., and H. Cantril, “They Saw a Game: A Case Study”, The Journal of Abnormal and Social Psychology, no. 1 (1954): 129–134.