The Weird Ways Your Politics Affects Your Morals
This article was originally published at The Conversation. The publication contributed the article to Live Science's Expert Voices: Op-Ed & Insights.
When news breaks about wrongdoings of our favorite politician, the other side inevitably argues that we have a scandal on our hands. We like to think that our superior grasp of logic is what enables us to reason through and reject the other side's concerns.
But, a series of three studies I recently published suggest such decisions are not just the result of reasoning. Rather, feeling moral aversion toward political opponents compels us toward positions that help our team "win." This is true even if it means adopting positions with which we'd otherwise disagree.
Here's the effect in a nutshell: Imagine that you walked into an ice cream shop on Election Day. You discover that the shop is filled with supporters of the presidential candidate you oppose, and you find supporters of that candidate morally abhorrent. When you get to the front of the line, the worker tells you all of the other customers just ordered red velvet – normally your favorite flavor.
My studies demonstrated that when asked to order, you are likely to feel an urge to stray from your favorite flavor toward one you like less, politically polarizing an otherwise innocuous decision.
Whatever they think, think the opposite
To understand what's meant by "urge" here, it helps to understand the Stroop effect. In this classic experiment, people see a single word and are asked to name the color in which the word is printed. When the color and the word match – for example, "red" printed in red – the task is easy. When the color and the word are incongruent – for example, "red" printed in blue – the task is harder. People feel an impulse, or "urge," to accidentally read the word. This urge interferes with the task of naming the color, and what should be a simple task becomes oddly difficult.
A theory of morality put forth by Jonathan Haidt suggests that morals "blind" people to alternative viewpoints such that even considering the other side's opinions is taboo. With that theory in mind, I thought that moral aversion might be a social cause of unproductive urges similar to urges experienced in the Stroop task. That is, just as people in the Stroop task feel the impulse to incorrectly read the word, I thought that strong moral beliefs might cause people to feel impulses to make decisions that maximize their distance from people they believe have different morals.
Sign up for the Live Science daily newsletter now
Get the world’s most fascinating discoveries delivered straight to your inbox.
How the test worked
Here's how I tested it:
I first had people do several Stroop trials to make them aware of what that urge to make an error feels like.
Next, I asked people six fairly trivial consumer choice questions, such as preference for car color (forest green vs. silver) or vacuum brand (Hoover vs. Dirt Devil).
Here's the twist: After answering each question, participants were told how a majority of other participants answered the same question. The identity of this majority group was random. It could be either a group that everyone belonged to (for example, Americans) or a more politically charged group (for example, Trump supporters, Clinton supporters or white supremacists).
Finally, I showed participants the set of questions a second time, and asked them to simply state their previous answer a second time. I also asked participants to rate their urge to change their answer – similar to the urge to make an error in the Stroop test.
This should have been straightforward.
Participants were not asked to evaluate the majority answer or reconsider their opinion in any way. Still, just like the interference felt in the Stroop task, knowing the majority response caused people to feel an urge to give the wrong answer.
When participants belonged to the majority group, they reported heightened urges to make an error when they had previously disagreed with the majority. Despite just being asked to repeat what they said a moment ago on a fairly trivial opinion question, they felt a conformist urge.
Similarly, when participants had strong moral distaste for the majority group, they reported heightened urges to make an error when they agreed with the group. In other words, participants' initial responses were now morally "tainted," and, even for these rather inconsequential questions, they felt an urge to abandon that response and distance themselves from their opponents. This urge made the trivial task of stating their opinion again slightly more difficult.
'Hive mind' and passive effects
As America is more ideologically divided now than any other point in history, these results illuminate two things about the psychology behind political polarization.
First, people might think they are able to use their reasoning to decide whether, say, a minimum wage increase will have positive or negative consequences. However, moral impulses have likely already nudged people toward disagreeing with their opponents before any deliberative thinking on the issue has begun.
Second, the effects observed here are likely a passive process. Participants did not want to feel urges to make an error in the Stroop task, and they likely did not want to feel urges to contradict their own opinions in my studies. The urges just happen as a result of a morality-driven psychology.
These results suggest that efforts to bring those on the fringe closer to the middle will likely fall on deaf ears. A more optimistic interpretation is that polarization might have its roots in unintentional partisan urges. While there is no shortage of moral issues that lead to polarization, polarization does not necessarily result from the malice of those involved.
Randy Stein, Assistant Professor of Marketing, California State Polytechnic University, Pomona
This article was originally published on The Conversation. Read the original article.