I wrote this over the span of three days. I finished this morning in a Starbucks in Palo Alto. The inspiration for this essay is from many hours contemplating: how do intelligent and rational people commit and justify evil actions? Enjoy.
It is indeed probable that more harm and misery has been caused by men determined to use coercion to stamp a moral evil than by men intent on doing evil. – Friedrich Hayek, The Constitution of Liberty, 1960
How do you get thousands and thousands of people to justify the total extinction of another race of people? How do you have a country in the 20th century that continues to suppress the rights of women to the point of death for adultery? How do you get a nation to rally around a government that seeks to occupy and exploit an entire population of people? Better yet, how do you get a person or a group to be complicit in evil actions?
If one is to understand morality and moral progress, it is paramount to cultivate an understanding for moral regression as well. It is simple to say that honesty and compassion lead to a virtuous life, but honesty and compassion is often trumped by retribution and justice – and often for justified reasons. Let us not forget that what is ‘justice’ to the perpetrator is often ‘evil’ to the victim. What was ‘just’ to numerous Germans was gratuitous evil to millions of Jews – yet both thought they were on the right side of virtue.
Evil comes in all shapes and sizes. Throughout history we have groups who have been labeled ‘evil’ such as the Nazi’s of Germany and Hutu’s of the Rwandan genocide. Even serial killers and psychopaths have been handed the nomenclature of evil. But evil and moral regression is not only for psychopaths and Nazi’s. All humans have the propensity, under certain circumstances to follow a herd towards evil acts. In fact, it is really quite easy for normal rational people to get sucked into the vortex of moral regression. In this brief essay, I try to answer several topics that center around how people come to the point of justifying evil; what I call: the “machinery of evil” that tempts all of us. Lastly, I will show how we are able to understand someone who commits evil and how society transcends moral regression.
How do people come to a point where they justify such evil?
Have you ever wondered how so many Nazi soldiers could have been complicit in such evil? One that comes to mind is Joseph Mengele, a physician and part of the team of doctors who performed experiments and deemed who was appropriate for the gas chamber. The expedient answer is to put Mengele and all of Hitler’s henchmen in the category of psychopathic monsters who lack the ability to feel. But psychopathic, they are not. With regard to Mengele, in Robert Jay Lifton’s book The Nazi Doctors, he notes the humanity of Mengele. Joseph Mengele’s many-sidedness in Auschwitz was both part of his legend and a source of his desacralization. In the camp, he could be a visionary ideologue, an efficiently murderous functionary, a “scientist” and even a “professor,” an innovator in several areas, a diligent, and above all, a physician who became a murderer. He reveals himself as a man and not a demon, a man whose multifaceted harmony with Auschwitz gives us insight into – and makes us pause before – the human capacity to convert healing into killing. Mengele was one of many men of Hitler’s regime that committed vicious crimes against humanity while also existing as loving husbands and fathers and possessing feelings of love, empathy, and affection. However, they fell prey to a monstrous ideology that could have sucked in any of us, had we been in their shoes. Too shed light on the human capacity to easily fall prey to monstrous ideology we need to turn to empirical data.
Yale University social psychologist Stanley Milgram observed the range of moral flexibility in his famous “shock” experiments in the 1960s. How was is possible for educated, intelligent, and cultured human beings to commit mass murder? Milgram assigned his subjects to the role of “teacher” in what was purported to be research on the effects of punishment on learning. The protocol called for the subject to read a list of paired words to the “learner” (who was, in reality, a shill working for Milgram), then present the first word of each pair again, upon which the learner was to recall the second word. Each time the learner was incorrect, the teacher was to deliver an electric shock from a box with toggle switches in 15-volt increments that ranged from 15 volts all the way to 450 volts, and featured such labels as Slight Shock, Moderate Shock, Strong Shock, Very Strong Shock, Intense Shock, Extreme Intensity Shock, and Danger: Severe Shock, XXX.
Who do you think was most likely to go all the away and shock the learner to the extreme? Surprisingly – and counterintuitively – age, gender, occupation, and personality characteristics mattered little to the outcome. Similar levels of punishment were delivered by the young and the old, by males and females, and by blue-collar and white-collar workers alike. What mattered most was the physical proximity and group pressure. The closer the learner was to the teacher, the less of a shock the latter delivered. And when Milgram added more confederates to encourage the teacher to administer ever more powerful shocks, most teachers complied; but when the confederates themselves rebelled against the authority figure’s instructions, the teacher was equally disinclined to obey. Nevertheless, 100 percent of Milgram’s subjects delivered at least a “strong shock” of 135 volts.
In 2010, a replication of Milgram’s test was done on Dateline NBC as Michael Shermer and host Chris Hanson tested six subjects. The test on Dateline had a “learner” played by an actor, an authority figure that applied the pressure on six innocent subjects that had control of the switch delivering the shock. The results of the Dateline replication were astonishing – five out of six who administered shocks, and three who went all the way to the end of maximal electrical evil of 450 volts.
What this experiment is testing for is obedience to authority. Also, it is testing to see what factors increase or decrease the “teacher’s” likelihood of shocking the subject to the point of severe pain. As I just previously stated, what mattered most was the physical proximity and group pressure. This was the same outcome of the Nazi soldiers who, when killing the Jews at point blank with a gun, displayed a visceral repulsiveness. Yet when the complicity was more tacit, for example, simply herding individuals in to the shower (really a gas chamber) and then turning on the gas from a distance, then defection was minimal. A sense of responsibility is further diminished when you are simply acting on orders.
Milgram’s interpretation of his data included what he called the “agentic state,” which is “the condition a person is in when he sees himself as an agent carrying out another person’s wishes and they therefore no longer see themselves as responsible for their actions.”
Subjects who are told that they are playing a role in an experiment are stuck in a no-man’s-land somewhere between an authority figure, in the form of a white-coated scientist, and a stooge, in the form of a defenseless learner in another room. They undergo a mental shift from being moral agents in themselves who make their own decisions (that autonomous state) to the ambiguous and susceptible state of being an intermediary in a hierarchy and therefore prone to unqualified obedience (the agentic state).
It is psychologically easy to ignore responsibility when one is only an intermediate link in a chain of evil action but is far from the final consequences of the action. Even Adolf Eichmann was repulsed when he first witnessed a camp killing, but to implement the Holocaust he had only to push papers across a desk. The SS guards who actually did the deed justified their behavior by claiming they were only acting on orders. Depending on the circumstances, perhaps any of us could become Nazi’s.
Milgram broke the process into two stages: “First, there is a set of ‘binding factors’ that lock the subject into the situation. They include such factors as politeness on his part, his desire to uphold his initial promise of aid to the experimenter, and the awkwardness of withdrawal. Second, a number of adjustments in the subject’s thinking occur that undermine his resolve to break with the authority. The adjustments help the subject maintain his relationship with the experimenter, while at the same time reducing the strain brought about by the experimental conflict.
As Milgram later reflected, “I am forever astonished that when lecturing on the obedience experiments in colleges across the country, I faced young men who were aghast at the behavior of experimental subjects and proclaimed they would never behave in such a way, but who, in a matter of months, were brought into the military and performed without compunction actions that made shocking the victim seem pallid.” In reality, there are no bad apples, just bad barrels. In the 1960s – the heyday of belief in the blank slate taken to mean that human behavior is almost infinitely malleable – Milgram’s data seemed to confirm the idea that degenerate acts are primarily the result of degenerate environments. Milgram believed that almost anyone pulled put into this agentic state could be pulled into evil one step at a time – in this case 15 volts at a time – until they were so far down the path there was no turning back.
The Machinery of Evil
Milgram’s studies beg the question: Do we have the same propensity to commit or participate in such evil acts like the holocaust? The results from his study reveals that with the right ingredients, creating a machinery of evil is possible. I believe that there is a machinery of evil that acts as the building blocks for destructive behavior that is lures us all. From gossip that ruins someone’s reputation to justifying xenophobic and tribal actions. With the right ingredient we can easily diffuse ourselves of responsibility and justify our actions. What exactly are the ingredients?
Deindividuation is the first ingredient in the machinery of evil. This involves taking people out of their normal circle, or dressing them up in uniforms, and/or insisting that they be team players. The French sociologist Gustave Le Bon called the concept “group mind, in which people are manipulated through anonymity, contagion and suggestibility.” We see this today in religious (parochial) schools, the U.S. military and even gangs. Once the process of deindividuation begins the individual loses a sense of ‘self’ and starts to merge his/her identity with the group.
Dehumanization is the next ingredient that disavowals the humanity of another person or group, either symbolically through discriminatory language and objectifying imagery or physically through captivity, slavery, the infliction of bodily harm, systematic humiliation, and so on. Thus, prisoners might have their heads shaved off, or marked with a symbol (as in concentration camps) or labeled in harsh terms like cockroaches or rats (like the Rwandans and Jews were labeled).
Most people are justifiably dumbfounded as to how and why so many Nazi soldiers could so easily comply with Hitler’s murderous regime. Compliance, the next ingredient, was explained at the Nuremburg trials, revealing the psychology of justifying the atrocities – especially of one Adolf Eichmann. Eichmann had been one of the chief orchestrators of the Final Solution, but like his fellow Nazis at the Nuremburg trials, his defense was that he was innocent by virtue if the fact that he was solely following orders. Befehl ist Befehl – orders are orders – is now known as the Nuremburg defense, and it’s an excuse that seems particularly feeble in a case like Eichmann’s. This defense may sound like a cop-out, but a 1966 study by Charles Hofling buttresses the psychology of complying to “legitimate powers.”
The psychiatrist Charles Hofling arranged to have an unknown physician contact nurses by phone and order them to administer 20mg of a nonexistent drug called “Astrofen” to one of his patients. Not only was the drug fictional, it also was not on the approved list of drugs, and the bottle was clearly labeled that 10mg was the maximum daily dose allowed. Preexperimental surveys showed that when given this scenario as purely hypothetical, virtually all nurses and nursing students confidently asserted that they would refuse to obey the order. Yet, when Hofling actually ran the experiment, he got twenty-one out of twenty-two nurses to comply with the doctor’s orders, even though they knew it was wrong. Subsequent studies support this disturbing finding. For example, a 1995 survey revealed that nearly half admitted to having at some time in their careers “carried out a physician’s order that that you felt could have had harmful consequences to the patient,” citing “legitimate power” that physicians hold over them as the reason for their compliance. Recall Milgram’s shock experiment? When you have a person wearing a white lab coat, appearing to be a legitimate person of authority, our brain’s fear recognition kicks in when considering breaking ranks. We simply don’t want to break ranks when it is so much easier to simply comply.
Identification is the next ingredient. Within an ‘us vs. them’ mentality, each group identifies with their group. Known as “identification based followership,” groups such as the KKK, militant Sunni’s and even Republicans and Democrats have a deep sense of identity to their group. Going back to Milgram’s experiment, the participant’s identification with either the experimenter and the scientific community that he represents or the learner and the general community that he represents better explains the willingness of his subjects to shock (or not shock) learners at the bidding of an authority, subjects identify with the experimenter and his worthy scientific program, but at 150 volts the subjects’ identification begins to shift to the learner, who cries “UHG!! That’s all!”
In addition to identification, the ingredient of conformity through tribalism and loyalty creates a deepening sense of moral confirmation to individuals. A fascinating study was done by Gregory Berns at Emory University in which involved subjects were expected to match rotated images of three-dimensional objects to a standard comparison object. Subjects were first put into groups of four people, but unbeknown to them the other three were confederates who would intentionally select an obviously wrong answer. On average, subjects conformed to the group’s wrong answer forty-one percent of the time, and when they did, the areas of their cortex related to vision and spatial awareness became active. But when they broke ranks with the group, their right amygdala and right caudate nucleus lit up, areas associated with negative emotions. In other words, nonconformity can be an emotionally traumatic experience, which is why most of don’t like to break ranks with our social norms.
The last ingredient in the “machinery of evil” is subtle than evil. We are easily susceptible to what social psychologists call pluralistic ignorance. Before I define it, it is best to show how Hans Christian Anderson’s story The Emperor’s New Clothes illustrates pluralistic ignorance. In his book, fraudsters sew the king a suit made of air but persuade his court that only the “unworthy” cannot see it. Although the king and his ministers harbor private concerns, all men fawn over the invisible threads to prove their own sophistication. The emperor dances down the street. His subjects—confused, weirded out, suspecting that everybody else “sees” a garment—enthusiastically play along. “How fine” they shout. “What a perfect fit!” It takes a child, too young to understand this pageantry of decorum, to point out that emperor is marching down the street buck naked.
This is pluralistic ignorance, technically defined as “the psychological state characterized by the belief that one’s private attitudes and judgments are different from those of others, even though one’s public behavior is identical.” More simply: It’s many people collectively praising a king’s robes while privately seeing only a naked monarch. It’s peer pressure dipped in irony. The evidence is strewn throughout history especially when you consider how a vast number of Nazi soldiers truly thought that Hitler’s policies were barbaric, yet thought their own views were unique to the group. Same is true with early 20th century Jim Crow laws, yet many supported the laws despite deploring them. These examples suggest, rather frighteningly, that a group of people can engage in systematic bigotry, even when a majority of them are not actually bigots. That’s not group-think, where people think and act the same way when they get together. Rather it’s group-ignorance: people thinking one thing and doing another, because they are deluded about the majority’s real views and then are conforming to that delusion. I’ll end this section with a quote from Russian novelist Fyodor Dostoevsky: “Every man reminiscences which he would not tell everyone but only his friends. He has other matters in his mind which he would not reveal even to his friends, but only to himself, and that in secret. But there are other things which a man is afraid to tell even to himself, and every decent man has a number of such things stored away in his mind.”
Moral Regression: A Public Health Model
Moral regression is not about bad apples, it’s about bad barrels. I have highlighted Hitler’s evil regime and corrupt people by illustrating the process and methodology – through scientific studies – behind the evil actions. All of us, given the right environment and genetic predispositions [both nature and nurture] can easily succumb to the most evil of actions.
I will conclude with a way to think about these difficult questions I have posed. A way to think about evil is to compare the medical model of disease to the public health model of disease. The medical model of evil treats it as if the locus of the contagion were inside each individual patient. In Western religion, sin is inside the individual; in the law, criminality is in the individual. The medical model demands that each infected person be treated, one by one, until no one shows further symptoms. Thus, evil is in their nature and to eradicate the plague of evil we have simply to eliminate those with evil dispositions.
This paradigm served as the basis of the Inquisition, in which women were instructed to be boiled in oil for such crimes as “sleeping with the devil.” Did this put a dent into evil? Hardly. What the witch hunt did was to spread evil in the form of barbaric, systemic violence against women throughout much of Europe and North America for centuries.
By contrast, the public health model of evil accepts as a given fact, of course, we affect and infect one another, but individuals are simply part of a larger disease vector that includes many additional social psychological factors that have been identified in the past century that must be included in any theoretical model to explain the often puzzling world of moral psychology. Our societal disease vector includes deindividuation, dehumanization, compliance, identification, conformity, tribalism and pluralistic ignorance. These are some of the forces that lure rational and normal people towards moral regression.
We’re likely never to admit we are regressing morally, rather, we are more likely to justify it with some virtuous title. Like Friedrich Hayek once said, “It is indeed probable that more harm and misery has been caused by men determined to use coercion to stamp a moral evil than by men intent on doing evil.”