In a 1962 letter, as a last-ditch effort for clemency, Holocaust organizer Adolf Eichmann wrote that he and other low-level officers were “forced to serve as mere instruments,” shifting the responsibility for the deaths of millions of Jews to his superiors. The “just following orders” defense, made famous in the post-WWII Nuremberg trials, featured heavily in Eichmann’s court hearings.
But that same year Stanley Milgram, a Yale University psychologist, conducted a series of famous experiments that tested whether “ordinary” folks would inflict harm on another person after following orders from an authoritative figure. Shockingly, the results suggested any human was capable of a heart of darkness.
Milgram’s research tackled whether a person could be coerced into behaving heinously, but new research released Thursday offers one explanation as to why.
“In particular, acting under orders caused participants to perceive a distance from outcomes that they themselves caused,” said study co-author Patrick Haggard, a cognitive neuroscientist at University College London, in an email.
In other words, people actually feel disconnected from their actions when they comply with orders, even though they’re the ones committing the act.
Milgram’s research tackled whether a person could be coerced into behaving heinously, but new research released Thursday offers one explanation as to why.
“In particular, acting under orders caused participants to perceive a distance from outcomes that they themselves caused,” said study co-author Patrick Haggard, a cognitive neuroscientist at University College London, in an email.
In other words, people actually feel disconnected from their actions when they comply with orders, even though they’re the ones committing the act.
The study, published in the journal Current Biology, described this distance as people experiencing their actions more as “passive movements than fully voluntary actions” when they follow orders.
Researchers at University College London and Université libre de Bruxelles in Belgium arrived at this conclusion by investigating how coercion could change someone’s “sense of agency,” a psychological phenomenon that refers to one’s awareness of their actions causing some external outcome.
More simply, Haggard described the phenomenon as flipping a switch (action) to turn on a light (external outcome). The time between the action and its outcome is typically experienced as a simultaneous event. Through two experiments, however, Haggard and the other researchers showed that people experienced a longer lapse in time in between the action and outcome, even if the outcome was unpleasant. It’s like you flip the switch, but it takes a beat or two for the light to appear.
“This [disconnect] suggests a reduced sense of agency, as if the participants’ actions under coercion began to feel more passive,” Haggard said.
Unlike Milgram’s classic research, Haggard’s team introduced a shocking element that was missing in the original 1960s experiments: actual shocks. Haggard said they used “moderately painful, but tolerable, shocks.” Milgram feigned shocks up to 450 volts.
In this test, the “agent” can shock or take money from the “victim,” either acting on orders or by their own choice. Image courtesy of Caspar et al., Current Biology (2016) |
According to Milgram’s experiments, 65 percent of his volunteers, described as “teachers,” were willing (sometimes reluctantly) to press a button that delivered shocks up to 450 volts to an unseen person, a “learner” in another room. Although pleas from the unknown person could be heard, including mentions of a heart condition, Milgram’s study said his volunteers continued to shock the “learner” when ordered to do so. At no point, however, did someone truly experience an electric shock.
“Milgram’s studies rested on a deception: Participants were instructed to administer ‘severe shocks’ to an actor, who in fact merely feigned being shocked,” Haggard said. “It’s difficult to ascertain whether participants are really deceived or not in such situations.”
When Yale received reams of Milgram’s documents in the 2000s, other psychologists started to criticize the famous electric-shock study when they sifted through the notes more closely.
Gina Perry, author of “Shock Machine: The Untold Story of the Notorious Milgram Psychology Experiments,” found a litany of methodological problems with the study. Perry said Milgram’s experiments were far less controlled than originally thought and introduced variables that appeared to goose the numbers.
Haggard said his team’s study was more transparent. In the first experiment, he said participants — an “agent” and a “victim” — took turns delivering mild shocks or inflicting a financial penalty on each other. In some cases, a third person — an “experimenter” — sat in the room and gave orders on whether to inflict harm. In other cases, the experimenter looked away, while the agent acted on their own volition.
The result? Researchers measured a “small, but significant” increase in the perceived time between a person’s action and outcome when coercion was involved. That is, when people act “under orders,” they seem to experience less agency over their actions and outcomes than when they choose for themselves, Haggard said.
In a second experiment, the team explored whether the loss of agency could also be seen in the brain activity of subjects. Prior work had found that brain activity is dampened when people are forced to follow orders.
So akin to before, subjects had to decide whether to shock a person with or without coercion, but now they heard an audible tone while making the choice. This tone elicited a brain response that could be measured by an electroencephalogram (EEG) cap.
Haggard’s team found that brain activity in response to this tone is indeed dampened when being coerced. Haggard’s team also used a questionnaire in the second experiment to get explicit judgments from the volunteers, who explained they felt less responsible when they acted under orders.
Haggard said his team’s findings do not legitimate the Nuremberg defense and that anyone who claims they were “just following orders” ought to be viewed with skepticism.
But, “our study does suggest that this claim might potentially correspond to the basicexperience that the person had of their action at the time,” Haggard said.
“If people acting under orders really do feel reduced responsibility, this seems important to understand. For a start, people who give orders should perhaps be held more responsible for the actions and outcomes of those they coerce,” he said.
H/T: PBS
Blogger Comment
Facebook Comment