Making sense of science

Testing the Limits of Human Obedience

Testing the Limits of Human Obedience

01.25.2016, by
Fifteen? Sixty? Four hundred volts? How many electric shocks would you have administered to an innocent test subject if you had been ordered to? Social psychologist Sophie Richardot looks back at Stanley Milgram's famous experiment, the subject of a film called "Experimenter."

Up to what point are we capable of obeying? That is the question Stanley Milgram tried to answer in his famous experiments on obedience to authority conducted at Yale University in the early 1960s. More precisely, the young social psychologist's preoccupation was to understand how ordinary individuals were able to take part in the extermination of the Jews of Europe during World War II.

Participants screamed at 120 volts and stopped responding at 300 volts...

The basic experiment1 was the following: two people2 were invited into a laboratory, a subject (a volunteer) and a confederate of Milgram's (presented as another subject). They were there to participate in a supposed experiment on learning and memory that brought a "teacher" face to face with a "learner" in the presence of the experimenter. Rigged lots were drawn to assign the roles, always giving the role of "teacher" to the volunteer, and that of learner to Milgram's confederate. This learner was then strapped to an armchair, and electrodes were attached to his wrists, while the "teacher" was set up in another room before a control panel equipped with 30 buttons, rising from 15 to 450 volts. The "teacher" (the only genuinely naïve subject of the experiment) was supposed to ask the learner questions of memorization, and to inflict increasingly intense electric shocks at each error.3 The reactions of the learner (pre-recorded and fictional) were perfectly audible: he groaned at 75 volts, yelled at 120 volts, refused to continue at 150 volts, and wailed and no longer responded at 300 volts. Each time the subject-teacher hesitated to press the punishment button, the experimenter told him to continue, and if he refused to obey after the fourth prompting, the experiment would stop. The results surprised Milgram himself: 62.5% of the participants continued until the end of the experiment and inflicted the maximum shock of 450 volts.
 

Experimenter
Stanley Milgram (played here by Peter Sarsgaard ) in front of the machine intended to send electric shocks to a test subject during his famous experiment on obedience to authority figures
Experimenter
Stanley Milgram (played here by Peter Sarsgaard ) in front of the machine intended to send electric shocks to a test subject during his famous experiment on obedience to authority figures

During the Holocaust, how did obedience to authority play a role?

From his very first article on this experiment, Milgram declared that the genocidal policy of the Shoah could not have been carried out without a large number of people also consenting to obey their superiors. Yet the notion that participants in the Shoah were only obeying orders, as they themselves claimed in their trials, created controversy in the early 2000s.
 

According to Goldhagen, explanations for obedience to authority (...) deny the importance of ideology and of the notion of the victim.

According notably to the political scientist Daniel Goldhagen, the "sociopsychological" explanations in terms of obedience to authority or pressure to conform are not admissible, primarily because they deny the importance of ideology and of the notion of the victim. These explanations also "do not conceive of the actors as human agents, as people with wills, but as beings moved solely by external forces or by transhistorical and invariant psychological propensities..."4 On the contrary, the historian Christopher Browning5 refers to Milgram in analyzing the "motivations" of "ordinary" Germans who became killers—capable of shooting their victims—as part of the “Holocaust by bullets.” He shows that obedience to authority figures sometimes played a decisive role in the unfolding of these murderous operations, amid other factors such as antisemitic propaganda and the dehumanization of the victims. 

In Milgram's experiment, the participants were faced with people they considered to be ordinary human beings. They did not have the impression of being confronted by beings deprived of humanity (dehumanized notably in the wake of propaganda campaigns). This experiment therefore also offers keys for understanding what led men who recognized the humanity of their victims to nevertheless participate in a murderous undertaking aiming to wipe them out. This research also makes it possible to grasp how obedient individuals can become willing executioners,6 and represents an invaluable resource to reflect on the process of murderous conversion. 

Rather than force them, why not push subordinates to commit atrocities?

Let us also dispel a potential misunderstanding: obedience, as it was conceived by Milgram, is not understood in its more common sense—as an action carried out against one's will and accomplished in an automatic fashion and under coercion—but much more subtly, and in certain respects paradoxically, as the fact of being made to want to do what is asked of us. It is more a form of consenting submission than pure coercion. Moreover, as we showed in an article published in 2014 in the review Aggression and Violent Behavior,7 political, military, and police authorities are not mistaken on this point. In pushing their subordinates to commit atrocities, they generally prefer driving them to do so gradually, and in a certain sense to manipulate them rather than exerting genuine coercion, which is deemed less effective. In fact, they often formulate their orders in a more or less explicit way, in order to give their subordinates the feeling that they freely did what was expected of them, and to potentially authorize their superiors, who nonetheless gave the orders, to not assume responsibility. More precisely, our study showed that in a democratic context, those who give orders tend to prefer vague, ambiguous, or impartial orders, preserving the appearance of acceptability and legality, while in a dictatorial context, those who give orders tend to give priority to explicit orders, all while seeking to lessen the psychological impact on subordinates (orders leaving a choice, coded or fragmentary).

Experimenter
On the other side of the room, behind a one-way mirror, the test subject is actually a confederate who receives no shock at all, and plays a recording of screams of pain.
Experimenter
On the other side of the room, behind a one-way mirror, the test subject is actually a confederate who receives no shock at all, and plays a recording of screams of pain.

When the experimenter looks away, people obey significantly less

To return to Milgram's work on obedience to authority, it is important to note that it is not limited to this sole experiment and its results. Much more important is the researcher's demonstration, on the basis of a series of experimental variations, that obedience to authority fluctuates considerably according to the characteristics of the situation in which the subject is placed. For example, people obeyed even more when they did not hear the victim cry, and much less when they were no longer under the gaze of the experimenter. Such behavior is therefore not exclusive to a few isolated sadists, as we would like to believe. Perception of the situation in which one acts can clearly compel people, much more than we suspect, to act according to what is expected of them.

What do these mini acts of resistance reveal? A genuine desire to resist orders or a way to save face in one's own eyes? 

Furthermore, obedience and disobedience are not two rigorously opposed phenomena. A recent article8 reveals that both the subjects carrying on with the experiment until the end and those who quit before show more or less explicit forms of behavioral resistance: through their silence, hesitation, disapproval, and even their attempts to end the experiment, they express the extent to which they do not agree with what is taking place. The film Experimenter, which ably weaves Milgram's life with the history of his experiments, dramatizes these forms of resistance very well. The participants express their opposition to the experimenter in multiple ways... but they continue all the same. We even see in certain cases a genuine phenomenon of disjunction, with people seeming to disapprove what they continue to be doing. Yet what do these mini acts of resistance reveal? A genuine desire to avoid the demands coming from a figure of authority? Or a way of saving face in one's own eyes, of finding a way to bear a trying obedience?

The higher the rate of obedience, the more people tend to challenge the experiment's "morality"

Milgram's experiment was replicated in numerous countries during the 1960s, 1970s, and 1980s, with more or less the same results.9 However, it is important to note that it was scandalous at the time: did experimentation justify deceiving subjects, and especially subjecting them to such intense stress? Was there not a risk of lasting affects? Could one play with people's emotions in this way? Many similar studies had been conducted during this period without prompting such criticism. Milgram used post-experiment follow-up to attest to the fact that the subjects were doing well, and even said they were satisfied with having participated in the research, but he did so in vain, and the controversy continued. Why did this experiment subsequently spark so much criticism? Is it not instead due to the fact that its results are profoundly disturbing? A study conducted during the 1970s incidentally reflects this view. It shows that people are all the more inclined to consider Milgram's experiment as less moral and more threatening for the dignity and well being of the participants if the participant's rate of obedience in the experiment is high.10
 

Experimenter
Stanley Milgram and Sasha Milgram (Winona Rider in the film) visiting Solomon Asch and his wife. This pioneer of social psychology was Milgram's thesis supervisor at Harvard.
Experimenter
Stanley Milgram and Sasha Milgram (Winona Rider in the film) visiting Solomon Asch and his wife. This pioneer of social psychology was Milgram's thesis supervisor at Harvard.

Despite the controversy, the experience was replicated in the US in 2009

The polemics surrounding this question were so heated, and the ethical framework conditions for experimental research had become so strict, at least in the US, that a new replication appeared to be out of the question beginning in the 1980s. It was therefore with surprise in 2009 that the scientific community discovered, in the journal American Psychologist, a replication of Milgram's experiment conducted in the US.11

The experiment was adjusted: participants were told (...) that they were free to leave, and the maximum level of the (fake) electric shocks was lowered to 150 volts.

How was such a replication able to see the light of day? Firstly, it was due to a propitious context: the country was at war in Iraq, and the scandal that followed revelations that acts of torture had been perpetrated by US soldiers, notably at Abu Ghraib prison, had much to do with it. Torture ceased to be regarded as some relic from the past, or as something foreign to democracies. It appeared instead as an inherent potentiality of the exercise of state power. The question of the conditions that could lead to a destructive obedience once again became central to public debate. Secondly, the experiment was adjusted to make it more morally acceptable: the participants were hand-picked, and they were informed on multiple occasions that they were free to leave at any point, and that the maximum level of the (fake) electric shocks had been lowered to 150 volts (as opposed to Milgram’s 450 volts); moreover, in order to prevent the subjects from experiencing too great a stress after the experiment, they were quickly informed that the student had not received any shocks. 

In this new version of the experiment, 70% of participants obeyed the experimenter—and were ready to continue beyond the 150 volts if they were not stopped— a number that is nevertheless slightly lower than Milgram's for the same voltage (82.5%). These results raise questions. We may ask whether this is still an experiment on obedience. The insistence on notifying the participant of his right to withdraw indeed casts doubt on how we should interpret the tendency to continue the experiment. If the subject is aware that he can leave the experiment if he chooses, how can one interpret his decision to end it despite the experimenter's attempts to get him to continue? As a genuine act of disobedience to an authority figure, or as a simple exercise of a right to withdraw?
 

Experimenter
Stanley Milgram (pictured here in the film mingling with a crowd on a train platform) also conducted many experiments on conformity in groups.
Experimenter
Stanley Milgram (pictured here in the film mingling with a crowd on a train platform) also conducted many experiments on conformity in groups.

Ready for prime time?

In France, an adaptation of Milgram's experiment to a situation involving a false game show (La Zone Xtrême) was conducted in 2009.12  Its goal was to understand how far the power of television could go. The experimenter in a lab gown was thus replaced by a television host and, in a notable difference, a public audience was included in the arrangement. The participants (who asked questions to a player, a planted partner of the show) were notified that they could withdraw from the game.  However, the modifications applied made this right very difficult to exercise: the individuals faced a higher number of orders, and were acting under the gaze of a public that incited them to continue. This French version established a more restrictive framework than the recent American version, although interpretation of its results, which are higher than Milgram's (81% continued up to 450 volts as opposed to 62.5%), is nevertheless not any easier. Should this be seen as a mark of a greater tendency on the part of our contemporaries to obey authority? Certainly not. For the game implied that the winner would take a cash prize (imaginary) if he or she could get to the end, and the public encouraged the candidate to do so (shouting "cash! cash!"). It is therefore probable that some highly competitive people decided to continue with the game, regardless of the means, in order to win.

Are we more inclined today than before to disobey immoral orders coming from a legitimate authority (science, television, school, business, etc.)? This is doubtful, given the weak propensity of these institutions to promote acts of reasoned disobedience. Are members of the army truly encouraged to disobey an illegal order given in the field of operations, even if they have the legal capacity to do so? Do we learn to wisely disobey in school today more than before? Do we value, in the workplace and elsewhere, constructive disobedience? Are we always appreciative of whistleblowers for raising their voice? Far from it. Banking managers, civil servants, and professionals who decided to denounce the sometimes dangerous fraudulent practices they had witnessed have often seen their lives turned upside down: harassment, being shelved, termination, trial, isolation.13

Agreeing without reservation, offering support without discussion, and obeying unconditionally to legitimate authorities are often still self-evident. Probably not a good omen for our democracies. 

The analysis, views and opinions expressed in this section are those of the authors and do not necessarily reflect the position or policies of the CNRS.

Footnotes
  • 1. Stanley Milgram, Obedience to Authority: An Experimental Point of View (London: Tavistock, 1974).
  • 2. Men. Only one experiment has been conducted with women yielding the same results (65%).
  • 3. The subject-teacher did not know that the learner-confederate was not receiving any electric shocks.
  • 4. S. Richardot, “L’apport de la psychologie sociale à la question de l’obéissance.” in A. Loez, N. Mariot (Eds.), Obéir / désobéir, les mutineries de 1917 en perspectives (Paris: La Découverte, 2008).
  • 5. C. Browning, Ordinary Men: Reserve Police Battalion 101 and the Final Solution in Poland (reissue) (New York: HarperPerennial, 1998).
  • 6. S. Richardot, “L’apport de la psychologie sociale à la question de l’obéissance.” In A. Loez, N. Mariot (Eds.), Obéir / désobéir, les mutineries de 1917 en perspectives (Paris: La Découverte, 2008).
  • 7. S. Richardot, “You know what to do with them:" The Formulation of Orders and Engagement in War Crimes. Aggression and Violent Behavior, 2014. 19(2): 83–90.
  • 8. M. M. Hollander, “The repertoire of resistance: Non-compliance with directives in Milgram’s ‘obedience’ experiments,” British Journal of Social Psychology, 2015. 54(3): 425-44.
  • 9. The basic experiment was reproduced 17 times in numerous countries (Italy, South Africa, West Germany, Jordan, Austria, and Spain).
  • 10. B. R. Schlenker and D. R. Forsyth, “On the ethics of psychological research,” Journal of Experimental Social Psychology, 1977. 13(4): 369–96.
  • 11. J. M. Burger, "Replicating Milgram: Would people still obey today?" 2009. 64(1): 1-11
  • 12. M. Eltchaninoff and C. Nick, L’expérience extrême (Paris: Don Quichotte éd., 2010).
  • 13. https://www.mediapart.fr/journal/france/080713/lanceur-dalerte-recherche...

Comments

0 comment
To comment on this article,
Log in, join the CNRS News community