In 1961 and 1962, a series of experiments were carried out at Yale University. Volunteers were paid a small sum to participate in what they understood would be 'a study of memory and learning'. In most of the experiments, a white-coated experimenter took charge of two of the volunteers, one of which was given the role of 'teacher' and the other 'learner'. The learner was told he had to remember lists of word pairs, and if he couldn't recall them, the teacher was asked to give the person, who was strapped into a chair, a small electric shock. With each incorrect answer, the voltage rose, and the teacher was forced to watch as the learner moved from small grunts of discomfort to screams of agony.
What the teacher didn't know was that there was actually no current running between his control box and the learner's chair, and that the volunteer was in fact an actor who is only pretending to get painful shocks. The real focus of the experiment was not the 'victim', but the reactions of the teacher pressing the buttons. How would they cope with administering greater and greater pain to a defenseless human being?
The Milgram experiment is one of the most famous in psychology, written up in his 1974 book Obedience to Authority. Here we take a look at what actually happened and why the results are important.
Expectations and reality
If you are like most people, you would expect that at the first sign of genuine pain on the part of the person being shocked, you would want the experiment halted. After all, it is only an experiment. This is the response Milgram got when, separate to the actual experiments, he surveyed a range of people (psychiatrists, graduate students, psychology academics, middle-class adults) on how they believed the subjects would react in these circumstances. Most predicted that the subjects would not give shocks beyond the point where the other subject asked to be freed. These expectations were entirely in line with Milgram's own. But what actually happened?
Most subjects were very stressed by the experiment, and protested to the experimenter that the person in the chair should not have to take any more. The logical next step would be then demand that the experiment be terminated.
In reality, this rarely happened.
Despite their reservations, most people continued to follow the orders of the experimenter and inflict progressively greater shocks. Indeed, as Milgram notes, “...a substantial proportion continue to the last shock on the generator”. This is even when they could hear the cries of the other subject, and even when that person pleaded to be let out of the experiment.
How we cope with a bad conscience
Milgram’s experiments have caused controversy over the years; many people are simply unwilling to accept that normal human beings would act like this. Many scientists have tried to find holes in the methodology, but the experiment has been replicated around the world with similar outcomes. As Milgram notes, the results astonish people. They want to believe that the subjects that volunteered are sadistic monsters. However, he made sure the subjects covered a range of social classes and professions, were 'normal' people put in unusual circumstances.
Why don't the subjects administering the 'shocks' get guilty and just opt out of the experiment? Milgram is careful to point out that most of his subjects knew that what they were doing was not right. They hated giving the shocks, especially when the victim was objecting to them. Yet even though they thought the experiment cruel or senseless, most were not able to extract themselves from it. Instead they developed coping mechanisms to justify what they were doing. These included:
- Getting absorbed in the technical side of the experiment. People have a strong desire to be competent in their work. The experiment and its successful implementation became more important than the welfare of the people involved.
- Transferring moral responsibility for the experiment to its leader. This is the common “I was just following orders” defense found in any war crimes trial. The moral sense or conscience of the subject is not lost, but is transformed into a wish to please the boss or leader.
- Choosing to believe that their actions need to be done as part of a larger, worthy cause. Where in the past wars have been waged over religion or political ideology, in this case the cause was Science.
- Devaluing the person who is receiving the shocks: ‘if this person is dumb enough not to be able to remember the word pairs, they deserve to be punished’. Such impugning of intelligence or character is commonly used by tyrants to encourage followers to get rid of whole groups of people. They are not worth much, the thinking goes, so who really cares if they are eliminated? The world will be a better place.
Perhaps the most surprising of the above is Milgram's observation that the subject's sense of morality does not disappear, but is reoriented, so that they feel duty and loyalty not to those they are harming but to the person giving the orders. The subject is not able to extract themselves from the situation because – amazingly – it would be impolite to go against the wishes of the experimenter. The subject feels they have agreed to do the experiment, so to pull out would make them appear as a promise-breaker.
The desire to please authority is seemingly more powerful than the moral force of the other volunteer's cries. When the subject does voice opposition to what is going on, he or she typically couches it in the most deferential terms – as Milgram described one subject: “He thinks he is killing someone, yet he uses the language of the tea table.”
The desire to please authority is seemingly more powerful than the moral force of the other volunteer's cries. When the subject does voice opposition to what is going on, he or she typically couches it in the most deferential terms – as Milgram described one subject: “He thinks he is killing someone, yet he uses the language of the tea table.”
From individual to 'agent'
Why are we like this? Milgram observed that the tendency of human beings to obey authority evolved for simple survival purposes. There had to be leaders and followers and hierarchies in order to get things done. Man is a communal animal, and does not want to rock the boat. Worse even than the bad conscience of harming others who are defenseless, it seems, is the fear of being isolated.
Most of us are inculcated from very young that it is wrong to hurt others needlessly, yet
we spend the first twenty years of our life being told what to do, so we get used to obeying authority. The experiments threw subjects right into the middle of this. Should they 'be good' in the sense of not harming, or 'be good' in the sense of doing what they're told? Most subjects chose the latter – suggesting our brain is hardwired to accept authority above all else.
we spend the first twenty years of our life being told what to do, so we get used to obeying authority. The experiments threw subjects right into the middle of this. Should they 'be good' in the sense of not harming, or 'be good' in the sense of doing what they're told? Most subjects chose the latter – suggesting our brain is hardwired to accept authority above all else.
The natural impulse not to harm others is dramatically altered when a person is put into a hierarchy structure. On our own we take full responsibility for what we do and consider ourselves autonomous, but once in a system or hierarchy we are more than willing to give over that responsibility to someone else. We stop being ourselves, and instead become an 'agent' for someone or something else.
How it becomes easy to kill
Milgram was influenced by the story of Adolf Eichmann, whose job it was to actually engineer the death of six million Jews under Hitler. Hannah Arendt's book Eichmann in Jerusalem had argued that Eichmann was not really a psychopath, but an obedient bureaucrat whose distance from the actual death camps allowed him to order the atrocities in the name of some higher goal. Milgram's experiments confirmed the truth of Arendt's idea of the 'banality of evil' - that is, humans are not inherently cruel, but become so when cruelty is demanded by authority. This was the main lesson of his study: that “...ordinary people, simply doing their jobs, and without any particular hostility on their part, can become agents in a terrible destructive process.”
Obedience to Authority can make for painful reading, especially the transcript of an interview with an American soldier who participated in the Mai Lai massacre in Vietnam. Milgram concluded that there was such a thing as inherent psychopathy, or 'evil', but that it was statistically not common. His alarm was more about your average person (his experiments include women too, who showed almost no difference in obedience to men) if taken off the street and put into the right conditions, can do terrible things to other people – and not feel too bad about it.
This, Milgram notes, is the purpose of military training. The trainee soldier is put in an environment separate from normal society and its moral niceties and instead is made to think in terms of 'the enemy'. He or she is instilled with: a love of 'duty'; the belief that they are fighting for a great cause; and a tremendous fear of disobeying orders: “Although its ostensible purpose is to provide the recruit with military skills, its fundamental aim is to break down any residues of individuality and selfhood.” The trainee soldier is made to become an agent for a cause, rather than a freethinking individual, and herein lies his or her vulnerability to dreadful actions. Other people stop being human beings, and become 'collateral damage'.
Source: 50 Psychology Classics: Who We Are, How We Think, What We Do: Insight and inspiration from 50 key books (Nicholas Brealey Publishing
No comments:
Post a Comment