There is a TV show called “What Would You Do?” which stages scenarios which beg for intervention and records how and whether unsuspecting folks in fact intervene. Some do and some don’t intervene but I’m not sure what the show “proves” beyond that. This post is not about that show but rather about how experiments show folks react in more structured situations, especially ones involving “authority.”
The impetus for this post is an episode of an old (1990’s) TV series I recently saw on PBS. The series is called Discovering Psychology and the episode is named “The Power of the Situation.” Despite how folks outside of a situation often say that they would “never” engage in certain behavior, and wonder how other folks do, the majority of people placed in certain situations will act in a way they never imagined.
Most folks are aware of one of the classic experiments in this area: the “electric shock” experiment. It was conducted by a Yale psychologist in 1963, shortly after the trial of Adolph Eichmann, who claimed that he was only following orders as a defense of his role in the Holocaust. Just how far would folks go in the name of obedience to authority?
The subjects were asked to participate in an effort to improve other people’s memory. The instructor assured the subjects that memory improvement would be promoted by their administering increasingly higher levels of electric shocks when folks answered a memory question incorrectly. The electric shock device was labeled both with voltage amount and intensity, ending with “Danger: severe shock” Of course, no actual electric shock was involved but the “recipients” of the “shocks”acted as if they were in pain and begged for the experiment to be stopped.
Before the experiment, 14 psychologists believed that most participants would not obey the “instructor’s” command to administer a shock beyond 150 volts (“strong”) and that only the “sadists” would go to the highest, potentially deadly, level of 450 volts. To the psychologists’ surprise, two-thirds of the participants obeyed the “instructor” and continued the “electric shocks” to the highest level. (You can read more about this experiment here.)
In the early 70’s, the Stanford “prison” experiment shows what folks will do in a situation based on their “status.” Volunteers were psychologically tested and then randomly assigned roles as either a prisoner or guard in a “prison” set up in a campus basement. The guards worked eight hour shifts and then went about their normal life. The prisoners remained in prison.
It didn’t take long for the “guards” to step into character and treat the “prisoners” harshly, even when the testing showed the “guard” had pacifist leanings. And the “prisoners” became passive and docile, even when testing showed them to be active folks. The “prisoners” became so stressed that the experiment, which had been scheduled to last two weeks, ended after just six days. When the “guards” were shown the video of their behavior, they were surprised by their actions.
These experiments show what can happen to an individual’s sense of “morality” when it is tested in a setting involving authority and group dynamics. The folks who feel morally superior to those on the TV show who do not intervene in the staged scenario are, as the saying goes, engaging in “Monday morning” quarterbacking.
Just as interesting was the “pilot vision” experiment. Air Force ROTC members were all given vision tests to determine their vision acuity. Then, they were placed in a training flight simulator used by the Air Force. Their role was as a “spotter” reading markings on other aircraft that appeared in view. These “markings” were actually the same letters at the same sizes used in the vision test.
Half the “spotters” wore flight suits; the other half did not. Amazingly, 40% of the “spotters” in flight suits improved on their vision test. As for the “spotters” who did not wear a flight suit, not one showed improved vision. Simply “dressing for the role” produced a psychological impact which had physical results.
I like to think that since I reject authority and don’t care much for “social approval” that I am fairly resistant to the pressure of social psychology. But don’t we all believe that “I’d never do something like that!”?
Maybe the radio show from the 40’s is right: Who knows what lurks in the hearts of men? The shadow knows….
Here’s a link to the psychology show episode I mentioned, which has footage from the “memory” and “prison” experiments.