字幕表 動画を再生する
In February 1999, four New York City police officers were on patrol in the Bronx when
they saw a young black man standing on a stoop. They thought he looked suspicious. When they
pulled over, he retreated into the doorway and began digging in his pocket. He kept digging
as the police shouted at him to show his hands; a few seconds later, the man, Amadou Diallo,
a 23-year-old immigrant from Guinea, was dead, hit by 19 of the 41 bullets that the
police fired at him. What Diallo was reaching for was his wallet. He was going for his ID
as he stood on the steps of his own apartment building.
Diallo's story, and the officer's fatal pre-judgment of him, is recounted in Malcolm Gladwell's
2005 bestseller Blink. Gladwell, and the social psychologists whose
work he draws upon, explores Diallo's case as an example of that grey area between deliberate
violence and an accident, propagated by non-conscious, or implicit biases.
The officers did discriminate against Diallo, but the prejudice they acted on may have been
driven by something more subtle than simple hatred.
And that's an important thing to think about. Yes, there are lots of overtly bigoted people
and policies at work all over the world, but what we're interested in today is the more
insidious, non-conscious automatic bias, and how it can affect our behavior.
The fact is, our implicit biases affect the way we relate to others in a very real way.
Our race, gender, age, religion, or sexual orientation can make the difference between
whether we get a job or not, a fair paycheck, or a good rental, or whether we get randomly
pulled over or shot and killed for reaching for a wallet.
In the last two episodes, we've examined how we think about and how we influence one another,
but social psychology is also about how we relate to one another.
Like what factors might cause us to help another person, or harm them, or fear them? What are
the social, and cognitive, and emotional roots of prejudice, racism, and sexism, and how
do they shape our society? These are some of the aspects of ourselves that are the hardest
and most uncomfortable for us to explore, which is why they're so important to understand.
We've all been unfairly judged in our time, and let's not pretend that we haven't done
our fair share of uninformed judging too.
Like it or not, prejudice is a common human condition.
Prejudice just means "prejudgment." It's an unjustified, typically negative attitude toward
an individual or group. Prejudicial attitudes are often directed along
the lines of gender, ethnic, socioeconomic status, or culture, and by definition, prejudice
is not the same thing as stereotyping or discrimination, although the three phenomena are intimately related.
People may distrust a female mechanic. That's a prejudicial attitude, but it's rooted in
a stereotype, or over-generalized belief about a particular group.
Although it's often discussed in a negative way, stereotyping is really more of a general
cognitive process that doesn't have to be negative. It can even be accurate at times.
Like, I have the stereotype that all crows have wings, injuries and birth defects aside.
And that happens to be true.
But on the negative end, your prejudice against female mechanics may be rooted in some inaccurate
stereotype about women's skills with a socket wrench.
And when stereotypical beliefs combine with prejudicial attitudes and emotions, like fear
and hostility, they can drive the behavior we call discrimination.
So a prejudiced person won't necessarily act on their attitude. Say you believe in the
stereotype that overweight people are lazy. You might then feel a prejudiced distaste
when you see someone who appears overweight.
But if you act on your prejudice, and, say, refuse to hire them for a job or don't let
them sit at your lunch counter, then you've crossed over into discriminating against them.
The former apartheid system of racial segregation in South Africa, the Nazis' mass killing of
Gypsies, Jewish people, and other groups, and centuries of bloodshed between Protestants
and Catholics, are all extreme examples of violent prejudice and discrimination.
The good news is that in many cultures, certain forms of overt prejudice have waned over time.
For example, in 1937 only 1/3 of Americans said that they'd vote for a qualified woman to
be president, while in 2007, that figure was up to nearly 90 percent.
But of course more subtle prejudices can still linger.
In the past, we've talked about dual-process theories of thought, memories, and attitudes,
and that while we're aware of our explicit thoughts, or implicit cognition still operates
under the radar, leaving us clueless about its effect on our attitudes and behavior.
In the same way, prejudice can be non-conscious and automatic. And I mean it can be so non-conscious
that even when people ask us point-blank about our attitudes, we unwillingly or unknowingly
don't always give them an honest answer.
Do you think that men are better at science the women? Or that Muslims are more violent
than Christians? Or that overweight people are unhealthy?
Our tendency to unwittingly doctor our answers to questions like these is why we have the
implicit association test, or IAT. The test was implemented in the late 1990s to try to
gauge implicit attitudes, identities, beliefs, and biases that people are unwilling or unable to report.
You can take the IAT online and measure your implicit attitudes in all kinds of topics,
from race, religion, and gender to disability, weight, and sexuality. It's basically a timed categorization task.
For example, the age-related IAT looks at implicit attitudes about older vs. younger
people. In it, you might be shown a series of faces, old and young, and objects, pleasant
and unpleasant, like pretty flowers vs. a pile of garbage.
You're then asked to sort these pictures, so you'd press the left key if you see a young
face or a pleasant object, and press the right key if you see an old face or an unpleasant
object. That's the stereotypic condition. Your keystrokes correspond to stereotypical
pairs; in this case, associating good stuff with youth and bad stuff with older age.
Then the test asks you to do the same thing in a counter-stereotypic condition, pressing
the left key if you see a young face or an unpleasant object and the right key if you
see an old face or a pleasant object.
The core of the test is your reaction time. Are you faster at sorting when you're working
with a stereotypical pairing than you are with counter-stereotypical pairings? If that's
the case, even though you may think you're unprejudiced, you've got an implicit association
between youth and goodness, which, as you might guess, may have some implications about
how you think and act toward older people.
The test is widely used in research, and contrary to what some critics think, it's surprisingly
predictive of discriminatory behavior in all kinds of experimental settings.
So that's one way to measure subtle, implicit prejudice. But obviously, overt prejudice
is far from dead. That's why discrimination studies are prominent in social psychology
research, and they can also predict, sometimes with scary accuracy, how discrimination might
show up in broad social patterns, like wage inequality and job opportunity gaps.
For instance, the 2012 Yale study led by social scientist Corinne Moss-Racusin demonstrated
that science faculty across the country systematically discriminated against female science students.
In a double-blind study, a representative sample of science faculty members were asked
to hire a fictional student applicant for a lab-manager job.
When the applicant's name was Jennifer, instead of John, they viewed her as less competent,
were less likely to hire her, offered her less money, and were less likely to mentor her.
And this prejudice was even exhibited by women faculty members.
And that's an important point. People on both sides of the stereotype tend to respond similarly,
with the subjects of prejudice themselves often holding the same stereotypical implicit
attitudes or engaging in the same discriminatory behavior.
So when we say that stereotypes are pervasive, we mean pervasive.
Now it's all too easy to hold up examples of how people are prejudiced, but the real
root of the issue is why they are.
Here are a few possibilities:
For one, prejudices can come up as a way of justifying social inequalities. This happens
when people on both sides of the power and wealth spectrum start believing that people
get what they deserve, and they deserve what they get. This is called the just-world phenomenon.
Prejudices can also be driven by the "us vs. them," or as social psychologists often call
it, the ingroup-outgroup phenomenon. Whether you're in a soccer stadium, or the political
arena or school lunchroom, or, you know, in the comments of this video, dividing the world
into in-groups and out-groups definitely drives prejudice and discrimination.
But an in-group identity also gives its members the benefits of communal solidarity and a
sort of safety in numbers. This in-group bias, or tendency to favor your own group at the
expense of others, is powerful, even when it's totally irrational. One common social
psychology exercise on in-group favoritism involves dividing a class into two arbitrary
groups, say, those wearing sneakers and those not wearing sneakers. Each person sits with
his or her group and is told to list differences between themselves and the opposing group.
The lists usually start out pretty tame, but become more strident as they grow longer. Eventually,
you have sneaker-wearing kids saying that they're just smarter than the people without
sneakers. The kids who don't have sneakers say that the other kids are trashy and low-class.
Soon enough, each group has inflated itself and derided the opposing group, even though
the division between the two was essentially meaningless to begin with.
Little exercises like this illustrate the power of any ingroup-outgroup distinction
in creating conflict between groups, and that brings us to the psychological nature of conflict itself.
History is littered with examples of how the us vs. them mentality has fueled violence
in warfare, which is exactly what we'll be talking about next time.
Today, you learned about how prejudice, stereotyping, and discrimination affect how we interact
and relate to one another. You learned how prejudice can often be non-conscious and automatic
and how tools like the Implicit Association Test help reveal and measure it. We also looked
at the implications of the ingroup-outgroup phenomenon, and how it can lead to strong
in-group bias that often turns aggressive.
This episode of Crash Course Psychology was sponsored by Shane Barr, whose young adult
sci-fi adventure book, Reset, is available on Amazon.
Thanks for watching, especially to all of our Subbable subscribers who make Crash Course
possible. To find out how you can become a supporter or lead sponsor like Shane, just
go to Subbable.com/CrashCourse.
This episodes was written by Kathleen Yale, edited by Blake de Pastino, and our consultant
is Dr. Ranjit Bhagwat. Our director and editor is Nicholas Jenkins, the script supervisor and sound
designer is Michael Aranda, and the graphics team is Thought Cafe.