How should we think about implicit biases? 

Good discussion of the strengths, limits and how they should be used:

A couple of years ago, during Merrick Garland’s confirmation hearing for becoming the attorney-general of the United States, one of the senators questioned him about implicit bias: “Does it mean that I’m a racist … but I don’t know I’m a racist?” he asked Mr. Garland, who responded by saying, no, everyone has biases, and this doesn’t make you a racist.

This is a reasonable answer, but others would give a different one. Some people think research on implicit bias shows that, yes, in the words of the famous Avenue Q song: “Everyone’s a little bit racist.” The conclusion that everyone-is-racist (or at least every-majority-group-member-is-racist) is part of the public conversation, taught in schools, and pressed into employees during diversity training.

Which side is right? Well, it’s complicated. We need to think about what these tests are really measuring.

The most famous implicit bias test is the Implicit Association Test – the IAT, which was developed by the psychologists Anthony Greenwald and Mahzarin Banaji. To get a sense of it, I encourage you to go online and try it out yourself. Here’s­­ how it goes, taking as an example a test that’s developed to explore implicit attitudes toward the young and the elderly: The subject watches a screen as either words or pictures flash by. The pictures are of either old faces or young faces, and the words are either positive (like “pleasant”) or negative (like “poison”). Then, for one set of trials, subjects are asked to press one key for either a young face or a positive word and another key for either an old face or a negative word. For another set of trials, it’s reversed: one key for a young face or a negative word and another key for an old face or a positive word.

The logic here is that if you have a positive association with youth and a negative one with the elderly, then your performance on young-positive/old-negative trials will be quicker than young-negative/old-positive. And, in fact, people do find it more natural to associate young with positive and old with negative than the other way around.

Such studies have been done with millions of people and have found the same pattern of negative associations when tested on attitudes toward gay, overweight and disabled people, and, most relevant to the question of racism, Black people. These effects are present even when questions about explicit attitudes find no bias and are often present even in subjects who belong to the group that is less favoured. People who take this test are often shocked, and their takeaway is often something like “I’m racist against Black people and never knew it.”

There is a lot of value to this work. It’s worth knowing that someone might not want to be biased, might wish to treat people equally, but nonetheless be influenced by psychological forces that are beyond their control.

But do measures like the IAT tap racism in a real sense? Here are three big qualifications to keep in mind:

First, these methods get a lot of play in the popular media, where they are often portrayed as racism-detectors. The worst example I ever saw was on the television show Lie to Me, where a crack team of investigators uses a muddled version of the IAT to determine which of a group of firemen has committed a hate crime. They find that one firefighter is slower than the rest to associate positive words such as principled with Black faces such as Barack Obama’s, and this settles it. “I’m not a racist,” he later protests. His interrogator snaps back: “You don’t think you are.”

In fact, the test is too unreliable to be used this way. Your score on the same test taken at different times can vary, and so the same person might prefer white people when tested on Monday and have no bias when tested on Tuesday. If you take the test and don’t like the result, just take it again.

Second, it’s unclear that your score on the IAT can predict your actual behaviour. One meta-analysis finds that one’s score on the IAT provides very little insight into how you act toward people of other races. This is no surprise given the problem above – if your IAT score bounces around depending on when you take the test, how can it do a good job at predicting your behaviour in the real world?

Third, these biases might be unconscious in the sense that we don’t know how or when we are influenced by them, but it’s not like people don’t know they exist. When I list certain groups – Black people, the overweight and so on – nobody is surprised to hear that people (perhaps not themselves, but people in general) harbour biases against them.

So how should we think about implicit biases? One theory is that they might have nothing to do with negative attitudes toward a group – something which many people see as constitutive of racism. Instead, as the psychologists Keith Payne and Jason Hannay argue, measures such as the IAT tap our appreciation of regularities in the environment, including regularities in how people think about other people. In other words, tests like the IAT don’t measure attitudes, let alone bad attitudes – they pick up associations.

Such associations are everywhere: Given the environment I was raised in, I associate peanut butter with jelly, Ringo with George, O Canada with hockey games. I also associate airplane pilots with men and nurses with women. And I associate some groups, such as the young, with mostly good things and other groups, like the elderly, with mostly bad things. If my world were different, I would have different associations. Dr. Payne and Dr. Hannay conclude that we should think of implicit racial biases as “the natural outcome of a mind that generates associations based on statistical regularities, whenever that mind is immersed in an environment of systemic racism.”

Regardless of whether we see this recording of statistical generalizations as racism, we are left with a problem here. This is the tension between how we believe we should act and how we actually act. The first arises through reflection and is our considered view as to how we should treat people. The second is influenced by all sorts of forces, including all the associations, explicit and implicit, we carry about in our heads.

For some people, there is no clash at all. Consider certain findings about bias, such that bidders on eBay tend to offer less money for a baseball card held by a Black hand than by a white one, or that judges are more likely to give a scholarship to a student who is a member of their political party. Some people, learning that they are biased in this way, will shrug and say it’s fine. It’s okay to discriminate. But some of us are at war with ourselves. We don’t want to be swayed by our associations and stereotypes. We want to be fair, and we see this as requiring us to treat people as individuals and ignore the categories they fall into.

You might think that the solution here is to try hard to be unbiased. Perhaps learning about and thinking about implicit biases can help us override them, just through force of will. Unfortunately, the evidence suggests otherwise. We are good at self-justification. We make choices that are shaped by prejudice and bias and convince ourselves that we were being fair and impartial.

My own view is that we do better when we construct procedures that override the biases we don’t want to have. If you’re choosing who to hire and don’t think that race should matter, set up the situation in such a way that you don’t have this information about the people you are judging. This is the logic of procedures such as blind auditions. Or, from a different moral viewpoint, set up diversity requirements that explicitly take into account factors such as race so as to override the prejudices you’re trying to overcome. These are different solutions – and people have strong views about which is preferable – but the impetus is the same: to engineer processes to eradicate bias where we think that bias is wrong.

This is how moral progress happens more generally. We don’t typically become better merely through good intentions and force of will, just as we don’t usually lose weight or give up smoking just by wanting to and trying hard. But we are smart critters, and we can use our intelligence to manage our information and constrain our options, allowing our better selves to overcome those gut feelings and associations that we believe we would be better off without.

Paul Bloom is professor of psychology at the University of Toronto, and the Brooks and Suzanne Ragen Professor Emeritus of Psychology at Yale University. His latest book is Psych: The Story of the Human Mind.

Source: How should we think about implicit biases?

When Whites Just Don’t Get It, Part 6 – The New York Times

Kristof on some of the ongoing biases and their effects:

In one study, researchers sent thousands of résumés to employers with openings, randomly using some stereotypically black names (like Jamal) and others that were more likely to belong to whites (like Brendan). A white name increased the likelihood of a callback by 50 percent.

Likewise, in Canada researchers found that emails from stereotypically black names seeking apartments are less likely to get responses from landlords. And in U.S. experiments, when blacks and whites go in person to rent or buy properties, blacks are shown fewer options.

Something similar happens even with sales. Researchers offered iPods for sale online and found that when the photo showed the iPod held by a white hand, it received 21 percent more offers than when held by a black hand.

Discrimination is also pervasive in the white-collar world. Researchers found that white state legislators, Democrats and Republicans alike, were less likely to respond to a constituent letter signed with a stereotypically black name. Even at universities, emails sent to professors from stereotypically black names asking for a chance to discuss research possibilities received fewer responses.

Why do we discriminate? The big factor isn’t overt racism. Rather, it seems to be unconscious bias among whites who believe in equality but act in ways that perpetuate inequality.

Eduardo Bonilla-Silva, an eminent sociologist, calls this unconscious bias “racism without racists,” and we whites should be less defensive about it. This bias affects blacks as well as whites, and we also have unconscious biases about gender, disability, body size and age. You can explore your own unconscious biases in a free online test, called the implicit association test.

One indication of how deeply rooted biases are: A rigorous study by economists found that even N.B.A. referees were more likely to call fouls on players of another race. Something similar happens in baseball, with researchers finding that umpires calling strikes are biased against black pitchers.

If even professional referees and umpires are biased, can there be any hope for you and me as we navigate our daily lives? Actually, there is.

The N.B.A. study caused a furor (the league denied the bias), and a few years later there was a follow-up by the same economists, and the bias had disappeared. It seems that when we humans realize our biases, we can adjust and act in ways that are more fair. As the study’s authors put it, “Awareness reduces racial bias.”

That’s why it’s so important for whites to engage in these uncomfortable discussions of race, because we are (unintentionally) so much a part of the problem. It’s not that we’re evil, but that we’re human. The challenge is to recognize that unconscious bias afflicts us all — but that we just may be able to overcome it if we face it.

Source: When Whites Just Don’t Get It, Part 6 – The New York Times

So You Flunked A Racism Test. Now What?

More on the inbuilt biases and prejudices that we all have:

You’re probably at least a little bit racist and sexist and homophobic. Most of us are.

Before you get all indignant, try taking one of the popular implicit-association tests. Created by sociologists at Harvard, the University of Washington, and the University of Virginia, they measure people’s unconscious prejudice by testing how easy — or difficult — it is for the test-takers to associate words like “good” and “bad” with images of black people versus white people, or “scientist” and “lab” with men versus women.

These tests find that — regardless of how many Pride parades they attend or how many “This is what a feminist looks like” T-shirts they own — most people trust men over women, white people over minorities, and straight people over queer people. These trends can hold true regardless of the gender, race or sexuality of the test-taker. I’m from India, and the test found that I’m biased against Asian-Americans.

There is research indicating that these types of implicit prejudices may help explain why cops are more likely to shoot unarmed black men than to shoot unarmed white men, and why employers are more likely to hire white candidates than equally qualified black candidates.

….Perhaps more important than the lasting effects of this particular approach, Paller’s findings are proof that our implicit attitudes are malleable — and maybe, just maybe, it is possible for people to let go of prejudice for good, if they want to. But it won’t be easy.

“Adults have had years and years of exposure to stereotypes,” Paller says. And biases take hold early — studies have found that kids as young as 4 and 5 show racial and gender bias. “It can take a lot of effort to reverse that.”

Paller stresses that this is very preliminary research. To confirm the results, a lot more people have to be tested. “Plus, we still don’t know if changing people’s results on the implicit-bias test translates to them acting differently toward minorities in the real world,” he notes.

The bottom line: There’s no silver bullet, says Anthony Greenwald, a social psychologist at the University of Washington who helped develop the implicit-association test. At least not yet. “But I’m open-minded,” says Greenwald, who wasn’t involved in Paller’s study. “It will be interesting to see if these results can be reproduced.”

Greenwald, who perhaps understands more about bias than just about anyone, has taken the implicit-association test himself. His results haven’t budged over the years. He’s still biased along racial and gender lines, he says, “even though I really don’t like having these biases.”

And while it may be hard to correct such inbuilt bias, it starts with being more mindful of such associations and automatic thinking.

So You Flunked A Racism Test. Now What? : Code Switch : NPR.

The Science of Why Cops Shoot Young Black Men

Good in-depth article on the psychology and neurology of subconscious bias and how it is part of our automatic thinking and sorting:

Science offers an explanation for this paradox—albeit a very uncomfortable one. An impressive body of psychological research suggests that the men who killed Brown and Martin need not have been conscious, overt racists to do what they did (though they may have been). The same goes for the crowds that flock to support the shooter each time these tragedies become public, or the birthers whose racially tinged conspiracy theories paint President Obama as a usurper. These people who voice mind-boggling opinions while swearing they’re not racist at all—they make sense to science, because the paradigm for understanding prejudice has evolved. There “doesn’t need to be intent, doesn’t need to be desire; there could even be desire in the opposite direction,” explains University of Virginia psychologist Brian Nosek, a prominent IAT researcher.”But biased results can still occur.”

The IAT is the most famous demonstration of this reality, but it’s just one of many similar tools. Through them, psychologists have chased prejudice back to its lair—the human brain.

Were not born with racial prejudices. We may never even have been “taught” them. Rather, explains Nosek, prejudice draws on “many of the same tools that help our minds figure out whats good and whats bad.” In evolutionary terms, its efficient to quickly classify a grizzly bear as “dangerous.” The trouble comes when the brain uses similar processes to form negative views about groups of people.

But here’s the good news: Research suggests that once we understand the psychological pathways that lead to prejudice, we just might be able to train our brains to go in the opposite direction.

The Science of Why Cops Shoot Young Black Men | Mother Jones.

And yes, I did take the Implicit Association Test (also available at UnderstandingPrejudice.org) and scored just as miserably as the Chris Mooney, the author of this article. Very sobering, and I encourage all to take it.