SHANKAR VEDANTAM, HOST:
Hey there. Shankar here. Today we wanted to bring you a bonus episode. It's a story we reported several years ago. It's about race and bias and the power of culture to affect our behavior. A quick note before we get started - this story begins with the police shooting of a man named Terence Crutcher. And there's discussion of police violence against African Americans throughout the episode.
(SOUNDBITE OF ARCHIVED NPR BROADCAST)
VEDANTAM: On a September evening in 2016, Terence Crutcher's SUV stopped in the middle of a road in Tulsa, Okla. A woman saw him step out of the car. The doors of the car were open. The engine was still running. The woman called 911. Officer Betty Shelby was on her way to an unrelated incident when the call came in.
Terence was 40, African American, born and raised in Tulsa. He was a churchgoing man with four children. Betty was 42, white, a mother. She was born in a small town not far from Tulsa. In an ideal world, these two Oklahoma natives, close in age, ought to have had more to bring them together than hold them apart.
But on this evening, there was no small talk or friendly chatter. The police officer told Terence to take his hands out of his pockets. According to her attorney, he first complied. He then put his hands up in the air. Moments later, he put his hands back in his pockets.
By this point, multiple police officers had gathered and drawn their guns and Tasers. Overhead, a police chopper filmed events as they unfolded. From the video, it's hard to tell exactly what's happening on the ground, but an officer in the helicopter thinks Terence isn't cooperating.
(SOUNDBITE OF ARCHIVED RECORDING)
UNIDENTIFIED PERSON #1: Time for a Taser, I think. That looks like a bad dude, too - could be on something.
VEDANTAM: Moments later, one officer on the ground does fire a Taser. Betty Shelby fires her gun.
(SOUNDBITE OF ARCHIVED RECORDING)
BETTY SHELBY: Shots fired.
UNIDENTIFIED PERSON #2: Who?
UNIDENTIFIED PERSON #3: I have a 3-21. We have shots fired. We have one suspect down. We need EMSA here.
VEDANTAM: She kills Terence Crutcher. Later, police discover that he was unarmed. Soon, accusations are flying. Maybe the victim was high on drugs. Others said maybe the police officer was racist. At a press conference after the shooting, a journalist asked Scott Wood, Betty Shelby's attorney, about that.
(SOUNDBITE OF PRESS CONFERENCE)
UNIDENTIFIED JOURNALIST: Did him being a big black man play a role in her perceived danger?
SCOTT WOOD: No, him being a large man perceived a role in her being in danger. She's worked in this part of town for quite some time. And, you know, just the week before, she was at an all-black high school homecoming football game. She's not afraid of black people.
VEDANTAM: Terence Crutcher's sister, Tiffany, sees it very differently. She thinks her brother was shot because he was black.
(SOUNDBITE OF ARCHIVED RECORDING)
TIFFANY CRUTCHER: That big, bad dude was my twin brother. That big, bad dude was a father. That big, bad dude was a son. That big, bad dude was enrolled at Tulsa Community College just wanting to make us proud.
VEDANTAM: Betty Shelby's daughter, Amber, defended her mother.
(SOUNDBITE OF ARCHIVED RECORDING)
AMBER SHELBY: I am here to reveal to you the side of my mother that the public does not know. My mother is an incredible, supportive, loving and caring woman. She is a wife, a mother and a grandmother with a heart of gold. She has always fought for the underdog and stands up for the weak.
VEDANTAM: Betty Shelby was acquitted of manslaughter charges. Still, the tenor of the back-and-forth, the psychological accusations and psychological defenses is very revealing. When an incident like this occurs, we want to hear the story of what happened. We want to know what was going on in the mind of the shooter and the mind of the victim.
Was Terence Crutcher truly a threat? Did Betty Shelby dislike black people? What clues explain the behavior of these individuals? We home in, dig for facts and look for psychological explanations. But what if there's another way to think about what happened, one that has less to do with the individuals involved and more to do with the context in which the shooting occurred?
MAHZARIN BANAJI: What we're discovering here is that the individual mind sits in society. And the connection between mind and society is an extremely important one that should not be forgotten.
VEDANTAM: Individual behavior and the mind of the village, this week on HIDDEN BRAIN.
(SOUNDBITE OF MUSIC)
BANAJI: I'm Mahzarin Banaji.
VEDANTAM: Mahzarin is a psychology professor at Harvard. She's made a career out of studying the invisible.
BANAJI: For the past 30 years, I've been interested in studying those aspects of our minds that are hidden from our own conscious awareness.
VEDANTAM: Mahzarin's interest began in graduate school. She was teaching psychology at Yale and looking for a way to measure people's biases. There was debate over the right scientific method to do this. You could simply ask people their views, but because prejudice is a sensitive topic, you often don't get anything.
BANAJI: You couldn't walk up to somebody and say, do you agree that Italians are lazy, and have them say yes or no. They'll just refuse to answer that question.
VEDANTAM: Our deep-seated discomfort about discussing prejudice was one hurdle for a researcher looking to study the phenomenon. Mahzarin realized there was another barrier. What if some forms of prejudice are so deeply buried that people don't even realize they harbor such bias?
BANAJI: Perhaps we behave in ways that are not known to our own conscious awareness, that we are being driven to act in certain ways not because we are explicitly prejudiced but because we may carry in our heads the thumbprint of the culture.
VEDANTAM: Was there a way to decipher this thumbprint and expose people's hidden biases? Eventually, Mahzarin, with the help of her mentor Tony Greenwald and then-graduate student Brian Nosek, developed a simple ingenious test. It's called the Implicit Association Test or the IAT. It's based on the way we group things in our minds.
BANAJI: When you say bread, my mind will easily think butter but not something unrelated to it.
VEDANTAM: Like, say, a hammer. Our brains, it turns out, make associations. And these associations can reveal important things about the way we think.
BANAJI: So the way the IAT works is to simply ask people to sort things. So imagine that you're given a deck of playing cards, and you're asked to sort all the red cards to the left and all the black cards to the right. I'll predict that it will take you about 20 seconds to run through that deck.
VEDANTAM: Next, Mahzarin says, shuffle the deck and re-sort the cards.
BANAJI: This time, I'd like you to put all the spades and the diamonds to one side and the clubs and the hearts to the other side. And what we'll find is that this will take you nearly twice as long to do. Why? Because the rule that your brain had learned - red and red go together, black and black go together - is no longer available to you.
VEDANTAM: Remember that in both scenarios, you are grouping two suits together. In the first scenario, you're grouping hearts with diamonds and clubs with spades. In the second scenario, you're grouping hearts with clubs and diamonds with spades. Because there's a simple rule for the first task - group red with red and black with black - that task is easy.
In the second scenario, you need a fraction of a second to think about each card. You can't follow a simple rule of thumb. Mahzarin and Tony and Brian had an important insight. These rules of association apply to many subjects, including the way we think about other human beings. So they created a new sorting task.
BANAJI: Sort, for me, faces of black people and bad things together, words like devil and bomb and vomit an awful and failure. Sort those on one side of the playing deck.
VEDANTAM: On the other side, put the faces of white people.
BANAJI: And words like love and peace and joy and sunshine and friendly and so on to the other side. This turns out to be pretty easy for us to do because, as my colleagues and I will argue, the association of white and good and black and bad has been made for us in our culture.
VEDANTAM: The test doesn't end there. After sorting white and good into one group and black and bad into another, you now have to do it again, this time grouping black with good and white with bad.
BANAJI: And when you try to do that, and when I try to do that, the data show that we will slow down, that we can't do it quite as fast because black and good are not practiced responses for us. They are not habitual responses for us. We have to exert control to make that happen because it doesn't come naturally and easily to us. That's the IAT.
VEDANTAM: By the way, if you're wondering whether the order of the test makes a difference, it doesn't. The researchers have presented tests to volunteers in different ways. It doesn't make a difference if you ask people to first group black with bad or first ask them to group black with good.
In both cases, people are faster to associate white faces with positive words and black faces with negative words. Mahzarin thinks the IAT is measuring a form of bias that is implicit or unconscious. Mahzarin herself has taken the IAT many times. To her dismay, the tests show she has robust levels of unconscious bias.
BANAJI: My brain simply could not make the association of black with good as quickly as I could make the association of white with good. And that told me something. It told me it's not the IAT that's screwed up, it's my head that's screwed up.
VEDANTAM: Because the IAT is a timed test, the results can be precisely measured. Implicit bias, in other words, can be quantified. Most psychological tests are only available in the lab, but Mahzarin and her colleagues decided to do something radical. They put that test on the Internet. You can find it today at implicit.harvard.edu. Millions of people have taken this test. The data has been collected, shared, disseminated. The IAT is widely considered, today, to be the most influential test of unconscious bias.
(SOUNDBITE OF MUSIC)
VEDANTAM: As Mahzarin and Tony and Brian were developing the IAT, other researchers were developing different ways to measure bias. Psychologist Joshua Correll found himself diving into the field shortly after a black man was shot and killed in New York City in 1999. His name - Amadou Diallo.
JOSHUA CORRELL: Diallo was standing unarmed on the front stoop of his apartment building. And the police thought he looked suspicious, and they approached him. And they ended up shooting him. And the question that everybody was asking - and this was, I mean, something that people, you know, across the country were wondering about was, was he shot because he was black?
VEDANTAM: At the time, Joshua was starting graduate school.
CORRELL: I took that question pretty seriously. And we tried to figure out how we could test it in a laboratory.
VEDANTAM: Joshua and his colleagues eventually developed a video game. It was a pretty bad video game, but it did the trick.
CORRELL: It's more like a slideshow, where there are series of backgrounds that pop up on the screen. And then in one of those critical backgrounds, a person will suddenly appear. So we've got photographs of, say, 25 or so white men and 25 black men. And we photographed these guys holding a variety of different objects - cellphones, can of coke, a wallet, a silver pistol, a black pistol. And so we've just edited the photographs so that the person pops up in the background holding an object. And the player has to decide how to respond. And they're instructed, if the guy on the screen has a gun, you're - he's a bad guy, and you're supposed to shoot him. And you're supposed to do that as quickly as you possibly can.
VEDANTAM: What Joshua wanted to know was whether players would respond differently depending on the race of the target on the screen.
CORRELL: Say a black guy pops up holding a wallet, and a white guy pops up holding a wallet. What's the likelihood that the black guy gets shot, and the white guy doesn't?
VEDANTAM: If current events are any clue, you may guess the answer.
CORRELL: Here we're looking at, you know, the players responding to a target who's holding a wallet. And the correct decision is to say, don't shoot. And what we found is that they are faster to say don't shoot if the target is white rather than black.
VEDANTAM: The same held true for armed targets. Test takers were faster to shoot black targets, slower to shoot white ones. Now, you might think that Joshua would conclude that his test takers were just racist. But one important similarity between Joshua's test and Mahzarin's test is that they do not presume that the people with such biases have active animosity toward African Americans. These are not members of the Ku Klux Klan.
CORRELL: It was just exactly what we had predicted, and I guess both kind of hoped and feared, right? I mean, it's an exciting scientific moment, but it also suggests something kind of deeply troubling - that these participants who are presumably nice people with no bone to pick - they're not bigots. They're not angry at black people in any way. But what we saw in their data very, very clearly is a tendency to associate black people with threat and to shoot them more quickly.
VEDANTAM: You can say that both the psychological tests are academic exercises. Do they say anything about how people behave in real life? Joshua Correll is very clear that his video game experiment cannot replicate real life. It's impossible, he says, to recreate in a lab the fear and stress that a real-world police confrontation can generate. The IAT, too, has been criticized for a somewhat hazy link between test results and real-world behavior. Hundreds of studies have been conducted looking at whether the IAT explains or predicts how people will act. The results have been mixed.
In some studies, unconscious racial bias on the test seems to predict how people will behave. Researchers found, for example, that doctors who score high in implicit bias are less likely to prescribe clot-busting heart drugs to black patients compared to white patients. But other studies, also looking at doctors and black and white patients, find no correlation between results on the bias test and actual behavior. This discrepancy bothers psychologist Phil Tetlock at the University of Pennsylvania. He is a critic of the IAT.
PHILIP TETLOCK: It's a test that is enormously intuitively appealing. I mean, I've never seen a psychological test take off the way the IAT has and - you know, has gripped the popular imagination the way it has because it just seems, on its surface, to be measuring something like prejudice.
VEDANTAM: Tetlock and other critics are concerned that just because someone shows bias on the IAT doesn't mean that they're going to act in biased ways in real life. If a test cannot predict how you're going to act, isn't it just an academic exercise?
TETLOCK: There is the question of whether or not people who score as prejudiced on the IAT actually act in discriminatory ways toward other human beings in real-world situations. And if they don't, if there is very close to zero relationship between those two things, what exactly is the IAT measuring?
(SOUNDBITE OF MUSIC)
VEDANTAM: It turns out, a lot. There's new evidence that suggests the IAT does, in fact, predict behavior. But to see it, you have to zoom out. You have to widen the lens to look beyond the individual and into the community.
ERIC HEHMAN: Hello, my name is Eric Hehman.
VEDANTAM: He's a psychology professor at McGill University. Eric got interested in the IAT as he was researching the use of lethal force in policing. He was trying to design a statistical model that would predict where in the United States people of color are disproportionately likely to be shot and killed by police. First, he needed some baseline data. This proved hard since the federal government does not require police departments to report deadly shootings by officers.
HEHMAN: We really had no idea about really basic questions, such as how often they were happening, where they're happening and who they were happening to.
VEDANTAM: But in 2015, some news outlets, including The Washington Post and the British newspaper The Guardian, began to compile their own database on police homicides in the United States. According to official terminology, these are called justifiable homicides.
HEHMAN: So what they were putting together was the most comprehensive list of these justifiable homicides in the United States.
VEDANTAM: Eric used this data to pinpoint where disproportionate police shootings of minorities were most likely. Then he turned to the IAT data. Eric suspected that if bias was a factor in police shootings, it was likely that implicit bias, rather than overt racism, was at play.
HEHMAN: Traditionally, the field has found that explicit biases predict behaviors that are under our conscious control, whereas implicit biases predict things that are a little bit more automatic and a little bit more difficult to control. And this is exactly the sort of behavior that we thought might be involved in police shootings.
VEDANTAM: People take the IAT anonymously, but they need to provide some information, like their race and where they live. With the millions of data points the IAT provided, Eric painted a map of bias across the United States. Some places seem to have lots of bias, others very little. Now he had two databases. He cross-referenced them to see if there was any connection between communities with disproportionate numbers of police shootings of minorities and communities showing high levels of implicit bias. A powerful correlation emerged.
HEHMAN: So we find that in communities in which people have more racial biases, African Americans are being killed more by police than their presence in the population would warrant.
VEDANTAM: Let me repeat this because it's important. In places where implicit bias in a community is higher than average, police shootings of minorities are also higher than average. Eric's analysis effectively pinpoints where police shootings are likely to happen. But here's what makes the finding crazy. Most people who take the IAT are not police officers.
HEHMAN: So we're predicting police behavior by not measuring police at all themselves.
VEDANTAM: Coming up, we explore how a test can predict how people will behave even when they are not the people who take the test. Stay with us.
(SOUNDBITE OF MUSIC)
VEDANTAM: Psychologist Eric Hehman found a way to predict police behavior by comparing places that have high levels of implicit bias with places where police shootings of minorities are higher than average. Since police don't typically take the IAT, how could the IAT be predicting how police would behave? Eric thinks the test has tapped into the mind of the community as a whole.
HEHMAN: Say there's a neighborhood that's traditionally associated with threat or danger, and the people who live in that neighborhood have these associations between African Americans and threat or African Americans and danger. And these would be anybody in this community. This could be my mother or the person who lives down the street, not necessarily the police officers themselves.
But there is this idea that this attitude is pervasive across the entire area, and that when officers are operating in that area, they themselves might share that same attitude that might influence their behaviors in these split-second, challenging, life-and-death decisions.
VEDANTAM: Implicit bias is like the smog that hangs over a community. It becomes the air people breathe. Or, as Mahzarin might say, the thumbprint of the culture is showing up in the minds of the people living in that community. There are many examples for this idea that individual minds shape the community, and the community shapes what happens in individual minds.
Seth Stephens-Davidowitz is a data scientist who used to work at Google. We featured him on a recent episode of our show. In his book, "Everybody Lies," Seth explains how big data from Google searches can predict with great accuracy things like the suicide rate in a community or the chances that a hate crime will take place.
SETH STEPHENS-DAVIDOWITZ: We've shown that you can predict hate crimes against Muslims based on searches people make. People make very, very, very disturbing searches, searches such as, kill Muslims or, I hate Muslims. And these searches can predict on a given week how many hate crimes there will be against Muslims. But I think the right approach to this is not to target any particular individual - to show up at the door of any particular individual who make these searches. But if there are many, many searches in a given week, it would be wise for police departments to put extra security around mosques because there is greater threat of these - of attacks.
VEDANTAM: In other words, what the Google search data is doing is effectively taking the temperature of an entire community. That's what you're really saying, that you're picking up on things that are in the ether, if you will, in the community that might not show up in the individual but are likely to show up in the aggregate.
STEPHENS-DAVIDOWITZ: Yeah. And I think you don't really know the reason that any particular person makes a search, right? Someone could be searching, kill Muslims because they're doing research or they're just curious about something or they made a mistake in their typing. There are a lot of reasons an individual can make these searches. But if twice as many people are making these searches, well, if I were a Muslim American, I'd want some extra security around my mosque, right?
VEDANTAM: Asking whether implicit bias affects the behavior of every individual is a little like investigating everyone who types an offensive search term into Google. A lot of the time, you're going to find nothing. And yet, when you look at the user search terms in aggregate, it can tell you with great precision which areas will see the most hate crimes. For her part, Mahazrin Banaji believes Eric's work is a key link between her psychological data on how individuals behave and sociological insights on how a community behaves.
BANAJI: What we're discovering here is that the individual mind sits in society. And the connection between mind and society is an extremely important one that should not be forgotten and that more than any other group of people, social psychologists owe it to the beginnings of their discipline to do both and to do it even-handedly, to be focused on the individual mind and to be talking about how that mind is both influenced by and is influencing the larger social group around her.
VEDANTAM: This is why, says Mahzarin, when a problem has spread throughout a community, when it has become part of the culture, you can't fix it by simply focusing on individuals.
BANAJI: One of the difficulties we've had in the past is that we have looked at individual people and blamed individual people. We've said if we can remove these 10 bad police officers from this force, we'll be fine. And we know as social scientists - and I believe firmly - that that is no way to change anything.
VEDANTAM: This new way of thinking about bias showed up in the last presidential election. Democrat Hillary Clinton said implicit bias probably played a role in police shootings.
(SOUNDBITE OF ARCHIVED RECORDING)
HILLARY CLINTON: I think implicit bias is a problem for everyone, not just police. I think unfortunately too many of us in our great country jump to conclusions about each other.
VEDANTAM: Republican Mike Pence, now vice president, bristled at the idea. He said that Clinton was calling cops racist.
(SOUNDBITE OF ARCHIVED RECORDING)
VICE PRESIDENT MIKE PENCE: When an African American police officer's involved in a police action shooting involving an African American, why would Hillary Clinton accuse that African American police officer of implicit bias?
TIM KAINE: Well, I guess I can't believe you are defending the position that there is no bias.
VEDANTAM: But as Mahzarin says, it's not quite right to think of people with implicit bias as harboring the kind of racial hostility we typically think of when we say someone is a racist. Small kids show implicit bias.
African Americans themselves show implicit bias against other African Americans. The test isn't picking up the nasty thoughts of a few angry outliers, it's picking up the thumbprint of the culture on each of our minds.
(SOUNDBITE OF MUSIC)
VEDANTAM: So what can we do? Mahzarin is skeptical of those who offer training courses that promise quick-fix solutions.
BANAJI: There are many people across the country who say that they offer such a thing called implicit bias training. And what they do is explain to large groups of people what might be going on that's keeping them from reaching their own goals and being the good people that they think they are. And my concern is that when I'm an old woman, that I will look back at this time and think, why didn't I do something about this? Because I don't believe this training is going to do anything.
VEDANTAM: In Mahzarin's view, you can't easily erase implicit bias because you can't erase the effect of the culture when people are living day in and day out in that same culture. But she and others argue that there might be ways to prevent such biases from influencing your behavior.
Let's return to psychologist Joshua Correll. Remember, he created the active shooter video game that found test takers were more likely to shoot black targets rather than white ones. Many of Joshua's initial test takers were students. Eventually, he decided to see what would happen if police officers took the test, so he went to the Denver police.
CORRELL: We brought down a bunch of laptops and button boxes, a bunch of electronic equipment that we were using to do this study. And we would set it up in their roll call room. And it was just complete chaos and really, really fun. And some of the police really wanted nothing to do with this, but a huge number of them volunteered. And they wanted to talk with us afterwards.
VEDANTAM: At first, the police officers performed exactly the same as everyone else. Their levels of implicit bias were about the same as laypeople who've taken the test, both in response times and in mistakes. But when it came to the actual shooting of targets, the police were very different.
CORRELL: The police officers did not show a bias in who they actually shot.
VEDANTAM: Those earlier test takers - college students and other laypeople - displayed their bias on response time, mistakes and who they shot - not the police.
CORRELL: But whereas those stereotypes may influence the behavior of the college students and of you and me, the police officers are somehow able to exert control. So even though the stereotype, say, of threat may come to mind, the officer can overcome that stereotype and respond based on the information that's actually present in the scene, rather than information that the officer is bringing to it through his or her stereotypes.
VEDANTAM: Joshua wondered whether there were certain factors that might keep police officers from exerting this kind of cognitive control over their biases. He found, among other things, that sleep made a difference.
CORRELL: Those who were getting less sleep were more or likely to show racial bias in their decisions to shoot. And again, that's just consistent with this idea that they are - they might be able to exert control, to use cognitive resources to avoid showing stereotypic bias in their decisions, but when those resources are compromised, they can't do it. And they could be compromised in a variety of ways. Sleep is just one way that we could compromise it.
VEDANTAM: This, once again, is evidence that you can't train people not to have unconscious bias, but as Joshua suggests, you can do things to make it less likely that people will be affected by their bias. To be clear, Joshua's experiments are laboratory experiments.
We know in real life that beat police officers do shoot people in error. Now, this could be because in a real-life encounter, stuff happens that makes it very difficult for you to actually think about what you're doing.
CORRELL: So on the street, when somebody pulls a gun on you, it's scary, right? Like, cops, when they're involved in these firefights, report some crazy, crazy psychological distortions because they're legitimately freaked out. If they think somebody poses a life and death threat, they may panic. And it may be hard to bring those cognitive resources online.
VEDANTAM: In several recent high-profile cases, as for the case of Terence Crutcher and Betty Shelby, police officers shoot people who are unarmed. Of course, officers do not always know whether someone is armed. It's only in hindsight that we know the officer was or wasn't in real danger. Joshua's larger point is that police encounters can be inherently stressful. The uncertainty embedded in a confrontation can make it very difficult to think objectively.
To put it another way, if you're running a police department and want to reduce errors in shootings, it may be less useful to give cops a lecture on how they shouldn't be racist and more useful to build procedures that give cops an extra half second when they are making decisions under pressure. With practice and a bit of time to exercise conscious control, people can reduce the risk of falling prey to their implicit biases.
CORRELL: There is the potential to control it, right? The performance of the regular police officers or even people that we train in our lab suggests that people don't have to succumb to those stereotypic influences. They can exert control in certain circumstances.
VEDANTAM: Mahzarin Banaji has a similar solution. She thinks we need more of what she calls in-the-moment reminders. For example, it's been found that some doctors prescribe painkillers to white patients far more often than they do to black patients who are reporting exactly the same levels of pain. The only difference is the patient's skin color. This suggests that bias is at work.
Mahzarin says if the bias is implicit, meaning physicians are acting biased without intending to be biased, a timely reminder can help doctors exercise cognitive control over their unconscious associations.
BANAJI: You type in a painkiller that you want to prescribe to a patient into your electronic system while the patient is sitting next to you. And it seems, to me, quite simple that when you type in the name of any painkiller - let's say codeine - that a little graph pops up in front of you that says, please note, in our hospital system, we have noticed that this is the average amount of painkiller we give to white men. This is the average amount we give to black men for the same reported level of pain.
VEDANTAM: In other words, giving doctors an opportunity to stop for a second to make a decision consciously and deliberately instead of quickly and automatically. This can reduce the effect of implicit bias. Psychology has spent many years understanding the behavior of individuals. But tools such as the IAT might give us a way to understand communities as a whole, maybe even countries as a whole.
BANAJI: We did a study some years ago - Brian Nosek lead this particular project - in which we looked at gender stereotypes across many countries in the world. How strongly do we associate, you know, female with science and male with science? And then we looked at the performance of girls and boys roughly around eighth grade on some kind of a standardized test. And what we discovered is that the stronger the gender bias in the country, that is to say, the stronger the association of male with science in a country, the less well girls in that country did on that mathematics test.
That's very similar to the Hehman kind of result because we didn't measure the gender bias in the girls and the boys who took the test. We were measuring something at the level of a country in that case. And yet, it did predict something systematic about the difference in performance between boys and girls.
VEDANTAM: When we look at an event like a police shooting, we invariably seek to understand it at the level of individuals. If something bad happens, we think it has to be because someone had bad intentions. Implicit bias certainly does act on individuals, but it's possible that its strongest effects are at the level of a community as a whole.
This might be why some police shootings of African American men are carried out by African American police officers and why some physicians who are not prescribing pain medications to people of color might themselves be people of color. Individuals can do their part to limit the effects of bias on their behavior. But if you want to fix the bias itself? Well, that takes the whole village.
(SOUNDBITE OF MUSIC)
VEDANTAM: This episode of HIDDEN BRAIN was produced by Jenny Schmidt, Maggie Penman and Rhaina Cohen. It was edited by Tara Boyle. The music in today's show was composed by Ramtin Arablouei. Our team includes Parth Shah, Thomas Lu, Laura Kwerel, Cat Schuknecht and Lushik Wahba.
This week, our unsung hero is Caroline Drees, the senior director of field safety and security at NPR. In this time of coronavirus and nationwide protests, Caroline works tirelessly to make sure the people of NPR stay safe and healthy. Thank you, Caroline, for all that you do.
For more HIDDEN BRAIN, you can find us on Facebook and Twitter. If you liked this episode, please do share it with a friend. I'm Shankar Vedantam, and this is NPR. Transcript provided by NPR, Copyright NPR.