Skip to Navigation
University of Pittsburgh
Print This Page Print this pages

October 27, 2016

Science 2016: Implicit bias

science2016 logo

 

 

Black female doctors are posting their photos under the hashtag #WhatDoctorsLookLike after Tamika Cross took to social media with her story of a Delta flight attendant’s skepticism when she responded to a call for doctors during an in-flight medical emergency.

“I’m sure many of my fellow young, corporate America working women of color can all understand my frustration when I say I’m sick of being disrespected,” wrote the 28-year-old Houston-based ob-gyn in an Oct. 9 Facebook post that has prompted thousands of responses.

The recent incident made the Science 2016 spotlight session “Implicit Bias and Other Hard-to-Measure Phenomena that Affect Our Perceptions and Actions” especially timely as a quartet of presenters examined the effects of unconscious stereotyping, and efforts to uproot it, particularly in academia and medicine.

Paula Davis PSPaula K. Davis, assistant vice chancellor for Health Sciences diversity, framed the issue in her presentation, “Presumed Incompetent.”

“Implicit bias is attitudes or stereotypes that affect our understanding, actions and decisions in an unconscious manner. They are pervasive. They affect every decision that we make in how we interact with others,” she said.

Everyone carries implicit biases with them every day, Davis said. “Even those who in their professional lives have committed to impartiality — judges, journalists.”

Implicit and explicit biases are related but are distinct mental constructs, she said. “There are things you may be very aware of, and they’re part of your functioning, but there also will be biases you carry that you really are not aware of and they have an equal impact.”

We tend to have implicit biases that favor groups that we are part of, Davis explained, adding that implicit biases are malleable and changeable over time and, with effort, can be unlearned.

We underestimate the differences between other groups who are not part of our “in-group,” she said.

“When we categorize people, we accentuate the differences between groups. We favor groups we are part of and we will assume there is greater difference and diversity with the groups that we are part of,” Davis said. “And we tend to homogenize the groups that are different.”

We’re likely to see those who are not like us as less able and less competent. Likewise, we’re likely to recall others’ errors, yet forget those of our in-group, she said. We’re also less generous with others who are different, and we tend to be more aggressive toward them.

“Our brains are making these judgments all the time. And we don’t see things as they are — we see things as we are,” Davis said.

Davis showed how implicit biases can have an impact in the workplace.

In one study, faculty raters were given identical resumes for a hypothetical lab position, with the only difference being the applicants’ apparent gender. Both men and women judged applicants with male first names as more competent and proposed higher salaries, while suggesting that the women applicants needed more mentoring and molding.

Similarly, an examination of jobs in a variety of industries found that applicants whose names sounded more white had to send out fewer resumes to get callbacks. “Those who had names that sounded as though they were African American had to send out 15 resumes to 10 for the white applicants,” Davis said.

Additionally, applicants with a “white” name would get a similar number of callbacks as an African-American candidate with eight more years of experience, she said.

The Tamika Cross story is a recent example, but implicit biases aren’t solely a male-female, black-white issue.
Davis shared the stories of Chinese scientists accused by federal authorities of spying. What might have been seen as an ordinary research collaboration resulted in federal charges against a Temple physics professor of Chinese heritage. His story is among the stories posted at www.scientistsnotspies.org.

“You see the kinds of biases that people carry with them lead to actions, and the impacts that they can have on individuals,” Davis said.

*

African-American women in academia often work in isolation “because they may be one of the few, and sometimes the only faculty person, like them in their department,” Davis said.

“They spend a lot of time and energy in addition to their research and teaching and publications in trying to manage the expectations that others have of them, or their perception of the expectations that others have of them,” she said.

“They’re far more likely to face challenges in the classroom to their authority, and they’re more likely to be tapped for service obligations,” she said. “They get tapped for every diversity committee, every diversity search, everything related to diversity and that adds another layer of burden to their professional existence.”

Those who are not very centrist in their behavior risk having those perceptions affect student evaluations and peer interactions.

Davis added that “women of color have to be very, very careful about their tone of voice, their facial expressions, their dress and also their language in the classroom, because any of those factors can have an impact on the perception of their competence,” Davis said.

“African-American women and Latinas are often seen as different or exotic. Conversely they can be seen as aggressive, immoral, angry or inferior and unable to meet the requirements of either social engagement or the academic requirements of their positions.”

Asian women, “even though they might be perceived to be part of the model minority, can be viewed as subservient or nonconfrontational, exotic or cute,” she said.

“It’s made it very, very difficult in some cases for Asian women to be perceived as serious academics.”

Labeling may exclude female faculty from resources and from the informal social networks where connections and access to resources come from, Davis said. That leaves them less powerful in policymaking, hiring and retention decisions.

*

Peer reviewers may be affected by implicit bias, she said, citing disparities in the number of R01 grants awarded to underrepresented minority researchers.

A working group on diversity in biomedical research found that peer reviewers may have an unconscious bias against applicants who have a background or research training pedigree that is different from their own, Davis said.

“We know that we are a very Ivy-centric group of individuals in the academy,” she said. “If you are not part of that top 20 institutions in NIH funding, you lose a little bit of gloss, 21-30 you lose a little bit more, and if you fall below that, your likelihood of being funded fell off greatly.”

This type of bias can result in poorer outcomes for applicants from diverse backgrounds, Davis said.

*

To combat implicit bias in the academy, “As individuals, first we have to know ourselves,” Davis said. She recommended taking the tests on Harvard University’s Project Implicit site (implicit.harvard.edu). “Once you become aware of what your biases are, then you have the opportunity to work against them.”
Davis recommended:

• Interact regularly with those you have biases against, so that you have an opportunity to retrain your brain about interacting with those individuals.

• Learn to recognize your triggers. “If you know you have a bias against young women who dress a particular way … If you know that’s a trigger, it’s something you have to work on,” she said.

• Slow down your decisionmaking. “Implicit biases rear their ugly heads when you are making very quick decisions,” she said.

• Evaluate decisions, particularly in groups, to ensure that everyone is being evaluated fairly and individual biases don’t creep in.

• Create structures to help mitigate biases: Create clear metrics and look at sources of ambiguity. Try to standardize decision-making as much as possible.

“Recognize that everyone makes mistakes,” Davis said. “But use those mistakes as an opportunity to learn something more about yourself and the way you interact with others.”

• At the unit level, supervisors need to be a role model. “Model the behavior you want those who report to you to exhibit,” she said.

• Create a safe space for discussions about implicit bias in your unit.

• Provide training in emotional intelligence so that your staff and faculty have an opportunity to know about themselves and the decisions they’re making.

• Provide training in the effective delivery of feedback. “There are many cases where it’s not what you said, it’s how you said it that’s really a problem for the individual who’s hearing that feedback,” Davis said.

• Train in interpersonal conflict management “so that everyone is sure that conflicts are being mediated in a fair and equitable manner and not driven by bias.”

In addition to Harvard’s Project Implicit, Davis recommended the resources of Ohio State University’s Kirwan Institute for the Study of Race and Ethnicity and consultants Cook Ross Inc.

****

barnatoPresenter Amber E. Barnato, a faculty member in clinical and translational medicine in the School of Medicine, discussed her investigation into race-based biases and end-of-life decision-making. The study, albeit small, found subtle differences between how doctors behave when treating black patients and how they interact with white patients.

Barnato said that black people in America are more likely than whites to die in the hospital with life support in the ICU and less likely to receive hospice care.

While the majority of individuals — black or white — say they don’t want aggressive treatment in the event of a future life-threatening illness, blacks are slightly more likely than whites to say they would want such treatment, she said. Her research team hypothesized that physicians’ race-based biases may influence end-of-life decision-making, either directly or indirectly.

To look into whether this difference might be related to doctors’ biases, the team recruited 33 emergency medicine doctors, hospitalists and intensivists and put them in a simulated interaction with a hospitalized terminally ill “patient” accompanied by a family member. Most of the doctors were white, middle-aged and male, Barnato said. Two were black, 15 percent were female. The doctors were told the purpose of the study was to learn about how doctors make decisions for patients they don’t know.

In the scenario, the physicians were asked to come to the bedside of a 78-year-old man with metastatic gastric cancer who was short of breath. The patient’s blood pressure and oxygen levels were low and his heart rate was fast — vital signs that would raise a doctor’s concern, she said.

“These are the kinds of situations that bring on biases … emotion and time pressure,” Barnato said. “We were trying to put physicians in a situation where they were doing some of that fast thinking so that any of the behaviors that we observed would be attributable to them, and not to the patient.”

Each doctor saw two cases at the Peter M. Winter Institute for Simulation, Education and Research. In one, the patient and family member were portrayed by black actors; in the other case, the actors were white.

The doctors made similar treatment decisions and demonstrated the same verbal communication skills regardless of the patient’s race, Barnato said, but doctors’ nonverbal communication with the black patient was more distant.

“They did show less physical proximity, open posture, touch and gaze with the black patients,” possibly indicating unconscious bias, she said.

How might this influence whether greater end-of-life interventions are preferred? Nonverbal connections such as touching and eye contact help build rapport with patients, she explained. It’s possible that less rapport — particularly in such a time-pressured and high-stakes situation — could lead to less trust.

“And if I don’t trust you to have my best interest in mind, I might want more aggressive treatment,” she said. “It could lead to a vicious cycle where patients seem to prefer more life support and may get more life support but it might be a manifestation of some implicit biases.”

****

Ann ThompsonDespite huge advances in medicine, disparities remain, said Ann E. Thompson, vice dean of the School of Medicine, citing numerous studies documenting not only racial inequalities in medical care, but similar inequalities for women, sexual minorities and the elderly.

“The causes of this are enormously complex,” Thompson said, adding that implicit bias plays a role. “Provider behavior and provider decision-making is clearly one important documented contributor to these disparities.”

Implicit biases influence provider behavior and affect the extent to which patients trust their doctor — “and probably how likely they are to follow their advice,” she said in her presentation, “Toward Unequal Treatment: Implicit Bias and the Hidden Curriculum in Medical Education.”

Students enter medical school with the desire to do good, Thompson noted. “Students are very open about their fears that they’ll lose their capacity for empathy, that they’ll lose the joy of taking great care of patients and, with that, lose their professional identity. They really care about becoming their ideal,” she said.

Students arrive at medical school with their own culture and beliefs. “They bring their conscious and unconscious assumptions, their needs, emotions and expectations, and their set of skills. And then we, of course, bring ours,” Thompson said. “And neither side has a long history of really examining those beliefs, assumptions and expectations as carefully as they need to.”

Like anyone else, medical students — and their teachers — have their biases, she said. “If we’re not conscious of our own and our students’ biases, and we don’t take steps to mitigate them, we reinforce and perpetuate them.”

Recent data show that “students who heard negative comments about black patients from residents or attending physicians showed increased racial bias between their first and fourth years of training. And it’s likely that a similar phenomenon exists about other groups,” Thompson noted.

Biases — both explicit and unconscious — plus elements of institutional culture and ingrained behaviors help to generate “a ‘hidden curriculum’ that’s nowhere on the list of things we say we need to be teaching our students,” she said. “That not only has an impact on patient care, but it gets recycled when those students become teachers themselves or physicians and again bring those biases or assumptions to the next generation of students.

“The importance of educating and encouraging reflection about unconscious bias is enormously important,” she said, noting that it must come from the top down. “The entire health care team needs to be involved. We need to recognize the impact of this on the severity of disease and the quality of care that we provide,” she said.

“One of the things we can do is to make sure that we encourage students to approach every interaction with an underprivileged or stereotyped group as an opportunity to act out their commitment to an egalitarian approach to good-quality care,” Thompson said. “If we can encourage people to make that goal a habit, we may actually trigger better care for minority patients.”

*

To move toward more equal treatment for all, a more diverse group must be recruited to join the medical school community, she said. “That applies to students, residents, faculty and staff.”

Thompson advised mentors to speak up to retain excellent students. “We need to let them know we want them to stay as residents,” she said. Likewise, it’s important to communicate the desire that excellent residents stay. “We need not to wait until they are ready to job-hunt, because by then it may be too late.”

In addition, “We need to recognize the systemic factors and make changes. Do our insurance plans make good care more difficult for some people? Do we sustain clinics and practices in underprivileged communities or are we completely focused on the bottom line?”

At the medical school, “We can make modest changes in the curriculum. We can increase exposure to counter stereotypical examples of minority groups. We can pay close attention to case studies and images that may perpetuate stereotypes. We can include implicit association tests in the recruitment,” she said.

At the administrative level, “we need to improve our recruitment and retention efforts. We need to have regular assessment of our institutional climate: What is the experience of our underrepresented or stereotyped groups?” she said.

“When unprofessional language or behavior is demonstrated, somebody needs to challenge it,” she said. Hearing negative remarks nourishes negative attitudes. “We need to have an institutional commitment to challenging comments like that when they’re made.

“If we refuse to challenge how racism and bias affect our practice, we’ll continue to contribute to health inequalities that remain unaddressed.”

*

Better research is needed on implicit bias and potential fixes, she maintained.

“If we can effectively address these issues, I think we have the potential to impact population health as powerfully as anything that we’re hearing (at Science 2016) about the exciting advances in the basic sciences. I think we’ll be far better poised to make effective use of those advances.”

Change must begin with the faculty, she said. “We need to teach faculty and then students about the history of bias and racism in medicine. We can encourage people to shine a light on themselves and really think about ‘what goes into my assessments and therapeutic decisions?’ People think of this as fluff, but I think this is really intrinsic to improving medical care,” Thompson said.

“We need to encourage discussions in academic medicine that look at the systemic features of our profession that perpetuate health disparities. These biases are malleable. We can overrule our mental habits and gut reactions. It’s not inevitable that these biases will control our behavior. But we have a lot of work to do.”

It’s hard work, she said. “But being conscious is a first step.”

****

bleeKathleen M. Blee, Distinguished Professor of Sociology and associate dean for graduate studies and research, offered solutions based on changes that have been made in the Dietrich School of Arts and Sciences with regard to faculty hiring and graduate school admissions.

Changing implicit bias may be hard, Blee said in her presentation, “Mitigating the Effects of Implicit Bias in Hiring and Admissions.” “It turns out it’s not that difficult to start to change the outcomes of processes in which there’s implicit bias.”

What’s different is the goal: Outcomes can be changed “not by changing people’s inherent implicit bias, but changing the relationship of implicit bias to outcomes.”

She reiterated that implicit bias often conflicts with one’s beliefs. It is a natural process that has to do with rapid processing of information. “As people need to process a lot of stimuli coming in, we put it in boxes … It’s the basis of stereotyping. It’s not good or bad; it’s on its face neutral. It’s a way that cognitive processes work.”

Existing outside our awareness, “implicit bias affects our judgment, without us meaning to do so. That’s why it’s so difficult to eradicate because it’s not a conscious process,” Blee said.

Along with the sort of implicit bias that makes people favor their own groups, there’s another kind: “Mostly we think about it as preferring our own group, but it can be a strong or stronger positive bias toward replicating what we know about that position,” Blee said.

“In a department in which most people are white, people have implicit bias toward white applicants. That is true whether or not the person making the decision is themselves white,” she said.

It makes sense, she said: “When we’re trying to hire somebody or admitting somebody in the medical school or a graduate program, we’re looking for success. … Implicitly in our mind, what’s successful is people who have been successful. We have an implicit understanding of the kind of person who’s going to be successful, based on the kind of people we know who are successful — people like us — who are the majority occupants.”

Social science research tells us that implicit bias is not strongly correlated with beliefs: “People can have very strong social justice beliefs and a wealth of implicit bias,” she said. “Changing people’s minds actually has not that much effect on implicit bias.”

Similarly, it’s not well correlated with one’s own social position, she said. “Women have implicit bias toward men when they’re looking at occupations that are male dominated, almost the same as males do toward men. Nonwhites have almost as much of this bias favoring whites as whites do,” she said.

“Implicit bias also is stronger with ambiguity, it’s stronger with time pressure, stronger where there’s a lack of critical mass,” she said.

*

Search committees should be diverse, but because implicit bias isn’t well correlated with one’s own social category, simply diversifying search committee membership makes little difference, Blee said.

• “What is important in a search committee is developing explicit criteria. The more ambiguous the criteria, the greater the possibility and certainty of implicit bias,” she said. “Squishy criteria … ‘We want somebody who fits in well… who will be a good colleague … who’s really engaged,’ are laden with implicit bias,” Blee said. “Replace them with very explicit criteria.”

• Diversity and excellence are not competing goals, she said. It helps to have an explicit discussion at the beginning about “ways in which being a diverse place is itself a goal of excellence,” she said.

• Searches generate enormous amounts of time pressure. “One thing that really reduces implicit bias is any measure that reduces the time pressure,” she said. “It turns out that if the committee that’s reviewing applications simply stops twice: That they go through the first screening and they stop for five minutes and say ‘Is this who we want? Let’s just think about it.’ … Any stops you put, any considerations in the process, even if they’re very brief, change the outcome and make the outcome more diverse.”

• Having inclusive rather than exclusive strategies for evaluation results in a more diverse pool of final candidates, she said.

Rather than finding reasons to eliminate candidates, “you get a much more diverse pool if you look at every application and think for a minute: ‘What’s a good thing about this application?’ It may not be good enough and you might get rid of it,” she said, but stopping to think of a strong point may bring out a hidden strength that might not ordinarily be considered.

• Critical mass is a factor. “If you have multiple candidates in the same social category, you’re less likely to have implicit bias than if you have only one,” she said. In the Dietrich school, if a search committee comes up with a slate that is itself diverse, but has no critical mass — three men and one woman, three whites and one nonwhite, for instance — “We add another interviewee of that other category to create critical mass,” so that the candidate from the underrepresented group isn’t representing everything about that group, she said.

Most importantly, Blee said, “We insisted that the department develop explicit multiple criteria,” and that all applicants have to be evaluated on all the criteria. Evaluators also must state the basis for their evaluation, she said.

Graduate admissions are being handled in a similar manner, she said. Departments are asked to figure out what exactly they’re looking for in a candidate, ask for evidence and evaluate based on what they’re looking for, Blee said.

“Departments often look for candidates who have had research experience or whose mentor they know.” That may favor students from the Ivy League or from research schools.

Is research experience what makes someone a good student? “Probably not,” she said. Rather, “What is it that you’re looking for, for which you’re using research experience as an indicator?

“Are you looking for perseverance or grit? Somebody with experience pulling off a longterm project, someone who can follow orders?” she asked. “Figure out what that is. Tell the student what you’re looking for,” then ask them to tell why they’d be a good candidate.

This enables the student from Yale to cite lab experience, while the student from Podunk College can present something else they have done to demonstrate, Blee said.

“It’s mitigating the implicit bias of replicating ourselves and it’s getting closer to selecting students based on what we really want students to have,” she said.

Departments have absorbed this process into their culture, Blee said. “It’s been self-reinforcing because it’s changed the outcome,” she said.

“My sense is that people really want a more diverse faculty and graduate population,” she said, adding that they’ve been receptive to having a tool that allows them to create a more diverse pool of candidates.

“People like the idea that this is fixable. Changing one’s own implicit bias, as everyone has said, is a really hard slog. And that’s the gold standard. But here’s a middle ground that will move us in the right way while we’re doing the longer-term projects,” she said.

—Kimberly K. Barlow 

Filed under: Feature,Volume 49 Issue 5

Leave a Reply