The Link Between Implicit Bias, Trust, and Neuroception

The Link Between Implicit Bias, Trust, and Neuroception

During my time as an executive with the Federal Bureau of Investigation (FBI) and the Mississippi Department of Human Services (MDHS), I was asked from time to time what I did at work.

I knew what they were asking. They were seeking an exciting, risk-taking, or juicy political story that might be worth sharing. I hated telling them that most of the time my job was navigating government bureaucracy. In fact, I’m pretty sure that in the FBI we put the word “bureau” in bureaucracy.

Instead, I told my friends that I spent my time “thinking big thoughts.”

Not sure that answer really resonated with anyone, but it was true. I did a lot of time thinking. In the FBI, it was thinking about ideas, information, and investigations. At MDHS, it was thinking about ways to provide assistance to children and families in considerable need.

I’m still trying to think big thoughts.

Recently, my thoughts have been centered around three seemingly disparate ideas: how implicit bias works; how we learn to trust others; and how the Polyvagal Theory informs our understanding of bias and trust.

On May 3, 2021, National Public Radio (NPR) published an article by Rose Eveleth titled, “You’re Probably Not As Open-Minded As You Think. Here’s How To Practice.”[1]

In the article, Ms. Eveleth postulates that most people are not as open to new ideas as they think they are or would like to be. A primary reason is that it “can be hard to reconsider long-held beliefs, and even harder to question things you didn’t even know you believed in the first place.”[2]

She acknowledges that our brains do a lot of things without our conscious control. For example, we breathe without thinking about it. We make split-second decisions without thinking. And we often pick up ideas from around us without even knowing it.

Sometimes these unconsciously learned ideas, whether positive or negative, can spill over into hot topic areas such as race, gender, education, medicine, and religion.

In other words, we have implicit biases.

Which got me thinking. What are researchers saying about implicit bias and what should I know?

In a September 2019 article published in the Journal of Experimental Social Psychology, the authors state, “Implicit biases are associations and reactions that emerge automatically and often without awareness upon encountering a relevant stimulus.”[3]

Other researchers in the journal Current Problems in Pediatric and Adolescent Health Care stated that implicit biases can create either unfavorable or favorable opinions of others “based on age, gender, race/ethnicity, weight, and appearance.”[4]

In the law enforcement and criminal justice professions, we might form stereotypical impressions of people when we hear such words as “criminal” and “dangerous.” This impression could predispose someone to think they either “know those types of people” or that these types of people are “certainly guilty.”

In the health care profession, implicit bias may “cause providers to unintentionally make assumptions about their patients based on stereotypes, such as having a lower expectation for patients to comply with medication regimens or assuming a patient is exaggerating symptoms based on their socioeconomic status or racial/ethnic background.” This is particularly an issue when it comes to racial, religious, and refugee issues.[5]

Implicit bias can occur when we hold negative attitudes or feelings towards people who are not part of our “ingroup.” You might think of “ingroup” as individuals who most likely look like you, have a similar background as you, and with whom you regularly socialize. People who don’t fit these categories are said to be an “outgroup.”

Are ingroups and outgroups inherently wrong? No, I don’t think so. Where it becomes troublesome is when we start thinking that our group is better than their group; when we start favoring people in our group over another group; or when we don’t hold people accountable for their implicit attitudes and behaviors.

At the heart of implicit bias is conscious awareness. If I were to describe the racist language and behavior used by some people, we would be shocked, angered, and abhorred. We would understand such behavior is wrong and should be condemned. However, part of the struggle when it comes to addressing or condemning such language or behaviors is that based on our childhoods, experiences, and environments, we often see “racist” or “bad” or “fill-in-the-blank” people as being in the “outgroup” and not in our “ingroup.”

In other words, it’s always “them” and not “us.”

Research has shown that “people typically unconsciously hold more negative attitudes or feelings about racial/ethnic outgroup, compared with ingroup, members.”[6]

What’s more troubling is that researchers “found that people hold perpetrators less accountable for discriminatory behavior when it is attributed to their implicit, rather than explicit, attitudes.”

Unfortunately, that means we are more likely to give people a pass or the benefit of the doubt if we believe their actions and attitudes are unintentional (implicit) rather than intentional (explicit).

Look, my thoughts are not meant to focus primarily on issues of racial bias. I mention race as a prime example of how implicit bias can shape our views and opinions based on our childhood and experiences.

I’m equally concerned about how we see other important societal issues such as substance abuse, homelessness, mental illness, poverty, violence, abuse, neglect, and incarceration to name just a few.

If we as a human race are going to better address this problem – I hesitate to say “solve this problem” – we are going to need to better understand our implicit biases. It would be nice if we could develop a different perspective of the world and be more empathetic.

Researchers have a name for it: perspective taking. I know; the name doesn’t sound all that original. Nevertheless, we can either “imagine oneself in the situation of the other (i.e., Self-focus), or one can attempt to put oneself in the shoes of the other (i.e., Other-focus).”[7]

The trouble is research shows that taking another person’s perspective does not help us “gain an accurate understanding of how they feel, even when verbal information is available.”[8]

We may feel closer, but we might still be blind to how they feel.

What we are better at, as human beings, is reading a person’s facial expressions. In their expressions, we see the emotions and the intensity behind those emotions. With one major exception: we are more empathetic towards people in our “ingroup” than we are towards those in an “outgroup.”[9]

Once again, our impartial biases betray us if we haven’t worked to overcome them.

And we can work towards overcoming them with practices that include things such as: “exposure to individuals with [different experiences], increasing intergroup contact, education about implicit bias, accountability for bias, understanding other’s viewpoints and increasing empathy, and self-monitoring.”

It’s not easy. Often people can’t get beyond page one because, generally, no one likes to think of themselves as biased towards other groups of people. As soon as we start talking about race, religion, and recommendations we start building walls and defenses and counterarguments.

This means we fail to develop trust which is key to our relationships and key to addressing our differences.

Next week we will take a look at trust; how we develop, ascertain, and learn trust; and how implicit biases might affect our ability to trust others.

[1] Rose Eveleth, “You’re Probably Not As Open-Minded As You Think. Here’s How To Practice,” National Public Radio, May 3, 2021,


[3] Natalie M. Daumeyer, Ivuoma N. Onyeador, Xanni Brown, Jennifer A. Richeson, “Consequences of attributing discrimination to implicit vs. explicit bias,” Journal of Experimental Social Psychology Volume 84, September 2019, 103812, 1-10.

[4] Jeanette Schnierle, Nicole Christian-Brathwaite, and Margee Louisias, “Implicit Bias: What Every Pediatrician Should Know About the Effect of Bias on Health and Future Directions,” Curr Probl Pediatr Adolesc Health Care 2019; 49:34-44.


[6]Natalie M. Daumeyer, et. al.

[7] Jacob Israelashvili, Disa A. Sauter, Agneta H. Fischer, “Different faces of empathy: Feelings of similarity disrupt recognition of negative emotions,” Journal of Experimental Social Psychology Volume 87, March 2020, 103912, 1-14.


[9] Puma Kommattam, Kai J. Jonas, Agneta H. Fischer, “Perceived to feel less: Intensity bias in interethnic emotion perception,” Journal of Experimental Social Psychology Volume 84, September 2019, 103809, 1-7.