This is the full article version of the talk I gave at Hacking Diversity, one of the conferences at Hacker Summer Camp in Las Vegas (along with Black Hat, DEF CON and BSidesLV – check out our conference coverage right here!).
Over the past year or two as I’ve spoken about unconscious bias, I’ve been surprised to find that many people are unfamiliar with this term. Sure they might know what ‘unconscious’ means (not awake) and what ‘bias’ means (prejudiced or unfair), but they don’t quite see what this has to do with building a new product or technology in general.
So let’s start with a definition.
What Is Unconscious Bias?
Unconscious bias is not the same as blatant prejudice. For example:
You might believe that you’re the most liberal-minded person in the world – you’d never use derogatory language about people of color – until you pull up at a stop sign, see two black guys standing on the corner, and lock your door.
Recently someone recounted a story to me about sitting in a movie theater right behind a talkative guy – that was the story. The guy talked throughout the movie. But this person started his story with “I was sitting behind this Chinese guy in a movie theater….” I asked why he mentioned the talker's race since it had absolutely nothing to do with the story. He thought for a moment and then said, “You’re right. I don’t know why I mentioned that.”
Uber sent a promotional email about “Wife Appreciation Day,” which urged husbands to order UberEats to give their wives a break from cooking.
Even after situations like the ones described above, many people are still unable to recognize that those actions or words stemmed straight from their unconscious bias. These people might have plenty of African-American friends or Asian colleagues or know husbands who cook – which it makes it harder to see unconscious biases in yourself because you truly believe that you are not racist or sexist or homophobic, etc.
It’s also easy to point to other explanations or rationales. But if you replay these – or any – situations in your mind and substitute two young black men for two older white women, or a Chinese guy for a white guy – would you lock your car door? would you mention the talkative guy’s race?
There are more than 150 types of implicit or cognitive biases, and are quite specifically defined.
Cognitive bias is a psychological concept in which "people use flawed judgement to assess data," and while unconscious bias is related, it is not precisely the same thing. For instance, you might hear "two political arguments and assign greater validity to one than the other. [You] will believe that [your] assessment is based on rigorous logic whereas in fact, [you are] simply drawn towards the argument that confirms beliefs [you] already hold." That's cognitive bias.
For the purpose of this article, I wanted to share the cognitive biases for context, but I will be discussing unconscious biases. Here are just a few examples:
Gender – Waiting for a nurse to see you…and then feeling surprised when a man walks into the room. Someone announces that they’re having a baby, and you ask what the gender is so that you know whether to buy a pink or blue onesie.
Race – Being surprised that an Asian person isn’t good at math.
Religion – Getting nervous when you see a woman in a burka or a man in a turban.
Sexual orientation – Assuming women have boyfriends and men have girlfriends. Saying you’re not homophobic, but that you’d just die if your kid was gay.
Age – The belief that you’re ‘over the hill’ after age 40. Believing that you should be married with kids, a mortgage and a flourishing career by the time you are 35 or there's something wrong with you.
Neurodiversity – Assuming that Autistic people cannot hold jobs or live on their own because they are less intelligent than neurotypicals. Asking an introvert why she can’t just come to the party and have a good time like everyone else.
Societal ‘norms’ – Calling a married couple selfish when they choose not to have kids. Hearing a man say that he is a stay-at-home dad…and thinking that he’s not a ‘real man’.
Language – Using mankind instead of humans/humankind, manmade instead of synthetic, using Mrs. and Miss to determine a woman's marital status instead of Ms. which is relationship-neutral, using wife beater instead of undershirt, using Indian giver, gyp, ghetto blaster or any other words and phrases that stereotype a group of people.
Does Diversity Training Work?
After the incident at a Philadelphia Starbucks in April of this year where two African American gentlemen were arrested after asking to use the restroom, city officials – not to mention the entire Internet – were demanding that Starbucks start providing diversity and inclusion training for all their employees.
But other people have been pessimistic about this kind of training, saying that it doesn’t work because, by its very nature, unconscious biases are invisible. Jack Glaser, a Professor of Public Policy at the University of California, Berkeley, said:
And as Mike Noon, Professor of Human Resource Management at Queen Mary University of London, says:
Case in point:
In 2013, Google created a 90-minute unconscious bias training program for its employees.
In 2015, Facebook created a site to make their unconscious bias training videos widely available.
In 2016, the U.S. Department of Justice announced its new Implicit Bias Training for personnel.
And yet, 5, 3, and 2 years later, discrimination based on gender, race, age, religion, sexual orientation and neurodiversity is still running rampant in these companies, not to mention the country in general:
In 2017, a Google employee wrote a 10-page anti-diversity memo in which he said that not as many women were as successful as men in the tech industry because women are neurotic.
In 2016, Facebook CEO Mark Zuckerberg publicly asked his employees to stop crossing out "black lives matter" and replacing it with "all lives matter" on the walls of MPK.
In 2017, the U.S. Department of Justice “reversed an Obama-era ruling that protected transgender employees from discrimination. All U.S. attorneys and federal officials were informed via a memo that individuals are not guaranteed protections based on gender identity under Title VII of the Civil Rights Act of 1964.”
So, does diversity training work? Not so much. Why not?
Why not? Because of something called...
The Reticular Activating System
The Reticular Activating System (also called reticular formation) is “a diffuse network of nerve pathways in the brainstem connecting the spinal cord, cerebrum and cerebellum, and mediating the overall level of consciousness.”
The brain processes 400 billion bits of information per second, but we are consciously aware of only 2,000 of those bits. We couldn’t possibly be consciously aware of all 400 billion bits or our minds would literally be blown. It would be far too overwhelming. And this is where the Reticular Activating System comes in.
Because you simply cannot pay attention to every single thing around you consciously, your RAS chooses which things – people, situations, words, etc. – to bring to your attention and lets your subconscious deal with the rest of the info coming at you.
That's why when you're in an airport where they're constantly announcing passengers' names, it's all white noise to you until they announce your name. Or why the day you purchase a maroon-colored Nissan Altima, suddenly you see maroon-colored Nissan Altimas everywhere. It’s not that everyone in town bought the exact same car on the exact same day as you – there have always been maroon Altimas on the road; it’s that this specific object is significant to you and therefore your RAS has brought this info from your subconscious to your conscious.
So if you believe that rich people are greedy and corrupt, you're right.
And if you believe that rich people are decent and generous, you're right.
In both cases, you can probably point to a dozen incidents that prove that you’re right. But what’s happening is that your RAS filters in only examples of rich people behaving dishonestly or immorally – and filters out all examples of rich people behaving honestly and generously.
Because you don’t see what’s being filtered out, you naturally assume that your belief is The Truth.
How does the RAS choose what to consciously process and what to ignore (well, not really ignore, but just subconsciously process)? It seeks information that reinforces your beliefs and it ignores information that contradicts your beliefs. And it filters everything around you through the rules and boundaries that you give it, and your beliefs shape those rules and boundaries.
I spoke to a guy once whose belief was “women cheat” (and therefore not to be trusted) because he had had three serious relationships and all three women wound up cheating on him. So to him that wasn’t a string of bad luck; he never questioned why he was continuing to attract women who were unfaithful. To him, this was a universal truth.
His unconscious bias was dictating his relationship pattern. Because if “women cheat” were a universal truth, this would happen to everyone, all the time, around the world.
Unconscious Bias and Technology
Now that we know what unconscious bias is, that we all have some kind of unconscious bias, and how this bias works in the brain, what does it have to do with technology? So your grandparents use the term “lady doctor” – that’s pretty benign, right?
YouTube launched a video upload feature for their app and discovered that about 10% of videos were being uploaded upside-down. Turns out that Google engineers had designed the app for right-handed users, and therefore left-handed users had to rotate their phone 180 degrees – leading to upside-down videos.
Air bags and seat belts have historically been tested with male crash test dummies, which puts women at a 47% great risk because they tend to be smaller than men and therefore sit closer to the steering wheel. In the 1960s when safety regulations were enforced in the car industry, test dummies were male. It wasn’t until 2011 that the first female crash test dummy was used.
Clinical research has generally been based male subjects. "This has been in spite of increasing awareness … [that] women and men differ in their susceptibility to and risk for many medical conditions, and they respond differently to drugs and other interventions. The close of the previous decade saw 8 out of 10 prescription drugs withdrawn from the U.S. market because they cause statistically greater health risks for women.”
Apple Health app lets users “monitor all of your metrics that you’re most interested in” – from calories to heart rate to blood alcohol content to daily intake of sodium. But there’s one thing that Apple Health doesn’t track: menstruation. As The Verge said: “is it really too much to ask that Apple treat women, and their health, with as much care as they've treated humanity’s sodium intake?”
Google’s algorithms. When you Google “three black teenagers” you get images of mug shots and when you Google “three white teenagers” you get clean-cut white kids having a good time. Googling “successful woman” comes up with images primarily of young, beautiful white women – and Oprah. Googling “successful man” displays images primarily of young, handsome white men. Google Doodles are white-male-centric: only 17% of doodles were about women and less than 5% honored women of color.
Heidi vs. Howard. In 2003, two professors ran an experiment where they shared a case study about a real-life successful entrepreneur, Heidi Roizen. The professors had half the students read Heidi’s story and the other half read "Howard's" story (which was the same exact case study with only the name and gender changed). Howard was perceived as the more appealing colleague while Heidi was seen as selfish and not someone you would want to hire or work for.
Research shows that female students do better than male students in math and science – when the exams exclude their names (and thus genders). Otherwise, teachers tend to overestimate the boys' abilities and underestimate the girls'.
Research from the National Center for Biotechnology Information (NCBI) shows that parents believe their sons are inherently more intelligent than their daughters. Parents Google “Is my son gifted” two and half times as much as “Is my daughter gifted” even though more girls than boys have been in gifted programs at school since 1976. And there are three times as many Google searches for “Is my daughter ugly" than “Is my son ugly”.
I’ve heard plenty of people say that their young daughters just naturally became interested in princesses and makeup and their sons in robots and dinosaurs, and that they did not push them in that direction. But I’d bet money on the fact that they did indeed – just unconsciously. When you decorate your baby girl’s nursery in pink fairies and your baby boy’s nursery in blue trucks, you’re pushing them in that direction. When you call your daughter “baby girl” or "princess" and your son “little man” or "big boy" you’re pushing them in that direction.
Joy Buolamwini, MIT grad student, explained in her TED Talk "How I'm Fighting Bias in Algorithms" that "over time, you can teach a computer how to recognize other faces. However, if the training sets aren't really that diverse, any face that deviates too much from the established norm will be harder to detect." The facial recognition software recognized all the white-skinned people, but it couldn't "see" her dark-skinned face – unless she put on a white mask. Check out her 9-minute TED Talk:
Google always blames these prejudices on their “neutral” algorithm, but it’s not the algorithm, it’s the human behind the algorithm.
Computers aren’t biased or opinionated, of course (except maybe my smartphone; one day while lying on the table it opened my dictionary app and looked up the word 'idiot', which I took as a personal jab), but every single piece of technology is programmed by a human, and humans have biases. And then these unconscious biases are baked right in to technology.
So now think about machine learning – a human builds a product and programs its algorithm and then thanks to AI, that product can continue to “learn” and make decisions on its own. But much like a cancer cell, which is just a regular cell gone renegade, AI that implements machine learning that was originally programmed by a fallible human…takes on that human’s biases – that he or she isn’t even aware that s/he has.
How Do You Make Unconscious Bias Conscious?
So how do you tackle an unconscious bias so that it doesn’t show up in the workplace, in consumer products and in algorithms? To start with, it’s going to take a helluva lot more than a 90-minute training video – or even a weekly training program.
Research from Phillippa Lally, a health psychology researcher at University College London, published a study in the European Journal of Social Psychology to discover how long it really takes to form a habit. It depends, of course, on the behavior, the person and the circumstances, but her study shows that it takes between 18 and 254 days (8 ½ months) for people to form a new habit and make it stick. And just to be clear – that’s 18-254 days in a row. If you miss a day, you have to start back at day one again or else you won't give you brain a chance to forge new neural pathways.
To break a bad habit (which is essentially what an unconscious bias is – remember, it is a learned stereotype that is automatic, unintentional, deeply ingrained and influences our behavior), there are three general stages: Awareness, Acceptance, and Action.
Awareness – We tend to take actions based on the things that we consistently say to ourselves, so you have to be mindful about what you think on a daily basis. Your thoughts drive your emotions, your emotions drive your actions, and your actions drive your outcomes. That’s why we’ll never see true change by just altering our actions. If we don’t also change our thoughts, they will always lead us back to the same old actions, and thus the same old outcomes.
So start by setting an intention to simply observe your thoughts, your words, your actions, and the words and actions of others. Don’t judge – just notice. If you judge, you'll probably force the conscious awareness back down to the unconscious.
To help you become more aware, you can take Harvard’s Implicit-Association Test (IAT) to assess your own unconscious biases.
Acceptance – Most people, upon hearing a phrase like ‘wife beater’, will generally take one of two extremes: point an accusatory finger and yell “Sexist! Misogynist!” or wave a dismissive hand and say “Good grief, you’re too sensitive. He didn’t mean it!”
The third reaction – that the person’s unconscious bias caused them to say a stereotype without thinking – is harder to accept because it’s in that hazy, gray area. Of course it’s sometimes a clear, overt prejudice, but much of the time, it’s unconscious bias, particularly in a person whom you truly know to be a good guy or gal.
So accept that you are imperfect. Accept the discomfort of knowing that you have preconceived notions about certain people, groups of people, words or situations. Accept that you have some work to do, just like going to the gym to shed excess weight. And remember, accepting that something exists is not the same as condoning it.
Action – Once you’ve become aware, make the decision to stop using non-stereotyped words and stop making limiting declarations about an entire group of people. No, this is not a quick-fix pill and yes you will be required to do the work.
Here's one of my favorite examples of diversity and inclusion in action. Earlier this year I was at a conference where author, activist and journalist Cory Doctorow gave the closing keynote. When he was done, he opened it up for questions at the end by saying, "So, if we could alternate between people who identify as women or non-binary and people who identify as men or non-binary, that would be great." (Scroll to 45:30 in the video below.)
What I loved about it was that he didn't make a big deal about it, he didn't explain why he was wording it this way, he didn't lecture the audience about it, he just showed by example.
So, watch your thoughts, accept that you have unconscious biases, consciously and vigilantly choose the words you say -- and for the love of god, stop saying 'lady policeman'!
Only then will diversity training have any effect.