Let’s take a look at some commonly heard arguments about why D&I isn’t important, relevant, or necessary and break them down based on facts.
I don’t think it’s surprising news to say that we have a diversity and inclusion problem in tech. A quick glance at the demographics of surveys such as the State of JS or the State of CSS will quickly show how disproportionate our industry is. Progress is happening, slowly—but it’s often impeded by myths and incorrect assumptions. In this blog, we’re going to take a look at some commonly heard arguments about why D&I isn’t important, relevant, or necessary and break them down based on facts.
This is one of the myths that gets thrown around most often to dismiss discussion around diversity in the industry. Because coding is thought to be based entirely on objectivity and logic, we often hear that an individual’s personal identity is irrelevant and that their skill is all that matters.
However, the facts show that teams with more diversity are more likely to consider all the various aspects of a product from their own lived experiences—aspects that may be overlooked or disregarded by homogenous groups.
This Harvard Business Review article states that diverse teams are more innovative, focus more on the facts, and are more careful with their decision-making. Homogenous group members tend to make assumptions—which are then (often incorrectly) validated when their peers (who have similar backgrounds and experiences) don’t disagree with or challenge those assumptions. However, having our opinions and assumptions challenged is crucial for both creating new ideas and distinguishing between opinion and fact.
Furthermore, teams with more diversity also show more empathy—a crucial part of designing software and applications that users will love. As stated in the book Mismatch: How Inclusion Shapes Design: “No degree of wearing a blindfold will ever be equivalent to the experience of being blind. The blindfold can actually give designers [and developers] a false sense of empathy, especially if they attempt to simulate disabilities without ever meeting or working alongside people with disabilities.”
When a team is comprised entirely of people who are attempting to understand a specific situation or user group from an outsider’s perspective, the conclusions they draw are less likely to be accurate—a flaw that will inevitably be reflected in the features and products they build.
This is a particularly harmful and insulting myth that, unfortunately, is still perpetuated. In truth, there is no correlation between race, gender, sexual orientation, or any other element of diversity and candidate quality.
This misconception is often rooted in the truth that diverse teams do experience more conflict and friction. While this can feel frustrating in the short term, research has shown this is actually a feature—not a bug.
Consider the following excerpt from Technically Wrong by Sara Wachter-Boettcher:
“Dealing with outsiders causes friction, which feels counterproductive,” write researchers David Rock, Heidi Grant, and Jacqui Grey. But experiments have shown that this type of friction is actually helpful because it leads teams to push past easy answers and think through solutions more carefully. “In fact, working on diverse teams produces better outcomes precisely because it’s harder.”
Furthermore, our concept of what a necessary skillset looks like can, itself, be biased and incorrect. Often, our understanding of what “qualified” looks like is based heavily on comparison to our own experiences.
One common example of this is which colleges and universities we perceive to be “good” schools. An Ivy League education has long been associated with a higher quality of candidates—but admission to these schools is inherently less accessible to certain groups and having a degree from one of these institutions doesn’t automatically guarantee a high-achieving individual. Rather than placing high emphasis on credentials alone, we should focus on whether or not an employee has the necessary traits and skills to succeed in the workplace.
For years, we were told that the lack of women and members of underrepresented groups in our tech workforce was due to a failure to promote STEM (Science, Technology, Engineering and Math) subjects in our schools and make them accessible to these groups. If there was any grain of truth in this, however, we have most certainly addressed the problem. Groups such as Girls who Code and Blacks in Technology (just to name a couple) have done excellent work in this area.
In truth, diverse candidates now exist in abundance. If these individuals are not applying to certain jobs, the issue is more likely related to the hiring process than the availability of the candidates. In fact, a USA Today article from 2014 states that “top universities turn out black and Hispanic computer science and computer engineering graduates at twice the rate that leading technology companies hire them”—and that was nearly a decade ago!
Many companies are still hiring based on “culture fit,” which can lead to the rejection of otherwise qualified candidates—even when the hiring pool itself is diverse. As Wachter-Boettcher puts it in Technically Wrong: “Regardless of how many women and underrepresented minorities study computer science, the industry will never be as diverse as the audience it’s seeking to serve—aka: all of us—if tech won’t create an environment where a wider range of people feel supported, welcomed, and able to thrive.”
Because we’re deep in the weeds of the work we do every day, it can be easy to lose sight of how what we create is being used. It’s a common misconception that our capacity to inflict harm is low or non-existent just because we’re not working on world-changing projects or cutting-edge technology. In truth, every interaction with our users—even the seemingly small ones—has the potential to cause significant damage and hurt.
Furthermore, every single one of us has inherent biases based on our own lived experiences. This isn’t a criticism or indication of failure; it doesn’t make us bad people. Rather, it is an unavoidable truth.
For example, consider Eric Meyer’s experience with the infamous Facebook Year in Review feature, which combed through his profile for most-liked photos over the last year and presented him with a cheery, celebratory slideshow of photos of his recently deceased 6-year-old daughter. On the surface, a Year in Review feature seems incredibly low-stakes and inherently harmless … and yet, for Eric, it was a “jarring,” “cruel” and deeply painful experience.
Our personal biases extend far beyond what we might first consider and can still cause substantial harm, even when (from our biased perspective) the stakes seem to be low. For those of us who have never experienced the loss of a child, the consideration of such a possibility probably wouldn’t even cross our minds. This is an unfortunate demonstration of the ways in which our own lived experiences bias our perceptions.
Eric Meyer went on to write Design for Real Life, co-authored with Sara Wachter-Boettcher, in which they state: “Moreover, whenever you tell yourself nobody would ever act a certain way or come to your site in certain conditions, that moment should raise a huge red flag in your head. Written on that flag, in block letters, should be the words UNSUBSTANTIATED ASSUMPTION.”
Our own biases will inevitably seep into the work we create; it’s unavoidable. Rather than pretending that it won’t happen, we need to factor this reality into our processes so we can minimize harm as much as humanly possible.
For a business to work, it needs to make money—there’s no arguing with the facts. However, many inaccessible or exclusionary decisions are made under the incorrect impression that only the “majority” group has an impact on the profit potential of a piece of software. In reality, the very concept of the “majority” group is, itself, a myth.
There is no truly “average” user; every user is unique and cannot be accurately defined by statistics. User personas, if not done with the utmost care and based on real data, become stereotypical and serve only to reinforce our own assumptions about who uses our product. Inclusive design and development will improve business naturally by creating innovative products that are easy to use and appeal to a large customer base.
In Design for Real Life, Meyers and Wachter-Boettcher make the argument that, “When we, the people who make digital products, don’t take stress cases into account, we miss out on designing for people who aren’t like us, people whose fears and challenges are different from our own. This can mean failing to reach—or even driving away—people who want and need digital products that fit their lives.”
By de-prioritizing the situations that we (likely incorrectly) perceive to be less important, we are actually losing potential users by telling them through our actions “This product isn’t for you.”
This can be a particularly hard one to contend with because it requires quite a bit of humility and vulnerability on our part—things which are challenging for all of us, but especially so in the workplace where we’re often told that we need to be confident, powerful and cutthroat in order to succeed.
Messing up and owning it is part of the deal. Especially when you’re just starting out, there’s a lot to learn. But it’s always better to make an effort, learn and grow than to abandon the idea of inclusivity altogether.
Kat Holmes puts it perfectly in Mismatch: How Inclusion Shapes Design:
"Inclusion is imperfect and requires humility. It’s an opportunity to be curious and approach challenges with a desire to learn. It teaches us new ways to adapt our solutions to what people need, which is sometimes different than how a designer [or developer] thought their solution would work.”
Kathryn and Alyssa hosted the stream Debunking Myths about D&I in Tech: Ethics & Bias in Computer Science on March 28, 2023. Check out the recording to see them, along with special guests Selam Moges, Deena McKay, Myriam Jessier and Gift Egwuenu, break down myths and take a look at both the challenges and opportunities of D&I in tech.
Kathryn Grayson Nanz is a developer advocate at Progress with a passion for React, UI and design and sharing with the community. She started her career as a graphic designer and was told by her Creative Director to never let anyone find out she could code because she’d be stuck doing it forever. She ignored his warning and has never been happier. You can find her writing, blogging, streaming and tweeting about React, design, UI and more. You can find her at @kathryngrayson on Twitter.
Subscribe to be the first to get our expert-written articles and tutorials for developers!
All fields are required