top of page
Image by Joe Woods

06.02.23

INVISIBLE WOMEN: EXPOSING DATA BIAS

What do smartphones, crash test dummies and a drug used to treat insomnia have in common? The answer may surprise you. But before we get to it, here’s another question to ponder: is the data that makes our world go around biased?


These two questions get to the heart of Caroline Criado Perez’s fascinating 2019 book Invisible Women: Exposing Data Bias in a World Designed for Men.


If you hadn’t already guessed, the items listed above are just three of many eye-opening examples of where ‘data gaps’ have led to designs which have put women at a disadvantage. Criado Perez has plenty more examples – from tax systems to office culture, to urban planning to disaster relief, the scale of the problem is staggering.

iluli_Website_DigitalFingerprints.png
Tape.png
Tape.png
Tape.png

WHAT IS A DATA GAP?


The great thing about scientific research is that it’s completely objective – we can rely on it not to fall prey to the subjective biases that can afflict human decision-making. That’s the idea anyway. The truth, however, is a bit more complicated.


Take, for example, the case of the pulse oximeter. This small medical device clips onto a patient’s fingertip and gauges how much oxygen is in their blood by measuring light reflected from under their skin. Less light means more oxygen.


It’s a widely used device, and the results it delivers are potentially lifesaving. You’d like to think it would work equally well for everyone. But it doesn’t. The design of the pulse oximeter was based on data gathered from too narrow a range of skins tones. As a result, this important bit of medical kit is much more likely to misdiagnose people who have darker skin.


This is just one example of a ‘data gap’ – a sinkhole that opens up when data is incomplete. The effects of these, often unintentional, data gaps are everywhere and can range from the inconvenient to the life-threatening. I explored the issue in a recent iluli video.


Criado Perez’s book shines a light on gender data gaps and their impact on women. These are largely the result of women not being adequately represented in research studies, leading to decisions which fail to account for gender differences.


THE DEFAULT MALE


The smartphone, crash test dummy and insomnia drug I mentioned at the start are all examples of where products may have been designed to be universal, but have ultimately better catered for men than women.


Criado Perez points out that smartphones were designed to comfortably fit in a man’s hand rather than a woman’s. It might be no coincidence that Apple subsequently introduced a new smaller iPhone model in 2020.


When it comes to the crash test dummy, the results of the data gap have been catastrophic. It may have been the intention for crash test dummies to be non-gendered, but the reality is that the shape and size of the standard dummy was much more akin to a typical male body than a female one. This meant car collision tests revealed far more about how a car design would impact the safety of a man involved in a crash than a woman. We now know that a woman involved in a car crash is 47% more likely to be seriously injured than a man.


The same issue has affected medical trials. It took 21 years for the US Food and Drug Administration to change the recommended dosage for women taking the insomnia drug Ambien. It transpired that women’s bodies were five times more likely than men’s to retain dangerous levels of the drug the next day. This oversight would have been picked up before the drug was approved, if the trial data had been broken down by gender.


Essentially, when studies fail to account for gender differences, the needs of women tend to be overlooked at the expense of a ‘default male’.


As Criado Perez explains, this tendency to see men as the default goes back centuries:


“The male experience, the male perspective, has come to be seen as universal, while the female experience - that of half the global population, after all - is seen as, well, niche.”

DATA GAPS IN TECH


Data gaps are rarely deliberate. They are often the product of institutions producing data that aligns with their own experience.


This may explain some of the issues in the tech sector, where women are significantly under-represented in the workforce.

In Invisible Women, we learn that:


“Women make up only a quarter of the tech industry’s employees and 11% of its executives. This is despite women earning more than half of all undergraduate degrees in the US, half of all undergraduate degrees in chemistry, and almost half in maths. More than 40% of women leave tech companies after ten years compared to 17% of men.”


There’s a good reason to single out the tech sector. Developments being pioneered in Silicon Valley have far-reaching consequences which will impact on most of us at some point. And the very nature of how algorithms and artificial intelligence (AI) works means that biases get reinforced and amplified. Just consider Microsoft’s infamous AI chatbot Tay which, after being exposed to Twitter for a few hours, started posting like a sexist, racist troll.


Tay was swiftly taken offline, and many people saw the funny side. But there are many other examples of where AI is learning from biases and exacerbating them in a way which is potentially more damaging but less easy to spot. One study found that image recognition software which learned from a deliberately biased set of photographs ended up making even stronger sexist judgements – it was more than twice as likely to associate women with kitchens based purely on their gender.


These data gaps create a vicious cycle where developments built on flawed research further hold back people who were under-represented in the data.


As Criado Perez puts it:


Our current approach to product design is disadvantaging women. It’s affecting our ability to do our jobs effectively - and sometimes to even get jobs in the first place. It’s affecting our health, and it’s affecting our safety. And perhaps worst of all, the evidence suggests that when it comes to algorithm-driven products, it’s making our world even more unequal.”


So, what is the solution?  The good news is that, when data gaps and inequalities are acknowledged and acted upon, positive change tends to happen pretty quickly. For instance, using more inclusive language in tech job recruitment ads has helped dramatically increase the number of women applying, while significant advances are now being made to reduce bias in AI through the creation of new algorithms which reduce stereotyping.


Ultimately, building better businesses, better innovations and a better society relies on data and insights that represent all shapes, sizes and skillsets – not just us menfolk.

Image by Joe Woods
iluli_Website_DigitalFingerprints.png
Tape.png
Tape.png
iluli_Website_DigitalFingerprints.png
Tape.png
Tape.png
bottom of page