Why are women invisible?
- Posted on April 23, 2019
- Estimated reading time 4 minutes
This article was originally published as a blog post on LinkedIn.
Let’s start with a simple test. Whip out your phone or fire up your search engine and look up “England football team”. Go on, try it.
What do the search results tell us? The results show that the men’s team is the default England football team. In fact, the results associate the men’s team as England’s national football team.
Do women not play football in England? They do. Is there not an England women’s national football team? Yes, there is. And they are hugely successful by their own merits too.
The (sad) fact of the matter here is that to find information about the England women’s national football team, you need to look it up with the word ‘women’ in your search. This is only one of many examples that demonstrate the problem we face in today’s digital world: gender-biased data and the data gap where men are recognized as the default and women are a subcategory.
Surely this is not a revelation – Bill and Melinda Gates ’ 2019 Annual Letter reflected the two inspirational figures’ shock at “how little data we have on women and girls.” The letter raised questions about what we know of women – and not surprisingly, most, if not all, of us probably do not have the answers because the data simply does not exist. When data about women is present, it is skewed to reflect women’s limited association to topics such as reproductive health or specific professions that only reinforces traditional gender norms and stereotypes.
Coincidentally for International Women’s Day, my husband gave me Caroline Criado Perez’s “Invisible Women: Exposing Data Bias in a World Designed for Men” – an eye-opening book which elaborated how women end up bearing the cost of bias when gender discrimination (from the lack of representation in data) is infused into our systems – from societies and communities to workplace, home, economy and healthcare, and the list goes on. For example, researchers have found sex differences in every tissue and organ system in the human body; however, most research into diseases and clinical trials continue to be conducted on men or male animals, resulting in drugs that are better suited to male bodies than female. Some researchers indicate that due to the lack of historical comparable data, including women is inadvisable. That, as the author suggests, “adds insult to injury”.
Data is becoming a vital factor in influencing our decisions (personal, business and policy-making) and datasets tend to exhibit gender stereotypes. With its advanced use in artificial intelligence, gender-biased data and self-learning algorithms based on that data would magnify the negative impact and exacerbate the inequalities rather than reduce them. This algorithmic bias could then impact downstream applications ranging from college admissions and job applications to law enforcement, healthcare response, insurance premiums, loan approval and so on. For example, in recruitment, AI- based tools can be used to screen a large pool of resumes and match candidates to specific job profiles based on their skillsets. Such tools however could defeat the purpose of being neutral if there is an underlying gender bias in the system because it was trained using resumes which were mostly from men.
Being exposed to the sexism in data can be a rude awakening but inspiring no less to take corrective measures from within – starting with our leaders and people, with the aspiration to affect change in technology and the communities we live in. As technologists building AI-driven solutions, we need to ensure gender-neutrality across the AI development lifecycle – from data collection, classification, algorithm development and training to rigorous testing and auditing of these AI solutions. A multi-disciplinary approach where social scientists work with technologists to incorporate concepts of fairness and neutrality in algorithms needs to be encouraged. A gender diverse AI development team is another way to ensure that unconscious bias does not result in an algorithmic bias.
For this year’s International Women’s Day, I invited our teams to reflect on the problems we face with gender-biased data and data gap about women. More importantly, I encouraged everyone to take a closer look at how we as professionals can influence the use of technology to bridge this data gap today, for a future that is less biased and more equitable.
Inclusion and diversity are fundamental to progress. We have so much more to accomplish and it should start with making sure data is truly objective and AI is truly ethical.