Loading...

Loading...

Why are women invisible?

  • Posted on April 23, 2019
  • Estimated reading time 4 minutes
gender bias in the workplace

This article was originally published as a blog post on LinkedIn.

 

Let’s start with a simple test. Whip out your phone or fire up your search engine and look up “England football team”. Go on, try it.

What do the search results tell us? The results show that the men’s team is the default England football team. In fact, the results associate the men’s team as England’s national football team.

Do women not play football in England? They do. Is there not an England women’s national football team? Yes, there is. And they are hugely successful by their own merits too.

The (sad) fact of the matter here is that to find information about the England women’s national football team, you need to look it up with the word ‘women’ in your search. This is only one of many examples that demonstrate the problem we face in today’s digital world: gender-biased data and the data gap where men are recognized as the default and women are a subcategory.

Surely this is not a revelation – Bill and Melinda Gates ’ 2019 Annual Letter reflected the two inspirational figures’ shock at “how little data we have on women and girls.” The letter raised questions about what we know of women – and not surprisingly, most, if not all, of us probably do not have the answers because the data simply does not exist. When data about women is present, it is skewed to reflect women’s limited association to topics such as reproductive health or specific professions that only reinforces traditional gender norms and stereotypes.

Coincidentally for International Women’s Day, my husband gave me Caroline Criado Perez’s “Invisible Women: Exposing Data Bias in a World Designed for Men” – an eye-opening book which elaborated how women end up bearing the cost of bias when gender discrimination (from the lack of representation in data) is infused into our systems – from societies and communities to workplace, home, economy and healthcare, and the list goes on. For example, researchers have found sex differences in every tissue and organ system in the human body; however, most research into diseases and clinical trials continue to be conducted on men or male animals, resulting in drugs that are better suited to male bodies than female. Some researchers indicate that due to the lack of historical comparable data, including women is inadvisable. That, as the author suggests, “adds insult to injury”.

Data is becoming a vital factor in influencing our decisions (personal, business and policy-making) and datasets tend to exhibit gender stereotypes. With its advanced use in artificial intelligence, gender-biased data and self-learning algorithms based on that data would magnify the negative impact and exacerbate the inequalities rather than reduce them. This algorithmic bias could then impact downstream applications ranging from college admissions and job applications to law enforcement, healthcare response, insurance premiums, loan approval and so on. For example, in recruitment, AI- based tools can be used to screen a large pool of resumes and match candidates to specific job profiles based on their skillsets. Such tools however could defeat the purpose of being neutral if there is an underlying gender bias in the system because it was trained using resumes which were mostly from men.

Being exposed to the sexism in data can be a rude awakening but inspiring no less to take corrective measures from within – starting with our leaders and people, with the aspiration to affect change in technology and the communities we live in. As technologists building AI-driven solutions, we need to ensure gender-neutrality across the AI development lifecycle – from data collection, classification, algorithm development and training to rigorous testing and auditing of these AI solutions. A multi-disciplinary approach where social scientists work with technologists to incorporate concepts of fairness and neutrality in algorithms needs to be encouraged. A gender diverse AI development team is another way to ensure that unconscious bias does not result in an algorithmic bias.

For this year’s International Women’s Day, I invited our teams to reflect on the problems we face with gender-biased data and data gap about women. More importantly, I encouraged everyone to take a closer look at how we as professionals can influence the use of technology to bridge this data gap today, for a future that is less biased and more equitable.

Inclusion and diversity are fundamental to progress. We have so much more to accomplish and it should start with making sure data is truly objective and AI is truly ethical.

Anna Di Silverio

Thank you everyone for your encouraging feedback. Overwhelmed from reading your comments and inspired to see how all of us are driven by a shared purpose to create human impact.  

June 12, 2019

Dominique Haft

This is so impactful, and I wonder how it translates to our business too - I'd be curious to see across Avanade what's being done to account for bias and ensure we are positioned as the "Leading Digital Innovator" for everyone (men, women, children, minorities, etc)...

May 15, 2019

sue holly-rodway

Great blog, passion underpinned by fact. That's what us girls do so well - shame the data doesn't always reflect that ;-)

May 15, 2019

Gui Schneider

Very interesting read Anna, thank you for your insights. It's time to write algorithm that makes us reflect on and re-shape data instead of perpetuate gender biased analytics. 

May 1, 2019

Emily Warren

Thank you Anna, an enlightening article and a great reminder of why some of the seemingly small things can make a big difference when viewed in a broader context.

May 1, 2019

Tony Hinkley Twitter Orange Icon@TonyHinkley

It is worth remembering that this is only one form of bias which impacts us.  Dr. Safiya Noble talks about this more in her book "Algorithms of Oppression".  Whilst the title of the book is quite divisive, the concepts discussed are interesting and worthy of consideration.

May 1, 2019

Erica Fletcher

Anna, thank you for sharing your perspective on this!  It is true that in many places we default to the male perspective - a natural function of many upon many centuries of practice.  We have a great opportunity to change this through AI and other technologies, as well as in our own day to day practices.

April 30, 2019

Inside Avanade Newsletter

Stay up to date with our latest news.

Contact Avanade

Next steps

Talk to us about how we can bring the power of digital innovation to your business.

CLOSE
Modal window
Contract
Share this page