Is AI destined to fail the LGBTQIA+ Community?

February 11, 2022
Please accept marketing-cookies to watch any videos on this page.

Samuel Bailey, a Social Media Analyst at the Harvey Nash Group, explores the challenges surrounding equal representation in AI.

Underrepresentation remains a huge challenge in technology. This, of course, is not news to us. However what are the unintentional consequences on the development of emerging tech, like AI?

There are huge changes going on right now to bring more women into tech and organizations like the LMF Network and YFYA, are encouraging more women and minority groups into STEM.

Although inclusion cannot be used as blanket term for everyone, unless you’re male, cisgender, heterosexual and white you are likely not well represented in technology.

Big tech is not showing the way forward

Big tech is not showing the way forward; only 2.5% of Google’s entire workforce is Black, with only a narrow improvement to 4% for Facebook and Microsoft. According to the MITSloan Management Review only 22% of AI professionals are women.

It is not a wild assumption to make that this same underrepresentation exists (or is worse) for the LGBTQIA+ community.

On one hand an issue lies within who is building these AI systems. However, we also have the issue that AI at its crux is code, therefore at its root it is binary. Although technology can and has been adapted in recent years to include as many types of people as possible, there still remain huge limitations to how it understands its user.

In June 2020 a crisis hit the AI community when a new software designed to build high-resolution images from pixelated photos, turned a photograph of Former President Barack Obama into a photo of a white man. This was not just a once-off as it was tested on images of various black, Hispanic and Asian individuals and the same result occurred.

If AI is unable to note key differences in race, how will it fare when faced with individuals of varying degrees of gender and sexual queerness?

Not only does facial recognition software pose a threat to LGBTQIA+ individuals, but speech recognition also follows an extremely cisgender-normative pattern, with deeper voices meaning male and higher voices assumed female.

Unconscious bias

This unconscious bias exists within technology (the choice between Male, Female and Other on registration forms) because the people that built the systems put it there to begin with. More work needs to be done to rebuild societal assumptions surrounding gender, and the treatment of sexuality in tech. This will allow tech to advance and abolish platforms and systems tailored to the white, heterosexual, male.

Queerness lies outside of the binary and is therefore harder to represent on systems built on ones and zeros. However, with the voice and experience of LGBTQIA+ people in these spaces, there may be an option to open tech to a wider range of groups.

Representation allows for groups to not just see themselves within services and understand it is also made for them, but step-in when your community is under-represented, halting the loss of your ability to advocate for your rights and freedoms.

In an industry where education is such a core value, creating systems where everyone is represented and building systems that are tailored to everyone, creates for a welcoming work environment that will have a voice for everyone.










More from across Nash Squared

We're ready to help you build a limitless future