One Green Bean highlights alarming gender bias in popular AI tool

By AdNews | 8 March 2023
 
Kat Thomas, Amber Abbot, Sophie Nicholson

One Green Bean has revealed how emerging AI generative image tools consistently under-represent women across senior professional roles.

The agency undertook a two-part experiment using Midjourney, an artificial intelligence platform that has surged in popularity in recent months. The program generates images based on decisive text prompts, trawling an estimated five billion images from the web for each search.

One Green Bean first asked the platform to generate images based on the job titles of three members of the global leadership team – managing director, executive creative director, and head of public relations EMEA. The results revealed a clear male gender bias. 

To experiment further, the team then ran the top 20 highest paid jobs in the UK as ranked by The Times newspaper, through the Midjourney platform. 88% of the images reinforced male gender stereotypes. From chief executive to locum consultant, tax partner to aircraft pilot, artificial intelligence revealed an overwhelming bias towards men. 

In the AU workforce, around one third of Australia’s top jobs are filled by women according to the Australian Government’s Workplace Gender Equality Agency, demonstrating that AI appears to reflect a distorted reality.

Kat Thomas, founder and global executive creative director of One Green Bean said: “There’s been huge hype around AI tools like ChatGPT and Midjourney. We’ve been deep in experimentation to understand its potential, but an eye-opening limitation became clear very quickly.

"A distinct gender bias is very evident, with favourability consistently skewing male. That’s not the only bias either. When you do include ‘woman’ in your key words, imagery tends to be sexualized – big boobs, unbuttoned shirts, pouting lips. Another huge bias is around diversity, the images these platforms generate overwhelmingly skew white, as well as male.

"Our industry is obsessed with artificial intelligence and whilst embryotic right now, its capacity is revolutionary. However, it’s not without its limitations and its bias against women is a significant hurdle these platforms need to overcome.

"They effectively hold a mirror up to society, demonstrating that ingrained cultural biases dictate the norms that machine intelligence currently relies on.”

Further research using Midjourney revealed gender bias across a huge variety of disciplines. When ‘International Tennis Star’ was typed into the platform, four shouting men appear, with no sign of Serena Williams or Ash Barty. Other roles the teams looked at included air traffic controller, paramedic, finance manager, police officer, train driver… all of which returned images of men.

Have something to say on this? Share your views in the comments section below. Or if you have a news story or tip-off, drop us a line at adnews@yaffa.com.au

Sign up to the AdNews newsletter, like us on Facebook or follow us on Twitter for breaking stories and campaigns throughout the day.

comments powered by Disqus