AI couldn't picture a woman like me - until now

8 hours ago 1

Jessica Smith Jessica Smith wearing a black swimming costumeJessica Smith

When Jess Smith uploaded a photo of herself into an AI image generator this summer, she wasn't expecting a social experiment.

The former Australian Paralympic swimmer wanted to vamp up her headshot and uploaded a full-length photo of her and prompted it really specifically that she was missing her left arm from below the elbow.

But ChatGPT couldn't create the image she was asking for and despite various prompts, the results were largely the same - a woman with two arms or one with a metal device to represent a prosthetic.

She asked the AI why it was so hard to create the image and it said it was because it didn't have enough data to work with.

"That was an important realisation for me that of course AI is a reflection of the world we live in today and the level of inequality and discrimination that exists," she says.

Smith recently tried to generate the image again on ChatGPT and was amazed to find it could now produce an accurate picture of a woman with one arm, just like her.

"Oh my goodness, it worked, it's amazing it's finally been updated," she tells the BBC. "This is a great step forward."

Jessica Smith AI image of a woman with shoulder length blonde hair, a green tshirt and left arm missing from below the elbowJessica Smith

Jess Smith found that ChatGPT was able to generate this image of her recently

It might not sound like a big deal, but for millions of people with disabilities, this shift matters.

"Representation in technology means being seen not as an afterthought, but as part of the world that's being built," Jess says.

"AI is evolving, and when it evolves with inclusion at its core, we all benefit. This is more than progress in tech it's progress in humanity."

A spokesperson for OpenAI, the company behind ChatGPT, said it had recently "made meaningful improvements" to its image generation model.

They added: "We know challenges remain, particularly around fair representation, and we're actively working to improve this - including refining our post-training methods and adding more diverse examples to help reduce bias over time."

Naomi Bowman Two pictures of a woman with glasses and short brown hair. On the left is the real picture of her and on the right is one that AI created of her that evened out her eyeNaomi Bowman

AI edited Naomi's eye even though she didn't request for this to be done

While Smith’s disability is now reflected with AI, Naomi Bowman, who only has sight in one eye, is still experiencing a similar problem.

She asked ChatGPT to blur the background of a picture but instead it "changed my face completely and evened out my eyes".

"Even when I specifically explained that I had an eye condition and to leave my face alone; it couldn't compute," she says.

Naomi initially found it funny but says "it now makes me sad as it shows the inherent bias within AI".

She is calling for AI models to be "trained and tested in rigorous ways to reduce AI bias and to ensure the data sets are broad enough so that everyone is represented and treated fairly".

Some concerned about AI‘s environmental impact have criticised the creation of images on ChatGPT.

Professor Gina Neff of Queen Mary University London told the BBC that ChatGPT is "burning through energy", and the data centres used to power it consume more electricity in a year than 117 countries.

Awkward conversations

Experts say bias in artificial intelligence often reflects the same blind spots that exist in wider society and it's not just disabilities that are unrepresented.

Abran Maldonado, chief execuive of Create Labs, a US-based company that builds culturally aware AI system, says diversity in AI starts with who's involved in training and labelling the data.

"It's about who's in the room when the data is being built," he explains. "You need cultural representation at the creation stage."

Not everything is represented on the internet correctly and Maldonado adds that if you don't consult the people with lived experiences then AI will miss them.

One well known example was a 2019 US government study which found that facial recognition algorithms were far less accurate at identifying African-American and Asian faces compared to Caucasian faces.

Despite living with one arm, Jess doesn't see herself as disabled, saying the barriers she faces are societal.

"If I use a public toilet and the tap has to be held down, that impacts my ability, not because I can't do it, but because the designer hasn't thought about me."

She believes there is a risk of the same oversight happening in the world of AI, systems and spaces built without considering everyone.

When Jess shared her original experience on LinkedIn, someone messaged her to say his AI app would create an image of a woman with one arm.

"I tried to create it and the same thing happened, I couldn't generate the image," she says.

She told the person, but they never replied to her and she says that's typical of conversations around disability.

"The conversation is too awkward and uncomfortable so people back away."

Read Entire Article