Solutions to AI image bias raise their own ethical questions

Matt Levin Oct 10, 2023
Heard on:
HTML EMBED:
COPY
Text to image AI companies know that the pictures generated by their software are informed by images that are already available on the internet. But the best way to correct for that bias can gets complicated. Getty Images

Solutions to AI image bias raise their own ethical questions

Matt Levin Oct 10, 2023
Heard on:
Text to image AI companies know that the pictures generated by their software are informed by images that are already available on the internet. But the best way to correct for that bias can gets complicated. Getty Images
HTML EMBED:
COPY

In a small conference room at Adobe’s San Francisco office, Alexandru Costin is very confidently letting me take command of his laptop.

“Text-to-image is very simple,” says Costin, Adobe’s vice president of generative artificial intelligence. “Go ahead and try us, try us.”

Costin is giving me a sneak peek at their updated AI image generation tool, which is now part of Photoshop. The family of AI models that powers the tool is called Firefly.

Firefly’s image generation interface is familiar to anyone who has played around with Dall-E, Stable Diffusion, Midjourney or the handful of other AI image tools that have exploded in popularity over the past year.

I type in “Darth Vader applying suntan lotion to the Easter Bunny” in the prompt bar, and Firefly serves up four separate images. Each contains cartoony-looking versions of the Easter Bunny.

But instead of Darth Vader, three women, including two Black women, appear. The fourth panel contains some odd Father Time-looking character in the Vader spot.

“This might look like a failure. It’s an actual feature,” explains Costin.

Adobe doesn’t want to infringe on Disney’s intellectual property, so instead of Darth Vader, you get a random human.

Apparently AI wasn’t confusing Darth with “dark” or choosing women to lotion up the Easter Bunny because of some baked-in sexism. It was actually Adobe’s “de-biasing algorithm” at work.

“We generate diverse, gender diverse, skin-tone diverse content to basically represent fairly the population,” says Costin.

AI text-to-image generators have a well-documented bias problem. AI models are trained on images from the internet, so bias in, bias out. A recent experiment from Bloomberg on the image generator Stable Diffusion found that AI portraits of architects, doctors and CEOs skewed white and male, while images of cashiers and housekeepers skewed towards women of color. 

Adobe’s solution to the bias issue was to use data that estimates the skin tone distribution of a Firefly user’s country, and apply it randomly to any human Firefly creates. In other words, if someone in the U.S. used Firefly to make an image of a doctor or a gardener,  the chances that person would be a woman or have non-white skin would be roughly proportional to the percentage of women and people of color in the U.S.

That solution sounds intuitive. But it also carries some unintended consequences.

Ask Firefly to generate an image of a 1960s Supreme Court justice, and there’s good chance it’ll return a woman. But a woman wasn’t appointed to the Supreme Court until the 1980s.

“Is it a good or bad thing? I think there’s pros and cons,” said Costin.

The con could be making history seem less sexist or racist than it actually was. But the process of de-biasing AI image generators doesn’t just pose new, complicated questions about historical anachronisms.

It poses new, complicated questions about distorting our current, still very unequal reality.

In Firefly world, about 14% of doctors should be Black — the same percentage as the Black population in the U.S. But in the messy, unequal real world, only 6% of doctors are Black.

So, should AI images depict the world as it is? Or as it should be?

“That becomes almost like a philosophical question,” said Rumman Chowdhury, a Responsible AI Fellow at Harvard’s Berkman Klein Center for Internet and Society.

Chowdhury generally believes that AI should reflect the world we want to see. Considering the role tech companies envision AI playing in our day-to-day lives, leaving AI defaults as biased as the rest of the internet could reinforce biases in the real world.

Chowdhury used the example of someone using AI to generate images for a poster or internet posting advertising a bartender position. If the “bartender” images that are generated only depict white men, that poses a problem.

“If you’re looking at this job ad that says, ‘I want an ambitious bartender,’ but now it’s got this photo of like, presumably the guy you want and it doesn’t look like me? I guess I won’t bother (applying),” Chowdhury said.

The big problem with AI image generators is a lack of diversity in the photos used to train AI systems, Chowdhury said.

Which is why Jacques Bastien said he’s been getting emails from AI companies lately.

“There were companies that reached out to us specifically because they wanted access to our photos, essentially paying us to use our photos to train the AI,” Bastien said.

Bastien is the co-founder of Nappy.co, a stock photo website and photography studio devoted to capturing Black and brown people in everyday settings.

The inspiration for Nappy came from his frustration trying to find diverse photos from traditional stock websites. Bastien also does website design, and had trouble finding images that looked like him to populate his specs.

“Black people drink coffee, right? And so we would go to those sites and type in coffee drinkers, we wouldn’t find anything,” Bastien said.

Bastien supports AI image companies trying to find more diverse training photos. Just not his photos.

“What we spent, you know, years acquiring and shooting and building, we’re not gonna just contribute that, just to be replaced tomorrow,” Bastien said.

Rather than selling the photos wholesale to one AI company, Bastien said he’s considering creating an API where anyone who wants to utilize his images in bulk would be charged.

But he can see a future where companies use AI to create fake Black and brown people for their marketing materials. Which raises another ethical question:

“When you go to Nappy library, it’s probably shot by a Black photographer that includes Black subjects. So however that went down, people got paid. Whereas the AI, probably not,” he said.

But Bastien says ethics are kind of beside the point. As a business owner, he wouldn’t blame another company for using the cheaper option. 

There’s a lot happening in the world.  Through it all, Marketplace is here for you. 

You rely on Marketplace to break down the world’s events and tell you how it affects you in a fact-based, approachable way. We rely on your financial support to keep making that possible. 

Your donation today powers the independent journalism that you rely on. For just $5/month, you can help sustain Marketplace so we can keep reporting on the things that matter to you.