Categories
AI

The flip side of FRT: profile creation

Facial recognition technology (FRT) has an ongoing problem with racial, gender and age bias.

But what about tools used to create profile images? Do they suffer from the same “White male” bias?

Jeremy Andrew Davis, who was diagnosed as autistic in 2022, tested MidJourney like this. His written prompt: “an autistic person” as well as “lifelike, photoreal, photojournalism).”

He generated 148 images. How many do you think were White male? What about ages?

Davis used the tool to creat 148 images. (It’s a simple matter to repeat a prompt.)

  • Two of the 148 presented as female.
  • Five of the 148 appeared to be older than 30.
  • No one smiled.

I created 12 images. All but one were clearly younger than 30 years old and all but one (White female) presented as White male.

No one smiled.

Output from Midjourney - four portraits of an "autistic person"
Sample output from MidJourney reflecting biased LLM data sets.
K.E. Gill. Private communication, MidJourney, a text-to-image model. Sep. 28, 2023. https://www.midjourney.com

~~~

Header image, EFF (Flicker, CC)

Talk to me: Facebook | Mastodon | Twitter

 

 

 

 

 

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.