fbpx

Artificial Intelligence, It Turns Out, Has Racial Bias, Too

Share

We are teachers and users of Photoshop and Photoshop Elements. So we appreciate the advancements in artificial intelligence and how it has improved our digital editing and organizing capabilities. We’ve written about Photoshop Elements’ artificial intelligence here and facial recognition here. It’s time for us to reevaluate implicit bias in the tools we use to create . . . again. (We wrote about the dubious Shirley cards here.)

Humans Create Artificial Intelligence

It seems so obvious once someone says it, but, yes, humans are creating artificial intelligence. And these humans bring their biases with them as they do. What a shock! As in all things we do, the best way to minimize bias is to have all kinds of people work on this artificial intelligence. Well, here’s another shock (NOT). It looks like companies aren’t that interested in giving women and people of color a seat at the table.

The New York Times has published a relevant article on how poorly companies minimize biases in their artificial intelligence and how consciously they are ignoring this problem. Are we really surprised to learn that people have actually been fired from large companies for bringing this up? Yes, Google, we’re talking about you. We want to mention Timnit Gebru here for her contributions in exposing racial bias in facial recognition. The upshot? It’s been a while since Photoshop Organizer mistook a gargoyle for a person, but now we will be more conscious of Organizer’s suggestions as we sort our photos. And it’s time we report any biases we notice as we use their program.

The ironic thing is that humans have been able to develop artificial intelligence that is quite able to discern race in facial recognition . . . to inhumane ends. Take the case of the Chinese government. Alfred Ng has reported how the Chinese government uses facial recognition to identify Uyghur Muslims and then commit atrocities against them. Read that CNET article here.

We want to create art that creates a space to discuss racial bias. We don’t want to perpetuate it. But, we realize that we bring with us our own baggage to the table, so we want to acknowledge our own biases as we work on our projects.

2 responses to “Artificial Intelligence, It Turns Out, Has Racial Bias, Too”

  1. It’s been a frustrating four years that uncovered the realities of the extent of prejudice. It was feeling a bit hopeful before tRump, but now that it’s out in the open I hope there will be better dialogues. So frustrating that change for the better moves so achingly slowly. I appreciate you both for keeping us informed and I will try to do my part as I use these tools.

Leave a Reply

Your email address will not be published. Required fields are marked *