Google Photos Identifies Black People as Gorillas. This Isn’t the First Time Facial Recognition Software Has Struggled With Black Features.
The words “facial recognition” have made me nervous more times than I can count. From the time I tried out a friend’s Xbox Kinect, to the various times I’ve download any app that requires use of my iPhone’s camera or my MacBook’s built-in web cam. Even though I’ve never had a problem with facial recognition technology recognizing my skin tone, I’ve struggled, many times, to fit my wide nose and high forehead into facial presets. I’ve laughed when the latest make-up app puts lipstick on my nose and eyeshadow on my forehead. I’ve tilted my chin and cocked my head towards the tiny camera a million different ways, in hopes of making the face that shows up on my phone or laptop screen actually look like me. It usually takes several attempts before I get it right, if I get it right at all. I’ve uninstalled more applications than I can count.
Earlier this week, when Brooklyn-based programmer and tech blogger Jacky Alciné noticed that Google Photos, a new program which launched in May, had mislabeled a picture of him and a friend as “gorillas” he expressed his disappointment on Twitter. The animal has long been used as way to slur black people and other people of color by alluding to supposed physical similarities to the animal as evidence of being subhuman.
Google Photos, y'all fucked up. My friend's not a gorilla. pic.twitter.com/SMkMCsNVX4
— diri noir avec banan (@jackyalcine) June 29, 2015
To their credit, a senior engineer from Google responded to Alciné pretty quickly, within an hour and a half, and worked with him to fix the error. He also received an apology.
@jackyalcine Thank you for telling us so quickly! Sheesh. High on my list of bugs you *never* want to see happen. ::shudder::
— Yonatan Zunger (@yonatanzunger) June 29, 2015
But, while Google fixed the error, they can’t fix the factors that contributed to that error. We’ll mostly likely be seeing this problem again in the future, some form, or another. The real issue is steeped in the history of camera technology, which was never created for or tested on people of color. The lack of diversity within the tech industry exacerbates that problem.
“I understand how this happens, the problem is more so on the why. This is how you determine someone’s target market,” Alciné said of the issue, via Twitter.
This isn’t the first time a black person with a darker skin tone has struggled with facial recognition technology. Back in 2010, XBOX Kinect, came under fire after users with darker skin tones reported that they weren’t being read by the products recognition technology. While Microsoft maintained that lighting and not the technology was the issue, lighting is a major reason many camera and video products struggle to identify the features of people with darker skin, and this problem persists. Users of HP web cams also reported a similar issue, back in 2009.
As, senior engineer Yonatan Zunger points out, “Machine Learning” is hard, but it’s likely harder without input from a diverse group of individuals.