How Flawed Computer Data Creates Racist Algorithms

05.24.16 10 months ago 5 Comments
tech-racism-feat-uproxx

shutterstock

The idea of a computer being racist ought to sound absurd. A computer is a hunk of silicon that doesn’t form opinions (bigoted or otherwise). But this notion ignores a crucial flaw in reasoning: Computers only understand the world through the data we give them. They are hunks of silicon, programmed by us, and we are inherently flawed. Which helps explain why, increasingly, the data is turning up ugly truths about human nature that, if we’re not careful, might affect how we do everything from identify photos to decide criminal sentencing.

How Computers Become Racist

In 2010, Joz Wang bought a Nikon point and shoot camera for her parents. While messing around with it, she noticed something odd: The camera’s face recognition software seemed to be malfunctioning, asking if someone in the photo had blinked every time she took a picture. It wasn’t until her brother made a “bug-eyed” face and the warning vanished that she realized the camera literally didn’t understand her facial structure.

The tech industry loves algorithms and big data, but that data is generated entirely by human beings, and that means human fallacies and even outright hatred can surface quite easily. Until it was spotted and filtered out by Google, punching the world’s most inflammatory racist term into Google Maps would give you directions to the White House. HP also got in hot water because their face-tracking software was unable to see black people — which inspired an entire episode of the sitcom Better Off Ted.

These incidents proved to be warnings, which ought not to have been ignored, because algorithms are designed to observe, and they can just as easily learn the worst from us as the best. Microsoft figured this out the hard way with Tay, a chatbot it debuted on Twitter that was promptly taught by trolls to spew racist vitriol. Learning algorithms are great, right up until they learn terrible behavior from us.

Racist Computers And Criminal Justice

Increasingly, the justice system is turning to past data to figure out which criminals are a danger and which are more likely to be rehabilitated. If that data were unbiased and clear, that would be one thing. Unfortunately, as anyone with even a passing awareness of race in America can tell you, the justice system has a long history of racism and its use of algorithms reflects that ugly past.

Around The Web