4 Times Technology Turned Racist

4 Times Technology Turned Racist


Computers do exactly as they are instructed. However, there have been times when a system’s output seemed rather racist.

[Related: Intel and IBM Set the Diversity Example for Technology, and For Us All]

Clearly, a machine can’t be racist. They are, however, built, designed and coded by humans, so the flaws of mankind sometimes find ways into our information systems.

Here are four times when tech seemingly turned ‘racist’:

HP Facial-Recognition Webcams Fail to Recognize Black People:
A YouTube video went viral when two colleagues; a white woman and a black man; demonstrated how an HP webcam with facial recognition capability only recognized her light skin and not his darker tone.

Although the two made light of the subject in their video, many viewed the issue as the result of the lack of diversity in Silicon Valley companies.

In a statement to CNN at the time, HP acknowledged, “the cameras may have issues with contrast recognition in certain lighting situations.”

Microsoft AI Bot Goes on Racist Tirade:
Artificial Intelligence (AI) is being explored in many new ways. Microsoft recently launched an AI chatbot named Tay. It was designed to engage in chat across social media with 18- to 24- year-olds.

Within hours, the chatbot began unleashing a stream of profane Tweets:

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Microsoft took Tay offline and issued a statement,”Unfortunately, in the first 24 hours of coming online, a coordinated attack by a subset of people exploited a vulnerability in Tay…”

Google Photos Tags Black People as “Gorillas”:
Last year, Google rolled out its Photos app, which also automatically tags.

Soon after, a black programmer uploaded an image of himself and a friend. Photos tagged both as gorillas.“  He shared his discovery on Twitter.

Google engineers responded and addressed the issue.

Lack of Women of Color in Tech Stock Images:
While creating a website, Christina Morillo and Stephanie Morillo (no relation) noticed there are few stock images of women of color in technology online.

 

 

 

 

 

 

 

The two set about changing the image landscape. They put out a call for models on Twitter and through mailing lists to launch #WOCinTech Chat. The site features images of women of color in all sorts of technology roles.

Airbnb Discrimination Based on “Black-Sounding” Names:
In an experiment involving over 6,000 Airbnb listings, researchers sent rental inquiries using two sets of names for each rental; one name traditionally African American and the other traditionally white. Requests from guests with black names were 16% less likely to be accepted by hosts.

 


×