Contractors targeted homeless with ‘dark skin’ to train Google’s facial recognition

Contractors working for Google reportedly targeted homeless people with “dark skin” to help train a facial recognition system. As reported by our sister publication AI News, facial recognition algorithms have well-documented problems when it comes to identifying people of colour. Part of the reason for the disparity is that most data sets for training algorithms have little diversity. Any responsible tech company will want to ensure their facial recognition technologies are equally accurate across society before they’re further deployed in areas such as law enforcement; which even some police have voiced concerns may lead to increased bias. However, it seems in a bid to prevent bias with its own facial recognition algorithms, Google has walked right into another controversy. The situation raises further questions about how to ethically harvest data.

Spotlight

Other News

Dom Nicastro | April 03, 2020

Read More

Dom Nicastro | April 03, 2020

Read More

Dom Nicastro | April 03, 2020

Read More

Dom Nicastro | April 03, 2020

Read More