Some of the tags it gives to people include things like "psycholinguist," "scientist," while other male subjects were labeled as "rape suspect," and other less-than flattering descriptions. Another troubling, stereotypical identification comes for men that are bald or balding. Those individuals are more often than not labeled as "skinheads," although it depends on the images in that category and how much the image you upload matches the various captures in the database.
Obviously, if you're a balanced, open-minded person that goes by the belief that you simply don't judge a book by its cover, these frivolous labels probably won't concern you as much. However, it's the bigger question of why the AI is seemingly slanted to levy these outrageous labels on people's faces, that might keep folks in the AI community up at night..
ImageNet Roulette was meant as a test case to show how politics propagate through technical systems, often without the creators of the systems being aware of them. The labels on the images come from Mechanical Turk workers (humans) that classified masses of images for very little money. Those who have delved into the ImageNet tag categories say that looking at the photos that go along with the tags hint at how the algorithm thinks.
For instance, many people labeled as "Phycholinguists" tend to be white and have uploaded an image that looks like a faculty headshot, notes LifeHacker. Anyone willing to risk being offended can try the tool for themselves here. One of our staff members in the office was already actually labeled a "skinhead" and was highly offended, obviously.
It reveals the deep problems with classifying humans - be it race, gender, emotions or characteristics. It's politics all the way down, and there's no simple way to 'debias' it.— Kate Crawford (@katecrawford) September 16, 2019
Facebook keeps facial recognition data on its users, and we recently talked about how to remove that data. However, at least Facebook only puts a name with a face, and doesn't try to further categorize and stereotype individuals based on their looks.