Facebook Researchers Use AI Trickery To Hide People From Facial Recognition
Facebook has combined an “adversarial autoencoder” and a “trained-face classifier”. An autoencoder is an artificial neural network that learns a representation for a set of data unsupervised. Adversarial autoencoders were introduced in 2016 and are able to “match the aggregated posterior of the hidden code vector of the autoencoder with an arbitrary prior.” Classifiers typically use an algorithm to map input data. This particular classifier deals with data related to facial images and videos.
This technology is able to generate a mask and an image of a person, after which it is able to produce both normal and slightly warped images of the subject. These images can then be embedded in a video.
The purpose of the technology is to make it more difficult for AI to recognize a person’s image or voice. Dr. Lior Wolf, a Facebook AI Research engineer and professor at Tel Aviv University, remarked, “So the autoencoder is such that it tries to make life harder for the facial recognition network, and it is actually a general technique that can also be used if you want to generate a way to mask somebody’s, say, voice or online behavior or any other type of identifiable information that you want to remove.”
At the moment, Facebook does not have any plans for this new technology. However, the research is particularly important due to the proliferation of deepfakes. Dr. Hao Li at the University of Southern California recently stated that we are less than a year away from dealing with “perfectly real” deepfakes. There are already a variety of deepfake videos and images of politicians, celebrities, and others in existence.
Facebook also recently made it easier to delete your facial recognition data that is stored by the social media site. If you delete this data, Facebook will no longer tag you or suggest that you tag friends in photos. Facebook claims that they do not share your data with third parties, but the ability to delete your data is a bit more reassuring.