All Eyes On Me: Researchers Develop Clever Way To Detect Otherwise Convincing Deepfakes

researchers create clever way to detect deepfakes
Deepfakes are beginning to grow in popularity and quality, such as the recent Tom Cruise deepfakes posted to TikTok. Deepfakes raise concerns over what is and isn't real when it comes to various media formats. Subsequently, researchers at the University of Buffalo have developed an AI system that can detect whether an image is real or not with some clever tricks.

One of the most popular ways of generating human faces is using a generative adversary network (GAN) model. As the researcher's paper explains, these models can "synthesize highly realistic human faces that are difficult to discern from real ones visually." In fact, the images you see above are what the researchers used, and they came from a website called thispersondoesnotexist.com which generates faces using the "StyleGAN2" model.
researchers create clever way to detect deepfakes eye differences
Though the GAN can generate faces that look realistic, small details that our brains filter out are noticeable by an AI. Specifically, the researchers used the reflective nature of corneas to train on. In generated images, the eyes' reflection is usually not the same, whereas it would generally be the same in real life. If you look closely, you can begin to tell that something is slightly off in nearly every reflection in the eyes of generated faces.

After training the AI, the researchers found that the method for detecting generated images was rather effective. The only issues encountered were when images were not in the portrait configuration or images where light in the eyes was not present. Moreover, the researchers believe that with "non-trivial" manual post-processing, the inconsistencies can be mitigated.

Ultimately, the researchers hope that they will not have to worry about these problems with further training and investigation. Perhaps this, alongside other methods of detecting edited or generated images, can help sniff out deepfakes.