News

Reflection of light in the eye is the key to detecting Deep Fake videos

So far, diphtheria has been used for many sinister purposes; From political campaigns that spread false information to falsifying the identities of celebrities in various fields. As this technology advances, it becomes more and more difficult to distinguish fake versions from real ones.

According to thenextweb, a new tool that uses artificial intelligence now offers a very simple way to detect diphtheria videos. In this method, by examining the light reflected in the eyes, it is possible to determine whether the videos are fake or real. This system was created by computer scientists. In experiments on portrait images, the tool was 94% successful in detecting diphtheria images.

The instrument can detect diphtheria videos by analyzing corneas that have a mirror-like surface and produce specific reflective patterns when exposed to light. In a photo of a real face taken with a camera, the reflection on both eyes will be the same; Because they see one thing. However, diphtheria images created with GANs usually do not mimic this properly. In fake videos, on the other hand, there are often discrepancies in light reflection that can be detected with a bit of precision.

Deep Fake Video Recognition Tool

Artificial intelligence explores these differences by drawing the face and analyzing the light reflected in each eyeball. A benchmarking system is created that, if the video in question fails to score, it is known as a diphtheria. In fact, the image properties are checked and finally, the rating is determined whether the video is real or not.

This system has been very successful in detecting defects in This Person Does Not Exist, a source of images created with the StyleGAN2 architecture. However, it is said that there are several limitations to this.

Related article:

The most obvious disadvantage of this tool is that it relies on the light source reflected in both eyes. Pattern inconsistencies can be fixed with manual settings after processing, and this method will not work if one eye is not visible in the image. Also, it should not be forgotten that this system is only effective on portrait images and if the subject does not look at the camera, it is not possible to detect the accuracy of the video.

Researchers plan to study these problems and find solutions to improve their new tools and make them more powerful. Currently, the system is not able to correctly detect complex diphtheria; But it can be useful for simpler videos. Having a tool now to check if the videos are fake or genuine can be a great way to start; Because Deepfick is moving in a direction where profiteers can make great use of this technology.


Source link

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button

Adblock Detected

Please consider supporting us by disabling your ad blocker