This new tool detects deepfakes by searching for heartbeat – DIY Photography


Deepfake videos are getting more and more advanced at a rapid pace. They could be misused in all sorts of ways, so it’s no wonder that scientists are working hard on spotting them. A new tool developed at Binghamton University detects deepfakes by analyzing the person in them and looking signs of life that may not be visible to the naked eye: a heartbeat. 

The tool is called FakeCatcher and it works by recognizing biological signals in videos of people. Binghamton University researchers teamed up with Intel to develop it and published their process and findings in a paper titled FakeCatcher: Detection of Synthetic Portrait Videos using Biological Signals. 

Basically, FakeCatcher uses photoplethysmography, similar to all those fitness gadgets that measure your heartbeat during physical activity. This technique is based on observing the subtle change in your skin color as the blood is pumped through it. Since your face emits the same color shifts, FakeCatcher relies on it to look for deepfakes.  The tool analyzes these changes of color and motion in RGB videos. Even though they’re not visible to the naked eye, FakeCatcher is able to detect them and tell you if the video is fake or not. The full paper is available here if you’d like to go more into detail. 

The researchers behind this project believe that deepfake videos could have a negative effect on society, law, and privacy. “Believing the fake video of a politician, distributing fake pornographic content of celebrities, fabricating impersonated fake videos as evidence in courts are just a few [real-world] consequences of deep fakes,” they write. 

There indeed are many dangers and possible misuses of fabricated videos and photos. So it’s no wonder that there have been more and more companies trying to combat them. MicrosoftJigsawGoogle, and Twitter are only some of the companies that introduced various methods of detecting fake imagery. But considering the fast development and increasing accuracy of deepfakes, I believe more solutions are yet to come. 

[via Gizmodo] 





Source link

Leave a Comment

Your email address will not be published. Required fields are marked *

%d bloggers like this: