YouTube's Newest Creator Tool Could Keep AI Deepfakes Off Of Your Feed

Along with the influx of new AI products and services, we've also seen an increase in "deepfakes." Some deepfakes aren't malicious at all, and are instead intended to add an extra layer to a funny video. Others, though, are meant to trick people into supporting scams by impersonating key figures they might look up to — like we've seen with the fake Elon Musk ads in recent years.

While deepfakes and impersonations have a place in media — when they aren't driving malicious products or scams — there has to be some way for creators to protect themselves, especially as deepfakes continue to get weirder and the technology behind them continues to advance. That is why YouTube has launched a new system called Likeness Detection.

The system has actually been in testing for quite a while, YouTube says, but it's finally pushing the system out to a wider group of creators, allowing them to keep up with any unsanctioned uses of their image and likeness across the site. The point of the system is to help prevent creators' voices and images from being taken and used to promote products they don't actually support, to spread misinformation, and more.

Here's how it works

In a new video uploaded to the Creator Insider channel on YouTube, the company provided some key details about how the system will work, and how creators can take advantage of it. Once creators sign up for the system by heading to the Likeness tab and consenting to data processing, YouTube will then require a photo ID and a brief selfie video. This helps YouTube's AI find videos that might be impersonating the creator.

YouTube says once the creators have set up their Likeness settings, they'll be able to see any videos that have been detected to include their likeness. They can then make a removal request according to the site's privacy guidelines. If the deepfakes are of a large enough scale, they can make a copyright request directly from the page.

Additionally, if Creators decide they no longer want to use Likeness Detection, they can drop out of the program at any time. YouTube will then stop scanning videos 24 hours after the request has been received. Hopefully, this setup will allow for creators to uncover and strike back against any AI deepfakes using their likeness. With any luck, it means we'll finally see less AI deepfake slop on our homepages. This development comes just a week after YouTube pushed out major design changes to its app and website.

Recommended