Where else is deepfake technology being used?
Deepfake content has come a long way and is steadily becoming part of mainstream online content, for better or worse. One example is the Tom Cruise deepfake TikTok account, which creates short-form videos convincingly showing the actor in all kinds of situations.
Users are often fooled, believing it to be the real Tom, when it’s actually a lookalike actor with Cruise’s face digitally imposed in post-production.
While this example is largely harmless and not intended to cause problems, similar deepfake technology is being used in more malicious ways.
Deepfake AI pornography is of particular concern. There is significant potential for explicit material to be created and shared without consent, as many adult websites are not properly regulated and seldom follow strict guidelines.
Deepfake material has also been created to confuse and misinform the public on the Ukrainian war. Content has been shared online that shows leaders of both countries ‘saying’ things they did not. The consequence of this could be far reaching – and we’re likely to see more of it moving forward.
It’s not just visuals that can be manipulated, either. Audio deepfakes are gaining traction and are far more accessible. One service we’ve tried messing around with in the Thred office is UberDuck, which allowed us to create entire old school Eminem songs from scratch.
Another tool is able to create automated faces from audio alone, too, which means we may soon see entire digital people built from scratch who never even existed. The platform Replika is experimenting with this using chat bots, creating digital conversational characters for humans to engage with. Think of the plot to Her and you’re not far off.
https://www.youtube.com/watch?v=QhBWRh-h71g&ab_channel=40Hertz
Why could this be concerning?
The moral use of deepfakes is currently somewhat of a mixed bag.
Right now we’re seeing a bunch of novel or gimmicky implementations of its technology, but it could easily become more commonplace and harder to identify very soon.
This brings a host of problems as mentioned above, from explicit content to misleading political content. Imagine the potential ramifications if an unhinged world leader (of which there seems to be many) sees a deepfaked video threatening nuclear war.
I’m not saying Kendrick’s video is a sign that we’re heading down a doomed path, but the video does demonstrate that deep fakes are becoming easier to create and are very effective. If seeing Kanye’s face imposed on Kendricks body isn’t enough to give you nightmare fuel for weeks, I don’t know what will.
For now, bring on Kendrick’s album – it’ll no doubt be a classic, right?