AI fake porn is the latest thing on the internet that needs to go away

Remember in the 1990s when the Daily Sport (allegedly) photoshopped celeb heads onto Page 3 models’ bodies, only to feign outrage about the fake snaps?
Well the 2018 version of that is AI pornography and it goes a little further. And by that, we mean way too far.
Deepfakes are pornographic videos with the actresses face replaced by a celebrity and there’s an entire Reddit thread dedicated to them, with 15,000 subscribers.
The advanced face swap videos have taken existing footage of celebrities like Daisy Ridley, Katy Perry and Taylor Swift and uses machine learning algorithms, bespoke software and a top GPU to create the lifelike fakes.
Related: CamSoda’s new VIRP tool revealed
Named after the originator’s Reddit profile, Deepfakes are taking on a new life after one poster published FakeApp, a free and easy-to-use desktop application.
The creator of the app told Motherboard they plan to improve the app so it works with just the touch of one button.
“I think the current version of the app is a good start, but I hope to streamline it even more in the coming days and weeks,” they said.
“Eventually, I want to improve it to the point where prospective users can simply select a video on their computer, download a neural network correlated to a certain face from a publicly available library, and swap the video with a different face with the press of one button.”
The videos aren’t limited to celebrities, Reddit posters are inquiring about creating the fakes using their crushes instead.
Harder to spot the fakes
Worryingly, perhaps, the videos are getting so good it’s becoming harder and harder to tell the difference between the fake and real videos.
Deborah Johnson, Professor Emeritus of Applied Ethics at the University of Virginia’s told Motherboard: “You could argue that what’s new is the degree to which it can be done, or the believability, we’re getting to the point where we can’t distinguish what’s real – but then, we didn’t before.
“What is new is the fact that it’s now available to everybody, or will be… It’s destabilising. The whole business of trust and reliability is undermined by this stuff.”
The potential uses for the tool are scary to think about, far beyond pornography. There are concerns over whether it could used to blackmail or create fake evidence against innocent people.
What’s your view on Deepfakes? Share it with us @TrustedReviews on Twitter.