MENU

AI Being Used To Create Fake Celebrity Porn

Close
At CES, LG introduces new smart home products with artificial intelligence

Reddit users are now using advanced machine learning technology to create fake celebrity pornography. They have been pasting faces of famous actresses and singers over existing lewd videos.

The action isn't just limited to pornography as they have also pasted the face of Nicolas Cage to various famous movies like James Bond.

What Is FakeApp? 

The technology used in making these fake videos, which have used the images of celebrities including Daisy Ridley, Katy Perry, and Taylor Swift, is pretty complicated.

Called FakeApp, it is based on a deep learning algorithm created by Reddit user Deepfakes. To set up, FakeApp users need to download and configure Nvidia's CUDA framework to run the TensorFlow code. This means they would need a GeForce GPU, which can be problematic due to high prices from cryptocurrency mining rigs.

FakeApp's Reddit contradicts the set-up process, however. It says the app can run the algorithm without installing Python or Tensorflow. It also states that users without a good GPU can rent cloud GPUs through services like Google Cloud Platform.

To make the video, users have to split it into individual frames. To add a new face, a large number of photos have to be used to train the software. It means the software can be used to create any kinds of videos but the most popular has been pornographic videos.

Using FakeApp, it takes 8 to 12 hours to create one short clip and the results are frighteningly good. Not all results have been good, though. Some of the users had a hard time where the outcome completely botched the face replacement.

Privacy Concerns

A major issue with the technology could be its possible use of ordinary people's faces. Celebrities are commonly used since there are more widespread images and videos that can make this available. However, it also means that with a vast library online, users can use anybody's face to make one of the videos.

The availability of the videos makes it easy for anyone to use the tools. This can be problematic for police who can possibly be tricked by one of the creations.

Another concern is making a fake news content. Fake news has been an issue that major tech companies are trying to contending with due to the use of bots and websites used to spread misinformation during the 2016 presidential election.

This AI technology would only make it harder to discern the difference between what's real and fake. People would need to take a closer look at videos that may prove to be hoaxes.

© 2018 Tech Times, All rights reserved. Do not reproduce without permission.

From Our Sponsor

Entropia Universe And ComPet’s Unique Cross Connectivity Allows Players To Trade Pets Across Platforms: A Closer Look

Players of Entropia Universe can exchange pets for Project Entropia Dollars, or PED, from another game called ComPet.
Real Time Analytics