YouTube
(Photo : Michal Jarmoluk | Pixabay) YouTube hosts deepfake videos that, according to Deeptrace, are "fairly innocuous." However, majority of the deepfake videos on the internet feature women whose likeness were being inserted into scenes that are pornographic.

Deepfake is on the rise. A study found that in the past year, the number of artificial intelligence-manipulated videos online almost doubled.

The Alarming Increase Of Deepfake Videos

The Amsterdam-based cybersecurity company Deeptrace found 14,698 deepfake videos on the internet during its most recent tally in June and July. For comparison, the tech start-up counted only 7,964 last December. That is an 84 percent increase within only seven months.

While concerns over deepfake center on the potential use of technology to mislead or lie to the public especially during election season, the recently published report by Deeptrace revealed that the majority (96 percent) of deepfake videos currently online consist of pornographic content and all of them feature women.

Nearly half of all pornographic deepfake videos feature American and British actresses, but researchers also found a significant number that used the likeness of singers of South Korean descent or K-Pop stars.

To monitor the prevalence of deepfake on the internet, Deeptrace used automated tools to track communities where AI-manipulated content appears. This includes Reddit, where the first known deepfake video was posted, and YouTube, where the researchers found "fairly innocuous" video manipulation.

The report added that more than 13,000 of the deepfake videos found during Deeptrace's most recent tally were on just nine deepfake-specific porn sites. However, deepfake videos were also found in eight of the top 10 porn sites.

Moreover, people and businesses that offer to create deepfakes have started popping up. One service mentioned in the report asks 250 photos of a person to create a deepfake video. Some charge as low as $3 per deepfake video.

"The debate is all about the politics or fraud and a near-term threat, but a lot of people are forgetting that deepfake pornography is a very real, very current phenomenon that is harming a lot of women," stated Henry Ajder, the head of research analysis at Deeptrace.

California Bans Political And Pornographic Deepfake

In related news, California has recently passed legislation that makes it illegal to create and distribute deepfakes that feature politicians. Assemblymember Marc Berman explained that he introduced AB 730 in response to fears that manipulated video or audio content will be utilized to spread misinformation in the upcoming presidential elections in the United States in 2020.

"Deepfakes distort the truth, making it extremely challenging to distinguish real events and actions from fiction and fantasy," he stated.

AB 602, which allows residents of California to sue anyone who uses their image to create sexually explicit content, was also signed into law by Governor Gavin Newsom.

ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion