AI-generated voices could be the next big thing in the voice-acting industry. However, with the ever-evolving artificial intelligence, cloning one's voice is feared to be a bane for those people who earn money via dubbing or making voice-overs.

Truly, our wild fascination for AI has pushed us to explore the lesser-known sides of voice acting, but that does not mean that it will also work for real voice actors (VAs).

AI Voice Cloning Can Be a Threat For Voice Actors

AI Voice Clones Might Replace Video Game Voice Actors Someday: Here's the Alarming Part
(Photo: Will Francis from Unsplash)
Video game voice actors fear that one day, AI voice clones could easily replace them.

In the field of video games, we often hear our favorite characters deliver a powerful monologue or dialogue through their voice. Little did we know that the hardworking person behind these iconic voices could lose a job someday because of AI.

One of the most famous voice actors in the gaming industry is Jennifer Hale, who served as the VA behind Kronika, the bad-ass time and space manipulator in "Mortal Kombat 11." She's also the voice actor of Ashe from "Overwatch 2."

Hale told Gizmodo via a phone interview that she was concerned about the growing AI technology in the world. She urged her co-VAs to unite and oppose the looming dangers of it.

Hale believes that AI is trying to steal their jobs on the market without their consent. She describes it as "theft."

"We're all on the chopping block, and we have to get up, come together, and fight back, or we're going down. As actors, when we are hired, we have a certain shelf life in any given year, in any given decade," she said.

That's the same sentiment being mentioned in The Hustle's article back in April 2023. The report wrote that voice acting is at risk of being overtaken by AI when it comes to replicating human voices.

Related Article: Researcher Tests AI Voice Cloning to See Their Mother Could Tell the Difference: They Couldn't

What's With AI Voice Synthesizing?

Using AI is not that bad if a gaming company only uses an AI voice synthesizer to edit or change the voice of a dubber. It's evident with Activision Blizzard's practice of relying on AI-assisted voice cloning.

As long as the dubbers won't lose their jobs, it's okay to utilize AI to pave the way for a smoother outcome. It's an effective tool that's been used in the industry for many years yet it's only in the present day that it was recognized as a threat to voice acting.

The truth is AI is notoriously used by some people who want to scam victims. Using deep fake, scammers can easily imitate the voice of an individual and even the voice of a celebrity. The act of maligning gets a green light, making online threats even more rampant than before.

Considering that some voice actors fear that AI might replace them someday, it's now up to the industry to swerve a different path or stick to the traditional means. 

After all, VAs mostly find comfort and a sense of job security when they know that they finish their project with their own voices-no help asked from AI.

Read Also: New AI Voice Scam Analysis Claims 77% of Victims Lose Money! How Can You Be Safe?

Joseph Henry

ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion