Nvidia recently showcased how developers are integrating its AI "digital human" tools to enhance video game characters' voices, animations, and dialogue generation. 

At the Game Developers Conference (GDC) 2024, the tech giant unveiled a glimpse of Covert Protocol, a playable tech demo highlighting the capabilities of its AI tools in enabling non-playable characters (NPCs) to respond dynamically to player interactions, thereby generating varied responses tailored to live gameplay.

Covert Protocol: A Unique Gameplay Experience

Nvidia Unveils AI-Powered NPCs for Video Games at GDC 2024
(Photo : Nvidia)
Nvidia has released the demo of its latest AI tool where people can create AI-generated game characters. They can add backstories and modify their traits on this digital interaction.

In Covert Protocol, players assume the role of a private detective, engaging in tasks determined by interactions with AI-driven NPCs.  

If you're into gaming, you know how these characters operate. They only have limited tasks whenever you interact with them, but that's not the case right now if you incorporate generative AI onto them.

Nvidia emphasizes that each playthrough offers a distinct experience, as players' real-time engagement influences the game's outcomes. 

John Spitzer, Nvidia's vice president of developer and performance technologies, highlights the potential of the company's AI technology in driving nuanced animations and conversational speech essential for fostering lifelike digital interactions.

Related Article: NVIDIA's Flagship AI Chip Unveiled Alongside Newest AI-Powered Robots

Collaboration with Inworld AI and ACE Technology

Covert Protocol is a collaborative effort between Nvidia and Inworld AI, an AI gaming startup, leveraging Nvidia's Avatar Cloud Engine (ACE) technology. ACE, previously showcased in Nvidia's futuristic ramen shop demo, forms the backbone of Covert Protocol. 

While the demo primarily exhibits NPC voice lines, Inworld AI intends to release the source code of Covert Protocol to encourage wider adoption of Nvidia's ACE digital human technology among developers, per The Verge.

Advancements in Audio2Face Tech

Nvidia also demonstrated its Audio2Face technology in a sneak peek of the MMO World of Jade Dynasty. This showcased character lip-syncing to both English and Mandarin Chinese speech, highlighting Audio2Face's potential in facilitating multilingual game development without the need for manual character reanimation. Additionally, a video snippet of the action melee game Unawake illustrates how Audio2Face enhances facial animations during both cinematics and gameplay sequences.

Implications for Game Developers and Voice Actors

While these tech demonstrations may intrigue game developers, particularly in diversifying character interactions and language support, the conversational aspect still presents challenges. NPCs in Covert Protocol exhibit limited resemblance to "real people," reminiscent of earlier Kairos demos. 

There's no doubt that this aspect may raise concerns among video game voice actors regarding the potential impact of AI adoption on their roles and livelihoods, especially if we are talking about its impact on the voice acting (VA) sector.

Indeed, Nvidia's advancements in AI-driven NPC technology offer promising prospects for enriching gameplay experiences. However, the balance between innovation and maintaining the human element in gaming remains a topic of debate, especially concerning the evolving landscape of VA in the industry.

Read Also: Nvidia GTC 2024: Automotive Tech and Breakthroughs Unveiled in California

Joseph Henry

ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion