Ubisoft Will Tap NVIDIA’s AI Tools to Build Chatty Game Characters

Tech giant NVIDIA’s stock price has risen to all-time highs Monday as the computer chip-maker and AI tech developer announced advancements in its Avatar Cloud Engine (ACE)—and that game publishers like Ubisoft have vowed to use it.

According to the announcement made during CES today, game developers and publishers can now apply for early access to two of NVIDIA’s ACE Production Microservices offerings and deploy its tech in their games. The first, Audio2Face, uses AI to make game character’s facial expressions more closely match any speech track. The second, Riva, uses AI to recognize player speech inputs and power real-time conversations with game characters.

In addition to Ubisoft, Genshin Impact developer MiHoYo, Riot Games parent company Tencent, and Chinese game studios NetEase and Ourpalm have also committed to using the new tech, according to NVIDIA. The company also said that AI-forward firms like Convai, Charisma.AI, and UneeQ will also be adopting ACE.

First announced back in March 2023, NVIDIA’s ACE previously made waves with an initial demo where a generative AI-powered non-playable character (NPC) working at a cyberpunk ramen bar had a real-time conversation with a human player. Historically, game narrative designers have crafted NPC dialogue options in advance and give players limited response options. But with new generative AI tools like NVIDIA’s, players can chat with NPCs using their microphones—and hear the NPC’s response generated on the spot.

NVIDIA’s latest update to the ramen bar demo now shows two generative AI-powered NPCs having a conversation with each other—and every conversation is different, with the player joining as a third voice in the mix.

While the use of generative AI tools in game development may be on the rise, it remains to be seen whether these tools will actually make games better. Based on a quick review of a few demo videos, NVIDIA’s Audio2Face isn’t perfect just yet, and character lips don’t exactly match what is being said. As a result, characters often look like they’re speaking a different language under a dubbed track as opposed to speaking organically.

NPCs can blink and move their eyebrows, but facial expressions still feel minimal in the ramen bar demo. Developers can fine-tune NPC “emotions” on a sliding scale, however, to decide how dramatic they’d like character reactions to be.

NVIDIA is pushing ahead with its AI offerings for game developers, but there’s already been substantial backlash within the game development community as game writers and voice actors raise alarms about potential job losses brought about by AI-generated art.

Xbox’s November announcement with Inworld AI further angered game developers, who called its AI gamedev tools “disrespectful and dangerous” to their professions.

AI has long been used in the gaming industry for other purposes, like “practice range” training and “bot modes” in shooter games. But the rise of generative AI has opened up a Pandora’s box—and NVIDIA isn’t the only one exploring its possibilities. Take, for instance, this Grand Theft Auto V mod that let AI-powered NPCs insult you—until GTA’s developer issued a copyright strike against the mod’s creator.

Edited by Ryan Ozawa.

Stay on top of crypto news, get daily updates in your inbox.

Leave a Comment