NVIDIA & Inworld AI demo on-device capabilities at Computex

Nathan Yu
Nathan YuJune 02, 2024
Related posts
Want to try Inworld?

During GDC 2024, Inworld AI worked with NVIDIA to develop Covert Protocol, a demo that unlocks social simulation game mechanics using Inworld’s AI agents. Inverse shared their impressions of the demo, stating “the potential is clearly there and we may already be in the fine-tuning stage.” 

Today at COMPUTEX, we’re unveiling the next phase of our work together, with a hybrid deployment of Covert Protocol that brings NVIDIA ACE, a suite of technologies for bringing digital humans to life with generative AI, on device. Covert Protocol features several NVIDIA NIM microservices, such as NVIDIA Riva for automatic speech recognition (ASR) and NVIDIA Audio2Face, deployed on-device on NVIDIA RTX AI PCs for a low-latency player experience and economical deployment costs. 

Watch the replay of NVIDIA CEO and founder Jensen Huang’s COMPUTEX keynote to see the latest in ACE content.

Developing AI agents for game worlds

AI agents are autonomous systems capable of perceiving their environment and representing a multitude of functions. Beyond NPCs, agents in video games can serve as tools for procedural generation, dynamically creating vast landscapes, quests, and narratives. They can also function as adaptive difficulty systems, adjusting the game's challenge level based on the player's skill level and preferences. Agents can also govern the behavior of objects and physics, adding fluid dynamics and movement to a living game world. AI agents in games encompass a vast spectrum of possibilities, facilitating unique player interactions and deepening engagement.

In Covert Protocol, Inworld’s AI agents are able to perceive, understand, and interact with the world around them, with the goal of enhancing the engagement and entertainment value of the game by simulating intelligent behavior and responding dynamically to player actions and the game world. Characters like Diego, NexaLife’s CEO and the conference keynote speaker, are capable of advanced reasoning, decision-making, adaptability, learning, perception, and communication – all leveraging Inworld’s multimodal AI systems.

Large language models are only a small component in the Inworld Engine. While a LLM would be capable of generating text based on input prompts, the reality is that LLMs are still prone to game-breaking hallucinations, coherence issues, and unnatural tone within complex interactive narratives. These limitations underscore the need for a robust and adaptive framework that integrates multiple AI models to create truly immersive and engaging virtual worlds.

Journey of an interaction within the Inworld Engine

One of the game objectives in Covert Protocol requires players to gain the trust of Diego by obtaining a conference badge. Let’s chart the journey of a player’s interaction with Diego to understand what’s happening under the hood as the Inworld Engine processes these inputs.  

Warning, this post contains spoilers!

In the video, we do a deep dive into the various models and systems powering interactions with our AI agents. In this scene, the player has found a conference badge lying around the hotel, and decides to re-approach Diego, who is the conference keynote speaker and has previously been dismissive. Watch how Diego’s demeanor changes after the player pretends to be a conference attendee, rather than a random tourist.

Cognition & Perception

  • The player’s voice input is being captured on-device by NVIDIA’s Riva ASR, and the text transcript is passed to Inworld for inference. 
  • When the player introduces themselves as “Alex Miller” (the name on the conference attendee badge), this triggers a Mutation of Diego’s motivations and demeanor, making him more collegial and friendly with the player. Intent-based Mutations are also used to trigger changes to the Player Profile, from tourist to attendee
  • To aid in agents’ cognition, we’ve added a Reasoning step to evaluate the interactions and dynamically evaluate motivations. Reasoning allows AI agents to simulate human-like decision-making processes, engage in complex interactions, pursue their own goals, and create dynamic gameplay scenarios.
  • Strict Safety filters are enabled for topics like politics and religion, which are not topics that the characters should comment on
  • Every interaction draws on Diego’s basic character sheet, which informs his perception of the game world and lore, as well as his core ego and persona. 

Shifting character behavior on device in Covert Protocol

Inworld’s cognition and perception systems work together to output character behavior across multiple data streams. While every response returns a text string coded for dialogue, we also return the following parameters to orchestrate the full character performance. 

  • NVIDIA Audio2Face and NVIDIA Riva ASR is running on-device to augment our character’s facial animation pipeline
  • We pass Emotion parameters, so developers can sync these to client-side animations. As Diego warms up to the player, he will lean forward to indicate interest, for example.
  • Voices are synthesized at runtime in the Cloud for expressive vocal performances

The case for on-device AI

At Inworld, we recognize that implementing sophisticated AI in games poses not only technical but also economic challenges. Traditionally, offloading AI processing to remote servers can be costly and may introduce latency that disrupts the player experience. While the Inworld Engine is optimized for real-time and cost efficiency, we also recognize the importance of giving developers control and flexibility over their games' performance and user experience. 

Covert Protocol is just one example of how developers can take advantage of hybrid deployments to integrate advanced AI capabilities into their games. As more powerful multimodal and language models become smaller and more efficient, the future of on-device AI for developers feels not just promising, but inevitable. 

In a world where AI agents go beyond dialogue and NPCs to shape procedural content, manage complex physics simulations, and adaptively adjust gameplay, on-device deployments become even more critical. We’re continuing to advance our capabilities there and in agentic applications of AI for gaming, and look forward to sharing our R&D in future experiences.

Attend our ‘making of’ Covert Protocol webinar! 

We're excited to bring you an exclusive webinar on June 11th, where we will pull back the curtain on how we created Covert Protocol, a demo of a next-gen gaming experience that unlocks social simulation and detective game mechanics using Inworld’s AI agents and NVIDIA ACE.

Webinar Details:

  • Creating a Next-Gen Gaming Experience with NVIDIA & Inworld AI
  • Date: June 11, 2024
  • Time: 11am - 12:00 pm PT
Get started with InworldGet in touch to discuss signing up for the Inworld License.