New Matrix Awakens Unreal Engine Demo Lets Players Talk to NPCs in Real-Time

“Hey, why did The Matrix: Resurrections flop at the box office?” Replica Studios, developer of a leading artificial intelligence (AI) voice platform for video games, has released a modified version of The Matrix Awakens Unreal Engine 5 demo to showcase Smart NPC, a new plug-in that allows developers to integrate NPCs powered by OpenAI or other AI language models into their games. The demo, which is now available for download with DirectX 12 as a listed requirement, lets players talk to any NPC using their microphone, after which they’ll respond with dialogue in real-time. Replica says that its Smart NPC plug-in for Unreal Engine will be available later this year.

“At Replica, we believe AI Voice technology has two-fold potential for furthering narrative-heavy games: eliminating critical bottlenecks in the development process, enabling studios to tell ever bigger and immersive stories, and allowing for the creation of never seen before living-world gaming experiences that shape around the player in real-time,” said Shreyas Nivas, CEO of Replica Studios. “Smart NPCs will allow smaller studios who don’t have the resources to script and vocalise on scale to dream big and scale their games 10x or 100x into the epic adventure they’ve always wanted to make.”

The Matrix Awakens Replica Smart NPCs Demo Features

  • Talk to any NPC in real-time: Using your microphone, talk to NPCs and they will respond with dialogue in real-time.
  • AI voices with emotion: NPC responses utilise Replica’s range of emotions and adapt in real-time.
  • Scale unique NPC characters for your world: NPC backstories and context can be customised for a unique experience (not available in the demo release).
  • NPCs can talk to each other: Ambient NPC interaction is in-built, so NPCs will converse with each other intelligently as well.
  • Automated lip-sync and body gestures: Included is a customized blend-shape to respond to phoneme timing output and create effortless lip-sync. Animation blueprints also provide suitable body gestures.

From a Replica Studios press release:

In Replica’s Smart NPC Experience, AI-powered NPCs will dynamically respond to the player’s in-game voice in real time. Characters will change their dialogue, emotional tone and body gestures in reaction to how the player speaks to them. Replica Studios’ NPCs now have a broader range of emotions thanks to Replica’s recent Style Morphing update.

Game designers can also alter the information provided to the AI characters to instruct them to behave differently or have certain personalities, or incorporate their own AI language models to suit their project.

When Replica’s Smart NPC plugin launches for Unreal Engine later this year, game developers will be able to integrate Smart NPCs into their own projects to begin scaling up characters in their games. The plugin integrates AI language model smarts along with Replica’s text-to-speech models and animates the characters in real-time by outputting sound phonemes and timelines to match an audio stream, while using a customised blend space for facial animations to power accurate lip sync, and custom animation blueprints to send body gesture messages for NPCs during their listening, thinking and speaking phases. The result is a natural looking and behaving NPC that can surprise and delight players with thoughtful, funny, and provocative responses, depending on their personal contexts as well as the voice input directly captured from players’ microphones.

Join the discussion for this post on our forums...

Recent News