How Niantic and Liquid City pushed the bounds of what’s possible in gaming
Augmented reality was missing a key element: Dynamic characters
When Niantic released Pokémon GO in 2016, they transformed how people interacted with the world around them. Over the next year, the app was downloaded 650 million times and gamers collectively traveled an astounding 15.8 billion kilometers to participate in Pokémon battles. It remains the most successful augmented reality game of all time.
Niantic’s focus on novel game experiences is what made the company excited about the possibilities of generative AI. How much more immersive would their game worlds be if they were populated by AI NPCs that players could talk to? And what non-gaming augmented reality experiences could benefit from AI-integrations?
Niantic wanted to be the first to find out. They organized an internal AI hackathon at their San Francisco headquarters in early 2023 and invited Inworld AI to participate. It was that collaboration that led Niantic to create the first AR experience that integrated AI NPCs: Wol.
As a pioneer of augmented reality technologies and experiences, Niantic has always prided itself on pushing the boundaries of game development. By driving the development of emerging mediums like location-based AR to tell stories in exciting new ways, they’ve created truly unique gaming experiences like Peridot, Pikmin Bloom, Ingress, and, more recently, Monster Hunter Now.
But even as they’ve succeeded in visually marrying the digital and physical worlds, they wanted to go further in bringing forth immersion by adding dynamic character interactions.
“We wanted players to be able to have an organic conversation with a character that felt immersive and drove user engagement,” said Maryam Sabour, who leads the business development team for Niantic’s platform.
It was that interactivity that Niantic’s developers and game designers were tasked with figuring out how to incorporate during their AI hackathon. How could they make the world come to life around them with AI NPCs?
The prototype Niantic created during their hackathon took that challenge literally. They designed a character that was a living embodiment of San Francisco’s Ferry Building that people could talk to. In their personification of the historic landmark, the Ferry Building was a grizzled sailor capable of spouting historical facts about San Francisco and the building itself one minute – and joking about pigeons pooping on it the next.
For Niantic’s team, this felt like a huge step forward for AR. Not only were they able to map personality onto the world around them – but the Ferry Building was fun to talk to, something that would undoubtedly increase interactivity and immersion within AR experiences.
They decided to integrate Inworld’s generative AI technology into an experience as soon as possible. “With the release of the Quest Pro and the impending release of Quest 3 mixed-reality devices, we were in the process of optimizing Niantic’s 8th Wall WebAR platform to improve its metaversal deployment capabilities so developers could build once and seamlessly deploy an AR experience across mobile and Quest devices,” explained Maryam. “To showcase this capability, Niantic partnered with Liquid City to develop and build a cross-device webAR experience that would showcase next level immersion. This created the perfect opportunity to integrate Inworld’s GenAI.”
After partnering with Liquid City, an XR and AI design studio based in London, the idea for Wol, an AI-powered owl educator came into focus. The aim was to create a character that could hold a conversation with players about redwood forests and act as their guide to the entire ecosystem. The experience needed to demonstrate the ability to build once and run on any device, a core feature of the 8th Wall platform. Specifically, it needed to run on mobile phones and the Meta Quest Pro.
Building a guided, curriculum friendly AR experience powered by generative AI wouldn’t have been possible under such a short timeline had they started with a large language model (LLM) API where they would have had to focus on architecting the backend and solving technical and common LLM problems. But with Inworld orchestrating multiple models, offering a suite of features customized to the needs of game developers, and delivering a scalable, real-time experience, Niantic could spend their time making the experience work, rather than engineering the backend and creating the features they would need to power dynamic characters.
Working closely with the team at Inworld AI, Liquid City created a loveable guide that flies through a portal and into your AR space which can be experienced via WebAR or mixed reality headsets. Designed as a cartoon-like version of a Northern Saw-whet owl, Wol can talk about things like the forest’s plants and animals, the lifecycle of a redwood tree, and much more.
Throughout the two month development process, the Liquid City and Niantic teams were grateful to be working with Inworld’s Character Engine for a number of key reasons – including the customer service they received. “Inworld was supportive,” said Maryam. “We did have some challenges but we had a direct communication line with the Inworld team which helped to unblock us.”
Dynamic personality generation
For Liquid City, it was a new experience to work with AI characters through a Character Engine. Rather than write a script and create dialogue trees, the team had to direct Wol as a ‘virtual actor’ rather than give him specific lines. Wol was also designed to explain things in an engaging way rather than just spit out facts verbatim. But they believe that was the best thing about working with AI characters for educational use cases as players might easily get bored if the character was explicitly trying to teach them.
“It's much more interesting to have a storyteller who is kind of fallible,” explained Keiichi Matsuda, Director at Liquid City. “Even with his animation and the way that Wol moves and talks, there's something kind of funny about Wol. He has so much charm.”
It was important to Liquid City that Wol didn’t act like a lecturer, said Keiichi. “Wol is actually more of a learning partner that you can discuss ideas with. He can have his own perspective.”
That’s one thing that impressed Keiichi most about Inworld’s Character Engine: “It’s not like we’re putting in text that Wol is then going to read out. We give Wol suggestions of things to talk about, building on existing knowledge, and then everything that Wol says is new every time.” In gaming use cases, that would make for infinite replayability. In educational ones? It keeps learners from getting bored.
Configurable Safety and Contextual Mesh
Anyone who has spent time talking with any AI chatbot knows that it’s possible to get off-topic or even talk about taboo subjects – something no one wants happening in a classroom or a video game.
“It was kind of nerve-wracking to put Wol in front of other people,” Keiichi admitted. “It's like sending your child out to talk to someone and hoping they don't say something stupid.”
Luckily, Inworld’s safety measures and Contextual Mesh were built to solve this problem. “One of the great things about Inworld is that it keeps the conversation coming back to Wol's core motivation - teaching you about the redwood forest,” Keiichi said.
Maryam agreed this was what stood out about Inworld’s Character Engine. “We could create interactions with a character centered around a very particular theme or niche – in this case, around redwood trees,” she said. “We made Wol an expert on redwood forests and he had ways to always bring the conversation back to that.”
Inworld’s Configurable Safety features also ensure that Wol can’t talk about inappropriate topics if a player tries to. “We appreciated the level of safeguarding Inworld offered to manage the boundaries of the conversations and interactions,” she added.
Long-term memory and relationship building
Another benefit of Inworld’s Character Engine is AI characters’ ability to remember and adapt to players. “Wol’s personality is remarkably consistent,” Kylan Gibbs Inworld’s Chief Product Officer explained. Over time, someone interacting with Wol will notice that the owl gets to know them. “Wol adapts the conversation. The experience becomes more personalized over time – something that has significant potential for deepening player engagement in gaming use cases.”
Niantic launched Wol in May 2023 alongside their announcement of new tools Niantic’s 8th Wall would be offering to help developers build augmented reality experiences that easily run on both mobile devices and mixed reality headsets. Wol was greeted with excitement by the technology and gaming press with The Verge calling it a ‘preview’ of what’s to come and outlets like GamesBeat and Pocket Gamer covering the launch.
“We created Wol to inspire developers to build their own WebAR content that makes use of the truly immersive capabilities of mixed reality passthrough while benefiting from the massive reach and scale of mobile,” said Tom Emrich, Director of Product Management at Niantic.
Maryam believes that the immersion AI characters add is the key to making AR experiences work. “Virtual characters and NPCs are a core part of gaming and augmented reality content. Giving these characters more intelligence in a scalable way increases the engagement from users while taking the immersion to a whole new level,” she said.
Keiichi, meanwhile, sees additional value in the combination of AR/AI. “I think one of the things that Wol does really well is make a really entertaining experience,” he explained. “Wol actually tries to provoke curiosity in people, so that users are driving the conversation. You're not being told a bunch of facts; you're actually able to be curious and direct the conversation in the direction you want to go.”
This is one of the key benefits of Inworld, according to Kylan. “One of the problems in education or gaming overall is if I have a specific interest or you want to talk more to a specific NPC, you can't go down that path. You have to stick to the curriculum or dialogue trees that are given to you,” he explained. “Wol respects what the player's actual interests are, which opens up education and gaming to be far more learner and player centric.”
Keiichi sees an impressive future for AR/AI characters. “Whether you're an expert in a subject, or a complete novice, it doesn't matter. You can interact with [an AI character like Wol], and he'll always have an answer for you,” he said. “It gives you opportunities for learning everywhere. This technology allows you to be able to have that in a really accessible and approachable way.”
That’s cause for excitement in educational use cases – and others. “We're really seeing a new kind of creativity and storytelling that's emerging, that's combining AI and mixed reality,” Keiichi added. “It's a whole new medium.”
That potential is why, after creating the Wol experience, Niantic worked with Inworld to create GenAI Modules and sample projects to make integrating Inworld easier for developers using Niantic’s 8th Wall Platform. Maryam is excited to see how their developers will use the technology in the future.
“Being able to really bring these characters to life is going to be extremely transformative,” she said. “Not just for education, for travel, for tourism - and you can even think about enterprise use cases like for learning and training. This can be deployed across many verticals.” She pointed to studies conducted at the Stanford Human Interaction Lab that show that when people are exposed to scenarios in VR and AR, lessons learned tend to be retained for longer. That makes it particularly effective for a variety of use cases.
Create your own AI NPCs
Creating your own AI character in Inworld’s no-code studio is simple. You can then easily integrate your character into popular game engines or web experiences.
Interested in building an experience with AI characters in augmented reality? With Inworld’s 8th Wall integration, developers can easily create truly dynamic interactive experiences they can deploy to multiple platforms.
“By combining WebAR with GenAI, developers are no longer creating single-use content,” explained Maryam. “Instead, GenAI and WebAR combine to create dynamic, evolving experiences that respond to end-user interaction, keeping users engaged and wanting more.”
Niantic is excited to see what developers will create. “In the last few months, we’ve seen 8th Wall developers create incredible experiences by combining various GenAI technologies with 8th Wall,” added Tom. “Now, we’ve introduced the first-ever GenAI Modules and sample projects using Inworld. These Modules provide developers with a hassle-free way to bring GenAI into WebAR experiences.”
Try out Inworld Studio today. For more advanced implementations, reach out to Inworld’s Partnership Team to discuss your needs.