How Inworld helped an AI game with 20 million players reach profitability

After losing money with other AI services, Death by AI switched to Inworld to scale

GenreSocial multiplayer
PlatformsWeb-based, Discord
ReleaseMay 2024
Players20 million
Gameplay3 million hours

Overview

Three days into the launch of their new game on Discord, Playroom CEO Tabish Ahmed knew they had a hit on their hands. Death by AI, their social survival native AI game, was seeing 700,000 daily users. Volume at that scale was beyond the company’s expectations. And that was just the beginning. 

In the first month, Death by AI had reached 10 million players. Three months later, that number had ballooned 20 million players with 3 million hours of gameplay and retention metrics that would make any studio jealous. Launching such a successful game should have been a cause for celebration – but instead the scale of Death by AI’s success on Discord created significant challenges for Playroom. 

A startup with a team of just six employees, Playroom (an a16z Speedrun portfolio company) had been around for less than a year and focused on building a social multiplayer publishing platform when they launched Death by AI to take advantage of the popularity of casual multiplayer and native AI games. Death by AI came out of the team’s desire to create a new kind of game by leveraging AI to create more interactive game loops. They suspected that more interactivity in casual games would increase user immersion and maximize replayability.

With tens of millions of users, their hunch certainly proved correct – but that also saw them consume hundreds of millions of AI tokens in the first week. Just three days after the launch of their game, Tabish realized that using OpenAI and ElevenLabs, the services they launched their game with, wasn’t financially sustainable given their game’s unexpected popularity and replayability. What’s more, they were also experiencing significant issues with AI latency and localization quality that were causing players in certain regions to churn. They needed to switch their AI models fast. 

Luckily, they reached out to Inworld to help. 

The challenge for native AI games

What does 10 million users translate to in AI usage with an AI native, narrative-based game? For Playroom, it added up to a whopping 1.2 billion tokens in the first month using OpenAI’s GPT3.5 and GPT4 large language models (LLMs) and ElevenLabs’ text-to-speech (TTS) model.  

Unlike games that use AI to supplement fixed storylines or add more interactivity to particular game mechanics, AI powers Death by AI’s entire core game loop. The native AI game is driven by an AI Game Master named Bob who gives players a perilous situation to escape from. Players then tell the AI how they plan to escape and the AI judges the player’s fate. 

“Every round there is so much of the story being generated, both on the voice side of it and the text side so there is no opportunity for caching to save on costs,” Tabish explains. “It’s also the fact that the outcomes are so fresh that makes people keep coming back and playing. They don’t expect what’s going to happen.”

While one way to consume fewer tokens would be for Death by AI to tweak their experience to have fewer interactions generated in real-time, diluting the interactivity that made the native AI game successful would fundamentally change the gameplay. “It’s a short, fast game loop that amplifies and incorporates players’ creativity. It’s not just AI doing the work,” explained Tabish. “What’s exciting with this game is you’re using your ideas, and adding your sense of humor. That makes it really compelling to play.” 

So, Tabish had to find another way to make Death by AI sustainable. “After three days we were freaking out. I started asking my team how they can save costs on this while not reducing the quality,” he explained. “Over the weekend, we tried a number of ways to reduce costs and we quickly found that not every AI service is ready to scale. Not every service has the infrastructure to support it. A lot of our time was spent trying to talk to the infrastructure teams at these companies to get the right resources so we didn’t crash.”

As they explored alternatives, Tabish was concerned about Playroom’s future. “It took us a week and a half to explore different options and I kept looking at our bank account and thinking we’re a small team,” he said. “We had to make sure we didn’t burn through our runway.”

Playroom needed a solution that met their needs in four critical areas: 

  • Cost
  • Latency and scalability
  • Model quality
  • Localization 

But every service they tested struggled with at least one of those things. That is, until they tried Inworld.

Watch our webinar with Playroom to learn more about their journey.

Buy vs. Build

Before Playroom found their way to Inworld, they considered every other option – including hosting their own models when commercial solutions struggled to keep up with their traffic. “We even tried different services where we could have a Llama model hosted,” explained Tabish. “We tried other platforms but ran into issues with scaling” At that point, he said, it became a build versus buy decision. 

While Playroom had a core team of specialists, they had to weigh whether to invest time in doing the work themselves. “How much AI infrastructure should we build? Are we game developers building a game? Or are we here to build AI models?” he explained. Their small team was already struggling with bandwidth as they attempted to test different services and manually work to address challenges with latency and scale they were creating for the game. They also needed specialized models for things like multilingual gameplay.

Ultimately, the team decided they wanted to focus on gameplay versus hacking the tech together and solving technical problems. “Playroom, as a company, started as an infrastructure company,” Tabish explained. ”We needed to practice what we preach and use infrastructure that could help us move faster.”

Inworld was the perfect solution. Inworld’s comprehensive AI suite and its appeal as a singular partner for AI in games versus multiple solutions was one of the main selling points. Rather than having to set up the architecture across multiple APIs, Inworld had an out-of-the-box solution that was optimized for gaming and handled the orchestration between things like speech-to-text, a large language model, and a realistic text-to-speech model. Inworld also offered prompt support and AI gaming capabilities not available from other solutions. 

By working with Inworld, Playroom was able to integrate a Custom LLM Service API that immediately reduced deployment costs, gave them more control over model selection and performance, and offloaded the maintenance burden of ensuring that the game was optimized to manage their scale and traffic.  

Ultimately, Tabish’s key regret was that he didn’t use Inworld in the first place: “Because of Inworld’s high profile partnerships, I thought Inworld was primarily for AAA games. I was happy to find out that they could support our team.” 

Why Inworld?

Here are the key factors that convinced Playroom that Inworld was the right partner for them.

Cost

Cost was the biggest challenge that Playroom faced after their launch. “We needed to be able to offer this at scale and remain profitable,” Tabish said. “That meant getting per user cost down so we could become cash flow positive as soon as possible.”

Tabish is the first to admit that his team underestimated the costs of running Death by AI with OpenAI and ElevenLabs. “My team is obsessed with shipping fast. We don’t want to sit on an idea for a year and wait for it to go out,” he explained. “But that was our problem. We moved too fast and underestimated how much people would play this game – which meant we underestimated the costs.”

From Death by AI’s experience, Tabish feels strongly that the models studios use in early prototyping aren’t necessarily the models they should go into production with. “I think many people are exploring native AI games and are using OpenAI or ElevenLabs or different services for prototyping. But the pricing of many of these models don’t work for gaming,” he explained. “Their pricing makes sense for enterprise use cases where the interactions or generation needs are not significant or don’t need to be that fast. But in a gaming scenario, the pricing needs to be different.”

Inworld, in contrast, had a pricing model made for gaming charging less than the leading LLM competitor and the leading TTS competitor – while offering more features and better performance. 

“We found a path to profitability with Inworld,” explained Tabish. “They helped us understand the base usage and keep costs per user low.”

Latency and scalability

Another challenge that Playroom faced before switching to Inworld was issues with outages and latency. 

“If you’re doing the scale we were doing, many of the services available today are likely to crash or experience significant latency,” explained Tabish. “Because those services will put your request in a queue since you don’t have a model  dedicated to you.” 

The latency was affecting the user experience.“ In our game, there is a judging screen after a scenario is presented that tells them if they survived,” Tabish said. “The copy on the waiting screen says, ‘Your fate has been sealed,’ and because it was taking so long to get the responses it became a joke among players that, ‘Our fate has been sealed, but it will not be revealed.’”

When Playroom tried to fix it, they found they were overloading the services they were using. “If you looked behind the scenes, it was just a long queue of requests,” he explained. “We ended up spending a lot of time rebalancing and sending 20% of our traffic to different services. But that was taking a lot of our engineering time to maintain and balance our load for both our LLM and TTS APIs.” 

Tabish remembers feeling overwhelmed by the work involved. “I finally just asked – who does it all? Who can take all this work away from us? Who can work with us to maintain this?” he recalled. “That’s where Inworld came in and we worked together on all the use cases and scenarios to make sure there wouldn’t be latency or outages.” 

Inworld was able to improve Death by AI’s performance for two reasons. First, Inworld has some of the fastest models for gaming applications with our TTS model performing with 38% lower latency than ElevenLabs. But Inworld also works closely with clients to ensure we can always handle their load, flexibly scaling up resources and adjusting limits as needed. For example, Playroom’s traffic increased by 10 times over one weekend and Inworld was able to scale up and down our resources to meet their needs. 

Inworld can help Playroom deal with fluctuations in traffic

Model quality

With an already launched game, Playroom had to be careful that any changes to their models wouldn’t affect the quality of the experience. “The stakes were too high to choose the wrong partner. It was a game that was live already,” explained Tabish. “People had already tried the game and they liked it and had started forming a relationship with Bob, the Game Master. They thought he was funny.”

Initially, whenever Playroom tested a new model, they received immediate feedback from their players. “People in Discord would be asking if we changed something because it wasn’t funny anymore,” he shared. “We’d see that feedback within a couple minutes after we made the changes live. We were worried that transitioning to a lower quality model would impact our gameplay since the whole game loop is about AI making up funny stories.” What’s more, some models that Playroom tested when looking to switch from OpenAI had significant safety issues. 

For that reason, Tabish had a very high bar for Inworld. “I was brutal on Inworld in the beginning. I had expectations of no latency, localized languages, and a high quality LLM. But thankfully Inworld had all those things,” he recalled. “With Inworld we tested it and the quality and fun was still there but with better latency and lower cost.”

Localization

Another big challenge Playroom had when considering models was finding a model that could handle localization well. “We wanted to build something that could be played globally from day one,” he explained. “Not all models can generate dialogue in different languages and, even if they can, the translated dialogue might not be good quality.”

Localization wasn’t working well with OpenAI models and that was causing churn in certain regions. “People were saying that it was funny in English but not funny in Chinese or Russian,” he explained. “I think one of the key issues when you launch a global game is making sure your AI models or service providers are actually going to work across the world. Not everyone does that well. That was a hard lesson for us.”

In comparison, Tabish has found Inworld’s Multilingual Support to be exactly what Playroom needs. Instead of relying on machine translation, Inworld focuses on AI models that can generate dialogue natively in a desired language because they have a broad enough dataset in that language. The result? Better quality dialogue – and funnier gameplay in all languages. 

Custom support with integration

While it wasn’t initially on Playroom’s shortlist of things the company was looking for from an AI partner, the hands-on support from Inworld’s team has greatly contributed to Playroom’s success. Unlike the hands-off experience of using an API from enterprise-focused companies like OpenAI or ElevenLabs, Inworld’s team took over considerable engineering, quality control, and model maintenance tasks from Playroom – all at a lower price point.  

“We worked really closely with the Inworld team on the integration,” Tabish explained. “A custom API was built by the Inworld team and from there it was fairly easy to connect to our game builder without any extra engineering on our side.”

Chris Covert, Inworld’s Director of Product Experiences, explained the process of working with Playroom. “We explored different models with them. We played with open source models, closed models, we discussed custom-trained models that were fine tuned on their data. We wanted to make sure that the quality, the reliability, and the performance of the solution was exactly right.”

Impact

Now that Death by AI is profitable and they have an AI partner who is able to handle the game's high traffic without constant maintenance work by the Playroom team, what’s next? 

The first thing on Playroom’s list is to add additional game modes to Death by AI or create new native AI games that take advantage of some of Inworld’s gaming-native features like long-term memory and the ability to build a relationship with a character over time.

“We came to Inworld with a very specific use case but we were excited to see they offer a whole suite of tools to create new kinds of games and gameplay,” Tabish explained. “We're now working on a new native AI game that’s in the same world as Death by AI but which has characters that learn from your interactions with the game and can adapt to them in real-time.”

Tabish loves how Inworld’s features are focused on adding more fun to games. “Inworld’s AI becomes smarter as the player plays the game. The character learns about the players throughout the game and then can make fun of them because of how they have been responding in the game so far,” he explained. “We’re experimenting with Inworld features to see how we can make our games even more fun. It’s allowing us to make more sophisticated games that are customized to the player.”

After his experience with other AI providers, Tabish wishes he’d reached out to Inworld much earlier. “If you’re thinking about creating an AI game and wondering if you should use Inworld or build your own service, I’d recommend having a conversation with the Inworld team sooner rather than later,” he said. “It’s important to plan for the potential scale of your game.