A Return to the User

The channel of translating ideas into user value is lossy. The development processes of building, scaling, and evolving applications remain parasitic. They are monsters that eat value before it reaches real users.

Because each process requires distinct tools (with their own internal development processes), there are compounding demands on effort, attention, and resources. With every step, focus is redirected away from user experience, more processes consume our efforts, and users end up with crumbs.

A bar chart demonstrating that: Building means turning ideas into code (~20% of effort). Scaling means making that code work reliably for millions (~20% of effort). Evolving means continuously updating to meet changing expectations and new technologies (~60% of effort, compounding over time)

Building means turning ideas into code (~20% of effort). Scaling means making that code work reliably for millions (~20% of effort). Evolving means continuously updating to meet changing expectations and new technologies (~60% of effort, compounding over time). 1, 2, 3, 4, 5, 6, 7, 89

Today, the monsters of building, scaling and evolving lie between builders’ ideas and user experience.

Build cycles are rapidly shrinking: copilots turn ideas into code in minutes, business processes are continuously refined and automated. The same AI that is conquering the initial build of products and businesses, will next conquer scale. And when that happens, applications will organically grow with demand and adapt to new contexts and technical advancements.

Today, building takes months despite AI assistance. You scale to thousands of users–everything breaks. Hidden cloud provider rate limits crash the app under inference loads. The models you need cost cents per interaction. Cheaper models can't maintain context or personality. When new AI models are released, you spend weeks migrating prompts, retuning responses. Other bugs accumulate, users see infinite loading screens and restart constantly. Evolution means chasing the latest models and compatibility fixes. You’re not sleeping, yet can't realize a fraction of your original idea–let alone explore new ideas–while you watch your users end up with an unreliable app and another generic chatbot.

An illustration demonstrating "monsters" of "building", "scaling" and "evolving" that lay between and idea and ideal user experience

The monsters of building, scaling and evolving consume builders’ ideas and leave only remnants of potential value for users.

Tomorrow, we return to the user. 

Tomorrow, you specify your idea and define success metrics and constraints. The system constructs and deploys itself. The intelligent runtime handles everything from there. Infrastructure scales with demand, from tens to millions of users. Costs optimize automatically. Models and prompts are auto-configured. Bugs are identified and fixed. New devices appear, backends update to support local inference. As better models are released, they’re seamlessly deployed when experiments show improved outcomes. True evolution through cumulative usage and experience. Users feel like the application knows them. Personalized features emerge from user patterns, and improve daily. Every interaction makes the system smarter. Every key decision and  insight is shared with builders to inform their understanding. Builders stay focused on investigating new features, kick off another build cycle and watch the application evolve.

An illustration demonstrating how living applications will return user experience to the center of their evolution

Tomorrow, living applications will return user experience to the center of applications’ evolution.

As we grapple with an acceleration of technological and social change, it feels important that the systems we develop implicitly grow to meet the breadth of human experience, rather than forcing a collapse to the mean. When software evolves through use, it becomes as diverse as humanity itself. Your needs matter, your context counts, your way of thinking shapes the experience. Your patterns become features. Your struggles simplify interfaces. Your repeated mistakes show where the interface fails. Your creative workarounds become tomorrow's shortcuts. Your innovations spread to others.

To make this tomorrow possible, Inworld is building an intelligent runtime: the layer that transforms static software into living systems. The last monster of maintaining and evolving remains to be conquered. Our answer: living applications; software that evolves itself. Builders provide the DNA, user experience drives the evolution. Every interaction - frustrated click, delighted swipe, confused pause - feeds the application's intelligence, and it evolves to better serve its users. You give it purpose: defining success, setting boundaries, encoding what matters most. What emerges is living software that belongs to everyone who touches it.

Scaling in an accessible ecosystem.

Much of our work at Inworld has been solving scale for AI applications while maintaining user experience. We've successfully helped teams of five scale to millions, partially enabled by an increasingly accessible ecosystem of components. The current cost dynamics of AI-enabled applications bias builders towards conformity. The vicious cycle is that to afford scale, you need volume; to get volume, you need broad appeal; to achieve broad appeal, you eliminate functionality that doesn't work for everyone. However, we are seeing the ecosystem making the right moves.

Providers are increasingly embracing  flexibility and experimentation, and producing more efficient components that make cost accessible, allow real-time performance and local inference. Open weights and training code enable moving away from generic defaults and lock-step updates with model releases. Backend interoperability is emerging, unlocking flexible hardware-software combinations. New chip providers and inference stacks enable new cloud options, while more providers offer cost-effective solutions for real-time and local inference. If the ecosystem continues to emphasize accessibility, it will allow us to find more viable configurations to support unique user experience at massive scale. Inworld will also do our part to democratize the necessary components.

Solving the way to evolve.

Living applications require infrastructure that supports continuous evolution based on users' real-time and long-term experience. Inworld exists to ensure consumer AI applications organically evolve to better serve their users. We’ve recognized solving evolution requires:

An understanding of users and their context, in real-time. Applications must capture and process every signal to truly understand users, their present context and constraints. They need to capture raw signals in every format, clean data, run evaluations and transform streams into actionable insights and maintain very long-term context across preferences, progress, and patterns.

The ability to make changes, to any part of the application configuration, at any time. Living applications require instant adaptation without human intervention. Nothing remains static. Every component must be learnable and changeable in near-real-time. Resources and model routes allocated dynamically based on load. All changes possible without deployments.

An embedded intelligence that autonomously configures all logic and components to optimize for user experience. The runtime must self-evolve models and logic to improve user experience. Systems need to understand operational constraints and improvement paths, recognize and fix bugs and errors as they arise, and surface critical information and decisions to builders so they can refine their understanding and guide major changes. The app you use tomorrow is more engaging than today.

As a community of builders, we set out to solve real human needs, and directed our attention to construct systems to create and manage applications. Somewhere along the way, the intent got swallowed by the systems. Building is being solved and scaling is becoming accessible. If we now solve evolution, we will not only fix the channel between builders and users, but enhance the fidelity of users' voices.

Kylan