Get in touch

All articles

26 March 2025 • 6 minute read

AI-native experiences and the promise of magical personal fluidity

We’ve been working on a tremendously exciting AI-native digital product with the folks at MindMax Labs.

It’s been a fascinating and rewarding piece of work, not least because we’re discovering that AI-native experiences challenge some of the fundamental design patterns that we’ve become so accustomed to designing with.

What’s been equally exciting is seeing those patterns come under scrutiny. So much of what arrives in the market today feels thoughtlessly churned out from a pattern library. This project has reminded us of the value of going back to first principles.

What do we mean by ‘AI-native experiences’?

By 'AI-native', we mean a digital product where artificial intelligence – typically an agent – is the foundational mode of interaction as opposed to an add-on to existing functionality.

For MindMax, this takes the form of AI teachers and co-learners that transform the user experience from a linear process to a dynamic, conversational journey that feels very different to anything we’ve designed in the past.

At its best, we’ve seen this create what we've dubbed magical personal fluidity: moments when an AI seems to listen and adapt in real time to a user's needs, context and even emotional state.

While I suspect we’ll become accustomed to these moments as AI becomes more familiar, right now it feels intuitive, delightful and, dare I say it, humane.

But to make these possible, we’re increasingly convinced tried and tested approaches to interaction design will need reconsidering.

Six shifts

In our recent work, we’ve seen six shifts that challenge established patterns and have pushed us to rethink how digital experiences might behave.

1. Users learn the system →  the system learns the user

Most software expects users to learn and adapt to its interface. In contrast, an AI-native system can learn the user – their tastes, goals and behaviours. Instead of relying on one-size-fits-all onboarding flows, AI agents can:

  • Quickly understand the user’s intentions
  • Dynamically shape the experience
  • Anticipate needs to reduce friction and cognitive load

2. Static brand voice → dynamic personality

In traditional digital experiences, the aesthetic and tone of voice are typically derived from an underlying brand architecture and embedded into a design system. In AI-native experiences, it’s not just possible but desirable to give users control over these choices. In this world users have the option to:

  • Customise the style and tone of voice of their agent
  • Adjust their appearance
  • Co-create the brand experience

3. Pre-determined flows → generative experiences

User journeys are usually defined in advance, guiding people through a fixed sequence of steps. While effective, pre-defined flows tend to mean that everyone follows a similar and often unnecessarily long journey, with only minor variations. Instead, AI can:

  • Generate tailored interaction paths on the fly
  • Adjust the level of detail to match the user’s preferences or expertise
  • Skip irrelevant steps to streamline the experience

4. Manual control → background intelligence

Conventional digital products tend to be session-based, meaning the experience pauses the moment the user stops engaging. When an AI agent is acting on your behalf, it can continue working in the background, shaping the experience even when you’re not there. Agents can:

  • Proactively generate new content
  • Generate personalised recommendations and insights
  • Take anticipatory actions on your behalf

5. Single → multi-modality

Advances in text-to-speech and voice recognition are opening new possibilities for audio-based interaction. For Mindmax, this has allowed the introduction of conversational, Socratic learning dynamics, where students develop knowledge through natural dialogue. This approach allows learners to:

  • Engage actively through real-time questioning
  • Reflect and respond in their own words
  • Deepen understanding through guided conversation

6. Hardcoded logic → prompt-driven behaviour

In most software, the logic is fixed, baked into the codebase and invisible to users. In AI-native experiences, this logic doesn’t have to be static or hidden and can instead become part of the interface itself, made accessible through prompts. Rather than navigating a fixed system, users can now:

  • Define how the AI interprets and responds
  • Adjust the behaviour of the agent
  • Modify reasoning on the fly

The potential is here

Our recent work has convinced us that it’s naive to assume AI-native experiences can be accommodated within existing design patterns. Many of the conventions we’ve come to rely on, such as linear flows, static logic and top-down branding, begin to break down when interaction becomes dynamic, multi-modal and shaped in real time through prompting.

And yet, that’s what makes this experience so exciting. When it works, intelligent agents enable experiences that feel magical, personal and fluid and hint at a potential new standard in user experience.

These shifts have far-reaching implications, but we’re especially excited by their potential in fields like healthcare, education and public services, where adaptability and empathy aren’t just nice to haves, they’re essential.

We’re keen to connect with others exploring these frontiers. If you’re working on, or even just thinking about what AI-native experiences might make possible, we’d love to chat.

Want to chat?

We’d love to hear from you

Get in touch