March 21, 2025
10

The Evolution of UX Research: Digital Twins and the Future of User Insight

Introduction

User Experience (UX) research has always been about people. How they think, how they behave, what they need, and—just as importantly—what they don’t yet realise they need. Traditional UX methodologies have long relied on direct human input: interviews, usability testing, surveys, and behavioral observation. The assumption was clear—if you want to understand people, you have to engage with real humans.

But in 2025, that assumption is being challenged.

The emergence of digital twins and synthetic users—AI-powered simulations of human behavior—is changing how researchers approach user insights. These technologies claim to solve persistent UX research problems: slow participant recruitment, small sample sizes, high costs, and research timelines that struggle to keep pace with product development. The promise is enticing: instantly accessible, infinitely scalable users who can test, interact, and generate feedback without the logistical headaches of working with real participants.

Yet, as with any new technology, there are trade-offs. While digital twins may unlock efficiencies, they also raise important questions: Can they truly replicate human complexity? Where do they fit within existing research practices? What risks do they introduce?

This article explores the evolving role of digital twins in UX research—where they excel, where they fall short, and what their rise means for the future of human-centered design.

The Traditional UX Research Model: Why Change?

For decades, UX research has been grounded in methodologies that involve direct human participation. The core methods—usability testing, user interviews, ethnographic research, and behavioral analytics—have been refined to account for the unpredictability of human nature.

This approach works well, but it has challenges:

  1. Participant recruitment is time-consuming. Finding the right users—especially niche audiences—can be a logistical hurdle, often requiring specialised panels, incentives, and scheduling gymnastics.
  2. Research is expensive. Incentives, moderation, analysis, and recruitment all add to the cost. A single usability study can run into tens of thousands of dollars.
  3. Small sample sizes create risk. Budget and timeline constraints often mean testing with small groups, leaving room for blind spots and bias.
  4. Long feedback loops slow decision-making. By the time research is completed, product teams may have already moved on, limiting its impact.

In short: traditional UX research provides depth and authenticity, but it’s not always fast or scalable.

Digital twins and synthetic users aim to change that.

What Are Digital Twins and Synthetic Users?

While the terms digital twins and synthetic users are sometimes used interchangeably, they are distinct concepts.

Digital Twins: Simulating Real-World Behavior

A digital twin is a data-driven virtual representation of a real-world entity. Originally developed for industrial applications, digital twins replicate machines, environments, and human behavior in a digital space. They can be updated in real time using live data, allowing organisations to analyse scenarios, predict outcomes, and optimise performance.

In UX research, human digital twins attempt to replicate real users' behavioral patterns, decision-making processes, and interactions. They draw on existing datasets to mirror real-world users dynamically, adapting based on real-time inputs.

Synthetic Users: AI-Generated Research Participants

While a digital twin is a mirror of a real entity, a synthetic user is a fabricated research participant—a simulation that mimics human decision-making, behaviors, and responses. These AI-generated personas can be used in research scenarios to interact with products, answer questions, and simulate user journeys.

Unlike traditional user personas (which are static profiles based on aggregated research), synthetic users are interactive and capable of generating dynamic feedback. They aren’t modeled after a specific real-world person, but rather a combination of user behaviors drawn from large datasets.

Think of it this way:

  • A digital twin is a highly detailed, data-driven clone of a specific person, customer segment, or process.
  • A synthetic user is a fictional but realistic simulation of a potential user, generated based on behavioral patterns and demographic characteristics.

Both approaches are still evolving, but their potential applications in UX research are already taking shape.

Where Digital Twins and Synthetic Users Fit into UX Research

The appeal of AI-generated users is undeniable. They can:

  • Scale instantly – Test designs with thousands of simulated users, rather than just a handful of real participants.
  • Eliminate recruitment bottlenecks – No need to chase down participants or schedule interviews.
  • Reduce costs – No incentives, no travel, no last-minute no-shows.
  • Enable rapid iteration – Get user insights in real time and adjust designs on the fly.
  • Generate insights on sensitive topics – Synthetic users can explore scenarios that real participants might find too personal or intrusive.

These capabilities make digital twins particularly useful for:

  • Early-stage concept validation – Rapidly test ideas before committing to development.
  • Edge case identification – Run simulations to explore rare but critical user scenarios.
  • Pre-testing before live usability sessions – Identify glaring issues before investing in human research.

However, digital twins and synthetic users are not a replacement for human research. Their effectiveness is limited in areas where emotional, cultural, and contextual factors play a major role.

The Risks and Limitations of AI-Driven UX Research

For all their promise, digital twins and synthetic users introduce new challenges.

  1. They lack genuine emotional responses.
    AI can analyse sentiment, but it doesn’t feel frustration, delight, or confusion the way a human does. UX is often about unexpected moments—the frustrations, workarounds, and “aha” realisations that define real-world use.
  2. Bias is a real problem.
    AI models are trained on existing datasets, meaning they inherit and amplify biases in those datasets. If synthetic users are based on an incomplete or non-diverse dataset, the research insights they generate will be skewed.
  3. They struggle with novelty.
    Humans are unpredictable. They find unexpected uses for products, misunderstand instructions, and behave irrationally. AI models, no matter how advanced, can only predict behavior based on past patterns—not the unexpected ways real users might engage with a product.
  4. They require careful validation.
    How do we know that insights from digital twins align with real-world user behavior? Without rigorous validation against human data, there’s a risk of over-reliance on synthetic feedback that doesn’t reflect reality.

A Hybrid Future: AI + Human UX Research

Rather than viewing digital twins as a replacement for human research, the best UX teams will integrate them as a complementary tool.

Where AI Can Lead:

  • Large-scale pattern identification
  • Early-stage usability evaluations
  • Speeding up research cycles
  • Automating repetitive testing

Where Humans Remain Essential:

  • Understanding emotion, frustration, and delight
  • Detecting unexpected behaviors
  • Validating insights with real-world context
  • Ethical considerations and cultural nuance

The future of UX research is not about choosing between AI and human research—it’s about blending the strengths of both.

Final Thoughts: Proceeding With Caution and Curiosity

Digital twins and synthetic users are exciting, but they are not a magic bullet. They cannot fully replace human users, and relying on them exclusively could lead to false confidence in flawed insights.

Instead, UX researchers should view these technologies as powerful, but imperfect tools—best used in combination with traditional research methods.

As with any new technology, thoughtful implementation is key. The real opportunity lies in designing research methodologies that harness the speed and scale of AI without losing the depth, nuance, and humanity that make UX research truly valuable.

The challenge ahead isn’t about choosing between human or synthetic research. It’s about finding the right balance—one that keeps user experience truly human-centered, even in an AI-driven world.

This article was researched with the help of Perplexity.ai. 

Share this article
Author
Optimal
Workshop

Related articles

View all blog articles
Learn more
1 min read

5 ways to measure UX return on investment

Return on investment (ROI) is often the term on everyone’s lips when starting a big project or even when reviewing a website. It’s especially popular with those that hold the purse strings.  As UX researchers it is important to consider the ROI of the work we do and understand how to measure this. 

We’ve lined up 5 key ways to measure ROI for UX research to help you get the conversation underway with stakeholders so you can show real and tangible benefits to your organization. 

1. Meet and exceed user expectations

Put simply, a product that meets and exceeds user expectations leads to increased revenue. When potential buyers are able to find and purchase what they’re looking for, easily, they’ll complete their purchase, and are far more likely to come back. The simple fact that users can finish their task will increase sales and improve overall customer satisfaction which has an influence on their loyalty. Repeat business means repeat sales. Means increased revenue.

Creating, developing and maintaining a usable website is more important than you might think. And this is measurable! Tracking and analyzing website performance prior to the UX research and after can be insightful and directly influenced by changes made based on UX research.

Measurable: review the website (product) performance prior to UX research and after changes have been made. The increase in clicks, completed tasks and/or baskets will tell the story.

2. Reduce development time

UX research done at the initial stages of a project can lead to a reduction in development time of by 33% to 50%! And reduced time developing, means reduced costs (people and overheads) and a speedier to market date. What’s not to love? 

Measurable: This one is a little more tricky as you have saved time (and cost) up front. Aiding in speed to market and performance prior to execution. Internal stakeholder research may be of value post the live date to understand how the project went.

3. Ongoing development costs

And the double hitter? Creating a product that has the user in mind up front, reduces the need to rehash or revisit as quickly. Reducing ongoing costs. Early UX research can help with the detection of errors early on in the development process. Fixing errors after development costs a company up to 100 times more than dealing with the same error before development.

Measureable: Again, as UX research has saved time and money up front this one can be difficult to track. Though depending on your organization and previous projects you could conduct internal research to understand how the project compares and the time and cost savings.

4. Meeting user requirements

Did you know that 70% of projects fail due to the lack of user acceptance? This is often because project managers fail to understand the user requirements properly. Thanks to UX research early on, gaining insights into users and only spending time developing the functions users actually want, saving time and reducing development costs. Make sure you get confirmation on those requirements by iterative testing. As always, fail early, fail often. Robust testing up front means that in the end, you’ll have a product that will meet the needs of the user.

Measurable: Where is the product currently? How does it perform? Set a benchmark up front and review post UX research. The deliverables should make the ROI obvious.

5. Investing in UX research leads to an essential competitive advantage.

Thanks to UX research you can find out exactly what your customers want, need and expect from you. This gives you a competitive advantage over other companies in your market. But you should be aware that more and more companies are investing in UX while customers are ever more demanding, their expectations continue to grow and they don’t tolerate bad experiences. And going elsewhere is an easy decision to make.

Measurable: Murky this one, but no less important. Knowing, understanding and responding to competitors can help keep you in the lead, and developing products that meet and exceed those user expectations.

Wrap up

Showing the ROI on the work we do is an essential part of getting key stakeholders on board with our research. It can be challenging to talk the same language, ultimately we all want the same outcome…a product that works well for our users, and delivers additional revenue.

For some continued reading (or watching in this case), Anna Bek, Product and Delivery Manager at Xplor explored the same concept of "How to measure experience" during her UX New Zealand 2020 – watch it here as she shares a perspective on UX ROI.

Learn more
1 min read

Why Understanding Users Has Never Been Easier...or Harder

Product, design and research teams today are drowning in user data while starving for user understanding. Never before have teams had such access to user information, analytics dashboards, heatmaps, session recordings, survey responses, social media sentiment, support tickets, and endless behavioral data points. Yet despite this volume of data, teams consistently build features users don't want and miss needs hiding in plain sight.

It’s a true paradox for product, design and research teams: more information has made genuine understanding more elusive. 

Because with  all this data, teams feel informed. They can say with confidence: "Users spend 3.2 minutes on this page," "42% abandon at this step," "Power users click here." But what this data doesn't tell you is Why. 

The Difference between Data and Insight

Data tells you what happened. Understanding tells you why it matters.

Here’s a good example of this: Your analytics show that 60% of users abandon a new feature after first use. You know they're leaving. You can see where they click before they go. You have their demographic data and behavioral patterns.

But you don't know:

  • Were they confused or simply uninterested?
  • Did it solve their problem too slowly or not at all?
  • Would they return if one thing changed, or is the entire approach wrong?
  • Are they your target users or the wrong segment entirely?

One team sees "60% abandonment" and adds onboarding tooltips. Another talks to users and discovers the feature solves the wrong problem entirely. Same data, completely different understanding.

Modern tools make it dangerously easy to mistake observation for comprehension, but some aspects of user experience exist beyond measurement:

  • Emotional context, like the frustration of trying to complete a task while handling a crying baby, or the anxiety of making a financial decision without confidence.
  • The unspoken needs of users which can only be demonstrated through real interactions. Users develop workarounds without reporting bugs. They live with friction because they don't know better solutions exist.
  • Cultural nuances that numbers don't capture, like how language choice resonates differently across cultures, or how trust signals vary by context.
  • Data shows what users do within your current product. It doesn't reveal what they'd do if you solved their problems differently to help you identify new opportunities. 

Why Human Empathy is More Important than Ever 

The teams building truly user-centered products haven't abandoned data but they've learned to combine quantitative and qualitative insights. 

  • Combine analytics (what happens), user interviews (why it happens), and observation (context in which it happens).
  • Understanding builds over time. A single study provides a snapshot; continuous engagement reveals the movie.
  • Use data to form theories, research to validate them, and real-world live testing to confirm understanding.
  • Different team members see different aspects. Engineers notice system issues, designers spot usability gaps, PMs identify market fit, researchers uncover needs.

Adding AI into the mix also emphasizes the need for human validation. While AI can help significantly speed up workflows and can augment human expertise, it still requires oversight and review from real people. 

AI can spot trends humans miss, processing millions of data points instantly but it can't understand human emotion, cultural context, or unspoken needs. It can summarize what users say but humans must interpret what they mean.

Understanding users has never been easier from a data perspective. We have tools our predecessors could only dream of.  But understanding users has never been harder from an empathy perspective. The sheer volume of data available to us creates an illusion of knowledge that's more dangerous than ignorance.

The teams succeeding aren't choosing between data and empathy, they're investing equally in both. They use analytics to spot patterns and conversations to understand meaning. They measure behavior and observe context. They quantify outcomes and qualify experiences.

Because at the end of the day, you can track every click, measure every metric, and analyze every behavior, but until you understand why, you're just collecting data, not creating understanding.

Learn more
1 min read

Radical Collaboration: how teamwork really can make the dream work

Natalie and Lulu have forged a unique team culture that focuses on positive outputs (and outcomes) for their app’s growing user base. In doing so, they turned the traditional design approach on its head and created a dynamic and supportive team. 

Natalie, Director of Design at Hatch, and Lulu, UX Design Specialist, recently spoke at UX New Zealand, the leading UX and IA conference in New Zealand hosted by Optimal Workshop, on their concept of “radical collaboration”.

In their talk, Nat and Lulu share their experience of growing a small app into a big player in the finance sector, and their unique approach to teamwork and culture which helped achieve it.

Background on Natalie Ferguson and Lulu Pachuau

Over the last two decades, Lulu and Nat have delivered exceptional customer experiences for too many organizations to count. After Nat co-founded Hatch, she begged Lulu to join her on their audacious mission: To supercharge wealth building in NZ. Together, they created a design and product culture that inspired 180,000 Kiwi investors to join in just 4 years.

Contact Details:

Email: natalie@sixfold.co.nz

LinkedIn: https://www.linkedin.com/in/natalieferguson/ and https://www.linkedin.com/in/lulupach/

Radical Collaboration - How teamwork makes the dream work 💪💪💪

Nat and Lulu discuss how they nurtured a team culture of “radical collaboration” when growing the hugely popular app Hatch, based in New Zealand. Hatch allows everyday New Zealanders to quickly and easily trade in the U.S. share market. 

The beginning of the COVID pandemic spelled huge growth for Hatch and caused significant design challenges for the product. This growth meant that the app had to grow from a baby startup to one that could operate at scale - virtually overnight. 

In navigating this challenge, Nat and Lulu coined the term radical collaboration, which aims to “dismantle organizational walls and supercharge what teams achieve”. Radical collaboration has six key pillars, which they discuss alongside their experience at Hatch.

Pillar #1: When you live and breathe your North star

Listening to hundreds of their customers’ stories, combined with their own personal experiences with money, compelled Lulu and Nat to change how their users view money. And so, “Grow the wealth of New Zealanders” became a powerful mission statement, or North Star, for Hatch. The mission was to give people the confidence and the ability to live their own lives with financial freedom and control. Nat and Lulu express the importance of truly believing in the mission of your product, and how this can become a guiding light for any team. 

Pillar #2: When you trust each other so much, you’re happy to give up control

As Hatch grew rapidly, trusting each other became more and more important. Nat and Lulu state that sometimes you need to take a step back and stop fueling growth for growth’s sake. It was at this point that Nat asked Lulu to join the team, and Nat’s first request was for Lulu to be super critical about the product design to date - no feedback was out of bounds. Letting go, feeling uncomfortable, and trusting your team can be difficult, but sometimes it’s what you need in order to drag yourself out of status quo design. This resulted in a brief hiatus from frantic delivery to take stock and reprioritize what was important - something that can be difficult without heavy doses of trust!

Pillar #3: When everyone wears all the hats

During their journey, the team at Hatch heard lots of stories from their users. Many of these stories were heard during “Hatcheversery Calls”, where team members would call users on their sign-up anniversary to chat about their experience with the app. Some of these calls were inspiring, insightful, and heartwarming.

Everyone at Hatch made these calls – designers, writers, customer support, engineers, and even the CEO. Speaking to strangers in this way was a challenge for some, especially since it was common to field technical questions about the business. Nevertheless, asking staff to wear many hats like this turned the entire team into researchers and analysts. By forcing ourselves and our team outside of our comfort zone, we forced each other to see the whole picture of the business, not just our own little piece.

Pillar #4: When you do what’s right, not what’s glam

In an increasingly competitive industry, designers and developers are often tempted to consistently deliver new and exciting features. In response to rapid growth, rather than adding more features to the app, Lulu and Nat made a conscious effort to really listen to their customers to understand what problems they needed solving. 

As it turned out, filing overseas tax returns was a significant and common problem for their customers - it was difficult and expensive. So, the team at Hatch devised a tax solution. This solution was developed by the entire team, with almost no tax specialists involved until the very end! This process was far from glamorous and it often fell outside of standard job descriptions. However, the team eventually succeeded in simplifying a notoriously difficult process and saved their customers a massive headache.

Pillar #5: When you own the outcome, not your output.

Over time Hatch’s user base changed from being primarily confident, seasoned investors, to being first-time investors. This new user group was typically scared of investing and often felt that it was only a thing wealthy people did.

At this point, Hatch felt it was necessary to take a step back from delivering updates to take stock of their new position. This meant deeply understanding their customers’ journey from signing up, to making their first trade. Once this was intimately understood, the team delivered a comprehensive onboarding process which increased the sign-up conversion rate by 10%!

Pillar #6: When you’re relentlessly committed to making it work

Nat and Lulu describe a moment when Allbirds wanted to work with Hatch to allow ordinary New Zealanders to be involved in their IPO launch on the New York stock exchange. Again, this task faced numerous tax and trade law challenges, and offering the service seemed like yet another insurmountable task. The team at Hatch nearly gave up several times during this project, but everyone was determined to get this feature across the line – and they did. As a result, New Zealanders were some of the few regular investors from outside the U.S that were able to take part in Albirds IPO. 

Why it matters 💥

Over four years, Hatch grew to 180,000 users who collectively invested over $1bn. Nat and Lulu’s success underscores the critical role of teamwork and collaboration in achieving exceptional user experiences. Product teams should remember that in the rapidly evolving tech industry, it's not just about delivering the latest features; it's about fostering a positive and supportive team culture that buys into the bigger picture.

The Hatch team grew to be more than team members and technical experts. They grew in confidence and appreciated every moving part of the business. Product teams can draw inspiration from Hatch's journey, where designers, writers, engineers, and even the CEO actively engaged with users, challenged traditional design decisions, and prioritized solving actual user problems. This approach led to better, more user-centric outcomes and a deep understanding of the end-to-end user experience.

Most importantly, through the good times and tough, the team grew to trust each other. The mission weaved its way through each member of the team, which ultimately manifested in positive outcomes for the user and the business.

Nat and Lulu’s concept of radical collaboration led to several positive outcomes for Hatch:

  • It changed the way they did business. Information was no longer held in the minds of a few individuals – instead, it was shared. People were able to step into other people's roles seamlessly. 
  • Hatch achieved better results faster by focusing on the end-to-end experience of the app, rather than by adding successive features. 
  • The team became more nimble – potential design/development issues were anticipated earlier because everyone knew what the downstream impacts of a decision would be.

Over the next week, Lulu and Nat encourage designers and researchers to get outside of their comfort zone and:

  • Visit customer support team
  • Pick up the phone and call a customer
  • Challenge status quo design decisions. Ask, does this thing solve an end-user problem?

Seeing is believing

Explore our tools and see how Optimal makes gathering insights simple, powerful, and impactful.