June 6, 2024
4 min

Event Recap: Measuring the Value of UX Research at UXDX

Last week Optimal Workshop was delighted to sponsor UXDX USA 2024 in New York. The User Experience event brings together Product, Design, UX, CX, and Engineering professionals and our team had an amazing time meeting with customers, industry experts, and colleagues throughout the conference. This year, we also had the privilege of sharing some of our industry expertise by running an interactive forum on “Measuring the Value of UX Research” - a topic very close to our hearts.

Our forum, hosted by Optimal Workshop CEO Alex Burke and Product Lead Ella Fielding, was focused on exploring the value of User Experience Research (UXR) from both an industry-wide perspective and within the diverse ecosystem of individual companies and teams conducting this type of research today.

The session brought together a global mix of UX professionals for a rich discussion on measuring and demonstrating the effectiveness of and the challenges facing organizations who are trying to tie UXR to tangible business value today.

The main topics for the discuss were: 

  • Metrics that Matter: How do you measure UXR's impact on sales, customer satisfaction, and design influence?
  • Challenges & Strategies: What are the roadblocks to measuring UXR impact, and how can we overcome them?
  • Beyond ROI:  UXR's value beyond just financial metrics

Some of the key takeaways from our discussions during the session were: 

  1. The current state of UX maturity and value
    • Many UX teams don’t measure the impact of UXR on core business metrics and there were more attendees who are not measuring the impact of their work than those that are measuring it. 
    • Alex & Ella discussed with the attendees the current state of UX research maturity and the ability to prove value across different organizations represented in the room. Most organizations were still early in their UX research maturity with only 5% considering themselves advanced in having research culturally embedded.
  1. Defining and proving the value of UX research
    • The industry doesn’t have clear alignment or understanding of what good measurement looks like. Many teams don’t know how to accurately measure UXR impact or don’t have the tools or platforms to measure it, which serve as core roadblocks for measuring UXRs’ impact. 
    • Alex and Ella discussed challenges in defining and proving the value of UX research, with common values being getting closer to customers, innovating faster, de-risking product decisions, and saving time and money. However, the value of research is hard to quantify compared to other product metrics like lines of code or features shipped.
  1. Measuring and advocating for UX research
    • When teams are measuring UXR today there is a strong bias for customer feedback, but little ability or understanding about how to measure impact on business metrics like revenue. 
    • The most commonly used metrics for measuring UXR are quantitative and qualitative feedback from customers as opposed to internal metrics like stakeholder involvement or tieing UXR to business performance metrics (including financial performance). 
    • Attendees felt that in organizations where research is more embedded, researchers spend significant time advocating for research and proving its value to stakeholders rather than just conducting studies. This included tactics like research repositories and pointing to past study impacts as well as ongoing battles to shape decision making processes. 
    • One of our attendees highlighted that engaging stakeholders in the process of defining key research metrics prior to running research was a key for them in proving value internally. 
  1. Relating user research to financial impact
    • Alex and Ella asked the audience if anyone had examples of demonstrating financial impact of research to justify investment in the team and we got some excellent examples from the audience proving that there are tangible ways to tie research outcomes to core business metrics including:
    • Calculating time savings for employees from internal tools as a financial impact metric. 
    • Measuring a reduction in calls to service desks as a way to quantify financial savings from research.
  1. Most attendees recognise the value in embedding UXR more deeply in all levels of their organization - but feel like they’re not succeeding at this today. 
    • Most attendees feel that UXR is not fully embedded in their orgnaization or culture, but that if it was - they would be more successful in proving its overall value.
    • Stakeholder buy-in and engagement with UXR, particularly from senior leadership varied enormously across organizations, and wasn’t regularly measured as an indicator of UXR value 
    • In organizations where research was more successfully embedded, researchers had to spend significant time and effort building relationships with internal stakeholders before and after running studies. This took time and effort away from actual research, but ended up making the research more valuable to the business in the long run. 

With the large range of UX maturity and the democratization of research across teams, we know there’s a lot of opportunity for our customers to improve their ability to tie their user research to tangible business outcomes and embed UX more deeply in all levels of their organizations. To help fill this gap, Optimal Workshop is currently running a large research project on Measuring the Value of UX which will be released in a few weeks.

Keep up to date with the latest news and events by following us on LinkedIn.

Share this article
Author
Amberlie
Denny

Related articles

View all blog articles
Learn more
1 min read

B is for belief: Optimal Workshop’s B Corp journey

There are over 5000 certified B Corporations around the world, including new recruit, New Zealand based SaaS company, Optimal Workshop.  The ‘B’ in B Corp actually stands for ‘beneficial’, to reflect the founding vision behind the movement: ‘make business a force for good’.   B Corp seeks to help companies balance purpose and profit, while also serving stakeholders by building a global community of like-minded organizations. We asked Andrew Mayfield, CEO and Julie Reddish, Head of People and Culture at Optimal Workshop about becoming a B Corp, the journey so far and why it’s so important.

Why did Optimal decide to become a B Corp?

Andrew: I’ve been interested in aligning our reporting with our purpose and values for years, so becoming a B Corp felt like a huge and natural step in this direction. Our ethos of placing people at the heart of decisions extends to the way we treat our own people, not just our customers.  So I saw B Corp as a way of enshrining this thinking into the company and making sure these aspects were considered in future decision making.  

Sounds as if being a B Corp was already in line with your thinking.  What’s it mean in terms of action?  

Julie: One of the things we’ve always cared about is ‘how do we show our real commitment to people, to the environment, to sustainability, to doing ethical good work?’ Finding out B Corp existed as this global initiative, this reputable, recognized way of measuring yourself against other companies was compelling.  It was a way for us to metrify or codify ‘the optimal way’ of doing things in a way that does good as a company and does good in the world.  

Andrew: Practically it means writing up more policies to ensure the things we do for the good of our people and the planet, which we consider normal are actually written down and therefore, in effect, protected. Our Code of Ethics and Business Conduct, Whistleblower policy, Breast feeding and support policy and Environmental sustainability policy are some that spring to mind.

Tell me about the actual journey to becoming a B Corp - what’s it been like?

Julie: Oh my gosh - it’s been a big project.  Little did we know how much work it would take to get accredited!  It requires six different categories of certification which have, like 50 or 60 questions or areas you can gain points in and you need a minimum of 80 points to get certification.  A lot of the questions weren’t really applicable to us so we really had to look closely and think, ‘what is there already that we do inside Optimal that actually equates to saying yes to that question?’ We had an awful lot of thinking to do about which questions to put our time and our money into.  

Andrew:  There were a lot of things we did that weren’t fully documented, that was the hard part. We didn’t have to change much of our actual behavior to be honest.  We have been for years looking at a people-centered approach - our three values are Approachable, Bold and Curious. We had to write things down, make sure they were where people could find them.  There wasn’t a lot we had to change to get our entrance score as such. If we want to continually improve each year then we will need to make continual changes for sure, like anyone trying to self improve.

Julie:  We thought we were quite close, then there was this massive surge of interest and eight months to even see a consultant, then another six months before the verification process even started.  

That’s when the rubber really hit the road.  We were working away on different bits of the B Corp certification like there’s one for having an office set up for breastfeeding.  Do you have a lockable door? Do you have a place that is private? Does everyone know? Is it communicated? Do you have a policy on breastfeeding? You had to look at each of those things and make sure that you could back it up with evidence.  And that might be worth 0.2 points.  

Sounds like a detailed and rigorous journey - but also quite meaningful and actionable?

Julie: We already had thoughts on what we could do to make our organization great like sourcing local produce and local suppliers but the process of becoming a B Corp really flushed it out for us.  Some of the suggestions and categories were things we were already looking at within the bigger picture of being a good employer and being a good contributor to our communities. Going through the assessment helped us identify a whole other layer of things that we could and should be doing.

Beyond measuring female representation, what else could we be doing for diversity?  What about our indigenous representation? What could we be doing for people with disabilities?  It got us into deeper thinking about what diversity actually means. It’s pretty amazing.

What does being a B Corp mean for your employees?

Julie:  As an employer it’s reaffirming a commitment to treating people well and human-centered work practices.  So the real nuts and bolts come down to individuals thinking how might I get involved with this:  If I see something I don’t feel is right I call it out.  I can also advocate for what is right.

Andrew:  More and more I think people are interested in working for companies that care for more than simply enriching their shareholders, that care about taking care of their team and of their environment and of their impact more broadly, of the change they seek to make on society, knowledge-sharing and all this sort of thing.  People are more aware of considering this on choice of where to work, where to stay and just generally where to spend their time.  We all have scarce time these days and strong choices to make and it does play into where people choose to work.

Does this extend to customers?  What impact, if any, does being a B Corp mean for them?

Andrew:  B Corp certainly takes into consideration who you choose to use as suppliers so it becomes a bit recursive in that way. If our customers value the fact we’re a B Corp then they need to be thinking about choosing suppliers who are also B Corps - so it would gradually happen over time I’d imagine. 

Julie: It’s about thoughtful practices.  Not just following trends. It’s about what works, not what’s popular. 

What’s it feel like to be part of this global community called B Corp?

Julie:  I think it's a really cool company to be in.  To share our thinking, to share policies and resources with somebody who’s traveled that road before us, with its dragons and potholes, to actually follow in someone’s footsteps, but also make it our own ‘Optimal’ way.  

Andrew: Being part of a community of B Corps supporting each other with new ways to manage these obligations we choose to put on ourselves to be better corporate citizens as such is valuable.  While there’s no desire to make it hard, sometimes it is hard to make sure you’re doing the right thing.  It requires extra research and extra conscientiousness when making decisions so sharing ideas and experiences.  Feeling like you’re not the only one who’s been there can help.

Becoming a B Corp is quite an achievement however the work doesn’t stop there does it?

Andrew: My understanding is the requirements get harder and that’s a good thing.  We can all get better.  The biggest areas for us to improve are things like sharing information in decision-making, we’re already pretty transparent but haven’t formalized that so there are things we can do there.  

The next checkpoint is in three years and we’re expected to improve plus the requirements get more onerous - so we’d better improve!

Learn more
1 min read

Usability Experts Unite: The Power of Heuristic Evaluation in User Interface Design

Usability experts play an essential role in the user interface design process by evaluating the usability of digital products from a very important perspective - the users! Usability experts utilize various techniques such as heuristic evaluation, usability testing, and user research to gather data on how users interact with digital products and services. This data helps to identify design flaws and areas for improvement, leading to the development of user-friendly and efficient products.

Heuristic evaluation is a usability research technique used to evaluate the user interface design of a digital product based on a set of ‘heuristics’ or ‘usability principles’. These heuristics are derived from a set of established principles of user experience design - attributed to the landmark article “Improving a Human-Computer Dialogue” published by web usability pioneers Jakob Nielsen and Rolf Molich in 1990. The principles focus on the experiential aspects of a user interface. 

In this article, we’ll discuss what heuristic evaluation is and how usability experts use the principles to create exceptional design. We’ll also discuss how usability testing works hand-in-hand with heuristic evaluation, and how minimalist design and user control impact user experience. So, let’s dive in!

Understanding Heuristic Evaluation


Heuristic evaluation helps usability experts to examine interface design against tried and tested rules of thumb. To conduct a heuristic evaluation, usability experts typically work through the interface of the digital product and identify any issues or areas for improvement based on these broad rules of thumb, of which there are ten. They broadly cover the key areas of design that impact user experience - not bad for an article published over 30 years ago!

The ten principles are:

  1. Prevention error: Well-functioning error messages are good, but instead of messages, can these problems be removed in the first place? Remove the opportunity for slips and mistakes to occur.
  2. Consistency and standards: Language, terms, and actions used should be consistent to not cause any confusion.
  3. Control and freedom for users: Give your users the freedom and control to undo/redo actions and exit out of situations if needed.
  4. System status visibility: Let your users know what’s going on with the site. Is the page they’re on currently loading, or has it finished loading?
  5. Design and aesthetics: Cut out unnecessary information and clutter to enhance visibility. Keep things in a minimalist style.
  6. Help and documentation: Ensure that information is easy to find for users, isn’t too large and is focused on your users’ tasks.
  7. Recognition, not recall: Make sure that your users don’t have to rely on their memories. Instead, make options, actions and objects visible. Provide instructions for use too.
  8. Provide a match between the system and the real world: Does the system speak the same language and use the same terms as your users? If you use a lot of jargon, make sure that all users can understand by providing an explanation or using other terms that are familiar to them. Also ensure that all your information appears in a logical and natural order.
  9. Flexibility: Is your interface easy to use and it is flexible for users? Ensure your system can cater to users to all types, from experts to novices.
  10. Help users to recognize, diagnose and recover from errors: Your users should not feel frustrated by any error messages they see. Instead, express errors in plain, jargon-free language they can understand. Make sure the problem is clearly stated and offer a solution for how to fix it.

Heuristic evaluation is a cost-effective way to identify usability issues early in the design process (although they can be performed at any stage) leading to faster and more efficient design iterations. It also provides a structured approach to evaluating user interfaces, making it easier to identify usability issues. By providing valuable feedback on overall usability, heuristic evaluation helps to improve user satisfaction and retention.

The Role of Usability Experts in Heuristic Evaluation

Usability experts play a central role in the heuristic evaluation process by providing feedback on the usability of a digital product, identifying any issues or areas for improvement, and suggesting changes to optimize user experience.

One of the primary goals of usability experts during the heuristic evaluation process is to identify and prevent errors in user interface design. They achieve this by applying the principles of error prevention, such as providing clear instructions and warnings, minimizing the cognitive load on users, and reducing the chances of making errors in the first place. For example, they may suggest adding confirmation dialogs for critical actions, ensuring that error messages are clear and concise, and making the navigation intuitive and straightforward.

Usability experts also use user testing to inform their heuristic evaluation. User testing involves gathering data from users interacting with the product or service and observing their behavior and feedback. This data helps to validate the design decisions made during the heuristic evaluation and identify additional usability issues that may have been missed. For example, usability experts may conduct A/B testing to compare the effectiveness of different design variations, gather feedback from user surveys, and conduct user interviews to gain insights into users' needs and preferences.

Conducting user testing with users that represent, as closely as possible, actual end users, ensures that the product is optimized for its target audience. Check out our tool Reframer, which helps usability experts collaborate and record research observations in one central database.

Minimalist Design and User Control in Heuristic Evaluation

Minimalist design and user control are two key principles that usability experts focus on during the heuristic evaluation process. A minimalist design is one that is clean, simple, and focuses on the essentials, while user control refers to the extent to which users can control their interactions with the product or service.

Minimalist design is important because it allows users to focus on the content and tasks at hand without being distracted by unnecessary elements or clutter. Usability experts evaluate the level of minimalist design in a user interface by assessing the visual hierarchy, the use of white space, the clarity of the content, and the consistency of the design elements. Information architecture (the system and structure you use to organize and label content) has a massive impact here, along with the content itself being concise and meaningful.

Incorporating minimalist design principles into heuristic evaluation can improve the overall user experience by simplifying the design, reducing cognitive load, and making it easier for users to find what they need. Usability experts may incorporate minimalist design by simplifying the navigation and site structure, reducing the number of design elements, and removing any unnecessary content (check out our tool Treejack to conduct site structure, navigation, and categorization research). Consistent color schemes and typography can also help to create a cohesive and unified design.

User control is also critical in a user interface design because it gives users the power to decide how they interact with the product or service. Usability experts evaluate the level of user control by looking at the design of the navigation, the placement of buttons and prompts, the feedback given to users, and the ability to undo actions. Again, usability testing plays an important role in heuristic evaluation by allowing researchers to see how users respond to the level of control provided, and gather feedback on any potential hiccups or roadblocks.

Usability Testing and Heuristic Evaluation

Usability testing and heuristic evaluation are both important components of the user-centered design process, and they complement each other in different ways.

Usability testing involves gathering feedback from users as they interact with a digital product. This feedback can provide valuable insights into how users perceive and use the user interface design, identify any usability issues, and help validate design decisions. Usability testing can be conducted in different forms, such as moderated or unmoderated, remote or in-person, and task-based or exploratory. Check out our usability testing 101 article to learn more.

On the other hand, heuristic evaluation is a method in which usability experts evaluate a product against a set of usability principles. While heuristic evaluation is a useful method to quickly identify usability issues and areas for improvement, it does not involve direct feedback from users.

Usability testing can be used to validate heuristic evaluation findings by providing evidence of how users interact with the product or service. For example, if a usability expert identifies a potential usability issue related to the navigation of a website during heuristic evaluation, usability testing can be used to see if users actually have difficulty finding what they need on the website. In this way, usability testing provides a reality check to the heuristic evaluation and helps ensure that the findings are grounded in actual user behavior.

Usability testing and heuristic evaluation work together in the design process by informing and validating each other. For example, a designer may conduct heuristic evaluation to identify potential usability issues and then use the insights gained to design a new iteration of the product or service. The designer can then use usability testing to validate that the new design has successfully addressed the identified usability issues and improved the user experience. This iterative process of designing, testing, and refining based on feedback from both heuristic evaluation and usability testing leads to a user-centered design that is more likely to meet user needs and expectations.

Conclusion

Heuristic evaluation is a powerful usability research technique that usability experts use to evaluate digital product interfaces based on a set of established principles of user experience design. After all these years, the ten principles of heuristic evaluation still cover the key areas of design that impact user experience, making it easier to identify usability issues early in the design process, leading to faster and more efficient design iterations. Usability experts play a critical role in the heuristic evaluation process by identifying design flaws and areas for improvement, using user testing to validate design decisions, and ensuring that the product is optimized for its intended users.

Minimalist design and user control are two key principles that usability experts focus on during the heuristic evaluation process. A minimalist design is clean, simple, and focuses on the essentials, while user control gives users the freedom and control to undo/redo actions and exit out of situations if needed. By following these principles, usability experts can create an exceptional design that enhances visibility, reduces cognitive load, and provides a positive user experience. 

Ultimately, heuristic evaluation is a cost-effective way to identify usability issues at any point in the design process, leading to faster and more efficient design iterations, and improving user satisfaction and retention. How many of the ten heuristic design principles does your digital product satisfy? 

Learn more
1 min read

The Power of Prototype Testing Live Training

If you missed our recent live training on Prototype Testing, don’t worry—we’ve got everything you need right here! You can catch up at your convenience, so grab a cup of tea, put your feet up, and enjoy the show.

In the session, we explored the powerful new features of our Prototype Testing tool, offering a step-by-step guide to setting up, running, and analyzing your tests like a seasoned pro. This tool is a game-changer for your design workflow, helping you identify usability issues and gather real user feedback before committing significant resources to development.


Here’s a quick recap of the highlights:

1. Creating a prototype test from scratch using images

We walked through how to create a prototype test from scratch using static images. This method is perfect for early-stage design concepts, where you want to quickly test user flows without a fully interactive prototype.

2. Preparing your Figma prototype for testing

Figma users, we’ve got you covered! We discussed how to prepare your Figma prototype for the smoothest possible testing experience. From setting up interactions to ensuring proper navigation, these tips ensure participants have an intuitive experience during the test. For more detailed instructions, check out our help article 

3. Seamless Figma prototype imports

One of the standout features of the tool is its seamless integration with Figma. We showed how easy it is to import your designs directly from Figma into Optimal, streamlining the setup process. You can bring your working files straight in, and resync when you need to with one click of a button.

4. Understanding usability metrics and analyzing results

We explored how to analyze the usability metrics, and walked through what the results can indicate on click maps and paths. These visual tools allow you to see exactly how participants navigate your design, making it easier to spot pain points, dead ends, or areas of friction. By understanding user behavior, you can rapidly iterate and refine your prototypes for optimal user experience.

Seeing is believing

Explore our tools and see how Optimal makes gathering insights simple, powerful, and impactful.