Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.

Design

Learn more
1 min read

67 ways to use Optimal for user research

User research and design doesn’t fail because teams don’t care – it fails because there’s rarely time to explore every option. When deadlines pile up, most teams default to the same familiar research patterns and miss opportunities to get more value from the tools they already have.

We’ve brought together practical, real-world ways to use Optimal – from tree testing and first-click testing to card sorting, surveys, prototype testing, and interviews. Some of these use cases are obvious, but many aren’t. All of them are designed to help teams move faster, reduce risk, and turn user insights into decisions stakeholders trust.

We’ve focused on quick wins and flexible examples you can adapt to your own context – whether you’re benchmarking navigation, validating early designs, improving conversion flows, prioritizing work, or proving the ROI of UX. You don’t need more tools or more processes. You just need smarter ways to use what you already have.

Let’s get into it.

Practical ways to use Optimal for user research and UX design

#1 Benchmark your information architecture (IA)

Without a baseline for your navigation or information architecture (IA), you can’t easily tell if any changes you make have a positive effect. If you haven’t done so, benchmark your existing website on tree testing now. Upload your site structure and get results the same day. Now you’ll have IA scores to beat each month. Easy.

#2 Find out precisely where people get lost

Watch video recordings of real people interacting with your sites with live site testing. Combine this with surveys and user interviews to understand where users struggled. You can also use the tree testing pietree to find out exactly where people are getting lost in your website structure and where they go instead.

#3 Start with one screenshot

If you’re just not sure where to begin then take a screenshot of your homepage, or any page that you think might have some issues and get going with first-click testing. Write up a string of things that people might want to do when they find themselves on this page and use these as your tasks. Surprise all your colleagues with a maddening heatmap or video recordings showing where people actually clicked in response to your tasks or where they struggle. Now you’ll have a better idea of which area of your site to focus on for your next step.

#4 Test live sites during discovery

You can run live site testing as part of your discovery phase to baseline your live experiences and see how well your current site supports real user goals. Test competitors' sites to see how you stack up. You’ll quickly uncover opportunities to differentiate your site, all before a single wireframe is drawn. All that's required is a URL and then you're set to go. No code needed.

#5 A/B test your site structure

Tree testing is great for testing more than one content structure. It’s easy to run two separate tree testing studies, even more than two. It’ll help you decide which structure you and your team should run with, and it won’t take you long to set them up.

#6 Optimize sign-up flows

Discover how easy (or not) it is for users to navigate your sign up experience to ensure it works exactly as intended. Create a live site or prototype test to identify any confusion or points of friction. You could also use this test to understand users' first impressions of your home or landing page. Where do they click first and what information is valuable to them?

#7 Make collaborative design decisions‍

Use surveys, first-click tests, and card sorting to get your team involved and let their feedback feed your designs: logos, icons, banners, images, the list goes on... For example, by creating a closed image sort with categories, your team can group designs based on their preferences, you can get some quick feedback to help you figure out where you should focus your efforts.

#8 Do your (market) research

Get a better sense of your users and customers’ motivations with surveys and user interviews. You can also find out what people actually want to see on your website with a card sort, by conducting an image sort of potential products. By providing categories like ‘I would buy this’, ‘I wouldn’t buy this’ to indicate their preferences for each item, you can figure out what types of products appeal to your customers.

#9 Customer satisfaction surveys with surveys and interviews

The thoughts and feelings of your users are always important. A simple survey or user interview can help you take a deeper look at your checkout process, a recently launched product or service, or even the packaging your product arrives in. Your options are endless.

#10 Start testing prototypes

Companies that incorporate prototype testing in their design process can reduce development costs by 33%. Use prototype testing to ensure your designs hit the mark before you invest too heavily in the build. Build your own prototype with images in Optimal or import a Figma file. You can even test AI-generated prototypes from tools like Lovable or Magic Patterns by dropping the URL into live site testing.

#11 Crowdsource content ideas

Whether you’re running a blog or a UX conference, surveys can help you generate content ideas and understand any knowledge gaps that might be out there. Figure out what your users and attendees like to read on your blog, or what they want to hear about at your event, and let this feed into what you offer.

#12 Evaluate user flows

Sometimes a change in your product or service means you have to change how it’s presented to your existing customers.  Ensure your customers understand the changes to your product or service with prototype and live site testing. Identify issues with user flow, content, or layout that may confuse them. Discover which options they’re most likely to choose with the updates. Uncover what truly matters to your customers.

#13 Quantify the return on investment of UX

Some people, including UX Agony Aunt, define return on UX as time saved, money made, and people engaged. By attaching a value to the time spent completing tasks, or to successful completion of tasks, you can approximate an ROI or at least illustrate the difference between two options.

#14 Convince your stakeholders with highlight reels

User interviews are teeming with insights but can be time and resource intensive to analyze without automation. Use Optimal Interviews tool to capture key moments, reactions, and pain points with automated highlight reels and clips. These are perfect for storytelling, stakeholder buy-in, and keeping teams connected to who they’re building for.

#15 Prioritize upcoming work 

Survey your organization to build a list of ideas for upcoming work. Understand your audience’s priorities with card sorting to inform your feature development. Categorize your upcoming work ideas to decide collectively what’s best to take on next. Great for clarifying what the team considers the most valuable or pressing work to be done.

#16 Reduce content on landing pages to what people access regularly

Before you run an open card sort to generate new category ideas, you can run a closed card sort to find out if you have any redundant content. Say you wanted to simplify the homepage of your intranet. You can ask participants to sort cards (containing homepage links) based on how often they use them. You could compare this card sort data with analytics from your intranet and see if people’s actual behavior and perception are well aligned.

#17 Create tests to fit in your onboarding process

Onboarding new customers is crucial to keeping them engaged with your product, especially if it involves your users learning how to use it. You can set up a quick study to help your users stay on track with onboarding. For example, say your company provided online email marketing software. You can set up a first-click testing study using a photo of your app, with a task asking your participants where they’d click to see the open rates for a particular email that went out.

#18 Input your learnings and observations from a UX conference with qualitative insights

If you're lucky enough to attend a UX conference, you can now share the experience with your colleagues. You can easily jot down ideas, quotes and key takeaways in a Qualitative Insights project and keep your notes organized by using a new session for each presenter Bonus, if you’re part of a team, they can watch the live feed rolling into Qualitative Insights!


#19 Multivariate testing

Tree testing and first-click testing allow you to compare multiple versions of content structures, designs, or flows. You can also compare how users engage with different live websites in one study. This helps decide the best-performing option without guessing.

#20 Do some sociological research

Using card sorting for sociological research is a great way to deepen your understanding of how different groups may categorize information. For example, by looking at how young people group popular social media platforms, you can understand the relationships between them, and identify where your product may fit in the mix. Then, follow up with surveys or moderated interviews for deeper insights. 

#21 Test your FAQs page with new users

Your support and knowledge base within your website can be just as important as any other core action on your website. If your support site is lacking in navigation and UX, this will no doubt increase support tickets and resources. Make sure your online support section is up to scratch. Here’s an article on how to do it quickly.

#22 Establish which tags or filters people consider to be the most important

Create a card sort with your search filters or tags as labels, and have participants rank them according to how important they consider them to be. Analytics can tell you half of the story (where people actually click), so the card sort can give another side: a better idea of what people actually think or want. Follow up with surveys or interviews to confirm insights.

#23 Figure out if your icons need labels

‍Figure out if your icons are doing their job by testing whether your users are understanding them as intended. Uploading icons you currently use, or plan to use in your interface to first-click testing, and ask your users to identify their meaning by making use of post-task questions.

#24 Get straight to the aha! moments

Optimal Interviews gives you automated insights but you can also engage with AI Chat to dive deeper. Ask AI specific questions about a feature or process or request quotes or examples. Then, get highlight reels and clips to match.


#25 Improve website conversions

Make the marketing team’s day by doing a fast improvement on some core conversions on your website. Now, there are loads of ways to improve conversions for a check out cart or signup form, but using first-click testing to test out ideas before you start going live A/B test can take mere minutes and give your B version a confidence boost. For deeper insights, try a live site test. 

#26 Test your mobile experience or web app

As more and more people are using their smartphones for apps and to browse sites, you need to ensure its design gives your users a great experience. Test your mobile site to ensure people aren’t getting lost in the mobile version of your site. If you haven’t got a mobile-friendly design yet, now’s the time to start designing it!

#27 Get automated transcripts

Have a number of interviews you need to transcribe quickly? Upload up to 20 interviews at once in Optimal Interviews and get automated transcripts, so you can spend less time on admin and more time digging into insights.

#28 Reduce the bounce rates of certain sections of your website‍

People jumping off your website and not continuing their experience is something (depending on the landing page) everyone tries to improve. The metric ‘time on site’ and ‘average page views’ is a metric that shows the value your whole website has to offer. Again, there are many different ways to do this, but one big reason for people jumping off the website is not being able to find what they’re looking for. Use prototype testing or live site testing to watch users in action and understand where things break down.

#29 Test your website in different countries‍

No, you don’t have to spend thousands of dollars to go to all these countries to test, although that’d be pretty sweet. You can remotely research participants from all over the world, using our integrated recruitment panel. Start seeing how different cultures, languages, and countries interact with your website. 

#30 Preference test

Whether you’re coming up with a new logo design, headline, featured image, or anything, you can preference test it with first-click testing. Create an image that shows the two designs side by side and upload it to first-click testing. From there, you can ask people to click whichever one they prefer!  If you want to track multiple clicks per task or watch recordings, use prototype testing instead.


#31 Test visual hierarchy with first-click testing

Use first-click testing to understand which elements draw users' attention first on your page. Upload your design and ask participants to click on the most important element, or what catches their eye first. The resulting heatmap will show you if your visual hierarchy is working as intended - are users clicking where you expect them to? This technique helps validate design decisions about sizing, color, positioning, and contrast without needing to build the actual page.


#32 Tame your blog or knowledge base

Get the tags and categories in your blog under control to make life easier for your readers. Set up a card sort and use all your tags and categories as card labels. Either use your existing ones or test a fresh set of new tags and categories.

#33 Use AI Chat for stakeholder-ready outputs

Use AI-powered chat to instantly reformat interview insights and fast-track deliverables for different audiences. Simply specify the details of the deliverable you would like. For example: “Turn this into a 3-sentence Slack summary (no citations).” or “Rewrite this as an exec-ready insight with a clear recommendation.”

‍#34 Validate the designs in your head

As designers, you’ve probably got umpteen designs floating around in your head at any one time. But which of these are really worth pursuing? Figure this out by using Optimal to test out wireframes of new designs before putting any more work into them.

#35 Optimize the support escalation flow

Understand how users navigate help resources, report issues, and conceptualize support categories, especially when they need to locate assistance quickly in time-sensitive situations.

#36 Improve your search engine optimization (SEO) with tree testing

Yes, a good IA improves your SEO. Tree testing helps you understand how people navigate throughout your site. It also helps search engines better understand and index your content, making it more discoverable and relevant in search results. Make sure people can easily find what they’re looking for, and you’ll start to see improvement in your search engine ranking.

#37 Feature prioritization and get some help for your roadmap

Find out what people think are the most important next steps for your team. Set up a survey or card sort and ask people to categorize items and rank them in descending order of importance or impact on their work. This can also help you gauge their thoughts on potential new features for your site, and for bonus points compare team responses with customer responses.

#38 Define your brand tone of voice

Use a card sort to understand how people perceive your brand, so you can shape or refine your brand personality, tone of voice, and style guidelines. Run this with stakeholders or your audience to uncover current perceptions and where they’d like your brand to go next.

#39 Run an Easter egg hunt using the correct areas in first-click testing

Liven up the workday by creating a fun Easter egg hunt in first-click testing. Simply upload a photo (like those really hard “spot the X” photos), set the correct area of your target, then send out your study with participant identifiers enabled. You can also send these out as competitions and have closing rules based on time, number of participants, or both.

#40 Test your home button

Would an icon or text link work better for navigating to your home page? Before you go ahead and make changes to your site, you can find out by setting up a first-click testing test.

#41 Improve team structure and clarity role expectations

Run a card sort, survey, or internal interviews to understand how responsibilities are perceived across different roles. Work with team leaders and managers to clarify role definitions, reporting lines, and decision-making authority. This helps uncover overlapping responsibilities and opportunities to streamline management and support team workflows.

#42 ‘Buy now’ button shopping cart visibility‍

If you’re running an e-commerce site, ease of use and a great user experience are crucial. To see if your shopping cart and checkout processes are as good as they can be, look into running a live site, prototype or first-click test.

#43 Website periodic health checks

Raise the visibility of good IA by running periodic IA health checks using tree testing and reporting the results. Proactively identifying structural issues early, and backing decisions with clear metrics, helps drive alignment and build confidence across stakeholders.

‍#44 Use heatmaps to get the first impressions of designs

Heatmaps in our first-click testing tool are a great way of getting first impressions of any design. You can see where people clicked (correctly and incorrectly), giving you insights on what works and doesn’t work with your designs. Because it’s so fast to test, you can iterate until your designs start singing.

#45 Focus groups with interviews

Thinking of launching a new product, app or website, or seeking opinions on an existing one? Remote focus groups can provide you with a lot of candid information that may help get your project off the ground. They’re also dangerous because they’re susceptible to groupthink, design by committee, and tunnel vision. Use with caution, but if you do then upload your recordings to Interviews for automated insights! Find patterns across sessions and use AI Chat to dig deeper. Pay attention to emotional triggers.

#46 Gather opinions with surveys

Whether you want the opinions of your users or from members of your team, you can set up a quick and simple survey. It’s super useful for getting opinions on new ideas (consider it almost like a mini-focus group), or even for brainstorming with teammates.

#47 Prioritise content

Use a card sort to understand what content matters most to people, so you can plan what to write first. Ask participants which information is most useful or which tasks they do most often. You can also run this after a top tasks survey to help shape your long list of content.

#48 Test a new concept

Got an idea you want to sanity-check before investing more time? Use surveys, first-click testing, or prototype testing to see if people understand the concept and find it valuable. A quick test now can save a lot of rework later.


#49 Run an image card sort to organize products into groups

You can add images to each card that allows you to understand how your participants may organize and label particular items. Very useful if you want to organize some retail products and want to find out how other people would organize them given a visual including shape, color, and other potential context.

#50 Guerrilla testing with first-click testing

For really quick first-click testing, take first-click testing on a tablet, mobile device or laptop to a local coffee shop. Ask people standing in line if they’d like to take part in your super quick test in exchange for a cup of joe. Easy!

#51 Test your search box

Case study by Viget: “One of the most heavily used features of the website is its keyword search, so we wanted to make absolutely certain that our redesigned search box didn’t make search harder for users to find and use.” Use first-click testing to test different variations. 

#52 Run a Net Promoter Score (NPS) survey

Optimal surveys give you plenty of question options, but one of the simplest ways to take the pulse of your product is an NPS survey to find out how likely they would recommend your product or brand. Use the out-of-the-box NPS question type question to quickly understand customer sentiment and track it over time.

#53 Run an empathy test

Empathy – the ability to understand and share the experience of another person – is central to the design process. An empathy test is another great tool to use in the design phase because it enables you to find out if you are creating the right kind of feelings with your user. Take your design and show it to users. Provide them with a variety of words that could represent the design – for example “minimalistic”, “dynamic”, or “professional” – and ask them to pick out which words which they think are best suited to their experience.

#54 Compare and test email designs

Drop your email designs into first-click testing to see which version people prefer and where they click first. Use these insights to refine your layout, hierarchy, and calls to action to improve engagement and conversions.

#55 Source-specific data with an online survey

Online survey tools can complement your existing research by sourcing specific information from your participants. For example, if you need to find out more about how your participants use social media, which sites they use, and on which devices, you can do it all through a simple survey questionnaire. Additionally, if you need to identify usage patterns, device preferences or get information on what other products/websites your users are aware of/are using, a questionnaire is the ticket.

#56 Make sure you get the user's first-click right

Like most things, read a little, and then it’s all about practice. We’ve found that people who get the first click correct are almost three times as likely to complete a task successfully. Get your first clicks right in tree testing and first-click testing and you’ll start seeing your customers smile.

#57 Destroy evil attractors in your tree

Evil attractors are those labels in your IA that attract unjustified clicks across tasks. This usually means the chosen label is ambiguous, or possibly a catch-all phrase like ‘Resources’. Read how to quickly identify evil attractors in the Destinations table of tree test results and how to fix them.


#58 Ensure accessibility and inclusion

Check how people with different physical, visual, or cognitive needs move through your content, and spot any areas that might slow them down or cause confusion. Use what you uncover to remove friction and support all users.

#59 Add moderated card sort results to your card sort‍

An excellent way of gathering valuable qualitative insights alongside the results of your remote card sorts is to run a moderated version of the sorts with a smaller group of participants. When you can observe and interact with your participants as they complete the sort, you’ll be able to ask questions and learn more about their thought processes and the reasons why they have categorized things in a particular way.

#60 Test your customers' perceptions of different logo and brand image designs

Understand how customers perceive your brand by creating a closed card sort. Come up with a list of categories, and ask participants to sort images such as logos, and branded images.

#61 Run an open image card sort to classify images into groups based on the emotions they elicit

‍Are these pictures exhilarating, or terrifying? Are they humorous, or offensive? Relaxing, or boring? Productive, or frantic? Happy memories, or a deep sigh?

#62 Crowd-source the values you want your team/brand/product to represent

Card sorting is a well-established technique in the ‘company values’ realm, and there are some great resources to help you and your team brainstorm the values you represent. These ‘in-person’ brainstorm sessions are great, and you can run a remote closed card sort to support your findings. And if you want feedback from more than a small group of people (if your company has, say, more than 15 staff) you can run a remote closed card sort on its own. Use Microsoft’s Reaction Card Method as card inspiration.

#63 Test physical and digital experiences together

Use recorded videos and interviews to observe people interacting with physical products, kiosks, or mobile apps in real-world contexts. Record sessions, capture moments of friction, and bring those insights back into Optimal’s Interviews tool for automated insights.

#64 HR exercises to determine the motivations of your team

It’s simple to ask your team about their thoughts, feelings, and motivations with a survey. You can choose to leave participant identifiers blank (so responses are anonymous), or you can ask for a name/email address. As a bonus, you can set up a calendar reminder to send out a new survey in the next quarter. Duplicate the survey and send it out again!

#65 Designing physical environments

‍If your company has a physical environment in which your customers visit, you can research new structures using a mixture of tools in Optimal. This especially comes in handy if your customers require certain information within the physical environment in order to make decisions. For example, picture a retail store. Are all the signs clear and communicate the right information? Are people overwhelmed by the physical environment?

#66 Run an image card sort to organize your library

Whether it’s a physical library of books, or a digital drive full of ebooks, you can run a card sort to help organize them in a way that makes sense. Will it be by genre, author name, color or topic? Send out the study to your coworkers to get their input! You can also do this at home for your own personal library, and you can include music/CDs/vinyl records and movies!

#67 Use tree testing to refine an interactive phone menu system

Similar to how you’d design an IA, you can create a tree test to design an automated phone system. Whether you’re designing from the ground up, or improving your existing system, you will be able to find out if people are getting lost.

Practical ways to use Optimal for user research (and get value fast)

And that’s the list. This is not everything you can do with Optimal, but a solid reminder that meaningful user insights don’t have to be slow, heavy, or overcomplicated. Small, well-timed studies can uncover friction, validate decisions, and create momentum across teams.

Ready to get started?

Have a creative use case we missed? Let us know, we’re always learning from the ways our customers push research further, faster, and smarter.

Learn more
1 min read

Ella Stoner: A three-step-tool to help designers break down the barriers of technical jargon

Designing in teams with different stakeholders can be incredibly complex. Each person looks at projects through their own lens, and can potentially introduce jargon and concepts that are confusing to others. Simplicity advocate Ella Stoner knows this scenario all too well. It’s what led her to create an easy three-step tool for recognizing problems and developing solutions. By getting everyone on the same page and creating an understanding of what the simplest solution is, designers can create products with customer needs in mind.

Ella’s background

Ella Stoner is a CX Designer at Spark in New Zealand. She is a creative thought leader and a talented designer who has facilitated over 50 Human Centered Design Workshops. Ella and her team have developed a cloud product that enables businesses to connect with Public Cloud Services such as Amazon, Google and Azure in a human-centric way. She brings a simplistic approach to her work that is reflected in her UX New Zealand talk. It’s about cutting out complex details to establish an agreed starting point that is easily understood by all team members.

Contact Details:

You can find Ella on LinkedIn.

Improving creative confidence 🤠

Ella is confident that she is not the only designer who has felt overwhelmed with technical and industry specific jargon in product meetings. For example, on Ella’s first day as a designer with Spark, she attended a meeting about an HSNS (High Speed Network Services) tool. Ella attempted to use context clues to try and predict what HSNS could mean. However, as the meeting went on, the technical and industry-specific jargon built on each other and Ella struggled to follow what was being said. At one point Ella asked the team to clarify this mysterious term:

“What’s an HSNS and why would the customer use it?” she asked. Much to her surprise, the room was completely silent. The team struggled to answer a basic question, about a term that appeared to be common knowledge during the meeting. There’s a saying, “Why do something simply when you can make it as complicated as possible?”. This happens all too often, where people and teams struggle to communicate with each other, and this results in projects and products that customers don’t understand and can’t use. Ella’s In A Nutshell tool is designed to cut through all that. It creates a base level starting point that’s understood by all, cuts out jargon, and puts the focus squarely on the customer. It:

  • condenses down language and jargon to its simplest form
  • translates everything into common language
  • flips it back to the people who’ll be using it.

Here’s how it works:

First, you complete this phrase as it pertains to your work: “In a nutshell, (project/topic) is (describe what the project or topic is in a few words), that (state what the project/topic does) for (indicate key customer/users and why). In order for this method to work, each of the four categories you insert must be simple and understandable. All acronyms, complex language, and technical jargon must be avoided.  In a literal sense, anyone reading the statement should be able to understand what is being said “in a nutshell.” When you’ve done this, you’ll have a statement that can act as a guide for the goals your project aims to achieve.

Why it matters 🤔

Applying the “In A Nutshell” tool doesn’t take long. However, it's important to write this statement as a team. Ideally, it’s best to write the statement at the start of a project, but you can also write it in the middle if you need to create a reference point, or any time you feel technical jargon creeping in.

Here’s what you’ll need to get started:

  • People with three or more role types (this accommodates varying perspectives to ensure it’s as relevant as possible)
  • A way to capture text - i.e. whiteboard, Slack channel, Miro board
  • An easy voting system - i.e., thumbs up in a chat

Before you start, you may need to pitch the idea to someone in a technical role. If you’re feeling lost or confused, chances are someone else will be too. Breaking down the technical concepts into easy-to-understand and digestible language is of utmost importance:

  1. Explain the Formula to the team..
  2. Individually brainstorm possible answers for each gap for three minutes.
  3. Put every idea up on the board or channel and vote on the best one.

Use the most popular answers as your final “In a Nutshell” statement.

Side note: Keep all the options that come through the brainstorm. They can still be useful in the design process to help form a full picture of what you’re working on, what it should do, who it should be for etc.

Learn more
1 min read

Bear Liu: How visual thinking can improve communications in design workplaces

When Bear Liu was teaching himself design, he struggled remembering concepts since English wasn’t his first language. To help, he started doodling. By drawing pictures that related to what he was learning, he found he could not only remember them better, he could understand and communicate more effectively too. Ever since, he’s used the power of drawings and pictures to relay information in ways people can use.

Bear gives examples of how visual communication can help design workplaces to relay information in a more memorable and usable way. It may only seem like a minor change, but the difference can be significant.

Bear’s background 🎤

Bear Liu is a Product Designer at Xero, an online accounting platform that’s used all over the world. He’s also a Design Mentor at Springboard and CareerFoundry, and an Apple Award-Winning podcast host at BearTalk.

His background is in science education. As a self-taught designer, Bear has helped a raft of large and small businesses with digital products over the last 16 years. His clients come from diverse backgrounds and industries across the globe. Bear's professional passions also carry over into his hobbies. Outside of work he enjoys reading, drawing, and producing videos & podcasts on tech and design.

Contact Details:

You can find Bear on LinkedIn, or listen to his podcast, BearTalk.

Unleash your visual superpower as a communication pro 🦸🏻

When it comes to addressing business challenges it is important to keep these three aspects in mind: 

  1. Understanding - break down complex problems and solutions so everyone can understand. 
  2. Memory - retaining information in your mind is difficult even with note taking.
  3. Communication- People relate to words differently, and the meaning of something can easily get lost in translation. This issue is more prevalent with remote work.

Bear Liu strongly believes that visual communication helps people understand, remember and communicate information more effectively. Why?

  • It helps to focus. Pictures remove distractions and draw attention to where it’s desired.
  • It’s a token. A picture is universal - a house or a smiley face means the same thing to people that speak different languages. 
  • Most people are visual thinkers. Studies have found humans are hard-wired to process visual information faster. We are better at storing information in images, rather than numbers and letters.

But what if I can’t draw? This is a common issue Bear finds when talking to people about this. It’s not about the quality of the drawing itself, it’s about what it means. By delivering a message through a picture, it becomes understandable. Many of Bear’s drawings only ever remain in draft form. Even simple doodles can have meanings that make concepts clear.

In his design work at Xero, Bear has used drawing and sketches to great effect in a range of instances:

  • The accessibility tree was a complex, abstract system, but by drawing it (on a literal tree), and adding a few notes alongside it, the terminology became much more understandable.
  • Sketching how customers work made it easier to describe how Xero could help them. It was much more memorable than writing it out in paragraphs.
  • Wrapping the year in product design. A written summary of a year’s work is long-winded. Instead, Bear drew a tree and pinned key words, quotes and achievements to communicate the highlights.
  • UX terminology explanations can be difficult for those outside the industry to comprehend. Bear challenged himself to share 1 minute videos that accompanied simple drawings to help colleagues understand them, and had rave reviews.
  • Sketching notes is a great alternative to writing notes at conferences or meetings. Presenters can draw to help audiences follow along, and people in the audience themselves can also sketch their own notes.

Why it matters  🔥

Bear has adapted visual thinking to his own product design process and has seen a noticeable improvement in communication as a result.

People are busy - their brains are packed with all sorts of information, and they’re easily distracted by other things they have on their minds. By delivering information in a way that helps them to focus on it, remember and understand it, designers can achieve their ultimate goals.

As Bear also notes, drawing is fun. It’s much more rewarding than using words, as well as much more effective.

Bear used the example of his talk at UX New Zealand 2023 as a great place to use a drawing. Rather than follow along with his message by scribbling notes the whole way through, those in the audience could capture the biggest lessons easily in one simple drawing.

  • First, Bear drew one stick figure to represent himself as a speaker. He drew three speech bubbles, where audience members could write the most notable points he said.
  • Then he drew another stick figure, which represented the audience member listening to him. They had three thought bubbles, which people could populate with their biggest takeaways from the speech.

That one simple drawing is a template that can be used in any speech or meeting to remember the key points.

Learn more
1 min read

Dan Dixon and Stéphan Willemse: HCD is dead, long live HCD

There is strong backlash about the perceived failures of Human Centred Design (HCD) and its contribution to contemporary macro problems. There seems to be a straightforward connection: HCD and Design Thinking have been adopted by organizations and are increasingly part of product/experience development, especially in big tech. However the full picture is more complex, and HCD does have some issues.

Dan Dixon, UX and Design Research Director and Stéphan Willemse, Strategy Director/Head of Strategy, both from the Digital Arts Network, recently spoke at UX New Zealand, the leading UX and IA conference in New Zealand hosted by Optimal Workshop, about the evolution and future of HCD.

In their talk, Dan and Stéphan cover the history of HCD, its use today, and its limitations, before presenting a Post HCD future. What could it be, and how should it be different? Dan and Stéphan help us to step outside of ourselves as we meet new problems with new ways of Design Thinking.

Dan Dixon and Stéphan Willemse bios

Dan is a long-term practitioner of human-centred experience design and has a wealth of experience in discovery and qual research. He’s worked in academic, agency and client-side roles in both the UK and NZ, covering diverse fields such as digital, product design, creative technology and game design. His history has blended a background in the digital industry with creative technology teaching and user experience research. He has taken pragmatic real-world knowledge into a higher education setting as well as bringing deeper research skills from academia into commercial design projects. In higher education, as well as talks and workshops, Dan has been teaching and sharing these skills for the last 16 years. 

Stéphan uses creativity, design and strategy to help organizations innovate towards positive, progressive futures. He works across innovation, experience design, emerging technologies, cultural intelligence and futures projects with clients including Starbucks, ANZ, Countdown, TradeMe and the public sector. He holds degrees in PPE, Development Studies, Education and an Executive MBA. However, he doesn’t like wearing a suit and his idea of the perfect board meeting is at a quiet surf break. He thinks ideas are powerful and that his young twins ask the best questions about the world we live in.

Contact Details:

Email: dan.dixon@digitalartsnetwork.com

Find Dan on LinkedIn  

HCD IS DEAD, LONG LIVE HCD 👑

Dan and Stéphan take us through the evolving landscape of Human Centred Design (HCD) and Design Thinking. Can HCD effectively respond to the challenges of the modern era, and can we get ahead of the unintended consequences of our design? They examine the inputs and processes of design, not just the output, to scrutinize the very essence of design practice.

A brief history of HCD

In the 1950s and 1960s, designers began exploring the application of scientific processes to design, aiming to transform it into a systematic problem-solving approach. Later in the 1960s, design thinkers in Scandinavia initiated the shift towards cooperative and participative design practices. Collaboration and engagement with diverse stakeholders became integral to design processes. Then, the 1970s and 1980s marked a shift in perspective, viewing design as a fundamentally distinct way of approaching problems. 

Moving into the late 1980s and 1990s, design thinking expanded to include user-centered design, and the idea of humans and technology becoming intertwined. Then the 2000s witnessed a surge in design thinking, where human-centered design started to make its mark.

Limitations of the “design process”

Dan and Stéphan discuss the “design squiggle”, a concept that portrays the messy and iterative nature of design, starting chaotically and gradually converging toward a solution. For 20 years, beginning in the early 90s, this was a popular way to explain how the design process feels. However, in the past 10 years or so, efforts to teach and pass down design processes have become common practice. Here enter concepts like the “double diamond” and “pattern problem”, which seek to be repeatable and process-driven. These neat processes, however, demand rigid adherence to specific design methods, which can ultimately stifle innovation. 

Issues with HCD and its evolution

The critique of such rigid design processes, which developed alongside HCD, highlights the need to acknowledge that humans are just one element in an intricate network of actors. By putting ourselves at the center of our design processes and efforts, we already limit our design. Design is just as much about the ecosystem surrounding any given problem as it is about the user. A limitation of HCD is that we humans are not actually at the center of anything except our own minds. So, how can we address this limitation?

Post-anthropocentric design starts to acknowledge that we are far less rational than we believe ourselves to be. It captures the idea that there are no clear divisions between ‘being human’ and everything else. This concept has become important as we adopt more and more technology into our lives, and we’re getting more enmeshed in it. 

Post-human design extends this further by removing ourselves from the center of design and empathizing with “things”, not just humans. This concept embraces the complexity of our world and emphasizes how we need to think about the problem just as much as we think about the solution. In other words, post-human design encourages us to “live” in our design problem(s) and consider multiple interventions.

Finally, Dan and Stéphan discuss the concept of Planetary design, which stresses that everything we create, and everything we do, has the possibility to impact everything else in the world. In fact, our designs do impact everything else, and we need to try and be aware of all possibilities.

Integrating new ways of thinking about design

To think beyond HCD and to foster innovation in design, we can begin by embracing emerging design practices and philosophies such as "life-centered design," "Society-centered design," and "Humanity-centered design." These emerging practices have toolsets that are readily available online and can be seamlessly integrated into your design approach, helping us to break away from traditional, often linear, methodologies. Or, taking a more proactive stance, we can craft our own unique design tools and frameworks. 

Why it matters 🎯

To illustrate how design processes can evolve to meet current and future challenges of our time, Dan and Stéphan present their concept of “Post human-centered design” (Post HCD). At its heart, it seeks to take what's great about HCD and build upon it, all while understanding its issues/limitations.

Dan and Stéphan put forward, as a starting point, some challenges for designers to consider as we move our practice to its next phase.

Suggested Post HCD principles:

  • Human to context: Moving from human-centered to a context-centred or context sensitive point of view.
  • Design Process to Design Behaviour: Not being beholden to design processes like the “double diamond”. Instead of thinking about designing for problems, we should design for behaviors instead. 
  • Problem-solutions to Interventions: Thinking more broadly about interventions in the problem space, rather than solutions to the problems
  • Linear to Dynamic: Understand ‘networks’ and complex systems.
  • Repeated to Reflexive: Challenging status quo processes and evolving with challenges that we’re trying to solve.

The talk wraps up by encouraging designers to incorporate some of this thinking into everyday practice. Some key takeaways are: 

  • Expand your web of context: Don’t just think about things having a center, think about networks.
  • Have empathy for “things”: Consider how you might then have empathy for all of those different things within that network, not just the human elements of the network.
  • Design practice is exploration and design exploration is our practice: Ensure that we're exploring both our practice as well as the design problem.
  • Make it different every time: Every time we design, try to make it different, don't just try and repeat the same loop over and over again.

Learn more
1 min read

Kate Keep and Brad Millen: How the relationship between Product Owners and Designers can impact human-centered design

Working in a multi-disciplined product team can be daunting, but how can those relationships be built, and what does that mean for your team, your stakeholders, and the users of the product?

Kate Keep, Product Owner, and Brad Millen, UX Designer, both work in the Digital team at the Accident Compensation Corporation (ACC). They recently spoke at UX New Zealand, the leading UX and IA conference in New Zealand hosted by Optimal Workshop, about their experience working on a large project within an organization that was new to continuous improvement and digital product delivery.

In their talk, Kate and Brad discuss how they were able to pull a team together around a common vision, and three key principles they found useful along the way.

Background on Kate

Kate is a Product Owner working in the Digital team at ACC, and her team currently look after ACC’s Injury Prevention websites. Kate is also a Photographer, which keeps her eye for detail sharp and her passion for excellence alive. She comes from a Contact Centre background which drives her dedication to continuously search for the optimal customer experience. Kate and the team are passionate about accessibility and building websites that are inclusive for all of Aotearoa.

Contact Details:

Email address: kate.keep@acc.co.nz

LinkedIn URL: Not provided

Background on Brad

Brad is a Digital UX Designer in Digital team at ACC. Before launching into the world of UX, Brad studied game design which sparked his interest in the way people interact, engage and perceive products. This helped to inform his ethos that you’re always designing with others in mind.

How the relationship between Product Owners and Designers can impact human-centered design 👩🏻💻📓✍🏻💡

Brad and Kate preface their talk by acknowledging that they were both new to their roles and came from different career backgrounds when this project began, which presented a significant challenge. Kate was a Product Owner with no previous delivery experience, while Brad, was a UX designer. To overcome these challenges, they needed to quickly figure out how to work together effectively.

Their talk focuses on three key principles that they believe are essential when building a digital product in a large, multi-disciplined team.

Building Trust-Based Relationships 🤝🏻

The first principle emphasizes the importance of building trust-based relationships. They highlight the need to understand each other's perspectives and work together towards a common vision for the customer. This can only be achieved by building a strong sense of trust with everyone on the team. They stress the value of open and honest communication - both within the team and with stakeholders.

Kate, as Product Owner, identified her role as being one of “setting the vision and getting the hell out of the way”. In this way, she avoided putting Brad and his team of designers in a state of paralysis by critiquing decisions all of the time. Additionally, she was clear from the outset with Brad that she needed “ruthless honesty” in order to build a strong relationship.

Cultivating Psychological Safety and a Flat Hierarchy 🧠

The second principle revolves around creating an environment of psychological safety. Kate explains that team members should feel comfortable challenging the status quo and working through disagreements without fear of ridicule. This type of safety improves communication and fast-tracks the project by allowing the team to raise issues without feeling they need to hide and wait for something to break.

They also advocate for a flat hierarchy where everyone has an equal say in decision-making. This approach empowers team members and encourages autonomy. It also means that decisions don’t need to wait for meetings, where juniors are scheduled to report issues or progress to seniors. Instead, all team members should feel comfortable walking up to a manager and, having built a relationship with them, flag what’s on their mind without having to wait. 

This combination of psychological safety and flat hierarchy, coupled with building trust, means that the team dynamic is efficient and productive.

Continuous Focus on the Customer Voice 🔊

The third principle centers on keeping the customer's voice at the forefront of the product development process. Brad and Kate recommend regularly surfacing customer feedback and involving the entire team in understanding customer needs and goals. They also highlight the importance of making customer feedback tangible and visible to all team members and stakeholders.

Explaining why the topic matters 💡

Kate and Brad’s talk sets a firm foundation for building positive and efficient team dynamics. The principles that they discuss champion empowerment and autonomy, which ultimately help multi-disciplined teams to gel when developing digital products. In practice, these principles set the stage for several key advantages.

They stress that building trust is key, not only for the immediate project team but for organizational stakeholders too. It’s just as crucial for the success of the product that all key stakeholders buy into the same way of thinking i.e. trusting the expertise of the product design and development teams. Kate stresses that sometimes Product Owners need to absorb stakeholder pressure and take failures on the chin so that they to let design teams do what they do best.

That being said, Kate also realizes that sometimes difficult decisions need to be made when disagreements arise within the project team. This is when the value of building trust works both ways. In other words, Kate, as Product Owner, needed to make decisions in the best interest of the team to keep the project moving.

Psychological safety, in practice, means leading by example and providing a safe environment for people to be honest and feel comfortable enough to speak up when necessary. This can even mean being honest about what scares you. People tend to value this type of honesty, and it establishes common ground by encouraging team members (and key stakeholders) to be upfront with each other.

Finally, keeping the customer's voice front and center is important, not just as design best practice, but also as a way of keeping the project team grounded. Whenever the project experiences a bump in the road, or a breakdown in team communication, Kate and Brad suggest always coming back to the question, “What’s most important to the customer?”. Allow user feedback to be accessible to everyone in the team. This means that the customer's voice can be present throughout the whole project, and everyone, including key stakeholders, never lose sight of the real-life application of the product. In this way, teams are consistently able to work with facts and insights rather than making assumptions that they think are best for the product.

What is UX New Zealand? 🤷

UX New Zealand is a leading UX and IA conference hosted by Optimal Workshop, that brings together industry professionals for three days of thought leadership, meaningful networking and immersive workshops. 

At UX New Zealand 2023, we featured some of the best and brightest in the fields of user experience, research and design. A raft of local and international speakers touched on the most important aspects of UX in today’s climate for service designers, marketers, UX writers and user researchers.

These speakers are some of the pioneers leading the way and pushing the standard for user experience today. Their experience and perspectives are invaluable for those working at the coalface of UX, and together, there’s a tonne of valuable insight on offer. 

Learn more
1 min read

Usability Experts Unite: The Power of Heuristic Evaluation in User Interface Design

Usability experts play an essential role in the user interface design process by evaluating the usability of digital products from a very important perspective - the users! Usability experts utilize various techniques such as heuristic evaluation, usability testing, and user research to gather data on how users interact with digital products and services. This data helps to identify design flaws and areas for improvement, leading to the development of user-friendly and efficient products.

Heuristic evaluation is a usability research technique used to evaluate the user interface design of a digital product based on a set of ‘heuristics’ or ‘usability principles’. These heuristics are derived from a set of established principles of user experience design - attributed to the landmark article “Improving a Human-Computer Dialogue” published by web usability pioneers Jakob Nielsen and Rolf Molich in 1990. The principles focus on the experiential aspects of a user interface. 

In this article, we’ll discuss what heuristic evaluation is and how usability experts use the principles to create exceptional design. We’ll also discuss how usability testing works hand-in-hand with heuristic evaluation, and how minimalist design and user control impact user experience. So, let’s dive in!

Understanding Heuristic Evaluation


Heuristic evaluation helps usability experts to examine interface design against tried and tested rules of thumb. To conduct a heuristic evaluation, usability experts typically work through the interface of the digital product and identify any issues or areas for improvement based on these broad rules of thumb, of which there are ten. They broadly cover the key areas of design that impact user experience - not bad for an article published over 30 years ago!

The ten principles are:

  1. Prevention error: Well-functioning error messages are good, but instead of messages, can these problems be removed in the first place? Remove the opportunity for slips and mistakes to occur.
  2. Consistency and standards: Language, terms, and actions used should be consistent to not cause any confusion.
  3. Control and freedom for users: Give your users the freedom and control to undo/redo actions and exit out of situations if needed.
  4. System status visibility: Let your users know what’s going on with the site. Is the page they’re on currently loading, or has it finished loading?
  5. Design and aesthetics: Cut out unnecessary information and clutter to enhance visibility. Keep things in a minimalist style.
  6. Help and documentation: Ensure that information is easy to find for users, isn’t too large and is focused on your users’ tasks.
  7. Recognition, not recall: Make sure that your users don’t have to rely on their memories. Instead, make options, actions and objects visible. Provide instructions for use too.
  8. Provide a match between the system and the real world: Does the system speak the same language and use the same terms as your users? If you use a lot of jargon, make sure that all users can understand by providing an explanation or using other terms that are familiar to them. Also ensure that all your information appears in a logical and natural order.
  9. Flexibility: Is your interface easy to use and it is flexible for users? Ensure your system can cater to users to all types, from experts to novices.
  10. Help users to recognize, diagnose and recover from errors: Your users should not feel frustrated by any error messages they see. Instead, express errors in plain, jargon-free language they can understand. Make sure the problem is clearly stated and offer a solution for how to fix it.

Heuristic evaluation is a cost-effective way to identify usability issues early in the design process (although they can be performed at any stage) leading to faster and more efficient design iterations. It also provides a structured approach to evaluating user interfaces, making it easier to identify usability issues. By providing valuable feedback on overall usability, heuristic evaluation helps to improve user satisfaction and retention.

The Role of Usability Experts in Heuristic Evaluation

Usability experts play a central role in the heuristic evaluation process by providing feedback on the usability of a digital product, identifying any issues or areas for improvement, and suggesting changes to optimize user experience.

One of the primary goals of usability experts during the heuristic evaluation process is to identify and prevent errors in user interface design. They achieve this by applying the principles of error prevention, such as providing clear instructions and warnings, minimizing the cognitive load on users, and reducing the chances of making errors in the first place. For example, they may suggest adding confirmation dialogs for critical actions, ensuring that error messages are clear and concise, and making the navigation intuitive and straightforward.

Usability experts also use user testing to inform their heuristic evaluation. User testing involves gathering data from users interacting with the product or service and observing their behavior and feedback. This data helps to validate the design decisions made during the heuristic evaluation and identify additional usability issues that may have been missed. For example, usability experts may conduct A/B testing to compare the effectiveness of different design variations, gather feedback from user surveys, and conduct user interviews to gain insights into users' needs and preferences.

Conducting user testing with users that represent, as closely as possible, actual end users, ensures that the product is optimized for its target audience. Check out our tool Reframer, which helps usability experts collaborate and record research observations in one central database.

Minimalist Design and User Control in Heuristic Evaluation

Minimalist design and user control are two key principles that usability experts focus on during the heuristic evaluation process. A minimalist design is one that is clean, simple, and focuses on the essentials, while user control refers to the extent to which users can control their interactions with the product or service.

Minimalist design is important because it allows users to focus on the content and tasks at hand without being distracted by unnecessary elements or clutter. Usability experts evaluate the level of minimalist design in a user interface by assessing the visual hierarchy, the use of white space, the clarity of the content, and the consistency of the design elements. Information architecture (the system and structure you use to organize and label content) has a massive impact here, along with the content itself being concise and meaningful.

Incorporating minimalist design principles into heuristic evaluation can improve the overall user experience by simplifying the design, reducing cognitive load, and making it easier for users to find what they need. Usability experts may incorporate minimalist design by simplifying the navigation and site structure, reducing the number of design elements, and removing any unnecessary content (check out our tool Treejack to conduct site structure, navigation, and categorization research). Consistent color schemes and typography can also help to create a cohesive and unified design.

User control is also critical in a user interface design because it gives users the power to decide how they interact with the product or service. Usability experts evaluate the level of user control by looking at the design of the navigation, the placement of buttons and prompts, the feedback given to users, and the ability to undo actions. Again, usability testing plays an important role in heuristic evaluation by allowing researchers to see how users respond to the level of control provided, and gather feedback on any potential hiccups or roadblocks.

Usability Testing and Heuristic Evaluation

Usability testing and heuristic evaluation are both important components of the user-centered design process, and they complement each other in different ways.

Usability testing involves gathering feedback from users as they interact with a digital product. This feedback can provide valuable insights into how users perceive and use the user interface design, identify any usability issues, and help validate design decisions. Usability testing can be conducted in different forms, such as moderated or unmoderated, remote or in-person, and task-based or exploratory. Check out our usability testing 101 article to learn more.

On the other hand, heuristic evaluation is a method in which usability experts evaluate a product against a set of usability principles. While heuristic evaluation is a useful method to quickly identify usability issues and areas for improvement, it does not involve direct feedback from users.

Usability testing can be used to validate heuristic evaluation findings by providing evidence of how users interact with the product or service. For example, if a usability expert identifies a potential usability issue related to the navigation of a website during heuristic evaluation, usability testing can be used to see if users actually have difficulty finding what they need on the website. In this way, usability testing provides a reality check to the heuristic evaluation and helps ensure that the findings are grounded in actual user behavior.

Usability testing and heuristic evaluation work together in the design process by informing and validating each other. For example, a designer may conduct heuristic evaluation to identify potential usability issues and then use the insights gained to design a new iteration of the product or service. The designer can then use usability testing to validate that the new design has successfully addressed the identified usability issues and improved the user experience. This iterative process of designing, testing, and refining based on feedback from both heuristic evaluation and usability testing leads to a user-centered design that is more likely to meet user needs and expectations.

Conclusion

Heuristic evaluation is a powerful usability research technique that usability experts use to evaluate digital product interfaces based on a set of established principles of user experience design. After all these years, the ten principles of heuristic evaluation still cover the key areas of design that impact user experience, making it easier to identify usability issues early in the design process, leading to faster and more efficient design iterations. Usability experts play a critical role in the heuristic evaluation process by identifying design flaws and areas for improvement, using user testing to validate design decisions, and ensuring that the product is optimized for its intended users.

Minimalist design and user control are two key principles that usability experts focus on during the heuristic evaluation process. A minimalist design is clean, simple, and focuses on the essentials, while user control gives users the freedom and control to undo/redo actions and exit out of situations if needed. By following these principles, usability experts can create an exceptional design that enhances visibility, reduces cognitive load, and provides a positive user experience. 

Ultimately, heuristic evaluation is a cost-effective way to identify usability issues at any point in the design process, leading to faster and more efficient design iterations, and improving user satisfaction and retention. How many of the ten heuristic design principles does your digital product satisfy? 

No results found.

Please try different keywords.

Subscribe to OW blog for an instantly better inbox

Thanks for subscribing!
Oops! Something went wrong while submitting the form.

Seeing is believing

Explore our tools and see how Optimal makes gathering insights simple, powerful, and impactful.