Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.

Card Sorting

Learn more
1 min read

Harnessing AI for Customer Engagement in Energy and Utilities

In today's rapidly evolving utility landscape, artificial intelligence  presents unprecedented opportunities to transform customer engagement strategies. However, as UX professionals in the energy and utilities sector, it's crucial to implement these technologies thoughtfully, balancing automation with the human touch that customers still expect and value.

Understanding AI's Role in Customer Engagement

The energy and utilities sector faces unique challenges: managing peak demand periods, addressing complex billing inquiries, and communicating effectively during outages. AI can help address these challenges by:

  • Managing routine inquiries at scale: Chatbots and virtual assistants can handle common questions about billing, service disruptions, or energy-saving tips, freeing human agents for more complex issues.
  • Personalizing customer communications: AI can analyze consumption patterns to deliver tailored energy-saving recommendations or alert customers to unusual usage.
  • Streamlining service processes: Smart algorithms can help schedule maintenance visits or process service changes more efficiently.

Finding the Right Balance: AI and Human Interaction

While AI offers significant advantages, implementation requires careful consideration of when and how to deploy these technologies:

Where AI Excels:

  • Initial customer triage: Directing customers to the right department or information resource
  • Data analysis and pattern recognition: Identifying trends in customer behavior or service issues
  • Content creation foundations: Generating initial drafts of communications or documentation
  • 24/7 basic support: Providing answers to straightforward questions outside business hours

Where Human Expertise Remains Essential:

  • Complex problem resolution: Addressing unique or multifaceted customer issues
  • Emotional intelligence: Handling sensitive situations with empathy and understanding
  • Content refinement: Adding nuance, brand voice, and industry expertise to AI-generated content
  • Strategic decision-making: Determining how customer engagement should evolve

Implementation Best Practices for UX Professionals

As you consider integrating AI into your customer engagement strategy, keep these guidelines in mind:

  1. Start with clear objectives: Define specific goals for your AI implementation, whether it's reducing wait times, improving self-service options, or enhancing personalization.
  2. Design transparent AI interactions: Customers should understand when they're interacting with AI versus a human agent. This transparency builds trust and sets appropriate expectations.
  3. Create seamless handoffs: When an AI system needs to transfer a customer to a human agent, ensure the transition is smooth and context is preserved.
  4. Continuously refine AI models: Use feedback from both customers and employees to improve your AI systems over time, addressing gaps in knowledge or performance.
  5. Measure both efficiency and effectiveness: Track not just cost savings or time metrics but also customer satisfaction and resolution quality.

Leveraging Optimal for AI-Enhanced Customer Engagement

Optimal's user insights platform can be instrumental in ensuring your AI implementation truly meets customer needs:

Tree Testing

Before implementing AI-powered self-service options, use Tree Testing to validate your information architecture:

  • Test whether customers can intuitively navigate through AI chatbot decision trees
  • Identify where users expect to find specific information or services
  • Optimize the pathways customers use to reach solutions, reducing frustration and abandonment

Card Sorting

When determining which tasks should be handled by AI versus human agents:

  • Conduct open or closed card sorting exercises to understand how customers naturally categorize different service requests
  • Discover which functions customers feel comfortable entrusting to automated systems
  • Group related features logically to create intuitive AI-powered interfaces that align with customer mental models

First-Click Testing

For AI-enhanced customer portals and apps:

  • Test whether customers can quickly identify where to begin tasks in your digital interfaces
  • Validate that AI-suggested actions are clearly visible and understood
  • Ensure critical functions remain discoverable even as AI features are introduced

Surveys

Gather crucial insights about customer comfort with AI:

  • Measure sentiment toward AI-powered versus human-provided services
  • Identify specific areas where customers prefer human interaction
  • Collect demographic data to understand varying preferences across customer segments

Qualitative Insights

During the ongoing refinement of your AI systems:

  • Capture qualitative observations during user testing sessions with AI interfaces
  • Tag and categorize recurring themes in customer feedback
  • Identify patterns that reveal opportunities to improve AI-human handoffs

Prototype Testing

When developing AI-powered customer interfaces for utilities:

  • Test early-stage prototypes of AI chatbots and virtual assistants to validate conversation flows before investing in full development
  • Capture video recordings of users interacting with prototype AI systems to identify moments of confusion during critical utility tasks like outage reporting or bill inquiries
  • Import wireframes or mockups of AI-enhanced customer portals from Figma to test user interactions with energy usage dashboards, bill payment flows, and outage reporting features

Looking Forward

As AI capabilities continue to evolve, the most successful utility companies will be those that thoughtfully integrate these technologies into their customer engagement strategies. The goal isn't to replace human interaction but to enhance it, using AI to handle routine tasks while enabling your team to focus on delivering exceptional service where human expertise, creativity, and empathy matter most.

By taking a balanced approach to AI implementation, supported by robust UX research tools like those offered by Optimal, UX professionals in the energy and utilities sector can create more responsive, personalized, and efficient customer experiences that meet the needs of today's consumers while preserving the human connection that remains essential to building lasting customer relationships.

Learn more
1 min read

Accelerate Study Creation with the New Study Flow

Inspired by insights from Optimal users, we’ve reimagined study creation to bring you a beautifully streamlined experience with the new Study Flow tab.


With the new Study Flow, you’ll: 

Enjoy faster study set up: Messages & instructions and Questionnaire tabs are combined in a single tab - called Study Flow - for faster editing and settings customization.

✨ See it all at a glance: Easily visualize and understand the participant experience—from the welcome message to the final thank-you screen—every step of the way. 

🎯 Duplicate questions: Save time and quickly replicate questions for surveys, screening questions, and pre- and post-study questions. 

Experience enhanced UI: Enjoy a modern, clean design with intuitive updates that minimize scrolling and reduce mental load.

🗂️ Collapse and expand sections: Easily navigate studies by collapsing and expanding sections, making it easier to build out specific parts of your study.

This Study Flow tab is available across all Optimal tools, except for Qualitative Insights. 


What’s next?

We’re not stopping there. We have some significant improvements on the horizon designed to give you even greater flexibility and control.

Advanced logic: Enhanced logic capabilities is one of our most highly requested features, and we’re thrilled to introduce new capabilities to help you build your ideal study experience – available for surveys and other tools. We will first introduce “display logic”, allowing for: 

  • If answer is X for Question Y, then hide/show Question Z.
  • If answer is X for Question Y, then hide/show specific answer options.

Customizable sections: Organize your questions into different sections to build a better study experience for your participants. For example, segment your questions into relevant groupings, such as demographics or product usage. With custom sections, you can add new sections, rename, reorder, duplicate, and move questions between different sections.* 

*Note: Questions cannot be moved to/from the screening questions section.

These upcoming features will empower you to create dynamic, tailored study experiences for different audiences with ease for more valuable insights. 

Start exploring the new Study Flow now.

Learn more
1 min read

Best UX Research Methods for Every Phase of Product Development

What is UX research?

User experience (UX) research, or user research as it’s commonly referred to, is an important part of the product design process. Primarily, UX research involves using different research methods to gather information about how your users interact with your product. It is an essential part of developing, building and launching a product that truly meets the requirements of your users. 

UX research is essential at all stages of a products' life cycle:

  1. Planning
  2. Building
  3. Introduction
  4. Growth & Maturity

While there is no one single time to conduct UX research it is best-practice to continuously gather information throughout the lifetime of your product. The good news is many of the UX research methods do not fit just one phase either, and can (and should) be used repeatedly. After all, there are always new pieces of functionality to test and new insights to discover. We introduce you to best-practice UX research methods for each lifecycle phase of your product.

1. Product planning phase

While the planning phase it is about creating a product that fits your organization, your organization’s needs and meeting a gap in the market it’s also about meeting the needs, desires and requirements of your users. Through UX research you’ll learn which features are necessary to be aligned with your users. And of course, user research lets you test your UX design before you build, saving you time and money.

Qualitative Research Methods

Usability Testing - Observational

One of the best ways to learn about your users and how they interact with your product is to observe them in their own environment. Watch how they accomplish tasks, the order they do things, what frustrates them, and what makes the task easier and/or more enjoyable for your subject. The data can be collated to inform the usability of your product, improving intuitive design, and what resonates with users.

Competitive Analysis

Reviewing products already in the market can be a great start to the planning process. Why are your competitors’ products successful and how well do they behave for users. Learn from their successes, and even better build on where they may not be performing the best and find your niche in the market.

Quantitative Research Methods

Surveys and Questionnaires

Surveys are useful for collecting feedback or understanding attitudes. You can use the learnings from your survey of a subset of users to draw conclusions about a larger population of users.

There are two types of survey questions:

Closed questions are designed to capture quantitative information. Instead of asking users to write out answers, these questions often use multi-choice answers.

Open questions are designed to capture qualitative information such as motivations and context.  Typically, these questions require users to write out an answer in a text field.

2. Product building phase

Once you've completed your product planning research, you’re ready to begin the build phase for your product. User research studies undertaken during the build phase enable you to validate the UX team’s deliverables before investing in the technical development.

Qualitative Research Methods

Focus groups

Generally involve 5-10 participants and include demographically similar individuals. The study is set up so that members of the group can interact with one another and can be carried out in person or remotely.


Besides learning about the participants’ impressions and perceptions of your product, focus group findings also include what users believe to be a product’s most important features, problems they might encounter while using the product, as well as their experiences with other products, both good and bad.

Quantitative Research Methods

Card sorting gives insight into how users think. Tools like card sorting reveal where your users expect to find certain information or complete specific tasks. This is especially useful for products with complex or multiple navigations and contributes to the creation of an intuitive information architecture and user experience.

Tree testing gives insight into where users expect to find things and where they’re getting lost within your product. Tools like tree testing help you test your information architecture.
Card sorting and tree testing are often used together. Depending on the purpose of your research and where you are at with your product, they can provide a fully rounded view of your information architecture.

3. Product introduction phase

You’ve launched your product, wahoo! And you’re ready for your first real life, real time users. Now it’s time to optimize your product experience. To do this, you’ll need to understand how your new users actually use your product.

Qualitative Research Methods

Usability testing involves testing a product with users. Typically it involves observing users as they try to follow and complete a series of tasks. As a result you can evaluate if the design is intuitive and if there are any usability problems.

User Interviews - A user interview is designed to get a deeper understanding of a particular topic. Unlike a usability test, where you’re more likely to be focused on how people use your product, a user interview is a guided conversation aimed at better understanding your users. This means you’ll be capturing details like their background, pain points, goals and motivations.

Quantitative Research Methods

A/B Testing is a way to compare two versions of a design in order to work out which is more effective. It’s typically used to test two versions of the same webpage, for example, using a different headline, image or call to action to see which one converts more effectively. This method offers a way to validate smaller design choices where you might not have the data to make an informed decision, like the color of a button or the layout of a particular image.

Flick-click testing shows you where people click first when trying to complete a task on a website. In most cases, first-click testing is performed on a very simple wireframe of a website, but it can also be carried out on a live website using a tool like first-time clicking.

4. Growth and maturity phase

If you’ve reached the growth stage, fantastic news! You’ve built a great product that’s been embraced by your users. Next on your to-do list is growing your product by increasing your user base and then eventually reaching maturity and making a profit on your hard work.

Growing your product involves building new or advanced features to satisfy specific customer segments. As you plan and build these enhancements, go through the same research and testing process you used to create the first release. The same holds true for enhancements as well as a new product build — user research ensures you’re building the right thing in the best way for your customers.

Qualitative research methods

User interviews will focus on how your product is working or if it’s missing any features, enriching your knowledge about your product and users.

It allows you to test your current features, discover new possibilities for additional features and think about discarding  existing ones. If your customers aren’t using certain features, it might be time to stop supporting them to reduce costs and help you grow your profits during the maturity stage.

Quantitative research methods

Surveys and questionnaires can help gather information around which features will work best for your product, enhancing and improving the user experience. 

A/B testing during growth and maturity occurs within your sales and onboarding processes. Making sure you have a smooth onboarding process increases your conversion rate and reduces wasted spend — improving your bottom line.

Final Thoughts: Why Continuous UX Research Matters

UX research testing throughout the lifecycle of your product helps you continuously evolve and develop a product that responds to what really matters - your users.

Talking to, testing, and knowing your users will allow you to push your product in ways that make sense with the data to back up decisions. Go forth and create the product that meets your organizations needs by delivering the very best user experience for your users.

Learn more
1 min read

Ready for take-off: Best practices for creating and launching remote user research studies

"Hi Optimal Work,I was wondering if there are some best practices you stick to when creating or sending out different UX research studies (i.e. Card sorts, Prototyye Test studies, etc)? Thank you! Mary"

Indeed I do! Over the years I’ve learned a lot about creating remote research studies and engaging participants. That experience has taught me a lot about what works, what doesn’t and what leaves me refreshing my results screen eagerly anticipating participant responses and getting absolute zip. Here are my top tips for remote research study creation and launch success!

Creating remote research studies

Use screener questions and post-study questions wisely

Screener questions are really useful for eliminating participants who may not fit the criteria you’re looking for but you can’t exactly stop them from being less than truthful in their responses. Now, I’m not saying all participants lie on the screener so they can get to the activity (and potentially claim an incentive) but I am saying it’s something you can’t control. To help manage this, I like to use the post-study questions to provide additional context and structure to the research.

Depending on the study, I might ask questions to which the answers might confirm or exclude specific participants from a specific group. For example, if I’m doing research on people who live in a specific town or area, I’ll include a location based question after the study. Any participant who says they live somewhere else is getting excluded via that handy toggle option in the results section. Post-study questions are also great for capturing additional ideas and feedback after participants complete the activity as remote research limits your capacity to get those — you’re not there with them so you can’t just ask. Post-study questions can really help bridge this gap. Use no more than five post-study questions at a time and consider not making them compulsory.

Do a practice run

No matter how careful I am, I always miss something! A typo, a card with a label in the wrong case, forgetting to update a new version of an information architecture after a change was made — stupid mistakes that we all make. By launching a practice version of your study and sharing it with your team or client, you can stop those errors dead in their tracks. It’s also a great way to get feedback from the team on your work before the real deal goes live. If you find an error, all you have to do is duplicate the study, fix the error and then launch. Just keep an eye on the naming conventions used for your studies to prevent the practice version and the final version from getting mixed up!

Sending out remote research studies

Manage expectations about how long the study will be open for

Something that has come back to bite me more than once is failing to clearly explain when the study will close. Understandably, participants can be left feeling pretty annoyed when they mentally commit to complete a study only to find it’s no longer available. There does come a point when you need to shut the study down to accurately report on quantitative data and you’re not going to be able to prevent every instance of this, but providing that information upfront will go a long way.

Provide contact details and be open to questions

You may think you’re setting yourself up to be bombarded with emails, but I’ve found that isn’t necessarily the case. I’ve noticed I get around 1-3 participants contacting me per study. Sometimes they just want to tell me they completed it and potentially provide additional information and sometimes they have a question about the project itself. I’ve also found that sometimes they have something even more interesting to share such as the contact details of someone I may benefit from connecting with — or something else entirely! You never know what surprises they have up their sleeves and it’s important to be open to it. Providing an email address or social media contact details could open up a world of possibilities.

Don’t forget to include the link!

It might seem really obvious, but I can’t tell you how many emails I received (and have been guilty of sending out) that are missing the damn link to the study. It happens! You’re so focused on getting that delivery right and it becomes really easy to miss that final yet crucial piece of information.

To avoid this irritating mishap, I always complete a checklist before hitting send:

  • Have I checked my spelling and grammar?
  • Have I replaced all the template placeholder content with the correct information?
  • Have I mentioned when the study will close?
  • Have I included contact details?
  • Have I launched my study and received confirmation that it is live?
  • Have I included the link to the study in my communications to participants?
  • Does the link work? (yep, I’ve broken it before)

General tips for both creating and sending out remote research studies

Know your audience

First and foremost, before you create or disseminate a remote research study, you need to understand who it’s going to and how they best receive this type of content. Posting it out when none of your followers are in your user group may not be the best approach. Do a quick brainstorm about the best way to reach them. For example if your users are internal staff, there might be an internal communications channel such as an all-staff newsletter, intranet or social media site that you can share the link and approach content to.

Keep it brief

And by that I’m talking about both the engagement mechanism and the study itself. I learned this one the hard way. Time is everything and no matter your intentions, no one wants to spend more time than they have to. Even more so in situations where you’re unable to provide incentives (yep, I’ve been there). As a rule, I always stick to no more than 10 questions in a remote research study and for card sorts, I’ll never include more than 60 cards. Anything more than that will see a spike in abandonment rates and of course only serve to annoy and frustrate your participants. You need to ensure that you’re balancing your need to gain insights with their time constraints.

As for the accompanying approach content, short and snappy equals happy! In the case of an email, website, other social media post, newsletter, carrier pigeon etc, keep your approach spiel to no more than a paragraph. Use an audience appropriate tone and stick to the basics such as: a high level sentence on what you’re doing, roughly how long the study will take participants to complete, details of any incentives on offer and of course don’t forget to thank them.

Set clear instructions

The default instructions in Optimal Workshop’s suite of tools are really well designed and I’ve learned to borrow from them for my approach content when sending the link out. There’s no need for wheel reinvention and it usually just needs a slight tweak to suit the specific study. This also helps provide participants with a consistent experience and minimizes confusion allowing them to focus on sharing those valuable insights!

Create a template

When you’re on to something that works — turn it into a template! Every time I create a study or send one out, I save it for future use. It still needs minor tweaks each time, but I use them to iterate my template.What are your top tips for creating and sending out remote user research studies? Comment below!

Learn more
1 min read

67 ways to use Optimal for user research

User research and design doesn’t fail because teams don’t care – it fails because there’s rarely time to explore every option. When deadlines pile up, most teams default to the same familiar research patterns and miss opportunities to get more value from the tools they already have.

We’ve brought together practical, real-world ways to use Optimal – from tree testing and first-click testing to card sorting, surveys, prototype testing, and interviews. Some of these use cases are obvious, but many aren’t. All of them are designed to help teams move faster, reduce risk, and turn user insights into decisions stakeholders trust.

We’ve focused on quick wins and flexible examples you can adapt to your own context – whether you’re benchmarking navigation, validating early designs, improving conversion flows, prioritizing work, or proving the ROI of UX. You don’t need more tools or more processes. You just need smarter ways to use what you already have.

Let’s get into it.

Practical ways to use Optimal for user research and UX design

#1 Benchmark your information architecture (IA)

Without a baseline for your navigation or information architecture (IA), you can’t easily tell if any changes you make have a positive effect. If you haven’t done so, benchmark your existing website on tree testing now. Upload your site structure and get results the same day. Now you’ll have IA scores to beat each month. Easy.

#2 Find out precisely where people get lost

Watch video recordings of real people interacting with your sites with live site testing. Combine this with surveys and user interviews to understand where users struggled. You can also use the tree testing pietree to find out exactly where people are getting lost in your website structure and where they go instead.

#3 Start with one screenshot

If you’re just not sure where to begin then take a screenshot of your homepage, or any page that you think might have some issues and get going with first-click testing. Write up a string of things that people might want to do when they find themselves on this page and use these as your tasks. Surprise all your colleagues with a maddening heatmap or video recordings showing where people actually clicked in response to your tasks or where they struggle. Now you’ll have a better idea of which area of your site to focus on for your next step.

#4 Test live sites during discovery

You can run live site testing as part of your discovery phase to baseline your live experiences and see how well your current site supports real user goals. Test competitors' sites to see how you stack up. You’ll quickly uncover opportunities to differentiate your site, all before a single wireframe is drawn. All that's required is a URL and then you're set to go. No code needed.

#5 A/B test your site structure

Tree testing is great for testing more than one content structure. It’s easy to run two separate tree testing studies, even more than two. It’ll help you decide which structure you and your team should run with, and it won’t take you long to set them up.

#6 Optimize sign-up flows

Discover how easy (or not) it is for users to navigate your sign up experience to ensure it works exactly as intended. Create a live site or prototype test to identify any confusion or points of friction. You could also use this test to understand users' first impressions of your home or landing page. Where do they click first and what information is valuable to them?

#7 Make collaborative design decisions‍

Use surveys, first-click tests, and card sorting to get your team involved and let their feedback feed your designs: logos, icons, banners, images, the list goes on... For example, by creating a closed image sort with categories, your team can group designs based on their preferences, you can get some quick feedback to help you figure out where you should focus your efforts.

#8 Do your (market) research

Get a better sense of your users and customers’ motivations with surveys and user interviews. You can also find out what people actually want to see on your website with a card sort, by conducting an image sort of potential products. By providing categories like ‘I would buy this’, ‘I wouldn’t buy this’ to indicate their preferences for each item, you can figure out what types of products appeal to your customers.

#9 Customer satisfaction surveys with surveys and interviews

The thoughts and feelings of your users are always important. A simple survey or user interview can help you take a deeper look at your checkout process, a recently launched product or service, or even the packaging your product arrives in. Your options are endless.

#10 Start testing prototypes

Companies that incorporate prototype testing in their design process can reduce development costs by 33%. Use prototype testing to ensure your designs hit the mark before you invest too heavily in the build. Build your own prototype with images in Optimal or import a Figma file. You can even test AI-generated prototypes from tools like Lovable or Magic Patterns by dropping the URL into live site testing.

#11 Crowdsource content ideas

Whether you’re running a blog or a UX conference, surveys can help you generate content ideas and understand any knowledge gaps that might be out there. Figure out what your users and attendees like to read on your blog, or what they want to hear about at your event, and let this feed into what you offer.

#12 Evaluate user flows

Sometimes a change in your product or service means you have to change how it’s presented to your existing customers.  Ensure your customers understand the changes to your product or service with prototype and live site testing. Identify issues with user flow, content, or layout that may confuse them. Discover which options they’re most likely to choose with the updates. Uncover what truly matters to your customers.

#13 Quantify the return on investment of UX

Some people, including UX Agony Aunt, define return on UX as time saved, money made, and people engaged. By attaching a value to the time spent completing tasks, or to successful completion of tasks, you can approximate an ROI or at least illustrate the difference between two options.

#14 Convince your stakeholders with highlight reels

User interviews are teeming with insights but can be time and resource intensive to analyze without automation. Use Optimal Interviews tool to capture key moments, reactions, and pain points with automated highlight reels and clips. These are perfect for storytelling, stakeholder buy-in, and keeping teams connected to who they’re building for.

#15 Prioritize upcoming work 

Survey your organization to build a list of ideas for upcoming work. Understand your audience’s priorities with card sorting to inform your feature development. Categorize your upcoming work ideas to decide collectively what’s best to take on next. Great for clarifying what the team considers the most valuable or pressing work to be done.

#16 Reduce content on landing pages to what people access regularly

Before you run an open card sort to generate new category ideas, you can run a closed card sort to find out if you have any redundant content. Say you wanted to simplify the homepage of your intranet. You can ask participants to sort cards (containing homepage links) based on how often they use them. You could compare this card sort data with analytics from your intranet and see if people’s actual behavior and perception are well aligned.

#17 Create tests to fit in your onboarding process

Onboarding new customers is crucial to keeping them engaged with your product, especially if it involves your users learning how to use it. You can set up a quick study to help your users stay on track with onboarding. For example, say your company provided online email marketing software. You can set up a first-click testing study using a photo of your app, with a task asking your participants where they’d click to see the open rates for a particular email that went out.

#18 Input your learnings and observations from a UX conference with qualitative insights

If you're lucky enough to attend a UX conference, you can now share the experience with your colleagues. You can easily jot down ideas, quotes and key takeaways in a Qualitative Insights project and keep your notes organized by using a new session for each presenter Bonus, if you’re part of a team, they can watch the live feed rolling into Qualitative Insights!


#19 Multivariate testing

Tree testing and first-click testing allow you to compare multiple versions of content structures, designs, or flows. You can also compare how users engage with different live websites in one study. This helps decide the best-performing option without guessing.

#20 Do some sociological research

Using card sorting for sociological research is a great way to deepen your understanding of how different groups may categorize information. For example, by looking at how young people group popular social media platforms, you can understand the relationships between them, and identify where your product may fit in the mix. Then, follow up with surveys or moderated interviews for deeper insights. 

#21 Test your FAQs page with new users

Your support and knowledge base within your website can be just as important as any other core action on your website. If your support site is lacking in navigation and UX, this will no doubt increase support tickets and resources. Make sure your online support section is up to scratch. Here’s an article on how to do it quickly.

#22 Establish which tags or filters people consider to be the most important

Create a card sort with your search filters or tags as labels, and have participants rank them according to how important they consider them to be. Analytics can tell you half of the story (where people actually click), so the card sort can give another side: a better idea of what people actually think or want. Follow up with surveys or interviews to confirm insights.

#23 Figure out if your icons need labels

‍Figure out if your icons are doing their job by testing whether your users are understanding them as intended. Uploading icons you currently use, or plan to use in your interface to first-click testing, and ask your users to identify their meaning by making use of post-task questions.

#24 Get straight to the aha! moments

Optimal Interviews gives you automated insights but you can also engage with AI Chat to dive deeper. Ask AI specific questions about a feature or process or request quotes or examples. Then, get highlight reels and clips to match.


#25 Improve website conversions

Make the marketing team’s day by doing a fast improvement on some core conversions on your website. Now, there are loads of ways to improve conversions for a check out cart or signup form, but using first-click testing to test out ideas before you start going live A/B test can take mere minutes and give your B version a confidence boost. For deeper insights, try a live site test. 

#26 Test your mobile experience or web app

As more and more people are using their smartphones for apps and to browse sites, you need to ensure its design gives your users a great experience. Test your mobile site to ensure people aren’t getting lost in the mobile version of your site. If you haven’t got a mobile-friendly design yet, now’s the time to start designing it!

#27 Get automated transcripts

Have a number of interviews you need to transcribe quickly? Upload up to 20 interviews at once in Optimal Interviews and get automated transcripts, so you can spend less time on admin and more time digging into insights.

#28 Reduce the bounce rates of certain sections of your website‍

People jumping off your website and not continuing their experience is something (depending on the landing page) everyone tries to improve. The metric ‘time on site’ and ‘average page views’ is a metric that shows the value your whole website has to offer. Again, there are many different ways to do this, but one big reason for people jumping off the website is not being able to find what they’re looking for. Use prototype testing or live site testing to watch users in action and understand where things break down.

#29 Test your website in different countries‍

No, you don’t have to spend thousands of dollars to go to all these countries to test, although that’d be pretty sweet. You can remotely research participants from all over the world, using our integrated recruitment panel. Start seeing how different cultures, languages, and countries interact with your website. 

#30 Preference test

Whether you’re coming up with a new logo design, headline, featured image, or anything, you can preference test it with first-click testing. Create an image that shows the two designs side by side and upload it to first-click testing. From there, you can ask people to click whichever one they prefer!  If you want to track multiple clicks per task or watch recordings, use prototype testing instead.


#31 Test visual hierarchy with first-click testing

Use first-click testing to understand which elements draw users' attention first on your page. Upload your design and ask participants to click on the most important element, or what catches their eye first. The resulting heatmap will show you if your visual hierarchy is working as intended - are users clicking where you expect them to? This technique helps validate design decisions about sizing, color, positioning, and contrast without needing to build the actual page.


#32 Tame your blog or knowledge base

Get the tags and categories in your blog under control to make life easier for your readers. Set up a card sort and use all your tags and categories as card labels. Either use your existing ones or test a fresh set of new tags and categories.

#33 Use AI Chat for stakeholder-ready outputs

Use AI-powered chat to instantly reformat interview insights and fast-track deliverables for different audiences. Simply specify the details of the deliverable you would like. For example: “Turn this into a 3-sentence Slack summary (no citations).” or “Rewrite this as an exec-ready insight with a clear recommendation.”

‍#34 Validate the designs in your head

As designers, you’ve probably got umpteen designs floating around in your head at any one time. But which of these are really worth pursuing? Figure this out by using Optimal to test out wireframes of new designs before putting any more work into them.

#35 Optimize the support escalation flow

Understand how users navigate help resources, report issues, and conceptualize support categories, especially when they need to locate assistance quickly in time-sensitive situations.

#36 Improve your search engine optimization (SEO) with tree testing

Yes, a good IA improves your SEO. Tree testing helps you understand how people navigate throughout your site. It also helps search engines better understand and index your content, making it more discoverable and relevant in search results. Make sure people can easily find what they’re looking for, and you’ll start to see improvement in your search engine ranking.

#37 Feature prioritization and get some help for your roadmap

Find out what people think are the most important next steps for your team. Set up a survey or card sort and ask people to categorize items and rank them in descending order of importance or impact on their work. This can also help you gauge their thoughts on potential new features for your site, and for bonus points compare team responses with customer responses.

#38 Define your brand tone of voice

Use a card sort to understand how people perceive your brand, so you can shape or refine your brand personality, tone of voice, and style guidelines. Run this with stakeholders or your audience to uncover current perceptions and where they’d like your brand to go next.

#39 Run an Easter egg hunt using the correct areas in first-click testing

Liven up the workday by creating a fun Easter egg hunt in first-click testing. Simply upload a photo (like those really hard “spot the X” photos), set the correct area of your target, then send out your study with participant identifiers enabled. You can also send these out as competitions and have closing rules based on time, number of participants, or both.

#40 Test your home button

Would an icon or text link work better for navigating to your home page? Before you go ahead and make changes to your site, you can find out by setting up a first-click testing test.

#41 Improve team structure and clarity role expectations

Run a card sort, survey, or internal interviews to understand how responsibilities are perceived across different roles. Work with team leaders and managers to clarify role definitions, reporting lines, and decision-making authority. This helps uncover overlapping responsibilities and opportunities to streamline management and support team workflows.

#42 ‘Buy now’ button shopping cart visibility‍

If you’re running an e-commerce site, ease of use and a great user experience are crucial. To see if your shopping cart and checkout processes are as good as they can be, look into running a live site, prototype or first-click test.

#43 Website periodic health checks

Raise the visibility of good IA by running periodic IA health checks using tree testing and reporting the results. Proactively identifying structural issues early, and backing decisions with clear metrics, helps drive alignment and build confidence across stakeholders.

‍#44 Use heatmaps to get the first impressions of designs

Heatmaps in our first-click testing tool are a great way of getting first impressions of any design. You can see where people clicked (correctly and incorrectly), giving you insights on what works and doesn’t work with your designs. Because it’s so fast to test, you can iterate until your designs start singing.

#45 Focus groups with interviews

Thinking of launching a new product, app or website, or seeking opinions on an existing one? Remote focus groups can provide you with a lot of candid information that may help get your project off the ground. They’re also dangerous because they’re susceptible to groupthink, design by committee, and tunnel vision. Use with caution, but if you do then upload your recordings to Interviews for automated insights! Find patterns across sessions and use AI Chat to dig deeper. Pay attention to emotional triggers.

#46 Gather opinions with surveys

Whether you want the opinions of your users or from members of your team, you can set up a quick and simple survey. It’s super useful for getting opinions on new ideas (consider it almost like a mini-focus group), or even for brainstorming with teammates.

#47 Prioritise content

Use a card sort to understand what content matters most to people, so you can plan what to write first. Ask participants which information is most useful or which tasks they do most often. You can also run this after a top tasks survey to help shape your long list of content.

#48 Test a new concept

Got an idea you want to sanity-check before investing more time? Use surveys, first-click testing, or prototype testing to see if people understand the concept and find it valuable. A quick test now can save a lot of rework later.


#49 Run an image card sort to organize products into groups

You can add images to each card that allows you to understand how your participants may organize and label particular items. Very useful if you want to organize some retail products and want to find out how other people would organize them given a visual including shape, color, and other potential context.

#50 Guerrilla testing with first-click testing

For really quick first-click testing, take first-click testing on a tablet, mobile device or laptop to a local coffee shop. Ask people standing in line if they’d like to take part in your super quick test in exchange for a cup of joe. Easy!

#51 Test your search box

Case study by Viget: “One of the most heavily used features of the website is its keyword search, so we wanted to make absolutely certain that our redesigned search box didn’t make search harder for users to find and use.” Use first-click testing to test different variations. 

#52 Run a Net Promoter Score (NPS) survey

Optimal surveys give you plenty of question options, but one of the simplest ways to take the pulse of your product is an NPS survey to find out how likely they would recommend your product or brand. Use the out-of-the-box NPS question type question to quickly understand customer sentiment and track it over time.

#53 Run an empathy test

Empathy – the ability to understand and share the experience of another person – is central to the design process. An empathy test is another great tool to use in the design phase because it enables you to find out if you are creating the right kind of feelings with your user. Take your design and show it to users. Provide them with a variety of words that could represent the design – for example “minimalistic”, “dynamic”, or “professional” – and ask them to pick out which words which they think are best suited to their experience.

#54 Compare and test email designs

Drop your email designs into first-click testing to see which version people prefer and where they click first. Use these insights to refine your layout, hierarchy, and calls to action to improve engagement and conversions.

#55 Source-specific data with an online survey

Online survey tools can complement your existing research by sourcing specific information from your participants. For example, if you need to find out more about how your participants use social media, which sites they use, and on which devices, you can do it all through a simple survey questionnaire. Additionally, if you need to identify usage patterns, device preferences or get information on what other products/websites your users are aware of/are using, a questionnaire is the ticket.

#56 Make sure you get the user's first-click right

Like most things, read a little, and then it’s all about practice. We’ve found that people who get the first click correct are almost three times as likely to complete a task successfully. Get your first clicks right in tree testing and first-click testing and you’ll start seeing your customers smile.

#57 Destroy evil attractors in your tree

Evil attractors are those labels in your IA that attract unjustified clicks across tasks. This usually means the chosen label is ambiguous, or possibly a catch-all phrase like ‘Resources’. Read how to quickly identify evil attractors in the Destinations table of tree test results and how to fix them.


#58 Ensure accessibility and inclusion

Check how people with different physical, visual, or cognitive needs move through your content, and spot any areas that might slow them down or cause confusion. Use what you uncover to remove friction and support all users.

#59 Add moderated card sort results to your card sort‍

An excellent way of gathering valuable qualitative insights alongside the results of your remote card sorts is to run a moderated version of the sorts with a smaller group of participants. When you can observe and interact with your participants as they complete the sort, you’ll be able to ask questions and learn more about their thought processes and the reasons why they have categorized things in a particular way.

#60 Test your customers' perceptions of different logo and brand image designs

Understand how customers perceive your brand by creating a closed card sort. Come up with a list of categories, and ask participants to sort images such as logos, and branded images.

#61 Run an open image card sort to classify images into groups based on the emotions they elicit

‍Are these pictures exhilarating, or terrifying? Are they humorous, or offensive? Relaxing, or boring? Productive, or frantic? Happy memories, or a deep sigh?

#62 Crowd-source the values you want your team/brand/product to represent

Card sorting is a well-established technique in the ‘company values’ realm, and there are some great resources to help you and your team brainstorm the values you represent. These ‘in-person’ brainstorm sessions are great, and you can run a remote closed card sort to support your findings. And if you want feedback from more than a small group of people (if your company has, say, more than 15 staff) you can run a remote closed card sort on its own. Use Microsoft’s Reaction Card Method as card inspiration.

#63 Test physical and digital experiences together

Use recorded videos and interviews to observe people interacting with physical products, kiosks, or mobile apps in real-world contexts. Record sessions, capture moments of friction, and bring those insights back into Optimal’s Interviews tool for automated insights.

#64 HR exercises to determine the motivations of your team

It’s simple to ask your team about their thoughts, feelings, and motivations with a survey. You can choose to leave participant identifiers blank (so responses are anonymous), or you can ask for a name/email address. As a bonus, you can set up a calendar reminder to send out a new survey in the next quarter. Duplicate the survey and send it out again!

#65 Designing physical environments

‍If your company has a physical environment in which your customers visit, you can research new structures using a mixture of tools in Optimal. This especially comes in handy if your customers require certain information within the physical environment in order to make decisions. For example, picture a retail store. Are all the signs clear and communicate the right information? Are people overwhelmed by the physical environment?

#66 Run an image card sort to organize your library

Whether it’s a physical library of books, or a digital drive full of ebooks, you can run a card sort to help organize them in a way that makes sense. Will it be by genre, author name, color or topic? Send out the study to your coworkers to get their input! You can also do this at home for your own personal library, and you can include music/CDs/vinyl records and movies!

#67 Use tree testing to refine an interactive phone menu system

Similar to how you’d design an IA, you can create a tree test to design an automated phone system. Whether you’re designing from the ground up, or improving your existing system, you will be able to find out if people are getting lost.

Practical ways to use Optimal for user research (and get value fast)

And that’s the list. This is not everything you can do with Optimal, but a solid reminder that meaningful user insights don’t have to be slow, heavy, or overcomplicated. Small, well-timed studies can uncover friction, validate decisions, and create momentum across teams.

Ready to get started?

Have a creative use case we missed? Let us know, we’re always learning from the ways our customers push research further, faster, and smarter.

Learn more
1 min read

Transforming digital experiences: Optimal Workshop's radical refresh

At Optimal Workshop, we've always prided ourselves on being pioneers in the field of UX research tools. Since our inception, we've been the go-to platform for UX researchers and designers looking to conduct card sorting, tree testing, and other critical user research activities. Our tools have helped shape the digital experiences of some of the world's leading companies.

The digital landscape is ever-changing, and with it, the needs of the professionals who create and optimize digital experiences. As we've grown and evolved alongside the industry, we've recognized the need to expand our reach and refine our value proposition to better serve not just UX researchers, but all professionals involved in delivering exceptional digital experiences.

That's why we're excited to announce a significant brand refresh and value pivot for Optimal Workshop. This evolution isn't just about a new coat of paint – it's a fundamental shift in how we position ourselves and the value we provide to our users and the broader digital experience community.

Why fix it if it ain’t broke? 💔🔨

Expanding Our User Base

While we've built a strong reputation among UX researchers, we've recognized that there's a broader audience of professionals who benefit from our tools and expertise. Product managers, marketers, customer experience specialists, and others involved in shaping digital experiences often lack access to professional-grade user research tools or the knowledge to effectively implement them.

Our brand refresh aims to make Optimal Workshop more accessible and relevant to this wider circle of professionals, without losing the deep respect we've earned from UX experts.

Adapting to market changes

The UX research landscape has evolved significantly since we first entered the market. With the proliferation of prototyping tools and the increasing speed of digital product development, there's been a shift away from extensive upfront research. However, we firmly believe that incorporating the voice of the customer throughout the development process is more critical than ever.

Our pivot allows us to emphasize the importance of integrating user research and insights at various stages of the digital experience lifecycle, not just during initial design phases.

Leveraging our expertise

As pioneers in the field, we have accumulated a wealth of knowledge about best practices in UX research and digital experience optimization. Our brand refresh positions us not just as a tool provider, but as a trusted partner and thought leader in the industry. We're doubling down on our commitment to sharing knowledge, standardizing best practices, and elevating the work of exceptional practitioners in the field.

Meeting evolving business needs through effective UX

The UX industry is evolving rapidly, with increasing investment and a diverse range of roles getting involved. From junior designers leveraging AI to seasoned researchers with deep expertise, the landscape is rich with talent and tools. However, the true value lies in how effectively these resources translate into business outcomes.

Optimal Workshop recognizes that exceptional digital experiences are no longer just nice-to-have – they're critical for engagement, conversion, and overall business success. Our tools bridge the gap between UX insights and tangible ROI by:

  • Democratizing research: Enabling teams across experience levels to gather valuable user data quickly and efficiently.
  • Accelerating decision-making: Providing fast, actionable insights that reduce design iterations and time-to-market.
  • Enhancing team effectiveness: Facilitating collaboration and knowledge sharing between junior and senior team members.
  • Driving business value: Directly linking UX improvements to key performance indicators and bottom-line results.

In a landscape where basic UX practices are becoming table stakes, Optimal Workshop empowers organizations to go beyond the basics. We help teams leverage the full spectrum of UX expertise – from AI-assisted analysis to seasoned human insight – to create digital experiences that truly set businesses apart and deliver measurable returns on UX investment.

What's changing at Optimal Workshop 🐛🦋

1. Best-in-class research & insights platform

While our core tools remain a critical part of our offering, we're broadening our focus to position Optimal Workshop as a best-in-class research & insights platform for digital experience professionals. This means developing new tools, workflows, and integrations that cater to a wider range of use cases and user types.

2. Accessibility without compromise

We're committed to making our professional-grade tools more accessible to a broader audience without sacrificing the depth and rigor that our expert users expect. This means developing new onboarding experiences, creating more intuitive interfaces, and providing educational resources to help users at all levels get the most out of our platform.

3. Championing best practices

As part of our evolution, we're placing a greater emphasis on identifying, sharing, and standardizing best practices in digital experience research and optimization. Through case studies, partnerships with industry leaders, and our own thought leadership content, we aim to elevate the entire field of digital experience design.

4. Fostering a community of excellence

We're doubling down on our commitment to building and nurturing a community of digital experience professionals. This includes expanding our educational offerings, hosting more events and webinars, and creating opportunities for our users to connect and learn from each other. 

5. Emphasizing outcomes and ROI

We're aligning our messaging and product development with the real-world impact our users are seeking. That’s why the new names for our tools emphasize what each tool helps you achieve, rather than the technical methodology behind it. This outcome-focused approach helps users quickly identify which tool is right for their specific needs.

How our evolution benefits you 🚀

For our long-time users, rest assured that the tools and features you know and love aren't going anywhere, but their names are changing to plain English terms to be more approachable for professionals who may not have a background in UX research. In fact, our UXR platform is only going to get better as we invest in improvements and new capabilities. You'll also benefit from an expanded community of practitioners and a wealth of new resources to help you continue to grow and excel in your field.

For professionals who may be new to Optimal Workshop or to formalized user research in general, our refresh means it's easier than ever to get started with professional-grade tools and methodologies. We're here to support you at every step, from your first card sort to building a comprehensive, data-driven approach to optimizing digital experiences.

Join us in shaping tomorrow's digital experiences 🌟

This brand refresh and value pivot mark an exciting new chapter for Optimal Workshop. We're committed to continuing our tradition of excellence while expanding our reach and impact in the world of digital experiences.

As we move forward, we'll be rolling out new features, resources, and initiatives that align with our refreshed brand and value proposition. We're excited to partner with our users – both new and long-standing – to push the boundaries of what's possible in creating truly exceptional digital experiences.

Thank you for being part of the Optimal Workshop community. Whether you're a UX research veteran or just starting your journey in optimizing digital experiences, we're here to support you with the best tools, knowledge, and community in the industry. Together, we can shape the future of digital experiences and make the online world more intuitive, efficient, and enjoyable for everyone.

Onwards and upwards,

Alex Burke, CEO Optimal Workshop

No results found.

Please try different keywords.

Subscribe to OW blog for an instantly better inbox

Thanks for subscribing!
Oops! Something went wrong while submitting the form.

Seeing is believing

Explore our tools and see how Optimal makes gathering insights simple, powerful, and impactful.