Blog

Optimal Blog

Articles and Podcasts on Customer Service, AI and Automation, Product, and more

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.

Latest

Learn more
1 min read

Ready for take-off: Best practices for creating and launching remote user research studies

"Hi Optimal Work,I was wondering if there are some best practices you stick to when creating or sending out different UX research studies (i.e. Card sorts, Prototyye Test studies, etc)? Thank you! Mary"

Indeed I do! Over the years I’ve learned a lot about creating remote research studies and engaging participants. That experience has taught me a lot about what works, what doesn’t and what leaves me refreshing my results screen eagerly anticipating participant responses and getting absolute zip. Here are my top tips for remote research study creation and launch success!

Creating remote research studies

Use screener questions and post-study questions wisely

Screener questions are really useful for eliminating participants who may not fit the criteria you’re looking for but you can’t exactly stop them from being less than truthful in their responses. Now, I’m not saying all participants lie on the screener so they can get to the activity (and potentially claim an incentive) but I am saying it’s something you can’t control. To help manage this, I like to use the post-study questions to provide additional context and structure to the research.

Depending on the study, I might ask questions to which the answers might confirm or exclude specific participants from a specific group. For example, if I’m doing research on people who live in a specific town or area, I’ll include a location based question after the study. Any participant who says they live somewhere else is getting excluded via that handy toggle option in the results section. Post-study questions are also great for capturing additional ideas and feedback after participants complete the activity as remote research limits your capacity to get those — you’re not there with them so you can’t just ask. Post-study questions can really help bridge this gap. Use no more than five post-study questions at a time and consider not making them compulsory.

Do a practice run

No matter how careful I am, I always miss something! A typo, a card with a label in the wrong case, forgetting to update a new version of an information architecture after a change was made — stupid mistakes that we all make. By launching a practice version of your study and sharing it with your team or client, you can stop those errors dead in their tracks. It’s also a great way to get feedback from the team on your work before the real deal goes live. If you find an error, all you have to do is duplicate the study, fix the error and then launch. Just keep an eye on the naming conventions used for your studies to prevent the practice version and the final version from getting mixed up!

Sending out remote research studies

Manage expectations about how long the study will be open for

Something that has come back to bite me more than once is failing to clearly explain when the study will close. Understandably, participants can be left feeling pretty annoyed when they mentally commit to complete a study only to find it’s no longer available. There does come a point when you need to shut the study down to accurately report on quantitative data and you’re not going to be able to prevent every instance of this, but providing that information upfront will go a long way.

Provide contact details and be open to questions

You may think you’re setting yourself up to be bombarded with emails, but I’ve found that isn’t necessarily the case. I’ve noticed I get around 1-3 participants contacting me per study. Sometimes they just want to tell me they completed it and potentially provide additional information and sometimes they have a question about the project itself. I’ve also found that sometimes they have something even more interesting to share such as the contact details of someone I may benefit from connecting with — or something else entirely! You never know what surprises they have up their sleeves and it’s important to be open to it. Providing an email address or social media contact details could open up a world of possibilities.

Don’t forget to include the link!

It might seem really obvious, but I can’t tell you how many emails I received (and have been guilty of sending out) that are missing the damn link to the study. It happens! You’re so focused on getting that delivery right and it becomes really easy to miss that final yet crucial piece of information.

To avoid this irritating mishap, I always complete a checklist before hitting send:

  • Have I checked my spelling and grammar?
  • Have I replaced all the template placeholder content with the correct information?
  • Have I mentioned when the study will close?
  • Have I included contact details?
  • Have I launched my study and received confirmation that it is live?
  • Have I included the link to the study in my communications to participants?
  • Does the link work? (yep, I’ve broken it before)

General tips for both creating and sending out remote research studies

Know your audience

First and foremost, before you create or disseminate a remote research study, you need to understand who it’s going to and how they best receive this type of content. Posting it out when none of your followers are in your user group may not be the best approach. Do a quick brainstorm about the best way to reach them. For example if your users are internal staff, there might be an internal communications channel such as an all-staff newsletter, intranet or social media site that you can share the link and approach content to.

Keep it brief

And by that I’m talking about both the engagement mechanism and the study itself. I learned this one the hard way. Time is everything and no matter your intentions, no one wants to spend more time than they have to. Even more so in situations where you’re unable to provide incentives (yep, I’ve been there). As a rule, I always stick to no more than 10 questions in a remote research study and for card sorts, I’ll never include more than 60 cards. Anything more than that will see a spike in abandonment rates and of course only serve to annoy and frustrate your participants. You need to ensure that you’re balancing your need to gain insights with their time constraints.

As for the accompanying approach content, short and snappy equals happy! In the case of an email, website, other social media post, newsletter, carrier pigeon etc, keep your approach spiel to no more than a paragraph. Use an audience appropriate tone and stick to the basics such as: a high level sentence on what you’re doing, roughly how long the study will take participants to complete, details of any incentives on offer and of course don’t forget to thank them.

Set clear instructions

The default instructions in Optimal Workshop’s suite of tools are really well designed and I’ve learned to borrow from them for my approach content when sending the link out. There’s no need for wheel reinvention and it usually just needs a slight tweak to suit the specific study. This also helps provide participants with a consistent experience and minimizes confusion allowing them to focus on sharing those valuable insights!

Create a template

When you’re on to something that works — turn it into a template! Every time I create a study or send one out, I save it for future use. It still needs minor tweaks each time, but I use them to iterate my template.What are your top tips for creating and sending out remote user research studies? Comment below!

Learn more
1 min read

Empowering UX Careers: Designlab Joins Forces with Optimal Workshop

Optimal Workshop is thrilled to welcome Designlab as our newest education partner. This collaboration merges our strengths to provide innovative learning opportunities for UX professionals looking to sharpen their design skills and elevate their careers. 

The Power of a Design-First Education Partner

What makes Designlab unique is its exclusive focus on design education. For more than a decade, they have dedicated themselves to providing hands-on learning experiences that  combine asynchronous, online lessons and projects with synchronous group sessions and expert mentorship. With a robust catalog of industry-relevant courses and an alumni network of over 20,000 professionals, Designlab is committed to empowering designers to make an impact at both individual and team levels.

What Designlab Offers for Experienced Designers

Designlab offers a range of advanced programs that support ongoing professional development. Some courses that might be interesting for our audience include:

  • Data-Driven Design: Gain confidence in your ability to collect and interpret data, justify design decisions with business impact, and win over stakeholders. 
  • Advanced Figma: Accelerate your design workflow and become a more efficient Figma user by learning tools like components, auto-layout, and design tokens. 
  • Strategic Business Acumen for Designers: Learn the foundational business knowledge and frameworks you need to influence strategy and get your design career to the next level.  
  • Advanced Usability and Accessibility: Strengthen your usability and accessibility skills, integrate universal design principles into your work, and improve advocacy for inclusivity in design.  

These courses ensure that experienced designers can enhance their technical and strategic skills to solve complex problems, lead projects, and design user-centered experiences.

Solutions for Design Teams

Designlab also offers solutions for design teams looking to upskill together. These solutions can range from multi-seat enrollments to their courses to custom facilitation and training programs, perfectly tailored to your teams’ needs. By partnering with Designlab, companies ensure their teams are equipped with practical skills and a forward-thinking mindset to tackle design challenges effectively.

READ: Designing for Accessibility with The Home Depot

Special Offer for the Optimal Workshop Community

To celebrate this partnership, Optimal Workshop users can take advantage of a special discount—$100 off any Designlab course with the code OPTIMAL. Whether you’re looking to refine your skills or explore new areas of expertise, Designlab’s programs offer the perfect opportunity to invest in your professional growth.

Explore how Designlab’s offerings can help you level up your design career—whether it’s through mastering advanced tools, leveraging data more, or becoming a more strategic thinker. With continuous learning at the heart of success in UX and product design, there’s no better time to start your journey with Designlab.

Unlock your potential and discover new possibilities with Designlab’s courses today. Use code OPTIMAL to save $100 on your next course and take the next step in your design career.

Learn more
1 min read

UXDX Dublin 2024: Where Chocolate Meets UX Innovation

What happens when you mix New Zealand's finest chocolate with 870 of Europe's brightest UX minds? Pure magic, as we discovered at UXDX Dublin 2024!

A sweet start

Our UXDX journey began with pre-event drinks (courtesy of yours truly, Optimal Workshop) and a special treat from down under - a truckload of Whittaker's chocolate that quickly became the talk of the conference. Our impromptu card sorting exercise with different Whittaker's flavors revealed some interesting preferences, with Coconut Slab emerging as the clear favorite among attendees!

Cross-Functional Collaboration: More Than Just a Buzzword

The conference's core theme of breaking down silos between design, product, and engineering teams resonated deeply with our mission at Optimal Workshop. Andrew Birgiolas from Sephora delivered what I call a "magical performance" on collaboration as a product, complete with an unforgettable moment where he used his shoe to demonstrate communication scenarios (now that's what we call thinking on your feet!).

Purpose-driven design

Frank Gaine's session on organizational purpose was a standout moment, emphasizing the importance of alignment at three crucial levels:

- Company purpose

- Team purpose

- Individual purpose

This multi-layered approach to purpose struck a chord with attendees, reminding us that effective UX research and design must be anchored in clear, meaningful objectives at every level.

The art of communication

One of the most practical takeaways came from Kelle Link's session on navigating enterprise ecosystems. Her candid discussion about the necessity of becoming proficient in deck creation sparked knowing laughter from the audience. As our CEO noted, it's a crucial skill for communicating with senior leadership, board members, and investors - even if it means becoming a "deck ninja" (to use a more family-friendly term).

Standardization meets innovation

Chris Grant's insights on standardization hit home: "You need to standardize everything so things are predictable for a team." This seemingly counterintuitive approach to fostering innovation resonated with our own experience at Optimal Workshop - when the basics are predictable, teams have more bandwidth for tackling the unpredictable challenges that drive real innovation.

Building impactful product teams

Matt Fenby-Taylor's discussion of the "pirate vs. worker bee" persona balance was particularly illuminating. Finding team members who can maintain that delicate equilibrium between creative disruption and methodical execution is crucial for building truly impactful product teams.

Research evolution

A key thread throughout the conference was the evolution of UX research methods. Nadine Piecha's "Beyond Interviews" session emphasized that research is truly a team sport, requiring involvement from designers, PMs, and other stakeholders. This aligns perfectly with our mission at Optimal Workshop to make research more accessible and actionable for everyone.

The AI conversation

The debate on AI's role in design and research between John Cleere and Kevin Hawkins sparked intense discussions. The consensus? AI will augment rather than replace human researchers, allowing us to focus more on strategic thinking and deeper insights - a perspective that aligns with our own approach to integrating AI capabilities.

Looking ahead

As we reflect on UXDX 2024, a few things are clear:

  1. The industry is evolving rapidly, but the fundamentals of human-centered design remain crucial

  1. Cross-functional collaboration isn't just nice to have - it's essential for delivering impactful products

  1. The future of UX research and design is bright, with teams becoming more integrated and methodologies more sophisticated

The power of community

Perhaps the most valuable aspect of UXDX wasn't just the formal sessions, but the connections made over coffee (which we were happy to provide!) and, yes, New Zealand chocolate. The mix of workshops, forums, and networking opportunities created an environment where ideas could flow freely and partnerships could form naturally.

What's next?

As we look forward to UXDX 2025, we're excited to see how these conversations evolve. Will AI transform how we approach UX research? How will cross-functional collaboration continue to develop? And most importantly, which Whittaker's chocolate flavor will reign supreme next year?

One thing's for certain - the UX community is more vibrant and collaborative than ever, and we're proud to be part of its evolution. I’ve said it before and I’ll say it again, the industry has a very bright future. 

See you next year! We’ll remember to bring more Coconut Slab chocolate next time - it seems we've created quite a demand!

Learn more
1 min read

The Power of Prototype Testing Live Training

If you missed our recent live training on Prototype Testing, don’t worry—we’ve got everything you need right here! You can catch up at your convenience, so grab a cup of tea, put your feet up, and enjoy the show.

In the session, we explored the powerful new features of our Prototype Testing tool, offering a step-by-step guide to setting up, running, and analyzing your tests like a seasoned pro. This tool is a game-changer for your design workflow, helping you identify usability issues and gather real user feedback before committing significant resources to development.


Here’s a quick recap of the highlights:

1. Creating a prototype test from scratch using images

We walked through how to create a prototype test from scratch using static images. This method is perfect for early-stage design concepts, where you want to quickly test user flows without a fully interactive prototype.

2. Preparing your Figma prototype for testing

Figma users, we’ve got you covered! We discussed how to prepare your Figma prototype for the smoothest possible testing experience. From setting up interactions to ensuring proper navigation, these tips ensure participants have an intuitive experience during the test. For more detailed instructions, check out our help article 

3. Seamless Figma prototype imports

One of the standout features of the tool is its seamless integration with Figma. We showed how easy it is to import your designs directly from Figma into Optimal, streamlining the setup process. You can bring your working files straight in, and resync when you need to with one click of a button.

4. Understanding usability metrics and analyzing results

We explored how to analyze the usability metrics, and walked through what the results can indicate on click maps and paths. These visual tools allow you to see exactly how participants navigate your design, making it easier to spot pain points, dead ends, or areas of friction. By understanding user behavior, you can rapidly iterate and refine your prototypes for optimal user experience.

Learn more
1 min read

67 ways to use Optimal for user research

User research and design can be tough in this fast-moving world. Sometimes we can get so wrapped up in what we’re doing, or what we think we’re supposed to be doing, that we don’t take the time to look for other options and other ways to use the tools we already know and love. I’ve compiled this list over last few days (my brain hurts) by talking to a few customers and a few people around the office. I’m sure it's far from comprehensive. I’ve focused on quick wins and unique examples. I’ll start off with some obvious ones, and we’ll get a little more abstract, or niche, as we go. I hope you get some ideas flying as you read through, enjoy!

#1 Benchmark your information architecture (IA)

Without a baseline for your information architecture, you can’t easily tell if any changes you make have a positive effect. If you haven’t done so, benchmark your existing website on Tree testing now. Upload your site structure and get results the same day. Now you’ll have IA scores to beat each month. Easy.

#2 Find out precisely where people get lost

Use Tree testing Pietree to find out exactly where people are getting lost in your website structure and where they go instead. You can also use First-click testing for this if you’re only interested in the first click, and let’s face it, that is where you’ll get the biggest bang for your buck.

#3 Start at the start

If you’re just not sure where to begin then take a screenshot of your homepage, or any page that you think might have some issues and get going with First-click testing. Write up a string of things that people might want to do when they find themselves on this page and use these as your tasks. Surprise all your colleagues with a maddening heatmap showing where people actually clicked in response to your tasks. Now you’ll know have a better idea of which area of your site to focus a tree test or card sort on for your next step.

#4 A/B test your site structure

Tree testing is great for testing more than one content structure. It’s easy to run two separate Tree testing studies, even more than two. It’ll help you decide which structure you and your team should run with, and it won’t take you long to set them up. Learn more.

#5 Make collaborative design decisions

Use Optimal Sort to get your team involved and let their feedback feed your designs: logos, icons, banners, images, the list goes on. By creating a closed image sort with categories where your team can group designs based on their preferences, you can get some quick feedback to help you figure out where you should focus your efforts.

#6 Do your (market) research

Card sorting is a great UX research technique, but it can also be a fun way to involve your users in some market research. Get a better sense of what your users and customers actually want to see on your website, by conducting an image sort of potential products. By providing categories like ‘I would buy this’, ‘I wouldn’t buy this’ to indicate their preferences for each item, you can figure out what types of products appeal to your customers.

#7 Customer satisfaction surveys with surveys

The thoughts and feelings of your users are always important. A simple survey can help you take a deeper look at your checkout process, a recently launched product or service, or even on the packaging your product arrives in, your options are endless.


#8 Crowdsource content ideas

Whether you’re running a blog or a UX conference, Questions can help you generate content ideas and understand any knowledge gaps that might be out there. Figure out what your users and attendees like to read on your blog, or what they want to hear about at your event, and let this feed into what you offer.

#9 Do some sociological research

Using card sorting for sociological research is a great way to deepen your understanding of how different groups may categorize information. Rather than focusing solely on how your users interact with your product or service, consider broadening your research horizons to understand your audience’s mental models. For example, by looking at how young people group popular social media platforms, you can understand the relationships between them, and identify where your product may fit in the mix.

#10 Create tests to fit in your onboarding process

Onboarding new customers is crucial to keeping them engaged with your product, especially if it involves your users learning how to use it. You can set up a quick study to help your users stay on track with onboarding. For example, say your company provided online email marketing software. You can set up a First-click testing study using a photo of your app, with a task asking your participants where they’d click to see the open rates for a particular email that went out.


#11 Quantify the return on investment of UX

Some people, including UX Agony Aunt, define return on UX as time saved, money made, and people engaged. By attaching a value to the time spent completing tasks, or to successful completion of tasks, you can approximate an ROI or at least illustrate the difference between two options.


#12 Collate all your user testing notes using qualitative Insights

Making sense of your notes from qualitative research activities can be simultaneously exciting and overwhelming. It’s fun being out on the field and jotting down observations on a notepad, or sitting in on user interviews and documenting observations into a spreadsheet. You can now easily import all your user research and give it some traceability.


#13 Establish which tags or filters people consider to be the most important

Create a card sort with your search filters or tags as labels, and have participants rank them according to how important they consider them to be. Analytics can tell you half of the story (where people actually click), so the card sort can give another side: a better idea of what people actually think or want.

#14 Reduce content on landing pages to what people access regularly

Before you run an open card sort to generate new category ideas, you can run a closed card sort to find out if you have any redundant content. Say you wanted to simplify the homepage of your intranet. You can ask participants to sort cards (containing homepage links) based on how often they use them. You could compare this card sort data with analytics from your intranet and see if people’s actual behavior and perception are well aligned.

#15 Crowd-source the values you want your team/brand/product to represent

Card sorting is a well-established technique in the ‘company values’ realm, and there are some great resources online to help you and your team brainstorm the values you represent. These ‘in-person’ brainstorm sessions are great, and you can run a remote closed card sort to support your findings. And if you want feedback from more than a small group of people (if your company has, say, more than 15 staff) you can run a remote closed card sort on its own. Use Microsoft’s Reaction Card Method as card inspiration.

#16 Input your learnings and observations from a UX conference with qualitative insights

If you're lucky enough to attend a UX conference, you can now share the experience with your colleagues. You can easily jot down ideas quotes and key takeaways in a Reframer project and keep your notes organized by using a new session for each presenter Bonus, if you’re part of a team, they can watch the live feed rolling into Reframer!


#17 Find out what actions people take across time

Use card sorting to understand when your participants are most likely to perform certain activities over the course of a day, week, or over the space of a year. Create categories that represent time, for example, ‘January to March’, ‘April to June’, ‘July to September’, and ‘October to December’, and ask your participants to sort activities according to the time they are most likely to do them (go on vacation, do their taxes, make big purchases, and so on). While there may be more arduous and more accurate methods for gathering this data, sometimes you need quick insights to help you make the right decisions.


#18 Gather quantitative data on prioritizing project tasks or product features

Closed card sorting can give you data that you might usually gather in team meetings or in Post-its on the wall, or that you might get through support channels. You can model your method on other prioritization techniques, including Eisenhower’s Decision Matrix, for example.

#19 Test your FAQs page with new users

Your support and knowledge base within your website can be just as important as any other core action on your website. If your support site is lacking in navigation and UX, this will no doubt increase support tickets and resources. Make sure your online support section is up to scratch. Here’s an article on how to do it quickly.

#20 Figure out if your icons need labels

Figure out if your icons are doing their job by testing whether your users are understanding them as intended. Uploading icons you currently use, or plan to use in your interface to First-click testing, and ask your users to identify their meaning by making use of post-task questions.

#21 Give your users some handy quick tools

In some cases, users may use your website with very specific goals in mind. Giving your users access to quick toos as soon as they land on your website is a great way to ensure they are able to get what they need done easily. Look at your analytics for things people do often that take several clicks to find, and check whether they can find your ‘quick tool’ in a single click using First-click testing.

#22 Benchmark the IA of your competition

We all have some sort of competitors, and researchers also need to pay attention to what they get up too. Make life easy in your reporting by benchmarking their IA and then reviewing it each quarter for the board and leaders to be wowed with. Also, not a perfect comparison, as users and separate sites have different flows, but compare your success scores with theirs. Makes your work feel like the Olympics with the healthy competition going on.

#23 Improve website conversions

Make the marketing team’s day by doing a fast improvement on some core conversions on your website. Now, there are loads of ways to improve conversions for a check out cart or signup form, but using First-click testing to test out ideas before you start going live A/B test can take mere minutes and give your B version a confidence boost.

#24 Reduce the bounce rates of certain sections of your website

People jumping off your website and not continuing their experience is something (depending on the landing page) everyone tries to improve. The metric ‘time on site’ and ‘average page views’ is a metric that shows the value your whole website has to offer. Again, there are many different ways to do this, but one big reason for people jumping off the website is not being able to find what they’re looking for. That’s where our IA toolkit comes in.

#25 Test your website’s IA in different countries

No, you don’t have to spend thousands of dollars to go to all these countries to test, although that’d be pretty sweet. You can remotely research participants from all over the world, using our integrated recruitment panel. Start seeing how different cultures, languages, and countries interact with your website.

#26 Run an empathy test (card sort)

Empathy – the ability to understand and share the experience of another person – is central to the design process. An empathy test is another great tool to use in the design phase because it enables you to find out if you are creating the right kind of feelings with your user. Take your design and show it to users. Provide them with a variety of words that could represent the design – for example “minimalistic”, “dynamic”, or “professional” – and ask them to pick out which the words which they think are best suited to their experience.

#27 Test visual hierarchy with first-click testing

Use first-click testing to understand which elements draw users' attention first on your page. Upload your design and ask participants to click on the most important element, or what catches their eye first. The resulting heatmap will show you if your visual hierarchy is working as intended - are users clicking where you expect them to? This technique helps validate design decisions about sizing, color, positioning, and contrast without needing to build the actual page.

#28 Take Qualitative Insights into the field

Get out of the office or the lab and observe social behaviour in the field. Use Qualitative Insights to input your observations on your field research. Then head back to your office to start making sense of the data in the Theme Builder.

#29 Use heatmaps to get the first impressions of designs

Heatmaps in our First-click testing tool are a great way of getting first impressions of any design. You can see where people clicked (correctly and incorrectly), giving you insights on what works and doesn’t work with your designs. Because it’s so fast to test, you can iterate until your designs start singing.

#30 Multivariate testing

Multivariate testing is when more than two versions of your studies are compared and allows you to understand which version performs better with your audience. Use multivariate testing with Tree testing and First-click testing to find the right design on which to focus and iterate.

#31 Improve your search engine optimization (SEO) with tree testing

Yes, a good IA improves your SEO. Search engines want to know how your users navigate throughout your site. Make sure people can easily find what they’re looking for, and you’ll start to see improvement in your search engine ranking.

#32 Test your mobile information architecture

As more and more people are using their smartphones for apps and to browse sites, you need to ensure its design gives your users a great experience. Test the IA of your mobile site to ensure people aren’t getting lost in the mobile version of your site. If you haven’t got a mobile-friendly design yet, now’s the time to start designing it!

#33 Run an Easter egg hunt using the correct areas in first-click testing

Liven up the workday by creating a fun Easter egg hunt in first-click testing. Simply upload a photo (like those really hard “spot the X” photos), set the correct area of your target, then send out your study with participant identifiers enabled. You can also send these out as competitions and have closing rules based on time, number of participants, or both.

#34 Keystroke level modeling

When interface efficiency is important you'll want to measure how much a new design can improve task times. You can actually estimate time saved (or lost) using some well-tested approaches that are based on average human performance for typical computer-based operations like clicking, pointing and typing. Read more about measuring task times without users.

#35 Feature prioritization and get some help for your roadmap

Find out what people think are the most important next steps for your team. Set up a card sort and ask people to categorize items and rank them in descending order of importance or impact on their work. This can also help you gauge their thoughts on potential new features for your site, and for bonus points compare team responses with customer responses.

#36 Tame your blog

Get the tags and categories in your blog under control to make life easier for your readers. Set up a card sort and use all your tags and categories as card labels. Either use your existing ones or test a fresh set of new tags and categories.

#37 Test your home button

Would an icon or text link work better for navigating to your home page? Before you go ahead and make changes to your site, you can find out by setting up a first-click testing test.

#38 Validate the designs in your head

As designers, you’ve probably got umpteen designs floating around in your head at any one time. But which of these are really worth pursuing? Figure this out by using The Optimal Workshop Suite to test out wireframes of new designs before putting any more work into them.

#39 ‘Buy now’ button shopping cart visibility

If you’re running an e-commerce site, ease of use and a great user experience are crucial. To see if your shopping cart and checkout processes are as good as they can be, run a first-click test.

#40 IA periodic health checks

Raise the visibility of good IA by running periodic IA health checks using Tree testing and reporting the results. Management loves metrics and catching any issues early is good too!

#41 Focus groups with qualitative insights

Thinking of launching a new product, app or website, or seeking opinions on an existing one? Focus groups can provide you with a lot of candid information that may help get your project off the ground. They’re also dangerous because they’re susceptible to groupthink, design by committee, and tunnel vision. Use with caution, but if you do then use with Qualitative Insights! Compare notes and find patterns across sessions. Pay attention to emotional triggers.

#42 Gather opinions with surveys

Whether you want the opinions of your users or from members of your team, you can set up a quick and simple survey using Surveys. It’s super useful for getting opinions on new ideas (consider it almost like a mini-focus group), or even for brainstorming with teammates.

#43 Design a style guide with card sorting

Style guides (for design and content) can take a lot of time and effort to create, especially when you need to get the guide proofed by various people in your company. To speed this up, simply create a card sort to find out what your guide should consist of. Find out the specifics in this article.

#44 Improve your company's CRM system

As your company grows, oftentimes your CRM can become riddled with outdated information and turn into a giant mess, especially if you deal with a lot of customers every day. To help clear this up, you can use card sorting and tree testing to solve navigational issues and get rid of redundant features. Learn more.

#45 Sort your life out

Let your creativity run wild, and get your team or family involved in organizing or prioritizing the things that matter. And the possibilities really are endless. Organize a long list of DIY projects, or ask the broader team how the functional pods should be re-organized. It’s up to you. How can card sorting help you in your work and daily life?

#46 Create an online diary study

Whether it’s a product, app or website, finding out the long-term behaviour and thoughts of your users is important. That’s where diary studies come in. For those new to this concept, diary studies are a longitudinal research method, aimed at collecting insights about a participant’s needs and behaviors. Participants note down activities as they’re using a particular product, app, or website. Add your participants into a qualitative study and allow them to create their diary study with ease.

#47 Source-specific data with an online survey

Online survey tools can complement your existing research by sourcing specific information from your participants. For example, if you need to find out more about how your participants use social media, which sites they use, and on which devices, you can do it all through a simple survey questionnaire. Additionally, if you need to identify usage patterns, device preferences or get information on what other products/websites your users are aware of/are using, a questionnaire is the ticket.

#48 Guerrilla testing with First-click testing

For really quick first-click testing, take First-click testing on a tablet, mobile device or laptop to a local coffee shop. Ask people standing in line if they’d like to take part in your super quick test in exchange for a cup of joe. Easy!

#50 Ask post-task questions for tree testing and first-click testing

You can now set specific task-related questions for both Tree testing and First-click testing. This is a great way to dive deeper into the mushy minds of your participants. Check out how to use this new(ish) feature here!

#51 Start testing prototypes

Paper prototypes are great, but what happens when your users are scattered around the globe, and you can’t invite them to an in-person test? By scanning (or taking a photo) of your paper prototypes, you can use first-click testingto test them with your users quickly and easily. Read more about our approach here.

#52 Take better notes for sense making

Qualitative research involves a lot of note-taking. So naturally, to be better at this method, improving how you take notes is important. Reframer is designed to make note-taking easy but it can still be an art. Learn more.

#53 Make sure you get the user's first-click right

Like most things, read a little, and then it’s all about practice.We’ve found that people who get the first click correct are almost three times as likely to complete a task successfully. Get your first clicks right in tree testing and first-click testing and you’ll start seeing your customers smile.


#54 Run a cat survey. Yep, cats!

We’ve gained some insight into how people intuitively group cats, and so can you (unless you’re a dog person). Honestly, doing something silly can be a useful way to introduce your team to a new method on a Friday afternoon. Remember to distribute the results!


#55 Destroy evil attractors in your tree

Evil attractors are those labels in your IA that attract unjustified clicks across tasks. This usually means the chosen label is ambiguous, or possibly a catch-all phrase like ‘Resources’. Read how to quickly identify evil attractors in the Destinations table of tree test results and how to fix them.

#56 Affinity map using card sorts

We all love our Post-its and sticking things on walls. But sometimes you need something quicker and accessible for people in remote areas. Try out using Card Sorts for a distributed approach to making sense of all the notes. Plus, you can easily import any qualitative insights when creating cards in card sort. Easy.

#57 Preference test with first-click testing

Whether you’re coming up with a new logo design, headline, featured image, or anything, you can preference test it with First-click testing. Create an image that shows the two designs side by side and upload it to First-click testing. From there, you can ask people to click whichever one they prefer!

#58 Add moderated card sort results to your card sort

An excellent way of gathering valuable qualitative insights alongside the results of your remote card sorts is to run a moderated version of the sorts with a smaller group of participants. When you can observe and interact with your participants as they complete the sort, you’ll be able to ask questions and learn more about their mental models and the reasons why they have categorized things in a particular way. Learn more.

#59 Test search box variations with first-click clicking

Case study by Viget: “One of the most heavily used features of the website is its keyword search, so we wanted to make absolutely certain that our redesigned search box didn’t make search harder for users to find and use.”

#60 Run an image card sort to organize products into groups

You can add images to each card that allows you understand how your participants may organize and label particular items. Very useful if you want to organize some retail products and want to find out how other people would organize them given a visual including shape, color, and other potential context.

#61 Test your customers' perceptions of different logo and brand image designs

Understand how customers perceive your brand by creating a closed card sort. Come up with a list of categories, and ask participants to sort images such as logos, and branded images.

#62 Run an open image card sort to classify images into groups based on the emotions they elicit

Are these pictures exhilarating, or terrifying? Are they humoros, or offensive? Relaxing, or boring? Productive, or frantic? Happy memories, or a deep sigh?

#63 Run an image card sort to organize your library

Whether it’s a physical library of books, or a digital drive full of ebooks, you can run a card sort to help organize them in a way that makes sense. Will it be by genre, author name, color or topic? Send out the study to your coworkers to get their input! You can also do this at home for your own personal library, and you can include music/CDs/vinyl records and movies!

#64 HR exercises to determine the motivations of your team

It’s simple to ask your team about their thoughts, feelings, and motivations with a Questions survey. You can choose to leave participant identifiers blank (so responses are anonymous), or you can ask for a name/email address. As a bonus, you can set up a calendar reminder to send out a new survey in the next quarter. Duplicate the survey and send it out again!

#65 Designing physical environments

If your company has a physical environment in which your customers visit, you can research new structures using a mixture of tools in The Optimal Workshop Suite. This especially comes in handy if your customers require certain information within the physical environment in order to make decisions. For example, picture a retail store. Are all the signs clear and communicate the right information? Are people overwhelmed by the physical environment?

#66 Use tree testing to refine an interactive phone menu system

Similar to how you’d design an IA, you can create a tree test to design an automated phone system. Whether you’re designing from the ground up, or improving your existing system, you will be able to find out if people are getting lost.


#67 Have your research team categorize and prioritize all these ideas

Before you dig deeper into more of these ideas, ask the rest of the team to help you decide which one to focus on. Let’s not get in the way of your work. Start your quick wins and log into your account. Here’s a spreadsheet of this list to upload to card sort. Aaaaaaaaaaand that’s a wrap! *Takes out gym towel and wipes sweaty face.
*Got any more suggestions to add to this list? We’d love to hear them in our comments section — we might even add them into this list

Learn more
1 min read

UX Insider: The value of qualitative research for business stakeholders

Every month we have informative “bite sized” presentations to add some inspiration to your day. These virtual events allow us to partner with amazing speakers, community groups and organizations to share their insights and hot takes on a variety of topics impacting our industry 🚀

Do you want to learn ways to uplift qualitative researchers and value their skill sets as business assets?

In an effort to make “data-driven” decisions, business leaders look to research for guidance. However, there is often an implicit priority for quantitative research over qualitative research.  Often, even if qualitative research is funded and the findings are valued, the qualitative researcher and their skill sets can feel under-appreciated at an organizational or business unit level.

Let’s uplift the qualitative researcher and honor the craft of qualitative research as a transferable skill set. In this talk we will discuss: 

  • Theories about why business leaders have a hard time thinking about qualitative research findings as “data”
  • Techniques for navigating the quant vs. qual conversation with non-research minded stakeholders — with an emphasis on not pitting research methods against each other.
  • The importance of modeling qualitative researcher behaviors in other business contexts.
  • How thinking like a qualitative researcher can close organizational gaps and aid in consensus building
  • Tips for demonstrating the value of thinking and acting like qualitative researchers

Jennifer Long

Speaker Bio 🎤

Jennifer is a business generalist with UX Research and Information Architecture chops. She spent six years at Factor, an Information Architecture Consulting Firm, where she most recently held the Chief of Staff role. Jennifer has an MBA, a certificate of UX design from School of Visual Concepts in Seattle, and Bachelor of Arts in Theatre. She strongly believes in building stakeholder consensus and adding depth to projects through careful exploration. She lives in Washington State near the U.S./Canadian border and loves hiking in the North Cascades with her family and their German Shepherd mutt.

Take a seat, invite your colleagues and we hope to see you at our next UX Insider!

No results found.

Please try different keywords.

Subscribe to OW blog for an instantly better inbox

Thanks for subscribing!
Oops! Something went wrong while submitting the form.

Seeing is believing

Explore our tools and see how Optimal makes gathering insights simple, powerful, and impactful.