May 26, 2016
4 min

Card descriptions: Testing the effect of contextual information in card sorts

The key purpose of running a card sort is to learn something new about how people conceptualize and organize the information that’s found on your website. The insights you gain from running a card sort can then help you develop a site structure with content labels or headings that best represent the way your users think about this information. Card sorts are in essence a simple technique, however it’s the details of the sort that can determine the quality of your results.

Adding context to cards in OptimalSort – descriptions, links and images

In most cases, each item in a card sort has only a short label, but there are instances where you may wish to add additional context to the items in your sort. Currently, the cards tab in OptimalSort allows you to include a tooltip description, a link within the tooltip description or to format the card as an image (with or without a label).

adding descriptions and images - 640px

We generally don’t recommend using tooltip descriptions and links, unless you have a specific reason to do so. It’s likely that they’ll provide your participants with more information than they would normally have when navigating your website, which may in turn influence your results by leading participants to a particular solution.

Legitimate reasons that you may want to use descriptions and links include situations where it’s not possible or practical to translate complex or technical labels (for example, medical, financial, legal or scientific terms) into plain language, or if you’re using a card sort to understand your participants’ preferences or priorities.

If you do decide to include descriptions in your sort, it’s important that you follow the same guidelines that you would otherwise follow for writing card labels. They should be easy for your participants to understand and you should avoid obvious patterns, for example repeating words and phrases, or including details that refer to the current structure of the website.

A quick survey of how card descriptions are used in OptimalSort

I was curious to find out how often people were including descriptions in their card sorts, so I asked our development team to look into this data. It turns out that around 15% of cards created in OptimalSort have at least some text entered in the description field. In order to dig into the data a bit further, both Ania and I reviewed a random sample of recent sorts and noted how descriptions were being used in each case.

We found that out of the descriptions that we reviewed, 40% (6% of the total cards) had text that should not have impacted the sort results. Most often, these cards simply had the card label repeated in the description (to be honest, we’re not entirely sure why so many descriptions are being used this way! But it’s now in our roadmap to stop this from happening — stay tuned!). Approximately 20% (3% of the total cards) used descriptions to add context without obviously leading participants, however another 40% of cards have descriptions that may well lead to biased results. On occasion, this included linking to the current content or using what we assumed to be the current top level heading within the description.

Use of card descriptions

Create pie charts

Testing the effect of card descriptions on sort results

So, how much influence could potentially leading card descriptions have on the results of a card sort? I decided to put it to the test by running a series of card sorts to compare the effect of different descriptions. As I also wanted to test the effect of linking card descriptions to existing content, I had to base the sort on a live website. In addition, I wanted to make sure that the card labels and descriptions were easily comprehensible by a general audience, but not so familiar that participants were highly likely to sort the cards in a similar manner.

I selected the government immigration website New Zealand Now as my test case. This site, which provides information for prospective and new immigrants to New Zealand, fit the above criteria and was likely unfamiliar to potential participants.

Card descriptions

Navigating the New Zealand Now website

When I reviewed the New Zealand Now site, I found that the top level navigation labels were clear and easy to understand for me personally. Of course, this is especially important when much of your target audience is likely to be non-native English speaking! On the whole, the second level headings were also well-labeled, which meant that they should translate to cards that participants were able to group relatively easily.

There were, however, a few headings such as “High quality” and “Life experiences”, both found under “Study in New Zealand”, which become less clear when removed from the context of their current location in the site structure. These headings would be particularly useful to include in the test sorts, as I predicted that participants would be more likely to rely on card descriptions in the cases where the card label was ambiguous.

Card Descriptions2

I selected 30 headings to use as card labels from under the sections “Choose New Zealand”, “Move to New Zealand”, “Live in New Zealand”, “Work in New Zealand” and “Study in New Zealand” and tweaked the language slightly, so that the labels were more generic.

card labels

I then created four separate sorts in OptimalSort:Round 1: No description: Each card showed a heading only — this functioned as the control sort

Card descriptions illustrations - card label only

Round 2: Site section in description: Each card showed a heading with the site section in the description

Card descriptions illustrations - site section

Round 3: Short description: Each card showed a heading with a short description — these were taken from the New Zealand Now topic landing pages

Card descriptions illustrations - short description

Round 4:Link in description: Each card showed a heading with a link to the current content page on the New Zealand Now website

Card descriptions illustrations - link

For each sort, I recruited 30 participants. Each participant could only take part in one of the sorts.

What the results showed

An interesting initial finding was that when we queried the participants following the sort, only around 40% said they noticed the tooltip descriptions and even fewer participants stated that they had used them as an aid to help complete the sort.

Participant recognition of descriptions

Create bar charts

Of course, what people say they do does not always reflect what they do in practice! To measure the effect that different descriptions had on the results of this sort, I compared how frequently cards were sorted with other cards from their respective site sections across the different rounds.Let’s take a look at the “Study in New Zealand” section that was mentioned above. Out of the five cards in this section,”Where & what to study”, “Everyday student life” and “After you graduate” were sorted pretty consistently, regardless of whether a description was provided or not. The following charts show the average frequency with which each card was sorted with other cards from this section. For example in the control round, “Where & what to study” was sorted with “After you graduate” 76% of the time and with “Everyday day student life” 70% of the time, but was sorted with “Life experiences” or “High quality” each only 10% of the time. This meant that the average sort frequency for this card was 42%.

Untitled chartCreate bar charts

On the other hand, the cards “High quality” and “Life experiences” were sorted much less frequently with other cards in this section, with the exception of the second sort, which included the site section in the description.These results suggest that including the existing site section in the card description did influence how participants sorted these cards — confirming our prediction! Interestingly, this round had the fewest number of participants who stated that they used the descriptions to help them complete the sort (only 10%, compared to 40% in round 3 and 20% in round 4).Also of note is that adding a link to the existing content did not seem to increase the likelihood that cards were sorted more frequently with other cards from the same section. Reasons for this could include that participants did not want to navigate to another website (due to time-consciousness in completing the task, or concern that they’d lose their place in the sort) or simply that it can be difficult to open a link from the tooltip pop-up.

What we can take away from these results

This quick investigation into the impact of descriptions illustrates some of the intricacies around using additional context in your card sorts, and why this should always be done with careful consideration. It’s interesting that we correctly predicted some of these results, but that in this case, other uses of the description had little effect at all. And the results serve as a good reminder that participants can often be influenced by factors that they don’t even recognise themselves!If you do decide to use card descriptions in your cards sorts, here are some guidelines that we recommend you follow:

  • Avoid repeating words and phrases, participants may sort cards by pattern-matching rather than based on the actual content
  • Avoid alluding to a predetermined structure, such as including references to the current site structure
  • If it’s important that participants use the descriptions to complete the sort, you should mention this in your task instructions. It may also be worth asking them a post-survey question to validate if they used them or not

We’d love to hear your thoughts on how we tested the effects of card descriptions and the results that we got. Would you have done anything differently?Have you ever completed a card sort only to realize later that you’d inadvertently biased your results? Or have you used descriptions in your card sorts to meet a genuine need? Do you think there’s a case to make descriptions more obvious than just a tooltip, so that when they are used legitimately, most participants don’t miss this information?

Let us know by leaving a comment!

Share this article
Author
Optimal
Workshop

Related articles

View all blog articles
Learn more
1 min read

Which comes first: card sorting or tree testing?

“Dear Optimal Workshop,I want to test the structure of a university website (well certain sections anyway). My gut instinct is that it's pretty 'broken'. Lots of sections feel like they're in the wrong place. I want to test my hypotheses before proposing a new structure. I'm definitely going to do some card sorting, and was planning a mixture of online and offline. My question is about when to bring in tree testing. Should I do this first to test the existing IA? Or is card sorting sufficient? I do intend to tree test my new proposed IA in order to validate it, but is it worth doing it upfront too?" — Matt

Dear Matt,

Ah, the classic chicken or the egg scenario: Which should come first — tree testing or card sorting?

It’s a question that many researchers often ask themselves, but I’m here to help clear the air!You should always use both methods when changing up your information architecture (IA) in order to capture the most information.

Tree testing and card sorting, when used together, can give you fantastic insight into the way your users interact with your site. First of all, I’ll run through some of the benefits of each testing method.

What is card sorting and why should I use it?

Card sorting is a great method to gauge the way in which your users organize the content on your site. It helps you figure out which things go together and which things don’t. There are two main types of card sorting: open and closed.

Closed card sorting involves providing participants with pre-defined categories into which they sort their cards. For example, you might be reorganizing the categories for your online clothing store for women. Your cards would have all the names of your products (e.g., “socks”, “skirts” and “singlets”) and you also provide the categories (e.g.,“outerwear”, “tops” and “bottoms”).

Open card sorting involves providing participants with cards and leaving them to organize the content in a way that makes sense to them. It’s the opposite to closed card sorting, in that participants dictate the categories themselves and also label them. This means you’d provide them with the cards only — no categories.

Card sorting, whether open or closed, is very user focused. It involves a lot of thought, input, and evaluation from each participant, helping you to form the structure of your new IA.

What is tree testing and why should I use it?

Tree testing is a fantastic way to determine how your users are navigating your site and how they’re finding information. Your site is organised into a tree structure, sorted into topics and subtopics, and participants are provided with some tasks that they need to perform. The results will show you how your participants performed those tasks, if they were successful or unsuccessful, and which route they took to complete the tasks. This data is extremely useful for creating a new and improved IA.

Tree testing is an activity that requires participants to seek information, which is quite the contrast to card sorting — an activity that requires participants to sort and organize information. Each activity requires users to behave in different ways, so each method will give its own valuable results.

Should you run a card or tree test first?

In this scenario, I’d recommend running a tree test first in order to find out how your existing IA currently performs. You said your gut instinct is telling you that your existing IA is pretty “broken”, but it’s good to have the data that proves this and shows you where your users get lost.

An initial tree test will give you a benchmark to work with — after all, how will you know your shiny, new IA is performing better if you don’t have any stats to compare it with? Your results from your first tree test will also show you which parts of your current IA are the biggest pain points and from there you can work on fixing them. Make sure you keep these tasks on hand — you’ll need them later!

Once your initial tree test is done, you can start your card sort, based on the results from your tree test. Here, I recommend conducting an open card sort so you can understand how your users organize the content in a way that makes sense to them. This will also show you the language your participants use to name categories, which will help you when you’re creating your new IA.

Finally, once your card sort is done you can conduct another tree test on your new, proposed IA. By using the same (or very similar) tasks from your initial tree test, you will be able to see that any changes in the results can be directly attributed to your new and improved IA.

Once your test has concluded, you can use this data to compare the performance from the tree test for your original information architecture — hopefully it is much better now!

Learn more
1 min read

Decoding Taylor Swift: A data-driven deep dive into the Swiftie psyche 👱🏻‍♀️

Taylor Swift's music has captivated millions, but what do her fans really think about her extensive catalog? We've crunched the numbers, analyzed the data, and uncovered some fascinating insights into how Swifties perceive and categorize their favorite artist's work. Let's dive in!

The great debate: openers, encores, and everything in between ⋆.˚✮🎧✮˚.⋆

Our study asked fans to categorize Swift's songs into potential opening numbers, encores, and songs they'd rather not hear (affectionately dubbed "Nah" songs). The results? As diverse as Swift's discography itself!

Opening with a bang 💥

Swifties seem to agree that high-energy tracks make for the best concert openers, but the results are more nuanced than previously suggested. "Shake It Off" emerged as the clear favorite for opening a concert, with 17 votes. "Love Story" follows closely behind with 14 votes, showing that nostalgia indeed plays a significant role. Interestingly, both "Cruel Summer" and "Blank Space" tied for third place with 13 votes each.

This mix of songs from different eras of Swift's career suggests that fans appreciate both her newer hits and classic favorites when it comes to kicking off a show. The strong showing for "Love Story" does indeed speak to the power of nostalgia in concert experiences. It's worth noting that "...Ready for It?", while a popular song, received fewer votes (9) for the opening slot than might have been expected.

Encore extravaganza 🎤

When it comes to encores, fans seem to favor a diverse mix of Taylor Swift's discography, with a surprising tie at the top. "Slut!" (Taylor's Version), "exile", "Guilty as Sin?", and "Bad Blood (Remix)" all received the highest number of votes with 13 each. This variety showcases the breadth of Swift's career and the different aspects of her artistry that resonate with fans for a memorable show finale.

Close behind are "evermore", "Wildest Dreams", "ME!", "Love Story", and "Lavender Haze", each garnering 12 votes. It's particularly interesting to see both newer tracks and classic hits like "Love Story" maintaining strong popularity for the encore slot. This balance suggests that Swifties appreciate both nostalgia and Swift's artistic evolution when it comes to closing out a concert experience.

The "Nah" list 😒

Interestingly, some of Taylor Swift's tracks found themselves on the "Nah" list, indicating that fans might prefer not to hear them in a concert setting. "Clara Bow" tops this category with 13 votes, closely followed by "You're On Your Own, Kid", "You're Losing Me", and "Delicate", each receiving 12 votes.

This doesn't necessarily mean fans dislike these songs - they might just feel they're not well-suited for live performances or don't fit as well into a concert setlist. It's particularly surprising to see "Delicate" on this list, given its popularity. The presence of both newer tracks like "Clara Bow" and older ones like "Delicate" suggests that the "Nah" list isn't tied to a specific era of Swift's career, but rather to individual song preferences in a live concert context.

It's worth noting that even popular songs can end up on this list, highlighting the complex relationship fans have with different tracks in various contexts. This data provides an interesting insight into how Swifties perceive songs differently when considering them for a live performance versus general listening.

The Similarity Matrix: set list synergies ⚡

Our similarity matrix revealed fascinating insights into how fans envision Taylor Swift's songs fitting together in a concert set list:

1. The "Midnights" Connection: Songs from "Midnights" like "Midnight Rain", "The Black Dog", and "The Tortured Poets Department" showed high similarity in set list placement. This suggests fans see these tracks working well in similar parts of a concert, perhaps as a cohesive segment showcasing the album's distinct sound.

2. Cross-album transitions: There's an intriguing connection between "Guilty as Sin?" and "exile", with a high similarity percentage. This indicates fans see these songs from different albums as complementary in a live setting, potentially suggesting a smooth transition point in the set list that bridges different eras of Swift's career.

3. The show-stoppers: "Shake It Off" stands out as dissimilar to most other songs in terms of placement. This likely reflects its perceived role as a high-energy, statement piece that occupies a unique position in the set list, perhaps as an opener, closer, or peak moment.

4. Set list evolution: There's a noticeable pattern of higher similarity between songs from the same or adjacent eras, suggesting fans envision distinct segments for different periods of Swift's career within the concert. This could indicate a preference for a chronological journey through her discography or strategic placement of different styles throughout the show.

5. Thematic groupings: Some songs from different albums showed higher similarity, such as "Is It Over Now? (Taylor's Version)" and "You're On Your Own, Kid". This suggests fans see them working well together in the set list based on thematic or emotional connections rather than just album cohesion.

What does it all mean?! 💃🏼📊

This card sort data paints a picture of an artist who continually evolves while maintaining certain core elements that define her work. Swift's ability to create cohesive album experiences, make bold stylistic shifts, and maintain thematic threads throughout her career is reflected in how fans perceive and categorize her songs. Moreover, the diversity of opinions on song categorization - with 59 different songs suggested as potential openers - speaks to the depth and breadth of Swift's discography. It also highlights the personal nature of music appreciation; what one fan sees as the perfect opener, another might categorize as a "Nah".

In the end, this analysis gives us a fascinating glimpse into the complex web of associations in Swift's discography. It shows us not just how Swift has evolved as an artist, but how her fans have evolved with her, creating deep and sometimes unexpected connections between songs across her entire career. Whether you're a die-hard Swiftie or a casual listener, or a weirdo who just loves a good card sort, one thing is clear: Taylor Swift's music is rich, complex, and deeply meaningful to her fans. And with each new album, she continues to surprise, delight, and challenge our expectations.

Conclusion: shaking up our understanding 🥤🤔

This deep dive into the Swiftie psyche through a card sort reveals the complexity of Taylor Swift's discography and fans' relationship with it. From strategic song placement in a dream setlist to unexpected cross-era connections, we've uncovered layers of meaning that showcase Swift's artistry and her fans' engagement. The exercise demonstrates how a song can be a potential opener, mid-show energy boost, poignant closer, or a skip-worthy track, highlighting Swift's ability to create diverse, emotionally resonant music that serves various roles in the listening experience.

The analysis underscores Swift's evolving career, with distinct album clusters alongside surprising connections, painting a picture of an artist who reinvents herself while maintaining a core essence. It also demonstrates how fan-driven analyses like card sorting can be insightful and engaging, offering a unique window into music fandom and reminding us that in Swift's discography, there's always more to discover. This exercise proves valuable whether you're a die-hard Swiftie, casual listener, or someone who loves to analyze pop culture phenomena.

Learn more
1 min read

67 ways to use Optimal for user research

User research and design can be tough in this fast-moving world. Sometimes we can get so wrapped up in what we’re doing, or what we think we’re supposed to be doing, that we don’t take the time to look for other options and other ways to use the tools we already know and love. I’ve compiled this list over last few days (my brain hurts) by talking to a few customers and a few people around the office. I’m sure it's far from comprehensive. I’ve focused on quick wins and unique examples. I’ll start off with some obvious ones, and we’ll get a little more abstract, or niche, as we go. I hope you get some ideas flying as you read through, enjoy!

#1 Benchmark your information architecture (IA)

Without a baseline for your information architecture, you can’t easily tell if any changes you make have a positive effect. If you haven’t done so, benchmark your existing website on Tree testing now. Upload your site structure and get results the same day. Now you’ll have IA scores to beat each month. Easy.

#2 Find out precisely where people get lost

Use Tree testing Pietree to find out exactly where people are getting lost in your website structure and where they go instead. You can also use First-click testing for this if you’re only interested in the first click, and let’s face it, that is where you’ll get the biggest bang for your buck.

#3 Start at the start

If you’re just not sure where to begin then take a screenshot of your homepage, or any page that you think might have some issues and get going with First-click testing. Write up a string of things that people might want to do when they find themselves on this page and use these as your tasks. Surprise all your colleagues with a maddening heatmap showing where people actually clicked in response to your tasks. Now you’ll know have a better idea of which area of your site to focus a tree test or card sort on for your next step.

#4 A/B test your site structure

Tree testing is great for testing more than one content structure. It’s easy to run two separate Tree testing studies, even more than two. It’ll help you decide which structure you and your team should run with, and it won’t take you long to set them up. Learn more.

#5 Make collaborative design decisions

Use Optimal Sort to get your team involved and let their feedback feed your designs: logos, icons, banners, images, the list goes on. By creating a closed image sort with categories where your team can group designs based on their preferences, you can get some quick feedback to help you figure out where you should focus your efforts.

#6 Do your (market) research

Card sorting is a great UX research technique, but it can also be a fun way to involve your users in some market research. Get a better sense of what your users and customers actually want to see on your website, by conducting an image sort of potential products. By providing categories like ‘I would buy this’, ‘I wouldn’t buy this’ to indicate their preferences for each item, you can figure out what types of products appeal to your customers.

#7 Customer satisfaction surveys with surveys

The thoughts and feelings of your users are always important. A simple survey can help you take a deeper look at your checkout process, a recently launched product or service, or even on the packaging your product arrives in, your options are endless.


#8 Crowdsource content ideas

Whether you’re running a blog or a UX conference, Questions can help you generate content ideas and understand any knowledge gaps that might be out there. Figure out what your users and attendees like to read on your blog, or what they want to hear about at your event, and let this feed into what you offer.

#9 Do some sociological research

Using card sorting for sociological research is a great way to deepen your understanding of how different groups may categorize information. Rather than focusing solely on how your users interact with your product or service, consider broadening your research horizons to understand your audience’s mental models. For example, by looking at how young people group popular social media platforms, you can understand the relationships between them, and identify where your product may fit in the mix.

#10 Create tests to fit in your onboarding process

Onboarding new customers is crucial to keeping them engaged with your product, especially if it involves your users learning how to use it. You can set up a quick study to help your users stay on track with onboarding. For example, say your company provided online email marketing software. You can set up a First-click testing study using a photo of your app, with a task asking your participants where they’d click to see the open rates for a particular email that went out.


#11 Quantify the return on investment of UX

Some people, including UX Agony Aunt, define return on UX as time saved, money made, and people engaged. By attaching a value to the time spent completing tasks, or to successful completion of tasks, you can approximate an ROI or at least illustrate the difference between two options.


#12 Collate all your user testing notes using qualitative Insights

Making sense of your notes from qualitative research activities can be simultaneously exciting and overwhelming. It’s fun being out on the field and jotting down observations on a notepad, or sitting in on user interviews and documenting observations into a spreadsheet. You can now easily import all your user research and give it some traceability.


#13 Establish which tags or filters people consider to be the most important

Create a card sort with your search filters or tags as labels, and have participants rank them according to how important they consider them to be. Analytics can tell you half of the story (where people actually click), so the card sort can give another side: a better idea of what people actually think or want.

#14 Reduce content on landing pages to what people access regularly

Before you run an open card sort to generate new category ideas, you can run a closed card sort to find out if you have any redundant content. Say you wanted to simplify the homepage of your intranet. You can ask participants to sort cards (containing homepage links) based on how often they use them. You could compare this card sort data with analytics from your intranet and see if people’s actual behavior and perception are well aligned.

#15 Crowd-source the values you want your team/brand/product to represent

Card sorting is a well-established technique in the ‘company values’ realm, and there are some great resources online to help you and your team brainstorm the values you represent. These ‘in-person’ brainstorm sessions are great, and you can run a remote closed card sort to support your findings. And if you want feedback from more than a small group of people (if your company has, say, more than 15 staff) you can run a remote closed card sort on its own. Use Microsoft’s Reaction Card Method as card inspiration.

#16 Input your learnings and observations from a UX conference with qualitative insights

If you're lucky enough to attend a UX conference, you can now share the experience with your colleagues. You can easily jot down ideas quotes and key takeaways in a Reframer project and keep your notes organized by using a new session for each presenter Bonus, if you’re part of a team, they can watch the live feed rolling into Reframer!


#17 Find out what actions people take across time

Use card sorting to understand when your participants are most likely to perform certain activities over the course of a day, week, or over the space of a year. Create categories that represent time, for example, ‘January to March’, ‘April to June’, ‘July to September’, and ‘October to December’, and ask your participants to sort activities according to the time they are most likely to do them (go on vacation, do their taxes, make big purchases, and so on). While there may be more arduous and more accurate methods for gathering this data, sometimes you need quick insights to help you make the right decisions.


#18 Gather quantitative data on prioritizing project tasks or product features

Closed card sorting can give you data that you might usually gather in team meetings or in Post-its on the wall, or that you might get through support channels. You can model your method on other prioritization techniques, including Eisenhower’s Decision Matrix, for example.

#19 Test your FAQs page with new users

Your support and knowledge base within your website can be just as important as any other core action on your website. If your support site is lacking in navigation and UX, this will no doubt increase support tickets and resources. Make sure your online support section is up to scratch. Here’s an article on how to do it quickly.

#20 Figure out if your icons need labels

Figure out if your icons are doing their job by testing whether your users are understanding them as intended. Uploading icons you currently use, or plan to use in your interface to First-click testing, and ask your users to identify their meaning by making use of post-task questions.

#21 Give your users some handy quick tools

In some cases, users may use your website with very specific goals in mind. Giving your users access to quick toos as soon as they land on your website is a great way to ensure they are able to get what they need done easily. Look at your analytics for things people do often that take several clicks to find, and check whether they can find your ‘quick tool’ in a single click using First-click testing.

#22 Benchmark the IA of your competition

We all have some sort of competitors, and researchers also need to pay attention to what they get up too. Make life easy in your reporting by benchmarking their IA and then reviewing it each quarter for the board and leaders to be wowed with. Also, not a perfect comparison, as users and separate sites have different flows, but compare your success scores with theirs. Makes your work feel like the Olympics with the healthy competition going on.

#23 Improve website conversions

Make the marketing team’s day by doing a fast improvement on some core conversions on your website. Now, there are loads of ways to improve conversions for a check out cart or signup form, but using First-click testing to test out ideas before you start going live A/B test can take mere minutes and give your B version a confidence boost.

#24 Reduce the bounce rates of certain sections of your website

People jumping off your website and not continuing their experience is something (depending on the landing page) everyone tries to improve. The metric ‘time on site’ and ‘average page views’ is a metric that shows the value your whole website has to offer. Again, there are many different ways to do this, but one big reason for people jumping off the website is not being able to find what they’re looking for. That’s where our IA toolkit comes in.

#25 Test your website’s IA in different countries

No, you don’t have to spend thousands of dollars to go to all these countries to test, although that’d be pretty sweet. You can remotely research participants from all over the world, using our integrated recruitment panel. Start seeing how different cultures, languages, and countries interact with your website.

#26 Run an empathy test (card sort)

Empathy – the ability to understand and share the experience of another person – is central to the design process. An empathy test is another great tool to use in the design phase because it enables you to find out if you are creating the right kind of feelings with your user. Take your design and show it to users. Provide them with a variety of words that could represent the design – for example “minimalistic”, “dynamic”, or “professional” – and ask them to pick out which the words which they think are best suited to their experience.

#27 Test visual hierarchy with first-click testing

Use first-click testing to understand which elements draw users' attention first on your page. Upload your design and ask participants to click on the most important element, or what catches their eye first. The resulting heatmap will show you if your visual hierarchy is working as intended - are users clicking where you expect them to? This technique helps validate design decisions about sizing, color, positioning, and contrast without needing to build the actual page.

#28 Take Qualitative Insights into the field

Get out of the office or the lab and observe social behaviour in the field. Use Qualitative Insights to input your observations on your field research. Then head back to your office to start making sense of the data in the Theme Builder.

#29 Use heatmaps to get the first impressions of designs

Heatmaps in our First-click testing tool are a great way of getting first impressions of any design. You can see where people clicked (correctly and incorrectly), giving you insights on what works and doesn’t work with your designs. Because it’s so fast to test, you can iterate until your designs start singing.

#30 Multivariate testing

Multivariate testing is when more than two versions of your studies are compared and allows you to understand which version performs better with your audience. Use multivariate testing with Tree testing and First-click testing to find the right design on which to focus and iterate.

#31 Improve your search engine optimization (SEO) with tree testing

Yes, a good IA improves your SEO. Search engines want to know how your users navigate throughout your site. Make sure people can easily find what they’re looking for, and you’ll start to see improvement in your search engine ranking.

#32 Test your mobile information architecture

As more and more people are using their smartphones for apps and to browse sites, you need to ensure its design gives your users a great experience. Test the IA of your mobile site to ensure people aren’t getting lost in the mobile version of your site. If you haven’t got a mobile-friendly design yet, now’s the time to start designing it!

#33 Run an Easter egg hunt using the correct areas in first-click testing

Liven up the workday by creating a fun Easter egg hunt in first-click testing. Simply upload a photo (like those really hard “spot the X” photos), set the correct area of your target, then send out your study with participant identifiers enabled. You can also send these out as competitions and have closing rules based on time, number of participants, or both.

#34 Keystroke level modeling

When interface efficiency is important you'll want to measure how much a new design can improve task times. You can actually estimate time saved (or lost) using some well-tested approaches that are based on average human performance for typical computer-based operations like clicking, pointing and typing. Read more about measuring task times without users.

#35 Feature prioritization and get some help for your roadmap

Find out what people think are the most important next steps for your team. Set up a card sort and ask people to categorize items and rank them in descending order of importance or impact on their work. This can also help you gauge their thoughts on potential new features for your site, and for bonus points compare team responses with customer responses.

#36 Tame your blog

Get the tags and categories in your blog under control to make life easier for your readers. Set up a card sort and use all your tags and categories as card labels. Either use your existing ones or test a fresh set of new tags and categories.

#37 Test your home button

Would an icon or text link work better for navigating to your home page? Before you go ahead and make changes to your site, you can find out by setting up a first-click testing test.

#38 Validate the designs in your head

As designers, you’ve probably got umpteen designs floating around in your head at any one time. But which of these are really worth pursuing? Figure this out by using The Optimal Workshop Suite to test out wireframes of new designs before putting any more work into them.

#39 ‘Buy now’ button shopping cart visibility

If you’re running an e-commerce site, ease of use and a great user experience are crucial. To see if your shopping cart and checkout processes are as good as they can be, run a first-click test.

#40 IA periodic health checks

Raise the visibility of good IA by running periodic IA health checks using Tree testing and reporting the results. Management loves metrics and catching any issues early is good too!

#41 Focus groups with qualitative insights

Thinking of launching a new product, app or website, or seeking opinions on an existing one? Focus groups can provide you with a lot of candid information that may help get your project off the ground. They’re also dangerous because they’re susceptible to groupthink, design by committee, and tunnel vision. Use with caution, but if you do then use with Qualitative Insights! Compare notes and find patterns across sessions. Pay attention to emotional triggers.

#42 Gather opinions with surveys

Whether you want the opinions of your users or from members of your team, you can set up a quick and simple survey using Surveys. It’s super useful for getting opinions on new ideas (consider it almost like a mini-focus group), or even for brainstorming with teammates.

#43 Design a style guide with card sorting

Style guides (for design and content) can take a lot of time and effort to create, especially when you need to get the guide proofed by various people in your company. To speed this up, simply create a card sort to find out what your guide should consist of. Find out the specifics in this article.

#44 Improve your company's CRM system

As your company grows, oftentimes your CRM can become riddled with outdated information and turn into a giant mess, especially if you deal with a lot of customers every day. To help clear this up, you can use card sorting and tree testing to solve navigational issues and get rid of redundant features. Learn more.

#45 Sort your life out

Let your creativity run wild, and get your team or family involved in organizing or prioritizing the things that matter. And the possibilities really are endless. Organize a long list of DIY projects, or ask the broader team how the functional pods should be re-organized. It’s up to you. How can card sorting help you in your work and daily life?

#46 Create an online diary study

Whether it’s a product, app or website, finding out the long-term behaviour and thoughts of your users is important. That’s where diary studies come in. For those new to this concept, diary studies are a longitudinal research method, aimed at collecting insights about a participant’s needs and behaviors. Participants note down activities as they’re using a particular product, app, or website. Add your participants into a qualitative study and allow them to create their diary study with ease.

#47 Source-specific data with an online survey

Online survey tools can complement your existing research by sourcing specific information from your participants. For example, if you need to find out more about how your participants use social media, which sites they use, and on which devices, you can do it all through a simple survey questionnaire. Additionally, if you need to identify usage patterns, device preferences or get information on what other products/websites your users are aware of/are using, a questionnaire is the ticket.

#48 Guerrilla testing with First-click testing

For really quick first-click testing, take First-click testing on a tablet, mobile device or laptop to a local coffee shop. Ask people standing in line if they’d like to take part in your super quick test in exchange for a cup of joe. Easy!

#50 Ask post-task questions for tree testing and first-click testing

You can now set specific task-related questions for both Tree testing and First-click testing. This is a great way to dive deeper into the mushy minds of your participants. Check out how to use this new(ish) feature here!

#51 Start testing prototypes

Paper prototypes are great, but what happens when your users are scattered around the globe, and you can’t invite them to an in-person test? By scanning (or taking a photo) of your paper prototypes, you can use first-click testingto test them with your users quickly and easily. Read more about our approach here.

#52 Take better notes for sense making

Qualitative research involves a lot of note-taking. So naturally, to be better at this method, improving how you take notes is important. Reframer is designed to make note-taking easy but it can still be an art. Learn more.

#53 Make sure you get the user's first-click right

Like most things, read a little, and then it’s all about practice.We’ve found that people who get the first click correct are almost three times as likely to complete a task successfully. Get your first clicks right in tree testing and first-click testing and you’ll start seeing your customers smile.


#54 Run a cat survey. Yep, cats!

We’ve gained some insight into how people intuitively group cats, and so can you (unless you’re a dog person). Honestly, doing something silly can be a useful way to introduce your team to a new method on a Friday afternoon. Remember to distribute the results!


#55 Destroy evil attractors in your tree

Evil attractors are those labels in your IA that attract unjustified clicks across tasks. This usually means the chosen label is ambiguous, or possibly a catch-all phrase like ‘Resources’. Read how to quickly identify evil attractors in the Destinations table of tree test results and how to fix them.

#56 Affinity map using card sorts

We all love our Post-its and sticking things on walls. But sometimes you need something quicker and accessible for people in remote areas. Try out using Card Sorts for a distributed approach to making sense of all the notes. Plus, you can easily import any qualitative insights when creating cards in card sort. Easy.

#57 Preference test with first-click testing

Whether you’re coming up with a new logo design, headline, featured image, or anything, you can preference test it with First-click testing. Create an image that shows the two designs side by side and upload it to First-click testing. From there, you can ask people to click whichever one they prefer!

#58 Add moderated card sort results to your card sort

An excellent way of gathering valuable qualitative insights alongside the results of your remote card sorts is to run a moderated version of the sorts with a smaller group of participants. When you can observe and interact with your participants as they complete the sort, you’ll be able to ask questions and learn more about their mental models and the reasons why they have categorized things in a particular way. Learn more.

#59 Test search box variations with first-click clicking

Case study by Viget: “One of the most heavily used features of the website is its keyword search, so we wanted to make absolutely certain that our redesigned search box didn’t make search harder for users to find and use.”

#60 Run an image card sort to organize products into groups

You can add images to each card that allows you understand how your participants may organize and label particular items. Very useful if you want to organize some retail products and want to find out how other people would organize them given a visual including shape, color, and other potential context.

#61 Test your customers' perceptions of different logo and brand image designs

Understand how customers perceive your brand by creating a closed card sort. Come up with a list of categories, and ask participants to sort images such as logos, and branded images.

#62 Run an open image card sort to classify images into groups based on the emotions they elicit

Are these pictures exhilarating, or terrifying? Are they humoros, or offensive? Relaxing, or boring? Productive, or frantic? Happy memories, or a deep sigh?

#63 Run an image card sort to organize your library

Whether it’s a physical library of books, or a digital drive full of ebooks, you can run a card sort to help organize them in a way that makes sense. Will it be by genre, author name, color or topic? Send out the study to your coworkers to get their input! You can also do this at home for your own personal library, and you can include music/CDs/vinyl records and movies!

#64 HR exercises to determine the motivations of your team

It’s simple to ask your team about their thoughts, feelings, and motivations with a Questions survey. You can choose to leave participant identifiers blank (so responses are anonymous), or you can ask for a name/email address. As a bonus, you can set up a calendar reminder to send out a new survey in the next quarter. Duplicate the survey and send it out again!

#65 Designing physical environments

If your company has a physical environment in which your customers visit, you can research new structures using a mixture of tools in The Optimal Workshop Suite. This especially comes in handy if your customers require certain information within the physical environment in order to make decisions. For example, picture a retail store. Are all the signs clear and communicate the right information? Are people overwhelmed by the physical environment?

#66 Use tree testing to refine an interactive phone menu system

Similar to how you’d design an IA, you can create a tree test to design an automated phone system. Whether you’re designing from the ground up, or improving your existing system, you will be able to find out if people are getting lost.


#67 Have your research team categorize and prioritize all these ideas

Before you dig deeper into more of these ideas, ask the rest of the team to help you decide which one to focus on. Let’s not get in the way of your work. Start your quick wins and log into your account. Here’s a spreadsheet of this list to upload to card sort. Aaaaaaaaaaand that’s a wrap! *Takes out gym towel and wipes sweaty face.
*Got any more suggestions to add to this list? We’d love to hear them in our comments section — we might even add them into this list

Seeing is believing

Explore our tools and see how Optimal makes gathering insights simple, powerful, and impactful.