March 20, 2023
4 min

Using User Engagement Metrics to Improve Your Website's User Experience

Optimal Workshop

Are your users engaged in your website? The success of your website will largely depend on your answer. After all, engaged users are valuable users; they keep coming back and will recommend your site to colleagues, friends, and family. So, if you’re not sure if your users are engaged or not, consider looking into your user engagement metrics.

User engagement can be measured using a number of key metrics provided by website analytics platforms. Metrics such as bounce rate, time on page, and click-through rate all provide clues to user engagement and therefore overall website user experience.

This article will help you understand user engagement and why it’s important to measure. We’ll also discuss how to apply user engagement insights to improve website success. Combining a little bit of data with some user research is a powerful thing, so let’s get into it.

Understanding User Engagement Metrics 📐

User engagement metrics provide valuable insight for both new and existing websites. They should be checked regularly as a sort of ‘pulse check’ for website user experience and performance. So, what metrics should you be looking at? Website metrics can be overwhelming; there are hundreds if not thousands to analyze, so let’s focus on three:

Bounce rate


Measures the percentage of users that visit just one page on your site before leaving. If your bounce rate is high it suggests that users aren’t finding the content relevant, engaging, or useful. It points to a poor initial reaction to your site and means that users are arriving, making a judgment about your design or content, and then leaving.

Time on page


Calculated by the time difference between the point when a person lands on the page and when they move on to the next one. It indicates how engaging or relevant individual pages on your website are. Low time on page figures suggest that users aren’t getting what they need from a certain page, either in terms of the content, the aesthetics, or both.

Click-through rate


Click-through rate compares the number of times someone clicks on your content, to the number of impressions you get (how many times an internal link or ad was viewed). The higher the rate, the better the engagement and performance of that element. User experience design can influence click-through rates through copywriting, button contrasts, heading structure, navigation, etc.

Conversion rate


Conversion rates are perhaps the pinnacle of user engagement metrics. Conversion rate is the percentage of users that perform specific tasks you define. They are therefore dictated by your goals, which could include form submissions, transactions, etc. If your website has high conversion rates, you can be fairly confident that your website is matching your users’ needs, requirements, and expectations.

But how do these metrics help? Well, they don’t give you an answer directly. The metrics point to potential issues with website user experience. They guide further research and subsequent updates that lead to website improvement. In the next section, we’ll discuss how these and others can support better website user experiences.

Identifying Areas for Improvement 💡

So, you’ve looked at your website’s user engagement metrics and discovered some good, and some bad. The good news is, there’s value in discovering both! The catch? You just need to find it. Remember, the metrics on their own don’t give you answers; they provide you direction.

The ‘clues’ that user engagement metrics provide are the starting point for further research. Remember, we want to make data-driven decisions. We want to avoid making assumptions and jumping to conclusions about why our website is reporting certain metrics. Fortunately, there are a bunch of different ways to do this.

User research data can be gathered by using both qualitative and quantitative research techniques. Insights into user behavior and needs can reveal why your website might be performing in certain ways. Research can include both qualitative and quantitative techniques.

Qualitative research techniques

  • Usability test – Test a product with people by observing them as they attempt to complete various tasks.
  • User interview – Sit down with a user to learn more about their background, motivations and pain points.
  • Contextual inquiry – Learn more about your users in their own environment by asking them questions before moving onto an observation activity.
  • Focus group – Gather 6 to 10 people for a forum-like session to get feedback on a product.

Quantitate research techniques

  • Card sorts – Find out how people categorize and sort information on your website.
  • First-click tests – See where people click first when tasked with completing an action.
  • A/B tests – Compare 2 versions of a design in order to work out which is more effective.
  • Clickstream analysis – Analyze aggregate data about website visits.
  • Tree-testing - Test your site structure using text-only categorization and labels

The type of research depends on what question you want to answer. Being specific about your question will help you identify what research technique(s) to deploy and ultimately the quality of your answer. If you’re serious about website improvement; identify problem areas with user engagement metrics, and investigate how to fix them with user research.

Optimizing Content and Design

If you have conducted user research and found weak areas on your website, there are many things to consider. Three good places to start are navigation, content, and website layout. Combined, these have a huge impact on user experience and can be leveraged to address disappointing engagement metrics.

Navigation


Navigation is a crucial aspect of creating a good user experience since it fundamentally connects pages and content which allows users to find what they need. Navigation should be simple and easy to follow, with important information/actions at the top of menus. Observing the results of card sorting, tree testing, and user testing can be particularly useful in website optimization efforts. You may find that search bars, breadcrumb trails, and internal links can also help overcome navigation issues.

Content


Are users seeing compelling or relevant content when they arrive on your site? Is your content organized in a way that encourages further exploration? Card sorting and content audits are useful in answering these questions and can help provide you with the insights required to optimize your content. You should identify what content might be redundant, out of date, or repetitive, as well as any gaps that may need filling.

Layout


A well-designed layout can improve the overall usability of a website, making it easier for users to find what they're looking for, understand the content, and engage with it. Consider how consistent your heading structures are and be sure to use consistent styling throughout the site, such as similar font sizes and colors. Don’t be afraid to use white space; it’s great at breaking up sections and making content more readable.

An additional factor related to layout is mobile optimization. Mobile-first design is necessary for apps, but it should also factor into your website design. How responsive is your website? How easy is it to navigate on mobile? Is your font size appropriate? You might find that poor mobile experience is negatively impacting user engagement metrics.

Measuring Success 🔎

User experience design is an iterative, ongoing process, so it’s important to keep a record of your website’s user experience metrics at various points of development. Fortunately, website analytics platforms will provide you with historic user data and key metrics; but be sure to keep a separate record of what improvements you make along the way. This will help you pinpoint what changes impacted different metrics.

Define your goals and create a website optimization checklist that monitors key metrics on your site. For example, whenever you make an update, ensure bounce rates don’t exceed a certain number during the days following; check that your conversion rates are performing as they should be; check your time on sites hasn’t dropped. Be sure to compare metrics between desktop and mobile too.

User’s needs and expectations change over time, so keep an eye on how new content is performing. For example, which new blog posts have attracted the most attention? What pages or topics have had the most page views compared to the previous period? Tracking such changes can help to inform what your users are currently engaged in, and will help guide your user experience improvements.

Conclusion 🤗

User engagement metrics allow you to put clear parameters around user experience. They allow you to measure where your website is performing well, and where your website might need improving. Their main strength is in how accessible they are; you can access key metrics on website analytics platforms in moments. However, user engagement metrics on their own may not reveal how and why certain website improvements should be made. In order to understand what’s going on, you often need to dig a little deeper.

Time on page, bounce rate, click-through rate, and conversion rates are all great starting points to understand your next steps toward website improvement. Use them to define where further research may be needed. Not sure why your average pages per session is two? Try conducting first-click testing; where are they heading that seems to be a dead end? Is your bounce rate too high? Conduct a content audit to find out if your information is still relevant, or look into navigation roadblocks. Whatever the question; keep searching for the answer.

User engagement metrics will keep you on your toes, but that’s a good thing. They empower you to make ongoing website improvements and ensure that users are at the heart of your website design. 

Publishing date
March 20, 2023
Share this article

Related articles

min read
Content design for startups: how to work lean, have maximum impact, and get all the high-fives

When you have a small design team or none at all, how do you ensure that your content is consistent, has the right tone, and is captivating? It can be difficult, but it doesn’t have to be! Julia Steffen, Principal Content Designer at Varis, spoke at UX New Zealand, the leading UX and IA conference in New Zealand hosted by Optimal Workshop, about how startups can achieve impactful content and delight users. 

In her talk, Julia shares her most useful tips, tricks, and rules of thumb to ensure meaningful content design. She also shares some helpful tools to achieve maximum efficiency.

Julia Steffen bio 🎤

Julia has worked in content for 10+ years at St.Jude, Wunderman Thompson, MetaLab, and Grubhub. She is based in the United States and is the Principal Content Designer at Varis.

Contact Details:

Email address: julia.steffen@govaris.com

You can find Julia on LinkedIn

Content design for startups - How to work lean, have maximum impact, and get all the high-fives ✋🏽✋🏻✋🏿

Why should you care about content design? Julia argues that “content design is product success”. Because Julia specifically talks about content design in relation to startups, she focuses on how to achieve the best results possible with a small, lean team. To that end, Julia discusses four must-haves for content design:

  1. Voice
  2. Tools for efficiency
  3. Words in the experience
  4. Ways to check, test, and perfect your words

Voice 🎙️

Why is your company’s voice important? Voice tells your users who you are, creates meaningful connections, and provides valuable signals that convey whether or not your company is deserving of trust. Choosing the voice for your startup begins with a competitor audit.  Documenting who you compete against, and how you might want to differentiate your startup is crucial to finding your corner of the market. For example, is your voice welcoming, gentle, and positive, or are you more formal and technical? 

User research can also be really helpful when determining and monitoring your voice. Involve your research team and learn what does and doesn’t delight your audience when it comes to your messaging.

It’s also important to map your voice to your startup’s values. Be sure to connect to your mission and your brand. Julia sums up product voice as:

Product voice = your values + space to differentiate + what research tells you

So, when you find your voice, where can you lean into it? There are several key areas or moments that provide opportunities to share your unique voice, such as:

  • Notifications: Emails, SMS, and in-app messages are a great place to delight customers
  • Success states: Celebrate with your users in your voice (and remove any anxiety that may be there)
  • Empty states: They aren’t just a chance to educate, they’re a chance to add some interest or fun (or to mask a UX issue).
  • Placeholder text: If a field is well labeled, you can use this section to bring joy and reduce a user’s anxiety.
  • Onboarding: You never get a second chance to make a first impression. Make it count!

Tools for efficiency ⚒️

To remain lean and efficient as a startup, one of the best things you can do is create a style guide. This helps to keep your content and voice consistent. For example, what pronouns do you use in your interface, do you capitalize certain words, etc? There is actually a lot to consider here, so Julia points viewers to various resources that allow you to copy and paste, such as Quinn Keast’s Product Language Framework.

A glossary or language bank is also important. Record branded words, terms that you never use, and terms that you’ve heard your users say organically. This helps to ensure that you’re using language that resonates with your audience and language that reduces cognitive load as much as possible.

Pro tip: Use the Writer app with Figma. This integration helps to ensure that your style guide is actually used! It includes your style guide and glossary so that you’re being consistent as you work. You can also use the Hemingway app or Grammarly to look out for passive voice, hard-to-parse sentences, and overall readability.

Words in the experience – writing for content design 📝

The first thing Julia points out when approaching writing is the need to be user-focused. This might seem obvious to UX practitioners, but word selection can be nuanced, and subtle changes can be powerful. For example, instead of writing “[Your company] introduces a new feature”, think about how can you change the statement to be more about what the new feature means for the user, rather than your company. Here are a few rules of thumb to help refine your writing.

  • Clarity over cleverness. Unless you’re clear and the message is understood by your user, even the best jokes and wittiest phrases in the world will be wasted.
  • Write like you’re having a conversation with your Grandmother. Be clear and don’t use too much jargon.
  • Think like the best content designers. Writing is a process and there are several things to consider, such as the purpose of your copy, the context that it’s being read, and what emotion the reader might be feeling at that moment, etc. Julia offers the Microcopy Canvas as a useful tool for startups, which is a helpful writing template/worksheet created by Jane Ruffino.

Ways to check, test, and perfect your words 👀

Julia suggests that design reviews are the perfect place to sense-check your words and content. Review your designs intentionally and through a content lens. Again, the Microcopy Canvas can be a useful tool when conducting this step, helping to ensure you have considered the right tone and achieved your purpose with your words.

Following a design review process, it’s important to test for clarity and affinity. Conduct user tests frequently to ensure your words and content are clear, understood, and hitting the mark in the intended way.

Finally, make sure your content goals are recorded in your dashboards. Be accountable to your own success measures, KPIs, and OKRs (Objectives and Key Results). Some metrics that help track success are:

  • Onboarding flows
  • Notification metrics
  • Feature adoption
  • Conversion rates

If you’re falling short on some metrics, review your content and try to figure out where words can be sharpened to be clearer, more friendly, or less technical, for example. Then, feed this information into your prioritization and planning. What changes are going to have the most impact on your product’s success? What changes are quick wins? 

Why it matters 🤯

Julia’s talk is important for UX and content designers, particularly those working in startup environments, as it highlights the critical role of content design in achieving product success. The content you share, the voice and tone you adopt, and the clarity of communication, all add to the user's overall experience with your product. Investing time into your content is critical and, as Julia explains, it doesn’t have to put too much stress on your team's workload. If time isn’t invested, however, you may find yourself with poor content, delivering poor experiences, resulting in high customer attrition. 

Efficiency, therefore, should be a focus for startups wanting to achieve great content design without being weighed down. Julia offers pragmatic advice on maintaining consistency through tools like style guides and language banks and by leveraging apps like Hemingway and Grammarly. Tools like these are incredibly helpful when streamlining processes and ensuring a cohesive and polished user interface. 

At the end of the day, Julia stresses the impact that content design has on user experiences and encourages startups to pay close attention to content in ways that are achievable for small teams.

min read
How we created a content strategy without realizing it

Tania Hockings is the Senior Digital Content Advisor at ACC and has a passion for creating content that caters to users (not the business). Co-presenter and co-author Amy Stoks is a Senior Experience Designer at PwC’s Experience Centre and possesses a love of empathy and collaboration. Ahead of their presentation at UX New Zealand 2017, Tania and Amy share their experience creating a content strategy while working on a project for the Accident Compensation Corporation (ACC’s no-fault scheme helps pay the cost of care for everyone in New Zealand if they’re injured in an accident).It’s a truth universally acknowledged that before you start writing content you’re supposed to have a content strategy. Three months out from launch of the new acc.co.nz beta site, we did not have one. Nor did we have a lot of time, very much resource or a completed content audit.However, we did have:

  • Some pretty good design principles, based on user research
  • A list of the 37 priority tasks that our users needed to do on the acc.co.nz website
  • One content writer (Tania) and one part-time content consultant (Amy)
  • A deadline
  • Freedom to do whatever we needed to do to get content live for the beta launch.

Here’s a quick look into how we created a content strategy for acc.co.nz without actually realizing it.

Content principles are a great starting point

We needed more direction than those 37 tasks to get writing, so inspired by our design principles, we wrote some equivalent principles for the content. We decided to start with the tried and tested principles already in play by Govt.nz and GOV.UK — we didn’t have time to reinvent the wheel. We ended up with eight principles for how we would write our content:

  1. Do less: only create what is needed and what we can maintain
  2. Always have an evidence-based user need: we know why a piece of content is needed and have the evidence to back it up
  3. Ask for constant feedback: we use feedback and analytics to understand what people want and need to know, as well as what they don’t
  4. Provide a consistent customer experience: our voice is the same across all platforms and our content is accessible for every kind of user
  5. Create seamless customer journeys: no dead ends or broken links, no unnecessary information
  6. Improve how ACC writes: we build good relationships with content contributors and proactively offer content support and training
  7. Ensure transparent ownership for all content: every piece of content has an owner, a business group and a digital advisor assigned
  8. Accept that not everything has to live on acc.co.nz: other channels share ACC content, so if ACC isn’t the source of truth for information we don’t put it on acc.co.nz

We made a checklist of what would and wouldn’t live on acc.co.nz according to the principles...and that was pretty much it. We really didn’t have time to do much else because the design of the site was running ahead of us. We also needed to get some content in front of users at our weekly testing sessions.

Sometimes you’ve just gotta get writing

We got stuck into writing those 37 tasks using our pair writing approach, which was also an experiment, but more on that in our UX New Zealand talk. While we wrote, we were living and breathing the content principles: we introduced them to our internal subject experts while we were writing and constantly referred back to the principles to help structure the content.After the beta launch, we had a few more content writers on the team and a bit of time to breathe (but not much!). We actually wrote the principles down and put them into a visual briefing pack to give to the subject experts ahead of our pair writing sessions. This pack covered:

  • our principles
  • the goal of the project
  • the process
  • the usual best practice webby stuff.

As we wrote more content, the briefing pack and our process evolved based on what we learned and feedback from our subject experts about what was and wasn’t working.During the same brief intermission, we also had a chance to revisit the content strategy. However, in practice we just did a brainstorm on a whiteboard of what the strategy might be. It looked like this:

image1.jpg

And it stayed like that for another six months.We can’t remember if we ever looked at it much, but we felt good knowing it was there.

Seriously, we really need a content strategy...don’t we?

We finally got to the end of the project. The launch date was looming, but still no content strategy. So we booked a room. Three of us agreed to meet to nut it out and finally write our formal content strategy. We talked for a bit, going around in circles, until we realized we’d already done it. The briefing pack was the content strategy. Less a formal document and more a living set of principles of how we had and would continue to work.

Would we do it again?

Yeah, we would. In fact, the ACC digital team is already following the same approach on a new project. Content principles are key: they’re simple, practical to socialize and easy to stick to. We found it really valuable to evolve the strategy as we learned more from user research and subject matter expertise.Of course, it wasn’t all rosy — these projects never are! Some governance and muscle behind what we were doing would have really helped. We found ourselves in some intense stakeholder meetings where we were the first line of defence for the content principles. Unsurprisingly, not everybody agrees with doing less! But we’re pretty sure that having a longer strategy formalized in an official document still wouldn’t have helped us much.The next piece of work for the digital team at ACC is defining that governance and building a collaborative process to design digital products within the organization. The plan is to run informal, regular catch-ups with internal stakeholders to make sure the content strategy is still relevant and working for ACC’s users and the organization itself.

If you remember anything from this blog post, remember this:

Treat your content strategy less like a formal document and more like a working hypothesis that changes and grows as you learn.A formal document might make you feel good, but it’s likely no one is reading it.Whatever format you choose for your content strategy/working hypothesis, make sure you get it signed off and endorsed by the people who matter. You’ll need back up in those inevitably tense project meetings!The acc.co.nz content strategy looks awesome these days — very visual and easy to read. Tania always has a copy in her notebook and carries it with her everywhere. If you’re lucky enough to run into her on the streets of Wellington, she might just show it to you.

Want to hear more? Come to UX New Zealand!

If you'd like to hear more about designing content, plus a bunch of other cool UX-related talks, head along to UX New Zealand 2017 hosted by Optimal Workshop. The conference runs from 11-13 October including a day of fantastic workshops, and you can get your tickets here.

No items found.
min read
The Role of Usability Metrics in User-Centered Design

The term ‘usability’ captures sentiments of how usable, useful, enjoyable, and intuitive a website or app is perceived by users. By its very nature, usability is somewhat subjective. But what we’re really looking for when we talk about usability is how well a website can be used to achieve a specific task or goal. Using this definition we can analyze usability metrics (standard units of measurement) to understand how well user experience design is performing.

Usability metrics provide helpful insights before and after any digital product is launched. They help us form a deeper understanding of how we can design with the user front of mind. This user-centered design approach is considered the best-practice in building effective information architecture and user experiences that help websites, apps, and software meet and exceed users' needs.

In this article, we’ll highlight key usability metrics, how to measure and understand them, and how you can apply them to improve user experience.

Understanding Usability Metrics

Usability metrics aim to understand three core elements of usability, namely: effectiveness, efficiency, and satisfaction. A variety of research techniques offer designers an avenue for quantifying usability. Quantifying usability is key because we want to measure and understand it objectively, rather than making assumptions.

Types of Usability Metrics

There are a few key metrics that we can measure directly if we’re looking to quantify effectiveness, efficiency, and satisfaction. Here are four common examples:

  • Success rate: Also known as ‘completion rate’, success rate is the percentage of users who were able to successfully complete the tasks.
  • Time-based efficiency: Also known as ‘time on task’, time-based efficiency measures how much time a user needs to complete a certain task.
  • Number of errors: Sounds like what it is! It measures the average number of times where an error occurred per user when performing a given task.
  • Post-task satisfaction: Measures a user's general impression or satisfaction after completing (or not completing) a given task.

How to Collect Usability Metrics


Usability metrics are outputs from research techniques deployed when conducting usability testing. Usability testing in web design, for example, involves assessing how a user interacts with the website by observing (and listening to) users completing defined tasks, such as purchasing a product or signing up for newsletters.

Conducting usability testing and collecting usability metrics usually involves:

  • Defining a set of tasks that you want to test
  • Recruitment of test participants
  • Observing participants (remotely or in-person)
  • Recording detailed observations
  • Follow-up satisfaction survey or questionnaire

Tools such Reframer are helpful in conducting usability tests remotely, and they enable live collaboration of multiple team members. It is extremely handy when trying to record and organize those insightful observations! Using paper prototypes is an inexpensive way to test usability early in the design process.

The Importance of Usability Metrics in User-Centered Design

User-centered design challenges designers to put user needs first. This means in order to deploy user-centered design, you need to understand your user. This is where usability testing and metrics add value to website and app performance; they provide direct, objective insight into user behavior, needs, and frustrations. If your user isn’t getting what they want or expect, they’ll simply leave and look elsewhere.

Usability metrics identify which parts of your design aren’t hitting the mark. Recognizing where users might be having trouble completing certain actions, or where users are regularly making errors, are vital insights when implementing user-centered design. In short, user-centered design relies on data-driven user insight.

But why hark on about usability metrics and user-centered design? Because at the heart of most successful businesses is a well-solved user problem. Take Spotify, for example, which solved the problem of dodgy, pirated digital files being so unreliable. People liked access to free digital music, but they had to battle viruses and fake files to get it. With Spotify, for a small monthly fee, or the cost of listening to a few ads, users have the best of both worlds. The same principle applies to user experience - identify recurring problems, then solve them.

Best Practices for Using Usability Metrics

Usability metrics should be analyzed by design teams of every size. However, there are some things to bear in mind when using usability metrics to inform design decisions:

  • Defining success: Usability metrics are only valuable if they are being measured against clearly defined benchmarks. Many tasks and processes are unique to each business, so use appropriate comparisons and targets; usually in the form of an ‘optimized’ user (a user with high efficiency).
  • Real user metrics: Be sure to test with participants that represent your final user base. For example, there’s little point in testing your team, who will likely be intimately aware of your business structure, terminology, and internal workflows.
  • Test early: Usability testing and subsequent usability metrics provide the most impact early on in the design process. This usually means testing an early prototype or even a paper prototype. Early testing helps to avoid any significant, unforeseen challenges that could be difficult to rewind in your information architecture.
  • Regular testing: Usability metrics can change over time as user behavior and familiarity with digital products evolve. You should also test and analyze the usability of new feature releases on your website or app.

Remember, data analysis is only as good as the data itself. Give yourself the best chance of designing exceptional user experiences by collecting, researching, and analyzing meaningful and accurate usability metrics.

Conclusion

Usability metrics are a guiding light when it comes to user experience. As the old saying goes, “you can’t manage what you can’t measure”. By including usability metrics in your design process, you invite direct user feedback into your product. This is ideal because we want to leave any assumptions or guesswork about user experience at the door.

User-centered design inherently relies on constant user research. Usability metrics such as success rate, time-based efficiency, number of errors, and post-task satisfaction will highlight potential shortcomings in your design. Subsequently, they identify where improvements can be made, AND they lay down a benchmark to check whether any resulting updates addressed the issues.

Ready to start collecting and analyzing usability metrics? Check out our guide to planning and running effective usability tests to get a head start!

Seeing is believing

Dive into our platform, explore our tools, and discover how easy it can be to conduct effective UX research.