Blog

Optimal Blog

Articles and Podcasts on Customer Service, AI and Automation, Product, and more

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.

Latest

Learn more
1 min read

My journey running a design sprint

Recently, everyone in the design industry has been talking about design sprints. So, naturally, the team at Optimal Workshop wanted to see what all the fuss was about. I picked up a copy of The Sprint Book and suggested to the team that we try out the technique.

In order to keep momentum, we identified a current problem and decided to run the sprint only two weeks later. The short notice was a bit of a challenge, but in the end we made it work. Here’s a run down of how things went, what worked, what didn’t, and lessons learned.

A sprint is an intensive focused period of time to get a product or feature designed and tested with the goal of knowing whether or not the team should keep investing in the development of the idea. The idea needs to be either validated or not validated by the end of the sprint. In turn, this saves time and resource further down the track by being able to pivot early if the idea doesn’t float.

If you’re following The Sprint Book you might have a structured 5 day plan that looks likes this:

  • Day 1 - Understand: Discover the business opportunity, the audience, the competition, the value proposition and define metrics of success.
  • Day 2 - Diverge: Explore, develop and iterate creative ways of solving the problem, regardless of feasibility.
  • Day 3 - Converge: Identify ideas that fit the next product cycle and explore them in further detail through storyboarding.
  • Day 4 - Prototype: Design and prepare prototype(s) that can be tested with people.
  • Day 5 - Test: User testing with the product's primary target audience.
Design sprint cycle
 With a Design Sprint, a product doesn't need to go full cycle to learn about the opportunities and gather feedback.

When you’re running a design sprint, it’s important that you have the right people in the room. It’s all about focus and working fast; you need the right people around in order to do this and not have any blocks down the path. Team, stakeholder and expert buy-in is key — this is not a task just for a design team!After getting buy in and picking out the people who should be involved (developers, designers, product owner, customer success rep, marketing rep, user researcher), these were my next steps:

Pre-sprint

  1. Read the book
  2. Panic
  3. Send out invites
  4. Write the agenda
  5. Book a meeting room
  6. Organize food and coffee
  7. Get supplies (Post-its, paper, Sharpies, laptops, chargers, cameras)

Some fresh smoothies for the sprinters made by our juice technician
 Some fresh smoothies for the sprinters made by our juice technician

The sprint

Due to scheduling issues we had to split the sprint over the end of the week and weekend. Sprint guidelines suggest you hold it over Monday to Friday — this is a nice block of time but we had to do Thursday to Thursday, with the weekend off in between, which in turn worked really well. We are all self confessed introverts and, to be honest, the thought of spending five solid days workshopping was daunting. At about two days in, we were exhausted and went away for the weekend and came back on Monday feeling sociable and recharged again and ready to examine the work we’d done in the first two days with fresh eyes.

Design sprint activities

During our sprint we completed a range of different activities but here’s a list of some that worked well for us. You can find out more information about how to run most of these over at The Sprint Book website or checkout some great resources over at Design Sprint Kit.

Lightning talks

We kicked off our sprint by having each person give a quick 5-minute talk on one of these topics in the list below. This gave us all an overview of the whole project and since we each had to present, we in turn became the expert in that area and engaged with the topic (rather than just listening to one person deliver all the information).

Our lightning talk topics included:

  • Product history - where have we come from so the whole group has an understanding of who we are and why we’ve made the things we’ve made.
  • Vision and business goals - (from the product owner or CEO) a look ahead not just of the tools we provide but where we want the business to go in the future.
  • User feedback - what have users been saying so far about the idea we’ve chosen for our sprint. This information is collected by our User Research and Customer Success teams.
  • Technical review - an overview of our tech and anything we should be aware of (or a look at possible available tech). This is a good chance to get an engineering lead in to share technical opportunities.
  • Comparative research - what else is out there, how have other teams or products addressed this problem space?

Empathy exercise

I asked the sprinters to participate in an exercise so that we could gain empathy for those who are using our tools. The task was to pretend we were one of our customers who had to present a dendrogram to some of our team members who are not involved in product development or user research. In this frame of mind, we had to talk through how we might start to draw conclusions from the data presented to the stakeholders. We all gained more empathy for what it’s like to be a researcher trying to use the graphs in our tools to gain insights.

How Might We

In the beginning, it’s important to be open to all ideas. One way we did this was to phrase questions in the format: “How might we…” At this stage (day two) we weren’t trying to come up with solutions — we were trying to work out what problems there were to solve. ‘We’ is a reminder that this is a team effort, and ‘might’ reminds us that it’s just one suggestion that may or may not work (and that’s OK). These questions then get voted on and moved into a workshop for generating ideas (see Crazy 8s).Read a more detailed instructions on how to run a ‘How might we’ session on the Design Sprint Kit website.

Crazy 8s

This activity is a super quick-fire idea generation technique. The gist of it is that each person gets a piece of paper that has been folded 8 times and has 8 minutes to come up with eight ideas (really rough sketches). When time is up, it’s all pens down and the rest of the team gets to review each other's ideas.In our sprint, we gave each person Post-it notes, paper, and set the timer for 8 minutes. At the end of the activity, we put all the sketches on a wall (this is where the art gallery exercise comes in).

Mila our data scientist sketching intensely during Crazy 8s
 Mila our data scientist sketching intensely during Crazy 8s

A close up of some sketches from the team
 A close up of some sketches from the team

Art gallery/Silent critique

The art gallery is the place where all the sketches go. We give everyone dot stickers so they can vote and pull out key ideas from each sketch. This is done silently, as the ideas should be understood without needing explanation from the person who made them. At the end of it you’ve got a kind of heat map, and you can see the ideas that stand out the most. After this first round of voting, the authors of the sketches get to talk through their ideas, then another round of voting begins.

Mila putting some sticky dots on some sketches
 Mila putting some sticky dots on some sketches

Bowie, our head of security/office dog, even took part in the sprint...kind of.
 Bowie, our head of security, even took part in the sprint...kind of

Usability testing and validation

The key part of a design sprint is validation. For one of our sprints we had two parts of our concept that needed validating. To test one part we conducted simple user tests with other members of Optimal Workshop (the feature was an internal tool). For the second part we needed to validate whether we had the data to continue with this project, so we had our data scientist run some numbers and predictions for us.

6-dan-design-sprintOur remote worker Rebecca dialed in to watch one of our user tests live
 Our remote worker Rebecca dialed in to watch one of our user tests live
"I'm pretty bloody happy" — Actual feedback.
 Actual feedback

Challenges and outcomes

One of our key team members, Rebecca, was working remotely during the sprint. To make things easier for her, we set up 2 cameras: one pointed to the whiteboard, the other was focused on the rest of the sprint team sitting at the table. Next to that, we set up a monitor so we could see Rebecca.

Engaging in workshop activities is a lot harder when working remotely. Rebecca would get around this by completing the activities and take photos to send to us.

8-rebecca-design-sprint
 For more information, read this great Medium post about running design sprints remotely

Lessons

  • Lightning talks are a great way to have each person contribute up front and feel invested in the process.
  • Sprints are energy intensive. Make sure you’re in a good place with plenty of fresh air with comfortable chairs and a break out space. We like to split the five days up so that we get a weekend break.
  • Give people plenty of notice to clear their schedules. Asking busy people to take five days from their schedule might not go down too well. Make sure they know why you’d like them there and what they should expect from the week. Send them an outline of the agenda. Ideally, have a chat in person and get them excited to be part of it.
  • Invite the right people. It’s important that you get the right kind of people from different parts of the company involved in your sprint. The role they play in day-to-day work doesn’t matter too much for this. We’re all mainly using pens and paper and the more types of brains in the room the better. Looking back, what we really needed on our team was a customer support team member. They have the experience and knowledge about our customers that we don’t have.
  • Choose the right sprint problem. The project we chose for our first sprint wasn’t really suited for a design sprint. We went in with a well defined problem and a suggested solution from the team instead of having a project that needed fresh ideas. This made the activities like ‘How Might We’ seem very redundant. The challenge we decided to tackle ended up being more of a data prototype (spreadsheets!). We used the week to validate assumptions around how we can better use data and how we can write a script to automate some internal processes. We got the prototype working and tested but due to the nature of the project we will have to run this experiment in the background for a few months before any building happens.

Overall, this design sprint was a great team bonding experience and we felt pleased with what we achieved in such a short amount of time. Naturally, here at Optimal Workshop, we're experimenters at heart and we will keep exploring new ways to work across teams and find a good middle ground.

Further reading

Learn more
1 min read

Arts, crafts and user feedback: How to engage your team through creative play

Doing research is one difficult task — sharing the results with your team is another. Reports can be skim read, forgotten and filed away. People can drift off into a daydream during slideshow presentations, and others may not understand what you’re trying to communicate.This is a problem that many research teams encounter, and it made me think a lot about how to make the wider team really engage in user feedback. While we at Optimal Workshop have a bunch of documents and great tools like Intercom, Evernote and Reframer to capture all our feedback, I wanted to figure out how I could make it fun and engaging to get people to read what our users tell us.How can we as designers and researchers better translate our findings into compelling insights and anecdotes for others to embrace and enjoy? After some thought and a trip to the craft store, I came up with this workshop activity that was a hit with the team.

Crafting feedback into art

Each fortnight we’ve been taking turns at running a full company activity instead of doing a full company standup (check in). Some of these activities included things like pairing up and going for a walk along the waterfront to talk about a challenge we are currently facing, or talk about a goal we each have. During my turn I came up with the idea of an arts and crafts session to get the team more engaged in reading some of our user feedback.Before the meeting, I asked every team member to bring one piece of user feedback that they found in Intercom, Evernote or Reframer. This feedback could be positive such as “Your customer support team is awesome” , a suggestion such as “It would be great to be able to hover over tags and see a tooltip with the description”, or it could be negative (opportunity) such as “I’m annoyed and confused with how recruitment works”.This meant that everyone in the team had to dig through the systems and tools we use and look for insights (nuggets) as their first task. This also helped the team gain appreciation for how much data and feedback our user researchers had been gathering.

A photo of the feedback art hung up on the walls of the office

After we all had one piece of feedback each I told everyone they get to spend the next half hour doing arts and crafts. They could use whatever they could find to create a poster, postcard, or visual interpretation of the insight they had.I provided colored card, emoji stickers, stencils, printed out memes, glitter and glue.During the next 30 minutes I stood back and saw everybody grinning and talking about their posters. The best thing was they were actually sharing their pieces of feedback with one another! We had everyone from devs, marketing, design, operations and finance all participating, which meant that people from all kinds of departments had a chance to read feedback from our customers.

More feedback art on the walls

At the end of the meeting we created a gallery in the office and we all spent time reading each poster because it was so fun to see what everyone came up with. We also hung up a few of these in spots around the office that get a lot of foot traffic, so that we can all have a reminder of some of the things our customers told us. I hope that each person took something away with them, and in the future, when working on a task they’ll remember back to a poster and think about how to tackle some of these requests!

Steve and Ania making a mess and crafting their posters

How to run a creative play feedback workshop

Time needed: 30 minutesInsights: Print off a pile of customer insights or encourage the team to find and bring in their own. Have backups as some might be hard to turn into posters.Tools: Scissors, glue sticks, blue tack for creating the gallery.Crafts: Paper, pens, stickers, stencils, googly eyes (anything goes!)

Another poster decorating the walls of Optimal HQ

Interested in other creative ways to tell stories? Our User Researcher Ania shares 8 creative ways to share your user research.If you do something similar in your team, we’d love to hear about it in the comments below!

Learn more
1 min read

Squirrel shoes, yoga and spacesuits: My experience at CanUX 2017

One of the great things about being in the UX field is the UX community. So many inspiring and generally all-round awesome people who are passionate about what they do. What happens when you get a big bunch of those fantastic people all in the one place? You can practically watch the enthusiasm and inspiration levels rise as people talk about ideas, experiences and challenges, and how they can help each other solve problems.Luckily for us, there are lots of events dedicated to getting UX people together in one place to share, learn, grow, and connect. CanUX is one of those, and I was fortunate enough to be there in Ottawa to speak at this year’s event.

The crowd at CanUX settling in for the start of the conference.

CanUX is the annual big UX event for Canada. Started 8 years ago by volunteers who wanted to create an opportunity for the Canadian UX community to get together, it’s grown from a small event to a major conference with top speakers from around the world. Still run by two of the original volunteers, CanUX has kept its focus on community which comes through clearly in how it’s organized. From the day of the week and time of year it’s held, through to details such as a yoga class at the venue to kick off the second day of the conference, there are countless details, small and large, that encourage people to go along, to meet others, and to catch up with old friends from previous CanUX conferences.Aware that there are natural energy lulls in conferences, as people’s brains fill up with inspiration and knowledge, the CanUX team have a regular MC, Rob Woodbridge. This is a man who bounds across the stage, encourages (and actually gets!) audience participation, swears, cracks jokes, and generally seems to have a lot of fun while being the most effective MC I have ever encountered. (Naturally, he’s a bit controversial — some people love to hate him because of that unbridled enthusiasm. But either way, he sets the tone for passionate, engaging presentations!)

The crowd at CanUX had a fair few chances to interact with speakers onstage.

With all the attention to detail around the rest of the conference, it’s not surprising that the same care is shown to the conference programme. All of the main presenters are seen ahead of time by one of the organizers, and then invited to be at CanUX. A very small number of short presentation times are set aside for an open call for submissions, to help encourage newer speakers. Presentations are chosen to cover research, design and IA topics, with both practical and inspirational talks in each.The talks themselves were fantastic, covering everything from the challenges of designing spacesuits for NASA, tips for overcoming challenges of being the lone UX person in a company, to the future of robotics in services, and how to get design systems up and working in a large organization. Two of the themes that came through strongest for me this year were inclusivity and empathy — for all of the wonderfully diverse people in the world, and also for people we often forget to take the time to understand and empathize with: our peers and our colleagues.

canux-kat-hardisty2.jpg

I feel very privileged to have been able to be involved in a conference that was so full of passion and dedication to UX, and to share the stage with so many inspiring people. The topic for my presentation was a subset of the outcomes of qualitative research I have been doing into who UX people are; in particular, the different types of challenges we face depending on our roles, the type of team we are in, our experience level, and (if reasonably new to UX) where our UX knowledge comes from. My talk seemed to be well received (yay!) — although some of the enthusiasm may have been due to the shoes with squirrel heels I was wearing, which got a lot of attention!

canux-kat-hardisty.jpg

Overall, CanUX was the best organized and most thoughtful conference I’ve ever attended. The passion that the volunteer organizers have for the UX field comes through clearly, and really helps build community. Here’s hoping I’m lucky enough to get back to Ottawa for another one!

Learn more
1 min read

5 ways to increase user research in your organization

Co-authored by Brandon Dorn, UX designer at Viget.As user experience designers, making sure that websites and tools are usable is a critical component of our work, and conducting user research enables us to assess whether we’re achieving that goal or not. Even if we want to incorporate research, however, certain constraints may stand in our way.

A few years ago, we realized that we were facing this issue at Viget, a digital design agency, and we decided to make an effort to prioritize user research. Almost two years ago, we shared initial thoughts on our progress in this blog post. We’ve continued to learn and grow as researchers since then and hope that what we’ve learned along the way can help your clients and coworkers understand the value of research and become better practitioners. Below are some of those lessons.

Make research a priority for your organization

Before you can do more research, it needs to be prioritized across your entire organization — not just within your design team. To that end, you should:

  • Know what you’re trying to achieve. By defining specific goals, you can share a clear message with the broader organization about what you’re after, how you can achieve those goals, and how you will measure success. At Viget, we shared our research goals with everyone at the company. In addition, we talked to the business development and project management teams in more depth about specific ways that they could help us achieve our goals, since they have the greatest impact on our ability to do more research.
  • Track your progress. Once you’ve made research a priority, make sure to review your goals on an ongoing basis to ensure that you’re making progress and share your findings with the organization. Six months after the research group at Viget started working on our goals, we held a retrospective to figure out what was working — and what wasn’t.
  • Adjust your approach as needed. You won’t achieve your goals overnight. As you put different tactics into action, adjust your approach if something isn’t helping you achieve your goals. Be willing to experiment and don’t feel bad if a specific tactic isn’t successful.

Educate your colleagues and clients

If you want people within your organization to get excited about doing more research, they need to understand what research means. To educate your colleagues and clients, you should:

  • Explain the fundamentals of research. If someone has not conducted research before, they may not be familiar or feel comfortable with the vernacular. Provide an overview of the fundamental terminology to establish a basic level of understanding. In a blog post, Speaking the Same Language About Research, we outline how we established a common vocabulary at Viget.
  • Help others understand the landscape of research methods. As designers, we feel comfortable talking about different methodologies and forget that that information will be new to many people. Look for opportunities to increase understanding by sharing your knowledge. At Viget, we make this happen in several ways. Internally, we give presentations to the company, organize group viewing sessions for webinars about user research, and lead focused workshops to help people put new skills into practice. Externally, we talk about our services and share knowledge through our blog posts. We are even hosting a webinar about conducting user interviews in November and we'd love for you to join us.
  • Incorporate others into the research process. Don't just tell people what research is and why it's important — show them. Look for opportunities to bring more people into the research process. Invite people to observe sessions so they can experience research firsthand or have them take on the role of the notetaker. Another simple way to make people feel involved is to share findings on an ongoing basis rather than providing a report at the end of the process.

Broaden your perspective while refining your skill set

Our commitment to testing assumptions led us to challenge ourselves to do research on every project. While we're dogmatic about this goal, we're decidedly un-dogmatic about the form our research takes from one project to another. To pursue this goal, we seek to:

  • Expand our understanding. To instill a culture of research at Viget, we've found it necessary to question our assumptions about what research looks like. Books like Erika Hall’s Just Enough Research teach us the range of possible approaches for getting useful user input at any stage of a project, and at any scale. Reflect on any methodological biases that have become well-worn paths in your approach to research. Maybe your organization is meticulous about metrics and quantitative data, and could benefit from a series of qualitative studies. Maybe you have plenty of anecdotal and qualitative evidence about your product that could be better grounded in objective analysis. Aim to establish a balanced perspective on your product through a diverse set of research lenses, filling in gaps as you learn about new approaches.
  • Adjust our approach to project constraints. We've found that the only way to consistently incorporate research in our work is to adjust our approach to the context and constraints of any given project. Client expectations, project type, business goals, timelines, budget, and access to participants all influence the type, frequency, and output of our research. Iterative prototype testing of an email editor, for example, looks very different than post-launch qualitative studies for an editorial website. While some projects are research-intensive, short studies can also be worthwhile.
  • Reflect on successes and shortcomings. We have a longstanding practice of holding post-project team retrospectives to reflect on and document lessons for future work. Research has naturally come up in these conversations, and many of the things we've discussed you're reading right now. As an agency with a diverse set of clients, it's been important for us to understand what types of research work for what types of clients, and when. Make sure to take time to ask these questions after projects. Mid-project retrospectives can be beneficial, especially on long engagements, yet it's hard to see the forest when you're in the weeds.

Streamline qualitative research processes 🚄

Learning to be more efficient at planning, conducting, and analyzing research has helped us overturn the idea that some projects merit research while others don't. Remote moderated usability tests are one of our preferred methods, yet, in our experience, the biggest obstacle to incorporating these tests isn't the actual moderating or analyzing, but the overhead of acquiring and scheduling participants. While some agencies contract out the work of recruiting, we've found it less expensive and more reliable to collaborate with our clients to find the right people for our tests. That said, here are some recommendations for holding efficient qualitative tests:

  • Know your tools ahead of time. We use a number of tools to plan, schedule, annotate, and analyze qualitative tests (we're inveterate spreadsheet users). Learn your tools beforehand, especially if you're trying something new. Tools should fade into the background during tests, which Reframer does nicely.
  • Establish a recruiting process. When working with clients to find participants, we'll often provide an email template tailored to the project for them to send to existing or potential users of their product. This introductory email will contain a screener that asks a few project-related demographic or usage questions, and provides us with participant email addresses which we use to follow-up with a link to a scheduling tool. Once this process is established, the project manager will ensure that the UX designer on the team has a regular flow of participants. The recruiting process doesn't take care of itself – participants cancel, or reschedule, or sometimes don't respond at all – yet establishing an approach ahead of time allows you, the researcher, to focus on the research in the midst of the project.
  • Start recruiting early. Don't wait until you've finished writing a testing script to begin recruiting participants. Once you determine the aim and focal points of your study, recruit accordingly. Scripts can be revised and approved in the meantime.

Be proactive about making research happen 🤸

As a generalist design agency, we work with clients whose industries and products vary significantly. While some clients come to us with clear research priorities in mind, others treat it as an afterthought. Rare, however, is the client who is actively opposed to researching their product. More often than not, budget and timelines are the limiting factors. So we try not to make research an ordeal, but instead treat it as part of our normal process even if a client hasn't explicitly asked for it. Common-sense perspectives like Jakob Nielsen’s classic “Discount Usability for the Web” remind us that some research is always better than none, and that some can still be meaningfully pursued. We aren’t pushy about research, of course, but instead try to find a way to make it happen when it isn't a definite priority.

World Usability Day is coming up on November 9, so now is a great time to stop and reflect on how you approach research and to brainstorm ways to improve your process. The tips above reflect some of the lessons we’ve learned at Viget as we’ve tried to improve our own process. We’d love to hear about approaches you’ve used as well.

Learn more
1 min read

How we created a content strategy without realizing it

Tania Hockings is the Senior Digital Content Advisor at ACC and has a passion for creating content that caters to users (not the business). Co-presenter and co-author Amy Stoks is a Senior Experience Designer at PwC’s Experience Centre and possesses a love of empathy and collaboration. Ahead of their presentation at UX New Zealand 2017, Tania and Amy share their experience creating a content strategy while working on a project for the Accident Compensation Corporation (ACC’s no-fault scheme helps pay the cost of care for everyone in New Zealand if they’re injured in an accident).It’s a truth universally acknowledged that before you start writing content you’re supposed to have a content strategy. Three months out from launch of the new acc.co.nz beta site, we did not have one. Nor did we have a lot of time, very much resource or a completed content audit.However, we did have:

  • Some pretty good design principles, based on user research
  • A list of the 37 priority tasks that our users needed to do on the acc.co.nz website
  • One content writer (Tania) and one part-time content consultant (Amy)
  • A deadline
  • Freedom to do whatever we needed to do to get content live for the beta launch.

Here’s a quick look into how we created a content strategy for acc.co.nz without actually realizing it.

Content principles are a great starting point

We needed more direction than those 37 tasks to get writing, so inspired by our design principles, we wrote some equivalent principles for the content. We decided to start with the tried and tested principles already in play by Govt.nz and GOV.UK — we didn’t have time to reinvent the wheel. We ended up with eight principles for how we would write our content:

  1. Do less: only create what is needed and what we can maintain
  2. Always have an evidence-based user need: we know why a piece of content is needed and have the evidence to back it up
  3. Ask for constant feedback: we use feedback and analytics to understand what people want and need to know, as well as what they don’t
  4. Provide a consistent customer experience: our voice is the same across all platforms and our content is accessible for every kind of user
  5. Create seamless customer journeys: no dead ends or broken links, no unnecessary information
  6. Improve how ACC writes: we build good relationships with content contributors and proactively offer content support and training
  7. Ensure transparent ownership for all content: every piece of content has an owner, a business group and a digital advisor assigned
  8. Accept that not everything has to live on acc.co.nz: other channels share ACC content, so if ACC isn’t the source of truth for information we don’t put it on acc.co.nz

We made a checklist of what would and wouldn’t live on acc.co.nz according to the principles...and that was pretty much it. We really didn’t have time to do much else because the design of the site was running ahead of us. We also needed to get some content in front of users at our weekly testing sessions.

Sometimes you’ve just gotta get writing

We got stuck into writing those 37 tasks using our pair writing approach, which was also an experiment, but more on that in our UX New Zealand talk. While we wrote, we were living and breathing the content principles: we introduced them to our internal subject experts while we were writing and constantly referred back to the principles to help structure the content.After the beta launch, we had a few more content writers on the team and a bit of time to breathe (but not much!). We actually wrote the principles down and put them into a visual briefing pack to give to the subject experts ahead of our pair writing sessions. This pack covered:

  • our principles
  • the goal of the project
  • the process
  • the usual best practice webby stuff.

As we wrote more content, the briefing pack and our process evolved based on what we learned and feedback from our subject experts about what was and wasn’t working.During the same brief intermission, we also had a chance to revisit the content strategy. However, in practice we just did a brainstorm on a whiteboard of what the strategy might be. It looked like this:

image1.jpg

And it stayed like that for another six months.We can’t remember if we ever looked at it much, but we felt good knowing it was there.

Seriously, we really need a content strategy...don’t we?

We finally got to the end of the project. The launch date was looming, but still no content strategy. So we booked a room. Three of us agreed to meet to nut it out and finally write our formal content strategy. We talked for a bit, going around in circles, until we realized we’d already done it. The briefing pack was the content strategy. Less a formal document and more a living set of principles of how we had and would continue to work.

Would we do it again?

Yeah, we would. In fact, the ACC digital team is already following the same approach on a new project. Content principles are key: they’re simple, practical to socialize and easy to stick to. We found it really valuable to evolve the strategy as we learned more from user research and subject matter expertise.Of course, it wasn’t all rosy — these projects never are! Some governance and muscle behind what we were doing would have really helped. We found ourselves in some intense stakeholder meetings where we were the first line of defence for the content principles. Unsurprisingly, not everybody agrees with doing less! But we’re pretty sure that having a longer strategy formalized in an official document still wouldn’t have helped us much.The next piece of work for the digital team at ACC is defining that governance and building a collaborative process to design digital products within the organization. The plan is to run informal, regular catch-ups with internal stakeholders to make sure the content strategy is still relevant and working for ACC’s users and the organization itself.

If you remember anything from this blog post, remember this:

Treat your content strategy less like a formal document and more like a working hypothesis that changes and grows as you learn.A formal document might make you feel good, but it’s likely no one is reading it.Whatever format you choose for your content strategy/working hypothesis, make sure you get it signed off and endorsed by the people who matter. You’ll need back up in those inevitably tense project meetings!The acc.co.nz content strategy looks awesome these days — very visual and easy to read. Tania always has a copy in her notebook and carries it with her everywhere. If you’re lucky enough to run into her on the streets of Wellington, she might just show it to you.

Want to hear more? Come to UX New Zealand!

If you'd like to hear more about designing content, plus a bunch of other cool UX-related talks, head along to UX New Zealand 2017 hosted by Optimal Workshop. The conference runs from 11-13 October including a day of fantastic workshops, and you can get your tickets here.

Learn more
1 min read

How to do a content audit: How and why

Ah, content audits! If you need to work on a website redesign, information architecture revamp, or a site migration, one of the first things you’ll need to do is a content audit.Most likely a website redesign project will need some amount of re-organization because users can't find anything on the site. First, you’ll need to know who your users are. You’ll find out what it is that the users of your website are doing or looking for. You’ll conduct some preliminary user interviews to ask your project stakeholders what they think their users are looking for. Maybe you’ll get to ask actual users of your website.Then you’ll realize that you could also find out if the website correlates with what your users expect to be able to find. If you only knew what was on your website in the first place...You’ll need to do the content audit. You can’t avoid it now. The spreadsheet comes out. You start your inventory, add the metadata, and add columns to make sense of cells.It was arduous, and it was worth it. Once you do the content audit, you’ll be able to conduct card sorting and affinity mapping to find out what your users are looking for.It reads like a children’s book, doesn’t it? “And then content lived happily ever after...”But, content audits are iterative, not a once in a project cycle activity. Like a routine checkup or a yearly exam, a content audit is essential to keep content relevant and valuable to your users.Conducting a content audit is one of the first steps in putting together a card sort. You need words on a card, right? Your content audit is where you’ll find them.Incidentally, “how to do a content audit” articles are plentiful, but this will be one that focuses on how a card sort makes use of a content audit.

What is a content audit?

Content audits start off as inventories. They tend to be massive spreadsheets that contain, among other things, metadata about the content you’re keeping track of. For example, an inventory may start as a list of companies, occupations, or cities. Then someone may come along and start collecting empirical metadata around these things. The list becomes a list of the top global brand companies, the best occupations for 2017, or the most dangerous cities in the world.Maybe you’re keeping track of what books you have, what you’ve read, what you haven’t read. Maybe you’re keeping an inventory of your kitchen pantry. Maybe you’re collecting a list of movies and films you should watch or keeping a bucket list of places you want to visit.Consider the scope of these three inventories:

  • A full content inventory. A complete listing of all site content, including pages, images, videos, and PDFs. If you consider a kitchen taxonomy, this includes everything in the kitchen, (including books, recipe binders, kitchen equipment, refrigerated items).
  • A partial content inventory. A subset listing of content slicing across the site. For example, most popular, site hierarchy, or items used within a defined period of time. A partial kitchen inventory would cover everything used in the past 6 months.
  • A content sample. A listing of example content from the site. For instance, a specific category or location. A content sample of kitchen inventory could cover the pantry or the spice cabinet.

Quantitative content inventories

Content inventories are the quantitative kind. The purpose of this list is to know how much content you have, and how many of each different kind. There are 12 countries I want to visit in Asia. I’ve already visited 10 of the 50 states in the United States. There are over 800 pages in the five websites that I’m consolidating.Content inventories give you numbers to work with and provide a current state of affairs before you go making changes. You’ll be able to refer to this when you talk to content owners.

Qualitative content inventories

Content audits focus on the qualitative. You add an evaluation of the content (which is qualitative) to that initial simple inventory. The most common is ROT analysis: redundant, obsolete, and trivial. Another type of analysis looks at tone and voice. Kristina Halvorson, in her book Content Strategy for the Web, chunks qualitative data into six groups: usability, knowledge level, findability, actionability, audience, and accuracy.The information you collect in your audit depends on what you want to know. If “easy to understand” is a KPI that you or the business wants to measure, then you’ll want to include readability scores. Every content audit is custom-fit for the purposes of each project.In essence, content audits are lists of things you want to track and your assessment of that thing — whatever that thing is. Things can be physical content as well as digital. Are those spices too old and should be thrown out? Is that travel destination in the midst of political turmoil and should it be taken off the list (for now)? Is that movie now available for streaming? How many of those 800 pages are worth of keeping or updating and how many of those could we archive and take offline?

Content audits are pivotal documents

Content audits are living documents. They need to be updated on a regular basis to maintain a certain level of content quality and relevance. They are pivotal documents, shared across various disciplines, used for various purposes.Search engine optimization (SEO) tools create site crawls that capture page titles, URLs, page elements, and position within a site hierarchy. They are spreadsheets that look and feel deceptively like content inventories. And they essentially are.Content audits are converging:

  • SEO specialists conduct SEO content audits to identify thin content, accessibility, indexability, duplicate content and such.
  • Content strategists and information architects conduct inventories and audits to determine what content exists, where it lives, when it was last updated, and who owns it.
  • Taxonomists mine content inventories for categories and content terminology.
  • Search analysts collect keywords to supplement site search.

Content audits are pivotal documents that have many different uses.Someone adds site analytics to the document, then readability scores, then BOOM! There are now even more ways to pivot the table — top landing pages, top pageviews, highest bounce rate, high word count, low word count, oldest content, newest content — where do you want to start?

No results found.

Please try different keywords.

Subscribe to OW blog for an instantly better inbox

Thanks for subscribing!
Oops! Something went wrong while submitting the form.

Seeing is believing

Explore our tools and see how Optimal makes gathering insights simple, powerful, and impactful.