Blog

Optimal Blog

Articles and Podcasts on Customer Service, AI and Automation, Product, and more

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.

Latest

Learn more
1 min read

Squirrel shoes, yoga and spacesuits: My experience at CanUX 2017

One of the great things about being in the UX field is the UX community. So many inspiring and generally all-round awesome people who are passionate about what they do. What happens when you get a big bunch of those fantastic people all in the one place? You can practically watch the enthusiasm and inspiration levels rise as people talk about ideas, experiences and challenges, and how they can help each other solve problems.Luckily for us, there are lots of events dedicated to getting UX people together in one place to share, learn, grow, and connect. CanUX is one of those, and I was fortunate enough to be there in Ottawa to speak at this year’s event.

The crowd at CanUX settling in for the start of the conference.

CanUX is the annual big UX event for Canada. Started 8 years ago by volunteers who wanted to create an opportunity for the Canadian UX community to get together, it’s grown from a small event to a major conference with top speakers from around the world. Still run by two of the original volunteers, CanUX has kept its focus on community which comes through clearly in how it’s organized. From the day of the week and time of year it’s held, through to details such as a yoga class at the venue to kick off the second day of the conference, there are countless details, small and large, that encourage people to go along, to meet others, and to catch up with old friends from previous CanUX conferences.Aware that there are natural energy lulls in conferences, as people’s brains fill up with inspiration and knowledge, the CanUX team have a regular MC, Rob Woodbridge. This is a man who bounds across the stage, encourages (and actually gets!) audience participation, swears, cracks jokes, and generally seems to have a lot of fun while being the most effective MC I have ever encountered. (Naturally, he’s a bit controversial — some people love to hate him because of that unbridled enthusiasm. But either way, he sets the tone for passionate, engaging presentations!)

The crowd at CanUX had a fair few chances to interact with speakers onstage.

With all the attention to detail around the rest of the conference, it’s not surprising that the same care is shown to the conference programme. All of the main presenters are seen ahead of time by one of the organizers, and then invited to be at CanUX. A very small number of short presentation times are set aside for an open call for submissions, to help encourage newer speakers. Presentations are chosen to cover research, design and IA topics, with both practical and inspirational talks in each.The talks themselves were fantastic, covering everything from the challenges of designing spacesuits for NASA, tips for overcoming challenges of being the lone UX person in a company, to the future of robotics in services, and how to get design systems up and working in a large organization. Two of the themes that came through strongest for me this year were inclusivity and empathy — for all of the wonderfully diverse people in the world, and also for people we often forget to take the time to understand and empathize with: our peers and our colleagues.

canux-kat-hardisty2.jpg

I feel very privileged to have been able to be involved in a conference that was so full of passion and dedication to UX, and to share the stage with so many inspiring people. The topic for my presentation was a subset of the outcomes of qualitative research I have been doing into who UX people are; in particular, the different types of challenges we face depending on our roles, the type of team we are in, our experience level, and (if reasonably new to UX) where our UX knowledge comes from. My talk seemed to be well received (yay!) — although some of the enthusiasm may have been due to the shoes with squirrel heels I was wearing, which got a lot of attention!

canux-kat-hardisty.jpg

Overall, CanUX was the best organized and most thoughtful conference I’ve ever attended. The passion that the volunteer organizers have for the UX field comes through clearly, and really helps build community. Here’s hoping I’m lucky enough to get back to Ottawa for another one!

Learn more
1 min read

5 ways to increase user research in your organization

Co-authored by Brandon Dorn, UX designer at Viget.As user experience designers, making sure that websites and tools are usable is a critical component of our work, and conducting user research enables us to assess whether we’re achieving that goal or not. Even if we want to incorporate research, however, certain constraints may stand in our way.

A few years ago, we realized that we were facing this issue at Viget, a digital design agency, and we decided to make an effort to prioritize user research. Almost two years ago, we shared initial thoughts on our progress in this blog post. We’ve continued to learn and grow as researchers since then and hope that what we’ve learned along the way can help your clients and coworkers understand the value of research and become better practitioners. Below are some of those lessons.

Make research a priority for your organization

Before you can do more research, it needs to be prioritized across your entire organization — not just within your design team. To that end, you should:

  • Know what you’re trying to achieve. By defining specific goals, you can share a clear message with the broader organization about what you’re after, how you can achieve those goals, and how you will measure success. At Viget, we shared our research goals with everyone at the company. In addition, we talked to the business development and project management teams in more depth about specific ways that they could help us achieve our goals, since they have the greatest impact on our ability to do more research.
  • Track your progress. Once you’ve made research a priority, make sure to review your goals on an ongoing basis to ensure that you’re making progress and share your findings with the organization. Six months after the research group at Viget started working on our goals, we held a retrospective to figure out what was working — and what wasn’t.
  • Adjust your approach as needed. You won’t achieve your goals overnight. As you put different tactics into action, adjust your approach if something isn’t helping you achieve your goals. Be willing to experiment and don’t feel bad if a specific tactic isn’t successful.

Educate your colleagues and clients

If you want people within your organization to get excited about doing more research, they need to understand what research means. To educate your colleagues and clients, you should:

  • Explain the fundamentals of research. If someone has not conducted research before, they may not be familiar or feel comfortable with the vernacular. Provide an overview of the fundamental terminology to establish a basic level of understanding. In a blog post, Speaking the Same Language About Research, we outline how we established a common vocabulary at Viget.
  • Help others understand the landscape of research methods. As designers, we feel comfortable talking about different methodologies and forget that that information will be new to many people. Look for opportunities to increase understanding by sharing your knowledge. At Viget, we make this happen in several ways. Internally, we give presentations to the company, organize group viewing sessions for webinars about user research, and lead focused workshops to help people put new skills into practice. Externally, we talk about our services and share knowledge through our blog posts. We are even hosting a webinar about conducting user interviews in November and we'd love for you to join us.
  • Incorporate others into the research process. Don't just tell people what research is and why it's important — show them. Look for opportunities to bring more people into the research process. Invite people to observe sessions so they can experience research firsthand or have them take on the role of the notetaker. Another simple way to make people feel involved is to share findings on an ongoing basis rather than providing a report at the end of the process.

Broaden your perspective while refining your skill set

Our commitment to testing assumptions led us to challenge ourselves to do research on every project. While we're dogmatic about this goal, we're decidedly un-dogmatic about the form our research takes from one project to another. To pursue this goal, we seek to:

  • Expand our understanding. To instill a culture of research at Viget, we've found it necessary to question our assumptions about what research looks like. Books like Erika Hall’s Just Enough Research teach us the range of possible approaches for getting useful user input at any stage of a project, and at any scale. Reflect on any methodological biases that have become well-worn paths in your approach to research. Maybe your organization is meticulous about metrics and quantitative data, and could benefit from a series of qualitative studies. Maybe you have plenty of anecdotal and qualitative evidence about your product that could be better grounded in objective analysis. Aim to establish a balanced perspective on your product through a diverse set of research lenses, filling in gaps as you learn about new approaches.
  • Adjust our approach to project constraints. We've found that the only way to consistently incorporate research in our work is to adjust our approach to the context and constraints of any given project. Client expectations, project type, business goals, timelines, budget, and access to participants all influence the type, frequency, and output of our research. Iterative prototype testing of an email editor, for example, looks very different than post-launch qualitative studies for an editorial website. While some projects are research-intensive, short studies can also be worthwhile.
  • Reflect on successes and shortcomings. We have a longstanding practice of holding post-project team retrospectives to reflect on and document lessons for future work. Research has naturally come up in these conversations, and many of the things we've discussed you're reading right now. As an agency with a diverse set of clients, it's been important for us to understand what types of research work for what types of clients, and when. Make sure to take time to ask these questions after projects. Mid-project retrospectives can be beneficial, especially on long engagements, yet it's hard to see the forest when you're in the weeds.

Streamline qualitative research processes 🚄

Learning to be more efficient at planning, conducting, and analyzing research has helped us overturn the idea that some projects merit research while others don't. Remote moderated usability tests are one of our preferred methods, yet, in our experience, the biggest obstacle to incorporating these tests isn't the actual moderating or analyzing, but the overhead of acquiring and scheduling participants. While some agencies contract out the work of recruiting, we've found it less expensive and more reliable to collaborate with our clients to find the right people for our tests. That said, here are some recommendations for holding efficient qualitative tests:

  • Know your tools ahead of time. We use a number of tools to plan, schedule, annotate, and analyze qualitative tests (we're inveterate spreadsheet users). Learn your tools beforehand, especially if you're trying something new. Tools should fade into the background during tests, which Reframer does nicely.
  • Establish a recruiting process. When working with clients to find participants, we'll often provide an email template tailored to the project for them to send to existing or potential users of their product. This introductory email will contain a screener that asks a few project-related demographic or usage questions, and provides us with participant email addresses which we use to follow-up with a link to a scheduling tool. Once this process is established, the project manager will ensure that the UX designer on the team has a regular flow of participants. The recruiting process doesn't take care of itself – participants cancel, or reschedule, or sometimes don't respond at all – yet establishing an approach ahead of time allows you, the researcher, to focus on the research in the midst of the project.
  • Start recruiting early. Don't wait until you've finished writing a testing script to begin recruiting participants. Once you determine the aim and focal points of your study, recruit accordingly. Scripts can be revised and approved in the meantime.

Be proactive about making research happen 🤸

As a generalist design agency, we work with clients whose industries and products vary significantly. While some clients come to us with clear research priorities in mind, others treat it as an afterthought. Rare, however, is the client who is actively opposed to researching their product. More often than not, budget and timelines are the limiting factors. So we try not to make research an ordeal, but instead treat it as part of our normal process even if a client hasn't explicitly asked for it. Common-sense perspectives like Jakob Nielsen’s classic “Discount Usability for the Web” remind us that some research is always better than none, and that some can still be meaningfully pursued. We aren’t pushy about research, of course, but instead try to find a way to make it happen when it isn't a definite priority.

World Usability Day is coming up on November 9, so now is a great time to stop and reflect on how you approach research and to brainstorm ways to improve your process. The tips above reflect some of the lessons we’ve learned at Viget as we’ve tried to improve our own process. We’d love to hear about approaches you’ve used as well.

Learn more
1 min read

How we created a content strategy without realizing it

Tania Hockings is the Senior Digital Content Advisor at ACC and has a passion for creating content that caters to users (not the business). Co-presenter and co-author Amy Stoks is a Senior Experience Designer at PwC’s Experience Centre and possesses a love of empathy and collaboration. Ahead of their presentation at UX New Zealand 2017, Tania and Amy share their experience creating a content strategy while working on a project for the Accident Compensation Corporation (ACC’s no-fault scheme helps pay the cost of care for everyone in New Zealand if they’re injured in an accident).It’s a truth universally acknowledged that before you start writing content you’re supposed to have a content strategy. Three months out from launch of the new acc.co.nz beta site, we did not have one. Nor did we have a lot of time, very much resource or a completed content audit.However, we did have:

  • Some pretty good design principles, based on user research
  • A list of the 37 priority tasks that our users needed to do on the acc.co.nz website
  • One content writer (Tania) and one part-time content consultant (Amy)
  • A deadline
  • Freedom to do whatever we needed to do to get content live for the beta launch.

Here’s a quick look into how we created a content strategy for acc.co.nz without actually realizing it.

Content principles are a great starting point

We needed more direction than those 37 tasks to get writing, so inspired by our design principles, we wrote some equivalent principles for the content. We decided to start with the tried and tested principles already in play by Govt.nz and GOV.UK — we didn’t have time to reinvent the wheel. We ended up with eight principles for how we would write our content:

  1. Do less: only create what is needed and what we can maintain
  2. Always have an evidence-based user need: we know why a piece of content is needed and have the evidence to back it up
  3. Ask for constant feedback: we use feedback and analytics to understand what people want and need to know, as well as what they don’t
  4. Provide a consistent customer experience: our voice is the same across all platforms and our content is accessible for every kind of user
  5. Create seamless customer journeys: no dead ends or broken links, no unnecessary information
  6. Improve how ACC writes: we build good relationships with content contributors and proactively offer content support and training
  7. Ensure transparent ownership for all content: every piece of content has an owner, a business group and a digital advisor assigned
  8. Accept that not everything has to live on acc.co.nz: other channels share ACC content, so if ACC isn’t the source of truth for information we don’t put it on acc.co.nz

We made a checklist of what would and wouldn’t live on acc.co.nz according to the principles...and that was pretty much it. We really didn’t have time to do much else because the design of the site was running ahead of us. We also needed to get some content in front of users at our weekly testing sessions.

Sometimes you’ve just gotta get writing

We got stuck into writing those 37 tasks using our pair writing approach, which was also an experiment, but more on that in our UX New Zealand talk. While we wrote, we were living and breathing the content principles: we introduced them to our internal subject experts while we were writing and constantly referred back to the principles to help structure the content.After the beta launch, we had a few more content writers on the team and a bit of time to breathe (but not much!). We actually wrote the principles down and put them into a visual briefing pack to give to the subject experts ahead of our pair writing sessions. This pack covered:

  • our principles
  • the goal of the project
  • the process
  • the usual best practice webby stuff.

As we wrote more content, the briefing pack and our process evolved based on what we learned and feedback from our subject experts about what was and wasn’t working.During the same brief intermission, we also had a chance to revisit the content strategy. However, in practice we just did a brainstorm on a whiteboard of what the strategy might be. It looked like this:

image1.jpg

And it stayed like that for another six months.We can’t remember if we ever looked at it much, but we felt good knowing it was there.

Seriously, we really need a content strategy...don’t we?

We finally got to the end of the project. The launch date was looming, but still no content strategy. So we booked a room. Three of us agreed to meet to nut it out and finally write our formal content strategy. We talked for a bit, going around in circles, until we realized we’d already done it. The briefing pack was the content strategy. Less a formal document and more a living set of principles of how we had and would continue to work.

Would we do it again?

Yeah, we would. In fact, the ACC digital team is already following the same approach on a new project. Content principles are key: they’re simple, practical to socialize and easy to stick to. We found it really valuable to evolve the strategy as we learned more from user research and subject matter expertise.Of course, it wasn’t all rosy — these projects never are! Some governance and muscle behind what we were doing would have really helped. We found ourselves in some intense stakeholder meetings where we were the first line of defence for the content principles. Unsurprisingly, not everybody agrees with doing less! But we’re pretty sure that having a longer strategy formalized in an official document still wouldn’t have helped us much.The next piece of work for the digital team at ACC is defining that governance and building a collaborative process to design digital products within the organization. The plan is to run informal, regular catch-ups with internal stakeholders to make sure the content strategy is still relevant and working for ACC’s users and the organization itself.

If you remember anything from this blog post, remember this:

Treat your content strategy less like a formal document and more like a working hypothesis that changes and grows as you learn.A formal document might make you feel good, but it’s likely no one is reading it.Whatever format you choose for your content strategy/working hypothesis, make sure you get it signed off and endorsed by the people who matter. You’ll need back up in those inevitably tense project meetings!The acc.co.nz content strategy looks awesome these days — very visual and easy to read. Tania always has a copy in her notebook and carries it with her everywhere. If you’re lucky enough to run into her on the streets of Wellington, she might just show it to you.

Want to hear more? Come to UX New Zealand!

If you'd like to hear more about designing content, plus a bunch of other cool UX-related talks, head along to UX New Zealand 2017 hosted by Optimal Workshop. The conference runs from 11-13 October including a day of fantastic workshops, and you can get your tickets here.

Learn more
1 min read

How to do a content audit: How and why

Ah, content audits! If you need to work on a website redesign, information architecture revamp, or a site migration, one of the first things you’ll need to do is a content audit.Most likely a website redesign project will need some amount of re-organization because users can't find anything on the site. First, you’ll need to know who your users are. You’ll find out what it is that the users of your website are doing or looking for. You’ll conduct some preliminary user interviews to ask your project stakeholders what they think their users are looking for. Maybe you’ll get to ask actual users of your website.Then you’ll realize that you could also find out if the website correlates with what your users expect to be able to find. If you only knew what was on your website in the first place...You’ll need to do the content audit. You can’t avoid it now. The spreadsheet comes out. You start your inventory, add the metadata, and add columns to make sense of cells.It was arduous, and it was worth it. Once you do the content audit, you’ll be able to conduct card sorting and affinity mapping to find out what your users are looking for.It reads like a children’s book, doesn’t it? “And then content lived happily ever after...”But, content audits are iterative, not a once in a project cycle activity. Like a routine checkup or a yearly exam, a content audit is essential to keep content relevant and valuable to your users.Conducting a content audit is one of the first steps in putting together a card sort. You need words on a card, right? Your content audit is where you’ll find them.Incidentally, “how to do a content audit” articles are plentiful, but this will be one that focuses on how a card sort makes use of a content audit.

What is a content audit?

Content audits start off as inventories. They tend to be massive spreadsheets that contain, among other things, metadata about the content you’re keeping track of. For example, an inventory may start as a list of companies, occupations, or cities. Then someone may come along and start collecting empirical metadata around these things. The list becomes a list of the top global brand companies, the best occupations for 2017, or the most dangerous cities in the world.Maybe you’re keeping track of what books you have, what you’ve read, what you haven’t read. Maybe you’re keeping an inventory of your kitchen pantry. Maybe you’re collecting a list of movies and films you should watch or keeping a bucket list of places you want to visit.Consider the scope of these three inventories:

  • A full content inventory. A complete listing of all site content, including pages, images, videos, and PDFs. If you consider a kitchen taxonomy, this includes everything in the kitchen, (including books, recipe binders, kitchen equipment, refrigerated items).
  • A partial content inventory. A subset listing of content slicing across the site. For example, most popular, site hierarchy, or items used within a defined period of time. A partial kitchen inventory would cover everything used in the past 6 months.
  • A content sample. A listing of example content from the site. For instance, a specific category or location. A content sample of kitchen inventory could cover the pantry or the spice cabinet.

Quantitative content inventories

Content inventories are the quantitative kind. The purpose of this list is to know how much content you have, and how many of each different kind. There are 12 countries I want to visit in Asia. I’ve already visited 10 of the 50 states in the United States. There are over 800 pages in the five websites that I’m consolidating.Content inventories give you numbers to work with and provide a current state of affairs before you go making changes. You’ll be able to refer to this when you talk to content owners.

Qualitative content inventories

Content audits focus on the qualitative. You add an evaluation of the content (which is qualitative) to that initial simple inventory. The most common is ROT analysis: redundant, obsolete, and trivial. Another type of analysis looks at tone and voice. Kristina Halvorson, in her book Content Strategy for the Web, chunks qualitative data into six groups: usability, knowledge level, findability, actionability, audience, and accuracy.The information you collect in your audit depends on what you want to know. If “easy to understand” is a KPI that you or the business wants to measure, then you’ll want to include readability scores. Every content audit is custom-fit for the purposes of each project.In essence, content audits are lists of things you want to track and your assessment of that thing — whatever that thing is. Things can be physical content as well as digital. Are those spices too old and should be thrown out? Is that travel destination in the midst of political turmoil and should it be taken off the list (for now)? Is that movie now available for streaming? How many of those 800 pages are worth of keeping or updating and how many of those could we archive and take offline?

Content audits are pivotal documents

Content audits are living documents. They need to be updated on a regular basis to maintain a certain level of content quality and relevance. They are pivotal documents, shared across various disciplines, used for various purposes.Search engine optimization (SEO) tools create site crawls that capture page titles, URLs, page elements, and position within a site hierarchy. They are spreadsheets that look and feel deceptively like content inventories. And they essentially are.Content audits are converging:

  • SEO specialists conduct SEO content audits to identify thin content, accessibility, indexability, duplicate content and such.
  • Content strategists and information architects conduct inventories and audits to determine what content exists, where it lives, when it was last updated, and who owns it.
  • Taxonomists mine content inventories for categories and content terminology.
  • Search analysts collect keywords to supplement site search.

Content audits are pivotal documents that have many different uses.Someone adds site analytics to the document, then readability scores, then BOOM! There are now even more ways to pivot the table — top landing pages, top pageviews, highest bounce rate, high word count, low word count, oldest content, newest content — where do you want to start?

Learn more
1 min read

Designing for conversational user interfaces: A Q&A with Stratis Valachis

Stratis Valachis, senior user experience designer at Aviva’s Digital Innovation Garage, took some time out of his busy schedule to answer some questions about designing for conversational user interfaces (CUI). Learn more about his processes for research and design for CUI, what he thinks the future will look like, and some of the biggest challenges he’s faced while designing for CUI.Stratis will be speaking at MUXL2017, the third annual conference around Mobile User Experience in London on the 10th of November at City, University of London. Using case studies through talks and workshops, the conference will cover Core UX principles as well as emerging topics such as AI (Chatbots), VR (AR) & IOT.

What does the research and design process for conversational interfaces look like?

Like any design project, you should always start by identifying user needs and real problems. Research how users solve that problem currently and then evaluate for which use cases you can remove friction and enhance the experience by utilizing a conversational interface.Don't try to chat-ify or voice-ify your product just because it's a cool trend. In many ways conversational interfaces (CUIs), both voice and visual, have more usability constraints than traditional GUI. For example, it’s hard to interrupt the conversation to recover from errors, you can't easily skim through information, progress is linear and you very often need to rely on recall.Users make conscious compromises about which type of interface they want to use.This means that a solution utilizing a CUI needs to offer an obvious benefit for your chosen use case, otherwise users won't use your product. That's why special emphasis should be placed on early research about the context in which users will use your product and on why a CUI could provide a better experience. When you begin the design phase, a good practice would be to craft a personality for your interface. Studies have shown that because humans are empathetic, they will assign human character attributes to your CUI anyway, so it's better to make sure this is defined through design. This works really well for platforms like Google Home and Facebook Messenger, which make it clear to the user that each product built on them is a different entity from the default assistant.Some channels like Alexa, though, don't make that distinction clear. In these cases, you need to make sure that the character of your CUI doesn’t significantly deviate from the personality of the default assistant, otherwise you'll mess with their mental model and create confusion. For example, when you're ordering an Uber with Alexa, it’s Alexa that speaks back to you: "Alexa, ask Uber for a ride." "Sure, there's an Uber less than a minute away, would you like me to order it?". While on Google Home, the Google Assistant makes it clear that it passes you over to Uber "Hi, I'm Uber, how can I help?".After you define the personality, start drafting out the core experience of your product.If you're working on a visual CUI, type the conversation down like a screenplay. If you're working with voice, act the dialogue out with your colleagues and use voice simulators to see how it feels in the channel you're designing for. This will make it easier to decide the direction you'd like to follow and will also help you initiate conversations with stakeholders.At this stage, you will be ready to start designing your user flows to define the functionality at a granular level. Again, understanding context is crucial. Make sure you think of the different scenarios in which users will interact with your product and the ways they're likely to phrase their input. User testing is key for this.

What are some of the biggest challenges you've faced designing for CUI?

Setting the right expectations for users. That applies to both visual and voice interfaces. There's a gap between the mental model users have of what most AI products with conversational interfaces can do, and what they are actually capable of doing. That was a common pattern I've seen in user testing sessions even with users who had previous experience in the conversational channel that was being tested. As a designer your challenge is to make the affordances and constraints clear in a way that feels like a natural part of the conversation and mitigates disappointment from unrealistic expectations. Another challenge is trying to cater for all the different ways people will phrase the requests. The key here is to invest time and resources in user research and NLP (natural language processing) services. If you feel that this is out of scope for your project, you may consider limiting the options for your users as trying to guide them to say things in a certain way will not work. Good examples of this are Facebook Messenger bots which now allow developers to remove the input field entirely from the experience in order to prevent users from making requests that can't be supported.

How do you think CUI is going to change the way designers and researchers do their work?

It might require designers and researchers to slightly alter some techniques they're using (for example thinking aloud during user testing doesn't work with voice interfaces) but the fundamentals will stay the same. You still need to focus on understanding the problem, explore different solutions through divergent thinking, converge, develop and continuously iterate based on user feedback. The exciting thing is that these new technologies significantly expand our toolbox and offer new interesting ways to solve problems for our users.

What improvements to this kind of technology do you wish to see? How would you like this technology to progress in the future?

I would like to see a more widespread integration of voice interfaces with visuals and GUI interaction patterns. A good example of the benefits of this approach is Amazon's Fire TV. Users can converse with the system via voice when it's more efficient than the alternative interaction options (for example, searching for a movie) but use their remote control to interact with visual UI elements for tasks that would be tedious to perform through voice. For example, selecting a movie cover to reveal descriptive text and then skimming through it helps you gauge whether the plot is interesting faster than if you had to consume this information through a conversation. This hybrid approach utilizes the best of each world to create a stronger experience. I think we will see this type of interface a lot more in the future. Think of Iron Man and J.A.R.V.I.S.

Any advice for young designers and researchers hoping to get into this part of the industry?

Invest time in learning best practices for crafting good dialogue. It's a crucial skill for designers in this field. Google and Amazon's design guidelines are a good starting point. This doesn't mean you should omit training and improving your knowledge in usability for traditional interfaces. Most of the principles are time-proof and channel agnostic and will help you greatly with conversational interfaces.Another thing you should make sure you do is stay up to date with the latest trends. The technology evolves very fast so you need to stay ahead of curve. Attend meetups, work on personal projects and participate in hackathons to practice and learn from the experts.As long as you're really passionate about the field, there will be plenty of opportunities for you to get involved and contribute. We're still in the early stages of mainstream adoption of the technology, so we have the chance to make significant impact on the evolution of the field and shape best practices for years to come, which is really exciting!

Learn more
1 min read

New to Reframer: Tag groups and custom colors

We’ve been working hard over the last few weeks on some nifty little changes to the way you can use, create and manage tags in Reframer. Today, we’re happy to announce that all these changes are now live. Yay!Now, you can group your tags and add custom colors. Learn a little bit about how these new features work and how you can use them below.

More flexibility with your tags

We made these changes so that the notetaking and analysis stages of your research projects are simpler. Previously, Reframer tags would all be the same shade of blue and organized alphabetically — not always the easiest way to find the tags you’re looking for. Now you can add as many tag groups as you like to your Reframer sessions to make managing and finding your tags much quicker and more effortless. Even better, you can also apply colors to these groups. Pick from our selected palette of colors, or choose a custom one through the hex color code chart. Grouping your tags gives you the freedom to organize your research observations in the way that makes sense to your project. Here are some of the many ways you can group and color your research observations.

  • Based on sentiment (happy, sad, angry, confused etc.)
  • Based on products
  • Based on tasks — for example, testing a certain section or feature of a website
  • Based on devices used
  • Based on defining personas
  • Based on period of time/time of day
reframer-tagcolors.png

These new tag features give you a lot of flexibility — each group doesn’t have to contain only one color. You can add multiple colors within one group (for example, within a ‘sentiment’ group you could select red for ‘angry’, green for ‘happy’ etc).For a bit more guidance on how to write effective research observations in Reframer, check out this article we wrote for UX Mastery.We hope you like these new tag features for Reframer. As always, if you have any feedback please let us know on support@optimalworkshop.comAlternatively, comment below with some of the ways you’ve used these new features in your research.

No results found.

Please try different keywords.

Subscribe to OW blog for an instantly better inbox

Thanks for subscribing!
Oops! Something went wrong while submitting the form.

Seeing is believing

Explore our tools and see how Optimal makes gathering insights simple, powerful, and impactful.