"Simple to use. Easy for participants to access and complete. But most importantly - the auto-magical analysis"
Paul the Lead User Experience Designer for the enterprise division of MYOB, a market-leading Australasian company building desktop and cloud-based business software. Paul’s been collaborating with partners in Norway and Sweden on a series of OptimalSort studies to refine the usability of MYOB’s stand-out enterprise product: a solution that allows large companies to run the whole business from the cloud (including sales, finance, CRM, HR, logistics and IT).
Here’s how they’re doing it.
It’s my job to make our existing enterprise products as user-friendly as possible, and to infuse our product development cycle with a strong sense of who it’s all for.
Our product can span a customer’s entire business, so we design for a huge range of people — from mobile and office-based sales people and relationship managers, to logistics and payroll administrators, IT personnel, executive managers, and, well, everyone else.
I work in three teams: locally (Auckland office) with our product managers, software developers and delivery managers; with the rest of the UX team across Melbourne and Sydney offices; and with our business partners in US/ Russia (Acumatica) and Norway/Sweden (Visma).
In essence, I want to uphold the user-centered design philosophy so that user adoption of our products is as high as possible.
Our project was to look into the efficiency of the labelling and navigation of MYOB’s stand-out cloud enterprise product, MYOB Advanced — a solution that allows large companies to run the whole business from the cloud (including sales, finance, CRM, HR, logistics and IT).
As part of our ongoing UX practice, we ran a series of usability studies on the product, and discovered that the top-level IA — the main menu that lead people to 40 key modules — just wasn’t working for people as well as we wanted it to.
We wanted to find out from our customers how they thought we should structure the product — after all, it’s them who will be interacting with it every day. We knew OptimalSort would make getting this data easy.
We also had important demographic questions to ask, and OptimalSort took care of the whole process, from the open card sort to the end survey. Being integrated as one package (rather than say, using Google forms for the survey) meant that its was easier for us to keep the information together and it was much more simple for the participant as they were guided through the process from beginning to end.
So as far as study set-up, all we had to do was create a new project, write some intro text, add our module labels to the cards, and then add the questions. We got data from 40 participants on how we could structure the top-level menu. Our goals were to compare a user-generated menu of our product with our existing product, and so we explained to participants why we needed their help. I think any customer would get a kick out of being asked for their perspective on a product’s navigation.
We also recruited more candidates than we usually did for in-person usability studies, because we were aiming for statistical significance. Our participants were our existing customers and partners, and we knew the majority already through preceding user testing, so contacting them by email was simple.
Almost as soon as we started seeing results come in, we got stuck into exploring OptimalSort’s results platform. We got great insights from the dendrograms, similarity matrices, and so on. We were able to easily see common groupings, and we found out quickly what the consensus was on particular structures. From the results we were able to come up with ideas for grouping and labelling the information that the majority of our users agreed on.
And there were lots of little touches we noticed, like being able to see how many attempts participants had made (and given up), which allowed us to easily follow-up with them. We could also see who we were waiting on easily!
We were able to share our results in a couple of different ways, depending on who was involved, by sharing a private link or by adding the data visualizations to presentations.
I think being able to present dendrograms, similarity matrix and PCAs gave our stakeholders confidence that the research was trustworthy and valuable.