Meera Pankhania: From funding to delivery - Ensuring alignment from start to finish
It’s a chicken and egg situation when it comes to securing funding for a large transformation program in government. On one hand, you need to submit a business case and, as part of that, you need to make early decisions about how you might approach and deliver the program of work. On the other hand, you need to know enough about the problem you are going to solve to ensure you have sufficient funding to understand the problem better, hire the right people, design the right service, and build it the right way.
Now imagine securing hundreds of millions of dollars to design and build a service, but not feeling confident about what the user needs are. What if you had the opportunity to change this common predicament and influence your leadership team to carry out alignment activities, all while successfully delivering within the committed time frames?
Meera Pankhania, Design Director and Co-founder of Propel Design, recently spoke at UX New Zealand, the leading UX and IA conference in New Zealand hosted by Optimal Workshop, on traceability and her learnings from delivering a $300 million Government program.
In her talk, Meera helps us understand how to use service traceability techniques in our work and apply them to any environment - ensuring we design and build the best service possible, no matter the funding model.
Background on Meera Pankhania
As a design leader, Meera is all about working on complex, purpose-driven challenges. She helps organizations take a human-centric approach to service transformation and helps deliver impactful, pragmatic outcomes while building capability and leading teams through growth and change.
Meera co-founded Propel Design, a strategic research, design, and delivery consultancy in late 2020. She has 15 years of experience in service design, inclusive design, and product management across the private, non-profit, and public sectors in both the UK and Australia.
Meera is particularly interested in policy and social design. After a stint in the Australian Public Service, Meera was appointed as a senior policy adviser to the NSW Minister for Customer Service, Hon. Victor Dominello MP. In this role, she played a part in NSW’s response to the COVID pandemic, flexing her design leadership skills in a new, challenging, and important context.
From funding to delivery: ensuring alignment from start to finish 🏁🎉👏
Meera’s talk explores a fascinating case study within the Department of Employment Services (Australia) where a substantial funding investment of around $300 million set the stage for a transformative journey. This funding supported the delivery of a revamped Employment Services Model, which had the goal of delivering better services to job seekers and employers, and a better system for providers within this system. The project had a focus on aligning teams prior to delivery, which resulted in a huge amount of groundwork for Meera.
Her journey involved engaging various stakeholders within the department, including executives, to understand the program as a whole and what exactly needed to be delivered. “Traceability” became the watchword for this project, which is laid out in three phases.
Phase 1: Aligning key deliverables
Phase 2: Ensuring delivery readiness
Phase 3: Building sustainable work practices
Phase 1: Aligning key deliverables 🧮
Research and discovery (pre-delivery)
Meera’s work initially meant conducting extensive research and engagement with executives, product managers, researchers, designers, and policymakers. Through this process, a common theme was identified – the urgent (and perhaps misguided) need to start delivering! Often, organizations focus on obtaining funding without adequately understanding the complexities involved in delivering the right services to the right users, leading to half-baked delivery.
After this initial research, some general themes started to emerge:
Assumptions were made that still needed validation
Teams weren’t entirely sure that they understood the user’s needs
A lack of holistic understanding of how much research and design was needed
The conclusion of this phase was that “what” needed to be delivered wasn’t clearly defined. The same was true for “how” it would be delivered.
Traceability
Meera’s journey heavily revolved around the concept of "traceability” and sought to ensure that every step taken within the department was aligned with the ultimate goal of improving employment services. Traceability meant having a clear origin and development path for every decision and action taken. This is particularly important when spending taxpayer dollars!
So, over the course of eight weeks (which turned out to be much longer), the team went through a process of combing through documents in an effort to bring everything together to make sense of the program as a whole. This involved some planning, user journey mapping, and testing and refinement.
Documenting Key Artifacts
Numerous artifacts and documents played a crucial role in shaping decisions. Meera and her team gathered and organized these artifacts, including policy requirements, legislation, business cases, product and program roadmaps, service maps, and blueprints. The team also included prior research insights and vision documents which helped to shape a holistic view of the required output.
After an effort of combing through the program documents and laying everything out, it became clear that there were a lot of gaps and a LOT to do.
Prioritising tasks
As a result of these gaps, a process of task prioritization was necessary. Tasks were categorized based on a series of factors and then mapped out based on things like user touch points, pain points, features, business policy, and technical capabilities.
This then enabled Meera and the team to create Product Summary Tiles. These tiles meant that each product team had its own summary ahead of a series of planning sessions. It gave them as much context (provided by the traceability exercise) as possible to help with planning. Essentially, these tiles provided teams with a comprehensive overview of their projects i.e. what their user needs, what certain policies require them to deliver, etc.
Phase 2: Ensuring delivery readiness 🙌🏻
Meera wanted every team to feel confident that we weren’t doing too much or too little in order to design and build the right service, the right way.
Standard design and research check-ins were well adopted, which was a great start, but Meera and the team also built a Delivery Readiness Tool. It was used to assess a team's readiness to move forward with a project. This tool includes questions related to the development phase, user research, alignment with the business case, consideration of policy requirements, and more. Ultimately, it ensures that teams have considered all necessary factors before progressing further.
Phase 3: Building sustainable work practices 🍃
As the program progressed, several sustainable work practices emerged which Government executives were keen to retain going forward.
Some of these included:
ResearchOps Practice: The team established a research operations practice, streamlining research efforts and ensuring that ongoing research was conducted efficiently and effectively.
Consistent Design Artifacts: Templates and consistent design artifacts were created, reducing friction and ensuring that teams going forward started from a common baseline.
Design Authority and Ways of Working: A design authority was established to elevate and share best practices across the program.
Centralized and Decentralized Team Models: The program showcased the effectiveness of a combination of centralized and decentralized team models. A central design team provided guidance and support, while service design leads within specific service lines ensured alignment and consistency.
Why it matters 🔥
Meera's journey serves as a valuable resource for those working on complex design programs, emphasizing the significance of aligning diverse stakeholders and maintaining traceability. Alignment and traceability are critical to ensuring that programs never lose sight of the problem they’re trying to solve, both from the user and organization’s perspective. They’re also critical to delivering on time and within budget!
Traceability key takeaways 🥡
Early Alignment Matters: While early alignment is ideal, it's never too late to embark on a traceability journey. It can uncover gaps, increase confidence in decision-making, and ensure that the right services are delivered.
Identify and audit: You never know what artifacts will shape your journey. Identify everything early, and don’t be afraid to get clarity on things you’re not sure about.
Conducting traceability is always worthwhile: Even if you don’t find many gaps in your program, you will at least gain a high level of confidence that your delivery is focused on the right things.
Delivery readiness key takeaways 🥡
Skills Mix is Vital: Assess and adapt team member roles to match their skills and experiences, ensuring they are positioned optimally.
Not Everyone Shares the Same Passion: Recognize that not everyone will share the same level of passion for design and research. Make the relevance of these practices clear to all team members.
Sustainability key takeaways 🥡
One Size Doesn't Fit All: Tailor methodologies, templates, and practices to the specific needs of your organization.
Collaboration is Key: Foster a sense of community and collective responsibility within teams, encouraging shared ownership of project outcomes.
At Optimal Workshop, we're dedicated to building the best user research platform to empower you with the tools to better understand your customers and create intuitive digital experiences. We're thrilled to announce some game-changing updates and new products that are on the horizon to help elevate the way you gather insights and keep customers at the heart of everything you do.
What’s new…
Integration with Figma 🚀
Last month, we joined forces with design powerhouse Figma to launch our integration. You can import images from Figma into Chalkmark (our click-testing tool) in just a few clicks, streamlining your workflows and getting insights to make decisions based on data not hunches and opinions.
What’s coming next…
Session Replays 🧑💻
With session replay you can focus on other tasks while Optimal Workshop automatically captures card sort sessions for you to watch in your own time. Gain valuable insights into how participants engage and interpret a card sort without the hassle of running moderated sessions. The first iteration of session replays captures the study interactions, and will not include audio or face recording, but this is something we are exploring for future iterations. Session replays will be available in tree testing and click-testing later in 2024.
Reframer Transcripts 🔍
Say goodbye to juggling note-taking and hello to more efficient ways of working with Transcripts! We're continuing to add more capability to Reframer, our qualitative research tool, to now include the importing of interview transcripts. Save time, reduce human errors and oversights by importing transcripts, tagging and analyzing observations all within Reframer. We’re committed to build on transcripts with video and audio transcription capability in the future, we’ll keep you in the loop and when to expect those releases.
Prototype testing 🧪
The team is fizzing to be working on a new Prototype testing product designed to expand your research methods and help test prototypes easily from the Optimal Workshop platform. Testing prototypes early and often is an important step in the design process, saving you time and money before you invest too heavily in the build. We are working with customers and on delivering the first iteration of this exciting new product. Stay tuned for Prototypes coming in the second quarter of 2024.
Workspaces 🎉
Making Optimal Workshop easier for large organizations to manage teams and collaborate more effectively on projects is a big focus for 2024. Workspaces are the first step towards empowering organizations to better manage multiple teams with projects. Projects will allow greater flexibility on who can see what, encouraging working in the open and collaboration alongside the ability to make projects private. The privacy feature is available on Enterprise plans.
Questions upgrade❓
Our survey product Questions is in for a glow up in 2024 💅. The team are enjoying working with customers, collecting and reviewing feedback on how to improve Questions and will be sharing more on this in the coming months.
Help us build a better Optimal Workshop
We are looking for new customers to join our research panel to help influence product development. From time to time, you’ll be invited to join us for interviews or surveys, and you’ll be rewarded for your time with a thank-you gift. If you’d like to join the team, email product@optimalworkshop.com
In the last few years, the influence of AI has steadily been expanding into various aspects of design. In early 2023, that expansion exploded. AI tools and features are now everywhere, and there are two ways designers commonly react to it:
With enthusiasm for how they can use it to make their jobs easier
With skepticism over how reliable it is, or even fear that it could replace their jobs
Google UX researcher Clara Kliman-Silver is at the forefront of researching and understanding the potential impact of AI on design into the future. This is a hot topic that’s on the radar of many designers as they grapple with what the new normal is, and how it will change things in the coming years.
Clara’s background
Clara Kliman-Silver spends her time studying design teams and systems, UX tools and designer-developer collaboration. She’s a specialist in participatory design and uses generative methods to investigate workflows, understand designer-developer experiences, and imagine ways to create UIs. In this work, Clara looks at how technology can be leveraged to help people make things, and do it more efficiently than they currently are.
In today’s context, that puts generative AI and machine learning right in her line of sight. The way this technology has boomed in recent times has many people scrambling to catch up - to identify the biggest opportunities and to understand the risks that come with it. Clara is a leader in assessing the implications of AI. She analyzes both the technology itself and the way people feel about it to forecast what it will mean into the future.
What role should artificial intelligence play in UX design process? 🤔
Clara’s expertise in understanding the role of AI in design comes from significant research and analysis of how the technology is being used currently and how industry experts feel about it. AI is everywhere in today’s world, from home devices to tech platforms and specific tools for various industries. In many cases, AI automation is used for productivity, where it can speed up processes with subtle, easy to use applications.
As mentioned above, the transformational capabilities of AI are met with equal parts of enthusiasm and skepticism. The way people use AI, and how they feel about it is important, because users need to be comfortable implementing the technology in order for it to make a difference. The question of what value AI brings to the design process is ongoing. On one hand, AI can help increase efficiency for systems and processes. On the other hand, it can exacerbate problems if the user's intentions are misunderstood.
Access for all 🦾
There’s no doubt that AI tools enable novices to perform tasks that, in years gone by, required a high level of expertise. For example, film editing was previously a manual task, where people would literally cut rolls of film and splice them together on a reel. It was something only a trained editor could do. Now, anyone with a smartphone has access to iMovie or a similar app, and they can edit film in seconds.
For film experts, digital technology allows them to speed up tedious tasks and focus on more sophisticated aspects of their work. Clara hypothesizes that AI is particularly valuable when it automates mundane tasks. AI enables more individuals to leverage digital technologies without requiring specialist training. Thus, AI has shifted the landscape of what it means to be an “expert” in a field. Expertise is about more than being able to simply do something - it includes having the knowledge and experience to do it for an informed reason.
Research and testing 🔬
Clara performs a lot of concept testing, which involves recognizing the perceived value of an approach or method. Concept testing helps in scenarios where a solution may not address a problem or where the real problem is difficult to identify. In a recent survey, Clara describes two predominant benefits designers experienced from AI:
Efficiency. Not only does AI expedite the problem solving process, it can also help efficiently identify problems.
Innovation. Generative AI can innovate on its own, developing ideas that designers themselves may not have thought of.
The design partnership 🤝🏽
Overall, Clara says UX designers tend to see AI as a creative partner. However, most users don’t yet trust AI enough to give it complete agency over the work it’s used for. The level of trust designers have exists on a continuum, where it depends on the nature of the work and the context of what they’re aiming to accomplish. Other factors such as where the tech comes from, who curated it and who’s training the model also influences trust. For now, AI is largely seen as a valued tool, and there is cautious optimism and tentative acceptance for its application.
Why it matters 💡
AI presents as potentially one of the biggest game-changers to how people work in our generation. Although AI has widespread applications across sectors and systems, there are still many questions about it. In the design world, systems like DALL-E allow people to create AI-generated imagery, and auto layout in various tools allows designers to iterate more quickly and efficiently.
Like many other industries, designers are wondering where AI might go in the future and what it might look like. The answer to these questions has very real implications for the future of design jobs and whether they will exist. In practice, Clara describes the current mood towards AI as existing on a continuum between adherence and innovation:
Adherence is about how AI helps designers follow best practice
Innovation is at the other end of the spectrum, and involves using AI to figure out what’s possible
The current environment is extremely subjective, and there’s no agreed best practice. This makes it difficult to recommend a certain approach to adopting AI and creating permanent systems around it. Both the technology and the sentiment around it will evolve through time, and it’s something designers, like all people, will need to maintain good awareness of.
With UX research so closely tied to product success, setting up a dedicated research practice is fast becoming important for many organizations. It’s not an easy process, especially for organizations that have had little to do with research, but the end goal is worth the effort.
But where exactly are you supposed to start? This article provides 6 key things to keep in mind when setting up a research practice, and should hopefully ensure you’ve considered all of the relevant factors.
1) Work out what your organization needs
The first and most simple step is to take stock of the current user research situation within the organization. How much research is currently being done? Which teams or individuals are talking to customers on an ongoing basis? Consider if there are any major pain points with the current way research is being carried out or bottlenecks in getting research insights to the people that need them. If research isn't being practiced, identify teams or individuals that don't currently have access to the resources they need, and consider ways to make insights available to the people that need them.
2) Consolidate your insights
UX research should be communicating with nearly every part of an organization, from design teams to customer support, engineering departments and C-level management. The insights that stem from user research are valuable everywhere. Of course, the opposite is also true: insights from support and sales are useful for understanding customers and how the current product is meeting people's needs.
When setting up a research practice, identify which teams you should align with, and then reach out. Sit down with these teams and explore how you can help each other. For your part, you’ll probably need to explain the what and why of user research within the context of your organization, and possibly even explain at a basic level some of the techniques you use and the data you can obtain.
Then, get in touch with other teams with the goal of learning from them. A good research practice needs a strong connection to other parts of the business with the express purpose of learning. For example, by working with your organization’s customer support team, you’ll have a direct line to some of the issues that customers deal with on a regular basis. A good working relationship here means they’ll likely feed these insights back to you, in order to help you frame your research projects.
By working with your sales team, they’ll be able to share issues prospective customers are dealing with. You can follow up on this information with research, the results of which can be fed into the development of your organization’s products.
It can also be fruitful to develop an insights repository, where researchers can store any useful insights and log research activities. This means that sales, customer support and other interested parties can access the results of your research whenever they need to.
When your research practice is tightly integrated other key areas of the business, the organization is likely to see innumerable benefits from the insights>product loop.
3) Figure out which tools you will use
By now you’ve hopefully got an idea of how your research practice will fit into the wider organization – now it’s time to look at the ways in which you’ll do your research. We’re talking, of course, about research methods and testing tools.
We won’t get into every different type of method here (there are plenty of other articles and guides for that), but we will touch on the importance of qualitative and quantitative methods. If you haven’t come across these terms before, here’s a quick breakdown:
Qualitative research – Focused on exploration. It’s about discovering things we cannot measure with numbers, and often involves speaking with users through observation or user interviews.
Quantitative research – Focused on measurement. It’s all about gathering data and then turning this data into usable statistics.
All user research methods are designed to deliver either qualitative or quantitative data, and as part of your research practice, you should ensure that you always try to gather both types. By using this approach, you’re able to generate a clearer overall picture of whatever it is you’re researching.
Next comes the software. A solid stack of user research testing tools will help you to put research methods into practice, whether for the purposes of card sorting, carrying out more effective user interviews or running a tree test.
There are myriad tools available now, and it can be difficult to separate the useful software from the chaff. Here’s a list of research and productivity tools that we recommend.
Tools for research
Here’s a collection of research tools that can help you gather qualitative and quantitative data, using a number of methods.
Treejack – Tree testing can show you where people get lost on your website, and help you take the guesswork out of information architecture decisions. Like OptimalSort, Treejack makes it easy to sort through information and pairs this with in-depth analysis features.
dScout – Imagine being able to get video snippets of your users as they answer questions about your product. That’s dScout. It’s a video research platform that collects in-context “moments” from a network of global participants, who answer your questions either by video or through photos.
Ethnio – Like dScout, this is another tool designed to capture information directly from your users. It works by showing an intercept pop-up to people who land on your website. Then, once they agree, it runs through some form of research.
OptimalSort – Card sorting allows you to get perspective on whatever it is you’re sorting and understand how people organize information. OptimalSort makes it easier and faster to sort through information, and you can access powerful analysis features.
Reframer – Taking notes during user interviews and usability tests can be quite time-consuming, especially when it comes to analyze the data. Reframer gives individuals and teams a single tool to store all of their notes, along with a set of powerful analysis features to make sense of their data.
Chalkmark – First-click testing can show you what people click on first in a user interface when they’re asked to complete a task. This is useful, as when people get their first click correct, they’re much more likely to complete their task. Chalkmark makes the process of setting up and running a first-click test easy. What’s more, you’re given comprehensive analysis tools, including a click heatmap.
Tools for productivity
These tools aren’t necessarily designed for user research, but can provide vital links in the process.
Whimsical – A fantastic tool for user journeys, flow charts and any other sort of diagram. It also solves one of the biggest problems with online whiteboards – finicky object placement.
Descript – Easily transcribe your interview and usability test audio recordings into text.
Google Slides – When it inevitably comes time to present your research findings to stakeholders, use Google Slides to create readable, clear presentations.
4) Figure out how you’ll track findings over time
With some idea of the research methods and testing tools you’ll be using to collect data, now it’s time to think about how you’ll manage all of this information. A carefully ordered spreadsheet and folder system can work – but only to an extent. Dedicated software is a much better choice, especially given that you can scale these systems much more easily.
A dedicated home for your research data serves a few distinct purposes. There’s the obvious benefit of being able to access all of your findings whenever you need them, which means it’s much easier to create personas if the need arises. A dedicated home also means your findings will remain accessible and useful well into the future.
When it comes to software, Reframer stands as one of the better options for creating a detailed customer insights repository as you’re able to capture your sessions directly in the tool and then apply tags afterwards. You can then easily review all of your observations and findings using the filtering options. Oh, and there’s obviously the analysis side of the tool as well.
If you’re looking for a way to store high-level findings – perhaps if you’re intending to share this data with other parts of your organization – then a tool like Confluence or Notion is a good option. These tools are basically wikis, and include capable search and navigation options too.
5) Where will you get participants from?
A pool of participants you can draw from for your user research is another important part of setting up a research practice. Whenever you need to run a study, you’ll have real people you can call on to test, ask questions and get feedback from.
This is where you’ll need to partner other teams, likely sales and customer support. They’ll have direct access to your customers, so make sure to build a strong relationship with these teams. If you haven’t made introductions, it can helpful to put together a one-page sheet of information explaining what UX research is and the benefits of working with your team.
You may also want to consider getting in some external help. Participant recruitment services are a great way to offload the heavy lifting of sourcing quality participants – often one of the hardest parts of the research process.
6) Work out how you'll communicate your research
Perhaps one of the most important parts of being a user researcher is taking the findings you uncover and communicating them back to the wider organization. By feeding insights back to product, sales and customer support teams, you’ll form an effective link between your organization’s customers and your organization. The benefits here are obvious. Product teams can build products that actually address customer pain points, and sales and support teams will better understand the needs and expectations of customers.
Of course, it isn’t easy to communicate findings. Here are a few tips:
Document your research activities: With a clear record of your research, you’ll find it easier to pull out relevant findings and communicate these to the right teams.
Decide who needs what: You’ll probably find that certain roles (like managers) will be best served by a high-level overview of your research activities (think a one-page summary), while engineers, developers and designers will want more detailed research findings.